Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

JSON datatype support #2558

Open
wants to merge 54 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from 18 commits
Commits
Show all changes
54 commits
Select commit Hold shift + click to select a range
17e21c1
JSON feature extension
lilgreenbird Nov 28, 2024
b480490
Added JSON data type support in DataTypes
Dec 4, 2024
1f2b95b
JSON startegy in DTV
Dec 10, 2024
e102633
Fix Json character encoding and JSON constants.
Dec 18, 2024
d994fec
Added JSON datatype test case
Dec 18, 2024
c164ef9
Added JSON datatype test cases in ResultSet, TVP and Regression scena…
Dec 18, 2024
f9db4f7
Added JSON datatype support
Dec 19, 2024
dff9280
Added JSON support in IOBuffer.java
Dec 19, 2024
0e7a053
Json datatype support: SQLServerCallableStatement
Ananya2 Jan 3, 2025
a8ebfd3
Updated callableStatementTest.java
Ananya2 Jan 3, 2025
952e37e
Merge branch 'main' into user/divang/json-datatype-support
Jan 9, 2025
44a185b
Added test cases for the TVP type in Callable Statements.
Ananya2 Jan 10, 2025
dc53208
Fixed JSON op.execute method mapping
Jan 22, 2025
508a1b9
Merge branch 'user/divang/json-datatype-support' of https://github.co…
Jan 22, 2025
fd21d1a
Fixed Store procedure output param TDS metadata and fixed bulk copy t…
Jan 29, 2025
07bea45
Fixed BulkCopy JSON datatype issue.
Feb 11, 2025
afb3753
Removed duplicate code and used existing method to solve Bulk opy iss…
Feb 11, 2025
9c4a401
Added test cases for Bulk copy
Feb 11, 2025
961f557
Enhanced JSON datatype regression test
Feb 12, 2025
a9042bf
Fixed formatting in dtv.java
Feb 12, 2025
a719e0a
Removed additional empty line in dtv.java
Feb 12, 2025
9bac0eb
Added JSON metadata test case
Feb 12, 2025
9bf9ffa
Merge remote-tracking branch 'origin/main' into user/divang/json-data…
Feb 12, 2025
cd28b53
Added JSON support for BulkCopyCSV, included test cases to validate i…
Ananya2 Feb 14, 2025
b34bc80
Added test for Bulk Copy
muskan124947 Feb 17, 2025
6fe9854
Hanlded review comment: removed JSON type in Textual category and pre…
Feb 17, 2025
46b87a1
Merge branch 'user/divang/json-datatype-support' of https://github.co…
Feb 17, 2025
f1b305b
Add test for datatype conversions
muskan124947 Feb 17, 2025
3c00924
Removed local formatting done by default
muskan124947 Feb 17, 2025
fbec8a1
Validated nested JSON for bulkCopyCSV()
Ananya2 Feb 17, 2025
58f0d25
Merge branch 'user/divang/json-datatype-support' of https://github.co…
Ananya2 Feb 17, 2025
7e8b1af
Enhance parseString() to support nested JSON and multiple key-value p…
Ananya2 Feb 18, 2025
ac68998
Fixed JSON displaySize and precision value in TypeInfo
Feb 18, 2025
20d5b17
Merge branch 'user/divang/json-datatype-support' of https://github.co…
Feb 18, 2025
b67553e
Fixed JSON datatype setObject issue
Feb 20, 2025
22b230f
Updated regression test case for JSON datatype: mix of addBatch, exec…
Feb 20, 2025
04cadb8
Added more metadata validation check for JSON datatype
Feb 20, 2025
0b6b392
Updated as per the review comment to validate the getString() result …
Ananya2 Feb 20, 2025
c69dfb1
Updated and added more test scenarios for Bulk Copy
muskan124947 Feb 20, 2025
81a19ed
Added JSONFunctionTest to test JSON functions
muskan124947 Feb 20, 2025
3d07480
Optimized string parsing to avoid unnecessary allocations.
Ananya2 Feb 21, 2025
e2e4748
formatted as per the older version
Ananya2 Feb 21, 2025
8384c9d
Improve CSV parsing by correctly detecting JSON fields with double qu…
Ananya2 Feb 24, 2025
ccb6256
Remove reference to non-existent test file BulkCopyCSVTestInputNoColu…
Ananya2 Feb 24, 2025
31bbf91
Updated JSONFunction Test class
muskan124947 Feb 24, 2025
630c99e
Adding tag JSONTest for some test cases
muskan124947 Feb 25, 2025
30aec56
Added and validated test cases for JSON support in JOIN queries, glob…
Ananya2 Feb 26, 2025
56233bf
Added test case to validate UDF returning JSON output; included metad…
Ananya2 Feb 26, 2025
4c4f064
Added tests for OPENJSON()
muskan124947 Feb 27, 2025
6412517
Added tests for OPENJSON() function
muskan124947 Feb 27, 2025
530bd94
Added tests to handle large JSON file upto 2GB
muskan124947 Feb 27, 2025
6cc9392
Add test cases to perform operations using Prepared Statement and use…
muskan124947 Feb 27, 2025
bef793e
Updated test to insert large json data and verify no packet loss is t…
muskan124947 Mar 3, 2025
60b602e
Added JSONTest tag to all the test cases
muskan124947 Mar 7, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion src/main/java/com/microsoft/sqlserver/jdbc/Column.java
Original file line number Diff line number Diff line change
Expand Up @@ -336,7 +336,7 @@ else if (jdbcType.isBinary()) {

// Update of Unicode SSType from textual JDBCType: Use Unicode.
if ((SSType.NCHAR == ssType || SSType.NVARCHAR == ssType || SSType.NVARCHARMAX == ssType
|| SSType.NTEXT == ssType || SSType.XML == ssType) &&
|| SSType.NTEXT == ssType || SSType.XML == ssType || SSType.JSON == ssType) &&

(JDBCType.CHAR == jdbcType || JDBCType.VARCHAR == jdbcType || JDBCType.LONGVARCHAR == jdbcType
|| JDBCType.CLOB == jdbcType)) {
Expand Down
40 changes: 28 additions & 12 deletions src/main/java/com/microsoft/sqlserver/jdbc/DataTypes.java
Original file line number Diff line number Diff line change
Expand Up @@ -65,6 +65,7 @@ enum TDSType {
NTEXT(0x63), // 99
UDT(0xF0), // -16
XML(0xF1), // -15
JSON(0xF4), // -12

// LONGLEN types
SQL_VARIANT(0x62); // 98
Expand Down Expand Up @@ -148,7 +149,8 @@ enum SSType {
XML(Category.XML, "xml", JDBCType.LONGNVARCHAR),
TIMESTAMP(Category.TIMESTAMP, "timestamp", JDBCType.BINARY),
GEOMETRY(Category.UDT, "geometry", JDBCType.GEOMETRY),
GEOGRAPHY(Category.UDT, "geography", JDBCType.GEOGRAPHY);
GEOGRAPHY(Category.UDT, "geography", JDBCType.GEOGRAPHY),
JSON(Category.JSON, "json", JDBCType.JSON);

final Category category;
private final String name;
Expand Down Expand Up @@ -204,7 +206,8 @@ enum Category {
TIMESTAMP,
UDT,
SQL_VARIANT,
XML;
XML,
JSON;

private static final Category[] VALUES = values();
}
Expand Down Expand Up @@ -266,7 +269,12 @@ enum GetterConversion {

SQL_VARIANT(SSType.Category.SQL_VARIANT, EnumSet.of(JDBCType.Category.CHARACTER, JDBCType.Category.SQL_VARIANT,
JDBCType.Category.NUMERIC, JDBCType.Category.DATE, JDBCType.Category.TIME, JDBCType.Category.BINARY,
JDBCType.Category.TIMESTAMP, JDBCType.Category.NCHARACTER, JDBCType.Category.GUID));
JDBCType.Category.TIMESTAMP, JDBCType.Category.NCHARACTER, JDBCType.Category.GUID)),

JSON(SSType.Category.JSON, EnumSet.of(JDBCType.Category.CHARACTER, JDBCType.Category.LONG_CHARACTER,
JDBCType.Category.CLOB, JDBCType.Category.NCHARACTER, JDBCType.Category.LONG_NCHARACTER,
JDBCType.Category.NCLOB, JDBCType.Category.BINARY, JDBCType.Category.LONG_BINARY,
JDBCType.Category.BLOB, JDBCType.Category.JSON));

private final SSType.Category from;
private final EnumSet<JDBCType.Category> to;
Expand Down Expand Up @@ -452,7 +460,9 @@ JDBCType getJDBCType(SSType ssType, JDBCType jdbcTypeFromApp) {
case NTEXT:
jdbcType = JDBCType.LONGVARCHAR;
break;

case JSON:
jdbcType = JDBCType.JSON;
break;
case XML:
default:
jdbcType = JDBCType.LONGVARBINARY;
Expand Down Expand Up @@ -673,8 +683,9 @@ enum JDBCType {
SQL_VARIANT(Category.SQL_VARIANT, microsoft.sql.Types.SQL_VARIANT, Object.class.getName()),
GEOMETRY(Category.GEOMETRY, microsoft.sql.Types.GEOMETRY, Object.class.getName()),
GEOGRAPHY(Category.GEOGRAPHY, microsoft.sql.Types.GEOGRAPHY, Object.class.getName()),
LOCALDATETIME(Category.TIMESTAMP, java.sql.Types.TIMESTAMP, LocalDateTime.class.getName());

LOCALDATETIME(Category.TIMESTAMP, java.sql.Types.TIMESTAMP, LocalDateTime.class.getName()),
JSON(Category.JSON, microsoft.sql.Types.JSON, Object.class.getName());

final Category category;
private final int intValue;
private final String className;
Expand Down Expand Up @@ -722,7 +733,8 @@ enum Category {
GUID,
SQL_VARIANT,
GEOMETRY,
GEOGRAPHY;
GEOGRAPHY,
JSON;

private static final Category[] VALUES = values();
}
Expand All @@ -733,7 +745,7 @@ enum SetterConversion {
JDBCType.Category.TIME, JDBCType.Category.TIMESTAMP, JDBCType.Category.DATETIMEOFFSET,
JDBCType.Category.CHARACTER, JDBCType.Category.LONG_CHARACTER, JDBCType.Category.NCHARACTER,
JDBCType.Category.LONG_NCHARACTER, JDBCType.Category.BINARY, JDBCType.Category.LONG_BINARY,
JDBCType.Category.GUID, JDBCType.Category.SQL_VARIANT)),
JDBCType.Category.GUID, JDBCType.Category.SQL_VARIANT, JDBCType.Category.JSON)),

LONG_CHARACTER(JDBCType.Category.LONG_CHARACTER, EnumSet.of(JDBCType.Category.CHARACTER,
JDBCType.Category.LONG_CHARACTER, JDBCType.Category.NCHARACTER, JDBCType.Category.LONG_NCHARACTER,
Expand Down Expand Up @@ -795,7 +807,8 @@ enum SetterConversion {

GEOMETRY(JDBCType.Category.GEOMETRY, EnumSet.of(JDBCType.Category.GEOMETRY)),

GEOGRAPHY(JDBCType.Category.GEOGRAPHY, EnumSet.of(JDBCType.Category.GEOGRAPHY));
GEOGRAPHY(JDBCType.Category.GEOGRAPHY, EnumSet.of(JDBCType.Category.GEOGRAPHY)),
JSON(JDBCType.Category.JSON, EnumSet.of(JDBCType.Category.JSON));

private final JDBCType.Category from;
private final EnumSet<JDBCType.Category> to;
Expand Down Expand Up @@ -832,7 +845,7 @@ enum UpdaterConversion {
SSType.Category.DATETIMEOFFSET, SSType.Category.CHARACTER, SSType.Category.LONG_CHARACTER,
SSType.Category.NCHARACTER, SSType.Category.LONG_NCHARACTER, SSType.Category.XML,
SSType.Category.BINARY, SSType.Category.LONG_BINARY, SSType.Category.UDT, SSType.Category.GUID,
SSType.Category.TIMESTAMP, SSType.Category.SQL_VARIANT)),
SSType.Category.TIMESTAMP, SSType.Category.SQL_VARIANT, SSType.Category.JSON)),

LONG_CHARACTER(JDBCType.Category.LONG_CHARACTER, EnumSet.of(SSType.Category.CHARACTER,
SSType.Category.LONG_CHARACTER, SSType.Category.NCHARACTER, SSType.Category.LONG_NCHARACTER,
Expand Down Expand Up @@ -895,7 +908,9 @@ enum UpdaterConversion {
SSType.Category.DATETIMEOFFSET, SSType.Category.CHARACTER, SSType.Category.LONG_CHARACTER,
SSType.Category.NCHARACTER, SSType.Category.LONG_NCHARACTER)),

SQL_VARIANT(JDBCType.Category.SQL_VARIANT, EnumSet.of(SSType.Category.SQL_VARIANT));
SQL_VARIANT(JDBCType.Category.SQL_VARIANT, EnumSet.of(SSType.Category.SQL_VARIANT)),

JSON(JDBCType.Category.JSON, EnumSet.of(SSType.Category.JSON));

private final JDBCType.Category from;
private final EnumSet<SSType.Category> to;
Expand Down Expand Up @@ -970,7 +985,7 @@ boolean isBinary() {
* @return true if the JDBC type is textual
*/
private final static EnumSet<Category> textualCategories = EnumSet.of(Category.CHARACTER, Category.LONG_CHARACTER,
Category.CLOB, Category.NCHARACTER, Category.LONG_NCHARACTER, Category.NCLOB);
Category.CLOB, Category.NCHARACTER, Category.LONG_NCHARACTER, Category.NCLOB, Category.JSON); //FIXME: JSON is textual?

boolean isTextual() {
return textualCategories.contains(category);
Expand All @@ -997,6 +1012,7 @@ int asJavaSqlType() {
return java.sql.Types.CHAR;
case NVARCHAR:
case SQLXML:
case JSON:
return java.sql.Types.VARCHAR;
case LONGNVARCHAR:
return java.sql.Types.LONGVARCHAR;
Expand Down
15 changes: 15 additions & 0 deletions src/main/java/com/microsoft/sqlserver/jdbc/IOBuffer.java
Original file line number Diff line number Diff line change
Expand Up @@ -169,6 +169,11 @@ final class TDS {
static final byte TDS_FEATURE_EXT_AZURESQLDNSCACHING = 0x0B;
static final byte TDS_FEATURE_EXT_SESSIONRECOVERY = 0x01;

// JSON support
static final byte TDS_FEATURE_EXT_JSONSUPPORT = 0x0D;
static final byte JSONSUPPORT_NOT_SUPPORTED = 0x00;
static final byte MAX_JSONSUPPORT_VERSION = 0x01;

static final int TDS_TVP = 0xF3;
static final int TVP_ROW = 0x01;
static final int TVP_NULL_TOKEN = 0xFFFF;
Expand Down Expand Up @@ -237,6 +242,9 @@ static final String getTokenName(int tdsTokenType) {
return "TDS_FEATURE_EXT_AZURESQLDNSCACHING (0x0B)";
case TDS_FEATURE_EXT_SESSIONRECOVERY:
return "TDS_FEATURE_EXT_SESSIONRECOVERY (0x01)";
case TDS_FEATURE_EXT_JSONSUPPORT:
return "TDS_FEATURE_EXT_JSONSUPPORT (0x0D)";

default:
return "unknown token (0x" + Integer.toHexString(tdsTokenType).toUpperCase() + ")";
}
Expand Down Expand Up @@ -4856,6 +4864,11 @@ void writeRPCStringUnicode(String sValue) throws SQLServerException {
writeRPCStringUnicode(null, sValue, false, null);
}

void writeRPCJson(String sName, String sValue, boolean bOut) throws SQLServerException {
writeRPCNameValType(sName, bOut, TDSType.JSON);
writeLong(0xFFFFFFFFFFFFFFFFL);
}

/**
* Writes a string value as Unicode for RPC
*
Expand Down Expand Up @@ -5241,6 +5254,7 @@ private void writeInternalTVPRowValues(JDBCType jdbcType, String currentColumnSt
case LONGVARCHAR:
case LONGNVARCHAR:
case SQLXML:
case JSON:
isShortValue = (2L * columnPair.getValue().precision) <= DataTypes.SHORT_VARTYPE_MAX_BYTES;
isNull = (null == currentColumnStringValue);
dataLength = isNull ? 0 : currentColumnStringValue.length() * 2;
Expand Down Expand Up @@ -5476,6 +5490,7 @@ void writeTVPColumnMetaData(TVP value) throws SQLServerException {
case LONGVARCHAR:
case LONGNVARCHAR:
case SQLXML:
case JSON:
writeByte(TDSType.NVARCHAR.byteValue());
isShortValue = (2L * pair.getValue().precision) <= DataTypes.SHORT_VARTYPE_MAX_BYTES;
// Use PLP encoding on Yukon and later with long values
Expand Down
4 changes: 3 additions & 1 deletion src/main/java/com/microsoft/sqlserver/jdbc/Parameter.java
Original file line number Diff line number Diff line change
Expand Up @@ -899,7 +899,9 @@ private void setTypeDefinition(DTV dtv) {
case SQLXML:
param.typeDefinition = SSType.XML.toString();
break;

case JSON:
param.typeDefinition = SSType.JSON.toString();
break;
case TVP:
// definition should contain the TVP name and the keyword READONLY
String schema = param.schemaName;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -545,6 +545,10 @@ else if ((null != columnNames) && (columnNames.length >= positionInSource))
columnMetadata.put(positionInSource,
new ColumnMetadata(colName, java.sql.Types.LONGNVARCHAR, precision, scale, dateTimeFormatter));
break;
case microsoft.sql.Types.JSON:
columnMetadata.put(positionInSource,
new ColumnMetadata(colName, microsoft.sql.Types.JSON, precision, scale, dateTimeFormatter));
break;
/*
* Redirecting Float as Double based on data type mapping
* https://msdn.microsoft.com/library/ms378878%28v=sql.110%29.aspx
Expand Down
22 changes: 15 additions & 7 deletions src/main/java/com/microsoft/sqlserver/jdbc/SQLServerBulkCopy.java
Original file line number Diff line number Diff line change
Expand Up @@ -780,7 +780,8 @@ private void writeColumnMetaDataColumnData(TDSWriter tdsWriter, int idx) throws
collation = connection.getDatabaseCollation();

if ((java.sql.Types.NCHAR == bulkJdbcType) || (java.sql.Types.NVARCHAR == bulkJdbcType)
|| (java.sql.Types.LONGNVARCHAR == bulkJdbcType)) {
|| (java.sql.Types.LONGNVARCHAR == bulkJdbcType)
|| (microsoft.sql.Types.JSON == bulkJdbcType)) {
isStreaming = (DataTypes.SHORT_VARTYPE_MAX_CHARS < bulkPrecision)
|| (DataTypes.SHORT_VARTYPE_MAX_CHARS < destPrecision);
} else {
Expand Down Expand Up @@ -837,7 +838,8 @@ else if (((java.sql.Types.CHAR == bulkJdbcType) || (java.sql.Types.VARCHAR == bu
int baseDestPrecision = destCryptoMeta.baseTypeInfo.getPrecision();

if ((java.sql.Types.NCHAR == baseDestJDBCType) || (java.sql.Types.NVARCHAR == baseDestJDBCType)
|| (java.sql.Types.LONGNVARCHAR == baseDestJDBCType))
|| (java.sql.Types.LONGNVARCHAR == baseDestJDBCType)
|| (microsoft.sql.Types.JSON == baseDestJDBCType))
isStreaming = (DataTypes.SHORT_VARTYPE_MAX_CHARS < baseDestPrecision);
else
isStreaming = (DataTypes.SHORT_VARTYPE_MAX_BYTES < baseDestPrecision);
Expand Down Expand Up @@ -997,6 +999,7 @@ private void writeTypeInfo(TDSWriter tdsWriter, int srcJdbcType, int srcScale, i

case java.sql.Types.LONGVARCHAR:
case java.sql.Types.VARCHAR: // 0xA7
case microsoft.sql.Types.JSON:
if (unicodeConversionRequired(srcJdbcType, destSSType)) {
tdsWriter.writeByte(TDSType.NVARCHAR.byteValue());
if (isStreaming) {
Expand Down Expand Up @@ -1025,7 +1028,6 @@ private void writeTypeInfo(TDSWriter tdsWriter, int srcJdbcType, int srcScale, i
}
collation.writeCollation(tdsWriter);
break;

case java.sql.Types.BINARY: // 0xAD
tdsWriter.writeByte(TDSType.BIGBINARY.byteValue());
tdsWriter.writeShort((short) (srcPrecision));
Expand Down Expand Up @@ -1127,7 +1129,7 @@ private void writeTypeInfo(TDSWriter tdsWriter, int srcJdbcType, int srcScale, i
case microsoft.sql.Types.SQL_VARIANT: // 0x62
tdsWriter.writeByte(TDSType.SQL_VARIANT.byteValue());
tdsWriter.writeInt(TDS.SQL_VARIANT_LENGTH);
break;
break;
default:
MessageFormat form = new MessageFormat(SQLServerException.getErrString("R_BulkTypeNotSupported"));
String unsupportedDataType = JDBCType.of(srcJdbcType).toString().toLowerCase(Locale.ENGLISH);
Expand Down Expand Up @@ -1470,6 +1472,8 @@ private String getDestTypeFromSrcType(int srcColIndx, int destColIndx,
}
case microsoft.sql.Types.SQL_VARIANT:
return SSType.SQL_VARIANT.toString();
case microsoft.sql.Types.JSON:
return SSType.JSON.toString();
default: {
MessageFormat form = new MessageFormat(SQLServerException.getErrString("R_BulkTypeNotSupported"));
Object[] msgArgs = {JDBCType.of(bulkJdbcType).toString().toLowerCase(Locale.ENGLISH)};
Expand Down Expand Up @@ -2090,6 +2094,7 @@ private void writeNullToTdsWriter(TDSWriter tdsWriter, int srcJdbcType,
case java.sql.Types.LONGVARCHAR:
case java.sql.Types.LONGNVARCHAR:
case java.sql.Types.LONGVARBINARY:
case microsoft.sql.Types.JSON:
if (isStreaming) {
tdsWriter.writeLong(PLPInputStream.PLP_NULL);
} else {
Expand Down Expand Up @@ -2322,6 +2327,7 @@ else if (null != sourceCryptoMeta) {
case java.sql.Types.LONGVARCHAR:
case java.sql.Types.CHAR: // Fixed-length, non-Unicode string data.
case java.sql.Types.VARCHAR: // Variable-length, non-Unicode string data.
case microsoft.sql.Types.JSON:
if (isStreaming) // PLP
{
// PLP_BODY rule in TDS
Expand Down Expand Up @@ -2460,7 +2466,6 @@ else if (null != sourceCryptoMeta) {
}
}
break;

case java.sql.Types.LONGVARBINARY:
case java.sql.Types.BINARY:
case java.sql.Types.VARBINARY:
Expand Down Expand Up @@ -2986,6 +2991,7 @@ private Object readColumnFromResultSet(int srcColOrdinal, int srcJdbcType, boole
case java.sql.Types.LONGNVARCHAR:
case java.sql.Types.NCHAR:
case java.sql.Types.NVARCHAR:
case microsoft.sql.Types.JSON:
// PLP if stream type and both the source and destination are not encrypted
// This is because AE does not support streaming types.
// Therefore an encrypted source or destination means the data must not actually be streaming data
Expand Down Expand Up @@ -3060,7 +3066,8 @@ private void writeColumn(TDSWriter tdsWriter, int srcColOrdinal, int destColOrdi
destPrecision = destColumnMetadata.get(destColOrdinal).precision;

if ((java.sql.Types.NCHAR == srcJdbcType) || (java.sql.Types.NVARCHAR == srcJdbcType)
|| (java.sql.Types.LONGNVARCHAR == srcJdbcType)) {
|| (java.sql.Types.LONGNVARCHAR == srcJdbcType)
|| (microsoft.sql.Types.JSON == srcJdbcType)) {
isStreaming = (DataTypes.SHORT_VARTYPE_MAX_CHARS < srcPrecision)
|| (DataTypes.SHORT_VARTYPE_MAX_CHARS < destPrecision);
} else {
Expand Down Expand Up @@ -3771,6 +3778,7 @@ void setDestinationTableMetadata(SQLServerResultSet rs) {
private boolean unicodeConversionRequired(int jdbcType, SSType ssType) {
return ((java.sql.Types.CHAR == jdbcType || java.sql.Types.VARCHAR == jdbcType
|| java.sql.Types.LONGNVARCHAR == jdbcType)
&& (SSType.NCHAR == ssType || SSType.NVARCHAR == ssType || SSType.NVARCHARMAX == ssType));
&& (SSType.NCHAR == ssType || SSType.NVARCHAR == ssType || SSType.NVARCHARMAX == ssType
|| SSType.JSON == ssType));
}
}
Loading