Skip to content

Conversation

tianhanhu
Copy link
Contributor

What changes were proposed in this pull request?

In https://issues.apache.org/jira/browse/SPARK-43263, Spark upgraded Jackson to 2.15.0. This brought in a breaking change that set string max length to 20,000 which Json data source picked up. Before the upgrade, there was no limit. The limit could only be modified through Json option "maxStringLen".

This PR attempts to restore the old no limit behavior, by introducing a new JSON_MAX_STRING_LENGTH feature flag that sets the string length limit globally.

Why are the changes needed?

To restore the old string length limit behavior of Json connector, prior to the upgrade to Jackson 2.15.0.

Does this PR introduce any user-facing change?

Yes, added a new user configurable config for specifying max string length for Json connector.

How was this patch tested?

New unit tests.

Was this patch authored or co-authored using generative AI tooling?

No.

@github-actions github-actions bot added the SQL label Jun 20, 2025
@tianhanhu
Copy link
Contributor Author

cc @cloud-fan

@HyukjinKwon HyukjinKwon changed the title [SPARK-52544] Allow configuring Json datasource string length limit through SQLConf [SPARK-52544][SQL] Allow configuring Json datasource string length limit through SQLConf Jun 23, 2025
buildConf("spark.sql.json.defaultMaxStringLength")
.doc("Global default maximum string length limit when reading JSON data. It will be " +
"overridden if a JSONOption maxStringLen is provided.")
.version("3.5.0")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

4.1.0

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This needs to be backported to 3.5 to fix a regression, how about 3.5.7?

.get("maxStringLen")
.map(_.toInt)
.getOrElse(StreamReadConstraints.DEFAULT_MAX_STRING_LEN)
.getOrElse(SQLConf.get.getConf(SQLConf.JSON_MAX_STRING_LENGTH))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can we follow ParquetOptions and pass a SQLConf instance to construct JSONOptions?


test("Test JSON data source maxStringLen option") {
// Create a JSON string that is way longer than DEFAULT_MAX_STRING_LEN.
val longStringSize = StreamReadConstraints.DEFAULT_MAX_STRING_LEN * 10
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

does * 2 work?

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If * 10 works, * 2 will - it's a test, not setting what the actual new max will be.

@DapengShi
Copy link

GetJsonObject is also affected by jackson 2.15 and it does not use JSONOptions, so can we fix all affected json related code in this PR? I think maybe we also need to improve SharedFactory in jsonExpressions.scala ?

@cdagraca
Copy link

It looks like FasterXML have increased the size on their end (FasterXML/jackson-core#1014, FasterXML/jackson-core#1019)
Could updating to a more recent version of that library be a simpler fix?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants