Skip to content

[Snyk] Security upgrade org.apache.spark:spark-core_2.12 from 2.4.4 to 3.5.7#102

Open
snyk-io[bot] wants to merge 1 commit intomasterfrom
snyk-fix-6db3ae4b7909d8656ec6d35a031881da
Open

[Snyk] Security upgrade org.apache.spark:spark-core_2.12 from 2.4.4 to 3.5.7#102
snyk-io[bot] wants to merge 1 commit intomasterfrom
snyk-fix-6db3ae4b7909d8656ec6d35a031881da

Conversation

@snyk-io
Copy link
Copy Markdown

@snyk-io snyk-io bot commented Mar 15, 2026

snyk-top-banner

Snyk has created this PR to fix 1 vulnerabilities in the maven dependencies of this project.

Snyk changed the following file(s):

  • pom.xml

Vulnerabilities that will be fixed with an upgrade:

Issue Score Upgrade
high severity Deserialization of Untrusted Data
SNYK-JAVA-ORGAPACHESPARK-15623151
  243   org.apache.spark:spark-core_2.12:
2.4.4 -> 3.5.7
Major version upgrade No Path Found No Known Exploit

Breaking Change Risk

Merge Risk: High

Notice: This assessment is enhanced by AI.


Important

  • Check the changes in this PR to ensure they won't cause issues with your project.
  • Max score is 1000. Note that the real score may have changed since the PR was raised.
  • This PR was automatically created by Snyk using the credentials of a real user.

Note: You are seeing this because you or someone else with access to this repository has authorized Snyk to open fix PRs.

For more information:
🧐 View latest project report
📜 Customise PR templates
🛠 Adjust project settings
📚 Read about Snyk's upgrade logic


Learn how to fix vulnerabilities with free interactive lessons:

🦉 Deserialization of Untrusted Data

@snyk-io
Copy link
Copy Markdown
Author

snyk-io bot commented Mar 15, 2026

Merge Risk: High

This is a major version upgrade from Spark 2.4.4 to 3.5.7, which introduces significant breaking changes requiring code modifications, dependency updates, and thorough testing. This is a high-risk migration that cannot be treated as a simple drop-in replacement.

Key Breaking Changes

  • Scala 2.12 Requirement: Spark 3.x is built on Scala 2.12, whereas Spark 2.4 used Scala 2.11. All applications and their dependencies must be recompiled against Scala 2.12.
  • Runtime Environment Changes:
    • Support for Python 2.7, 3.4, and 3.5 was dropped in Spark 3.1. Support for Python 3.6 was dropped in Spark 3.2.
    • Java 8 versions older than 8u201 are no longer supported as of Spark 3.2. Spark 3.5 is compatible with Java 17.
    • The logging framework was migrated from Log4j 1.x to Log4j2 in Spark 3.3, which requires updating configuration files to the new format.
  • Spark SQL and DataFrame Behavior:
    • Date and Timestamp Parsing: Spark 3.0 switched to the Proleptic Gregorian calendar and enforces stricter parsing rules for date/timestamp strings. This is a common source of errors and may cause SparkUpgradeException. The behavior can be reverted by setting spark.sql.legacy.timeParserPolicy to LEGACY.
    • API Deprecations: The UserDefinedAggregateFunction (UDAF) API is deprecated and should be migrated to the Aggregator API.
    • UDF Null Handling: The behavior of untyped Scala UDFs with null inputs has changed. Instead of returning null, they may return the default value for the Java type (e.g., 0 for an Int).

Notice 🤖: This content was augmented using artificial intelligence. AI-generated content may contain errors and should be reviewed for accuracy before use.

@snyk-io
Copy link
Copy Markdown
Author

snyk-io bot commented Mar 15, 2026

Snyk checks have failed. 8 issues have been found so far.

Status Scan Engine Critical High Medium Low Total (8)
Open Source Security 0 8 0 0 8 issues
Licenses 0 0 0 0 0 issues

💻 Catch issues earlier using the plugins for VS Code, JetBrains IDEs, Visual Studio, and Eclipse.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Development

Successfully merging this pull request may close these issues.

0 participants