[Announcement]: master branch will move to support Spark 3.0.
See original GitHub issueThe master
branch is planned to move to have a dependency on Spark 3.0 after Hyperspace 0.5 is released; this indicates that Spark 2.4 will be supported by Hyperspace version 0.5.x. branch-0.5
will be created at that time, and any critical bugs will be back-ported and new patch versions (e.g., 0.5.1, 0.5.2) will be released if necessary. However, there is no plan to back-port any new features.
Please leave comments if you have any concerns.
Issue Analytics
- State:
- Created 2 years ago
- Comments:6 (5 by maintainers)
Top Results From Across the Web
Support Apache Spark 3.0 · Issue #57 - GitHub
I wanted to update the community with status: The Spark-CDM-Connector now supports Spark3 in Synapse. In conjunction with DataBricks engineering, we are trying ......
Read more >Contributing to Spark | Apache Spark
This guide documents the best way to make various types of contribution to Apache Spark, including what is required before submitting a code...
Read more >Using version control and deploying | Looker - Google Cloud
Create and manage Git branches, commit, and deploy using Git. ... You can also delete a branch other than the master branch, your...
Read more >github how to include files from master in new git branch gh ...
You want branch 'gh-pages' in your GitHub repository to be the same as 'master' branch. The simplest solution would be to configure git...
Read more >Developing and testing AWS Glue job scripts locally
This topic describes how to develop and test AWS Glue version 3.0 jobs in a Docker container using a Docker image. There are...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
If the changes between Spark 2 and 3 are not significant in terms of API, public or internal, then we can go with the option 7, producing JAR for each Spark version. I’ll take a look if this is feasible.
We will go with option 7 thanks to @clee704 (#421). (I think we need to change the JAR names to conform to Maven spec, but I will check and take care of them).