build fails for Spark 2.0 due to removal of org.apache.spark.Logging and akka
See original GitHub issueWe were planning to shift to Spark 2.0 along with their dependencies. So, currently we are using this Connector. Which internally imports org.apache.spark.Logging
. But in the Spark 2.0 it has been made private. Morover, the support for streaming connectors like Akka also have been removed So, what could be the work around to build it for Spark 2.0
Issue Analytics
- State:
- Created 7 years ago
- Comments:11 (4 by maintainers)
Top Results From Across the Web
java.lang.NoClassDefFoundError: org/apache/spark/Logging
The error is because you are using Spark 2.0 libraries with the connector from Spark ... It's because of the missing of org.apache.spark....
Read more >Re: Spark (Standalone) error local class incompati... - 25909
I'm trying to use spark (standalone) to load data onto hive tables. The avro schema is successfully, I see (on spark ui page)...
Read more >Spark Configuration - Spark 1.2.2 Documentation
Logging can be configured through log4j.properties . Spark Properties. Spark properties control most application settings and are configured separately for each ...
Read more >method not found during metals initialization #2132 - GitHub
I'm getting a method not found error upon vim startup during metals initialization. It's possible vim-lsp really is sending an invalid method, ...
Read more >Category: Spark 1.6.0 - markobigdata
I have experienced that if the ports Spark is using can not be reached, YARN is going to terminate with the error message...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
The next weeks we publish one version compatible with Spark 2.0. Thanks for all.
@compae Why still in the current release you have used
import org.apache.spark.internal.Logging
. Whereas in the Spark Official documentation it is not recommended to use internal logging mechanism of spark