java.lang.NoClassDefFoundError: org/apache/atlas/ApplicationProperties
See original GitHub issueI am getting the following error when running the spark-shell:
$ spark-shell --jars spark-atlas-connector_2.11-0.1.0-SNAPSHOT.jar --conf spark.extraListeners=com.hortonworks.spark.atlas.SparkAtlasEventTracker --conf spark.sql.queryExecutionListeners=com.hortonworks.spark.atlas.SparkAtlasEventTracker
SPARK_MAJOR_VERSION is set to 2, using Spark2
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
java.lang.NoClassDefFoundError: org/apache/atlas/ApplicationProperties
at com.hortonworks.spark.atlas.AtlasClientConf.configuration$lzycompute(AtlasClientConf.scala:27)
at com.hortonworks.spark.atlas.AtlasClientConf.configuration(AtlasClientConf.scala:27)
at com.hortonworks.spark.atlas.AtlasClientConf.get(AtlasClientConf.scala:52)
at com.hortonworks.spark.atlas.AtlasClient$.atlasClient(AtlasClient.scala:88)
at com.hortonworks.spark.atlas.SparkAtlasEventTracker.<init>(SparkAtlasEventTracker.scala:39)
at com.hortonworks.spark.atlas.SparkAtlasEventTracker.<init>(SparkAtlasEventTracker.scala:43)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.spark.util.Utils$$anonfun$loadExtensions$1.apply(Utils.scala:2743)
at org.apache.spark.util.Utils$$anonfun$loadExtensions$1.apply(Utils.scala:2732)
at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
at scala.collection.mutable.ArraySeq.foreach(ArraySeq.scala:74)
at scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:241)
at scala.collection.AbstractTraversable.flatMap(Traversable.scala:104)
at org.apache.spark.util.Utils$.loadExtensions(Utils.scala:2732)
at org.apache.spark.SparkContext$$anonfun$setupAndStartListenerBus$1.apply(SparkContext.scala:2360)
at org.apache.spark.SparkContext$$anonfun$setupAndStartListenerBus$1.apply(SparkContext.scala:2359)
at scala.Option.foreach(Option.scala:257)
at org.apache.spark.SparkContext.setupAndStartListenerBus(SparkContext.scala:2359)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:554)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2493)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:930)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:921)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:921)
at org.apache.spark.repl.Main$.createSparkSession(Main.scala:103)
... 55 elided
Caused by: java.lang.ClassNotFoundException: org.apache.atlas.ApplicationProperties
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 84 more
<console>:14: error: not found: value spark
import spark.implicits._
^
<console>:14: error: not found: value spark
import spark.sql
^
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 2.3.0.2.6.5.0-292
/_/
Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_112)
Type in expressions to have them evaluated.
Type :help for more information.
scala>
Issue Analytics
- State:
- Created 5 years ago
- Comments:10 (3 by maintainers)
Top Results From Across the Web
java.lang.NoClassDefFoundError: org/apache/atlas ... - GitHub
Take a look at the ApplicationProperties.java in atlas repo. You can see that if ATLAS_CONFIGURATION_DIRECTORY_PROPERTY == null then it will ...
Read more >Unable to execute import-hive.sh - Cloudera Community
2021-07-13T06:16:50,948 INFO [main] org.apache.atlas.ApplicationProperties - Atlas is running in MODE: PROD. ... Failed to import Hive Meta Data ...
Read more >spark-atlas-connector: "SparkCatalogEventProcessor-thread ...
Advise you to use correct versions. 2) Spark runs on Java 8+, Python 2.7+/3.4+ and R 3.1+. For the Scala API, Spark 2.3.1...
Read more >[jira] [Created] (ATLAS-4445) unable to run import-hive.sh ...
ApplicationProperties - Looking for atlas-application.properties in ... NoClassDefFoundError: javax/ws/rs/core/Link$Builder at java.lang.
Read more >java.lang.ClassNotFoundException(org.apache.atlas.hive ...
Hive Internal Error on HDP: java.lang.ClassNotFoundException(org.apache.atlas.hive.hook.HiveHook) in bigsql.log · Symptoms · Resolving the problem.
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
use uber(assembly) jar and not spark-atlas-connector_2.11-0.1.0-SNAPSHOT.jar, that will fix the issue.
Hello, have you solved this problem