question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Failed to initialize Atlas client

See original GitHub issue

While running my basic word-count spark job with spark-atlas-connector, i am getting the following error :

18/07/25 11:57:52 ERROR ClientResponse: A message body reader for Java class org.apache.atlas.model.typedef.AtlasTypesDef, and Java type class org.apache.atlas.model.typedef.AtlasTypesDef, and MIME media type application/json;charset=UTF-8 was not found    
18/07/25 11:57:52 ERROR ClientResponse: The registered message body readers compatible with the MIME media type are:    
*/* ->    
  com.sun.jersey.core.impl.provider.entity.FormProvider    
  com.sun.jersey.core.impl.provider.entity.StringProvider    
  com.sun.jersey.core.impl.provider.entity.ByteArrayProvider    
  com.sun.jersey.core.impl.provider.entity.FileProvider    
  com.sun.jersey.core.impl.provider.entity.InputStreamProvider    
  com.sun.jersey.core.impl.provider.entity.DataSourceProvider    
  com.sun.jersey.core.impl.provider.entity.XMLJAXBElementProvider$General    
  com.sun.jersey.core.impl.provider.entity.ReaderProvider    
  com.sun.jersey.core.impl.provider.entity.DocumentProvider    
  com.sun.jersey.core.impl.provider.entity.SourceProvider$StreamSourceReader    
  com.sun.jersey.core.impl.provider.entity.SourceProvider$SAXSourceReader    
  com.sun.jersey.core.impl.provider.entity.SourceProvider$DOMSourceReader    
  com.sun.jersey.core.impl.provider.entity.XMLRootElementProvider$General    
  com.sun.jersey.core.impl.provider.entity.XMLListElementProvider$General    
  com.sun.jersey.core.impl.provider.entity.XMLRootObjectProvider$General    
  com.sun.jersey.core.impl.provider.entity.EntityHolderReader    

18/07/25 11:57:52 ERROR SparkAtlasEventTracker: Fail to initialize Atlas client, stop this listener    
org.apache.atlas.AtlasServiceException: Metadata service API GET : api/atlas/v2/types/typedefs/ failed    
        at org.apache.atlas.AtlasBaseClient.callAPIWithResource(AtlasBaseClient.java:325)    
        at org.apache.atlas.AtlasBaseClient.callAPIWithResource(AtlasBaseClient.java:287)    
        at org.apache.atlas.AtlasBaseClient.callAPI(AtlasBaseClient.java:469)    
        at org.apache.atlas.AtlasClientV2.getAllTypeDefs(AtlasClientV2.java:131)    
        at com.hortonworks.spark.atlas.RestAtlasClient.getAtlasTypeDefs(RestAtlasClient.scala:58)    
        at com.hortonworks.spark.atlas.types.SparkAtlasModel$$anonfun$checkAndGroupTypes$1.apply(SparkAtlasModel.scala:107)    
        at com.hortonworks.spark.atlas.types.SparkAtlasModel$$anonfun$checkAndGroupTypes$1.apply(SparkAtlasModel.scala:104)    
        at scala.collection.immutable.HashMap$HashMap1.foreach(HashMap.scala:221)    
        at scala.collection.immutable.HashMap$HashTrieMap.foreach(HashMap.scala:428)    
        at scala.collection.immutable.HashMap$HashTrieMap.foreach(HashMap.scala:428)    
        at com.hortonworks.spark.atlas.types.SparkAtlasModel$.checkAndGroupTypes(SparkAtlasModel.scala:104)    
        at com.hortonworks.spark.atlas.types.SparkAtlasModel$.checkAndCreateTypes(SparkAtlasModel.scala:71)    
        at com.hortonworks.spark.atlas.SparkAtlasEventTracker.initializeSparkModel(SparkAtlasEventTracker.scala:108)    
        at com.hortonworks.spark.atlas.SparkAtlasEventTracker.<init>(SparkAtlasEventTracker.scala:48)    
        at com.hortonworks.spark.atlas.SparkAtlasEventTracker.<init>(SparkAtlasEventTracker.scala:39)    
        at com.hortonworks.spark.atlas.SparkAtlasEventTracker.<init>(SparkAtlasEventTracker.scala:43)    
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)    
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)    
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)    
        at java.lang.reflect.Constructor.newInstance(Constructor.java:423)    
        at org.apache.spark.util.Utils$$anonfun$loadExtensions$1.apply(Utils.scala:2743)    
        at org.apache.spark.util.Utils$$anonfun$loadExtensions$1.apply(Utils.scala:2732)    
        at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)    
        at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)    
        at scala.collection.mutable.ArraySeq.foreach(ArraySeq.scala:74)    
        at scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:241)    
        at scala.collection.AbstractTraversable.flatMap(Traversable.scala:104)    
        at org.apache.spark.util.Utils$.loadExtensions(Utils.scala:2732)    
        at org.apache.spark.sql.util.ExecutionListenerManager$$anonfun$$lessinit$greater$1.apply(QueryExecutionListener.scala:83)    
        at org.apache.spark.sql.util.ExecutionListenerManager$$anonfun$$lessinit$greater$1.apply(QueryExecutionListener.scala:82)    
        at scala.Option.foreach(Option.scala:257)    
        at org.apache.spark.sql.util.ExecutionListenerManager.<init>(QueryExecutionListener.scala:82)    
        at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$listenerManager$2.apply(BaseSessionStateBuilder.scala:270)    
        at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$listenerManager$2.apply(BaseSessionStateBuilder.scala:270)    
        at scala.Option.getOrElse(Option.scala:121)    
        at org.apache.spark.sql.internal.BaseSessionStateBuilder.listenerManager(BaseSessionStateBuilder.scala:269)    
        at org.apache.spark.sql.internal.BaseSessionStateBuilder.build(BaseSessionStateBuilder.scala:297)    
        at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$instantiateSessionState(SparkSession.scala:1070)    
        at org.apache.spark.sql.SparkSession$$anonfun$sessionState$2.apply(SparkSession.scala:141)    
        at org.apache.spark.sql.SparkSession$$anonfun$sessionState$2.apply(SparkSession.scala:140)    
        at scala.Option.getOrElse(Option.scala:121)    
        at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:140)    
        at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:137)    
        at org.apache.spark.sql.Dataset.<init>(Dataset.scala:178)    
        at org.apache.spark.sql.Dataset$.apply(Dataset.scala:65)    
        at org.apache.spark.sql.SparkSession.createDataset(SparkSession.scala:470)    
        at org.apache.spark.sql.SQLContext.createDataset(SQLContext.scala:377)    
        at org.apache.spark.sql.SQLImplicits.localSeqToDatasetHolder(SQLImplicits.scala:228)    
        at com.oi.spline.main.SparkAtlasConnector$.main(SparkAtlasConnector.scala:20)    
        at com.oi.spline.main.SparkAtlasConnector.main(SparkAtlasConnector.scala)    
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)    
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)    
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)    
        at java.lang.reflect.Method.invoke(Method.java:498)    
        at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)    
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:906)    
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:197)    
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:227)    
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:136)    
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)    
Caused by: com.sun.jersey.api.client.ClientHandlerException: A message body reader for Java class org.apache.atlas.model.typedef.AtlasTypesDef, and Java type class org.apache.atlas.model.typedef.AtlasTypesDef, and MIME media type application/json;charset=UTF-8 was not found    
        at com.sun.jersey.api.client.ClientResponse.getEntity(ClientResponse.java:630)    
        at com.sun.jersey.api.client.ClientResponse.getEntity(ClientResponse.java:604)    
        at org.apache.atlas.AtlasBaseClient.callAPIWithResource(AtlasBaseClient.java:321)    
        ... 59 more    
18/07/25 11:57:53 INFO FileOutputCommitter: File Output Committer Algorithm version is 1

Mine is a kerberized cluster. Atlas version is 0.8.2 and spark version is 2.3.0. I have followed all the steps specified for Kerberized environment. Any help will be highly appreciated @jerryshao , @weiqingy

Issue Analytics

  • State:closed
  • Created 5 years ago
  • Comments:5 (1 by maintainers)

github_iconTop GitHub Comments

2reactions
srinucse9commented, May 28, 2019

While running my basic spark job with spark-atlas-connector, i am getting the following error : command :(spark 2.3, Atkas 1.0, HDP3.0) spark-shell --jars /home/atlas/tt/spark-atlas-connector-assembly_2.11-0.1.0-SNAPSHOT.jar --master local --conf spark.extraListeners=com.hortonworks.spark.atlas.SparkAtlasEventTracker --conf spark.sql.queryExecutionListeners=com.hortonworks.spark.atlas.SparkAtlasEventTracker --conf spark.sql.streaming.streamingQueryListeners=com.hortonworks.spark.atlas.SparkAtlasStreamingQueryEventTracker

error:

spark-shell --jars /home/atlas/tt/spark-atlas-connector-assembly_2.11-0.1.0-SNAPSHOT.jar --master local --conf spark.extraListeners=com.hortonworks.spark.atlas.SparkAtlasEventTracker --conf spark.sql.queryExecutionListeners=com.hortonworks.spark.atlas.SparkAtlasEventTracker --conf spark.sql.streaming.streamingQueryListeners=com.hortonworks.spark.atlas.SparkAtlasStreamingQueryEventTracker Setting default log level to “WARN”. To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). 19/05/28 07:16:58 WARN Utils: Service ‘SparkUI’ could not bind on port 4040. Attempting port 4041. 19/05/28 07:16:58 WARN Utils: Service ‘SparkUI’ could not bind on port 4041. Attempting port 4042. 19/05/28 07:16:58 WARN Utils: Service ‘SparkUI’ could not bind on port 4042. Attempting port 4043. 19/05/28 07:16:58 WARN Utils: Service ‘SparkUI’ could not bind on port 4043. Attempting port 4044. 19/05/28 07:16:58 WARN Utils: Service ‘SparkUI’ could not bind on port 4044. Attempting port 4045. 19/05/28 07:17:01 ERROR SparkAtlasEventTracker: Fail to initialize Atlas client, stop this listener org.apache.atlas.AtlasServiceException: Metadata service API org.apache.atlas.AtlasClientV2$API_V2@398694a6 failed with status 401 (Unauthorized) Response Body () at org.apache.atlas.AtlasBaseClient.callAPIWithResource(AtlasBaseClient.java:395) at org.apache.atlas.AtlasBaseClient.callAPIWithResource(AtlasBaseClient.java:323) at org.apache.atlas.AtlasBaseClient.callAPI(AtlasBaseClient.java:239) at org.apache.atlas.AtlasClientV2.getAllTypeDefs(AtlasClientV2.java:124) at com.hortonworks.spark.atlas.RestAtlasClient.getAtlasTypeDefs(RestAtlasClient.scala:58) at com.hortonworks.spark.atlas.types.SparkAtlasModel$$anonfun$checkAndGroupTypes$1.apply(SparkAtlasModel.scala:107) at com.hortonworks.spark.atlas.types.SparkAtlasModel$$anonfun$checkAndGroupTypes$1.apply(SparkAtlasModel.scala:104) at scala.collection.immutable.HashMap$HashMap1.foreach(HashMap.scala:221) at scala.collection.immutable.HashMap$HashTrieMap.foreach(HashMap.scala:428) at scala.collection.immutable.HashMap$HashTrieMap.foreach(HashMap.scala:428) at com.hortonworks.spark.atlas.types.SparkAtlasModel$.checkAndGroupTypes(SparkAtlasModel.scala:104) at com.hortonworks.spark.atlas.types.SparkAtlasModel$.checkAndCreateTypes(SparkAtlasModel.scala:71) at com.hortonworks.spark.atlas.SparkAtlasEventTracker.initializeSparkModel(SparkAtlasEventTracker.scala:108) at com.hortonworks.spark.atlas.SparkAtlasEventTracker.<init>(SparkAtlasEventTracker.scala:48) at com.hortonworks.spark.atlas.SparkAtlasEventTracker.<init>(SparkAtlasEventTracker.scala:39) at com.hortonworks.spark.atlas.SparkAtlasEventTracker.<init>(SparkAtlasEventTracker.scala:43) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.spark.util.Utils$$anonfun$loadExtensions$1.apply(Utils.scala:2747) at org.apache.spark.util.Utils$$anonfun$loadExtensions$1.apply(Utils.scala:2736) at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241) at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241) at scala.collection.mutable.ArraySeq.foreach(ArraySeq.scala:74) at scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:241) at scala.collection.AbstractTraversable.flatMap(Traversable.scala:104) at org.apache.spark.util.Utils$.loadExtensions(Utils.scala:2736) at org.apache.spark.SparkContext$$anonfun$setupAndStartListenerBus$1.apply(SparkContext.scala:2360) at org.apache.spark.SparkContext$$anonfun$setupAndStartListenerBus$1.apply(SparkContext.scala:2359) at scala.Option.foreach(Option.scala:257) at org.apache.spark.SparkContext.setupAndStartListenerBus(SparkContext.scala:2359) at org.apache.spark.SparkContext.<init>(SparkContext.scala:554) at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2493) at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:934) at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:925) at scala.Option.getOrElse(Option.scala:121) at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:925) at org.apache.spark.repl.Main$.createSparkSession(Main.scala:103) at $line3.$read$$iw$$iw.<init>(<console>:15) at $line3.$read$$iw.<init>(<console>:43) at $line3.$read.<init>(<console>:45) at $line3.$read$.<init>(<console>:49) at $line3.$read$.<clinit>(<console>) at $line3.$eval$.$print$lzycompute(<console>:7) at $line3.$eval$.$print(<console>:6) at $line3.$eval.$print(<console>) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786) at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047) at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638) at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637) at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31) at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19) at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637) at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569) at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565) at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807) at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681) at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395) at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1$$anonfun$apply$mcV$sp$1$$anonfun$apply$mcV$sp$2.apply(SparkILoop.scala:79) at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1$$anonfun$apply$mcV$sp$1$$anonfun$apply$mcV$sp$2.apply(SparkILoop.scala:79) at scala.collection.immutable.List.foreach(List.scala:381) at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(SparkILoop.scala:79) at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:79) at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:79) at scala.tools.nsc.interpreter.ILoop.savingReplayStack(ILoop.scala:91) at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:78) at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:78) at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:78) at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214) at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:77) at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:110) at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920) at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909) at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909) at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97) at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909) at org.apache.spark.repl.Main$.doMain(Main.scala:76) at org.apache.spark.repl.Main$.main(Main.scala:56) at org.apache.spark.repl.Main.main(Main.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:904) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:198) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:228) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:137) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

can you please help me on this issue?

0reactions
q330483012commented, Jul 25, 2020

I have met the same problem like @vjbhakuni2 reported, I solved it by updated jersey-core-1.9.jar to last version jersey-core-1.19.4.jar,you can try it in your project

Read more comments on GitHub >

github_iconTop Results From Across the Web

Failed to initialize Atlas client using spark-atlas-connector
You can fix it by making changes in the code(change underline password). In regard to this one can pass a atlas-application.properties(property ...
Read more >
Apache Atlas: HTTP ERROR 503 Service Unavailable
The error is about failing to get /hbase zknode in zookeeper. Can you verify that HBase is up and running? HBase logs would...
Read more >
Resolve failed to initialize Geostan. us.z9 file not found error in ...
Learn how to resolve Failed to initialize Geostan. us.z9 file not found in the path(s) specified error in Spectrum. A Geocoding Enterprise Module...
Read more >
Problem determination for CDP Private Cloud Base and ... - IBM
NameNodes can start but Ranger policies do not work. NameNode log shows the following error message: org.apache.ranger.admin.client.
Read more >
Re: Need Help!!! Atlas Installation Error - Apache Mail Archives
NoClassDefFoundError: Could not initialize class org.apache.atlas.hive.hook.HiveHook java.sql.SQLException: Error running query: java.lang.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found