question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Run SAC in pyspark, and got "Cannot find active or default SparkSession in the current context"

See original GitHub issue

I ran a sql query in pyspark, and got the following exception:

>>> spark.sql("create table tmp_case_q (key INT, value STRING)")
18/03/05 20:20:20 WARN SparkCatalogEventTracker:  Caught exception during parsing catalog event
java.lang.IllegalStateException: Cannot find active or default SparkSession in the current context
	at com.hortonworks.spark.atlas.utils.SparkUtils$.sparkSession(SparkUtils.scala:35)
	at com.hortonworks.spark.atlas.utils.SparkUtils$.getExternalCatalog(SparkUtils.scala:87)
	at com.hortonworks.spark.atlas.sql.SparkCatalogEventTracker$$anonfun$eventProcess$2.apply(SparkCatalogEventTracker.scala:120)
	at com.hortonworks.spark.atlas.sql.SparkCatalogEventTracker$$anonfun$eventProcess$2.apply(SparkCatalogEventTracker.scala:96)
	at scala.Option.foreach(Option.scala:257)
	at com.hortonworks.spark.atlas.sql.SparkCatalogEventTracker.eventProcess(SparkCatalogEventTracker.scala:96)
	at com.hortonworks.spark.atlas.sql.AbstractService$$anon$1.run(AbstractService.scala:24)
DataFrame[]

The exception is from here. @jerryshao Could you please help to look at this issue? Thanks.

cc: @dongjoon-hyun

Issue Analytics

  • State:closed
  • Created 6 years ago
  • Comments:5 (1 by maintainers)

github_iconTop GitHub Comments

2reactions
srinivasugaddamcommented, May 23, 2019

@weiqingy I meet the same problem, can you tell me how to resolve the problem, thank you

0reactions
iamrjt04commented, Aug 5, 2019

@SimonWan1029 HWX(stands for hortonworks). @weiqingy instead of pyspark use spark-shell or spark-submit(use spark-session of java or scala), as SAC code is written in scala and unable to fetch active or default SparkSession from python since up to my knowledge it is able to search only in java environment(JVM).

Read more comments on GitHub >

github_iconTop Results From Across the Web

Spark Session — PySpark 3.3.1 documentation
Gets an existing SparkSession or, if there is no existing one, creates a new one based on the options set in this builder....
Read more >
Spark Get the Current SparkContext Settings
In Spark/PySpark you can get the current active SparkContext and its configuration settings by accessing spark.sparkContext.getConf.getAll(), here spark.
Read more >
Is it possible to get the current spark context settings in PySpark?
No - you can get the conf object but not the things you'd looking for. Defaults are not available through SparkConf (they're hardcoded...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found