Run SAC in pyspark, and got "Cannot find active or default SparkSession in the current context"
See original GitHub issueI ran a sql query in pyspark, and got the following exception:
>>> spark.sql("create table tmp_case_q (key INT, value STRING)")
18/03/05 20:20:20 WARN SparkCatalogEventTracker: Caught exception during parsing catalog event
java.lang.IllegalStateException: Cannot find active or default SparkSession in the current context
at com.hortonworks.spark.atlas.utils.SparkUtils$.sparkSession(SparkUtils.scala:35)
at com.hortonworks.spark.atlas.utils.SparkUtils$.getExternalCatalog(SparkUtils.scala:87)
at com.hortonworks.spark.atlas.sql.SparkCatalogEventTracker$$anonfun$eventProcess$2.apply(SparkCatalogEventTracker.scala:120)
at com.hortonworks.spark.atlas.sql.SparkCatalogEventTracker$$anonfun$eventProcess$2.apply(SparkCatalogEventTracker.scala:96)
at scala.Option.foreach(Option.scala:257)
at com.hortonworks.spark.atlas.sql.SparkCatalogEventTracker.eventProcess(SparkCatalogEventTracker.scala:96)
at com.hortonworks.spark.atlas.sql.AbstractService$$anon$1.run(AbstractService.scala:24)
DataFrame[]
The exception is from here. @jerryshao Could you please help to look at this issue? Thanks.
cc: @dongjoon-hyun
Issue Analytics
- State:
- Created 6 years ago
- Comments:5 (1 by maintainers)
Top Results From Across the Web
Spark Session — PySpark 3.3.1 documentation
Gets an existing SparkSession or, if there is no existing one, creates a new one based on the options set in this builder....
Read more >Spark Get the Current SparkContext Settings
In Spark/PySpark you can get the current active SparkContext and its configuration settings by accessing spark.sparkContext.getConf.getAll(), here spark.
Read more >Is it possible to get the current spark context settings in PySpark?
No - you can get the conf object but not the things you'd looking for. Defaults are not available through SparkConf (they're hardcoded...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found

@weiqingy I meet the same problem, can you tell me how to resolve the problem, thank you
@SimonWan1029 HWX(stands for hortonworks). @weiqingy instead of pyspark use spark-shell or spark-submit(use spark-session of java or scala), as SAC code is written in scala and unable to fetch active or default SparkSession from python since up to my knowledge it is able to search only in java environment(JVM).