question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

[Feature Request]Add a check to verify the required Spark confs

See original GitHub issue

Feature request

Overview

We saw a few reported issues due to missing configs. We can add a check in DeltaLog to verify the following Spark confs.

--conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" --conf "spark.sql.catalog.spark_catalog=org.apache.spark.sql.delta.catalog.DeltaCatalog"

If these are not set, we can throw a user friendly error.

Motivation

Provide a user friendly error rather than throwing random weird errors.

Willingness to contribute

The Delta Lake Community encourages new feature contributions. Would you or another member of your organization be willing to contribute an implementation of this feature?

  • Yes. I can contribute this feature independently.
  • Yes. I would be willing to contribute this feature with guidance from the Delta Lake community.
  • No. I cannot contribute this feature at this time.

Issue Analytics

  • State:closed
  • Created a year ago
  • Comments:8 (6 by maintainers)

github_iconTop GitHub Comments

1reaction
ganeshchandcommented, Jul 13, 2022

Thanks for the clarification. I am working on it and will send the PR soon.

1reaction
ganeshchandcommented, May 31, 2022

@zsxwing I can work on this feature. Please feel free to assign this to me.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Is there a way to validate the values of spark configs?
Running spark.conf.get('spark.sql.shuffle.partition') will return a value ==> without ... Is there a way to check if what I have used as config parameter...
Read more >
Configuration - Spark 3.3.1 Documentation - Apache Spark
This is a useful place to check to make sure that your properties have been set correctly. Note that only values explicitly specified...
Read more >
Spark Get the Current SparkContext Settings
In the below Spark example, I have added additional configuration to Spark using SparkConf and retrieve all default config values from SparkContext along...
Read more >
Get and set Apache Spark configuration properties in a ...
However, there may be instances when you need to check (or set) the values of specific Spark configuration properties in a notebook. This ......
Read more >
How do I ensure that my Apache Spark setup code runs only ...
To confirm that, you can check whether you have 2 tables created after ... you need to change the code so that block...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found