Throwing runtime exception for delta commands
See original GitHub issueIssue Description
Current spline is not supported for delta commands such as MergeInto, Update, Delete
I have added io.delta
package with scope provided as databricks cluster by default has io.delta
imported, and made changes to WriteCommandExtractor.scala
file.
import org.apache.spark.sql.delta.commands._
and added below code to commandsToBeImplemented just to see if it throws custom exception when run in Required mode.
classOf[MergeIntoCommand]
But it throws following exception when I run an application
Caused by: java.lang.ClassNotFoundException: org.apache.spark.sql.delta.commands.MergeIntoCommand
I am trying to debug the issue but your support on this is highly appreciated.
Issue Analytics
- State:
- Created 3 years ago
- Comments:17 (7 by maintainers)
Top Results From Across the Web
Databricks Delta Tables Exception thrown in awaitResult ...
It seems like spark is looking for the delta file, but can not find one, since we deleted all files before we execute...
Read more >How to Throw Exceptions (The Java™ Tutorials > Essential ...
All methods use the throw statement to throw an exception. The throw statement requires a single argument: a throwable object. Throwable objects are...
Read more >Error conditions in Databricks
This is a list of common, named error conditions returned by Databricks. In this article: Databricks Runtime and Databricks SQL; Delta Lake ...
Read more >Solved: User class threw exception: org.apache.spark.sql.A...
We run Spark 2.3.2 on Hadoop 3.1.1. We use external ORC tables stored on HDFS. We are encountering an issue on a job...
Read more >DeltaErrors - The Internals of Delta Lake
Convert to Delta command is executed (and DeltaCommand is requested to commitLarge) ... postCommitHookFailedException throws a RuntimeException :.
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
That’s why Spline agent is full of dirty reflection hacks 😃 We’re living in wild water…
How to log is described here https://github.com/AbsaOSS/spline-spark-agent/discussions/394 especially the last paragraph. You need a TRACE level for evrything under
za.co.absa.spline
package.You should find the MergeIntoCommandEdge command in an OBJECT DUMP BEGIN log. This is a log of the class structure and fields, that should hopefully be enough to see the fields you need to extract the dataSource data without seeing the actual code.
For the implementation, check the PR for Databricks’s CreateDeltaTableCommand https://github.com/AbsaOSS/spline-spark-agent/pull/112/files
It might be enough to just add some code in
DatabricksPlugin
similar way it is done for CreateDeltaTableCommand.