[QUESTION]: How to use SparkSession.ExecuteCommand
See original GitHub issueI have found this page in the documentation but I would like to know how to use it
I have tried doing the following example:
var optionsDictionary = new Dictionary<string, string>
{
{"driver", "com.microsoft.sqlserver.jdbc.SQLServerDriver"},
{"jdbcUrl", $"jdbc:sqlserver://{server};database={db};user={user};password={password}"}
};
_sparkSession.ExecuteCommand("runner", $"DROP TABLE {temporaryTable};", optionsDictionary);
StackTrace:
System.Exception: JVM method execution failed: Nonstatic method 'executeCommand' failed for class '6' when called with 3 arguments ([Index=1, Type=String, Value=runner], [Index=2, Type=String, Value=DROP TABLE TemporaryTable;], [Index=3, Type=Dictionary`2, Value=System.Collections.Generic.Dictionary`2[System.String,System.String]], )
---> Microsoft.Spark.JvmException
--- End of inner exception stack trace ---
at Microsoft.Spark.Interop.Ipc.JvmBridge.CallJavaMethod(Boolean isStatic, Object classNameOrJvmObjectReference, String methodName, Object[] args)
at Microsoft.Spark.Interop.Ipc.JvmBridge.CallNonStaticJavaMethod(JvmObjectReference objectId, String methodName, Object[] args)
at Microsoft.Spark.Interop.Ipc.JvmObjectReference.Invoke(String methodName, Object[] args)
at Microsoft.Spark.Sql.SparkSession.ExecuteCommand(String runner, String command, Dictionary`2 options)
Issue Analytics
- State:
- Created 3 years ago
- Comments:7 (2 by maintainers)
Top Results From Across the Web
SparkSession.ExecuteCommand Method
Execute an arbitrary string command inside an external execution engine rather than Spark. This could be useful when user wants to execute some...
Read more >How to create SparkSession from existing SparkContext
I have a Spark application which using Spark 2.0 new API with SparkSession . I am building this application on top of the...
Read more >Spark Shell Command Usage with Examples
In Spark shell, Spark by default provides spark and sc variables. spark is an object of SparkSession and sc is an object of...
Read more >Spark SQL, DataFrames and Datasets Guide
One use of Spark SQL is to execute SQL queries. Spark SQL can also be used to read ... To create a basic...
Read more >SparkSession, SparkContext, SQLContext in Spark [What's the ...
In this quick tutorial, let's answer, what are the differences between SparkSession vs. SparkContext vs. SQLContext? and how to choose.
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Do you have a
runner
class that implementsExternalCommandRunner
on the JVM side?If you do, you can pass the JAR that contains the
runner
class as a part of thespark-submit
.Hi @this-fifo. No problem man. Thanks WTB