Connection issues while using on databricks
See original GitHub issueAny idea how to fix this?
: java.lang.ClassNotFoundException: Failed to find data source: com.microsoft.sqlserver.jdbc.spark. Please find packages at http://spark.apache.org/third-party-projects.html
conf = SparkConf() \
.setAppName(appName) \
.setMaster(master) \
.set("spark.driver.extraClassPath","C:/Users/XXXX//mssql-jdbc-8.3.1.jre14-preview.jar")\
.set("spark.sql.execution.arrow.enabled", True)
Issue Analytics
- State:
- Created 3 years ago
- Comments:7 (2 by maintainers)
Top Results From Across the Web
Errors and troubleshooting for Databricks Repos
Get guidance for common error messages or troubleshooting issues when using Databricks Repos with a remote Git repo.
Read more >Troubleshoot Partner Connect | Databricks on AWS
Learn how to troubleshoot common issues with Partner Connect.
Read more >ConnectException error - Databricks Community
ConnectException error: This is often caused by an OOM error that causes the connection to the Python REPL to be closed. Check your...
Read more >Troubleshooting JDBC and ODBC connections - Databricks
This article provides information to help you troubleshoot the connection between your Databricks JDBC/ODBC server and BI tools and data ...
Read more >Connection - Databricks Community
ConnectException error : This is often caused by an OOM error that causes the connection to the Python REPL to be closed. Check...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
I shifted to local to try this with possible combinations, the outcome is as follows:
Spark: 3.0.0 Environment: Windows
Code:
This works fine.
@zacqed Muito obrigado! Sua informação me ajudou muito!