question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

No Suitable driver error

See original GitHub issue

When trying to write to azure sql using the snipped below, we see the following error

final_df.write.format(“com.microsoft.sqlserver.jdbc.spark”).mode(“overwrite”).option(“url”, url).option(“dbtable”, table_name).option(“user”, username).option(“password”, password).save()

java.sql.SQLException: No suitable driver
  at java.sql.DriverManager.getDriver(DriverManager.java:315)
  at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions.$anonfun$driverClass$2(JDBCOptions.scala:105)
  at scala.Option.getOrElse(Option.scala:189)
  at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions.<init>(JDBCOptions.scala:105)
  at org.apache.spark.sql.execution.datasources.jdbc.JdbcOptionsInWrite.<init>(JDBCOptions.scala:194)
  at com.microsoft.sqlserver.jdbc.spark.SQLServerBulkJdbcOptions.<init>(SQLServerBulkJdbcOptions.scala:25)
  at com.microsoft.sqlserver.jdbc.spark.SQLServerBulkJdbcOptions.<init>(SQLServerBulkJdbcOptions.scala:27)
  at com.microsoft.sqlserver.jdbc.spark.DefaultSource.createRelation(DefaultSource.scala:55)
  at org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:46)
  at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
  at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
  at org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:90)
  at org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:180)
  at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:218)
  at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
  at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:215)
  at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:176)
  at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:127)
  at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:126)
  at org.apache.spark.sql.DataFrameWriter.$anonfun$runCommand$1(DataFrameWriter.scala:962)
  at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:100)
  at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:160)
  at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:87)
  at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:767)
  at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64)
  at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:962)
  at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:414)
  at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:398)

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Reactions:1
  • Comments:6

github_iconTop GitHub Comments

1reaction
madhu-monetizecommented, Aug 4, 2021

Already…here is what I could do to make it work. Something is either goofed up in the dependencies or docs need to be updated

  1. Add JDBC driver explicitly to spark bin/spark-shell --packages org.apache.hadoop:hadoop-azure:2.7.3,com.microsoft.azure:azure-storage:8.6.6,com.microsoft.azure:spark-mssql-connector_2.12:1.1.0,com.microsoft.sqlserver:mssql-jdbc:8.4.1.jre8

  2. Add the driver option final_df.write.format("com.microsoft.sqlserver.jdbc.spark").mode("overwrite").option("url", url).option("dbtable", table_name).option("user", username).option("password", password).option("driver", "com.microsoft.sqlserver.jdbc.SQLServerDriver").save()

0reactions
luxu1-mscommented, Aug 4, 2021

Glad to know it works now. Connector does have jdbc dependency.

Read more comments on GitHub >

github_iconTop Results From Across the Web

No Suitable Driver Found For JDBC - Javatpoint
No suitable driver found for JDBC is an exception in Java that generally occurs when any driver is not found for making the...
Read more >
No suitable driver found for jdbc:mysql://localhost:3306/dbname
Make sure you run this first: Class.forName("com.mysql.jdbc.Driver");. This forces the driver to register itself, so that Java knows how to ...
Read more >
How to Fix java.sql.SQLException: No suitable driver found for ...
In order to solve this error, you need the MySQL JDBC driver like mysql-connector-java-5.1.36.jar in your classpath. If you use a driver which...
Read more >
java.sql.SQLException: No suitable driver found for 'jdbc ...
This error comes when you are trying to connect to MySQL database from Java program using JDBC but either the JDBC driver for...
Read more >
Resolve java.sql.SQLException: No suitable driver found for ...
You will get this type of exception whenever your JDBC URL is not accepted by any of the loaded JDBC drivers by the...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found