question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

java.util.NoSuchElementException: key not found: path

See original GitHub issue

I’m trying to test this code

from pyspark.sql import SQLContext
from pyspark import SparkContext
sc = SparkContext(appName="Connect Spark with Redshift")
sql_context = SQLContext(sc)
sc._jsc.hadoopConfiguration().set("fs.s3n.awsAccessKeyId", "ACCESSID")
sc._jsc.hadoopConfiguration().set("fs.s3n.awsSecretAccessKey", "ACEESKEY")
df = sql_context.read \
    .option("url", "jdbc:redshift://example.coyf2i236wts.eu-central1.redshift.amazonaws.com:5439/agcdb?user=user&password=pwd") \
    .option("dbtable", "table_name") \
    .option("tempdir", "s3://bucket/path") \
    .load()

but i’m getting this error

capture d ecran 2016-07-11 a 14 56 47

Any ideas ?

Issue Analytics

  • State:closed
  • Created 7 years ago
  • Comments:10 (5 by maintainers)

github_iconTop GitHub Comments

3reactions
toklandcommented, Sep 29, 2016

Note you write -- jars after the python script, but this is an option of spark-submit. For the record, this worked for me:

$ spark-submit --jars spark-redshift_2.10-1.1.0.jar,RedshiftJDBC.jar,minimal-json-0.9.4.jar test-redshift.py
1reaction
JoshRosencommented, Jul 13, 2016

Okay, and you also added .format("com.databricks.spark.redshift") to your code?

Read more comments on GitHub >

github_iconTop Results From Across the Web

Spark throws java.util.NoSuchElementException: key not found
I believe the issue is because of closure. When you run your application locally, everything might be running in same memory/process.
Read more >
key not found: path -- when writing new table · Issue #205 ...
Testing in a spark-shell, I can successfully create tables, insertIntoTable and create from External succesfully, however, when loading an ...
Read more >
RE: SparkSql - java.util.NoSuchElementException: key not found
NoSuchElementException : key not found: node when access JSON Array From: ... but looks like my syntax is off: sqlContext.sql( "SELECT path,`timestamp`, ...
Read more >
java.util.NoSuchElementException: key not found: date
I'm guessing the issue is that your WishCountTable class is not being properly set on the executor classpath. Unfortunately you haven't sent the...
Read more >
java.util.NoSuchElementException: key not found
java.util.NoSuchElementException: key not found: _PYSPARK_DRIVER_CALLBACK_HOST at scala.collection.MapLike$class.default(MapLike.scala:228)
Read more >

github_iconTop Related Medium Post

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found