question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Can't import sparkdl with spark-deep-learning-assembly-0.1.0-spark2.1.jar

See original GitHub issue

First of all, thank you for a great library!

I tried to use sparkdl in PySpark, but couldn’t import sparkdl. Detailed procedure is as follows:

# make sparkdl jar
build/sbt assembly

# run pyspark with sparkdl
pyspark --master local[4] --jars target/scala-2.11/spark-deep-learning-assembly-0.1.0-spark2.1.jar

# import sparkdl
import sparkdl
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
ImportError: No module named sparkdl

After digging a few places, I found that it works if I deflate the jar file as follows.

cd target/scala-2.11
mkdir tmp
cp spark-deep-learning-assembly-0.1.0-spark2.1.jar tmp/
cd tmp
jar xf spark-deep-learning-assembly-0.1.0-spark2.1.jar

pyspark --jars spark-deep-learning-assembly-0.1.0-spark2.1.jar

import sparkdl
Using TensorFlow backend.

Edited-1 : The second method works only in the directory where the jar file is deflated.

Best wishes, HanCheol

Issue Analytics

  • State:closed
  • Created 6 years ago
  • Comments:14 (3 by maintainers)

github_iconTop GitHub Comments

5reactions
prianchocommented, Jun 13, 2017

After a few hours of googling/testing, I finally found a complete solution for my problem.

When I run pyspark in local mode, --packages option is enough to import sparkdl.

$ pyspark --master local[1] --packages databricks:spark-deep-learning:0.1.0-spark2.1-s_2.11

>>> import sparkdl
Using TensorFlow backend.

But it breaks when pyspark runs in yarn mode.

$ pyspark --master yarn --packages databricks:spark-deep-learning:0.1.0-spark2.1-s_2.11

>>> import sparkdl
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
ImportError: No module named sparkdl

Interestingly, spark-submit runs a python program having the same import code without any problem.

Based on the information found here, https://issues.apache.org/jira/browse/SPARK-5185, I manually added paths to the downloaded jars into sys.path variable (which is equivalent to PYTHONPATH). And it started to work.

$ pyspark --master yarn --packages databricks:spark-deep-learning:0.1.0-spark2.1-s_2.11

>>> import sys, glob, os
>>> sys.path.extend(glob.glob(os.path.join(os.path.expanduser("~"), ".ivy2/jars/*.jar")))
>>>
>>> import sparkdl
Using TensorFlow backend.
>>> my_images = sparkdl.readImages("data/flower_photos/daisy/*.jpg")
>>> my_images.show()
+--------------------+--------------------+
|            filePath|               image|
+--------------------+--------------------+
|hdfs://mycluster/...|[RGB,263,320,3,[B...|
|hdfs://mycluster/...|[RGB,313,500,3,[B...|
|hdfs://mycluster/...|[RGB,215,320,3,[B...|
...

Based on the results of these try&error, it seems like pyspark in yarn mode doesn’t properly set PYTHONPATH for the jar files added by --packages option.

Best wishes, Hancheol

2reactions
sueanncommented, Jun 9, 2017

If you don’t need to change the code, the easiest way to start pyspark with the library would be

pyspark --packages databricks:spark-deep-learning:0.1.0-spark2.1-s_2.11

If you need to use the jar, it should work if you also add the jar to the PYTHONPATH environment variable before the pyspark command. e.g. for bash:

export PYTHONPATH=$PYTHONPATH:$<path/to/assembly/jar> 

Hope that works!

Read more comments on GitHub >

github_iconTop Results From Across the Web

Spark deep learning Import error - Stack Overflow
Its not a full fix, as i have yet to be able to import things from sparkdl in jupyter notebooks aswell, but!
Read more >
Jeff Zhang - Apache
I experienced "ImportError: No module named sparkdl" exception while trying to use databricks' spark-deep-learning (sparkdl) in PySpark.
Read more >
Not able to import sparkdl in jupyter notebook - Python技术博客
When I am running the below command pyspark --packages databricks:spark-deep-learning:1.5.0-spark2.4-s_2.11 I am able to import sparkdl in the spark shell.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found