[BUG] Failed to install SynapeML on existing cluster (Windows OS), package commons-codec not found
See original GitHub issueSynapseML version
0.10.1
System information
scala 2.12.15 spark 3.2.2 local spark-shell on Windows 11 (the issue does not exist on Linux Ubuntu)
Describe the problem
spark-shell --packages com.microsoft.azure:synapseml_2.12:0.10.1
got error: [NOT FOUND ] commons-codec#commons-codec;1.10!commons-codec.jar (0ms)
Code to reproduce issue
Execute: spark-shell --packages com.microsoft.azure:synapseml_2.12:0.10.1
Other info / logs
:: problems summary :: :::: WARNINGS [NOT FOUND ] commons-codec#commons-codec;1.10!commons-codec.jar (0ms)
==== local-m2-cache: tried
file:/C:/Users/<name>/.m2/repository/commons-codec/commons-codec/1.10/commons-codec-1.10.jar
::::::::::::::::::::::::::::::::::::::::::::::
:: FAILED DOWNLOADS ::
:: ^ see resolution messages for details ^ ::
::::::::::::::::::::::::::::::::::::::::::::::
:: commons-codec#commons-codec;1.10!commons-codec.jar
::::::::::::::::::::::::::::::::::::::::::::::
:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS Exception in thread “main” java.lang.RuntimeException: [download failed: commons-codec#commons-codec;1.10!commons-codec.jar] at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1447) at org.apache.spark.util.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:185) at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:308) at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:898) at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180) at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203) at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90) at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1043) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1052) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
What component(s) does this bug affect?
-
area/cognitive
: Cognitive project -
area/core
: Core project -
area/deep-learning
: DeepLearning project -
area/lightgbm
: Lightgbm project -
area/opencv
: Opencv project -
area/vw
: VW project -
area/website
: Website -
area/build
: Project build system -
area/notebooks
: Samples under notebooks folder -
area/docker
: Docker usage -
area/models
: models related issue
What language(s) does this bug affect?
-
language/scala
: Scala source code -
language/python
: Pyspark APIs -
language/r
: R APIs -
language/csharp
: .NET APIs -
language/new
: Proposals for new client languages
What integration(s) does this bug affect?
-
integrations/synapse
: Azure Synapse integrations -
integrations/azureml
: Azure ML integrations -
integrations/databricks
: Databricks integrations
Issue Analytics
- State:
- Created a year ago
- Comments:9 (3 by maintainers)
Top GitHub Comments
@dylanw-oss, there is a jar that’s missing. just download it from here: https://archive.apache.org/dist/commons/codec/binaries/commons-codec-1.10-bin.zip and put it into an appropriate location. For me, it was /Users/username/.m2/repository/commons-codec/commons-codec/1.10/ folder. For you it should be at this directory: C:/Users/<name>/.m2/repository/commons-codec/commons-codec/1.10/
@mhamilton723 , deleting ivy cache of all subfolders does not help, still hit the same issue. @eerga 's solution works for me. in .m2/repository/commons-codec/commons-codec/1.10/ folder, there is no jar files, I manually download the zip file and place the jar into this folder and the issue gone. Btw, I tested on Ubuntu there is no such issue at all.