Unable to build Spark Client
See original GitHub issueI tried to build the spark client, but failed installing dependencies. I don’t know much about scala and sbt, but gave my best to fix ist. First it could not resolve sbt-assembly:
./build_client.sh
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] :: UNRESOLVED DEPENDENCIES ::
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] :: com.eed3si9n#sbt-assembly;0.14.3: not found
[warn] :: com.twitter#scrooge-sbt-plugin;4.20.0: not found
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn]
[warn] Note: Some unresolved dependencies have extra attributes. Check that these dependencies exist with the requested attributes.
[warn] com.eed3si9n:sbt-assembly:0.14.3 (scalaVersion=2.12, sbtVersion=1.0)
[warn] com.twitter:scrooge-sbt-plugin:4.20.0 (scalaVersion=2.12, sbtVersion=1.0)
[warn]
[warn] Note: Unresolved dependencies path:
[warn] com.eed3si9n:sbt-assembly:0.14.3 (scalaVersion=2.12, sbtVersion=1.0) (/Users/nkreiling/dev/inovex/modeldb/modeldb/client/scala/libs/spark.ml/project/assembly.sbt#L1-2)
[warn] +- default:spark-ml-build:0.1.0-SNAPSHOT (scalaVersion=2.12, sbtVersion=1.0)
[warn] com.twitter:scrooge-sbt-plugin:4.20.0 (scalaVersion=2.12, sbtVersion=1.0) (/Users/nkreiling/dev/inovex/modeldb/modeldb/client/scala/libs/spark.ml/project/plugins.sbt#L5-6)
[warn] +- default:spark-ml-build:0.1.0-SNAPSHOT (scalaVersion=2.12, sbtVersion=1.0)
[error] sbt.librarymanagement.ResolveException: unresolved dependency: com.eed3si9n#sbt-assembly;0.14.3: not found
[error] unresolved dependency: com.twitter#scrooge-sbt-plugin;4.20.0: not found
[error] at sbt.internal.librarymanagement.IvyActions$.resolveAndRetrieve(IvyActions.scala:334)
[error] at sbt.internal.librarymanagement.IvyActions$.$anonfun$updateEither$1(IvyActions.scala:208)
[error] at sbt.internal.librarymanagement.IvySbt$Module.$anonfun$withModule$1(Ivy.scala:243)
[error] at sbt.internal.librarymanagement.IvySbt.$anonfun$withIvy$1(Ivy.scala:204)
[error] at sbt.internal.librarymanagement.IvySbt.sbt$internal$librarymanagement$IvySbt$$action$1(Ivy.scala:70)
[error] at sbt.internal.librarymanagement.IvySbt$$anon$3.call(Ivy.scala:77)
[error] at xsbt.boot.Locks$GlobalLock.withChannel$1(Locks.scala:95)
[error] at xsbt.boot.Locks$GlobalLock.xsbt$boot$Locks$GlobalLock$$withChannelRetries$1(Locks.scala:80)
[error] at xsbt.boot.Locks$GlobalLock$$anonfun$withFileLock$1.apply(Locks.scala:99)
[error] at xsbt.boot.Using$.withResource(Using.scala:10)
[error] at xsbt.boot.Using$.apply(Using.scala:9)
[error] at xsbt.boot.Locks$GlobalLock.ignoringDeadlockAvoided(Locks.scala:60)
[error] at xsbt.boot.Locks$GlobalLock.withLock(Locks.scala:50)
[error] at xsbt.boot.Locks$.apply0(Locks.scala:31)
[error] at xsbt.boot.Locks$.apply(Locks.scala:28)
[error] at sbt.internal.librarymanagement.IvySbt.withDefaultLogger(Ivy.scala:77)
[error] at sbt.internal.librarymanagement.IvySbt.withIvy(Ivy.scala:199)
[error] at sbt.internal.librarymanagement.IvySbt.withIvy(Ivy.scala:196)
[error] at sbt.internal.librarymanagement.IvySbt$Module.withModule(Ivy.scala:242)
[error] at sbt.internal.librarymanagement.IvyActions$.updateEither(IvyActions.scala:193)
[error] at sbt.librarymanagement.ivy.IvyDependencyResolution.update(IvyDependencyResolution.scala:20)
[error] at sbt.librarymanagement.DependencyResolution.update(DependencyResolution.scala:56)
[error] at sbt.internal.LibraryManagement$.resolve$1(LibraryManagement.scala:46)
[error] at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$12(LibraryManagement.scala:99)
[error] at sbt.util.Tracked$.$anonfun$lastOutput$1(Tracked.scala:68)
[error] at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$19(LibraryManagement.scala:112)
[error] at scala.util.control.Exception$Catch.apply(Exception.scala:224)
[error] at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$11(LibraryManagement.scala:112)
[error] at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$11$adapted(LibraryManagement.scala:95)
[error] at sbt.util.Tracked$.$anonfun$inputChanged$1(Tracked.scala:149)
[error] at sbt.internal.LibraryManagement$.cachedUpdate(LibraryManagement.scala:126)
[error] at sbt.Classpaths$.$anonfun$updateTask$5(Defaults.scala:2385)
[error] at scala.Function1.$anonfun$compose$1(Function1.scala:44)
[error] at sbt.internal.util.$tilde$greater.$anonfun$$u2219$1(TypeFunctions.scala:39)
[error] at sbt.std.Transform$$anon$4.work(System.scala:66)
[error] at sbt.Execute.$anonfun$submit$2(Execute.scala:262)
[error] at sbt.internal.util.ErrorHandling$.wideConvert(ErrorHandling.scala:16)
[error] at sbt.Execute.work(Execute.scala:271)
[error] at sbt.Execute.$anonfun$submit$1(Execute.scala:262)
[error] at sbt.ConcurrentRestrictions$$anon$4.$anonfun$submitValid$1(ConcurrentRestrictions.scala:174)
[error] at sbt.CompletionService$$anon$2.call(CompletionService.scala:36)
[error] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
[error] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
[error] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
[error] at java.lang.Thread.run(Thread.java:745)
[error] (update) sbt.librarymanagement.ResolveException: unresolved dependency: com.eed3si9n#sbt-assembly;0.14.3: not found
[error] unresolved dependency: com.twitter#scrooge-sbt-plugin;4.20.0: not found
I upgraded those to
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.6")
and
addSbtPlugin("com.twitter" % "scrooge-sbt-plugin" % "18.4.0")
but then i ran into:
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] :: UNRESOLVED DEPENDENCIES ::
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] :: com.twitter#finagle-thrift_2.12;6.36.0: not found
[warn] :: org.apache.spark#spark-core_2.12;2.1.0: not found
[warn] :: org.apache.spark#spark-sql_2.12;2.1.0: not found
[warn] :: org.apache.spark#spark-mllib_2.12;2.1.0: not found
[warn] :: org.apache.spark#spark-hive_2.12;2.1.0: not found
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn]
[warn] Note: Unresolved dependencies path:
[warn] org.apache.spark:spark-hive_2.12:2.1.0 (/Users/nkreiling/dev/inovex/modeldb/modeldb/client/scala/libs/spark.ml/build.sbt#L13-14)
[warn] +- model-db-spark-client:model-db-spark-client_2.12:1.0
[warn] com.twitter:finagle-thrift_2.12:6.36.0 (/Users/nkreiling/dev/inovex/modeldb/modeldb/client/scala/libs/spark.ml/build.sbt#L17-18)
[warn] +- model-db-spark-client:model-db-spark-client_2.12:1.0
[warn] org.apache.spark:spark-core_2.12:2.1.0 (/Users/nkreiling/dev/inovex/modeldb/modeldb/client/scala/libs/spark.ml/build.sbt#L10-11)
[warn] +- model-db-spark-client:model-db-spark-client_2.12:1.0
[warn] org.apache.spark:spark-sql_2.12:2.1.0 (/Users/nkreiling/dev/inovex/modeldb/modeldb/client/scala/libs/spark.ml/build.sbt#L11-12)
[warn] +- model-db-spark-client:model-db-spark-client_2.12:1.0
[warn] org.apache.spark:spark-mllib_2.12:2.1.0 (/Users/nkreiling/dev/inovex/modeldb/modeldb/client/scala/libs/spark.ml/build.sbt#L12-13)
[warn] +- model-db-spark-client:model-db-spark-client_2.12:1.0
[error] sbt.librarymanagement.ResolveException: unresolved dependency: com.twitter#finagle-thrift_2.12;6.36.0: not found
[error] unresolved dependency: org.apache.spark#spark-core_2.12;2.1.0: not found
[error] unresolved dependency: org.apache.spark#spark-sql_2.12;2.1.0: not found
[error] unresolved dependency: org.apache.spark#spark-mllib_2.12;2.1.0: not found
[error] unresolved dependency: org.apache.spark#spark-hive_2.12;2.1.0: not found
[error] at sbt.internal.librarymanagement.IvyActions$.resolveAndRetrieve(IvyActions.scala:334)
[error] at sbt.internal.librarymanagement.IvyActions$.$anonfun$updateEither$1(IvyActions.scala:208)
[error] at sbt.internal.librarymanagement.IvySbt$Module.$anonfun$withModule$1(Ivy.scala:243)
[error] at sbt.internal.librarymanagement.IvySbt.$anonfun$withIvy$1(Ivy.scala:204)
[error] at sbt.internal.librarymanagement.IvySbt.sbt$internal$librarymanagement$IvySbt$$action$1(Ivy.scala:70)
[error] at sbt.internal.librarymanagement.IvySbt$$anon$3.call(Ivy.scala:77)
[error] at xsbt.boot.Locks$GlobalLock.withChannel$1(Locks.scala:95)
[error] at xsbt.boot.Locks$GlobalLock.xsbt$boot$Locks$GlobalLock$$withChannelRetries$1(Locks.scala:80)
[error] at xsbt.boot.Locks$GlobalLock$$anonfun$withFileLock$1.apply(Locks.scala:99)
[error] at xsbt.boot.Using$.withResource(Using.scala:10)
[error] at xsbt.boot.Using$.apply(Using.scala:9)
[error] at xsbt.boot.Locks$GlobalLock.ignoringDeadlockAvoided(Locks.scala:60)
[error] at xsbt.boot.Locks$GlobalLock.withLock(Locks.scala:50)
[error] at xsbt.boot.Locks$.apply0(Locks.scala:31)
[error] at xsbt.boot.Locks$.apply(Locks.scala:28)
[error] at sbt.internal.librarymanagement.IvySbt.withDefaultLogger(Ivy.scala:77)
[error] at sbt.internal.librarymanagement.IvySbt.withIvy(Ivy.scala:199)
[error] at sbt.internal.librarymanagement.IvySbt.withIvy(Ivy.scala:196)
[error] at sbt.internal.librarymanagement.IvySbt$Module.withModule(Ivy.scala:242)
[error] at sbt.internal.librarymanagement.IvyActions$.updateEither(IvyActions.scala:193)
[error] at sbt.librarymanagement.ivy.IvyDependencyResolution.update(IvyDependencyResolution.scala:20)
[error] at sbt.librarymanagement.DependencyResolution.update(DependencyResolution.scala:56)
[error] at sbt.internal.LibraryManagement$.resolve$1(LibraryManagement.scala:46)
[error] at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$12(LibraryManagement.scala:99)
[error] at sbt.util.Tracked$.$anonfun$lastOutput$1(Tracked.scala:68)
[error] at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$19(LibraryManagement.scala:112)
[error] at scala.util.control.Exception$Catch.apply(Exception.scala:224)
[error] at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$11(LibraryManagement.scala:112)
[error] at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$11$adapted(LibraryManagement.scala:95)
[error] at sbt.util.Tracked$.$anonfun$inputChanged$1(Tracked.scala:149)
[error] at sbt.internal.LibraryManagement$.cachedUpdate(LibraryManagement.scala:126)
[error] at sbt.Classpaths$.$anonfun$updateTask$5(Defaults.scala:2385)
[error] at scala.Function1.$anonfun$compose$1(Function1.scala:44)
[error] at sbt.internal.util.$tilde$greater.$anonfun$$u2219$1(TypeFunctions.scala:39)
[error] at sbt.std.Transform$$anon$4.work(System.scala:66)
[error] at sbt.Execute.$anonfun$submit$2(Execute.scala:262)
[error] at sbt.internal.util.ErrorHandling$.wideConvert(ErrorHandling.scala:16)
[error] at sbt.Execute.work(Execute.scala:271)
[error] at sbt.Execute.$anonfun$submit$1(Execute.scala:262)
[error] at sbt.ConcurrentRestrictions$$anon$4.$anonfun$submitValid$1(ConcurrentRestrictions.scala:174)
[error] at sbt.CompletionService$$anon$2.call(CompletionService.scala:36)
[error] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
[error] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
[error] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
[error] at java.lang.Thread.run(Thread.java:745)
[error] (update) sbt.librarymanagement.ResolveException: unresolved dependency: com.twitter#finagle-thrift_2.12;6.36.0: not found
[error] unresolved dependency: org.apache.spark#spark-core_2.12;2.1.0: not found
[error] unresolved dependency: org.apache.spark#spark-sql_2.12;2.1.0: not found
[error] unresolved dependency: org.apache.spark#spark-mllib_2.12;2.1.0: not found
[error] unresolved dependency: org.apache.spark#spark-hive_2.12;2.1.0: not found
using MacOS Scala 2.12.1 Sbt: 1.1.4
Issue Analytics
- State:
- Created 5 years ago
- Comments:9
Top Results From Across the Web
Hive on Spark CDH 5.7 - Failed to create spark client
We are getting the error while executing the hive queries with spark engine. Failed to execute spark task, with exception 'org.apache.hadoop.hive.ql.metadata.
Read more >Hive on Spark is not working - Failed to create spark client
I set the YARN Container memory and maximum to be greater than Spark Executor Memory + Overhead.
Read more >Building Spark - Spark 3.3.1 Documentation
Building Apache Spark. Apache Maven. Setting up Maven's Memory Usage; build/mvn. Building a Runnable Distribution; Specifying the Hadoop Version and Enabling ...
Read more >Troubleshooting issues with Apache Spark - IBM
Symptom: Spark worker daemon fails to create executors with the following error: ... For more information, see Configuring z/OS Spark client authentication.
Read more >error: Failed to create spark client. for hive on spark
Hi all, anyone met this error: HiveException(Failed to create spark client.) ... Failed to execute spark task, with exception 'org.apache.hadoop.hive.ql.metadata.
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
okay, that helped. Now I run into:
Still no luck .i tried with the same config. Getting the below error.