question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Unable to build Spark Client

See original GitHub issue

I tried to build the spark client, but failed installing dependencies. I don’t know much about scala and sbt, but gave my best to fix ist. First it could not resolve sbt-assembly:

./build_client.sh
[warn] 	::::::::::::::::::::::::::::::::::::::::::::::
[warn] 	::          UNRESOLVED DEPENDENCIES         ::
[warn] 	::::::::::::::::::::::::::::::::::::::::::::::
[warn] 	:: com.eed3si9n#sbt-assembly;0.14.3: not found
[warn] 	:: com.twitter#scrooge-sbt-plugin;4.20.0: not found
[warn] 	::::::::::::::::::::::::::::::::::::::::::::::
[warn]
[warn] 	Note: Some unresolved dependencies have extra attributes.  Check that these dependencies exist with the requested attributes.
[warn] 		com.eed3si9n:sbt-assembly:0.14.3 (scalaVersion=2.12, sbtVersion=1.0)
[warn] 		com.twitter:scrooge-sbt-plugin:4.20.0 (scalaVersion=2.12, sbtVersion=1.0)
[warn]
[warn] 	Note: Unresolved dependencies path:
[warn] 		com.eed3si9n:sbt-assembly:0.14.3 (scalaVersion=2.12, sbtVersion=1.0) (/Users/nkreiling/dev/inovex/modeldb/modeldb/client/scala/libs/spark.ml/project/assembly.sbt#L1-2)
[warn] 		  +- default:spark-ml-build:0.1.0-SNAPSHOT (scalaVersion=2.12, sbtVersion=1.0)
[warn] 		com.twitter:scrooge-sbt-plugin:4.20.0 (scalaVersion=2.12, sbtVersion=1.0) (/Users/nkreiling/dev/inovex/modeldb/modeldb/client/scala/libs/spark.ml/project/plugins.sbt#L5-6)
[warn] 		  +- default:spark-ml-build:0.1.0-SNAPSHOT (scalaVersion=2.12, sbtVersion=1.0)
[error] sbt.librarymanagement.ResolveException: unresolved dependency: com.eed3si9n#sbt-assembly;0.14.3: not found
[error] unresolved dependency: com.twitter#scrooge-sbt-plugin;4.20.0: not found
[error] 	at sbt.internal.librarymanagement.IvyActions$.resolveAndRetrieve(IvyActions.scala:334)
[error] 	at sbt.internal.librarymanagement.IvyActions$.$anonfun$updateEither$1(IvyActions.scala:208)
[error] 	at sbt.internal.librarymanagement.IvySbt$Module.$anonfun$withModule$1(Ivy.scala:243)
[error] 	at sbt.internal.librarymanagement.IvySbt.$anonfun$withIvy$1(Ivy.scala:204)
[error] 	at sbt.internal.librarymanagement.IvySbt.sbt$internal$librarymanagement$IvySbt$$action$1(Ivy.scala:70)
[error] 	at sbt.internal.librarymanagement.IvySbt$$anon$3.call(Ivy.scala:77)
[error] 	at xsbt.boot.Locks$GlobalLock.withChannel$1(Locks.scala:95)
[error] 	at xsbt.boot.Locks$GlobalLock.xsbt$boot$Locks$GlobalLock$$withChannelRetries$1(Locks.scala:80)
[error] 	at xsbt.boot.Locks$GlobalLock$$anonfun$withFileLock$1.apply(Locks.scala:99)
[error] 	at xsbt.boot.Using$.withResource(Using.scala:10)
[error] 	at xsbt.boot.Using$.apply(Using.scala:9)
[error] 	at xsbt.boot.Locks$GlobalLock.ignoringDeadlockAvoided(Locks.scala:60)
[error] 	at xsbt.boot.Locks$GlobalLock.withLock(Locks.scala:50)
[error] 	at xsbt.boot.Locks$.apply0(Locks.scala:31)
[error] 	at xsbt.boot.Locks$.apply(Locks.scala:28)
[error] 	at sbt.internal.librarymanagement.IvySbt.withDefaultLogger(Ivy.scala:77)
[error] 	at sbt.internal.librarymanagement.IvySbt.withIvy(Ivy.scala:199)
[error] 	at sbt.internal.librarymanagement.IvySbt.withIvy(Ivy.scala:196)
[error] 	at sbt.internal.librarymanagement.IvySbt$Module.withModule(Ivy.scala:242)
[error] 	at sbt.internal.librarymanagement.IvyActions$.updateEither(IvyActions.scala:193)
[error] 	at sbt.librarymanagement.ivy.IvyDependencyResolution.update(IvyDependencyResolution.scala:20)
[error] 	at sbt.librarymanagement.DependencyResolution.update(DependencyResolution.scala:56)
[error] 	at sbt.internal.LibraryManagement$.resolve$1(LibraryManagement.scala:46)
[error] 	at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$12(LibraryManagement.scala:99)
[error] 	at sbt.util.Tracked$.$anonfun$lastOutput$1(Tracked.scala:68)
[error] 	at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$19(LibraryManagement.scala:112)
[error] 	at scala.util.control.Exception$Catch.apply(Exception.scala:224)
[error] 	at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$11(LibraryManagement.scala:112)
[error] 	at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$11$adapted(LibraryManagement.scala:95)
[error] 	at sbt.util.Tracked$.$anonfun$inputChanged$1(Tracked.scala:149)
[error] 	at sbt.internal.LibraryManagement$.cachedUpdate(LibraryManagement.scala:126)
[error] 	at sbt.Classpaths$.$anonfun$updateTask$5(Defaults.scala:2385)
[error] 	at scala.Function1.$anonfun$compose$1(Function1.scala:44)
[error] 	at sbt.internal.util.$tilde$greater.$anonfun$$u2219$1(TypeFunctions.scala:39)
[error] 	at sbt.std.Transform$$anon$4.work(System.scala:66)
[error] 	at sbt.Execute.$anonfun$submit$2(Execute.scala:262)
[error] 	at sbt.internal.util.ErrorHandling$.wideConvert(ErrorHandling.scala:16)
[error] 	at sbt.Execute.work(Execute.scala:271)
[error] 	at sbt.Execute.$anonfun$submit$1(Execute.scala:262)
[error] 	at sbt.ConcurrentRestrictions$$anon$4.$anonfun$submitValid$1(ConcurrentRestrictions.scala:174)
[error] 	at sbt.CompletionService$$anon$2.call(CompletionService.scala:36)
[error] 	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] 	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
[error] 	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
[error] 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
[error] 	at java.lang.Thread.run(Thread.java:745)
[error] (update) sbt.librarymanagement.ResolveException: unresolved dependency: com.eed3si9n#sbt-assembly;0.14.3: not found
[error] unresolved dependency: com.twitter#scrooge-sbt-plugin;4.20.0: not found

I upgraded those to addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.6") and addSbtPlugin("com.twitter" % "scrooge-sbt-plugin" % "18.4.0")

but then i ran into:

[warn] 	::::::::::::::::::::::::::::::::::::::::::::::
[warn] 	::          UNRESOLVED DEPENDENCIES         ::
[warn] 	::::::::::::::::::::::::::::::::::::::::::::::
[warn] 	:: com.twitter#finagle-thrift_2.12;6.36.0: not found
[warn] 	:: org.apache.spark#spark-core_2.12;2.1.0: not found
[warn] 	:: org.apache.spark#spark-sql_2.12;2.1.0: not found
[warn] 	:: org.apache.spark#spark-mllib_2.12;2.1.0: not found
[warn] 	:: org.apache.spark#spark-hive_2.12;2.1.0: not found
[warn] 	::::::::::::::::::::::::::::::::::::::::::::::
[warn]
[warn] 	Note: Unresolved dependencies path:
[warn] 		org.apache.spark:spark-hive_2.12:2.1.0 (/Users/nkreiling/dev/inovex/modeldb/modeldb/client/scala/libs/spark.ml/build.sbt#L13-14)
[warn] 		  +- model-db-spark-client:model-db-spark-client_2.12:1.0
[warn] 		com.twitter:finagle-thrift_2.12:6.36.0 (/Users/nkreiling/dev/inovex/modeldb/modeldb/client/scala/libs/spark.ml/build.sbt#L17-18)
[warn] 		  +- model-db-spark-client:model-db-spark-client_2.12:1.0
[warn] 		org.apache.spark:spark-core_2.12:2.1.0 (/Users/nkreiling/dev/inovex/modeldb/modeldb/client/scala/libs/spark.ml/build.sbt#L10-11)
[warn] 		  +- model-db-spark-client:model-db-spark-client_2.12:1.0
[warn] 		org.apache.spark:spark-sql_2.12:2.1.0 (/Users/nkreiling/dev/inovex/modeldb/modeldb/client/scala/libs/spark.ml/build.sbt#L11-12)
[warn] 		  +- model-db-spark-client:model-db-spark-client_2.12:1.0
[warn] 		org.apache.spark:spark-mllib_2.12:2.1.0 (/Users/nkreiling/dev/inovex/modeldb/modeldb/client/scala/libs/spark.ml/build.sbt#L12-13)
[warn] 		  +- model-db-spark-client:model-db-spark-client_2.12:1.0
[error] sbt.librarymanagement.ResolveException: unresolved dependency: com.twitter#finagle-thrift_2.12;6.36.0: not found
[error] unresolved dependency: org.apache.spark#spark-core_2.12;2.1.0: not found
[error] unresolved dependency: org.apache.spark#spark-sql_2.12;2.1.0: not found
[error] unresolved dependency: org.apache.spark#spark-mllib_2.12;2.1.0: not found
[error] unresolved dependency: org.apache.spark#spark-hive_2.12;2.1.0: not found
[error] 	at sbt.internal.librarymanagement.IvyActions$.resolveAndRetrieve(IvyActions.scala:334)
[error] 	at sbt.internal.librarymanagement.IvyActions$.$anonfun$updateEither$1(IvyActions.scala:208)
[error] 	at sbt.internal.librarymanagement.IvySbt$Module.$anonfun$withModule$1(Ivy.scala:243)
[error] 	at sbt.internal.librarymanagement.IvySbt.$anonfun$withIvy$1(Ivy.scala:204)
[error] 	at sbt.internal.librarymanagement.IvySbt.sbt$internal$librarymanagement$IvySbt$$action$1(Ivy.scala:70)
[error] 	at sbt.internal.librarymanagement.IvySbt$$anon$3.call(Ivy.scala:77)
[error] 	at xsbt.boot.Locks$GlobalLock.withChannel$1(Locks.scala:95)
[error] 	at xsbt.boot.Locks$GlobalLock.xsbt$boot$Locks$GlobalLock$$withChannelRetries$1(Locks.scala:80)
[error] 	at xsbt.boot.Locks$GlobalLock$$anonfun$withFileLock$1.apply(Locks.scala:99)
[error] 	at xsbt.boot.Using$.withResource(Using.scala:10)
[error] 	at xsbt.boot.Using$.apply(Using.scala:9)
[error] 	at xsbt.boot.Locks$GlobalLock.ignoringDeadlockAvoided(Locks.scala:60)
[error] 	at xsbt.boot.Locks$GlobalLock.withLock(Locks.scala:50)
[error] 	at xsbt.boot.Locks$.apply0(Locks.scala:31)
[error] 	at xsbt.boot.Locks$.apply(Locks.scala:28)
[error] 	at sbt.internal.librarymanagement.IvySbt.withDefaultLogger(Ivy.scala:77)
[error] 	at sbt.internal.librarymanagement.IvySbt.withIvy(Ivy.scala:199)
[error] 	at sbt.internal.librarymanagement.IvySbt.withIvy(Ivy.scala:196)
[error] 	at sbt.internal.librarymanagement.IvySbt$Module.withModule(Ivy.scala:242)
[error] 	at sbt.internal.librarymanagement.IvyActions$.updateEither(IvyActions.scala:193)
[error] 	at sbt.librarymanagement.ivy.IvyDependencyResolution.update(IvyDependencyResolution.scala:20)
[error] 	at sbt.librarymanagement.DependencyResolution.update(DependencyResolution.scala:56)
[error] 	at sbt.internal.LibraryManagement$.resolve$1(LibraryManagement.scala:46)
[error] 	at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$12(LibraryManagement.scala:99)
[error] 	at sbt.util.Tracked$.$anonfun$lastOutput$1(Tracked.scala:68)
[error] 	at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$19(LibraryManagement.scala:112)
[error] 	at scala.util.control.Exception$Catch.apply(Exception.scala:224)
[error] 	at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$11(LibraryManagement.scala:112)
[error] 	at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$11$adapted(LibraryManagement.scala:95)
[error] 	at sbt.util.Tracked$.$anonfun$inputChanged$1(Tracked.scala:149)
[error] 	at sbt.internal.LibraryManagement$.cachedUpdate(LibraryManagement.scala:126)
[error] 	at sbt.Classpaths$.$anonfun$updateTask$5(Defaults.scala:2385)
[error] 	at scala.Function1.$anonfun$compose$1(Function1.scala:44)
[error] 	at sbt.internal.util.$tilde$greater.$anonfun$$u2219$1(TypeFunctions.scala:39)
[error] 	at sbt.std.Transform$$anon$4.work(System.scala:66)
[error] 	at sbt.Execute.$anonfun$submit$2(Execute.scala:262)
[error] 	at sbt.internal.util.ErrorHandling$.wideConvert(ErrorHandling.scala:16)
[error] 	at sbt.Execute.work(Execute.scala:271)
[error] 	at sbt.Execute.$anonfun$submit$1(Execute.scala:262)
[error] 	at sbt.ConcurrentRestrictions$$anon$4.$anonfun$submitValid$1(ConcurrentRestrictions.scala:174)
[error] 	at sbt.CompletionService$$anon$2.call(CompletionService.scala:36)
[error] 	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] 	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
[error] 	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
[error] 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
[error] 	at java.lang.Thread.run(Thread.java:745)
[error] (update) sbt.librarymanagement.ResolveException: unresolved dependency: com.twitter#finagle-thrift_2.12;6.36.0: not found
[error] unresolved dependency: org.apache.spark#spark-core_2.12;2.1.0: not found
[error] unresolved dependency: org.apache.spark#spark-sql_2.12;2.1.0: not found
[error] unresolved dependency: org.apache.spark#spark-mllib_2.12;2.1.0: not found
[error] unresolved dependency: org.apache.spark#spark-hive_2.12;2.1.0: not found

using MacOS Scala 2.12.1 Sbt: 1.1.4

Issue Analytics

  • State:open
  • Created 5 years ago
  • Comments:9

github_iconTop GitHub Comments

1reaction
krlngcommented, May 27, 2018

okay, that helped. Now I run into:

[info] Loading project definition from /Users/nkreiling/dev/inovex/modeldb/modeldb/client/scala/libs/spark.ml/project
Error wrapping InputStream in GZIPInputStream: java.util.zip.ZipException: Not in GZIP format
	at sbt.ErrorHandling$.translate(ErrorHandling.scala:10)
	at sbt.WrapUsing.open(Using.scala:34)
	at sbt.Using.apply(Using.scala:23)
	at sbt.IO$$anonfun$gzipFileIn$1.apply(IO.scala:810)
	at sbt.IO$$anonfun$gzipFileIn$1.apply(IO.scala:809)
	at sbt.Using.apply(Using.scala:24)
	at sbt.IO$.gzipFileIn(IO.scala:809)
	at sbt.Sync$.readUncaught(Sync.scala:88)
	at sbt.Sync$.readInfo(Sync.scala:84)
	at sbt.Sync$$anonfun$apply$1.apply(Sync.scala:28)
	at sbt.Sync$$anonfun$apply$1.apply(Sync.scala:22)
	at sbt.Defaults$$anonfun$copyResourcesTask$1.apply(Defaults.scala:891)
	at sbt.Defaults$$anonfun$copyResourcesTask$1.apply(Defaults.scala:887)
	at scala.Function4$$anonfun$tupled$1.apply(Function4.scala:35)
	at scala.Function4$$anonfun$tupled$1.apply(Function4.scala:34)
	at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
	at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:40)
	at sbt.std.Transform$$anon$4.work(System.scala:63)
	at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
	at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
	at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
	at sbt.Execute.work(Execute.scala:235)
	at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
	at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
	at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159)
	at sbt.CompletionService$$anon$2.call(CompletionService.scala:28)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
Caused by: java.util.zip.ZipException: Not in GZIP format
	at java.util.zip.GZIPInputStream.readHeader(GZIPInputStream.java:165)
	at java.util.zip.GZIPInputStream.<init>(GZIPInputStream.java:79)
	at sbt.Using$$anonfun$gzipInputStream$1.apply(Using.scala:84)
	at sbt.Using$$anonfun$gzipInputStream$1.apply(Using.scala:84)
	at sbt.Using$$anon$1.openImpl(Using.scala:51)
	at sbt.WrapUsing$$anonfun$open$2.apply(Using.scala:34)
	at sbt.ErrorHandling$.translate(ErrorHandling.scala:10)
	at sbt.WrapUsing.open(Using.scala:34)
	at sbt.Using.apply(Using.scala:23)
	at sbt.IO$$anonfun$gzipFileIn$1.apply(IO.scala:810)
	at sbt.IO$$anonfun$gzipFileIn$1.apply(IO.scala:809)
	at sbt.Using.apply(Using.scala:24)
	at sbt.IO$.gzipFileIn(IO.scala:809)
	at sbt.Sync$.readUncaught(Sync.scala:88)
	at sbt.Sync$.readInfo(Sync.scala:84)
	at sbt.Sync$$anonfun$apply$1.apply(Sync.scala:28)
	at sbt.Sync$$anonfun$apply$1.apply(Sync.scala:22)
	at sbt.Defaults$$anonfun$copyResourcesTask$1.apply(Defaults.scala:891)
	at sbt.Defaults$$anonfun$copyResourcesTask$1.apply(Defaults.scala:887)
	at scala.Function4$$anonfun$tupled$1.apply(Function4.scala:35)
	at scala.Function4$$anonfun$tupled$1.apply(Function4.scala:34)
	at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
	at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:40)
	at sbt.std.Transform$$anon$4.work(System.scala:63)
	at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
	at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
	at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
	at sbt.Execute.work(Execute.scala:235)
	at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
	at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
	at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159)
	at sbt.CompletionService$$anon$2.call(CompletionService.scala:28)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
0reactions
jeganthirumenicommented, Jun 14, 2018

Still no luck .i tried with the same config. Getting the below error.

[error] /root/modeldb/client/scala/libs/spark.ml/target/scala-2.11/src_managed/main/thrift/modeldb/ModelDBService.scala:359: too many arguments for method apply: (method: com.twitter.scrooge.ThriftMethod, thriftService: com.twitter.finagle.Service[com.twitter.finagle.thrift.ThriftClientRequest,Array[Byte]], pf: org.apache.thrift.protocol.TProtocolFactory, stats: com.twitter.finagle.stats.StatsReceiver)com.twitter.finagle.Service[method.Args,method.Result] in object ThriftServiceIface
[error]           appendToVectorField = ThriftServiceIface(self.AppendToVectorField, binaryService, pf, stats, responseClassifier),
[error]                                                   ^
[error] /root/modeldb/client/scala/libs/spark.ml/target/scala-2.11/src_managed/main/thrift/modeldb/ModelDBService.scala:360: too many arguments for method apply: (method: com.twitter.scrooge.ThriftMethod, thriftService: com.twitter.finagle.Service[com.twitter.finagle.thrift.ThriftClientRequest,Array[Byte]], pf: org.apache.thrift.protocol.TProtocolFactory, stats: com.twitter.finagle.stats.StatsReceiver)com.twitter.finagle.Service[method.Args,method.Result] in object ThriftServiceIface
[error]           getModel = ThriftServiceIface(self.GetModel, binaryService, pf, stats, responseClassifier),
[error]                                        ^
[error] /root/modeldb/client/scala/libs/spark.ml/target/scala-2.11/src_managed/main/thrift/modeldb/ModelDBService.scala:361: too many arguments for method apply: (method: com.twitter.scrooge.ThriftMethod, thriftService: com.twitter.finagle.Service[com.twitter.finagle.thrift.ThriftClientRequest,Array[Byte]], pf: org.apache.thrift.protocol.TProtocolFactory, stats: com.twitter.finagle.stats.StatsReceiver)com.twitter.finagle.Service[method.Args,method.Result] in object ThriftServiceIface
[error]           getRunsInExperiment = ThriftServiceIface(self.GetRunsInExperiment, binaryService, pf, stats, responseClassifier),
[error]                                                   ^
[error] /root/modeldb/client/scala/libs/spark.ml/target/scala-2.11/src_managed/main/thrift/modeldb/ModelDBService.scala:362: too many arguments for method apply: (method: com.twitter.scrooge.ThriftMethod, thriftService: com.twitter.finagle.Service[com.twitter.finagle.thrift.ThriftClientRequest,Array[Byte]], pf: org.apache.thrift.protocol.TProtocolFactory, stats: com.twitter.finagle.stats.StatsReceiver)com.twitter.finagle.Service[method.Args,method.Result] in object ThriftServiceIface
[error]           getRunsAndExperimentsInProject = ThriftServiceIface(self.GetRunsAndExperimentsInProject, binaryService, pf, stats, responseClassifier),
[error]                                                              ^
[error] /root/modeldb/client/scala/libs/spark.ml/target/scala-2.11/src_managed/main/thrift/modeldb/ModelDBService.scala:363: too many arguments for method apply: (method: com.twitter.scrooge.ThriftMethod, thriftService: com.twitter.finagle.Service[com.twitter.finagle.thrift.ThriftClientRequest,Array[Byte]], pf: org.apache.thrift.protocol.TProtocolFactory, stats: com.twitter.finagle.stats.StatsReceiver)com.twitter.finagle.Service[method.Args,method.Result] in object ThriftServiceIface
[error]           getProjectOverviews = ThriftServiceIface(self.GetProjectOverviews, binaryService, pf, stats, responseClassifier),
[error]                                                   ^
[error] /root/modeldb/client/scala/libs/spark.ml/target/scala-2.11/src_managed/main/thrift/modeldb/ModelDBService.scala:364: too many arguments for method apply: (method: com.twitter.scrooge.ThriftMethod, thriftService: com.twitter.finagle.Service[com.twitter.finagle.thrift.ThriftClientRequest,Array[Byte]], pf: org.apache.thrift.protocol.TProtocolFactory, stats: com.twitter.finagle.stats.StatsReceiver)com.twitter.finagle.Service[method.Args,method.Result] in object ThriftServiceIface
[error]           getExperimentRunDetails = ThriftServiceIface(self.GetExperimentRunDetails, binaryService, pf, stats, responseClassifier),
[error]                                                       ^
[error] /root/modeldb/client/scala/libs/spark.ml/target/scala-2.11/src_managed/main/thrift/modeldb/ModelDBService.scala:365: too many arguments for method apply: (method: com.twitter.scrooge.ThriftMethod, thriftService: com.twitter.finagle.Service[com.twitter.finagle.thrift.ThriftClientRequest,Array[Byte]], pf: org.apache.thrift.protocol.TProtocolFactory, stats: com.twitter.finagle.stats.StatsReceiver)com.twitter.finagle.Service[method.Args,method.Result] in object ThriftServiceIface
[error]           originalFeatures = ThriftServiceIface(self.OriginalFeatures, binaryService, pf, stats, responseClassifier),
[error]                                                ^
[error] /root/modeldb/client/scala/libs/spark.ml/target/scala-2.11/src_managed/main/thrift/modeldb/ModelDBService.scala:366: too many arguments for method apply: (method: com.twitter.scrooge.ThriftMethod, thriftService: com.twitter.finagle.Service[com.twitter.finagle.thrift.ThriftClientRequest,Array[Byte]], pf: org.apache.thrift.protocol.TProtocolFactory, stats: com.twitter.finagle.stats.StatsReceiver)com.twitter.finagle.Service[method.Args,method.Result] in object ThriftServiceIface
[error]           storeTreeModel = ThriftServiceIface(self.StoreTreeModel, binaryService, pf, stats, responseClassifier),
[error]                                              ^
[error] /root/modeldb/client/scala/libs/spark.ml/target/scala-2.11/src_managed/main/thrift/modeldb/ModelDBService.scala:367: too many arguments for method apply: (method: com.twitter.scrooge.ThriftMethod, thriftService: com.twitter.finagle.Service[com.twitter.finagle.thrift.ThriftClientRequest,Array[Byte]], pf: org.apache.thrift.protocol.TProtocolFactory, stats: com.twitter.finagle.stats.StatsReceiver)com.twitter.finagle.Service[method.Args,method.Result] in object ThriftServiceIface
[error]           storePipelineTransformEvent = ThriftServiceIface(self.StorePipelineTransformEvent, binaryService, pf, stats, responseClassifier),
[error]                                                           ^
[error] /root/modeldb/client/scala/libs/spark.ml/target/scala-2.11/src_managed/main/thrift/modeldb/ModelDBService.scala:368: too many arguments for method apply: (method: com.twitter.scrooge.ThriftMethod, thriftService: com.twitter.finagle.Service[com.twitter.finagle.thrift.ThriftClientRequest,Array[Byte]], pf: org.apache.thrift.protocol.TProtocolFactory, stats: com.twitter.finagle.stats.StatsReceiver)com.twitter.finagle.Service[method.Args,method.Result] in object ThriftServiceIface
[error]           computeModelAncestry = ThriftServiceIface(self.ComputeModelAncestry, binaryService, pf, stats, responseClassifier),
[error]                                                    ^
[error] /root/modeldb/client/scala/libs/spark.ml/target/scala-2.11/src_managed/main/thrift/modeldb/ModelDBService.scala:369: too many arguments for method apply: (method: com.twitter.scrooge.ThriftMethod, thriftService: com.twitter.finagle.Service[com.twitter.finagle.thrift.ThriftClientRequest,Array[Byte]], pf: org.apache.thrift.protocol.TProtocolFactory, stats: com.twitter.finagle.stats.StatsReceiver)com.twitter.finagle.Service[method.Args,method.Result] in object ThriftServiceIface
[error]           extractPipeline = ThriftServiceIface(self.ExtractPipeline, binaryService, pf, stats, responseClassifier)
Read more comments on GitHub >

github_iconTop Results From Across the Web

Hive on Spark CDH 5.7 - Failed to create spark client
We are getting the error while executing the hive queries with spark engine. Failed to execute spark task, with exception 'org.apache.hadoop.hive.ql.metadata.
Read more >
Hive on Spark is not working - Failed to create spark client
I set the YARN Container memory and maximum to be greater than Spark Executor Memory + Overhead.
Read more >
Building Spark - Spark 3.3.1 Documentation
Building Apache Spark. Apache Maven. Setting up Maven's Memory Usage; build/mvn. Building a Runnable Distribution; Specifying the Hadoop Version and Enabling ...
Read more >
Troubleshooting issues with Apache Spark - IBM
Symptom: Spark worker daemon fails to create executors with the following error: ... For more information, see Configuring z/OS Spark client authentication.
Read more >
error: Failed to create spark client. for hive on spark
Hi all, anyone met this error: HiveException(Failed to create spark client.) ... Failed to execute spark task, with exception 'org.apache.hadoop.hive.ql.metadata.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found