question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

No Encoder found for Double

See original GitHub issue

Hi,

I’ve modified the example in the Quick Start Guide to use the following main function:

@file:JvmName("SimpleApp")
import org.jetbrains.spark.api.*

data class LonLat(val lon: Double, val lat: Double)

fun main() {
    withSpark {
        spark.dsOf(LonLat(1.0, 2.0), LonLat(3.0, 4.0)).show()
    }
}

When I execute this program with:

spark-3.0.0-bin-hadoop2.7/bin/spark-submit --class "SimpleApp" --master local src/IdeaProjects/kotlin-spark-example/build/libs/kotlin-spark-example-1.0-SNAPSHOT-all.jar

I get the following error:

Exception in thread "main" java.lang.UnsupportedOperationException: No Encoder found for Double
- field (class: "scala.Double", name: "lat")
- root class: "LonLat"
	at org.apache.spark.sql.KotlinReflection$.$anonfun$deserializerFor$1(KotlinReflection.scala:426)
	at scala.reflect.internal.tpe.TypeConstraints$UndoLog.undo(TypeConstraints.scala:69)
	at org.apache.spark.sql.KotlinReflection.cleanUpReflectionObjects(KotlinReflection.scala:864)
	at org.apache.spark.sql.KotlinReflection.cleanUpReflectionObjects$(KotlinReflection.scala:863)
	at org.apache.spark.sql.KotlinReflection$.cleanUpReflectionObjects(KotlinReflection.scala:47)
	at org.apache.spark.sql.KotlinReflection$.deserializerFor(KotlinReflection.scala:202)
	at org.apache.spark.sql.KotlinReflection$.$anonfun$deserializerFor$7(KotlinReflection.scala:351)
	at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:238)
	at scala.collection.IndexedSeqOptimized.foreach(IndexedSeqOptimized.scala:36)
	at scala.collection.IndexedSeqOptimized.foreach$(IndexedSeqOptimized.scala:33)
	at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:198)
	at scala.collection.TraversableLike.map(TraversableLike.scala:238)
	at scala.collection.TraversableLike.map$(TraversableLike.scala:231)
	at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:198)
	at org.apache.spark.sql.KotlinReflection$.$anonfun$deserializerFor$1(KotlinReflection.scala:340)
	at scala.reflect.internal.tpe.TypeConstraints$UndoLog.undo(TypeConstraints.scala:69)
	at org.apache.spark.sql.KotlinReflection.cleanUpReflectionObjects(KotlinReflection.scala:864)
	at org.apache.spark.sql.KotlinReflection.cleanUpReflectionObjects$(KotlinReflection.scala:863)
	at org.apache.spark.sql.KotlinReflection$.cleanUpReflectionObjects(KotlinReflection.scala:47)
	at org.apache.spark.sql.KotlinReflection$.deserializerFor(KotlinReflection.scala:202)
	at org.apache.spark.sql.KotlinReflection$.$anonfun$deserializerFor$20(KotlinReflection.scala:470)
	at org.apache.spark.sql.catalyst.DeserializerBuildHelper$.deserializerForWithNullSafetyAndUpcast(DeserializerBuildHelper.scala:54)
	at org.apache.spark.sql.KotlinReflection$.deserializerFor(KotlinReflection.scala:470)
	at org.apache.spark.sql.KotlinReflection.deserializerFor(KotlinReflection.scala)
	at org.jetbrains.spark.api.ApiV1Kt.kotlinClassEncoder(ApiV1.kt:103)
	at org.jetbrains.spark.api.ApiV1Kt.generateEncoder(ApiV1.kt:91)
	at SimpleApp.main(SimpleApp.kt:47)
	at SimpleApp.main(SimpleApp.kt)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
	at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:928)
	at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
	at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
	at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
	at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1007)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1016)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

If I replace the Doubles in the data class with Ints, then the program executes correctly, printing:

+---+---+
|lat|lon|
+---+---+
|  2|  1|
|  4|  3|
+---+---+

My build environment is identical to the one at https://github.com/MKhalusova/kotlin-spark-example, and I’m using Spark 3.0.0 with Java 1.8.0_252 (AdoptOpenJDK) on MacOS 10.15.6.

Thank you,

Todd

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:5 (1 by maintainers)

github_iconTop GitHub Comments

1reaction
ToddSmallcommented, Aug 6, 2020

@asm0dey Yes, my issue is fixed. Thanks again!

Todd

0reactions
asm0deycommented, Aug 5, 2020

@ToddSmall Thank you for the report! Could you test please?

Read more comments on GitHub >

github_iconTop Results From Across the Web

How to store custom objects in Dataset? - Stack Overflow
UnsupportedOperationException: No Encoder found for MyObj ... with the right names and with the first two both things I can join against.
Read more >
No Encoder found for Double · Issue #37 · Kotlin/kotlin-spark-api
Exception in thread "main" java.lang.UnsupportedOperationException: No Encoder found for Double - field (class: "scala.
Read more >
Typed Encoders in Frameless · GitBook
UnsupportedOperationException: No Encoder found for java.util. ... case class Bar(d: Double, s: String) // defined class Bar case class Foo(i: Int, ...
Read more >
Serializers for Classes in Datasets - FullContact
UnsupportedOperationException: No Encoder found for com.fullcontact.publish.CustomMessage - field (class: "com.fullcontact.publish.
Read more >
spark-scala/Lobby - Gitter
UnsupportedOperationException: No Encoder found for java.time.LocalDate ... maybe internally optimized to perform joins two RDD of different sizes.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found