question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

java.lang.UnsupportedOperationException: newFileChannel

See original GitHub issue

When running under Java 1.8 it works great, but attempts to run it under Java 11 (tried different releases) it throws the following:

io.github.classgraph.ClassGraphException: Uncaught exception during scan
  at io.github.classgraph.ClassGraph.scan(ClassGraph.java:1558)
  at io.github.classgraph.ClassGraph.scan(ClassGraph.java:1575)
  at io.github.classgraph.ClassGraph.scan(ClassGraph.java:1588)
  at za.co.absa.spline.harvester.plugin.registry.AutoDiscoveryPluginRegistry$.$anonfun$PluginClasses$2(AutoDiscoveryPluginRegistry.scala:78)
  at za.co.absa.commons.lang.ARM$.using(ARM.scala:30)
  at za.co.absa.commons.lang.ARM$ResourceWrapper.flatMap(ARM.scala:43)
  at za.co.absa.spline.harvester.plugin.registry.AutoDiscoveryPluginRegistry$.<init>(AutoDiscoveryPluginRegistry.scala:78)
  at za.co.absa.spline.harvester.plugin.registry.AutoDiscoveryPluginRegistry$.<clinit>(AutoDiscoveryPluginRegistry.scala)
  at za.co.absa.spline.harvester.plugin.registry.AutoDiscoveryPluginRegistry.<init>(AutoDiscoveryPluginRegistry.scala:48)
  at za.co.absa.spline.harvester.LineageHarvesterFactory.<init>(LineageHarvesterFactory.scala:40)
  at za.co.absa.spline.harvester.conf.DefaultSplineConfigurer.harvesterFactory(DefaultSplineConfigurer.scala:140)
  at za.co.absa.spline.harvester.conf.DefaultSplineConfigurer.queryExecutionEventHandler(DefaultSplineConfigurer.scala:104)
  at za.co.absa.spline.harvester.QueryExecutionEventHandlerFactory.initEventHandler(QueryExecutionEventHandlerFactory.scala:64)
  at za.co.absa.spline.harvester.QueryExecutionEventHandlerFactory.$anonfun$createEventHandler$6(QueryExecutionEventHandlerFactory.scala:43)
  at za.co.absa.spline.harvester.QueryExecutionEventHandlerFactory.withErrorHandling(QueryExecutionEventHandlerFactory.scala:55)
  at za.co.absa.spline.harvester.QueryExecutionEventHandlerFactory.createEventHandler(QueryExecutionEventHandlerFactory.scala:42)
  at za.co.absa.spline.harvester.listener.SplineQueryExecutionListener$.za$co$absa$spline$harvester$listener$SplineQueryExecutionListener$$constructEventHandler(SplineQueryExecutionListener.scala:67)
  at za.co.absa.spline.harvester.listener.SplineQueryExecutionListener.<init>(SplineQueryExecutionListener.scala:37)
  at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
  at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
  at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
  at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490)
  at org.apache.spark.util.Utils$.$anonfun$loadExtensions$1(Utils.scala:2788)
  at scala.collection.TraversableLike.$anonfun$flatMap$1(TraversableLike.scala:245)
  at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
  at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
  at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
  at scala.collection.TraversableLike.flatMap(TraversableLike.scala:245)
  at scala.collection.TraversableLike.flatMap$(TraversableLike.scala:242)
  at scala.collection.AbstractTraversable.flatMap(Traversable.scala:108)
  at org.apache.spark.util.Utils$.loadExtensions(Utils.scala:2777)
  at org.apache.spark.sql.util.ExecutionListenerManager.$anonfun$new$2(QueryExecutionListener.scala:88)
  at org.apache.spark.sql.util.ExecutionListenerManager.$anonfun$new$2$adapted(QueryExecutionListener.scala:87)
  at scala.Option.foreach(Option.scala:407)
  at org.apache.spark.sql.util.ExecutionListenerManager.<init>(QueryExecutionListener.scala:87)
  at org.apache.spark.sql.internal.BaseSessionStateBuilder.$anonfun$listenerManager$2(BaseSessionStateBuilder.scala:319)
  at scala.Option.getOrElse(Option.scala:189)
  at org.apache.spark.sql.internal.BaseSessionStateBuilder.listenerManager(BaseSessionStateBuilder.scala:319)
  at org.apache.spark.sql.internal.BaseSessionStateBuilder.build(BaseSessionStateBuilder.scala:346)
  at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$instantiateSessionState(SparkSession.scala:1145)
  at org.apache.spark.sql.SparkSession.$anonfun$sessionState$2(SparkSession.scala:159)
  at scala.Option.getOrElse(Option.scala:189)
  at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:155)
  at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:152)
  at org.apache.spark.sql.SparkSession.$anonfun$new$3(SparkSession.scala:112)
  at scala.Option.map(Option.scala:230)
  at org.apache.spark.sql.SparkSession.$anonfun$new$1(SparkSession.scala:112)
  at org.apache.spark.sql.internal.SQLConf$.get(SQLConf.scala:196)
  at org.apache.spark.sql.types.DataType.sameType(DataType.scala:97)
  at org.apache.spark.sql.catalyst.analysis.TypeCoercion$.$anonfun$haveSameType$1(TypeCoercion.scala:291)
  at org.apache.spark.sql.catalyst.analysis.TypeCoercion$.$anonfun$haveSameType$1$adapted(TypeCoercion.scala:291)
  at scala.collection.LinearSeqOptimized.forall(LinearSeqOptimized.scala:85)
  at scala.collection.LinearSeqOptimized.forall$(LinearSeqOptimized.scala:82)
  at scala.collection.immutable.List.forall(List.scala:89)
  at org.apache.spark.sql.catalyst.analysis.TypeCoercion$.haveSameType(TypeCoercion.scala:291)
  at org.apache.spark.sql.catalyst.expressions.ComplexTypeMergingExpression.dataTypeCheck(Expression.scala:1057)
  at org.apache.spark.sql.catalyst.expressions.ComplexTypeMergingExpression.dataTypeCheck$(Expression.scala:1052)
  at org.apache.spark.sql.catalyst.expressions.If.dataTypeCheck(conditionalExpressions.scala:36)
  at org.apache.spark.sql.catalyst.expressions.ComplexTypeMergingExpression.org$apache$spark$sql$catalyst$expressions$ComplexTypeMergingExpression$$internalDataType(Expression.scala:1063)
  at org.apache.spark.sql.catalyst.expressions.ComplexTypeMergingExpression.org$apache$spark$sql$catalyst$expressions$ComplexTypeMergingExpression$$internalDataType$(Expression.scala:1062)
  at org.apache.spark.sql.catalyst.expressions.If.org$apache$spark$sql$catalyst$expressions$ComplexTypeMergingExpression$$internalDataType$lzycompute(conditionalExpressions.scala:36)
  at org.apache.spark.sql.catalyst.expressions.If.org$apache$spark$sql$catalyst$expressions$ComplexTypeMergingExpression$$internalDataType(conditionalExpressions.scala:36)
  at org.apache.spark.sql.catalyst.expressions.ComplexTypeMergingExpression.dataType(Expression.scala:1067)
  at org.apache.spark.sql.catalyst.expressions.ComplexTypeMergingExpression.dataType$(Expression.scala:1067)
  at org.apache.spark.sql.catalyst.expressions.If.dataType(conditionalExpressions.scala:36)
  at org.apache.spark.sql.catalyst.encoders.ExpressionEncoder.isSerializedAsStruct(ExpressionEncoder.scala:309)
  at org.apache.spark.sql.catalyst.encoders.ExpressionEncoder.isSerializedAsStructForTopLevel(ExpressionEncoder.scala:319)
  at org.apache.spark.sql.catalyst.encoders.ExpressionEncoder.<init>(ExpressionEncoder.scala:248)
  at org.apache.spark.sql.catalyst.encoders.ExpressionEncoder$.apply(ExpressionEncoder.scala:61)
  at org.apache.spark.sql.Encoders$.product(Encoders.scala:285)
  at org.apache.spark.sql.LowPrioritySQLImplicits.newProductEncoder(SQLImplicits.scala:251)
  at org.apache.spark.sql.LowPrioritySQLImplicits.newProductEncoder$(SQLImplicits.scala:251)
  at org.apache.spark.sql.SQLImplicits.newProductEncoder(SQLImplicits.scala:32)
  ... 47 elided
Caused by: java.lang.UnsupportedOperationException: newFileChannel
  at java.base/jdk.internal.jrtfs.JrtFileSystem.newFileChannel(JrtFileSystem.java:338)
  at java.base/jdk.internal.jrtfs.JrtPath.newFileChannel(JrtPath.java:702)
  at java.base/jdk.internal.jrtfs.JrtFileSystemProvider.newFileChannel(JrtFileSystemProvider.java:316)
  at java.base/java.nio.channels.FileChannel.open(FileChannel.java:292)
  at java.base/java.nio.channels.FileChannel.open(FileChannel.java:345)
  at nonapi.io.github.classgraph.fileslice.PathSlice.<init>(PathSlice.java:118)
  at nonapi.io.github.classgraph.fileslice.PathSlice.<init>(PathSlice.java:140)
  at io.github.classgraph.ClasspathElementPathDir$1.openClassfile(ClasspathElementPathDir.java:253)
  at io.github.classgraph.Classfile.<init>(Classfile.java:1925)
  at io.github.classgraph.Scanner$ClassfileScannerWorkUnitProcessor.processWorkUnit(Scanner.java:734)
  at io.github.classgraph.Scanner$ClassfileScannerWorkUnitProcessor.processWorkUnit(Scanner.java:657)
  at nonapi.io.github.classgraph.concurrency.WorkQueue.runWorkLoop(WorkQueue.java:246)
  at nonapi.io.github.classgraph.concurrency.WorkQueue.runWorkQueue(WorkQueue.java:161)
  at io.github.classgraph.Scanner.processWorkUnits(Scanner.java:342)
  at io.github.classgraph.Scanner.performScan(Scanner.java:970)
  at io.github.classgraph.Scanner.openClasspathElementsThenScan(Scanner.java:1112)
  at io.github.classgraph.Scanner.call(Scanner.java:1146)
  at io.github.classgraph.Scanner.call(Scanner.java:83)
  at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
  at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
  at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
  at java.base/java.lang.Thread.run(Thread.java:829)

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:9 (5 by maintainers)

github_iconTop GitHub Comments

1reaction
wajdacommented, Sep 9, 2021

Tested on Scala REPL 2.12 with several different JDK 1.8 - both v115 and v116 work. On Scala REPL with JDK 11 - v116 works like a charm!

Confirm the issue fixed. Thank you a million Luke!

1reaction
wajdacommented, Sep 4, 2021

Thanks Luke, I’ll try to do it over the weekend. For now I can say that I run it from inside the Apache Spark shell (we develop a Spark driver extension), so perhaps this will give you a hint.

Read more comments on GitHub >

github_iconTop Results From Across the Web

UnsupportedOperationException with MultiPart file upload ...
When using an alternative FileSystem with FilePart.transferTo I am seeing an UnsupportedOperationException thrown during multipart file upload.
Read more >
[JENKINS-64922] java.lang.UnsupportedOperationException ...
We recently upgraded to Jenkins 2.263.4 (previous version 2.235.1) ,now we have issue opening files form workspace.
Read more >
UnsupportedOperationException (Java Platform SE 8 )
Constructs a new exception with the specified cause and a detail message of (cause==null ? null : cause.toString()) (which typically contains the class...
Read more >
google-cloud-nio: CloudStorageFileSystemProvider throws ...
Because CloudStorageFileSystemProvider does not override the newFileChannel method it throws an UnsupportedOperationException when used for ...
Read more >
FileSystemProvider - People @EECS
java.nio.file.spi ... The newFileChannel and AsynchronousFileChannel methods are defined to open or create files ... Methods inherited from class java.lang.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found