S3FileIO fails with java.lang.NoSuchMethodError: 'void software.amazon.awssdk.utils.IoUtils.closeQuietly(java.lang.AutoCloseable, software.amazon.awssdk.thirdparty.org.slf4j.Logger)'
See original GitHub issueApache Iceberg version
0.14.0 (latest release)
Query engine
Spark
Please describe the bug 🐞
Installed these packages as per the doc:
"software.amazon.awssdk:bundle:2.17.131",
"software.amazon.awssdk:url-connection-client:2.17.131",
When using S3FileIO:
sc.set("spark.sql.extensions", "org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions")
sc.set("spark.sql.catalog.iceberg", "org.apache.iceberg.spark.SparkCatalog")
...
sc.set("spark.sql.catalog.iceberg.io-impl", "org.apache.iceberg.aws.s3.S3FileIO")
I get the exception:
java.lang.NoSuchMethodError: 'void software.amazon.awssdk.utils.IoUtils.closeQuietly(java.lang.AutoCloseable, software.amazon.awssdk.thirdparty.org.slf4j.Logger)'
at software.amazon.awssdk.core.util.SdkUserAgent.kotlinVersion(SdkUserAgent.java:173)
at software.amazon.awssdk.core.util.SdkUserAgent.getAdditionalJvmLanguages(SdkUserAgent.java:123)
at software.amazon.awssdk.core.util.SdkUserAgent.getUserAgent(SdkUserAgent.java:98)
at software.amazon.awssdk.core.util.SdkUserAgent.initializeUserAgent(SdkUserAgent.java:81)
at software.amazon.awssdk.core.util.SdkUserAgent.<init>(SdkUserAgent.java:51)
at software.amazon.awssdk.core.util.SdkUserAgent.create(SdkUserAgent.java:58)
at software.amazon.awssdk.core.client.builder.SdkDefaultClientBuilder.lambda$mergeGlobalDefaults$0(SdkDefaultClientBuilder.java:211)
at software.amazon.awssdk.utils.builder.SdkBuilder.applyMutation(SdkBuilder.java:61)
at software.amazon.awssdk.core.client.config.SdkClientConfiguration.merge(SdkClientConfiguration.java:66)
at software.amazon.awssdk.core.client.builder.SdkDefaultClientBuilder.mergeGlobalDefaults(SdkDefaultClientBuilder.java:207)
at software.amazon.awssdk.core.client.builder.SdkDefaultClientBuilder.syncClientConfiguration(SdkDefaultClientBuilder.java:158)
at software.amazon.awssdk.services.s3.DefaultS3ClientBuilder.buildClient(DefaultS3ClientBuilder.java:27)
at software.amazon.awssdk.services.s3.DefaultS3ClientBuilder.buildClient(DefaultS3ClientBuilder.java:22)
at software.amazon.awssdk.core.client.builder.SdkDefaultClientBuilder.build(SdkDefaultClientBuilder.java:133)
at org.apache.iceberg.aws.AwsClientFactories$DefaultAwsClientFactory.s3(AwsClientFactories.java:106)
at org.apache.iceberg.aws.s3.S3FileIO.client(S3FileIO.java:290)
at org.apache.iceberg.aws.s3.S3FileIO.newOutputFile(S3FileIO.java:129)
at org.apache.iceberg.io.OutputFileFactory.newOutputFile(OutputFileFactory.java:105)
at org.apache.iceberg.io.RollingFileWriter.newFile(RollingFileWriter.java:111)
at org.apache.iceberg.io.RollingFileWriter.openCurrentWriter(RollingFileWriter.java:102)
at org.apache.iceberg.io.RollingDataWriter.<init>(RollingDataWriter.java:44)
at org.apache.iceberg.io.ClusteredDataWriter.newWriter(ClusteredDataWriter.java:51)
at org.apache.iceberg.io.ClusteredWriter.write(ClusteredWriter.java:87)
at org.apache.iceberg.io.ClusteredDataWriter.write(ClusteredDataWriter.java:32)
at org.apache.iceberg.spark.source.SparkWrite$PartitionedDataWriter.write(SparkWrite.java:713)
at org.apache.iceberg.spark.source.SparkWrite$PartitionedDataWriter.write(SparkWrite.java:689)
at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$.$anonfun$run$1(WriteToDataSourceV2Exec.scala:442)
at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1538)
at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$.run(WriteToDataSourceV2Exec.scala:480)
at org.apache.spark.sql.execution.datasources.v2.V2TableWriteExec.$anonfun$writeWithV2$2(WriteToDataSourceV2Exec.scala:381)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:136)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:548)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1504)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:551)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
Issue Analytics
- State:
- Created a year ago
- Reactions:1
- Comments:5 (1 by maintainers)
Top Results From Across the Web
java.lang.NoSuchMethodError: software.amazon.awssdk.utils ...
I am trying to index files on S3 with Lucene and. I found this question Integrating Lucene Index and Amazon AWS, with some...
Read more >java.lang.NoSuchMethodError: com.amazonaws.util ... - GitHub
I had the similar issue in scala, I was using aws-sdk for different aws services. Seq( "org.apache.spark" %% "spark-core" % sparkVer, "org.
Read more >IoUtils (AWS SDK for Java - 2.18.34)
Copies all bytes from the given input stream to the given output stream. static void. drainInputStream(InputStream in). Read all remaining data in the...
Read more >Logging with the SDK for Java 2.x - AWS Documentation
x is software.amazon.awssdk . Setting dependencies. To configure the Log4j 2 binding for SLF4J in Maven, use the following in your pom.xml ...
Read more >ERROR: "Caused by: java.lang.NoSuchMethodError - Search
Provide the location of third-party .jar files by editing <Informatica installation directory>/infaConf/hadoopEnv.properties. Add the value $ ...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Looking at it a bit further, I see that
url-connection-client
depends on theutils
package, which has un-shaded function signatures. Depending on class load order, can this overrule shaded functions we find in the bundle package, thereby causing no such method issues? I would advise against using bundle at all and just import individual aws packages.Even with the latest
2.17.257
I see the issue:It doesn’t happen always, so its hard to tell how to reliably reproduce the issue. But I saw it especially in a long-running job (say >10 hr-ish)