[SUPPORT] Hudi 0.11.0 HoodieDeltaStreamer failing to start with error : java.lang.NoSuchFieldError: DROP_PARTITION_COLUMNS
See original GitHub issueTips before filing an issue
-
Have you gone through our FAQs?
-
Join the mailing list to engage in conversations and get faster support at dev-subscribe@hudi.apache.org.
-
If you have triaged this as a bug, then file an issue directly.
Describe the problem you faced
I have pulled the 0.11.0 release branch and trying to build and run hudi. Earlier I was running 0.10.1 with spark 3.1.3 and able to run it without any issues.
With 0.11.0 , I am facing the error which I mentioned in stack trace.
To Reproduce
Steps to reproduce the behavior:
- git pull origin release-0.11.0
- mvn clean install -DskipTests -Dspark3.2 -Dscala-2.12
- ./spark-submit --jars packaging/hudi-spark-bundle/target/hudi-spark3.2-bundle_2.12-0.11.0.jar --class org.apache.hudi.utilities.deltastreamer.HoodieDeltaStreamer hudi/packaging/hudi-utilities-slim-bundle/target/hudi-utilities-slim-bundle_2.12-0.11.0.jar --props file://hudi/properties/kafka.properties --schemaprovider-class org.apache.hudi.utilities.schema.FilebasedSchemaProvider --source-class org.apache.hudi.utilities.sources.JsonKafkaSource --target-base-path gs://xxx/ --target-table hudi.xxx --op INSERT --table-type COPY_ON_WRITE --source-ordering-field time --continuous --transformer-class org.apache.hudi.utilities.transform.AddDateHourColumnTransformer --source-limit 150
Expected behavior
Delta streamer to start and run and consume and write data.
Environment Description
-
Hudi version : 0.11.0
-
Spark version : 3.2.1
-
Hive version :
-
Hadoop version : 3.3
-
Storage (HDFS/S3/GCS…) : gcs
-
Running on Docker? (yes/no) : no
Additional context
Built the hudi project using maven and trying to run the jar. val DROP_PARTITION_COLUMNS: ConfigProperty[Boolean] = HoodieTableConfig.DROP_PARTITION_COLUMNS in DataSourceOptions.scala file is failing to resolve the static field in HoodieTableConfig.java
Stacktrace
Exception in thread "main" java.lang.NoSuchFieldError: DROP_PARTITION_COLUMNS
at org.apache.hudi.DataSourceWriteOptions$.<init>(DataSourceOptions.scala:488)
at org.apache.hudi.DataSourceWriteOptions$.<clinit>(DataSourceOptions.scala)
at org.apache.hudi.DataSourceWriteOptions.RECONCILE_SCHEMA(DataSourceOptions.scala)
at org.apache.hudi.utilities.deltastreamer.HoodieDeltaStreamer.combineProperties(HoodieDeltaStreamer.java:160)
at org.apache.hudi.utilities.deltastreamer.HoodieDeltaStreamer.<init>(HoodieDeltaStreamer.java:130)
at org.apache.hudi.utilities.deltastreamer.HoodieDeltaStreamer.<init>(HoodieDeltaStreamer.java:115)
at org.apache.hudi.utilities.deltastreamer.HoodieDeltaStreamer.main(HoodieDeltaStreamer.java:549)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:955)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1043)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1052)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
22/05/04 10:21:34 INFO ShutdownHookManager: Shutdown hook called
Issue Analytics
- State:
- Created a year ago
- Comments:5 (1 by maintainers)
Top GitHub Comments
@alexeykudinkin. Apologies for the trouble, I accidentally placed hudi-hadoop-mr-bundle-0.10.1.jar in my spark class path earlier and HoodieTableConfig is being picked from that location and not able to find latest compiled class. I am closing this issue. Thanks for the your comments.
I am trying to run 0.11.0 with the same config used for 0.10.1. Earlier it used to be hudi-utilities-bundle that we run, but now it is changed to hudi-utilities-slim-bundle along with hudi-sparkx.y-bundle . Somewhere the code recompilation/library versions are screwed is what I am feeling. But it is giving me hard time to figure that out.