question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

[SUPPORT] Exception in thread "main" java.lang.IllegalArgumentException: Can't find primaryKey `uuid` in root

See original GitHub issue

Describe the problem you faced

Using Hudi 0.10.0-rc3, creating an external COW table without primaryKey fails with

Can't find primaryKey `uuid` in root

Creating with a primaryKey does not fail.

I’m also wondering what this error means?

ERROR org.apache.hudi.common.config.DFSPropertiesConfiguration  - Error reading in properties from dfs

To Reproduce

Steps to reproduce the behavior:

  1. Create table
CREATE TABLE IF NOT EXISTS public.test_create (
    id bigint,
    name string,
    dt string
) USING hudi
LOCATION 's3a://<bucket>/<path>/test_create'
OPTIONS (
  type = 'cow'
);
  1. Error is thrown

Expected behavior

I would expect the table to be correctly created.

Environment Description

  • Hudi version : 0.10.0-rc3

  • Spark version : 2.4.4

  • Hive version : 2.3.5

  • Hadoop version : 2.7.3

  • Storage (HDFS/S3/GCS…) : S3

  • Running on Docker? (yes/no) : No

Stacktrace

4575 [main] ERROR org.apache.hudi.common.config.DFSPropertiesConfiguration  - Error reading in properties from dfs
Exception in thread "main" java.lang.IllegalArgumentException: Can't find primaryKey `uuid` in root
 |-- _hoodie_commit_time: string (nullable = true)
 |-- _hoodie_commit_seqno: string (nullable = true)
 |-- _hoodie_record_key: string (nullable = true)
 |-- _hoodie_partition_path: string (nullable = true)
 |-- _hoodie_file_name: string (nullable = true)
 |-- id: long (nullable = true)
 |-- name: string (nullable = true)
 |-- dt: string (nullable = true)
.
	at org.apache.hudi.common.util.ValidationUtils.checkArgument(ValidationUtils.java:40)
	at org.apache.spark.sql.hudi.HoodieOptionConfig$$anonfun$validateTable$1.apply(HoodieOptionConfig.scala:201)
	at org.apache.spark.sql.hudi.HoodieOptionConfig$$anonfun$validateTable$1.apply(HoodieOptionConfig.scala:200)
	at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
	at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186)
	at org.apache.spark.sql.hudi.HoodieOptionConfig$.validateTable(HoodieOptionConfig.scala:200)
	at org.apache.spark.sql.catalyst.catalog.HoodieCatalogTable.parseSchemaAndConfigs(HoodieCatalogTable.scala:213)
	at org.apache.spark.sql.catalyst.catalog.HoodieCatalogTable.initHoodieTable(HoodieCatalogTable.scala:156)
	at org.apache.spark.sql.hudi.command.CreateHoodieTableCommand.run(CreateHoodieTableCommand.scala:67)
	at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
	at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
	at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:79)
	at org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:194)
	at org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:194)
	at org.apache.spark.sql.Dataset$$anonfun$52.apply(Dataset.scala:3370)
	at org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78)
	at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
	at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
	at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3369)
	at org.apache.spark.sql.Dataset.<init>(Dataset.scala:194)
	at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:79)
	at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:642)
	at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:694)
	at com.twilio.optimustranformer.OptimusTranformer$$anonfun$main$1$$anonfun$apply$mcV$sp$1.apply(OptimusTranformer.scala:74)
	at com.twilio.optimustranformer.OptimusTranformer$$anonfun$main$1$$anonfun$apply$mcV$sp$1.apply(OptimusTranformer.scala:72)
	at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
	at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186)
	at com.twilio.optimustranformer.OptimusTranformer$$anonfun$main$1.apply$mcV$sp(OptimusTranformer.scala:71)
	at scala.util.control.Breaks.breakable(Breaks.scala:38)
	at com.twilio.optimustranformer.OptimusTranformer$.main(OptimusTranformer.scala:70)
	at com.twilio.optimustranformer.OptimusTranformer.main(OptimusTranformer.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
	at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:845)
	at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)
	at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)
	at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
	at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:920)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:929)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:7 (5 by maintainers)

github_iconTop GitHub Comments

1reaction
BenjMaqcommented, Dec 15, 2021

Thank you all. Passing a primary is the way then. We are good to close for me.

0reactions
yanghuacommented, Dec 16, 2021

Thank you all. Passing a primary is the way then. We are good to close for me.

thanks, closing now.

Read more comments on GitHub >

github_iconTop Results From Across the Web

[GitHub] [hudi] yanghua closed issue #4229 - The Mail Archive
... closed issue #4229: [SUPPORT] Exception in thread "main" java.lang.IllegalArgumentException: Can't find primaryKey `uuid` in root.
Read more >
Spring boot UUID error when I try to save a user - Stack Overflow
When I save the user using Long or String the endpoint operates normally but when I try to change the type of the...
Read more >
Java UUID - Javatpoint
Java UUID. What is UUID? UUID stands for Universally Unique Identifier. UUID are standardized by the Open Software Foundation (OSF).
Read more >
Hibernate ORM 5.2.18.Final User Guide - Red Hat on GitHub
See the lgpl.txt file in the root directory or <http://www.gnu.org/licenses/lgpl-2.1.html>. */ package org.hibernate.userguide.naming; import java.util.
Read more >
TPS-5250 - 7.3 - Talend Help Center
lang.IllegalArgumentException: Field 'var1' does not exist; TBD-13902 - Spark Batch job fails with error 'java.lang.NullPointerException' when ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found