question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

[QUESION] Should filter prop "hoodie.datasource.write.operation" when use spark sql create table?

See original GitHub issue

when I use spark sql create table and set hoodie.datasource.write.operation=upsert. delete sql (like pr #5215 ), insert overwrite sql etc will still use hoodie.datasource.write.operation to update record, not delete, insert_overwrite etc.

eg: create a table and set hoodie.datasource.write.operation upsert when I use sql to delete, the delete operation key will be overwrite by hoodie.datasource.write.operation from table or env, OPERATION.key -> DataSourceWriteOptions.DELETE_OPERATION_OPT_VAL will not effect, overwrite to upsert

withSparkConf(sparkSession, hoodieCatalogTable.catalogProperties) {
  Map(
    "path" -> path,
    RECORDKEY_FIELD.key -> hoodieCatalogTable.primaryKeys.mkString(","),
    TBL_NAME.key -> tableConfig.getTableName,
    HIVE_STYLE_PARTITIONING.key -> tableConfig.getHiveStylePartitioningEnable,
    URL_ENCODE_PARTITIONING.key -> tableConfig.getUrlEncodePartitioning,
    KEYGENERATOR_CLASS_NAME.key -> classOf[SqlKeyGenerator].getCanonicalName,
    SqlKeyGenerator.ORIGIN_KEYGEN_CLASS_NAME -> tableConfig.getKeyGeneratorClassName,
    OPERATION.key -> DataSourceWriteOptions.DELETE_OPERATION_OPT_VAL,
    PARTITIONPATH_FIELD.key -> tableConfig.getPartitionFieldProp,
    HiveSyncConfig.HIVE_SYNC_MODE.key -> HiveSyncMode.HMS.name(),
    HiveSyncConfig.HIVE_SUPPORT_TIMESTAMP_TYPE.key -> "true",
    HoodieWriteConfig.DELETE_PARALLELISM_VALUE.key -> "200",
    SqlKeyGenerator.PARTITION_SCHEMA -> partitionSchema.toDDL
  )
} 

so, when use sql, what about don’t write it to hoodie.properties, confine it when sql check, command generated itself in runtime.

Issue Analytics

  • State:closed
  • Created a year ago
  • Comments:6 (6 by maintainers)

github_iconTop GitHub Comments

2reactions
dongkeluncommented, Apr 9, 2022

Not only hoodie.datasource.write.operation, but also other properties such as hoodie.table.name has the same problem. I think we should find all the parameters with similar problems

0reactions
yihuacommented, May 3, 2022

We’re going to track the fix in HUDI-4001. Closing this issue.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Spark Guide - Apache Hudi
This guide provides a quick peek at Hudi's capabilities using spark-shell. Using Spark datasources, we will walk through.
Read more >
Questions tagged with Data Lakes - AWS re:Post
Browse through Data Lakes questions or showcase your expertise by answering ... 'hoodie.table.name': mytable 'hoodie.datasource.write.recordkey.field': Id ...
Read more >
Writing spark DataFrame In Apache Hudi Table - Stack Overflow
For type first time i am not creating any table and writing in overwrite mode so I am expecting it will create hudi...
Read more >
Using the Hudi framework in AWS Glue
You can use AWS Glue to perform read and write operations on Hudi tables in ... in order to use the AWS Glue...
Read more >
Hudi on Hops - DiVA Portal
HoodieDeltaStreamer utility and DataSource API are used for the integration ... Keywords: Hudi, Hadoop, Hops, Upsert, SQL, Spark, Kafka.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found