[SUPPORT] Flink Hudi Quickstart - SqlClientException
See original GitHub issueDescribe the problem you faced Running the Flink Quickstart guide (https://hudi.apache.org/docs/flink-quick-start-guide.html), I receive the error I copied in the stack trace when I get to the “Insert Table” step. If anyone could give me a quick demo of how this works via Zoom or some other platform, that would be very helpful.
To Reproduce
Steps to reproduce the behavior:
- cd to the base directory of Flink
- Start Flink 1.12.2 cluster:
./bin/start-cluster.sh
- Download hudi-flink-bundle_2.11-0.8.0.jar
- Start the Flink SQL Client:
./bin/sql-client.sh embedded -j <path>/hudi-flink-bundle_2.11-0.8.0.jar shell
- set execution result mode:
set execution.result-mode=tableau;
- create the table:
CREATE TABLE t1(
uuid VARCHAR(20), -- you can use 'PRIMARY KEY NOT ENFORCED' syntax to mark the field as record key
name VARCHAR(10),
age INT,
ts TIMESTAMP(3),
`partition` VARCHAR(20)
)
PARTITIONED BY (`partition`)
WITH (
'connector' = 'hudi',
'path' = 's3://<bucket_name>/<folder>/',
'write.tasks' = '1', -- default is 4 ,required more resource
'compaction.tasks' = '1', -- default is 10 ,required more resource
'table.type' = 'MERGE_ON_READ' -- this creates a MERGE_ON_READ table, by default is COPY_ON_WRITE
);
- insert data to table:
INSERT INTO t1 VALUES
('id1','Danny',23,TIMESTAMP '1970-01-01 00:00:01','par1');
Expected behavior
I expected the data to be added to a table.
Environment Description
-
Hudi version : 0.8.0
-
Spark version : n/a
-
Hive version : n/a
-
Hadoop version : 2.10.1
-
Storage (HDFS/S3/GCS…) : S3
-
Running on Docker? (yes/no) : no
-
Flink Version: 1.12.2
Additional context
I have tried this code using hudi-flink-bundle_2.11-0.7.0.jar and that also fails.
Stacktrace
Exception in thread "main" org.apache.flink.table.client.SqlClientException: Unexpected exception. This is a bug. Please consider filing an issue.
at org.apache.flink.table.client.SqlClient.main(SqlClient.java:215)
Caused by: java.lang.RuntimeException: Error running SQL job.
at org.apache.flink.table.client.gateway.local.LocalExecutor.lambda$executeUpdateInternal$4(LocalExecutor.java:514)
at org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:256)
at org.apache.flink.table.client.gateway.local.LocalExecutor.executeUpdateInternal(LocalExecutor.java:507)
at org.apache.flink.table.client.gateway.local.LocalExecutor.executeUpdate(LocalExecutor.java:428)
at org.apache.flink.table.client.cli.CliClient.callInsert(CliClient.java:690)
Issue Analytics
- State:
- Created 2 years ago
- Comments:8 (7 by maintainers)
Top Results From Across the Web
Flink Guide - Apache Hudi
This guide helps you quickly start using Flink on Hudi, and learn different modes for reading/writing Hudi by Flink: Quick Start : Read...
Read more >[GitHub] [hudi] danny0405 commented on issue #3215: [SUPPORT ...
[GitHub] [hudi] danny0405 commented on issue #3215: [SUPPORT] Flink Hudi Quickstart - SqlClientException · GitBox Tue, 10 Aug 2021 00:39:54 -0700.
Read more >GI Tracker Board - GitHub
[SUPPORT] Slow Upsert When Reloading Data into Hudi Table #5481 opened by MikeBuh ... [SUPPORT] Flink Hudi Quickstart - SqlClientException #3215 opened by ......
Read more >Hudi - Apache Kyuubi - Read the Docs
Apache Hudi (pronounced “hoodie”) is the next generation streaming data lake ... The classpath of kyuubi flink sql engine with Hudi supported consists...
Read more >hudi - Git at Google
Hudi supports three types of queries: Snapshot Query - Provides snapshot queries on real-time data, using a combination of columnar & row-based storage...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
@anikait-rao You may need to use the package built from the master branch manually, or wait for our 0.9 release which is coming soon.
closing it due to inactivity. feel free to open a new one if still facing issues. thanks!