question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

master branch - flink sql create hive catalog error

See original GitHub issue

I build a jar file from master branch. when i create a hive catalog on flink sql client ,it comes a problem.

the SQL is :

drop catalog if exists iceberg_catalog;
CREATE CATALOG iceberg_catalog WITH (
  'type'='iceberg',
  'catalog-type'='hive',
  'uri'='thrift://xxx1:9083,thrift://xxx2:9083',
  'clients'='5',
  'property-version'='1',
  'warehouse'='hdfs://hacluster/user/hive/warehouse'
);

error info:

java.lang.IllegalArgumentException: Cannot initialize Catalog, org.apache.iceberg.hive.HiveCatalog does not implement Catalog.
	at org.apache.iceberg.CatalogUtil.loadCatalog(CatalogUtil.java:176)
	at org.apache.iceberg.flink.CatalogLoader$HiveCatalogLoader.loadCatalog(CatalogLoader.java:112)
	at org.apache.iceberg.flink.FlinkCatalog.<init>(FlinkCatalog.java:110)
	at org.apache.iceberg.flink.FlinkCatalogFactory.createCatalog(FlinkCatalogFactory.java:130)
	at org.apache.iceberg.flink.FlinkCatalogFactory.createCatalog(FlinkCatalogFactory.java:117)
	at org.apache.flink.table.api.internal.TableEnvironmentImpl.createCatalog(TableEnvironmentImpl.java:1085)
	at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeOperation(TableEnvironmentImpl.java:1019)
	at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeSql(TableEnvironmentImpl.java:690)
	at org.apache.zeppelin.flink.Flink111Shims.executeSql(Flink111Shims.java:374)
	at org.apache.zeppelin.flink.FlinkSqlInterrpeter.callCreateCatalog(FlinkSqlInterrpeter.java:391)
	at org.apache.zeppelin.flink.FlinkSqlInterrpeter.callCommand(FlinkSqlInterrpeter.java:244)
	at org.apache.zeppelin.flink.FlinkSqlInterrpeter.runSqlList(FlinkSqlInterrpeter.java:151)
	at org.apache.zeppelin.flink.FlinkSqlInterrpeter.internalInterpret(FlinkSqlInterrpeter.java:111)
	at org.apache.zeppelin.interpreter.AbstractInterpreter.interpret(AbstractInterpreter.java:47)
	at org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:110)
	at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:808)
	at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:700)
	at org.apache.zeppelin.scheduler.Job.run(Job.java:172)
	at org.apache.zeppelin.scheduler.AbstractScheduler.runJob(AbstractScheduler.java:132)
	at org.apache.zeppelin.scheduler.ParallelScheduler.lambda$runJobInScheduler$0(ParallelScheduler.java:46)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.ClassCastException: org.apache.iceberg.hive.HiveCatalog cannot be cast to org.apache.iceberg.catalog.Catalog
	at org.apache.iceberg.CatalogUtil.loadCatalog(CatalogUtil.java:172)
	... 22 more

but when i’m using iceberg-flink-runtime-0.11.1.jar . it‘s seems ok. could you tell me how to fix this.

Issue Analytics

  • State:open
  • Created 2 years ago
  • Comments:11 (5 by maintainers)

github_iconTop GitHub Comments

1reaction
erictan90commented, Aug 23, 2021

having the same problem with flink-1.12.1 and runtime-0.12.0, switch back to flink-1.11.4 and runtime-0.11.1 then it’s fine.

0reactions
jtLiBraincommented, Jun 24, 2022

I build a jar file from master branch. when i create a hive catalog on flink sql client ,it comes a problem.

the SQL is :

drop catalog if exists iceberg_catalog;
CREATE CATALOG iceberg_catalog WITH (
  'type'='iceberg',
  'catalog-type'='hive',
  'uri'='thrift://xxx1:9083,thrift://xxx2:9083',
  'clients'='5',
  'property-version'='1',
  'warehouse'='hdfs://hacluster/user/hive/warehouse'
);

error info:

java.lang.IllegalArgumentException: Cannot initialize Catalog, org.apache.iceberg.hive.HiveCatalog does not implement Catalog.
	at org.apache.iceberg.CatalogUtil.loadCatalog(CatalogUtil.java:176)
	at org.apache.iceberg.flink.CatalogLoader$HiveCatalogLoader.loadCatalog(CatalogLoader.java:112)
	at org.apache.iceberg.flink.FlinkCatalog.<init>(FlinkCatalog.java:110)
	at org.apache.iceberg.flink.FlinkCatalogFactory.createCatalog(FlinkCatalogFactory.java:130)
	at org.apache.iceberg.flink.FlinkCatalogFactory.createCatalog(FlinkCatalogFactory.java:117)
	at org.apache.flink.table.api.internal.TableEnvironmentImpl.createCatalog(TableEnvironmentImpl.java:1085)
	at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeOperation(TableEnvironmentImpl.java:1019)
	at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeSql(TableEnvironmentImpl.java:690)
	at org.apache.zeppelin.flink.Flink111Shims.executeSql(Flink111Shims.java:374)
	at org.apache.zeppelin.flink.FlinkSqlInterrpeter.callCreateCatalog(FlinkSqlInterrpeter.java:391)
	at org.apache.zeppelin.flink.FlinkSqlInterrpeter.callCommand(FlinkSqlInterrpeter.java:244)
	at org.apache.zeppelin.flink.FlinkSqlInterrpeter.runSqlList(FlinkSqlInterrpeter.java:151)
	at org.apache.zeppelin.flink.FlinkSqlInterrpeter.internalInterpret(FlinkSqlInterrpeter.java:111)
	at org.apache.zeppelin.interpreter.AbstractInterpreter.interpret(AbstractInterpreter.java:47)
	at org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:110)
	at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:808)
	at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:700)
	at org.apache.zeppelin.scheduler.Job.run(Job.java:172)
	at org.apache.zeppelin.scheduler.AbstractScheduler.runJob(AbstractScheduler.java:132)
	at org.apache.zeppelin.scheduler.ParallelScheduler.lambda$runJobInScheduler$0(ParallelScheduler.java:46)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.ClassCastException: org.apache.iceberg.hive.HiveCatalog cannot be cast to org.apache.iceberg.catalog.Catalog
	at org.apache.iceberg.CatalogUtil.loadCatalog(CatalogUtil.java:172)
	... 22 more

but when i’m using iceberg-flink-runtime-0.11.1.jar . it‘s seems ok. could you tell me how to fix this.

I got the same exception when i use iceberg in Zeppline

Read more comments on GitHub >

github_iconTop Results From Across the Web

[GitHub] [iceberg] asnowfox edited a comment on issue #2468
[GitHub] [iceberg] asnowfox edited a comment on issue #2468: master branch - flink sql create hive catalog error.
Read more >
Hive Catalog | Apache Flink
Without a persistent catalog, users using Flink SQL CREATE DDL have to repeatedly create meta-objects like a Kafka table in each session, which...
Read more >
write apache iceberg table to azure ADLS / S3 without using ...
I would definitely try to use a catalog other than the HadoopCatalog / hdfs type for production workloads. As somebody who works on...
Read more >
Differences and considerations for Hive on Amazon EMR
Hive 2.1.0 on Amazon EMR release 5.x now creates, reads from, and writes to temporary files stored in Amazon S3. As a result,...
Read more >
MySQL CDC Connector
jar and put it under <FLINK_HOME>/lib/ . Note: flink-sql-connector-mysql-cdc-XXX-SNAPSHOT version is the code corresponding to the development branch. Users ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found