batch insert meet nullpoint exception
See original GitHub issueEnvironment
- OS version: mac lastest
- JDK version: jdk 11
- ClickHouse Server version: lastest
- ClickHouse Native JDBC version: lastest
- (Optional) Spark version: N/A 3.0.2
- (Optional) Other components’ version: N/A
I think maybe the spark dataframe rows size is zero
Error logs
Caused by: java.lang.NullPointerException
at com.github.housepower.jdbc.data.type.DataTypeInt64.serializeBinary(DataTypeInt64.java:77)
at com.github.housepower.jdbc.data.Column.write(Column.java:31)
at com.github.housepower.jdbc.data.Block.appendRow(Block.java:93)
at com.github.housepower.jdbc.statement.ClickHousePreparedInsertStatement.addParameters(ClickHousePreparedInsertStatement.java:162)
at com.github.housepower.jdbc.statement.ClickHousePreparedInsertStatement.addBatch(ClickHousePreparedInsertStatement.java:95)
at com.hw.utils.CKJDBCUtils$.saveBatchRowInsertDfToCK(CKJDBCUtils.scala:724)
at com.hw.utils.CKJDBCUtils$.insertIterRowToCKTable(CKJDBCUtils.scala:691)
at com.hw.utils.CKJDBCUtils$.$anonfun$initTableWithInsertDataByRDDRows$1(CKJDBCUtils.scala:428)
at com.hw.utils.CKJDBCUtils$.$anonfun$initTableWithInsertDataByRDDRows$1$adapted(CKJDBCUtils.scala:425)
at org.apache.spark.rdd.RDD.$anonfun$foreachPartition$2(RDD.scala:994)
at org.apache.spark.rdd.RDD.$anonfun$foreachPartition$2$adapted(RDD.scala:994)
at org.apache.spark.SparkContext.$anonfun$runJob$5(SparkContext.scala:2154)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:127)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:462)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1377)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:465)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
Steps to reproduce
Other descriptions
Issue Analytics
- State:
- Created 2 years ago
- Comments:9 (4 by maintainers)
Top Results From Across the Web
getting null pointer exception sending data to db using spring ...
setSql( "insert into user(userId,namePrefix,firstName,lastName) values ... userInput) throws Exception { User user = new User(); user.
Read more >IC52368: NULLPOINTEREXCEPTION USING ADDBATCH ...
using addBatch and GeneratedKeys results in the following exception: Exception in thread "main" java.lang.NullPointerException ... prepareStatement("INSERT
Read more >[SUPPORT] NullPointerException while writing Bulk ingest table
Hello, I am currently getting an exception while writing a hudi talbe in bulk_ingest mode. Please see below for the stacktrace along with ......
Read more >Null Pointer Exception - Only on Bulk Insert
I have a trigger that looks to see if the lookup field AVSFQB__Primary_Contact__c (looks up to Contact) has changed on an Opportunity and...
Read more >Null pointer exception on bulk update but working when I ...
I'm running a batch update on tasks and my task trigger handler is throwing a null pointer exception error. If I go through...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Hi Pan, Thank you for you inputs. I was able to resolve the issue. I was using ReplacingMergeTree() Engine which was not allowing to write duplicate records.
thank you for your support.
@TimmannaC Could you give me any advice about
registerDialect
in pyspark? InJdbcDialects.registerDialect(ClickHouseDialect)
, theClickHouseDialect
is an object (not a class) so I can’t find out how to do it.The below code doesn’t work 😢 … it said
'JavaPackage' object is not callable
.And I found out how to get reference of scala object, but it also doesn’t work, and the error message said
ClassNotFoundException: org.apache.spark.sql.jdbc.ClickHouseDialect$
.