question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

BigQuery Exception, cannot write data to BigQuery using JSON Converter

See original GitHub issue

Hi,

I am trying to write data from Kafka to BigQuery using Kafka Connect JSON Converter. But, I always got error. How to solve this issue?

Here is the configuration:

{
    "name": "sink_bq_MyTable",
    "config": {
        "connector.class": "com.wepay.kafka.connect.bigquery.BigQuerySinkConnector",
        "key.converter": "org.apache.kafka.connect.json.JsonConverter",
        "key.converter.schema.registry.url": "http://shema-registry:8081",
        "key.converter.schemas.enable": "false",
        "value.converter": "org.apache.kafka.connect.json.JsonConverter",
        "value.converter.schema.registry.url": "http://schema-registry:8081",
        "value.converter.schemas.enable": "true",
	"transforms": "unwrap",  
	"transforms.unwrap.type": "io.debezium.transforms.UnwrapFromEnvelope",
	"project": "my-bq-project-id",
        "keyfile": "/home/ubuntu/key.json",
        "datasets": ".*=my-bq-dataset",
	"autoCreateTables":"false",
	"autoUpdateSchemas":"false",
        "topics": "cdc.dbo.MyTable",
        "topicsToTables":"cdc.dbo.(.*)=$1"
    }
}

Here is the exception:

java.lang.ClassCastException: [B cannot be cast to java.nio.ByteBuffer
	at com.wepay.kafka.connect.bigquery.convert.BigQueryRecordConverter.convertObject(BigQueryRecordConverter.java:99)
	at com.wepay.kafka.connect.bigquery.convert.BigQueryRecordConverter.convertStruct(BigQueryRecordConverter.java:129)
	at com.wepay.kafka.connect.bigquery.convert.BigQueryRecordConverter.convertRecord(BigQueryRecordConverter.java:73)
	at com.wepay.kafka.connect.bigquery.convert.BigQueryRecordConverter.convertRecord(BigQueryRecordConverter.java:51)
	at com.wepay.kafka.connect.bigquery.BigQuerySinkTask.getRecordRow(BigQuerySinkTask.java:143)
	at com.wepay.kafka.connect.bigquery.BigQuerySinkTask.put(BigQuerySinkTask.java:165)
	at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:538)
	at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:321)
	at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:224)
	at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:192)
	at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:175)
	at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:219)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

Thanks in advance

Issue Analytics

  • State:open
  • Created 4 years ago
  • Comments:12 (1 by maintainers)

github_iconTop GitHub Comments

1reaction
Pangstarcommented, Oct 19, 2022

+1

0reactions
bmd-benitaclarissacommented, Oct 21, 2022

@bmd-benita Hi, could you write JSON data to BigQuery? or this library only works with Avro format?

Yes, I use the io.confluent.connect.json.JsonSchemaConverter to write the data in JSON format from Kafka (CDC) to BigQuery. The connector is running in Kafka Connect v7.1.1 and it works properly now.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Loading JSON data from Cloud Storage | BigQuery
You can load newline delimited JSON data from Cloud Storage into a new table or partition, or append to or overwrite an existing...
Read more >
JSON formatting Error when loading into Google Big Query
Yes, BigQuery only accepts new-line delimited JSON, which means one complete JSON object per line.
Read more >
Using BigQuery with Python - Google Codelabs
In this codelab, you will learn how to use BigQuery with Python.
Read more >
Working with JSON data in BigQuery - Medium
By calling PARSE_JSON on a string in JSON format. SAFE.PARSE_JSON tells BigQuery to use null for the object if there is a syntax...
Read more >
BigQuery | Airbyte Documentation
BigQuery : Produces a normalized output by storing the JSON blob data in _airbyte_raw_* tables and then transforming and normalizing the data into...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found