Deserialization exception: Unknown magic byte!
See original GitHub issueHi, I’m using version 1.1.0 of this connector and I’m trying to export some data from a Kafka topic to Google Big Query. I verified that I have a correct schema in my schema registry and that I can consume using the kafka-avro-console-consumer (from Confluent platform).
When starting the connector, however, I’m getting the following exceptions in the log:
Caused by: org.apache.kafka.common.errors.SerializationException: Error deserializing Avro message for id -1 Caused by: org.apache.kafka.common.errors.SerializationException: Unknown magic byte!
The table is created in BigQuery and the fields look correct, but no rows are being written as a result of the exception.
Any pointers?
Issue Analytics
- State:
- Created 5 years ago
- Comments:7 (3 by maintainers)
Top Results From Across the Web
Kafka Streams - SerializationException: Unknown magic byte
Unknown magic byte ! Means your data does not adhere to the wire format that's expected for the Schema Registry.
Read more >Unknown magic byte even with AvroConverter #424 - GitHub
SerializationException : Unknown magic byte! The problem is that the data was written using a JDBC source connector with:.
Read more >Magic byte error when trying to consume Avro data with Kafka ...
We're trying to use Kafka Connect to pull down data from Kafka, but we're having issues with the Avro deserialization.
Read more >Error "Unknown magic byte" occurred while deserializing Avro ...
Hi, We tried to produce and consume a AVRO message (zookeeper, broker and schema registry have been launched), error "Unknown magic byte" occurred...
Read more >org.apache.kafka.common.errors.SerializationException
SerializationException : Unknown magic byte! · It often denotes that the Topic data\message is not valid Avro structure and hence could not be...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Hi, I am getting the same error. I converted a JSON file into an AVRO file.
@DediBar KCBQ currently works out of the box with Avro topics, not CSV/JSON topics. The reason for this is that we have to map from incoming message schemas to BigQuery schemas (e.g. from Avro schema to BigQuery schema). We have made this mapper pluggable, but we only support Avro out of the box. If you want to use JSON/CSV input rather than Avro input, you’ll have to write your own.
https://github.com/wepay/kafka-connect-bigquery/blob/master/kcbq-api/src/main/java/com/wepay/kafka/connect/bigquery/api/SchemaRetriever.java