Kafka Connect Cassandra Sink
See original GitHub issueI tried to connect kafka with cassandra locally and it worked fine. But I have a problem when another machine tries to connect with me and send data it appears an error with serialization and deserialization at datamountaineer cassandra sink shell as follow:
ERROR Task cassandra-sink-orders-0 threw an uncaught and unrecoverable exception (org.apache.kafka.connect.runtime.WorkerTask:141)
org.apache.kafka.connect.errors.DataException: orders-topic
at io.confluent.connect.avro.AvroConverter.toConnectData(AvroConverter.java:92)
at org.apache.kafka.connect.runtime.WorkerSinkTask.convertMessages(WorkerSinkTask.java:401)
at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:249)
at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:179)
at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:148)
at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:139)
at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:182)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at `java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)`
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.kafka.common.errors.SerializationException: Error deserializing Avro message for id -1
Caused by: org.apache.kafka.common.errors.SerializationException: Unknown magic byte!
and these errors at kafka sink:
ERROR Unknown error when running consumer: (kafka.tools.ConsoleConsumer$:105)
org.apache.kafka.common.errors.SerializationException: Error retrieving Avro schema for id 61
Caused by: io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException: Schema not found; error code: 40403
at io.confluent.kafka.schemaregistry.client.rest.RestService.sendHttpRequest(RestService.java:170)
at io.confluent.kafka.schemaregistry.client.rest.RestService.httpRequest(RestService.java:188)
at io.confluent.kafka.schemaregistry.client.rest.RestService.getId(RestService.java:330)
at io.confluent.kafka.schemaregistry.client.rest.RestService.getId(RestService.java:323)
at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.getSchemaByIdFromRegistry(CachedSchemaRegistryClient.java:63)
at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.getBySubjectAndID(CachedSchemaRegistryClient.java:118)
at io.confluent.kafka.serializers.AbstractKafkaAvroDeserializer.deserialize(AbstractKafkaAvroDeserializer.java:121)
at io.confluent.kafka.serializers.AbstractKafkaAvroDeserializer.deserialize(AbstractKafkaAvroDeserializer.java:92)
at io.confluent.kafka.formatter.AvroMessageFormatter.writeTo(AvroMessageFormatter.java:120)
at io.confluent.kafka.formatter.AvroMessageFormatter.writeTo(AvroMessageFormatter.java:112)
at kafka.tools.ConsoleConsumer$.process(ConsoleConsumer.scala:137)
at kafka.tools.ConsoleConsumer$.run(ConsoleConsumer.scala:75)
at kafka.tools.ConsoleConsumer$.main(ConsoleConsumer.scala:50)
at kafka.tools.ConsoleConsumer.main(ConsoleConsumer.scala)
and that in another trials:
ERROR Unknown error when running consumer: (kafka.tools.ConsoleConsumer$:105)
org.apache.kafka.common.errors.SerializationException: Error deserializing Avro message for id -1
Caused by: org.apache.kafka.common.errors.SerializationException: Unknown magic byte!
I have tried for more than 5 hours and nothing change could you help me to solve this issue!!
Issue Analytics
- State:
- Created 6 years ago
- Comments:6
Top Results From Across the Web
Cassandra Sink Connector for Confluent Platform
The Kafka Connect Cassandra Sink connector is a high-speed mechanism for writing data to Apache Cassandra and is compatible with Cassandra 2.1, 2.2,...
Read more >Kafka to Cassandra | Open source sink kafka connector
A Kafka Connect sink connector for writing records from Kafka to Cassandra. Requires: Cassandra 2.2.4+ if your are on version 2.* or 3.0.1+...
Read more >Installing DataStax Apache Kafka Connector 1.4.0
The cassandra-sink-distributed.json.sample file is located in the conf directory of the DataStax Apache Kafka Connector distribution package ...
Read more >Create a Apache Cassandra® stream reactor sink connector
The Apache Cassandra® stream reactor sink connector enables you to move data from an Aiven for Apache Kafka® cluster to a Apache Cassandra®...
Read more >Integrating Kafka and Cassandra: A Comprehensive Guide
Cassandra as a Sink for Kafka: In this method, data is ingested into Cassandra from Kafka. It uses the DataStax Kafka Connector. This...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
The other machine is using a different instance of schema registry. So you sink running on your machine uses it’s own schema registry instance thus the ids between those two won’t match.
You can join our slack channel and we can give more guidance on the setup. Your colleague connecting to your machine needs to use the same schema registry instance otherwise you end up in trouble with deserialization. Also, make sure you push avro to the topic