Redis Sink Connector error (QuickStart document)
See original GitHub issueI’d like to use redis sink connector and follow this instruction http://docs.datamountaineer.com/en/latest/redis.html but the connector error and cannot sink to redis (error below)
# kafka-avro-console-producer --broker-list localhost:9092 --topic redistopic --property value.schema='{"type":"record","name":"User","namespace":"com.datamountaineer.strea mreactor.connect.redis","fields":[{"name":"firstName","type":"string"},{"name":"lastName","type":"string"},{"name":"age","type":"int"},{"name":"salary","type":"double"}]}' SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/opt/confluent/share/java/kafka-serde-tools/slf4j-log4j12-1.7.6.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/opt/confluent/share/java/schema-registry/slf4j-log4j12-1.7.6.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] {"firstName": "John", "lastName": "Smith", "age":30, "salary": 4830}
after use kafka-avro-console-producer and input sample data the connector will fail
Task 0 is FAILED
connector.class com.datamountaineer.streamreactor.connect.redis.sink.RedisSinkConnector connect.redis.sink.kcql INSERT INTO TABLE1 SELECT * FROM redis-topic task.class com.datamountaineer.streamreactor.connect.redis.sink.RedisSinkTask topics redis-topic tasks.max 1 connect.redis.connection.port 6379 name RedisSinkConnector connect.redis.connection.host [Redis IP] TRACE:
org.apache.kafka.connect.errors.ConnectException: Exiting WorkerSinkTask due to unrecoverable exception. at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:451) at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:250) at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:179) at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:148) at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:139) at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:182) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745)
Could you please suggest me how to solve this and how to use redis sink connector?
PS. cassandra sink work properly, I try to reconfigure/reinstall manually and try to use fast-data-dev docker https://hub.docker.com/r/landoop/fast-data-dev/ but still same error
Thank you
Issue Analytics
- State:
- Created 6 years ago
- Comments:6
There is a problem with the connector. It builds the key on the transformed struct, which in this case is just containing age (
Select age from redistopic PK firstName
). Until we fix this you need to add the field for your key in the Select fields. For example :select age, firstName from redistopic PK firstName
@stheppi @Antwnis Thank you for your advice, now I can sink to redis with landoop container when I added ‘PK firstName’ to kcql. I’ll reinstall production server again.
thank you so much again!!