Error when creating sink connector with `BytesToString$Value` transforms
See original GitHub issueI installed the plugin via manual process as written on the docs here and then tried to create the new connector with the following body (note that the importante fields are the last ones):
{
"name": "bill_events",
"config": {
"connector.class": "io.confluent.connect.jdbc.JdbcSinkConnector",
"tasks.max": "1",
"topics": "bill_events",
"connection.url": "jdbc:postgresql://postgres_sink:5432/local?user=postgres&password=123&stringtype=unspecified",
"transforms": "unwrap,created_at_converter,updated_at_converter,deleted_at_converter,insertTS,formatTS,type_field",
"transforms.unwrap.type": "io.debezium.transforms.ExtractNewRecordState",
"transforms.unwrap.drop.tombstones": "false",
"auto.create": "true",
"insert.mode": "upsert",
"delete.enabled": "true",
"pk.mode": "record_key",
"errors.tolerance": "all",
"errors.deadletterqueue.topic.name": "bill_events_dlq",
"errors.deadletterqueue.context.headers.enable": true,
"errors.deadletterqueue.topic.replication.factor": -1,
"errors.log.enable": true,
"transforms.created_at_converter.type": "org.apache.kafka.connect.transforms.TimestampConverter$Value",
"transforms.created_at_converter.field": "created_at",
"transforms.created_at_converter.format": "yyyy-MM-dd'T'HH:mm:ss.SSS",
"transforms.created_at_converter.target.type": "Timestamp",
"transforms.updated_at_converter.type": "org.apache.kafka.connect.transforms.TimestampConverter$Value",
"transforms.updated_at_converter.field": "updated_at",
"transforms.updated_at_converter.format": "yyyy-MM-dd'T'HH:mm:ss.SSS",
"transforms.updated_at_converter.target.type": "Timestamp",
"transforms.deleted_at_converter.type": "org.apache.kafka.connect.transforms.TimestampConverter$Value",
"transforms.deleted_at_converter.field": "deleted_at",
"transforms.deleted_at_converter.format": "yyyy-MM-dd'T'HH:mm:ss.SSS",
"transforms.deleted_at_converter.target.type": "Timestamp",
"transforms.formatTS.type": "org.apache.kafka.connect.transforms.TimestampConverter$Value",
"transforms.formatTS.format": "yyyy-MM-dd HH:mm:ss.SSS",
"transforms.formatTS.field": "kafka_timestamp",
"transforms.formatTS.target.type": "Timestamp",
"transforms.insertTS.type": "org.apache.kafka.connect.transforms.InsertField$Value",
"transforms.insertTS.timestamp.field": "kafka_timestamp",
"transforms.type_field.type": "com.github.jcustenborder.kafka.connect.transform.common.BytesToString$Value",
"transforms.type_field.field": "type"
}
}
Once I execute the POST in order to create the connector I receive the errer as follows:
<html>
<head>
<meta http-equiv="Content-Type" content="text/html;charset=utf-8" />
<title>Error 500 Request failed.</title>
</head>
<body>
<h2>HTTP ERROR 500 Request failed.</h2>
<table>
<tr>
<th>URI:</th>
<td>/connectors</td>
</tr>
<tr>
<th>STATUS:</th>
<td>500</td>
</tr>
<tr>
<th>MESSAGE:</th>
<td>Request failed.</td>
</tr>
<tr>
<th>SERVLET:</th>
<td>org.glassfish.jersey.servlet.ServletContainer-13278a41</td>
</tr>
</table>
<hr><a href="https://eclipse.org/jetty">Powered by Jetty:// 9.4.33.v20201020</a>
<hr />
</body>
</html>
and on kafka-connect logs we can see the logs below:
error:
plugin being loaded by jdbcSink:
Important to note that we used the compiled files and put it into the plugin.path
on kafka connect:
Can someone please help me with that? I have no clues on what to do right now (also tried installing an older version but build throws me some errors as well)
Issue Analytics
- State:
- Created a year ago
- Reactions:2
- Comments:8 (3 by maintainers)
Top Results From Across the Web
Kafka Connect Sink Configuration Properties for Confluent ...
Behavior for tolerating errors during connector operation. ... in an error when processed by this sink connector, or its transformations or converters.
Read more >Error creating kafka sink connector, How to ... - Stack Overflow
I am building kafka using AWS MSK service. I am trying to copy data from source DB to target DB data. The cluster...
Read more >Key conversion Error with Kafka Sink connector - MongoDB
I'm trying to read messages from our Kafka cluster into my local MongoDB database. For testing I'm using the MongoDB Kafka Sink connector...
Read more >sink_connector_config.html - Apache Kafka
Must be a subclass of org.apache.kafka.connect.connector. ... FileStreamSinkConnector, you can either specify this full name, or use "FileStreamSink" or ...
Read more >Kafka Connect - LinkedIn
Converter handles the serialization and deserialization of data. Transformer is optional functionality and perform transformations to data ...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
I’m also experiencing the same issue.
I used version
0.1.0.58
. There’s noguava-*.jar
in here. Actually it isn’t at the pom.xml as well.