question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Error when creating sink connector with `BytesToString$Value` transforms

See original GitHub issue

I installed the plugin via manual process as written on the docs here and then tried to create the new connector with the following body (note that the importante fields are the last ones):

{
  "name": "bill_events",
  "config": {
    "connector.class": "io.confluent.connect.jdbc.JdbcSinkConnector",
    "tasks.max": "1",
    "topics": "bill_events",
    "connection.url": "jdbc:postgresql://postgres_sink:5432/local?user=postgres&password=123&stringtype=unspecified",
    "transforms": "unwrap,created_at_converter,updated_at_converter,deleted_at_converter,insertTS,formatTS,type_field",
    "transforms.unwrap.type": "io.debezium.transforms.ExtractNewRecordState",
    "transforms.unwrap.drop.tombstones": "false",
    "auto.create": "true",
    "insert.mode": "upsert",
    "delete.enabled": "true",
    "pk.mode": "record_key",
    "errors.tolerance": "all",
    "errors.deadletterqueue.topic.name": "bill_events_dlq",
    "errors.deadletterqueue.context.headers.enable": true,
    "errors.deadletterqueue.topic.replication.factor": -1,
    "errors.log.enable": true,
    "transforms.created_at_converter.type": "org.apache.kafka.connect.transforms.TimestampConverter$Value",
    "transforms.created_at_converter.field": "created_at",
    "transforms.created_at_converter.format": "yyyy-MM-dd'T'HH:mm:ss.SSS",
    "transforms.created_at_converter.target.type": "Timestamp",
    "transforms.updated_at_converter.type": "org.apache.kafka.connect.transforms.TimestampConverter$Value",
    "transforms.updated_at_converter.field": "updated_at",
    "transforms.updated_at_converter.format": "yyyy-MM-dd'T'HH:mm:ss.SSS",
    "transforms.updated_at_converter.target.type": "Timestamp",
    "transforms.deleted_at_converter.type": "org.apache.kafka.connect.transforms.TimestampConverter$Value",
    "transforms.deleted_at_converter.field": "deleted_at",
    "transforms.deleted_at_converter.format": "yyyy-MM-dd'T'HH:mm:ss.SSS",
    "transforms.deleted_at_converter.target.type": "Timestamp",
    "transforms.formatTS.type": "org.apache.kafka.connect.transforms.TimestampConverter$Value",
    "transforms.formatTS.format": "yyyy-MM-dd HH:mm:ss.SSS",
    "transforms.formatTS.field": "kafka_timestamp",
    "transforms.formatTS.target.type": "Timestamp",
    "transforms.insertTS.type": "org.apache.kafka.connect.transforms.InsertField$Value",
    "transforms.insertTS.timestamp.field": "kafka_timestamp",
    "transforms.type_field.type": "com.github.jcustenborder.kafka.connect.transform.common.BytesToString$Value",
    "transforms.type_field.field": "type"
  }
}

Once I execute the POST in order to create the connector I receive the errer as follows:

<html>

<head>
	<meta http-equiv="Content-Type" content="text/html;charset=utf-8" />
	<title>Error 500 Request failed.</title>
</head>

<body>
	<h2>HTTP ERROR 500 Request failed.</h2>
	<table>
		<tr>
			<th>URI:</th>
			<td>/connectors</td>
		</tr>
		<tr>
			<th>STATUS:</th>
			<td>500</td>
		</tr>
		<tr>
			<th>MESSAGE:</th>
			<td>Request failed.</td>
		</tr>
		<tr>
			<th>SERVLET:</th>
			<td>org.glassfish.jersey.servlet.ServletContainer-13278a41</td>
		</tr>
	</table>
	<hr><a href="https://eclipse.org/jetty">Powered by Jetty:// 9.4.33.v20201020</a>
	<hr />

</body>

</html>

and on kafka-connect logs we can see the logs below:

error:

image

plugin being loaded by jdbcSink:

image

image

Important to note that we used the compiled files and put it into the plugin.path on kafka connect:

image

Can someone please help me with that? I have no clues on what to do right now (also tried installing an older version but build throws me some errors as well)

Issue Analytics

  • State:open
  • Created a year ago
  • Reactions:2
  • Comments:8 (3 by maintainers)

github_iconTop GitHub Comments

3reactions
felipemotarochacommented, Oct 24, 2022

I’m also experiencing the same issue.

3reactions
moreiravictorcommented, Oct 24, 2022

I used version 0.1.0.58. There’s no guava-*.jar in here. Actually it isn’t at the pom.xml as well.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Kafka Connect Sink Configuration Properties for Confluent ...
Behavior for tolerating errors during connector operation. ... in an error when processed by this sink connector, or its transformations or converters.
Read more >
Error creating kafka sink connector, How to ... - Stack Overflow
I am building kafka using AWS MSK service. I am trying to copy data from source DB to target DB data. The cluster...
Read more >
Key conversion Error with Kafka Sink connector - MongoDB
I'm trying to read messages from our Kafka cluster into my local MongoDB database. For testing I'm using the MongoDB Kafka Sink connector...
Read more >
sink_connector_config.html - Apache Kafka
Must be a subclass of org.apache.kafka.connect.connector. ... FileStreamSinkConnector, you can either specify this full name, or use "FileStreamSink" or ...
Read more >
Kafka Connect - LinkedIn
Converter handles the serialization and deserialization of data. Transformer is optional functionality and perform transformations to data ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found