question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Avro converter does not work in Kafka Connect

See original GitHub issue

When converter is used for the first time it throws

connect_1    | WARNING: An illegal reflective access operation has occurred
connect_1    | WARNING: Illegal reflective access by retrofit2.Platform (file:/kafka/external_libs/apicurio/retrofit-2.9.0.jar) to constructor java.lang.invoke.MethodHandles$Lookup(java.lang.Class,int)
connect_1    | WARNING: Please consider reporting this to the maintainers of retrofit2.Platform
connect_1    | WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
connect_1    | WARNING: All illegal access operations will be denied in a future release
connect_1    | 2020-09-21 08:27:56,001 ERROR  ||  Failed to start task inventory-connector-0   [org.apache.kafka.connect.runtime.Worker]
connect_1    | java.lang.VerifyError: Bad type on operand stack
connect_1    | Exception Details:
connect_1    |   Location:
connect_1    |     org/apache/avro/Schema.<clinit>()V @39: invokevirtual
connect_1    |   Reason:
connect_1    |     Type 'com/fasterxml/jackson/databind/ObjectMapper' (current frame, stack[1]) is not assignable to 'com/fasterxml/jackson/core/ObjectCodec'
connect_1    |   Current Frame:
connect_1    |     bci: @39
connect_1    |     flags: { }
connect_1    |     locals: { }
connect_1    |     stack: { 'com/fasterxml/jackson/core/JsonFactory', 'com/fasterxml/jackson/databind/ObjectMapper' }
connect_1    |   Bytecode:
connect_1    |     0000000: bb01 2359 b701 24b3 005a bb01 2559 b200
connect_1    |     0000010: 5ab7 0126 b300 f6b2 005a b201 27b6 0128
connect_1    |     0000020: 57b2 005a b200 f6b6 0129 57bb 012a 5910
connect_1    |     0000030: 0abd 00d1 5903 12b0 5359 0412 bc53 5905
connect_1    |     0000040: 12dd 5359 0612 b153 5907 12ae 5359 0812
connect_1    |     0000050: e253 5910 0612 d753 5910 0712 e053 5910
connect_1    |     0000060: 0812 6553 5910 0912 f053 b801 2bb7 012c
connect_1    |     0000070: b300 10bb 012a 59b2 0010 b701 2cb3 000f
connect_1    |     0000080: b200 0f12 c8b9 00f5 0200 57bb 012a 5910
connect_1    |     0000090: 06bd 00d1 5903 12c8 5359 0412 b053 5905
connect_1    |     00000a0: 12b1 5359 0612 c453 5907 1265 5359 0812
connect_1    |     00000b0: f053 b801 2bb7 012c b801 2db3 0009 ba01
connect_1    |     00000c0: 2e00 00b8 012f b300 05ba 0130 0000 b801
connect_1    |     00000d0: 2fb3 0004 bb01 0059 b701 21b3 00b5 b200
connect_1    |     00000e0: b513 0131 b201 32b9 010a 0300 57b2 00b5
connect_1    |     00000f0: 1301 33b2 0134 b901 0a03 0057 b200 b513
connect_1    |     0000100: 0135 b201 36b9 010a 0300 57b2 00b5 1301
connect_1    |     0000110: 37b2 0138 b901 0a03 0057 b200 b513 0139
connect_1    |     0000120: b200 c9b9 010a 0300 57b2 00b5 1301 3ab2
connect_1    |     0000130: 00ca b901 0a03 0057 b200 b513 013b b201
connect_1    |     0000140: 3cb9 010a 0300 57b2 00b5 1301 3db2 0070
connect_1    |     0000150: b901 0a03 0057 ba01 3e00 00b8 012f b300
connect_1    |     0000160: 03ba 013f 0000 b801 2fb3 0002 b1       
connect_1    | 
connect_1    | 	at io.apicurio.registry.utils.converter.avro.AvroData.<clinit>(AvroData.java:155)
connect_1    | 	at io.apicurio.registry.utils.converter.AvroConverter.configure(AvroConverter.java:64)
connect_1    | 	at org.apache.kafka.connect.runtime.isolation.Plugins.newConverter(Plugins.java:293)
connect_1    | 	at org.apache.kafka.connect.runtime.Worker.startTask(Worker.java:440)
connect_1    | 	at org.apache.kafka.connect.runtime.distributed.DistributedHerder.startTask(DistributedHerder.java:1147)
connect_1    | 	at org.apache.kafka.connect.runtime.distributed.DistributedHerder.access$1600(DistributedHerder.java:126)
connect_1    | 	at org.apache.kafka.connect.runtime.distributed.DistributedHerder$12.call(DistributedHerder.java:1162)
connect_1    | 	at org.apache.kafka.connect.runtime.distributed.DistributedHerder$12.call(DistributedHerder.java:1158)
connect_1    | 	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
connect_1    | 	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
connect_1    | 	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
connect_1    | 	at java.base/java.lang.Thread.run(Thread.java:834)

This is regression from 1.3.0.Final

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:6 (6 by maintainers)

github_iconTop GitHub Comments

2reactions
famartingcommented, Sep 28, 2020

btw @EricWittmann @carlesarnal here you have a test run in GH actions reproducing the issue, https://github.com/Apicurio/apicurio-registry/runs/1165428129

1reaction
EricWittmanncommented, Sep 24, 2020

Here is the full log generated by @famartinrh 's test: https://gist.github.com/famartinrh/8db30fcb680e14021903a414aeed2c61

@carlesarnal - this error is likely due to having multiple copies of Jackson on the classpath.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Using Kafka Connect with Schema Registry
The Schema Registry URL configuration property is required for Avro, Protobuf, and JSON Schema. All connectors use the value. converter worker property org....
Read more >
Running Kafka Connect with Avro Converter : ConfigException
This is the correct answer. The avro converter used here requires a schema registry URL configuration for each instance that it is used...
Read more >
Avro Serialization :: Debezium Documentation
Kafka Connect comes with a JSON converter that serializes the message keys and values into JSON documents. The JSON converter can be configured...
Read more >
When a Kafka Connect converter is not a converter
Regardless, you figure that since it's JSON then you must use the JsonConverter ? Or, you want to take that JSON and write...
Read more >
22 Using the Kafka Connect Handler - Oracle Help Center
Confluent has solved this problem by using a schema registry and the Confluent schema converters. The following shows the configuration of the Kafka...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found