question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

java.lang.ClassCastException: class io.confluent.kafka.serializers.KafkaAvroDeserializer

See original GitHub issue

Overview

I am trying to deserialize an avro message using io.confluent.kafka.serializers.KafkaAvroDeserializer and receive the follow error:

Error

java.lang.IllegalStateException: java.lang.ClassCastException: class io.confluent.kafka.serializers.KafkaAvroDeserializer
        at org.springframework.cloud.stream.binder.kafka.streams.KafkaStreamsStreamListenerSetupMethodOrchestrator.adaptAndRetrieveInboundArguments(KafkaStreamsStreamListenerSetupMethodOrchestrator.java:318) ~[spring-cloud-stream-binder-kafka-streams-3.0.3.RELEASE.jar:3.0.3.RELEASE]
        at org.springframework.cloud.stream.binder.kafka.streams.KafkaStreamsStreamListenerSetupMethodOrchestrator.orchestrateStreamListenerSetupMethod(KafkaStreamsStreamListenerSetupMethodOrchestrator.java:164) ~[spring-cloud-stream-binder-kafka-streams-3.0.3.RELEASE.jar:3.0.3.RELEASE]
        at org.springframework.cloud.stream.binding.StreamListenerAnnotationBeanPostProcessor.doPostProcess(StreamListenerAnnotationBeanPostProcessor.java:232) ~[spring-cloud-stream-3.0.3.RELEASE.jar:3.0.3.RELEASE]
        at org.springframework.cloud.stream.binding.StreamListenerAnnotationBeanPostProcessor.lambda$postProcessAfterInitialization$0(StreamListenerAnnotationBeanPostProcessor.java:202) ~[spring-cloud-stream-3.0.3.RELEASE.jar:3.0.3.RELEASE]
        at java.base/java.lang.Iterable.forEach(Iterable.java:75) ~[na:na]
        at org.springframework.cloud.stream.binding.StreamListenerAnnotationBeanPostProcessor.injectAndPostProcessDependencies(StreamListenerAnnotationBeanPostProcessor.java:336) ~[spring-cloud-stream-3.0.3.RELEASE.jar:3.0.3.RELEASE]
        at org.springframework.cloud.stream.binding.StreamListenerAnnotationBeanPostProcessor.afterSingletonsInstantiated(StreamListenerAnnotationBeanPostProcessor.java:118) ~[spring-cloud-stream-3.0.3.RELEASE.jar:3.0.3.RELEASE]
        at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:896) ~[spring-beans-5.2.4.RELEASE.jar:5.2.4.RELEASE]
        at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:878) ~[spring-context-5.2.4.RELEASE.jar:5.2.4.RELEASE]
        at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:550) ~[spring-context-5.2.4.RELEASE.jar:5.2.4.RELEASE]
        at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.refresh(ServletWebServerApplicationContext.java:141) ~[spring-boot-2.2.5.RELEASE.jar:2.2.5.RELEASE]
        at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:747) ~[spring-boot-2.2.5.RELEASE.jar:2.2.5.RELEASE]
        at org.springframework.boot.SpringApplication.refreshContext(SpringApplication.java:397) ~[spring-boot-2.2.5.RELEASE.jar:2.2.5.RELEASE]
        at org.springframework.boot.SpringApplication.run(SpringApplication.java:315) ~[spring-boot-2.2.5.RELEASE.jar:2.2.5.RELEASE]
        at org.springframework.boot.SpringApplication.run(SpringApplication.java:1226) ~[spring-boot-2.2.5.RELEASE.jar:2.2.5.RELEASE]
        at org.springframework.boot.SpringApplication.run(SpringApplication.java:1215) ~[spring-boot-2.2.5.RELEASE.jar:2.2.5.RELEASE]
        at io.karthik.cepengine.CepEngineApplication.main(CepEngineApplication.java:11) ~[main/:na]
Caused by: java.lang.ClassCastException: class io.confluent.kafka.serializers.KafkaAvroDeserializer
        at java.base/java.lang.Class.asSubclass(Class.java:3641) ~[na:na]
        at org.apache.kafka.common.utils.Utils.loadClass(Utils.java:348) ~[kafka-clients-2.3.1.jar:na]
        at org.apache.kafka.common.utils.Utils.newInstance(Utils.java:337) ~[kafka-clients-2.3.1.jar:na]
        at org.springframework.cloud.stream.binder.kafka.streams.KeyValueSerdeResolver.getKeySerde(KeyValueSerdeResolver.java:257) ~[spring-cloud-stream-binder-kafka-streams-3.0.3.RELEASE.jar:3.0.3.RELEASE]
        at org.springframework.cloud.stream.binder.kafka.streams.KeyValueSerdeResolver.getInboundKeySerde(KeyValueSerdeResolver.java:106) ~[spring-cloud-stream-binder-kafka-streams-3.0.3.RELEASE.jar:3.0.3.RELEASE]
        at org.springframework.cloud.stream.binder.kafka.streams.KafkaStreamsStreamListenerSetupMethodOrchestrator.adaptAndRetrieveInboundArguments(KafkaStreamsStreamListenerSetupMethodOrchestrator.java:272) ~[spring-cloud-stream-binder-kafka-streams-3.0.3.RELEASE.jar:3.0.3.RELEASE]

Dependencies

dependencies {
	implementation 'org.apache.kafka:kafka-streams'
	implementation 'org.springframework.cloud:spring-cloud-stream:3.0.3.RELEASE'
	implementation 'org.springframework.cloud:spring-cloud-stream-binder-kafka-streams:3.0.3.RELEASE'
	implementation 'org.springframework.cloud:spring-cloud-starter-stream-kafka:3.0.3.RELEASE'
	implementation 'io.confluent:kafka-avro-serializer:5.4.1'
	implementation 'io.confluent:kafka-streams-avro-serde:5.4.1'
	implementation 'org.apache.avro:avro:1.9.1'
	implementation 'org.springframework.boot:spring-boot-starter-actuator'
	implementation 'org.springframework.boot:spring-boot-starter-web'
	compileOnly 'org.projectlombok:lombok'
	annotationProcessor 'org.projectlombok:lombok'
	testImplementation('org.springframework.boot:spring-boot-starter-test') {
		exclude group: 'org.junit.vintage', module: 'junit-vintage-engine'
	}
	runtime group: 'org.springframework.cloud', name: 'spring-cloud-dependencies', version: 'Hoxton.SR3', ext: 'pom'
}

Application.yaml

spring:
  kafka:
    consumer:
      value-deserializer: io.confluent.kafka.serializers.KafkaAvroDeserializer
      key-deserializer: io.confluent.kafka.serializers.KafkaAvroDeserializer
    producer:
      value-serializer: io.confluent.kafka.serializers.KafkaAvroSerializer
      key-serializer: io.confluent.kafka.serializers.KafkaAvroSerializer
  cloud:
    stream:
      kafka:
        binder:
          brokers: localhost
          defaultBrokerPort: 9093
          producerProperties:
            client.id: Cep-Engine
            schema.registry.url: http://localhost:8081
            requiredAcks: 1
          consumerProperties:
            schema.registry.url: http://localhost:8081
        streams:
          binder:
            brokers: localhost:9093
            configuration:
              commit.interval.mms: 1000
          bindings:
            shipment: 
              consumer:
                applicationId: kstream
                valueSerde: io.confluent.kafka.serializers.KafkaAvroDeserializer
                keySerde: io.confluent.kafka.serializers.KafkaAvroDeserializer
              group: shipment-consumer-streamer
              content-type: application/*+avro

logging:
  level:
    root: INFO

Interface

package io.karthik.cepengine.streamers;

import org.apache.kafka.streams.kstream.KStream;
import org.springframework.cloud.stream.annotation.Input;

public interface KafkaStreamsIOInterface {

  String INPUT = "shipment";

  @Input(INPUT)
  KStream<?, ?> input();
}

Stream Processing class

package io.karthik.cepengine.streamers;

import org.apache.avro.generic.GenericRecord;
import org.apache.kafka.streams.kstream.KStream;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.cloud.stream.annotation.EnableBinding;
import org.springframework.cloud.stream.annotation.StreamListener;
import org.springframework.stereotype.Component;

@EnableBinding(KafkaStreamsIOInterface.class)
@Component
public class KStreamShipmentStreamer {

  private final Logger log = LoggerFactory.getLogger(KStreamShipmentStreamer.class);

  @StreamListener(KafkaStreamsIOInterface.INPUT)
  public void processStream(KStream<GenericRecord, GenericRecord> inputStream) {
    inputStream.foreach((k,v) -> log.info("Key: {}, value: {}", k, v));
  }
}

Not sure why it throws a ClassCastException. Also, how can i specify other kafka StreamsConfig specifc properties?

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:6 (3 by maintainers)

github_iconTop GitHub Comments

2reactions
sobychackocommented, Mar 26, 2020

@karthik-reddivari I see that you are using the wrong classes for Serde. See here:

valueSerde: io.confluent.kafka.serializers.KafkaAvroDeserializer
keySerde: io.confluent.kafka.serializers.KafkaAvroDeserializer

You are not using a proper Serde class, but the deserializer directly. You need to change this as below.

valueSerde: io.confluent.kafka.streams.serdes.avro.SpecificAvroSerde
keySerde: io.confluent.kafka.streams.serdes.avro.SpecificAvroSerde

You also possibly don’t need this unless you have some very specific needs.

consumer:
      value-deserializer: io.confluent.kafka.serializers.KafkaAvroDeserializer
      key-deserializer: io.confluent.kafka.serializers.KafkaAvroDeserializer
    producer:
      value-serializer: io.confluent.kafka.serializers.KafkaAvroSerializer
      key-serializer: io.confluent.kafka.serializers.KafkaAvroSerializer

Also, how can i specify other kafka StreamsConfig specifc properties?

Any streams config properties can be directly set on the binder configuration or at the binding level. For e.g. spring.cloud.stream.kafka.streams.binder.configuration.schema.registry.url or spring.cloud.stream.kafka.streams.bindings.shipment.consumer.configuration...

You also don’t need to provide any content type configuration (content-type: application/*+avro), as in the Kafka Streams binder, deserialization/serialization is done natively.

Hope this helps.

0reactions
karthik-reddivaricommented, Mar 27, 2020

Thanks @sobychacko! Closing this issue.

Read more comments on GitHub >

github_iconTop Results From Across the Web

classcastexception while deserializing the avro message in ...
Caused by: java.lang.ClassCastException: class com.example.schema.avro. ... key.deserializer: io.confluent.kafka.serializers.
Read more >
Avro Schema Serializer and Deserializer
This document describes how to use Avro schemas with the Apache Kafka® Java client and console tools. Avro Serializer¶. You can plug KafkaAvroSerializer...
Read more >
KafkaStreams - java.lang.ClassCastException received when ...
to a bug in io.confluent.examples.streams.utils.SpecificAvroDeserializer. I will fix this bug in Confluent's repo, as for you to work around it, you
Read more >
Spring Kafka with Avro Deserializer - DevPress - CSDN
Using gradle and .avsc I have generated avro classes. ... to use Spring Kafka with Confluent schema registry and Kafka Avro Deserializer.
Read more >
Avro class cannot be cast to class org.springframework ...
ShopConsumerConfig.java ... KEY_DESERIALIZER_CLASS_CONFIG, io.confluent.kafka.serializers.KafkaAvroDeserializer.class); props.put(ConsumerConfig.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found