question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

KTable don't auto convert as input channel

See original GitHub issue

Hi! I’m using Cloud Stream with Kafka Streams, and i’ve got a question about KTable. When my binder is a KStream type the spring cloud stream parse my JSON payload (value) to my Object automatic, but when i’m use KTable as input binder occur an error that say:

“java.lang.ClassCastException: class java.lang.String cannot be cast to class br.com.wspot.accounting.streamprocessor.models.RawAccounting”, bellow

I’m using Java 11 + Spring Boot 2.1.7.

Is that a bug or i’m missed some concept?

Sorry if i submited this question in the wrong place!

follow my code and my application.properties:

## My application.properties

spring.cloud.stream.kafka.streams.binder.configuration.commit.interval.mms=1000
spring.cloud.stream.kafka.streams.binder.configuration.default.key.serde=org.apache.kafka.common.serialization.Serdes$StringSerde
spring.cloud.stream.kafka.streams.binder.configuration.default.value.serde=org.apache.kafka.common.serialization.Serdes$StringSerde

spring.cloud.stream.bindings.inputAcct.destination=wspot.accounting.raw
spring.cloud.stream.bindings.inputAcct.content-type=application/json
spring.cloud.stream.kafka.streams.bindings.inputAcct.consumer.application-id=wspot-acct-raw-processor-id

spring.cloud.stream.kafka.binder.brokers=localhost:9092
## This example with KStream<String, RawAccounting> works great!
@EnableBinding({Stream.AccountingProcessor.class })
public class Stream {

    @StreamListener
    public void process(@Input(AccountingProcessor.INPUT_ACCT) KStream<String, RawAccounting> rawAccountingKStream) {

        rawAccountingKStream.foreach((key, value) -> System.out.println(value));
    }

    public interface AccountingProcessor {
        String INPUT_ACCT = "inputAcct";

        @Input(AccountingProcessor.INPUT_ACCT)
        KStream<String, RawAccounting> rawAcct();
    }
}

But when i change the Kafka KStrem to KTable i’ve got an Exception:

## This example with KTable throw the follow exception: java.lang.ClassCastException: class java.lang.String cannot be cast to class br.com.wspot.accounting.streamprocessor.models.RawAccounting

@EnableBinding({Stream.AccountingProcessor.class })
public class Stream {

    @StreamListener
    public void process(@Input(AccountingProcessor.INPUT_ACCT) KTable<String, RawAccounting> rawAccountingKStream) {

        rawAccountingKStream.toStream().foreach((key, value) -> System.out.println(value));
    }

    public interface AccountingProcessor {
        String INPUT_ACCT = "inputAcct";

        @Input(AccountingProcessor.INPUT_ACCT)
        KTable<String, RawAccounting> rawAcct();
    }
}

The Full stacktrace

Exception in thread "wspot-acct-raw-processor-id-12eae8ae-fcf4-4a54-9d98-fc3b86f815ed-StreamThread-1" org.apache.kafka.streams.errors.ProcessorStateException: task [0_0] Failed to flush state store wspot.accounting.raw-STATE-STORE-0000000000
	at org.apache.kafka.streams.processor.internals.ProcessorStateManager.flush(ProcessorStateManager.java:242)
	at org.apache.kafka.streams.processor.internals.AbstractTask.flushState(AbstractTask.java:202)
	at org.apache.kafka.streams.processor.internals.StreamTask.flushState(StreamTask.java:420)
	at org.apache.kafka.streams.processor.internals.StreamTask.commit(StreamTask.java:394)
	at org.apache.kafka.streams.processor.internals.StreamTask.commit(StreamTask.java:382)
	at org.apache.kafka.streams.processor.internals.AssignedTasks$1.apply(AssignedTasks.java:67)
	at org.apache.kafka.streams.processor.internals.AssignedTasks.applyToRunningTasks(AssignedTasks.java:362)
	at org.apache.kafka.streams.processor.internals.AssignedTasks.commit(AssignedTasks.java:352)
	at org.apache.kafka.streams.processor.internals.TaskManager.commitAll(TaskManager.java:401)
	at org.apache.kafka.streams.processor.internals.StreamThread.maybeCommit(StreamThread.java:1042)
	at org.apache.kafka.streams.processor.internals.StreamThread.runOnce(StreamThread.java:845)
	at org.apache.kafka.streams.processor.internals.StreamThread.runLoop(StreamThread.java:767)
	at org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:736)
Caused by: java.lang.ClassCastException: class java.lang.String cannot be cast to class br.com.wspot.accounting.streamprocessor.models.RawAccounting (java.lang.String is in module java.base of loader 'bootstrap'; br.com.wspot.accounting.streamprocessor.models.RawAccounting is in unnamed module of loader 'app')
	at org.apache.kafka.streams.kstream.internals.KStreamPeek$KStreamPeekProcessor.process(KStreamPeek.java:42)
	at org.apache.kafka.streams.processor.internals.ProcessorNode$1.run(ProcessorNode.java:50)
	at org.apache.kafka.streams.processor.internals.ProcessorNode.runAndMeasureLatency(ProcessorNode.java:244)
	at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:133)
	at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:143)
	at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:126)
	at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:90)
	at org.apache.kafka.streams.kstream.internals.KStreamMapValues$KStreamMapProcessor.process(KStreamMapValues.java:41)
	at org.apache.kafka.streams.processor.internals.ProcessorNode$1.run(ProcessorNode.java:50)
	at org.apache.kafka.streams.processor.internals.ProcessorNode.runAndMeasureLatency(ProcessorNode.java:244)
	at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:133)
	at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:143)
	at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:126)
	at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:90)
	at org.apache.kafka.streams.kstream.internals.ForwardingCacheFlushListener.apply(ForwardingCacheFlushListener.java:42)
	at org.apache.kafka.streams.state.internals.CachingKeyValueStore.putAndMaybeForward(CachingKeyValueStore.java:101)
	at org.apache.kafka.streams.state.internals.CachingKeyValueStore.access$000(CachingKeyValueStore.java:38)
	at org.apache.kafka.streams.state.internals.CachingKeyValueStore$1.apply(CachingKeyValueStore.java:83)
	at org.apache.kafka.streams.state.internals.NamedCache.flush(NamedCache.java:141)
	at org.apache.kafka.streams.state.internals.NamedCache.flush(NamedCache.java:99)
	at org.apache.kafka.streams.state.internals.ThreadCache.flush(ThreadCache.java:125)
	at org.apache.kafka.streams.state.internals.CachingKeyValueStore.flush(CachingKeyValueStore.java:123)
	at org.apache.kafka.streams.state.internals.InnerMeteredKeyValueStore.flush(InnerMeteredKeyValueStore.java:284)
	at org.apache.kafka.streams.state.internals.MeteredKeyValueBytesStore.flush(MeteredKeyValueBytesStore.java:149)
	at org.apache.kafka.streams.processor.internals.ProcessorStateManager.flush(ProcessorStateManager.java:239)
	... 12 more

And an important information is that if i use “KTable<String,String>” works great, but when i use my own DTO type “RawAccounting” the auto conversion don’t work with KTable only with KStream.

My model bellow:

@Getter
@Setter
public class RawAccounting implements Serializable {
    private String username;
    private String statusType;
    private String sessionId;
    private String acctUniqueId;
    private String nasIpAddress;
    private String acctInputOctets;
    private String acctOutputOctets;
    private String nasIdentifier;
    private String calledStationId;
    private String callingStationId;
    private String framedIpAddress;
    private String acctDelayTime;
    private String acctSessionTime;
    private String timestamp;

    @JsonCreator
    public RawAccounting(
            @JsonProperty("username") String username,
            @JsonProperty("acct_status_type") String statusType,
            @JsonProperty("acct_session_id") String sessionId,
            @JsonProperty("acct_unique_session_id") String acctUniqueId,
            @JsonProperty("nas_ip_address") String nasIpAddress,
            @JsonProperty("acct_input_octets") String acctInputOctets,
            @JsonProperty("acct_output_octets") String acctOutputOctets,
            @JsonProperty("nas_identifier") String nasIdentifier,
            @JsonProperty("called_station_id") String calledStationId,
            @JsonProperty("calling_station_id") String callingStationId,
            @JsonProperty("framed_ip_address") String framedIpAddress,
            @JsonProperty("acct_delay_time") String acctDelayTime,
            @JsonProperty("acct_session_time") String acctSessionTime,
            @JsonProperty("timestamp") String timestamp
    ) {
        this.username = username;
        this.statusType = statusType;
        this.sessionId = sessionId;
        this.acctUniqueId = acctUniqueId;
        this.nasIpAddress = nasIpAddress;
        this.acctInputOctets = acctInputOctets;
        this.acctOutputOctets = acctOutputOctets;
        this.nasIdentifier = nasIdentifier;
        this.calledStationId = calledStationId;
        this.callingStationId = callingStationId;
        this.framedIpAddress = framedIpAddress;
        this.acctDelayTime = acctDelayTime;
        this.acctSessionTime = acctSessionTime;
        this.timestamp = timestamp;
    }

    @Override
    public String toString() {
        return "RawAccounting{" +
                "username='" + username + '\'' +
                '}';
    }
}

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Comments:5 (3 by maintainers)

github_iconTop GitHub Comments

1reaction
sobychackocommented, Aug 22, 2019

@felipearomani KTable is always converted using the native Serde feature of Kafka Streams. Framework level conversion is not done on KTable (although there is an issue out there to add it). Since you are using a custom type for value, you need to specify a proper Serde instead of using the default String serde. You can add these to the configuration.

spring.cloud.stream.kafka.streams.binder.configuration:
  default.value.serde: org.springframework.kafka.support.serializer.JsonSerde
  spring.json.value.default.type: RawAccounting

If you don’t want to set that Serde at the default level, you can set it on the binding itself which will have a higher priority. (spring.cloud.stream.kafka.streams.bindings. inputAcct.consumer.valueSerde)

This is a kafka binder related issue. Once you confirm that the solution is working, then I will move it over there before closing. However, please keep in mind that Stack Overflow is a better place to ask questions like this.

Btw, in the latest version of Spring Cloud Stream Kafka Streams binder, the default serialization is always delegated to Kafka Streams and the binder will try to infer the types for you before delegating. If you can update to the latest milestore or snapshot (3.0.0), then your application will work without providing these extra properties.

Thank you!

0reactions
sobychackocommented, Sep 11, 2019

@felipearomani Closing this issue. Please re-open or create a new one if you see any further issues. Thank you!

Read more comments on GitHub >

github_iconTop Results From Across the Web

spring cloud stream kafka KTable as input not working
KTable is always converted using the native Serde feature of Kafka Streams. Framework level conversion is not done on KTable (although there ...
Read more >
How to convert a Kafka Streams KStream to a KTable
In this tutorial, learn how to convert a Kafka Streams KStream to a KTable using Confluent, with step-by-step instructions and examples.
Read more >
Spring Cloud Stream Kafka Binder Reference Guide
Use the corresponding input channel name for your example. ... then the framework will skip any form of automatic message conversion on the...
Read more >
Set channel strip input formats in Logic Pro - Apple Support
In the Logic Pro Mixer, select the channel input format for each channel strip to determine its mono, stereo, or surround state.
Read more >
Create and run an update query - Microsoft Support
As you continue, remember that although the data types for each table field do not have to match, they must be compatible. Access...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found