question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Error messages are never sent to the DLQ

See original GitHub issue

Hi,

I created a Spring Cloud Stream application using the Kafka Binder, here is my yml configuration file:

spring:
  cloud:
    stream:
      kafka:
        binder:
          brokers: localhost
          consumerProperties:
            schema.registry.url: http://localhost:8081
            key.deserializer: org.apache.kafka.common.serialization.StringDeserializer
            value.deserializer: io.confluent.kafka.streams.serdes.avro.SpecificAvroDeserializer
          producerProperties:
            schema.registry.url: http://localhost:8081
            key.serializer: org.apache.kafka.common.serialization.StringSerializer
            value.serializer: io.confluent.kafka.streams.serdes.avro.SpecificAvroSerializer
        bindings:
          input:
            consumer:
              startOffset: earliest
              enableDlq: true
              dlqName: dlq
      bindings:
        input:
          destination: foo
          group: fooGroup
          consumer:
            maxAttempts: 1
            useNativeDecoding: true
        output:
          destination: bar
          producer:
            useNativeEncoding: true
      function:
        definition: processor

Everything works great but the DLQ. While inspecting the broker I can see that it gets created yet no messages are sent to the topic. Here are the different processors I tried:

    @Bean
    public Function<EventA, EventB> processor() {
        return input -> {
            throw new RuntimeException("FAIL");
        };
    }
    @Bean
    public Function<Flux<EventA>, Flux<EventB>> processor() {
        return input -> input.map(i -> {
            throw new RuntimeException("FAIL");
        });
    } 

Libraries used:

    compile group: 'org.springframework.cloud', name: 'spring-cloud-stream', version: '2.2.0.RELEASE'
    compile group: 'org.springframework.cloud', name: 'spring-cloud-stream-binder-kafka', version: '2.2.0.RELEASE'
    compile group: 'org.springframework.cloud', name: 'spring-cloud-stream-schema', version: '2.2.0.RELEASE'

Did I miss something?

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Comments:10 (6 by maintainers)

github_iconTop GitHub Comments

4reactions
sobychackocommented, Aug 26, 2019

@antoin-m Apologies for the late response. I was able to triage your issue. It looks like you are missing some configuration. You are using native de-serialization (useNativeDecoding: true) and thus when you write the failed messages to DLQ, you need to use a corresponding serializer. In this case, you need to use io.confluent.kafka.streams.serdes.avro.SpecificAvroSerializer for DLQ. Here is an example of the configuration:

bindings:
  input:
     consumer:
       startOffset: earliest
       enableDlq: true
       dlqName: dlq
       dlqProducerProperties:
         configuration:
           key.serializer: org.apache.kafka.common.serialization.StringSerializer
           value.serializer: io.confluent.kafka.streams.serdes.avro.SpecificAvroSerializer

If you are not using native decoding, then you don’t need to set configuration on dlqProducerProperties as it is handled by the framework using a converter.

I will use this issue to add some additional docs for this. Thank you!

1reaction
garyrussellcommented, Apr 15, 2020

Ah; sorry; I was referring to specific consumer/producer bindings, not the producer properties at the binder level.

Yes, I think this is a reasonable request, but it could be a breaking change so it will probably have to be done in the next release.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Failed events not sent to Dead letter queue? - Stack Overflow
Lambda reads data from Kinesis synchronously. DLQ is used only for asynchronous invocations of Lambda.
Read more >
Amazon SQS dead-letter queues - AWS Documentation
Amazon SQS supports dead-letter queues (DLQ), which other queues (source queues) can target for messages that can't be processed (consumed) successfully.
Read more >
What happens when a message cannot be delivered? - IBM
When a message cannot be delivered, the MCA can process it in several ways. It can try again, it can return-to-sender, or it...
Read more >
What to DO when a MESSAGE FAILS PROCESSING? SQS ...
Dead Letter Queues ( DLQ ) are very handy for keeping track of all the messages that failed processing. Attach a DLQ to...
Read more >
Error Handling via Dead Letter Queue in Apache Kafka
This article focuses on the data streaming platform Apache Kafka. The main reason for putting a message into a DLQ in Kafka is...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found