question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

sleuth trace not working with Kstream

See original GitHub issue

I have a stream processor written using spring-cloud-stream-binder-kafka-streams

@Slf4j
@EnableBinding(KafkaStreamsProcessor.class)
@RequiredArgsConstructor
@Component
public class SomeWorker extends BaseStream<SomePojo> {

    @StreamListener(Sink.INPUT)
    @SendTo(Source.OUTPUT)
    public KStream<?, SomePojo> process(KStream<?, SomePojo> pojoObj) {
                     //stream processing
}

pom.xml <spring-cloud.version>Greenwich.SR1</spring-cloud.version>

        <dependency>
            <groupId>org.springframework.cloud</groupId>
            <artifactId>spring-cloud-starter-sleuth</artifactId>
        </dependency>
        <dependency>
            <groupId>org.springframework.cloud</groupId>
            <artifactId>spring-cloud-stream</artifactId>
        </dependency>
        <dependency>
            <groupId>org.springframework.cloud</groupId>
            <artifactId>spring-cloud-stream-binder-kafka</artifactId>
        </dependency>
        <dependency>
            <groupId>org.springframework.cloud</groupId>
            <artifactId>spring-cloud-stream-binder-kafka-streams</artifactId>
        </dependency>

properties

#kafka stream properties
spring.cloud.stream.kafka.binder.brokers=localhost
spring.cloud.stream.kafka.streams.binder.configuration.commit.interval.ms=1000
spring.cloud.stream.kafka.streams.binder.configuration.default.key.serde=org.springframework.kafka.support.serializer.JsonSerde
spring.cloud.stream.kafka.streams.binder.configuration.default.value.serde=org.springframework.kafka.support.serializer.JsonSerde
spring.cloud.stream.bindings.input.contentType=application/json
spring.cloud.stream.bindings.output.contentType=application/json
spring.cloud.stream.kafka.streams.bindings.input.consumer.application-id=dedup-metric-worker
spring.cloud.stream.bindings.input.destination=split-metrics3
spring.cloud.stream.bindings.output.destination=cleaned-metrics9

Sleuth trace keys in MDC(X-B3-TraceId, X-B3-SpanId) are missing when logging. Is Kstream supported by sleuth? (the main feature of kafka streams)

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Comments:24 (11 by maintainers)

github_iconTop GitHub Comments

4reactions
timtebeekcommented, Jun 11, 2019

@t0il3ts0ap not sure how to interpret the 😕 reaction on my above comment; You asked about log correlation with Kafka Streams; at present there’s two implementation for Spring + Kafka:

  1. spring-cloud-stream-binder-kafka-streams, which you’re using in your examples
  2. spring-kafka, which uses @EnableKafkaStreams to bootstrap & support the StreamsBuilder DSL

As you’ve found, approach 1 above, currently does not output your Span correlation IDs on any logged messages. Gary Russel above has suggested some hooks to work around that, and that might be a valuable route to pursue.

Above you mentioned possibly migrating to spring-kafka (approach 2 above). That at present currently also lacks log correlation out of the box. Now Jorge Otoya has done some work recently to enable Trace forwarding and log correlation with an explicit KafkaStreamsTracing wrapper in brave-instrumentation-kafka-streams.

For my part; I’m taking the wrapper created by Jorge, and adding auto configuration for that in Sleuth https://github.com/spring-cloud/spring-cloud-sleuth/pull/1365. Add that configuration class on your classpath, and invoke the wrapper with something like:

kstream.transform(kafkaStreamsTracing.peek("my-named-span",
    (k, v) -> LOGGER.info("This message will have log correlation IDs; key: {}, value: {}", k, v)));

That will give your logged message Sleuth Trace/Span correlation IDs in the log output!

There’s more operations you can wrap, and an ongoing discussion to follow on whether this should be part of Sleuth, but this should at least get you going. Hope it helps!

2reactions
garyrussellcommented, Jul 3, 2019

I opened an issue against the binder; link above.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Sleuth with Spring Cloud Stream Binder Kafka Streams
The input KStream comes with sleuth headers, but they get lost in the kafka stream. FlapMap split one record into N Records. I...
Read more >
Spring Cloud Sleuth customization
Sleuth does not work with parallelStream() out of the box. If you want to have the tracing information propagated through the stream, you...
Read more >
spring-cloud/spring-cloud-sleuth - Gitter
Hi, I'm using DefaultMessageListenerContainer to consume messages from JMS queue. I'm interested in having span id and trace id in logs for messages...
Read more >
Tracing a Reactive Kotlin App with Spring Cloud Sleuth
Distributed tracing with Spring Cloud Sleuth for reactive microservices. ... trace context in some cases, and may even cause MDC to not work...
Read more >
Distributed tracing with Spring Cloud Sleuth and Zipkin
Because microservices are distributed by nature, trying to debug where a problem is occurring can be maddening. The distributed nature of the services...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found