Existing TraceId not propagated in a Spring Cloud Stream Kafka Producer
See original GitHub issueDescribe the bug I’m using Spring Boot 2.3.3 and Cloud Hoxton.SR6 I’ve found that an existing traceid is not propagated in the message, a new one is generated instead. See sample below for more details
Sample
I’ve uploaded a sample project in this repo: https://github.com/codependent/sleuth-kafka
Just start a local Kafka broker (2.5.1 in my case) and then run SleuthKafkaApplication
class.
The application consists of a controller (EventRestController
) that receives an event and sends (EventProducer
) it to a Kafka topic. Then a consumer (KafkaConfiguration.consumer()
) reads that same message.
In every step I print a log to verify the correlation information.
Just execute this curl to see it in action:
curl -X POST http://localhost:8080/api/v1/events -d '{"id":"someid2", "body":"some event"}' -H "content-type: application/json"
The following logs show up:
2020-09-03 14:04:34.250 DEBUG [sleuth-kafka,,,] 9111 --- [ctor-http-nio-4] c.c.s.filter.RequestLoggingFilter : Request URI http://localhost:8080/api/v1/events - headers [Host:"localhost:8080", User-Agent:"curl/7.64.1", Accept:"*/*", content-type:"application/json", Content-Length:"37"]
2020-09-03 14:04:34.253 INFO [sleuth-kafka,1f321656197f4119,1f321656197f4119,true] 9111 --- [ctor-http-nio-4] c.c.s.a.v.e.c.EventRestController : registerEvent() - event Event(id=someid2, body=some event)
2020-09-03 14:04:34.253 INFO [sleuth-kafka,1f321656197f4119,1f321656197f4119,true] 9111 --- [ctor-http-nio-4] c.c.sleuthkafka.producer.EventProducer : send() - event Event(id=someid2, body=some event)
2020-09-03 14:04:34.259 INFO [sleuth-kafka,7c9ddc695caba01b,1daff4416e92b0fe,false] 9111 --- [container-0-C-1] uration$$EnhancerBySpringCGLIB$$8bb402c7 : consumer() - event Event(id=someid2, body=some event)
As you can see the controller has this traceid and spanid generated: 1f321656197f4119,1f321656197f4119
That same info is present right before sending the message: 2020-09-03 14:04:34.253 INFO [sleuth-kafka,1f321656197f4119,1f321656197f4119,true] 9111 --- [ctor-http-nio-4] c.c.sleuthkafka.producer.EventProducer : send() - event Event(id=someid2, body=some event)
However the consumer gets a different traceId: 2020-09-03 14:04:34.259 INFO [sleuth-kafka,7c9ddc695caba01b,1daff4416e92b0fe,false] 9111 --- [container-0-C-1] uration$$EnhancerBySpringCGLIB$$8bb402c7 : consumer() - event Event(id=someid2, body=some event)
Using kafkacat I see that these last values (7c9ddc695caba01b,1daff4416e92b0fe
) were the ones written into the topic:
kafkacat -b localhost:9092 -t events -C \
-f '\nKey (%K bytes): %k
Value (%S bytes): %s
Timestamp: %T
Partition: %p
Offset: %o
Headers: %h\n'
Key (7 bytes): someid2
Value (36 bytes): {"id":"someid2","body":"some event"}
Timestamp: 1599134674253
Partition: 0
Offset: 22
Headers: b3=7c9ddc695caba01b-fe3b14495ab765cc-0,nativeHeaders={"b3":["7c9ddc695caba01b-fe3b14495ab765cc-0"]},spring_json_header_types={"b3":"java.lang.String","nativeHeaders":"org.springframework.util.LinkedMultiValueMap"},__TypeId__=com.codependent.sleuthkafka.api.v1.event.dto.Event
For some reason the existing context isn’t being retrieved in TracingChannelInterceptor
Issue Analytics
- State:
- Created 3 years ago
- Comments:8 (2 by maintainers)
@marcingrzejszczak here is a repo that reproduces the issue https://github.com/jarias/spring-cloud-sleuth-issue-1731 if you downgrade to spring boot 2.3.5.RELEASE and spring cloud Hoxton.SR2 things should work.
Thanks @jarias for trying it out. I’m pointing to your commit here https://github.com/jarias/spring-cloud-sleuth-issue-1731/commit/3fa92282d7031abe4748c8e44eaac27f24d3a23f
Yeah there are issues with stream and reactor and once you do things manually then you’re in full control of the context passing. Then things work even though you have to do some manual operations.
I’m closing this issue and thanks again for providing the information on how you fixed it.