question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

TraceId and spanId do not propagate over kafka messages due to SleuthKafkaAspect.wrapProducerFactory() is not triggered

See original GitHub issue

Tracing information do not propagate over kafka messages due to the method SleuthKafkaAspect.wrapProducerFactory() is not triggered. On the producer side, the message is correctly sent and the tracing information is correctly logged. On consumer side, instead a new traceId and spanId is created.

The following two logging lines show different values for traceId,spanId (and parentId):

2021-03-23 11:42:30.158 [http-nio-9185-exec-2] INFO  my.company.Producer - /4afe07273872918b/4afe07273872918b// - Sending event='MyEvent'
2021-03-23 11:42:54.374 [org.springframework.kafka.KafkaListenerEndpointContainer#1-0-C-1] INFO my.company.Consumer /1fec3bf6a3c91773/ff4bd26b2e509ed8/1fec3bf6a3c91773/ - Received new event='MyEvent'

In first instance, using Krafdrop and also debugging, I verified that the message header doesn’t contains any tracing information.

After that, I figured out that the method SleuthKafkaAspect.wrapProducerFactory() is never triggered, instead on consumer side the method SleuthKafkaAspect.anyConsumerFactory() is.

The libraries versions used are the following:

  • spring boot: 2.3.7.RELEASE
  • spring cloud bom: Hoxton.SR10
  • spring cloud: 2.2.7.RELEASE (and 2.2.5.RELEASE)
  • spring kafka: 2.5.10.RELEASE
  • kakfa client: 2.4.1
  • spring-cloud-starter-sleuth: 2.2.7.RELEASE
  • spring-cloud-sleuth-zipkin:2.2.7.RELEASE

The kakfa client library version is 2.4.1 is due to a version downgrade related to production bug on 2.5.1 version of kafka client that increase the cpu usage. I also tried to use the following libraries versions combination with no success:

  • spring boot: 2.3.7.RELEASE
  • spring cloud bom: Hoxton.SR10 (and Hoxton.SR8)
  • spring cloud: 2.2.7.RELEASE (and 2.2.5.RELEASE)
  • spring kafka: 2.5.10.RELEASE
  • kakfa client: 2.5.1
  • spring-cloud-starter-sleuth: 2.2.7.RELEASE (and 2.2.5.RELEASE)
  • spring-cloud-sleuth-zipkin:2.2.7.RELEASE (and 2.2.5.RELEASE)
  • spring boot: 2.3.7.RELEASE
  • spring cloud bom: Hoxton.SR10 (and Hoxton.SR8)
  • spring cloud: 2.2.7.RELEASE (and 2.2.5.RELEASE)
  • spring kafka: 2.5.10.RELEASE
  • kakfa client: 2.6.0
  • spring-cloud-starter-sleuth: 2.2.7.RELEASE (and 2.2.5.RELEASE)
  • spring-cloud-sleuth-zipkin:2.2.7.RELEASE (and 2.2.5.RELEASE)
  • spring boot: 2.3.7.RELEASE
  • spring cloud bom: Hoxton.SR10 (and Hoxton.SR8)
  • spring cloud: 2.2.7.RELEASE (and 2.2.5.RELEASE)
  • spring kafka: 2.6.x
  • kakfa client: 2.6.0
  • spring-cloud-starter-sleuth: 2.2.7.RELEASE (and 2.2.5.RELEASE)
  • spring-cloud-sleuth-zipkin:2.2.7.RELEASE (and 2.2.5.RELEASE)

We migrated our project to a different spring boot version, from 2.3.0.RELEASE to 2.3.7.RELEASE. Before everthing was working correctly. Below the old libraries versions:

  • spring-boot: 2.3.0.RELEASE
  • spring-kafka: 2.5.0.RELEASE
  • kafka-clients: 2.4.1
  • spring-cloud: 2.2.5.RELEASE
  • spring-cloud-starter-sleuth: 2.2.5.RELEASE
  • spring-cloud-sleuth-zipkin:2.2.5.RELEASE

We also introduced a log42/log4j (before it was slf4j with logback).

Below the related libraries:

- org.springframework.boot:spring-boot-starter-log4j2:jar:2.3.7.RELEASE:compile
- org.slf4j:jul-to-slf4j:jar:1.7.30:compile
- io.projectreactor:reactor-test:jar:3.3.12.RELEASE:test
- io.projectreactor:reactor-core:jar:3.3.12.RELEASE:test
- org.reactivestreams:reactive-streams:jar:1.0.3:test

The properties configured are the following:

spring.sleuth.messaging.enabled=true
spring.kafka.consumer.auto-offset-reset=latest
spring.kafka.consumer.enable-auto-commit=false
spring.kafka.bootstrap-servers=localhost:9092
spring.kafka.client-id=myClientIdentifier
spring.kafka.consumer.group-id=MyConsumerGroup
spring.kafka.producer.value-serializer=org.springframework.kafka.support.serializer.JsonSerializer

The configuration class for the ProducerFactory creation is the following:


@Configuration
@EnableTransactionManagement
public class KafkaProducerConfig {

	KafkaProperties kafkaProperties;

	@Autowired
	public KafkaProducerConfig(
			KafkaProperties kafkaProperties) {
		this.kafkaProperties = kafkaProperties;
	}

	@Bean
	public KafkaTemplate<String, Object> kafkaTemplate() {
		KafkaTemplate<String, Object> kafkaTemplate = new KafkaTemplate<>(producerFactory());
		return kafkaTemplate;
	}


	private ProducerFactory<String, Object> producerFactory() {
		DefaultKafkaProducerFactory<String, Object> defaultKafkaProducerFactory =
				new DefaultKafkaProducerFactory<>(producerConfigs());
		//defaultKafkaProducerFactory.transactionCapable();
		//defaultKafkaProducerFactory.setTransactionIdPrefix("tx-");
		return defaultKafkaProducerFactory;
	}

	private Map<String, Object> producerConfigs() {

		Map<String, Object> configs = kafkaProperties.buildProducerProperties();
		configs.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
		configs.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, JsonSerializer.class);
		return configs;
	}

}

My spring boot application class:


@Profile("DEV")
@SpringBootApplication(
        scanBasePackages = {"my.company"},
        exclude = {
                DataSourceAutoConfiguration.class,
                DataSourceTransactionManagerAutoConfiguration.class,
                HibernateJpaAutoConfiguration.class
        }
)
@EnableSwagger2
@EnableFeignClients(basePackages = {"my.company.common", "my.company.integration"})
@EnableTransactionManagement
@EnableMongoRepositories(basePackages = {
        "my.company.repository"})
@EnableMBeanExport(registration = RegistrationPolicy.IGNORE_EXISTING)
@ServletComponentScan
public class DevAppStartup extends SpringBootServletInitializer {

    public static void main(String[] args) {
        SpringApplication.run(DevAppStartup.class, args);
    }

}

Here you can find the output of command “mvn dependency:tree” mvn_dependency_tree.txt

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:13 (6 by maintainers)

github_iconTop GitHub Comments

2reactions
venrafcommented, Apr 1, 2021

Thank you so much for your help! I know, you are absolutely right. When a project is large it becomes impossible to understand the real cause. We have already fixed the problem in our projects. There is no need to share the repository. Many thanks again.

1reaction
jonatan-ivanovcommented, Mar 31, 2021

@venraf Thanks for all of your efforts minimizing this. From our perspective having a project that we can use to reproduce the issue with minimal moving parts is vital, it makes troubleshooting possible.

I was able to repro and fix your issue, here’s what I did:

  1. Since you removed your docker-compose file and the original contains a lot of things (including mongoDB), I used a different compose file (I think this is unrelated)
  2. Fixed a few things (put application name back, cleaned-up the controller a little) (I think this is unrelated)
  3. Created a ProducerFactory bean as the documentation suggests, I think you need to do this if you use your own KafkaTemplate

Here’s the whole class:

package my.company.mykafkaissue;

import org.springframework.boot.autoconfigure.kafka.KafkaProperties;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.kafka.core.DefaultKafkaProducerFactory;
import org.springframework.kafka.core.KafkaTemplate;
import org.springframework.kafka.core.ProducerFactory;

@Configuration
public class KafkaProducerConfig {

    @Bean
    public ProducerFactory<String, Object>producerFactory(KafkaProperties kafkaProperties) {
        return new DefaultKafkaProducerFactory<>(kafkaProperties.buildProducerProperties());
    }

    @Bean
    public KafkaTemplate<String, Object> kafkaTemplate(ProducerFactory<String, Object>producerFactory) {
        return new KafkaTemplate<>(producerFactory);
    }
}

Before:

2021-03-31 16:23:36.253  INFO [kafka-demo,82e24974b99b8aea,82e24974b99b8aea,true] 37474 --- [nio-8080-exec-3] my.company.mykafkaissue.MyController     : Sending '123' to: 'my.topic' 
2021-03-31 16:23:36.258  INFO [kafka-demo,11ec4a09f7130aab,0f33fbf28fdcf7e1,true] 37474 --- [ntainer#0-0-C-1] my.company.mykafkaissue.MyController     : Received '123'

After:

2021-03-31 16:24:25.293  INFO [kafka-demo,2b306123ede1dba2,2b306123ede1dba2,true] 37515 --- [nio-8080-exec-2] my.company.mykafkaissue.MyController     : Sending '123' to: 'my.topic' 
2021-03-31 16:24:25.298  INFO [kafka-demo,2b306123ede1dba2,a1d38ecab9313197,true] 37515 --- [ntainer#0-0-C-1] my.company.mykafkaissue.MyController     : Received '123'

I can push a fix branch to your repo if you are interested. Please check and let me know if it works.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Why tracing information do not propagate over kafka ...
Tracing information do not propagate over kafka messages due to the method SleuthKafkaAspect.wrapProducerFactory() is not triggered. On the ...
Read more >
spring-cloud/spring-cloud-sleuth - Gitter
We are including spring-cloud-starter-sleuth , which is working perfectly with REST calls, but not with Kafka. We are using KafkaTemplate to send messages, ......
Read more >
Spring Cloud Sleuth
The value of the ID of that span is equal to the trace ID. ... When an exception was thrown and was not...
Read more >
Introduction - Apache Kafka
To serve as the foundation for data platforms, event-driven architectures, and microservices. Apache Kafka® is an event streaming platform. What does that mean?...
Read more >
Distributed Tracing with Kafka Streams - JDriven Blog
Applied to Kafka Streams it allows us to trace and visualize our messages by propagating diagnostic information within message headers.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found