question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Spring Kafka overriding custom header pattern and removes id from Producer Record Header

See original GitHub issue

Requirement: The application must generate a Spring Message<T> object including id as header and send the message to Kafka Topic. We have used Spring Cloud Streams to integrate the application with Kafka.

Expected Output The consumer application that consumes message from the Kakfa Topic must be able to read the message headers id and validate if the value of the header is same as Kafka Producer Record Key.

Actual Output The consumer application that consumes message from the Kafka Topic is always receiving the header id value as “None”.

Customization To avoid unwanted headers like spring_json_header_types generated along with custom headers by the default mapper class i.e. BinderHeaderMapper and override the default header patterns to allow id, a custom header mapper is implemented.

@Slf4j
public class CustomKafkaHeaderMapper extends AbstractKafkaHeaderMapper {

  private final ObjectMapper objectMapper;

  public CustomKafkaHeaderMapper(String... patterns) {
    super(patterns);
    this.objectMapper = new ObjectMapper();
    this.objectMapper.registerModule(
        (new SimpleModule())
            .addDeserializer(
                MimeType.class, new CustomKafkaHeaderMapper.MimeTypeJsonDeserializer()));
  }

  @Override
  public void fromHeaders(MessageHeaders messageHeaders, Headers target) {
    messageHeaders.forEach(
        (key, value) -> {
          if (!KafkaHeaders.DELIVERY_ATTEMPT.equals(key) && this.matches(key, value)) {
            Object valueToAdd = this.headerValueToAddOut(key, value);
            if (valueToAdd instanceof byte[]) {
              target.add(new RecordHeader(key, (byte[]) valueToAdd));
            } else if (valueToAdd instanceof String) {
              target.add(new RecordHeader(key, ((String) valueToAdd).getBytes(this.getCharset())));
            } else {
              try {
                target.add(new RecordHeader(key, new ObjectMapper().writeValueAsBytes(valueToAdd)));
              } catch (JsonProcessingException e) {
                logger.error(
                    e,
                    () -> "Could not map " + key + " with type " + value.getClass().getName()
                    );
              }
            }
          }
        });
  }

  @Override
  public void toHeaders(Headers source, Map<String, Object> target) {
    source.forEach(
        header -> {
          if (KafkaHeaders.DELIVERY_ATTEMPT.equals(header.key())) {
            target.put(header.key(), ByteBuffer.wrap(header.value()).getInt());
          } else {
            target.put(header.key(), new String(header.value(), this.getCharset()));
          }
        });
  }

  private class MimeTypeJsonDeserializer extends StdNodeBasedDeserializer<MimeType> {
    private static final long serialVersionUID = 1L;

    MimeTypeJsonDeserializer() {
      super(MimeType.class);
    }

    public MimeType convert(JsonNode root, DeserializationContext ctxt) throws IOException {
      if (root instanceof TextNode) {
        return MimeType.valueOf(root.asText());
      } else {
        JsonNode type = root.get("type");
        JsonNode subType = root.get("subtype");
        JsonNode parameters = root.get("parameters");
        Map<String, String> params =
            (Map)
                CustomKafkaHeaderMapper.this.objectMapper.readValue(
                    parameters.traverse(),
                    TypeFactory.defaultInstance()
                        .constructMapType(HashMap.class, String.class, String.class));
        return new MimeType(type.asText(), subType.asText(), params);
      }
    }
  }
}

This customHeaderMapper bean name is configured to the Binder properties as mentioned below: spring.cloud.stream.kafka.binder.headerMapperBeanName=customKafkaHeaderMapper

Analysis I thought the Custom Header Mapper will serve the purpose and set the id header in the Kafka Producer Record. But in my initial analysis, I observed that

  • The custom header mapper bean is captured and configured to be used in the code below inside KafkaMessageChannelBinder class createProducerMessageHandler method.
final KafkaHeaderMapper mapper = null;
    if (this.configurationProperties.getHeaderMapperBeanName() != null) {
      mapper = (KafkaHeaderMapper)applicationContext.getBean(this.configurationProperties.getHeaderMapperBeanName(), KafkaHeaderMapper.class);
    }

But at the end of this method, the handler is set to a new KafkaHeaderMapper object that uses the custom header mapper to map spring message headers.

final KafkaHeaderMapper mapper = null;
    if (this.configurationProperties.getHeaderMapperBeanName() != null) {
      mapper = (KafkaHeaderMapper)applicationContext.getBean(this.configurationProperties.getHeaderMapperBeanName(), KafkaHeaderMapper.class);
    }

    if (mapper == null) {
      try {
        mapper = (KafkaHeaderMapper)applicationContext.getBean("kafkaBinderHeaderMapper", KafkaHeaderMapper.class);
      } catch (BeansException var14) {
      }
    }

    Object mapper;
    if (producerProperties.getHeaderMode() != null && !HeaderMode.headers.equals(producerProperties.getHeaderMode())) {
      mapper = null;
    } else if (mapper == null) {
      String[] headerPatterns = ((KafkaProducerProperties)producerProperties.getExtension()).getHeaderPatterns();
      if (headerPatterns != null && headerPatterns.length > 0) {
        mapper = new BinderHeaderMapper(BinderHeaderMapper.addNeverHeaderPatterns(Arrays.asList(headerPatterns)));
      } else {
        mapper = new BinderHeaderMapper();
      }
    } else {
      mapper = new KafkaHeaderMapper() {
        public void toHeaders(Headers source, Map<String, Object> target) {
          mapper.toHeaders(source, target);
        }

        public void fromHeaders(MessageHeaders headers, Headers target) {
          mapper.fromHeaders(headers, target);
          BinderHeaderMapper.removeNeverHeaders(target);
        }
      };
    }

    handler.setHeaderMapper((KafkaHeaderMapper)mapper);

If it can be observed inside the fromHeaders method of the above code, the mapper is the custom header mapper instance that is mapping the spring message headers to the Apache Kafka Headers. Till here, I can see the id header entry available in the Apache Kafka Headers. Post that the BinderHeaderMapper's removeNeverHeaders is invoked by passing the Apache Kafka Headers to explicitly remove the id header from the record.

public static void removeNeverHeaders(Headers headers) {
    headers.remove("id");
    headers.remove("timestamp");
    headers.remove("deliveryAttempt");
    headers.remove("scst_nativeHeadersPresent");
  }

As per my analysis, even if my application sets the a simple Spring Message with id header, before the message is written to the Kafka topic, the above code is removing the ‘id’ header from the Apache Kafka Headers and the id value is deserialized as ‘None’ by the consumer.

I would like to know, if there is a way to avoid the removal of the id header from the Kafka Producer Record Headers as per our project requirement.

Please let me know, if you need further information.

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:9 (7 by maintainers)

github_iconTop GitHub Comments

1reaction
arronharden-elscommented, Jul 1, 2021

No worries, thanks for taking the time to look at this @sobychacko. We have a solution now, so good to close this issue. (A bit more detail of the solution is in https://github.com/spring-projects/spring-kafka/issues/1803).

1reaction
sobychackocommented, May 20, 2021

@skkadium based on what @garyrussell commented above, why can’t you use some custom id field that is domain-specific for your application? That way, you can avoid clashing with the framework-level reserved id.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Spring Kafka overriding custom header pattern and removes ...
The consumer application that consumes message from the Kafka Topic is always receiving the header id value as "None".
Read more >
Spring for Apache Kafka
You can now use a custom correlation header which will be echoed in any ... used as the Kafka consumer group.id property, overriding...
Read more >
Adding Custom Header to Kafka Message Example
In this tutorial we demonstrate how to add/read custom headers to/from a Kafka Message using Spring Kafka. We start by adding headers using ......
Read more >
Adding custom header using Spring Kafka - Stack Overflow
Here is my approach to add a custom header: var record = new ProducerRecord<String, String>(topicName, "Hello World"); ...
Read more >
Spring Boot and Kafka – Practical Example
Kafka Producer configuration in Spring Boot · The KafkaTemplate instance. This is the object we employ to send messages to Kafka. · The...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found