about serializer and deserializer config bug
See original GitHub issueMy English is not well,if the description is not clear,please help me translate.
要是翻译的不清楚…哪个吊大的帮忙翻译一下
I build Spring Boot project depandend by Spring Cloud Confug and Spring Cloud Bus,at the same time I use Kafka as a message queue
我创建一个Spring Boot项目依赖了Spring Cloud Confg 和Spring Cloud Bus,同时使用 Kafka 作为消息队列;
There is config;
其中 Kafka 设置是这样的
spring:
kafka:
producer:
key-serializer: org.apache.kafka.common.serialization.StringSerializer
value-serializer: org.apache.kafka.common.serialization.StringSerializer
consumer:
key-deserializer: org.apache.kafka.common.serialization.StringDeserializer
value-deserializer: org.apache.kafka.common.serialization.StringDeserializer
but when I refresh config , Spring cloud has error like this;
但是当我刷新配置时 Spring Cloud Bus 报错
org.springframework.messaging.MessageHandlingException: error occurred in message handler [org.springframework.cloud.stream.binder.kafka.KafkaMessageChannelBinder$ProducerConfigurationMessageHandler@1efa1562]; nested exception is org.apache.kafka.common.errors.SerializationException: Can't convert value of class [B to class org.apache.kafka.common.serialization.StringSerializer specified in value.serializer
I found bug maybe in org.springframework.cloud.stream.binder.kafka.KafkaMessageChannelBinder#getProducerFactory
我发现 bug 可能在 org.springframework.cloud.stream.binder.kafka.KafkaMessageChannelBinder#getProducerFactory
protected DefaultKafkaProducerFactory<byte[], byte[]> getProducerFactory(
String transactionIdPrefix,
ExtendedProducerProperties<KafkaProducerProperties> producerProperties) {
Map<String, Object> props = new HashMap<>();
props.put(ProducerConfig.RETRIES_CONFIG, 0);
props.put(ProducerConfig.BUFFER_MEMORY_CONFIG, 33554432);
//set serializer method in there
//这里设置了序列化方法
props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, ByteArraySerializer.class);
props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG,
ByteArraySerializer.class);
props.put(ProducerConfig.ACKS_CONFIG,
String.valueOf(this.configurationProperties.getRequiredAcks()));
// There will be get spring.kafka in config properties
// 这里会获取spring.kafka的配置;
// if there is any about serializer config in config properties,they well be return in there
// 如果配置中有设置任意序列化配置,这里会返回相关配置;
// and cover the serializer config that set before
// 覆盖上面配置的序列化参数
Map<String, Object> mergedConfig = this.configurationProperties
.mergedProducerConfiguration();
if (!ObjectUtils.isEmpty(mergedConfig)) {
props.putAll(mergedConfig);
}
if (ObjectUtils.isEmpty(props.get(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG))) {
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG,
this.configurationProperties.getKafkaConnectionString());
}
if (ObjectUtils.isEmpty(props.get(ProducerConfig.BATCH_SIZE_CONFIG))) {
props.put(ProducerConfig.BATCH_SIZE_CONFIG,
String.valueOf(producerProperties.getExtension().getBufferSize()));
}
if (ObjectUtils.isEmpty(props.get(ProducerConfig.LINGER_MS_CONFIG))) {
props.put(ProducerConfig.LINGER_MS_CONFIG,
String.valueOf(producerProperties.getExtension().getBatchTimeout()));
}
if (ObjectUtils.isEmpty(props.get(ProducerConfig.COMPRESSION_TYPE_CONFIG))) {
props.put(ProducerConfig.COMPRESSION_TYPE_CONFIG,
producerProperties.getExtension().getCompressionType().toString());
}
if (!ObjectUtils.isEmpty(producerProperties.getExtension().getConfiguration())) {
props.putAll(producerProperties.getExtension().getConfiguration());
}
//there will be build a producerFactory with other serializer(like StringSerializer if I set in
application.yml)
//这里会创建一个用其他解析器(例如 StringSerializer)构造的ProducerFactory
DefaultKafkaProducerFactory<byte[], byte[]> producerFactory = new DefaultKafkaProducerFactory<>(
props);
if (transactionIdPrefix != null) {
producerFactory.setTransactionIdPrefix(transactionIdPrefix);
}
return producerFactory;
}
I found that many places are building ProducerFactory / ConsumerFactory in this way 我发现好多地方在用这样的方法构建ProducerFactory/ConsumerFactory. I’m not sure this is bug, but I don’t understand what they do 我不确定这是不是bug,但是我不理解这样的意义 Is it impossible to set serialization parameters when using Spring Cloud Stream Binder Kafka? 难道是用Spring Cloud Stream Binder Kafka 时不能设置序列化参数么
Issue Analytics
- State:
- Created 4 years ago
- Comments:9 (4 by maintainers)
Exactly what are you trying to do by specifying
StringSerializer
andStringDeserializer
? It seems like you are sending rich objects so you are relying on the framework to do the conversion for you.In that case, you should not specify these at all and use the default
ByteArraySerializer
ByteArrayDeserializer
.yes, config use default, and my code:
config