Setting max.request.size for Producer
See original GitHub issueSimilarly to #2592, I’m setting the value for max.request.size
in the KafkaConnect
CRD:
apiVersion: kafka.strimzi.io/v1beta1
kind: KafkaConnect
metadata:
name: connect-cluster
spec:
version: 2.4.0
replicas: 1
bootstrapServers: kafka-cluster-kafka-bootstrap:9093
...
config:
group.id: connect-cluster
offset.storage.topic: connect-cluster-offsets
config.storage.topic: connect-cluster-configs
status.storage.topic: connect-cluster-status
config.storage.replication.factor: 1
offset.storage.replication.factor: 1
status.storage.replication.factor: 1
offset.flush.timeout.ms: 50000
max.request.size: 8388608
The log shows the correct value at first:
# Provided configuration
..
max.request.size=8388608
..
Until I start my postgres connector via the REST
interface:
2020-03-18 20:40:20,082 INFO Instantiated connector postgres-connector with version 1.0.2.Final of type class io.debezium.connector.postgresql.PostgresConnector (org.apache.kafka.connect.runtime.Worker) [pool-6-thread-1]
2020-03-18 20:40:20,086 INFO ProducerConfig values:
...
max.request.size = 1048576
...
Is there any way to prevent max.request.size
from being reset to the default value?
I’m using:
- Strimzi 0.16.2
- A custom Kafka Connect image based on
strimzi/kafka:0.16.2-kafka-2.4.0
Issue Analytics
- State:
- Created 4 years ago
- Comments:6 (5 by maintainers)
Top Results From Across the Web
Kafka Producer Configurations for Confluent Platform
The maximum size of a request in bytes. This setting will limit the number of record batches the producer will send in a...
Read more >How can I send large messages with Kafka (over 15MB)?
The message max size is 1MB (the setting in your brokers is called message.max.bytes ) Apache Kafka. If you really needed it badly,...
Read more >How to send Large Messages in Apache Kafka? - Conduktor
The Kafka max message size is 1MB. In this lesson we will look at two approaches for handling larger messages in Kafka. Kafka...
Read more >Send Large Messages With Kafka | Baeldung
Kafka configuration limits the size of messages that it's allowed to send. By default, this limit is 1MB. However, if there's a requirement...
Read more >Re: producer max.request.size - Google Groups
Caused by: org.apache.kafka.common.errors.RecordTooLargeException: The message is 1740572 bytes when serialized which is larger than 1048576 ...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Thanks for the quick response @scholzj.
It turns out I just needed to set
producer.max.request.size
to ensure the producer worker picked it up.Not sure why my consumer doesn’t require a
consumer.max.request.size
counterpart, but it works.@scholzj I can’t explain the observations in #2592, unless the config was being overridden for that connector. The producer configs get created in
Worker#producerConfigs
certainly doesn’t do anything which could explain what you saw.