question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

MaxPollRecords Not Available on the ConsumerConfig

See original GitHub issue

Description

We have a long running process tied to our records that can sometimes result in a Application maximum pool interval exceeded message being thrown. There is currently a MaxPoolIntervalsMS property on the ConsumerConfig that we can set to help with this issue, but we only encounter this when getting a large amount of records at once. Based on some articles and documentation I have read there is a configuration option called max.poll.records that can be set to limit the amount of records per poll. The default is 500, but I would like to try changing this to a smaller value to see if this helps with the error. The issue I have is there is not a property exposed on the ConsumerConfig for the max.pool.records setting. Is this by design or am I missing something?

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:6 (3 by maintainers)

github_iconTop GitHub Comments

3reactions
mhowlettcommented, Aug 25, 2021

it’s a design choice that mimics the librdkafka API. you could easily create a batch consume method if you want it that simply consumes up to N messages then returns them in a collection. it’s not a design flaw, or performance limitation - librdkafka has higher consume throughput than the java client.

3reactions
mhowlettcommented, Nov 5, 2020

max.poll.records isn’t relevant to the C# consumer (or any other librdkafka based client) because messages are delivered one at a time to the application. in the java client, they are delivered in batches.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Apache Kafka "max.poll.records" configuration not working
I see that it reads all the unread messages from the Kafka at once. So? Since you have enable.auto.commit = false , you're...
Read more >
Kafka Consumer configuration reference
This topic provides the configuration parameters that are available for ... Note, that max.poll.records does not impact the underlying fetching behavior.
Read more >
Chapter 4. Kafka consumer configuration tuning
The consumer fetches from a given offset and consumes the messages in order, unless the offset is changed to skip or re-read messages....
Read more >
KafkaConsumer (kafka 1.0.1 API)
A client that consumes records from a Kafka cluster. This client transparently handles the failure of Kafka brokers, and transparently adapts as topic ......
Read more >
Optimizing Kafka consumers
We recently gave a few pointers on how you can fine-tune Kafka producers to improve message publication to Kafka.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found