Consumer streams stop consuming
See original GitHub issueHi,
I’m using the consumer stream in an ETL service. Currently there are around 60 active consumers and everything works great. Once in a couple of days I notice that one of the streams stop consuming data and I need to manually restart it, yet there isn’t any error being emitted regarding this. I Use manual commits and don’t wait for the commit callbacks, not sure if that might be related to the issue.
I currently use the stream.on('error', cb)
for error handling. Is there a better way to get some more data for debugging?
Thanks.
Here is the consumer configuration:
const hosts = config.kafka.hosts.join(',');
const globalConfig = {
'metadata.broker.list': hosts,
'group.id': groupId,
'enable.auto.commit': false,
'offset.store.method': 'broker',
'partition.assignment.strategy': 'roundrobin',
'topic.metadata.refresh.interval.ms': 30000,
'queued.max.messages.kbytes': 10000
};
const topicConfig = {
'auto.offset.reset': 'latest'
};
const stream = new Kafka.createReadStream(globalConfig, topicConfig, {
topics: [topicsRegex],
waitInterval: 0,
fetchSize: 1000
});
stream.on('error', (err) => {
if (stream.consumer && stream.consumer._isDisconnecting) {
logger.error('Error in kafka: Disconnecting in progress');
return;
}
logger.error('Error in kafka ', {
error_message: err.stack,
error_code: err.code
});
});
Issue Analytics
- State:
- Created 6 years ago
- Comments:13
Top Results From Across the Web
@StreamListener (kafka) : How to stop consumers in-order ...
The consumer will stop when all the messages from the previous poll() are processed, not immediately. To stop immediately you must set the...
Read more >Akka kafka streams app suddenly stops consuming #1204
Hello! We recently had an issue with two of our akka-kafka-streams apps in production, where they just suddenly stopped consuming.
Read more >Kafka Streams vs. Kafka Consumer
Single Kafka Stream to consume and produce; Perform complex processing; Do not support batch processing; Support stateless and stateful ...
Read more >How Kafka Streams Works: A Guide to Stream Processing
In Kafka Streams, state stores can either be persistent—using RocksDB—or in memory. Of course, because it is easy to lose a disk or...
Read more >Kafka-Streams - Tips on How to Decrease Re-Balancing ...
During rebalance, consumers stop processing messages for some period of time, and, as a result, processing of events from a topic happens with ......
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
@webmakersteve @csaroff We now use much less consumers per CPU core and also made sure we gracefully close the consumers on restart and things works just fine ever since.
Ok thanks, I’ll give it a try.