question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Consumer streams stop consuming

See original GitHub issue

Hi, I’m using the consumer stream in an ETL service. Currently there are around 60 active consumers and everything works great. Once in a couple of days I notice that one of the streams stop consuming data and I need to manually restart it, yet there isn’t any error being emitted regarding this. I Use manual commits and don’t wait for the commit callbacks, not sure if that might be related to the issue. I currently use the stream.on('error', cb) for error handling. Is there a better way to get some more data for debugging? Thanks.

Here is the consumer configuration:

const hosts = config.kafka.hosts.join(',');

  const globalConfig = {
    'metadata.broker.list': hosts,
    'group.id': groupId,
    'enable.auto.commit': false,
    'offset.store.method': 'broker',
    'partition.assignment.strategy': 'roundrobin',
    'topic.metadata.refresh.interval.ms': 30000,
    'queued.max.messages.kbytes': 10000
  };

  const topicConfig = {
    'auto.offset.reset': 'latest'
  };

  const stream = new Kafka.createReadStream(globalConfig, topicConfig, {
    topics: [topicsRegex],
    waitInterval: 0,
    fetchSize: 1000
  });

  stream.on('error', (err) => {
    if (stream.consumer && stream.consumer._isDisconnecting) {
      logger.error('Error in kafka: Disconnecting in progress');
      return;
    }

    logger.error('Error in kafka ', {
      error_message: err.stack,
      error_code: err.code
    });
  });

Issue Analytics

  • State:closed
  • Created 6 years ago
  • Comments:13

github_iconTop GitHub Comments

1reaction
bobrowadamcommented, May 11, 2018

@webmakersteve @csaroff We now use much less consumers per CPU core and also made sure we gracefully close the consumers on restart and things works just fine ever since.

1reaction
bobrowadamcommented, Mar 6, 2018

Ok thanks, I’ll give it a try.

Read more comments on GitHub >

github_iconTop Results From Across the Web

@StreamListener (kafka) : How to stop consumers in-order ...
The consumer will stop when all the messages from the previous poll() are processed, not immediately. To stop immediately you must set the...
Read more >
Akka kafka streams app suddenly stops consuming #1204
Hello! We recently had an issue with two of our akka-kafka-streams apps in production, where they just suddenly stopped consuming.
Read more >
Kafka Streams vs. Kafka Consumer
Single Kafka Stream to consume and produce; Perform complex processing; Do not support batch processing; Support stateless and stateful ...
Read more >
How Kafka Streams Works: A Guide to Stream Processing
In Kafka Streams, state stores can either be persistent—using RocksDB—or in memory. Of course, because it is easy to lose a disk or...
Read more >
Kafka-Streams - Tips on How to Decrease Re-Balancing ...
During rebalance, consumers stop processing messages for some period of time, and, as a result, processing of events from a topic happens with ......
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found