Unable to consume messages from a topic
See original GitHub issueIssue Description
Hi,
i’m just testing the library, but i’m not able to consume any message from my topic.
I created a topic named parallelConsumerTest
, with 10 partitions, and produced 1 million records in it. I already checked that the topic is not empty.
Here is my Java consumer:
Consumer<String, String> consumer = ConsumerCreator.createParallelConsumer();
ParallelConsumerOptions<String, String> options = ParallelConsumerOptions.<String, String>builder()
.consumer(consumer)
.ordering(ParallelConsumerOptions.ProcessingOrder.PARTITION)
.maxConcurrency(10)
.build();
ParallelStreamProcessor<String, String> parallelConsumer = ParallelStreamProcessor.createEosStreamProcessor(options);
parallelConsumer.subscribe(Collections.singletonList("parallelConsumerTest"));
System.out.println("PARALLEL CONSUMER SUBSCRIBED");
Map<String, String> mapRecord = new HashMap<>();
parallelConsumer.poll(record -> {
System.out.println("Concurrently processing a record: {}" + record);
mapRecord.put(record.key(), record.value());
});
System.out.println("RECORD COUNT: " + mapRecord.size());
What happen is that my mapRecord
is always empty and i can’t understand why.
Please note that using a “standard” Kafka Consumer, everything works fine.
What am i doing wrong?
Thanks in advance!
Mauro
Issue Analytics
- State:
- Created 2 years ago
- Comments:5 (2 by maintainers)
Top Results From Across the Web
Kafka consumer not able to consume messages using ...
While consuming messages from kafka using bootstrap-server parameter, the connection happens via the kafka server instead of zookeeper.
Read more >Not able to read from kafka topic - Cloudera Community
But now , I am getting the following error message "Connection to node -1 could not be established. Broker may not be available....
Read more >Standard Kafka consumer hangs and does not output messages
This situation occurs if the consumer is invoked without supplying the required security credentials. In this case, the consumer hangs and does not...
Read more >KafkaConsumer can't consume messages sometimes #2315
when serveral kafka consumer based on librdkafka v1.0.0 working with others which based on librdkafka v0.11.5, some time(maybe serveral days ...
Read more >Chapter 4. Kafka Consumers: Reading Data from Kafka
During a rebalance, consumers can't consume messages, so a rebalance is basically a short window of unavailability of the entire consumer group.
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Thanks for the code, but can you add a test to it which reproduces the problem? If you a unit test doesn’t do it, try an integration test with test containers. Happy to help where I can, but I don’t have much time 😃 I’m curious to see what the issues is though…
I solved the problem! Obviously it wasn’t an issue with the parallel consumer library. Since i used 2 consumers, the standard and the parallel, i forgot to assign different
group.id
to them. My bad!Thanks anyway for your availability!