KafkaConsumer should not be sealed
See original GitHub issueKafkaConsumer
is a sealed abstract class. That serves no purpose since it is not an enumeration that gets pattern matched somewhere and it has no concrete implementations.
On the flip side, being sealed means you can’t mock it easily for testing. KafkaProducer
is also abstract but not sealed.
Issue Analytics
- State:
- Created 3 years ago
- Reactions:1
- Comments:8 (3 by maintainers)
Top Results From Across the Web
KafkaConsumer (kafka 2.2.0 API)
A client that consumes records from a Kafka cluster. This client transparently handles the failure of Kafka brokers, and transparently adapts as topic ......
Read more >KafkaConsumer (clients 3.3.1 API) - javadoc.io
A client that consumes records from a Kafka cluster. This client transparently handles the failure of Kafka brokers, and transparently adapts as topic ......
Read more >Chapter 4. Kafka Consumers: Reading Data from Kafka
There is a fourth property, which is not strictly mandatory, but for now we will pretend it is. The property is group.id and...
Read more >Kafka Streams vs. Kafka Consumer | Baeldung
It deals with messages as an unbounded, continuous, and real-time flow of records, with the following characteristics: Single Kafka Stream to ...
Read more >Running kafka consumer(new Consumer API) forever
Consumers are usually long-running applications that continuously poll Kafka for more data. Consumers must keep polling Kafka or they will be considered dead...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
The only gain is that we can keep adding functions to the sealed
KafkaConsumer
without breaking binary compatibility.Don’t think I agree.
KafkaConsumer
has methods for:There’s a definite possibility that we’ll want to support new features that are added to Kafka in the future, and/or add higher-level functionality. Splitting out traits also lets us restrict which capabilities we pass around, so makes it easier to reason about what a program does and doesn’t do.