Topic Subscription with offset info for exactly-once semantics
See original GitHub issueConsuming from a topic may provide the topic offset so that consumers can track it and have an exactly-once implementation.
Kafka docs, for example, suggest that approach:
So what about exactly once semantics (i.e. the thing you actually want)? […]this can be handled more simply and generally by simply letting the consumer store its offset in the same place as its output.
This could require adding an exactlyOnce
method on the subscriber API which required a Flow
of (message,offset)
. We may have to model this offset as a (offset:Long, partitionId:Int)
since each partition of the same topic has it’s offset.
Issue Analytics
- State:
- Created 6 years ago
- Comments:11 (11 by maintainers)
Top Results From Across the Web
Exactly-Once Semantics Are Possible: Here's How Kafka Does It
In this post, I'd like to tell you what Kafka's exactly-once semantics mean, why it is a hard problem, and how the new...
Read more >Exactly-Once Processing in Kafka explained - sudan - Medium
This article explains how Exactly-Once Processing in Kafka works internally. It assumes that the reader is already familiar with the basics ...
Read more >What is Kafka Exactly Once Semantics? How to Handle It?
It allows you to atomically write data to various topics and partitions, as well as offsets of consumed messages. A single processing step ......
Read more >KIP-447: Producer scalability for exactly once semantics
Exactly once semantics (EOS) provides transactional message processing guarantees. Producers can write to multiple partitions atomically so that either all ...
Read more >Exactly Once Processing in Kafka with Java - Baeldung
And finally, we need to commit our offsets that we just finished consuming. With transactions, we commit the offsets back to the input...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
There’s some example code at http://doc.akka.io/docs/akka-stream-kafka/current/consumer.html#external-offset-storage
We probably need https://github.com/akka/reactive-kafka/blob/776a6ec9e320415bb72486612cc90d33f3557fc4/core/src/main/scala/akka/kafka/ConsumerSettings.scala#L30-L30