Flow.chunked operator with size limit
See original GitHub issueIt would be useful to have an optional transformation in Flow.buffer
method to aggregate the buffered items like in kotlin.sequences.Sequence.chunked
.
I mean,
fun buffer(capacity: Int = BUFFERED, suspend transform: (List<T>) -> R) : Flow<T>
Then we can write
runBlocking {
(1..100).asFlow().buffer(capacity = 10) { it.sum() }.collect { println(it) }
}
with result 55, 155, 255, … , 955
Issue Analytics
- State:
- Created 4 years ago
- Reactions:40
- Comments:20 (5 by maintainers)
Top Results From Across the Web
Kotlin - Chunk sequence based on size and time
What I was expecting to have is something like . chunked(1000, 300) which 300 is second for when I want to send every...
Read more >Kotlin on Twitter: "Want to split a collection into a list of lists ...
Want to split a collection into a list of lists with custom size? Use the "chunked" function: kotlinlang.org/api/latest/jvm… Need a list of ...
Read more >Batching results from flow : r/Kotlin - Reddit
Discussion has been going on about adding a Flow.chunked operator which ... fun <T> Flow<T>.chunked(size: Int): Flow<List<T>> = flow { val ...
Read more >Chunk Size - an overview | ScienceDirect Topics
It is fixed-size chunk deduplication performed at routers in place of hosts. First, the considered packet is captured in a router on the...
Read more >Configuring a Step - Spring
When creating job flows, as described later in this chapter, the next attribute ... <step id="step1"> <tasklet start-limit="1"> <chunk ...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
I’m realizing that I actually want a slightly different behavior from what I’ve seen discussed thus far, because really all I want is to be able to convert a stream of values to a batch operation when appropriate.
something like the following:
This is optimizing for a database that performs better with batch operations than a number of small ones, and for which it’s safest to restrict writes to once a second.
My implementation is here. I’m sure I’m using coroutines incorrectly somehow: https://gist.github.com/AWinterman/8516d4869f491176ebb270dafbb23199
Would this work?