[Request] Supported fs2 integration
See original GitHub issueI looked at the http4s example, but it appears to read all inputs into memory rather than streaming.
This appears to work so far, but it would be nice to have official support that is maintained over time (and hopefully improved!).
import cats.effect.implicits._
import cats.effect.{ConcurrentEffect, ContextShift, IO, Sync}
import cats.implicits._
import com.github.plokhotnyuk.jsoniter_scala.core.{
JsonValueCodec,
scanJsonValuesFromStream,
writeToArray
}
import com.github.plokhotnyuk.jsoniter_scala.macros.{
CodecMakerConfig,
JsonCodecMaker
}
import fs2.concurrent.Queue
import fs2.{Chunk, Pipe, Stream}
import org.scalatest.{FlatSpec, Matchers}
import scala.concurrent.ExecutionContext
class JsoniterSpec extends FlatSpec with Matchers {
import ExecutionContext.Implicits.global
implicit val CS: ContextShift[IO] = IO.contextShift(global)
behavior of "jsoniter"
it should "parse streaming values" in {
case class Foo(name: String, num: Int)
implicit val codec: JsonValueCodec[Foo] =
JsonCodecMaker.make[Foo](CodecMakerConfig())
val input = Stream(Foo("fred", 1), Foo("wilma", 2))
val bytes = input
.map(writeToArray(_))
.flatMap(bs => Stream.chunk(Chunk.bytes(bs)))
.covary[IO]
val results = bytes.through(parse[IO, Foo](global))
results.compile.toList
.map(result => result shouldEqual input.toList)
.unsafeRunSync()
}
def parse[F[_]: ConcurrentEffect: ContextShift, A: JsonValueCodec](
blockingExecutionContext: ExecutionContext,
maxBuffered: Int = 1,
shouldParseNext: A => Boolean = (_: A) => true
): Pipe[F, Byte, A] = { in =>
Stream.eval(Queue.boundedNoneTerminated[F, A](maxBuffered)).flatMap { q =>
in.through(fs2.io.toInputStream[F]).flatMap { inputStream =>
def eachA(a: A): Boolean =
q.enqueue1(Some(a)).as(shouldParseNext(a)).toIO.unsafeRunSync()
val parseBytes = ContextShift[F].evalOn(blockingExecutionContext) {
Sync[F].delay(scanJsonValuesFromStream(inputStream)(eachA))
} *> q.offer1(None)
val emitValues = q.dequeue
emitValues concurrently Stream.eval(parseBytes)
}
}
}
}
Issue Analytics
- State:
- Created 4 years ago
- Comments:7 (4 by maintainers)
Top Results From Across the Web
Tips for working with FS2 - Underscore.io
Specifically, we will look at how to work with flatMap in Streams and at Topics , along some minor comments on fs2 Streams...
Read more >Where is the llama for FS2? - SoftwareMill Tech Blog
During this post we are going to take a look at the different system integration tools and libraries, verify if there is the...
Read more >Testing asynchronous pipelines with fs2 and weaver-test
By integrating tightly with IO, our test framework is able to “weave” together the separate tests, allowing them to run safely and seamlessly...
Read more >Streaming patterns with fs2 - Beyond the lines
Here Pure means that the stream doesn't require any effect to be evaluated. If you were to integrate with cats-effect (let's say to...
Read more >Knowledge Hub for FS² and SAP S/4HANA - Serrala
PSP Reconciliation, Request to Pay, Instant Payments, Cloud, AI Match, etc. ... Our SAP-integrated FS2 Solution Suite Version 6.5 is now certified for...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
@Daenyth you can use the
InputStream
-based methods for trusted input.For better throughput pass
ReaderConfig
/WriterConfig
to them with tuned preferred sizes of internal read/write buffers.@steven-lai Hi Steven! Please open a separate issue If your case is not related to fs2 integration.
I’m happy to help you in finding the best solution. A good starting point would be samples of your input and examples of data structures that you use to handle them after parsing.
As an example, we can define a manually written codec that returns
scala.Unit
but accepts some call back function in a constructor to redirect parsed data structures to it. It will allow handling repetitive JSON structures that are nested in different ways (not just line-feed separated values or JSON arrays of values).