Allow to decode more than one object from InputStream
See original GitHub issueThis is a meta issue for collecting various use-cases for designing an API that will allow to actually ‘stream’ objects from InputStream
, instead of decoding a single one using Json.decodeFromStream
.
Please describe your use case in detail, if this is an HTTP long-polling stream or WebSocket, first-party or third-party API, or just a very large file.
Issue Analytics
- State:
- Created 2 years ago
- Comments:9 (2 by maintainers)
Top Results From Across the Web
Is there a way to decode InputStream data to text without using ...
Decide on a buffer size and use the byte array version. You can then convert an entire byte array to a string via...
Read more >Base64 encoding and decoding in Java 8 - InfoWorld
InputStream wrap(InputStream is) : Wrap an input stream for decoding byte data. The read() methods of the is object throw java.io.
Read more >The InputStream Interface in Java - Ice
The constructors accept three types of arguments: A communicator instance; An encoding version; The encoded data that you intend to decode.
Read more >Base64InputStream (Apache Commons Codec 1.15 API)
Provides Base64 encoding and decoding in a streaming fashion (unlimited size). When encoding the default lineLength is 76 characters and the default ...
Read more >InputStreamReader (Java SE 11 & JDK 11 )
Each invocation of one of an InputStreamReader's read() methods may cause one or more bytes to be read from the underlying byte-input stream....
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
My use case is using WebSocket streaming to drive the entire user interface.
The current implementation uses streams of
Event
objects (one stream per direction) and serializes/deserializes them into/from WebSocket frames via ProtoBuf. It also uses special synchronization events for backpressure (e.g. throttling the backend when it sends updates faster than the frontend can process them).So although this is a streaming scenario, serialization is currently dealing with single objects only (events enveloped in frames). A drawback of the current approach is that a large event might have to be split into (artificially created) smaller chunks before serialization, to allow for fine-grained backpressure (in-flight synchronization events) and stuff like progress indicators. Chunking can occur at the binary level, though.
I’d like to tail a JSON lines file from a potentially infinite stream. Examples of this format can be found here: https://jsonlines.org/examples/ and look like this:
Note that this format is somewhat different from the JSON spec. Instead of a really large array containing the objects, multiple ‘root’ objects are used where each object is separated by a line separator.
In practice we often use logback with https://github.com/logstash/logstash-logback-encoder for our logging, which writes each log event as a single-line JSON to some output such as a file or socket.