[recorder] binary response encoded in 'hex' do not work well
See original GitHub issueWhen the response body is binary data, our recoder will store the data in hex.
When consuming the data in playback mode, we can’t infer the encoding: response.readableStreamBody?.readableEncoding = null
.
If I set the encoding manually with response.readableStreamBody?.setEncoding("hex");
The data get decoded correctly. Yet the length of data is not right. As in live mode, we are expecting binary data. With stream.read(size)
, we mean to the size of bytes(8 bits). But for ‘hex’ data, each byte/char only contain 4 meaningful bits.
Ran into this with the tests for quick query and change feed. https://github.com/Azure/azure-sdk-for-js/blob/66799d5b0238c6d3870dca2173f99aea9145578a/sdk/storage/storage-blob/test/node/blobclient.spec.ts#L351
Issue Analytics
- State:
- Created 3 years ago
- Comments:16 (16 by maintainers)
Top Results From Across the Web
Nock record outputs array of hex values instead of plain JSON ...
When recording responses which are compressed and chunked, ... array of hex values instead of plain JSON when response is encoded #1212.
Read more >What text encoding scheme do you use when you have binary ...
I would suggest that to be absolutely sure that your encoded binary is cookie-compatible, then basic hex encoding is safest (e.g. in java)....
Read more >RFC 2616: Hypertext Transfer Protocol -- HTTP/1.1
1. If the Request-URI is encoded using the "% HEX HEX" encoding [42], the origin server MUST decode the Request-URI in order to...
Read more >Hazards of Converting Binary Data To A String - Haacked
ToBase64String to encode the binary data as text, then decode using Convert.FromBase64String . Yes! Absolutely. Totally agree. As a general rule ...
Read more >ffmpeg Documentation
Since there is no decoding or encoding, it is very fast and there is no quality ... Note that in ffmpeg , matching...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
@ljian3377 That’s interesting.
I believe this is what you are referring to. https://github.com/Azure/azure-sdk-for-js/blob/317b9d825b4bc84de17529328340014124d6c7dd/sdk/storage/storage-blob-changefeed/recordings/node/blobchangefeedclient/recording_bypage.js#L382
I think I know what went wrong… The status code here is “206” in the reply, my previous fix only allows filtering the “200” status code(Intention behind that was to not do greedy replacements).
The fix should be as simple as allowing "206" status code too. I'll verify my theory and put up a fix for this tomorrow.The API should support both browser and Node in the end. We are still working on it. The recorder utilities need to guarantee we get the same data in live and playback mode. In live mode, we are expecting binary data - 8 bits for a byte. The length matters.
I couldn’t find a way to get the encoding of the ReadableStream so not able to workaround accordingly.
@HarshaNalluru Yes, the subscription needs to register the quick query feature and probably it now only works in central Canada and France in production. I will ping you with the account info I am using.