question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Nock record outputs array of hex values instead of plain JSON when response is encoded

See original GitHub issue

Related: https://github.com/nock/nock/issues/457#issuecomment-419480974

What is the expected behavior? When recording responses which are compressed and chunked, e.g. Content-Encoding: 'gzip' and Transfer-Encoding: 'chunked', I was expecting the generated fixture to decompress and combine the chunks into the human readable response; this would then allow us to remove sensitive information from the response and modify it to satisfy scenarios which are hard to reproduce against the real APIs.

What is the actual behavior? The fixtures response is an array of hex values.

Possible solution I’m working around this by modifying the nockDefs like this:

onst { ungzip } = require('node-gzip');

try {
  if (Array.isArray(def.response)) { // NOTE: This is a v. naive check
    def.response = JSON.parse(
      (await ungzip(Buffer.from(def.response.join(''), 'hex'))).toString(
        'utf-8'
      )
    );
  }
} catch (ex) {
  console.warn('Failed to decode response');
}

If this is in fact a bug then it’d be good to fix it in nock itself – if it’s as intended, i.e. the nock is technically returning the exact same response as the real server, then perhaps an option could be passed to nock record / nock.back to have it output the decompressed response?

How to reproduce the issue The issue can be reproduced by recording an API which returns compressed, chunked JSON.

Does the bug have a test case? https://github.com/richardscarrott/nock-record-chunked-encoding

Versions

Software Version(s)
Nock 9.6.1
Node 10.5.0

Issue Analytics

  • State:open
  • Created 5 years ago
  • Reactions:12
  • Comments:9 (2 by maintainers)

github_iconTop GitHub Comments

6reactions
protoEvangelioncommented, Jan 9, 2019

@richardscarrott thanks for pointing me in the right direction with this. The additional problem I had to figure out was that I needed to re-gzip after normalizing. This was because https://github.com/octokit/rest.js expects the response to be gzipped.

To work with nockBack, without changing nock internally, here is what I did in the afterRecord callback:

function decodeBuffer(fixture) {
  // Decode the hex buffer that nock made
  const response = isArray(fixture.response) ? fixture.response.join('') : fixture.response

  try {
      const decoded = Buffer.from(response, 'hex')
      var unzipped = zlib.gunzipSync(decoded).toString('utf-8')
  } catch (err) {
      throw new Error(`Error decoding nock hex:\n${err}`)
  }

  return JSON.parse(unzipped)
}

function afterRecord(fixtures) {
  const normalizedFixtures = fixtures.map(fixture => {
      fixture.response = decodeBuffer(fixture)

      // do normalization stuff
      // Re-gzip to keep the @octokit/rest happy
      const stringified = JSON.stringify(fixture.response)
      const zipped = zlib.gzipSync(stringified)

      fixture.response = zipped

      return fixture
  })

  return normalizedFixtures
}
5reactions
richardscarrottcommented, Sep 10, 2018

FYI, this is the complete afterNock function I’m using

const parseNockDefs = (
  nockDefs: (nock.NockDefinition & { rawHeaders: string[] })[]
) => {
  return nockDefs.map(def => {
    try {
      const headers = def.rawHeaders.reduce<Dictionary<string>>(
        (acc, curr, i, arr) => {
          if (i % 2 === 0) {
            acc[arr[i].toLowerCase()] = arr[i + 1].toLowerCase();
          }
          return acc;
        },
        {}
      );
      if (
        headers['transfer-encoding'] === 'chunked' &&
        headers['content-encoding'] === 'gzip' &&
        Array.isArray(def.response)
      ) {
        def.response = JSON.parse(
          gunzipSync(Buffer.from(def.response.join(''), 'hex')).toString(
            'utf-8'
          )
        );
        def.rawHeaders = Object.entries(headers).flatMap(([key, value]) => {
          if (key === 'transfer-encoding' || key === 'content-encoding') {
            return [];
          }
          return [key, value];
        });
      }
    } catch (ex) {
      console.warn('Failed to decode response');
    }
    return def;
  });
};
Read more comments on GitHub >

github_iconTop Results From Across the Web

How to decode nock recorded response - Stack Overflow
Nock is saving this data a hex encoded buffer string. You can convert these cassettes into json with the following utility: var zlib...
Read more >
Nock: HTTP Mocking and Expectations - Morioh
Nock. HTTP server mocking and expectations library for Node.js. Nock can be used to test modules that perform HTTP requests in isolation.
Read more >
MessagePack: It's like JSON. but fast and small.
It lets you exchange data among multiple languages like JSON. But it's faster and smaller. Small integers are encoded into a single byte,...
Read more >
6. Standard library — Halon Scripting Language - Support
Encode an array, number or string into a JSON representation (string). The encoding distinguishes arrays from objects if they are sequentially numbered from ......
Read more >
Mocking HTTP requests with Nock - codeburst
This is a 'how to' article on using Nock to mock HTTP requests during tests. When dealing with code that depends on external...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found