question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Cannot pipe s3 object readstream to PUT request

See original GitHub issue

StackOverflow Link. Reason for asking it here is because I suspect there might be a bug while piping an external read-stream to put/post methods. The boundary of multipart data may not be set properly.

I have a simple scenario. I need to read an object from S3 and pipe its output to PUT request. Here is my client.js code using request module.

// client.js
let AWS = require('aws-sdk')
let request = require('request')

let bucket = 'my_bucket'
let filename = 'path/to/file.zip'

let host = 'localhost'
let port = 8080

let s3 = new AWS.S3({
  . . .
})

let readStream = s3.getObject({
  Bucket: bucket,
  Key: filename
}).createReadStream()

let formData = {
  applicationType: 'my_app_type',
  applicationName: 'my_app_name',
  upload: {
    value: readStream,
    options: {
      filename: 'my_file_name.zip',
      contentType: 'application/zip'
    }
  }
}

request.put({
  url: 'http://' + host + ':' + port + '/bootstrap',
  formData: formData
}, function (error, response, body) {
  if (error) throw error
  console.log(body)
})

And, here is my server.js code.

// server.js
let http = require('http')
let Busboy = require('busboy')
let events = require('events')
let fs = require('fs')

let host = 'localhost'
let port = 8080

let compressedCodeLocation = './code.zip'

let handleRequest = function (request, response) {
  let eventEmitter = new events.EventEmitter()
  let inputStreamWriter = fs.createWriteStream(compressedCodeLocation)
  inputStreamWriter.on('finish', function () {
    eventEmitter.emit('input.stream.saved')
  })
  let busboy = new Busboy({
    headers: request.headers
  })

  busboy.on('file', function (field, file) {
    file.pipe(inputStreamWriter)
  })
  busboy.on('field', function (field, val) {
    console.log(field + ': ' + val)
  })
  eventEmitter.on('input.stream.saved', function () {
    let stats = fs.statSync(compressedCodeLocation)
    response.statusCode = 200
    response.end(JSON.stringify(stats))
  })

  request.pipe(busboy)
}

let server = http.createServer(handleRequest)
server.listen(port, host, function () {
  console.log('Server started on ' + host + ':' + port)
})

let handleShutdown = function () {
  server.close(function () {
    console.log('Server stopped on ' + host + ':' + port)
  })
}
process.on('SIGTERM', handleShutdown)
process.on('SIGINT', handleShutdown)

I am getting this error:

File [upload] got 58 bytes
events.js:160
      throw er; // Unhandled 'error' event
      ^

Error: Unexpected end of multipart data
    at /pots/cnc/node_modules/dicer/lib/Dicer.js:62:28
    at _combinedTickCallback (internal/process/next_tick.js:67:7)
    at process._tickCallback (internal/process/next_tick.js:98:9)

The funny thing is, if I save the file locally first and then createReadStream for that local file, it works:

let formData = {
  ...
  upload: {
    value: fs.createReadStream(localPath + "/" + filename),
    options: {
      ...
    }
  }
};

Issue Analytics

  • State:closed
  • Created 6 years ago
  • Reactions:1
  • Comments:6

github_iconTop GitHub Comments

6reactions
d0b1010rcommented, Oct 5, 2018

I had a smiliar problem. It worked for me once I added knownLength to the options:

upload: {
    value: readStream,
    options: {
      filename: 'my_file_name.zip',
      contentType: 'application/zip'
      knownLength: 423424
    }
  }

You can get the file size for example by using the headObject func:

return s3.headObject({ Key: key, Bucket: bucket })
		.promise()
		.then(res => res.ContentLength);

I explained it to me that for multipart requests you need the size in advance. Somehow form-data (https://github.com/form-data/form-data) automatically detects the size it needs to send when using fs.createReadStream. That does not work for streams from s3, thus you need to supply it yourself

1reaction
rash805115commented, Oct 27, 2017

I got answers to my StackOverflow Question, but I am not sure of the reasoning behind why it works. Can someone provide an explanation?

Read more comments on GitHub >

github_iconTop Results From Across the Web

Cannot pipe s3 object readstream to PUT request
I think something is wrong in s3.getObject().createReadStream() - i'm experiencing a similar behavior ... request.put works great with local ...
Read more >
How to create a read stream of a AWS S3 object in a async ...
How to create a read stream of a AWS S3 object in a async function? If I try ``` exports.handler = async (event)...
Read more >
s3.getobject(params).createreadstream() - You.com - You.com
To zip all requested files into one archive I use node-archiver. The single files are streamed from an AWS S3 bucket and the...
Read more >
Upload an object to an Amazon S3 bucket using an AWS SDK
The following code examples show how to upload an object to an S3 bucket. .NET. AWS SDK for .NET. Note. There's more on...
Read more >
Streaming files from AWS S3 using NodeJS Stream API with ...
There is a timeout on connections to an AWS s3 instance set to 120000ms ... Readable super class here // For this example...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found