question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Error uploading to S3 using read streams and putObject

See original GitHub issue

Bug report

I was trying to use the S3 adapter to upload files in the admin api using the File field type. When attempting to upload the file I would get a ‘nested errors’ message. There wouldn’t be anything in the CLI log where the app was running. I had make my own version of the adapter in a stand-alone project locally to debug this. Essentially, the file is getting a read stream and the putObject method of the AWS SDK expects a binary string. This can be resolved by using the upload method as it is smart and can handle read streams. I can prepare a PR if you would like.

To Reproduce

Steps to reproduce the behaviour. Please provide code snippets or a repository:

  1. Setup S3 file adapter like demonstrated in the documentation.
  2. Try to upload file
  3. Error

Expected behaviour

Uploading a file should work

Screenshots

https://dsh.re/7c44e

System information

  • OS: Linux (Docker)

Additional context

https://github.com/keystonejs/keystone/blob/1d7f2e62756f054f0b2ba1c34043387e9f945393/packages/file-adapters/lib/s3.js#L39

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Reactions:2
  • Comments:6 (3 by maintainers)

github_iconTop GitHub Comments

1reaction
BitForgercommented, Jan 16, 2020

@LiamAttClarke #2249 is exactly what I did to fix it.

1reaction
LiamAttClarkecommented, Jan 14, 2020

According to the docs, s3.putObject accepts “Buffer, Typed Array, Blob, String, ReadableStream”.

https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/S3.html#putObject-property

That said, I agree that the s3.upload method would be better.

Read more comments on GitHub >

github_iconTop Results From Across the Web

S3.putObject only accepts streams that it can determine the ...
According to https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/S3.html#putObject-property the Body element can be a ReadableStream , ...
Read more >
node.js - Pipe a stream to s3.upload() - Stack Overflow
Wrap the S3 upload() function with the node.js stream.PassThrough() stream. Here's an example: inputStream .pipe(uploadFromStream(s3)); ...
Read more >
Upload an object to an Amazon S3 bucket using an AWS SDK
Upload a file from local storage to a bucket. Upload the contents of a Swift Data object to a bucket. For API details,...
Read more >
Programmatically Stream (Upload) Large Files to Amazon S3
It's also possible to pipe a data stream to it in order to upload very large objects. To do this, simply wrap the...
Read more >
Uploading large files to S3 using streams
I'll use my admin credentials to run the code from my computer. The minimum permission you'll need is PutObject , which is a...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found