question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Browser s3.upload fails if file or contents is too large

See original GitHub issue

TLDR

I was able to get this to work if I omitted the FileReader object and set the Body equal to the file input. However, I want to read the file first using FileReader in order to transform the data and then send that data to S3. Right now, if I read the data first with FileReader the browser works fine and does not crash. Once I attempt to take the contents of the file and send it to S3 using s3.upload() that’s when it crashes. I’m not sure how to debug this issue.

What I’ve tried

I’m seeing a similar issue to https://github.com/aws/aws-sdk-js/issues/1865 when using this library in chrome.

In the browser I can upload up to 66 MB but it fails when I upload between 66 MB to 132 MB. I found this when trying to upload a csv with 1 million rows (roughly 322 MB) so I retried the upload starting with a file with only 1 MB all the way to 66MB and then 132MB doubling it each time.

The function I’m using is simple where Body is the string contents from the FileReader object.

Bucket = 'my-personal-bucket';
Key = 'my-key';
var params = {Bucket, Key, Body: fileContents};
var options = {partSize: 10 * 1024 * 1024, queueSize: 10};
var s3 = new AWS.S3();
s3.upload(params, options, function(err, data) {
  console.log(err, data);
});

I also tried a concurrency queueSize: 1 as the s3.upload() docs recommended.

Manually chunking seems to work but then I wind up with multiple files that I’d like to combine and no way to combine them using the sdk https://github.com/aws/aws-sdk-js/issues/2084.

// https://stackoverflow.com/a/29202760
function chunkSubstr(str, size) {
  const numChunks = Math.ceil(str.length / size)
  const chunks = new Array(numChunks)
  for (let i = 0, o = 0; i < numChunks; ++i, o += size) {
    chunks[i] = str.substr(o, size)
  }
  return chunks
}
// 50 MB
chunk = 50 * 1024 * 1024;
// split into parts
parts = chunkSubstr(fileContents, chunk);
// for each part, upload as s3://my-personal-bucket/my-key.idx.txt where idx is the index
parts.forEach((part, idx) => {
  var params = {Bucket: 'my-personal-bucket', Key: `my-key.${idx}.txt`, Body: part};
  s3.upload(params, function(err, data) {
    console.log(err, data);
  });
});

Any thoughts on how I can get around this or is this a bug with the sdk? or is there a workaround for this?

Issue Analytics

  • State:closed
  • Created 5 years ago
  • Comments:5 (3 by maintainers)

github_iconTop GitHub Comments

1reaction
srchasecommented, Aug 16, 2018

I’m going to close this issue. If you have a separate memory issue you feel needs addressing, please feel free to open that separately.

0reactions
lock[bot]commented, Sep 28, 2019

This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs and link to relevant comments in this thread.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Resolve issues with uploading large files in Amazon S3
I'm trying to upload a large file (1 GB or larger) to Amazon Simple Storage Service (Amazon S3) using the console. However, the...
Read more >
Resolve errors uploading data to or downloading data ... - AWS
To load data as a text from Amazon S3 to Amazon Aurora, run the LOAD DATA FROM S3 command in Amazon Aurora. Some...
Read more >
Upload file on amazon S3 of size more than 3 gb
First, you're not allowed to upload files of more than 5gb (6gb MUST FAIL, but not 2gb file):. But uploading large files can...
Read more >
How to Upload Large Files to AWS S3 - Medium
The size of an object in S3 can be from a minimum of 0 bytes to a maximum of 5 terabytes, so, if...
Read more >
10 things you should know about using AWS S3 - Sumo Logic
S3 is highly scalable, so in principle, with a big enough pipe or enough ... The level of concurrency used for requests when...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found