Browser s3.upload fails if file or contents is too large
See original GitHub issueTLDR
I was able to get this to work if I omitted the FileReader
object and set the Body
equal to the file input. However, I want to read the file first using FileReader
in order to transform the data and then send that data to S3. Right now, if I read the data first with FileReader
the browser works fine and does not crash. Once I attempt to take the contents of the file and send it to S3 using s3.upload()
that’s when it crashes. I’m not sure how to debug this issue.
What I’ve tried
I’m seeing a similar issue to https://github.com/aws/aws-sdk-js/issues/1865 when using this library in chrome.
In the browser I can upload up to 66 MB but it fails when I upload between 66 MB to 132 MB. I found this when trying to upload a csv with 1 million rows (roughly 322 MB) so I retried the upload starting with a file with only 1 MB all the way to 66MB and then 132MB doubling it each time.
The function I’m using is simple where Body is the string contents from the FileReader
object.
Bucket = 'my-personal-bucket';
Key = 'my-key';
var params = {Bucket, Key, Body: fileContents};
var options = {partSize: 10 * 1024 * 1024, queueSize: 10};
var s3 = new AWS.S3();
s3.upload(params, options, function(err, data) {
console.log(err, data);
});
I also tried a concurrency queueSize: 1
as the s3.upload()
docs recommended.
Manually chunking seems to work but then I wind up with multiple files that I’d like to combine and no way to combine them using the sdk https://github.com/aws/aws-sdk-js/issues/2084.
// https://stackoverflow.com/a/29202760
function chunkSubstr(str, size) {
const numChunks = Math.ceil(str.length / size)
const chunks = new Array(numChunks)
for (let i = 0, o = 0; i < numChunks; ++i, o += size) {
chunks[i] = str.substr(o, size)
}
return chunks
}
// 50 MB
chunk = 50 * 1024 * 1024;
// split into parts
parts = chunkSubstr(fileContents, chunk);
// for each part, upload as s3://my-personal-bucket/my-key.idx.txt where idx is the index
parts.forEach((part, idx) => {
var params = {Bucket: 'my-personal-bucket', Key: `my-key.${idx}.txt`, Body: part};
s3.upload(params, function(err, data) {
console.log(err, data);
});
});
Any thoughts on how I can get around this or is this a bug with the sdk? or is there a workaround for this?
Issue Analytics
- State:
- Created 5 years ago
- Comments:5 (3 by maintainers)
I’m going to close this issue. If you have a separate memory issue you feel needs addressing, please feel free to open that separately.
This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs and link to relevant comments in this thread.