question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Impossible to upload a big file (+200mb)

See original GitHub issue

Hi,

Following the examples shown in #80 , I got some errors using this implementation

const uploadWithSDK = file => new Promise((resolve, reject) => {
  const dbx = new Dropbox({
    accessToken: 'mytoken'
  });

  let fileStart;
  let finishBatch;

  dbx.filesUploadSessionStart({
    contents: file,
    close: false,
  })
    .then(function (response) {
      fileStart = response;
      console.log('session started');

      dbx.filesUploadSessionFinishBatch({
        entries: [
          {
            cursor: {
              session_id: fileStart.session_id,
              offset: file.length
            },
            commit: {
              path: `/${file.name}`
            }
          }
        ]
      })
        .then(function (response) {
          finishBatch = response;
          console.log('on finish batch', finishBatch);

          dbx.filesUploadSessionFinishBatchCheck({async_job_id: finishBatch.async_job_id})
            .then(function (response) {
              finishBatch = response;

            })
            .catch(function (err) {
              console.log(err, 'on finish batch check');
            });
        })
        .catch(function (err) {
          console.log(err, 'on batching');
        });
    })
    .catch(function (err) {
      console.log(err, 'on filestart');
    });
});

I got the following errors in chrome, because of the reasons stated in #111

Refused to set unsafe header "accept-encoding"
Refused to set unsafe header "user-agent"
Refused to set unsafe header "content-length"

Response to preflight request doesn't pass access control check: 
The value of the 'Access-Control-Allow-Origin' header in the response must not be the wildcard '*' 
when the request's credentials mode is 'include'. 

Origin 'http://localhost:3000' is therefore not allowed access. 
The credentials mode of requests initiated by the XMLHttpRequest is controlled by the withCredentials attribute.

Currently, the SDK doesn’t allow to pass any request headers so I literally cant use session uploads.

My question is: If it’s doable, how?

Issue Analytics

  • State:closed
  • Created 6 years ago
  • Comments:17 (7 by maintainers)

github_iconTop GitHub Comments

1reaction
lgrrealagcommented, Jun 6, 2017

There might be a better way to solve this but here is how I solved it. It seemed the only way to get Dropbox to not error on a large upload was to upload each part one at a time running the function recursively. I am by no means a Javscript expert but this has been working for us on file uploads as large as 900mb.

		if ( file.size > 100000000) {
			//file over 100mb go break it into parts and come back
			var fileparts = filechunker(file);
			var fileoffset = 0;

			dbx.filesUploadSessionStart({
				contents: fileparts[0],
				close: false,
			})
			.then(function (response) {
				var fileid = response;
				fileoffset = fileoffset+fileparts[0].size;

					if (fileparts.length > 2) {
						//need to do the file append recursively calling it one at a time
						var fileappends = function (startkey) {
						var endkey = fileparts.length-2;
	
						dbx.filesUploadSessionAppend({
							contents: fileparts[startkey],
							offset: startkey*1000000,
							session_id: fileid.session_id
						})
						.then(function(response){
							//we have done all of them so return
							if (startkey == endkey) {
								filefinish();
								return 'complete';
							}
							else{
								return fileappends(startkey+1);
							}
						})
						.catch(function (err) {
							console.log(err, 'on append');
						});
					}
					//this starts recursively uploading the parts
					fileappends(1);
				}
				else {
					filefinish();
				}

			//if all fileappends are done run the finish
				var filefinish = function () {
					dbx.filesUploadSessionFinish({
						contents: fileparts[(fileparts.length-1)],
						cursor: {
							session_id: fileid.session_id,
							offset: (fileparts.length-1)*1000000
							},
							commit: {
							path: '/' + file.name,
							mode: 'overwrite'
							}
					})
					.then(function (response) {
						//get the sharing link of the file that was uploaded.
							dbx.sharingCreateSharedLink({ path : response.path_display })
								.then(function(response){

								})
								.catch(function(error) {

								});
					})
					.catch(function (err) {

					});
				}

			})
			.catch(function (err) {

			});



//modified from https://stackoverflow.com/questions/32898082/splitting-a-file-into-chunks-with-javascript 
function filechunker (file) {
	var chunkSize = 1000000; //1mb roughly
	var fileSize = file.size;
	var chunks = Math.ceil(file.size/chunkSize,chunkSize);
	var chunk = 0;

	var fileparts = new Array();
	while (chunk < chunks) {
		var offset = chunk*chunkSize;
		fileparts[chunk] = file.slice(offset, offset + chunkSize);
		chunk++;
	}
	return fileparts;
}

If there is a way to make this more efficient or faster would love to know. Especially if the experts at Dropbox have a better way.

Otherwise hope this helps someone.

1reaction
vladejscommented, May 23, 2017

hi @rymerrick , I have solved the problem, it only involves two steps:

  1. Remove your account from Dropbox
  2. Sign up on Amazon and use S3 (you can upload up to 5GB per request)

I am using it with 0 problem on my Meteor app.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Immediate 'Unable to upload' error for files over 200 MB
Solved: HELP! All of a sudden I cannot upload large files. I can still upload smaller files but anything over 200mb seems to...
Read more >
Unable to upload a large file(200MB in size) in WebCenter ...
Hello, I am trying to upload a large file(200MB in size) in WebCenter Content 12c but getting the following error: Error.
Read more >
Unable to upload a file of size 200mb to IPFS - Moralis Forum
I'm getting this error when I try to upload a 500MB video. Same code works fine with a smaller one (35MB). On IFPS...
Read more >
How To Upload Large Files To Google Drive Quickly - MASV
Learn the best ways to upload to Google Drive large files quickly, including reasons why large uploads to Google Drive can fail. Read...
Read more >
i cannot upload large files (i do have space for them!)
I cannot upload large file (like 700 mb movie file ) . at the starting time it is showing 1 minute ... Please...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found