Manual chunk upload for GCS
See original GitHub issueHello! Thank you for the work on this SDK as it has been all working perfectly for now.
We are trying to manually upload chunks the following way:
- Client calls our backend service that, in turn, calls the APIs of
Google.Cloud.Storage.V1
to initialize a chunked upload of a large file (multiple GBs). For this we useclient.CreateObjectUploader
and thenawait uploader.InitiateSessionAsync
. We therefore get a session URI.
using MemoryStream emptyMemoryStream = new();
Google.Apis.Storage.v1.ObjectsResource.InsertMediaUpload uploader = client.CreateObjectUploader(command.BucketName, command.Key, GetMimeType(command.Key), emptyMemoryStream);
Uri uploadUri = await uploader.InitiateSessionAsync(cancellationToken);
- Client manually upload chunks of data, 1 by 1, to our backend that, in turn, calls the APIs of
Google.Cloud.Storage.V1
. Every time the client uploads a chunk of data, we do the following:
ResumableUpload actualUploader = ResumableUpload.CreateFromUploadUri(new Uri(request.ResumableUrl), new MemoryStream(Convert.FromBase64String(request.ContentAsBase64)));
IUploadProgress uploadProgress = await actualUploader.ResumeAsync(new Uri(request.ResumableUrl), cancellationToken);
Unfortunately, everytime a new chunk is uploaded, it replaces the one before that. We are not sure what we are missing as we have tried a few ways to do that.
Can you please let us know if you have any idea of what we are doing wrong?
Issue Analytics
- State:
- Created 2 months ago
- Comments:11
Top Results From Across the Web
Perform resumable uploads | Cloud Storage
Create a chunk of data from the overall data you want to upload. The chunk size should be a multiple of 256 KiB...
Read more >Resumable uploads | Cloud Storage
Introduction; How tools and APIs use resumable uploads; Resumable uploads of unknown size; Upload performance. Choosing session regions; Uploading in chunks.
Read more >How to send large file as chunked requests under ...
From documentation, the recommended way of sending a large file on GCS is via resumable uploads. Our use case would be sending a...
Read more >How to Upload Large Files to Google Cloud Storage
Divide the file into smaller chunks. The ideal chunk size will depend on the size of the file and the available network bandwidth....
Read more >Upload Large Files Directly to GCS with Dropzone and Signed ...
I leveraged the chunking feature from the dropzone and send files in chunks of 30MB (which GAE allowed), once all of the chunks...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
On a mobile, so briefly - you call Execute or ExecuteAsync on the request.
I suspect that client.Service.Objects.Compose(…) is a simpler way to get a request, too - but what you’ve got should work.
+1 to what @jskeet has said. You can use https://cloud.google.com/storage/docs/composing-objects#create-composite-client-libraries as a reference.