question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Manual chunk upload for GCS

See original GitHub issue

Hello! Thank you for the work on this SDK as it has been all working perfectly for now.

We are trying to manually upload chunks the following way:

  1. Client calls our backend service that, in turn, calls the APIs of Google.Cloud.Storage.V1 to initialize a chunked upload of a large file (multiple GBs). For this we use client.CreateObjectUploader and then await uploader.InitiateSessionAsync. We therefore get a session URI.
using MemoryStream emptyMemoryStream = new();
Google.Apis.Storage.v1.ObjectsResource.InsertMediaUpload uploader = client.CreateObjectUploader(command.BucketName, command.Key, GetMimeType(command.Key), emptyMemoryStream);

Uri uploadUri = await uploader.InitiateSessionAsync(cancellationToken);
  1. Client manually upload chunks of data, 1 by 1, to our backend that, in turn, calls the APIs of Google.Cloud.Storage.V1 . Every time the client uploads a chunk of data, we do the following:
ResumableUpload actualUploader = ResumableUpload.CreateFromUploadUri(new Uri(request.ResumableUrl), new MemoryStream(Convert.FromBase64String(request.ContentAsBase64)));

IUploadProgress uploadProgress = await actualUploader.ResumeAsync(new Uri(request.ResumableUrl), cancellationToken);

Unfortunately, everytime a new chunk is uploaded, it replaces the one before that. We are not sure what we are missing as we have tried a few ways to do that.

Can you please let us know if you have any idea of what we are doing wrong?

Issue Analytics

  • State:open
  • Created 2 months ago
  • Comments:11

github_iconTop GitHub Comments

2reactions
jskeetcommented, Aug 1, 2023

On a mobile, so briefly - you call Execute or ExecuteAsync on the request.

I suspect that client.Service.Objects.Compose(…) is a simpler way to get a request, too - but what you’ve got should work.

1reaction
hemanshvcommented, Aug 1, 2023
Read more comments on GitHub >

github_iconTop Results From Across the Web

Perform resumable uploads | Cloud Storage
Create a chunk of data from the overall data you want to upload. The chunk size should be a multiple of 256 KiB...
Read more >
Resumable uploads | Cloud Storage
Introduction; How tools and APIs use resumable uploads; Resumable uploads of unknown size; Upload performance. Choosing session regions; Uploading in chunks.
Read more >
How to send large file as chunked requests under ...
From documentation, the recommended way of sending a large file on GCS is via resumable uploads. Our use case would be sending a...
Read more >
How to Upload Large Files to Google Cloud Storage
Divide the file into smaller chunks. The ideal chunk size will depend on the size of the file and the available network bandwidth....
Read more >
Upload Large Files Directly to GCS with Dropzone and Signed ...
I leveraged the chunking feature from the dropzone and send files in chunks of 30MB (which GAE allowed), once all of the chunks...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found