Batch uploading of files
See original GitHub issueI’m confused about this comment https://github.com/GoogleCloudPlatform/google-cloud-python/blob/master/storage/google/cloud/storage/blob.py#L589
What is the correct way of uploading blobs in a batch?
client = storage.Client.from_service_account_json('client-secret.json', project=PROJECT_NAME)
bucket = client.get_bucket(BUCKET_NAME)
with client.batch():
for i in range(10):
with open('base.py', 'rb') as my_file:
blob = storage.Blob('/test/{}'.format(i), bucket)
blob.upload_from_file(my_file, client=client)
This, of course, won’t work because is using client._base_connection instead of client.current_batch.
Issue Analytics
- State:
- Created 7 years ago
- Reactions:1
- Comments:7 (3 by maintainers)
Top Results From Across the Web
Batch Uploads - IBM
A batch upload definition file consists of fields and defining attributes for the records. This definition is used to download and upload batch...
Read more >Batch Uploading files from your local computer - Bepress
The Batch Upload File Manager carries out two tasks: 1. Stores a copy of file(s), which you upload into it, on a secure...
Read more >Commons:Guide to batch uploading
Batch uploading or data ingestion is uploading multiple files in an automated manner. This guide aims to explain how to do this.
Read more >How do I bulk upload files to my user or group files?
Select the files you want to upload to your user or group files [1]. To select a group of files, hold down the...
Read more >Bulk Batch File Upload - Developer Documentation
Bulk batch file upload uses the STS token provided by AWS with write access. It is required to specify the folder in which...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
@arvindnrbt with consulting hat on “It depends…”. You could:
queue.Queue
with the filenames and have worker threads upload items from the queue.You can’t:
For anyone that might want to batch upload/download. I wanted to speed up this process and it turned out that spawning a few threads and using a different google storage client instance in each did improve the speed significantly. Tested on a 6-core machine. I just needed to partition the list of files to upload/download myself. Hope this will be helpful for some.