Google Cloud Storage failing when using threads
See original GitHub issue- Ubuntu 16.04
- Python 2.7.6
- google-api-python-client>=1.6.2 and google-cloud-storage>=1.1.1
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/lib/python2.7/multiprocessing/pool.py", line 251, in map
return self.map_async(func, iterable, chunksize).get()
File "/usr/lib/python2.7/multiprocessing/pool.py", line 558, in get
raise self._value
ssl.SSLError: [Errno 1] _ssl.c:1429: error:1408F10B:SSL routines:SSL3_GET_RECORD:wrong version number
- The client is not thread safe (I think)
from multiprocessing.pool import ThreadPool
from google.cloud import storage
from functools import partial
def upload(bucket, i):
blob = bucket.blob("file{}.png".format(i))
blob.upload_from_string("blabla")
blob.make_public()
return blob.public_url
bucket = storage.Client().get_bucket("deepo-test")
pool = ThreadPool()
fct = partial(upload, bucket)
pool.map(fct, [i for i in range(2)])
Issue Analytics
- State:
- Created 6 years ago
- Reactions:4
- Comments:12 (9 by maintainers)
Top Results From Across the Web
Troubleshoot Dataflow errors - Google Cloud
If you run into problems with your Dataflow pipeline or job, this page lists error messages that you might see and provides suggestions...
Read more >gsutil failing on worker_thread.start() - Stack Overflow
I have been using gsutil for the past 3 months. I noticed an error that occurs randomly from time to time, e.g. sometimes...
Read more >Backup failing for Google Cloud Storage - Duplicacy Forum
The -threads option only sets the number of uploading threads. There is only one chunking thread that splits files into chunks and then ......
Read more >Bottleneck while uploading lots of files to GCP bucket in a ...
google.cloud.storage.StorageException: Connect timed out,” exception when I exceed 25 threads. Going beyond 60 threads, the programs don' ...
Read more >Storageclient.ListObjects causes thread cancel error when ...
I have some code that fetches files from Google Cloud Storage. ... LogToConsole($" - Failed to download attachment with name {fileName}"); ...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
you need to create a new client connection for every pool / thread inside def upload(bucket, i). That will be work.
from multiprocessing.pool import ThreadPool from google.cloud import storage
def upload(i): bucket = storage.Client().get_bucket(“deepo-test”) blob = bucket.blob(“file{}.png”.format(i)) blob.upload_from_string(“blabla”) blob.make_public() return blob.public_url
pool = ThreadPool() pool.map(fct, [i for i in range(2)])
@Alexis-Jacob That’s correct, the error you are seeing is caused by the lack of thread-safety in
httplib2
. We recommend (for now) creating an instance ofClient
that is local to your thread / process.