Upload models using transformers-cli fails
See original GitHub issueEnvironment info
transformers
version: 3.0.2- Platform: Linux-4.15.0-112-generic-x86_64-with-glibc2.10
- Python version: 3.8.5
- PyTorch version (GPU?): 1.6.0 (False)
- Tensorflow version (GPU?): 2.3.0 (False)
- Using GPU in script?: No
- Using distributed or parallel set-up in script?: No
Who can help
Model Cards: @julien-c T5: @patrickvonplaten
Information
Model I am using T5:
The problem arises when using:
- the official example scripts: (give details below)
- my own modified scripts: (give details below)
The tasks I am working on is:
- an official GLUE/SQUaD task: (give the name)
- my own task or dataset: (give details below)
To reproduce
Steps to reproduce the behavior:
Command:
transformers-cli upload ./prot_t5_xl_bfd/ --organization Rostlab
Error:
About to upload file /mnt/lsf-nas-1/lsf/job/repo/elnaggar/prot-transformers/models/transformers/prot_t5_xl_bfd/pytorch_model.bin to S3 under filename prot_t5_xl_bfd/pytorch_model.bin and namespace Rostl
ab
Proceed? [Y/n] y
Uploading... This might take a while if files are large
0%|▌ | 48242688/11276091454 [00:02<14:55, 12534308.31it/s]
Traceback (most recent call last):
File "/mnt/lsf-nas-1/lsf/job/repo/elnaggar/anaconda3/envs/transformers_covid/lib/python3.8/site-packages/urllib3/connectionpool.py", line 670, in urlopen
httplib_response = self._make_request(
File "/mnt/lsf-nas-1/lsf/job/repo/elnaggar/anaconda3/envs/transformers_covid/lib/python3.8/site-packages/urllib3/connectionpool.py", line 392, in _make_request
conn.request(method, url, **httplib_request_kw)
File "/mnt/lsf-nas-1/lsf/job/repo/elnaggar/anaconda3/envs/transformers_covid/lib/python3.8/http/client.py", line 1255, in request
self._send_request(method, url, body, headers, encode_chunked)
File "/mnt/lsf-nas-1/lsf/job/repo/elnaggar/anaconda3/envs/transformers_covid/lib/python3.8/http/client.py", line 1301, in _send_request
self.endheaders(body, encode_chunked=encode_chunked)
File "/mnt/lsf-nas-1/lsf/job/repo/elnaggar/anaconda3/envs/transformers_covid/lib/python3.8/http/client.py", line 1250, in endheaders
self._send_output(message_body, encode_chunked=encode_chunked)
File "/mnt/lsf-nas-1/lsf/job/repo/elnaggar/anaconda3/envs/transformers_covid/lib/python3.8/http/client.py", line 1049, in _send_output
self.send(chunk)
File "/mnt/lsf-nas-1/lsf/job/repo/elnaggar/anaconda3/envs/transformers_covid/lib/python3.8/http/client.py", line 971, in send
self.sock.sendall(data)
File "/mnt/lsf-nas-1/lsf/job/repo/elnaggar/anaconda3/envs/transformers_covid/lib/python3.8/ssl.py", line 1204, in sendall
v = self.send(byte_view[count:])
File "/mnt/lsf-nas-1/lsf/job/repo/elnaggar/anaconda3/envs/transformers_covid/lib/python3.8/ssl.py", line 1173, in send
return self._sslobj.write(data)
BrokenPipeError: [Errno 32] Broken pipe
Traceback (most recent call last):
File "/mnt/lsf-nas-1/lsf/job/repo/elnaggar/anaconda3/envs/transformers_covid/lib/python3.8/site-packages/requests/adapters.py", line 439, in send
resp = conn.urlopen(
File "/mnt/lsf-nas-1/lsf/job/repo/elnaggar/anaconda3/envs/transformers_covid/lib/python3.8/site-packages/urllib3/connectionpool.py", line 726, in urlopen
retries = retries.increment(
File "/mnt/lsf-nas-1/lsf/job/repo/elnaggar/anaconda3/envs/transformers_covid/lib/python3.8/site-packages/urllib3/util/retry.py", line 403, in increment
raise six.reraise(type(error), error, _stacktrace)
File "/mnt/lsf-nas-1/lsf/job/repo/elnaggar/anaconda3/envs/transformers_covid/lib/python3.8/site-packages/urllib3/packages/six.py", line 734, in reraise
raise value.with_traceback(tb)
File "/mnt/lsf-nas-1/lsf/job/repo/elnaggar/anaconda3/envs/transformers_covid/lib/python3.8/site-packages/urllib3/connectionpool.py", line 670, in urlopen
httplib_response = self._make_request(
File "/mnt/lsf-nas-1/lsf/job/repo/elnaggar/anaconda3/envs/transformers_covid/lib/python3.8/site-packages/urllib3/connectionpool.py", line 392, in _make_request
conn.request(method, url, **httplib_request_kw)
File "/mnt/lsf-nas-1/lsf/job/repo/elnaggar/anaconda3/envs/transformers_covid/lib/python3.8/http/client.py", line 1255, in request
self._send_request(method, url, body, headers, encode_chunked)
File "/mnt/lsf-nas-1/lsf/job/repo/elnaggar/anaconda3/envs/transformers_covid/lib/python3.8/http/client.py", line 1301, in _send_request
self.endheaders(body, encode_chunked=encode_chunked)
File "/mnt/lsf-nas-1/lsf/job/repo/elnaggar/anaconda3/envs/transformers_covid/lib/python3.8/http/client.py", line 1250, in endheaders
self._send_output(message_body, encode_chunked=encode_chunked)
File "/mnt/lsf-nas-1/lsf/job/repo/elnaggar/anaconda3/envs/transformers_covid/lib/python3.8/http/client.py", line 1049, in _send_output
self.send(chunk)
File "/mnt/lsf-nas-1/lsf/job/repo/elnaggar/anaconda3/envs/transformers_covid/lib/python3.8/http/client.py", line 971, in send
self.sock.sendall(data)
File "/mnt/lsf-nas-1/lsf/job/repo/elnaggar/anaconda3/envs/transformers_covid/lib/python3.8/ssl.py", line 1204, in sendall
v = self.send(byte_view[count:])
File "/mnt/lsf-nas-1/lsf/job/repo/elnaggar/anaconda3/envs/transformers_covid/lib/python3.8/ssl.py", line 1173, in send
return self._sslobj.write(data)
urllib3.exceptions.ProtocolError: ('Connection aborted.', BrokenPipeError(32, 'Broken pipe'))
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/mnt/lsf-nas-1/lsf/job/repo/elnaggar/anaconda3/envs/transformers_covid/bin/transformers-cli", line 8, in <module>
sys.exit(main())
File "/mnt/lsf-nas-1/lsf/job/repo/elnaggar/anaconda3/envs/transformers_covid/lib/python3.8/site-packages/transformers/commands/transformers_cli.py", line 33, in main
service.run()
File "/mnt/lsf-nas-1/lsf/job/repo/elnaggar/anaconda3/envs/transformers_covid/lib/python3.8/site-packages/transformers/commands/user.py", line 232, in run
access_url = self._api.presign_and_upload(
File "/mnt/lsf-nas-1/lsf/job/repo/elnaggar/anaconda3/envs/transformers_covid/lib/python3.8/site-packages/transformers/hf_api.py", line 167, in presign_and_upload
r = requests.put(urls.write, data=data, headers={"content-type": urls.type})
File "/mnt/lsf-nas-1/lsf/job/repo/elnaggar/anaconda3/envs/transformers_covid/lib/python3.8/site-packages/requests/api.py", line 134, in put
return request('put', url, data=data, **kwargs)
File "/mnt/lsf-nas-1/lsf/job/repo/elnaggar/anaconda3/envs/transformers_covid/lib/python3.8/site-packages/requests/api.py", line 61, in request
return session.request(method=method, url=url, **kwargs)
File "/mnt/lsf-nas-1/lsf/job/repo/elnaggar/anaconda3/envs/transformers_covid/lib/python3.8/site-packages/requests/sessions.py", line 530, in request
resp = self.send(prep, **send_kwargs)
File "/mnt/lsf-nas-1/lsf/job/repo/elnaggar/anaconda3/envs/transformers_covid/lib/python3.8/site-packages/requests/sessions.py", line 643, in send
r = adapter.send(request, **kwargs)
File "/mnt/lsf-nas-1/lsf/job/repo/elnaggar/anaconda3/envs/transformers_covid/lib/python3.8/site-packages/requests/adapters.py", line 498, in send
raise ConnectionError(err, request=request)
requests.exceptions.ConnectionError: ('Connection aborted.', BrokenPipeError(32, 'Broken pipe'))
Expected behavior
I am trying to upload our T5-3B model using transformers-cli, but it always fails and gives “BrokenPipeError”. It only uploads small files like configuration files but it fails for the model files. I have tried two different machines and both of them gives the same error.
Issue Analytics
- State:
- Created 3 years ago
- Comments:14 (11 by maintainers)
Top Results From Across the Web
Model sharing and uploading - Hugging Face
Go to a terminal and run the following command. It should be in the virtual environment where you installed Transformers, since that command...
Read more >How to Upload Models to Hugging Face's Model Distribution ...
This article will discuss how to upload a Transformer model created with Happy Transformer to Hugging Face's model distribution network.
Read more >OSError for huggingface model - python - Stack Overflow
In this case huggingface will prioritize it over the online version, try to load it and fail if its not a fully trained...
Read more >Using the Hub for model storage | Daniel van Strien
How I'm planning to use the huggingface hub for storing flyswot ... using other peoples models and uploading fine-tuned transformer models.
Read more >Use Hugging Face with Amazon SageMaker
If you bring your own existing Hugging Face model, you must upload the ... Face models , as outlined in Deploy pre-trained Hugging...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
no, it is indeed supposed to work as you describe, specifying the dir from any point in your filesystem.
Let us know if that’s not the case.
Let’s leave it open 😃