question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

huggingface model download failed

See original GitHub issue

Hi, thanks for sharing this exciting work!

I’m having trouble downloading the model in huggingface

when I download the tokenizer of hkunlp/from_all_T5_base_prefix_grailqa2, got error

tokenizer = AutoTokenizer.from_pretrained("hkunlp/from_all_T5_base_prefix_grailqa2")

Traceback (most recent call last):
  File "<input>", line 4, in <module>
  File "/home2/xh/.conda/envs/skg/lib/python3.6/site-packages/transformers/models/auto/tokenization_auto.py", line 416, in from_pretrained
    return tokenizer_class_fast.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs)
  File "/home2/xh/.conda/envs/skg/lib/python3.6/site-packages/transformers/tokenization_utils_base.py", line 1705, in from_pretrained
    resolved_vocab_files, pretrained_model_name_or_path, init_configuration, *init_inputs, **kwargs
  File "/home2/xh/.conda/envs/skg/lib/python3.6/site-packages/transformers/tokenization_utils_base.py", line 1776, in _from_pretrained
    tokenizer = cls(*init_inputs, **init_kwargs)
  File "/home2/xh/.conda/envs/skg/lib/python3.6/site-packages/transformers/models/t5/tokenization_t5_fast.py", line 136, in __init__
    **kwargs,
  File "/home2/xh/.conda/envs/skg/lib/python3.6/site-packages/transformers/tokenization_utils_fast.py", line 87, in __init__
    fast_tokenizer = TokenizerFast.from_file(fast_tokenizer_file)
Exception: Permission denied (os error 13)

Same problem happens on others like from_all_T5_base_prefix_compwebq2, but download the model of them works fine.

looking forward to your reply, thx

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:5 (3 by maintainers)

github_iconTop GitHub Comments

2reactions
Timothyxxxcommented, Mar 12, 2022

(Or unable the use_fast by use_fast=False.

2reactions
cdhxcommented, Mar 12, 2022

It works, thanks for your quickly replay!

Read more comments on GitHub >

github_iconTop Results From Across the Web

Troubleshoot - Hugging Face
When your script attempts to download model weights or datasets, ... However, when you load the model file again, you may run into...
Read more >
Model can't be downloaded · Issue #8735 - GitHub
I used the following code to download the model: ... Could you try updating transformers to 3.5.0 to see if the error persists?...
Read more >
Unable to download huggingface model
Hello, I use Rasa v2.0.2 I have an error when i run rasa train nlu. Rasa can't download huggingface model. I have this...
Read more >
How to download model from huggingface? - Stack Overflow
Accepted answer is good, but writing code to download model is not always convenient. It seems git works fine with getting models from ......
Read more >
huggingface-hub - PyPI
Client library to download and publish models, datasets and other repos on the huggingface.co hub.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found