Can't load tokenizer for 'facebook/hubert-base-ls960'
See original GitHub issueEnvironment info
transformers
version: 4.12.2- Platform: Mac
- Python version: 3.7
- PyTorch version (GPU?): 1.9
- Tensorflow version (GPU?): No
- Using GPU in script?: No
- Using distributed or parallel set-up in script?: No
I just run simple code to load Hubert pretrained base model
from transformers import Wav2Vec2Processor, HubertForCTC
import torch
import librosa
PROCESSOR = Wav2Vec2Processor.from_pretrained('facebook/hubert-base-ls960')
model = HubertForCTC.from_pretrained('facebook/hubert-base-ls960')
And i got error trace:
Traceback (most recent call last):
File "/Users/
PROCESSOR = Wav2Vec2Processor.from_pretrained('facebook/hubert-base-ls960')
File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/transformers/models/wav2vec2/processing_wav2vec2.py", line 105, in from_pretrained
tokenizer = Wav2Vec2CTCTokenizer.from_pretrained(pretrained_model_name_or_path, **kwargs)
File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/transformers/tokenization_utils_base.py", line 1733, in from_pretrained
raise EnvironmentError(msg)
OSError: Can't load tokenizer for 'facebook/hubert-base-ls960'. Make sure that:
- 'facebook/hubert-base-ls960' is a correct model identifier listed on 'https://huggingface.co/models'
(make sure 'facebook/hubert-base-ls960' is not a path to a local directory with something else, in that case)
- or 'facebook/hubert-base-ls960' is the correct path to a directory containing relevant tokenizer files
Issue Analytics
- State:
- Created 2 years ago
- Reactions:2
- Comments:8 (4 by maintainers)
Top Results From Across the Web
Preprocessor for `faceebook/hubert-base-ls960` - Hugging Face
OSError : Can't load tokenizer for 'facebook/hubert-base-ls960'. If you were trying to load it from 'https://huggingface.co/models', ...
Read more >Huggingface AutoTokenizer can't load from local path
I have problem loading the tokenizer. I think the problem is with AutoTokenizer.from_pretrained('local/path/to/directory').
Read more >Can't load tokenizer for 'facebook/hubert-base-ls960' - transformers
Ask questionsCan't load tokenizer for 'facebook/hubert-base-ls960' ... I just run simple code to load Hubert pretrained base model
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Hey @harrypotter90,
Note that
facebook/hubert-base-ls960
is just the pretrained model and therefore does not have a tokenizer yet. You can create one yourself as shown in this blog post: https://huggingface.co/blog/fine-tune-wav2vec2-englishHey @kroq-gar78,
Ah yeah, that’s great feedback! I’ll change that asap. The documentation should definitely use a model that has a tokenizer!
Also I’ve updated the model card: https://huggingface.co/facebook/hubert-base-ls960 leaving a big note to make it clearer.
Thanks!