Can't load feature extractor for 'facebook/wav2vec2-large-xlsr-53'
See original GitHub issueHi @Gastron, TParcollet, I was very happy coming across speechbrain toolkit for my master project on African ASR, However i have been having serious challenges for over 24 hours trying to train a model using FONGBE dataset which i have download and is supported by speechbrain network. Below is the actual error i’m having.
f"Can't load feature extractor for '{pretrained_model_name_or_path}'. If you were trying to load it "
OSError: Can't load feature extractor for 'facebook/wav2vec2-large-xlsr-53'. If you were trying to load it from 'https://huggingface.co/models', make sure you don't have a local directory with the same name. Otherwise, make sure 'facebook/wav2vec2-large-xlsr-53' is the correct path to a directory containing a preprocessor_config.json file
when i run this command to start training
python train_with_wav2vec2.py hparams/train_fon_with_wav2vec.yaml --data_folder=/localscratch/ALFFA_PUBLIC/ASR/FONGBE/data/
Acoording to the documentation on the speechbrain huggingface page.
my yaml conifguration is this
# Url for xlsr wav2vec2
#wav2vec2_hub: C:\Users\nathk\Documents\TacoProject\brain\speechbrain\xlsr_53_56k.pt
wav2vec2_hub: facebook/wav2vec2-large-xlsr-53
#wav2vec2_hub: C:/Users/nathk/Documents/TacoProject/brain/speechbrain/recipes/DVoice/ASR/CTC/results/wav2vec2_ctc_FONGBE/1250/save/wav2vec2_checkpoint/config.json
# Data files
data_folder: /localscratch/ALFFA_PUBLIC/ASR/FONGBE/data
train_csv_file: !ref <data_folder>/train.csv
I have manually downloaded the wav2vec2-large-xlsr-53 model and then replace the path in yaml file in this manner:
wav2vec2_hub: C:\Users\nathk\Documents\TacoProject\brain\speechbrain\xlsr_53_56k.pt
But still nothing seems to work.
Please can you assist me. FONGBE documentation link which i’m using : https://huggingface.co/speechbrain/asr-wav2vec2-dvoice-fongbe
Issue Analytics
- State:
- Created a year ago
- Comments:8
Top GitHub Comments
Yes, but make sure that it contains ALL the files from the repo 😃
Hello,
it seems there has been no activity for a long time, so I am closing this issue.
Feel free to reopen it if you still encounter the problem. Thank you. 🙂