No such file or directory while opening './4-gram.arpa.gz' KenLM
See original GitHub issueHi @leo19941227, Could you please help with this error? Is there anything to consider while installing the KENLM? I followed the link here for the installation https://medium.com/tekraze/install-kenlm-binaries-on-ubuntu-language-model-inference-tool-33507000f33. I have checked and this file exist in my system (I have placed it in the same root to make sure the problem is not with the path)
Traceback (most recent call last):
File "run_downstream.py", line 206, in <module>
main()
File "run_downstream.py", line 201, in main
runner = Runner(args, config)
File "/home/ai-labs/Desktop/ASR/s3prl/s3prl/downstream/runner.py", line 52, in __init__
self.downstream = self._get_downstream()
File "/home/ai-labs/Desktop/ASR/s3prl/s3prl/downstream/runner.py", line 129, in _get_downstream
model = Downstream(
File "/home/ai-labs/Desktop/ASR/s3prl/s3prl/downstream/asr/expert.py", line 99, in __init__
self.decoder = get_decoder(decoder_args, self.dictionary)
File "/home/ai-labs/Desktop/ASR/s3prl/s3prl/downstream/asr/expert.py", line 36, in get_decoder
return W2lKenLMDecoder(decoder_args, dictionary)
File "/home/ai-labs/Desktop/ASR/s3prl/s3prl/downstream/asr/w2l_decoder.py", line 127, in __init__
self.lm = KenLM('./4-gram.arpa.gz', self.word_dict)
RuntimeError: /home/ai-labs/Desktop/ASR/kenlm/util/file.cc:76 in int util::OpenReadOrThrow(const char*) threw ErrnoException because `-1 == (ret = open(name, 00))'.
No such file or directory while opening ./4-gram.arpa.gz
Issue Analytics
- State:
- Created a year ago
- Comments:6 (3 by maintainers)
Top Results From Across the Web
Invalid n-gram in ARPA · Issue #247 · kpu/kenlm - GitHub
Hello, everyone. I trained a language model using this KenLM toolkit on a large corpus(>150G), then pruned at some threshold such as 5e-9....
Read more >Boosting Wav2Vec2 with n-grams in Transformers
This blog post is a step-by-step technical guide to explain how one can create an n-gram language model and combine it with an...
Read more >Creating a n-gram Language Model using Wikipedia
TLDR: This post describes how to train a n-gram Language Model of any order using Wikipedia articles. The code used is available from...
Read more >LibriSpeech language models - openslr.org
LibriSpeech language models, vocabulary and G2P models ... About this resource: Language modeling resources to be used in conjunction with the (soon-to-be- ...
Read more >subject:"\[Moses\-support\] ERROR" - The Mail Archive
I get a similar error compiling on a WSL2 environment but i know the >> compile itself has succeeded: >> >> No such...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
You should use
--mode evaluate
instead of--mode inference
Can you try to fill the config field with the absolute path?