ImportError: cannot import name 'MBart50TokenizerFast' from 'transformers' (unknown location)
See original GitHub issueEnvironment info
transformers
version: 4.3.2- Platform: Linux-4.19.121-linuxkit-x86_64-with-debian-10.1
- Python version: 3.7.4
- PyTorch version (GPU?): 1.7.1 (False)
- Tensorflow version (GPU?): 2.4.1 (False)
- Using GPU in script?: <NO>
- Using distributed or parallel set-up in script?: <NO>
Who can help
Model: https://huggingface.co/facebook/mbart-large-50-one-to-many-mmt @patrickvonplaten, @patil-suraj
Information
Model I am using (Bert, XLNet …):
The problem arises when using:
- the official example scripts: (give details below)
- my own modified scripts: (give details below)
The tasks I am working on is:
- an official GLUE/SQUaD task: (translation)
- my own task or dataset: (give details below)
To reproduce
Steps to reproduce the behavior:
import os
from transformers import MBartForConditionalGeneration, MBart50TokenizerFast
article_en = "The head of the United Nations says there is no military solution in Syria"
model = MBartForConditionalGeneration.from_pretrained(
"facebook/mbart-large-50-one-to-many-mmt", cache_dir=os.getenv("cache_dir", "model"))
tokenizer = MBart50TokenizerFast.from_pretrained(
"facebook/mbart-large-50-one-to-many-mmt", src_lang="en_XX")
model_inputs = tokenizer(article_en, return_tensors="pt")
# translate from English to Hindi
generated_tokens = model.generate(
**model_inputs,
forced_bos_token_id=tokenizer.lang_code_to_id["hi_IN"]
)
tokenizer.batch_decode(generated_tokens, skip_special_tokens=True)
# => 'संयुक्त राष्ट्र के नेता कहते हैं कि सीरिया में कोई सैन्य समाधान नहीं है'
# translate from English to Chinese
generated_tokens = model.generate(
**model_inputs,
forced_bos_token_id=tokenizer.lang_code_to_id["zh_CN"]
)
decoded = tokenizer.batch_decode(generated_tokens, skip_special_tokens=True)
# => '联合国首脑说,叙利亚没有军事解决办法'
print(decoded)
ERROR:
Traceback (most recent call last):
File "src/translation/run.py", line 7, in <module>
from transformers import MBartForConditionalGeneration, MBart50TokenizerFast
ImportError: cannot import name 'MBart50TokenizerFast' from 'transformers' (unknown location)
Expected behavior
no error
Issue Analytics
- State:
- Created 3 years ago
- Comments:8 (7 by maintainers)
Top Results From Across the Web
Not able to import MBart50TokenizerFast from transformers
Hi I am not able to import MBart50TokenizerFast from transformers. ... MBart50TokenizerFast ImportError: cannot import name ...
Read more >Error importing MBart from transformers · Issue #7288 - GitHub
Environment info transformers version: 3.1.0 Platform: ... MBartTokenizer ImportError: cannot import name 'MBartForConditionalGeneration' ...
Read more >cannot import name 'TFBertModel' from 'transformers ...
I'm trying to use a tensorflow model with huggingface transformers. No success so far... Any hint would be appreciated !
Read more >cannot import name 'pegasusforconditionalgeneration' from ...
The message suggests that the import is not available in your transformers installation. You should manually check that it has been installed ...
Read more >ImportError: cannot import name (unknown location) - YouTube
ImportError : cannot import name 'blabla' from 'some_module' ( unknown location )thanks for watchinglike this video, share, ...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
To install from source clone the repo and run
pip install .
from the root of the repo or runpip install git+https://github.com/huggingface/transformers.git
, which will install the master branch.Ah, I think I have found the culprit! MBart-50 was only just released on the
master
branch and you seem to be using version v4.3.2, which does not have it yet. Could you install from source and let me know if you still have the issue?