Cannot import AutoModelForSeq2SeqLM
See original GitHub issue🐛 Bug
Has the AutoModelForSeq2SeqLM
class changed?
I am trying to run transformer examples, basically the token-classification with pytorch-lightning, which calls AutoModelForSeq2SeqLM. However, I am getting an import error. See below.
Information
Model I am using (Bert, XLNet …):
bert-base-multilingual-cased
Language I am using the model on (English, Chinese …):
English
The problem arises when using:
- the official example scripts: (give details below)
- my own modified scripts: (give details below)
The tasks I am working on is:
- an official GLUE/SQUaD task: (give the name)
- my own task or dataset: (give details below)
To reproduce
Steps to reproduce the behavior:
cd transformers/examples/token-classification
./run_pl.sh
Traceback (most recent call last):
File "run_pl_ner.py", line 12, in <module>
from lightning_base import BaseTransformer, add_generic_args, generic_train
File "/transformers/examples/lightning_base.py", line 12, in <module>
from transformers import (
ImportError: cannot import name 'AutoModelForSeq2SeqLM' from 'transformers' (/anaconda3/envs/transformers/lib/python3.8/site-packages/transformers/__init__.py)
Expected behavior
Reproduce the example.
Environment info
transformers
version: 2.11.0- Platform: Linux
- Python version: Python 3.8.3
- PyTorch version (GPU?): 1.5.1
- Tensorflow version (GPU?):
- Using GPU in script?: Yes
- Using distributed or parallel set-up in script?:
Issue Analytics
- State:
- Created 3 years ago
- Reactions:3
- Comments:5 (4 by maintainers)
Top Results From Across the Web
ImportError: cannot import name 'AutoModelWithLMHead ...
I solved it! Apperantly AutoModelWithLMHead is removed on my version. Now you need to use AutoModelForCausalLM for causal language models, ...
Read more >Auto Classes - Hugging Face
This class cannot be instantiated directly using __init__() (throws an error). ... from transformers import AutoConfig, AutoModelForSeq2SeqLM >>> # Download ...
Read more >Constrained Beam Search with Transformers
from transformers import AutoTokenizer, AutoModelForSeq2SeqLMtokenizer ... We can't just keep branching out, then the number of beams we'd have to keep ...
Read more >Trying to customize transformer (error) - nlp - PyTorch Forums
But when i trying to import :- ``from t… ... am getting error as `` cannot import name 'AutoModelForSeq2SeqLM' from 'transformers' (unknown ...
Read more >translate_opt - Kaggle
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM # from ... We can't record the data flow of Python values, so this value will ...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
I’m not sure I follow. 2.11.0 is the last stable version, but the examples script on the main branch only works with an installation from source.
If you want a version of the examples compatible with 2.11.0, you should use their version in the 2.11.0 tagged repo.
Again this is not an issue. The examples in the master repo are on par with the version of transformers on master, so you need an installation from source to run them, which is clearly indicated in the README.
If you want to execute the examples script as they were for v2.11.0, you should use 2.11.0 tagged repo](https://github.com/huggingface/transformers/tree/v2.11.0).
Closing this issue, please reopen if anything I said was unclear.