question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Problem running T5 (configuration) with text classification

See original GitHub issue

Environment info

  • transformers version: 4.3.2
  • Platform: Linux-4.18.0-193.el8.x86_64-x86_64-with-glibc2.10
  • Python version: 3.8.3
  • PyTorch version (GPU?): 1.5.1+cu101 (True)
  • Tensorflow version (GPU?): not installed (NA)
  • Using GPU in script?: yes
  • Using distributed or parallel set-up in script?: single gpu

Who can help

Perhaps @patrickvonplaten, @patil-suraj could help?

Information

Model I am using (Bert, XLNet …): T5

The problem arises when using:

  • the official example scripts: (give details below)
  • my own modified scripts: (give details below)

The tasks I am working on is:

  • an official GLUE/SQUaD task: (give the name)
  • my own task or dataset: (give details below)

To reproduce

I’m trying to run the T5 base model. It seems that I use the correct model path (i.e., t5-base) and it finds and downloads the model, but crashes when it tries to instantiate it. The problem seems to be around the configuration class not being found. This is what I get:

File "../../../models/tr-4.3.2/run_puppets.py", line 279, in main
    model = AutoModelForSequenceClassification.from_pretrained(
  File "/dccstor/redrug_ier/envs/last-tr/lib/python3.8/site-packages/transformers/models/auto/modeling_auto.py", line 1362, in from_pretrained
    raise ValueError(
ValueError: Unrecognized configuration class <class 'transformers.models.t5.configuration_t5.T5Config'> for this kind of AutoModel: AutoModelForSequenceClassification.
Model type should be one of ConvBertConfig, LEDConfig, DistilBertConfig, AlbertConfig, CamembertConfig, XLMRobertaConfig, MBartConfig, BartConfig, LongformerConfig, RobertaConfig, SqueezeBertConfig, LayoutLMConfig, BertConfig, XLNetConfig, MobileBertConfig, FlaubertConfig, XLMConfig, ElectraConfig, FunnelConfig, DebertaConfig, GPT2Config, OpenAIGPTConfig, ReformerConfig, CTRLConfig, TransfoXLConfig, MPNetConfig, TapasConfig.

I dig a bit and I may have a hunch why this happens. The config file is there: https://github.com/huggingface/transformers/blob/master/src/transformers/models/t5/configuration_t5.py#L32 but it’s not recorded here: https://github.com/huggingface/transformers/blob/master/src/transformers/models/auto/modeling_auto.py#L514

So the check here fails: https://github.com/huggingface/transformers/blob/master/src/transformers/models/auto/modeling_auto.py#L1389

And the ValueError is raised.

I hope this is it. It looks like an easy fix 😃 Thanks!

PS: I’m running the same scripts/files with other models without problems. This seems to be something specific to T5.

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:5 (4 by maintainers)

github_iconTop GitHub Comments

1reaction
xhlucacommented, Feb 23, 2022

@patrickvonplaten I just upgraded transformers to the latest version (4.16) and when i run this:

from transformers import AutoModelForConditionalGeneration

I get this error:

---------------------------------------------------------------------------
ImportError                               Traceback (most recent call last)
/tmp/ipykernel_20/1334627133.py in <module>
----> 1 from transformers import AutoModelForConditionalGeneration

ImportError: cannot import name 'AutoModelForConditionalGeneration' from 'transformers' (/opt/conda/lib/python3.7/site-packages/transformers/__init__.py)

If this is supposed to work I can open an issue (let me know who I should tag). See kaggle notebook example

0reactions
patrickvonplatencommented, Feb 23, 2022

Should work yes 😃

Read more comments on GitHub >

github_iconTop Results From Across the Web

T5 - Hugging Face
T5 is an encoder-decoder model and converts all NLP problems into a text-to-text format. It is trained using teacher forcing. This means that...
Read more >
Lack of funetune examples for T5 model · Issue #4426 - GitHub
I've setup T5 fine-tuning using lightning and also HF's new ... Plus, it is also the example for bert here in examples/text-classification.
Read more >
The Guide to Multi-Tasking with the T5 Transformer
The T5 Transformer can perform any NLP task. It can perform multiple tasks, at the same time, with the same model. Here's how!...
Read more >
T5-Base Model for Summarization, Sentiment Classification ...
Build a text pre-processing pipeline for a T5 model · Instantiate a pre-trained T5 model with base configuration · Read in the CNNDM,...
Read more >
Classification: T5 - seekinginference
The guide proceeds by (1) preparing the data for text classification with T5 small – a small version of T5 base, and (2)...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found