question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Load Huggingface Transformers model using TFAutoModel

See original GitHub issue

Description of Problem: I want to use models provided by the HuggingFace transformers library that are not available when using the HFTransformersNLP Component.

Overview of the Solution: Right now, models are loaded using a dictionary of architectures in rasa/nlu/utils/hugging_face/registry.py. Using the AutoModel and AutoTokenizer features from the transformers library, we can get rid of these limitations, as well as not having to define a model_name in the config file. We can only use the model_weights info to select the proper model and weights of the wanted tokenizers and featurizers.

Examples (if relevant): Docs for how to use the “auto” features is available here

Blockers (if relevant):

Definition of Done:

  • Changes are made to files in rasa/rasa/nlu/utils/hugging_face/
  • Tests are added
  • Feature described the docs
  • Feature mentioned in the changelog

Issue Analytics

  • State:open
  • Created 3 years ago
  • Comments:10 (6 by maintainers)

github_iconTop GitHub Comments

1reaction
Ghostvvcommented, Mar 17, 2022

Exalate commented:

Ghostvv commented:

shall we move it to blocked then?

0reactions
sync-by-unito[bot]commented, Dec 19, 2022

➤ Maxime Verger commented:

💡 Heads up! We’re moving issues to Jira: https://rasa-open-source.atlassian.net/browse/OSS.

From now on, this Jira board is the place where you can browse (without an account) and create issues (you’ll need a free Jira account for that). This GitHub issue has already been migrated to Jira and will be closed on January 9th, 2023. Do not forget to subscribe to the corresponding Jira issue!

➡️ More information in the forum: https://forum.rasa.com/t/migration-of-rasa-oss-issues-to-jira/56569.

Read more comments on GitHub >

github_iconTop Results From Across the Web

AutoModels — transformers 3.0.2 documentation
Loading a model from its configuration file does not load the model weights. It only affects the model's configuration. Use from_pretrained() to load...
Read more >
Models - Hugging Face
PreTrainedModel takes care of storing the configuration of the models and handles methods for loading, downloading and saving models as well as a...
Read more >
Auto Classes - Hugging Face
The configuration class to instantiate is selected based on the model_type property of the config object that is loaded, or when it's missing,...
Read more >
Load pretrained instances with an AutoClass - Hugging Face
Generally, we recommend using the AutoTokenizer class and the TFAutoModelFor class to load pretrained instances of models.
Read more >
Quick tour - Hugging Face
Get up and running with Transformers! Whether you're a developer or an everyday user, this quick tour will help you get started and...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found