question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

How to load a pretrained TF model using AutoModel?

See original GitHub issue

Run the following code:

import tensorflow as tf
from transformers import AutoModel, TFBertModel

auto_model = AutoModel.from_pretrained("bert-base-uncased")
tfbert_model = TFBertModel.from_pretrained("bert-base-uncased")

print(auto_model.__class__)
print(tfbert_model.__class__)

Then the output is:

<class 'transformers.modeling_bert.BertModel'>
<class 'transformers.modeling_tf_bert.TFBertModel'>

It seems that AutoModel defaultly loads the pretrained PyTorch models, but how can I use it to load a pretrained TF model?

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Comments:6 (2 by maintainers)

github_iconTop GitHub Comments

8reactions
juanmanuelvccommented, Oct 22, 2020

I can’t able to load model for model = TFAutoModel.from_pretrained(“emilyalsentzer/Bio_ClinicalBERT”) and TFAutoModel .from_pretrained(‘microsoft/BiomedNLP-PubMedBERT-base-uncased-abstract-fulltext’) anyone ?

HI @blmali, I had the same issue when trying to load “emilyalsentzer/Bio_Discharge_Summary_BERT”. I solved it by passing from_pt argument as True: model = TFAutoModel.from_pretrained("emilyalsentzer/Bio_Discharge_Summary_BERT", from_pt=True).

I hope this helps.

3reactions
julien-ccommented, Feb 7, 2020

Hi @erikchwang, you should use TFAutoModel instead

Read more comments on GitHub >

github_iconTop Results From Across the Web

AutoModels — transformers 3.0.2 documentation
the model is a model provided by the library (loaded with the shortcut-name string of a pretrained model), or. the model was saved...
Read more >
Load a pre-trained model from disk with Huggingface ...
Assuming your pre-trained (pytorch based) transformer model is in 'model' folder in your current working directory, following code can load ...
Read more >
Save and load models | TensorFlow Core
There are different ways to save TensorFlow models depending on the API you're using. This guide uses tf.keras—a high-level API to build and ......
Read more >
load pretrained language model transformer - You.com
Use an already pretrained transformers model and fine-tune (continue training) it on your custom dataset. Train a transformer model from scratch on a...
Read more >
Use pre-trained Huggingface models in TensorFlow Serving
I will use this Distilbert pre-trained model for Sentiment Analysis which will predict if ... import tensorflow as tfMAX_SEQ_LEN = 100model ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found