How to load a pretrained TF model using AutoModel?
See original GitHub issueRun the following code:
import tensorflow as tf
from transformers import AutoModel, TFBertModel
auto_model = AutoModel.from_pretrained("bert-base-uncased")
tfbert_model = TFBertModel.from_pretrained("bert-base-uncased")
print(auto_model.__class__)
print(tfbert_model.__class__)
Then the output is:
<class 'transformers.modeling_bert.BertModel'>
<class 'transformers.modeling_tf_bert.TFBertModel'>
It seems that AutoModel defaultly loads the pretrained PyTorch models, but how can I use it to load a pretrained TF model?
Issue Analytics
- State:
- Created 4 years ago
- Comments:6 (2 by maintainers)
Top Results From Across the Web
AutoModels — transformers 3.0.2 documentation
the model is a model provided by the library (loaded with the shortcut-name string of a pretrained model), or. the model was saved...
Read more >Load a pre-trained model from disk with Huggingface ...
Assuming your pre-trained (pytorch based) transformer model is in 'model' folder in your current working directory, following code can load ...
Read more >Save and load models | TensorFlow Core
There are different ways to save TensorFlow models depending on the API you're using. This guide uses tf.keras—a high-level API to build and ......
Read more >load pretrained language model transformer - You.com
Use an already pretrained transformers model and fine-tune (continue training) it on your custom dataset. Train a transformer model from scratch on a...
Read more >Use pre-trained Huggingface models in TensorFlow Serving
I will use this Distilbert pre-trained model for Sentiment Analysis which will predict if ... import tensorflow as tfMAX_SEQ_LEN = 100model ...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
HI @blmali, I had the same issue when trying to load “emilyalsentzer/Bio_Discharge_Summary_BERT”. I solved it by passing
from_pt
argument asTrue
:model = TFAutoModel.from_pretrained("emilyalsentzer/Bio_Discharge_Summary_BERT", from_pt=True)
.I hope this helps.
Hi @erikchwang, you should use
TFAutoModel
instead