[TF] Save finetuned-model without huggingface-hub login
See original GitHub issueFeature request
[TF] Save finetuned-model in local without huggingface-hub login
Motivation
in TF, We need to login for saving finetuned-model.
from transformers.keras_callbacks import PushToHubCallback
push_to_hub_callback = PushToHubCallback(
output_dir="my_awesome_model",
tokenizer=tokenizer,
)
But I don’t want to sync in my hub yet. Firstly, I want to save my models in local and test them
I checked that works in PyTorch, But It’s not in Tensorflow
Your contribution
I think we need to add argument whether to login or not
Issue Analytics
- State:
- Created 9 months ago
- Comments:5 (3 by maintainers)
Top Results From Across the Web
Model sharing and uploading - Hugging Face
Uploading your files. Once the repo is cloned, you can add the model, configuration and tokenizer files. For instance, saving the model and...
Read more >huggingface transformers - saving finetuned model locally
I want to save the model locally, and then later be able to load it from my own computer into future task so...
Read more >I Fine-Tuned GPT-2 on 100K Scientific Papers
I started by loading the dataset from the Huggingface Hub. ... At this point, I set up a callback to the Huggingface Hub...
Read more >Untitled
Feature request [TF] Save finetuned-model in local without huggingface-hub login ... or any dir you want to save logs # training train_result =...
Read more >François Chollet on Twitter: "Exciting -- you can now push any ...
Exciting -- you can now push any Keras model to the HuggingFace Hub in just a couple of lines of code. Quote Tweet....
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Hi @Rocketknight1 ! Thanks for your comment.
I want to use Huggingface’s
pipeline
API for inference. I thinkpipeline
perhaps can receive only.h5
modelWhen I tried
ModelCheckpoint callback
, It returnsckpt
files. It can’t be used inpipeline
. For convertckpt
to.h5
, I need to write model architecture (in my caseELECTRA
) But It’s so difficult and complex to me 😥 I tried to convertckpt
topth (PyTorch)
But It doesn’t work… Maybe this code only works in converting TF1 to PyTorchWhen I tried
model.save('my_model.h5')
, Error msg raised. Maybe Something format is not matchI don’t test
model.save_pretrained()
yet, It returns.h5
?Ah, yes. The
.ckpt
files fromModelCheckpoint
are only useful for saving/resuming training, and you won’t be able to use them in pipelines.The way TF models on HuggingFace work is that they’re built on top of Keras models.
model.save()
andModelCheckpoint
are both part of Keras. However, if you want to save the model to load with other HuggingFace tools, you should usesave_pretrained()
. This is our method and doesn’t exist in base Keras models. It saves the model as.h5
, but also adds aconfig.json
that will allow thepipeline
API and other methods likefrom_pretrained
to initialize the model correctly.Try just doing this:
Though of course, make sure to change
text-classification
to the task you want to do!