TF models save_pretrained() failed when saved_model=True
See original GitHub issueEnvironment info
transformers
version: 4.13.0.dev0- Platform: Windows-10-10.0.19042-SP0
- Python version: 3.9.5
- PyTorch version (GPU?): 1.9.0+cpu (False)
- Tensorflow version (GPU?): 2.6.0 (False)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using GPU in script?: None
- Using distributed or parallel set-up in script?: None
Who can help
TensorFlow: @Rocketknight1
To reproduce
from transformers import TFBertModel
import tensorflow as tf
from PIL import Image
import requests
model = TFBertModel.from_pretrained("bert-base-uncased")
model.save_pretrained("tmp", saved_model=True)
# this also failed
for x in model.config.items():
print(x)
Error messages:
Traceback (most recent call last):
File "C:\Users\33611\Desktop\Projects\transformers-dev-2\transformers\del.py", line 7, in <module>
model.save_pretrained("tmp", saved_model=True)
File "C:\Users\33611\Desktop\Projects\transformers-dev-2\transformers\src\transformers\modeling_tf_utils.py", line 1227, in save_pretrained
self.save(saved_model_dir, include_optimizer=False, signatures=self.serving)
File "C:\Users\33611\miniconda3\envs\py39\lib\site-packages\keras\engine\training.py", line 2145, in save
save.save_model(self, filepath, overwrite, include_optimizer, save_format,
File "C:\Users\33611\miniconda3\envs\py39\lib\site-packages\keras\saving\save.py", line 149, in save_model
saved_model_save.save(model, filepath, overwrite, include_optimizer,
File "C:\Users\33611\miniconda3\envs\py39\lib\site-packages\keras\saving\saved_model\save.py", line 94, in save
metadata = generate_keras_metadata(saved_nodes, node_paths)
File "C:\Users\33611\miniconda3\envs\py39\lib\site-packages\keras\saving\saved_model\save.py", line 123, in generate_keras_metadata
metadata=node._tracking_metadata) # pylint: disable=protected-access
File "C:\Users\33611\miniconda3\envs\py39\lib\site-packages\keras\engine\base_layer.py", line 3078, in _tracking_metadata
return self._trackable_saved_model_saver.tracking_metadata
File "C:\Users\33611\miniconda3\envs\py39\lib\site-packages\keras\saving\saved_model\base_serialization.py", line 54, in tracking_metadata
return json_utils.Encoder().encode(self.python_properties)
File "C:\Users\33611\miniconda3\envs\py39\lib\site-packages\keras\saving\saved_model\layer_serialization.py", line 37, in python_properties
return self._python_properties_internal()
File "C:\Users\33611\miniconda3\envs\py39\lib\site-packages\keras\saving\saved_model\model_serialization.py", line 31, in _python_properties_internal
metadata = super(ModelSavedModelSaver, self)._python_properties_internal()
File "C:\Users\33611\miniconda3\envs\py39\lib\site-packages\keras\saving\saved_model\layer_serialization.py", line 54, in _python_properties_internal
metadata.update(get_serialized(self.obj))
File "C:\Users\33611\miniconda3\envs\py39\lib\site-packages\keras\saving\saved_model\layer_serialization.py", line 113, in get_serialized
return generic_utils.serialize_keras_object(obj)
File "C:\Users\33611\miniconda3\envs\py39\lib\site-packages\keras\utils\generic_utils.py", line 510, in serialize_keras_object
for key, item in config.items():
File "C:\Users\33611\Desktop\Projects\transformers-dev-2\transformers\src\transformers\configuration_utils.py", line 237, in __getattribute__
return super().__getattribute__(key)
AttributeError: 'BertConfig' object has no attribute 'items'
Expected behavior
model.save_pretrained(…, saved_model=True) should work, because it is used in test_saved_model_creation_extended()
in test_modeling_tf_common.py
.
Issue Analytics
- State:
- Created 2 years ago
- Comments:12 (12 by maintainers)
Top Results From Across the Web
Models - Hugging Face
The model is set in evaluation mode by default using model.eval() (Dropout ... Instantiate a pretrained TF 2.0 model from a pre-trained model...
Read more >bert model save_pretrained | The AI Search Engine You Control
The snippet used to save the model was not correct, for models based on the huggingface-transformers library you can't use model.save_dict() and load...
Read more >TensorFlow: model saved successful but restore failed, where ...
Try this method to save and restore: saver = tf.train.Saver() with tf.Session() as sess: sess.run(initVar) # restore saved model new_saver ...
Read more >Save and load models | TensorFlow Core
There are different ways to save TensorFlow models depending on the API you're using. This guide uses tf.keras—a high-level API to build and ......
Read more >How to load the pre-trained BERT model from local/colab ...
You need to download a converted checkpoint, from there. Note : HuggingFace also released TF models. But I'm not sure if it works...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
@shabie Just to let you know, we refactored a lot of those tests quite urgently when we realized that lack of coverage was causing serious problems! This issue should hopefully be resolved now, but if people encounter further difficulties with saving TF models, please comment or file a new issue.
Alternatively, this may be caused by my PR here, which made changes to the saving/loading of TF models. I’ll try a version of Transformers before that to see if the issue is still there.
EDIT: Still happens before my PR, so that’s not the problem.