question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Failing to load saved TFBertModel

See original GitHub issue

TF version: 2.2.0-rc1 transformers version: 2.7.0

import tensorflow as tf import transformers print(tf.__version__) print(transformers.__version__) MAX_LEN = 10 model_path = 'saved_model/temp_model' ids = tf.keras.layers.Input((MAX_LEN,), dtype=tf.int32) mask = tf.keras.layers.Input((MAX_LEN,), dtype=tf.int32) token_type_ids = tf.keras.layers.Input((MAX_LEN,), dtype=tf.int32) base_model = transformers.TFBertModel.from_pretrained("bert-base-cased" , output_hidden_states=False) base_output = base_model([ids, mask, token_type_ids]) seq_out, _ = base_output[0], base_output[1] base_model.trainable = False model = tf.keras.models.Model(inputs=[ids, mask, token_type_ids], outputs=[seq_out]) model.compile(optimizer='adam', loss='binary_crossentropy', metrics=['accuracy']) print(model.summary()) model.save(model_path) model = tf.keras.models.load_model(model_path)

Model load fails with the following error:

Traceback (most recent call last): File “/Users/sourabhmaity/anaconda3/lib/python3.7/site-packages/tensorflow/python/util/nest.py”, line 378, in assert_same_structure expand_composites) TypeError: The two structures don’t have the same nested structure.

First structure: type=dict str={‘input_ids’: TensorSpec(shape=(None, 5), dtype=tf.int32, name=‘input_ids’)}

Second structure: type=list str=[TensorSpec(shape=(None, 10), dtype=tf.int32, name=‘inputs/0’), TensorSpec(shape=(None, 10), dtype=tf.int32, name=‘inputs/1’), TensorSpec(shape=(None, 10), dtype=tf.int32, name=‘inputs/2’)]

More specifically: The two namedtuples don’t have the same sequence type. First structure type=dict str={‘input_ids’: TensorSpec(shape=(None, 5), dtype=tf.int32, name=‘input_ids’)} has type dict, while second structure type=list str=[TensorSpec(shape=(None, 10), dtype=tf.int32, name=‘inputs/0’), TensorSpec(shape=(None, 10), dtype=tf.int32, name=‘inputs/1’), TensorSpec(shape=(None, 10), dtype=tf.int32, name=‘inputs/2’)] has type list

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File “temp.py”, line 29, in <module> model = tf.keras.models.load_model(model_path) File “/Users/sourabhmaity/anaconda3/lib/python3.7/site-packages/tensorflow/python/keras/saving/save.py”, line 190, in load_model return saved_model_load.load(filepath, compile) File “/Users/sourabhmaity/anaconda3/lib/python3.7/site-packages/tensorflow/python/keras/saving/saved_model/load.py”, line 116, in load model = tf_load.load_internal(path, loader_cls=KerasObjectLoader) File “/Users/sourabhmaity/anaconda3/lib/python3.7/site-packages/tensorflow/python/saved_model/load.py”, line 604, in load_internal export_dir) File “/Users/sourabhmaity/anaconda3/lib/python3.7/site-packages/tensorflow/python/keras/saving/saved_model/load.py”, line 188, in init super(KerasObjectLoader, self).init(*args, **kwargs) File “/Users/sourabhmaity/anaconda3/lib/python3.7/site-packages/tensorflow/python/saved_model/load.py”, line 123, in init self._load_all() File “/Users/sourabhmaity/anaconda3/lib/python3.7/site-packages/tensorflow/python/keras/saving/saved_model/load.py”, line 215, in _load_all self._finalize_objects() File “/Users/sourabhmaity/anaconda3/lib/python3.7/site-packages/tensorflow/python/keras/saving/saved_model/load.py”, line 506, in _finalize_objects _finalize_saved_model_layers(layers_revived_from_saved_model) File “/Users/sourabhmaity/anaconda3/lib/python3.7/site-packages/tensorflow/python/keras/saving/saved_model/load.py”, line 677, in _finalize_saved_model_layers inputs = infer_inputs_from_restored_call_function(call_fn) File “/Users/sourabhmaity/anaconda3/lib/python3.7/site-packages/tensorflow/python/keras/saving/saved_model/load.py”, line 921, in infer_inputs_from_restored_call_function spec = nest.map_structure(common_spec, spec, spec2) File “/Users/sourabhmaity/anaconda3/lib/python3.7/site-packages/tensorflow/python/util/nest.py”, line 611, in map_structure expand_composites=expand_composites) File “/Users/sourabhmaity/anaconda3/lib/python3.7/site-packages/tensorflow/python/util/nest.py”, line 385, in assert_same_structure % (str(e), str1, str2)) TypeError: The two structures don’t have the same nested structure.

First structure: type=dict str={‘input_ids’: TensorSpec(shape=(None, 5), dtype=tf.int32, name=‘input_ids’)}

Second structure: type=list str=[TensorSpec(shape=(None, 10), dtype=tf.int32, name=‘inputs/0’), TensorSpec(shape=(None, 10), dtype=tf.int32, name=‘inputs/1’), TensorSpec(shape=(None, 10), dtype=tf.int32, name=‘inputs/2’)]

More specifically: The two namedtuples don’t have the same sequence type. First structure type=dict str={‘input_ids’: TensorSpec(shape=(None, 5), dtype=tf.int32, name=‘input_ids’)} has type dict, while second structure type=list str=[TensorSpec(shape=(None, 10), dtype=tf.int32, name=‘inputs/0’), TensorSpec(shape=(None, 10), dtype=tf.int32, name=‘inputs/1’), TensorSpec(shape=(None, 10), dtype=tf.int32, name=‘inputs/2’)] has type list Entire first structure: {‘input_ids’: .} Entire second structure: [., ., .]

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Reactions:20
  • Comments:16 (3 by maintainers)

github_iconTop GitHub Comments

56reactions
Souls362commented, Jun 19, 2020

change
base_output = base_model([ids, mask, token_type_ids])
to
base_output = base_model.bert([ids, mask, token_type_ids]) should fix

5reactions
Jordy-VLcommented, Sep 17, 2020

change base_output = base_model([ids, mask, token_type_ids]) to base_output = base_model.bert([ids, mask, token_type_ids]) should fix

one tip for TFBertSequenceClassification: base_model.bert([ids, mask, token_type_ids])[1]

Read more comments on GitHub >

github_iconTop Results From Across the Web

Trouble saving tf.keras model with Bert (huggingface) classifier
I am aware that huggingface provides a model.save_pretrained() method for TFBertModel, but I prefer to wrap it in tf.keras.Model as I plan to ......
Read more >
Models - Hugging Face
PreTrainedModel takes care of storing the configuration of the models and handles methods for loading, downloading and saving models as well as a...
Read more >
Save and load a model using a distribution strategy
Overview. This tutorial demonstrates how you can save and load models in a SavedModel format with tf.distribute.Strategy during or after training.
Read more >
BERT Text Classification using Keras | by Swatimeena - Medium
Load the BERT tokenizer and the suitable BERT model. from transformers import * from transformers import BertTokenizer, TFBertModel, ...
Read more >
[ML-News] Failing to load saved TFBertModel · Issue #3627 ...
Failing to load saved TFBertModel · Issue #3627 · huggingface/transformers · GitHub. TF version: 2.2.0-rc1 transformers version: 2.7.0 import tensorflow as ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found