question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Converting model to frozen pb causes original model to go into an "Invalid State"

See original GitHub issue
Click to expand!

Issue Type

Support

Source

binary

Tensorflow Version

2.4 / 2.9.2 (Occurs on both)

Custom Code

No

OS Platform and Distribution

Linux Ubuntu 18.04

Python version

3.8

Current Behaviour?

I currently am trying to convert a Tensorflow 2 Keras model into a Tensorflow 1 frozen pb. My code is able to accomplish this and freezes the model correctly. I do this by creating my model, then saving it as an h5, then load that models h5 as a separate model and freeze it.

However, if I try to load and freeze the model and then continue on to use the original model (the untouched one), it’s put into an “Invalid State”. I’ve tried looking to see if there is an issue with the keras backend session being confused or if the two models have the same reference but they are all separate.

It’s as if the original model and the loaded model are the same one. I’m not sure if this is a bug or more likely user error.

Standalone code to reproduce the issue

# create, compile, train original model | or load original model 

original_model.save('original_model.h5', save_format='h5')
convert_to_pb('original_model.h5')

original_model.predict(inp) # Error occurs here

-----------------------------------------
# Convert Script

import tensorflow as tf

def convert_to_pb(h5_file):
    with tf.compat.v1.keras.backend.get_session() as sess:
        model = tf.keras.models.load_model(h5_file)

        graph = sess.graph
        output_names = [out.op.name for out in model.outputs]
        input_graph_def = graph.as_graph_def()

        for node in input_graph_def.node:
            node.device = ""
        
        frozen_graph = tf.compat.v1.graph_util.convert_variables_to_constants(sess, input_graph_def, output_names)
        frozen_graph = tf.compat.v1.graph_util.remove_training_nodes(frozen_graph)

        tf.io.write_graph(frozen_graph, '.', 'frozen_model.pb', as_text=False)

A gist can be found here with the code.

Relevant log output

ValueError: Your Layer or Model is in an invalid state. This can happen for the following cases:
1. You might be interleaving estimator/non-estimator models or interleaving models/layers made in tf.compat.v1.Graph.as_default() with model/layers created outside of it. Converting a model to an estimator (via model_to_estimator) invalidates all models/layers made before the conversion (even if they were not the model converted to an estimator). Similarly, making a layer or a model inside a a tf.compat.v1.Graph invalid$tes all layers/models you previously made outside of the graph. 
2. You might be using a custom keras layer implementation with  custom __init__ which didn't call super().__init__.  Please check the implementation of <class 'tensorflow.python.keras.layers.convolutional.Conv2D'> and its bases.

Issue Analytics

  • State:closed
  • Created a year ago
  • Comments:10

github_iconTop GitHub Comments

1reaction
sushreebarsacommented, Nov 10, 2022

@SuryanarayanaY I was able to replicate the issue on colab, please find the gist here. Thank you!

0reactions
matthewfernstcommented, Nov 18, 2022

@SuryanarayanaY Hi there. I finally got my config portion to work on my custom layers and this issue is still persisting. Do you have another solution? Thanks!

Read more comments on GitHub >

github_iconTop Results From Across the Web

How to export Keras .h5 to tensorflow .pb? - Stack Overflow
The following simple example (XOR example) shows how to export Keras models (in both h5 format and pb format), and using the model...
Read more >
Error in converting custom ssd model using Tensorflow2 ...
Solved: Hi, I am trying to convert a custom SSD MobileNet V2 FPNLite 320x320 from TensorFlow2 model zoo to Openvino Intermediate Representation (IR)...
Read more >
Converting a TensorFlow Model - OpenVINO™ Documentation
This page provides general instructions on how to convert a model from a TensorFlow format to the OpenVINO IR format using Model Optimizer....
Read more >
Convert .pb file into frozen .pb file - Google Groups
You are using only the meta graph out of the saved model. This means you aren't loading the checkpoint. ... And if you...
Read more >
A quick complete tutorial to save and restore Tensorflow models
In this quick Tensorflow tutorial, you shall learn what's a Tensorflow model and how to save and restore Tensorflow models for fine-tuning and...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found