SavedModel doesn't have SignatureDefs
See original GitHub issueConversion of TFJS model (PoseNet ResNet50) to SavedModel is successful but the produced SavedModel doesn’t seem to contain any signature keys, inputs and outputs.
saved_model_cli show --dir /posenet/savedmodel --all
MetaGraphDef with tag-set: 'serve' contains the following SignatureDefs:
# end out output
It means that TensorFlow Serving cannot serve this model.
My idea was to load the produced model as GraphDef
def get_graph_def_from_saved_model(saved_model_dir):
with tf.Session() as session:
meta_graph_def = tf.saved_model.loader.load(
session,
tags=['serve'],
export_dir=saved_model_dir
)
return meta_graph_def.graph_def
graph_def = get_graph_def_from_saved_model('/posenet/savedmodel')
find name of the input
input_nodes = ([node.name for node in graph_def.node if node.op == 'Placeholder'])
print(input_nodes) # ['sub_2']
and then save it again
with tf.Session(graph=tf.Graph()) as session:
tf.import_graph_def(graph_def, name='')
inputs = {input_name: session.graph.get_tensor_by_name(f'{input_name}:0') for input_name in input_nodes}
outputs = # how do I find the outputs?
tf.saved_model.simple_save(
session,
'/posenet/savedmodel_2',
inputs=inputs,
outputs=outputs
)
but I don’t know how to find names of the output nodes. Can you please suggest? Or can you think of a better way of generating SignatureDefs for the SavedModel?
Thank you for this useful tool!
Issue Analytics
- State:
- Created 3 years ago
- Comments:13 (3 by maintainers)
Top Results From Across the Web
SignatureDefs in SavedModel for TensorFlow Serving | TFX
A SignatureDef defines the signature of a computation supported in a TensorFlow graph. SignatureDefs aim to provide generic support to identify ...
Read more >{ "error": "Serving signature name: "serving_default" not found ...
Seemes that you have identified a custom signature but not the default value. Just change "serving_default" to your signature would be fine.
Read more >Change output signatures - Support - OpenNMT Forum
how can we only export a model with only tokens in signatures? The following method would have the error when using saved_model_cli.
Read more >How to deploy TensorFlow models to production using TF ...
In most situations, to perform inference, your graph doesn't need some ... To create such signatures, we need to provide definitions for ...
Read more >Converting a TensorFlow Model to TensorFlow.js in Python
First things first: if you have a SavedModel on disk somewhere that ... It's essentially a reproduction that doesn't require a directory as ......
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Additional comment: if you want to continue to convert the result to TFLite (which I did), then the following code snippet from the converter API guide may be helpful (this can basically be appended right below the code example from @glenvorel ):
I only used the output node
float_segments
and classified everything above 0.65 as body part, nothing more.