Models converted from `saved_model` to `tfjs_graph_model` loose output signature information
See original GitHub issueModels converted from saved_model
to tfjs_graph_model
loose output signature information.
This is not specific to any single model, it’s a generic converter issue.
a) Saved model from https://tfhub.dev/tensorflow/efficientdet/d0/1?tf-hub-format=compressed
console.log(await tf.node.getMetaGraphsFromSavedModel(modelPath));
outputs: {
detection_anchor_indices: { dtype: 'float32', name: 'StatefulPartitionedCall:0', shape: [Array] },
detection_boxes: { dtype: 'float32', name: 'StatefulPartitionedCall:1', shape: [Array] },
detection_classes: { dtype: 'float32', name: 'StatefulPartitionedCall:2', shape: [Array] },
detection_multiclass_scores: { dtype: 'float32', name: 'StatefulPartitionedCall:3', shape: [Array] },
detection_scores: { dtype: 'float32', name: 'StatefulPartitionedCall:4', shape: [Array] },
num_detections: { dtype: 'float32', name: 'StatefulPartitionedCall:5', shape: [Array] },
raw_detection_boxes: { dtype: 'float32', name: 'StatefulPartitionedCall:6', shape: [Array] },
raw_detection_scores: { dtype: 'float32', name: 'StatefulPartitionedCall:7', shape: [Array] }
}
b) Same model converted to TFJS Graph model using tensorflowjs_converter --input_format=tf_saved_model --output_format=tfjs_graph_model . graph
const model = await tf.loadGraphModel(`file://${path.join(__dirname, modelPath)}`);
console.log(model.executor._signature); // can also use model.outputs, but that has even less info
outputs: {
'Identity_6:0': { name: 'Identity_6:0', dtype: 'DT_FLOAT', tensorShape: { dim: [ { size: '1' }, { size: '49104' }, { size: '4' }, [length]: 3 ] } },
'Identity_1:0': { name: 'Identity_1:0', dtype: 'DT_FLOAT', tensorShape: { dim: [ { size: '1' }, { size: '100' }, { size: '4' }, [length]: 3 ] } },
'Identity_3:0': { name: 'Identity_3:0', dtype: 'DT_FLOAT', tensorShape: { dim: [ { size: '1' }, { size: '100' }, { size: '90' }, [length]: 3 ] } },
'Identity_2:0': { name: 'Identity_2:0', dtype: 'DT_FLOAT', tensorShape: { dim: [ { size: '1' }, { size: '100' }, [length]: 2 ] } },
'Identity_5:0': { name: 'Identity_5:0', dtype: 'DT_FLOAT', tensorShape: { dim: [ { size: '1' }, [length]: 1 ] } },
'Identity_7:0': { name: 'Identity_7:0', dtype: 'DT_FLOAT', tensorShape: { dim: [ { size: '1' }, { size: '49104' }, { size: '90' }, [length]: 3 ] } },
'Identity_4:0': { name: 'Identity_4:0', dtype: 'DT_FLOAT', tensorShape: { dim: [ { size: '1' }, { size: '100' }, [length]: 2 ] } },
'Identity:0': { name: 'Identity:0', dtype: 'DT_FLOAT', tensorShape: { dim: [ { size: '1' }, { size: '100' }, [length]: 2 ] } }
}
This is cosmetic as all outputs are still present, but makes converted models extremely difficult to use.
Environment: Ubuntu 20.04 with NodeJS 14.11.0 and TFJS 2.4.0
Issue Analytics
- State:
- Created 3 years ago
- Reactions:1
- Comments:18 (8 by maintainers)
Top Results From Across the Web
Using the SavedModel format | TensorFlow Core
You can save and load a model in the SavedModel format using the following APIs: ... and output dictionary keys, see Specifying signatures...
Read more >How to Convert a TensorFlow.js Graph Model ... - Christian Mills
This post will cover how to use this library to convert a TFJS model to the standard SavedModel format. About the Tool. The...
Read more >Tensorflow 2.0: How to change the output signature while ...
I figured out a way to define the output signature without using tf.Module by defining a tf.function that returns a dictionary of outputs...
Read more >tfjs-graph-converter - PyPI
input_path, Path to the TFJS Graph Model directory containing the model.json. output_path, For output format "tf_saved_model", a SavedModel target directory ...
Read more >Model saving & serialization APIs - Keras
X, and 'h5' in TF 1.X. signatures: Signatures to save with the SavedModel. Applicable to the 'tf' format only. Please see the signatures...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
@rohanmuplara good stuff, but doesn’t help when i’m converting a pretrained model.
Are you satisfied with the resolution of your issue? Yes No