question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Models converted from `saved_model` to `tfjs_graph_model` loose output signature information

See original GitHub issue

Models converted from saved_model to tfjs_graph_model loose output signature information.

This is not specific to any single model, it’s a generic converter issue.

a) Saved model from https://tfhub.dev/tensorflow/efficientdet/d0/1?tf-hub-format=compressed

console.log(await tf.node.getMetaGraphsFromSavedModel(modelPath));
outputs: {
  detection_anchor_indices: { dtype: 'float32', name: 'StatefulPartitionedCall:0', shape: [Array] },
  detection_boxes: { dtype: 'float32', name: 'StatefulPartitionedCall:1', shape: [Array] },
  detection_classes: { dtype: 'float32', name: 'StatefulPartitionedCall:2', shape: [Array] },
  detection_multiclass_scores: { dtype: 'float32', name: 'StatefulPartitionedCall:3', shape: [Array] },
  detection_scores: { dtype: 'float32', name: 'StatefulPartitionedCall:4', shape: [Array] },
  num_detections: { dtype: 'float32', name: 'StatefulPartitionedCall:5', shape: [Array] },
  raw_detection_boxes: { dtype: 'float32', name: 'StatefulPartitionedCall:6', shape: [Array] },
  raw_detection_scores: { dtype: 'float32', name: 'StatefulPartitionedCall:7', shape: [Array] }
}

b) Same model converted to TFJS Graph model using tensorflowjs_converter --input_format=tf_saved_model --output_format=tfjs_graph_model . graph

const model = await tf.loadGraphModel(`file://${path.join(__dirname, modelPath)}`);
console.log(model.executor._signature); // can also use model.outputs, but that has even less info
outputs: {
  'Identity_6:0': { name: 'Identity_6:0', dtype: 'DT_FLOAT', tensorShape: { dim: [ { size: '1' }, { size: '49104' }, { size: '4' }, [length]: 3 ] } },
  'Identity_1:0': { name: 'Identity_1:0', dtype: 'DT_FLOAT', tensorShape: { dim: [ { size: '1' }, { size: '100' }, { size: '4' }, [length]: 3 ] } },
  'Identity_3:0': { name: 'Identity_3:0', dtype: 'DT_FLOAT', tensorShape: { dim: [ { size: '1' }, { size: '100' }, { size: '90' }, [length]: 3 ] } },
  'Identity_2:0': { name: 'Identity_2:0', dtype: 'DT_FLOAT', tensorShape: { dim: [ { size: '1' }, { size: '100' }, [length]: 2 ] } },
  'Identity_5:0': { name: 'Identity_5:0', dtype: 'DT_FLOAT', tensorShape: { dim: [ { size: '1' }, [length]: 1 ] } },
  'Identity_7:0': { name: 'Identity_7:0', dtype: 'DT_FLOAT', tensorShape: { dim: [ { size: '1' }, { size: '49104' }, { size: '90' }, [length]: 3 ] } },
  'Identity_4:0': { name: 'Identity_4:0', dtype: 'DT_FLOAT', tensorShape: { dim: [ { size: '1' }, { size: '100' }, [length]: 2 ] } },
  'Identity:0': { name: 'Identity:0', dtype: 'DT_FLOAT', tensorShape: { dim: [ { size: '1' }, { size: '100' }, [length]: 2 ] } }
}

This is cosmetic as all outputs are still present, but makes converted models extremely difficult to use.

Environment: Ubuntu 20.04 with NodeJS 14.11.0 and TFJS 2.4.0

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Reactions:1
  • Comments:18 (8 by maintainers)

github_iconTop GitHub Comments

1reaction
vladmandiccommented, Apr 2, 2021

@rohanmuplara good stuff, but doesn’t help when i’m converting a pretrained model.

0reactions
google-ml-butler[bot]commented, May 7, 2021

Are you satisfied with the resolution of your issue? Yes No

Read more comments on GitHub >

github_iconTop Results From Across the Web

Using the SavedModel format | TensorFlow Core
You can save and load a model in the SavedModel format using the following APIs: ... and output dictionary keys, see Specifying signatures...
Read more >
How to Convert a TensorFlow.js Graph Model ... - Christian Mills
This post will cover how to use this library to convert a TFJS model to the standard SavedModel format. About the Tool. The...
Read more >
Tensorflow 2.0: How to change the output signature while ...
I figured out a way to define the output signature without using tf.Module by defining a tf.function that returns a dictionary of outputs...
Read more >
tfjs-graph-converter - PyPI
input_path, Path to the TFJS Graph Model directory containing the model.json. output_path, For output format "tf_saved_model", a SavedModel target directory ...
Read more >
Model saving & serialization APIs - Keras
X, and 'h5' in TF 1.X. signatures: Signatures to save with the SavedModel. Applicable to the 'tf' format only. Please see the signatures...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found