ValueError: Unsupported Ops in the model before optimization StatefulPartitionedCall
See original GitHub issueTensorFlow.js version: 2.0.1.post1
Browser version: Chrome Version 83.0.4103.106 (Official Build) (64-bit)
Describe the problem or feature request
I am trying to convert a TensorFlow saved_model to TensorFlowJS type with a “quantize” flag. The model uses Huggingface’s Distillbert as a layer as follows.
config = DistilBertConfig.from_pretrained( 'distilbert-base-uncased')
config.output_hidden_states = False
distillbert_main = TFDistilBertMainLayer(config = config)
input_word_ids = tf.keras.layers.Input(shape=(8,), dtype = tf.int32, name = "input_word_ids"),
x = distillbert_main(input_word_ids)[0]
x = tf.keras.layers.Lambda(lambda seq: seq[:, 0, :])(x)
x = tf.keras.layers.BatchNormalization()(x)
x = tf.keras.layers.Dropout(0.2)(x)
out = tf.keras.layers.Dense(2)(x)
model = tf.keras.Model(inputs=input_word_ids, outputs=out)
for layer in model.layers[:3]:
layer.trainable = False
model.summary() # Works fine
model.get_config() # Works fine
tf.saved_model.save(model, './models/model') # works fine
tfjs.converters.convert_tf_saved_model('./models/model', './models/tfjs') # does not work
I know tfjs currently supports Image-based models, and Models with control flow ops (e.g., RNNs) are also supported. It would be great to see if tfjs team could add the features that convert the attention-based models like the model above.
Code to reproduce the bug / link to feature request
tensorflowjs_converter --quantize_float16 --input_format=tf_saved_model /Users/sabber/Desktop/node_test/model /Users/sabber/Desktop/node_test/model/quantized
Issue Analytics
- State:
- Created 3 years ago
- Comments:11
Top Results From Across the Web
Tensorflowjs Conversion Error: "ValueError: Unsupported Ops"
The short answer is yes, you will need to change them. TensorflowJS will change the ops for optimisation purposes, but not all the...
Read more >Why Unsupported Ops in the model before optimization ...
ValueError : Unsupported Ops in the model before optimization IteratorV2, IteratorGetNext. I tested TensorFlow 1.11 and 1.12, I use TFRecordDataset and an ...
Read more >Export OpenAI GPT-2 model into SavedModel.ipynb - Colaboratory
... line 143, in optimize_graph ', '.join(unsupported)) ValueError: Unsupported Ops in the model before optimization StatefulPartitionedCall ...
Read more >Converting a TensorFlow Model - OpenVINO™ documentation
If a model contains operations currently unsupported by OpenVINO, ... nodes in the StatefulPartitionedCall/\* subgraph of TensorFlow 2.x SavedModel format.
Read more >Top 5 tensorflowjs Code Examples - Snyk
tensorflow / tfjs / integration_tests / benchmarks / python / benchmarks.py ... raise ValueError('Unsupported Ops in the model before optimization\n' + ' ...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Hi! I have the same error. @msahamed, did you manage to solve it or avoid somehow? (I need to run DistilBERT in a browser too)
Thank you for helping. It works with release 2.1.0.