question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Missing / non-supported op during convert with tensorflowjs_convert

See original GitHub issue

I was trying to use the general documentation of AutoML to use trained model in a web based application. As that way wasn’t working (issue was raised here) I tried a workaround and exported and downloaded the saved model of my custom training.

My idea was to use the tensorflowjs_converter and convert the saved model to graph model. I used the following command:

tensorflowjs_converter --input_format=tf_saved_model --output_format=tfjs_graph_model --signature_name=serving_default --saved_model_tags=serve --control_flow_v2=true model-export/ converted/

This will stop with the following error:

tensorflowjs_converter --input_format=tf_saved_model --output_format=tfjs_graph_model --signature_name=serving_default --saved_model_tags=serve --control_flow_v2=true model-export/ converted/
2020-10-15 13:12:12.115746: I tensorflow/core/platform/cpu_feature_guard.cc:142] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN)to use the following CPU instructions in performance-critical operations:  AVX2 FMA
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
2020-10-15 13:12:12.135884: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x7fbc2ba2d150 initialized for platform Host (this does not guarantee that XLA will be used). Devices:
2020-10-15 13:12:12.135902: I tensorflow/compiler/xla/service/service.cc:176]   StreamExecutor device (0): Host, Default Version
2020-10-15 13:12:13.772835: I tensorflow/core/grappler/devices.cc:78] Number of eligible GPUs (core count >= 8, compute capability >= 0.0): 0 (Note: TensorFlow was not compiled with CUDA or ROCm support)
2020-10-15 13:12:13.773299: I tensorflow/core/grappler/clusters/single_machine.cc:356] Starting new session
2020-10-15 13:12:13.840628: E tensorflow/core/grappler/grappler_item_builder.cc:669] Init node index_to_string/table_init/LookupTableImportV2 doesn't exist in graph
WARNING:tensorflow:From /usr/local/lib/python3.8/site-packages/tensorflowjs/converters/tf_saved_model_conversion_v2.py:379: load (from tensorflow.python.saved_model.loader_impl) is deprecated and will be removed in a future version.
Instructions for updating:
This function will only be available through the v1 compatibility library as tf.compat.v1.saved_model.loader.load or tf.compat.v1.saved_model.load. There will be a new function for importing SavedModels in Tensorflow 2.0.
WARNING:tensorflow:From /usr/local/lib/python3.8/site-packages/tensorflowjs/converters/tf_saved_model_conversion_v2.py:383: convert_variables_to_constants (from tensorflow.python.framework.graph_util_impl) is deprecated and will be removed in a future version.
Instructions for updating:
Use `tf.compat.v1.graph_util.convert_variables_to_constants`
WARNING:tensorflow:From /usr/local/lib/python3.8/site-packages/tensorflow/python/framework/convert_to_constants.py:854: extract_sub_graph (from tensorflow.python.framework.graph_util_impl) is deprecated and will be removed in a future version.
Instructions for updating:
Use `tf.compat.v1.graph_util.extract_sub_graph`
Traceback (most recent call last):
  File "/usr/local/bin/tensorflowjs_converter", line 8, in <module>
    sys.exit(pip_main())
  File "/usr/local/lib/python3.8/site-packages/tensorflowjs/converters/converter.py", line 757, in pip_main
    main([' '.join(sys.argv[1:])])
  File "/usr/local/lib/python3.8/site-packages/tensorflowjs/converters/converter.py", line 761, in main
    convert(argv[0].split(' '))
  File "/usr/local/lib/python3.8/site-packages/tensorflowjs/converters/converter.py", line 690, in convert
    tf_saved_model_conversion_v2.convert_tf_saved_model(
  File "/usr/local/lib/python3.8/site-packages/tensorflowjs/converters/tf_saved_model_conversion_v2.py", line 607, in convert_tf_saved_model
    optimize_graph(frozen_graph, signature,
  File "/usr/local/lib/python3.8/site-packages/tensorflowjs/converters/tf_saved_model_conversion_v2.py", line 145, in optimize_graph
    raise ValueError('Unsupported Ops in the model before optimization\n' +
ValueError: Unsupported Ops in the model before optimization
DecodeJpeg

Note: there was more error, but i realized there was an update/commit and some new ops were added, so I reinstalled tensorflow, still, the DecodeJpeg is still present.

Now I’m not sure how I can make my custom AutoML trained model work on web as neither the default export is working (see issue linked above) and I cannot convert it.

I tried to suppress the error message by passing the switch --skip_op_check, but than I have an error while trying to load the model. (the same error, just in the browser console)

tfjs:17 Uncaught (in promise) TypeError: Unknown op 'HashTableV2'. File an issue at https://github.com/tensorflow/tfjs/issues so we can add it, or register a custom execution with tf.registerOp()
    at tfjs:17
    at UO (tfjs:17)
    at tfjs:17
    at tfjs:17
    at e.t.scopedRun (tfjs:17)
    at e.t.tidy (tfjs:17)
    at sx (tfjs:17)
    at e.t.execute (tfjs:17)
    at e.t.loadSync (tfjs:17)
    at e.<anonymous> (tfjs:17)
(anonymous) @ tfjs:17
UO @ tfjs:17
(anonymous) @ tfjs:17
(anonymous) @ tfjs:17
t.scopedRun @ tfjs:17
t.tidy @ tfjs:17
sx @ tfjs:17
t.execute @ tfjs:17
t.loadSync @ tfjs:17
(anonymous) @ tfjs:17
u @ tfjs:17
(anonymous) @ tfjs:17
forEach.e.<computed> @ tfjs:17
Um @ tfjs:17
o @ tfjs:17
async function (async)
run @ script.js:3
(anonymous) @ script.js:14

I have tensorflow@2.3.1 installed and running Python 3.8.5 on OSX (Mojave).

Sample HTML I use:

<!DOCTYPE html>
<html lang="en">
  <head>
    <title>Hello World - TensorFlow.js</title>
    <meta charset="utf-8">
    <meta http-equiv="X-UA-Compatible" content="IE=edge">
    <meta name="viewport" content="width=device-width, initial-scale=1">
    <link rel="stylesheet" href="/style.css">
    <script src="https://cdn.jsdelivr.net/npm/@tensorflow/tfjs"></script>
    <script src="https://cdn.jsdelivr.net/npm/@tensorflow/tfjs-automl"></script>
    <script src="script.js"></script>
  </head>  
  <body>
    <img id="test" crossorigin="anonymous" src="test.jpg">
  </body>
</html>

Sample script I use:

async function run() {
    const MODEL_URL = 'model.json';
    const model = await tf.automl.loadObjectDetection(MODEL_URL);
  tf.enableDebugMode(); 
  const img = document.getElementById('test');
  const options = {score: 0.5, iou: 0.5, topk: 20};
  const predictions = await model.detect(img, options);
  console.log(predictions);
  const pre = document.createElement('pre');
  pre.textContent = JSON.stringify(predictions, null, 2);
  document.body.append(pre);
}
run();

Is there a way to use the exported AutoML models with TensorflowJS?

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:5

github_iconTop GitHub Comments

1reaction
rthadurcommented, Oct 15, 2020

@peterschmiz would you be intrested in contributing ? please check our contributing guidelines here.

0reactions
google-ml-butler[bot]commented, Apr 30, 2021

Closing as stale. Please @mention us if this needs more attention.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Unclear error from tensorflowjs_converter of USE large from ...
When I run tensorflowjs_converter --input_format=tf_hub 'tfhub.dev/google/universal-sentence-encoder-large/5' use-large (the URL in the command starts with ...
Read more >
Tensorflowjs Conversion Error: "ValueError: Unsupported Ops"
The short answer is yes, you will need to change them. TensorflowJS will change the ops for optimisation purposes, but not all the...
Read more >
End-to-End Machine Learning in JavaScript using Danfo.js ...
It is built on TensorFlow.js, and supports tensors out of the box. This means you can convert danfo data structures to tensors; Missing...
Read more >
Environments for running AI in Node.js - IBM Developer
In some cases, using the Node back end for TensorFlow.js ( tfjs-node ) might not be doable due to issues getting the libtensorflow...
Read more >
Convert a Python SavedModel to TensorFlow.js format
In this codelab, you'll learn how to take an existing Python ML model that is in the SavedModel format and convert it to...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found