question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Cannot convert AutoML Tables model to TF.JS model

See original GitHub issue

Please make sure that this is a bug. As per our GitHub Policy, we only address code/doc bugs, performance issues, feature requests and build/installation issues on GitHub. tag:bug_template

System information

  • Have I written custom code (as opposed to using a stock example script provided in TensorFlow.js): No
  • OS Platform and Distribution (e.g., Linux Ubuntu 16.04):
  • Mobile device (e.g. iPhone 8, Pixel 2, Samsung Galaxy) if the issue happens on mobile device:
  • TensorFlow.js installed from (npm or script link): pip
  • TensorFlow.js version (use command below): 2.4.0
  • Browser version: N/A
  • Tensorflow.js Converter Version: 2.4.0 ?

Describe the current behavior

I’ve exported an AutoML Tables model to TF saved model. Now I’m trying to convert the model to TF JS. I get the following error: Is there any way to convert this model?

Installing collected packages: tensorflow-hub, wcwidth, prompt-toolkit, Pygments, regex, PyInquirer, tensorflow-cpu, tensorflowjs
Successfully installed PyInquirer-1.0.3 Pygments-2.7.1 prompt-toolkit-1.0.14 regex-2020.7.14 tensorflow-cpu-2.3.0 tensorflow-hub-0.7.0 tensorflowjs-2.4.0 wcwidth-0.2.5


+ tensorflowjs_converter --input_format=tf_saved_model --output_format=tfjs_graph_model --signature_name=serving_default --saved_model_tags=serve /tmp/inputs/Model/data /tmp/outputs/Model/data
Traceback (most recent call last):
  File "/usr/local/bin/tensorflowjs_converter", line 8, in <module>
    sys.exit(pip_main())
  File "/usr/local/lib/python3.6/dist-packages/tensorflowjs/converters/converter.py", line 757, in pip_main
    main([' '.join(sys.argv[1:])])
  File "/usr/local/lib/python3.6/dist-packages/tensorflowjs/converters/converter.py", line 761, in main
    convert(argv[0].split(' '))
  File "/usr/local/lib/python3.6/dist-packages/tensorflowjs/converters/converter.py", line 699, in convert
    experiments=args.experiments)
  File "/usr/local/lib/python3.6/dist-packages/tensorflowjs/converters/tf_saved_model_conversion_v2.py", line 481, in convert_tf_saved_model
    model = load(saved_model_dir, saved_model_tags)
  File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/saved_model/load.py", line 603, in load
    return load_internal(export_dir, tags, options)
  File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/saved_model/load.py", line 649, in load_internal
    root = load_v1_in_v2.load(export_dir, tags)
  File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/saved_model/load_v1_in_v2.py", line 263, in load
    return loader.load(tags=tags)
  File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/saved_model/load_v1_in_v2.py", line 209, in load
    signature=[])
  File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/eager/wrap_function.py", line 628, in wrap_function
    collections={}),
  File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/func_graph.py", line 986, in func_graph_from_py_func
    func_outputs = python_func(*func_args, **func_kwargs)
  File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/eager/wrap_function.py", line 87, in __call__
    return self.call_with_variable_creator_scope(self._fn)(*args, **kwargs)
  File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/eager/wrap_function.py", line 93, in wrapped
    return fn(*args, **kwargs)
  File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/saved_model/load_v1_in_v2.py", line 90, in load_graph
    meta_graph_def)
  File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/training/saver.py", line 1486, in _import_meta_graph_with_return_elements
    **kwargs))
  File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/meta_graph.py", line 799, in import_scoped_meta_graph_with_return_elements
    return_elements=return_elements)
  File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/util/deprecation.py", line 507, in new_func
    return func(*args, **kwargs)
  File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/importer.py", line 405, in import_graph_def
    producer_op_list=producer_op_list)
  File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/importer.py", line 497, in _import_graph_def_internal
    graph._c_graph, serialized, options)  # pylint: disable=protected-access
tensorflow.python.framework.errors_impl.NotFoundError: Converting GraphDef to Graph has failed. The binary trying to import the GraphDef was built when GraphDef version was 440. The GraphDef was produced by a binary built when GraphDef version was 518. The difference between these versions is larger than TensorFlow's forward compatibility guarantee. The following error might be due to the binary trying to import the GraphDef being too old: Op type not registered 'DecodeProtoSparseV2' in binary running on retail-product-stockout-prediction-pipeline-gcs-v765m-318467729. Make sure the Op and Kernel are registered in the binary running in this process. Note that if you are loading a saved graph which used ops from tf.contrib, accessing (e.g.) `tf.contrib.resampler` should be done before importing the graph, as contrib ops are lazily registered when the module is first accessed.

Issue Analytics

  • State:open
  • Created 3 years ago
  • Comments:8 (2 by maintainers)

github_iconTop GitHub Comments

1reaction
tafsiricommented, Sep 21, 2020

cc @rthadur (updated tags)

0reactions
pyu10055commented, Sep 22, 2020

@Ark-kun I see the autoML is producing graph_def with the latest version of TensorFlow. You can try following, after you installed tensorflowjs pip,

  1. you can uninstall the tensorflow-cpu pip
  2. install the tf-nightly-cpu pip

Then try to convert the model again.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Convert Tensorflow 1 pb model created with AutoML Tables to ...
This is the reason you are unable to convert your model to tensorflow.js . The ops that are supported currently by tensorflow.js can...
Read more >
Convert Tensorflow 1 pb model created with AutoML Tables to ...
Coding example for the question Convert Tensorflow 1 pb model created with AutoML Tables to TensorflowJS to run in NodeJS-node.js.
Read more >
Model conversion | TensorFlow.js
The TensorFlow.js converter has two components: A command line utility that converts Keras and TensorFlow models for use in TensorFlow.js.
Read more >
Common Data Formats for Training - Amazon SageMaker
File mode uses disk space to store both your final model artifacts and your full training dataset. By streaming in your data directly...
Read more >
The CREATE MODEL statement for Matrix Factorization
model_name is the name of the BigQuery ML model you're creating or replacing. The model name must be unique per dataset: no other...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found