Optimizer crashes on onnx model converted from Tensorflow graph
See original GitHub issueHi, I am trying to invoke the ONNX optimizer following in the instructions in the following post:
https://github.com/onnx/onnx/blob/master/docs/PythonAPIOverview.md#optimizing-an-onnx-model
The ONNX model is converted from a `TensorFlow .pb file:
onnx_model = tensorflow_graph_to_onnx_model(graph_def, outputs, opset=7,
ignore_unimplemented=True)
all_passes = optimizer.get_available_passes()
print("Available optimization passes:")
for p in all_passes:
print(p)
print()
# Apply the optimization on the original model
passes = ['fuse_consecutive_transposes']
optimized_onnx_model = optimizer.optimize(onnx_model, passes)
I am getting a crash in the optimizer:
File "/home/xxx/PycharmProjects/virtualenv/local/lib/python3.5/site-packages/onnx/optimizer.py", line 52, in optimize
optimized_model_str = C.optimize(model_str, passes)
IndexError: _Map_base::at
onnx version: 1.3.0
onnx-tf verison: 1.1.2
Thanks!
Issue Analytics
- State:
- Created 5 years ago
- Reactions:5
- Comments:15 (2 by maintainers)
Top Results From Across the Web
Can't convert onnx model to tflite using TF 2.4.1 - Stack Overflow
But when it comes to the conversion of that saved model to TFLite an error happens. The code: import onnx import tensorflow as...
Read more >Intel at the Edge (The Model Optimizer) - Kevin Urban
Btw, out of curiosity, I tried running this command without the config commands set: the model optimizer crashes. Exercise: Convert a Caffe ...
Read more >Accelerating Inference in TensorFlow with TensorRT User Guide
Conversion with TF-TRT is only one extra step to optimize a model for inference on NVIDIA devices. The TF-TRT workflow is simple. The...
Read more >Release Notes for Intel® Distribution of OpenVINO™ toolkit ...
This API works properly only for ONNX* models, the set of supported frameworks ... which simplifies conversion of non-frozen model graphs.
Read more >Archives - Qualcomm Developer Network
Re-converting these models is strongly recommended as the old layer will be ... TensorFlow Converter: Added support for Identity nodes that act as...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found

hi all I finally resolved this issue
you have to add keep_initializers_as_inputs=True when exporting onnx
since this commit
https://github.com/pytorch/pytorch/commit/7583519b870e33ee3182f330c1bb8663559697b6
i have the same problem.The error is invalid unordered_map<K, T> key