TensorRT conversion failure
See original GitHub issueTraceback (most recent call last):
File "to_onnx.py", line 38, in <module>
generate_onnx_file()
File "to_onnx.py", line 28, in generate_onnx_file
onnx_mxnet.export_model(net_symbol, net_params, [input_shape], numpy.float32, onnx_path, verbose=True)
File "/usr/local/lib/python3.5/dist-packages/mxnet/contrib/onnx/mx2onnx/export_model.py", line 87, in export_model
verbose=verbose)
File "/usr/local/lib/python3.5/dist-packages/mxnet/contrib/onnx/mx2onnx/export_onnx.py", line 309, in create_onnx_graph_proto
checker.check_graph(graph)
File "/usr/local/lib/python3.5/dist-packages/onnx/checker.py", line 52, in checker
proto.SerializeToString(), ctx)
onnx.onnx_cpp2py_export.checker.ValidationError: Node (slice_axis20) has input size 1 not in range [min=3, max=5].
==> Context: Bad node spec: input: "softmax0" output: "slice_axis20" name: "slice_axis20" op_type: "Slice" attribute { name: "axes" ints: 1 type: INTS } attribute { name: "ends" ints: 1 type: INTS } attribute { name: "starts" ints: 0 type: INTS }
Issue Analytics
- State:
- Created 4 years ago
- Comments:6 (2 by maintainers)
Top Results From Across the Web
ONNX/TensorRT conversion failure for Mask-RCNN model
Description Am getting shape issues when I try to convert Detectron Mask-RCNN model to onnx (and then to TensorRT), despite following the guide...
Read more >TensorRT conversion fails with DCHECK(!i->is_use_only())
Description I am trying to convert the following simple ONNX model to TensorRT: Both when using trtexec or the python API it fails...
Read more >How to Convert Your Custom Model into TensorRT
In this blog, we'll show you how to convert your model with custom operators into TensorRT and how to avoid these errors! Nvidia...
Read more >Face recognition: OnnX to TensorRT conversion of Arcface ...
In this blog post, I will explain the steps required in the model conversion of ONNX to TensorRT and the reason why my...
Read more >tf.experimental.tensorrt.Converter | TensorFlow v2.11.0
An offline converter for TF-TRT transformation for TF 2.0 ... Run inference with converted graph in order to build TensorRT engines.
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
@prnvjb @gneworld @jinfagang i try onnx ==1.4 is OK.
Hey @YonghaoHe Could you share your mxnet and onnx versions? Thanks!!