some error with onnx transfer to trt
See original GitHub issuesome error export
[05/19/2022-11:11:38] [W] [TRT] onnx2trt_utils.cpp:220: Your ONNX model has been generated with INT64 weights, while TensorRT does not natively support INT64. Attempting to cast down to INT32.
ERROR: builtin_op_importers.cpp:2261 In function importPad:
[8] Assertion failed: mode == "constant" && value == 0.f && "This version of TensorRT only supports constant 0 padding!"
[05/19/2022-11:11:38] [E] Failed to parse onnx file
[05/19/2022-11:11:38] [E] Parsing model failed
[05/19/2022-11:11:38] [E] Engine creation failed
[05/19/2022-11:11:38] [E] Engine set up failed
&&&& FAILED TensorRT.trtexec # trtexec --onnx=colorize_infer_model_CPU_sim.onnx --saveEngine=colorize_infer_model_CPU_sim.trt --best
How do I convert INT64 to INT32? or How to fix this type error?
Issue Analytics
- State:
- Created a year ago
- Comments:6 (3 by maintainers)
Top Results From Across the Web
Failed converting ONNX model to TensorRT model
I want to convert the model from ONNX to TensorRT, manually and ... [TRT] [W] GPU error during getBestTactic: Conv_6 + Relu_7 :...
Read more >TensorRT/ONNX - eLinux.org
If you met some error during converting onnx to engine. If you met some error during parsing, please add “--verbose” into trtexec cmd...
Read more >TensorRT Execution Provider - NVIDIA - ONNX Runtime
Instructions to execute ONNX Runtime on NVIDIA GPUs with the TensorRT ... If some operators in the model are not supported by TensorRT,...
Read more >Convert PyTorch Model to ONNX Model - Documentation
To convert a PyTorch model to an ONNX model, you need both the PyTorch model and ... Just do as the error message...
Read more >What is the opset number? — sklearn-onnx 1.11.2 ...
An ONNX graph only contains one unique opset, every node must be described following ... 2} 91 try target_opset: {'': 10, 'ai.onnx.ml': 1}...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Thank you for your reply. sorry ,This is Tensorrt’s mistake,Someone say that upgrading Tensorrt can solve this problem,so I upgrade tensorrt==8.2.0 and i solved this problem.
Sorry I don’t have permission for onnx-tensorrt so I cannot transfer this issue there