onnx export error and can the onnx model support tensorRT?
See original GitHub issuewhen I export the detr_r50.pth
to onnx
model, but it has an error:
RuntimeError: Failed to export an ONNX attribute 'onnx::Sub', since it's not constant, please try to make things (e.g., kernel size) static if possible
Issue Analytics
- State:
- Created 3 years ago
- Comments:19 (5 by maintainers)
Top Results From Across the Web
ONNX and tensorRT: ERROR: Network must have at least one ...
Hi, I exported a model to ONNX from pytorch 1.0, and tried to load it to tensorRT using: def build_engine_onnx(model_file): with trt.
Read more >torch.onnx — PyTorch 1.13 documentation
The torch.onnx module can export PyTorch models to ONNX. The model can then be consumed by any of the many runtimes that support...
Read more >TensorRT Execution Provider - NVIDIA - ONNX Runtime
If some operators in the model are not supported by TensorRT, ONNX Runtime will partition the graph and only send supported subgraphs to...
Read more >Tutorial 9: ONNX to TensorRT (Experimental)
Try the new MMDeploy to deploy your model · How to convert models from ONNX to TensorRT · How to evaluate the exported...
Read more >Deployment techniques for PyTorch models using TensorRT
To deploy PyTorch models using TensorRT, we will export them in ONNX format. ONNX stands for Open Neural Network Exchange and is an...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
i have opened a solution of detr-tensorrt there: https://github.com/wang-xinyu/tensorrtx/tree/master/detr
DETR TensorRT: https://github.com/DataXujing/TensorRT-DETR