question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

question about model export to onnx

See original GitHub issue

I want to use TensorRT to delpoy the model. and I try to export the model to onnx, but I found this error when I use export_onnx function

RuntimeError: Only tuples, lists and Variables supported as JIT inputs/outputs. Dictionaries and strings are also accepted but their usage is not recommended. But got unsupported type numpy.ndarray

and this seems to be because forward function use dict as input param. can I have some non-destructive methods to support export to onnx?

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:18

github_iconTop GitHub Comments

3reactions
muzi2045commented, Aug 19, 2020

pytorch don’t support dict when export model to onnx. so here is something to change in networek input and output, the dict input --> list input here is three part onnx export from OpenPCDet codebase:

image

image

image

1reaction
NLCharlescommented, Nov 9, 2020

Thank you for the information. I only tested PartA2 model in pytorch and it’s not far from 10fps. I thought transplanting it to tensorrt could boost its speed.

Could you please tell me which ops are not GPU-friendly, e.g sparse convolution?

Looking forward to your results.

Thank you for your reply. Did your run time include preprocess and post process?

Ops utilized by PointNet++ modules, set abstraction and forward interpolation, involve intensivelogical oprations.

For voxelnet, voxelization of the input costs a lot of time, and sparse convolution must be wrapped into tensorrt plugin.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Export to ONNX - Transformers - Hugging Face
In this guide, we'll show you how to export Transformers models to ONNX (Open Neural Network eXchange). Once exported, a model can be...
Read more >
Best Practices for Neural Network Exports to ONNX
Our experience shows that is easier to export PyTorch models. If possible, choose a PyTorch source and convert it using the built-in torch.onnx...
Read more >
ONNX Model: Export Using Pytorch, Problems, and Solutions
Always, use the latest ONNX while exporting the model. Also, always try to use the latest opset, for example, the current latest is...
Read more >
(optional) Exporting a Model from PyTorch to ONNX and ...
This tutorial will use as an example a model exported by tracing. To export a model, we call the torch.onnx.export() function. This will...
Read more >
Convert PyTorch Model to ONNX Model - Documentation
The PyTorch 'compiler' will correctly capture any control flow, and correctly export the model to ONNX format. This sounds like a proper solution...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found