question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

How to convert .pt model to .onnx model

See original GitHub issue

Similar problem on converting .pt model to .onnx model. After loading the .pt model, I executed: example = torch.ones(3, 1, 18, 1) example = example.unsqueeze(0) example = example.float().to(device).detach() torch.onnx.export(model, example, "onnx_model_80.onnx") Then I got following error: File "D:\ProgramData\Miniconda3\lib\site-packages\torch\onnx\__init__.py", line 25, in export return utils.export(*args, **kwargs) File "D:\ProgramData\Miniconda3\lib\site-packages\torch\onnx\utils.py", line 131, in export strip_doc_string=strip_doc_string) File "D:\ProgramData\Miniconda3\lib\site-packages\torch\onnx\utils.py", line 363, in _export _retain_param_name, do_constant_folding) File "D:\ProgramData\Miniconda3\lib\site-packages\torch\onnx\utils.py", line 278, in _model_to_graph _disable_torch_constant_prop=_disable_torch_constant_prop) File "D:\ProgramData\Miniconda3\lib\site-packages\torch\onnx\utils.py", line 188, in _optimize_graph graph = torch._C._jit_pass_onnx(graph, operator_export_type) File "D:\ProgramData\Miniconda3\lib\site-packages\torch\onnx\__init__.py", line 50, in _run_symbolic_function return utils._run_symbolic_function(*args, **kwargs) File "D:\ProgramData\Miniconda3\lib\site-packages\torch\onnx\utils.py", line 602, in _run_symbolic_function n.kindOf("value"))) RuntimeError: Unsupported prim::Constant kind:s. Send a bug report. I am aware that the function torch.onnx.export should contain params called input_names and output_names but I got difficulty acquring them in the net as I used model.state_dict() to list the whole net’s params. Could you give any advise on it? Thank you.

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Comments:6

github_iconTop GitHub Comments

1reaction
qualia0000commented, Feb 2, 2021

@zhanghaowei01 Sorry I didn’t use the .onnx model eventually but I saw that PyTorch 1.7 supports the torch.einsum operator on .onnx model generating. Maybe this could help.

0reactions
henbucuoshanghaicommented, Oct 11, 2021

@zhanghaowei01 have you solve it ?how?

Read more comments on GitHub >

github_iconTop Results From Across the Web

How to Convert a PyTorch Model to ONNX in 5 Minutes
Converting deep learning models from PyTorch to ONNX is quite straightforward. Start by loading a pre-trained ResNet-50 model from PyTorch's ...
Read more >
Convert PyTorch Model to ONNX Model - Documentation
To convert a PyTorch model to an ONNX model, you need both the PyTorch model and the source code that generates the PyTorch...
Read more >
Convert your PyTorch training model to ONNX
Export the model · Copy the following code into the PyTorchTraining.py file in Visual Studio, above your main function. · To run the...
Read more >
How-to-convert `.pt` to `.onnx` model file format.
Convert .pt to .onnx ... The function using in Scaled-YOLOv4, please refer to Scaled-YOLOv4 repository. ... You should see output like this: Namespace(batch_size=1, ......
Read more >
(optional) Exporting a Model from PyTorch to ONNX and ...
To export a model, we call the torch.onnx.export() function. This will execute the model, recording a trace of what operators are used to...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found