How to convert .pt model to .onnx model
See original GitHub issueSimilar problem on converting .pt model to .onnx model. After loading the .pt model, I executed:
example = torch.ones(3, 1, 18, 1)
example = example.unsqueeze(0)
example = example.float().to(device).detach()
torch.onnx.export(model, example, "onnx_model_80.onnx")
Then I got following error:
File "D:\ProgramData\Miniconda3\lib\site-packages\torch\onnx\__init__.py", line 25, in export return utils.export(*args, **kwargs) File "D:\ProgramData\Miniconda3\lib\site-packages\torch\onnx\utils.py", line 131, in export strip_doc_string=strip_doc_string) File "D:\ProgramData\Miniconda3\lib\site-packages\torch\onnx\utils.py", line 363, in _export _retain_param_name, do_constant_folding) File "D:\ProgramData\Miniconda3\lib\site-packages\torch\onnx\utils.py", line 278, in _model_to_graph _disable_torch_constant_prop=_disable_torch_constant_prop) File "D:\ProgramData\Miniconda3\lib\site-packages\torch\onnx\utils.py", line 188, in _optimize_graph graph = torch._C._jit_pass_onnx(graph, operator_export_type) File "D:\ProgramData\Miniconda3\lib\site-packages\torch\onnx\__init__.py", line 50, in _run_symbolic_function return utils._run_symbolic_function(*args, **kwargs) File "D:\ProgramData\Miniconda3\lib\site-packages\torch\onnx\utils.py", line 602, in _run_symbolic_function n.kindOf("value"))) RuntimeError: Unsupported prim::Constant kind:s. Send a bug report.
I am aware that the function torch.onnx.export
should contain params called input_names
and output_names
but I got difficulty acquring them in the net as I used model.state_dict()
to list the whole net’s params.
Could you give any advise on it? Thank you.
Issue Analytics
- State:
- Created 4 years ago
- Comments:6
@zhanghaowei01 Sorry I didn’t use the .onnx model eventually but I saw that PyTorch 1.7 supports the
torch.einsum
operator on .onnx model generating. Maybe this could help.@zhanghaowei01 have you solve it ?how?