question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Failed to export onnx model

See original GitHub issue

Code version (Git Hash) and PyTorch version

st-gcn master@e7024ac and Pytorch ‘1.1.0a0+828a6a3’

Dataset used

Demo

Expected behavior

Successfully export onnx model

Actual behavior

root@p4station:/workspace# python main.py demo --openpose openpose/build/ --device 1 /workspace/processor/io.py:39: YAMLLoadWarning: calling yaml.load() without Loader=… is deprecated, as the default Loader is unsafe. Please read https://msg.pyyaml.org/load for full details. default_arg = yaml.load(f) Starting OpenPose demo… Auto-detecting all available GPUs… Detected 1 GPU(s), using 1 of them starting at GPU 0. Starting thread(s)… OpenPose demo successfully finished. Total time: 47.039486 seconds. Pose estimation complete.

Network forwad… Prediction result: skateboarding Done. /workspace/net/utils/tgcn.py:58: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can’t record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs! assert A.size(0) == self.kernel_size Traceback (most recent call last): File “main.py”, line 31, in <module> p.start() File “/workspace/processor/demo.py”, line 85, in start torch.onnx.export(self.model, dummy_input, “st-gcn_kinetics-skeleton.onnx”) File “/opt/conda/lib/python3.6/site-packages/torch/onnx/init.py”, line 24, in export return utils.export(*args, **kwargs) File “/opt/conda/lib/python3.6/site-packages/torch/onnx/utils.py”, line 108, in export _retain_param_name=_retain_param_name) File “/opt/conda/lib/python3.6/site-packages/torch/onnx/utils.py”, line 315, in _export _retain_param_name) File “/opt/conda/lib/python3.6/site-packages/torch/onnx/utils.py”, line 245, in _model_to_graph graph = _optimize_graph(graph, operator_export_type) File “/opt/conda/lib/python3.6/site-packages/torch/onnx/utils.py”, line 164, in _optimize_graph graph = torch._C._jit_pass_onnx(graph, operator_export_type) File “/opt/conda/lib/python3.6/site-packages/torch/onnx/init.py”, line 49, in _run_symbolic_function return utils._run_symbolic_function(*args, **kwargs) File “/opt/conda/lib/python3.6/site-packages/torch/onnx/utils.py”, line 550, in _run_symbolic_function n.kindOf(“value”))) RuntimeError: Unsupported prim::Constant kind: s. Send a bug report.

Steps to reproduce the behavior

dummy_input = torch.randn(1, 3, 300, 18, 2, device=‘cuda’) torch.onnx.export(self.model, dummy_input, “st-gcn_kinetics-skeleton.onnx”)

Other comments

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Comments:5

github_iconTop GitHub Comments

1reaction
nfeng0105commented, Jul 17, 2019

@lxy5513 I replaced einsum through other supported ops,

`#x = torch.einsum(‘nkctv,kvw->nctw’, (x, A))

x = x.permute(0, 2, 3, 1, 4).contiguous() n, c, t, k, v = x.size() k, v, w = A.size() x = x.view(n * c * t, k * v) A = A.view(k * v, w) x = torch.mm(x, A) x = x.view(n, c, t, w) `

0reactions
RobotTimcommented, Jun 3, 2021

@nfeng0105 @lxy5513 hi sir, have you ever export onnx successfully. Could you share the export script for us pls!!!. This problem bothered me several days!!!

Read more comments on GitHub >

github_iconTop Results From Across the Web

Failed to export an ONNX attribute, since it's not constant ...
Go to file symbolic.py in onnx directory (for its path see error on your terminal like in my case it was - /home/adam/anaconda3/lib/python3.6/ ......
Read more >
Can't convert Pytorch to ONNX - Stack Overflow
I used to have a similar error when exporting using. torch.onnx.export(model, x, ONNX_FILE_PATH). and I fixed it by specifying the ...
Read more >
Exporting an ONNX Model - Huawei Support
Before exporting the ONNX model using the .pth.tar file, you need to check the saved information. Sometimes, the saved node name may be...
Read more >
FullSubNet-plusのエクスポート試行 Failed to export an ONNX ...
FullSubNet-plusのエクスポート試行 Failed to export an ONNX ... torch/onnx/__init__.py", line 316, in export return utils.export(model, args, ...
Read more >
detectron2 torch.onnx.export gives device error for all oprtions ...
I am trying to export an object detection model trained with detectron2 framework. I am using latest 0.6 version.
Read more >

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found