Error when running online demo main.py
See original GitHub issueI’m getting this error when I try to execute python3 main.py within ~/temporal-shift-module/online_demo
folder…
Open camera...
<VideoCapture 0x7f2f0b2370>
Build transformer...
/usr/local/lib/python3.6/dist-packages/torchvision-0.5.0a0+85b8fbf-py3.6-linux-aarch64.egg/torchvision/transforms/transforms.py:220: UserWarning: The use of the transforms.Scale transform is deprecated, please use transforms.Resize instead.
"please use transforms.Resize instead.")
Build Executor...
/home/bm/temporal-shift-module/online_demo/mobilenet_v2_tsm.py:95: TracerWarning: Converting a tensor to a Python index might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
x1, x2 = x[:, : c // 8], x[:, c // 8:]
Traceback (most recent call last):
File "main.py", line 386, in <module>
main()
File "main.py", line 282, in main
executor, ctx = get_executor()
File "main.py", line 96, in get_executor
return torch2executor(torch_module, torch_inputs, target)
File "main.py", line 52, in torch2executor
graph, tvm_module, params = torch2tvm_module(torch_module, torch_inputs, target)
File "main.py", line 31, in torch2tvm_module
torch.onnx.export(torch_module, torch_inputs, buffer, input_names=input_names, output_names=["o" + str(i) for i in range(len(torch_inputs))])
File "/home/bm/.local/lib/python3.6/site-packages/torch/onnx/__init__.py", line 148, in export
strip_doc_string, dynamic_axes, keep_initializers_as_inputs)
File "/home/bm/.local/lib/python3.6/site-packages/torch/onnx/utils.py", line 66, in export
dynamic_axes=dynamic_axes, keep_initializers_as_inputs=keep_initializers_as_inputs)
File "/home/bm/.local/lib/python3.6/site-packages/torch/onnx/utils.py", line 416, in _export
fixed_batch_size=fixed_batch_size)
File "/home/bm/.local/lib/python3.6/site-packages/torch/onnx/utils.py", line 296, in _model_to_graph
fixed_batch_size=fixed_batch_size, params_dict=params_dict)
File "/home/bm/.local/lib/python3.6/site-packages/torch/onnx/utils.py", line 135, in _optimize_graph
graph = torch._C._jit_pass_onnx(graph, operator_export_type)
File "/home/bm/.local/lib/python3.6/site-packages/torch/onnx/__init__.py", line 179, in _run_symbolic_function
return utils._run_symbolic_function(*args, **kwargs)
File "/home/bm/.local/lib/python3.6/site-packages/torch/onnx/utils.py", line 657, in _run_symbolic_function
return op_fn(g, *inputs, **attrs)
File "/home/bm/.local/lib/python3.6/site-packages/torch/onnx/symbolic_helper.py", line 129, in wrapper
return fn(g, *args)
File "/home/bm/.local/lib/python3.6/site-packages/torch/onnx/symbolic_opset9.py", line 1311, in slice
raise RuntimeError('Unsupported: ONNX export of Slice with dynamic inputs. DynamicSlice '
RuntimeError: Unsupported: ONNX export of Slice with dynamic inputs. DynamicSlice is a deprecated experimental op. Please use statically allocated variables or export to a higher opset version.
Issue Analytics
- State:
- Created 4 years ago
- Comments:29
Top Results From Across the Web
Unable to run demo: Key error "data" · Issue #89 - GitHub
I'm trying to run the data lineage wikimedia demo but I'm running into an error: Traceback (most recent call last):
Read more >How do I run my Python script? Why does the command line ...
Following the advice in the Google Developers course, I try to run the script by using python hello.py at the command prompt.
Read more >Fix JavaScript errors that are reported in the Console
Open the demo webpage Network error reported in Console in a new window or tab. · Right-click anywhere in the webpage and then...
Read more >Code Running Assistance | PyCharm Documentation - JetBrains
You have Python interpreter already configured. Note that for the current project your Python interpreter version should be 3.0 or later. First ...
Read more >Python Errors — MF 703, Boston University
This error occurs when a name (a variable or function name) is used that Python does not know about. It can occur if...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
I have solved this problem. The solution is to export torch model with opset 10 instead of default opset 9, TVM does not really support some operators in opset 9. Additionally, using a tool “onnx-simplifier” would be great help.
Hi @poincarelee, how did you install onnx-simplify? I get an error with the onnxruntime dependency:
did you compile onnxruntime from source?
edit:
I did have to compile onnxruntime, I used branch 1.4. I installed it after building the wheels file. Additionally I compiled it with CUDA support, and then had to change onnx-simplifier to depend on for onnxruntime-gpu. After that I managed to install onnx-simplifier.
After that I added the changes suggested above, and it finally worked.