Convert TSM pytorch model to onnx
See original GitHub issueI have succees converting the tsm pytorch to onnx. but the onnx output is different with the pytorch. when converting. it show me that:
TracerWarning: Converting a tensor to a Python index might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
out[:, :-1, :fold] = x[:, 1:, :fold] # shift left
/Users/zhanghongxing/vscode_project/tsm_inference/infer/../ops/temporal_shift.py:37: TracerWarning: There are 2 live references to the data region being modified when tracing in-place operator copy_ (possibly due to an assignment). This might cause the trace to be incorrect, because all other views that also reference this data will not reflect this change in the trace! On the other hand, if all other views use the same memory chunk, but are disjoint (e.g. are outputs of torch.split), this might still be safe.
out[:, :-1, :fold] = x[:, 1:, :fold] # shift left
/Users/zhanghongxing/vscode_project/tsm_inference/infer/../ops/temporal_shift.py:38: TracerWarning: Converting a tensor to a Python index might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
out[:, 1:, fold: 2 * fold] = x[:, :-1, fold: 2 * fold] # shift right
/Users/zhanghongxing/vscode_project/tsm_inference/infer/../ops/temporal_shift.py:38: TracerWarning: There are 2 live references to the data region being modified when tracing in-place operator copy_ (possibly due to an assignment). This might cause the trace to be incorrect, because all other views that also reference this data will not reflect this change in the trace! On the other hand, if all other views use the same memory chunk, but are disjoint (e.g. are outputs of torch.split), this might still be safe.
out[:, 1:, fold: 2 * fold] = x[:, :-1, fold: 2 * fold] # shift right
/Users/zhanghongxing/vscode_project/tsm_inference/infer/../ops/temporal_shift.py:39: TracerWarning: Converting a tensor to a Python index might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
out[:, :, 2 * fold:] = x[:, :, 2 * fold:] # not shift
/Users/zhanghongxing/vscode_project/tsm_inference/infer/../ops/temporal_shift.py:39: TracerWarning: There are 2 live references to the data region being modified when tracing in-place operator copy_ (possibly due to an assignment). This might cause the trace to be incorrect, because all other views that also reference this data will not reflect this change in the trace! On the other hand, if all other views use the same memory chunk, but are disjoint (e.g. are outputs of torch.split), this might still be safe.
**I didnot use the inplace shift**
out[:, :, 2 * fold:] = x[:, :, 2 * fold:] # not shift
out[:, :-1, :fold] = x[:, 1:, :fold] # shift left
out[:, 1:, fold: 2 * fold] = x[:, :-1, fold: 2 * fold] # shift right
out[:, :, 2 * fold:] = x[:, :, 2 * fold:] # not shift
I think this may be caused by the shift operation. but I dont think it is a inplace op. it seems that the onnx not support the slice op.
Issue Analytics
- State:
- Created 3 years ago
- Comments:6
Top Results From Across the Web
Convert TSM pytorch model to onnx - jit
I want to convert tsm model https://github.com/mit-han-lab/temporal-shift-module to onnx. when convert it show me that TracerWarning: ...
Read more >Convert your PyTorch training model to ONNX - Microsoft Learn
To export a model, you will use the torch.onnx.export() function. This function executes the model, and records a trace of what operators are ......
Read more >Tutorial 6: Exporting a model to ONNX
First, install onnx. We provide a python script to export the pytorch model trained by MMAction2 to ONNX. Optional arguments: --shape : The...
Read more >Transform a PyTorch model to onnx - Medium
Photo by Andy Brunner on Unsplash. In this tutorial, I want to show how easily you can transform a PyTorch model to the...
Read more >data: kMIN dimensions in profile 0 are [24,3224224] but input ...
Description Hi all. I have converted a PyTorch model to trt directly in python, without using ONNX or trtexec (because it has some...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
@ChiShao you can change https://github.com/mit-han-lab/temporal-shift-module/blob/f09f42db80f1dbeaf9c7448fbd491cd59043e711/ops/temporal_shift.py#L27 to this:
@Usernamezhx can you provide the code of coverting to onnx?