error while converting complex models to onnx model (caused by view_as_complex)
See original GitHub issue🐛 Bug
While exporting to onnx some of the models (with complex operation). There is error caused by no support of complex casting in the onnx ops set torch.view_as_complex(input))
To Reproduce
torch.onnx.export(model_dccrn, input_random, 'model_dccrn.onnx', verbose=True, opset_version=11)
Steps to reproduce the behavior (code sample and stack trace):
~/lib/miniconda3/lib/python3.9/site-packages/torch/onnx/symbolic_registry.py in get_registered_op(opname, domain, version)
114 else:
115 msg += "Please feel free to request support or submit a pull request on PyTorch GitHub."
--> 116 raise RuntimeError(msg)
117 return _registry[(domain, version)][opname]
RuntimeError: Exporting the operator view_as_complex to ONNX opset version 12 is not supported. Please feel free to request support or submit a pull request on PyTorch GitHub.
Expected behavior
The convertion should proceed without errors and end with proper onnx model.
Environment
Package versions
Run asteroid-versions
and paste the output here:
Asteroid 0.5.1
PyTorch 1.9.0
PyTorch-Lightning 1.3.8
Additional info
I know it is not set case in pytorch-onnx ops set
ONNX current operations:
- https://github.com/onnx/onnx/blob/master/docs/Operators.md
- in part for Cast
-
Casting to complex is not supported.
issues with errors:
- error with support view_as_complex for converting to onnx model
However, we can propose a wrapper that is covering this convertion in such a way that onnx model will be created properly.
view_as_complex is implemented in ATen library
- https://github.com/pytorch/pytorch/blob/30e48bbeae545c3292c2ab3fed0cb2dba4a92fed/aten/src/ATen/native/ComplexHelper.h#L70
const auto new_strides = computeStrideForViewAsComplex(self.strides()); const auto complex_type = c10::toComplexType(self.scalar_type()); view_tensor(self, complex_type, new_storage_offset, new_sizes, new_strides);
Issue Analytics
- State:
- Created 2 years ago
- Comments:6
Top Results From Across the Web
Can't convert Pytorch to ONNX - Stack Overflow
I used to have a similar error when exporting using. torch.onnx.export(model, x, ONNX_FILE_PATH). and I fixed it by specifying the ...
Read more >Scaling-up PyTorch inference: Serving billions of daily NLP ...
Serving complex transformer models in production for high-volume inferencing is not an easy task. This post shares how we tackled this problem ...
Read more >torch.onnx — PyTorch 1.13 documentation
The torch.onnx module can export PyTorch models to ONNX. ... If the passed-in model is not already a ScriptModule , export() will use...
Read more >ailia SDK tutorial (model conversion to ONNX) - Medium
This is a tutorial on exporting models trained with various learning frameworks such as Pytorch and TensorFlow to ONNX that can be used...
Read more >Convert PyTorch Model to ONNX Model - Documentation
However, even with built-in ONNX conversion capability, some models are still difficult to export. In general, there are three possible road blockers:.
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
I guess for simple operations, it’s possible, but when solve and eigenvalue decompositions are computed, having the facade is more complicated, right?
I don’t really know what we should do about that.
Thanks a lot, I am debugging some of the models with asteroid complex representation and there are still some errors. I think all of the error places are in
complex_nn
.