outputs are different between ONNX and pytorch
See original GitHub issueI export a pytorch model to onnx and there are not any errors, but the outputs are different when doing inference use same inputs. What is possible solution ?
pytorch output: [-10.975916 -8.120772 -3.1923165 ... -10.976107 -11.38656272 -2.362452 ]
onnx output: [ -9.995668 -4.854749 -3.230622 ... -9.998256 -10.756512 -3.699711]
pytorch model , network and script (export onnx ) in this repo https://github.com/mdztravelling/ASR convert log in https://github.com/mdztravelling/ASR/blob/master/log.txt
-
pytorch version: 1.3.0
-
OS (e.g., Linux): CentOS 6.3
-
Python version: 3.6.5
-
onnx opset version: 10
-
CUDA/cuDNN version: 10.0/7.4
Issue Analytics
- State:
- Created 4 years ago
- Reactions:2
- Comments:8 (2 by maintainers)
Top Results From Across the Web
outputs are different between ONNX and pytorch
Problem solve by adding model.eval() before running inference of pytorch model in test code. Solution is from the link model = models.
Read more >Inference result is different between Pytorch and ONNX model
Hi,. I converted Pytorch model to ONNX model. However, output is different between two models like below. inference_result. inference environment. Pytorch.
Read more >(optional) Exporting a Model from PyTorch to ONNX and ...
To export a model, we call the torch.onnx.export() function. This will execute the model, recording a trace of what operators are used to...
Read more >ONNX vs Torch Output Mismatch - PyTorch Forums
I have a 2dConvNet that i export from PyTorch to ONNX and TorchScript. However, while TorchScript output matches the native PyTorch code's ......
Read more >torch.onnx — PyTorch 1.13 documentation
A “symbolic function” is a function that decomposes a PyTorch operator into a composition of a series of ONNX operators. During export, each...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
add model.eval() before model.inference()
pytorch1.10.0 same issue.