question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

outputs are different between ONNX and pytorch

See original GitHub issue

I export a pytorch model to onnx and there are not any errors, but the outputs are different when doing inference use same inputs. What is possible solution ?

pytorch output:  [-10.975916   -8.120772   -3.1923165 ... -10.976107  -11.38656272   -2.362452 ]
onnx output: [ -9.995668  -4.854749  -3.230622 ...  -9.998256 -10.756512  -3.699711]

pytorch model , network and script (export onnx ) in this repo https://github.com/mdztravelling/ASR convert log in https://github.com/mdztravelling/ASR/blob/master/log.txt

  • pytorch version: 1.3.0

  • OS (e.g., Linux): CentOS 6.3

  • Python version: 3.6.5

  • onnx opset version: 10

  • CUDA/cuDNN version: 10.0/7.4

Issue Analytics

  • State:open
  • Created 4 years ago
  • Reactions:2
  • Comments:8 (2 by maintainers)

github_iconTop GitHub Comments

10reactions
Zhang-Ocommented, Sep 23, 2020

add model.eval() before model.inference()

3reactions
lizezhengcommented, Jan 11, 2022

pytorch1.10.0 same issue.

Read more comments on GitHub >

github_iconTop Results From Across the Web

outputs are different between ONNX and pytorch
Problem solve by adding model.eval() before running inference of pytorch model in test code. Solution is from the link model = models.
Read more >
Inference result is different between Pytorch and ONNX model
Hi,. I converted Pytorch model to ONNX model. However, output is different between two models like below. inference_result. inference environment. Pytorch.
Read more >
(optional) Exporting a Model from PyTorch to ONNX and ...
To export a model, we call the torch.onnx.export() function. This will execute the model, recording a trace of what operators are used to...
Read more >
ONNX vs Torch Output Mismatch - PyTorch Forums
I have a 2dConvNet that i export from PyTorch to ONNX and TorchScript. However, while TorchScript output matches the native PyTorch code's ......
Read more >
torch.onnx — PyTorch 1.13 documentation
A “symbolic function” is a function that decomposes a PyTorch operator into a composition of a series of ONNX operators. During export, each...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found