question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

tree_implementation parameter gets overwritten with onnx backend

See original GitHub issue

Hello,

I have been trying to use the gemm with onnx backend but the onnx graph did not really look like we were using gemm tree implementation:

model = convert(skl_model, backend="onnx", test_input=X, extra_config={"tree_implementation":"gemm"})

It seems that the tree_implementation parameter gets overwritten deeper in the code:

        if backend == onnx.__name__:
            # vers = LooseVersion(torch.__version__)
            # allowed_min = LooseVersion("1.6.0")
            # Pytorch <= 1.6.0 has a bug with exporting GEMM into ONNX.
            # For the moment only tree_trav is enabled for pytorch <= 1.6.0
            # if vers < allowed_min:
            extra_config[constants.TREE_IMPLEMENTATION] = "tree_trav"

I am not sure why this is done tho? Removing this part gives me back the expected behavior.

Could you give a bit more info on what is the point of this? Can I do anything to fix this?

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:8 (3 by maintainers)

github_iconTop GitHub Comments

1reaction
jfrerycommented, Feb 5, 2022

Yes of course I will prepare that. I think it is better to raise an exception rather than overwrite the parameter given by the user if the pytorch version is too low.

Does this sound right? I will have a PR soon.

0reactions
interesaaatcommented, Feb 5, 2022

It looks that this error is coming from the onnxruntime. To bypass this, we can force onnxml models to not use gemm, and open an issue regarding this.

You can extra_config[constants.TREE_IMPLEMENTATION] = "tree_trav" somewhere in this method. In this way sklearn models will use gemm (eventually), while onnxml models will stil use tree_trav until we understand where this new error is coming from.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Onnx Backend API 'run_model' is broken · Issue #1056 - GitHub
Trying to run an Onnx model with Onnx Backend API raises TypeError exception because it cannot call 'rep.run()' with 2 arguments (inputs, ...
Read more >
API — ONNX Runtime 1.14.0 documentation
InferenceSession ready to be used as a backend. Parameters. model – ModelProto (returned by onnx.load ), string for a filename or bytes for...
Read more >
module onnx_tools.onnx_manipulations
Overwrites the main opset in an ONNX file. ... as_parameter – add new nodes with results as one parameter (True) or as initializer...
Read more >
torch.onnx — PyTorch 1.13 documentation
The torch.onnx module can export PyTorch models to ONNX. ... structure and parameters of the model you exported (in this case, AlexNet). ......
Read more >
API Summary — sklearn-onnx 1.11.1 documentation
Some ONNX operators exposes parameters sklearn-onnx cannot guess from the raw ... To find the converter associated to a specific model, the library...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found