tree_implementation parameter gets overwritten with onnx backend
See original GitHub issueHello,
I have been trying to use the gemm with onnx backend but the onnx graph did not really look like we were using gemm tree implementation:
model = convert(skl_model, backend="onnx", test_input=X, extra_config={"tree_implementation":"gemm"})
It seems that the tree_implementation parameter gets overwritten deeper in the code:
if backend == onnx.__name__:
# vers = LooseVersion(torch.__version__)
# allowed_min = LooseVersion("1.6.0")
# Pytorch <= 1.6.0 has a bug with exporting GEMM into ONNX.
# For the moment only tree_trav is enabled for pytorch <= 1.6.0
# if vers < allowed_min:
extra_config[constants.TREE_IMPLEMENTATION] = "tree_trav"
I am not sure why this is done tho? Removing this part gives me back the expected behavior.
Could you give a bit more info on what is the point of this? Can I do anything to fix this?
Issue Analytics
- State:
- Created 2 years ago
- Comments:8 (3 by maintainers)
Top Results From Across the Web
Onnx Backend API 'run_model' is broken · Issue #1056 - GitHub
Trying to run an Onnx model with Onnx Backend API raises TypeError exception because it cannot call 'rep.run()' with 2 arguments (inputs, ...
Read more >API — ONNX Runtime 1.14.0 documentation
InferenceSession ready to be used as a backend. Parameters. model – ModelProto (returned by onnx.load ), string for a filename or bytes for...
Read more >module onnx_tools.onnx_manipulations
Overwrites the main opset in an ONNX file. ... as_parameter – add new nodes with results as one parameter (True) or as initializer...
Read more >torch.onnx — PyTorch 1.13 documentation
The torch.onnx module can export PyTorch models to ONNX. ... structure and parameters of the model you exported (in this case, AlexNet). ......
Read more >API Summary — sklearn-onnx 1.11.1 documentation
Some ONNX operators exposes parameters sklearn-onnx cannot guess from the raw ... To find the converter associated to a specific model, the library...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found

Yes of course I will prepare that. I think it is better to raise an exception rather than overwrite the parameter given by the user if the pytorch version is too low.
Does this sound right? I will have a PR soon.
It looks that this error is coming from the onnxruntime. To bypass this, we can force onnxml models to not use
gemm, and open an issue regarding this.You can
extra_config[constants.TREE_IMPLEMENTATION] = "tree_trav"somewhere in this method. In this way sklearn models will usegemm(eventually), while onnxml models will stil usetree_travuntil we understand where this new error is coming from.