DirectML version 1.7.0 /1.8.0 cause some conv2d cases get wrong result
See original GitHub issueI find that the microsoft.ai.directml.1.7.0 and microsoft.ai.directml.1.8.0 will cause some conv2d cases get wrong result.
I provide one case to reproduce the issue, please refer to my branch for the sample code. Depend on PR#232 which can fix build issue when doing python setup.py install
.
Issue Analytics
- State:
- Created a year ago
- Reactions:1
- Comments:6 (6 by maintainers)
Top Results From Across the Web
DirectML version history | Microsoft Learn
In this article. Version table; Selecting a DirectML target version; DirectML version versus feature level; See also.
Read more >Failed to get convolution algorithm. This is probably because ...
I've seen this error message for three different reasons, with different solutions: 1. You have cache issues. I regularly work around this ...
Read more >onnxruntime-directml - PyPI
ONNX Runtime is a runtime accelerator for Machine Learning models.
Read more >Windows - DirectML | onnxruntime
DirectML Execution Provider. The DirectML Execution Provider is a component of ONNX Runtime that uses DirectML to accelerate inference of ONNX models.
Read more >DirectML - bytemeta
DirectML repo issues. ... DirectML 1.8.2 depthwise convolution slow? ... DirectML version 1.7.0 /1.8.0 cause some conv2d cases get wrong result.
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Can you provide us some information about your hardware and drivers so we can reproduce the issue? Also, does the issue occur if you use
DML_EXECUTION_FLAG_DISABLE_META_COMMANDS
? Thanks!Thanks so much! I verified yet, close this issue. @Jamather @fdwr @huningxin