UNAVAILABLE: Internal: unable to create stream: the provided PTX was compiled with an unsupported toolchain
See original GitHub issueDescription
I upgraded the triton server with the ngc container 22.01-py3
and all my models becomes unavailable.
This previous issue suggests that the nvidia driver is outdated, so I did an upgrade and now I got: Driver Version: 495.46, CUDA Version: 11.5
on my host (ubuntu 20.04). But still doesn’t work.
Triton Information
What version of Triton are you using?
The one in docker image nvcr.io/nvidia/tritonserver:22.01-py3
.
To Reproduce
docker run nvcr.io/nvidia/tritonserver:22.01-py3
Expected behavior It should not fail.
Issue Analytics
- State:
- Created 2 years ago
- Reactions:1
- Comments:9 (5 by maintainers)
Top Results From Across the Web
"provided PTX was compiled with an unsupported toolchain ...
Hello all, This morning, I was suddenly facing a provided PTX was compiled with an unsupported toolchain error. I came across this post...
Read more >CUDA - the provided PTX was compiled with an unsupported ...
This indicates that the provided PTX was compiled with an unsupported toolchain. The most common reason for this, is the PTX was generated...
Read more >the provided PTX was compiled with an unsupported toolchain
When I run a python file, containing a machine learning model, I get the following error. (pytorch) [s.1915438@cl1 aneurysm]$ srun python ...
Read more >the provided PTX was compiled with an unsupported toolchain
I am new to CUDA an recently i installed it on my windows pc. I have been running a small test with some...
Read more >vs进行cuda编程失败,报错“the provided PTX was compiled ...
vs进行cuda编程失败,报错“the provided PTX was compiled with an unsupported toolchain.” 这表明提供的PTX是使用不受支持的工具链编译的。最常见的原因 ...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
I meet this issue when deploying tritonserver in kubernetes with A100 Card containerd://1.5.4 Driver Version: 470.82.01 CUDA Version: 11.6 nvcr tag: 22.03-py3
I was use tensorRT to build a model.plan, then run it with tritonserver, with the same nvcr container version tag but it says:
but I finally figure it out.
it can work with
can’t work with
I have no idea why this happened.
2022.2.14: Version 510.54 for RTX3090 is available for download at https://www.nvidia.com/Download/driverResults.aspx/186996/en-us