question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

UNAVAILABLE: Internal: unable to create stream: the provided PTX was compiled with an unsupported toolchain

See original GitHub issue

Description I upgraded the triton server with the ngc container 22.01-py3 and all my models becomes unavailable.

This previous issue suggests that the nvidia driver is outdated, so I did an upgrade and now I got: Driver Version: 495.46, CUDA Version: 11.5 on my host (ubuntu 20.04). But still doesn’t work.

Triton Information What version of Triton are you using? The one in docker image nvcr.io/nvidia/tritonserver:22.01-py3.

To Reproduce

docker run nvcr.io/nvidia/tritonserver:22.01-py3

Expected behavior It should not fail.

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Reactions:1
  • Comments:9 (5 by maintainers)

github_iconTop GitHub Comments

1reaction
TaylorHerecommented, Apr 28, 2022

I meet this issue when deploying tritonserver in kubernetes with A100 Card containerd://1.5.4 Driver Version: 470.82.01 CUDA Version: 11.6 nvcr tag: 22.03-py3

I was use tensorRT to build a model.plan, then run it with tritonserver, with the same nvcr container version tag but it says:

UNAVAILABLE: Internal: unable to create stream: the provided PTX was compiled with an unsupported toolchain

but I finally figure it out.

it can work with

command:
  - bash
args:
  - -c
  - tritonserver --model-store=/models/

can’t work with

command:
  - tritonserver
args:
  - --model-store=/models/

I have no idea why this happened.

图片

1reaction
739982423commented, Feb 14, 2022

@CoderHam Thanks for the pointers, I will give it a try for my local test environment. However, we would like to deploy to a K8s cluster, and according to the GPU Operator Component Matrix, the latest we can get is 470.82.01. Do you by chance know a workaround for this?

2022.2.14: Version 510.54 for RTX3090 is available for download at https://www.nvidia.com/Download/driverResults.aspx/186996/en-us

Read more comments on GitHub >

github_iconTop Results From Across the Web

"provided PTX was compiled with an unsupported toolchain ...
Hello all, This morning, I was suddenly facing a provided PTX was compiled with an unsupported toolchain error. I came across this post...
Read more >
CUDA - the provided PTX was compiled with an unsupported ...
This indicates that the provided PTX was compiled with an unsupported toolchain. The most common reason for this, is the PTX was generated...
Read more >
the provided PTX was compiled with an unsupported toolchain
When I run a python file, containing a machine learning model, I get the following error. (pytorch) [s.1915438@cl1 aneurysm]$ srun python ...
Read more >
the provided PTX was compiled with an unsupported toolchain
I am new to CUDA an recently i installed it on my windows pc. I have been running a small test with some...
Read more >
vs进行cuda编程失败,报错“the provided PTX was compiled ...
vs进行cuda编程失败,报错“the provided PTX was compiled with an unsupported toolchain.” 这表明提供的PTX是使用不受支持的工具链编译的。最常见的原因 ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found