question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Description I’m trying to run Triton on the recently released WSL2 and a specific CUDA driver for it. The examples provided by NVIDIA work fine.

Details: https://docs.nvidia.com/cuda/wsl-user-guide/index.html

Triton Information What version of Triton are you using? nvcr.io/nvidia/tritonserver:20.03-py3

Are you using the Triton container or did you build it yourself? nvcr.io/nvidia/tritonserver:20.03-py3

To Reproduce sudo docker run -d --restart always --gpus all --shm-size=1g --ulimit memlock=-1 --ulimit stack=67108864 --name triton -p 8000:8000 -p 8001:8001 -p 8002:8002 nvcr.io/nvidia/tritonserver:20.03-py3 OR sudo nvidia-docker run -d --restart always --gpus all --shm-size=1g --ulimit memlock=-1 --ulimit stack=67108864 --name triton -p 8000:8000 -p 8001:8001 -p 8002:8002 nvcr.io/nvidia/tritonserver:20.03-py3

=============================
== Triton Inference Server ==
=============================

NVIDIA Release 20.03 (build 11042949)

Copyright (c) 2018-2019, NVIDIA CORPORATION.  All rights reserved.

Various files include modifications (c) NVIDIA CORPORATION.  All rights reserved.
NVIDIA modifications are covered by the license terms that apply to the underlying
project or file.

WARNING: The NVIDIA Driver was not detected.  GPU functionality will not be available.
   Use 'nvidia-docker run' to start this container; see
   https://github.com/NVIDIA/nvidia-docker/wiki/nvidia-docker ..

Expected behavior I expect Triton to start up?

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:27 (25 by maintainers)

github_iconTop GitHub Comments

2reactions
iamsellekcommented, Dec 29, 2021

@turowicz Bless you for all the follow-up posts as you continued your research. The internet needs more people like you.

1reaction
deadeyegoodwincommented, Jul 7, 2020

The warning message about GPU that is produced by the entrypoint code is something we are working on fixing for WSL. We haven’t spent much time with WSL and Triton so we can’t yet provide much detailed feedback on your other findings. Are you aware of the CUDA/WSL developer forum? You may find some quicker help there: https://forums.developer.nvidia.com/c/accelerated-computing/cuda/cuda-on-windows-subsystem-for-linux

Read more comments on GitHub >

github_iconTop Results From Across the Web

CUDA on WSL User Guide - NVIDIA Documentation Center
WSL 2 is a key enabler in making GPU acceleration to be seamlessly shared between Windows and Linux applications on the same system...
Read more >
Enable NVIDIA CUDA on WSL 2 - Windows - Microsoft Learn
In this article. Install Windows 11 or Windows 10, version 21H2; Install the GPU driver; Install WSL; Get started with NVIDIA CUDA.
Read more >
Enabling GPU acceleration on Ubuntu on WSL2 with the ...
Enabling GPU acceleration on Ubuntu on WSL2 with the NVIDIA CUDA Platform ... OpenGL, and CUDA that target Ubuntu while staying on Windows....
Read more >
How to Install the NVIDIA CUDA Driver, Toolkit, cuDNN, and ...
The Founder's Guide. How to Install the NVIDIA CUDA Driver, Toolkit, cuDNN, and TensorRT in WSL2 · Download the NVIDIA CUDA Driver: ·...
Read more >
Install the CUDA Driver and Toolkit in WSL2 - Level Up Coding
Install the CUDA Driver and Toolkit in WSL2 · A condensed guide with instructions and screenshots · Join the NVIDIA Developer Program: ·...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found