Triton Server build with Docker fails
See original GitHub issueDescription Trying to build Triton Inference Server with Docker. Build fails with message: “make install failed”. Build logs are attached: build_log.txt
Triton Information
Building on branch r22.04 (commit c049b389b411707de035a31e341ecdfbf532cbe2
).
To Reproduce Using the following script to build Triton.
mkdir -p $HOME/tmp
cd $HOME/tmp
if [ ! -d "tritonserver" ]; then
git clone https://github.com/triton-inference-server/server.git tritonserver
fi
build_type='Release'
tag='r22.04'
cd tritonserver
git checkout $tag
python3 build.py \
--cmake-dir=$HOME/tmp/tritonserver/build \
--build-dir=/tmp/citritonbuild \
--enable-logging \
--enable-stats \
--enable-tracing \
--enable-metrics \
--enable-gpu-metrics \
--enable-gpu \
--endpoint=http \
--repo-tag=common:$tag \
--repo-tag=core:$tag \
--repo-tag=backend:$tag \
--repo-tag=thirdparty:$tag \
--backend=ensemble \
--backend=tensorrt \
--backend=identity \
--backend=repeat \
--backend=square \
--backend=onnxruntime \
--backend=pytorch \
--backend=tensorflow1 \
--backend=tensorflow2 \
--backend=openvino \
--backend=python \
--backend=dali \
--backend=fil \
--repoagent=checksum \
--build-type=$build_type
build.py is modified as follows to overcome this problem: https://forums.developer.nvidia.com/t/invalid-public-key-for-cuda-apt-repository/212901/27
diff --git a/build.py b/build.py
index 8867daaf..e017a342 100755
--- a/build.py
+++ b/build.py
@@ -759,6 +759,9 @@ ENV DCGM_VERSION {}
RUN wget https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64/cuda-ubuntu2004.pin && \
mv cuda-ubuntu2004.pin /etc/apt/preferences.d/cuda-repository-pin-600 && \
apt-key adv --fetch-keys https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64/7fa2af80.pub && \
+ apt-key del 7fa2af80 && \
+ wget https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64/cuda-keyring_1.0-1_all.deb && \
+ dpkg -i cuda-keyring_1.0-1_all.deb && \
add-apt-repository "deb https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64/ /" && \
apt-get update && apt-get install -y datacenter-gpu-manager=1:{}
'''.format(dcgm_version, dcgm_version)
By looking at log looks like the problem is still the same:
#11 9.248 W: GPG error: https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64 InRelease: The following signatures couldn't be verified because the public key is not available: NO_PUBKEY A4B469963BF863CC
#11 9.248 E: The repository 'https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64 InRelease' is not signed.
Python version is 3.9.12.
Expected behavior Building Triton with Docker succeeds.
Issue Analytics
- State:
- Created a year ago
- Comments:9
Top Results From Across the Web
build with docker failed · Issue #2416 · triton-inference-server ...
Description Hi, I want to use python backend with gpu input support, but build triton with docker failed. Successfully built 4122c02315c4 ...
Read more >Triton Inference Server Release 22.05
The Triton Inference Server Docker image contains the inference server executable and related shared libraries in /opt/tritonserver .
Read more >Enabling GPU access with Compose - Docker Documentation
Compose services can define GPU device reservations if the Docker host contains such devices and the Docker Daemon is set accordingly. For this,...
Read more >Triton Inference Server: The Basics and a Quick Tutorial
Learn about the NVIDIA Triton Inference Server, its key features, models and model repositories, client libraries, and get started with a quick tutorial....
Read more >Deploying the Nvidia Triton Inference Server behind AWS ...
I want to Deploying the Nvidia Triton Inference Server behind AWS Internal Application Load Balancer My Triton Application Running ubuntu ...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Hi @BrandonBocheng, you need to install docker package for Python before running the script:
Hi @antonlukyanov ,
This is know issue for
r22.04
and would be fixed in future releases.Please make sure that you made appropriate changes for all given files.
Below quote from Release 2.21.0 corresponding to NGC container 22.04 page is refer customers to commit / places which needs to be upated.