question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Triton Server build with Docker fails

See original GitHub issue

Description Trying to build Triton Inference Server with Docker. Build fails with message: “make install failed”. Build logs are attached: build_log.txt

Triton Information Building on branch r22.04 (commit c049b389b411707de035a31e341ecdfbf532cbe2).

To Reproduce Using the following script to build Triton.

mkdir -p $HOME/tmp
cd $HOME/tmp

if [ ! -d "tritonserver" ]; then
    git clone https://github.com/triton-inference-server/server.git tritonserver
fi

build_type='Release'
tag='r22.04'
cd tritonserver
git checkout $tag

python3 build.py \
    --cmake-dir=$HOME/tmp/tritonserver/build \
    --build-dir=/tmp/citritonbuild \
    --enable-logging \
    --enable-stats \
    --enable-tracing \
    --enable-metrics \
    --enable-gpu-metrics \
    --enable-gpu \
    --endpoint=http \
    --repo-tag=common:$tag \
    --repo-tag=core:$tag \
    --repo-tag=backend:$tag \
    --repo-tag=thirdparty:$tag \
    --backend=ensemble \
    --backend=tensorrt \
    --backend=identity \
    --backend=repeat \
    --backend=square \
    --backend=onnxruntime \
    --backend=pytorch \
    --backend=tensorflow1 \
    --backend=tensorflow2 \
    --backend=openvino \
    --backend=python \
    --backend=dali \
    --backend=fil \
    --repoagent=checksum \
    --build-type=$build_type

build.py is modified as follows to overcome this problem: https://forums.developer.nvidia.com/t/invalid-public-key-for-cuda-apt-repository/212901/27

diff --git a/build.py b/build.py
index 8867daaf..e017a342 100755
--- a/build.py
+++ b/build.py
@@ -759,6 +759,9 @@ ENV DCGM_VERSION {}
 RUN wget https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64/cuda-ubuntu2004.pin && \
     mv cuda-ubuntu2004.pin /etc/apt/preferences.d/cuda-repository-pin-600 && \
     apt-key adv --fetch-keys https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64/7fa2af80.pub && \
+    apt-key del 7fa2af80 && \
+    wget https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64/cuda-keyring_1.0-1_all.deb && \
+        dpkg -i cuda-keyring_1.0-1_all.deb && \
     add-apt-repository "deb https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64/ /" && \
     apt-get update && apt-get install -y datacenter-gpu-manager=1:{}
 '''.format(dcgm_version, dcgm_version)

By looking at log looks like the problem is still the same:

#11 9.248 W: GPG error: https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64  InRelease: The following signatures couldn't be verified because the public key is not available: NO_PUBKEY A4B469963BF863CC
#11 9.248 E: The repository 'https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64  InRelease' is not signed.

Python version is 3.9.12.

Expected behavior Building Triton with Docker succeeds.

Issue Analytics

  • State:closed
  • Created a year ago
  • Comments:9

github_iconTop GitHub Comments

1reaction
antonlukyanovcommented, May 13, 2022

Hi @BrandonBocheng, you need to install docker package for Python before running the script:

pip3 install docker
1reaction
mc-nvcommented, May 12, 2022

Hi @antonlukyanov ,

This is know issue for r22.04 and would be fixed in future releases.

Please make sure that you made appropriate changes for all given files.

Below quote from Release 2.21.0 corresponding to NGC container 22.04 page is refer customers to commit / places which needs to be upated.

  • To best ensure the security and reliability of our RPM and Debian package repositories, NVIDIA is updating and rotating the signing keys used by the apt, dnf/yum, and zypper package managers beginning April 27, 2022. Triton r22.04 and prior release branches have not updated these repository signing keys. Due to this users should expect package management errors when attempting to access or install packages from CUDA repositories. Please follow these recommendations to mitigate the issue. Please update your branches prior to building to include the updated signing key(s). These changes are captured in this commit.
Read more comments on GitHub >

github_iconTop Results From Across the Web

build with docker failed · Issue #2416 · triton-inference-server ...
Description Hi, I want to use python backend with gpu input support, but build triton with docker failed. Successfully built 4122c02315c4 ...
Read more >
Triton Inference Server Release 22.05
The Triton Inference Server Docker image contains the inference server executable and related shared libraries in /opt/tritonserver .
Read more >
Enabling GPU access with Compose - Docker Documentation
Compose services can define GPU device reservations if the Docker host contains such devices and the Docker Daemon is set accordingly. For this,...
Read more >
Triton Inference Server: The Basics and a Quick Tutorial
Learn about the NVIDIA Triton Inference Server, its key features, models and model repositories, client libraries, and get started with a quick tutorial....
Read more >
Deploying the Nvidia Triton Inference Server behind AWS ...
I want to Deploying the Nvidia Triton Inference Server behind AWS Internal Application Load Balancer My Triton Application Running ubuntu ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found