Dockerfile in building triton
See original GitHub issue- Problem Hello I am using triton inference server. And when i using docker image 21.07. I have to install a package to fix issues about related to libGL.so.1. This is link issues: https://github.com/conda-forge/pygridgen-feedstock/issues/10#issuecomment-365914605 I rebuild a triton image with docker. Refer to : https://github.com/triton-inference-server/server/blob/r21.07/docs/build.md#building-with-docker
- Solution
Before build triton with
./build.py
command. I add a line intobuild.py
file atline 706
in a create dockerfile function
RUN apt-get -yqq update && apt -yqq install libgl1-mesa-glx
I think this is a common problem. So, should we add the line above into build.py
file of the repo.
Issue Analytics
- State:
- Created 2 years ago
- Comments:5 (2 by maintainers)
Top Results From Across the Web
server/build.md at main · triton-inference-server/server - GitHub
The easiest way to build Triton is to use Docker. The result of the build will be a Docker image called tritonserver that...
Read more >Building NVIDIA Triton Inference Server from Scratch for ...
I have created my own docker image of it. I have build this for only for tensorflow backend. If you want for other...
Read more >Triton Inference Server Release 21.08
The Triton Inference Server Docker image contains the inference server executable and related shared libraries in /opt/tritonserver .
Read more >Work with Docker containers - Documentation
Our focus is on making the Triton Elastic Docker Host the best place to run your Docker images; building Docker images using the...
Read more >Serving a Torch-TensorRT model with Triton - PyTorch
Let's first pull the NGC PyTorch Docker container. You may need to create an account and get the ... Step 3: Building a...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
@gioipv If you install opencv through pip, you can install opencv-python-headless in order to avoid the dependency. conda-pack works just fine with pip packages as well.
Conda OpenCV also uses OpenMP for multi threading which doesn’t play nicely with many computing paradigms without a ton of extra configuration and debugging. The pip packaging of OpenCV uses vanilla pthreads, which are much easier to work with and play nicer with other things.
It looks like it is more a problem with the way conda packages the dependencies. We can’t change Triton container to include dependencies for Python packages. I think the only work around for now would be to create another Docker container that contains the dependencies required for your Python environment.