Triton with python backend: not Using Python execution env *.tar.gz file
See original GitHub issueHello. I am using triton with python backend.
- I followed this issues: https://github.com/triton-inference-server/server/issues/3189
- This is config.pbtxt file
backend: "python" .... parameters: { key: "EXECUTION_ENV_PATH", value: {string_value: "/home/gioipv/workspaces/ekyc_glasses/triton/model_repo2/model1/test2.tar.gz"} }
- when i run.
docker run --gpus=1 --shm-size=5G -p8111:8111 -p8222:8222 -p8333:8333 --rm -v /home/gioipv/workspaces/ekyc_glasses/triton/model_repo2:/models --name tritonserver nvcr.io/nvidia/tritonserver:20.11-py3 tritonserver --model-repository=/models --log-verbose 20
- It raises an error
ModuleNotFoundError: No module named 'librosa
- I checked in my logs, it don’t show the line:
Using Python execution env ***.tar.gz
Could you please help me with this …
Triton Information Triton docker images 20.11 version release
To Reproduce Following this issues https://github.com/triton-inference-server/server/issues/3189
Issue Analytics
- State:
- Created 2 years ago
- Comments:5 (2 by maintainers)
Top Results From Across the Web
Cannot start Triton inference server with Python backend stub ...
I built a custom Python 3.9 execution environment stub and tar file according to the instructions here (both steps 1 and 2), ...
Read more >Using Triton for production deployment of TensorRT models
NVIDIA Triton Inference Server is an open source solution created for fast and scalable deployment of deep learning inference in production.
Read more >py-triton - PyPI
Triton - Kinesis Data Pipeline. ... Triton Project Python Utility code for building a Data Pipeline with AWS Kinesis. ... or the config...
Read more >Triton Inference Server Release 21.06.1
The Python backend now allows the use of conda to create a unique execution environment for your Python model.
Read more >Security Xray Scan Knife Detection - Seeed Wiki
tgz. The tar file here contains the Triton server executable and shared libraries including the C++ and Python client libraries and examples. For...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
@gioipv You should point
EXECUTION_ENV_PATH
out properly.because you mounted
/home/gioipv/workspaces/ekyc_glasses/triton/model_repo2
in/models
in a docker container.I think you should update the GPU driver version. The version of the Python backend and the server must match.