ONNX Backend Installation Error
See original GitHub issueDescription
Hello,
I am trying to install ONNX backend on Jetson Nano (Jetpack 4.6).
From the following command, I download the required backend file:
git clone https://github.com/triton-inference-server/onnxruntime_backend.git
Then I run these:
$ mkdir build
$ cd build
$ cmake -DCMAKE_INSTALL_PREFIX:PATH=`pwd`/install -DTRITON_BUILD_ONNXRUNTIME_VERSION=1.10.0 -DTRITON_BUILD_CONTAINER_VERSION=21.12 ..
$ make install
But I have got this error:
Step 19/27 : RUN ./build.sh ${COMMON_BUILD_ARGS} --update --build --use_cuda --cuda_home “/usr/local/cuda” —> Running in 1c7c7a3f2f14 2022-02-18 16:06:11,773 tools_python_utils [INFO] - flatbuffers module is not installed. parse_config will not be available 2022-02-18 16:06:11,792 build [DEBUG] - Command line arguments: –build_dir /workspace/onnxruntime/build/Linux --config Release --skip_submodule_sync --parallel --build_shared_lib --build_dir /workspace/build --cmake_extra_defines ‘CMAKE_CUDA_ARCHITECTURES=’“'”‘52;60;61;70;75;80;86’“'”‘’ --update --build --use_cuda --cuda_home /usr/local/cuda 2022-02-18 16:06:11,825 build [ERROR] - cuda_home and cudnn_home paths must be specified and valid. cuda_home=‘/usr/local/cuda’ valid=True. cudnn_home=‘None’ valid=False The command ‘/bin/sh -c ./build.sh ${COMMON_BUILD_ARGS} --update --build --use_cuda --cuda_home “/usr/local/cuda”’ returned a non-zero code: 1 CMakeFiles/ort_target.dir/build.make:72: recipe for target ‘onnxruntime/lib/libonnxruntime.so’ failed make[2]: *** [onnxruntime/lib/libonnxruntime.so] Error 1 CMakeFiles/Makefile2:142: recipe for target ‘CMakeFiles/ort_target.dir/all’ failed make[1]: *** [CMakeFiles/ort_target.dir/all] Error 2 Makefile:135: recipe for target ‘all’ failed
When I try to execute Triton Server code as
bin/tritonserver --model-repository=triton_model_repository/ --backend-directory=/backends
I got this error: Failed to load ‘segmentation_model’ version 1: Invalid argument: unable to find ‘libtriton_onnxruntime.so’ for model ‘segmentation_model’, searched: triton_model_repository/segmentation_model/1, triton_model_repository/segmentation_model, /backends/onnxruntime
How I can solve this error?
Thanks
Triton Information What version of Triton are you using? Release 2.17.0 corresponding to NGC container 21.12
Are you using the Triton container or did you build it yourself? Build myself
Issue Analytics
- State:
- Created 2 years ago
- Comments:7 (5 by maintainers)
Top GitHub Comments
Hello @GuanLuo , @CoderHam
I couldn’t solve the problem by doing what you said. I am sharing what I did step by step below.
But I got this error: E0223 08:12:04.883918 13888 model_repository_manager.cc:1152] failed to load ‘segmentation_model’ version 1: Invalid argument: unable to find ‘libtriton_onnxruntime.so’ for model ‘segmentation_model’, searched: triton_model_repository/segmentation_model/1, triton_model_repository/segmentation_model, /backends/onnxruntime
How should I complete the ONNX Backend Installation? What is the problem?
It is becoming urgent problem, could you help me?
Thanks
Closing due to inactivity