question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

ONNX Backend Installation Error

See original GitHub issue

Description

Hello,

I am trying to install ONNX backend on Jetson Nano (Jetpack 4.6).

From the following command, I download the required backend file: git clone https://github.com/triton-inference-server/onnxruntime_backend.git

Then I run these:

$ mkdir build
$ cd build
$ cmake -DCMAKE_INSTALL_PREFIX:PATH=`pwd`/install -DTRITON_BUILD_ONNXRUNTIME_VERSION=1.10.0 -DTRITON_BUILD_CONTAINER_VERSION=21.12 ..
$ make install

But I have got this error:

Step 19/27 : RUN ./build.sh ${COMMON_BUILD_ARGS} --update --build --use_cuda --cuda_home “/usr/local/cuda” —> Running in 1c7c7a3f2f14 2022-02-18 16:06:11,773 tools_python_utils [INFO] - flatbuffers module is not installed. parse_config will not be available 2022-02-18 16:06:11,792 build [DEBUG] - Command line arguments: –build_dir /workspace/onnxruntime/build/Linux --config Release --skip_submodule_sync --parallel --build_shared_lib --build_dir /workspace/build --cmake_extra_defines ‘CMAKE_CUDA_ARCHITECTURES=’“'”‘52;60;61;70;75;80;86’“'”‘’ --update --build --use_cuda --cuda_home /usr/local/cuda 2022-02-18 16:06:11,825 build [ERROR] - cuda_home and cudnn_home paths must be specified and valid. cuda_home=‘/usr/local/cuda’ valid=True. cudnn_home=‘None’ valid=False The command ‘/bin/sh -c ./build.sh ${COMMON_BUILD_ARGS} --update --build --use_cuda --cuda_home “/usr/local/cuda”’ returned a non-zero code: 1 CMakeFiles/ort_target.dir/build.make:72: recipe for target ‘onnxruntime/lib/libonnxruntime.so’ failed make[2]: *** [onnxruntime/lib/libonnxruntime.so] Error 1 CMakeFiles/Makefile2:142: recipe for target ‘CMakeFiles/ort_target.dir/all’ failed make[1]: *** [CMakeFiles/ort_target.dir/all] Error 2 Makefile:135: recipe for target ‘all’ failed

When I try to execute Triton Server code as bin/tritonserver --model-repository=triton_model_repository/ --backend-directory=/backends

I got this error: Failed to load ‘segmentation_model’ version 1: Invalid argument: unable to find ‘libtriton_onnxruntime.so’ for model ‘segmentation_model’, searched: triton_model_repository/segmentation_model/1, triton_model_repository/segmentation_model, /backends/onnxruntime

How I can solve this error?

Thanks

Triton Information What version of Triton are you using? Release 2.17.0 corresponding to NGC container 21.12

Are you using the Triton container or did you build it yourself? Build myself

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:7 (5 by maintainers)

github_iconTop GitHub Comments

1reaction
sarperkiliccommented, Feb 23, 2022

Hello @GuanLuo , @CoderHam

I couldn’t solve the problem by doing what you said. I am sharing what I did step by step below.

 mkdir build
 cd build
 cmake -DCMAKE_INSTALL_PREFIX:PATH=pwd/install -DTRITON_ONNXRUNTIME_INCLUDE_PATHS=/home/User/Documents/tritonserver2.17.0-jetpack4.6/backends/onnxruntime/include -DTRITON_ONNXRUNTIME_LIB_PATHS=/home/User/Documents/tritonserver2.17.0-jetpack4.6/backends/onnxruntime/lib ..
 sudo make install
  • Installation completed successfully and then I try to load models by following code bin/tritonserver --model-repository=triton_model_repository --backend-directory=/backends

But I got this error: E0223 08:12:04.883918 13888 model_repository_manager.cc:1152] failed to load ‘segmentation_model’ version 1: Invalid argument: unable to find ‘libtriton_onnxruntime.so’ for model ‘segmentation_model’, searched: triton_model_repository/segmentation_model/1, triton_model_repository/segmentation_model, /backends/onnxruntime

How should I complete the ONNX Backend Installation? What is the problem?

It is becoming urgent problem, could you help me?

Thanks

0reactions
CoderHamcommented, Mar 10, 2022

Closing due to inactivity

Read more comments on GitHub >

github_iconTop Results From Across the Web

Can not Install ONNX==1.4.1 on Jetson Xavier
For this I need to have onnx version 1.4.1 . I tried to execute the following command sudo pip3 install onnx==1.4.1. But it...
Read more >
torch.onnx — PyTorch 1.13 documentation
UnsupportedOperatorError – If the ONNX graph cannot be exported because it uses an operator that is not supported by the exporter. torch.onnx.errors.
Read more >
onnxoptimizer - PyPI
ONNX provides a C++ library for performing arbitrary optimizations on ONNX models, as well as a growing list of prepackaged optimization passes. The...
Read more >
error: could not build wheels for onnx which use pep 517 and ...
ERROR: Could not build wheels for scipy which use PEP 517 and cannot be installed directly Sometimes this can be due to a...
Read more >
Backend - Go Packages
type ComputationBackend interface { onnx.Backend Run() error }. ComputationBackend is a backend that can run the graph ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found