question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

error: libonnxruntime.so.1.11.1: cannot open shared object file: No such file or directory

See original GitHub issue

Checklist

  • I have searched related issues but cannot get the expected help.
  • 2. I have read the FAQ documentation but cannot get the expected help.
  • 3. The bug has not been fixed in the latest version.

Describe the bug

When I run inference_model with .onnx file, there is a error as follows: File “/media/zjq/Data/Ubuntu_Project/svd/MMDeploy/mmdeploy/apis/inference.py”, line 169, in inference_model model = task_processor.init_backend_model(backend_files) File “/media/zjq/Data/Ubuntu_Project/svd/MMDeploy/mmdeploy/codebase/mmrotate/deploy/rotated_detection.py”, line 101, in init_backend_model model_files, self.model_cfg, self.deploy_cfg, device=self.device) File “/media/zjq/Data/Ubuntu_Project/svd/MMDeploy/mmdeploy/codebase/mmrotate/deploy/rotated_detection_model.py”, line 264, in build_rotated_detection_model **kwargs) File “/home/zjq/anaconda3/envs/svd/lib/python3.7/site-packages/mmcv/utils/registry.py”, line 237, in build return self.build_func(*args, **kwargs, registry=self) File “/media/zjq/Data/Ubuntu_Project/svd/MMDeploy/mmdeploy/codebase/mmrotate/deploy/rotated_detection_model.py”, line 18, in __build_backend_model return registry.module_dict[cls_name](*args, **kwargs) File “/media/zjq/Data/Ubuntu_Project/svd/MMDeploy/mmdeploy/codebase/mmrotate/deploy/rotated_detection_model.py”, line 57, in init backend=backend, backend_files=backend_files, device=device) File “/media/zjq/Data/Ubuntu_Project/svd/MMDeploy/mmdeploy/codebase/mmrotate/deploy/rotated_detection_model.py”, line 76, in _init_wrapper deploy_cfg=self.deploy_cfg) File “/media/zjq/Data/Ubuntu_Project/svd/MMDeploy/mmdeploy/codebase/base/backend_model.py”, line 63, in _build_wrapper output_names=output_names) File “/media/zjq/Data/Ubuntu_Project/svd/MMDeploy/mmdeploy/backend/onnxruntime/wrapper.py”, line 46, in init session_options.register_custom_ops_library(ort_custom_op_path) onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Failed to load library /media/zjq/Data/Ubuntu_Project/svd/MMDeploy/mmdeploy/lib/libmmdeploy_onnxruntime_ops.so with error: libonnxruntime.so.1.11.1: cannot open shared object file: No such file or directory

Reproduction

I run this code: model_output = inference_model(model_cfg, deploy_cfg, backend_files, img, device) with the configs: oriented_rcnn_r50_fpn_3x_dota_le90.py rotated-detection_onnxruntime_dynamic.py oriented_rcnn_r50_fpn_1x_dota_le90-6d2b2ce0.onnx

Environment

2022-11-16 22:36:38,887 - mmdeploy - INFO - 

2022-11-16 22:36:38,888 - mmdeploy - INFO - **********Environmental information**********
2022-11-16 22:36:39,045 - mmdeploy - INFO - sys.platform: linux
2022-11-16 22:36:39,045 - mmdeploy - INFO - Python: 3.7.13 (default, Oct 18 2022, 18:57:03) [GCC 11.2.0]
2022-11-16 22:36:39,045 - mmdeploy - INFO - CUDA available: True
2022-11-16 22:36:39,045 - mmdeploy - INFO - GPU 0: NVIDIA GeForce RTX 3090
2022-11-16 22:36:39,045 - mmdeploy - INFO - CUDA_HOME: /usr/local/cuda
2022-11-16 22:36:39,045 - mmdeploy - INFO - NVCC: Cuda compilation tools, release 11.3, V11.3.109
2022-11-16 22:36:39,045 - mmdeploy - INFO - GCC: gcc (Ubuntu 9.4.0-1ubuntu1~20.04.1) 9.4.0
2022-11-16 22:36:39,045 - mmdeploy - INFO - PyTorch: 1.12.1
2022-11-16 22:36:39,045 - mmdeploy - INFO - PyTorch compiling details: PyTorch built with:
  - GCC 9.3
  - C++ Version: 201402
  - Intel(R) oneAPI Math Kernel Library Version 2021.4-Product Build 20210904 for Intel(R) 64 architecture applications
  - Intel(R) MKL-DNN v2.6.0 (Git Hash 52b5f107dd9cf10910aaa19cb47f3abf9b349815)
  - OpenMP 201511 (a.k.a. OpenMP 4.5)
  - LAPACK is enabled (usually provided by MKL)
  - NNPACK is enabled
  - CPU capability usage: AVX2
  - CUDA Runtime 11.3
  - NVCC architecture flags: -gencode;arch=compute_37,code=sm_37;-gencode;arch=compute_50,code=sm_50;-gencode;arch=compute_60,code=sm_60;-gencode;arch=compute_61,code=sm_61;-gencode;arch=compute_70,code=sm_70;-gencode;arch=compute_75,code=sm_75;-gencode;arch=compute_80,code=sm_80;-gencode;arch=compute_86,code=sm_86;-gencode;arch=compute_37,code=compute_37
  - CuDNN 8.3.2  (built against CUDA 11.5)
  - Magma 2.5.2
  - Build settings: BLAS_INFO=mkl, BUILD_TYPE=Release, CUDA_VERSION=11.3, CUDNN_VERSION=8.3.2, CXX_COMPILER=/opt/rh/devtoolset-9/root/usr/bin/c++, CXX_FLAGS= -fabi-version=11 -Wno-deprecated -fvisibility-inlines-hidden -DUSE_PTHREADPOOL -fopenmp -DNDEBUG -DUSE_KINETO -DUSE_FBGEMM -DUSE_QNNPACK -DUSE_PYTORCH_QNNPACK -DUSE_XNNPACK -DSYMBOLICATE_MOBILE_DEBUG_HANDLE -DEDGE_PROFILER_USE_KINETO -O2 -fPIC -Wno-narrowing -Wall -Wextra -Werror=return-type -Wno-missing-field-initializers -Wno-type-limits -Wno-array-bounds -Wno-unknown-pragmas -Wno-unused-parameter -Wno-unused-function -Wno-unused-result -Wno-unused-local-typedefs -Wno-strict-overflow -Wno-strict-aliasing -Wno-error=deprecated-declarations -Wno-stringop-overflow -Wno-psabi -Wno-error=pedantic -Wno-error=redundant-decls -Wno-error=old-style-cast -fdiagnostics-color=always -faligned-new -Wno-unused-but-set-variable -Wno-maybe-uninitialized -fno-math-errno -fno-trapping-math -Werror=format -Werror=cast-function-type -Wno-stringop-overflow, LAPACK_INFO=mkl, PERF_WITH_AVX=1, PERF_WITH_AVX2=1, PERF_WITH_AVX512=1, TORCH_VERSION=1.12.1, USE_CUDA=ON, USE_CUDNN=ON, USE_EXCEPTION_PTR=1, USE_GFLAGS=OFF, USE_GLOG=OFF, USE_MKL=ON, USE_MKLDNN=OFF, USE_MPI=OFF, USE_NCCL=ON, USE_NNPACK=ON, USE_OPENMP=ON, USE_ROCM=OFF, 

2022-11-16 22:36:39,045 - mmdeploy - INFO - TorchVision: 0.13.1
2022-11-16 22:36:39,045 - mmdeploy - INFO - OpenCV: 4.1.2
2022-11-16 22:36:39,045 - mmdeploy - INFO - MMCV: 1.6.0
2022-11-16 22:36:39,045 - mmdeploy - INFO - MMCV Compiler: GCC 9.3
2022-11-16 22:36:39,045 - mmdeploy - INFO - MMCV CUDA Compiler: 11.3
2022-11-16 22:36:39,045 - mmdeploy - INFO - MMDeploy: 0.9.0+4c872a4
2022-11-16 22:36:39,045 - mmdeploy - INFO - 

2022-11-16 22:36:39,045 - mmdeploy - INFO - **********Backend information**********
2022-11-16 22:36:40,511 - mmdeploy - INFO - onnxruntime: 1.11.1	ops_is_avaliable : True
2022-11-16 22:36:40,558 - mmdeploy - INFO - tensorrt: 8.2.3.0	ops_is_avaliable : True
2022-11-16 22:36:40,580 - mmdeploy - INFO - ncnn: None	ops_is_avaliable : False
2022-11-16 22:36:40,582 - mmdeploy - INFO - pplnn_is_avaliable: False
2022-11-16 22:36:40,584 - mmdeploy - INFO - openvino_is_avaliable: False
2022-11-16 22:36:40,607 - mmdeploy - INFO - snpe_is_available: False
2022-11-16 22:36:40,609 - mmdeploy - INFO - ascend_is_available: False
2022-11-16 22:36:40,611 - mmdeploy - INFO - coreml_is_available: False
2022-11-16 22:36:40,611 - mmdeploy - INFO - 

2022-11-16 22:36:40,611 - mmdeploy - INFO - **********Codebase information**********
2022-11-16 22:36:40,614 - mmdeploy - INFO - mmdet:	2.25.2
2022-11-16 22:36:40,614 - mmdeploy - INFO - mmseg:	0.29.0
2022-11-16 22:36:40,614 - mmdeploy - INFO - mmcls:	0.24.0
2022-11-16 22:36:40,614 - mmdeploy - INFO - mmocr:	None
2022-11-16 22:36:40,614 - mmdeploy - INFO - mmedit:	None
2022-11-16 22:36:40,614 - mmdeploy - INFO - mmdet3d:	None
2022-11-16 22:36:40,614 - mmdeploy - INFO - mmpose:	None
2022-11-16 22:36:40,614 - mmdeploy - INFO - mmrotate:	0.3.2

Error traceback

No response

Issue Analytics

  • State:open
  • Created 10 months ago
  • Comments:7 (2 by maintainers)

github_iconTop GitHub Comments

1reaction
JinqingZhengTjucommented, Nov 21, 2022

Hello, Try to set ONNXRUNTIME_DIR with the location where you have installed the onnx-runtime changing the following lines.

export ONNXRUNTIME_DIR="path to onnx-runtime"
export LD_LIBRARY_PATH=$ONNXRUNTIME_DIR/lib:$LD_LIBRARY_PATH

That solved the problem for me last time.

I have set the environment path for onnxruntime as same as your advice. But it also reports this error, I don’t know the reason.

Is this issue solved? If not, would you please attach a screenshot of your echo $LD_LIBRARY_PATH ?

I have solved this issue by setting the environment path in Pycharm. This is a way to solve this kind of problem. I don’t know why the environment path setting in ~./bashrc doesn’t work.

Glad to hear that! BTW, have you done source ~/.bashrc after you set the path in ~/.bashrc ???

Yes, I have run this command. But it still doesn’t work. I have to set it in Pycharm.

1reaction
JinqingZhengTjucommented, Nov 16, 2022

Hello,

Try to set ONNXRUNTIME_DIR with the location where you have installed the onnx-runtime changing the following lines.

export ONNXRUNTIME_DIR="path to onnx-runtime"
export LD_LIBRARY_PATH=$ONNXRUNTIME_DIR/lib:$LD_LIBRARY_PATH

That solved the problem for me last time.

I have set the environment path for onnxruntime as same as your advice. But it also reports this error, I don’t know the reason.

Read more comments on GitHub >

github_iconTop Results From Across the Web

libcublas.so.9.0: cannot open shared object file: No ... - GitHub
I installed tf-nightly build and I get the following error on import of tensorflow. ImportError: libcublas.so.9.0: cannot open shared object ...
Read more >
Install ONNX Runtime | onnxruntime
Instructions to install ONNX Runtime on your target platform in your environment.
Read more >
failed to load library libonnxruntime_providers_tensorrt.so with ...
1: cannot open shared object file: No such file or directory Failed to load the native TensorFlow runtime. See https://www.tensorflow.org/install/ ...
Read more >
windows 编译调试onnxruntime - CSDN
直接执行会报error:onnxruntime.capi.onnxruntime_pybind11_state. ... error: libonnxruntime.so.1.8.1: cannot open shared object file: No such file or directory ...
Read more >
Open-Mmlab Mmdeploy Statistics & Issues - Codesti
Open-Mmlab Mmdeploy: OpenMMLab Model Deployment Framework Check out ... error: libonnxruntime.so.1.11.1: cannot open shared object file: No such file or ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found