question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

[EXTERNAL] Resolving ONNX runtime fails

See original GitHub issue

The following code

#r "nuget:Microsoft.ML.OnnxRuntime.Managed,1.5.2"
#r "nuget:Microsoft.ML.OnnxRuntime,1.5.2"

using Microsoft.ML.OnnxRuntime;
using Microsoft.ML.OnnxRuntime.Tensors;

var session = new InferenceSession("path to a valid onnx model", new SessionOptions
    {
        LogSeverityLevel = OrtLoggingLevel.ORT_LOGGING_LEVEL_ERROR
    });

Produces the following errors

Error: System.TypeInitializationException: The type initializer for 'Microsoft.ML.OnnxRuntime.NativeMethods' threw an exception.
---> System.EntryPointNotFoundException: Unable to find an entry point named 'OrtGetApiBase' in DLL 'onnxruntime'.
at Microsoft.ML.OnnxRuntime.NativeMethods.OrtGetApiBase()
at Microsoft.ML.OnnxRuntime.NativeMethods..cctor() in C:\a\1\s\csharp\src\Microsoft.ML.OnnxRuntime\NativeMethods.cs:line 193
--- End of inner exception stack trace ---
at Microsoft.ML.OnnxRuntime.SessionOptions..ctor() in C:\a\1\s\csharp\src\Microsoft.ML.OnnxRuntime\SessionOptions.cs:line 50
at Submission#5.<<Initialize>>d__0.MoveNext()
--- End of stack trace from previous location ---
at Microsoft.CodeAnalysis.Scripting.ScriptExecutionState.RunSubmissionsAsync[TResult](ImmutableArray`1 precedingExecutors, Func`2 currentExecutor, StrongBox`1 exceptionHolderOpt, Func`2 catchExceptionOpt, CancellationToken cancellationToken)

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:9 (8 by maintainers)

github_iconTop GitHub Comments

4reactions
michaelgsharpcommented, Jun 25, 2021

So this is caused by the same problem as #75, but with this case it is slightly different. onnxruntime.dll is now shipped as part of windows in the System32 folder. So for #75 the native asset isn’t found, but in this case its finding the one in the System32 folder.

In my experience with Onnx Runtime, the error Unable to find an entry point named 'OrtGetApiBase' in DLL 'onnxruntime'. is thrown when there is a version mismatch somewhere. Either between the native and managed code, or if you have the wrong version of Cuda installed. Thats what let me to find out its finding the System32 one.

I was able to create a workaround by adding a DllImportResolver to the assembly to get it working, and this same workaround should for any native asset as long as you update the paths used. This was quick and dirty so I’m sure there would be better ways to do this. This needs to be placed after Microsoft.ML.OnnxRuntime as been imported from NuGet, but BEFORE you actually try and use it in the code.

using System.Reflection;
using System.Runtime.InteropServices;

static class Library
{
    public static IntPtr OnnxRuntimeImportResolver(string libraryName, Assembly assembly, DllImportSearchPath? searchPath)
    {
        if (libraryName != "onnxruntime") 
            return IntPtr.Zero;

        var fullLoc = Assembly.Load("Microsoft.ML.OnnxRuntime").Location;
        var loc = fullLoc.Substring(0, fullLoc.IndexOf("microsoft.ml.onnxruntime.managed"));
        loc += @"microsoft.ml.onnxruntime/1.6.0/runtimes/";

        if (RuntimeInformation.IsOSPlatform(OSPlatform.Windows))
            loc += "win-x64";
        else if (RuntimeInformation.IsOSPlatform(OSPlatform.Linux))
            loc += "linux-x64";
        else
            loc += "osx-x64";

            loc +="/native/onnxruntime.dll";

        IntPtr libHandle = IntPtr.Zero;
        NativeLibrary.TryLoad(loc, out libHandle);

        return libHandle;
    }
}

NativeLibrary.SetDllImportResolver(Assembly.Load("Microsoft.ML.OnnxRuntime"), Library.OnnxRuntimeImportResolver);

Onnx Runtime does also support windows x86 which I didn’t add here, but otherwise those are the only 3 RIDs it needs to handle. Others may be more/less complex.

This also assumes version 1.6.0 since that is the version that ML.NET 1.5.5 uses. You could programmatically get that version as well, this was just the quickest way to unblock in my case.

1reaction
KevinRansomcommented, May 4, 2023

@colombod ,

I believe this is fixed. When I run your repo, I now see a dfferent error and that error looks to be the one I would expect to see, model file doesn’t exist.

Can we close this?

Microsoft (R) F# Interactive version 12.5.0.0 for F# 7.0
Copyright (c) Microsoft Corporation. All Rights Reserved.

For help type #help;;

> #r "nuget:Microsoft.ML.OnnxRuntime.Managed,1.5.2"
- #r "nuget:Microsoft.ML.OnnxRuntime,1.5.2"
-
- open Microsoft.ML.OnnxRuntime
- open Microsoft.ML.OnnxRuntime.Tensors
-
- let session = new InferenceSession("path to a valid onnx model", new SessionOptions(LogSeverityLevel = OrtLoggingLevel.ORT_LOGGING_LEVEL_ERROR));;
[Loading C:\Users\kevinr\.packagemanagement\nuget\Cache\951f9086aa245bd46c2a954539f5e1686dfbbf334bd24cb5709787e7931b4ecc.fsx]
module FSI_0002.
       951f9086aa245bd46c2a954539f5e1686dfbbf334bd24cb5709787e7931b4ecc

Microsoft.ML.OnnxRuntime.OnnxRuntimeException: [ErrorCode:NoSuchFile] Load model from path to a valid onnx model failed:Load model path to a valid onnx model failed. File doesn't exist
   at Microsoft.ML.OnnxRuntime.NativeApiStatus.VerifySuccess(IntPtr nativeStatus) in C:\a\1\s\csharp\src\Microsoft.ML.OnnxRuntime\NativeApiStatus.cs:line 32
   at Microsoft.ML.OnnxRuntime.InferenceSession.Init(String modelPath, SessionOptions options) in C:\a\1\s\csharp\src\Microsoft.ML.OnnxRuntime\InferenceSession.cs:line 667
   at Microsoft.ML.OnnxRuntime.InferenceSession..ctor(String modelPath, SessionOptions options) in C:\a\1\s\csharp\src\Microsoft.ML.OnnxRuntime\InferenceSession.cs:line 46
   at <StartupCode$FSI_0003>.$FSI_0003.main@() in c:\temp\stdin:line 7
   at System.RuntimeMethodHandle.InvokeMethod(Object target, Void** arguments, Signature sig, Boolean isConstructor)
   at System.Reflection.MethodInvoker.Invoke(Object obj, IntPtr* args, BindingFlags invokeAttr)
Stopped due to error
>
Read more comments on GitHub >

github_iconTop Results From Across the Web

Fail: [ONNXRuntimeError] : 1 : FAIL : Deserialize tensor onnx
Hi, I try to export whisper-large model to ONNX. But I faced belowed error. Fail: [ONNXRuntimeError] : 1 : FAIL : Deserialize tensor...
Read more >
can't build max/msp external with onnx-runtime, "LNK2019
i am trying to write a max-external to do inference on an .onnx neural ... several tutorials on certain steps but fail to...
Read more >
Onnxruntime cross-build for android on windows fails
I'm trying to build the lastest version of onnxruntime (1.15.1) on Windows ... url for 'cmake/external/onnx' Synchronizing submodule url for ...
Read more >
Build for inferencing | onnxruntime
Follow the instructions below to build ONNX Runtime to perform inference. Contents. CPU; Supported architectures and build environments; Common Build ...
Read more >
Bitbake full image failed by armnn-onnx and onnxruntime
Solved: Dear Community, I'm bitbaking full image for imx8mpevk with BSP-5.4.70-2.3.2 and struggling with following error : ERROR:
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found