question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Wrong error for loading ONNX models

See original GitHub issue

Description I am trying to load the ONNX model and use the next type of config. It has the wrong type of input:

platform: "onnxruntime_onnx"
max_batch_size: 1000
input [
    {
      name: "users"
      data_type: TYPE_FP32
      dims: [-1]
    }
]
output [
    {
      name: "output"
      data_type: TYPE_FP32 
      dims: [-1]
    }
]

The error I get:

UNAVAILABLE: Invalid argument: unable to load model 'recommendation, unexpected datatype INT64 for input 'users', expecting TYPE_FP32

As you might have noticed, TYPE_FP32 is the type I initially had. And when I change the type to TYPE_INT64, everything works smoothly. I’ve looked into code but couldn’t find where exactly this logging happens, so I am creating an issue instead of a PR.

Triton Information docker run --rm -p8000:8000 -p8001:8001 -p8002:8002 -v $PWD:/models nvcr.io/nvidia/tritonserver:21.02-py3 tritonserver --model-repository=/models --model-control-mode=poll --repository-poll-secs=30

To Reproduce Try using some ONNX model of yours with the wrong data type on input or output. What you will see is a confusing error message.

Expected behavior I expected the logger to return awaited type as expected and the current type as unexpected.

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:5 (3 by maintainers)

github_iconTop GitHub Comments

1reaction
GuanLuocommented, Mar 20, 2021

The validation is checking the model against the model config, so I guess the error message is intended to be unexpected datatype INT64 for input 'users', [model config] expects TYPE_FP32. I think changing to that will clarify the confusion as in other places, it was called out that what is expected by the model config.

@Raduan77 Yes the ORT backend is in a separate Github repository as you discovered, do you want to create a PR to improve the error message? Otherwise I can do that.

1reaction
CoderHamcommented, Mar 18, 2021

Yes the error reported by the backend is incorrect. It should be: unable to load model ‘recommendation’, unexpected datatype TYPE_FP32 for input ‘users’, expecting TYPE_INT64 @GuanLuo could you fix the same? The expecting datatype should be what the model expects and not what the model configuration specifies.

Read more comments on GitHub >

github_iconTop Results From Across the Web

OpenCV DNN Inference from ONNX file - Error - Stack Overflow
I'm trying to run an inference using the OpenCV DNN framework of an ONNX file. The problem is that when i'm trying to...
Read more >
Error on loading ONNX model - OpenCV Q&A Forum
Solved! Error on parsing is caused by a path too long (maybe there is a string size limit in readNetFromONNX ). Copying file...
Read more >
Common errors with onnxruntime
It starts by loading the model trained in example Step 1: Train a model using ... The first example fails due to bad...
Read more >
Exporting your model to ONNX format - Unity - Manual
To use your trained neural network in Unity, you need to export it to the ONNX format. ONNX (Open Neural Network Exchange) is...
Read more >
Error when trying to load .onnx files - Apache TVM Discuss
Error when trying to load .onnx files ... the problem is that the model file link is wrong and you are probably using...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found