question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Error when checking model with external data

See original GitHub issue

Bug Report

Describe the bug

I saved a simple model with save_as_external_data=True, then load the model into memory with load_external_data=False, will get an error when using checker.check_model() for this model.

System information

  • OS Platform and Distribution (e.g. Linux Ubuntu 16.04): win10
  • ONNX version (e.g. 1.7): 1.9.0
  • Python version: 3.8.0

Reproduction instructions

  • Describe the code to reproduce the behavior.
import numpy as np
import onnx
from onnx import helper
from onnx import AttributeProto, TensorProto, GraphProto
import json
from google.protobuf.json_format import MessageToJson

#======== 1. create an onnx model
dtype = np.float32
x = np.random.randn(3, 5).astype(dtype)

data_w = np.random.randn(2, 5).astype(dtype)
w = helper.make_tensor(name = 'w', data_type = TensorProto.FLOAT, dims = data_w.shape, vals = data_w.flatten().astype(dtype).tobytes(), raw=True)

Z = helper.make_tensor_value_info('Z', TensorProto.FLOAT, [2, 3])

# Create node (NodeProto)
X = helper.make_node(
    'Constant',
    inputs=[],
    outputs=['X'],
    value=onnx.helper.make_tensor(
        name='const_tensor_x',
        data_type=onnx.TensorProto.FLOAT,
        dims=x.shape,
        vals=x.flatten().astype(float),
    ),
)
node_gemm = helper.make_node(
    'Gemm',                  # name
    ['w', 'X'], # inputs
    ['Z'],                  # outputs
    name = 'gemm1'
)
# Create the graph (GraphProto)
graph_def = helper.make_graph(
    [X, node_gemm],        # nodes
    'test-model',      # name
    [],  # inputs
    [Z],  # outputs
    initializer = [w],
)
model_def = helper.make_model(graph_def, producer_name='onnx-example')

#======== 2. save an onnx model
model_path = '../outs/model.onnx'
# save model with separate data files
onnx.save_model(model_def, model_path
                , save_as_external_data=True
                , all_tensors_to_one_file=False
                , size_threshold=0)

#======== 4. load and try save
# (1) load with data, then check: will success
model_full = onnx.load(model_path)
onnx.checker.check_model(model_full)

# (2) load without data, then check: will fail
model_sep = onnx.load(model_path, load_external_data=False)
onnx.checker.check_model(model_sep)  # onnx.onnx_cpp2py_export.checker.ValidationError: Data of TensorProto ( tensor name: w) should be stored in w, but it doesn't exist or is not accessible.
  • Error:
"C:\Program Files\Python38\python.exe" D:/pythonProject/test/issue1_check_model.py
Traceback (most recent call last):
  File "D:/pythonProject/test/issue1_check_model.py", line 61, in <module>
    onnx.checker.check_model(model_sep)  # onnx.onnx_cpp2py_export.checker.ValidationError: Data of TensorProto ( tensor name: w) should be stored in w, but it doesn't exist or is not accessible.
  File "C:\Program Files\Python38\lib\site-packages\onnx\checker.py", line 104, in check_model
    C.check_model(protobuf_string)
onnx.onnx_cpp2py_export.checker.ValidationError: Data of TensorProto ( tensor name: w) should be stored in w, but it doesn't exist or is not accessible.

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:8 (5 by maintainers)

github_iconTop GitHub Comments

1reaction
jcwchencommented, Jul 29, 2021

We are targeting the next release for this feature, but typically ONNX is released semi-annually. (So it will be supported by official release in six months) However, once we have implemented this feature and got it merged, you can still use it from building the latest ONNX from source or weekly TestPyPI package. I will post the updates in this thread if any. Thanks!

0reactions
yjydfnhccommented, Sep 9, 2021

If you load the model first and then call checker it does not know the data dir and therefore tries to find the data in the current working dir and complains if it does not find it. If your script and data are not in the same directory, directly use the checker api and providing the model path .i.e check_model_path(“D:/mytest/outs/model.onnx”)

great, according to your advice, it truly does not fail by using checker.check_model(model_path) instead, thanks!

Read more comments on GitHub >

github_iconTop Results From Across the Web

Handling data source errors (Power Query) - Microsoft Support
Advice on how to identify, deal with, and resolve errors from externals data sources and Power Query when you refresh data.
Read more >
('Error when checking model input: expected no data, but got ...
Solved this problem by the following steps: input_tensor=Input((300,300,3)). in place of input_tensor = inception_model.input.
Read more >
Model connection with Excel is error | Decision Optimization
I'm new beginner using CPLEX, so my the model is error, please post .mod and dat. ... External data element "ProdCost" was not...
Read more >
Simulink Checks - MathWorks
Check that root model Inport blocks fully define dimensions, sample time, and data type. Description. Using root model Inport blocks that do not...
Read more >
How to Make Predictions with Keras - Machine Learning Mastery
You now must train a final model on all of your available data. ... Error when checking input: expected input_2 to have shape...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found