question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Can't convert Core ML model to Onnx

See original GitHub issue

Hello,

I’m trying to convert a trained Core ML model (activity classification) to Onnx in order to convert it to TensorFlow Lite. The problems is that I get errors. I’ve tried with different versions of python, onnxmltools, winmltools and it doesn’t seems to work. I also tried docker image of onnx ecosystem with same result. Can any one help me with it? Thanks in advance.

Script

import coremltools
import onnxmltools

input_coreml_model = '../model.mlmodel'
output_onnx_model = '../model.onnx'
coreml_model = coremltools.utils.load_spec(input_coreml_model)
onnx_model = onnxmltools.convert_coreml(coreml_model)
onnxmltools.utils.save_model(onnx_model, output_onnx_model)

Error Messages

IndexError                                Traceback (most recent call last)
<ipython-input-11-94a6dc527869> in <module>
      3 
      4 # Convert the CoreML model into ONNX
----> 5 onnx_model = onnxmltools.convert_coreml(coreml_model)
      6 
      7 # Save as protobuf

/usr/local/lib/python3.6/dist-packages/onnxmltools/convert/main.py in convert_coreml(model, name, initial_types, doc_string, target_opset, targeted_onnx, custom_conversion_functions, custom_shape_calculators)
     16     from .coreml.convert import convert
     17     return convert(model, name, initial_types, doc_string, target_opset, targeted_onnx,
---> 18                    custom_conversion_functions, custom_shape_calculators)
     19 
     20 

/usr/local/lib/python3.6/dist-packages/onnxmltools/convert/coreml/convert.py in convert(model, name, initial_types, doc_string, target_opset, targeted_onnx, custom_conversion_functions, custom_shape_calculators)
     58     target_opset = target_opset if target_opset else get_opset_number_from_onnx()
     59     # Parse CoreML model as our internal data structure (i.e., Topology)
---> 60     topology = parse_coreml(spec, initial_types, target_opset, custom_conversion_functions, custom_shape_calculators)
     61 
     62     # Parse CoreML description, author, and license. Those information will be attached to the final ONNX model.

/usr/local/lib/python3.6/dist-packages/onnxmltools/convert/coreml/_parse.py in parse_coreml(model, initial_types, target_opset, custom_conversion_functions, custom_shape_calculators)
    465     # Instead of using CoremlModelContainer, we directly pass the model in because _parse_model is CoreML-specific.
    466     _parse_model(topology, scope, model)
--> 467     topology.compile()
    468 
    469     for variable in topology.find_root_and_sink_variables():

/usr/local/lib/python3.6/dist-packages/onnxconverter_common/topology.py in compile(self)
    630         self._resolve_duplicates()
    631         self._fix_shapes()
--> 632         self._infer_all_types()
    633         self._check_structure()
    634 

/usr/local/lib/python3.6/dist-packages/onnxconverter_common/topology.py in _infer_all_types(self)
    506                 pass  # in Keras converter, the shape calculator can be optional.
    507             else:
--> 508                 operator.infer_types()
    509 
    510     def _resolve_duplicates(self):

/usr/local/lib/python3.6/dist-packages/onnxconverter_common/topology.py in infer_types(self)
    108     def infer_types(self):
    109         # Invoke a core inference function
--> 110         registration.get_shape_calculator(self.type)(self)
    111 
    112 

/usr/local/lib/python3.6/dist-packages/onnxmltools/convert/coreml/shape_calculators/neural_network/Concat.py in calculate_concat_output_shapes(operator)
     22         if variable.type.shape[0] != 'None' and variable.type.shape[0] != output_shape[0]:
     23             raise RuntimeError('Only dimensions along C-axis can be different')
---> 24         if variable.type.shape[2] != 'None' and variable.type.shape[2] != output_shape[2]:
     25             raise RuntimeError('Only dimensions along C-axis can be different')
     26         if variable.type.shape[3] != 'None' and variable.type.shape[3] != output_shape[3]:

IndexError: list index out of range

Issue Analytics

  • State:open
  • Created 4 years ago
  • Comments:12

github_iconTop GitHub Comments

6reactions
bwerycommented, Sep 11, 2020

Investigating a little more, I have found that the problem is clearly inside the “optimize_onnx” routine from the file “optimizer.py” in imported project “onnxconverter_common”.

As this “optimize_onnx” routine takes a topology in input to generate an “optimized” topology in output, both belonging to the same class and as this operation seems to be optional (it is controlled through flag “container.enable_optimizer” but I do not see what would make this flag false), there is a workaround which is simply to skip this optimization step.

To implement the work around, I have replaced line 796 in topology.py by content of line 798.

My network now is properly converted and operational. A look in Netron shows its structure is what I was expecting.

0reactions
bwerycommented, Mar 17, 2021

I apologize for this long delay before answering.

I have upgraded now to releases 1.8.0 of onnxconverter-common and onnxmltools. This problem appears to be solved on my side.

Thank you !

Read more comments on GitHub >

github_iconTop Results From Across the Web

Can't convert Core ML model to Onnx (then to Tensorflow Lite)
I'm trying to convert a trained Core ML model to TensorFlow Lite. I find I need convert it to Onnx first. The problems...
Read more >
onnxmltools: Convert your model into ONNX
ONNXMLTools enables you to convert models from different machine learning toolkits into ONNX. Currently the following toolkits are supported: Apple Core ML ......
Read more >
Convert Yolov5 to CoreML. Also add a decode layer. | by MLBoy
Use the export code in the Yolov5 repository to convert the Pytorch model to a CoreML model. Defines the decode layer. Defines Non...
Read more >
FAQ (Frequently Asked Questions) | Microsoft Learn
You can use WinMLTools to convert models of several different formats, such as Apple CoreML and scikit-learn, to ONNX. I am getting errors...
Read more >
How to Deploy PyTorch Models with Core ML Conversion ...
As the stack trace indicates, the error is now in the package onnx-coreml. The layer in question is early on, likely in the...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found