question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

I exported vgg16 from keras as an onnx file:

onnx_model = onnxmltools.convert_keras(vgg)
onnxmltools.utils.save_model(onnx_model, 'vgg.onnx')

Then on a Jetson Tx2 I executed: onnx2trt vgg.onnx -o onnx.trt

This is what I got:

[libprotobuf ERROR google/protobuf/text_format.cc:307] Error parsing text-format onnx2trt_onnx.ModelProto: 1:1: Invalid control characters encountered in text.
[libprotobuf ERROR google/protobuf/text_format.cc:307] Error parsing text-format onnx2trt_onnx.ModelProto: 1:16: Invalid control characters encountered in text.
[libprotobuf ERROR google/protobuf/text_format.cc:307] Error parsing text-format onnx2trt_onnx.ModelProto: 1:21: Already saw decimal point or exponent; can't have another one.
[libprotobuf ERROR google/protobuf/text_format.cc:307] Error parsing text-format onnx2trt_onnx.ModelProto: 1:18: Message type "onnx2trt_onnx.ModelProto" has no field named "OnnxMLTools".
Failed to parse ONNX model

Issue Analytics

  • State:closed
  • Created 5 years ago
  • Comments:6 (2 by maintainers)

github_iconTop GitHub Comments

1reaction
jiafatomcommented, Feb 21, 2019

@MohamedAfifii , can you try convert keras model using https://github.com/onnx/keras-onnx ? There is a Readme there and we successfully convert vgg16 there.

0reactions
DJMengcommented, Jul 17, 2019

@MohamedAfifii hi guys, i also meet the same problem: [8] No importer registered for op: Cast do you solve it? Please help me. Thanks first!

Read more comments on GitHub >

github_iconTop Results From Across the Web

onnx2trt error · Issue #842 · onnx/onnx-tensorrt - GitHub
cpp:370: Your ONNX model has been generated with INT64 weights, while TensorRT does not natively support INT64. Attempting to cast down to INT32....
Read more >
ERORR with ONNX2TRT : Unknown embedded device detected
Hi,. We test your model with XavierNX+JetPack4.6.1. It can run normally with the trtexec tool. Do you meet the error with this model?...
Read more >
Tutorial 9: ONNX to TensorRT (Experimental)
If you meet any problem with the listed models above, please create an issue and it would be taken care of soon. For...
Read more >
Why TensorRT ONNX parser fails, while parsing the ... - LinkedIn
Below are the two most common errors (but not limited to) we get while parsing the ONNX model in TensroRT. Please ensure the...
Read more >
Caffe2's bug, with TensorRT? - PyTorch Forums
pytorch/caffe2/contrib/tensorrt/trt_utils.h:16:8: error: looser exception specification ... third_party/onnx-tensorrt/onnx2trt.hpp:9, from .
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found