question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

[BUG] ValueError: Message onnx.ModelProto exceeds maximum protobuf size of 2GB: 2215158447

See original GitHub issue

@jcwchen Optimizing large models fails in the latest release of onnx (1.12.0) even with use_external_data_format=True.

In the latest version of onnxruntime, calling OnnxModel.save(self.model, output_path, use_external_data_format, all_tensors_to_one_file) fails with the following stack trace:

True Traceback (most recent call last): File “examples/onnxruntime/optimization/question-answering/run_qa.py”, line 525, in main() File “examples/onnxruntime/optimization/question-answering/run_qa.py”, line 311, in main optimizer.export( File “/home/mroyzen/train_files/optimum/optimum/onnxruntime/optimization.py”, line 150, in export optimizer.save_model_to_file(onnx_optimized_model_output_path, use_external_data_format=True) File “/opt/conda/lib/python3.8/site-packages/onnxruntime/transformers/models/gpt2/…/…/onnx_model.py”, line 938, in save_model_to_file OnnxModel.save(self.model, output_path, use_external_data_format, all_tensors_to_one_file) File “/opt/conda/lib/python3.8/site-packages/onnxruntime/transformers/models/gpt2/…/…/onnx_model.py”, line 914, in save save_model( File “/opt/conda/lib/python3.8/site-packages/onnx/init.py”, line 202, in save_model s = _serialize(proto) File “/opt/conda/lib/python3.8/site-packages/onnx/init.py”, line 71, in _serialize result = proto.SerializeToString() ValueError: Message onnx.ModelProto exceeds maximum protobuf size of 2GB: 2215158447

Linking my onnxruntime issue as well, and I believe this feature request is related. Your help would be appreciated. Thanks!

Issue Analytics

  • State:open
  • Created a year ago
  • Comments:20 (10 by maintainers)

github_iconTop GitHub Comments

1reaction
JingyaHuangcommented, Aug 10, 2022

Hi @jcwchen,

Thanks for the suggestion!

I just found that actually the difference came from the fact that I was using optimization example instead of the quantization example. Now I have successfully exported the proto.

Thank you so much for helping, I will integrate these large proto export features in optimum. Thanks again for your help!

0reactions
jcwchencommented, Aug 9, 2022

Thanks for the update and experiment. One more possible variance I can think of: did you run run_qa.py on optimum 1.12.3’s commit? (5a0106d781a8358aa7eab88a8b115a538b4840d1)

Read more comments on GitHub >

github_iconTop Results From Across the Web

Not able to export large model with onnx - PyTorch Forums
But this doesn't work and I get the error: RuntimeError: Exporting model exceed maximum protobuf size of 2GB. Please call torch.onnx.export ...
Read more >
tensorflow.GraphDef exceeded maximum protobuf size of 2GB ...
Description When I attempt to convert a Tensoflow saved model, TrtGraphConverter.convert() log shows the following error: [libprotobuf ERROR ...
Read more >
google protobuf maximum size - protocol buffers
10MB is pushing it but you'll probably be OK. Protobuf has a hard limit of 2GB, because many implementations use 32-bit signed arithmetic....
Read more >
Message onnx.ModelProto exceeds maximum protobuf size of ...
ModelProto exceeds maximum protobuf size of 2GB: 2215158447. @jcwchen Optimizing large models fails in the latest release of onnx (1.12.0) even with ...
Read more >
ValueError: Message onnx.ModelProto exceeds maximum ...
ONNXOptimizer: ValueError: Message onnx.ModelProto exceeds maximum protobuf size of 2GB: 2215158499.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found