unet export to onnx issue
See original GitHub issueDescribe the bug
Trying to export the unet model to onnx. Used this script
.onnx
file created successfully but export wasn’t successful:
import onnx
onnx_model = onnx.load("model.onnx")
onnx.checker.check_model(onnx_model)
Error:
protobuf_string = model if isinstance(model, bytes) else model.SerializeToString()
ValueError: Message onnx.ModelProto exceeds maximum protobuf size of 2GB: 3439066593
Reproduction
No response
Logs
No response
System Info
Python 3.8.13 Pytorch 1.13 ONNX 1.12.0 transformers 4.22.1 diffusers 0.3.0
Issue Analytics
- State:
- Created a year ago
- Comments:5 (3 by maintainers)
Top Results From Across the Web
Exception when converting Unet from pytorch to onnx
The problem is due to ONNX not having an implementation of the PyTorch 2D Instane Normalization layer. The solution was to copy the...
Read more >Best Practices for Neural Network Exports to ONNX
Our experience shows that is easier to export PyTorch models. If possible, choose a PyTorch source and convert it using the built-in torch.onnx...
Read more >(optional) Exporting a Model from PyTorch to ONNX and ...
In this tutorial, we describe how to convert a model defined in PyTorch into the ONNX format and then run it with ONNX...
Read more >Convert PyTorch Model to ONNX Model - Documentation
The PyTorch 'compiler' will correctly capture any control flow, and correctly export the model to ONNX format. This sounds like a proper solution...
Read more >MATLAB exportONNXNetwork - MathWorks
Export the network net as an ONNX format file called squeezenet.onnx . Save the file to the current folder. If the Deep Learning...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Hi @tanayvarshney! The exported UNet is indeed >2GB due to the protobuf size limits, but the directory of the exported model contains external weight tensors. So while this fails the
onnx
check,onnxruntime
is able to load the model just fine 😃P.S. I’m preparing an update for the onnx exporting script that will save the UNet in fp16, which should fit into 2GB.
https://github.com/huggingface/diffusers/pull/932 enabled an FP16 conversion, so it should be possible to manually convert and use the checkpoints with size-sensitive onnx tools now. Feel free to ping me here if you have any issues with that workflow! Hardware-independent FP16 checkpoints are still WIP, as they require still-unreleased onnxruntime updates.