Convert PLATO-2 model to ONNX format
See original GitHub issueHello everyone,
I’m currently trying to convert PLATO-2 model into ONNX format using Paddle2ONNX. However, when I try to convert the NSP model, I got this error:
Traceback (most recent call last):
File "/usr/local/bin/paddle2onnx", line 8, in <module>
sys.exit(main())
File "/usr/local/lib/python3.7/dist-packages/paddle2onnx/command.py", line 184, in main
input_shape_dict=input_shape_dict)
File "/usr/local/lib/python3.7/dist-packages/paddle2onnx/command.py", line 148, in program2onnx
operator_export_type=operator_export_type)
File "/usr/local/lib/python3.7/dist-packages/paddle2onnx/convert.py", line 84, in program2onnx
enable_onnx_checker, operator_export_type)
File "/usr/local/lib/python3.7/dist-packages/paddle2onnx/convert.py", line 34, in export_onnx
operator_export_type, verbose)
File "/usr/local/lib/python3.7/dist-packages/paddle2onnx/graph/onnx_graph.py", line 240, in build
onnx_graph = ONNXGraph(paddle_graph, opset_version=opset_version, operator_export_type=operator_export_type)
File "/usr/local/lib/python3.7/dist-packages/paddle2onnx/graph/onnx_graph.py", line 79, in __init__
self.update_opset_version()
File "/usr/local/lib/python3.7/dist-packages/paddle2onnx/graph/onnx_graph.py", line 194, in update_opset_version
self.opset_version = OpMapper.get_recommend_opset_version(node_map, self.opset_version)
File "/usr/local/lib/python3.7/dist-packages/paddle2onnx/op_mapper/op_mapper.py", line 129, in get_recommend_opset_version
node_map, opset_version, True)
File "/usr/local/lib/python3.7/dist-packages/paddle2onnx/op_mapper/op_mapper.py", line 174, in check_support_status
raise NotImplementedError(error_info)
NotImplementedError:
There's 1 ops are not supported yet
=========== gather_nd ===========
Is this Paddle2ONNX’s issue?
Also, Is there anyone who had successfully converted PLATO-2 model to ONNX format with Paddle2ONNX or other alternative methods and does not mind to share how to do it?
Thank you very much in advance!
Issue Analytics
- State:
- Created 2 years ago
- Comments:6
Top Results From Across the Web
Converting Models to #ONNX Format - YouTube
The first step to using #ONNXRuntime is converting your model to an ONNX Format. In this video we show you how to convert...
Read more >Converting Models to ONNX Format - Cassie - Medium
In this video we show you how to convert a model from PyTorch, TensorFlow, Keras, SciKit Learn and with Huggingface for Transformer models....
Read more >Export to ONNX - Transformers - Hugging Face
Transformers provides a transformers.onnx package that enables you to convert model checkpoints to an ONNX graph by leveraging configuration objects.
Read more >Convert your TensorFlow model into ONNX format
Convert TensorFlow model to ONNX · Save the tf model in preparation for ONNX conversion, by running the following command. python save_model.py ...
Read more >"onnx.ModelProto exceeds maximum protobuf size of 2GB ...
Issue Title Created Date Comment Count Updated Date
Error rendering data template: UndefinedError 1 2021‑04‑27 2022‑09‑20
Windows Client not finding LAN games 1 2021‑08‑01 2022‑10‑07
Is...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Please send a same issue to Paddle2ONNX, this will take some effort to solve
Hello, We have received your question, and we will solve it as soon as possible.