Mt5 model loading fails
See original GitHub issueHallo, I have MT5 pretrained model, i am using fastt5 approch to convert the model to onnx. The convestion of the model works fine. But when creating the decoder_sess at
decoder_sess = InferenceSession(str(path_to_decoder))
more specfic it fails at
# initialize the C++ InferenceSession
sess.initialize_session(providers, provider_options, disabled_optimizers)
it fails without any error, as
Process finished with exit code 135 (interrupted by signal 7: SIGEMT)
Loading the encoder model works, but not decoder model
I am using latest version of fastt5==0.1.4 Any ideas to create session.
Issue Analytics
- State:
- Created a year ago
- Comments:11 (4 by maintainers)
Top Results From Across the Web
Issue while loading big model like mt5 (may be due to limited ...
I have mt5 model, which is converted from pytorch to onnx and now i am trying to load a model into onnxruntime, i...
Read more >Failed to get mT5 model - Hugging Face Forums
I got this error when getting mT5 model. Can anyone help? (base) notooth@Debian:~$ python Python 3.8.8 (default, Apr 13 2021, 19:58:26) [GCC ...
Read more >Failed to load - wrong type [Solved] - Price Chart - MQL5
I'm using mt5 and getting "Error loading - wrong type." This is very frustrating for me because I cannot run any ea's as...
Read more >Metatrader 5 login() broken after update? - Stack Overflow
34). Since client's update 5.00 (the lib requires a running client), previously working login() function seems broken: SEVERE LOGIN FAIL error ...
Read more >How to Train an mT5 Model for Translation With Simple ...
With the model and evaluation data loaded and ready, we can go ahead and do the translations. With Simple Transformers, we just call...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
it looks like the issue is in onnxruntime itself, I suggest you to create an issue there.
i tried different approch, now it gives out
onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Deserialize tensor onnx::MatMul_4622 failed.tensorprotoutils.cc:637 TensorProtoToTensor External initializer: onnx::MatMul_4622 offset: 0 size to read: 11534336 given file_length: 4194304 are out of bounds or can not be read in full.