Error during decoding
See original GitHub issueAfter training the model,I tried decoding it with two iterations and i am getting the following error.
python sample_ensemble.py --models trained_models/EuTrans_esen_AttentionRNNEncoderDecoder_src_emb_250_bidir_True_enc_LSTM_32_dec_ConditionalLSTM_32_deepout_linear_trg_emb_250_Adam_0.001/epoch_1 trained_models/EuTrans_esen_AttentionRNNEncoderDecoder_src_emb_250_bidir_True_enc_LSTM_32_dec_ConditionalLSTM_32_deepout_linear_trg_emb_250_Adam_0.001/epoch_2 --dataset datasets/Dataset_EuTrans_esen.pkl --text examples/EuTrans/test.en
[01/11/2018 12:49:07] Reading parameters from config.py
[01/11/2018 12:49:07] <<< Cupy not available. Using numpy. >>>
Using Theano backend.
[01/11/2018 12:49:08] <<< Cupy not available. Using numpy. >>>
[01/11/2018 12:49:08] Using an ensemble of 2 models
[01/11/2018 12:49:08] <<< Loading model from trained_models/EuTrans_esen_AttentionRNNEncoderDecoder_src_emb_250_bidir_True_enc_LSTM_32_dec_ConditionalLSTM_32_deepout_linear_trg_emb_250_Adam_0.001/epoch_1_Model_Wrapper.pkl ... >>>
[01/11/2018 12:49:08] <<< Loading model from trained_models/EuTrans_esen_AttentionRNNEncoderDecoder_src_emb_250_bidir_True_enc_LSTM_32_dec_ConditionalLSTM_32_deepout_linear_trg_emb_250_Adam_0.001/epoch_1.h5 ... >>>
[01/11/2018 12:49:12] <<< Loading optimized model... >>>
[01/11/2018 12:49:15] <<< Optimized model loaded. >>>
[01/11/2018 12:49:15] <<< Model loaded in 7.5578 seconds. >>>
[01/11/2018 12:49:15] <<< Loading model from trained_models/EuTrans_esen_AttentionRNNEncoderDecoder_src_emb_250_bidir_True_enc_LSTM_32_dec_ConditionalLSTM_32_deepout_linear_trg_emb_250_Adam_0.001/epoch_2_Model_Wrapper.pkl ... >>>
[01/11/2018 12:49:15] <<< Loading model from trained_models/EuTrans_esen_AttentionRNNEncoderDecoder_src_emb_250_bidir_True_enc_LSTM_32_dec_ConditionalLSTM_32_deepout_linear_trg_emb_250_Adam_0.001/epoch_2.h5 ... >>>
[01/11/2018 12:49:17] <<< Loading optimized model... >>>
[01/11/2018 12:49:21] <<< Optimized model loaded. >>>
[01/11/2018 12:49:21] <<< Model loaded in 5.2314 seconds. >>>
[01/11/2018 12:49:21] <<< Loading Dataset instance from datasets/Dataset_EuTrans_esen.pkl ... >>>
[01/11/2018 12:49:21] <<< Dataset instance loaded >>>
[01/11/2018 12:49:21] Removed "val" set outputs of type "text" with id "target_text.
[01/11/2018 12:49:21] Applying tokenization function: "tokenize_none".
[01/11/2018 12:49:21] Loaded "val" set inputs of data_type "text" with data_id "source_text" and length 2996.
[01/11/2018 12:49:21] Loaded "val" set inputs of data_type "ghost" with data_id "state_below" and length 2996.
[01/11/2018 12:49:21] Loaded "val" set inputs of type "file-name" with id "raw_source_text".
[01/11/2018 12:49:21]
<<< Predicting outputs of val set >>>
[01/11/2018 12:49:28] We did not find a dynamic library in the library_dir of the library we use for blas. If you use ATLAS, make sure to compile it with dynamics library.
[01/11/2018 12:49:29] We did not find a dynamic library in the library_dir of the library we use for blas. If you use ATLAS, make sure to compile it with dynamics library.
Traceback (most recent call last):
File "sample_ensemble.py", line 61, in <module>
sample_ensemble(args, params)
File "/home/Vinay737/nmt-keras/nmt_keras/apply_model.py", line 94, in sample_ensemble
predictions = beam_searcher.predictBeamSearchNet()[s]
File "/home/Vinay737/nmt-keras/src/keras-wrapper/keras_wrapper/model_ensemble.py", line 249, in predictBeamSearchNet
return_alphas=self.return_alphas, model_ensemble=True, n_models=len(self.models))
File "/home/Vinay737/nmt-keras/src/keras-wrapper/keras_wrapper/search.py", line 135, in beam_search
new_hyp_samples.append(hyp_samples[ti] + [wi])
IndexError: list index out of range
Any fixes ?
Issue Analytics
- State:
- Created 5 years ago
- Comments:5 (5 by maintainers)
Top Results From Across the Web
DecodingError | Apple Developer Documentation
An error that occurs during the decoding of a value. ... Returns a new .dataCorrupted error using a constructed coding path and the...
Read more >SOLVED: Error Decoding – SwiftUI - Forums - Hacking with Swift
Don't use try? when decoding JSON because, as you discovered, it swallows any errors that may occur by turning them into an Optional...
Read more >Decoding Error Messages | Website Builders.com
Decoding Error Messages · 400 – Bad Request · 401 – Unauthorized · 403 – Forbidden · 404 – Not Found · 503...
Read more >When Is a Decoding Error Not a Decoding Error? - JStor
decoding error must represent a lack of knowledge of the sound-symbol relationship involved. Therefore, the student needs drill in that sound.
Read more >List Decoding of Error-Correcting Codes
Abstract. Error-correcting codes are combinatorial objects designed to cope with the problem of reli- able transmission of information on a noisy channel.
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Working…!!
How about integrating evaluation metrics after this step with bleu or toolkits like this
I’d rather to maintain the sampling process independent of the evaluation. After sampling, you can run your desired evaluator. There is a script (utils/evaluate_from_file.py) for computing several metrics according to the
coco
evaluation package. You can use this (or any other) for computing BLEU.