example code lstm_seq2seq.py warns about non-serializable keywords when attempting to save a model
See original GitHub issuePlease make sure that the boxes below are checked before you submit your issue. If your issue is an implementation question, please ask your question on StackOverflow or join the Keras Slack channel and ask there instead of filing a GitHub issue.
Thank you!
-
[X ] Check that you are up-to-date with the master branch of Keras. You can update with: pip install git+git://github.com/keras-team/keras.git --upgrade --no-deps
-
[ X] If running on TensorFlow, check that you are up-to-date with the latest version. The installation instructions can be found here.
-
If running on Theano, check that you are up-to-date with the master branch of Theano. You can update with: pip install git+git://github.com/Theano/Theano.git --upgrade --no-deps
-
[ X] Provide a link to a GitHub Gist of a Python script that can reproduce your issue (or just copy the script here if it is short). https://github.com/keras-team/keras/blob/master/examples/lstm_seq2seq.py
Basic Issue: I’m running on TensorFlow
In the Keras example linked above, the file lstm_seq2seq.py generates an error.
line 153: model.save('s2s.h5')
returns
2379: UserWarning: Layer lstm_2 was passed non-serializable keyword arguments: {'initial_state': [<tf.Tensor 'lstm_1/while/Exit_2:0' shape=(?, 256) dtype=float32>, <tf.Tensor 'lstm_1/while/Exit_3:0' shape=(?, 256) dtype=float32>]}. They will not be included in the serialized model (and thus will be missing at deserialization time).
str(node.arguments) + '. They will not be included '
Although this is phrased as a warning not an error , the result seems to be that the saved model is missing required information.
I’ve successfully saved other models in the past - so its something specific to this model.
Other information
I’ve tried breaking up the model and saving the weights and config separately (see below) but model.get_weights()
returns the same error.
# alternative method to save model by breaking it up into weights and config
import os
import pickle
def save_model(model, MODEL_DIR):
if not os.path.isdir(MODEL_DIR):
os.makedirs(MODEL_DIR)
weights = model.get_weights()
with open(os.path.join(MODEL_DIR ,'model'),'wb') as file_:
pickle.dump(weights[1:], file_)
with open(os.path.join(MODEL_DIR, 'config.json'),'w') as file_:
file_.write(model.to_json())
save_model(model,'model_dir')
I tried to look into how model.get_weights()
is implemented. Its just a loop that calls layer.get_weights()
for each layer of the model.
Issue Analytics
- State:
- Created 5 years ago
- Reactions:19
- Comments:32
@microdave There are two versions of the encoder/decoder constructor; the one at https://github.com/keras-team/keras/blob/master/examples/lstm_seq2seq.py (as linked to by OP) only works if you have just trained the model, because it relies on already having
encoder_inputs
andencoder_states
defined when it assigns:encoder_model = Model(encoder_inputs, encoder_states)
It has these because
encoder_inputs
andencoder_states
are defined during model setup. The other version at https://github.com/keras-team/keras/blob/master/examples/lstm_seq2seq_restore.py is needed if you are reloading the model: it dissects the layers of the loaded model and picks out the bits it needs to reconstruct everything. E.g. it precedes the above line withI had the same experience as you until I realised this second version was needed, so hopefully this will fix your issue.
I am also getting the same “warning”. Any solution to this problem yet?