suggestion: save the training history in model.train_history_
See original GitHub issueFor example, if we are using verbose and validation:
'''
Demonstration of validation_split
'''
model.fit(X_train, y_train, nb_epoch=3, batch_size=16, validation_split=0.1, show_accuracy=True, verbose=1)
# outputs
'''
Train on 37800 samples, validate on 4200 samples
Epoch 0
37800/37800 [==============================] - 7s - loss: 0.0385 - acc.: 0.7258 - val. loss: 0.0160 - val. acc.: 0.9136
Epoch 1
37800/37800 [==============================] - 8s - loss: 0.0140 - acc.: 0.9265 - val. loss: 0.0109 - val. acc.: 0.9383
Epoch 2
10960/37800 [=======>......................] - ETA: 4s - loss: 0.0109 - acc.: 0.9420
'''
But after the fitting is finished, we could only get the final model performance. So, my suggestion is that, save the performances inside a new variable (inside the class), for example:
model.fit(X_train, y_train, nb_epoch=3, batch_size=16, validation_split=0.1, show_accuracy=True, verbose=1)
model.train_history_
# outputs
'''
{
'epoch': [0, 1, 2],
'loss': [0.0385, 0.0140, 0.0109],
'acc': [0.7258, 0.9256, 0.9420],
'val_loss': [0.0160, 0.0169, 0.0170],
'val_acc': [0.9136, 0.9383, 0.9400]
}
'''
So that we can analyze the history, like drawing the lines of the changes of loss and val_loss, to try to select the best epochs to prevent the overfit.
Issue Analytics
- State:
- Created 8 years ago
- Comments:5 (3 by maintainers)
Top Results From Across the Web
python - keras: how to save the training history attribute of the ...
What I use is the following: with open('/trainHistoryDict', 'wb') as file_pi: pickle.dump(history.history, file_pi). In this way I save the ...
Read more >Display Deep Learning Model Training History in Keras
Access Model Training History in Keras It records training metrics for each epoch. This includes the loss and the accuracy (for classification ...
Read more >keras: how to save the training history - Intellipaat Community
You can save the model history by this: with open('/trainHistoryDict', 'wb') as file_pi: pickle.dump(history.history, file_pi). Happy Learning.
Read more >Effective Model Saving and Resuming Training in PyTorch
This blog post explores how to do proper model saving in PyTorch framework that helps in resuming training later on.
Read more >4. Model Training Patterns - Machine Learning Design ...
Then take the same network and train it on the full training dataset. ... Models like recurrent neural networks incorporate history of previous...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Hi, I implemented this wrapper class for tracking the training history.
Well, I am just a newbie of keras and deep learning. I just follow the example and have never used
train
yet. So I do not know what we happen.But I have looked your source code. It seems that at least the design of keras provides some capability for
scikit-learn
, right? Because at least theSequential()
class has.fit()
,.describe()
,.predict()
,.predict_proba()
. And inscikit-learn
, it is common to save some kind of.train_history_
inside the class after afit()
, instead of returning a value.And I do not think it is a trouble to
fit
serval times. After all, naturally you have a different training history in a differentfit
, and it’s developers’ responsibility to save out the training history before they fit it again.Last but not least, I am using a custom sub-class to provide numerical information.
val_loss
andval_acc
work as expected, butloss
andacc
don’t, sinceloss
andacc is actually average of something in
Progbar`, and I could not figure out the correct information. Would you provide some help? I think it is useful to save some time for future development to solve this issue.It is almost the same as the original
Sequential()
, but I add a linetrain_history = []
in the middle, and I add lines fortrain_history
in the end of each epoch. I always useshow_accuracy
anddo_validation
so I do not insert the codes inside theif else
.