question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Optional logging of validation loss (and other metrics) in KerasModel

See original GitHub issue

KerasModel (and TensorGraph) currently don’t support periodic logging of validation loss. Would it be a good idea to have this in the fit_generator and fit API?

This would be an optional argument with the modified API looking something like:

def fit(self, dataset, nb_epoch=10, max_checkpoints_to_keep=5, 
      checkpoint_interval=1000, deterministic=False, restore=False, 
      submodel=None, val_dataset=None, eval_interval=1000, **kwargs):

It would make it easier to do things like early stopping, which was done in the ChemNet Transfer Learning paper, and in general can be useful as well

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Reactions:1
  • Comments:9 (9 by maintainers)

github_iconTop GitHub Comments

1reaction
peastmancommented, Jul 1, 2019

Agreed, this would be very useful. We should give some thought to how this should work. For fit(), it could work to just provide another dataset to use for validation. For fit_generator() that isn’t necessarily possible. After all, one of the purposes of that method is to support models that take multiple inputs and therefore require more than just the X array from a dataset.

You also mentioned the possibility of tracking other metrics, which would also be useful. So perhaps we can have a unified mechanism that supports all of those things.

0reactions
vsomnathcommented, Jul 13, 2019

I borrowed the BaseLogger term from Keras Callbacks, but it was motivated by your invoked “after every step” statement:

A callback can be any callable object. It gets invoked after each step, with the KerasModel being passed as the only argument.

Using the same idea as Keras, this CallBack would compute the loss at every step, and then display every certain iterations. This is used by default in every model. It can slso compute some metrics on the training set, if needed.

In case of Keras (https://github.com/keras-team/keras/blob/master/keras/callbacks.py), the EarlyStoppingCallback keeps track of the best weights if a toggle is turned on. So having one class should be sufficient.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Training & evaluation with the built-in methods - Keras
Introduction. This guide covers training, evaluation, and prediction (inference) models when using built-in APIs for training & validation ...
Read more >
How to Use Metrics for Deep Learning with Keras in Python
The Keras library provides a way to calculate and report on a suite of standard metrics when training deep learning models.
Read more >
Keras Loss Functions: Everything You Need to Know
It is usually a good idea to monitor the loss function on the training and validation set as the model is training. Looking...
Read more >
[Feature Request] Logging of validation metrics when using ...
fit. Allowing the user to pass another (optional) parameter to the CSVLogger constructor called, say, 'missing_value_string' which is then used ...
Read more >
How to return history of validation loss in Keras - Stack Overflow
fit method returns a History callback, which has a history attribute containing the lists of successive losses and other metrics. hist = model....
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found