state.best_metric does not update in EarlyStoppingCallback
See original GitHub issueI print the state.best_metric
, but it is always None. I wonder whether the state.best_metric
should be updated if the condition is satisfied, like the following:
if state.best_metric is None or (
operator(metric_value, state.best_metric)
and abs(metric_value - state.best_metric) > self.early_stopping_threshold
):
self.early_stopping_patience_counter = 0
state.best_metric = metric_value
Issue Analytics
- State:
- Created a year ago
- Comments:9 (2 by maintainers)
Top Results From Across the Web
Callbacks - Hugging Face
Callbacks are objects that can customize the behavior of the training loop in the PyTorch Trainer (this feature is not yet implemented in...
Read more >Callbacks — Catalyst 20.12 documentation
minimize_metric – indicator for selecting best metric, if true then best metric will be the metric with the lowest value, otherwise with the...
Read more >Keras early stopping callback error, val_loss metric not available
The error message appears to relate to the early stopping callback but the callback looks OK. Also the error states that the val_loss...
Read more >Callbacks — pytorch-accelerated 0.1.3 documentation
This is done to support the pattern of updating the trainer's state in a method ... A callback which stops training early if...
Read more >Tracking callbacks - fastai
EarlyStoppingCallback (monitor='valid_loss', comp=None, min_delta=0.0, ... patience, int, 1, number of epochs to wait when training has not improved model.
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Cool, I’ll be working on one!
not work for me