question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Optuna prunes too aggressively, when objective is jittery (early stopping patience needed)

See original GitHub issue

Optuna Pruners should have a parameter early_stopping_patience (or checks_patience), which defaults to 1. If the objective hasn’t improved over the last early_stopping_patience checks, then (early stopping) pruning occurs.

Motivation

My objective function is jittery. So Optuna is very aggressive and prunes trials when the objective increases slightly due to jitter.

This means Optuna is worse than hand tuning, because it prefers only optimization paths that are not jittery and monotonically decrease, which aren’t always the ones that converge to the lowest optimum. In some problem settings, it is unusable.

Description

What would be better is an optional parameter that says how many checks must occur without improvement, in order for early stopping pruning to happen.

Optuna Pruners should have a parameter early_stopping_patience (or checks_patience), which defaults to 1. If the objective hasn’t improved over the last early_stopping_patience checks, then (early stopping) pruning occurs.

More specifically, the objective at a particular check for the purposes of pruning, reporting, etc. should be the min (or max) objective observed over the last early_stopping_patience checks.

Alternatives (optional)

I have tried interval_steps=10 but it doesn’t help. Because if the tenth step is the one that has jitter, it gets pruned.

Additional context (optional)

Pytorch early stopping has a patience parameter that is similar to the one I propose.

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Reactions:7
  • Comments:41 (27 by maintainers)

github_iconTop GitHub Comments

1reaction
nzw0301commented, May 26, 2021

@turian PatiencePruner is available on the master branch 😃

1reaction
turiancommented, Apr 26, 2021

@nzw0301 This is looking much more like I would expect! Looks very nice

Read more comments on GitHub >

github_iconTop Results From Across the Web

optuna.pruners.PatientPruner - Read the Docs
If it is None , this pruner is equivalent to early-stopping taken the ... patience (int) – Pruning is disabled until the objective...
Read more >
Optuna pruning for validation loss - Stack Overflow
Short answer: Yes. Hi, I'm one of the authors of PatientPruner in Optuna. If we perform vanilla early-stopping, wrapped_pruner=None works as ...
Read more >
projects.json - DagsHub
This implementation opts online mode of semi - hard triplet mining. ... Loses Patience: Fast and Robust Inference with Early Exit" ...
Read more >
The 2020 Joint Conference on AI Music Creativity - DiVA Portal
But it is too soon to conclude at this stage whether it leads to premature convergence. More complex musical tasks may require the...
Read more >
arXiv:1907.10902v1 [cs.LG] 25 Jul 2019
Optuna :ANext-generation Hyperparameter Optimization Framework ... asynchronously execute aggressive early stopping based on pro-.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found