question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

[Feature Request] Resume study and avoid OOM when optimizing with Optuna plugin

See original GitHub issue

🚀 Feature Request

I would like to be able:

  • to persist my study in order to resume my hyperparameter search from where I left it
  • set the gc_after_trial parameter of Optuna’s study.optimize()

Motivation

Is your feature request related to a problem? Please describe. I’m always frustrated when my code crashes after 60 trials (out of 100). I suspect an OOM error. Being able to prevent the script from crashing in the first place with gc.collect() would be great. However, at least being able to resume my search from where it stopped would be a game changer.

Pitch

Describe the solution you’d like I would like to set gc_after_trial to True and a path to store my study parameters after each trial in my Optuna sweeper hydra config.

Describe alternatives you’ve considered I read Optuna’s documentation but I’m not sure how to make their examples work with Hydra:

Are you willing to open a pull request? (See CONTRIBUTING) I’m not comfortable enough with Optuna’s and Hydra’s library to prepare a pull request.

Issue Analytics

  • State:open
  • Created 2 years ago
  • Reactions:4
  • Comments:9 (4 by maintainers)

github_iconTop GitHub Comments

2reactions
omrycommented, Sep 14, 2022

Hydra has callbacks which can probably be used for it. See this.

2reactions
omrycommented, Jun 17, 2021

Hi @dianemarquette, I am open to supporting it, although resume study might be harder than in seems (in general resume is not something supported by any Hydra Sweeeper right now).

In any case, we do not have the cycles for it, which means this will happen if someone from the community wants to work toward it.

Supporting gc_after_trial seems like it should be straight forward though.

Read more comments on GitHub >

github_iconTop Results From Across the Web

FAQ — Optuna 3.0.4 documentation
How do I avoid running out of memory (OOM) when optimizing studies? How can I output a log only when the best value...
Read more >
Optuna - A hyperparameter optimization framework
Optuna is an automatic hyperparameter optimization software framework, particularly designed for machine learning.
Read more >
A generic battery-cycling optimization framework with learned ...
To shorten the computational time in simulation or to reduce cost and enable robots12 in experiments, we require powerful optimization ...
Read more >
Hippo: Sharing Computations in Hyper-Parameter Optimization
optimization job, referred to as a study, involves numerous trials of training a model using different training knobs, and therefore is.
Read more >
Screening for Early-Stage Alzheimer's Disease Using ...
Request PDF | Screening for Early-Stage Alzheimer's Disease Using Optimized Feature Sets and Machine Learning | Background: Detecting ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found