question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Epochs argument in Hyperband search method

See original GitHub issue

Is the epochs argument in the search() method redundant for Hyperband?

For what I understood the algorithm should “automatically” allocate the the number of epochs during the tuning process according to max_epochs.

From the documentation:

from kerastuner.applications import HyperResNet
from kerastuner.tuners import Hyperband

hypermodel = HyperResNet(input_shape=(128, 128, 3), num_classes=10)

tuner = Hyperband(
    hypermodel,
    objective='val_accuracy',
    max_trials=40, #I think max_trials is not a valid argument for Hyperband
    directory='my_dir',
    project_name='helloworld')

tuner.search(x, y,
             epochs=20, #what is the expected behavior here that max_epochs != epochs?
             validation_data=(val_x, val_y))

Issue Analytics

  • State:open
  • Created 4 years ago
  • Reactions:3
  • Comments:6 (1 by maintainers)

github_iconTop GitHub Comments

1reaction
vb690commented, Oct 4, 2020

Hi @tkmamidi,

proceeding with order:

  1. I briefly looked at the current version of Keras Tuner documentation and max_trials is not a valid argument for Hyperband anymore(rightfully in my opinion).

  2. If you think about it and look at the details of the algorithm this does make perfect sense: Hyperband compute the maximum number of trials given the resources you allocate for training that in this case is given by max_epochs and hyperband_iterations.

  3. This means you don’t have direct control on the number of trials the algorithm will run or better, you can compute that value given the value of max_epochs and hyperband_iterations (again see the resource linked above).

  4. As they suggest in the tutorial you want to set max_epochs to the number of epochs you expect your model will need to converge while also passing a callback for early termination of training (like EarlyStopping for instance). Since Hyperband is based on sampling random configurations of hyper-parameters and iteratively “breeding” the most promising ones you don’t want to waste the computational budget allocated at any point of the optimization process (i.e. the trials) to configuration which are not optimal or that reached convergence early on.

  5. I have the felling that also executions_per_trial is not a valid argument (I can’t find it even in the Tuner class).

  6. I believe you want to increase your value of max_epochs, at the moment you are sampling from an extremely small pool of random configurations.

Pinging @omalleyt12 for double checking what I’ve written.

0reactions
brethvoicecommented, Jul 26, 2022

I have the felling that also executions_per_trial is not a valid argument (I can’t find it even in the Tuner class).

I am currently using HyperBand with two executions per trial. So it is a valid argument @vb690 and its purpose is to allow for averaging out some of the uncontrollable pseudo-randomness from, say, NVIDIA’s GPU drivers which have non-deterministic behavior.

Read more comments on GitHub >

github_iconTop Results From Across the Web

How to tune the number of epochs and batch_size in Keras ...
This can be done by subclassing the Tuner class you are using and overriding run_trial . (Note that Hyperband sets the epochs to...
Read more >
Hyperband Tuner - Keras
"Hyperband: A Novel Bandit-Based Approach to Hyperparameter Optimization." Journal of Machine Learning Research 18 (2018): 1-52. Arguments.
Read more >
Keras Tuner: Lessons Learned From Tuning Hyperparameters ...
Keras tuner provides an elegant way to define a model and a search space for the parameters that the tuner will use –...
Read more >
Keras Tuner Hyperband - how to set max trials and max ...
another argument for the num of epochs can be passed. What is the relation between 'epochs' for 'search()' and 'max_epochs' for 'Hyperband()'?
Read more >
The Hyperband Algorithm. - Louis Henri Franc
They've become my favorite way to go for hyper parameter search ... it ... neural training network for some epoch ruled by early...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found