question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Inactive hyperparameters having an effect on BayesianOptimization

See original GitHub issue

Correct me if I’m wrong, but (and I see this is a recurring issue) since the tuner seems to tune hyperparameters regardless of whether they’re active or not, wouldn’t this have an effect on the way Bayesian Optimization works since the probability score being built around inactive hyperparameters has an effect on the way the tuner perceives the final score being generated (sorry for my poor wording)

In any case, my current code looks like this:

def model_builder(hp):
    tf.keras.backend.clear_session()
    hp_timesteps = hp.Int('timesteps' , min_value = 4, max_value = 30, step = 1)
    hp_optimizer = hp.Choice('optimizer', values = ['adam', 'rmsprop', 'adamax', 'nadam'])
    hp_layers = hp.Int('num_layers', min_value = 1, max_value = 4, step =1)
    
    x_train, y_train = create_dataset(trainS, trainS.Aggregate, hp_timesteps)
    x_test, y_test = create_dataset(testS, testS.Aggregate, hp_timesteps)

    if hp_layers == 1:
        model = tf.keras.Sequential()
        model.add(tf.keras.layers.LSTM(hp.Int('hp_units_single', min_value=8, max_value=128, step=8)))
        model.add(tf.keras.layers.Dense(units=1))
        model.compile(loss=loss, optimizer=hp_optimizer)
        return model
    
    if hp_layers > 1:
        model = tf.keras.Sequential()
        for i in range(hp_layers - 1):
            model.add(tf.keras.layers.LSTM(hp.Int(f'hp_units_{i}', min_value=8, max_value=128, step=8), return_sequences=True))
        model.add(tf.keras.layers.LSTM(hp.Int('hp_units_final', min_value=8, max_value=128, step=8)))
        model.add(tf.keras.layers.Dense(units=1))
        model.compile(loss=loss, optimizer=hp_optimizer)
        return model

Is there any way to restrict the tuner from attempting to pass values to inactive hyperparameters?

Issue Analytics

  • State:open
  • Created 3 years ago
  • Comments:7 (3 by maintainers)

github_iconTop GitHub Comments

1reaction
ben-arnaocommented, Jun 22, 2020

@KareemAlSaudi-RUG There’s a pretty recent paper that suggests nothing really significantly beats a good random-search variant yet.

0reactions
Astlaancommented, Jun 26, 2020

@KareemAlSaudi-RUG There’s a pretty recent paper that suggests nothing really significantly beats a good random-search variant yet.

Interesting. Any recommendation then, between RandomSearch and Hyperband? I suppose that on the paper you linked, Random search with Early Stopping refers to Hyperband, or?

Thanks

Read more comments on GitHub >

github_iconTop Results From Across the Web

Bayesian Optimization for Conditional ... - ResearchGate
The comparison of an active condition with an inactive condition is defined as being false, returning a zero kernel value (hence no shared...
Read more >
How Hyperparameter Tuning Works - Amazon SageMaker
Amazon SageMaker hyperparameter tuning uses either a Bayesian or a random search strategy to find the best values for hyperparameters.
Read more >
Overview of hyperparameter tuning | Vertex AI - Google Cloud
If the hyperparameter is shared, the tuning job uses what it has learned from LINEAR_REGRESSION and DNN trials to tune the learning rate....
Read more >
Bayesian Multi-objective Hyperparameter Optimization for ...
In Parsa et al. (2019b), we used a single objective hyperparameter Bayesian optimization to optimize performance of spiking neuromorphic systems ...
Read more >
A Conceptual Explanation of Bayesian Hyperparameter ...
If you said below 200 estimators, then you already have the idea of Bayesian optimization! We want to focus on the most promising ......
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found