question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Set model optimizer as a parameter during tuning

See original GitHub issue

Hey,

I’m trying to use Ax with Pytorch and I’m following your official tutorial. I will need to run my algorithm using different optimizers (Adam, SGD, RMSprop) while also tuning these function’s learning rate and momentum. Something like this:

best_parameters, values, experiment, model = optimize(parameters=[
        {"name": "lr", "type": "range", "bounds": [1e-6, 0.4], "log_scale": True},

        {"name": "momentum", "type": "range", "bounds": [0.0, 1.0]},

        {"name": "epoch", "type": "range", "bounds": [15, 40]},

        {"name": "optimizer", "type": "choice", "values": [optim.SGD(lr,momentum), optim.Adam(lr,momentum)]}], 
evaluation_function=train_evaluate, objective_name='accuracy')

What would be the best way to go about doing this?

Thanks!

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:6 (3 by maintainers)

github_iconTop GitHub Comments

2reactions
Balandatcommented, Feb 28, 2022

The issue is

"values": [optim.SGD(lr,momentum), optim.Adam(lr,momentum)]}]

The parameter types must be basic Ax types, and cannot be arbitrary objects such as torch optimizer instances. So what you can do instead is choose a STRING parameter type and allow choices ["SGD", "Adam"] in the search space definition, and in your code construct an optimizer object based on the string value you receive for the “optimizer” parameter.

1reaction
adwaykanherecommented, Feb 28, 2022

Thank you very much! I’m closing this issue now.

Read more comments on GitHub >

github_iconTop Results From Across the Web

3.6 Model Optimization and Tuning - Bookdown
The search for the best tuning parameter values can be done in many ways but most fall into two main categories: those that...
Read more >
Best Tools for Model Tuning and Hyperparameter Optimization
Google's Vizer. AI Platform Vizier is a black-box optimization service for tuning hyperparameters in complex machine learning models. It not ...
Read more >
Hyperparameters Optimization - Towards Data Science
The model parameters define how to use input data to get the desired output and are learned at training time.
Read more >
Hyperparameter tuning for machine learning models.
Model parameters are learned during training when we optimize a loss function using something like gradient descent.The process for learning ...
Read more >
Automatic Hyperparameter Optimization With Keras Tuner
When building machine learning models, hyperparameters are set to guide the training process. Depending on the performance of the model after initial training, ......
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found