Set model optimizer as a parameter during tuning
See original GitHub issueHey,
I’m trying to use Ax with Pytorch and I’m following your official tutorial. I will need to run my algorithm using different optimizers (Adam, SGD, RMSprop) while also tuning these function’s learning rate and momentum. Something like this:
best_parameters, values, experiment, model = optimize(parameters=[
{"name": "lr", "type": "range", "bounds": [1e-6, 0.4], "log_scale": True},
{"name": "momentum", "type": "range", "bounds": [0.0, 1.0]},
{"name": "epoch", "type": "range", "bounds": [15, 40]},
{"name": "optimizer", "type": "choice", "values": [optim.SGD(lr,momentum), optim.Adam(lr,momentum)]}],
evaluation_function=train_evaluate, objective_name='accuracy')
What would be the best way to go about doing this?
Thanks!
Issue Analytics
- State:
- Created 2 years ago
- Comments:6 (3 by maintainers)
Top Results From Across the Web
3.6 Model Optimization and Tuning - Bookdown
The search for the best tuning parameter values can be done in many ways but most fall into two main categories: those that...
Read more >Best Tools for Model Tuning and Hyperparameter Optimization
Google's Vizer. AI Platform Vizier is a black-box optimization service for tuning hyperparameters in complex machine learning models. It not ...
Read more >Hyperparameters Optimization - Towards Data Science
The model parameters define how to use input data to get the desired output and are learned at training time.
Read more >Hyperparameter tuning for machine learning models.
Model parameters are learned during training when we optimize a loss function using something like gradient descent.The process for learning ...
Read more >Automatic Hyperparameter Optimization With Keras Tuner
When building machine learning models, hyperparameters are set to guide the training process. Depending on the performance of the model after initial training, ......
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
The issue is
The parameter types must be basic Ax types, and cannot be arbitrary objects such as torch optimizer instances. So what you can do instead is choose a STRING parameter type and allow choices
["SGD", "Adam"]
in the search space definition, and in your code construct an optimizer object based on the string value you receive for the “optimizer” parameter.Thank you very much! I’m closing this issue now.