Hyperparameter optimization does not return learning rate
See original GitHub issuePresently, the hyperparameter optimization returns the best parameters for
{'gradient_clip_val': 1.028043566346387,
'hidden_size': 176,
'dropout': 0.22095396475678628,
'hidden_continuous_size': 36,
'attention_head_size': 1}
even when use_learning_rate_finder=True
. I expect this could be fixed easily just by adding it to the optuna parameters to track.
Issue Analytics
- State:
- Created 3 years ago
- Comments:10 (3 by maintainers)
Top Results From Across the Web
How to Configure the Learning Rate When Training Deep ...
In this tutorial, you will discover the learning rate hyperparameter used when training deep learning neural networks.
Read more >18. Hyperparameter Tuning
Learning rate is a hyperparameter that controls the step size to move in the direction of lower loss function, with the goal of...
Read more >Automated Machine Learning Hyperparameter Tuning in Python
Increasingly, hyperparameter tuning is done by automated methods that aim to find optimal hyperparameters in less time using an informed search ...
Read more >Optuna Guide: How to Monitor Hyper-Parameter Optimization ...
Time-consuming – in deep learning, where one experiment takes hours to complete, this strategy is not efficient. Another traditional approach is ...
Read more >Optimizer Benchmarking Needs to Account for ... - arXiv
Budget for hyperparameter optimization (# models trained) ... and classification, tuning only the learning rate for Adam is the most reliable option, ...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Annoying mishap. Pushed a fix.
Yes. It’s unreleased.