question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

[Minor Bug] Unnecessary and mildy confusing print statement during hyperparameter optimization

See original GitHub issue

Starting a hyperparameter run with train.py prints out something like this:

========== LunarLanderContinuous-v2 ==========
Seed: 3139539977
OrderedDict([('batch_size', 64),
             ('ent_coef', 0.01),
             ('gae_lambda', 0.98),
             ('gamma', 0.999),
             ('n_envs', 16),
             ('n_epochs', 4),
             ('n_steps', 1024),
             ('n_timesteps', 1000000.0),
             ('policy', 'MlpPolicy')])
Using 4 environments
Overwriting n_timesteps with n=4000000

The problem is that the OrderedDict that’s printed is the parameters for the environment in the relevant .yml file. It has no impact on hyperparameter tuning, isn’t even used, can make it look like you’re doing something wrong (e.g. by default you’re tuning hyperparameters that aren’t included in that list) and the origin of it takes a shockingly long time to hunt down to confirm you aren’t somehow doing something wrong.

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:5 (5 by maintainers)

github_iconTop GitHub Comments

1reaction
jkterry1commented, May 20, 2021

(sorry for the delayed reply)

I meant a message like “Default hyperparameters for environment (ones being tuned will be overridden)”"

0reactions
araffincommented, Jun 2, 2021

I meant a message like “Default hyperparameters for environment (ones being tuned will be overridden)”"

sounds good 😉 (my upvote did not create a notification …)

Read more comments on GitHub >

github_iconTop Results From Across the Web

Common Problems in Hyperparameter Optimization - SigOpt
SigOpt shows solutions to some of the most common problems we've seen people run into when implementing hyperparameter optimization.
Read more >
A Novice's Guide to Hyperparameter Optimization at Scale |
The search algorithm governs how hyperparameter space is sampled and optimized (e.g. random search). From a practical standpoint, the search ...
Read more >
Machine Learning Glossary - Google Developers
A training-time optimization in which a probability is calculated for all the positive labels, using, for example, softmax, but only for a ...
Read more >
Stan Modeling Language
Statements in Stan are interpreted imperatively, so their order matters. ... Stan supports print statements with one or more string or expression arguments....
Read more >
Deep Learning - USTC Vision and Multimedia (VIM)
of biologically inspired machine learning often worked with small, ... deep learning have brought the latest top-5 error rate in this contest down...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found