PBT update config in custom explore function
See original GitHub issueI am new to ray and tune. I want to use PBT with a dynamic config. For example,
def explore(config):
# here config['varA'] = None not something as in TrainMNIST._setup
return config
class TrainMNIST(Trainable):
def _setup(self, config):
config['varA'] = something
self.config.update(config)
pbt = PopulationBasedTraining(
...,
custom_explore_fn=explore)
tune.run(
TrainMNIST,
name="exp",
scheduler=pbt,
stop={
"test_acc": 0.99,
"training_iteration": 100,
},
resources_per_trial={
"cpu": 2,
"gpu": 0.25,
},
**{"config":
{ "args": vars(args),
"varA": None,
}
}
How can I explore with config[‘varA’] = something?
Issue Analytics
- State:
- Created 4 years ago
- Comments:5 (2 by maintainers)
Top Results From Across the Web
ray/pbt.py at master · ray-project/ray - GitHub
config, used to filter the config. class PopulationBasedTraining(FIFOScheduler): """Implements the Population Based Training (PBT) algorithm. outperforming the ...
Read more >ray.tune.schedulers.pbt — Ray 2.2.0 - the Ray documentation
Source code for ray.tune.schedulers.pbt ... hyperparameter. custom_explore_fn: Custom explore function applied after built-in config perturbations.
Read more >Population-based Training Method
Population-based training (PBT) is loosely based on genetic algorithms; see the original paper or blog post for details. The motivation is that it...
Read more >Details and advanced features - Hypothesis! - Read the Docs
Custom function execution Hypothesis provides you with a hook that lets you control how it runs examples. This lets you do things like...
Read more >How to Configure your CTRL Keyboard! - Drop
Create your keyboard configuration and compile it into a firmware file. ... you can customize its behavior by selecting another key or function...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
I found a way to work aound this by adding trial.config.update(result[‘config’]) in on_trial_result. I use this to make the explore function adapt to each model’s performance. Say, for better performance I use small step for hyperparams.
Still, I think synchronize config is essential for robustness.
update config when getting results