[tune] separate fixed configuration from searchable parameters
See original GitHub issueDescribe your feature request
I’ve wrapped an existing pytorch based project with ray tune PBT. It works great 👍thank you 😃
But one thing I wish I had is, being able to input fixed configuration somewhere else other than in the configuration that is used for searching hyperparameters.
The things is, I have many options (e.g., dataset name and paths) that are completely irrelevant to the algorithm but are necessary. As far as I know, I have to include them in my hyperparameter space as list of length 1 in order to make them available to use in _setup()
. But this looks messy and appears constantly when some hyperparameter perturbation occurs.
So it would be great if I could list my basic (fixed) configuration somewhere else, and use the hyperparameter space (dict) purely for tunable hyperparameters.
Issue Analytics
- State:
- Created 4 years ago
- Comments:6 (5 by maintainers)
Top GitHub Comments
@richardliaw would it be difficult to add support for config to (optionally) be an argparse.Namespace? Otherwise, packing args into the config dict is awkward, e.g.
self.config["args"].max_epochs
. It seems like it would be as simple as addingif isinstance(config, Namespace): config = vars(config).
I wrote my codebase for args, so right now I do
args = config_to_args(config)
in each of my functions.Are you using argparse? One option is to simply keep everything in the Namespace object and update it on the worker:
BTW, this should also work: