question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

[tune] Support nesting grid_search in lambdas

See original GitHub issue

System information

  • OS Platform and Distribution (e.g., Linux Ubuntu 16.04): 16.04.4 LTS
  • Ray installed from (source or binary): binary
  • Ray version: 6.0
  • Python version: 3.6.6

Describe the problem

The documentation for generate_variants(...) states that unresolved parameters lambda and grid_search can be nested. From ray/tune/suggest/variant_generator.py, line 33:

"""It is also possible to nest the two, e.g. have a lambda function return a grid search or vice versa, as long as there are no cyclic dependencies between unresolved values. """

However, it seems like even the following simple case fails:

import ray
from ray import tune

def dummy_fn(config, reporter):
  print(config)

def resolve_b(spec):
  values = [i**spec.config.a for i in range(2, 4)]
  return tune.grid_search(values)

exp_config = {
  "dummy_exp": {
    "run": dummy_fn,
    "config": {"a": tune.grid_search([1, 2]),
               "b": resolve_b},
  },
}

ray.init()
tune.run_experiments(exp_config)

with error message (raised from ray/tune/suggest/variant_generator.py, line 129):

ValueError: The variable `('config', 'b')` could not be unambiguously resolved to a single value. Consider simplifying your variable dependencies.

Additionally, the error handling seems off as the error message appears incorrect in blaming dependencies. This example fails for me in the same way as the other case despite having no dependencies:

import ray
from ray import tune

def dummy_fn(config, reporter):
  print(config)
  
exp_config = {
  "dummy_exp": {
    "run": dummy_fn,
    "config": {"a": tune.grid_search([1, 2]),
               "b": lambda spec: tune.grid_search([3, 4])},
  },
}

ray.init()
tune.run_experiments(exp_config)

Issue Analytics

  • State:open
  • Created 5 years ago
  • Reactions:1
  • Comments:12 (10 by maintainers)

github_iconTop GitHub Comments

3reactions
krfrickecommented, Sep 16, 2020

The following code works in the latest master:

tune.run(
    training_function,
    config={
        'test1': tune.grid_search(['a', 'b']),
        'test2': tune.grid_search(['c', 'd']),
        'test3': tune.sample_from(lambda spec: {
                'a': tune.sample_from(lambda spec: spec.config['test2'] * 2),
                'b': tune.sample_from(lambda spec: spec.config['test2'] * 3),
            }[spec.config['test1']])
    })

If there’s still cases we don’t consider, please open a new issue. Thanks!

1reaction
krfrickecommented, Jan 4, 2021

The main problem why we currently don’t support this is that we currently require to know the number of samples in advance. However, with dependent grid search spaces, the number of grid search elements can vary. Even if we introduce a method to specify the number of elements and fix it, resolving the grid search variables would required some significant overhaul of our current search space resolution algorithm.

We can keep this issue open to track this request, but it’s unlikely we’ll get to this soon.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Working with Tune Search Spaces — Ray 2.2.0
Here's an example showing a grid search over two nested parameters combined with random sampling from two lambda functions, generating 9 different trials....
Read more >
Nested GridSearchCV - Stack Overflow
For a given model type, I want to both 1) tune parameters for various model types and 2) find the best tuned model...
Read more >
How to Grid Search SARIMA Hyperparameters for Time Series ...
Seasonal Autoregressive Integrated Moving Average, SARIMA or Seasonal ARIMA, is an extension of ARIMA that explicitly supports univariate ...
Read more >
Tuning Models · MLJ
LatinHypercube (rng=GLOBAL_RNG), with discrete parameter support ... Tuning a single hyperparameter using a grid search (regression example).
Read more >
Common values used for hyperparameter grid search?
If we let ˜λ denote the minimum lambda for which all parameters are estimated as zero (the set of all such lambdas is...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found