question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Issues with `optimize`: needs more documentation, maybe does not return the correct best parameters?

See original GitHub issue

I’m running the Get Started example with optimize and seeing a couple issues:

  1. Setting random_seed does not lead to reproducible output. Values only change by a small amount on subsequent runs so it’s not 100% “random”- perhaps there’s another RNG that feeds into one part of this?
  2. best_parameters do not actually come from the trial with the best result. In the example below, they come from trial 14, while the actual best result was in trial 11. (This issue persisted even though the output values varied when I reran it.)
In [13]: %paste                                                                                                                               
import ax
print(ax.__version__)
best_parameters, best_values, experiment, model = ax.optimize(
        parameters=[
          {
            "name": "x1",
            "type": "range",
            "bounds": [-10.0, 10.0],
          },
          {
            "name": "x2",
            "type": "range",
            "bounds": [-10.0, 10.0],
          },
        ],
        # Booth function
        evaluation_function=lambda p: (p["x1"] + 2*p["x2"] - 7)**2 + (2*p["x1"] + p["x2"] - 5)**2,
        minimize=True,
        random_seed=2,
    )

print(f"best params: {best_parameters}")

for t in experiment.trials:
  print('---')
  print(t)
  print(experiment.trials[t].arm.parameters)
  print(experiment.trials[t].objective_mean)

## -- End pasted text --
0.1.20
[INFO 06-20 19:13:57] ax.service.utils.instantiation: Inferred value type of ParameterType.FLOAT for parameter x1. If that is not the expected value type, you can explicity specify 'value_type' ('int', 'float', 'bool' or 'str') in parameter dict.
[INFO 06-20 19:13:57] ax.service.utils.instantiation: Inferred value type of ParameterType.FLOAT for parameter x2. If that is not the expected value type, you can explicity specify 'value_type' ('int', 'float', 'bool' or 'str') in parameter dict.
[INFO 06-20 19:13:57] ax.modelbridge.dispatch_utils: Using Bayesian Optimization generation strategy: GenerationStrategy(name='Sobol+GPEI', steps=[Sobol for 5 trials, GPEI for subsequent trials]). Iterations after 5 will take longer to generate due to  model-fitting.
[INFO 06-20 19:13:57] ax.service.managed_loop: Started full optimization with 20 steps.
[INFO 06-20 19:13:57] ax.service.managed_loop: Running optimization trial 1...
[INFO 06-20 19:13:57] ax.service.managed_loop: Running optimization trial 2...
[INFO 06-20 19:13:57] ax.service.managed_loop: Running optimization trial 3...
[INFO 06-20 19:13:57] ax.service.managed_loop: Running optimization trial 4...
[INFO 06-20 19:13:57] ax.service.managed_loop: Running optimization trial 5...
[INFO 06-20 19:13:58] ax.service.managed_loop: Running optimization trial 6...
[INFO 06-20 19:13:58] ax.service.managed_loop: Running optimization trial 7...
[INFO 06-20 19:13:58] ax.service.managed_loop: Running optimization trial 8...
[INFO 06-20 19:13:59] ax.service.managed_loop: Running optimization trial 9...
[INFO 06-20 19:13:59] ax.service.managed_loop: Running optimization trial 10...
[INFO 06-20 19:14:00] ax.service.managed_loop: Running optimization trial 11...
[INFO 06-20 19:14:00] ax.service.managed_loop: Running optimization trial 12...
[INFO 06-20 19:14:01] ax.service.managed_loop: Running optimization trial 13...
[INFO 06-20 19:14:01] ax.service.managed_loop: Running optimization trial 14...
[INFO 06-20 19:14:02] ax.service.managed_loop: Running optimization trial 15...
[INFO 06-20 19:14:03] ax.service.managed_loop: Running optimization trial 16...
[INFO 06-20 19:14:03] ax.service.managed_loop: Running optimization trial 17...
[INFO 06-20 19:14:04] ax.service.managed_loop: Running optimization trial 18...
[INFO 06-20 19:14:05] ax.service.managed_loop: Running optimization trial 19...
[INFO 06-20 19:14:05] ax.service.managed_loop: Running optimization trial 20...

best params: {'x1': 1.4327030843905124, 'x2': 2.8136152874988056}
---
0
{'x1': 5.551732778549194, 'x2': 9.377297163009644}
539.1629722466083
---
1
{'x1': -3.933116849511862, 'x2': -7.940220162272453}
1151.8753706086577
---
2
{'x1': -5.739457290619612, 'x2': 0.686403214931488}
378.6041670452986
---
3
{'x1': 4.6975501254200935, 'x2': -1.6736418567597866}
39.3258254072101
---
4
{'x1': 2.1678781881928444, 'x2': 4.608701150864363}
34.78943015956635
---
5
{'x1': 7.36139263591334, 'x2': 1.504591000062078}
137.41475132767002
---
6
{'x1': 2.310447780652936, 'x2': 1.3904117780626084}
4.665987644465215
---
7
{'x1': -4.810674963815469, 'x2': 8.364400314275986}
63.337378518202556
---
8
{'x1': -1.5243934300008029, 'x2': 5.022754561104923}
11.47066442305054
---
9
{'x1': 9.999999999999986, 'x2': -6.252351964268424}
166.86074292618207
---
10
{'x1': -10.0, 'x2': 10.0}
234.0
---
11
{'x1': 0.6636458893856982, 'x2': 3.2849454446895585}
0.20489939790511563
---
12
{'x1': 2.629822040934295, 'x2': 2.2841183380110905}
6.509954504154573
---
13
{'x1': 0.12090649037823198, 'x2': 4.516155679782699}
4.694926278597075
---
14
{'x1': 1.4327030843905124, 'x2': 2.8136152874988056}
0.46466218161603334
---
15
{'x1': -1.5311382671850993, 'x2': 7.107052332123033}
33.20856034668283
---
16
{'x1': 3.765216423598442, 'x2': 0.6366904023760931}
13.877770510619534
---
17
{'x1': 0.9165355489969684, 'x2': 3.4964105314979896}
0.9354875919479475
---
18
{'x1': 9.999999999999993, 'x2': -2.6701034251441644}
157.50291764877784
---
19
{'x1': 1.4885066277036607, 'x2': 2.4159862924326205}
0.6161971451533785

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:8 (5 by maintainers)

github_iconTop GitHub Comments

2reactions
kurtosiscommented, Jun 21, 2021

Thanks @lena-kashtelyan! My two cents- optimize is a great entry point to get more folks to try out Ax. Basically anyone who’s training a model with hyperparameters can drop this into their code in under an hour. The main thing I’d suggest for the tutorial is more explicit description of the options for setting the parameter space (integer, categorical, log scale) since those come up a lot. Also maybe links to the docs for the objects it returns (e.g. where can one learn what’s in experiment or what exactly model is) .

2reactions
Balandatcommented, Jun 21, 2021

although I think the page you linked has an incorrect description?

Ah good catch, we must have forgotten to update this part of the docs. Fixed in #606.

Fwiw it might be nice to have more documentation of optimize

Point taken - we should write up a more detailed description of what’s going on under the hood.

Btw, I think I was misled by the best_objectives code in section 5 of this tutorial: https://ax.dev/tutorials/tune_cnn.html

Good point also, we should do one of the following:

  1. plot the model-predicted means at the observed arms (rather than the evaluatiaon)
  2. change the evaluation function to return 0.0 SEM

I don’t think an option 3 (leave things as is but explain why we do it) makes a lot of sense.

Read more comments on GitHub >

github_iconTop Results From Across the Web

How (Not) to Tune Your Model With Hyperopt
There is no simple way to know which algorithm, and which settings for that algorithm ("hyperparameters"), produces the best model for the data....
Read more >
scipy.optimize.minimize — SciPy v1.9.3 Manual
This algorithm requires the gradient and the Hessian (which is not required to be positive definite). It is, in many situations, the Newton...
Read more >
Tips and best practices to improve performance of canvas ...
Follow the best practices and tips in this topic to boost the performance of canvas apps.
Read more >
Hyperopt
In general, the search for best parameters starts with a few random ... Docker¶. The docker-image includes hyperopt dependencies, no further action needed....
Read more >
MATLAB bayesopt
Optimize hyperparameters of a KNN classifier for the ionosphere data, ... The coupled constraint is that the number of support vectors is no...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found