question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Slow candidate generation for a simple multi-objective problem

See original GitHub issue

Hello,

Thank you for a great framework. I am observing really slow generation times for a very simple problem outlined here:

import numpy as np
import math

from ax.service.ax_client import AxClient
from ax.service.utils.instantiation import ObjectiveProperties

def f(parameters):
    x1, x2 = parameters["x1"], parameters["x2"]
    n = 2
    f1 = 1.- np.exp(-sum([(x - 1/math.sqrt(n))**2 for x in [x1, x2]]))
    f2 = 1.- np.exp(-sum([(x + 1/math.sqrt(n))**2 for x in [x1, x2]]))
    return {"f1": (f1, 0.0), "f2": (f2, 0.0)}

ax_client = AxClient()
ax_client.create_experiment(
    name="toy_example",
    parameters=[
        {
            "name": "x1",
            "type": "range",
            "bounds": [-2.0, 2.0],
            "value_type": "float"
        },
        {
            "name": "x2",
            "type": "range",
            "bounds": [-2.0, 2.0],
            "value_type": "float"
        },
    ],
    objectives={
        "f1": ObjectiveProperties(minimize=True), 
        "f2": ObjectiveProperties(minimize=True)
    })

for i in range(20):
    parameters, trial_index = ax_client.get_next_trial()
    print("Iteration: {}, Parameters: {}".format(i, parameters))
    ax_client.complete_trial(trial_index = trial_index, raw_data=f(parameters))

The output looks approximately like this:

[INFO 02-16 17:53:32] ax.service.ax_client: Starting optimization with verbose logging. To disable logging, set the `verbose_logging` argument to `False`. Note that float values in the logs are rounded to 6 decimal points.
[INFO 02-16 17:53:32] ax.service.utils.instantiation: Due to non-specification, we will use the heuristic for selecting objective thresholds.
[INFO 02-16 17:53:32] ax.service.utils.instantiation: Created search space: SearchSpace(parameters=[RangeParameter(name='x1', parameter_type=FLOAT, range=[-2.0, 2.0]), RangeParameter(name='x2', parameter_type=FLOAT, range=[-2.0, 2.0])], parameter_constraints=[]).
[INFO 02-16 17:53:32] ax.modelbridge.dispatch_utils: Using Bayesian optimization since there are more ordered parameters than there are categories for the unordered categorical parameters.
[INFO 02-16 17:53:32] ax.modelbridge.dispatch_utils: Using Bayesian Optimization generation strategy: GenerationStrategy(name='Sobol+MOO', steps=[Sobol for 5 trials, MOO for subsequent trials]). Iterations after 5 will take longer to generate due to  model-fitting.
[INFO 02-16 17:53:32] ax.service.ax_client: Generated new trial 0 with parameters {'x1': -1.15997, 'x2': 1.953482}.
[INFO 02-16 17:53:32] ax.service.ax_client: Completed trial 0 with data: {'f1': (0.993523, 0.0), 'f2': (0.999313, 0.0)}.
[INFO 02-16 17:53:32] ax.service.ax_client: Generated new trial 1 with parameters {'x1': -0.670189, 'x2': 1.804642}.
[INFO 02-16 17:53:32] ax.service.ax_client: Completed trial 1 with data: {'f1': (0.95502, 0.0), 'f2': (0.998182, 0.0)}.
[INFO 02-16 17:53:32] ax.service.ax_client: Generated new trial 2 with parameters {'x1': -0.99277, 'x2': 0.938971}.
[INFO 02-16 17:53:32] ax.service.ax_client: Completed trial 2 with data: {'f1': (0.947311, 0.0), 'f2': (0.938651, 0.0)}.
[INFO 02-16 17:53:32] ax.service.ax_client: Generated new trial 3 with parameters {'x1': -0.046451, 'x2': -0.747148}.
[INFO 02-16 17:53:32] ax.service.ax_client: Completed trial 3 with data: {'f1': (0.931622, 0.0), 'f2': (0.354719, 0.0)}.
[INFO 02-16 17:53:32] ax.service.ax_client: Generated new trial 4 with parameters {'x1': -1.47453, 'x2': 0.428721}.
[INFO 02-16 17:53:32] ax.service.ax_client: Completed trial 4 with data: {'f1': (0.99207, 0.0), 'f2': (0.847264, 0.0)}.
[WARNING 02-16 17:53:32] ax.utils.common.kwargs: `<class 'ax.modelbridge.multi_objective_torch.MultiObjectiveTorchModelBridge'>` expected argument `transform_configs` to be of type typing.Union[typing.Dict[str, typing.Dict[str, typing.Union[int, float, str, botorch.acquisition.acquisition.AcquisitionFunction, typing.Dict[str, typing.Any], NoneType]]], NoneType]. Got {'Winsorize': {'optimization_config': MultiObjectiveOptimizationConfig(objective=MultiObjective(objectives=[Objective(metric_name="f1", minimize=True), Objective(metric_name="f2", minimize=True)]), outcome_constraints=[], objective_thresholds=[])}} (type: <class 'dict'>).
Iteration: 0, Parameters: {'x1': -1.1599698066711426, 'x2': 1.953481912612915}
Iteration: 1, Parameters: {'x1': -0.6701887361705303, 'x2': 1.8046424239873886}
Iteration: 2, Parameters: {'x1': -0.992770355194807, 'x2': 0.938971497118473}
Iteration: 3, Parameters: {'x1': -0.0464509092271328, 'x2': -0.7471476458013058}
Iteration: 4, Parameters: {'x1': -1.4745304770767689, 'x2': 0.4287210963666439}
[INFO 02-16 17:53:38] ax.service.ax_client: Generated new trial 5 with parameters {'x1': 0.788512, 'x2': -1.507235}.
[INFO 02-16 17:53:38] ax.service.ax_client: Completed trial 5 with data: {'f1': (0.992627, 0.0), 'f2': (0.943701, 0.0)}.
[WARNING 02-16 17:53:38] ax.utils.common.kwargs: `<class 'ax.modelbridge.multi_objective_torch.MultiObjectiveTorchModelBridge'>` expected argument `transform_configs` to be of type typing.Union[typing.Dict[str, typing.Dict[str, typing.Union[int, float, str, botorch.acquisition.acquisition.AcquisitionFunction, typing.Dict[str, typing.Any], NoneType]]], NoneType]. Got {'Winsorize': {'optimization_config': MultiObjectiveOptimizationConfig(objective=MultiObjective(objectives=[Objective(metric_name="f1", minimize=True), Objective(metric_name="f2", minimize=True)]), outcome_constraints=[], objective_thresholds=[])}} (type: <class 'dict'>).
Iteration: 5, Parameters: {'x1': 0.7885118081940683, 'x2': -1.5072345989809635}
[INFO 02-16 17:54:07] ax.service.ax_client: Generated new trial 6 with parameters {'x1': -0.401061, 'x2': -0.466999}.
[INFO 02-16 17:54:07] ax.service.ax_client: Completed trial 6 with data: {'f1': (0.926212, 0.0), 'f2': (0.140424, 0.0)}.
[WARNING 02-16 17:54:07] ax.utils.common.kwargs: `<class 'ax.modelbridge.multi_objective_torch.MultiObjectiveTorchModelBridge'>` expected argument `transform_configs` to be of type typing.Union[typing.Dict[str, typing.Dict[str, typing.Union[int, float, str, botorch.acquisition.acquisition.AcquisitionFunction, typing.Dict[str, typing.Any], NoneType]]], NoneType]. Got {'Winsorize': {'optimization_config': MultiObjectiveOptimizationConfig(objective=MultiObjective(objectives=[Objective(metric_name="f1", minimize=True), Objective(metric_name="f2", minimize=True)]), outcome_constraints=[], objective_thresholds=[])}} (type: <class 'dict'>).
Iteration: 6, Parameters: {'x1': -0.40106072850883034, 'x2': -0.4669987616451259}
[INFO 02-16 17:55:18] ax.service.ax_client: Generated new trial 7 with parameters {'x1': -0.178156, 'x2': -0.081864}.
[INFO 02-16 17:55:18] ax.service.ax_client: Completed trial 7 with data: {'f1': (0.754919, 0.0), 'f2': (0.488659, 0.0)}.
[WARNING 02-16 17:55:18] ax.utils.common.kwargs: `<class 'ax.modelbridge.multi_objective_torch.MultiObjectiveTorchModelBridge'>` expected argument `transform_configs` to be of type typing.Union[typing.Dict[str, typing.Dict[str, typing.Union[int, float, str, botorch.acquisition.acquisition.AcquisitionFunction, typing.Dict[str, typing.Any], NoneType]]], NoneType]. Got {'Winsorize': {'optimization_config': MultiObjectiveOptimizationConfig(objective=MultiObjective(objectives=[Objective(metric_name="f1", minimize=True), Objective(metric_name="f2", minimize=True)]), outcome_constraints=[], objective_thresholds=[])}} (type: <class 'dict'>).
Iteration: 7, Parameters: {'x1': -0.17815635208056846, 'x2': -0.08186353915271938}
[INFO 02-16 17:56:59] ax.service.ax_client: Generated new trial 8 with parameters {'x1': -0.373454, 'x2': -0.203828}.
[INFO 02-16 17:56:59] ax.service.ax_client: Completed trial 8 with data: {'f1': (0.864314, 0.0), 'f2': (0.305535, 0.0)}.
[WARNING 02-16 17:56:59] ax.utils.common.kwargs: `<class 'ax.modelbridge.multi_objective_torch.MultiObjectiveTorchModelBridge'>` expected argument `transform_configs` to be of type typing.Union[typing.Dict[str, typing.Dict[str, typing.Union[int, float, str, botorch.acquisition.acquisition.AcquisitionFunction, typing.Dict[str, typing.Any], NoneType]]], NoneType]. Got {'Winsorize': {'optimization_config': MultiObjectiveOptimizationConfig(objective=MultiObjective(objectives=[Objective(metric_name="f1", minimize=True), Objective(metric_name="f2", minimize=True)]), outcome_constraints=[], objective_thresholds=[])}} (type: <class 'dict'>).
Iteration: 8, Parameters: {'x1': -0.373453809152835, 'x2': -0.20382825580915798}
[INFO 02-16 17:57:24] ax.service.ax_client: Generated new trial 9 with parameters {'x1': -1.190448, 'x2': -1.544238}.
[INFO 02-16 17:57:24] ax.service.ax_client: Completed trial 9 with data: {'f1': (0.999828, 0.0), 'f2': (0.607182, 0.0)}.
[WARNING 02-16 17:57:24] ax.utils.common.kwargs: `<class 'ax.modelbridge.multi_objective_torch.MultiObjectiveTorchModelBridge'>` expected argument `transform_configs` to be of type typing.Union[typing.Dict[str, typing.Dict[str, typing.Union[int, float, str, botorch.acquisition.acquisition.AcquisitionFunction, typing.Dict[str, typing.Any], NoneType]]], NoneType]. Got {'Winsorize': {'optimization_config': MultiObjectiveOptimizationConfig(objective=MultiObjective(objectives=[Objective(metric_name="f1", minimize=True), Objective(metric_name="f2", minimize=True)]), outcome_constraints=[], objective_thresholds=[])}} (type: <class 'dict'>).
Iteration: 9, Parameters: {'x1': -1.1904480359033829, 'x2': -1.544238327092655}

My config is:

PyTorch built with:
  - GCC 7.3
  - C++ Version: 201402
  - Intel(R) Math Kernel Library Version 2020.0.0 Product Build 20191122 for Intel(R) 64 architecture applications
  - Intel(R) MKL-DNN v2.2.3 (Git Hash 7336ca9f055cf1bfa13efb658fe15dc9b41f0740)
  - OpenMP 201511 (a.k.a. OpenMP 4.5)
  - LAPACK is enabled (usually provided by MKL)
  - NNPACK is enabled
  - CPU capability usage: AVX2
  - CUDA Runtime 11.3
  - NVCC architecture flags: -gencode;arch=compute_37,code=sm_37;-gencode;arch=compute_50,code=sm_50;-gencode;arch=compute_60,code=sm_60;-gencode;arch=compute_70,code=sm_70;-gencode;arch=compute_75,code=sm_75;-gencode;arch=compute_80,code=sm_80;-gencode;arch=compute_86,code=sm_86
  - CuDNN 8.2
  - Magma 2.5.2
  - Build settings: BLAS_INFO=mkl, BUILD_TYPE=Release, CUDA_VERSION=11.3, CUDNN_VERSION=8.2.0, CXX_COMPILER=/opt/rh/devtoolset-7/root/usr/bin/c++, CXX_FLAGS= -Wno-deprecated -fvisibility-inlines-hidden -DUSE_PTHREADPOOL -fopenmp -DNDEBUG -DUSE_KINETO -DUSE_FBGEMM -DUSE_QNNPACK -DUSE_PYTORCH_QNNPACK -DUSE_XNNPACK -DSYMBOLICATE_MOBILE_DEBUG_HANDLE -DEDGE_PROFILER_USE_KINETO -O2 -fPIC -Wno-narrowing -Wall -Wextra -Werror=return-type -Wno-missing-field-initializers -Wno-type-limits -Wno-array-bounds -Wno-unknown-pragmas -Wno-sign-compare -Wno-unused-parameter -Wno-unused-variable -Wno-unused-function -Wno-unused-result -Wno-unused-local-typedefs -Wno-strict-overflow -Wno-strict-aliasing -Wno-error=deprecated-declarations -Wno-stringop-overflow -Wno-psabi -Wno-error=pedantic -Wno-error=redundant-decls -Wno-error=old-style-cast -fdiagnostics-color=always -faligned-new -Wno-unused-but-set-variable -Wno-maybe-uninitialized -fno-math-errno -fno-trapping-math -Werror=format -Wno-stringop-overflow, LAPACK_INFO=mkl, PERF_WITH_AVX=1, PERF_WITH_AVX2=1, PERF_WITH_AVX512=1, TORCH_VERSION=1.10.2, USE_CUDA=ON, USE_CUDNN=ON, USE_EXCEPTION_PTR=1, USE_GFLAGS=OFF, USE_GLOG=OFF, USE_MKL=ON, USE_MKLDNN=ON, USE_MPI=OFF, USE_NCCL=ON, USE_NNPACK=ON, USE_OPENMP=ON,
print(ax.__version__)
0.2.3
Python 3.7.6 (default, Feb 14 2022, 16:14:04) 
[GCC 8.5.0 20210514 (Red Hat 8.5.0-8)] on linux

Not that I mentioned this also with respect to #478, which was closed, but a solution was not provided.

Thank you!

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:14 (8 by maintainers)

github_iconTop GitHub Comments

3reactions
saitcakmakcommented, Feb 25, 2022

Hi @martinferianc. In case you still want to try it, we implemented an easy way of using a GPU with the Service API. You can simply initialize AxClient as ax_client = AxClient(torch_device=torch.device("cuda")) (this will pick GPU 0, if you have multiple GPUs, you can specify a specific one with cuda:x where x is the GPU number). To use this, you need to build on the main branch, which you can do with pip3 install git+ssh://git@github.com/facebook/Ax.git#egg=ax-platform.

2reactions
martinferianccommented, Feb 25, 2022

Hi @martinferianc. In case you still want to try it, we implemented an easy way of using a GPU with the Service API. You can simply initialize AxClient as ax_client = AxClient(torch_device=torch.device("cuda")) (this will pick GPU 0, if you have multiple GPUs, you can specify a specific one with cuda:x where x is the GPU number). To use this, you need to build on the main branch, which you can do with pip3 install git+ssh://git@github.com/facebook/Ax.git#egg=ax-platform.

Thank you for this!

Read more comments on GitHub >

github_iconTop Results From Across the Web

Slow GPEI candidate generation on Cloud-CPU #478 - GitHub
ax is a good optimization library. I am new to the ax platform, but recently discovered that when using the gpei model to...
Read more >
Multiobjective Optimization Problem - ScienceDirect.com
A multiobjective optimization problem needs the simultaneous satisfaction of a number of different and often conflicting objectives. For thermal system design, ...
Read more >
Intro Into Multi Objective Optimization - YouTube
Multi-objective optimization (also known as multi-objective programming, vector optimization, multiattribute optimization or Pareto ...
Read more >
Multiobjective Optimization | SpringerLink
Multiobjective optimization caters to achieving multiple goals, subject to a set of constraints, with a likelihood that the objectives will ...
Read more >
Multi-objective Genetic Algorithms: Problem Difficulties and ...
Abstract. In this paper, we study the problem features that may cause a multi-objective genetic algorithm. (GA) difficulty to converge to the true ......
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found