Skip to content

Commit

Permalink
Remove get_GPEI factory function (#3019)
Browse files Browse the repository at this point in the history
Summary:
Pull Request resolved: #3019

Factory functions mostly duplicate an existing registry entry and are not recommended. This diff cleans up `get_GPEI` factory function, which re-packages legacy GPEI.

Reviewed By: dme65

Differential Revision: D65486831

fbshipit-source-id: 11d6a912997ce1f737065e75f5877dfb3b7fb0b3
  • Loading branch information
saitcakmak authored and facebook-github-bot committed Nov 5, 2024
1 parent f2c2695 commit f09a318
Show file tree
Hide file tree
Showing 3 changed files with 4 additions and 56 deletions.
22 changes: 0 additions & 22 deletions ax/modelbridge/factory.py
Original file line number Diff line number Diff line change
Expand Up @@ -140,28 +140,6 @@ def get_botorch(
)


def get_GPEI(
experiment: Experiment,
data: Data,
search_space: SearchSpace | None = None,
dtype: torch.dtype = torch.double,
device: torch.device = DEFAULT_TORCH_DEVICE,
) -> TorchModelBridge:
"""Instantiates a GP model that generates points with EI."""
if data.df.empty:
raise ValueError("GP+EI BotorchModel requires non-empty data.")
return checked_cast(
TorchModelBridge,
Models.LEGACY_BOTORCH(
experiment=experiment,
data=data,
search_space=search_space or experiment.search_space,
torch_dtype=dtype,
torch_device=device,
),
)


def get_factorial(search_space: SearchSpace) -> DiscreteModelBridge:
"""Instantiates a factorial generator."""
return checked_cast(
Expand Down
30 changes: 0 additions & 30 deletions ax/modelbridge/tests/test_factory.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,26 +9,21 @@
from ax.core.outcome_constraint import ComparisonOp, ObjectiveThreshold
from ax.modelbridge.discrete import DiscreteModelBridge
from ax.modelbridge.factory import (
get_botorch,
get_empirical_bayes_thompson,
get_factorial,
get_GPEI,
get_sobol,
get_thompson,
get_uniform,
)
from ax.modelbridge.random import RandomModelBridge
from ax.modelbridge.torch import TorchModelBridge
from ax.models.discrete.eb_thompson import EmpiricalBayesThompsonSampler
from ax.models.discrete.thompson import ThompsonSampler
from ax.utils.common.testutils import TestCase
from ax.utils.testing.core_stubs import (
get_branin_experiment,
get_branin_experiment_with_multi_objective,
get_branin_optimization_config,
get_factorial_experiment,
)
from ax.utils.testing.mock import mock_botorch_optimize


# pyre-fixme[3]: Return type must be annotated.
Expand All @@ -52,31 +47,6 @@ def get_multi_obj_exp_and_opt_config():


class ModelBridgeFactoryTestSingleObjective(TestCase):
@mock_botorch_optimize
def test_sobol_GPEI(self) -> None:
"""Tests sobol + GPEI instantiation."""
exp = get_branin_experiment()
# Check that factory generates a valid sobol modelbridge.
sobol = get_sobol(search_space=exp.search_space)
self.assertIsInstance(sobol, RandomModelBridge)
for _ in range(5):
sobol_run = sobol.gen(n=1)
exp.new_batch_trial().add_generator_run(sobol_run).run().mark_completed()
# Check that factory generates a valid GP+EI modelbridge.
exp.optimization_config = get_branin_optimization_config()
gpei = get_GPEI(experiment=exp, data=exp.fetch_data())
self.assertIsInstance(gpei, TorchModelBridge)
gpei = get_GPEI(
experiment=exp, data=exp.fetch_data(), search_space=exp.search_space
)
self.assertIsInstance(gpei, TorchModelBridge)
botorch = get_botorch(experiment=exp, data=exp.fetch_data())
self.assertIsInstance(botorch, TorchModelBridge)

# Check that .gen returns without failure
gpei_run = gpei.gen(n=1)
self.assertEqual(len(gpei_run.arms), 1)

def test_model_kwargs(self) -> None:
"""Tests that model kwargs are passed correctly."""
exp = get_branin_experiment()
Expand Down
8 changes: 4 additions & 4 deletions docs/models.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,11 +22,11 @@ Additional arguments can be passed to [`get_sobol`](../api/modelbridge.html#ax.m
Sobol sequences are typically used to select initialization points, and this model does not implement [`predict`](../api/modelbridge.html#ax.modelbridge.base.ModelBridge.predict). It can be used on search spaces with any combination of discrete and continuous parameters.

#### Gaussian Process with EI
Gaussian Processes (GPs) are used for [Bayesian Optimization](bayesopt.md) in Ax, the [`get_GPEI`](../api/modelbridge.html#ax.modelbridge.factory.get_gpei) function constructs a model that fits a GP to the data, and uses the EI acquisition function to generate new points on calls to [`gen`](../api/modelbridge.html#ax.modelbridge.base.ModelBridge.gen). This code fits a GP and generates a batch of 5 points which maximizes EI:
Gaussian Processes (GPs) are used for [Bayesian Optimization](bayesopt.md) in Ax, the [`Models.BOTORCH_MODULAR`](../api/modelbridge.html#ax.modelbridge.registry.Models) registry entry constructs a modular BoTorch model that fits a GP to the data, and uses qLogNEI (or qLogNEHVI for MOO) acquisition function to generate new points on calls to [`gen`](../api/modelbridge.html#ax.modelbridge.base.ModelBridge.gen). This code fits a GP and generates a batch of 5 points which maximizes EI:
```Python
from ax.modelbridge.factory import get_GPEI
from ax.modelbridge.registry import Models

m = get_GPEI(experiment, data)
m = Models.BOTORCH_MODULAR(experiment=experiment, data=data)
gr = m.gen(n=5, optimization_config=optimization_config)
```

Expand Down Expand Up @@ -158,7 +158,7 @@ The primary role of the [`ModelBridge`](../api/modelbridge.html#ax.modelbridge.b

## Transforms

The transformations in the [`ModelBridge`](../api/modelbridge.html#ax.modelbridge.base.ModelBridge) are done by chaining together a set of individual Transform objects. For continuous space models obtained via factory functions ([`get_sobol`](/api/data.html#.data.users.adamobeng.fbsource.fbcode.ax.ax.modelbridge.factory.get_sobol) and [`get_GPEI`](/api/data.html#.data.users.adamobeng.fbsource.fbcode.ax.ax.modelbridge.factory.get_GPEI)), the following transforms will be applied by default, in this sequence:
The transformations in the [`ModelBridge`](../api/modelbridge.html#ax.modelbridge.base.ModelBridge) are done by chaining together a set of individual Transform objects. For continuous space models obtained via factory functions ([`get_sobol`](/api/data.html#.data.users.adamobeng.fbsource.fbcode.ax.ax.modelbridge.factory.get_sobol) and [`Models.BOTORCH_MODULAR`](/api/data.html#.data.users.adamobeng.fbsource.fbcode.ax.ax.modelbridge.registry.Models)), the following transforms will be applied by default, in this sequence:
* [`RemoveFixed`](../api/modelbridge.html#ax.modelbridge.transforms.remove_fixed.RemoveFixed): Remove [`FixedParameters`](../api/core.html#ax.core.parameter.FixedParameter) from the search space.
* [`OrderedChoiceEncode`](../api/modelbridge.html#ax.modelbridge.transforms.choice_encode.OrderedChoiceEncode): [`ChoiceParameters`](../api/core.html#ax.core.parameter.ChoiceParameter) with `is_ordered` set to `True` are encoded as a sequence of integers.
* [`OneHot`](../api/modelbridge.html#ax.modelbridge.transforms.one_hot.OneHot): [`ChoiceParameters`](../api/core.html#ax.core.parameter.ChoiceParameter) with `is_ordered` set to `False` are one-hot encoded.
Expand Down

0 comments on commit f09a318

Please sign in to comment.