Skip to content

Commit 30b217c

Browse files
esantorellafacebook-github-bot
authored andcommitted
Update MOBO docs (#2438)
Summary: ## Motivation * Refer to log variants of acquisition functions, with explanation in footnote * Fix broken links * Mention `qLogNParEGO` exists (previous documentation implied the user would have to implement ParEGO variants themself) Pull Request resolved: #2438 Test Plan: Built the website locally and checked every link manually Reviewed By: saitcakmak Differential Revision: D60043743 Pulled By: esantorella fbshipit-source-id: 7793d4baed869b0ef671cb2b2507d9e8bb9a4b92
1 parent 25506ab commit 30b217c

File tree

1 file changed

+26
-18
lines changed

1 file changed

+26
-18
lines changed

docs/multi_objective.md

+26-18
Original file line numberDiff line numberDiff line change
@@ -3,12 +3,11 @@ id: multi_objective
33
title: Multi-Objective Bayesian Optimization
44
---
55

6-
BoTorch provides first-class support for Multi-Objective (MO) Bayesian
7-
Optimization (BO) including implementations of
8-
[`qNoisyExpectedHypervolumeImprovement`](../api/acquisition.html#botorch.acquisition.multi_objective.monte_carlo.qNoisyExpectedHypervolumeImprovement)
9-
(qNEHVI)[^qNEHVI],
10-
[`qExpectedHypervolumeImprovement`](../api/acquisition.html#botorch.acquisition.multi_objective.monte_carlo.qExpectedHypervolumeImprovement)
11-
(qEHVI), qParEGO[^qEHVI], qNParEGO[^qNEHVI], and analytic
6+
BoTorch provides first-class support for Multi-Objective (MO) Bayesian Optimization (BO) including implementations of
7+
[`qLogNoisyExpectedHypervolumeImprovement`](../api/acquisition.html#botorch.acquisition.multi_objective.logei.qLogNoisyExpectedHypervolumeImprovement) (qLogNEHVI)[^qNEHVI][^LogEI],
8+
[`qLogExpectedHypervolumeImprovement`](../api/acquisition.html#botorch.acquisition.multi_objective.logei.qLogExpectedHypervolumeImprovement) (qLogEHVI),
9+
[`qLogNParEGO`](../api/acquisition.html#botorch.acquisition.multi_objective.parego.qLogNParEGO)[^qNEHVI],
10+
and analytic
1211
[`ExpectedHypervolumeImprovement`](../api/acquisition.html#botorch.acquisition.multi_objective.analytic.ExpectedHypervolumeImprovement)
1312
(EHVI) with gradients via auto-differentiation acquisition functions[^qEHVI].
1413

@@ -25,8 +24,8 @@ example, analytic EHVI has no known analytical gradient for when there are more
2524
than two objectives, but BoTorch computes analytic gradients for free via
2625
auto-differentiation, regardless of the number of objectives [^qEHVI].
2726

28-
For analytic and MC-based MOBO acquisition functions like qNEHVI, qEHVI, and
29-
qParEGO, BoTorch leverages GPU acceleration and quasi-second order methods for
27+
For analytic and MC-based MOBO acquisition functions such as qLogNEHVI, qLogEHVI, and
28+
`qLogNParEGO`, BoTorch leverages GPU acceleration and quasi-second order methods for
3029
acquisition optimization for efficient computation and optimization in many
3130
practical scenarios [^qNEHVI][^qEHVI]. The MC-based acquisition functions
3231
support using the sample average approximation for rapid convergence [^BoTorch].
@@ -38,15 +37,8 @@ and all MC-based acquisition functions derive from
3837
These abstract classes easily integrate with BoTorch's standard optimization
3938
machinery.
4039

41-
Additionally, qParEGO and qNParEGO are trivially implemented using an augmented
42-
Chebyshev scalarization as the objective with the
43-
[`qExpectedImprovement`](../api/acquisition.html#qexpectedimprovement)
44-
acquisition function or the
45-
[`qNoisyExpectedImprovement`](../api/acquisition.html#qnoisyexpectedimprovement)
46-
acquisition function, respectively. Botorch provides a
47-
[`get_chebyshev_scalarization`](../api/utils.html#botorch.utils.multi_objective.scalarization.get_chebyshev_scalarizationconvenience)
48-
convenience function for generating these scalarizations. In the batch setting,
49-
qParEGO and qNParEGO both use a new random scalarization for each candidate
40+
`qLogNParEGO` supports optimization via random scalarizations.
41+
In the batch setting, it uses a new random scalarization for each candidate
5042
[^qEHVI]. Candidates are selected in a sequential greedy fashion, each with a
5143
different scalarization, via the
5244
[`optimize_acqf_list`](../api/optim.html#botorch.optim.optimize.optimize_acqf_list)
@@ -65,7 +57,7 @@ and efficient box decomposition algorithms for efficiently partitioning the the
6557
space dominated
6658
[`DominatedPartitioning`](../api/utils.html#botorch.utils.multi_objective.box_decompositions.dominated.DominatedPartitioning)
6759
or non-dominated
68-
[`NonDominatedPartitioning`](../api/utils.html#botorch.utils.multi_objective.box_decompositions.non_dominated.NonDominatedPartitioning)
60+
[`NonDominatedPartitioning`](../api/utils.html#botorch.utils.multi_objective.box_decompositions.non_dominated.NondominatedPartitioning)
6961
by the Pareto frontier into axis-aligned hyperrectangular boxes. For exact box
7062
decompositions, BoTorch uses a two-step approach similar to that in [^Yang2019],
7163
where (1) Algorithm 1 from [Lacour17]_ is used to find the local lower bounds
@@ -77,11 +69,27 @@ Appendix F.4 in [^qEHVI] for an analysis of approximate vs exact box
7769
decompositions with EHVI. These box decompositions (approximate or exact) can
7870
also be used to efficiently compute hypervolumes.
7971

72+
Additionally, variations on ParEGO can be trivially implemented using an
73+
augmented Chebyshev scalarization as the objective with an EI-type
74+
single-objective acquisition function such as
75+
[`qLogNoisyExpectedImprovement`](../api/acquisition.html#botorch.acquisition.logei.qLogNoisyExpectedImprovement).
76+
The
77+
[`get_chebyshev_scalarization`](../api/utils.html#botorch.utils.multi_objective.scalarization.get_chebyshev_scalarization)
78+
convenience function generates these scalarizations.
79+
8080
[^qNEHVI]: S. Daulton, M. Balandat, and E. Bakshy. Parallel Bayesian
8181
Optimization of Multiple Noisy Objectives with Expected Hypervolume Improvement.
8282
Advances in Neural Information Processing Systems 34, 2021.
8383
[paper](https://arxiv.org/abs/2105.08195)
8484

85+
[^LogEI]: S. Ament, S. Daulton, D. Eriksson, M. Balandat, and E. Bakshy.
86+
Unexpected Improvements to Expected Improvement for Bayesian Optimization. Advances
87+
in Neural Information Processing Systems 36, 2023.
88+
[paper](https://arxiv.org/abs/2310.20708) "Log" variances of acquisition
89+
functions, such as [`qLogNoisyExpectedHypervolumeImprovement`](../api/acquisition.html#botorch.acquisition.multi_objective.logei.qLogNoisyExpectedHypervolumeImprovement),
90+
offer improved numerics compared to older counterparts such as
91+
[`qNoisyExpectedHypervolumeImprovement`](../api/acquisition.html#botorch.acquisition.multi_objective.monte_carlo.qNoisyExpectedHypervolumeImprovement).
92+
8593
[^qEHVI]: S. Daulton, M. Balandat, and E. Bakshy. Differentiable Expected Hypervolume
8694
Improvement for Parallel Multi-Objective Bayesian Optimization. Advances in Neural
8795
Information Processing Systems 33, 2020.

0 commit comments

Comments
 (0)