Skip to content

Commit 4748a53

Browse files
committed
add readme and example
1 parent 3d323f8 commit 4748a53

File tree

4 files changed

+40
-124
lines changed

4 files changed

+40
-124
lines changed

package/samplers/turbo/README.md

Lines changed: 33 additions & 92 deletions
Original file line numberDiff line numberDiff line change
@@ -1,127 +1,68 @@
11
---
2-
author: Please fill in the author name here. (e.g., John Smith)
3-
title: Please fill in the title of the feature here. (e.g., Gaussian-Process Expected Improvement Sampler)
4-
description: Please fill in the description of the feature here. (e.g., This sampler searches for each trial based on expected improvement using Gaussian process.)
5-
tags: [Please fill in the list of tags here. (e.g., sampler, visualization, pruner)]
6-
optuna_versions: ['Please fill in the list of versions of Optuna in which you have confirmed the feature works, e.g., 3.6.1.']
2+
author: Optuna Team
3+
title: TuRBOSampler
4+
description: This sampler performs Bayesian optimization in adaptive trust regions using Gaussian Processes
5+
tags: [sampler, Bayesian optimization]
6+
optuna_versions: [4.5.0]
77
license: MIT License
88
---
99

10-
<!--
11-
This is an example of the frontmatters.
12-
All columns must be string.
13-
You can omit quotes when value types are not ambiguous.
14-
For tags, a package placed in
15-
- package/samplers/ must include the tag "sampler"
16-
- package/visualilzation/ must include the tag "visualization"
17-
- package/pruners/ must include the tag "pruner"
18-
respectively.
19-
20-
---
21-
author: Optuna team
22-
title: My Sampler
23-
description: A description for My Sampler.
24-
tags: [sampler, 2nd tag for My Sampler, 3rd tag for My Sampler]
25-
optuna_versions: [3.6.1]
26-
license: "MIT License"
27-
---
28-
-->
29-
30-
Instruction (Please remove this instruction after you carefully read)
31-
32-
- Please read the [tutorial guide](https://optuna.github.io/optunahub/generated/recipes/001_first.html) to register your feature in OptunaHub. You can find more detailed explanation of the following contents in the tutorial.
33-
- Looking at [other packages' implementations](https://github.com/optuna/optunahub-registry/tree/main/package) would also be helpful.
34-
- **Please do not use HTML tags in the `README.md` file. Only markdown is allowed. For security reasons, the HTML tags will be removed when the package is registered on the web page.**
35-
3610
## Abstract
3711

38-
You can provide an abstract for your package here.
39-
This section will help attract potential users to your package.
12+
TuRBOSampler implements Bayesian optimization with trust regions. It places local trust regions around the current best solutions and fits Gaussian Process (GP) models within those regions. Operating within adaptive local regions reduces high-dimensional sample complexity, yielding accurate fits with fewer trials.
4013

41-
**Example**
42-
43-
This package provides a sampler based on Gaussian process-based Bayesian optimization. The sampler is highly sample-efficient, so it is suitable for computationally expensive optimization problems with a limited evaluation budget, such as hyperparameter optimization of machine learning algorithms.
14+
Please refer to the paper, [Scalable Global Optimization via Local Bayesian Optimization](https://proceedings.neurips.cc/paper_files/paper/2019/file/6c990b7aca7bc7058f5e98ea909e924b-Paper.pdf) for more information.
4415

4516
## APIs
4617

47-
Please provide API documentation describing how to use your package's functionalities.
48-
The documentation format is arbitrary, but at least the important class/function names that you implemented should be listed here.
49-
More users will take advantage of your package by providing detailed and helpful documentation.
50-
51-
**Example**
52-
53-
- `MoCmaSampler(*, search_space: dict[str, BaseDistribution] | None = None, popsize: int | None = None, seed: int | None = None)`
54-
- `search_space`: A dictionary containing the search space that defines the parameter space. The keys are the parameter names and the values are [the parameter's distribution](https://optuna.readthedocs.io/en/stable/reference/distributions.html). If the search space is not provided, the sampler will infer the search space dynamically.
55-
Example:
56-
```python
57-
search_space = {
58-
"x": optuna.distributions.FloatDistribution(-5, 5),
59-
"y": optuna.distributions.FloatDistribution(-5, 5),
60-
}
61-
MoCmaSampler(search_space=search_space)
62-
```
63-
- `popsize`: Population size of the CMA-ES algorithm. If not provided, the population size will be set based on the search space dimensionality. If you have a sufficient evaluation budget, it is recommended to increase the popsize.
64-
- `seed`: Seed for random number generator.
18+
- `TuRBOSampler(*, n_startup_trials: int = 4, n_trust_region: int = 5, success_tolerance: int = 3, failure_tolerance: int = 5, seed: int | None = None, independent_sampler: BaseSampler | None = None, deterministic_objective: bool = False, warn_independent_sampling: bool = True)`
19+
- `n_startup_trials`: Number of initial trials PER TRUST REGION. Default is 2. As suggested in the original paper, consider setting this to 2\*(number of parameters).
20+
- `n_trust_region`: Number of trust regions. Default is 5.
21+
- `success_tolerance`: Number of consecutive successful iterations required to expand the trust region. Default is 3.
22+
- `failure_tolerance`: Number of consecutive failed iterations required to shrink the trust region. Default is 5. As suggested in the original paper, consider setting this to max(5, number of parameters).
23+
- `seed`: Random seed to initialize internal random number generator. Defaults to :obj:`None` (a seed is picked randomly).
24+
- `independent_sampler`: Sampler used for initial sampling (for the first `n_startup_trials` trials) and for conditional parameters. Defaults to :obj:`None` (a random sampler with the same `seed` is used).
25+
- `deterministic_objective`: Whether the objective function is deterministic or not. If :obj:`True`, the sampler will fix the noise variance of the surrogate model to the minimum value (slightly above 0 to ensure numerical stability). Defaults to :obj:`False`. Currently, all the objectives will be assume to be deterministic if :obj:`True`.
26+
- `warn_independent_sampling`: If this is :obj:`True`, a warning message is emitted when the value of a parameter is sampled by using an independent sampler, meaning that no GP model is used in the sampling. Note that the parameters of the first trial in a study are always sampled via an independent sampler, so no warning messages are emitted in this case.
6527

66-
Note that because of the limitation of the algorithm, only non-conditional numerical parameters can be sampled by the MO-CMA-ES algorithm, and categorical and conditional parameters are handled by random search.
28+
Note that categorical parameters are currently unsupported, and multi-objective optimization is not available.
6729

6830
## Installation
6931

70-
If you have additional dependencies, please fill in the installation guide here.
71-
If no additional dependencies is required, **this section can be removed**.
72-
73-
**Example**
74-
7532
```shell
76-
$ pip install scipy torch
77-
```
78-
79-
If your package has `requirements.txt`, it will be automatically uploaded to the OptunaHub, and the package dependencies will be available to install as follows.
80-
81-
```shell
82-
pip install -r https://hub.optuna.org/{category}/{your_package_name}/requirements.txt
33+
$ pip install torch scipy
8334
```
8435

8536
## Example
8637

87-
Please fill in the code snippet to use the implemented feature here.
88-
89-
**Example**
90-
9138
```python
9239
import optuna
9340
import optunahub
9441

9542

96-
def objective(trial):
97-
x = trial.suggest_float("x", -5, 5)
98-
return x**2
43+
def objective(trial: optuna.Trial) -> float:
44+
x = trial.suggest_float("x", -5, 5)
45+
y = trial.suggest_float("y", -5, 5)
46+
return x**2 + y**2
9947

10048

101-
sampler = optunahub.load_module(package="samplers/gp").GPSampler()
49+
sampler = optunahub.load_module(package="samplers/turbo").TuRBOSampler()
10250
study = optuna.create_study(sampler=sampler)
103-
study.optimize(objective, n_trials=100)
51+
study.optimize(objective, n_trials=200)
52+
10453
```
10554

10655
## Others
10756

108-
Please fill in any other information if you have here by adding child sections (###).
109-
If there is no additional information, **this section can be removed**.
110-
111-
<!--
112-
For example, you can add sections to introduce a corresponding paper.
113-
114-
### Reference
115-
Takuya Akiba, Shotaro Sano, Toshihiko Yanase, Takeru Ohta, and Masanori Koyama. 2019.
116-
Optuna: A Next-generation Hyperparameter Optimization Framework. In KDD.
117-
11857
### Bibtex
58+
11959
```
120-
@inproceedings{optuna_2019,
121-
title={Optuna: A Next-generation Hyperparameter Optimization Framework},
122-
author={Akiba, Takuya and Sano, Shotaro and Yanase, Toshihiko and Ohta, Takeru and Koyama, Masanori},
123-
booktitle={Proceedings of the 25th {ACM} {SIGKDD} International Conference on Knowledge Discovery and Data Mining},
124-
year={2019}
60+
@inproceedings{eriksson2019scalable,
61+
title = {Scalable Global Optimization via Local {Bayesian} Optimization},
62+
author = {Eriksson, David and Pearce, Michael and Gardner, Jacob and Turner, Ryan D and Poloczek, Matthias},
63+
booktitle = {Advances in Neural Information Processing Systems},
64+
pages = {5496--5507},
65+
year = {2019},
66+
url = {http://papers.nips.cc/paper/8788-scalable-global-optimization-via-local-bayesian-optimization.pdf},
12567
}
12668
```
127-
-->

package/samplers/turbo/__init__.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
from .sampler import TurBOSampler
1+
from .sampler import TuRBOSampler
22

33

4-
__all__ = ["TurBOSampler"]
4+
__all__ = ["TuRBOSampler"]

package/samplers/turbo/example.py

Lines changed: 2 additions & 27 deletions
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,3 @@
1-
"""
2-
This example is only for sampler.
3-
You can verify your sampler code using this file as well.
4-
Please feel free to remove this file if necessary.
5-
"""
6-
7-
from __future__ import annotations
8-
91
import optuna
102
import optunahub
113

@@ -16,23 +8,6 @@ def objective(trial: optuna.Trial) -> float:
168
return x**2 + y**2
179

1810

19-
# TODO: Change package_name to test your package.
20-
package_name = "samplers/your_sampler"
21-
test_local = True
22-
23-
if test_local:
24-
# This is an example of how to load a sampler from your local optunahub-registry.
25-
sampler = optunahub.load_local_module(
26-
package=package_name,
27-
registry_root="./package", # Path to the root of the optunahub-registry.
28-
).YourSampler()
29-
else:
30-
# This is an example of how to load a sampler from your fork of the optunahub-registry.
31-
# Please remove repo_owner and ref arguments before submitting a pull request.
32-
sampler = optunahub.load_module(
33-
package=package_name, repo_owner="Your GitHub Account ID", ref="Your Git Branch Name"
34-
).YourSampler()
35-
11+
sampler = optunahub.load_module(package="samplers/turbo").TuRBOSampler()
3612
study = optuna.create_study(sampler=sampler)
37-
study.optimize(objective, n_trials=30)
38-
print(study.best_trials)
13+
study.optimize(objective, n_trials=200)

package/samplers/turbo/sampler.py

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -53,7 +53,7 @@ def _standardize_values(values: np.ndarray) -> tuple[np.ndarray, np.ndarray, np.
5353
return standardized_values, means, stds
5454

5555

56-
class TurBOSampler(BaseSampler):
56+
class TuRBOSampler(BaseSampler):
5757
"""Sampler using Trust Region Bayesian optimization.
5858
5959
Args:
@@ -119,7 +119,7 @@ def __init__(
119119
self._n_local_search = 10
120120
self._tol = 1e-4
121121

122-
# hyperparameters of TurBOSampler
122+
# hyperparameters of TuRBOSampler
123123
self._init_length = 0.8
124124
self._max_length = 1.6
125125
self._min_length = 0.5**7
@@ -257,7 +257,7 @@ def sample_relative(
257257
)
258258
self._gprs_cache_list = gprs_list
259259

260-
# note(sawa3030): TurboSampler currently supports single-objective optimization only.
260+
# note(sawa3030): TuRBOSampler currently supports single-objective optimization only.
261261
assert n_objectives == 1
262262
assert len(gprs_list) == 1
263263
acqf = acqf_module.LogEI(

0 commit comments

Comments
 (0)