|
1 | 1 | --- |
2 | | -author: Please fill in the author name here. (e.g., John Smith) |
3 | | -title: Please fill in the title of the feature here. (e.g., Gaussian-Process Expected Improvement Sampler) |
4 | | -description: Please fill in the description of the feature here. (e.g., This sampler searches for each trial based on expected improvement using Gaussian process.) |
5 | | -tags: [Please fill in the list of tags here. (e.g., sampler, visualization, pruner)] |
6 | | -optuna_versions: ['Please fill in the list of versions of Optuna in which you have confirmed the feature works, e.g., 3.6.1.'] |
| 2 | +author: Optuna Team |
| 3 | +title: TuRBOSampler |
| 4 | +description: This sampler performs Bayesian optimization in adaptive trust regions using Gaussian Processes |
| 5 | +tags: [sampler, Bayesian optimization] |
| 6 | +optuna_versions: [4.5.0] |
7 | 7 | license: MIT License |
8 | 8 | --- |
9 | 9 |
|
10 | | -<!-- |
11 | | -This is an example of the frontmatters. |
12 | | -All columns must be string. |
13 | | -You can omit quotes when value types are not ambiguous. |
14 | | -For tags, a package placed in |
15 | | -- package/samplers/ must include the tag "sampler" |
16 | | -- package/visualilzation/ must include the tag "visualization" |
17 | | -- package/pruners/ must include the tag "pruner" |
18 | | -respectively. |
19 | | -
|
20 | | ---- |
21 | | -author: Optuna team |
22 | | -title: My Sampler |
23 | | -description: A description for My Sampler. |
24 | | -tags: [sampler, 2nd tag for My Sampler, 3rd tag for My Sampler] |
25 | | -optuna_versions: [3.6.1] |
26 | | -license: "MIT License" |
27 | | ---- |
28 | | ---> |
29 | | - |
30 | | -Instruction (Please remove this instruction after you carefully read) |
31 | | - |
32 | | -- Please read the [tutorial guide](https://optuna.github.io/optunahub/generated/recipes/001_first.html) to register your feature in OptunaHub. You can find more detailed explanation of the following contents in the tutorial. |
33 | | -- Looking at [other packages' implementations](https://github.com/optuna/optunahub-registry/tree/main/package) would also be helpful. |
34 | | -- **Please do not use HTML tags in the `README.md` file. Only markdown is allowed. For security reasons, the HTML tags will be removed when the package is registered on the web page.** |
35 | | - |
36 | 10 | ## Abstract |
37 | 11 |
|
38 | | -You can provide an abstract for your package here. |
39 | | -This section will help attract potential users to your package. |
| 12 | +TuRBOSampler implements Bayesian optimization with trust regions. It places local trust regions around the current best solutions and fits Gaussian Process (GP) models within those regions. Operating within adaptive local regions reduces high-dimensional sample complexity, yielding accurate fits with fewer trials. |
40 | 13 |
|
41 | | -**Example** |
42 | | - |
43 | | -This package provides a sampler based on Gaussian process-based Bayesian optimization. The sampler is highly sample-efficient, so it is suitable for computationally expensive optimization problems with a limited evaluation budget, such as hyperparameter optimization of machine learning algorithms. |
| 14 | +Please refer to the paper, [Scalable Global Optimization via Local Bayesian Optimization](https://proceedings.neurips.cc/paper_files/paper/2019/file/6c990b7aca7bc7058f5e98ea909e924b-Paper.pdf) for more information. |
44 | 15 |
|
45 | 16 | ## APIs |
46 | 17 |
|
47 | | -Please provide API documentation describing how to use your package's functionalities. |
48 | | -The documentation format is arbitrary, but at least the important class/function names that you implemented should be listed here. |
49 | | -More users will take advantage of your package by providing detailed and helpful documentation. |
50 | | - |
51 | | -**Example** |
52 | | - |
53 | | -- `MoCmaSampler(*, search_space: dict[str, BaseDistribution] | None = None, popsize: int | None = None, seed: int | None = None)` |
54 | | - - `search_space`: A dictionary containing the search space that defines the parameter space. The keys are the parameter names and the values are [the parameter's distribution](https://optuna.readthedocs.io/en/stable/reference/distributions.html). If the search space is not provided, the sampler will infer the search space dynamically. |
55 | | - Example: |
56 | | - ```python |
57 | | - search_space = { |
58 | | - "x": optuna.distributions.FloatDistribution(-5, 5), |
59 | | - "y": optuna.distributions.FloatDistribution(-5, 5), |
60 | | - } |
61 | | - MoCmaSampler(search_space=search_space) |
62 | | - ``` |
63 | | - - `popsize`: Population size of the CMA-ES algorithm. If not provided, the population size will be set based on the search space dimensionality. If you have a sufficient evaluation budget, it is recommended to increase the popsize. |
64 | | - - `seed`: Seed for random number generator. |
| 18 | +- `TuRBOSampler(*, n_startup_trials: int = 4, n_trust_region: int = 5, success_tolerance: int = 3, failure_tolerance: int = 5, seed: int | None = None, independent_sampler: BaseSampler | None = None, deterministic_objective: bool = False, warn_independent_sampling: bool = True)` |
| 19 | + - `n_startup_trials`: Number of initial trials PER TRUST REGION. Default is 2. As suggested in the original paper, consider setting this to 2\*(number of parameters). |
| 20 | + - `n_trust_region`: Number of trust regions. Default is 5. |
| 21 | + - `success_tolerance`: Number of consecutive successful iterations required to expand the trust region. Default is 3. |
| 22 | + - `failure_tolerance`: Number of consecutive failed iterations required to shrink the trust region. Default is 5. As suggested in the original paper, consider setting this to max(5, number of parameters). |
| 23 | + - `seed`: Random seed to initialize internal random number generator. Defaults to :obj:`None` (a seed is picked randomly). |
| 24 | + - `independent_sampler`: Sampler used for initial sampling (for the first `n_startup_trials` trials) and for conditional parameters. Defaults to :obj:`None` (a random sampler with the same `seed` is used). |
| 25 | + - `deterministic_objective`: Whether the objective function is deterministic or not. If :obj:`True`, the sampler will fix the noise variance of the surrogate model to the minimum value (slightly above 0 to ensure numerical stability). Defaults to :obj:`False`. Currently, all the objectives will be assume to be deterministic if :obj:`True`. |
| 26 | + - `warn_independent_sampling`: If this is :obj:`True`, a warning message is emitted when the value of a parameter is sampled by using an independent sampler, meaning that no GP model is used in the sampling. Note that the parameters of the first trial in a study are always sampled via an independent sampler, so no warning messages are emitted in this case. |
65 | 27 |
|
66 | | -Note that because of the limitation of the algorithm, only non-conditional numerical parameters can be sampled by the MO-CMA-ES algorithm, and categorical and conditional parameters are handled by random search. |
| 28 | +Note that categorical parameters are currently unsupported, and multi-objective optimization is not available. |
67 | 29 |
|
68 | 30 | ## Installation |
69 | 31 |
|
70 | | -If you have additional dependencies, please fill in the installation guide here. |
71 | | -If no additional dependencies is required, **this section can be removed**. |
72 | | - |
73 | | -**Example** |
74 | | - |
75 | 32 | ```shell |
76 | | -$ pip install scipy torch |
77 | | -``` |
78 | | - |
79 | | -If your package has `requirements.txt`, it will be automatically uploaded to the OptunaHub, and the package dependencies will be available to install as follows. |
80 | | - |
81 | | -```shell |
82 | | - pip install -r https://hub.optuna.org/{category}/{your_package_name}/requirements.txt |
| 33 | +$ pip install torch scipy |
83 | 34 | ``` |
84 | 35 |
|
85 | 36 | ## Example |
86 | 37 |
|
87 | | -Please fill in the code snippet to use the implemented feature here. |
88 | | - |
89 | | -**Example** |
90 | | - |
91 | 38 | ```python |
92 | 39 | import optuna |
93 | 40 | import optunahub |
94 | 41 |
|
95 | 42 |
|
96 | | -def objective(trial): |
97 | | - x = trial.suggest_float("x", -5, 5) |
98 | | - return x**2 |
| 43 | +def objective(trial: optuna.Trial) -> float: |
| 44 | + x = trial.suggest_float("x", -5, 5) |
| 45 | + y = trial.suggest_float("y", -5, 5) |
| 46 | + return x**2 + y**2 |
99 | 47 |
|
100 | 48 |
|
101 | | -sampler = optunahub.load_module(package="samplers/gp").GPSampler() |
| 49 | +sampler = optunahub.load_module(package="samplers/turbo").TuRBOSampler() |
102 | 50 | study = optuna.create_study(sampler=sampler) |
103 | | -study.optimize(objective, n_trials=100) |
| 51 | +study.optimize(objective, n_trials=200) |
| 52 | + |
104 | 53 | ``` |
105 | 54 |
|
106 | 55 | ## Others |
107 | 56 |
|
108 | | -Please fill in any other information if you have here by adding child sections (###). |
109 | | -If there is no additional information, **this section can be removed**. |
110 | | - |
111 | | -<!-- |
112 | | -For example, you can add sections to introduce a corresponding paper. |
113 | | -
|
114 | | -### Reference |
115 | | -Takuya Akiba, Shotaro Sano, Toshihiko Yanase, Takeru Ohta, and Masanori Koyama. 2019. |
116 | | -Optuna: A Next-generation Hyperparameter Optimization Framework. In KDD. |
117 | | -
|
118 | 57 | ### Bibtex |
| 58 | + |
119 | 59 | ``` |
120 | | -@inproceedings{optuna_2019, |
121 | | - title={Optuna: A Next-generation Hyperparameter Optimization Framework}, |
122 | | - author={Akiba, Takuya and Sano, Shotaro and Yanase, Toshihiko and Ohta, Takeru and Koyama, Masanori}, |
123 | | - booktitle={Proceedings of the 25th {ACM} {SIGKDD} International Conference on Knowledge Discovery and Data Mining}, |
124 | | - year={2019} |
| 60 | +@inproceedings{eriksson2019scalable, |
| 61 | + title = {Scalable Global Optimization via Local {Bayesian} Optimization}, |
| 62 | + author = {Eriksson, David and Pearce, Michael and Gardner, Jacob and Turner, Ryan D and Poloczek, Matthias}, |
| 63 | + booktitle = {Advances in Neural Information Processing Systems}, |
| 64 | + pages = {5496--5507}, |
| 65 | + year = {2019}, |
| 66 | + url = {http://papers.nips.cc/paper/8788-scalable-global-optimization-via-local-bayesian-optimization.pdf}, |
125 | 67 | } |
126 | 68 | ``` |
127 | | ---> |
0 commit comments