Skip to content

MO-SMAC merge #1222

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 61 commits into
base: development
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from 60 commits
Commits
Show all changes
61 commits
Select commit Hold shift + click to select a range
f6ee35b
Add MO facade with todos
Jan 9, 2023
2b97fca
Add NoAggregatuonStrategy
Jan 9, 2023
556ad37
Update aggregation strategy
Jan 9, 2023
672389f
Limit value to bounds region
Jan 9, 2023
09160b7
Factor out creating a unique list
Jan 9, 2023
1359f19
More debug logging
Jan 9, 2023
733f94d
Factor out sorting of costs
Jan 9, 2023
3e015c0
Better docstring
Jan 9, 2023
171958b
Add MO acq maximizer
Jan 9, 2023
0fe8e7d
Update acq optimizer
Jan 9, 2023
0059155
Stop local search after max steps is reached
Jan 9, 2023
5b0a1bf
Abstract away population trimming and pareto front calculation
Jan 9, 2023
a0bed50
Add MO intensifier draft
Jan 9, 2023
325cb5c
Add comment
Jan 10, 2023
227ceb7
Add todos
Jan 10, 2023
c320f04
Pass rh's incumbents to acquisition function
Jan 10, 2023
67eefec
Add incumbents data structure in runhistory
Jan 10, 2023
b297a98
Add property for incumbents
Jan 10, 2023
6042bed
Add EHVI acq fun
Jan 10, 2023
a96172d
Update PHVI
Jan 10, 2023
75a2077
Add ACLib runner draft
Jan 10, 2023
4b2d101
Merge branch 'development' into mosmac
jeroenrook Feb 27, 2023
a5902d5
Native objective support
jeroenrook Mar 1, 2023
5e7d880
Fix typo
jeroenrook Mar 1, 2023
3cdf96a
Initial modifications for mo facade
jeroenrook Mar 1, 2023
087d7c8
Make the HV based acquisition functions work
jeroenrook Mar 1, 2023
1b20106
Logic fix
jeroenrook Mar 1, 2023
a057733
AClib runner
jeroenrook Mar 3, 2023
6c0bcd1
AClib runner fixes
jeroenrook Mar 3, 2023
71409ce
MO utils initial expansion
jeroenrook Mar 3, 2023
0587938
MO intensifier
jeroenrook Mar 3, 2023
d05fc42
Merge branch 'development' into mosmac
jeroenrook Mar 3, 2023
bd31d32
Expanded debugging message
jeroenrook Mar 20, 2023
4322cfb
Allow saving the intensifier when no incumbent is chosen yet.
jeroenrook Mar 20, 2023
6113c18
Bugfix for passing checks when MO model with features
jeroenrook Mar 20, 2023
8cd499f
Added support to retrain the surrogate model and acquisition loop in …
jeroenrook Mar 22, 2023
a26b7c9
Added a minimal number of configuration that need to be yielded befor…
jeroenrook Mar 28, 2023
37ae763
Remove sleep call used for testing
jeroenrook Mar 28, 2023
9b85222
Only compute Pareto fronts on the same subset of isb_keys.
jeroenrook Mar 28, 2023
8c114c0
Compute actual isb differences
jeroenrook Apr 3, 2023
2bc7383
Aclib runner
jeroenrook Apr 3, 2023
6ddc94c
Reset counter when retrain is triggered
jeroenrook Apr 3, 2023
24a749f
Comparison on one config from the incumbent
jeroenrook Apr 12, 2023
944425b
Make dask runner work
jeroenrook Apr 13, 2023
8496461
Added different intermediate update methods that can be mixed with th…
jeroenrook Apr 20, 2023
da0bb6b
Make normalization of costs in the mo setting a choice
jeroenrook Apr 26, 2023
2ca601c
In the native MO setting the EPM are trained by using the costs retri…
jeroenrook Apr 26, 2023
603182a
Generic HVI class
jeroenrook Apr 27, 2023
a109f48
Decomposed the intensifier decision logic and created mixins to easil…
jeroenrook May 2, 2023
17ce0a3
Changed the intensifier
jeroenrook May 3, 2023
fd317b0
Commit everythin
jeroenrook May 3, 2023
b50db2b
csvs
jeroenrook May 4, 2023
38b22d4
Merge remote-tracking branch 'origin/main' into mosmac
jeroenrook May 22, 2023
69d466b
README change
jeroenrook Nov 15, 2023
fdd33f6
README change
jeroenrook Nov 15, 2023
bf2a2f0
Even bigger push
jeroenrook Mar 3, 2025
1d71cf4
Merge remote-tracking branch 'origin/development' into mosmac-merge
jeroenrook Mar 27, 2025
7d7290d
Remove EHVI acquisition function
jeroenrook Mar 27, 2025
aec7609
README
jeroenrook Mar 27, 2025
cb9eab6
Fix failing tests. Disentangle normalisation and aggregation
jeroenrook Mar 27, 2025
373dc08
Fix failing pytests
jeroenrook Mar 27, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 6 additions & 6 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,10 +12,10 @@ SMAC offers a robust and flexible framework for Bayesian Optimization to support
hyperparameter configurations for their (Machine Learning) algorithms, datasets and applications at hand. The main core
consists of Bayesian Optimization in combination with an aggressive racing mechanism to efficiently decide which of two configurations performs better.

SMAC3 is written in Python3 and continuously tested with Python 3.8, 3.9, and 3.10. Its Random
SMAC3 is written in Python3 and continuously tested with Python 3.8, 3.9, and 3.10 (and works with newer python versions). Its Random
Forest is written in C++. In further texts, SMAC is representatively mentioned for SMAC3.

> [Documentation](https://automl.github.io/SMAC3)
> [Documentation](https://automl.github.io/SMAC3/latest/)

> [Roadmap](https://github.com/orgs/automl/projects/5/views/2)

Expand All @@ -36,7 +36,7 @@ We hope you enjoy this new user experience as much as we do. 🚀

## Installation

This instruction is for the installation on a Linux system, for Windows and Mac and further information see the [documentation](https://automl.github.io/SMAC3/main/1_installation.html).
This instruction is for the installation on a Linux system, for Windows and Mac and further information see the [documentation](https://automl.github.io/SMAC3/latest/1_installation/).

Create a new environment with python 3.10 and make sure swig is installed either on your system or
inside the environment. We demonstrate the installation via anaconda in the following:
Expand Down Expand Up @@ -94,7 +94,7 @@ smac = HyperparameterOptimizationFacade(scenario, train)
incumbent = smac.optimize()
```

More examples can be found in the [documentation](https://automl.github.io/SMAC3/main/examples/).
More examples can be found in the [documentation](https://automl.github.io/SMAC3/latest/examples/1%20Basics/1_quadratic_function/).

## Visualization via DeepCAVE

Expand Down Expand Up @@ -123,7 +123,7 @@ For all other inquiries, please write an email to smac[at]ai[dot]uni[dash]hannov
## Miscellaneous

SMAC3 is developed by the [AutoML Groups of the Universities of Hannover and
Freiburg](http://www.automl.org/).
Freiburg](http://www.automl.org/). It is a featured optimizer on [AutoML Space](https://automl.space/automl-tools/).

If you have found a bug, please report to [issues](https://github.com/automl/SMAC3/issues). Moreover, we are
appreciating any kind of help. Find our guidelines for contributing to this package
Expand All @@ -144,4 +144,4 @@ If you use SMAC in one of your research projects, please cite our
}
```

Copyright (C) 2016-2022 [AutoML Group](http://www.automl.org).
Copyright (c) 2025, [Leibniz University Hannover - Institute of AI](https://www.ai.uni-hannover.de/)
12 changes: 7 additions & 5 deletions examples/2_multi_fidelity/1_mlp_epochs.py
Original file line number Diff line number Diff line change
Expand Up @@ -80,7 +80,7 @@ def configspace(self) -> ConfigurationSpace:

return cs

def train(self, config: Configuration, seed: int = 0, budget: int = 25) -> float:
def train(self, config: Configuration, seed: int = 0, instance: str = "0", budget: int = 25) -> dict[str, float]:
# For deactivated parameters (by virtue of the conditions),
# the configuration stores None-values.
# This is not accepted by the MLP, so we replace them with placeholder values.
Expand All @@ -106,7 +106,7 @@ def train(self, config: Configuration, seed: int = 0, budget: int = 25) -> float
cv = StratifiedKFold(n_splits=5, random_state=seed, shuffle=True) # to make CV splits consistent
score = cross_val_score(classifier, dataset.data, dataset.target, cv=cv, error_score="raise")

return 1 - np.mean(score)
return {"accuracy": 1 - np.mean(score)}


def plot_trajectory(facades: list[AbstractFacade]) -> None:
Expand Down Expand Up @@ -147,9 +147,11 @@ def plot_trajectory(facades: list[AbstractFacade]) -> None:
mlp.configspace,
walltime_limit=60, # After 60 seconds, we stop the hyperparameter optimization
n_trials=500, # Evaluate max 500 different trials
min_budget=1, # Train the MLP using a hyperparameter configuration for at least 5 epochs
max_budget=25, # Train the MLP using a hyperparameter configuration for at most 25 epochs
n_workers=8,
instances=[str(i) for i in range(10)],
objectives="accuracy",
# min_budget=1, # Train the MLP using a hyperparameter configuration for at least 5 epochs
# max_budget=25, # Train the MLP using a hyperparameter configuration for at most 25 epochs
n_workers=4,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  • test example

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Works

)

# We want to run five random configurations before starting the optimization.
Expand Down
4 changes: 2 additions & 2 deletions smac/acquisition/function/abstract_acquisition_function.py
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@ def update(self, model: AbstractModel, **kwargs: Any) -> None:

This method will be called after fitting the model, but before maximizing the acquisition
function. As an examples, EI uses it to update the current fmin. The default implementation only updates the
attributes of the acqusition function which are already present.
attributes of the acquisition function which are already present.

Calls `_update` to update the acquisition function attributes.

Expand All @@ -65,7 +65,7 @@ def update(self, model: AbstractModel, **kwargs: Any) -> None:
self._update(**kwargs)

def _update(self, **kwargs: Any) -> None:
"""Update acsquisition function attributes
"""Update acquisition function attributes

Might be different for each child class.
"""
Expand Down
Loading