Skip to content
Open
Show file tree
Hide file tree
Changes from 58 commits
Commits
Show all changes
89 commits
Select commit Hold shift + click to select a range
f6ee35b
Add MO facade with todos
Jan 9, 2023
2b97fca
Add NoAggregatuonStrategy
Jan 9, 2023
556ad37
Update aggregation strategy
Jan 9, 2023
672389f
Limit value to bounds region
Jan 9, 2023
09160b7
Factor out creating a unique list
Jan 9, 2023
1359f19
More debug logging
Jan 9, 2023
733f94d
Factor out sorting of costs
Jan 9, 2023
3e015c0
Better docstring
Jan 9, 2023
171958b
Add MO acq maximizer
Jan 9, 2023
0fe8e7d
Update acq optimizer
Jan 9, 2023
0059155
Stop local search after max steps is reached
Jan 9, 2023
5b0a1bf
Abstract away population trimming and pareto front calculation
Jan 9, 2023
a0bed50
Add MO intensifier draft
Jan 9, 2023
325cb5c
Add comment
Jan 10, 2023
227ceb7
Add todos
Jan 10, 2023
c320f04
Pass rh's incumbents to acquisition function
Jan 10, 2023
67eefec
Add incumbents data structure in runhistory
Jan 10, 2023
b297a98
Add property for incumbents
Jan 10, 2023
6042bed
Add EHVI acq fun
Jan 10, 2023
a96172d
Update PHVI
Jan 10, 2023
75a2077
Add ACLib runner draft
Jan 10, 2023
4b2d101
Merge branch 'development' into mosmac
jeroenrook Feb 27, 2023
a5902d5
Native objective support
jeroenrook Mar 1, 2023
5e7d880
Fix typo
jeroenrook Mar 1, 2023
3cdf96a
Initial modifications for mo facade
jeroenrook Mar 1, 2023
087d7c8
Make the HV based acquisition functions work
jeroenrook Mar 1, 2023
1b20106
Logic fix
jeroenrook Mar 1, 2023
a057733
AClib runner
jeroenrook Mar 3, 2023
6c0bcd1
AClib runner fixes
jeroenrook Mar 3, 2023
71409ce
MO utils initial expansion
jeroenrook Mar 3, 2023
0587938
MO intensifier
jeroenrook Mar 3, 2023
d05fc42
Merge branch 'development' into mosmac
jeroenrook Mar 3, 2023
bd31d32
Expanded debugging message
jeroenrook Mar 20, 2023
4322cfb
Allow saving the intensifier when no incumbent is chosen yet.
jeroenrook Mar 20, 2023
6113c18
Bugfix for passing checks when MO model with features
jeroenrook Mar 20, 2023
8cd499f
Added support to retrain the surrogate model and acquisition loop in …
jeroenrook Mar 22, 2023
a26b7c9
Added a minimal number of configuration that need to be yielded befor…
jeroenrook Mar 28, 2023
37ae763
Remove sleep call used for testing
jeroenrook Mar 28, 2023
9b85222
Only compute Pareto fronts on the same subset of isb_keys.
jeroenrook Mar 28, 2023
8c114c0
Compute actual isb differences
jeroenrook Apr 3, 2023
2bc7383
Aclib runner
jeroenrook Apr 3, 2023
6ddc94c
Reset counter when retrain is triggered
jeroenrook Apr 3, 2023
24a749f
Comparison on one config from the incumbent
jeroenrook Apr 12, 2023
944425b
Make dask runner work
jeroenrook Apr 13, 2023
8496461
Added different intermediate update methods that can be mixed with th…
jeroenrook Apr 20, 2023
da0bb6b
Make normalization of costs in the mo setting a choice
jeroenrook Apr 26, 2023
2ca601c
In the native MO setting the EPM are trained by using the costs retri…
jeroenrook Apr 26, 2023
603182a
Generic HVI class
jeroenrook Apr 27, 2023
a109f48
Decomposed the intensifier decision logic and created mixins to easil…
jeroenrook May 2, 2023
17ce0a3
Changed the intensifier
jeroenrook May 3, 2023
fd317b0
Commit everythin
jeroenrook May 3, 2023
b50db2b
csvs
jeroenrook May 4, 2023
38b22d4
Merge remote-tracking branch 'origin/main' into mosmac
jeroenrook May 22, 2023
69d466b
README change
jeroenrook Nov 15, 2023
fdd33f6
README change
jeroenrook Nov 15, 2023
bf2a2f0
Even bigger push
jeroenrook Mar 3, 2025
1d71cf4
Merge remote-tracking branch 'origin/development' into mosmac-merge
jeroenrook Mar 27, 2025
7d7290d
Remove EHVI acquisition function
jeroenrook Mar 27, 2025
aec7609
README
jeroenrook Mar 27, 2025
cb9eab6
Fix failing tests. Disentangle normalisation and aggregation
jeroenrook Mar 27, 2025
373dc08
Fix failing pytests
jeroenrook Mar 27, 2025
85f822a
Merge remote-tracking branch 'automl/development' into mosmac-merge
Oct 6, 2025
cc2762d
resolving tests
Oct 7, 2025
c6c4b8b
intensifier fix for MF. Passes tests
Oct 7, 2025
f390582
fix merging retrain. test passes
Oct 7, 2025
04643e5
format: ruff
benjamc Oct 7, 2025
dd2ff58
build(setup.py): add dependency pygmo
benjamc Oct 7, 2025
5b0c318
style: pydocstyle, flake
benjamc Oct 7, 2025
2fab658
readd paretofront
benjamc Oct 7, 2025
31eda8c
refactor(pareto_front.py): delete illegal functions
benjamc Oct 7, 2025
cf824ae
fix some mypy
benjamc Oct 7, 2025
0b7e947
style: mypy
benjamc Oct 8, 2025
a159eb4
refactor(expected_hypervolume): rm duplicate function
benjamc Oct 8, 2025
51418df
refactor(expected_hypervolume): delete proxy method which was a comme…
benjamc Oct 8, 2025
e0e59a9
refactor(expected_hypervolume): delete ehvi method which was a commen…
benjamc Oct 8, 2025
538f4df
rename hypervolume.py
benjamc Oct 8, 2025
9f84c6d
style(hypervolume.py): fix mypy
benjamc Oct 8, 2025
3f06d61
style: pre-commit fix
benjamc Oct 8, 2025
41742c9
refactor crowding distance: optional normalization
Oct 8, 2025
3168346
Remove develop comparisons
Oct 17, 2025
b7b635f
Add PHVI to init
Oct 17, 2025
2b0b4a2
PHVI test
Oct 17, 2025
19883a5
Fix: MOLocalSearch
Oct 17, 2025
2dee6e6
Test: MOLocalSearch
Oct 17, 2025
b0b222d
Test: MOFacade + fixes to make facade work for MO
Nov 5, 2025
1ab8fc2
Test: Intensifier Mixins
Nov 5, 2025
ed5cc03
Test: Multi-objective
Nov 5, 2025
aa43f04
Remove TODOs
Nov 5, 2025
47e0024
Precommit
Nov 5, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
102 changes: 18 additions & 84 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,33 +12,12 @@ SMAC offers a robust and flexible framework for Bayesian Optimization to support
hyperparameter configurations for their (Machine Learning) algorithms, datasets and applications at hand. The main core
consists of Bayesian Optimization in combination with an aggressive racing mechanism to efficiently decide which of two configurations performs better.

SMAC3 is written in Python3 and continuously tested with Python 3.8, 3.9, and 3.10. Its Random
Forest is written in C++. In further texts, SMAC is representatively mentioned for SMAC3.

> [Documentation](https://automl.github.io/SMAC3)

> [Roadmap](https://github.com/orgs/automl/projects/5/views/2)


## Important: Changes in v2.0

With the next big major release of SMAC, we drastically boosted the user experience by improving the APIs and how the
pipelining is done (see [changelog](CHANGELOG.md)). All facades/intensifiers support multi-objective, multi-fidelity,
and multi-threading natively now! That includes having an ask-and-tell interface and continuing a run
wherever you left off. pSMAC is removed because when specifying the number of workers, SMAC automatically uses
multi-threading for evaluating trials. When cleaning the code base, however, we removed the command-line
interface (calling a target function from a script is still supported), and runtime optimization. Also,
python 3.7 is not supported anymore. If you depend on those functionalities, please keep using v1.4.

We are excited to introduce the new major release and look forward to developing new features on the new code base.
We hope you enjoy this new user experience as much as we do. 🚀

MO-SMAC is implemented directly into SMAC3. This repository is forked from the [SMAC3 repository](https://github.com/automl/SMAC3) and therefore contains references and copyright information to those authors.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  • update README

These do not align with the authors of MO-SMAC and, therefore, the anonymity for this repository remains intact.

## Installation

This instruction is for the installation on a Linux system, for Windows and Mac and further information see the [documentation](https://automl.github.io/SMAC3/main/1_installation.html).

Create a new environment with python 3.10 and make sure swig is installed either on your system or
Create a new environment with Python 3.10 and make sure swig is installed either on your system or
inside the environment. We demonstrate the installation via anaconda in the following:

Create and activate environment:
Expand All @@ -52,25 +31,27 @@ Install swig:
conda install gxx_linux-64 gcc_linux-64 swig
```

Install SMAC via PyPI:
```
pip install smac
Clone this repository and install locally:
```

If you want to contribute to SMAC, use the following steps instead:
```
git clone https://github.com/automl/SMAC3.git && cd SMAC3
make install-dev
cd SMAC3K
pip install -e .[dev]
```


## Minimal Example
To use MO-SMAC, there is a multi-objective facade that provides all the functionalities for MO-AAC. The example below shows how this facade can be accessed and used.

```py
from ConfigSpace import Configuration, ConfigurationSpace

import time
import numpy as np
from smac import HyperparameterOptimizationFacade, Scenario
from smac.facade.multi_objective_facade import MultiObjectiveFacade
from smac import Scenario
from sklearn import datasets
from sklearn.svm import SVC
from sklearn.model_selection import cross_val_score
Expand All @@ -80,68 +61,21 @@ iris = datasets.load_iris()

def train(config: Configuration, seed: int = 0) -> float:
classifier = SVC(C=config["C"], random_state=seed)
start_time = time.time()
scores = cross_val_score(classifier, iris.data, iris.target, cv=5)
return 1 - np.mean(scores)
run_time = time.time() - start_time
return {"perf": 1 - np.mean(scores), "runtime": run_time}


configspace = ConfigurationSpace({"C": (0.100, 1000.0)})

# Scenario object specifying the optimization environment
scenario = Scenario(configspace, deterministic=True, n_trials=200)
scenario = Scenario(configspace,
deterministic=True,
n_trials=200,
objectives=["perf", "runtime"])

# Use SMAC to find the best configuration/hyperparameters
smac = HyperparameterOptimizationFacade(scenario, train)
smac = MultiObjectiveFacade(scenario, train)
incumbent = smac.optimize()
```

More examples can be found in the [documentation](https://automl.github.io/SMAC3/main/examples/).

## Visualization via DeepCAVE

With DeepCAVE ([Repo](https://github.com/automl/DeepCAVE), [Paper](https://arxiv.org/abs/2206.03493)) you can visualize your SMAC runs. It is a visualization and analysis tool for AutoML (especially for the sub-problem
hyperparameter optimization) runs.

## License

This program is free software: you can redistribute it and/or modify
it under the terms of the 3-clause BSD license (please see the LICENSE file).

This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.

You should have received a copy of the 3-clause BSD license
along with this program (see LICENSE file).
If not, see [here](https://opensource.org/licenses/BSD-3-Clause).

## Contacting us

If you have trouble using SMAC, a concrete question or found a bug, please create an [issue](https://github.com/automl/SMAC3/issues). This is the easiest way to communicate about these things with us.

For all other inquiries, please write an email to smac[at]ai[dot]uni[dash]hannover[dot]de.

## Miscellaneous

SMAC3 is developed by the [AutoML Groups of the Universities of Hannover and
Freiburg](http://www.automl.org/).

If you have found a bug, please report to [issues](https://github.com/automl/SMAC3/issues). Moreover, we are
appreciating any kind of help. Find our guidelines for contributing to this package
[here](CONTRIBUTING.md).

If you use SMAC in one of your research projects, please cite our
[JMLR paper](https://jmlr.org/papers/v23/21-0888.html):
```
@article{JMLR:v23:21-0888,
author = {Marius Lindauer and Katharina Eggensperger and Matthias Feurer and André Biedenkapp and Difan Deng and Carolin Benjamins and Tim Ruhkopf and René Sass and Frank Hutter},
title = {SMAC3: A Versatile Bayesian Optimization Package for Hyperparameter Optimization},
journal = {Journal of Machine Learning Research},
year = {2022},
volume = {23},
number = {54},
pages = {1--9},
url = {http://jmlr.org/papers/v23/21-0888.html}
}
```

Copyright (C) 2016-2022 [AutoML Group](http://www.automl.org).
12 changes: 7 additions & 5 deletions examples/2_multi_fidelity/1_mlp_epochs.py
Original file line number Diff line number Diff line change
Expand Up @@ -80,7 +80,7 @@ def configspace(self) -> ConfigurationSpace:

return cs

def train(self, config: Configuration, seed: int = 0, budget: int = 25) -> float:
def train(self, config: Configuration, seed: int = 0, instance: str = "0", budget: int = 25) -> dict[str, float]:
# For deactivated parameters (by virtue of the conditions),
# the configuration stores None-values.
# This is not accepted by the MLP, so we replace them with placeholder values.
Expand All @@ -106,7 +106,7 @@ def train(self, config: Configuration, seed: int = 0, budget: int = 25) -> float
cv = StratifiedKFold(n_splits=5, random_state=seed, shuffle=True) # to make CV splits consistent
score = cross_val_score(classifier, dataset.data, dataset.target, cv=cv, error_score="raise")

return 1 - np.mean(score)
return {"accuracy": 1 - np.mean(score)}


def plot_trajectory(facades: list[AbstractFacade]) -> None:
Expand Down Expand Up @@ -147,9 +147,11 @@ def plot_trajectory(facades: list[AbstractFacade]) -> None:
mlp.configspace,
walltime_limit=60, # After 60 seconds, we stop the hyperparameter optimization
n_trials=500, # Evaluate max 500 different trials
min_budget=1, # Train the MLP using a hyperparameter configuration for at least 5 epochs
max_budget=25, # Train the MLP using a hyperparameter configuration for at most 25 epochs
n_workers=8,
instances=[str(i) for i in range(10)],
objectives="accuracy",
# min_budget=1, # Train the MLP using a hyperparameter configuration for at least 5 epochs
# max_budget=25, # Train the MLP using a hyperparameter configuration for at most 25 epochs
n_workers=4,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  • test example

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Works

)

# We want to run five random configurations before starting the optimization.
Expand Down
4 changes: 2 additions & 2 deletions smac/acquisition/function/abstract_acquisition_function.py
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@ def update(self, model: AbstractModel, **kwargs: Any) -> None:

This method will be called after fitting the model, but before maximizing the acquisition
function. As an examples, EI uses it to update the current fmin. The default implementation only updates the
attributes of the acqusition function which are already present.
attributes of the acquisition function which are already present.

Calls `_update` to update the acquisition function attributes.

Expand All @@ -65,7 +65,7 @@ def update(self, model: AbstractModel, **kwargs: Any) -> None:
self._update(**kwargs)

def _update(self, **kwargs: Any) -> None:
"""Update acsquisition function attributes
"""Update acquisition function attributes

Might be different for each child class.
"""
Expand Down
Loading