Skip to content

Commit

Permalink
docs: add link to new book chapter about internal tuning
Browse files Browse the repository at this point in the history
  • Loading branch information
be-marc committed Nov 7, 2024
1 parent 27e5fd6 commit 4dc1160
Show file tree
Hide file tree
Showing 2 changed files with 47 additions and 53 deletions.
4 changes: 1 addition & 3 deletions README.Rmd
Original file line number Diff line number Diff line change
Expand Up @@ -50,13 +50,11 @@ There are several sections about hyperparameter optimization in the [mlr3book](h
* Learn about [tuning spaces](https://mlr3book.mlr-org.com/chapters/chapter4/hyperparameter_optimization.html#sec-defining-search-spaces).
* Estimate the model performance with [nested resampling](https://mlr3book.mlr-org.com/chapters/chapter4/hyperparameter_optimization.html#sec-nested-resampling).
* Learn about [multi-objective optimization](https://mlr3book.mlr-org.com/chapters/chapter5/advanced_tuning_methods_and_black_box_optimization.html#sec-multi-metrics-tuning).
* Simultaneously optimize hyperparameters and use [early stopping](https://mlr3book.mlr-org.com/chapters/chapter15/predsets_valid_inttune.html) with XGBoost.

The [gallery](https://mlr-org.com/gallery-all-optimization.html) features a collection of case studies and demos about optimization.

* Learn more advanced methods with the [Practical Tuning Series](https://mlr-org.com/gallery/series/2021-03-09-practical-tuning-series-tune-a-support-vector-machine/).
* Optimize an rpart classification tree with only a [few lines of code](https://mlr-org.com/gallery/optimization/2022-11-10-hyperparameter-optimization-on-the-palmer-penguins/).
* Simultaneously optimize hyperparameters and use [early stopping](https://mlr-org.com/gallery/optimization/2022-11-04-early-stopping-with-xgboost/) with XGBoost.
* Make us of proven [search space](https://mlr-org.com/gallery/optimization/2021-07-06-introduction-to-mlr3tuningspaces/).
* Learn about [hotstarting](https://mlr-org.com/gallery/optimization/2023-01-16-hotstart/) models.
* Run the [default hyperparameter configuration](https://mlr-org.com/gallery/optimization/2023-01-31-default-configuration/) of learners as a baseline.
* Use the [Hyperband](https://mlr-org.com/gallery/series/2023-01-15-hyperband-xgboost/) optimizer with different budget parameters.
Expand Down
96 changes: 46 additions & 50 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,53 +34,49 @@ The package is built on the optimization framework

mlr3tuning is extended by the following packages.

- [mlr3tuningspaces](https://github.com/mlr-org/mlr3tuningspaces) is a
collection of search spaces from scientific articles for commonly
used learners.
- [mlr3hyperband](https://github.com/mlr-org/mlr3hyperband) adds the
Hyperband and Successive Halving algorithm.
- [mlr3mbo](https://github.com/mlr-org/mlr3mbo) adds Bayesian
Optimization methods.
- [mlr3tuningspaces](https://github.com/mlr-org/mlr3tuningspaces) is a
collection of search spaces from scientific articles for commonly used
learners.
- [mlr3hyperband](https://github.com/mlr-org/mlr3hyperband) adds the
Hyperband and Successive Halving algorithm.
- [mlr3mbo](https://github.com/mlr-org/mlr3mbo) adds Bayesian
Optimization methods.

## Resources

There are several sections about hyperparameter optimization in the
[mlr3book](https://mlr3book.mlr-org.com).

- Getting started with [hyperparameter
optimization](https://mlr3book.mlr-org.com/chapters/chapter4/hyperparameter_optimization.html).
- An overview of all tuners can be found on our
[website](https://mlr-org.com/tuners.html).
- [Tune](https://mlr3book.mlr-org.com/chapters/chapter4/hyperparameter_optimization.html#sec-model-tuning)
a support vector machine on the Sonar data set.
- Learn about [tuning
spaces](https://mlr3book.mlr-org.com/chapters/chapter4/hyperparameter_optimization.html#sec-defining-search-spaces).
- Estimate the model performance with [nested
resampling](https://mlr3book.mlr-org.com/chapters/chapter4/hyperparameter_optimization.html#sec-nested-resampling).
- Learn about [multi-objective
optimization](https://mlr3book.mlr-org.com/chapters/chapter5/advanced_tuning_methods_and_black_box_optimization.html#sec-multi-metrics-tuning).
- Getting started with [hyperparameter
optimization](https://mlr3book.mlr-org.com/chapters/chapter4/hyperparameter_optimization.html).
- An overview of all tuners can be found on our
[website](https://mlr-org.com/tuners.html).
- [Tune](https://mlr3book.mlr-org.com/chapters/chapter4/hyperparameter_optimization.html#sec-model-tuning)
a support vector machine on the Sonar data set.
- Learn about [tuning
spaces](https://mlr3book.mlr-org.com/chapters/chapter4/hyperparameter_optimization.html#sec-defining-search-spaces).
- Estimate the model performance with [nested
resampling](https://mlr3book.mlr-org.com/chapters/chapter4/hyperparameter_optimization.html#sec-nested-resampling).
- Learn about [multi-objective
optimization](https://mlr3book.mlr-org.com/chapters/chapter5/advanced_tuning_methods_and_black_box_optimization.html#sec-multi-metrics-tuning).
- Simultaneously optimize hyperparameters and use [early
stopping](https://mlr3book.mlr-org.com/chapters/chapter15/predsets_valid_inttune.html)
with XGBoost.

The [gallery](https://mlr-org.com/gallery-all-optimization.html)
features a collection of case studies and demos about optimization.

- Learn more advanced methods with the [Practical Tuning
Series](https://mlr-org.com/gallery/series/2021-03-09-practical-tuning-series-tune-a-support-vector-machine/).
- Optimize an rpart classification tree with only a [few lines of
code](https://mlr-org.com/gallery/optimization/2022-11-10-hyperparameter-optimization-on-the-palmer-penguins/).
- Simultaneously optimize hyperparameters and use [early
stopping](https://mlr-org.com/gallery/optimization/2022-11-04-early-stopping-with-xgboost/)
with XGBoost.
- Make us of proven [search
space](https://mlr-org.com/gallery/optimization/2021-07-06-introduction-to-mlr3tuningspaces/).
- Learn about
[hotstarting](https://mlr-org.com/gallery/optimization/2023-01-16-hotstart/)
models.
- Run the [default hyperparameter
configuration](https://mlr-org.com/gallery/optimization/2023-01-31-default-configuration/)
of learners as a baseline.
- Use the
[Hyperband](https://mlr-org.com/gallery/series/2023-01-15-hyperband-xgboost/)
optimizer with different budget parameters.
- Learn more advanced methods with the [Practical Tuning
Series](https://mlr-org.com/gallery/series/2021-03-09-practical-tuning-series-tune-a-support-vector-machine/).
- Learn about
[hotstarting](https://mlr-org.com/gallery/optimization/2023-01-16-hotstart/)
models.
- Run the [default hyperparameter
configuration](https://mlr-org.com/gallery/optimization/2023-01-31-default-configuration/)
of learners as a baseline.
- Use the
[Hyperband](https://mlr-org.com/gallery/series/2023-01-15-hyperband-xgboost/)
optimizer with different budget parameters.

The [cheatsheet](https://cheatsheets.mlr-org.com/mlr3tuning.pdf)
summarizes the most important functions of mlr3tuning.
Expand Down Expand Up @@ -161,7 +157,7 @@ tuner$optimize(instance)
```

## cost gamma learner_param_vals x_domain classif.ce
## 1: 5.756463 -5.756463 <list[4]> <list[2]> 0.2063492
## 1: 5.756463 -5.756463 <list[4]> <list[2]> 0.1828847

The tuner returns the best hyperparameter configuration and the
corresponding measured performance.
Expand All @@ -172,18 +168,18 @@ The archive contains all evaluated hyperparameter configurations.
as.data.table(instance$archive)[, .(cost, gamma, classif.ce, batch_nr, resample_result)]
```

## cost gamma classif.ce batch_nr resample_result
## 1: 5.756463 0.000000 0.4661146 1 <ResampleResult>
## 2: 11.512925 0.000000 0.4661146 2 <ResampleResult>
## 3: -11.512925 5.756463 0.4661146 3 <ResampleResult>
## 4: -11.512925 0.000000 0.4661146 4 <ResampleResult>
## 5: -5.756463 11.512925 0.4661146 5 <ResampleResult>
## ---
## 21: 11.512925 5.756463 0.4661146 21 <ResampleResult>
## 22: -11.512925 11.512925 0.4661146 22 <ResampleResult>
## 23: 0.000000 5.756463 0.4661146 23 <ResampleResult>
## 24: 5.756463 11.512925 0.4661146 24 <ResampleResult>
## 25: -11.512925 -5.756463 0.4661146 25 <ResampleResult>
## cost gamma classif.ce batch_nr resample_result
## 1: -5.756463 5.756463 0.4663216 1 <ResampleResult>
## 2: 5.756463 -5.756463 0.1828847 2 <ResampleResult>
## 3: 11.512925 5.756463 0.4663216 3 <ResampleResult>
## 4: 5.756463 11.512925 0.4663216 4 <ResampleResult>
## 5: -11.512925 -11.512925 0.4663216 5 <ResampleResult>
## ---
## 21: -5.756463 -5.756463 0.4663216 21 <ResampleResult>
## 22: 11.512925 11.512925 0.4663216 22 <ResampleResult>
## 23: -11.512925 11.512925 0.4663216 23 <ResampleResult>
## 24: 11.512925 -5.756463 0.1828847 24 <ResampleResult>
## 25: 0.000000 -5.756463 0.2402346 25 <ResampleResult>

The [mlr3viz](https://mlr3viz.mlr-org.com/) package visualizes tuning
results.
Expand Down

0 comments on commit 4dc1160

Please sign in to comment.