From 4dc11607b6df79429bb957fce4c11b224e283099 Mon Sep 17 00:00:00 2001 From: be-marc Date: Thu, 7 Nov 2024 20:38:26 +0100 Subject: [PATCH] docs: add link to new book chapter about internal tuning --- README.Rmd | 4 +-- README.md | 96 ++++++++++++++++++++++++++---------------------------- 2 files changed, 47 insertions(+), 53 deletions(-) diff --git a/README.Rmd b/README.Rmd index b92f658b..7836bd1a 100644 --- a/README.Rmd +++ b/README.Rmd @@ -50,13 +50,11 @@ There are several sections about hyperparameter optimization in the [mlr3book](h * Learn about [tuning spaces](https://mlr3book.mlr-org.com/chapters/chapter4/hyperparameter_optimization.html#sec-defining-search-spaces). * Estimate the model performance with [nested resampling](https://mlr3book.mlr-org.com/chapters/chapter4/hyperparameter_optimization.html#sec-nested-resampling). * Learn about [multi-objective optimization](https://mlr3book.mlr-org.com/chapters/chapter5/advanced_tuning_methods_and_black_box_optimization.html#sec-multi-metrics-tuning). +* Simultaneously optimize hyperparameters and use [early stopping](https://mlr3book.mlr-org.com/chapters/chapter15/predsets_valid_inttune.html) with XGBoost. The [gallery](https://mlr-org.com/gallery-all-optimization.html) features a collection of case studies and demos about optimization. * Learn more advanced methods with the [Practical Tuning Series](https://mlr-org.com/gallery/series/2021-03-09-practical-tuning-series-tune-a-support-vector-machine/). -* Optimize an rpart classification tree with only a [few lines of code](https://mlr-org.com/gallery/optimization/2022-11-10-hyperparameter-optimization-on-the-palmer-penguins/). -* Simultaneously optimize hyperparameters and use [early stopping](https://mlr-org.com/gallery/optimization/2022-11-04-early-stopping-with-xgboost/) with XGBoost. -* Make us of proven [search space](https://mlr-org.com/gallery/optimization/2021-07-06-introduction-to-mlr3tuningspaces/). * Learn about [hotstarting](https://mlr-org.com/gallery/optimization/2023-01-16-hotstart/) models. * Run the [default hyperparameter configuration](https://mlr-org.com/gallery/optimization/2023-01-31-default-configuration/) of learners as a baseline. * Use the [Hyperband](https://mlr-org.com/gallery/series/2023-01-15-hyperband-xgboost/) optimizer with different budget parameters. diff --git a/README.md b/README.md index 4798bd92..51bdd02f 100644 --- a/README.md +++ b/README.md @@ -34,53 +34,49 @@ The package is built on the optimization framework mlr3tuning is extended by the following packages. -- [mlr3tuningspaces](https://github.com/mlr-org/mlr3tuningspaces) is a - collection of search spaces from scientific articles for commonly - used learners. -- [mlr3hyperband](https://github.com/mlr-org/mlr3hyperband) adds the - Hyperband and Successive Halving algorithm. -- [mlr3mbo](https://github.com/mlr-org/mlr3mbo) adds Bayesian - Optimization methods. +- [mlr3tuningspaces](https://github.com/mlr-org/mlr3tuningspaces) is a + collection of search spaces from scientific articles for commonly used + learners. +- [mlr3hyperband](https://github.com/mlr-org/mlr3hyperband) adds the + Hyperband and Successive Halving algorithm. +- [mlr3mbo](https://github.com/mlr-org/mlr3mbo) adds Bayesian + Optimization methods. ## Resources There are several sections about hyperparameter optimization in the [mlr3book](https://mlr3book.mlr-org.com). -- Getting started with [hyperparameter - optimization](https://mlr3book.mlr-org.com/chapters/chapter4/hyperparameter_optimization.html). -- An overview of all tuners can be found on our - [website](https://mlr-org.com/tuners.html). -- [Tune](https://mlr3book.mlr-org.com/chapters/chapter4/hyperparameter_optimization.html#sec-model-tuning) - a support vector machine on the Sonar data set. -- Learn about [tuning - spaces](https://mlr3book.mlr-org.com/chapters/chapter4/hyperparameter_optimization.html#sec-defining-search-spaces). -- Estimate the model performance with [nested - resampling](https://mlr3book.mlr-org.com/chapters/chapter4/hyperparameter_optimization.html#sec-nested-resampling). -- Learn about [multi-objective - optimization](https://mlr3book.mlr-org.com/chapters/chapter5/advanced_tuning_methods_and_black_box_optimization.html#sec-multi-metrics-tuning). +- Getting started with [hyperparameter + optimization](https://mlr3book.mlr-org.com/chapters/chapter4/hyperparameter_optimization.html). +- An overview of all tuners can be found on our + [website](https://mlr-org.com/tuners.html). +- [Tune](https://mlr3book.mlr-org.com/chapters/chapter4/hyperparameter_optimization.html#sec-model-tuning) + a support vector machine on the Sonar data set. +- Learn about [tuning + spaces](https://mlr3book.mlr-org.com/chapters/chapter4/hyperparameter_optimization.html#sec-defining-search-spaces). +- Estimate the model performance with [nested + resampling](https://mlr3book.mlr-org.com/chapters/chapter4/hyperparameter_optimization.html#sec-nested-resampling). +- Learn about [multi-objective + optimization](https://mlr3book.mlr-org.com/chapters/chapter5/advanced_tuning_methods_and_black_box_optimization.html#sec-multi-metrics-tuning). +- Simultaneously optimize hyperparameters and use [early + stopping](https://mlr3book.mlr-org.com/chapters/chapter15/predsets_valid_inttune.html) + with XGBoost. The [gallery](https://mlr-org.com/gallery-all-optimization.html) features a collection of case studies and demos about optimization. -- Learn more advanced methods with the [Practical Tuning - Series](https://mlr-org.com/gallery/series/2021-03-09-practical-tuning-series-tune-a-support-vector-machine/). -- Optimize an rpart classification tree with only a [few lines of - code](https://mlr-org.com/gallery/optimization/2022-11-10-hyperparameter-optimization-on-the-palmer-penguins/). -- Simultaneously optimize hyperparameters and use [early - stopping](https://mlr-org.com/gallery/optimization/2022-11-04-early-stopping-with-xgboost/) - with XGBoost. -- Make us of proven [search - space](https://mlr-org.com/gallery/optimization/2021-07-06-introduction-to-mlr3tuningspaces/). -- Learn about - [hotstarting](https://mlr-org.com/gallery/optimization/2023-01-16-hotstart/) - models. -- Run the [default hyperparameter - configuration](https://mlr-org.com/gallery/optimization/2023-01-31-default-configuration/) - of learners as a baseline. -- Use the - [Hyperband](https://mlr-org.com/gallery/series/2023-01-15-hyperband-xgboost/) - optimizer with different budget parameters. +- Learn more advanced methods with the [Practical Tuning + Series](https://mlr-org.com/gallery/series/2021-03-09-practical-tuning-series-tune-a-support-vector-machine/). +- Learn about + [hotstarting](https://mlr-org.com/gallery/optimization/2023-01-16-hotstart/) + models. +- Run the [default hyperparameter + configuration](https://mlr-org.com/gallery/optimization/2023-01-31-default-configuration/) + of learners as a baseline. +- Use the + [Hyperband](https://mlr-org.com/gallery/series/2023-01-15-hyperband-xgboost/) + optimizer with different budget parameters. The [cheatsheet](https://cheatsheets.mlr-org.com/mlr3tuning.pdf) summarizes the most important functions of mlr3tuning. @@ -161,7 +157,7 @@ tuner$optimize(instance) ``` ## cost gamma learner_param_vals x_domain classif.ce - ## 1: 5.756463 -5.756463 0.2063492 + ## 1: 5.756463 -5.756463 0.1828847 The tuner returns the best hyperparameter configuration and the corresponding measured performance. @@ -172,18 +168,18 @@ The archive contains all evaluated hyperparameter configurations. as.data.table(instance$archive)[, .(cost, gamma, classif.ce, batch_nr, resample_result)] ``` - ## cost gamma classif.ce batch_nr resample_result - ## 1: 5.756463 0.000000 0.4661146 1 - ## 2: 11.512925 0.000000 0.4661146 2 - ## 3: -11.512925 5.756463 0.4661146 3 - ## 4: -11.512925 0.000000 0.4661146 4 - ## 5: -5.756463 11.512925 0.4661146 5 - ## --- - ## 21: 11.512925 5.756463 0.4661146 21 - ## 22: -11.512925 11.512925 0.4661146 22 - ## 23: 0.000000 5.756463 0.4661146 23 - ## 24: 5.756463 11.512925 0.4661146 24 - ## 25: -11.512925 -5.756463 0.4661146 25 + ## cost gamma classif.ce batch_nr resample_result + ## 1: -5.756463 5.756463 0.4663216 1 + ## 2: 5.756463 -5.756463 0.1828847 2 + ## 3: 11.512925 5.756463 0.4663216 3 + ## 4: 5.756463 11.512925 0.4663216 4 + ## 5: -11.512925 -11.512925 0.4663216 5 + ## --- + ## 21: -5.756463 -5.756463 0.4663216 21 + ## 22: 11.512925 11.512925 0.4663216 22 + ## 23: -11.512925 11.512925 0.4663216 23 + ## 24: 11.512925 -5.756463 0.1828847 24 + ## 25: 0.000000 -5.756463 0.2402346 25 The [mlr3viz](https://mlr3viz.mlr-org.com/) package visualizes tuning results.