Skip to content

Commit

Permalink
Merge branch 'main' into callback
Browse files Browse the repository at this point in the history
  • Loading branch information
be-marc committed Dec 30, 2024
2 parents 7165473 + ddefaee commit 69d2faf
Show file tree
Hide file tree
Showing 9 changed files with 21 additions and 14 deletions.
6 changes: 3 additions & 3 deletions DESCRIPTION
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
Package: mlr3tuning
Title: Hyperparameter Optimization for 'mlr3'
Version: 1.2.1.9000
Version: 1.3.0.9000
Authors@R: c(
person("Marc", "Becker", , "[email protected]", role = c("cre", "aut"),
comment = c(ORCID = "0000-0002-8115-0400")),
Expand Down Expand Up @@ -29,7 +29,7 @@ Depends:
paradox (>= 1.0.1),
R (>= 3.1.0)
Imports:
bbotk (>= 1.4.1),
bbotk (>= 1.5.0),
checkmate (>= 2.0.0),
data.table,
lgr,
Expand All @@ -39,7 +39,7 @@ Suggests:
adagio,
future,
GenSA,
irace (>= 4.0.0),
irace (>= 4.1.0),
knitr,
mlflow,
mlr3learners (>= 0.7.0),
Expand Down
5 changes: 4 additions & 1 deletion NEWS.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,9 @@
# mlr3tuning (development version)

perf: save models on worker only when requested in `ObjectiveTuningAsync`.
# mlr3tuning 1.3.0

* feat: Save `ArchiveAsyncTuning` to a `data.table` with `ArchiveAsyncTuningFrozen`.
* perf: Save models on worker only when requested in `ObjectiveTuningAsync`.

# mlr3tuning 1.2.1

Expand Down
2 changes: 1 addition & 1 deletion R/ArchiveAsyncTuningFrozen.R
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@
#'
#' @section S3 Methods:
#' * `as.data.table(archive)`\cr
#' [ArchiveAsync] -> [data.table::data.table()]\cr
#' [ArchiveAsyncTuningFrozen] -> [data.table::data.table()]\cr
#' Returns a tabular view of all performed function calls of the Objective.
#' The `x_domain` column is unnested to separate columns.
#'
Expand Down
4 changes: 3 additions & 1 deletion R/TunerBatchIrace.R
Original file line number Diff line number Diff line change
Expand Up @@ -53,6 +53,8 @@
#'
#' # load learner and set search space
#' learner = lrn("classif.rpart", cp = to_tune(1e-04, 1e-1, logscale = TRUE))
#'
#' # runtime of the example is too long
#' \donttest{
#' # hyperparameter tuning on the pima indians diabetes data set
#' instance = tune(
Expand All @@ -61,7 +63,7 @@
#' learner = learner,
#' resampling = rsmp("holdout"),
#' measure = msr("classif.ce"),
#' term_evals = 42
#' term_evals = 200
#' )
#'
#' # best performing hyperparameter configuration
Expand Down
2 changes: 1 addition & 1 deletion R/mlr_callbacks.R
Original file line number Diff line number Diff line change
Expand Up @@ -433,7 +433,7 @@ load_callback_one_se_rule = function() {
#' @name mlr3tuning.async_freeze_archive
#'
#' @description
#' This [CallbackAsync] freezes the [ArchiveAsync] to [ArchiveAsyncFrozen] after the optimization has finished.
#' This [CallbackAsyncTuning] freezes the [ArchiveAsyncTuning] to [ArchiveAsyncTuningFrozen] after the optimization has finished.
#'
#' @examples
#' clbk("mlr3tuning.async_freeze_archive")
Expand Down
2 changes: 1 addition & 1 deletion man/ArchiveAsyncTuningFrozen.Rd

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

2 changes: 1 addition & 1 deletion man/mlr3tuning.async_freeze_archive.Rd

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

4 changes: 3 additions & 1 deletion man/mlr_tuners_irace.Rd

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

8 changes: 4 additions & 4 deletions tests/testthat/test_TunerBatchIrace.R
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ test_that("TunerIrace works with dependencies", {
learner = lrn("regr.rpart"),
resampling = rsmp("holdout"),
measures = msr("regr.mse"),
terminator = trm("evals", n_evals = 96),
terminator = trm("evals", n_evals = 200),
search_space = search_space)
tuner = tnr("irace")
x = capture.output({tuner$optimize(instance)})
Expand All @@ -50,7 +50,7 @@ test_that("TunerIrace works with logical parameters", {
learner = lrn("regr.rpart"),
resampling = rsmp("holdout"),
measures = msr("regr.mse"),
terminator = trm("evals", n_evals = 96),
terminator = trm("evals", n_evals = 200),
search_space = search_space)
tuner = tnr("irace")
x = capture.output({tuner$optimize(instance)})
Expand All @@ -59,7 +59,7 @@ test_that("TunerIrace works with logical parameters", {

test_that("TunerIrace uses digits", {
search_space = ps(cp = p_dbl(lower = pi * 1e-20, upper = 5.242e12 / 1e13))
instance = ti(
instance = ti(
task = tsk("mtcars"),
learner = lrn("regr.rpart"),
resampling = rsmp("holdout"),
Expand All @@ -78,7 +78,7 @@ test_that("TunerIrace works with unnamed discrete values", {
learner = lrn("regr.rpart"),
resampling = rsmp("holdout"),
measures = msr("regr.mse"),
terminator = trm("evals", n_evals = 96),
terminator = trm("evals", n_evals = 200),
search_space = search_space)
tuner = tnr("irace")
x = capture.output({expect_data_table(tuner$optimize(instance))})
Expand Down

0 comments on commit 69d2faf

Please sign in to comment.