Skip to content

Commit

Permalink
refactor: pass extra information of the result in the extra parameter (
Browse files Browse the repository at this point in the history
…#458)

* refactor: pass extra information of the result in the extra parameter

* ...

* ...
  • Loading branch information
be-marc authored Oct 27, 2024
1 parent 89bc580 commit 0d5e8f5
Show file tree
Hide file tree
Showing 12 changed files with 94 additions and 38 deletions.
2 changes: 1 addition & 1 deletion DESCRIPTION
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ Depends:
paradox (>= 1.0.1),
R (>= 3.1.0)
Imports:
bbotk (>= 1.1.1),
bbotk (>= 1.2.0),
checkmate (>= 2.0.0),
data.table,
lgr,
Expand Down
7 changes: 3 additions & 4 deletions NEWS.md
Original file line number Diff line number Diff line change
@@ -1,11 +1,10 @@
# mlr3tuning (development version)

* fix: The `as_data_table()` functions do not unnest the `x_domain` colum anymore by default.
* fix: `to_tune(internal = TRUE)` now also works if non-internal tuning parameters require have
an `.extra_trafo`
* fix: `to_tune(internal = TRUE)` now also works if non-internal tuning parameters require have an `.extra_trafo`.
* feat: It is now possible to pass an `internal_search_space` manually.
This allows to use parameter transformations on the primary search space in combination with
internal hyperparameter tuning.
This allows to use parameter transformations on the primary search space in combination with internal hyperparameter tuning.
* refactor: The `Tuner` pass extra information of the result in the `extra` parameter now.

# mlr3tuning 1.0.2

Expand Down
16 changes: 11 additions & 5 deletions R/TuningInstanceAsyncMulticrit.R
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,7 @@
#' @template param_xdt
#' @template param_learner_param_vals
#' @template param_internal_tuned_values
#' @template param_extra
#'
#' @template field_internal_search_space
#'
Expand Down Expand Up @@ -147,13 +148,18 @@ TuningInstanceAsyncMultiCrit = R6Class("TuningInstanceAsyncMultiCrit",
#' For internal use.
#'
#' @param ydt (`numeric(1)`)\cr
#' Optimal outcomes, e.g. the Pareto front.
#' Optimal outcomes, e.g. the Pareto front.
#' @param xydt (`data.table::data.table()`)\cr
#' Point, outcome, and additional information.
assign_result = function(xdt, ydt, learner_param_vals = NULL, xydt = NULL) {
#' Point, outcome, and additional information.
#' @param ... (`any`)\cr
#' ignored.
assign_result = function(xdt, ydt, learner_param_vals = NULL, extra = NULL, xydt = NULL, ...) {
# workaround
extra = extra %??% xydt

# extract internal tuned values
if ("internal_tuned_values" %in% names(xydt)) {
set(xdt, j = "internal_tuned_values", value = list(xydt[["internal_tuned_values"]]))
if ("internal_tuned_values" %in% names(extra)) {
set(xdt, j = "internal_tuned_values", value = list(extra[["internal_tuned_values"]]))
}

# set the column with the learner param_vals that were not optimized over but set implicitly
Expand Down
14 changes: 10 additions & 4 deletions R/TuningInstanceAsyncSingleCrit.R
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,7 @@
#' @template param_xdt
#' @template param_learner_param_vals
#' @template param_internal_tuned_values
#' @template param_extra
#'
#' @template field_internal_search_space
#'
Expand Down Expand Up @@ -159,14 +160,19 @@ TuningInstanceAsyncSingleCrit = R6Class("TuningInstanceAsyncSingleCrit",
#' @param y (`numeric(1)`)\cr
#' Optimal outcome.
#' @param xydt (`data.table::data.table()`)\cr
#' Point, outcome, and additional information.
assign_result = function(xdt, y, learner_param_vals = NULL, xydt = NULL) {
#' Point, outcome, and additional information (Deprecated).
#' @param ... (`any`)\cr
#' ignored.
assign_result = function(xdt, y, learner_param_vals = NULL, extra = NULL, xydt = NULL, ...) {
# workaround
extra = extra %??% xydt

# set the column with the learner param_vals that were not optimized over but set implicitly
assert_list(learner_param_vals, null.ok = TRUE, names = "named")

# extract internal tuned values
if ("internal_tuned_values" %in% names(xydt)) {
set(xdt, j = "internal_tuned_values", value = list(xydt[["internal_tuned_values"]]))
if ("internal_tuned_values" %in% names(extra)) {
set(xdt, j = "internal_tuned_values", value = list(extra[["internal_tuned_values"]]))
}

if (is.null(learner_param_vals)) {
Expand Down
16 changes: 11 additions & 5 deletions R/TuningInstanceBatchMulticrit.R
Original file line number Diff line number Diff line change
Expand Up @@ -33,6 +33,7 @@
#' @template param_xdt
#' @template param_learner_param_vals
#' @template param_internal_tuned_values
#' @template param_extra
#'
#' @template field_internal_search_space
#'
Expand Down Expand Up @@ -181,13 +182,18 @@ TuningInstanceBatchMultiCrit = R6Class("TuningInstanceBatchMultiCrit",
#' For internal use.
#'
#' @param ydt (`data.table::data.table()`)\cr
#' Optimal outcomes, e.g. the Pareto front.
#' Optimal outcomes, e.g. the Pareto front.
#' @param xydt (`data.table::data.table()`)\cr
#' Point, outcome, and additional information.
assign_result = function(xdt, ydt, learner_param_vals = NULL, xydt = NULL) {
#' Point, outcome, and additional information (Deprecated).
#' @param ... (`any`)\cr
#' ignored.
assign_result = function(xdt, ydt, learner_param_vals = NULL, extra = NULL, xydt = NULL, ...) {
# workaround
extra = extra %??% xydt

# extract internal tuned values
if ("internal_tuned_values" %in% names(xydt)) {
set(xdt, j = "internal_tuned_values", value = list(xydt[["internal_tuned_values"]]))
if ("internal_tuned_values" %in% names(extra)) {
set(xdt, j = "internal_tuned_values", value = list(extra[["internal_tuned_values"]]))
}

# set the column with the learner param_vals that were not optimized over but set implicitly
Expand Down
15 changes: 10 additions & 5 deletions R/TuningInstanceBatchSingleCrit.R
Original file line number Diff line number Diff line change
Expand Up @@ -68,6 +68,7 @@
#' @template param_xdt
#' @template param_learner_param_vals
#' @template param_internal_tuned_values
#' @template param_extra
#'
#' @template field_internal_search_space
#'
Expand Down Expand Up @@ -219,17 +220,21 @@ TuningInstanceBatchSingleCrit = R6Class("TuningInstanceBatchSingleCrit",
#' For internal use.
#'
#' @param y (`numeric(1)`)\cr
#' Optimal outcome.
#' Optimal outcome.
#' @param xydt (`data.table::data.table()`)\cr
#' Point, outcome, and additional information.
assign_result = function(xdt, y, learner_param_vals = NULL, xydt = NULL) {
#' Point, outcome, and additional information (Deprecated).
#' @param ... (`any`)\cr
#' ignored.
assign_result = function(xdt, y, learner_param_vals = NULL, extra = NULL, xydt = NULL, ...) {
# workaround
extra = extra %??% xydt

# set the column with the learner param_vals that were not optimized over but set implicitly
assert_list(learner_param_vals, null.ok = TRUE, names = "named")

# extract internal tuned values
if ("internal_tuned_values" %in% names(xydt)) {
set(xdt, j = "internal_tuned_values", value = list(xydt[["internal_tuned_values"]]))
if ("internal_tuned_values" %in% names(extra)) {
set(xdt, j = "internal_tuned_values", value = list(extra[["internal_tuned_values"]]))
}

# learner param values
Expand Down
2 changes: 2 additions & 0 deletions man-roxygen/param_extra.R
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
#' @param extra (`data.table::data.table()`)\cr
#' Additional information.
10 changes: 9 additions & 1 deletion man/TuningInstanceAsyncMultiCrit.Rd

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

12 changes: 10 additions & 2 deletions man/TuningInstanceAsyncSingleCrit.Rd

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

12 changes: 10 additions & 2 deletions man/TuningInstanceBatchMultiCrit.Rd

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

22 changes: 15 additions & 7 deletions man/TuningInstanceBatchSingleCrit.Rd

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

4 changes: 2 additions & 2 deletions man/mlr_tuners_cmaes.Rd

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

0 comments on commit 0d5e8f5

Please sign in to comment.