-
Notifications
You must be signed in to change notification settings - Fork 7
6. Roadmap
Right now the package 'works' in that it will generate models and go through the process to eventually produce predictions. It does not however work for all algorithms supported by the parsnip
package and therefore the tidymodels
ecosystem. It is the hope that this road map will help to address that. I think this can be done on a engine/function basis, for example gee/linear_reg()
. It was the aim to do all of the fitting, etc. dynamically, but this was thrwarted by the way in which some algorithms need to work, and because of this the tidymodels
team has created work around for them so that they can be supported by parsnip
however, you cannot use them in the tradition workflows
manner. This has been documented here with the gee algorithm.
This leaves us in the position to most likely having to refresh the way in which the package actually works, from the more heavy use of purrr
to iterate of dynamic lists to a more robust method dispatch way of doing things.
https://rstudio.github.io/r-manuals/r-exts/Generic-functions-and-methods.html
https://adv-r.hadley.nz/s3.html#s3-methods
https://github.com/tidymodels/broom/tree/main
https://www.tidymodels.org/learn/develop/broom/
https://adv-r.hadley.nz/index.html
The workflow is considered fast when a recipe is passed or is created dynamically with zero input, this does not mean it will be correct or what is needed, just that it is done on the fly quickly.
Current state of algorithm as of December 4th, 2023:
.parsnip_eng = "lm"
.parsnip_fns = "linear_reg"
rec_obj <- recipe(mpg ~ ., data = mtcars)
fr <- fast_regression(
.data = mtcars,
.rec_obj = rec_obj,
.parsnip_fns = "linear_reg",
.parsnip_eng = "lm"
)
fr$model_spec[[1]]
fr$wflw[[1]]
fr$fitted_wflw[[1]]
fr$fitted_wflw[[1]] |> broom::tidy()
fr$fitted_wflw[[1]] |> broom::glance()
fr$fitted_wflw[[1]] |> broom::augment(new_data = mtcars)
fr$pred_wflw[[1]]
> fr$model_spec[[1]]
Linear Regression Model Specification (regression)
Computational engine: lm
> fr$wflw[[1]]
══ Workflow ═══════════════════════════════════════════════════════════════════════════════════════════
Preprocessor: Recipe
Model: linear_reg()
── Preprocessor ───────────────────────────────────────────────────────────────────────────────────────
0 Recipe Steps
── Model ──────────────────────────────────────────────────────────────────────────────────────────────
Linear Regression Model Specification (regression)
Computational engine: lm
> fr$fitted_wflw[[1]]
══ Workflow [trained] ═════════════════════════════════════════════════════════════════════════════════
Preprocessor: Recipe
Model: linear_reg()
── Preprocessor ───────────────────────────────────────────────────────────────────────────────────────
0 Recipe Steps
── Model ──────────────────────────────────────────────────────────────────────────────────────────────
Call:
stats::lm(formula = ..y ~ ., data = data)
Coefficients:
(Intercept) cyl disp hp drat wt qsec
21.78132 -0.80763 0.02319 -0.02797 -0.10038 -3.64608 0.62728
vs am gear carb
0.58327 2.77856 0.37366 0.13047
> fr$fitted_wflw[[1]] |> broom::tidy()
# A tibble: 11 × 5
term estimate std.error statistic p.value
<chr> <dbl> <dbl> <dbl> <dbl>
1 (Intercept) 21.8 21.1 1.03 0.320
2 cyl -0.808 1.25 -0.647 0.529
3 disp 0.0232 0.0278 0.834 0.419
4 hp -0.0280 0.0280 -0.998 0.337
5 drat -0.100 1.90 -0.0529 0.959
6 wt -3.65 2.41 -1.51 0.155
7 qsec 0.627 0.782 0.802 0.437
8 vs 0.583 2.33 0.251 0.806
9 am 2.78 2.51 1.11 0.289
10 gear 0.374 1.67 0.224 0.826
11 carb 0.130 1.06 0.123 0.904
> fr$fitted_wflw[[1]] |> broom::glance()
# A tibble: 1 × 12
r.squared adj.r.squared sigma statistic p.value df logLik AIC BIC deviance df.residual nobs
<dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <int> <int>
1 0.863 0.758 2.74 8.20 0.000390 10 -50.9 126. 140. 97.7 13 24
> fr$fitted_wflw[[1]] |> broom::augment(new_data = mtcars)
# A tibble: 32 × 12
mpg cyl disp hp drat wt qsec vs am gear carb .pred
* <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl>
1 21 6 160 110 3.9 2.62 16.5 0 1 4 4 22.7
2 21 6 160 110 3.9 2.88 17.0 0 1 4 4 22.2
3 22.8 4 108 93 3.85 2.32 18.6 1 1 4 1 26.3
4 21.4 6 258 110 3.08 3.22 19.4 1 0 3 1 21.8
5 18.7 8 360 175 3.15 3.44 17.0 0 0 3 2 18.0
6 18.1 6 225 105 2.76 3.46 20.2 1 0 3 1 20.8
7 14.3 8 360 245 3.21 3.57 15.8 0 0 3 4 15.1
8 24.4 4 147. 62 3.69 3.19 20 1 0 4 2 23.1
9 22.8 4 141. 95 3.92 3.15 22.9 1 0 4 2 24.0
10 19.2 6 168. 123 3.92 3.44 18.3 1 0 4 4 18.5
# ℹ 22 more rows
# ℹ Use `print(n = ...)` to see more rows
> fr$pred_wflw[[1]]
# A tibble: 8 × 1
.pred
<dbl>
1 26.3
2 21.8
3 15.1
4 14.0
5 12.7
6 27.3
7 17.4
8 17.5
.parsnip_eng = "brulee"
.parsnip_fns = "linear_reg"
fr <- fast_regression(
.data = mtcars,
.rec_obj = rec_obj,
.parsnip_fns = "linear_reg",
.parsnip_eng = "brulee"
)
Error in !self$..refer_to_state_dict..: invalid argument type
> fr
# A tibble: 1 × 8
.model_id .parsnip_engine .parsnip_mode .parsnip_fns model_spec wflw fitted_wflw pred_wflw
<int> <chr> <chr> <chr> <list> <list> <list> <list>
1 1 brulee regression linear_reg <spec[+]> <workflow> <workflow> <NULL>
> fr$model_spec[[1]]
Linear Regression Model Specification (regression)
Computational engine: brulee
> fr$wflw[[1]]
══ Workflow ═══════════════════════════════════════════════════════════════════════════════════════════
Preprocessor: Recipe
Model: linear_reg()
── Preprocessor ───────────────────────────────────────────────────────────────────────────────────────
0 Recipe Steps
── Model ──────────────────────────────────────────────────────────────────────────────────────────────
Linear Regression Model Specification (regression)
Computational engine: brulee
> fr$fitted_wflw[[1]]
══ Workflow [trained] ═════════════════════════════════════════════════════════════════════════════════
Preprocessor: Recipe
Model: linear_reg()
── Preprocessor ───────────────────────────────────────────────────────────────────────────────────────
0 Recipe Steps
── Model ──────────────────────────────────────────────────────────────────────────────────────────────
Linear regression
24 samples, 10 features, numeric outcome
weight decay: 0.001
batch size: 22
scaled validation loss after 1 epoch: 16.6
> fr$fitted_wflw[[1]] |> broom::tidy()
Error: No tidy method for objects of class brulee_linear_reg
> fr$fitted_wflw[[1]] |> broom::glance()
Error: No glance method for objects of class brulee_linear_reg
> fr$fitted_wflw[[1]] |> broom::augment(new_data = mtcars)
Error in !self$..refer_to_state_dict.. : invalid argument type
> fr$pred_wflw[[1]]
NULL
.parsnip_eng = "gee"
.parsnip_fns = "linear_reg"
fr <- fast_regression(
.data = mtcars,
.rec_obj = rec_obj,
.parsnip_fns = "linear_reg",
.parsnip_eng = "gee"
)
> fr$model_spec[[1]]
Linear Regression Model Specification (regression)
Computational engine: gee
> fr$wflw[[1]]
══ Workflow ═══════════════════════════════════════════════════════════════════════════════════════════
Preprocessor: Variables
Model: linear_reg()
── Preprocessor ───────────────────────────────────────────────────────────────────────────────────────
Outcomes: outcome_var
Predictors: predictor_vars
── Model ──────────────────────────────────────────────────────────────────────────────────────────────
Linear Regression Model Specification (regression)
Computational engine: gee
> fr$fitted_wflw[[1]]
══ Workflow [trained] ═════════════════════════════════════════════════════════════════════════════════
Preprocessor: Variables
Model: linear_reg()
── Preprocessor ───────────────────────────────────────────────────────────────────────────────────────
Outcomes: outcome_var
Predictors: predictor_vars
── Model ──────────────────────────────────────────────────────────────────────────────────────────────
GEE: GENERALIZED LINEAR MODELS FOR DEPENDENT DATA
gee S-function, version 4.13 modified 98/01/27 (1998)
Model:
Link: Identity
Variance to Mean Relation: Gaussian
Correlation Structure: Independent
Call:
gee::gee(formula = mpg ~ disp + hp + drat + wt + qsec + vs +
am + gear + carb, id = data$cyl, data = data, family = gaussian)
Number of observations : 24
Maximum cluster size : 4
Coefficients:
(Intercept) disp hp drat wt qsec vs
2.772481358 0.012291451 -0.001785185 2.555081075 -2.868239428 0.764672142 0.308871461
am gear carb
3.809941720 0.670775437 -1.001749335
Estimated Scale Parameter: 6.299191
Number of Iterations: 1
Working Correlation[1:4,1:4]
[,1] [,2] [,3] [,4]
[1,] 1 0 0 0
[2,] 0 1 0 0
[3,] 0 0 0 0
[4,] 0 0 0 0
Returned Error Value:
[1] 0
> fr$fitted_wflw[[1]] |> broom::tidy()
# A tibble: 10 × 6
term estimate std.error statistic p.value ``
<chr> <dbl> <dbl> <dbl> <dbl> <dbl>
1 (Intercept) 2.77 15.1 0.183 6.91 0.401
2 disp 0.0123 0.0182 0.676 0.0130 0.948
3 hp -0.00179 0.0234 -0.0764 0.0142 -0.126
4 drat 2.56 1.78 1.43 0.626 4.08
5 wt -2.87 2.09 -1.37 1.81 -1.58
6 qsec 0.765 0.724 1.06 0.346 2.21
7 vs 0.309 2.16 0.143 1.06 0.290
8 am 3.81 2.21 1.72 1.85 2.06
9 gear 0.671 1.82 0.368 0.930 0.722
10 carb -1.00 0.894 -1.12 0.752 -1.33
> fr$fitted_wflw[[1]] |> broom::glance()
# A tibble: 1 × 8
null.deviance df.null logLik AIC BIC deviance df.residual nobs
<dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <int>
1 NA NA NA NA NA NA NA 24
> fr$fitted_wflw[[1]] |> broom::augment(new_data = mtcars)
# A tibble: 32 × 12
mpg cyl disp hp drat wt qsec vs am gear carb .pred
* <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl>
1 21 6 160 110 3.9 2.62 16.5 0 1 4 4 22.1
2 21 6 160 110 3.9 2.88 17.0 0 1 4 4 21.8
3 22.8 4 108 93 3.85 2.32 18.6 1 1 4 1 27.1
4 21.4 6 258 110 3.08 3.22 19.4 1 0 3 1 20.6
5 18.7 8 360 175 3.15 3.44 17.0 0 0 3 2 18.1
6 18.1 6 225 105 2.76 3.46 20.2 1 0 3 1 19.3
7 14.3 8 360 245 3.21 3.57 15.8 0 0 3 4 14.8
8 24.4 4 147. 62 3.69 3.19 20 1 0 4 2 21.0
9 22.8 4 141. 95 3.92 3.15 22.9 1 0 4 2 23.8
10 19.2 6 168. 123 3.92 3.44 18.3 1 0 4 4 17.7
# ℹ 22 more rows
# ℹ Use `print(n = ...)` to see more rows
> fr$pred_wflw[[1]]
# A tibble: 8 × 1
.pred
<dbl>
1 14.8
2 21.0
3 15.5
4 16.3
5 17.7
6 27.4
7 22.6
8 25.6
.parsnip_eng = "glm"
.parsnip_fns = "linear_reg"
fr <- fast_regression(
.data = mtcars,
.rec_obj = rec_obj,
.parsnip_fns = "linear_reg",
.parsnip_eng = "glm"
)
> fr$model_spec[[1]]
Linear Regression Model Specification (regression)
Computational engine: glm
> fr$wflw[[1]]
══ Workflow ═══════════════════════════════════════════════════════════════════════════════════════════
Preprocessor: Recipe
Model: linear_reg()
── Preprocessor ───────────────────────────────────────────────────────────────────────────────────────
0 Recipe Steps
── Model ──────────────────────────────────────────────────────────────────────────────────────────────
Linear Regression Model Specification (regression)
Computational engine: glm
> fr$fitted_wflw[[1]]
══ Workflow [trained] ═════════════════════════════════════════════════════════════════════════════════
Preprocessor: Recipe
Model: linear_reg()
── Preprocessor ───────────────────────────────────────────────────────────────────────────────────────
0 Recipe Steps
── Model ──────────────────────────────────────────────────────────────────────────────────────────────
Call: stats::glm(formula = ..y ~ ., family = stats::gaussian, data = data)
Coefficients:
(Intercept) cyl disp hp drat wt qsec
18.91748 -0.25686 0.02509 -0.02400 0.08744 -7.23026 1.15143
vs am gear carb
-0.07889 0.82917 0.32748 0.42705
Degrees of Freedom: 23 Total (i.e. Null); 13 Residual
Null Deviance: 872.7
Residual Deviance: 89.83 AIC: 123.8
> fr$fitted_wflw[[1]] |> broom::tidy()
# A tibble: 11 × 5
term estimate std.error statistic p.value
<chr> <dbl> <dbl> <dbl> <dbl>
1 (Intercept) 18.9 21.3 0.887 0.391
2 cyl -0.257 1.41 -0.183 0.858
3 disp 0.0251 0.0212 1.18 0.257
4 hp -0.0240 0.0248 -0.969 0.350
5 drat 0.0874 1.79 0.0487 0.962
6 wt -7.23 2.45 -2.95 0.0113
7 qsec 1.15 0.785 1.47 0.166
8 vs -0.0789 2.95 -0.0267 0.979
9 am 0.829 2.60 0.319 0.755
10 gear 0.327 1.81 0.181 0.859
11 carb 0.427 1.02 0.419 0.682
> fr$fitted_wflw[[1]] |> broom::glance()
# A tibble: 1 × 8
null.deviance df.null logLik AIC BIC deviance df.residual nobs
<dbl> <int> <dbl> <dbl> <dbl> <dbl> <int> <int>
1 873. 23 -49.9 124. 138. 89.8 13 24
> fr$fitted_wflw[[1]] |> broom::augment(new_data = mtcars)
# A tibble: 32 × 12
mpg cyl disp hp drat wt qsec vs am gear carb .pred
* <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl>
1 21 6 160 110 3.9 2.62 16.5 0 1 4 4 22.9
2 21 6 160 110 3.9 2.88 17.0 0 1 4 4 21.7
3 22.8 4 108 93 3.85 2.32 18.6 1 1 4 1 25.8
4 21.4 6 258 110 3.08 3.22 19.4 1 0 3 1 21.9
5 18.7 8 360 175 3.15 3.44 17.0 0 0 3 2 18.5
6 18.1 6 225 105 2.76 3.46 20.2 1 0 3 1 20.3
7 14.3 8 360 245 3.21 3.57 15.8 0 0 3 4 15.4
8 24.4 4 147. 62 3.69 3.19 20 1 0 4 2 22.5
9 22.8 4 141. 95 3.92 3.15 22.9 1 0 4 2 25.2
10 19.2 6 168. 123 3.92 3.44 18.3 1 0 4 4 18.1
# ℹ 22 more rows
# ℹ Use `print(n = ...)` to see more rows
> fr$pred_wflw[[1]]
# A tibble: 8 × 1
.pred
<dbl>
1 12.6
2 7.50
3 6.77
4 25.4
5 18.1
6 28.9
7 26.2
8 20.0
.parsnip_eng = "glmer"
.parsnip_fns = "linear_reg"
fr <- fast_regression(
.data = mtcars,
.rec_obj = rec_obj,
.parsnip_fns = "linear_reg",
.parsnip_eng = "glmer"
)
Error: No random effects terms specified in formula
Error in UseMethod("predict"): no applicable method for 'predict' applied to an object of class "NULL"
Warning message:
There was 1 warning in `dplyr::mutate()`.
ℹ In argument: `fitted_wflw = internal_make_fitted_wflw(mod_tbl, splits_obj)`.
Caused by warning in `lme4::glmer()`:
! calling glmer() with family=gaussian (identity link) as a shortcut to lmer() is deprecated; please call lmer() directly
.parsnip_eng = "glmnet"
.parsnip_fns = "linear_reg"
fr <- fast_regression(
.data = mtcars,
.rec_obj = rec_obj,
.parsnip_fns = "linear_reg",
.parsnip_eng = "glmnet"
)
Error in `.check_glmnet_penalty_fit()`:
! For the glmnet engine, `penalty` must be a single number (or a value of `tune()`).
• There are 0 values for `penalty`.
• To try multiple values for total regularization, use the tune package.
• To predict multiple penalties, use `multi_predict()`
Error in UseMethod("predict"): no applicable method for 'predict' applied to an object of class "NULL"
.parsnip_eng = "gls"
.parsnip_fns = "linear_reg"
fr <- fast_regression(
.data = mtcars,
.rec_obj = rec_obj,
.parsnip_fns = "linear_reg",
.parsnip_eng = "gls"
)
> fr$model_spec[[1]]
Linear Regression Model Specification (regression)
Computational engine: gls
> fr$wflw[[1]]
══ Workflow ═══════════════════════════════════════════════════════════════════════════════════════════
Preprocessor: Recipe
Model: linear_reg()
── Preprocessor ───────────────────────────────────────────────────────────────────────────────────────
0 Recipe Steps
── Model ──────────────────────────────────────────────────────────────────────────────────────────────
Linear Regression Model Specification (regression)
Computational engine: gls
> fr$fitted_wflw[[1]]
══ Workflow [trained] ═════════════════════════════════════════════════════════════════════════════════
Preprocessor: Recipe
Model: linear_reg()
── Preprocessor ───────────────────────────────────────────────────────────────────────────────────────
0 Recipe Steps
── Model ──────────────────────────────────────────────────────────────────────────────────────────────
Generalized least squares fit by REML
Model: ..y ~ .
Data: data
Log-restricted-likelihood: -49.68099
Coefficients:
(Intercept) cyl disp hp drat wt qsec
-19.59286037 0.32629657 0.01919518 -0.01332160 -0.09600257 -5.78680096 2.57090463
vs am gear carb
-1.78348102 2.69856618 2.18246456 0.02194602
Degrees of freedom: 24 total; 13 residual
Residual standard error: 2.662756
> fr$fitted_wflw[[1]] |> broom::tidy()
Error: No tidy method for objects of class gls
> fr$fitted_wflw[[1]] |> broom::glance()
Error: No glance method for objects of class gls
> fr$fitted_wflw[[1]] |> broom::augment(new_data = mtcars)
# A tibble: 32 × 12
mpg cyl disp hp drat wt qsec vs am gear carb .pred
* <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl>
1 21 6 160 110 3.9 2.62 16.5 0 1 4 4 22.3
2 21 6 160 110 3.9 2.88 17.0 0 1 4 4 22.2
3 22.8 4 108 93 3.85 2.32 18.6 1 1 4 1 26.3
4 21.4 6 258 110 3.08 3.22 19.4 1 0 3 1 21.7
5 18.7 8 360 175 3.15 3.44 17.0 0 0 3 2 17.7
6 18.1 6 225 105 2.76 3.46 20.2 1 0 3 1 21.8
7 14.3 8 360 245 3.21 3.57 15.8 0 0 3 4 13.1
8 24.4 4 147. 62 3.69 3.19 20 1 0 4 2 23.3
9 22.8 4 141. 95 3.92 3.15 22.9 1 0 4 2 30.4
10 19.2 6 168. 123 3.92 3.44 18.3 1 0 4 4 17.7
# ℹ 22 more rows
# ℹ Use `print(n = ...)` to see more rows
> fr$pred_wflw[[1]]
# A tibble: 8 × 1
.pred
<dbl>
1 26.3
2 17.7
3 30.4
4 16.6
5 11.5
6 8.58
7 10.1
8 29.0
.parsnip_eng = "lme"
.parsnip_fns = "linear_reg"
fr <- fast_regression(
.data = mtcars,
.rec_obj = rec_obj,
.parsnip_fns = "linear_reg",
.parsnip_eng = "lme"
)
Error in getGroups.data.frame(dataMix, groups): invalid formula for groups
Error in UseMethod("predict"): no applicable method for 'predict' applied to an object of class "NULL"
.parsnip_eng = "lmer"
.parsnip_fns = "linear_reg"
fr <- fast_regression(
.data = mtcars,
.rec_obj = rec_obj,
.parsnip_fns = "linear_reg",
.parsnip_eng = "lmer"
)
Error: No random effects terms specified in formula
Error in UseMethod("predict"): no applicable method for 'predict' applied to an object of class "NULL"
.parsnip_eng = "stan"
.parsnip_fns = "linear_reg"
fr <- fast_regression(
.data = mtcars,
.rec_obj = rec_obj,
.parsnip_fns = "linear_reg",
.parsnip_eng = "stan"
)
> fr$model_spec[[1]]
Linear Regression Model Specification (regression)
Computational engine: stan
> fr$wflw[[1]]
══ Workflow ═══════════════════════════════════════════════════════════════════════════════════════════
Preprocessor: Recipe
Model: linear_reg()
── Preprocessor ───────────────────────────────────────────────────────────────────────────────────────
0 Recipe Steps
── Model ──────────────────────────────────────────────────────────────────────────────────────────────
Linear Regression Model Specification (regression)
Computational engine: stan
> fr$fitted_wflw[[1]]
══ Workflow [trained] ═════════════════════════════════════════════════════════════════════════════════
Preprocessor: Recipe
Model: linear_reg()
── Preprocessor ───────────────────────────────────────────────────────────────────────────────────────
0 Recipe Steps
── Model ──────────────────────────────────────────────────────────────────────────────────────────────
stan_glm
family: gaussian [identity]
formula: ..y ~ .
observations: 24
predictors: 11
------
Median MAD_SD
(Intercept) 3.0 22.6
cyl 0.0 1.2
disp 0.0 0.0
hp 0.0 0.0
drat 2.7 2.1
wt -4.0 2.2
qsec 1.3 0.9
vs -1.5 2.4
am 3.4 2.6
gear -1.4 1.9
carb 0.5 1.0
Auxiliary parameter(s):
Median MAD_SD
sigma 2.6 0.5
------
* For help interpreting the printed output see ?print.stanreg
* For info on the priors used see ?prior_summary.stanreg
> fr$fitted_wflw[[1]] |> broom::tidy()
Error in warn_on_stanreg(x) :
The supplied model object seems to be outputted from the rstanarm package. Tidiers for mixed model output now live in the broom.mixed package.
> fr$fitted_wflw[[1]] |> broom::glance()
Error in warn_on_stanreg(x) :
The supplied model object seems to be outputted from the rstanarm package. Tidiers for mixed model output now live in the broom.mixed package.
> fr$fitted_wflw[[1]] |> broom::augment(new_data = mtcars)
# A tibble: 32 × 12
mpg cyl disp hp drat wt qsec vs am gear carb .pred
* <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl>
1 21 6 160 110 3.9 2.62 16.5 0 1 4 4 23.8
2 21 6 160 110 3.9 2.88 17.0 0 1 4 4 23.4
3 22.8 4 108 93 3.85 2.32 18.6 1 1 4 1 24.1
4 21.4 6 258 110 3.08 3.22 19.4 1 0 3 1 19.5
5 18.7 8 360 175 3.15 3.44 17.0 0 0 3 2 17.9
6 18.1 6 225 105 2.76 3.46 20.2 1 0 3 1 18.2
7 14.3 8 360 245 3.21 3.57 15.8 0 0 3 4 15.4
8 24.4 4 147. 62 3.69 3.19 20 1 0 4 2 20.3
9 22.8 4 141. 95 3.92 3.15 22.9 1 0 4 2 24.0
10 19.2 6 168. 123 3.92 3.44 18.3 1 0 4 4 17.8
# ℹ 22 more rows
# ℹ Use `print(n = ...)` to see more rows
> fr$pred_wflw[[1]]
# A tibble: 8 × 1
.pred
<dbl>
1 23.8
2 19.5
3 15.4
4 20.3
5 13.2
6 26.5
7 15.1
8 23.4
.parsnip_eng = "stan_glmer"
.parsnip_fns = "linear_reg"
fr <- fast_regression(
.data = mtcars,
.rec_obj = rec_obj,
.parsnip_fns = "linear_reg",
.parsnip_eng = "stan_glmer"
)
Error: No random effects terms specified in formula
Error in UseMethod("predict"): no applicable method for 'predict' applied to an object of class "NULL"
Not implemented
.parsnip_eng = "Cubist"
.parsnip_fns = "cubsit_rules"
fr <- fast_regression(
.data = mtcars,
.rec_obj = rec_obj,
.parsnip_fns = "cubist_rules",
.parsnip_eng = "Cubist"
)
> fr$model_spec[[1]]
Cubist Model Specification (regression)
Computational engine: Cubist
> fr$wflw[[1]]
══ Workflow ═══════════════════════════════════════════════════════════════════════════════════════════
Preprocessor: Recipe
Model: cubist_rules()
── Preprocessor ───────────────────────────────────────────────────────────────────────────────────────
0 Recipe Steps
── Model ──────────────────────────────────────────────────────────────────────────────────────────────
Cubist Model Specification (regression)
Computational engine: Cubist
> fr$fitted_wflw[[1]]
══ Workflow [trained] ═════════════════════════════════════════════════════════════════════════════════
Preprocessor: Recipe
Model: cubist_rules()
── Preprocessor ───────────────────────────────────────────────────────────────────────────────────────
0 Recipe Steps
── Model ──────────────────────────────────────────────────────────────────────────────────────────────
Call:
cubist.default(x = x, y = y, committees = 1)
Number of samples: 24
Number of predictors: 10
Number of committees: 1
Number of rules: 1
> fr$fitted_wflw[[1]] |> broom::tidy() |> unnest(cols = c(estimate, statistic))
# A tibble: 2 × 11
committee rule_num rule term estimate num_conditions coverage mean min max error
<int> <int> <chr> <chr> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl>
1 1 1 <no conditions> (Interce… 28.7 0 24 20.1 10.4 33.9 3.08
2 1 1 <no conditions> disp -0.04 0 24 20.1 10.4 33.9 3.08
> fr$fitted_wflw[[1]] |> broom::glance()
Error: No glance method for objects of class cubist
> fr$fitted_wflw[[1]] |> broom::augment(new_data = mtcars)
# A tibble: 32 × 12
mpg cyl disp hp drat wt qsec vs am gear carb .pred
* <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl>
1 21 6 160 110 3.9 2.62 16.5 0 1 4 4 22.3
2 21 6 160 110 3.9 2.88 17.0 0 1 4 4 22.3
3 22.8 4 108 93 3.85 2.32 18.6 1 1 4 1 24.4
4 21.4 6 258 110 3.08 3.22 19.4 1 0 3 1 18.4
5 18.7 8 360 175 3.15 3.44 17.0 0 0 3 2 14.3
6 18.1 6 225 105 2.76 3.46 20.2 1 0 3 1 19.7
7 14.3 8 360 245 3.21 3.57 15.8 0 0 3 4 14.3
8 24.4 4 147. 62 3.69 3.19 20 1 0 4 2 22.8
9 22.8 4 141. 95 3.92 3.15 22.9 1 0 4 2 23.1
10 19.2 6 168. 123 3.92 3.44 18.3 1 0 4 4 22.0
# ℹ 22 more rows
# ℹ Use `print(n = ...)` to see more rows
> fr$pred_wflw[[1]]
# A tibble: 8 × 1
.pred
<dbl>
1 24.4
2 22.0
3 17.7
4 17.7
5 25.7
6 16.5
7 23.9
8 14.7
.parsnip_eng = "glm"
.parsnip_fns = "poisson_reg"
fr <- fast_regression(
.data = mtcars,
.rec_obj = rec_obj,
.parsnip_fns = "poisson_reg",
.parsnip_eng = "glm"
)
Warning message:
There were 21 warnings in `dplyr::mutate()`.
The first warning was:
ℹ In argument: `fitted_wflw = internal_make_fitted_wflw(mod_tbl, splits_obj)`.
Caused by warning in `dpois()`:
! non-integer x = 16.400000
ℹ Run dplyr::last_dplyr_warnings() to see the 20 remaining warnings.
> fr$model_spec[[1]]
Poisson Regression Model Specification (regression)
Computational engine: glm
> fr$wflw[[1]]
══ Workflow ═══════════════════════════════════════════════════════════════════════════════════════════
Preprocessor: Recipe
Model: poisson_reg()
── Preprocessor ───────────────────────────────────────────────────────────────────────────────────────
0 Recipe Steps
── Model ──────────────────────────────────────────────────────────────────────────────────────────────
Poisson Regression Model Specification (regression)
Computational engine: glm
> fr$fitted_wflw[[1]]
══ Workflow [trained] ═════════════════════════════════════════════════════════════════════════════════
Preprocessor: Recipe
Model: poisson_reg()
── Preprocessor ───────────────────────────────────────────────────────────────────────────────────────
0 Recipe Steps
── Model ──────────────────────────────────────────────────────────────────────────────────────────────
Call: stats::glm(formula = ..y ~ ., family = stats::poisson, data = data)
Coefficients:
(Intercept) cyl disp hp drat wt qsec
2.2690172 0.0633415 -0.0006075 -0.0006781 -0.0089000 -0.2240039 0.0577436
vs am gear carb
-0.0054883 0.1056335 0.0599864 -0.0120491
Degrees of Freedom: 23 Total (i.e. Null); 13 Residual
Null Deviance: 41.13
Residual Deviance: 2.54 AIC: Inf
> fr$fitted_wflw[[1]] |> broom::tidy()
# A tibble: 11 × 5
term estimate std.error statistic p.value
<chr> <dbl> <dbl> <dbl> <dbl>
1 (Intercept) 2.27 1.65 1.38 0.168
2 cyl 0.0633 0.109 0.583 0.560
3 disp -0.000608 0.00218 -0.278 0.781
4 hp -0.000678 0.00243 -0.279 0.780
5 drat -0.00890 0.150 -0.0593 0.953
6 wt -0.224 0.186 -1.20 0.229
7 qsec 0.0577 0.0651 0.887 0.375
8 vs -0.00549 0.177 -0.0310 0.975
9 am 0.106 0.197 0.536 0.592
10 gear 0.0600 0.140 0.428 0.669
11 carb -0.0120 0.0825 -0.146 0.884
> fr$fitted_wflw[[1]] |> broom::glance()
# A tibble: 1 × 8
null.deviance df.null logLik AIC BIC deviance df.residual nobs
<dbl> <int> <dbl> <dbl> <dbl> <dbl> <int> <int>
1 41.1 23 -Inf Inf Inf 2.54 13 24
> fr$fitted_wflw[[1]] |> broom::augment(new_data = mtcars)
# A tibble: 32 × 12
mpg cyl disp hp drat wt qsec vs am gear carb .pred
* <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl>
1 21 6 160 110 3.9 2.62 16.5 0 1 4 4 22.3
2 21 6 160 110 3.9 2.88 17.0 0 1 4 4 21.7
3 22.8 4 108 93 3.85 2.32 18.6 1 1 4 1 25.6
4 21.4 6 258 110 3.08 3.22 19.4 1 0 3 1 19.2
5 18.7 8 360 175 3.15 3.44 17.0 0 0 3 2 16.1
6 18.1 6 225 105 2.76 3.46 20.2 1 0 3 1 19.5
7 14.3 8 360 245 3.21 3.57 15.8 0 0 3 4 13.6
8 24.4 4 147. 62 3.69 3.19 20 1 0 4 2 20.3
9 22.8 4 141. 95 3.92 3.15 22.9 1 0 4 2 23.7
10 19.2 6 168. 123 3.92 3.44 18.3 1 0 4 4 18.2
# ℹ 22 more rows
# ℹ Use `print(n = ...)` to see more rows
> fr$pred_wflw[[1]]
# A tibble: 8 × 1
.pred
<dbl>
1 22.3
2 16.1
3 20.3
4 16.4
5 10.2
6 9.62
7 12.5
8 14.4
.parsnip_eng = "gee"
.parsnip_fns = "poisson_reg"
fr <- fast_regression(
.data = mtcars,
.rec_obj = rec_obj,
.parsnip_fns = "poisson_reg",
.parsnip_eng = "gee"
)
Error in terms.formula(f, specials = "id_var"): '.' in formula and no 'data' argument
Error in UseMethod("predict"): no applicable method for 'predict' applied to an object of class "NULL"
.parsnip_eng = "glmer"
.parsnip_fns = "poisson_reg"
> fr <- fast_regression(
+ .data = mtcars,
+ .rec_obj = rec_obj,
+ .parsnip_fns = "poisson_reg",
+ .parsnip_eng = "glmer"
+ )
Error: No random effects terms specified in formula
Error in UseMethod("predict"): no applicable method for 'predict' applied to an object of class "NULL"
.parsnip_eng = "glmnet"
.parsnip_fns = "poisson_reg"
fr <- fast_regression(
.data = mtcars,
.rec_obj = rec_obj,
.parsnip_fns = "poisson_reg",
.parsnip_eng = "glmnet"
)
Error in `.check_glmnet_penalty_fit()`:
! For the glmnet engine, `penalty` must be a single number (or a value of `tune()`).
• There are 0 values for `penalty`.
• To try multiple values for total regularization, use the tune package.
• To predict multiple penalties, use `multi_predict()`
Error in UseMethod("predict"): no applicable method for 'predict' applied to an object of class "NULL"
.parsnip_eng = "hurdle"
.parsnip_fns = "poisson_reg"
> fr <- fast_regression(
+ .data = mtcars,
+ .rec_obj = rec_obj,
+ .parsnip_fns = "poisson_reg",
+ .parsnip_eng = "hurdle"
+ )
Error in pscl::hurdle(formula = ..y ~ ., data = data): invalid dependent variable, minimum count is not zero
Error in UseMethod("predict"): no applicable method for 'predict' applied to an object of class "NULL
.parsnip_eng = "stan"
.parsnip_fns = "poisson_reg"
fr <- fast_regression(
.data = mtcars,
.rec_obj = rec_obj,
.parsnip_fns = "poisson_reg",
.parsnip_eng = "stan"
)
Error: All outcome values must be counts for Poisson models
Error in UseMethod("predict"): no applicable method for 'predict' applied to an object of class "NULL"
.parsnip_eng = "stan_glmer"
.parsnip_fns = "poisson_reg"
fr <- fast_regression(
.data = mtcars,
.rec_obj = rec_obj,
.parsnip_fns = "poisson_reg",
.parsnip_eng = "stan_glmer"
)
Error: No random effects terms specified in formula
Error in UseMethod("predict"): no applicable method for 'predict' applied to an object of class "NULL"
.parsnip_eng = "zeroinfl"
.parsnip_fns = "poisson_reg"
> fr <- fast_regression(
+ .data = mtcars,
+ .rec_obj = rec_obj,
+ .parsnip_fns = "poisson_reg",
+ .parsnip_eng = "zeroinfl"
+ )
Error in pscl::zeroinfl(formula = ..y ~ ., data = data): invalid dependent variable, minimum count is not zero
Error in UseMethod("predict"): no applicable method for 'predict' applied to an object of class "NULL"
.parsnip_eng = "earth"
.parsnip_fns = "bag_mars"
fr <- fast_regression(
.data = mtcars,
.rec_obj = rec_obj,
.parsnip_fns = "bag_mars",
.parsnip_eng = "earth"
)> fr$model_spec[[1]]
Bagged MARS Model Specification (regression)
Computational engine: earth
> fr$wflw[[1]]
══ Workflow ═══════════════════════════════════════════════════════════════════════════════════════════
Preprocessor: Recipe
Model: bag_mars()
── Preprocessor ───────────────────────────────────────────────────────────────────────────────────────
0 Recipe Steps
── Model ──────────────────────────────────────────────────────────────────────────────────────────────
Bagged MARS Model Specification (regression)
Computational engine: earth
> fr$fitted_wflw[[1]]
══ Workflow [trained] ═════════════════════════════════════════════════════════════════════════════════
Preprocessor: Recipe
Model: bag_mars()
── Preprocessor ───────────────────────────────────────────────────────────────────────────────────────
0 Recipe Steps
── Model ──────────────────────────────────────────────────────────────────────────────────────────────
Bagged MARS (regression with 11 members)
Variable importance scores include:
# A tibble: 3 × 4
term value std.error used
<chr> <dbl> <dbl> <int>
1 disp 19.2 15.3 3
2 wt 9.09 0 1
3 hp 6.66 5.89 2
> fr$fitted_wflw[[1]] |> broom::tidy()
Error: No tidy method for objects of class bagger
> fr$fitted_wflw[[1]] |> broom::glance()
Error: No glance method for objects of class bagger
> fr$fitted_wflw[[1]] |> broom::augment(new_data = mtcars)
# A tibble: 32 × 12
mpg cyl disp hp drat wt qsec vs am gear carb .pred
* <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl>
1 21 6 160 110 3.9 2.62 16.5 0 1 4 4 21.8
2 21 6 160 110 3.9 2.88 17.0 0 1 4 4 20.2
3 22.8 4 108 93 3.85 2.32 18.6 1 1 4 1 24.1
4 21.4 6 258 110 3.08 3.22 19.4 1 0 3 1 20.1
5 18.7 8 360 175 3.15 3.44 17.0 0 0 3 2 14.7
6 18.1 6 225 105 2.76 3.46 20.2 1 0 3 1 19.7
7 14.3 8 360 245 3.21 3.57 15.8 0 0 3 4 13.3
8 24.4 4 147. 62 3.69 3.19 20 1 0 4 2 19.5
9 22.8 4 141. 95 3.92 3.15 22.9 1 0 4 2 20.1
10 19.2 6 168. 123 3.92 3.44 18.3 1 0 4 4 18.7
# ℹ 22 more rows
# ℹ Use `print(n = ...)` to see more rows
> fr$pred_wflw[[1]]
# A tibble: 8 × 1
.pred
<dbl>
1 14.7
2 11.6
3 26.1
4 16.1
5 14.3
6 27.1
7 16.7
8 19.0
.parsnip_eng = "rpart"
.parsnip_fns = "bag_tree"
fr <- fast_regression(
.data = mtcars,
.rec_obj = rec_obj,
.parsnip_fns = "bag_tree",
.parsnip_eng = "rpart"
)
> fr$model_spec[[1]]
Bagged Decision Tree Model Specification (regression)
Main Arguments:
cost_complexity = 0
min_n = 2
Computational engine: rpart
> fr$wflw[[1]]
══ Workflow ═══════════════════════════════════════════════════════════════════════════════════════════
Preprocessor: Recipe
Model: bag_tree()
── Preprocessor ───────────────────────────────────────────────────────────────────────────────────────
0 Recipe Steps
── Model ──────────────────────────────────────────────────────────────────────────────────────────────
Bagged Decision Tree Model Specification (regression)
Main Arguments:
cost_complexity = 0
min_n = 2
Computational engine: rpart
> fr$fitted_wflw[[1]]
══ Workflow [trained] ═════════════════════════════════════════════════════════════════════════════════
Preprocessor: Recipe
Model: bag_tree()
── Preprocessor ───────────────────────────────────────────────────────────────────────────────────────
0 Recipe Steps
── Model ──────────────────────────────────────────────────────────────────────────────────────────────
Bagged CART (regression with 11 members)
Variable importance scores include:
# A tibble: 10 × 4
term value std.error used
<chr> <dbl> <dbl> <int>
1 disp 571. 54.9 11
2 wt 569. 48.2 11
3 hp 470. 51.9 11
4 drat 399. 51.3 11
5 cyl 368. 37.3 11
6 am 215. 64.8 11
7 gear 193. 57.0 9
8 carb 65.4 33.9 9
9 qsec 64.7 15.5 11
10 vs 7.33 2.85 10
> fr$fitted_wflw[[1]] |> broom::tidy()
Error: No tidy method for objects of class bagger
> fr$fitted_wflw[[1]] |> broom::glance()
Error: No glance method for objects of class bagger
> fr$fitted_wflw[[1]] |> broom::augment(new_data = mtcars)
# A tibble: 32 × 12
mpg cyl disp hp drat wt qsec vs am gear carb .pred
* <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl>
1 21 6 160 110 3.9 2.62 16.5 0 1 4 4 20.9
2 21 6 160 110 3.9 2.88 17.0 0 1 4 4 20.9
3 22.8 4 108 93 3.85 2.32 18.6 1 1 4 1 24.9
4 21.4 6 258 110 3.08 3.22 19.4 1 0 3 1 18.8
5 18.7 8 360 175 3.15 3.44 17.0 0 0 3 2 18.6
6 18.1 6 225 105 2.76 3.46 20.2 1 0 3 1 18
7 14.3 8 360 245 3.21 3.57 15.8 0 0 3 4 14.3
8 24.4 4 147. 62 3.69 3.19 20 1 0 4 2 22.8
9 22.8 4 141. 95 3.92 3.15 22.9 1 0 4 2 21.8
10 19.2 6 168. 123 3.92 3.44 18.3 1 0 4 4 18.5
# ℹ 22 more rows
# ℹ Use `print(n = ...)` to see more rows
> fr$pred_wflw[[1]]
# A tibble: 8 × 1
.pred
<dbl>
1 18.8
2 18.5
3 17.5
4 28.2
5 28.2
6 18.2
7 14.4
8 14.7
.parsnip_eng = "dbarts"
.parsnip_fns = "bart"
fr <- fast_regression(
.data = mtcars,
.rec_obj = rec_obj,
.parsnip_fns = "bart",
.parsnip_eng = "dbart"
)
> fr$model_spec[[1]]
Call:
NULL
> fr$wflw[[1]]
══ Workflow ═══════════════════════════════════════════════════════════════════════════════════════════
Preprocessor: Recipe
Model: bart()
── Preprocessor ───────────────────────────────────────────────────────────────────────────────────────
0 Recipe Steps
── Model ──────────────────────────────────────────────────────────────────────────────────────────────
Call:
NULL
> fr$fitted_wflw[[1]]
══ Workflow [trained] ═════════════════════════════════════════════════════════════════════════════════
Preprocessor: Recipe
Model: bart()
── Preprocessor ───────────────────────────────────────────────────────────────────────────────────────
0 Recipe Steps
── Model ──────────────────────────────────────────────────────────────────────────────────────────────
Call:
`NULL`()
> fr$fitted_wflw[[1]] |> broom::tidy()
Error: No tidy method for objects of class bart
> fr$fitted_wflw[[1]] |> broom::glance()
Error: No glance method for objects of class bart
> fr$fitted_wflw[[1]] |> broom::augment(new_data = mtcars)
# A tibble: 32 × 12
mpg cyl disp hp drat wt qsec vs am gear carb .pred
* <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl>
1 21 6 160 110 3.9 2.62 16.5 0 1 4 4 21.0
2 21 6 160 110 3.9 2.88 17.0 0 1 4 4 20.9
3 22.8 4 108 93 3.85 2.32 18.6 1 1 4 1 25.2
4 21.4 6 258 110 3.08 3.22 19.4 1 0 3 1 20.5
5 18.7 8 360 175 3.15 3.44 17.0 0 0 3 2 17.7
6 18.1 6 225 105 2.76 3.46 20.2 1 0 3 1 18.9
7 14.3 8 360 245 3.21 3.57 15.8 0 0 3 4 14.3
8 24.4 4 147. 62 3.69 3.19 20 1 0 4 2 23.4
9 22.8 4 141. 95 3.92 3.15 22.9 1 0 4 2 22.8
10 19.2 6 168. 123 3.92 3.44 18.3 1 0 4 4 18.0
# ℹ 22 more rows
# ℹ Use `print(n = ...)` to see more rows
> fr$pred_wflw[[1]]
# A tibble: 8 × 1
.pred
<dbl>
1 23.4
2 17.9
3 13.0
4 30.8
5 27.3
6 19.2
7 20.6
8 26.2
.parsnip_eng = "xgboost"
.parsnip_fns = "boost_tree"
fr <- fast_regression(
.data = mtcars,
.rec_obj = rec_obj,
.parsnip_fns = "boost_tree",
.parsnip_eng = "xgboost"
)
> fr$model_spec[[1]]
Boosted Tree Model Specification (regression)
Computational engine: xgboost
> fr$wflw[[1]]
══ Workflow ═══════════════════════════════════════════════════════════════════════════════════════════
Preprocessor: Recipe
Model: boost_tree()
── Preprocessor ───────────────────────────────────────────────────────────────────────────────────────
0 Recipe Steps
── Model ──────────────────────────────────────────────────────────────────────────────────────────────
Boosted Tree Model Specification (regression)
Computational engine: xgboost
> fr$fitted_wflw[[1]]
══ Workflow [trained] ═════════════════════════════════════════════════════════════════════════════════
Preprocessor: Recipe
Model: boost_tree()
── Preprocessor ───────────────────────────────────────────────────────────────────────────────────────
0 Recipe Steps
── Model ──────────────────────────────────────────────────────────────────────────────────────────────
##### xgb.Booster
raw: 20.2 Kb
call:
xgboost::xgb.train(params = list(eta = 0.3, max_depth = 6, gamma = 0,
colsample_bytree = 1, colsample_bynode = 1, min_child_weight = 1,
subsample = 1), data = x$data, nrounds = 15, watchlist = x$watchlist,
verbose = 0, nthread = 1, objective = "reg:squarederror")
params (as set within xgb.train):
eta = "0.3", max_depth = "6", gamma = "0", colsample_bytree = "1", colsample_bynode = "1", min_child_weight = "1", subsample = "1", nthread = "1", objective = "reg:squarederror", validate_parameters = "TRUE"
xgb.attributes:
niter
callbacks:
cb.evaluation.log()
# of features: 10
niter: 15
nfeatures : 10
evaluation_log:
iter training_rmse
1 15.2936282
2 11.3548484
---
14 0.6191974
15 0.5200460
> fr$fitted_wflw[[1]] |> broom::tidy()
Error: No tidy method for objects of class xgb.Booster
> fr$fitted_wflw[[1]] |> broom::glance()
Error: No glance method for objects of class xgb.Booster
> fr$fitted_wflw[[1]] |> broom::augment(new_data = mtcars)
# A tibble: 32 × 12
mpg cyl disp hp drat wt qsec vs am gear carb .pred
* <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl>
1 21 6 160 110 3.9 2.62 16.5 0 1 4 4 20.9
2 21 6 160 110 3.9 2.88 17.0 0 1 4 4 20.9
3 22.8 4 108 93 3.85 2.32 18.6 1 1 4 1 22.4
4 21.4 6 258 110 3.08 3.22 19.4 1 0 3 1 21.3
5 18.7 8 360 175 3.15 3.44 17.0 0 0 3 2 18.3
6 18.1 6 225 105 2.76 3.46 20.2 1 0 3 1 20.1
7 14.3 8 360 245 3.21 3.57 15.8 0 0 3 4 14.3
8 24.4 4 147. 62 3.69 3.19 20 1 0 4 2 24.1
9 22.8 4 141. 95 3.92 3.15 22.9 1 0 4 2 20.9
10 19.2 6 168. 123 3.92 3.44 18.3 1 0 4 4 18.7
# ℹ 22 more rows
# ℹ Use `print(n = ...)` to see more rows
> fr$pred_wflw[[1]]
# A tibble: 8 × 1
.pred
<dbl>
1 20.1
2 20.9
3 15.2
4 15.2
5 10.3
6 30.4
7 22.3
8 13.8
.parsnip_eng = "lightgbm"
.parsnip_fns = "boost_tree"
fr <- fast_regression(
.data = mtcars,
.rec_obj = rec_obj,
.parsnip_fns = "boost_tree",
.parsnip_eng = "lightgbm"
)
> fr$model_spec[[1]]
Boosted Tree Model Specification (regression)
Computational engine: lightgbm
> fr$wflw[[1]]
══ Workflow ═══════════════════════════════════════════════════════════════════════════════════════════
Preprocessor: Recipe
Model: boost_tree()
── Preprocessor ───────────────────────────────────────────────────────────────────────────────────────
0 Recipe Steps
── Model ──────────────────────────────────────────────────────────────────────────────────────────────
Boosted Tree Model Specification (regression)
Computational engine: lightgbm
> fr$fitted_wflw[[1]]
══ Workflow [trained] ═════════════════════════════════════════════════════════════════════════════════
Preprocessor: Recipe
Model: boost_tree()
── Preprocessor ───────────────────────────────────────────────────────────────────────────────────────
0 Recipe Steps
── Model ──────────────────────────────────────────────────────────────────────────────────────────────
<lgb.Booster>
Public:
add_valid: function (data, name)
best_iter: -1
best_score: NA
current_iter: function ()
dump_model: function (num_iteration = NULL, feature_importance_type = 0L)
eval: function (data, name, feval = NULL)
eval_train: function (feval = NULL)
eval_valid: function (feval = NULL)
finalize: function ()
initialize: function (params = list(), train_set = NULL, modelfile = NULL,
lower_bound: function ()
params: list
predict: function (data, start_iteration = NULL, num_iteration = NULL,
raw: NA
record_evals: list
reset_parameter: function (params, ...)
rollback_one_iter: function ()
save: function ()
save_model: function (filename, num_iteration = NULL, feature_importance_type = 0L)
save_model_to_string: function (num_iteration = NULL, feature_importance_type = 0L)
set_train_data_name: function (name)
to_predictor: function ()
update: function (train_set = NULL, fobj = NULL)
upper_bound: function ()
Private:
eval_names: NULL
get_eval_info: function ()
handle: lgb.Booster.handle
higher_better_inner_eval: NULL
init_predictor: NULL
inner_eval: function (data_name, data_idx, feval = NULL)
inner_predict: function (idx)
is_predicted_cur_iter: list
name_train_set: training
name_valid_sets: list
num_class: 1
num_dataset: 1
predict_buffer: list
set_objective_to_none: FALSE
train_set: lgb.Dataset, R6
train_set_version: 1
valid_sets: list
> fr$fitted_wflw[[1]] |> broom::tidy()
Error: No tidy method for objects of class lgb.Booster
> fr$fitted_wflw[[1]] |> broom::glance()
Error: No glance method for objects of class lgb.Booster
> fr$fitted_wflw[[1]] |> broom::augment(new_data = mtcars)
# A tibble: 32 × 12
mpg cyl disp hp drat wt qsec vs am gear carb .pred
* <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl>
1 21 6 160 110 3.9 2.62 16.5 0 1 4 4 20.8
2 21 6 160 110 3.9 2.88 17.0 0 1 4 4 20.8
3 22.8 4 108 93 3.85 2.32 18.6 1 1 4 1 20.8
4 21.4 6 258 110 3.08 3.22 19.4 1 0 3 1 20.8
5 18.7 8 360 175 3.15 3.44 17.0 0 0 3 2 20.8
6 18.1 6 225 105 2.76 3.46 20.2 1 0 3 1 20.8
7 14.3 8 360 245 3.21 3.57 15.8 0 0 3 4 20.8
8 24.4 4 147. 62 3.69 3.19 20 1 0 4 2 20.8
9 22.8 4 141. 95 3.92 3.15 22.9 1 0 4 2 20.8
10 19.2 6 168. 123 3.92 3.44 18.3 1 0 4 4 20.8
# ℹ 22 more rows
# ℹ Use `print(n = ...)` to see more rows
> fr$pred_wflw[[1]]
# A tibble: 8 × 1
.pred
<dbl>
1 20.8
2 20.8
3 20.8
4 20.8
5 20.8
6 20.8
7 20.8
8 20.8
.parsnip_eng = "rpart"
.parsnip_fns = "decision_tree"
fr <- fast_regression(
.data = mtcars,
.rec_obj = rec_obj,
.parsnip_fns = "decsion_tree",
.parsnip_eng = "rpart"
)
> fr$model_spec[[1]]
Decision Tree Model Specification (regression)
Computational engine: rpart
> fr$wflw[[1]]
══ Workflow ═══════════════════════════════════════════════════════════════════════════════════════════
Preprocessor: Recipe
Model: decision_tree()
── Preprocessor ───────────────────────────────────────────────────────────────────────────────────────
0 Recipe Steps
── Model ──────────────────────────────────────────────────────────────────────────────────────────────
Decision Tree Model Specification (regression)
Computational engine: rpart
> fr$fitted_wflw[[1]]
══ Workflow [trained] ═════════════════════════════════════════════════════════════════════════════════
Preprocessor: Recipe
Model: decision_tree()
── Preprocessor ───────────────────────────────────────────────────────────────────────────────────────
0 Recipe Steps
── Model ──────────────────────────────────────────────────────────────────────────────────────────────
n= 24
node), split, n, deviance, yval
* denotes terminal node
1) root 24 887.2696 20.37083
2) cyl>=5 15 125.4000 16.50000 *
3) cyl< 5 9 162.5356 26.82222 *
> fr$fitted_wflw[[1]] |> broom::tidy()
Error: No tidy method for objects of class rpart
> fr$fitted_wflw[[1]] |> broom::glance()
Error: No glance method for objects of class rpart
> fr$fitted_wflw[[1]] |> broom::augment(new_data = mtcars)
# A tibble: 32 × 12
mpg cyl disp hp drat wt qsec vs am gear carb .pred
* <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl>
1 21 6 160 110 3.9 2.62 16.5 0 1 4 4 16.5
2 21 6 160 110 3.9 2.88 17.0 0 1 4 4 16.5
3 22.8 4 108 93 3.85 2.32 18.6 1 1 4 1 26.8
4 21.4 6 258 110 3.08 3.22 19.4 1 0 3 1 16.5
5 18.7 8 360 175 3.15 3.44 17.0 0 0 3 2 16.5
6 18.1 6 225 105 2.76 3.46 20.2 1 0 3 1 16.5
7 14.3 8 360 245 3.21 3.57 15.8 0 0 3 4 16.5
8 24.4 4 147. 62 3.69 3.19 20 1 0 4 2 26.8
9 22.8 4 141. 95 3.92 3.15 22.9 1 0 4 2 26.8
10 19.2 6 168. 123 3.92 3.44 18.3 1 0 4 4 16.5
# ℹ 22 more rows
# ℹ Use `print(n = ...)` to see more rows
> fr$pred_wflw[[1]]
# A tibble: 8 × 1
.pred
<dbl>
1 16.5
2 16.5
3 16.5
4 16.5
5 26.8
6 26.8
7 16.5
8 16.5
.parsnip_eng = "partykit"
.parsnip_fns = "decision_tree"
fr <- fast_regression(
.data = mtcars,
.rec_obj = rec_obj,
.parsnip_fns = "decision_tree",
.parsnip_eng = "partykit"
)
> fr$model_spec[[1]]
Decision Tree Model Specification (regression)
Computational engine: partykit
> fr$wflw[[1]]
══ Workflow ═══════════════════════════════════════════════════════════════════════════════════════════
Preprocessor: Recipe
Model: decision_tree()
── Preprocessor ───────────────────────────────────────────────────────────────────────────────────────
0 Recipe Steps
── Model ──────────────────────────────────────────────────────────────────────────────────────────────
Decision Tree Model Specification (regression)
Computational engine: partykit
> fr$fitted_wflw[[1]]
══ Workflow [trained] ═════════════════════════════════════════════════════════════════════════════════
Preprocessor: Recipe
Model: decision_tree()
── Preprocessor ───────────────────────────────────────────────────────────────────────────────────────
0 Recipe Steps
── Model ──────────────────────────────────────────────────────────────────────────────────────────────
Model formula:
..y ~ cyl + disp + hp + drat + wt + qsec + vs + am + gear + carb
Fitted party:
[1] root
| [2] wt <= 2.62: 27.000 (n = 7, err = 119.6)
| [3] wt > 2.62: 17.706 (n = 17, err = 256.5)
Number of inner nodes: 1
Number of terminal nodes: 2
> fr$fitted_wflw[[1]] |> broom::tidy()
Error: No tidy method for objects of class constparty
> fr$fitted_wflw[[1]] |> broom::glance()
Error: No glance method for objects of class constparty
> fr$fitted_wflw[[1]] |> broom::augment(new_data = mtcars)
# A tibble: 32 × 12
mpg cyl disp hp drat wt qsec vs am gear carb .pred
* <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl>
1 21 6 160 110 3.9 2.62 16.5 0 1 4 4 27
2 21 6 160 110 3.9 2.88 17.0 0 1 4 4 17.7
3 22.8 4 108 93 3.85 2.32 18.6 1 1 4 1 27
4 21.4 6 258 110 3.08 3.22 19.4 1 0 3 1 17.7
5 18.7 8 360 175 3.15 3.44 17.0 0 0 3 2 17.7
6 18.1 6 225 105 2.76 3.46 20.2 1 0 3 1 17.7
7 14.3 8 360 245 3.21 3.57 15.8 0 0 3 4 17.7
8 24.4 4 147. 62 3.69 3.19 20 1 0 4 2 17.7
9 22.8 4 141. 95 3.92 3.15 22.9 1 0 4 2 17.7
10 19.2 6 168. 123 3.92 3.44 18.3 1 0 4 4 17.7
# ℹ 22 more rows
# ℹ Use `print(n = ...)` to see more rows
> fr$pred_wflw[[1]]
# A tibble: 8 × 1
.pred
<dbl>
1 27
2 17.7
3 17.7
4 17.7
5 27
6 17.7
7 17.7
8 17.7
.parsnip_eng = "nnet"
.parsnip_fns = "mlp"
fr <- fast_regression(
.data = mtcars,
.rec_obj = rec_obj,
.parsnip_fns = "mlp",
.parsnip_eng = "nnet"
)
> fr$model_spec[[1]]
Single Layer Neural Network Model Specification (regression)
Computational engine: nnet
> fr$wflw[[1]]
══ Workflow ═══════════════════════════════════════════════════════════════════════════════════════════
Preprocessor: Recipe
Model: mlp()
── Preprocessor ───────────────────────────────────────────────────────────────────────────────────────
0 Recipe Steps
── Model ──────────────────────────────────────────────────────────────────────────────────────────────
Single Layer Neural Network Model Specification (regression)
Computational engine: nnet
> fr$fitted_wflw[[1]]
══ Workflow [trained] ═════════════════════════════════════════════════════════════════════════════════
Preprocessor: Recipe
Model: mlp()
── Preprocessor ───────────────────────────────────────────────────────────────────────────────────────
0 Recipe Steps
── Model ──────────────────────────────────────────────────────────────────────────────────────────────
a 10-5-1 network with 61 weights
inputs: cyl disp hp drat wt qsec vs am gear carb
output(s): ..y
options were - linear output units
> fr$fitted_wflw[[1]] |> broom::tidy()
Error: No tidy method for objects of class nnet.formula
> fr$fitted_wflw[[1]] |> broom::glance()
Error: No glance method for objects of class nnet.formula
> fr$fitted_wflw[[1]] |> broom::augment(new_data = mtcars)
# A tibble: 32 × 12
mpg cyl disp hp drat wt qsec vs am gear carb .pred
* <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl>
1 21 6 160 110 3.9 2.62 16.5 0 1 4 4 20.4
2 21 6 160 110 3.9 2.88 17.0 0 1 4 4 20.4
3 22.8 4 108 93 3.85 2.32 18.6 1 1 4 1 20.4
4 21.4 6 258 110 3.08 3.22 19.4 1 0 3 1 20.4
5 18.7 8 360 175 3.15 3.44 17.0 0 0 3 2 20.4
6 18.1 6 225 105 2.76 3.46 20.2 1 0 3 1 20.4
7 14.3 8 360 245 3.21 3.57 15.8 0 0 3 4 20.4
8 24.4 4 147. 62 3.69 3.19 20 1 0 4 2 20.4
9 22.8 4 141. 95 3.92 3.15 22.9 1 0 4 2 20.4
10 19.2 6 168. 123 3.92 3.44 18.3 1 0 4 4 20.4
# ℹ 22 more rows
# ℹ Use `print(n = ...)` to see more rows
> fr$pred_wflw[[1]]
# A tibble: 8 × 1
.pred
<dbl>
1 20.4
2 20.4
3 20.4
4 20.4
5 20.4
6 20.4
7 20.4
8 20.4
.parsnip_eng = "brulee"
.parsnip_fns = "mlp"
fr <- fast_regression(
.data = mtcars,
.rec_obj = rec_obj,
.parsnip_fns = "mlp",
.parsnip_eng = "brulee"
)
Error in !self$..refer_to_state_dict..: invalid argument type
> fr
# A tibble: 1 × 8
.model_id .parsnip_engine .parsnip_mode .parsnip_fns model_spec wflw fitted_wflw pred_wflw
<int> <chr> <chr> <chr> <list> <list> <list> <list>
1 1 brulee regression mlp <spec[+]> <workflow> <workflow> <NULL>
> fr$model_spec[[1]]
Single Layer Neural Network Model Specification (regression)
Computational engine: brulee
> fr$wflw[[1]]
══ Workflow ═══════════════════════════════════════════════════════════════════════════════════════════
Preprocessor: Recipe
Model: mlp()
── Preprocessor ───────────────────────────────────────────────────────────────────────────────────────
0 Recipe Steps
── Model ──────────────────────────────────────────────────────────────────────────────────────────────
Single Layer Neural Network Model Specification (regression)
Computational engine: brulee
> fr$fitted_wflw[[1]]
══ Workflow [trained] ═════════════════════════════════════════════════════════════════════════════════
Preprocessor: Recipe
Model: mlp()
── Preprocessor ───────────────────────────────────────────────────────────────────────────────────────
0 Recipe Steps
── Model ──────────────────────────────────────────────────────────────────────────────────────────────
Multilayer perceptron
relu activation
3 hidden units, 37 model parameters
24 samples, 10 features, numeric outcome
weight decay: 0.001
dropout proportion: 0
batch size: 22
learn rate: 0.01
scaled validation loss after 1 epoch: 0.0719
> fr$fitted_wflw[[1]] |> broom::tidy()
Error: No tidy method for objects of class brulee_mlp
> fr$fitted_wflw[[1]] |> broom::glance()
Error: No glance method for objects of class brulee_mlp
> fr$fitted_wflw[[1]] |> broom::augment(new_data = mtcars)
Error in !self$..refer_to_state_dict.. : invalid argument type
> fr$pred_wflw[[1]]
NULL