xgb.Booster objects are created from the xgboost package, which provides efficient and scalable implementations of gradient boosted decision trees. Given the reliance of post processing functions on the model object, like xgb.Booster.complete, on the first class listed, the butcher_xgb.Booster class is not appended.

# S3 method for xgb.Booster
axe_call(x, verbose = FALSE, ...)

# S3 method for xgb.Booster
axe_ctrl(x, verbose = FALSE, ...)

# S3 method for xgb.Booster
axe_env(x, verbose = FALSE, ...)

# S3 method for xgb.Booster
axe_fitted(x, verbose = FALSE, ...)

Arguments

x

A model object.

verbose

Print information each time an axe method is executed. Notes how much memory is released and what functions are disabled. Default is FALSE.

...

Any additional arguments related to axing.

Value

Axed xgb.Booster object.

Examples

suppressWarnings(suppressMessages(library(xgboost))) suppressWarnings(suppressMessages(library(parsnip))) data(agaricus.train) bst <- xgboost(data = agaricus.train$data, label = agaricus.train$label, eta = 1, nthread = 2, nrounds = 2, eval_metric = "logloss", objective = "binary:logistic", verbose = 0) out <- butcher(bst, verbose = TRUE)
#> Memory released: '37,816 B'
#> Disabled: `print()`, `summary()`, `xgb.Booster.complete()`
#> Could not add 'butchered' class
# Another xgboost model fit <- boost_tree(mode = "classification", trees = 20) %>% set_engine("xgboost", eval_metric = "mlogloss") %>% fit(Species ~ ., data = iris) out <- butcher(fit, verbose = TRUE)
#> No memory released. Do not butcher.