Package website: release | dev
This package provides hyperparameter tuning for mlr3. It offers various tuning methods e.g. grid search, random search and generalized simulated annealing and different termination criteria can be set and combined. ‘AutoTuner’ provides a convenient way to perform nested resampling in combination with ‘mlr3’. The package is build on bbotk which provides a common framework for optimization.
CRAN version
install.packages("mlr3tuning")
Development version
remotes::install_github("mlr-org/mlr3tuning")
library("mlr3")
library("mlr3tuning")
library("paradox")
task = tsk("pima")
learner = lrn("classif.rpart")
resampling = rsmp("holdout")
measure = msr("classif.ce")
# Create the search space with lower and upper bounds
search_space = ParamSet$new(list(
ParamDbl$new("cp", lower = 0.001, upper = 0.1),
ParamInt$new("minsplit", lower = 1, upper = 10)
))
# Define termination criterion
terminator = trm("evals", n_evals = 20)
# Create tuning instance
instance = TuningInstanceSingleCrit$new(task = task,
learner = learner,
resampling = resampling,
measure = measure,
search_space = search_space,
terminator = terminator)
# Load tuner
tuner = tnr("grid_search", resolution = 5)
# Trigger optimization
tuner$optimize(instance)
# View results
instance$result
Further documentation can be found in the mlr3book and the mlr3tuning cheatsheet. Tutorials are available in the mlr3gallery.