tune {e1071} | R Documentation |
This generic function tunes hyperparameters of statistical methods using a grid search over supplied parameter ranges.
tune(method, train.x, train.y = NULL, data = list(), validation.x = NULL, validation.y = NULL, ranges = NULL, predict.func = predict, tunecontrol = tune.control(), ...) best.tune(...)
method |
function to be tuned. |
train.x |
either a formula or a matrix of predictors. |
train.y |
the response variable if train.x is a predictor
matrix. Ignored if train.x is a formula. |
data |
data, if a formula interface is used. Ignored, if predictor matrix and response are supplied directly. |
validation.x |
an optional validation set. Depending on whether a
formula interface is used or not, the response can be
included in validation.x or separately speciefied using validation.y . |
validation.y |
if no formula interface is used, the response of the (optional) validation set. |
ranges |
a named list of parameter vectors spanning the sampling
space. The vectors will usually be created by seq . |
predict.func |
optional predict function, if the standard predict
behaviour is inadequate. |
tunecontrol |
object of class "tune.control" , as created by the
function tune.control() . If omitted, tune.control()
gives the defaults. |
... |
Further parameters passed to the training functions. |
As performance measure, the classification error is used
for classification, and the mean squared error for regression. It is
possible to specify only one parameter combination (i.e., vectors of
length 1) to obtain an error estimation of the specified type
(bootstrap, cross-classification, etc.) on the given data set. For
conveneince, there
are several tune.foo()
wrappers defined, e.g., for
nnet()
, randomForest()
,
rpart()
, svm()
, and knn()
.
For tune
, an object of class tune
, including the components:
best.parameters |
a 1 x k data frame, k number of parameters. |
best.performance |
best achieved performance. |
performances |
if requested, a data frame of all parameter combinations along with the corresponding performance results. |
if requested, the model trained on the complete training data
using the best parameter combination. |
best.tune
returns the best model detected by tune
.
David Meyer
david.meyer@ci.tuwien.ac.at
tune.control
, plot.tune
,
tune.svm
, tune.wrapper
data(iris) ## tune `svm' for classification with RBF-kernel (default in svm), ## using one split for training/validation set obj <- tune(svm, Species~., data = iris, ranges = list(gamma = 2^(-1:1), cost = 2^(2:4)), tunecontrol = tune.control(sampling = "fix") ) ## alternatively: ## obj <- tune.svm(Species~., data = iris, gamma = 2^(-1:1), cost = 2^(2:4)) summary(obj) plot(obj) ## tune `knn' using a convenience function; this time with the ## conventional interface and bootstrap sampling: x <- iris[,-5] y <- iris[,5] obj2 <- tune.knn(x, y, k = 1:5, tunecontrol = tune.control(sampling = "boot")) summary(obj2) plot(obj2) ## tune `rpart' for regression, using 10-fold cross validation (default) data(mtcars) obj3 <- tune.rpart(mpg~., data = mtcars, minsplit = c(5,10,15)) summary(obj3) plot(obj3) ## simple error estimation for lm using 10-fold cross validation tune(lm, mpg~., data = mtcars)