![]() ![]() The Sonar data are available in the mlbench package. By default, the function automatically chooses the tuning parameters associated with the best value, although different algorithms can be used (see details below). After resampling, the process produces a profile of performance measures is available to guide the user as to which tuning parameter values should be chosen. Currently, k-fold cross-validation (once or repeated), leave-one-out cross-validation and bootstrap (simple estimation or the 632 rule) resampling methods can be used by train. Once the model and tuning parameter values have been defined, the type of resampling should be also be specified. For example, if fitting a Partial Least Squares (PLS) model, the number of PLS components to evaluate must be specified. The first step in tuning the model (line 1 in the algorithm below) is to choose a set of parameters to evaluate. On these pages, there are lists of tuning parameters that can potentially be optimized. Currently, 238 are available using caret see train Model List or train Models By Tag for details. estimate model performance from a training setįirst, a specific model must be chosen. ![]() choose the “optimal” model across these parameters.evaluate, using resampling, the effect of model tuning parameters on performance.The caret package has several functions that attempt to streamline the model building and evaluation process. 22.2 Internal and External Performance Estimates.22 Feature Selection using Simulated Annealing.21.2 Internal and External Performance Estimates.21 Feature Selection using Genetic Algorithms.20.3 Recursive Feature Elimination via caret.20.2 Resampling and External Validation.19 Feature Selection using Univariate Filters.18.1 Models with Built-In Feature Selection.16.6 Neural Networks with a Principal Component Step.16.2 Partial Least Squares Discriminant Analysis.16.1 Yet Another k-Nearest Neighbor Function.13.9 Illustrative Example 6: Offsets in Generalized Linear Models.13.8 Illustrative Example 5: Optimizing probability thresholds for class imbalances.13.7 Illustrative Example 4: PLS Feature Extraction Pre-Processing.13.6 Illustrative Example 3: Nonstandard Formulas.13.5 Illustrative Example 2: Something More Complicated - LogitBoost.13.2 Illustrative Example 1: SVMs with Laplacian Kernels.12.1.2 Using additional data to measure performance.12.1.1 More versatile tools for preprocessing data.11.4 Using Custom Subsampling Techniques. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |