Weitere Beispiele werden automatisch zu den Stichwörtern zugeordnet - wir garantieren ihre Korrektheit nicht.
However there are many ways that cross-validation can be misused.
Cross-validation and related techniques must be used for validating the model instead.
In machine learning, this is typically done by cross-validation.
Backtesting can be considered a type of cross-validation applied to time series data.
Cross-validation was used to test the robustness of classification accuracy.
One form of cross-validation leaves out a single observation at a time; this is similar to the jackknife.
In all cases it is useful to check the accuracy of the emulator, for example using cross-validation.
Cross-validation is a statistical method for validating a predictive model.
Cross-validation can be used to compare the performances of different predictive modeling procedures.
Cross-validation can also be used in variable selection.
"Estimating the error rate of a prediction rule: improvement on cross-validation".
Cross-validation is an approach by which the sets of scientific data generated using two or more methods are critically assessed.
For cross-validation, the accuracy levels were compared using a z-test for proportions.
One approach is to estimate the generalizability of a model to independent datasets using methods such as cross-validation.
The class of techniques is called cross-validation.
Cross-validation only yields meaningful results if the validation set and test set are drawn from the same population.
Cross-validation is the process of assessing how the results of a statistical analysis will generalize to an independent data set.
The value of r is found through cross-validation or a forward stage-wise strategy which stops when the model fit cannot be significantly improved.
He also pioneered the theory of cross-validation.
Alternative methods of controlling overfitting not involving regularization include cross-validation.
However under cross-validation, the model with the best fit will generally include only a subset of the features that are deemed truly informative.
If the prediction method is expensive to train, cross-validation can be very slow since the training must be carried out repeatedly.
Regularization can be used to fine tune model complexity using an augmented error function with cross-validation.
This is the simplest variation of k-fold cross-validation.
This can be done by cross-validation, or by using an analytic estimate of the shrinkage intensity.