SVM Cross Validation Training

조회 수: 3 (최근 30일)
Nedz
Nedz 2020년 5월 7일
답변: Gayathri 2025년 1월 3일
I am using K-Fold cross validation. My K is 10.
I am supposed to do 10 crossfold and take the average of the SVM performance.
How should i perform such? Running the cross validarion ounce only generates 1 fold prediction or a complete 10-fold prediction?
  댓글 수: 1
Mohammad Sami
Mohammad Sami 2020년 5월 8일
According to the documentation it is average over all folds
https://www.mathworks.com/help/releases/R2020a/stats/select-data-and-validation-for-classification-problem.html

댓글을 달려면 로그인하십시오.

답변 (1개)

Gayathri
Gayathri 2025년 1월 3일
Hi @Nedz,
I understand that you need to perform K-fold cross-validation for a SVM model. For this purpose you can use the "crossval" function. And then, "kfoldLoss" function can be used to get the classification loss for cross-validated classification model. Please refer to the code below which implements the same.
load ionosphere
%Train a SVM classifier using the radial basis kernel
SVMModel = fitcsvm(X,Y,'Standardize',true,'KernelFunction','RBF','KernelScale','auto');
%Cross-validate the SVM classifier
CVSVMModel = crossval(SVMModel);
%Estimate the out-of-sample misclassification rate.
classLoss = kfoldLoss(CVSVMModel)
"crossval" by default uses 10-fold cross-validation.
Please refer to the "Train and Cross-Validate SVM Classifier" example in the documentation link mentioned below.
Hope you find this information helpful!

카테고리

Help CenterFile Exchange에서 Gaussian Process Regression에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by