fitcsvm cross-validation
조회 수: 6 (최근 30일)
이전 댓글 표시
Hi, I am training a SVM classifier with the following code:
SVM_1=fitcsvm(X_train, y_train, 'OptimizeHyperparameters', 'all','HyperparameterOptimizationOptions',struct('Optimizer','bayesopt','AcquisitionFunctionName','expected-improvement-per-second-plus','Kfold',10,'ShowPlots',0));
I was wondering if there is any possibility to retrieve a performance metric of the classifier from the cross-validation - since I specify it as a 10-fold cross-validation (AUC, for example).
Thank you,
J
댓글 수: 0
채택된 답변
Alan Weiss
2021년 4월 16일
As shown in this doc example, the cross-validation loss is reported at the command line and plotted by default (I see that you turned off the plot). Is there something else that you need, or did I misunderstand you?
Alan Weiss
MATLAB mathematical toolbox documentation
댓글 수: 3
Alan Weiss
2021년 4월 16일
The "Objective" in the iterative display (the generated table of iterations) is the cross-validation loss. The "Best so far" is simply the minimum objective up to that iteration. There is a difference between the "best so far" estimated and observed; that is a function of the model that the solver is estimating, and that changes every iteration. The model is that the observations themselves are noisy, so simply observing a value doesn't mean that observing it again will give the same response.
In a nutshell, I think that the iterative display gives you the information you seek.
Alan Weiss
MATLAB mathematical toolbox documentation
추가 답변 (0개)
참고 항목
카테고리
Help Center 및 File Exchange에서 Sequence and Numeric Feature Data Workflows에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!