How to count kfoldloss error from ClassificationLinear?

조회 수: 2 (최근 30일)
Yean Lim
Yean Lim 2020년 12월 7일
편집: Walter Roberson 2020년 12월 10일
cv = cvpartition(numel(y_trainUndersampled),'Kfold',5);
hyperOpt = struct('AcquisitionFunctionName','expected-improvement-plus',...
'Optimizer','bayesopt','MaxObjectiveEvaluations', 100,...
'CVPartition', cv);
bestLogsMdl = fitclinear(X_trainUndersampled, y_trainUndersampled,...
'Learner', 'logistic',...
'OptimizeHyperparameters',{'Lambda','Regularization'},...
'HyperparameterOptimizationOptions',hyperOpt,...
'ScoreTransform','logit');
Hi, I have used hyperparameter optimization on fitclinear function. The code above produces bestLogsMdl as ClassificationLinear.
I want to use ClassificationLinear to count the kfoldLoss.
However based on the documentation in https://uk.mathworks.com/help/stats/fitclinear.html#bu5mw4p , kfoldLoss is used on ClassificationPartitionedLinear
How to use hyperparameter optimization with fitclinear together on the kfoldLoss? What modifications are needed on the fitclinear so it would produce ClassificationPartitionedLinear?
My ultimate goal is to plot a misclassification rate vs number of learning cycles graph

채택된 답변

Walter Roberson
Walter Roberson 2020년 12월 8일
편집: Walter Roberson 2020년 12월 10일
You cannot use any cross-validation name-value pair argument along with the 'OptimizeHyperparameters' name-value pair argument. You can modify the cross-validation for 'OptimizeHyperparameters' only by using the 'HyperparameterOptimizationOptions' name-value pair argument.
So you need to get rid of OptimizeHyperParameters and set appropriate HyperparameterOptimizationOptions

추가 답변 (0개)

카테고리

Help CenterFile Exchange에서 Model Building and Assessment에 대해 자세히 알아보기

제품


릴리스

R2020b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by