kfoldloss and regression machine learning like fitrsvm

조회 수: 1 (최근 30일)
Dimitri
Dimitri 2018년 11월 3일
답변: Don Mathis 2018년 11월 7일
Hello,
I want to calculate the cross validation loss of my different regression machine learning models to compare them with each other. Therefor I want to use kfoldLoss, but I´m getting an error.
My code looks as follows:
%splitting data
[m,n] = size(Daten) ;
P = 0.8 ;
Training = Daten(1:round(P*m),:) ;
Testing = Daten(round(P*m)+1:end,:);
XTrain=Training(:,1:n-1);
YTrain=Training(:,n);
XTest=Testing(:,1:n-1);
YTest=Testing(:,n);
%Hyperparameter optimization
rng default
c = cvpartition(YTrain,'KFold',10);
Mdl = fitrsvm(XTrain,YTrain,'KernelFunction','gaussian','OptimizeHyperparameters','Epsilon',...
'HyperparameterOptimizationOptions',struct('AcquisitionFunctionName',...
'expected-improvement-plus','cvpartition',c));
L=kfoldLoss(Mdl)
I want to use this code structure for the different functions like fitrtree and the other regression functions in the bayesian optimization workflow. Why does kfoldLoss not work for this code?
Best regards, Dimitri

답변 (1개)

Don Mathis
Don Mathis 2018년 11월 7일
When you call fitrsvm with 'OptimizeHyperparameters', the result is a single svm model, not a partitioned model with a kfoldLoss method. To get an estimate of the out-of-sample loss of your final model, you'll need to run the crossval function on it and then call kfoldLoss on the result of that:
pm = crossval(Mdl, 'cvpartition', c)
kfoldLoss(pm)
Or, since there's no need to reuse the cvpartition that you used for the optimization,
pm = crossval(Mdl, 'KFold', 10)
kfoldLoss(pm)

제품


릴리스

R2018b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by