cvloss or kfoldloss for regression tree?
조회 수: 7 (최근 30일)
이전 댓글 표시
Hi,
I'm a bit confused with 'cvloss' and 'kfoldLoss'.
- kfoldLoss
Syntax: L = kfoldLoss(cvmodel) returns the cross-validation loss of cvmodel.
load carsmall
>> XX = [Displacement Horsepower Weight];
>> YY = MPG;
>> cvmodel = fitrtree(XX,YY,'crossval','on');
>> L = kfoldLoss(cvmodel,'mode','average')
L =
30.3578
Default: 'mse', mean square root.
2. cvloss
Syntax: E = cvloss(tree) returns the cross-validated regression error (loss) for a regression tree.
>> load carsmall
>> X = [Displacement Horsepower Weight];
>> Mdl = fitrtree(X,MPG);
>> rng(1);
>> E = cvloss(Mdl)
E =
25.7383
First, both cases used same predictors and same response, why there is a difference between L and E outcomes?
Second, function 'fitrtree' by default 'crossval' is turned 'off'. In the 'cvloss' example, noticed that 'Mdl = fitrtree(X,MPG);' didn't turn 'crossval' on, how does it have anything to do with cross-validated regression? It is not even turned on.
Third, how are both kfoldLoss and cvloss calculated? looks that they both use MSE but giving completely different results.
댓글 수: 0
답변 (1개)
Jeremy Brecevic
2020년 11월 27일
Unlike cvloss, kfoldLoss does not return SE,Nleaf, or BestLevel. kfoldLoss also does not allow you to examine any error other than the classification error.
댓글 수: 0
참고 항목
카테고리
Help Center 및 File Exchange에서 Gaussian Process Regression에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!