How can I make a decision stump using a decision tree ?

조회 수: 9 (최근 30일)
MHN
MHN 2016년 2월 4일
댓글: MHN 2016년 2월 4일
Right now, I am using fitctree and pruning the tree to level one, but beacuse the 'SplitCriterion' is 'gdi' and the 'PruneCriterion' is 'error', I can not find the best decision stump to minimize the total error. I know there is an option 'MaxNumSplits' for growing a decision tree, but this option was added on 2015 and I am using 2014 version of Matlab. So, I need to build a decision stump which minimizes the error with respects to the given weights, and I prefer to use fitctree instead of writing that from scratch. Any suggestion ?!
Here is an example that explain my problem more clearly. The correct one must look like this :
I have a full tree like the following tree and I am pruning that until (max-1) level.
Since it is not a decision stump yet, I am pruning that one level more. but it does not the same as the correct one.
And it is the code that I am using:
MinLeafSize = 1;
MinParentSize = 2;
NumVariablesToSample = 'all';
ScoreTransform = 'none';
PruneCriterion= 'error' ;
SplitCriterion = 'gdi';
Weights = Pt; % Train the weak learner by weights Pt
tree = fitctree(X,Y, 'MinLeafSize',MinLeafSize, 'MinParentSize', MinParentSize, 'NumVariablesToSample', NumVariablesToSample, ...
'PruneCriterion',PruneCriterion, 'SplitCriterion', SplitCriterion, 'Weights', Weights, 'ScoreTransform', ScoreTransform);
prune_tree = prune(tree, 'Level', max(tree.PruneList)-1); % prune tree to have decision stump
% if the prune tree still has more than one decision node (three inner nodes) use the max(tree.PruneList) to reduce it to just one node
if length(prune_tree.NodeSize) > 3
prune_tree = prune(tree, 'Level', max(tree.PruneList));
end
P.S. I would like to implement AdaBoostM1 by using Decision Stumps as the weak learner. I know that there is a built in function 'fitensemble' for making an ensemble classifier using AdaBoostM1 algorithm, but I would like to implement it myself.
  댓글 수: 1
MHN
MHN 2016년 2월 4일
편집: MHN 2016년 2월 4일
The default option for using 'fitensemble' by 'AdaBoostM1' method grows such a decision stump. Does anyone know how should we manually set the options for fitctree to have the same decision stump? I have checked the classTreeEns.Trained which shows the tree properties, but since it is a compact classification tree, the pruning information and ModelParameters are removed.

댓글을 달려면 로그인하십시오.

채택된 답변

Ilya
Ilya 2016년 2월 4일
Use
fitctree(X,Y,'minparent',size(X,1),'prune','off','mergeleaves','off')
  댓글 수: 1
MHN
MHN 2016년 2월 4일
Thank you very much indeed. I have just added also weights to it.
fitctree(X,Y,'minparent',size(X,1),'prune','off','mergeleaves','off', 'Weights', Weights)

댓글을 달려면 로그인하십시오.

추가 답변 (0개)

카테고리

Help CenterFile Exchange에서 Classification Trees에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by