Proper use of ClassificationTree.fit for categorical variables?
조회 수: 2 (최근 30일)
이전 댓글 표시
The documentation for fitting classification trees states that X needs to be a floating point array, but also indicates that X can represent categorical variables (using the 'CategoricalPredictors' Name-Value argument).
Is the proper way to handle this to
(1) take the categorical variable, e.g.
category1 = {'duck','duck','goose','squash','quartz'}';
category2 = {'animal','animal','animal','vegetable','mineral'}';
(2) run those through grp2idx()
numcat1 = grp2idx(category1);
numcat2 = grp2idx(category2);
(3) Embed those in my X:
X = [numcat1 numcat2 otherTrulyNumericalVariables]
(4) Identify those as categorical
tree = ClassificationTree.fit(X,Y,'CategoricalPredictors',[1 2])
Seems like that's probably right, but I'd love an expert to vet that idea. The documentation doesn't have a categorical example.
댓글 수: 0
채택된 답변
Ilya
2013년 11월 8일
Yes, this would be one way to accomplish this. You'd have to be careful when you convert new data to numeric for prediction. If the new data are missing a level (for example, 'goose' does not appear in the value set), grp2idx can return different indices for the same categorical values. One way to avoid this pitfall would be by using the nominal type and specifying the level order explicitly, for example:
category1 = nominal({'duck','duck','goose','squash','quartz'},...
[],{'goose','squash','quartz' 'duck'})
numcat1 = double(category1)
Depending on how you get your data, you might find it easier to put your entire data (numeric and categorical variables) into a table or, if you are not in R2013b yet, into a dataset object and then extract numeric and categorical variables from that object.
추가 답변 (0개)
참고 항목
카테고리
Help Center 및 File Exchange에서 Descriptive Statistics and Visualization에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!