classifier

조회 수: 6 (최근 30일)
FIR
FIR 2012년 4월 7일
답변: Agustin 2014년 11월 17일
i have a dataset of 100x6,i want to classify these and find the accuracy using random forest and mlp ,i have classifeid using svm and knn,but dont know how to do with MLP and random forest ,please do help

채택된 답변

Greg Heath
Greg Heath 2012년 4월 11일
For Neural Net classification, see the documentation for patternnet and the classification demo example.
Hope this helps.
Greg

추가 답변 (4개)

Agustin
Agustin 2014년 11월 17일
I have written the following code to do cross-validation using TreeBagger (I use the fisheriris dataset):
load fisheriris
X = meas;
y = species;
%data partition
cp = cvpartition(y,'k',10); %10-folds
%prediction function
classF = @(XTRAIN,ytrain,XTEST)(predict(TreeBagger(50,XTRAIN,ytrain),XTEST));
&missclassification error
missclasfError = crossval('mcr',X,y,'predfun',classF,'partition',cp);
I hope it is useful.

Ilya
Ilya 2012년 4월 7일
If you have Statistics Toolbox and MATLAB 9a or later, you can use TreeBagger. Please read the documentation and take a look at the examples. Follow up with a specific question if something remains unclear.
For MLP, take a look at the Neural Network Toolbox.
  댓글 수: 2
FIR
FIR 2012년 4월 10일
how to add cv to this treebagger please help
B = TreeBagger(ntrees,X,Y)
Greg Heath
Greg Heath 2012년 4월 11일
See the pattern recognition and classification demos in the Neural Network Toolbox.
Hope this helps.
Greg

댓글을 달려면 로그인하십시오.


Ilya
Ilya 2012년 4월 10일
You can use out-of-bag error as an unbiased estimate of the generalization error. Train TreeBagger with 'oobpred' set to 'on' and call oobError method.
If you insist on using cross-validation, do 'doc crossval' and follow examples there.
  댓글 수: 2
FIR
FIR 2012년 4월 11일
i have a code
load fisheriris
groups=species;
cvFolds = crossvalind('kfold', groups, 10); %# get indices of 10-fold CV %# get indices of 10-fold CV
cp = classperf(groups);
for k=1:10
b = TreeBagger(10,meas,species,'oobpred','on');
cp = classperf(groups,b)
end
but i get error as
Error using TreeBagger/subsref (line 884)
Subscripting into TreeBagger using () is not allowed.
Error in classperf (line 219)
gps = varargin{1}(:);
Error in yass (line 10)
cp = classperf(groups,b)
please help
Ilya
Ilya 2012년 4월 11일
First, this thread has become convoluted. If you want to post another question, I suggest that you post it as a new question, not as an answer to your own old question.
Second, I suggested that you look at function crossval, not crossvalind. You can cross-validate using crossvalind too, but crossval offers an API that reduces the amount of code you need to write.
Third, whether you choose to use crossval or crossvalind, please take a look at the examples and follow them closely. In particular, 2nd example for crossval here http://www.mathworks.com/help/toolbox/stats/crossval.html shows what you need to do. You would need to replace the function handle classf in that example with a function which has two lines of code in it: 1) Train a TreeBagger on Xtrain and Ytrain, and 2) Predict labels for Xtest using the trained TreeBagger.

댓글을 달려면 로그인하십시오.


Richard Willey
Richard Willey 2012년 4월 11일
I did a webinar a couple years titled:
"Computational Statistics: An Introduction to Classification with MATLAB". You can watch the recorded webinar online. The demo code and data sets are available on MATLAB Central.
  댓글 수: 3
Richard Willey
Richard Willey 2012년 4월 11일
Not sure what you mean by rbf? Radial Basis Functions?
Greg Heath
Greg Heath 2012년 4월 11일
Search the Newsgroup using
heath newrb design
Hope this helps.
Greg

댓글을 달려면 로그인하십시오.

카테고리

Help CenterFile Exchange에서 Classification Ensembles에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by