why accuracy is zero
조회 수: 4 (최근 30일)
이전 댓글 표시
function [result] = multisvm(TrainingSet,GroupTrain,TestSet)
%Models a given training set with a corresponding group vector and
%classifies a given test set using an SVM classifier according to a
%one vs. all relation.
%
%This code was written by Cody Neuburger cneuburg@fau.edu
%Florida Atlantic University, Florida USA...
%This code was adapted and cleaned from Anand Mishra's multisvm function
%found at http://www.mathworks.com/matlabcentral/fileexchange/33170-multi-class-support-vector-machine/
GroupTrain=GroupTrain';
u=unique(GroupTrain);
numClasses=length(u);
%TestSet=TestSet';
%TrainingSet=TrainingSet';
result = zeros(length(TestSet(:,1)),1);
%build models
for k=1:numClasses
%Vectorized statement that binarizes Group
%where 1 is the current class and 0 is all other classes
G1vAll=(GroupTrain==u(k));
models {k} = fitcsvm(TrainingSet,G1vAll);
end
%classify test cases
for j=1:size(TestSet,1)
for d=1:numClasses
if(predict(models{d},TestSet(j,:)))
break;
end
end
result(j) = d;
%--------------------------------
end
%disp(result);
%disp(GroupTrain);
load Group_Test
Group_Test1 = Group_Test1;
%disp(Group_Test1);
%Accuracy = mean(Group_Test1==result)*100;
%fprintf('Accuracy = %f\n', Accuracy);
%fprintf('error rate = %f\n ', length(find(result ~= Group_Test1 ))/length(Group_Test1'));
c=0;
for j=1:size(TestSet,1)
if Group_Test1(j)==result(j)
c = c+1;
end
end
acc = c/100
end
댓글 수: 2
DGM
2021년 8월 9일
I'm assuming that the equality test in the if statement in the screenshot is never true. If these are floating point numbers, that's entirely possible.
채택된 답변
Walter Roberson
2021년 8월 9일
result(j) is going to be a class number, an integer represented in double precision.
GroupTrain is not necessarily an integer class number at all, and is not necessarily consecutive from 1 even if it is integer. All we know is that it is a datatype that unique() can be applied to and that == comparisons works for.
For example if GroupTrain is 10, 20, 30, then u = unique() of that would be 10, 20, 30, and the code would loop through training based upon whether the class was 10, then whether it was 20, and so on. Then it would loop over classes, and use predict() and if the prediction was non-zero then it would record the class index rather than u() indexed at the class index. So predictions might be perfect, but it would be 1, 2, 3 recorded, and those would not match the 10, 20, 30s of the classes.
댓글 수: 21
Walter Roberson
2021년 8월 26일
I let the code run for about 40 hours. It was up to 132 gigabytes of memory. I got tired of it and canceled it; it really needs a rewrite.
Walter Roberson
2021년 8월 27일
I changed the imresize() to [400,100] in both places, and reran HOG_NEW, which ran without problem.
I then re-ran HOG_NEW . After about 2 hours I asked it to pause; about half an hour later it did pause, having just completed building the first classification tree out of 937 . Estimated time to build the trees is therefore roughly
days(hours(2.5) * 937)
which is more than 3 months.
I then asked it to apply that classification tree to all of the training data, which took about 20 minutes. That adds another
days(hours(1/3) * 937)
So you should expect your code to take more than 4 months to run. More, as your system is slower than mine.
There is not much you can do to speed up building the classification trees... though possibly dropping the members of each class after that class is trained might help. Results would probably be less robust.
추가 답변 (1개)
참고 항목
카테고리
Help Center 및 File Exchange에서 Classification Trees에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!