Why SVM is not giving expected result

조회 수: 1 (최근 30일)
Diver
Diver 2015년 10월 21일
댓글: Diver 2015년 10월 23일
0
I have training data composed from only one feature.
  • The feature have around 113K observation.
  • 8K only of those observation have positive class.
  • 105K of those observation have negative class.
  • The 8K observation composed of a number below 1 (90%), and 10% above 1
  • The 105K observation composed of a number above 1 (80%), and 20% below 1
Hence, almost, any X value below than 1 show be predicted as positive class, and any X value above 1 should be predicted as negative class.
I used the following fitcsvm call:
svmStruct = fitcsvm(X,Y,'Standardize',true, 'Prior','uniform','KernelFunction','linear','KernelScale','auto','Verbose',1,'IterationLimit',1000000);
the fitcsvm give message at the end saying SVM optimization did not converge to the required tolerance., ... but why ... most of first class X values are below 1 and visa versa ... so it should be easy to find classification boundary. and when I run:
[label,score,cost]= predict(svmStruct, X) ;
it gives wrong prediction.
Below a portion of my X values is listed:
0.9911
0.9836
0.9341
0.9751
0.9880
0.9977
0.9853
0.9861
1.0143
1.0086
0.9594
0.9787
0.9927
0.9839
1.0024
0.9931
0.9930
1.0275
  댓글 수: 4
Image Analyst
Image Analyst 2015년 10월 21일
Can you help us by sending your classes to gscatter() and showing us a screenshot?
Diver
Diver 2015년 10월 23일
my understanding gscatter() require X and Y ( 2 features) in addition to group, however, in my case I only have 1 feature, in addition to group, hence, I dont think I use gscatter to plot 1 feature.

댓글을 달려면 로그인하십시오.

답변 (0개)

카테고리

Help CenterFile Exchange에서 Classification에 대해 자세히 알아보기

태그

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by