Low performance when training SVM model using "polynomial" kernel function
이전 댓글 표시
Hello,
I am trying to compare the performance of SVM regression (or SVR) between "rbf", "polynomial", and "linear".
The training works well when using "rbf" and "linear" (e.g., 0.7~0.8 of R^2).
However, when "polynomial" function was applied as kernel function, the performance degraded to 0.001 of R^2 or negative.
I used the code:
Mdl = fitrsvm(X,Y,"Standardize",'true','KernelFunction','polynomial','OptimizeHyperparameters',{'BoxConstraint','Epsilon','KernelScale','PolynomialOrder'},'HyperparameterOptimizationOptions',struct('MaxObjectiveEvaluations',100))
Please help
Thank you.
채택된 답변
추가 답변 (1개)
Ganesh
2024년 6월 14일
2 개 추천
The accuracy you achieve with a Kernal Function would depend on the data distribution. Adding your data might help us give you a better idea over the reason.
You could try out the following example in MATLAB:
Initially, run the example and see the number of iterations, and you can try changing the "Kernal Function" to "polynomial" and running the model. You will find that the number of iterations it takes to converge is now 20 times!
When your data is two or three columns it's easier to visualize the same, but as your dimensions grow, it gets harder to plot and visualize your findings.
카테고리
도움말 센터 및 File Exchange에서 Linear Regression에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!
