SVM Classification with Cross Validation
조회 수: 3 (최근 30일)
이전 댓글 표시
Hi,
I am trying to follow the examples in Bioinformatics Toolbox -> SVM Classification with Cross Validation (<http://www.mathworks.com/help/bioinfo/ug/support-vector-machines-svm.html#bs3tbev-16>
However, there is one part that I do not understand and that is
9. Set up a function that takes an input z=[rbf_sigma,boxconstraint], and returns the cross-validation value of exp(z). The reason to take exp(z) is twofold:
rbf_sigma and boxconstraint must be positive.
You should look at points spaced approximately exponentially apart.
This function handle computes the cross validation at parameters exp([rbf_sigma,boxconstraint]):
minfn = @(z)crossval('mcr',cdata,grp,'Predfun', ...
@(xtrain,ytrain,xtest)crossfun(xtrain,ytrain,...
xtest,exp(z(1)),exp(z(2))),'partition',c);
Will someone please explain how I should implement this in code? Thanks.
Cheers,
Wee Chong
댓글 수: 0
답변 (1개)
Hari
2025년 6월 11일
Hi,
I understand that you are trying to implement SVM classification with cross-validation from the Bioinformatics Toolbox example, and you're specifically confused about how the function minfn works — particularly why it uses exp(z) and how it fits into the cross-validation.
I assume you already have your data (cdata, grp) and a partition (c) ready for cross-validation, and you want to tune the hyperparameters rbf_sigma and boxconstraint for an RBF SVM using a custom function for optimization.
In order to compute cross-validation loss with varying values of RBF sigma and box constraint, you can follow the below steps:
Step 1: Understand the parameter transformation
The parameters rbf_sigma and boxconstraint must be strictly positive. Using exp(z) ensures this. It also makes the parameter search more effective by spacing values exponentially (e.g., 0.1, 1, 10...).
Step 2: Define your SVM training function
Write a custom function (e.g., crossfun) that takes training and test data, fits the SVM with the given sigma and box constraint, and returns the predicted labels.
Step 3: Set up the anonymous function for optimization
Use crossval with a function handle like:
minfn = @(z) crossval('mcr', cdata, grp, 'Predfun', ...
@(xtrain, ytrain, xtest) crossfun(xtrain, ytrain, xtest, exp(z(1)), exp(z(2))), ...
'partition', c);
This function will be passed to an optimizer (like fminsearch) that minimizes the misclassification rate.
Step 4: Optimize the parameters
You can now search for the optimal parameters:
bestZ = fminsearch(minfn, [0, 0]); % Initial guess in log-scale
The resulting exp(bestZ) gives the actual values for rbf_sigma and boxconstraint.
Refer to the documentation of "crossval" function for more details:
Refer to the documentation of "fminsearch" function:
Hope this helps!
댓글 수: 0
참고 항목
카테고리
Help Center 및 File Exchange에서 Biological and Health Sciences에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!