Regarding Lagrange Multiplier in SVM
이전 댓글 표시
Hello all, In my work I am using SVM for classification. I had trained the SVM classifier and also obtained the Lagrangian multiplier (α) which is a
column vector .
column vector .My query is how we can use this Lagrangian multiplier (α) to predict the labels of the data points.
Any help in this regard will be highly appreciated.
댓글 수: 4
charu shree
2023년 3월 23일
Torsten
2023년 3월 23일
I think there are only very few people here who know what you are talking about:
SVM ?
trained the SVM classifier ?
Lagrangian multiplier (α) which is a column vector ?
how we can use this Lagrangian multiplier (α) to predict the labels of the data points ?
Maybe you could formulate your problem in a more general - maybe mathematical - form.
charu shree
2023년 3월 24일
charu shree
2023년 3월 24일
답변 (1개)
Parag
2025년 4월 10일
To predict the labels for your test data (e.g., a 404 × 4 matrix), using SVM trained in a one-vs-all setting with 16 classifiers, you can follow this approach:
- Each SVM classifier corresponds to one class and provides a decision score using the learned parameters (α, support vectors, labels, and bias).
- For each test vector, compute the decision score from all 16 classifiers using the kernel function (e.g., linear).
- Stack all scores in a matrix of size 404 × 16.
- Assign each test vector the class label of the classifier with the highest score (i.e., maximum margin decision function output).
This approach implements the standard prediction step in multi-class SVM, aligned with the formulation in the paper.
% Inputs (assumed precomputed and available):
% X_test : 404 x 4 test data matrix
% supportVectors : N x 4 training vectors
% alpha : 1x16 cell array, each cell contains alpha vector for a class
% labels : N x 16 label matrix in one-vs-all format (+1/-1 per class)
% b : 16 x 1 bias vector for each classifier
numTest = size(X_test, 1);
numClasses = 16;
F = zeros(numTest, numClasses); % Stores decision scores
for k = 1:numClasses
% Linear kernel: dot product
K = X_test * supportVectors'; % 404 x N
F(:, k) = K * (alpha{k} .* labels(:, k)) + b(k);
end
% Predict class with highest score
[~, predicted_labels] = max(F, [], 2); % Output: 404 x 1 vector of predicted classes
Hope it helps!
카테고리
도움말 센터 및 File Exchange에서 Statistics and Machine Learning Toolbox에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!



