File Exchange

## Linear Discriminant Analysis (LDA) aka. Fisher Discriminant Analysis (FDA)

version 1.0.0.0 (5.7 KB) by
Implemenatation of LDA in MATLAB for dimensionality reduction and linear feature extraction

Updated 22 Sep 2015

http://yarpiz.com/430/ypml114-linear-discriminant-analysis

### Cite As

Yarpiz (2020). Linear Discriminant Analysis (LDA) aka. Fisher Discriminant Analysis (FDA) (https://www.mathworks.com/matlabcentral/fileexchange/53151-linear-discriminant-analysis-lda-aka-fisher-discriminant-analysis-fda), MATLAB Central File Exchange. Retrieved .

Dengyu Wang

Matlaber

Hi (to below AC)

% X | double | nxk | data matrix: rows->observations cols->variables
% label | double | nx1 | labels
% m | double | 1x1 | dimension of reduced data

X is data matrix
label is class matrix ?

Am I correct?

AC

Why is the result of each iteration saved in a cell mat. Taht is not necessary. I would add another input argument that specifies the dimension of the output data.

INPUT:
% X | double | nxk | data matrix: rows->observations cols->variables
% label | double | nx1 | labels
% m | double | 1x1 | dimension of reduced data
% OUTPUT:
% Y | double | nxm | reduced data
% W | double | kxm | transformationmatrix Y = X*W
function [ Y, W] = fisherLDA( X, label, m)

classes = unique(label);
k = numel(classes); % number of classes
dim = size(X,2); % Dimension of input data

SB = zeros(dim); % between-class scatter matrix
SW = zeros(dim); % within-class scatter matrix

X_ = mean(X); % mean of complete data

% loop over all classes and calculate SW and SB
for i = 1 : k
v = label == classes(i);
Xl = X(v,:); % all datapoints corresponding to label i

Xl_ = mean(Xl); % mean of all datapoints corresponding to label i

r = Xl_-X_;
SB = SB + size(Xl,1)*( r'*r );

for j = 1 : size(Xl,1)
r = Xl(j,:)-Xl_;
SW = SW + r'*r;
end

end
[W, LAMBDA]=eig(SB,SW,'qz');
lambda=diag(LAMBDA);

[~, SortOrder] = sort(lambda,'descend');

W=W(:,SortOrder);
W=W(:,1:m);
Y=X*W;
end

Jan Motl

The header of LDA.m should really say what are the returned parameters. Something along:
Y: The new feature space.
W: Eigenvectors (sorted in the descending order of the eigenvalues).
lambda: Eigenvalues (sorted in the descending order).

Chao-Chung Peng

Angelo Carfora

In the function: [Y, W, lambda] = LDA(X, L);
What is the meaning of W and lambda?

I think it's a little more efficient to write the code as follows. Because most of the time the number of features is much less than the number of samples.

function [Y, W, lambda] = LDA(X, L)

Classes=unique(L)';
k=numel(Classes);
NumFeatures = size(X,2);
n=zeros(k,1);
C=cell(k,1);
M=mean(X);
S=cell(k,1);
Sw=0;
Sb=0;
for j=1:k
Xj=X(L==Classes(j),:);
n(j)=size(Xj,1);
C{j}=mean(Xj);
C_rep = repmat(C{j},n(j),1);
XC = Xj-C_rep;
S{j}=0;
for k1=1:NumFeatures
for k2=k1:NumFeatures
S{j}(k1,k2) = XC(:,k1)'*XC(:,k2);
S{j}(k2,k1) = S{j}(k1,k2);
end
end
Sw=Sw+S{j};
Sb=Sb+n(j)*(C{j}-M)'*(C{j}-M);
end

[W, LAMBDA]=eig(Sb,Sw);

lambda=diag(LAMBDA);

[lambda, SortOrder]=sort(lambda,'descend');

W=W(:,SortOrder);

Y=X*W;

end

Arijit Dash

i need matlab code

##### MATLAB Release Compatibility
Created with R2009a
Compatible with any release
##### Platform Compatibility
Windows macOS Linux