Using SVD for Dimensionality Reduction
조회 수: 18 (최근 30일)
이전 댓글 표시
Hello everyone.
I have a matrix that has 300 rows(samples) and 5000 columns(features).
I need to reduce the number of columns for classification.
As far as I know for using pca() function the number of samples should be greater than the number of features.
So I try to use Singular Value Decomposition function with below codes.
%Singular value decomposition of X;
[U, Sig, V]=svd(X);
sv=diag(Sig)
%for the distribution of singular values;
figure;
sv=sv/sum(sv);
stairs(cumsum(sv));
xlabel('singular values');
ylabel('cumulative sum');
I have two questions.
1) As i understand from the above figure i have to take approximately 250 singular values that it counts for 95% of my data.
So should I take first 250 singular values for creating a new data for classification?
How can i see the variance of each principal components like in pca() functions explained matrix to decide how many of them should i use?
2) After defining the number of principal components, I need to create a new matrix for classification.
Can I do this with below code? (for example with first two principal components)
new_matrix_for_classification = X*(V:,1:2);
Thanks in advance.
댓글 수: 0
채택된 답변
Mahesh Taparia
2021년 4월 2일
Hi
For 2nd part, you can use the function pca to directly calculate the input with principal components. For example, in your case if you want 1st 2 components, then:
[coeff,score,latent] = pca(X);
new_matrix_for_classification = score(:,1:2); %score is representation in new space
Hope it will help!
댓글 수: 0
추가 답변 (0개)
참고 항목
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!