How to whiten data IN fast ICA

조회 수: 15 (최근 30일)
myetceteramail myetceteramail
myetceteramail myetceteramail 2018년 5월 4일
댓글: Ion Stefanache 2021년 9월 18일
i am trying to implement FAst ICA for my project, but i am facing it hard to just go through the fisrt step. The code that i am using in matlab is as given below for ex
Z=[1 2; 5 6]
mu = mean(Z,2);
Zc = bsxfun(@minus,Z,mu); % This is for centering the data value, i,e zero mean
C=cov(Zc') % for calculating the covariance matrix
C=[2 2; 2 2]
[E,D]=eig(C) % for calculating the eigenvalues and eigenvector of C
Transf = U * diag(1 ./ sqrt(diag(D))) * U'; % the whitening transformation
Zw= transf.Zc;
But i have questions, the covariance matrix is always singular, so it will have 0 as an eigenvalue, so how do we apply D^{-1/2}, i.e inverse of th square root.
Where am i going wrong can somebody help.

답변 (1개)

Ion Stefanache
Ion Stefanache 2021년 9월 18일
Before at all need to preprocesing the signal observed x:
Preprocessing for ICA
First, let us consider the basic statement of ICA. $$ x = As \\ s = Wx $$ Where $s$ refers to the source signals, $A$, the mixing matrix and $x$, the signal we receive at microphones(say) and $W = A^{-1}$ Given below are the pre-processing stages performed:
  • Centering: The most basic and necessary preprocessing is to center ${\bf x}$, i.e. subtract its mean vector ${\bf m = E\{x\}}$ so as to make ${\bf x}$ a zero-mean variable. This also implies that ${\bf s}$ is a zero-mean variable, as can be seen by taking expectation on both sides, above. This preprocessing is made solely to simplify the ICA algorithms: It does not mean that the mean could not be estimated. After estimating the mixing matrix ${\bf A}$ with centered data, we can complete the estimation by adding the mean vector of ${\bf s}$ back to the centered estimates of ${\bf s}$. The mean vector of ${\bf s}$ is given by ${\bf A}^{-1} {\bf m}$, where ${\bf m}$ is the mean that was subtracted in the preprocessing.
  • Whitening: Another useful preprocessing strategy in ICA is to first whiten the observed variables. This means that before the application of the ICA algorithm (and after centering), we transform the observed vector ${\bf x}$linearly so that we obtain a new vector $\tilde{{\bf x}}$ which is white, i.e. its components are uncorrelated and their variances equal unity. In other words, the covariance matrix of $\tilde{{\bf x}}$ equals the identity matrix. A whitening transformation is a linear transformation that transforms a vector of random variables with a known covariance matrix into a set of new variables whose covariance is the identity matrix meaning that they are uncorrelated and all have variance unity. $$ C_{\tilde{x}} = \tilde{x}\tilde{x}^T = I $$ The math behind whitening involves a greater understanding of eigenvectors and matrices, which we shall ignore for the purpose of this tutorial. Let us fast-forward to how whitening can be done on a given dataset. The whitening transformation is always possible. One popular method for whitening is to use the eigen-value decomposition (EVD) of the covariance matrix $C_{\tilde{x}} ={\bf E}{\bf D}{\bf E}^T$, where ${\bf E}$ is the orthogonal matrix of eigenvectors of $ C_{\tilde{x}} $ and ${\bf D}$ is the diagonal matrix of its eigenvalues, ${\bf D}= \mbox{diag}(d_1,...,d_n)$. Note that $E\{{\bf x}{\bf x}^T\}$can be estimated in a standard way from the available sample $x(1),...,x(T)$. Whitening can now be done by $$ \tilde{x} = ED^{-1/2}E^Tx$$ where the matrix ${\bf D}^{-1/2}$is computed by a simple component-wise operation as ${\bf D}^{-1/2}=\mbox{diag}(d_1^{-1/2},...,d_n^{-1/2})$. It is easy to check that now $C_{\tilde{x}}={\bf I}$.
It is important to note that the whitening transformation changes the matrix $A$ corresponding to the $x$, and hence $$ \tilde{x} = ED^{-1/2}E^TAs = \tilde{A}s$$ The utility of whitening resides in the fact that the new mixing matrix $\tilde{{\bf A}}$ is orthogonal. How this affects the data is a little complicated to explain, but I will attempt to illlustrate it below.

카테고리

Help CenterFile Exchange에서 Mathematics에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by