# Phase difference removal by using Filtering, butterworth, filtfilt command

조회 수: 30(최근 30일)
MAT-Magic 5 Jun 2020
댓글: Nimish Iwale 8 Jul 2020
Hi Guys!
I am using 6th order Butterworth bandpass filter to extract the 10-20 Hz band from the signal (x). After that, I applied the filtfilt command on to the Butterworth bandpass filtered signal (10-20 Hz) to remove the phase delay. Is that I am doing right?
Thanks in advance. The code is given below. Feel free to correct the code.
Fs = 500;
fcutlow = 6; % low cut frequency in Hz
fcuthigh = 14; % high cut frequency in Hz
[b,a] = butter(6,[fcutlow,fcuthigh]/(Fs/2));
Butterworth_bandpass_filter = filter(b,a,x);
Filt_filt_signal = filtfilt(b,a, Butterworth_bandpass_filter);

### 채택된 답변

Alberto Mora 5 Jun 2020
편집: Alberto Mora 5 Jun 2020
I think that you just need to do:
Fs = 500;
fcutlow = 6; % low cut frequency in Hz
fcuthigh = 14; % high cut frequency in Hz
[b,a] = butter(6,[fcutlow,fcuthigh]/(Fs/2));
Filt_filt_signal = filtfilt(b,a, rawSignal );
without "Butterworth_bandpass_filter = filter(b,a,x);", otherwise you firstly compute a filter on the signal (with phase lag), and then you pass the filtered signal into filtfilt routine that filter again twice the incoming signal, but the incoming signal is already with a phase delay.
If you wanto to be sure about the results, have a look the frequency domain, comparin initial and final spectrums.

#### 댓글 수: 4

표시 이전 댓글 수: 1
Alberto Mora 5 Jun 2020
I try to answer more clearly:
1. In the code that you written you firstly compute coefficient of the butterworth filter (that shows a phase dealy)
2. you filter your raw data x with the filter computed on step1 (therefore the output signal shows the phase delay of the filter). You get Butterworth_bandpass_filter.
3. you pass the filtered signal of step2 (Butterworth_bandpass_filter) to another filtering using filtfilt (but filtfilt don't increase the phase delay since compute filtering on both direction of the signal). This is wrong: in the input of filtfilt there shoudl be Filt_filt_signal = filtfilt(b,a, rawSignal ) and not Butterworth_bandpass_filter
What you obtain is that you filter your raw signal two times: one time in step#2 (and in this step you also add the phase delay), and the second time when you pass the output of step2 to filtfilt routine (but on this case you don't increase further the phase delay).
You have just to compute filter coefficient, and pass them with the raw signal to filtfilt function (without passing to filtfilt the Butterworth_bandpass_filter signal).
MAT-Magic 6 Jun 2020
Thank you!
Nimish Iwale 8 Jul 2020
Is there any way this can be implemented in Simulink? I want to remove the phase shift in a simulink model.