Simulink Transport Delay's influence on Variance Percentage?

조회 수: 1 (최근 30일)
Yufeng
Yufeng 2012년 5월 18일
Dear all,
I have a signal that I put through a transport delay of 0.2 seconds in Simulink. Let's call the time-delayed signal u. Noise is added to the resulting time-delayed signal u to get signal u_noise.
I want to keep the noise in the signal at a variance percentage of 10%, so I calculate
var(u-u_noise)/var(u)*100
When I do not use the Transport Delay I can easily tweak the noise level with a gain to 10%. However, when I use the Transport Delay, the variance percentage drops and when I increase the gain I use to tweak the noise level, suddenly there is a maximum to my variance percentage, which is lower than the 10% I need. I plotted the variance percentage against the gain and in the beginning there is a steep rise until the maximum amount after which is stabilizes. But, as said, the maximum amount is lower than I need (about 7%). Does anybody know why this Transport Delay causes this? In theory it should only delay the signal, so I expected to receive the same variance percentage.
Any help is very much appreciated.
  댓글 수: 1
Kaustubha Govind
Kaustubha Govind 2012년 5월 18일
I don't know enough about the Transport Delay block to comment on this, but I wonder if you have experimented with increasing the buffer size parameter on the dialog to see if that improves things?

댓글을 달려면 로그인하십시오.

답변 (0개)

카테고리

Help CenterFile Exchange에서 Signal Generation에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by