Mean Square Error with ML Estimate
조회 수: 5 (최근 30일)
이전 댓글 표시
Hello,
I've been struggling with an code in my Detection and Estimation theory course.
I have a signal of know frequency w0 given by:
y(t)=Asin(w0*t) + Sigma(t) where Sigma(t) is a noisy sequence of i.i.d random variables with mean zero and variance a^2
We need to estimate the parameter A when the noise is gaussian.
The question is to compute the mean square error (MSE) defined as MSE(AEst)=E(AEst-A)^2 by averaging simulation results over 5,000 independent runs. (Assuming A=1 and w0=2*pi*0.2)
Can anyone help me in approaching this problem ?
댓글 수: 0
답변 (0개)
참고 항목
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!