Minimize error between data distribution and expected distribution
조회 수: 1 (최근 30일)
이전 댓글 표시
Hi all,
I have a 3 set of data which are expected to:
1) 1st data-block to approach a Gaussian distribution with mu = 0 and sigma = 1;
2) 2nd data-block to approach a Gaussian distribution with mu = 0 and sigma = .8;
3) 3rd data-block to approach a Gaussian distribution with mu = 0 and sigma = .5;
Each data-block has only a limited number of representations (generally between 2048 and 8192) and because of some filter effects drawn by the specific code I use, they will not exactly match the corresponding expected distribution.
The point is that, although what it implies in terms of manipulation, I want each data-block to minimize the discrepancy between actual and expected distribution. It's to be remarked that I won't increase the number of representations, due to some need I will not explain in detail.
Generally, the first data-block, respect to the normal Gaussian distribution, looks like the followinf figure:
I was thinking to use lsqcurvefit for this purpose.
What would you suggest?
댓글 수: 0
답변 (1개)
Wouter
2013년 3월 20일
Do you know this function:
histfit
댓글 수: 6
Wouter
2013년 3월 21일
편집: Wouter
2013년 3월 21일
You could try to change individual datapoints after your filteringset in order to update your datapoints; this will change the blue bars. For example; find a blue bar that is too high; change one of those datapoints into a value which lies in a blue bar that too low (compared to the red line). This does however changes your data and will render step 2)treat_with_piece_of_code useless.
However it makes more sense to find a better fit to the histogram; i.e. change the red line. Lsqcurvefit would only be useful if you would like to update the red line (fit)
참고 항목
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!