How to model a correcting function for shifted data?
조회 수: 3 (최근 30일)
이전 댓글 표시
Hi all,
I have sensor reading that incorporates some time delays in its measurements, so that the measurement values of it are shifted from its ideal values (illustrated on the figure below). I have a set of data from both the ideal and measured. But, later I want to use a model that can 'correct' my measurement to 'estimate' the ideal value of it using only a single point of data (from one particular time step).
What technique should I use? All helps will be very much appreciated!
Thanks,
Ghazi

댓글 수: 0
답변 (0개)
참고 항목
카테고리
Help Center 및 File Exchange에서 Statistics and Machine Learning Toolbox에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!