Radar simulation: How to apply a precise delay?
조회 수: 8 (최근 30일)
이전 댓글 표시
Hi all,
I am trying to create a simple radar baseband channel model. Therefore, I want to generate a delayed version of my transmit baseband signal:
The signals are sampled with sampling frequency
, so applying a delay by shifting the signal by a certain amount of samples gives a "delay resolution" of only
. As the location of targets might lead to delays that are no multiples of that value, this approach might introduce an error. Are there ways of applying a delay that are not limited by time discretization (except upsampling, which is already applied), or is this a limitation not to overcome? In particular:
- Is it possible to perform a frequency-dependent phase shift according to the shift theorem of the DFT instead of a time delay? And does this give a finer resolution?
- I saw that there is a FreeSpace object in the Phased Array Toolbox, that also adds a delay to a signal. Does anyone know how this is realized?
댓글 수: 0
채택된 답변
Honglei Chen
2019년 3월 15일
The FreeSpace in Phased Array System Toolbox uses fractional delay fitler to approximate the delay between samples.
HTH
댓글 수: 0
추가 답변 (0개)
참고 항목
카테고리
Help Center 및 File Exchange에서 Transmitters and Receivers에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!