fusionRadarSensor object simulates the detection of targets by a radar. You
can use the object to model many properties of real radar sensors. For example, you can
simulate real detections with added random noise
generate false alarms
simulate mechanically scanned antennas and electronically scanned phased arrays
specify angular, range, and range-rate resolution and limits
The radar sensor is assumed to be mounted on a platform and carried by the
platform as it maneuvers. A platform can carry multiple sensors. When you create a sensor,
you specify sensor positions and orientations with respect to the body coordinate system of
a platform. Each call to
fusionRadarSensor creates a sensor. The output of
fusionRadarSensor generates the detection that can be used as input to
multi-object trackers, such as
any tracking filters, such as
The radar platform does not maintain any information about the radar sensors that are
mounted on it. (The sensor itself contains its position and orientation with respect to the
platform on which it is mounted but not which platform). You must create the association
between radar sensors and platforms. A way to do this association is to put the platform and
its associated sensors into a cell array. When you call a particular sensor, pass in the
platform-centric target pose and target profile information. The sensor converts this
information to sensor-centric poses. Target poses are outputs of
You can create a radar sensor using the
fusionRadarSensor object. Set
the radar properties using name-value pairs and then execute the simulator. For
radar1 = fusionRadarSensor( ... 'SensorIndex',1,... 'UpdateRate',10, ... % Hz 'ReferenceRange', 111.0e3, ... % m 'ReferenceRCS', 0.0, ... % dBsm 'FieldOfView',[70,10], ... % [az;el] deg 'HasElevation',false, ... 'HasRangeRate',false, ... 'AzimuthResolution',1.4, ... % deg 'RangeResolution', 135.0) % m
There are several syntaxes of
fusionRadarSensor that make it easier
to specify the properties of commonly implemented radar scan modes.
sensor = fusionRadarSensor('Rotator') creates a
fusionRadarSensor object that mechanically scans
360° in azimuth. Setting
true points the radar antenna towards the center
of the elevation field of view.
sensor = fusionRadarSensor('Sector') creates a
fusionRadarSensor object that mechanically scans a
90° azimuth sector. Setting
true, points the radar antenna towards the center
of the elevation field of view. You can change the
electronically scan the same azimuth sector. In this case, the antenna
is not mechanically tilted in an electronic sector scan. Instead, beams
are stacked electronically to process the entire elevation spanned by
the scan limits in a single dwell.
sensor = fusionRadarSensor('Raster') returns a
fusionRadarSensor object that mechanically scans a
raster pattern spanning 90° in azimuth and 10° in elevation upwards from
the horizon. You can change the
'Electronic' to perform an electronic raster scan
in the same volume.
sensor = fusionRadarSensor('No scanning') returns a
fusionRadarSensor object that stares along the
radar antenna boresight direction. No mechanical or electronic scanning
You can set other radar properties when you use these syntaxes. For example,
sensor = fusionRadarSensor(1,'Raster','ScanMode','Electronic')
The properties specific to the
fusionRadarSensor object are listed
here. For more detailed information,
Sensor location parameters.
|A unique identifier for each sensor.|
Rate at which sensor updates are generated, specified as a positive scalar. The reciprocal of this property must be an integer multiple of the simulation time interval. Updates requested between sensor update intervals do not return detections.
Sensor (x,y,z) defining the offset of the sensor origin from the origin of its platform. The default value positions the sensor origin at the platform origin.
Yaw, pitch, and roll angles of the sensor mounting frame with respect to the platform frame.
Specifies the coordinate system for detections reported
in the Detections output
Probability of detecting a target with radar cross
The probability of a false detection within each
resolution cell of the radar. Resolution cells are
determined from the
Range at which a target with radar cross section,
The target radar cross section (RCS) in dB at which
the target is detected at the range specified by
Sensor resolution and bias parameters.
The radar azimuthal resolution defines the minimum separation in azimuth angle at which the radar can distinguish two targets.
The radar elevation resolution defines the minimum
separation in elevation angle at which the radar can
distinguish two targets. This property only applies when the
The radar range resolution defines the minimum separation in range at which the radar can distinguish two targets.
The radar range rate resolution defines the minimum
separation in range rate at which the radar can distinguish
two targets. This property only applies when the
This property defines the azimuthal bias component of
the radar as a fraction of the radar azimuthal resolution
specified by the
This property defines the elevation bias component of
the radar as a fraction of the radar elevation resolution
specified by the
This property defines the range bias component of the
radar as a fraction of the radar range resolution specified
This property defines the range rate bias component of
the radar as a fraction of the radar range resolution
specified by the
This property allows the radar sensor to scan in elevation and estimate elevation from target detections.
This property allows the radar sensor to estimate range rate.
This property allows the radar sensor to generate false alarm detection reports.
When true, the radar does not resolve range
ambiguities. When a radar sensor cannot resolve range
ambiguities, targets at ranges beyond the
When true, the radar does not resolve range rate
ambiguities. When a radar sensor cannot resolve range rate
ambiguities, targets at range rates above the
Specifies if noise is added to the sensor measurements.
Set this property to
|Enable occlusion from extended objects, specified as
Set this property to true to enable an optional input
argument to pass the current estimate of the sensor platform
pose to the sensor. This pose information is added to the
Range and range rate parameters.
Range and Range Rate Parameters
This property specifies the range at which the radar
can unambiguously resolve the range of a target. Targets
detected at ranges beyond the unambiguous range are wrapped
into the range interval
This property also
defines the maximum range at which false alarms are
generated. This property only applies to false target
detections when you set
This property specifies the maximum magnitude value of
the radial speed at which the radar can unambiguously
resolve the range rate of a target. Targets detected at
range rates whose magnitude is greater than the maximum
unambiguous radial speed are wrapped into the range rate
This property also
defines the range rate interval over which false target
detections are generated. This property only applies to
false target detections when you set both the
Each sensor created by
fusionRadarSensor accepts as input an array of
target structures. This structure serves as the interface between the
trackingScenario and the sensors. You create the target
struct from target poses and profile information produced by
trackingScenario or equivalent software.
The structure contains these fields.
Unique identifier for the platform, specified as a positive integer. This is a required field with no default value.
User-defined integer used to classify
the type of target, specified as a nonnegative integer.
Position of target in platform coordinates, specified as a real-valued, 1-by-3 vector. This is a required field with no default value. Units are in meters.
Velocity of target in platform coordinates, specified as a real-valued, 1-by-3 vector.
Units are in meters per second. The default is
Acceleration of target in platform coordinates, specified as a 1-by-3
row vector. Units are in meters per second-squared. The default is
Orientation of the target with respect to platform coordinates, specified as a scalar
quaternion or a 3-by-3 rotation matrix. Orientation defines the frame
rotation from the platform coordinate system to the current target body
coordinate system. Units are dimensionless. The default is
Angular velocity of the target in platform coordinates, specified as a
real-valued, 1-by-3 vector. The magnitude of the vector defines the angular
speed. The direction defines the axis of clockwise rotation. Units are in
degrees per second. The default is
You can create a target pose structure by merging information from the
platform information output from the
targetProfiles method of
trackingScenario and target pose information output from the
targetPoses method on the platform carrying the sensors. You can
merge them by extracting for each
PlatformID in the target poses
array, the profile information in platform profiles array for the same
targetPoses method returns this structure for each
target other than the platform.
platformProfiles method returns this structure for all platforms
in the scenario.
Detections consist of measurements of positions and velocities of targets and their covariance matrices. Detections are constructed with respect to sensor coordinates but can be output in one of several coordinates. Multiple coordinate frames are used to represent the positions and orientations of the various platforms and sensors in a scenario.
In a radar simulation, there is always a top-level global coordinate system which is
usually the North-East-Down (NED) Cartesian coordinate system defined by a tangent plane
at any point on the surface of the Earth. The
models the motion of platforms in the global coordinate system. When you create a
platform, you specify its location and orientation relative to the global frame. These
quantities define the body axes of the platform. Each radar sensor is mounted on the
body of a platform. When you create a sensor, you specify its location and orientation
with respect to the platform body coordinates. These quantities define the sensor axes.
The body and radar axes can change over time, however, global axes do not change.
Additional coordinate frames can be required. For example, often tracks are not maintained in NED (or ENU) coordinates, as this coordinate frame changes based on the latitude and longitude where it is defined. For scenarios that cover large areas (over 100 kilometers in each dimension), earth-centered earth-fixed (ECEF) can be a more appropriate global frame to use.
A radar sensor generates measurements in spherical coordinates relative to its sensor frame. However, the locations of the objects in the radar scenario are maintained in a top-level frame. A radar sensor is mounted on a platform and will, by default, only be aware of its position and orientation relative to the platform on which it is mounted. In other words, the radar expects all target objects to be reported relative to the platform body axes. The radar reports the required transformations (position and orientation) to relate the reported detections to the platform body axes. These transformations are used by consumers of the radar detections (e.g. trackers) to maintain tracks in the platform body axes. Maintaining tracks in the platform body axes enables the fusion of measurement or track information across multiple sensors mounted on the same platform.
If the platform is equipped with an inertial navigation system (INS) sensor, then the location and orientation of the platform relative to the top-level frame can be determined. This INS information can be used by the radar to reference all detections to scenario coordinates.
When you specify
HasINS as true, you must pass in an INS
struct into the
step method. This structure
consists of the position, velocity, and orientation of the platform in scenario
coordinates. These parameters let you express target poses in scenario coordinates by
Radar sensor detections are returned as a cell array of
objects. A detection contains these properties.
|Measurement noise covariance matrix|
|Unique ID of the sensor|
|Parameters used by initialization functions of any nonlinear Kalman tracking filters|
|Additional information passed to tracker|
reported in the coordinate system specified by the
DetectionCoordinates property of the
fusionRadarSensor are reported in sensor Cartesian coordinates.
|DetectionCoordinates||Measurement and Measurement Noise Coordinates|
Coordinate Dependence on
Coordinate Dependence on HasRangeRate and HasElevation
MeasurementParameters field consists of an array of
structs describing a sequence of coordinate transformations from
a child frame to a parent frame or the inverse transformations (see Frame Rotation). The longest
possible sequence of transformations is: Sensor → Platform → Scenario. For example, if
the detections are reported in sensor spherical coordinates and
HasINS is set to false, then the sequence consists of one
transformation from sensor to platform. If
HasINS is true, the
sequence of transformations consists of two transformations – first to platform
coordinates then to scenario coordinates. Trivially, if the detections are reported in
platform rectangular coordinates and
HasINS is set to false, the
transformation consists only of the identity.
struct takes the form:
Enumerated type indicating the frame used to report
measurements. When detections are reported using a rectangular
Position offset of the origin of frame(k) from the origin of frame(k+1) represented as a 3-by-1 vector.
Velocity offset of the origin of frame(k) from the origin of frame(k+1) represented as a 3-by-1 vector.
A 3-by-3 real-valued orthonormal frame rotation matrix which rotates the axes of frame(k+1) into alignment with the axes of frame(k).
A logical scalar indicating if
A logical scalar indicating if the frame has
three-dimensional position. Only set to false for the first
A logical scalar indicating if the reported detections
include velocity measurements.
Identifier of the platform,
Detection signal-to-noise ratio in dB.