Main Content

Angle-Only Tracking with Passive Scanning Sensors

Since R2023b

This example shows how to track multiple targets and create a maritime situational picture with passive sensors. Passive sensors are sensors that do not emit any energy towards their target and rely solely on emissions from the targets for detection. Therefore, passive sensors cannot provide a full position measurement of the detected target. In this example, the passive sensors you use are infrared sensors that scan the region of interest and provide angle-only detections of the targets.

Create Scenario and Visualization

In this section, you create a marine scenario using the trackingScenario based on the Angle and Position Measurement Fusion for Marine Surveillance example. The scenario consists of small civilian boats in the Boston Harbor as well as a large cargo vessel exiting the harbor. You simulate these objects as point targets which generate at most one detection per object at every time step. You set the IsEarthCentered property of trackingScenario to true to simulate geo-referenced scenarios. Further, you use the geoTrajectory System object™ to specify trajectories of each object.

To detect the vessels, several scanning infrared sensors are mounted on towers on a few islands in the harbor. The sensors are positioned to provide overlaps between different sensor coverage areas. Note that when the sensors scan the harbor region, their detection of each target occurs at different times. Therefore, it is not feasible to perform static fusion as discussed in the Tracking Using Distributed Synchronous Passive Sensors example.

For more details on the scenario, see the helper function, helperCreateMarineIRScenario, attached with this example.

% Create scenario
scenario = helperCreateMarineIRScenario;

You use the trackingGlobeViewer object to visualize the ground truth, instantaneous sensor coverages, detections, and the tracks generated by the tracker. To generate a birds-eye view of the scene, you set the camera position 7 kilometers above the ground using the campos object function.

% Create a uifigure
sz = get(groot,"ScreenSize");
fig = uifigure(Position = [0.1*sz(3) 0.1*sz(4) 0.8*sz(3) 0.8*sz(4)]);

% Create tracking globe viewer
viewer = trackingGlobeViewer(fig,NumCovarianceSigma=1,TrackLabelScale=2);

% Set position of camera
meanLat = mean(cellfun(@(x)x.Position(1),scenario.Platforms));
meanLong = mean(cellfun(@(x)x.Position(2),scenario.Platforms));
campos(viewer,[meanLat+0.005 meanLong 7e3]);

% Plot all platforms
plotPlatform(viewer,[scenario.Platforms{:}],TrajectoryMode="Full",...
    LineWidth=2,Marker="d",Color=[0.0745 0.6235 1.0000]);

% Plot all sensor coverages
covcon = coverageConfig(scenario);
[covcon.Range] = deal(10e3);
plotCoverage(viewer,covcon,"ECEF",Color=[1.0000 0.0745 0.6510],Alpha=0.2);

The image below shows the trajectory of all targets in the scene in blue and each blue diamond represents the starting position of the corresponding trajectory.

% Take snapshot
drawnow;
snapshot(viewer);

Figure contains an axes object. The axes object contains an object of type image.

Tracking Challenges in the Scenario

The picture below shows an annotated snapshot near the end of the scenario. The true targets are depicted using blue triangles and tracks are depicted with green squares, with their respective history as a line using the same color. Instantaneous sensor coverage areas are shown in transparent purple color and the sensor angle-only detections are shown using a purple line. This scenario presents the well-known challenges of tracking with angle-only detections, as well as unique challenges due to the scanning sensors.

Common angle-only challenges:

  • Partial observability: In the top of the picture, the target is detected by a single angle-only detection. As the angle-only detection does not provide the range from the sensor to the target, the tracker can only update the target state along the observable direction.

  • Missed detections: There are two undetected targets though they are within the sensor coverage. All multi-target trackers must handle cases in which some targets are not detected in an update.

  • Ghost intersections: A common way to overcome partial observability is by looking for intersections of two or more angle-only detections to form a full measurement. However, as you can see in the middle section of the picture, the total number of intersections far exceeds the number of intersections that corresponds to a real target. These false intersections are called "ghosts". The tracker must be careful not to falsely confirm a track too quickly based on these ghosts.

Incomplete coverage area: The righthand side of the picture shows a large area where there is no coverage in that step. As the sensors scan the environment, there are areas uncovered by the sensor field of views. As a result, targets in these uncovered areas cannot be detected at this time step. The tracker must maintain a track on these targets without deleting them until they are detected again.

Configure Tracker and Performance Metric

The example uses a Gaussian Mixture (GM) Probability Hypothesis Density (PHD) tracker, similar to the tracker used in the Asynchronous Angle-only Tracking with GM-PHD Tracker example. The GM-PHD tracker requires knowledge of the sensor configurations, provided as trackingSensorConfiguration objects. You obtain these configurations from the tracking scenario and add information about the sensor transform function and clutter density.

GM-PHD Tracker

sensorConfigs = trackingSensorConfiguration(scenario);
for i = 1:numel(sensorConfigs)
    sensorConfigs{i}.SensorTransformFcn = @cvmeas;
end

You configure the tracker to initialize the Gaussian mixture from an unassigned detection. As the detections lack information about the range to the target, you use range-parameterization to initialize 10 components that range between 100 meters to 5000 meters from the sensor. The function initcvRPGMPHD provided at the bottom of this example implements the range-parameterization.

% Range parameterization values
numComponents = 10;
rmin = 100;
rmax = 5000;

% Define a function to initialize the PHD from each sensor
for i = 1:numel(sensorConfigs)
    sensorConfigs{i}.FilterInitializationFcn = @(varargin)initcvRPGMPHD(varargin,numComponents,[rmin rmax]);
end

Finally, you construct the GM-PHD tracker using the sensor configurations. To account for the scanning sensors, you let the sensor configurations update with every step. To account for the large number of potential ghost targets, you increase the maximum number of components and the component birth rate. To avoid confirming tracks too quickly, you increase the confirmation threshold. You also increase the deletion threshold to make it easier to delete unnecessary components.

% Construct the PHD tracker
tracker = trackerPHD(SensorConfigurations=sensorConfigs,... % Sensor configurations
    HasSensorConfigurationsInput=true,... % Sensor configurations change with time
    MaxNumComponents=5000,... % Maximum number of PHD components
    MergingThreshold=5,... % Merging threshold
    ConfirmationThreshold=0.9,... % Track confirmation threshold
    BirthRate=10,... % Birth rate
    AssignmentThreshold=15, ...% Threshold for adding birth components
    DeletionThreshold = 0.01 ... % Make it easier to delete a component
    );

OSPA(2) Metric

In this example, you use the OSPA(2) metric to measure the performance of the tracker. The OSPA(2) metric allows you to evaluate the tracking performance by calculating a metric over history of tracks as opposed to instantaneous estimates in the traditional OSPA metric. As a result, the OSPA(2) metric penalizes phenomena such as track switching and track fragmentation more consistently. You configure the OSPA(2) metric by using the trackOSPAMetric object and setting the Metric property to "OSPA(2)". You choose absolute error in position as the base distance between a track and truth at a time instant by setting the Distance property to "posabserr".

% Define OSPA metric object
ospaMetric = trackOSPAMetric(Metric="OSPA(2)",...
    Distance="posabserr",...
    CutoffDistance=10);

Run Scenario and Track Objects

In this section, you run the scenario in a loop, generate detections and configurations from the sensor models, and feed these data to the multi-object GM-PHD tracker. To qualitatively assess the performance of the tracker, you visualize the results on the globe. To quantitatively assess the performance, you evaluate the OSPA(2) metric.

% Initialize OSPA metric values
numSamples = scenario.StopTime*scenario.UpdateRate + 1;
ospa = nan(numSamples,1);
counter  = 1;

% Reset scenario, tracker, and view
restart(scenario);
reset(tracker);
clear(viewer);
trackLog = cell(1,numSamples);

% Set random seed for reproducible results
rng(2022);

while advance(scenario)
    % Current time
    time = scenario.SimulationTime;

    % Generate detections and update sensor configurations
    [detections, configs] = detect(scenario);

    % Update tracker
    tracks = tracker(detections, configs, time);
    trackLog{counter} = tracks; % Save log for visualization later
    
    % Update globe viewer
    updateDisplay(viewer, scenario, detections, tracks);
    
    % Calculate OSPA metric
    poses = platformPoses(scenario);
    ospa(counter) = ospaMetric(tracks, poses);
    counter = counter + 1;
    
    % Take snapshot of Platform 10 at 5.9 seconds 
    if abs((time - 5.9)) < 0.05
        snap = zoomOnPlatform(viewer, scenario.Platforms{10}, 25);
    end
end

Results

The image below shows the ground truth as well as the track history for all the platforms. The blue lines represent the ground truth trajectories. The color lines represent the track log with a label showing the latest track update. Notice that the tracker is able to maintain tracks on all platforms.

% Clear the viewer and show just the platforms and tracks
clear(viewer);
posSelector = [1 0 0 0 0 0; 0 0 1 0 0 0; 0 0 0 0 1 0];
plotTrackLog(viewer,vertcat(trackLog{:}),"ECEF", LabelSampling=59.9, ...
    PositionSelector=posSelector,LineWidth=2,FontSize=20);
plotPlatform(viewer, [scenario.Platforms{:}],TrajectoryMode="Full", ...
    LineWidth=2,Marker="d",Color=[0.0745 0.6235 1.0000]);
snapshot(viewer);

Figure contains an axes object. The axes object contains an object of type image.

The next image shows the estimated track on the 8th platform in the scenario. Notice that the tracker maintains a track close to the actual position of the object and closely follows the true trajectory of the platform.

zoomOnPlatform(viewer,scenario.Platforms{8},250);

Figure contains an axes object. The axes object contains an object of type image.

You can also quantitatively measure the tracking performance by plotting the OSPA(2) metric. As a reminder, lower OSPA(2) values indicate better tracking, because the metric increases for tracking errors and assignment issues including missed targets, false tracks, as well as track switches and fragmentation.

The OSPA(2) metric begins to decrease after about ten seconds, or 100 tracker updates, which is the WindowLength property of the metric. After 10 additional seconds, the metric drops below 1 indicating that there are no missed targets or false tracks. The value stays low for the duration of the scenario.

time = linspace(0,scenario.StopTime,numSamples);
plot(time, ospa,"LineWidth",2);
xlabel("Time(s)");
ylabel("Metric");

Figure contains an axes object. The axes object with xlabel Time(s), ylabel Metric contains an object of type line.

As mentioned, the accurate angle information from the infrared sensor allows the tracker to estimate the position of objects with much higher accuracy. The image below is the snapshot of a platform at about 5.9 seconds. From the image, you can observe position uncertainty of the track estimate. The track uncertainty (denoted with a green ellipse) is very small and fully contains the target at the time, which indicates an unbiased and consistent tracking.

imshow(snap);

Figure contains an axes object. The axes object contains an object of type image.

Summary

In this example, you learned how to simulate a geo-referenced marine scenario and generate synthetic measurements using scanning infrared sensors. You also learned how to design a range-parameterized GM-PHD multi-object tracker to fuse the infrared angle-only measurements. You assessed the tracking results by visualizing the tracks and ground truth on the globe and calculating the OSPA(2) metric.

Supporting Functions

initcvRPGMPHD Define range-parameterized GM-PHD filter

function filter = initcvRPGMPHD(detection,numComponents,rranges)
% No components added to predictive birth intensity
filter = initcvgmphd;
if isempty(detection)
    return;
end

% Use initrpekf to get states and covariances of the filter bank
rpekf = initrpekf(detection{1}{1},numComponents,rranges);

% Add components to the PHD filter
for i = 1:numComponents
    % Gaussian component for one range
    thisFilter = gmphd(rpekf.TrackingFilters{i}.State,...
        rpekf.TrackingFilters{i}.StateCovariance);

    % Append the filter to total PHD filters to get a Gaussian mixture.
    append(filter, thisFilter);
end
end

updateDisplay Update the globe viewer

function updateDisplay(viewer, scenario, detections, tracks)

% Define position and velocity selectors
posSelector = [1 0 0 0 0 0;0 0 1 0 0 0;0 0 0 0 1 0];
velSelector = [0 1 0 0 0 0;0 0 0 1 0 0;0 0 0 0 0 1];

% Color order
colorOrder = [1.0000    1.0000    0.0667
    0.0745    0.6235    1.0000
    1.0000    0.4118    0.1608
    0.3922    0.8314    0.0745
    0.7176    0.2745    1.0000
    0.0588    1.0000    1.0000
    1.0000    0.0745    0.6510];

% Plot Platforms
plotPlatform(viewer, [scenario.Platforms{:}],"ECEF",Color=colorOrder(2,:),TrajectoryMode="Full");

% Plot Coverage
covConfig = coverageConfig(scenario);
covConfig(2).Range = 10e3;
plotCoverage(viewer,covConfig,"ECEF",Color=colorOrder(7,:),Alpha=0.2);

% Plot Detections
plotDetection(viewer,detections,"ECEF",Color=colorOrder(7,:),RayLength=10000);

% Plot Tracks
plotTrack(viewer,tracks,"ECEF",PositionSelector=posSelector,...
    VelocitySelector=velSelector,LineWidth=2,Color=colorOrder(4,:));

end

zoomOnPlatform Take a snapshot of the platform from certain height

function varargout = zoomOnPlatform(viewer, platform, height)
currentPos = campos(viewer);
currentOrient = camorient(viewer);
pos = platform.Position;
campos(viewer,[pos(1) pos(2) height]);
drawnow;
[varargout{1:nargout}] = snapshot(viewer);
campos(viewer,currentPos);
camorient(viewer,currentOrient);
end