Track multiple objects in autonomous and surveillance applications
Multi-object tracking and sensor fusion are at the heart of perception systems, a critical component of both autonomous systems and surveillance systems. Sensors such as cameras, lidars, radars, and sonar generate detections that are used as inputs to trackers. Multi-object tracking algorithms are used to estimate the number of objects, along with their states including position, velocity, and in some cases size and orientation. This information enables autonomous systems and surveillance systems to maintain situational awareness.
Multi-object tracking performance is driven by factors such as:
- Sensor parameters including probability of detection (Pd), resolution, and accuracy
- The number of targets and detections present
- The presence of false measurements for objects not in the environment
- Ambiguity in measurements of objects being tracked
With MATLAB® and Sensor Fusion and Tracking Toolbox™, you can track objects with data from real-world sensors, including active and passive radar, sonar, lidar, EO/IR, IMU, and GPS. You can also generate synthetic data from virtual sensors to test your algorithms under different scenarios. The toolbox includes a library of multi-object trackers and estimation filters that you can further customize for your application. You can also generate C code with MATLAB Coder™ to accelerate simulation performance or to get a head start on your prototype system.
To learn more about multi-object tracking, see Sensor Fusion and Tracking Toolbox™ for use with MATLAB.
Examples and How To
Tracking for Autonomous Systems
Tracking for Surveillance Systems
Testing Multi-Object Trackers
Generating C Code for Multi-Object Trackers
See also: sensor fusion, tracking with passive sensors, radar tracking, Phased Array System Toolbox, Automated Driving Toolbox, Lidar Toolbox, Computer Vision Toolbox, UAV Toolbox