This video series provides an overview of sensor fusion and multi-object tracking in autonomous systems. Starting with sensor fusion to determine positioning and localization, the series builds up to tracking single objects with an IMM filter, and completes with the topic of multi-object tracking.
Part 1: What is Sensor Fusion? This video provides an overview of what sensor fusion is and how it helps in the design of autonomous systems. It also covers a few scenarios that illustrate the various ways in which sensor fusion can be implemented.
Part 2: Fusing a Mag, Accel, and Gyro to Estimate Orientation This video describes how we can use a magnetometer, accelerometer, and a gyro to estimate an object’s orientation. The goal is to show how these sensors contribute to the solution, and to explain a few things to watch out for along the way.
Part 3: Fusing a GPS and IMU to Estimate Pose This video describes how we can use a GPS and an IMU to estimate an object’s orientation and position. We’ll go over the structure of the algorithm and show you how the GPS and IMU both contribute to the final solution.
Part 4: Tracking a Single Object With an IMM Filter This video describes how we can track a single object by estimating state with an interacting multiple model filter. We build up some intuition about the IMM filter and show how it is a better tracking algorithm than a single model Kalman filter.
Part 5: How to Track Multiple Objects at Once This video describes two common problems that arise when tracking multiple objects: data association and track maintenance. We cover a few ways to solve these issues and provide a general way to approach all multi-object tracking problems.
Part 6: What is Track-Level Fusion? This video introduces track-level fusion by providing some intuition into the types of tracking situations that require it and some of the challenges associated with it.