- Predict - to predict the object state from one time step to the next.
- Correct - to correct the object state with a new measurement.
- Distance - to aid in computing the association cost that is used in the association stage.
How to use other filters than simple Kalman in Motion-Based Multiple Object Tracking Example
조회 수: 6 (최근 30일)
이전 댓글 표시
Hi,
I have found the Motion-Based Multiple Object Tracking Example very useful in various problems. The example states at the end: "The likelihood of tracking errors can be reduced by using a more complex motion model, such as constant acceleration, or by using multiple Kalman filters for every object. Also, you can incorporate other cues for associating detections over time, such as size, shape, and color. "
I would like to try different filters such as those listed in Matlab as usable in the predict and correct functions:
Filter for object tracking, specified as one of these objects:
- trackingEKF — Extended Kalman filter
- trackingUKF — Unscented Kalman filter
- trackingABF — Alpha-beta filter
- trackingCKF — Cubature Kalman filter
- trackingIMM — Interacting multiple model (IMM) filter
- trackingGSF — Gaussian-sum filter
- trackingPF — Particle filter
- trackingMSCEKF — Extended Kalman filter using modified spherical coordinates (MSC)
How would this be incorporated here? Would it involve the vision.Kalmanfilter? How?
Are there any examples of these or the other cues "associating detections over time, such as size, shape, and color." available. I searched the community and could not find any.
Thank you
댓글 수: 0
채택된 답변
Elad Kivelevitch
2022년 4월 28일
Hi Peter,
Thanks for the question.
The example that you refer to uses the vision.KalmanFilter object, which is a linear Kalman filter that assumes that both the motion and the measurements are modeled as linear models. Furthermore, the example uses some helper functions to associate new measurements with existing tracked objects, initialize new tracked objects, update existing ones, and delete ones that are no longer present.
There are two ways to move forward from this example to other filters and models. The first way is to still use the same helper functions, and replace the vision.KalmanFilter with one of the filters you listed. As you correctly noted, for a filter to be compatible, it must provide a few object functions (methods):
The above filters all support these methods. The easiest one to convert to, and I recommend starting with that, would be the trackingKF, which is very similar to vision.KalmanFilter. You will need to define a bounding box model, for example on how to do that, please see: https://www.mathworks.com/help/driving/ug/multiple-object-tracking-tutorial.html
After doing that, if you want to try using EKF or UKF, you will need to define the appropriate motion and measurement model functions. You can see the constvel and cvmeas functions for inspiration. Then, simply use the filter with these models by defining the StateTransitionFcn and MeasurementFcn, accordingly.
You can stop here, or you can decide to move to the next step.
The next step could be replacing all the tracking helper function with a tracker. Once again, I recommend looking at the https://www.mathworks.com/help/driving/ug/multiple-object-tracking-tutorial.html example to see how to set up a tracker and how to run it. You can use any of the following trackers: trackerGNN, trackerJPDA, and trackerTOMHT with any of the filters listed in the question. To choose a filter, simply define the FilterInitializationFcn. You may want to look at the same FilterInitializationFcn used in the example I linked to above for that.
Finally, to learn more about tracking and trackers, please look at the documentation for the Sensor Fusion and Tracking Toolbox.
Good luck
Elad
추가 답변 (1개)
참고 항목
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!