Most modern autonomous or semi-autonomous vehicles are equipped with sensor suites that contain multiple sensors. Rotational and translational transformations are required to calibrate and fuse data from these sensors. Fusing lidar data with corresponding camera data is particularly useful in the perception pipeline. The lidar and camera calibration (LCC) workflow serves this purpose. It uses the checkerboard pattern calibration method.
Lidar Toolbox™ algorithms provide functionalities to extract checkerboard features from images and point clouds and use them to estimate the transformation between camera and lidar sensor. The toolbox also provides downstream LCC functionalities, projecting lidar points on images, fusing color information in lidar point clouds, and transferring bounding boxes from camera data to lidar data.
What Is Lidar Camera Calibration?
Integrate lidar and camera data.