Fusing camera to lidar not working correctly outside lidarCamer​aCalibrato​r-App

조회 수: 4 (최근 30일)
Finn Strzelczyk
Finn Strzelczyk 2022년 8월 24일
댓글: Xiao Lin 2023년 12월 13일
I'm trying to fuse a camera and a Lidar-sensor using Matlabs lidar toolbox. For the calibration I used the lidarCameraCalibrator-app with the camera- and lidarfiles you can find in the Google Drive folder below (square size 250mm; no padding; extended roi). This calibration works just fine and the resulting colored point cloud looks like I expected:
However, when I now use the exported tform with the fuseCameraToLidar or the bboxCameraToLidar function I get incorrect results:
It seems like the rotation being done to the point cloud is just wrong. The projectLidarPointsOnImage function is working just fine. In the following Google Drive folder you can find my calibration-data, the resulting calibration-matrices and a script to show the incorrect functions:
Thank you very much for your help!
  댓글 수: 2
Moritz Rumpf
Moritz Rumpf 2023년 4월 26일
Did you try the projectLidarPointsOnImage - function?
I have similiar problems but it works with this function
Xiao Lin
Xiao Lin 2023년 12월 13일
Please check the detail of the function. While the projectLidarPointsOnImage using lidar2camera tform, the fusion function uses camera2lidar tform. Plus, for some unknown reasons, fusion function only accept rigidtform3d object as tform input. Using rigid3d object will cause wrong fusion result. rigid3d is not recommanded to be used in the latter version of matlab (since 2020)

댓글을 달려면 로그인하십시오.

답변 (1개)

Maneet Kaur Bagga
Maneet Kaur Bagga 2023년 8월 31일
As per my understanding the problem shows incorrect results due to the following reasons:
1) As the transformation matrix is used with the fuseCameraToLidar function. While reproducing the code provided in the Google Drive the variable intrinsic is creating a tform (3-D Rigid transformation object) and while printing the object parameters the result is found that it contains properties with Rotation and Translation but the expected result after creating a matrix should be a d+1 by d+1 matrix where d is the dimension of transformation (d = 3). Hence the first possible error is in creating the rigidform3d object. Here the dimensionality parameter is missing, and the object is defined incorrectly.
2) Also, after defining the rotation and the translation parameters separately and recreating the object using the rigidtform3d(R, translation) function from the given values defined in the cameraCalibration_Session_3.mat file and then passing it as a rigidform3d object as follows:
We achieve the following output after running the program, the Rotation matrix defined here based on the given values in the mat file is Invalid.
Here R is the Rotation matrix and translation is the second parameter of the function rigidfrom3d as defined in the provided file.
3) The third possibility for encountering the error is due to the Radial Distortion present in the image provided.
To remove the radial distortion from the image you can use the following documentation:
Additionally, you may also refer to the following documentation for the above used functions:
fuseCameratolidar function:
Project Lidar Points on an Image:
Creating a rigitform3d object.
Thank You!
Maneet Bagga

제품


릴리스

R2022a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by