Match the coordinate systems of "triangulate" and "reconstructScene" with "disparitySGM"
조회 수: 1 (최근 30일)
이전 댓글 표시
Hello,
I have an image pair of a grid-ruled sheet of paper and the corresponding stereoParams, obtained from checkerboard calibration with the Stereo Camera Calibrator using default settings. I apply both the following trains of processing steps to the image pair:
Processing A: rectifyStereoImages (with stereoParams), disparitySGM, reconstructScene (with stereoParams)
Processing B: individual undistortImage (with stereoParams.CameraParameters1[or 2]), detection of line intersections in both images, get the 3D locations of the intersections with triangulate (with stereoParams)
I wish to display both the point cloud resulting from A and the 3D feature locations from processing B in the same 3D plot, such that they coincide. But I find that they don't coincide; there is an apparent rotation about the origin of coordinates (optical center of camera 1) between the two. I show here the output of showExtrinsics(stereoParams) together with the point cloud (jet colormap in y) and the triangulated points (green). The triangulated points are where I expected them to be, while the point cloud is located right of the area that I had calibrated, off the plane of symmetry between the cameras. Both cameras' optical axes point at the center of the sheet of paper in the experiment.
When I apply the rotation matrix of inv(stereoParams.RotationOfCamera2)^0.5 to the point cloud, the point cloud almost coincides with the feature locations, but not to my satisfaction. Here I show a closeup (color map in depth, triangulated points in orange):
Note that I have a similar result when I compute the Euler angles of the original rotation matrix, multiply them by -0.5, and turn them into a rotation matrix again.
Now I would like to understand what exactly are the output coordinate systems of both reconstructScene and triangulate, including the orientation. And maybe somebody can disapprove my assumption that both methods should yield matching results.
댓글 수: 0
채택된 답변
Qu Cao
2022년 8월 17일
The point cloud generated from reconstructScene is in the rectified camera 1 coordinate.
Starting in R2022a, you can use the additional output R1 of the rectifyStereoImage function to convert the reconstructed point cloud from the rectified camera 1 coordinate to the original, unrectified camera 1 coordinate, used by the triangulate function.
추가 답변 (0개)
참고 항목
카테고리
Help Center 및 File Exchange에서 Point Cloud Processing에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!