Ambiguous Disparity Map and Inadequate 3D Scene Reconstruction

조회 수: 9 (최근 30일)
Hi,
I'm trying to measure distance from camera to object using stereo images. I used 40 image pairs for calibration with matlab stereo calibration app. At the result of the calibration, overall mean error was 1.49 pixels. I applied all of the technics which are in the Matlab's depth estimation from stereo video tutorial.
My actual distance from camera to object is 2.97 meters. However, I found 4,32 meters as a result of my program's calculation. I guess there is something wrong in my disparity map and 3d scene reconstruction. Because my disparity map is really ambiguous and point cloud is so inadequate. I would like to get any suggestion regarding this topic. Also i tried to apply some technics such as median and wiener filter to remove noise from images which were written at this topic: http://uk.mathworks.com/matlabcentral/answers/153348-tips-and-tricks-about-3d-scene-reconstruction
These are my outputs:
This is my code:
Read and Rectify Images
imageLeft = imread('D:\stereo\imgPairLeft1.png');
imageRight = imread('D:\stereo\imgPairRight1.png');
imageLeft = undistortImage(imageLeft, stereoParams.CameraParameters1, ...
'OutputView', 'same');
imageRight = undistortImage(imageRight, stereoParams.CameraParameters2, ...
'OutputView', 'same');
[imageLeftRect, imageRightRect] = ...
rectifyStereoImages(imageLeft, imageRight, stereoParams);
subplot(2,2,1);
imshow(stereoAnaglyph(imageLeftRect, imageRightRect));
title('Rectified Images');
imageLeftGray = rgb2gray(imageLeftRect);
imageRightGray = rgb2gray(imageRightRect);
imageLeftGrayHisteq = histeq(imageLeftGray);
imageRightGrayHisteq = histeq(imageRightGray);
Compute Disparity
disparityRange = [0 80];
disparityMap = disparity(imageLeftGrayHisteq, imageRightGrayHisteq, ...
'DisparityRange', disparityRange, 'BlockSize', 15);
subplot(2,2,2);
imshow(disparityMap, disparityRange);
title('Disparity Map');
colormap jet;
colorbar;
Reconstruct the 3-D scence
points3D = reconstructScene(disparityMap, stereoParams);
% Convert to meters and create a pointCloud object
points3D = points3D ./ 1000;
ptCloud = pointCloud(points3D, 'Color', imageLeftRect);
subplot(2,2,3);
pcshow(ptCloud);
title('Point Cloud');
Thresholding
binaryImage = imageLeftGrayHisteq > 0 & imageLeftGrayHisteq < 60;
binaryImage = imfill(binaryImage, 'holes');
% Assign each blob different color
labeledImage = bwlabel(binaryImage, 8);
coloredLabels = label2rgb(labeledImage, 'hsv', 'k', 'shuffle');
Blob Analysis
blobMeasurements = regionprops(labeledImage, imageLeftGrayHisteq, 'all');
numberOfBlobs = size(blobMeasurements, 1);
% Sort the rows by Area.
[~,index] = sortrows([blobMeasurements.Area].');
blobMeasurements = blobMeasurements(index);
newIndex = sortrows(index);
% After sorting, last index is the largest blob
tvUnitBoundingBox = blobMeasurements(newIndex(end)).BoundingBox;
Determine the distance of tv unit to the camera. Find the centroids of tv unit.
centroid = [round(tvUnitBoundingBox(:,1) + tvUnitBoundingBox(:,3) / 2) ...
round(tvUnitBoundingBox(:,2) + tvUnitBoundingBox(:,4) / 2)];
% Find 3-D world coordinates of the centroids.
centroidsIdx = sub2ind(size(disparityMap), centroid(:,2), centroid(:,1));
X = points3D(:, :, 1);
Y = points3D(:, :, 2);
Z = points3D(:, :, 3);
centroids3D = [X(centroidsIdx)'; Y(centroidsIdx)'; Z(centroidsIdx)'];
% Find the distances from the camera in meters
distanceFromTvUnitToCam = sqrt(sum(centroids3D .^ 2));
% Display the tv unit and its distance.
label = sprintf('%02f meters', distanceFromTvUnitToCam);
subplot(2,2,4);
imshow(insertObjectAnnotation(imageLeftRect, 'rectangle', tvUnitBoundingBox, label));
title('Detected Object');
  댓글 수: 1
Pamela Erin Rosario
Pamela Erin Rosario 2020년 2월 22일
Hi! May i ask if the BoundingBox is only for TV units? or can i use it for other objects to be detected as well?

댓글을 달려면 로그인하십시오.

채택된 답변

Dima Lisin
Dima Lisin 2016년 5월 3일
편집: Dima Lisin 2016년 5월 3일
Hi Yildiray,
There are several problems with your calibration:
  1. The checkerboard is not really flat. You have to glue it to a hard flat surface, like a flat sheet of plastic or glass, which does not bend easily.
  2. There seems to be a long delay between taking the two images of each stereo pair. If you look at the calibration images, you can see that your face had the time to change expression between the two shots. That means that the checkerboard also moves noticeably in the time between taking of the two images, because a person cannot hold anything perfectly still for any length of time. To fix this, you either have to get a "real" stereo camera with hardware triggering, which takes the two images nearly simultaneously, or you need to lean your checkerboard on something, so that it does not move between the shots.
  3. You need to have a greater variation of 3D orientations of the checkerboard. In most of your calibration images you are rotating the checkerboard in-plane, but it is always more or less parallel to the image plane. In other words, most of your checkerboards are almost parallel to each other, which does not give you much information for the calibration. You need to slant the checkerboard forward and backward, and left and right.
  4. Finally, when you calibrate you have to look at the reprojection errors graph, and exclude the images with really high reprojection errors. Ideally, you want your average reprojection error over all the images to be less than .5 of a pixel.
  댓글 수: 1
YILDIRAY  YILMAZ
YILDIRAY YILMAZ 2016년 5월 6일
Hi Dima,
It works. Under favour of your suggestions, I calculated the distance with almost %100 accuracy. Thank you very much again.
Best wishes,
Yildiray

댓글을 달려면 로그인하십시오.

추가 답변 (1개)

Vaddeti Nikitha
Vaddeti Nikitha 2020년 2월 3일
Can I rectify car wheel images of a stereo camera with only cameraparams of the checkerboard images ???

카테고리

Help CenterFile Exchange에서 MATLAB Support Package for USB Webcams에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by