- f in that equation refers to fx in pixels
- That equation is incomplete as it is missing the removal of the skew factor. The effect of skew must be removed normalizing the x coordinates. The complete equation looks like x = (u - u0 - skew*y ) / fx where y = (v - v0) / fy
- Yes, the final equation looks correct.
How to get normalized pixel values
조회 수: 4 (최근 30일)
이전 댓글 표시
Francesco Mazzoni
2023년 5월 21일
답변: Giridharan Kumaravelu
2023년 6월 7일
I'm trying to manually calculate the reprojected points from worldPoints to imagePoints coordinates. Since I used certain settings in the estimateCameraParameters() I want to know how to correctly use the distortion coefficients found after calibration. In Matlab documentation, distorted pixels are found through normalized pixels obtained with:
![](https://www.mathworks.com/matlabcentral/answers/uploaded_files/1389724/image.png)
This for example is the x value. My questions:
1) should f be the focal length in x for x value and the focal length in y for y value or is it the usual focal length in mm obtained from
and
?
![](https://www.mathworks.com/matlabcentral/answers/uploaded_files/1389729/image.png)
![](https://www.mathworks.com/matlabcentral/answers/uploaded_files/1389734/image.png)
2) is the equation right?
3) for the final value in x that I should get in an image, is this the equation to use?
![](https://www.mathworks.com/matlabcentral/answers/uploaded_files/1389739/image.png)
with
being the first component of the projected vector divided by the third one (scale factor)
![](https://www.mathworks.com/matlabcentral/answers/uploaded_files/1389744/image.png)
Here is the portion of code I'm using. Results are not the same as the ReprojectedPoint struct portion I get from calibration:
Q = [worldPoints(7,1);worldPoints(7,2);0;1];
Kaugm = [params.K(1,1),params.K(1,2),params.K(1,3),0; ...
params.K(2,1),params.K(2,2),params.K(2,3),0; ...
params.K(3,1),params.K(3,2),params.K(3,3),0; ...
];
q = Kaugm*params.PatternExtrinsics(5).A*Q
u = q(1,1)/q(3,1);
v = q(2,1)/q(3,1);
x = (u-params.PrincipalPoint(1))/params.Intrinsics.FocalLength(1);
y = (v-params.PrincipalPoint(2))/params.Intrinsics.FocalLength(2);
u_dist_rad = x*(1 + params.RadialDistortion(1,1)*(x^2+y^2) + ...
params.RadialDistortion(1,2)*(x^2+y^2)^2 + ...
params.RadialDistortion(1,3)*(x^2+y^2)^3);
v_dist_rad = y*(1 + params.RadialDistortion(1,1)*(x^2+y^2) + ...
params.RadialDistortion(1,2)*(x^2+y^2)^2 + ...
params.RadialDistortion(1,3)*(x^2+y^2)^3);
u_dist_tan = x + (2 * params.TangentialDistortion(1,1) * x * y + ...
params.TangentialDistortion(1,2) * ((x^2+y^2) + 2 * x^2));
v_dist_tan = y + (params.TangentialDistortion(1,1) * ((x^2+y^2) + 2 *y^2) +...
2 * params.TangentialDistortion(1,2) * x * y);
u_final = u + u_dist_rad + u_dist_tan
v_final = v + v_dist_rad + v_dist_tan
댓글 수: 0
채택된 답변
Giridharan Kumaravelu
2023년 6월 7일
Here are the answers to your questions:
댓글 수: 0
추가 답변 (0개)
참고 항목
카테고리
Help Center 및 File Exchange에서 Computer Vision with Simulink에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!