Problem with bwboundaries in image processing

I am trying to do the next things:
Objective: Find centroid and major axis of a set of x,y coordinates by using bwboundaries and image processing to finally plot the resulting boundary+centroid+major axis that should fit the original
Initial data: - x,y coordinates of a plane cut (image1.png)
Process:
  1. Get the outer boundaries by using matlab function: k = boundary(x,y)
  2. Use poly2mask to convert the boundary into a binary image (image2.png)
  3. Get the bwboundaries of the binary image
  4. Find centroid and major axis using regionprops (image2.png)
  5. Plot bwboundaries, major axis and centroid (image3.png)
  6. Move bwboundaries (x,y) to origin (0,0), move centroid, move major axis , and apply a scale factor to bwboundaries in order to match the original x,y coordinates scale (image4.png)
In image 4 the blue plot is the original, and the red plot is the bwboundaries shifted and scaled to fit the original. I found the centroid of the blue original plot using polygeom function from file exchange. And the circular markers just indicate the origin of each plot.
The way I have move the bwboundaries points so the origin is (0,0) is by subtracting the mean of x and y coordinates.
% x axis origin in 0
x_mean=mean(bwboundary(:,2));
bwboundary(:,2) = bwboundary(:,2)-x_mean;
% y axis origin in 0
y_mean=mean(bwboundary(:,1));
bwboundary(:,1) = bwboundary(:,1)-y_mean;
% Move the centroid
centroid(1) = centroid(1) - x_mean;
centroid(2) = centroid(2) - y_mean;
% major axis passes through centroid
xx_mean=mean(majoraxis(:,1));
majoraxis(:,1) = majoraxis(:,1)-xx_mean + centroid(1);
yy_mean=mean(majoraxis(:,2));
majoraxis(:,2) = majoraxis(:,2)-yy_mean + centroid(2);
In order to get the scale factor, as you can see in images 1(original) and 3(bwboundaries), the plots are with axis tight, so i got the x and y axis limits for the two plots with [xl = xlim] and [yl = ylim] and calculated the distance from xmin to xmax and ymin to ymax for each plot. I calculated it by...
x_factor = distance_x_original/distance_x_bwboundaries;
y_factor = distance_y_original/distance_y_bwboundaries;
Finally just multiply the factors to bwboundaries coordinates, the centroid coordinates and the major axis coordinates.
As you can see the result, red plot, (image4.png) is slightly shifted from the original blue and the centroids are different where they would have to be the same, but I don't really know why is this happening. I have checked, x and y limits, distances and scale factor results and seems correct, aswell as the code used to shift and scale the bwboundaries coordinates. Maybe I am misusing some function.
Thanks for any help.

 채택된 답변

Matt J
Matt J 2018년 4월 29일

0 개 추천

Well, there's no reason to think the centroid according to regionprops will agree with the centroid according to polygeom. They derive results from different data. Also, the s-parameter in boundary(x,y,s) will affect agreement between the output of boundary and the original data. You might try using s=1 for a tighter wrapping.

댓글 수: 27

Alfonso
Alfonso 2018년 4월 29일
편집: Alfonso 2018년 4월 29일
Thanks for answering Matt, the first image shows the original points (blue) and the output of boundary function without specifying s.
The next image shows the same result but with s=1
As you said s=1 offers a tighter wrapping, but in this case, in my opinion, the original points connections are not between adjacent points as the next image shows so at the end the boundary gets messed up.
As you can see in the first image the match of the original points with the boundary obtained is pretty good, but after getting the bwboundaries from the binary image something happens which I am missing that makes the bwboundaries slightly shift (last image of the initial question).
Thank you in advance.
Matt J
Matt J 2018년 4월 30일
편집: Matt J 2018년 4월 30일
As the objective of all this is to find the major axis of the original(blue) cloud of points, is there any effective way to obtain it in comparison to image processing which I could give a try?
Instead of shifting the boundary points around, couldn't you just take the major axis, as you are computing it now, and force it (i.e. shift it) to run through the centroid of the original blue cloud?
Why does the centroid of the red cloud, or the centroid calculation of the BW image by regionprops, have to play a role at all?
Alfonso
Alfonso 2018년 4월 30일
I thought about the idea you suggest, but as the output boundary of the binary image is not 100% equal to the original image, how can I assure that the blue plot will have the exact same major axis (orientation) as the red plot but passing through it's centroid? From a strict point of view, as the image where I compare the 2 plots show, they look like different plots to me.
Maybe it provides the correct result but I am not quite sure if the major axis will always coincide by shifting to the original centroid in the case I change the cloud of points.
Thanks in advance.
Matt J
Matt J 2018년 4월 30일
편집: Matt J 2018년 4월 30일
how can I assure that the blue plot will have the exact same major axis (orientation) as
You can be certain that it will not be exactly the same. The question is, how much error can you tolerate? You have to tolerate some, because there is no way there was ever going to be a possibility of extracting the same information that was in the original cross-section image just from finite samples of the boundary. There is always loss in any finite sampling process.
A further question is, why don't you use the original plane cut, as opposed to the boundary samples? Don't have access to it anymore? The original plane cut is what makes sense, if you can't tolerate any error, since all the required information was there.
Alfonso
Alfonso 2018년 4월 30일
편집: Alfonso 2018년 4월 30일
I have access to the original plane cut, but the more straight forward way of acquiring the major axis of it I have seen is through image processing, that's the reason I asked if someone knew of an effective way of obtaining the major axis attaining only to the x and y coordinates of the original plane cut. I thought about calculating Euclidean distances from centroid to each point of the plane cut and finally keeping the max distance between 2 opposite points that passes through the centroid (not quite sure if this will actually work) . But I wanted to know if there were any other alternatives to this.
Thanks in advance
Matt J
Matt J 2018년 5월 1일
I don't understand. If you have the original plane cut, why are you trying to recover it using poly2mask?
Through the binary image given by poly2mask I can easily obtain the centroid and the major axis. But through the original data it is not so clear for me, I thought about calculating euclidean distances between points that passes through centroid and get the maximum one. But again, obtaining the major axis seems simpler through image processing.
Matt J
Matt J 2018년 5월 1일
편집: Matt J 2018년 5월 1일
But even so, why not derive a binary mask more directly, e.g., through thresholding? Why try to reconstruct the original shape from boundary samples?
Alfonso
Alfonso 2018년 5월 1일
편집: Alfonso 2018년 5월 1일
I now used this code to get the binary image
plot(original_blue(:,1), original_blue(:,2), '.b')
img = getframe(gca);
Iv= img.cdata;
I = rgb2gray(Iv);
I_bin = imbinarize(I);
imshow(I_bin)
At this point, if I have understood you correctly, I should create a mask which englobes the black points defining the plain cut, but I'm not quite sure in which way I could determine this without using boundaries
Matt J
Matt J 2018년 5월 1일
You have't shown I_bin. Why not show that, as well as the original image, Iv? In fact, why not attach Iv as a.mat file so we can play with it as well.
Alfonso
Alfonso 2018년 5월 1일
편집: Alfonso 2018년 5월 1일
I have attached Iv.mat as .zip (had to compress it)
imshow(Iv) % Original
imshow(I) % binary img
Matt J
Matt J 2018년 5월 1일
편집: Matt J 2018년 5월 1일
Hmmm. Sorry, I mean where is the complete image of the original object including the detail of its interior (not just the image of the boundary point cloud)?
Incidentally, you seem to be saving your entire workspace to your .mat files which is making them unnecessarily big. To save an individual variable to a .mat file, you do as in the following,
save myFile.mat Iv
Yes, totally true. I have attached the image.
Matt J
Matt J 2018년 5월 1일
It is still just an image of a point cloud. Where/what is the complete original object that the point cloud came from?
The original object is a 3D object conformed of vertices and triangular faces. I have attached it. And the result of cutting a certain plane through it is the blue cloud of points.
Matt J
Matt J 2018년 5월 1일
편집: Matt J 2018년 5월 1일
In that case, the plane cut would have to be a 2D polygon. Are the points in the blue cloud the vertices of that polygon? If so, you could use inpolygon() or isinterior (in conjunction with polyshape) to get the binary map of the region. It should give you a much more exact result than boundary()/poly2mask(). Even if not, it would still probably offer a pretty good approximation.
The intersection points between plane&vertices of the 3D object are obtained, so the blue points would be these points. I will try your suggestions and comment them next day. Thank you.
Alfonso
Alfonso 2018년 5월 2일
편집: Alfonso 2018년 5월 2일
I have tried using polyshape with the original blue coordinates, but I get the next result (I have attached the polyshape.mat):
when I plotted the original coordinates as:
plot(original_blue(:,1), original_blue(:,2), 'b')
I got the next...
This is the reason why I firstly used the boundary function, as this would just output the outside boundary which is what I was looking for.
Matt J
Matt J 2018년 5월 2일
Did you give the vertices to polyshape in consecutive order?
Alfonso
Alfonso 2018년 5월 2일
편집: Alfonso 2018년 5월 2일
No I didn't, now I get a better result as you can see
I then use the isinterior function were the TFon values are all 1s, which means that all the points are in the boundary (logical as the polygon was created initially from them). Now I guess I should proceed to create the binary image, find centroid, major axis but at the point where I need to get all the information from the image I will have to use bwboundaries I guess
Matt J
Matt J 2018년 5월 2일
편집: Matt J 2018년 5월 2일
Now I guess I should proceed to create the binary image, find centroid, major axis but at the point where I need to get all the information from the image I will have to use bwboundaries I guess
I don't see why you would use bwboundaries at all. I also don't see why you would use regionprops to get the centroid, since the pgon gives you a more direct and exact calculation of that:
x=blue_cloud(:,1).';
y=blue_cloud(:,2).';
xc=x-mean(x);
yc=y-mean(y);
[~,idx]=sort(atan2(yc,xc));
x=x(idx)-min(x)+1;
y=y(idx)-min(y)+1;
pgon=polyshape(x,y);
xmax=max(x);
ymax=max(y);
[I,J]=ndgrid(1:xmax,1:ymax);
BW=rot90( reshape(pgon.isinterior(I(:),J(:)), size(I)) );
theta=getfield( regionprops(BW,'Orientation'), 'Orientation');
[xCentroid, yCentroid]=pgon.centroid
MajorAxis=[cosd(theta), sind(theta)],
subplot(1,2,1)
plot(pgon); axis equal
hold on
plot(xCentroid+[-60,0,60]*MajorAxis(1) , ...
yCentroid+[-60,0,60]*MajorAxis(2),'x--');
hold off
xlim([0,xmax+1])
ylim([0,ymax+1])
title 'polyshape'
subplot(1,2,2)
imagesc(BW); axis image
colormap 'gray'
title('Binary Map')
Alfonso
Alfonso 2018년 5월 2일
편집: Alfonso 2018년 5월 2일
This seems to work almost perfectly Matt, I changed your line of code of the MajorAxis to be able to draw it as a line.
I also compared the polyshape to the original blue points by centering it to 0,0 by:
mx=mean(pgon.Vertices(:,1))
pgon.Vertices(:,1) = pgon.Vertices(:,1)-mx
my=mean(pgon.Vertices(:,2))
pgon.Vertices(:,2) = pgon.Vertices(:,2)-my
xCentroid = xCentroid - mx;
yCentroid = yCentroid - my;
but again there is a small shift between them that should not be there, do you know why this might be happening?
[Figure8]I have compared the original blue points, the bwboundaries result(red) and the polyshape result. In my opinion the polyshape offers a better agreement with the original blue points. Although the centroid obtained by bwboundaries(+red) is closer than the polyshape (+pink) one to the original blue(+b). What do you think?
Matt J
Matt J 2018년 5월 2일
편집: Matt J 2018년 5월 2일
I think I know the reason. Notice that the original blue_cloud data contains 200 points, but pgon keeps only 122 of them. polyshape() is throwing away some of your points. It should be giving you the following warning message about this,
Warning: Polyshape has duplicate vertices, intersections, or other inconsistencies that may produce inaccurate or unexpected
results. Input data has been modified to create a well-defined polyshape.
It is easy to verify that some of your "vertices" are indeed non-unique:
>> size( unique(blue_cloud,'rows')), size(blue_cloud)
ans =
169 2
ans =
200 2
So, you really ought to do some sort of pre-pruning of the blue_cloud so that it makes more sense.
In any case, the point is that mx and my are affected when polyshape discards points, leading to this slight inconsistency.
Alfonso
Alfonso 2018년 5월 2일
편집: Alfonso 2018년 5월 2일
Yes that seems very reasonable, I will try to find a way of 'pre-pruning' the blue cloud.
Overall, attaining to the last image I have posted, I suppose we can say that it would be better to use the major axis and centroid of the polygon obtained with polyshape (it is much closer to the original) than the information obtained with bwboundaries. My question would be, is the information obtained from the polygon more consistent than the one obtained with bwboundaries in comparison with the original plane cut(blue)?
Matt J
Matt J 2018년 5월 2일
My question would be, is the information obtained from the polygon more consistent than the one obtained to bwboundaries in comparison with the original plane cut(blue)?
Yes, because polyshape doesn't involve any pixelation of the object. Once you pixelate the object, you have introduced approximations, and any calculation deriving from the pixelation (e.g., bwboundaries) inherits some error from that.
Alfonso
Alfonso 2018년 5월 2일
편집: Alfonso 2018년 5월 2일
Okay, this final result seems to be good enough in terms of error. Thank you for all your help, information and time Matt.
Matt J
Matt J 2018년 5월 2일
You're quite welcome.

댓글을 달려면 로그인하십시오.

추가 답변 (2개)

Image Analyst
Image Analyst 2018년 4월 30일

0 개 추천

Also, regionprops computes centroid based on the whole blob. Taking the mean of boundary points does not. For example, what if you had a square and only two points on the left side of the square to define it but a thousand on the right side of the square? Taking the mean of those would not put the x centroid in the middle of the square - not even close.

댓글 수: 1

Alfonso
Alfonso 2018년 4월 30일
편집: Alfonso 2018년 4월 30일
I have been trying Steve´s part series on Feret diameters where he uses square pixels, circular pixels on corners.
The next image is the binary image with circular pixels in corners. The gray background is the grid (which is only appreciable if you zoom in).
The next image is a comparison of the original with the boundary from the binary img using circular pixels.
I have used the output x,y coordinates of the boundary but as it does not cover the whole image the output boundary has blank spaces, I am not sure how to include the remaining points aswell to the ones obtained.
Finally the comparison between original and output is logically wrong, which makes me think that if I don't get the exactly original boundary the match will not be 100% (which coincides to what I understood about your comment; nº of points in both clouds of points should be practically in the same place ). As the k = boundary(x,y) Matlab function does not give a faithful boundary of the original it will not be possible to obtain a match at the end of the process. By the fact that fixing s=1 as I answered previously to Matt J gives me a wrong output boundary.
As the objective of all this is to find the major axis of the original(blue) cloud of points, is there any effective way to obtain it (as my initial data (blue plot) is not an image) without using image processing which I could give a try?
Thank you in advance.

댓글을 달려면 로그인하십시오.

Matt J
Matt J 2018년 4월 30일
편집: Matt J 2018년 4월 30일

0 개 추천

This is not exactly what you asked for, but it might make sense to fit an ellipse directly to the blue point cloud (e.g., using this ) and use the major axis of the ellipse fit to define the major axis of the given shape.
This is asking you to change your problem requirements, but it might be a better axis definition for the data that you have. And, at least you wouldn't have to worry about further losses and inaccuracies creeping in as you try to map the blue cloud back to the full image that it came from.

댓글 수: 5

Alfonso
Alfonso 2018년 4월 30일
Thank you Matt J, I will try it and comment my results next day.
I get the next error when using the function: Error using ellipsefit (line 55) Fit produced a non-elliptic conic section
Matt J
Matt J 2018년 5월 1일
Please attach the point cloud data in a .mat file, so that I can examine it.
Here is the original plane cut data.
Matt J
Matt J 2018년 5월 2일
The data doesn't look like the shape you posted... In any case, make sure the coordinate samples are arranged columnwise, not row-wise.
I had no problem getting output when the samples were put in the columnwise order described in the help text.

댓글을 달려면 로그인하십시오.

질문:

2018년 4월 27일

댓글:

2018년 5월 2일

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by