# Why can't I get the matchScans function to work properly?

조회 수: 3 (최근 30일)
Amrik Sandhu 2018년 12월 21일
답변: soorajanilkumar 2019년 9월 13일
I've been trying to use the matchScans function along with transformScans to estimate the pose between two lidar scans but I've found that the estimated pose it is calculating is far off what it should actually be and I'm not sure why this is.
For context, I am taking out data from a free space sensor (basically lidar) in CarMaker in the form of x and y cartesian co-ordinates but I've also tried this with range and angle values with no luck as well. When I try to find the pose between two scans, it will estimate a transform of 0.0001 -0.0527 -0.0003 for example, which I will then apply to the current scan (using transformScans) and obtain a transformed scan which in theory should align almost exactly with the reference scan. But when I look at the cartesian co-ordinates of the current and reference scans and compare against the transformed scan cartersian co-ordinates, I'm finding that the translation/rotation it is applying is far too small. In this example, it is trying to apply a translation of 0.05m or so to the range values when in reality I know that the position between scans is about 2.5m apart and this isn't being reflected in the estimated pose.
Does anyone know why this might be happening and why it isn't able to estimate an accurate pose between scans. Just for reference my data is made up of about 1000 points with cartesian co-ordinates, so I don't think it will have anything to do with there being too few points to measure for it to estimate effectively. Any help is appreciated. Thanks.

댓글을 달려면 로그인하십시오.

### 답변 (2개)

Hello Amrik,
We cannot be sure about the reason for the inaccuracy in the scan matching. However, using a different algorithm to do the scan matching may help to allieviate the issue. Try using matchScansGrid instead of matchScans, and let us know if that works.
##### 댓글 수: 0이전 댓글 -2개 표시이전 댓글 -2개 숨기기

댓글을 달려면 로그인하십시오.

soorajanilkumar 2019년 9월 13일
Hi Amrik,
The matchScans() uses NDT scan matching that is prone to get stuck in local maxima. It means that unless you give a good initial pose estimate, you won't get a good result with matchScans(). It is quite possible you get a completely wrong relative pose output.
As mathworks have suggested try using matchScansGrid() that uses multi-resolution correlative scan matching algorithm that is robust and which has a wider search space. The matchScansGrid() is much slower than matchScans(), which you can speed up to a certain extend by providing initial pose estimate from odometry if you have it.
Later you can use the result of matchScansGrid() as the pose estimate for the matchScans() to get a better result.
This is the method followed in lidarSLAM function.

댓글을 달려면 로그인하십시오.

### 카테고리

Help CenterFile Exchange에서 Localization Algorithms에 대해 자세히 알아보기

R2018b

### Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by