Classification of a matrix and binary mask

조회 수: 1 (최근 30일)
nomad nomad
nomad nomad 2011년 4월 29일
Hello everyone,
With the aim to optimize the ability to discriminate between two peaks (apes obtained a correlation between a scene containing two letters E and F, and the target that has the letter F) and from a theoretical calculation, I came to this matrix has the form):
M (u, v) = | D (u, v) | cos (Phi (d) - Phi (t))-epsilon | T (u, v) |
with D (u, v) is the FT of the letter E centered;
T (u, v) is the FT of the letter F-centered;
Phi (d) the phase of E centered and Phi (t) is the F center.
* The summation matrix (u, v = 0 to image size / 2)
1-I want to classify this way matrix, where the sum is greater than 0, must I eliminate from the positive area, and if the amount is less than 0, I eliminate from the negative region.
So run the program until you have a sum equal to 0. 2 - How to code binary mask that blocks certain frequencies to optimize the correlation and on what basis, I have to do this?
Thank you in advance

답변 (0개)

카테고리

Help CenterFile Exchange에서 Image Segmentation and Analysis에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by