How can I display not recognized for someone not in the database?

조회 수: 1 (최근 30일)
Ali Subhan
Ali Subhan 2018년 5월 26일
댓글: Image Analyst 2018년 5월 26일
Hi everyone!I am working on real time face recognition, I have made classifier. The problem that I am having is that if someone outside the database comes in it(the code) will classify him/her inside the database. So how can I determine if someone is not in the database and display not recognized? I am tracking face using KLT algorithm and here's the code in comment
  댓글 수: 1
Ali Subhan
Ali Subhan 2018년 5월 26일
편집: Ali Subhan 2018년 5월 26일
%%
clear all;
clc;
%%
convnet = alexnet;
featureLayer = 'fc7';
load('FACE_classifier');
% Create the face detector object.
faceDetector = vision.CascadeObjectDetector();
faceDetector.MergeThreshold=9;
% Create the point tracker object.
pointTracker = vision.PointTracker('MaxBidirectionalError', 2);
% Create the webcam object.
cam = webcam();
% Capture one frame to get its size.
videoFrame = snapshot(cam);
frameSize = size(videoFrame);
% Create the video player object.
videoPlayer = vision.VideoPlayer('Position', [100 100 [frameSize(2),
frameSize(1)]+30]);
%%
runLoop = true;
numPts = 0;
frameCount = 0;
while runLoop
% Get the next frame.
videoFrame = snapshot(cam);
videoFrameGray = rgb2gray(videoFrame);
frameCount = frameCount + 1;
if numPts < 10
% Detection mode.
bbox = faceDetector.step(videoFrameGray);
if ~isempty(bbox)
% Find corner points inside the detected region.
points = detectMinEigenFeatures(videoFrameGray, 'ROI', bbox(1, :));
% Re-initialize the point tracker.
xyPoints = points.Location;
numPts = size(xyPoints,1);
pause(0.5); title('test Image');
test=imcrop(videoFrame,bbox);
imshow(test);
testSet = imresize(test, [227 227]);
testFeatures = activations(convnet, testSet, featureLayer);
predictedLabels = predict(FACE_classifier, testFeatures)
release(pointTracker);
initialize(pointTracker, xyPoints, videoFrameGray);
% Save a copy of the points.
oldPoints = xyPoints;
% Convert the rectangle represented as [x, y, w, h] into an
% M-by-2 matrix of [x,y] coordinates of the four corners. This
% is needed to be able to transform the bounding box to display
% the orientation of the face.
bboxPoints = bbox2points(bbox(1, :));
% Convert the box corners into the [x1 y1 x2 y2 x3 y3 x4 y4]
% format required by insertShape.
bboxPolygon = reshape(bboxPoints', 1, []);
% Display a bounding box around the detected face.
videoFrame = insertShape(videoFrame, 'Polygon', bboxPolygon, 'LineWidth', 3);
% Display detected corners.
videoFrame = insertMarker(videoFrame, xyPoints, '+', 'Color', 'white');
end
else
% Tracking mode.
[xyPoints, isFound] = step(pointTracker, videoFrameGray);
visiblePoints = xyPoints(isFound, :);
oldInliers = oldPoints(isFound, :);
numPts = size(visiblePoints, 1);
if numPts >= 10
% Estimate the geometric transformation between the old points
% and the new points.
[xform, oldInliers, visiblePoints] = estimateGeometricTransform(...
oldInliers, visiblePoints, 'similarity', 'MaxDistance', 4);
% Apply the transformation to the bounding box.
bboxPoints = transformPointsForward(xform, bboxPoints);
% Convert the box corners into the [x1 y1 x2 y2 x3 y3 x4 y4]
% format required by insertShape.
bboxPolygon = reshape(bboxPoints', 1, []);
% Display a bounding box around the face being tracked.
videoFrame = insertShape(videoFrame, 'Polygon', bboxPolygon, 'LineWidth', 3);
% Display tracked points.
videoFrame = insertMarker(videoFrame, visiblePoints, '+', 'Color', 'white');
% Reset the points.
oldPoints = visiblePoints;
setPoints(pointTracker, oldPoints);
end
end
% Display the annotated video frame using the video player object.
step(videoPlayer, videoFrame);
% Check whether the video player window has been closed.
runLoop = isOpen(videoPlayer);
end
% Clean up. clear cam; release(videoPlayer); release(pointTracker); release(faceDetector);

댓글을 달려면 로그인하십시오.

답변 (1개)

Image Analyst
Image Analyst 2018년 5월 26일
편집: Image Analyst 2018년 5월 26일
How about this:
uiwait(warndlg('Not recognized!'));
Oh, and please see this link.
  댓글 수: 8
Ali Subhan
Ali Subhan 2018년 5월 26일
I don't know Alex net does the feature extraction. How can I compute distance metric.
Image Analyst
Image Analyst 2018년 5월 26일
Alexnet will tell you. You'll have an output for every person in the database. You should have one of those outputs be true if the person is in there. If the person is not in the training set, then all of the outputs will be false.

댓글을 달려면 로그인하십시오.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by