Adding new features to trained classifier (fitcecoc)

조회 수: 2 (최근 30일)
Emi Boto
Emi Boto 2015년 10월 7일
답변: Shubham 2024년 9월 5일
Hi everyone.
I'm designing a Face Recognition System for an hotel. I have a imageSet with photos of a few guests so, if I train a fitcecoc classifier I can recognize them when I give the system one input photo. That's all perfect.
The problem comes when new guests log in. The process would be simple, the recepcionist would take him/her a photo and should add a new entry to the database.
My question is: How can I do for adding a new entry to the 'database' with no need to train again the system (which will be bigger each time)? It's there any way of add new elements to an existant classifier?
I need accuracy, so HOG + fitcecoc is a great solution. But I also need a non slow system.
Excuse my english. And thank you.
My best regards,
Emi.

답변 (1개)

Shubham
Shubham 2024년 9월 5일
Hi Emi,
Your use case for a face recognition system in a hotel setting is quite common, and indeed, retraining the entire model every time a new guest is added can be inefficient. Unfortunately, traditional classifiers like those trained with fitcecoc in MATLAB do not support incremental learning, meaning you can't directly add new classes without retraining the model. However, there are alternative approaches you can consider:
Alternative Approaches:
  1. Incremental Learning Algorithms: Use algorithms that support incremental learning, such as some variants of Support Vector Machines (SVM) or neural networks that are designed for online learning. However, MATLAB's built-in fitcecoc does not directly support this.
  2. Feature Extraction with HOG and KNN: You can use the Histogram of Oriented Gradients (HOG) for feature extraction, which you're already using, and then apply a K-Nearest Neighbors (KNN) classifier. KNN can be updated easily by simply adding new feature vectors (guest images) to the dataset without retraining the entire model.
  3. Hybrid Approach with Pre-trained Models: Use a pre-trained deep learning model to extract features from images, and then use a simpler classifier like KNN or a small neural network. The feature extraction part remains constant, and you only update the classifier.
  4. Database and Similarity Search: Store the HOG features of each guest in a database. When a new guest is added, compute their HOG features and store them. For recognition, compute the HOG features of the input image and perform a similarity search in the database using a distance metric (e.g., Euclidean distance) to find the closest match.
Example Using HOG and KNN:
Here's a simple example of how you might implement a system using HOG features and a KNN classifier:
% Load or capture a new guest image
newGuestImage = imread('new_guest.jpg');
% Extract HOG features
hogFeatureSize = 64; % Example size, adjust based on your setup
newGuestFeatures = extractHOGFeatures(newGuestImage, 'CellSize', [8 8]);
% Assume you have an existing feature matrix and labels
% existingFeatures = [existing HOG feature matrix];
% existingLabels = {'guest1', 'guest2', ...};
% Add the new guest's features and label
existingFeatures = [existingFeatures; newGuestFeatures];
existingLabels = [existingLabels; {'new_guest'}];
% Train a KNN classifier
knnModel = fitcknn(existingFeatures, existingLabels);
% For recognition, extract features from the input image and classify
inputImage = imread('input.jpg');
inputFeatures = extractHOGFeatures(inputImage, 'CellSize', [8 8]);
predictedLabel = predict(knnModel, inputFeatures);
disp(['Predicted Guest: ', predictedLabel]);
Considerations:
  • Scalability: KNN is simple and easy to update, but it may not scale well with very large datasets because it needs to compare against all stored samples.
  • Accuracy: Ensure that the feature extraction process (HOG in this case) is robust enough to handle variations in lighting, pose, and expression.
  • Performance: Depending on the size of your dataset, consider optimizing the feature extraction and classification process to maintain real-time performance.

카테고리

Help CenterFile Exchange에서 Classification Ensembles에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by