confusion matrix for image retrieval
조회 수: 7 (최근 30일)
이전 댓글 표시
im = fullfile(pn, fn);
images_query = imageDatastore(rootFolder, 'IncludeSubfolders',true, 'LabelSource','foldernames'); %%'ReadFcn', @readCBIR
R = imread(im); % Read image
Input_Layer_Size_q = net.Layers(1).InputSize(1:2); % (1:2 = 1st 2 elemnts of input size), input layer size stored in this variable (Input_layer_size)
Resized_Test_image_q = augmentedImageDatastore(Input_Layer_Size_q, R, 'ColorPreprocessing','gray2rgb'); %% For defining test image replace "Testing _image with test folder
%Extract feature
train_feature = activations(net, Resized_Training_image, 'Animal Feature Learner', 'OutputAs', 'Rows');
query_feature = activations(net, Resized_Test_image_q, 'Animal Feature Learner', 'OutputAs', 'Rows');
%Equation 2
a = query_feature; % transposing
b = transpose(1-a);
%Equation 3
c = zeros(Number_of_Classes,Number_of_Training_images);
d = sqrt(sum((query_feature' - train_feature') .^ 2)); % other method eucledian: giving all images from same category maybe something is wrong
for e = 1 : Number_of_Training_images
f = b.*d(:,e);
c(:, e) = f;
end
c = sqrt(sum(c))';
% Fetch top 25 similar images
g = sort(c);
[~, n] = sort(c);
n = n(1:25);
files = cell(1, 25);
for h =1:25
files{h} = Training_image.Files{n(h)};
end
%Display query image
figure;
imshow(R);
title('query')
% Display retrived images
figure;
montage(files);
title("retrived")
figure;
confusionchart(Testing_image.Labels, Predicted_Label,'Normalization','row-normalized', 'Title', 'Normalised ConfMat'); % normalised values for different rows, col, etc
confMat = confusionmat(Testing_image.Labels, Predicted_Label); %confusionmat generates cofusion matrix
How do I find confusion matrix, precison, RMS error for image retrival?
댓글 수: 0
답변 (1개)
Omega
2023년 10월 18일
Hi,
I understand that you would like to calculate precision, RMSE(Root mean square error) and plot the confusion matrix for your image retrieval task.
To find the confusion matrix, precision, and RMSE for image retrieval, you need to have the ground truth labels of the test images and the predicted labels obtained from the retrieval process.
1. Confusion Matrix:
Assuming you have the predicted labels “Predicted_Label” and the true labels “Testing_image.Labels” you can use the “confusionmat” function to generate the confusion matrix and plot it using the “confustionchart” function in MATLAB. Here's an example:
confMat = confusionmat(Testing_image.Labels, Predicted_Label);
confusionchart(Testing_image.Labels, Predicted_Label,'Normalization','row-normalized', 'Title', 'Normalised ConfMat');
2. Precision:
Precision can be calculated as the number of true positives divided by the sum of true positives and false positives for each class. You can use the confusion matrix to calculate precision. Here's an example for a specific class:
classIndex = 1; % Index of the class for which you want to calculate precision
truePositives = confMat(classIndex, classIndex);
falsePositives = sum(confMat(:, classIndex)) - truePositives;
precision = truePositives / (truePositives + falsePositives);
Note that, precision is typically calculated per class, so you may need to loop through the classes to obtain the overall precision value.
3. RMSE:
RMSE (Root Mean Square Error) is not typically used for evaluating image retrieval tasks. RMSE is a metric commonly used in regression tasks to measure the average squared difference between predicted and actual values. Still, if you would like to calculate RMSE you can do so by using MATLAB’s “rmse” function.
You can refer to the following documentation links which might help you in understanding the functions better:
댓글 수: 0
참고 항목
카테고리
Help Center 및 File Exchange에서 Image Processing Toolbox에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!