필터 지우기
필터 지우기

Calculate Sensitivity and Specificity from Code generated from Classification Learner

조회 수: 54 (최근 30일)
I have trained my dataset in the classification learner app and tried to calculate classification performance using leave-one-out cross-validation. Since classification learner doesn't support this configuration of K-fold, I used the way of generating the code for training the currently selected model.
I have tried to compute the sensitivity and specificity but all the ways I found depend on predicted class labels and I can't get the resulted class labels since it is not a new dataset. I just want to evaluate the trained model.
Is any way to evaluate the sensitivity and specifity or the confusion matrix from Classification Learner App Code generated?

채택된 답변

Sarah Ayyad
Sarah Ayyad 2021년 9월 28일
편집: Sarah Ayyad 2021년 9월 28일
I computed all performance metrics by the following way
[validationPredictions, validationScores] = kfoldPredict(partitionedModel);
confmat = confusionmat(response,validationPredictions) % where response is the last column in the dataset representing a class
TP = confmat(2, 2);
TN = confmat(1, 1);
FP = confmat(1, 2);
FN = confmat(2, 1);
Accuracy = (TP + TN) / (TP + TN + FP + FN);
Sensitivity = TP / (FN + TP);
specificity = TN / (TN + FP);
z = FP / (FP+TN);
X = [0;Sensitivity;1];
Y = [0;z;1];
AUC = trapz(Y,X); % This way is used for only binary classification

추가 답변 (0개)

카테고리

Help CenterFile Exchange에서 Classification에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by