Precision/Recall (perfcurve)
조회 수: 56 (최근 30일)
이전 댓글 표시
Hello,
My naive question is about the precision and recall rates that can be output from the perfcurve function.
As you may already know, this would look like:
[X, Y] = perfcurve(labels, score(:, 2), 'LowFlow', 'XCrit', 'prec', 'YCrit', 'reca');
X and Y, however, are vectors. Normally, what is reported in the literature is a single value. My question is, to get the precision/recall estimates, should I take the mean of the non-NaN values from X (= precision) and the mean of the non-NaN values from Y (= recall) or is there another computation involved into getting a single value that represents these rates?
Thank you for your help.
채택된 답변
Athul Prakash
2019년 10월 23일
편집: Athul Prakash
2019년 10월 23일
perfcurve() is meant for computing performance curves for your model - such as PR curves or ROC curves. In the literature, you may find these curves used for a more rigorous examination of a model's performance than can be given by a single score.
I recommend looking up the doc linked below. The Algorithms > Pointwise Confidence Bounds section explains what goes on under-the-hood after you call the function with your particular arguments. It samples from your input data multiple times, with different thresholds, to come up with a range of P/R scores, which are used to plot the curve.
perfcurve() doc: https://www.mathworks.com/help/stats/perfcurve.html
To calculate a single value, you may extract counts of TP, FP, TN & FN from your labels and predictions; then manually compute -
Precision = TP / (TP + FP)
Recall = TP / (TP + FN)
Should be done in 2 lines of code.
Alternatively, you may use confmat() to produce a confusion matrix, to readily give you TP/FP/TN/FN counts. And then do the same computation.
%Using this method, here's an anonymous function to calculate precision.
% << for an existing Confision Matrix 'ConfusionMat' >>
precision = @(confusionMat) diag(confusionMat)./sum(confusionMat,2);
% And another for recall
recall = @(confusionMat) diag(confusionMat)./sum(confusionMat,1)';
% Computes a vector of P/R values - each value corresponds to selecting a different label as +ve.
Hope it Helps!
추가 답변 (0개)
참고 항목
카테고리
Help Center 및 File Exchange에서 ROC - AUC에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!