# instanceSegmentationMetrics

## Description

An `instanceSegmentationMetrics`

object stores instance
segmentation quality metrics, such as the confusion matrix and average precision, for a set of
images.

## Creation

Create an `instanceSegmentationMetrics`

object using the `evaluateInstanceSegmentation`

function.

## Properties

`ConfusionMatrix`

— Confusion matrix

numeric matrix | numeric array

This property is read-only.

Confusion matrix, specified as a numeric matrix or numeric array.

When

`OverlapThreshold`

is a scalar,`ConfusionMatrix`

is a square matrix of size*C*-by-*C*, where*C*is the number of classes. Each element (*i*,*j*) is the count of objects known to belong to class*i*but predicted to belong to class*j*.When

`OverlapThreshold`

is a vector,`ConfusionMatrix`

is an array of size*C*-by-*C*-by-*numThresh*. There is one confusion matrix for each of the*numThresh*overlap thresholds.

`NormalizedConfusionMatrix`

— Normalized confusion matrix

numeric matrix | numeric array

This property is read-only.

Normalized confusion matrix, specified as a numeric matrix or numeric array with
elements in the range [0, 1]. `NormalizedConfusionMatrix`

represents
a confusion matrix normalized by the number of objects known to belong to each class.
For each overlap threshold, each element (*i*, *j*) in
the normalized confusion matrix is the count of objects known to belong to class
*i* but predicted to belong to class *j*, divided by
the total number of objects predicted in class *i*.

`DataSetMetrics`

— Metrics aggregated over the data set

table

This property is read-only.

Metrics aggregated over the data set, specified as a table with one row.
`DataSetMetrics`

has two columns corresponding to these instance
segmentation metrics:

`mAP`

— Mean average precision, or the average precision values calculated at the default threshold range and averaged over all the classes.`AP`

— Average precision (AP) calculated for each class at each specified overlap threshold in`OverlapThreshold`

, returned as a*numThresh*-by-*numObjects*matrix.*numThresh*is the number of overlap thresholds and*numObjects*is the number of objects.

`ClassMetrics`

— Metrics for each class

table

This property is read-only.

Metrics for each class, specified as a table with *C* rows, where
*C* is the number of classes in the instance segmentation.
`ClassMetrics`

has four columns, corresponding to these instance
segmentation metrics:

`mAP`

— Mean average precision calculated for a class at the default threshold range.`AP`

— Average precision calculated for a class at each overlap threshold in`OverlapThreshold`

, returned as a*numThresh*-by-*numObjects*matrix.*numThresh*is the number of overlap thresholds and`numObjects`

is the number of objects.`Precision`

— Precision values, returned as a*numThresh*-by-(*numObjects*+1) matrix. Precision is the ratio of the number of true positives (*TP*) and the total number of predicted positives.Precision =

*TP*/ (*TP*+*FP*)*FP*is the number of false positives. Larger precision scores imply that most detected objects match ground truth objects.`Recall`

— Recall values, returned as a*numThresh*-by-(*numObjects*+1) matrix. Recall is the ratio of the number of true positives (*TP*) and the total number of ground truth positives.Recall =

*TP*/ (*TP*+*FN*)*FN*is the number of false negatives. Larger recall scores imply that most ground truth objects are detected.

`ImageMetrics`

— Metrics for each image

table

This property is read-only.

Metrics for each image in the data set, specified as a table with
*numImages* rows, where *numImages* is the number of
images in the data set. `ImageMetrics`

has two columns, corresponding
to these instance segmentation metrics:

`mAP`

— Mean average precision, or the average precision values calculated at the default threshold range and averaged over all the classes.`AP`

— Average precision calculated for each class at each overlap threshold in`OverlapThreshold`

, returned as a*numThresh*-by-*numObjects*matrix.*numThresh*is the number of overlap thresholds and*numObjects*is the number of objects.

`ClassNames`

— Class names

array of strings

Class names of segmented objects, specified as an array of strings.

**Example: **`["sky" "grass" "building" "sidewalk"]`

`OverlapThreshold`

— Overlap threshold

numeric scalar | numeric vector

Overlap threshold, specified as a numeric scalar or numeric vector. When the intersection over union (IoU) of the pixels in the predicted object mask and ground truth object mask is equal to or greater than the overlap threshold, the prediction is considered a true positive.

IoU, or the Jaccard Index, is the number of pixels in the intersection of the binary masks divided by the number of pixels in the union of the masks. In other words, IoU is the ratio of correctly classified pixels to the total number of pixels that are assigned that class by the ground truth and the predictor. IoU can be expressed as:

IoU = *TP* / (*TP* +
*FP* + *FN*)

## Version History

**Introduced in R2022b**

## MATLAB 명령

다음 MATLAB 명령에 해당하는 링크를 클릭했습니다.

명령을 실행하려면 MATLAB 명령 창에 입력하십시오. 웹 브라우저는 MATLAB 명령을 지원하지 않습니다.

Select a Web Site

Choose a web site to get translated content where available and see local events and offers. Based on your location, we recommend that you select: .

You can also select a web site from the following list:

## How to Get Best Site Performance

Select the China site (in Chinese or English) for best site performance. Other MathWorks country sites are not optimized for visits from your location.

### Americas

- América Latina (Español)
- Canada (English)
- United States (English)

### Europe

- Belgium (English)
- Denmark (English)
- Deutschland (Deutsch)
- España (Español)
- Finland (English)
- France (Français)
- Ireland (English)
- Italia (Italiano)
- Luxembourg (English)

- Netherlands (English)
- Norway (English)
- Österreich (Deutsch)
- Portugal (English)
- Sweden (English)
- Switzerland
- United Kingdom (English)