Confusion Matrix (Error Matrix)for Accuracy Assessment
Classifier
results
Ground Truth data
Overall
accuracy (OA):
Kappa1:


What is Error matrix?

A confusion matrix (or error matrix) is usually used as the quantitative method of characterising image classification accuracy. It is a table that shows correspondence between the classification result and a reference image. I.e., to create the confusion matrix we need the ground truth data, such as cartographic information, results of manually digitizing an image, field work/ground survey results recorded with a GPS-receiver.

What is Overall Accuracy?

Diagonal cells contain the number of correctly identified pixels. If we divide the sum of these pixels by the total number of pixels we will get classification’s overall accuracy (OvAc).

What is kappa coefficient?

It is a measure of how the classification results compare to values assigned by chance. It can take values from 0 to 1. If kappa coefficient equals to 0, there is no agreement between the classified image and the reference image. If kappa coefficient equals to 1, then the classified image and the ground truth image are totally identical. So, the higher the kappa coefficient, the more accurate the classification is.

What is error in error matrix>

Apart from the overall accuracy, the accuracy of class identification needs to be assessed. In order to do that, we have to look at non-diagonal cells in the matrix. These cells contain classification errors, i.e. cases when the reference image and the classified image don’t match. There are two types of errors: underestimation (omission errors, omission) and overestimation (commission errors, commission).

Commision error: For any class, errors of commission occur when a classification procedure assigns pixels to a certain class that in fact don’t belong to it. Number of pixels mistakenly assigned to a class is found in column cells of the class above and below the main diagonal.

Omission errorFor any class, errors of omission occur when pixels that in fact belong to one class, are included into other classes. In the confusion matrix, the number of omitted pixels is found in the row cells to the left and to the right from the main diagonal. For class A, omission errors are marked in orange. The sum of these cells is the absolute value of the class omission. And if we divide this sum by the total number of class pixels in the classified image, we will get the relative omission error (Om):

What is User’s accuracy (UsAc)?

User’s accuracy (UsAc) is another index characterising the amount of errors of omission. It is the number of the correctly identified pixels of a class, divided by the total number of pixels of the class in the classified image. For class A in table 1 it equals:

For more information Click on more information