Page 235 - AI Ver 3.0 class 10_Flipbook
P. 235
The confusion matrix with all classification outcomes based on the different values of actual and predicted labels
can be presented as follows:
Predicted
Confusion Matrix
YES NO
True Positive False Negative
YES Predicted – Yes Predicted – No
Actual - Yes Actual - Yes
Actual
False Positive True Negative
NO Predicted – Yes Predicted – No
Actual - No Actual - No
Accuracy from Confusion Matrix
Classification Accuracy is the percentage of correct predictions out of the total observations made by an AI
model. It provides a clear picture of how accurate the predictions are for the given model. A high accuracy score
generally indicates good performance, as it accounts for all correctly predicted values. The mathematical formula
for classification accuracy is:
No. of correct predictions
Classification Accuracy = × 100
Total no. of predictions
Where,
Total Correct Predictions = True Positive (TP) + True Negative (TN)
Total Predictions = True Positive (TP) + True Negative (TN) + False Positive (FP) + False Negative (FN)
A prediction is said to be correct if it matches reality. Here we have two conditions in which the Prediction matches
with the Reality, i.e., True Positive and True Negative.
For example, in a model of predicting whether the credit card transaction is fraudulent or not, the confusion matrix
is as follows:
Predicted
Confusion Matrix
Yes No
Yes 15 14
Actual
No 12 10
Total Correct Predictions = TP+TN
= 15+10
= 25
Total Predictions = TP+TN+FP+FN
= 15+10+12+14
= 51
Evaluating Models 233

