Page 306 - Artificial Intellegence_v2.0_Class_11
P. 306
• The actual value was negative but the model predicted a positive value.
• This is also called Type 1 Error.
False Negative (FN)
• The predicted value doesn’t tally with the actual value.
• The actual value was positive but the model predicted a negative value.
• This is called Type 2 Error.
Let us understand through an example:
An AI model studies data samples of 1000 cases and predicts diabetes/no diabetes in patients. The prediction categories
and their number are given as follows:
TP – 651 TN – 108
FP – 120 FN – 321
Let’s understand all terms one by one:
True Positive (TP): Model predicted Diabetes when patient has DIABETES.
True Negative (TN): Model predicted No Diabetes when patient has NO DIABETES.
False Positive (FP): Model predicted Diabetes when patient has NO DIABETES.
False Negative (FN): Model predicted No Diabetes when patient has DIABETES.
Now draw the Confusion Matrix for the above data:
REALITY
CONFUSION MATRIX
TRUE FALSE
TRUE 651 120
PREDICTION
FALSE 321 108
Evaluation Methods
After going through all the possible combinations of Prediction and Reality, let us understand how we can use these
four states to evaluate the model.
Accuracy
Accuracy is described as the percentage of correct predictions out of all the samples. A prediction can be said to be
correct if it matches reality. Here, we have two conditions in which the Prediction matches with the Reality: True Positive
and True Negative. Hence, the formula for Accuracy becomes:
Correct prediction
Accuracy = × 100%
Total cases
(TP+TN)
Accuracy = × 100%
(TP+TN+ FP+FN)
In the above example,
Total No. of cases = 1000
(TP+TN)
Accuracy = × 100%
(TP+TN+ FP+FN)
651+108/(1000) × 100%
0.759 × 100 = 75.9%
Accuracy depicts how true the model’s predictions are.
304 Touchpad Artificial Intelligence (Ver. 2.0)-XI

