Page 236 - AI Ver 3.0 class 10_Flipbook
P. 236
Total no. of correct predictions
Classification Accuracy = × 100
Total no. of predictions
(TP + TN)
= × 100
(TP + TN + FP + FN)
25
= × 100
51
= 49%
Can we use Accuracy all the time?
It is suitable wherever the dataset is balanced, which means the positive and negative classes are roughly equal, that
is a rare occurrence, and that all predictions and prediction errors are equally important, which is often not the case.
For example, Calculating the accuracy of the classifier model, that predicts whether a student will pass a test
(Yes) or not pass a test (No). It classifies the input into two classes Yes and No. Let's, calculate the accuracy of the
classifier model and construct the confusion matrix for the model.
Here,
• Total test data is 1000.
• Actual values are 900 Yes and 100 No (Unbalanced dataset).
• It is a faulty model which, irrespective of any input, will give a prediction as Yes.
• Calculate the classification accuracy of this model.
To prepare the classification accuracy of this model follow the given steps:
Step 1 Construct the Actual value vs Predicted value table. Consider Yes as the positive class and No as the
negative class.
Predicted Value Actual Value
Step 2 Construct the confusion matrix.
So, the faulty model will predict all the 1000 input data as Yes.
Consider Yes as the positive class and No as the negative class. Construct the confusion matrix from the
Actual vs Predicted table.
Predicted Value Actual value
Yes=1000 Yes=900
No=0 No=100
Predicted Values
Yes No
Actual Values Yes No TP= FN=
FP=
TN=
234 Touchpad Artificial Intelligence (Ver. 3.0)-X

