Page 153 - Ai_C10_Flipbook
P. 153

Where,
                 Total Correct Predictions = True Positive (TP) + True Negative (TN)
                 Total Predictions = True Positive (TP) + True Negative (TN) + False Positive (FP) + False Negative (FN)
                 A prediction is said to be correct if it matches reality. Here we have two conditions in which the Prediction matches
                 with the Reality, i.e., True Positive and True Negative.
                 For example, in a model of predicting whether the credit card transaction is fraudulent or not, the confusion matrix
                 is as follows:

                                                                          Predicted
                                                  Confusion Matrix
                                                                        Yes       No

                                                              Yes        15        14
                                                   Actual
                                                              No         12        10

                 Total Correct Predictions  = TP+TN
                                        = 15+10
                                        = 25

                 Total Predictions      = TP+TN+FP+FN
                                        = 15+10+12+14
                                        = 51
                                           Total no. of correct predictions
                 Classification Accuracy   =                             × 100
                                               Total no. of predictions
                                               (TP + TN)
                                        =                    × 100
                                           (TP + TN + FP + FN)
                                           25
                                        =      × 100
                                           51
                                        = 49%
                 Can we use Accuracy all the time?

                 It is suitable wherever the dataset is balanced, which means the positive and negative classes are roughly equal, that
                 is a rare occurrence, and that all predictions and prediction errors are equally important, which is often not the case.
                 For example, Calculating the accuracy of the classifier model, that predicts whether a student will pass a test
                 (Yes) or not pass a test (No). It classifies the input into two classes Yes and No. Let's, calculate the accuracy of the
                 classifier model and construct the confusion matrix for the model. Here,
                    • Total test data is 1000.
                    • Actual values are 900 Yes and 100 No (Unbalanced dataset).

                    • It is a faulty model which, irrespective of any input, will give a prediction as Yes.
                    • Calculate the classification accuracy of this model.
                 To prepare the classification accuracy of this model follow the given steps:

                 Step 1    Construct the Actual value vs Predicted value table. Consider Yes as the positive class and No as the
                          negative class.
                                                  Predicted Value        Actual Value








                                                                                           Evaluating Models    151
   148   149   150   151   152   153   154   155   156   157   158