Page 149 - Ai_C10_Flipbook
P. 149

The following confusion matrix table illustrates how the 4-classification metrics are calculated (TP, FP, FN, TN), and
                 how our predicted value is compared to the actual value in the confusion matrix.


                                                                                         Prediction
                                   Confusion Matrix
                                                                                Yes                      No
                                                       Yes                True Positive (TP)      False Negative (FN)
                            Actual
                                                       No                False Positive (FP)      True Negative (TN)
                 In the Confusion Matrix,

                    • The target variable has two values: Positive and Negative.
                    • The columns (Y-axis) represent the actual values of the target variable.
                    • The rows (X-axis) represent the predicted values of the target variable.

                    • The numbers in each cell represent the number of predictions made by the machine learning algorithm in each
                   category

                 To understand the confusion matrix, let’s understand the following terms:
                    • Positive: The prediction is positive for the scenario. For example, it will rain today.
                    • Negative: The prediction is negative for the scenario. For example, it will not rain today.
                    • True Positive: The predicted value matches the actual value i.e.; the actual value was positive and the model
                   predicted a positive value.
                    • True Negative: The predicted value matches the actual value i.e.; the actual value was negative and the model
                   predicted a negative value.
                    • False Positive (Type 1 error): The predicted value was falsely predicted i.e.; the actual value was negative but
                   the model predicted a positive value.

                    • False Negative (Type 2 error): The predicted value was falsely predicted i.e.; the actual value was positive but
                   the model predicted a negative value.



                                 Brainy Fact


                      AI algorithms can analyse weather patterns and other data to predict natural disasters such as hurricanes and
                      earthquakes.

                 For example, Loan Approval

                 Let’s say, based on some metric parameters; you have designed a classifier that predicts whether a loan will be
                 approved or not.
                 The output is 1 if the loan is approved or 0 if loan is not approved or rejected. That is, 1 and 0 signify whether there
                 loan is approved or not.
                 The following is a confusion matrix of models predicting whether the loan is approved or not.












                                                                                           Evaluating Models    147
   144   145   146   147   148   149   150   151   152   153   154