Page 249 - AI Ver 3.0 class 10_Flipbook
P. 249

7.  Which of the following statements is true about F1 Score?
                       a.  F1 score is the average of precision and recall.

                       b.  F1 score only considers false negatives.

                       c.  F1 score is the sum of precision and recall.
                       d.  F1 score is always equal to the accuracy of the model.

                    8.  What does  the recall metric measure in a classification  problem?
                       a.  The proportion of true positive instances out of all predicted positive instances
                       b.  The proportion of actual positive instances that were correctly identified

                       c.  The overall accuracy of the model
                       d.  The proportion of false negatives out of all predicted negative instances
                    9.  Which of the  following is true about a confusion matrix?

                       a.  The confusion  matrix shows only the correct predictions of a model.
                       b.  The  diagonal  elements of a confusion  matrix represent  the false  positives  and false  negatives.
                       c.  The confusion matrix  can be used to calculate accuracy, precision, recall, and F1 score.

                       d.  The confusion matrix is used only for regression problems.
                    10.  Which of this is a classification use case example?                              [CBSE Handbook]

                       a.  House  Price prediction                       b.  Credit  card fraud
                       c.  Salary  prediction                            d.  None of these

                    11.  A teacher's marks prediction system predicts the marks of a student as 75, but the actual marks obtained by the student
                       are 80. What is the absolute error in the prediction?                              [CBSE Handbook]
                       a.  5                                             b.  10
                       c.  15                                            d.  20

                    12.  How is the relationship between model performance and accuracy described?        [CBSE Handbook]
                       a.  Inversely proportional                        b.  Not related
                       c.  Directly proportional                         d.  Randomly fluctuating

                 B.  Fill in the blanks.

                    1.  ………………………. occurs when a model performs well on the training data but poorly on test data because it memorizes
                       the training data instead of generalizing.

                    2.  ………………………. is prioritized over precision when false negatives are more costly than false positives.

                    3.  A Confusion Matrix is a ………………………. structure that helps in measuring the performance of an AI model using the
                       test data.
                    4.  The target variable in a confusion matrix has two values Positive and ………………………..

                    5.  The rows (x-axis) in the confusion matrix represent the ………………………. values of the target variable.
                    6.  The F1 score is a number between 0 and 1 and is the harmonic mean of ………………………. and recall.

                    7.  The  ideal  scenario  is  called  ……………………….,  where  the  model  strikes  the  right  balance  between  complexity  and
                       simplicity, performing well on both training and test data.


                                                                                           Evaluating Models    247
   244   245   246   247   248   249   250   251   252   253   254