Page 319 - Artificial Intellegence_v2.0_Class_11
P. 319
Case 4: Did an Earthquake occur? Prediction – No Reality Yes
False Negative
REALITY
CONFUSION MATRIX
Yes No
Yes True Positive False Positive
PREDICTION
No False Negative True Negative
3. Why should false positives and false negatives be given due importance in medical testing?
Ans. A false positive, in medical testing, is actually an error. A medical test may wrongly report the presence of a disease
(as the test result is positive), when in reality the patient does not suffer from that disease. A false negative is also an
error. In this case, the test result improperly shows the absence of a disease, when in reality the patient is suffering
from that disease. These are the two kinds of errors given by a binary classification model. While many medical
tests conducted nowadays are accurate and reliable, however, there are still, a few cases of false positives or false
negatives. Their implications on the patient or his family are quite severe. A ‘false negative’ is dangerous for the
patient—the test says you don’t have the disease when you actually suffer from that disease, especially in diseases
like cancer or HIV Aids, Covid-19.
4. Explain Logarithmic Regression algorithm. Why does it give only 2 values – 0 and 1? Also draw it graphical
representation.
Ans. Logistic regression is an algorithm used to predict a binary outcome 1
0.9
either something is going to happen or it is not going to happen. This
0.8
can be expressed as yes / no, pass / fail, survival / death, etc. 0.7
The independent variable can be categorical or numeric, but the 0.6
dependent variable is always categorical. So, consider using some data x, 0.5
logistic regression tries to find out whether some event y occurs or not. 0.4
0.3
So, y can either be 0 or 1. In this case, the event takes place, y is given the 0.2
value 1. If the event does not occur, then y is given the value of 0. 0.1
5. Consider the following Confusion Matrix and calculate the following 0 –6 –4 –2 0 2 4 6
metrics:
• Accuracy • Precision • Recall • F1 Score
Also comment on the F1 score. What does it tell you about the AI model?
CONFUSION MATRIX REALITY
TRUE FALSE
TRUE 105 40
PREDICTION
FALSE 60 325
(TP+TN)
Ans. Accuracy = × 100% Total No. of cases = 530
(TP+TN+ FP+FN)
(105+325)/530 *100% = 81.13%
TP
Precision = × 100%
TP+ FP
105/(105+40) = 72.41%
TP
Recall = × 100%
TP+ FN
105/(105+60) = 0.63
Precision × Recall
F1 Score =2 × × 100%
Precision + Recall
Classification & Clustering 317

