Podcast
Questions and Answers
What does TP stand for in a binary classification confusion matrix?
What does TP stand for in a binary classification confusion matrix?
- True Positives (correct)
- Total Positives
- True Predictions
- Total Predictions
What is the formula for calculating Accuracy in a confusion matrix?
What is the formula for calculating Accuracy in a confusion matrix?
- (TP + FP) / (TP + FP + TN + FN)
- (TP + TN) / (TP + FP + TN + FN) (correct)
- (TP + FN) / (TP + FP + TN + FN)
- (TP + TN) / (FP + TN)
In Precision/Recall evaluation, what is Recall referring to?
In Precision/Recall evaluation, what is Recall referring to?
- Percentage of true positives (correct)
- Percentage of true negatives
- Percentage of false alarms
- Percentage of false negatives
What is the F-measure also known as in its most commonly used form?
What is the F-measure also known as in its most commonly used form?
For a highly-performing model, where should the False Positives (FP) value be in a confusion matrix?
For a highly-performing model, where should the False Positives (FP) value be in a confusion matrix?
What does TN represent in a confusion matrix?
What does TN represent in a confusion matrix?
'Precision' in Precision/Recall evaluation refers to what?
'Precision' in Precision/Recall evaluation refers to what?
'F-measure' combines which two metrics?
'F-measure' combines which two metrics?
'Accuracy' in confusion matrices is calculated using which four metrics?
'Accuracy' in confusion matrices is calculated using which four metrics?
'Recall' percentage measures what aspect in an evaluation?
'Recall' percentage measures what aspect in an evaluation?