Evaluation of Classification Model Performance

Document Details

PrivilegedEveningPrimrose9790

Uploaded by PrivilegedEveningPrimrose9790

Tags

confusion matrix classification model machine learning

Summary

This document contains notes on how to evaluate the performance of a classification model. It discusses use of the confusion matrix during the evaluation phase to analyze the model's performance. Key evaluation metrics like accuracy, precision, recall, and F1 score are outlined.

Full Transcript

Evaluation of classification > - to evaluate performance of classification model TP (True Positive): examples correctly classified o...

Evaluation of classification > - to evaluate performance of classification model TP (True Positive): examples correctly classified o TN (True Negative): examples correctly rejected FP (False Positive): examples incorrectly classified FN (False Negative): examples incorrectly rejected · sensitivity -- recall/true positive rate High sensitivity = few false negatives · Specificity > - true negative rate High specificity = Few False positives · Accuracy - Overall performance (How often the model is correct) Use the confusion matrix during the evaluation phase, after training and When ? testing, to analyze how well your model performs and to calculate key metrics like accuracy, precision, recall, and F1 score. to asses well how is your model