Summarize the Confusion Matrix
One of the evaluation indexes in the classification.
Predicted Negative | Predicted Positive | |
---|---|---|
Actual Negative | True Negative | False Positive |
Actual Positive | False Negative | True Positive |
Precision = TP / (TP + FP) Recall = TP / (TP + FN)
from sklearn.metrics import confusion_matrix
cm = confusion_matrix(y_test, y_pred)
Recommended Posts