Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

 Binary Classification Metrics refer to the following two formulas used to calculated the reliability of a binary classification model.

NameFormula
Anchor
TPR
TPR
True Positive Rate (Sensitivity)
TPR = TP / P = TP / (TP + FN)
True Negative Rate (Specificity)SPC = TN / N = TN / (TN + FP)

...

MeasureAvailable for
Confusion Matrix
  • Binary classification models
  • Multi-class classification models
Accuracy
  • Binary classification models
  • Multi-class classification models
ROC Curve
  • Binary classification models
AUC
  • Binary classification models
Feature Importance
  • Binary classification
  • Numerical prediction
Predicted vs Actual
  • Binary classification models
  • Multi-class classification models
MSE
  • Numerical prediction
Residual Plot
  • Numerical Prediction
Precision and Recall
  • Binary classification models
F1 Score
  • Binary classification models

Confusion Matrix
Anchor
confusion matrix
confusion matrix

...

If the above conditions are not satisfied, it is possible that there are some missing/hidden factors/predictor variables that have not been taken into account. Residual plot is available for numerical prediction models. You can select a dataset feature to be plotted with its residuals.

Precision and Recall

 

Precision and Recall are performance measures used to evaluate search strategies. They are typically used in document retrieval scenarios.

When search is carried out on a set of records in a database, some of the records are relevant to the search and the rest of the records irrelevant to the search. However, the actual set of records retrieved may not perfectly match the set of records that are relevant to the search. Based on this, Precision and Recall can be described as follows.

MeasureDefinitionFormula
PrecisionThe number of the records relevant to the search that are retrieved, as a percentage of the total number of records in the database. TP / (TP + FP)
Recall

The number of the records relevant to the search that are retrieved, as a percentage of the total number of records that are relevant to the search.

Info

This is the same as the TPR.


TP / (TP + FN)

 

F1 Score

 

The F1 Score gives the weighted average of Precision and Recall. It is expressed as a value between 0 and 1, where 0 indicates the worst performance and 1 indicates the best performance.

2TP / (2TP + FP + FN)