Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

MeasureAvailable for
Confusion Matrix
  • Binary classification models
  • Multi-class classification models
Accuracy
  • Binary classification models
  • Multi-class classification models
ROC Curve
  • Binary classification models
AUC
  • Binary classification models
Feature Importance
  • Binary classification
  • Numerical prediction
Predicted vs Actual
  • Binary classification models
  • Multi-class classification models
MSE
  • Numerical prediction
Residual Plot
  • Numerical Prediction
Precision and RecallBinary classification Recall
  • Anomaly detection models
F1 Score
  • Binary classification Anomaly detection models

Confusion Matrix
Anchor
confusion matrix
confusion matrix

...

If the above conditions are not satisfied, it is possible that there are some missing/hidden factors/predictor variables that have not been taken into account. Residual plot is available for numerical prediction models. You can select a dataset feature to be plotted with its residuals.

Precision and Recall
Anchor
Precision and Recall
Precision and Recall

Precision and Recall are performance measures used to evaluate search strategies. They are typically used in document retrieval scenarios.

...

MeasureDefinitionFormula
PrecisionThe number of the records relevant to the search that are retrieved, as a percentage of the total number of records in the databaseselected items that are relevant. TP / (TP + FP)
Recall

The number of the records relevant to the search that are retrieved, as a percentage of the total number of records that are relevant to the searchitems that are selected.

Info

This is the same as the TPR.


TP / (TP + FN)

 

F1 Score
Anchor
F1 Score
F1 Score

The F1 Score gives the weighted average of Precision and Recall. It is expressed as a value between 0 and 1, where 0 indicates the worst performance and 1 indicates the best performance.

...