autointent.metrics.decision.decision_roc_auc#
- autointent.metrics.decision.decision_roc_auc(y_true, y_pred)#
Calculate ROC AUC for multiclass and multilabel classification.
The ROC AUC measures the ability of a model to distinguish between classes. It is calculated as the area under the curve of the true positive rate (TPR) against the false positive rate (FPR) at various threshold settings.
- Parameters:
y_true (autointent.metrics.custom_types.LABELS_VALUE_TYPE) – True values of labels
y_pred (autointent.metrics.custom_types.LABELS_VALUE_TYPE) – Predicted values of labels
- Returns:
Score of the decision ROC AUC
- Return type: