Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Precision@k and Recall@k metrics #554

Open
Andron00e opened this issue Mar 2, 2024 · 0 comments
Open

Add Precision@k and Recall@k metrics #554

Andron00e opened this issue Mar 2, 2024 · 0 comments

Comments

@Andron00e
Copy link

Previous Precision and Recall metrics, supported by evaluate are only sklearn clones. It would be great to add an top k version for those metrics.

For example, a simple implementation of P@k metric is:

def precision_at_k(y_true, y_score, k):
    df = pd.DataFrame({'true': y_true, 'score': y_score}).sort('score')
    threshold = df.iloc[int(k*len(df)),1]
    y_pred = pd.Series([1 if i >= threshold else 0 for i in df['score']])
    return metrics.precision_score(y_true, y_pred)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant