Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Performance Metrics #625

Open
AlexInJar opened this issue Sep 19, 2022 · 1 comment
Open

Performance Metrics #625

AlexInJar opened this issue Sep 19, 2022 · 1 comment
Labels
enhancement New feature or request

Comments

@AlexInJar
Copy link

AlexInJar commented Sep 19, 2022

May I ask how is the true positive in cell detection instance calculated in the metric. Is there perhaps a IoU threshold which you have applied that determined whether we have a true positive?
Also, may I ask if the deep cell toolbox package supports using the AUC score and other metrics related to instance segmentation?

Thanks

@AlexInJar AlexInJar added the enhancement New feature or request label Sep 19, 2022
@AlexInJar AlexInJar changed the title State of Art Performance Table Performance Metrics Sep 19, 2022
@ngreenwald
Copy link
Collaborator

The true positive is determined using a matching algorithm. You can take a look at the code to see how its implemented. There's also a section in the methods section of the Mesmer paper that discusses it.

We don't report AUC, but there is a F1, precision, and recall reported as a default. However, the predictions use the same format as most other segmentation algorithms (integer cell id masks), so you could feed them into whatever evaluation code you like to generate additional metrics.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants