Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FEATURE] log inference times #491

Open
neuronflow opened this issue Sep 28, 2023 · 1 comment
Open

[FEATURE] log inference times #491

neuronflow opened this issue Sep 28, 2023 · 1 comment

Comments

@neuronflow
Copy link

neuronflow commented Sep 28, 2023

Is your feature request related to a problem? Please describe.
We need to know the inference times of algorithms per exam.

Describe the solution you'd like
Medperf should log and return inference times in milliseconds.

Additional context
Email conversation with Hasan and Verena. Miccai challenge guidelines (attached)
BIAS_ReportingGuideline.pdf

@neuronflow neuronflow added the type: enhancement New feature or request label Sep 28, 2023
@neuronflow
Copy link
Author

PS: ideally you can separate between timings:

  • medperf housekeeping
  • loading model weights
  • actual inference
  • computing metrics

Probably I missed some relevant points.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants