Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Benchmarking functionalities that are not part of the main workflow #21

Open
sfmig opened this issue Oct 2, 2023 · 1 comment
Open
Labels
question Further information is requested

Comments

@sfmig
Copy link
Contributor

sfmig commented Oct 2, 2023

We are defining workflows that are most representative of the users experience with the idea of benchmarking them.

However some functions are not part of the basic workflows (e.g. loading the output XML file, containing the result of the analysis).

Should we include these as simpler workflows here (like a XML loading one) or should we benchmark these modules / functions individually (maybe following the structure of the Python modules as in our initial asv work)?

@sfmig sfmig added the question Further information is requested label Oct 2, 2023
@adamltyson
Copy link
Member

It would be great if these could be benchmarked, but it's maybe not a top priority. If they're straightforward to do though, then great. Ideally as much would be benchmarked in the repo (all the quick stuff), as with your previous asv work.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
Status: Backlog
Development

No branches or pull requests

2 participants