Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Benchmark Creation using Python Package #161

Open
nquetschlich opened this issue Jan 23, 2023 · 1 comment
Open

Benchmark Creation using Python Package #161

nquetschlich opened this issue Jan 23, 2023 · 1 comment
Labels
enhancement New feature or request good first issue Good for newcomers

Comments

@nquetschlich
Copy link
Collaborator

Currently, each benchmark is newly created by the python package whenever a get_benchmark method call is executed.
This could be improvement, since many benchmarks are already created and available using the database of the MQT Bench webserver.
Therefore, the following steps would be useful when get_benchmark is called:

  1. Check if the MQT Bench database is downloaded locally. If yes, check if the benchmark is already part of that.
  2. If the database is not downloaded but part of it, get the benchmark using the MQT Bench webpage.
  3. Only if 1) and 2) are not successful, create the benchmark locally as it is currently implemented.
@nquetschlich nquetschlich added the enhancement New feature or request label Jan 23, 2023
@burgholzer
Copy link
Member

Just dumping some ideas on this:

  • this should, most likely, be an option instead of a default behavior, so that a user can really choose the behavior they want, e.g., caching=off|local|online|automatic where local means 1), online means 2), and automatic is the combination of both
  • I am not yet sure whether this feature should be opt-in or opt-out and what the appropriate default should be.
  • People might want to use the package offline or not allow it to access the internet/download lots of files. This has to be kept in mind and should somehow be reflected in the implementation.
  • (Optional, wild and crazy idea) Could there be a way to submit generated benchmarks to the online repository/website? This would need some kind of approval from us as maintainers. Could you somehow programmatically generate a pull request that "proposes" a new zip file with additional files? Or trigger a GitHub workflow that takes the latest zip file and augments it with some newly added files and attaches them to a release that states the added files in the release notes? Probably, this should be its own issue, but since it's a rather crazy idea, I'll just keep it here for now.

@burgholzer burgholzer added the good first issue Good for newcomers label Jan 23, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request good first issue Good for newcomers
Projects
None yet
Development

No branches or pull requests

2 participants