Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature request: Caching precomputations #560

Open
cascremers opened this issue Jun 29, 2023 · 0 comments
Open

Feature request: Caching precomputations #560

cascremers opened this issue Jun 29, 2023 · 0 comments

Comments

@cascremers
Copy link
Member

cascremers commented Jun 29, 2023

At the moment, a large model can take a long time to load due to complex precomputations. If a small change is made to the model (even if this does not impact precomputations, e.g. for a lemma) then recomputations are still redone.

It would be desirable to cache these and avoid needless precomputation.

One possible solution involves:

  • Computing a fingerprint of the precomputation-relevant data (e.g. a hash value over partial input excluding lemmas etc)
  • Storing precomputation data on disk (e.g. temp folder) by fingerprint.
  • When precomputation is needed, check if present on disk, reload if present; otherwise precompute and save to disk.

Minor notes: might want to include Maude/Tamarin versions and command-line options in fingerprint.

@cascremers cascremers changed the title Caching of precomputations? Feature request: Caching precomputations Jun 29, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant