You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
At the moment, a large model can take a long time to load due to complex precomputations. If a small change is made to the model (even if this does not impact precomputations, e.g. for a lemma) then recomputations are still redone.
It would be desirable to cache these and avoid needless precomputation.
One possible solution involves:
Computing a fingerprint of the precomputation-relevant data (e.g. a hash value over partial input excluding lemmas etc)
Storing precomputation data on disk (e.g. temp folder) by fingerprint.
When precomputation is needed, check if present on disk, reload if present; otherwise precompute and save to disk.
Minor notes: might want to include Maude/Tamarin versions and command-line options in fingerprint.
The text was updated successfully, but these errors were encountered:
At the moment, a large model can take a long time to load due to complex precomputations. If a small change is made to the model (even if this does not impact precomputations, e.g. for a lemma) then recomputations are still redone.
It would be desirable to cache these and avoid needless precomputation.
One possible solution involves:
Minor notes: might want to include Maude/Tamarin versions and command-line options in fingerprint.
The text was updated successfully, but these errors were encountered: