Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inconsistent behavior of regression tests due to derivation check #609

Open
kevinmorio opened this issue Nov 24, 2023 · 2 comments
Open

Comments

@kevinmorio
Copy link
Contributor

The default timeout of the derivation checks introduced in PR ?? causes inconsistent behavior when running the regression tests.
Since the timeout defaults to 5 seconds, the result of the derivation checks (success or timeout) depends on the performance of the system they are executed on.

The regression tests contain the following case studies with timed out derivation checks:

  • case-studies-regression/fast-tests/related_work/YubiSecure_KS_STM12/Yubikey_and_YubiHSM_multiset_analyzed.spthy
  • case-studies-regression/sapic/fast/feature-secret-channel/U2F_analyzed.spthy
  • case-studies-regression/sapic/slow/PKCS11/pkcs11-templates_analyzed.spthy
  • case-studies-regression/ake/dh/DHKEA_NAXOS_C_eCK_PFS_partially_matching_analyzed.spthy
  • case-studies-regression/ake/dh/DHKEA_NAXOS_C_eCK_PFS_keyreg_partially_matching_analyzed.spthy

The first of these is the reason why the current regression tests fail. That is, the GitHub runner is fast enough such that the derivation check succeeds. The four other case studies are not part of the default regression tests.

@jdreier
Copy link
Member

jdreier commented Dec 5, 2023

Hmm I guess to avoid this issue we need to either deactivate the derivation checks for most regression tests (except for those testing the derivation checks) to have stable results (in most cases), or to increment the timeout (which might cause longer run times, and might not always guarantee stable results). Any opinions?

@rsasse
Copy link
Member

rsasse commented Dec 5, 2023

I would be fine in turning them off for the regression, except where they are tested themselves. This is the cleaner (and faster running) solution.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants