-
Notifications
You must be signed in to change notification settings - Fork 241
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Added A Test Policy #4908
Added A Test Policy #4908
Conversation
Jenkins BuildsClick to see older builds (21)
|
_docs/policies/tests.md
Outdated
- Prove that the functionality performs as described | ||
- Can be falsified | ||
- Are resistant to fuzzing | ||
- All new tests MUST BE validated via 1000 local tests. Ensuring that the test runs and passes consistently every time gives confidence that the test is not flaky. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This sentence I sort of disagree. Sort of because perhaps we should differentiate the type of test we want to rerun hundreds of times because a good portion of unit tests are so deterministic in nature that this recommendation seems out of place or wasted effort to even worry about. Could we change the wording to something more flexible or realistic?
The tricky thing is that we haven't ever (or have we?) agreed on how to classify status-go's tests. If they all go in the same bucket as "tests" it becomes harder for us to express which ones we really want devs to increase -count
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I guess we should say that integration tests should be rerun
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That is reasonable. Let's change the language to:
- All new tests MUST BE validated via 1000 local tests. Ensuring that the test runs and passes consistently every time gives confidence that the test is not flaky. | |
- All new integration tests MUST BE validated via at least 10000 local tests. Ensuring that the test runs and passes consistently every time gives confidence that the test is not flaky. |
I also agree with you @ilmotta we should have categories of test. Maybe we can open an issue and place a link to it in the Go Council page to mark this work as needing consensus.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm also happy to add a definition of "integration tests."
@Samyoul make an issue and define
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is done #4935
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice policy. I agree with it!
_docs/policies/tests.md
Outdated
- Prove that the functionality performs as described | ||
- Can be falsified | ||
- Are resistant to fuzzing | ||
- All new tests MUST BE validated via 1000 local tests. Ensuring that the test runs and passes consistently every time gives confidence that the test is not flaky. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I guess we should say that integration tests should be rerun
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for this. I believe this is an ideal candidate for a flowchart (Mermaid). It will be much more readable. (Come on, we are developers 🤓 )
flowchart TB
A[Start: Before merging] --> B{Check rerun reports}
B -->|Yes| C[Check if test is in flaky tests GitHub issues or in the last three nightly test runs]
C -->|No| D[Flakiness likely introduced by your changes]
D --> E[Address flakiness]
E --> B
C -->|Yes| F[Add a comment to the existing issue or open a new issue if none exists]
B -->|No| G[Proceed with merge]
F --> G
^ It's merely a draft, needs more love. If you don't have time for it, I can come up with something more mature at the end of the week.
_docs/policies/tests.md
Outdated
- Prove that the functionality performs as described | ||
- Can be falsified | ||
- Are resistant to fuzzing | ||
- All new tests MUST BE validated via 1000 local tests. Ensuring that the test runs and passes consistently every time gives confidence that the test is not flaky. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Would be great to automate it and run on CI. What is not a CI check, will never be respected. Some will forget, some will not know (new contributors), some will not run to save time.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I agree with your points. For now we can include it as part of the policy as this is the behaviour we want regardless of how we enforce the requirement.
I'll raise an issue to explore / implement the automation of "new integration test mega runs". I'll track it via the Go Council project board ... Which doesn't exist yet but we probably should have it as we have multiple improvements to track, and this improvement is dependent on being able to categorise certain tests as "integration test".
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@Samyoul make issue and somewhere to track Go Council related things
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ok this issue is now done #4952
Thank you for all the feedback and so quickly. I will get back to you on each point. Also I should probably not have put so many jokes in the document. I'll fix that and turn this document into a serious policy without the memes and edgelording. Also 🤦 this is just perfection #4908 (comment) |
b611df7
to
6602d49
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you for your effort ❤️ @Samyoul
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you! ❤️
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I really do appreciate!
The changes of the PR are basically the description. This is based on a
status-go
crew meeting https://www.notion.so/Sync-Call-67878e2bdbe447ddab0ebc462be8e64fThe human readable policy is below:
status-go
Test PolicyCreating Tests
integration tests
MUST BE validated via a minimum of 1000 tests.-count
or-test.count
flag with the test command eg:-count 1000
/-test.count 1000
TODO
Add link to issue for CI automation of validation test runs of newintegration tests
.Flaky Tests
Flaky tests are defined as tests that fail intermittently.
Steps to resolving or reporting flaky tests
Is it me?
Determine who caused the flaky test.
TODO
add link to rerun reportsReporting Flaky Tests
If an old test fails and/or seems flaky either locally or in CI, you MUST report the event.
status-go
GitHub repo issues for the test name(s) failing.