Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding a new strategy gotchas #1370

Open
marcharper opened this issue Sep 9, 2020 · 2 comments
Open

Adding a new strategy gotchas #1370

marcharper opened this issue Sep 9, 2020 · 2 comments

Comments

@marcharper
Copy link
Member

In PR #1364, adding a new strategy tripped some of the tests for the meta strategies because it induces a change in the behavior of those strategies. This is not ideal since it adds complexity to the process of adding new strategies.

There are various ways we could mitigate this:

  • Simply remove the tests -- they don't really test "expected behaviors" and basically just detect that something changed. We could still run a test of some sort to make sure nothing breaks, just not care about the output
  • Following the suggestion in Idea: New way to organize many tests #1353, as part of the testing / presubmit process (see Presumbit script and pin all dependency versions #1360), we could detect these test changes and suggest fixes
  • Pin the strategies that go into the meta strategies, so their behavior doesn't actually change

At the least we should update the documentation. There are also documentation tests that tend to fail when adding a new strategy (counts of how many strategies there are, for example), but those seem to serve a somewhat useful purpose.

@drvinceknight
Copy link
Member

Gosh sorry I've just seen your comment on #1364, my bad for missing it.

Simply remove the tests -- they don't really test "expected behaviors" and basically just detect that something changed. We could still run a test of some sort to make sure nothing breaks, just not care about the output

I think I'm in favour of that approach. It's not ideal but is probably the most sensible on balance.

marcharper added a commit that referenced this issue Sep 10, 2020
Removes some tests that do not target specific strategy behaviors and can change when new strategies are added.
@marcharper
Copy link
Member Author

There are already some tests in TestPlayer that do exactly as described above, e.g. tests that a clone and the original produce the same play against a small set of opponents. So I guess we just remove the extra tests, which I did in #1373.

Nikoleta-v3 pushed a commit that referenced this issue Sep 25, 2020
* [#1370] Remove some tests from test_meta.py

Removes some tests that do not target specific strategy behaviors and can change when new strategies are added.

* Add a property based test for valid strategies

Not sure if this is overkill @marcharper, just thought it could be worth
having.

* Run isort.

Co-authored-by: Vince Knight <vincent.knight@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants