Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Redo / Improve our Doctests #184

Open
henrifroese opened this issue Sep 12, 2020 · 0 comments
Open

Redo / Improve our Doctests #184

henrifroese opened this issue Sep 12, 2020 · 0 comments
Labels
discussion To discuss new improvements help wanted Extra attention is needed testing

Comments

@henrifroese
Copy link
Collaborator

Doctests are used in Texthero to make sure that what we think the output will look like is also what it looks like. Example:

def double(x):
    """
    Double the given input.

    Examples
    -------------
    >>> f(5)
    10
    """
    return 2*x

The "Examples" part is then executed by the doctest suite when running our testing script.

We have noticed that with more "complicated" outputs, we often need to skip the doctests as they do work locally, but do not work on all our travis builds (we're testing on macOS/Xenial/Windows) . The main reasons are:

  • different floating point representation on the OSes -> float results are different after some decimals -> tests fail
  • different pandas printing outputs on the OSes -> e.g. macOS prints "..." in DataFrames at a different position than the others -> tests fail

We're looking for any solution for those issues.

Preliminary Ideas

  1. Look at the doctest module and find ways to make it work better (e.g. somehow allow some floating point epsilon); maybe the only thing not working will we DataFrames and we might be able to live with that

  2. (Partly) write our own doctest module (see here)

Interested in opinions!

@henrifroese henrifroese changed the title Doctests are often skipped Redo / Improve our Doctests Sep 12, 2020
@henrifroese henrifroese added discussion To discuss new improvements help wanted Extra attention is needed testing labels Sep 12, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
discussion To discuss new improvements help wanted Extra attention is needed testing
Projects
None yet
Development

No branches or pull requests

1 participant