Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add testing.check_figures_equal to avoid storing baseline images #555

Merged
merged 10 commits into from
Sep 4, 2020

Conversation

seisman
Copy link
Member

@seisman seisman commented Aug 6, 2020

Description of proposed changes

This PR adds a new function check_figures_equal to check if two pygmt.Figure() objects are the same, mostly inspired by comment #522 (review) and the matplotlib decorator check_figures_equal.

What the function check_figures_equal does is very simple:

  1. It accepts two pygmt.Figure() objects (fig_ref and fig_test) as arguments
  2. save them into two PNG images
  3. compare these two images using the compare_images function from matplotlib and calculate the RMS value
  4. if the two images are the same, then delete them; if the rms is greater than tol, generate the diff image and raise an exception.

I add three new tests:

Some known issues/limitations:

  1. some paths are hard-coded, but they may be improved to be more flexible
  2. The test test_check_figures_unequal() checks if the function check_figures_equal correctly raises the GMTImageComparisonFailure exception when two images are unequal, and I use pytest.raises to catch the exception so that the test passes. However, the two images are not deleted after the test. They are now.

Fixes #

Reminders

  • Run make format and make check to make sure the code follows the style guide.
  • Add tests for new features or tests that would have caught the bug that you're fixing.
  • Add new public functions/methods/classes to doc/api/index.rst.
  • Write detailed docstrings for all functions/methods.
  • If adding new functionality, add an example to docstrings or tutorials.

@weiji14
Copy link
Member

weiji14 commented Sep 3, 2020

@seisman, hope you don't mind if I continue some work on this.

  1. some paths are hard-coded, but they may be improved to be more flexible

Will try to see if it's possible to mimic the pytest-mpl paths, or look to matplotlib to see how they do it.

  1. The test test_check_figures_unequal() checks if the function check_figures_equal correctly raises the GMTImageComparisonFailure exception when two images are unequal, and I use pytest.raises to catch the exception so that the test passes. However, the two images are not deleted after the test.

It should be easy-enough to use GMTTempFile so that the images are cleaned up after the test, but we'll need to find a way to keep the images if the test fails so that they can be examined.

pygmt/helpers/testing.py Outdated Show resolved Hide resolved
Also moved test_check_figures_* to a doctest
under check_figures_equal.
@vercel vercel bot temporarily deployed to Preview September 3, 2020 09:12 Inactive
Copy link
Member

@weiji14 weiji14 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok, this is pretty much ready for review. After stress testing this with pytest, I think the check_figures_equal function will have to end up pretty much exactly like matplotlib's https://matplotlib.org/3.3.1/_modules/matplotlib/testing/decorators.html#check_figures_equal, except that we're using fig=pygmt.Figure() instead of matplotlib's plt.subplot. If only we can monkeypatch it!!

It might be worth raising an issue/PR on matplotlib to see if there's a way to reduce code duplication (for the sake of long term maintainability), as they've clearly spent months/years of thought and effort into this one check_figures_equal function. This would involve some sort of subclassing (or I don't know, pluggability?), so that we can swap in pygmt.Figure into fig_ref and fig_test, while getting all of the image_compare goodness.

Comment on lines 488 to 494
parameters = [
param
for param in old_sig.parameters.values()
if param.name not in {"fig_test", "fig_ref"}
]
new_sig = old_sig.replace(parameters=parameters)
wrapper.__signature__ = new_sig
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Figured out how to make our PyGMT check_figures_equal decorator work with pytest fixtures (e.g. grid=xr.DataArray(...)) in 3e0d3fb. This is basically just copying what was done in matplotlib at matplotlib/matplotlib#16800.

Comment on lines +99 to +105
@check_figures_equal()
def test_grdimage_central_longitude(grid, fig_ref, fig_test):
"""
Test that plotting a grid centred at different longitudes/meridians work.
"""
fig_ref.grdimage("@earth_relief_01d_g", projection="W120/15c", cmap="geo")
fig_test.grdimage(grid, projection="W120/15c", cmap="geo")
Copy link
Member

@weiji14 weiji14 Sep 3, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
@check_figures_equal()
def test_grdimage_central_longitude(grid, fig_ref, fig_test):
"""
Test that plotting a grid centred at different longitudes/meridians work.
"""
fig_ref.grdimage("@earth_relief_01d_g", projection="W120/15c", cmap="geo")
fig_test.grdimage(grid, projection="W120/15c", cmap="geo")
@pytest.mark.parametrize("meridian", [0, 33, 120, 180])
@check_figures_equal()
@pytest.mark.parametrize("proj_type", ["H", "Q", "W"])
def test_grdimage_different_central_meridians_and_projections(
grid, proj_type, meridian, fig_ref, fig_test
):
"""
Test that plotting a grid centred on different meridians using different
projection systems work.
"""
fig_ref.grdimage(
"@earth_relief_01d_g", projection=f"{proj_type}{meridian}/15c", cmap="geo"
)
fig_test.grdimage(grid, projection=f"{proj_type}{meridian}/15c", cmap="geo")

I'll update this test in #560 later 😄. Problem with using this fancy pytest.mark.parametrize is that it would complicate the check_figures_equal code (see matplotlib/matplotlib#15199 and matplotlib/matplotlib#16693), and make this PR even harder to review.


import numpy as np
from matplotlib.testing.compare import compare_images
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

from matplotlib.testing.compare import compare_images

As I understand it, the code means that now matplotlib becomes a required dependency, even for users who never run the tests, right?

Although PyGMT already requires matplotlib for testings and most users usually have matplotlib installed. I still don't want to add one dependency to PyGMT.

When I wrote the first commit (8b78614), I put the codes in pygmt/helpers/testing.py. By doing that way, I think matplotlib is still optional, although I haven't tested it.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good point, I think you're right here. I also encountered issues with circular imports when moving the code to decorators.py, hence this line:

from ..figure import Figure # pylint: disable=import-outside-toplevel

Probably should move it back under pygmt/helpers/testing.py then. As an aside, I've opened up a feature request at matplotlib/pytest-mpl#94, and we might be able to do all this from pytest-mpl in the future.

@vercel vercel bot temporarily deployed to Preview September 3, 2020 21:34 Inactive
pygmt/helpers/testing.py Show resolved Hide resolved
pygmt/helpers/testing.py Show resolved Hide resolved
from ..figure import Figure


def check_figures_equal(*, tol=0.0, result_dir="result_images"):
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One more thing about the result_dir is that, the check_figures_equal decorator generates images in result_images directory, while pytest.mark.mpl_image_compare generates images in directories like results/tmpjtnnwqt4.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, which was partly why I opened up the issue at matplotlib/pytest-mpl#94, to get all of that pytest-mpl goodness (e.g. not having a hardcoded result_dir). I'll try to make a Pull Request to pytest-mpl for that, so we can just use a proper @pytest.mark.mpl_check_equal decorator in the future (will open a new issue after this one is merged). For now though, since we don't have many tests using check_figures_equal yet, we can probably just leave it like so.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, looks good to me.

Comment on lines +313 to +315
Writing an image-based test is only slightly more difficult than a simple test.
The main consideration is that you must specify the "baseline" or reference
image, and compare it with a "generated" or test image. This is handled using
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added some notes here to CONTRIBUTING.md, adapted from https://matplotlib.org/3.3.1/devel/testing.html#writing-an-image-comparison-test.

Copy link
Member Author

@seisman seisman left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me, but I can't approve it because I opened this PR.

Copy link
Member

@weiji14 weiji14 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah yes, I'll approve it then 😆

@weiji14 weiji14 merged commit 97a585b into master Sep 4, 2020
@weiji14 weiji14 deleted the testing branch September 4, 2020 02:15
@weiji14 weiji14 added the maintenance Boring but important stuff for the core devs label Sep 5, 2020
weiji14 added a commit that referenced this pull request Sep 8, 2020
weiji14 added a commit that referenced this pull request Sep 8, 2020
Tell git to ignore PNG files in 'result_images' folder,
and add it to list of folders to clean for make clean.
Patches #555.
weiji14 added a commit that referenced this pull request Oct 27, 2020
Tests with @pytest-mpl.mpl_image_compare and @check_figures_equal (#555)
will return an image diff on test failures in the 'tmp-test-dir-with-unique-name'
directory, and we can upload those files as a Github artifact for inspection purposes.

* Use GitHub Action to upload diff images on test failure
* Set a unique artifact name for each OS/Python Version
* Add upload-artifact to GMT Latest tests

Co-Authored-By: Dongdong Tian <seisman.info@gmail.com>
@lhoupert lhoupert mentioned this pull request Nov 6, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
maintenance Boring but important stuff for the core devs
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants