Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add automated tests to determine the actual behaviour of the calendar mapping functions #132

Closed
3 tasks done
andrewphilipsmith opened this issue Feb 6, 2024 · 3 comments · Fixed by #145
Closed
3 tasks done
Assignees
Labels
bug Something isn't working

Comments

@andrewphilipsmith
Copy link
Collaborator

andrewphilipsmith commented Feb 6, 2024

The scope of this issue is just to add indicative automated tests. It does not require any changes to main pipeline code until we have determined the desired behaviour.

To contribute to resolving #100 and #106, it would be helpful to have tests that determine the current behaviour of the calendar mapping functions. The tests should include (at least) the following scenarios:

  • Test the behaviour of the bits of code referenced in this comment.

eg what is the actual difference between data and data_360 from this line of code:

data_360 = data.convert_calendar(dim="time", calendar="360_day", align_on="year")

What is in behaviour does the enforce_date_dropping function make? What is the difference between
data_dates_dropped and data_360 in this snipit?

if data.time.dt.is_leap_year.any():
    data_dates_dropped = enforce_date_dropping(data, data_360)

Sub tasks

  • Create test data files (eg one-pixel for a leap year and a non leap-year)
  • Write unittest for the resample_hadukgrid function within python/resampling/resampling_hads.py, asserting on the dimensions of the input and output data.
  • Write unittest for the enforce_date_dropping function within python/resampling/resampling_hads.py, asserting on the dimensions of the input and output data.

Write unittest for preprocess_data in python/debiasing/preprocess_data.py, asserting on the dimensions of the input and output data. the python/debiasing/pre_process_data.py process may now be superseded. Will open a new ticket separately if needed.

@griff-rees
Copy link
Collaborator

Thanks @andrewphilipsmith. @aranas FYI: refactoring your work on #32 to check and automate testing is my focus, specifically:

  • python/resampling/check_calendar.py
  • python/resampling/resampling_hads.py
  • python/resampling/check_calendar_log.txt

@griff-rees
Copy link
Collaborator

griff-rees commented Feb 15, 2024

The doctests of enforce_date_dropping in python/resample_hads.py in the day-sampling branch demonstrate that by default that function produces 1437 days over 4 years, which looks like 3 days less than the expected 360 days per year ($360 * 4 = 1440$).

    >>> ts_4_years: xr.DataArray = enforce_date_dropping(
    ...     test_4_years_xarray, test_4_years_xarray)
    >>> ts_4_years
    <xarray.DataArray (time: 1437, space: 3)>
    array([[0.5488135 , 0.71518937, 0.60276338],
           [0.43758721, 0.891773  , 0.96366276],
           [0.38344152, 0.79172504, 0.52889492],
           ...,
           [0.0916689 , 0.62816966, 0.52649637],
           [0.50034874, 0.93687921, 0.88042738],
           [0.71393397, 0.57754071, 0.25236931]])
    Coordinates:
      * time     (time) object 1980-11-30 1980-12-02 ... 1984-11-28 1984-11-29
      * space    (space) <U10 'Glasgow' 'Manchester' 'London'
    >>> len(ts_4_years) == 365*4 + 1  # Would keep all days
    False
    >>> len(ts_4_years) == 360*4      # Would enforce all years at 360 days
    False
    >>> len(ts_4_years)               # 3 days fewer than 360 per year
    1437

andrewphilipsmith added a commit that referenced this issue Mar 12, 2024
@griff-rees
Copy link
Collaborator

This issue is addressed in this branch and should be closable when merged: #145

@griff-rees griff-rees linked a pull request May 22, 2024 that will close this issue
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants