Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add unit test for 1d residual fringing #8349

Open
wants to merge 3 commits into
base: master
Choose a base branch
from

Conversation

drlaw1558
Copy link
Collaborator

This PR adds an additional unit test for the 1d residual fringe correction routine.

Checklist for maintainers

  • added entry in CHANGES.rst within the relevant release section
  • updated or added relevant tests
  • updated relevant documentation
  • added relevant milestone
  • added relevant label(s)
  • ran regression tests, post a link to the Jenkins job below.
    How to run regression tests on a PR
  • Make sure the JIRA ticket is resolved properly

@drlaw1558
Copy link
Collaborator Author

@jemorrison I thought it would be instructive for me to foray into unit tests by adding one for 1d residual fringing, let me know what you think.

Copy link

codecov bot commented Mar 12, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 75.67%. Comparing base (4cc0ac1) to head (869f1a2).
Report is 20 commits behind head on master.

Additional details and impacted files
@@            Coverage Diff             @@
##           master    #8349      +/-   ##
==========================================
+ Coverage   75.15%   75.67%   +0.52%     
==========================================
  Files         470      474       +4     
  Lines       38604    38924     +320     
==========================================
+ Hits        29014    29457     +443     
+ Misses       9590     9467     -123     
Flag Coverage Δ *Carryforward flag
nightly 77.33% <ø> (-0.07%) ⬇️ Carriedforward from 4cc0ac1

*This pull request uses carry forward flags. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

Copy link
Contributor

@jemorrison jemorrison left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

David why do you expect outflux and outflux2 to be the same ?
Outflow2 is 1D residual fringe corrected.
Is it that Outflux does not have any residual fringes to correct

@drlaw1558
Copy link
Collaborator Author

David why do you expect outflux and outflux2 to be the same ? Outflow2 is 1D residual fringe corrected. Is it that Outflux does not have any residual fringes to correct

outflux is the version of the input that I've already fringe-corrected, so this test would fail if something started to go wrong in the rf1d routine such that it no longer gave the expected output.

@tapastro
Copy link
Contributor

I think this test falls more in line with regression testing than unit testing. We try to avoid adding data files to the repository, and the architecture in place for regression testing allows for comparison to a truth file. I see that we have a residual fringe regression test already - does this test offer different coverage to that one?

@drlaw1558
Copy link
Collaborator Author

I think this test falls more in line with regression testing than unit testing. We try to avoid adding data files to the repository, and the architecture in place for regression testing allows for comparison to a truth file. I see that we have a residual fringe regression test already - does this test offer different coverage to that one?

The previous test tests the 1d core of the 2d residual fringe correction code (fit_1d_background_complex) while the new tests evaluate the 1d residual fringe correction code (fit_1d_background_complex_1d) as wrapped by the main calling function (fit_residual_fringes_1d). Since I've never added a unit test before I can't speak to how those are usually structured- in this case I copied what I saw already existed for testing the step. If there's a better way I'd be curious to know how to implement it.

@hbushouse
Copy link
Collaborator

I agree with @tapastro that we try to avoid putting data files for testing into the code repo, unless it's absolutely unavoidable. So most unit tests that need to use something like one of our actual data products just build a data model on the fly within the test itself. An example that builds a SpecModel, which I think is the kind of model you need here (the output of extract_1d, is at https://github.com/spacetelescope/jwst/blob/master/jwst/white_light/tests/test_white_light.py

@drlaw1558
Copy link
Collaborator Author

In this case it isn't actually a SpecModel; it's really just arrays of wavelength and flux, the test bypasses the dealing with datamodels and just tests the core algorithm itself.

@emolter
Copy link
Contributor

emolter commented Mar 20, 2024

In this case it isn't actually a SpecModel; it's really just arrays of wavelength and flux, the test bypasses the dealing with datamodels and just tests the core algorithm itself.

I think this should be a regtest because the data arrays themselves can't be programmatically generated, i.e., they are "real data". IMO unit tests should be idealized cases for which you can determine the input and output analytically or trivially. They are good for catching edge cases where you know the output should be zero, or NaN, or an error should be raised. If there is a nontrivial example (something like, input a noise-free sinusoid and output a flat line) then that's great for a unit test too, because you can easily make arrays representing the input and output from scratch.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants