Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

JP-3566 Move pixel replace before resample/cube build #8409

Open
wants to merge 23 commits into
base: master
Choose a base branch
from

Conversation

jemorrison
Copy link
Contributor

@jemorrison jemorrison commented Apr 3, 2024

Resolves JP-3566

Closes #8340

This PR moves the pixel replace step to right before resample/cube build in both the Spec2 and Spec3 pipelines. The *cal.fits files now do not include the replace pixel values. These replacements are occurring right before resample/cubebuild.

Checklist for maintainers

@jemorrison
Copy link
Contributor Author

@tapastro
This PR has how I was planning on moving the pixel replacement. I have not tested it yet. I just wanted to check it this method seems like the right approach before I move into testing. The cal files would not longer include the replaced pixels.

@drlaw1558
Copy link
Collaborator

@jemorrison That looks about right to me. Presumably the standard save_results=True approach will make the pix replace step write output files for test/debugging purposes?

Copy link

codecov bot commented Apr 4, 2024

Codecov Report

Attention: Patch coverage is 10.66667% with 67 lines in your changes are missing coverage. Please review.

Project coverage is 57.91%. Comparing base (781e0e0) to head (c465c30).
Report is 12 commits behind head on master.

Files Patch % Lines
jwst/pixel_replace/pixel_replace_step.py 2.63% 37 Missing ⚠️
jwst/pipeline/calwebb_spec3.py 13.63% 19 Missing ⚠️
jwst/pipeline/calwebb_spec2.py 30.00% 7 Missing ⚠️
jwst/pipeline/calwebb_tso3.py 25.00% 3 Missing ⚠️
jwst/pixel_replace/pixel_replace.py 0.00% 1 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##           master    #8409      +/-   ##
==========================================
- Coverage   57.93%   57.91%   -0.03%     
==========================================
  Files         387      387              
  Lines       38839    38882      +43     
==========================================
+ Hits        22502    22518      +16     
- Misses      16337    16364      +27     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@jemorrison
Copy link
Contributor Author

jemorrison commented Apr 4, 2024

@tapastro The pixel replace step assumed the input data was a single input not a model container since it was written for calspec2. Now the pixel_replacement step needs to also work over ModelContainers.
Take a look at the changes I made to pixel_replace_step.py and see if they look correct to you.

In calspec2 the code it works as expected. In the cal file, bad pixels have a science value of nan. If I save the pixel_replace output then the science frame has the nans replaced (so now these files look like the old ca files). The files names are expected.

In calpsec3 the code works fine. The output filenames when pixel_replace.save_results=True are not what I was expecting. I need to make the name look more like the output from outlier detection, for example:
jw01204018001_02101_00004_mirifushort_ratesub_o018_crf.fits - where the o018 is the asn_id
I tried to replicate the outlier_detection code using the asn_id and make_output_path - but I do not have it right.
The filename are of the form:
jw01204-o018_t003_miri_7_pixel_replace.fits

Not sure what I am doing wrong.

@jemorrison
Copy link
Contributor Author

@emolter @hbushouse
I have to code working as intended. I just can not get the pixel_replace output filenames to be correct.
This PR moves where pixel_replace is called. When it is called in calspec3 the output names should be similar to the output names for outlier detection.
For example:
jw01204018001_02101_00001_mirifulong_o018_crf.fits
but for pixel replace I have jw01204-o018_t003_miri_0_pixel_replace.fits

I looked at what outlier detection was doing for file names and I can get the correct looking file name and I print it in the step (I left the print statement in the code for now - to show you where it works). But the final resulting output names are the same. Hum ? What I am missing ?

Snipet of code:
self.log.debug('Input is an IFUImageModel.')
replacement = PixelReplacement(model, **pars)
replacement.replace()
self.record_step_status(replacement.output, 'pixel_replace', success=True)
output_model[i] = replacement.output
o_path_name = self.make_output_path(basepath=output_model[i].meta.filename)
print(o_path_name). ****** THIS IS THE FILENAME I WANT
output_model[i].meta.filename = o_path_name

jwst/pixel_replace/pixel_replace_step.py Outdated Show resolved Hide resolved
jwst/pixel_replace/pixel_replace_step.py Outdated Show resolved Hide resolved
jwst/pixel_replace/pixel_replace_step.py Outdated Show resolved Hide resolved
@emolter
Copy link
Contributor

emolter commented Apr 11, 2024

It looks like the style checks are failing:
jwst/pixel_replace/pixel_replace_step.py:6:29: F401 [*] jwst.datamodels.ModelContainer imported but unused
jwst/pixel_replace/pixel_replace_step.py:101:46: F821 Undefined name partial

The second one of these makes me worried that there is no test coverage for the renaming, because this seems like it should cause failures. Is there a way to write a unit test that ensures the output filenames are as expected given association and non-association inputs?

jwst/pixel_replace/pixel_replace_step.py Outdated Show resolved Hide resolved
jwst/pipeline/calwebb_spec3.py Show resolved Hide resolved
jwst/pixel_replace/pixel_replace_step.py Show resolved Hide resolved
@jemorrison
Copy link
Contributor Author

jemorrison commented May 3, 2024

@hbushouse
Regression tests:
Several updates to regression test will cause failures. I added pixel_replace.skip=False for MIRI LRS SLIT and SLITLESS
since that is default now. All those down stream results will need to be okify.
I added to the truth files when no file existed (*pixel_replace.fits and *photom.fits. I added the photom results because they are missing from the tests and don't think they should be.)

For MIRI IFU , NIRSpec IFU and FS I added new test that ran pixel_replace, since at this time pixel_replace.skip=True in the pipeline for those modes. I think that might change in the future, so I just added another test rather than changing the main test. For these new tests I uploaded all the truth files.

I do not have a test for NIRSpec MOS data. I hope having all the other tests is good enough.

https://plwishmaster.stsci.edu:8081/job/RT/job/JWST-Developers-Pull-Requests/1417/

There are main failures - several of them are related to Extra keyword 'S_PXREPL' in b: 'SKIPPED'. This is when pixel_replace is not run on the data.
I do not have a test for NIRSpec MOS data. I don't know if that is a mode that will use with step.

If you want me to I can go through the results and okify them. I just need a refresher on how that is done.

Copy link
Contributor

@emolter emolter left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code changes look fine to me. There's probably some way to refactor pixel_replace_step to avoid the long if...else chain but I don't think that's very important.

Should I be worried about the bigdata error that the regtest run can't find certain files in Artifactory? e.g. this one?

jwst/pipeline/calwebb_spec2.py Show resolved Hide resolved
jwst/pipeline/calwebb_spec3.py Show resolved Hide resolved
@jemorrison
Copy link
Contributor Author

Running new regression tests: 1469

@jemorrison
Copy link
Contributor Author

@hbushouse I added pixel replace to calspec3 documentation - the table. There is a check mark for what spec3 runs by default. Or that is my understanding of check mark. I just put MIRI FS - is that correct ?
In the table for calwebb_spec2 the table has the correct order.
Not sure what I should put for calwebb_tso3 since it is skipped. Maybe my reading of the table is incorrect. Because I could add it and then not put a check for any modes ?
Suggestions

@jemorrison
Copy link
Contributor Author

new regression tests at:
https://plwishmaster.stsci.edu:8081/job/RT/job/JWST-Developers-Pull-Requests/1469/

I have made so many changes recently that I am going upload new truth files for the nirspec ifu spec3 test. Then I will run another regression test

@jemorrison
Copy link
Contributor Author

@hbushouse @melanieclarke
The newest regression test are at https://plwishmaster.stsci.edu:8081/job/RT/job/JWST-Developers-Pull-Requests/1470/#showFailuresLink

I think all the failures are expected because pixel_replace is not run before the cal file is called.
So I think we might be done.
But let me know if I missed something

@jemorrison
Copy link
Contributor Author

@hbushouse I have not added pixel_replace to the documentation on tso3. I was not sure what to do here. I can add it to the table of steps and then just not put a check mark next to it.
Let me know what you want

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Move pixel_replace step
6 participants