-
Notifications
You must be signed in to change notification settings - Fork 167
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
JWST issues with numpy 2.0 #8580
JWST issues with numpy 2.0 #8580
Comments
Comment by Kenneth MacDonald on JIRA: There doesn't appear to be any problems in STCAL with numpy 2.0. !ruff_checks_numpy_20.png! |
Comment by Kenneth MacDonald on JIRA: There doesn't appear to be a problem in JWST ramp fitting for numpy 2.0.
!Screenshot 2024-08-29 at 1.08.36 PM.png! |
Comment by Kenneth MacDonald on JIRA: Tyler Pauly has an STCAL PR open to handle numpy 2.0, which necessitates a change in the calling of a byte order method:
|
Still regression test differences that need to be investigated; the stcal PR partially addressed this issue. |
Comment by Robert Jedrzejewski on JIRA: I checked all the steps in the miri detector1 pipeline. The only ones that gave non-zero differences between numpy 1.26 and numpy 21.1 were emicorr and ramp_fit. The differences were all below 1 part in 10^6, except for 8 pixels in the ramp_fit step that were still closer to 1 part in 10^5. |
Comment by Robert Jedrzejewski on JIRA: I also checked the refpix irs2 step. I ran the step starting with the same _superbias.fits input file, and then using numpy 1.26 and numpy 2.1.1. The distribution of #discrepant pixels vs. threshold looks like this: rtol #pixels these differences propagated down to the rate file: rtol #pixels There were a few statements that behave differently under numpy 2.1.1 compared to 1.26; I tried forcing the same behaviour by casting variables in arithmetic expressions, but was unable to reconcile the differences. More time would be needed to determine whether the numpy 2 behaviour is preferable to the numpy 1 behaviour |
Comment by Robert Jedrzejewski on JIRA: I checked all the files that were reported as discrepant in the regression test run and made a note of the largest percentage of discrepant pixels in the comparison (there will be more than 1 value for multi-extension products). In some cases the failures were because the data in the extension was of a different size between result and truth. some files had a different number of extensions or table rows in an extension. The last dozen or so tests with very large number of discrepant pixels also had the RSCD and FIRSTFRAME steps performed in the test, while these steps were skipped in the truth files. There are about 4000 file comparisons done in the regression tests. A few are skipped, a few are XFAILed. These 112 failed jwst.regtest.test_nirspec_fs_spec2.[stable-deps] test_nirspec_fs_spec2[jw02072-o002_20221206t143745_spec2_00001_asn.json-nsclean] 0.71% The test modules that product failures are: test_nirspec_fs_spec2.py 20/69 test modules. That means 49/69 are passing. Some of these are very small differences and can safely be disregarded, but some are more significant. |
Comment by Melanie Clarke on JIRA: Looking at the NIRSpec spec2 differences for MOS, FS, and IFU, they appear to be at least partly because the FFT fit in NSClean is a little different at the edges for full frame data. I attached an example from the jw02072-o002_20221206t143745_spec2_00001_asn.json-nsclean test. For this one, the edge effects look a little worse with numpy 2; for the MOS test, they look a little better with numpy 2 than with numpy 1. I don't think there's any reason to hold up transitioning to numpy 2 for this, but we should consider fixing edge effects more robustly while refactoring NSClean for full frame arrays in JP-3740. |
Comment by Maria Pena-Guerrero on JIRA: I did a run with numpy 2.0 and numpy 1.26.4 on MIRI data that I had locally to test the emicorr, dark, and saturation steps. I used file jw01386007001_04101_00006_mirimage_uncal.fits. I find that the differences are negligible, attaching the plots. !dark_current_np2vs126.png|thumbnail! !emicorr_np2_vs126.png|thumbnail! !saturation_np2_vs126.png|thumbnail! I obtained similar results for jw05594035001_02101_00005_nrcb1_uncal.fits |
test_engdb_mast.py also fails with numpy 2: jwst/jwst/lib/tests/test_engdb_mast.py Line 85 in 57f6148
compares string converted numbers (which changed for numpy 2): https://github.com/spacetelescope/stdatamodels/actions/runs/11921284593/job/33225085285#step:10:308 |
Comment by Melanie Clarke on JIRA: An issue with integer overflow in NIRSpec MOS shutter IDs under numpy 2 was addressed in: |
Comment by Robert Jedrzejewski on JIRA: For the Jenkins run using numpy 2 on September 19: The regression tests that failed under numpy2 are: test_nirspec_fs_spec2.py 24/69 tests (34.7%) To try and separate the "significant" failures from the less significant ones, the tests were re-run individually test_nirspec_masterbackground All these had issues with some 2-d extracted images having the wrong shape, which was linked to negative values of the SHUTTRID keyword. The miri LRS tests also had significant differences: test_miri_lrs_slitless These differences appear to come from the pixel_replace step. !Figure_1.png|thumbnail! Exposure jw01530-o005_t004_miri_p750l_x1d.fits in the test_miri_lrs_slit_spec3 test also showed some significant differences, again probably from differences in the pixel_replace step: !Figure2.png! test_nirspec_fs_spec2 This test showed some significant artifacts at the edge of the 2-d spectral image in the nsclean product. As mentioned below
test_nircam_coron3 test_miri_coron3 These tests showed significant differences in the PSF subtracted images because of a slight difference in the center of the PSF as |
Comment by Robert Jedrzejewski on JIRA: I looked at the latest JWST regtest that used numpy 2, from December 4 2024. The following 20 tests failed in both the December and September runs of the jwst regression tests under numpy 2: test_nirspec_fs_spec2.py Three tests failed in the December run, but not in the September run: test_miri_lrs_nod_masterbg.pyThis test seems to fail with both 1.26 and 2.1.1. Looks like the pixel_replace signature. It didn't fail !test_miri_lrs_nod_masterbg.png! test_nirspec_wcs.pyWhen I ran this test offline with the latest fetch from main, it passed under both numpy 2.1.1 and 1.26. The test was changed test_fgs_image3.pyThis test is new since September. The test fails under both nunpy 1.26 and 2.1.1. However, the 1.26 and 2.1.1 data are identical, These 4 tests failed in the old (September) test, but not in the December test: test_nirspec_verify.pyNow passes with when run offline both 1.26 and 2.1.1. This test was changed in October to use real data, previously used pre-launch data. test_nircam_mtimage.pyTest was updated in October. Test now passeswhen run offline with numpy 1.26 and 2.1.1. test_miri_mrs_badpix_selfcal.pyThe test was changed in November. Now passes when run offline with both 1.26 and 2.1.1. test_nirspec_subarray.py This test was removed and subsumed into test_nirspec_irs2_detector1 |
Issue JP-3664 was created on JIRA by Brett Graham:
Using numpy 2.0 results in failures in:
test_coron
failures with numpy 2.0 #8579Regression tests here: https://plwishmaster.stsci.edu:8081/job/RT/job/JWST-Developers-Pull-Requests/1664/#showFailuresLink
More recent regression test results here: https://plwishmaster.stsci.edu:8081/job/RT/job/JWST-Developers-Pull-Requests/1737/#showFailuresLink](https://plwishmaster.stsci.edu:8081/job/RT/job/JWST-Developers-Pull-Requests/1737/#showFailuresLink)
These were built with PRs to jwst, stdatamodels and stcal:
jwst: #8718
stcal: tapastro/stcal@1bb106f
EDIT: updated commit hash to
stdatamodels: tapastro/stdatamodels@e3b53d6
Hashes provided for stcal/stdatamodels to specify custom installation procedure for future regression test runs, as needed.
The regression tests show many failures, most of which on cursory inspection show small numerical differences. We'll need to check them off one by one to ensure there aren't any unreasonable changes as part of this migration.
Helpful link to numpy 2.0 migration guide: https://numpy.org/devdocs/numpy_2_0_migration_guide.html
The text was updated successfully, but these errors were encountered: