Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[develop] Update SRW with spack-stack version 1.5.0 (from 1.4.1) #969

Merged
merged 56 commits into from
Feb 15, 2024

Conversation

RatkoVasic-NOAA
Copy link
Collaborator

@RatkoVasic-NOAA RatkoVasic-NOAA commented Nov 14, 2023

DESCRIPTION OF CHANGES:

Update SRW with spack-stack 1.5.0
Machines affected:

  • gaea C4
  • gaea C5
  • Hera - intel
  • Hera - GNU
  • Hercules
  • Orion
  • Jet

Type of change

  • New feature (non-breaking change which adds functionality)

TESTS CONDUCTED:

Fundamental tests performed.

  • hera.intel
  • orion.intel
  • hercules.intel
  • cheyenne.intel
  • cheyenne.gnu
  • derecho.intel
  • gaea.intel
  • gaeac5.intel
  • jet.intel
  • wcoss2.intel
  • NOAA Cloud (indicate which platform)
  • Jenkins
  • fundamental test suite
  • comprehensive tests (specify which if a subset was used)

DEPENDENCIES:

PR #973 and it's follow-up PR

ISSUE:

This solves issue #946
Issue to be solved: #991

CHECKLIST

  • My code follows the style guidelines in the Contributor's Guide
  • I have performed a self-review of my own code using the Code Reviewer's Guide
  • I have commented my code, particularly in hard-to-understand areas
  • My changes need updates to the documentation. I have made corresponding changes to the documentation
  • My changes do not require updates to the documentation (explain).
  • My changes generate no new warnings
  • New and existing tests pass with my changes

LABELS (optional):

A Code Manager needs to add the following labels to this PR:

  • Work In Progress
  • bug
  • enhancement
  • documentation
  • release
  • high priority
  • run_ci
  • run_we2e_fundamental_tests
  • run_we2e_comprehensive_tests
  • Needs Cheyenne test
  • Needs Jet test
  • Needs Hera test
  • Needs Orion test
  • help wanted

@RatkoVasic-NOAA
Copy link
Collaborator Author

@MichaelLueken , @ulmononian caught one thing. We forget to change file system on build and workflow scripts for Gaea-C5 from F2 to F5. I just committed these two changes. Do you think there's any other place (like in Jenkins)?

@MichaelLueken
Copy link
Collaborator

@MichaelLueken , @ulmononian caught one thing. We forget to change file system on build and workflow scripts for Gaea-C5 from F2 to F5. I just committed these two changes. Do you think there's any other place (like in Jenkins)?

@RatkoVasic-NOAA - While queuing up the Jenkins tests this morning, it has come to my attention that Gaea C5 is no longer using the gaea-c5 label, but gaeac5. I'm unsure if this will also require a renaming of the gaea-c5 modulefiles to gaeac5, or if the SRW_PLATFORM setting is still being set as gaea-c5 .

@RatkoVasic-NOAA
Copy link
Collaborator Author

@MichaelLueken I found two more files pointing to old file system.
As for name of machine, I don't think we changed anything, so I believe it will work with gaea-c5. But if consensus is to go without hyphen, I'm OK in changing it everywhere.

@MichaelLueken
Copy link
Collaborator

@RatkoVasic-NOAA - There are no longer any nodes associated with gaea-c5 in Jenkins. In order to run Jenkins on Gaea C5 moving forward, we will need to set gaeac5 in .cicd/Jenkinsfile.

@RatkoVasic-NOAA
Copy link
Collaborator Author

@MichaelLueken I'm looking into UFS WM PRs, and they are changing Gaea's name from gaea-c5 to just gaea. Can you check with whomever changed by just taking off hyphen if they also agree on having just gaea? It would be great to have same name across all applications.

@MichaelLueken MichaelLueken added the run_we2e_jenkins_coverage_tests SRW App automated CI testing with modified Jenkinsfile label Jan 25, 2024
@ulmononian
Copy link
Collaborator

@MichaelLueken I'm looking into UFS WM PRs, and they are changing Gaea's name from gaea-c5 to just gaea. Can you check with whomever changed by just taking off hyphen if they also agree on having just gaea? It would be great to have same name across all applications.

@jkbk2004 @zach1221 do you know how this call was made for the ufs-wm?

@MichaelLueken
Copy link
Collaborator

@RatkoVasic-NOAA - I'm using the SRW_App_Jenkinsfile_test sandbox on Jenkins to test the changes on Gaea C5. If any changes are required, I will open one final PR to your ss150 branch to address the issues.

@MichaelLueken
Copy link
Collaborator

@RatkoVasic-NOAA and @ulmononian - Talking with Zach, the change in PR #2115 is only for manual runs of the UFS WM regression tests. The Jenkinsfile is still pointing to gaeac5 for the UFS WM Jenkins tests.

@MichaelLueken
Copy link
Collaborator

Additional information from Kris Booker:

As for right now Jenkins is referring to Gaea as node name GaeaC5 with a label name of 'gaeac5'. This was due to an issue with UFS WM pipeline.

So, for the purposes of the Jenkinsfile, renaming the gaea-c5 label to gaeac5 is the correct method.

@MichaelLueken
Copy link
Collaborator

@RatkoVasic-NOAA and @natalie-perlin - I have made the necessary modifications to allow the SRW App to successfully build on Gaea C5, but while attempting to run the WE2E coverage tests, the tests are all failing in make_grid with the following error message:

/gpfs/f5/epic/scratch/Michael.Lueken/ufs-srweather-app/gaeac5/install_intel/exec/regional_esg_grid: symbol lookup error: /usr/lib64/libssh.so.4: undefined symbol: EVP_KDF_CTX_new_id, version OPENSSL_1_1_1d

I'll continue to dig around and see what might be happening, but I would appreciate any assistance you can provide, especially if this error message was encountered on other machines transitioning to spack-stack v.1.50.

My forked branch of @RatkoVasic-NOAA's ss150 branch can be found https://github.com/MichaelLueken/ufs-srweather-app/tree/ss150

@MichaelLueken
Copy link
Collaborator

@natalie-perlin - Making the changes to modulefiles/wflow_gaea-c5.lua likely caused the issue. I forgot that Gaea C5 requires the old workflow_tools conda environment to work. I'm working on correcting this in my branch, as well as updating the devbuild.sh script to replace gaea-c5 with gaeac5, and then rebuild and rerun the tests.

@MichaelLueken
Copy link
Collaborator

@RatkoVasic-NOAA - There were four WE2E comprehensive tests that failed on Jet:

----------------------------------------------------------------------------------------------------
Experiment name                                                  | Status    | Core hours used
----------------------------------------------------------------------------------------------------
2020_CAD_20240125173400                                            COMPLETE              37.68
community_20240125173401                                           COMPLETE              17.23
custom_ESGgrid_20240125173403                                      COMPLETE              20.23
custom_ESGgrid_Central_Asia_3km_20240125173404                     COMPLETE              35.80
custom_ESGgrid_Great_Lakes_snow_8km_20240125173405                 COMPLETE              13.14
custom_ESGgrid_IndianOcean_6km_20240125173407                      COMPLETE              17.01
custom_ESGgrid_NewZealand_3km_20240125173408                       COMPLETE              71.01
custom_ESGgrid_Peru_12km_20240125173409                            COMPLETE              21.22
custom_ESGgrid_SF_1p1km_20240125173411                             COMPLETE             219.84
custom_GFDLgrid__GFDLgrid_USE_NUM_CELLS_IN_FILENAMES_eq_FALSE_202  COMPLETE               8.84
custom_GFDLgrid_20240125173413                                     COMPLETE               9.03
deactivate_tasks_20240125173414                                    COMPLETE               0.76
get_from_AWS_ics_GEFS_lbcs_GEFS_fmt_grib2_2022040400_ensemble_2me  COMPLETE            1015.01
get_from_HPSS_ics_FV3GFS_lbcs_FV3GFS_fmt_grib2_2019061200_2024012  COMPLETE               6.60
get_from_HPSS_ics_FV3GFS_lbcs_FV3GFS_fmt_nemsio_2019061200_202401  COMPLETE               9.01
get_from_HPSS_ics_FV3GFS_lbcs_FV3GFS_fmt_nemsio_2021032018_202401  COMPLETE               9.08
get_from_HPSS_ics_FV3GFS_lbcs_FV3GFS_fmt_netcdf_2022060112_48h_20  COMPLETE              50.75
get_from_HPSS_ics_GDAS_lbcs_GDAS_fmt_netcdf_2022040400_ensemble_2  COMPLETE            1073.64
get_from_HPSS_ics_GSMGFS_lbcs_GSMGFS_20240125173423                COMPLETE               6.76
get_from_HPSS_ics_HRRR_lbcs_RAP_20240125173424                     COMPLETE              13.18
get_from_HPSS_ics_RAP_lbcs_RAP_20240125173426                      COMPLETE              15.41
get_from_NOMADS_ics_FV3GFS_lbcs_FV3GFS_20240125173427              DEAD                  12.19
grid_CONUS_25km_GFDLgrid_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16_202  COMPLETE              19.23
grid_CONUS_3km_GFDLgrid_ics_FV3GFS_lbcs_FV3GFS_suite_RRFS_v1beta_  COMPLETE             361.94
grid_RRFS_AK_13km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16_plot_20240  COMPLETE             166.24
grid_RRFS_AK_3km_ics_FV3GFS_lbcs_FV3GFS_suite_HRRR_20240125173432  COMPLETE             231.01
grid_RRFS_CONUS_13km_ics_FV3GFS_lbcs_FV3GFS_suite_RAP_20240125173  COMPLETE              36.20
grid_RRFS_CONUS_13km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16_plot_20  COMPLETE              42.18
grid_RRFS_CONUS_13km_ics_FV3GFS_lbcs_FV3GFS_suite_HRRR_2024012517  COMPLETE              39.96
grid_RRFS_CONUS_13km_ics_FV3GFS_lbcs_FV3GFS_suite_RRFS_v1beta_202  COMPLETE              39.95
grid_RRFS_CONUS_25km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v15p2_20240  COMPLETE              10.31
grid_RRFS_CONUS_25km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16_plot_20  COMPLETE              16.53
grid_RRFS_CONUS_25km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v17_p8_plot  COMPLETE              14.47
grid_RRFS_CONUS_25km_ics_FV3GFS_lbcs_FV3GFS_suite_HRRR_2024012517  COMPLETE              15.93
grid_RRFS_CONUS_25km_ics_FV3GFS_lbcs_FV3GFS_suite_RAP_20240125173  COMPLETE              43.60
grid_RRFS_CONUS_25km_ics_FV3GFS_lbcs_FV3GFS_suite_RRFS_v1beta_202  COMPLETE              18.64
grid_RRFS_CONUS_25km_ics_FV3GFS_lbcs_RAP_suite_RAP_20240125173447  COMPLETE              10.07
grid_RRFS_CONUS_25km_ics_GSMGFS_lbcs_GSMGFS_suite_GFS_v15p2_20240  COMPLETE               7.01
grid_RRFS_CONUS_25km_ics_NAM_lbcs_NAM_suite_GFS_v16_2024012517345  COMPLETE              18.43
grid_RRFS_CONUS_25km_ics_NAM_lbcs_NAM_suite_RRFS_v1beta_202401251  COMPLETE              14.50
grid_RRFS_CONUS_3km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v15p2_202401  COMPLETE             328.71
grid_RRFS_CONUS_3km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v15_thompson  COMPLETE            3282.92
grid_RRFS_CONUS_3km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16_20240125  COMPLETE             419.22
grid_RRFS_CONUS_3km_ics_FV3GFS_lbcs_FV3GFS_suite_HRRR_20240125173  COMPLETE             514.55
grid_RRFS_CONUS_3km_ics_FV3GFS_lbcs_FV3GFS_suite_RRFS_v1beta_2024  COMPLETE             520.15
grid_RRFS_CONUScompact_13km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16_  COMPLETE              33.34
grid_RRFS_CONUScompact_13km_ics_HRRR_lbcs_RAP_suite_HRRR_20240125  COMPLETE              31.89
grid_RRFS_CONUScompact_13km_ics_HRRR_lbcs_RAP_suite_RRFS_v1beta_2  COMPLETE              30.89
grid_RRFS_CONUScompact_25km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16_  COMPLETE              10.80
grid_RRFS_CONUScompact_25km_ics_HRRR_lbcs_HRRR_suite_HRRR_2024012  COMPLETE              24.02
grid_RRFS_CONUScompact_25km_ics_HRRR_lbcs_RAP_suite_RRFS_v1beta_2  COMPLETE               8.62
grid_RRFS_CONUScompact_3km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16_2  COMPLETE             365.49
grid_RRFS_CONUScompact_3km_ics_HRRR_lbcs_RAP_suite_HRRR_202401251  COMPLETE             434.65
grid_RRFS_CONUScompact_3km_ics_HRRR_lbcs_RAP_suite_RRFS_v1beta_20  COMPLETE             444.76
grid_RRFS_NA_13km_ics_FV3GFS_lbcs_FV3GFS_suite_RAP_20240125173513  COMPLETE              95.37
grid_SUBCONUS_Ind_3km_ics_FV3GFS_lbcs_FV3GFS_suite_WoFS_v0_202401  COMPLETE              20.92
grid_SUBCONUS_Ind_3km_ics_HRRR_lbcs_HRRR_suite_HRRR_2024012517351  COMPLETE              23.69
grid_SUBCONUS_Ind_3km_ics_HRRR_lbcs_RAP_suite_WoFS_v0_20240125173  COMPLETE              20.38
grid_SUBCONUS_Ind_3km_ics_NAM_lbcs_NAM_suite_GFS_v16_202401251735  COMPLETE              29.89
grid_SUBCONUS_Ind_3km_ics_RAP_lbcs_RAP_suite_RRFS_v1beta_plot_202  COMPLETE              12.31
long_fcst_20240125173522                                           COMPLETE              63.52
MET_ensemble_verification_only_vx_20240125173523                   COMPLETE               1.33
MET_ensemble_verification_only_vx_time_lag_20240125173526          DEAD                   4.41
MET_ensemble_verification_winter_wx_20240125173528                 COMPLETE             118.67
MET_verification_only_vx_20240125173531                            COMPLETE               0.27
nco_20240125173533                                                 COMPLETE               7.73
nco_ensemble_20240125173535                                        COMPLETE              73.55
nco_grid_RRFS_CONUS_13km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16_202  COMPLETE              33.75
nco_grid_RRFS_CONUS_25km_ics_FV3GFS_lbcs_FV3GFS_timeoffset_suite_  DEAD                   3.46
nco_grid_RRFS_CONUS_3km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v15_thom  COMPLETE             434.71
nco_grid_RRFS_CONUScompact_25km_ics_HRRR_lbcs_RAP_suite_HRRR_2024  COMPLETE              10.79
pregen_grid_orog_sfc_climo_20240125173546                          COMPLETE               8.81
specify_EXTRN_MDL_SYSBASEDIR_ICS_LBCS_20240125173548               COMPLETE               7.08
specify_template_filenames_20240125173549                          DEAD                   6.47
----------------------------------------------------------------------------------------------------
Total                                                              DEAD               11216.74

The Jenkins working directory on Jet is /mnt/lfs1/NAGAPE/epic/role.epic/jenkins/workspace/fs-srweather-app_pipeline_PR-969/jet/expt_dirs.

The get_from_NOMADS_ics_FV3GFS_lbcs_FV3GFS test failed in the make_ics task due to a bad allocation. I suspect a rerun will allow this test to pass.

The MET_ensemble_verification_only_vx_time_lag test failed in the run_MET_PcpCombine_fcst_APCP0*h_mem00* tasks due to OBS_DIR does not exist or is not a directory. The tasks that pulled the necessary data successfully completed, so hopefully a rerun will work here as well.

The nco_grid_RRFS_CONUS_25km_ics_FV3GFS_lbcs_FV3GFS_timeoffset_suite_GFS_v16 test failed in the make_lbcs task due to OOM kill events. A rerun should work.

The specify_template_filenames test failed in the run_fcst_mem000 task due to CFL violation. Hopefully a rerun will correct this.

I'm resubmitting the failed jobs now and will let you know how they look.

@MichaelLueken
Copy link
Collaborator

@RatkoVasic-NOAA - Three of the tests that had failed are now successfully passing on Jet:

  expt_name = "MET_ensemble_verification_only_vx_time_lag"
  wflow_status = "SUCCESS"
  expt_name = "get_from_NOMADS_ics_FV3GFS_lbcs_FV3GFS"
  wflow_status = "SUCCESS"
  expt_name = "nco_grid_RRFS_CONUS_25km_ics_FV3GFS_lbcs_FV3GFS_timeoffset_suite_GFS_v16"
  wflow_status = "SUCCESS"

The specify_template_filenames test is still failing in run_fcst with:

FATAL from PE 1: compute_qs: saturation vapor pressure table overflow, nbad= 1

It doesn't make sense to me that a single test is encountering a CFL violation and not the rest of the tests that also use FV3_GFS_v15p2 SDF.

For Gaea C5, running the WE2E fundamental tests on the machine fail in make_grid with:

/gpfs/f5/epic/scratch/Michael.Lueken/ufs-srweather-app/gaeac5/install_intel/exec/regional_esg_grid: symbol lookup error: /usr/lib64/libssh.so.4: undefined symbol: EVP_KDF_CTX_new_id, version OPENSSL_1_1_1d

This is using the BUILD_CONDA option to create the srw_app conda environment on the machine. While applying the necessary changes that @natalie-perlin made to allow the WE2E tests to run on Gaea C5 with the F2 filesystem (not using the BUILD_CONDA option and using the old workflow_tools conda environment instead), the WE2E tests will fail to generate because uwtools isn't in the workflow_tools conda environment. Following the merging of PR #994, uwtools needs to be in the conda environment, otherwise the templater tool will fail to generate.

@RatkoVasic-NOAA
Copy link
Collaborator Author

@MichaelLueken I'm running now 'specify_template_filenames' test on Jet. Last time I ran it it worked. I'll start from beginning.

@MichaelLueken
Copy link
Collaborator

@RatkoVasic-NOAA - After many rocotorewind/rocotoboot to run_fcst in specify_template_filenames, all of the WE2E comprehensive tests successfully passed on Jet:

----------------------------------------------------------------------------------------------------
Experiment name                                                  | Status    | Core hours used 
----------------------------------------------------------------------------------------------------
2020_CAD_20240125173400                                            COMPLETE              37.68
community_20240125173401                                           COMPLETE              17.23
custom_ESGgrid_20240125173403                                      COMPLETE              20.23
custom_ESGgrid_Central_Asia_3km_20240125173404                     COMPLETE              35.80
custom_ESGgrid_Great_Lakes_snow_8km_20240125173405                 COMPLETE              13.14
custom_ESGgrid_IndianOcean_6km_20240125173407                      COMPLETE              17.01
custom_ESGgrid_NewZealand_3km_20240125173408                       COMPLETE              71.01
custom_ESGgrid_Peru_12km_20240125173409                            COMPLETE              21.22
custom_ESGgrid_SF_1p1km_20240125173411                             COMPLETE             219.84
custom_GFDLgrid__GFDLgrid_USE_NUM_CELLS_IN_FILENAMES_eq_FALSE_202  COMPLETE               8.84
custom_GFDLgrid_20240125173413                                     COMPLETE               9.03
deactivate_tasks_20240125173414                                    COMPLETE               0.76
get_from_AWS_ics_GEFS_lbcs_GEFS_fmt_grib2_2022040400_ensemble_2me  COMPLETE            1015.01
get_from_HPSS_ics_FV3GFS_lbcs_FV3GFS_fmt_grib2_2019061200_2024012  COMPLETE               6.60
get_from_HPSS_ics_FV3GFS_lbcs_FV3GFS_fmt_nemsio_2019061200_202401  COMPLETE               9.01
get_from_HPSS_ics_FV3GFS_lbcs_FV3GFS_fmt_nemsio_2021032018_202401  COMPLETE               9.08
get_from_HPSS_ics_FV3GFS_lbcs_FV3GFS_fmt_netcdf_2022060112_48h_20  COMPLETE              50.75
get_from_HPSS_ics_GDAS_lbcs_GDAS_fmt_netcdf_2022040400_ensemble_2  COMPLETE            1073.64
get_from_HPSS_ics_GSMGFS_lbcs_GSMGFS_20240125173423                COMPLETE               6.76
get_from_HPSS_ics_HRRR_lbcs_RAP_20240125173424                     COMPLETE              13.18
get_from_HPSS_ics_RAP_lbcs_RAP_20240125173426                      COMPLETE              15.41
get_from_NOMADS_ics_FV3GFS_lbcs_FV3GFS_20240125173427              COMPLETE              13.65
grid_CONUS_25km_GFDLgrid_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16_202  COMPLETE              19.23
grid_CONUS_3km_GFDLgrid_ics_FV3GFS_lbcs_FV3GFS_suite_RRFS_v1beta_  COMPLETE             361.94
grid_RRFS_AK_13km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16_plot_20240  COMPLETE             166.24
grid_RRFS_AK_3km_ics_FV3GFS_lbcs_FV3GFS_suite_HRRR_20240125173432  COMPLETE             231.01
grid_RRFS_CONUS_13km_ics_FV3GFS_lbcs_FV3GFS_suite_RAP_20240125173  COMPLETE              36.20
grid_RRFS_CONUS_13km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16_plot_20  COMPLETE              42.18
grid_RRFS_CONUS_13km_ics_FV3GFS_lbcs_FV3GFS_suite_HRRR_2024012517  COMPLETE              39.96
grid_RRFS_CONUS_13km_ics_FV3GFS_lbcs_FV3GFS_suite_RRFS_v1beta_202  COMPLETE              39.95
grid_RRFS_CONUS_25km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v15p2_20240  COMPLETE              10.31
grid_RRFS_CONUS_25km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16_plot_20  COMPLETE              16.53
grid_RRFS_CONUS_25km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v17_p8_plot  COMPLETE              14.47
grid_RRFS_CONUS_25km_ics_FV3GFS_lbcs_FV3GFS_suite_HRRR_2024012517  COMPLETE              15.93
grid_RRFS_CONUS_25km_ics_FV3GFS_lbcs_FV3GFS_suite_RAP_20240125173  COMPLETE              43.60
grid_RRFS_CONUS_25km_ics_FV3GFS_lbcs_FV3GFS_suite_RRFS_v1beta_202  COMPLETE              18.64
grid_RRFS_CONUS_25km_ics_FV3GFS_lbcs_RAP_suite_RAP_20240125173447  COMPLETE              10.07
grid_RRFS_CONUS_25km_ics_GSMGFS_lbcs_GSMGFS_suite_GFS_v15p2_20240  COMPLETE               7.01
grid_RRFS_CONUS_25km_ics_NAM_lbcs_NAM_suite_GFS_v16_2024012517345  COMPLETE              18.43
grid_RRFS_CONUS_25km_ics_NAM_lbcs_NAM_suite_RRFS_v1beta_202401251  COMPLETE              14.50
grid_RRFS_CONUS_3km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v15p2_202401  COMPLETE             328.71
grid_RRFS_CONUS_3km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v15_thompson  COMPLETE            3282.92
grid_RRFS_CONUS_3km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16_20240125  COMPLETE             419.22
grid_RRFS_CONUS_3km_ics_FV3GFS_lbcs_FV3GFS_suite_HRRR_20240125173  COMPLETE             514.55
grid_RRFS_CONUS_3km_ics_FV3GFS_lbcs_FV3GFS_suite_RRFS_v1beta_2024  COMPLETE             520.15
grid_RRFS_CONUScompact_13km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16_  COMPLETE              33.34
grid_RRFS_CONUScompact_13km_ics_HRRR_lbcs_RAP_suite_HRRR_20240125  COMPLETE              31.89
grid_RRFS_CONUScompact_13km_ics_HRRR_lbcs_RAP_suite_RRFS_v1beta_2  COMPLETE              30.89
grid_RRFS_CONUScompact_25km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16_  COMPLETE              10.80
grid_RRFS_CONUScompact_25km_ics_HRRR_lbcs_HRRR_suite_HRRR_2024012  COMPLETE              24.02
grid_RRFS_CONUScompact_25km_ics_HRRR_lbcs_RAP_suite_RRFS_v1beta_2  COMPLETE               8.62
grid_RRFS_CONUScompact_3km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16_2  COMPLETE             365.49
grid_RRFS_CONUScompact_3km_ics_HRRR_lbcs_RAP_suite_HRRR_202401251  COMPLETE             434.65
grid_RRFS_CONUScompact_3km_ics_HRRR_lbcs_RAP_suite_RRFS_v1beta_20  COMPLETE             444.76
grid_RRFS_NA_13km_ics_FV3GFS_lbcs_FV3GFS_suite_RAP_20240125173513  COMPLETE              95.37
grid_SUBCONUS_Ind_3km_ics_FV3GFS_lbcs_FV3GFS_suite_WoFS_v0_202401  COMPLETE              20.92
grid_SUBCONUS_Ind_3km_ics_HRRR_lbcs_HRRR_suite_HRRR_2024012517351  COMPLETE              23.69
grid_SUBCONUS_Ind_3km_ics_HRRR_lbcs_RAP_suite_WoFS_v0_20240125173  COMPLETE              20.38
grid_SUBCONUS_Ind_3km_ics_NAM_lbcs_NAM_suite_GFS_v16_202401251735  COMPLETE              29.89
grid_SUBCONUS_Ind_3km_ics_RAP_lbcs_RAP_suite_RRFS_v1beta_plot_202  COMPLETE              12.31
long_fcst_20240125173522                                           COMPLETE              63.52
MET_ensemble_verification_only_vx_20240125173523                   COMPLETE               1.33
MET_ensemble_verification_only_vx_time_lag_20240125173526          COMPLETE               6.26
MET_ensemble_verification_winter_wx_20240125173528                 COMPLETE             118.67
MET_verification_only_vx_20240125173531                            COMPLETE               0.27
nco_20240125173533                                                 COMPLETE               7.73
nco_ensemble_20240125173535                                        COMPLETE              73.55
nco_grid_RRFS_CONUS_13km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16_202  COMPLETE              33.75
nco_grid_RRFS_CONUS_25km_ics_FV3GFS_lbcs_FV3GFS_timeoffset_suite_  COMPLETE              14.34
nco_grid_RRFS_CONUS_3km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v15_thom  COMPLETE             434.71
nco_grid_RRFS_CONUScompact_25km_ics_HRRR_lbcs_RAP_suite_HRRR_2024  COMPLETE              10.79
pregen_grid_orog_sfc_climo_20240125173546                          COMPLETE               8.81
specify_EXTRN_MDL_SYSBASEDIR_ICS_LBCS_20240125173548               COMPLETE               7.08
specify_template_filenames_20240125173549                          COMPLETE              11.27
----------------------------------------------------------------------------------------------------
Total                                                              COMPLETE           11235.73

@MichaelLueken
Copy link
Collaborator

On Gaea C5, while compiling, I am seeing the following messages, which didn't appear before the transition to the F5 filesystem:

[  0%] Building Fortran object sorc/emcsfc_ice_blend.fd/CMakeFiles/emcsfc_ice_blend.dir/emcsfc_ice_blend.f90.o
No supported cpu target is set, CRAY_CPU_TARGET=x86-64 will be used.
Load a valid targeting module or set CRAY_CPU_TARGET

Additionally, I'm trying to see if the modifications to etc/lmod-setup.sh and etc/lmod-setup.csh (replacing the calls to source /lustre/f2/dev/role.epic/contrib/Lmod_init_C5.sh and source /lustre/f2/dev/role.epic/contrib/Lmod_init_C5.csh, respectively, with module reset) might be causing issues.

It's unclear why the regional_esg_grid executable would encounter symbol lookup error: /usr/lib64/libssh.so.4: undefined symbol: EVP_KDF_CTX_new_id, version OPENSSL_1_1_1d.

@MichaelLueken MichaelLueken merged commit 869761c into ufs-community:develop Feb 15, 2024
2 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request run_we2e_coverage_tests Run the coverage set of SRW end-to-end tests run_we2e_jenkins_coverage_tests SRW App automated CI testing with modified Jenkinsfile
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Upgrade SRW to spack-stack 1.5.0 Upgrade ip to 4.1.0
5 participants