From 71ea0207fec678f3200573c6ae4ce5d28947f457 Mon Sep 17 00:00:00 2001 From: daniel-hildebrandt <98896618+daniel-hildebrandt@users.noreply.github.com> Date: Wed, 11 May 2022 08:47:42 -0600 Subject: [PATCH] Release/public v2 (#765) * Add Gaea as a supported platform for the regional_workflow (#734) * Updates to port regional workflow to gaea * Temp change with -v as batch option * new fixes for gaea/slurm * Updated time for make lbcs * added TEST data directory path * Update gaea.sh * fixes for PR * Add more parameters to CSV file containing WE2E test info (#740) ## DESCRIPTION OF CHANGES: The script/function `get_WE2Etest_names_subdirs_descs.sh` (which is called from `run_WE2E_tests.sh` if needed) creates a CSV (Comma-Separated Value) file named `WE2E_test_info.csv` that contains information about the WE2E tests. Currently, this CSV file contains only 3 columns: the test name, the names of any alternate names for the test, and the test description. In order to have a more complete summary of the WE2E tests, this PR modifies `get_WE2Etest_names_subdirs_descs.sh` so that additional information is included in the CSV file. This additional information consists of the values of the following experiment variables for each test: ``` PREDEF_GRID_NAME CCPP_PHYS_SUITE EXTRN_MDL_NAME_ICS EXTRN_MDL_NAME_LBCS DATE_FIRST_CYCL DATE_LAST_CYCL CYCL_HRS INCR_CYCL_FREQ FCST_LEN_HRS LBC_SPEC_INTVL_HRS NUM_ENS_MEMBERS ``` In addition, the script uses this information to calculate the number of times each test calls the forecast model (e.g. if the test uses 3 different cycle dates, then the forecast model will be called 3 times; if it is an ensemble test for a single cycle, the test will call the forecast model as many times as the number of ensemble members). ## TESTS CONDUCTED: The script `run_WE2E_tests.sh` was called that in turn calls `get_WE2Etest_names_subdirs_descs.sh`. This created a new CSV file that contained the new fields (columns). The CSV file was imported into Google Sheets (using "|" as the field/column separator) and looked correct. ## DOCUMENTATION: The documentation is for the most part already within the `get_WE2Etest_names_subdirs_descs.sh`. This PR slightly modifies that documentation to update it. * Update directory structure of NCO mode (#743) * update vertical structure of NCO mode * update sample script for nco * Fix typo on write component of new RRFS CONUS * Default CCPP physics option is FV3_GFS_v16 (#746) * Updated the default CCPP physics option to FV3_GFS_v16 * Updated the default CCPP physics option to FV3_GFS_v16 in config_defaults.sh Co-authored-by: Natalie Perlin * Adds an alternative python workflow generation path (#698) * Workflow in python starting to work. * Use new python_utils package structure. * Some bug fixes. * Use uppercase TRUE/FALSE in var_dfns * Use config.sh by default. * Minor bug fixes. * Remove config.yaml * Update to the latest develop * Remove quotes from numbers in predef grid. * Minor bug fix. * Move validity checker to the bottom of setup * Add more unit tests. * Update with python_utils changes. * Update to latest develop additions (Need to re-run regression test) * Use set_namelist and fill_jinja_template as python functions. * Replace sed regex searches with python re. * Use python realpath. * Construct settings as dictionary before passing to fill_jinja and set_namelist * Use yaml for setting predefined grid parameters. * Use xml parser for ccpp phys suite definition file. * Remove more run_command calls. * Simplify some func argument processing. * Move different config format parsers to same file. * Use os.path.join for the sake of macosx * Remove remaining func argument processing via os.environ. * Minor bug fix in set_extrn_mdl_params.sh * Add suite defn in test_data. * Minor fixes on unittest on jet. * Simplify boolean condition checks. * Include old in renaming of old directories * Fix conflicting yaml !join tag for paths and strings. * Bug fix with setting sfcperst dict. * Imitate "readlink -m" with os.path.realpath instead of os.readlink * Don't use /tmp as that is shared by multiple users. * Bug fix with cron line, maintain quotes around TRUE/FALSE. * Update to latest develop (untested) * Bug fix with existing cron line and quotes. * Bug fix with case-sensitive MACHINE name, and empty EXPT_DIR. * Update to latest develop * More updates. * Bug fix thanks to @willmayfield! Check both starting/ending characters are brackets for shell variable to be considered an array. * Make empty EXPT_BASEDIR workable. * Update to latest develop * Update in predef grid. * Check f90nml as well. Co-authored-by: Daniel Abdi * Fix typo and crontab issue on wcoss dell in workflow python scripts (#750) * Fix typo and failure on wcoss * fix new line issue on wcoss dell * remove capture_output * Get USER from environment Co-authored-by: Daniel Abdi * Add new WE2E configs (#748) ## DESCRIPTION OF CHANGES: Added two new WE2E config files for the Sub-CONUS Indianapolis domain to support the upcoming SRW release. In addition, modified the external data used in the `config.specify_EXTRN_MDL_SYSBASEDIR_ICS_LBCS.sh` to match more common datasets used in the WE2E testing process. ## TESTS CONDUCTED: Successfully ran the new WE2E tests (`config.SUBCONUS_Ind_3km_ics_HRRR_lbcs_RAP_suite_HRRR.sh`, `config.SUBCONUS_Ind_3km_ics_HRRR_lbcs_RAP_suite_RRFS_v1beta.sh`) and `config.specify_EXTRN_MDL_SYSBASEDIR_ICS_LBCS.sh` on NOAA Parallel Works AWS instance. ## DEPENDENCIES: None. ## DOCUMENTATION: No documentation changes are required. * Added a fixed WoF grid and the python tool to determine the write component parameters (#733) * Added a fixed WoF grid and the python tool to determine the write component parameters * Update set_predef_grid_params.sh * Renamed file as recommended and removed unused lines * Modified comment Co-authored-by: JeffBeck-NOAA <55201531+JeffBeck-NOAA@users.noreply.github.com> Co-authored-by: WYH@MBP * Replace env with modulefiles in scripts (#752) * change env to mod * update we2e script * WE2E script improvements for usability (#745) ## DESCRIPTION OF CHANGES: * Modifications to `run_WE2E_tests.sh`: * Add examples to help/usage statement * Modifications to `check_expts_status.sh`: * Add arguments list that can be processed by `process_args` * Add new optional arguments: `num_log_lines`, `verbose` * Include a help/usage message ## TESTS CONDUCTED: * Ran `run_WE2E_tests.sh --help` from the command line and got the expected help message. * Ran `check_expts_status.sh --help` from the command line and got the expected help message. * Used `run_WE2E_tests.sh` to run a set of 2 WE2E tests -- works as expected. * Used `check_expts_status` to check on the status of the 2 tests run above and got the expected status message. ## DEPENDENCIES: PR #[241](https://github.com/ufs-community/ufs-srweather-app/pull/241) ## DOCUMENTATION: A lot of this PR is documentation in the scripts. There is an accompanying documentation PR #[241](https://github.com/ufs-community/ufs-srweather-app/pull/241) into ufs-srweather-app. * Standardize static data across Tier-1 platforms; fix and improve IC and LBC data retrieval (#744) * Bug fixes (grid size + suppress screen output from module load) (#756) ## DESCRIPTION OF CHANGES: 1) Adjust y-direction size of write-component grid of `SUBCONUS_Ind_3km` predefined grid from 195 to 197 (this was just an oversight in PR #725 ). 2) Redirect output of module load in launch script (`launch_FV3LAM_wflow.sh`) to `/dev/null` to avoid unwanted screen output (which was introduced in PR #[238](https://github.com/ufs-community/ufs-srweather-app/pull/238) in ufs-srweather-app and is about how to load the `regional_workflow` environment and is not relevant in this context). ## TESTS CONDUCTED: 1) Plotted the `SUBCONUS_Ind_3km` grid to ensure it has correct size (it does). 2) Manually ran `launch_FV3LAM_wflow.sh` from the command line to verify that screen output is suppressed (it is). * Update default SPP ISEED array in config_defaults.sh to use unique values (#759) * Modify RRFS North America 3- and 13-km domain configuration and WE2E test. * Modify default ISEED values for SPP * Fix grid in WE2E test * Update workflow python scripts (#760) * update python scripts * Change output file name of run_post to meet NCO standards (#758) * change output file name * change variable name * update python script * remove duplicates * add a check for empty variables * move variable to common area * clean up unnecessary comments * update scripts * remove duplicate * update python scripts * fix user-staged dir path issue in python script * Add POST_OUTPUT_DOMAIN_NAME to WE2E tests for new grids (#763) * Add new var to we2e tests for new grids * rename we2e tests for custom grid * remove unnecessary $ Co-authored-by: Mark Potts <33099090+mark-a-potts@users.noreply.github.com> Co-authored-by: gsketefian <31046882+gsketefian@users.noreply.github.com> Co-authored-by: Chan-Hoo.Jeon-NOAA <60152248+chan-hoo@users.noreply.github.com> Co-authored-by: Natalie Perlin <68030316+natalie-perlin@users.noreply.github.com> Co-authored-by: Natalie Perlin Co-authored-by: danielabdi-noaa <52012304+danielabdi-noaa@users.noreply.github.com> Co-authored-by: Daniel Abdi Co-authored-by: Daniel Abdi Co-authored-by: EdwardSnyder-NOAA <96196752+EdwardSnyder-NOAA@users.noreply.github.com> Co-authored-by: Yunheng Wang <47898913+ywangwof@users.noreply.github.com> Co-authored-by: JeffBeck-NOAA <55201531+JeffBeck-NOAA@users.noreply.github.com> Co-authored-by: WYH@MBP Co-authored-by: Michael Kavulich --- modulefiles/tasks/gaea/make_grid.local | 6 + modulefiles/tasks/gaea/make_ics.local | 6 + modulefiles/tasks/gaea/make_lbcs.local | 6 + modulefiles/tasks/gaea/run_fcst.local | 6 + modulefiles/tasks/gaea/run_vx.local | 7 + scripts/exregional_get_extrn_mdl_files.sh | 4 +- scripts/exregional_make_grid.sh | 2 +- scripts/exregional_run_ensgridvx.sh | 1 + scripts/exregional_run_ensgridvx_mean.sh | 1 + scripts/exregional_run_ensgridvx_prob.sh | 1 + scripts/exregional_run_enspointvx.sh | 1 + scripts/exregional_run_enspointvx_mean.sh | 1 + scripts/exregional_run_enspointvx_prob.sh | 1 + scripts/exregional_run_fcst.sh | 5 +- scripts/exregional_run_gridstatvx.sh | 1 + scripts/exregional_run_pointstatvx.sh | 1 + scripts/exregional_run_post.sh | 15 +- .../WE2E/get_WE2Etest_names_subdirs_descs.sh | 306 ++- tests/WE2E/get_expts_status.sh | 157 +- tests/WE2E/run_WE2E_tests.sh | 149 +- ...NUS_25km_ics_FV3GFS_lbcs_RAP_suite_HRRR.sh | 1 + ...mpact_13km_ics_HRRR_lbcs_RAP_suite_HRRR.sh | 2 + ...km_ics_HRRR_lbcs_RAP_suite_RRFS_v1alpha.sh | 2 + ...3km_ics_HRRR_lbcs_RAP_suite_RRFS_v1beta.sh | 2 + ...pact_25km_ics_HRRR_lbcs_HRRR_suite_HRRR.sh | 3 + ...km_ics_HRRR_lbcs_HRRR_suite_RRFS_v1beta.sh | 2 + ...mpact_25km_ics_HRRR_lbcs_RAP_suite_HRRR.sh | 2 + ...km_ics_HRRR_lbcs_RAP_suite_RRFS_v1alpha.sh | 2 + ...5km_ics_HRRR_lbcs_RAP_suite_RRFS_v1beta.sh | 2 + ...t_3km_ics_HRRR_lbcs_RAP_suite_GFS_v15p2.sh | 2 + ...ompact_3km_ics_HRRR_lbcs_RAP_suite_HRRR.sh | 3 + ...km_ics_HRRR_lbcs_RAP_suite_RRFS_v1alpha.sh | 2 + ...3km_ics_HRRR_lbcs_RAP_suite_RRFS_v1beta.sh | 2 + ...S_3km_ics_HRRR_lbcs_RAP_suite_GFS_v15p2.sh | 2 + ...m_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16.sh} | 0 ...US_Ind_3km_ics_HRRR_lbcs_RAP_suite_HRRR.sh | 27 + ...3km_ics_HRRR_lbcs_RAP_suite_RRFS_v1beta.sh | 27 + ...km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16.sh | 1 - ...3GFS_suite_GFS_v15_thompson_mynn_lam3km.sh | 2 - ...mpact_25km_ics_HRRR_lbcs_RAP_suite_HRRR.sh | 2 + ...ew_ESGgrid.sh => config.custom_ESGgrid.sh} | 2 + ..._GFDLgrid.sh => config.custom_GFDLgrid.sh} | 2 + ...USE_GFDLgrid_RES_IN_FILENAMES_eq_FALSE.sh} | 2 + ..._USE_GFDLgrid_RES_IN_FILENAMES_eq_TRUE.sh} | 2 + .../config.get_from_HPSS_ics_HRRR_lbcs_RAP.sh | 2 + ...ig.specify_DT_ATMOS_LAYOUT_XY_BLOCKSIZE.sh | 2 + ...g.specify_EXTRN_MDL_SYSBASEDIR_ICS_LBCS.sh | 6 +- .../wflow_features/config.subhourly_post.sh | 2 + .../config.subhourly_post_ensemble_2mems.sh | 2 + ush/NOMADS_get_extrn_mdl_files.sh | 12 +- ush/check_expt_config_vars.sh | 2 +- ush/check_ruc_lsm.py | 30 + ush/config.community.sh | 4 +- ush/config.nco.sh | 29 +- ush/config_defaults.sh | 133 +- ush/config_defaults.yaml | 2022 +++++++++++++++ ush/constants.py | 27 + ush/create_diag_table_file.py | 79 + ush/create_model_configure_file.py | 243 ++ ush/fill_jinja_template.py | 18 +- ush/generate_FV3LAM_wflow.py | 1126 +++++++++ ush/generate_FV3LAM_wflow.sh | 1 + ush/get_crontab_contents.py | 83 + ush/launch_FV3LAM_wflow.sh | 8 +- ush/link_fix.py | 391 +++ ush/load_modules_run_task.sh | 15 +- ush/machine/cheyenne.sh | 25 +- ush/machine/gaea.sh | 70 + ush/machine/hera.sh | 27 +- ush/machine/jet.sh | 21 +- ush/machine/noaacloud.sh | 33 +- ush/machine/odin.sh | 32 +- ush/machine/orion.sh | 27 +- ush/machine/singularity.sh | 4 +- ush/machine/stampede.sh | 34 +- ush/machine/wcoss_dell_p3.sh | 27 +- ush/mrms_pull_topofhour.py | 160 +- ush/predef_grid_params.yaml | 909 +++++++ ush/python_utils/__init__.py | 12 +- ush/python_utils/change_case.py | 25 - .../check_for_preexist_dir_file.py | 2 +- ush/python_utils/config_parser.py | 159 +- ush/python_utils/environment.py | 2 +- ush/python_utils/fv3write_parms_lambert.py | 91 + ush/python_utils/get_elem_inds.py | 2 +- .../get_manage_externals_config_property.py | 54 - ush/python_utils/misc.py | 57 + ush/python_utils/print_input_args.py | 2 +- ush/python_utils/test_python_utils.py | 45 +- ush/python_utils/xml_parser.py | 30 + ush/retrieve_data.py | 3 +- ush/set_FV3nml_ens_stoch_seeds.py | 137 + ush/set_FV3nml_sfc_climo_filenames.py | 141 ++ ush/set_cycle_dates.py | 54 + ush/set_extrn_mdl_params.py | 56 + ush/set_extrn_mdl_params.sh | 19 +- ush/set_gridparams_ESGgrid.py | 110 + ush/set_gridparams_GFDLgrid.py | 467 ++++ ush/set_namelist.py | 17 +- ush/set_ozone_param.py | 231 ++ ush/set_predef_grid_params.py | 64 + ush/set_predef_grid_params.sh | 70 +- ush/set_thompson_mp_fix_files.py | 176 ++ ush/setup.py | 2220 +++++++++++++++++ ush/setup.sh | 55 +- ush/templates/FV3LAM_wflow.xml | 124 + ush/templates/data_locations.yml | 2 - .../parm/metplus/EnsembleStat_APCP01h.conf | 2 +- .../parm/metplus/EnsembleStat_APCP03h.conf | 6 +- .../parm/metplus/EnsembleStat_APCP06h.conf | 6 +- .../parm/metplus/EnsembleStat_APCP24h.conf | 6 +- .../parm/metplus/EnsembleStat_REFC.conf | 2 +- .../parm/metplus/EnsembleStat_RETOP.conf | 2 +- .../parm/metplus/EnsembleStat_conus_sfc.conf | 2 +- .../parm/metplus/EnsembleStat_upper_air.conf | 2 +- .../parm/metplus/GridStat_APCP01h.conf | 2 +- .../parm/metplus/GridStat_APCP03h.conf | 4 +- .../parm/metplus/GridStat_APCP06h.conf | 4 +- .../parm/metplus/GridStat_APCP24h.conf | 4 +- ush/templates/parm/metplus/GridStat_REFC.conf | 2 +- .../parm/metplus/GridStat_RETOP.conf | 2 +- .../parm/metplus/PointStat_conus_sfc.conf | 2 +- .../parm/metplus/PointStat_upper_air.conf | 2 +- .../RRFS_CONUS_3km/C3357.facsf.tile7.halo0.nc | 0 .../RRFS_CONUS_3km/C3357.facsf.tile7.halo4.nc | 0 .../C3357.maximum_snow_albedo.tile7.halo0.nc | 0 .../C3357.maximum_snow_albedo.tile7.halo4.nc | 0 .../C3357.slope_type.tile7.halo0.nc | 0 .../C3357.slope_type.tile7.halo4.nc | 0 .../C3357.snowfree_albedo.tile7.halo0.nc | 0 .../C3357.snowfree_albedo.tile7.halo4.nc | 0 .../C3357.soil_type.tile7.halo0.nc | 0 .../C3357.soil_type.tile7.halo4.nc | 0 ...C3357.substrate_temperature.tile7.halo0.nc | 0 ...C3357.substrate_temperature.tile7.halo4.nc | 0 .../C3357.vegetation_greenness.tile7.halo0.nc | 0 .../C3357.vegetation_greenness.tile7.halo4.nc | 0 .../C3357.vegetation_type.tile7.halo0.nc | 0 .../C3357.vegetation_type.tile7.halo4.nc | 0 .../RRFS_CONUS_3km/C3357_grid.tile7.halo3.nc | 0 .../RRFS_CONUS_3km/C3357_grid.tile7.halo4.nc | 0 .../RRFS_CONUS_3km/C3357_grid.tile7.halo6.nc | 0 .../RRFS_CONUS_3km/C3357_mosaic.halo3.nc | 0 .../RRFS_CONUS_3km/C3357_mosaic.halo4.nc | 0 .../RRFS_CONUS_3km/C3357_mosaic.halo6.nc | 0 .../C3357_oro_data.tile7.halo0.nc | 0 .../C3357_oro_data.tile7.halo4.nc | 0 .../C3357_oro_data_ls.tile7.halo0.nc | 0 .../C3357_oro_data_ss.tile7.halo0.nc | 0 ush/test_data/suite_FV3_GSD_SAR.xml | 85 + ush/valid_param_vals.sh | 5 +- ush/valid_param_vals.yaml | 85 + 152 files changed, 10431 insertions(+), 600 deletions(-) create mode 100644 modulefiles/tasks/gaea/make_grid.local create mode 100644 modulefiles/tasks/gaea/make_ics.local create mode 100644 modulefiles/tasks/gaea/make_lbcs.local create mode 100644 modulefiles/tasks/gaea/run_fcst.local create mode 100644 modulefiles/tasks/gaea/run_vx.local rename tests/WE2E/test_configs/grids_extrn_mdls_suites_community/{config.SUBCONUS_Ind_3km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16.sh => config.grid_SUBCONUS_Ind_3km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16.sh} (100%) create mode 100644 tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_SUBCONUS_Ind_3km_ics_HRRR_lbcs_RAP_suite_HRRR.sh create mode 100644 tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_SUBCONUS_Ind_3km_ics_HRRR_lbcs_RAP_suite_RRFS_v1beta.sh rename tests/WE2E/test_configs/wflow_features/{config.new_ESGgrid.sh => config.custom_ESGgrid.sh} (96%) rename tests/WE2E/test_configs/wflow_features/{config.new_GFDLgrid.sh => config.custom_GFDLgrid.sh} (98%) rename tests/WE2E/test_configs/wflow_features/{config.new_GFDLgrid__GFDLgrid_USE_GFDLgrid_RES_IN_FILENAMES_eq_FALSE.sh => config.custom_GFDLgrid__GFDLgrid_USE_GFDLgrid_RES_IN_FILENAMES_eq_FALSE.sh} (97%) rename tests/WE2E/test_configs/wflow_features/{config.new_GFDLgrid__GFDLgrid_USE_GFDLgrid_RES_IN_FILENAMES_eq_TRUE.sh => config.custom_GFDLgrid__GFDLgrid_USE_GFDLgrid_RES_IN_FILENAMES_eq_TRUE.sh} (97%) create mode 100644 ush/check_ruc_lsm.py create mode 100644 ush/config_defaults.yaml create mode 100644 ush/constants.py create mode 100644 ush/create_diag_table_file.py create mode 100644 ush/create_model_configure_file.py create mode 100755 ush/generate_FV3LAM_wflow.py create mode 100644 ush/get_crontab_contents.py create mode 100644 ush/link_fix.py create mode 100755 ush/machine/gaea.sh create mode 100644 ush/predef_grid_params.yaml delete mode 100644 ush/python_utils/change_case.py create mode 100755 ush/python_utils/fv3write_parms_lambert.py delete mode 100644 ush/python_utils/get_manage_externals_config_property.py create mode 100644 ush/python_utils/misc.py create mode 100644 ush/python_utils/xml_parser.py create mode 100644 ush/set_FV3nml_ens_stoch_seeds.py create mode 100644 ush/set_FV3nml_sfc_climo_filenames.py create mode 100644 ush/set_cycle_dates.py create mode 100644 ush/set_extrn_mdl_params.py create mode 100644 ush/set_gridparams_ESGgrid.py create mode 100644 ush/set_gridparams_GFDLgrid.py create mode 100644 ush/set_ozone_param.py create mode 100644 ush/set_predef_grid_params.py create mode 100644 ush/set_thompson_mp_fix_files.py create mode 100644 ush/setup.py create mode 100644 ush/test_data/RRFS_CONUS_3km/C3357.facsf.tile7.halo0.nc create mode 100644 ush/test_data/RRFS_CONUS_3km/C3357.facsf.tile7.halo4.nc create mode 100644 ush/test_data/RRFS_CONUS_3km/C3357.maximum_snow_albedo.tile7.halo0.nc create mode 100644 ush/test_data/RRFS_CONUS_3km/C3357.maximum_snow_albedo.tile7.halo4.nc create mode 100644 ush/test_data/RRFS_CONUS_3km/C3357.slope_type.tile7.halo0.nc create mode 100644 ush/test_data/RRFS_CONUS_3km/C3357.slope_type.tile7.halo4.nc create mode 100644 ush/test_data/RRFS_CONUS_3km/C3357.snowfree_albedo.tile7.halo0.nc create mode 100644 ush/test_data/RRFS_CONUS_3km/C3357.snowfree_albedo.tile7.halo4.nc create mode 100644 ush/test_data/RRFS_CONUS_3km/C3357.soil_type.tile7.halo0.nc create mode 100644 ush/test_data/RRFS_CONUS_3km/C3357.soil_type.tile7.halo4.nc create mode 100644 ush/test_data/RRFS_CONUS_3km/C3357.substrate_temperature.tile7.halo0.nc create mode 100644 ush/test_data/RRFS_CONUS_3km/C3357.substrate_temperature.tile7.halo4.nc create mode 100644 ush/test_data/RRFS_CONUS_3km/C3357.vegetation_greenness.tile7.halo0.nc create mode 100644 ush/test_data/RRFS_CONUS_3km/C3357.vegetation_greenness.tile7.halo4.nc create mode 100644 ush/test_data/RRFS_CONUS_3km/C3357.vegetation_type.tile7.halo0.nc create mode 100644 ush/test_data/RRFS_CONUS_3km/C3357.vegetation_type.tile7.halo4.nc create mode 100644 ush/test_data/RRFS_CONUS_3km/C3357_grid.tile7.halo3.nc create mode 100644 ush/test_data/RRFS_CONUS_3km/C3357_grid.tile7.halo4.nc create mode 100644 ush/test_data/RRFS_CONUS_3km/C3357_grid.tile7.halo6.nc create mode 100644 ush/test_data/RRFS_CONUS_3km/C3357_mosaic.halo3.nc create mode 100644 ush/test_data/RRFS_CONUS_3km/C3357_mosaic.halo4.nc create mode 100644 ush/test_data/RRFS_CONUS_3km/C3357_mosaic.halo6.nc create mode 100644 ush/test_data/RRFS_CONUS_3km/C3357_oro_data.tile7.halo0.nc create mode 100644 ush/test_data/RRFS_CONUS_3km/C3357_oro_data.tile7.halo4.nc create mode 100644 ush/test_data/RRFS_CONUS_3km/C3357_oro_data_ls.tile7.halo0.nc create mode 100644 ush/test_data/RRFS_CONUS_3km/C3357_oro_data_ss.tile7.halo0.nc create mode 100644 ush/test_data/suite_FV3_GSD_SAR.xml create mode 100644 ush/valid_param_vals.yaml diff --git a/modulefiles/tasks/gaea/make_grid.local b/modulefiles/tasks/gaea/make_grid.local new file mode 100644 index 000000000..fbd52ffa0 --- /dev/null +++ b/modulefiles/tasks/gaea/make_grid.local @@ -0,0 +1,6 @@ +#%Module +module use /lustre/f2/pdata/esrl/gsd/contrib/modulefiles +module load rocoto +module load miniconda3 + +setenv SRW_ENV regional_workflow diff --git a/modulefiles/tasks/gaea/make_ics.local b/modulefiles/tasks/gaea/make_ics.local new file mode 100644 index 000000000..fbd52ffa0 --- /dev/null +++ b/modulefiles/tasks/gaea/make_ics.local @@ -0,0 +1,6 @@ +#%Module +module use /lustre/f2/pdata/esrl/gsd/contrib/modulefiles +module load rocoto +module load miniconda3 + +setenv SRW_ENV regional_workflow diff --git a/modulefiles/tasks/gaea/make_lbcs.local b/modulefiles/tasks/gaea/make_lbcs.local new file mode 100644 index 000000000..fbd52ffa0 --- /dev/null +++ b/modulefiles/tasks/gaea/make_lbcs.local @@ -0,0 +1,6 @@ +#%Module +module use /lustre/f2/pdata/esrl/gsd/contrib/modulefiles +module load rocoto +module load miniconda3 + +setenv SRW_ENV regional_workflow diff --git a/modulefiles/tasks/gaea/run_fcst.local b/modulefiles/tasks/gaea/run_fcst.local new file mode 100644 index 000000000..fbd52ffa0 --- /dev/null +++ b/modulefiles/tasks/gaea/run_fcst.local @@ -0,0 +1,6 @@ +#%Module +module use /lustre/f2/pdata/esrl/gsd/contrib/modulefiles +module load rocoto +module load miniconda3 + +setenv SRW_ENV regional_workflow diff --git a/modulefiles/tasks/gaea/run_vx.local b/modulefiles/tasks/gaea/run_vx.local new file mode 100644 index 000000000..929c2011f --- /dev/null +++ b/modulefiles/tasks/gaea/run_vx.local @@ -0,0 +1,7 @@ +#%Module + +module use -a /contrib/anaconda/modulefiles +module load intel/18.0.5.274 +module load anaconda/latest +module use -a /contrib/met/modulefiles/ +module load met/10.0.0 diff --git a/scripts/exregional_get_extrn_mdl_files.sh b/scripts/exregional_get_extrn_mdl_files.sh index fc02e3b99..5a3fe04cf 100755 --- a/scripts/exregional_get_extrn_mdl_files.sh +++ b/scripts/exregional_get_extrn_mdl_files.sh @@ -108,7 +108,7 @@ elif [ "${ICS_OR_LBCS}" = "LBCS" ]; then input_file_path=${EXTRN_MDL_SOURCE_BASEDIR_LBCS:-$EXTRN_MDL_SYSBASEDIR_LBCS} fi -data_stores="hpss aws" +data_stores="${EXTRN_MDL_DATA_STORES}" yyyymmddhh=${extrn_mdl_cdate:0:10} yyyy=${yyyymmddhh:0:4} @@ -141,7 +141,7 @@ if [ -n "${file_names:-}" ] ; then fi if [ -n "${input_file_path:-}" ] ; then - data_stores="disk hpss aws" + data_stores="disk $data_stores" additional_flags="$additional_flags \ --input_file_path ${input_file_path}" fi diff --git a/scripts/exregional_make_grid.sh b/scripts/exregional_make_grid.sh index 742770630..3744638c8 100755 --- a/scripts/exregional_make_grid.sh +++ b/scripts/exregional_make_grid.sh @@ -410,7 +410,7 @@ if [ "${GRID_GEN_METHOD}" = "GFDLgrid" ]; then elif [ "${GRID_GEN_METHOD}" = "ESGgrid" ]; then CRES="C${res_equiv}" fi -set_file_param "${GLOBAL_VAR_DEFNS_FP}" "CRES" "\"$CRES\"" +set_file_param "${GLOBAL_VAR_DEFNS_FP}" "CRES" "'$CRES'" # #----------------------------------------------------------------------- # diff --git a/scripts/exregional_run_ensgridvx.sh b/scripts/exregional_run_ensgridvx.sh index df1b91cd4..6d2cc1e40 100755 --- a/scripts/exregional_run_ensgridvx.sh +++ b/scripts/exregional_run_ensgridvx.sh @@ -133,6 +133,7 @@ export METPLUS_CONF export MET_CONFIG export MODEL export NET +export POST_OUTPUT_DOMAIN_NAME export NUM_ENS_MEMBERS export NUM_PAD export LOG_SUFFIX diff --git a/scripts/exregional_run_ensgridvx_mean.sh b/scripts/exregional_run_ensgridvx_mean.sh index bd50e3cff..9a18823ef 100755 --- a/scripts/exregional_run_ensgridvx_mean.sh +++ b/scripts/exregional_run_ensgridvx_mean.sh @@ -143,6 +143,7 @@ export METPLUS_CONF export MET_CONFIG export MODEL export NET +export POST_OUTPUT_DOMAIN_NAME export INPUT_BASE export LOG_SUFFIX diff --git a/scripts/exregional_run_ensgridvx_prob.sh b/scripts/exregional_run_ensgridvx_prob.sh index 7345c1728..8f45a53cd 100755 --- a/scripts/exregional_run_ensgridvx_prob.sh +++ b/scripts/exregional_run_ensgridvx_prob.sh @@ -144,6 +144,7 @@ export METPLUS_CONF export MET_CONFIG export MODEL export NET +export POST_OUTPUT_DOMAIN_NAME export LOG_SUFFIX # diff --git a/scripts/exregional_run_enspointvx.sh b/scripts/exregional_run_enspointvx.sh index 544f0f006..ed1bd3581 100755 --- a/scripts/exregional_run_enspointvx.sh +++ b/scripts/exregional_run_enspointvx.sh @@ -135,6 +135,7 @@ export METPLUS_CONF export MET_CONFIG export MODEL export NET +export POST_OUTPUT_DOMAIN_NAME export NUM_ENS_MEMBERS ${METPLUS_PATH}/ush/run_metplus.py \ diff --git a/scripts/exregional_run_enspointvx_mean.sh b/scripts/exregional_run_enspointvx_mean.sh index b65c9bd34..b9a0bc0c8 100755 --- a/scripts/exregional_run_enspointvx_mean.sh +++ b/scripts/exregional_run_enspointvx_mean.sh @@ -137,6 +137,7 @@ export METPLUS_CONF export MET_CONFIG export MODEL export NET +export POST_OUTPUT_DOMAIN_NAME ${METPLUS_PATH}/ush/run_metplus.py \ -c ${METPLUS_CONF}/common.conf \ diff --git a/scripts/exregional_run_enspointvx_prob.sh b/scripts/exregional_run_enspointvx_prob.sh index b315faac3..4d2bf8464 100755 --- a/scripts/exregional_run_enspointvx_prob.sh +++ b/scripts/exregional_run_enspointvx_prob.sh @@ -137,6 +137,7 @@ export METPLUS_CONF export MET_CONFIG export MODEL export NET +export POST_OUTPUT_DOMAIN_NAME ${METPLUS_PATH}/ush/run_metplus.py \ -c ${METPLUS_CONF}/common.conf \ diff --git a/scripts/exregional_run_fcst.sh b/scripts/exregional_run_fcst.sh index 331bb6ab4..2f768a657 100755 --- a/scripts/exregional_run_fcst.sh +++ b/scripts/exregional_run_fcst.sh @@ -514,7 +514,6 @@ if [ ${WRITE_DOPOST} = "TRUE" ]; then yyyymmdd=${cdate:0:8} hh=${cdate:8:2} cyc=$hh - tmmark="tm00" fmn="00" if [ "${RUN_ENVIR}" = "nco" ]; then @@ -539,7 +538,7 @@ if [ ${WRITE_DOPOST} = "TRUE" ]; then post_mn=${post_time:10:2} post_mn_or_null="" post_fn_suffix="GrbF${fhr_d}" - post_renamed_fn_suffix="f${fhr}${post_mn_or_null}.${tmmark}.grib2" + post_renamed_fn_suffix="f${fhr}${post_mn_or_null}.${POST_OUTPUT_DOMAIN_NAME}.grib2" basetime=$( $DATE_UTIL --date "$yyyymmdd $hh" +%y%j%H%M ) symlink_suffix="_${basetime}f${fhr}${post_mn}" @@ -547,7 +546,7 @@ if [ ${WRITE_DOPOST} = "TRUE" ]; then for fid in "${fids[@]}"; do FID=$(echo_uppercase $fid) post_orig_fn="${FID}.${post_fn_suffix}" - post_renamed_fn="${NET}.t${cyc}z.${fid}${post_renamed_fn_suffix}" + post_renamed_fn="${NET}.t${cyc}z.${fid}.${post_renamed_fn_suffix}" mv_vrfy ${run_dir}/${post_orig_fn} ${post_renamed_fn} ln_vrfy -fs ${post_renamed_fn} ${FID}${symlink_suffix} done diff --git a/scripts/exregional_run_gridstatvx.sh b/scripts/exregional_run_gridstatvx.sh index 63288c44e..ae7f960fc 100755 --- a/scripts/exregional_run_gridstatvx.sh +++ b/scripts/exregional_run_gridstatvx.sh @@ -150,6 +150,7 @@ export METPLUS_CONF export MET_CONFIG export MODEL export NET +export POST_OUTPUT_DOMAIN_NAME # #----------------------------------------------------------------------- diff --git a/scripts/exregional_run_pointstatvx.sh b/scripts/exregional_run_pointstatvx.sh index 458b87749..02a65ed2b 100755 --- a/scripts/exregional_run_pointstatvx.sh +++ b/scripts/exregional_run_pointstatvx.sh @@ -140,6 +140,7 @@ export METPLUS_CONF export MET_CONFIG export MODEL export NET +export POST_OUTPUT_DOMAIN_NAME ${METPLUS_PATH}/ush/run_metplus.py \ -c ${METPLUS_CONF}/common.conf \ diff --git a/scripts/exregional_run_post.sh b/scripts/exregional_run_post.sh index c7aafd399..cb8c3d5d5 100755 --- a/scripts/exregional_run_post.sh +++ b/scripts/exregional_run_post.sh @@ -172,17 +172,6 @@ cyc=$hh # #----------------------------------------------------------------------- # -# The tmmark is a reference value used in real-time, DA-enabled NCEP models. -# It represents the delay between the onset of the DA cycle and the free -# forecast. With no DA in the SRW App at the moment, it is hard-wired to -# tm00 for now. -# -#----------------------------------------------------------------------- -# -tmmark="tm00" -# -#----------------------------------------------------------------------- -# # Create the namelist file (itag) containing arguments to pass to the post- # processor's executable. # @@ -300,7 +289,7 @@ if [ "${post_mn}" != "00" ]; then fi post_fn_suffix="GrbF${post_fhr}${dot_post_mn_or_null}" -post_renamed_fn_suffix="f${fhr}${post_mn_or_null}.${tmmark}.grib2" +post_renamed_fn_suffix="f${fhr}${post_mn_or_null}.${POST_OUTPUT_DOMAIN_NAME}.grib2" # # For convenience, change location to postprd_dir (where the final output # from UPP will be located). Then loop through the two files that UPP @@ -314,7 +303,7 @@ fids=( "prslev" "natlev" ) for fid in "${fids[@]}"; do FID=$(echo_uppercase $fid) post_orig_fn="${FID}.${post_fn_suffix}" - post_renamed_fn="${NET}.t${cyc}z.${fid}${post_renamed_fn_suffix}" + post_renamed_fn="${NET}.t${cyc}z.${fid}.${post_renamed_fn_suffix}" mv_vrfy ${tmp_dir}/${post_orig_fn} ${post_renamed_fn} create_symlink_to_file target="${post_renamed_fn}" \ symlink="${FID}${symlink_suffix}" \ diff --git a/tests/WE2E/get_WE2Etest_names_subdirs_descs.sh b/tests/WE2E/get_WE2Etest_names_subdirs_descs.sh index 2183ead89..d0db858b3 100755 --- a/tests/WE2E/get_WE2Etest_names_subdirs_descs.sh +++ b/tests/WE2E/get_WE2Etest_names_subdirs_descs.sh @@ -7,8 +7,10 @@ # the WE2E tests available in the WE2E testing system. This information # consists of the test names, the category subdirectories in which the # test configuration files are located (relative to a base directory), -# the test IDs, and the test descriptions. These are described in more -# detail below. +# the test IDs, and the test descriptions. This function optionally +# also creates a CSV (Comma-Separated Value) file containing various +# pieces of information about each of the workflow end-to-end (WE2E) +# tests. These are described in more detail below. # # The function takes as inputs the following arguments: # @@ -21,6 +23,10 @@ # Flag that specifies whether or not a CSV (Comma-Separated Value) file # containing information about the WE2E tests should be generated. # +# verbose: +# Optional verbosity flag. Should be set to "TRUE" or "FALSE". Default +# is "FALSE". +# # output_varname_test_configs_basedir: # Name of output variable in which to return the base directory of the # WE2E test configuration files. @@ -65,7 +71,7 @@ # in the local array category_subdirs. We refer to these as "category" # subdirectories because they are used for clarity to group the tests # into categories (instead of putting them all directly under the base -# directory). For example, one category of tests might be those that +# directory). For example, one category of tests might be those that # test workflow capabilities such as running multiple cycles and ensemble # forecasts, another might be those that run various combinations of # grids, physics suites, and external models for ICs/LBCs, etc. Note @@ -208,7 +214,11 @@ # the file will be placed in the main WE2E testing system directory # specified by the input argument WE2Edir. The CSV file can be read # into a spreadsheet in Google Sheets (or another similar tool) to get -# an overview of all the available WE2E tests. +# an overview of all the available WE2E tests. The rows of the CSV file +# correspond to the primary WE2E tests, and the columns correspond to +# the (primary) test name, alternate test names (if any), test description, +# number of times the test calls the forecast model, and values of various +# SRW App experiment variables for that test. # # A CSV file will be generated in the directory specified by WE2Edir if # one or more of the following conditions hold: @@ -228,7 +238,8 @@ # "FALSE" in the call to this function (regardless of whether or not a # CSV file already exists). If a CSV file is generated, it is placed in # the directory specified by the input argment WE2Edir, and it overwrites -# any existing copies of the file in that directory. +# any existing copies of the file in that directory. The contents of +# each column of the CSV file are described below. # #----------------------------------------------------------------------- # @@ -255,6 +266,7 @@ function get_WE2Etest_names_subdirs_descs() { local valid_args=( \ "WE2Edir" \ "generate_csv_file" \ + "verbose" \ "output_varname_test_configs_basedir" \ "output_varname_test_names" \ "output_varname_test_subdirs" \ @@ -275,6 +287,17 @@ function get_WE2Etest_names_subdirs_descs() { # #----------------------------------------------------------------------- # +# Make the default value of "verbose" "FALSE". Then make sure "verbose" +# is set to a valid value. +# +#----------------------------------------------------------------------- +# + verbose=${verbose:-"FALSE"} + check_var_valid_value "verbose" "valid_vals_BOOLEAN" + verbose=$(boolify $verbose) +# +#----------------------------------------------------------------------- +# # Declare local variables. # #----------------------------------------------------------------------- @@ -286,7 +309,10 @@ function get_WE2Etest_names_subdirs_descs() { alt_test_prim_test_names \ alt_test_subdir \ alt_test_subdirs \ + array_names_vars_to_extract \ + array_names_vars_to_extract_orig \ category_subdirs \ + cmd \ column_titles \ config_fn \ crnt_item \ @@ -294,24 +320,35 @@ function get_WE2Etest_names_subdirs_descs() { csv_fn \ csv_fp \ cwd \ + default_val \ hash_or_null \ i \ ii \ j \ jp1 \ + k \ line \ mod_time_csv \ mod_time_subdir \ + msg \ num_alt_tests \ num_category_subdirs \ + num_cdates \ + num_cycles_per_day \ + num_days \ + num_fcsts \ + num_fcsts_orig \ num_items \ num_occurrences \ num_prim_tests \ num_tests \ + num_vars_to_extract \ + prim_array_names_vars_to_extract \ prim_test_descs \ prim_test_ids \ prim_test_name_subdir \ prim_test_names \ + prim_test_num_fcsts \ prim_test_subdirs \ get_test_descs \ regex_search \ @@ -349,7 +386,11 @@ function get_WE2Etest_names_subdirs_descs() { test_subdirs_orig \ test_subdirs_str \ test_type \ - valid_vals_generate_csv_file + val \ + valid_vals_generate_csv_file \ + var_name \ + var_name_at \ + vars_to_extract # #----------------------------------------------------------------------- # @@ -408,6 +449,13 @@ function get_WE2Etest_names_subdirs_descs() { fi fi + + if [ "${generate_csv_file}" = "TRUE" ]; then + print_info_msg " +Will generate a CSV (Comma Separated Value) file (csv_fp) containing +information on all WE2E tests: + csv_fp = \"${csv_fp}\"" + fi # #----------------------------------------------------------------------- # @@ -472,6 +520,7 @@ function get_WE2Etest_names_subdirs_descs() { prim_test_names=() prim_test_ids=() prim_test_subdirs=() + prim_test_num_fcsts=() alt_test_names=() alt_test_subdirs=() @@ -839,17 +888,61 @@ they correspond to unique test names and rerun." #----------------------------------------------------------------------- # if [ "${get_test_descs}" = "TRUE" ]; then +# +# Specify in "vars_to_extract" the list of experiment variables to extract +# from each test configuration file (and later to place in the CSV file). +# Recall that the rows of the CSV file correspond to the various WE2E +# tests, and the columns correspond to the test name, description, and +# experiment variable values. The elements of "vars_to_extract" should +# be the names of SRW App experiment variables that are (or can be) +# specified in the App's configuration file. Note that if a variable is +# not specified in the test configuration file, in most cases its value +# is set to an empty string (and recorded as such in the CSV file). In +# some cases, it is set to some other value (e.g. for the number of +# ensemble members NUM_ENS_MEMBERS, it is set to 1). +# + vars_to_extract=( "PREDEF_GRID_NAME" \ + "CCPP_PHYS_SUITE" \ + "EXTRN_MDL_NAME_ICS" \ + "EXTRN_MDL_NAME_LBCS" \ + "DATE_FIRST_CYCL" \ + "DATE_LAST_CYCL" \ + "CYCL_HRS" \ + "INCR_CYCL_FREQ" \ + "FCST_LEN_HRS" \ + "LBC_SPEC_INTVL_HRS" \ + "NUM_ENS_MEMBERS" \ + ) + num_vars_to_extract="${#vars_to_extract[@]}" +# +# Create names of local arrays that will hold the value of the corresponding +# variable for each test. Then use these names to define them as empty +# arrays. [The arrays named "prim_..." are to hold values for only the +# primary tests, while other arrays are to hold values for all (primary +# plus alternate) tests.] +# + prim_array_names_vars_to_extract=( $( printf "prim_test_%s_vals " "${vars_to_extract[@]}" ) ) + array_names_vars_to_extract=( $( printf "%s_vals " "${vars_to_extract[@]}" ) ) + for (( k=0; k<=$((num_vars_to_extract-1)); k++ )); do + cmd="${prim_array_names_vars_to_extract[$k]}=()" + eval $cmd + cmd="${array_names_vars_to_extract[$k]}=()" + eval $cmd + done print_info_msg " -Gathering test descriptions from the configuration files of the primary -WE2E tests..." +Gathering test descriptions and experiment variable values from the +configuration files of the primary WE2E tests... +" prim_test_descs=() for (( i=0; i<=$((num_prim_tests-1)); i++ )); do test_name="${prim_test_names[$i]}" print_info_msg "\ - Reading in the test description for primary WE2E test: \"${test_name}\"" + Reading in the test description for primary WE2E test: \"${test_name}\" + In category (subdirectory): \"${subdir}\" +" subdir=("${prim_test_subdirs[$i]}") cd_vrfy "${test_configs_basedir}/$subdir" # @@ -931,16 +1024,121 @@ ${test_desc}${stripped_line} " # # First remove leading whitespace. # - test_desc="${test_desc#"${test_desc%%[![:space:]]*}"}" + test_desc="${test_desc#"${test_desc%%[![:space:]]*}"}" # # Now remove trailing whitespace. # - test_desc="${test_desc%"${test_desc##*[![:space:]]}"}" + test_desc="${test_desc%"${test_desc##*[![:space:]]}"}" # # Finally, save the description of the current test as the next element # of the array prim_test_descs. # prim_test_descs+=("${test_desc}") +# +# Get from the current test's configuration file the values of the +# variables specified in "vars_to_extract". Then save the value in the +# arrays specified by "prim_array_names_vars_to_extract". +# + for (( k=0; k<=$((num_vars_to_extract-1)); k++ )); do + + var_name="${vars_to_extract[$k]}" + cmd=$( grep "^[ ]*${var_name}=" "${config_fn}" ) + eval $cmd + + if [ -z "${!var_name+x}" ]; then + + msg=" + The variable \"${var_name}\" is not defined in the current test's + configuration file (config_fn): + config_fn = \"${config_fn}\" + Setting the element in the array \"${prim_array_names_vars_to_extract[$k]}\" + corresponding to this test to" + + case "${var_name}" in + + "NUM_ENS_MEMBERS") + default_val="1" + msg=$msg": + ${var_name} = \"${default_val}\"" + ;; + + "INCR_CYCL_FREQ") + default_val="24" + msg=$msg": + ${var_name} = \"${default_val}\"" + ;; + + *) + default_val="" + msg=$msg" an empty string." + ;; + + esac + cmd="${var_name}=\"${default_val}\"" + eval $cmd + + print_info_msg "$verbose" "$msg" + cmd="${prim_array_names_vars_to_extract[$k]}+=(\"'${default_val}\")" + + else +# +# The following are important notes regarding how the variable "cmd" +# containing the command that will append an element to the array +# specified by ${prim_array_names_vars_to_extract[$k]} is formulated: +# +# 1) If all the experiment variables were scalars, then the more complex +# command below could be replaced with the following: +# +# cmd="${prim_array_names_vars_to_extract[$k]}+=(\"${!var_name}\")" +# +# But some variables are arrays, so we need the more complex approach +# to cover those cases. +# +# 2) The double quotes (which need to be escaped here, i.e. \") are needed +# so that for any experiment variables that are arrays, all the elements +# of the array are combined together and treated as a single element. +# If the experiment variable is CYCL_HRS (cycle hours) and is set to +# the array ("00" "12"), we want the value saved in the local array +# here to be a single element consisting of "00 12". Otherwise, "00" +# and "12" will be treated as separate elements, and more than one +# element would be added to the array (which would be incorrect here). +# +# 3) The single quote (which needs to be escaped here, i.e. \') is needed +# so that any numbers (e.g. a set of cycle hours such as "00 12") are +# treated as strings when the CSV file is opened in Google Sheets. +# If this is not done, Google Sheets will remove leading zeros. +# + var_name_at="${var_name}[@]" + cmd="${prim_array_names_vars_to_extract[$k]}+=(\'\"${!var_name_at}\")" + fi + eval $cmd + + done +# +# Calculate the number of forecasts that will be launched by the current +# test. The "10#" forces bash to treat the following number as a decimal +# (not hexadecimal, etc). +# + num_cycles_per_day=${#CYCL_HRS[@]} + num_days=$(( (${DATE_LAST_CYCL} - ${DATE_FIRST_CYCL} + 1)*24/10#${INCR_CYCL_FREQ} )) + num_cdates=$(( ${num_cycles_per_day}*${num_days} )) + nf=$(( ${num_cdates}*10#${NUM_ENS_MEMBERS} )) +# +# In the following, the single quote at the beginning forces Google Sheets +# to interpret this quantity as a string. This prevents any automatic +# number fomatting from being applied when the CSV file is imported into +# Google Sheets. +# + prim_test_num_fcsts+=( "'$nf" ) +# +# Unset the experiment variables defined for the current test so that +# they are not accidentally used for the next one. +# + for (( k=0; k<=$((num_vars_to_extract-1)); k++ )); do + var_name="${vars_to_extract[$k]}" + cmd="unset ${var_name}" + eval $cmd + done done @@ -950,13 +1148,20 @@ ${test_desc}${stripped_line} " # # Create the arrays test_ids and test_descs that initially contain the # test IDs and descriptions corresponding to the primary test names -# (those of the alternate test names will be appended below). +# (those of the alternate test names will be appended below). Then, in +# the for-loop, do same for the arrays containing the experiment variable +# values for each test. # #----------------------------------------------------------------------- # test_ids=("${prim_test_ids[@]}") if [ "${get_test_descs}" = "TRUE" ]; then test_descs=("${prim_test_descs[@]}") + num_fcsts=("${prim_test_num_fcsts[@]}") + for (( k=0; k<=$((num_vars_to_extract-1)); k++ )); do + cmd="${array_names_vars_to_extract[$k]}=(\"\${${prim_array_names_vars_to_extract[$k]}[@]}\")" + eval $cmd + done fi # #----------------------------------------------------------------------- @@ -964,7 +1169,8 @@ ${test_desc}${stripped_line} " # Append to the arrays test_ids and test_descs the test IDs and descriptions # of the alternate test names. We set the test ID and description of # each alternate test name to those of the corresponding primary test -# name. +# name. Then, in the inner for-loop, do the same for the arrays containing +# the experiment variable values. # #----------------------------------------------------------------------- # @@ -980,6 +1186,11 @@ ${test_desc}${stripped_line} " test_ids+=("${prim_test_ids[$j]}") if [ "${get_test_descs}" = "TRUE" ]; then test_descs+=("${prim_test_descs[$j]}") + num_fcsts+=("${prim_test_num_fcsts[$j]}") + for (( k=0; k<=$((num_vars_to_extract-1)); k++ )); do + cmd="${array_names_vars_to_extract[$k]}+=(\"\${${prim_array_names_vars_to_extract[$k]}[$j]}\")" + eval $cmd + done fi num_occurrences=$((num_occurrences+1)) fi @@ -1003,7 +1214,9 @@ Please correct and rerun." #----------------------------------------------------------------------- # # Sort in order of increasing test ID the arrays containing the names, -# IDs, category subdirectories, and descriptions of the WE2E tests. +# IDs, category subdirectories, and descriptions of the WE2E tests as +# well as the arrays containing the experiment variable values for each +# test. # # For this purpose, we first create an array (test_ids_and_inds) each # of whose elements consist of the test ID, the test type, and the index @@ -1029,8 +1242,8 @@ Please correct and rerun." # and the test type, which we no longer need), which is the original # array index before sorting, and save the results in the array sort_inds. # This array will contain the original indices in sorted order that we -# then use to sort the arrays containing the names, IDs, subdirectories, -# and descriptions of the WE2E tests. +# then use to sort the arrays containing the WE2E test names, IDs, +# subdirectories, descriptions, and experiment variable values. # #----------------------------------------------------------------------- # @@ -1064,11 +1277,24 @@ Please correct and rerun." done if [ "${get_test_descs}" = "TRUE" ]; then + test_descs_orig=( "${test_descs[@]}" ) + num_fcsts_orig=( "${num_fcsts[@]}" ) + for (( k=0; k<=$((num_vars_to_extract-1)); k++ )); do + cmd="${array_names_vars_to_extract[$k]}_orig=(\"\${${array_names_vars_to_extract[$k]}[@]}\")" + eval $cmd + done + for (( i=0; i<=$((num_tests-1)); i++ )); do ii="${sort_inds[$i]}" test_descs[$i]="${test_descs_orig[$ii]}" + num_fcsts[$i]="${num_fcsts_orig[$ii]}" + for (( k=0; k<=$((num_vars_to_extract-1)); k++ )); do + cmd="${array_names_vars_to_extract[$k]}[$i]=\"\${${array_names_vars_to_extract[$k]}_orig[$ii]}\"" + eval $cmd + done done + fi # #----------------------------------------------------------------------- @@ -1094,18 +1320,27 @@ Please correct and rerun." # csv_delimiter="|" # -# Set the titles of the three columns that will be in the file. Then -# write them to the file. The contents of the columns are described in -# more detail further below. +# Set the titles of the columns that will be in the file. Then write +# them to the file. The contents of the columns are described in more +# detail further below. # column_titles="\ \"Test Name (Subdirectory)\" ${csv_delimiter} \ \"Alternate Test Names (Subdirectories)\" ${csv_delimiter} \ -\"Test Purpose/Description\"" +\"Test Purpose/Description\" ${csv_delimiter} \ +\"Number of Forecast Model Runs\"" + for (( k=0; k<=$((num_vars_to_extract-1)); k++ )); do + column_titles="\ +${column_titles} ${csv_delimiter} \ +\"${vars_to_extract[$k]}\"" + done printf "%s\n" "${column_titles}" >> "${csv_fp}" # # Loop through the arrays containing the WE2E test information. Extract # the necessary information and record it to the CSV file row-by-row. +# Note that each row corresponds to a primary test. When an alternate +# test is encountered, its information is stored in the row of the +# corresponding primary test (i.e. a new row is not created). # j=0 jp1=$((j+1)) @@ -1130,6 +1365,11 @@ Please correct and rerun." # test_desc=$( printf "%s" "${test_desc}" | sed -r -e "s/\"/\"\"/g" ) # +# Get the number of forecasts (number of times the forcast model is run, +# due to a unique starting date, an ensemble member, etc). +# + nf="${num_fcsts[$j]}" +# # In the following inner while-loop, we step through all alternate test # names (if any) that follow the current primary name and construct a # string (alt_test_names_subdirs) consisting of all the alternate test @@ -1167,11 +1407,30 @@ ${test_names[$jp1]} (${test_subdirs[$jp1]})" # # Column 3: # The test description. +# +# Column 4: +# The number of times the forecast model will be run by the test. This +# has been calculated above using the quantities that go in Columns 5, +# 6, .... +# +# Columns 5...: +# The values of the experiment variables specified in vars_to_extract. # row_content="\ \"${prim_test_name_subdir}\" ${csv_delimiter} \ \"${alt_test_names_subdirs}\" ${csv_delimiter} \ -\"${test_desc}\"" +\"${test_desc}\" ${csv_delimiter} \ +\"${nf}\"" + + for (( k=0; k<=$((num_vars_to_extract-1)); k++ )); do + unset "val" + cmd="val=\"\${${array_names_vars_to_extract[$k]}[$j]}\"" + eval $cmd + row_content="\ +${row_content} ${csv_delimiter} \ +\"${val}\"" + done + printf "%s\n" "${row_content}" >> "${csv_fp}" # # Update loop indices. @@ -1181,6 +1440,11 @@ ${test_names[$jp1]} (${test_subdirs[$jp1]})" done + print_info_msg "\ +Successfully generated a CSV (Comma Separated Value) file (csv_fp) +containing information on all WE2E tests: + csv_fp = \"${csv_fp}\"" + fi # #----------------------------------------------------------------------- diff --git a/tests/WE2E/get_expts_status.sh b/tests/WE2E/get_expts_status.sh index 01b127d9f..91de215d2 100755 --- a/tests/WE2E/get_expts_status.sh +++ b/tests/WE2E/get_expts_status.sh @@ -12,8 +12,8 @@ # directory represent active experiments (see below for how this is done). # For all such experiments, it calls the workflow (re)launch script to # update the status of the workflow and prints the status out to screen. -# It also generates a summary status file in the base directory that -# contains the last num_tail_lines lines (defined below) of each experiment's +# It also generates a status report file in the base directory that +# contains the last num_log_lines lines (defined below) of each experiment's # workflow log file [which is generated by the (re)launch script] and thus # has information on which tasks may have succeeded/failed]. # @@ -70,42 +70,112 @@ ushdir="$homerrfs/ush" # #----------------------------------------------------------------------- # -# Exactly one argument must be specified that consists of the full path -# to the experiments base directory (i.e. the directory containing the -# experiment subdirectories). Ensure that the number of arguments is -# one. +# Set the usage message. # #----------------------------------------------------------------------- # -num_args="$#" -if [ "${num_args}" -eq 1 ]; then - expts_basedir="$1" -else - print_err_msg_exit " -The number of arguments to this script must be exacty one, and that -argument must specify the experiments base directory, i.e. the directory -containing the experiment subdirectories. The acutal number of arguments -is: - num_args = ${num_args}" +usage_str="\ +Usage: + + ${scrfunc_fn} \\ + expts_basedir=\"...\" \\ + [verbose=\"...\"] + +The arguments in brackets are optional. The arguments are defined as +follows: + +expts_basedir: +Full path to the experiments base directory, i.e. the directory containing +the experiment subdirectories. + +num_log_lines: +Optional integer specifying the number of lines from the end of the +workflow launch log file (log.launch_FV3LAM_wflow) of each test to +include in the status report file that this script generates. + +verbose: +Optional verbosity flag. Should be set to \"TRUE\" or \"FALSE\". Default +is \"FALSE\". +" +# +#----------------------------------------------------------------------- +# +# Check to see if usage help for this script is being requested. If so, +# print it out and exit with a 0 exit code (success). +# +#----------------------------------------------------------------------- +# +help_flag="--help" +if [ "$#" -eq 1 ] && [ "$1" = "${help_flag}" ]; then + print_info_msg "${usage_str}" + exit 0 +fi +# +#----------------------------------------------------------------------- +# +# Specify the set of valid argument names for this script or function. +# Then process the arguments provided to it on the command line (which +# should consist of a set of name-value pairs of the form arg1="value1", +# arg2="value2", etc). +# +#----------------------------------------------------------------------- +# +valid_args=( \ + "expts_basedir" \ + "num_log_lines" \ + "verbose" \ + ) +process_args valid_args "$@" +# +#----------------------------------------------------------------------- +# +# Set the default value of "num_log_lines". +# +#----------------------------------------------------------------------- +# +num_log_lines=${num_log_lines:-"40"} +# +#----------------------------------------------------------------------- +# +# Make the default value of "verbose" "FALSE". Then make sure "verbose" +# is set to a valid value. +# +#----------------------------------------------------------------------- +# +verbose=${verbose:-"FALSE"} +check_var_valid_value "verbose" "valid_vals_BOOLEAN" +verbose=$(boolify $verbose) +# +#----------------------------------------------------------------------- +# +# Verify that the required arguments to this script have been specified. +# If not, print out an error message and exit. +# +#----------------------------------------------------------------------- +# +help_msg="\ +Use + ${scrfunc_fn} ${help_flag} +to get help on how to use this script." + +if [ -z "${expts_basedir}" ]; then + print_err_msg_exit "\ +The argument \"expts_basedir\" specifying the base directory containing +the experiment directories was not specified in the call to this script. \ +${help_msg}" fi # #----------------------------------------------------------------------- # # Check that the specified experiments base directory exists and is # actually a directory. If not, print out an error message and exit. -# If so, print out an informational message. # #----------------------------------------------------------------------- # if [ ! -d "${expts_basedir}" ]; then print_err_msg_exit " -The experiments base directory (expts_basedir) does not exit or is not -actually a directory: - expts_basedir = \"${expts_basedir}\"" -else - print_info_msg " -Checking the workflow status of all forecast experiments in the following -specified experiments base directory: +The specified experiments base directory (expts_basedir) does not exit +or is not actually a directory: expts_basedir = \"${expts_basedir}\"" fi # @@ -116,7 +186,7 @@ fi # #----------------------------------------------------------------------- # -cd "${expts_basedir}" +cd_vrfy "${expts_basedir}" # # Get a list of all subdirectories (but not files) in the experiment base # directory. Note that the ls command below will return a string containing @@ -175,6 +245,12 @@ var_defns_fn="var_defns.sh" j="0" expt_subdirs=() +print_info_msg "\ +Checking for active experiment directories in the specified experiments +base directory (expts_basedir): + expts_basedir = \"${expts_basedir}\" +..." + num_subdirs="${#subdirs_list[@]}" for (( i=0; i<=$((num_subdirs-1)); i++ )); do @@ -184,7 +260,7 @@ $separator Checking whether the subdirectory \"${subdir}\" contains an active experiment..." - print_info_msg "$msg" + print_info_msg "$verbose" "$msg" cd_vrfy "${subdir}" # @@ -193,7 +269,7 @@ contains an active experiment..." # if [ ! -f "${var_defns_fn}" ]; then - print_info_msg " + print_info_msg "$verbose" " The current subdirectory (subdir) under the experiments base directory (expts_basedir) does not contain an experiment variable defintions file (var_defns_fn): @@ -219,7 +295,7 @@ must be checked." # if [ "${EXPT_SUBDIR}" = "$subdir" ]; then - print_info_msg " + print_info_msg "$verbose" " The current subdirectory (subdir) under the experiments base directory (expts_basedir) contains an active experiment: expts_basedir = \"${expts_basedir}\" @@ -238,7 +314,7 @@ subdirectories whose workflow status must be checked." # else - print_info_msg " + print_info_msg "$verbose" " The current subdirectory (subdir) under the experiments base directory (expts_basedir) contains an experiment whose original name (EXPT_SUBDIR) does not match the name of the current subdirectory: @@ -254,7 +330,7 @@ status must be checked." fi - print_info_msg "\ + print_info_msg "$verbose" "\ $separator " # @@ -302,7 +378,7 @@ check_for_preexist_dir_file "${expts_status_fp}" "rename" # Loop through the elements of the array expt_subdirs. For each element # (i.e. for each active experiment), change location to the experiment # directory and call the script launch_FV3LAM_wflow.sh to update the log -# file log.launch_FV3LAM_wflow. Then take the last num_tail_lines of +# file log.launch_FV3LAM_wflow. Then take the last num_log_lines of # this log file (along with an appropriate message) and add it to the # status report file. # @@ -310,7 +386,6 @@ check_for_preexist_dir_file "${expts_status_fp}" "rename" # launch_wflow_fn="launch_FV3LAM_wflow.sh" launch_wflow_log_fn="log.launch_FV3LAM_wflow" -num_tail_lines="40" for (( i=0; i<=$((num_expts-1)); i++ )); do @@ -326,25 +401,28 @@ Checking workflow status of experiment \"${expt_subdir}\" ..." # cd_vrfy "${expt_subdir}" launch_msg=$( "${launch_wflow_fn}" 2>&1 ) - log_tail=$( tail -n ${num_tail_lines} "${launch_wflow_log_fn}" ) + log_tail=$( tail -n ${num_log_lines} "${launch_wflow_log_fn}" ) # # Print the workflow status to the screen. # - wflow_status=$( printf "${log_tail}" | grep "Workflow status:" ) -# wflow_status="${wflow_status## }" # Not sure why this doesn't work to strip leading spaces. - wflow_status=$( printf "${wflow_status}" "%s" | sed -r 's|^[ ]*||g' ) # Remove leading spaces. + # The "tail -1" is to get only the last occurrence of "Workflow status" + wflow_status=$( printf "${log_tail}" | grep "Workflow status:" | tail -1 ) + # Not sure why this doesn't work to strip leading spaces. +# wflow_status="${wflow_status## }" + # Remove leading spaces. + wflow_status=$( printf "${wflow_status}" "%s" | sed -r 's|^[ ]*||g' ) print_info_msg "${wflow_status}" print_info_msg "\ $separator " # -# Combine message above with the last num_tail_lines lines from the workflow +# Combine message above with the last num_log_lines lines from the workflow # launch log file and place the result in the status report file. # msg=$msg" ${wflow_status} -The last ${num_tail_lines} lines of this experiment's workflow launch log file +The last ${num_log_lines} lines of this experiment's workflow launch log file (\"${launch_wflow_log_fn}\") are: ${log_tail} @@ -360,4 +438,7 @@ ${log_tail} done print_info_msg "\ +A status report has been created in: + expts_status_fp = \"${expts_status_fp}\" + DONE." diff --git a/tests/WE2E/run_WE2E_tests.sh b/tests/WE2E/run_WE2E_tests.sh index f3040208e..a11dfaba3 100755 --- a/tests/WE2E/run_WE2E_tests.sh +++ b/tests/WE2E/run_WE2E_tests.sh @@ -94,7 +94,7 @@ Usage: [stmp=\"...\"] \\ [ptmp=\"...\"] \\ [compiler=\"...\"] \\ - [build_env_fn=\"...\"] + [build_mod_fn=\"...\"] The arguments in brackets are optional. The arguments are defined as follows: @@ -133,9 +133,9 @@ tests under subdirectory testset1, another set of tests under testset2, etc. exec_subdir: -Optional. Argument is used to set the EXEC_SUBDIR configuration -variable. Please see the ush/default_configs.sh file for a full -description. +Optional. Argument used to set the EXEC_SUBDIR experiment variable. +Please see the default experiment configuration file \"config_defaults.sh\" +for a full description of EXEC_SUBDIR. use_cron_to_relaunch: Argument used to explicitly set the experiment variable USE_CRON_TO_RELAUNCH @@ -208,14 +208,85 @@ Same as the argument \"stmp\" described above but for setting the experiment variable PTMP for all tests that will run in NCO mode. compiler: -Type of compiler to use for the workflow. Options are \"intel\" -and \"gnu\". Default is \"intel\", +Type of compiler to use for the workflow. Options are \"intel\" and \"gnu\". +Default is \"intel\". -build_env_fn: -Specify the build environment (see ufs-srweather-app/envs) to -use for the workflow. (e.g. build_cheyenne_gnu.env). If a +build_mod_fn: +Specify the build module files (see ufs-srweather-app/modulefiles) to +use for the workflow. (e.g. build_cheyenne_gnu). If a \"gnu\" compiler is specified, it must also be specified with the \"compiler\" option. + + +Usage Examples: +-------------- +Here, we give several common usage examples. In the following, assume +my_tests.txt is a text file in the same directory as this script containing +a list of test names that we want to run, e.g. + +> more my_tests.txt +new_ESGgrid +specify_DT_ATMOS_LAYOUT_XY_BLOCKSIZE + +Then: + +1) To run the tests listed in my_tests.txt on Hera and charge the core- + hours used to the \"rtrr\" account, use: + + > run_WE2E_tests.sh tests_file=\"my_tests.txt\" machine=\"hera\" account=\"rtrr\" + + This will create the experiment subdirectories for the two tests in + the directory + + \${SR_WX_APP_TOP_DIR}/../expt_dirs + + where SR_WX_APP_TOP_DIR is the directory in which the ufs-srweather-app + repository is cloned. Thus, the following two experiment directories + will be created: + + \${SR_WX_APP_TOP_DIR}/../expt_dirs/new_ESGgrid + \${SR_WX_APP_TOP_DIR}/../expt_dirs/specify_DT_ATMOS_LAYOUT_XY_BLOCKSIZE + + In addition, by default, cron jobs will be created in the user's cron + table to relaunch the workflows of these experiments every 2 minutes. + +2) To change the frequency with which the cron relaunch jobs are submitted + from the default of 2 minutes to 1 minute, use: + + > run_WE2E_tests.sh tests_file=\"my_tests.txt\" machine=\"hera\" account=\"rtrr\" cron_relaunch_intvl_mnts=\"01\" + +3) To disable use of cron (which means the worfkow for each test will + have to be relaunched manually from within each experiment directory), + use: + + > run_WE2E_tests.sh tests_file=\"my_tests.txt\" machine=\"hera\" account=\"rtrr\" use_cron_to_relaunch=\"FALSE\" + +4) To place the experiment subdirectories in a subdirectory named \"test_set_01\" + under + + \${SR_WX_APP_TOP_DIR}/../expt_dirs + + (instead of immediately under the latter), use: + + > run_WE2E_tests.sh tests_file=\"my_tests.txt\" machine=\"hera\" account=\"rtrr\" expt_basedir=\"test_set_01\" + + In this case, the full paths to the experiment directories will be: + + \${SR_WX_APP_TOP_DIR}/../expt_dirs/test_set_01/new_ESGgrid + \${SR_WX_APP_TOP_DIR}/../expt_dirs/test_set_01/specify_DT_ATMOS_LAYOUT_XY_BLOCKSIZE + +5) To use a list of tests that is located in + + /path/to/custom/my_tests.txt + + instead of in the same directory as this script, and to have the + experiment directories be placed in an arbitrary location, say + + /path/to/custom/expt_dirs + + use: + + > run_WE2E_tests.sh tests_file=\"/path/to/custom/my_tests.txt\" machine=\"hera\" account=\"rtrr\" expt_basedir=\"/path/to/custom/expt_dirs\" " # #----------------------------------------------------------------------- @@ -253,7 +324,7 @@ valid_args=( \ "stmp" \ "ptmp" \ "compiler" \ - "build_env_fn" \ + "build_mod_fn" \ ) process_args valid_args "$@" # @@ -685,7 +756,7 @@ Please correct and rerun." MACHINE="${machine^^}" ACCOUNT="${account}" COMPILER=${compiler:-"intel"} - BUILD_ENV_FN=${build_env_fn:-"build_${machine}_${COMPILER}.env"} + BUILD_MOD_FN=${build_mod_fn:-"build_${machine}_${COMPILER}"} EXPT_BASEDIR="${expt_basedir}" EXPT_SUBDIR="${test_name}" USE_CRON_TO_RELAUNCH=${use_cron_to_relaunch:-"TRUE"} @@ -710,7 +781,7 @@ MACHINE=\"${MACHINE}\" ACCOUNT=\"${ACCOUNT}\" COMPILER=\"${COMPILER}\" -BUILD_ENV_FN=\"${BUILD_ENV_FN}\"" +BUILD_MOD_FN=\"${BUILD_MOD_FN}\"" if [ -n "${exec_subdir}" ]; then expt_config_str=${expt_config_str}" @@ -854,7 +925,7 @@ SFC_CLIMO_DIR=\"${SFC_CLIMO_DIR}\"" # 2) The directory in which the output files from the post-processor (UPP) # for a given cycle are stored. The path to this directory is # -# \$PTMP/com/\$NET/\$envir/\$RUN.\$yyyymmdd/\$hh +# \$PTMP/com/\$NET/\$model_ver/\$RUN.\$yyyymmdd/\$hh # # Here, we make the first directory listed above unique to a WE2E test # by setting RUN to the name of the current test. This will also make @@ -872,40 +943,32 @@ SFC_CLIMO_DIR=\"${SFC_CLIMO_DIR}\"" # envir to the same value as RUN (which is just EXPT_SUBDIR). Then, for # this test, the UPP output will be located in the directory # -# \$PTMP/com/\$NET/\$RUN/\$RUN.\$yyyymmdd/\$hh +# \$PTMP/com/\$NET/\we2e/\$RUN.\$yyyymmdd/\$hh # RUN=\"\${EXPT_SUBDIR}\" -envir=\"\${EXPT_SUBDIR}\"" +model_ver="we2e"" # -# Set COMINgfs if using the FV3GFS or the GSMGFS as the external model -# for ICs or LBCs. -# - if [ "${EXTRN_MDL_NAME_ICS}" = "FV3GFS" ] || \ - [ "${EXTRN_MDL_NAME_ICS}" = "GSMGFS" ] || \ - [ "${EXTRN_MDL_NAME_LBCS}" = "FV3GFS" ] || \ - [ "${EXTRN_MDL_NAME_LBCS}" = "GSMGFS" ]; then +# Set COMIN. - COMINgfs=${TEST_COMINgfs:-} + COMIN=${TEST_COMIN:-} - if [ ! -d "${COMINgfs:-}" ] ; then - print_err_msg_exit "\ -The directory (COMINgfs) that needs to be specified when running the + if [ ! -d "${COMIN:-}" ] ; then + print_err_msg_exit "\ +The directory (COMIN) that needs to be specified when running the workflow in NCO mode (RUN_ENVIR set to \"nco\") AND using the FV3GFS or the GSMGFS as the external model for ICs and/or LBCs has not been specified for this machine (MACHINE): MACHINE= \"${MACHINE}\"" - fi + fi - expt_config_str=${expt_config_str}" + expt_config_str=${expt_config_str}" # # Directory that needs to be specified when running the workflow in NCO -# mode (RUN_ENVIR set to \"nco\") AND using the FV3GFS or the GSMGFS as -# the external model for ICs and/or LBCs. +# mode (RUN_ENVIR set to \"nco\"). # -COMINgfs=\"${COMINgfs}\"" +COMIN=\"${COMIN}\"" - fi # # Set STMP and PTMP. # @@ -932,6 +995,9 @@ PTMP=\"${PTMP}\"" # if [ "${USE_USER_STAGED_EXTRN_FILES}" = "TRUE" ]; then + # Ensure we only check on disk for these files + data_stores="disk" + extrn_mdl_source_basedir=${TEST_EXTRN_MDL_SOURCE_BASEDIR:-} if [ ! -d "${extrn_mdl_source_basedir:-}" ] ; then print_err_msg_exit "\ @@ -942,20 +1008,16 @@ machine (MACHINE): fi EXTRN_MDL_SOURCE_BASEDIR_ICS="${extrn_mdl_source_basedir}/${EXTRN_MDL_NAME_ICS}" if [ "${EXTRN_MDL_NAME_ICS}" = "FV3GFS" ] ; then - EXTRN_MDL_SOURCE_BASEDIR_ICS="${EXTRN_MDL_SOURCE_BASEDIR_ICS}/${FV3GFS_FILE_FMT_ICS}/\${DATE_FIRST_CYCL}\${CYCL_HRS[0]}" - elif [ "${EXTRN_MDL_NAME_ICS}" = "NAM" ] ; then - EXTRN_MDL_SOURCE_BASEDIR_ICS="${EXTRN_MDL_SOURCE_BASEDIR_ICS}/\${DATE_FIRST_CYCL}\${CYCL_HRS[0]}/for_ICS" + EXTRN_MDL_SOURCE_BASEDIR_ICS="${EXTRN_MDL_SOURCE_BASEDIR_ICS}/${FV3GFS_FILE_FMT_ICS}/\${yyyymmddhh}" else - EXTRN_MDL_SOURCE_BASEDIR_ICS="${EXTRN_MDL_SOURCE_BASEDIR_ICS}/\${DATE_FIRST_CYCL}\${CYCL_HRS[0]}" + EXTRN_MDL_SOURCE_BASEDIR_ICS="${EXTRN_MDL_SOURCE_BASEDIR_ICS}/\${yyyymmddhh}" fi EXTRN_MDL_SOURCE_BASEDIR_LBCS="${extrn_mdl_source_basedir}/${EXTRN_MDL_NAME_LBCS}" if [ "${EXTRN_MDL_NAME_LBCS}" = "FV3GFS" ] ; then - EXTRN_MDL_SOURCE_BASEDIR_LBCS="${EXTRN_MDL_SOURCE_BASEDIR_LBCS}/${FV3GFS_FILE_FMT_LBCS}/\${DATE_FIRST_CYCL}\${CYCL_HRS[0]}" - elif [ "${EXTRN_MDL_NAME_LBCS}" = "GSMGFS" ] ; then - EXTRN_MDL_SOURCE_BASEDIR_LBCS="${EXTRN_MDL_SOURCE_BASEDIR_LBCS}/\${DATE_FIRST_CYCL}\${CYCL_HRS[0]}" + EXTRN_MDL_SOURCE_BASEDIR_LBCS="${EXTRN_MDL_SOURCE_BASEDIR_LBCS}/${FV3GFS_FILE_FMT_LBCS}/\${yyyymmddhh}" else - EXTRN_MDL_SOURCE_BASEDIR_LBCS="${EXTRN_MDL_SOURCE_BASEDIR_LBCS}/\${DATE_FIRST_CYCL}\${CYCL_HRS[0]}/for_LBCS" + EXTRN_MDL_SOURCE_BASEDIR_LBCS="${EXTRN_MDL_SOURCE_BASEDIR_LBCS}/\${yyyymmddhh}" fi # # Make sure that the forecast length is evenly divisible by the interval @@ -976,10 +1038,11 @@ boundary conditions specification interval (LBC_SPEC_INTVL_HRS): # Locations and names of user-staged external model files for generating # ICs and LBCs. # -EXTRN_MDL_SOURCE_BASEDIR_ICS=${EXTRN_MDL_SOURCE_BASEDIR_ICS} +EXTRN_MDL_SOURCE_BASEDIR_ICS='${EXTRN_MDL_SOURCE_BASEDIR_ICS}' EXTRN_MDL_FILES_ICS=( ${EXTRN_MDL_FILES_ICS[@]} ) -EXTRN_MDL_SOURCE_BASEDIR_LBCS=${EXTRN_MDL_SOURCE_BASEDIR_LBCS} -EXTRN_MDL_FILES_LBCS=( ${EXTRN_MDL_FILES_LBCS[@]} )" +EXTRN_MDL_SOURCE_BASEDIR_LBCS='${EXTRN_MDL_SOURCE_BASEDIR_LBCS}' +EXTRN_MDL_FILES_LBCS=( ${EXTRN_MDL_FILES_LBCS[@]} ) +EXTRN_MDL_DATA_STORES=\"$data_stores\"" fi # diff --git a/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_CONUS_25km_ics_FV3GFS_lbcs_RAP_suite_HRRR.sh b/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_CONUS_25km_ics_FV3GFS_lbcs_RAP_suite_HRRR.sh index 492a16336..1465641a5 100644 --- a/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_CONUS_25km_ics_FV3GFS_lbcs_RAP_suite_HRRR.sh +++ b/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_CONUS_25km_ics_FV3GFS_lbcs_RAP_suite_HRRR.sh @@ -22,6 +22,7 @@ EXTRN_MDL_NAME_ICS="FV3GFS" FV3GFS_FILE_FMT_ICS="grib2" EXTRN_MDL_NAME_LBCS="RAP" USE_USER_STAGED_EXTRN_FILES="TRUE" +EXTRN_MDL_FILES_LBCS=( '{yy}{jjj}{hh}00{fcst_hr:02d}00' ) DATE_FIRST_CYCL="20200810" DATE_LAST_CYCL="20200810" diff --git a/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_CONUScompact_13km_ics_HRRR_lbcs_RAP_suite_HRRR.sh b/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_CONUScompact_13km_ics_HRRR_lbcs_RAP_suite_HRRR.sh index 4239752de..40f5e4997 100644 --- a/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_CONUScompact_13km_ics_HRRR_lbcs_RAP_suite_HRRR.sh +++ b/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_CONUScompact_13km_ics_HRRR_lbcs_RAP_suite_HRRR.sh @@ -17,6 +17,8 @@ CCPP_PHYS_SUITE="FV3_HRRR" EXTRN_MDL_NAME_ICS="HRRR" EXTRN_MDL_NAME_LBCS="RAP" USE_USER_STAGED_EXTRN_FILES="TRUE" +EXTRN_MDL_FILES_ICS=( '{yy}{jjj}{hh}00{fcst_hr:02d}00' ) +EXTRN_MDL_FILES_LBCS=( '{yy}{jjj}{hh}00{fcst_hr:02d}00' ) DATE_FIRST_CYCL="20200810" DATE_LAST_CYCL="20200810" diff --git a/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_CONUScompact_13km_ics_HRRR_lbcs_RAP_suite_RRFS_v1alpha.sh b/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_CONUScompact_13km_ics_HRRR_lbcs_RAP_suite_RRFS_v1alpha.sh index 8aaeddf43..f44afffcd 100644 --- a/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_CONUScompact_13km_ics_HRRR_lbcs_RAP_suite_RRFS_v1alpha.sh +++ b/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_CONUScompact_13km_ics_HRRR_lbcs_RAP_suite_RRFS_v1alpha.sh @@ -16,6 +16,8 @@ CCPP_PHYS_SUITE="FV3_RRFS_v1alpha" EXTRN_MDL_NAME_ICS="HRRR" EXTRN_MDL_NAME_LBCS="RAP" USE_USER_STAGED_EXTRN_FILES="TRUE" +EXTRN_MDL_FILES_ICS=( '{yy}{jjj}{hh}00{fcst_hr:02d}00' ) +EXTRN_MDL_FILES_LBCS=( '{yy}{jjj}{hh}00{fcst_hr:02d}00' ) DATE_FIRST_CYCL="20200801" DATE_LAST_CYCL="20200801" diff --git a/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_CONUScompact_13km_ics_HRRR_lbcs_RAP_suite_RRFS_v1beta.sh b/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_CONUScompact_13km_ics_HRRR_lbcs_RAP_suite_RRFS_v1beta.sh index 9e86198e6..afcfa32e1 100644 --- a/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_CONUScompact_13km_ics_HRRR_lbcs_RAP_suite_RRFS_v1beta.sh +++ b/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_CONUScompact_13km_ics_HRRR_lbcs_RAP_suite_RRFS_v1beta.sh @@ -16,6 +16,8 @@ CCPP_PHYS_SUITE="FV3_RRFS_v1beta" EXTRN_MDL_NAME_ICS="HRRR" EXTRN_MDL_NAME_LBCS="RAP" USE_USER_STAGED_EXTRN_FILES="TRUE" +EXTRN_MDL_FILES_ICS=( '{yy}{jjj}{hh}00{fcst_hr:02d}00' ) +EXTRN_MDL_FILES_LBCS=( '{yy}{jjj}{hh}00{fcst_hr:02d}00' ) DATE_FIRST_CYCL="20200801" DATE_LAST_CYCL="20200801" diff --git a/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_CONUScompact_25km_ics_HRRR_lbcs_HRRR_suite_HRRR.sh b/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_CONUScompact_25km_ics_HRRR_lbcs_HRRR_suite_HRRR.sh index 67b381044..f1302d163 100644 --- a/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_CONUScompact_25km_ics_HRRR_lbcs_HRRR_suite_HRRR.sh +++ b/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_CONUScompact_25km_ics_HRRR_lbcs_HRRR_suite_HRRR.sh @@ -16,6 +16,8 @@ CCPP_PHYS_SUITE="FV3_HRRR" EXTRN_MDL_NAME_ICS="HRRR" EXTRN_MDL_NAME_LBCS="HRRR" USE_USER_STAGED_EXTRN_FILES="TRUE" +EXTRN_MDL_FILES_ICS=( '{yy}{jjj}{hh}00{fcst_hr:02d}00' ) +EXTRN_MDL_FILES_LBCS=( '{yy}{jjj}{hh}00{fcst_hr:02d}00' ) DATE_FIRST_CYCL="20200810" DATE_LAST_CYCL="20200810" @@ -23,3 +25,4 @@ CYCL_HRS=( "00" ) FCST_LEN_HRS="24" LBC_SPEC_INTVL_HRS="3" + diff --git a/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_CONUScompact_25km_ics_HRRR_lbcs_HRRR_suite_RRFS_v1beta.sh b/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_CONUScompact_25km_ics_HRRR_lbcs_HRRR_suite_RRFS_v1beta.sh index 09b9548d1..0060e4466 100644 --- a/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_CONUScompact_25km_ics_HRRR_lbcs_HRRR_suite_RRFS_v1beta.sh +++ b/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_CONUScompact_25km_ics_HRRR_lbcs_HRRR_suite_RRFS_v1beta.sh @@ -16,6 +16,8 @@ CCPP_PHYS_SUITE="FV3_RRFS_v1beta" EXTRN_MDL_NAME_ICS="HRRR" EXTRN_MDL_NAME_LBCS="HRRR" USE_USER_STAGED_EXTRN_FILES="TRUE" +EXTRN_MDL_FILES_ICS=( '{yy}{jjj}{hh}00{fcst_hr:02d}00' ) +EXTRN_MDL_FILES_LBCS=( '{yy}{jjj}{hh}00{fcst_hr:02d}00' ) DATE_FIRST_CYCL="20200810" DATE_LAST_CYCL="20200810" diff --git a/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_CONUScompact_25km_ics_HRRR_lbcs_RAP_suite_HRRR.sh b/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_CONUScompact_25km_ics_HRRR_lbcs_RAP_suite_HRRR.sh index f4b781e07..3dfedb568 100644 --- a/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_CONUScompact_25km_ics_HRRR_lbcs_RAP_suite_HRRR.sh +++ b/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_CONUScompact_25km_ics_HRRR_lbcs_RAP_suite_HRRR.sh @@ -16,6 +16,8 @@ CCPP_PHYS_SUITE="FV3_HRRR" EXTRN_MDL_NAME_ICS="HRRR" EXTRN_MDL_NAME_LBCS="RAP" USE_USER_STAGED_EXTRN_FILES="TRUE" +EXTRN_MDL_FILES_ICS=( '{yy}{jjj}{hh}00{fcst_hr:02d}00' ) +EXTRN_MDL_FILES_LBCS=( '{yy}{jjj}{hh}00{fcst_hr:02d}00' ) DATE_FIRST_CYCL="20200810" DATE_LAST_CYCL="20200810" diff --git a/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_CONUScompact_25km_ics_HRRR_lbcs_RAP_suite_RRFS_v1alpha.sh b/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_CONUScompact_25km_ics_HRRR_lbcs_RAP_suite_RRFS_v1alpha.sh index b0b8a2e42..bf2e2f15e 100644 --- a/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_CONUScompact_25km_ics_HRRR_lbcs_RAP_suite_RRFS_v1alpha.sh +++ b/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_CONUScompact_25km_ics_HRRR_lbcs_RAP_suite_RRFS_v1alpha.sh @@ -16,6 +16,8 @@ CCPP_PHYS_SUITE="FV3_RRFS_v1alpha" EXTRN_MDL_NAME_ICS="HRRR" EXTRN_MDL_NAME_LBCS="RAP" USE_USER_STAGED_EXTRN_FILES="TRUE" +EXTRN_MDL_FILES_ICS=( '{yy}{jjj}{hh}00{fcst_hr:02d}00' ) +EXTRN_MDL_FILES_LBCS=( '{yy}{jjj}{hh}00{fcst_hr:02d}00' ) DATE_FIRST_CYCL="20200810" DATE_LAST_CYCL="20200810" diff --git a/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_CONUScompact_25km_ics_HRRR_lbcs_RAP_suite_RRFS_v1beta.sh b/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_CONUScompact_25km_ics_HRRR_lbcs_RAP_suite_RRFS_v1beta.sh index 2a98569a3..8fc60571d 100644 --- a/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_CONUScompact_25km_ics_HRRR_lbcs_RAP_suite_RRFS_v1beta.sh +++ b/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_CONUScompact_25km_ics_HRRR_lbcs_RAP_suite_RRFS_v1beta.sh @@ -16,6 +16,8 @@ CCPP_PHYS_SUITE="FV3_RRFS_v1beta" EXTRN_MDL_NAME_ICS="HRRR" EXTRN_MDL_NAME_LBCS="RAP" USE_USER_STAGED_EXTRN_FILES="TRUE" +EXTRN_MDL_FILES_ICS=( '{yy}{jjj}{hh}00{fcst_hr:02d}00' ) +EXTRN_MDL_FILES_LBCS=( '{yy}{jjj}{hh}00{fcst_hr:02d}00' ) DATE_FIRST_CYCL="20200810" DATE_LAST_CYCL="20200810" diff --git a/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_CONUScompact_3km_ics_HRRR_lbcs_RAP_suite_GFS_v15p2.sh b/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_CONUScompact_3km_ics_HRRR_lbcs_RAP_suite_GFS_v15p2.sh index 240715751..11227ea00 100644 --- a/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_CONUScompact_3km_ics_HRRR_lbcs_RAP_suite_GFS_v15p2.sh +++ b/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_CONUScompact_3km_ics_HRRR_lbcs_RAP_suite_GFS_v15p2.sh @@ -16,6 +16,8 @@ CCPP_PHYS_SUITE="FV3_GFS_v15p2" EXTRN_MDL_NAME_ICS="HRRR" EXTRN_MDL_NAME_LBCS="RAP" USE_USER_STAGED_EXTRN_FILES="TRUE" +EXTRN_MDL_FILES_ICS=( '{yy}{jjj}{hh}00{fcst_hr:02d}00' ) +EXTRN_MDL_FILES_LBCS=( '{yy}{jjj}{hh}00{fcst_hr:02d}00' ) DATE_FIRST_CYCL="20200801" DATE_LAST_CYCL="20200801" diff --git a/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_CONUScompact_3km_ics_HRRR_lbcs_RAP_suite_HRRR.sh b/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_CONUScompact_3km_ics_HRRR_lbcs_RAP_suite_HRRR.sh index eda52d1e7..396ce3e15 100644 --- a/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_CONUScompact_3km_ics_HRRR_lbcs_RAP_suite_HRRR.sh +++ b/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_CONUScompact_3km_ics_HRRR_lbcs_RAP_suite_HRRR.sh @@ -16,6 +16,9 @@ CCPP_PHYS_SUITE="FV3_HRRR" EXTRN_MDL_NAME_ICS="HRRR" EXTRN_MDL_NAME_LBCS="RAP" USE_USER_STAGED_EXTRN_FILES="TRUE" +EXTRN_MDL_FILES_ICS=( '{yy}{jjj}{hh}00{fcst_hr:02d}00' ) +EXTRN_MDL_FILES_LBCS=( '{yy}{jjj}{hh}00{fcst_hr:02d}00' ) + DATE_FIRST_CYCL="20200810" DATE_LAST_CYCL="20200810" diff --git a/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_CONUScompact_3km_ics_HRRR_lbcs_RAP_suite_RRFS_v1alpha.sh b/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_CONUScompact_3km_ics_HRRR_lbcs_RAP_suite_RRFS_v1alpha.sh index 1b3131663..cc92aecaa 100644 --- a/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_CONUScompact_3km_ics_HRRR_lbcs_RAP_suite_RRFS_v1alpha.sh +++ b/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_CONUScompact_3km_ics_HRRR_lbcs_RAP_suite_RRFS_v1alpha.sh @@ -17,6 +17,8 @@ CCPP_PHYS_SUITE="FV3_RRFS_v1alpha" EXTRN_MDL_NAME_ICS="HRRR" EXTRN_MDL_NAME_LBCS="RAP" USE_USER_STAGED_EXTRN_FILES="TRUE" +EXTRN_MDL_FILES_ICS=( '{yy}{jjj}{hh}00{fcst_hr:02d}00' ) +EXTRN_MDL_FILES_LBCS=( '{yy}{jjj}{hh}00{fcst_hr:02d}00' ) DATE_FIRST_CYCL="20200801" DATE_LAST_CYCL="20200801" diff --git a/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_CONUScompact_3km_ics_HRRR_lbcs_RAP_suite_RRFS_v1beta.sh b/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_CONUScompact_3km_ics_HRRR_lbcs_RAP_suite_RRFS_v1beta.sh index cf6965435..a75f8d79e 100644 --- a/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_CONUScompact_3km_ics_HRRR_lbcs_RAP_suite_RRFS_v1beta.sh +++ b/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_CONUScompact_3km_ics_HRRR_lbcs_RAP_suite_RRFS_v1beta.sh @@ -17,6 +17,8 @@ CCPP_PHYS_SUITE="FV3_RRFS_v1beta" EXTRN_MDL_NAME_ICS="HRRR" EXTRN_MDL_NAME_LBCS="RAP" USE_USER_STAGED_EXTRN_FILES="TRUE" +EXTRN_MDL_FILES_ICS=( '{yy}{jjj}{hh}00{fcst_hr:02d}00' ) +EXTRN_MDL_FILES_LBCS=( '{yy}{jjj}{hh}00{fcst_hr:02d}00' ) DATE_FIRST_CYCL="20200801" DATE_LAST_CYCL="20200801" diff --git a/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_SUBCONUS_3km_ics_HRRR_lbcs_RAP_suite_GFS_v15p2.sh b/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_SUBCONUS_3km_ics_HRRR_lbcs_RAP_suite_GFS_v15p2.sh index 3534d5df9..c5512d0ea 100644 --- a/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_SUBCONUS_3km_ics_HRRR_lbcs_RAP_suite_GFS_v15p2.sh +++ b/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_RRFS_SUBCONUS_3km_ics_HRRR_lbcs_RAP_suite_GFS_v15p2.sh @@ -17,6 +17,8 @@ CCPP_PHYS_SUITE="FV3_GFS_v15p2" EXTRN_MDL_NAME_ICS="HRRR" EXTRN_MDL_NAME_LBCS="RAP" USE_USER_STAGED_EXTRN_FILES="TRUE" +EXTRN_MDL_FILES_ICS=( '{yy}{jjj}{hh}00{fcst_hr:02d}00' ) +EXTRN_MDL_FILES_LBCS=( '{yy}{jjj}{hh}00{fcst_hr:02d}00' ) DATE_FIRST_CYCL="20200801" DATE_LAST_CYCL="20200801" diff --git a/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.SUBCONUS_Ind_3km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16.sh b/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_SUBCONUS_Ind_3km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16.sh similarity index 100% rename from tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.SUBCONUS_Ind_3km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16.sh rename to tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_SUBCONUS_Ind_3km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16.sh diff --git a/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_SUBCONUS_Ind_3km_ics_HRRR_lbcs_RAP_suite_HRRR.sh b/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_SUBCONUS_Ind_3km_ics_HRRR_lbcs_RAP_suite_HRRR.sh new file mode 100644 index 000000000..aa2cf6edc --- /dev/null +++ b/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_SUBCONUS_Ind_3km_ics_HRRR_lbcs_RAP_suite_HRRR.sh @@ -0,0 +1,27 @@ +# +# TEST PURPOSE/DESCRIPTION: +# ------------------------ +# +# This test is to ensure that the workflow running in community mode +# completes successfully on the SUBCONUS_Ind_3km grid using the HRRR +# physics suite with ICs derived from HRRR and LBCs derived from the RAP. +# + +RUN_ENVIR="community" +PREEXISTING_DIR_METHOD="rename" + +PREDEF_GRID_NAME="SUBCONUS_Ind_3km" +CCPP_PHYS_SUITE="FV3_HRRR" + +EXTRN_MDL_NAME_ICS="HRRR" +EXTRN_MDL_NAME_LBCS="RAP" +USE_USER_STAGED_EXTRN_FILES="TRUE" +EXTRN_MDL_FILES_ICS=( '{yy}{jjj}{hh}00{fcst_hr:02d}00' ) +EXTRN_MDL_FILES_LBCS=( '{yy}{jjj}{hh}00{fcst_hr:02d}00' ) + +DATE_FIRST_CYCL="20200810" +DATE_LAST_CYCL="20200810" +CYCL_HRS=( "00" ) + +FCST_LEN_HRS="6" +LBC_SPEC_INTVL_HRS="6" diff --git a/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_SUBCONUS_Ind_3km_ics_HRRR_lbcs_RAP_suite_RRFS_v1beta.sh b/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_SUBCONUS_Ind_3km_ics_HRRR_lbcs_RAP_suite_RRFS_v1beta.sh new file mode 100644 index 000000000..82c5ef43a --- /dev/null +++ b/tests/WE2E/test_configs/grids_extrn_mdls_suites_community/config.grid_SUBCONUS_Ind_3km_ics_HRRR_lbcs_RAP_suite_RRFS_v1beta.sh @@ -0,0 +1,27 @@ +# +# TEST PURPOSE/DESCRIPTION: +# ------------------------ +# +# This test is to ensure that the workflow running in community mode +# completes successfully on the SUBCONUS_Ind_3km grid using the RRFS_v1beta +# physics suite with ICs derived from HRRR and LBCs derived from the RAP. +# + +RUN_ENVIR="community" +PREEXISTING_DIR_METHOD="rename" + +PREDEF_GRID_NAME="SUBCONUS_Ind_3km" +CCPP_PHYS_SUITE="FV3_RRFS_v1beta" + +EXTRN_MDL_NAME_ICS="HRRR" +EXTRN_MDL_NAME_LBCS="RAP" +USE_USER_STAGED_EXTRN_FILES="TRUE" +EXTRN_MDL_FILES_ICS=( '{yy}{jjj}{hh}00{fcst_hr:02d}00' ) +EXTRN_MDL_FILES_LBCS=( '{yy}{jjj}{hh}00{fcst_hr:02d}00' ) + +DATE_FIRST_CYCL="20200801" +DATE_LAST_CYCL="20200801" +CYCL_HRS=( "00" ) + +FCST_LEN_HRS="6" +LBC_SPEC_INTVL_HRS="6" diff --git a/tests/WE2E/test_configs/grids_extrn_mdls_suites_nco/config.nco_grid_RRFS_CONUS_13km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16.sh b/tests/WE2E/test_configs/grids_extrn_mdls_suites_nco/config.nco_grid_RRFS_CONUS_13km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16.sh index f9eea0729..f6d6454bd 100644 --- a/tests/WE2E/test_configs/grids_extrn_mdls_suites_nco/config.nco_grid_RRFS_CONUS_13km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16.sh +++ b/tests/WE2E/test_configs/grids_extrn_mdls_suites_nco/config.nco_grid_RRFS_CONUS_13km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16.sh @@ -17,7 +17,6 @@ EXTRN_MDL_NAME_ICS="FV3GFS" FV3GFS_FILE_FMT_ICS="grib2" EXTRN_MDL_NAME_LBCS="FV3GFS" FV3GFS_FILE_FMT_LBCS="grib2" -USE_USER_STAGED_EXTRN_FILES="TRUE" DATE_FIRST_CYCL="20190615" DATE_LAST_CYCL="20190615" diff --git a/tests/WE2E/test_configs/grids_extrn_mdls_suites_nco/config.nco_grid_RRFS_CONUS_3km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v15_thompson_mynn_lam3km.sh b/tests/WE2E/test_configs/grids_extrn_mdls_suites_nco/config.nco_grid_RRFS_CONUS_3km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v15_thompson_mynn_lam3km.sh index 6a02d4f11..19b36f87f 100644 --- a/tests/WE2E/test_configs/grids_extrn_mdls_suites_nco/config.nco_grid_RRFS_CONUS_3km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v15_thompson_mynn_lam3km.sh +++ b/tests/WE2E/test_configs/grids_extrn_mdls_suites_nco/config.nco_grid_RRFS_CONUS_3km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v15_thompson_mynn_lam3km.sh @@ -20,8 +20,6 @@ FV3GFS_FILE_FMT_ICS="grib2" EXTRN_MDL_NAME_LBCS="FV3GFS" FV3GFS_FILE_FMT_LBCS="grib2" -USE_USER_STAGED_EXTRN_FILES="TRUE" - DATE_FIRST_CYCL="20190615" DATE_LAST_CYCL="20190615" CYCL_HRS=( "00" ) diff --git a/tests/WE2E/test_configs/grids_extrn_mdls_suites_nco/config.nco_grid_RRFS_CONUScompact_25km_ics_HRRR_lbcs_RAP_suite_HRRR.sh b/tests/WE2E/test_configs/grids_extrn_mdls_suites_nco/config.nco_grid_RRFS_CONUScompact_25km_ics_HRRR_lbcs_RAP_suite_HRRR.sh index c2a054b30..e5c4e7732 100644 --- a/tests/WE2E/test_configs/grids_extrn_mdls_suites_nco/config.nco_grid_RRFS_CONUScompact_25km_ics_HRRR_lbcs_RAP_suite_HRRR.sh +++ b/tests/WE2E/test_configs/grids_extrn_mdls_suites_nco/config.nco_grid_RRFS_CONUScompact_25km_ics_HRRR_lbcs_RAP_suite_HRRR.sh @@ -16,6 +16,8 @@ CCPP_PHYS_SUITE="FV3_HRRR" EXTRN_MDL_NAME_ICS="HRRR" EXTRN_MDL_NAME_LBCS="RAP" USE_USER_STAGED_EXTRN_FILES="TRUE" +EXTRN_MDL_FILES_ICS=( '{yy}{jjj}{hh}00{fcst_hr:02d}00' ) +EXTRN_MDL_FILES_LBCS=( '{yy}{jjj}{hh}00{fcst_hr:02d}00' ) DATE_FIRST_CYCL="20200810" DATE_LAST_CYCL="20200810" diff --git a/tests/WE2E/test_configs/wflow_features/config.new_ESGgrid.sh b/tests/WE2E/test_configs/wflow_features/config.custom_ESGgrid.sh similarity index 96% rename from tests/WE2E/test_configs/wflow_features/config.new_ESGgrid.sh rename to tests/WE2E/test_configs/wflow_features/config.custom_ESGgrid.sh index fc34f4a0c..bb8d2bc0c 100644 --- a/tests/WE2E/test_configs/wflow_features/config.new_ESGgrid.sh +++ b/tests/WE2E/test_configs/wflow_features/config.custom_ESGgrid.sh @@ -44,6 +44,8 @@ LAYOUT_X="8" LAYOUT_Y="12" BLOCKSIZE="13" +POST_OUTPUT_DOMAIN_NAME="custom_ESGgrid" + QUILTING="TRUE" if [ "$QUILTING" = "TRUE" ]; then WRTCMP_write_groups="1" diff --git a/tests/WE2E/test_configs/wflow_features/config.new_GFDLgrid.sh b/tests/WE2E/test_configs/wflow_features/config.custom_GFDLgrid.sh similarity index 98% rename from tests/WE2E/test_configs/wflow_features/config.new_GFDLgrid.sh rename to tests/WE2E/test_configs/wflow_features/config.custom_GFDLgrid.sh index d27148de2..66604a549 100644 --- a/tests/WE2E/test_configs/wflow_features/config.new_GFDLgrid.sh +++ b/tests/WE2E/test_configs/wflow_features/config.custom_GFDLgrid.sh @@ -76,6 +76,8 @@ LAYOUT_X="6" LAYOUT_Y="6" BLOCKSIZE="26" +POST_OUTPUT_DOMAIN_NAME="custom_GFDLgrid" + QUILTING="TRUE" if [ "$QUILTING" = "TRUE" ]; then WRTCMP_write_groups="1" diff --git a/tests/WE2E/test_configs/wflow_features/config.new_GFDLgrid__GFDLgrid_USE_GFDLgrid_RES_IN_FILENAMES_eq_FALSE.sh b/tests/WE2E/test_configs/wflow_features/config.custom_GFDLgrid__GFDLgrid_USE_GFDLgrid_RES_IN_FILENAMES_eq_FALSE.sh similarity index 97% rename from tests/WE2E/test_configs/wflow_features/config.new_GFDLgrid__GFDLgrid_USE_GFDLgrid_RES_IN_FILENAMES_eq_FALSE.sh rename to tests/WE2E/test_configs/wflow_features/config.custom_GFDLgrid__GFDLgrid_USE_GFDLgrid_RES_IN_FILENAMES_eq_FALSE.sh index aef2636f2..4c4f0d5c7 100644 --- a/tests/WE2E/test_configs/wflow_features/config.new_GFDLgrid__GFDLgrid_USE_GFDLgrid_RES_IN_FILENAMES_eq_FALSE.sh +++ b/tests/WE2E/test_configs/wflow_features/config.custom_GFDLgrid__GFDLgrid_USE_GFDLgrid_RES_IN_FILENAMES_eq_FALSE.sh @@ -63,6 +63,8 @@ LAYOUT_X="6" LAYOUT_Y="6" BLOCKSIZE="26" +POST_OUTPUT_DOMAIN_NAME="custom_GFDLgrid" + QUILTING="TRUE" if [ "$QUILTING" = "TRUE" ]; then WRTCMP_write_groups="1" diff --git a/tests/WE2E/test_configs/wflow_features/config.new_GFDLgrid__GFDLgrid_USE_GFDLgrid_RES_IN_FILENAMES_eq_TRUE.sh b/tests/WE2E/test_configs/wflow_features/config.custom_GFDLgrid__GFDLgrid_USE_GFDLgrid_RES_IN_FILENAMES_eq_TRUE.sh similarity index 97% rename from tests/WE2E/test_configs/wflow_features/config.new_GFDLgrid__GFDLgrid_USE_GFDLgrid_RES_IN_FILENAMES_eq_TRUE.sh rename to tests/WE2E/test_configs/wflow_features/config.custom_GFDLgrid__GFDLgrid_USE_GFDLgrid_RES_IN_FILENAMES_eq_TRUE.sh index 59ca0925e..3e067ee3d 100644 --- a/tests/WE2E/test_configs/wflow_features/config.new_GFDLgrid__GFDLgrid_USE_GFDLgrid_RES_IN_FILENAMES_eq_TRUE.sh +++ b/tests/WE2E/test_configs/wflow_features/config.custom_GFDLgrid__GFDLgrid_USE_GFDLgrid_RES_IN_FILENAMES_eq_TRUE.sh @@ -63,6 +63,8 @@ LAYOUT_X="6" LAYOUT_Y="6" BLOCKSIZE="26" +POST_OUTPUT_DOMAIN_NAME="custom_GFDLgrid" + QUILTING="TRUE" if [ "$QUILTING" = "TRUE" ]; then WRTCMP_write_groups="1" diff --git a/tests/WE2E/test_configs/wflow_features/config.get_from_HPSS_ics_HRRR_lbcs_RAP.sh b/tests/WE2E/test_configs/wflow_features/config.get_from_HPSS_ics_HRRR_lbcs_RAP.sh index 5d6ed126e..61d4e5c9e 100644 --- a/tests/WE2E/test_configs/wflow_features/config.get_from_HPSS_ics_HRRR_lbcs_RAP.sh +++ b/tests/WE2E/test_configs/wflow_features/config.get_from_HPSS_ics_HRRR_lbcs_RAP.sh @@ -17,6 +17,8 @@ CCPP_PHYS_SUITE="FV3_HRRR" EXTRN_MDL_NAME_ICS="HRRR" EXTRN_MDL_NAME_LBCS="RAP" +EXTRN_MDL_FILES_ICS=( '{yy}{jjj}{hh}00{fcst_hr:02d}00' ) +EXTRN_MDL_FILES_LBCS=( '{yy}{jjj}{hh}00{fcst_hr:02d}00' ) DATE_FIRST_CYCL="20200801" DATE_LAST_CYCL="20200801" diff --git a/tests/WE2E/test_configs/wflow_features/config.specify_DT_ATMOS_LAYOUT_XY_BLOCKSIZE.sh b/tests/WE2E/test_configs/wflow_features/config.specify_DT_ATMOS_LAYOUT_XY_BLOCKSIZE.sh index d985559f2..7a10a0ec3 100644 --- a/tests/WE2E/test_configs/wflow_features/config.specify_DT_ATMOS_LAYOUT_XY_BLOCKSIZE.sh +++ b/tests/WE2E/test_configs/wflow_features/config.specify_DT_ATMOS_LAYOUT_XY_BLOCKSIZE.sh @@ -16,6 +16,8 @@ CCPP_PHYS_SUITE="FV3_HRRR" EXTRN_MDL_NAME_ICS="HRRR" EXTRN_MDL_NAME_LBCS="RAP" USE_USER_STAGED_EXTRN_FILES="TRUE" +EXTRN_MDL_FILES_ICS=( '{yy}{jjj}{hh}00{fcst_hr:02d}00' ) +EXTRN_MDL_FILES_LBCS=( '{yy}{jjj}{hh}00{fcst_hr:02d}00' ) DATE_FIRST_CYCL="20200801" DATE_LAST_CYCL="20200801" diff --git a/tests/WE2E/test_configs/wflow_features/config.specify_EXTRN_MDL_SYSBASEDIR_ICS_LBCS.sh b/tests/WE2E/test_configs/wflow_features/config.specify_EXTRN_MDL_SYSBASEDIR_ICS_LBCS.sh index 2ade79ff5..eca762a94 100644 --- a/tests/WE2E/test_configs/wflow_features/config.specify_EXTRN_MDL_SYSBASEDIR_ICS_LBCS.sh +++ b/tests/WE2E/test_configs/wflow_features/config.specify_EXTRN_MDL_SYSBASEDIR_ICS_LBCS.sh @@ -18,9 +18,9 @@ FV3GFS_FILE_FMT_ICS="grib2" EXTRN_MDL_NAME_LBCS="FV3GFS" FV3GFS_FILE_FMT_LBCS="grib2" -DATE_FIRST_CYCL="20210603" -DATE_LAST_CYCL="20210603" -CYCL_HRS=( "06" ) +DATE_FIRST_CYCL="20210615" +DATE_LAST_CYCL="20210615" +CYCL_HRS=( "00" ) FCST_LEN_HRS="6" LBC_SPEC_INTVL_HRS="3" diff --git a/tests/WE2E/test_configs/wflow_features/config.subhourly_post.sh b/tests/WE2E/test_configs/wflow_features/config.subhourly_post.sh index 09df72e29..a1676e4d5 100644 --- a/tests/WE2E/test_configs/wflow_features/config.subhourly_post.sh +++ b/tests/WE2E/test_configs/wflow_features/config.subhourly_post.sh @@ -15,6 +15,8 @@ CCPP_PHYS_SUITE="FV3_RRFS_v1beta" EXTRN_MDL_NAME_ICS="HRRR" EXTRN_MDL_NAME_LBCS="RAP" USE_USER_STAGED_EXTRN_FILES="TRUE" +EXTRN_MDL_FILES_ICS=( '{yy}{jjj}{hh}00{fcst_hr:02d}00' ) +EXTRN_MDL_FILES_LBCS=( '{yy}{jjj}{hh}00{fcst_hr:02d}00' ) DATE_FIRST_CYCL="20200810" DATE_LAST_CYCL="20200810" diff --git a/tests/WE2E/test_configs/wflow_features/config.subhourly_post_ensemble_2mems.sh b/tests/WE2E/test_configs/wflow_features/config.subhourly_post_ensemble_2mems.sh index 4c2080711..adcc37a33 100644 --- a/tests/WE2E/test_configs/wflow_features/config.subhourly_post_ensemble_2mems.sh +++ b/tests/WE2E/test_configs/wflow_features/config.subhourly_post_ensemble_2mems.sh @@ -21,6 +21,8 @@ CCPP_PHYS_SUITE="FV3_RRFS_v1beta" EXTRN_MDL_NAME_ICS="HRRR" EXTRN_MDL_NAME_LBCS="RAP" USE_USER_STAGED_EXTRN_FILES="TRUE" +EXTRN_MDL_FILES_ICS=( '{yy}{jjj}{hh}00{fcst_hr:02d}00' ) +EXTRN_MDL_FILES_LBCS=( '{yy}{jjj}{hh}00{fcst_hr:02d}00' ) DATE_FIRST_CYCL="20200810" DATE_LAST_CYCL="20200810" diff --git a/ush/NOMADS_get_extrn_mdl_files.sh b/ush/NOMADS_get_extrn_mdl_files.sh index 308aa8082..789eeff3d 100755 --- a/ush/NOMADS_get_extrn_mdl_files.sh +++ b/ush/NOMADS_get_extrn_mdl_files.sh @@ -37,10 +37,10 @@ cd gfs.$yyyymmdd/$hh #getting online analysis data if [ $file_fmt == "grib2" ] || [ $file_fmt == "GRIB2" ]; then - wget -c https://nomads.ncep.noaa.gov/pub/data/nccf/com/gfs/prod/gfs.$yyyymmdd/$hh/gfs.t${hh}z.pgrb2.0p25.f000 + wget --tries=2 -c https://nomads.ncep.noaa.gov/pub/data/nccf/com/gfs/prod/gfs.$yyyymmdd/$hh/gfs.t${hh}z.pgrb2.0p25.f000 else - wget -c https://nomads.ncep.noaa.gov/pub/data/nccf/com/gfs/prod/gfs.$yyyymmdd/$hh/gfs.t${hh}z.atmanl.nemsio - wget -c https://nomads.ncep.noaa.gov/pub/data/nccf/com/gfs/prod/gfs.$yyyymmdd/$hh/gfs.t${hh}z.sfcanl.nemsio + wget --tries=2 -c https://nomads.ncep.noaa.gov/pub/data/nccf/com/gfs/prod/gfs.$yyyymmdd/$hh/gfs.t${hh}z.atmanl.nemsio + wget --tries=2 -c https://nomads.ncep.noaa.gov/pub/data/nccf/com/gfs/prod/gfs.$yyyymmdd/$hh/gfs.t${hh}z.sfcanl.nemsio fi #getting online forecast data @@ -60,10 +60,10 @@ echo $ifcst echo $ifcst_str # if [ $file_fmt == "grib2" ] || [ $file_fmt == "GRIB2" ]; then - wget -c https://nomads.ncep.noaa.gov/pub/data/nccf/com/gfs/prod/gfs.$yyyymmdd/$hh/gfs.t${hh}z.pgrb2.0p25.f${ifcst_str} + wget --tries=2 -c https://nomads.ncep.noaa.gov/pub/data/nccf/com/gfs/prod/gfs.$yyyymmdd/$hh/gfs.t${hh}z.pgrb2.0p25.f${ifcst_str} else - wget -c https://nomads.ncep.noaa.gov/pub/data/nccf/com/gfs/prod/gfs.$yyyymmdd/$hh/gfs.t${hh}z.atmf${ifcst_str}.nemsio - wget -c https://nomads.ncep.noaa.gov/pub/data/nccf/com/gfs/prod/gfs.$yyyymmdd/$hh/gfs.t${hh}z.sfcf${ifcst_str}.nemsio + wget --tries=2 -c https://nomads.ncep.noaa.gov/pub/data/nccf/com/gfs/prod/gfs.$yyyymmdd/$hh/gfs.t${hh}z.atmf${ifcst_str}.nemsio + wget --tries=2 -c https://nomads.ncep.noaa.gov/pub/data/nccf/com/gfs/prod/gfs.$yyyymmdd/$hh/gfs.t${hh}z.sfcf${ifcst_str}.nemsio fi # ifcst=$[$ifcst+$nfcst_int] diff --git a/ush/check_expt_config_vars.sh b/ush/check_expt_config_vars.sh index 53ce13a09..7476de779 100644 --- a/ush/check_expt_config_vars.sh +++ b/ush/check_expt_config_vars.sh @@ -63,7 +63,7 @@ function check_expt_config_vars() { # Note that a variable name will be found only if the equal sign immediately # follows the variable name. # - var_name=$( printf "%s" "${crnt_line}" | $SED -n -r -e "s/^([^ =\"]*)=.*/\1/p") + var_name=$( printf "%s" "${crnt_line}" | $SED -n -r -e "s/^([^ =\"]*)=.*/\1/p" ) if [ -z "${var_name}" ]; then diff --git a/ush/check_ruc_lsm.py b/ush/check_ruc_lsm.py new file mode 100644 index 000000000..fdf288f30 --- /dev/null +++ b/ush/check_ruc_lsm.py @@ -0,0 +1,30 @@ +#!/usr/bin/env python3 + +import os +import unittest + +from python_utils import set_env_var, print_input_args, \ + load_xml_file, has_tag_with_value + +def check_ruc_lsm(ccpp_phys_suite_fp): + """ This file defines a function that checks whether the RUC land surface + model (LSM) parameterization is being called by the selected physics suite. + + Args: + ccpp_phys_suite_fp: full path to CCPP physics suite xml file + Returns: + Boolean + """ + + print_input_args(locals()) + + tree = load_xml_file(ccpp_phys_suite_fp) + has_ruc = has_tag_with_value(tree, "scheme", "lsm_ruc") + return has_ruc + +class Testing(unittest.TestCase): + def test_check_ruc_lsm(self): + self.assertTrue( check_ruc_lsm(ccpp_phys_suite_fp=f"test_data{os.sep}suite_FV3_GSD_SAR.xml") ) + def setUp(self): + set_env_var('DEBUG',True) + diff --git a/ush/config.community.sh b/ush/config.community.sh index 70433d597..ba8aa44da 100644 --- a/ush/config.community.sh +++ b/ush/config.community.sh @@ -14,7 +14,7 @@ QUILTING="TRUE" DO_ENSEMBLE="FALSE" NUM_ENS_MEMBERS="2" -CCPP_PHYS_SUITE="FV3_GFS_v15p2" +CCPP_PHYS_SUITE="FV3_GFS_v16" FCST_LEN_HRS="48" LBC_SPEC_INTVL_HRS="6" @@ -30,7 +30,7 @@ FV3GFS_FILE_FMT_LBCS="grib2" WTIME_RUN_FCST="01:00:00" -MODEL="FV3_GFS_v15p2_CONUS_25km" +MODEL="FV3_GFS_v16_CONUS_25km" METPLUS_PATH="path/to/METPlus" MET_INSTALL_DIR="path/to/MET" CCPA_OBS_DIR="/path/to/processed/CCPA/data" diff --git a/ush/config.nco.sh b/ush/config.nco.sh index 3b2c84be6..2e136c773 100644 --- a/ush/config.nco.sh +++ b/ush/config.nco.sh @@ -8,27 +8,34 @@ VERBOSE="TRUE" RUN_ENVIR="nco" PREEXISTING_DIR_METHOD="rename" -PREDEF_GRID_NAME="CONUS_25km_GFDLgrid" +PREDEF_GRID_NAME="RRFS_CONUS_25km" QUILTING="TRUE" -CCPP_PHYS_SUITE="FV3_GFS_v15p2" +CCPP_PHYS_SUITE="FV3_GFS_v16" -FCST_LEN_HRS="06" -LBC_SPEC_INTVL_HRS="6" +FCST_LEN_HRS="6" +LBC_SPEC_INTVL_HRS="3" -DATE_FIRST_CYCL="20190901" -DATE_LAST_CYCL="20190901" -CYCL_HRS=( "18" ) +DATE_FIRST_CYCL="20220407" +DATE_LAST_CYCL="20220407" +CYCL_HRS=( "00" ) EXTRN_MDL_NAME_ICS="FV3GFS" EXTRN_MDL_NAME_LBCS="FV3GFS" +FV3GFS_FILE_FMT_ICS="grib2" +FV3GFS_FILE_FMT_LBCS="grib2" + +WRITE_DOPOST="TRUE" + # # The following must be modified for different platforms and users. # -RUN="an_experiment" -COMINgfs="/scratch1/NCEPDEV/hwrf/noscrub/hafs-input/COMGFS" # Path to directory containing files from the external model (FV3GFS). +NET="rrfs" +model_ver="v1.0" +RUN="rrfs_test" +COMIN="/scratch1/NCEPDEV/rstprod/com/gfs/prod" # Path to directory containing files from the external model. FIXLAM_NCO_BASEDIR="/scratch2/BMC/det/FV3LAM_pregen" # Path to directory containing the pregenerated grid, orography, and surface climatology "fixed" files to use for the experiment. -STMP="/scratch2/BMC/det/Gerard.Ketefian/UFS_CAM/NCO_dirs/stmp" # Path to directory STMP that mostly contains input files. -PTMP="/scratch2/BMC/det/Gerard.Ketefian/UFS_CAM/NCO_dirs/ptmp" # Path to directory PTMP in which the experiment's output files will be placed. +STMP="/scratch2/NCEPDEV/fv3-cam/Chan-hoo.Jeon/01_OUT_DATA/stmp" # Path to directory STMP that mostly contains input files. +PTMP="/scratch2/NCEPDEV/fv3-cam/Chan-hoo.Jeon/01_OUT_DATA/ptmp" # Path to directory PTMP in which the experiment's output files will be placed. diff --git a/ush/config_defaults.sh b/ush/config_defaults.sh index 145d397ff..d304d203a 100644 --- a/ush/config_defaults.sh +++ b/ush/config_defaults.sh @@ -17,8 +17,8 @@ # # NCEP Central Operations # WCOSS Implementation Standards -# April 17, 2019 -# Version 10.2.0 +# January 19, 2022 +# Version 11.0.0 # # RUN_ENVIR is described in this document as follows: # @@ -73,12 +73,12 @@ RUN_ENVIR="nco" # Path to the LMOD sh file on your Linux system. Is set automatically # for supported machines. # -# BUILD_ENV_FN: -# Name of alternative build environment file to use if using an +# BUILD_MOD_FN: +# Name of alternative build module file to use if using an # unsupported platform. Is set automatically for supported machines. # -# WFLOW_ENV_FN: -# Name of alternative workflow environment file to use if using an +# WFLOW_MOD_FN: +# Name of alternative workflow module file to use if using an # unsupported platform. Is set automatically for supported machines. # # SCHED: @@ -142,8 +142,8 @@ ACCOUNT="project_name" WORKFLOW_MANAGER="none" NCORES_PER_NODE="" LMOD_PATH="" -BUILD_ENV_FN="" -WFLOW_ENV_FN="" +BUILD_MOD_FN="" +WFLOW_MOD_FN="" SCHED="" PARTITION_DEFAULT="" QUEUE_DEFAULT="" @@ -232,31 +232,29 @@ EXEC_SUBDIR="bin" # Set variables that are only used in NCO mode (i.e. when RUN_ENVIR is # set to "nco"). Definitions: # -# COMINgfs: -# The beginning portion of the directory containing files generated by -# the external model (FV3GFS) that the initial and lateral boundary -# condition generation tasks need in order to create initial and boundary -# condition files for a given cycle on the native FV3-LAM grid. For a -# cycle that starts on the date specified by the variable yyyymmdd -# (consisting of the 4-digit year followed by the 2-digit month followed -# by the 2-digit day of the month) and hour specified by the variable hh -# (consisting of the 2-digit hour-of-day), the directory in which the -# workflow will look for the external model files is: +# COMIN: +# Directory containing files generated by the external model (FV3GFS, NAM, +# HRRR, etc) that the initial and lateral boundary condition generation tasks +# need in order to create initial and boundary condition files for a given +# cycle on the native FV3-LAM grid. # -# $COMINgfs/gfs.$yyyymmdd/$hh/atmos +# envir, NET, model_ver, RUN: +# Standard environment variables defined in the NCEP Central Operations WCOSS +# Implementation Standards document as follows: # -# FIXLAM_NCO_BASEDIR: -# The base directory containing pregenerated grid, orography, and surface -# climatology files. For the pregenerated grid specified by PREDEF_GRID_NAME, -# these "fixed" files are located in: +# envir: +# Set to "test" during the initial testing phase, "para" when running +# in parallel (on a schedule), and "prod" in production. # -# ${FIXLAM_NCO_BASEDIR}/${PREDEF_GRID_NAME} +# NET: +# Model name (first level of com directory structure) # -# The workflow scripts will create a symlink in the experiment directory -# that will point to a subdirectory (having the name of the grid being -# used) under this directory. This variable should be set to a null -# string in this file, but it can be specified in the user-specified -# workflow configuration file (EXPT_CONFIG_FN). +# model_ver: +# Version number of package in three digits (second level of com directory) +# +# RUN: +# Name of model run (third level of com directory structure). +# In general, same as $NET # # STMP: # The beginning portion of the directory that will contain cycle-dependent @@ -269,22 +267,6 @@ EXEC_SUBDIR="bin" # # $STMP/tmpnwprd/$RUN/$cdate # -# NET, envir, RUN: -# Variables used in forming the path to the directory that will contain -# the output files from the post-processor (UPP) for a given cycle (see -# definition of PTMP below). These are defined in the WCOSS Implementation -# Standards document as follows: -# -# NET: -# Model name (first level of com directory structure) -# -# envir: -# Set to "test" during the initial testing phase, "para" when running -# in parallel (on a schedule), and "prod" in production. -# -# RUN: -# Name of model run (third level of com directory structure). -# # PTMP: # The beginning portion of the directory that will contain the output # files from the post-processor (UPP) for a given cycle. For a cycle @@ -292,16 +274,17 @@ EXEC_SUBDIR="bin" # (where yyyymmdd and hh are as described above), the directory in which # the UPP output files will be placed will be: # -# $PTMP/com/$NET/$envir/$RUN.$yyyymmdd/$hh +# $PTMP/com/$NET/$model_ver/$RUN.$yyyymmdd/$hh # #----------------------------------------------------------------------- # -COMINgfs="/base/path/of/directory/containing/gfs/input/files" -FIXLAM_NCO_BASEDIR="" +COMIN="/path/of/directory/containing/data/files/for/IC/LBCS" STMP="/base/path/of/directory/containing/model/input/and/raw/output/files" -NET="rrfs" envir="para" -RUN="experiment_name" +NET="rrfs" +model_ver="v1.0.0" +RUN="rrfs" +STMP="/base/path/of/directory/containing/model/input/and/raw/output/files" PTMP="/base/path/of/directory/containing/postprocessed/output/files" # #----------------------------------------------------------------------- @@ -428,6 +411,18 @@ WFLOW_LAUNCH_LOG_FN="log.launch_FV3LAM_wflow" # #----------------------------------------------------------------------- # +# Set output file name. Definitions: +# +# POST_OUTPUT_DOMAIN_NAME: +# Domain name used in naming the output files of run_post by UPP or inline post. +# Output file name: $NET.tHHz.[var_name].f###.$POST_OUTPUT_DOMAIN_NAME.grib2 +# +#----------------------------------------------------------------------- +# +POST_OUTPUT_DOMAIN_NAME="" +# +#----------------------------------------------------------------------- +# # Set forecast parameters. Definitions: # # DATE_FIRST_CYCL: @@ -715,6 +710,13 @@ EXTRN_MDL_SYSBASEDIR_LBCS='' # EXTRN_MDL_FILES_LBCS: # Analogous to EXTRN_MDL_FILES_ICS but for LBCs instead of ICs. # +# EXTRN_MDL_DATA_STORES: +# A list of data stores where the scripts should look for external model +# data. The list is in priority order. If disk information is provided +# via USE_USER_STAGED_EXTRN_FILES or a known location on the platform, +# the disk location will be highest priority. Options are disk, hpss, +# aws, and nomads. +# #----------------------------------------------------------------------- # USE_USER_STAGED_EXTRN_FILES="FALSE" @@ -722,6 +724,7 @@ EXTRN_MDL_SOURCE_BASEDIR_ICS="" EXTRN_MDL_FILES_ICS="" EXTRN_MDL_SOURCE_BASEDIR_LBCS="" EXTRN_MDL_FILES_LBCS="" +EXTRN_MDL_DATA_STORES="" # #----------------------------------------------------------------------- # @@ -751,7 +754,7 @@ NOMADS_file_type="nemsio" # #----------------------------------------------------------------------- # -CCPP_PHYS_SUITE="FV3_GFS_v15p2" +CCPP_PHYS_SUITE="FV3_GFS_v16" # #----------------------------------------------------------------------- # @@ -1309,6 +1312,22 @@ VX_ENSPOINT_PROB_TN="run_enspointvx_prob" # SFC_CLIMO_DIR: # Same as GRID_DIR but for the MAKE_SFC_CLIMO_TN task. # +# DOMAIN_PREGEN_BASEDIR: +# The base directory containing pregenerated grid, orography, and surface +# climatology files. This is an alternative for setting GRID_DIR, +# OROG_DIR, and SFC_CLIMO_DIR individually +# +# For the pregenerated grid specified by PREDEF_GRID_NAME, +# these "fixed" files are located in: +# +# ${DOMAIN_PREGEN_BASEDIR}/${PREDEF_GRID_NAME} +# +# The workflow scripts will create a symlink in the experiment directory +# that will point to a subdirectory (having the name of the grid being +# used) under this directory. This variable should be set to a null +# string in this file, but it can be specified in the user-specified +# workflow configuration file (EXPT_CONFIG_FN). +# # RUN_TASK_GET_EXTRN_ICS: # Flag that determines whether the GET_EXTRN_ICS_TN task is to be run. # @@ -1355,6 +1374,8 @@ OROG_DIR="/path/to/pregenerated/orog/files" RUN_TASK_MAKE_SFC_CLIMO="TRUE" SFC_CLIMO_DIR="/path/to/pregenerated/surface/climo/files" +DOMAIN_PREGEN_BASEDIR="" + RUN_TASK_GET_EXTRN_ICS="TRUE" RUN_TASK_GET_EXTRN_LBCS="TRUE" RUN_TASK_MAKE_ICS="TRUE" @@ -1691,6 +1712,14 @@ MAXTRIES_VX_ENSGRID_PROB_RETOP="1" MAXTRIES_VX_ENSPOINT="1" MAXTRIES_VX_ENSPOINT_MEAN="1" MAXTRIES_VX_ENSPOINT_PROB="1" + +# +#----------------------------------------------------------------------- +# +# Allows an extra parameter to be passed to slurm via XML Native +# command +# +SLURM_NATIVE_CMD="" # #----------------------------------------------------------------------- # @@ -1849,7 +1878,7 @@ SPP_TSCALE=( "21600.0" "21600.0" "21600.0" "21600.0" "21600.0" ) #Variable "spp_ SPP_SIGTOP1=( "0.1" "0.1" "0.1" "0.1" "0.1") SPP_SIGTOP2=( "0.025" "0.025" "0.025" "0.025" "0.025" ) SPP_STDDEV_CUTOFF=( "1.5" "1.5" "2.5" "1.5" "1.5" ) -ISEED_SPP=( "4" "4" "4" "4" "4" ) +ISEED_SPP=( "4" "5" "6" "7" "8" ) # #----------------------------------------------------------------------- # diff --git a/ush/config_defaults.yaml b/ush/config_defaults.yaml new file mode 100644 index 000000000..56b89bb11 --- /dev/null +++ b/ush/config_defaults.yaml @@ -0,0 +1,2022 @@ +# +#----------------------------------------------------------------------- +# +# This file sets the experiment's configuration variables (which are +# global shell variables) to their default values. For many of these +# variables, the valid values that they may take on are defined in the +# file $USHDIR/valid_param_vals.sh. +# +#----------------------------------------------------------------------- +# + +# +#----------------------------------------------------------------------- +# +# Set the RUN_ENVIR variable that is listed and described in the WCOSS +# Implementation Standards document: +# +# NCEP Central Operations +# WCOSS Implementation Standards +# April 19, 2022 +# Version 11.0.0 +# +# RUN_ENVIR is described in this document as follows: +# +# Set to "nco" if running in NCO's production environment. Used to +# distinguish between organizations. +# +# Valid values are "nco" and "community". Here, we use it to generate +# and run the experiment either in NCO mode (if RUN_ENVIR is set to "nco") +# or in community mode (if RUN_ENVIR is set to "community"). This has +# implications on the experiment variables that need to be set and the +# the directory structure used. +# +#----------------------------------------------------------------------- +# +RUN_ENVIR: "nco" +# +#----------------------------------------------------------------------- +# +# mach_doc_start +# Set machine and queue parameters. Definitions: +# +# MACHINE: +# Machine on which the workflow will run. If you are NOT on a named, +# supported platform, and you want to use the Rocoto workflow manager, +# you will need set MACHINE: "linux" and WORKFLOW_MANAGER: "rocoto". This +# combination will assume a Slurm batch manager when generating the XML. +# Please see ush/valid_param_vals.sh for a full list of supported +# platforms. +# +# MACHINE_FILE: +# Path to a configuration file with machine-specific settings. If none +# is provided, setup.sh will attempt to set the path to for a supported +# platform. +# +# ACCOUNT: +# The account under which to submit jobs to the queue. +# +# WORKFLOW_MANAGER: +# The workflow manager to use (e.g. rocoto). This is set to "none" by +# default, but if the machine name is set to a platform that supports +# rocoto, this will be overwritten and set to "rocoto". If set +# explicitly to rocoto along with the use of the MACHINE=linux target, +# the configuration layer assumes a Slurm batch manager when generating +# the XML. Valid options: "rocoto" or "none" +# +# NCORES_PER_NODE: +# The number of cores available per node on the compute platform. Set +# for supported platforms in setup.sh, but is now also configurable for +# all platforms. +# +# LMOD_PATH: +# Path to the LMOD sh file on your Linux system. Is set automatically +# for supported machines. +# +# BUILD_MOD_FN: +# Name of alternative build module file to use if using an +# unsupported platform. Is set automatically for supported machines. +# +# WFLOW_MOD_FN: +# Name of alternative workflow module file to use if using an +# unsupported platform. Is set automatically for supported machines. +# +# SCHED: +# The job scheduler to use (e.g. slurm). Set this to an empty string in +# order for the experiment generation script to set it depending on the +# machine. +# +# PARTITION_DEFAULT: +# If using the slurm job scheduler (i.e. if SCHED is set to "slurm"), +# the default partition to which to submit workflow tasks. If a task +# does not have a specific variable that specifies the partition to which +# it will be submitted (e.g. PARTITION_HPSS, PARTITION_FCST; see below), +# it will be submitted to the partition specified by this variable. If +# this is not set or is set to an empty string, it will be (re)set to a +# machine-dependent value. This is not used if SCHED is not set to +# "slurm". +# +# QUEUE_DEFAULT: +# The default queue or QOS (if using the slurm job scheduler, where QOS +# is Quality of Service) to which workflow tasks are submitted. If a +# task does not have a specific variable that specifies the queue to which +# it will be submitted (e.g. QUEUE_HPSS, QUEUE_FCST; see below), it will +# be submitted to the queue specified by this variable. If this is not +# set or is set to an empty string, it will be (re)set to a machine- +# dependent value. +# +# PARTITION_HPSS: +# If using the slurm job scheduler (i.e. if SCHED is set to "slurm"), +# the partition to which the tasks that get or create links to external +# model files [which are needed to generate initial conditions (ICs) and +# lateral boundary conditions (LBCs)] are submitted. If this is not set +# or is set to an empty string, it will be (re)set to a machine-dependent +# value. This is not used if SCHED is not set to "slurm". +# +# QUEUE_HPSS: +# The queue or QOS to which the tasks that get or create links to external +# model files [which are needed to generate initial conditions (ICs) and +# lateral boundary conditions (LBCs)] are submitted. If this is not set +# or is set to an empty string, it will be (re)set to a machine-dependent +# value. +# +# PARTITION_FCST: +# If using the slurm job scheduler (i.e. if SCHED is set to "slurm"), +# the partition to which the task that runs forecasts is submitted. If +# this is not set or set to an empty string, it will be (re)set to a +# machine-dependent value. This is not used if SCHED is not set to +# "slurm". +# +# QUEUE_FCST: +# The queue or QOS to which the task that runs a forecast is submitted. +# If this is not set or set to an empty string, it will be (re)set to a +# machine-dependent value. +# +# mach_doc_end +# +#----------------------------------------------------------------------- +# +MACHINE: "BIG_COMPUTER" +MACHINE_FILE: "" +ACCOUNT: "project_name" +WORKFLOW_MANAGER: "none" +NCORES_PER_NODE: "" +LMOD_PATH: "" +BUILD_MOD_FN: "" +WFLOW_MOD_FN: "" +SCHED: "" +PARTITION_DEFAULT: "" +QUEUE_DEFAULT: "" +PARTITION_HPSS: "" +QUEUE_HPSS: "" +PARTITION_FCST: "" +QUEUE_FCST: "" +# +#----------------------------------------------------------------------- +# +# Set run commands for platforms without a workflow manager. These values +# will be ignored unless WORKFLOW_MANAGER: "none". Definitions: +# +# RUN_CMD_UTILS: +# The run command for pre-processing utilities (shave, orog, sfc_climo_gen, +# etc.) Can be left blank for smaller domains, in which case the executables +# will run without MPI. +# +# RUN_CMD_FCST: +# The run command for the model forecast step. This will be appended to +# the end of the variable definitions file, so it can reference other +# variables. +# +# RUN_CMD_POST: +# The run command for post-processing (UPP). Can be left blank for smaller +# domains, in which case UPP will run without MPI. +# +#----------------------------------------------------------------------- +# +RUN_CMD_UTILS: "mpirun -np 1" +RUN_CMD_FCST: "mpirun -np ${PE_MEMBER01}" +RUN_CMD_POST: "mpirun -np 1" +# +#----------------------------------------------------------------------- +# +# Set cron-associated parameters. Definitions: +# +# USE_CRON_TO_RELAUNCH: +# Flag that determines whether or not to add a line to the user's cron +# table to call the experiment launch script every CRON_RELAUNCH_INTVL_MNTS +# minutes. +# +# CRON_RELAUNCH_INTVL_MNTS: +# The interval (in minutes) between successive calls of the experiment +# launch script by a cron job to (re)launch the experiment (so that the +# workflow for the experiment kicks off where it left off). +# +#----------------------------------------------------------------------- +# +USE_CRON_TO_RELAUNCH: "FALSE" +CRON_RELAUNCH_INTVL_MNTS: "03" +# +#----------------------------------------------------------------------- +# +# dir_doc_start +# Set directories. Definitions: +# +# EXPT_BASEDIR: +# The base directory in which the experiment directory will be created. +# If this is not specified or if it is set to an empty string, it will +# default to ${HOMErrfs}/../expt_dirs. +# +# EXPT_SUBDIR: +# The name that the experiment directory (without the full path) will +# have. The full path to the experiment directory, which will be contained +# in the variable EXPTDIR, will be: +# +# EXPTDIR: "${EXPT_BASEDIR}/${EXPT_SUBDIR}" +# +# This cannot be empty. If set to a null string here, it must be set to +# a (non-empty) value in the user-defined experiment configuration file. +# +# dir_doc_end +# +# EXEC_SUBDIR: +# The name of the subdirectory of ufs-srweather-app where executables are +# installed. +#----------------------------------------------------------------------- +# +EXPT_BASEDIR: "" +EXPT_SUBDIR: "" +EXEC_SUBDIR: "bin" +# +#----------------------------------------------------------------------- +# +# Set variables that are only used in NCO mode (i.e. when RUN_ENVIR is +# set to "nco"). Definitions: +# +# COMIN: +# Directory containing files generated by the external model (FV3GFS, NAM, +# HRRR, etc) that the initial and lateral boundary condition generation tasks +# need in order to create initial and boundary condition files for a given +# cycle on the native FV3-LAM grid. +# +# envir, NET, model_ver, RUN: +# Standard environment variables defined in the NCEP Central Operations WCOSS +# Implementation Standards document as follows: +# +# envir: +# Set to "test" during the initial testing phase, "para" when running +# in parallel (on a schedule), and "prod" in production. +# +# NET: +# Model name (first level of com directory structure) +# +# model_ver: +# Version number of package in three digits (second level of com directory) +# +# RUN: +# Name of model run (third level of com directory structure). +# In general, same as $NET +# +# STMP: +# The beginning portion of the directory that will contain cycle-dependent +# model input files, symlinks to cycle-independent input files, and raw +# (i.e. before post-processing) forecast output files for a given cycle. +# For a cycle that starts on the date specified by yyyymmdd and hour +# specified by hh (where yyyymmdd and hh are as described above) [so that +# the cycle date (cdate) is given by cdate: "${yyyymmdd}${hh}"], the +# directory in which the aforementioned files will be located is: +# +# $STMP/tmpnwprd/$RUN/$cdate +# +# PTMP: +# The beginning portion of the directory that will contain the output +# files from the post-processor (UPP) for a given cycle. For a cycle +# that starts on the date specified by yyyymmdd and hour specified by hh +# (where yyyymmdd and hh are as described above), the directory in which +# the UPP output files will be placed will be: +# +# $PTMP/com/$NET/$model_ver/$RUN.$yyyymmdd/$hh +# +#----------------------------------------------------------------------- +# +COMIN: "/path/of/directory/containing/data/files/for/IC/LBCS" +envir: "para" +NET: "rrfs" +model_ver: "v1.0.0" +RUN: "rrfs" +STMP: "/base/path/of/directory/containing/model/input/and/raw/output/files" +PTMP: "/base/path/of/directory/containing/postprocessed/output/files" +# +#----------------------------------------------------------------------- +# +# Set the separator character(s) to use in the names of the grid, mosaic, +# and orography fixed files. +# +# Ideally, the same separator should be used in the names of these fixed +# files as the surface climatology fixed files (which always use a "." +# as the separator), i.e. ideally, DOT_OR_USCORE should be set to "." +# +#----------------------------------------------------------------------- +# +DOT_OR_USCORE: "_" +# +#----------------------------------------------------------------------- +# +# Set file names. Definitions: +# +# EXPT_CONFIG_FN: +# Name of the user-specified configuration file for the forecast experiment. +# +# RGNL_GRID_NML_FN: +# Name of file containing the namelist settings for the code that generates +# a "ESGgrid" type of regional grid. +# +# FV3_NML_BASE_SUITE_FN: +# Name of Fortran namelist file containing the forecast model's base suite +# namelist, i.e. the portion of the namelist that is common to all physics +# suites. +# +# FV3_NML_YAML_CONFIG_FN: +# Name of YAML configuration file containing the forecast model's namelist +# settings for various physics suites. +# +# FV3_NML_BASE_ENS_FN: +# Name of Fortran namelist file containing the forecast model's base +# ensemble namelist, i.e. the the namelist file that is the starting point +# from which the namelist files for each of the enesemble members are +# generated. +# +# FV3_EXEC_FN: +# Name to use for the forecast model executable when it is copied from +# the directory in which it is created in the build step to the executables +# directory (EXECDIR; this is set during experiment generation). +# +# DIAG_TABLE_TMPL_FN: +# Name of a template file that specifies the output fields of the forecast +# model (ufs-weather-model: diag_table) followed by [dot_ccpp_phys_suite]. +# Its default value is the name of the file that the ufs weather model +# expects to read in. +# +# FIELD_TABLE_TMPL_FN: +# Name of a template file that specifies the tracers in IC/LBC files of the +# forecast model (ufs-weather-mode: field_table) followed by [dot_ccpp_phys_suite]. +# Its default value is the name of the file that the ufs weather model expects +# to read in. +# +# MODEL_CONFIG_TMPL_FN: +# Name of a template file that contains settings and configurations for the +# NUOPC/ESMF main component (ufs-weather-model: model_config). Its default +# value is the name of the file that the ufs weather model expects to read in. +# +# NEMS_CONFIG_TMPL_FN: +# Name of a template file that contains information about the various NEMS +# components and their run sequence (ufs-weather-model: nems.configure). +# Its default value is the name of the file that the ufs weather model expects +# to read in. +# +# FCST_MODEL: +# Name of forecast model (default=ufs-weather-model) +# +# WFLOW_XML_FN: +# Name of the rocoto workflow XML file that the experiment generation +# script creates and that defines the workflow for the experiment. +# +# GLOBAL_VAR_DEFNS_FN: +# Name of file (a shell script) containing the defintions of the primary +# experiment variables (parameters) defined in this default configuration +# script and in the user-specified configuration as well as secondary +# experiment variables generated by the experiment generation script. +# This file is sourced by many scripts (e.g. the J-job scripts corresponding +# to each workflow task) in order to make all the experiment variables +# available in those scripts. +# +# EXTRN_MDL_VAR_DEFNS_FN: +# Name of file (a shell script) containing the defintions of variables +# associated with the external model from which ICs or LBCs are generated. This +# file is created by the GET_EXTRN_*_TN task because the values of the variables +# it contains are not known before this task runs. The file is then sourced by +# the MAKE_ICS_TN and MAKE_LBCS_TN tasks. +# +# WFLOW_LAUNCH_SCRIPT_FN: +# Name of the script that can be used to (re)launch the experiment's rocoto +# workflow. +# +# WFLOW_LAUNCH_LOG_FN: +# Name of the log file that contains the output from successive calls to +# the workflow launch script (WFLOW_LAUNCH_SCRIPT_FN). +# +#----------------------------------------------------------------------- +# +EXPT_CONFIG_FN: "config.sh" + +RGNL_GRID_NML_FN: "regional_grid.nml" + +FV3_NML_BASE_SUITE_FN: "input.nml.FV3" +FV3_NML_YAML_CONFIG_FN: "FV3.input.yml" +FV3_NML_BASE_ENS_FN: "input.nml.base_ens" +FV3_EXEC_FN: "ufs_model" + +DATA_TABLE_TMPL_FN: "" +DIAG_TABLE_TMPL_FN: "" +FIELD_TABLE_TMPL_FN: "" +MODEL_CONFIG_TMPL_FN: "" +NEMS_CONFIG_TMPL_FN: "" + +FCST_MODEL: "ufs-weather-model" +WFLOW_XML_FN: "FV3LAM_wflow.xml" +GLOBAL_VAR_DEFNS_FN: "var_defns.sh" +EXTRN_MDL_VAR_DEFNS_FN: "extrn_mdl_var_defns.sh" +WFLOW_LAUNCH_SCRIPT_FN: "launch_FV3LAM_wflow.sh" +WFLOW_LAUNCH_LOG_FN: "log.launch_FV3LAM_wflow" +# +#----------------------------------------------------------------------- +# +# Set output file name. Definitions: +# +# POST_OUTPUT_DOMAIN_NAME: +# Domain name used in naming the output files of run_post by UPP or inline post. +# Output file name: $NET.tHHz.[var_name].f###.$POST_OUTPUT_DOMAIN_NAME.grib2 +# +#----------------------------------------------------------------------- +# +POST_OUTPUT_DOMAIN_NAME: "" +# +#----------------------------------------------------------------------- +# +# Set forecast parameters. Definitions: +# +# DATE_FIRST_CYCL: +# Starting date of the first forecast in the set of forecasts to run. +# Format is "YYYYMMDD". Note that this does not include the hour-of-day. +# +# DATE_LAST_CYCL: +# Starting date of the last forecast in the set of forecasts to run. +# Format is "YYYYMMDD". Note that this does not include the hour-of-day. +# +# CYCL_HRS: +# An array containing the hours of the day at which to launch forecasts. +# Forecasts are launched at these hours on each day from DATE_FIRST_CYCL +# to DATE_LAST_CYCL, inclusive. Each element of this array must be a +# two-digit string representing an integer that is less than or equal to +# 23, e.g. "00", "03", "12", "23". +# +# INCR_CYCL_FREQ: +# Increment in hours for Cycle Frequency (cycl_freq). +# Default is 24, which means cycle_freq=24:00:00 +# +# FCST_LEN_HRS: +# The length of each forecast, in integer hours. +# +#----------------------------------------------------------------------- +# +DATE_FIRST_CYCL: "YYYYMMDD" +DATE_LAST_CYCL: "YYYYMMDD" +CYCL_HRS: [ "HH1", "HH2" ] +INCR_CYCL_FREQ: "24" +FCST_LEN_HRS: "24" +# +#----------------------------------------------------------------------- +# +# Set model_configure parameters. Definitions: +# +# DT_ATMOS: +# The main forecast model integraton time step. As described in the +# forecast model documentation, "It corresponds to the frequency with +# which the top level routine in the dynamics is called as well as the +# frequency with which the physics is called." +# +# CPL: parameter for coupling +# (set automatically based on FCST_MODEL in ush/setup.sh) +# (ufs-weather-model:FALSE, fv3gfs_aqm:TRUE) +# +# RESTART_INTERVAL: +# frequency of the output restart files (unit:hour). +# Default=0: restart files are produced at the end of a forecast run +# For example, RESTART_INTERVAL: "1": restart files are produced every hour +# with the prefix "YYYYMMDD.HHmmSS." in the RESTART directory +# +# WRITE_DOPOST: +# Flag that determines whether or not to use the INLINE POST option +# When TRUE, force to turn off run_post (RUN_TASK_RUN_POST=FALSE) in setup.sh +# +#----------------------------------------------------------------------- +# +DT_ATMOS: "" +RESTART_INTERVAL: "0" +WRITE_DOPOST: "FALSE" +# +#----------------------------------------------------------------------- +# +# Set METplus parameters. Definitions: +# +# MODEL: +# String that specifies a descriptive name for the model being verified. +# +# MET_INSTALL_DIR: +# Location to top-level directory of MET installation. +# +# METPLUS_PATH: +# Location to top-level directory of METplus installation. +# +# CCPA_OBS_DIR: +# User-specified location of top-level directory where CCPA hourly +# precipitation files used by METplus are located. This parameter needs +# to be set for both user-provided observations and for observations +# that are retrieved from the NOAA HPSS (if the user has access) via +# the get_obs_ccpa_tn task (activated in workflow by setting +# RUN_TASK_GET_OBS_CCPA="TRUE"). In the case of pulling observations +# directly from NOAA HPSS, the data retrieved will be placed in this +# directory. Please note, this path must be defind as +# /full-path-to-obs/ccpa/proc. METplus is configured to verify 01-, +# 03-, 06-, and 24-h accumulated precipitation using hourly CCPA files. +# METplus configuration files require the use of predetermined directory +# structure and file names. Therefore, if the CCPA files are user +# provided, they need to follow the anticipated naming structure: +# {YYYYMMDD}/ccpa.t{HH}z.01h.hrap.conus.gb2, where YYYY is the 4-digit +# valid year, MM the 2-digit valid month, DD the 2-digit valid day of +# the month, and HH the 2-digit valid hour of the day. In addition, a +# caveat is noted for using hourly CCPA data. There is a problem with +# the valid time in the metadata for files valid from 19 - 00 UTC (or +# files under the '00' directory). The script to pull the CCPA data +# from the NOAA HPSS has an example of how to account for this as well +# as organizing the data into a more intuitive format: +# regional_workflow/scripts/exregional_get_ccpa_files.sh. When a fix +# is provided, it will be accounted for in the +# exregional_get_ccpa_files.sh script. +# +# MRMS_OBS_DIR: +# User-specified location of top-level directory where MRMS composite +# reflectivity files used by METplus are located. This parameter needs +# to be set for both user-provided observations and for observations +# that are retrieved from the NOAA HPSS (if the user has access) via the +# get_obs_mrms_tn task (activated in workflow by setting +# RUN_TASK_GET_OBS_MRMS="TRUE"). In the case of pulling observations +# directly from NOAA HPSS, the data retrieved will be placed in this +# directory. Please note, this path must be defind as +# /full-path-to-obs/mrms/proc. METplus configuration files require the +# use of predetermined directory structure and file names. Therefore, if +# the MRMS files are user provided, they need to follow the anticipated +# naming structure: +# {YYYYMMDD}/MergedReflectivityQCComposite_00.50_{YYYYMMDD}-{HH}{mm}{SS}.grib2, +# where YYYY is the 4-digit valid year, MM the 2-digit valid month, DD +# the 2-digit valid day of the month, HH the 2-digit valid hour of the +# day, mm the 2-digit valid minutes of the hour, and SS is the two-digit +# valid seconds of the hour. In addition, METplus is configured to look +# for a MRMS composite reflectivity file for the valid time of the +# forecast being verified; since MRMS composite reflectivity files do +# not always exactly match the valid time, a script, within the main +# script to retrieve MRMS data from the NOAA HPSS, is used to identify +# and rename the MRMS composite reflectivity file to match the valid +# time of the forecast. The script to pull the MRMS data from the NOAA +# HPSS has an example of the expected file naming structure: +# regional_workflow/scripts/exregional_get_mrms_files.sh. This script +# calls the script used to identify the MRMS file closest to the valid +# time: regional_workflow/ush/mrms_pull_topofhour.py. +# +# NDAS_OBS_DIR: +# User-specified location of top-level directory where NDAS prepbufr +# files used by METplus are located. This parameter needs to be set for +# both user-provided observations and for observations that are +# retrieved from the NOAA HPSS (if the user has access) via the +# get_obs_ndas_tn task (activated in workflow by setting  +# RUN_TASK_GET_OBS_NDAS="TRUE"). In the case of pulling observations +# directly from NOAA HPSS, the data retrieved will be placed in this +# directory. Please note, this path must be defind as +# /full-path-to-obs/ndas/proc. METplus is configured to verify +# near-surface variables hourly and upper-air variables at times valid +# at 00 and 12 UTC with NDAS prepbufr files. METplus configuration files +# require the use of predetermined file names. Therefore, if the NDAS +# files are user provided, they need to follow the anticipated naming +# structure: prepbufr.ndas.{YYYYMMDDHH}, where YYYY is the 4-digit valid +# year, MM the 2-digit valid month, DD the 2-digit valid day of the +# month, and HH the 2-digit valid hour of the day. The script to pull +# the NDAS data from the NOAA HPSS has an example of how to rename the +# NDAS data into a more intuitive format with the valid time listed in +# the file name: regional_workflow/scripts/exregional_get_ndas_files.sh +# +#----------------------------------------------------------------------- +# +MODEL: "" +MET_INSTALL_DIR: "" +MET_BIN_EXEC: "bin" +METPLUS_PATH: "" +CCPA_OBS_DIR: "" +MRMS_OBS_DIR: "" +NDAS_OBS_DIR: "" +# +#----------------------------------------------------------------------- +# +# Set initial and lateral boundary condition generation parameters. +# Definitions: +# +# EXTRN_MDL_NAME_ICS: +#`The name of the external model that will provide fields from which +# initial condition (including and surface) files will be generated for +# input into the forecast model. +# +# EXTRN_MDL_NAME_LBCS: +#`The name of the external model that will provide fields from which +# lateral boundary condition (LBC) files will be generated for input into +# the forecast model. +# +# LBC_SPEC_INTVL_HRS: +# The interval (in integer hours) with which LBC files will be generated. +# We will refer to this as the boundary update interval. Note that the +# model specified in EXTRN_MDL_NAME_LBCS must have data available at a +# frequency greater than or equal to that implied by LBC_SPEC_INTVL_HRS. +# For example, if LBC_SPEC_INTVL_HRS is set to 6, then the model must have +# data availble at least every 6 hours. It is up to the user to ensure +# that this is the case. +# +# EXTRN_MDL_ICS_OFFSET_HRS: +# Users may wish to start a forecast from a forecast of a previous cycle +# of an external model. This variable sets the number of hours earlier +# the external model started than when the FV3 forecast configured here +# should start. For example, the forecast should start from a 6 hour +# forecast of the GFS, then EXTRN_MDL_ICS_OFFSET_HRS=6. + +# EXTRN_MDL_LBCS_OFFSET_HRS: +# Users may wish to use lateral boundary conditions from a forecast that +# was started earlier than the initial time for the FV3 forecast +# configured here. This variable sets the number of hours earlier +# the external model started than when the FV3 forecast configured here +# should start. For example, the forecast should use lateral boundary +# conditions from the GFS started 6 hours earlier, then +# EXTRN_MDL_LBCS_OFFSET_HRS=6. +# Note: the default value is model-dependent and set in +# set_extrn_mdl_params.sh +# +# FV3GFS_FILE_FMT_ICS: +# If using the FV3GFS model as the source of the ICs (i.e. if EXTRN_MDL_NAME_ICS +# is set to "FV3GFS"), this variable specifies the format of the model +# files to use when generating the ICs. +# +# FV3GFS_FILE_FMT_LBCS: +# If using the FV3GFS model as the source of the LBCs (i.e. if +# EXTRN_MDL_NAME_LBCS is set to "FV3GFS"), this variable specifies the +# format of the model files to use when generating the LBCs. +# +#----------------------------------------------------------------------- +# +EXTRN_MDL_NAME_ICS: "FV3GFS" +EXTRN_MDL_NAME_LBCS: "FV3GFS" +LBC_SPEC_INTVL_HRS: "6" +EXTRN_MDL_ICS_OFFSET_HRS: "0" +EXTRN_MDL_LBCS_OFFSET_HRS: "" +FV3GFS_FILE_FMT_ICS: "nemsio" +FV3GFS_FILE_FMT_LBCS: "nemsio" +# +#----------------------------------------------------------------------- +# +# Base directories in which to search for external model files. +# +# EXTRN_MDL_SYSBASEDIR_ICS: +# Base directory on the local machine containing external model files for +# generating ICs on the native grid. The way the full path containing +# these files is constructed depends on the user-specified external model +# for ICs, i.e. EXTRN_MDL_NAME_ICS. +# +# EXTRN_MDL_SYSBASEDIR_LBCS: +# Same as EXTRN_MDL_SYSBASEDIR_ICS but for LBCs. +# +# Note that these must be defined as null strings here so that if they +# are specified by the user in the experiment configuration file, they +# remain set to those values, and if not, they get set to machine-dependent +# values. +# +#----------------------------------------------------------------------- +# +EXTRN_MDL_SYSBASEDIR_ICS: '' +EXTRN_MDL_SYSBASEDIR_LBCS: '' +# +#----------------------------------------------------------------------- +# +# User-staged external model directories and files. Definitions: +# +# USE_USER_STAGED_EXTRN_FILES: +# Flag that determines whether or not the workflow will look for the +# external model files needed for generating ICs and LBCs in user-specified +# directories. +# +# EXTRN_MDL_SOURCE_BASEDIR_ICS: +# Directory in which to look for external model files for generating ICs. +# If USE_USER_STAGED_EXTRN_FILES is set to "TRUE", the workflow looks in +# this directory (specifically, in a subdirectory under this directory +# named "YYYYMMDDHH" consisting of the starting date and cycle hour of +# the forecast, where YYYY is the 4-digit year, MM the 2-digit month, DD +# the 2-digit day of the month, and HH the 2-digit hour of the day) for +# the external model files specified by the array EXTRN_MDL_FILES_ICS +# (these files will be used to generate the ICs on the native FV3-LAM +# grid). This variable is not used if USE_USER_STAGED_EXTRN_FILES is +# set to "FALSE". +# +# EXTRN_MDL_FILES_ICS: +# Array containing templates of the names of the files to search for in +# the directory specified by EXTRN_MDL_SOURCE_BASEDIR_ICS. This +# variable is not used if USE_USER_STAGED_EXTRN_FILES is set to "FALSE". +# A single template should be used for each model file type that is +# meant to be used. You may use any of the Python-style templates +# allowed in the ush/retrieve_data.py script. To see the full list of +# supported templates, run that script with a -h option. Here is an example of +# setting FV3GFS nemsio input files: +# EXTRN_MDL_FILES_ICS=( gfs.t{hh}z.atmf{fcst_hr:03d}.nemsio \ +# gfs.t{hh}z.sfcf{fcst_hr:03d}.nemsio ) +# Or for FV3GFS grib files: +# EXTRN_MDL_FILES_ICS=( gfs.t{hh}z.pgrb2.0p25.f{fcst_hr:03d} ) +# +# EXTRN_MDL_SOURCE_BASEDIR_LBCS: +# Analogous to EXTRN_MDL_SOURCE_BASEDIR_ICS but for LBCs instead of ICs. +# +# EXTRN_MDL_FILES_LBCS: +# Analogous to EXTRN_MDL_FILES_ICS but for LBCs instead of ICs. +# +# EXTRN_MDL_DATA_STORES: +# A list of data stores where the scripts should look for external model +# data. The list is in priority order. If disk information is provided +# via USE_USER_STAGED_EXTRN_FILES or a known location on the platform, +# the disk location will be highest priority. Options are disk, hpss, +# aws, and nomads. +# +#----------------------------------------------------------------------- +# +USE_USER_STAGED_EXTRN_FILES: "FALSE" +EXTRN_MDL_SOURCE_BASEDIR_ICS: "" +EXTRN_MDL_FILES_ICS: "" +EXTRN_MDL_SOURCE_BASEDIR_LBCS: "" +EXTRN_MDL_FILES_LBCS: "" +EXTRN_MDL_DATA_STORES: "" +# +#----------------------------------------------------------------------- +# +# Set NOMADS online data associated parameters. Definitions: +# +# NOMADS: +# Flag controlling whether or not using NOMADS online data. +# +# NOMADS_file_type: +# Flag controlling the format of data. +# +#----------------------------------------------------------------------- +# +NOMADS: "FALSE" +NOMADS_file_type: "nemsio" +# +#----------------------------------------------------------------------- +# +# Set CCPP-associated parameters. Definitions: +# +# CCPP_PHYS_SUITE: +# The physics suite that will run using CCPP (Common Community Physics +# Package). The choice of physics suite determines the forecast model's +# namelist file, the diagnostics table file, the field table file, and +# the XML physics suite definition file that are staged in the experiment +# directory or the cycle directories under it. +# +#----------------------------------------------------------------------- +# +CCPP_PHYS_SUITE: "FV3_GFS_v16" +# +#----------------------------------------------------------------------- +# +# Set GRID_GEN_METHOD. This variable specifies the method to use to +# generate a regional grid in the horizontal. The values that it can +# take on are: +# +# * "GFDLgrid": +# This setting will generate a regional grid by first generating a +# "parent" global cubed-sphere grid and then taking a portion of tile +# 6 of that global grid -- referred to in the grid generation scripts +# as "tile 7" even though it doesn't correspond to a complete tile -- +# and using it as the regional grid. Note that the forecast is run on +# only on the regional grid (i.e. tile 7, not tiles 1 through 6). +# +# * "ESGgrid": +# This will generate a regional grid using the map projection developed +# by Jim Purser of EMC. +# +# Note that: +# +# 1) If the experiment is using one of the predefined grids (i.e. if +# PREDEF_GRID_NAME is set to the name of one of the valid predefined +# grids), then GRID_GEN_METHOD will be reset to the value of +# GRID_GEN_METHOD for that grid. This will happen regardless of +# whether or not GRID_GEN_METHOD is assigned a value in the user- +# specified experiment configuration file, i.e. any value it may be +# assigned in the experiment configuration file will be overwritten. +# +# 2) If the experiment is not using one of the predefined grids (i.e. if +# PREDEF_GRID_NAME is set to a null string), then GRID_GEN_METHOD must +# be set in the experiment configuration file. Otherwise, it will +# remain set to a null string, and the experiment generation will +# fail because the generation scripts check to ensure that it is set +# to a non-empty string before creating the experiment directory. +# +#----------------------------------------------------------------------- +# +GRID_GEN_METHOD: "" +# +#----------------------------------------------------------------------- +# +# Set parameters specific to the "GFDLgrid" method of generating a regional +# grid (i.e. for GRID_GEN_METHOD set to "GFDLgrid"). The following +# parameters will be used only if GRID_GEN_METHOD is set to "GFDLgrid". +# In this grid generation method: +# +# * The regional grid is defined with respect to a "parent" global cubed- +# sphere grid. Thus, all the parameters for a global cubed-sphere grid +# must be specified in order to define this parent global grid even +# though the model equations are not integrated on (they are integrated +# only on the regional grid). +# +# * GFDLgrid_RES is the number of grid cells in either one of the two +# horizontal directions x and y on any one of the 6 tiles of the parent +# global cubed-sphere grid. The mapping from GFDLgrid_RES to a nominal +# resolution (grid cell size) for a uniform global grid (i.e. Schmidt +# stretch factor GFDLgrid_STRETCH_FAC set to 1) for several values of +# GFDLgrid_RES is as follows: +# +# GFDLgrid_RES typical cell size +# ------------ ----------------- +# 192 50 km +# 384 25 km +# 768 13 km +# 1152 8.5 km +# 3072 3.2 km +# +# Note that these are only typical cell sizes. The actual cell size on +# the global grid tiles varies somewhat as we move across a tile. +# +# * Tile 6 has arbitrarily been chosen as the tile to use to orient the +# global parent grid on the sphere (Earth). This is done by specifying +# GFDLgrid_LON_T6_CTR and GFDLgrid_LAT_T6_CTR, which are the longitude +# and latitude (in degrees) of the center of tile 6. +# +# * Setting the Schmidt stretching factor GFDLgrid_STRETCH_FAC to a value +# greater than 1 shrinks tile 6, while setting it to a value less than +# 1 (but still greater than 0) expands it. The remaining 5 tiles change +# shape as necessary to maintain global coverage of the grid. +# +# * The cell size on a given global tile depends on both GFDLgrid_RES and +# GFDLgrid_STRETCH_FAC (since changing GFDLgrid_RES changes the number +# of cells in the tile, and changing GFDLgrid_STRETCH_FAC modifies the +# shape and size of the tile). +# +# * The regional grid is embedded within tile 6 (i.e. it doesn't extend +# beyond the boundary of tile 6). Its exact location within tile 6 is +# is determined by specifying the starting and ending i and j indices +# of the regional grid on tile 6, where i is the grid index in the x +# direction and j is the grid index in the y direction. These indices +# are stored in the variables +# +# GFDLgrid_ISTART_OF_RGNL_DOM_ON_T6G +# GFDLgrid_JSTART_OF_RGNL_DOM_ON_T6G +# GFDLgrid_IEND_OF_RGNL_DOM_ON_T6G +# GFDLgrid_JEND_OF_RGNL_DOM_ON_T6G +# +# * In the forecast model code and in the experiment generation and workflow +# scripts, for convenience the regional grid is denoted as "tile 7" even +# though it doesn't map back to one of the 6 faces of the cube from +# which the parent global grid is generated (it maps back to only a +# subregion on face 6 since it is wholly confined within tile 6). Tile +# 6 may be referred to as the "parent" tile of the regional grid. +# +# * GFDLgrid_REFINE_RATIO is the refinement ratio of the regional grid +# (tile 7) with respect to the grid on its parent tile (tile 6), i.e. +# it is the number of grid cells along the boundary of the regional grid +# that abut one cell on tile 6. Thus, the cell size on the regional +# grid depends not only on GFDLgrid_RES and GFDLgrid_STRETCH_FAC (because +# the cell size on tile 6 depends on these two parameters) but also on +# GFDLgrid_REFINE_RATIO. Note that as on the tiles of the global grid, +# the cell size on the regional grid is not uniform but varies as we +# move across the grid. +# +# Definitions of parameters that need to be specified when GRID_GEN_METHOD +# is set to "GFDLgrid": +# +# GFDLgrid_LON_T6_CTR: +# Longitude of the center of tile 6 (in degrees). +# +# GFDLgrid_LAT_T6_CTR: +# Latitude of the center of tile 6 (in degrees). +# +# GFDLgrid_RES: +# Number of points in each of the two horizontal directions (x and y) on +# each tile of the parent global grid. Note that the name of this parameter +# is really a misnomer because although it has the stirng "RES" (for +# "resolution") in its name, it specifies number of grid cells, not grid +# size (in say meters or kilometers). However, we keep this name in order +# to remain consistent with the usage of the word "resolution" in the +# global forecast model and other auxiliary codes. +# +# GFDLgrid_STRETCH_FAC: +# Stretching factor used in the Schmidt transformation applied to the +# parent cubed-sphere grid. +# +# GFDLgrid_REFINE_RATIO: +# Cell refinement ratio for the regional grid, i.e. the number of cells +# in either the x or y direction on the regional grid (tile 7) that abut +# one cell on its parent tile (tile 6). +# +# GFDLgrid_ISTART_OF_RGNL_DOM_ON_T6G: +# i-index on tile 6 at which the regional grid (tile 7) starts. +# +# GFDLgrid_IEND_OF_RGNL_DOM_ON_T6G: +# i-index on tile 6 at which the regional grid (tile 7) ends. +# +# GFDLgrid_JSTART_OF_RGNL_DOM_ON_T6G: +# j-index on tile 6 at which the regional grid (tile 7) starts. +# +# GFDLgrid_JEND_OF_RGNL_DOM_ON_T6G: +# j-index on tile 6 at which the regional grid (tile 7) ends. +# +# GFDLgrid_USE_GFDLgrid_RES_IN_FILENAMES: +# Flag that determines the file naming convention to use for grid, orography, +# and surface climatology files (or, if using pregenerated files, the +# naming convention that was used to name these files). These files +# usually start with the string "C${RES}_", where RES is an integer. +# In the global forecast model, RES is the number of points in each of +# the two horizontal directions (x and y) on each tile of the global grid +# (defined here as GFDLgrid_RES). If this flag is set to "TRUE", RES will +# be set to GFDLgrid_RES just as in the global forecast model. If it is +# set to "FALSE", we calculate (in the grid generation task) an "equivalent +# global uniform cubed-sphere resolution" -- call it RES_EQUIV -- and +# then set RES equal to it. RES_EQUIV is the number of grid points in +# each of the x and y directions on each tile that a global UNIFORM (i.e. +# stretch factor of 1) cubed-sphere grid would have to have in order to +# have the same average grid size as the regional grid. This is a more +# useful indicator of the grid size because it takes into account the +# effects of GFDLgrid_RES, GFDLgrid_STRETCH_FAC, and GFDLgrid_REFINE_RATIO +# in determining the regional grid's typical grid size, whereas simply +# setting RES to GFDLgrid_RES doesn't take into account the effects of +# GFDLgrid_STRETCH_FAC and GFDLgrid_REFINE_RATIO on the regional grid's +# resolution. Nevertheless, some users still prefer to use GFDLgrid_RES +# in the file names, so we allow for that here by setting this flag to +# "TRUE". +# +# Note that: +# +# 1) If the experiment is using one of the predefined grids (i.e. if +# PREDEF_GRID_NAME is set to the name of one of the valid predefined +# grids), then: +# +# a) If the value of GRID_GEN_METHOD for that grid is "GFDLgrid", then +# these parameters will get reset to the values for that grid. +# This will happen regardless of whether or not they are assigned +# values in the user-specified experiment configuration file, i.e. +# any values they may be assigned in the experiment configuration +# file will be overwritten. +# +# b) If the value of GRID_GEN_METHOD for that grid is "ESGgrid", then +# these parameters will not be used and thus do not need to be reset +# to non-empty strings. +# +# 2) If the experiment is not using one of the predefined grids (i.e. if +# PREDEF_GRID_NAME is set to a null string), then: +# +# a) If GRID_GEN_METHOD is set to "GFDLgrid" in the user-specified +# experiment configuration file, then these parameters must be set +# in that configuration file. +# +# b) If GRID_GEN_METHOD is set to "ESGgrid" in the user-specified +# experiment configuration file, then these parameters will not be +# used and thus do not need to be reset to non-empty strings. +# +#----------------------------------------------------------------------- +# +GFDLgrid_LON_T6_CTR: "" +GFDLgrid_LAT_T6_CTR: "" +GFDLgrid_RES: "" +GFDLgrid_STRETCH_FAC: "" +GFDLgrid_REFINE_RATIO: "" +GFDLgrid_ISTART_OF_RGNL_DOM_ON_T6G: "" +GFDLgrid_IEND_OF_RGNL_DOM_ON_T6G: "" +GFDLgrid_JSTART_OF_RGNL_DOM_ON_T6G: "" +GFDLgrid_JEND_OF_RGNL_DOM_ON_T6G: "" +GFDLgrid_USE_GFDLgrid_RES_IN_FILENAMES: "" +# +#----------------------------------------------------------------------- +# +# Set parameters specific to the "ESGgrid" method of generating a regional +# grid (i.e. for GRID_GEN_METHOD set to "ESGgrid"). Definitions: +# +# ESGgrid_LON_CTR: +# The longitude of the center of the grid (in degrees). +# +# ESGgrid_LAT_CTR: +# The latitude of the center of the grid (in degrees). +# +# ESGgrid_DELX: +# The cell size in the zonal direction of the regional grid (in meters). +# +# ESGgrid_DELY: +# The cell size in the meridional direction of the regional grid (in +# meters). +# +# ESGgrid_NX: +# The number of cells in the zonal direction on the regional grid. +# +# ESGgrid_NY: +# The number of cells in the meridional direction on the regional grid. +# +# ESGgrid_WIDE_HALO_WIDTH: +# The width (in units of number of grid cells) of the halo to add around +# the regional grid before shaving the halo down to the width(s) expected +# by the forecast model. +# +# ESGgrid_PAZI: +# The rotational parameter for the ESG grid (in degrees). +# +# In order to generate grid files containing halos that are 3-cell and +# 4-cell wide and orography files with halos that are 0-cell and 3-cell +# wide (all of which are required as inputs to the forecast model), the +# grid and orography tasks first create files with halos around the regional +# domain of width ESGgrid_WIDE_HALO_WIDTH cells. These are first stored +# in files. The files are then read in and "shaved" down to obtain grid +# files with 3-cell-wide and 4-cell-wide halos and orography files with +# 0-cell-wide (i.e. no halo) and 3-cell-wide halos. For this reason, we +# refer to the original halo that then gets shaved down as the "wide" +# halo, i.e. because it is wider than the 0-cell-wide, 3-cell-wide, and +# 4-cell-wide halos that we will eventually end up with. Note that the +# grid and orography files with the wide halo are only needed as intermediates +# in generating the files with 0-cell-, 3-cell-, and 4-cell-wide halos; +# they are not needed by the forecast model. +# NOTE: Probably don't need to make ESGgrid_WIDE_HALO_WIDTH a user-specified +# variable. Just set it in the function set_gridparams_ESGgrid.sh. +# +# Note that: +# +# 1) If the experiment is using one of the predefined grids (i.e. if +# PREDEF_GRID_NAME is set to the name of one of the valid predefined +# grids), then: +# +# a) If the value of GRID_GEN_METHOD for that grid is "GFDLgrid", then +# these parameters will not be used and thus do not need to be reset +# to non-empty strings. +# +# b) If the value of GRID_GEN_METHOD for that grid is "ESGgrid", then +# these parameters will get reset to the values for that grid. +# This will happen regardless of whether or not they are assigned +# values in the user-specified experiment configuration file, i.e. +# any values they may be assigned in the experiment configuration +# file will be overwritten. +# +# 2) If the experiment is not using one of the predefined grids (i.e. if +# PREDEF_GRID_NAME is set to a null string), then: +# +# a) If GRID_GEN_METHOD is set to "GFDLgrid" in the user-specified +# experiment configuration file, then these parameters will not be +# used and thus do not need to be reset to non-empty strings. +# +# b) If GRID_GEN_METHOD is set to "ESGgrid" in the user-specified +# experiment configuration file, then these parameters must be set +# in that configuration file. +# +#----------------------------------------------------------------------- +# +ESGgrid_LON_CTR: "" +ESGgrid_LAT_CTR: "" +ESGgrid_DELX: "" +ESGgrid_DELY: "" +ESGgrid_NX: "" +ESGgrid_NY: "" +ESGgrid_WIDE_HALO_WIDTH: "" +ESGgrid_PAZI: "" +# +#----------------------------------------------------------------------- +# +# Set computational parameters for the forecast. Definitions: +# +# LAYOUT_X, LAYOUT_Y: +# The number of MPI tasks (processes) to use in the two horizontal +# directions (x and y) of the regional grid when running the forecast +# model. +# +# BLOCKSIZE: +# The amount of data that is passed into the cache at a time. +# +# Here, we set these parameters to null strings. This is so that, for +# any one of these parameters: +# +# 1) If the experiment is using a predefined grid, then if the user +# sets the parameter in the user-specified experiment configuration +# file (EXPT_CONFIG_FN), that value will be used in the forecast(s). +# Otherwise, the default value of the parameter for that predefined +# grid will be used. +# +# 2) If the experiment is not using a predefined grid (i.e. it is using +# a custom grid whose parameters are specified in the experiment +# configuration file), then the user must specify a value for the +# parameter in that configuration file. Otherwise, the parameter +# will remain set to a null string, and the experiment generation +# will fail because the generation scripts check to ensure that all +# the parameters defined in this section are set to non-empty strings +# before creating the experiment directory. +# +#----------------------------------------------------------------------- +# +LAYOUT_X: "" +LAYOUT_Y: "" +BLOCKSIZE: "" +# +#----------------------------------------------------------------------- +# +# Set write-component (quilting) parameters. Definitions: +# +# QUILTING: +# Flag that determines whether or not to use the write component for +# writing output files to disk. +# +# WRTCMP_write_groups: +# The number of write groups (i.e. groups of MPI tasks) to use in the +# write component. +# +# WRTCMP_write_tasks_per_group: +# The number of MPI tasks to allocate for each write group. +# +# PRINT_ESMF: +# Flag for whether or not to output extra (debugging) information from +# ESMF routines. Must be "TRUE" or "FALSE". Note that the write +# component uses ESMF library routines to interpolate from the native +# forecast model grid to the user-specified output grid (which is defined +# in the model configuration file MODEL_CONFIG_FN in the forecast's run +# directory). +# +#----------------------------------------------------------------------- +# +QUILTING: "TRUE" +PRINT_ESMF: "FALSE" + +WRTCMP_write_groups: "1" +WRTCMP_write_tasks_per_group: "20" + +WRTCMP_output_grid: "''" +WRTCMP_cen_lon: "" +WRTCMP_cen_lat: "" +WRTCMP_lon_lwr_left: "" +WRTCMP_lat_lwr_left: "" +# +# The following are used only for the case of WRTCMP_output_grid set to +# "'rotated_latlon'". +# +WRTCMP_lon_upr_rght: "" +WRTCMP_lat_upr_rght: "" +WRTCMP_dlon: "" +WRTCMP_dlat: "" +# +# The following are used only for the case of WRTCMP_output_grid set to +# "'lambert_conformal'". +# +WRTCMP_stdlat1: "" +WRTCMP_stdlat2: "" +WRTCMP_nx: "" +WRTCMP_ny: "" +WRTCMP_dx: "" +WRTCMP_dy: "" +# +#----------------------------------------------------------------------- +# +# Set PREDEF_GRID_NAME. This parameter specifies a predefined regional +# grid, as follows: +# +# * If PREDEF_GRID_NAME is set to a valid predefined grid name, the grid +# generation method GRID_GEN_METHOD, the (native) grid parameters, and +# the write-component grid parameters are set to predefined values for +# the specified grid, overwriting any settings of these parameters in +# the user-specified experiment configuration file. In addition, if +# the time step DT_ATMOS and the computational parameters LAYOUT_X, +# LAYOUT_Y, and BLOCKSIZE are not specified in that configuration file, +# they are also set to predefined values for the specified grid. +# +# * If PREDEF_GRID_NAME is set to an empty string, it implies the user +# is providing the native grid parameters in the user-specified +# experiment configuration file (EXPT_CONFIG_FN). In this case, the +# grid generation method GRID_GEN_METHOD, the native grid parameters, +# and the write-component grid parameters as well as the time step +# forecast model's main time step DT_ATMOS and the computational +# parameters LAYOUT_X, LAYOUT_Y, and BLOCKSIZE must be set in that +# configuration file; otherwise, the values of all of these parameters +# in this default experiment configuration file will be used. +# +# Setting PREDEF_GRID_NAME provides a convenient method of specifying a +# commonly used set of grid-dependent parameters. The predefined grid +# parameters are specified in the script +# +# $HOMErrfs/ush/set_predef_grid_params.sh +# +#----------------------------------------------------------------------- +# +PREDEF_GRID_NAME: "" +# +#----------------------------------------------------------------------- +# +# Set PREEXISTING_DIR_METHOD. This variable determines the method to use +# use to deal with preexisting directories [e.g ones generated by previous +# calls to the experiment generation script using the same experiment name +# (EXPT_SUBDIR) as the current experiment]. This variable must be set to +# one of "delete", "rename", and "quit". The resulting behavior for each +# of these values is as follows: +# +# * "delete": +# The preexisting directory is deleted and a new directory (having the +# same name as the original preexisting directory) is created. +# +# * "rename": +# The preexisting directory is renamed and a new directory (having the +# same name as the original preexisting directory) is created. The new +# name of the preexisting directory consists of its original name and +# the suffix "_oldNNN", where NNN is a 3-digit integer chosen to make +# the new name unique. +# +# * "quit": +# The preexisting directory is left unchanged, but execution of the +# currently running script is terminated. In this case, the preexisting +# directory must be dealt with manually before rerunning the script. +# +#----------------------------------------------------------------------- +# +PREEXISTING_DIR_METHOD: "delete" +# +#----------------------------------------------------------------------- +# +# Set flags for more detailed messages. Defintitions: +# +# VERBOSE: +# This is a flag that determines whether or not the experiment generation +# and workflow task scripts tend to print out more informational messages. +# +# DEBUG: +# This is a flag that determines whether or not very detailed debugging +# messages are printed to out. Note that if DEBUG is set to TRUE, then +# VERBOSE will also get reset to TRUE if it isn't already. +# +#----------------------------------------------------------------------- +# +VERBOSE: "TRUE" +DEBUG: "FALSE" +# +#----------------------------------------------------------------------- +# +# Set the names of the various rocoto workflow tasks. +# +#----------------------------------------------------------------------- +# +MAKE_GRID_TN: "make_grid" +MAKE_OROG_TN: "make_orog" +MAKE_SFC_CLIMO_TN: "make_sfc_climo" +GET_EXTRN_ICS_TN: "get_extrn_ics" +GET_EXTRN_LBCS_TN: "get_extrn_lbcs" +MAKE_ICS_TN: "make_ics" +MAKE_LBCS_TN: "make_lbcs" +RUN_FCST_TN: "run_fcst" +RUN_POST_TN: "run_post" +GET_OBS: "get_obs" +GET_OBS_CCPA_TN: "get_obs_ccpa" +GET_OBS_MRMS_TN: "get_obs_mrms" +GET_OBS_NDAS_TN: "get_obs_ndas" +VX_TN: "run_vx" +VX_GRIDSTAT_TN: "run_gridstatvx" +VX_GRIDSTAT_REFC_TN: "run_gridstatvx_refc" +VX_GRIDSTAT_RETOP_TN: "run_gridstatvx_retop" +VX_GRIDSTAT_03h_TN: "run_gridstatvx_03h" +VX_GRIDSTAT_06h_TN: "run_gridstatvx_06h" +VX_GRIDSTAT_24h_TN: "run_gridstatvx_24h" +VX_POINTSTAT_TN: "run_pointstatvx" +VX_ENSGRID_TN: "run_ensgridvx" +VX_ENSGRID_03h_TN: "run_ensgridvx_03h" +VX_ENSGRID_06h_TN: "run_ensgridvx_06h" +VX_ENSGRID_24h_TN: "run_ensgridvx_24h" +VX_ENSGRID_REFC_TN: "run_ensgridvx_refc" +VX_ENSGRID_RETOP_TN: "run_ensgridvx_retop" +VX_ENSGRID_MEAN_TN: "run_ensgridvx_mean" +VX_ENSGRID_PROB_TN: "run_ensgridvx_prob" +VX_ENSGRID_MEAN_03h_TN: "run_ensgridvx_mean_03h" +VX_ENSGRID_PROB_03h_TN: "run_ensgridvx_prob_03h" +VX_ENSGRID_MEAN_06h_TN: "run_ensgridvx_mean_06h" +VX_ENSGRID_PROB_06h_TN: "run_ensgridvx_prob_06h" +VX_ENSGRID_MEAN_24h_TN: "run_ensgridvx_mean_24h" +VX_ENSGRID_PROB_24h_TN: "run_ensgridvx_prob_24h" +VX_ENSGRID_PROB_REFC_TN: "run_ensgridvx_prob_refc" +VX_ENSGRID_PROB_RETOP_TN: "run_ensgridvx_prob_retop" +VX_ENSPOINT_TN: "run_enspointvx" +VX_ENSPOINT_MEAN_TN: "run_enspointvx_mean" +VX_ENSPOINT_PROB_TN: "run_enspointvx_prob" +# +#----------------------------------------------------------------------- +# +# Set flags (and related directories) that determine whether various +# workflow tasks should be run. Note that the MAKE_GRID_TN, MAKE_OROG_TN, +# and MAKE_SFC_CLIMO_TN are all cycle-independent tasks, i.e. if they +# are to be run, they do so only once at the beginning of the workflow +# before any cycles are run. Definitions: +# +# RUN_TASK_MAKE_GRID: +# Flag that determines whether the MAKE_GRID_TN task is to be run. If +# this is set to "TRUE", the grid generation task is run and new grid +# files are generated. If it is set to "FALSE", then the scripts look +# for pregenerated grid files in the directory specified by GRID_DIR +# (see below). +# +# GRID_DIR: +# The directory in which to look for pregenerated grid files if +# RUN_TASK_MAKE_GRID is set to "FALSE". +# +# RUN_TASK_MAKE_OROG: +# Same as RUN_TASK_MAKE_GRID but for the MAKE_OROG_TN task. +# +# OROG_DIR: +# Same as GRID_DIR but for the MAKE_OROG_TN task. +# +# RUN_TASK_MAKE_SFC_CLIMO: +# Same as RUN_TASK_MAKE_GRID but for the MAKE_SFC_CLIMO_TN task. +# +# SFC_CLIMO_DIR: +# Same as GRID_DIR but for the MAKE_SFC_CLIMO_TN task. +# +# DOMAIN_PREGEN_BASEDIR: +# The base directory containing pregenerated grid, orography, and surface +# climatology files. This is an alternative for setting GRID_DIR, +# OROG_DIR, and SFC_CLIMO_DIR individually +# +# For the pregenerated grid specified by PREDEF_GRID_NAME, +# these "fixed" files are located in: +# +# ${DOMAIN_PREGEN_BASEDIR}/${PREDEF_GRID_NAME} +# +# The workflow scripts will create a symlink in the experiment directory +# that will point to a subdirectory (having the name of the grid being +# used) under this directory. This variable should be set to a null +# string in this file, but it can be specified in the user-specified +# workflow configuration file (EXPT_CONFIG_FN). +# +# RUN_TASK_GET_EXTRN_ICS: +# Flag that determines whether the GET_EXTRN_ICS_TN task is to be run. +# +# RUN_TASK_GET_EXTRN_LBCS: +# Flag that determines whether the GET_EXTRN_LBCS_TN task is to be run. +# +# RUN_TASK_MAKE_ICS: +# Flag that determines whether the MAKE_ICS_TN task is to be run. +# +# RUN_TASK_MAKE_LBCS: +# Flag that determines whether the MAKE_LBCS_TN task is to be run. +# +# RUN_TASK_RUN_FCST: +# Flag that determines whether the RUN_FCST_TN task is to be run. +# +# RUN_TASK_RUN_POST: +# Flag that determines whether the RUN_POST_TN task is to be run. +# +# RUN_TASK_VX_GRIDSTAT: +# Flag that determines whether the grid-stat verification task is to be +# run. +# +# RUN_TASK_VX_POINTSTAT: +# Flag that determines whether the point-stat verification task is to be +# run. +# +# RUN_TASK_VX_ENSGRID: +# Flag that determines whether the ensemble-stat verification for gridded +# data task is to be run. +# +# RUN_TASK_VX_ENSPOINT: +# Flag that determines whether the ensemble point verification task is +# to be run. If this flag is set, both ensemble-stat point verification +# and point verification of ensemble-stat output is computed. +# +#----------------------------------------------------------------------- +# +RUN_TASK_MAKE_GRID: "TRUE" +GRID_DIR: "/path/to/pregenerated/grid/files" + +RUN_TASK_MAKE_OROG: "TRUE" +OROG_DIR: "/path/to/pregenerated/orog/files" + +RUN_TASK_MAKE_SFC_CLIMO: "TRUE" +SFC_CLIMO_DIR: "/path/to/pregenerated/surface/climo/files" + +DOMAIN_PREGEN_BASEDIR: "" + +RUN_TASK_GET_EXTRN_ICS: "TRUE" +RUN_TASK_GET_EXTRN_LBCS: "TRUE" +RUN_TASK_MAKE_ICS: "TRUE" +RUN_TASK_MAKE_LBCS: "TRUE" +RUN_TASK_RUN_FCST: "TRUE" +RUN_TASK_RUN_POST: "TRUE" + +RUN_TASK_GET_OBS_CCPA: "FALSE" +RUN_TASK_GET_OBS_MRMS: "FALSE" +RUN_TASK_GET_OBS_NDAS: "FALSE" +RUN_TASK_VX_GRIDSTAT: "FALSE" +RUN_TASK_VX_POINTSTAT: "FALSE" +RUN_TASK_VX_ENSGRID: "FALSE" +RUN_TASK_VX_ENSPOINT: "FALSE" +# +#----------------------------------------------------------------------- +# +# Flag that determines whether MERRA2 aerosol climatology data and +# lookup tables for optics properties are obtained +# +#----------------------------------------------------------------------- +# +USE_MERRA_CLIMO: "FALSE" +# +#----------------------------------------------------------------------- +# +# Set the array parameter containing the names of all the fields that the +# MAKE_SFC_CLIMO_TN task generates on the native FV3-LAM grid. +# +#----------------------------------------------------------------------- +# +SFC_CLIMO_FIELDS: [ +"facsf", +"maximum_snow_albedo", +"slope_type", +"snowfree_albedo", +"soil_type", +"substrate_temperature", +"vegetation_greenness", +"vegetation_type" +] +# +#----------------------------------------------------------------------- +# +# Set parameters associated with the fixed (i.e. static) files. Definitions: +# +# FIXgsm: +# System directory in which the majority of fixed (i.e. time-independent) +# files that are needed to run the FV3-LAM model are located +# +# FIXaer: +# System directory where MERRA2 aerosol climatology files are located +# +# FIXlut: +# System directory where the lookup tables for optics properties are located +# +# TOPO_DIR: +# The location on disk of the static input files used by the make_orog +# task (orog.x and shave.x). Can be the same as FIXgsm. +# +# SFC_CLIMO_INPUT_DIR: +# The location on disk of the static surface climatology input fields, used by +# sfc_climo_gen. These files are only used if RUN_TASK_MAKE_SFC_CLIMO=TRUE +# +# FNGLAC, ..., FNMSKH: +# Names of (some of the) global data files that are assumed to exist in +# a system directory specified (this directory is machine-dependent; +# the experiment generation scripts will set it and store it in the +# variable FIXgsm). These file names also appear directly in the forecast +# model's input namelist file. +# +# FIXgsm_FILES_TO_COPY_TO_FIXam: +# If not running in NCO mode, this array contains the names of the files +# to copy from the FIXgsm system directory to the FIXam directory under +# the experiment directory. Note that the last element has a dummy value. +# This last element will get reset by the workflow generation scripts to +# the name of the ozone production/loss file to copy from FIXgsm. The +# name of this file depends on the ozone parameterization being used, +# and that in turn depends on the CCPP physics suite specified for the +# experiment. Thus, the CCPP physics suite XML must first be read in to +# determine the ozone parameterizaton and then the name of the ozone +# production/loss file. These steps are carried out elsewhere (in one +# of the workflow generation scripts/functions). +# +# FV3_NML_VARNAME_TO_FIXam_FILES_MAPPING: +# This array is used to set some of the namelist variables in the forecast +# model's namelist file that represent the relative or absolute paths of +# various fixed files (the first column of the array, where columns are +# delineated by the pipe symbol "|") to the full paths to these files in +# the FIXam directory derived from the corresponding workflow variables +# containing file names (the second column of the array). +# +# FV3_NML_VARNAME_TO_SFC_CLIMO_FIELD_MAPPING: +# This array is used to set some of the namelist variables in the forecast +# model's namelist file that represent the relative or absolute paths of +# various fixed files (the first column of the array, where columns are +# delineated by the pipe symbol "|") to the full paths to surface climatology +# files (on the native FV3-LAM grid) in the FIXLAM directory derived from +# the corresponding surface climatology fields (the second column of the +# array). +# +# CYCLEDIR_LINKS_TO_FIXam_FILES_MAPPING: +# This array specifies the mapping to use between the symlinks that need +# to be created in each cycle directory (these are the "files" that FV3 +# looks for) and their targets in the FIXam directory. The first column +# of the array specifies the symlink to be created, and the second column +# specifies its target file in FIXam (where columns are delineated by the +# pipe symbol "|"). +# +#----------------------------------------------------------------------- +# +# Because the default values are dependent on the platform, we set these +# to a null string which will then be overwritten in setup.sh unless the +# user has specified a different value in config.sh +FIXgsm: "" +FIXaer: "" +FIXlut: "" +TOPO_DIR: "" +SFC_CLIMO_INPUT_DIR: "" + +FNGLAC: &FNGLAC "global_glacier.2x2.grb" +FNMXIC: &FNMXIC "global_maxice.2x2.grb" +FNTSFC: &FNTSFC "RTGSST.1982.2012.monthly.clim.grb" +FNSNOC: &FNSNOC "global_snoclim.1.875.grb" +FNZORC: &FNZORC "igbp" +FNAISC: &FNAISC "CFSR.SEAICE.1982.2012.monthly.clim.grb" +FNSMCC: &FNSMCC "global_soilmgldas.t126.384.190.grb" +FNMSKH: &FNMSKH "seaice_newland.grb" + +FIXgsm_FILES_TO_COPY_TO_FIXam: [ +*FNGLAC, +*FNMXIC, +*FNTSFC, +*FNSNOC, +*FNAISC, +*FNSMCC, +*FNMSKH, +"global_climaeropac_global.txt", +"fix_co2_proj/global_co2historicaldata_2010.txt", +"fix_co2_proj/global_co2historicaldata_2011.txt", +"fix_co2_proj/global_co2historicaldata_2012.txt", +"fix_co2_proj/global_co2historicaldata_2013.txt", +"fix_co2_proj/global_co2historicaldata_2014.txt", +"fix_co2_proj/global_co2historicaldata_2015.txt", +"fix_co2_proj/global_co2historicaldata_2016.txt", +"fix_co2_proj/global_co2historicaldata_2017.txt", +"fix_co2_proj/global_co2historicaldata_2018.txt", +"fix_co2_proj/global_co2historicaldata_2019.txt", +"fix_co2_proj/global_co2historicaldata_2020.txt", +"fix_co2_proj/global_co2historicaldata_2021.txt", +"global_co2historicaldata_glob.txt", +"co2monthlycyc.txt", +"global_h2o_pltc.f77", +"global_hyblev.l65.txt", +"global_zorclim.1x1.grb", +"global_sfc_emissivity_idx.txt", +"global_tg3clim.2.6x1.5.grb", +"global_solarconstant_noaa_an.txt", +"global_albedo4.1x1.grb", +"geo_em.d01.lat-lon.2.5m.HGT_M.nc", +"HGT.Beljaars_filtered.lat-lon.30s_res.nc", +"replace_with_FIXgsm_ozone_prodloss_filename" +] + +# +# It is possible to remove this as a workflow variable and make it only +# a local one since it is used in only one script. +# +FV3_NML_VARNAME_TO_FIXam_FILES_MAPPING: [ +!join_str ["FNGLAC | ",*FNGLAC], +!join_str ["FNMXIC | ",*FNMXIC], +!join_str ["FNTSFC | ",*FNTSFC], +!join_str ["FNSNOC | ",*FNSNOC], +!join_str ["FNAISC | ",*FNAISC], +!join_str ["FNSMCC | ",*FNSMCC], +!join_str ["FNMSKH | ",*FNMSKH] +] +#"FNZORC | $FNZORC", + +FV3_NML_VARNAME_TO_SFC_CLIMO_FIELD_MAPPING: [ +"FNALBC | snowfree_albedo", +"FNALBC2 | facsf", +"FNTG3C | substrate_temperature", +"FNVEGC | vegetation_greenness", +"FNVETC | vegetation_type", +"FNSOTC | soil_type", +"FNVMNC | vegetation_greenness", +"FNVMXC | vegetation_greenness", +"FNSLPC | slope_type", +"FNABSC | maximum_snow_albedo" +] + +CYCLEDIR_LINKS_TO_FIXam_FILES_MAPPING: [ +"aerosol.dat | global_climaeropac_global.txt", +"co2historicaldata_2010.txt | fix_co2_proj/global_co2historicaldata_2010.txt", +"co2historicaldata_2011.txt | fix_co2_proj/global_co2historicaldata_2011.txt", +"co2historicaldata_2012.txt | fix_co2_proj/global_co2historicaldata_2012.txt", +"co2historicaldata_2013.txt | fix_co2_proj/global_co2historicaldata_2013.txt", +"co2historicaldata_2014.txt | fix_co2_proj/global_co2historicaldata_2014.txt", +"co2historicaldata_2015.txt | fix_co2_proj/global_co2historicaldata_2015.txt", +"co2historicaldata_2016.txt | fix_co2_proj/global_co2historicaldata_2016.txt", +"co2historicaldata_2017.txt | fix_co2_proj/global_co2historicaldata_2017.txt", +"co2historicaldata_2018.txt | fix_co2_proj/global_co2historicaldata_2018.txt", +"co2historicaldata_2019.txt | fix_co2_proj/global_co2historicaldata_2019.txt", +"co2historicaldata_2020.txt | fix_co2_proj/global_co2historicaldata_2020.txt", +"co2historicaldata_2021.txt | fix_co2_proj/global_co2historicaldata_2021.txt", +"co2historicaldata_glob.txt | global_co2historicaldata_glob.txt", +"co2monthlycyc.txt | co2monthlycyc.txt", +"global_h2oprdlos.f77 | global_h2o_pltc.f77", +"global_albedo4.1x1.grb | global_albedo4.1x1.grb", +"global_zorclim.1x1.grb | global_zorclim.1x1.grb", +"global_tg3clim.2.6x1.5.grb | global_tg3clim.2.6x1.5.grb", +"sfc_emissivity_idx.txt | global_sfc_emissivity_idx.txt", +"solarconstant_noaa_an.txt | global_solarconstant_noaa_an.txt", +"global_o3prdlos.f77 | " +] +# +#----------------------------------------------------------------------- +# +# For each workflow task, set the parameters to pass to the job scheduler +# (e.g. slurm) that will submit a job for each task to be run. These +# parameters include the number of nodes to use to run the job, the MPI +# processes per node, the maximum walltime to allow for the job to complete, +# and the maximum number of times to attempt to run each task. +# +#----------------------------------------------------------------------- +# +# Number of nodes. +# +NNODES_MAKE_GRID: "1" +NNODES_MAKE_OROG: "1" +NNODES_MAKE_SFC_CLIMO: "2" +NNODES_GET_EXTRN_ICS: "1" +NNODES_GET_EXTRN_LBCS: "1" +NNODES_MAKE_ICS: "4" +NNODES_MAKE_LBCS: "4" +NNODES_RUN_FCST: "" # This is calculated in the workflow generation scripts, so no need to set here. +NNODES_RUN_POST: "2" +NNODES_GET_OBS_CCPA: "1" +NNODES_GET_OBS_MRMS: "1" +NNODES_GET_OBS_NDAS: "1" +NNODES_VX_GRIDSTAT: "1" +NNODES_VX_POINTSTAT: "1" +NNODES_VX_ENSGRID: "1" +NNODES_VX_ENSGRID_MEAN: "1" +NNODES_VX_ENSGRID_PROB: "1" +NNODES_VX_ENSPOINT: "1" +NNODES_VX_ENSPOINT_MEAN: "1" +NNODES_VX_ENSPOINT_PROB: "1" +# +# Number of MPI processes per node. +# +PPN_MAKE_GRID: "24" +PPN_MAKE_OROG: "24" +PPN_MAKE_SFC_CLIMO: "24" +PPN_GET_EXTRN_ICS: "1" +PPN_GET_EXTRN_LBCS: "1" +PPN_MAKE_ICS: "12" +PPN_MAKE_LBCS: "12" +PPN_RUN_FCST: "" # will be calculated from NCORES_PER_NODE and OMP_NUM_THREADS in setup.sh +PPN_RUN_POST: "24" +PPN_GET_OBS_CCPA: "1" +PPN_GET_OBS_MRMS: "1" +PPN_GET_OBS_NDAS: "1" +PPN_VX_GRIDSTAT: "1" +PPN_VX_POINTSTAT: "1" +PPN_VX_ENSGRID: "1" +PPN_VX_ENSGRID_MEAN: "1" +PPN_VX_ENSGRID_PROB: "1" +PPN_VX_ENSPOINT: "1" +PPN_VX_ENSPOINT_MEAN: "1" +PPN_VX_ENSPOINT_PROB: "1" +# +# Walltimes. +# +WTIME_MAKE_GRID: "00:20:00" +WTIME_MAKE_OROG: "01:00:00" +WTIME_MAKE_SFC_CLIMO: "00:20:00" +WTIME_GET_EXTRN_ICS: "00:45:00" +WTIME_GET_EXTRN_LBCS: "00:45:00" +WTIME_MAKE_ICS: "00:30:00" +WTIME_MAKE_LBCS: "00:30:00" +WTIME_RUN_FCST: "04:30:00" +WTIME_RUN_POST: "00:15:00" +WTIME_GET_OBS_CCPA: "00:45:00" +WTIME_GET_OBS_MRMS: "00:45:00" +WTIME_GET_OBS_NDAS: "02:00:00" +WTIME_VX_GRIDSTAT: "02:00:00" +WTIME_VX_POINTSTAT: "01:00:00" +WTIME_VX_ENSGRID: "01:00:00" +WTIME_VX_ENSGRID_MEAN: "01:00:00" +WTIME_VX_ENSGRID_PROB: "01:00:00" +WTIME_VX_ENSPOINT: "01:00:00" +WTIME_VX_ENSPOINT_MEAN: "01:00:00" +WTIME_VX_ENSPOINT_PROB: "01:00:00" +# +# Maximum number of attempts. +# +MAXTRIES_MAKE_GRID: "2" +MAXTRIES_MAKE_OROG: "2" +MAXTRIES_MAKE_SFC_CLIMO: "2" +MAXTRIES_GET_EXTRN_ICS: "1" +MAXTRIES_GET_EXTRN_LBCS: "1" +MAXTRIES_MAKE_ICS: "1" +MAXTRIES_MAKE_LBCS: "1" +MAXTRIES_RUN_FCST: "1" +MAXTRIES_RUN_POST: "2" +MAXTRIES_GET_OBS_CCPA: "1" +MAXTRIES_GET_OBS_MRMS: "1" +MAXTRIES_GET_OBS_NDAS: "1" +MAXTRIES_VX_GRIDSTAT: "1" +MAXTRIES_VX_GRIDSTAT_REFC: "1" +MAXTRIES_VX_GRIDSTAT_RETOP: "1" +MAXTRIES_VX_GRIDSTAT_03h: "1" +MAXTRIES_VX_GRIDSTAT_06h: "1" +MAXTRIES_VX_GRIDSTAT_24h: "1" +MAXTRIES_VX_POINTSTAT: "1" +MAXTRIES_VX_ENSGRID: "1" +MAXTRIES_VX_ENSGRID_REFC: "1" +MAXTRIES_VX_ENSGRID_RETOP: "1" +MAXTRIES_VX_ENSGRID_03h: "1" +MAXTRIES_VX_ENSGRID_06h: "1" +MAXTRIES_VX_ENSGRID_24h: "1" +MAXTRIES_VX_ENSGRID_MEAN: "1" +MAXTRIES_VX_ENSGRID_PROB: "1" +MAXTRIES_VX_ENSGRID_MEAN_03h: "1" +MAXTRIES_VX_ENSGRID_PROB_03h: "1" +MAXTRIES_VX_ENSGRID_MEAN_06h: "1" +MAXTRIES_VX_ENSGRID_PROB_06h: "1" +MAXTRIES_VX_ENSGRID_MEAN_24h: "1" +MAXTRIES_VX_ENSGRID_PROB_24h: "1" +MAXTRIES_VX_ENSGRID_PROB_REFC: "1" +MAXTRIES_VX_ENSGRID_PROB_RETOP: "1" +MAXTRIES_VX_ENSPOINT: "1" +MAXTRIES_VX_ENSPOINT_MEAN: "1" +MAXTRIES_VX_ENSPOINT_PROB: "1" + +# +#----------------------------------------------------------------------- +# +# Allows an extra parameter to be passed to slurm via XML Native +# command +# +SLURM_NATIVE_CMD: "" +# +#----------------------------------------------------------------------- +# +# Set parameters associated with subhourly forecast model output and +# post-processing. +# +# SUB_HOURLY_POST: +# Flag that indicates whether the forecast model will generate output +# files on a sub-hourly time interval (e.g. 10 minutes, 15 minutes, etc). +# This will also cause the post-processor to process these sub-hourly +# files. If ths is set to "TRUE", then DT_SUBHOURLY_POST_MNTS should be +# set to a value between "00" and "59". +# +# DT_SUB_HOURLY_POST_MNTS: +# Time interval in minutes between the forecast model output files. If +# SUB_HOURLY_POST is set to "TRUE", this needs to be set to a two-digit +# integer between "01" and "59". This is not used if SUB_HOURLY_POST is +# not set to "TRUE". Note that if SUB_HOURLY_POST is set to "TRUE" but +# DT_SUB_HOURLY_POST_MNTS is set to "00", SUB_HOURLY_POST will get reset +# to "FALSE" in the experiment generation scripts (there will be an +# informational message in the log file to emphasize this). +# +#----------------------------------------------------------------------- +# +SUB_HOURLY_POST: "FALSE" +DT_SUBHOURLY_POST_MNTS: "00" +# +#----------------------------------------------------------------------- +# +# Set parameters associated with defining a customized post configuration +# file. +# +# USE_CUSTOM_POST_CONFIG_FILE: +# Flag that determines whether a user-provided custom configuration file +# should be used for post-processing the model data. If this is set to +# "TRUE", then the workflow will use the custom post-processing (UPP) +# configuration file specified in CUSTOM_POST_CONFIG_FP. Otherwise, a +# default configuration file provided in the UPP repository will be +# used. +# +# CUSTOM_POST_CONFIG_FP: +# The full path to the custom post flat file, including filename, to be +# used for post-processing. This is only used if CUSTOM_POST_CONFIG_FILE +# is set to "TRUE". +# +#----------------------------------------------------------------------- +# +USE_CUSTOM_POST_CONFIG_FILE: "FALSE" +CUSTOM_POST_CONFIG_FP: "" +# +#----------------------------------------------------------------------- +# +# Set parameters associated with outputting satellite fields in the UPP +# grib2 files using the Community Radiative Transfer Model (CRTM). +# +# USE_CRTM: +# Flag that defines whether external CRTM coefficient files have been +# staged by the user in order to output synthetic statellite products +# available within the UPP. If this is set to "TRUE", then the workflow +# will check for these files in the directory CRTM_DIR. Otherwise, it is +# assumed that no satellite fields are being requested in the UPP +# configuration. +# +# CRTM_DIR: +# This is the path to the top CRTM fix file directory. This is only used +# if USE_CRTM is set to "TRUE". +# +#----------------------------------------------------------------------- +# +USE_CRTM: "FALSE" +CRTM_DIR: "" +# +#----------------------------------------------------------------------- +# +# Set parameters associated with running ensembles. Definitions: +# +# DO_ENSEMBLE: +# Flag that determines whether to run a set of ensemble forecasts (for +# each set of specified cycles). If this is set to "TRUE", NUM_ENS_MEMBERS +# forecasts are run for each cycle, each with a different set of stochastic +# seed values. Otherwise, a single forecast is run for each cycle. +# +# NUM_ENS_MEMBERS: +# The number of ensemble members to run if DO_ENSEMBLE is set to "TRUE". +# This variable also controls the naming of the ensemble member directories. +# For example, if this is set to "8", the member directories will be named +# mem1, mem2, ..., mem8. If it is set to "08" (note the leading zero), +# the member directories will be named mem01, mem02, ..., mem08. Note, +# however, that after reading in the number of characters in this string +# (in order to determine how many leading zeros, if any, should be placed +# in the names of the member directories), the workflow generation scripts +# strip away those leading zeros. Thus, in the variable definitions file +# (GLOBAL_VAR_DEFNS_FN), this variable appear with its leading zeros +# stripped. This variable is not used if DO_ENSEMBLE is not set to "TRUE". +# +#----------------------------------------------------------------------- +# +DO_ENSEMBLE: "FALSE" +NUM_ENS_MEMBERS: "1" +# +#----------------------------------------------------------------------- +# +# Set default ad-hoc stochastic physics options. +# For detailed documentation of these parameters, see: +# https://stochastic-physics.readthedocs.io/en/ufs_public_release/namelist_options.html +# +#----------------------------------------------------------------------- +# +DO_SHUM: "FALSE" +DO_SPPT: "FALSE" +DO_SKEB: "FALSE" +ISEED_SPPT: "1" +ISEED_SHUM: "2" +ISEED_SKEB: "3" +NEW_LSCALE: "TRUE" +SHUM_MAG: "0.006" #Variable "shum" in input.nml +SHUM_LSCALE: "150000" +SHUM_TSCALE: "21600" #Variable "shum_tau" in input.nml +SHUM_INT: "3600" #Variable "shumint" in input.nml +SPPT_MAG: "0.7" #Variable "sppt" in input.nml +SPPT_LOGIT: "TRUE" +SPPT_LSCALE: "150000" +SPPT_TSCALE: "21600" #Variable "sppt_tau" in input.nml +SPPT_INT: "3600" #Variable "spptint" in input.nml +SPPT_SFCLIMIT: "TRUE" +SKEB_MAG: "0.5" #Variable "skeb" in input.nml +SKEB_LSCALE: "150000" +SKEB_TSCALE: "21600" #Variable "skeb_tau" in input.nml +SKEB_INT: "3600" #Variable "skebint" in input.nml +SKEBNORM: "1" +SKEB_VDOF: "10" +USE_ZMTNBLCK: "FALSE" +# +#----------------------------------------------------------------------- +# +# Set default SPP stochastic physics options. Each SPP option is an array, +# applicable (in order) to the scheme/parameter listed in SPP_VAR_LIST. +# Enter each value of the array in config.sh as shown below without commas +# or single quotes (e.g., SPP_VAR_LIST=( "pbl" "sfc" "mp" "rad" "gwd" ). +# Both commas and single quotes will be added by Jinja when creating the +# namelist. +# +# Note that SPP is currently only available for specific physics schemes +# used in the RAP/HRRR physics suite. Users need to be aware of which SDF +# is chosen when turning this option on. +# +# Patterns evolve and are applied at each time step. +# +#----------------------------------------------------------------------- +# +DO_SPP: "FALSE" +SPP_VAR_LIST: [ "pbl", "sfc", "mp", "rad", "gwd" ] +SPP_MAG_LIST: [ "0.2", "0.2", "0.75", "0.2", "0.2" ] #Variable "spp_prt_list" in input.nml +SPP_LSCALE: [ "150000.0", "150000.0", "150000.0", "150000.0", "150000.0" ] +SPP_TSCALE: [ "21600.0", "21600.0", "21600.0", "21600.0", "21600.0" ] #Variable "spp_tau" in input.nml +SPP_SIGTOP1: [ "0.1", "0.1", "0.1", "0.1", "0.1" ] +SPP_SIGTOP2: [ "0.025", "0.025", "0.025", "0.025", "0.025" ] +SPP_STDDEV_CUTOFF: [ "1.5", "1.5", "2.5", "1.5", "1.5" ] +ISEED_SPP: [ "4", "4", "4", "4", "4" ] +# +#----------------------------------------------------------------------- +# +# Turn on SPP in Noah or RUC LSM (support for Noah MP is in progress). +# Please be aware of the SDF that you choose if you wish to turn on LSM +# SPP. +# +# SPP in LSM schemes is handled in the &nam_sfcperts namelist block +# instead of in &nam_sppperts, where all other SPP is implemented. +# +# The default perturbation frequency is determined by the fhcyc namelist +# entry. Since that parameter is set to zero in the SRW App, use +# LSM_SPP_EACH_STEP to perturb every time step. +# +# Perturbations to soil moisture content (SMC) are only applied at the +# first time step. +# +# LSM perturbations include SMC - soil moisture content (volume fraction), +# VGF - vegetation fraction, ALB - albedo, SAL - salinity, +# EMI - emissivity, ZOL - surface roughness (cm), and STC - soil temperature. +# +# Only five perturbations at a time can be applied currently, but all seven +# are shown below. In addition, only one unique iseed value is allowed +# at the moment, and is used for each pattern. +# +DO_LSM_SPP: "FALSE" #If true, sets lndp_type=2 +LSM_SPP_TSCALE: [ "21600", "21600", "21600", "21600", "21600", "21600", "21600" ] +LSM_SPP_LSCALE: [ "150000", "150000", "150000", "150000", "150000", "150000", "150000" ] +ISEED_LSM_SPP: [ "9" ] +LSM_SPP_VAR_LIST: [ "smc", "vgf", "alb", "sal", "emi", "zol", "stc" ] +LSM_SPP_MAG_LIST: [ "0.2", "0.001", "0.001", "0.001", "0.001", "0.001", "0.2" ] +LSM_SPP_EACH_STEP: "TRUE" #Sets lndp_each_step=.true. +# +#----------------------------------------------------------------------- +# +# HALO_BLEND: +# Number of rows into the computational domain that should be blended +# with the LBCs. To shut halo blending off, this can be set to zero. +# +#----------------------------------------------------------------------- +# +HALO_BLEND: "10" +# +#----------------------------------------------------------------------- +# +# USE_FVCOM: +# Flag set to update surface conditions in FV3-LAM with fields generated +# from the Finite Volume Community Ocean Model (FVCOM). This will +# replace lake/sea surface temperature, ice surface temperature, and ice +# placement. FVCOM data must already be interpolated to the desired +# FV3-LAM grid. This flag will be used in make_ics to modify sfc_data.nc +# after chgres_cube is run by running the routine process_FVCOM.exe +# +# FVCOM_WCSTART: +# Define if this is a "warm" start or a "cold" start. Setting this to +# "warm" will read in sfc_data.nc generated in a RESTART directory. +# Setting this to "cold" will read in the sfc_data.nc generated from +# chgres_cube in the make_ics portion of the workflow. +# +# FVCOM_DIR: +# User defined directory where FVCOM data already interpolated to FV3-LAM +# grid is located. File name in this path should be "fvcom.nc" to allow +# +# FVCOM_FILE: +# Name of file located in FVCOM_DIR that has FVCOM data interpolated to +# FV3-LAM grid. This file will be copied later to a new location and name +# changed to fvcom.nc +# +#------------------------------------------------------------------------ +# +USE_FVCOM: "FALSE" +FVCOM_WCSTART: "cold" +FVCOM_DIR: "/user/defined/dir/to/fvcom/data" +FVCOM_FILE: "fvcom.nc" +# +#----------------------------------------------------------------------- +# +# COMPILER: +# Type of compiler invoked during the build step. +# +#------------------------------------------------------------------------ +# +COMPILER: "intel" +# +#----------------------------------------------------------------------- +# +# KMP_AFFINITY_*: +# From Intel: "The Intel® runtime library has the ability to bind OpenMP +# threads to physical processing units. The interface is controlled using +# the KMP_AFFINITY environment variable. Depending on the system (machine) +# topology, application, and operating system, thread affinity can have a +# dramatic effect on the application speed. +# +# Thread affinity restricts execution of certain threads (virtual execution +# units) to a subset of the physical processing units in a multiprocessor +# computer. Depending upon the topology of the machine, thread affinity can +# have a dramatic effect on the execution speed of a program." +# +# For more information, see the following link: +# https://software.intel.com/content/www/us/en/develop/documentation/cpp- +# compiler-developer-guide-and-reference/top/optimization-and-programming- +# guide/openmp-support/openmp-library-support/thread-affinity-interface- +# linux-and-windows.html +# +# OMP_NUM_THREADS_*: +# The number of OpenMP threads to use for parallel regions. +# +# OMP_STACKSIZE_*: +# Controls the size of the stack for threads created by the OpenMP +# implementation. +# +# Note that settings for the make_grid and make_orog tasks are not +# included below as they do not use parallelized code. +# +#----------------------------------------------------------------------- +# +KMP_AFFINITY_MAKE_OROG: "disabled" +OMP_NUM_THREADS_MAKE_OROG: "6" +OMP_STACKSIZE_MAKE_OROG: "2048m" + +KMP_AFFINITY_MAKE_SFC_CLIMO: "scatter" +OMP_NUM_THREADS_MAKE_SFC_CLIMO: "1" +OMP_STACKSIZE_MAKE_SFC_CLIMO: "1024m" + +KMP_AFFINITY_MAKE_ICS: "scatter" +OMP_NUM_THREADS_MAKE_ICS: "1" +OMP_STACKSIZE_MAKE_ICS: "1024m" + +KMP_AFFINITY_MAKE_LBCS: "scatter" +OMP_NUM_THREADS_MAKE_LBCS: "1" +OMP_STACKSIZE_MAKE_LBCS: "1024m" + +KMP_AFFINITY_RUN_FCST: "scatter" +OMP_NUM_THREADS_RUN_FCST: "2" # atmos_nthreads in model_configure +OMP_STACKSIZE_RUN_FCST: "1024m" + +KMP_AFFINITY_RUN_POST: "scatter" +OMP_NUM_THREADS_RUN_POST: "1" +OMP_STACKSIZE_RUN_POST: "1024m" +# +#----------------------------------------------------------------------- +# diff --git a/ush/constants.py b/ush/constants.py new file mode 100644 index 000000000..e0ff60a0b --- /dev/null +++ b/ush/constants.py @@ -0,0 +1,27 @@ +#!/usr/env/bin python3 + +# +#----------------------------------------------------------------------- +# +# Mathematical and physical constants. +# +#----------------------------------------------------------------------- +# + +# Pi. +pi_geom = 3.14159265358979323846264338327 + +# Degrees per radian. +degs_per_radian = 360.0 / (2.0 * pi_geom) + +# Radius of the Earth in meters. +radius_Earth = 6371200.0 + +# +#----------------------------------------------------------------------- +# +# Other. +# +#----------------------------------------------------------------------- +# +valid_vals_BOOLEAN = [True, False] diff --git a/ush/create_diag_table_file.py b/ush/create_diag_table_file.py new file mode 100644 index 000000000..cd98fcec7 --- /dev/null +++ b/ush/create_diag_table_file.py @@ -0,0 +1,79 @@ +#!/usr/bin/env python3 + +import os +import unittest +from textwrap import dedent + +from python_utils import import_vars, set_env_var, print_input_args, \ + print_info_msg, print_err_msg_exit, cfg_to_yaml_str + +from fill_jinja_template import fill_jinja_template + +def create_diag_table_file(run_dir): + """ Creates a diagnostic table file for each cycle to be run + + Args: + run_dir: run directory + Returns: + Boolean + """ + + print_input_args(locals()) + + #import all environment variables + import_vars() + + #create a diagnostic table file within the specified run directory + print_info_msg(f''' + Creating a diagnostics table file (\"{DIAG_TABLE_FN}\") in the specified + run directory... + + run_dir = \"{run_dir}\"''', verbose=VERBOSE) + + diag_table_fp = os.path.join(run_dir, DIAG_TABLE_FN) + + print_info_msg(f''' + + Using the template diagnostics table file: + + diag_table_tmpl_fp = {DIAG_TABLE_TMPL_FP} + + to create: + + diag_table_fp = \"{diag_table_fp}\"''', verbose=VERBOSE) + + settings = { + 'starttime': CDATE, + 'cres': CRES + } + settings_str = cfg_to_yaml_str(settings) + + #call fill jinja + try: + fill_jinja_template(["-q", "-u", settings_str, "-t", DIAG_TABLE_TMPL_FP, "-o", diag_table_fp]) + except: + print_err_msg_exit(f''' + !!!!!!!!!!!!!!!!! + + fill_jinja_template.py failed! + + !!!!!!!!!!!!!!!!!''') + return False + return True + +class Testing(unittest.TestCase): + def test_create_diag_table_file(self): + path = os.path.join(os.getenv('USHDIR'), "test_data") + self.assertTrue(create_diag_table_file(run_dir=path)) + def setUp(self): + USHDIR = os.path.dirname(os.path.abspath(__file__)) + DIAG_TABLE_FN="diag_table" + DIAG_TABLE_TMPL_FP = os.path.join(USHDIR,"templates",f"{DIAG_TABLE_FN}.FV3_GFS_v15p2") + set_env_var('DEBUG',True) + set_env_var('VERBOSE',True) + set_env_var("USHDIR",USHDIR) + set_env_var("DIAG_TABLE_FN",DIAG_TABLE_FN) + set_env_var("DIAG_TABLE_TMPL_FP",DIAG_TABLE_TMPL_FP) + set_env_var("CRES","C48") + set_env_var("CDATE","2021010106") + diff --git a/ush/create_model_configure_file.py b/ush/create_model_configure_file.py new file mode 100644 index 000000000..9ae124139 --- /dev/null +++ b/ush/create_model_configure_file.py @@ -0,0 +1,243 @@ +#!/usr/bin/env python3 + +import os +import unittest +from datetime import datetime +from textwrap import dedent + +from python_utils import import_vars, set_env_var, print_input_args, \ + print_info_msg, print_err_msg_exit, lowercase, cfg_to_yaml_str + +from fill_jinja_template import fill_jinja_template + +def create_model_configure_file(cdate,run_dir,sub_hourly_post,dt_subhourly_post_mnts,dt_atmos): + """ Creates a model configuration file in the specified + run directory + + Args: + cdate: cycle date + run_dir: run directory + sub_hourly_post + dt_subhourly_post_mnts + dt_atmos + Returns: + Boolean + """ + + print_input_args(locals()) + + #import all environment variables + import_vars() + + # + #----------------------------------------------------------------------- + # + # Create a model configuration file in the specified run directory. + # + #----------------------------------------------------------------------- + # + print_info_msg(f''' + Creating a model configuration file (\"{MODEL_CONFIG_FN}\") in the specified + run directory (run_dir): + run_dir = \"{run_dir}\"''', verbose=VERBOSE) + # + # Extract from cdate the starting year, month, day, and hour of the forecast. + # + yyyy=cdate.year + mm=cdate.month + dd=cdate.day + hh=cdate.hour + # + # Set parameters in the model configure file. + # + dot_quilting_dot=f".{lowercase(str(QUILTING))}." + dot_print_esmf_dot=f".{lowercase(str(PRINT_ESMF))}." + dot_cpl_dot=f".{lowercase(str(CPL))}." + dot_write_dopost=f".{lowercase(str(WRITE_DOPOST))}." + # + #----------------------------------------------------------------------- + # + # Create a multiline variable that consists of a yaml-compliant string + # specifying the values that the jinja variables in the template + # model_configure file should be set to. + # + #----------------------------------------------------------------------- + # + settings = { + 'PE_MEMBER01': PE_MEMBER01, + 'print_esmf': dot_print_esmf_dot, + 'start_year': yyyy, + 'start_month': mm, + 'start_day': dd, + 'start_hour': hh, + 'nhours_fcst': FCST_LEN_HRS, + 'dt_atmos': DT_ATMOS, + 'cpl': dot_cpl_dot, + 'atmos_nthreads': OMP_NUM_THREADS_RUN_FCST, + 'restart_interval': RESTART_INTERVAL, + 'write_dopost': dot_write_dopost, + 'quilting': dot_quilting_dot, + 'output_grid': WRTCMP_output_grid + } + # + # If the write-component is to be used, then specify a set of computational + # parameters and a set of grid parameters. The latter depends on the type + # (coordinate system) of the grid that the write-component will be using. + # + if QUILTING: + settings.update({ + 'write_groups': WRTCMP_write_groups, + 'write_tasks_per_group': WRTCMP_write_tasks_per_group, + 'cen_lon': WRTCMP_cen_lon, + 'cen_lat': WRTCMP_cen_lat, + 'lon1': WRTCMP_lon_lwr_left, + 'lat1': WRTCMP_lat_lwr_left + }) + + if WRTCMP_output_grid == "lambert_conformal": + settings.update({ + 'stdlat1': WRTCMP_stdlat1, + 'stdlat2': WRTCMP_stdlat2, + 'nx': WRTCMP_nx, + 'ny': WRTCMP_ny, + 'dx': WRTCMP_dx, + 'dy': WRTCMP_dy, + 'lon2': "", + 'lat2': "", + 'dlon': "", + 'dlat': "", + }) + elif WRTCMP_output_grid == "regional_latlon" or \ + WRTCMP_output_grid == "rotated_latlon": + settings.update({ + 'lon2': WRTCMP_lon_upr_rght, + 'lat2': WRTCMP_lat_upr_rght, + 'dlon': WRTCMP_dlon, + 'dlat': WRTCMP_dlat, + 'stdlat1': "", + 'stdlat2': "", + 'nx': "", + 'ny': "", + 'dx': "", + 'dy': "" + }) + # + # If sub_hourly_post is set to "TRUE", then the forecast model must be + # directed to generate output files on a sub-hourly interval. Do this + # by specifying the output interval in the model configuration file + # (MODEL_CONFIG_FN) in units of number of forecat model time steps (nsout). + # nsout is calculated using the user-specified output time interval + # dt_subhourly_post_mnts (in units of minutes) and the forecast model's + # main time step dt_atmos (in units of seconds). Note that nsout is + # guaranteed to be an integer because the experiment generation scripts + # require that dt_subhourly_post_mnts (after conversion to seconds) be + # evenly divisible by dt_atmos. Also, in this case, the variable output_fh + # [which specifies the output interval in hours; + # see the jinja model_config template file] is set to 0, although this + # doesn't matter because any positive of nsout will override output_fh. + # + # If sub_hourly_post is set to "FALSE", then the workflow is hard-coded + # (in the jinja model_config template file) to direct the forecast model + # to output files every hour. This is done by setting (1) output_fh to 1 + # here, and (2) nsout to -1 here which turns off output by time step interval. + # + # Note that the approach used here of separating how hourly and subhourly + # output is handled should be changed/generalized/simplified such that + # the user should only need to specify the output time interval (there + # should be no need to specify a flag like sub_hourly_post); the workflow + # should then be able to direct the model to output files with that time + # interval and to direct the post-processor to process those files + # regardless of whether that output time interval is larger than, equal + # to, or smaller than one hour. + # + if sub_hourly_post: + nsout=dt_subhourly_post_mnts*60 / dt_atmos + output_fh=0 + else: + output_fh=1 + nsout=-1 + + settings.update({ + 'output_fh': output_fh, + 'nsout': nsout + }) + + settings_str = cfg_to_yaml_str(settings) + + print_info_msg(dedent(f''' + The variable \"settings\" specifying values to be used in the \"{MODEL_CONFIG_FN}\" + file has been set as follows: + #----------------------------------------------------------------------- + settings =\n''') + settings_str,verbose=VERBOSE) + # + #----------------------------------------------------------------------- + # + # Call a python script to generate the experiment's actual MODEL_CONFIG_FN + # file from the template file. + # + #----------------------------------------------------------------------- + # + model_config_fp=os.path.join(run_dir, MODEL_CONFIG_FN) + + try: + fill_jinja_template(["-q", "-u", settings_str, "-t", MODEL_CONFIG_TMPL_FP, "-o", model_config_fp]) + except: + print_err_msg_exit(f''' + Call to python script fill_jinja_template.py to create a \"{MODEL_CONFIG_FN}\" + file from a jinja2 template failed. Parameters passed to this script are: + Full path to template rocoto XML file: + MODEL_CONFIG_TMPL_FP = \"{MODEL_CONFIG_TMPL_FP}\" + Full path to output rocoto XML file: + model_config_fp = \"{model_config_fp}\" + Namelist settings specified on command line: + settings = + {settings_str}''') + return False + + return True + +class Testing(unittest.TestCase): + def test_create_model_configure_file(self): + path = os.path.join(os.getenv('USHDIR'), "test_data") + self.assertTrue(\ + create_model_configure_file( \ + run_dir=path, + cdate=datetime(2021,1,1), + sub_hourly_post=True, + dt_subhourly_post_mnts=4, + dt_atmos=1) ) + def setUp(self): + USHDIR = os.path.dirname(os.path.abspath(__file__)) + MODEL_CONFIG_FN='model_configure' + MODEL_CONFIG_TMPL_FP = os.path.join(USHDIR, "templates", MODEL_CONFIG_FN) + + set_env_var('DEBUG',True) + set_env_var('VERBOSE',True) + set_env_var('QUILTING',True) + set_env_var('PRINT_ESMF',True) + set_env_var('CPL',True) + set_env_var('WRITE_DOPOST',True) + set_env_var("USHDIR",USHDIR) + set_env_var('MODEL_CONFIG_FN',MODEL_CONFIG_FN) + set_env_var("MODEL_CONFIG_TMPL_FP",MODEL_CONFIG_TMPL_FP) + set_env_var('PE_MEMBER01',24) + set_env_var('FCST_LEN_HRS',72) + set_env_var('DT_ATMOS',1) + set_env_var('OMP_NUM_THREADS_RUN_FCST',1) + set_env_var('RESTART_INTERVAL',4) + + set_env_var('WRTCMP_write_groups',1) + set_env_var('WRTCMP_write_tasks_per_group',2) + set_env_var('WRTCMP_output_grid',"lambert_conformal") + set_env_var('WRTCMP_cen_lon',-97.5) + set_env_var('WRTCMP_cen_lat',35.0) + set_env_var('WRTCMP_stdlat1',35.0) + set_env_var('WRTCMP_stdlat2',35.0) + set_env_var('WRTCMP_nx',199) + set_env_var('WRTCMP_ny',111) + set_env_var('WRTCMP_lon_lwr_left',-121.23349066) + set_env_var('WRTCMP_lat_lwr_left',23.41731593) + set_env_var('WRTCMP_dx',3000.0) + set_env_var('WRTCMP_dy',3000.0) + + diff --git a/ush/fill_jinja_template.py b/ush/fill_jinja_template.py index 47885051b..74c348908 100755 --- a/ush/fill_jinja_template.py +++ b/ush/fill_jinja_template.py @@ -43,6 +43,7 @@ import datetime as dt import os +import sys import argparse import jinja2 as j2 @@ -155,7 +156,7 @@ def path_ok(arg): raise argparse.ArgumentTypeError(msg) -def parse_args(): +def parse_args(argv): ''' Function maintains the arguments accepted by this script. Please see @@ -195,7 +196,7 @@ def parse_args(): required=True, type=path_ok, ) - return parser.parse_args() + return parser.parse_args(argv) def update_dict(dest, newdict, quiet=False): @@ -231,13 +232,18 @@ def update_dict(dest, newdict, quiet=False): print('*' * 50) -def main(cla): +def fill_jinja_template(argv): ''' Loads a Jinja template, determines its necessary undefined variables, retrives them from user supplied settings, and renders the final result. ''' + # parse args + cla = parse_args(argv) + if cla.config: + cla.config = config_exists(cla.config) + # Create a Jinja Environment to load the template. env = j2.Environment(loader=j2.FileSystemLoader(cla.template)) template_source = env.loader.get_source(env, '') @@ -274,7 +280,5 @@ def main(cla): if __name__ == '__main__': - cla = parse_args() - if cla.config: - cla.config = config_exists(cla.config) - main(cla) + + fill_jinja_template(sys.argv[1:]) diff --git a/ush/generate_FV3LAM_wflow.py b/ush/generate_FV3LAM_wflow.py new file mode 100755 index 000000000..2391340c4 --- /dev/null +++ b/ush/generate_FV3LAM_wflow.py @@ -0,0 +1,1126 @@ +#!/usr/bin/env python3 + +import os +import sys +import platform +import subprocess +from multiprocessing import Process +from textwrap import dedent +from datetime import datetime,timedelta + +from python_utils import print_info_msg, print_err_msg_exit, import_vars, cp_vrfy, cd_vrfy,\ + rm_vrfy, ln_vrfy, mkdir_vrfy, mv_vrfy, run_command, date_to_str, \ + define_macos_utilities, create_symlink_to_file, check_for_preexist_dir_file, \ + cfg_to_yaml_str, find_pattern_in_str + +from setup import setup +from set_FV3nml_sfc_climo_filenames import set_FV3nml_sfc_climo_filenames +from get_crontab_contents import get_crontab_contents +from fill_jinja_template import fill_jinja_template +from set_namelist import set_namelist + +def python_error_handler(): + """ Error handler for missing packages """ + + print_err_msg_exit(''' + Errors found: check your python environment + + Instructions for setting up python environments can be found on the web: + https://github.com/ufs-community/ufs-srweather-app/wiki/Getting-Started + ''', stack_trace=False) + +# Check for non-standard python packages +try: + import jinja2 + import yaml + import f90nml +except ImportError as error: + print_info_msg(error.__class__.__name__ + ": " + str(error)) + python_error_handler() + +def generate_FV3LAM_wflow(): + """ Function to setup a forecast experiment and create a workflow + (according to the parameters specified in the config file + + Args: + None + Returns: + None + """ + + print(dedent(''' + ======================================================================== + ======================================================================== + + Starting experiment generation... + + ======================================================================== + ========================================================================''')) + + #set ushdir + ushdir = os.path.dirname(os.path.abspath(__file__)) + + #check python version + major,minor,patch = platform.python_version_tuple() + if int(major) < 3 or int(minor) < 6: + print_info_msg(f''' + + Error: python version must be 3.6 or higher + python version: {major}.{minor}''') + + #define macros + define_macos_utilities() + + # + #----------------------------------------------------------------------- + # + # Source the file that defines and then calls the setup function. The + # setup function in turn first sources the default configuration file + # (which contains default values for the experiment/workflow parameters) + # and then sources the user-specified configuration file (which contains + # user-specified values for a subset of the experiment/workflow parame- + # ters that override their default values). + # + #----------------------------------------------------------------------- + # + setup() + + #import all environment variables + import_vars() + + # + #----------------------------------------------------------------------- + # + # Set the full path to the experiment's rocoto workflow xml file. This + # file will be placed at the top level of the experiment directory and + # then used by rocoto to run the workflow. + # + #----------------------------------------------------------------------- + # + WFLOW_XML_FP = os.path.join(EXPTDIR, WFLOW_XML_FN) + + # + #----------------------------------------------------------------------- + # + # Create a multiline variable that consists of a yaml-compliant string + # specifying the values that the jinja variables in the template rocoto + # XML should be set to. These values are set either in the user-specified + # workflow configuration file (EXPT_CONFIG_FN) or in the setup.sh script + # sourced above. Then call the python script that generates the XML. + # + #----------------------------------------------------------------------- + # + if WORKFLOW_MANAGER == "rocoto": + + template_xml_fp = os.path.join(TEMPLATE_DIR, WFLOW_XML_FN) + + print_info_msg(f''' + Creating rocoto workflow XML file (WFLOW_XML_FP) from jinja template XML + file (template_xml_fp): + template_xml_fp = \"{template_xml_fp}\" + WFLOW_XML_FP = \"{WFLOW_XML_FP}\"''') + + ensmem_indx_name = "" + uscore_ensmem_name = "" + slash_ensmem_subdir = "" + if DO_ENSEMBLE: + ensmem_indx_name = "mem" + uscore_ensmem_name = f"_mem#{ensmem_indx_name}#" + slash_ensmem_subdir = f"/mem#{ensmem_indx_name}#" + + #get time string + d = DATE_FIRST_CYCL + timedelta(seconds=DT_ATMOS) + time_str = d.strftime("%M:%S") + + cycl_hrs_str = [ f"{c:02d}" for c in CYCL_HRS ] + cdate_first_cycl = DATE_FIRST_CYCL + timedelta(hours=CYCL_HRS[0]) + + # Dictionary of settings + settings = { + # + # Parameters needed by the job scheduler. + # + 'account': ACCOUNT, + 'sched': SCHED, + 'partition_default': PARTITION_DEFAULT, + 'queue_default': QUEUE_DEFAULT, + 'partition_hpss': PARTITION_HPSS, + 'queue_hpss': QUEUE_HPSS, + 'partition_fcst': PARTITION_FCST, + 'queue_fcst': QUEUE_FCST, + 'machine': MACHINE, + 'slurm_native_cmd': SLURM_NATIVE_CMD, + # + # Workflow task names. + # + 'make_grid_tn': MAKE_GRID_TN, + 'make_orog_tn': MAKE_OROG_TN, + 'make_sfc_climo_tn': MAKE_SFC_CLIMO_TN, + 'get_extrn_ics_tn': GET_EXTRN_ICS_TN, + 'get_extrn_lbcs_tn': GET_EXTRN_LBCS_TN, + 'make_ics_tn': MAKE_ICS_TN, + 'make_lbcs_tn': MAKE_LBCS_TN, + 'run_fcst_tn': RUN_FCST_TN, + 'run_post_tn': RUN_POST_TN, + 'get_obs_ccpa_tn': GET_OBS_CCPA_TN, + 'get_obs_ndas_tn': GET_OBS_NDAS_TN, + 'get_obs_mrms_tn': GET_OBS_MRMS_TN, + 'vx_tn': VX_TN, + 'vx_gridstat_tn': VX_GRIDSTAT_TN, + 'vx_gridstat_refc_tn': VX_GRIDSTAT_REFC_TN, + 'vx_gridstat_retop_tn': VX_GRIDSTAT_RETOP_TN, + 'vx_gridstat_03h_tn': VX_GRIDSTAT_03h_TN, + 'vx_gridstat_06h_tn': VX_GRIDSTAT_06h_TN, + 'vx_gridstat_24h_tn': VX_GRIDSTAT_24h_TN, + 'vx_pointstat_tn': VX_POINTSTAT_TN, + 'vx_ensgrid_tn': VX_ENSGRID_TN, + 'vx_ensgrid_refc_tn': VX_ENSGRID_REFC_TN, + 'vx_ensgrid_retop_tn': VX_ENSGRID_RETOP_TN, + 'vx_ensgrid_03h_tn': VX_ENSGRID_03h_TN, + 'vx_ensgrid_06h_tn': VX_ENSGRID_06h_TN, + 'vx_ensgrid_24h_tn': VX_ENSGRID_24h_TN, + 'vx_ensgrid_mean_tn': VX_ENSGRID_MEAN_TN, + 'vx_ensgrid_prob_tn': VX_ENSGRID_PROB_TN, + 'vx_ensgrid_mean_03h_tn': VX_ENSGRID_MEAN_03h_TN, + 'vx_ensgrid_prob_03h_tn': VX_ENSGRID_PROB_03h_TN, + 'vx_ensgrid_mean_06h_tn': VX_ENSGRID_MEAN_06h_TN, + 'vx_ensgrid_prob_06h_tn': VX_ENSGRID_PROB_06h_TN, + 'vx_ensgrid_mean_24h_tn': VX_ENSGRID_MEAN_24h_TN, + 'vx_ensgrid_prob_24h_tn': VX_ENSGRID_PROB_24h_TN, + 'vx_ensgrid_prob_refc_tn': VX_ENSGRID_PROB_REFC_TN, + 'vx_ensgrid_prob_retop_tn': VX_ENSGRID_PROB_RETOP_TN, + 'vx_enspoint_tn': VX_ENSPOINT_TN, + 'vx_enspoint_mean_tn': VX_ENSPOINT_MEAN_TN, + 'vx_enspoint_prob_tn': VX_ENSPOINT_PROB_TN, + # + # Entity used to load the module file for each GET_OBS_* task. + # + 'get_obs': GET_OBS, + # + # Number of nodes to use for each task. + # + 'nnodes_make_grid': NNODES_MAKE_GRID, + 'nnodes_make_orog': NNODES_MAKE_OROG, + 'nnodes_make_sfc_climo': NNODES_MAKE_SFC_CLIMO, + 'nnodes_get_extrn_ics': NNODES_GET_EXTRN_ICS, + 'nnodes_get_extrn_lbcs': NNODES_GET_EXTRN_LBCS, + 'nnodes_make_ics': NNODES_MAKE_ICS, + 'nnodes_make_lbcs': NNODES_MAKE_LBCS, + 'nnodes_run_fcst': NNODES_RUN_FCST, + 'nnodes_run_post': NNODES_RUN_POST, + 'nnodes_get_obs_ccpa': NNODES_GET_OBS_CCPA, + 'nnodes_get_obs_mrms': NNODES_GET_OBS_MRMS, + 'nnodes_get_obs_ndas': NNODES_GET_OBS_NDAS, + 'nnodes_vx_gridstat': NNODES_VX_GRIDSTAT, + 'nnodes_vx_pointstat': NNODES_VX_POINTSTAT, + 'nnodes_vx_ensgrid': NNODES_VX_ENSGRID, + 'nnodes_vx_ensgrid_mean': NNODES_VX_ENSGRID_MEAN, + 'nnodes_vx_ensgrid_prob': NNODES_VX_ENSGRID_PROB, + 'nnodes_vx_enspoint': NNODES_VX_ENSPOINT, + 'nnodes_vx_enspoint_mean': NNODES_VX_ENSPOINT_MEAN, + 'nnodes_vx_enspoint_prob': NNODES_VX_ENSPOINT_PROB, + # + # Number of cores used for a task + # + 'ncores_run_fcst': PE_MEMBER01, + 'native_run_fcst': f"--cpus-per-task {OMP_NUM_THREADS_RUN_FCST} --exclusive", + # + # Number of logical processes per node for each task. If running without + # threading, this is equal to the number of MPI processes per node. + # + 'ppn_make_grid': PPN_MAKE_GRID, + 'ppn_make_orog': PPN_MAKE_OROG, + 'ppn_make_sfc_climo': PPN_MAKE_SFC_CLIMO, + 'ppn_get_extrn_ics': PPN_GET_EXTRN_ICS, + 'ppn_get_extrn_lbcs': PPN_GET_EXTRN_LBCS, + 'ppn_make_ics': PPN_MAKE_ICS, + 'ppn_make_lbcs': PPN_MAKE_LBCS, + 'ppn_run_fcst': PPN_RUN_FCST, + 'ppn_run_post': PPN_RUN_POST, + 'ppn_get_obs_ccpa': PPN_GET_OBS_CCPA, + 'ppn_get_obs_mrms': PPN_GET_OBS_MRMS, + 'ppn_get_obs_ndas': PPN_GET_OBS_NDAS, + 'ppn_vx_gridstat': PPN_VX_GRIDSTAT, + 'ppn_vx_pointstat': PPN_VX_POINTSTAT, + 'ppn_vx_ensgrid': PPN_VX_ENSGRID, + 'ppn_vx_ensgrid_mean': PPN_VX_ENSGRID_MEAN, + 'ppn_vx_ensgrid_prob': PPN_VX_ENSGRID_PROB, + 'ppn_vx_enspoint': PPN_VX_ENSPOINT, + 'ppn_vx_enspoint_mean': PPN_VX_ENSPOINT_MEAN, + 'ppn_vx_enspoint_prob': PPN_VX_ENSPOINT_PROB, + # + # Maximum wallclock time for each task. + # + 'wtime_make_grid': WTIME_MAKE_GRID, + 'wtime_make_orog': WTIME_MAKE_OROG, + 'wtime_make_sfc_climo': WTIME_MAKE_SFC_CLIMO, + 'wtime_get_extrn_ics': WTIME_GET_EXTRN_ICS, + 'wtime_get_extrn_lbcs': WTIME_GET_EXTRN_LBCS, + 'wtime_make_ics': WTIME_MAKE_ICS, + 'wtime_make_lbcs': WTIME_MAKE_LBCS, + 'wtime_run_fcst': WTIME_RUN_FCST, + 'wtime_run_post': WTIME_RUN_POST, + 'wtime_get_obs_ccpa': WTIME_GET_OBS_CCPA, + 'wtime_get_obs_mrms': WTIME_GET_OBS_MRMS, + 'wtime_get_obs_ndas': WTIME_GET_OBS_NDAS, + 'wtime_vx_gridstat': WTIME_VX_GRIDSTAT, + 'wtime_vx_pointstat': WTIME_VX_POINTSTAT, + 'wtime_vx_ensgrid': WTIME_VX_ENSGRID, + 'wtime_vx_ensgrid_mean': WTIME_VX_ENSGRID_MEAN, + 'wtime_vx_ensgrid_prob': WTIME_VX_ENSGRID_PROB, + 'wtime_vx_enspoint': WTIME_VX_ENSPOINT, + 'wtime_vx_enspoint_mean': WTIME_VX_ENSPOINT_MEAN, + 'wtime_vx_enspoint_prob': WTIME_VX_ENSPOINT_PROB, + # + # Maximum number of tries for each task. + # + 'maxtries_make_grid': MAXTRIES_MAKE_GRID, + 'maxtries_make_orog': MAXTRIES_MAKE_OROG, + 'maxtries_make_sfc_climo': MAXTRIES_MAKE_SFC_CLIMO, + 'maxtries_get_extrn_ics': MAXTRIES_GET_EXTRN_ICS, + 'maxtries_get_extrn_lbcs': MAXTRIES_GET_EXTRN_LBCS, + 'maxtries_make_ics': MAXTRIES_MAKE_ICS, + 'maxtries_make_lbcs': MAXTRIES_MAKE_LBCS, + 'maxtries_run_fcst': MAXTRIES_RUN_FCST, + 'maxtries_run_post': MAXTRIES_RUN_POST, + 'maxtries_get_obs_ccpa': MAXTRIES_GET_OBS_CCPA, + 'maxtries_get_obs_mrms': MAXTRIES_GET_OBS_MRMS, + 'maxtries_get_obs_ndas': MAXTRIES_GET_OBS_NDAS, + 'maxtries_vx_gridstat': MAXTRIES_VX_GRIDSTAT, + 'maxtries_vx_gridstat_refc': MAXTRIES_VX_GRIDSTAT_REFC, + 'maxtries_vx_gridstat_retop': MAXTRIES_VX_GRIDSTAT_RETOP, + 'maxtries_vx_gridstat_03h': MAXTRIES_VX_GRIDSTAT_03h, + 'maxtries_vx_gridstat_06h': MAXTRIES_VX_GRIDSTAT_06h, + 'maxtries_vx_gridstat_24h': MAXTRIES_VX_GRIDSTAT_24h, + 'maxtries_vx_pointstat': MAXTRIES_VX_POINTSTAT, + 'maxtries_vx_ensgrid': MAXTRIES_VX_ENSGRID, + 'maxtries_vx_ensgrid_refc': MAXTRIES_VX_ENSGRID_REFC, + 'maxtries_vx_ensgrid_retop': MAXTRIES_VX_ENSGRID_RETOP, + 'maxtries_vx_ensgrid_03h': MAXTRIES_VX_ENSGRID_03h, + 'maxtries_vx_ensgrid_06h': MAXTRIES_VX_ENSGRID_06h, + 'maxtries_vx_ensgrid_24h': MAXTRIES_VX_ENSGRID_24h, + 'maxtries_vx_ensgrid_mean': MAXTRIES_VX_ENSGRID_MEAN, + 'maxtries_vx_ensgrid_prob': MAXTRIES_VX_ENSGRID_PROB, + 'maxtries_vx_ensgrid_mean_03h': MAXTRIES_VX_ENSGRID_MEAN_03h, + 'maxtries_vx_ensgrid_prob_03h': MAXTRIES_VX_ENSGRID_PROB_03h, + 'maxtries_vx_ensgrid_mean_06h': MAXTRIES_VX_ENSGRID_MEAN_06h, + 'maxtries_vx_ensgrid_prob_06h': MAXTRIES_VX_ENSGRID_PROB_06h, + 'maxtries_vx_ensgrid_mean_24h': MAXTRIES_VX_ENSGRID_MEAN_24h, + 'maxtries_vx_ensgrid_prob_24h': MAXTRIES_VX_ENSGRID_PROB_24h, + 'maxtries_vx_ensgrid_prob_refc': MAXTRIES_VX_ENSGRID_PROB_REFC, + 'maxtries_vx_ensgrid_prob_retop': MAXTRIES_VX_ENSGRID_PROB_RETOP, + 'maxtries_vx_enspoint': MAXTRIES_VX_ENSPOINT, + 'maxtries_vx_enspoint_mean': MAXTRIES_VX_ENSPOINT_MEAN, + 'maxtries_vx_enspoint_prob': MAXTRIES_VX_ENSPOINT_PROB, + # + # Flags that specify whether to run the preprocessing or + # verification-related tasks. + # + 'run_task_make_grid': RUN_TASK_MAKE_GRID, + 'run_task_make_orog': RUN_TASK_MAKE_OROG, + 'run_task_make_sfc_climo': RUN_TASK_MAKE_SFC_CLIMO, + 'run_task_get_extrn_ics': RUN_TASK_GET_EXTRN_ICS, + 'run_task_get_extrn_lbcs': RUN_TASK_GET_EXTRN_LBCS, + 'run_task_make_ics': RUN_TASK_MAKE_ICS, + 'run_task_make_lbcs': RUN_TASK_MAKE_LBCS, + 'run_task_run_fcst': RUN_TASK_RUN_FCST, + 'run_task_run_post': RUN_TASK_RUN_POST, + 'run_task_get_obs_ccpa': RUN_TASK_GET_OBS_CCPA, + 'run_task_get_obs_mrms': RUN_TASK_GET_OBS_MRMS, + 'run_task_get_obs_ndas': RUN_TASK_GET_OBS_NDAS, + 'run_task_vx_gridstat': RUN_TASK_VX_GRIDSTAT, + 'run_task_vx_pointstat': RUN_TASK_VX_POINTSTAT, + 'run_task_vx_ensgrid': RUN_TASK_VX_ENSGRID, + 'run_task_vx_enspoint': RUN_TASK_VX_ENSPOINT, + # + # Number of physical cores per node for the current machine. + # + 'ncores_per_node': NCORES_PER_NODE, + # + # Directories and files. + # + 'jobsdir': JOBSDIR, + 'logdir': LOGDIR, + 'scriptsdir': SCRIPTSDIR, + 'cycle_basedir': CYCLE_BASEDIR, + 'global_var_defns_fp': GLOBAL_VAR_DEFNS_FP, + 'load_modules_run_task_fp': LOAD_MODULES_RUN_TASK_FP, + # + # External model information for generating ICs and LBCs. + # + 'extrn_mdl_name_ics': EXTRN_MDL_NAME_ICS, + 'extrn_mdl_name_lbcs': EXTRN_MDL_NAME_LBCS, + # + # Parameters that determine the set of cycles to run. + # + 'date_first_cycl': date_to_str(DATE_FIRST_CYCL,True), + 'date_last_cycl': date_to_str(DATE_LAST_CYCL,True), + 'cdate_first_cycl': cdate_first_cycl, + 'cycl_hrs': cycl_hrs_str, + 'cycl_freq': f"{INCR_CYCL_FREQ:02d}:00:00", + # + # Forecast length (same for all cycles). + # + 'fcst_len_hrs': FCST_LEN_HRS, + # + # Inline post + # + 'write_dopost': WRITE_DOPOST, + # + # METPlus-specific information + # + 'model': MODEL, + 'met_install_dir': MET_INSTALL_DIR, + 'met_bin_exec': MET_BIN_EXEC, + 'metplus_path': METPLUS_PATH, + 'vx_config_dir': VX_CONFIG_DIR, + 'metplus_conf': METPLUS_CONF, + 'met_config': MET_CONFIG, + 'ccpa_obs_dir': CCPA_OBS_DIR, + 'mrms_obs_dir': MRMS_OBS_DIR, + 'ndas_obs_dir': NDAS_OBS_DIR, + # + # Ensemble-related parameters. + # + 'do_ensemble': DO_ENSEMBLE, + 'num_ens_members': NUM_ENS_MEMBERS, + 'ndigits_ensmem_names': f"{NDIGITS_ENSMEM_NAMES}", + 'ensmem_indx_name': ensmem_indx_name, + 'uscore_ensmem_name': uscore_ensmem_name, + 'slash_ensmem_subdir': slash_ensmem_subdir, + # + # Parameters associated with subhourly post-processed output + # + 'sub_hourly_post': SUB_HOURLY_POST, + 'delta_min': DT_SUBHOURLY_POST_MNTS, + 'first_fv3_file_tstr': f"000:{time_str}" + } + # End of "settings" variable. + settings_str = cfg_to_yaml_str(settings) + + print_info_msg(dedent(f''' + The variable \"settings\" specifying values of the rococo XML variables + has been set as follows: + #----------------------------------------------------------------------- + settings =\n''') + settings_str, verbose=VERBOSE) + + # + # Call the python script to generate the experiment's actual XML file + # from the jinja template file. + # + try: + fill_jinja_template(["-q", "-u", settings_str, "-t", template_xml_fp, "-o", WFLOW_XML_FP]) + except: + print_err_msg_exit(f''' + Call to python script fill_jinja_template.py to create a rocoto workflow + XML file from a template file failed. Parameters passed to this script + are: + Full path to template rocoto XML file: + template_xml_fp = \"{template_xml_fp}\" + Full path to output rocoto XML file: + WFLOW_XML_FP = \"{WFLOW_XML_FP}\" + Namelist settings specified on command line: + settings = + {settings_str}''') + # + #----------------------------------------------------------------------- + # + # Create a symlink in the experiment directory that points to the workflow + # (re)launch script. + # + #----------------------------------------------------------------------- + # + print_info_msg(f''' + Creating symlink in the experiment directory (EXPTDIR) that points to the + workflow launch script (WFLOW_LAUNCH_SCRIPT_FP): + EXPTDIR = \"{EXPTDIR}\" + WFLOW_LAUNCH_SCRIPT_FP = \"{WFLOW_LAUNCH_SCRIPT_FP}\"''',verbose=VERBOSE) + + create_symlink_to_file(WFLOW_LAUNCH_SCRIPT_FP, + os.path.join(EXPTDIR,WFLOW_LAUNCH_SCRIPT_FN), + False) + # + #----------------------------------------------------------------------- + # + # If USE_CRON_TO_RELAUNCH is set to TRUE, add a line to the user's cron + # table to call the (re)launch script every CRON_RELAUNCH_INTVL_MNTS mi- + # nutes. + # + #----------------------------------------------------------------------- + # + if USE_CRON_TO_RELAUNCH: + # + # Make a backup copy of the user's crontab file and save it in a file. + # + time_stamp = datetime.now().strftime("%F_%T") + crontab_backup_fp=os.path.join(EXPTDIR,f"crontab.bak.{time_stamp}") + print_info_msg(f''' + Copying contents of user cron table to backup file: + crontab_backup_fp = \"{crontab_backup_fp}\"''',verbose=VERBOSE) + + global called_from_cron + try: called_from_cron + except: called_from_cron = False + + crontab_cmd,crontab_contents = get_crontab_contents(called_from_cron=called_from_cron) + # To create the backup crontab file and add a new job to the user's + # existing cron table, use the "printf" command, not "echo", to print + # out variables. This is because "echo" will add a newline at the end + # of its output even if its input argument is a null string, resulting + # in extranous blank lines in the backup crontab file and/or the cron + # table itself. Using "printf" prevents the appearance of these blank + # lines. + run_command(f'''printf "%s" '{crontab_contents}' > "{crontab_backup_fp}"''') + # + # Below, we use "grep" to determine whether the crontab line that the + # variable CRONTAB_LINE contains is already present in the cron table. + # For that purpose, we need to escape the asterisks in the string in + # CRONTAB_LINE with backslashes. Do this next. + # + (_,crontab_line_esc_astr,_) = run_command(f'''printf "%s" '{CRONTAB_LINE}' | \ + {SED} -r -e "s%[*]%\\\\*%g"''') + # In the grep command below, the "^" at the beginning of the string + # passed to grep is a start-of-line anchor, and the "$" at the end is + # an end-of-line anchor. Thus, in order for grep to find a match on + # any given line of the cron table's contents, that line must contain + # exactly the string in the variable crontab_line_esc_astr without any + # leading or trailing characters. This is to eliminate situations in + # which a line in the cron table contains the string in crontab_line_esc_astr + # but is precedeeded, for example, by the comment character "#" (in which + # case cron ignores that line) and/or is followed by further commands + # that are not part of the string in crontab_line_esc_astr (in which + # case it does something more than the command portion of the string in + # crontab_line_esc_astr does). + # + if MACHINE == "WCOSS_DELL_P3": + (exit_status,grep_output,_)=run_command(f'''grep '^{crontab_line_esc_astr}$' "/u/{USER}/cron/mycrontab"''') + else: + (exit_status,grep_output,_)=run_command(f'''printf "%s" '{crontab_contents}' | grep "^{crontab_line_esc_astr}$"''') + + if exit_status == 0: + + print_info_msg(f''' + The following line already exists in the cron table and thus will not be + added: + CRONTAB_LINE = \"{CRONTAB_LINE}\"''') + + else: + + print_info_msg(f''' + Adding the following line to the user's cron table in order to automatically + resubmit SRW workflow: + CRONTAB_LINE = \"{CRONTAB_LINE}\"''',verbose=VERBOSE) + + if MACHINE == "WCOSS_DELL_P3": + run_command(f'''printf "%s\n" '{CRONTAB_LINE}' >> "/u/{USER}/cron/mycrontab"''') + else: + # Add a newline to the end of crontab_contents only if it is not empty. + # This is needed so that when CRONTAB_LINE is printed out, it appears on + # a separate line. + if crontab_contents: + crontab_contents += "\n" + run_command(f'''( printf "%s" '{crontab_contents}'; printf "%s\n" '{CRONTAB_LINE}' ) | {crontab_cmd}''') + # + #----------------------------------------------------------------------- + # + # Create the FIXam directory under the experiment directory. In NCO mode, + # this will be a symlink to the directory specified in FIXgsm, while in + # community mode, it will be an actual directory with files copied into + # it from FIXgsm. + # + #----------------------------------------------------------------------- + # + # First, consider NCO mode. + # + if RUN_ENVIR == "nco": + + ln_vrfy(f'''-fsn "{FIXgsm}" "{FIXam}"''') + # + # Resolve the target directory that the FIXam symlink points to and check + # that it exists. + # + try: + path_resolved = os.path.realpath(FIXam) + except: + path_resolved = FIXam + if not os.path.exists(path_resolved): + print_err_msg_exit(f''' + In order to be able to generate a forecast experiment in NCO mode (i.e. + when RUN_ENVIR set to \"nco\"), the path specified by FIXam after resolving + all symlinks (path_resolved) must be an existing directory (but in this + case isn't): + RUN_ENVIR = \"{RUN_ENVIR}\" + FIXam = \"{FIXam}\" + path_resolved = \"{path_resolved}\" + Please ensure that path_resolved is an existing directory and then rerun + the experiment generation script.''') + # + # Now consider community mode. + # + else: + + print_info_msg(f''' + Copying fixed files from system directory (FIXgsm) to a subdirectory + (FIXam) in the experiment directory: + FIXgsm = \"{FIXgsm}\" + FIXam = \"{FIXam}\"''',verbose=VERBOSE) + + check_for_preexist_dir_file(FIXam,"delete") + mkdir_vrfy("-p",FIXam) + mkdir_vrfy("-p",os.path.join(FIXam,"fix_co2_proj")) + + num_files=len(FIXgsm_FILES_TO_COPY_TO_FIXam) + for i in range(num_files): + fn=f"{FIXgsm_FILES_TO_COPY_TO_FIXam[i]}" + cp_vrfy(os.path.join(FIXgsm,fn), os.path.join(FIXam,fn)) + # + #----------------------------------------------------------------------- + # + # Copy MERRA2 aerosol climatology data. + # + #----------------------------------------------------------------------- + # + if USE_MERRA_CLIMO: + print_info_msg(f''' + Copying MERRA2 aerosol climatology data files from system directory + (FIXaer/FIXlut) to a subdirectory (FIXclim) in the experiment directory: + FIXaer = \"{FIXaer}\" + FIXlut = \"{FIXlut}\" + FIXclim = \"{FIXclim}\"''',verbose=VERBOSE) + + check_for_preexist_dir_file(FIXclim,"delete") + mkdir_vrfy("-p", FIXclim) + + cp_vrfy(os.path.join(FIXaer,"merra2.aerclim*.nc"), FIXclim) + cp_vrfy(os.path.join(FIXlut,"optics*.dat"), FIXclim) + # + #----------------------------------------------------------------------- + # + # Copy templates of various input files to the experiment directory. + # + #----------------------------------------------------------------------- + # + print_info_msg(f''' + Copying templates of various input files to the experiment directory...''', + verbose=VERBOSE) + + print_info_msg(f''' + Copying the template data table file to the experiment directory...''', + verbose=VERBOSE) + cp_vrfy(DATA_TABLE_TMPL_FP,DATA_TABLE_FP) + + print_info_msg(f''' + Copying the template field table file to the experiment directory...''', + verbose=VERBOSE) + cp_vrfy(FIELD_TABLE_TMPL_FP,FIELD_TABLE_FP) + + print_info_msg(f''' + Copying the template NEMS configuration file to the experiment directory...''', + verbose=VERBOSE) + cp_vrfy(NEMS_CONFIG_TMPL_FP,NEMS_CONFIG_FP) + # + # Copy the CCPP physics suite definition file from its location in the + # clone of the FV3 code repository to the experiment directory (EXPT- + # DIR). + # + print_info_msg(f''' + Copying the CCPP physics suite definition XML file from its location in + the forecast model directory sturcture to the experiment directory...''', + verbose=VERBOSE) + cp_vrfy(CCPP_PHYS_SUITE_IN_CCPP_FP,CCPP_PHYS_SUITE_FP) + # + # Copy the field dictionary file from its location in the + # clone of the FV3 code repository to the experiment directory (EXPT- + # DIR). + # + print_info_msg(f''' + Copying the field dictionary file from its location in the forecast + model directory sturcture to the experiment directory...''', + verbose=VERBOSE) + cp_vrfy(FIELD_DICT_IN_UWM_FP,FIELD_DICT_FP) + # + #----------------------------------------------------------------------- + # + # Set parameters in the FV3-LAM namelist file. + # + #----------------------------------------------------------------------- + # + print_info_msg(f''' + Setting parameters in weather model's namelist file (FV3_NML_FP): + FV3_NML_FP = \"{FV3_NML_FP}\"''') + # + # Set npx and npy, which are just NX plus 1 and NY plus 1, respectively. + # These need to be set in the FV3-LAM Fortran namelist file. They represent + # the number of cell vertices in the x and y directions on the regional + # grid. + # + npx=NX+1 + npy=NY+1 + # + # For the physics suites that use RUC LSM, set the parameter kice to 9, + # Otherwise, leave it unspecified (which means it gets set to the default + # value in the forecast model). + # + # NOTE: + # May want to remove kice from FV3.input.yml (and maybe input.nml.FV3). + # + kice=None + if SDF_USES_RUC_LSM: + kice=9 + # + # Set lsoil, which is the number of input soil levels provided in the + # chgres_cube output NetCDF file. This is the same as the parameter + # nsoill_out in the namelist file for chgres_cube. [On the other hand, + # the parameter lsoil_lsm (not set here but set in input.nml.FV3 and/or + # FV3.input.yml) is the number of soil levels that the LSM scheme in the + # forecast model will run with.] Here, we use the same approach to set + # lsoil as the one used to set nsoill_out in exregional_make_ics.sh. + # See that script for details. + # + # NOTE: + # May want to remove lsoil from FV3.input.yml (and maybe input.nml.FV3). + # Also, may want to set lsm here as well depending on SDF_USES_RUC_LSM. + # + lsoil=4 + if ( EXTRN_MDL_NAME_ICS == "HRRR" or \ + EXTRN_MDL_NAME_ICS == "RAP" ) and \ + ( SDF_USES_RUC_LSM ): + lsoil=9 + # + # Create a multiline variable that consists of a yaml-compliant string + # specifying the values that the namelist variables that are physics- + # suite-independent need to be set to. Below, this variable will be + # passed to a python script that will in turn set the values of these + # variables in the namelist file. + # + # IMPORTANT: + # If we want a namelist variable to be removed from the namelist file, + # in the "settings" variable below, we need to set its value to the + # string "null". This is equivalent to setting its value to + # !!python/none + # in the base namelist file specified by FV3_NML_BASE_SUITE_FP or the + # suite-specific yaml settings file specified by FV3_NML_YAML_CONFIG_FP. + # + # It turns out that setting the variable to an empty string also works + # to remove it from the namelist! Which is better to use?? + # + settings = {} + settings['atmos_model_nml'] = { + 'blocksize': BLOCKSIZE, + 'ccpp_suite': CCPP_PHYS_SUITE + } + settings['fv_core_nml'] = { + 'target_lon': LON_CTR, + 'target_lat': LAT_CTR, + 'nrows_blend': HALO_BLEND, + # + # Question: + # For a ESGgrid type grid, what should stretch_fac be set to? This depends + # on how the FV3 code uses the stretch_fac parameter in the namelist file. + # Recall that for a ESGgrid, it gets set in the function set_gridparams_ESGgrid(.sh) + # to something like 0.9999, but is it ok to set it to that here in the + # FV3 namelist file? + # + 'stretch_fac': STRETCH_FAC, + 'npx': npx, + 'npy': npy, + 'layout': [LAYOUT_X, LAYOUT_Y], + 'bc_update_interval': LBC_SPEC_INTVL_HRS + } + settings['gfs_physics_nml'] = { + 'kice': kice or None, + 'lsoil': lsoil or None, + 'do_shum': DO_SHUM, + 'do_sppt': DO_SPPT, + 'do_skeb': DO_SKEB, + 'do_spp': DO_SPP, + 'n_var_spp': N_VAR_SPP, + 'n_var_lndp': N_VAR_LNDP, + 'lndp_type': LNDP_TYPE, + 'lndp_each_step': LSM_SPP_EACH_STEP, + 'fhcyc': FHCYC_LSM_SPP_OR_NOT + } + # + # Add to "settings" the values of those namelist variables that specify + # the paths to fixed files in the FIXam directory. As above, these namelist + # variables are physcs-suite-independent. + # + # Note that the array FV3_NML_VARNAME_TO_FIXam_FILES_MAPPING contains + # the mapping between the namelist variables and the names of the files + # in the FIXam directory. Here, we loop through this array and process + # each element to construct each line of "settings". + # + dummy_run_dir=os.path.join(EXPTDIR,"any_cyc") + if DO_ENSEMBLE: + dummy_run_dir=os.path.join(dummy_run_dir,"any_ensmem") + + regex_search="^[ ]*([^| ]+)[ ]*[|][ ]*([^| ]+)[ ]*$" + num_nml_vars=len(FV3_NML_VARNAME_TO_FIXam_FILES_MAPPING) + namsfc_dict = {} + for i in range(num_nml_vars): + + mapping=f"{FV3_NML_VARNAME_TO_FIXam_FILES_MAPPING[i]}" + tup = find_pattern_in_str(regex_search, mapping) + nml_var_name = tup[0] + FIXam_fn = tup[1] + + fp="\"\"" + if FIXam_fn: + fp=os.path.join(FIXam,FIXam_fn) + # + # If not in NCO mode, for portability and brevity, change fp so that it + # is a relative path (relative to any cycle directory immediately under + # the experiment directory). + # + if RUN_ENVIR != "nco": + fp = os.path.relpath(os.path.realpath(fp), start=dummy_run_dir) + # + # Add a line to the variable "settings" that specifies (in a yaml-compliant + # format) the name of the current namelist variable and the value it should + # be set to. + # + namsfc_dict[nml_var_name] = fp + # + # Add namsfc_dict to settings + # + settings['namsfc'] = namsfc_dict + # + # Use netCDF4 when running the North American 3-km domain due to file size. + # + if PREDEF_GRID_NAME == "RRFS_NA_3km": + settings['fms2_io_nml'] = { + 'netcdf_default_format': 'netcdf4' + } + # + # Add the relevant tendency-based stochastic physics namelist variables to + # "settings" when running with SPPT, SHUM, or SKEB turned on. If running + # with SPP or LSM SPP, set the "new_lscale" variable. Otherwise only + # include an empty "nam_stochy" stanza. + # + nam_stochy_dict = {} + if DO_SPPT: + nam_stochy_dict.update({ + 'iseed_sppt': ISEED_SPPT, + 'new_lscale': NEW_LSCALE, + 'sppt': SPPT_MAG, + 'sppt_logit': SPPT_LOGIT, + 'sppt_lscale': SPPT_LSCALE, + 'sppt_sfclimit': SPPT_SFCLIMIT, + 'sppt_tau': SPPT_TSCALE, + 'spptint': SPPT_INT, + 'use_zmtnblck': USE_ZMTNBLCK + }) + + if DO_SHUM: + nam_stochy_dict.update({ + 'iseed_shum': ISEED_SHUM, + 'new_lscale': NEW_LSCALE, + 'shum': SHUM_MAG, + 'shum_lscale': SHUM_LSCALE, + 'shum_tau': SHUM_TSCALE, + 'shumint': SHUM_INT + }) + + if DO_SKEB: + nam_stochy_dict.update({ + 'iseed_skeb': ISEED_SKEB, + 'new_lscale': NEW_LSCALE, + 'skeb': SKEB_MAG, + 'skeb_lscale': SKEB_LSCALE, + 'skebnorm': SKEBNORM, + 'skeb_tau': SKEB_TSCALE, + 'skebint': SKEB_INT, + 'skeb_vdof': SKEB_VDOF + }) + + if DO_SPP or DO_LSM_SPP: + nam_stochy_dict.update({ + 'new_lscale': NEW_LSCALE + }) + + settings['nam_stochy'] = nam_stochy_dict + # + # Add the relevant SPP namelist variables to "settings" when running with + # SPP turned on. Otherwise only include an empty "nam_sppperts" stanza. + # + nam_sppperts_dict = {} + if DO_SPP: + nam_sppperts_dict = { + 'iseed_spp': ISEED_SPP, + 'spp_lscale': SPP_LSCALE, + 'spp_prt_list': SPP_MAG_LIST, + 'spp_sigtop1': SPP_SIGTOP1, + 'spp_sigtop2': SPP_SIGTOP2, + 'spp_stddev_cutoff': SPP_STDDEV_CUTOFF, + 'spp_tau': SPP_TSCALE, + 'spp_var_list': SPP_VAR_LIST + } + + settings['nam_sppperts'] = nam_sppperts_dict + # + # Add the relevant LSM SPP namelist variables to "settings" when running with + # LSM SPP turned on. + # + nam_sfcperts_dict = {} + if DO_LSM_SPP: + nam_sfcperts_dict = { + 'lndp_type': LNDP_TYPE, + 'lndp_tau': LSM_SPP_TSCALE, + 'lndp_lscale': LSM_SPP_LSCALE, + 'iseed_lndp': ISEED_LSM_SPP, + 'lndp_var_list': LSM_SPP_VAR_LIST, + 'lndp_prt_list': LSM_SPP_MAG_LIST + } + + settings['nam_sfcperts'] = nam_sfcperts_dict + + settings_str = cfg_to_yaml_str(settings) + + print_info_msg(dedent(f''' + The variable \"settings\" specifying values of the weather model's + namelist variables has been set as follows: + + settings =\n''') + settings_str, verbose=VERBOSE) + # + #----------------------------------------------------------------------- + # + # Call the set_namelist.py script to create a new FV3 namelist file (full + # path specified by FV3_NML_FP) using the file FV3_NML_BASE_SUITE_FP as + # the base (i.e. starting) namelist file, with physics-suite-dependent + # modifications to the base file specified in the yaml configuration file + # FV3_NML_YAML_CONFIG_FP (for the physics suite specified by CCPP_PHYS_SUITE), + # and with additional physics-suite-independent modificaitons specified + # in the variable "settings" set above. + # + #----------------------------------------------------------------------- + # + try: + set_namelist(["-q", "-n", FV3_NML_BASE_SUITE_FP, "-c", FV3_NML_YAML_CONFIG_FP, + CCPP_PHYS_SUITE, "-u", settings_str, "-o", FV3_NML_FP]) + except: + print_err_msg_exit(f''' + Call to python script set_namelist.py to generate an FV3 namelist file + failed. Parameters passed to this script are: + Full path to base namelist file: + FV3_NML_BASE_SUITE_FP = \"{FV3_NML_BASE_SUITE_FP}\" + Full path to yaml configuration file for various physics suites: + FV3_NML_YAML_CONFIG_FP = \"{FV3_NML_YAML_CONFIG_FP}\" + Physics suite to extract from yaml configuration file: + CCPP_PHYS_SUITE = \"{CCPP_PHYS_SUITE}\" + Full path to output namelist file: + FV3_NML_FP = \"{FV3_NML_FP}\" + Namelist settings specified on command line: + settings = + {settings_str}''') + # + # If not running the MAKE_GRID_TN task (which implies the workflow will + # use pregenerated grid files), set the namelist variables specifying + # the paths to surface climatology files. These files are located in + # (or have symlinks that point to them) in the FIXLAM directory. + # + # Note that if running the MAKE_GRID_TN task, this action usually cannot + # be performed here but must be performed in that task because the names + # of the surface climatology files depend on the CRES parameter (which is + # the C-resolution of the grid), and this parameter is in most workflow + # configurations is not known until the grid is created. + # + if not RUN_TASK_MAKE_GRID: + + set_FV3nml_sfc_climo_filenames() + # + #----------------------------------------------------------------------- + # + # To have a record of how this experiment/workflow was generated, copy + # the experiment/workflow configuration file to the experiment directo- + # ry. + # + #----------------------------------------------------------------------- + # + cp_vrfy(os.path.join(USHDIR,EXPT_CONFIG_FN), EXPTDIR) + # + #----------------------------------------------------------------------- + # + # For convenience, print out the commands that need to be issued on the + # command line in order to launch the workflow and to check its status. + # Also, print out the line that should be placed in the user's cron table + # in order for the workflow to be continually resubmitted. + # + #----------------------------------------------------------------------- + # + if WORKFLOW_MANAGER == "rocoto": + wflow_db_fn=f"{os.path.splitext(WFLOW_XML_FN)[0]}.db" + rocotorun_cmd=f"rocotorun -w {WFLOW_XML_FN} -d {wflow_db_fn} -v 10" + rocotostat_cmd=f"rocotostat -w {WFLOW_XML_FN} -d {wflow_db_fn} -v 10" + + print_info_msg(f''' + ======================================================================== + ======================================================================== + + Experiment generation completed. The experiment directory is: + + EXPTDIR=\"{EXPTDIR}\" + + ======================================================================== + ======================================================================== + ''') + # + #----------------------------------------------------------------------- + # + # If rocoto is required, print instructions on how to load and use it + # + #----------------------------------------------------------------------- + # + if WORKFLOW_MANAGER == "rocoto": + + print_info_msg(f''' + To launch the workflow, first ensure that you have a compatible version + of rocoto available. For most pre-configured platforms, rocoto can be + loaded via a module: + + > module load rocoto + + For more details on rocoto, see the User's Guide. + + To launch the workflow, first ensure that you have a compatible version + of rocoto loaded. For example, to load version 1.3.1 of rocoto, use + + > module load rocoto/1.3.1 + + (This version has been tested on hera; later versions may also work but + have not been tested.) + + To launch the workflow, change location to the experiment directory + (EXPTDIR) and issue the rocotrun command, as follows: + + > cd {EXPTDIR} + > {rocotorun_cmd} + + To check on the status of the workflow, issue the rocotostat command + (also from the experiment directory): + + > {rocotostat_cmd} + + Note that: + + 1) The rocotorun command must be issued after the completion of each + task in the workflow in order for the workflow to submit the next + task(s) to the queue. + + 2) In order for the output of the rocotostat command to be up-to-date, + the rocotorun command must be issued immediately before issuing the + rocotostat command. + + For automatic resubmission of the workflow (say every 3 minutes), the + following line can be added to the user's crontab (use \"crontab -e\" to + edit the cron table): + + */3 * * * * cd {EXPTDIR} && ./launch_FV3LAM_wflow.sh called_from_cron=\"TRUE\" + ''') + # + # If necessary, run the NOMADS script to source external model data. + # + if NOMADS: + print("Getting NOMADS online data") + print(f"NOMADS_file_type= {NOMADS_file_type}") + cd_vrfy(EXPTDIR) + NOMADS_script = os.path.join(USHDIR, "NOMADS_get_extrn_mdl_files.h") + run_command(f'''{NOMADS_script} {date_to_str(DATE_FIRST_CYCL,True)} \ + {CYCL_HRS} {NOMADS_file_type} {FCST_LEN_HRS} {LBC_SPEC_INTVL_HRS}''') + + +# +#----------------------------------------------------------------------- +# +# Start of the script that will call the experiment/workflow generation +# function defined above. +# +#----------------------------------------------------------------------- +# +if __name__ == "__main__": + # + #----------------------------------------------------------------------- + # + # Set directories. + # + #----------------------------------------------------------------------- + # + ushdir=os.path.dirname(os.path.abspath(__file__)) + # + # Set the name of and full path to the temporary file in which we will + # save some experiment/workflow variables. The need for this temporary + # file is explained below. + # + tmp_fn="tmp" + tmp_fp=os.path.join(ushdir, tmp_fn) + rm_vrfy("-f",tmp_fp) + # + # Set the name of and full path to the log file in which the output from + # the experiment/workflow generation function will be saved. + # + log_fn="log.generate_FV3LAM_wflow" + log_fp=os.path.join(ushdir, log_fn) + rm_vrfy("-f",log_fp) + # + # Call the generate_FV3LAM_wflow function defined above to generate the + # experiment/workflow. Note that we pipe the output of the function + # (and possibly other commands) to the "tee" command in order to be able + # to both save it to a file and print it out to the screen (stdout). + # The piping causes the call to the function (and the other commands + # grouped with it using the curly braces, { ... }) to be executed in a + # subshell. As a result, the experiment/workflow variables that the + # function sets are not available outside of the grouping, i.e. they are + # not available at and after the call to "tee". Since some of these va- + # riables are needed after the call to "tee" below, we save them in a + # temporary file and read them in outside the subshell later below. + # + def workflow_func(): + retval=1 + generate_FV3LAM_wflow() + retval=0 + run_command(f'''echo "{EXPTDIR}" >> "{tmp_fp}"''') + run_command(f'''echo "{retval}" >> "{tmp_fp}"''') + + # create tee functionality + tee = subprocess.Popen(["tee", log_fp], stdin=subprocess.PIPE) + os.dup2(tee.stdin.fileno(), sys.stdout.fileno()) + os.dup2(tee.stdin.fileno(), sys.stderr.fileno()) + + #create workflow process + p = Process(target=workflow_func) + p.start() + p.join() + + # + # Read in experiment/workflow variables needed later below from the tem- + # porary file created in the subshell above containing the call to the + # generate_FV3LAM_wflow function. These variables are not directly + # available here because the call to generate_FV3LAM_wflow above takes + # place in a subshell (due to the fact that we are then piping its out- + # put to the "tee" command). Then remove the temporary file. + # + (_,exptdir,_)=run_command(f'''sed "1q;d" "{tmp_fp}"''') + (_,retval,_)=run_command(f''' sed "2q;d" "{tmp_fp}"''') + if retval: + retval = int(retval) + else: + retval = 1 + rm_vrfy(tmp_fp) + # + # If the call to the generate_FV3LAM_wflow function above was success- + # ful, move the log file in which the "tee" command saved the output of + # the function to the experiment directory. + # + if retval == 0: + mv_vrfy(log_fp,exptdir) + # + # If the call to the generate_FV3LAM_wflow function above was not suc- + # cessful, print out an error message and exit with a nonzero return + # code. + # + else: + print_err_msg_exit(f''' + Experiment generation failed. Check the log file from the ex- + periment/workflow generation script in the file specified by log_fp: + log_fp = \"{log_fp}\" + Stopping.''') + diff --git a/ush/generate_FV3LAM_wflow.sh b/ush/generate_FV3LAM_wflow.sh index 83282b717..8af7497b0 100755 --- a/ush/generate_FV3LAM_wflow.sh +++ b/ush/generate_FV3LAM_wflow.sh @@ -193,6 +193,7 @@ file (template_xml_fp): 'partition_fcst': ${PARTITION_FCST} 'queue_fcst': ${QUEUE_FCST} 'machine': ${MACHINE} + 'slurm_native_cmd': ${SLURM_NATIVE_CMD} # # Workflow task names. # diff --git a/ush/get_crontab_contents.py b/ush/get_crontab_contents.py new file mode 100644 index 000000000..cbb434c69 --- /dev/null +++ b/ush/get_crontab_contents.py @@ -0,0 +1,83 @@ +#!/usr/bin/env python3 + +import os +import unittest +from datetime import datetime + +from python_utils import import_vars, set_env_var, print_input_args, \ + run_command, define_macos_utilities, check_var_valid_value +from constants import valid_vals_BOOLEAN + +def get_crontab_contents(called_from_cron): + """ + #----------------------------------------------------------------------- + # + # This function returns the contents of the user's + # cron table as well as the command to use to manipulate the cron table + # (i.e. the "crontab" command, but on some platforms the version or + # location of this may change depending on other circumstances, e.g. on + # Cheyenne, this depends on whether a script that wants to call "crontab" + # is itself being called from a cron job). Arguments are as follows: + # + # called_from_cron: + # Boolean flag that specifies whether this function (and the scripts or + # functions that are calling it) are called as part of a cron job. Must + # be set to "TRUE" or "FALSE". + # + # outvarname_crontab_cmd: + # Name of the output variable that will contain the command to issue for + # the system "crontab" command. + # + # outvarname_crontab_contents: + # Name of the output variable that will contain the contents of the + # user's cron table. + # + #----------------------------------------------------------------------- + """ + + print_input_args(locals()) + + #import all env vars + IMPORTS = ["MACHINE", "USER"] + import_vars(env_vars=IMPORTS) + + # + # Make sure called_from_cron is set to a valid value. + # + check_var_valid_value(called_from_cron, valid_vals_BOOLEAN) + + if MACHINE == "WCOSS_DELL_P3": + __crontab_cmd__="" + (_,__crontab_contents__,_)=run_command(f'''cat "/u/{USER}/cron/mycrontab"''') + else: + __crontab_cmd__="crontab" + # + # On Cheyenne, simply typing "crontab" will launch the crontab command + # at "/glade/u/apps/ch/opt/usr/bin/crontab". This is a containerized + # version of crontab that will work if called from scripts that are + # themselves being called as cron jobs. In that case, we must instead + # call the system version of crontab at /usr/bin/crontab. + # + if MACHINE == "CHEYENNE": + if called_from_cron: + __crontab_cmd__="/usr/bin/crontab" + (_,__crontab_contents__,_)=run_command(f'''{__crontab_cmd__} -l''') + # + # On Cheyenne, the output of the "crontab -l" command contains a 3-line + # header (comments) at the top that is not actually part of the user's + # cron table. This needs to be removed to avoid adding an unnecessary + # copy of this header to the user's cron table. + # + if MACHINE == "CHEYENNE": + (_,__crontab_contents__,_)=run_command(f'''printf "%s" "{__crontab_contents__}" | tail -n +4 ''') + + return __crontab_cmd__, __crontab_contents__ + +class Testing(unittest.TestCase): + def test_get_crontab_contents(self): + crontab_cmd,crontab_contents = get_crontab_contents(called_from_cron=True) + self.assertEqual(crontab_cmd, "crontab") + def setUp(self): + define_macos_utilities(); + set_env_var('DEBUG',False) + set_env_var('MACHINE', 'HERA') diff --git a/ush/launch_FV3LAM_wflow.sh b/ush/launch_FV3LAM_wflow.sh index ddee812fa..9184cb0e0 100755 --- a/ush/launch_FV3LAM_wflow.sh +++ b/ush/launch_FV3LAM_wflow.sh @@ -149,11 +149,11 @@ expt_name="${EXPT_SUBDIR}" # #----------------------------------------------------------------------- # -env_fp="${SR_WX_APP_TOP_DIR}/env/${WFLOW_ENV_FN}" -source "${env_fp}" || print_err_msg_exit "\ -Sourcing platform-specific environment file (env_fp) for the workflow +module use "${SR_WX_APP_TOP_DIR}/modulefiles" +module load "${WFLOW_MOD_FN}" > /dev/null 2>&1 || print_err_msg_exit "\ +Loading platform-specific module file (WFLOW_MOD_FN) for the workflow task failed: - env_fp = \"${env_fp}\"" + WFLOW_MOD_FN = \"${WFLOW_MOD_FN}\"" # #----------------------------------------------------------------------- # diff --git a/ush/link_fix.py b/ush/link_fix.py new file mode 100644 index 000000000..9788a4ad4 --- /dev/null +++ b/ush/link_fix.py @@ -0,0 +1,391 @@ +#!/usr/bin/env python3 + +import unittest +import os +import glob + +from python_utils import import_vars, set_env_var, print_input_args, \ + print_info_msg, print_err_msg_exit, create_symlink_to_file, \ + define_macos_utilities, check_var_valid_value, \ + cd_vrfy, mkdir_vrfy, find_pattern_in_str + +def link_fix(verbose, file_group): + """ This file defines a function that ... + Args: + verbose: True or False + file_group: could be on of ["grid", "orog", "sfc_climo"] + Returns: + a string: resolution + """ + + print_input_args(locals()) + + valid_vals_file_group=["grid", "orog", "sfc_climo"] + check_var_valid_value(file_group, valid_vals_file_group) + + #import all environement variables + import_vars() + + # + #----------------------------------------------------------------------- + # + # Create symlinks in the FIXLAM directory pointing to the grid files. + # These symlinks are needed by the make_orog, make_sfc_climo, make_ic, + # make_lbc, and/or run_fcst tasks. + # + # Note that we check that each target file exists before attempting to + # create symlinks. This is because the "ln" command will create sym- + # links to non-existent targets without returning with a nonzero exit + # code. + # + #----------------------------------------------------------------------- + # + print_info_msg(f'Creating links in the FIXLAM directory to the grid files...', + verbose=verbose) + # + #----------------------------------------------------------------------- + # + # Create globbing patterns for grid, orography, and surface climatology + # files. + # + # + # For grid files (i.e. file_group set to "grid"), symlinks are created + # in the FIXLAM directory to files (of the same names) in the GRID_DIR. + # These symlinks/files and the reason each is needed is listed below: + # + # 1) "C*.mosaic.halo${NHW}.nc" + # This mosaic file for the wide-halo grid (i.e. the grid with a ${NHW}- + # cell-wide halo) is needed as an input to the orography filtering + # executable in the orography generation task. The filtering code + # extracts from this mosaic file the name of the file containing the + # grid on which it will generate filtered topography. Note that the + # orography generation and filtering are both performed on the wide- + # halo grid. The filtered orography file on the wide-halo grid is then + # shaved down to obtain the filtered orography files with ${NH3}- and + # ${NH4}-cell-wide halos. + # + # The raw orography generation step in the make_orog task requires the + # following symlinks/files: + # + # a) C*.mosaic.halo${NHW}.nc + # The script for the make_orog task extracts the name of the grid + # file from this mosaic file; this name should be + # "C*.grid.tile${TILE_RGNL}.halo${NHW}.nc". + # + # b) C*.grid.tile${TILE_RGNL}.halo${NHW}.nc + # This is the + # The script for the make_orog task passes the name of the grid + # file (extracted above from the mosaic file) to the orography + # generation executable. The executable then + # reads in this grid file and generates a raw orography + # file on the grid. The raw orography file is initially renamed "out.oro.nc", + # but for clarity, it is then renamed "C*.raw_orog.tile${TILE_RGNL}.halo${NHW}.nc". + # + # c) The fixed files thirty.second.antarctic.new.bin, landcover30.fixed, + # and gmted2010.30sec.int. + # + # The orography filtering step in the make_orog task requires the + # following symlinks/files: + # + # a) C*.mosaic.halo${NHW}.nc + # This is the mosaic file for the wide-halo grid. The orography + # filtering executable extracts from this file the name of the grid + # file containing the wide-halo grid (which should be + # "${CRES}.grid.tile${TILE_RGNL}.halo${NHW}.nc"). The executable then + # looks for this grid file IN THE DIRECTORY IN WHICH IT IS RUNNING. + # Thus, before running the executable, the script creates a symlink in this run directory that + # points to the location of the actual wide-halo grid file. + # + # b) C*.raw_orog.tile${TILE_RGNL}.halo${NHW}.nc + # This is the raw orography file on the wide-halo grid. The script + # for the make_orog task copies this file to a new file named + # "C*.filtered_orog.tile${TILE_RGNL}.halo${NHW}.nc" that will be + # used as input to the orography filtering executable. The executable + # will then overwrite the contents of this file with the filtered orography. + # Thus, the output of the orography filtering executable will be + # the file C*.filtered_orog.tile${TILE_RGNL}.halo${NHW}.nc. + # + # The shaving step in the make_orog task requires the following: + # + # a) C*.filtered_orog.tile${TILE_RGNL}.halo${NHW}.nc + # This is the filtered orography file on the wide-halo grid. + # This gets shaved down to two different files: + # + # i) ${CRES}.oro_data.tile${TILE_RGNL}.halo${NH0}.nc + # This is the filtered orography file on the halo-0 grid. + # + # ii) ${CRES}.oro_data.tile${TILE_RGNL}.halo${NH4}.nc + # This is the filtered orography file on the halo-4 grid. + # + # Note that the file names of the shaved files differ from that of + # the initial unshaved file on the wide-halo grid in that the field + # after ${CRES} is now "oro_data" (not "filtered_orog") to comply + # with the naming convention used more generally. + # + # 2) "C*.mosaic.halo${NH4}.nc" + # This mosaic file for the grid with a 4-cell-wide halo is needed as + # an input to the surface climatology generation executable. The + # surface climatology generation code reads from this file the number + # of tiles (which should be 1 for a regional grid) and the tile names. + # More importantly, using the ESMF function ESMF_GridCreateMosaic(), + # it creates a data object of type esmf_grid; the grid information + # in this object is obtained from the grid file specified in the mosaic + # file, which should be "C*.grid.tile${TILE_RGNL}.halo${NH4}.nc". The + # dimensions specified in this grid file must match the ones specified + # in the (filtered) orography file "C*.oro_data.tile${TILE_RGNL}.halo${NH4}.nc" + # that is also an input to the surface climatology generation executable. + # If they do not, then the executable will crash with an ESMF library + # error (something like "Arguments are incompatible"). + # + # Thus, for the make_sfc_climo task, the following symlinks/files must + # exist: + # a) "C*.mosaic.halo${NH4}.nc" + # b) "C*.grid.tile${TILE_RGNL}.halo${NH4}.nc" + # c) "C*.oro_data.tile${TILE_RGNL}.halo${NH4}.nc" + # + # 3) + # + # + #----------------------------------------------------------------------- + # + # + if file_group == "grid": + fns=[ + f"C*{DOT_OR_USCORE}mosaic.halo{NHW}.nc", + f"C*{DOT_OR_USCORE}mosaic.halo{NH4}.nc", + f"C*{DOT_OR_USCORE}mosaic.halo{NH3}.nc", + f"C*{DOT_OR_USCORE}grid.tile{TILE_RGNL}.halo{NHW}.nc", + f"C*{DOT_OR_USCORE}grid.tile{TILE_RGNL}.halo{NH3}.nc", + f"C*{DOT_OR_USCORE}grid.tile{TILE_RGNL}.halo{NH4}.nc" + ] + fps=[ os.path.join(GRID_DIR,itm) for itm in fns] + run_task=f"{RUN_TASK_MAKE_GRID}" + # + elif file_group == "orog": + fns=[ + f"C*{DOT_OR_USCORE}oro_data.tile{TILE_RGNL}.halo{NH0}.nc", + f"C*{DOT_OR_USCORE}oro_data.tile{TILE_RGNL}.halo{NH4}.nc" + ] + if CCPP_PHYS_SUITE == "FV3_HRRR": + fns+=[ + f"C*{DOT_OR_USCORE}oro_data_ss.tile{TILE_RGNL}.halo{NH0}.nc", + f"C*{DOT_OR_USCORE}oro_data_ls.tile{TILE_RGNL}.halo{NH0}.nc", + ] + fps=[ os.path.join(OROG_DIR,itm) for itm in fns] + run_task=f"{RUN_TASK_MAKE_OROG}" + # + # The following list of symlinks (which have the same names as their + # target files) need to be created made in order for the make_ics and + # make_lbcs tasks (i.e. tasks involving chgres_cube) to work. + # + elif file_group == "sfc_climo": + num_fields=len(SFC_CLIMO_FIELDS) + fns=[None] * (2 * num_fields) + for i in range(num_fields): + ii=2*i + fns[ii]=f"C*.{SFC_CLIMO_FIELDS[i]}.tile{TILE_RGNL}.halo{NH0}.nc" + fns[ii+1]=f"C*.{SFC_CLIMO_FIELDS[i]}.tile{TILE_RGNL}.halo{NH4}.nc" + fps=[ os.path.join(SFC_CLIMO_DIR,itm) for itm in fns] + run_task=f"{RUN_TASK_MAKE_SFC_CLIMO}" + # + + # + #----------------------------------------------------------------------- + # + # Find all files matching the globbing patterns and make sure that they + # all have the same resolution (an integer) in their names. + # + #----------------------------------------------------------------------- + # + i=0 + res_prev="" + res="" + fp_prev="" + + for pattern in fps: + files = glob.glob(pattern) + for fp in files: + + fn = os.path.basename(fp) + + regex_search = "^C([0-9]*).*" + res = find_pattern_in_str(regex_search, fn) + if res is None: + print_err_msg_exit(f''' + The resolution could not be extracted from the current file's name. The + full path to the file (fp) is: + fp = \"{fp}\" + This may be because fp contains the * globbing character, which would + imply that no files were found that match the globbing pattern specified + in fp.''') + else: + res = res[0] + + if ( i > 0 ) and ( res != res_prev ): + print_err_msg_exit(f''' + The resolutions (as obtained from the file names) of the previous and + current file (fp_prev and fp, respectively) are different: + fp_prev = \"{fp_prev}\" + fp = \"{fp}\" + Please ensure that all files have the same resolution.''') + + i=i+1 + fp_prev=f"{fp}" + res_prev=res + # + #----------------------------------------------------------------------- + # + # Replace the * globbing character in the set of globbing patterns with + # the resolution. This will result in a set of (full paths to) specific + # files. + # + #----------------------------------------------------------------------- + # + fps=[ itm.replace('*',res) for itm in fps] + # + #----------------------------------------------------------------------- + # + # In creating the various symlinks below, it is convenient to work in + # the FIXLAM directory. We will change directory back to the original + # later below. + # + #----------------------------------------------------------------------- + # + SAVE_DIR=os.getcwd() + cd_vrfy(FIXLAM) + # + #----------------------------------------------------------------------- + # + # Use the set of full file paths generated above as the link targets to + # create symlinks to these files in the FIXLAM directory. + # + #----------------------------------------------------------------------- + # + # If the task in consideration (which will be one of the pre-processing + # tasks MAKE_GRID_TN, MAKE_OROG_TN, and MAKE_SFC_CLIMO_TN) was run, then + # the target files will be located under the experiment directory. In + # this case, we use relative symlinks in order the experiment directory + # more portable and the symlinks more readable. However, if the task + # was not run, then pregenerated grid, orography, or surface climatology + # files will be used, and those will be located in an arbitrary directory + # (specified by the user) that is somwehere outside the experiment + # directory. Thus, in this case, there isn't really an advantage to using + # relative symlinks, so we use symlinks with absolute paths. + # + if run_task: + relative_link_flag=True + else: + relative_link_flag=False + + for fp in fps: + fn=os.path.basename(fp) + create_symlink_to_file(fp,fn,relative_link_flag) + # + #----------------------------------------------------------------------- + # + # Set the C-resolution based on the resolution appearing in the file + # names. + # + #----------------------------------------------------------------------- + # + cres=f"C{res}" + # + #----------------------------------------------------------------------- + # + # If considering grid files, create a symlink to the halo4 grid file + # that does not contain the halo size in its name. This is needed by + # the tasks that generate the initial and lateral boundary condition + # files. + # + #----------------------------------------------------------------------- + # + if file_group == "grid": + + target=f"{cres}{DOT_OR_USCORE}grid.tile{TILE_RGNL}.halo{NH4}.nc" + symlink=f"{cres}{DOT_OR_USCORE}grid.tile{TILE_RGNL}.nc" + create_symlink_to_file(target,symlink,True) + # + # The surface climatology file generation code looks for a grid file + # having a name of the form "C${GFDLgrid_RES}_grid.tile7.halo4.nc" (i.e. + # the C-resolution used in the name of this file is the number of grid + # points per horizontal direction per tile, just like in the global model). + # Thus, if we are running the MAKE_SFC_CLIMO_TN task, if the grid is of + # GFDLgrid type, and if we are not using GFDLgrid_RES in filenames (i.e. + # we are using the equivalent global uniform grid resolution instead), + # then create a link whose name uses the GFDLgrid_RES that points to the + # link whose name uses the equivalent global uniform resolution. + # + if RUN_TASK_MAKE_SFC_CLIMO and \ + GRID_GEN_METHOD == "GFDLgrid" and \ + not GFDLgrid_USE_GFDLgrid_RES_IN_FILENAMES: + target=f"{cres}{DOT_OR_USCORE}grid.tile{TILE_RGNL}.halo{NH4}.nc" + symlink=f"C{GFDLgrid_RES}{DOT_OR_USCORE}grid.tile{TILE_RGNL}.nc" + create_symlink_to_file(target,symlink,relative) + # + #----------------------------------------------------------------------- + # + # If considering surface climatology files, create symlinks to the surface + # climatology files that do not contain the halo size in their names. + # These are needed by the task that generates the initial condition files. + # + #----------------------------------------------------------------------- + # + if file_group == "sfc_climo": + + tmp=[ f"{cres}.{itm}" for itm in SFC_CLIMO_FIELDS] + fns_sfc_climo_with_halo_in_fn=[ f"{itm}.tile{TILE_RGNL}.halo{NH4}.nc" for itm in tmp] + fns_sfc_climo_no_halo_in_fn=[ f"{itm}.tile{TILE_RGNL}.nc" for itm in tmp] + + for i in range(num_fields): + target=f"{fns_sfc_climo_with_halo_in_fn[i]}" + symlink=f"{fns_sfc_climo_no_halo_in_fn[i]}" + create_symlink_to_file(target, symlink, True) + # + # In order to be able to specify the surface climatology file names in + # the forecast model's namelist file, in the FIXLAM directory a symlink + # must be created for each surface climatology field that has "tile1" in + # its name (and no "halo") and which points to the corresponding "tile7.halo0" + # file. + # + tmp=[ f"{cres}.{itm}" for itm in SFC_CLIMO_FIELDS ] + fns_sfc_climo_tile7_halo0_in_fn=[ f"{itm}.tile{TILE_RGNL}.halo{NH0}.nc" for itm in tmp ] + fns_sfc_climo_tile1_no_halo_in_fn=[ f"{itm}.tile1.nc" for itm in tmp ] + + for i in range(num_fields): + target=f"{fns_sfc_climo_tile7_halo0_in_fn[i]}" + symlink=f"{fns_sfc_climo_tile1_no_halo_in_fn[i]}" + create_symlink_to_file(target,symlink,True) + # + #----------------------------------------------------------------------- + # + # Change directory back to original one. + # + #----------------------------------------------------------------------- + # + cd_vrfy(SAVE_DIR) + + return res + +class Testing(unittest.TestCase): + def test_link_fix(self): + res = link_fix(verbose=True, file_group="grid") + self.assertTrue( res == "3357") + def setUp(self): + define_macos_utilities() + TEST_DIR = os.path.join(os.path.dirname(os.path.abspath(__file__)), "test_data"); + FIXLAM = os.path.join(TEST_DIR, "expt", "fix_lam") + mkdir_vrfy("-p",FIXLAM) + set_env_var("FIXLAM",FIXLAM) + set_env_var("DOT_OR_USCORE","_") + set_env_var("TILE_RGNL",7) + set_env_var("NH0",0) + set_env_var("NHW",6) + set_env_var("NH4",4) + set_env_var("NH3",3) + set_env_var("GRID_DIR",TEST_DIR + os.sep + "RRFS_CONUS_3km") + set_env_var("RUN_TASK_MAKE_GRID","FALSE") + set_env_var("OROG_DIR",TEST_DIR + os.sep + "RRFS_CONUS_3km") + set_env_var("RUN_TASK_MAKE_OROG","FALSE") + set_env_var("SFC_CLIMO_DIR",TEST_DIR + os.sep + "RRFS_CONUS_3km") + set_env_var("RUN_TASK_MAKE_SFC_CLIMO","FALSE") + set_env_var("CCPP_PHYS_SUITE","FV3_GSD_SAR") diff --git a/ush/load_modules_run_task.sh b/ush/load_modules_run_task.sh index 56c005faa..7a9547fe0 100755 --- a/ush/load_modules_run_task.sh +++ b/ush/load_modules_run_task.sh @@ -86,18 +86,19 @@ jjob_fp="$2" # #----------------------------------------------------------------------- # -# Sourcing ufs-srweather-app build env file +# Loading ufs-srweather-app build module files # #----------------------------------------------------------------------- # machine=$(echo_lowercase $MACHINE) -env_fp="${SR_WX_APP_TOP_DIR}/env/${BUILD_ENV_FN}" -module use "${SR_WX_APP_TOP_DIR}/env" -source "${env_fp}" || print_err_msg_exit "\ -Sourcing platform- and compiler-specific environment file (env_fp) for the + +source "${SR_WX_APP_TOP_DIR}/etc/lmod-setup.sh" +module use "${SR_WX_APP_TOP_DIR}/modulefiles" +module load "${BUILD_MOD_FN}" || print_err_msg_exit "\ +Sourcing platform- and compiler-specific module file (BUILD_MOD_FN) for the workflow task specified by task_name failed: task_name = \"${task_name}\" - env_fp = \"${env_fp}\"" + BUILD_MOD_FN = \"${BUILD_MOD_FN}\"" # #----------------------------------------------------------------------- # @@ -112,7 +113,7 @@ workflow task specified by task_name failed: # # The regional_workflow repository contains module files for the # workflow tasks in the template rocoto XML file for the FV3-LAM work- -# flow that need modules not loaded in the env_fn above. +# flow that need modules not loaded in the BUILD_MOD_FN above. # # The full path to a module file for a given task is # diff --git a/ush/machine/cheyenne.sh b/ush/machine/cheyenne.sh index 6cf61061a..cf80db74f 100644 --- a/ush/machine/cheyenne.sh +++ b/ush/machine/cheyenne.sh @@ -46,12 +46,13 @@ QUEUE_HPSS=${QUEUE_HPSS:-"regular"} QUEUE_FCST=${QUEUE_FCST:-"regular"} # UFS SRW App specific paths -FIXgsm=${FIXgsm:-"/glade/p/ral/jntp/UFS_CAM/fix/fix_am"} -FIXaer=${FIXaer:-"/glade/p/ral/jntp/UFS_CAM/fix/fix_aer"} -FIXlut=${FIXlut:-"/glade/p/ral/jntp/UFS_CAM/fix/fix_lut"} -TOPO_DIR=${TOPO_DIR:-"/glade/p/ral/jntp/UFS_CAM/fix/fix_orog"} -SFC_CLIMO_INPUT_DIR=${SFC_CLIMO_INPUT_DIR:-"/glade/p/ral/jntp/UFS_CAM/fix/climo_fields_netcdf"} -FIXLAM_NCO_BASEDIR=${FIXLAM_NCO_BASEDIR:-"/needs/to/be/specified"} +staged_data_dir="/glade/p/ral/jntp/UFS_SRW_App/develop" +FIXgsm=${FIXgsm:-"${staged_data_dir}/fix/fix_am"} +FIXaer=${FIXaer:-"${staged_data_dir}/fix/fix_aer"} +FIXlut=${FIXlut:-"${staged_data_dir}/fix/fix_lut"} +TOPO_DIR=${TOPO_DIR:-"${staged_data_dir}/fix/fix_orog"} +SFC_CLIMO_INPUT_DIR=${SFC_CLIMO_INPUT_DIR:-"${staged_data_dir}/fix/fix_sfc_climo"} +DOMAIN_PREGEN_BASEDIR=${DOMAIN_PREGEN_BASEDIR:-"${staged_data_dir}/FV3LAM_pregen"} # Run commands for executables RUN_CMD_SERIAL="time" @@ -62,12 +63,12 @@ RUN_CMD_POST='mpirun -np $nprocs' # MET/METplus-Related Paths MET_INSTALL_DIR=${MET_INSTALL_DIR:-"/glade/p/ral/jntp/MET/MET_releases/10.0.0"} METPLUS_PATH=${METPLUS_PATH:-"/glade/p/ral/jntp/MET/METplus/METplus-4.0.0"} -CCPA_OBS_DIR=${CCPA_OBS_DIR:-"/glade/p/ral/jntp/UFS_SRW_app/develop/obs_data/ccpa/proc"} -MRMS_OBS_DIR=${MRMS_OBS_DIR:-"/glade/p/ral/jntp/UFS_SRW_app/develop/obs_data/mrms/proc"} -NDAS_OBS_DIR=${NDAS_OBS_DIR:-"/glade/p/ral/jntp/UFS_SRW_app/develop/obs_data/ndas/proc"} +CCPA_OBS_DIR=${CCPA_OBS_DIR:-"${staged_data_dir}/obs_data/ccpa/proc"} +MRMS_OBS_DIR=${MRMS_OBS_DIR:-"${staged_data_dir}/obs_data/mrms/proc"} +NDAS_OBS_DIR=${NDAS_OBS_DIR:-"${staged_data_dir}/obs_data/ndas/proc"} MET_BIN_EXEC=${MET_BIN_EXEC:-"bin"} # Test Data Locations -TEST_PREGEN_BASEDIR="/glade/p/ral/jntp/UFS_SRW_app/FV3LAM_pregen" -TEST_COMINgfs="/glade/p/ral/jntp/UFS_SRW_app/COMGFS" -TEST_EXTRN_MDL_SOURCE_BASEDIR="/glade/p/ral/jntp/UFS_SRW_app/staged_extrn_mdl_files" +TEST_COMIN="${staged_data_dir}/COMGFS" +TEST_PREGEN_BASEDIR="${staged_data_dir}/FV3LAM_pregen" +TEST_EXTRN_MDL_SOURCE_BASEDIR="${staged_data_dir}/input_model_data" diff --git a/ush/machine/gaea.sh b/ush/machine/gaea.sh new file mode 100755 index 000000000..37f0838e6 --- /dev/null +++ b/ush/machine/gaea.sh @@ -0,0 +1,70 @@ +#!/bin/bash + +function file_location() { + + # Return the default location of external model files on disk + + local external_file_fmt external_model location + + external_model=${1} + external_file_fmt=${2} + + case ${external_model} in + + "FV3GFS") + location='/lustre/f2/dev/Mark.Potts/EPIC/SRW/model_data/FV3GFS/${yyyymmdd}${hh}' + ;; + + esac + echo ${location:-} +} + + +EXTRN_MDL_SYSBASEDIR_ICS=${EXTRN_MDL_SYSBASEDIR_ICS:-$(file_location \ + ${EXTRN_MDL_NAME_ICS} \ + ${FV3GFS_FILE_FMT_ICS})} +EXTRN_MDL_SYSBASEDIR_LBCS=${EXTRN_MDL_SYSBASEDIR_LBCS:-$(file_location \ + ${EXTRN_MDL_NAME_LBCS} \ + ${FV3GFS_FILE_FMT_ICS})} + +# System scripts to source to initialize various commands within workflow +# scripts (e.g. "module"). +if [ -z ${ENV_INIT_SCRIPTS_FPS:-""} ]; then + ENV_INIT_SCRIPTS_FPS=( "/etc/profile" ) +fi + + +# Commands to run at the start of each workflow task. +PRE_TASK_CMDS='{ ulimit -s unlimited; ulimit -a; }' + +# Architecture information +WORKFLOW_MANAGER="rocoto" +SLURM_NATIVE_CMD="-M c3" +NCORES_PER_NODE=${NCORES_PER_NODE:-32} +SCHED=${SCHED:-"slurm"} +QUEUE_DEFAULT=${QUEUE_DEFAULT:-"normal"} +QUEUE_HPSS=${QUEUE_DEFAULT:-"normal"} +QUEUE_FCST=${QUEUE_DEFAULT:-"normal"} +WTIME_MAKE_LBCS="00:60:00" + +# UFS SRW App specific paths +staged_data_dir="/lustre/f2/dev/Mark.Potts/EPIC/UFS_SRW_App/develop" +FIXgsm=${FIXgsm:-"${staged_data_dir}/fix/fix_am"} +FIXaer=${FIXaer:-"${staged_data_dir}/fix/fix_aer"} +FIXlut=${FIXlut:-"${staged_data_dir}/fix/fix_lut"} +TOPO_DIR=${TOPO_DIR:-"${staged_data_dir}/fix/fix_orog"} +SFC_CLIMO_INPUT_DIR=${SFC_CLIMO_INPUT_DIR:-"${staged_data_dir}/fix/fix_sfc_climo"} +TEST_EXTRN_MDL_SOURCE_BASEDIR="${staged_data_dir}/input_model_data" + +RUN_CMD_SERIAL="time" +#Run Commands currently differ for GNU/openmpi +#RUN_CMD_UTILS='mpirun --mca btl tcp,vader,self -np $nprocs' +#RUN_CMD_FCST='mpirun --mca btl tcp,vader,self -np ${PE_MEMBER01}' +#RUN_CMD_POST='mpirun --mca btl tcp,vader,self -np $nprocs' +RUN_CMD_UTILS='srun --mpi=pmi2 -n $nprocs' +RUN_CMD_FCST='srun --mpi=pmi2 -n ${PE_MEMBER01}' +RUN_CMD_POST='srun --mpi=pmi2 -n $nprocs' + +# MET Installation Locations +# MET Plus is not yet supported on gaea +# Test Data Locations diff --git a/ush/machine/hera.sh b/ush/machine/hera.sh index bb0f77313..454a1e938 100644 --- a/ush/machine/hera.sh +++ b/ush/machine/hera.sh @@ -28,6 +28,8 @@ EXTRN_MDL_SYSBASEDIR_LBCS=${EXTRN_MDL_SYSBASEDIR_LBCS:-$(file_location \ ${EXTRN_MDL_NAME_LBCS} \ ${FV3GFS_FILE_FMT_LBCS})} +EXTRN_MDL_DATA_STORES=${EXTRN_MDL_DATA_STORES:-"hpss aws nomads"} + # System scripts to source to initialize various commands within workflow # scripts (e.g. "module"). if [ -z ${ENV_INIT_SCRIPTS_FPS:-""} ]; then @@ -49,12 +51,13 @@ PARTITION_FCST=${PARTITION_FCST:-"hera"} QUEUE_FCST=${QUEUE_FCST:-"batch"} # UFS SRW App specific paths -FIXgsm=${FIXgsm:-"/scratch1/NCEPDEV/global/glopara/fix/fix_am"} -FIXaer=${FIXaer:-"/scratch1/NCEPDEV/global/glopara/fix/fix_aer"} -FIXlut=${FIXlut:-"/scratch1/NCEPDEV/global/glopara/fix/fix_lut"} -TOPO_DIR=${TOPO_DIR:-"/scratch1/NCEPDEV/global/glopara/fix/fix_orog"} -SFC_CLIMO_INPUT_DIR=${SFC_CLIMO_INPUT_DIR:-"/scratch1/NCEPDEV/global/glopara/fix/fix_sfc_climo"} -FIXLAM_NCO_BASEDIR=${FIXLAM_NCO_BASEDIR:-"/scratch2/BMC/det/FV3LAM_pregen"} +staged_data_dir="/scratch2/BMC/det/UFS_SRW_App/develop" +FIXgsm=${FIXgsm:-"${staged_data_dir}/fix/fix_am"} +FIXaer=${FIXaer:-"${staged_data_dir}/fix/fix_aer"} +FIXlut=${FIXlut:-"${staged_data_dir}/fix/fix_lut"} +TOPO_DIR=${TOPO_DIR:-"${staged_data_dir}/fix/fix_orog"} +SFC_CLIMO_INPUT_DIR=${SFC_CLIMO_INPUT_DIR:-"${staged_data_dir}/fix/fix_sfc_climo"} +DOMAIN_PREGEN_BASEDIR=${DOMAIN_PREGEN_BASEDIR:-"${staged_data_dir}/FV3LAM_pregen"} # Run commands for executables RUN_CMD_SERIAL="time" @@ -65,14 +68,14 @@ RUN_CMD_POST="srun" # MET/METplus-Related Paths MET_INSTALL_DIR=${MET_INSTALL_DIR:-"/contrib/met/10.0.0"} METPLUS_PATH=${METPLUS_PATH:-"/contrib/METplus/METplus-4.0.0"} -CCPA_OBS_DIR=${CCPA_OBS_DIR:-"/scratch2/BMC/det/UFS_SRW_app/develop/obs_data/ccpa/proc"} -MRMS_OBS_DIR=${MRMS_OBS_DIR:-"/scratch2/BMC/det/UFS_SRW_app/develop/obs_data/mrms/proc"} -NDAS_OBS_DIR=${NDAS_OBS_DIR:-"/scratch2/BMC/det/UFS_SRW_app/develop/obs_data/ndas/proc"} +CCPA_OBS_DIR=${CCPA_OBS_DIR:-"${staged_data_dir}/obs_data/ccpa/proc"} +MRMS_OBS_DIR=${MRMS_OBS_DIR:-"${staged_data_dir}/obs_data/mrms/proc"} +NDAS_OBS_DIR=${NDAS_OBS_DIR:-"${staged_data_dir}/obs_data/ndas/proc"} MET_BIN_EXEC=${MET_BIN_EXEC:-"bin"} # Test Data Locations -TEST_PREGEN_BASEDIR="/scratch2/BMC/det/UFS_SRW_app/FV3LAM_pregen" -TEST_COMINgfs="/scratch2/NCEPDEV/fv3-cam/noscrub/UFS_SRW_App/COMGFS" -TEST_EXTRN_MDL_SOURCE_BASEDIR="/scratch2/BMC/det/UFS_SRW_app/develop/model_data" +TEST_COMIN="${staged_data_dir}/COMGFS" +TEST_PREGEN_BASEDIR="${staged_data_dir}/FV3LAM_pregen" +TEST_EXTRN_MDL_SOURCE_BASEDIR="${staged_data_dir}/input_model_data" TEST_ALT_EXTRN_MDL_SYSBASEDIR_ICS="/scratch2/BMC/det/UFS_SRW_app/dummy_FV3GFS_sys_dir" TEST_ALT_EXTRN_MDL_SYSBASEDIR_LBCS="/scratch2/BMC/det/UFS_SRW_app/dummy_FV3GFS_sys_dir" diff --git a/ush/machine/jet.sh b/ush/machine/jet.sh index dac90d549..f383090ec 100644 --- a/ush/machine/jet.sh +++ b/ush/machine/jet.sh @@ -44,6 +44,8 @@ EXTRN_MDL_SYSBASEDIR_LBCS=${EXTRN_MDL_SYSBASEDIR_LBCS:-$(file_location \ ${EXTRN_MDL_NAME_LBCS} \ ${FV3GFS_FILE_FMT_LBCS})} +EXTRN_MDL_DATA_STORES=${EXTRN_MDL_DATA_STORES:-"hpss aws nomads"} + # System scripts to source to initialize various commands within workflow # scripts (e.g. "module"). if [ -z ${ENV_INIT_SCRIPTS_FPS:-""} ]; then @@ -65,12 +67,13 @@ PARTITION_FCST=${PARTITION_FCST:-"sjet,vjet,kjet,xjet"} QUEUE_FCST=${QUEUE_FCST:-"batch"} # UFS SRW App specific paths -FIXgsm=${FIXgsm:-"/lfs4/HFIP/hfv3gfs/glopara/git/fv3gfs/fix/fix_am"} -FIXaer=${FIXaer:-"/lfs4/HFIP/hfv3gfs/glopara/git/fv3gfs/fix/fix_aer"} -FIXlut=${FIXlut:-"/lfs4/HFIP/hfv3gfs/glopara/git/fv3gfs/fix/fix_lut"} -TOPO_DIR=${TOPO_DIR:-"/lfs4/HFIP/hfv3gfs/glopara/git/fv3gfs/fix/fix_orog"} -SFC_CLIMO_INPUT_DIR=${SFC_CLIMO_INPUT_DIR:-"/lfs4/HFIP/hfv3gfs/glopara/git/fv3gfs/fix/fix_sfc_climo"} -FIXLAM_NCO_BASEDIR=${FIXLAM_NCO_BASEDIR:-"/mnt/lfs4/BMC/wrfruc/FV3-LAM/pregen"} +staged_data_dir="/mnt/lfs4/BMC/wrfruc/UFS_SRW_App/develop" +FIXgsm=${FIXgsm:-"${staged_data_dir}/fix/fix_am"} +FIXaer=${FIXaer:-"${staged_data_dir}/fix/fix_aer"} +FIXlut=${FIXlut:-"${staged_data_dir}/fix/fix_lut"} +TOPO_DIR=${TOPO_DIR:-"${staged_data_dir}/fix/fix_orog"} +SFC_CLIMO_INPUT_DIR=${SFC_CLIMO_INPUT_DIR:-"${staged_data_dir}/fix/fix_sfc_climo"} +DOMAIN_PREGEN_BASEDIR=${DOMAIN_PREGEN_BASEDIR:-"${staged_data_dir}/FV3LAM_pregen"} # Run commands for executables RUN_CMD_SERIAL="time" @@ -79,6 +82,6 @@ RUN_CMD_FCST="srun" RUN_CMD_POST="srun" # Test Data Locations -TEST_PREGEN_BASEDIR="/mnt/lfs4/BMC/wrfruc/UFS_SRW_app/FV3LAM_pregen" -TEST_COMINgfs="/mnt/lfs4/BMC/wrfruc/UFS_SRW_app/COMGFS" -TEST_EXTRN_MDL_SOURCE_BASEDIR="/mnt/lfs4/BMC/wrfruc/UFS_SRW_app/staged_extrn_mdl_files" +TEST_COMIN="${staged_data_dir}/COMGFS" +TEST_PREGEN_BASEDIR="${staged_data_dir}/FV3LAM_pregen" +TEST_EXTRN_MDL_SOURCE_BASEDIR="${staged_data_dir}/input_model_data" diff --git a/ush/machine/noaacloud.sh b/ush/machine/noaacloud.sh index fe4276661..49299c325 100755 --- a/ush/machine/noaacloud.sh +++ b/ush/machine/noaacloud.sh @@ -1,6 +1,5 @@ -#!/bin/bash +#!/bin/bash -set -x function file_location() { @@ -16,16 +15,13 @@ function file_location() { "FV3GFS") location='/contrib/GST/model_data/FV3GFS/${yyyymmdd}${hh}' ;; - *) - print_info_msg"\ - External model \'${external_model}\' does not have a default - location on Hera. Will try to pull from HPSS" - ;; esac echo ${location:-} } +export OPT=/contrib/EPIC/hpc-modules +export PATH=${PATH}:/contrib/GST/miniconda/envs/regional_workflow/bin EXTRN_MDL_SYSBASEDIR_ICS=${EXTRN_MDL_SYSBASEDIR_ICS:-$(file_location \ ${EXTRN_MDL_NAME_ICS} \ @@ -34,6 +30,8 @@ EXTRN_MDL_SYSBASEDIR_LBCS=${EXTRN_MDL_SYSBASEDIR_LBCS:-$(file_location \ ${EXTRN_MDL_NAME_LBCS} \ ${FV3GFS_FILE_FMT_ICS})} +EXTRN_MDL_DATA_STORES=${EXTRN_MDL_DATA_STORES:-"aws nomads"} + # System scripts to source to initialize various commands within workflow # scripts (e.g. "module"). if [ -z ${ENV_INIT_SCRIPTS_FPS:-""} ]; then @@ -50,22 +48,23 @@ NCORES_PER_NODE=${NCORES_PER_NODE:-36} SCHED=${SCHED:-"slurm"} # UFS SRW App specific paths -FIXgsm=${FIXgsm:-"/contrib/EPIC/fix/fix_am"} -FIXaer=${FIXaer:-"/contrib/EPIC/fix/fix_aer"} -FIXlut=${FIXlut:-"/contrib/EPIC/fix/fix_lut"} -TOPO_DIR=${TOPO_DIR:-"/contrib/EPIC/fix/fix_orog"} -SFC_CLIMO_INPUT_DIR=${SFC_CLIMO_INPUT_DIR:-"/contrib/EPIC/fix/fix_sfc_climo"} -FIXLAM_NCO_BASEDIR=${FIXLAM_NCO_BASEDIR:-"/scratch2/BMC/det/FV3LAM_pregen"} +staged_data_dir="/contrib/EPIC/UFS_SRW_App/develop" +FIXgsm=${FIXgsm:-"${staged_data_dir}/fix/fix_am"} +FIXaer=${FIXaer:-"${staged_data_dir}/fix/fix_aer"} +FIXlut=${FIXlut:-"${staged_data_dir}/fix/fix_lut"} +TOPO_DIR=${TOPO_DIR:-"${staged_data_dir}/fix/fix_orog"} +SFC_CLIMO_INPUT_DIR=${SFC_CLIMO_INPUT_DIR:-"${staged_data_dir}/fix/fix_sfc_climo"} +TEST_EXTRN_MDL_SOURCE_BASEDIR="${staged_data_dir}/input_model_data" RUN_CMD_SERIAL="time" #Run Commands currently differ for GNU/openmpi #RUN_CMD_UTILS='mpirun --mca btl tcp,vader,self -np $nprocs' #RUN_CMD_FCST='mpirun --mca btl tcp,vader,self -np ${PE_MEMBER01}' #RUN_CMD_POST='mpirun --mca btl tcp,vader,self -np $nprocs' -RUN_CMD_UTILS='srun --mpi=pmi2 -n $nprocs' -RUN_CMD_FCST='srun --mpi=pmi2 -n ${PE_MEMBER01}' -RUN_CMD_POST='srun --mpi=pmi2 -n $nprocs' +RUN_CMD_UTILS='mpiexec -np $nprocs' +RUN_CMD_FCST='mpiexec -np ${PE_MEMBER01}' +RUN_CMD_POST='mpiexec -np $nprocs' # MET Installation Locations # MET Plus is not yet supported on noaacloud - +. /contrib/EPIC/.bash_conda diff --git a/ush/machine/odin.sh b/ush/machine/odin.sh index 1bceaa873..58d360a63 100644 --- a/ush/machine/odin.sh +++ b/ush/machine/odin.sh @@ -9,16 +9,26 @@ function file_location() { external_model=${1} external_file_fmt=${2} + staged_data_dir="/scratch/ywang/UFS_SRW_App/develop" + location="" case ${external_model} in "GSMGFS") - location='/scratch/ywang/EPIC/GDAS/2019053000_mem001' + location="${staged_data_dir}/input_model_data/GFS" ;; "FV3GFS") - location='/scratch/ywang/test_runs/FV3_regional/gfs/${yyyymmdd}' + location="${staged_data_dir}/input_model_data/FV3GFS" + ;; + "HRRR") + location="${staged_data_dir}/input_model_data/HRRR" + ;; + "RAP") + location="${staged_data_dir}/input_model_data/RAP" + ;; + "NAM") + location="${staged_data_dir}/input_model_data/NAM" ;; - esac echo ${location:-} @@ -52,15 +62,19 @@ PARTITION_FCST=${PARTITION_FCST:-"workq"} QUEUE_FCST=${QUEUE_FCST:-"workq"} # UFS SRW App specific paths -FIXgsm=${FIXgsm:-"/scratch/ywang/fix/theia_fix/fix_am"} -FIXaer=${FIXaer:-"/scratch/ywang/fix/theia_fix/fix_aer"} -FIXlut=${FIXlut:-"/scratch/ywang/fix/theia_fix/fix_lut"} -TOPO_DIR=${TOPO_DIR:-"/scratch/ywang/fix/theia_fix/fix_orog"} -SFC_CLIMO_INPUT_DIR=${SFC_CLIMO_INPUT_DIR:-"/scratch/ywang/fix/climo_fields_netcdf"} -FIXLAM_NCO_BASEDIR=${FIXLAM_NCO_BASEDIR:-"/needs/to/be/specified"} +FIXgsm=${FIXgsm:-"${staged_data_dir}/fix/fix_am"} +FIXaer=${FIXaer:-"${staged_data_dir}/fix/fix_aer"} +FIXlut=${FIXlut:-"${staged_data_dir}/fix/fix_lut"} +TOPO_DIR=${TOPO_DIR:-"${staged_data_dir}/fix/fix_orog"} +SFC_CLIMO_INPUT_DIR=${SFC_CLIMO_INPUT_DIR:-"${staged_data_dir}/fix/fix_sfc_climo"} +DOMAIN_PREGEN_BASEDIR=${DOMAIN_PREGEN_BASEDIR:-"${staged_data_dir}/FV3LAM_pregen"} # Run commands for executables RUN_CMD_SERIAL="srun -n 1" RUN_CMD_UTILS='srun -n $nprocs' RUN_CMD_FCST='srun -n ${PE_MEMBER01}' RUN_CMD_POST="srun -n 1" + +# Test Data Locations +TEST_PREGEN_BASEDIR="${staged_data_dir}/FV3LAM_pregen" +TEST_EXTRN_MDL_SOURCE_BASEDIR="${staged_data_dir}/input_model_data" diff --git a/ush/machine/orion.sh b/ush/machine/orion.sh index 5c872410d..b8032e241 100644 --- a/ush/machine/orion.sh +++ b/ush/machine/orion.sh @@ -22,6 +22,8 @@ EXTRN_MDL_SYSBASEDIR_LBCS=${EXTRN_MDL_SYSBASEDIR_LBCS:-$(file_location \ ${EXTRN_MDL_NAME_LBCS} \ ${FV3GFS_FILE_FMT_LBCS})} +EXTRN_MDL_DATA_STORES=${EXTRN_MDL_DATA_STORES:-"aws nomads"} + # System scripts to source to initialize various commands within workflow # scripts (e.g. "module"). if [ -z ${ENV_INIT_SCRIPTS_FPS:-""} ]; then @@ -43,12 +45,13 @@ PARTITION_FCST=${PARTITION_FCST:-"orion"} QUEUE_FCST=${QUEUE_FCST:-"batch"} # UFS SRW App specific paths -FIXgsm=${FIXgsm:-"/work/noaa/global/glopara/fix/fix_am"} -FIXaer=${FIXaer:-"/work/noaa/global/glopara/fix/fix_aer"} -FIXlut=${FIXlut:-"/work/noaa/global/glopara/fix/fix_lut"} -TOPO_DIR=${TOPO_DIR:-"/work/noaa/global/glopara/fix/fix_orog"} -SFC_CLIMO_INPUT_DIR=${SFC_CLIMO_INPUT_DIR:-"/work/noaa/global/glopara/fix/fix_sfc_climo"} -FIXLAM_NCO_BASEDIR=${FIXLAM_NCO_BASEDIR:-"/work/noaa/fv3-cam/UFS_SRW_App/FIXLAM_NCO_BASE"} +staged_data_dir="/work/noaa/fv3-cam/UFS_SRW_App/develop" +FIXgsm=${FIXgsm:-"${staged_data_dir}/fix/fix_am"} +FIXaer=${FIXaer:-"${staged_data_dir}/fix/fix_aer"} +FIXlut=${FIXlut:-"${staged_data_dir}/fix/fix_lut"} +TOPO_DIR=${TOPO_DIR:-"${staged_data_dir}/fix/fix_orog"} +SFC_CLIMO_INPUT_DIR=${SFC_CLIMO_INPUT_DIR:-"${staged_data_dir}/fix/fix_sfc_climo"} +DOMAIN_PREGEN_BASEDIR=${DOMAIN_PREGEN_BASEDIR:-"${staged_data_dir}/FV3LAM_pregen"} # Run commands for executables RUN_CMD_SERIAL="time" @@ -59,13 +62,13 @@ RUN_CMD_POST="srun" # MET/METplus-Related Paths MET_INSTALL_DIR=${MET_INSTALL_DIR:-"/apps/contrib/MET/10.1.0"} METPLUS_PATH=${METPLUS_PATH:-"/apps/contrib/MET/METplus/METplus-4.0.0"} -CCPA_OBS_DIR=${CCPA_OBS_DIR:-"/work/noaa/fv3-cam/UFS_SRW_App/develop/obs_data/ccpa/proc"} -MRMS_OBS_DIR=${MRMS_OBS_DIR:-"/work/noaa/fv3-cam/UFS_SRW_App/develop/obs_data/mrms/proc"} -NDAS_OBS_DIR=${NDAS_OBS_DIR:-"/work/noaa/fv3-cam/UFS_SRW_App/develop/obs_data/ndas/proc"} +CCPA_OBS_DIR=${CCPA_OBS_DIR:-"${staged_data_dir}/obs_data/ccpa/proc"} +MRMS_OBS_DIR=${MRMS_OBS_DIR:-"${staged_data_dir}/obs_data/mrms/proc"} +NDAS_OBS_DIR=${NDAS_OBS_DIR:-"${staged_data_dir}/obs_data/ndas/proc"} MET_BIN_EXEC=${MET_BIN_EXEC:-"bin"} # Test Data Locations -TEST_PREGEN_BASEDIR="/work/noaa/fv3-cam/UFS_SRW_App/FV3LAM_pregen" -TEST_COMINgfs="/work/noaa/fv3-cam/UFS_SRW_App/COMGFS" -TEST_EXTRN_MDL_SOURCE_BASEDIR="/work/noaa/fv3-cam/UFS_SRW_App/develop/model_data" +TEST_COMIN="${staged_data_dir}/COMGFS" +TEST_PREGEN_BASEDIR="${staged_data_dir}/FV3LAM_pregen" +TEST_EXTRN_MDL_SOURCE_BASEDIR="${staged_data_dir}/input_model_data" diff --git a/ush/machine/singularity.sh b/ush/machine/singularity.sh index 528e1dbd5..14f840800 100644 --- a/ush/machine/singularity.sh +++ b/ush/machine/singularity.sh @@ -21,6 +21,8 @@ EXTRN_MDL_SYSBASEDIR_LBCS=${EXTRN_MDL_SYSBASEDIR_LBCS:-$(file_location \ ${EXTRN_MDL_NAME_LBCS} \ ${FV3GFS_FILE_FMT_LBCS})} +EXTRN_MDL_DATA_STORES=${EXTRN_MDL_DATA_STORES:-"aws nomads"} + # System scripts to source to initialize various commands within workflow # scripts (e.g. "module"). if [ -z ${ENV_INIT_SCRIPTS_FPS:-""} ]; then @@ -47,7 +49,7 @@ FIXaer=${FIXaer:-"/contrib/global/glopara/fix/fix_aer"} FIXlut=${FIXlut:-"/contrib/global/glopara/fix/fix_lut"} TOPO_DIR=${TOPO_DIR:-"/contrib/global/glopara/fix/fix_orog"} SFC_CLIMO_INPUT_DIR=${SFC_CLIMO_INPUT_DIR:-"/contrib/global/glopara/fix/fix_sfc_climo"} -FIXLAM_NCO_BASEDIR=${FIXLAM_NCO_BASEDIR:-"/needs/to/be/specified"} +DOMAIN_PREGEN_BASEDIR=${DOMAIN_PREGEN_BASEDIR:-"/needs/to/be/specified"} # Run commands for executables RUN_CMD_SERIAL="time" diff --git a/ush/machine/stampede.sh b/ush/machine/stampede.sh index 41afa5fc1..3f879ea22 100644 --- a/ush/machine/stampede.sh +++ b/ush/machine/stampede.sh @@ -9,15 +9,26 @@ function file_location() { external_model=${1} external_file_fmt=${2} + staged_data_dir="/work2/00315/tg455890/stampede2/UFS_SRW_App/develop" + location="" case ${external_model} in "GSMGFS") - ;& # Fall through. All files in same place + location="${staged_data_dir}/input_model_data/GFS" + ;; "FV3GFS") - location='/scratch/00315/tg455890/GDAS/20190530/2019053000_mem001' + location="${staged_data_dir}/input_model_data/FV3GFS" + ;; + "HRRR") + location="${staged_data_dir}/input_model_data/HRRR" + ;; + "RAP") + location="${staged_data_dir}/input_model_data/RAP" + ;; + "NAM") + location="${staged_data_dir}/input_model_data/NAM" ;; - esac echo ${location:-} @@ -51,15 +62,20 @@ PARTITION_FCST=${PARTITION_FCST:-"normal"} QUEUE_FCST=${QUEUE_FCST:-"normal"} # UFS SRW App specific paths -FIXgsm=${FIXgsm:-"/work/00315/tg455890/stampede2/regional_fv3/fix_am"} -FIXaer=${FIXaer:-"/work/00315/tg455890/stampede2/regional_fv3/fix_aer"} -FIXlut=${FIXlut:-"/work/00315/tg455890/stampede2/regional_fv3/fix_lut"} -TOPO_DIR=${TOPO_DIR:-"/work/00315/tg455890/stampede2/regional_fv3/fix_orog"} -SFC_CLIMO_INPUT_DIR=${SFC_CLIMO_INPUT_DIR:-"/work/00315/tg455890/stampede2/regional_fv3/climo_fields_netcdf"} -FIXLAM_NCO_BASEDIR=${FIXLAM_NCO_BASEDIR:-"/needs/to/be/specified"} +staged_data_dir="/work2/00315/tg455890/stampede2/UFS_SRW_App/develop" +FIXgsm=${FIXgsm:-"${staged_data_dir}/fix/fix_am"} +FIXaer=${FIXaer:-"${staged_data_dir}/fix/fix_aer"} +FIXlut=${FIXlut:-"${staged_data_dir}/fix/fix_lut"} +TOPO_DIR=${TOPO_DIR:-"${staged_data_dir}/fix/fix_orog"} +SFC_CLIMO_INPUT_DIR=${SFC_CLIMO_INPUT_DIR:-"${staged_data_dir}/fix/fix_sfc_climo"} +DOMAIN_PREGEN_BASEDIR=${DOMAIN_PREGEN_BASEDIR:-"${staged_data_dir}/FV3LAM_pregen"} # Run commands for executables RUN_CMD_SERIAL="time" RUN_CMD_UTILS='ibrun -np $nprocs' RUN_CMD_FCST='ibrun -np $nprocs' RUN_CMD_POST='ibrun -np $nprocs' + +# Test Data Locations +TEST_PREGEN_BASEDIR="${staged_data_dir}/FV3LAM_pregen" +TEST_EXTRN_MDL_SOURCE_BASEDIR="${staged_data_dir}/input_model_data" diff --git a/ush/machine/wcoss_dell_p3.sh b/ush/machine/wcoss_dell_p3.sh index 811c3ae8b..9bf525a35 100644 --- a/ush/machine/wcoss_dell_p3.sh +++ b/ush/machine/wcoss_dell_p3.sh @@ -37,6 +37,8 @@ EXTRN_MDL_SYSBASEDIR_LBCS=${EXTRN_MDL_SYSBASEDIR_LBCS:-$(file_location \ ${EXTRN_MDL_NAME_LBCS} \ ${FV3GFS_FILE_FMT_LBCS})} +EXTRN_MDL_DATA_STORES=${EXTRN_MDL_DATA_STORES:-"hpss"} + # System scripts to source to initialize various commands within workflow # scripts (e.g. "module"). if [ -z ${ENV_INIT_SCRIPTS_FPS:-""} ]; then @@ -55,12 +57,13 @@ QUEUE_HPSS=${QUEUE_HPSS:-"dev_transfer"} QUEUE_FCST=${QUEUE_FCST:-"dev"} # UFS SRW App specific paths -FIXgsm=${FIXgsm:-"/gpfs/dell2/emc/modeling/noscrub/emc.glopara/git/fv3gfs/fix/fix_am"} -FIXaer=${FIXaer:-"/gpfs/dell2/emc/modeling/noscrub/emc.glopara/git/fv3gfs/fix/fix_aer"} -FIXlut=${FIXlut:-"/gpfs/dell2/emc/modeling/noscrub/emc.glopara/git/fv3gfs/fix/fix_lut"} -TOPO_DIR=${TOPO_DIR:-"/gpfs/dell2/emc/modeling/noscrub/emc.glopara/git/fv3gfs/fix/fix_orog"} -SFC_CLIMO_INPUT_DIR=${SFC_CLIMO_INPUT_DIR:-"/gpfs/dell2/emc/modeling/noscrub/emc.glopara/git/fv3gfs/fix/fix_sfc_climo"} -FIXLAM_NCO_BASEDIR=${FIXLAM_NCO_BASEDIR:-"/gpfs/dell2/emc/modeling/noscrub/UFS_SRW_App/FIXLAM_NCO_BASE"} +staged_data_dir="/gpfs/dell2/emc/modeling/noscrub/UFS_SRW_App/develop" +FIXgsm=${FIXgsm:-"${staged_data_dir}/fix/fix_am"} +FIXaer=${FIXaer:-"${staged_data_dir}/fix/fix_aer"} +FIXlut=${FIXlut:-"${staged_data_dir}/fix/fix_lut"} +TOPO_DIR=${TOPO_DIR:-"${staged_data_dir}/fix/fix_orog"} +SFC_CLIMO_INPUT_DIR=${SFC_CLIMO_INPUT_DIR:-"${staged_data_dir}/fix/fix_sfc_climo"} +DOMAIN_PREGEN_BASEDIR=${DOMAIN_PREGEN_BASEDIR:-"${staged_data_dir}/FV3LAM_pregen"} # Run commands for executables RUN_CMD_SERIAL="mpirun" @@ -71,12 +74,12 @@ RUN_CMD_POST="mpirun" # MET/METplus-Related Paths MET_INSTALL_DIR=${MET_INSTALL_DIR:-"/gpfs/dell2/emc/verification/noscrub/emc.metplus/met/10.0.0"} METPLUS_PATH=${METPLUS_PATH:-"/gpfs/dell2/emc/verification/noscrub/emc.metplus/METplus/METplus-4.0.0"} -CCPA_OBS_DIR=${CCPA_OBS_DIR:-"/gpfs/dell2/emc/modeling/noscrub/UFS_SRW_App/develop/obs_data/ccpa/proc"} -MRMS_OBS_DIR=${MRMS_OBS_DIR:-"/gpfs/dell2/emc/modeling/noscrub/UFS_SRW_App/develop/obs_data/mrms/proc"} -NDAS_OBS_DIR=${NDAS_OBS_DIR:-"/gpfs/dell2/emc/modeling/noscrub/UFS_SRW_App/develop/obs_data/ndas/proc"} +CCPA_OBS_DIR=${CCPA_OBS_DIR:-"${staged_data_dir}/obs_data/ccpa/proc"} +MRMS_OBS_DIR=${MRMS_OBS_DIR:-"${staged_data_dir}/obs_data/mrms/proc"} +NDAS_OBS_DIR=${NDAS_OBS_DIR:-"${staged_data_dir}/obs_data/ndas/proc"} MET_BIN_EXEC=${MET_BIN_EXEC:-"exec"} # Test Data Locations -TEST_PREGEN_BASEDIR="/gpfs/dell2/emc/modeling/noscrub/UFS_SRW_App/FV3LAM_pregen" -TEST_COMINgfs="/gpfs/dell2/emc/modeling/noscrub/UFS_SRW_App/COMGFS" -TEST_EXTRN_MDL_SOURCE_BASEDIR="/gpfs/dell2/emc/modeling/noscrub/UFS_SRW_App/develop/model_data" +TEST_PREGEN_BASEDIR="${staged_data_dir}/FV3LAM_pregen" +TEST_COMIN="${staged_data_dir}/COMGFS" +TEST_EXTRN_MDL_SOURCE_BASEDIR="${staged_data_dir}/input_model_data" diff --git a/ush/mrms_pull_topofhour.py b/ush/mrms_pull_topofhour.py index bd98b2805..374424201 100644 --- a/ush/mrms_pull_topofhour.py +++ b/ush/mrms_pull_topofhour.py @@ -3,80 +3,86 @@ import re, csv, glob import bisect import numpy as np - -# Copy and unzip MRMS files that are closest to top of hour -# Done every hour on a 20-minute lag - -# Include option to define valid time on command line -# Used to backfill verification -#try: -valid_time = str(sys.argv[1]) - -YYYY = int(valid_time[0:4]) -MM = int(valid_time[4:6]) -DD = int(valid_time[6:8]) -HH = int(valid_time[8:19]) - -valid = datetime.datetime(YYYY,MM,DD,HH,0,0) - -#except IndexError: -# valid_time = None - -# Default to current hour if not defined on command line -#if valid_time is None: -# now = datetime.datetime.utcnow() -# YYYY = int(now.strftime('%Y')) -# MM = int(now.strftime('%m')) -# DD = int(now.strftime('%d')) -# HH = int(now.strftime('%H')) - -# valid = datetime.datetime(YYYY,MM,DD,HH,0,0) -# valid_time = valid.strftime('%Y%m%d%H') - -print('Pulling '+valid_time+' MRMS data') - -# Set up working directory -DATA_HEAD = str(sys.argv[2]) -MRMS_PROD_DIR = str(sys.argv[3]) -MRMS_PRODUCT = str(sys.argv[4]) -level = str(sys.argv[5]) - -VALID_DIR = os.path.join(DATA_HEAD,valid.strftime('%Y%m%d')) -if not os.path.exists(VALID_DIR): - os.makedirs(VALID_DIR) -os.chdir(DATA_HEAD) - -# Sort list of files for each MRMS product -print(valid.strftime('%Y%m%d')) -if valid.strftime('%Y%m%d') < '20200303': - search_path = MRMS_PROD_DIR+'/'+valid.strftime('%Y%m%d')+'/dcom/us007003/ldmdata/obs/upperair/mrms/conus/'+MRMS_PRODUCT+'/'+MRMS_PRODUCT+'*.gz' -elif valid.strftime('%Y%m%d') >= '20200303': - search_path = MRMS_PROD_DIR+'/'+valid.strftime('%Y%m%d')+'/upperair/mrms/conus/'+MRMS_PRODUCT+'/'+MRMS_PRODUCT+'*.gz' -file_list = [f for f in glob.glob(search_path)] -time_list = [file_list[x][-24:-9] for x in range(len(file_list))] -int_list = [int(time_list[x][0:8]+time_list[x][9:15]) for x in range(len(time_list))] -int_list.sort() -datetime_list = [datetime.datetime.strptime(str(x),"%Y%m%d%H%M%S") for x in int_list] - -# Find the MRMS file closest to the valid time -i = bisect.bisect_left(datetime_list,valid) -closest_timestamp = min(datetime_list[max(0, i-1): i+2], key=lambda date: abs(valid - date)) - -# Check to make sure closest file is within +/- 15 mins of top of the hour -# Copy and rename the file for future ease -difference = abs(closest_timestamp - valid) -if difference.total_seconds() <= 900: - filename1 = MRMS_PRODUCT+level+closest_timestamp.strftime('%Y%m%d-%H%M%S')+'.grib2.gz' - filename2 = MRMS_PRODUCT+level+valid.strftime('%Y%m%d-%H')+'0000.grib2.gz' - - if valid.strftime('%Y%m%d') < '20200303': - print('cp '+MRMS_PROD_DIR+'/'+valid.strftime('%Y%m%d')+'/dcom/us007003/ldmdata/obs/upperair/mrms/conus/'+MRMS_PRODUCT+'/'+filename1+' '+VALID_DIR+'/'+filename2) - - os.system('cp '+MRMS_PROD_DIR+'/'+valid.strftime('%Y%m%d')+'/dcom/us007003/ldmdata/obs/upperair/mrms/conus/'+MRMS_PRODUCT+'/'+filename1+' '+VALID_DIR+'/'+filename2) - os.system('gunzip '+VALID_DIR+'/'+filename2) - elif valid.strftime('%Y%m%d') >= '20200303': - print('cp '+MRMS_PROD_DIR+'/'+valid.strftime('%Y%m%d')+'/upperair/mrms/conus/'+MRMS_PRODUCT+'/'+filename1+' '+VALID_DIR+'/'+filename2) - - os.system('cp '+MRMS_PROD_DIR+'/'+valid.strftime('%Y%m%d')+'/upperair/mrms/conus/'+MRMS_PRODUCT+'/'+filename1+' '+VALID_DIR+'/'+filename2) - os.system('gunzip '+VALID_DIR+'/'+filename2) - +import unittest + +if __name__ == '__main__': + # Copy and unzip MRMS files that are closest to top of hour + # Done every hour on a 20-minute lag + + # Include option to define valid time on command line + # Used to backfill verification + #try: + valid_time = str(sys.argv[1]) + + YYYY = int(valid_time[0:4]) + MM = int(valid_time[4:6]) + DD = int(valid_time[6:8]) + HH = int(valid_time[8:19]) + + valid = datetime.datetime(YYYY,MM,DD,HH,0,0) + + #except IndexError: + # valid_time = None + + # Default to current hour if not defined on command line + #if valid_time is None: + # now = datetime.datetime.utcnow() + # YYYY = int(now.strftime('%Y')) + # MM = int(now.strftime('%m')) + # DD = int(now.strftime('%d')) + # HH = int(now.strftime('%H')) + + # valid = datetime.datetime(YYYY,MM,DD,HH,0,0) + # valid_time = valid.strftime('%Y%m%d%H') + + print('Pulling '+valid_time+' MRMS data') + + # Set up working directory + DATA_HEAD = str(sys.argv[2]) + MRMS_PROD_DIR = str(sys.argv[3]) + MRMS_PRODUCT = str(sys.argv[4]) + level = str(sys.argv[5]) + + VALID_DIR = os.path.join(DATA_HEAD,valid.strftime('%Y%m%d')) + if not os.path.exists(VALID_DIR): + os.makedirs(VALID_DIR) + os.chdir(DATA_HEAD) + + # Sort list of files for each MRMS product + print(valid.strftime('%Y%m%d')) + if valid.strftime('%Y%m%d') < '20200303': + search_path = MRMS_PROD_DIR+'/'+valid.strftime('%Y%m%d')+'/dcom/us007003/ldmdata/obs/upperair/mrms/conus/'+MRMS_PRODUCT+'/'+MRMS_PRODUCT+'*.gz' + elif valid.strftime('%Y%m%d') >= '20200303': + search_path = MRMS_PROD_DIR+'/'+valid.strftime('%Y%m%d')+'/upperair/mrms/conus/'+MRMS_PRODUCT+'/'+MRMS_PRODUCT+'*.gz' + file_list = [f for f in glob.glob(search_path)] + time_list = [file_list[x][-24:-9] for x in range(len(file_list))] + int_list = [int(time_list[x][0:8]+time_list[x][9:15]) for x in range(len(time_list))] + int_list.sort() + datetime_list = [datetime.datetime.strptime(str(x),"%Y%m%d%H%M%S") for x in int_list] + + # Find the MRMS file closest to the valid time + i = bisect.bisect_left(datetime_list,valid) + closest_timestamp = min(datetime_list[max(0, i-1): i+2], key=lambda date: abs(valid - date)) + + # Check to make sure closest file is within +/- 15 mins of top of the hour + # Copy and rename the file for future ease + difference = abs(closest_timestamp - valid) + if difference.total_seconds() <= 900: + filename1 = MRMS_PRODUCT+level+closest_timestamp.strftime('%Y%m%d-%H%M%S')+'.grib2.gz' + filename2 = MRMS_PRODUCT+level+valid.strftime('%Y%m%d-%H')+'0000.grib2.gz' + + if valid.strftime('%Y%m%d') < '20200303': + print('cp '+MRMS_PROD_DIR+'/'+valid.strftime('%Y%m%d')+'/dcom/us007003/ldmdata/obs/upperair/mrms/conus/'+MRMS_PRODUCT+'/'+filename1+' '+VALID_DIR+'/'+filename2) + + os.system('cp '+MRMS_PROD_DIR+'/'+valid.strftime('%Y%m%d')+'/dcom/us007003/ldmdata/obs/upperair/mrms/conus/'+MRMS_PRODUCT+'/'+filename1+' '+VALID_DIR+'/'+filename2) + os.system('gunzip '+VALID_DIR+'/'+filename2) + elif valid.strftime('%Y%m%d') >= '20200303': + print('cp '+MRMS_PROD_DIR+'/'+valid.strftime('%Y%m%d')+'/upperair/mrms/conus/'+MRMS_PRODUCT+'/'+filename1+' '+VALID_DIR+'/'+filename2) + + os.system('cp '+MRMS_PROD_DIR+'/'+valid.strftime('%Y%m%d')+'/upperair/mrms/conus/'+MRMS_PRODUCT+'/'+filename1+' '+VALID_DIR+'/'+filename2) + os.system('gunzip '+VALID_DIR+'/'+filename2) + +#dummy unittest +class Testing(unittest.TestCase): + def test_mrms_pull_topfhour(self): + pass diff --git a/ush/predef_grid_params.yaml b/ush/predef_grid_params.yaml new file mode 100644 index 000000000..330a9a647 --- /dev/null +++ b/ush/predef_grid_params.yaml @@ -0,0 +1,909 @@ +# +#----------------------------------------------------------------------- +# +# Set grid and other parameters according to the value of the predefined +# domain (PREDEF_GRID_NAME). Note that the code will enter this script +# only if PREDEF_GRID_NAME has a valid (and non-empty) value. +# +#################### +# The following comments need to be updated: +#################### +# +# 1) Reset the experiment title (expt_title). +# 2) Reset the grid parameters. +# 3) If the write component is to be used (i.e. QUILTING is set to +# "TRUE") and the variable WRTCMP_PARAMS_TMPL_FN containing the name +# of the write-component template file is unset or empty, set that +# filename variable to the appropriate preexisting template file. +# +# For the predefined domains, we determine the starting and ending indi- +# ces of the regional grid within tile 6 by specifying margins (in units +# of number of cells on tile 6) between the boundary of tile 6 and that +# of the regional grid (tile 7) along the left, right, bottom, and top +# portions of these boundaries. Note that we do not use "west", "east", +# "south", and "north" here because the tiles aren't necessarily orient- +# ed such that the left boundary segment corresponds to the west edge, +# etc. The widths of these margins (in units of number of cells on tile +# 6) are specified via the parameters +# +# num_margin_cells_T6_left +# num_margin_cells_T6_right +# num_margin_cells_T6_bottom +# num_margin_cells_T6_top +# +# where the "_T6" in these names is used to indicate that the cell count +# is on tile 6, not tile 7. +# +# Note that we must make the margins wide enough (by making the above +# four parameters large enough) such that a region of halo cells around +# the boundary of the regional grid fits into the margins, i.e. such +# that the halo does not overrun the boundary of tile 6. (The halo is +# added later in another script; its function is to feed in boundary +# conditions to the regional grid.) Currently, a halo of 5 regional +# grid cells is used around the regional grid. Setting num_margin_- +# cells_T6_... to at least 10 leaves enough room for this halo. +# +#----------------------------------------------------------------------- +# +#----------------------------------------------------------------------- +# +# The RRFS CONUS domain with ~25km cells. +# +#----------------------------------------------------------------------- +# +"RRFS_CONUS_25km": + GRID_GEN_METHOD: "ESGgrid" + ESGgrid_LON_CTR: -97.5 + ESGgrid_LAT_CTR: 38.5 + ESGgrid_DELX: 25000.0 + ESGgrid_DELY: 25000.0 + ESGgrid_NX: 219 + ESGgrid_NY: 131 + ESGgrid_PAZI: 0.0 + ESGgrid_WIDE_HALO_WIDTH: 6 + DT_ATMOS: 40 + LAYOUT_X: 5 + LAYOUT_Y: 2 + BLOCKSIZE: 40 + #if QUILTING = True + WRTCMP_write_groups: 1 + WRTCMP_write_tasks_per_group: 2 + WRTCMP_output_grid: "lambert_conformal" + WRTCMP_cen_lon: -97.5 + WRTCMP_cen_lat: 38.5 + WRTCMP_stdlat1: 38.5 + WRTCMP_stdlat2: 38.5 + WRTCMP_nx: 217 + WRTCMP_ny: 128 + WRTCMP_lon_lwr_left: -122.719528 + WRTCMP_lat_lwr_left: 21.138123 + WRTCMP_dx: 25000.0 + WRTCMP_dy: 25000.0 +# +#----------------------------------------------------------------------- +# +# The RRFS CONUS domain with ~25km cells that can be initialized from the HRRR. +# +#----------------------------------------------------------------------- +# +"RRFS_CONUScompact_25km": + GRID_GEN_METHOD: "ESGgrid" + ESGgrid_LON_CTR: -97.5 + ESGgrid_LAT_CTR: 38.5 + ESGgrid_DELX: 25000.0 + ESGgrid_DELY: 25000.0 + ESGgrid_NX: 202 + ESGgrid_NY: 116 + ESGgrid_PAZI: 0.0 + ESGgrid_WIDE_HALO_WIDTH: 6 + DT_ATMOS: 40 + LAYOUT_X: 5 + LAYOUT_Y: 2 + BLOCKSIZE: 40 + #if QUILTING = True + WRTCMP_write_groups: 1 + WRTCMP_write_tasks_per_group: 2 + WRTCMP_output_grid: "lambert_conformal" + WRTCMP_cen_lon: -97.5 + WRTCMP_cen_lat: 38.5 + WRTCMP_stdlat1: 38.5 + WRTCMP_stdlat2: 38.5 + WRTCMP_nx: 199 + WRTCMP_ny: 111 + WRTCMP_lon_lwr_left: -121.23349066 + WRTCMP_lat_lwr_left: 23.41731593 + WRTCMP_dx: 25000.0 + WRTCMP_dy: 25000.0 +# +#----------------------------------------------------------------------- +# +# The RRFS CONUS domain with ~13km cells. +# +#----------------------------------------------------------------------- +# +"RRFS_CONUS_13km": + GRID_GEN_METHOD: "ESGgrid" + ESGgrid_LON_CTR: -97.5 + ESGgrid_LAT_CTR: 38.5 + ESGgrid_DELX: 13000.0 + ESGgrid_DELY: 13000.0 + ESGgrid_NX: 420 + ESGgrid_NY: 252 + ESGgrid_PAZI: 0.0 + ESGgrid_WIDE_HALO_WIDTH: 6 + DT_ATMOS: 45 + LAYOUT_X: 16 + LAYOUT_Y: 10 + BLOCKSIZE: 32 + #if QUILTING = True + WRTCMP_write_groups: 1 + WRTCMP_write_tasks_per_group: 10 + WRTCMP_output_grid: "lambert_conformal" + WRTCMP_cen_lon: -97.5 + WRTCMP_cen_lat: 38.5 + WRTCMP_stdlat1: 38.5 + WRTCMP_stdlat2: 38.5 + WRTCMP_nx: 416 + WRTCMP_ny: 245 + WRTCMP_lon_lwr_left: -122.719528 + WRTCMP_lat_lwr_left: 21.138123 + WRTCMP_dx: 13000.0 + WRTCMP_dy: 13000.0 +# +#----------------------------------------------------------------------- +# +# The RRFS CONUS domain with ~13km cells that can be initialized from the HRRR. +# +#----------------------------------------------------------------------- +# +"RRFS_CONUScompact_13km": + GRID_GEN_METHOD: "ESGgrid" + ESGgrid_LON_CTR: -97.5 + ESGgrid_LAT_CTR: 38.5 + ESGgrid_DELX: 13000.0 + ESGgrid_DELY: 13000.0 + ESGgrid_NX: 396 + ESGgrid_NY: 232 + ESGgrid_PAZI: 0.0 + ESGgrid_WIDE_HALO_WIDTH: 6 + DT_ATMOS: 45 + LAYOUT_X: 16 + LAYOUT_Y: 10 + BLOCKSIZE: 32 + # if QUILTING = TRUE + WRTCMP_write_groups: 1 + WRTCMP_write_tasks_per_group: 16 + WRTCMP_output_grid: "lambert_conformal" + WRTCMP_cen_lon: -97.5 + WRTCMP_cen_lat: 38.5 + WRTCMP_stdlat1: 38.5 + WRTCMP_stdlat2: 38.5 + WRTCMP_nx: 393 + WRTCMP_ny: 225 + WRTCMP_lon_lwr_left: -121.70231097 + WRTCMP_lat_lwr_left: 22.57417972 + WRTCMP_dx: 13000.0 + WRTCMP_dy: 13000.0 +# +#----------------------------------------------------------------------- +# +# The RRFS CONUS domain with ~13km cells. +# +#----------------------------------------------------------------------- +# +"RRFS_CONUS_13km": + GRID_GEN_METHOD: "ESGgrid" + ESGgrid_LON_CTR: -97.5 + ESGgrid_LAT_CTR: 38.5 + ESGgrid_DELX: 13000.0 + ESGgrid_DELY: 13000.0 + ESGgrid_NX: 396 + ESGgrid_NY: 232 + ESGgrid_PAZI: 0.0 + ESGgrid_WIDE_HALO_WIDTH: 6 + DT_ATMOS: 45 + LAYOUT_X: 16 + LAYOUT_Y: 10 + BLOCKSIZE: 32 + #if QUILTING = True + WRTCMP_write_groups: 1 + WRTCMP_write_tasks_per_group: 10 + WRTCMP_output_grid: "lambert_conformal" + WRTCMP_cen_lon: -97.5 + WRTCMP_cen_lat: 38.5 + WRTCMP_stdlat1: 38.5 + WRTCMP_stdlat2: 38.5 + WRTCMP_nx: 393 + WRTCMP_ny: 225 + WRTCMP_lon_lwr_left: -121.70231097 + WRTCMP_lat_lwr_left: 22.57417972 + WRTCMP_dx: 13000.0 + WRTCMP_dy: 13000.0 +# +#----------------------------------------------------------------------- +# +# The RRFS CONUS domain with ~3km cells. +# +#----------------------------------------------------------------------- +# +"RRFS_CONUS_3km": + GRID_GEN_METHOD: "ESGgrid" + ESGgrid_LON_CTR: -97.5 + ESGgrid_LAT_CTR: 38.5 + ESGgrid_DELX: 3000.0 + ESGgrid_DELY: 3000.0 + ESGgrid_NX: 1820 + ESGgrid_NY: 1092 + ESGgrid_PAZI: 0.0 + ESGgrid_WIDE_HALO_WIDTH: 6 + DT_ATMOS: 36 + LAYOUT_X: 28 + LAYOUT_Y: 28 + BLOCKSIZE: 29 + #if QUILTING = True + WRTCMP_write_groups: 1 + WRTCMP_write_tasks_per_group: 28 + WRTCMP_output_grid: "lambert_conformal" + WRTCMP_cen_lon: -97.5 + WRTCMP_cen_lat: 38.5 + WRTCMP_stdlat1: 38.5 + WRTCMP_stdlat2: 38.5 + WRTCMP_nx: 1799 + WRTCMP_ny: 1059 + WRTCMP_lon_lwr_left: -122.719528 + WRTCMP_lat_lwr_left: 21.138123 + WRTCMP_dx: 3000.0 + WRTCMP_dy: 3000.0 +# +#----------------------------------------------------------------------- +# +# The RRFS CONUS domain with ~3km cells that can be initialized from the HRRR. +# +#----------------------------------------------------------------------- +# +"RRFS_CONUScompact_3km": + GRID_GEN_METHOD: "ESGgrid" + ESGgrid_LON_CTR: -97.5 + ESGgrid_LAT_CTR: 38.5 + ESGgrid_DELX: 3000.0 + ESGgrid_DELY: 3000.0 + ESGgrid_NX: 1748 + ESGgrid_NY: 1038 + ESGgrid_PAZI: 0.0 + ESGgrid_WIDE_HALO_WIDTH: 6 + DT_ATMOS: 40 + LAYOUT_X: 30 + LAYOUT_Y: 16 + BLOCKSIZE: 32 + #if QUILTING = True + WRTCMP_write_groups: 1 + WRTCMP_write_tasks_per_group: 16 + WRTCMP_output_grid: "lambert_conformal" + WRTCMP_cen_lon: -97.5 + WRTCMP_cen_lat: 38.5 + WRTCMP_stdlat1: 38.5 + WRTCMP_stdlat2: 38.5 + WRTCMP_nx: 1746 + WRTCMP_ny: 1014 + WRTCMP_lon_lwr_left: -122.17364391 + WRTCMP_lat_lwr_left: 21.88588562 + WRTCMP_dx: 3000.0 + WRTCMP_dy: 3000.0 +# +#----------------------------------------------------------------------- +# +# The RRFS SUBCONUS domain with ~3km cells. +# +#----------------------------------------------------------------------- +# +"RRFS_SUBCONUS_3km": + GRID_GEN_METHOD: "ESGgrid" + ESGgrid_LON_CTR: -97.5 + ESGgrid_LAT_CTR: 35.0 + ESGgrid_DELX: 3000.0 + ESGgrid_DELY: 3000.0 + ESGgrid_NX: 840 + ESGgrid_NY: 600 + ESGgrid_PAZI: 0.0 + ESGgrid_WIDE_HALO_WIDTH: 6 + DT_ATMOS: 40 + LAYOUT_X: 30 + LAYOUT_Y: 24 + BLOCKSIZE: 35 + #if QUILTING = True + WRTCMP_write_groups: 1 + WRTCMP_write_tasks_per_group: 24 + WRTCMP_output_grid: "lambert_conformal" + WRTCMP_cen_lon: -97.5 + WRTCMP_cen_lat: 35.0 + WRTCMP_stdlat1: 35.0 + WRTCMP_stdlat2: 35.0 + WRTCMP_nx: 837 + WRTCMP_ny: 595 + WRTCMP_lon_lwr_left: -109.97410429 + WRTCMP_lat_lwr_left: 26.31459843 + WRTCMP_dx: 3000.0 + WRTCMP_dy: 3000.0 +# +#----------------------------------------------------------------------- +# +# A subconus domain over Indianapolis, Indiana with ~3km cells. This is +# mostly for testing on a 3km grid with a much small number of cells than +# on the full CONUS. +# +#----------------------------------------------------------------------- +# +"SUBCONUS_Ind_3km": + GRID_GEN_METHOD: "ESGgrid" + ESGgrid_LON_CTR: -86.16 + ESGgrid_LAT_CTR: 39.77 + ESGgrid_DELX: 3000.0 + ESGgrid_DELY: 3000.0 + ESGgrid_NX: 200 + ESGgrid_NY: 200 + ESGgrid_PAZI: 0.0 + ESGgrid_WIDE_HALO_WIDTH: 6 + DT_ATMOS: 40 + LAYOUT_X: 5 + LAYOUT_Y: 5 + BLOCKSIZE: 40 + #if QUILTING : True + WRTCMP_write_groups: 1 + WRTCMP_write_tasks_per_group: 5 + WRTCMP_output_grid: "lambert_conformal" + WRTCMP_cen_lon: -86.16 + WRTCMP_cen_lat: 39.77 + WRTCMP_stdlat1: 39.77 + WRTCMP_stdlat2: 39.77 + WRTCMP_nx: 197 + WRTCMP_ny: 197 + WRTCMP_lon_lwr_left: -89.47120417 + WRTCMP_lat_lwr_left: 37.07809642 + WRTCMP_dx: 3000.0 + WRTCMP_dy: 3000.0 +# +#----------------------------------------------------------------------- +# +# The RRFS Alaska domain with ~13km cells. +# +# Note: +# This grid has not been thoroughly tested (as of 20201027). +# +#----------------------------------------------------------------------- +# +"RRFS_AK_13km": + GRID_GEN_METHOD: "ESGgrid" + ESGgrid_LON_CTR: -161.5 + ESGgrid_LAT_CTR: 63.0 + ESGgrid_DELX: 13000.0 + ESGgrid_DELY: 13000.0 + ESGgrid_NX: 320 + ESGgrid_NY: 240 + ESGgrid_PAZI: 0.0 + ESGgrid_WIDE_HALO_WIDTH: 6 + DT_ATMOS: 10 + LAYOUT_X: 16 + LAYOUT_Y: 12 + BLOCKSIZE: 40 + #if QUILTING = True + WRTCMP_write_groups: 1 + WRTCMP_write_tasks_per_group: 12 + WRTCMP_output_grid: "lambert_conformal" + WRTCMP_cen_lon: -161.5 + WRTCMP_cen_lat: 63.0 + WRTCMP_stdlat1: 63.0 + WRTCMP_stdlat2: 63.0 + WRTCMP_nx: 318 + WRTCMP_ny: 234 + WRTCMP_lon_lwr_left: 172.23339164 + WRTCMP_lat_lwr_left: 45.77691870 + WRTCMP_dx: 13000.0 + WRTCMP_dy: 13000.0 +# +#----------------------------------------------------------------------- +# +# The RRFS Alaska domain with ~3km cells. +# +# Note: +# This grid has not been thoroughly tested (as of 20201027). +# +#----------------------------------------------------------------------- +# +"RRFS_AK_3km": + GRID_GEN_METHOD: "ESGgrid" + ESGgrid_LON_CTR: -161.5 + ESGgrid_LAT_CTR: 63.0 + ESGgrid_DELX: 3000.0 + ESGgrid_DELY: 3000.0 + ESGgrid_NX: 1380 + ESGgrid_NY: 1020 + ESGgrid_PAZI: 0.0 + ESGgrid_WIDE_HALO_WIDTH: 6 + DT_ATMOS: 10 + LAYOUT_X: 30 + LAYOUT_Y: 17 + BLOCKSIZE: 40 + #if QUILTING = True + WRTCMP_write_groups: 1 + WRTCMP_write_tasks_per_group: 17 + WRTCMP_output_grid: "lambert_conformal" + WRTCMP_cen_lon: -161.5 + WRTCMP_cen_lat: 63.0 + WRTCMP_stdlat1: 63.0 + WRTCMP_stdlat2: 63.0 + WRTCMP_nx: 1379 + WRTCMP_ny: 1003 + WRTCMP_lon_lwr_left: -187.89737923 + WRTCMP_lat_lwr_left: 45.84576053 + WRTCMP_dx: 3000.0 + WRTCMP_dy: 3000.0 +# +#----------------------------------------------------------------------- +# +# The WoFS domain with ~3km cells. +# +# Note: +# The WoFS domain will generate a 301 x 301 output grid (WRITE COMPONENT) and +# will eventually be movable (ESGgrid_LON_CTR/ESGgrid_LAT_CTR). A python script +# python_utils/fv3write_parms_lambert will be useful to determine +# WRTCMP_lon_lwr_left and WRTCMP_lat_lwr_left locations (only for Lambert map +# projection currently) of the quilting output when the domain location is +# moved. Later, it should be integrated into the workflow. +# +#----------------------------------------------------------------------- +# +"WoFS_3km": + GRID_GEN_METHOD: "ESGgrid" + ESGgrid_LON_CTR: -97.5 + ESGgrid_LAT_CTR: 38.5 + ESGgrid_DELX: 3000.0 + ESGgrid_DELY: 3000.0 + ESGgrid_NX: 361 + ESGgrid_NY: 361 + ESGgrid_PAZI: 0.0 + ESGgrid_WIDE_HALO_WIDTH: 6 + DT_ATMOS: 20 + LAYOUT_X: 18 + LAYOUT_Y: 12 + BLOCKSIZE: 30 + # if QUILTING = True + WRTCMP_write_groups: "1" + WRTCMP_write_tasks_per_group: 12 + WRTCMP_output_grid: "lambert_conformal" + WRTCMP_cen_lon: -97.5 + WRTCMP_cen_lat: 38.5 + WRTCMP_stdlat1: 38.5 + WRTCMP_stdlat2: 38.5 + WRTCMP_nx: 301 + WRTCMP_ny: 301 + WRTCMP_lon_lwr_left: -102.3802487 + WRTCMP_lat_lwr_left: 34.3407918 + WRTCMP_dx: 3000.0 + WRTCMP_dy: 3000.0 +# +#----------------------------------------------------------------------- +# +# A CONUS domain of GFDLgrid type with ~25km cells. +# +# Note: +# This grid is larger than the HRRRX domain and thus cannot be initialized +# using the HRRRX. +# +#----------------------------------------------------------------------- +# +"CONUS_25km_GFDLgrid": + GRID_GEN_METHOD: "GFDLgrid" + GFDLgrid_LON_T6_CTR: -97.5 + GFDLgrid_LAT_T6_CTR: 38.5 + GFDLgrid_STRETCH_FAC: 1.4 + GFDLgrid_RES: 96 + GFDLgrid_REFINE_RATIO: 3 + num_margin_cells_T6_left: 12 + GFDLgrid_ISTART_OF_RGNL_DOM_ON_T6G: 13 + num_margin_cells_T6_right: 12 + GFDLgrid_IEND_OF_RGNL_DOM_ON_T6G: 84 + num_margin_cells_T6_bottom: 16 + GFDLgrid_JSTART_OF_RGNL_DOM_ON_T6G: 17 + num_margin_cells_T6_top: 16 + GFDLgrid_JEND_OF_RGNL_DOM_ON_T6G: 80 + GFDLgrid_USE_GFDLgrid_RES_IN_FILENAMES: True + DT_ATMOS: 225 + LAYOUT_X: 6 + LAYOUT_Y: 4 + BLOCKSIZE: 36 + #if QUILTING = True + WRTCMP_write_groups: 1 + WRTCMP_write_tasks_per_group: 4 + WRTCMP_output_grid: "rotated_latlon" + WRTCMP_cen_lon: -97.5 + WRTCMP_cen_lat: 38.5 + WRTCMP_lon_lwr_left: -24.40085141 + WRTCMP_lat_lwr_left: -19.65624142 + WRTCMP_lon_upr_rght: 24.40085141 + WRTCMP_lat_upr_rght: 19.65624142 + WRTCMP_dlon: 0.22593381 + WRTCMP_dlat: 0.22593381 +# +#----------------------------------------------------------------------- +# +# A CONUS domain of GFDLgrid type with ~3km cells. +# +# Note: +# This grid is larger than the HRRRX domain and thus cannot be initialized +# using the HRRRX. +# +#----------------------------------------------------------------------- +# +"CONUS_3km_GFDLgrid": + GRID_GEN_METHOD: "GFDLgrid" + GFDLgrid_LON_T6_CTR: -97.5 + GFDLgrid_LAT_T6_CTR: 38.5 + GFDLgrid_STRETCH_FAC: 1.5 + GFDLgrid_RES: 768 + GFDLgrid_REFINE_RATIO: 3 + num_margin_cells_T6_left: 69 + GFDLgrid_ISTART_OF_RGNL_DOM_ON_T6G: 70 + num_margin_cells_T6_right: 69 + GFDLgrid_IEND_OF_RGNL_DOM_ON_T6G: 699 + num_margin_cells_T6_bottom: 164 + GFDLgrid_JSTART_OF_RGNL_DOM_ON_T6G: 165 + num_margin_cells_T6_top: 164 + GFDLgrid_JEND_OF_RGNL_DOM_ON_T6G: 604 + GFDLgrid_USE_GFDLgrid_RES_IN_FILENAMES: True + DT_ATMOS: 18 + LAYOUT_X: 30 + LAYOUT_Y: 22 + BLOCKSIZE: 35 + #if QUILTING = True + WRTCMP_write_groups: 1 + WRTCMP_write_tasks_per_group: 22 + WRTCMP_output_grid: "rotated_latlon" + WRTCMP_cen_lon: -97.5 + WRTCMP_cen_lat: 38.5 + WRTCMP_lon_lwr_left: -25.23144805 + WRTCMP_lat_lwr_left: -15.82130419 + WRTCMP_lon_upr_rght: 25.23144805 + WRTCMP_lat_upr_rght: 15.82130419 + WRTCMP_dlon: 0.02665763 + WRTCMP_dlat: 0.02665763 +# +#----------------------------------------------------------------------- +# +# EMC's Alaska grid. +# +#----------------------------------------------------------------------- +# +"EMC_AK": + GRID_GEN_METHOD: "ESGgrid" + ESGgrid_LON_CTR: -153.0 + ESGgrid_LAT_CTR: 61.0 + ESGgrid_DELX: 3000.0 + ESGgrid_DELY: 3000.0 + ESGgrid_NX: 1344 # Supergrid value 2704 + ESGgrid_NY: 1152 # Supergrid value 2320 + ESGgrid_PAZI: 0.0 + ESGgrid_WIDE_HALO_WIDTH: 6 + DT_ATMOS: 18 + LAYOUT_X: 28 + LAYOUT_Y: 16 + BLOCKSIZE: 24 + #if QUILTING = True + WRTCMP_write_groups: 1 + WRTCMP_write_tasks_per_group: 24 + WRTCMP_output_grid: "lambert_conformal" + WRTCMP_cen_lon: -153.0 + WRTCMP_cen_lat: 61.0 + WRTCMP_stdlat1: 61.0 + WRTCMP_stdlat2: 61.0 + WRTCMP_nx: 1344 + WRTCMP_ny: 1152 + WRTCMP_lon_lwr_left: -177.0 + WRTCMP_lat_lwr_left: 42.5 + WRTCMP_dx: 3000.0 + WRTCMP_dy: 3000.0 +# +#----------------------------------------------------------------------- +# +# EMC's Hawaii grid. +# +#----------------------------------------------------------------------- +# +"EMC_HI": + GRID_GEN_METHOD: "ESGgrid" + ESGgrid_LON_CTR: -157.0 + ESGgrid_LAT_CTR: 20.0 + ESGgrid_DELX: 3000.0 + ESGgrid_DELY: 3000.0 + ESGgrid_NX: 432 # Supergrid value 880 + ESGgrid_NY: 360 # Supergrid value 736 + ESGgrid_PAZI: 0.0 + ESGgrid_WIDE_HALO_WIDTH: 6 + DT_ATMOS: 18 + LAYOUT_X: 8 + LAYOUT_Y: 8 + BLOCKSIZE: 27 + #if QUILTING = True + WRTCMP_write_groups: 1 + WRTCMP_write_tasks_per_group: 8 + WRTCMP_output_grid: "lambert_conformal" + WRTCMP_cen_lon: -157.0 + WRTCMP_cen_lat: 20.0 + WRTCMP_stdlat1: 20.0 + WRTCMP_stdlat2: 20.0 + WRTCMP_nx: 420 + WRTCMP_ny: 348 + WRTCMP_lon_lwr_left: -162.8 + WRTCMP_lat_lwr_left: 15.2 + WRTCMP_dx: 3000.0 + WRTCMP_dy: 3000.0 +# +#----------------------------------------------------------------------- +# +# EMC's Puerto Rico grid. +# +#----------------------------------------------------------------------- +# +"EMC_PR": + GRID_GEN_METHOD: "ESGgrid" + ESGgrid_LON_CTR: -69.0 + ESGgrid_LAT_CTR: 18.0 + ESGgrid_DELX: 3000.0 + ESGgrid_DELY: 3000.0 + ESGgrid_NX: 576 # Supergrid value 1168 + ESGgrid_NY: 432 # Supergrid value 880 + ESGgrid_PAZI: 0.0 + ESGgrid_WIDE_HALO_WIDTH: 6 + DT_ATMOS: 18 + LAYOUT_X: 16 + LAYOUT_Y: 8 + BLOCKSIZE: 24 + #if QUILTING = True + WRTCMP_write_groups: 1 + WRTCMP_write_tasks_per_group: 24 + WRTCMP_output_grid: "lambert_conformal" + WRTCMP_cen_lon: -69.0 + WRTCMP_cen_lat: 18.0 + WRTCMP_stdlat1: 18.0 + WRTCMP_stdlat2: 18.0 + WRTCMP_nx: 576 + WRTCMP_ny: 432 + WRTCMP_lon_lwr_left: -77 + WRTCMP_lat_lwr_left: 12 + WRTCMP_dx: 3000.0 + WRTCMP_dy: 3000.0 +# +#----------------------------------------------------------------------- +# +# EMC's Guam grid. +# +#----------------------------------------------------------------------- +# +"EMC_GU": + GRID_GEN_METHOD: "ESGgrid" + ESGgrid_LON_CTR: 146.0 + ESGgrid_LAT_CTR: 15.0 + ESGgrid_DELX: 3000.0 + ESGgrid_DELY: 3000.0 + ESGgrid_NX: 432 # Supergrid value 880 + ESGgrid_NY: 360 # Supergrid value 736 + ESGgrid_PAZI: 0.0 + ESGgrid_WIDE_HALO_WIDTH: 6 + DT_ATMOS: 18 + LAYOUT_X: 16 + LAYOUT_Y: 12 + BLOCKSIZE: 27 + #if QUILTING = True + WRTCMP_write_groups: 1 + WRTCMP_write_tasks_per_group: 24 + WRTCMP_output_grid: "lambert_conformal" + WRTCMP_cen_lon: 146.0 + WRTCMP_cen_lat: 15.0 + WRTCMP_stdlat1: 15.0 + WRTCMP_stdlat2: 15.0 + WRTCMP_nx: 420 + WRTCMP_ny: 348 + WRTCMP_lon_lwr_left: 140 + WRTCMP_lat_lwr_left: 10 + WRTCMP_dx: 3000.0 + WRTCMP_dy: 3000.0 +# +#----------------------------------------------------------------------- +# +# Emulation of the HAFS v0.A grid at 25 km. +# +#----------------------------------------------------------------------- +# +"GSL_HAFSV0.A_25km": + GRID_GEN_METHOD: "ESGgrid" + ESGgrid_LON_CTR: -62.0 + ESGgrid_LAT_CTR: 22.0 + ESGgrid_DELX: 25000.0 + ESGgrid_DELY: 25000.0 + ESGgrid_NX: 345 + ESGgrid_NY: 230 + ESGgrid_PAZI: 0.0 + ESGgrid_WIDE_HALO_WIDTH: 6 + DT_ATMOS: 300 + LAYOUT_X: 5 + LAYOUT_Y: 5 + BLOCKSIZE: 6 + #if QUILTING = True + WRTCMP_write_groups: 1 + WRTCMP_write_tasks_per_group: 32 + WRTCMP_output_grid: "regional_latlon" + WRTCMP_cen_lon: -62.0 + WRTCMP_cen_lat: 25.0 + WRTCMP_lon_lwr_left: -114.5 + WRTCMP_lat_lwr_left: -5.0 + WRTCMP_lon_upr_rght: -9.5 + WRTCMP_lat_upr_rght: 55.0 + WRTCMP_dlon: 0.25 + WRTCMP_dlat: 0.25 +# +#----------------------------------------------------------------------- +# +# Emulation of the HAFS v0.A grid at 13 km. +# +#----------------------------------------------------------------------- +# +"GSL_HAFSV0.A_13km": + GRID_GEN_METHOD: "ESGgrid" + ESGgrid_LON_CTR: -62.0 + ESGgrid_LAT_CTR: 22.0 + ESGgrid_DELX: 13000.0 + ESGgrid_DELY: 13000.0 + ESGgrid_NX: 665 + ESGgrid_NY: 444 + ESGgrid_PAZI: 0.0 + ESGgrid_WIDE_HALO_WIDTH: 6 + DT_ATMOS: 180 + LAYOUT_X: 19 + LAYOUT_Y: 12 + BLOCKSIZE: 35 + #if QUILTING = True + WRTCMP_write_groups: 1 + WRTCMP_write_tasks_per_group: 32 + WRTCMP_output_grid: "regional_latlon" + WRTCMP_cen_lon: -62.0 + WRTCMP_cen_lat: 25.0 + WRTCMP_lon_lwr_left: -114.5 + WRTCMP_lat_lwr_left: -5.0 + WRTCMP_lon_upr_rght: -9.5 + WRTCMP_lat_upr_rght: 55.0 + WRTCMP_dlon: 0.13 + WRTCMP_dlat: 0.13 +# +#----------------------------------------------------------------------- +# +# Emulation of the HAFS v0.A grid at 3 km. +# +#----------------------------------------------------------------------- +# +"GSL_HAFSV0.A_3km": + GRID_GEN_METHOD: "ESGgrid" + ESGgrid_LON_CTR: -62.0 + ESGgrid_LAT_CTR: 22.0 + ESGgrid_DELX: 3000.0 + ESGgrid_DELY: 3000.0 + ESGgrid_NX: 2880 + ESGgrid_NY: 1920 + ESGgrid_PAZI: 0.0 + ESGgrid_WIDE_HALO_WIDTH: 6 + DT_ATMOS: 40 + LAYOUT_X: 32 + LAYOUT_Y: 24 + BLOCKSIZE: 32 + #if QUILTING = True + WRTCMP_write_groups: 1 + WRTCMP_write_tasks_per_group: 32 + WRTCMP_output_grid: "regional_latlon" + WRTCMP_cen_lon: -62.0 + WRTCMP_cen_lat: 25.0 + WRTCMP_lon_lwr_left: -114.5 + WRTCMP_lat_lwr_left: -5.0 + WRTCMP_lon_upr_rght: -9.5 + WRTCMP_lat_upr_rght: 55.0 + WRTCMP_dlon: 0.03 + WRTCMP_dlat: 0.03 +# +#----------------------------------------------------------------------- +# +# 50-km HRRR Alaska grid. +# +#----------------------------------------------------------------------- +# +"GSD_HRRR_AK_50km": + GRID_GEN_METHOD: "ESGgrid" + ESGgrid_LON_CTR: -163.5 + ESGgrid_LAT_CTR: 62.8 + ESGgrid_DELX: 50000.0 + ESGgrid_DELY: 50000.0 + ESGgrid_NX: 74 + ESGgrid_NY: 51 + ESGgrid_PAZI: 0.0 + ESGgrid_WIDE_HALO_WIDTH: 6 + DT_ATMOS: 600 + LAYOUT_X: 2 + LAYOUT_Y: 3 + BLOCKSIZE: 37 + #if QUILTING = True + WRTCMP_write_groups: 1 + WRTCMP_write_tasks_per_group: 1 + WRTCMP_output_grid: "lambert_conformal" + WRTCMP_cen_lon: -163.5 + WRTCMP_cen_lat: 62.8 + WRTCMP_stdlat1: 62.8 + WRTCMP_stdlat2: 62.8 + WRTCMP_nx: 70 + WRTCMP_ny: 45 + WRTCMP_lon_lwr_left: 172.0 + WRTCMP_lat_lwr_left: 49.0 + WRTCMP_dx: 50000.0 + WRTCMP_dy: 50000.0 +# +#----------------------------------------------------------------------- +# +# Emulation of GSD's RAP domain with ~13km cell size. +# +#----------------------------------------------------------------------- +# +"RRFS_NA_13km": + GRID_GEN_METHOD: "ESGgrid" + ESGgrid_LON_CTR: -112.5 + ESGgrid_LAT_CTR: 55.0 + ESGgrid_DELX: 13000.0 + ESGgrid_DELY: 13000.0 + ESGgrid_NX: 912 + ESGgrid_NY: 623 + ESGgrid_PAZI: 0.0 + ESGgrid_WIDE_HALO_WIDTH: 6 + DT_ATMOS: 50 + LAYOUT_X: 16 + LAYOUT_Y: 16 + BLOCKSIZE: 30 + #if QUILTING = True + WRTCMP_write_groups: 1 + WRTCMP_write_tasks_per_group: 16 + WRTCMP_output_grid: "rotated_latlon" + WRTCMP_cen_lon: -113.0 + WRTCMP_cen_lat: 55.0 + WRTCMP_lon_lwr_left: -61.0 + WRTCMP_lat_lwr_left: -37.0 + WRTCMP_lon_upr_rght: 61.0 + WRTCMP_lat_upr_rght: 37.0 + WRTCMP_dlon: 0.1169081 + WRTCMP_dlat: 0.1169081 +# +#----------------------------------------------------------------------- +# +# Future operational RRFS domain with ~3km cell size. +# +#----------------------------------------------------------------------- +# +"RRFS_NA_3km": + GRID_GEN_METHOD: "ESGgrid" + ESGgrid_LON_CTR: -112.5 + ESGgrid_LAT_CTR: 55.0 + ESGgrid_DELX: 3000.0 + ESGgrid_DELY: 3000.0 + ESGgrid_NX: 3950 + ESGgrid_NY: 2700 + ESGgrid_PAZI: 0.0 + ESGgrid_WIDE_HALO_WIDTH: 6 + DT_ATMOS: 36 + LAYOUT_X: 20 # 40 - EMC operational configuration + LAYOUT_Y: 35 # 45 - EMC operational configuration + BLOCKSIZE: 28 + #if QUILTING = True + WRTCMP_write_groups: 1 + WRTCMP_write_tasks_per_group: 144 + WRTCMP_output_grid: "rotated_latlon" + WRTCMP_cen_lon: -113.0 + WRTCMP_cen_lat: 55.0 + WRTCMP_lon_lwr_left: -61.0 + WRTCMP_lat_lwr_left: -37.0 + WRTCMP_lon_upr_rght: 61.0 + WRTCMP_lat_upr_rght: 37.0 + WRTCMP_dlon: 0.025 + WRTCMP_dlat: 0.025 + diff --git a/ush/python_utils/__init__.py b/ush/python_utils/__init__.py index 1e4f6feed..9371488d5 100644 --- a/ush/python_utils/__init__.py +++ b/ush/python_utils/__init__.py @@ -1,4 +1,4 @@ -from .change_case import uppercase, lowercase +from .misc import uppercase, lowercase, find_pattern_in_str, find_pattern_in_file from .check_for_preexist_dir_file import check_for_preexist_dir_file from .check_var_valid_value import check_var_valid_value from .count_files import count_files @@ -9,12 +9,14 @@ from .filesys_cmds_vrfy import cmd_vrfy, cp_vrfy, mv_vrfy, rm_vrfy, ln_vrfy, mkdir_vrfy, cd_vrfy from .get_charvar_from_netcdf import get_charvar_from_netcdf from .get_elem_inds import get_elem_inds -from .get_manage_externals_config_property import get_manage_externals_config_property from .interpol_to_arbit_CRES import interpol_to_arbit_CRES from .print_input_args import print_input_args from .print_msg import print_info_msg, print_err_msg_exit from .process_args import process_args from .run_command import run_command -from .config_parser import cfg_to_shell_str, cfg_to_yaml_str, yaml_safe_load, \ - load_shell_config, load_config_file - +from .config_parser import load_yaml_config, cfg_to_yaml_str, \ + load_json_config, cfg_to_json_str, \ + load_ini_config, cfg_to_ini_str, get_ini_value, \ + load_shell_config, cfg_to_shell_str, \ + load_config_file +from .xml_parser import load_xml_file, has_tag_with_value diff --git a/ush/python_utils/change_case.py b/ush/python_utils/change_case.py deleted file mode 100644 index 4fb46f94e..000000000 --- a/ush/python_utils/change_case.py +++ /dev/null @@ -1,25 +0,0 @@ -#!/usr/bin/env python3 - -def uppercase(str): - """ Function to convert a given string to uppercase - - Args: - str: the string - Return: - Uppercased str - """ - - return str.upper() - - -def lowercase(str): - """ Function to convert a given string to lowercase - - Args: - str: the string - Return: - Lowercase str - """ - - return str.lower() - diff --git a/ush/python_utils/check_for_preexist_dir_file.py b/ush/python_utils/check_for_preexist_dir_file.py index 743f80fcc..97c709cf0 100644 --- a/ush/python_utils/check_for_preexist_dir_file.py +++ b/ush/python_utils/check_for_preexist_dir_file.py @@ -24,7 +24,7 @@ def check_for_preexist_dir_file(path, method): rm_vrfy(' -rf ', path) elif method == 'rename': now = datetime.now() - d = now.strftime("_%Y%m%d_%H%M%S") + d = now.strftime("_old_%Y%m%d_%H%M%S") new_path = path + d print_info_msg(f''' Specified directory or file already exists: diff --git a/ush/python_utils/config_parser.py b/ush/python_utils/config_parser.py index f7d5a86ee..da7131df2 100644 --- a/ush/python_utils/config_parser.py +++ b/ush/python_utils/config_parser.py @@ -1,15 +1,78 @@ #!/usr/bin/env python3 + +""" +This file provides utilities for processing different configuration file formats. +Supported formats include: + a) YAML + b) JSON + c) SHELL + d) INI + +Typical usage involves first loading the config file, then using the dictionary +returnded by load_config to make queries. +""" + import argparse import yaml +import json import sys import os from textwrap import dedent +import configparser from .environment import list_to_str, str_to_list from .print_msg import print_err_msg_exit from .run_command import run_command +########## +# YAML +########## +def load_yaml_config(config_file): + """ Safe load a yaml file """ + + try: + with open(config_file,'r') as f: + cfg = yaml.safe_load(f) + except yaml.YAMLError as e: + print_err_msg_exit(e) + + return cfg + +def cfg_to_yaml_str(cfg): + """ Get contents of config file as a yaml string """ + + return yaml.dump(cfg, sort_keys=False, default_flow_style=False) + +def join_str(loader, node): + """ Custom tag hangler to join strings """ + seq = loader.construct_sequence(node) + return ''.join([str(i) for i in seq]) + +yaml.add_constructor('!join_str', join_str, Loader=yaml.SafeLoader) + +########## +# JSON +########## +def load_json_config(config_file): + """ Load json config file """ + + try: + with open(config_file,'r') as f: + cfg = json.load(f) + except: + print_err_msg_exit(e) + + return cfg + +def cfg_to_json_str(cfg): + """ Get contents of config file as a json string """ + + return json.dumps(cfg, sort_keys=False, indent=4) + +########## +# SHELL +########## def load_shell_config(config_file): """ Loads old style shell config files. We source the config script in a subshell and gets the variables it sets @@ -24,10 +87,11 @@ def load_shell_config(config_file): # do a diff to get variables specifically defined/updated in the script # Method sounds brittle but seems to work ok so far code = dedent(f''' #!/bin/bash - (set -o posix; set) > /tmp/t1 + (set -o posix; set) > ./_t1 {{ . {config_file}; set +x; }} &>/dev/null - (set -o posix; set) > /tmp/t2 - diff /tmp/t1 /tmp/t2 | grep "> " | cut -c 3- + (set -o posix; set) > ./_t2 + diff ./_t1 ./_t2 | grep "> " | cut -c 3- + rm -rf ./_t1 ./_t2 ''') (_,config_str,_) = run_command(code) lines = config_str.splitlines() @@ -41,16 +105,16 @@ def load_shell_config(config_file): cfg[k] = v return cfg -def cfg_to_yaml_str(cfg): - """ Get contents of config file as a yaml string """ - - return yaml.dump(cfg, sort_keys=False, default_flow_style=False) - def cfg_to_shell_str(cfg): - """ Get contents of yaml file as shell script string""" + """ Get contents of config file as shell script string""" shell_str = '' for k,v in cfg.items(): + if isinstance(v,dict): + shell_str += f"# [{k}]\n" + shell_str += cfg_to_shell_str(v) + shell_str += "\n" + continue v1 = list_to_str(v) if isinstance(v,list): shell_str += f'{k}={v1}\n' @@ -58,46 +122,85 @@ def cfg_to_shell_str(cfg): shell_str += f"{k}='{v1}'\n" return shell_str -def yaml_safe_load(file_name): - """ Safe load a yaml file """ +########## +# INI +########## +def load_ini_config(config_file): + """ Load a config file with a format similar to Microsoft's INI files""" - try: - with open(file_name,'r') as f: - cfg = yaml.safe_load(f) - except yaml.YAMLError as e: - print_err_msg_exit(e) + if not os.path.exists(config_file): + print_err_msg_exit(f''' + The specified configuration file does not exist: + \"{file_name}\"''') + + config = configparser.ConfigParser() + config.read(config_file) + config_dict = {s:dict(config.items(s)) for s in config.sections()} + return config_dict + +def get_ini_value(config, section, key): + """ Finds the value of a property in a given section""" + + if not section in config: + print_err_msg_exit(f''' + Section not found: + section = \"{section}\" + valid sections = \"{config.keys()}\"''') + else: + return config[section][key] - return cfg + return None -def join(loader, node): - """ Custom tag hangler to join strings """ - seq = loader.construct_sequence(node) - return ''.join([str(i) for i in seq]) +def cfg_to_ini_str(cfg): + """ Get contents of config file as ini string""" -yaml.add_constructor('!join', join, Loader=yaml.SafeLoader) + ini_str = '' + for k,v in cfg.items(): + if isinstance(v,dict): + ini_str += f"[{k}]\n" + ini_str += cfg_to_ini_str(v) + ini_str += "\n" + continue + v1 = list_to_str(v) + if isinstance(v,list): + ini_str += f'{k}={v1}\n' + else: + ini_str += f"{k}='{v1}'\n" + return ini_str +################## +# CONFIG loader +################## def load_config_file(file_name): - """ Choose yaml/shell file based on extension """ + """ Load config file based on file name extension """ + ext = os.path.splitext(file_name)[1][1:] if ext == "sh": return load_shell_config(file_name) + elif ext == "cfg": + return load_ini_config(file_name) + elif ext == "json": + return load_json_config(file_name) else: - return yaml_safe_load(file_name) - + return load_yaml_config(file_name) if __name__ == "__main__": parser = argparse.ArgumentParser(description=\ - 'Prints contents of yaml file as bash argument-value pairs.') + 'Prints contents of config file.') parser.add_argument('--cfg','-c',dest='cfg',required=True, - help='yaml or regular shell config file to parse') + help='config file to parse') parser.add_argument('--output-type','-o',dest='out_type',required=False, - help='output format: "shell": shell format, any other: yaml format ') + help='output format: can be any of ["shell", "yaml", "ini", "json"]') args = parser.parse_args() cfg = load_config_file(args.cfg) if args.out_type == 'shell': print( cfg_to_shell_str(cfg) ) + elif args.out_type == 'ini': + print( cfg_to_ini_str(cfg) ) + elif args.out_type == 'json': + print( cfg_to_json_str(cfg) ) else: print( cfg_to_yaml_str(cfg) ) diff --git a/ush/python_utils/environment.py b/ush/python_utils/environment.py index e98deaad2..25f03b8fd 100644 --- a/ush/python_utils/environment.py +++ b/ush/python_utils/environment.py @@ -126,7 +126,7 @@ def str_to_list(v): v = v.strip() if not v: return None - if v[0] == '(': + if v[0] == '(' and v[-1] == ')': v = v[1:-1] tokens = shlex.split(v) lst = [] diff --git a/ush/python_utils/fv3write_parms_lambert.py b/ush/python_utils/fv3write_parms_lambert.py new file mode 100755 index 000000000..18b4a8f35 --- /dev/null +++ b/ush/python_utils/fv3write_parms_lambert.py @@ -0,0 +1,91 @@ +#!/usr/bin/env python +# +# To use this tool, you should source the regional workflow environment +# $> source env/wflow_xxx.env +# and activate pygraf (or any one with cartopy installation) +# $> conda activate pygraf +# + +import argparse + +import cartopy.crs as ccrs + +#@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ +# +# Main function to return parameters for the FV3 write component. +# + +if __name__ == "__main__": + + parser = argparse.ArgumentParser(description='Determine FV3 write component lat1/lon1 for Lamert Conformal map projection', + epilog=''' ---- Yunheng Wang (2021-07-15). + ''') + #formatter_class=CustomFormatter) + parser.add_argument('-v','--verbose', help='Verbose output', action="store_true") + parser.add_argument('-ca','--ctrlat', help='Lambert Conformal central latitude', type=float, default=38.5 ) + parser.add_argument('-co','--ctrlon', help='Lambert Conformal central longitude', type=float, default=-97.5 ) + parser.add_argument('-s1','--stdlat1',help='Lambert Conformal standard latitude1', type=float, default=38.5 ) + parser.add_argument('-s2','--stdlat2',help='Lambert Conformal standard latitude2', type=float, default=38.5 ) + parser.add_argument('-nx', help='number of grid in X direction', type=int, default=301 ) + parser.add_argument('-ny' ,help='number of grid in Y direction', type=int, default=301 ) + parser.add_argument('-dx' ,help='grid resolution in X direction (meter)',type=float, default=3000.0) + parser.add_argument('-dy' ,help='grid resolution in Y direction (meter)',type=float, default=3000.0) + + args = parser.parse_args() + + if args.verbose: + print("Write component Lambert Conformal Parameters:") + print(f" cen_lat = {args.ctrlat}, cen_lon = {args.ctrlon}, stdlat1 = {args.stdlat1}, stdlat2 = {args.stdlat2}") + print(f" nx = {args.nx}, ny = {args.ny}, dx = {args.dx}, dy = {args.dy}") + + #----------------------------------------------------------------------- + # + # Lambert grid + # + #----------------------------------------------------------------------- + + nx1 = args.nx + ny1 = args.ny + dx1 = args.dx + dy1 = args.dy + + ctrlat = args.ctrlat + ctrlon = args.ctrlon + + xctr = (nx1-1)/2*dx1 + yctr = (ny1-1)/2*dy1 + + carr= ccrs.PlateCarree() + + proj1=ccrs.LambertConformal(central_longitude=ctrlon, central_latitude=ctrlat, + false_easting=xctr, false_northing= yctr, secant_latitudes=None, + standard_parallels=(args.stdlat1, args.stdlat2), globe=None) + + lonlat1 = carr.transform_point(0.0,0.0,proj1) + + if args.verbose: + print() + print(f' lat1 = {lonlat1[1]}, lon1 = {lonlat1[0]}') + print('\n') + + #----------------------------------------------------------------------- + # + # Output write component parameters + # + #----------------------------------------------------------------------- + + print() + print("output_grid: 'lambert_conformal'") + print(f"cen_lat: {args.ctrlat}") + print(f"cen_lon: {args.ctrlon}") + print(f"stdlat1: {args.stdlat1}") + print(f"stdlat2: {args.stdlat2}") + print(f"nx: {args.nx}") + print(f"ny: {args.ny}") + print(f"dx: {args.dx}") + print(f"dy: {args.dy}") + print(f"lat1: {lonlat1[1]}") + print(f"lon1: {lonlat1[0]}") + print() + + # End of program diff --git a/ush/python_utils/get_elem_inds.py b/ush/python_utils/get_elem_inds.py index fd4ea2cee..08cbfbff7 100644 --- a/ush/python_utils/get_elem_inds.py +++ b/ush/python_utils/get_elem_inds.py @@ -1,6 +1,6 @@ #!/usr/bin/env python3 -from .change_case import lowercase +from .misc import lowercase from .check_var_valid_value import check_var_valid_value def get_elem_inds(arr, match, ret_type): diff --git a/ush/python_utils/get_manage_externals_config_property.py b/ush/python_utils/get_manage_externals_config_property.py deleted file mode 100644 index c537a8824..000000000 --- a/ush/python_utils/get_manage_externals_config_property.py +++ /dev/null @@ -1,54 +0,0 @@ -#!/usr/bin/env python3 - -import os -import configparser - -from .print_msg import print_err_msg_exit - -def get_manage_externals_config_property(externals_cfg_fp, external_name, property_name): - """ - This function searches a specified manage_externals configuration file - and extracts from it the value of the specified property of the external - with the specified name (e.g. the relative path in which the external - has been/will be cloned by the manage_externals utility). - - Args: - - externals_cfg_fp: - The absolute or relative path to the manage_externals configuration - file that will be searched. - - external_name: - The name of the external to search for in the manage_externals confi- - guration file specified by externals_cfg_fp. - - property_name: - The name of the property whose value to obtain (for the external spe- - cified by external_name). - - Returns: - The property value - """ - - if not os.path.exists(externals_cfg_fp): - print_err_msg_exit(f''' - The specified manage_externals configuration file (externals_cfg_fp) - does not exist: - externals_cfg_fp = \"{externals_cfg_fp}\"''') - - config = configparser.ConfigParser() - config.read(externals_cfg_fp) - - if not external_name in config.sections(): - print_err_msg_exit(f''' - In the specified manage_externals configuration file (externals_cfg_fp), - the specified property (property_name) was not found for the the speci- - fied external (external_name): - externals_cfg_fp = \"{externals_cfg_fp}\" - external_name = \"{external_name}\" - property_name = \"{property_name}\"''') - else: - return config[external_name][property_name] - - return None - diff --git a/ush/python_utils/misc.py b/ush/python_utils/misc.py new file mode 100644 index 000000000..1934ac3d6 --- /dev/null +++ b/ush/python_utils/misc.py @@ -0,0 +1,57 @@ +#!/usr/bin/env python3 + +import re + +def uppercase(str): + """ Function to convert a given string to uppercase + + Args: + str: the string + Return: + Uppercased str + """ + + return str.upper() + + +def lowercase(str): + """ Function to convert a given string to lowercase + + Args: + str: the string + Return: + Lowercase str + """ + + return str.lower() + +def find_pattern_in_str(pattern, source): + """ Find regex pattern in a string + + Args: + pattern: regex expression + source: string + Return: + A tuple of matched groups or None + """ + pattern = re.compile(pattern) + for match in re.finditer(pattern,source): + return match.groups() + return None + +def find_pattern_in_file(pattern, file_name): + """ Find regex pattern in a file + + Args: + pattern: regex expression + file_name: name of text file + Return: + A tuple of matched groups or None + """ + pattern = re.compile(pattern) + with open(file_name) as f: + for line in f: + for match in re.finditer(pattern,line): + return match.groups() + return None + diff --git a/ush/python_utils/print_input_args.py b/ush/python_utils/print_input_args.py index 4f168a612..25d979eae 100644 --- a/ush/python_utils/print_input_args.py +++ b/ush/python_utils/print_input_args.py @@ -4,7 +4,7 @@ import inspect from textwrap import dedent -from .change_case import lowercase +from .misc import lowercase from .print_msg import print_info_msg from .environment import import_vars diff --git a/ush/python_utils/test_python_utils.py b/ush/python_utils/test_python_utils.py index 568de918e..52defb1c6 100644 --- a/ush/python_utils/test_python_utils.py +++ b/ush/python_utils/test_python_utils.py @@ -18,9 +18,23 @@ from python_utils import * class Testing(unittest.TestCase): - def test_change_case(self): + def test_misc(self): self.assertEqual( uppercase('upper'), 'UPPER' ) self.assertEqual( lowercase('LOWER'), 'lower' ) + # regex in file + pattern = f'^[ ]*(lsm_ruc)<\/scheme>[ ]*$' + FILE=f"{self.PATH}/../test_data/suite_FV3_GSD_SAR.xml" + match = find_pattern_in_file(pattern, FILE) + self.assertEqual( ("lsm_ruc",), match) + # regex in string + with open(FILE) as f: + content = f.read() + find_pattern_in_str(pattern, content) + self.assertEqual( ("lsm_ruc",), match) + def test_xml_parser(self): + FILE=f"{self.PATH}/../test_data/suite_FV3_GSD_SAR.xml" + tree = load_xml_file(FILE) + self.assertTrue(has_tag_with_value(tree,"scheme","lsm_ruc")) def test_check_for_preexist_dir_file(self): cmd_vrfy('mkdir -p test_data/dir') self.assertTrue( os.path.exists('test_data/dir') ) @@ -38,8 +52,8 @@ def test_filesys_cmds(self): dPATH=f'{self.PATH}/test_data/dir' mkdir_vrfy(dPATH) self.assertTrue( os.path.exists(dPATH) ) - cp_vrfy(f'{self.PATH}/change_case.py', f'{dPATH}/change_cases.py') - self.assertTrue( os.path.exists(f'{dPATH}/change_cases.py') ) + cp_vrfy(f'{self.PATH}/misc.py', f'{dPATH}/miscs.py') + self.assertTrue( os.path.exists(f'{dPATH}/miscs.py') ) cmd_vrfy(f'rm -rf {dPATH}') self.assertFalse( os.path.exists('tt.py') ) def test_get_charvar_from_netcdf(self): @@ -53,13 +67,6 @@ def test_get_elem_inds(self): self.assertEqual( get_elem_inds(arr, 'egg', 'first' ) , 0 ) self.assertEqual( get_elem_inds(arr, 'egg', 'last' ) , 4 ) self.assertEqual( get_elem_inds(arr, 'egg', 'all' ) , [0, 2, 4] ) - def test_get_manage_externals_config_property(self): - self.assertIn( \ - 'regional_workflow', - get_manage_externals_config_property( \ - f'{self.PATH}/test_data/Externals.cfg', - 'regional_workflow', - 'repo_url')) def test_interpol_to_arbit_CRES(self): RES = 800 RES_array = [ 5, 25, 40, 60, 80, 100, 400, 700, 1000, 1500, 2800, 3000 ] @@ -95,7 +102,7 @@ def test_import_vars(self): set_env_var("MYVAR","MYVAL") env_vars = ["PWD", "MYVAR"] import_vars(env_vars=env_vars) - self.assertEqual( PWD, os.getcwd() ) + self.assertEqual( os.path.realpath(PWD), os.path.realpath(os.getcwd()) ) self.assertEqual(MYVAR,"MYVAL") #test export MYVAR="MYNEWVAL" @@ -106,10 +113,26 @@ def test_import_vars(self): dictionary = { "Hello": "World!" } import_vars(dictionary=dictionary) self.assertEqual( Hello, "World!" ) + #test array + shell_str='("1" "2") \n' + v = str_to_list(shell_str) + self.assertTrue( isinstance(v,list) ) + self.assertEqual(v, [1, 2]) + shell_str = '( "1" "2" \n' + v = str_to_list(shell_str) + self.assertFalse( isinstance(v,list) ) def test_config_parser(self): cfg = { "HRS": [ "1", "2" ] } shell_str = cfg_to_shell_str(cfg) self.assertEqual( shell_str, 'HRS=( "1" "2" )\n') + # ini file + cfg = load_ini_config(f'{self.PATH}/test_data/Externals.cfg') + self.assertIn( \ + 'regional_workflow', + get_ini_value( \ + cfg, + 'regional_workflow', + 'repo_url')) def test_print_msg(self): self.assertEqual( print_info_msg("Hello World!", verbose=False), False) def setUp(self): diff --git a/ush/python_utils/xml_parser.py b/ush/python_utils/xml_parser.py new file mode 100644 index 000000000..0428259c6 --- /dev/null +++ b/ush/python_utils/xml_parser.py @@ -0,0 +1,30 @@ +#!/usr/bin/env python3 + +import xml.etree.ElementTree as ET + +def load_xml_file(xml_file): + """ Loads xml file + + Args: + xml_file: path to xml file + Returns: + root of the xml tree + """ + tree = ET.parse(xml_file) + return tree + +def has_tag_with_value(tree, tag, value): + """ Check if xml tree has a node with tag and value + + Args: + tree: the xml tree + tag: the tag + value: text of tag + Returns: + Boolean + """ + for node in tree.iter(): + if node.tag == tag and node.text == value: + return True + return False + diff --git a/ush/retrieve_data.py b/ush/retrieve_data.py index 86cb87e07..d68c08237 100755 --- a/ush/retrieve_data.py +++ b/ush/retrieve_data.py @@ -578,7 +578,7 @@ def main(cla): write_summary_file(cla, data_store, file_templates) break - logging.warning(f'Requested files are unavialable from {data_store}') + logging.warning(f'Requested files are unavailable from {data_store}') if unavailable: logging.error('Could not find any of the requested files.') @@ -716,7 +716,6 @@ def parse_args(): output = subprocess.run('which hsi', check=True, shell=True, - capture_output=True, ) except subprocess.CalledProcessError: logging.error('You requested the hpss data store, but ' \ diff --git a/ush/set_FV3nml_ens_stoch_seeds.py b/ush/set_FV3nml_ens_stoch_seeds.py new file mode 100644 index 000000000..9d9ae4b39 --- /dev/null +++ b/ush/set_FV3nml_ens_stoch_seeds.py @@ -0,0 +1,137 @@ +#!/usr/bin/env python3 + +import unittest +import os +from textwrap import dedent +from datetime import datetime + +from python_utils import print_input_args, print_info_msg, print_err_msg_exit,\ + date_to_str, mkdir_vrfy,cp_vrfy,\ + import_vars,set_env_var,\ + define_macos_utilities, cfg_to_yaml_str + +from set_namelist import set_namelist + +def set_FV3nml_ens_stoch_seeds(cdate): + """ + This function, for an ensemble-enabled experiment + (i.e. for an experiment for which the workflow configuration variable + DO_ENSEMBLE has been set to "TRUE"), creates new namelist files with + unique stochastic "seed" parameters, using a base namelist file in the + ${EXPTDIR} directory as a template. These new namelist files are stored + within each member directory housed within each cycle directory. Files + of any two ensemble members differ only in their stochastic "seed" + parameter values. These namelist files are generated when this file is + called as part of the RUN_FCST_TN task. + + Args: + cdate + Returns: + None + """ + + print_input_args(locals()) + + # import all environment variables + import_vars() + + # + #----------------------------------------------------------------------- + # + # For a given cycle and member, generate a namelist file with unique + # seed values. + # + #----------------------------------------------------------------------- + # + ensmem_name=f"mem{ENSMEM_INDX}" + + fv3_nml_ensmem_fp=os.path.join(CYCLE_BASEDIR, f"{date_to_str(cdate,True)}{os.sep}{ensmem_name}{os.sep}{FV3_NML_FN}") + + ensmem_num=ENSMEM_INDX + + cdate_i = int(cdate.strftime('%Y%m%d')) + + settings = {} + nam_stochy_dict = {} + + if DO_SPP: + iseed_sppt=cdate_i*1000 + ensmem_num*10 + 1 + nam_stochy_dict.update({ + 'iseed_sppt': iseed_sppt + }) + + if DO_SHUM: + iseed_shum=cdate_i*1000 + ensmem_num*10 + 2 + nam_stochy_dict.update({ + 'iseed_shum': iseed_shum + }) + + if DO_SKEB: + iseed_skeb=cdate_i*1000 + ensmem_num*10 + 3 + nam_stochy_dict.update({ + 'iseed_skeb': iseed_skeb + }) + + settings['nam_stochy'] = nam_stochy_dict + + if DO_SPP: + num_iseed_spp=len(ISEED_SPP) + iseed_spp = [None]*num_iseed_spp + for i in range(num_iseed_spp): + iseed_spp[i]=cdate_i*1000 + ensmem_num*10 + ISEED_SPP[i] + + settings['nam_spperts'] = { + 'iseed_spp': iseed_spp + } + else: + settings['nam_spperts'] = {} + + if DO_LSM_SPP: + iseed_lsm_spp=cdate_i*1000 + ensmem_num*10 + 9 + + settings['nam_sppperts'] = { + 'iseed_lndp': [iseed_lsm_spp] + } + + settings_str = cfg_to_yaml_str(settings) + try: + set_namelist(["-q", "-n", FV3_NML_FP, "-u", settings_str, "-o", fv3_nml_ensmem_fp]) + except: + print_err_msg_exit(dedent(f''' + Call to python script set_namelist.py to set the variables in the FV3 + namelist file that specify the paths to the surface climatology files + failed. Parameters passed to this script are: + Full path to base namelist file: + FV3_NML_FP = \"{FV3_NML_FP}\" + Full path to output namelist file: + fv3_nml_ensmem_fp = \"{fv3_nml_ensmem_fp}\" + Namelist settings specified on command line (these have highest precedence): + settings = + {settings_str}''')) + +class Testing(unittest.TestCase): + def test_set_FV3nml_ens_stoch_seeds(self): + set_FV3nml_ens_stoch_seeds(cdate=self.cdate) + def setUp(self): + define_macos_utilities(); + set_env_var('DEBUG',True) + set_env_var('VERBOSE',True) + USHDIR = os.path.dirname(os.path.abspath(__file__)) + EXPTDIR = os.path.join(USHDIR,"test_data","expt"); + cp_vrfy(os.path.join(USHDIR,f'templates{os.sep}input.nml.FV3'), \ + os.path.join(EXPTDIR,'input.nml')) + self.cdate=datetime(2021, 1, 1) + + mkdir_vrfy("-p", os.path.join(EXPTDIR,f'{date_to_str(self.cdate,True)}{os.sep}mem0')) + set_env_var("USHDIR",USHDIR) + set_env_var("CYCLE_BASEDIR",EXPTDIR) + set_env_var("ENSMEM_INDX",0) + set_env_var("FV3_NML_FN","input.nml") + set_env_var("FV3_NML_FP",os.path.join(EXPTDIR,"input.nml")) + set_env_var("DO_SPP",True) + set_env_var("DO_SHUM",True) + set_env_var("DO_SKEB",True) + set_env_var("DO_LSM_SPP",True) + ISEED_SPP = [ 4, 4, 4, 4, 4] + set_env_var("ISEED_SPP",ISEED_SPP) + diff --git a/ush/set_FV3nml_sfc_climo_filenames.py b/ush/set_FV3nml_sfc_climo_filenames.py new file mode 100644 index 000000000..518f81ae3 --- /dev/null +++ b/ush/set_FV3nml_sfc_climo_filenames.py @@ -0,0 +1,141 @@ +#!/usr/bin/env python3 + +import unittest +import os +from textwrap import dedent + +from python_utils import print_input_args, print_info_msg, print_err_msg_exit,\ + check_var_valid_value,mv_vrfy,mkdir_vrfy,cp_vrfy,\ + rm_vrfy,import_vars,set_env_var,\ + define_macos_utilities,find_pattern_in_str,cfg_to_yaml_str + +from set_namelist import set_namelist + +def set_FV3nml_sfc_climo_filenames(): + """ + This function sets the values of the variables in + the forecast model's namelist file that specify the paths to the surface + climatology files on the FV3LAM native grid (which are either pregenerated + or created by the MAKE_SFC_CLIMO_TN task). Note that the workflow + generation scripts create symlinks to these surface climatology files + in the FIXLAM directory, and the values in the namelist file that get + set by this function are relative or full paths to these links. + + Args: + None + Returns: + None + """ + + # import all environment variables + import_vars() + + # The regular expression regex_search set below will be used to extract + # from the elements of the array FV3_NML_VARNAME_TO_SFC_CLIMO_FIELD_MAPPING + # the name of the namelist variable to set and the corresponding surface + # climatology field from which to form the name of the surface climatology file + regex_search = "^[ ]*([^| ]+)[ ]*[|][ ]*([^| ]+)[ ]*$" + + # Set the suffix of the surface climatology files. + suffix = "tileX.nc" + + # create yaml-complaint string + settings = {} + + dummy_run_dir = os.path.join(EXPTDIR, "any_cyc") + if DO_ENSEMBLE == "TRUE": + dummy_run_dir += os.sep + "any_ensmem" + + namsfc_dict = {} + for mapping in FV3_NML_VARNAME_TO_SFC_CLIMO_FIELD_MAPPING: + tup = find_pattern_in_str(regex_search, mapping) + nml_var_name = tup[0] + sfc_climo_field_name = tup[1] + + check_var_valid_value(sfc_climo_field_name, SFC_CLIMO_FIELDS) + + fp = os.path.join(FIXLAM, f'{CRES}.{sfc_climo_field_name}.{suffix}') + if RUN_ENVIR != "nco": + fp = os.path.relpath(os.path.realpath(fp), start=dummy_run_dir) + + namsfc_dict[nml_var_name] = fp + + settings['namsfc_dict'] = namsfc_dict + settings_str = cfg_to_yaml_str(settings) + + + print_info_msg(f''' + The variable \"settings\" specifying values of the namelist variables + has been set as follows: + + settings = + {settings_str}''', verbose=DEBUG) + + # Rename the FV3 namelist and call set_namelist + fv3_nml_base_fp = f'{FV3_NML_FP}.base' + mv_vrfy(f'{FV3_NML_FP} {fv3_nml_base_fp}') + + try: + set_namelist(["-q", "-n", fv3_nml_base_fp, "-u", settings_str, "-o", FV3_NML_FP]) + except: + print_err_msg_exit(f''' + Call to python script set_namelist.py to set the variables in the FV3 + namelist file that specify the paths to the surface climatology files + failed. Parameters passed to this script are: + Full path to base namelist file: + fv3_nml_base_fp = \"{fv3_nml_base_fp}\" + Full path to output namelist file: + FV3_NML_FP = \"{FV3_NML_FP}\" + Namelist settings specified on command line (these have highest precedence): + settings = + {settings_str}''') + + rm_vrfy(f'{fv3_nml_base_fp}') + +class Testing(unittest.TestCase): + def test_set_FV3nml_sfc_climo_filenames(self): + set_FV3nml_sfc_climo_filenames() + def setUp(self): + define_macos_utilities(); + set_env_var('DEBUG',True) + set_env_var('VERBOSE',True) + USHDIR = os.path.dirname(os.path.abspath(__file__)) + EXPTDIR = os.path.join(USHDIR, "test_data", "expt"); + FIXLAM = os.path.join(EXPTDIR, "fix_lam") + mkdir_vrfy("-p",FIXLAM) + cp_vrfy(os.path.join(USHDIR,f'templates{os.sep}input.nml.FV3'), \ + os.path.join(EXPTDIR,'input.nml')) + set_env_var("USHDIR",USHDIR) + set_env_var("EXPTDIR",EXPTDIR) + set_env_var("FIXLAM",FIXLAM) + set_env_var("DO_ENSEMBLE",False) + set_env_var("CRES","C3357") + set_env_var("RUN_ENVIR","nco") + set_env_var("FV3_NML_FP",os.path.join(EXPTDIR,"input.nml")) + + FV3_NML_VARNAME_TO_SFC_CLIMO_FIELD_MAPPING=[ + "FNALBC | snowfree_albedo", + "FNALBC2 | facsf", + "FNTG3C | substrate_temperature", + "FNVEGC | vegetation_greenness", + "FNVETC | vegetation_type", + "FNSOTC | soil_type", + "FNVMNC | vegetation_greenness", + "FNVMXC | vegetation_greenness", + "FNSLPC | slope_type", + "FNABSC | maximum_snow_albedo" + ] + SFC_CLIMO_FIELDS=[ + "facsf", + "maximum_snow_albedo", + "slope_type", + "snowfree_albedo", + "soil_type", + "substrate_temperature", + "vegetation_greenness", + "vegetation_type" + ] + set_env_var("FV3_NML_VARNAME_TO_SFC_CLIMO_FIELD_MAPPING", + FV3_NML_VARNAME_TO_SFC_CLIMO_FIELD_MAPPING) + set_env_var("SFC_CLIMO_FIELDS",SFC_CLIMO_FIELDS) + diff --git a/ush/set_cycle_dates.py b/ush/set_cycle_dates.py new file mode 100644 index 000000000..69a707b97 --- /dev/null +++ b/ush/set_cycle_dates.py @@ -0,0 +1,54 @@ +#!/usr/bin/env python3 + +import unittest +from datetime import datetime,timedelta,date + +from python_utils import print_input_args, print_err_msg_exit + +def set_cycle_dates(date_start, date_end, cycle_hrs, incr_cycl_freq): + """ This file defines a function that, given the starting date (date_start, + in the form YYYYMMDD), the ending date (date_end, in the form YYYYMMDD), + and an array containing the cycle hours for each day (whose elements + have the form HH), returns an array of cycle date-hours whose elements + have the form YYYYMMDD. Here, YYYY is a four-digit year, MM is a two- + digit month, DD is a two-digit day of the month, and HH is a two-digit + hour of the day. + + Args: + date_start: start date + date_end: end date + cycle_hrs: [ HH0, HH1, ...] + incr_cycl_freq: cycle frequency increment in hours + Returns: + A list of dates in a format YYYYMMDDHH + """ + + print_input_args(locals()) + + #calculate date increment + if incr_cycl_freq <= 24: + incr_days = 1 + else: + incr_days = incr_cycl_freq // 24 + if incr_cycl_freq % 24 != 0: + print_err_msg_exit(f''' + INCR_CYCL_FREQ is not divided by 24: + INCR_CYCL_FREQ = \"{incr_cycl_freq}\"''') + + #iterate over days and cycles + all_cdates = [] + d = date_start + while d <= date_end: + for c in cycle_hrs: + dc = d + timedelta(hours=c) + v = datetime.strftime(dc,'%Y%m%d%H') + all_cdates.append(v) + d += timedelta(days=incr_days) + + return all_cdates + +class Testing(unittest.TestCase): + def test_set_cycle_dates(self): + cdates = set_cycle_dates(date_start=datetime(2022,1,1), date_end=datetime(2022,1,4), + cycle_hrs=[6,12], incr_cycl_freq=48) + self.assertEqual(cdates, ['2022010106', '2022010112','2022010306', '2022010312']) diff --git a/ush/set_extrn_mdl_params.py b/ush/set_extrn_mdl_params.py new file mode 100644 index 000000000..f4fffc590 --- /dev/null +++ b/ush/set_extrn_mdl_params.py @@ -0,0 +1,56 @@ +#!/usr/bin/env python3 + +import unittest + +from python_utils import import_vars, export_vars, set_env_var, get_env_var + +def set_extrn_mdl_params(): + """ Sets parameters associated with the external model used for initial + conditions (ICs) and lateral boundary conditions (LBCs). + Args: + None + Returns: + None + """ + + #import all env variables + import_vars() + + global EXTRN_MDL_LBCS_OFFSET_HRS + + # + #----------------------------------------------------------------------- + # + # Set EXTRN_MDL_LBCS_OFFSET_HRS, which is the number of hours to shift + # the starting time of the external model that provides lateral boundary + # conditions. + # + #----------------------------------------------------------------------- + # + if EXTRN_MDL_NAME_LBCS == "RAP": + EXTRN_MDL_LBCS_OFFSET_HRS=EXTRN_MDL_LBCS_OFFSET_HRS or "3" + else: + EXTRN_MDL_LBCS_OFFSET_HRS=EXTRN_MDL_LBCS_OFFSET_HRS or "0" + + # export values we set above + env_vars = ["EXTRN_MDL_LBCS_OFFSET_HRS"] + export_vars(env_vars=env_vars) +# +#----------------------------------------------------------------------- +# +# Call the function defined above. +# +#----------------------------------------------------------------------- +# +if __name__ == "__main__": + set_extrn_mdl_params() + +class Testing(unittest.TestCase): + def test_extrn_mdl_params(self): + set_extrn_mdl_params() + EXTRN_MDL_LBCS_OFFSET_HRS = get_env_var("EXTRN_MDL_LBCS_OFFSET_HRS") + self.assertEqual(EXTRN_MDL_LBCS_OFFSET_HRS,3) + + def setUp(self): + set_env_var("EXTRN_MDL_NAME_LBCS","RAP") + set_env_var("EXTRN_MDL_LBCS_OFFSET_HRS",None) diff --git a/ush/set_extrn_mdl_params.sh b/ush/set_extrn_mdl_params.sh index f110a4c26..12b11e88d 100644 --- a/ush/set_extrn_mdl_params.sh +++ b/ush/set_extrn_mdl_params.sh @@ -6,23 +6,6 @@ #----------------------------------------------------------------------- # function set_extrn_mdl_params() { - # - #----------------------------------------------------------------------- - # - # Use known locations or COMINgfs as default, depending on RUN_ENVIR - # - #----------------------------------------------------------------------- - # - if [ "${RUN_ENVIR}" = "nco" ]; then - EXTRN_MDL_SYSBASEDIR_ICS="${EXTRN_MDL_SYSBASEDIR_ICS:-$COMINgfs}" - EXTRN_MDL_SYSBASEDIR_LBCS="${EXTRN_MDL_SYSBASEDIR_LBCS:-$COMINgfs}" - else - EXTRN_MDL_SYSBASEDIR_ICS="${EXTRN_MDL_SYSBASEDIR_ICS:-$(set_known_sys_dir \ - ${EXTRN_MDL_NAME_ICS})}" - EXTRN_MDL_SYSBASEDIR_LBCS="${EXTRN_MDL_SYSBASEDIR_LBCS:-$(set_known_sys_dir \ - ${EXTRN_MDL_NAME_LBCS})}" - fi - # #----------------------------------------------------------------------- # @@ -36,7 +19,7 @@ function set_extrn_mdl_params() { "RAP") EXTRN_MDL_LBCS_OFFSET_HRS=${EXTRN_MDL_LBCS_OFFSET_HRS:-"3"} ;; - "*") + *) EXTRN_MDL_LBCS_OFFSET_HRS=${EXTRN_MDL_LBCS_OFFSET_HRS:-"0"} ;; esac diff --git a/ush/set_gridparams_ESGgrid.py b/ush/set_gridparams_ESGgrid.py new file mode 100644 index 000000000..c1ead9522 --- /dev/null +++ b/ush/set_gridparams_ESGgrid.py @@ -0,0 +1,110 @@ +#!/usr/bin/env python3 + +import unittest +from datetime import datetime,timedelta + +from constants import radius_Earth,degs_per_radian +from python_utils import import_vars, set_env_var, print_input_args + +def set_gridparams_ESGgrid(lon_ctr,lat_ctr,nx,ny,halo_width,delx,dely,pazi): + """ Sets the parameters for a grid that is to be generated using the "ESGgrid" + grid generation method (i.e. GRID_GEN_METHOD set to "ESGgrid"). + + Args: + lon_ctr + lat_ctr + nx + ny + halo_width + delx + dely + pazi + Returns: + Tuple of inputs, and 4 outputs (see return statement) + """ + + print_input_args(locals()) + # + #----------------------------------------------------------------------- + # + # For a ESGgrid-type grid, the orography filtering is performed by pass- + # ing to the orography filtering the parameters for an "equivalent" glo- + # bal uniform cubed-sphere grid. These are the parameters that a global + # uniform cubed-sphere grid needs to have in order to have a nominal + # grid cell size equal to that of the (average) cell size on the region- + # al grid. These globally-equivalent parameters include a resolution + # (in units of number of cells in each of the two horizontal directions) + # and a stretch factor. The equivalent resolution is calculated in the + # script that generates the grid, and the stretch factor needs to be set + # to 1 because we are considering an equivalent globally UNIFORM grid. + # However, it turns out that with a non-symmetric regional grid (one in + # which nx is not equal to ny), setting stretch_factor to 1 fails be- + # cause the orography filtering program is designed for a global cubed- + # sphere grid and thus assumes that nx and ny for a given tile are equal + # when stretch_factor is exactly equal to 1. + # ^^-- Why is this? Seems like symmetry btwn x and y should still hold when the stretch factor is not equal to 1. + # It turns out that the program will work if we set stretch_factor to a + # value that is not exactly 1. This is what we do below. + # + #----------------------------------------------------------------------- + # + stretch_factor=0.999 # Check whether the orography program has been fixed so that we can set this to 1... + # + #----------------------------------------------------------------------- + # + # Set parameters needed as inputs to the regional_grid grid generation + # code. + # + #----------------------------------------------------------------------- + # + del_angle_x_sg = (delx / (2.0 * radius_Earth)) * degs_per_radian + del_angle_y_sg = (dely / (2.0 * radius_Earth)) * degs_per_radian + neg_nx_of_dom_with_wide_halo = -(nx + 2 * halo_width) + neg_ny_of_dom_with_wide_halo = -(ny + 2 * halo_width) + # + #----------------------------------------------------------------------- + # + # return output variables. + # + #----------------------------------------------------------------------- + # + return (lon_ctr,lat_ctr,nx,ny,pazi,halo_width,stretch_factor, + "{:0.10f}".format(del_angle_x_sg), + "{:0.10f}".format(del_angle_y_sg), + "{:.0f}".format(neg_nx_of_dom_with_wide_halo), + "{:.0f}".format(neg_ny_of_dom_with_wide_halo)) + +class Testing(unittest.TestCase): + def test_set_gridparams_ESGgrid(self): + + (LON_CTR,LAT_CTR,NX,NY,PAZI,NHW,STRETCH_FAC, + DEL_ANGLE_X_SG, + DEL_ANGLE_Y_SG, + NEG_NX_OF_DOM_WITH_WIDE_HALO, + NEG_NY_OF_DOM_WITH_WIDE_HALO) = set_gridparams_ESGgrid( \ + lon_ctr=-97.5, + lat_ctr=38.5, + nx=1748, + ny=1038, + pazi=0.0, + halo_width=6, + delx=3000.0, + dely=3000.0) + + self.assertEqual(\ + (LON_CTR,LAT_CTR,NX,NY,PAZI,NHW,STRETCH_FAC, + DEL_ANGLE_X_SG, + DEL_ANGLE_Y_SG, + NEG_NX_OF_DOM_WITH_WIDE_HALO, + NEG_NY_OF_DOM_WITH_WIDE_HALO), + (-97.5, 38.5, 1748, 1038, 0.0, 6,0.999, + "0.0134894006", + "0.0134894006", + "-1760", + "-1050") + ) + + def setUp(self): + set_env_var('DEBUG',True) + set_env_var('VERBOSE',True) + diff --git a/ush/set_gridparams_GFDLgrid.py b/ush/set_gridparams_GFDLgrid.py new file mode 100644 index 000000000..0dab11f75 --- /dev/null +++ b/ush/set_gridparams_GFDLgrid.py @@ -0,0 +1,467 @@ +#!/usr/bin/env python3 + +import unittest + +from constants import radius_Earth,degs_per_radian + +from python_utils import import_vars, set_env_var, print_input_args, \ + print_info_msg, print_err_msg_exit + +def prime_factors(n): + i = 2 + factors = [] + while i * i <= n: + if n % i: + i += 1 + else: + n //= i + factors.append(i) + if n > 1: + factors.append(n) + return factors + +def set_gridparams_GFDLgrid(lon_of_t6_ctr, lat_of_t6_ctr, res_of_t6g, stretch_factor, + refine_ratio_t6g_to_t7g, + istart_of_t7_on_t6g, iend_of_t7_on_t6g, + jstart_of_t7_on_t6g, jend_of_t7_on_t6g): + """ Sets the parameters for a grid that is to be generated using the "GFDLgrid" + grid generation method (i.e. GRID_GEN_METHOD set to "ESGgrid"). + + Args: + lon_of_t6_ctr + lat_of_t6_ctr + res_of_t6g + stretch_factor + refine_ratio_t6g_to_t7g + istart_of_t7_on_t6g + iend_of_t7_on_t6g + jstart_of_t7_on_t6g + jend_of_t7_on_t6g): + Returns: + Tuple of inputs and outputs (see return statement) + """ + + print_input_args(locals()) + + # get needed environment variables + IMPORTS = ['VERBOSE', 'RUN_ENVIR', 'NH4'] + import_vars(env_vars=IMPORTS) + + # + #----------------------------------------------------------------------- + # + # To simplify the grid setup, we require that tile 7 be centered on tile + # 6. Note that this is not really a restriction because tile 6 can al- + # ways be moved so that it is centered on tile 7 [the location of tile 6 + # doesn't really matter because for a regional setup, the forecast model + # will only run on tile 7 (not on tiles 1-6)]. + # + # We now check that tile 7 is centered on tile 6 by checking (1) that + # the number of cells (on tile 6) between the left boundaries of these + # two tiles is equal to that between their right boundaries and (2) that + # the number of cells (on tile 6) between the bottom boundaries of these + # two tiles is equal to that between their top boundaries. If not, we + # print out an error message and exit. If so, we set the longitude and + # latitude of the center of tile 7 to those of tile 6 and continue. + # + #----------------------------------------------------------------------- + # + + nx_of_t6_on_t6g = res_of_t6g + ny_of_t6_on_t6g = res_of_t6g + + num_left_margin_cells_on_t6g = istart_of_t7_on_t6g - 1 + num_right_margin_cells_on_t6g = nx_of_t6_on_t6g - iend_of_t7_on_t6g + + # This if-statement can hopefully be removed once EMC agrees to make their + # GFDLgrid type grids (tile 7) symmetric about tile 6. + if RUN_ENVIR != "nco": + if num_left_margin_cells_on_t6g != num_right_margin_cells_on_t6g: + print_err_msg_exit(f''' + In order for tile 7 to be centered in the x direction on tile 6, the x- + direction tile 6 cell indices at which tile 7 starts and ends (given by + istart_of_t7_on_t6g and iend_of_t7_on_t6g, respectively) must be set + such that the number of tile 6 cells in the margin between the left + boundaries of tiles 6 and 7 (given by num_left_margin_cells_on_t6g) is + equal to that in the margin between their right boundaries (given by + num_right_margin_cells_on_t6g): + istart_of_t7_on_t6g = {istart_of_t7_on_t6g} + iend_of_t7_on_t6g = {iend_of_t7_on_t6g} + num_left_margin_cells_on_t6g = {num_left_margin_cells_on_t6g} + num_right_margin_cells_on_t6g = {num_right_margin_cells_on_t6g} + Note that the total number of cells in the x-direction on tile 6 is gi- + ven by: + nx_of_t6_on_t6g = {nx_of_t6_on_t6g} + Please reset istart_of_t7_on_t6g and iend_of_t7_on_t6g and rerun.''') + + num_bot_margin_cells_on_t6g = jstart_of_t7_on_t6g - 1 + num_top_margin_cells_on_t6g = ny_of_t6_on_t6g - jend_of_t7_on_t6g + + # This if-statement can hopefully be removed once EMC agrees to make their + # GFDLgrid type grids (tile 7) symmetric about tile 6. + if RUN_ENVIR != "nco": + if num_bot_margin_cells_on_t6g != num_top_margin_cells_on_t6g: + print_err_msg_exit(f''' + In order for tile 7 to be centered in the y direction on tile 6, the y- + direction tile 6 cell indices at which tile 7 starts and ends (given by + jstart_of_t7_on_t6g and jend_of_t7_on_t6g, respectively) must be set + such that the number of tile 6 cells in the margin between the left + boundaries of tiles 6 and 7 (given by num_left_margin_cells_on_t6g) is + equal to that in the margin between their right boundaries (given by + num_right_margin_cells_on_t6g): + jstart_of_t7_on_t6g = {jstart_of_t7_on_t6g} + jend_of_t7_on_t6g = {jend_of_t7_on_t6g} + num_bot_margin_cells_on_t6g = {num_bot_margin_cells_on_t6g} + num_top_margin_cells_on_t6g = {num_top_margin_cells_on_t6g} + Note that the total number of cells in the y-direction on tile 6 is gi- + ven by: + ny_of_t6_on_t6g = {ny_of_t6_on_t6g} + Please reset jstart_of_t7_on_t6g and jend_of_t7_on_t6g and rerun.''') + + lon_of_t7_ctr = lon_of_t6_ctr + lat_of_t7_ctr = lat_of_t6_ctr + # + #----------------------------------------------------------------------- + # + # The grid generation script grid_gen_scr called below in turn calls the + # make_hgrid utility/executable to construct the regional grid. make_- + # hgrid accepts as arguments the index limits (i.e. starting and ending + # indices) of the regional grid on the supergrid of the regional grid's + # parent tile. The regional grid's parent tile is tile 6, and the su- + # pergrid of any given tile is defined as the grid obtained by doubling + # the number of cells in each direction on that tile's grid. We will + # denote these index limits by + # + # istart_of_t7_on_t6sg + # iend_of_t7_on_t6sg + # jstart_of_t7_on_t6sg + # jend_of_t7_on_t6sg + # + # The "_T6SG" suffix in these names is used to indicate that the indices + # are on the supergrid of tile 6. Recall, however, that we have as in- + # puts the index limits of the regional grid on the tile 6 grid, not its + # supergrid. These are given by + # + # istart_of_t7_on_t6g + # iend_of_t7_on_t6g + # jstart_of_t7_on_t6g + # jend_of_t7_on_t6g + # + # We can obtain the former from the latter by recalling that the super- + # grid has twice the resolution of the original grid. Thus, + # + # istart_of_t7_on_t6sg = 2*istart_of_t7_on_t6g - 1 + # iend_of_t7_on_t6sg = 2*iend_of_t7_on_t6g + # jstart_of_t7_on_t6sg = 2*jstart_of_t7_on_t6g - 1 + # jend_of_t7_on_t6sg = 2*jend_of_t7_on_t6g + # + # These are obtained assuming that grid cells on tile 6 must either be + # completely within the regional domain or completely outside of it, + # i.e. the boundary of the regional grid must coincide with gridlines + # on the tile 6 grid; it cannot cut through tile 6 cells. (Note that + # this implies that the starting indices on the tile 6 supergrid must be + # odd while the ending indices must be even; the above expressions sa- + # tisfy this requirement.) We perfrom these calculations next. + # + #----------------------------------------------------------------------- + # + istart_of_t7_on_t6sg = 2*istart_of_t7_on_t6g - 1 + iend_of_t7_on_t6sg = 2*iend_of_t7_on_t6g + jstart_of_t7_on_t6sg = 2*jstart_of_t7_on_t6g - 1 + jend_of_t7_on_t6sg = 2*jend_of_t7_on_t6g + # + #----------------------------------------------------------------------- + # + # If we simply pass to make_hgrid the index limits of the regional grid + # on the tile 6 supergrid calculated above, make_hgrid will generate a + # regional grid without a halo. To obtain a regional grid with a halo, + # we must pass to make_hgrid the index limits (on the tile 6 supergrid) + # of the regional grid including a halo. We will let the variables + # + # istart_of_t7_with_halo_on_t6sg + # iend_of_t7_with_halo_on_t6sg + # jstart_of_t7_with_halo_on_t6sg + # jend_of_t7_with_halo_on_t6sg + # + # denote these limits. The reason we include "_wide_halo" in these va- + # riable names is that the halo of the grid that we will first generate + # will be wider than the halos that are actually needed as inputs to the + # FV3LAM model (i.e. the 0-cell-wide, 3-cell-wide, and 4-cell-wide halos + # described above). We will generate the grids with narrower halos that + # the model needs later on by "shaving" layers of cells from this wide- + # halo grid. Next, we describe how to calculate the above indices. + # + # Let halo_width_on_t7g denote the width of the "wide" halo in units of number of + # grid cells on the regional grid (i.e. tile 7) that we'd like to have + # along all four edges of the regional domain (left, right, bottom, and + # top). To obtain the corresponding halo width in units of number of + # cells on the tile 6 grid -- which we denote by halo_width_on_t6g -- we simply di- + # vide halo_width_on_t7g by the refinement ratio, i.e. + # + # halo_width_on_t6g = halo_width_on_t7g/refine_ratio_t6g_to_t7g + # + # The corresponding halo width on the tile 6 supergrid is then given by + # + # halo_width_on_t6sg = 2*halo_width_on_t6g + # = 2*halo_width_on_t7g/refine_ratio_t6g_to_t7g + # + # Note that halo_width_on_t6sg must be an integer, but the expression for it de- + # rived above may not yield an integer. To ensure that the halo has a + # width of at least halo_width_on_t7g cells on the regional grid, we round up the + # result of the expression above for halo_width_on_t6sg, i.e. we redefine halo_width_on_t6sg + # to be + # + # halo_width_on_t6sg = ceil(2*halo_width_on_t7g/refine_ratio_t6g_to_t7g) + # + # where ceil(...) is the ceiling function, i.e. it rounds its floating + # point argument up to the next larger integer. Since in bash division + # of two integers returns a truncated integer and since bash has no + # built-in ceil(...) function, we perform the rounding-up operation by + # adding the denominator (of the argument of ceil(...) above) minus 1 to + # the original numerator, i.e. by redefining halo_width_on_t6sg to be + # + # halo_width_on_t6sg = (2*halo_width_on_t7g + refine_ratio_t6g_to_t7g - 1)/refine_ratio_t6g_to_t7g + # + # This trick works when dividing one positive integer by another. + # + # In order to calculate halo_width_on_t6g using the above expression, we must + # first specify halo_width_on_t7g. Next, we specify an initial value for it by + # setting it to one more than the largest-width halo that the model ac- + # tually needs, which is NH4. We then calculate halo_width_on_t6sg using the + # above expression. Note that these values of halo_width_on_t7g and halo_width_on_t6sg will + # likely not be their final values; their final values will be calcula- + # ted later below after calculating the starting and ending indices of + # the regional grid with wide halo on the tile 6 supergrid and then ad- + # justing the latter to satisfy certain conditions. + # + #----------------------------------------------------------------------- + # + halo_width_on_t7g = NH4 + 1 + halo_width_on_t6sg = (2*halo_width_on_t7g + refine_ratio_t6g_to_t7g - 1)/refine_ratio_t6g_to_t7g + # + #----------------------------------------------------------------------- + # + # With an initial value of halo_width_on_t6sg now available, we can obtain the + # tile 6 supergrid index limits of the regional domain (including the + # wide halo) from the index limits for the regional domain without a ha- + # lo by simply subtracting halo_width_on_t6sg from the lower index limits and add- + # ing halo_width_on_t6sg to the upper index limits, i.e. + # + # istart_of_t7_with_halo_on_t6sg = istart_of_t7_on_t6sg - halo_width_on_t6sg + # iend_of_t7_with_halo_on_t6sg = iend_of_t7_on_t6sg + halo_width_on_t6sg + # jstart_of_t7_with_halo_on_t6sg = jstart_of_t7_on_t6sg - halo_width_on_t6sg + # jend_of_t7_with_halo_on_t6sg = jend_of_t7_on_t6sg + halo_width_on_t6sg + # + # We calculate these next. + # + #----------------------------------------------------------------------- + # + istart_of_t7_with_halo_on_t6sg = int(istart_of_t7_on_t6sg - halo_width_on_t6sg) + iend_of_t7_with_halo_on_t6sg = int(iend_of_t7_on_t6sg + halo_width_on_t6sg) + jstart_of_t7_with_halo_on_t6sg = int(jstart_of_t7_on_t6sg - halo_width_on_t6sg) + jend_of_t7_with_halo_on_t6sg = int(jend_of_t7_on_t6sg + halo_width_on_t6sg) + # + #----------------------------------------------------------------------- + # + # As for the regional grid without a halo, the regional grid with a wide + # halo that make_hgrid will generate must be such that grid cells on + # tile 6 either lie completely within this grid or outside of it, i.e. + # they cannot lie partially within/outside of it. This implies that the + # starting indices on the tile 6 supergrid of the grid with wide halo + # must be odd while the ending indices must be even. Thus, below, we + # subtract 1 from the starting indices if they are even (which ensures + # that there will be at least halo_width_on_t7g halo cells along the left and bot- + # tom boundaries), and we add 1 to the ending indices if they are odd + # (which ensures that there will be at least halo_width_on_t7g halo cells along the + # right and top boundaries). + # + #----------------------------------------------------------------------- + # + if istart_of_t7_with_halo_on_t6sg % 2 == 0: + istart_of_t7_with_halo_on_t6sg = istart_of_t7_with_halo_on_t6sg - 1 + + if iend_of_t7_with_halo_on_t6sg % 2 == 1: + iend_of_t7_with_halo_on_t6sg = iend_of_t7_with_halo_on_t6sg + 1 + + if jstart_of_t7_with_halo_on_t6sg % 2 == 0: + jstart_of_t7_with_halo_on_t6sg = jstart_of_t7_with_halo_on_t6sg - 1 + + if jend_of_t7_with_halo_on_t6sg % 2 == 1: + jend_of_t7_with_halo_on_t6sg = jend_of_t7_with_halo_on_t6sg + 1 + # + #----------------------------------------------------------------------- + # + # Now that the starting and ending tile 6 supergrid indices of the re- + # gional grid with the wide halo have been calculated (and adjusted), we + # recalculate the width of the wide halo on: + # + # 1) the tile 6 supergrid; + # 2) the tile 6 grid; and + # 3) the tile 7 grid. + # + # These are the final values of these quantities that are guaranteed to + # correspond to the starting and ending indices on the tile 6 supergrid. + # + #----------------------------------------------------------------------- + # + print_info_msg(f''' + Original values of the halo width on the tile 6 supergrid and on the + tile 7 grid are: + halo_width_on_t6sg = {halo_width_on_t6sg} + halo_width_on_t7g = {halo_width_on_t7g}''', verbose=VERBOSE) + + halo_width_on_t6sg = istart_of_t7_on_t6sg - istart_of_t7_with_halo_on_t6sg + halo_width_on_t6g = halo_width_on_t6sg//2 + halo_width_on_t7g = int(halo_width_on_t6g*refine_ratio_t6g_to_t7g) + + print_info_msg(f''' + Values of the halo width on the tile 6 supergrid and on the tile 7 grid + AFTER adjustments are: + halo_width_on_t6sg = {halo_width_on_t6sg} + halo_width_on_t7g = {halo_width_on_t7g}''', verbose=VERBOSE) + # + #----------------------------------------------------------------------- + # + # Calculate the number of cells that the regional domain (without halo) + # has in each of the two horizontal directions (say x and y). We denote + # these by nx_of_t7_on_t7g and ny_of_t7_on_t7g, respectively. These + # will be needed in the "shave" steps in the grid generation task of the + # workflow. + # + #----------------------------------------------------------------------- + # + nx_of_t7_on_t6sg = iend_of_t7_on_t6sg - istart_of_t7_on_t6sg + 1 + nx_of_t7_on_t6g = nx_of_t7_on_t6sg/2 + nx_of_t7_on_t7g = int(nx_of_t7_on_t6g*refine_ratio_t6g_to_t7g) + + ny_of_t7_on_t6sg = jend_of_t7_on_t6sg - jstart_of_t7_on_t6sg + 1 + ny_of_t7_on_t6g = ny_of_t7_on_t6sg/2 + ny_of_t7_on_t7g = int(ny_of_t7_on_t6g*refine_ratio_t6g_to_t7g) + # + # The following are set only for informational purposes. + # + nx_of_t6_on_t6sg = 2*nx_of_t6_on_t6g + ny_of_t6_on_t6sg = 2*ny_of_t6_on_t6g + + prime_factors_nx_of_t7_on_t7g = prime_factors(nx_of_t7_on_t7g) + prime_factors_ny_of_t7_on_t7g = prime_factors(ny_of_t7_on_t7g) + + print_info_msg(f''' + The number of cells in the two horizontal directions (x and y) on the + parent tile's (tile 6) grid and supergrid are: + nx_of_t6_on_t6g = {nx_of_t6_on_t6g} + ny_of_t6_on_t6g = {ny_of_t6_on_t6g} + nx_of_t6_on_t6sg = {nx_of_t6_on_t6sg} + ny_of_t6_on_t6sg = {ny_of_t6_on_t6sg} + + The number of cells in the two horizontal directions on the tile 6 grid + and supergrid that the regional domain (tile 7) WITHOUT A HALO encompas- + ses are: + nx_of_t7_on_t6g = {nx_of_t7_on_t6g} + ny_of_t7_on_t6g = {ny_of_t7_on_t6g} + nx_of_t7_on_t6sg = {nx_of_t7_on_t6sg} + ny_of_t7_on_t6sg = {ny_of_t7_on_t6sg} + + The starting and ending i and j indices on the tile 6 grid used to gene- + rate this regional grid are: + istart_of_t7_on_t6g = {istart_of_t7_on_t6g} + iend_of_t7_on_t6g = {iend_of_t7_on_t6g} + jstart_of_t7_on_t6g = {jstart_of_t7_on_t6g} + jend_of_t7_on_t6g = {jend_of_t7_on_t6g} + + The corresponding starting and ending i and j indices on the tile 6 su- + pergrid are: + istart_of_t7_on_t6sg = {istart_of_t7_on_t6sg} + iend_of_t7_on_t6sg = {iend_of_t7_on_t6sg} + jstart_of_t7_on_t6sg = {jstart_of_t7_on_t6sg} + jend_of_t7_on_t6sg = {jend_of_t7_on_t6sg} + + The refinement ratio (ratio of the number of cells in tile 7 that abut + a single cell in tile 6) is: + refine_ratio_t6g_to_t7g = {refine_ratio_t6g_to_t7g} + + The number of cells in the two horizontal directions on the regional do- + main's (i.e. tile 7's) grid WITHOUT A HALO are: + nx_of_t7_on_t7g = {nx_of_t7_on_t7g} + ny_of_t7_on_t7g = {ny_of_t7_on_t7g} + + The prime factors of nx_of_t7_on_t7g and ny_of_t7_on_t7g are (useful for + determining an MPI task layout): + prime_factors_nx_of_t7_on_t7g: {prime_factors_nx_of_t7_on_t7g} + prime_factors_ny_of_t7_on_t7g: {prime_factors_ny_of_t7_on_t7g}''', verbose=VERBOSE) + # + #----------------------------------------------------------------------- + # + # For informational purposes, calculate the number of cells in each di- + # rection on the regional grid including the wide halo (of width halo_- + # width_on_t7g cells). We denote these by nx_of_t7_with_halo_on_t7g and + # ny_of_t7_with_halo_on_t7g, respectively. + # + #----------------------------------------------------------------------- + # + nx_of_t7_with_halo_on_t6sg = iend_of_t7_with_halo_on_t6sg - istart_of_t7_with_halo_on_t6sg + 1 + nx_of_t7_with_halo_on_t6g = nx_of_t7_with_halo_on_t6sg/2 + nx_of_t7_with_halo_on_t7g = nx_of_t7_with_halo_on_t6g*refine_ratio_t6g_to_t7g + + ny_of_t7_with_halo_on_t6sg = jend_of_t7_with_halo_on_t6sg - jstart_of_t7_with_halo_on_t6sg + 1 + ny_of_t7_with_halo_on_t6g = ny_of_t7_with_halo_on_t6sg/2 + ny_of_t7_with_halo_on_t7g = ny_of_t7_with_halo_on_t6g*refine_ratio_t6g_to_t7g + + print_info_msg(f''' + nx_of_t7_with_halo_on_t7g = {nx_of_t7_with_halo_on_t7g} + (istart_of_t7_with_halo_on_t6sg = {istart_of_t7_with_halo_on_t6sg}, + iend_of_t7_with_halo_on_t6sg = {iend_of_t7_with_halo_on_t6sg})''', verbose=VERBOSE) + + print_info_msg(f''' + ny_of_t7_with_halo_on_t7g = {ny_of_t7_with_halo_on_t7g} + (jstart_of_t7_with_halo_on_t6sg = {jstart_of_t7_with_halo_on_t6sg}, + jend_of_t7_with_halo_on_t6sg = {jend_of_t7_with_halo_on_t6sg})''', verbose=VERBOSE) + # + #----------------------------------------------------------------------- + # + # Return output variables. + # + #----------------------------------------------------------------------- + # + return (lon_of_t7_ctr, lat_of_t7_ctr, nx_of_t7_on_t7g, ny_of_t7_on_t7g, + halo_width_on_t7g, stretch_factor, + istart_of_t7_with_halo_on_t6sg, + iend_of_t7_with_halo_on_t6sg, + jstart_of_t7_with_halo_on_t6sg, + jend_of_t7_with_halo_on_t6sg) + +class Testing(unittest.TestCase): + def test_set_gridparams_GFDLgrid(self): + (LON_CTR,LAT_CTR,NX,NY,NHW,STRETCH_FAC, + ISTART_OF_RGNL_DOM_WITH_WIDE_HALO_ON_T6SG, + IEND_OF_RGNL_DOM_WITH_WIDE_HALO_ON_T6SG, + JSTART_OF_RGNL_DOM_WITH_WIDE_HALO_ON_T6SG, + JEND_OF_RGNL_DOM_WITH_WIDE_HALO_ON_T6SG) = set_gridparams_GFDLgrid( \ + lon_of_t6_ctr=-97.5, \ + lat_of_t6_ctr=38.5, \ + res_of_t6g=96, \ + stretch_factor=1.4, \ + refine_ratio_t6g_to_t7g=3, \ + istart_of_t7_on_t6g=13, \ + iend_of_t7_on_t6g=84, \ + jstart_of_t7_on_t6g=17, \ + jend_of_t7_on_t6g=80) + + self.assertEqual(\ + (LON_CTR,LAT_CTR,NX,NY,NHW,STRETCH_FAC, + ISTART_OF_RGNL_DOM_WITH_WIDE_HALO_ON_T6SG, + IEND_OF_RGNL_DOM_WITH_WIDE_HALO_ON_T6SG, + JSTART_OF_RGNL_DOM_WITH_WIDE_HALO_ON_T6SG, + JEND_OF_RGNL_DOM_WITH_WIDE_HALO_ON_T6SG), + (-97.5,38.5,216,192,6,1.4, + 21, + 172, + 29, + 164) + ) + + def setUp(self): + set_env_var('DEBUG',True) + set_env_var('VERBOSE',True) + set_env_var('NH4', 4) + diff --git a/ush/set_namelist.py b/ush/set_namelist.py index 43bbeb8a6..84c61975e 100755 --- a/ush/set_namelist.py +++ b/ush/set_namelist.py @@ -69,6 +69,7 @@ import argparse import collections import os +import sys import f90nml import yaml @@ -137,7 +138,7 @@ def path_ok(arg): msg = f'{arg} is not a writable path!' raise argparse.ArgumentTypeError(msg) -def parse_args(): +def parse_args(argv): ''' Function maintains the arguments accepted by this script. Please see @@ -192,7 +193,7 @@ def parse_args(): action='store_true', help='If provided, suppress all output.', ) - return parser.parse_args() + return parser.parse_args(argv) def dict_diff(dict1, dict2): @@ -284,10 +285,15 @@ def update_dict(dest, newdict, quiet=False): dest[sect] = {} dest[sect][key] = value -def main(cla): +def set_namelist(argv): ''' Using input command line arguments (cla), update a Fortran namelist file. ''' + # parse argumetns + cla = parse_args(argv) + if cla.config: + cla.config, _ = config_exists(cla.config) + # Load base namelist into dict nml = f90nml.Namelist() if cla.nml is not None: @@ -324,7 +330,4 @@ def main(cla): if __name__ == '__main__': - cla = parse_args() - if cla.config: - cla.config, _ = config_exists(cla.config) - main(cla) + set_namelist(sys.argv[1:]) diff --git a/ush/set_ozone_param.py b/ush/set_ozone_param.py new file mode 100644 index 000000000..4c9998c3c --- /dev/null +++ b/ush/set_ozone_param.py @@ -0,0 +1,231 @@ +#!/usr/bin/env python3 + +import os +import unittest +from textwrap import dedent + +from python_utils import import_vars,export_vars,set_env_var,list_to_str,\ + print_input_args, print_info_msg, print_err_msg_exit,\ + define_macos_utilities,load_xml_file,has_tag_with_value,find_pattern_in_str + +def set_ozone_param(ccpp_phys_suite_fp): + """ Function that does the following: + (1) Determines the ozone parameterization being used by checking in the + CCPP physics suite XML. + + (2) Sets the name of the global ozone production/loss file in the FIXgsm + FIXgsm system directory to copy to the experiment's FIXam directory. + + (3) Resets the last element of the workflow array variable + FIXgsm_FILES_TO_COPY_TO_FIXam that contains the files to copy from + FIXgsm to FIXam (this last element is initially set to a dummy + value) to the name of the ozone production/loss file set in the + previous step. + + (4) Resets the element of the workflow array variable + CYCLEDIR_LINKS_TO_FIXam_FILES_MAPPING (this array contains the + mapping between the symlinks to create in any cycle directory and + the files in the FIXam directory that are their targets) that + specifies the mapping for the ozone symlink/file such that the + target FIXam file name is set to the name of the ozone production/ + loss file set above. + + Args: + ccpp_phys_suite_fp: full path to CCPP physics suite + Returns: + ozone_param: a string + """ + + print_input_args(locals()) + + # import all environment variables + import_vars() + + # + #----------------------------------------------------------------------- + # + # Get the name of the ozone parameterization being used. There are two + # possible ozone parameterizations: + # + # (1) A parameterization developed/published in 2015. Here, we refer to + # this as the 2015 parameterization. If this is being used, then we + # set the variable ozone_param to the string "ozphys_2015". + # + # (2) A parameterization developed/published sometime after 2015. Here, + # we refer to this as the after-2015 parameterization. If this is + # being used, then we set the variable ozone_param to the string + # "ozphys". + # + # We check the CCPP physics suite definition file (SDF) to determine the + # parameterization being used. If this file contains the line + # + # ozphys_2015 + # + # then the 2015 parameterization is being used. If it instead contains + # the line + # + # ozphys + # + # then the after-2015 parameterization is being used. (The SDF should + # contain exactly one of these lines; not both nor neither; we check for + # this.) + # + #----------------------------------------------------------------------- + # + tree = load_xml_file(ccpp_phys_suite_fp) + ozone_param = "" + if has_tag_with_value(tree, "scheme", "ozphys_2015"): + fixgsm_ozone_fn="ozprdlos_2015_new_sbuvO3_tclm15_nuchem.f77" + ozone_param = "ozphys_2015" + elif has_tag_with_value(tree, "scheme", "ozphys"): + fixgsm_ozone_fn="global_o3prdlos.f77" + ozone_param = "ozphys" + else: + print_err_msg_exit(f''' + Unknown or no ozone parameterization + specified in the CCPP physics suite file (ccpp_phys_suite_fp): + ccpp_phys_suite_fp = \"{ccpp_phys_suite_fp}\" + ozone_param = \"{ozone_param}\"''') + # + #----------------------------------------------------------------------- + # + # Set the last element of the array FIXgsm_FILES_TO_COPY_TO_FIXam to the + # name of the ozone production/loss file to copy from the FIXgsm to the + # FIXam directory. + # + #----------------------------------------------------------------------- + # + i=len(FIXgsm_FILES_TO_COPY_TO_FIXam) - 1 + FIXgsm_FILES_TO_COPY_TO_FIXam[i]=f"{fixgsm_ozone_fn}" + # + #----------------------------------------------------------------------- + # + # Set the element in the array CYCLEDIR_LINKS_TO_FIXam_FILES_MAPPING that + # specifies the mapping between the symlink for the ozone production/loss + # file that must be created in each cycle directory and its target in the + # FIXam directory. The name of the symlink is alrady in the array, but + # the target is not because it depends on the ozone parameterization that + # the physics suite uses. Since we determined the ozone parameterization + # above, we now set the target of the symlink accordingly. + # + #----------------------------------------------------------------------- + # + ozone_symlink="global_o3prdlos.f77" + fixgsm_ozone_fn_is_set=False + regex_search="^[ ]*([^| ]*)[ ]*[|][ ]*([^| ]*)[ ]*$" + num_symlinks=len(CYCLEDIR_LINKS_TO_FIXam_FILES_MAPPING) + + for i in range(num_symlinks): + mapping=CYCLEDIR_LINKS_TO_FIXam_FILES_MAPPING[i] + symlink = find_pattern_in_str(regex_search, mapping) + if symlink is not None: + symlink = symlink[0] + if symlink == ozone_symlink: + regex_search="^[ ]*([^| ]+[ ]*)[|][ ]*([^| ]*)[ ]*$" + mapping_ozone = find_pattern_in_str(regex_search, mapping)[0] + mapping_ozone=f"{mapping_ozone}| {fixgsm_ozone_fn}" + CYCLEDIR_LINKS_TO_FIXam_FILES_MAPPING[i]=f"{mapping_ozone}" + fixgsm_ozone_fn_is_set=True + break + # + #----------------------------------------------------------------------- + # + # If fixgsm_ozone_fn_is_set is set to True, then the appropriate element + # of the array CYCLEDIR_LINKS_TO_FIXam_FILES_MAPPING was set successfully. + # In this case, print out the new version of this array. Otherwise, print + # out an error message and exit. + # + #----------------------------------------------------------------------- + # + if fixgsm_ozone_fn_is_set: + + msg=dedent(f''' + After setting the file name of the ozone production/loss file in the + FIXgsm directory (based on the ozone parameterization specified in the + CCPP suite definition file), the array specifying the mapping between + the symlinks that need to be created in the cycle directories and the + files in the FIXam directory is: + + ''') + msg+=dedent(f''' + CYCLEDIR_LINKS_TO_FIXam_FILES_MAPPING = {list_to_str(CYCLEDIR_LINKS_TO_FIXam_FILES_MAPPING)} + ''') + print_info_msg(msg,verbose=VERBOSE) + + else: + + print_err_msg_exit(f''' + Unable to set name of the ozone production/loss file in the FIXgsm directory + in the array that specifies the mapping between the symlinks that need to + be created in the cycle directories and the files in the FIXgsm directory: + fixgsm_ozone_fn_is_set = \"{fixgsm_ozone_fn_is_set}\"''') + + EXPORTS = ["CYCLEDIR_LINKS_TO_FIXam_FILES_MAPPING", "FIXgsm_FILES_TO_COPY_TO_FIXam"] + export_vars(env_vars=EXPORTS) + + return ozone_param + +class Testing(unittest.TestCase): + def test_set_ozone_param(self): + self.assertEqual( "ozphys_2015", + set_ozone_param(ccpp_phys_suite_fp=f"test_data{os.sep}suite_FV3_GSD_SAR.xml") ) + def setUp(self): + define_macos_utilities(); + set_env_var('DEBUG',True) + set_env_var('VERBOSE',True) + + CYCLEDIR_LINKS_TO_FIXam_FILES_MAPPING = [ + "aerosol.dat | global_climaeropac_global.txt", + "co2historicaldata_2010.txt | fix_co2_proj/global_co2historicaldata_2010.txt", + "co2historicaldata_2011.txt | fix_co2_proj/global_co2historicaldata_2011.txt", + "co2historicaldata_2012.txt | fix_co2_proj/global_co2historicaldata_2012.txt", + "co2historicaldata_2013.txt | fix_co2_proj/global_co2historicaldata_2013.txt", + "co2historicaldata_2014.txt | fix_co2_proj/global_co2historicaldata_2014.txt", + "co2historicaldata_2015.txt | fix_co2_proj/global_co2historicaldata_2015.txt", + "co2historicaldata_2016.txt | fix_co2_proj/global_co2historicaldata_2016.txt", + "co2historicaldata_2017.txt | fix_co2_proj/global_co2historicaldata_2017.txt", + "co2historicaldata_2018.txt | fix_co2_proj/global_co2historicaldata_2018.txt", + "co2historicaldata_2019.txt | fix_co2_proj/global_co2historicaldata_2019.txt", + "co2historicaldata_2020.txt | fix_co2_proj/global_co2historicaldata_2020.txt", + "co2historicaldata_2021.txt | fix_co2_proj/global_co2historicaldata_2021.txt", + "co2historicaldata_glob.txt | global_co2historicaldata_glob.txt", + "co2monthlycyc.txt | co2monthlycyc.txt", + "global_h2oprdlos.f77 | global_h2o_pltc.f77", + "global_zorclim.1x1.grb | global_zorclim.1x1.grb", + "sfc_emissivity_idx.txt | global_sfc_emissivity_idx.txt", + "solarconstant_noaa_an.txt | global_solarconstant_noaa_an.txt", + "global_o3prdlos.f77 | ozprdlos_2015_new_sbuvO3_tclm15_nuchem.f77"] + FIXgsm_FILES_TO_COPY_TO_FIXam = [ + "global_glacier.2x2.grb", + "global_maxice.2x2.grb", + "RTGSST.1982.2012.monthly.clim.grb", + "global_snoclim.1.875.grb", + "CFSR.SEAICE.1982.2012.monthly.clim.grb", + "global_soilmgldas.t126.384.190.grb", + "seaice_newland.grb", + "global_climaeropac_global.txt", + "fix_co2_proj/global_co2historicaldata_2010.txt", + "fix_co2_proj/global_co2historicaldata_2011.txt", + "fix_co2_proj/global_co2historicaldata_2012.txt", + "fix_co2_proj/global_co2historicaldata_2013.txt", + "fix_co2_proj/global_co2historicaldata_2014.txt", + "fix_co2_proj/global_co2historicaldata_2015.txt", + "fix_co2_proj/global_co2historicaldata_2016.txt", + "fix_co2_proj/global_co2historicaldata_2017.txt", + "fix_co2_proj/global_co2historicaldata_2018.txt", + "fix_co2_proj/global_co2historicaldata_2019.txt", + "fix_co2_proj/global_co2historicaldata_2020.txt", + "fix_co2_proj/global_co2historicaldata_2021.txt", + "global_co2historicaldata_glob.txt", + "co2monthlycyc.txt", + "global_h2o_pltc.f77", + "global_hyblev.l65.txt", + "global_zorclim.1x1.grb", + "global_sfc_emissivity_idx.txt", + "global_solarconstant_noaa_an.txt", + "geo_em.d01.lat-lon.2.5m.HGT_M.nc", + "HGT.Beljaars_filtered.lat-lon.30s_res.nc", + "ozprdlos_2015_new_sbuvO3_tclm15_nuchem.f77"] + + set_env_var('CYCLEDIR_LINKS_TO_FIXam_FILES_MAPPING', CYCLEDIR_LINKS_TO_FIXam_FILES_MAPPING) + set_env_var('FIXgsm_FILES_TO_COPY_TO_FIXam', FIXgsm_FILES_TO_COPY_TO_FIXam) diff --git a/ush/set_predef_grid_params.py b/ush/set_predef_grid_params.py new file mode 100644 index 000000000..283851715 --- /dev/null +++ b/ush/set_predef_grid_params.py @@ -0,0 +1,64 @@ +#!/usr/bin/env python3 + +import unittest +import os + +from constants import radius_Earth,degs_per_radian + +from python_utils import process_args,import_vars,export_vars,set_env_var,get_env_var,\ + print_input_args,define_macos_utilities, load_config_file, \ + cfg_to_yaml_str + +def set_predef_grid_params(): + """ Sets grid parameters for the specified predfined grid + + Args: + None + Returns: + None + """ + # import all environement variables + import_vars() + + params_dict = load_config_file("predef_grid_params.yaml") + params_dict = params_dict[PREDEF_GRID_NAME] + + # if QUILTING = False, skip variables that start with "WRTCMP_" + if not QUILTING: + params_dict = {k: v for k,v in params_dict.items() \ + if not k.startswith("WRTCMP_") } + + # take care of special vars + special_vars = ['DT_ATMOS', 'LAYOUT_X', 'LAYOUT_Y', 'BLOCKSIZE'] + for var in special_vars: + if globals()[var] is not None: + params_dict[var] = globals()[var] + + #export variables to environment + export_vars(source_dict=params_dict) +# +#----------------------------------------------------------------------- +# +# Call the function defined above. +# +#----------------------------------------------------------------------- +# +if __name__ == "__main__": + set_predef_grid_params() + +class Testing(unittest.TestCase): + def test_set_predef_grid_params(self): + set_predef_grid_params() + self.assertEqual(get_env_var('GRID_GEN_METHOD'),"ESGgrid") + self.assertEqual(get_env_var('ESGgrid_LON_CTR'),-97.5) + + def setUp(self): + define_macos_utilities(); + set_env_var('DEBUG',False) + set_env_var('PREDEF_GRID_NAME',"RRFS_CONUS_3km") + set_env_var('DT_ATMOS',36) + set_env_var('LAYOUT_X',18) + set_env_var('LAYOUT_Y',36) + set_env_var('BLOCKSIZE',28) + set_env_var('QUILTING',False) + diff --git a/ush/set_predef_grid_params.sh b/ush/set_predef_grid_params.sh index 88b2c0f63..fd4ef530b 100644 --- a/ush/set_predef_grid_params.sh +++ b/ush/set_predef_grid_params.sh @@ -132,7 +132,7 @@ case ${PREDEF_GRID_NAME} in WRTCMP_stdlat2="${ESGgrid_LAT_CTR}" WRTCMP_nx="217" WRTCMP_ny="128" - WRTCMP_lon_lwr_left="-122.719258" + WRTCMP_lon_lwr_left="-122.719528" WRTCMP_lat_lwr_left="21.138123" WRTCMP_dx="${ESGgrid_DELX}" WRTCMP_dy="${ESGgrid_DELY}" @@ -224,7 +224,7 @@ case ${PREDEF_GRID_NAME} in WRTCMP_stdlat2="${ESGgrid_LAT_CTR}" WRTCMP_nx="416" WRTCMP_ny="245" - WRTCMP_lon_lwr_left="-122.719258" + WRTCMP_lon_lwr_left="-122.719528" WRTCMP_lat_lwr_left="21.138123" WRTCMP_dx="${ESGgrid_DELX}" WRTCMP_dy="${ESGgrid_DELY}" @@ -251,7 +251,7 @@ case ${PREDEF_GRID_NAME} in ESGgrid_NY="232" ESGgrid_PAZI="0.0" - + ESGgrid_WIDE_HALO_WIDTH="6" DT_ATMOS="${DT_ATMOS:-45}" @@ -316,7 +316,7 @@ case ${PREDEF_GRID_NAME} in WRTCMP_stdlat2="${ESGgrid_LAT_CTR}" WRTCMP_nx="1799" WRTCMP_ny="1059" - WRTCMP_lon_lwr_left="-122.719258" + WRTCMP_lon_lwr_left="-122.719528" WRTCMP_lat_lwr_left="21.138123" WRTCMP_dx="${ESGgrid_DELX}" WRTCMP_dy="${ESGgrid_DELY}" @@ -389,7 +389,7 @@ case ${PREDEF_GRID_NAME} in ESGgrid_NY="600" ESGgrid_PAZI="0.0" - + ESGgrid_WIDE_HALO_WIDTH="6" DT_ATMOS="${DT_ATMOS:-40}" @@ -455,7 +455,7 @@ case ${PREDEF_GRID_NAME} in WRTCMP_stdlat1="${ESGgrid_LAT_CTR}" WRTCMP_stdlat2="${ESGgrid_LAT_CTR}" WRTCMP_nx="197" - WRTCMP_ny="195" + WRTCMP_ny="197" WRTCMP_lon_lwr_left="-89.47120417" WRTCMP_lat_lwr_left="37.07809642" WRTCMP_dx="${ESGgrid_DELX}" @@ -486,7 +486,7 @@ case ${PREDEF_GRID_NAME} in ESGgrid_NY="240" ESGgrid_PAZI="0.0" - + ESGgrid_WIDE_HALO_WIDTH="6" # DT_ATMOS="${DT_ATMOS:-50}" @@ -606,7 +606,7 @@ case ${PREDEF_GRID_NAME} in ESGgrid_NY="1020" ESGgrid_PAZI="0.0" - + ESGgrid_WIDE_HALO_WIDTH="6" # DT_ATMOS="${DT_ATMOS:-50}" @@ -635,6 +635,60 @@ case ${PREDEF_GRID_NAME} in # #----------------------------------------------------------------------- # +# The WoFS domain with ~3km cells. +# +# Note: +# The WoFS domain will generate a 301 x 301 output grid (WRITE COMPONENT) and +# will eventually be movable (ESGgrid_LON_CTR/ESGgrid_LAT_CTR). A python script +# python_utils/fv3write_parms_lambert will be useful to determine +# WRTCMP_lon_lwr_left and WRTCMP_lat_lwr_left locations (only for Lambert map +# projection currently) of the quilting output when the domain location is +# moved. Later, it should be integrated into the workflow. +# +#----------------------------------------------------------------------- +# +"WoFS_3km") + + GRID_GEN_METHOD="ESGgrid" + + ESGgrid_LON_CTR="-97.5" + ESGgrid_LAT_CTR="38.5" + + ESGgrid_DELX="3000.0" + ESGgrid_DELY="3000.0" + + ESGgrid_NX="361" + ESGgrid_NY="361" + + ESGgrid_PAZI="0.0" + + ESGgrid_WIDE_HALO_WIDTH="6" + + DT_ATMOS="${DT_ATMOS:-20}" + + LAYOUT_X="${LAYOUT_X:-18}" + LAYOUT_Y="${LAYOUT_Y:-12}" + BLOCKSIZE="${BLOCKSIZE:-30}" + + if [ "$QUILTING" = "TRUE" ]; then + WRTCMP_write_groups="1" + WRTCMP_write_tasks_per_group=$(( 1*LAYOUT_Y )) + WRTCMP_output_grid="lambert_conformal" + WRTCMP_cen_lon="${ESGgrid_LON_CTR}" + WRTCMP_cen_lat="${ESGgrid_LAT_CTR}" + WRTCMP_stdlat1="${ESGgrid_LAT_CTR}" + WRTCMP_stdlat2="${ESGgrid_LAT_CTR}" + WRTCMP_nx="301" + WRTCMP_ny="301" + WRTCMP_lon_lwr_left="-102.3802487" + WRTCMP_lat_lwr_left="34.3407918" + WRTCMP_dx="${ESGgrid_DELX}" + WRTCMP_dy="${ESGgrid_DELY}" + fi + ;; +# +#----------------------------------------------------------------------- +# # A CONUS domain of GFDLgrid type with ~25km cells. # # Note: diff --git a/ush/set_thompson_mp_fix_files.py b/ush/set_thompson_mp_fix_files.py new file mode 100644 index 000000000..9a1703e09 --- /dev/null +++ b/ush/set_thompson_mp_fix_files.py @@ -0,0 +1,176 @@ +#!/usr/bin/env python3 + +import os +import unittest +from textwrap import dedent + +from python_utils import import_vars,export_vars,set_env_var,list_to_str,\ + print_input_args,print_info_msg, print_err_msg_exit,\ + define_macos_utilities,load_xml_file,has_tag_with_value + +def set_thompson_mp_fix_files(ccpp_phys_suite_fp, thompson_mp_climo_fn): + """ Function that first checks whether the Thompson + microphysics parameterization is being called by the selected physics + suite. If not, it sets the output variable whose name is specified by + output_varname_sdf_uses_thompson_mp to "FALSE" and exits. If so, it + sets this variable to "TRUE" and modifies the workflow arrays + FIXgsm_FILES_TO_COPY_TO_FIXam and CYCLEDIR_LINKS_TO_FIXam_FILES_MAPPING + to ensure that fixed files needed by the Thompson microphysics + parameterization are copied to the FIXam directory and that appropriate + symlinks to these files are created in the run directories. + + Args: + ccpp_phys_suite_fp: full path to CCPP physics suite + thompson_mp_climo_fn: netcdf file for thompson microphysics + Returns: + boolean: sdf_uses_thompson_mp + """ + + print_input_args(locals()) + + # import all environment variables + import_vars() + + # + #----------------------------------------------------------------------- + # + # Check the suite definition file to see whether the Thompson microphysics + # parameterization is being used. + # + #----------------------------------------------------------------------- + # + tree = load_xml_file(ccpp_phys_suite_fp) + sdf_uses_thompson_mp = has_tag_with_value(tree, "scheme", "mp_thompson") + # + #----------------------------------------------------------------------- + # + # If the Thompson microphysics parameterization is being used, then... + # + #----------------------------------------------------------------------- + # + if sdf_uses_thompson_mp: + # + #----------------------------------------------------------------------- + # + # Append the names of the fixed files needed by the Thompson microphysics + # parameterization to the workflow array FIXgsm_FILES_TO_COPY_TO_FIXam, + # and append to the workflow array CYCLEDIR_LINKS_TO_FIXam_FILES_MAPPING + # the mappings between these files and the names of the corresponding + # symlinks that need to be created in the run directories. + # + #----------------------------------------------------------------------- + # + thompson_mp_fix_files=[ + "CCN_ACTIVATE.BIN", + "freezeH2O.dat", + "qr_acr_qg.dat", + "qr_acr_qs.dat", + "qr_acr_qgV2.dat", + "qr_acr_qsV2.dat" + ] + + if (EXTRN_MDL_NAME_ICS != "HRRR" and EXTRN_MDL_NAME_ICS != "RAP") or \ + (EXTRN_MDL_NAME_LBCS != "HRRR" and EXTRN_MDL_NAME_LBCS != "RAP"): + thompson_mp_fix_files.append(thompson_mp_climo_fn) + + FIXgsm_FILES_TO_COPY_TO_FIXam.extend(thompson_mp_fix_files) + + for fix_file in thompson_mp_fix_files: + mapping=f"{fix_file} | {fix_file}" + CYCLEDIR_LINKS_TO_FIXam_FILES_MAPPING.append(mapping) + + msg=dedent(f''' + Since the Thompson microphysics parameterization is being used by this + physics suite (CCPP_PHYS_SUITE), the names of the fixed files needed by + this scheme have been appended to the array FIXgsm_FILES_TO_COPY_TO_FIXam, + and the mappings between these files and the symlinks that need to be + created in the cycle directories have been appended to the array + CYCLEDIR_LINKS_TO_FIXam_FILES_MAPPING. After these modifications, the + values of these parameters are as follows: + + ''') + msg+=dedent(f''' + CCPP_PHYS_SUITE = \"{CCPP_PHYS_SUITE}\" + + FIXgsm_FILES_TO_COPY_TO_FIXam = {list_to_str(FIXgsm_FILES_TO_COPY_TO_FIXam)} + ''') + msg+=dedent(f''' + CYCLEDIR_LINKS_TO_FIXam_FILES_MAPPING = {list_to_str(CYCLEDIR_LINKS_TO_FIXam_FILES_MAPPING)} + ''') + print_info_msg(msg) + + EXPORTS = [ "CYCLEDIR_LINKS_TO_FIXam_FILES_MAPPING", "FIXgsm_FILES_TO_COPY_TO_FIXam" ] + export_vars(env_vars=EXPORTS) + + return sdf_uses_thompson_mp + +class Testing(unittest.TestCase): + def test_set_thompson_mp_fix_files(self): + self.assertEqual( True, + set_thompson_mp_fix_files(ccpp_phys_suite_fp=f"test_data{os.sep}suite_FV3_GSD_SAR.xml", + thompson_mp_climo_fn="Thompson_MP_MONTHLY_CLIMO.nc") ) + def setUp(self): + define_macos_utilities(); + set_env_var('DEBUG',True) + set_env_var('VERBOSE',True) + set_env_var('EXTRN_MDL_NAME_ICS',"FV3GFS") + set_env_var('EXTRN_MDL_NAME_LBCS',"FV3GFS") + set_env_var('CCPP_PHYS_SUITE',"FV3_GSD_SAR") + + CYCLEDIR_LINKS_TO_FIXam_FILES_MAPPING = [ + "aerosol.dat | global_climaeropac_global.txt", + "co2historicaldata_2010.txt | fix_co2_proj/global_co2historicaldata_2010.txt", + "co2historicaldata_2011.txt | fix_co2_proj/global_co2historicaldata_2011.txt", + "co2historicaldata_2012.txt | fix_co2_proj/global_co2historicaldata_2012.txt", + "co2historicaldata_2013.txt | fix_co2_proj/global_co2historicaldata_2013.txt", + "co2historicaldata_2014.txt | fix_co2_proj/global_co2historicaldata_2014.txt", + "co2historicaldata_2015.txt | fix_co2_proj/global_co2historicaldata_2015.txt", + "co2historicaldata_2016.txt | fix_co2_proj/global_co2historicaldata_2016.txt", + "co2historicaldata_2017.txt | fix_co2_proj/global_co2historicaldata_2017.txt", + "co2historicaldata_2018.txt | fix_co2_proj/global_co2historicaldata_2018.txt", + "co2historicaldata_2019.txt | fix_co2_proj/global_co2historicaldata_2019.txt", + "co2historicaldata_2020.txt | fix_co2_proj/global_co2historicaldata_2020.txt", + "co2historicaldata_2021.txt | fix_co2_proj/global_co2historicaldata_2021.txt", + "co2historicaldata_glob.txt | global_co2historicaldata_glob.txt", + "co2monthlycyc.txt | co2monthlycyc.txt", + "global_h2oprdlos.f77 | global_h2o_pltc.f77", + "global_zorclim.1x1.grb | global_zorclim.1x1.grb", + "sfc_emissivity_idx.txt | global_sfc_emissivity_idx.txt", + "solarconstant_noaa_an.txt | global_solarconstant_noaa_an.txt", + "global_o3prdlos.f77 | ozprdlos_2015_new_sbuvO3_tclm15_nuchem.f77"] + + FIXgsm_FILES_TO_COPY_TO_FIXam = [ + "global_glacier.2x2.grb", + "global_maxice.2x2.grb", + "RTGSST.1982.2012.monthly.clim.grb", + "global_snoclim.1.875.grb", + "CFSR.SEAICE.1982.2012.monthly.clim.grb", + "global_soilmgldas.t126.384.190.grb", + "seaice_newland.grb", + "global_climaeropac_global.txt", + "fix_co2_proj/global_co2historicaldata_2010.txt", + "fix_co2_proj/global_co2historicaldata_2011.txt", + "fix_co2_proj/global_co2historicaldata_2012.txt", + "fix_co2_proj/global_co2historicaldata_2013.txt", + "fix_co2_proj/global_co2historicaldata_2014.txt", + "fix_co2_proj/global_co2historicaldata_2015.txt", + "fix_co2_proj/global_co2historicaldata_2016.txt", + "fix_co2_proj/global_co2historicaldata_2017.txt", + "fix_co2_proj/global_co2historicaldata_2018.txt", + "fix_co2_proj/global_co2historicaldata_2019.txt", + "fix_co2_proj/global_co2historicaldata_2020.txt", + "fix_co2_proj/global_co2historicaldata_2021.txt", + "global_co2historicaldata_glob.txt", + "co2monthlycyc.txt", + "global_h2o_pltc.f77", + "global_hyblev.l65.txt", + "global_zorclim.1x1.grb", + "global_sfc_emissivity_idx.txt", + "global_solarconstant_noaa_an.txt", + "geo_em.d01.lat-lon.2.5m.HGT_M.nc", + "HGT.Beljaars_filtered.lat-lon.30s_res.nc", + "ozprdlos_2015_new_sbuvO3_tclm15_nuchem.f77"] + + set_env_var('CYCLEDIR_LINKS_TO_FIXam_FILES_MAPPING', CYCLEDIR_LINKS_TO_FIXam_FILES_MAPPING) + set_env_var('FIXgsm_FILES_TO_COPY_TO_FIXam', FIXgsm_FILES_TO_COPY_TO_FIXam) + diff --git a/ush/setup.py b/ush/setup.py new file mode 100644 index 000000000..2e6cb912c --- /dev/null +++ b/ush/setup.py @@ -0,0 +1,2220 @@ +#!/usr/bin/env python3 + +import os +import sys +import datetime +from textwrap import dedent + +from python_utils import cd_vrfy, mkdir_vrfy, rm_vrfy, check_var_valid_value,\ + lowercase,uppercase,check_for_preexist_dir_file,\ + list_to_str, type_to_str, \ + import_vars, export_vars, get_env_var, print_info_msg,\ + print_err_msg_exit, load_config_file, cfg_to_shell_str,\ + load_shell_config, load_ini_config, get_ini_value + +from set_cycle_dates import set_cycle_dates +from set_predef_grid_params import set_predef_grid_params +from set_ozone_param import set_ozone_param +from set_extrn_mdl_params import set_extrn_mdl_params +from set_gridparams_ESGgrid import set_gridparams_ESGgrid +from set_gridparams_GFDLgrid import set_gridparams_GFDLgrid +from link_fix import link_fix +from check_ruc_lsm import check_ruc_lsm +from set_thompson_mp_fix_files import set_thompson_mp_fix_files + +def setup(): + """ Function that sets a secondary set + of parameters needed by the various scripts that are called by the + FV3-LAM rocoto community workflow. This secondary set of parameters is + calculated using the primary set of user-defined parameters in the de- + fault and custom experiment/workflow configuration scripts (whose file + names are defined below). This script then saves both sets of parame- + ters in a global variable definitions file (really a bash script) in + the experiment directory. This file then gets sourced by the various + scripts called by the tasks in the workflow. + + Args: + None + Returns: + None + """ + + ushdir=os.path.dirname(os.path.abspath(__file__)) + cd_vrfy(ushdir) + + # import all environment variables + import_vars() + + # print message + print_info_msg(f''' + ======================================================================== + Starting function setup() in \"{os.path.basename(__file__)}\"... + ========================================================================''') + # + #----------------------------------------------------------------------- + # + # Set the name of the configuration file containing default values for + # the experiment/workflow variables. Then source the file. + # + #----------------------------------------------------------------------- + # + EXPT_DEFAULT_CONFIG_FN="config_defaults.yaml" + cfg_d = load_config_file(EXPT_DEFAULT_CONFIG_FN) + import_vars(dictionary=cfg_d) + # + #----------------------------------------------------------------------- + # + # If a user-specified configuration file exists, source it. This file + # contains user-specified values for a subset of the experiment/workflow + # variables that override their default values. Note that the user- + # specified configuration file is not tracked by the repository, whereas + # the default configuration file is tracked. + # + #----------------------------------------------------------------------- + # + if os.path.exists(EXPT_CONFIG_FN): + # + # We require that the variables being set in the user-specified configu- + # ration file have counterparts in the default configuration file. This + # is so that we do not introduce new variables in the user-specified + # configuration file without also officially introducing them in the de- + # fault configuration file. Thus, before sourcing the user-specified + # configuration file, we check that all variables in the user-specified + # configuration file are also assigned default values in the default + # configuration file. + # + cfg_u = load_config_file(os.path.join(ushdir,EXPT_CONFIG_FN)) + cfg_d.update(cfg_u) + if cfg_u.items() > cfg_d.items(): + print_err_msg_exit(f''' + User specified config file {EXPT_CONGIG_FN} has variables that are + not defined in the default configuration file {EXPT_DEFAULT_CONFIG_FN}''') + import_vars(dictionary=cfg_u) + + # + #----------------------------------------------------------------------- + # + # If PREDEF_GRID_NAME is set to a non-empty string, set or reset parameters + # according to the predefined domain specified. + # + #----------------------------------------------------------------------- + # + + # export env vars before calling another module + export_vars() + + if PREDEF_GRID_NAME: + set_predef_grid_params() + + import_vars() + + # + #----------------------------------------------------------------------- + # + # Make sure different variables are set to their corresponding valid value + # + #----------------------------------------------------------------------- + # + global VERBOSE + if DEBUG and not VERBOSE: + print_info_msg(''' + Resetting VERBOSE to \"TRUE\" because DEBUG has been set to \"TRUE\"...''') + VERBOSE=False + + # + #----------------------------------------------------------------------- + # + # Set magnitude of stochastic ad-hoc schemes to -999.0 if they are not + # being used. This is required at the moment, since "do_shum/sppt/skeb" + # does not override the use of the scheme unless the magnitude is also + # specifically set to -999.0. If all "do_shum/sppt/skeb" are set to + # "false," then none will run, regardless of the magnitude values. + # + #----------------------------------------------------------------------- + # + global SHUM_MAG, SKEB_MAG, SPPT_MAG + if not DO_SHUM: + SHUM_MAG=-999.0 + if not DO_SKEB: + SKEB_MAG=-999.0 + if not DO_SPPT: + SPPT_MAG=-999.0 + # + #----------------------------------------------------------------------- + # + # If running with SPP in MYNN PBL, MYNN SFC, GSL GWD, Thompson MP, or + # RRTMG, count the number of entries in SPP_VAR_LIST to correctly set + # N_VAR_SPP, otherwise set it to zero. + # + #----------------------------------------------------------------------- + # + global N_VAR_SPP + N_VAR_SPP=0 + if DO_SPP: + N_VAR_SPP = len(SPP_VAR_LIST) + # + #----------------------------------------------------------------------- + # + # If running with Noah or RUC-LSM SPP, count the number of entries in + # LSM_SPP_VAR_LIST to correctly set N_VAR_LNDP, otherwise set it to zero. + # Also set LNDP_TYPE to 2 for LSM SPP, otherwise set it to zero. Finally, + # initialize an "FHCYC_LSM_SPP" variable to 0 and set it to 999 if LSM SPP + # is turned on. This requirement is necessary since LSM SPP cannot run with + # FHCYC=0 at the moment, but FHCYC cannot be set to anything less than the + # length of the forecast either. A bug fix will be submitted to + # ufs-weather-model soon, at which point, this requirement can be removed + # from regional_workflow. + # + #----------------------------------------------------------------------- + # + global N_VAR_LNDP, LNDP_TYPE, FHCYC_LSM_SPP_OR_NOT + N_VAR_LNDP=0 + LNDP_TYPE=0 + FHCYC_LSM_SPP_OR_NOT=0 + if DO_LSM_SPP: + N_VAR_LNDP=len(LSM_SPP_VAR_LIST) + LNDP_TYPE=2 + FHCYC_LSM_SPP_OR_NOT=999 + # + #----------------------------------------------------------------------- + # + # If running with SPP, confirm that each SPP-related namelist value + # contains the same number of entries as N_VAR_SPP (set above to be equal + # to the number of entries in SPP_VAR_LIST). + # + #----------------------------------------------------------------------- + # + if DO_SPP: + if ( len(SPP_MAG_LIST) != N_VAR_SPP ) or \ + ( len(SPP_LSCALE) != N_VAR_SPP) or \ + ( len(SPP_TSCALE) != N_VAR_SPP) or \ + ( len(SPP_SIGTOP1) != N_VAR_SPP) or \ + ( len(SPP_SIGTOP2) != N_VAR_SPP) or \ + ( len(SPP_STDDEV_CUTOFF) != N_VAR_SPP) or \ + ( len(ISEED_SPP) != N_VAR_SPP): + print_err_msg_exit(f''' + All MYNN PBL, MYNN SFC, GSL GWD, Thompson MP, or RRTMG SPP-related namelist + variables set in {CONFIG_FN} must be equal in number of entries to what is + found in SPP_VAR_LIST: + Number of entries in SPP_VAR_LIST = \"{len(SPP_VAR_LIST)}\"''') + # + #----------------------------------------------------------------------- + # + # If running with LSM SPP, confirm that each LSM SPP-related namelist + # value contains the same number of entries as N_VAR_LNDP (set above to + # be equal to the number of entries in LSM_SPP_VAR_LIST). + # + #----------------------------------------------------------------------- + # + if DO_LSM_SPP: + if ( len(LSM_SPP_MAG_LIST) != N_VAR_LNDP) or \ + ( len(LSM_SPP_LSCALE) != N_VAR_LNDP) or \ + ( len(LSM_SPP_TSCALE) != N_VAR_LNDP): + print_err_msg_exit(f''' + All Noah or RUC-LSM SPP-related namelist variables (except ISEED_LSM_SPP) + set in {CONFIG_FN} must be equal in number of entries to what is found in + SPP_VAR_LIST: + Number of entries in SPP_VAR_LIST = \"{len(LSM_SPP_VAR_LIST)}\"''') + # + # The current script should be located in the ush subdirectory of the + # workflow directory. Thus, the workflow directory is the one above the + # directory of the current script. + # + SR_WX_APP_TOP_DIR=os.path.abspath( os.path.dirname(__file__) + \ + os.sep + os.pardir + os.sep + os.pardir) + + # + #----------------------------------------------------------------------- + # + # Set the base directories in which codes obtained from external reposi- + # tories (using the manage_externals tool) are placed. Obtain the rela- + # tive paths to these directories by reading them in from the manage_ex- + # ternals configuration file. (Note that these are relative to the lo- + # cation of the configuration file.) Then form the full paths to these + # directories. Finally, make sure that each of these directories actu- + # ally exists. + # + #----------------------------------------------------------------------- + # + mng_extrns_cfg_fn = os.path.join(SR_WX_APP_TOP_DIR, "Externals.cfg") + try: + mng_extrns_cfg_fn = os.readlink(mng_extrns_cfg_fn) + except: + pass + property_name="local_path" + cfg = load_ini_config(mng_extrns_cfg_fn) + # + # Get the path to the workflow scripts + # + external_name="regional_workflow" + HOMErrfs = get_ini_value(cfg, external_name, property_name) + + if not HOMErrfs: + print_err_msg_exit(f''' + Externals.cfg does not contain "{external_name}".''') + + HOMErrfs = os.path.join(SR_WX_APP_TOP_DIR, HOMErrfs) + # + # Get the base directory of the FV3 forecast model code. + # + external_name=FCST_MODEL + UFS_WTHR_MDL_DIR = get_ini_value(cfg, external_name,property_name) + + if not UFS_WTHR_MDL_DIR: + print_err_msg_exit(f''' + Externals.cfg does not contain "{external_name}".''') + + UFS_WTHR_MDL_DIR=os.path.join(SR_WX_APP_TOP_DIR, UFS_WTHR_MDL_DIR) + if not os.path.exists(UFS_WTHR_MDL_DIR): + print_err_msg_exit(f''' + The base directory in which the FV3 source code should be located + (UFS_WTHR_MDL_DIR) does not exist: + UFS_WTHR_MDL_DIR = \"{UFS_WTHR_MDL_DIR}\" + Please clone the external repository containing the code in this directory, + build the executable, and then rerun the workflow.''') + # + # Get the base directory of the UFS_UTILS codes. + # + external_name="ufs_utils" + UFS_UTILS_DIR=get_ini_value(cfg, external_name, property_name) + + if not UFS_UTILS_DIR: + print_err_msg_exit(f''' + Externals.cfg does not contain "{external_name}".''') + + UFS_UTILS_DIR=os.path.join(SR_WX_APP_TOP_DIR, UFS_UTILS_DIR) + if not os.path.exists(UFS_UTILS_DIR): + print_err_msg_exit(f''' + The base directory in which the UFS utilities source codes should be lo- + cated (UFS_UTILS_DIR) does not exist: + UFS_UTILS_DIR = \"{UFS_UTILS_DIR}\" + Please clone the external repository containing the code in this direct- + ory, build the executables, and then rerun the workflow.''') + # + # Get the base directory of the UPP code. + # + external_name="UPP" + UPP_DIR=get_ini_value(cfg,external_name,property_name ) + if not UPP_DIR: + print_err_msg_exit(f''' + Externals.cfg does not contain "{external_name}".''') + + UPP_DIR=os.path.join(SR_WX_APP_TOP_DIR, UPP_DIR) + if not os.path.exists(UPP_DIR): + print_err_msg_exit(f''' + The base directory in which the UPP source code should be located + (UPP_DIR) does not exist: + UPP_DIR = \"{UPP_DIR}\" + Please clone the external repository containing the code in this directory, + build the executable, and then rerun the workflow.''') + + # + # Define some other useful paths + # + global USHDIR, SCRIPTSDIR, JOBSDIR,SORCDIR, SRC_DIR, PARMDIR, MODULES_DIR, EXECDIR, TEMPLATE_DIR, \ + VX_CONFIG_DIR, METPLUS_CONF, MET_CONFIG + + USHDIR = os.path.join(HOMErrfs,"ush") + SCRIPTSDIR = os.path.join(HOMErrfs,"scripts") + JOBSDIR = os.path.join(HOMErrfs,"jobs") + SORCDIR = os.path.join(HOMErrfs,"sorc") + SRC_DIR = os.path.join(SR_WX_APP_TOP_DIR,"src") + PARMDIR = os.path.join(HOMErrfs,"parm") + MODULES_DIR = os.path.join(HOMErrfs,"modulefiles") + EXECDIR = os.path.join(SR_WX_APP_TOP_DIR,"bin") + TEMPLATE_DIR = os.path.join(USHDIR,"templates") + VX_CONFIG_DIR = os.path.join(TEMPLATE_DIR,"parm") + METPLUS_CONF = os.path.join(TEMPLATE_DIR,"parm","metplus") + MET_CONFIG = os.path.join(TEMPLATE_DIR,"parm","met") + + # + #----------------------------------------------------------------------- + # + # Source the machine config file containing architechture information, + # queue names, and supported input file paths. + # + #----------------------------------------------------------------------- + # + global MACHINE + global MACHINE_FILE + global FIXgsm, FIXaer, FIXlut, TOPO_DIR, SFC_CLIMO_INPUT_DIR, DOMAIN_PREGEN_BASEDIR, \ + RELATIVE_LINK_FLAG, WORKFLOW_MANAGER, NCORES_PER_NODE, SCHED, \ + QUEUE_DEFAULT, QUEUE_HPSS, QUEUE_FCST, \ + PARTITION_DEFAULT, PARTITION_HPSS, PARTITION_FCST + + MACHINE = uppercase(MACHINE) + RELATIVE_LINK_FLAG="--relative" + MACHINE_FILE=MACHINE_FILE or os.path.join(USHDIR,"machine",f"{lowercase(MACHINE)}.sh") + machine_cfg = load_shell_config(MACHINE_FILE) + import_vars(dictionary=machine_cfg) + + if not NCORES_PER_NODE: + print_err_msg_exit(f""" + NCORES_PER_NODE has not been specified in the file {MACHINE_FILE} + Please ensure this value has been set for your desired platform. """) + + if not (FIXgsm and FIXaer and FIXlut and TOPO_DIR and SFC_CLIMO_INPUT_DIR): + print_err_msg_exit(f''' + One or more fix file directories have not been specified for this machine: + MACHINE = \"{MACHINE}\" + FIXgsm = \"{FIXgsm or ""} + FIXaer = \"{FIXaer or ""} + FIXlut = \"{FIXlut or ""} + TOPO_DIR = \"{TOPO_DIR or ""} + SFC_CLIMO_INPUT_DIR = \"{SFC_CLIMO_INPUT_DIR or ""} + DOMAIN_PREGEN_BASEDIR = \"{DOMAIN_PREGEN_BASEDIR or ""} + You can specify the missing location(s) in config.sh''') + + # + #----------------------------------------------------------------------- + # + # Set the names of the build and workflow module files (if not + # already specified by the user). These are the files that need to be + # sourced before building the component SRW App codes and running various + # workflow scripts, respectively. + # + #----------------------------------------------------------------------- + # + global WFLOW_MOD_FN, BUILD_MOD_FN + machine=lowercase(MACHINE) + WFLOW_MOD_FN=WFLOW_MOD_FN or f"wflow_{machine}" + BUILD_MOD_FN=BUILD_MOD_FN or f"build_{machine}_{COMPILER}" + # + #----------------------------------------------------------------------- + # + # Calculate a default value for the number of processes per node for the + # RUN_FCST_TN task. Then set PPN_RUN_FCST to this default value if + # PPN_RUN_FCST is not already specified by the user. + # + #----------------------------------------------------------------------- + # + global PPN_RUN_FCST + ppn_run_fcst_default = NCORES_PER_NODE // OMP_NUM_THREADS_RUN_FCST + PPN_RUN_FCST=PPN_RUN_FCST or ppn_run_fcst_default + # + #----------------------------------------------------------------------- + # + # If we are using a workflow manager check that the ACCOUNT variable is + # not empty. + # + #----------------------------------------------------------------------- + # + if WORKFLOW_MANAGER != "none": + if not ACCOUNT: + print_err_msg_exit(f''' + The variable ACCOUNT cannot be empty if you are using a workflow manager: + ACCOUNT = \"ACCOUNT\" + WORKFLOW_MANAGER = \"WORKFLOW_MANAGER\"''') + # + #----------------------------------------------------------------------- + # + # Set the grid type (GTYPE). In general, in the FV3 code, this can take + # on one of the following values: "global", "stretch", "nest", and "re- + # gional". The first three values are for various configurations of a + # global grid, while the last one is for a regional grid. Since here we + # are only interested in a regional grid, GTYPE must be set to "region- + # al". + # + #----------------------------------------------------------------------- + # + global TILE_RGNL, GTYPE + GTYPE="regional" + TILE_RGNL="7" + + #----------------------------------------------------------------------- + # + # Set USE_MERRA_CLIMO to either "TRUE" or "FALSE" so we don't + # have to consider other valid values later on. + # + #----------------------------------------------------------------------- + global USE_MERRA_CLIMO + if CCPP_PHYS_SUITE == "FV3_GFS_v15_thompson_mynn_lam3km": + USE_MERRA_CLIMO=True + # + #----------------------------------------------------------------------- + # + # Set CPL to TRUE/FALSE based on FCST_MODEL. + # + #----------------------------------------------------------------------- + # + global CPL + if FCST_MODEL == "ufs-weather-model": + CPL=False + elif FCST_MODEL == "fv3gfs_aqm": + CPL=True + else: + print_err_msg_exit(f''' + The coupling flag CPL has not been specified for this value of FCST_MODEL: + FCST_MODEL = \"{FCST_MODEL}\"''') + # + #----------------------------------------------------------------------- + # + # Make sure RESTART_INTERVAL is set to an integer value if present + # + #----------------------------------------------------------------------- + # + if not isinstance(RESTART_INTERVAL,int): + print_err_msg_exit(f''' + RESTART_INTERVAL must be set to an integer number of hours. + RESTART_INTERVAL = \"{RESTART_INTERVAL}\"''') + # + #----------------------------------------------------------------------- + # + # Check that DATE_FIRST_CYCL and DATE_LAST_CYCL are strings consisting + # of exactly 8 digits. + # + #----------------------------------------------------------------------- + # + if not isinstance(DATE_FIRST_CYCL,datetime.date): + print_err_msg_exit(f''' + DATE_FIRST_CYCL must be a string consisting of exactly 8 digits of the + form \"YYYYMMDD\", where YYYY is the 4-digit year, MM is the 2-digit + month, and DD is the 2-digit day-of-month. + DATE_FIRST_CYCL = \"{DATE_FIRST_CYCL}\"''') + + if not isinstance(DATE_LAST_CYCL,datetime.date): + print_err_msg_exit(f''' + DATE_LAST_CYCL must be a string consisting of exactly 8 digits of the + form \"YYYYMMDD\", where YYYY is the 4-digit year, MM is the 2-digit + month, and DD is the 2-digit day-of-month. + DATE_LAST_CYCL = \"{DATE_LAST_CYCL}\"''') + # + #----------------------------------------------------------------------- + # + # Check that all elements of CYCL_HRS are strings consisting of exactly + # 2 digits that are between "00" and "23", inclusive. + # + #----------------------------------------------------------------------- + # + i=0 + for CYCL in CYCL_HRS: + if CYCL < 0 or CYCL > 23: + print_err_msg_exit(f''' + Each element of CYCL_HRS must be an integer between \"00\" and \"23\", in- + clusive (including a leading \"0\", if necessary), specifying an hour-of- + day. Element #{i} of CYCL_HRS (where the index of the first element is 0) + does not have this form: + CYCL_HRS = {CYCL_HRS} + CYCL_HRS[{i}] = \"{CYCL_HRS[i]}\"''') + + i=i+1 + # + #----------------------------------------------------------------------- + # Check cycle increment for cycle frequency (cycl_freq). + # only if INCR_CYCL_FREQ < 24. + #----------------------------------------------------------------------- + # + if INCR_CYCL_FREQ < 24 and i > 1: + cycl_intv=(24//i) + if cycl_intv != INCR_CYCL_FREQ: + print_err_msg_exit(f''' + The number of CYCL_HRS does not match with that expected by INCR_CYCL_FREQ: + INCR_CYCL_FREQ = {INCR_CYCL_FREQ} + cycle interval by the number of CYCL_HRS = {cycl_intv} + CYCL_HRS = {CYCL_HRS} ''') + + for itmp in range(1,i): + itm1=itmp-1 + cycl_next_itmp=CYCL_HRS[itm1] + INCR_CYCL_FREQ + if cycl_next_itmp != CYCL_HRS[itmp]: + print_err_msg_exit(f''' + Element {itmp} of CYCL_HRS does not match with the increment of cycle + frequency INCR_CYCL_FREQ: + CYCL_HRS = {CYCL_HRS} + INCR_CYCL_FREQ = {INCR_CYCL_FREQ} + CYCL_HRS[{itmp}] = \"{CYCL_HRS[itmp]}\"''') + # + #----------------------------------------------------------------------- + # + # Call a function to generate the array ALL_CDATES containing the cycle + # dates/hours for which to run forecasts. The elements of this array + # will have the form YYYYMMDDHH. They are the starting dates/times of + # the forecasts that will be run in the experiment. Then set NUM_CYCLES + # to the number of elements in this array. + # + #----------------------------------------------------------------------- + # + + ALL_CDATES = set_cycle_dates( \ + date_start=DATE_FIRST_CYCL, + date_end=DATE_LAST_CYCL, + cycle_hrs=CYCL_HRS, + incr_cycl_freq=INCR_CYCL_FREQ) + + NUM_CYCLES=len(ALL_CDATES) + + if NUM_CYCLES > 90: + ALL_CDATES=None + print_info_msg(f''' + Too many cycles in ALL_CDATES to list, redefining in abbreviated form." + ALL_CDATES="{DATE_FIRST_CYCL}{CYCL_HRS[0]}...{DATE_LAST_CYCL}{CYCL_HRS[-1]}''') + # + #----------------------------------------------------------------------- + # + # If using a custom post configuration file, make sure that it exists. + # + #----------------------------------------------------------------------- + # + if USE_CUSTOM_POST_CONFIG_FILE: + if not os.path.exists(CUSTOM_POST_CONFIG_FP): + print_err_msg_exit(f''' + The custom post configuration specified by CUSTOM_POST_CONFIG_FP does not + exist: + CUSTOM_POST_CONFIG_FP = \"{CUSTOM_POST_CONFIG_FP}\"''') + # + #----------------------------------------------------------------------- + # + # If using external CRTM fix files to allow post-processing of synthetic + # satellite products from the UPP, then make sure the fix file directory + # exists. + # + #----------------------------------------------------------------------- + # + if USE_CRTM: + if not os.path.exists(CRTM_DIR): + print_err_msg_exit(f''' + The external CRTM fix file directory specified by CRTM_DIR does not exist: + CRTM_DIR = \"{CRTM_DIR}\"''') + # + #----------------------------------------------------------------------- + # + # The forecast length (in integer hours) cannot contain more than 3 cha- + # racters. Thus, its maximum value is 999. Check whether the specified + # forecast length exceeds this maximum value. If so, print out a warn- + # ing and exit this script. + # + #----------------------------------------------------------------------- + # + fcst_len_hrs_max=999 + if FCST_LEN_HRS > fcst_len_hrs_max: + print_err_msg_exit(f''' + Forecast length is greater than maximum allowed length: + FCST_LEN_HRS = {FCST_LEN_HRS} + fcst_len_hrs_max = {fcst_len_hrs_max}''') + # + #----------------------------------------------------------------------- + # + # Check whether the forecast length (FCST_LEN_HRS) is evenly divisible + # by the BC update interval (LBC_SPEC_INTVL_HRS). If not, print out a + # warning and exit this script. If so, generate an array of forecast + # hours at which the boundary values will be updated. + # + #----------------------------------------------------------------------- + # + rem=FCST_LEN_HRS%LBC_SPEC_INTVL_HRS + + if rem != 0: + print_err_msg_exit(f''' + The forecast length (FCST_LEN_HRS) is not evenly divisible by the lateral + boundary conditions update interval (LBC_SPEC_INTVL_HRS): + FCST_LEN_HRS = {FCST_LEN_HRS} + LBC_SPEC_INTVL_HRS = {LBC_SPEC_INTVL_HRS} + rem = FCST_LEN_HRS%%LBC_SPEC_INTVL_HRS = {rem}''') + # + #----------------------------------------------------------------------- + # + # Set the array containing the forecast hours at which the lateral + # boundary conditions (LBCs) need to be updated. Note that this array + # does not include the 0-th hour (initial time). + # + #----------------------------------------------------------------------- + # + LBC_SPEC_FCST_HRS=[ i for i in range(LBC_SPEC_INTVL_HRS, \ + LBC_SPEC_INTVL_HRS + FCST_LEN_HRS, \ + LBC_SPEC_INTVL_HRS ) ] + # + #----------------------------------------------------------------------- + # + # Check to make sure that various computational parameters needed by the + # forecast model are set to non-empty values. At this point in the + # experiment generation, all of these should be set to valid (non-empty) + # values. + # + #----------------------------------------------------------------------- + # + if not DT_ATMOS: + print_err_msg_exit(f''' + The forecast model main time step (DT_ATMOS) is set to a null string: + DT_ATMOS = {DT_ATMOS} + Please set this to a valid numerical value in the user-specified experiment + configuration file (EXPT_CONFIG_FP) and rerun: + EXPT_CONFIG_FP = \"{EXPT_CONFIG_FP}\"''') + + if not LAYOUT_X: + print_err_msg_exit(f''' + The number of MPI processes to be used in the x direction (LAYOUT_X) by + the forecast job is set to a null string: + LAYOUT_X = {LAYOUT_X} + Please set this to a valid numerical value in the user-specified experiment + configuration file (EXPT_CONFIG_FP) and rerun: + EXPT_CONFIG_FP = \"{EXPT_CONFIG_FP}\"''') + + if not LAYOUT_Y: + print_err_msg_exit(f''' + The number of MPI processes to be used in the y direction (LAYOUT_Y) by + the forecast job is set to a null string: + LAYOUT_Y = {LAYOUT_Y} + Please set this to a valid numerical value in the user-specified experiment + configuration file (EXPT_CONFIG_FP) and rerun: + EXPT_CONFIG_FP = \"{EXPT_CONFIG_FP}\"''') + + if not BLOCKSIZE: + print_err_msg_exit(f''' + The cache size to use for each MPI task of the forecast (BLOCKSIZE) is + set to a null string: + BLOCKSIZE = {BLOCKSIZE} + Please set this to a valid numerical value in the user-specified experiment + configuration file (EXPT_CONFIG_FP) and rerun: + EXPT_CONFIG_FP = \"{EXPT_CONFIG_FP}\"''') + # + #----------------------------------------------------------------------- + # + # If performing sub-hourly model output and post-processing, check that + # the output interval DT_SUBHOURLY_POST_MNTS (in minutes) is specified + # correctly. + # + #----------------------------------------------------------------------- + # + global SUB_HOURLY_POST + + if SUB_HOURLY_POST: + # + # Check that DT_SUBHOURLY_POST_MNTS is between 0 and 59, inclusive. + # + if DT_SUBHOURLY_POST_MNTS < 0 or DT_SUBHOURLY_POST_MNTS > 59: + print_err_msg_exit(f''' + When performing sub-hourly post (i.e. SUB_HOURLY_POST set to \"TRUE\"), + DT_SUBHOURLY_POST_MNTS must be set to an integer between 0 and 59, + inclusive but in this case is not: + SUB_HOURLY_POST = \"{SUB_HOURLY_POST}\" + DT_SUBHOURLY_POST_MNTS = \"{DT_SUBHOURLY_POST_MNTS}\"''') + # + # Check that DT_SUBHOURLY_POST_MNTS (after converting to seconds) is + # evenly divisible by the forecast model's main time step DT_ATMOS. + # + rem=( DT_SUBHOURLY_POST_MNTS*60 % DT_ATMOS ) + if rem != 0: + print_err_msg_exit(f''' + When performing sub-hourly post (i.e. SUB_HOURLY_POST set to \"TRUE\"), + the time interval specified by DT_SUBHOURLY_POST_MNTS (after converting + to seconds) must be evenly divisible by the time step DT_ATMOS used in + the forecast model, i.e. the remainder (rem) must be zero. In this case, + it is not: + SUB_HOURLY_POST = \"{SUB_HOURLY_POST}\" + DT_SUBHOURLY_POST_MNTS = \"{DT_SUBHOURLY_POST_MNTS}\" + DT_ATMOS = \"{DT_ATMOS}\" + rem = (DT_SUBHOURLY_POST_MNTS*60) %% DT_ATMOS = {rem} + Please reset DT_SUBHOURLY_POST_MNTS and/or DT_ATMOS so that this remainder + is zero.''') + # + # If DT_SUBHOURLY_POST_MNTS is set to 0 (with SUB_HOURLY_POST set to + # True), then we're not really performing subhourly post-processing. + # In this case, reset SUB_HOURLY_POST to False and print out an + # informational message that such a change was made. + # + if DT_SUBHOURLY_POST_MNTS == 0: + print_info_msg(f''' + When performing sub-hourly post (i.e. SUB_HOURLY_POST set to \"TRUE\"), + DT_SUBHOURLY_POST_MNTS must be set to a value greater than 0; otherwise, + sub-hourly output is not really being performed: + SUB_HOURLY_POST = \"{SUB_HOURLY_POST}\" + DT_SUBHOURLY_POST_MNTS = \"{DT_SUBHOURLY_POST_MNTS}\" + Resetting SUB_HOURLY_POST to \"FALSE\". If you do not want this, you + must set DT_SUBHOURLY_POST_MNTS to something other than zero.''') + SUB_HOURLY_POST=False + # + #----------------------------------------------------------------------- + # + # If the base directory (EXPT_BASEDIR) in which the experiment subdirectory + # (EXPT_SUBDIR) will be located does not start with a "/", then it is + # either set to a null string or contains a relative directory. In both + # cases, prepend to it the absolute path of the default directory under + # which the experiment directories are placed. If EXPT_BASEDIR was set + # to a null string, it will get reset to this default experiment directory, + # and if it was set to a relative directory, it will get reset to an + # absolute directory that points to the relative directory under the + # default experiment directory. Then create EXPT_BASEDIR if it doesn't + # already exist. + # + #----------------------------------------------------------------------- + # + global EXPT_BASEDIR + if (not EXPT_BASEDIR) or (EXPT_BASEDIR[0] != "/"): + if not EXPT_BASEDIR: + EXPT_BASEDIR = "" + EXPT_BASEDIR = os.path.join(SR_WX_APP_TOP_DIR,"..","expt_dirs",EXPT_BASEDIR) + try: + EXPT_BASEDIR = os.path.realpath(EXPT_BASEDIR) + except: + pass + EXPT_BASEDIR = os.path.abspath(EXPT_BASEDIR) + + mkdir_vrfy(f' -p "{EXPT_BASEDIR}"') + # + #----------------------------------------------------------------------- + # + # If the experiment subdirectory name (EXPT_SUBDIR) is set to an empty + # string, print out an error message and exit. + # + #----------------------------------------------------------------------- + # + if not EXPT_SUBDIR: + print_err_msg_exit(f''' + The name of the experiment subdirectory (EXPT_SUBDIR) cannot be empty: + EXPT_SUBDIR = \"{EXPT_SUBDIR}\"''') + # + #----------------------------------------------------------------------- + # + # Set the full path to the experiment directory. Then check if it already + # exists and if so, deal with it as specified by PREEXISTING_DIR_METHOD. + # + #----------------------------------------------------------------------- + # + global EXPTDIR + EXPTDIR = os.path.join(EXPT_BASEDIR, EXPT_SUBDIR) + check_for_preexist_dir_file(EXPTDIR,PREEXISTING_DIR_METHOD) + # + #----------------------------------------------------------------------- + # + # Set other directories, some of which may depend on EXPTDIR (depending + # on whether we're running in NCO or community mode, i.e. whether RUN_ENVIR + # is set to "nco" or "community"). Definitions: + # + # LOGDIR: + # Directory in which the log files from the workflow tasks will be placed. + # + # FIXam: + # This is the directory that will contain the fixed files or symlinks to + # the fixed files containing various fields on global grids (which are + # usually much coarser than the native FV3-LAM grid). + # + # FIXclim: + # This is the directory that will contain the MERRA2 aerosol climatology + # data file and lookup tables for optics properties + # + # FIXLAM: + # This is the directory that will contain the fixed files or symlinks to + # the fixed files containing the grid, orography, and surface climatology + # on the native FV3-LAM grid. + # + # CYCLE_BASEDIR: + # The base directory in which the directories for the various cycles will + # be placed. + # + # COMROOT: + # In NCO mode, this is the full path to the "com" directory under which + # output from the RUN_POST_TN task will be placed. Note that this output + # is not placed directly under COMROOT but several directories further + # down. More specifically, for a cycle starting at yyyymmddhh, it is at + # + # $COMROOT/$NET/$envir/$RUN.$yyyymmdd/$hh + # + # Below, we set COMROOT in terms of PTMP as COMROOT="$PTMP/com". COMOROOT + # is not used by the workflow in community mode. + # + # COMOUT_BASEDIR: + # In NCO mode, this is the base directory directly under which the output + # from the RUN_POST_TN task will be placed, i.e. it is the cycle-independent + # portion of the RUN_POST_TN task's output directory. It is given by + # + # $COMROOT/$NET/$model_ver + # + # COMOUT_BASEDIR is not used by the workflow in community mode. + # + # POST_OUTPUT_DOMAIN_NAME: + # The PREDEF_GRID_NAME is set by default. + # + #----------------------------------------------------------------------- + # + global LOGDIR, FIXam, FIXclim, FIXLAM, CYCLE_BASEDIR, \ + COMROOT, COMOUT_BASEDIR, POST_OUTPUT_DOMAIN_NAME + + LOGDIR = os.path.join(EXPTDIR, "log") + + FIXam = os.path.join(EXPTDIR, "fix_am") + FIXclim = os.path.join(EXPTDIR, "fix_clim") + FIXLAM = os.path.join(EXPTDIR, "fix_lam") + + if RUN_ENVIR == "nco": + + CYCLE_BASEDIR = os.path.join(STMP, "tmpnwprd", RUN) + check_for_preexist_dir_file(CYCLE_BASEDIR,PREEXISTING_DIR_METHOD) + COMROOT = os.path.join(PTMP, "com") + COMOUT_BASEDIR = os.path.join(COMROOT, NET, model_ver) + check_for_preexist_dir_file(COMOUT_BASEDIR,PREEXISTING_DIR_METHOD) + + else: + + CYCLE_BASEDIR=EXPTDIR + COMROOT="" + COMOUT_BASEDIR="" + + if POST_OUTPUT_DOMAIN_NAME is None: + if PREDEF_GRID_NAME is None: + print_err_msg_exit(f''' + The domain name used in naming the run_post output files + (POST_OUTPUT_DOMAIN_NAME) has not been set: + POST_OUTPUT_DOMAIN_NAME = \"{POST_OUTPUT_DOMAIN_NAME}\" + If this experiment is not using a predefined grid (i.e. if + PREDEF_GRID_NAME is set to a null string), POST_OUTPUT_DOMAIN_NAME + must be set in the configuration file (\"{EXPT_CONFIG_FN}\"). ''') + + POST_OUTPUT_DOMAIN_NAME = PREDEF_GRID_NAME + + POST_OUTPUT_DOMAIN_NAME = lowercase(POST_OUTPUT_DOMAIN_NAME) + # + #----------------------------------------------------------------------- + # + # The FV3 forecast model needs the following input files in the run di- + # rectory to start a forecast: + # + # (1) The data table file + # (2) The diagnostics table file + # (3) The field table file + # (4) The FV3 namelist file + # (5) The model configuration file + # (6) The NEMS configuration file + # + # If using CCPP, it also needs: + # + # (7) The CCPP physics suite definition file + # + # The workflow contains templates for the first six of these files. + # Template files are versions of these files that contain placeholder + # (i.e. dummy) values for various parameters. The experiment/workflow + # generation scripts copy these templates to appropriate locations in + # the experiment directory (either the top of the experiment directory + # or one of the cycle subdirectories) and replace the placeholders in + # these copies by actual values specified in the experiment/workflow + # configuration file (or derived from such values). The scripts then + # use the resulting "actual" files as inputs to the forecast model. + # + # Note that the CCPP physics suite defintion file does not have a cor- + # responding template file because it does not contain any values that + # need to be replaced according to the experiment/workflow configura- + # tion. If using CCPP, this file simply needs to be copied over from + # its location in the forecast model's directory structure to the ex- + # periment directory. + # + # Below, we first set the names of the templates for the first six files + # listed above. We then set the full paths to these template files. + # Note that some of these file names depend on the physics suite while + # others do not. + # + #----------------------------------------------------------------------- + # + global DATA_TABLE_TMPL_FN, DIAG_TABLE_TMPL_FN, FIELD_TABLE_TMPL_FN, \ + MODEL_CONFIG_TMPL_FN, NEMS_CONFIG_TMPL_FN + global DATA_TABLE_TMPL_FP, DIAG_TABLE_TMPL_FP, FIELD_TABLE_TMPL_FP, \ + MODEL_CONFIG_TMPL_FP, NEMS_CONFIG_TMPL_FP + global FV3_NML_BASE_SUITE_FP, FV3_NML_YAML_CONFIG_FP,FV3_NML_BASE_ENS_FP + + dot_ccpp_phys_suite_or_null=f".{CCPP_PHYS_SUITE}" + + # Names of input files that the forecast model (ufs-weather-model) expects + # to read in. These should only be changed if the input file names in the + # forecast model code are changed. + #---------------------------------- + DATA_TABLE_FN = "data_table" + DIAG_TABLE_FN = "diag_table" + FIELD_TABLE_FN = "field_table" + MODEL_CONFIG_FN = "model_configure" + NEMS_CONFIG_FN = "nems.configure" + #---------------------------------- + + if DATA_TABLE_TMPL_FN is None: + DATA_TABLE_TMPL_FN = DATA_TABLE_FN + if DIAG_TABLE_TMPL_FN is None: + DIAG_TABLE_TMPL_FN = f"{DIAG_TABLE_FN}{dot_ccpp_phys_suite_or_null}" + if FIELD_TABLE_TMPL_FN is None: + FIELD_TABLE_TMPL_FN = f"{FIELD_TABLE_FN}{dot_ccpp_phys_suite_or_null}" + if MODEL_CONFIG_TMPL_FN is None: + MODEL_CONFIG_TMPL_FN = MODEL_CONFIG_FN + if NEMS_CONFIG_TMPL_FN is None: + NEMS_CONFIG_TMPL_FN = NEMS_CONFIG_FN + + DATA_TABLE_TMPL_FP = os.path.join(TEMPLATE_DIR,DATA_TABLE_TMPL_FN) + DIAG_TABLE_TMPL_FP = os.path.join(TEMPLATE_DIR,DIAG_TABLE_TMPL_FN) + FIELD_TABLE_TMPL_FP = os.path.join(TEMPLATE_DIR,FIELD_TABLE_TMPL_FN) + FV3_NML_BASE_SUITE_FP = os.path.join(TEMPLATE_DIR,FV3_NML_BASE_SUITE_FN) + FV3_NML_YAML_CONFIG_FP = os.path.join(TEMPLATE_DIR,FV3_NML_YAML_CONFIG_FN) + FV3_NML_BASE_ENS_FP = os.path.join(EXPTDIR,FV3_NML_BASE_ENS_FN) + MODEL_CONFIG_TMPL_FP = os.path.join(TEMPLATE_DIR,MODEL_CONFIG_TMPL_FN) + NEMS_CONFIG_TMPL_FP = os.path.join(TEMPLATE_DIR,NEMS_CONFIG_TMPL_FN) + # + #----------------------------------------------------------------------- + # + # Set: + # + # 1) the variable CCPP_PHYS_SUITE_FN to the name of the CCPP physics + # suite definition file. + # 2) the variable CCPP_PHYS_SUITE_IN_CCPP_FP to the full path of this + # file in the forecast model's directory structure. + # 3) the variable CCPP_PHYS_SUITE_FP to the full path of this file in + # the experiment directory. + # + # Note that the experiment/workflow generation scripts will copy this + # file from CCPP_PHYS_SUITE_IN_CCPP_FP to CCPP_PHYS_SUITE_FP. Then, for + # each cycle, the forecast launch script will create a link in the cycle + # run directory to the copy of this file at CCPP_PHYS_SUITE_FP. + # + #----------------------------------------------------------------------- + # + global CCPP_PHYS_SUITE_FN, CCPP_PHYS_SUITE_IN_CCPP_FP, CCPP_PHYS_SUITE_FP + CCPP_PHYS_SUITE_FN=f"suite_{CCPP_PHYS_SUITE}.xml" + CCPP_PHYS_SUITE_IN_CCPP_FP=os.path.join(UFS_WTHR_MDL_DIR, "FV3","ccpp","suites",CCPP_PHYS_SUITE_FN) + CCPP_PHYS_SUITE_FP=os.path.join(EXPTDIR, CCPP_PHYS_SUITE_FN) + if not os.path.exists(CCPP_PHYS_SUITE_IN_CCPP_FP): + print_err_msg_exit(f''' + The CCPP suite definition file (CCPP_PHYS_SUITE_IN_CCPP_FP) does not exist + in the local clone of the ufs-weather-model: + CCPP_PHYS_SUITE_IN_CCPP_FP = \"{CCPP_PHYS_SUITE_IN_CCPP_FP}\"''') + # + #----------------------------------------------------------------------- + # + # Set: + # + # 1) the variable FIELD_DICT_FN to the name of the field dictionary + # file. + # 2) the variable FIELD_DICT_IN_UWM_FP to the full path of this + # file in the forecast model's directory structure. + # 3) the variable FIELD_DICT_FP to the full path of this file in + # the experiment directory. + # + #----------------------------------------------------------------------- + # + global FIELD_DICT_FN, FIELD_DICT_IN_UWM_FP, FIELD_DICT_FP + FIELD_DICT_FN = "fd_nems.yaml" + FIELD_DICT_IN_UWM_FP = os.path.join(UFS_WTHR_MDL_DIR, "tests", "parm", FIELD_DICT_FN) + FIELD_DICT_FP = os.path.join(EXPTDIR, FIELD_DICT_FN) + if not os.path.exists(FIELD_DICT_IN_UWM_FP): + print_err_msg_exit(f''' + The field dictionary file (FIELD_DICT_IN_UWM_FP) does not exist + in the local clone of the ufs-weather-model: + FIELD_DICT_IN_UWM_FP = \"{FIELD_DICT_IN_UWM_FP}\"''') + # + #----------------------------------------------------------------------- + # + # Call the function that sets the ozone parameterization being used and + # modifies associated parameters accordingly. + # + #----------------------------------------------------------------------- + # + + # export env vars before calling another module + export_vars() + + OZONE_PARAM = set_ozone_param( \ + ccpp_phys_suite_fp=CCPP_PHYS_SUITE_IN_CCPP_FP) + + IMPORTS = ["CYCLEDIR_LINKS_TO_FIXam_FILES_MAPPING", "FIXgsm_FILES_TO_COPY_TO_FIXam"] + import_vars(env_vars=IMPORTS) + # + #----------------------------------------------------------------------- + # + # Set the full paths to those forecast model input files that are cycle- + # independent, i.e. they don't include information about the cycle's + # starting day/time. These are: + # + # * The data table file [(1) in the list above)] + # * The field table file [(3) in the list above)] + # * The FV3 namelist file [(4) in the list above)] + # * The NEMS configuration file [(6) in the list above)] + # + # Since they are cycle-independent, the experiment/workflow generation + # scripts will place them in the main experiment directory (EXPTDIR). + # The script that runs each cycle will then create links to these files + # in the run directories of the individual cycles (which are subdirecto- + # ries under EXPTDIR). + # + # The remaining two input files to the forecast model, i.e. + # + # * The diagnostics table file [(2) in the list above)] + # * The model configuration file [(5) in the list above)] + # + # contain parameters that depend on the cycle start date. Thus, custom + # versions of these two files must be generated for each cycle and then + # placed directly in the run directories of the cycles (not EXPTDIR). + # For this reason, the full paths to their locations vary by cycle and + # cannot be set here (i.e. they can only be set in the loop over the + # cycles in the rocoto workflow XML file). + # + #----------------------------------------------------------------------- + # + global DATA_TABLE_FP, FIELD_TABLE_FP, FV3_NML_FN, FV3_NML_FP, NEMS_CONFIG_FP + DATA_TABLE_FP = os.path.join(EXPTDIR, DATA_TABLE_FN) + FIELD_TABLE_FP = os.path.join(EXPTDIR, FIELD_TABLE_FN) + FV3_NML_FN = os.path.splitext(FV3_NML_BASE_SUITE_FN)[0] + FV3_NML_FP = os.path.join(EXPTDIR, FV3_NML_FN) + NEMS_CONFIG_FP = os.path.join(EXPTDIR, NEMS_CONFIG_FN) + # + #----------------------------------------------------------------------- + # + # If USE_USER_STAGED_EXTRN_FILES is set to TRUE, make sure that the user- + # specified directories under which the external model files should be + # located actually exist. + # + #----------------------------------------------------------------------- + # + if USE_USER_STAGED_EXTRN_FILES: + # Check for the base directory up to the first templated field. + idx = EXTRN_MDL_SOURCE_BASEDIR_ICS.find("$") + if idx == -1: + idx=len(EXTRN_MDL_SOURCE_BASEDIR_ICS) + + if not os.path.exists(EXTRN_MDL_SOURCE_BASEDIR_ICS[:idx]): + print_err_msg_exit(f''' + The directory (EXTRN_MDL_SOURCE_BASEDIR_ICS) in which the user-staged + external model files for generating ICs should be located does not exist: + EXTRN_MDL_SOURCE_BASEDIR_ICS = \"{EXTRN_MDL_SOURCE_BASEDIR_ICS}\"''') + + idx = EXTRN_MDL_SOURCE_BASEDIR_LBCS.find("$") + if idx == -1: + idx=len(EXTRN_MDL_SOURCE_BASEDIR_LBCS) + + if not os.path.exists(EXTRN_MDL_SOURCE_BASEDIR_LBCS[:idx]): + print_err_msg_exit(f''' + The directory (EXTRN_MDL_SOURCE_BASEDIR_LBCS) in which the user-staged + external model files for generating LBCs should be located does not exist: + EXTRN_MDL_SOURCE_BASEDIR_LBCS = \"{EXTRN_MDL_SOURCE_BASEDIR_LBCS}\"''') + # + #----------------------------------------------------------------------- + # + # Make sure that DO_ENSEMBLE is set to a valid value. Then set the names + # of the ensemble members. These will be used to set the ensemble member + # directories. Also, set the full path to the FV3 namelist file corresponding + # to each ensemble member. + # + #----------------------------------------------------------------------- + # + global NDIGITS_ENSMEM_NAMES,ENSMEM_NAMES,FV3_NML_ENSMEM_FPS,NUM_ENS_MEMBERS + NDIGITS_ENSMEM_NAMES=0 + ENSMEM_NAMES=[] + FV3_NML_ENSMEM_FPS=[] + if DO_ENSEMBLE: + NDIGITS_ENSMEM_NAMES=len(str(NUM_ENS_MEMBERS)) + fmt=f"0{NDIGITS_ENSMEM_NAMES}d" + for i in range(NUM_ENS_MEMBERS): + ENSMEM_NAMES.append(f"mem{fmt}".format(i+1)) + FV3_NML_ENSMEM_FPS.append(os.path.join(EXPTDIR, f"{FV3_NML_FN}_{ENSMEM_NAMES[i]}")) + # + #----------------------------------------------------------------------- + # + # Set the full path to the forecast model executable. + # + #----------------------------------------------------------------------- + # + global FV3_EXEC_FP + FV3_EXEC_FP = os.path.join(EXECDIR, FV3_EXEC_FN) + # + #----------------------------------------------------------------------- + # + # Set the full path to the script that can be used to (re)launch the + # workflow. Also, if USE_CRON_TO_RELAUNCH is set to TRUE, set the line + # to add to the cron table to automatically relaunch the workflow every + # CRON_RELAUNCH_INTVL_MNTS minutes. Otherwise, set the variable con- + # taining this line to a null string. + # + #----------------------------------------------------------------------- + # + global WFLOW_LAUNCH_SCRIPT_FP, WFLOW_LAUNCH_LOG_FP, CRONTAB_LINE + WFLOW_LAUNCH_SCRIPT_FP = os.path.join(USHDIR, WFLOW_LAUNCH_SCRIPT_FN) + WFLOW_LAUNCH_LOG_FP = os.path.join(EXPTDIR, WFLOW_LAUNCH_LOG_FN) + if USE_CRON_TO_RELAUNCH: + CRONTAB_LINE=f'''*/{CRON_RELAUNCH_INTVL_MNTS} * * * * cd {EXPTDIR} && ./{WFLOW_LAUNCH_SCRIPT_FN} called_from_cron="TRUE" >> ./{WFLOW_LAUNCH_LOG_FN} 2>&1''' + else: + CRONTAB_LINE="" + # + #----------------------------------------------------------------------- + # + # Set the full path to the script that, for a given task, loads the + # necessary module files and runs the tasks. + # + #----------------------------------------------------------------------- + # + global LOAD_MODULES_RUN_TASK_FP + LOAD_MODULES_RUN_TASK_FP = os.path.join(USHDIR, "load_modules_run_task.sh") + # + #----------------------------------------------------------------------- + # + # Define the various work subdirectories under the main work directory. + # Each of these corresponds to a different step/substep/task in the pre- + # processing, as follows: + # + # GRID_DIR: + # Directory in which the grid files will be placed (if RUN_TASK_MAKE_GRID + # is set to True) or searched for (if RUN_TASK_MAKE_GRID is set to + # False). + # + # OROG_DIR: + # Directory in which the orography files will be placed (if RUN_TASK_MAKE_OROG + # is set to True) or searched for (if RUN_TASK_MAKE_OROG is set to + # False). + # + # SFC_CLIMO_DIR: + # Directory in which the surface climatology files will be placed (if + # RUN_TASK_MAKE_SFC_CLIMO is set to True) or searched for (if + # RUN_TASK_MAKE_SFC_CLIMO is set to False). + # + #---------------------------------------------------------------------- + # + global RUN_TASK_MAKE_GRID, RUN_TASK_MAKE_OROG, RUN_TASK_MAKE_SFC_CLIMO + global GRID_DIR, OROG_DIR, SFC_CLIMO_DIR + global RUN_TASK_VX_GRIDSTAT, RUN_TASK_VX_POINTSTAT, RUN_TASK_VX_ENSGRID + + # + #----------------------------------------------------------------------- + # + # Make sure that DO_ENSEMBLE is set to TRUE when running ensemble vx. + # + #----------------------------------------------------------------------- + # + if (not DO_ENSEMBLE) and (RUN_TASK_VX_ENSGRID or RUN_TASK_VX_ENSPOINT): + print_err_msg_exit(f''' + Ensemble verification can not be run unless running in ensemble mode: + DO_ENSEMBLE = \"{DO_ENSEMBLE}\" + RUN_TASK_VX_ENSGRID = \"{RUN_TASK_VX_ENSGRID}\" + RUN_TASK_VX_ENSPOINT = \"{RUN_TASK_VX_ENSPOINT}\"''') + + if RUN_ENVIR == "nco": + + nco_fix_dir = os.path.join(DOMAIN_PREGEN_BASEDIR, PREDEF_GRID_NAME) + if not os.path.exists(nco_fix_dir): + print_err_msg_exit(f''' + The directory (nco_fix_dir) that should contain the pregenerated grid, + orography, and surface climatology files does not exist: + nco_fix_dir = \"{nco_fix_dir}\"''') + + if RUN_TASK_MAKE_GRID or \ + ( not RUN_TASK_MAKE_GRID and \ + GRID_DIR != nco_fix_dir ): + + msg=f''' + When RUN_ENVIR is set to \"nco\", the workflow assumes that pregenerated + grid files already exist in the directory + + {DOMAIN_PREGEN_BASEDIR}/{PREDEF_GRID_NAME} + + where + + DOMAIN_PREGEN_BASEDIR = \"{DOMAIN_PREGEN_BASEDIR}\" + PREDEF_GRID_NAME = \"{PREDEF_GRID_NAME}\" + + Thus, the MAKE_GRID_TN task must not be run (i.e. RUN_TASK_MAKE_GRID must + be set to \"FALSE\"), and the directory in which to look for the grid + files (i.e. GRID_DIR) must be set to the one above. Current values for + these quantities are: + + RUN_TASK_MAKE_GRID = \"{RUN_TASK_MAKE_GRID}\" + GRID_DIR = \"{GRID_DIR}\" + + Resetting RUN_TASK_MAKE_GRID to \"FALSE\" and GRID_DIR to the one above. + Reset values are: + ''' + + RUN_TASK_MAKE_GRID=False + GRID_DIR=nco_fix_dir + + msg+=f''' + RUN_TASK_MAKE_GRID = \"{RUN_TASK_MAKE_GRID}\" + GRID_DIR = \"{GRID_DIR}\" + ''' + + print_info_msg(msg) + + + if RUN_TASK_MAKE_OROG or \ + ( not RUN_TASK_MAKE_OROG and \ + OROG_DIR != nco_fix_dir ): + + msg=f''' + When RUN_ENVIR is set to \"nco\", the workflow assumes that pregenerated + orography files already exist in the directory + {DOMAIN_PREGEN_BASEDIR}/{PREDEF_GRID_NAME} + + where + + DOMAIN_PREGEN_BASEDIR = \"{DOMAIN_PREGEN_BASEDIR}\" + PREDEF_GRID_NAME = \"{PREDEF_GRID_NAME}\" + + Thus, the MAKE_OROG_TN task must not be run (i.e. RUN_TASK_MAKE_OROG must + be set to \"FALSE\"), and the directory in which to look for the orography + files (i.e. OROG_DIR) must be set to the one above. Current values for + these quantities are: + + RUN_TASK_MAKE_OROG = \"{RUN_TASK_MAKE_OROG}\" + OROG_DIR = \"{OROG_DIR}\" + + Resetting RUN_TASK_MAKE_OROG to \"FALSE\" and OROG_DIR to the one above. + Reset values are: + ''' + + RUN_TASK_MAKE_OROG=False + OROG_DIR=nco_fix_dir + + msg+=f''' + RUN_TASK_MAKE_OROG = \"{RUN_TASK_MAKE_OROG}\" + OROG_DIR = \"{OROG_DIR}\" + ''' + + print_info_msg(msg) + + + if RUN_TASK_MAKE_SFC_CLIMO or \ + ( not RUN_TASK_MAKE_SFC_CLIMO and \ + SFC_CLIMO_DIR != nco_fix_dir ): + + msg=f''' + When RUN_ENVIR is set to \"nco\", the workflow assumes that pregenerated + surface climatology files already exist in the directory + + {DOMAIN_PREGEN_BASEDIR}/{PREDEF_GRID_NAME} + + where + + DOMAIN_PREGEN_BASEDIR = \"{DOMAIN_PREGEN_BASEDIR}\" + PREDEF_GRID_NAME = \"{PREDEF_GRID_NAME}\" + + Thus, the MAKE_SFC_CLIMO_TN task must not be run (i.e. RUN_TASK_MAKE_SFC_CLIMO + must be set to \"FALSE\"), and the directory in which to look for the + surface climatology files (i.e. SFC_CLIMO_DIR) must be set to the one + above. Current values for these quantities are: + + RUN_TASK_MAKE_SFC_CLIMO = \"{RUN_TASK_MAKE_SFC_CLIMO}\" + SFC_CLIMO_DIR = \"{SFC_CLIMO_DIR}\" + + Resetting RUN_TASK_MAKE_SFC_CLIMO to \"FALSE\" and SFC_CLIMO_DIR to the + one above. Reset values are: + ''' + + RUN_TASK_MAKE_SFC_CLIMO=False + SFC_CLIMO_DIR=nco_fix_dir + + msg+=f''' + RUN_TASK_MAKE_SFC_CLIMO = \"{RUN_TASK_MAKE_SFC_CLIMO}\" + SFC_CLIMO_DIR = \"{SFC_CLIMO_DIR}\" + ''' + + print_info_msg(msg) + + if RUN_TASK_VX_GRIDSTAT: + + msg=f''' + When RUN_ENVIR is set to \"nco\", it is assumed that the verification + will not be run. + RUN_TASK_VX_GRIDSTAT = \"{RUN_TASK_VX_GRIDSTAT}\" + Resetting RUN_TASK_VX_GRIDSTAT to \"FALSE\" + Reset value is:''' + + RUN_TASK_VX_GRIDSTAT=False + + msg+=f''' + RUN_TASK_VX_GRIDSTAT = \"{RUN_TASK_VX_GRIDSTAT}\" + ''' + + print_info_msg(msg) + + if RUN_TASK_VX_POINTSTAT: + + msg=f''' + When RUN_ENVIR is set to \"nco\", it is assumed that the verification + will not be run. + RUN_TASK_VX_POINTSTAT = \"{RUN_TASK_VX_POINTSTAT}\" + Resetting RUN_TASK_VX_POINTSTAT to \"FALSE\" + Reset value is:''' + + RUN_TASK_VX_POINTSTAT=False + + msg=f''' + RUN_TASK_VX_POINTSTAT = \"{RUN_TASK_VX_POINTSTAT}\" + ''' + + print_info_msg(msg) + + if RUN_TASK_VX_ENSGRID: + + msg=f''' + When RUN_ENVIR is set to \"nco\", it is assumed that the verification + will not be run. + RUN_TASK_VX_ENSGRID = \"{RUN_TASK_VX_ENSGRID}\" + Resetting RUN_TASK_VX_ENSGRID to \"FALSE\" + Reset value is:''' + + RUN_TASK_VX_ENSGRID=False + + msg+=f''' + RUN_TASK_VX_ENSGRID = \"{RUN_TASK_VX_ENSGRID}\" + ''' + + print_info_msg(msg) + + # + #----------------------------------------------------------------------- + # + # Now consider community mode. + # + #----------------------------------------------------------------------- + # + else: + # + # If RUN_TASK_MAKE_GRID is set to False, the workflow will look for + # the pregenerated grid files in GRID_DIR. In this case, make sure that + # GRID_DIR exists. Otherwise, set it to a predefined location under the + # experiment directory (EXPTDIR). + # + if not RUN_TASK_MAKE_GRID: + if not os.path.exists(GRID_DIR): + print_err_msg_exit(f''' + The directory (GRID_DIR) that should contain the pregenerated grid files + does not exist: + GRID_DIR = \"{GRID_DIR}\"''') + else: + GRID_DIR=os.path.join(EXPTDIR,"grid") + # + # If RUN_TASK_MAKE_OROG is set to False, the workflow will look for + # the pregenerated orography files in OROG_DIR. In this case, make sure + # that OROG_DIR exists. Otherwise, set it to a predefined location under + # the experiment directory (EXPTDIR). + # + if not RUN_TASK_MAKE_OROG: + if not os.path.exists(OROG_DIR): + print_err_msg_exit(f''' + The directory (OROG_DIR) that should contain the pregenerated orography + files does not exist: + OROG_DIR = \"{OROG_DIR}\"''') + else: + OROG_DIR=os.path.join(EXPTDIR,"orog") + # + # If RUN_TASK_MAKE_SFC_CLIMO is set to False, the workflow will look + # for the pregenerated surface climatology files in SFC_CLIMO_DIR. In + # this case, make sure that SFC_CLIMO_DIR exists. Otherwise, set it to + # a predefined location under the experiment directory (EXPTDIR). + # + if not RUN_TASK_MAKE_SFC_CLIMO: + if not os.path.exists(SFC_CLIMO_DIR): + print_err_msg_exit(f''' + The directory (SFC_CLIMO_DIR) that should contain the pregenerated surface + climatology files does not exist: + SFC_CLIMO_DIR = \"{SFC_CLIMO_DIR}\"''') + else: + SFC_CLIMO_DIR=os.path.join(EXPTDIR,"sfc_climo") + + #----------------------------------------------------------------------- + # + # Set cycle-independent parameters associated with the external models + # from which we will obtain the ICs and LBCs. + # + #----------------------------------------------------------------------- + # + + # export env vars before calling another module + export_vars() + + set_extrn_mdl_params() + + IMPORTS = ["EXTRN_MDL_SYSBASEDIR_ICS", "EXTRN_MDL_SYSBASEDIR_LBCS", "EXTRN_MDL_LBCS_OFFSET_HRS"] + import_vars(env_vars=IMPORTS) + # + #----------------------------------------------------------------------- + # + # Any regional model must be supplied lateral boundary conditions (in + # addition to initial conditions) to be able to perform a forecast. In + # the FV3-LAM model, these boundary conditions (BCs) are supplied using a + # "halo" of grid cells around the regional domain that extend beyond the + # boundary of the domain. The model is formulated such that along with + # files containing these BCs, it needs as input the following files (in + # NetCDF format): + # + # 1) A grid file that includes a halo of 3 cells beyond the boundary of + # the domain. + # 2) A grid file that includes a halo of 4 cells beyond the boundary of + # the domain. + # 3) A (filtered) orography file without a halo, i.e. a halo of width + # 0 cells. + # 4) A (filtered) orography file that includes a halo of 4 cells beyond + # the boundary of the domain. + # + # Note that the regional grid is referred to as "tile 7" in the code. + # We will let: + # + # * NH0 denote the width (in units of number of cells on tile 7) of + # the 0-cell-wide halo, i.e. NH0 = 0; + # + # * NH3 denote the width (in units of number of cells on tile 7) of + # the 3-cell-wide halo, i.e. NH3 = 3; and + # + # * NH4 denote the width (in units of number of cells on tile 7) of + # the 4-cell-wide halo, i.e. NH4 = 4. + # + # We define these variables next. + # + #----------------------------------------------------------------------- + # + global NH0,NH3,NH4 + NH0=0 + NH3=3 + NH4=4 + + # export env vars + EXPORTS = ["NH0","NH3","NH4"] + export_vars(env_vars = EXPORTS) + # + #----------------------------------------------------------------------- + # + # Set parameters according to the type of horizontal grid generation + # method specified. First consider GFDL's global-parent-grid based + # method. + # + #----------------------------------------------------------------------- + # + global LON_CTR,LAT_CTR,NX,NY,NHW,STRETCH_FAC,\ + ISTART_OF_RGNL_DOM_WITH_WIDE_HALO_ON_T6SG,\ + IEND_OF_RGNL_DOM_WITH_WIDE_HALO_ON_T6SG,\ + JSTART_OF_RGNL_DOM_WITH_WIDE_HALO_ON_T6SG,\ + JEND_OF_RGNL_DOM_WITH_WIDE_HALO_ON_T6SG + global PAZI,DEL_ANGLE_X_SG,DEL_ANGLE_Y_SG,\ + NEG_NX_OF_DOM_WITH_WIDE_HALO,\ + NEG_NY_OF_DOM_WITH_WIDE_HALO + + if GRID_GEN_METHOD == "GFDLgrid": + + (\ + LON_CTR,LAT_CTR,NX,NY,NHW,STRETCH_FAC, + ISTART_OF_RGNL_DOM_WITH_WIDE_HALO_ON_T6SG, + IEND_OF_RGNL_DOM_WITH_WIDE_HALO_ON_T6SG, + JSTART_OF_RGNL_DOM_WITH_WIDE_HALO_ON_T6SG, + JEND_OF_RGNL_DOM_WITH_WIDE_HALO_ON_T6SG \ + ) = \ + set_gridparams_GFDLgrid( \ + lon_of_t6_ctr=GFDLgrid_LON_T6_CTR, \ + lat_of_t6_ctr=GFDLgrid_LAT_T6_CTR, \ + res_of_t6g=GFDLgrid_RES, \ + stretch_factor=GFDLgrid_STRETCH_FAC, \ + refine_ratio_t6g_to_t7g=GFDLgrid_REFINE_RATIO, \ + istart_of_t7_on_t6g=GFDLgrid_ISTART_OF_RGNL_DOM_ON_T6G, \ + iend_of_t7_on_t6g=GFDLgrid_IEND_OF_RGNL_DOM_ON_T6G, \ + jstart_of_t7_on_t6g=GFDLgrid_JSTART_OF_RGNL_DOM_ON_T6G, \ + jend_of_t7_on_t6g=GFDLgrid_JEND_OF_RGNL_DOM_ON_T6G) + # + #----------------------------------------------------------------------- + # + # Now consider Jim Purser's map projection/grid generation method. + # + #----------------------------------------------------------------------- + # + elif GRID_GEN_METHOD == "ESGgrid": + + (\ + LON_CTR,LAT_CTR,NX,NY,PAZI, + NHW,STRETCH_FAC,DEL_ANGLE_X_SG,DEL_ANGLE_Y_SG, + NEG_NX_OF_DOM_WITH_WIDE_HALO, + NEG_NY_OF_DOM_WITH_WIDE_HALO \ + ) = \ + set_gridparams_ESGgrid( \ + lon_ctr=ESGgrid_LON_CTR, \ + lat_ctr=ESGgrid_LAT_CTR, \ + nx=ESGgrid_NX, \ + ny=ESGgrid_NY, \ + pazi=ESGgrid_PAZI, \ + halo_width=ESGgrid_WIDE_HALO_WIDTH, \ + delx=ESGgrid_DELX, \ + dely=ESGgrid_DELY) + + # + #----------------------------------------------------------------------- + # + # Create a new experiment directory. Note that at this point we are + # guaranteed that there is no preexisting experiment directory. For + # platforms with no workflow manager, we need to create LOGDIR as well, + # since it won't be created later at runtime. + # + #----------------------------------------------------------------------- + # + mkdir_vrfy(f' -p "{EXPTDIR}"') + mkdir_vrfy(f' -p "{LOGDIR}"') + # + #----------------------------------------------------------------------- + # + # If not running the MAKE_GRID_TN, MAKE_OROG_TN, and/or MAKE_SFC_CLIMO + # tasks, create symlinks under the FIXLAM directory to pregenerated grid, + # orography, and surface climatology files. In the process, also set + # RES_IN_FIXLAM_FILENAMES, which is the resolution of the grid (in units + # of number of grid points on an equivalent global uniform cubed-sphere + # grid) used in the names of the fixed files in the FIXLAM directory. + # + #----------------------------------------------------------------------- + # + mkdir_vrfy(f' -p "{FIXLAM}"') + RES_IN_FIXLAM_FILENAMES="" + # + #----------------------------------------------------------------------- + # + # If the grid file generation task in the workflow is going to be skipped + # (because pregenerated files are available), create links in the FIXLAM + # directory to the pregenerated grid files. + # + #----------------------------------------------------------------------- + # + + # export env vars + export_vars() + + # link fix files + res_in_grid_fns="" + if not RUN_TASK_MAKE_GRID: + + res_in_grid_fns = link_fix( \ + verbose=VERBOSE, \ + file_group="grid") + + RES_IN_FIXLAM_FILENAMES=res_in_grid_fns + # + #----------------------------------------------------------------------- + # + # If the orography file generation task in the workflow is going to be + # skipped (because pregenerated files are available), create links in + # the FIXLAM directory to the pregenerated orography files. + # + #----------------------------------------------------------------------- + # + res_in_orog_fns="" + if not RUN_TASK_MAKE_OROG: + + res_in_orog_fns = link_fix( \ + verbose=VERBOSE, \ + file_group="orog") + + if not RES_IN_FIXLAM_FILENAMES and \ + ( res_in_orog_fns != RES_IN_FIXLAM_FILENAMES): + print_err_msg_exit(f''' + The resolution extracted from the orography file names (res_in_orog_fns) + does not match the resolution in other groups of files already consi- + dered (RES_IN_FIXLAM_FILENAMES): + res_in_orog_fns = {res_in_orog_fns} + RES_IN_FIXLAM_FILENAMES = {RES_IN_FIXLAM_FILENAMES}''') + else: + RES_IN_FIXLAM_FILENAMES=res_in_orog_fns + # + #----------------------------------------------------------------------- + # + # If the surface climatology file generation task in the workflow is + # going to be skipped (because pregenerated files are available), create + # links in the FIXLAM directory to the pregenerated surface climatology + # files. + # + #----------------------------------------------------------------------- + # + res_in_sfc_climo_fns="" + if not RUN_TASK_MAKE_SFC_CLIMO: + + res_in_sfc_climo_fns = link_fix( \ + verbose=VERBOSE, \ + file_group="sfc_climo") + + if RES_IN_FIXLAM_FILENAMES and \ + res_in_sfc_climo_fns != RES_IN_FIXLAM_FILENAMES: + print_err_msg_exit(f''' + The resolution extracted from the surface climatology file names (res_- + in_sfc_climo_fns) does not match the resolution in other groups of files + already considered (RES_IN_FIXLAM_FILENAMES): + res_in_sfc_climo_fns = {res_in_sfc_climo_fns} + RES_IN_FIXLAM_FILENAMES = {RES_IN_FIXLAM_FILENAMES}''') + else: + RES_IN_FIXLAM_FILENAMES=res_in_sfc_climo_fns + # + #----------------------------------------------------------------------- + # + # The variable CRES is needed in constructing various file names. If + # not running the make_grid task, we can set it here. Otherwise, it + # will get set to a valid value by that task. + # + #----------------------------------------------------------------------- + # + global CRES + CRES="" + if not RUN_TASK_MAKE_GRID: + CRES=f"C{RES_IN_FIXLAM_FILENAMES}" + # + #----------------------------------------------------------------------- + # + # Make sure that WRITE_DOPOST is set to a valid value. + # + #----------------------------------------------------------------------- + # + global RUN_TASK_RUN_POST + if WRITE_DOPOST: + # Turn off run_post + RUN_TASK_RUN_POST=False + + # Check if SUB_HOURLY_POST is on + if SUB_HOURLY_POST: + print_err_msg_exit(f''' + SUB_HOURLY_POST is NOT available with Inline Post yet.''') + # + #----------------------------------------------------------------------- + # + # Calculate PE_MEMBER01. This is the number of MPI tasks used for the + # forecast, including those for the write component if QUILTING is set + # to True. + # + #----------------------------------------------------------------------- + # + global PE_MEMBER01 + PE_MEMBER01=LAYOUT_X*LAYOUT_Y + if QUILTING: + PE_MEMBER01 = PE_MEMBER01 + WRTCMP_write_groups*WRTCMP_write_tasks_per_group + + print_info_msg(f''' + The number of MPI tasks for the forecast (including those for the write + component if it is being used) are: + PE_MEMBER01 = {PE_MEMBER01}''', verbose=VERBOSE) + # + #----------------------------------------------------------------------- + # + # Calculate the number of nodes (NNODES_RUN_FCST) to request from the job + # scheduler for the forecast task (RUN_FCST_TN). This is just PE_MEMBER01 + # dividied by the number of processes per node we want to request for this + # task (PPN_RUN_FCST), then rounded up to the nearest integer, i.e. + # + # NNODES_RUN_FCST = ceil(PE_MEMBER01/PPN_RUN_FCST) + # + # where ceil(...) is the ceiling function, i.e. it rounds its floating + # point argument up to the next larger integer. Since in bash, division + # of two integers returns a truncated integer, and since bash has no + # built-in ceil(...) function, we perform the rounding-up operation by + # adding the denominator (of the argument of ceil(...) above) minus 1 to + # the original numerator, i.e. by redefining NNODES_RUN_FCST to be + # + # NNODES_RUN_FCST = (PE_MEMBER01 + PPN_RUN_FCST - 1)/PPN_RUN_FCST + # + #----------------------------------------------------------------------- + # + global NNODES_RUN_FCST + NNODES_RUN_FCST= (PE_MEMBER01 + PPN_RUN_FCST - 1)//PPN_RUN_FCST + + # + #----------------------------------------------------------------------- + # + # Call the function that checks whether the RUC land surface model (LSM) + # is being called by the physics suite and sets the workflow variable + # SDF_USES_RUC_LSM to True or False accordingly. + # + #----------------------------------------------------------------------- + # + global SDF_USES_RUC_LSM + SDF_USES_RUC_LSM = check_ruc_lsm( \ + ccpp_phys_suite_fp=CCPP_PHYS_SUITE_IN_CCPP_FP) + # + #----------------------------------------------------------------------- + # + # Set the name of the file containing aerosol climatology data that, if + # necessary, can be used to generate approximate versions of the aerosol + # fields needed by Thompson microphysics. This file will be used to + # generate such approximate aerosol fields in the ICs and LBCs if Thompson + # MP is included in the physics suite and if the exteranl model for ICs + # or LBCs does not already provide these fields. Also, set the full path + # to this file. + # + #----------------------------------------------------------------------- + # + THOMPSON_MP_CLIMO_FN="Thompson_MP_MONTHLY_CLIMO.nc" + THOMPSON_MP_CLIMO_FP=os.path.join(FIXam,THOMPSON_MP_CLIMO_FN) + # + #----------------------------------------------------------------------- + # + # Call the function that, if the Thompson microphysics parameterization + # is being called by the physics suite, modifies certain workflow arrays + # to ensure that fixed files needed by this parameterization are copied + # to the FIXam directory and appropriate symlinks to them are created in + # the run directories. This function also sets the workflow variable + # SDF_USES_THOMPSON_MP that indicates whether Thompson MP is called by + # the physics suite. + # + #----------------------------------------------------------------------- + # + SDF_USES_THOMPSON_MP = set_thompson_mp_fix_files( \ + ccpp_phys_suite_fp=CCPP_PHYS_SUITE_IN_CCPP_FP, \ + thompson_mp_climo_fn=THOMPSON_MP_CLIMO_FN) + + IMPORTS = [ "CYCLEDIR_LINKS_TO_FIXam_FILES_MAPPING", "FIXgsm_FILES_TO_COPY_TO_FIXam" ] + import_vars(env_vars=IMPORTS) + # + #----------------------------------------------------------------------- + # + # Generate the shell script that will appear in the experiment directory + # (EXPTDIR) and will contain definitions of variables needed by the va- + # rious scripts in the workflow. We refer to this as the experiment/ + # workflow global variable definitions file. We will create this file + # by: + # + # 1) Copying the default workflow/experiment configuration file (speci- + # fied by EXPT_DEFAULT_CONFIG_FN and located in the shell script di- + # rectory specified by USHDIR) to the experiment directory and rena- + # ming it to the name specified by GLOBAL_VAR_DEFNS_FN. + # + # 2) Resetting the default variable values in this file to their current + # values. This is necessary because these variables may have been + # reset by the user-specified configuration file (if one exists in + # USHDIR) and/or by this setup script, e.g. because predef_domain is + # set to a valid non-empty value. + # + # 3) Appending to the variable definitions file any new variables intro- + # duced in this setup script that may be needed by the scripts that + # perform the various tasks in the workflow (and which source the va- + # riable defintions file). + # + # First, set the full path to the variable definitions file and copy the + # default configuration script into it. + # + #----------------------------------------------------------------------- + # + + # update dictionary with globals() values + update_dict = {k: globals()[k] for k in cfg_d.keys() if k in globals() } + cfg_d.update(update_dict) + + # write the updated default dictionary + global GLOBAL_VAR_DEFNS_FP + GLOBAL_VAR_DEFNS_FP=os.path.join(EXPTDIR,GLOBAL_VAR_DEFNS_FN) + all_lines=cfg_to_shell_str(cfg_d) + with open(GLOBAL_VAR_DEFNS_FP,'w') as f: + msg = f""" # + #----------------------------------------------------------------------- + #----------------------------------------------------------------------- + # Section 1: + # This section contains (most of) the primary experiment variables, i.e. + # those variables that are defined in the default configuration file + # (config_defaults.sh) and that can be reset via the user-specified + # experiment configuration file (config.sh). + #----------------------------------------------------------------------- + #----------------------------------------------------------------------- + # + """ + f.write(dedent(msg)) + f.write(all_lines) + + # print info message + msg=dedent(f''' + Before updating default values of experiment variables to user-specified + values, the variable \"line_list\" contains: + + ''') + + msg +=dedent(f''' + {all_lines}''') + + print_info_msg(msg,verbose=DEBUG) + # + # print info message + # + print_info_msg(f''' + Generating the global experiment variable definitions file specified by + GLOBAL_VAR_DEFNS_FN: + GLOBAL_VAR_DEFNS_FN = \"{GLOBAL_VAR_DEFNS_FN}\" + Full path to this file is: + GLOBAL_VAR_DEFNS_FP = \"{GLOBAL_VAR_DEFNS_FP}\" + For more detailed information, set DEBUG to \"TRUE\" in the experiment + configuration file (\"{EXPT_CONFIG_FN}\").''') + + # + #----------------------------------------------------------------------- + # + # Append additional variable definitions (and comments) to the variable + # definitions file. These variables have been set above using the vari- + # ables in the default and local configuration scripts. These variables + # are needed by various tasks/scripts in the workflow. + # + #----------------------------------------------------------------------- + # + msg = f""" + # + #----------------------------------------------------------------------- + #----------------------------------------------------------------------- + # Section 2: + # This section defines variables that have been derived from the primary + # set of experiment variables above (we refer to these as \"derived\" or + # \"secondary\" variables). + #----------------------------------------------------------------------- + #----------------------------------------------------------------------- + # + + # + #----------------------------------------------------------------------- + # + # Full path to workflow (re)launch script, its log file, and the line + # that gets added to the cron table to launch this script if the flag + # USE_CRON_TO_RELAUNCH is set to \"TRUE\". + # + #----------------------------------------------------------------------- + # + WFLOW_LAUNCH_SCRIPT_FP='{WFLOW_LAUNCH_SCRIPT_FP}' + WFLOW_LAUNCH_LOG_FP='{WFLOW_LAUNCH_LOG_FP}' + CRONTAB_LINE='{CRONTAB_LINE}' + # + #----------------------------------------------------------------------- + # + # Directories. + # + #----------------------------------------------------------------------- + # + SR_WX_APP_TOP_DIR='{SR_WX_APP_TOP_DIR}' + HOMErrfs='{HOMErrfs}' + USHDIR='{USHDIR}' + SCRIPTSDIR='{SCRIPTSDIR}' + JOBSDIR='{JOBSDIR}' + SORCDIR='{SORCDIR}' + SRC_DIR='{SRC_DIR}' + PARMDIR='{PARMDIR}' + MODULES_DIR='{MODULES_DIR}' + EXECDIR='{EXECDIR}' + FIXam='{FIXam}' + FIXclim='{FIXclim}' + FIXLAM='{FIXLAM}' + FIXgsm='{FIXgsm}' + FIXaer='{FIXaer}' + FIXlut='{FIXlut}' + COMROOT='{COMROOT}' + COMOUT_BASEDIR='{COMOUT_BASEDIR}' + TEMPLATE_DIR='{TEMPLATE_DIR}' + VX_CONFIG_DIR='{VX_CONFIG_DIR}' + METPLUS_CONF='{METPLUS_CONF}' + MET_CONFIG='{MET_CONFIG}' + UFS_WTHR_MDL_DIR='{UFS_WTHR_MDL_DIR}' + UFS_UTILS_DIR='{UFS_UTILS_DIR}' + SFC_CLIMO_INPUT_DIR='{SFC_CLIMO_INPUT_DIR}' + TOPO_DIR='{TOPO_DIR}' + UPP_DIR='{UPP_DIR}' + + EXPTDIR='{EXPTDIR}' + LOGDIR='{LOGDIR}' + CYCLE_BASEDIR='{CYCLE_BASEDIR}' + GRID_DIR='{GRID_DIR}' + OROG_DIR='{OROG_DIR}' + SFC_CLIMO_DIR='{SFC_CLIMO_DIR}' + + NDIGITS_ENSMEM_NAMES='{NDIGITS_ENSMEM_NAMES}' + ENSMEM_NAMES={list_to_str(ENSMEM_NAMES)} + FV3_NML_ENSMEM_FPS={list_to_str(FV3_NML_ENSMEM_FPS)} + # + #----------------------------------------------------------------------- + # + # Files. + # + #----------------------------------------------------------------------- + # + GLOBAL_VAR_DEFNS_FP='{GLOBAL_VAR_DEFNS_FP}' + + DATA_TABLE_FN='{DATA_TABLE_FN}' + DIAG_TABLE_FN='{DIAG_TABLE_FN}' + FIELD_TABLE_FN='{FIELD_TABLE_FN}' + MODEL_CONFIG_FN='{MODEL_CONFIG_FN}' + NEMS_CONFIG_FN='{NEMS_CONFIG_FN}' + + DATA_TABLE_TMPL_FN='{DATA_TABLE_TMPL_FN}' + DIAG_TABLE_TMPL_FN='{DIAG_TABLE_TMPL_FN}' + FIELD_TABLE_TMPL_FN='{FIELD_TABLE_TMPL_FN}' + MODEL_CONFIG_TMPL_FN='{MODEL_CONFIG_TMPL_FN}' + NEMS_CONFIG_TMPL_FN='{NEMS_CONFIG_TMPL_FN}' + + DATA_TABLE_TMPL_FP='{DATA_TABLE_TMPL_FP}' + DIAG_TABLE_TMPL_FP='{DIAG_TABLE_TMPL_FP}' + FIELD_TABLE_TMPL_FP='{FIELD_TABLE_TMPL_FP}' + FV3_NML_BASE_SUITE_FP='{FV3_NML_BASE_SUITE_FP}' + FV3_NML_YAML_CONFIG_FP='{FV3_NML_YAML_CONFIG_FP}' + FV3_NML_BASE_ENS_FP='{FV3_NML_BASE_ENS_FP}' + MODEL_CONFIG_TMPL_FP='{MODEL_CONFIG_TMPL_FP}' + NEMS_CONFIG_TMPL_FP='{NEMS_CONFIG_TMPL_FP}' + + CCPP_PHYS_SUITE_FN='{CCPP_PHYS_SUITE_FN}' + CCPP_PHYS_SUITE_IN_CCPP_FP='{CCPP_PHYS_SUITE_IN_CCPP_FP}' + CCPP_PHYS_SUITE_FP='{CCPP_PHYS_SUITE_FP}' + + FIELD_DICT_FN='{FIELD_DICT_FN}' + FIELD_DICT_IN_UWM_FP='{FIELD_DICT_IN_UWM_FP}' + FIELD_DICT_FP='{FIELD_DICT_FP}' + + DATA_TABLE_FP='{DATA_TABLE_FP}' + FIELD_TABLE_FP='{FIELD_TABLE_FP}' + FV3_NML_FN='{FV3_NML_FN}' # This may not be necessary... + FV3_NML_FP='{FV3_NML_FP}' + NEMS_CONFIG_FP='{NEMS_CONFIG_FP}' + + FV3_EXEC_FP='{FV3_EXEC_FP}' + + LOAD_MODULES_RUN_TASK_FP='{LOAD_MODULES_RUN_TASK_FP}' + + THOMPSON_MP_CLIMO_FN='{THOMPSON_MP_CLIMO_FN}' + THOMPSON_MP_CLIMO_FP='{THOMPSON_MP_CLIMO_FP}' + # + #----------------------------------------------------------------------- + # + # Flag for creating relative symlinks (as opposed to absolute ones). + # + #----------------------------------------------------------------------- + # + RELATIVE_LINK_FLAG='{RELATIVE_LINK_FLAG}' + # + #----------------------------------------------------------------------- + # + # Parameters that indicate whether or not various parameterizations are + # included in and called by the physics suite. + # + #----------------------------------------------------------------------- + # + SDF_USES_RUC_LSM='{type_to_str(SDF_USES_RUC_LSM)}' + SDF_USES_THOMPSON_MP='{type_to_str(SDF_USES_THOMPSON_MP)}' + # + #----------------------------------------------------------------------- + # + # Grid configuration parameters needed regardless of grid generation + # method used. + # + #----------------------------------------------------------------------- + # + GTYPE='{GTYPE}' + TILE_RGNL='{TILE_RGNL}' + NH0='{NH0}' + NH3='{NH3}' + NH4='{NH4}' + + LON_CTR='{LON_CTR}' + LAT_CTR='{LAT_CTR}' + NX='{NX}' + NY='{NY}' + NHW='{NHW}' + STRETCH_FAC='{STRETCH_FAC}' + + RES_IN_FIXLAM_FILENAMES='{RES_IN_FIXLAM_FILENAMES}' + # + # If running the make_grid task, CRES will be set to a null string during + # the grid generation step. It will later be set to an actual value after + # the make_grid task is complete. + # + CRES='{CRES}'""" + with open(GLOBAL_VAR_DEFNS_FP,'a') as f: + f.write(dedent(msg)) + # + #----------------------------------------------------------------------- + # + # Append to the variable definitions file the defintions of grid parame- + # ters that are specific to the grid generation method used. + # + #----------------------------------------------------------------------- + # + if GRID_GEN_METHOD == "GFDLgrid": + + msg=f""" + # + #----------------------------------------------------------------------- + # + # Grid configuration parameters for a regional grid generated from a + # global parent cubed-sphere grid. This is the method originally + # suggested by GFDL since it allows GFDL's nested grid generator to be + # used to generate a regional grid. However, for large regional domains, + # it results in grids that have an unacceptably large range of cell sizes + # (i.e. ratio of maximum to minimum cell size is not sufficiently close + # to 1). + # + #----------------------------------------------------------------------- + # + ISTART_OF_RGNL_DOM_WITH_WIDE_HALO_ON_T6SG='{ISTART_OF_RGNL_DOM_WITH_WIDE_HALO_ON_T6SG}' + IEND_OF_RGNL_DOM_WITH_WIDE_HALO_ON_T6SG='{IEND_OF_RGNL_DOM_WITH_WIDE_HALO_ON_T6SG}' + JSTART_OF_RGNL_DOM_WITH_WIDE_HALO_ON_T6SG='{JSTART_OF_RGNL_DOM_WITH_WIDE_HALO_ON_T6SG}' + JEND_OF_RGNL_DOM_WITH_WIDE_HALO_ON_T6SG='{JEND_OF_RGNL_DOM_WITH_WIDE_HALO_ON_T6SG}'""" + with open(GLOBAL_VAR_DEFNS_FP,'a') as f: + f.write(dedent(msg)) + + elif GRID_GEN_METHOD == "ESGgrid": + + msg=f""" + # + #----------------------------------------------------------------------- + # + # Grid configuration parameters for a regional grid generated independently + # of a global parent grid. This method was developed by Jim Purser of + # EMC and results in very uniform grids (i.e. ratio of maximum to minimum + # cell size is very close to 1). + # + #----------------------------------------------------------------------- + # + DEL_ANGLE_X_SG='{DEL_ANGLE_X_SG}' + DEL_ANGLE_Y_SG='{DEL_ANGLE_Y_SG}' + NEG_NX_OF_DOM_WITH_WIDE_HALO='{NEG_NX_OF_DOM_WITH_WIDE_HALO}' + NEG_NY_OF_DOM_WITH_WIDE_HALO='{NEG_NY_OF_DOM_WITH_WIDE_HALO}' + PAZI='{PAZI or ''}'""" + with open(GLOBAL_VAR_DEFNS_FP,'a') as f: + f.write(dedent(msg)) + # + #----------------------------------------------------------------------- + # + # Continue appending variable definitions to the variable definitions + # file. + # + #----------------------------------------------------------------------- + # + msg = f""" + # + #----------------------------------------------------------------------- + # + # Flag in the \"{MODEL_CONFIG_FN}\" file for coupling the ocean model to + # the weather model. + # + #----------------------------------------------------------------------- + # + CPL='{type_to_str(CPL)}' + # + #----------------------------------------------------------------------- + # + # Name of the ozone parameterization. The value this gets set to depends + # on the CCPP physics suite being used. + # + #----------------------------------------------------------------------- + # + OZONE_PARAM='{OZONE_PARAM}' + # + #----------------------------------------------------------------------- + # + # If USE_USER_STAGED_EXTRN_FILES is set to \"FALSE\", this is the system + # directory in which the workflow scripts will look for the files generated + # by the external model specified in EXTRN_MDL_NAME_ICS. These files will + # be used to generate the input initial condition and surface files for + # the FV3-LAM. + # + #----------------------------------------------------------------------- + # + EXTRN_MDL_SYSBASEDIR_ICS='{EXTRN_MDL_SYSBASEDIR_ICS}' + # + #----------------------------------------------------------------------- + # + # If USE_USER_STAGED_EXTRN_FILES is set to \"FALSE\", this is the system + # directory in which the workflow scripts will look for the files generated + # by the external model specified in EXTRN_MDL_NAME_LBCS. These files + # will be used to generate the input lateral boundary condition files for + # the FV3-LAM. + # + #----------------------------------------------------------------------- + # + EXTRN_MDL_SYSBASEDIR_LBCS='{EXTRN_MDL_SYSBASEDIR_LBCS}' + # + #----------------------------------------------------------------------- + # + # Shift back in time (in units of hours) of the starting time of the ex- + # ternal model specified in EXTRN_MDL_NAME_LBCS. + # + #----------------------------------------------------------------------- + # + EXTRN_MDL_LBCS_OFFSET_HRS='{EXTRN_MDL_LBCS_OFFSET_HRS}' + # + #----------------------------------------------------------------------- + # + # Boundary condition update times (in units of forecast hours). Note that + # LBC_SPEC_FCST_HRS is an array, even if it has only one element. + # + #----------------------------------------------------------------------- + # + LBC_SPEC_FCST_HRS={list_to_str(LBC_SPEC_FCST_HRS)} + # + #----------------------------------------------------------------------- + # + # The number of cycles for which to make forecasts and the list of + # starting dates/hours of these cycles. + # + #----------------------------------------------------------------------- + # + NUM_CYCLES='{NUM_CYCLES}' + ALL_CDATES={list_to_str(ALL_CDATES)} + # + #----------------------------------------------------------------------- + # + # Parameters that determine whether FVCOM data will be used, and if so, + # their location. + # + # If USE_FVCOM is set to \"TRUE\", then FVCOM data (in the file FVCOM_FILE + # located in the directory FVCOM_DIR) will be used to update the surface + # boundary conditions during the initial conditions generation task + # (MAKE_ICS_TN). + # + #----------------------------------------------------------------------- + # + USE_FVCOM='{type_to_str(USE_FVCOM)}' + FVCOM_DIR='{FVCOM_DIR}' + FVCOM_FILE='{FVCOM_FILE}' + # + #----------------------------------------------------------------------- + # + # Computational parameters. + # + #----------------------------------------------------------------------- + # + NCORES_PER_NODE='{NCORES_PER_NODE}' + PE_MEMBER01='{PE_MEMBER01}' + # + #----------------------------------------------------------------------- + # + # IF DO_SPP is set to "TRUE", N_VAR_SPP specifies the number of physics + # parameterizations that are perturbed with SPP. If DO_LSM_SPP is set to + # "TRUE", N_VAR_LNDP specifies the number of LSM parameters that are + # perturbed. LNDP_TYPE determines the way LSM perturbations are employed + # and FHCYC_LSM_SPP_OR_NOT sets FHCYC based on whether LSM perturbations + # are turned on or not. + # + #----------------------------------------------------------------------- + # + N_VAR_SPP='{N_VAR_SPP}' + N_VAR_LNDP='{N_VAR_LNDP}' + LNDP_TYPE='{LNDP_TYPE}' + FHCYC_LSM_SPP_OR_NOT='{FHCYC_LSM_SPP_OR_NOT}' + """ + + with open(GLOBAL_VAR_DEFNS_FP,'a') as f: + f.write(dedent(msg)) + + # export all vars + export_vars() + + # + #----------------------------------------------------------------------- + # + # Check validity of parameters in one place, here in the end. + # + #----------------------------------------------------------------------- + # + + # update dictionary with globals() values + update_dict = {k: globals()[k] for k in cfg_d.keys() if k in globals() } + cfg_d.update(update_dict) + + # loop through cfg_d and check validity of params + cfg_v = load_config_file("valid_param_vals.yaml") + for k,v in cfg_d.items(): + if v == None: + continue + vkey = 'valid_vals_' + k + if (vkey in cfg_v) and not (v in cfg_v[vkey]): + print_err_msg_exit(f''' + The variable {k}={v} in {EXPT_DEFAULT_CONFIG_FN} or {EXPT_CONFIG_FN} does not have + a valid value. Possible values are: + {k} = {cfg_v[vkey]}''') + + # + #----------------------------------------------------------------------- + # + # Print message indicating successful completion of script. + # + #----------------------------------------------------------------------- + # + print_info_msg(f''' + ======================================================================== + Function setup() in \"{os.path.basename(__file__)}\" completed successfully!!! + ========================================================================''') + +# +#----------------------------------------------------------------------- +# +# Call the function defined above. +# +#----------------------------------------------------------------------- +# +if __name__ == "__main__": + setup() + diff --git a/ush/setup.sh b/ush/setup.sh index 26dd58b17..a5e0f15fd 100755 --- a/ush/setup.sh +++ b/ush/setup.sh @@ -515,7 +515,7 @@ One or more fix file directories have not been specified for this machine: FIXlut = \"${FIXlut:-\"\"} TOPO_DIR = \"${TOPO_DIR:-\"\"} SFC_CLIMO_INPUT_DIR = \"${SFC_CLIMO_INPUT_DIR:-\"\"} - FIXLAM_NCO_BASEDIR = \"${FIXLAM_NCO_BASEDIR:-\"\"} + DOMAIN_PREGEN_BASEDIR = \"${DOMAIN_PREGEN_BASEDIR:-\"\"} You can specify the missing location(s) in ${machine_file}" fi @@ -523,7 +523,7 @@ fi # #----------------------------------------------------------------------- # -# Set the names of the build and workflow environment files (if not +# Set the names of the build and workflow module files (if not # already specified by the user). These are the files that need to be # sourced before building the component SRW App codes and running various # workflow scripts, respectively. @@ -531,8 +531,8 @@ fi #----------------------------------------------------------------------- # machine=$(echo_lowercase ${MACHINE}) -WFLOW_ENV_FN=${WFLOW_ENV_FN:-"wflow_${machine}.env"} -BUILD_ENV_FN=${BUILD_ENV_FN:-"build_${machine}_${COMPILER}.env"} +WFLOW_MOD_FN=${WFLOW_MOD_FN:-"wflow_${machine}"} +BUILD_MOD_FN=${BUILD_MOD_FN:-"build_${machine}_${COMPILER}"} # #----------------------------------------------------------------------- # @@ -1141,7 +1141,7 @@ check_for_preexist_dir_file "$EXPTDIR" "${PREEXISTING_DIR_METHOD}" # is not placed directly under COMROOT but several directories further # down. More specifically, for a cycle starting at yyyymmddhh, it is at # -# $COMROOT/$NET/$envir/$RUN.$yyyymmdd/$hh +# $COMROOT/$NET/$model_ver/$RUN.$yyyymmdd/$hh # # Below, we set COMROOT in terms of PTMP as COMROOT="$PTMP/com". COMOROOT # is not used by the workflow in community mode. @@ -1151,10 +1151,13 @@ check_for_preexist_dir_file "$EXPTDIR" "${PREEXISTING_DIR_METHOD}" # from the RUN_POST_TN task will be placed, i.e. it is the cycle-independent # portion of the RUN_POST_TN task's output directory. It is given by # -# $COMROOT/$NET/$envir +# $COMROOT/$NET/$model_ver # # COMOUT_BASEDIR is not used by the workflow in community mode. # +# POST_OUTPUT_DOMAIN_NAME: +# The PREDEF_GRID_NAME is set by default. +# #----------------------------------------------------------------------- # LOGDIR="${EXPTDIR}/log" @@ -1164,20 +1167,28 @@ FIXclim="${EXPTDIR}/fix_clim" FIXLAM="${EXPTDIR}/fix_lam" if [ "${RUN_ENVIR}" = "nco" ]; then - - CYCLE_BASEDIR="$STMP/tmpnwprd/$RUN" + CYCLE_BASEDIR="${STMP}/tmpnwprd/${RUN}" check_for_preexist_dir_file "${CYCLE_BASEDIR}" "${PREEXISTING_DIR_METHOD}" - COMROOT="$PTMP/com" - COMOUT_BASEDIR="$COMROOT/$NET/$envir" + COMROOT="${PTMP}/com" + COMOUT_BASEDIR="${COMROOT}/${NET}/${model_ver}" check_for_preexist_dir_file "${COMOUT_BASEDIR}" "${PREEXISTING_DIR_METHOD}" - else - CYCLE_BASEDIR="$EXPTDIR" COMROOT="" COMOUT_BASEDIR="" +fi +POST_OUTPUT_DOMAIN_NAME="${POST_OUTPUT_DOMAIN_NAME:-${PREDEF_GRID_NAME}}" +if [ -z "${POST_OUTPUT_DOMAIN_NAME}" ]; then + print_err_msg_exit "\ +The domain name used in naming the run_post output files (POST_OUTPUT_DOMAIN_NAME) +has not been set: + POST_OUTPUT_DOMAIN_NAME = \"${POST_OUTPUT_DOMAIN_NAME}\" +If this experiment is not using a predefined grid (i.e. if PREDEF_GRID_NAME +is set to a null string), POST_OUTPUT_DOMAIN_NAME must be set in the SRW +App's configuration file (\"${EXPT_CONFIG_FN}\")." fi +POST_OUTPUT_DOMAIN_NAME=$(echo_lowercase ${POST_OUTPUT_DOMAIN_NAME}) # #----------------------------------------------------------------------- # @@ -1356,14 +1367,15 @@ USE_USER_STAGED_EXTRN_FILES=$(boolify $USE_USER_STAGED_EXTRN_FILES) # if [ "${USE_USER_STAGED_EXTRN_FILES}" = "TRUE" ]; then - if [ ! -d "${EXTRN_MDL_SOURCE_BASEDIR_ICS}" ]; then + # Check for the base directory up to the first templated field. + if [ ! -d "$(dirname ${EXTRN_MDL_SOURCE_BASEDIR_ICS%%\$*})" ]; then print_err_msg_exit "\ The directory (EXTRN_MDL_SOURCE_BASEDIR_ICS) in which the user-staged external model files for generating ICs should be located does not exist: EXTRN_MDL_SOURCE_BASEDIR_ICS = \"${EXTRN_MDL_SOURCE_BASEDIR_ICS}\"" fi - if [ ! -d "${EXTRN_MDL_SOURCE_BASEDIR_LBCS}" ]; then + if [ ! -d "$(dirname ${EXTRN_MDL_SOURCE_BASEDIR_LBCS%%\$*})" ]; then print_err_msg_exit "\ The directory (EXTRN_MDL_SOURCE_BASEDIR_LBCS) in which the user-staged external model files for generating LBCs should be located does not exist: @@ -1478,7 +1490,7 @@ LOAD_MODULES_RUN_TASK_FP="$USHDIR/load_modules_run_task.sh" # if [ "${RUN_ENVIR}" = "nco" ]; then - nco_fix_dir="${FIXLAM_NCO_BASEDIR}/${PREDEF_GRID_NAME}" + nco_fix_dir="${DOMAIN_PREGEN_BASEDIR}/${PREDEF_GRID_NAME}" if [ ! -d "${nco_fix_dir}" ]; then print_err_msg_exit "\ The directory (nco_fix_dir) that should contain the pregenerated grid, @@ -1494,11 +1506,11 @@ orography, and surface climatology files does not exist: When RUN_ENVIR is set to \"nco\", the workflow assumes that pregenerated grid files already exist in the directory - \${FIXLAM_NCO_BASEDIR}/\${PREDEF_GRID_NAME} + \${DOMAIN_PREGEN_BASEDIR}/\${PREDEF_GRID_NAME} where - FIXLAM_NCO_BASEDIR = \"${FIXLAM_NCO_BASEDIR}\" + DOMAIN_PREGEN_BASEDIR = \"${DOMAIN_PREGEN_BASEDIR}\" PREDEF_GRID_NAME = \"${PREDEF_GRID_NAME}\" Thus, the MAKE_GRID_TN task must not be run (i.e. RUN_TASK_MAKE_GRID must @@ -1532,11 +1544,11 @@ Reset values are: msg=" When RUN_ENVIR is set to \"nco\", the workflow assumes that pregenerated orography files already exist in the directory - \${FIXLAM_NCO_BASEDIR}/\${PREDEF_GRID_NAME} + \${DOMAIN_PREGEN_BASEDIR}/\${PREDEF_GRID_NAME} where - FIXLAM_NCO_BASEDIR = \"${FIXLAM_NCO_BASEDIR}\" + DOMAIN_PREGEN_BASEDIR = \"${DOMAIN_PREGEN_BASEDIR}\" PREDEF_GRID_NAME = \"${PREDEF_GRID_NAME}\" Thus, the MAKE_OROG_TN task must not be run (i.e. RUN_TASK_MAKE_OROG must @@ -1571,11 +1583,11 @@ Reset values are: When RUN_ENVIR is set to \"nco\", the workflow assumes that pregenerated surface climatology files already exist in the directory - \${FIXLAM_NCO_BASEDIR}/\${PREDEF_GRID_NAME} + \${DOMAIN_PREGEN_BASEDIR}/\${PREDEF_GRID_NAME} where - FIXLAM_NCO_BASEDIR = \"${FIXLAM_NCO_BASEDIR}\" + DOMAIN_PREGEN_BASEDIR = \"${DOMAIN_PREGEN_BASEDIR}\" PREDEF_GRID_NAME = \"${PREDEF_GRID_NAME}\" Thus, the MAKE_SFC_CLIMO_TN task must not be run (i.e. RUN_TASK_MAKE_SFC_CLIMO @@ -2163,7 +2175,6 @@ GLOBAL_VAR_DEFNS_FP="$EXPTDIR/${GLOBAL_VAR_DEFNS_FN}" # variable definitions file. # #----------------------------------------------------------------------- - # print_info_msg " Creating list of default experiment variable definitions..." diff --git a/ush/templates/FV3LAM_wflow.xml b/ush/templates/FV3LAM_wflow.xml index 6cdaff347..8dff08780 100644 --- a/ush/templates/FV3LAM_wflow.xml +++ b/ush/templates/FV3LAM_wflow.xml @@ -22,6 +22,7 @@ Parameters needed by the job scheduler. + @@ -184,6 +185,9 @@ MODULES_RUN_TASK_FP script. {%- endif %} {{ wtime_make_grid }} &NCORES_PER_NODE; + {%- if machine in ["GAEA"] %} + &SLURM_NATIVE_CMD; + {%- endif %} &MAKE_GRID_TN; &LOGDIR;/&MAKE_GRID_TN;.log @@ -207,6 +211,9 @@ MODULES_RUN_TASK_FP script. {%- endif %} {{ wtime_make_orog }} &NCORES_PER_NODE; + {%- if machine in ["GAEA"] %} + &SLURM_NATIVE_CMD; + {%- endif %} &MAKE_OROG_TN; &LOGDIR;/&MAKE_OROG_TN;.log @@ -234,6 +241,9 @@ MODULES_RUN_TASK_FP script. {{ nnodes_make_sfc_climo }}:ppn={{ ppn_make_sfc_climo }} {{ wtime_make_sfc_climo }} &NCORES_PER_NODE; + {%- if machine in ["GAEA"] %} + &SLURM_NATIVE_CMD; + {%- endif %} &MAKE_SFC_CLIMO_TN; &LOGDIR;/&MAKE_SFC_CLIMO_TN;.log @@ -271,6 +281,9 @@ MODULES_RUN_TASK_FP script. {{ nnodes_get_extrn_ics }}:ppn={{ ppn_get_extrn_ics }} {{ wtime_get_extrn_ics }} &NCORES_PER_NODE; + {%- if machine in ["GAEA"] %} + &SLURM_NATIVE_CMD; + {%- endif %} &GET_EXTRN_ICS_TN; &LOGDIR;/&GET_EXTRN_ICS_TN;_@Y@m@d@H.log @@ -297,6 +310,9 @@ MODULES_RUN_TASK_FP script. {{ nnodes_get_extrn_lbcs }}:ppn={{ ppn_get_extrn_lbcs }} {{ wtime_get_extrn_lbcs }} &NCORES_PER_NODE; + {%- if machine in ["GAEA"] %} + &SLURM_NATIVE_CMD; + {%- endif %} &GET_EXTRN_LBCS_TN; &LOGDIR;/&GET_EXTRN_LBCS_TN;_@Y@m@d@H.log @@ -330,6 +346,9 @@ MODULES_RUN_TASK_FP script. {{ nnodes_make_ics }}:ppn={{ ppn_make_ics }} {{ wtime_make_ics }} &NCORES_PER_NODE; + {%- if machine in ["GAEA"] %} + &SLURM_NATIVE_CMD; + {%- endif %} &MAKE_ICS_TN;{{ uscore_ensmem_name }} &LOGDIR;/&MAKE_ICS_TN;{{ uscore_ensmem_name }}_@Y@m@d@H.log @@ -374,6 +393,9 @@ MODULES_RUN_TASK_FP script. {{ nnodes_make_lbcs }}:ppn={{ ppn_make_lbcs }} {{ wtime_make_lbcs }} &NCORES_PER_NODE; + {%- if machine in ["GAEA"] %} + &SLURM_NATIVE_CMD; + {%- endif %} &MAKE_LBCS_TN;{{ uscore_ensmem_name }} &LOGDIR;/&MAKE_LBCS_TN;{{ uscore_ensmem_name }}_@Y@m@d@H.log @@ -421,6 +443,9 @@ MODULES_RUN_TASK_FP script. {%- else %} {{ nnodes_run_fcst }}:ppn={{ ppn_run_fcst }} &NCORES_PER_NODE; + {%- endif %} + {%- if machine in ["GAEA"] %} + &SLURM_NATIVE_CMD; {%- endif %} {{ wtime_run_fcst }} &RUN_FCST_TN;{{ uscore_ensmem_name }} @@ -473,6 +498,9 @@ later below for other output times. {{ nnodes_run_post }}:ppn={{ ppn_run_post }} {{ wtime_run_post }} &NCORES_PER_NODE; + {%- if machine in ["GAEA"] %} + &SLURM_NATIVE_CMD; + {%- endif %} &RUN_POST_TN;{{ uscore_ensmem_name }}_f#fhr##fmn# &LOGDIR;/&RUN_POST_TN;{{ uscore_ensmem_name }}_f#fhr##fmn#_@Y@m@d@H.log @@ -527,6 +555,9 @@ for other output times. {{ nnodes_run_post }}:ppn={{ ppn_run_post }} {{ wtime_run_post }} &NCORES_PER_NODE; + {%- if machine in ["GAEA"] %} + &SLURM_NATIVE_CMD; + {%- endif %} &RUN_POST_TN;{{ uscore_ensmem_name }}_f#fhr##fmn# &LOGDIR;/&RUN_POST_TN;{{ uscore_ensmem_name }}_f#fhr##fmn#_@Y@m@d@H.log @@ -586,6 +617,9 @@ always zero). {{ nnodes_run_post }}:ppn={{ ppn_run_post }} {{ wtime_run_post }} &NCORES_PER_NODE; + {%- if machine in ["GAEA"] %} + &SLURM_NATIVE_CMD; + {%- endif %} {%- if sub_hourly_post %} &RUN_POST_TN;{{ uscore_ensmem_name }}_f#fhr##fmn# &LOGDIR;/&RUN_POST_TN;{{ uscore_ensmem_name }}_f#fhr##fmn#_@Y@m@d@H.log @@ -654,6 +688,9 @@ the tag to be identical to the ones above for other output times. {{ nnodes_run_post }}:ppn={{ ppn_run_post }} {{ wtime_run_post }} &NCORES_PER_NODE; + {%- if machine in ["GAEA"] %} + &SLURM_NATIVE_CMD; + {%- endif %} &RUN_POST_TN;{{ uscore_ensmem_name }}_f#fhr##fmn# &LOGDIR;/&RUN_POST_TN;{{ uscore_ensmem_name }}_f#fhr##fmn#_@Y@m@d@H.log @@ -697,6 +734,9 @@ the tag to be identical to the ones above for other output times. {{ nnodes_get_obs_ccpa }}:ppn={{ ppn_get_obs_ccpa }} {{ wtime_get_obs_ccpa }} &NCORES_PER_NODE; + {%- if machine in ["GAEA"] %} + &SLURM_NATIVE_CMD; + {%- endif %} &GET_OBS_CCPA_TN; &LOGDIR;/&GET_OBS_CCPA_TN;_@Y@m@d@H.log @@ -727,6 +767,9 @@ the tag to be identical to the ones above for other output times. {{ nnodes_get_obs_mrms }}:ppn={{ ppn_get_obs_mrms }} {{ wtime_get_obs_mrms }} &NCORES_PER_NODE; + {%- if machine in ["GAEA"] %} + &SLURM_NATIVE_CMD; + {%- endif %} &GET_OBS_MRMS_TN; &LOGDIR;/&GET_OBS_MRMS_TN;_@Y@m@d@H.log @@ -758,6 +801,9 @@ the tag to be identical to the ones above for other output times. {{ nnodes_get_obs_ndas }}:ppn={{ ppn_get_obs_ndas }} {{ wtime_get_obs_ndas }} &NCORES_PER_NODE; + {%- if machine in ["GAEA"] %} + &SLURM_NATIVE_CMD; + {%- endif %} &GET_OBS_NDAS_TN; &LOGDIR;/&GET_OBS_NDAS_TN;_@Y@m@d@H.log @@ -784,6 +830,9 @@ the tag to be identical to the ones above for other output times. {{ nnodes_vx_gridstat }}:ppn={{ ppn_vx_gridstat }} {{ wtime_vx_gridstat }} &NCORES_PER_NODE; + {%- if machine in ["GAEA"] %} + &SLURM_NATIVE_CMD; + {%- endif %} &VX_GRIDSTAT_TN; &LOGDIR;/&VX_GRIDSTAT_TN;{{ uscore_ensmem_name }}_@Y@m@d@H.log @@ -834,6 +883,9 @@ the tag to be identical to the ones above for other output times. {{ nnodes_vx_gridstat }}:ppn={{ ppn_vx_gridstat }} {{ wtime_vx_gridstat }} &NCORES_PER_NODE; + {%- if machine in ["GAEA"] %} + &SLURM_NATIVE_CMD; + {%- endif %} &VX_GRIDSTAT_REFC_TN; &LOGDIR;/&VX_GRIDSTAT_REFC_TN;{{ uscore_ensmem_name }}_@Y@m@d@H.log @@ -883,6 +935,9 @@ the tag to be identical to the ones above for other output times. {{ nnodes_vx_gridstat }}:ppn={{ ppn_vx_gridstat }} {{ wtime_vx_gridstat }} &NCORES_PER_NODE; + {%- if machine in ["GAEA"] %} + &SLURM_NATIVE_CMD; + {%- endif %} &VX_GRIDSTAT_RETOP_TN; &LOGDIR;/&VX_GRIDSTAT_RETOP_TN;{{ uscore_ensmem_name }}_@Y@m@d@H.log @@ -932,6 +987,9 @@ the tag to be identical to the ones above for other output times. {{ nnodes_vx_gridstat }}:ppn={{ ppn_vx_gridstat }} {{ wtime_vx_gridstat }} &NCORES_PER_NODE; + {%- if machine in ["GAEA"] %} + &SLURM_NATIVE_CMD; + {%- endif %} &VX_GRIDSTAT_03h_TN; &LOGDIR;/&VX_GRIDSTAT_03h_TN;{{ uscore_ensmem_name }}_@Y@m@d@H.log @@ -967,6 +1025,9 @@ the tag to be identical to the ones above for other output times. {{ nnodes_vx_gridstat }}:ppn={{ ppn_vx_gridstat }} {{ wtime_vx_gridstat }} &NCORES_PER_NODE; + {%- if machine in ["GAEA"] %} + &SLURM_NATIVE_CMD; + {%- endif %} &VX_GRIDSTAT_06h_TN; &LOGDIR;/&VX_GRIDSTAT_06h_TN;{{ uscore_ensmem_name }}_@Y@m@d@H.log @@ -1002,6 +1063,9 @@ the tag to be identical to the ones above for other output times. {{ nnodes_vx_gridstat }}:ppn={{ ppn_vx_gridstat }} {{ wtime_vx_gridstat }} &NCORES_PER_NODE; + {%- if machine in ["GAEA"] %} + &SLURM_NATIVE_CMD; + {%- endif %} &VX_GRIDSTAT_24h_TN; &LOGDIR;/&VX_GRIDSTAT_24h_TN;{{ uscore_ensmem_name }}_@Y@m@d@H.log @@ -1037,6 +1101,9 @@ the tag to be identical to the ones above for other output times. {{ nnodes_vx_pointstat }}:ppn={{ ppn_vx_pointstat }} {{ wtime_vx_pointstat }} &NCORES_PER_NODE; + {%- if machine in ["GAEA"] %} + &SLURM_NATIVE_CMD; + {%- endif %} &VX_POINTSTAT_TN; &LOGDIR;/&VX_POINTSTAT_TN;{{ uscore_ensmem_name }}_@Y@m@d@H.log @@ -1089,6 +1156,9 @@ the tag to be identical to the ones above for other output times. {{ nnodes_vx_ensgrid }}:ppn={{ ppn_vx_ensgrid }} {{ wtime_vx_ensgrid }} &NCORES_PER_NODE; + {%- if machine in ["GAEA"] %} + &SLURM_NATIVE_CMD; + {%- endif %} &VX_ENSGRID_TN; &LOGDIR;/&VX_ENSGRID_TN;_@Y@m@d@H.log @@ -1121,6 +1191,9 @@ the tag to be identical to the ones above for other output times. {{ nnodes_vx_ensgrid }}:ppn={{ ppn_vx_ensgrid }} {{ wtime_vx_ensgrid }} &NCORES_PER_NODE; + {%- if machine in ["GAEA"] %} + &SLURM_NATIVE_CMD; + {%- endif %} &VX_ENSGRID_03h_TN; &LOGDIR;/&VX_ENSGRID_03h_TN;_@Y@m@d@H.log @@ -1153,6 +1226,9 @@ the tag to be identical to the ones above for other output times. {{ nnodes_vx_ensgrid }}:ppn={{ ppn_vx_ensgrid }} {{ wtime_vx_ensgrid }} &NCORES_PER_NODE; + {%- if machine in ["GAEA"] %} + &SLURM_NATIVE_CMD; + {%- endif %} &VX_ENSGRID_06h_TN; &LOGDIR;/&VX_ENSGRID_06h_TN;_@Y@m@d@H.log @@ -1185,6 +1261,9 @@ the tag to be identical to the ones above for other output times. {{ nnodes_vx_ensgrid }}:ppn={{ ppn_vx_ensgrid }} {{ wtime_vx_ensgrid }} &NCORES_PER_NODE; + {%- if machine in ["GAEA"] %} + &SLURM_NATIVE_CMD; + {%- endif %} &VX_ENSGRID_24h_TN; &LOGDIR;/&VX_ENSGRID_24h_TN;_@Y@m@d@H.log @@ -1216,6 +1295,9 @@ the tag to be identical to the ones above for other output times. {{ nnodes_vx_ensgrid }}:ppn={{ ppn_vx_ensgrid }} {{ wtime_vx_ensgrid }} &NCORES_PER_NODE; + {%- if machine in ["GAEA"] %} + &SLURM_NATIVE_CMD; + {%- endif %} &VX_ENSGRID_REFC_TN; &LOGDIR;/&VX_ENSGRID_REFC_TN;_@Y@m@d@H.log @@ -1246,6 +1328,9 @@ the tag to be identical to the ones above for other output times. {{ nnodes_vx_ensgrid }}:ppn={{ ppn_vx_ensgrid }} {{ wtime_vx_ensgrid }} &NCORES_PER_NODE; + {%- if machine in ["GAEA"] %} + &SLURM_NATIVE_CMD; + {%- endif %} &VX_ENSGRID_RETOP_TN; &LOGDIR;/&VX_ENSGRID_RETOP_TN;_@Y@m@d@H.log @@ -1275,6 +1360,9 @@ the tag to be identical to the ones above for other output times. {{ nnodes_vx_ensgrid_mean }}:ppn={{ ppn_vx_ensgrid_mean }} {{ wtime_vx_ensgrid_mean }} &NCORES_PER_NODE; + {%- if machine in ["GAEA"] %} + &SLURM_NATIVE_CMD; + {%- endif %} &VX_ENSGRID_MEAN_TN; &LOGDIR;/&VX_ENSGRID_MEAN_TN;_@Y@m@d@H.log @@ -1306,6 +1394,9 @@ the tag to be identical to the ones above for other output times. {{ nnodes_vx_ensgrid_prob }}:ppn={{ ppn_vx_ensgrid_prob }} {{ wtime_vx_ensgrid_prob }} &NCORES_PER_NODE; + {%- if machine in ["GAEA"] %} + &SLURM_NATIVE_CMD; + {%- endif %} &VX_ENSGRID_PROB_TN; &LOGDIR;/&VX_ENSGRID_PROB_TN;_@Y@m@d@H.log @@ -1337,6 +1428,9 @@ the tag to be identical to the ones above for other output times. {{ nnodes_vx_ensgrid_mean }}:ppn={{ ppn_vx_ensgrid_mean }} {{ wtime_vx_ensgrid_mean }} &NCORES_PER_NODE; + {%- if machine in ["GAEA"] %} + &SLURM_NATIVE_CMD; + {%- endif %} &VX_ENSGRID_MEAN_03h_TN; &LOGDIR;/&VX_ENSGRID_MEAN_03h_TN;_@Y@m@d@H.log @@ -1368,6 +1462,9 @@ the tag to be identical to the ones above for other output times. {{ nnodes_vx_ensgrid_prob }}:ppn={{ ppn_vx_ensgrid_prob }} {{ wtime_vx_ensgrid_prob }} &NCORES_PER_NODE; + {%- if machine in ["GAEA"] %} + &SLURM_NATIVE_CMD; + {%- endif %} &VX_ENSGRID_PROB_03h_TN; &LOGDIR;/&VX_ENSGRID_PROB_03h_TN;_@Y@m@d@H.log @@ -1400,6 +1497,9 @@ the tag to be identical to the ones above for other output times. {{ nnodes_vx_ensgrid_mean }}:ppn={{ ppn_vx_ensgrid_mean }} {{ wtime_vx_ensgrid_mean }} &NCORES_PER_NODE; + {%- if machine in ["GAEA"] %} + &SLURM_NATIVE_CMD; + {%- endif %} &VX_ENSGRID_MEAN_06h_TN; &LOGDIR;/&VX_ENSGRID_MEAN_06h_TN;_@Y@m@d@H.log @@ -1431,6 +1531,9 @@ the tag to be identical to the ones above for other output times. {{ nnodes_vx_ensgrid_prob }}:ppn={{ ppn_vx_ensgrid_prob }} {{ wtime_vx_ensgrid_prob }} &NCORES_PER_NODE; + {%- if machine in ["GAEA"] %} + &SLURM_NATIVE_CMD; + {%- endif %} &VX_ENSGRID_PROB_06h_TN; &LOGDIR;/&VX_ENSGRID_PROB_06h_TN;_@Y@m@d@H.log @@ -1463,6 +1566,9 @@ the tag to be identical to the ones above for other output times. {{ nnodes_vx_ensgrid_mean }}:ppn={{ ppn_vx_ensgrid_mean }} {{ wtime_vx_ensgrid_mean }} &NCORES_PER_NODE; + {%- if machine in ["GAEA"] %} + &SLURM_NATIVE_CMD; + {%- endif %} &VX_ENSGRID_MEAN_24h_TN; &LOGDIR;/&VX_ENSGRID_MEAN_24h_TN;_@Y@m@d@H.log @@ -1494,6 +1600,9 @@ the tag to be identical to the ones above for other output times. {{ nnodes_vx_ensgrid_prob }}:ppn={{ ppn_vx_ensgrid_prob }} {{ wtime_vx_ensgrid_prob }} &NCORES_PER_NODE; + {%- if machine in ["GAEA"] %} + &SLURM_NATIVE_CMD; + {%- endif %} &VX_ENSGRID_PROB_24h_TN; &LOGDIR;/&VX_ENSGRID_PROB_24h_TN;_@Y@m@d@H.log @@ -1525,6 +1634,9 @@ the tag to be identical to the ones above for other output times. {{ nnodes_vx_ensgrid_prob }}:ppn={{ ppn_vx_ensgrid_prob }} {{ wtime_vx_ensgrid_prob }} &NCORES_PER_NODE; + {%- if machine in ["GAEA"] %} + &SLURM_NATIVE_CMD; + {%- endif %} &VX_ENSGRID_PROB_REFC_TN; &LOGDIR;/&VX_ENSGRID_PROB_REFC_TN;_@Y@m@d@H.log @@ -1555,6 +1667,9 @@ the tag to be identical to the ones above for other output times. {{ nnodes_vx_ensgrid_prob }}:ppn={{ ppn_vx_ensgrid_prob }} {{ wtime_vx_ensgrid_prob }} &NCORES_PER_NODE; + {%- if machine in ["GAEA"] %} + &SLURM_NATIVE_CMD; + {%- endif %} &VX_ENSGRID_PROB_RETOP_TN; &LOGDIR;/&VX_ENSGRID_PROB_RETOP_TN;_@Y@m@d@H.log @@ -1586,6 +1701,9 @@ the tag to be identical to the ones above for other output times. {{ nnodes_vx_enspoint }}:ppn={{ ppn_vx_enspoint }} {{ wtime_vx_enspoint }} &NCORES_PER_NODE; + {%- if machine in ["GAEA"] %} + &SLURM_NATIVE_CMD; + {%- endif %} &VX_ENSPOINT_TN; &LOGDIR;/&VX_ENSPOINT_TN;_@Y@m@d@H.log @@ -1614,6 +1732,9 @@ the tag to be identical to the ones above for other output times. {{ nnodes_vx_enspoint_mean }}:ppn={{ ppn_vx_enspoint_mean }} {{ wtime_vx_enspoint_mean }} &NCORES_PER_NODE; + {%- if machine in ["GAEA"] %} + &SLURM_NATIVE_CMD; + {%- endif %} &VX_ENSPOINT_MEAN_TN; &LOGDIR;/&VX_ENSPOINT_MEAN_TN;_@Y@m@d@H.log @@ -1642,6 +1763,9 @@ the tag to be identical to the ones above for other output times. {{ nnodes_vx_enspoint_prob }}:ppn={{ ppn_vx_enspoint_prob }} {{ wtime_vx_enspoint_prob }} &NCORES_PER_NODE; + {%- if machine in ["GAEA"] %} + &SLURM_NATIVE_CMD; + {%- endif %} &VX_ENSPOINT_PROB_TN; &LOGDIR;/&VX_ENSPOINT_PROB_TN;_@Y@m@d@H.log diff --git a/ush/templates/data_locations.yml b/ush/templates/data_locations.yml index d82eeec67..ef5d49c8e 100644 --- a/ush/templates/data_locations.yml +++ b/ush/templates/data_locations.yml @@ -64,14 +64,12 @@ FV3GFS: - gfs.t{hh}z.sfcanl.nemsio fcst: - gfs.t{hh}z.atmf{fcst_hr:03d}.nemsio - - gfs.t{hh}z.sfcf{fcst_hr:03d}.nemsio netcdf: anl: - gfs.t{hh}z.atmanl.nc - gfs.t{hh}z.sfcanl.nc fcst: - gfs.t{hh}z.atmf{fcst_hr:03d}.nc - - gfs.t{hh}z.sfcf{fcst_hr:03d}.nc hpss: protocol: htar archive_path: diff --git a/ush/templates/parm/metplus/EnsembleStat_APCP01h.conf b/ush/templates/parm/metplus/EnsembleStat_APCP01h.conf index 23f6adae9..33599d9f7 100644 --- a/ush/templates/parm/metplus/EnsembleStat_APCP01h.conf +++ b/ush/templates/parm/metplus/EnsembleStat_APCP01h.conf @@ -221,7 +221,7 @@ STAGING_DIR = {OUTPUT_BASE}/stage/APCP_01 # FCST_ENSEMBLE_STAT_INPUT_TEMPLATE - comma separated list of ensemble members # or a single line, - filename wildcard characters may be used, ? or *. -FCST_ENSEMBLE_STAT_INPUT_TEMPLATE = mem*/postprd/{ENV[NET]}.t{init?fmt=%H}z.prslevf{lead?fmt=%HHH}.tm{init?fmt=%H}.grib2 +FCST_ENSEMBLE_STAT_INPUT_TEMPLATE = mem*/postprd/{ENV[NET]}.t{init?fmt=%H}z.prslev.f{lead?fmt=%HHH}.{ENV[POST_OUTPUT_DOMAIN_NAME]}.grib2 # Template to look for point observations. # Example precip24_2010010112.nc diff --git a/ush/templates/parm/metplus/EnsembleStat_APCP03h.conf b/ush/templates/parm/metplus/EnsembleStat_APCP03h.conf index d67380ad8..2691dc2dc 100644 --- a/ush/templates/parm/metplus/EnsembleStat_APCP03h.conf +++ b/ush/templates/parm/metplus/EnsembleStat_APCP03h.conf @@ -256,8 +256,8 @@ STAGING_DIR = {OUTPUT_BASE}/stage/APCP_03 [filename_templates] # Need to have PCPCombine output data to individual member directories. -FCST_PCP_COMBINE_INPUT_TEMPLATE = {custom?fmt=%s}/postprd/{ENV[NET]}.t{init?fmt=%H}z.prslevf{lead?fmt=%HHH}.tm{init?fmt=%H}.grib2 -FCST_PCP_COMBINE_OUTPUT_TEMPLATE = {custom?fmt=%s}/metprd/pcp_combine/{ENV[NET]}.t{init?fmt=%H}z.prslevf{lead?fmt=%HHH}.tm{init?fmt=%H}_a03h +FCST_PCP_COMBINE_INPUT_TEMPLATE = {custom?fmt=%s}/postprd/{ENV[NET]}.t{init?fmt=%H}z.prslev.f{lead?fmt=%HHH}.{ENV[POST_OUTPUT_DOMAIN_NAME]}.grib2 +FCST_PCP_COMBINE_OUTPUT_TEMPLATE = {custom?fmt=%s}/metprd/pcp_combine/{ENV[NET]}.t{init?fmt=%H}z.prslev.f{lead?fmt=%HHH}.{ENV[POST_OUTPUT_DOMAIN_NAME]}_a03h # Input and output template for obs pcp-combine files OBS_PCP_COMBINE_INPUT_TEMPLATE = {OBS_PCP_COMBINE_INPUT_DIR}/{valid?fmt=%Y%m%d}/ccpa.t{valid?fmt=%H}z.01h.hrap.conus.gb2 @@ -266,7 +266,7 @@ OBS_PCP_COMBINE_OUTPUT_TEMPLATE = {OBS_PCP_COMBINE_OUTPUT_DIR}/{valid?fmt=%Y%m%d # FCST_ENSEMBLE_STAT_INPUT_TEMPLATE - comma separated list of ensemble members # or a single line, - filename wildcard characters may be used, ? or *. -FCST_ENSEMBLE_STAT_INPUT_TEMPLATE = membegin_end_incr(1,{ENV[NUM_ENS_MEMBERS]},1,{ENV[NUM_PAD]})/metprd/pcp_combine/{ENV[NET]}.t{init?fmt=%H}z.prslevf{lead?fmt=%HHH}.tm{init?fmt=%H}_a03h +FCST_ENSEMBLE_STAT_INPUT_TEMPLATE = membegin_end_incr(1,{ENV[NUM_ENS_MEMBERS]},1,{ENV[NUM_PAD]})/metprd/pcp_combine/{ENV[NET]}.t{init?fmt=%H}z.prslev.f{lead?fmt=%HHH}.{ENV[POST_OUTPUT_DOMAIN_NAME]}_a03h # Template to look for point observations. # Example precip24_2010010112.nc diff --git a/ush/templates/parm/metplus/EnsembleStat_APCP06h.conf b/ush/templates/parm/metplus/EnsembleStat_APCP06h.conf index ae2f61f44..8527e2363 100644 --- a/ush/templates/parm/metplus/EnsembleStat_APCP06h.conf +++ b/ush/templates/parm/metplus/EnsembleStat_APCP06h.conf @@ -257,8 +257,8 @@ STAGING_DIR = {OUTPUT_BASE}/stage/APCP_01 [filename_templates] # Need to have PCPCombine output data to individual member directories. -FCST_PCP_COMBINE_INPUT_TEMPLATE = {custom?fmt=%s}/postprd/{ENV[NET]}.t{init?fmt=%H}z.prslevf{lead?fmt=%HHH}.tm{init?fmt=%H}.grib2 -FCST_PCP_COMBINE_OUTPUT_TEMPLATE = {custom?fmt=%s}/metprd/pcp_combine/{ENV[NET]}.t{init?fmt=%H}z.prslevf{lead?fmt=%HHH}.tm{init?fmt=%H}_a06h +FCST_PCP_COMBINE_INPUT_TEMPLATE = {custom?fmt=%s}/postprd/{ENV[NET]}.t{init?fmt=%H}z.prslev.f{lead?fmt=%HHH}.{ENV[POST_OUTPUT_DOMAIN_NAME]}.grib2 +FCST_PCP_COMBINE_OUTPUT_TEMPLATE = {custom?fmt=%s}/metprd/pcp_combine/{ENV[NET]}.t{init?fmt=%H}z.prslev.f{lead?fmt=%HHH}.{ENV[POST_OUTPUT_DOMAIN_NAME]}_a06h # Input and output template for obs pcp-combine files OBS_PCP_COMBINE_INPUT_TEMPLATE = {OBS_PCP_COMBINE_INPUT_DIR}/{valid?fmt=%Y%m%d}/ccpa.t{valid?fmt=%H}z.01h.hrap.conus.gb2 @@ -267,7 +267,7 @@ OBS_PCP_COMBINE_OUTPUT_TEMPLATE = {OBS_PCP_COMBINE_OUTPUT_DIR}/{valid?fmt=%Y%m%d # FCST_ENSEMBLE_STAT_INPUT_TEMPLATE - comma separated list of ensemble members # or a single line, - filename wildcard characters may be used, ? or *. -FCST_ENSEMBLE_STAT_INPUT_TEMPLATE = membegin_end_incr(1,{ENV[NUM_ENS_MEMBERS]},1,{ENV[NUM_PAD]})/metprd/pcp_combine/{ENV[NET]}.t{init?fmt=%H}z.prslevf{lead?fmt=%HHH}.tm{init?fmt=%H}_a06h +FCST_ENSEMBLE_STAT_INPUT_TEMPLATE = membegin_end_incr(1,{ENV[NUM_ENS_MEMBERS]},1,{ENV[NUM_PAD]})/metprd/pcp_combine/{ENV[NET]}.t{init?fmt=%H}z.prslev.f{lead?fmt=%HHH}.{ENV[POST_OUTPUT_DOMAIN_NAME]}_a06h # Template to look for point observations. # Example precip24_2010010112.nc diff --git a/ush/templates/parm/metplus/EnsembleStat_APCP24h.conf b/ush/templates/parm/metplus/EnsembleStat_APCP24h.conf index 0b1d175af..c3ed6a3a9 100644 --- a/ush/templates/parm/metplus/EnsembleStat_APCP24h.conf +++ b/ush/templates/parm/metplus/EnsembleStat_APCP24h.conf @@ -257,8 +257,8 @@ STAGING_DIR = {OUTPUT_BASE}/stage/APCP_01 [filename_templates] # Need to have PCPCombine output data to individual member directories. -FCST_PCP_COMBINE_INPUT_TEMPLATE = {custom?fmt=%s}/postprd/{ENV[NET]}.t{init?fmt=%H}z.prslevf{lead?fmt=%HHH}.tm{init?fmt=%H}.grib2 -FCST_PCP_COMBINE_OUTPUT_TEMPLATE = {custom?fmt=%s}/metprd/pcp_combine/{ENV[NET]}.t{init?fmt=%H}z.prslevf{lead?fmt=%HHH}.tm{init?fmt=%H}_a24h +FCST_PCP_COMBINE_INPUT_TEMPLATE = {custom?fmt=%s}/postprd/{ENV[NET]}.t{init?fmt=%H}z.prslev.f{lead?fmt=%HHH}.{ENV[POST_OUTPUT_DOMAIN_NAME]}.grib2 +FCST_PCP_COMBINE_OUTPUT_TEMPLATE = {custom?fmt=%s}/metprd/pcp_combine/{ENV[NET]}.t{init?fmt=%H}z.prslev.f{lead?fmt=%HHH}.{ENV[POST_OUTPUT_DOMAIN_NAME]}_a24h # Input and output template for obs pcp-combine files OBS_PCP_COMBINE_INPUT_TEMPLATE = {OBS_PCP_COMBINE_INPUT_DIR}/{valid?fmt=%Y%m%d}/ccpa.t{valid?fmt=%H}z.01h.hrap.conus.gb2 @@ -267,7 +267,7 @@ OBS_PCP_COMBINE_OUTPUT_TEMPLATE = {OBS_PCP_COMBINE_OUTPUT_DIR}/{valid?fmt=%Y%m%d # FCST_ENSEMBLE_STAT_INPUT_TEMPLATE - comma separated list of ensemble members # or a single line, - filename wildcard characters may be used, ? or *. -FCST_ENSEMBLE_STAT_INPUT_TEMPLATE = membegin_end_incr(1,{ENV[NUM_ENS_MEMBERS]},1,{ENV[NUM_PAD]})/metprd/pcp_combine/{ENV[NET]}.t{init?fmt=%H}z.prslevf{lead?fmt=%HHH}.tm{init?fmt=%H}_a24h +FCST_ENSEMBLE_STAT_INPUT_TEMPLATE = membegin_end_incr(1,{ENV[NUM_ENS_MEMBERS]},1,{ENV[NUM_PAD]})/metprd/pcp_combine/{ENV[NET]}.t{init?fmt=%H}z.prslev.f{lead?fmt=%HHH}.{ENV[POST_OUTPUT_DOMAIN_NAME]}_a24h # Template to look for point observations. # Example precip24_2010010112.nc diff --git a/ush/templates/parm/metplus/EnsembleStat_REFC.conf b/ush/templates/parm/metplus/EnsembleStat_REFC.conf index 989341a55..123af0f0a 100644 --- a/ush/templates/parm/metplus/EnsembleStat_REFC.conf +++ b/ush/templates/parm/metplus/EnsembleStat_REFC.conf @@ -220,7 +220,7 @@ STAGING_DIR = {OUTPUT_BASE}/stage/REFC # FCST_ENSEMBLE_STAT_INPUT_TEMPLATE - comma separated list of ensemble members # or a single line, - filename wildcard characters may be used, ? or *. -FCST_ENSEMBLE_STAT_INPUT_TEMPLATE = mem*/postprd/{ENV[NET]}.t{init?fmt=%H}z.prslevf{lead?fmt=%HHH}.tm{init?fmt=%H}.grib2 +FCST_ENSEMBLE_STAT_INPUT_TEMPLATE = mem*/postprd/{ENV[NET]}.t{init?fmt=%H}z.prslev.f{lead?fmt=%HHH}.{ENV[POST_OUTPUT_DOMAIN_NAME]}.grib2 # Template to look for point observations. # Example precip24_2010010112.nc diff --git a/ush/templates/parm/metplus/EnsembleStat_RETOP.conf b/ush/templates/parm/metplus/EnsembleStat_RETOP.conf index b0c235a2e..4fcbc9bcf 100644 --- a/ush/templates/parm/metplus/EnsembleStat_RETOP.conf +++ b/ush/templates/parm/metplus/EnsembleStat_RETOP.conf @@ -223,7 +223,7 @@ STAGING_DIR = {OUTPUT_BASE}/stage/RETOP # FCST_ENSEMBLE_STAT_INPUT_TEMPLATE - comma separated list of ensemble members # or a single line, - filename wildcard characters may be used, ? or *. -FCST_ENSEMBLE_STAT_INPUT_TEMPLATE = mem*/postprd/{ENV[NET]}.t{init?fmt=%H}z.prslevf{lead?fmt=%HHH}.tm{init?fmt=%H}.grib2 +FCST_ENSEMBLE_STAT_INPUT_TEMPLATE = mem*/postprd/{ENV[NET]}.t{init?fmt=%H}z.prslev.f{lead?fmt=%HHH}.{ENV[POST_OUTPUT_DOMAIN_NAME]}.grib2 # Template to look for point observations. # Example precip24_2010010112.nc diff --git a/ush/templates/parm/metplus/EnsembleStat_conus_sfc.conf b/ush/templates/parm/metplus/EnsembleStat_conus_sfc.conf index e1d95040c..d66066422 100644 --- a/ush/templates/parm/metplus/EnsembleStat_conus_sfc.conf +++ b/ush/templates/parm/metplus/EnsembleStat_conus_sfc.conf @@ -325,7 +325,7 @@ PB2NC_OUTPUT_TEMPLATE = prepbufr.ndas.{valid?fmt=%Y%m%d%H}.nc # or a single line, - filename wildcard characters may be used, ? or *. FCST_ENSEMBLE_STAT_INPUT_TEMPLATE = - mem*/postprd/{ENV[NET]}.t{init?fmt=%H}z.prslevf{lead?fmt=%HHH}.tm{init?fmt=%H}.grib2 + mem*/postprd/{ENV[NET]}.t{init?fmt=%H}z.prslev.f{lead?fmt=%HHH}.{ENV[POST_OUTPUT_DOMAIN_NAME]}.grib2 # Template to look for point observations. # Example precip24_2010010112.nc diff --git a/ush/templates/parm/metplus/EnsembleStat_upper_air.conf b/ush/templates/parm/metplus/EnsembleStat_upper_air.conf index 6251fbac1..072926a3b 100644 --- a/ush/templates/parm/metplus/EnsembleStat_upper_air.conf +++ b/ush/templates/parm/metplus/EnsembleStat_upper_air.conf @@ -394,7 +394,7 @@ PB2NC_OUTPUT_TEMPLATE = prepbufr.ndas.{valid?fmt=%Y%m%d%H}.nc # or a single line, - filename wildcard characters may be used, ? or *. FCST_ENSEMBLE_STAT_INPUT_TEMPLATE = - mem*/postprd/{ENV[NET]}.t{init?fmt=%H}z.prslevf{lead?fmt=%HHH}.tm{init?fmt=%H}.grib2 + mem*/postprd/{ENV[NET]}.t{init?fmt=%H}z.prslev.f{lead?fmt=%HHH}.{ENV[POST_OUTPUT_DOMAIN_NAME]}.grib2 # Template to look for point observations. # Example precip24_2010010112.nc diff --git a/ush/templates/parm/metplus/GridStat_APCP01h.conf b/ush/templates/parm/metplus/GridStat_APCP01h.conf index 240462dac..2c6aac65a 100644 --- a/ush/templates/parm/metplus/GridStat_APCP01h.conf +++ b/ush/templates/parm/metplus/GridStat_APCP01h.conf @@ -253,7 +253,7 @@ STAGING_DIR = {OUTPUT_BASE}/stage/APCP_01h [filename_templates] # Template to look for forecast input to GridStat relative to FCST_GRID_STAT_INPUT_DIR -FCST_GRID_STAT_INPUT_TEMPLATE = {ENV[NET]}.t{init?fmt=%H}z.prslevf{lead?fmt=%HHH}.tm{init?fmt=%H}.grib2 +FCST_GRID_STAT_INPUT_TEMPLATE = {ENV[NET]}.t{init?fmt=%H}z.prslev.f{lead?fmt=%HHH}.{ENV[POST_OUTPUT_DOMAIN_NAME]}.grib2 # Template to look for observation input to GridStat relative to OBS_GRID_STAT_INPUT_DIR OBS_GRID_STAT_INPUT_TEMPLATE = {valid?fmt=%Y%m%d}/ccpa.t{valid?fmt=%H}z.01h.hrap.conus.gb2 diff --git a/ush/templates/parm/metplus/GridStat_APCP03h.conf b/ush/templates/parm/metplus/GridStat_APCP03h.conf index f6d629021..c01780617 100644 --- a/ush/templates/parm/metplus/GridStat_APCP03h.conf +++ b/ush/templates/parm/metplus/GridStat_APCP03h.conf @@ -283,7 +283,7 @@ STAGING_DIR = {OUTPUT_BASE}/stage/APCP_03h [filename_templates] # Template to look for forecast input to PCPCombine and GridStat relative to FCST_GRID_STAT_INPUT_DIR -FCST_PCP_COMBINE_INPUT_TEMPLATE = {ENV[NET]}.t{init?fmt=%H}z.prslevf{lead?fmt=%HHH}.tm{init?fmt=%H}.grib2 +FCST_PCP_COMBINE_INPUT_TEMPLATE = {ENV[NET]}.t{init?fmt=%H}z.prslev.f{lead?fmt=%HHH}.{ENV[POST_OUTPUT_DOMAIN_NAME]}.grib2 FCST_GRID_STAT_INPUT_TEMPLATE = {FCST_PCP_COMBINE_OUTPUT_TEMPLATE} # Template to look for observation input to PCPCombine and GridStat relative to OBS_GRID_STAT_INPUT_DIR @@ -291,7 +291,7 @@ OBS_PCP_COMBINE_INPUT_TEMPLATE = {valid?fmt=%Y%m%d}/ccpa.t{valid?fmt=%H}z.01h.hr OBS_GRID_STAT_INPUT_TEMPLATE = {OBS_PCP_COMBINE_OUTPUT_TEMPLATE} # Optional subdirectories relative to GRID_STAT_OUTPUT_DIR to write output from PCPCombine and GridStat -FCST_PCP_COMBINE_OUTPUT_TEMPLATE = {ENV[NET]}.t{init?fmt=%H}z.prslevf{lead?fmt=%HHH}.tm{init?fmt=%H}_a{level?fmt=%HH}h +FCST_PCP_COMBINE_OUTPUT_TEMPLATE = {ENV[NET]}.t{init?fmt=%H}z.prslev.f{lead?fmt=%HHH}.{ENV[POST_OUTPUT_DOMAIN_NAME]}_a{level?fmt=%HH}h OBS_PCP_COMBINE_OUTPUT_TEMPLATE = {valid?fmt=%Y%m%d}/ccpa.t{valid?fmt=%H}z.hrap.conus.gb2_a{level?fmt=%HH}h GRID_STAT_OUTPUT_TEMPLATE = metprd/grid_stat diff --git a/ush/templates/parm/metplus/GridStat_APCP06h.conf b/ush/templates/parm/metplus/GridStat_APCP06h.conf index 7c8d69d1a..0e4d66531 100644 --- a/ush/templates/parm/metplus/GridStat_APCP06h.conf +++ b/ush/templates/parm/metplus/GridStat_APCP06h.conf @@ -283,7 +283,7 @@ STAGING_DIR = {OUTPUT_BASE}/stage/APCP_06h [filename_templates] # Template to look for forecast input to PCPCombine and GridStat relative to FCST_GRID_STAT_INPUT_DIR -FCST_PCP_COMBINE_INPUT_TEMPLATE = {ENV[NET]}.t{init?fmt=%H}z.prslevf{lead?fmt=%HHH}.tm{init?fmt=%H}.grib2 +FCST_PCP_COMBINE_INPUT_TEMPLATE = {ENV[NET]}.t{init?fmt=%H}z.prslev.f{lead?fmt=%HHH}.{ENV[POST_OUTPUT_DOMAIN_NAME]}.grib2 FCST_GRID_STAT_INPUT_TEMPLATE = {FCST_PCP_COMBINE_OUTPUT_TEMPLATE} # Template to look for observation input to PCPCombine and GridStat relative to OBS_GRID_STAT_INPUT_DIR @@ -291,7 +291,7 @@ OBS_PCP_COMBINE_INPUT_TEMPLATE = {valid?fmt=%Y%m%d}/ccpa.t{valid?fmt=%H}z.01h.hr OBS_GRID_STAT_INPUT_TEMPLATE = {OBS_PCP_COMBINE_OUTPUT_TEMPLATE} # Optional subdirectories relative to GRID_STAT_OUTPUT_DIR to write output from PCPCombine and GridStat -FCST_PCP_COMBINE_OUTPUT_TEMPLATE = {ENV[NET]}.t{init?fmt=%H}z.prslevf{lead?fmt=%HHH}.tm{init?fmt=%H}_a{level?fmt=%HH}h +FCST_PCP_COMBINE_OUTPUT_TEMPLATE = {ENV[NET]}.t{init?fmt=%H}z.prslev.f{lead?fmt=%HHH}.{ENV[POST_OUTPUT_DOMAIN_NAME]}_a{level?fmt=%HH}h OBS_PCP_COMBINE_OUTPUT_TEMPLATE = {valid?fmt=%Y%m%d}/ccpa.t{valid?fmt=%H}z.hrap.conus.gb2_a{level?fmt=%HH}h GRID_STAT_OUTPUT_TEMPLATE = metprd/grid_stat diff --git a/ush/templates/parm/metplus/GridStat_APCP24h.conf b/ush/templates/parm/metplus/GridStat_APCP24h.conf index bcccfe8e1..7ae700cf3 100644 --- a/ush/templates/parm/metplus/GridStat_APCP24h.conf +++ b/ush/templates/parm/metplus/GridStat_APCP24h.conf @@ -283,7 +283,7 @@ STAGING_DIR = {OUTPUT_BASE}/stage/APCP_24h [filename_templates] # Template to look for forecast input to PCPCombine and GridStat relative to FCST_GRID_STAT_INPUT_DIR -FCST_PCP_COMBINE_INPUT_TEMPLATE = {ENV[NET]}.t{init?fmt=%H}z.prslevf{lead?fmt=%HHH}.tm{init?fmt=%H}.grib2 +FCST_PCP_COMBINE_INPUT_TEMPLATE = {ENV[NET]}.t{init?fmt=%H}z.prslev.f{lead?fmt=%HHH}.{ENV[POST_OUTPUT_DOMAIN_NAME]}.grib2 FCST_GRID_STAT_INPUT_TEMPLATE = {FCST_PCP_COMBINE_OUTPUT_TEMPLATE} # Template to look for observation input to PCPCombine and GridStat relative to OBS_GRID_STAT_INPUT_DIR @@ -291,7 +291,7 @@ OBS_PCP_COMBINE_INPUT_TEMPLATE = {valid?fmt=%Y%m%d}/ccpa.t{valid?fmt=%H}z.01h.hr OBS_GRID_STAT_INPUT_TEMPLATE = {OBS_PCP_COMBINE_OUTPUT_TEMPLATE} # Optional subdirectories relative to GRID_STAT_OUTPUT_DIR to write output from PCPCombine and GridStat -FCST_PCP_COMBINE_OUTPUT_TEMPLATE = {ENV[NET]}.t{init?fmt=%H}z.prslevf{lead?fmt=%HHH}.tm{init?fmt=%H}_a{level?fmt=%HH}h +FCST_PCP_COMBINE_OUTPUT_TEMPLATE = {ENV[NET]}.t{init?fmt=%H}z.prslev.f{lead?fmt=%HHH}.{ENV[POST_OUTPUT_DOMAIN_NAME]}_a{level?fmt=%HH}h OBS_PCP_COMBINE_OUTPUT_TEMPLATE = {valid?fmt=%Y%m%d}/ccpa.t{valid?fmt=%H}z.hrap.conus.gb2_a{level?fmt=%HH}h GRID_STAT_OUTPUT_TEMPLATE = metprd/grid_stat diff --git a/ush/templates/parm/metplus/GridStat_REFC.conf b/ush/templates/parm/metplus/GridStat_REFC.conf index 7eade0cb1..027723470 100644 --- a/ush/templates/parm/metplus/GridStat_REFC.conf +++ b/ush/templates/parm/metplus/GridStat_REFC.conf @@ -263,7 +263,7 @@ STAGING_DIR = {OUTPUT_BASE}/stage/REFC [filename_templates] # Template to look for forecast input to GridStat relative to FCST_GRID_STAT_INPUT_DIR -FCST_GRID_STAT_INPUT_TEMPLATE = {ENV[NET]}.t{init?fmt=%H}z.prslevf{lead?fmt=%HHH}.tm{init?fmt=%H}.grib2 +FCST_GRID_STAT_INPUT_TEMPLATE = {ENV[NET]}.t{init?fmt=%H}z.prslev.f{lead?fmt=%HHH}.{ENV[POST_OUTPUT_DOMAIN_NAME]}.grib2 # Template to look for observation input to GridStat relative to OBS_GRID_STAT_INPUT_DIR OBS_GRID_STAT_INPUT_TEMPLATE = {valid?fmt=%Y%m%d}/MergedReflectivityQCComposite_00.50_{valid?fmt=%Y%m%d}-{valid?fmt=%H}0000.grib2 diff --git a/ush/templates/parm/metplus/GridStat_RETOP.conf b/ush/templates/parm/metplus/GridStat_RETOP.conf index 4ef239d36..0b0029211 100644 --- a/ush/templates/parm/metplus/GridStat_RETOP.conf +++ b/ush/templates/parm/metplus/GridStat_RETOP.conf @@ -263,7 +263,7 @@ STAGING_DIR = {OUTPUT_BASE}/stage/RETOP [filename_templates] # Template to look for forecast input to GridStat relative to FCST_GRID_STAT_INPUT_DIR -FCST_GRID_STAT_INPUT_TEMPLATE = {ENV[NET]}.t{init?fmt=%H}z.prslevf{lead?fmt=%HHH}.tm{init?fmt=%H}.grib2 +FCST_GRID_STAT_INPUT_TEMPLATE = {ENV[NET]}.t{init?fmt=%H}z.prslev.f{lead?fmt=%HHH}.{ENV[POST_OUTPUT_DOMAIN_NAME]}.grib2 # Template to look for observation input to GridStat relative to OBS_GRID_STAT_INPUT_DIR OBS_GRID_STAT_INPUT_TEMPLATE = {valid?fmt=%Y%m%d}/EchoTop_18_00.50_{valid?fmt=%Y%m%d}-{valid?fmt=%H%M%S}.grib2 diff --git a/ush/templates/parm/metplus/PointStat_conus_sfc.conf b/ush/templates/parm/metplus/PointStat_conus_sfc.conf index f6e231699..b3a2755a5 100644 --- a/ush/templates/parm/metplus/PointStat_conus_sfc.conf +++ b/ush/templates/parm/metplus/PointStat_conus_sfc.conf @@ -273,7 +273,7 @@ PB2NC_INPUT_TEMPLATE = prepbufr.ndas.{valid?fmt=%Y%m%d%H} PB2NC_OUTPUT_TEMPLATE = prepbufr.ndas.{valid?fmt=%Y%m%d%H}.nc # Template to look for forecast input to PointStat relative to FCST_POINT_STAT_INPUT_DIR -FCST_POINT_STAT_INPUT_TEMPLATE = {ENV[NET]}.t{init?fmt=%H}z.prslevf{lead?fmt=%HHH}.tm{init?fmt=%H}.grib2 +FCST_POINT_STAT_INPUT_TEMPLATE = {ENV[NET]}.t{init?fmt=%H}z.prslev.f{lead?fmt=%HHH}.{ENV[POST_OUTPUT_DOMAIN_NAME]}.grib2 # Template to look for observation input to PointStat relative to OBS_POINT_STAT_INPUT_DIR OBS_POINT_STAT_INPUT_TEMPLATE = prepbufr.ndas.{valid?fmt=%Y%m%d%H}.nc diff --git a/ush/templates/parm/metplus/PointStat_upper_air.conf b/ush/templates/parm/metplus/PointStat_upper_air.conf index 1f24cbab4..6110122c5 100644 --- a/ush/templates/parm/metplus/PointStat_upper_air.conf +++ b/ush/templates/parm/metplus/PointStat_upper_air.conf @@ -273,7 +273,7 @@ PB2NC_INPUT_TEMPLATE = prepbufr.ndas.{valid?fmt=%Y%m%d%H} PB2NC_OUTPUT_TEMPLATE = prepbufr.ndas.{valid?fmt=%Y%m%d%H}.nc # Template to look for forecast input to PointStat relative to FCST_POINT_STAT_INPUT_DIR -FCST_POINT_STAT_INPUT_TEMPLATE = {ENV[NET]}.t{init?fmt=%H}z.prslevf{lead?fmt=%HHH}.tm{init?fmt=%H}.grib2 +FCST_POINT_STAT_INPUT_TEMPLATE = {ENV[NET]}.t{init?fmt=%H}z.prslev.f{lead?fmt=%HHH}.{ENV[POST_OUTPUT_DOMAIN_NAME]}.grib2 # Template to look for observation input to PointStat relative to OBS_POINT_STAT_INPUT_DIR OBS_POINT_STAT_INPUT_TEMPLATE = prepbufr.ndas.{valid?fmt=%Y%m%d%H}.nc diff --git a/ush/test_data/RRFS_CONUS_3km/C3357.facsf.tile7.halo0.nc b/ush/test_data/RRFS_CONUS_3km/C3357.facsf.tile7.halo0.nc new file mode 100644 index 000000000..e69de29bb diff --git a/ush/test_data/RRFS_CONUS_3km/C3357.facsf.tile7.halo4.nc b/ush/test_data/RRFS_CONUS_3km/C3357.facsf.tile7.halo4.nc new file mode 100644 index 000000000..e69de29bb diff --git a/ush/test_data/RRFS_CONUS_3km/C3357.maximum_snow_albedo.tile7.halo0.nc b/ush/test_data/RRFS_CONUS_3km/C3357.maximum_snow_albedo.tile7.halo0.nc new file mode 100644 index 000000000..e69de29bb diff --git a/ush/test_data/RRFS_CONUS_3km/C3357.maximum_snow_albedo.tile7.halo4.nc b/ush/test_data/RRFS_CONUS_3km/C3357.maximum_snow_albedo.tile7.halo4.nc new file mode 100644 index 000000000..e69de29bb diff --git a/ush/test_data/RRFS_CONUS_3km/C3357.slope_type.tile7.halo0.nc b/ush/test_data/RRFS_CONUS_3km/C3357.slope_type.tile7.halo0.nc new file mode 100644 index 000000000..e69de29bb diff --git a/ush/test_data/RRFS_CONUS_3km/C3357.slope_type.tile7.halo4.nc b/ush/test_data/RRFS_CONUS_3km/C3357.slope_type.tile7.halo4.nc new file mode 100644 index 000000000..e69de29bb diff --git a/ush/test_data/RRFS_CONUS_3km/C3357.snowfree_albedo.tile7.halo0.nc b/ush/test_data/RRFS_CONUS_3km/C3357.snowfree_albedo.tile7.halo0.nc new file mode 100644 index 000000000..e69de29bb diff --git a/ush/test_data/RRFS_CONUS_3km/C3357.snowfree_albedo.tile7.halo4.nc b/ush/test_data/RRFS_CONUS_3km/C3357.snowfree_albedo.tile7.halo4.nc new file mode 100644 index 000000000..e69de29bb diff --git a/ush/test_data/RRFS_CONUS_3km/C3357.soil_type.tile7.halo0.nc b/ush/test_data/RRFS_CONUS_3km/C3357.soil_type.tile7.halo0.nc new file mode 100644 index 000000000..e69de29bb diff --git a/ush/test_data/RRFS_CONUS_3km/C3357.soil_type.tile7.halo4.nc b/ush/test_data/RRFS_CONUS_3km/C3357.soil_type.tile7.halo4.nc new file mode 100644 index 000000000..e69de29bb diff --git a/ush/test_data/RRFS_CONUS_3km/C3357.substrate_temperature.tile7.halo0.nc b/ush/test_data/RRFS_CONUS_3km/C3357.substrate_temperature.tile7.halo0.nc new file mode 100644 index 000000000..e69de29bb diff --git a/ush/test_data/RRFS_CONUS_3km/C3357.substrate_temperature.tile7.halo4.nc b/ush/test_data/RRFS_CONUS_3km/C3357.substrate_temperature.tile7.halo4.nc new file mode 100644 index 000000000..e69de29bb diff --git a/ush/test_data/RRFS_CONUS_3km/C3357.vegetation_greenness.tile7.halo0.nc b/ush/test_data/RRFS_CONUS_3km/C3357.vegetation_greenness.tile7.halo0.nc new file mode 100644 index 000000000..e69de29bb diff --git a/ush/test_data/RRFS_CONUS_3km/C3357.vegetation_greenness.tile7.halo4.nc b/ush/test_data/RRFS_CONUS_3km/C3357.vegetation_greenness.tile7.halo4.nc new file mode 100644 index 000000000..e69de29bb diff --git a/ush/test_data/RRFS_CONUS_3km/C3357.vegetation_type.tile7.halo0.nc b/ush/test_data/RRFS_CONUS_3km/C3357.vegetation_type.tile7.halo0.nc new file mode 100644 index 000000000..e69de29bb diff --git a/ush/test_data/RRFS_CONUS_3km/C3357.vegetation_type.tile7.halo4.nc b/ush/test_data/RRFS_CONUS_3km/C3357.vegetation_type.tile7.halo4.nc new file mode 100644 index 000000000..e69de29bb diff --git a/ush/test_data/RRFS_CONUS_3km/C3357_grid.tile7.halo3.nc b/ush/test_data/RRFS_CONUS_3km/C3357_grid.tile7.halo3.nc new file mode 100644 index 000000000..e69de29bb diff --git a/ush/test_data/RRFS_CONUS_3km/C3357_grid.tile7.halo4.nc b/ush/test_data/RRFS_CONUS_3km/C3357_grid.tile7.halo4.nc new file mode 100644 index 000000000..e69de29bb diff --git a/ush/test_data/RRFS_CONUS_3km/C3357_grid.tile7.halo6.nc b/ush/test_data/RRFS_CONUS_3km/C3357_grid.tile7.halo6.nc new file mode 100644 index 000000000..e69de29bb diff --git a/ush/test_data/RRFS_CONUS_3km/C3357_mosaic.halo3.nc b/ush/test_data/RRFS_CONUS_3km/C3357_mosaic.halo3.nc new file mode 100644 index 000000000..e69de29bb diff --git a/ush/test_data/RRFS_CONUS_3km/C3357_mosaic.halo4.nc b/ush/test_data/RRFS_CONUS_3km/C3357_mosaic.halo4.nc new file mode 100644 index 000000000..e69de29bb diff --git a/ush/test_data/RRFS_CONUS_3km/C3357_mosaic.halo6.nc b/ush/test_data/RRFS_CONUS_3km/C3357_mosaic.halo6.nc new file mode 100644 index 000000000..e69de29bb diff --git a/ush/test_data/RRFS_CONUS_3km/C3357_oro_data.tile7.halo0.nc b/ush/test_data/RRFS_CONUS_3km/C3357_oro_data.tile7.halo0.nc new file mode 100644 index 000000000..e69de29bb diff --git a/ush/test_data/RRFS_CONUS_3km/C3357_oro_data.tile7.halo4.nc b/ush/test_data/RRFS_CONUS_3km/C3357_oro_data.tile7.halo4.nc new file mode 100644 index 000000000..e69de29bb diff --git a/ush/test_data/RRFS_CONUS_3km/C3357_oro_data_ls.tile7.halo0.nc b/ush/test_data/RRFS_CONUS_3km/C3357_oro_data_ls.tile7.halo0.nc new file mode 100644 index 000000000..e69de29bb diff --git a/ush/test_data/RRFS_CONUS_3km/C3357_oro_data_ss.tile7.halo0.nc b/ush/test_data/RRFS_CONUS_3km/C3357_oro_data_ss.tile7.halo0.nc new file mode 100644 index 000000000..e69de29bb diff --git a/ush/test_data/suite_FV3_GSD_SAR.xml b/ush/test_data/suite_FV3_GSD_SAR.xml new file mode 100644 index 000000000..f2b38d577 --- /dev/null +++ b/ush/test_data/suite_FV3_GSD_SAR.xml @@ -0,0 +1,85 @@ + + + + + + + GFS_time_vary_pre + GFS_rrtmg_setup + GFS_rad_time_vary + GFS_phys_time_vary + + + + + GFS_suite_interstitial_rad_reset + sgscloud_radpre + GFS_rrtmg_pre + GFS_radiation_surface + rrtmg_sw_pre + rrtmg_sw + rrtmg_sw_post + rrtmg_lw_pre + rrtmg_lw + sgscloud_radpost + rrtmg_lw_post + GFS_rrtmg_post + + + + + GFS_suite_interstitial_phys_reset + GFS_suite_stateout_reset + get_prs_fv3 + GFS_suite_interstitial_1 + GFS_surface_generic_pre + GFS_surface_composites_pre + dcyc2t3 + GFS_surface_composites_inter + GFS_suite_interstitial_2 + + + + sfc_diff + GFS_surface_loop_control_part1 + sfc_nst_pre + sfc_nst + sfc_nst_post + lsm_ruc + GFS_surface_loop_control_part2 + + + + GFS_surface_composites_post + sfc_diag + sfc_diag_post + GFS_surface_generic_post + mynnedmf_wrapper + GFS_GWD_generic_pre + cires_ugwp + cires_ugwp_post + GFS_GWD_generic_post + GFS_suite_stateout_update + ozphys_2015 + h2ophys + get_phi_fv3 + + GFS_suite_interstitial_3 + GFS_suite_interstitial_4 + + GFS_MP_generic_pre + mp_thompson_pre + mp_thompson + mp_thompson_post + GFS_MP_generic_post + maximum_hourly_diagnostics + phys_tend + + + + + GFS_stochastics + + + + diff --git a/ush/valid_param_vals.sh b/ush/valid_param_vals.sh index d7fdd9f07..69c08984e 100644 --- a/ush/valid_param_vals.sh +++ b/ush/valid_param_vals.sh @@ -4,7 +4,7 @@ valid_vals_RUN_ENVIR=("nco" "community") valid_vals_VERBOSE=("TRUE" "true" "YES" "yes" "FALSE" "false" "NO" "no") valid_vals_DEBUG=("TRUE" "true" "YES" "yes" "FALSE" "false" "NO" "no") -valid_vals_MACHINE=("WCOSS_DELL_P3" "HERA" "ORION" "JET" "ODIN" "CHEYENNE" "STAMPEDE" "LINUX" "MACOS" "NOAACLOUD" "SINGULARITY") +valid_vals_MACHINE=("WCOSS_DELL_P3" "HERA" "ORION" "JET" "ODIN" "CHEYENNE" "STAMPEDE" "LINUX" "MACOS" "NOAACLOUD" "SINGULARITY" "GAEA") valid_vals_SCHED=("slurm" "pbspro" "lsf" "lsfcray" "none") valid_vals_FCST_MODEL=("ufs-weather-model" "fv3gfs_aqm") valid_vals_WORKFLOW_MANAGER=("rocoto" "none") @@ -31,6 +31,7 @@ valid_vals_PREDEF_GRID_NAME=( \ "RRFS_NA_13km" \ "RRFS_NA_3km" \ "SUBCONUS_Ind_3km" \ +"WoFS_3km" \ ) valid_vals_CCPP_PHYS_SUITE=( \ "FV3_GFS_2017_gfdlmp" \ @@ -41,7 +42,7 @@ valid_vals_CCPP_PHYS_SUITE=( \ "FV3_RRFS_v1beta" \ "FV3_RRFS_v1alpha" \ "FV3_HRRR" \ -) +) valid_vals_GFDLgrid_RES=("48" "96" "192" "384" "768" "1152" "3072") valid_vals_EXTRN_MDL_NAME_ICS=("GSMGFS" "FV3GFS" "RAP" "HRRR" "NAM") valid_vals_EXTRN_MDL_NAME_LBCS=("GSMGFS" "FV3GFS" "RAP" "HRRR" "NAM") diff --git a/ush/valid_param_vals.yaml b/ush/valid_param_vals.yaml new file mode 100644 index 000000000..efadce608 --- /dev/null +++ b/ush/valid_param_vals.yaml @@ -0,0 +1,85 @@ +# +# Define valid values for various global experiment/workflow variables. +# +valid_vals_RUN_ENVIR: ["nco", "community"] +valid_vals_VERBOSE: [True, False] +valid_vals_DEBUG: [True, False] +valid_vals_MACHINE: ["WCOSS_DELL_P3", "HERA", "ORION", "JET", "ODIN", "CHEYENNE", "STAMPEDE", "LINUX", "MACOS", "NOAACLOUD", "SINGULARITY", "GAEA"] +valid_vals_SCHED: ["slurm", "pbspro", "lsf", "lsfcray", "none"] +valid_vals_FCST_MODEL: ["ufs-weather-model", "fv3gfs_aqm"] +valid_vals_WORKFLOW_MANAGER: ["rocoto", "none"] +valid_vals_PREDEF_GRID_NAME: [ +"RRFS_CONUS_25km", +"RRFS_CONUS_13km", +"RRFS_CONUS_3km", +"RRFS_CONUScompact_25km", +"RRFS_CONUScompact_13km", +"RRFS_CONUScompact_3km", +"RRFS_SUBCONUS_3km", +"RRFS_AK_13km", +"RRFS_AK_3km", +"CONUS_25km_GFDLgrid", +"CONUS_3km_GFDLgrid", +"EMC_AK", +"EMC_HI", +"EMC_PR", +"EMC_GU", +"GSL_HAFSV0.A_25km", +"GSL_HAFSV0.A_13km", +"GSL_HAFSV0.A_3km", +"GSD_HRRR_AK_50km", +"RRFS_NA_13km", +"RRFS_NA_3km", +"SUBCONUS_Ind_3km", +"WoFS_3km" +] +valid_vals_CCPP_PHYS_SUITE: [ +"FV3_GFS_2017_gfdlmp", +"FV3_GFS_2017_gfdlmp_regional", +"FV3_GFS_v15p2", +"FV3_GFS_v15_thompson_mynn_lam3km", +"FV3_GFS_v16", +"FV3_RRFS_v1beta", +"FV3_RRFS_v1alpha", +"FV3_HRRR" +] +valid_vals_GFDLgrid_RES: [48, 96, 192, 384, 768, 1152, 3072] +valid_vals_EXTRN_MDL_NAME_ICS: ["GSMGFS", "FV3GFS", "RAP", "HRRR", "NAM"] +valid_vals_EXTRN_MDL_NAME_LBCS: ["GSMGFS", "FV3GFS", "RAP", "HRRR", "NAM"] +valid_vals_USE_USER_STAGED_EXTRN_FILES: [True, False] +valid_vals_FV3GFS_FILE_FMT_ICS: ["nemsio", "grib2", "netcdf"] +valid_vals_FV3GFS_FILE_FMT_LBCS: ["nemsio", "grib2", "netcdf"] +valid_vals_GRID_GEN_METHOD: ["GFDLgrid", "ESGgrid"] +valid_vals_PREEXISTING_DIR_METHOD: ["delete", "rename", "quit"] +valid_vals_GTYPE: ["regional"] +valid_vals_WRTCMP_output_grid: ["rotated_latlon", "lambert_conformal", "regional_latlon"] +valid_vals_RUN_TASK_MAKE_GRID: [True, False] +valid_vals_RUN_TASK_MAKE_OROG: [True, False] +valid_vals_RUN_TASK_MAKE_SFC_CLIMO: [True, False] +valid_vals_RUN_TASK_RUN_POST: [True, False] +valid_vals_WRITE_DOPOST: [True, False] +valid_vals_RUN_TASK_VX_GRIDSTAT: [True, False] +valid_vals_RUN_TASK_VX_POINTSTAT: [True, False] +valid_vals_RUN_TASK_VX_ENSGRID: [True, False] +valid_vals_RUN_TASK_VX_ENSPOINT: [True, False] +valid_vals_QUILTING: [True, False] +valid_vals_PRINT_ESMF: [True, False] +valid_vals_USE_CRON_TO_RELAUNCH: [True, False] +valid_vals_DOT_OR_USCORE: [".", "_"] +valid_vals_NOMADS: [True, False] +valid_vals_NOMADS_file_type: ["GRIB2", "grib2", "NEMSIO", "nemsio"] +valid_vals_DO_ENSEMBLE: [True, False] +valid_vals_USE_CUSTOM_POST_CONFIG_FILE: [True, False] +valid_vals_USE_CRTM: [True, False] +valid_vals_DO_SHUM: [True, False] +valid_vals_DO_SPPT: [True, False] +valid_vals_DO_SPP: [True, False] +valid_vals_DO_LSM_SPP: [True, False] +valid_vals_DO_SKEB: [True, False] +valid_vals_USE_ZMTNBLCK: [True, False] +valid_vals_USE_FVCOM: [True, False] +valid_vals_FVCOM_WCSTART: ["warm", "WARM", "cold", "COLD"] +valid_vals_COMPILER: ["intel", "gnu"] +valid_vals_SUB_HOURLY_POST: [True, False] +valid_vals_DT_SUBHOURLY_POST_MNTS: [0, 1, 2, 3, 4, 5, 6, 10, 12, 15, 20, 30] +valid_vals_USE_MERRA_CLIMO: [True, False]