From 391585962def7b4e836dfb0661eff8ebbacb85fe Mon Sep 17 00:00:00 2001 From: Christina Holt <56881914+christinaholtNOAA@users.noreply.github.com> Date: Tue, 7 Jun 2022 14:38:17 -0600 Subject: [PATCH] rrfs_ci: Update to top of authoritative develop (#156) * Add support on NSSL/Odin (#227) * Add support on NSSL/Odin * Add wlfow_odin.env and modify detect_machine.sh * update comment * Detect explicitly odin1/odin2 * Add python module to cheyenne build environments (#232) * Update SRW Documentation (#212) * updated docs * added git submodule * fix formatting * added new submodule commits * fixed ref links * finished Intro * finish Components & Intro edits * edited Rocoto workflow section of Quickstart * added minor hpc submodule commits * Updates to Rocoto Workflow in Quick Start * add to HPC-stack intro * submodule updates * added submodule docs edits * hpc-stack updates & formatting fixes * hpc-stack intro edits * bibtex attempted fix * add hpc-stack module edits * update sphinxcontrib version * add .readthedocs.yaml file * update .readthedocs.yaml file * update .readthedocs.yaml file * update conf.py * updates .readthedocs.yaml with submodules * updates .readthedocs.yaml with submodules * submodule updates * submodule updates * minor Intro edits * minor Intro edits * minor Intro edits * submodule updates * fixed typos in QS * QS updates * QS updates * QS updates * updates to InputOutput and QS * fix I/O doc typos * pull updates to hpc-stack docs * pull updates to hpc-stack docs * fix table wrapping * updates to QS for cloud * fix QS export statements * fix QS export statements * QS edits on bind, config * add bullet points to notes * running without rocoto * add HPC-Stack submodule w/docs * split QS into container/non-container approaches * added filepath changes for running in container on Orion, et al. * edits to overview and container QS * moved CodeReposAndDirs.rst info to the Introduction & deleted file * continued edits to SRWAppOverview * combine overview w/non-container docs * finish merging non-container guide & SRWOverview, rename/remove files, update FAQ * minor edits for Intro & QS * updates to BuildRun doc through 3.8.1 * edits to Build/Run and Components * remove .gitignore * fix Ch 3 title, 4 supported platform levels note * fix typos, add term links * other minor fixes/suggestions implemented * updated Intro based on feedback; changed SRW to SRW App throughout * update comment to Intro citation * add user-defined vertical levels to future work * Add instructions for srw_common module load * fix typo * update Intro & BuildRunSRW based on Mark's feedback * minor intro updates * 1st round of jwolff's edits * 2nd round of jwolff updates * update QS intro * fix minor physics details * update citation and physics suite name * add compute node allocation info to QS * add authoritative hpc-stack docs to Intro Co-authored-by: gspetro * Update hashes of all components to latest versions (#233) * Add a Contributor's Guide feature to the docs (#228) * create contributor's guide * add guidelines for making good PR * good pull request edits * 1st Draft of COntributor's Guide * minor formatting edits * Add instructions for srw_common module load * fix module files guidance * fix typo * update Intro & BuildRunSRW based on Mark's feedback * minor intro updates * add info on documentation requirements * 1st round of J. Beck's edits * add textonly issue template + minor typo-type fixes * minor edits/formatting * remove pylintrc requirement * fixed .yaml & other minor Co-authored-by: gspetro * Updates to parameters in config_defaults .rst files (#237) * edit config intro & platform environment sections * edit sections on cron & directory parameters, platform & parameters for running without a workflow manager * edit NCO, file-separator, filename params, add some METplus and model config params * ConfigWorkflow.rst revisions, added METplus to Components, grid info * add grid config details * changes to readme.md * RTD readme.md edits * create MacOS install/build instructions * update task run and grid parameters * fixed file params & workflow task params * 1st draft of ConfigParameters.inc * minor edits * add stochastic physics var details * update FVCOM, thread affinity params * halo_blend, ens, crtm, custom post, subhourly updates * update HPC-Stack submodule/docs * Rocoto WF tasks & params * workflow tasks/params, debug, verbose, pre-existing dir, predefined grid * move Stochastic physics to CCP section; write component edits * comp'l forecast, grid gen, NOMADS, user-staged files * METplus, model config & forecast params, separator * 2nd draft complete * physics updates * remove MacInstall empty file * undo hpc-stack submodule update (save for separate PR) * undo hpc-stack install doc update (save for separate PR) * revisions to SPP & LSM physics * minor edits * update comments in LAM Grid chapter * update LSM_SPP_EACH_STEP * revert LSM_SPP_EACH_STEP to original definition * combine config info into one doc instead of two Co-authored-by: gspetro * Added functionality for MacOS X (#242) * Added functionality for MacOS X Functionality for MacOS, updated module list in srw_common * Update build_macos_gnu.env * Update srw_common * Update build_macos_gnu.env The env/build_mac_gnu.env does not load srw_common module, but instead loads individual HPC-stack modules built locally on the Mac that contain higher-versions of some packages. This avoids conflicts with SRW builds for other platforms. * Update srw_common corrected the version of the gftl-shared * Update build_macos_gnu.env No need to load libpng module found in srw_common, as this is not being built as a part of the hpc-stack on MacOS X, rather installed system-wide. Co-authored-by: Natalie Perlin * Add gaea to supported platforms (#236) * fixes for gaea * updates for gaea * tweak for build env * 2nd tweak for build env * Fixes for slurm * another fix for env * added version for cmake * Update Externals.cfg * Update wflow_gaea.env * tweak * pulling Externals.cfg explicitly from develop * temporarily removing externals * Checked out directly from develop * Bug fix with singularity env files (#245) * Replace bash env files with modules (#238) * Pass machine name to build scripts. * Use modules environment instead of shell scripts. * Leave conda activation to the user. * Remove set_machine script. * Rename env to modulefiles * Minor fix. * Minor fix * Take out *module purge* from modufiles and put it in devbuild.sh * Activate conda directly in signularity modulefile. * Minor fixes. * Add Gaea modulefiles. * Restore odin env files. * Bug fixes in singularity modulefiles. * Move activation of Lmod to devbuild.sh * Don't do 'module purge' on cray systems * Put Lmod initialization code in separate script. * Go back to using modulefile for odin. * Optionally pass machine name to lmod-setup.sh * Modify odin wflow modulefile. * Allow unknown platforms in devbuild.sh * Update documentation. * Move cmake init out of lmod-setup.sh on odin * Also update markup language build documentation. * Lmod setup script for both bash and tcsh login shells. * Some fixes for tcsh login shell. * Add singularity platform to lmod-setup * update hash of regional workflow (#247) * Update WE2E documentation (#241) ## DESCRIPTION OF CHANGES: This updates the documentation for how to use the WE2E testing system. ## TESTS CONDUCTED: Compiled the documentation using `sphinx-build` and viewed the resulting html. ## DEPENDENCIES: PR #[745](https://github.com/ufs-community/regional_workflow/pull/745) in regional_workflow. ## CONTRIBUTORS: @gspetro * fixes for gaea modules (#248) * Update regional_workflow hash, add shortcuts for common devbuild.sh options (#251) * Modifications to `CODEOWNERS` file (#252) ## DESCRIPTION OF CHANGES: Make the following modifications to the github `CODEOWNERS` file: 1) Add several EPIC staff (@gspetro-NOAA, @natalie-perlin, and @EdwardSnyder-NOAA) so they are notified of all PRs and can review them. 2) Remove duplicate entries. 3) Remove users who will no longer be working with the repo (thus far only @jwolff-ncar) . ## TESTS CONDUCTED: None. * Add verification tasks to documentation (#243) * METplus, model config & forecast params, separator * remove MacInstall empty file * undo hpc-stack submodule update (save for separate PR) * undo hpc-stack install doc update (save for separate PR) * combine config info into one doc instead of two * remove ConfigParameters.inc (contents now appear in ConfigWorkflow.rst) * add VX tables, config info, & Rocoto output tables * add module use/load statements, fix typos * varied minor details * add workflow svg diagram * condense VX task table using ## * update README * add png and revert hpc-stack commits until PR#240 (mac docs) is approved * jwolff edits * add info on run_vx.local Co-authored-by: gspetro * Add NOAA cloud platforms to SRW (#221) * updates for noaacloud * working version * pointing to noaa-epic for testing * changes for noaacloud * switched to load-any * fix for regional_workflow pointer (#260) * update hash of regional workflow (#261) * Feature/cheyenne fix (#258) * Adding a github action to build on cheyenne with intel * fixing yml * fixes for missing load-any on cheyenne * added pio as well * Update .github/workflows/build.yml Co-authored-by: Will Mayfield <59745143+willmayfield@users.noreply.github.com> Co-authored-by: Will Mayfield <59745143+willmayfield@users.noreply.github.com> * tweaks for build/run on gaea (#254) * tweaks for build/run on gaea * fixed path * Updates for PR * Check-in Jenkinsfile and unified scripts (#253) * Add Jenkinsfile that includes a build and test pipeline, which leverages the unified build and test scripts * Supported platforms are Cheyenne, Gaea, and Orion * Supported compilers are GNU and Intel * Fix for miniconda3 load on Hera (#257) Pin down the version of miniconda3 on Hera, and do not append to the module path * Updates to Remaining Chapters (6 & 8-12) of SRW Docs/User's Guide (#255) * updated docs * added git submodule * fix formatting * added new submodule commits * fixed ref links * finished Intro * finish Components & Intro edits * edited Rocoto workflow section of Quickstart * added minor hpc submodule commits * Updates to Rocoto Workflow in Quick Start * add to HPC-stack intro * submodule updates * added submodule docs edits * hpc-stack updates & formatting fixes * hpc-stack intro edits * bibtex attempted fix * add hpc-stack module edits * update sphinxcontrib version * add .readthedocs.yaml file * update .readthedocs.yaml file * update .readthedocs.yaml file * update conf.py * updates .readthedocs.yaml with submodules * updates .readthedocs.yaml with submodules * submodule updates * submodule updates * minor Intro edits * minor Intro edits * minor Intro edits * submodule updates * fixed typos in QS * QS updates * QS updates * QS updates * updates to InputOutput and QS * fix I/O doc typos * pull updates to hpc-stack docs * pull updates to hpc-stack docs * fix table wrapping * updates to QS for cloud * fix QS export statements * fix QS export statements * QS edits on bind, config * add bullet points to notes * running without rocoto * add HPC-Stack submodule w/docs * split QS into container/non-container approaches * added filepath changes for running in container on Orion, et al. * edits to overview and container QS * moved CodeReposAndDirs.rst info to the Introduction & deleted file * continued edits to SRWAppOverview * combine overview w/non-container docs * finish merging non-container guide & SRWOverview, rename/remove files, update FAQ * minor edits for Intro & QS * updates to BuildRun doc through 3.8.1 * edits to Build/Run and Components * remove .gitignore * fix Ch 3 title, 4 supported platform levels note * fix typos, add term links * other minor fixes/suggestions implemented * updated Intro based on feedback; changed SRW to SRW App throughout * update comment to Intro citation * add user-defined vertical levels to future work * Add instructions for srw_common module load * fix typo * update Intro & BuildRunSRW based on Mark's feedback * minor intro updates * 1st round of jwolff's edits * 2nd round of jwolff updates * update QS intro * fix minor physics details * update citation and physics suite name * add compute node allocation info to QS * add authoritative hpc-stack docs to Intro * edit config intro & platform environment sections * edit sections on cron & directory parameters, platform & parameters for running without a workflow manager * edit NCO, file-separator, filename params, add some METplus and model config params * ConfigWorkflow.rst revisions, added METplus to Components, grid info * add grid config details * changes to readme.md * RTD readme.md edits * create MacOS install/build instructions * update task run and grid parameters * fixed file params & workflow task params * 1st draft of ConfigParameters.inc * minor edits * add stochastic physics var details * update FVCOM, thread affinity params * halo_blend, ens, crtm, custom post, subhourly updates * update HPC-Stack submodule/docs * remove extra macinstall document * Rocoto WF tasks & params * workflow tasks/params, debug, verbose, pre-existing dir, predefined grid * move Stochastic physics to CCP section; write component edits * comp'l forecast, grid gen, NOMADS, user-staged files * METplus, model config & forecast params, separator * 2nd draft complete * physics updates * remove MacInstall empty file * undo hpc-stack submodule update (save for separate PR) * undo hpc-stack install doc update (save for separate PR) * revert hpc-stack submodule update * revisions to SPP & LSM physics * minor edits * update comments in LAM Grid chapter * update LSM_SPP_EACH_STEP * revert LSM_SPP_EACH_STEP to original definition * combine config info into one doc instead of two * remove ConfigParameters.inc (contents now appear in ConfigWorkflow.rst) * update hpc-stack docs submodule * odds & ends * add VX tables, config info, & Rocoto output tables * add module use/load statements, fix typos * varied minor details * add workflow svg diagram * edits to rocoto ch * updates to Rocoto chapter * fix minor formatting/wording issues * updates to LAMgrid chapter * LAM Grid edits * LAM ch: user-defined grid section * add UPP Product tables ch 6 * I/O edits & glossary terms * I/O Pt2 * I/O changes * include updated images * update docs to reflect changes in PR #238 * Graphics Ch-1st pass * minor updates to Graphics * minor updates to Graphics * edit ConfigNewPlatform sections 1-4 * ConfigNewPlatform edits * resolve merge conflicts * I/O ch edits * I/O edits * more I/O edits * hpc-stack submodule updates * add HPC-Stack MacOs info * WE2E edits & tables * fix typo * minor grammar/typos * merge conflict resolution * merge conflict resolution * fix grid name * remove resolved comments * add compact grids * file path updates & info for HPC-Stack * add SRW prereqs to Intro * change ConfigNewPlatform to a non-container quickstart * clean up non-container quickstart * update build options for non-container QS * update file paths & WE2E * minor fixes * update I/O & Gaea file paths * update error in non-container QS * add warning for users w/o Rocoto * add UPP Satellite Product instructions * Xlink for UPP satellite output info Co-authored-by: gspetro * Update compiler prerequisite in docs (#267) * updated docs * added git submodule * fix formatting * added new submodule commits * fixed ref links * finished Intro * finish Components & Intro edits * edited Rocoto workflow section of Quickstart * added minor hpc submodule commits * Updates to Rocoto Workflow in Quick Start * add to HPC-stack intro * submodule updates * added submodule docs edits * hpc-stack updates & formatting fixes * hpc-stack intro edits * bibtex attempted fix * add hpc-stack module edits * update sphinxcontrib version * add .readthedocs.yaml file * update .readthedocs.yaml file * update .readthedocs.yaml file * update conf.py * updates .readthedocs.yaml with submodules * updates .readthedocs.yaml with submodules * submodule updates * submodule updates * minor Intro edits * minor Intro edits * minor Intro edits * submodule updates * fixed typos in QS * QS updates * QS updates * QS updates * updates to InputOutput and QS * fix I/O doc typos * pull updates to hpc-stack docs * pull updates to hpc-stack docs * fix table wrapping * updates to QS for cloud * fix QS export statements * fix QS export statements * QS edits on bind, config * add bullet points to notes * running without rocoto * add HPC-Stack submodule w/docs * split QS into container/non-container approaches * added filepath changes for running in container on Orion, et al. * edits to overview and container QS * moved CodeReposAndDirs.rst info to the Introduction & deleted file * continued edits to SRWAppOverview * combine overview w/non-container docs * finish merging non-container guide & SRWOverview, rename/remove files, update FAQ * minor edits for Intro & QS * updates to BuildRun doc through 3.8.1 * edits to Build/Run and Components * remove .gitignore * fix Ch 3 title, 4 supported platform levels note * fix typos, add term links * other minor fixes/suggestions implemented * updated Intro based on feedback; changed SRW to SRW App throughout * update comment to Intro citation * add user-defined vertical levels to future work * Add instructions for srw_common module load * fix typo * update Intro & BuildRunSRW based on Mark's feedback * minor intro updates * 1st round of jwolff's edits * 2nd round of jwolff updates * update QS intro * fix minor physics details * update citation and physics suite name * add compute node allocation info to QS * add authoritative hpc-stack docs to Intro * edit config intro & platform environment sections * edit sections on cron & directory parameters, platform & parameters for running without a workflow manager * edit NCO, file-separator, filename params, add some METplus and model config params * ConfigWorkflow.rst revisions, added METplus to Components, grid info * add grid config details * changes to readme.md * RTD readme.md edits * create MacOS install/build instructions * update task run and grid parameters * fixed file params & workflow task params * 1st draft of ConfigParameters.inc * minor edits * add stochastic physics var details * update FVCOM, thread affinity params * halo_blend, ens, crtm, custom post, subhourly updates * update HPC-Stack submodule/docs * remove extra macinstall document * Rocoto WF tasks & params * workflow tasks/params, debug, verbose, pre-existing dir, predefined grid * move Stochastic physics to CCP section; write component edits * comp'l forecast, grid gen, NOMADS, user-staged files * METplus, model config & forecast params, separator * 2nd draft complete * physics updates * remove MacInstall empty file * undo hpc-stack submodule update (save for separate PR) * undo hpc-stack install doc update (save for separate PR) * revert hpc-stack submodule update * revisions to SPP & LSM physics * minor edits * update comments in LAM Grid chapter * update LSM_SPP_EACH_STEP * revert LSM_SPP_EACH_STEP to original definition * combine config info into one doc instead of two * remove ConfigParameters.inc (contents now appear in ConfigWorkflow.rst) * update hpc-stack docs submodule * odds & ends * add VX tables, config info, & Rocoto output tables * add module use/load statements, fix typos * varied minor details * add workflow svg diagram * edits to rocoto ch * updates to Rocoto chapter * fix minor formatting/wording issues * updates to LAMgrid chapter * LAM Grid edits * LAM ch: user-defined grid section * add UPP Product tables ch 6 * I/O edits & glossary terms * I/O Pt2 * I/O changes * include updated images * update docs to reflect changes in PR #238 * Graphics Ch-1st pass * minor updates to Graphics * minor updates to Graphics * edit ConfigNewPlatform sections 1-4 * ConfigNewPlatform edits * resolve merge conflicts * I/O ch edits * I/O edits * more I/O edits * hpc-stack submodule updates * add HPC-Stack MacOs info * WE2E edits & tables * fix typo * minor grammar/typos * merge conflict resolution * merge conflict resolution * fix grid name * remove resolved comments * add compact grids * file path updates & info for HPC-Stack * add SRW prereqs to Intro * change ConfigNewPlatform to a non-container quickstart * clean up non-container quickstart * update build options for non-container QS * update file paths & WE2E * minor fixes * update I/O & Gaea file paths * update error in non-container QS * add warning for users w/o Rocoto * add UPP Satellite Product instructions * Xlink for UPP satellite output info * clarify compiler prereqs Co-authored-by: gspetro Co-authored-by: Yunheng Wang <47898913+ywangwof@users.noreply.github.com> Co-authored-by: Will Mayfield <59745143+willmayfield@users.noreply.github.com> Co-authored-by: Gillian Petro <96886803+gspetro-NOAA@users.noreply.github.com> Co-authored-by: gspetro Co-authored-by: Michael Kavulich Co-authored-by: Natalie Perlin <68030316+natalie-perlin@users.noreply.github.com> Co-authored-by: Natalie Perlin Co-authored-by: Mark Potts <33099090+mark-a-potts@users.noreply.github.com> Co-authored-by: danielabdi-noaa <52012304+danielabdi-noaa@users.noreply.github.com> Co-authored-by: Chan-Hoo.Jeon-NOAA <60152248+chan-hoo@users.noreply.github.com> Co-authored-by: gsketefian <31046882+gsketefian@users.noreply.github.com> Co-authored-by: Jesse McFarland --- .cicd/Jenkinsfile | 169 ++ .cicd/scripts/srw_build.sh | 31 + .cicd/scripts/srw_test.sh | 21 + .github/CODEOWNERS | 1 - .github/ISSUE_TEMPLATE/textonly_request.md | 20 + .github/workflows/build.yml | 21 + .gitmodules | 3 + .readthedocs.yaml | 35 + Externals.cfg | 6 +- README.md | 7 +- devbuild.sh | 61 +- docs/INSTALL | 36 +- docs/RUNTIME | 19 +- docs/UsersGuide/README | 29 +- docs/UsersGuide/build/.gitignore | 4 - docs/UsersGuide/source/BuildRunSRW.rst | 1113 ++++++++++++++ docs/UsersGuide/source/CodeReposAndDirs.rst | 261 ---- docs/UsersGuide/source/CompleteTests.csv | 28 + docs/UsersGuide/source/CompleteTests.rst | 8 + docs/UsersGuide/source/Components.rst | 94 ++ docs/UsersGuide/source/ConfigNewPlatform.rst | 388 ----- docs/UsersGuide/source/ConfigParameters.inc | 338 ---- docs/UsersGuide/source/ConfigWorkflow.rst | 1363 +++++++++++++++-- docs/UsersGuide/source/ContributorsGuide.rst | 353 +++++ docs/UsersGuide/source/FAQ.rst | 48 +- docs/UsersGuide/source/Glossary.rst | 166 +- docs/UsersGuide/source/Graphics.rst | 130 +- docs/UsersGuide/source/Include-HPCInstall.rst | 9 + docs/UsersGuide/source/InputOutputFiles.rst | 416 ++--- docs/UsersGuide/source/Introduction.rst | 626 +++++--- docs/UsersGuide/source/LAMGrids.rst | 210 ++- docs/UsersGuide/source/Non-ContainerQS.rst | 123 ++ docs/UsersGuide/source/Quickstart.rst | 445 +++--- docs/UsersGuide/source/RocotoInfo.rst | 231 ++- docs/UsersGuide/source/SRWAppOverview.rst | 710 --------- docs/UsersGuide/source/SRW_NATLEV_table.csv | 210 +++ docs/UsersGuide/source/SRW_NATLEV_table.rst | 11 + docs/UsersGuide/source/SRW_PRSLEV_table.csv | 257 ++++ docs/UsersGuide/source/SRW_PRSLEV_table.rst | 11 + docs/UsersGuide/source/WE2Etests.rst | 459 ++++-- .../_static/FV3LAM_wflow_flowchart_v2.png | Bin 0 -> 561715 bytes .../source/_static/SUBCONUS_Ind_3km.png | Bin 0 -> 81808 bytes docs/UsersGuide/source/_static/custom.css | 19 + .../source/_static/theme_overrides.css | 10 +- docs/UsersGuide/source/conf.py | 15 +- docs/UsersGuide/source/index.rst | 15 +- docs/UsersGuide/source/references.bib | 16 +- env/build_cheyenne_gnu.env | 23 - env/build_cheyenne_intel.env | 23 - env/build_jet_intel.env | 22 - env/build_macos_gccgfortran.env | 23 - env/build_orion_intel.env | 22 - env/build_singularity_gnu.env | 40 - env/build_wcoss_dell_p3_intel.env | 24 - env/detect_machine.sh | 104 -- env/wflow_cheyenne.env | 6 - env/wflow_hera.env | 7 - env/wflow_jet.env | 7 - env/wflow_orion.env | 7 - env/wflow_singularity.env | 3 - etc/lmod-setup.csh | 40 + etc/lmod-setup.sh | 41 + hpc-stack-mod | 1 + modulefiles/build_cheyenne_gnu | 34 + modulefiles/build_cheyenne_intel | 34 + modulefiles/build_gaea_intel | 25 + .../build_hera_intel | 19 +- modulefiles/build_jet_intel | 27 + modulefiles/build_macos_gnu | 83 + modulefiles/build_noaacloud_intel | 21 + modulefiles/build_odin_intel | 54 + modulefiles/build_orion_intel | 27 + modulefiles/build_singularity_gnu | 43 + modulefiles/build_wcoss_dell_p3_intel | 29 + {env => modulefiles}/srw_common | 17 +- modulefiles/wflow_cheyenne | 18 + modulefiles/wflow_gaea | 18 + modulefiles/wflow_hera | 19 + modulefiles/wflow_jet | 19 + modulefiles/wflow_macos | 26 + modulefiles/wflow_noaacloud | 21 + modulefiles/wflow_odin | 42 + modulefiles/wflow_orion | 19 + modulefiles/wflow_singularity | 16 + .../wflow_wcoss_dell_p3 | 9 +- test/README.md | 2 +- test/build.sh | 18 +- 87 files changed, 6376 insertions(+), 3203 deletions(-) create mode 100644 .cicd/Jenkinsfile create mode 100755 .cicd/scripts/srw_build.sh create mode 100755 .cicd/scripts/srw_test.sh create mode 100644 .github/ISSUE_TEMPLATE/textonly_request.md create mode 100644 .github/workflows/build.yml create mode 100644 .gitmodules create mode 100644 .readthedocs.yaml delete mode 100644 docs/UsersGuide/build/.gitignore create mode 100644 docs/UsersGuide/source/BuildRunSRW.rst delete mode 100644 docs/UsersGuide/source/CodeReposAndDirs.rst create mode 100644 docs/UsersGuide/source/CompleteTests.csv create mode 100644 docs/UsersGuide/source/CompleteTests.rst create mode 100644 docs/UsersGuide/source/Components.rst delete mode 100644 docs/UsersGuide/source/ConfigNewPlatform.rst delete mode 100644 docs/UsersGuide/source/ConfigParameters.inc create mode 100644 docs/UsersGuide/source/ContributorsGuide.rst create mode 100644 docs/UsersGuide/source/Include-HPCInstall.rst create mode 100644 docs/UsersGuide/source/Non-ContainerQS.rst delete mode 100644 docs/UsersGuide/source/SRWAppOverview.rst create mode 100644 docs/UsersGuide/source/SRW_NATLEV_table.csv create mode 100644 docs/UsersGuide/source/SRW_NATLEV_table.rst create mode 100644 docs/UsersGuide/source/SRW_PRSLEV_table.csv create mode 100644 docs/UsersGuide/source/SRW_PRSLEV_table.rst create mode 100644 docs/UsersGuide/source/_static/FV3LAM_wflow_flowchart_v2.png create mode 100644 docs/UsersGuide/source/_static/SUBCONUS_Ind_3km.png delete mode 100644 env/build_cheyenne_gnu.env delete mode 100644 env/build_cheyenne_intel.env delete mode 100644 env/build_jet_intel.env delete mode 100644 env/build_macos_gccgfortran.env delete mode 100644 env/build_orion_intel.env delete mode 100644 env/build_singularity_gnu.env delete mode 100644 env/build_wcoss_dell_p3_intel.env delete mode 100755 env/detect_machine.sh delete mode 100644 env/wflow_cheyenne.env delete mode 100644 env/wflow_hera.env delete mode 100644 env/wflow_jet.env delete mode 100644 env/wflow_orion.env delete mode 100644 env/wflow_singularity.env create mode 100644 etc/lmod-setup.csh create mode 100644 etc/lmod-setup.sh create mode 160000 hpc-stack-mod create mode 100644 modulefiles/build_cheyenne_gnu create mode 100644 modulefiles/build_cheyenne_intel create mode 100644 modulefiles/build_gaea_intel rename env/build_hera_intel.env => modulefiles/build_hera_intel (57%) create mode 100644 modulefiles/build_jet_intel create mode 100644 modulefiles/build_macos_gnu create mode 100644 modulefiles/build_noaacloud_intel create mode 100644 modulefiles/build_odin_intel create mode 100644 modulefiles/build_orion_intel create mode 100644 modulefiles/build_singularity_gnu create mode 100644 modulefiles/build_wcoss_dell_p3_intel rename {env => modulefiles}/srw_common (54%) create mode 100644 modulefiles/wflow_cheyenne create mode 100644 modulefiles/wflow_gaea create mode 100644 modulefiles/wflow_hera create mode 100644 modulefiles/wflow_jet create mode 100644 modulefiles/wflow_macos create mode 100644 modulefiles/wflow_noaacloud create mode 100644 modulefiles/wflow_odin create mode 100644 modulefiles/wflow_orion create mode 100644 modulefiles/wflow_singularity rename env/wflow_wcoss_dell_p3.env => modulefiles/wflow_wcoss_dell_p3 (51%) diff --git a/.cicd/Jenkinsfile b/.cicd/Jenkinsfile new file mode 100644 index 0000000000..265d7d445e --- /dev/null +++ b/.cicd/Jenkinsfile @@ -0,0 +1,169 @@ +pipeline { + agent none + + options { + skipDefaultCheckout(true) + } + + parameters { + // Allow job runner to filter based on platform + // choice(name: 'SRW_PLATFORM_FILTER', choices: ['all', 'cheyenne', 'gaea', 'orion', 'pcluster_noaa_v2_use1', 'azcluster_noaa', 'gcluster_noaa_v2_usc1'], description: 'Specify the platform(s) to use') + choice(name: 'SRW_PLATFORM_FILTER', choices: ['all', 'cheyenne', 'gaea', 'orion'], description: 'Specify the platform(s) to use') + // Allow job runner to filter based on compiler + choice(name: 'SRW_COMPILER_FILTER', choices: ['all', 'gnu', 'intel'], description: 'Specify the compiler(s) to use to build') + } + + stages { + /* + // Start the NOAA Parallel Works clusters, if necessary + stage('Start Parallel Works Clusters') { + matrix { + // Start all clusters by default or only the specified cluster given by SRW_PLATFORM_FILTER + when { + anyOf { + expression { params.SRW_PLATFORM_FILTER == 'all' } + expression { params.SRW_PLATFORM_FILTER == env.SRW_PLATFORM } + } + } + + axes { + axis { + name 'SRW_PLATFORM' + values 'pcluster_noaa_v2_use1', 'azcluster_noaa', 'gcluster_noaa_v2_usc1' + } + } + + stages { + // Call the parallel-works-jenkins-client/start-cluster job using SRW_PLATFORM for the + // PW_CLUSTER_NAME parameter + stage('Start Cluster') { + steps { + build job: 'parallel-works-jenkins-client/start-cluster', parameters: [string(name: 'PW_CLUSTER_NAME', value: env.SRW_PLATFORM), string(name: 'PW_CLUSTER_SSH_KEY', value: '~/.ssh/id_rsa'), string(name: 'JAVA_VERSION', value: '11')] + } + } + } + } + } + */ + + // Build and test the SRW application on all supported platforms using the supported compilers for each platform + stage('Build and Test') { + matrix { + // Run on all platform/compiler combinations by default or build and test only on the platform(s) and + // compiler(s) specified by SRW_PLATFORM_FILTER and SRW_COMPILER_FILTER + when { + allOf { + anyOf { + expression { params.SRW_PLATFORM_FILTER == 'all' } + expression { params.SRW_PLATFORM_FILTER == env.SRW_PLATFORM } + } + + anyOf { + expression { params.SRW_COMPILER_FILTER == 'all' } + expression { params.SRW_COMPILER_FILTER == env.SRW_COMPILER } + } + } + } + + axes { + axis { + name 'SRW_PLATFORM' + // values 'cheyenne', 'gaea', 'orion', 'pcluster_noaa_v2_use1', 'azcluster_noaa', 'gcluster_noaa_v2_usc1' + values 'cheyenne', 'gaea', 'orion' + } + + axis { + name 'SRW_COMPILER' + values 'gnu', 'intel' + } + } + + excludes { + // Exclude GNU from platforms that don't support it + exclude { + axis { + name 'SRW_PLATFORM' + values 'gaea', 'orion' + } + + axis { + name 'SRW_COMPILER' + values 'gnu' + } + } + } + + agent { + label env.SRW_PLATFORM + } + + environment { + BUILD_VERSION = "${env.SRW_PLATFORM}-${env.SRW_COMPILER}-${env.BRANCH_NAME}-${env.BUILD_NUMBER}" + BUILD_NAME = "ufs-srweather-app_${env.BUILD_VERSION}" + } + + stages { + // Clean the workspace, checkout the repository, and run checkout_externals + stage('Initialize') { + steps { + echo "Initializing SRW (${env.SRW_COMPILER}) build environment on ${env.SRW_PLATFORM}" + cleanWs() + checkout scm + sh '"${WORKSPACE}/manage_externals/checkout_externals"' + } + } + + // Run the unified build script; if successful create a tarball of the build and upload to S3 + stage('Build') { + steps { + echo "Building SRW (${env.SRW_COMPILER}) on ${env.SRW_PLATFORM}" + sh 'bash --login "${WORKSPACE}/.cicd/scripts/srw_build.sh"' + } + + post { + success { + sh 'tar --create --gzip --verbose --file "${WORKSPACE}/${BUILD_NAME}.tgz" bin include lib share' + s3Upload consoleLogLevel: 'INFO', dontSetBuildResultOnFailure: false, dontWaitForConcurrentBuildCompletion: false, entries: [[bucket: 'woc-epic-jenkins-artifacts', excludedFile: '', flatten: false, gzipFiles: false, keepForever: false, managedArtifacts: true, noUploadOnFailure: true, selectedRegion: 'us-east-1', showDirectlyInBrowser: false, sourceFile: "${env.BUILD_NAME}.tgz", storageClass: 'STANDARD', uploadFromSlave: false, useServerSideEncryption: false]], pluginFailureResultConstraint: 'FAILURE', profileName: 'main', userMetadata: [] + } + } + } + + // Run the unified test script + stage('Test') { + steps { + echo "Testing SRW (${env.SRW_COMPILER}) on ${env.SRW_PLATFORM}" + sh 'bash --login "${WORKSPACE}/.cicd/scripts/srw_test.sh"' + } + } + } + } + } + } + + /* + post { + always { + // Stop any Parallel Works clusters that were started during the pipeline execution + script { + def pw_clusters = ['pcluster_noaa_v2_use1', 'azcluster_noaa', 'gcluster_noaa_v2_usc1'] + def clusters = [] + + // Determine which clusters need to be stopped, if any + if (params.SRW_PLATFORM_FILTER == 'all') { + clusters = pw_clusters + } else if (params.SRW_PLATFORM_FILTER in pw_clusters) { + clusters = [params.SRW_PLATFORM_FILTER] + } else { + echo 'No Parallel Works clusters were used in build' + } + + for (int i = 0; i < clusters.size(); ++i) { + // Call the parallel-works-jenkins-client/stop-cluster job using clusters[i] for the + // PW_CLUSTER_NAME parameter + build job: 'parallel-works-jenkins-client/stop-cluster', parameters: [string(name: 'PW_CLUSTER_NAME', value: clusters[i])] + } + } + } + } + */ +} diff --git a/.cicd/scripts/srw_build.sh b/.cicd/scripts/srw_build.sh new file mode 100755 index 0000000000..79d9211af5 --- /dev/null +++ b/.cicd/scripts/srw_build.sh @@ -0,0 +1,31 @@ +#!/usr/bin/env bash +# +# A unified build script for the SRW application. This script is expected to +# build the SRW application for all supported platforms. +# +set -e -u -x + +script_dir="$(cd -- "$(dirname -- "${BASH_SOURCE[0]}")" > /dev/null 2>&1 && pwd)" + +# Get repository root from Jenkins WORKSPACE variable if set, otherwise, set +# relative to script directory. +declare workspace +if [[ -n "${WORKSPACE}" ]]; then + workspace="${WORKSPACE}" +else + workspace="$(cd -- "${script_dir}/../.." && pwd)" +fi + +build_dir="${workspace}/build" + +# Set build related environment variables and load required modules. +source "${workspace}/etc/lmod-setup.sh" "${SRW_PLATFORM}" +module use "${workspace}/modulefiles" +module load "build_${SRW_PLATFORM}_${SRW_COMPILER}" + +# Compile SRW application and install to repository root. +mkdir "${build_dir}" +pushd "${build_dir}" + cmake -DCMAKE_INSTALL_PREFIX="${workspace}" "${workspace}" + make -j "${MAKE_JOBS}" +popd diff --git a/.cicd/scripts/srw_test.sh b/.cicd/scripts/srw_test.sh new file mode 100755 index 0000000000..43b9935888 --- /dev/null +++ b/.cicd/scripts/srw_test.sh @@ -0,0 +1,21 @@ +#!/usr/bin/env bash +# +# A unified test script for the SRW application. This script is expected to +# test the SRW application for all supported platforms. NOTE: At this time, +# this script is a placeholder for a more robust test framework. +# +set -e -u -x + +script_dir="$(cd -- "$(dirname -- "${BASH_SOURCE[0]}")" > /dev/null 2>&1 && pwd)" + +# Get repository root from Jenkins WORKSPACE variable if set, otherwise, set +# relative to script directory. +declare workspace +if [[ -n "${WORKSPACE}" ]]; then + workspace="${WORKSPACE}" +else + workspace="$(cd -- "${script_dir}/../.." && pwd)" +fi + +# Verify that there is a non-zero sized weather model executable. +[[ -s "${workspace}/bin/ufs_model" ]] || [[ -s "${workspace}/bin/NEMS.exe" ]] diff --git a/.github/CODEOWNERS b/.github/CODEOWNERS index 08fa3f311a..51c46142df 100644 --- a/.github/CODEOWNERS +++ b/.github/CODEOWNERS @@ -4,7 +4,6 @@ # These owners will be the default owners for everything in the repo. #* @defunkt * @robgonzalezpita @venitahagerty @christopherwharrop-NOAA @christinaholtNOAA - # Order is important. The last matching pattern has the most precedence. # So if a pull request only touches javascript files, only these owners # will be requested to review. diff --git a/.github/ISSUE_TEMPLATE/textonly_request.md b/.github/ISSUE_TEMPLATE/textonly_request.md new file mode 100644 index 0000000000..e2be6bb5fe --- /dev/null +++ b/.github/ISSUE_TEMPLATE/textonly_request.md @@ -0,0 +1,20 @@ +--- +name: Text-only request +about: Suggest an idea for this project +title: '' +labels: textonly +assignees: '' + +--- + +## Description +Provide a clear and concise description of the problem to be solved. + +## Solution +Add a clear and concise description of the proposed solution. + +## Alternatives (optional) +If applicable, add a description of any alternative solutions or features you've considered. + +## Related to (optional) +Directly reference any issues or PRs in this or other repositories that this is related to, and describe how they are related. diff --git a/.github/workflows/build.yml b/.github/workflows/build.yml new file mode 100644 index 0000000000..4703c58749 --- /dev/null +++ b/.github/workflows/build.yml @@ -0,0 +1,21 @@ +name: Build SRW +on: [push] +jobs: + Explore-GitHub-Actions: + runs-on: [self-hosted, ncar] + steps: + - run: echo "๐ŸŽ‰ The job was automatically triggered by a ${{ github.event_name }} event." + - run: echo "๐Ÿง This job is now running on a ${{ runner.os }} server hosted by GitHub!" + - run: echo "๐Ÿ”Ž The name of your branch is ${{ github.ref }} and your repository is ${{ github.repository }}." + - name: Check out repository code + uses: actions/checkout@v2 + - run: | + cd ufs-srweather-app + ./manage_externals/checkout_externals + module use modulefiles + module load build_cheyenne_intel + mdkir build + cd build + cmake -DCMAKE_INSTALL_PREFIX=.. .. + make -j 4 + - run: echo "๐Ÿ This job's status is ${{ job.status }}." diff --git a/.gitmodules b/.gitmodules new file mode 100644 index 0000000000..ca914133d5 --- /dev/null +++ b/.gitmodules @@ -0,0 +1,3 @@ +[submodule "hpc-stack-mod"] + path = hpc-stack-mod + url = https://github.com/NOAA-EMC/hpc-stack.git diff --git a/.readthedocs.yaml b/.readthedocs.yaml new file mode 100644 index 0000000000..e0987f8926 --- /dev/null +++ b/.readthedocs.yaml @@ -0,0 +1,35 @@ +# .readthedocs.yaml +# Read the Docs configuration file +# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details + +# Required +version: 2 + +# Set the version of Python and other tools you might need +build: + os: ubuntu-20.04 + tools: + python: "3.9" + # You can also specify other tool versions: + # nodejs: "16" + # rust: "1.55" + # golang: "1.17" + +# Build documentation in the docs/ directory with Sphinx +sphinx: + configuration: docs/UsersGuide/source/conf.py + +# If using Sphinx, optionally build your docs in additional formats such as PDF +# formats: +# - pdf + +# Optionally declare the Python requirements required to build your docs +python: + install: + - requirements: docs/UsersGuide/requirements.txt + +submodules: + include: + - hpc-stack-mod + recursive: true + diff --git a/Externals.cfg b/Externals.cfg index 36af2bf7b1..d7d1fd6f8c 100644 --- a/Externals.cfg +++ b/Externals.cfg @@ -11,7 +11,7 @@ protocol = git repo_url = https://github.com/ufs-community/UFS_UTILS # Specify either a branch name or a hash but not both. #branch = develop -hash = f30740e +hash = 31271f7 local_path = src/UFS_UTILS required = True @@ -20,7 +20,7 @@ protocol = git repo_url = https://github.com/ufs-community/ufs-weather-model # Specify either a branch name or a hash but not both. #branch = develop -hash = e593349 +hash = 96dffa1 local_path = src/ufs-weather-model required = True @@ -29,7 +29,7 @@ protocol = git repo_url = https://github.com/NOAA-EMC/UPP # Specify either a branch name or a hash but not both. #branch = develop -hash = 4a16052 +hash = 394917e local_path = src/UPP required = True diff --git a/README.md b/README.md index 7eb717cd9f..744909ed42 100644 --- a/README.md +++ b/README.md @@ -2,12 +2,11 @@ The Unified Forecast System (UFS) is a community-based, coupled, comprehensive Earth modeling system. It is designed to be the source system for NOAAโ€™s operational numerical weather prediction applications while enabling research, development, and contribution opportunities for the broader weather enterprise. For more information about the UFS, visit the UFS Portal at https://ufscommunity.org/. -The UFS can be configured for multiple applications (see a complete list at https://ufscommunity.org/#/science/aboutapps). The configuration described here is the UFS Short-Range Weather (SRW) Application, which targets predictions of atmospheric behavior on a limited spatial domain and on time scales from less than an hour out to several days. The development branch of the application is continually evolving as the system undergoes open development. The SRW App v1.0.0 represents a snapshot of this continuously evolving system. The SRW App includes a prognostic atmospheric model, pre- and post-processing, and a community workflow for running the system end-to-end. +The UFS includes multiple applications (see a complete list at https://ufscommunity.org/science/aboutapps/) that support different forecast durations and spatial domains. This documentation describes the UFS Short-Range Weather (SRW) Application, which targets predictions of atmospheric behavior on a limited spatial domain and on time scales from minutes to several days. The development branch of the application is continually evolving as the system undergoes open development. The SRW App release branches represent a snapshot of this continuously evolving system. The SRW Application includes a prognostic atmospheric model, pre- and post-processing, and a community workflow for running the system end-to-end. These components are documented within the User's Guide and supported through a community forum (https://forums.ufscommunity.org/). -The UFS SRW App User's Guide associated with the development branch is at: https://ufs-srweather-app.readthedocs.io/en/latest/, while that specific to the SRW App v1.0.0 release can be found at: https://ufs-srweather-app.readthedocs.io/en/ufs-v1.0.0/. The repository is at: https://github.com/ufs-community/ufs-srweather-app. +The UFS SRW App User's Guide associated with the development branch can be found at: https://ufs-srweather-app.readthedocs.io/en/develop/, while the guide specific to the SRW App v1.0.1 release can be found at: https://ufs-srweather-app.readthedocs.io/en/ufs-v1.0.1/. The GitHub repository link is: https://github.com/ufs-community/ufs-srweather-app. For instructions on how to clone the repository, build the code, and run the workflow, see: https://github.com/ufs-community/ufs-srweather-app/wiki/Getting-Started -UFS Development Team. (2021, March 4). Unified Forecast System (UFS) Short-Range Weather (SRW) Application (Version v1.0.0). Zenodo. https://doi.org/10.5281/zenodo.4534994 - +UFS Development Team. (2021, March 4). Unified Forecast System (UFS) Short-Range Weather (SRW) Application (Version v1.0.0). Zenodo. https://doi.org/10.5281/zenodo.4534994 \ No newline at end of file diff --git a/devbuild.sh b/devbuild.sh index b46063d30f..35a78e3115 100755 --- a/devbuild.sh +++ b/devbuild.sh @@ -3,15 +3,15 @@ # usage instructions usage () { cat << EOF_USAGE -Usage: $0 [OPTIONS]... +Usage: $0 --platform=PLATFORM [OPTIONS]... OPTIONS -h, --help show this help guide - --platform=PLATFORM + -p, --platform=PLATFORM name of machine you are building on (e.g. cheyenne | hera | jet | orion | wcoss_dell_p3) - --compiler=COMPILER + -c, --compiler=COMPILER compiler to use; default depends on platform (e.g. intel | gnu | cray | gccgfortran) --app=APPLICATION @@ -93,8 +93,6 @@ BUILD_JOBS=4 CLEAN=false CONTINUE=false VERBOSE=false -# detect PLATFORM (MACHINE) -source ${SRC_DIR}/env/detect_machine.sh # process required arguments if [[ ("$1" == "--help") || ("$1" == "-h") ]]; then @@ -106,10 +104,10 @@ fi while :; do case $1 in --help|-h) usage; exit 0 ;; - --platform=?*) PLATFORM=${1#*=} ;; - --platform|--platform=) usage_error "$1 requires argument." ;; - --compiler=?*) COMPILER=${1#*=} ;; - --compiler|--compiler=) usage_error "$1 requires argument." ;; + --platform=?*|-p=?*) PLATFORM=${1#*=} ;; + --platform|--platform=|-p|-p=) usage_error "$1 requires argument." ;; + --compiler=?*|-c=?*) COMPILER=${1#*=} ;; + --compiler|--compiler=|-c|-c=) usage_error "$1 requires argument." ;; --app=?*) APPLICATION=${1#*=} ;; --app|--app=) usage_error "$1 requires argument." ;; --ccpp=?*) CCPP=${1#*=} ;; @@ -138,17 +136,32 @@ while :; do shift done +# check if PLATFORM is set +if [ -z $PLATFORM ] ; then + printf "\nERROR: Please set PLATFORM.\n\n" + usage + exit 0 +fi + +# set PLATFORM (MACHINE) +MACHINE="${PLATFORM}" +printf "PLATFORM(MACHINE)=${PLATFORM}\n" >&2 + set -eu # automatically determine compiler if [ -z "${COMPILER}" ] ; then case ${PLATFORM} in - jet|hera) COMPILER=intel ;; + jet|hera|gaea) COMPILER=intel ;; orion) COMPILER=intel ;; wcoss_dell_p3) COMPILER=intel ;; cheyenne) COMPILER=intel ;; - macos) COMPILER=gccgfortran ;; - *) printf "ERROR: Unknown platform ${PLATFORM}\n" >&2; usage >&2; exit 1 ;; + macos,singularity) COMPILER=gnu ;; + odin) COMPILER=intel ;; + *) + COMPILER=intel + printf "WARNING: Setting default COMPILER=intel for new platform ${PLATFORM}\n" >&2; + ;; esac fi @@ -159,18 +172,19 @@ if [ "${VERBOSE}" = true ] ; then settings fi -# set ENV_FILE for this platform/compiler combination -ENV_FILE="${SRC_DIR}/env/build_${PLATFORM}_${COMPILER}.env" -if [ ! -f "${ENV_FILE}" ]; then - printf "ERROR: environment file does not exist for platform/compiler\n" >&2 - printf " ENV_FILE=${ENV_FILE}\n" >&2 +# set MODULE_FILE for this platform/compiler combination +MODULE_FILE="build_${PLATFORM}_${COMPILER}" +if [ ! -f "${SRC_DIR}/modulefiles/${MODULE_FILE}" ]; then + printf "ERROR: module file does not exist for platform/compiler\n" >&2 + printf " MODULE_FILE=${MODULE_FILE}\n" >&2 printf " PLATFORM=${PLATFORM}\n" >&2 printf " COMPILER=${COMPILER}\n\n" >&2 + printf "Please make sure PLATFORM and COMPILER are set correctly\n" >&2 usage >&2 exit 64 fi -printf "ENV_FILE=${ENV_FILE}\n" >&2 +printf "MODULE_FILE=${MODULE_FILE}\n" >&2 # if build directory already exists then exit if [ "${CLEAN}" = true ]; then @@ -228,10 +242,13 @@ if [ "${VERBOSE}" = true ]; then MAKE_SETTINGS="${MAKE_SETTINGS} VERBOSE=1" fi -# source the environment file for this platform/compiler combination, then build the code -printf "... Source ENV_FILE and create BUILD directory ...\n" -module use ${SRC_DIR}/env -. ${ENV_FILE} +# Before we go on load modules, we first need to activate Lmod for some systems +source ${SRC_DIR}/etc/lmod-setup.sh + +# source the module file for this platform/compiler combination, then build the code +printf "... Load MODULE_FILE and create BUILD directory ...\n" +module use ${SRC_DIR}/modulefiles +module load ${MODULE_FILE} module list mkdir -p ${BUILD_DIR} cd ${BUILD_DIR} diff --git a/docs/INSTALL b/docs/INSTALL index 5923285be0..4b659bb1d1 100644 --- a/docs/INSTALL +++ b/docs/INSTALL @@ -12,10 +12,38 @@ git clone https://github.com/ufs-community/ufs-srweather-app.git cd ufs-srweather-app/ ./manage_externals/checkout_externals -# Prior to building, you must set up the environment so cmake can find the appropriate compilers -# and libraries. For instructions specific to supported platforms, see the "build_[machine]_[compiler].env -# files in the "env" directory. These files give instructions assuming a bash or ksh login shell, for -# csh and tcsh users you will have to modify the commands for setting envronment variables. +# We can build ufs-sreweather-app binaries in two ways. + +# Method 1 +# ======== + +# This is the simplest way to build the binaries + +./devbuild.sh --platform=PLATFORM + +# If compiler auto-detection fails, specify it using + +./devbuild.sh --platform=PLATFORM --compiler=COMPILER + +# Method 2 +# ======== + +# The above instructions will work atleast on Tier-1 systems, if not on all supported machines. +# However, if it fails for some reason, we can build directly with cmake. + +# First, we need to make sure that there is a modulefile "build_[PLATFORM]_[COMPILER]" in the +# "modulefiles" directory. Also, on some systems (e.g. Gaea/Odin) that come with cray module app, +# we may need to swap that for Lmod instead. Assuming your login shell is bash, run + +source etc/lmod-setup.sh PLATFORM + +# and if your login schell is csh/tcsh, source etc/lmod-setup.csh instead. + +# From here on, we can assume Lmod is loaded and ready to go. Then we load the specific +# module for a given PLATFORM and COMPILER as follows + +module use modulefiles +module load build_[PLATFORM]_[COMPILER] # Supported CMake flags: # -DCMAKE_INSTALL_PREFIX Location where the bin/ include/ lib/ and share/ directories containing diff --git a/docs/RUNTIME b/docs/RUNTIME index a80a65228e..e2ca78894d 100644 --- a/docs/RUNTIME +++ b/docs/RUNTIME @@ -1,13 +1,22 @@ # Users should load the appropriate python environment for the workflow. # The workflow requires Python 3, with the packages 'PyYAML', 'Jinja2', and 'f90nml' available. -# For users' convenience, the python environment for the workflow is put in 'ufs-srweather-app/env/wflow_[machine].env'. -# When generating a workflow experiment or running a workflow, users can use this file for a specific machine. +# For users' convenience, the python environment for the workflow can be activated by loading wflow_[PLATFORM] modulefile # For example, on Hera: -cd ufs-srweather-app/env -source wflow_hera.env +module load wflow_hera + +# Due to older version of Lmod, inconsistency with TCL modulefiles etc, you may have to activate +# conda manually using instructions that the previous module command prints. +# Hera is one of those systems, so execute: + +conda activate regional_workflow + +# After that we can setup an experiment in the directory + +cd regional_workflow/ush + +# Once we prepare experiment file config.sh, we can generate workflow using -cd ../regional_workflow/ush ./generate_FV3LAM_wflow.sh diff --git a/docs/UsersGuide/README b/docs/UsersGuide/README index 30617076c6..0ad8948eda 100644 --- a/docs/UsersGuide/README +++ b/docs/UsersGuide/README @@ -1,26 +1,29 @@ Steps to build and use the Sphinx documentation tool: -1) Get Sphinx and sphinxcontrib-bibtex installed on your desktop from +1) Get Sphinx, sphinxcontrib-bibtex, and the RTD theme installed on your desktop from http://www.sphinx-doc.org/en/master/usage/installation.html https://sphinxcontrib-bibtex.readthedocs.io/en/latest/quickstart.html#installation + https://pypi.org/project/sphinx-rtd-theme/ + + For example: + pip install sphinx + pip install sphinxcontrib-bibtex + pip install sphinx-rtd-theme -2) Create a Sphinx documentation root directory: - % mkdir docs - % cd docs + One approach that has worked to resolve "Module Not Found" errors for users with MacPorts package manager: + $ sudo port install py-six # may not be necessary + $ sudo port install py310-sphinxcontrib-bibtex + $ sudo port select --set sphinx py310-sphinx + $ sudo port install py310-sphinx_rtd_theme -3) Initialize your Sphinx project (set up an initial directory structure) using - % sphinx-quickstart + py310 can be replaced with the user's version of Python (e.g., py39) - See http://www.sphinx-doc.org/en/master/usage/quickstart.html or - https://sphinx-rtd-tutorial.readthedocs.io/en/latest/sphinx-quickstart.html - - for help. You can answer (ENTER) to most of the questions. - To build html: -From the directory above source and build, the sphinx project directory: +$ cd ufs-srweather-app/docs/UsersGuide +$ make clean && sphinx-build -b html source build -make html +The "make html" command can often be used in place of the previous command. Sphinx uses Latex to export the documentation as a PDF file. To build pdf: diff --git a/docs/UsersGuide/build/.gitignore b/docs/UsersGuide/build/.gitignore deleted file mode 100644 index 5e7d2734cf..0000000000 --- a/docs/UsersGuide/build/.gitignore +++ /dev/null @@ -1,4 +0,0 @@ -# Ignore everything in this directory -* -# Except this file -!.gitignore diff --git a/docs/UsersGuide/source/BuildRunSRW.rst b/docs/UsersGuide/source/BuildRunSRW.rst new file mode 100644 index 0000000000..629210ec27 --- /dev/null +++ b/docs/UsersGuide/source/BuildRunSRW.rst @@ -0,0 +1,1113 @@ +.. _BuildRunSRW: + +===================================== +Building and Running the SRW App +===================================== + +The Unified Forecast System (:term:`UFS`) Short-Range Weather (SRW) Application is an :term:`umbrella repository` consisting of a number of different :ref:`components ` housed in external repositories. Once the SRW App is configured and built, users can generate predictions of atmospheric behavior over a limited spatial area and on time scales ranging from minutes out to several days. + +This chapter walks users through how to build and run the "out-of-the-box" case for the SRW App. However, the steps are relevant to any SRW Application experiment and can be modified to suit user goals. The "out-of-the-box" SRW App case builds a weather forecast for June 15-16, 2019. Multiple convective weather events during these two days produced over 200 filtered storm reports. Severe weather was clustered in two areas: the Upper Midwest through the Ohio Valley and the Southern Great Plains. This forecast uses a predefined 25-km Continental United States (:term:`CONUS`) domain (RRFS_CONUS_25km), the Global Forecast System (:term:`GFS`) version 15.2 physics suite (FV3_GFS_v16 :term:`CCPP`), and :term:`FV3`-based GFS raw external model data for initialization. + +.. attention:: + + All UFS applications support `four platform levels `_. The steps described in this chapter will work most smoothly on preconfigured (Level 1) systems. On Level 1 systems, all of the required libraries for building community releases of UFS models and applications are available in a central location. This guide can serve as a starting point for running the SRW App on other systems, too, but the user may need to perform additional troubleshooting. + +.. note:: + The :ref:`container approach ` is recommended for a smoother build and run experience. Building without a container allows for the use of the Rocoto workflow manager and may allow for more customization. However, the non-container approach requires more in-depth system-based knowledge, especially on Level 3 and 4 systems; it is less appropriate for beginners. + +The overall procedure for generating an experiment is shown in :numref:`Figure %s `, with the scripts to generate and run the workflow shown in red. The steps are as follows: + + * :ref:`Install prerequisites ` + * :ref:`Clone the SRW App from GitHub ` + * :ref:`Check out the external repositories ` + * :ref:`Set up the build environment and build the executables ` + * :ref:`Download and stage data ` + * :ref:`Optional: Configure a new grid ` + * :ref:`Generate a regional workflow experiment ` + * :ref:`Configure the experiment parameters ` + * :ref:`Load the python environment for the regional workflow ` + * :ref:`Run the regional workflow ` + * :ref:`Optional: Plot the output ` + +.. _AppOverallProc: + +.. figure:: _static/FV3LAM_wflow_overall.png + + *Overall layout of the SRW App Workflow* + + +.. _HPCstackInfo: + +Install the HPC-Stack +======================== + +.. Attention:: + Skip the HPC-Stack installation if working on a `Level 1 system `_ (e.g., Cheyenne, Hera, Orion, NOAA Cloud). + +**Definition:** :term:`HPC-Stack` is a repository that provides a unified, shell script-based build system and builds the software stack required for `UFS `_ applications such as the SRW App. + +Background +---------------- + +The UFS Weather Model draws on over 50 code libraries to run its applications. These libraries range from libraries developed in-house at NOAA (e.g., NCEPLIBS, FMS) to libraries developed by NOAA's partners (e.g., PIO, ESMF) to truly third party libraries (e.g., NETCDF). Individual installation of these libraries is not practical, so the `HPC-Stack `__ was developed as a central installation system to ensure that the infrastructure environment across multiple platforms is as similar as possible. Installation of the HPC-Stack is required to run the SRW App. + +Instructions +------------------------- +Users working on systems that fall under `Support Levels 2-4 `_ will need to install the HPC-Stack the first time they try to build applications (such as the SRW App) or models that depend on it. Users can either build the HPC-Stack on their local system or use the centrally maintained stacks on each HPC platform if they are working on a Level 1 system. Before installing the HPC-Stack, users on both Linux and MacOS systems should set the stack size to "unlimited" (if allowed) or to the largest possible value: + +.. code-block:: console + + # Linux, if allowed + ulimit -s unlimited + + # MacOS, this corresponds to 65MB + ulimit -S -s unlimited + +For a detailed description of installation options, see :ref:`Installing the HPC-Stack `. + +After completing installation, continue to the next section. + +.. _DownloadSRWApp: + +Download the UFS SRW Application Code +====================================== +The SRW Application source code is publicly available on GitHub. To download the SRW App, clone the ``develop`` branch of the repository: + +.. code-block:: console + + git clone -b develop https://github.com/ufs-community/ufs-srweather-app.git + +.. + COMMENT: This will need to be changed to the updated release branch of the SRW repo once it exists. + +The cloned repository contains the configuration files and sub-directories shown in +:numref:`Table %s `. + +.. _FilesAndSubDirs: + +.. table:: Files and sub-directories of the ufs-srweather-app repository + + +--------------------------------+--------------------------------------------------------+ + | **File/Directory Name** | **Description** | + +================================+========================================================+ + | CMakeLists.txt | Main cmake file for SRW App | + +--------------------------------+--------------------------------------------------------+ + | Externals.cfg | Includes tags pointing to the correct version of the | + | | external GitHub repositories/branches used in the SRW | + | | App. | + +--------------------------------+--------------------------------------------------------+ + | LICENSE.md | CC0 license information | + +--------------------------------+--------------------------------------------------------+ + | README.md | Getting Started Guide | + +--------------------------------+--------------------------------------------------------+ + | ufs_srweather_app_meta.h.in | Meta information for SRW App which can be used by | + | | other packages | + +--------------------------------+--------------------------------------------------------+ + | ufs_srweather_app.settings.in | SRW App configuration summary | + +--------------------------------+--------------------------------------------------------+ + | modulefiles | Contains build and workflow module files | + +--------------------------------+--------------------------------------------------------+ + | etc | Contains Lmod startup scripts | + +--------------------------------+--------------------------------------------------------+ + | docs | Contains release notes, documentation, and User's Guide| + +--------------------------------+--------------------------------------------------------+ + | manage_externals | Utility for checking out external repositories | + +--------------------------------+--------------------------------------------------------+ + | src | Contains CMakeLists.txt; external repositories | + | | will be cloned in this directory. | + +--------------------------------+--------------------------------------------------------+ + + +.. _CheckoutExternals: + +Check Out External Components +================================ + +The SRW App relies on a variety of components (e.g., regional_workflow, UFS_UTILS, ufs-weather-model, and UPP) detailed in :numref:`Chapter %s ` of this User's Guide. Each component has its own :term:`repository`. Users must run the ``checkout_externals`` script to collect the individual components of the SRW App from their respective git repositories. The ``checkout_externals`` script uses the configuration file ``Externals.cfg`` in the top level directory of the SRW App to clone the correct tags (code versions) of the external repositories listed in :numref:`Section %s ` into the appropriate directories under the ``regional_workflow`` and ``src`` directories. + +Run the executable that pulls in SRW App components from external repositories: + +.. code-block:: console + + cd ufs-srweather-app + ./manage_externals/checkout_externals + +The script should output dialogue indicating that it is retrieving different code repositories. It may take several minutes to download these repositories. + +.. _BuildExecutables: + +Set Up the Environment and Build the Executables +=================================================== + +.. _DevBuild: + +``devbuild.sh`` Approach +----------------------------- + +On Level 1 systems for which a modulefile is provided under the ``modulefiles`` directory, we can build the SRW App binaries with: + +.. code-block:: console + + ./devbuild.sh --platform= + +where ```` is replaced with the name of the platform the user is working on. Valid values are: ``cheyenne`` | ``gaea`` | ``hera`` | ``jet`` | ``macos`` | ``odin`` | ``orion`` | ``singularity`` | ``wcoss_dell_p3`` + +If compiler auto-detection fails for some reason, specify it using the ``--compiler`` argument. FOr example: + +.. code-block:: console + + ./devbuild.sh --platform=hera --compiler=intel + +where valid values are ``intel`` or ``gnu``. + +The last line of the console output should be ``[100%] Built target ufs-weather-model``, indicating that the UFS Weather Model executable has been built successfully. + +The executables listed in :numref:`Table %s ` should appear in the ``ufs-srweather-app/bin`` directory. If this build method doesn't work, or it users are not on a supported machine, they will have to manually setup the environment and build the SRW App binaries with CMake as described in :numref:`Section %s `. + + +.. _ExecDescription: + +.. table:: Names and descriptions of the executables produced by the build step and used by the SRW App + + +------------------------+---------------------------------------------------------------------------------+ + | **Executable Name** | **Description** | + +========================+=================================================================================+ + | chgres_cube | Reads in raw external model (global or regional) and surface climatology data | + | | to create initial and lateral boundary conditions | + +------------------------+---------------------------------------------------------------------------------+ + | filter_topo | Filters topography based on resolution | + +------------------------+---------------------------------------------------------------------------------+ + | global_equiv_resol | Calculates a global, uniform, cubed-sphere equivalent resolution for the | + | | regional Extended Schmidt Gnomonic (ESG) grid | + +------------------------+---------------------------------------------------------------------------------+ + | make_solo_mosaic | Creates mosaic files with halos | + +------------------------+---------------------------------------------------------------------------------+ + | upp.x | Post-processor for the model output | + +------------------------+---------------------------------------------------------------------------------+ + | ufs_model | UFS Weather Model executable | + +------------------------+---------------------------------------------------------------------------------+ + | orog | Generates orography, land mask, and gravity wave drag files from fixed files | + +------------------------+---------------------------------------------------------------------------------+ + | regional_esg_grid | Generates an ESG regional grid based on a user-defined namelist | + +------------------------+---------------------------------------------------------------------------------+ + | sfc_climo_gen | Creates surface climatology fields from fixed files for use in ``chgres_cube`` | + +------------------------+---------------------------------------------------------------------------------+ + | shave | Shaves the excess halo rows down to what is required for the lateral boundary | + | | conditions (LBC's) in the orography and grid files | + +------------------------+---------------------------------------------------------------------------------+ + | vcoord_gen | Generates hybrid coordinate interface profiles | + +------------------------+---------------------------------------------------------------------------------+ + | fvcom_to_FV3 | Determines lake surface conditions for the Great Lakes | + +------------------------+---------------------------------------------------------------------------------+ + | make_hgrid | Computes geo-referencing parameters (e.g., latitude, longitude, grid cell area) | + | | for global uniform grids | + +------------------------+---------------------------------------------------------------------------------+ + | emcsfc_ice_blend | Blends National Ice Center sea ice cover and EMC sea ice concentration data to | + | | create a global sea ice analysis used to update the GFS once per day | + +------------------------+---------------------------------------------------------------------------------+ + | emcsfc_snow2mdl | Blends National Ice Center snow cover and Air Force snow depth data to create a | + | | global depth analysis used to update the GFS snow field once per day | + +------------------------+---------------------------------------------------------------------------------+ + | global_cycle | Updates the GFS surface conditions using external snow and sea ice analyses | + +------------------------+---------------------------------------------------------------------------------+ + | inland | Creates an inland land mask by determining in-land (i.e. non-coastal) points | + | | and assigning a value of 1. Default value is 0. | + +------------------------+---------------------------------------------------------------------------------+ + | orog_gsl | Ceates orographic statistics fields required for the orographic drag suite | + | | developed by NOAA's Global Systems Laboratory (GSL) | + +------------------------+---------------------------------------------------------------------------------+ + | fregrid | Remaps data from the input mosaic grid to the output mosaic grid | + +------------------------+---------------------------------------------------------------------------------+ + | lakefrac | Calculates the ratio of the lake area to the grid cell area at each atmospheric | + | | grid point. | + +------------------------+---------------------------------------------------------------------------------+ + +.. _CMakeApproach: + +CMake Approach +----------------- + +Set Up the Workflow Environment +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + +.. attention:: + If users successfully built the executables in :numref:`Step %s `, they should skip to step :numref:`Step %s `. + +If the ``devbuild.sh`` approach failed, users need to set up their environment to run a workflow on their specific platform. First, users should make sure ``Lmod`` is the app used for loading modulefiles. This is the case on most Level 1 systems; however, on systems such as Gaea/Odin, the default modulefile loader is from Cray and must be switched to Lmod. For example, on Gaea, assuming a ``bash`` login shell, run: + +.. code-block:: console + + source etc/lmod-setup.sh gaea + +or if the login shell is ``csh`` or ``tcsh``, run ``source etc/lmod-setup.csh`` instead. If users execute the above command on systems that don't need it, it will not cause any problems (it will simply do a ``module purge``). From here on, ``Lmod`` is ready to load the modulefiles needed by the SRW App. These modulefiles are located in ``modulefiles`` directory. To load the necessary modulefile for a specific ```` using ````, run: + +.. code-block:: console + + module use + module load build__ + +where ```` is the full path to the ``modulefiles`` directory. This will work on Level 1 systems, where a modulefile is available in the ``modulefiles`` directory. + +On Level 2-4 systems, users will need to modify certain environment variables, such as the path to HPC-Stack, so that the SRW App can find and load the appropriate modules. For systems with Lmod installed, one of the current ``build__`` modulefiles can be copied and used as a template. To check whether Lmod is installed, run ``echo $LMOD_PKG``, and see if it outputs a path to the Lmod package. On systems without Lmod, users can modify or set the required environment variables with the ``export`` or ``setenv`` commands despending on whether they are using a bash or csh/tcsh shell, respectively: + +.. code-block:: + + export = + setenv + +.. + COMMENT: Might be good to list an example here... + +.. _BuildCMake: + +Build the Executables Using CMake +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + +.. attention:: + If users successfully built the executables in :numref:`Step %s `, they should skip to step :numref:`Step %s `. + +In the ``ufs-srweather-app`` directory, create a subdirectory to hold the build's executables: + +.. code-block:: console + + mkdir build + cd build + +From the build directory, run the following commands to build the pre-processing utilities, forecast model, and post-processor: + +.. code-block:: console + + cmake .. -DCMAKE_INSTALL_PREFIX=.. + make -j 4 >& build.out & + +``-DCMAKE_INSTALL_PREFIX`` specifies the location in which the ``bin``, ``include``, ``lib``, and ``share`` directories will be created. These directories will contain various components of the SRW App. Its recommended value ``..`` denotes one directory up from the build directory. In the next line, the ``make`` call argument ``-j 4`` indicates that the build will run in parallel with 4 threads. Although users can specify a larger or smaller number of threads (e.g., ``-j8``, ``-j2``), it is highly recommended to use at least 4 parallel threads to prevent overly long installation times. + +The build will take a few minutes to complete. When it starts, a random number is printed to the console, and when it is done, a ``[1]+ Done`` message is printed to the console. ``[1]+ Exit`` indicates an error. Output from the build will be in the ``ufs-srweather-app/build/build.out`` file. When the build completes, users should see the forecast model executable ``ufs_model`` and several pre- and post-processing executables in the ``ufs-srweather-app/bin`` directory. These executables are described in :numref:`Table %s `. + +.. hint:: + + If you see the build.out file, but there is no ``ufs-srweather-app/bin`` directory, wait a few more minutes for the build to complete. + + +.. _Data: + +Download and Stage the Data +============================ + +The SRW App requires input files to run. These include static datasets, initial and boundary conditions files, and model configuration files. On Level 1 and 2 systems, the data required to run SRW App tests are already available. For Level 3 and 4 systems, the data must be added. Detailed instructions on how to add the data can be found in :numref:`Section %s Downloading and Staging Input Data `. :numref:`Sections %s ` and :numref:`%s ` contain useful background information on the input and output files used in the SRW App. + +.. _GridSpecificConfig: + +Grid Configuration +======================= + +The SRW App officially supports four different predefined grids as shown in :numref:`Table %s `. The "out-of-the-box" SRW App case uses the ``RRFS_CONUS_25km`` predefined grid option. More information on the predefined and user-generated grid options can be found in :numref:`Chapter %s ` for those who are curious. Users who plan to utilize one of the four predefined domain (grid) options may continue to :numref:`Step %s `. Users who plan to create a new domain should refer to :numref:`Chapter %s ` for details on how to do so. At a minimum, these users will need to add the new grid name to the ``valid_param_vals`` script and add the corresponding grid-specific parameters in the ``set_predef_grid_params`` script. + +.. _PredefinedGrids: + +.. table:: Predefined grids in the SRW App + + +----------------------+-------------------+--------------------------------+ + | **Grid Name** | **Grid Type** | **Quilting (write component)** | + +======================+===================+================================+ + | RRFS_CONUS_25km | ESG grid | lambert_conformal | + +----------------------+-------------------+--------------------------------+ + | RRFS_CONUS_13km | ESG grid | lambert_conformal | + +----------------------+-------------------+--------------------------------+ + | RRFS_CONUS_3km | ESG grid | lambert_conformal | + +----------------------+-------------------+--------------------------------+ + | SUBCONUS_Ind_3km | ESG grid | lambert_conformal | + +----------------------+-------------------+--------------------------------+ + + +.. _GenerateForecast: + +Generate the Forecast Experiment +================================= +Generating the forecast experiment requires three steps: + +* :ref:`Set experiment parameters ` +* :ref:`Set Python and other environment parameters ` +* :ref:`Run a script to generate the experiment workflow ` + +The first two steps depend on the platform being used and are described here for each Level 1 platform. Users will need to adjust the instructions to their machine if they are working on a Level 2-4 platform. Information in :numref:`Chapter %s: Configuring the Workflow ` can help with this. + +.. _ExptConfig: + +Set Experiment Parameters +---------------------------- + +Each experiment requires certain basic information to run (e.g., date, grid, physics suite). This information is specified in ``config_defaults.sh`` and in the user-specific ``config.sh`` file. When generating a new experiment, the SRW App first reads and assigns default values from the ``config_defaults.sh`` file. Then, it reads and (re)assigns variables from the user's custom ``config.sh`` file. For background info on ``config_defaults.sh``, read :numref:`Section %s `, or jump to :numref:`Section %s ` to continue configuring the experiment. + +.. _DefaultConfigSection: + +Default configuration: ``config_defaults.sh`` +------------------------------------------------ + +.. note:: + This section provides background information on how the SRW App uses the ``config_defaults.sh`` file. This information is informative, but users do not need to modify ``config_defaults.sh`` to run the out-of-the-box case for the SRW App. Users may skip to :numref:`Step %s ` to continue configuring their experiment. + +Important configuration variables in the ``config_defaults.sh`` file appear in +:numref:`Table %s `. Some of these default values are intentionally invalid in order to ensure that the user assigns valid values in the user-specified ``config.sh`` file. Any settings provided in ``config.sh`` will override the default ``config_defaults.sh`` +settings. There is usually no need for a user to modify the default configuration file. Additional information on the default settings can be found in the file itself and in :numref:`Chapter %s `. + +.. _ConfigVarsDefault: + +.. table:: Configuration variables specified in the config_defaults.sh script. + + +----------------------+------------------------------------------------------------+ + | **Group Name** | **Configuration variables** | + +======================+============================================================+ + | Experiment mode | RUN_ENVIR | + +----------------------+------------------------------------------------------------+ + | Machine and queue | MACHINE, ACCOUNT, SCHED, PARTITION_DEFAULT, QUEUE_DEFAULT, | + | | PARTITION_HPSS, QUEUE_HPSS, PARTITION_FCST, QUEUE_FCST | + +----------------------+------------------------------------------------------------+ + | Cron | USE_CRON_TO_RELAUNCH, CRON_RELAUNCH_INTVL_MNTS | + +----------------------+------------------------------------------------------------+ + | Experiment Dir. | EXPT_BASEDIR, EXPT_SUBDIR | + +----------------------+------------------------------------------------------------+ + | NCO mode | COMINgfs, STMP, NET, envir, RUN, PTMP | + +----------------------+------------------------------------------------------------+ + | Separator | DOT_OR_USCORE | + +----------------------+------------------------------------------------------------+ + | File name | EXPT_CONFIG_FN, RGNL_GRID_NML_FN, DATA_TABLE_FN, | + | | DIAG_TABLE_FN, FIELD_TABLE_FN, FV3_NML_BASE_SUITE_FN, | + | | FV3_NML_YALM_CONFIG_FN, FV3_NML_BASE_ENS_FN, | + | | MODEL_CONFIG_FN, NEMS_CONFIG_FN, FV3_EXEC_FN, | + | | WFLOW_XML_FN, GLOBAL_VAR_DEFNS_FN, | + | | EXTRN_MDL_ICS_VAR_DEFNS_FN, EXTRN_MDL_LBCS_VAR_DEFNS_FN, | + | | WFLOW_LAUNCH_SCRIPT_FN, WFLOW_LAUNCH_LOG_FN | + +----------------------+------------------------------------------------------------+ + | Forecast | DATE_FIRST_CYCL, DATE_LAST_CYCL, CYCL_HRS, FCST_LEN_HRS | + +----------------------+------------------------------------------------------------+ + | IC/LBC | EXTRN_MDL_NAME_ICS, EXTRN_MDL_NAME_LBCS, | + | | LBC_SPEC_INTVL_HRS, FV3GFS_FILE_FMT_ICS, | + | | FV3GFS_FILE_FMT_LBCS | + +----------------------+------------------------------------------------------------+ + | NOMADS | NOMADS, NOMADS_file_type | + +----------------------+------------------------------------------------------------+ + | External model | USE_USER_STAGED_EXTRN_FILES, EXTRN_MDL_SOURCE_BASEDRI_ICS, | + | | EXTRN_MDL_FILES_ICS, EXTRN_MDL_SOURCE_BASEDIR_LBCS, | + | | EXTRN_MDL_FILES_LBCS | + +----------------------+------------------------------------------------------------+ + | CCPP | CCPP_PHYS_SUITE | + +----------------------+------------------------------------------------------------+ + | GRID | GRID_GEN_METHOD | + +----------------------+------------------------------------------------------------+ + | ESG grid | ESGgrid_LON_CTR, ESGgrid_LAT_CTR, ESGgrid_DELX, | + | | ESGgrid_DELY, ESGgrid_NX, ESGgrid_NY, | + | | ESGgrid_WIDE_HALO_WIDTH | + +----------------------+------------------------------------------------------------+ + | Input configuration | DT_ATMOS, LAYOUT_X, LAYOUT_Y, BLOCKSIZE, QUILTING, | + | | PRINT_ESMF, WRTCMP_write_groups, | + | | WRTCMP_write_tasks_per_group, WRTCMP_output_grid, | + | | WRTCMP_cen_lon, WRTCMP_cen_lat, WRTCMP_lon_lwr_left, | + | | WRTCMP_lat_lwr_left, WRTCMP_lon_upr_rght, | + | | WRTCMP_lat_upr_rght, WRTCMP_dlon, WRTCMP_dlat, | + | | WRTCMP_stdlat1, WRTCMP_stdlat2, WRTCMP_nx, WRTCMP_ny, | + | | WRTCMP_dx, WRTCMP_dy | + +----------------------+------------------------------------------------------------+ + | Pre-existing grid | PREDEF_GRID_NAME, PREEXISTING_DIR_METHOD, VERBOSE | + +----------------------+------------------------------------------------------------+ + | Cycle-independent | RUN_TASK_MAKE_GRID, GRID_DIR, RUN_TASK_MAKE_OROG, | + | | OROG_DIR, RUN_TASK_MAKE_SFC_CLIMO, SFC_CLIMO_DIR | + +----------------------+------------------------------------------------------------+ + | Surface climatology | SFC_CLIMO_FIELDS, FIXgsm, TOPO_DIR, SFC_CLIMO_INPUT_DIR, | + | | FNGLAC, FNMXIC, FNTSFC, FNSNOC, FNZORC, FNAISC, FNSMCC, | + | | FNMSKH, FIXgsm_FILES_TO_COPY_TO_FIXam, | + | | FV3_NML_VARNAME_TO_FIXam_FILES_MAPPING, | + | | FV3_NML_VARNAME_TO_SFC_CLIMO_FIELD_MAPPING, | + | | CYCLEDIR_LINKS_TO_FIXam_FILES_MAPPING | + +----------------------+------------------------------------------------------------+ + | Workflow task | MAKE_GRID_TN, MAKE_OROG_TN, MAKE_SFC_CLIMO_TN, | + | | GET_EXTRN_ICS_TN, GET_EXTRN_LBCS_TN, MAKE_ICS_TN, | + | | MAKE_LBCS_TN, RUN_FCST_TN, RUN_POST_TN | + +----------------------+------------------------------------------------------------+ + | NODE | NNODES_MAKE_GRID, NNODES_MAKE_OROG, NNODES_MAKE_SFC_CLIMO, | + | | NNODES_GET_EXTRN_ICS, NNODES_GET_EXTRN_LBCS, | + | | NNODES_MAKE_ICS, NNODES_MAKE_LBCS, NNODES_RUN_FCST, | + | | NNODES_RUN_POST | + +----------------------+------------------------------------------------------------+ + | MPI processes | PPN_MAKE_GRID, PPN_MAKE_OROG, PPN_MAKE_SFC_CLIMO, | + | | PPN_GET_EXTRN_ICS, PPN_GET_EXTRN_LBCS, PPN_MAKE_ICS, | + | | PPN_MAKE_LBCS, PPN_RUN_FCST, PPN_RUN_POST | + +----------------------+------------------------------------------------------------+ + | Walltime | WTIME_MAKE_GRID, WTIME_MAKE_OROG, WTIME_MAKE_SFC_CLIMO, | + | | WTIME_GET_EXTRN_ICS, WTIME_GET_EXTRN_LBCS, WTIME_MAKE_ICS, | + | | WTIME_MAKE_LBCS, WTIME_RUN_FCST, WTIME_RUN_POST | + +----------------------+------------------------------------------------------------+ + | Maximum attempt | MAXTRIES_MAKE_GRID, MAXTRIES_MAKE_OROG, | + | | MAXTRIES_MAKE_SFC_CLIMO, MAXTRIES_GET_EXTRN_ICS, | + | | MAXTRIES_GET_EXTRN_LBCS, MAXTRIES_MAKE_ICS, | + | | MAXTRIES_MAKE_LBCS, MAXTRIES_RUN_FCST, MAXTRIES_RUN_POST | + +----------------------+------------------------------------------------------------+ + | Post configuration | USE_CUSTOM_POST_CONFIG_FILE, CUSTOM_POST_CONFIG_FP | + +----------------------+------------------------------------------------------------+ + | Running ensembles | DO_ENSEMBLE, NUM_ENS_MEMBERS | + +----------------------+------------------------------------------------------------+ + | Stochastic physics | DO_SHUM, DO_SPPT, DO_SKEB, SHUM_MAG, SHUM_LSCALE, | + | | SHUM_TSCALE, SHUM_INT, SPPT_MAG, SPPT_LSCALE, SPPT_TSCALE, | + | | SPPT_INT, SKEB_MAG, SKEB_LSCALE, SKEP_TSCALE, SKEB_INT, | + | | SKEB_VDOF, USE_ZMTNBLCK | + +----------------------+------------------------------------------------------------+ + | Boundary blending | HALO_BLEND | + +----------------------+------------------------------------------------------------+ + | FVCOM | USE_FVCOM, FVCOM_DIR, FVCOM_FILE | + +----------------------+------------------------------------------------------------+ + | Compiler | COMPILER | + +----------------------+------------------------------------------------------------+ + | METplus | MODEL, MET_INSTALL_DIR, MET_BIN_EXEC, METPLUS_PATH, | + | | CCPA_OBS_DIR, MRMS_OBS_DIR, NDAS_OBS_DIR | + +----------------------+------------------------------------------------------------+ + + + + +.. _UserSpecificConfig: + +User-specific configuration: ``config.sh`` +-------------------------------------------- + +The user must specify certain basic information about the experiment in a ``config.sh`` file located in the ``ufs-srweather-app/regional_workflow/ush`` directory. Two example templates are provided in that directory: ``config.community.sh`` and ``config.nco.sh``. The first file is a minimal example for creating and running an experiment in the *community* mode (with ``RUN_ENVIR`` set to ``community``). The second is an example for creating and running an experiment in the *NCO* (operational) mode (with ``RUN_ENVIR`` set to ``nco``). The *community* mode is recommended in most cases and will be fully supported for this release. The operational/NCO mode will typically be used by those at the NOAA/NCEP/Environmental Modeling Center (EMC) and the NOAA/Global Systems Laboratory (GSL) working on pre-implementation testing for the Rapid Refresh Forecast System (RRFS). :numref:`Table %s ` shows the configuration variables, along with their default values in ``config_default.sh`` and the values defined in ``config.community.sh``. + +.. _ConfigCommunity: + +.. table:: Configuration variables specified in the config.community.sh script + + +--------------------------------+-------------------+--------------------------------------------------------+ + | **Parameter** | **Default Value** | **config.community.sh Value** | + +================================+===================+========================================================+ + | MACHINE | "BIG_COMPUTER" | "hera" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | ACCOUNT | "project_name" | "an_account" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | EXPT_SUBDIR | "" | "test_CONUS_25km_GFSv16" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | VERBOSE | "TRUE" | "TRUE" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | RUN_ENVIR | "nco" | "community" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | PREEXISTING_DIR_METHOD | "delete" | "rename" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | PREDEF_GRID_NAME | "" | "RRFS_CONUS_25km" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | GRID_GEN_METHOD | "ESGgrid" | "ESGgrid" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | QUILTING | "TRUE" | "TRUE" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | CCPP_PHYS_SUITE | "FV3_GSD_V0" | "FV3_GFS_v16" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | FCST_LEN_HRS | "24" | "48" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | LBC_SPEC_INTVL_HRS | "6" | "6" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | DATE_FIRST_CYCL | "YYYYMMDD" | "20190615" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | DATE_LAST_CYCL | "YYYYMMDD" | "20190615" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | CYCL_HRS | ("HH1" "HH2") | "00" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | EXTRN_MDL_NAME_ICS | "FV3GFS" | "FV3GFS" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | EXTRN_MDL_NAME_LBCS | "FV3GFS" | "FV3GFS" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | FV3GFS_FILE_FMT_ICS | "nemsio" | "grib2" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | FV3GFS_FILE_FMT_LBCS | "nemsio" | "grib2" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | WTIME_RUN_FCST | "04:30:00" | "01:00:00" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | USE_USER_STAGED_EXTRN_FILES | "FALSE" | "TRUE" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | EXTRN_MDL_SOURCE_BASE_DIR_ICS | "" | "/scratch2/BMC/det/UFS_SRW_app/v1p0/model_data/FV3GFS" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | EXTRN_MDL_FILES_ICS | "" | "gfs.pgrb2.0p25.f000" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | EXTRN_MDL_SOURCE_BASEDIR_LBCS | "" | "/scratch2/BMC/det/UFS_SRW_app/v1p0/model_data/FV3GFS" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | EXTRN_MDL_FILES_LBCS | "" | "gfs.pgrb2.0p25.f006" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | MODEL | "" | FV3_GFS_v16_CONUS_25km" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | METPLUS_PATH | "" | "/path/to/METPlus" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | MET_INSTALL_DIR | "" | "/path/to/MET" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | CCPA_OBS_DIR | "" | "/path/to/processed/CCPA/data" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | MRMS_OBS_DIR | "" | "/path/to/processed/MRMS/data" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | NDAS_OBS_DIR | "" | "/path/to/processed/NDAS/data" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | RUN_TASK_GET_OBS_CCPA | "FALSE" | "FALSE" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | RUN_TASK_GET_OBS_MRMS | "FALSE" | "FALSE" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | RUN_TASK_GET_OBS_NDAS | "FALSE" | "FALSE" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | RUN_TASK_VX_GRIDSTAT | "FALSE" | "FALSE" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | RUN_TASK_VX_POINTSTAT | "FALSE" | "FALSE" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | RUN_TASK_VX_ENSGRID | "FALSE" | "FALSE" | + +--------------------------------+-------------------+--------------------------------------------------------+ + | RUN_TASK_VX_ENSPOINT | "FALSE" | "FALSE" | + +--------------------------------+-------------------+--------------------------------------------------------+ + + + +To get started, make a copy of ``config.community.sh``. From the ``ufs-srweather-app`` directory, run: + +.. code-block:: console + + cd regional_workflow/ush + cp config.community.sh config.sh + +The default settings in this file include a predefined 25-km :term:`CONUS` grid (RRFS_CONUS_25km), the :term:`GFS` v16 physics suite (FV3_GFS_v16 :term:`CCPP`), and :term:`FV3`-based GFS raw external model data for initialization. + +Next, edit the new ``config.sh`` file to customize it for your machine. At a minimum, change the ``MACHINE`` and ``ACCOUNT`` variables; then choose a name for the experiment directory by setting ``EXPT_SUBDIR``. If you have pre-staged the initialization data for the experiment, set ``USE_USER_STAGED_EXTRN_FILES="TRUE"``, and set the paths to the data for ``EXTRN_MDL_SOURCE_BASEDIR_ICS`` and ``EXTRN_MDL_SOURCE_BASEDIR_LBCS``. + +Sample settings are indicated below for Level 1 platforms. Detailed guidance applicable to all systems can be found in :numref:`Chapter %s: Configuring the Workflow `, which discusses each variable and the options available. Additionally, information about the four predefined Limited Area Model (LAM) Grid options can be found in :numref:`Chapter %s: Limited Area Model (LAM) Grids `. + +.. important:: + + If your modulefile uses a GNU compiler to set up the build environment in :numref:`Section %s `, you will have to check that the line ``COMPILER="gnu"`` appears in the ``config.sh`` file. + +.. hint:: + + To determine an appropriate ACCOUNT field for Level 1 systems, run ``groups``, and it will return a list of projects you have permissions for. Not all of the listed projects/groups have an HPC allocation, but those that do are potentially valid account names. + +Minimum parameter settings for running the out-of-the-box SRW App case on Level 1 machines: + +**Cheyenne:** + +.. code-block:: console + + MACHINE="cheyenne" + ACCOUNT="" + EXPT_SUBDIR="" + USE_USER_STAGED_EXTRN_FILES="TRUE" + EXTRN_MDL_SOURCE_BASEDIR_ICS="/glade/p/ral/jntp/UFS_SRW_App/develop/input_model_data///" + EXTRN_MDL_SOURCE_BASEDIR_LBCS="/glade/p/ral/jntp/UFS_SRW_App/develop/input_model_data///" + +where: +* refers to a subdirectory such as "FV3GFS" or "HRRR" containing the experiment data. +* refers to one of 3 possible data formats: ``grib2``, ``nemsio``, or ``netcdf``. +* YYYYMMDDHH refers to a subdirectory containing data for the :term:`cycle` date. + + +**Hera, Jet, Orion, Gaea:** + +The ``MACHINE``, ``ACCOUNT``, and ``EXPT_SUBDIR`` settings are the same as for Cheyenne, except that ``"cheyenne"`` should be switched to ``"hera"``, ``"jet"``, ``"orion"``, or ``"gaea"``, respectively. Set ``USE_USER_STAGED_EXTRN_FILES="TRUE"``, but replace the file paths to Cheyenne's data with the file paths for the correct machine. ``EXTRN_MDL_SOURCE_BASEDIR_ICS`` and ``EXTRN_MDL_SOURCE_BASEDIR_LBCS`` use the same base file path. + +On Hera: + +.. code-block:: console + + "/scratch2/BMC/det/UFS_SRW_App/develop/input_model_data///YYYYMMDDHH/" + +On Jet: + +.. code-block:: console + + "/mnt/lfs4/BMC/wrfruc/UFS_SRW_App/develop/input_model_data///YYYYMMDDHH/" + +On Orion: + +.. code-block:: console + + "/work/noaa/fv3-cam/UFS_SRW_App/develop/input_model_data///YYYYMMDDHH/" + +On Gaea: + +.. code-block:: console + + "/lustre/f2/pdata/ncep/UFS_SRW_App/develop/input_model_data///YYYYMMDDHH/" + +For **WCOSS** systems, edit ``config.sh`` with these WCOSS-specific parameters, and use a valid WCOSS project code for the account parameter: + +.. code-block:: console + + MACHINE="wcoss_cray" or MACHINE="wcoss_dell_p3" + ACCOUNT="valid_wcoss_project_code" + EXPT_SUBDIR="my_expt_name" + USE_USER_STAGED_EXTRN_FILES="TRUE" + +For WCOSS_DELL_P3: + +.. code-block:: console + + EXTRN_MDL_SOURCE_BASEDIR_ICS="/gpfs/dell2/emc/modeling/noscrub/UFS_SRW_App/develop/model_data///YYYYMMDDHH/ICS" + EXTRN_MDL_SOURCE_BASEDIR_LBCS="/gpfs/dell2/emc/modeling/noscrub/UFS_SRW_App/develop/input_model_data///YYYYMMDDHH/LBCS" + +**NOAA Cloud Systems:** + +.. code-block:: console + + MACHINE="NOAACLOUD" + ACCOUNT="none" + EXPT_SUBDIR="" + EXPT_BASEDIR="lustre/$USER/expt_dirs" + COMPILER="gnu" + USE_USER_STAGED_EXTRN_FILES="TRUE" + EXTRN_MDL_SOURCE_BASEDIR_ICS="/contrib/EPIC/UFS_SRW_App/develop/input_model_data/FV3GFS" + EXTRN_MDL_FILES_ICS=( "gfs.t18z.pgrb2.0p25.f000" ) + EXTRN_MDL_SOURCE_BASEDIR_LBCS="/contrib/EPIC/UFS_SRW_App/develop/input_model_data/FV3GFS" + EXTRN_MDL_FILES_LBCS=( "gfs.t18z.pgrb2.0p25.f006" "gfs.t18z.pgrb2.0p25.f012" ) + +.. note:: + + The values of the configuration variables should be consistent with those in the + ``valid_param_vals script``. In addition, various example configuration files can be + found in the ``regional_workflow/tests/baseline_configs`` directory. + +.. _VXConfig: + +Configure METplus Verification Suite (Optional) +-------------------------------------------------- + +Users who want to use the METplus verification suite to evaluate their forecasts need to add additional information to their ``config.sh`` file. Other users may skip to the :ref:`next section `. + +.. attention:: + METplus *installation* is not included as part of the build process for this release of the SRW App. However, METplus is preinstalled on `Level 1 `__ systems. For the v2 release, METplus *use* is supported on systems with a functioning METplus installation, although installation itself is not supported. For more information about METplus, see :numref:`Section %s `. + +.. note:: + If METplus users update their METplus installation, they must update the module load statements in ``ufs-srweather-app/regional_workflow/modulefiles/tasks//run_vx.local`` file to correspond to their system's updated installation: + + .. code-block:: console + + module use -a + module load met/ + +To use METplus verification, the path to the MET and METplus directories must be added to ``config.sh``: + +.. code-block:: console + + METPLUS_PATH="" + MET_INSTALL_DIR="" + +Users who have already staged the observation data needed for METplus (i.e., the :term:`CCPA`, :term:`MRMS`, and :term:`NDAS` data) on their system should set the path to this data and set the corresponding ``RUN_TASK_GET_OBS_*`` parameters to "FALSE" in ``config.sh``. + +.. code-block:: console + + CCPA_OBS_DIR="/path/to/UFS_SRW_app/develop/obs_data/ccpa/proc" + MRMS_OBS_DIR="/path/to/UFS_SRW_app/develop/obs_data/mrms/proc" + NDAS_OBS_DIR="/path/to/UFS_SRW_app/develop/obs_data/ndas/proc" + RUN_TASK_GET_OBS_CCPA="FALSE" + RUN_TASK_GET_OBS_MRMS="FALSE" + RUN_TASK_GET_OBS_NDAS="FALSE" + +If users have access to NOAA HPSS but have not pre-staged the data, they can simply set the ``RUN_TASK_GET_OBS_*`` tasks to "TRUE", and the machine will attempt to download the appropriate data from NOAA HPSS. The ``*_OBS_DIR`` paths must be set to the location where users want the downloaded data to reside. + +Users who do not have access to NOAA HPSS and do not have the data on their system will need to download :term:`CCPA`, :term:`MRMS`, and :term:`NDAS` data manually from collections of publicly available data, such as the ones listed `here `__. + +Next, the verification tasks must be turned on according to the user's needs. Users should add some or all of the following tasks to ``config.sh``, depending on the verification procedure(s) they have in mind: + +.. code-block:: console + + RUN_TASK_VX_GRIDSTAT="TRUE" + RUN_TASK_VX_POINTSTAT="TRUE" + RUN_TASK_VX_ENSGRID="TRUE" + RUN_TASK_VX_ENSPOINT="TRUE" + +These tasks are independent, so users may set some values to "TRUE" and others to "FALSE" depending on the needs of their experiment. Note that the ENSGRID and ENSPOINT tasks apply only to ensemble model verification. Additional verification tasks appear in :numref:`Table %s ` More details on all of the parameters in this section are available in :numref:`Chapter %s `. + +.. _SetUpPythonEnv: + +Set up the Python and other Environment Parameters +-------------------------------------------------- +The workflow requires Python 3 with the packages 'PyYAML', 'Jinja2', and 'f90nml' available. This Python environment has already been set up on Level 1 platforms, and it can be activated in the following way (from ``/ufs-srweather-app/regional_workflow/ush``): + +.. code-block:: console + + module use + module load wflow_ + conda activate regional_workflow + +This command will activate the ``regional_workflow`` conda environment. The user should see ``(regional_workflow)`` in front of the Terminal prompt at this point. If this is not the case, activate the regional workflow from the ``ush`` directory by running: + +.. code-block:: console + + conda init + source ~/.bashrc + conda activate regional_workflow + + +.. _GenerateWorkflow: + +Generate the Regional Workflow +------------------------------------------- + +Run the following command from the ``ufs-srweather-app/regional_workflow/ush`` directory to generate the workflow: + +.. code-block:: console + + ./generate_FV3LAM_wflow.sh + +The last line of output from this script, starting with ``*/1 * * * *`` or ``*/3 * * * *``, can be saved and :ref:`used later ` to automatically run portions of the workflow. + +This workflow generation script creates an experiment directory and populates it with all the data needed to run through the workflow. The flowchart in :numref:`Figure %s ` describes the experiment generation process. First, ``generate_FV3LAM_wflow.sh`` runs the ``setup.sh`` script to set the configuration parameters. Second, it copies the time-independent (fix) files and other necessary data input files from their location in the ufs-weather-model directory to the experiment directory (``EXPTDIR``). Third, it copies the weather model executable (``ufs_model``) from the ``bin`` directory to ``EXPTDIR`` and creates the input namelist file ``input.nml`` based on the ``input.nml.FV3`` file in the regional_workflow/ush/templates directory. Lastly, it creates the workflow XML file ``FV3LAM_wflow.xml`` that is executed when running the experiment with the Rocoto workflow manager. + +The ``setup.sh`` script reads three other configuration scripts in order: (1) ``config_default.sh`` (:numref:`Section %s `), (2) ``config.sh`` (:numref:`Section %s `), and (3) ``set_predef_grid_params.sh`` (:numref:`Section %s `). If a parameter is specified differently in these scripts, the file containing the last defined value will be used. + +The generated workflow will appear in ``EXPTDIR``, where ``EXPTDIR=${EXPT_BASEDIR}/${EXPT_SUBDIR}``. These variables were specified in the ``config.sh`` file in :numref:`Step %s `. The settings for these paths can also be viewed in the console output from the ``./generate_FV3LAM_wflow.sh`` script or in the ``log.generate_FV3LAM_wflow`` file, which can be found in ``$EXPTDIR``. + +.. _WorkflowGeneration: + +.. figure:: _static/FV3regional_workflow_gen.png + + *Experiment generation description* + +.. _WorkflowTaskDescription: + +Description of Workflow Tasks +-------------------------------- + +.. note:: + This section gives a general overview of workflow tasks. To begin running the workflow, skip to :numref:`Step %s ` + +:numref:`Figure %s ` illustrates the overall workflow. Individual tasks that make up the workflow are specified in the ``FV3LAM_wflow.xml`` file. :numref:`Table %s ` describes the function of each baseline task. The first three pre-processing tasks; ``MAKE_GRID``, ``MAKE_OROG``, and ``MAKE_SFC_CLIMO`` are optional. If the user stages pre-generated grid, orography, and surface climatology fix files, these three tasks can be skipped by adding the following lines to the ``config.sh`` file before running the ``generate_FV3LAM_wflow.sh`` script: + +.. code-block:: console + + RUN_TASK_MAKE_GRID="FALSE" + RUN_TASK_MAKE_OROG="FALSE" + RUN_TASK_MAKE_SFC_CLIMO="FALSE" + + +.. _WorkflowTasksFig: + +.. figure:: _static/FV3LAM_wflow_flowchart_v2.png + + *Flowchart of the workflow tasks* + + +The ``FV3LAM_wflow.xml`` file runs the specific j-job scripts (``regional_workflow/jobs/JREGIONAL_[task name]``) in the prescribed order when the experiment is launched via the ``launch_FV3LAM_wflow.sh`` script or the ``rocotorun`` command. Each j-job task has its own source script (or "ex-script") named ``exregional_[task name].sh`` in the ``regional_workflow/scripts`` directory. Two database files named ``FV3LAM_wflow.db`` and ``FV3LAM_wflow_lock.db`` are generated and updated by the Rocoto calls. There is usually no need for users to modify these files. To relaunch the workflow from scratch, delete these two ``*.db`` files and then call the launch script repeatedly for each task. + + +.. _WorkflowTasksTable: + +.. table:: Baseline workflow tasks in the SRW App + + +----------------------+------------------------------------------------------------+ + | **Workflow Task** | **Task Description** | + +======================+============================================================+ + | make_grid | Pre-processing task to generate regional grid files. Only | + | | needs to be run once per experiment. | + +----------------------+------------------------------------------------------------+ + | make_orog | Pre-processing task to generate orography files. Only | + | | needs to be run once per experiment. | + +----------------------+------------------------------------------------------------+ + | make_sfc_climo | Pre-processing task to generate surface climatology files. | + | | Only needs to be run, at most, once per experiment. | + +----------------------+------------------------------------------------------------+ + | get_extrn_ics | Cycle-specific task to obtain external data for the | + | | initial conditions | + +----------------------+------------------------------------------------------------+ + | get_extrn_lbcs | Cycle-specific task to obtain external data for the | + | | lateral boundary conditions (LBC's) | + +----------------------+------------------------------------------------------------+ + | make_ics | Generate initial conditions from the external data | + +----------------------+------------------------------------------------------------+ + | make_lbcs | Generate LBC's from the external data | + +----------------------+------------------------------------------------------------+ + | run_fcst | Run the forecast model (UFS weather model) | + +----------------------+------------------------------------------------------------+ + | run_post | Run the post-processing tool (UPP) | + +----------------------+------------------------------------------------------------+ + +In addition to the baseline tasks described in :numref:`Table %s ` above, users may choose to run some or all of the METplus verification tasks. These tasks are described in :numref:`Table %s ` below. + +.. _VXWorkflowTasksTable: + +.. table:: Verification (VX) workflow tasks in the SRW App + + +-----------------------+------------------------------------------------------------+ + | **Workflow Task** | **Task Description** | + +=======================+============================================================+ + | GET_OBS_CCPA | Retrieves and organizes hourly :term:`CCPA` data from NOAA | + | | HPSS. Can only be run if ``RUN_TASK_GET_OBS_CCPA="TRUE"`` | + | | *and* user has access to NOAA HPSS data. | + +-----------------------+------------------------------------------------------------+ + | GET_OBS_NDAS | Retrieves and organizes hourly :term:`NDAS` data from NOAA | + | | HPSS. Can only be run if ``RUN_TASK_GET_OBS_NDAS="TRUE"`` | + | | *and* user has access to NOAA HPSS data. | + +-----------------------+------------------------------------------------------------+ + | GET_OBS_MRMS | Retrieves and organizes hourly :term:`MRMS` composite | + | | reflectivity and :term:`echo top` data from NOAA HPSS. Can | + | | only be run if ``RUN_TASK_GET_OBS_MRMS="TRUE"`` *and* user | + | | has access to NOAA HPSS data. | + +-----------------------+------------------------------------------------------------+ + | VX_GRIDSTAT | Runs METplus grid-to-grid verification for 1-h accumulated | + | | precipitation | + +-----------------------+------------------------------------------------------------+ + | VX_GRIDSTAT_REFC | Runs METplus grid-to-grid verification for composite | + | | reflectivity | + +-----------------------+------------------------------------------------------------+ + | VX_GRIDSTAT_RETOP | Runs METplus grid-to-grid verification for :term:`echo top`| + +-----------------------+------------------------------------------------------------+ + | VX_GRIDSTAT_##h | Runs METplus grid-to-grid verification for 3-h, 6-h, and | + | | 24-h (i.e., daily) accumulated precipitation. Valid values | + | | of ``##`` are ``03``, ``06``, and ``24``. | + +-----------------------+------------------------------------------------------------+ + | VX_POINTSTAT | Runs METplus grid-to-point verification for surface and | + | | upper-air variables | + +-----------------------+------------------------------------------------------------+ + | VX_ENSGRID | Runs METplus grid-to-grid ensemble verification for 1-h | + | | accumulated precipitation. Can only be run if | + | | ``DO_ENSEMBLE="TRUE"`` and ``RUN_TASK_VX_ENSGRID="TRUE"``. | + +-----------------------+------------------------------------------------------------+ + | VX_ENSGRID_REFC | Runs METplus grid-to-grid ensemble verification for | + | | composite reflectivity. Can only be run if | + | | ``DO_ENSEMBLE="TRUE"`` and | + | | ``RUN_TASK_VX_ENSGRID = "TRUE"``. | + +-----------------------+------------------------------------------------------------+ + | VX_ENSGRID_RETOP | Runs METplus grid-to-grid ensemble verification for | + | | :term:`echo top`. Can only be run if ``DO_ENSEMBLE="TRUE"``| + | | and ``RUN_TASK_VX_ENSGRID="TRUE"``. | + +-----------------------+------------------------------------------------------------+ + | VX_ENSGRID_##h | Runs METplus grid-to-grid ensemble verification for 3-h, | + | | 6-h, and 24-h (i.e., daily) accumulated precipitation. | + | | Valid values of ``##`` are ``03``, ``06``, and ``24``. Can | + | | only be run if ``DO_ENSEMBLE="TRUE"`` and | + | | ``RUN_TASK_VX_ENSGRID="TRUE"``. | + +-----------------------+------------------------------------------------------------+ + | VX_ENSGRID_MEAN | Runs METplus grid-to-grid verification for ensemble mean | + | | 1-h accumulated precipitation. Can only be run if | + | | ``DO_ENSEMBLE="TRUE"`` and ``RUN_TASK_VX_ENSGRID="TRUE"``. | + +-----------------------+------------------------------------------------------------+ + | VX_ENSGRID_PROB | Runs METplus grid-to-grid verification for 1-h accumulated | + | | precipitation probabilistic output. Can only be run if | + | | ``DO_ENSEMBLE="TRUE"`` and ``RUN_TASK_VX_ENSGRID="TRUE"``. | + +-----------------------+------------------------------------------------------------+ + | VX_ENSGRID_MEAN_##h | Runs METplus grid-to-grid verification for ensemble mean | + | | 3-h, 6-h, and 24h (i.e., daily) accumulated precipitation. | + | | Valid values of ``##`` are ``03``, ``06``, and ``24``. Can | + | | only be run if ``DO_ENSEMBLE="TRUE"`` and | + | | ``RUN_TASK_VX_ENSGRID="TRUE"``. | + +-----------------------+------------------------------------------------------------+ + | VX_ENSGRID_PROB_##h | Runs METplus grid-to-grid verification for 3-h, 6-h, and | + | | 24h (i.e., daily) accumulated precipitation probabilistic | + | | output. Valid values of ``##`` are ``03``, ``06``, and | + | | ``24``. Can only be run if ``DO_ENSEMBLE="TRUE"`` and | + | | ``RUN_TASK_VX_ENSGRID="TRUE"``. | + +-----------------------+------------------------------------------------------------+ + | VX_ENSGRID_PROB_REFC | Runs METplus grid-to-grid verification for ensemble | + | | probabilities for composite reflectivity. Can only be run | + | | if ``DO_ENSEMBLE="TRUE"`` and | + | | ``RUN_TASK_VX_ENSGRID="TRUE"``. | + +-----------------------+------------------------------------------------------------+ + | VX_ENSGRID_PROB_RETOP | Runs METplus grid-to-grid verification for ensemble | + | | probabilities for :term:`echo top`. Can only be run if | + | | ``DO_ENSEMBLE="TRUE"`` and ``RUN_TASK_VX_ENSGRID="TRUE"``. | + +-----------------------+------------------------------------------------------------+ + | VX_ENSPOINT | Runs METplus grid-to-point ensemble verification for | + | | surface and upper-air variables. Can only be run if | + | | ``DO_ENSEMBLE="TRUE"`` and ``RUN_TASK_VX_ENSPOINT="TRUE"``.| + +-----------------------+------------------------------------------------------------+ + | VX_ENSPOINT_MEAN | Runs METplus grid-to-point verification for ensemble mean | + | | surface and upper-air variables. Can only be run if | + | | ``DO_ENSEMBLE="TRUE"`` and ``RUN_TASK_VX_ENSPOINT="TRUE"``.| + +-----------------------+------------------------------------------------------------+ + | VX_ENSPOINT_PROB | Runs METplus grid-to-point verification for ensemble | + | | probabilities for surface and upper-air variables. Can | + | | only be run if ``DO_ENSEMBLE="TRUE"`` and | + | | ``RUN_TASK_VX_ENSPOINT="TRUE"``. | + +-----------------------+------------------------------------------------------------+ + + +.. _RocotoRun: + +Run the Workflow Using Rocoto +============================= + +.. attention:: + + If users are running the SRW App in a container or on a system that does not have Rocoto installed (e.g., `Level 3 & 4 `__ systems, such as MacOS), they should follow the process outlined in :numref:`Section %s ` instead of the instructions in this section. + +The information in this section assumes that Rocoto is available on the desired platform. All official HPC platforms for the UFS SRW App release make use of the Rocoto workflow management software for running experiments. However, Rocoto cannot be used when running the workflow within a container. If Rocoto is not available, it is still possible to run the workflow using stand-alone scripts according to the process outlined in :numref:`Section %s `. There are two main ways to run the workflow with Rocoto: (1) with the ``launch_FV3LAM_wflow.sh`` script, and (2) by manually calling the ``rocotorun`` command. Users can also automate the workflow using a crontab. + +.. note:: + Users may find it helpful to review :numref:`Chapter %s ` to gain a better understanding of Rocoto commands and workflow management before continuing, but this is not required to run the experiment. + +Optionally, an environment variable can be set to navigate to the ``$EXPTDIR`` more easily. If the login shell is bash, it can be set as follows: + +.. code-block:: console + + export EXPTDIR=// + +If the login shell is csh/tcsh, it can be set using: + +.. code-block:: console + + setenv EXPTDIR /path-to-experiment/directory + + +Launch the Rocoto Workflow Using a Script +----------------------------------------------- + +To run Rocoto using the ``launch_FV3LAM_wflow.sh`` script provided, simply call it without any arguments: + +.. code-block:: console + + cd $EXPTDIR + ./launch_FV3LAM_wflow.sh + +This script creates a log file named ``log.launch_FV3LAM_wflow`` in ``$EXPTDIR`` or appends information to it if the file already exists. The launch script also creates the ``log/FV3LAM_wflow.log`` file, which shows Rocoto task information. Check the end of the log files periodically to see how the experiment is progressing: + +.. code-block:: console + + tail -n 40 log.launch_FV3LAM_wflow + +In order to launch additional tasks in the workflow, call the launch script again; this action will need to be repeated until all tasks in the workflow have been launched. To (re)launch the workflow and check its progress on a single line, run: + +.. code-block:: console + + ./launch_FV3LAM_wflow.sh; tail -n 40 log.launch_FV3LAM_wflow + +This will output the last 40 lines of the log file, which list the status of the workflow tasks (e.g., SUCCEEDED, DEAD, RUNNING, SUBMITTING, QUEUED). The number 40 can be changed according to the user's preferences. The output will look like this: + +.. code-block:: console + + CYCLE TASK JOBID STATE EXIT STATUS TRIES DURATION + ====================================================================================================== + 202006170000 make_grid druby://hfe01:33728 SUBMITTING - 0 0.0 + 202006170000 make_orog - - - - - + 202006170000 make_sfc_climo - - - - - + 202006170000 get_extrn_ics druby://hfe01:33728 SUBMITTING - 0 0.0 + 202006170000 get_extrn_lbcs druby://hfe01:33728 SUBMITTING - 0 0.0 + 202006170000 make_ics - - - - - + 202006170000 make_lbcs - - - - - + 202006170000 run_fcst - - - - - + 202006170000 run_post_00 - - - - - + 202006170000 run_post_01 - - - - - + 202006170000 run_post_02 - - - - - + 202006170000 run_post_03 - - - - - + 202006170000 run_post_04 - - - - - + 202006170000 run_post_05 - - - - - + 202006170000 run_post_06 - - - - - + + Summary of workflow status: + ~~~~~~~~~~~~~~~~~~~~~~~~~~ + + 0 out of 1 cycles completed. + Workflow status: IN PROGRESS + +If all the tasks complete successfully, the "Workflow status" at the bottom of the log file will change from "IN PROGRESS" to "SUCCESS". If certain tasks could not complete, the "Workflow status" will instead change to "FAILURE". Error messages for each specific task can be found in the task log files located in ``$EXPTDIR/log``. + +.. _Success: + +The workflow run is complete when all tasks have "SUCCEEDED". If everything goes smoothly, users will eventually see a workflow status table similar to the following: + +.. code-block:: console + + CYCLE TASK JOBID STATE EXIT STATUS TRIES DURATION + ========================================================================================================== + 201906150000 make_grid 4953154 SUCCEEDED 0 1 5.0 + 201906150000 make_orog 4953176 SUCCEEDED 0 1 26.0 + 201906150000 make_sfc_climo 4953179 SUCCEEDED 0 1 33.0 + 201906150000 get_extrn_ics 4953155 SUCCEEDED 0 1 2.0 + 201906150000 get_extrn_lbcs 4953156 SUCCEEDED 0 1 2.0 + 201906150000 make_ics 4953184 SUCCEEDED 0 1 16.0 + 201906150000 make_lbcs 4953185 SUCCEEDED 0 1 71.0 + 201906150000 run_fcst 4953196 SUCCEEDED 0 1 1035.0 + 201906150000 run_post_f000 4953244 SUCCEEDED 0 1 5.0 + 201906150000 run_post_f001 4953245 SUCCEEDED 0 1 4.0 + ... + 201906150000 run_post_f048 4953381 SUCCEEDED 0 1 7.0 + +If users choose to run METplus verification tasks as part of their experiment, the output above will include additional lines after ``run_post_f048``. The output will resemble the following but may be significantly longer when using ensemble verification: + +.. code-block:: console + + CYCLE TASK JOBID STATE EXIT STATUS TRIES DURATION + ========================================================================================================== + 201906150000 make_grid 30466134 SUCCEEDED 0 1 5.0 + ... + 201906150000 run_post_f048 30468271 SUCCEEDED 0 1 7.0 + 201906150000 run_gridstatvx 30468420 SUCCEEDED 0 1 53.0 + 201906150000 run_gridstatvx_refc 30468421 SUCCEEDED 0 1 934.0 + 201906150000 run_gridstatvx_retop 30468422 SUCCEEDED 0 1 1002.0 + 201906150000 run_gridstatvx_03h 30468491 SUCCEEDED 0 1 43.0 + 201906150000 run_gridstatvx_06h 30468492 SUCCEEDED 0 1 29.0 + 201906150000 run_gridstatvx_24h 30468493 SUCCEEDED 0 1 20.0 + 201906150000 run_pointstatvx 30468423 SUCCEEDED 0 1 670.0 + + +Launch the Rocoto Workflow Manually +--------------------------------------- + +Load Rocoto +^^^^^^^^^^^^^^^^ + +Instead of running the ``./launch_FV3LAM_wflow.sh`` script, users can load Rocoto and any other required modules. This gives the user more control over the process and allows them to view experiment progress more easily. On Level 1 systems, the Rocoto modules are loaded automatically in :numref:`Step %s `. For most other systems, a variant on the following commands will be necessary to load the Rocoto module: + +.. code-block:: console + + module use + module load rocoto + +Some systems may require a version number (e.g., ``module load rocoto/1.3.3``) + +Run the Rocoto Workflow +^^^^^^^^^^^^^^^^^^^^^^^^^^ + +After loading Rocoto, call ``rocotorun`` from the experiment directory to launch the workflow tasks. This will start any tasks that do not have a dependency. As the workflow progresses through its stages, ``rocotostat`` will show the state of each task and allow users to monitor progress: + +.. code-block:: console + + cd $EXPTDIR + rocotorun -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10 + rocotostat -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10 + +The ``rocotorun`` and ``rocotostat`` commands above will need to be resubmitted regularly and repeatedly until the experiment is finished. In part, this is to avoid having the system time out. This also ensures that when one task ends, tasks dependent on it will run as soon as possible, and ``rocotostat`` will capture the new progress. + +If the experiment fails, the ``rocotostat`` command will indicate which task failed. Users can look at the log file in the ``log`` subdirectory for the failed task to determine what caused the failure. For example, if the ``make_grid`` task failed, users can open the ``make_grid.log`` file to see what caused the problem: + +.. code-block:: console + + cd $EXPTDIR/log + vi make_grid.log + +.. note:: + + If users have the `Slurm workload manager `_ on their system, they can run the ``squeue`` command in lieu of ``rocotostat`` to check what jobs are currently running. + +.. _Automate: + +Automated Option +---------------------- +For automatic resubmission of the workflow at regular intervals (e.g., every minute), the user can add a crontab entry using the ``crontab -e`` command. As mentioned in :numref:`Section %s `, the last line of output from ``./generate_FV3LAM_wflow.sh`` (starting with ``*/1 * * * *`` or ``*/3 * * * *``), can be pasted into the crontab file. It can also be found in the ``$EXPTDIR/log.generate_FV3LAM_wflow`` file. The crontab entry should resemble the following: + +.. code-block:: console + + */3 * * * * cd && /apps/rocoto/1.3.3/bin/rocotorun -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10 + +where ```` is changed to correspond to the user's ``$EXPTDIR``, and ``/apps/rocoto/1.3.3/bin/rocotorun`` corresponds to the location of the ``rocotorun`` command on the user's system. The number ``3`` can be changed to a different positive integer and simply means that the workflow will be resubmitted every three minutes. + +.. hint:: + + * On NOAA Cloud instances, ``*/1 * * * *`` is the preferred option for cron jobs because compute nodes will shut down if they remain idle too long. If the compute node shuts down, it can take 15-20 minutes to start up a new one. + * On other NOAA HPC systems, admins discourage the ``*/1 * * * *`` due to load problems. ``*/3 * * * *`` is the preferred option for cron jobs on non-Cloud systems. + +To check the experiment progress: + +.. code-block:: console + + cd $EXPTDIR + rocotostat -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10 + +After finishing the experiment, open the crontab using ``crontab -e`` and delete the crontab entry. + +.. note:: + + On Orion, *cron* is only available on the orion-login-1 node, so users will need to work on that node when running *cron* jobs on Orion. + +The workflow run is complete when all tasks have "SUCCEEDED", and the rocotostat command outputs a table similar to the one :ref:`above `. + +.. _PlotOutput: + +Plot the Output +=============== +Two python scripts are provided to generate plots from the :term:`FV3`-LAM post-processed :term:`GRIB2` output. Information on how to generate the graphics can be found in :numref:`Chapter %s `. diff --git a/docs/UsersGuide/source/CodeReposAndDirs.rst b/docs/UsersGuide/source/CodeReposAndDirs.rst deleted file mode 100644 index e52f5512c0..0000000000 --- a/docs/UsersGuide/source/CodeReposAndDirs.rst +++ /dev/null @@ -1,261 +0,0 @@ -.. _CodeReposAndDirs: - -========================================= -Code Repositories and Directory Structure -========================================= -This chapter describes the code repositories that comprise the UFS SRW Application, -without describing any of the components in detail. - -.. _HierarchicalRepoStr: - -Hierarchical Repository Structure -================================= -The umbrella repository for the UFS SRW Application is named ufs-srweather-app and is -available on GitHub at https://github.com/ufs-community/ufs-srweather-app. An umbrella -repository is defined as a repository that houses external code, called "externals," from -additional repositories. The UFS SRW Application includes the ``manage_externals`` tools -along with a configuration file called ``Externals.cfg``, which describes the external -repositories associated with this umbrella repo (see :numref:`Table %s `). - -.. _top_level_repos: - -.. table:: List of top-level repositories that comprise the UFS SRW Application. - - +---------------------------------+---------------------------------------------------------+ - | **Repository Description** | **Authoritative repository URL** | - +=================================+=========================================================+ - | Umbrella repository for the UFS | https://github.com/ufs-community/ufs-srweather-app | - | Short-Range Weather Application | | - +---------------------------------+---------------------------------------------------------+ - | Repository for | https://github.com/ufs-community/ufs-weather-model | - | the UFS Weather Model | | - +---------------------------------+---------------------------------------------------------+ - | Repository for the regional | https://github.com/ufs-community/regional_workflow | - | workflow | | - +---------------------------------+---------------------------------------------------------+ - | Repository for UFS utilities, | https://github.com/ufs-community/UFS_UTILS | - | including pre-processing, | | - | chgres_cube, and more | | - +---------------------------------+---------------------------------------------------------+ - | Repository for the Unified Post | https://github.com/NOAA-EMC/UPP | - | Processor (UPP) | | - +---------------------------------+---------------------------------------------------------+ - -The UFS Weather Model contains a number of sub-repositories used by the model as -documented `here `_. - -Note that the prerequisite libraries (including NCEP Libraries and external libraries) are not -included in the UFS SRW Application repository. The source code for these components resides in -the repositories `NCEPLIBS `_ and `NCEPLIBS-external -`_. - -These external components are already built on the preconfigured platforms listed `here -`_. -However, they must be cloned and built on other platforms according to the instructions provided -in the wiki pages of those repositories: https://github.com/NOAA-EMC/NCEPLIBS/wiki and -https://github.com/NOAA-EMC/NCEPLIBS-external/wiki. - -.. _TopLevelDirStructure: - -Directory Structure -=================== -The directory structure for the SRW Application is determined by the ``local_path`` settings in -the ``Externals.cfg`` file, which is in the directory where the umbrella repository has -been cloned. After ``manage_externals/checkout_externals`` is run, the specific GitHub repositories -that are described in :numref:`Table %s ` are cloned into the target -subdirectories shown below. The directories that will be created later by running the -scripts are presented in parentheses. Some directories have been removed for brevity. - -.. code-block:: console - - ufs-srweather-app - โ”œโ”€โ”€ (bin) - โ”œโ”€โ”€ (build) - โ”œโ”€โ”€ docs - โ”‚ โ””โ”€โ”€ UsersGuide - โ”œโ”€โ”€ (include) - โ”œโ”€โ”€ (lib) - โ”œโ”€โ”€ manage_externals - โ”œโ”€โ”€ regional_workflow - โ”‚ โ”œโ”€โ”€ docs - โ”‚ โ”‚ โ””โ”€โ”€ UsersGuide - โ”‚ โ”œโ”€โ”€ (fix) - โ”‚ โ”œโ”€โ”€ jobs - โ”‚ โ”œโ”€โ”€ modulefiles - โ”‚ โ”œโ”€โ”€ scripts - โ”‚ โ”œโ”€โ”€ tests - โ”‚ โ”‚ โ””โ”€โ”€ baseline_configs - โ”‚ โ””โ”€โ”€ ush - โ”‚ โ”œโ”€โ”€ Python - โ”‚ โ”œโ”€โ”€ rocoto - โ”‚ โ”œโ”€โ”€ templates - โ”‚ โ””โ”€โ”€ wrappers - โ”œโ”€โ”€ (share) - โ””โ”€โ”€ src - โ”œโ”€โ”€ UPP - โ”‚ โ”œโ”€โ”€ parm - โ”‚ โ””โ”€โ”€ sorc - โ”‚ โ””โ”€โ”€ ncep_post.fd - โ”œโ”€โ”€ UFS_UTILS - โ”‚ โ”œโ”€โ”€ sorc - โ”‚ โ”‚ โ”œโ”€โ”€ chgres_cube.fd - โ”‚ โ”‚ โ”œโ”€โ”€ fre-nctools.fd - | โ”‚ โ”œโ”€โ”€ grid_tools.fd - โ”‚ โ”‚ โ”œโ”€โ”€ orog_mask_tools.fd - โ”‚ โ”‚ โ””โ”€โ”€ sfc_climo_gen.fd - โ”‚ โ””โ”€โ”€ ush - โ””โ”€โ”€ ufs_weather_model - โ””โ”€โ”€ FV3 - โ”œโ”€โ”€ atmos_cubed_sphere - โ””โ”€โ”€ ccpp - -Regional Workflow Sub-Directories ---------------------------------- -Under the ``regional_workflow`` directory shown in :numref:`TopLevelDirStructure` there are -a number of sub-directories that are created when the regional workflow is cloned. The -contents of these sub-directories are described in :numref:`Table %s `. - -.. _Subdirectories: - -.. table:: Sub-directories of the regional workflow. - - +-------------------------+---------------------------------------------------------+ - | **Directory Name** | **Description** | - +=========================+=========================================================+ - | docs | Users' Guide Documentation | - +-------------------------+---------------------------------------------------------+ - | jobs | J-job scripts launched by Rocoto | - +-------------------------+---------------------------------------------------------+ - | modulefiles | Files used to load modules needed for building and | - | | running the workflow | - +-------------------------+---------------------------------------------------------+ - | scripts | Run scripts launched by the J-jobs | - +-------------------------+---------------------------------------------------------+ - | tests | Baseline experiment configuration | - +-------------------------+---------------------------------------------------------+ - | ush | Utility scripts used by the workflow | - +-------------------------+---------------------------------------------------------+ - -.. _ExperimentDirSection: - -Experiment Directory Structure -============================== -When the ``generate_FV3LAM_wflow.sh`` script is run, the user-defined experimental directory -``EXPTDIR=/path-to/ufs-srweather-app/../expt_dirs/${EXPT_SUBDIR}`` is created, where ``EXPT_SUBDIR`` -is specified in the ``config.sh`` file. The contents of the ``EXPTDIR`` directory, before the -workflow is run, is shown in :numref:`Table %s `. - -.. _ExptDirStructure: - -.. table:: Files and sub-directory initially created in the experimental directory. - :widths: 33 67 - - +---------------------------+-------------------------------------------------------------------------------------------------------+ - | **File Name** | **Description** | - +===========================+=======================================================================================================+ - | config.sh | User-specified configuration file, see :numref:`Section %s ` | - +---------------------------+-------------------------------------------------------------------------------------------------------+ - | data_table | Cycle-independent input file (empty) | - +---------------------------+-------------------------------------------------------------------------------------------------------+ - | field_table | Tracers in the `forecast model | - | | `_ | - +---------------------------+-------------------------------------------------------------------------------------------------------+ - | FV3LAM_wflow.xml | Rocoto XML file to run the workflow | - +---------------------------+-------------------------------------------------------------------------------------------------------+ - | input.nml | Namelist for the `UFS Weather model | - | | `_ | - +---------------------------+-------------------------------------------------------------------------------------------------------+ - | launch_FV3LAM_wflow.sh | Symlink to the shell script of | - | | ``ufs-srweather-app/regional_workflow/ush/launch_FV3LAM_wflow.sh`` | - | | that can be used to (re)launch the Rocoto workflow. | - | | Each time this script is called, it appends to a log | - | | file named ``log.launch_FV3LAM_wflow``. | - +---------------------------+-------------------------------------------------------------------------------------------------------+ - | log.generate_FV3LAM_wflow | Log of the output from the experiment generation script | - | | ``generate_FV3LAM_wflow.sh`` | - +---------------------------+-------------------------------------------------------------------------------------------------------+ - | nems.configure | See `NEMS configuration file | - | | `_ | - +---------------------------+-------------------------------------------------------------------------------------------------------+ - | suite_{CCPP}.xml | CCPP suite definition file used by the forecast model | - +---------------------------+-------------------------------------------------------------------------------------------------------+ - | var_defns.sh | Shell script defining the experiment parameters. It contains all | - | | of the primary parameters specified in the default and | - | | user-specified configuration files plus many secondary parameters | - | | that are derived from the primary ones by the experiment | - | | generation script. This file is sourced by various other scripts | - | | in order to make all the experiment variables available to these | - | | scripts. | - +---------------------------+-------------------------------------------------------------------------------------------------------+ - | YYYYMMDDHH | Cycle directory (empty) | - +---------------------------+-------------------------------------------------------------------------------------------------------+ - -In addition, the *community* mode creates the ``fix_am`` and ``fix_lam`` directories in ``EXPTDIR``. -The ``fix_lam`` directory is initially empty but will contain some *fix* (time-independent) files -after the grid, orography, and/or surface climatology generation tasks are run. - -.. _FixDirectories: - -.. table:: Description of the fix directories - - +-------------------------+----------------------------------------------------------+ - | **Directory Name** | **Description** | - +=========================+==========================================================+ - | fix_am | Directory containing the global `fix` (time-independent) | - | | data files. The experiment generation script copies | - | | these files from a machine-dependent system directory. | - +-------------------------+----------------------------------------------------------+ - | fix_lam | Directory containing the regional fix (time-independent) | - | | data files that describe the regional grid, orography, | - | | and various surface climatology fields as well as | - | | symlinks to pre-generated files. | - +-------------------------+----------------------------------------------------------+ - -Once the workflow is launched with the ``launch_FV3LAM_wflow.sh`` script, a log file named -``log.launch_FV3LAM_wflow`` will be created (or appended to it if it already exists) in ``EXPTDIR``. -Once the ``make_grid``, ``make_orog``, and ``make_sfc_climo`` tasks and the ``get_extrn_ics`` -and ``get_extrn_lbc`` tasks for the YYYYMMDDHH cycle have completed successfully, new files and -sub-directories are created, as described in :numref:`Table %s `. - -.. _CreatedByWorkflow: - -.. table:: New directories and files created when the workflow is launched. - :widths: 30 70 - - +---------------------------+--------------------------------------------------------------------+ - | **Directory/file Name** | **Description** | - +===========================+====================================================================+ - | YYYYMMDDHH | This is updated when the first cycle-specific workflow tasks are | - | | run, which are ``get_extrn_ics`` and ``get_extrn_lbcs`` (they are | - | | launched simultaneously for each cycle in the experiment). We | - | | refer to this as a โ€œcycle directoryโ€. Cycle directories are | - | | created to contain cycle-specific files for each cycle that the | - | | experiment runs. If ``DATE_FIRST_CYCL`` and ``DATE_LAST_CYCL`` | - | | were different, and/or ``CYCL_HRS`` contained more than one | - | | element in the ``config.sh`` file, then more than one cycle | - | | directory would be created under the experiment directory. | - +---------------------------+--------------------------------------------------------------------+ - | grid | Directory generated by the ``make_grid`` task containing grid | - | | files for the experiment | - +---------------------------+--------------------------------------------------------------------+ - | log | Contains log files generated by the overall workflow and its | - | | various tasks. Look in these files to trace why a task may have | - | | failed. | - +---------------------------+--------------------------------------------------------------------+ - | orog | Directory generated by the ``make_orog`` task containing the | - | | orography files for the experiment | - +---------------------------+--------------------------------------------------------------------+ - | sfc_climo | Directory generated by the ``make_sfc_climo`` task containing the | - | | surface climatology files for the experiment | - +---------------------------+--------------------------------------------------------------------+ - | FV3LAM_wflow.db | Database files that are generated when Rocoto is called (by the | - | FV3LAM_wflow_lock.db | launch script) to launch the workflow. | - +---------------------------+--------------------------------------------------------------------+ - | log.launch_FV3LAM_wflow | This is the log file to which the launch script | - | | ``launch_FV3LAM_wflow.sh`` appends its output each time it is | - | | called. Take a look at the last 30โ€“50 lines of this file to check | - | | the status of the workflow. | - +---------------------------+--------------------------------------------------------------------+ - -The output files for an experiment are described in :numref:`Section %s `. -The workflow tasks are described in :numref:`Section %s `). diff --git a/docs/UsersGuide/source/CompleteTests.csv b/docs/UsersGuide/source/CompleteTests.csv new file mode 100644 index 0000000000..28cc46162f --- /dev/null +++ b/docs/UsersGuide/source/CompleteTests.csv @@ -0,0 +1,28 @@ +๏ปฟGrid,ICS,LBCS,Suite,Date,Time (UTC),Script Name,Test Type +RRFS_CONUS_3km,FV3GFS,FV3GFS,GFS_v16,2019-07-01,00,config.grid_RRFS_CONUS_3km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16.sh,Complete +RRFS_CONUS_25km,HRRR,RAP,RRFS_v1beta,2020-08-10,00,config.grid_RRFS_CONUScompact_25km_ics_HRRR_lbcs_RAP_suite_HRRR.sh,Complete +RRFS_CONUS_13km,HRRR,RAP,RRFS_v1beta,2020-08-01,00,config.grid_RRFS_CONUScompact_13km_ics_HRRR_lbcs_RAP_suite_RRFS_v1beta.sh,Complete +RRFS_CONUS_3km,HRRR,RAP,RRFS_v1beta,2020-08-01,00,config.grid_RRFS_CONUScompact_3km_ics_HRRR_lbcs_RAP_suite_RRFS_v1beta.sh,Complete +RRFS_CONUS_25km,HRRR,RAP,HRRR,2020-08-10,00,config.grid_RRFS_CONUScompact_25km_ics_HRRR_lbcs_RAP_suite_HRRR.sh,Complete +RRFS_CONUS_13km,HRRR,RAP,HRRR,2020-08-10,00,config.grid_RRFS_CONUScompact_13km_ics_HRRR_lbcs_RAP_suite_HRRR.sh,Complete +RRFS_CONUS_3km,HRRR,RAP,HRRR,2020-08-10,00,config.grid_RRFS_CONUScompact_3km_ics_HRRR_lbcs_RAP_suite_HRRR.sh,Complete +RRFS_CONUS_25km,FV3GFS,FV3GFS,FV3_GFS_2017_gfdlmp,2019-07-01,"00,12",config.community_ensemble_008mems.sh,Complete/wflow +RRFS_CONUS_25km,FV3GFS,FV3GFS,FV3_GFS_2017_gfdlmp,2019-07-01,"00,12",config.community_ensemble_2mems.sh,Complete/wflow +RRFS_CONUS_25km,FV3GFS,FV3GFS,FV3_GFS_2017_gfdlmp,2019-07-02,"00,12",config.community_ensemble_008mems.sh,Complete/wflow +RRFS_CONUS_25km,FV3GFS,FV3GFS,FV3_GFS_2017_gfdlmp,2019-07-02,"00,12",config.community_ensemble_2mems.sh,Complete/wflow +RRFS_CONUS_25km,FV3GFS,FV3GFS,FV3_GFS_v15p2,2019-07-01,00,config.deactivate_tasks.sh,Complete/wflow +RRFS_CONUS_25km,FV3GFS,FV3GFS,FV3_GFS_v15p2,2019-07-01,00,config.inline_post.sh,Complete/wflow +RRFS_CONUS_25km,FV3GFS,FV3GFS,FV3_GFS_v15p2,2019-06-15,00,config.MET_ensemble_verification.sh,Complete/wflow +RRFS_CONUS_25km,FV3GFS,FV3GFS,FV3_GFS_v15p2,2019-06-15,00,config.MET_verification.sh,Complete/wflow +ESGgrid,FV3GFS,FV3GFS,FV3_GFS_2017_gfdlmp_regional,2019-07-01,00,config.new_ESGgrid.sh,Complete/wflow +GFDLgrid,FV3GFS,FV3GFS,FV3_GFS_2017_gfdlmp,2019-07-01,00,config.new_GFDLgrid.sh,Complete/wflow +GFDLgrid,FV3GFS,FV3GFS,FV3_GFS_2017_gfdlmp,2019-07-01,00,config.new_GFDLgrid__GFDLgrid_USE_GFDLgrid_RES_IN_FILENAMES_eq_FALSE.sh,Complete/wflow +GFDLgrid,FV3GFS,FV3GFS,FV3_GFS_2017_gfdlmp,2019-07-01,00,config.new_GFDLgrid__GFDLgrid_USE_GFDLgrid_RES_IN_FILENAMES_eq_TRUE.sh,Complete/wflow +RRFS_CONUS_25km,FV3GFS,FV3GFS,FV3_GFS_v15p2,2019-07-01,00,config.pregen_grid_orog_sfc_climo.sh,Complete/wflow +RRFS_CONUS_25km,GSMGFS,GSMGFS,FV3_GFS_2017_gfdlmp,2019-05-20,00,config.specify_DOT_OR_USCORE.sh,Complete/wflow +RRFS_CONUScompact_25km,HRRR,RAP,FV3_HRRR,2020-08-01,00,config.specify_DT_ATMOS_LAYOUT_XY_BLOCKSIZE.sh,Complete/wflow +RRFS_CONUS_25km,FV3GFS,FV3GFS,FV3_GFS_v15p2,2021-06-03,06,config.specify_EXTRN_MDL_SYSBASEDIR_ICS_LBCS.sh,Complete/wflow +RRFS_CONUS_25km,FV3GFS,FV3GFS,FV3_GFS_v15p2,2019-07-01,00,config.specify_RESTART_INTERVAL.sh,Complete/wflow +RRFS_CONUScompact_25km,HRRR,RAP,FV3_RRFS_v1beta,2020-08-10,00,config.subhourly_post_ensemble_2mems.sh,Complete/wflow +RRFS_CONUScompact_25km,HRRR,RAP,FV3_RRFS_v1beta,2020-08-10,00,config.subhourly_post.sh,Complete/wflow +RRFS_CONUS_25km,FV3GFS,FV3GFS,FV3_GFS_v15p2,2019-07-01,00,config.specify_template_filenames.sh,Complete/wflow \ No newline at end of file diff --git a/docs/UsersGuide/source/CompleteTests.rst b/docs/UsersGuide/source/CompleteTests.rst new file mode 100644 index 0000000000..50cf6f69fb --- /dev/null +++ b/docs/UsersGuide/source/CompleteTests.rst @@ -0,0 +1,8 @@ +************************************************************ +Complete WE2E Tests +************************************************************ + +.. csv-table:: + :file: CompleteTests.csv + :widths: 20,20,20,20,20,20,20,20 + :header-rows: 1 diff --git a/docs/UsersGuide/source/Components.rst b/docs/UsersGuide/source/Components.rst new file mode 100644 index 0000000000..028c87c851 --- /dev/null +++ b/docs/UsersGuide/source/Components.rst @@ -0,0 +1,94 @@ +.. _Components: + +============================ +SRW Application Components +============================ + +The SRW Application assembles a variety of components, including: + +* Pre-processor Utilities & Initial Conditions +* UFS Weather Forecast Model +* Unified Post-Processor +* Visualization Examples +* Build System and Workflow + +These components are documented within this User's Guide and supported through a `community forum `_. + +.. _Utils: + +Pre-processor Utilities and Initial Conditions +============================================== + +The SRW Application includes a number of pre-processing utilities that initialize and prepare the model. Since the SRW App provides forecast predictions over a limited area (rather than globally), it is necessary to first generate a regional grid (``regional_esg_grid/make_hgrid``) along with :term:`orography` (``orog``) and surface climatology (``sfc_climo_gen``) files on that grid. Grids include a strip, or "halo," of six cells that surround the regional grid and feed in lateral boundary condition data. Since different grid and orography files require different numbers of halo cells, additional utilities handle topography filtering and shave the number of halo points (based on downstream workflow component requirements). The pre-processing software ``chgres_cube`` is used to convert the raw external model data into initial and lateral boundary condition files in netCDF format. These are needed as input to the FV3-LAM. Additional information about the UFS pre-processor utilities can be found in the `UFS_UTILS User's Guide `_. + +The SRW Application can be initialized from a range of operational initial condition files. It is possible to initialize the model from the Global Forecast System (:term:`GFS`), North American Mesoscale (:term:`NAM`) Forecast System, Rapid Refresh (:term:`RAP`), and High-Resolution Rapid Refresh (:term:`HRRR`) files in Gridded Binary v2 (:term:`GRIB2`) format. GFS files also come in :term:`NEMSIO` format for past dates. + +.. WARNING:: + For GFS data, dates prior to 1 January 2018 may work but are not guaranteed. Public archives of model data can be accessed through the `National Centers for Environmental Information `_ (NCEI) or through the `NOAA Operational Model Archive and Distribution System `_ (NOMADS). Raw external model data may be pre-staged on disk by the user. + + +Forecast Model +============== + +The prognostic atmospheric model in the UFS SRW Application is the Finite-Volume Cubed-Sphere +(:term:`FV3`) dynamical core configured with a Limited Area Model (:term:`LAM`) capability :cite:`BlackEtAl2021`. The dynamical core is the computational part of a model that solves the equations of fluid motion. A Userโ€™s Guide for the UFS :term:`Weather Model` is `here `__. + +Supported model resolutions in this release include 3-, 13-, and 25-km predefined Contiguous U.S. (:term:`CONUS`) domains, each with 64 vertical levels. Preliminary tools for users to define their own domain are also available in the release with full, formal support of these tools to be provided in future releases. The Extended Schmidt Gnomonic (ESG) grid is used with the FV3-LAM, which features relatively uniform grid cells across the entirety of the domain. Additional information about the FV3 dynamical core can be found `here `__ and on the `NOAA Geophysical Fluid Dynamics Laboratory website `_. + +Interoperable atmospheric physics, along with various land surface model options, are supported through the Common Community Physics Package (:term:`CCPP`), described `here `__. Atmospheric physics are a set of numerical methods describing small-scale processes such as clouds, turbulence, radiation, and their interactions. There will be four physics options supported for the v2.0 release. The first is the FV3_RRFS_v1beta physics suite, which is being tested for use in the future operational implementation of the Rapid Refresh Forecast System (RRFS) planned for 2023-2024, and the second is an updated version of the physics suite used in the operational Global Forecast System (GFS) v16. Additionally, FV3_WoFS and FV3_HRRR will be supported. A scientific description of the CCPP parameterizations and suites can be found in the `CCPP Scientific Documentation `_, and CCPP technical aspects are described in the `CCPP Technical Documentation `_. The model namelist has many settings beyond the physics options that can optimize various aspects of the model for use with each of the supported suites. + +The SRW App supports the use of both :term:`GRIB2` and :term:`NEMSIO` input data. The UFS Weather Model ingests initial and lateral boundary condition files produced by :term:`chgres_cube` and outputs files in netCDF format on a specific projection (e.g., Lambert Conformal) in the horizontal direction and model levels in the vertical direction. + +Post-processor +============== + +The SRW Application is distributed with the Unified Post Processor (:term:`UPP`) included in the workflow as a way to convert the netCDF output on the native model grid to :term:`GRIB2` format on standard isobaric vertical coordinates. The UPP can also be used to compute a variety of useful diagnostic fields, as described in the `UPP Userโ€™s Guide `__. + +Output from UPP can be used with visualization, plotting, and verification packages or in +further downstream post-processing (e.g., statistical post-processing techniques). + +.. _MetplusComponent: + +METplus Verification Suite +============================= + +The enhanced Model Evaluation Tools (`METplus `__) verification system has been integrated into the SRW App to facilitate forecast evaluation. METplus is a verification framework that spans a wide range of temporal scales (warn-on-forecast to climate) and spatial scales (storm to global). It is supported by the `Developmental Testbed Center (DTC) `__. + +METplus *installation* is not included as part of the build process for this release of the SRW App. However, METplus is preinstalled on all `Level 1 `__ systems; existing builds can be viewed `here `__. METplus can be installed on other systems individually or as part of :term:`HPC-Stack`. Users on non-Level 1 systems can follow the `MET Installation `__ and `METplus Installation `__ Guides for individual installation. Currently, METplus *installation* is not a supported feature for this release of the SRW App. However, METplus *use* is supported on systems with a functioning METplus installation. + +The core components of the METplus framework include the statistical driver, MET, the associated database and display systems known as METviewer and METexpress, and a suite of Python wrappers to provide low-level automation and examples, also called use-cases. MET is a set of verification tools developed for use by the :term:`NWP` community. It matches up grids with either gridded analyses or point observations and applies configurable methods to compute statistics and diagnostics. Extensive documentation is available in the `METplus Userโ€™s Guide `__ and `MET Userโ€™s Guide `__. Documentation for all other components of the framework can be found at the Documentation link for each component on the METplus `downloads `__ page. + +Among other techniques, MET provides the capability to compute standard verification scores for comparing deterministic gridded model data to point-based and gridded observations. It also provides ensemble and probabilistic verification methods for comparing gridded model data to point-based or gridded observations. Verification tasks to accomplish these comparisons are defined in the SRW App in :numref:`Table %s `. Currently, the SRW App supports the use of :term:`NDAS` observation files in `prepBUFR format `__ (which include conventional point-based surface and upper-air data) for point-based verification. It also supports gridded Climatology-Calibrated Precipitation Analysis (:term:`CCPA`) data for accumulated precipitation evaluation and Multi-Radar/Multi-Sensor (:term:`MRMS`) gridded analysis data for composite reflectivity and :term:`echo top` verification. + +METplus is being actively developed by :term:`NCAR`/Research Applications Laboratory (RAL), NOAA/Earth Systems Research Laboratories (ESRL), and NOAA/Environmental Modeling Center (EMC), and it is open to community contributions. + + +Visualization Example +===================== +A Python script is provided to create basic visualization of the model output. The script +is designed to output graphics in PNG format for 14 standard meteorological variables +when using the pre-defined :term:`CONUS` domain. A difference plotting script is also included to visually compare two runs for the same domain and resolution. These scripts are provided only as an example for users familiar with Python. They may be used to perform a visual check to verify that the application is producing reasonable results. + +After running ``manage_externals/checkout_externals``, the visualization scripts will be available in the ``ufs-srweather-app/regional_workflow/ush/Python`` directory. Usage information and instructions are described in :numref:`Chapter %s ` and are also included at the top of the script. + +Build System and Workflow +========================= + +The SRW Application has a portable build system and a user-friendly, modular, and expandable workflow framework. + +An umbrella CMake-based build system is used for building the components necessary for running the end-to-end SRW Application, including the UFS Weather Model and the pre- and post-processing software. Additional libraries necessary for the application (e.g., :term:`NCEPLIBS-external` and :term:`NCEPLIBS`) are not included in the SRW Application build system but are available pre-built on pre-configured platforms. On other systems, they can be installed via the HPC-Stack (see :numref:`Chapter %s: Installing the HPC-Stack `). There is a small set of system libraries and utilities that are assumed to be present on the target computer: the CMake build software, a Fortran, C, and C++ compiler, and an :term:`MPI` library. + +Once built, the provided experiment generator script can be used to create a Rocoto-based +workflow file that will run each task in the system in the proper sequence (see :numref:`Chapter %s ` or the `Rocoto documentation `_) for more information. If Rocoto and/or a batch system is not present on the available platform, the individual components can be run in a stand-alone, command line fashion with provided run scripts. The generated namelist for the atmospheric model can be modified in order to vary settings such as forecast starting and ending dates, forecast length hours, the :term:`CCPP` physics suite, integration time step, history file output frequency, and more. It also allows for configuration of other elements of the workflow; for example, whether to run some or all of the pre-processing, forecast model, and post-processing steps. + +This SRW Application release has been tested on a variety of platforms widely used by +researchers, such as the NOAA Research and Development High-Performance Computing Systems +(RDHPCS), including Hera, Orion, and Jet; NOAAโ€™s Weather and Climate Operational +Supercomputing System (WCOSS); the National Center for Atmospheric Research (:term:`NCAR`) Cheyenne +system; the National Severe Storms Laboratory (NSSL) HPC machine, Odin; the National Science Foundation Stampede2 system; and generic Linux and macOS systems using Intel and GNU compilers. Four `levels of support `_ have been defined for the SRW Application, including pre-configured (Level 1), configurable (Level 2), limited test platforms (Level 3), and build only platforms (Level 4). Each level is further described below. + +On pre-configured (Level 1) computational platforms, all the required libraries for building the SRW Application are available in a central place. That means bundled libraries (NCEPLIBS) and third-party libraries (NCEPLIBS-external) have both been built. The SRW Application is expected to build and run out-of-the-box on these pre-configured platforms. + +A few additional computational platforms are considered configurable for the SRW Application release. Configurable platforms (Level 2) are platforms where all of the required libraries for building the SRW Application are expected to install successfully but are not available in a central location. Applications and models are expected to build and run once the required bundled libraries (e.g., NCEPLIBS) and third-party libraries (e.g., NCEPLIBS-external) are built. + +Limited-Test (Level 3) and Build-Only (Level 4) computational platforms are those in which the developers have built the code but little or no pre-release testing has been conducted, respectively. A complete description of the levels of support, along with a list of preconfigured and configurable platforms can be found in the `SRW Application wiki page `_. diff --git a/docs/UsersGuide/source/ConfigNewPlatform.rst b/docs/UsersGuide/source/ConfigNewPlatform.rst deleted file mode 100644 index 381ffb98cb..0000000000 --- a/docs/UsersGuide/source/ConfigNewPlatform.rst +++ /dev/null @@ -1,388 +0,0 @@ -.. _ConfigNewPlatform: - -========================== -Configuring a New Platform -========================== - -The UFS SRW Application has been designed to work primarily on a number of Level 1 and 2 support platforms, as specified `here `_. However, it is also designed with flexibility in mind, so that any sufficiently up-to-date machine with a UNIX-based operating system should be capable of running the application. A full list of prerequisites for installing the UFS SRW App and running the Graduate Student Test can be found in :numref:`Section %s `. - -The first step to installing on a new machine is to install :term:`NCEPLIBS` (https://github.com/NOAA-EMC/NCEPLIBS), the NCEP libraries package, which is a set of libraries created and maintained by NCEP and EMC that are used in many parts of the UFS. NCEPLIBS comes with a large number of prerequisites (see :numref:`Section %s ` for more info), but the only required software prior to starting the installation process are as follows: - -* Fortran compiler with support for Fortran 2003 - - * gfortran v9+ or ifort v18+ are the only ones tested, but others may work. - -* C and C++ compilers compatible with the Fortran compiler - - * gcc v9+, ifort v18+, and clang v9+ (macOS, native Apple clang or LLVM clang) have been tested - -* Python v3.6+ - - * Prerequisite packages must be downloaded: jinja2, yaml and f90nml, as well as a number of additional Python modules (see :numref:`Section %s `) if the user would like to use the provided graphics scripts - -* Perl 5 - -* git v1.8+ - -* CMake v3.12+ - - * CMake v3.15+ is needed for building NCEPLIBS, but versions as old as 3.12 can be used to build NCEPLIBS-external, which contains a newer CMake that can be used for the rest of the build. - -For both Linux and macOS, you will need to set the stack size to "unlimited" (if allowed) or the largest possible value. - -.. code-block:: console - - # Linux, if allowed - ulimit -s unlimited - - # macOS, this corresponds to 65MB - ulimit -S -s unlimited - -For Linux systems, as long as the above software is available, you can move on to the next step: installing the :term:`NCEPLIBS-external` package. - -For macOS systems, some extra software is needed: ``wget``, ``coreutils``, ``pkg-config``, and ``gnu-sed``. -It is recommended that you install this software using the Homebrew package manager for macOS (https://brew.sh/): - -* brew install wget - -* brew install cmake - -* brew install coreutils - -* brew install pkg-config - -* brew install gnu-sed - -However, it is also possible to install these utilities via Macports (https://www.macports.org/), or installing each utility individually (not recommended). - -Installing NCEPLIBS-external -============================ -In order to facilitate the installation of NCEPLIBS (and therefore, the SRW and other UFS applications) on new platforms, EMC maintains a one-stop package containing most of the prerequisite libraries and software necessary for installing NCEPLIBS. This package is known as NCEPLIBS-external, and is maintained in a git repository at https://github.com/NOAA-EMC/NCEPLIBS-external. Instructions for installing these will depend on your platform, but generally so long as all the above-mentioned prerequisites have been installed you can follow the proceeding instructions verbatim (in bash; a csh-based shell will require different commands). Some examples for installing on specific platforms can be found in the `NCEPLIBS-external/doc directory `. - - -These instructions will install the NCEPLIBS-external in the current directory tree, so be sure you are in the desired location before starting. - -.. code-block:: console - - export WORKDIR=`pwd` - export INSTALL_PREFIX=${WORKDIR}/NCEPLIBS-ufs-v2.0.0/ - export CC=gcc - export FC=gfortran - export CXX=g++ - -The CC, CXX, and FC variables should specify the C, C++, and Fortran compilers you will be using, respectively. They can be the full path to the compiler if necessary (for example, on a machine with multiple versions of the same compiler). It will be important that all libraries and utilities are built with the same set of compilers, so it is best to set these variables once at the beginning of the process and not modify them again. - -.. code-block:: console - - mkdir -p ${INSTALL_PREFIX}/src && cd ${INSTALL_PREFIX}/src - git clone -b release/public-v2 --recursive https://github.com/NOAA-EMC/NCEPLIBS-external - cd NCEPLIBS-external - mkdir build && cd build - cmake -DCMAKE_INSTALL_PREFIX=${INSTALL_PREFIX} .. 2>&1 | tee log.cmake - make -j4 2>&1 | tee log.make - -The previous commands go through the process of cloning the git repository for NCEPLIBS-external, creating and entering a build directory, and invoking cmake and make to build the code/libraries. The ``make`` step will take a while; as many as a few hours depending on your machine and various settings. It is highly recommended you use at least 4 parallel make processes to prevent overly long installation times. The ``-j4`` option in the make command specifies 4 parallel make processes, ``-j8`` would specify 8 parallel processes, while omitting the flag all together will run make serially (not recommended). - -If you would rather use a different version of one or more of the software packages included in NCEPLIBS-external, you can skip building individual parts of the package by including the proper flags in your call to cmake. For example: - -.. code-block:: console - - cmake -DBUILD_MPI=OFF -DCMAKE_INSTALL_PREFIX=${INSTALL_PREFIX} .. 2>&1 | tee log.cmake - -will skip the building of MPICH that comes with NCEPLIBS-external. See the readme file ``NCEPLIBS-external/README.md`` for more information on these flags, or for general troubleshooting. - -Once NCEPLIBS-external is installed, you can move on to installing NCEPLIBS. - -Installing NCEPLIBS -=================== -Prior to building the UFS SRW Application on a new machine, you will need to install NCEPLIBS. Installation instructions will again depend on your platform, but so long as NCEPLIBS-external has been installed successfully you should be able to build NCEPLIBS. The following instructions will install the NCEPLIBS in the same directory tree as was used for NCEPLIBS-external above, so if you did not install NCEPLIBS-external in the same way, you will need to modify these commands. - -.. code-block:: console - - cd ${INSTALL_PREFIX}/src - git clone -b release/public-v2 --recursive https://github.com/NOAA-EMC/NCEPLIBS - cd NCEPLIBS - mkdir build && cd build - export ESMFMKFILE=${INSTALL_PREFIX}/lib/esmf.mk - cmake -DCMAKE_INSTALL_PREFIX=${INSTALL_PREFIX} -DCMAKE_PREFIX_PATH=${INSTALL_PREFIX} -DOPENMP=ON .. 2>&1 | tee log.cmake - make -j4 2>&1 | tee log.make - make deploy 2>&1 | tee log.deploy - -As with NCEPLIBS-external, the above commands go through the process of cloning the git repository for NCEPLIBS, creating and entering a build directory, and invoking cmake and make to build the code. The ``make deploy`` step created a number of modulefiles and scripts that will be used for setting up the build environment for the UFS SRW App. The ``ESMFMKFILE`` variable allows NCEPLIBS to find the location where ESMF has been built; if you receive a ``ESMF not found, abort`` error, you may need to specify a slightly different location: - -.. code-block:: console - - export ESMFMKFILE=${INSTALL_PREFIX}/lib64/esmf.mk - -Then delete and re-create the build directory and continue the build process as described above. - -If you skipped the building of any of the software provided by NCEPLIBS-external, you may need to add the appropriate locations to your ``CMAKE_PREFIX_PATH`` variable. Multiple directories may be added, separated by semicolons (;) like in the following example: - -.. code-block:: console - - cmake -DCMAKE_INSTALL_PREFIX=${INSTALL_PREFIX} -DCMAKE_PREFIX_PATH=โ€${INSTALL_PREFIX};/location/of/other/softwareโ€ -DOPENMP=ON .. 2>&1 | tee log.cmake - -Further information on including prerequisite libraries, as well as other helpful tips, can be found in the ``NCEPLIBS/README.md`` file. - -Once the NCEPLIBS package has been successfully installed, you can move on to building the UFS SRW Application. - -Building the UFS Short-Range Weather Application (UFS SRW App) -============================================================== -Building the UFS SRW App is similar to building NCEPLIBS, in that the code is stored in a git repository and is built using CMake software. The first step is to retrieve the code from GitHub, using the variables defined earlier: - -.. code-block:: console - - cd ${WORKDIR} - git clone -b release/public-v1 https://github.com/ufs-community/ufs-srweather-app.git - cd ufs-srweather-app/ - ./manage_externals/checkout_externals - -Here the procedure differs a bit from NCEPLIBS and NCEPLIBS-external. The UFS SRW App is maintained using an umbrella git repository that collects the individual components of the application from their individual, independent git repositories. This is handled using "Manage Externals" software, which is included in the application; this is the final step listed above, which should output a bunch of dialogue indicating that it is retrieving different code repositories as described in :numref:`Table %s `. It may take several minutes to download these repositories. - -Once the Manage Externals step has completed, you will need to make sure your environment is set up so that the UFS SRW App can find all of the prerequisite software and libraries. There are a few ways to do this, the simplest of which is to load a modulefile if your machine supports Lua Modules: - -.. code-block:: console - - module use ${INSTALL_PREFIX}/modules - module load NCEPLIBS/2.0.0 - -If your machine does not support Lua but rather TCL modules, see instructions in the ``NCEPLIBS/README.md`` file for converting to TCL modulefiles. - -If your machine does not support modulefiles, you can instead source the provided bash script for setting up the environment: - -.. code-block:: console - - source ${INSTALL_PREFIX}/bin/setenv_nceplibs.sh - -This script, just like the modulefiles, will set a number of environment variables that will allow CMake to easily find all the libraries that were just built. There is also a csh version of the script in the same directory if your shell is csh-based. If you are using your machineโ€™s pre-built version of any of the NCEP libraries (not recommended), reference that file to see which variables should be set to point CMake in the right direction. - -At this point there are just a few more variables that need to be set prior to building: - -.. code-block:: console - - export CMAKE_C_COMPILER=mpicc - export CMAKE_CXX_COMPILER=mpicxx - export CMAKE_Fortran_COMPILER=mpifort - -If you are using your machineโ€™s built-in MPI compilers, it is recommended you set the ``CMAKE_*_COMPILER`` flags to full paths to ensure that the correct MPI aliases are used. Finally, one last environment variable, ``CMAKE_Platform``, must be set. This will depend on your machine; for example, on a macOS operating system with GNU compilers: - -.. code-block:: console - - export CMAKE_Platform=macosx.gnu - -This is the variable used by the weather model to set a few additional flags based on your machine. The available options can be found `here `_. - -Now all the prerequisites have been installed and variables set, so you should be ready to build the model! - -.. code-block:: console - - mkdir build && cd build - cmake .. -DCMAKE_INSTALL_PREFIX=.. | tee log.cmake - make -j4 | tee log.make - -On many platforms this build step will take less than 30 minutes, but for some machines it may take up to a few hours, depending on the system architecture, compiler and compiler flags, and number of parallel make processes used. - -Setting Up Your Python Environment -================================== -The regional_workflow repository contains scripts for generating and running experiments, and these require some specific python packages to function correctly. First, as mentioned before, your platform will need Python 3.6 or newer installed. Once this is done, you will need to install several python packages that are used by the workflow: ``jinja2`` (https://jinja2docs.readthedocs.io/), ``pyyaml`` (https://pyyaml.org/wiki/PyYAML), and ``f90nml`` (https://pypi.org/project/f90nml/). These packages can be installed individually, but it is recommended you use a package manager (https://www.datacamp.com/community/tutorials/pip-python-package-manager). - -If you have conda on your machine: - -.. code-block:: console - - conda install jinja2 pyyaml f90nml - -Otherwise you may be able to use pip3 (the Python3 package manager; may need to be installed separately depending on your platform): - -.. code-block:: console - - pip3 install jinja2 pyyaml f90nml - -Running the graphics scripts in ``${WORKDIR}/ufs-srweather-app/regional_workflow/ush/Python`` will require the additional packages ``pygrib``, ``cartopy``, ``matplotlib``, ``scipy``, and ``pillow``. These can be installed in the same way as described above. - -For the final step of creating and running an experiment, the exact methods will depend on if you are running with or without a workflow manager (Rocoto). - -Running Without a Workflow Manager: Generic Linux and macOS Platforms -===================================================================== -Now that the code has been built, you can stage your data as described in :numref:`Section %s `. - -Once the data has been staged, setting up your experiment on a platform without a workflow manager is similar to the procedure for other platforms described in earlier chapters. Enter the ``${WORKDIR}/ufs-srweather-app/regional_workflow/ush`` directory and configure the workflow by creating a ``config.sh`` file as described in :numref:`Chapter %s `. There will be a few specific settings that you may need change prior to generating the experiment compared to the instructions for pre-configured platforms: - -``MACHINE="MACOS" or MACHINE="LINUX"`` - These are the two ``MACHINE`` settings for generic, non-Rocoto-based platforms; you should choose the one most appropriate for your machine. ``MACOS`` has its own setting due to some differences in how command-line utilities function on Darwin-based operating systems. - -``LAYOUT_X=2`` -``LAYOUT_Y=2`` - These are the settings that control the MPI decomposition when running the weather model. There are default values, but for your machine it is recommended that you specify your own layout to achieve the correct number of MPI processes for your application. In total, your machine should be able to handle ``LAYOUT_Xร—LAYOUT_Y+WRTCMP_write_tasks_per_group`` tasks. ``WRTCMP_write_tasks_per_group`` is the number of MPI tasks that will be set aside for writing model output, and it is a setting dependent on the domain you have selected. You can find and edit the value of this variable in the file ``regional_workflow/ush/set_predef_grid_params.sh``. - -``RUN_CMD_UTILS="mpirun -np 4"`` - This is the run command for MPI-enabled pre-processing utilities. Depending on your machine and your MPI installation, you may need to use a different command for launching an MPI-enabled executable. - -``RUN_CMD_POST="mpirun -np 1"`` - This is the same as RUN_CMD_UTILS but for UPP. - -``RUN_CMD_FCST='mpirun -np ${PE_MEMBER01}'`` - This is the run command for the weather model. It is **strongly** recommended that you use the variable ``${PE_MEMBER01}`` here, which is calculated within the workflow generation script (based on the layout and write tasks described above) and is the number of MPI tasks that the weather model will expect to run with. Running the weather model with a different number of MPI tasks than the workflow has been set up for can lead to segmentation faults and other errors. It is also important to use single quotes here (or escape the โ€œ$โ€ character) so that ``PE_MEMBER01`` is not referenced until runtime, since it is not defined at the beginning of the workflow generation script. - -``FIXgsm=${WORKDIR}/data/fix_am`` - The location of the ``fix_am`` static files. This and the following two static data sets will need to be downloaded to your machine, as described in :numref:`Section %s `. - -``TOPO_DIR=${WORKDIR}/data/fix_orog`` - Location of ``fix_orog`` static files - -``SFC_CLIMO_INPUT_DIR=${WORKDIR}/data/fix_sfc_climo`` - Location of ``climo_fields_netcdf`` static files - -Once you are happy with your settings in ``config.sh``, it is time to run the workflow and move to the experiment directory (that is printed at the end of the scriptโ€™s execution): - -.. code-block:: console - - ./generate_FV3LAM_wflow.sh - export EXPTDIR="your experiment directory" - cd $EXPTDIR - -From here, you can run each individual task of the UFS SRW App using the provided run scripts: - -.. code-block:: console - - cp ${WORKDIR}/ufs-srweather-app/regional_workflow/ush/wrappers/*sh . - cp ${WORKDIR}/ufs-srweather-app/regional_workflow/ush/wrappers/README.md . - -The ``README.md`` file will contain instructions on the order that each script should be run in. An example of wallclock times for each task for an example run (2017 Macbook Pro, macOS Catalina, 25km CONUS domain, 48hr forecast) is listed in :numref:`Table %s `. - -.. _WallClockTimes: - -.. table:: Example wallclock times for each workflow task. - - - +--------------------+----------------------------+------------+-----------+ - | **UFS Component** | **Script Name** | **Num.** | **Wall** | - | | | **Cores** | **time** | - +====================+============================+============+===========+ - | UFS_UTILS | ./run_get_ics.sh | n/a | 3 s | - +--------------------+----------------------------+------------+-----------+ - | UFS_UTILS | ./run_get_lbcs.sh | n/a | 3 s | - +--------------------+----------------------------+------------+-----------+ - | UFS_UTILS | ./run_make_grid.sh | n/a | 9 s | - +--------------------+----------------------------+------------+-----------+ - | UFS_UTILS | ./run_make_orog.sh | 4 | 1 m | - +--------------------+----------------------------+------------+-----------+ - | UFS_UTILS | ./run_make_sfc_climo.sh | 4 | 27 m | - +--------------------+----------------------------+------------+-----------+ - | UFS_UTILS | ./run_make_ics.sh | 4 | 5 m | - +--------------------+----------------------------+------------+-----------+ - | UFS_UTILS | ./run_make_lbcs.sh | 4 | 5 m | - +--------------------+----------------------------+------------+-----------+ - | ufs-weather-model | ./run_fcst.sh | 6 | 1h 40 m | - +--------------------+----------------------------+------------+-----------+ - | UPP | ./run_post.sh | 1 | 7 m | - +--------------------+----------------------------+------------+-----------+ - -Running on a New Platform with Rocoto Workflow Manager -====================================================== -All official HPC platforms for the UFS SRW App release make use of the Rocoto workflow management software for running experiments. If you would like to use the Rocoto workflow manager on a new machine, you will have to make modifications to the scripts in the ``regional_workflow`` repository. The easiest way to do this is to search the files in the ``regional_workflow/scripts`` and ``regional_workflow/ush`` directories for an existing platform name (e.g. ``CHEYENNE``) and add a stanza for your own unique machine (e.g. ``MYMACHINE``). As an example, here is a segment of code from ``regional_workflow/ush/setup.sh``, where the highlighted text is an example of the kind of change you will need to make: - -.. code-block:: console - :emphasize-lines: 11-18 - - ... - "CHEYENNE") - WORKFLOW_MANAGER="rocoto" - NCORES_PER_NODE=36 - SCHED="${SCHED:-pbspro}" - QUEUE_DEFAULT=${QUEUE_DEFAULT:-"regular"} - QUEUE_HPSS=${QUEUE_HPSS:-"regular"} - QUEUE_FCST=${QUEUE_FCST:-"regular"} - ;; - - "MYMACHINE") - WORKFLOW_MANAGER="rocoto" - NCORES_PER_NODE=your_machine_cores_per_node - SCHED="${SCHED:-your_machine_scheduler}" - QUEUE_DEFAULT=${QUEUE_DEFAULT:-"your_machine_queue_name"} - QUEUE_HPSS=${QUEUE_HPSS:-"your_machine_queue_name"} - QUEUE_FCST=${QUEUE_FCST:-"your_machine_queue_name"} - ;; - - "STAMPEDE") - WORKFLOW_MANAGER="rocoto" - ... - -You will also need to add ``MYMACHINE`` to the list of valid machine names in ``regional_workflow/ush/valid_param_vals.sh``. The minimum list of files that will need to be modified in this way are as follows (all in the ``regional_workflow`` repository): - -* ``scripts/exregional_run_post.sh``, line 131 -* ``scripts/exregional_make_sfc_climo.sh``, line 162 -* ``scripts/exregional_make_lbcs.sh``, line 114 -* ``scripts/exregional_make_orog.sh``, line 147 -* ``scripts/exregional_make_grid.sh``, line 145 -* ``scripts/exregional_run_fcst.sh``, line 140 -* ``scripts/exregional_make_ics.sh``, line 114 -* ``ush/setup.sh``, lines 431 and 742 -* ``ush/launch_FV3LAM_wflow.sh``, line 104 -* ``ush/get_extrn_mdl_file_dir_info.sh``, many lines, starting around line 589 -* ``ush/valid_param_vals.sh``, line 3 -* ``ush/load_modules_run_task.sh``, line 126 -* ``ush/set_extrn_mdl_params.sh``, many lines, starting around line 61 - -The line numbers may differ slightly given future bug fixes. Additionally, you may need to make further changes depending on the exact setup of your machine and Rocoto installation. Information about installing and configuring Rocoto on your machine can be found in the Rocoto GitHub repository: https://github.com/christopherwharrop/rocoto - -.. _SW-OS-Requirements: - -Software/Operating System Requirements -====================================== -Those requirements highlighted in **bold** are included in the NCEPLIBS-external (https://github.com/NOAA-EMC/NCEPLIBS-external) package. - -**Minimum platform requirements for the UFS SRW Application and NCEPLIBS:** - -* POSIX-compliant UNIX-style operating system - -* >40 GB disk space - - * 18 GB input data from GFS, RAP, and HRRR for Graduate Student Test - * 6 GB for NCEPLIBS-external and NCEPLIBS full installation - * 1 GB for ufs-srweather-app installation - * 11 GB for 48hr forecast on CONUS 25km domain - -* 4GB memory (CONUS 25km domain) - -* Fortran compiler with full Fortran 2008 standard support - -* C and C++ compiler - -* Python v3.6+, including prerequisite packages ``jinja2``, ``pyyaml`` and ``f90nml`` - -* Perl 5 - -* git v1.8+ - -* MPI (**MPICH**, OpenMPI, or other implementation) - -* wgrib2 - -* CMake v3.12+ - -* Software libraries - - * **netCDF (C and Fortran libraries)** - * **HDF5** - * **ESMF** 8.0.0 - * **Jasper** - * **libJPG** - * **libPNG** - * **zlib** - -macOS-specific prerequisites: - -* brew install wget -* brew install cmake -* brew install coreutils -* brew install pkg-config -* brew install gnu-sed - -Optional but recommended prerequisites: - -* Conda for installing/managing Python packages -* Bash v4+ -* Rocoto Workflow Management System (1.3.1) -* **CMake v3.15+** -* Python packages scipy, matplotlib, pygrib, cartopy, and pillow for graphics diff --git a/docs/UsersGuide/source/ConfigParameters.inc b/docs/UsersGuide/source/ConfigParameters.inc deleted file mode 100644 index 3dd9dab99b..0000000000 --- a/docs/UsersGuide/source/ConfigParameters.inc +++ /dev/null @@ -1,338 +0,0 @@ -.. This is a continuation of the ConfigWorkflow.rst chapter - -.. _ConfigParameters: - -Grid Generation Parameters -========================== -``GRID_GEN_METHOD``: (Default: โ€œโ€) - This variable specifies the method to use to generate a regional grid in the horizontal. The only supported value of this parameter is โ€œESGgridโ€, in which case the Extended Schmidt Gnomonic grid generation method developed by Jim Purser(1) of EMC will be used. - -(1)Purser, R. J., D. Jovic, G. Ketefian, T. Black, J. Beck, J. Dong, and J. Carley, 2020: The Extended Schmidt Gnomonic Grid for Regional Applications. Unified Forecast System (UFS) Usersโ€™ Workshop. July 27-29, 2020. - -.. note:: - - #. If the experiment is using one of the predefined grids (i.e. if ``PREDEF_GRID_NAME`` is set to the name of one of the valid predefined grids), then ``GRID_GEN_METHOD`` will be reset to the value of ``GRID_GEN_METHOD`` for that grid. This will happen regardless of whether or not ``GRID_GEN_METHOD`` is assigned a value in the user-specified experiment configuration file, i.e. any value it may be assigned in the experiment configuration file will be overwritten. - - #. If the experiment is not using one of the predefined grids (i.e. if ``PREDEF_GRID_NAME`` is set to a null string), then ``GRID_GEN_METHOD`` must be set in the experiment configuration file. Otherwise, it will remain set to a null string, and the experiment generation will fail because the generation scripts check to ensure that it is set to a non-empty string before creating the experiment directory. - -The following parameters must be set if using the "ESGgrid" method of generating a regional grid (i.e. for ``GRID_GEN_METHOD`` set to "ESGgrid"). - -``ESGgrid_LON_CTR``: (Default: โ€œโ€) - The longitude of the center of the grid (in degrees). - -``ESGgrid_LAT_CTR``: (Default: โ€œโ€) - The latitude of the center of the grid (in degrees). - -``ESGgrid_DELX``: (Default: โ€œโ€) - The cell size in the zonal direction of the regional grid (in meters). - -``ESGgrid_DELY``: (Default: โ€œโ€) - The cell size in the meridional direction of the regional grid (in meters). - -``ESGgrid_NX``: (Default: โ€œโ€) - The number of cells in the zonal direction on the regional grid. - -``ESGgrid_NY``: (Default: โ€œโ€) - The number of cells in the meridional direction on the regional grid. - -``ESGgrid_WIDE_HALO_WIDTH``: (Default: โ€œโ€) - The width (in units of number of grid cells) of the halo to add around the regional grid before shaving the halo down to the width(s) expected by the forecast model. - -In order to generate grid files containing halos that are 3-cell and 4-cell wide and orography files with halos that are 0-cell and 3-cell wide (all of which are required as inputs to the forecast model), the grid and orography tasks first create files with halos around the regional domain of width ``ESGgrid_WIDE_HALO_WIDTH`` cells. These are first stored in files. The files are then read in and "shaved" down to obtain grid files with 3-cell-wide and 4-cell-wide halos and orography files with 0-cell-wide (i.e. no halo) and 3-cell-wide halos. For this reason, we refer to the original halo that then gets shaved down as the "wide" halo, i.e. because it is wider than the 0-cell-wide, 3-cell-wide, and 4-cell-wide halos that we will eventually end up with. Note that the grid and orography files with the wide halo are only needed as intermediates in generating the files with 0-cell-, 3-cell-, and 4-cell-wide halos; they are not needed by the forecast model. - -Computational Forecast Parameters -================================= -``DT_ATMOS``: (Default: โ€œโ€) - The main forecast model integration time step. As described in the forecast model documentation, "It corresponds to the frequency with which the top level routine in the dynamics is called as well as the frequency with which the physics is called." - -``LAYOUT_X, LAYOUT_Y``: (Default: โ€œโ€) - The number of MPI tasks (processes) to use in the two horizontal directions (x and y) of the regional grid when running the forecast model. - -``BLOCKSIZE``: (Default: โ€œโ€) - The amount of data that is passed into the cache at a time. - -Here, we set these parameters to null strings. This is so that, for any one of these parameters: - -#. If the experiment is using a predefined grid and the user sets the parameter in the user-specified experiment configuration file (``EXPT_CONFIG_FN``), that value will be used in the forecast(s). Otherwise, the default value of the parameter for that predefined grid will be used. - -#. If the experiment is not using a predefined grid (i.e. it is using a custom grid whose parameters are specified in the experiment configuration file), then the user must specify a value for the parameter in that configuration file. Otherwise, the parameter will remain set to a null string, and the experiment generation will fail, because the generation scripts check to ensure that all the parameters defined in this section are set to non-empty strings before creating the experiment directory. - -Write-Component (Quilting) Parameters -===================================== -``QUILTING``: (Default: โ€œTRUEโ€) - Flag that determines whether or not to use the write-component for writing forecast output files to disk. If set to โ€œTRUEโ€, the forecast model will output files named ``dynf$HHH.nc`` and ``phyf$HHH.nc`` (where HHH is the 3-hour output forecast hour) containing dynamics and physics fields, respectively, on the write-component grid (the regridding from the native FV3-LAM grid to the write-component grid is done by the forecast model). If ``QUILTING`` is set to "FALSE", then the output file names are ``fv3_history.nc`` and ``fv3_history2d.nc`` and contain fields on the native grid. Note that if ``QUILTING`` is set to โ€œFALSEโ€, then the ``RUN_POST_TN`` (meta)task cannot be run because the Unified Post Processor (UPP) code that this task calls cannot process fields on the native grid. In that case, the ``RUN_POST_TN`` (meta)task will be automatically removed from the Rocoto workflow XML. - -``PRINT_ESMF``: (Default: โ€œFALSEโ€) - Flag for whether or not to output extra (debugging) information from ESMF routines. Must be "TRUE" or "FALSE". Note that the write-component uses ESMF library routines to interpolate from the native forecast model grid to the user-specified output grid (which is defined in the model configuration file (model_configure) in the forecast run directory). - -``WRTCMP_write_groups``: (Default: โ€œ1โ€) - The number of write groups (i.e. groups of MPI tasks) to use in the write-component. - -``WRTCMP_write_tasks_per_group``: (Default: โ€œ20โ€) - The number of MPI tasks to allocate for each write group. - -Predefined Grid Parameters -========================== -``PREDEF_GRID_NAME``: (Default: โ€œโ€) - This parameter specifies the name of a predefined regional grid. - -.. note:: - - * If ``PREDEF_GRID_NAME`` is set to a valid predefined grid name, the grid generation method ``GRID_GEN_METHOD``, the (native) grid parameters, and the write-component grid parameters are set to predefined values for the specified grid, overwriting any settings of these parameters in the user-specified experiment configuration file (``config.sh``). In addition, if the time step ``DT_ATMOS`` and the computational parameters ``LAYOUT_X``, ``LAYOUT_Y``, and ``BLOCKSIZE`` are not specified in that configuration file, they are also set to predefined values for the specified grid. - - * If ``PREDEF_GRID_NAME`` is set to an empty string, it implies the user is providing the native grid parameters in the user-specified experiment configuration file (``EXPT_CONFIG_FN``). In this case, the grid generation method ``GRID_GEN_METHOD``, the native grid parameters, and the write-component grid parameters as well as the main time step (``DT_ATMOS``) and the computational parameters ``LAYOUT_X``, ``LAYOUT_Y``, and ``BLOCKSIZE`` must be set in that configuration file. - -Setting ``PREDEF_GRID_NAME`` provides a convenient method of specifying a commonly used set of grid-dependent parameters. The predefined grid parameters are specified in the script - -.. code-block:: console - - ush/set_predef_grid_params.sh - -Currently supported ``PREDEF_GRID_NAME`` options are "RRFS_CONUS_25km," "RRFS_CONUS_13km," and "RRFS_CONUS_3km." - -Pre-existing Directory Parameter -================================ -``PREEXISTING_DIR_METHOD``: (Default: โ€œdeleteโ€) - This variable determines the method to deal with pre-existing directories [e.g ones generated by previous calls to the experiment generation script using the same experiment name (``EXPT_SUBDIR``) as the current experiment]. This variable must be set to one of "delete", "rename", and "quit". The resulting behavior for each of these values is as follows: - - * "delete": The preexisting directory is deleted and a new directory (having the same name as the original preexisting directory) is created. - - * "rename": The preexisting directory is renamed and a new directory (having the same name as the original pre-existing directory) is created. The new name of the preexisting directory consists of its original name and the suffix "_oldNNN", where NNN is a 3-digit integer chosen to make the new name unique. - - * "quit": The preexisting directory is left unchanged, but execution of the currently running script is terminated. In this case, the preexisting directory must be dealt with manually before rerunning the script. - -Verbose Parameter -================= -``VERBOSE``: (Default: โ€œTRUEโ€) - This is a flag that determines whether or not the experiment generation and workflow task scripts print out extra informational messages. - -Pre-Processing Parameters -========================= -These parameters set flags (and related directories) that determine whether the grid, orography, and/or surface climatology file generation tasks should be run. Note that these are all cycle-independent tasks, i.e. if they are to be run, they do so only once at the beginning of the workflow before any cycles are run. - -``RUN_TASK_MAKE_GRID``: (Default: โ€œTRUEโ€) - Flag that determines whether the grid file generation task (``MAKE_GRID_TN``) is to be run. If this is set to "TRUE", the grid generation task is run and new grid files are generated. If it is set to "FALSE", then the scripts look for pre-generated grid files in the directory specified by ``GRID_DIR`` (see below). - -``GRID_DIR``: (Default: "/path/to/pregenerated/grid/files") - The directory in which to look for pre-generated grid files if ``RUN_TASK_MAKE_GRID`` is set to "FALSE". - -``RUN_TASK_MAKE_OROG``: (Default: โ€œTRUEโ€) - Same as ``RUN_TASK_MAKE_GRID`` but for the orography generation task (``MAKE_OROG_TN``). - -``OROG_DIR``: (Default: "/path/to/pregenerated/orog/files") - Same as ``GRID_DIR`` but for the orography generation task. - -``RUN_TASK_MAKE_SFC_CLIMO``: (Default: โ€œTRUEโ€) - Same as ``RUN_TASK_MAKE_GRID`` but for the surface climatology generation task (``MAKE_SFC_CLIMO_TN``). - -``SFC_CLIMO_DIR``: (Default: "/path/to/pregenerated/surface/climo/files") - Same as ``GRID_DIR`` but for the surface climatology generation task. - -Surface Climatology Parameter -============================= -``SFC_CLIMO_FIELDS``: (Default: โ€œ("facsf" "maximum_snow_albedo" "slope_type" "snowfree_albedo" "soil_type" "substrate_temperature" "vegetation_greenness" "vegetation_type")โ€) - Array containing the names of all the fields for which the ``MAKE_SFC_CLIMO_TN`` task generates files on the native FV3-LAM grid. - -Fixed File Parameters -===================== -Set parameters associated with the fixed (i.e. static) files. For the main NOAA HPC platforms, as well as Cheyenne, Odin, and Stampede, fixed files are prestaged with paths defined in the ``setup.sh`` script. - -``FIXgsm``: (Default: โ€œโ€) - System directory in which the majority of fixed (i.e. time-independent) files that are needed to run the FV3-LAM model are located. - -``TOPO_DIR``: (Default: โ€œโ€) - The location on disk of the static input files used by the ``make_orog task`` (``orog.x`` and ``shave.x``). Can be the same as ``FIXgsm``. - -``SFC_CLIMO_INPUT_DIR``: (Default: โ€œโ€) - The location on disk of the static surface climatology input fields, used by ``sfc_climo_gen``. These files are only used if ``RUN_TASK_MAKE_SFC_CLIMO=TRUE``. - -``FNGLAC, ..., FNMSKH``: (Default: see below) - .. code-block:: console - - (FNGLAC="global_glacier.2x2.grb" - FNMXIC="global_maxice.2x2.grb" - FNTSFC="RTGSST.1982.2012.monthly.clim.grb" - FNSNOC="global_snoclim.1.875.grb" - FNZORC="igbp" - FNAISC="CFSR.SEAICE.1982.2012.monthly.clim.grb" - FNSMCC="global_soilmgldas.t126.384.190.grb" - FNMSKH="seaice_newland.grb") - - Names of (some of the) global data files that are assumed to exist in a system directory specified (this directory is machine-dependent; the experiment generation scripts will set it and store it in the variable ``FIXgsm``). These file names also appear directly in the forecast model's input namelist file. - -``FIXgsm_FILES_TO_COPY_TO_FIXam``: (Default: see below) - .. code-block:: console - - ("$FNGLAC" \ - "$FNMXIC" \ - "$FNTSFC" \ - "$FNSNOC" \ - "$FNAISC" \ - "$FNSMCC" \ - "$FNMSKH" \ - "global_climaeropac_global.txt" \ - "fix_co2_proj/global_co2historicaldata_2010.txt" \ - "fix_co2_proj/global_co2historicaldata_2011.txt" \ - "fix_co2_proj/global_co2historicaldata_2012.txt" \ - "fix_co2_proj/global_co2historicaldata_2013.txt" \ - "fix_co2_proj/global_co2historicaldata_2014.txt" \ - "fix_co2_proj/global_co2historicaldata_2015.txt" \ - "fix_co2_proj/global_co2historicaldata_2016.txt" \ - "fix_co2_proj/global_co2historicaldata_2017.txt" \ - "fix_co2_proj/global_co2historicaldata_2018.txt" \ - "global_co2historicaldata_glob.txt" \ - "co2monthlycyc.txt" \ - "global_h2o_pltc.f77" \ - "global_hyblev.l65.txt" \ - "global_zorclim.1x1.grb" \ - "global_sfc_emissivity_idx.txt" \ - "global_solarconstant_noaa_an.txt" \ - "replace_with_FIXgsm_ozone_prodloss_filename") - - If not running in NCO mode, this array contains the names of the files to copy from the ``FIXgsm`` system directory to the ``FIXam`` directory under the experiment directory. Note that the last element has a dummy value. This last element will get reset by the workflow generation scripts to the name of the ozone production/loss file to copy from ``FIXgsm``. The name of this file depends on the ozone parameterization being used, and that in turn depends on the CCPP physics suite specified for the experiment. Thus, the CCPP physics suite XML must first be read in to determine the ozone parameterization and then the name of the ozone production/loss file. These steps are carried out elsewhere (in one of the workflow generation scripts/functions). - -``FV3_NML_VARNAME_TO_FIXam_FILES_MAPPING``: (Default: see below) - .. code-block:: console - - ("FNGLAC | $FNGLAC" \ - "FNMXIC | $FNMXIC" \ - "FNTSFC | $FNTSFC" \ - "FNSNOC | $FNSNOC" \ - "FNAISC | $FNAISC" \ - "FNSMCC | $FNSMCC" \ - "FNMSKH | $FNMSKH" ) - - This array is used to set some of the namelist variables in the forecast model's namelist file that represent the relative or absolute paths of various fixed files (the first column of the array, where columns are delineated by the pipe symbol "|") to the full paths to these files in the FIXam directory derived from the corresponding workflow variables containing file names (the second column of the array). - -``FV3_NML_VARNAME_TO_SFC_CLIMO_FIELD_MAPPING``: (Default: see below) - .. code-block:: console - - ("FNALBC | snowfree_albedo" \ - "FNALBC2 | facsf" \ - "FNTG3C | substrate_temperature" \ - "FNVEGC | vegetation_greenness" \ - "FNVETC | vegetation_type" \ - "FNSOTC | soil_type" \ - "FNVMNC | vegetation_greenness" \ - "FNVMXC | vegetation_greenness" \ - "FNSLPC | slope_type" \ - "FNABSC | maximum_snow_albedo" ) - - This array is used to set some of the namelist variables in the forecast model's namelist file that represent the relative or absolute paths of various fixed files (the first column of the array, where columns are delineated by the pipe symbol "|") to the full paths to surface climatology files (on the native FV3-LAM grid) in the ``FIXLAM`` directory derived from the corresponding surface climatology fields (the second column of the array). - -``CYCLEDIR_LINKS_TO_FIXam_FILES_MAPPING``: (Default: see below) - .. code-block:: console - - ("aerosol.dat | global_climaeropac_global.txt" \ - "co2historicaldata_2010.txt | fix_co2_proj/global_co2historicaldata_2010.txt" \ - "co2historicaldata_2011.txt | fix_co2_proj/global_co2historicaldata_2011.txt" \ - "co2historicaldata_2012.txt | fix_co2_proj/global_co2historicaldata_2012.txt" \ - "co2historicaldata_2013.txt | fix_co2_proj/global_co2historicaldata_2013.txt" \ - "co2historicaldata_2014.txt | fix_co2_proj/global_co2historicaldata_2014.txt" \ - "co2historicaldata_2015.txt | fix_co2_proj/global_co2historicaldata_2015.txt" \ - "co2historicaldata_2016.txt | fix_co2_proj/global_co2historicaldata_2016.txt" \ - "co2historicaldata_2017.txt | fix_co2_proj/global_co2historicaldata_2017.txt" \ - "co2historicaldata_2018.txt | fix_co2_proj/global_co2historicaldata_2018.txt" \ - "co2historicaldata_glob.txt | global_co2historicaldata_glob.txt" \ - "co2monthlycyc.txt | co2monthlycyc.txt" \ - "global_h2oprdlos.f77 | global_h2o_pltc.f77" \ - "global_zorclim.1x1.grb | global_zorclim.1x1.grb" \ - "sfc_emissivity_idx.txt | global_sfc_emissivity_idx.txt" \ - "solarconstant_noaa_an.txt | global_solarconstant_noaa_an.txt" \ - "global_o3prdlos.f77 | " ) - - This array specifies the mapping to use between the symlinks that need to be created in each cycle directory (these are the "files" that FV3 looks for) and their targets in the ``FIXam`` directory. The first column of the array specifies the symlink to be created, and the second column specifies its target file in ``FIXam`` (where columns are delineated by the pipe symbol "|"). - -Workflow Task Parameters -======================== -These parameters set the names of the various workflow tasks and usually do not need to be changed. For each task, additional values set the parameters to pass to the job scheduler (e.g. slurm) that will submit a job for each task to be run. Parameters include the number of nodes to use to run the job, the number of MPI processes per node, the maximum walltime to allow for the job to complete, and the maximum number of times to attempt to run each task. - -Task names: - -| ``MAKE_GRID_TN``: (Default: "make_grid") -| ``MAKE_OROG_TN``: (Default: โ€œmake_orog") -| ``MAKE_SFC_CLIMO_TN``: (Default: โ€œmake_sfc_climo") -| ``GET_EXTRN_ICS_TN``: (Default: "get_extrn_ics") -| ``GET_EXTRN_LBCS_TN``: (Default: "get_extrn_lbcs") -| ``MAKE_ICS_TN``: (Default: "make_ics") -| ``MAKE_LBCS_TN``: (Default: "make_lbcs") -| ``RUN_FCST_TN``: (Default: "run_fcst") -| ``RUN_POST_TN``: (Default: "run_post") - -Number of nodes: - -| ``NODES_MAKE_GRID``: (Default: "1") -| ``NODES_MAKE_OROG``: (Default: "1") -| ``NODES_MAKE_SFC_CLIMO``: (Default: "2") -| ``NODES_GET_EXTRN_ICS``: (Default: "1") -| ``NODES_GET_EXTRN_LBCS``: (Default: "1") -| ``NODES_MAKE_ICS``: (Default: "4") -| ``NODES_MAKE_LBCS``: (Default: "4โ€) -| ``NODES_RUN_FCST``: (Default: "") # Calculated in the workflow generation scripts. -| ``NODES_RUN_POST``: (Default: "2") - -Number of MPI processes per node: - -| ``PPN_MAKE_GRID``: (Default: "24") -| ``PPN_MAKE_OROG``: (Default: "24") -| ``PPN_MAKE_SFC_CLIMO``: (Default: "24") -| ``PPN_GET_EXTRN_ICS``: (Default: "1") -| ``PPN_GET_EXTRN_LBCS``: (Default: "1") -| ``PPN_MAKE_ICS``: (Default: "12") -| ``PPN_MAKE_LBCS``: (Default: "12") -| ``PPN_RUN_FCST``: (Default: "24") # Can be changed depending on the number of threads used. -| ``PPN_RUN_POST``: (Default: "24") - -Wall times: - -| ``TIME_MAKE_GRID``: (Default: "00:20:00") -| ``TIME_MAKE_OROG``: (Default: "00:20:00โ€) -| ``TIME_MAKE_SFC_CLIMO``: (Default: "00:20:00") -| ``TIME_GET_EXTRN_ICS``: (Default: "00:45:00") -| ``TIME_GET_EXTRN_LBCS``: (Default: "00:45:00") -| ``TIME_MAKE_ICS``: (Default: "00:30:00") -| ``TIME_MAKE_LBCS``: (Default: "00:30:00") -| ``TIME_RUN_FCST``: (Default: "04:30:00") -| ``TIME_RUN_POST``: (Default: "00:15:00") - -Maximum number of attempts. - -| ``MAXTRIES_MAKE_GRID``: (Default: "1") -| ``MAXTRIES_MAKE_OROG``: (Default: "1") -| ``MAXTRIES_MAKE_SFC_CLIMO``: (Default: "1") -| ``MAXTRIES_GET_EXTRN_ICS``: (Default: "1") -| ``MAXTRIES_GET_EXTRN_LBCS``: (Default: "1") -| ``MAXTRIES_MAKE_ICS``: (Default: "1") -| ``MAXTRIES_MAKE_LBCS``: (Default: "1") -| ``MAXTRIES_RUN_FCST``: (Default: "1") -| ``MAXTRIES_RUN_POST``: (Default: "1") - -Customized Post Configuration Parameters -======================================== -``USE_CUSTOM_POST_CONFIG_FILE``: (Default: โ€œFALSEโ€) - Flag that determines whether a user-provided custom configuration file should be used for post-processing the model data. If this is set to "TRUE", then the workflow will use the custom post-processing (UPP) configuration file specified in ``CUSTOM_POST_CONFIG_FP``. Otherwise, a default configuration file provided in the UPP repository will be used. - -``CUSTOM_POST_CONFIG_FP``: (Default: โ€œโ€) - The full path to the custom post flat file, including filename, to be used for post-processing. This is only used if ``CUSTOM_POST_CONFIG_FILE`` is set to "TRUE". - -Halo Blend Parameter -==================== -``HALO_BLEND``: (Default: โ€œ10โ€) - Number of rows into the computational domain that should be blended with the LBCs. To shut halo blending off, set this to zero. - -FVCOM Parameter -=============== -``USE_FVCOM``: (Default: โ€œFALSEโ€) - Flag that specifies whether or not to update surface conditions in FV3-LAM with fields generated from the Finite Volume Community Ocean Model (FVCOM). If set to โ€œTRUEโ€, lake/sea surface temperatures, ice surface temperatures, and ice placement will be overwritten by data provided by FVCOM. This is done by running the executable ``process_FVCOM.exe`` in the ``MAKE_ICS_TN`` task to modify the file ``sfc_data.nc`` generated by ``chgres_cube``. Note that the FVCOM data must already be interpolated to the desired FV3-LAM grid. - -``FVCOM_DIR``: (Default: โ€œ/user/defined/dir/to/fvcom/data") - User defined directory in which the file ``fvcom.nc`` containing FVCOM data on the FV3-LAM native grid is located. The file name in this directory must be ``fvcom.nc``. - -``FVCOM_FILE``: (Default: โ€œfvcom.ncโ€) - Name of file located in ``FVCOM_DIR`` that has FVCOM data interpolated to FV3-LAM grid. This file will be copied later to a new location and the name changed to ``fvcom.nc``. - -Compiler Parameter -================== -``COMPILER``: (Default: โ€œintelโ€) - Type of compiler invoked during the build step. Currently, this must be set manually (i.e. it is not inherited from the build system in the ``ufs-srweather-app`` directory). - diff --git a/docs/UsersGuide/source/ConfigWorkflow.rst b/docs/UsersGuide/source/ConfigWorkflow.rst index 04d3775c4a..9fcfbd8b11 100644 --- a/docs/UsersGuide/source/ConfigWorkflow.rst +++ b/docs/UsersGuide/source/ConfigWorkflow.rst @@ -1,93 +1,118 @@ .. _ConfigWorkflow: -================================================================== -Configuring the Workflow: ``config.sh`` and ``config_defaults.sh`` -================================================================== -To create the experiment directory and workflow when running the SRW App, the user must create an experiment configuration file named ``config.sh``. This file contains experiment-specific information, such as dates, external model data, directories, and other relevant settings. To help the user, two sample configuration files have been included in the ``regional_workflow`` repositoryโ€™s ``ush`` directory: ``config.community.sh`` and ``config.nco.sh``. The first is for running experiments in community mode (``RUN_ENVIR`` set to โ€œcommunityโ€; see below), and the second is for running experiments in โ€œncoโ€ mode (``RUN_ENVIR`` set to โ€œncoโ€). Note that for this release, only โ€œcommunityโ€ mode is supported. These files can be used as the starting point from which to generate a variety of experiment configurations in which to run the SRW App. +============================================================================================ +Workflow Parameters: Configuring the Workflow in ``config.sh`` and ``config_defaults.sh`` +============================================================================================ +To create the experiment directory and workflow when running the SRW App, the user must create an experiment configuration file named ``config.sh``. This file contains experiment-specific information, such as dates, external model data, observation data, directories, and other relevant settings. To help the user, two sample configuration files have been included in the ``regional_workflow`` repositoryโ€™s ``ush`` directory: ``config.community.sh`` and ``config.nco.sh``. The first is for running experiments in community mode (``RUN_ENVIR`` set to "community"; see below), and the second is for running experiments in "nco" mode (``RUN_ENVIR`` set to "nco"). Note that for this release, only "community" mode is supported. These files can be used as the starting point from which to generate a variety of experiment configurations in which to run the SRW App. -There is an extensive list of experiment parameters that a user can set when configuring the experiment. Not all of these need to be explicitly set by the user in ``config.sh``. In the case that a user does not define an entry in the ``config.sh`` script, either its value in ``config_defaults.sh`` will be used, or it will be reset depending on other parameters, e.g. the platform on which the experiment will be run (specified by ``MACHINE``). Note that ``config_defaults.sh`` contains the full list of experiment parameters that a user may set in ``config.sh`` (i.e. the user cannot set parameters in config.sh that are not initialized in ``config_defaults.sh``). +There is an extensive list of experiment parameters that a user can set when configuring the experiment. Not all of these need to be explicitly set by the user in ``config.sh``. If a user does not define an entry in the ``config.sh`` script, either its value in ``config_defaults.sh`` will be used, or it will be reset depending on other parameters, such as the platform on which the experiment will be run (specified by ``MACHINE``). Note that ``config_defaults.sh`` contains the full list of experiment parameters that a user may set in ``config.sh`` (i.e., the user cannot set parameters in config.sh that are not initialized in ``config_defaults.sh``). -The following is a list of the parameters in the ``config_defaults.sh`` file. For each parameter, the default value and a brief description is given. In addition, any relevant information on features and settings supported or unsupported in this release is specified. +The following is a list of the parameters in the ``config_defaults.sh`` file. For each parameter, the default value and a brief description is given. In addition, any relevant information on features and settings supported or unsupported in this release is specified. Platform Environment ==================== -``RUN_ENVIR``: (Default: โ€œncoโ€) - This variable determines the mode that the workflow will run in. The user can choose between two modes: โ€œncoโ€ and โ€œcommunity.โ€ The โ€œncoโ€ mode uses a directory structure that mimics what is used in operations at NOAA/NCEP Central Operations (NCO) and by those in the NOAA/NCEP/Environmental Modeling Center (EMC) working with NCO on pre-implementation testing. Specifics of the conventions used in โ€œncoโ€ mode can be found in the following WCOSS Implementation Standards document: +``RUN_ENVIR``: (Default: "nco") + This variable determines the mode that the workflow will run in. The user can choose between two modes: "nco" and "community". The "nco" mode uses a directory structure that mimics what is used in operations at NOAA/NCEP Central Operations (NCO) and at the NOAA/NCEP/Environmental Modeling Center (EMC), which is working with NCO on pre-implementation testing. Specifics of the conventions used in "nco" mode can be found in the following `WCOSS Implementation Standards `__ document: | NCEP Central Operations | WCOSS Implementation Standards - | April 17, 2019 - | Version 10.2.0 + | January 19, 2022 + | Version 11.0.0 + + Setting ``RUN_ENVIR`` to "community" will use the standard directory structure and variable naming convention and is recommended in most cases for users who are not planning to implement their code into operations at NCO. - Setting ``RUN_ENVIR`` to โ€œcommunityโ€ will use the standard directory structure and variable naming convention and is recommended in most cases for users who are not planning to implement their code into operations at NCO. +``MACHINE``: (Default: "BIG_COMPUTER") + The machine (a.k.a. platform) on which the workflow will run. Currently supported platforms include "WCOSS_DELL_P3", "HERA", "ORION", "JET", "ODIN", "CHEYENNE", "STAMPEDE", "GAEA", "SINGULARITY", "NOAACLOUD", "MACOS", and "LINUX". When running the SRW App in a container, set ``MACHINE`` to "SINGULARITY" regardless of the underlying platform. -``MACHINE``: (Default: โ€œBIG_COMPUTERโ€) - The machine (a.k.a. platform) on which the workflow will run. Currently supported platforms include "WCOSS_DELL_P3," "HERA," "ORION," "JET," "ODIN," "CHEYENNE," "STAMPEDE,โ€ โ€œGAEA,โ€ โ€œMACOS,โ€ and โ€œLINUX." +``MACHINE_FILE``: (Default: "") + Path to a configuration file with machine-specific settings. If none is provided, ``setup.sh`` will attempt to set the path to a configuration file for a supported platform. -``ACCOUNT``: (Default: โ€œproject_nameโ€) - The account under which to submit jobs to the queue on the specified ``MACHINE``. +``ACCOUNT``: (Default: "project_name") + The account under which to submit jobs to the queue on the specified ``MACHINE``. To determine an appropriate ``ACCOUNT`` field for `Level 1 `__ systems, users may run the ``groups`` command, which will return a list of projects that the user has permissions for. Not all of the listed projects/groups have an HPC allocation, but those that do are potentially valid account names. On some systems, the ``saccount_params`` command will display additional account details. -``WORKFLOW_MANAGER``: (Default: โ€œnoneโ€) - The workflow manager to use (e.g. โ€œROCOTOโ€). This is set to "none" by default, but if the machine name is set to a platform that supports Rocoto, this will be overwritten and set to "ROCOTO." +``WORKFLOW_MANAGER``: (Default: "none") + The workflow manager to use (e.g. "ROCOTO"). This is set to "none" by default, but if the machine name is set to a platform that supports Rocoto, this will be overwritten and set to "ROCOTO." Valid values: "rocoto" "none" -``SCHED``: (Default: โ€œโ€) - The job scheduler to use (e.g. slurm) on the specified ``MACHINE``. Set this to an empty string in order for the experiment generation script to set it automatically depending on the machine the workflow is running on. Currently, supported schedulers include "slurm," "pbspro," "lsf," "lsfcray," and "none". +``NCORES_PER_NODE``: (Default: "") + The number of cores available per node on the compute platform. Set for supported platforms in ``setup.sh``, but it is now also configurable for all platforms. -``PARTITION_DEFAULT``: (Default: โ€œโ€) - If using the slurm job scheduler (i.e. if ``SCHED`` is set to "slurm"), the default partition to which to submit workflow tasks. If a task does not have a specific variable that specifies the partition to which it will be submitted (e.g. ``PARTITION_HPSS``, ``PARTITION_FCST``; see below), it will be submitted to the partition specified by this variable. If this is not set or is set to an empty string, it will be (re)set to a machine-dependent value. This is not used if ``SCHED`` is not set to "slurm." +``LMOD_PATH``: (Default: "") + Path to the LMOD shell file on the user's Linux system. It is set automatically for supported machines. -``CLUSTERS_DEFAULT``: (Default: โ€œโ€) - If using the slurm job scheduler (i.e. if ``SCHED`` is set to "slurm"), the default clusters to which to submit workflow tasks. If a task does not have a specific variable that specifies the partition to which it will be submitted (e.g. ``CLUSTERS_HPSS``, ``CLUSTERS_FCST``; see below), it will be submitted to the clusters specified by this variable. If this is not set or is set to an empty string, it will be (re)set to a machine-dependent value. This is not used if ``SCHED`` is not set to "slurm." +``BUILD_MOD_FN``: (Default: "") + Name of alternative build module file to use if running on an unsupported platform. Is set automatically for supported machines. -``QUEUE_DEFAULT``: (Default: โ€œโ€) - The default queue or QOS (if using the slurm job scheduler, where QOS is Quality of Service) to which workflow tasks are submitted. If a task does not have a specific variable that specifies the queue to which it will be submitted (e.g. ``QUEUE_HPSS``, ``QUEUE_FCST``; see below), it will be submitted to the queue specified by this variable. If this is not set or is set to an empty string, it will be (re)set to a machine-dependent value. +``WFLOW_MOD_FN``: (Default: "") + Name of alternative workflow module file to use if running on an unsupported platform. Is set automatically for supported machines. -``PARTITION_HPSS``: (Default: โ€œโ€) - If using the slurm job scheduler (i.e. if ``SCHED`` is set to "slurm"), the partition to which the tasks that get or create links to external model files [which are needed to generate initial conditions (ICs) and lateral boundary conditions (LBCs)] are submitted. If this is not set or is set to an empty string, it will be (re)set to a machine-dependent value. This is not used if ``SCHED`` is not set to "slurm." +``SCHED``: (Default: "") + The job scheduler to use (e.g., Slurm) on the specified ``MACHINE``. Set this to an empty string in order for the experiment generation script to set it automatically depending on the machine the workflow is running on. Valid values: "slurm" "pbspro" "lsf" "lsfcray" "none" -``CLUSTERS_HPSS``: (Default: โ€œโ€) - If using the slurm job scheduler (i.e. if ``SCHED`` is set to "slurm"), the clusters to which the tasks that get or create links to external model files [which are needed to generate initial conditions (ICs) and lateral boundary conditions (LBCs)] are submitted. If this is not set or is set to an empty string, it will be (re)set to a machine-dependent value. This is not used if ``SCHED`` is not set to "slurm." +Machine-Dependent Parameters: +------------------------------- +These parameters vary depending on machine. On `Level 1 and 2 `__ systems, the appropriate values for each machine can be viewed in the ``regional_workflow/ush/machine/.sh`` scripts. To specify a value other than the default, add these variables and the desired value in the ``config.sh`` file so that they override the ``config_defaults.sh`` and machine default values. -``QUEUE_HPSS``: (Default: โ€œโ€) - The queue or QOS to which the tasks that get or create links to external model files are submitted. If this is not set or is set to an empty string, it will be (re)set to a machine-dependent value. +``PARTITION_DEFAULT``: (Default: "") + This variable is only used with the Slurm job scheduler (i.e., if ``SCHED`` is set to "slurm"). This is the default partition to which Slurm submits workflow tasks. When a variable that designates the partition (e.g., ``PARTITION_HPSS``, ``PARTITION_FCST``; see below) is **not** specified, the task will be submitted to the default partition indicated in the ``PARTITION_DEFAULT`` variable. If this value is not set or is set to an empty string, it will be (re)set to a machine-dependent value. Valid values: "" "hera" "normal" "orion" "sjet,vjet,kjet,xjet" "workq" -``PARTITION_FCST``: (Default: โ€œโ€) - If using the slurm job scheduler (i.e. if ``SCHED`` is set to "slurm"), the partition to which the task that runs forecasts is submitted. If this is not set or set to an empty string, it will be (re)set to a machine-dependent value. This is not used if ``SCHED`` is not set to "slurm." +``CLUSTERS_DEFAULT``: (Default: "") + This variable is only used with the Slurm job scheduler (i.e., if ``SCHED`` is set to "slurm"). These are the default clusters to which Slurm submits workflow tasks. If ``CLUSTERS_HPSS`` or ``CLUSTERS_FCST`` (see below) are not specified, the task will be submitted to the default clusters indicated in this variable. If this value is not set or is set to an empty string, it will be (re)set to a machine-dependent value. -``CLUSTERS_FCST``: (Default: โ€œโ€) - If using the slurm job scheduler (i.e. if ``SCHED`` is set to "slurm"), the clusters to which the task that runs forecasts is submitted. If this is not set or set to an empty string, it will be (re)set to a machine-dependent value. This is not used if ``SCHED`` is not set to "slurm." +``QUEUE_DEFAULT``: (Default: "") + The default queue or QOS to which workflow tasks are submitted (QOS is Slurm's term for queue; it stands for "Quality of Service"). If the task's ``QUEUE_HPSS`` or ``QUEUE_FCST`` parameters (see below) are not specified, the task will be submitted to the queue indicated by this variable. If this value is not set or is set to an empty string, it will be (re)set to a machine-dependent value. Valid values: "" "batch" "dev" "normal" "regular" "workq" -``QUEUE_FCST``: (Default: โ€œโ€) - The queue or QOS to which the task that runs a forecast is submitted. If this is not set or set to an empty string, it will be (re)set to a machine-dependent value. +``PARTITION_HPSS``: (Default: "") + This variable is only used with the Slurm job scheduler (i.e., if ``SCHED`` is set to "slurm"). Tasks that get or create links to external model files are submitted to the partition specified in this variable. These links are needed to generate initial conditions (ICs) and lateral boundary conditions (LBCs) for the experiment. If this variable is not set or is set to an empty string, it will be (re)set to a machine-dependent value. Valid values: "" "normal" "service" "workq" + +.. + COMMENT: Wouldn't it be reset to the PARTITION_DEFAULT value? + +``CLUSTERS_HPSS``: (Default: "") + This variable is only used with the Slurm job scheduler (i.e., if ``SCHED`` is set to "slurm"). Tasks that get or create links to external model files are submitted to the clusters specified in this variable. These links are needed to generate initial conditions (ICs) and lateral boundary conditions (LBCs) for the experiment. If this variable is not set or is set to an empty string, it will be (re)set to a machine-dependent value. + +``QUEUE_HPSS``: (Default: "") + Tasks that get or create links to external model files are submitted to this queue, or QOS (QOS is Slurm's term for queue; it stands for "Quality of Service"). If this value is not set or is set to an empty string, it will be (re)set to a machine-dependent value. Valid values: "" "batch" "dev_transfer" "normal" "regular" "workq" + +``PARTITION_FCST``: (Default: "") + This variable is only used with the Slurm job scheduler (i.e., if ``SCHED`` is set to "slurm"). The task that runs forecasts is submitted to this partition. If this variable is not set or is set to an empty string, it will be (re)set to a machine-dependent value. Valid values: "" "hera" "normal" "orion" "sjet,vjet,kjet,xjet" "workq" + +``CLUSTERS_FCST``: (Default: "") + This variable is only used with the Slurm job scheduler (i.e., if ``SCHED`` is set to "slurm"). The task that runs forecasts is submitted to this cluster. If this variable is not set or is set to an empty string, it will be (re)set to a machine-dependent value. + +``QUEUE_FCST``: (Default: "") + The task that runs a forecast is submitted to this queue, or QOS (QOS is Slurm's term for queue; it stands for "Quality of Service"). If this variable is not set or set to an empty string, it will be (re)set to a machine-dependent value. Valid values: "" "batch" "dev" "normal" "regular" "workq" Parameters for Running Without a Workflow Manager ================================================= -These settings control run commands for platforms without a workflow manager. Values will be ignored unless ``WORKFLOW_MANAGER="none"``. +These settings control run commands for platforms without a workflow manager. Values will be ignored unless ``WORKFLOW_MANAGER="none"``. ``RUN_CMD_UTILS``: (Default: "mpirun -np 1") - The run command for pre-processing utilities (shave, orog, sfc_climo_gen, etc.). This can be left blank for smaller domains, in which case the executables will run without MPI. + The run command for MPI-enabled pre-processing utilities (e.g., shave, orog, sfc_climo_gen). This can be left blank for smaller domains, in which case the executables will run without :term:`MPI`. Users may need to use a different command for launching an MPI-enabled executable depending on their machine and MPI installation. ``RUN_CMD_FCST``: (Default: "mpirun -np \${PE_MEMBER01}") - The run command for the model forecast step. This will be appended to the end of the variable definitions file ("var_defns.sh"). + The run command for the model forecast step. This will be appended to the end of the variable definitions file (``var_defns.sh``). Changing the ``${PE_MEMBER01}`` variable is **not** recommended; it refers to the number of MPI tasks that the Weather Model will expect to run with. Running the Weather Model with a different number of MPI tasks than the workflow has been set up for can lead to segmentation faults and other errors. It is also important to escape the ``$`` character or use single quotes here so that ``PE_MEMBER01`` is not referenced until runtime, since it is not defined at the beginning of the workflow generation script. ``RUN_CMD_POST``: (Default: "mpirun -np 1") - The run command for post-processing (UPP). Can be left blank for smaller domains, in which case UPP will run without MPI. + The run command for post-processing (:term:`UPP`). Can be left blank for smaller domains, in which case UPP will run without :term:`MPI`. Cron-Associated Parameters ========================== -``USE_CRON_TO_RELAUNCH``: (Default: โ€œFALSEโ€) - Flag that determines whether or not a line is added to the user's cron table that calls the experiment launch script every ``CRON_RELAUNCH_INTVL_MNTS`` minutes. +``USE_CRON_TO_RELAUNCH``: (Default: "FALSE") + Flag that determines whether or not a line is added to the user's cron table, which calls the experiment launch script every ``CRON_RELAUNCH_INTVL_MNTS`` minutes. + +``CRON_RELAUNCH_INTVL_MNTS``: (Default: "03") + The interval (in minutes) between successive calls of the experiment launch script by a cron job to (re)launch the experiment (so that the workflow for the experiment kicks off where it left off). This is used only if ``USE_CRON_TO_RELAUNCH`` is set to "TRUE". -``CRON_RELAUNCH_INTVL_MNTS``: (Default: โ€œ03โ€) - The interval (in minutes) between successive calls of the experiment launch script by a cron job to (re)launch the experiment (so that the workflow for the experiment kicks off where it left off). This is used only if ``USE_CRON_TO_RELAUNCH`` is set to โ€œTRUEโ€. +.. + COMMENT: Are these variables set in a machine script somewhere for Level 1 systems? I've used cron but never had to set these. It seems like the default for NOAA Cloud is "01". Directory Parameters ==================== -``EXPT_BASEDIR``: (Default: โ€œโ€) - The base directory in which the experiment directory will be created. If this is not specified or if it is set to an empty string, it will default to ``${HOMErrfs}/../../expt_dirs``, where ``${HOMErrfs}`` contains the full path to the ``regional_workflow`` directory. +``EXPT_BASEDIR``: (Default: "") + The base directory in which the experiment directory will be created. If this is not specified or if it is set to an empty string, it will default to ``${HOMErrfs}/../../expt_dirs``, where ``${HOMErrfs}`` contains the full path to the ``regional_workflow`` directory. -``EXPT_SUBDIR``: (Default: โ€œโ€) - The name that the experiment directory (without the full path) will have. The full path to the experiment directory, which will be contained in the variable ``EXPTDIR``, will be: +``EXPT_SUBDIR``: (Default: "") + The name that the experiment directory (without the full path) will have. The full path to the experiment directory, which will be contained in the variable ``EXPTDIR``, will be: .. code-block:: console @@ -95,39 +120,55 @@ Directory Parameters This parameter cannot be left as a null string. +``EXEC_SUBDIR``: (Default: "bin") + The name of the subdirectory of ``ufs-srweather-app`` where executables are installed. + +.. _NCOModeParms: NCO Mode Parameters =================== These variables apply only when using NCO mode (i.e. when ``RUN_ENVIR`` is set to "nco"). ``COMINgfs``: (Default: "/base/path/of/directory/containing/gfs/input/files") - The beginning portion of the directory which contains files generated by the external model that the initial and lateral boundary condition generation tasks need in order to create initial and boundary condition files for a given cycle on the native FV3-LAM grid. For a cycle that starts on the date specified by the variable YYYYMMDD (consisting of the 4-digit year followed by the 2-digit month followed by the 2-digit day of the month) and hour specified by the variable HH (consisting of the 2-digit hour-of-day), the directory in which the workflow will look for the external model files is: + The beginning portion of the path to the directory that contains files generated by the external model (FV3GFS). The initial and lateral boundary condition generation tasks need this path in order to create initial and boundary condition files for a given cycle on the native FV3-LAM grid. For a cycle that starts on the date specified by the variable YYYYMMDD (consisting of the 4-digit year, 2-digit month, and 2-digit day of the month) and the hour specified by the variable HH (consisting of the 2-digit hour of the day), the directory in which the workflow will look for the external model files is: .. code-block:: console - $COMINgfs/gfs.$yyyymmdd/$hh + $COMINgfs/gfs.$yyyymmdd/$hh/atmos + +.. + COMMENT: Should "atmos" be at the end of this file path? If so, is it standing in for something (like FV3GFS), or is "atmos" actually part of the file path? Are the files created directly in the "atmos" folder? Or is there an "ICS" and "LBCS" directory generated? + +``FIXLAM_NCO_BASEDIR``: (Default: "") + The base directory containing pregenerated grid, orography, and surface climatology files. For the pregenerated grid specified by ``PREDEF_GRID_NAME``, these "fixed" files are located in: + + .. code-block:: console + + ${FIXLAM_NCO_BASEDIR}/${PREDEF_GRID_NAME} + + The workflow scripts will create a symlink in the experiment directory that will point to a subdirectory (having the name of the grid being used) under this directory. This variable should be set to a null string in this file, but it can be specified in the user-specified workflow configuration file (e.g., ``config.sh``). ``STMP``: (Default: "/base/path/of/directory/containing/model/input/and/raw/output/files") - The beginning portion of the directory that will contain cycle-dependent model input files, symlinks to cycle-independent input files, and raw (i.e. before post-processing) forecast output files for a given cycle. For a cycle that starts on the date specified by YYYYMMDD and hour specified by HH (where YYYYMMDD and HH are as described above) [so that the cycle date (cdate) is given by ``cdate="${YYYYMMDD}${HH}"``], the directory in which the aforementioned files will be located is: + The beginning portion of the path to the directory that will contain :term:`cycle-dependent` model input files, symlinks to :term:`cycle-independent` input files, and raw (i.e., before post-processing) forecast output files for a given :term:`cycle`. The format for cycle dates (cdate) is ``cdate="${YYYYMMDD}${HH}"``, where the date is specified using YYYYMMDD format, and the hour is specified using HH format. The files for a cycle date will be located in the following directory: .. code-block:: console $STMP/tmpnwprd/$RUN/$cdate ``NET, envir, RUN``: - Variables used in forming the path to the directory that will contain the output files from the post-processor (UPP) for a given cycle (see definition of ``PTMP`` below). These are defined in the WCOSS Implementation Standards document as follows: + Variables used in forming the path to the directory that will contain the post-processor (:term:`UPP`) output files for a given cycle (see ``PTMP`` below). These are defined in the `WCOSS Implementation Standards `__ document (pp. 4-5, 19-20) as follows: - ``NET``: (Default: โ€œrrfsโ€) - Model name (first level of com directory structure) + ``NET``: (Default: "rrfs") + Model name (first level of ``com`` directory structure) - ``envir``: (Default: โ€œparaโ€) - Set to "test" during the initial testing phase, "para" when running in parallel (on a schedule), and "prod" in production. + ``envir``: (Default: "para") + Set to "test" during the initial testing phase, "para" when running in parallel (on a schedule), and "prod" in production. (Second level of ``com`` directory structure.) - ``RUN``: (Default: โ€œexperiment_nameโ€) - Name of model run (third level of com directory structure). + ``RUN``: (Default: "experiment_name") + Name of model run (third level of ``com`` directory structure). ``PTMP``: (Default: "/base/path/of/directory/containing/postprocessed/output/files") - The beginning portion of the directory that will contain the output files from the post-processor (UPP) for a given cycle. For a cycle that starts on the date specified by YYYYMMDD and hour specified by HH (where YYYYMMDD and HH are as described above), the directory in which the UPP output files will be placed will be: + The beginning portion of the path to the directory that will contain the output files from the post-processor (:term:`UPP`) for a given cycle. For a cycle that starts on the date specified by YYYYMMDD and hour specified by HH (where YYYYMMDD and HH are as described above), the UPP output files will be placed in the following directory: .. code-block:: console @@ -136,7 +177,7 @@ These variables apply only when using NCO mode (i.e. when ``RUN_ENVIR`` is set t Pre-Processing File Separator Parameters ======================================== ``DOT_OR_USCORE``: (Default: "_") - This variable sets the separator character(s) to use in the names of the grid, mosaic, and orography fixed files. Ideally, the same separator should be used in the names of these fixed files as the surface climatology fixed files. + This variable sets the separator character(s) to use in the names of the grid, mosaic, and orography fixed files. Ideally, the same separator should be used in the names of these fixed files as in the surface climatology fixed files. Valid values: "_" "." File Name Parameters ==================== @@ -144,101 +185,1197 @@ File Name Parameters Name of the user-specified configuration file for the forecast experiment. ``RGNL_GRID_NML_FN``: (Default: "regional_grid.nml") - Name of the file containing Fortran namelist settings for the code that generates an "ESGgrid" type of regional grid. + Name of the file containing namelist settings for the code that generates an "ESGgrid" regional grid. ``FV3_NML_BASE_SUITE_FN``: (Default: "input.nml.FV3") - Name of the Fortran namelist file containing the forecast model's base suite namelist, i.e. the portion of the namelist that is common to all physics suites. + Name of the Fortran namelist file containing the forecast model's base suite namelist (i.e., the portion of the namelist that is common to all physics suites). ``FV3_NML_YAML_CONFIG_FN``: (Default: "FV3.input.yml") Name of YAML configuration file containing the forecast model's namelist settings for various physics suites. -``DIAG_TABLE_FN``: (Default: โ€œdiag_tableโ€) +``FV3_NML_BASE_ENS_FN``: (Default: "input.nml.base_ens") + Name of the Fortran namelist file containing the forecast model's base ensemble namelist, i.e., the the namelist file that is the starting point from which the namelist files for each of the enesemble members are generated. + +``DIAG_TABLE_FN``: (Default: "diag_table") Name of the file that specifies the fields that the forecast model will output. -``FIELD_TABLE_FN``: (Default: โ€œfield_tableโ€) - Name of the file that specifies the tracers that the forecast model will read in from the IC/LBC files. +``FIELD_TABLE_FN``: (Default: "field_table") + Name of the file that specifies the tracers that the forecast model will read in from the :term:`IC/LBC` files. -``DATA_TABLE_FN``: (Default: โ€œdata_tableโ€) - The name of the file containing the data table read in by the forecast model. +``DATA_TABLE_FN``: (Default: "data_table") + Name of the file containing the data table read in by the forecast model. -``MODEL_CONFIG_FN``: (Default: โ€œmodel_configureโ€) - The name of the file containing settings and configurations for the NUOPC/ESMF component. +``MODEL_CONFIG_FN``: (Default: "model_configure") + Name of the file containing settings and configurations for the :term:`NUOPC`/:term:`ESMF` component. -``NEMS_CONFIG_FN``: (Default: โ€œnems.configureโ€) - The name of the file containing information about the various NEMS components and their run sequence. +``NEMS_CONFIG_FN``: (Default: "nems.configure") + Name of the file containing information about the various :term:`NEMS` components and their run sequence. -``FV3_EXEC_FN``: (Default: โ€œNEMS.exeโ€) - Name of the forecast model executable in the executables directory (``EXECDIR``; set during experiment generation). +``FV3_EXEC_FN``: (Default: "ufs_model") + Name of the forecast model executable stored in the executables directory (``EXECDIR``; set during experiment generation). -``WFLOW_XML_FN``: (Default: โ€œFV3LAM_wflow.xmlโ€) - Name of the Rocoto workflow XML file that the experiment generation script creates and that defines the workflow for the experiment. +``FCST_MODEL``: (Default: "ufs-weather-model") + Name of forecast model. Valid values: "ufs-weather-model" "fv3gfs_aqm" -``GLOBAL_VAR_DEFNS_FN``: (Default: โ€œvar_defns.shโ€) - Name of the file (a shell script) containing the definitions of the primary experiment variables (parameters) defined in this default configuration script and in config.sh as well as secondary experiment variables generated by the experiment generation script. This file is sourced by many scripts (e.g. the J-job scripts corresponding to each workflow task) in order to make all the experiment variables available in those scripts. +``WFLOW_XML_FN``: (Default: "FV3LAM_wflow.xml") + Name of the Rocoto workflow XML file that the experiment generation script creates. This file defines the workflow for the experiment. -``EXTRN_MDL_ICS_VAR_DEFNS_FN``: (Default: โ€œextrn_mdl_ics_var_defns.sh") - Name of the file (a shell script) containing the definitions of variables associated with the external model from which ICs are generated. This file is created by the ``GET_EXTRN_ICS_TN`` task because the values of the variables it contains are not known before this task runs. The file is then sourced by the ``MAKE_ICS_TN`` task. +``GLOBAL_VAR_DEFNS_FN``: (Default: "var_defns.sh") + Name of the file (a shell script) containing definitions of the primary and secondary experiment variables (parameters). This file is sourced by many scripts (e.g., the J-job scripts corresponding to each workflow task) in order to make all the experiment variables available in those scripts. The primary variables are defined in the default configuration script (``config_defaults.sh``) and in ``config.sh``. The secondary experiment variables are generated by the experiment generation script. -``EXTRN_MDL_LBCS_VAR_DEFNS_FN``: (Default: โ€œextrn_mdl_lbcs_var_defns.sh") - Name of the file (a shell script) containing the definitions of variables associated with the external model from which LBCs are generated. This file is created by the ``GET_EXTRN_LBCS_TN`` task because the values of the variables it contains are not known before this task runs. The file is then sourced by the ``MAKE_ICS_TN`` task. +``EXTRN_MDL_ICS_VAR_DEFNS_FN``: (Default: "extrn_mdl_ics_var_defns.sh") + Name of the file (a shell script) containing the definitions of variables associated with the external model from which :term:`ICs` are generated. This file is created by the ``GET_EXTRN_ICS_TN`` task because the values of the variables it contains are not known before this task runs. The file is then sourced by the ``MAKE_ICS_TN`` task. -``WFLOW_LAUNCH_SCRIPT_FN``: (Default: โ€œlaunch_FV3LAM_wflow.sh") +``EXTRN_MDL_LBCS_VAR_DEFNS_FN``: (Default: "extrn_mdl_lbcs_var_defns.sh") + Name of the file (a shell script) containing the definitions of variables associated with the external model from which :term:`LBCs` are generated. This file is created by the ``GET_EXTRN_LBCS_TN`` task because the values of the variables it contains are not known before this task runs. The file is then sourced by the ``MAKE_ICS_TN`` task. + +``WFLOW_LAUNCH_SCRIPT_FN``: (Default: "launch_FV3LAM_wflow.sh") Name of the script that can be used to (re)launch the experiment's Rocoto workflow. -``WFLOW_LAUNCH_LOG_FN``: (Default: โ€œlog.launch_FV3LAM_wflowโ€) +``WFLOW_LAUNCH_LOG_FN``: (Default: "log.launch_FV3LAM_wflow") Name of the log file that contains the output from successive calls to the workflow launch script (``WFLOW_LAUNCH_SCRIPT_FN``). Forecast Parameters =================== -``DATE_FIRST_CYCL``: (Default: โ€œYYYYMMDDโ€) - Starting date of the first forecast in the set of forecasts to run. Format is "YYYYMMDD". Note that this does not include the hour-of-day. +``DATE_FIRST_CYCL``: (Default: "YYYYMMDD") + Starting date of the first forecast in the set of forecasts to run. Format is "YYYYMMDD". Note that this does not include the hour of the day. + +``DATE_LAST_CYCL``: (Default: "YYYYMMDD") + Starting date of the last forecast in the set of forecasts to run. Format is "YYYYMMDD". Note that this does not include the hour of the day. -``DATE_LAST_CYCL``: (Default: โ€œYYYYMMDDโ€) - Starting date of the last forecast in the set of forecasts to run. Format is "YYYYMMDD". Note that this does not include the hour-of-day. +``CYCL_HRS``: (Default: ( "HH1" "HH2" )) + An array containing the hours of the day at which to launch forecasts. Forecasts are launched at these hours on each day from ``DATE_FIRST_CYCL`` to ``DATE_LAST_CYCL``, inclusive. Each element of this array must be a two-digit string representing an integer that is less than or equal to 23 (e.g., "00", "03", "12", "23"). -``CYCL_HRS``: (Default: ( โ€œHH1โ€ โ€œHH2โ€ )) - An array containing the hours of the day at which to launch forecasts. Forecasts are launched at these hours on each day from ``DATE_FIRST_CYCL`` to ``DATE_LAST_CYCL``, inclusive. Each element of this array must be a two-digit string representing an integer that is less than or equal to 23, e.g. "00", "03", "12", "23". +``INCR_CYCL_FREQ``: (Default: "24") + Increment in hours for cycle frequency (cycl_freq). The default is "24", which means cycl_freq=24:00:00. -``FCST_LEN_HRS``: (Default: โ€œ24โ€) +.. + COMMENT: What is cycl_freq from? It's not mentioned anywhere else here... In general, this definition could be better... need more info. + +``FCST_LEN_HRS``: (Default: "24") The length of each forecast, in integer hours. +Model Configuration Parameters +================================= + +``DT_ATMOS``: (Default: "") + Time step for the outermost atmospheric model loop in seconds. This corresponds to the frequency at which the physics routines and the top level dynamics routine are called. (Note that one call to the top-level dynamics routine results in multiple calls to the horizontal dynamics, tracer transport, and vertical dynamics routines; see the `FV3 dycore documentation `__ for details.) Must be set. Takes an integer value. + +.. + COMMENT: FV3 documentation says DT_ATMOS must be set, but in our code, the default value is "". What is the actual default value? And is the default set by the FV3 dycore (or somewhere else) rather than in the SRW App itself? + +``RESTART_INTERVAL``: (Default: "0") + Frequency of the output restart files in hours. Using the default interval ("0"), restart files are produced at the end of a forecast run. When ``RESTART_INTERVAL="1"``, restart files are produced every hour with the prefix "YYYYMMDD.HHmmSS." in the ``RESTART`` directory. + +.. _InlinePost: + +``WRITE_DOPOST``: (Default: "FALSE") + Flag that determines whether to use the INLINE POST option. If TRUE, the ``WRITE_DOPOST`` flag in the ``model_configure`` file will be set to "TRUE", and the post-processing tasks get called from within the weather model so that the post files (:term:`grib2`) are output by the weather model at the same time that it outputs the ``dynf###.nc`` and ``phyf###.nc`` files. Setting ``WRITE_DOPOST="TRUE"`` + turns off the separate ``run_post`` task (i.e., ``RUN_TASK_RUN_POST`` is set to "FALSE") in ``setup.sh``. + + .. + Should there be an underscore in inline post? + +METplus Parameters +===================== + +:ref:`METplus ` is a scientific verification framework that spans a wide range of temporal and spatial scales. Many of the METplus parameters are described below, but additional documentation for the METplus components is available on the `METplus website `__. + +``MODEL``: (Default: "") + A descriptive name of the user's choice for the model being verified. + +``MET_INSTALL_DIR``: (Default: "") + Path to top-level directory of MET installation. + +``METPLUS_PATH``: (Default: "") + Path to top-level directory of METplus installation. + +``MET_BIN_EXEC``: (Default: "bin") + Location where METplus executables are installed. + +.. _METParamNote: + +.. note:: + Where a date field is required: + * YYYY refers to the 4-digit valid year + * MM refers to the 2-digit valid month + * DD refers to the 2-digit valid day of the month + * HH refers to the 2-digit valid hour of the day + * mm refers to the 2-digit valid minutes of the hour + * SS refers to the two-digit valid seconds of the hour + +``CCPA_OBS_DIR``: (Default: "") + User-specified location of top-level directory where CCPA hourly precipitation files used by METplus are located. This parameter needs to be set for both user-provided observations and for observations that are retrieved from the NOAA HPSS (if the user has access) via the ``get_obs_ccpa_tn`` task. (This task is activated in the workflow by setting ``RUN_TASK_GET_OBS_CCPA="TRUE"``). + METplus configuration files require the use of predetermined directory structure and file names. If the CCPA files are user-provided, they need to follow the anticipated naming structure: ``{YYYYMMDD}/ccpa.t{HH}z.01h.hrap.conus.gb2``, where YYYYMMDD and HH are as described in the note :ref:`above `. When pulling observations from NOAA HPSS, the data retrieved will be placed in the ``CCPA_OBS_DIR`` directory. This path must be defind as ``//ccpa/proc``. METplus is configured to verify 01-, 03-, 06-, and 24-h accumulated precipitation using hourly CCPA files. + + .. note:: + There is a problem with the valid time in the metadata for files valid from 19 - 00 UTC (i.e., files under the "00" directory). The script to pull the CCPA data from the NOAA HPSS (``regional_workflow/scripts/exregional_get_ccpa_files.sh``) has an example of how to account for this and organize the data into a more intuitive format. When a fix is provided, it will be accounted for in the ``exregional_get_ccpa_files.sh`` script. + +``MRMS_OBS_DIR``: (Default: "") + User-specified location of top-level directory where MRMS composite reflectivity files used by METplus are located. This parameter needs to be set for both user-provided observations and for observations that are retrieved from the NOAA HPSS (if the user has access) via the ``get_obs_mrms_tn`` task (activated in the workflow by setting ``RUN_TASK_GET_OBS_MRMS="TRUE"``). When pulling observations directly from NOAA HPSS, the data retrieved will be placed in this directory. Please note, this path must be defind as ``//mrms/proc``. METplus configuration files require the use of a predetermined directory structure and file names. Therefore, if the MRMS files are user-provided, they need to follow the anticipated naming structure: ``{YYYYMMDD}/MergedReflectivityQCComposite_00.50_{YYYYMMDD}-{HH}{mm}{SS}.grib2``, where YYYYMMDD and {HH}{mm}{SS} are as described in the note :ref:`above `. + +.. note:: + METplus is configured to look for a MRMS composite reflectivity file for the valid time of the forecast being verified; since MRMS composite reflectivity files do not always exactly match the valid time, a script, within the main script to retrieve MRMS data from the NOAA HPSS, is used to identify and rename the MRMS composite reflectivity file to match the valid time of the forecast. The script to pull the MRMS data from the NOAA HPSS has an example of the expected file naming structure: ``regional_workflow/scripts/exregional_get_mrms_files.sh``. This script calls the script used to identify the MRMS file closest to the valid time: ``regional_workflow/ush/mrms_pull_topofhour.py``. + + +``NDAS_OBS_DIR``: (Default: "") + User-specified location of top-level directory where NDAS prepbufr files used by METplus are located. This parameter needs to be set for both user-provided observations and for observations that are retrieved from the NOAA HPSS (if the user has access) via the ``get_obs_ndas_tn`` task (activated in the workflow by setting ``RUN_TASK_GET_OBS_NDAS="TRUE"``). When pulling observations directly from NOAA HPSS, the data retrieved will be placed in this directory. Please note, this path must be defined as ``//ndas/proc``. METplus is configured to verify near-surface variables hourly and upper-air variables at 00 and 12 UTC with NDAS prepbufr files. METplus configuration files require the use of predetermined file names. Therefore, if the NDAS files are user provided, they need to follow the anticipated naming structure: ``prepbufr.ndas.{YYYYMMDDHH}``, where YYYYMMDD and HH are as described in the note :ref:`above `. The script to pull the NDAS data from the NOAA HPSS (``regional_workflow/scripts/exregional_get_ndas_files.sh``) has an example of how to rename the NDAS data into a more intuitive format with the valid time listed in the file name. + Initial and Lateral Boundary Condition Generation Parameters ============================================================ -``EXTRN_MDL_NAME_ICS``: (Default: โ€œFV3GFSโ€) - The name of the external model that will provide fields from which initial condition (IC) files, surface files, and 0-th hour boundary condition files will be generated for input into the forecast model. +``EXTRN_MDL_NAME_ICS``: (Default: "FV3GFS") + The name of the external model that will provide fields from which initial condition (IC) files, surface files, and 0-th hour boundary condition files will be generated for input into the forecast model. Valid values: "GSMGFS" "FV3GFS" "RAP" "HRRR" "NAM" + +``EXTRN_MDL_NAME_LBCS``: (Default: "FV3GFS") + The name of the external model that will provide fields from which lateral boundary condition (LBC) files (except for the 0-th hour LBC file) will be generated for input into the forecast model. Valid values: "GSMGFS" "FV3GFS" "RAP" "HRRR" "NAM" + +``LBC_SPEC_INTVL_HRS``: (Default: "6") + The interval (in integer hours) at which LBC files will be generated. This is also referred to as the *boundary specification interval*. Note that the model specified in ``EXTRN_MDL_NAME_LBCS`` must have data available at a frequency greater than or equal to that implied by ``LBC_SPEC_INTVL_HRS``. For example, if ``LBC_SPEC_INTVL_HRS`` is set to "6", then the model must have data available at least every 6 hours. It is up to the user to ensure that this is the case. + +``EXTRN_MDL_ICS_OFFSET_HRS``: (Default: "0") + Users may wish to start a forecast using forecast data from a previous cycle of an external model. This variable sets the number of hours earlier the external model started than when the FV3 forecast configured here should start. For example, if the forecast should start from a 6 hour forecast of the GFS, then ``EXTRN_MDL_ICS_OFFSET_HRS="6"``. + +``EXTRN_MDL_LBCS_OFFSET_HRS``: (Default: "") + Users may wish to use lateral boundary conditions from a forecast that was started earlier than the initial time for the FV3 forecast configured here. This variable sets the number of hours earlier the external model started than when the FV3 forecast configured here should start. For example, if the forecast should use lateral boundary conditions from the GFS started 6 hours earlier, then ``EXTRN_MDL_LBCS_OFFSET_HRS="6"``. Note: the default value is model-dependent and set in ``set_extrn_mdl_params.sh``. + +``FV3GFS_FILE_FMT_ICS``: (Default: "nemsio") + If using the FV3GFS model as the source of the :term:`ICs` (i.e., if ``EXTRN_MDL_NAME_ICS="FV3GFS"``), this variable specifies the format of the model files to use when generating the ICs. Valid values: "nemsio" "grib2" "netcdf" -``EXTRN_MDL_NAME_LBCS``: (Default: โ€œFV3GFSโ€) - The name of the external model that will provide fields from which lateral boundary condition (LBC) files (except for the 0-th hour LBC file) will be generated for input into the forecast model. +``FV3GFS_FILE_FMT_LBCS``: (Default: "nemsio") + If using the FV3GFS model as the source of the :term:`LBCs` (i.e., if ``EXTRN_MDL_NAME_ICS="FV3GFS"``), this variable specifies the format of the model files to use when generating the LBCs. Valid values: "nemsio" "grib2" "netcdf" -``LBC_SPEC_INTVL_HRS``: (Default: โ€œ6โ€) - The interval (in integer hours) at which LBC files will be generated, referred to as the boundary specification interval. Note that the model specified in ``EXTRN_MDL_NAME_LBCS`` must have data available at a frequency greater than or equal to that implied by ``LBC_SPEC_INTVL_HRS``. For example, if ``LBC_SPEC_INTVL_HRS`` is set to 6, then the model must have data available at least every 6 hours. It is up to the user to ensure that this is the case. -``FV3GFS_FILE_FMT_ICS``: (Default: โ€œnemsioโ€) - If using the FV3GFS model as the source of the ICs (i.e. if ``EXTRN_MDL_NAME_ICS`` is set to "FV3GFS"), this variable specifies the format of the model files to use when generating the ICs. -``FV3GFS_FILE_FMT_LBCS``: (Default: โ€œnemsioโ€) - If using the FV3GFS model as the source of the LBCs (i.e. if ``EXTRN_MDL_NAME_LBCS`` is set to "FV3GFS"), this variable specifies the format of the model files to use when generating the LBCs. +Base Directories for External Model Files +=========================================== + +.. note:: + Note that these must be defined as null strings in ``config_defaults.sh`` so that if they are specified by the user in the experiment configuration file (i.e., ``config.sh``), they remain set to those values, and if not, they get set to machine-dependent values. + +``EXTRN_MDL_SYSBASEDIR_ICS``: (Default: "") + Base directory on the local machine containing external model files for generating :term:`ICs` on the native grid. The way the full path containing these files is constructed depends on the user-specified external model for ICs (defined in ``EXTRN_MDL_NAME_ICS`` above). + +``EXTRN_MDL_SYSBASEDIR_LBCS``: (Default: "") + Base directory on the local machine containing external model files for generating :term:`LBCs` on the native grid. The way the full path containing these files is constructed depends on the user-specified external model for LBCs (defined in ``EXTRN_MDL_NAME_LBCS`` above). + User-Staged External Model Directory and File Parameters ======================================================== -``USE_USER_STAGED_EXTRN_FILES``: (Default: โ€œFalseโ€) - Flag that determines whether or not the workflow will look for the external model files needed for generating ICs and LBCs in user-specified directories (as opposed to fetching them from mass storage like NOAA HPSS). +``USE_USER_STAGED_EXTRN_FILES``: (Default: "FALSE") + Flag that determines whether or not the workflow will look for the external model files needed for generating :term:`ICs` and :term:`LBCs` in user-specified directories (as opposed to fetching them from mass storage like NOAA HPSS). -``EXTRN_MDL_SOURCE_BASEDIR_ICS``: (Default: โ€œ/base/dir/containing/user/staged/extrn/mdl/files/for/ICs") - Directory in which to look for external model files for generating ICs. If ``USE_USER_STAGED_EXTRN_FILES`` is set to "TRUE", the workflow looks in this directory (specifically, in a subdirectory under this directory named "YYYYMMDDHH" consisting of the starting date and cycle hour of the forecast, where YYYY is the 4-digit year, MM the 2-digit month, DD the 2-digit day of the month, and HH the 2-digit hour of the day) for the external model files specified by the array ``EXTRN_MDL_FILES_ICS`` (these files will be used to generate the ICs on the native FV3-LAM grid). This variable is not used if ``USE_USER_STAGED_EXTRN_FILES`` is set to "FALSE". +``EXTRN_MDL_SOURCE_BASEDIR_ICS``: (Default: "/base/dir/containing/user/staged/extrn/mdl/files/for/ICs") + Directory containing external model files for generating ICs. If ``USE_USER_STAGED_EXTRN_FILES`` is set to "TRUE", the workflow looks within this directory for a subdirectory named "YYYYMMDDHH", which contains the external model files specified by the array ``EXTRN_MDL_FILES_ICS``. This "YYYYMMDDHH" subdirectory corresponds to the start date and cycle hour of the forecast (see :ref:`above `). These files will be used to generate the :term:`ICs` on the native FV3-LAM grid. This variable is not used if ``USE_USER_STAGED_EXTRN_FILES`` is set to "FALSE". -``EXTRN_MDL_FILES_ICS``: (Default: "ICS_file1โ€ โ€œICS_file2โ€ โ€œ...โ€) - Array containing the names of the files to search for in the directory specified by ``EXTRN_MDL_SOURCE_BASEDIR_ICS``. This variable is not used if ``USE_USER_STAGED_EXTRN_FILES`` is set to "FALSE". +``EXTRN_MDL_FILES_ICS``: (Default: "ICS_file1" "ICS_file2" "...") + Array containing the file names to search for in the ``EXTRN_MDL_SOURCE_BASEDIR_ICS`` directory. This variable is not used if ``USE_USER_STAGED_EXTRN_FILES`` is set to "FALSE". ``EXTRN_MDL_SOURCE_BASEDIR_LBCS``: (Default: "/base/dir/containing/user/staged/extrn/mdl/files/for/ICs") - Analogous to ``EXTRN_MDL_SOURCE_BASEDIR_ICS`` but for LBCs instead of ICs. + Analogous to ``EXTRN_MDL_SOURCE_BASEDIR_ICS`` but for :term:`LBCs` instead of :term:`ICs`. + +``EXTRN_MDL_FILES_LBCS``: (Default: " "LBCS_file1" "LBCS_file2" "...") + Analogous to ``EXTRN_MDL_FILES_ICS`` but for :term:`LBCs` instead of :term:`ICs`. + + +NOMADS Parameters +====================== + +Set parameters associated with NOMADS online data. + +``NOMADS``: (Default: "FALSE") + Flag controlling whether to use NOMADS online data. + +``NOMADS_file_type``: (Default: "nemsio") + Flag controlling the format of the data. Valid values: "GRIB2" "grib2" "NEMSIO" "nemsio" -``EXTRN_MDL_FILES_LBCS``: (Default: " โ€œLBCS_file1โ€ โ€œLBCS_file2โ€ โ€œ...โ€) - Analogous to ``EXTRN_MDL_FILES_ICS`` but for LBCs instead of ICs. CCPP Parameter ============== -``CCPP_PHYS_SUITE``: (Default: "FV3_GFS_v15p2") - The CCPP (Common Community Physics Package) physics suite to use for the forecast(s). The choice of physics suite determines the forecast model's namelist file, the diagnostics table file, the field table file, and the XML physics suite definition file that are staged in the experiment directory or the cycle directories under it. Current supported settings for this parameter are โ€œFV3_GFS_v15p2โ€ and โ€œFV3_RRFS_v1alphaโ€. +``CCPP_PHYS_SUITE``: (Default: "FV3_GFS_v16") + This parameter indicates which :term:`CCPP` (Common Community Physics Package) physics suite to use for the forecast(s). The choice of physics suite determines the forecast model's namelist file, the diagnostics table file, the field table file, and the XML physics suite definition file, which are staged in the experiment directory or the :term:`cycle` directories under it. + + **Current supported settings for this parameter are:** + + | "FV3_GFS_v16" + | "FV3_RRFS_v1beta" + | "FV3_HRRR" + | "FV3_WoFS" + + **Other valid values include:** + + | "FV3_GFS_2017_gfdlmp" + | "FV3_GFS_2017_gfdlmp_regional" + | "FV3_GFS_v15p2" + | "FV3_GFS_v15_thompson_mynn_lam3km" + | "FV3_RRFS_v1alpha" + + +Stochastic Physics Parameters +================================ + +For the most updated and detailed documentation of these parameters, see the `UFS Stochastic Physics Documentation `__. + + +``NEW_LSCALE``: (Default: "TRUE") + Use correct formula for converting a spatial legnth scale into spectral space. + +Specific Humidity (SHUM) Perturbation Parameters +--------------------------------------------------- + +``DO_SHUM``: (Default: "FALSE") + Flag to turn Specific Humidity (SHUM) perturbations on or off. SHUM perturbations multiply the low-level specific humidity by a small random number at each time-step. The SHUM scheme attempts to address missing physics phenomena (e.g., cold pools, gust fronts) most active in convective regions. + +``ISEED_SHUM``: (Default: "2") + Seed for setting the SHUM random number sequence. + +``SHUM_MAG``: (Default: "0.006") + Amplitudes of random patterns. Corresponds to the variable ``shum`` in ``input.nml``. + +``SHUM_LSCALE``: (Default: "150000") + Decorrelation spatial scale in meters. + +``SHUM_TSCALE``: (Default: "21600") + Decorrelation timescale in seconds. Corresponds to the variable ``shum_tau`` in ``input.nml``. + +``SHUM_INT``: (Default: "3600") + Interval in seconds to update random pattern (optional). Perturbations still get applied at every time-step. Corresponds to the variable ``shumint`` in ``input.nml``. + +.. _SPPT: + +Stochastically Perturbed Physics Tendencies (SPPT) Parameters +----------------------------------------------------------------- + +SPPT perturbs full physics tendencies *after* the call to the physics suite, unlike :ref:`SPP ` (below), which perturbs specific tuning parameters within a physics scheme. + +``DO_SPPT``: (Default: "FALSE") + Flag to turn Stochastically Perturbed Physics Tendencies (SPPT) on or off. SPPT multiplies the physics tendencies by a random number between 0 and 2 before updating the model state. This addresses error in the physics parameterizations (either missing physics or unresolved subgrid processes). It is most active in the boundary layer and convective regions. + +``ISEED_SPPT``: (Default: "1") + Seed for setting the SPPT random number sequence. + +``SPPT_MAG``: (Default: "0.7") + Amplitude of random patterns. Corresponds to the variable ``sppt`` in ``input.nml``. + +``SPPT_LOGIT``: (Default: "TRUE") + Limits the SPPT perturbations to between 0 and 2. Should be "TRUE"; otherwise the model will crash. + +``SPPT_LSCALE``: (Default: "150000") + Decorrelation spatial scale in meters. + +``SPPT_TSCALE``: (Default: "21600") + Decorrelation timescale in seconds. Corresponds to the variable ``sppt_tau`` in ``input.nml``. + +``SPPT_INT``: (Default: "3600") + Interval in seconds to update random pattern (optional parameter). Perturbations still get applied at every time-step. Corresponds to the variable ``spptint`` in ``input.nml``. + +``SPPT_SFCLIMIT``: (Default: "TRUE") + When "TRUE", tapers the SPPT perturbations to zero at the modelโ€™s lowest level, which reduces model crashes. + +``USE_ZMTNBLCK``: (Default: "FALSE") + When "TRUE", do not apply perturbations below the dividing streamline that is diagnosed by the gravity wave drag, mountain blocking scheme + +Stochastic Kinetic Energy Backscatter (SKEB) Parameters +---------------------------------------------------------- + +``DO_SKEB``: (Default: "FALSE") + Flag to turn Stochastic Kinetic Energy Backscatter (SKEB) on or off. SKEB adds wind perturbations to the model state. Perturbations are random in space/time, but amplitude is determined by a smoothed dissipation estimate provided by the :term:`dynamical core`. SKEB addresses errors in the dynamics more active in the mid-latitudes. + +``ISEED_SKEB``: (Default: "3") + Seed for setting the SHUM random number sequence. + +``SKEB_MAG``: (Default: "0.5") + Amplitude of random patterns. Corresponds to the variable ``skeb`` in ``input.nml``. + +``SKEB_LSCALE``: (Default: "150000") + Decorrelation spatial scale in meters. + +``SKEB_TSCALE``: (Default: "21600") + Decorrelation timescale in seconds. Corresponds to the variable ``skeb_tau`` in ``input.nml``. + +``SKEB_INT``: (Default: "3600") + Interval in seconds to update random pattern (optional). Perturbations still get applied every time-step. Corresponds to the variable ``skebint`` in ``input.nml``. + +``SKEBNORM``: (Default: "1") + Patterns: + * 0-random pattern is stream function + * 1-pattern is K.E. norm + * 2-pattern is vorticity + +``SKEB_VDOF``: (Default: "10") + The number of degrees of freedom in the vertical direction for the SKEB random pattern. + +.. _SPP: + +Parameters for Stochastically Perturbed Parameterizations (SPP) +------------------------------------------------------------------ + +Set default Stochastically Perturbed Parameterizations (SPP) stochastic physics options. Unlike :ref:`SPPT physics `, SPP is applied within the physics, not afterward. SPP perturbs specific tuning parameters within a physics :term:`parameterization` (unlike `SPPT `, which multiplies overall physics tendencies by a random perturbation field *after* the call to the physics suite). Each SPP option is an array, applicable (in order) to the :term:`RAP`/:term:`HRRR`-based parameterization listed in ``SPP_VAR_LIST``. Enter each value of the array in ``config.sh`` as shown below without commas or single quotes (e.g., ``SPP_VAR_LIST=( "pbl" "sfc" "mp" "rad" "gwd"`` ). Both commas and single quotes will be added by Jinja when creating the namelist. + +.. note:: + SPP is currently only available for specific physics schemes used in the RAP/HRRR physics suite. Users need to be aware of which :term:`SDF` is chosen when turning this option on. + +``DO_SPP``: (Default: "false") + Flag to turn SPP on or off. SPP perturbs parameters or variables with unknown or uncertain magnitudes within the physics code based on ranges provided by physics experts. + +``ISEED_SPP``: (Default: ( "4" "4" "4" "4" "4" ) ) + The initial seed value for the perturbation pattern. + +``SPP_MAG_LIST``: (Default: ( "0.2" "0.2" "0.75" "0.2" "0.2" ) ) + Corresponds to the variable ``spp_prt_list`` in ``input.nml`` + +``SPP_LSCALE``: (Default: ( "150000.0" "150000.0" "150000.0" "150000.0" "150000.0" ) ) + Length scale in meters. + +``SPP_TSCALE``: (Default: ( "21600.0" "21600.0" "21600.0" "21600.0" "21600.0" ) ) + Time decorrelation length in seconds. Corresponds to the variable ``spp_tau`` in ``input.nml``. + +``SPP_SIGTOP1``: (Default: ( "0.1" "0.1" "0.1" "0.1" "0.1") ) + Controls vertical tapering of perturbations at the tropopause and corresponds to the lower sigma level at which to taper perturbations to zero. + +.. + COMMENT: Needs review. + +``SPP_SIGTOP2``: (Default: ( "0.025" "0.025" "0.025" "0.025" "0.025" ) ) + Controls vertical tapering of perturbations at the tropopause and corresponds to the upper sigma level at which to taper perturbations to zero. + +.. + COMMENT: Needs review. + +``SPP_STDDEV_CUTOFF``: (Default: ( "1.5" "1.5" "2.5" "1.5" "1.5" ) ) + Perturbation magnitude cutoff in number of standard deviations from the mean. + +.. + COMMENT: Needs review. + +``SPP_VAR_LIST``: (Default: ( "pbl" "sfc" "mp" "rad" "gwd" ) ) + The list of parameterizations to perturb: planetary boundary layer (PBL), surface physics (SFC), microphysics (MP), radiation (RAD), gravity wave drag (GWD). Valid values: "pbl", "sfc", "rad", "gwd", and "mp". + + +Land Surface Model (LSM) SPP +------------------------------- + +Land surface perturbations can be applied to land model parameters and land model prognostic variables. The LSM scheme is intended to address errors in the land model and land-atmosphere interactions. LSM perturbations include soil moisture content [SMC] (volume fraction), vegetation fraction (VGF), albedo (ALB), salinity (SAL), emissivity (EMI), surface roughness (ZOL) (in cm), and soil temperature (STC). Perturbations to soil moisture content (SMC) are only applied at the first time step. Only five perturbations at a time can be applied currently, but all seven are shown below. In addition, only one unique *iseed* value is allowed at the moment, and it is used for each pattern. + +The parameters below turn on SPP in Noah or RUC LSM (support for Noah MP is in progress). Please be aware of the :term:`SDF` that you choose if you wish to turn on Land Surface Model (LSM) SPP. SPP in LSM schemes is handled in the ``&nam_sfcperts`` namelist block instead of in ``&nam_sppperts``, where all other SPP is implemented. The default perturbation frequency is determined by the ``fhcyc`` namelist entry. Since that parameter is set to zero in the SRW App, use ``LSM_SPP_EACH_STEP`` to perturb every time step. + +``DO_LSM_SPP``: (Default: "false") + Turns on Land Surface Model (LSM) Stochastic Physics Parameterizations (SPP). When "TRUE", sets ``lndp_type=2``, which applies land perturbations to the selected paramaters using a newer scheme designed for data assimilation (DA) ensemble spread. LSM SPP perturbs uncertain land surface fields ("smc" "vgf" "alb" "sal" "emi" "zol" "stc") based on recommendations from physics experts. + +``LSM_SPP_TSCALE``: (Default: ( ( "21600" "21600" "21600" "21600" "21600" "21600" "21600" ) ) + Decorrelation timescale in seconds. + +``LSM_SPP_LSCALE``: (Default: ( ( "150000" "150000" "150000" "150000" "150000" "150000" "150000" ) ) + Decorrelation spatial scale in meters. + +``ISEED_LSM_SPP``: (Default: ("9") ) + Seed to initialize the random perturbation pattern. + +``LSM_SPP_VAR_LIST``: (Default: ( ( "smc" "vgf" "alb" "sal" "emi" "zol" "stc" ) ) + Indicates which LSM variables to perturb. + +``LSM_SPP_MAG_LIST``: (Default: ( ( "0.2" "0.001" "0.001" "0.001" "0.001" "0.001" "0.2" ) ) + Sets the maximum random pattern amplitude for each of the LSM perturbations. + +``LSM_SPP_EACH_STEP``: (Default: "true") + When set to "TRUE", it sets ``lndp_each_step=.true.`` and perturbs each time step. + +.. This is a continuation of the ConfigWorkflow.rst chapter + +.. _ConfigParameters: + +Grid Generation Parameters +========================== +``GRID_GEN_METHOD``: (Default: "") + This variable specifies which method to use to generate a regional grid in the horizontal plane. The values that it can take on are: + + * **"ESGgrid":** The "ESGgrid" method will generate a regional version of the Extended Schmidt Gnomonic (ESG) grid using the map projection developed by Jim Purser of EMC (:cite:t:`Purser_2020`). "ESGgrid" is the preferred grid option. + + * **"GFDLgrid":** The "GFDLgrid" method first generates a "parent" global cubed-sphere grid. Then a portion from tile 6 of the global grid is used as the regional grid. This regional grid is referred to in the grid generation scripts as "tile 7," even though it does not correspond to a complete tile. The forecast is run only on the regional grid (i.e., on tile 7, not on tiles 1 through 6). Note that the "GFDLgrid" grid generation method is the legacy grid generation method. It is not supported in *all* predefined domains. + +.. attention:: + + If the experiment uses a **predefined grid** (i.e., if ``PREDEF_GRID_NAME`` is set to the name of a valid predefined grid), then ``GRID_GEN_METHOD`` will be reset to the value of ``GRID_GEN_METHOD`` for that grid. This will happen regardless of whether ``GRID_GEN_METHOD`` is assigned a value in the experiment configuration file; any value assigned will be overwritten. + +.. note:: + + If the experiment uses a **user-defined grid** (i.e. if ``PREDEF_GRID_NAME`` is set to a null string), then ``GRID_GEN_METHOD`` must be set in the experiment configuration file. Otherwise, the experiment generation will fail because the generation scripts check to ensure that the grid name is set to a non-empty string before creating the experiment directory. + +.. _ESGgrid: + +ESGgrid Settings +------------------- + +The following parameters must be set if using the "ESGgrid" method of generating a regional grid (i.e., when ``GRID_GEN_METHOD="ESGgrid"``). + +``ESGgrid_LON_CTR``: (Default: "") + The longitude of the center of the grid (in degrees). + +``ESGgrid_LAT_CTR``: (Default: "") + The latitude of the center of the grid (in degrees). + +``ESGgrid_DELX``: (Default: "") + The cell size in the zonal direction of the regional grid (in meters). + +``ESGgrid_DELY``: (Default: "") + The cell size in the meridional direction of the regional grid (in meters). + +``ESGgrid_NX``: (Default: "") + The number of cells in the zonal direction on the regional grid. + +``ESGgrid_NY``: (Default: "") + The number of cells in the meridional direction on the regional grid. + +``ESGgrid_WIDE_HALO_WIDTH``: (Default: "") + The width (in number of grid cells) of the :term:`halo` to add around the regional grid before shaving the halo down to the width(s) expected by the forecast model. + +``ESGgrid_PAZI``: (Default: "") + The rotational parameter for the "ESGgrid" (in degrees). + +.. _WideHalo: + +.. note:: + A :term:`halo` is the strip of cells surrounding the regional grid; the halo is used to feed in the lateral boundary conditions to the grid. The forecast model requires **grid** files containing 3-cell- and 4-cell-wide halos and **orography** files with 0-cell- and 3-cell- wide halos. In order to generate grid and orography files with appropriately-sized halos, the grid and orography tasks create preliminary files with halos around the regional domain of width ``ESGgrid_WIDE_HALO_WIDTH`` cells. The files are then read in and "shaved" down to obtain grid files with 3-cell-wide and 4-cell-wide halos and orography files with 0-cell-wide and 3-cell-wide halos. The original halo that gets shaved down is referred to as the "wide" halo because it is wider than the 0-cell-wide, 3-cell-wide, and 4-cell-wide halos that we eventually end up with. Note that the grid and orography files with the wide halo are only needed as intermediates in generating the files with 0-cell-, 3-cell-, and 4-cell-wide halos; they are not needed by the forecast model. + +.. + COMMENT: There's a note that we "probably don't need to make ESGgrid_WIDE_HALO_WIDTH a user-specified variable. Just set it in the function set_gridparams_ESGgrid.sh". Has this been done? I thought there was a default value of 6. Does this come from set_gridparams_ESGgrid.sh? Will it overwirte what's added here? + + +GFDLgrid Settings +--------------------- + +The following parameters must be set if using the "GFDLgrid" method of generating a regional grid (i.e., when ``GRID_GEN_METHOD="GFDLgrid"``). Note that the regional grid is defined with respect to a "parent" global cubed-sphere grid. Thus, all the parameters for a global cubed-sphere grid must be specified even though the model equations are integrated only on the regional grid. Tile 6 has arbitrarily been chosen as the tile to use to orient the global parent grid on the sphere (Earth). For convenience, the regional grid is denoted as "tile 7" even though it is embedded within tile 6 (i.e., it doesn't extend beyond the boundary of tile 6). Its exact location within tile 6 is determined by specifying the starting and ending i- and j-indices of the regional grid on tile 6, where i is the grid index in the x direction and j is the grid index in the y direction. All of this information is set in the variables below. + +``GFDLgrid_LON_T6_CTR``: (Default: "") + Longitude of the center of tile 6 (in degrees). + +``GFDLgrid_LAT_T6_CTR``: (Default: "") + Latitude of the center of tile 6 (in degrees). + +``GFDLgrid_RES``: (Default: "") + Number of points in either of the two horizontal directions (x and y) on each tile of the parent global cubed-sphere grid. Valid values: "48" "96" "192" "384" "768" "1152" "3072" + + .. + COMMENT: Are these still the valid values? Are there others? + + .. note:: + ``GFDLgrid_RES`` is a misnomer because it specifies *number* of grid cells, not grid size (in meters or kilometers). However, we keep this name in order to remain consistent with the usage of the word "resolution" in the global forecast model and auxiliary codes. The mapping from ``GFDLgrid_RES`` to a nominal resolution (grid cell size) for several values of ``GFDLgrid_RES`` is as follows (assuming a uniform global grid, i.e., with Schmidt stretch factor ``GFDLgrid_STRETCH_FAC="1"``): + + +----------------+--------------------+ + | GFDLgrid_RES | typical cell size | + +================+====================+ + | 192 | 50 km | + +----------------+--------------------+ + | 384 | 25 km | + +----------------+--------------------+ + | 768 | 13 km | + +----------------+--------------------+ + | 1152 | 8.5 km | + +----------------+--------------------+ + | 3072 | 3.2 km | + +----------------+--------------------+ + + Note that these are only typical cell sizes. The actual cell size on the global grid tiles varies somewhat as we move across a tile. + + +``GFDLgrid_STRETCH_FAC``: (Default: "") + Stretching factor used in the Schmidt transformation applied to the parent cubed-sphere grid. Setting the Schmidt stretching factor (``GFDLgrid_STRETCH_FAC``) to a value greater than 1 shrinks tile 6, while setting it to a value less than 1 (but still greater than 0) expands it. The remaining 5 tiles change shape as necessary to maintain global coverage of the grid. + +``GFDLgrid_REFINE_RATIO``: (Default: "") + Cell refinement ratio for the regional grid. It refers to the number of cells in either the x or y direction on the regional grid (tile 7) that abut one cell on its parent tile (tile 6). + +``GFDLgrid_ISTART_OF_RGNL_DOM_ON_T6G``: (Default: "") + i-index on tile 6 at which the regional grid (tile 7) starts. + +``GFDLgrid_IEND_OF_RGNL_DOM_ON_T6G``: (Default: "") + i-index on tile 6 at which the regional grid (tile 7) ends. + +``GFDLgrid_JSTART_OF_RGNL_DOM_ON_T6G``: (Default: "") + j-index on tile 6 at which the regional grid (tile 7) starts. + +``GFDLgrid_JEND_OF_RGNL_DOM_ON_T6G``: (Default: "") + j-index on tile 6 at which the regional grid (tile 7) ends. + +``GFDLgrid_USE_GFDLgrid_RES_IN_FILENAMES``: (Default: "") + Flag that determines the file naming convention to use for grid, orography, and surface climatology files (or, if using pregenerated files, the naming convention that was used to name these files). These files usually start with the string ``"C${RES}_"``, where ``RES`` is an integer. In the global forecast model, ``RES`` is the number of points in each of the two horizontal directions (x and y) on each tile of the global grid (defined here as ``GFDLgrid_RES``). If this flag is set to "TRUE", ``RES`` will be set to ``GFDLgrid_RES`` just as in the global forecast model. If it is set to "FALSE", we calculate (in the grid generation task) an "equivalent global uniform cubed-sphere resolution" -- call it ``RES_EQUIV`` -- and then set ``RES`` equal to it. ``RES_EQUIV`` is the number of grid points in each of the x and y directions on each tile that a global UNIFORM (i.e., stretch factor of 1) cubed-sphere grid would need to have in order to have the same average grid size as the regional grid. This is a more useful indicator of the grid size because it takes into account the effects of ``GFDLgrid_RES``, ``GFDLgrid_STRETCH_FAC``, and ``GFDLgrid_REFINE_RATIO`` in determining the regional grid's typical grid size, whereas simply setting RES to ``GFDLgrid_RES`` doesn't take into account the effects of ``GFDLgrid_STRETCH_FAC`` and ``GFDLgrid_REFINE_RATIO`` on the regional grid's resolution. Nevertheless, some users still prefer to use ``GFDLgrid_RES`` in the file names, so we allow for that here by setting this flag to "TRUE". + +Computational Forecast Parameters +================================= + +``LAYOUT_X, LAYOUT_Y``: (Default: "") + The number of :term:`MPI` tasks (processes) to use in the two horizontal directions (x and y) of the regional grid when running the forecast model. + +``BLOCKSIZE``: (Default: "") + The amount of data that is passed into the cache at a time. + +.. note:: + + In ``config_defaults.sh`` these parameters are set to null strings so that: + + #. If the experiment is using a predefined grid and the user sets the ``BLOCKSIZE`` parameter in the user-specified experiment configuration file (i.e., ``config.sh``), that value will be used in the forecast(s). Otherwise, the default ``BLOCKSIZE`` for that predefined grid will be used. + #. If the experiment is *not* using a predefined grid (i.e., it is using a custom grid whose parameters are specified in the experiment configuration file), then the user must specify a value for the ``BLOCKSIZE`` parameter in that configuration file. Otherwise, it will remain set to a null string, and the experiment generation will fail, because the generation scripts check to ensure that all the parameters defined in this section are set to non-empty strings before creating the experiment directory. + +.. _WriteComp: + +Write-Component (Quilting) Parameters +====================================== + +.. note:: + The :term:`UPP` (called by the ``RUN_POST_TN`` task) cannot process output on the native grid types ("GFDLgrid" and "ESGgrid"), so output fields are interpolated to a **write-component grid** before writing them to an output file. The output files written by the UFS Weather Model model use an Earth System Modeling Framework (ESMF) component, referred to as the **write component**. This model component is configured with settings in the ``model_configure`` file, as described in `Section 4.2.3 `__ of the UFS Weather Model documentation. + +``QUILTING``: (Default: "TRUE") + +.. attention:: + The regional grid requires the use of the write component, so users generally should not need to change the default value for ``QUILTING``. + + Flag that determines whether to use the write component for writing forecast output files to disk. If set to "TRUE", the forecast model will output files named ``dynf$HHH.nc`` and ``phyf$HHH.nc`` (where HHH is the 3-digit forecast hour) containing dynamics and physics fields, respectively, on the write-component grid. For example, the output files for the 3rd hour of the forecast would be ``dynf$003.nc`` and ``phyf$003.nc``. (The regridding from the native FV3-LAM grid to the write-component grid is done by the forecast model.) If ``QUILTING`` is set to "FALSE", then the output file names are ``fv3_history.nc`` and ``fv3_history2d.nc``, and they contain fields on the native grid. Although the UFS Weather Model can run without quilting, the regional grid requires the use of the write component. Therefore, QUILTING should be set to "TRUE" when running the SRW App. If ``QUILTING`` is set to "FALSE", the ``RUN_POST_TN`` (meta)task cannot run because the :term:`UPP` code that this task calls cannot process fields on the native grid. In that case, the ``RUN_POST_TN`` (meta)task will be automatically removed from the Rocoto workflow XML. The :ref:`INLINE POST ` option also requires ``QUILTING`` to be set to "TRUE" in the SRW App. + +``PRINT_ESMF``: (Default: "FALSE") + Flag that determines whether to output extra (debugging) information from ESMF routines. Must be "TRUE" or "FALSE". Note that the write component uses ESMF library routines to interpolate from the native forecast model grid to the user-specified output grid (which is defined in the model configuration file ``model_configure`` in the forecast run directory). + +``WRTCMP_write_groups``: (Default: "1") + The number of write groups (i.e., groups of :term:`MPI` tasks) to use in the write component. + +``WRTCMP_write_tasks_per_group``: (Default: "20") + The number of MPI tasks to allocate for each write group. + +``WRTCMP_output_grid``: (Default: "''") + Sets the type (coordinate system) of the write component grid. The default empty string forces the user to set a valid value for ``WRTCMP_output_grid`` in ``config.sh`` if specifying a *custom* grid. When creating an experiment with a user-defined grid, this parameter must be specified or the experiment will fail. Valid values: "lambert_conformal" "regional_latlon" "rotated_latlon" + +``WRTCMP_cen_lon``: (Default: "") + Longitude (in degrees) of the center of the write component grid. Can usually be set to the corresponding value from the native grid. + +``WRTCMP_cen_lat``: (Default: "") + Latitude (in degrees) of the center of the write component grid. Can usually be set to the corresponding value from the native grid. + +``WRTCMP_lon_lwr_left``: (Default: "") + Longitude (in degrees) of the center of the lower-left (southwest) cell on the write component grid. If using the "rotated_latlon" coordinate system, this is expressed in terms of the rotated longitude. Must be set manually when running an experiment with a user-defined grid. + +``WRTCMP_lat_lwr_left``: (Default: "") + Latitude (in degrees) of the center of the lower-left (southwest) cell on the write component grid. If using the "rotated_latlon" coordinate system, this is expressed in terms of the rotated latitude. Must be set manually when running an experiment with a user-defined grid. + +**The following parameters must be set when** ``WRTCMP_output_grid`` **is set to "rotated_latlon":** + +``WRTCMP_lon_upr_rght``: (Default: "") + Longitude (in degrees) of the center of the upper-right (northeast) cell on the write component grid (expressed in terms of the rotated longitude). + +``WRTCMP_lat_upr_rght``: (Default: "") + Latitude (in degrees) of the center of the upper-right (northeast) cell on the write component grid (expressed in terms of the rotated latitude). + +``WRTCMP_dlon``: (Default: "") + Size (in degrees) of a grid cell on the write component grid (expressed in terms of the rotated longitude). + +``WRTCMP_dlat``: (Default: "") + Size (in degrees) of a grid cell on the write component grid (expressed in terms of the rotated latitude). + +**The following parameters must be set when** ``WRTCMP_output_grid`` **is set to "lambert_conformal":** + +``WRTCMP_stdlat1``: (Default: "") + First standard latitude (in degrees) in definition of Lambert conformal projection. + +``WRTCMP_stdlat2``: (Default: "") + Second standard latitude (in degrees) in definition of Lambert conformal projection. + +``WRTCMP_nx``: (Default: "") + Number of grid points in the x-coordinate of the Lambert conformal projection. + +``WRTCMP_ny``: (Default: "") + Number of grid points in the y-coordinate of the Lambert conformal projection. + +``WRTCMP_dx``: (Default: "") + Grid cell size (in meters) along the x-axis of the Lambert conformal projection. + +``WRTCMP_dy``: (Default: "") + Grid cell size (in meters) along the y-axis of the Lambert conformal projection. + + +Predefined Grid Parameters +========================== +``PREDEF_GRID_NAME``: (Default: "") + This parameter specifies the name of a predefined regional grid. Setting ``PREDEF_GRID_NAME`` provides a convenient method of specifying a commonly used set of grid-dependent parameters. The predefined grid parameters are specified in the script ``ush/set_predef_grid_params.sh``. + + **Currently supported options:** + + | "RRFS_CONUS_25km" + | "RRFS_CONUS_13km" + | "RRFS_CONUS_3km" + | "SUBCONUS_Ind_3km" + + **Other valid values include:** + + | "CONUS_25km_GFDLgrid" + | "CONUS_3km_GFDLgrid" + | "EMC_AK" + | "EMC_HI" + | "EMC_PR" + | "EMC_GU" + | "GSL_HAFSV0.A_25km" + | "GSL_HAFSV0.A_13km" + | "GSL_HAFSV0.A_3km" + | "GSD_HRRR_AK_50km" + | "RRFS_AK_13km" + | "RRFS_AK_3km" + | "RRFS_CONUScompact_25km" + | "RRFS_CONUScompact_13km" + | "RRFS_CONUScompact_3km" + | "RRFS_NA_13km" + | "RRFS_NA_3km" + | "RRFS_SUBCONUS_3km" + | "WoFS_3km" + +.. note:: + + * If ``PREDEF_GRID_NAME`` is set to a valid predefined grid name, the grid generation method ``GRID_GEN_METHOD``, the (native) grid parameters, and the write-component grid parameters are set to predefined values for the specified grid, overwriting any settings of these parameters in the user-specified experiment configuration file (``config.sh``). In addition, if the time step ``DT_ATMOS`` and the computational parameters ``LAYOUT_X``, ``LAYOUT_Y``, and ``BLOCKSIZE`` are not specified in that configuration file, they are also set to predefined values for the specified grid. + + * If ``PREDEF_GRID_NAME`` is set to an empty string, it implies the user is providing the native grid parameters in the user-specified experiment configuration file (``EXPT_CONFIG_FN``). In this case, the grid generation method ``GRID_GEN_METHOD``, the native grid parameters, and the write-component grid parameters as well as the main time step (``DT_ATMOS``) and the computational parameters ``LAYOUT_X``, ``LAYOUT_Y``, and ``BLOCKSIZE`` must be set in that configuration file. Otherwise, the values of all of these parameters in this default experiment configuration file will be used. + + +Pre-existing Directory Parameter +================================ +``PREEXISTING_DIR_METHOD``: (Default: "delete") + This variable determines the method to use to deal with pre-existing directories (generated by previous calls to the experiment generation script using the same experiment name (``EXPT_SUBDIR``) as the current experiment). This variable must be set to one of three valid values: "delete", "rename", and "quit". The resulting behavior for each of these values is as follows: + + * **"delete":** The preexisting directory is deleted and a new directory (having the same name as the original preexisting directory) is created. + + * **"rename":** The preexisting directory is renamed and a new directory (having the same name as the original pre-existing directory) is created. The new name of the preexisting directory consists of its original name and the suffix "_oldNNN", where NNN is a 3-digit integer chosen to make the new name unique. + + * **"quit":** The preexisting directory is left unchanged, but execution of the currently running script is terminated. In this case, the preexisting directory must be dealt with manually before rerunning the script. + + +Verbose Parameter +================= +``VERBOSE``: (Default: "TRUE") + Flag that determines whether the experiment generation and workflow task scripts print out extra informational messages. Valid values: "TRUE" "true" "YES" "yes" "FALSE" "false" "NO" "no" + +Debug Parameter +================= +``DEBUG``: (Default: "FALSE") + Flag that determines whether to print out very detailed debugging messages. Note that if DEBUG is set to TRUE, then VERBOSE will also get reset to TRUE if it isn't already. Valid values: "TRUE" "true" "YES" "yes" "FALSE" "false" "NO" "no" + +.. _WFTasks: + +Rocoto Workflow Tasks +======================== + +Set the names of the various Rocoto workflow tasks. These names usually do not need to be changed. + +**Baseline Tasks:** + +| ``MAKE_GRID_TN``: (Default: "make_grid") +| ``MAKE_OROG_TN``: (Default: "make_orog") +| ``MAKE_SFC_CLIMO_TN``: (Default: "make_sfc_climo") +| ``GET_EXTRN_ICS_TN``: (Default: "get_extrn_ics") +| ``GET_EXTRN_LBCS_TN``: (Default: "get_extrn_lbcs") +| ``MAKE_ICS_TN``: (Default: "make_ics") +| ``MAKE_LBCS_TN``: (Default: "make_lbcs") +| ``RUN_FCST_TN``: (Default: "run_fcst") +| ``RUN_POST_TN``: (Default: "run_post") + +**METplus Verification Tasks:** When running METplus verification tasks, the following task names are also added to the Rocoto workflow: + +| ``GET_OBS``: (Default: "get_obs") +| ``GET_OBS_CCPA_TN``: (Default: "get_obs_ccpa") +| ``GET_OBS_MRMS_TN``: (Default: "get_obs_mrms") +| ``GET_OBS_NDAS_TN``: (Default: "get_obs_ndas") +| ``VX_TN``: (Default: "run_vx") +| ``VX_GRIDSTAT_TN``: (Default: "run_gridstatvx") +| ``VX_GRIDSTAT_REFC_TN``: (Default: "run_gridstatvx_refc") +| ``VX_GRIDSTAT_RETOP_TN``: (Default: "run_gridstatvx_retop") +| ``VX_GRIDSTAT_03h_TN``: (Default: "run_gridstatvx_03h") +| ``VX_GRIDSTAT_06h_TN``: (Default: "run_gridstatvx_06h") +| ``VX_GRIDSTAT_24h_TN``: (Default: "run_gridstatvx_24h") +| ``VX_POINTSTAT_TN``: (Default: "run_pointstatvx") +| ``VX_ENSGRID_TN``: (Default: "run_ensgridvx") +| ``VX_ENSGRID_03h_TN``: (Default: "run_ensgridvx_03h") +| ``VX_ENSGRID_06h_TN``: (Default: "run_ensgridvx_06h") +| ``VX_ENSGRID_24h_TN``: (Default: "run_ensgridvx_24h") +| ``VX_ENSGRID_REFC_TN``: (Default: "run_ensgridvx_refc") +| ``VX_ENSGRID_RETOP_TN``: (Default: "run_ensgridvx_retop") +| ``VX_ENSGRID_MEAN_TN``: (Default: "run_ensgridvx_mean") +| ``VX_ENSGRID_PROB_TN``: (Default: "run_ensgridvx_prob") +| ``VX_ENSGRID_MEAN_03h_TN``: (Default: "run_ensgridvx_mean_03h") +| ``VX_ENSGRID_PROB_03h_TN``: (Default: "run_ensgridvx_prob_03h") +| ``VX_ENSGRID_MEAN_06h_TN``: (Default: "run_ensgridvx_mean_06h") +| ``VX_ENSGRID_PROB_06h_TN``: (Default: "run_ensgridvx_prob_06h") +| ``VX_ENSGRID_MEAN_24h_TN``: (Default: "run_ensgridvx_mean_24h") +| ``VX_ENSGRID_PROB_24h_TN``: (Default: "run_ensgridvx_prob_24h") +| ``VX_ENSGRID_PROB_REFC_TN``: (Default: "run_ensgridvx_prob_refc") +| ``VX_ENSGRID_PROB_RETOP_TN``: (Default: "run_ensgridvx_prob_retop") +| ``VX_ENSPOINT_TN``: (Default: "run_enspointvx") +| ``VX_ENSPOINT_MEAN_TN``: (Default: "run_enspointvx_mean") +| ``VX_ENSPOINT_PROB_TN``: (Default: "run_enspointvx_prob") + + +Workflow Task Parameters +======================== +For each workflow task, additional parameters set the values to pass to the job scheduler (e.g., Slurm) that will submit a job for each task to be run. Parameters include the number of nodes to use to run the job, the number of MPI processes per node, the maximum walltime to allow for the job to complete, and the maximum number of times to attempt to run each task. + +**Number of nodes:** + +| ``NNODES_MAKE_GRID``: (Default: "1") +| ``NNODES_MAKE_OROG``: (Default: "1") +| ``NNODES_MAKE_SFC_CLIMO``: (Default: "2") +| ``NNODES_GET_EXTRN_ICS``: (Default: "1") +| ``NNODES_GET_EXTRN_LBCS``: (Default: "1") +| ``NNODES_MAKE_ICS``: (Default: "4") +| ``NNODES_MAKE_LBCS``: (Default: "4") +| ``NNODES_RUN_FCST``: (Default: "") + +.. note:: + The correct value for ``NNODES_RUN_FCST`` will be calculated in the workflow generation scripts. + +| ``NNODES_RUN_POST``: (Default: "2") +| ``NNODES_GET_OBS_CCPA``: (Default: "1") +| ``NNODES_GET_OBS_MRMS``: (Default: "1") +| ``NNODES_GET_OBS_NDAS``: (Default: "1") +| ``NNODES_VX_GRIDSTAT``: (Default: "1") +| ``NNODES_VX_POINTSTAT``: (Default: "1") +| ``NNODES_VX_ENSGRID``: (Default: "1") +| ``NNODES_VX_ENSGRID_MEAN``: (Default: "1") +| ``NNODES_VX_ENSGRID_PROB``: (Default: "1") +| ``NNODES_VX_ENSPOINT``: (Default: "1") +| ``NNODES_VX_ENSPOINT_MEAN``: (Default: "1") +| ``NNODES_VX_ENSPOINT_PROB``: (Default: "1") + +**Number of MPI processes per node:** + +| ``PPN_MAKE_GRID``: (Default: "24") +| ``PPN_MAKE_OROG``: (Default: "24") +| ``PPN_MAKE_SFC_CLIMO``: (Default: "24") +| ``PPN_GET_EXTRN_ICS``: (Default: "1") +| ``PPN_GET_EXTRN_LBCS``: (Default: "1") +| ``PPN_MAKE_ICS``: (Default: "12") +| ``PPN_MAKE_LBCS``: (Default: "12") +| ``PPN_RUN_FCST``: (Default: "") + +.. note:: + The correct value for ``PPN_RUN_FCST`` will be calculated from ``NCORES_PER_NODE`` and ``OMP_NUM_THREADS`` in ``setup.sh``. + +| ``PPN_RUN_POST``: (Default: "24") +| ``PPN_GET_OBS_CCPA``: (Default: "1") +| ``PPN_GET_OBS_MRMS``: (Default: "1") +| ``PPN_GET_OBS_NDAS``: (Default: "1") +| ``PPN_VX_GRIDSTAT``: (Default: "1") +| ``PPN_VX_POINTSTAT``: (Default: "1") +| ``PPN_VX_ENSGRID``: (Default: "1") +| ``PPN_VX_ENSGRID_MEAN``: (Default: "1") +| ``PPN_VX_ENSGRID_PROB``: (Default: "1") +| ``PPN_VX_ENSPOINT``: (Default: "1") +| ``PPN_VX_ENSPOINT_MEAN``: (Default: "1") +| ``PPN_VX_ENSPOINT_PROB``: (Default: "1") + + +**Wall Times:** Maximum amount of time for the task to run + +| ``WTIME_MAKE_GRID``: (Default: "00:20:00") +| ``WTIME_MAKE_OROG``: (Default: "01:00:00") +| ``WTIME_MAKE_SFC_CLIMO``: (Default: "00:20:00") +| ``WTIME_GET_EXTRN_ICS``: (Default: "00:45:00") +| ``WTIME_GET_EXTRN_LBCS``: (Default: "00:45:00") +| ``WTIME_MAKE_ICS``: (Default: "00:30:00") +| ``WTIME_MAKE_LBCS``: (Default: "00:30:00") +| ``WTIME_RUN_FCST``: (Default: "04:30:00") +| ``WTIME_RUN_POST``: (Default: "00:15:00") +| ``WTIME_GET_OBS_CCPA``: (Default: "00:45:00") +| ``WTIME_GET_OBS_MRMS``: (Default: "00:45:00") +| ``WTIME_GET_OBS_NDAS``: (Default: "02:00:00") +| ``WTIME_VX_GRIDSTAT``: (Default: "02:00:00") +| ``WTIME_VX_POINTSTAT``: (Default: "01:00:00") +| ``WTIME_VX_ENSGRID``: (Default: "01:00:00") +| ``WTIME_VX_ENSGRID_MEAN``: (Default: "01:00:00") +| ``WTIME_VX_ENSGRID_PROB``: (Default: "01:00:00") +| ``WTIME_VX_ENSPOINT``: (Default: "01:00:00") +| ``WTIME_VX_ENSPOINT_MEAN``: (Default: "01:00:00") +| ``WTIME_VX_ENSPOINT_PROB``: (Default: "01:00:00") + +**Maximum number of attempts to run a task:** + +| ``MAXTRIES_MAKE_GRID``: (Default: "2") +| ``MAXTRIES_MAKE_OROG``: (Default: "2") +| ``MAXTRIES_MAKE_SFC_CLIMO``: (Default: "2") +| ``MAXTRIES_GET_EXTRN_ICS``: (Default: "1") +| ``MAXTRIES_GET_EXTRN_LBCS``: (Default: "1") +| ``MAXTRIES_MAKE_ICS``: (Default: "1") +| ``MAXTRIES_MAKE_LBCS``: (Default: "1") +| ``MAXTRIES_RUN_FCST``: (Default: "1") +| ``MAXTRIES_RUN_POST``: (Default: "2") +| ``MAXTRIES_GET_OBS_CCPA``: (Default: "1") +| ``MAXTRIES_GET_OBS_MRMS``: (Default: "1") +| ``MAXTRIES_GET_OBS_NDAS``: (Default: "1") +| ``MAXTRIES_VX_GRIDSTAT``: (Default: "1") +| ``MAXTRIES_VX_GRIDSTAT_REFC``: (Default: "1") +| ``MAXTRIES_VX_GRIDSTAT_RETOP``: (Default: "1") +| ``MAXTRIES_VX_GRIDSTAT_03h``: (Default: "1") +| ``MAXTRIES_VX_GRIDSTAT_06h``: (Default: "1") +| ``MAXTRIES_VX_GRIDSTAT_24h``: (Default: "1") +| ``MAXTRIES_VX_POINTSTAT``: (Default: "1") +| ``MAXTRIES_VX_ENSGRID``: (Default: "1") +| ``MAXTRIES_VX_ENSGRID_REFC``: (Default: "1") +| ``MAXTRIES_VX_ENSGRID_RETOP``: (Default: "1") +| ``MAXTRIES_VX_ENSGRID_03h``: (Default: "1") +| ``MAXTRIES_VX_ENSGRID_06h``: (Default: "1") +| ``MAXTRIES_VX_ENSGRID_24h``: (Default: "1") +| ``MAXTRIES_VX_ENSGRID_MEAN``: (Default: "1") +| ``MAXTRIES_VX_ENSGRID_PROB``: (Default: "1") +| ``MAXTRIES_VX_ENSGRID_MEAN_03h``: (Default: "1") +| ``MAXTRIES_VX_ENSGRID_PROB_03h``: (Default: "1") +| ``MAXTRIES_VX_ENSGRID_MEAN_06h``: (Default: "1") +| ``MAXTRIES_VX_ENSGRID_PROB_06h``: (Default: "1") +| ``MAXTRIES_VX_ENSGRID_MEAN_24h``: (Default: "1") +| ``MAXTRIES_VX_ENSGRID_PROB_24h``: (Default: "1") +| ``MAXTRIES_VX_ENSGRID_PROB_REFC``: (Default: "1") +| ``MAXTRIES_VX_ENSGRID_PROB_RETOP``: (Default: "1") +| ``MAXTRIES_VX_ENSPOINT``: (Default: "1") +| ``MAXTRIES_VX_ENSPOINT_MEAN``: (Default: "1") +| ``MAXTRIES_VX_ENSPOINT_PROB``: (Default: "1") + + +Pre-Processing Parameters +========================= +These parameters set flags (and related directories) that determine whether various workflow tasks should be run. Note that the ``MAKE_GRID_TN``, ``MAKE_OROG_TN``, and ``MAKE_SFC_CLIMO_TN`` are all :term:`cycle-independent` tasks, i.e., if they are to be run, they do so only once at the beginning of the workflow before any cycles are run. + +Baseline Workflow Tasks +-------------------------- + +``RUN_TASK_MAKE_GRID``: (Default: "TRUE") + Flag that determines whether to run the grid file generation task (``MAKE_GRID_TN``). If this is set to "TRUE", the grid generation task is run and new grid files are generated. If it is set to "FALSE", then the scripts look for pre-generated grid files in the directory specified by ``GRID_DIR`` (see below). + +``GRID_DIR``: (Default: "/path/to/pregenerated/grid/files") + The directory containing pre-generated grid files when ``RUN_TASK_MAKE_GRID`` is set to "FALSE". + +``RUN_TASK_MAKE_OROG``: (Default: "TRUE") + Same as ``RUN_TASK_MAKE_GRID`` but for the orography generation task (``MAKE_OROG_TN``). + +``OROG_DIR``: (Default: "/path/to/pregenerated/orog/files") + Same as ``GRID_DIR`` but for the orography generation task (``MAKE_OROG_TN``). + +``RUN_TASK_MAKE_SFC_CLIMO``: (Default: "TRUE") + Same as ``RUN_TASK_MAKE_GRID`` but for the surface climatology generation task (``MAKE_SFC_CLIMO_TN``). + +``SFC_CLIMO_DIR``: (Default: "/path/to/pregenerated/surface/climo/files") + Same as ``GRID_DIR`` but for the surface climatology generation task (``MAKE_SFC_CLIMO_TN``). + +``RUN_TASK_GET_EXTRN_ICS``: (Default: "TRUE") + Flag that determines whether to run the ``GET_EXTRN_ICS_TN`` task. + +``RUN_TASK_GET_EXTRN_LBCS``: (Default: "TRUE") + Flag that determines whether to run the ``GET_EXTRN_LBCS_TN`` task. + +``RUN_TASK_MAKE_ICS``: (Default: "TRUE") + Flag that determines whether to run the ``MAKE_ICS_TN`` task. + +``RUN_TASK_MAKE_LBCS``: (Default: "TRUE") + Flag that determines whether to run the ``MAKE_LBCS_TN`` task. + +``RUN_TASK_RUN_FCST``: (Default: "TRUE") + Flag that determines whether to run the ``RUN_FCST_TN`` task. + +``RUN_TASK_RUN_POST``: (Default: "TRUE") + Flag that determines whether to run the ``RUN_POST_TN`` task. + +.. _VXTasks: + +Verification Tasks +-------------------- + +``RUN_TASK_GET_OBS_CCPA``: (Default: "FALSE") + Flag that determines whether to run the ``GET_OBS_CCPA_TN`` task, which retrieves the :term:`CCPA` hourly precipitation files used by METplus from NOAA HPSS. + +``RUN_TASK_GET_OBS_MRMS``: (Default: "FALSE") + Flag that determines whether to run the ``GET_OBS_MRMS_TN`` task, which retrieves the :term:`MRMS` composite reflectivity files used by METplus from NOAA HPSS. + +``RUN_TASK_GET_OBS_NDAS``: (Default: "FALSE") + Flag that determines whether to run the ``GET_OBS_NDAS_TN`` task, which retrieves the :term:`NDAS` PrepBufr files used by METplus from NOAA HPSS. + +``RUN_TASK_VX_GRIDSTAT``: (Default: "FALSE") + Flag that determines whether to run the grid-stat verification task. + +``RUN_TASK_VX_POINTSTAT``: (Default: "FALSE") + Flag that determines whether to run the point-stat verification task. + +``RUN_TASK_VX_ENSGRID``: (Default: "FALSE") + Flag that determines whether to run the ensemble-stat verification for gridded data task. + +``RUN_TASK_VX_ENSPOINT``: (Default: "FALSE") + Flag that determines whether to run the ensemble point verification task. If this flag is set, both ensemble-stat point verification and point verification of ensemble-stat output is computed. + +.. + COMMENT: Might be worth defining "ensemble-stat verification for gridded data," "ensemble point verification," "ensemble-stat point verification," and "point verification of ensemble-stat output" + +Aerosol Climatology Parameter +================================ + +``USE_MERRA_CLIMO``: (Default: "FALSE") + Flag that determines whether MERRA2 aerosol climatology data and lookup tables for optics properties are obtained. + +.. + COMMENT: When would it be appropriate to obtain these files? + +Surface Climatology Parameter +============================= +``SFC_CLIMO_FIELDS``: (Default: "("facsf" "maximum_snow_albedo" "slope_type" "snowfree_albedo" "soil_type" "substrate_temperature" "vegetation_greenness" "vegetation_type")" ) + Array containing the names of all the fields for which ``MAKE_SFC_CLIMO_TN`` generates files on the native FV3-LAM grid. + +Fixed File Parameters +===================== +These parameters are associated with the fixed (i.e., static) files. On `Level 1 & 2 `__ systems, fixed files are prestaged with paths defined in the ``setup.sh`` script. Because the default values are platform-dependent, they are set to a null string in ``config_defaults.sh``. Then these null values are overwritten in ``setup.sh`` with machine-specific values or with a user-specified value from ``config.sh``. + +``FIXgsm``: (Default: "") + System directory in which the majority of fixed (i.e., time-independent) files that are needed to run the FV3-LAM model are located. + +``FIXaer``: (Default: "") + System directory where MERRA2 aerosol climatology files are located. + +``FIXlut``: (Default: "") + System directory where the lookup tables for optics properties are located. + +``TOPO_DIR``: (Default: "") + The location on disk of the static input files used by the ``make_orog`` task (i.e., ``orog.x`` and ``shave.x``). Can be the same as ``FIXgsm``. + +``SFC_CLIMO_INPUT_DIR``: (Default: "") + The location on disk of the static surface climatology input fields, used by ``sfc_climo_gen``. These files are only used if ``RUN_TASK_MAKE_SFC_CLIMO=TRUE``. + +``FNGLAC, ..., FNMSKH``: (Default: see below) + .. code-block:: console + + (FNGLAC="global_glacier.2x2.grb" + FNMXIC="global_maxice.2x2.grb" + FNTSFC="RTGSST.1982.2012.monthly.clim.grb" + FNSNOC="global_snoclim.1.875.grb" + FNZORC="igbp" + FNAISC="CFSR.SEAICE.1982.2012.monthly.clim.grb" + FNSMCC="global_soilmgldas.t126.384.190.grb" + FNMSKH="seaice_newland.grb") + + Names and default locations of (some of the) global data files that are assumed to exist in a system directory. (This directory is machine-dependent; the experiment generation scripts will set it and store it in the variable ``FIXgsm``.) These file names also appear directly in the forecast model's input :term:`namelist` file. + +``FIXgsm_FILES_TO_COPY_TO_FIXam``: (Default: see below) + .. code-block:: console + + ("$FNGLAC" \ + "$FNMXIC" \ + "$FNTSFC" \ + "$FNSNOC" \ + "$FNAISC" \ + "$FNSMCC" \ + "$FNMSKH" \ + "global_climaeropac_global.txt" \ + "fix_co2_proj/global_co2historicaldata_2010.txt" \ + "fix_co2_proj/global_co2historicaldata_2011.txt" \ + "fix_co2_proj/global_co2historicaldata_2012.txt" \ + "fix_co2_proj/global_co2historicaldata_2013.txt" \ + "fix_co2_proj/global_co2historicaldata_2014.txt" \ + "fix_co2_proj/global_co2historicaldata_2015.txt" \ + "fix_co2_proj/global_co2historicaldata_2016.txt" \ + "fix_co2_proj/global_co2historicaldata_2017.txt" \ + "fix_co2_proj/global_co2historicaldata_2018.txt" \ + "fix_co2_proj/global_co2historicaldata_2019.txt" \ + "fix_co2_proj/global_co2historicaldata_2020.txt" \ + "fix_co2_proj/global_co2historicaldata_2021.txt" \ + "global_co2historicaldata_glob.txt" \ + "co2monthlycyc.txt" \ + "global_h2o_pltc.f77" \ + "global_hyblev.l65.txt" \ + "global_zorclim.1x1.grb" \ + "global_sfc_emissivity_idx.txt" \ + "global_tg3clim.2.6x1.5.grb" \ + "global_solarconstant_noaa_an.txt" \ + "global_albedo4.1x1.grb" \ + "geo_em.d01.lat-lon.2.5m.HGT_M.nc" \ + "HGT.Beljaars_filtered.lat-lon.30s_res.nc" \ + "replace_with_FIXgsm_ozone_prodloss_filename") + + If not running in NCO mode, this array contains the names of the files to copy from the ``FIXgsm`` system directory to the ``FIXam`` directory under the experiment directory. + + .. note:: + The last element in the list above contains a dummy value. This value will be reset by the workflow generation scripts to the name of the ozone production/loss file that needs to be copied from ``FIXgsm``. This file depends on the :term:`CCPP` physics suite specified for the experiment (and the corresponding ozone parameterization scheme used in that physics suite). + +``FV3_NML_VARNAME_TO_FIXam_FILES_MAPPING``: (Default: see below) + .. code-block:: console + + ("FNGLAC | $FNGLAC" \ + "FNMXIC | $FNMXIC" \ + "FNTSFC | $FNTSFC" \ + "FNSNOC | $FNSNOC" \ + "FNAISC | $FNAISC" \ + "FNSMCC | $FNSMCC" \ + "FNMSKH | $FNMSKH" ) + + This array is used to set some of the :term:`namelist` variables in the forecast model's namelist file. It maps file symlinks to the actual fixed file locations in the ``FIXam`` directory. The symlink names appear in the first column (to the left of the "|" symbol), and the paths to these files (in the ``FIXam`` directory) are held in workflow variables, which appear to the right of the "|" symbol. It is possible to remove ``FV3_NML_VARNAME_TO_FIXam_FILES_MAPPING`` as a workflow variable and make it only a local one since it is used in only one script. + +.. + COMMENT: Why is #"FNZORC | $FNZORC" \ commented out in config_defaults.sh? + COMMENT: Is this an accurate rewording of the original? + + +``FV3_NML_VARNAME_TO_SFC_CLIMO_FIELD_MAPPING``: (Default: see below) + .. code-block:: console + + ("FNALBC | snowfree_albedo" \ + "FNALBC2 | facsf" \ + "FNTG3C | substrate_temperature" \ + "FNVEGC | vegetation_greenness" \ + "FNVETC | vegetation_type" \ + "FNSOTC | soil_type" \ + "FNVMNC | vegetation_greenness" \ + "FNVMXC | vegetation_greenness" \ + "FNSLPC | slope_type" \ + "FNABSC | maximum_snow_albedo" ) + + This array is used to set some of the :term:`namelist` variables in the forecast model's namelist file. The variable names appear in the first column (to the left of the "|" symbol), and the paths to these surface climatology files on the native FV3-LAM grid (in the ``FIXLAM`` directory) are derived from the corresponding surface climatology fields (the second column of the array). + +.. + COMMENT: Is this an accurate rewording of the original? + +``CYCLEDIR_LINKS_TO_FIXam_FILES_MAPPING``: (Default: see below) + .. code-block:: console + + ("aerosol.dat | global_climaeropac_global.txt" \ + "co2historicaldata_2010.txt | fix_co2_proj/global_co2historicaldata_2010.txt" \ + "co2historicaldata_2011.txt | fix_co2_proj/global_co2historicaldata_2011.txt" \ + "co2historicaldata_2012.txt | fix_co2_proj/global_co2historicaldata_2012.txt" \ + "co2historicaldata_2013.txt | fix_co2_proj/global_co2historicaldata_2013.txt" \ + "co2historicaldata_2014.txt | fix_co2_proj/global_co2historicaldata_2014.txt" \ + "co2historicaldata_2015.txt | fix_co2_proj/global_co2historicaldata_2015.txt" \ + "co2historicaldata_2016.txt | fix_co2_proj/global_co2historicaldata_2016.txt" \ + "co2historicaldata_2017.txt | fix_co2_proj/global_co2historicaldata_2017.txt" \ + "co2historicaldata_2018.txt | fix_co2_proj/global_co2historicaldata_2018.txt" \ + "co2historicaldata_2019.txt | fix_co2_proj/global_co2historicaldata_2019.txt" \ + "co2historicaldata_2020.txt | fix_co2_proj/global_co2historicaldata_2020.txt" \ + "co2historicaldata_2021.txt | fix_co2_proj/global_co2historicaldata_2021.txt" \ + "co2historicaldata_glob.txt | global_co2historicaldata_glob.txt" \ + "co2monthlycyc.txt | co2monthlycyc.txt" \ + "global_h2oprdlos.f77 | global_h2o_pltc.f77" \ + "global_albedo4.1x1.grb | global_albedo4.1x1.grb" \ + "global_zorclim.1x1.grb | global_zorclim.1x1.grb" \ + "global_tg3clim.2.6x1.5.grb | global_tg3clim.2.6x1.5.grb" \ + "sfc_emissivity_idx.txt | global_sfc_emissivity_idx.txt" \ + "solarconstant_noaa_an.txt | global_solarconstant_noaa_an.txt" \ + "global_o3prdlos.f77 | " ) + + This array specifies the mapping to use between the symlinks that need to be created in each cycle directory (these are the "files" that FV3 looks for) and their targets in the ``FIXam`` directory. The first column of the array specifies the symlink to be created, and the second column specifies its target file in ``FIXam`` (where columns are delineated by the pipe symbol "|"). + +Subhourly Forecast Parameters +================================= + +``SUB_HOURLY_POST``: (Default: "FALSE") + Flag that indicates whether the forecast model will generate output files on a sub-hourly time interval (e.g., 10 minutes, 15 minutes). This will also cause the post-processor to process these sub-hourly files. If this variable is set to "TRUE", then ``DT_SUBHOURLY_POST_MNTS`` should be set to a value between "01" and "59". + +``DT_SUB_HOURLY_POST_MNTS``: (Default: "00") + Time interval in minutes between the forecast model output files. If ``SUB_HOURLY_POST`` is set to "TRUE", this needs to be set to a two-digit integer between "01" and "59". Note that if ``SUB_HOURLY_POST`` is set to "TRUE" but ``DT_SUB_HOURLY_POST_MNTS`` is set to "00", ``SUB_HOURLY_POST`` will get reset to "FALSE" in the experiment generation scripts (there will be an informational message in the log file to emphasize this). Valid values: "1" "01" "2" "02" "3" "03" "4" "04" "5" "05" "6" "06" "10" "12" "15" "20" "30". + +Customized Post Configuration Parameters +======================================== + +``USE_CUSTOM_POST_CONFIG_FILE``: (Default: "FALSE") + Flag that determines whether a user-provided custom configuration file should be used for post-processing the model data. If this is set to "TRUE", then the workflow will use the custom post-processing (:term:`UPP`) configuration file specified in ``CUSTOM_POST_CONFIG_FP``. Otherwise, a default configuration file provided in the UPP repository will be used. + +``CUSTOM_POST_CONFIG_FP``: (Default: "") + The full path to the custom post flat file, including filename, to be used for post-processing. This is only used if ``CUSTOM_POST_CONFIG_FILE`` is set to "TRUE". + + +Community Radiative Transfer Model (CRTM) Parameters +======================================================= + +These variables set parameters associated with outputting satellite fields in the :term:`UPP` :term:`grib2` files using the Community Radiative Transfer Model (:term:`CRTM`). :numref:`Section %s ` includes further instructions on how to do this. + +.. + COMMENT: What actually happens here? Where are the satellite fields outputted to? When/why would this be used? What kind of satellites? + +``USE_CRTM``: (Default: "FALSE") + Flag that defines whether external :term:`CRTM` coefficient files have been staged by the user in order to output synthetic satellite products available within the :term:`UPP`. If this is set to "TRUE", then the workflow will check for these files in the directory ``CRTM_DIR``. Otherwise, it is assumed that no satellite fields are being requested in the UPP configuration. + +``CRTM_DIR``: (Default: "") + This is the path to the top CRTM fix file directory. This is only used if ``USE_CRTM`` is set to "TRUE". + +Ensemble Model Parameters +============================ + +``DO_ENSEMBLE``: (Default: "FALSE") + Flag that determines whether to run a set of ensemble forecasts (for each set of specified cycles). If this is set to "TRUE", ``NUM_ENS_MEMBERS`` forecasts are run for each cycle, each with a different set of stochastic seed values. When "FALSE", a single forecast is run for each cycle. + +``NUM_ENS_MEMBERS``: (Default: "1") + The number of ensemble members to run if ``DO_ENSEMBLE`` is set to "TRUE". This variable also controls the naming of the ensemble member directories. For example, if ``NUM_ENS_MEMBERS`` is set to "8", the member directories will be named *mem1, mem2, ..., mem8*. If it is set to "08" (with a leading zero), the member directories will be named *mem01, mem02, ..., mem08*. However, after reading in the number of characters in this string (in order to determine how many leading zeros, if any, should be placed in the names of the member directories), the workflow generation scripts strip away those leading zeros. Thus, in the variable definitions file (``GLOBAL_VAR_DEFNS_FN``), this variable appears with its leading zeros stripped. This variable is not used unless ``DO_ENSEMBLE`` is set to "TRUE". + +.. _HaloBlend: + +Halo Blend Parameter +==================== +``HALO_BLEND``: (Default: "10") + Number of cells to use for โ€œblendingโ€ the external solution (obtained from the :term:`LBCs`) with the internal solution from the FV3LAM dycore. Specifically, it refers to the number of rows into the computational domain that should be blended with the LBCs. Cells at which blending occurs are all within the boundary of the native grid; they donโ€™t involve the 4 cells outside the boundary where the LBCs are specified (which is a different :term:`halo`). Blending is necessary to smooth out waves generated due to mismatch between the external and internal solutions. To shut :term:`halo` blending off, set this to zero. + + +FVCOM Parameter +=============== +``USE_FVCOM``: (Default: "FALSE") + Flag that specifies whether or not to update surface conditions in FV3-LAM with fields generated from the Finite Volume Community Ocean Model (:term:`FVCOM`). If set to "TRUE", lake/sea surface temperatures, ice surface temperatures, and ice placement will be overwritten using data provided by FVCOM. Setting ``USE_FVCOM`` to "TRUE" causes the executable ``process_FVCOM.exe`` in the ``MAKE_ICS_TN`` task to run. This, in turn, modifies the file ``sfc_data.nc`` generated by ``chgres_cube``. Note that the FVCOM data must already be interpolated to the desired FV3-LAM grid. + +``FVCOM_WCSTART``: (Default: "cold") + Define if this is a "warm" start or a "cold" start. Setting this to "warm" will read in ``sfc_data.nc`` generated in a RESTART directory. Setting this to "cold" will read in the ``sfc_data.nc`` generated from ``chgres_cube`` in the ``make_ics`` portion of the workflow. Valid values: "cold" "warm" + +``FVCOM_DIR``: (Default: "/user/defined/dir/to/fvcom/data") + User-defined directory where the ``fvcom.nc`` file containing :term:`FVCOM` data on the FV3-LAM native grid is located. The file name in this directory must be ``fvcom.nc``. + +``FVCOM_FILE``: (Default: "fvcom.nc") + Name of file located in ``FVCOM_DIR`` that has :term:`FVCOM` data interpolated to the FV3-LAM grid. This file will be copied later to a new location and the name changed to ``fvcom.nc`` if a name other than ``fvcom.nc`` is selected. + +Compiler Parameter +================== +``COMPILER``: (Default: "intel") + Type of compiler invoked during the build step. Currently, this must be set manually (i.e., it is not inherited from the build system in the ``ufs-srweather-app`` directory). Valid values: "intel" "gnu" + + +Thread Affinity Interface +=========================== + +.. note:: + Note that settings for the ``make_grid`` and ``make_orog`` tasks are not included below because they do not use parallelized code. + +.. + COMMENT: The note above is in config_defaults.sh comments, but make_orog does seem to be included below... should I remove it? + +``KMP_AFFINITY_*``: (Default: see below) + + .. code-block:: console + + KMP_AFFINITY_MAKE_OROG="disabled" + KMP_AFFINITY_MAKE_SFC_CLIMO="scatter" + KMP_AFFINITY_MAKE_ICS="scatter" + KMP_AFFINITY_MAKE_LBCS="scatter" + KMP_AFFINITY_RUN_FCST="scatter" + KMP_AFFINITY_RUN_POST="scatter" + + Intel's runtime library can bind OpenMP threads to physical processing units. The interface is controlled using the KMP_AFFINITY environment variable. Thread affinity restricts execution of certain threads to a subset of the physical processing units in a multiprocessor computer. Depending on the system (machine) topology, application, and operating system, thread affinity can have a dramatic effect on the application speed and on the execution speed of a program." Valid values: "scatter" "disabled" "balanced" "compact" "explicit" "none" + + For more information, see the `Intel Development Reference Guide `__. + +``OMP_NUM_THREADS_*``: (Default: see below) + + .. code-block:: console + + OMP_NUM_THREADS_MAKE_OROG="6" + OMP_NUM_THREADS_MAKE_SFC_CLIMO="1" + OMP_NUM_THREADS_MAKE_ICS="1" + OMP_NUM_THREADS_MAKE_LBCS="1" + OMP_NUM_THREADS_RUN_FCST="2" # atmos_nthreads in model_configure + OMP_NUM_THREADS_RUN_POST="1" + + The number of OpenMP threads to use for parallel regions. + +.. + COMMENT: What does the #atmos_nthreads comment mean? Can it be removed? + + +``OMP_STACKSIZE_*``: (Default: see below) + + .. code-block:: console + + OMP_STACKSIZE_MAKE_OROG="2048m" + OMP_STACKSIZE_MAKE_SFC_CLIMO="1024m" + OMP_STACKSIZE_MAKE_ICS="1024m" + OMP_STACKSIZE_MAKE_LBCS="1024m" + OMP_STACKSIZE_RUN_FCST="1024m" + OMP_STACKSIZE_RUN_POST="1024m" + + Controls the size of the stack for threads created by the OpenMP implementation. + -.. include:: ConfigParameters.inc diff --git a/docs/UsersGuide/source/ContributorsGuide.rst b/docs/UsersGuide/source/ContributorsGuide.rst new file mode 100644 index 0000000000..7d5813582c --- /dev/null +++ b/docs/UsersGuide/source/ContributorsGuide.rst @@ -0,0 +1,353 @@ + +.. _ContributorsGuide: + +============================== +SRW App Contributor's Guide +============================== + +.. _Background: + +Background +=========== + +Authoritative branch +----------------------- + +The main development branch for the ``ufs-srweather-app`` repository is ``develop``. The HEAD of ``develop`` reflects the latest development changes. It points to regularly updated hashes for individual sub-components, including the ``regional_workflow``. Pull requests (PRs) will be merged to ``develop``. + +The ``develop`` branch is protected by the code management team: + #. Pull requests for this branch require approval by at least two code reviewers. + #. A code manager should perform the review and the merge, but other contributors are welcome to provide comments/suggestions. + + +Code Management Team +-------------------------- + +Scientists from across multiple labs and organizations have volunteered to review pull requests for the ``develop`` branch: + +.. table:: + + +------------------+------------------------------------------------+ + | **Organization** | **Reviewers** | + +==================+================================================+ + | EMC | Chan-Hoo Jeon (@chan-hoo) | + | | | + | | Ben Blake (@BenjaminBlake-NOAA) | + | | | + | | Ratko Vasic (@RatkoVasic-NOAA) | + +------------------+------------------------------------------------+ + | EPIC | Mark Potts (@mark-a-potts) | + | | | + | | Jong Kim (@jkbk2004) | + +------------------+------------------------------------------------+ + | GLERL/UM | David Wright (@dmwright526) | + +------------------+------------------------------------------------+ + | GSL | Jeff Beck (@JeffBeck-NOAA) | + | | | + | | Gerard Ketefian (@gsketefian) | + | | | + | | Linlin Pan (@panll) | + | | | + | | Christina Holt (@christinaholtNOAA) | + | | | + | | Christopher Harrop (@christopherwharrop-noaa) | + | | | + | | Daniel Abdi (@danielabdi-noaa) | + +------------------+------------------------------------------------+ + | NCAR | Mike Kavulich (@mkavulich) | + | | | + | | Will Mayfield (@willmayfield) | + | | | + | | Jamie Wolff (@jwolff-ncar) | + +------------------+------------------------------------------------+ + | NSSL | Yunheng Wang (@ywangwof) | + +------------------+------------------------------------------------+ + + +.. _ContribProcess: + +Contribution Process +======================== + +The steps below should be followed in order to make changes to the ``develop`` branch of the ``ufs-srweather-app`` repository. Communication with code managers and the code management team throughout the process is encouraged. + + #. **Issue** - Open an issue to document changes. Click `here `__ to open a new ``ufs-srweather-app`` issue or see :numref:`Step %s ` for detailed instructions. + #. **GitFlow** - Follow `GitFlow `__ procedures for development. + #. **Fork the repository** - Read more `here `__ about forking in GitHub. + #. **Create a branch** - Create a branch in your fork of the authoritative repository. Follow `GitFlow `__ conventions when creating the branch. Branches should be named as follows, where [name] is a one-word description of the branch: + + * **bugfix/[name]:** Fixes a demonstrably incorrect portion of code + * **feature/[name]:** Adds a new feature to the code + * **enhancement/[name]:** Improves an existing portion of the code + * **textonly/[name]:** Changes elements of the repository that do not impact program output or log files (e.g., changes to README, documentation, comments, changing quoted Registry elements, white space alignment). Any change which does not impact the compiled code in any way should fall under this category. + + #. **Development** - Perform and test changes in the branch. Document work in the issue and mention the issue number in commit messages to link your work to the issue (e.g., ``commit -m "Issue #23 - "``). Test code modifications on as many platforms as possible, and request help with further testing from the code management team when unable to test on all platforms. Document changes to the workflow and capabilities (either in the ``.rst`` files or separately) so that the SRW App documentation stays up-to-date. + #. **Pull request** - When ready to merge changes back to the ``develop`` branch, the code developer should initiate a pull request (PR) of the feature branch into the ``develop`` branch. Read `here `__ about pull requests in GitHub. When a PR is initiated, the :ref:`PR Template