From 4f5573b317df8652b7497e1b17ab14fabc642a91 Mon Sep 17 00:00:00 2001 From: Gillian Petro <96886803+gspetro-NOAA@users.noreply.github.com> Date: Mon, 30 Oct 2023 16:38:32 -0400 Subject: [PATCH] [release/public-v2.2.0]: Documentation Updates (#950) Documentation updates to the User's Guide for the SRW App v2.2 release. --- README.md | 13 +- .../source/BackgroundInfo/CCPPUpdates.rst | 95 ++ .../source/BackgroundInfo/Components.rst | 36 +- .../source/BackgroundInfo/Introduction.rst | 79 +- .../BackgroundInfo/TechnicalOverview.rst | 44 +- .../source/BuildingRunningTesting/AQM.rst | 27 +- .../BuildingRunningTesting/BuildSRW.rst | 194 +++-- .../ContainerQuickstart.rst | 85 +- .../BuildingRunningTesting/Quickstart.rst | 58 +- .../source/BuildingRunningTesting/RunSRW.rst | 449 ++++++---- .../BuildingRunningTesting/Tutorial.rst | 65 +- .../source/BuildingRunningTesting/VXCases.rst | 85 +- .../BuildingRunningTesting/WE2Etests.rst | 215 ++--- .../CustomizingTheWorkflow/ConfigWorkflow.rst | 30 +- .../InputOutputFiles.rst | 39 +- .../CustomizingTheWorkflow/LAMGrids.rst | 115 ++- docs/UsersGuide/source/Reference/FAQ.rst | 70 +- docs/UsersGuide/source/Reference/Glossary.rst | 8 +- docs/UsersGuide/source/conf.py | 35 +- docs/UsersGuide/source/index.rst | 4 +- docs/UsersGuide/source/tables/CCPPUpdates.rst | 58 -- docs/UsersGuide/source/tables/Tests.csv | 4 +- .../source/tables/fix_file_list.rst | 821 ------------------ 23 files changed, 1006 insertions(+), 1623 deletions(-) create mode 100644 docs/UsersGuide/source/BackgroundInfo/CCPPUpdates.rst delete mode 100644 docs/UsersGuide/source/tables/CCPPUpdates.rst delete mode 100644 docs/UsersGuide/source/tables/fix_file_list.rst diff --git a/README.md b/README.md index d4268e5e80..067b2a16b2 100644 --- a/README.md +++ b/README.md @@ -1,18 +1,21 @@ # UFS Short-Range Weather Application -The Unified Forecast System (UFS) is a community-based, coupled, comprehensive Earth modeling system. It is designed to be the source system for NOAA’s operational numerical weather prediction applications while enabling research, development, and contribution opportunities for the broader Weather Enterprise. For more information about the UFS, visit the UFS Portal at https://ufscommunity.org/. +The Unified Forecast System (UFS) is a community-based, coupled, comprehensive Earth modeling system. NOAA's operational model suite for numerical weather prediction (NWP) is quickly transitioning to the UFS from a number of different modeling systems. The UFS enables research, development, and contribution opportunities within the broader Weather Enterprise (including government, industry, and academia). For more information about the UFS, visit the UFS Portal at https://ufscommunity.org/. -The UFS includes multiple applications (see a complete list at https://ufscommunity.org/science/aboutapps/) that support different forecast durations and spatial domains. This documentation describes the development branch of the UFS Short-Range Weather (SRW) Application, which targets predictions of atmospheric behavior on a limited spatial domain and on time scales from minutes to several days. The development branch of the application is continually evolving as the system undergoes open development. The latest SRW App release (v2.1.0) represents a snapshot of this continuously evolving system. +The UFS includes multiple applications (see a complete list at https://ufscommunity.org/science/aboutapps/) that support different forecast durations and spatial domains. The UFS Short-Range Weather (SRW) Application targets predictions of atmospheric behavior on a limited spatial domain and on time scales from minutes to several days. The development branch of the application is continually evolving as the system undergoes open development. This SRW App v2.2.0 release represents a snapshot of this continuously evolving system. -The UFS SRW App User's Guide associated with the development branch is at: https://ufs-srweather-app.readthedocs.io/en/develop/, while the guide specific to the SRW App v2.1.0 release can be found at: https://ufs-srweather-app.readthedocs.io/en/release-public-v2.1.0/. The repository is at: https://github.com/ufs-community/ufs-srweather-app. +The UFS SRW App User's Guide associated with the development branch is at: https://ufs-srweather-app.readthedocs.io/en/develop/, while the guide specific to the SRW App v2.2.0 release can be found at: https://ufs-srweather-app.readthedocs.io/en/release-public-v2.2.0/. The repository is at: https://github.com/ufs-community/ufs-srweather-app. For instructions on how to clone the repository, build the code, and run the workflow, see: -https://github.com/ufs-community/ufs-srweather-app/wiki/Getting-Started +- https://ufs-srweather-app.readthedocs.io/en/release-public-v2.2.0/ +- https://github.com/ufs-community/ufs-srweather-app/wiki/Getting-Started For a debugging guide for users and developers in the field of Earth System Modeling, please see: https://epic.noaa.gov/wp-content/uploads/2022/12/Debugging-Guide.pdf -UFS Development Team. (2022, Nov. 17). Unified Forecast System (UFS) Short-Range Weather (SRW) Application (Version v2.1.0). Zenodo. https://doi.org/10.5281/zenodo.7277602 +The SRW App v2.2.0 citation is as follows and should be used when presenting results based on research conducted with the App: + +UFS Development Team. (2023, Oct. 30). Unified Forecast System (UFS) Short-Range Weather (SRW) Application (Version v2.2.0). Zenodo. https://doi.org/10.5281/zenodo.10015544 [![Python unittests](https://github.com/ufs-community/ufs-srweather-app/actions/workflows/python_unittests.yaml/badge.svg)](https://github.com/ufs-community/ufs-srweather-app/actions/workflows/python_unittests.yaml) [![Python functional tests](https://github.com/ufs-community/ufs-srweather-app/actions/workflows/python_func_tests.yaml/badge.svg)](https://github.com/ufs-community/ufs-srweather-app/actions/workflows/python_func_tests.yaml) diff --git a/docs/UsersGuide/source/BackgroundInfo/CCPPUpdates.rst b/docs/UsersGuide/source/BackgroundInfo/CCPPUpdates.rst new file mode 100644 index 0000000000..c9737f7737 --- /dev/null +++ b/docs/UsersGuide/source/BackgroundInfo/CCPPUpdates.rst @@ -0,0 +1,95 @@ +:orphan: + +.. _CCPPUpdates: + +================================================ +CCPP Updates for the SRW App v2.2.0 Release +================================================ + +Here is what's new in CCPP Physics for the UFS SRW v2.2.0 public release. + +General Updates +================= + +* Added RAP suite (``FV3_RAP``) as a new supported suite (documentation `here `__) +* Added the Community Land Model (CLM) Lake model in the HRRR suite (``FV3_HRRR``) + +Thompson Microphysics Scheme +============================== + +* Reduced ice generation supersaturation requirement from 0.25 to 0.15 to generate more ice at the upper levels and reduce the outgoing longwave radiation bias +* Divided cloud number concentration into two parts (over land and others). Reduced number concentration over ocean to a smaller number (50/L) from its previous default (100/L). Both changes were made to reduce excessive surface downward shortwave radiative flux off coastal regions including the Southeast Pacific +* Implemented small fixes to the minimum size of snow and collision constants + +.. note:: + + The above improvements were tested with the non-aerosol option, so results with the aerosol-aware Thompson (used in the SRW App) may vary. + + +NoahMP Land Surface Model +=========================== + +* Option for using the unified frozen precipitation fraction in NoahMP. +* Diagnostic 2-meter temperature and humidity now based on vegetation and bare-ground tiles (new namelist option ``iopt_diag``) +* Bug fixes for GFS-based thermal roughness length scheme +* New soil color dataset introduced to improve soil albedo to reduce the large warm bias found in the Sahel desert +* Wet leaf contribution factor is included +* Leaf-area index now depends on momentum roughness length + + +RUC Land Surface Model +======================== + +* Initialization of land and ice emissivity and albedo with consideration of partial snow cover +* Initialization of water vapor mixing ratio over land ice +* Initialization of fractions of soil and vegetation types in a grid cell +* Changes in the computation of a flag for sea ice: set to true only if ``flag_cice=.false`` (atmosphere uncoupled from the sea ice model). +* Separate variables for sea ice, for example: ``snowfallac`` is replaced with ``snowfallac_ice`` +* Solar angle dependence of albedo for snow-free land +* Stochastic physics perturbations (SPP) introduced for emissivity, albedo and vegetation fraction +* Coefficient in soil resistance formulation (Sakaguchi and Zeng, 2009) raised from 0.7 to 1.0 to increase soil resistance to evaporation +* Computation of snow cover fraction and snow thermal conductivity updated + +GFS Scale-Aware TKE-EDMF PBL and Cumulus Schemes +================================================== + +* Parameterization to represent environmental wind shear effect added to reduce excessively high hurricane intensity +* Entrainment rates enhanced proportionally to the sub-cloud or PBL-mean TKE when TKE is larger than a threshold value +* Entrainment rate is increased as a function of vegetation fraction and surface roughness length to enhance underestimated CAPE + +MYNN-EDMF PBL Scheme +====================== + +* Small increase of buoyancy length scale in convective environments +* Patch for ensuring non-zero cloud fractions for all grid cells where cloud mixing ratio is greater than 1e-6 or ice mixing ratio is greater than 1e-9 + +Subgrid-Scale (SGS) Clouds Scheme +=================================== + +* Bug fix for cloud condensate input into RRTMG radiation +* New code section for use with SAS convection scheme +* Cloud fraction now computed as a mix between the area-dependent form and the modified Chaboureau and Bechtold (2005) form +* Adjusted limit for the boundary flux functions + +MYNN Surface-layer Scheme +=========================== + +* Reintroduced friction velocity averaging over water to reduce noise in 10-m winds in the hurricane regime + +Grell-Freitas Scale and Aerosol Aware Convection Scheme +========================================================= + +* Update for aerosol-awareness (experimental) +* Scale-awareness turned off when explicit microphysics is not active anywhere in the column +* Convection is completely suppressed at grid points where the MYNN PBL scheme produces shallow convection +* Radar reflectivity considers mass flux PDF as well as whether scale-awareness is turned on at the grid point in equation + +Unified Gravity Wave Physics Scheme +===================================== + +* Optional diagnostic for tendencies computed. They can be switched on by setting the following namelist variables to ``“.true.”``: ``ldiag3d`` and ``ldiag_ugwp`` + + +.. attention:: + + The improvements in Thompson cloud microphysics, NoahMP land surface model, GFS TKE-EDMF and cumulus schemes were tested in UFS global configuration, so results in the UFS limited-area configuration (SRW) may vary. diff --git a/docs/UsersGuide/source/BackgroundInfo/Components.rst b/docs/UsersGuide/source/BackgroundInfo/Components.rst index fd362ec691..25b67d20b9 100644 --- a/docs/UsersGuide/source/BackgroundInfo/Components.rst +++ b/docs/UsersGuide/source/BackgroundInfo/Components.rst @@ -6,21 +6,21 @@ SRW Application Components The SRW Application assembles a variety of components, including: -* Pre-processor Utilities & Initial Conditions +* UFS Utilities * UFS Weather Forecast Model * Unified Post Processor * METplus Verification Suite * Unified Workflow Tools -* Build System and Workflow +* Build system and workflow -These components are documented within this User's Guide and supported through the `GitHub Discussions `__ forum. +These components are documented within this User's Guide and supported through the `GitHub Discussions `__ forum. .. _Utils: UFS Preprocessing Utilities (UFS_UTILS) ========================================== -The SRW Application includes a number of pre-processing utilities (UFS_UTILS) that initialize and prepare the model. Since the SRW App provides forecast predictions over a limited area (rather than globally), these utilities generate a regional grid (``regional_esg_grid/make_hgrid``) along with :term:`orography` (``orog``) and surface climatology (``sfc_climo_gen``) files on that grid. Grids include a strip, or "halo," of six cells that surround the regional grid and feed in lateral boundary condition data. Since different grid and orography files require different numbers of :term:`halo` cells, additional utilities handle topography filtering and shave the number of halo points (based on downstream workflow component requirements). The pre-processing software :term:`chgres_cube` is used to convert the raw external model data into initial and lateral boundary condition files in :term:`netCDF` format. These are needed as input to the :term:`FV3`-:term:`LAM`. Additional information about the UFS pre-processor utilities can be found in the `UFS_UTILS Technical Documentation `__ and in the `UFS_UTILS Scientific Documentation `__. +The SRW Application includes a number of pre-processing utilities (UFS_UTILS) that initialize and prepare the model. Since the SRW App provides forecast predictions over a limited area (rather than globally), these utilities generate a regional grid (``regional_esg_grid/make_hgrid``) along with :term:`orography` (``orog``) and surface climatology (``sfc_climo_gen``) files on that grid. Grids include a strip, or "halo," of six cells that surround the regional grid and feed in lateral boundary condition data. Since different grid and orography files require different numbers of :term:`halo` cells, additional utilities handle topography filtering and shave the number of halo points (based on downstream workflow component requirements). The pre-processing software :term:`chgres_cube` is used to convert the raw external model data into initial and lateral boundary condition files in :term:`netCDF` format. These are needed as input to the :term:`FV3` limited area model (:term:`LAM`). Additional information about the UFS pre-processing utilities can be found in the :doc:`UFS_UTILS Technical Documentation ` and in the `UFS_UTILS Scientific Documentation `__. The SRW Application can be initialized from a range of operational initial condition files. It is possible to initialize the model from the Global Forecast System (:term:`GFS`), North American Mesoscale (:term:`NAM`) Forecast System, Rapid Refresh (:term:`RAP`), and High-Resolution Rapid Refresh (:term:`HRRR`) files in Gridded Binary v2 (:term:`GRIB2`) format. GFS files also come in :term:`NEMSIO` format for past dates. @@ -31,14 +31,14 @@ The SRW Application can be initialized from a range of operational initial condi Forecast Model ============== -The prognostic atmospheric model in the UFS SRW Application is the Finite-Volume Cubed-Sphere (:term:`FV3`) dynamical core configured with a Limited Area Model (:term:`LAM`) capability (:cite:t:`BlackEtAl2021`). The :term:`dynamical core` is the computational part of a model that solves the equations of fluid motion. A User's Guide for the UFS :term:`Weather Model` can be accessed `here `__. +The prognostic atmospheric model in the UFS SRW Application is the Finite-Volume Cubed-Sphere (:term:`FV3`) dynamical core configured with a Limited Area Model (:term:`LAM`) capability (:cite:t:`BlackEtAl2021`). The :term:`dynamical core` is the computational part of a model that solves the equations of fluid motion. A User's Guide for the UFS :term:`Weather Model` can be accessed :doc:`here `. Supported model resolutions in this release include 3-, 13-, and 25-km predefined contiguous U.S. (:term:`CONUS`) domains, each with 127 vertical levels. Preliminary tools for users to define their own domain are also available in the release with full, formal support of these tools to be provided in future releases. The Extended Schmidt Gnomonic (ESG) grid is used with the FV3-LAM, which features relatively uniform grid cells across the entirety of the domain. Additional information about the FV3 dynamical core can be found in the `scientific documentation `__, the `technical documentation `__, and on the `NOAA Geophysical Fluid Dynamics Laboratory website `__. Model Physics --------------- -The Common Community Physics Package (CCPP), described `here `__, supports interoperable atmospheric physics and land surface model options. Atmospheric physics are a set of numerical methods describing small-scale processes such as clouds, turbulence, radiation, and their interactions. The most recent SRW App release (v2.1.0) included four supported physics suites, and a fifth has since been added: FV3_RRFS_v1beta, FV3_GFS_v16, FV3_WoFS_v0, FV3_HRRR, and FV3_RAP (new!). The FV3_RRFS_v1beta physics suite is being tested for use in the future operational implementation of the Rapid Refresh Forecast System (:term:`RRFS`) planned for 2023-2024, and the FV3_GFS_v16 is an updated version of the physics suite used in the operational Global Forecast System (GFS) v16. A detailed list of CCPP updates since the SRW App v2.0.0 release is available :ref:`here `. A full scientific description of CCPP parameterizations and suites can be found in the `CCPP Scientific Documentation `__, and CCPP technical aspects are described in the `CCPP Technical Documentation `__. The model namelist has many settings beyond the physics options that can optimize various aspects of the model for use with each of the supported suites. Additional information on Stochastic Physics options is available `here `__. Additionally, a CCPP single-column model (`CCPP-SCM `__) option has also been developed as a child repository. Users can refer to the `CCPP Single Column Model User and Technical Guide `__ for more details. This CCPP-SCM user guide contains a Quick Start Guide with instructions for obtaining the code, compiling, and running test cases, which include five standard test cases and two additional FV3 replay cases (refer to section 5.2 in the CCPP-SCM user guide for more details). Moreover, the CCPP-SCM supports a precompiled version in a docker container, allowing it to be easily executed on NOAA's cloud computing platforms without any issues (see section 2.5 in the CCPP-SCM user guide for more details). +The Common Community Physics Package (CCPP), described `here `__, supports interoperable atmospheric physics and land surface model options. Atmospheric physics are a set of numerical methods describing small-scale processes such as clouds, turbulence, radiation, and their interactions. The SRW App v2.2.0 release includes five supported physics suites: FV3_RRFS_v1beta, FV3_GFS_v16, FV3_WoFS_v0, FV3_HRRR, and FV3_RAP (new!). The FV3_RRFS_v1beta physics suite is being tested for use in the future operational implementation of the Rapid Refresh Forecast System (:term:`RRFS`) planned for 2023-2024, and the FV3_GFS_v16 is an updated version of the physics suite used in the operational Global Forecast System (GFS) v16. A detailed list of CCPP updates since the SRW App v2.1.0 release is available :ref:`here `. A full scientific description of CCPP parameterizations and suites can be found in the `CCPP Scientific Documentation `__, and CCPP technical aspects are described in the :doc:`CCPP Technical Documentation `. The model namelist has many settings beyond the physics options that can optimize various aspects of the model for use with each of the supported suites. Additional information on Stochastic Physics options is available :doc:`here `. .. note:: SPP is currently only available for specific physics schemes used in the RAP/HRRR physics suite. Users need to be aware of which physics suite definition file (:term:`SDF`) is chosen when turning this option on. Among the supported physics suites, the full set of parameterizations can only be used with the ``FV3_HRRR`` option for ``CCPP_PHYS_SUITE``. @@ -48,7 +48,7 @@ The SRW App supports the use of both :term:`GRIB2` and :term:`NEMSIO` input data Unified Post Processor (UPP) ============================== -The Unified Post Processor (:term:`UPP`) processes raw output from a variety of numerical weather prediction (:term:`NWP`) models. In the SRW App, the UPP converts model output data from the model's native :term:`netCDF` format to :term:`GRIB2` format on standard isobaric vertical coordinates. The UPP can also be used to compute a variety of useful diagnostic fields, as described in the `UPP User's Guide `__. Output from UPP can be used with visualization, plotting, and verification packages or in further downstream post-processing (e.g., statistical post-processing techniques). +The Unified Post Processor (:term:`UPP`) processes raw output from a variety of numerical weather prediction (:term:`NWP`) models. In the SRW App, the UPP converts model output data from the model's native :term:`netCDF` format to :term:`GRIB2` format on standard isobaric vertical coordinates. The UPP can also be used to compute a variety of useful diagnostic fields, as described in the :doc:`UPP User's Guide `. Output from UPP can be used with visualization, plotting, and verification packages or in further downstream post-processing (e.g., statistical post-processing techniques). .. _MetplusComponent: @@ -59,20 +59,18 @@ The Model Evaluation Tools (MET) package is a set of statistical verification to The METplus verification framework has been integrated into the SRW App to facilitate forecast evaluation. METplus is a verification framework that spans a wide range of temporal scales (warn-on-forecast to climate) and spatial scales (storm to global). It is supported by the `Developmental Testbed Center (DTC) `__. -METplus *installation* is not included as part of the build process for the most recent release of the SRW App. However, METplus is preinstalled on many `Level 1 & 2 `__ systems; existing builds can be viewed `here `__. +METplus comes preinstalled with :term:`spack-stack` but can also be installed on other systems individually or as part of :term:`HPC-Stack` installation. Users on systems without a previous installation of METplus can follow the :ref:`MET Installation Guide ` and :ref:`METplus Installation Guide ` for individual installation. Currently, METplus *installation* is only supported as part of spack-stack installation; users attempting to install METplus individually or as part of HPC-Stack will need to direct assistance requests to the METplus team. However, METplus *use* is supported on any system with a functioning METplus installation. -METplus can be installed on other systems individually or as part of :term:`HPC-Stack` installation. Users on systems without a previous installation of METplus can follow the `MET Installation Guide `__ and `METplus Installation Guide `__ for individual installation. Currently, METplus *installation* is not a supported feature for this release of the SRW App. However, METplus *use* is supported on systems with a functioning METplus installation. +The core components of the METplus framework include the statistical driver, MET, the associated database and display systems known as METviewer and METexpress, and a suite of Python wrappers to provide low-level automation and examples, also called use cases. MET is a set of verification tools developed for use by the :term:`NWP` community. It matches up grids with either gridded analyses or point observations and applies configurable methods to compute statistics and diagnostics. Extensive documentation is available in the :doc:`METplus User's Guide ` and :doc:`MET User's Guide `. Documentation for all other components of the framework can be found at the *Documentation* link for each component on the METplus `downloads `__ page. -The core components of the METplus framework include the statistical driver, MET, the associated database and display systems known as METviewer and METexpress, and a suite of Python wrappers to provide low-level automation and examples, also called use cases. MET is a set of verification tools developed for use by the :term:`NWP` community. It matches up grids with either gridded analyses or point observations and applies configurable methods to compute statistics and diagnostics. Extensive documentation is available in the `METplus User's Guide `__ and `MET User's Guide `__. Documentation for all other components of the framework can be found at the Documentation link for each component on the METplus `downloads `__ page. +Among other techniques, MET provides the capability to compute standard verification scores for comparing deterministic gridded model data to point-based and gridded observations. It also provides ensemble and probabilistic verification methods for comparing gridded model data to point-based or gridded observations. Verification tasks to accomplish these comparisons are defined in the SRW App in :numref:`Table %s `. Currently, the SRW App supports the use of :term:`NDAS` observation files (which include conventional point-based surface and upper-air data) `in prepBUFR format `__ for point-based verification. It also supports gridded Climatology-Calibrated Precipitation Analysis (:term:`CCPA`) data for accumulated precipitation evaluation and Multi-Radar/Multi-Sensor (:term:`MRMS`) gridded analysis data for composite reflectivity and :term:`echo top` verification. -Among other techniques, MET provides the capability to compute standard verification scores for comparing deterministic gridded model data to point-based and gridded observations. It also provides ensemble and probabilistic verification methods for comparing gridded model data to point-based or gridded observations. Verification tasks to accomplish these comparisons are defined in the SRW App in :numref:`Table %s `. Currently, the SRW App supports the use of :term:`NDAS` observation files (which include conventional point-based surface and upper-air data) in `prepBUFR format `__ for point-based verification. It also supports gridded Climatology-Calibrated Precipitation Analysis (:term:`CCPA`) data for accumulated precipitation evaluation and Multi-Radar/Multi-Sensor (:term:`MRMS`) gridded analysis data for composite reflectivity and :term:`echo top` verification. - -METplus is being actively developed by :term:`NCAR`/Research Applications Laboratory (RAL), NOAA/Earth Systems Research Laboratories (ESRL), and NOAA/Environmental Modeling Center (:term:`EMC`), and it is open to community contributions. More details about METplus can be found in :numref:`Chapter %s ` and on the `METplus website `__. +METplus is being actively developed by :term:`NCAR`/Research Applications Laboratory (RAL), NOAA/Earth Systems Research Laboratories (`ESRL `__), and NOAA/Environmental Modeling Center (:term:`EMC`), and it is open to community contributions. More details about METplus can be found on the `METplus website `__. Air Quality Modeling (AQM) Utilities ======================================= -AQM Utilities (AQM-utils) includes the utility executables and python scripts to run SRW-AQM (Online-CMAQ). +AQM Utilities (AQM-utils) include the utility executables and python scripts to run SRW-AQM (Online-:term:`CMAQ`). For more information on AQM-utils, visit the GitHub repository at https://github.com/NOAA-EMC/AQM-utils. .. _nexus: @@ -89,22 +87,22 @@ For more information on NEXUS, visit the GitHub repository at https://github.com Unified Workflow Tools ======================== -The Unified Workflow (UW) is a set of tools intended to unify the workflow for various UFS applications under one framework. The UW toolkit currently includes templater and config tools, which have been incorporated into the SRW App workflow and will soon be incorporated into other UFS repositories. Additional tools are under development. More details about the UW can be found in the `workflow-tools `__ GitHub repository and in the `UW Documentation `__. +The Unified Workflow (UW) is a set of tools intended to unify the workflow for various UFS applications under one framework. The UW toolkit currently includes templater and configuration (config) tools, which have been incorporated into the SRW App workflow and will soon be incorporated into other UFS repositories. Additional tools are under development. More details about the UW can be found in the `workflow-tools `__ GitHub repository and in the `UW Documentation `__. Build System and Workflow ========================= -The SRW Application has a portable, CMake-based build system that packages together all the components required to build the SRW Application. This build system collects the components necessary for running the end-to-end SRW Application, including the UFS Weather Model and the pre- and post-processing software. Additional libraries necessary for the application (e.g., :term:`NCEPLIBS-external` and :term:`NCEPLIBS`) are not included in the SRW Application build system but are available pre-built on pre-configured platforms. On other systems, they can be installed via the HPC-Stack (see :doc:`HPC-Stack Documentation `). There is a small set of system libraries and utilities that are assumed to be present on the target computer: the CMake build software; a Fortran, C, and C++ compiler; and an :term:`MPI` library. +The SRW Application has a portable, CMake-based build system that packages together all the components required to build the SRW Application. This build system collects the components necessary for running the end-to-end SRW Application, including the UFS Weather Model and the pre- and post-processing software. Additional libraries necessary for the application (e.g., :term:`NCEPLIBS-external` and :term:`NCEPLIBS`) are not included in the SRW Application build system but are available pre-built on pre-configured platforms. On other systems, they can be installed via spack-stack (see :doc:`spack-stack Documentation `). There is a small set of :ref:`prerequisite system libraries ` and utilities that are assumed to be present on the target computer: the CMake build software; a Fortran, C, and C++ compiler; and an :term:`MPI` library. Once built, users can generate a Rocoto-based workflow that will run each task in the proper sequence (see :numref:`Chapter %s ` or the `Rocoto documentation `__ for more information on Rocoto and workflow management). If Rocoto and/or a batch system is not present on the available platform, the individual components can be run in a stand-alone, command line fashion with provided run scripts. The SRW Application allows users to configure various elements of the workflow. For example, users can modify the parameters of the atmospheric model, such as start and end dates, duration, integration time step, and the physics suite used for the simulation. It also allows for configuration of other elements of the workflow; for example, users can choose whether to run some or all of the pre-processing, forecast model, and post-processing steps. More information on how to configure the workflow is available in :numref:`Section %s `. -An optional Python plotting task is also included to create basic visualizations of the model output. The task outputs graphics in PNG format for several standard meteorological variables on the pre-defined :term:`CONUS` domain. A difference plotting option is also included to visually compare two runs for the same domain and resolution. These plots may be used to perform a visual check to verify that the application is producing reasonable results. Configuration instructions are provided in :numref:`Section %s `. +An optional Python plotting task can also be included in the workflow to create basic visualizations of the model output. The task outputs graphics in PNG format for several standard meteorological variables on the pre-defined :term:`CONUS` domain. A difference plotting option is also included to visually compare two runs for the same domain and resolution. These plots may be used to perform a visual check to verify that the application is producing reasonable results. Configuration instructions are provided in :numref:`Section %s `. -The SRW Application has been tested on a variety of platforms widely used by researchers, including NOAA High-Performance Computing (HPC) systems (e.g., Hera, Orion, Jet); the National Center for Atmospheric Research (:term:`NCAR`) Cheyenne system; cloud environment, and generic Linux and MacOS systems using Intel and GNU compilers. Four `levels of support `__ have been defined for the SRW Application, including pre-configured (Level 1), configurable (Level 2), limited-test (Level 3), and build-only (Level 4) platforms. +The SRW Application has been tested on a variety of platforms widely used by researchers, including NOAA High-Performance Computing (HPC) systems (e.g., Hera, Jet); the National Center for Atmospheric Research (:term:`NCAR`) Derecho system; cloud environments; and generic Linux and MacOS systems using Intel and GNU compilers. Four `levels of support `__ have been defined for the SRW Application, including pre-configured (Level 1), configurable (Level 2), limited-test (Level 3), and build-only (Level 4) platforms. -Preconfigured (Level 1) systems already have the required external libraries available in a central location (via :term:`HPC-Stack` or :term:`spack-stack`). The SRW Application is expected to build and run out-of-the-box on these systems, and users can :ref:`download the SRW App code ` without first installing prerequisites. +Preconfigured (Level 1) systems already have the required external libraries available in a central location (via :term:`spack-stack` or :term:`HPC-Stack`). The SRW Application is expected to build and run out-of-the-box on these systems, and users can :ref:`download the SRW App code ` without first installing prerequisites. Configurable platforms (Level 2) are platforms where all of the required libraries for building the SRW Application are expected to install successfully but are not available in a central location. Users will need to install the required libraries as part of the :ref:`SRW Application build ` process. Applications and models are expected to build and run once the required libraries are built. Release testing is conducted on these systems to ensure that the SRW App runs smoothly there. diff --git a/docs/UsersGuide/source/BackgroundInfo/Introduction.rst b/docs/UsersGuide/source/BackgroundInfo/Introduction.rst index f4aa344fa7..926dfd61b8 100644 --- a/docs/UsersGuide/source/BackgroundInfo/Introduction.rst +++ b/docs/UsersGuide/source/BackgroundInfo/Introduction.rst @@ -8,30 +8,42 @@ The Unified Forecast System (:term:`UFS`) is a community-based, coupled, compreh The UFS includes `multiple applications `__ that support different forecast durations and spatial domains. This documentation describes the UFS Short-Range Weather (SRW) Application, which targets predictions of atmospheric behavior on a limited spatial domain and on time scales from minutes to several days. The most recent SRW Application includes a prognostic atmospheric model, pre- and post-processing, and a community workflow for running the system end-to-end. These components are documented within this User's Guide and supported through the `GitHub Discussions `__ forum. The SRW App also includes support for a verification package (METplus) for both deterministic and ensemble simulations and support for four stochastically perturbed physics schemes. -Since the v2.1.0 release, developers have added a variety of features: +Since the last release, developers have added a variety of features: * Bug fixes since the v2.1.0 release - * Pre-implementation Rapid Refresh Forecast System (RRFS) forecast configurations - * Air Quality Modeling (AQM) capabilities - * Updates to :term:`CCPP` that target the top of the ``main`` branch (which is ahead of CCPP v6.0.0). See :ref:`this page ` for a detailed summary of updates that came in ahead of the v2.1.0 release. - * Support for the :term:`UPP` inline post option (see :ref:`here `) - * Documentation updates to reflect the changes above - -The SRW App v2.1.0 citation is as follows and should be used when presenting results based on research conducted with the App: - -UFS Development Team. (2022, Nov. 17). Unified Forecast System (UFS) Short-Range Weather (SRW) Application (Version v2.1.0). Zenodo. https://doi.org/10.5281/zenodo.7277602 + * Addition of the supported ``FV3_RAP`` physics suite (`PR #811 `__) and support for the ``RRFS_NA_13km`` predefined grid + * Addition of ``FV3_GFS_v17_p8`` physics suite (`PR #574 `__) + * Updates to :term:`CCPP` that target the top of the ``main`` branch (which is ahead of CCPP v6.0.0). See :ref:`this page ` for a detailed summary of updates that came in ahead of the v2.2.0 release. + * Expansion of `Level 1 platforms `__ to include Derecho, Hercules, and Gaea C5 (PRs `#894 `__, `#898 `__, `#911 `__) + * Transition to spack-stack modulefiles for most supported platforms to align with the UFS WM shift to spack-stack (PRs `#913 `__ and `#941 `__) + * Overhaul of WE2E testing suite (see, e.g., PRs `#686 `__, `#732 `__, `#864 `__, `#871 `__) + * Improvements to the CI/CD automated testing pipeline (see, e.g., PRs `#707 `__ and `#847 `__) + * Incorporation of additional METplus verification capabilities (PRs `#552 `__, `#614 `__, `#757 `__, `#853 `__) + * Integration of the Unified Workflow's templater tool (`PR #793 `__) + * Ability to create a user-defined custom workflow (`PR #676 `__) + * Option to use a custom vertical coordinate file with different distribution of vertical layers (`PR #813 `__) and :ref:`documentation on how to use this feature ` (`PR #888 `__) + * Incorporation of plotting tasks into the workflow (PR `#482 `__); addition of ability to plot on both CONUS and smaller regional grid (`PR #560 `__) + * Addition of a sample verification case (`PR #500 `__) with :ref:`documentation ` + * A new :ref:`tutorial chapter ` in the documentation (`PR #584 `__) + * Incorporation of `UFS Case Studies `__ within the WE2E framework (PRs `#736 `__ and `#822 `__) + * Air Quality Modeling (AQM) capabilities (unsupported but available; see `PR #613 `__) + * Miscellaneous documentation updates to reflect the changes above + +The SRW App v2.2.0 citation is as follows and should be used when presenting results based on research conducted with the App: + +UFS Development Team. (2023, Oct. 30). Unified Forecast System (UFS) Short-Range Weather (SRW) Application (Version v2.2.0). Zenodo. https://doi.org/10.5281/zenodo.10015544 User's Guide Organization ============================ -The SRW Application documentation is organized into four sections: *Background Information*; *Building, Running, and Testing the SRW App*; *Customizing the Workflow*; and *Reference*. +The SRW Application documentation is organized into four sections: (1) *Background Information*; (2) *Building, Running, and Testing the SRW App*; (3) *Customizing the Workflow*; and (4) *Reference*. Background Information ------------------------- * This **Introduction** section explains how the SRW App documentation is organized, how to use this guide, and where to find user support and component documentation. - * :numref:`Section %s: Technical Overview ` provides a technical overview, including SRW App prerequisites and an overview of the code directory structure. - * :numref:`Section %s: SRW Application Components ` provides a detailed description of the application components, including optional application components. + * :numref:`Section %s: Technical Overview ` provides technical information about the SRW App, including prerequisites and an overview of the code directory structure. + * :numref:`Section %s: SRW Application Components ` provides a description of the application components, including optional components. Building, Running, and Testing the SRW App -------------------------------------------- @@ -57,7 +69,7 @@ Customizing the Workflow * :numref:`Section %s: Workflow Parameters ` documents all of the user-configurable experiment parameters that can be set in the user configuration file (``config.yaml``). * :numref:`Section %s: Input & Output Files ` describes application input and output files, as well as information on where to get publicly available data. - * :numref:`Section %s: Limited Area Model (LAM) Grids ` describes the SRW App predefined grids in detail and explains how to create a custom user-generated grid. + * :numref:`Section %s: Limited Area Model (LAM) Grids ` describes the SRW App predefined grids, explains how to create a custom user-generated grid, and provides information on using a custom distribution of vertical levels. * :numref:`Section %s: Defining an SRW App Workflow ` explains how to build a customized SRW App workflow XML file. * :numref:`Section %s: Template Variables ` explains how to use template variables. @@ -91,44 +103,42 @@ A list of available component documentation is shown in :numref:`Table %s `__ repository should follow the guidelines contained in the `SRW App Contributor's Guide `__. For code to be accepted into a component repository, users must follow the code management rules of that component's authoritative repository. These rules are usually outlined in the User's Guide (see :numref:`Table %s `) or GitHub wiki for each respective repository (see :numref:`Table %s `). +utilities, model code, and infrastructure. As described above, users can post issues in the SRW App to report bugs or to announce upcoming contributions to the code base. Contributions to the `ufs-srweather-app `__ repository should follow the guidelines contained in the `SRW App Contributor's Guide `__. Additionally, users can file issues in component repositories for contributions that directly concern those repositories. For code to be accepted into a component repository, users must follow the code management rules of that component's authoritative repository. These rules are usually outlined in the component's User's Guide (see :numref:`Table %s `) or GitHub wiki for each respective repository (see :numref:`Table %s `). Future Direction ================= Users can expect to see incremental improvements and additional capabilities in upcoming releases of the SRW Application to enhance research opportunities and support operational forecast implementations. Planned enhancements include: +* Inclusion of data assimilation and forecast restart/cycling capabilities via :term:`JEDI`. * A more extensive set of supported developmental physics suites. * A larger number of pre-defined domains/resolutions and a *fully supported* capability to create a user-defined domain. -* Add user-defined vertical levels (number and distribution). -* Inclusion of data assimilation and forecast restart/cycling capabilities. - +* Incorporation of additional `Unified Workflow `__ tools. .. bibliography:: ../references.bib diff --git a/docs/UsersGuide/source/BackgroundInfo/TechnicalOverview.rst b/docs/UsersGuide/source/BackgroundInfo/TechnicalOverview.rst index a179d535e1..0248745200 100644 --- a/docs/UsersGuide/source/BackgroundInfo/TechnicalOverview.rst +++ b/docs/UsersGuide/source/BackgroundInfo/TechnicalOverview.rst @@ -4,7 +4,7 @@ Technical Overview ==================== -This chapter provides information on SRW App prerequistes, code repositories, and directory structure. +This chapter provides information on SRW App prerequistes, component code repositories, and directory structure. .. _SRWPrerequisites: @@ -40,10 +40,10 @@ The UFS SRW Application has been designed so that any sufficiently up-to-date ma * POSIX-compliant UNIX-style operating system -* >82 GB disk space +* >97 GB disk space * 53 GB input data for a standard collection of global data, or "fix" file data (topography, climatology, observational data) for a short 12-hour test forecast on the :term:`CONUS` 25km domain. See data download instructions in :numref:`Section %s `. - * 8 GB for full :term:`HPC-Stack` installation + * ~23 GB for full :term:`spack-stack` installation (or ~8 GB :term:`HPC-Stack`) * 3 GB for ``ufs-srweather-app`` installation * 1 GB for boundary conditions for a short 12-hour test forecast on the CONUS 25km domain. See data download instructions in :numref:`Section %s `. * 17 GB for a 12-hour test forecast on the CONUS 25km domain, with model output saved hourly. @@ -56,7 +56,7 @@ The UFS SRW Application has been designed so that any sufficiently up-to-date ma * gcc v9+, ifort v18+, and clang v9+ (macOS, native Apple clang, LLVM clang, GNU) have been tested -* Python v3.6+, including prerequisite packages ``jinja2``, ``pyyaml``, and ``f90nml`` +* Python v3.7+ (preferably 3.9+), including prerequisite packages ``jinja2``, ``pyyaml``, and ``f90nml`` * Python packages ``scipy``, ``matplotlib``, ``pygrib``, ``cartopy``, and ``pillow`` are required for users who would like to use the provided graphics scripts. @@ -70,13 +70,13 @@ The UFS SRW Application has been designed so that any sufficiently up-to-date ma * Only required for retrieving data using ``retrieve_data.py``. If data is prestaged, *wget* is not required. If data is retrieved using other means, *curl* may be used as an alternative. -The following software is also required to run the SRW Application, but the :term:`HPC-Stack` (which contains the software libraries necessary for building and running the SRW App) can be configured to build these requirements: +The following software is also required to run the SRW Application, but the :term:`spack-stack` (which contains the software libraries necessary for building and running the SRW App) can be configured to build these requirements: * CMake v3.20+ * :term:`MPI` (MPICH, OpenMPI, or other implementation) - * Only **MPICH** or **OpenMPI** can be built with HPC-Stack. Other implementations must be installed separately by the user (if desired). + * Only **MPICH** or **OpenMPI** can be built with spack-stack. Other implementations must be installed separately by the user (if desired). For MacOS systems, some additional software packages are needed. When possible, it is recommended that users install and/or upgrade this software (along with software listed above) using the `Homebrew `__ package manager for MacOS. See :doc:`HPC-Stack Documentation: Chapter 3 ` and :numref:`Chapter %s ` for further guidance on installing these prerequisites on MacOS. @@ -103,7 +103,7 @@ Code Repositories and Directory Structure Hierarchical Repository Structure ----------------------------------- -The :term:`umbrella repository` for the SRW Application is named ``ufs-srweather-app`` and is available on GitHub at https://github.com/ufs-community/ufs-srweather-app. The SRW Application uses the ``manage_externals`` tool and a configuration file called ``Externals.cfg``, to pull in the appropriate versions of the external repositories associated with the SRW App (see :numref:`Table %s `). +The :term:`umbrella repository` for the SRW Application is named ``ufs-srweather-app`` and is available on GitHub at https://github.com/ufs-community/ufs-srweather-app. The SRW Application uses the ``manage_externals`` tool and a configuration file called ``Externals.cfg`` to pull in the appropriate versions of the external repositories associated with the SRW App (see :numref:`Table %s `). .. _top_level_repos: @@ -123,16 +123,15 @@ The :term:`umbrella repository` for the SRW Application is named ``ufs-srweather - https://github.com/NOAA-EMC/UPP * - Repository for Air Quality Modeling (AQM) Utilities - https://github.com/NOAA-EMC/AQM-utils - * - Repository for NEXUS + * - Repository for the NOAA Emission and eXchange Unified System (NEXUS) - https://github.com/noaa-oar-arl/NEXUS * - Repository for the Unified Workflow (UW) Toolkit - https://github.com/ufs-community/workflow-tools -The UFS Weather Model contains a number of sub-repositories, which are documented `here `__. +The UFS Weather Model contains a number of sub-repositories, which are documented :doc:`here `. .. note:: - The prerequisite libraries (including NCEP Libraries and external libraries) are not included in the UFS SRW Application repository. The `HPC-Stack `__ repository assembles these prerequisite libraries. The HPC-Stack has already been built on `preconfigured (Level 1) platforms `__. However, it must be built on other systems. See the :doc:`HPC-Stack Documentation ` for details on installing the HPC-Stack. - + The prerequisite libraries (including NCEP Libraries and external libraries) are not included in the UFS SRW Application repository. The `spack-stack `__ repository assembles these prerequisite libraries. Spack-stack has already been built on `preconfigured (Level 1) platforms `__. However, it must be built on other systems. See the :doc:`spack-stack Documentation ` for details on installing spack-stack. .. _TopLevelDirStructure: @@ -157,6 +156,7 @@ The ``ufs-srweather-app`` :term:`umbrella repository` is an NCO-compliant reposi │ └── wflow_.lua ├── parm │ ├── wflow + │ │ └── default_workflow.yaml │ └── FV3LAM_wflow.xml ├── (share) ├── scripts @@ -179,13 +179,15 @@ The ``ufs-srweather-app`` :term:`umbrella repository` is an NCO-compliant reposi │ ├── atmos_cubed_sphere │ └── ccpp ├── tests/WE2E + │ └── run_WE2E_tests.py ├── ush - │ ├── bash_utils │ ├── machine - │ ├── Python - │ ├── python_utils - │ ├── test_data - │ └── wrappers + │ ├── wrappers + │ ├── config.community.yaml + │ ├── generate_FV3LAM_wflow.py + │ ├── launch_FV3LAM_wflow.sh + │ ├── setup.py + │ └── valid_param_vals.yaml └── versions SRW App SubDirectories @@ -226,7 +228,7 @@ When the user generates an experiment using the ``generate_FV3LAM_wflow.py`` scr .. _ExptDirStructure: -.. table:: Files and subdirectory initially created in the experiment directory +.. table:: Files and subdirectory initially created in the experiment directory :widths: 33 67 +---------------------------+--------------------------------------------------------------------------------------------------------------+ @@ -237,12 +239,11 @@ When the user generates an experiment using the ``generate_FV3LAM_wflow.py`` scr | data_table | :term:`Cycle-independent` input file (empty) | +---------------------------+--------------------------------------------------------------------------------------------------------------+ | field_table | :term:`Tracers ` in the `forecast model | - | | `__ | + | | `__ | +---------------------------+--------------------------------------------------------------------------------------------------------------+ | FV3LAM_wflow.xml | Rocoto XML file to run the workflow | +---------------------------+--------------------------------------------------------------------------------------------------------------+ - | input.nml | :term:`Namelist` for the `UFS Weather Model | - | | `__ | + | input.nml | :term:`Namelist` for the :ref:`UFS Weather Model ` | +---------------------------+--------------------------------------------------------------------------------------------------------------+ | launch_FV3LAM_wflow.sh | Symlink to the ``ufs-srweather-app/ush/launch_FV3LAM_wflow.sh`` shell script, | | | which can be used to (re)launch the Rocoto workflow. | @@ -252,8 +253,7 @@ When the user generates an experiment using the ``generate_FV3LAM_wflow.py`` scr | log.generate_FV3LAM_wflow | Log of the output from the experiment generation script | | | (``generate_FV3LAM_wflow.py``) | +---------------------------+--------------------------------------------------------------------------------------------------------------+ - | nems.configure | See `NEMS configuration file | - | | `__ | + | nems.configure | See :ref:`NEMS configuration file ` | +---------------------------+--------------------------------------------------------------------------------------------------------------+ | suite_{CCPP}.xml | :term:`CCPP` suite definition file (:term:`SDF`) used by the forecast model | +---------------------------+--------------------------------------------------------------------------------------------------------------+ diff --git a/docs/UsersGuide/source/BuildingRunningTesting/AQM.rst b/docs/UsersGuide/source/BuildingRunningTesting/AQM.rst index 4c83bcee2b..c4f373f4ba 100644 --- a/docs/UsersGuide/source/BuildingRunningTesting/AQM.rst +++ b/docs/UsersGuide/source/BuildingRunningTesting/AQM.rst @@ -4,6 +4,10 @@ Air Quality Modeling (SRW-AQM) ===================================== +.. attention:: + + AQM capabilities are an unsupported feature of the SRW App. This means that it is available for users to experiment with, but assistance for AQM-related issues is limited. + The standard SRW App distribution uses the uncoupled version of the UFS Weather Model (atmosphere-only). However, users have the option to use a coupled version of the SRW App that includes the standard distribution (atmospheric model) plus the Air Quality Model (AQM). The AQM is a UFS Application that dynamically couples the Community Multiscale Air Quality (:term:`CMAQ`) model with the UFS Weather Model (WM) through the :term:`NUOPC` Layer to simulate temporal and spatial variations of atmospheric compositions (e.g., ozone and aerosol compositions). The CMAQ model, treated as a column chemistry model, updates concentrations of chemical species (e.g., ozone and aerosol compositions) at each integration time step. The transport terms (e.g., :term:`advection` and diffusion) of all chemical species are handled by the UFS WM as tracers. @@ -12,21 +16,21 @@ The AQM is a UFS Application that dynamically couples the Community Multiscale A Although this chapter is the primary documentation resource for running the AQM configuration, users may need to refer to :numref:`Chapter %s ` and :numref:`Chapter %s ` for additional information on building and running the SRW App, respectively. +Quick Start Guide (SRW-AQM) +===================================== + .. attention:: These instructions should work smoothly on Hera and WCOSS2, but users on other systems may need to make additional adjustments. -Quick Start Guide (SRW-AQM) -===================================== - Download the Code ------------------- -Clone the ``develop`` branch of the authoritative SRW App repository: +Clone the |branch| branch of the authoritative SRW App repository: .. code-block:: console - git clone -b develop https://github.com/ufs-community/ufs-srweather-app + git clone -b release/public-v2.2.0 https://github.com/ufs-community/ufs-srweather-app cd ufs-srweather-app Checkout Externals @@ -68,7 +72,7 @@ If the SRW-AQM builds correctly, users should see the standard executables liste * - nexus - Runs the NOAA Emission and eXchange Unified System (:ref:`NEXUS `) emissions processing system -Load the ``workflow_tools`` Environment +Load the |wflow_env| Environment -------------------------------------------- Load the python environment for the workflow: @@ -90,7 +94,7 @@ If the console outputs a message, the user should run the commands specified in Please do the following to activate conda: > conda activate workflow_tools -then the user should run ``conda activate workflow_tools``. Otherwise, the user can continue with configuring the workflow. +then the user should run |activate|. Otherwise, the user can continue with configuring the workflow. .. _AQMConfig: @@ -131,7 +135,7 @@ Users may also wish to change :term:`cron`-related parameters in ``config.yaml`` This means that cron will submit the launch script every 3 minutes. Users may choose not to submit using cron or to submit at a different frequency. Note that users should create a crontab by running ``crontab -e`` the first time they use cron. -When using the basic ``config.aqm.community.yaml`` experiment, the AQM pre-processing tasks are automatically turned because ``"parm/wflow/aqm_prep.yaml"`` appears in the list of workflow files in the ``rocoto: tasks: taskgroups:`` section of ``config.yaml`` (see :numref:`Section %s ` for task descriptions). To turn on AQM post-processing tasks in the workflow, include ``"parm/wflow/aqm_post.yaml"`` in the ``rocoto: tasks: taskgroups:`` section, too (see :numref:`Section %s ` for task descriptions). +When using the basic ``config.aqm.community.yaml`` experiment, the AQM pre-processing tasks are automatically turned on because ``"parm/wflow/aqm_prep.yaml"`` appears in the list of workflow files in the ``rocoto: tasks: taskgroups:`` section of ``config.yaml`` (see :numref:`Section %s ` for task descriptions). To turn on AQM *post*-processing tasks in the workflow, include ``"parm/wflow/aqm_post.yaml"`` in the ``rocoto: tasks: taskgroups:`` section, too (see :numref:`Section %s ` for task descriptions). .. attention:: @@ -153,10 +157,10 @@ If ``USE_CRON_TO_RELAUNCH`` is set to true in ``config.yaml`` (see :numref:`Sect .. code-block:: console - cd / + cd ${EXPT_BASEDIR}/${EXPT_SUBDIR} ./launch_FV3LAM_wflow.sh -Repeat the launch command regularly until a SUCCESS or FAILURE message appears on the terminal window. See :numref:`Section %s ` for more on the ```` and ```` variables. +Repeat the launch command regularly until a SUCCESS or FAILURE message appears on the terminal window. See :numref:`Section %s ` for more on the ``${EXPT_BASEDIR}`` and ``${EXPT_SUBDIR}`` variables. Users may check experiment status from the experiment directory with either of the following commands: @@ -235,7 +239,7 @@ Structure of SRW-AQM Workflow .. _FlowProcAQM: -.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/SRW-AQM_workflow.png +.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/WorkflowImages/SRW-AQM_workflow.png :alt: Flowchart of the SRW-AQM tasks. *Workflow Structure of SRW-AQM (non-DA)* @@ -307,6 +311,7 @@ Add the WE2E test for AQM to the list file: .. code-block:: console + cd /path/to/ufs-srweather-app/tests/WE2E echo "custom_ESGgrid" > my_tests.txt echo "aqm_grid_AQM_NA13km_suite_GFS_v16" >> my_tests.txt diff --git a/docs/UsersGuide/source/BuildingRunningTesting/BuildSRW.rst b/docs/UsersGuide/source/BuildingRunningTesting/BuildSRW.rst index 5b871fe87f..a5d58985c1 100644 --- a/docs/UsersGuide/source/BuildingRunningTesting/BuildSRW.rst +++ b/docs/UsersGuide/source/BuildingRunningTesting/BuildSRW.rst @@ -22,32 +22,39 @@ To build the SRW App, users will complete the following steps: .. _AppBuildProc: -.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/SRW_build_process.png +.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/WorkflowImages/SRW_build_process.png :alt: Flowchart describing the SRW App build process. *Overview of the SRW App Build Process* -.. _HPCstackInfo: +.. _StackInfo: Install the Prerequisite Software Stack ========================================== Users on any sufficiently up-to-date machine with a UNIX-based operating system should be able to install the prerequisite software stack and run the SRW Application. However, a list of prerequisites is available in :numref:`Section %s ` for reference. Users should install or update their system as required before attempting to install the software stack. -Currently, installation of the prerequisite software stack is supported via HPC-Stack. :term:`HPC-Stack` is a :term:`repository` that provides a unified, shell script-based system to build the software stack required for `UFS `__ applications such as the SRW App. +Currently, installation of the prerequisite software stack is supported via spack-stack on most systems. :term:`Spack-stack` is a :term:`repository` that provides a Spack-based system to build the software stack required for `UFS `__ applications such as the SRW App. Spack-stack is the software stack validated by the UFS Weather Model (:term:`WM`), and the SRW App has likewise shifted to spack-stack for most Level 1 systems. -.. Attention:: - Skip the HPC-Stack installation if working on a `Level 1 system `__ (e.g., Cheyenne, Hera, Orion, NOAA Cloud), and :ref:`continue to the next section `. +.. hint:: + Skip the spack-stack installation if working on a `Level 1 system `__ (e.g., Hera, Jet, Derecho, NOAA Cloud), and :ref:`continue to the next section `. Background ---------------- -SRW App components, including the UFS Weather Model (:term:`WM `), draw on over 50 code libraries to run. These libraries range from libraries developed in-house at NOAA (e.g., NCEPLIBS, FMS) to libraries developed by NOAA's partners (e.g., PIO, ESMF) to truly third party libraries (e.g., netCDF). Individual installation of these libraries is not practical, so the `HPC-Stack `__ was developed as a central installation system to ensure that the infrastructure environment across multiple platforms is as similar as possible. Installation of the HPC-Stack (or :term:`spack-stack`) is required to run the SRW App. +SRW App components, including the UFS :term:`WM`, draw on over 50 code libraries to run. These libraries range from libraries developed in-house at NOAA (e.g., NCEPLIBS, FMS) to libraries developed by NOAA's partners (e.g., PIO, ESMF) to truly third-party libraries (e.g., netCDF). Individual installation of these libraries is not practical, so `spack-stack `__ was developed as a central installation system to ensure that the infrastructure environment across multiple platforms is as similar as possible. Installation of spack-stack (or its predecessor, :term:`HPC-Stack`) is required to run the SRW App. Instructions ------------------------- -Users working on systems that fall under `Support Levels 2-4 `__ will need to install the HPC-Stack the first time they try to build applications (such as the SRW App) that depend on it. Users can either build the HPC-Stack on their local system or use the centrally maintained stacks on each HPC platform if they are working on a Level 1 system. Before installing the HPC-Stack, users on both Linux and MacOS systems should set the stack size to "unlimited" (if allowed) or to the largest possible value: + +.. attention:: + + Spack-stack is the fully-supported software stack validated by the UFS WM as of `PR #1707 `__ on August 24, 2023. UFS applications are therefore shifting to :term:`spack-stack`, too. When all systems have shifted to spack-stack, support for HPC-Stack will be deprecated. Users are encouraged to check out `spack-stack `__ to prepare for this shift in support from HPC-Stack to spack-stack even if their system currently has support for HPC-Stack. + + As of the v2.2.0 release, spack-stack is supported in the SRW App on most Level 1 systems with the exception of Derecho, which uses HPC-Stack. Transition to spack-stack is underway for Derecho. Users on generic MacOS and Linux systems will find HPC-Stack-based modulefiles in the v2.2.0 release but can expect that these will also shift to spack-stack in the ``develop`` branch in the coming months. + +Users working on systems that fall under `Support Levels 2-4 `__ will need to install spack-stack or HPC-Stack the first time they try to build applications (such as the SRW App) that depend on it. Users can build the stack on their local system or use the centrally maintained stacks on each HPC platform if they are working on a Level 1 system. Before installing spack-stack/HPC-Stack, users on both Linux and MacOS systems should set the stack size to "unlimited" (if allowed) or to the largest possible value: .. code-block:: console @@ -57,10 +64,7 @@ Users working on systems that fall under `Support Levels 2-4 `. - -.. attention:: - Although HPC-Stack is currently the fully-supported software stack option, UFS applications are gradually shifting to :term:`spack-stack`, which is a :term:`Spack`-based method for installing UFS prerequisite software libraries. The spack-stack is currently used on NOAA Cloud platforms and in containers, while HPC-Stack is still used on other Level 1 systems and is the software stack validated by the UFS Weather Model. Users are encouraged to check out `spack-stack `__ to prepare for the upcoming shift in support from HPC-Stack to spack-stack. +For a detailed description of installation options, see :doc:`spack-stack instructions for configuring the stack on a new platform ` or :ref:`HPC-Stack installation instructions `. After completing installation, continue to the :ref:`next section ` to download the UFS SRW Application Code. @@ -68,11 +72,11 @@ After completing installation, continue to the :ref:`next section `. The user may set an ``$SRW`` environment variable to point to the location of the new ``ufs-srweather-app`` repository. For example, if ``ufs-srweather-app`` was cloned into the ``$HOME`` directory, the following commands will set an ``$SRW`` environment variable in a bash or csh shell, respectively: @@ -86,84 +90,61 @@ The cloned repository contains the configuration files and sub-directories shown .. _FilesAndSubDirs: -.. table:: Files and sub-directories of the ufs-srweather-app repository - - +--------------------------------+-----------------------------------------------------------+ - | **File/Directory Name** | **Description** | - +================================+===========================================================+ - | CMakeLists.txt | Main CMake file for SRW App | - +--------------------------------+-----------------------------------------------------------+ - | devbuild.sh | SRW App build script | - +--------------------------------+-----------------------------------------------------------+ - | devclean.sh | Convenience script that can be used to clean up code if | - | | something goes wrong when checking out externals or | - | | building the application. | - +--------------------------------+-----------------------------------------------------------+ - | docs | Contains release notes, documentation, and User's Guide | - +--------------------------------+-----------------------------------------------------------+ - | environment.yml | Contains information on the package versions required for | - | | the regional workflow environment. | - +--------------------------------+-----------------------------------------------------------+ - | etc | Contains Lmod startup scripts | - +--------------------------------+-----------------------------------------------------------+ - | Externals.cfg | Includes tags pointing to the correct version of the | - | | external GitHub repositories/branches used in the SRW | - | | App. | - +--------------------------------+-----------------------------------------------------------+ - | jobs | Contains the *j-job* script for each workflow task. These | - | | scripts set up the environment variables and call an | - | | *ex-script* script located in the ``scripts`` | - | | subdirectory. | - +--------------------------------+-----------------------------------------------------------+ - | LICENSE.md | CC0 license information | - +--------------------------------+-----------------------------------------------------------+ - | manage_externals | Utility for checking out external repositories | - +--------------------------------+-----------------------------------------------------------+ - | modulefiles | Contains build and workflow modulefiles | - +--------------------------------+-----------------------------------------------------------+ - | parm | Contains parameter files. Includes UFS Weather Model | - | | configuration files such as ``model_configure``, | - | | ``diag_table``, and ``field_table``. | - +--------------------------------+-----------------------------------------------------------+ - | README.md | Contains SRW App introductory information | - +--------------------------------+-----------------------------------------------------------+ - | rename_model.sh | Used to rename the model before it is transitioned into | - | | operations. The SRW App is a generic app that is the base | - | | for models such as :term:`AQM` and :term:`RRFS`. When | - | | these models become operational, variables like | - | | ``HOMEdir`` and ``PARMdir`` will be renamed to | - | | ``HOMEaqm``/``HOMErrfs``, ``PARMaqm``/``PARMrrfs``, etc. | - | | using this script. | - +--------------------------------+-----------------------------------------------------------+ - | scripts | Contains the *ex-script* for each workflow task. | - | | These scripts are where the task logic and executables | - | | are contained. | - +--------------------------------+-----------------------------------------------------------+ - | sorc | Contains CMakeLists.txt; source code from external | - | | repositories is cloned into this directory. | - +--------------------------------+-----------------------------------------------------------+ - | tests | Contains SRW App tests, including workflow end-to-end | - | | (WE2E) tests. | - +--------------------------------+-----------------------------------------------------------+ - | ufs_srweather_app_meta.h.in | Meta information for SRW App which can be used by | - | | other packages | - +--------------------------------+-----------------------------------------------------------+ - | ufs_srweather_app.settings.in | SRW App configuration summary | - +--------------------------------+-----------------------------------------------------------+ - | ush | Contains utility scripts. Includes the experiment | - | | configuration file and the experiment generation file. | - +--------------------------------+-----------------------------------------------------------+ - | versions | Contains ``run.ver`` and ``build.ver`` files, which track | - | | package versions at run time and compile time, | - | | respectively. | - +--------------------------------+-----------------------------------------------------------+ - +.. list-table:: Files and Subdirectories of the *ufs-srweather-app* Repository + :widths: 20 50 + :header-rows: 1 + + * - File/Directory Name + - Description + * - CMakeLists.txt + - Main CMake file for SRW App + * - devbuild.sh + - SRW App build script + * - devclean.sh + - Convenience script that can be used to clean up code if something goes wrong when checking out externals or building the application. + * - docs + - Contains release notes, documentation, and User's Guide + * - environment.yml + - Contains information on the package versions required for the regional workflow environment. + * - etc + - Contains Lmod startup scripts + * - Externals.cfg + - Includes tags pointing to the correct version of the external GitHub repositories/branches used in the SRW App. + * - jobs + - Contains the *j-job* script for each workflow task. These scripts set up the environment variables and call an *ex-script* script located in the ``scripts`` subdirectory. + * - LICENSE.md + - CC0 license information + * - manage_externals + - Utility for checking out external repositories + * - modulefiles + - Contains build and workflow modulefiles + * - parm + - Contains parameter files. Includes UFS Weather Model configuration files such as ``model_configure``, ``diag_table``, and ``field_table``. + * - README.md + - Contains SRW App introductory information + * - rename_model.sh + - Used to rename the model before it is transitioned into operations. The SRW App is a generic app that is the base for models such as :term:`AQM` and :term:`RRFS`. When these models become operational, variables like ``HOMEdir`` and ``PARMdir`` will be renamed to ``HOMEaqm``/``HOMErrfs``, ``PARMaqm``/``PARMrrfs``, etc. using this script. + * - scripts + - Contains the *ex-script* for each workflow task. These scripts are where the task logic and executables are contained. + * - sorc + - Contains CMakeLists.txt; source code from external repositories is cloned into this directory. + * - tests + - Contains SRW App tests, including workflow end-to-end (WE2E) tests and unit tests. + * - ufs_srweather_app_meta.h.in + - Meta information for SRW App which can be used by other packages + * - ufs_srweather_app.settings.in + - SRW App configuration summary + * - ush + - Contains utility scripts. Includes the experiment configuration file and the experiment generation file. + * - versions + - Contains ``run.ver`` and ``build.ver`` files, which track package versions at run time and compile time, respectively. + .. _CheckoutExternals: Check Out External Components ================================ -The SRW App relies on a variety of components (e.g., UFS_UTILS, ufs-weather-model, and UPP) detailed in :numref:`Chapter %s ` of this User's Guide. Each component has its own repository. Users must run the ``checkout_externals`` script to collect the individual components of the SRW App from their respective GitHub repositories. The ``checkout_externals`` script uses the configuration file ``Externals.cfg`` in the top level directory of the SRW App to clone the correct tags (code versions) of the external repositories listed in :numref:`Section %s ` into the appropriate directories (e.g., ``ush``, ``sorc``). +The SRW App relies on a variety of components (e.g., UFS_UTILS, ufs-weather-model, and UPP) detailed in :numref:`Chapter %s ` of this User's Guide. Each component has its own repository. Users must run the ``checkout_externals`` script to collect the individual components of the SRW App from their respective GitHub repositories. The ``checkout_externals`` script uses the configuration file ``Externals.cfg`` in the top-level directory of the SRW App to clone the correct tags (code versions) of the external repositories listed in :numref:`Section %s ` into the appropriate directories (e.g., ``ush``, ``sorc``). Run the executable that pulls in SRW App components from external repositories: @@ -174,6 +155,17 @@ Run the executable that pulls in SRW App components from external repositories: The script should output dialogue indicating that it is retrieving different code repositories. It may take several minutes to download these repositories. +.. hint:: + + Some systems (e.g., Hercules, Gaea) may have difficulty finding prerequisite software, such as python. If users run into this issue but know that the software exists on their system, they can run ``module load `` followed by ``module save``. For example: + + .. code-block:: console + + /usr/bin/env: ‘python’: No such file or directory + hercules-login-1[10] username$ module load python + hercules-login-1[11] username$ module save + Saved current collection of modules to: "default", for system: "hercules" + To see more options for the ``checkout_externals`` script, users can run ``./manage_externals/checkout_externals -h``. For example: * ``-S``: Outputs the status of the repositories managed by ``checkout_externals``. By default, only summary information is provided. Use with the ``-v`` (verbose) option to see details. @@ -198,7 +190,7 @@ On Level 1 systems for which a modulefile is provided under the ``modulefiles`` ./devbuild.sh --platform= -where ```` is replaced with the name of the platform the user is working on. Valid values include: ``cheyenne`` | ``gaea`` | ``hera`` | ``jet`` | ``linux`` | ``macos`` | ``noaacloud`` | ``orion`` +where ```` is replaced with the name of the platform the user is working on. Valid values include: ``derecho`` | ``gaea`` | ``hera`` | ``hercules`` | ``jet`` | ``linux`` | ``macos`` | ``noaacloud`` | ``orion`` .. note:: Although build modulefiles exist for generic Linux and MacOS machines, users will need to alter these according to the instructions in Sections :numref:`%s ` & :numref:`%s `. Users on these systems may have more success building the SRW App with the :ref:`CMake Approach ` instead. @@ -211,9 +203,15 @@ If compiler auto-detection fails for some reason, specify it using the ``--compi where valid values are ``intel`` or ``gnu``. -The last line of the console output should be ``[100%] Built target ufs-weather-model``, indicating that the UFS Weather Model executable has been built successfully. +The last few lines of the console output should include ``[100%] Built target ufs-weather-model``, indicating that the UFS Weather Model executable has been built successfully. -After running ``devbuild.sh``, the executables listed in :numref:`Table %s ` should appear in the ``ufs-srweather-app/exec`` directory. If the ``devbuild.sh`` build method does not work, or if users are not on a supported machine, they will have to manually set up the environment and build the SRW App binaries with CMake as described in :numref:`Section %s `. +After running ``devbuild.sh``, the executables listed in :numref:`Table %s ` should appear in the ``ufs-srweather-app/exec`` directory. If the application built properly, users may configure and run an experiment. Users have a few options: + +#. Proceed to :numref:`Section %s: Quick Start Guide ` for a quick overview of the workflow steps. +#. Try the :ref:`SRW App Tutorials ` (good for new users!). +#. For detailed information on running the SRW App, including optional tasks like plotting and verification, users can refer to :numref:`Section %s: Running the SRW App `. + +If the ``devbuild.sh`` build method did *not* work, or if users are not on a supported machine, they will have to manually set up the environment and build the SRW App binaries with CMake as described in :numref:`Section %s `. .. _ExecDescription: @@ -287,7 +285,7 @@ Set Up the Build Environment ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ .. attention:: - * If users successfully built the executables in :numref:`Table %s `, they should skip to step :numref:`Chapter %s: Running the SRW App `. + * If users successfully built the executables listed in :numref:`Table %s `, they should skip to step :numref:`Section %s: Running the SRW App `. * Users who want to build the SRW App on MacOS or generic Linux systems should skip to :numref:`Section %s ` and follow the approach there. If the ``devbuild.sh`` approach failed, users need to set up their environment to run a workflow on their specific platform. First, users should make sure ``Lmod`` is the app used for loading modulefiles. This is the case on most Level 1 systems; however, on systems such as Gaea/Odin, the default modulefile loader is from Cray and must be switched to Lmod. For example, on Gaea, users can run one of the following two commands depending on whether they have a bash or csh shell, respectively: @@ -305,10 +303,10 @@ From here, ``Lmod`` is ready to load the modulefiles needed by the SRW App. Thes .. code-block:: console - module use /path/to/modulefiles + module use /path/to/ufs-srweather-app/modulefiles module load build__ -where ``/path/to/modulefiles/`` is the full path to the ``modulefiles`` directory. +where ``/path/to/ufs-srweather-app/modulefiles/`` is the full path to the ``modulefiles`` directory. This will work on Level 1 systems, where a modulefile is available in the ``modulefiles`` directory. Users on Level 2-4 systems (including generic Linux/MacOS systems) will need to modify an appropriate ``build__`` modulefile. One of the current ``build__`` modulefiles can be copied and used as a template. However, users will need to adjust certain environment variables in their modulefile, such as the path to HPC-Stack, so that the SRW App can find and load the appropriate modules. @@ -342,7 +340,7 @@ From the build directory, run the following commands to build the pre-processing cmake .. -DCMAKE_INSTALL_PREFIX=.. -DCMAKE_INSTALL_BINDIR=exec .. make -j 4 >& build.out & -``-DCMAKE_INSTALL_PREFIX`` specifies the location where the ``exec``, ``include``, ``lib``, and ``share`` directories will be created. These directories will contain various components of the SRW App. Its recommended value ``..`` denotes one directory up from the build directory. In the next line, the ``make`` argument ``-j 4`` indicates that the build will run in parallel with four threads. Although users can specify a larger or smaller number of threads (e.g., ``-j 8``, ``-j 2``), it is highly recommended to use at least four parallel threads to prevent overly long installation times. +``-DCMAKE_INSTALL_PREFIX`` specifies the location where the ``exec``, ``include``, ``lib``, and ``share`` directories will be created. These directories will contain various components of the SRW App. Its recommended value ``..`` denotes one directory up from the ``build`` directory. In the next line, the ``make`` argument ``-j 4`` indicates that the build will run in parallel with four threads. Although users can specify a larger or smaller number of threads (e.g., ``-j 8``, ``-j 2``), it is highly recommended to use at least four parallel threads to prevent overly long installation times. The build will take a few minutes to complete. When it starts, a random number is printed to the console, and when it is done, a ``[1]+ Done`` message is printed to the console. ``[1]+ Exit`` indicates an error. Output from the build will be in the ``ufs-srweather-app/build/build.out`` file. When the build completes, users should see the forecast model executable ``ufs_model`` and several pre- and post-processing executables in the ``ufs-srweather-app/exec`` directory. These executables are described in :numref:`Table %s `. @@ -356,7 +354,7 @@ Additional Details for Building on MacOS or Generic Linux ------------------------------------------------------------ .. note:: - Users who are **not** building the SRW App on MacOS or generic Linux platforms may skip to :numref:`Section %s ` to finish building the SRW App or continue to :numref:`Section %s ` to configure and run an experiment. + Users who are **not** building the SRW App on MacOS or generic Linux platforms may skip to :numref:`Section %s ` to finish building the SRW App or continue to :numref:`Section %s ` to configure and run an experiment if they have already built the App. The SRW App can be built on MacOS and generic Linux machines after the prerequisite software has been installed on these systems (via :term:`HPC-Stack` or :term:`spack-stack`). The installation for MacOS is architecture-independent and has been tested using both x86_64 and M1 chips (running natively). The following configurations for MacOS have been tested: @@ -366,7 +364,7 @@ The SRW App can be built on MacOS and generic Linux machines after the prerequis Several Linux builds have been tested on systems with x86_64 architectures. -The ``$SRW/modulefiles/build__gnu.lua`` modulefile (where ```` is ``macos`` or ``linux``) is written as a Lmod module in the Lua language. It can be loaded once the Lmod module environment has been initialized (which should have happened even prior to :ref:`installing HPC-Stack `). The ``build__gnu`` modulefile lists the location of the HPC-Stack modules, loads the meta-modules and modules, sets serial and parallel compilers, additional flags, and any environment variables needed for building the SRW App. The modulefile must be modified to include the absolute path to the user's HPC-Stack installation: +The ``$SRW/modulefiles/build__gnu.lua`` modulefile (where ```` is ``macos`` or ``linux``) is written as a Lmod module in the Lua language. It can be loaded once the Lmod module environment has been initialized (which should have happened even prior to :ref:`installing HPC-Stack `). The ``build__gnu`` modulefile lists the location of the HPC-Stack modules, loads the meta-modules and modules, sets serial and parallel compilers, additional flags, and any environment variables needed for building the SRW App. The modulefile must be modified to include the absolute path to the user's HPC-Stack installation: .. code-block:: console @@ -391,4 +389,8 @@ Proceed to building the executables using the process outlined in :numref:`Step Run an Experiment ===================== -To configure and run an experiment, users should proceed to :numref:`Chapter %s `. +To configure and run an experiment, users have a few options: + +#. Proceed to :numref:`Section %s: Quick Start Guide ` for a quick overview of the workflow steps. +#. Try the :ref:`SRW App Tutorials ` (good for new users!). +#. For detailed information on running the SRW App, including optional tasks like plotting and verification, users can refer to :numref:`Section %s: Running the SRW App `. diff --git a/docs/UsersGuide/source/BuildingRunningTesting/ContainerQuickstart.rst b/docs/UsersGuide/source/BuildingRunningTesting/ContainerQuickstart.rst index bcc17e73ce..5e0f2db31b 100644 --- a/docs/UsersGuide/source/BuildingRunningTesting/ContainerQuickstart.rst +++ b/docs/UsersGuide/source/BuildingRunningTesting/ContainerQuickstart.rst @@ -28,15 +28,13 @@ Install Singularity/Apptainer .. note:: - As of November 2021, the Linux-supported version of Singularity has been `renamed `__ to *Apptainer*. Apptainer has maintained compatibility with Singularity, so ``singularity`` commands should work with either Singularity or Apptainer (see compatibility details `here `__.) + As of November 2021, the Linux-supported version of Singularity has been `renamed `__ to *Apptainer*. Apptainer has maintained compatibility with Singularity, so ``singularity`` commands should work with either Singularity or Apptainer (see compatibility details `here `__.) To build and run the SRW App using a Singularity/Apptainer container, first install the software according to the `Apptainer Installation Guide `__. This will include the installation of all dependencies. .. warning:: Docker containers can only be run with root privileges, and users cannot have root privileges on :term:`HPCs `. Therefore, it is not possible to build the SRW App, which uses the spack-stack, inside a Docker container on an HPC system. However, a Singularity/Apptainer image may be built directly from a Docker image for use on the system. -.. COMMENT: Update reference to HPC-Stack --> spack-stack? - Working in the Cloud or on HPC Systems ----------------------------------------- @@ -49,7 +47,7 @@ For users working on systems with limited disk space in their ``/home`` director where ``/absolute/path/to/writable/directory/`` refers to a writable directory (usually a project or user directory within ``/lustre``, ``/work``, ``/scratch``, or ``/glade`` on NOAA Level 1 systems). If the ``cache`` and ``tmp`` directories do not exist already, they must be created with a ``mkdir`` command. -On NOAA Cloud systems, the ``sudo su`` command may also be required: +On NOAA Cloud systems, the ``sudo su`` command may also be required. For example: .. code-block:: @@ -74,26 +72,27 @@ Build the Container Level 1 Systems ^^^^^^^^^^^^^^^^^^ -On most Level 1 systems, a container named ``ubuntu20.04-intel-ue-1.4.1-srw-dev.img`` has already been built at the following locations: - -.. table:: Locations of pre-built containers - - +--------------+--------------------------------------------------------+ - | Machine | File location | - +==============+========================================================+ - | Cheyenne | /glade/scratch/epicufsrt/containers | - +--------------+--------------------------------------------------------+ - | Gaea | /lustre/f2/dev/role.epic/containers | - +--------------+--------------------------------------------------------+ - | Hera | /scratch1/NCEPDEV/nems/role.epic/containers | - +--------------+--------------------------------------------------------+ - | Jet | /mnt/lfs4/HFIP/hfv3gfs/role.epic/containers | - +--------------+--------------------------------------------------------+ - | NOAA Cloud | /contrib/EPIC/containers | - +--------------+--------------------------------------------------------+ - | Orion | /work/noaa/epic-ps/role-epic-ps/containers | - +--------------+--------------------------------------------------------+ - +On most Level 1 systems, a container named ``ubuntu20.04-intel-srwapp-release-public-v2.2.0.img`` has already been built at the following locations: + +.. list-table:: Locations of pre-built containers + :widths: 20 50 + :header-rows: 1 + + * - Machine + - File Location + * - Cheyenne/Derecho + - /glade/scratch/epicufsrt/containers + * - Gaea + - /lustre/f2/dev/role.epic/containers + * - Hera + - /scratch1/NCEPDEV/nems/role.epic/containers + * - Jet + - /mnt/lfs4/HFIP/hfv3gfs/role.epic/containers + * - NOAA Cloud + - /contrib/EPIC/containers + * - Orion/Hercules + - /work/noaa/epic/role-epic/contrib/containers + .. note:: * On Gaea, Singularity/Apptainer is only available on the C5 partition, and therefore container use is only supported on Gaea C5. * The NOAA Cloud containers are accessible only to those with EPIC resources. @@ -102,7 +101,7 @@ Users can simply set an environment variable to point to the container: .. code-block:: console - export img=/path/to/ubuntu20.04-intel-ue-1.4.1-srw-dev.img + export img=/path/to/ubuntu20.04-intel-srwapp-release-public-v2.2.0.img Users may convert the container ``.img`` file to a writable sandbox. This step is required when running on Cheyenne but is optional on other systems: @@ -115,7 +114,7 @@ When making a writable sandbox on Level 1 systems, the following warnings common .. code-block:: console INFO: Starting build... - INFO: Verifying bootstrap image ubuntu20.04-intel-ue-1.4.1-srw-dev.img + INFO: Verifying bootstrap image ubuntu20.04-intel-srwapp-release-public-v2.2.0.img WARNING: integrity: signature not found for object group 1 WARNING: Bootstrap image could not be verified, but build will continue. @@ -126,17 +125,10 @@ On non-Level 1 systems, users should build the container in a writable sandbox: .. code-block:: console - sudo singularity build --sandbox ubuntu20.04-intel-srwapp docker://noaaepic/ubuntu20.04-intel-srwapp:develop + sudo singularity build --sandbox ubuntu20.04-intel-srwapp docker://noaaepic/ubuntu20.04-intel-srwapp:release-public-v2.2.0 Some users may prefer to issue the command without the ``sudo`` prefix. Whether ``sudo`` is required is system-dependent. -.. note:: - Users can choose to build a release version of the container (SRW App v2.1.0) using a similar command: - - .. code-block:: console - - sudo singularity build --sandbox ubuntu20.04-intel-srwapp docker://noaaepic/ubuntu20.04-intel-srwapp:release-public-v2.1.0 - For easier reference, users can set an environment variable to point to the container: .. code-block:: console @@ -236,7 +228,7 @@ Generate the Forecast Experiment To generate the forecast experiment, users must: #. :ref:`Activate the workflow ` -#. :ref:`Set experiment parameters ` +#. :ref:`Set experiment parameters to configure the workflow ` #. :ref:`Run a script to generate the experiment workflow ` The first two steps depend on the platform being used and are described here for Level 1 platforms. Users will need to adjust the instructions to match their machine configuration if their local machine is a Level 2-4 platform. @@ -273,7 +265,7 @@ The ``wflow_`` modulefile will then output instructions to activate th Please do the following to activate conda: > conda activate workflow_tools -then the user should run ``conda activate workflow_tools``. This will activate the ``workflow_tools`` conda environment. The command(s) will vary from system to system, but the user should see ``(workflow_tools)`` in front of the Terminal prompt at this point. +then the user should run |activate|. This will activate the |wflow_env| conda environment. The command(s) will vary from system to system, but the user should see |prompt| in front of the Terminal prompt at this point. .. _SetUpConfigFileC: @@ -290,20 +282,20 @@ where: * ``-c`` indicates the compiler on the user's local machine (e.g., ``intel/2022.1.2``) * ``-m`` indicates the :term:`MPI` on the user's local machine (e.g., ``impi/2022.1.2``) - * ```` refers to the local machine (e.g., ``hera``, ``jet``, ``noaacloud``, ``mac``). See ``MACHINE`` in :numref:`Section %s ` for a full list of options. - * ``-i`` indicates the container image that was built in :numref:`Step %s ` (``ubuntu20.04-intel-srwapp`` or ``ubuntu20.04-intel-ue-1.4.1-srw-dev.img`` by default). + * ```` refers to the local machine (e.g., ``hera``, ``jet``, ``noaacloud``, ``macos``, ``linux``). See ``MACHINE`` in :numref:`Section %s ` for a full list of options. + * ``-i`` indicates the container image that was built in :numref:`Step %s ` (``ubuntu20.04-intel-srwapp`` or ``ubuntu20.04-intel-srwapp-release-public-v2.2.0.img`` by default). For example, on Hera, the command would be: .. code-block:: console - ./stage-srw.sh -c=intel/2022.1.2 -m=impi/2022.1.2 -p=hera -i=ubuntu20.04-intel-ue-1.4.1-srw-dev.img + ./stage-srw.sh -c=intel/2022.1.2 -m=impi/2022.1.2 -p=hera -i=ubuntu20.04-intel-srwapp-release-public-v2.2.0.img .. attention:: The user must have an Intel compiler and MPI on their system because the container uses an Intel compiler and MPI. Intel compilers are now available for free as part of `Intel's oneAPI Toolkit `__. -After this command runs, the working directory should contain ``srw.sh`` and a ``ufs-srweather-app`` directory. +After this command runs, the working directory should contain ``srw.sh``, a ``ufs-srweather-app`` directory, and an ``ush`` directory. From here, users can follow the steps below to configure the out-of-the-box SRW App case with an automated Rocoto workflow. For more detailed instructions on experiment configuration, users can refer to :numref:`Section %s `. @@ -339,16 +331,16 @@ From here, users can follow the steps below to configure the out-of-the-box SRW .. code-block:: console USE_USER_STAGED_EXTRN_FILES: true - EXTRN_MDL_SOURCE_BASEDIR_ICS: /scratch1/NCEPDEV/nems/role.epic/UFS_SRW_data/develop/input_model_data/FV3GFS/grib2/${yyyymmddhh} + EXTRN_MDL_SOURCE_BASEDIR_ICS: /scratch1/NCEPDEV/nems/role.epic/UFS_SRW_data/v2p2/input_model_data/FV3GFS/grib2/${yyyymmddhh} - On other systems, users will need to change the path for ``EXTRN_MDL_SOURCE_BASEDIR_ICS`` and ``EXTRN_MDL_FILES_LBCS`` (below) to reflect the location of the system's data. The location of the machine's global data can be viewed :ref:`here ` for Level 1 systems. Alternatively, the user can add the path to their local data if they downloaded it as described in :numref:`Section %s `. + On other systems, users will need to change the path for ``EXTRN_MDL_SOURCE_BASEDIR_ICS`` and ``EXTRN_MDL_SOURCE_BASEDIR_LBCS`` (below) to reflect the location of the system's data. The location of the machine's global data can be viewed :ref:`here ` for Level 1 systems. Alternatively, the user can add the path to their local data if they downloaded it as described in :numref:`Section %s `. #. Edit the ``task_get_extrn_lbcs:`` section of the ``config.yaml`` to include the correct data paths to the lateral boundary conditions files. For example, on Hera, add: .. code-block:: console USE_USER_STAGED_EXTRN_FILES: true - EXTRN_MDL_SOURCE_BASEDIR_LBCS: /scratch1/NCEPDEV/nems/role.epic/UFS_SRW_data/develop/input_model_data/FV3GFS/grib2/${yyyymmddhh} + EXTRN_MDL_SOURCE_BASEDIR_LBCS: /scratch1/NCEPDEV/nems/role.epic/UFS_SRW_data/v2p2/input_model_data/FV3GFS/grib2/${yyyymmddhh} .. _GenerateWorkflowC: @@ -413,12 +405,9 @@ If a task goes DEAD, it will be necessary to restart it according to the instruc crontab -e */3 * * * * cd /path/to/expt_dirs/test_community && ./launch_FV3LAM_wflow.sh called_from_cron="TRUE" -where: - - * ``/path/to`` is replaced by the actual path to the user's experiment directory, and - * ``esc`` and ``enter`` refer to the escape and enter **keys** (not a typed command). +where ``/path/to`` is replaced by the actual path to the user's experiment directory. New Experiment =============== -To run a new experiment in the container at a later time, users will need to rerun the commands in :numref:`Section %s ` to reactivate the workflow. Then, users can configure a new experiment by updating the environment variables in ``config.yaml`` to reflect the desired experiment configuration. Basic instructions appear in :numref:`Section %s ` above, and detailed instructions can be viewed in :numref:`Section %s `. After adjusting the configuration file, regenerate the experiment by running ``./generate_FV3LAM_wflow.py``. +To run a new experiment in the container at a later time, users will need to rerun the commands in :numref:`Section %s ` to reactivate the workflow. Then, users can configure a new experiment by updating the variables in ``config.yaml`` to reflect the desired experiment configuration. Basic instructions appear in :numref:`Section %s ` above, and detailed instructions can be viewed in :numref:`Section %s `. After adjusting the configuration file, regenerate the experiment by running ``./generate_FV3LAM_wflow.py``. diff --git a/docs/UsersGuide/source/BuildingRunningTesting/Quickstart.rst b/docs/UsersGuide/source/BuildingRunningTesting/Quickstart.rst index df5a61b1ef..0f597241d1 100644 --- a/docs/UsersGuide/source/BuildingRunningTesting/Quickstart.rst +++ b/docs/UsersGuide/source/BuildingRunningTesting/Quickstart.rst @@ -7,14 +7,14 @@ Quick Start Guide This chapter provides a brief summary of how to build and run the SRW Application. The steps will run most smoothly on `Level 1 `__ systems. Users should expect to reference other chapters of this User's Guide, particularly :numref:`Section %s: Building the SRW App ` and :numref:`Section %s: Running the SRW App `, for additional explanations regarding each step. -Install the HPC-Stack -=========================== -SRW App users who are not working on a `Level 1 `__ platform will need to install the prerequisite software stack via :term:`HPC-Stack` prior to building the SRW App on a new machine. Users can find installation instructions in the :doc:`HPC-Stack documentation `. The steps will vary slightly depending on the user's platform. However, in all cases, the process involves (1) cloning the `HPC-Stack repository `__, (2) reviewing/modifying the ``config/config_.sh`` and ``stack/stack_.yaml`` files, and (3) running the commands to build the stack. This process will create a number of modulefiles required for building the SRW App. +Install the Prerequisite Software Stack +========================================= +SRW App users who are **not** working on a `Level 1 `__ platform will need to install the prerequisite software stack via :term:`spack-stack` or :term:`HPC-Stack` prior to building the SRW App on a new machine. Users can find installation instructions in the :doc:`spack-stack documentation ` or the :doc:`HPC-Stack documentation `. The steps will vary slightly depending on the user's platform, but detailed instructions for a variety of platforms are available in the documentation. Users may also post questions in the `ufs-community Discussions tab `__. -Once the HPC-Stack has been successfully installed, users can move on to building the SRW Application. +Once spack-stack or HPC-Stack has been successfully installed, users can move on to building the SRW Application. .. attention:: - Although HPC-Stack is currently the fully-supported software stack option, UFS applications are gradually shifting to :term:`spack-stack`, which is a :term:`Spack`-based method for installing UFS prerequisite software libraries. Users are encouraged to check out `spack-stack `__ to prepare for the upcoming shift in support from HPC-Stack to spack-stack. + Most SRW App `Level 1 `__ systems have shifted to spack-stack from HPC-Stack (with the exception of Derecho). Spack-stack is a Spack-based method for installing UFS prerequisite software libraries. Currently, spack-stack is the software stack validated by the UFS Weather Model (:term:`WM `) for running regression tests. UFS applications and components are also shifting to spack-stack from HPC-Stack but are at various stages of this transition. Although users can still build and use HPC-Stack, the UFS WM no longer uses HPC-Stack for validation, and support for this option is being deprecated. .. _QuickBuildRun: @@ -27,7 +27,7 @@ For a detailed explanation of how to build and run the SRW App on any supported .. code-block:: console - git clone -b develop https://github.com/ufs-community/ufs-srweather-app.git + git clone -b release/public-v2.2.0 https://github.com/ufs-community/ufs-srweather-app.git #. Check out the external repositories: @@ -42,16 +42,17 @@ For a detailed explanation of how to build and run the SRW App on any supported ./devbuild.sh --platform= - where ```` is replaced with the name of the user's platform/system. Valid values include: ``cheyenne`` | ``gaea`` | ``hera`` | ``jet`` | ``linux`` | ``macos`` | ``noaacloud`` | ``orion`` | ``wcoss2`` + where ```` is replaced with the name of the user's platform/system. Valid values include: ``derecho`` | ``gaea`` | ``hera`` | ``hercules`` | ``jet`` | ``linux`` | ``macos`` | ``noaacloud`` | ``orion`` | ``wcoss2`` For additional details, see :numref:`Section %s `, or view :numref:`Section %s ` to try the CMake build approach instead. - #. Users on a `Level 2-4 `__ system must download and stage data (both the fix files and the :term:`IC/LBC ` files) according to the instructions in :numref:`Section %s `. Standard data locations for Level 1 systems appear in :numref:`Table %s `. + #. Users on a `Level 2-4 `__ system must download and stage data (both the fix files and the :term:`IC/LBC ` files) according to the instructions in :numref:`Section %s `. Standard data locations for Level 1 systems appear in :numref:`Table %s `. #. Load the python environment for the workflow. Users on Level 2-4 systems will need to use one of the existing ``wflow_`` modulefiles (e.g., ``wflow_macos``) and adapt it to their system. Then, run: .. code-block:: console + source /path/to/ufs-srweather-app/etc/lmod-setup.sh module use /path/to/ufs-srweather-app/modulefiles module load wflow_ @@ -62,7 +63,7 @@ For a detailed explanation of how to build and run the SRW App on any supported Please do the following to activate conda: > conda activate workflow_tools - then the user should run ``conda activate workflow_tools`` to activate the workflow environment. + then the user should run |activate| to activate the workflow environment. #. Configure the experiment: @@ -73,7 +74,27 @@ For a detailed explanation of how to build and run the SRW App on any supported cd ush cp config.community.yaml config.yaml - Users will need to open the ``config.yaml`` file and adjust the experiment parameters in it to suit the needs of their experiment (e.g., date, grid, physics suite). At a minimum, users need to modify the ``MACHINE`` parameter. In most cases, users will need to specify the ``ACCOUNT`` parameter and the location of the experiment data (see :numref:`Section %s ` for Level 1 system default locations). Additional changes may be required based on the system and experiment. More detailed guidance is available in :numref:`Section %s `. Parameters and valid values are listed in :numref:`Chapter %s `. + Users will need to open the ``config.yaml`` file and adjust the experiment parameters in it to suit the needs of their experiment (e.g., date, grid, physics suite). At a minimum, users need to modify the ``MACHINE`` parameter. In most cases, users will need to specify the ``ACCOUNT`` parameter and the location of the experiment data (see :numref:`Section %s ` for Level 1 system default locations). + + For example, a user on Gaea might adjust or add the following fields to run the 12-hr "out-of-the-box" case on Gaea using prestaged system data and :term:`cron` to automate the workflow: + + .. code-block:: console + + user: + MACHINE: gaea + ACCOUNT: hfv3gfs + workflow: + EXPT_SUBDIR: run_basic_srw + USE_CRON_TO_RELAUNCH: true + CRON_RELAUNCH_INTVL_MNTS: 3 + task_get_extrn_ics: + USE_USER_STAGED_EXTRN_FILES: true + EXTRN_MDL_SOURCE_BASEDIR_ICS: /lustre/f2/dev/role.epic/contrib/UFS_SRW_data/v2p2/input_model_data/FV3GFS/grib2/${yyyymmddhh} + task_get_extrn_lbcs: + USE_USER_STAGED_EXTRN_FILES: true + EXTRN_MDL_SOURCE_BASEDIR_LBCS: /lustre/f2/dev/role.epic/contrib/UFS_SRW_data/v2p2/input_model_data/FV3GFS/grib2/${yyyymmddhh} + + Users on a different system would update the machine, account, and data paths accordingly. Additional changes may be required based on the system and experiment. More detailed guidance is available in :numref:`Section %s `. Parameters and valid values are listed in :numref:`Chapter %s `. #. Generate the experiment workflow. @@ -81,19 +102,28 @@ For a detailed explanation of how to build and run the SRW App on any supported ./generate_FV3LAM_wflow.py - #. Run the workflow from the experiment directory (``$EXPTDIR``). By default, the path to this directory is ``${EXPT_BASEDIR}/${EXPT_SUBDIR}`` (see :numref:`Section %s ` for more detail). There are several methods for running the workflow, which are discussed in :numref:`Section %s `. One possible method is summarized below. It requires the :ref:`Rocoto Workflow Manager `. + #. Run the workflow from the experiment directory (``$EXPTDIR``). By default, the path to this directory is ``${EXPT_BASEDIR}/${EXPT_SUBDIR}`` (see :numref:`Section %s ` for more detail). There are several methods for running the workflow, which are discussed in :numref:`Section %s `. Most require the :ref:`Rocoto Workflow Manager `. For example, if the user automated the workflow using cron, run: + + .. code-block:: console + + cd $EXPTDIR + rocotostat -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10 + + The user can resubmit the ``rocotostat`` command as needed to check the workflow progress. + + If the user has Rocoto but did *not* automate the workflow using cron, run: .. code-block:: console cd $EXPTDIR ./launch_FV3LAM_wflow.sh - To (re)launch the workflow and check the experiment's progress: + To (re)launch the workflow and check the experiment's progress, run: .. code-block:: console ./launch_FV3LAM_wflow.sh; tail -n 40 log.launch_FV3LAM_wflow - The workflow must be relaunched regularly and repeatedly until the log output includes a ``Workflow status: SUCCESS`` message indicating that the experiment has finished. The :term:`cron` utility may be used to automate repeated runs. The last section of the log messages from running ``./generate_FV3LAM_wflow.py`` instruct users how to use that functionality. Users may also refer to :numref:`Section %s ` for instructions. + The workflow must be relaunched regularly and repeatedly until the log output includes a ``Workflow status: SUCCESS`` message indicating that the experiment has finished. -Optionally, users may :ref:`configure their own grid `, instead of using a predefined grid, and/or :ref:`plot the output ` of their experiment(s). +Optionally, users may :ref:`configure their own grid ` or :ref:`vertical levels ` instead of using a predefined grid. Users can also :ref:`plot the output ` of their experiment(s) or :ref:`run verification tasks using METplus `. diff --git a/docs/UsersGuide/source/BuildingRunningTesting/RunSRW.rst b/docs/UsersGuide/source/BuildingRunningTesting/RunSRW.rst index c7ea16d6b0..9d2b288109 100644 --- a/docs/UsersGuide/source/BuildingRunningTesting/RunSRW.rst +++ b/docs/UsersGuide/source/BuildingRunningTesting/RunSRW.rst @@ -34,7 +34,7 @@ The overall procedure for generating an experiment is shown in :numref:`Figure % .. _AppOverallProc: -.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/SRW_run_process.png +.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/WorkflowImages/SRW_run_process.png :alt: Flowchart describing the SRW App workflow steps. *Overall Layout of the SRW App Workflow* @@ -47,53 +47,62 @@ Download and Stage the Data The SRW App requires input files to run. These include static datasets, initial and boundary conditions files, and model configuration files. On Level 1 systems, the data required to run SRW App tests are already available in the following locations: .. _DataLocations: -.. table:: Data Locations for Level 1 Systems - - +--------------+------------------------------------------------------------------------------+ - | Machine | File location | - +==============+==============================================================================+ - | Cheyenne | /glade/work/epicufsrt/contrib/UFS_SRW_data/develop/input_model_data/ | - +--------------+------------------------------------------------------------------------------+ - | Gaea | /lustre/f2/dev/role.epic/contrib/UFS_SRW_data/develop/input_model_data/ | - +--------------+------------------------------------------------------------------------------+ - | Hera | /scratch1/NCEPDEV/nems/role.epic/UFS_SRW_data/develop/input_model_data/ | - +--------------+------------------------------------------------------------------------------+ - | Jet | /mnt/lfs4/HFIP/hfv3gfs/role.epic/UFS_SRW_data/develop/input_model_data/ | - +--------------+------------------------------------------------------------------------------+ - | NOAA Cloud | /contrib/EPIC/UFS_SRW_data/develop/input_model_data/ | - +--------------+------------------------------------------------------------------------------+ - | Orion | /work/noaa/epic-ps/role-epic-ps/UFS_SRW_data/develop/input_model_data/ | - +--------------+------------------------------------------------------------------------------+ - | WCOSS2 | /lfs/h2/emc/lam/noscrub/UFS_SRW_App/develop/input_model_data/ | - +--------------+------------------------------------------------------------------------------+ - -For Level 2-4 systems, the data must be added to the user's system. Detailed instructions on how to add the data can be found in :numref:`Section %s: Downloading and Staging Input Data `. Sections :numref:`%s ` and :numref:`%s ` contain useful background information on the input and output files used in the SRW App. +.. list-table:: Data Locations for Level 1 Systems + :widths: 20 50 + :header-rows: 1 + + * - Machine + - File location + * - Derecho + - /glade/work/epicufsrt/contrib/UFS_SRW_data/|data|/input_model_data + * - Gaea (C3/C4/C5) + - /lustre/f2/dev/role.epic/contrib/UFS_SRW_data/|data|/input_model_data/ + * - Hera + - /scratch1/NCEPDEV/nems/role.epic/UFS_SRW_data/|data|/input_model_data/ + * - Hercules + - /work/noaa/epic/role-epic/contrib/UFS_SRW_data/|data|/input_model_data/ + * - Jet + - /mnt/lfs4/HFIP/hfv3gfs/role.epic/UFS_SRW_data/|data|/input_model_data/ + * - NOAA Cloud + - /contrib/EPIC/UFS_SRW_data/|data|/input_model_data/ + * - Orion + - /work/noaa/epic/role-epic/contrib/UFS_SRW_data/|data|/input_model_data/ + * - WCOSS2 + - /lfs/h2/emc/lam/noscrub/UFS_SRW_App/develop/input_model_data/ + +For Level 2-4 systems, the data must be added to the user's system. Detailed instructions on how to add the data can be found in :numref:`Section %s: Downloading and Staging Input Data `. Sections :numref:`%s: Input Files ` and :numref:`%s: Output Files ` contain useful background information on the input and output files used in the SRW App. .. _GridSpecificConfig: Grid Configuration ======================= -The SRW App officially supports the four predefined grids shown in :numref:`Table %s `. The out-of-the-box SRW App case uses the ``RRFS_CONUS_25km`` predefined grid option. More information on the predefined and user-generated grid options can be found in :numref:`Section %s: Limited Area Model (LAM) Grids `. Users who plan to utilize one of the four predefined domain (grid) options may continue to the next step (:numref:`Step %s: Generate the Forecast Experiment `). Users who plan to create a new custom predefined grid should refer to the instructions in :numref:`Section %s: Creating User-Generated Grids `. At a minimum, these users will need to add the new grid name to the ``valid_param_vals.yaml`` file and add the corresponding grid-specific parameters in the ``predef_grid_params.yaml`` file. +The SRW App officially supports the five predefined grids shown in :numref:`Table %s `. The out-of-the-box SRW App case uses the ``RRFS_CONUS_25km`` predefined grid option. More information on the predefined and user-generated grid options can be found in :numref:`Section %s: Limited Area Model (LAM) Grids `. Users who plan to utilize one of the five predefined domain (grid) options may continue to the next step (:numref:`Step %s: Generate the Forecast Experiment `). Users who plan to create a new custom predefined grid should refer to the instructions in :numref:`Section %s: Creating User-Generated Grids `. At a minimum, these users will need to add the new grid name to the ``valid_param_vals.yaml`` file and add the corresponding grid-specific parameters in the ``predef_grid_params.yaml`` file. .. _PredefinedGrids: +.. list-table:: Predefined Grids Supported in the SRW App + :widths: 30 30 30 + :header-rows: 1 -.. table:: Predefined Grids Supported in the SRW App - - +----------------------+-------------------+--------------------------------+ - | **Grid Name** | **Grid Type** | **Quilting (write component)** | - +======================+===================+================================+ - | RRFS_CONUS_25km | ESG grid | lambert_conformal | - +----------------------+-------------------+--------------------------------+ - | RRFS_CONUS_13km | ESG grid | lambert_conformal | - +----------------------+-------------------+--------------------------------+ - | RRFS_CONUS_3km | ESG grid | lambert_conformal | - +----------------------+-------------------+--------------------------------+ - | SUBCONUS_Ind_3km | ESG grid | lambert_conformal | - +----------------------+-------------------+--------------------------------+ - -.. COMMENT: Revisit before SRW w/RRFS release - + * - Grid Name + - Grid Type + - Quilting (write component) + * - RRFS_CONUS_25km + - ESG grid + - lambert_conformal + * - RRFS_CONUS_13km + - ESG grid + - lambert_conformal + * - RRFS_CONUS_3km + - ESG grid + - lambert_conformal + * - SUBCONUS_Ind_3km + - ESG grid + - lambert_conformal + * - RRFS_NA_13km + - ESG grid + - lambert_conformal + .. _GenerateForecast: Generate the Forecast Experiment @@ -111,7 +120,7 @@ The first two steps depend on the platform being used and are described here for Load the Conda/Python Environment ------------------------------------ -The SRW App workflow is often referred to as the *regional workflow* because it runs experiments on a regional scale (unlike the *global workflow* used in other applications). The SRW App workflow requires installation of Python3 using conda; it also requires additional packages built in a separate conda evironment named ``workflow_tools``. On Level 1 systems, a ``workflow_tools`` environment already exists, and users merely need to load the environment. On Level 2-4 systems, users must create and then load the environment. The process for each is described in detail below. +The SRW App workflow is often referred to as the *regional workflow* because it runs experiments on a regional scale (unlike the *global workflow* used in other applications). The SRW App workflow requires installation of Python3 using conda; it also requires additional packages built in a separate conda evironment named |wflow_env|. On Level 1 systems, a |wflow_env| environment already exists, and users merely need to load the environment. On Level 2-4 systems, users must create and then load the environment. The process for each is described in detail below. .. _Load-WF-L1: @@ -122,12 +131,12 @@ Loading the Workflow Environment on Level 1 Systems Users on a Level 2-4 system should skip to the :ref:`next section ` for instructions. -The ``workflow_tools`` conda/Python environment has already been set up on Level 1 platforms and can be activated in the following way: +The |wflow_env| conda/Python environment has already been set up on Level 1 platforms and can be activated in the following way: .. code-block:: console - source /path/to/etc/lmod-setup.sh - module use /path/to/modulefiles + source /path/to/ufs-srweather-app/etc/lmod-setup.sh + module use /path/to/ufs-srweather-app/modulefiles module load wflow_ where ```` refers to a valid machine name (see :numref:`Section %s ` for ``MACHINE`` options). In a csh shell environment, users should replace ``lmod-setup.sh`` with ``lmod-setup.csh``. @@ -142,7 +151,7 @@ The ``wflow_`` modulefile will then output instructions to activate th Please do the following to activate conda: > conda activate workflow_tools -then the user should run ``conda activate workflow_tools``. This activates the ``workflow_tools`` conda environment, and the user typically sees ``(workflow_tools)`` in front of the Terminal prompt at this point. +then the user should run |activate|. This activates the |wflow_env| conda environment, and the user typically sees |prompt| in front of the Terminal prompt at this point. After loading the workflow environment, users may continue to :numref:`Section %s ` for instructions on setting the experiment configuration parameters. @@ -176,15 +185,14 @@ MacOS requires the installation of a few additional packages and, possibly, an u bash --version brew install bash # or: brew upgrade bash brew install coreutils - brew gsed # follow directions to update the PATH env variable - + brew install gsed # follow directions to update the PATH env variable .. _LinuxMacVEnv: -Creating the ``workflow_tools`` Environment on Linux and Mac OS +Creating the |wflow_env| Environment on Linux and Mac OS """"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""" -On generic Mac and Linux systems, users need to create a conda ``workflow_tools`` environment. The environment can be stored in a local path, which could be a default location or a user-specified location (e.g., ``$HOME/condaenv/venvs/`` directory). (To determine the default location, use the ``conda info`` command, and look for the ``envs directories`` list.) The following is a brief recipe for creating a virtual conda environment on non-Level 1 platforms. It uses the aarch64 (64-bit ARM) Miniforge for Linux and installs into $HOME/conda. Adjust as necessary for your target system. +On generic Mac and Linux systems, users need to create a conda |wflow_env| environment. The environment can be stored in a local path, which could be a default location or a user-specified location (e.g., ``$HOME/condaenv/venvs/`` directory). (To determine the default location, use the ``conda info`` command, and look for the ``envs directories`` list.) The following is a brief recipe for creating a virtual conda environment on non-Level 1 platforms. It uses the aarch64 (64-bit ARM) Miniforge for Linux and installs into $HOME/conda. Adjust as necessary for your target system. .. code-block:: console @@ -204,24 +212,24 @@ In future shells, you can activate and use this environment with: .. code-block:: console source ~/conda/etc/profile.d/conda.sh - conda activate uwtools + conda activate workflow_tools See the `workflow-tools repository `__ for additional documentation. Modify a ``wflow_`` File `````````````````````````````````````` -Users can copy one of the provided ``wflow_`` files from the ``modulefiles`` directory and use it as a template to create a ``wflow_`` file that functions on their system. The ``wflow_macos`` and ``wflow_linux`` template modulefiles are provided as a starting point, but any ``wflow_`` file could be used. Users must modify the files to provide paths for python, miniconda modules, module loads, conda initialization, and the user's ``workflow_tools`` conda environment. +Users can copy one of the provided ``wflow_`` files from the ``modulefiles`` directory and use it as a template to create a ``wflow_`` file that functions on their system. The ``wflow_macos`` and ``wflow_linux`` template modulefiles are provided as a starting point, but any ``wflow_`` file could be used. Users must modify the files to provide paths for python, miniconda modules, module loads, conda initialization, and the user's |wflow_env| conda environment. Load the Workflow Environment ``````````````````````````````` -After creating a ``workflow_tools`` environment and making modifications to a ``wflow_`` file, users can run the commands below to activate the workflow environment: +After creating a |wflow_env| environment and making modifications to a ``wflow_`` file, users can run the commands below to activate the workflow environment: .. code-block:: console - source /path/to/etc/lmod-setup.sh - module use /path/to/modulefiles + source /path/to/ufs-srweather-app/etc/lmod-setup.sh + module use /path/to/ufs-srweather-app/modulefiles module load wflow_ where ```` refers to a valid machine name (i.e., ``linux`` or ``macos``). @@ -236,10 +244,10 @@ The ``wflow_`` modulefile will then output the following instructions: Please do the following to activate conda: > conda activate workflow_tools -After running ``conda activate workflow_tools``, the user will typically see ``(workflow_tools)`` in front of the Terminal prompt. This indicates that the workflow environment has been loaded successfully. +After running |activate|, the user will typically see |prompt| in front of the Terminal prompt. This indicates that the workflow environment has been loaded successfully. .. note:: - ``conda`` needs to be initialized before running ``conda activate workflow_tools`` command. Depending on the user's system and login setup, this may be accomplished in a variety of ways. Conda initialization usually involves the following command: ``source /etc/profile.d/conda.sh``, where ```` is the base conda installation directory. + ``conda`` needs to be initialized before running the |activate| command. Depending on the user's system and login setup, this may be accomplished in a variety of ways. Conda initialization usually involves the following command: ``source /etc/profile.d/conda.sh``, where ```` is the base conda installation directory. .. _ExptConfig: @@ -253,75 +261,102 @@ Each experiment requires certain basic information to run (e.g., date, grid, phy Default configuration: ``config_defaults.yaml`` ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -In general, ``config_defaults.yaml`` is split into sections by category (e.g., ``user:``, ``platform:``, ``workflow:``, ``task_make_grid:``). Users can view a full list of categories and configuration parameters in the :doc:`Table of Variables in config_defaults.yaml `. Definitions and default values of each of the variables can be found in the ``config_defaults.yaml`` file comments and in :numref:`Section %s: Workflow Parameters `. Some of these default values are intentionally invalid in order to ensure that the user assigns valid values in their ``config.yaml`` file. There is usually no need for a user to modify ``config_defaults.yaml`` because any settings provided in ``config.yaml`` will override the settings in ``config_defaults.yaml``. +In general, ``config_defaults.yaml`` is split into sections by category (e.g., ``user:``, ``platform:``, ``workflow:``, ``task_make_grid:``). Users can view a full list of categories and configuration parameters in the :doc:`Table of Variables in config_defaults.yaml `. Definitions and default values of each of the variables can be found in :numref:`Section %s: Workflow Parameters ` and in the ``config_defaults.yaml`` file comments. Some of these default values are intentionally invalid in order to ensure that the user assigns valid values in their ``config.yaml`` file. There is usually no need for a user to modify ``config_defaults.yaml`` because any settings provided in ``config.yaml`` will override the settings in ``config_defaults.yaml``. .. _UserSpecificConfig: User-specific configuration: ``config.yaml`` ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -The user must set the specifics of their experiment configuration in a ``config.yaml`` file located in the ``ufs-srweather-app/ush`` directory. Two example templates are provided in that directory: ``config.community.yaml`` and ``config.nco.yaml``. The first file is a basic example for creating and running an experiment in *community* mode (with ``RUN_ENVIR`` set to ``community``). The second is an example for creating and running an experiment in the *NCO* (operational) mode (with ``RUN_ENVIR`` set to ``nco``). The *community* mode is recommended in most cases, and user support is available for running in community mode. The operational/NCO mode is typically used by developers at the Environmental Modeling Center (:term:`EMC`) and the Global Systems Laboratory (:term:`GSL`) who are working on pre-implementation testing for the Rapid Refresh Forecast System (:term:`RRFS`). :numref:`Table %s ` compares the configuration variables that appear in the ``config.community.yaml`` with their default values in ``config_default.yaml``. +The user must set the specifics of their experiment configuration in a ``config.yaml`` file located in the ``ufs-srweather-app/ush`` directory. Two example templates are provided in that directory: ``config.community.yaml`` and ``config.nco.yaml``. The first file is a basic example for creating and running an experiment in *community* mode (with ``RUN_ENVIR`` set to ``community``). The second is an example for creating and running an experiment in the *NCO* (operational) mode (with ``RUN_ENVIR`` set to ``nco``). The *community* mode is recommended in most cases, and user support is available for running in community mode. The operational/NCO mode is typically used by developers at the Environmental Modeling Center (:term:`EMC`) and the Global Systems Laboratory (:term:`GSL`) who are working on pre-implementation testing for the Rapid Refresh Forecast System (:term:`RRFS`). :numref:`Table %s ` compares the configuration variables that appear in the ``config.community.yaml`` with their default values in ``config_defaults.yaml``. .. _ConfigCommunity: +.. list-table:: Configuration variables specified in the *config.community.yaml* script + :widths: 30 30 30 + :header-rows: 1 -.. table:: Configuration variables specified in the config.community.yaml script - - +--------------------------------+-------------------+------------------------------------+ - | **Parameter** | **Default Value** | **config.community.yaml Value** | - +================================+===================+====================================+ - | RUN_ENVIR | "nco" | "community" | - +--------------------------------+-------------------+------------------------------------+ - | MACHINE | "BIG_COMPUTER" | "hera" | - +--------------------------------+-------------------+------------------------------------+ - | ACCOUNT | "" | "an_account" | - +--------------------------------+-------------------+------------------------------------+ - | CCPA_OBS_DIR | "" | "" | - +--------------------------------+-------------------+------------------------------------+ - | NOHRSC_OBS_DIR | "" | "" | - +--------------------------------+-------------------+------------------------------------+ - | MRMS_OBS_DIR | "" | "" | - +--------------------------------+-------------------+------------------------------------+ - | NDAS_OBS_DIR | "" | "" | - +--------------------------------+-------------------+------------------------------------+ - | USE_CRON_TO_RELAUNCH | false | false | - +--------------------------------+-------------------+------------------------------------+ - | EXPT_SUBDIR | "" | "test_community" | - +--------------------------------+-------------------+------------------------------------+ - | CCPP_PHYS_SUITE | "FV3_GFS_v16" | "FV3_GFS_v16" | - +--------------------------------+-------------------+------------------------------------+ - | PREDEF_GRID_NAME | "" | "RRFS_CONUS_25km" | - +--------------------------------+-------------------+------------------------------------+ - | DATE_FIRST_CYCL | "YYYYMMDDHH" | '2019061518' | - +--------------------------------+-------------------+------------------------------------+ - | DATE_LAST_CYCL | "YYYYMMDDHH" | '2019061518' | - +--------------------------------+-------------------+------------------------------------+ - | FCST_LEN_HRS | 24 | 12 | - +--------------------------------+-------------------+------------------------------------+ - | PREEXISTING_DIR_METHOD | "delete" | "rename" | - +--------------------------------+-------------------+------------------------------------+ - | VERBOSE | true | true | - +--------------------------------+-------------------+------------------------------------+ - | COMPILER | "intel" | "intel" | - +--------------------------------+-------------------+------------------------------------+ - | EXTRN_MDL_NAME_ICS | "FV3GFS" | "FV3GFS" | - +--------------------------------+-------------------+------------------------------------+ - | FV3GFS_FILE_FMT_ICS | "nemsio" | "grib2" | - +--------------------------------+-------------------+------------------------------------+ - | EXTRN_MDL_NAME_LBCS | "FV3GFS" | "FV3GFS" | - +--------------------------------+-------------------+------------------------------------+ - | LBC_SPEC_INTVL_HRS | 6 | 6 | - +--------------------------------+-------------------+------------------------------------+ - | FV3GFS_FILE_FMT_LBCS | "nemsio" | "grib2" | - +--------------------------------+-------------------+------------------------------------+ - | QUILTING | true | true | - +--------------------------------+-------------------+------------------------------------+ - | COMOUT_REF | "" | "" | - +--------------------------------+-------------------+------------------------------------+ - | DO_ENSEMBLE | false | false | - +--------------------------------+-------------------+------------------------------------+ - | NUM_ENS_MEMBERS | 1 | 2 | - +--------------------------------+-------------------+------------------------------------+ - + * - Parameter + - Default Value + - *config.community.yaml* Value + * - RUN_ENVIR + - "nco" + - "community" + * - MACHINE + - "BIG_COMPUTER" + - "hera" + * - ACCOUNT + - "" + - "an_account" + * - CCPA_OBS_DIR + - "{{ workflow.EXPTDIR }}/obs_data/ccpa/proc" + - "" + * - MRMS_OBS_DIR + - "{{ workflow.EXPTDIR }}/obs_data/mrms/proc" + - "" + * - NDAS_OBS_DIR + - "{{ workflow.EXPTDIR }}/obs_data/ndas/proc" + - "" + * - USE_CRON_TO_RELAUNCH + - false + - false + * - EXPT_SUBDIR + - "" + - "test_community" + * - CCPP_PHYS_SUITE + - "FV3_GFS_v16" + - "FV3_GFS_v16" + * - PREDEF_GRID_NAME + - "" + - "RRFS_CONUS_25km" + * - DATE_FIRST_CYCL + - "YYYYMMDDHH" + - '2019061518' + * - DATE_LAST_CYCL + - "YYYYMMDDHH" + - '2019061518' + * - FCST_LEN_HRS + - 24 + - 12 + * - PREEXISTING_DIR_METHOD + - "delete" + - "rename" + * - VERBOSE + - true + - true + * - COMPILER + - "intel" + - "intel" + * - EXTRN_MDL_NAME_ICS + - "FV3GFS" + - "FV3GFS" + * - FV3GFS_FILE_FMT_ICS + - "nemsio" + - "grib2" + * - EXTRN_MDL_NAME_LBCS + - "FV3GFS" + - "FV3GFS" + * - LBC_SPEC_INTVL_HRS + - 6 + - 6 + * - FV3GFS_FILE_FMT_LBCS + - "nemsio" + - "grib2" + * - QUILTING + - true + - true + * - COMOUT_REF + - "" + - "" + * - DO_ENSEMBLE + - false + - false + * - NUM_ENS_MEMBERS + - 1 + - 2 + * - VX_FCST_MODEL_NAME + - '{{ nco.NET_default }}.{{ task_run_post.POST_OUTPUT_DOMAIN_NAME }}' + - FV3_GFS_v16_CONUS_25km + .. _GeneralConfig: General Instructions for All Systems @@ -347,10 +382,10 @@ Next, users should edit the new ``config.yaml`` file to customize it for their m EXPT_SUBDIR: test_community task_get_extrn_ics: USE_USER_STAGED_EXTRN_FILES: true - EXTRN_MDL_SOURCE_BASEDIR_ICS: "/path/to/UFS_SRW_App/develop/input_model_data///" + EXTRN_MDL_SOURCE_BASEDIR_ICS: /path/to/UFS_SRW_data/v2p2/input_model_data///${yyyymmddhh} task_get_extrn_lbcs: USE_USER_STAGED_EXTRN_FILES: true - EXTRN_MDL_SOURCE_BASEDIR_LBCS: "/path/to/UFS_SRW_App/develop/input_model_data///" + EXTRN_MDL_SOURCE_BASEDIR_LBCS: /path/to/UFS_SRW_data/v2p2/input_model_data///${yyyymmddhh} where: * ``MACHINE`` refers to a valid machine name (see :numref:`Section %s ` for options). @@ -358,13 +393,16 @@ where: .. hint:: - To determine an appropriate ACCOUNT field for Level 1 systems, run ``groups``, and it will return a list of projects you have permissions for. Not all of the listed projects/groups have an HPC allocation, but those that do are potentially valid account names. + * To determine an appropriate ACCOUNT field for Level 1 systems, run ``groups``, and it will return a list of projects you have permissions for. Not all of the listed projects/groups have an HPC allocation, but those that do are potentially valid account names. + * Users can also run ``saccount_params``, which provides more information but is not available on all systems. + + .. COMMENT: Check this (above)! * ``EXPT_SUBDIR`` is changed to an experiment name of the user's choice. * ``/path/to/`` is the path to the SRW App data on the user's machine (see :numref:`Section %s ` for data locations on Level 1 systems). * ```` refers to a subdirectory containing the experiment data from a particular model. Valid values on Level 1 systems correspond to the valid values for ``EXTRN_MDL_NAME_ICS`` and ``EXTRN_MDL_NAME_LBCS`` (see :numref:`Section %s ` or :numref:`%s ` for options). * ```` refers to one of 3 possible data formats: ``grib2``, ``nemsio``, or ``netcdf``. - * ```` refers to a subdirectory containing data for the :term:`cycle` date (in YYYYMMDDHH format). + * ``${yyyymmddhh}`` refers to a subdirectory containing data for the :term:`cycle` date (in YYYYMMDDHH format). Users may hardcode this value or leave it as-is, and the experiment will derive the correct value from ``DATE_FIRST_CYCL`` and related information. On platforms where Rocoto and :term:`cron` are available, users can automate resubmission of their experiment workflow by adding the following lines to the ``workflow:`` section of the ``config.yaml`` file: @@ -396,37 +434,82 @@ For example, to run the out-of-the-box experiment on Gaea using cron to automate CRON_RELAUNCH_INTVL_MNTS: 3 task_get_extrn_ics: USE_USER_STAGED_EXTRN_FILES: true - EXTRN_MDL_SOURCE_BASEDIR_ICS: /lustre/f2/dev/role.epic/contrib/UFS_SRW_data/develop/input_model_data/FV3GFS/grib2/2019061518 + EXTRN_MDL_SOURCE_BASEDIR_ICS: /lustre/f2/dev/role.epic/contrib/UFS_SRW_data/v2p2/input_model_data/FV3GFS/grib2/2019061518 task_get_extrn_lbcs: USE_USER_STAGED_EXTRN_FILES: true - EXTRN_MDL_SOURCE_BASEDIR_LBCS: /lustre/f2/dev/role.epic/contrib/UFS_SRW_data/develop/input_model_data/FV3GFS/grib2/2019061518 + EXTRN_MDL_SOURCE_BASEDIR_LBCS: /lustre/f2/dev/role.epic/contrib/UFS_SRW_data/v2p2/input_model_data/FV3GFS/grib2/2019061518 -To determine whether the ``config.yaml`` file adjustments are valid, users can run the following script from the ``ush`` directory: +.. hint:: -.. code-block:: console + * Valid values for configuration variables should be consistent with those in the ``ush/valid_param_vals.yaml`` script. + + * Various sample configuration files can be found within the subdirectories of ``tests/WE2E/test_configs``. + + * Users can find detailed information on configuration parameter options in :numref:`Section %s: Configuring the Workflow `. + +.. _ConfigTasks: - ./config_utils.py -c config.yaml -v config_defaults.yaml -k "(?\!rocoto\b)" +Turning On/Off Workflow Tasks +```````````````````````````````` + +The ``ufs-srweather-app/parm/wflow`` directory contains several ``YAML`` files that configure different workflow task groups. Each task group file contains a number of tasks that are typically run together. :numref:`Table %s ` describes each of the task groups. + +.. _task-group-files: + +.. list-table:: Task Group Files + :widths: 20 50 + :header-rows: 1 + + * - File + - Function + * - aqm_post.yaml + - SRW-AQM post-processing tasks + * - aqm_prep.yaml + - SRW-AQM pre-processing tasks + * - coldstart.yaml + - Tasks required to run a cold-start forecast + * - default_workflow.yaml + - Sets the default workflow (prep.yaml, coldstart.yaml, post.yaml) + * - plot.yaml + - Plotting tasks + * - post.yaml + - Post-processing tasks + * - prdgen.yaml + - Horizontal map projection processor that creates smaller domain products from the larger domain created by the UPP. + * - prep.yaml + - Pre-processing tasks + * - verify_det.yaml + - Deterministic verification tasks + * - verify_ens.yaml + - Ensemble verification tasks + * - verify_pre.yaml + - Verification pre-processing tasks -A correct ``config.yaml`` file will output a ``SUCCESS`` message. A ``config.yaml`` file with problems will output a ``FAILURE`` message describing the problem. For example: +The default workflow task groups are set in ``parm/wflow/default_workflow.yaml`` and include ``prep.yaml``, ``coldstart.yaml``, and ``post.yaml``. To turn on/off tasks in the workflow, users must alter the list of task groups in the ``rocoto: tasks: taskgroups:`` section of ``config.yaml``. The list in ``config.yaml`` will override the default and run only the task groups listed. For example, to omit :term:`cycle-independent` tasks and run plotting tasks, users would delete ``prep.yaml`` from the list of tasks and add ``plot.yaml``: .. code-block:: console - INVALID ENTRY: EXTRN_MDL_FILES_ICS=[] - FAILURE + rocoto: + tasks: + taskgroups: '{{ ["parm/wflow/coldstart.yaml", "parm/wflow/post.yaml", "parm/wflow/plot.yaml"]|include }}' -.. hint:: +Users may need to make additional adjustments to ``config.yaml`` depending on which task groups they add or remove. For example, when plotting, the user should add the plotting increment (``PLOT_FCST_INC``) for the plotting tasks in ``task_plot_allvars`` (see :numref:`Section %s ` on plotting). - * The ``workflow_tools`` environment must be loaded for the ``config_utils.py`` script to validate the ``config.yaml`` file. +Users can omit specific tasks from a task group by including them under the list of tasks as an empty entry. For example, if a user wanted to run only ``task_pre_post_stat`` from ``aqm_post.yaml``, the taskgroups list would include ``aqm_post.yaml``, and the tasks that the user wanted to omit would be listed with no value: - * Valid values for configuration variables should be consistent with those in the ``ush/valid_param_vals.yaml`` script. - - * Various sample configuration files can be found within the subdirectories of ``tests/WE2E/test_configs``. +.. code-block:: console - * Users can find detailed information on configuration parameter options in :numref:`Section %s: Configuring the Workflow `. + rocoto: + tasks: + taskgroups: '{{ ["parm/wflow/prep.yaml", "parm/wflow/coldstart.yaml", "parm/wflow/post.yaml", "parm/wflow/aqm_post.yaml"]|include }}' + task_post_stat_o3: + task_post_stat_pm25: + task_bias_correction_o3: + task_bias_correction_pm25: **Next Steps:** - * To configure an experiment for a general Linux or Mac system, see the :ref:`next section ` for additional required steps. + * To configure an experiment for a general Linux or Mac system, see the :ref:`next section ` for additional required steps. * To add the graphics plotting tasks to the experiment workflow, go to section :numref:`Section %s: Plotting Configuration `. * To configure an experiment to run METplus verification tasks, see :numref:`Section %s `. * Otherwise, skip to :numref:`Section %s ` to generate the workflow. @@ -489,7 +572,7 @@ For a machine with 4 CPUs, the following domain decomposition could be used: **Configure the Machine File** -Configure a ``macos.yaml`` or ``linux.yaml`` machine file in ``ufs-srweather-app/ush/machine`` based on the number of CPUs (``NCORES_PER_NODE``) in the system (usually 8 or 4 in MacOS; varies on Linux systems). Job scheduler (``SCHED``) options can be viewed :ref:`here `. Users must also set the path to the fix file directories. +Configure the ``macos.yaml`` or ``linux.yaml`` machine file in ``ufs-srweather-app/ush/machine`` based on the number of CPUs (``NCORES_PER_NODE``) in the system (usually 8 or 4 in MacOS; varies on Linux systems). Job scheduler (``SCHED``) options can be viewed :ref:`here `. Users must also set the path to the fix file directories. .. code-block:: console @@ -505,24 +588,25 @@ Configure a ``macos.yaml`` or ``linux.yaml`` machine file in ``ufs-srweather-app RUN_CMD_UTILS: 'mpirun -np 4' # Commands to run at the start of each workflow task. PRE_TASK_CMDS: '{ ulimit -a; }' + FIXaer: /path/to/FIXaer/files + FIXgsm: /path/to/FIXgsm/files + FIXlut: /path/to/FIXlut/files - task_make_orog: # Path to location of static input files used by the make_orog task - FIXorg: path/to/FIXorg/files + FIXorg: path/to/FIXorg/files - task_make_sfc_climo: # Path to location of static surface climatology input fields used by sfc_climo_gen - FIXsfc: path/to/FIXsfc/files + FIXsfc: path/to/FIXsfc/files - task_run_fcst: - FIXaer: /path/to/FIXaer/files - FIXgsm: /path/to/FIXgsm/files - FIXlut: /path/to/FIXlut/files + #Path to location of NaturalEarth shapefiles used for plotting + FIXshp: /Users/username/DATA/UFS/NaturalEarth data: # Used by setup.py to set the values of EXTRN_MDL_SOURCE_BASEDIR_ICS and EXTRN_MDL_SOURCE_BASEDIR_LBCS FV3GFS: /Users/username/DATA/UFS/FV3GFS +.. COMMENT: Check this section (above) with Natalie! + The ``data:`` section of the machine file can point to various data sources that the user has pre-staged on disk. For example: .. code-block:: console @@ -574,7 +658,7 @@ The Python plotting tasks require a path to the directory where the Cartopy Natu Task Configuration ````````````````````` -Users will need to add or modify certain variables in ``config.yaml`` to run the plotting task(s). At a minimum, to activate the ``plot_allvars`` tasks, users must add the task yaml file to the default list of ``taskgroups`` under the ``rocoto: tasks:`` section. +Users will need to add or modify certain variables in ``config.yaml`` to run the plotting task(s). At a minimum, to activate the ``plot_allvars`` tasks, users must add the task's ``.yaml`` file to the default list of ``taskgroups`` under the ``rocoto: tasks:`` section. .. code-block:: console @@ -643,7 +727,7 @@ To use METplus verification, MET and METplus modules need to be installed. To t tasks: taskgroups: '{{ ["parm/wflow/prep.yaml", "parm/wflow/coldstart.yaml", "parm/wflow/post.yaml", "parm/wflow/verify_pre.yaml", "parm/wflow/verify_det.yaml"]|include }}' -:numref:`Table %s ` indicates which functions each ``verify_*.yaml`` file configures. Users must add ``verify_pre.yaml`` anytime they want to do VX; it runs preprocessing tasks that are necessary for both deterministic and ensemble VX. Then users can add ``verify_det.yaml`` for deterministic VX or ``verify_ens.yaml`` for ensemble VX (or both). Note that ensemble VX requires the user to be running an ensemble forecast or to stage ensemble forecast files in an appropriate location. +:numref:`Table %s ` indicates which functions each ``verify_*.yaml`` file configures. Users must add ``verify_pre.yaml`` anytime they want to do verification (VX); it runs preprocessing tasks that are necessary for both deterministic and ensemble VX. Then users can add ``verify_det.yaml`` for deterministic VX or ``verify_ens.yaml`` for ensemble VX (or both). Note that ensemble VX requires the user to be running an ensemble forecast or to stage ensemble forecast files in an appropriate location. .. _VX-yamls: @@ -682,10 +766,12 @@ Users who have already staged the observation data needed for METplus (i.e., the .. code-block:: console platform: - CCPA_OBS_DIR: /path/to/UFS_SRW_App/develop/obs_data/ccpa/proc - NOHRSC_OBS_DIR: /path/to/UFS_SRW_App/develop/obs_data/nohrsc/proc - MRMS_OBS_DIR: /path/to/UFS_SRW_App/develop/obs_data/mrms/proc - NDAS_OBS_DIR: /path/to/UFS_SRW_App/develop/obs_data/ndas/proc + CCPA_OBS_DIR: /path/to/UFS_SRW_data/v2p2/obs_data/ccpa/proc + NOHRSC_OBS_DIR: /path/to/UFS_SRW_data/v2p2/obs_data/nohrsc/proc + MRMS_OBS_DIR: /path/to/UFS_SRW_data/v2p2/obs_data/mrms/proc + NDAS_OBS_DIR: /path/to/UFS_SRW_data/v2p2/obs_data/ndas/proc + +After adding the VX tasks to the ``rocoto:`` section and the data paths to the ``platform:`` section, users can proceed to generate the experiment, which will perform VX tasks in addition to the default workflow tasks. .. _GenerateWorkflow: @@ -702,22 +788,23 @@ The last line of output from this script, starting with ``*/1 * * * *`` or ``*/3 This workflow generation script creates an experiment directory and populates it with all the data needed to run through the workflow. The flowchart in :numref:`Figure %s ` describes the experiment generation process. The ``generate_FV3LAM_wflow.py`` script: - #. Runs the ``setup.py`` script to set the configuration parameters. This script reads three other configuration scripts in order: + #. Runs the ``setup.py`` script to set the configuration parameters. This script reads four other configuration scripts in order: a. ``config_defaults.yaml`` (:numref:`Section %s `) - b. ``config.yaml`` (:numref:`Section %s `), and - c. ``set_predef_grid_params.py``. + b. ``${machine}.yaml`` (the machine configuration file) + c. ``config.yaml`` (:numref:`Section %s `) + d. ``valid_param_vals.yaml`` #. Symlinks the time-independent (fix) files and other necessary data input files from their location to the experiment directory (``$EXPTDIR``). #. Creates the input namelist file ``input.nml`` based on the ``input.nml.FV3`` file in the ``parm`` directory. #. Creates the workflow XML file ``FV3LAM_wflow.xml`` that is executed when running the experiment with the Rocoto workflow manager. -The generated workflow will appear in ``$EXPTDIR``, where ``EXPTDIR=${EXPT_BASEDIR}/${EXPT_SUBDIR}``. These variables were specified in ``config_defaults.yaml`` and ``config.yaml`` in :numref:`Step %s `. The settings for these paths can also be viewed in the console output from the ``./generate_FV3LAM_wflow.py`` script or in the ``log.generate_FV3LAM_wflow`` file, which can be found in ``$EXPTDIR``. +The generated workflow will appear in ``$EXPTDIR``, where ``EXPTDIR=${EXPT_BASEDIR}/${EXPT_SUBDIR}``; these variables were specified in ``config_defaults.yaml`` and ``config.yaml`` in :numref:`Step %s `. The settings for these directory paths can also be viewed in the console output from the ``./generate_FV3LAM_wflow.py`` script or in the ``log.generate_FV3LAM_wflow`` file, which can be found in ``$EXPTDIR``. .. _WorkflowGeneration: -.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/SRW_regional_workflow_gen.png - :alt: Flowchart of the workflow generation process. Scripts are called in the following order: source_util_funcs.sh (which calls bash_utils), then set_FV3nml_sfc_climo_filenames.py, set_FV3nml_ens_stoch_seeds.py, create_diag_table_file.py, and setup.py. setup.py reads several yaml configuration files (config_defaults.yaml, config.yaml, {machine_config}.yaml, valid_param_vals.yaml, and others) and calls several scripts: set_cycle_dates.py, set_grid_params_GFDLgrid.py, set_grid_params_ESGgrid.py, link_fix.py, and set_ozone_param.py. Then, it sets a number of variables, including FIXgsm, TOPO_DIR, and SFC_CLIMO_INPUT_DIR variables. Next, set_predef_grid_params.py is called, and the FIXam and FIXLAM directories are set, along with the forecast input files. The setup script also calls set_extrn_mdl_params.py, sets the GRID_GEN_METHOD with HALO, checks various parameters, and generates shell scripts. Then, the workflow generation script produces a YAML configuration file and generates the actual Rocoto workflow XML file from the template file (by calling uwtools set_template). The workflow generation script checks the crontab file and, if applicable, copies certain fix files to the experiment directory. Then, it copies templates of various input files to the experiment directory and sets parameters for the input.nml file. Finally, it generates the workflow. Additional information on each step appears in comments within each script. +.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/WorkflowImages/SRW_regional_workflow_gen.png + :alt: Flowchart of the workflow generation process. Scripts are called in the following order: source_util_funcs.sh (which calls bash_utils), then set_FV3nml_sfc_climo_filenames.py, set_FV3nml_ens_stoch_seeds.py, create_diag_table_file.py, and setup.py. setup.py reads several yaml configuration files (config_defaults.yaml, config.yaml, {machine_config}.yaml, valid_param_vals.yaml, and others) and calls several scripts: set_cycle_dates.py, set_grid_params_GFDLgrid.py, set_grid_params_ESGgrid.py, link_fix.py, and set_ozone_param.py. Then, it sets a number of variables, including FIXgsm, fixorg, and FIXsfc variables. Next, set_predef_grid_params.py is called, and the FIXam and FIXLAM directories are set, along with the forecast input files. The setup script also calls set_extrn_mdl_params.py, sets the GRID_GEN_METHOD with HALO, checks various parameters, and generates shell scripts. Then, the workflow generation script produces a YAML configuration file and generates the actual Rocoto workflow XML file from the template file (by calling workflow-tools set_template). The workflow generation script checks the crontab file and, if applicable, copies certain fix files to the experiment directory. Then, it copies templates of various input files to the experiment directory and sets parameters for the input.nml file. Finally, it generates the workflow. Additional information on each step appears in comments within each script. *Experiment Generation Description* @@ -729,7 +816,7 @@ Description of Workflow Tasks .. note:: This section gives a general overview of workflow tasks. To begin running the workflow, skip to :numref:`Step %s ` -:numref:`Figure %s ` illustrates the overall workflow. Individual tasks that make up the workflow are detailed in the ``FV3LAM_wflow.xml`` file. :numref:`Table %s ` describes the function of each baseline task. The first three pre-processing tasks; ``MAKE_GRID``, ``MAKE_OROG``, and ``MAKE_SFC_CLIMO``; are optional. If the user stages pre-generated grid, orography, and surface climatology fix files, these three tasks can be skipped by removing the ``prep.yaml`` file from the default ``taskgroups`` entry in the ``config.yaml`` file before running the ``generate_FV3LAM_wflow.py`` script: +:numref:`Figure %s ` illustrates the overall workflow. Individual tasks that make up the workflow are detailed in the ``FV3LAM_wflow.xml`` file. :numref:`Table %s ` describes the function of each baseline task. The first three pre-processing tasks; ``make_grid``, ``make_orog``, and ``make_sfc_climo``; are optional. If the user stages pre-generated grid, orography, and surface climatology fix files, these three tasks can be skipped by removing the ``prep.yaml`` file from the default ``taskgroups`` entry in the ``config.yaml`` file before running the ``generate_FV3LAM_wflow.py`` script: .. code-block:: console @@ -739,13 +826,13 @@ Description of Workflow Tasks .. _WorkflowTasksFig: -.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/SRW_wflow_flowchart.png +.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/WorkflowImages/SRW_wflow_flowchart.png :alt: Flowchart of the default workflow tasks. If the make_grid, make_orog, and make_sfc_climo tasks are toggled off, they will not be run. If toggled on, make_grid, make_orog, and make_sfc_climo will run consecutively by calling the corresponding exregional script in the scripts directory. The get_ics, get_lbcs, make_ics, make_lbcs, and run_fcst tasks call their respective exregional scripts. The run_post task will run, and if METplus verification tasks have been configured, those will run during post-processing by calling their exregional scripts. *Flowchart of the Default Workflow Tasks* -The ``FV3LAM_wflow.xml`` file runs the specific j-job scripts (``jobs/JREGIONAL_[task name]``) in the prescribed order when the experiment is launched via the ``launch_FV3LAM_wflow.sh`` script or the ``rocotorun`` command. Each j-job task has its own source script (or "ex-script") named ``exregional_[task name].sh`` in the ``scripts`` directory. Two database files named ``FV3LAM_wflow.db`` and ``FV3LAM_wflow_lock.db`` are generated and updated by the Rocoto calls. There is usually no need for users to modify these files. To relaunch the workflow from scratch, delete these two ``*.db`` files and then call the launch script repeatedly for each task. +The ``FV3LAM_wflow.xml`` file runs the specific j-job scripts (``jobs/JREGIONAL_[task name]``) in the prescribed order when the experiment is launched via the ``launch_FV3LAM_wflow.sh`` script or the ``rocotorun`` command. Each j-job task has its own source script (or "ex-script") named ``exregional_[task name].sh`` in the ``ufs-srweather-app/scripts`` directory. Two database files named ``FV3LAM_wflow.db`` and ``FV3LAM_wflow_lock.db`` are generated and updated by the Rocoto calls. There is usually no need for users to modify these files. To relaunch the workflow from scratch, delete these two ``*.db`` files and then call the launch script repeatedly for each task. .. _WorkflowTasksTable: @@ -792,7 +879,7 @@ In addition to the baseline tasks described in :numref:`Table %s ` below. The column "taskgroup" indicates the taskgroup file that must be included in the user's ``config.yaml`` file under ``rocoto: tasks: taskgroups:`` (see :numref:`Section %s ` for more details). For each task, ``mem###`` refers to either ``mem000`` (if running a deterministic forecast) or a specific forecast member number (if running an ensemble forecast). "Metatasks" indicate task definitions that will become more than one workflow task based on different variables, number of hours, etc., as described in the Task Description column. See :numref:`Section %s ` for more details about Metatasks. +METplus verification tasks are described in :numref:`Table %s ` below. The column "taskgroup" indicates the taskgroup file that must be included in the user's ``config.yaml`` file under ``rocoto: tasks: taskgroups:`` (see :numref:`Section %s ` for more details). For each task, ``mem###`` refers to either ``mem000`` (if running a deterministic forecast) or a specific forecast member number (if running an ensemble forecast). "Metatasks" indicate task definitions that will become more than one workflow task based on different variables, number of hours, etc., as described in the Task Description column. See :numref:`Section %s ` for more details about metatasks. .. _VXWorkflowTasksTable: @@ -817,19 +904,19 @@ METplus verification tasks are described in :numref:`Table %s `, the last line of output from ``./generate_FV3LAM_wflow.py`` (starting with ``*/3 * * * *``), can be pasted into the crontab file. It can also be found in the ``$EXPTDIR/log.generate_FV3LAM_wflow`` file. The crontab entry should resemble the following: +This will automatically add an appropriate entry to the user's :term:`cron table` and launch the workflow. Alternatively, the user can add a crontab entry manually using the ``crontab -e`` command. As mentioned in :numref:`Section %s `, the last line of output from ``./generate_FV3LAM_wflow.py`` (usually starting with ``*/3 * * * *``), can be pasted into the crontab file. It can also be found in the ``$EXPTDIR/log.generate_FV3LAM_wflow`` file. The crontab entry should resemble the following: .. code-block:: console @@ -933,7 +1020,7 @@ where ``/path/to/experiment/directory`` is changed to correspond to the user's ` .. hint:: * On NOAA Cloud instances, ``*/1 * * * *`` (or ``CRON_RELAUNCH_INTVL_MNTS: 1``) is the preferred option for cron jobs because compute nodes will shut down if they remain idle too long. If the compute node shuts down, it can take 15-20 minutes to start up a new one. - * On other NOAA HPC systems, administrators discourage using ``*/1 * * * *`` due to load problems. ``*/3 * * * *`` (or ``CRON_RELAUNCH_INTVL_MNTS: 3``) is the preferred option for cron jobs on non-NOAA Cloud systems. + * On other NOAA HPC systems, administrators discourage using ``*/1 * * * *`` due to load problems. ``*/3 * * * *`` (or ``CRON_RELAUNCH_INTVL_MNTS: 3``) is the preferred option for cron jobs on other Level 1 systems. To check the experiment progress: @@ -942,7 +1029,13 @@ To check the experiment progress: cd $EXPTDIR rocotostat -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10 -After finishing the experiment, open the crontab using ``crontab -e`` and delete the crontab entry. +Users can track the experiment's progress by reissuing the ``rocotostat`` command above every so often until the experiment runs to completion. The following message usually means that the experiment is still getting set up: + +.. code-block:: console + + 08/04/23 17:34:32 UTC :: FV3LAM_wflow.xml :: ERROR: Can not open FV3LAM_wflow.db read-only because it does not exist + +After a few (3-5) minutes, ``rocotostat`` should show a status-monitoring table. .. _Success: @@ -986,13 +1079,15 @@ If users choose to run METplus verification tasks as part of their experiment, t 201906151800 run_gridstatvx_24h 30468493 SUCCEEDED 0 1 20.0 201906151800 run_pointstatvx 30468423 SUCCEEDED 0 1 670.0 ... - 201906151800 run_MET_GridStat_vx_APCP01h_mem000 - - - - - - 201906151800 run_MET_GridStat_vx_APCP03h_mem000 - - - - - - 201906151800 run_MET_GridStat_vx_APCP06h_mem000 - - - - - - 201906151800 run_MET_GridStat_vx_REFC_mem000 - - - - - - 201906151800 run_MET_GridStat_vx_RETOP_mem000 - - - - - - 201906151800 run_MET_PointStat_vx_SFC_mem000 - - - - - - 201906151800 run_MET_PointStat_vx_UPA_mem000 - - - - - + 201906151800 run_MET_GridStat_vx_APCP01h_mem000 - - - - - + 201906151800 run_MET_GridStat_vx_APCP03h_mem000 - - - - - + 201906151800 run_MET_GridStat_vx_APCP06h_mem000 - - - - - + 201906151800 run_MET_GridStat_vx_REFC_mem000 - - - - - + 201906151800 run_MET_GridStat_vx_RETOP_mem000 - - - - - + 201906151800 run_MET_PointStat_vx_SFC_mem000 - - - - - + 201906151800 run_MET_PointStat_vx_UPA_mem000 - - - - - + +After finishing the experiment, open the crontab using ``crontab -e`` and delete the crontab entry. Launch the Rocoto Workflow Using a Script ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ @@ -1074,7 +1169,7 @@ Some systems may require a version number (e.g., ``module load rocoto/1.3.3``) **Run the Rocoto Workflow** -After loading Rocoto, ``cd`` to the experiment directory and call ``rocotorun`` to launch the workflow tasks. This will start any tasks that do not have a dependency. As the workflow progresses through its stages, ``rocotostat`` will show the state of each task and allow users to monitor progress: +After loading Rocoto, ``cd`` to the experiment directory and call ``rocotorun`` to launch the workflow tasks. This will start any tasks that are not awaiting completion of a dependency. As the workflow progresses through its stages, ``rocotostat`` will show the state of each task and allow users to monitor progress: .. code-block:: console @@ -1105,7 +1200,7 @@ The SRW App workflow can be run using standalone shell scripts in cases where th .. attention:: - When working on an HPC system, users should allocate a compute node prior to running their experiment. The proper command will depend on the system's resource manager, but some guidance is offered in :numref:`Section %s `. It may be necessary to reload the ``build__`` scripts (see :numref:`Section %s `) and the workflow environment (see :numref:`Section %s `). + When working on an HPC system, users should allocate a compute node prior to running their experiment. The proper command will depend on the system's resource manager, but some guidance is offered in :numref:`Section %s `. It may be necessary to reload the ``build__`` scripts (see :numref:`Section %s `) and the workflow environment (see :numref:`Section %s `) after allocating the compute node. .. note:: Examples in this subsection presume that the user is running in the Terminal with a bash shell environment. If this is not the case, users will need to adjust the commands to fit their command line application and shell environment. @@ -1160,7 +1255,7 @@ Check the batch script output file in your experiment directory for a “SUCCESS .. table:: List of tasks in the SRW App workflow in the order that they are executed. Scripts with the same stage number may be run simultaneously. The number of - processors and wall clock time is a good starting point for Cheyenne or Hera + processors and wall clock time is a good starting point for NOAA HPC systems when running a 48-h forecast on the 25-km CONUS domain. For a brief description of tasks, see :numref:`Table %s `. +------------+------------------------+----------------+----------------------------+ diff --git a/docs/UsersGuide/source/BuildingRunningTesting/Tutorial.rst b/docs/UsersGuide/source/BuildingRunningTesting/Tutorial.rst index c27544f54e..78ed48091a 100644 --- a/docs/UsersGuide/source/BuildingRunningTesting/Tutorial.rst +++ b/docs/UsersGuide/source/BuildingRunningTesting/Tutorial.rst @@ -4,9 +4,9 @@ Tutorials ============= -This chapter walks users through experiment configuration options for various severe weather events. It assumes that users have already (1) :ref:`built the SRW App ` successfully and (2) run the out-of-the-box case contained in ``config.community.yaml`` (and copied to ``config.yaml`` in :numref:`Step %s ` or :numref:`Step %s `) to completion. +This chapter walks users through experiment configuration options for various severe weather events. It assumes that users have already :ref:`built the SRW App ` successfully. -Users can run through the entire set of tutorials or jump to the one that interests them most. The five tutorials address different skills: +Users can run through the entire set of tutorials or jump to the one that interests them most. The first tutorial is recommended for users who have never run the SRW App before. The five tutorials address different skills: #. :ref:`Severe Weather Over Indianapolis `: Change physics suites and compare graphics plots. #. :ref:`Cold Air Damming `: Coming soon! @@ -33,7 +33,7 @@ A surface boundary associated with a vorticity maximum over the northern Great P * `Storm Prediction Center (SPC) Storm Report for 20190615 `__ * `Storm Prediction Center (SPC) Storm Report for 20190616 `__ -.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/IndySevereWeather18z.gif +.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/Tutorial/IndySevereWeather18z.gif :alt: Radar animation of severe weather over Indianapolis on June 15, 2019 starting at 18z. The animation shows areas of heavy rain and tornado reports moving from west to east over Indianapolis. *Severe Weather Over Indianapolis Starting at 18z* @@ -65,7 +65,7 @@ To load the workflow environment, source the lmod-setup file. Then load the work where ```` is a valid, lowercased machine name (see ``MACHINE`` in :numref:`Section %s ` for valid values). -After loading the workflow, users should follow the instructions printed to the console. Usually, the instructions will tell the user to run ``conda activate workflow_tools``. For example, a user on Hera with permissions on the ``nems`` project may issue the following commands to load the workflow (replacing ``User.Name`` with their actual username): +After loading the workflow, users should follow the instructions printed to the console. Usually, the instructions will tell the user to run |activate|. For example, a user on Hera with permissions on the ``nems`` project may issue the following commands to load the workflow (replacing ``User.Name`` with their actual username): .. code-block:: console @@ -160,9 +160,8 @@ should be included in the ``rocoto:tasks:taskgroups:`` section, like this: walltime: 02:00:00 taskgroups: '{{ ["parm/wflow/prep.yaml", "parm/wflow/coldstart.yaml", "parm/wflow/post.yaml", "parm/wflow/plot.yaml"]|include }}' - -For more information on how to turn on/off tasks in the worklfow, please -see :numref:`Section %s `. +For more information on how to turn on/off tasks in the workflow, please +see :numref:`Section %s `. In the ``task_get_extrn_ics:`` section, add ``USE_USER_STAGED_EXTRN_FILES`` and ``EXTRN_MDL_SOURCE_BASEDIR_ICS``. Users will need to adjust the file path to reflect the location of data on their system (see :numref:`Section %s ` for locations on `Level 1 `__ systems). @@ -172,7 +171,7 @@ In the ``task_get_extrn_ics:`` section, add ``USE_USER_STAGED_EXTRN_FILES`` and EXTRN_MDL_NAME_ICS: FV3GFS FV3GFS_FILE_FMT_ICS: grib2 USE_USER_STAGED_EXTRN_FILES: true - EXTRN_MDL_SOURCE_BASEDIR_ICS: /path/to/UFS_SRW_App/develop/input_model_data/FV3GFS/grib2/${yyyymmddhh} + EXTRN_MDL_SOURCE_BASEDIR_ICS: /path/to/UFS_SRW_App/v2p2/input_model_data/FV3GFS/grib2/${yyyymmddhh} For a detailed description of the ``task_get_extrn_ics:`` variables, see :numref:`Section %s `. @@ -185,7 +184,7 @@ Similarly, in the ``task_get_extrn_lbcs:`` section, add ``USE_USER_STAGED_EXTRN_ LBC_SPEC_INTVL_HRS: 6 FV3GFS_FILE_FMT_LBCS: grib2 USE_USER_STAGED_EXTRN_FILES: true - EXTRN_MDL_SOURCE_BASEDIR_LBCS: /path/to/UFS_SRW_App/develop/input_model_data/FV3GFS/grib2/${yyyymmddhh} + EXTRN_MDL_SOURCE_BASEDIR_LBCS: /path/to/UFS_SRW_App/v2p2/input_model_data/FV3GFS/grib2/${yyyymmddhh} For a detailed description of the ``task_get_extrn_lbcs:`` variables, see :numref:`Section %s `. @@ -243,7 +242,7 @@ Once the control case is running, users can return to the ``config.yaml`` file ( EXPT_SUBDIR: test_expt CCPP_PHYS_SUITE: FV3_RRFS_v1beta -``EXPT_SUBDIR:`` This name must be different than the ``EXPT_SUBDIR`` name used in the previous forecast experiment. Otherwise, the first forecast experiment will be overwritten. ``test_expt`` is used here, but the user may select a different name if desired. +``EXPT_SUBDIR:`` This name must be different than the ``EXPT_SUBDIR`` name used in the previous forecast experiment. Otherwise, the first forecast experiment will be renamed, and the new experiment will take its place (see :numref:`Section %s ` for details). To avoid this issue, this tutorial uses ``test_expt`` as the second experiment's name, but the user may select a different name if desired. ``CCPP_PHYS_SUITE:`` The FV3_RRFS_v1beta physics suite was specifically created for convection-allowing scales and is the precursor to the operational physics suite that will be used in the Rapid Refresh Forecast System (:term:`RRFS`). @@ -259,10 +258,10 @@ Next, users will need to modify the data parameters in ``task_get_extrn_ics:`` a task_get_extrn_ics: EXTRN_MDL_NAME_ICS: HRRR - EXTRN_MDL_SOURCE_BASEDIR_ICS: /path/to/UFS_SRW_App/develop/input_model_data/HRRR/${yyyymmddhh} + EXTRN_MDL_SOURCE_BASEDIR_ICS: /path/to/UFS_SRW_App/v2p2/input_model_data/HRRR/${yyyymmddhh} task_get_extrn_lbcs: EXTRN_MDL_NAME_LBCS: RAP - EXTRN_MDL_SOURCE_BASEDIR_LBCS: /path/to/UFS_SRW_App/develop/input_model_data/RAP/${yyyymmddhh} + EXTRN_MDL_SOURCE_BASEDIR_LBCS: /path/to/UFS_SRW_App/v2p2/input_model_data/RAP/${yyyymmddhh} EXTRN_MDL_LBCS_OFFSET_HRS: '-0' HRRR and RAP data are better than FV3GFS data for use with the FV3_RRFS_v1beta physics scheme because these datasets use the same physics :term:`parameterizations` that are in the FV3_RRFS_v1beta suite. They focus on small-scale weather phenomena involved in storm development, so forecasts tend to be more accurate when HRRR/RAP data are paired with FV3_RRFS_v1beta and a high-resolution (e.g., 3-km) grid. Using HRRR/RAP data with FV3_RRFS_v1beta also limits the "spin-up adjustment" that takes place when initializing with model data coming from different physics. @@ -325,7 +324,7 @@ Users should substitute ``/path/to/expt_dirs/test_expt`` with the actual path on Compare and Analyze Results ----------------------------- -Navigate to ``test_expt/2019061518/postprd``. This directory contains the post-processed data generated by the :term:`UPP` from the forecast. After the ``plot_allvars`` task completes, this directory will contain ``.png`` images for several forecast variables including 2-m temperature, 2-m dew point temperature, 10-m winds, accumulated precipitation, composite reflectivity, and surface-based CAPE/CIN. Plots with a ``_diff`` label in the file name are plots that compare the ``control`` forecast and the ``test_expt`` forecast. +Navigate to ``test_expt/2019061518/postprd``. This directory contains the post-processed data generated by the :term:`UPP` from the ``test_expt`` forecast. After the ``plot_allvars`` task completes, this directory will contain ``.png`` images for several forecast variables including 2-m temperature, 2-m dew point temperature, 10-m winds, accumulated precipitation, composite reflectivity, and surface-based CAPE/CIN. Plots with a ``_diff`` label in the file name are plots that compare the ``control`` forecast and the ``test_expt`` forecast. Copy ``.png`` Files onto Local System ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ @@ -388,22 +387,25 @@ Sea Level Pressure ````````````````````` In the Sea Level Pressure (SLP) plots, the ``control`` and ``test_expt`` plots are nearly identical at forecast hour f000, so the difference plot is entirely white. -.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/fcst1_plots/slp_diff_regional_f000.png +.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/Tutorial/fcst1_plots/slp_diff_regional_f000.png :align: center + :width: 75% *Difference Plot for Sea Level Pressure at f000* As the forecast continues, the results begin to diverge, as evidenced by the spattering of light blue dispersed across the f006 SLP difference plot. -.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/fcst1_plots/slp_diff_regional_f006.png +.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/Tutorial/fcst1_plots/slp_diff_regional_f006.png :align: center + :width: 75% *Difference Plot for Sea Level Pressure at f006* The predictions diverge further by f012, where a solid section of light blue in the top left corner of the difference plot indicates that to the northwest of Indianapolis, the SLP predictions for the ``control`` forecast were slightly lower than the predictions for the ``test_expt`` forecast. -.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/fcst1_plots/slp_diff_regional_f012.png +.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/Tutorial/fcst1_plots/slp_diff_regional_f012.png :align: center + :width: 75% *Difference Plot for Sea Level Pressure at f012* @@ -416,8 +418,9 @@ Reflectivity images visually represent the weather based on the energy (measured At f000, the ``test_expt`` plot (top left) is showing more severe weather than the ``control`` plot (top right). The ``test_expt`` plot shows a vast swathe of the Indianapolis region covered in yellow with spots of orange, corresponding to composite reflectivity values of 35+ dBZ. The ``control`` plot radar image covers a smaller area of the grid, and with the exception of a few yellow spots, composite reflectivity values are <35 dBZ. The difference plot (bottom) shows areas where the ``test_expt`` plot (red) and the ``control`` plot (blue) have reflectivity values greater than 20 dBZ. The ``test_expt`` plot has significantly more areas with high composite reflectivity values. -.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/fcst1_plots/refc_diff_regional_f000.png +.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/Tutorial/fcst1_plots/refc_diff_regional_f000.png :align: center + :width: 75% *Composite Reflectivity at f000* @@ -425,15 +428,17 @@ As the forecast progresses, the radar images resemble each other more (see :numr .. _refc006: -.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/fcst1_plots/refc_diff_regional_f006.png +.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/Tutorial/fcst1_plots/refc_diff_regional_f006.png :align: center + :width: 75% *Composite reflectivity at f006 shows storm gathering strength* At forecast hour 12, the plots for each forecast show a similar evolution of the storm with both resolving a squall line. The ``test_expt`` plot shows a more intense squall line with discrete cells (areas of high composite reflectivity in dark red), which could lead to severe weather. The ``control`` plot shows an overall decrease in composite reflectivity values compared to f006. It also orients the squall line more northward with less intensity, possibly due to convection from the previous forecast runs cooling the atmosphere. In short, ``test_expt`` suggests that the storm will still be going strong at 06z on June 15, 2019, whereas the ``control`` suggests that the storm will begin to let up. -.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/fcst1_plots/refc_diff_regional_f012.png +.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/Tutorial/fcst1_plots/refc_diff_regional_f012.png :align: center + :width: 75% *Composite Reflectivity at f012* @@ -445,7 +450,7 @@ Surface-Based CAPE/CIN Background """""""""""" -The National Weather Service (:term:`NWS`) defines Surface-Based Convective Available Potential Energy (CAPE) as "the amount of fuel available to a developing thunderstorm." According to NWS, CAPE "describes the instabilily of the atmosphere and provides an approximation of updraft strength within a thunderstorm. A higher value of CAPE means the atmosphere is more unstable and would therefore produce a stronger updraft" (see `NWS, What is CAPE? `__ for further explanation). +The National Weather Service (:term:`NWS`) defines Surface-Based Convective Available Potential Energy (CAPE) as "the amount of fuel available to a developing thunderstorm." According to NWS, CAPE "describes the instabilily of the atmosphere and provides an approximation of updraft strength within a thunderstorm. A higher value of CAPE means the atmosphere is more unstable and would therefore produce a stronger updraft" (see `NWS: What is CAPE? `__ for further explanation). According to the NWS `Storm Prediction Center `__, Convective Inhibition (CIN) "represents the 'negative' area on a sounding that must be overcome for storm initiation." In effect, it measures negative buoyancy (-B) --- the opposite of CAPE, which measures positive buoyancy (B or B+) of an air parcel. @@ -468,24 +473,24 @@ In general, the higher the CIN values are (i.e., the closer they are to zero), t At the 0th forecast hour, the ``test_expt`` plot (below, left) shows lower values of CAPE and higher values of CIN than in the ``control`` plot (below, right). This means that ``test_expt`` is projecting lower potential energy available for a storm but also lower inhibition, which means that less energy would be required for a storm to develop. The difference between the two plots is particularly evident in the southwest corner of the difference plot, which shows a 1000+ J/kg difference between the two plots. -.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/fcst1_plots/sfcape_diff_regional_f000.png - :width: 1200 +.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/Tutorial/fcst1_plots/sfcape_diff_regional_f000.png + :width: 75% :align: center *CAPE/CIN Difference Plot at f000* At the 6th forecast hour, both ``test_expt`` and ``control`` plots are forecasting higher CAPE values overall. Both plots also predict higher CAPE values to the southwest of Indianapolis than to the northeast. This makes sense because the storm was passing from west to east. However, the difference plot shows that the ``control`` forecast is predicting higher CAPE values primarily to the southwest of Indianapolis, whereas ``test_expt`` is projecting a rise in CAPE values throughout the region. The blue region of the difference plot indicates where ``test_expt`` predictions are higher than the ``control`` predictions; the red/orange region shows places where ``control`` predicts significantly higher CAPE values than ``test_expt`` does. -.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/fcst1_plots/sfcape_diff_regional_f006.png - :width: 1200 +.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/Tutorial/fcst1_plots/sfcape_diff_regional_f006.png + :width: 75% :align: center *CAPE/CIN Difference Plot at f006* At the 12th forecast hour, the ``control`` plot indicates that CAPE may be decreasing overall. ``test_expt``, however, shows that areas of high CAPE remain and continue to grow, particularly to the east. The blue areas of the difference plot indicate that ``test_expt`` is predicting higher CAPE than ``control`` everywhere but in the center of the plot. -.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/fcst1_plots/sfcape_diff_regional_f012.png - :width: 1200 +.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/Tutorial/fcst1_plots/sfcape_diff_regional_f012.png + :width: 75% :align: center *CAPE/CIN Difference Plot at f012* @@ -521,7 +526,7 @@ Cold air damming occurs when cold dense air is topographically trapped along the * `Storm Prediction Center (SPC) Storm Report for 20200206 `__ * `Storm Prediction Center (SPC) Storm Report for 20200207 `__ -.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/ColdAirDamming.jpg +.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/Tutorial/ColdAirDamming.jpg :alt: Radar animation of precipitation resulting from cold air damming in the southern Appalachian mountains. *Precipitation Resulting from Cold Air Damming East of the Appalachian Mountains* @@ -543,7 +548,7 @@ A polar vortex brought arctic air to much of the U.S. and Mexico. A series of co **Weather Phenomena:** Snow and record-breaking cold temperatures -.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/SouthernPlainsWinterWeather.jpg +.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/Tutorial/SouthernPlainsWinterWeather.jpg :alt: Radar animation of the Southern Plains Winter Weather Event centered over Oklahoma City. Animation starts on February 14, 2021 at 6h00 UTC and ends on February 17, 2021 at 6h00 UTC. *Southern Plains Winter Weather Event Over Oklahoma City* @@ -569,7 +574,7 @@ A line of severe storms brought strong winds, flash flooding, and tornadoes to t * `Storm Prediction Center (SPC) Storm Report for 20191031 `__ -.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/HalloweenStorm.jpg +.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/Tutorial/HalloweenStorm.jpg :alt: Radar animation of the Halloween Storm that swept across the Eastern United States in 2019. *Halloween Storm 2019* @@ -594,7 +599,7 @@ Hurricane Barry made landfall in Louisiana on July 11, 2019 as a Category 1 hurr * `Storm Prediction Center (SPC) Storm Report for 20190713 `__ * `Storm Prediction Center (SPC) Storm Report for 20190714 `__ -.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/HurricaneBarry_Making_Landfall.jpg +.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/Tutorial/HurricaneBarry_Making_Landfall.jpg :alt: Radar animation of Hurricane Barry making landfall. *Hurricane Barry Making Landfall* diff --git a/docs/UsersGuide/source/BuildingRunningTesting/VXCases.rst b/docs/UsersGuide/source/BuildingRunningTesting/VXCases.rst index 533f6e7fb2..5593528937 100644 --- a/docs/UsersGuide/source/BuildingRunningTesting/VXCases.rst +++ b/docs/UsersGuide/source/BuildingRunningTesting/VXCases.rst @@ -7,16 +7,14 @@ METplus Verification Sample Cases Introduction =============== -The goal of these sample cases is to provide the UFS community with datasets that they can modify and run to see if their changes can improve the forecast and/or reduce the model biases. Each case covers an interesting weather event. The case that was added with the v2.1.0 release was a severe weather event over Indianapolis on June 15-16, 2019. In the future, additional sample cases will be provided. +The goal of these sample cases is to provide the UFS community with datasets that they can modify and run to see if their changes can improve the forecast and/or reduce the model biases. Each case covers an interesting weather event. The case that was added ahead of the v2.1.0 release was a severe weather event over Indianapolis on June 15-16, 2019. Content has been updated for the |latestr| release. In the future, additional sample cases will be provided. Each sample case contains model output from a control run; this output includes ``postprd`` (post-processed) and ``metprd`` (MET verification-processed) directories. Under the ``postprd`` directory, users will find the :term:`UPP` output of the model run along with plots for several forecast variables (when plotting tasks are run). These can be used for a visual/qualitative comparison of forecasts. The ``metprd`` directory contains METplus verification statistics files, which can be used for a quantitative comparison of forecast outputs. Prerequisites ================ -This chapter assumes that users have already (1) built the SRW App v2.1.0 successfully and (2) installed MET and METplus on their system. - -For instructions on how to build the v2.1.0 release of the SRW App, see the v2.1.0 release documentation on :ref:`Building the SRW App `. The release code is used to provide a consistent point of comparison; the ``develop`` branch code is constantly receiving updates, which makes it unsuited to this purpose. Users will have an easier time if they run through the out-of-the-box case described in the v2.1.0 release documentation on :ref:`Running the SRW App ` before attempting to run any verification sample cases, but doing so is optional. +This chapter assumes that users have already (1) built the SRW App |latestr| release successfully and (2) installed MET and METplus on their system (e.g., as part of :term:`spack-stack` installation). For instructions on how to build the |latestr| release, see :numref:`Section %s `. Users will have an easier time if they run through the out-of-the-box case described in :numref:`Section %s ` before attempting to run any verification sample cases, but doing so is optional. For information on MET and METplus, see :numref:`Section %s `, which contains information on METplus, links to a list of existing MET/METplus builds on `Level 1 & 2 `__ systems, and links to installation instructions and documentation for users on other systems. @@ -36,7 +34,7 @@ There were many storm reports for this event with the majority of tornadoes and Set Up Verification ----------------------- -Follow the instructions below to reproduce a forecast for this event using your own model setup! Make sure to install and build the latest version of the SRW Application (v2.1.0). ``develop`` branch code is constantly changing, so it does not provide a consistent baseline for comparison. +Follow the instructions below to reproduce a forecast for this event using your own model setup! Make sure to install and build the latest version of the SRW Application (|latestr|). ``develop`` branch code is constantly changing, so it does not provide a consistent baseline for comparison. .. _GetSampleData: @@ -47,21 +45,21 @@ On `Level 1 ` files, observation data, model/forecast output, and MET verification output for the sample forecast. Users who have never run the SRW App on their system before will also need to download (1) the fix files required for SRW App forecasts and (2) the NaturalEarth shapefiles required for plotting. Users can download the fix file data from a browser at https://noaa-ufs-srw-pds.s3.amazonaws.com/current_srw_release_data/fix_data.tgz or visit :numref:`Section %s ` for instructions on how to download the data with ``wget``. NaturalEarth files are available at https://noaa-ufs-srw-pds.s3.amazonaws.com/NaturalEarth/NaturalEarth.tgz. See the :ref:`Graphics ` chapter of the release documentation for more information. +This tar file contains :term:`IC/LBC ` files, observation data, model/forecast output, and MET verification output for the sample forecast. Users who have never run the SRW App on their system before will also need to download (1) the fix files required for SRW App forecasts and (2) the NaturalEarth shapefiles required for plotting. Users can download the fix file data from a browser at https://noaa-ufs-srw-pds.s3.amazonaws.com/current_srw_release_data/fix_data.tgz or visit :numref:`Section %s ` for instructions on how to download the data with ``wget``. NaturalEarth files are available at https://noaa-ufs-srw-pds.s3.amazonaws.com/NaturalEarth/NaturalEarth.tgz. See the :numref:`Section %s ` for more information on plotting. After downloading ``Indy-Severe-Weather.tgz`` using one of the three methods above, untar the downloaded compressed archive file: @@ -69,12 +67,12 @@ After downloading ``Indy-Severe-Weather.tgz`` using one of the three methods abo tar xvfz Indy-Severe-Weather.tgz -Record the path to this file output using the ``pwd`` command: +Save the path to this file in and ``INDYDATA`` environment variable: .. code-block:: console cd Indy-Severe-Weather - pwd + export INDYDATA=$PWD .. note:: @@ -93,7 +91,7 @@ First, navigate to the ``ufs-srweather-app/ush`` directory. Then, load the workf Users running a csh/tcsh shell would run ``source /path/to/etc/lmod-setup.csh `` in place of the first command above. -After loading the workflow, users should follow the instructions printed to the console. Usually, the instructions will tell the user to run ``conda activate regional_workflow``. +After loading the workflow, users should follow the instructions printed to the console. Usually, the instructions will tell the user to run |activate|. Configure the Verification Sample Case ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ @@ -110,29 +108,26 @@ where ``/path/to/ufs-srweather-app/ush`` is replaced by the actual path to the ` Then, edit the configuration file (``config.yaml``) to include the variables and values in the sample configuration excerpt below (variables not listed below do not need to be changed or removed). Users must be sure to substitute values in ``<>`` with values appropriate to their system. .. note:: - Users working on a `Level 1 platform `__ do not need to add or update the following variables: ``MET_INSTALL_DIR``, ``METPLUS_PATH``, ``MET_BIN_EXEC``, ``CCPA_OBS_DIR``, ``MRMS_OBS_DIR``, and ``NDAS_OBS_DIR``. + Users working on a `Level 1 platform `__ do not need to add or update the following variables: ``CCPA_OBS_DIR``, ``MRMS_OBS_DIR``, and ``NDAS_OBS_DIR``. .. code-block:: console user: + MACHINE: ACCOUNT: platform: - MODEL: FV3_GFS_v16_SUBCONUS_3km - MET_INSTALL_DIR: /path/to/met/x.x.x # Example: MET_INSTALL_DIR: /contrib/met/10.1.1 - METPLUS_PATH: /path/to/METplus/METplus-x.x.x # Example: METPLUS_PATH: /contrib/METplus/METplus-4.1.1 - # Add MET_BIN_EXEC variable to config.yaml - MET_BIN_EXEC: bin CCPA_OBS_DIR: /path/to/Indy-Severe-Weather/obs_data/ccpa/proc MRMS_OBS_DIR: /path/to/Indy-Severe-Weather/obs_data/mrms/proc NDAS_OBS_DIR: /path/to/Indy-Severe-Weather/obs_data/ndas/proc workflow: EXPT_SUBDIR: + CCPP_PHYS_SUITE: FV3_RRFS_v1beta + PREDEF_GRID_NAME: SUBCONUS_Ind_3km DATE_FIRST_CYCL: '2019061500' DATE_LAST_CYCL: '2019061500' FCST_LEN_HRS: 60 - workflow_switches: - RUN_TASK_VX_GRIDSTAT: true - RUN_TASK_VX_POINTSTAT: true + # Change to gnu if using a gnu compiler; otherwise, no change + COMPILER: intel task_get_extrn_ics: # Add EXTRN_MDL_SOURCE_BASEDIR_ICS variable to config.yaml EXTRN_MDL_SOURCE_BASEDIR_ICS: /path/to/Indy-Severe-Weather/input_model_data/FV3GFS/grib2/2019061500 @@ -141,9 +136,17 @@ Then, edit the configuration file (``config.yaml``) to include the variables and # Add EXTRN_MDL_SOURCE_BASEDIR_LBCS variable to config.yaml EXTRN_MDL_SOURCE_BASEDIR_LBCS: /path/to/Indy-Severe-Weather/input_model_data/FV3GFS/grib2/2019061500 USE_USER_STAGED_EXTRN_FILES: true - task_run_fcst: - WTIME_RUN_FCST: 05:00:00 - PREDEF_GRID_NAME: SUBCONUS_Ind_3km + task_plot_allvars: + PLOT_FCST_INC: 6 + PLOT_DOMAINS: ["regional"] + verification: + VX_FCST_MODEL_NAME: FV3_RRFS_v1beta_SUBCONUS_Ind_3km + rocoto: + tasks: + metatask_run_ensemble: + task_run_fcst_mem#mem#: + walltime: 02:00:00 + taskgroups: '{{ ["parm/wflow/prep.yaml", "parm/wflow/coldstart.yaml", "parm/wflow/post.yaml", "parm/wflow/plot.yaml", "parm/wflow/verify_pre.yaml", "parm/wflow/verify_det.yaml"]|include }}' .. hint:: To open the configuration file in the command line, users may run the command: @@ -154,7 +157,7 @@ Then, edit the configuration file (``config.yaml``) to include the variables and To modify the file, hit the ``i`` key and then make any changes required. To close and save, hit the ``esc`` key and type ``:wq``. Users may opt to use their preferred code editor instead. -For additional configuration guidance, refer to the v2.1.0 release documentation on :ref:`configuring the SRW App `. +For additional configuration guidance, refer to the |latestr| release documentation on :ref:`configuring the SRW App `. Generate the Experiment ^^^^^^^^^^^^^^^^^^^^^^^^^^ @@ -186,16 +189,6 @@ Users who prefer to automate the workflow via :term:`crontab` or who need guidan If a problem occurs and a task goes DEAD, view the task log files in ``$EXPTDIR/log`` to determine the problem. Then refer to :numref:`Section %s ` to restart a DEAD task once the problem has been resolved. For troubleshooting assistance, users are encouraged to post questions on the new SRW App `GitHub Discussions `__ Q&A page. -Generate Plots -^^^^^^^^^^^^^^^^^ - -The plots are created using the graphics generation script that comes with the SRW App v2.1.0 release. Information on the plots and instructions on how to run the script can be found in :doc:`Chapter 12 ` of the v2.1.0 release documentation. If the python environment is already loaded (i.e., ``(regional_workflow)`` is visible in the command prompt), users can navigate to the directory with the plotting scripts and run ``plot_allvars.py``: - -.. code-block:: console - - cd /path/to/ufs-srweather-app/ush/Python - python plot_allvars.py 2019061500 0 60 6 /path/to/experiment/directory /path/to/NaturalEarth SUBCONUS_Ind_3km - Compare ---------- @@ -243,7 +236,7 @@ METplus verification ``.stat`` files provide users the opportunity to compare th .. code-block:: console - point_stat_FV3_GFS_v16_SUBCONUS_3km_NDAS_ADPSFC_300000L_20190616_060000V.stat + point_stat_FV3_RRFS_v1beta_SUBCONUS_Ind_3km_NDAS_ADPSFC_300000L_20190616_060000V.stat The 30th hour of the forecast occurs at 6am (06Z) on June 16, 2019. The lead time is 30 hours (300000L in HHMMSSL format) because this is the 30th hour of the forecast. The valid time is 06Z (060000V in HHMMSSV format). @@ -252,16 +245,16 @@ The following is the list of METplus output files users can reference during the .. code-block:: console # Point-Stat Files - point_stat_FV3_GFS_v16_SUBCONUS_3km_NDAS_ADPSFC_HHMMSSL_YYYYMMDD_HHMMSSV.stat - point_stat_FV3_GFS_v16_SUBCONUS_3km_NDAS_ADPUPA_HHMMSSL_YYYYMMDD_HHMMSSV.stat + point_stat_FV3_RRFS_v1beta_SUBCONUS_Ind_3km_NDAS_ADPSFC_HHMMSSL_YYYYMMDD_HHMMSSV.stat + point_stat_FV3_RRFS_v1beta_SUBCONUS_Ind_3km_NDAS_ADPUPA_HHMMSSL_YYYYMMDD_HHMMSSV.stat # Grid-Stat Files - grid_stat_FV3_GFS_v16_SUBCONUS_3km_REFC_MRMS_HHMMSSL_YYYYMMDD_HHMMSSV.stat - grid_stat_FV3_GFS_v16_SUBCONUS_3km_RETOP_MRMS_HHMMSSL_YYYYMMDD_HHMMSSV.stat - grid_stat_FV3_GFS_v16_SUBCONUS_3km_APCP_01h_CCPA_HHMMSSL_YYYYMMDD_HHMMSSV.stat - grid_stat_FV3_GFS_v16_SUBCONUS_3km_APCP_03h_CCPA_HHMMSSL_YYYYMMDD_HHMMSSV.stat - grid_stat_FV3_GFS_v16_SUBCONUS_3km_APCP_06h_CCPA_HHMMSSL_YYYYMMDD_HHMMSSV.stat - grid_stat_FV3_GFS_v16_SUBCONUS_3km_APCP_24h_CCPA_HHMMSSL_YYYYMMDD_HHMMSSV.stat + grid_stat_FV3_RRFS_v1beta_SUBCONUS_Ind_3km_REFC_MRMS_HHMMSSL_YYYYMMDD_HHMMSSV.stat + grid_stat_FV3_RRFS_v1beta_SUBCONUS_Ind_3km_RETOP_MRMS_HHMMSSL_YYYYMMDD_HHMMSSV.stat + grid_stat_FV3_RRFS_v1beta_SUBCONUS_Ind_3km_APCP_01h_CCPA_HHMMSSL_YYYYMMDD_HHMMSSV.stat + grid_stat_FV3_RRFS_v1beta_SUBCONUS_Ind_3km_APCP_03h_CCPA_HHMMSSL_YYYYMMDD_HHMMSSV.stat + grid_stat_FV3_RRFS_v1beta_SUBCONUS_Ind_3km_APCP_06h_CCPA_HHMMSSL_YYYYMMDD_HHMMSSV.stat + grid_stat_FV3_RRFS_v1beta_SUBCONUS_Ind_3km_APCP_24h_CCPA_HHMMSSL_YYYYMMDD_HHMMSSV.stat Point STAT Files @@ -269,7 +262,7 @@ Point STAT Files The Point-Stat files contain continuous variables like temperature, pressure, and wind speed. A description of the Point-Stat file can be found :ref:`here ` in the MET documentation. -The Point-Stat files contain a potentially overwhelming amount of information. Therefore, it is recommended that users focus on the CNT MET test, which contains the `RMSE `__ and `MBIAS `__ statistics. The MET tests are defined in column 24 'LINE_TYPE' of the ``.stat`` file. Look for 'CNT' in this column. Then find column 66-68 for MBIAS and 78-80 for RMSE statistics. A full description of this file can be found `here `__. +The Point-Stat files contain a potentially overwhelming amount of information. Therefore, it is recommended that users focus on the CNT MET test, which contains the `RMSE `__ and `MBIAS `__ statistics. The MET tests are defined in column 24 'LINE_TYPE' of the ``.stat`` file. Look for 'CNT' in this column. Then find column 66-68 for MBIAS and 78-80 for RMSE statistics. A full description of this file can be found :ref:`here `. To narrow down the variable field even further, users can focus on these weather variables: @@ -289,7 +282,7 @@ Grid-Stat Files The Grid-Stat files contain gridded variables like reflectivity and precipitation. A description of the Grid-Stat file can be found :ref:`here `. -As with the Point-Stat file, there are several MET tests and statistics available in the Grid-Stat file. To simplify this dataset users can focus on the MET tests and statistics found in :numref:`Table %s ` below. The MET tests are found in column 24 ‘LINE_TYPE’ of the Grid-Stat file. The table also shows the user the columns for the statistics of interest. For a more detailed description of the Grid-Stat files, view the :ref:`MET Grid-Stat Documentation `. +As with the Point-Stat file, there are several MET tests and statistics available in the Grid-Stat file. To simplify this dataset, users can focus on the MET tests and statistics found in :numref:`Table %s ` below. The MET tests are found in column 24 ‘LINE_TYPE’ of the Grid-Stat file. The table also shows the user the columns for the statistics of interest. For a more detailed description of the Grid-Stat files, view the :ref:`MET Grid-Stat Documentation `. .. _GridStatStatistics: diff --git a/docs/UsersGuide/source/BuildingRunningTesting/WE2Etests.rst b/docs/UsersGuide/source/BuildingRunningTesting/WE2Etests.rst index 3c20fc8120..0dfd8d3f19 100644 --- a/docs/UsersGuide/source/BuildingRunningTesting/WE2Etests.rst +++ b/docs/UsersGuide/source/BuildingRunningTesting/WE2Etests.rst @@ -3,22 +3,38 @@ ======================= Testing the SRW App ======================= + +Introduction to Workflow End-to-End (WE2E) Tests +================================================== + The SRW App contains a set of end-to-end tests that exercise various workflow configurations of the SRW App. These are referred to as workflow end-to-end (WE2E) tests because they all use the Rocoto workflow manager to run their individual workflows from start to finish. The purpose of these tests is to ensure that new changes to the App do not break existing functionality and capabilities. However, these WE2E tests also provide users with additional sample cases and data beyond the basic ``config.community.yaml`` case. -Note that the WE2E tests are not regression tests---they do not check whether -current results are identical to previously established baselines. They also do -not test the scientific integrity of the results (e.g., they do not check that values -of output fields are reasonable). These tests only check that the tasks within each test's workflow complete successfully. They are, in essence, tests of the workflow generation, task execution (:term:`J-jobs`, +.. attention:: + + * This introductory section provides high-level information on what is and is not tested with WE2E tests. It also provides information on :ref:`WE2E test categories ` and the :ref:`WE2E test information file `, which summarizes each test. + * To skip directly to running WE2E tests, go to :numref:`Section %s: Running the WE2E Tests `. + +What is a WE2E test? +---------------------- + +WE2E tests are, in essence, tests of the workflow generation, task execution (:term:`J-jobs`, :term:`ex-scripts`), and other auxiliary scripts to ensure that these scripts function correctly. Tested functions include creating and correctly arranging and naming directories and files, ensuring -that all input files are available and readable, calling executables with correct namelists and/or options, etc. Currently, it is up to the external repositories that the App clones (:numref:`Section %s `) to check that changes to those repositories do not change results, or, if they do, to ensure that the new results are acceptable. (At least two of these external repositories---``UFS_UTILS`` and ``ufs-weather-model``---do have such regression tests.) +that all input files are available and readable, calling executables with correct namelists and/or options, etc. + +Note that the WE2E tests are **not** regression tests---they do not check whether +current results are identical to previously established baselines. They also do +not test the scientific integrity of the results (e.g., they do not check that values +of output fields are reasonable). These tests only check that the tasks within each test's workflow complete successfully. Currently, it is up to the external repositories that the App clones (see :numref:`Section %s `) to check that changes to those repositories do not change results, or, if they do, to ensure that the new results are acceptable. (At least two of these external repositories---``UFS_UTILS`` and ``ufs-weather-model``---do have such regression tests.) + +.. _we2e-categories: WE2E Test Categories -====================== +---------------------- WE2E tests are grouped into two categories that are of interest to code developers: ``fundamental`` and ``comprehensive`` tests. "Fundamental" tests are a lightweight but wide-reaching set of tests designed to function as a cheap "`smoke test `__" for changes to the UFS SRW App. The fundamental suite of tests runs common combinations of workflow tasks, physical domains, input data, physics suites, etc. The comprehensive suite of tests covers a broader range of combinations of capabilities, configurations, and components, ideally including all capabilities that *can* be run on a given platform. Because some capabilities are not available on all platforms (e.g., retrieving data directly from NOAA HPSS), the suite of comprehensive tests varies from machine to machine. -The list of fundamental and comprehensive tests can be viewed in the ``ufs-srweather-app/tests/WE2E/machine_suites/`` directory and are described in more detail in :doc:`this table <../tables/Tests>`. +The list of fundamental and comprehensive tests can be viewed in the ``ufs-srweather-app/tests/WE2E/machine_suites/`` directory, and the tests are described in more detail in :doc:`this table <../tables/Tests>`. .. note:: @@ -33,7 +49,7 @@ For convenience, the WE2E tests are currently grouped into the following categor This category tests custom grids aside from those specified in ``ufs-srweather-app/ush/predef_grid_params.yaml``. These tests help ensure a wide range of domain sizes, resolutions, and locations will work as expected. These test files can also serve as examples for how to set your own custom domain. * ``default_configs`` - This category tests example config files provided for user reference. They are symbolically linked from the ``ufs-srweather-app/ush/`` directory. + This category tests example configuration files provided for user reference. They are symbolically linked from the ``ufs-srweather-app/ush/`` directory. * ``grids_extrn_mdls_suites_community`` This category of tests ensures that the SRW App workflow running in **community mode** (i.e., with ``RUN_ENVIR`` set to ``"community"``) completes successfully for various combinations of predefined grids, physics suites, and input data from different external models. Note that in community mode, all output from the Application is placed under a single experiment directory. @@ -67,28 +83,110 @@ For convenience, the WE2E tests are currently grouped into the following categor Some tests are duplicated among the above categories via symbolic links, both for legacy reasons (when tests for different capabilities were consolidated) and for convenience when a user would like to run all tests for a specific category (e.g., verification tests). +.. _WE2ETestInfoFile: + +WE2E Test Information File +----------------------------- + +If users want to see consolidated test information, they can generate a file that can be imported into a spreadsheet program (Google Sheets, Microsoft Excel, etc.) that summarizes each test. This file, named ``WE2E_test_info.txt`` by default, is delimited by the ``|`` character and can be created either by running the ``./print_test_info.py`` script, or by generating an experiment using ``./run_WE2E_tests.py`` with the ``--print_test_info`` flag. + +The rows of the file/sheet represent the full set of available tests (not just the ones to be run). The columns contain the following information (column titles are included in the CSV file): + +| **Column 1** +| The primary test name followed (in parentheses) by the category subdirectory where it is + located. + +| **Column 2** +| Any alternate names for the test followed by their category subdirectories + (in parentheses). + +| **Column 3** +| The test description. + +| **Column 4** +| The relative cost of running the dynamics in the test. This gives an + idea of how expensive the test is relative to a reference test that runs + a single 6-hour forecast on the ``RRFS_CONUS_25km`` predefined grid using + its default time step (``DT_ATMOS: 40``). To calculate the relative cost, the absolute cost (``abs_cost``) is first calculated as follows: + +.. code-block:: + + abs_cost = nx*ny*num_time_steps*num_fcsts + +| Here, ``nx`` and ``ny`` are the number of grid points in the horizontal + (``x`` and ``y``) directions, ``num_time_steps`` is the number of time + steps in one forecast, and ``num_fcsts`` is the number of forecasts the + test runs (see Column 5 below). (Note that this cost calculation does + not (yet) differentiate between different physics suites.) The relative + cost ``rel_cost`` is then calculated using: + +.. code-block:: + + rel_cost = abs_cost/abs_cost_ref + +| where ``abs_cost_ref`` is the absolute cost of running the reference forecast + described above, i.e., a single (``num_fcsts = 1``) 6-hour forecast + (``FCST_LEN_HRS = 6``) on the ``RRFS_CONUS_25km grid`` (which currently has + ``nx = 219``, ``ny = 131``, and ``DT_ATMOS = 40 sec`` (so that ``num_time_steps + = FCST_LEN_HRS*3600/DT_ATMOS = 6*3600/40 = 540``). Therefore, the absolute cost reference is calculated as: + +.. code-block:: + + abs_cost_ref = 219*131*540*1 = 15,492,060 + +| **Column 5** +| The number of times the forecast model will be run by the test. This + is calculated using quantities such as the number of :term:`cycle` dates (i.e., + forecast model start dates) and the number of ensemble members (which + is greater than 1 if running ensemble forecasts and 1 otherwise). The + number of cycle dates and/or ensemble members is derived from the quantities listed + in Columns 6, 7, .... + +| **Columns 6, 7, ...** +| The values of various experiment variables (if defined) in each test's + configuration file. Currently, the following experiment variables are + included: + + | ``PREDEF_GRID_NAME`` + | ``CCPP_PHYS_SUITE`` + | ``EXTRN_MDL_NAME_ICS`` + | ``EXTRN_MDL_NAME_LBCS`` + | ``DATE_FIRST_CYCL`` + | ``DATE_LAST_CYCL`` + | ``INCR_CYCL_FREQ`` + | ``FCST_LEN_HRS`` + | ``DT_ATMOS`` + | ``LBC_SPEC_INTVL_HRS`` + | ``NUM_ENS_MEMBERS`` + +.. _RunWE2E: + Running the WE2E Tests ================================ -The Test Script (``run_WE2E_tests.py``) ------------------------------------------ +About the Test Script (``run_WE2E_tests.py``) +----------------------------------------------- -The script to run the WE2E tests is named ``run_WE2E_tests.py`` and is located in the directory ``ufs-srweather-app/tests/WE2E``. Each WE2E test has an associated configuration file named ``config.${test_name}.yaml``, where ``${test_name}`` is the name of the corresponding test. These configuration files are subsets of the full range of ``config.yaml`` experiment configuration options. (See :numref:`Section %s ` for all configurable options and :numref:`Section %s ` for information on configuring ``config.yaml``.) For each test, the ``run_WE2E_tests.py`` script reads in the test configuration file and generates from it a complete ``config.yaml`` file. It then calls the ``generate_FV3LAM_wflow()`` function, which in turn reads in ``config.yaml`` and generates a new experiment for the test. The name of each experiment directory is set to that of the corresponding test, and a copy of ``config.yaml`` for each test is placed in its experiment directory. +The script to run the WE2E tests is named ``run_WE2E_tests.py`` and is located in the directory ``ufs-srweather-app/tests/WE2E``. Each WE2E test has an associated configuration file named ``config.${test_name}.yaml``, where ``${test_name}`` is the name of the corresponding test. These configuration files are subsets of the full range of ``config.yaml`` experiment configuration options. (See :numref:`Section %s ` for all configurable options and :numref:`Section %s ` for information on configuring ``config.yaml`` or any test configuration ``.yaml`` file.) For each test, the ``run_WE2E_tests.py`` script reads in the test configuration file and generates from it a complete ``config.yaml`` file. It then calls the ``generate_FV3LAM_wflow()`` function, which in turn reads in ``config.yaml`` and generates a new experiment for the test. The name of each experiment directory is set to that of the corresponding test, and a copy of ``config.yaml`` for each test is placed in its experiment directory. .. note:: The full list of WE2E tests is extensive, and some larger, high-resolution tests are computationally expensive. Estimates of walltime and core-hour cost for each test are provided in :doc:`this table <../tables/Tests>`. +.. COMMENT: Is the list of supported tests up-to-date? + Using the Test Script ---------------------- .. attention:: - These instructions assume that the user has already built the SRW App (as described in :numref:`Section %s `) and loaded the appropriate python environment (as described in :numref:`Section %s `). + These instructions assume that the user has already built the SRW App (as described in :numref:`Section %s `). + +First, load the appropriate python environment (as described in :numref:`Section %s `). The test script has three required arguments: machine, account, and tests. - * Users must indicate which machine they are on using the ``--machine`` or ``-m`` option. See ``ush/machine`` or :numref:`Section %s ` for valid values. + * Users must indicate which machine they are on using the ``--machine`` or ``-m`` option. See :numref:`Section %s ` for valid values or check the ``valid_param_vals.yaml`` file. * Users must submit a valid account name using the ``--account`` or ``-a`` option to run submitted jobs. On systems where an account name is not required, users may simply use ``-a none``. * Users must specify the set of tests to run using the ``--tests`` or ``-t`` option. Users may pass (in order of priority): @@ -119,12 +217,12 @@ Alternatively, to run the entire suite of fundamental tests on Hera, users might ./run_WE2E_tests.py -t fundamental -m hera -a nems -To run the tests ``custom_ESGgrid`` and ``grid_RRFS_CONUScompact_25km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16`` on NOAA Cloud, users would enter the following commands: +To add ``custom_ESGgrid`` and ``grid_RRFS_CONUScompact_25km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16`` to a text file and run the tests in that file on NOAA Cloud, users would enter the following commands: .. code-block:: console echo "custom_ESGgrid" > my_tests.txt - echo "grid_RRFS_CONUScompact_25km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16" >> my_tests.txt + echo "grid_RRFS_CONUScompact_25km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16" > my_tests.txt ./run_WE2E_tests.py -t my_tests.txt -m noaacloud -a none By default, the experiment directory for a WE2E test has the same name as the test itself, and it is created in ``${HOMEdir}/../expt_dirs``, where ``HOMEdir`` is the top-level directory for the ``ufs-srweather-app`` repository (usually set to something like ``/path/to/ufs-srweather-app``). Thus, the ``custom_ESGgrid`` experiment directory would be located in ``${HOMEdir}/../expt_dirs/custom_ESGgrid``. @@ -135,7 +233,7 @@ By default, the experiment directory for a WE2E test has the same name as the te ./run_WE2E_tests.py -t fundamental -m orion -a gsd-fv3 --expt_basedir "test_set_01" -q -p 2 - * ``--expt_basedir``: Useful for grouping sets of tests. If set to a relative path, the provided path will be appended to the default path. In this case, all of the fundamental tests will reside in ``${HOMEdir}/../expt_dirs/test_set_01/``. It can also take a full path as an argument, which will place experiments in the given location. + * ``--expt_basedir``: Useful for grouping sets of tests. If set to a relative path, the provided path will be appended to the default path. In this case, all of the fundamental tests will reside in ``${HOMEdir}/../expt_dirs/test_set_01/``. It can also take a full (absolute) path as an argument, which will place experiments in the given location. * ``-q``: Suppresses the output from ``generate_FV3LAM_wflow()`` and prints only important messages (warnings and errors) to the screen. The suppressed output will still be available in the ``log.run_WE2E_tests`` file. * ``-p 2``: Indicates the number of parallel proceeses to run. By default, job monitoring and submission is serial, using a single task. Therefore, the script may take a long time to return to a given experiment and submit the next job when running large test suites. Depending on the machine settings, running in parallel can substantially reduce the time it takes to run all experiments. However, it should be used with caution on shared resources (such as HPC login nodes) due to the potential to overwhelm machine resources. @@ -191,7 +289,7 @@ For each specified test, ``run_WE2E_tests.py`` will generate a new experiment di As the script runs, detailed debug output is written to the file ``log.run_WE2E_tests``. This can be useful for debugging if something goes wrong. Adding the ``-d`` flag will print all this output to the screen during the run, but this can get quite cluttered. -The progress of ``monitor_jobs()`` is tracked in a file ``WE2E_tests_{datetime}.yaml``, where {datetime} is the date and time (in ``yyyymmddhhmmss`` format) that the file was created. The final job summary is written by the ``print_WE2E_summary()``; this prints a short summary of experiments to the screen and prints a more detailed summary of all jobs for all experiments in the indicated ``.txt`` file. +The progress of ``monitor_jobs()`` is tracked in a file ``WE2E_tests_{datetime}.yaml``, where {datetime} is the date and time (in ``YYYYMMDDHHmmSS`` format) that the file was created. The final job summary is written by the ``print_WE2E_summary()``; this prints a short summary of experiments to the screen and prints a more detailed summary of all jobs for all experiments in the indicated ``.txt`` file. .. code-block:: console @@ -266,7 +364,8 @@ One might have noticed the line during the experiment run that reads "Use ctrl-c ./monitor_jobs.py -y=WE2E_tests_20230418174042.yaml -p=1 Checking Test Status and Summary -================================= +---------------------------------- + By default, ``./run_WE2E_tests.py`` will actively monitor jobs, printing to console when jobs are complete (either successfully or with a failure), and printing a summary file ``WE2E_summary_{datetime.now().strftime("%Y%m%d%H%M%S")}.txt``. However, if the user is using the legacy crontab option (by submitting ``./run_WE2E_tests.py`` with the ``--launch cron`` option), or if the user would like to summarize one or more experiments that either are not complete or were not handled by the WE2E test scripts, this status/summary file can be generated manually using ``WE2E_summary.py``. In this example, an experiment was generated using the crontab option and has not yet finished running. @@ -338,85 +437,7 @@ The "Status" as specified by the above summary is explained below: All jobs are status SUCCEEDED; we will monitor for one more cycle in case there are unsubmitted jobs remaining. * ``COMPLETE`` - All jobs are status SUCCEEDED, and we have monitored this job for an additional cycle to ensure there are no un-submitted jobs. We will no longer monitor this experiment. - - -.. _WE2ETestInfoFile: - -WE2E Test Information File -================================== - -If the user wants to see consolidated test information, they can generate a file that can be imported into a spreadsheet program (Google Sheets, Microsoft Excel, etc.) that summarizes each test. This file, named ``WE2E_test_info.txt`` by default, is delimited by the ``|`` character and can be created either by running the ``./print_test_info.py`` script, or by generating an experiment using ``./run_WE2E_tests.py`` with the ``--print_test_info`` flag. - -The rows of the file/sheet represent the full set of available tests (not just the ones to be run). The columns contain the following information (column titles are included in the CSV file): - -| **Column 1** -| The primary test name followed (in parentheses) by the category subdirectory where it is - located. - -| **Column 2** -| Any alternate names for the test followed by their category subdirectories - (in parentheses). - -| **Column 3** -| The test description. - -| **Column 4** -| The relative cost of running the dynamics in the test. This gives an - idea of how expensive the test is relative to a reference test that runs - a single 6-hour forecast on the ``RRFS_CONUS_25km`` predefined grid using - its default time step (``DT_ATMOS: 40``). To calculate the relative cost, the absolute cost (``abs_cost``) is first calculated as follows: - -.. code-block:: - - abs_cost = nx*ny*num_time_steps*num_fcsts - -| Here, ``nx`` and ``ny`` are the number of grid points in the horizontal - (``x`` and ``y``) directions, ``num_time_steps`` is the number of time - steps in one forecast, and ``num_fcsts`` is the number of forecasts the - test runs (see Column 5 below). (Note that this cost calculation does - not (yet) differentiate between different physics suites.) The relative - cost ``rel_cost`` is then calculated using: - -.. code-block:: - - rel_cost = abs_cost/abs_cost_ref - -| where ``abs_cost_ref`` is the absolute cost of running the reference forecast - described above, i.e., a single (``num_fcsts = 1``) 6-hour forecast - (``FCST_LEN_HRS = 6``) on the ``RRFS_CONUS_25km grid`` (which currently has - ``nx = 219``, ``ny = 131``, and ``DT_ATMOS = 40 sec`` (so that ``num_time_steps - = FCST_LEN_HRS*3600/DT_ATMOS = 6*3600/40 = 540``). Therefore, the absolute cost reference is calculated as: - -.. code-block:: - - abs_cost_ref = 219*131*540*1 = 15,492,060 - -| **Column 5** -| The number of times the forecast model will be run by the test. This - is calculated using quantities such as the number of :term:`cycle` dates (i.e., - forecast model start dates) and the number of ensemble members (which - is greater than 1 if running ensemble forecasts and 1 otherwise). The - number of cycle dates and/or ensemble members is derived from the quantities listed - in Columns 6, 7, .... - -| **Columns 6, 7, ...** -| The values of various experiment variables (if defined) in each test's - configuration file. Currently, the following experiment variables are - included: - - | ``PREDEF_GRID_NAME`` - | ``CCPP_PHYS_SUITE`` - | ``EXTRN_MDL_NAME_ICS`` - | ``EXTRN_MDL_NAME_LBCS`` - | ``DATE_FIRST_CYCL`` - | ``DATE_LAST_CYCL`` - | ``INCR_CYCL_FREQ`` - | ``FCST_LEN_HRS`` - | ``DT_ATMOS`` - | ``LBC_SPEC_INTVL_HRS`` - | ``NUM_ENS_MEMBERS`` - + All jobs are status SUCCEEDED, and we have monitored this job for an additional cycle to ensure there are no unsubmitted jobs. We will no longer monitor this experiment. Modifying the WE2E System ============================ diff --git a/docs/UsersGuide/source/CustomizingTheWorkflow/ConfigWorkflow.rst b/docs/UsersGuide/source/CustomizingTheWorkflow/ConfigWorkflow.rst index 11d882c4f9..1f3b73f4dc 100644 --- a/docs/UsersGuide/source/CustomizingTheWorkflow/ConfigWorkflow.rst +++ b/docs/UsersGuide/source/CustomizingTheWorkflow/ConfigWorkflow.rst @@ -30,7 +30,7 @@ If non-default parameters are selected for the variables in this section, they s Setting ``RUN_ENVIR`` to "community" is recommended in most cases for users who are not running in NCO's production environment. Valid values: ``"nco"`` | ``"community"`` ``MACHINE``: (Default: "BIG_COMPUTER") - The machine (a.k.a. platform or system) on which the workflow will run. Currently supported platforms are listed on the `SRW App Wiki page `__. When running the SRW App on any ParallelWorks/NOAA Cloud system, use "NOAACLOUD" regardless of the underlying system (AWS, GCP, or Azure). Valid values: ``"HERA"`` | ``"ORION"`` | ``"JET"`` | ``"CHEYENNE"`` | ``"DERECHO"`` | ``"GAEA"`` | ``"NOAACLOUD"`` | ``"STAMPEDE"`` | ``"ODIN"`` | ``"MACOS"`` | ``"LINUX"`` | ``"SINGULARITY"`` | ``"WCOSS2"`` (Check ``ufs-srweather-app/ush/valid_param_vals.yaml`` for the most up-to-date list of supported platforms.) + The machine (a.k.a. platform or system) on which the workflow will run. Currently supported platforms are listed on the `SRW App Wiki page `__. When running the SRW App on any ParallelWorks/NOAA Cloud system, use "NOAACLOUD" regardless of the underlying system (AWS, GCP, or Azure). Valid values: ``"HERA"`` | ``"ORION"`` | ``"HERCULES"`` | ``"JET"`` | ``"CHEYENNE"`` | ``"DERECHO"`` | ``"GAEA"`` | ``"GAEA-C5"`` | ``"NOAACLOUD"`` | ``"STAMPEDE"`` | ``"ODIN"`` | ``"MACOS"`` | ``"LINUX"`` | ``"SINGULARITY"`` | ``"WCOSS2"`` (Check ``ufs-srweather-app/ush/valid_param_vals.yaml`` for the most up-to-date list of supported platforms.) .. hint:: Users who are NOT on a named, supported Level 1 or 2 platform will need to set the ``MACHINE`` variable to ``LINUX`` or ``MACOS``. To combine use of a Linux or MacOS platform with the Rocoto workflow manager, users will also need to set ``WORKFLOW_MANAGER: "rocoto"`` in the ``platform:`` section of ``config.yaml``. This combination will assume a Slurm batch manager when generating the XML. @@ -82,7 +82,7 @@ PLATFORM Configuration Parameters If non-default parameters are selected for the variables in this section, they should be added to the ``platform:`` section of the ``config.yaml`` file. ``WORKFLOW_MANAGER``: (Default: "none") - The workflow manager to use (e.g., "rocoto"). This is set to "none" by default, but if the machine name is set to a platform that supports Rocoto, this will be overwritten and set to "rocoto." If set explicitly to "rocoto" along with the use of the ``MACHINE: "LINUX"`` target, the configuration layer assumes a Slurm batch manager when generating the XML. Valid values: ``"rocoto"`` | ``"none"`` + The workflow manager to use (e.g., "rocoto"). This is set to "none" by default, but if the machine name is set to a platform that supports Rocoto, this will be overwritten and set to "rocoto." If set explicitly to "rocoto" along with the use of the ``MACHINE: "LINUX"`` target, the configuration layer assumes a Slurm batch manager when generating the XML. Valid values: ``"rocoto"`` | ``"ecflow"`` | ``"none"`` ``NCORES_PER_NODE``: (Default: "") The number of cores available per node on the compute platform. Set for supported platforms in ``setup.py``, but it is now also configurable for all platforms. @@ -370,7 +370,7 @@ Set File Name Parameters Prefix for the name of the file that specifies the output fields of the forecast model. ``FIELD_TABLE_FN``: ( Default: "field_table") - Prefix for the name of the file that specifies the :term:`tracers ` that the forecast model will read in from the :term:`IC/LBC ` files. + Prefix for the name of the file that specifies the :term:`tracers ` that the forecast model will read in from the :term:`IC/LBC ` files. .. _tmpl-fn-warning: @@ -382,7 +382,7 @@ Set File Name Parameters Name of a template file that specifies the output fields of the forecast model. The selected physics suite is appended to this file name in ``setup.py``, taking the form ``{DIAG_TABLE_TMPL_FN}.{CCPP_PHYS_SUITE}``. In general, users should not set this variable in their configuration file (see :ref:`note `). ``FIELD_TABLE_TMPL_FN``: (Default: ``'field_table.{{ CCPP_PHYS_SUITE }}'``) - Name of a template file that specifies the :term:`tracers ` that the forecast model will read in from the :term:`IC/LBC ` files. The selected physics suite is appended to this file name in ``setup.py``, taking the form ``{FIELD_TABLE_TMPL_FN}.{CCPP_PHYS_SUITE}``. In general, users should not set this variable in their configuration file (see :ref:`note `). + Name of a template file that specifies the :term:`tracers ` that the forecast model will read in from the :term:`IC/LBC ` files. The selected physics suite is appended to this file name in ``setup.py``, taking the form ``{FIELD_TABLE_TMPL_FN}.{CCPP_PHYS_SUITE}``. In general, users should not set this variable in their configuration file (see :ref:`note `). ``MODEL_CONFIG_FN``: (Default: "model_configure") Name of a file that contains settings and configurations for the :term:`NUOPC`/:term:`ESMF` main component. In general, users should not set this variable in their configuration file (see :ref:`note `). @@ -522,7 +522,7 @@ CCPP Parameter | ``"FV3_WoFS_v0"`` | ``"FV3_RAP"`` - Other valid values can be found in the ``ush/valid_param_vals.yaml`` file, but users cannot expect full support for these schemes. + Other valid values can be found in the ``ush/valid_param_vals.yaml`` `file `__, but users cannot expect full support for these schemes. ``CCPP_PHYS_SUITE_FN``: (Default: ``'suite_{{ CCPP_PHYS_SUITE }}.xml'``) The name of the suite definition file (SDF) used for the experiment. @@ -582,6 +582,7 @@ Predefined Grid Parameters | ``"RRFS_CONUS_13km"`` | ``"RRFS_CONUS_3km"`` | ``"SUBCONUS_Ind_3km"`` + | ``"RRFS_NA_13km"`` **Other valid values include:** @@ -594,7 +595,6 @@ Predefined Grid Parameters | ``"RRFS_CONUScompact_25km"`` | ``"RRFS_CONUScompact_13km"`` | ``"RRFS_CONUScompact_3km"`` - | ``"RRFS_NA_13km"`` | ``"RRFS_NA_3km"`` | ``"WoFS_3km"`` @@ -646,6 +646,8 @@ Forecast Parameters By setting ``FCST_LEN_HRS: -1``, the experiment will derive the values of ``FCST_LEN_HRS`` (18) and ``LONG_FCST_LEN_HRS`` (48) for each cycle date. +.. _preexisting-dirs: + Pre-Existing Directory Parameter ------------------------------------ ``PREEXISTING_DIR_METHOD``: (Default: "delete") @@ -923,7 +925,7 @@ Basic Task Parameters For each workflow task, certain parameter values must be passed to the job scheduler (e.g., Slurm), which submits a job for the task. ``EXTRN_MDL_NAME_ICS``: (Default: "FV3GFS") - The name of the external model that will provide fields from which initial condition (IC) files, surface files, and 0-th hour boundary condition files will be generated for input into the forecast model. Valid values: ``"GSMGFS"`` | ``"FV3GFS"`` | ``"GEFS"`` | ``"GDAS"`` | ``"RAP"`` | ``"HRRR"`` | ``"NAM"`` + The name of the external model that will provide fields from which initial condition (IC) files, surface files, and 0-th hour boundary condition files will be generated for input into the forecast model. Valid values: ``"GSMGFS"`` | ``"FV3GFS"`` | ``"GEFS"`` | ``"GDAS"`` | ``"RAP"`` | ``"HRRR"`` | ``"NAM"`` | ``"UFS-CASE-STUDY"`` ``EXTRN_MDL_ICS_OFFSET_HRS``: (Default: 0) Users may wish to start a forecast using forecast data from a previous cycle of an external model. This variable indicates how many hours earlier the external model started than the FV3 forecast configured here. For example, if the forecast should start from a 6-hour forecast of the GFS, then ``EXTRN_MDL_ICS_OFFSET_HRS: "6"``. @@ -977,7 +979,7 @@ Basic Task Parameters For each workflow task, certain parameter values must be passed to the job scheduler (e.g., Slurm), which submits a job for the task. ``EXTRN_MDL_NAME_LBCS``: (Default: "FV3GFS") - The name of the external model that will provide fields from which lateral boundary condition (LBC) files (except for the 0-th hour LBC file) will be generated for input into the forecast model. Valid values: ``"GSMGFS"`` | ``"FV3GFS"`` | ``"GEFS"`` | ``"GDAS"`` | ``"RAP"`` | ``"HRRR"`` | ``"NAM"`` + The name of the external model that will provide fields from which lateral boundary condition (LBC) files (except for the 0-th hour LBC file) will be generated for input into the forecast model. Valid values: ``"GSMGFS"`` | ``"FV3GFS"`` | ``"GEFS"`` | ``"GDAS"`` | ``"RAP"`` | ``"HRRR"`` | ``"NAM"`` | ``"UFS-CASE-STUDY"`` ``LBC_SPEC_INTVL_HRS``: (Default: 6) The interval (in integer hours) at which LBC files will be generated. This is also referred to as the *boundary update interval*. Note that the model selected in ``EXTRN_MDL_NAME_LBCS`` must have data available at a frequency greater than or equal to that implied by ``LBC_SPEC_INTVL_HRS``. For example, if ``LBC_SPEC_INTVL_HRS`` is set to "6", then the model must have data available at least every 6 hours. It is up to the user to ensure that this is the case. @@ -1138,6 +1140,8 @@ These parameters set values in the Weather Model's ``model_configure`` file. ``WRITE_DOPOST``: (Default: false) Flag that determines whether to use the inline post option, which calls the Unified Post Processor (:term:`UPP`) from within the UFS Weather Model. The default ``WRITE_DOPOST: false`` does not use the inline post functionality, and the ``run_post`` tasks are called from outside of the UFS Weather Model. If ``WRITE_DOPOST: true``, the ``WRITE_DOPOST`` flag in the ``model_configure`` file will be set to true, and the post-processing (:term:`UPP`) tasks will be called from within the Weather Model. This means that the post-processed files (in :term:`grib2` format) are output by the Weather Model at the same time that it outputs the ``dynf###.nc`` and ``phyf###.nc`` files. Setting ``WRITE_DOPOST: true`` turns off the separate ``run_post`` task in ``setup.py`` to avoid unnecessary computations. Valid values: ``True`` | ``False`` +.. _CompParams: + Computational Parameters ---------------------------- @@ -1156,7 +1160,7 @@ Write-Component (Quilting) Parameters ----------------------------------------- .. note:: - The :term:`UPP` (called by the ``run_post`` task) cannot process output on the native grid types ("GFDLgrid" and "ESGgrid"), so output fields are interpolated to a **write component grid** before writing them to an output file. The output files written by the UFS Weather Model use an Earth System Modeling Framework (:term:`ESMF`) component, referred to as the **write component**. This model component is configured with settings in the ``model_configure`` file, as described in `Section 4.2.3 `__ of the UFS Weather Model documentation. + The :term:`UPP` (called by the ``run_post`` task) cannot process output on the native grid types ("GFDLgrid" and "ESGgrid"), so output fields are interpolated to a **write component grid** before writing them to an output file. The output files written by the UFS Weather Model use an Earth System Modeling Framework (:term:`ESMF`) component, referred to as the **write component**. This model component is configured with settings in the ``model_configure`` file, as described in :ref:`Section 4.2.3 ` of the UFS Weather Model documentation. ``QUILTING``: (Default: true) @@ -1173,7 +1177,7 @@ Write-Component (Quilting) Parameters .. math:: - LAYOUT_X * LAYOUT_Y + WRTCMP\_write\_groups * WRTCMP_write_tasks_per_group + LAYOUT\_X * LAYOUT\_Y + WRTCMP\_write\_groups * WRTCMP\_write\_tasks\_per\_group ``WRTCMP_write_groups``: (Default: "") The number of write groups (i.e., groups of :term:`MPI` tasks) to use in the write component. Each write group will write to one set of output files (a ``dynf${fhr}.nc`` and a ``phyf${fhr}.nc`` file, where ``${fhr}`` is the forecast hour). Each write group contains ``WRTCMP_write_tasks_per_group`` tasks. Usually, one write group is sufficient. This may need to be increased if the forecast is proceeding so quickly that a single write group cannot complete writing to its set of files before there is a need/request to start writing the next set of files at the next output time. @@ -1445,7 +1449,7 @@ Set parameters associated with running ensembles. Stochastic Physics Parameters ---------------------------------- -Set default ad-hoc stochastic physics options. For the most updated and detailed documentation of these parameters, see the `UFS Stochastic Physics Documentation `__. +Set default ad-hoc stochastic physics options. For the most updated and detailed documentation of these parameters, see the :doc:`UFS Stochastic Physics Documentation `. ``NEW_LSCALE``: (Default: true) Use correct formula for converting a spatial length scale into spectral space. @@ -1655,10 +1659,10 @@ VX Forecast Model Name The fields or groups of fields for which verification tasks will run. Because ``ASNOW`` is often not of interest in cases outside of winter, and because observation files are not located for retrospective cases on NOAA HPSS before March 2020, ``ASNOW`` is not included by default. ``"ASNOW"`` may be added to this list in order to include the related verification tasks in the workflow. Valid values: ``"APCP"`` | ``"REFC"`` | ``"RETOP"`` | ``"SFC"`` | ``"UPA"`` | ``"ASNOW"`` ``VX_APCP_ACCUMS_HRS``: (Default: [ 1, 3, 6, 24 ]) - The accumulation periods (in hours) to consider for accumulated precipitation (APCP). If ``VX_FIELDS`` contains ``"APCP"``, then ``VX_APCP_ACCUMS_HRS`` must contain at least one element. If ``VX_FIELDS`` does not contain ``"APCP"``, ``VX_APCP_ACCUMS_HRS`` will be ignored. + The accumulation periods (in hours) to consider for accumulated precipitation (APCP). If ``VX_FIELDS`` contains ``"APCP"``, then ``VX_APCP_ACCUMS_HRS`` must contain at least one element. If ``VX_FIELDS`` does not contain ``"APCP"``, ``VX_APCP_ACCUMS_HRS`` will be ignored. Valid values: ``1`` | ``3`` | ``6`` | ``24`` ``VX_ASNOW_ACCUMS_HRS``: (Default: [ 6, 24 ]) - The accumulation periods (in hours) to consider for ``ASNOW`` (accumulated snowfall). If ``VX_FIELDS`` contains ``"ASNOW"``, then ``VX_ASNOW_ACCUMS_HRS`` must contain at least one element. If ``VX_FIELDS`` does not contain ``"ASNOW"``, ``VX_ASNOW_ACCUMS_HRS`` will be ignored. + The accumulation periods (in hours) to consider for ``ASNOW`` (accumulated snowfall). If ``VX_FIELDS`` contains ``"ASNOW"``, then ``VX_ASNOW_ACCUMS_HRS`` must contain at least one element. If ``VX_FIELDS`` does not contain ``"ASNOW"``, ``VX_ASNOW_ACCUMS_HRS`` will be ignored. Valid values: ``6`` | ``24`` Verification (VX) Directories ------------------------------ diff --git a/docs/UsersGuide/source/CustomizingTheWorkflow/InputOutputFiles.rst b/docs/UsersGuide/source/CustomizingTheWorkflow/InputOutputFiles.rst index 675771432e..416daa91b0 100644 --- a/docs/UsersGuide/source/CustomizingTheWorkflow/InputOutputFiles.rst +++ b/docs/UsersGuide/source/CustomizingTheWorkflow/InputOutputFiles.rst @@ -27,17 +27,16 @@ The data format for these files can be :term:`GRIB2` or :term:`NEMSIO`. More inf Pre-processing (UFS_UTILS) --------------------------- -When a user generates the SRW App workflow as described in :numref:`Section %s `, the workflow generation script links the input data for the pre-processing utilities to the experiment directory. The pre-processing utilities use many different datasets to create grids and to generate model input datasets from the external model files. A detailed description of the input files for the pre-processing utilities can be found in the UFS_UTILS `Technical Documentation `__ and `Scientific Documentation `__. +When a user generates the SRW App workflow as described in :numref:`Section %s `, the workflow generation script links the input data for the pre-processing utilities to the experiment directory. The pre-processing utilities use many different datasets to create grids and to generate model input datasets from the external model files. A detailed description of the input files for the pre-processing utilities can be found in the UFS_UTILS :doc:`Technical Documentation ` and `Scientific Documentation `__. UFS Weather Model ----------------- The input files for the UFS Weather Model include both static (fixed) files and grid- and date-specific files (terrain, initial conditions, boundary conditions, etc). The static fix(ed) files -must be staged by the user unless the user is running on a `Level 1/pre-configured `__ platform, in which case users can link to the existing copy of the data on their machine. (See :numref:`Section %s ` for instructions on staging the data on a new machine and :numref:`Section %s ` for data locations on Level 1 machines.) The workflow scripts link the static, grid, and date-specific files to the experiment directory. An extensive description of the input files for the Weather Model can be found in the `UFS Weather Model User's Guide `__. The namelists and configuration files for the SRW Application are created from templates by the workflow generation script, as described in :numref:`Section %s `. +must be staged by the user unless the user is running on a `Level 1/pre-configured `__ platform, in which case users can link to the existing copy of the data on their machine. (See :numref:`Section %s ` for instructions on staging the data on a new machine and :numref:`Section %s ` for data locations on Level 1 machines.) The workflow scripts link the static, grid, and date-specific files to the experiment directory. An extensive description of the input files for the Weather Model can be found in the :doc:`UFS Weather Model User's Guide `. The namelists and configuration files for the SRW Application are created from templates by the workflow generation script, as described in :numref:`Section %s `. Unified Post Processor (UPP) ---------------------------- -Documentation for the UPP input files can be found in the `UPP User's Guide -`__. +Documentation for the UPP input files can be found in the :doc:`UPP User's Guide `. .. _WorkflowTemplates: @@ -80,7 +79,7 @@ and are shown in :numref:`Table %s `. * - README.xml_templating.md - Instructions for Rocoto XML templating with Jinja. -Additional information related to ``diag_table.[CCPP]``, ``field_table.[CCPP]``, ``input.nml.FV3``, ``model_configure``, and ``nems.configure`` can be found in the `UFS Weather Model User's Guide `__, while information on ``regional_grid.nml`` options can be found in the `UFS_UTILS Technical Documentation `__. +Additional information related to ``diag_table.[CCPP]``, ``field_table.[CCPP]``, ``input.nml.FV3``, ``model_configure``, and ``nems.configure`` can be found in the :ref:`UFS Weather Model User's Guide `, while information on ``regional_grid.nml`` options can be found in the `UFS_UTILS Technical Documentation `__. Migratory Route of the Input Files in the Workflow ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ @@ -88,7 +87,7 @@ Migratory Route of the Input Files in the Workflow .. _MigratoryRoute: -.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/SRW_wflow_input_path.png +.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/WorkflowImages/SRW_wflow_input_path.png :alt: Flowchart showing how information from the physics suite travels from the configuration file to the setup file to the workflow generation script to the run forecast ex-script. As this information is fed from one file to the next, file paths and variables required for workflow execution are set. *Migratory Route of Input Files* @@ -127,7 +126,7 @@ experiment directory (``$EXPTDIR/YYYYMMDDHH/INPUT``) and consist of the followin * ``tmp_LBCS`` These output files are used as inputs for the UFS Weather Model and are described in the `UFS Weather Model User's Guide -`__. ``gfs_bndy.tile7.HHH.nc`` refers to a series of IC/LBC files where ``HHH`` is the 3-digit hour of the forecast. +`__. ``gfs_bndy.tile7.HHH.nc`` refers to a series of IC/LBC files where ``HHH`` is the 3-digit hour of the forecast. UFS Weather Model ------------------ @@ -141,12 +140,11 @@ In this case, the netCDF output files are written to the ``$EXPTDIR/YYYYMMDDHH`` * ``dynfHHH.nc`` * ``phyfHHH.nc`` -where ``HHH`` corresponds to the 3-digit forecast hour (e.g., ``dynf006.nc`` for the 6th hour of the forecast). Additional details may be found in the `UFS Weather Model User's Guide -`__. +where ``HHH`` corresponds to the 3-digit forecast hour (e.g., ``dynf006.nc`` for the 6th hour of the forecast). Additional details may be found in the :ref:`UFS Weather Model User's Guide `. Unified Post Processor (UPP) ---------------------------- -Documentation for the UPP output files can be found in the `UPP User's Guide `__. +Documentation for the UPP output files can be found in the `UPP User's Guide `__. For the SRW Application, the Weather Model netCDF output files are written to ``$EXPTDIR/YYYYMMDDHH/postprd`` and have the naming convention (file->linked to): @@ -169,7 +167,7 @@ UPP Product Output Tables for the UFS SRW LAM Grid: * :doc:`3D Native Hybrid Level Fields ` * :doc:`3D Pressure Level Fields ` -Use the instructions in the `UPP User's Guide `__ to make modifications to the ``fv3lam.xml`` file and to remake the flat text file, called ``postxconfig-NT-fv3lam.txt`` (default), that the UPP reads. +Use the instructions in the `UPP User's Guide `__ to make modifications to the ``fv3lam.xml`` file and to remake the flat text file, called ``postxconfig-NT-fv3lam.txt`` (default), that the UPP reads. After creating the new flat text file to reflect the changes, users will need to modify their ``config.yaml`` to point the workflow to the new text file. In ``config.yaml``, set the following: @@ -214,7 +212,7 @@ By setting ``USE_CRTM`` to true, the workflow will use the path defined in ``CRT Downloading and Staging Input Data ================================== -A set of input files, including static (fix) data and raw initial and lateral boundary conditions (:term:`IC/LBCs`), is required to run the SRW Application. The data required for the "out-of-the-box" SRW App case described in Chapters :numref:`%s ` and :numref:`%s ` is already preinstalled on `Level 1 & 2 `__ systems, along with data required to run the :ref:`WE2E ` test cases. Therefore, users on these systems do not need to stage the fixed files manually because they have been prestaged, and the paths are set in ``ush/setup.sh``. Users on Level 3 & 4 systems can find the most recent SRW App release data in the `UFS SRW Application Data Bucket `__ by clicking on `Browse Bucket `__. +A set of input files, including static (fix) data and raw initial and lateral boundary conditions (:term:`ICs/LBCs`), is required to run the SRW Application. The data required for the "out-of-the-box" SRW App case described in Chapters :numref:`%s ` and :numref:`%s ` is already preinstalled on `Level 1 & 2 `__ systems, along with data required to run the :ref:`WE2E ` test cases. Therefore, users on these systems do not need to stage the fixed files manually because they have been prestaged, and the paths are set in ``ush/setup.sh``. Users on Level 3 & 4 systems can find the most recent SRW App release data in the `UFS SRW Application Data Bucket `__ by clicking on `Browse Bucket `__. .. _StaticFixFiles: @@ -228,9 +226,7 @@ Static files are available in the `"fix" directory `__ of the SRW Data Bucket using the ``wget`` command for each required file. A list of ``wget`` commands with links is provided :ref:`here ` for the release v2.1.0 fix file data. Users will need to create an appropriate directory structure for the files when downloading them individually. The best solution is to download the files into directories that mirror the structure of the `Data Bucket `__. - -.. COMMENT: Update release file list above for next SRW release. +Alternatively, users can download the static files individually from the `"fix" directory `__ of the SRW Data Bucket using the ``wget`` command for each required file. Users will need to create an appropriate directory structure for the files when downloading them individually. The best solution is to download the files into directories that mirror the structure of the `Data Bucket `__. The environment variables ``FIXgsm``, ``FIXorg``, and ``FIXsfc`` indicate the path to the directories where the static files are located. After downloading the experiment data, users must set the paths to the files in ``config.yaml``. Add the following code to the ``task_run_fcst:`` section of the ``config.yaml`` file, and alter the variable paths accordingly: @@ -271,9 +267,9 @@ The paths to ``EXTRN_MDL_SOURCE_BASEDIR_ICS`` and ``EXTRN_MDL_SOURCE_BASEDIR_LBC USE_USER_STAGED_EXTRN_FILES: true EXTRN_MDL_SOURCE_BASEDIR_LBCS: /path/to/ufs-srweather-app/input_model_data/FV3GFS/grib2/YYYYMMDDHH -The two ``EXTRN_MDL_SOURCE_BASEDIR_*CS`` variables describe where the :term:`IC ` and :term:`LBC ` file directories are located, respectively. For ease of reusing ``config.yaml`` across experiments, it is recommended that users set up the raw :term:`IC/LBC ` file paths to include the model name (e.g., FV3GFS, GEFS, GDAS, NAM, RAP, HRRR), data format (e.g., grib2, nemsio), and date (in ``YYYYMMDDHH`` format). For example: ``/path/to/input_model_data/FV3GFS/grib2/2019061518/``. While there is flexibility to modify these settings, this structure will provide the most reusability for multiple dates when using the SRW Application workflow. +The two ``EXTRN_MDL_SOURCE_BASEDIR_*CS`` variables describe where the :term:`IC ` and :term:`LBC ` file directories are located, respectively. For ease of reusing ``config.yaml`` across experiments, it is recommended that users set up the raw :term:`IC/LBC ` file paths to include the model name (e.g., FV3GFS, GEFS, GDAS, NAM, RAP, HRRR), data format (e.g., grib2, nemsio), and date (in ``YYYYMMDDHH`` format). For example: ``/path/to/input_model_data/FV3GFS/grib2/2019061518/``. While there is flexibility to modify these settings, this structure will provide the most reusability for multiple dates when using the SRW Application workflow. -When files are pulled from NOAA :term:`HPSS` (rather than downloaded from the data bucket), the naming convention looks something like: +When files are pulled from NOAA :term:`HPSS` (rather than downloaded from the data bucket), the naming convention looks something like this: * FV3GFS (GRIB2): ``gfs.t{cycle}z.pgrb2.0p25.f{fhr}`` * FV3GFS (NEMSIO): @@ -294,7 +290,7 @@ where: * ``{cycle}`` corresponds to the 2-digit hour of the day when the forecast cycle starts, and * ``{fhr}`` corresponds to the 2- or 3-digit nth hour of the forecast (3-digits for FV3GFS/GDAS data and 2 digits for RAP/HRRR data). -For example, a forecast using FV3GFS GRIB2 data that starts at 18h00 UTC would have a {cycle} value of 18, which is the 000th forecast hour. The LBCS file for 21h00 UTC would be named ``gfs.t18z.pgrb2.0p25.f003``. +For example, a forecast using FV3GFS GRIB2 data that starts at 18h00 UTC would have a ``{cycle}`` value of 18, which is the 000th forecast hour. The LBCS file for 21h00 UTC would be named ``gfs.t18z.pgrb2.0p25.f003``. In some cases, it may be necessary to specify values for ``EXTRN_MDL_FILES_*CS`` variables. This is often the case with HRRR and RAP data. An example ``config.yaml`` excerpt using HRRR and RAP data appears below: @@ -350,6 +346,13 @@ AWS S3 Data Buckets: * GEFS: https://registry.opendata.aws/noaa-gefs/ * GDAS: https://registry.opendata.aws/noaa-gfs-bdp-pds/ * HRRR: https://registry.opendata.aws/noaa-hrrr-pds/ (necessary fields for initializing available for dates 2015 and newer) +* A list of the NOAA Open Data Dissemination (NODD) datasets can be found here: https://www.noaa.gov/nodd/datasets + +NCEI Archive: + +* GFS: https://www.ncei.noaa.gov/products/weather-climate-models/global-forecast +* NAM: https://www.ncei.noaa.gov/products/weather-climate-models/north-american-mesoscale +* RAP: https://www.ncei.noaa.gov/products/weather-climate-models/rapid-refresh-update Google Cloud: diff --git a/docs/UsersGuide/source/CustomizingTheWorkflow/LAMGrids.rst b/docs/UsersGuide/source/CustomizingTheWorkflow/LAMGrids.rst index db04ad5043..e86b755c8f 100644 --- a/docs/UsersGuide/source/CustomizingTheWorkflow/LAMGrids.rst +++ b/docs/UsersGuide/source/CustomizingTheWorkflow/LAMGrids.rst @@ -1,3 +1,6 @@ +.. role:: raw-html(raw) + :format: html + .. _LAMGrids: ================================================================================= @@ -5,19 +8,20 @@ Limited Area Model (:term:`LAM`) Grids: Predefined and User-Generated Options ================================================================================= In order to set up the workflow and generate an experiment with the SRW Application, the user must choose between various predefined :term:`FV3`-:term:`LAM` grids or generate a user-defined grid. -At this time, full support is only provided to those using one of the four predefined -grids supported in the v2.1.0 release, but other predefined grids are available (see :numref:`Section %s ` for more detail). Preliminary information is also provided at the end of this chapter describing how users can leverage the SRW App workflow scripts to generate their own user-defined grid and/or adjust the number of vertical levels in the grid. Currently, this feature is not fully supported and is "use at your own risk." +At this time, full support is only provided to those using one of the five predefined +grids supported in the v2.2.0 release, but other predefined grids are available (see :numref:`Section %s ` for more detail). Preliminary information is also provided at the end of this chapter describing how users can leverage the SRW App workflow scripts to generate their own user-defined grid and/or to adjust the number of vertical levels in the grid. Currently, these features are not fully supported and are "use at your own risk." Predefined Grids ================= -The SRW App v2.1.0 release includes four predefined limited area model (:term:`LAM`) grids. To select a supported predefined grid, the ``PREDEF_GRID_NAME`` variable within the ``workflow:`` section of the ``config.yaml`` script must be set to one of the following four options: +The SRW App v2.2.0 release includes five predefined limited area model (:term:`LAM`) grids. To select a supported predefined grid, the ``PREDEF_GRID_NAME`` variable within the ``workflow:`` section of the ``config.yaml`` script must be set to one of the following five options: * ``RRFS_CONUS_3km`` * ``RRFS_CONUS_13km`` * ``RRFS_CONUS_25km`` * ``SUBCONUS_Ind_3km`` +* ``RRFS_NA_13km`` -These four options are provided for flexibility related to compute resources and supported physics options. Other predefined grids are listed :ref:`here `. The high-resolution 3-km :term:`CONUS` grid generally requires more compute power and works well with three of the five supported physics suites (see :numref:`Table %s `). Low-resolution grids (i.e., 13-km and 25-km domains) require less compute power and should generally be used with the other supported physics suites: ``FV3_GFS_v16`` and ``FV3_RAP``. +These five options are provided for flexibility related to compute resources and supported physics options. Other predefined grids are listed :ref:`here `. The high-resolution 3-km :term:`CONUS` grid generally requires more compute power and works well with three of the five supported physics suites (see :numref:`Table %s `). Low-resolution grids (i.e., 13-km and 25-km domains) require less compute power and should generally be used with the other supported physics suites: ``FV3_GFS_v16`` and ``FV3_RAP``. .. _GridPhysicsCombos: @@ -42,6 +46,10 @@ These four options are provided for flexibility related to compute resources and | | | | | FV3_RAP | +-------------------+------------------+ + | RRFS_NA_13km | FV3_RAP | + | | | + | | FV3_GFS_v16 | + +-------------------+------------------+ | RRFS_CONUS_25km | FV3_GFS_v16 | | | | | | FV3_RAP | @@ -49,7 +57,7 @@ These four options are provided for flexibility related to compute resources and In theory, it is possible to run any of the supported physics suites with any of the predefined grids, but the results will be more accurate and meaningful with appropriate grid/physics pairings. -The predefined :term:`CONUS` grids follow the naming convention (e.g., ``RRFS_CONUS_*km``) of the 3-km version of the continental United States (CONUS) grid being tested for the Rapid Refresh Forecast System (:term:`RRFS`). RRFS will be a convection-allowing, hourly-cycled, :term:`FV3`-:term:`LAM`-based ensemble planned for operational implementation in 2025. All four supported grids were created to fit completely within the High Resolution Rapid Refresh (`HRRR `_) domain to allow for use of HRRR data to initialize the SRW App. +The predefined :term:`CONUS` grids follow the naming convention (e.g., ``RRFS_CONUS_*km``) of the 3-km version of the continental United States (CONUS) grid being tested for the Rapid Refresh Forecast System (:term:`RRFS`). RRFS will be a convection-allowing, hourly-cycled, :term:`FV3`-:term:`LAM`-based ensemble planned for operational implementation in 2025. Aside from ``RRFS_NA_13km``, the supported predefined grids were created to fit completely within the High Resolution Rapid Refresh (`HRRR `__) domain to allow for use of HRRR data to initialize the SRW App. The ``RRFS_NA_13km`` grid covers all of North America and therefore requires different data to initialize. Predefined 3-km CONUS Grid ----------------------------- @@ -58,13 +66,13 @@ The 3-km CONUS domain is ideal for running the ``FV3_RRFS_v1beta`` physics suite .. _RRFS_CONUS_3km: -.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/RRFS_CONUS_3km.sphr.native_wrtcmp.png - :alt: Map of the continental United States 3 kilometer domain. The computational grid boundaries appear in red and the write-component grid appears just inside the computational grid boundaries in blue. +.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/LAMGrids/RRFS_CONUS_3km.sphr.native_wrtcmp.png + :alt: Map of the continental United States 3 kilometer domain. The computational grid boundaries appear in red and the write component grid appears just inside the computational grid boundaries in blue. - *The boundary of the RRFS_CONUS_3km computational grid (red) and corresponding write-component grid (blue).* + *The boundary of the RRFS_CONUS_3km computational grid (red) and corresponding write component grid (blue).* -The boundary of the ``RRFS_CONUS_3km`` domain is shown in :numref:`Figure %s ` (in red), and the boundary of the :ref:`write-component grid ` sits just inside the computational domain (in blue). This extra grid is required because the post-processing utility (:term:`UPP`) is unable to process data on the native FV3 gnomonic grid (in red). Therefore, model data are interpolated to a Lambert conformal grid (the write component grid) in order for the :term:`UPP` to read in and correctly process the data. +The boundary of the ``RRFS_CONUS_3km`` domain is shown in :numref:`Figure %s ` (in red), and the boundary of the :ref:`write component grid ` sits just inside the computational domain (in blue). This extra grid is required because the post-processing utility (:term:`UPP`) is unable to process data on the native FV3 gnomonic grid (in red). Therefore, model data are interpolated to a Lambert conformal grid (the write component grid) in order for the :term:`UPP` to read in and correctly process the data. .. note:: While it is possible to initialize the FV3-LAM with coarser external model data when using the ``RRFS_CONUS_3km`` domain, it is generally advised to use external model data (such as HRRR or RAP data) that has a resolution similar to that of the native FV3-LAM (predefined) grid. @@ -75,47 +83,81 @@ Predefined SUBCONUS Grid Over Indianapolis .. _SUBCONUS_Ind_3km: -.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/SUBCONUS_Ind_3km.png - :alt: Map of Indiana and portions of the surrounding states. The map shows the boundaries of the continental United States sub-grid centered over Indianapolis. The computational grid boundaries appear in red and the write-component grid appears just inside the computational grid boundaries in blue. +.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/LAMGrids/SUBCONUS_Ind_3km.png + :alt: Map of Indiana and portions of the surrounding states. The map shows the boundaries of the continental United States sub-grid centered over Indianapolis. The computational grid boundaries appear in red and the write component grid appears just inside the computational grid boundaries in blue. - *The boundary of the SUBCONUS_Ind_3km computational grid (red) and corresponding write-component grid (blue).* + *The boundary of the SUBCONUS_Ind_3km computational grid (red) and corresponding write component grid (blue).* The ``SUBCONUS_Ind_3km`` grid covers only a small section of the :term:`CONUS` centered over Indianapolis. Like the ``RRFS_CONUS_3km`` grid, it is ideally paired with the ``FV3_RRFS_v1beta``, ``FV3_HRRR``, or ``FV3_WoFS`` physics suites, since these are all convection-allowing physics suites designed to work well on high-resolution grids. -Predefined 13-km Grid ------------------------- +Predefined 13-km CONUS Grid +----------------------------- .. _RRFS_CONUS_13km: -.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/RRFS_CONUS_13km.sphr.native_wrtcmp.png - :alt: Map of the continental United States 13 kilometer domain. The computational grid boundaries appear in red and the write-component grid appears just inside the computational grid boundaries in blue. +.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/LAMGrids/RRFS_CONUS_13km.sphr.native_wrtcmp.png + :alt: Map of the continental United States 13 kilometer domain. The computational grid boundaries appear in red and the write component grid appears just inside the computational grid boundaries in blue. - *The boundary of the RRFS_CONUS_13km computational grid (red) and corresponding write-component grid (blue).* + *The boundary of the RRFS_CONUS_13km computational grid (red) and corresponding write component grid (blue).* The ``RRFS_CONUS_13km`` grid (:numref:`Fig. %s `) covers the full :term:`CONUS`. This grid is meant to be run with the ``FV3_GFS_v16`` or ``FV3_RAP`` physics suites. These suites use convective :term:`parameterizations`, whereas the other supported suites do not. Convective parameterizations are necessary for low-resolution grids because convection occurs on scales smaller than 25-km and 13-km. +Predefined 13-km North American Grid +-------------------------------------- + +.. _RRFS_NA_13km: + +.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/LAMGrids/RRFS_NA_13km.sphr.whole.png + :alt: Map of the North American 13 kilometer domain. The computational grid boundaries appear in red and the write component grid appears just outside the computational grid boundaries in blue. + + *The boundary of the RRFS_NA_13km computational grid (red) and corresponding write component grid (blue).* + +The ``RRFS_NA_13km`` grid (:numref:`Fig. %s `) covers all of North America. This grid was designed to run with the ``FV3_RAP`` physics suite but can also be run with the ``FV3_GFS_v16`` suite. These suites use convective :term:`parameterizations`, whereas the other supported suites do not. Convective parameterizations are necessary for low-resolution grids because convection occurs on scales smaller than 25-km and 13-km. + +Corner plots for the ``RRFS_NA_13km`` grid in :numref:`Table %s ` show the 4-cell-wide :term:`halo` on the computational grid in orange, which gives an idea of the size of the grid cells. + +.. |logo1| image:: https://github.com/ufs-community/ufs-srweather-app/wiki/LAMGrids/RRFS_NA_13km.upper_left_w_halo.png + :alt: Upper left corner of the RRFS_NA_13km with computational grid and four-cell-wide halo in orange and write component grid outside of the computational grid in blue. + +.. |logo2| image:: https://github.com/ufs-community/ufs-srweather-app/wiki/LAMGrids/RRFS_NA_13km.upper_right_w_halo.png + :alt: Upper right corner of the RRFS_NA_13km with computational grid and four-cell-wide halo in orange and write component grid outside of the computational grid in blue. + +.. |logo3| image:: https://github.com/ufs-community/ufs-srweather-app/wiki/LAMGrids/RRFS_NA_13km.lower_right_w_halo.png + :alt: Lower right corner of the RRFS_NA_13km with computational grid and four-cell-wide halo in orange and write component grid outside of the computational grid in blue. + +.. |logo4| image:: https://github.com/ufs-community/ufs-srweather-app/wiki/LAMGrids/RRFS_NA_13km.lower_left_w_halo.png + :alt: Lower left corner of the RRFS_NA_13km with computational grid and four-cell-wide halo in orange and write component grid outside of the computational grid in blue. + +.. _CornerPlots: +.. list-table:: Corner Plots for the RRFS_NA_13km Grid + + * - |logo1| :raw-html:`

Upper left w/halo

` + - |logo2| :raw-html:`

Upper right w/halo

` + - |logo3| :raw-html:`

Lower right w/halo

` + - |logo4| :raw-html:`

Lower left w/halo

` + Predefined 25-km Grid ------------------------ .. _RRFS_CONUS_25km: -.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/RRFS_CONUS_25km.sphr.native_wrtcmp.png - :alt: Map of the continental United States 25 kilometer domain. The computational grid boundaries appear in red and the write-component grid appears just inside the computational grid boundaries in blue. +.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/LAMGrids/RRFS_CONUS_25km.sphr.native_wrtcmp.png + :alt: Map of the continental United States 25 kilometer domain. The computational grid boundaries appear in red and the write component grid appears just inside the computational grid boundaries in blue. - *The boundary of the RRFS_CONUS_25km computational grid (red) and corresponding write-component grid (blue).* + *The boundary of the RRFS_CONUS_25km computational grid (red) and corresponding write component grid (blue).* The final predefined :term:`CONUS` grid (:numref:`Fig. %s `) uses a 25-km resolution and is meant mostly for quick testing to ensure functionality prior to using a higher-resolution domain. -However, for users who would like to use the 25-km domain for research, the ``FV3_GFS_v16`` :term:`SDF` is recommended for the reasons mentioned :ref:`above `. +However, if users plan to use the 25-km domain for research, the ``FV3_GFS_v16`` :term:`SDF` is recommended for the reasons mentioned :ref:`above `. -Ultimately, the choice of grid is experiment-dependent and resource-dependent. For example, a user may wish to use the ``FV3_GFS_v16`` physics suite, which uses cumulus physics that are not configured to run at the 3-km resolution. In this case, the 13-km or 25-km domain options are better suited to the experiment. Users will also have fewer computational constraints when running with the 13-km and 25-km domains, so depending on the resources available, certain grids may be better options than others. +Ultimately, the choice of grid is experiment-dependent and resource-dependent. For example, a user may wish to use the ``FV3_GFS_v16`` physics suite, which uses cumulus physics that are not configured to run at the 3-km resolution. In this case, the 13-km or 25-km domain options are better suited to the experiment. Users will also have fewer computational constraints when running with the 13-km and 25-km CONUS domains, so depending on the resources available, certain grids may be better options than others. .. _UserDefinedGrid: Creating User-Generated Grids =============================== -While the four supported predefined grids are ideal for users just starting +While the five supported predefined grids are ideal for users just starting out with the SRW App, more advanced users may wish to create their own predefined grid for testing over a different region and/or with a different resolution. Creating a user-defined grid requires knowledge of how the SRW App workflow functions. In particular, it is important to understand the set of @@ -131,7 +173,7 @@ The steps to add such a grid to the workflow are as follows: #. Add NEW_GRID to the array ``valid_vals_PREDEF_GRID_NAME`` in the ``ufs-srweather-app/ush/valid_param_vals.yaml`` file. -#. In ``ufs-srweather-app/ush/predef_grid_params.yaml``, add a stanza describing the parameters for NEW_GRID. An example of such a stanza is given :ref:`below `. For descriptions of the variables that need to be set, see Sections :numref:`%s ` and :numref:`%s `. +#. In ``ufs-srweather-app/ush/predef_grid_params.yaml``, add a stanza describing the parameters for NEW_GRID. An example of such a stanza is given :ref:`below `. For descriptions of the variables that need to be set, see Sections :numref:`%s: ESGgrid Settings ` and :numref:`%s: Forecast Configuration Parameters `. To run a forecast experiment on NEW_GRID, start with a workflow configuration file for a successful experiment (e.g., ``config.community.yaml``, located in the ``ufs-srweather-app/ush`` subdirectory), and change the line for ``PREDEF_GRID_NAME`` in the ``workflow:`` section to ``NEW_GRID``: @@ -160,12 +202,11 @@ The following is an example of a code stanza for "NEW_GRID" to be added to ``pre "NEW_GRID": - # The method used to generate the grid. This example is specifically for the "ESGgrid" method. - + # The method used to generate the grid. This example is specifically for the "ESGgrid" method. GRID_GEN_METHOD: "ESGgrid" - # ESGgrid parameters: + # ESGgrid parameters ESGgrid_LON_CTR: -97.5 ESGgrid_LAT_CTR: 38.5 @@ -176,14 +217,14 @@ The following is an example of a code stanza for "NEW_GRID" to be added to ``pre ESGgrid_PAZI: 0.0 ESGgrid_WIDE_HALO_WIDTH: 6 - # Forecast configuration parameters: + # Forecast configuration parameters DT_ATMOS: 40 LAYOUT_X: 5 LAYOUT_Y: 2 BLOCKSIZE: 40 - # Parameters for the write-component (aka "quilting") grid. + # Parameters for the write component (aka "quilting") grid. QUILTING: WRTCMP_write_groups: 1 @@ -194,7 +235,7 @@ The following is an example of a code stanza for "NEW_GRID" to be added to ``pre WRTCMP_lon_lwr_left: -121.12455072 WRTCMP_lat_lwr_left: 23.89394570 - # Parameters required for the Lambert conformal grid mapping. + # Parameters required for the Lambert conformal grid mapping. WRTCMP_stdlat1: 38.5 WRTCMP_stdlat2: 38.5 @@ -204,35 +245,35 @@ The following is an example of a code stanza for "NEW_GRID" to be added to ``pre WRTCMP_dy: 25000.0 .. note:: - The process above explains how to create a new *predefined* grid, which can be used more than once. If a user prefers to create a custom grid for one-time use, the variables above can instead be specified in ``config.yaml``, and ``PREDEF_GRID_NAME`` can be set to a null string. In this case, it is not necessary to modify ``valid_param_vals.yaml`` or ``predef_grid_params.yaml``. Users can view an example configuration file for a custom grid `here `__. + The process above explains how to create a new *predefined* grid, which can be used more than once. If a user prefers to create a custom grid for one-time use, the variables above can instead be specified in ``config.yaml``, and ``PREDEF_GRID_NAME`` can be set to a null string. In this case, it is not necessary to modify ``valid_param_vals.yaml`` or ``predef_grid_params.yaml``. Users can view an example configuration file for a custom grid `here `__. .. _VerticalLevels: Changing the Number of Vertical Levels ======================================== -The four supported predefined grids included with the SRW App are configured to run with 65 levels by default. However, advanced users may wish to vary the number of vertical levels in the grids they are using, whether these be the predefined grids or a user-generated grid. Varying the number of vertical levels requires +The five supported predefined grids included with the SRW App are configured to run with 65 levels by default. However, advanced users may wish to vary the number of vertical levels in the grids they are using, whether these be the predefined grids or a user-generated grid. Varying the number of vertical levels requires knowledge of how the SRW App interfaces with the UFS Weather Model (:term:`WM `) and preprocessing utilities. It is also important to note that user-defined vertical levels are not a supported feature at present; information is being provided for the benefit of the FV3-LAM community, but user support for this feature is limited. With those caveats in mind, this section provides instructions for creating a user-defined vertical coordinate distribution on a regional grid. Find ``ak``/``bk`` -------------------- -Users will need to determine ``ak`` and ``bk`` values, which are used to define the vertical levels. The UFS WM uses a hybrid vertical coordinate system, which moves from purely sigma levels near the surface to purely isobaric levels near the top of the atmosphere (TOA). The equation :math:`pk=ak+bk*ps` (where ``ps`` is surface pressure) is used to derive the pressure value at a given level. The ``ak`` values define the contribution from the purely isobaric component of the hybrid vertical coordinate, and the ``bk`` values are the contribution from the sigma component. When ``ak`` and ``bk`` are both zero, it is the TOA (pressure is zero). When ``bk`` is 1 and ak is 0, it is a purely sigma vertical coordinate surface, which is the case near the surface (the first model level). +Users will need to determine ``ak`` and ``bk`` values, which are used to define the vertical levels. The UFS WM uses a hybrid vertical coordinate system, which moves from purely sigma levels near the surface to purely isobaric levels near the top of the atmosphere (TOA). The equation :math:`pk=ak+bk*ps` (where ``ps`` is surface pressure) is used to derive the pressure value at a given level. The ``ak`` values define the contribution from the purely isobaric component of the hybrid vertical coordinate, and the ``bk`` values are the contribution from the sigma component. When ``ak`` and ``bk`` are both zero, it is the TOA (pressure is zero). When ``bk`` is 1 and ``ak`` is 0, it is a purely sigma vertical coordinate surface, which is the case near the surface (the first model level). The ``vcoord_gen`` tool from UFS_UTILS can be used to generate ``ak`` and ``bk`` values, although users may choose a different tool if they prefer. The program can output a text file containing ``ak`` and ``bk`` values for each model level, which will be used by ``chgres_cube`` in the ``make_ics_*`` and ``make_lbcs_*`` tasks to generate the initial and lateral boundary conditions from the external data. -Users can find ``vcoord_gen`` `technical documentation here `__ and `scientific documentation here `__. Since UFS_UTILS is part of the SRW App, users can find and run the UFS_UTILS ``vcoord_gen`` tool in their ``ufs-srweather-app/exec`` directory. To run ``vcoord_gen`` within the SRW App: +Users can find ``vcoord_gen`` `technical documentation here `__ and `scientific documentation here `__. Since UFS_UTILS is part of the SRW App, users can find and run the UFS_UTILS ``vcoord_gen`` tool in their ``ufs-srweather-app/exec`` directory. To run ``vcoord_gen`` within the SRW App: .. code-block:: console cd /path/to/ufs-srweather-app/exec ./vcoord_gen > /path/to/vcoord_gen_outfile.txt -Users should modify the output file path to save the output file in the desired location. In the SRW App, the default file defining vertical levels is named ``global_hyblev.txt`` and contains the default 65 levels. By convention, users who create a new vertical coodinate distribution file often append this file name with ``LXX`` or ``LXXX`` for their number of levels (e.g., ``global_hyblev.L128.txt``). Configuration files are typically placed in the ``parm`` directory. For example, users might run: +Users should modify the output file path (``/path/to/vcoord_gen_outfile.txt``) to save the output file in the desired location. In the SRW App, the default file defining vertical levels is named ``global_hyblev.txt`` and contains the default 65 levels. By convention, users who create a new vertical coodinate distribution file often append this file name with ``LXX`` or ``LXXX`` for their number of levels (e.g., ``global_hyblev.L128.txt``). Configuration files are typically placed in the ``parm`` directory. For example, a user (Jane Smith) might run: .. code-block:: console - cd /path/to/ufs-srweather-app/exec + cd /Users/Jane.Smith/ufs-srweather-app/exec ./vcoord_gen > /Users/Jane.Smith/ufs-srweather-app/parm/global_hyblev.L128.txt When ``vcoord_gen`` starts, it will print a message telling users to specify certain variables for ``ak``/``bk`` generation: @@ -409,7 +450,7 @@ Additionally, check that ``external_eta = .true.``. Modify ``config.yaml`` ^^^^^^^^^^^^^^^^^^^^^^^^ -To use the text file produced by ``vcoord_gen`` in the SRW App, users need to set the ``VCOORD_FILE`` variable in their ``config.yaml`` file. Normally, this file is named ``global_hyblev.l65.txt`` and is located in the ``fix_am`` directory on Level 1 systems, but users should adjust the path and name of the file to suit their system. For example, in ``config.yaml``, set: +To use the text file produced by ``vcoord_gen`` in the SRW App, users need to set the ``VCOORD_FILE`` variable in their ``config.yaml`` file. Normally, this file is named ``global_hyblev.l65.txt`` and is located in the ``fix_am`` directory on Level 1 systems, but users should adjust the path and name of the file to suit their system. For example, in ``config.yaml``, a user (Jane Smith) might set: .. code-block:: console diff --git a/docs/UsersGuide/source/Reference/FAQ.rst b/docs/UsersGuide/source/Reference/FAQ.rst index 237980789c..728223f263 100644 --- a/docs/UsersGuide/source/Reference/FAQ.rst +++ b/docs/UsersGuide/source/Reference/FAQ.rst @@ -49,7 +49,7 @@ model directory to the experiment directory ``$EXPTDIR``. For more information o How do I change the grid? =========================== -To change the predefined grid, modify the ``PREDEF_GRID_NAME`` variable in the ``task_run_fcst:`` section of the ``config.yaml`` script (see :numref:`Section %s ` for details on creating and modifying the ``config.yaml`` file). The four supported predefined grids as of the SRW Application v2.1.0 release are: +To change the predefined grid, modify the ``PREDEF_GRID_NAME`` variable in the ``task_run_fcst:`` section of the ``config.yaml`` script (see :numref:`Section %s ` for details on creating and modifying the ``config.yaml`` file). The five supported predefined grids as of the SRW Application |latestr| release are: .. code-block:: console @@ -57,8 +57,9 @@ To change the predefined grid, modify the ``PREDEF_GRID_NAME`` variable in the ` RRFS_CONUS_13km RRFS_CONUS_25km SUBCONUS_Ind_3km + RRFS_NA_13km -However, users can choose from a variety of predefined grids listed in :numref:`Section %s `. An option also exists to create a user-defined grid, with information available in :numref:`Section %s `. However, the user-defined grid option is not fully supported as of the v2.1.0 release and is provided for informational purposes only. +However, users can choose from a variety of predefined grids listed in :numref:`Section %s `. An option also exists to create a user-defined grid, with information available in :numref:`Section %s `. However, the user-defined grid option is not fully supported as of the |latestr| release and is provided for informational purposes only. .. _SetTasks: @@ -66,60 +67,9 @@ However, users can choose from a variety of predefined grids listed in :numref:` How can I select which workflow tasks to run? =============================================== -The ``/parm/wflow`` directory contains several ``YAML`` files that configure different workflow task groups. Each task group file contains a number of tasks that are typically run together. :numref:`Table %s ` describes each of the task groups. - -.. _task-group-files: - -.. list-table:: Task group files - :widths: 20 50 - :header-rows: 1 - - * - File - - Function - * - aqm_post.yaml - - SRW-AQM post-processing tasks - * - aqm_prep.yaml - - SRW-AQM pre-processing tasks - * - coldstart.yaml - - Tasks required to run a cold-start forecast - * - da_data_preproc.yaml - - Preprocessing tasks for RRFS `DA `. - * - plot.yaml - - Plotting tasks - * - post.yaml - - Post-processing tasks - * - prdgen.yaml - - Horizontal map projection processor that creates smaller domain products from the larger domain created by the UPP. - * - prep.yaml - - Pre-processing tasks - * - verify_det.yaml - - Deterministic verification tasks - * - verify_ens.yaml - - Ensemble verification tasks - * - verify_pre.yaml - - Verification pre-processing tasks - -The default workflow task groups are set in ``parm/wflow/default_workflow.yaml`` and include ``prep.yaml``, ``coldstart.yaml``, and ``post.yaml``. Changing this list of task groups in the user configuration file (``config.yaml``) will override the default and run only the task groups listed. For example, to omit :term:`cycle-independent` tasks and run plotting tasks, users would delete ``prep.yaml`` from the list of tasks and add ``plot.yaml``: +:numref:`Section %s ` provides a full description of how to turn on/off workflow tasks. -.. code-block:: console - - rocoto: - tasks: - taskgroups: '{{ ["parm/wflow/coldstart.yaml", "parm/wflow/post.yaml", "parm/wflow/plot.yaml"]|include }}' - -Users may need to make additional adjustments to ``config.yaml`` depending on which task groups they add or remove. For example, when plotting, the user should add the plotting increment (``PLOT_FCST_INC``) for the plotting tasks in ``task_plot_allvars``. - -Users can omit specific tasks from a task group by including them under the list of tasks as an empty entry. For example, if a user wanted to run only ``task_pre_post_stat`` from ``aqm_post.yaml``, the taskgroups list would include ``aqm_post.yaml``, and the tasks that the user wanted to omit would be listed with no value: - -.. code-block:: console - - rocoto: - tasks: - taskgroups: '{{ ["parm/wflow/prep.yaml", "parm/wflow/coldstart.yaml", "parm/wflow/post.yaml", "parm/wflow/aqm_post.yaml"]|include }}' - task_post_stat_o3: - task_post_stat_pm25: - task_bias_correction_o3: - task_bias_correction_pm25: +The default workflow tasks are defined in ``ufs-srweather-app/parm/wflow/default_workflow.yaml``. However, the ``/parm/wflow`` directory contains several ``YAML`` files that configure different workflow task groups. Each file contains a number of tasks that are typically run together (see :numref:`Table %s ` for a description of each task group). To add or remove workflow tasks, users will need to alter the user configuration file (``config.yaml``) as described in :numref:`Section %s ` to override the default workflow and run the selected tasks and task groups. .. _CycleInd: @@ -142,7 +92,7 @@ To skip these tasks, remove ``parm/wflow/prep.yaml`` from the list of task group tasks: taskgroups: '{{ ["parm/wflow/coldstart.yaml", "parm/wflow/post.yaml"]|include }}' -Then, add the paths to the previously generated grid, orography, and surface climatology files under the appropariate tasks in ``config.yaml``: +Then, add the appropriate tasks and paths to the previously generated grid, orography, and surface climatology files to ``config.yaml``: .. code-block:: console @@ -161,7 +111,7 @@ All three sets of files *may* be placed in the same directory location (and woul How do I restart a DEAD task? ============================= -On platforms that utilize Rocoto workflow software (such as NCAR's Cheyenne machine), if something goes wrong with the workflow, a task may end up in the DEAD state: +On platforms that utilize Rocoto workflow software (such as NCAR's Derecho machine), if something goes wrong with the workflow, a task may end up in the DEAD state: .. code-block:: console @@ -226,7 +176,7 @@ In addition to the options above, many standard terminal commands can be run to How can I run a new experiment? ================================== -To run a new experiment at a later time, users need to rerun the commands in :numref:`Section %s ` that reactivate the *workflow_tools* environment: +To run a new experiment at a later time, users need to rerun the commands in :numref:`Section %s ` that reactivate the |wflow_env| environment: .. code-block:: console @@ -234,9 +184,9 @@ To run a new experiment at a later time, users need to rerun the commands in :nu module use /path/to/modulefiles module load wflow_ -Follow any instructions output by the console (e.g., ``conda activate workflow_tools``). +Follow any instructions output by the console (e.g., |activate|). -Then, users can configure a new experiment by updating the environment variables in ``config.yaml`` to reflect the desired experiment configuration. Detailed instructions can be viewed in :numref:`Section %s `. Parameters and valid values are listed in :numref:`Section %s `. After adjusting the configuration file, generate the new experiment by running ``./generate_FV3LAM_wflow.py``. Check progress by navigating to the ``$EXPTDIR`` and running ``rocotostat -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10``. +Then, users can configure a new experiment by updating the experiment parameters in ``config.yaml`` to reflect the desired experiment configuration. Detailed instructions can be viewed in :numref:`Section %s `. Parameters and valid values are listed in :numref:`Section %s `. After adjusting the configuration file, generate the new experiment by running ``./generate_FV3LAM_wflow.py``. Check progress by navigating to the ``$EXPTDIR`` and running ``rocotostat -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10``. .. note:: diff --git a/docs/UsersGuide/source/Reference/Glossary.rst b/docs/UsersGuide/source/Reference/Glossary.rst index ebeffa33e8..1be57408ab 100644 --- a/docs/UsersGuide/source/Reference/Glossary.rst +++ b/docs/UsersGuide/source/Reference/Glossary.rst @@ -22,7 +22,7 @@ Glossary Climatology-Calibrated Precipitation Analysis (CCPA) data. This data is required for METplus precipitation verification tasks within the SRW App. The most recent 8 days worth of data are publicly available and can be accessed `here `__. CCPP - The `Common Community Physics Package `_ is a forecast-model agnostic, vetted collection of code containing atmospheric physical parameterizations and suites of parameterizations for use in Numerical Weather Prediction (NWP) along with a framework that connects the physics to the host forecast model. + The `Common Community Physics Package `__ is a forecast-model agnostic, vetted collection of code containing atmospheric physical parameterizations and suites of parameterizations for use in Numerical Weather Prediction (NWP) along with a framework that connects the physics to the host forecast model. chgres_cube The preprocessing software used to create initial and boundary condition files to @@ -97,7 +97,7 @@ Glossary Fluid Dynamics Laboratory `__ (GFDL), it is a scalable and flexible dycore capable of both hydrostatic and non-hydrostatic atmospheric simulations. It is the dycore used in the UFS Weather Model. FVCOM - `Finite Volume Community Ocean Model `__. FVCOM is used in modeling work for the `Great Lakes Coastal Forecasting System (next-gen FVCOM) `__ conducted by the `Great Lakes Environmental Research Laboratory `__. + `Finite Volume Community Ocean Model `__. FVCOM is used in modeling work for the `Great Lakes Coastal Forecasting System (next-gen FVCOM) `__ conducted by the `Great Lakes Environmental Research Laboratory `__. GFS `Global Forecast System `_. The GFS is a National Centers for Environmental Prediction (:term:`NCEP`) weather forecast model that generates data for dozens of atmospheric and land-soil variables, including temperatures, winds, precipitation, soil moisture, and atmospheric ozone concentration. The system couples four separate models (atmosphere, ocean, land/soil, and sea ice) that work together to accurately depict weather conditions. @@ -236,7 +236,7 @@ Glossary `Spack `__ is a package management tool designed to support multiple versions and configurations of software on a wide variety of platforms and environments. It was designed for large supercomputing centers, where many users and application teams share common installations of software on clusters with exotic architectures. spack-stack - The `spack-stack `__ is a collaborative effort between the NOAA Environmental Modeling Center (EMC), the UCAR Joint Center for Satellite Data Assimilation (JCSDA), and the Earth Prediction Innovation Center (EPIC). *spack-stack* is a repository that provides a :term:`Spack`-based method for building the software stack required for numerical weather prediction (NWP) tools such as the `Unified Forecast System (UFS) `__ and the `Joint Effort for Data assimilation Integration (JEDI) `__ framework. *spack-stack* uses the Spack package manager along with custom Spack configuration files and Python scripts to simplify installation of the libraries required to run various applications. The *spack-stack* can be installed on a range of platforms and comes pre-configured for many systems. Users can install the necessary packages for a particular application and later add the missing packages for another application without having to rebuild the entire stack. + The `spack-stack `__ is a collaborative effort between the NOAA Environmental Modeling Center (EMC), the UCAR Joint Center for Satellite Data Assimilation (JCSDA), and the Earth Prediction Innovation Center (EPIC). *spack-stack* is a repository that provides a :term:`Spack`-based method for building the software stack required for numerical weather prediction (NWP) tools such as the `Unified Forecast System (UFS) `__ and the `Joint Effort for Data assimilation Integration (JEDI) `__ framework. *spack-stack* uses the Spack package manager along with custom Spack configuration files and Python scripts to simplify installation of the libraries required to run various applications. The *spack-stack* can be installed on a range of platforms and comes pre-configured for many systems. Users can install the necessary packages for a particular application and later add the missing packages for another application without having to rebuild the entire stack. To get started, check out the documentation :doc:`here `. tracer According to the American Meteorological Society (AMS) `definition `__, a tracer is "Any substance in the atmosphere that can be used to track the history [i.e., movement] of an air mass." Tracers are carried around by the motion of the atmosphere (i.e., by :term:`advection`). These substances are usually gases (e.g., water vapor, CO2), but they can also be non-gaseous (e.g., rain drops in microphysics parameterizations). In weather models, temperature (or potential temperature), absolute humidity, and radioactivity are also usually treated as tracers. According to AMS, "The main requirement for a tracer is that its lifetime be substantially longer than the transport process under study." @@ -268,7 +268,7 @@ Glossary Weather Model A prognostic model that can be used for short- and medium-range research and operational forecasts. It can be an atmosphere-only model or an atmospheric - model coupled with one or more additional components, such as a wave or ocean model. The SRW App uses the `UFS Weather Model `__. + model coupled with one or more additional components, such as a wave or ocean model. The SRW App uses the `UFS Weather Model `__. Workflow The sequence of steps required to run an experiment from start to finish. diff --git a/docs/UsersGuide/source/conf.py b/docs/UsersGuide/source/conf.py index a6023d339d..a581d0e410 100644 --- a/docs/UsersGuide/source/conf.py +++ b/docs/UsersGuide/source/conf.py @@ -20,14 +20,14 @@ # -- Project information ----------------------------------------------------- -project = 'UFS Short-Range Weather App Users Guide' +project = 'UFS Short-Range Weather App User\'s Guide' copyright = '2020, ' author = ' ' # The short X.Y version -version = 'develop' +version = 'v2.2.0' # The full version, including alpha/beta/rc tags -release = 'Develop Branch Documentation' +release = 'v2.2.0' numfig = True @@ -85,6 +85,16 @@ # The name of the Pygments (syntax highlighting) style to use. pygments_style = 'sphinx' +# Documentation-wide substitutions + +rst_prolog = """ +.. |wflow_env| replace:: ``workflow_tools`` +.. |activate| replace:: ``conda activate workflow_tools`` +.. |prompt| replace:: ``(workflow_tools)`` +.. |latestr| replace:: v2.2.0 +.. |branch| replace:: ``release/public-v2.2.0`` +.. |data| replace:: v2p2 +""" # -- Options for HTML output ------------------------------------------------- @@ -99,7 +109,10 @@ # documentation. # # html_theme_options = {} -html_theme_options = {"body_max_width": "none"} +html_theme_options = { + "body_max_width": "none", + 'navigation_depth': 6, + } # Add any paths that contain custom static files (such as style sheets) here, # relative to this directory. They are copied after the builtin static files, @@ -208,10 +221,16 @@ def setup(app): # Example configuration for intersphinx: refer to the Python standard library. intersphinx_mapping = { - 'hpc-stack': ('https://hpc-stack-epic.readthedocs.io/en/latest/', None), - 'met': ('https://met.readthedocs.io/en/latest/', None), - 'srw_v2.1.0': ('https://ufs-srweather-app.readthedocs.io/en/release-public-v2.1.0/', None), - 'ufs-wm': ('https://ufs-weather-model.readthedocs.io/en/latest/', None), + 'hpc-stack': ('https://hpc-stack-epic.readthedocs.io/en/release-srw-public-v2.2.0/', None), + 'spack-stack': ('https://spack-stack.readthedocs.io/en/1.4.1/', None), + 'met': ('https://met.readthedocs.io/en/main_v10.1/', None), + 'metplus': ('https://metplus.readthedocs.io/en/main_v4.1/', None), + 'srw_v2.2.0': ('https://ufs-srweather-app.readthedocs.io/en/release-public-v2.2.0/', None), + 'ufs-wm': ('https://ufs-weather-model.readthedocs.io/en/ufs-srw-v2.2.0-doc/', None), + 'upp': ('https://upp.readthedocs.io/en/upp-srw-v2.2.0-docs/', None), + 'ufs-utils': ('https://noaa-emcufs-utils.readthedocs.io/en/ufs_utils_1_11_0/', None), + 'ccpp-techdoc': ('https://ccpp-techdoc.readthedocs.io/en/ufs_srw_app_v2.2.0/', None), + 'stochphys': ('https://stochastic-physics.readthedocs.io/en/release-public-v3/', None), } # -- Options for todo extension ---------------------------------------------- diff --git a/docs/UsersGuide/source/index.rst b/docs/UsersGuide/source/index.rst index 52937c81ae..6c092d9a28 100644 --- a/docs/UsersGuide/source/index.rst +++ b/docs/UsersGuide/source/index.rst @@ -3,8 +3,8 @@ You can adapt this file completely to your liking, but it should at least contain the root `toctree` directive. -UFS Short-Range Weather App Users Guide -======================================= +UFS Short-Range Weather App User's Guide (|version|) +===================================================== .. toctree:: :numbered: diff --git a/docs/UsersGuide/source/tables/CCPPUpdates.rst b/docs/UsersGuide/source/tables/CCPPUpdates.rst deleted file mode 100644 index 8f5bf776a5..0000000000 --- a/docs/UsersGuide/source/tables/CCPPUpdates.rst +++ /dev/null @@ -1,58 +0,0 @@ -:orphan: - -.. _CCPPUpdates: - -================================================ -CCPP Updates for the SRW App v2.1.0 Release -================================================ - -Here is what's new in CCPP Physics for the UFS SRW v2.1.0 public release. These changes are expected to improve the performance of the RRFS_v1beta, HRRR, and WoFS_v0 suites. - -RRFS_v1beta, HRRR, and WoFS Suites: -================================================ - -MYNN-EDMF PBL scheme: - * Added the ability to configure the MYNN-EDMF PBL scheme to function at closure level 2.5, 2.6 (current default), or 3.0 closure and included a partial-condensation scheme. - * Reverted to Tian-Kuang lateral entrainment, which reduces a high relative humidity bias found in some HRRR cases. - * Reduced the entrainment rate for momentum. - * Removed the first-order form of the Chaboureau and Bechtold (CB2002) stratiform cloud fraction calculation---it now only uses a higher form of CB. - * Changed CB to use absolute temperature instead of "liquid" temperature (CB2002). - * Added variable ``sm3d``---a stability function for momentum. - -MYNN Surface Layer Scheme: - * Moved four internal parameters to namelist options: - - * ``isftcflux``: flag for thermal roughness lengths over water in MYNN-SFCLAY - * ``iz0tlnd``: flag for thermal roughness lengths over land in MYNN-SFCLAY - * ``sfclay_compute_flux``: flag for computing surface scalar fluxes in MYNN-SFCLAY - * ``sfclay_compute_diag``: flag for computing surface diagnostics in MYNN-SFCLAY - -Subgrid Scale Clouds Interstitial Scheme: - * Separated frozen subgrid clouds into snow and ice categories. - * Added CB2005 as a new cloud fraction option. -RRTMG: - * Removed cloud fraction calculations for the MYNN-EDMF scheme, since cloud fraction is already defined in the subgrid scale cloud scheme used by MYNN-EDMF. - -HRRR Suite: -================================================ - -RUC Land Surface Model: - * In the computation of soil resistance to evaporation, the soil moisture field capacity factor changed from 0.7 to 1. This change will reduce direct evaporation from bare soil. - -GSL Drag Suite: - * Removed limits on the standard deviation of small-scale topography used by the small-scale GWD and turbulent orographic form drag (TOFD) schemes; removed the height limitation of the TOFD scheme. - -Removed the “sfc_nst” scheme from the suite to avert a cooling SST trend that had a negative impact on surface variables in the coastal regions. - -RRFS_v1beta Suite: -================================================ - -Noah-MP Land Surface Model: - * Added a connection with the MYNN surface layer scheme via namelist option ``opt_sfc=4``. - -GFS_v16 Suite: -================================================ - -GFS saSAS Deep Convection and saMF Shallow Cumulus Schemes: - * Added a new prognostic updraft area fraction closure in saSAS and saMF (Bengtsson et al., 2022). It is controlled via namelist option ``progsima`` (set to ``false`` by default) and an updated field table including ``sigmab``. - diff --git a/docs/UsersGuide/source/tables/Tests.csv b/docs/UsersGuide/source/tables/Tests.csv index 33b0016e41..2a9c91c0ef 100644 --- a/docs/UsersGuide/source/tables/Tests.csv +++ b/docs/UsersGuide/source/tables/Tests.csv @@ -24,8 +24,8 @@ yes,yes,grid_RRFS_CONUS_25km_ics_NAM_lbcs_NAM_suite_GFS_v16,RRFS_CONUS_25km,FV3_ ,yes,grid_RRFS_CONUS_13km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16_plot,RRFS_CONUS_13km,FV3_GFS_v16,FV3GFS,FV3GFS,2019070100,6,27,18, ,yes,grid_RRFS_CONUS_13km_ics_FV3GFS_lbcs_FV3GFS_suite_HRRR,RRFS_CONUS_13km,FV3_HRRR,FV3GFS,FV3GFS,2019070100,6,27,20, ,yes,grid_RRFS_CONUS_13km_ics_FV3GFS_lbcs_FV3GFS_suite_RRFS_v1beta,RRFS_CONUS_13km,FV3_RRFS_v1beta,FV3GFS,FV3GFS,2019070100,6,27,20, -,yes,grid_RRFS_CONUS_25km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_2017_gfdlmp,RRFS_CONUS_25km,FV3_GFS_2017_gfdlmp,FV3GFS,FV3GFS,2019070112 2019070200,6,39,24, -,yes,grid_RRFS_CONUS_25km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16,RRFS_CONUS_25km,FV3_GFS_v16,FV3GFS,FV3GFS,2019070100,6,5,23, +,yes,grid_RRFS_CONUS_25km_ics_FV3GFS_lbcs_FV3GFS_suite_RAP,RRFS_CONUS_25km,FV3_RAP,FV3GFS,FV3GFS,2019070112 2019070200,6,39,24, +,yes,grid_RRFS_CONUS_25km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v16_plot,RRFS_CONUS_25km,FV3_GFS_v16,FV3GFS,FV3GFS,2019070100,6,5,23, ,yes,grid_RRFS_CONUS_25km_ics_FV3GFS_lbcs_FV3GFS_suite_HRRR,RRFS_CONUS_25km,FV3_HRRR,FV3GFS,FV3GFS,2019070100,6,15,30, ,yes,grid_RRFS_CONUS_25km_ics_FV3GFS_lbcs_FV3GFS_suite_RRFS_v1beta,RRFS_CONUS_25km,FV3_RRFS_v1beta,FV3GFS,FV3GFS,2019070100,6,12,34, ,yes,grid_RRFS_CONUS_25km_ics_FV3GFS_lbcs_RAP_suite_RAP,RRFS_CONUS_25km,FV3_RAP,FV3GFS,RAP,2019061518,6,18,45, diff --git a/docs/UsersGuide/source/tables/fix_file_list.rst b/docs/UsersGuide/source/tables/fix_file_list.rst deleted file mode 100644 index a20bd39245..0000000000 --- a/docs/UsersGuide/source/tables/fix_file_list.rst +++ /dev/null @@ -1,821 +0,0 @@ -:orphan: - -.. _StaticFilesList: - - -Static Files for SRW App Release v2.1.0 -========================================== - -``fix_aer`` Files ---------------------- - -.. code-block:: console - - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_aer/merra2.aerclim.2003-2014.m01.nc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_aer/merra2.aerclim.2003-2014.m02.nc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_aer/merra2.aerclim.2003-2014.m03.nc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_aer/merra2.aerclim.2003-2014.m04.nc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_aer/merra2.aerclim.2003-2014.m05.nc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_aer/merra2.aerclim.2003-2014.m06.nc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_aer/merra2.aerclim.2003-2014.m07.nc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_aer/merra2.aerclim.2003-2014.m08.nc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_aer/merra2.aerclim.2003-2014.m09.nc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_aer/merra2.aerclim.2003-2014.m10.nc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_aer/merra2.aerclim.2003-2014.m11.nc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_aer/merra2.aerclim.2003-2014.m12.nc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_aer/merra2C.aerclim.2003-2014.m01.nc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_aer/merra2C.aerclim.2003-2014.m02.nc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_aer/merra2C.aerclim.2003-2014.m03.nc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_aer/merra2C.aerclim.2003-2014.m04.nc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_aer/merra2C.aerclim.2003-2014.m05.nc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_aer/merra2C.aerclim.2003-2014.m06.nc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_aer/merra2C.aerclim.2003-2014.m07.nc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_aer/merra2C.aerclim.2003-2014.m08.nc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_aer/merra2C.aerclim.2003-2014.m09.nc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_aer/merra2C.aerclim.2003-2014.m10.nc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_aer/merra2C.aerclim.2003-2014.m11.nc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_aer/merra2C.aerclim.2003-2014.m12.nc - -``fix_am`` Files ---------------------- - -.. code-block:: console - - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/CCN_ACTIVATE.BIN - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/cfs_ice1x1monclim19822001.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/cfs_oi2sst1x1monclim19822001.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/cfs_v2_soilmcpc.1x1.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/CFSR.OISST.1982.2010.monthly.clim - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/CFSR.OISST.1999.2012.monthly.clim.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/CFSR.SEAICE.1982.2010.monthly.clim - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/CFSR.SEAICE.1982.2012.monthly.clim.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2monthlycyc.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/emcsfc_gland5min.grib2 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/emcsfc_snow_cover.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/emcsfc_snow_cover_climo.grib2 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/freezeH2O.dat - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/geo_em.d01.lat-lon.2.5m.HGT_M.nc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/geo_em.d01.nc_HRRR_AK - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/geo_em.d01.nc_HRRRX - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/geo_em.d01.nc_RAPX - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_1x1_paramlist - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_1x1_paramlist.anl - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_1x1_paramlist.f00 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_aeroinfo.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_aeropac3a.m01.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_aeropac3a.m02.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_aeropac3a.m03.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_aeropac3a.m04.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_aeropac3a.m05.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_aeropac3a.m06.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_aeropac3a.m07.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_aeropac3a.m08.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_aeropac3a.m09.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_aeropac3a.m10.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_aeropac3a.m11.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_aeropac3a.m12.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_albedo4.1x1.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_cldtune.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_climaeropac_global.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2con.l28.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2con.l42.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2con.l64.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_1956.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_1957.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_1958.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_1959.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_1960.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_1961.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_1962.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_1963.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_1964.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_1965.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_1966.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_1967.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_1968.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_1969.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_1970.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_1971.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_1972.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_1973.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_1974.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_1975.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_1976.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_1977.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_1978.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_1979.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_1980.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_1981.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_1982.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_1983.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_1984.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_1985.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_1986.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_1987.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_1988.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_1989.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_1990.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_1991.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_1992.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_1993.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_1994.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_1995.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_1996.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_1997.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_1998.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_1999.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_2000.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_2001.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_2002.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_2003.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_2004.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_2005.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_2006.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_2007.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_2008.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_2009.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_2010.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_2011.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_2012.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_2013.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2historicaldata_glob.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2monthlycyc1976_2006.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2monthlycyc1976_2007.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_co2monthlycyc1976_2009.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_divten.l28.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_divten.l42.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_divten.l64.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_emissivity_coefs.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_gaussian_latitudes.t1148.2304.1152.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_gaussian_latitudes.t1534.3072.1536.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_gaussian_latitudes.t574.1152.576.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_gaussian_latitudes.t670.1344.672.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_gaussian_latitudes.t766.1536.768.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_gaussian_latitudes.t94.192.96.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_glacier.2x2.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_h2o_pltc.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_hd_paramlist - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_hd_paramlist.f00 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_hyblev.l128.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_hyblev.l128C.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_hyblev.l150.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_hyblev.l28.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_hyblev.l42.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_hyblev.l60.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_hyblev.l64.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_hyblev.l64sl.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_hyblev.l65.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_hyblev.l65.txt_0.1hPa - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_hyblev.l91.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_hyblev.l98.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_hyblev3.l28.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_hyblev3.l42.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_hyblev3.l60.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_hyblev3.l64.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_iceclim.2x2.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_idea_coeff_hflux.dat - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_idea_coeff_lflux.dat - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_idea_coeff_lte.150 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_idea_coeff_lte.360 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_idea_coeff_lte.540 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_idea_coeff_lte.720 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_idea_ggww_in1.par - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_idea_ggww_in4.par - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_idea_h2ort_kg7t.par - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_idea_h2ovb_kg7t.par - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_idea_wei96.cofcnts - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_kplist.1d.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_kplist.hd.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_kplist.master.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_latitudes.t1148.2304.1152.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_latitudes.t126.384.190.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_latitudes.t1534.3072.1536.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_latitudes.t170.512.256.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_latitudes.t190.384.192.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_latitudes.t190.576.288.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_latitudes.t254.512.256.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_latitudes.t254.768.384.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_latitudes.t382.1152.576.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_latitudes.t382.768.384.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_latitudes.t574.1152.576.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_latitudes.t574.1760.880.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_latitudes.t62.192.94.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_latitudes.t670.1344.672.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_latitudes.t878.1760.880.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_latitudes.t878.2640.1320.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_latitudes.t92.192.94.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_longitudes.t1148.2304.1152.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_longitudes.t126.384.190.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_longitudes.t1534.3072.1536.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_longitudes.t170.512.256.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_longitudes.t190.384.192.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_longitudes.t190.576.288.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_longitudes.t254.512.256.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_longitudes.t254.768.384.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_longitudes.t382.1152.576.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_longitudes.t382.768.384.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_longitudes.t574.1152.576.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_longitudes.t574.1760.880.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_longitudes.t62.192.94.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_longitudes.t670.1344.672.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_longitudes.t878.1760.880.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_longitudes.t878.2640.1320.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_longitudes.t92.192.94.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_lonsperlat.t1148.2304.1152.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_lonsperlat.t126.384.190.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_lonsperlat.t1534.3072.1536.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_lonsperlat.t170.512.256.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_lonsperlat.t190.384.192.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_lonsperlat.t190.576.288.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_lonsperlat.t254.512.256.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_lonsperlat.t254.768.384.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_lonsperlat.t3070.6144.3072.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_lonsperlat.t382.1152.576.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_lonsperlat.t382.768.384.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_lonsperlat.t574.1152.576.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_lonsperlat.t574.1760.880.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_lonsperlat.t62.192.94.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_lonsperlat.t670.1344.672.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_lonsperlat.t766.1536.768.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_lonsperlat.t878.1760.880.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_lonsperlat.t878.2640.1320.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_lonsperlat.t92.192.94.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_lonsperlat.t94.192.96.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_maskh.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_master-catchup_parmlist - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_maxice.2x2.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mtnvar.t1148.2304.1152.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mtnvar.t126.384.190.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mtnvar.t126.384.190.rg.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mtnvar.t1534.3072.1536.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mtnvar.t1534.3072.1536.rg.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mtnvar.t170.512.256.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mtnvar.t190.384.192.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mtnvar.t190.384.192.rg.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mtnvar.t190.576.288.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mtnvar.t254.512.256.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mtnvar.t254.512.256.rg.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mtnvar.t254.768.384.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mtnvar.t382.1152.576.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mtnvar.t382.768.384.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mtnvar.t382.768.384.rg.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mtnvar.t574.1152.576.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mtnvar.t574.1152.576.rg.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mtnvar.t574.1760.880.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mtnvar.t62.192.94.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mtnvar.t670.1344.672.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mtnvar.t670.1344.672.rg.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mtnvar.t766.1536.768.rg.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mtnvar.t878.1760.880.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mtnvar.t878.2640.1320.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mtnvar.t92.192.94.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mtnvar.t92.192.94.rg.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mxsnoalb.uariz.t1148.2304.1152.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mxsnoalb.uariz.t126.384.190.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mxsnoalb.uariz.t126.384.190.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mxsnoalb.uariz.t1534.3072.1536.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mxsnoalb.uariz.t1534.3072.1536.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mxsnoalb.uariz.t170.512.256.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mxsnoalb.uariz.t190.384.192.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mxsnoalb.uariz.t190.384.192.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mxsnoalb.uariz.t190.576.288.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mxsnoalb.uariz.t190.576.288.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mxsnoalb.uariz.t254.512.256.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mxsnoalb.uariz.t254.512.256.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mxsnoalb.uariz.t254.768.384.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mxsnoalb.uariz.t382.1152.576.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mxsnoalb.uariz.t382.768.384.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mxsnoalb.uariz.t382.768.384.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mxsnoalb.uariz.t574.1152.576.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mxsnoalb.uariz.t574.1152.576.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mxsnoalb.uariz.t574.1760.880.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mxsnoalb.uariz.t62.192.94.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mxsnoalb.uariz.t62.192.94.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mxsnoalb.uariz.t670.1344.672.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mxsnoalb.uariz.t670.1344.672.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mxsnoalb.uariz.t766.1536.768.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mxsnoalb.uariz.t766.1536.768.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mxsnoalb.uariz.t878.1760.880.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mxsnoalb.uariz.t878.2640.1320.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mxsnoalb.uariz.t92.192.94.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mxsnoalb.uariz.t92.192.94.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_mxsnoalb.uariz.t94.192.96.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_npoess_paramlist - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_o3clim.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_o3prdlos.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography.t1148.2304.1152.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography.t126.384.190.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography.t126.384.190.rg.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography.t126.384.190.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography.t1534.3072.1536.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography.t1534.3072.1536.rg.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography.t1534.3072.1536.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography.t170.512.256.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography.t190.384.192.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography.t190.384.192.rg.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography.t190.384.192.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography.t190.576.288.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography.t254.512.256.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography.t254.512.256.rg.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography.t254.512.256.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography.t254.768.384.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography.t382.1152.576.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography.t382.768.384.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography.t382.768.384.rg.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography.t382.768.384.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography.t574.1152.576.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography.t574.1152.576.rg.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography.t574.1152.576.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography.t574.1760.880.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography.t62.192.94.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography.t62.192.94.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography.t670.1344.672.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography.t670.1344.672.rg.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography.t670.1344.672.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography.t766.1536.768.rg.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography.t766.1536.768.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography.t878.1760.880.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography.t878.2640.1320.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography.t92.192.94.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography.t92.192.94.rg.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography.t92.192.94.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography_uf.t1148.2304.1152.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography_uf.t126.384.190.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography_uf.t126.384.190.rg.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography_uf.t126.384.190.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography_uf.t1534.3072.1536.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography_uf.t1534.3072.1536.rg.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography_uf.t1534.3072.1536.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography_uf.t170.512.256.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography_uf.t190.384.192.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography_uf.t190.384.192.rg.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography_uf.t190.384.192.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography_uf.t190.576.288.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography_uf.t254.512.256.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography_uf.t254.512.256.rg.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography_uf.t254.512.256.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography_uf.t254.768.384.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography_uf.t382.1152.576.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography_uf.t382.768.384.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography_uf.t382.768.384.rg.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography_uf.t382.768.384.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography_uf.t574.1152.576.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography_uf.t574.1152.576.rg.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography_uf.t574.1152.576.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography_uf.t574.1760.880.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography_uf.t62.192.94.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography_uf.t62.192.94.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography_uf.t670.1344.672.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography_uf.t670.1344.672.rg.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography_uf.t670.1344.672.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography_uf.t766.1536.768.rg.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography_uf.t766.1536.768.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography_uf.t878.1760.880.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography_uf.t878.2640.1320.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography_uf.t92.192.94.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography_uf.t92.192.94.rg.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography_uf.t92.192.94.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_orography_0.5x0.5.dat - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_salclm.t1534.3072.1536.nc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_sfc_emissivity_idx.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_shdmax.0.144x0.144.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_shdmax.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_shdmin.0.144x0.144.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_shdmin.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_siglevel.l28.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_siglevel.l42.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_siglevel.l64.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_slmask.t1148.2304.1152.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_slmask.t126.384.190.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_slmask.t126.384.190.rg.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_slmask.t126.384.190.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_slmask.t1534.3072.1536.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_slmask.t1534.3072.1536.rg.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_slmask.t1534.3072.1536.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_slmask.t170.512.256.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_slmask.t190.384.192.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_slmask.t190.384.192.rg.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_slmask.t190.384.192.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_slmask.t190.576.288.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_slmask.t254.512.256.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_slmask.t254.512.256.rg.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_slmask.t254.512.256.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_slmask.t254.768.384.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_slmask.t382.1152.576.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_slmask.t382.768.384.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_slmask.t382.768.384.rg.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_slmask.t382.768.384.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_slmask.t574.1152.576.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_slmask.t574.1152.576.rg.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_slmask.t574.1152.576.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_slmask.t574.1760.880.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_slmask.t62.192.94.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_slmask.t62.192.94.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_slmask.t670.1344.672.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_slmask.t670.1344.672.rg.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_slmask.t670.1344.672.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_slmask.t766.1536.768.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_slmask.t766.1536.768.rg.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_slmask.t766.1536.768.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_slmask.t878.1760.880.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_slmask.t878.2640.1320.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_slmask.t92.192.94.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_slmask.t92.192.94.rg.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_slmask.t92.192.94.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_slope.1x1.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_slptyp.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_snoalb.1x1.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_snoalb.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_snoclim.1.875.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_snowfree_albedo.bosu.t1148.2304.1152.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_snowfree_albedo.bosu.t126.384.190.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_snowfree_albedo.bosu.t126.384.190.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_snowfree_albedo.bosu.t1534.3072.1536.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_snowfree_albedo.bosu.t1534.3072.1536.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_snowfree_albedo.bosu.t170.512.256.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_snowfree_albedo.bosu.t190.384.192.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_snowfree_albedo.bosu.t190.384.192.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_snowfree_albedo.bosu.t190.576.288.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_snowfree_albedo.bosu.t190.576.288.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_snowfree_albedo.bosu.t254.512.256.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_snowfree_albedo.bosu.t254.512.256.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_snowfree_albedo.bosu.t254.768.384.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_snowfree_albedo.bosu.t382.1152.576.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_snowfree_albedo.bosu.t382.768.384.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_snowfree_albedo.bosu.t382.768.384.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_snowfree_albedo.bosu.t574.1152.576.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_snowfree_albedo.bosu.t574.1152.576.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_snowfree_albedo.bosu.t574.1760.880.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_snowfree_albedo.bosu.t62.192.94.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_snowfree_albedo.bosu.t62.192.94.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_snowfree_albedo.bosu.t670.1344.672.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_snowfree_albedo.bosu.t670.1344.672.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_snowfree_albedo.bosu.t766.1536.768.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_snowfree_albedo.bosu.t766.1536.768.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_snowfree_albedo.bosu.t878.1760.880.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_snowfree_albedo.bosu.t878.2640.1320.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_snowfree_albedo.bosu.t92.192.94.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_snowfree_albedo.bosu.t92.192.94.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_snowfree_albedo.bosu.t94.192.96.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soilmcpc.1x1.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soilmgldas.statsgo.t1534.3072.1536.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soilmgldas.statsgo.t254.512.256.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soilmgldas.statsgo.t382.1152.576.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soilmgldas.statsgo.t382.768.384.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soilmgldas.statsgo.t574.1152.576.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soilmgldas.statsgo.t766.1536.768.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soilmgldas.statsgo.t92.192.94.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soilmgldas.statsgo.t94.192.96.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soilmgldas.t1148.2304.1152.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soilmgldas.t126.384.190.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soilmgldas.t1534.3072.1536.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soilmgldas.t170.512.256.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soilmgldas.t190.384.192.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soilmgldas.t190.576.288.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soilmgldas.t190.576.288.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soilmgldas.t254.512.256.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soilmgldas.t254.768.384.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soilmgldas.t382.1152.576.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soilmgldas.t382.768.384.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soilmgldas.t574.1152.576.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soilmgldas.t574.1760.880.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soilmgldas.t62.192.94.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soilmgldas.t670.1344.672.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soilmgldas.t766.1536.768.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soilmgldas.t878.1760.880.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soilmgldas.t878.2640.1320.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soilmgldas.t92.192.94.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soiltype.1x1.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soiltype.statsgo.t1148.2304.1152.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soiltype.statsgo.t126.384.190.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soiltype.statsgo.t126.384.190.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soiltype.statsgo.t1534.3072.1536.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soiltype.statsgo.t1534.3072.1536.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soiltype.statsgo.t170.512.256.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soiltype.statsgo.t190.384.192.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soiltype.statsgo.t190.384.192.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soiltype.statsgo.t190.576.288.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soiltype.statsgo.t190.576.288.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soiltype.statsgo.t254.512.256.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soiltype.statsgo.t254.512.256.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soiltype.statsgo.t254.768.384.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soiltype.statsgo.t382.1152.576.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soiltype.statsgo.t382.768.384.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soiltype.statsgo.t382.768.384.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soiltype.statsgo.t574.1152.576.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soiltype.statsgo.t574.1152.576.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soiltype.statsgo.t574.1760.880.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soiltype.statsgo.t62.192.94.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soiltype.statsgo.t62.192.94.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soiltype.statsgo.t670.1344.672.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soiltype.statsgo.t670.1344.672.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soiltype.statsgo.t766.1536.768.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soiltype.statsgo.t766.1536.768.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soiltype.statsgo.t878.1760.880.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soiltype.statsgo.t878.2640.1320.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soiltype.statsgo.t92.192.94.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soiltype.statsgo.t92.192.94.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_soiltype.statsgo.t94.192.96.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_solarconstant_cmip_an.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_solarconstant_cmip_mn.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_solarconstant_noaa_a0.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_solarconstant_noaa_an.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_solarconstant_noaa_an.txt_v2011 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_solarconstant_noaa_an.txt_v2019 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_solarconstantdata.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_spectral_coefs.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_sstclim.2x2.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_tbthe.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_tg3clim.2.6x1.5.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_transmittance_coefs.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_vars.l28.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_vars.l42.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_vars.l64.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_vegfrac.0.144.decpercent.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_vegfrac.1x1.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_vegtype.1x1.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_vegtype.igbp.t1148.2304.1152.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_vegtype.igbp.t126.384.190.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_vegtype.igbp.t126.384.190.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_vegtype.igbp.t1534.3072.1536.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_vegtype.igbp.t1534.3072.1536.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_vegtype.igbp.t170.512.256.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_vegtype.igbp.t190.384.192.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_vegtype.igbp.t190.384.192.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_vegtype.igbp.t190.576.288.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_vegtype.igbp.t190.576.288.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_vegtype.igbp.t254.512.256.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_vegtype.igbp.t254.512.256.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_vegtype.igbp.t254.768.384.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_vegtype.igbp.t382.1152.576.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_vegtype.igbp.t382.768.384.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_vegtype.igbp.t382.768.384.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_vegtype.igbp.t574.1152.576.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_vegtype.igbp.t574.1152.576.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_vegtype.igbp.t574.1760.880.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_vegtype.igbp.t62.192.94.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_vegtype.igbp.t62.192.94.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_vegtype.igbp.t670.1344.672.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_vegtype.igbp.t670.1344.672.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_vegtype.igbp.t766.1536.768.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_vegtype.igbp.t766.1536.768.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_vegtype.igbp.t878.1760.880.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_vegtype.igbp.t878.2640.1320.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_vegtype.igbp.t92.192.94.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_vegtype.igbp.t92.192.94.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_vegtype.igbp.t94.192.96.rg.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_volcanic_aerosols_1850-1859.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_volcanic_aerosols_1860-1869.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_volcanic_aerosols_1870-1879.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_volcanic_aerosols_1880-1889.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_volcanic_aerosols_1890-1899.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_volcanic_aerosols_1900-1909.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_volcanic_aerosols_1910-1919.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_volcanic_aerosols_1920-1929.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_volcanic_aerosols_1930-1939.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_volcanic_aerosols_1940-1949.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_volcanic_aerosols_1950-1959.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_volcanic_aerosols_1960-1969.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_volcanic_aerosols_1970-1979.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_volcanic_aerosols_1980-1989.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_volcanic_aerosols_1990-1999.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/global_zorclim.1x1.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/HGT.Beljaars_filtered.lat-lon.30s_res.nc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/latlon_grid3.32769.nc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/ozone.clim - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/ozprdlos_2015_new_sbuvO3_tclm15_nuchem.f77 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/qr_acr_qg.dat - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/qr_acr_qgV2.dat - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/qr_acr_qs.dat - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/qr_acr_qsV2.dat - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/rrtmgp-cloud-optics-coeffs-lw.nc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/rrtmgp-cloud-optics-coeffs-sw.nc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/rrtmgp-data-lw-g256-2018-12-04.nc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/rrtmgp-data-sw-g224-2018-12-04.nc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/rrtmgp-lw-prototype-g128-210413.nc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/rrtmgp-sw-prototype-g131-210413.nc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/RTGSST.1982.2012.monthly.clim.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/seaice_newland.grb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/syndat_fildef.vit - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/syndat_slmask.t126.gaussian - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/syndat_stmnames - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/syndat_stmnames_old - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/syndat_stmnames_old1 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/syndat_stmnames_old2 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/Thompson_MP_MONTHLY_CLIMO.nc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/ugwp_limb_tau.nc - - -``fix_am/co2dat_4a/`` Files: -^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -.. code-block:: console - - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_1956.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_1957.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_1958.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_1959.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_1960.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_1961.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_1962.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_1963.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_1964.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_1965.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_1966.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_1967.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_1968.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_1969.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_1970.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_1971.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_1972.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_1973.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_1974.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_1975.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_1976.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_1977.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_1978.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_1979.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_1980.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_1981.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_1982.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_1983.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_1984.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_1985.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_1986.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_1987.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_1988.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_1989.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_1990.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_1991.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_1992.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_1993.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_1994.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_1995.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_1996.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_1997.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_1998.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_1999.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_2000.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_2001.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_2002.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_2003.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_2004.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_2005.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_2006.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_2007.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_2008.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_2009.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_2009.txt_proj - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_2009.txt_proj_u - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_2010.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_2010.txt_proj - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_2010.txt_proj_u - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_2011.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_2011.txt_proj - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_2011.txt_proj_u - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_2012.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_2012.txt_proj - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_2012.txt_proj_u - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_2013.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_2013.txt_proj - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_2013.txt_proj_u - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_2014.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_2014.txt_proj - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_2014.txt_proj_u - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_2015.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_2015.txt_proj - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_2015.txt_proj_u - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_2016.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_2016.txt_proj - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_2016.txt_proj_u - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_2017.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_2017.txt_proj - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_2017.txt_proj_u - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_2018.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_2018.txt_proj - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_2018.txt_proj_u - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_2019.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_2019.txt_proj - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_2019.txt_proj_u - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_2020.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_2020.txt_proj - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_2020.txt_proj_u - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_2021.txt_proj - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_2021.txt_proj_u - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_2022.txt_proj - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2historicaldata_glob.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2monthlycyc1976_2006.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/global_co2monthlycyc1976_2009.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/co2dat_4a/MEMO - - -``fix_am/fix_co2_proj`` Files: -^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -.. code-block:: console - - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/fix_co2_proj/global_co2historicaldata_2009.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/fix_co2_proj/global_co2historicaldata_2010.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/fix_co2_proj/global_co2historicaldata_2011.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/fix_co2_proj/global_co2historicaldata_2012.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/fix_co2_proj/global_co2historicaldata_2013.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/fix_co2_proj/global_co2historicaldata_2014.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/fix_co2_proj/global_co2historicaldata_2015.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/fix_co2_proj/global_co2historicaldata_2016.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/fix_co2_proj/global_co2historicaldata_2017.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/fix_co2_proj/global_co2historicaldata_2018.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/fix_co2_proj/global_co2historicaldata_2019.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/fix_co2_proj/global_co2historicaldata_2020.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/fix_co2_proj/global_co2historicaldata_2021.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/fix_co2_proj/global_co2historicaldata_2022.txt - - -``fix_am/fix_co2_update`` Files: -^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -.. code-block:: console - - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/fix_co2_update/global_co2historicaldata_2009.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/fix_co2_update/global_co2historicaldata_2010.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/fix_co2_update/global_co2historicaldata_2011.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/fix_co2_update/global_co2historicaldata_2012.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/fix_co2_update/global_co2historicaldata_2013.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/fix_co2_update/global_co2historicaldata_2014.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/fix_co2_update/global_co2historicaldata_2015.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/fix_co2_update/global_co2historicaldata_2016.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/fix_co2_update/global_co2historicaldata_2017.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/fix_co2_update/global_co2historicaldata_2018.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/fix_co2_update/global_co2historicaldata_2019.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/fix_co2_update/global_co2historicaldata_2020.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_am/fix_co2_update/global_co2historicaldata_2021.txt - - -``fix_lut`` Files ---------------------- - -.. code-block:: console - - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_lut/optics_BC.v1_3.dat - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_lut/optics_DU.v15_3.dat - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_lut/optics_DU.v15_3.nc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_lut/optics_OC.v1_3.dat - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_lut/optics_SS.v3_3.dat - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_lut/optics_SU.v1_3.dat - - -``fix_orog`` Files ---------------------- - -.. code-block:: console - - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_orog/clmgrb - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_orog/clmgrb.index - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_orog/convert.f90 - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_orog/gmted2010.30sec.flt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_orog/gmted2010.30sec.int - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_orog/gmted2010.30sec.flt.ctl - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_orog/gmted2010.30sec.int.ctl - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_orog/thirty.second.antarctic.new.bin - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_orog/GlobalLakeDepth.dat - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_orog/GlobalLakeDepth.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_orog/GlobalLakeStatus.dat - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_orog/GlobalLakeStatus.txt - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_orog/gtopo30_gg.fine - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_orog/gtopo30_gg.fine.nh - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_orog/landcover30.fixed - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_orog/makefile - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_orog/run.lsf - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_orog/TOP8M_avg.20I4.asc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_orog/TOP8M_max.20I4.asc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_orog/TOP8M_slm.80I1.asc - - - -``fix_sfc_climo`` Files --------------------------- - -.. code-block:: console - - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_sfc_climo/facsf.1.0.nc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_sfc_climo/vegetation_type.igbp.0.03.nc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_sfc_climo/leaf_area_index.0.05.nc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_sfc_climo/vegetation_type.igbp.0.05.nc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_sfc_climo/maximum_snow_albedo.0.05.nc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_sfc_climo/vegetation_type.igbp.conus.0.01.nc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_sfc_climo/slope_type.1.0.nc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_sfc_climo/vegetation_type.modis.igbp.0.03.nc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_sfc_climo/snowfree_albedo.4comp.0.05.nc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_sfc_climo/vegetation_type.modis.igbp.0.05.nc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_sfc_climo/soil_type.statsgo.0.03.nc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_sfc_climo/vegetation_type.modis.igbp.conus.0.01.nc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_sfc_climo/soil_type.statsgo.0.05.nc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_sfc_climo/vegetation_type.viirs.igbp.0.03.nc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_sfc_climo/soil_type.statsgo.conus.0.01.nc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_sfc_climo/vegetation_type.viirs.igbp.0.05.nc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_sfc_climo/substrate_temperature.1.0.nc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_sfc_climo/vegetation_type.viirs.igbp.0.1.nc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_sfc_climo/substrate_temperature.2.6x1.5.nc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_sfc_climo/vegetation_type.viirs.igbp.conus.0.01.nc - wget https://noaa-ufs-srw-pds.s3.amazonaws.com/fix/fix_sfc_climo/vegetation_greenness.0.144.nc - -