diff --git a/doc/ContribGuide/contributing.rst b/doc/ContribGuide/contributing.rst index eb995efb4..0f7231e26 100644 --- a/doc/ContribGuide/contributing.rst +++ b/doc/ContribGuide/contributing.rst @@ -11,7 +11,7 @@ Fork and PR Overview Contributions to the ``ufs-srweather-app`` project are made via a :github-docs:`Fork` and :github-docs:`Pull Request (PR)` model. GitHub provides a thorough description of this contribution model in their `Contributing to a project` :github-docs:`Quickstart`, but the steps, with respect to ``ufs-srweather-app`` contributions, can be summarized as: -#. :github-docs:`Create an issue ` to document proposed changes. +#. :github-docs:`Create an issue ` to document proposed changes. #. :github-docs:`Fork` the :srw-repo:`ufs-srweather-app repository<>` into your personal GitHub account. #. :github-docs:`Clone` your fork onto your development system. #. :github-docs:`Create a branch` in your clone for your changes. All development should take place on a branch, *not* on ``develop``. diff --git a/doc/UsersGuide/BackgroundInfo/Components.rst b/doc/UsersGuide/BackgroundInfo/Components.rst index 559576725..6df12dbd2 100644 --- a/doc/UsersGuide/BackgroundInfo/Components.rst +++ b/doc/UsersGuide/BackgroundInfo/Components.rst @@ -38,12 +38,12 @@ Supported model resolutions in this release include 3-, 13-, and 25-km predefine Model Physics --------------- -The Common Community Physics Package (CCPP), described `here `__, supports interoperable atmospheric physics and land surface model options. Atmospheric physics are a set of numerical methods describing small-scale processes such as clouds, turbulence, radiation, and their interactions. The most recent SRW App release (|latestr|) included five supported physics suites: FV3_RRFS_v1beta, FV3_GFS_v16, FV3_WoFS_v0, FV3_HRRR, and FV3_RAP. The FV3_RRFS_v1beta physics suite is being tested for use in the future operational implementation of the Rapid Refresh Forecast System (:term:`RRFS`) planned for 2023-2024, and the FV3_GFS_v16 is an updated version of the physics suite used in the operational Global Forecast System (GFS) v16. A detailed list of CCPP updates since the SRW App v2.1.0 release is available :ref:`here `. A full scientific description of CCPP parameterizations and suites can be found in the `CCPP Scientific Documentation `__, and CCPP technical aspects are described in the :doc:`CCPP Technical Documentation `. The model namelist has many settings beyond the physics options that can optimize various aspects of the model for use with each of the supported suites. Additional information on Stochastic Physics options is available :doc:`here `. +The Common Community Physics Package (CCPP), described `here `_, supports interoperable atmospheric physics and land surface model options. Atmospheric physics are a set of numerical methods describing small-scale processes such as clouds, turbulence, radiation, and their interactions. The most recent SRW App release (|latestr|) included five supported physics suites: FV3_RRFS_v1beta, FV3_GFS_v16, FV3_WoFS_v0, FV3_HRRR, and FV3_RAP. The FV3_RRFS_v1beta physics suite is being tested for use in the future operational implementation of the Rapid Refresh Forecast System (:term:`RRFS`) planned for 2023-2024, and the FV3_GFS_v16 is an updated version of the physics suite used in the operational Global Forecast System (GFS) v16. A detailed list of CCPP updates since the SRW App v2.1.0 release is available :ref:`here `. A full scientific description of CCPP parameterizations and suites can be found in the `CCPP Scientific Documentation `_, and CCPP technical aspects are described in the :doc:`CCPP Technical Documentation `. The model namelist has many settings beyond the physics options that can optimize various aspects of the model for use with each of the supported suites. Additional information on Stochastic Physics options is available :doc:`here `. .. note:: SPP is currently only available for specific physics schemes used in the RAP/HRRR physics suite. Users need to be aware of which physics suite definition file (:term:`SDF`) is chosen when turning this option on. Among the supported physics suites, the full set of parameterizations can only be used with the ``FV3_HRRR`` option for ``CCPP_PHYS_SUITE``. -Additionally, a CCPP single-column model (`CCPP-SCM `__) option has also been developed as a child repository. Users can refer to the `CCPP Single Column Model User and Technical Guide `__ for more details. This CCPP-SCM user guide contains a Quick Start Guide with instructions for obtaining the code, compiling, and running test cases, which include five standard test cases and two additional FV3 replay cases (refer to section 5.2 in the CCPP-SCM user guide for more details). Moreover, the CCPP-SCM supports a precompiled version in a docker container, allowing it to be easily executed on NOAA's cloud computing platforms without any issues (see section 2.5 in the CCPP-SCM user guide for more details). +Additionally, a CCPP single-column model (`CCPP-SCM `_) option has also been developed as a child repository. Users can refer to the `CCPP Single Column Model User and Technical Guide `_ for more details. This CCPP-SCM user guide contains a Quick Start Guide with instructions for obtaining the code, compiling, and running test cases, which include five standard test cases and two additional FV3 replay cases (refer to section 5.2 in the CCPP-SCM user guide for more details). Moreover, the CCPP-SCM supports a precompiled version in a docker container, allowing it to be easily executed on NOAA's cloud computing platforms without any issues (see section 2.5 in the CCPP-SCM user guide for more details). The SRW App supports the use of both :term:`GRIB2` and :term:`NEMSIO` input data. The UFS Weather Model ingests initial and lateral boundary condition files produced by :term:`chgres_cube` and outputs files in netCDF format on a specific projection (e.g., Lambert Conformal) in the horizontal direction and model levels in the vertical direction. @@ -57,17 +57,17 @@ The Unified Post Processor (:term:`UPP`) processes raw output from a variety of METplus Verification Suite ============================= -The Model Evaluation Tools (MET) package is a set of statistical verification tools developed by the `Developmental Testbed Center `__ (DTC) for use by the :term:`NWP` community to help them assess and evaluate the performance of numerical weather predictions. MET is the core component of the enhanced `METplus `__ verification framework; the suite also includes the associated database and display systems called METviewer and METexpress. +The Model Evaluation Tools (MET) package is a set of statistical verification tools developed by the `Developmental Testbed Center `_ (DTC) for use by the :term:`NWP` community to help them assess and evaluate the performance of numerical weather predictions. MET is the core component of the enhanced `METplus `_ verification framework; the suite also includes the associated database and display systems called METviewer and METexpress. -The METplus verification framework has been integrated into the SRW App to facilitate forecast evaluation. METplus is a verification framework that spans a wide range of temporal scales (warn-on-forecast to climate) and spatial scales (storm to global). It is supported by the `Developmental Testbed Center (DTC) `__. +The METplus verification framework has been integrated into the SRW App to facilitate forecast evaluation. METplus is a verification framework that spans a wide range of temporal scales (warn-on-forecast to climate) and spatial scales (storm to global). It is supported by the `Developmental Testbed Center (DTC) `_. METplus comes preinstalled with :term:`spack-stack` but can also be installed on other systems individually or as part of :term:`HPC-Stack` installation. Users on systems without a previous installation of METplus can follow the :ref:`MET Installation Guide ` and :ref:`METplus Installation Guide ` for individual installation. Currently, METplus *installation* is only supported as part of spack-stack installation; users attempting to install METplus individually or as part of HPC-Stack will need to direct assistance requests to the METplus team. However, METplus *use* is supported on any system with a functioning METplus installation. -The core components of the METplus framework include the statistical driver (MET), the associated database and display systems known as METviewer and METexpress, and a suite of Python wrappers to provide low-level automation and examples, also called use cases. MET is a set of verification tools developed for use by the :term:`NWP` community. It matches up gridded forecast fields with either gridded analyses or point observations and applies configurable methods to compute statistics and diagnostics. Extensive documentation is available in the :doc:`METplus User's Guide ` and :doc:`MET User's Guide `. Documentation for all other components of the framework can be found at the *Documentation* link for each component on the METplus `downloads `__ page. +The core components of the METplus framework include the statistical driver (MET), the associated database and display systems known as METviewer and METexpress, and a suite of Python wrappers to provide low-level automation and examples, also called use cases. MET is a set of verification tools developed for use by the :term:`NWP` community. It matches up gridded forecast fields with either gridded analyses or point observations and applies configurable methods to compute statistics and diagnostics. Extensive documentation is available in the :doc:`METplus User's Guide ` and :doc:`MET User's Guide `. Documentation for all other components of the framework can be found at the *Documentation* link for each component on the METplus `downloads `_ page. Among other techniques, MET provides the capability to compute standard verification scores for comparing deterministic gridded model data to point-based and gridded observations. It also provides ensemble and probabilistic verification methods for comparing gridded model data to point-based or gridded observations. Verification tasks to accomplish these comparisons are defined in the SRW App in :numref:`Table %s `. Currently, the SRW App supports the use of :term:`NDAS` observation files (which include conventional point-based surface and upper-air data) `in prepBUFR format `__ for point-based verification. It also supports gridded Climatology-Calibrated Precipitation Analysis (:term:`CCPA`) data for accumulated precipitation evaluation and Multi-Radar/Multi-Sensor (:term:`MRMS`) gridded analysis data for composite reflectivity and :term:`echo top` verification. -METplus is being actively developed by :term:`NCAR`/Research Applications Laboratory (RAL), NOAA/Earth Systems Research Laboratories (`ESRL `__), and NOAA/Environmental Modeling Center (:term:`EMC`), and it is open to community contributions. More details about METplus can be found on the `METplus website `__. +METplus is being actively developed by :term:`NCAR`/Research Applications Laboratory (RAL), NOAA/Earth Systems Research Laboratories (`ESRL `__), and NOAA/Environmental Modeling Center (:term:`EMC`), and it is open to community contributions. More details about METplus can be found on the `METplus website `_. Air Quality Modeling (AQM) Utilities ======================================= diff --git a/doc/UsersGuide/BackgroundInfo/Introduction.rst b/doc/UsersGuide/BackgroundInfo/Introduction.rst index f1a384e02..4b0978ef2 100644 --- a/doc/UsersGuide/BackgroundInfo/Introduction.rst +++ b/doc/UsersGuide/BackgroundInfo/Introduction.rst @@ -4,9 +4,9 @@ Introduction ============== -The Unified Forecast System (:term:`UFS`) is a community-based, coupled, comprehensive Earth modeling system. NOAA's operational model suite for numerical weather prediction (:term:`NWP`) is quickly transitioning to the UFS from a number of different modeling systems. The UFS enables research, development, and contribution opportunities within the broader :term:`Weather Enterprise` (including government, industry, and academia). For more information about the UFS, visit the `UFS Portal `__. +The Unified Forecast System (:term:`UFS`) is a community-based, coupled, comprehensive Earth modeling system. NOAA's operational model suite for numerical weather prediction (:term:`NWP`) is quickly transitioning to the UFS from a number of different modeling systems. The UFS enables research, development, and contribution opportunities within the broader :term:`Weather Enterprise` (including government, industry, and academia). For more information about the UFS, visit the `UFS Portal `_. -The UFS includes `multiple applications `__ that support different forecast durations and spatial domains. This documentation describes the UFS Short-Range Weather (SRW) Application, which targets predictions of atmospheric behavior on a limited spatial domain and on time scales from minutes to several days. The most recent SRW Application includes a prognostic atmospheric model, pre- and post-processing, and a community workflow for running the system end-to-end. These components are documented within this User's Guide and supported through the `GitHub Discussions `__ forum. The SRW App also includes support for a verification package (METplus) for both deterministic and ensemble simulations and support for four stochastically perturbed physics schemes. +The UFS includes `multiple applications `_ that support different forecast durations and spatial domains. This documentation describes the UFS Short-Range Weather (SRW) Application, which targets predictions of atmospheric behavior on a limited spatial domain and on time scales from minutes to several days. The most recent SRW Application includes a prognostic atmospheric model, pre- and post-processing, and a community workflow for running the system end-to-end. These components are documented within this User's Guide and supported through the `GitHub Discussions `_ forum. The SRW App also includes support for a verification package (METplus) for both deterministic and ensemble simulations and support for four stochastically perturbed physics schemes. Since the last release, developers have added a variety of features: diff --git a/doc/UsersGuide/BuildingRunningTesting/BuildSRW.rst b/doc/UsersGuide/BuildingRunningTesting/BuildSRW.rst index 3076a5f6e..73334828d 100644 --- a/doc/UsersGuide/BuildingRunningTesting/BuildSRW.rst +++ b/doc/UsersGuide/BuildingRunningTesting/BuildSRW.rst @@ -35,7 +35,7 @@ Install the Prerequisite Software Stack Users on any sufficiently up-to-date machine with a UNIX-based operating system should be able to install the prerequisite software stack and run the SRW Application. However, a list of prerequisites is available in :numref:`Section %s ` for reference. Users should install or update their system as required before attempting to install the software stack. -Currently, installation of the prerequisite software stack is supported via spack-stack on most systems. :term:`Spack-stack` is a :term:`repository` that provides a Spack-based system to build the software stack required for `UFS `__ applications such as the SRW App. Spack-stack is the software stack validated by the UFS Weather Model (:term:`WM`), and the SRW App has likewise shifted to spack-stack for most Level 1 systems. +Currently, installation of the prerequisite software stack is supported via spack-stack on most systems. :term:`Spack-stack` is a :term:`repository` that provides a Spack-based system to build the software stack required for `UFS `_ applications such as the SRW App. Spack-stack is the software stack validated by the UFS Weather Model (:term:`WM`), and the SRW App has likewise shifted to spack-stack for most Level 1 systems. .. hint:: Skip the spack-stack installation if working on a :srw-wiki:`Level 1 system ` (e.g., Hera, Jet, Derecho, NOAA Cloud), and :ref:`continue to the next section `. diff --git a/doc/UsersGuide/BuildingRunningTesting/Quickstart.rst b/doc/UsersGuide/BuildingRunningTesting/Quickstart.rst index 3e58d6117..8f31dd9b1 100644 --- a/doc/UsersGuide/BuildingRunningTesting/Quickstart.rst +++ b/doc/UsersGuide/BuildingRunningTesting/Quickstart.rst @@ -50,13 +50,9 @@ For a detailed explanation of how to build and run the SRW App on any supported #. Load the python environment for the workflow. Users on Level 2-4 systems will need to use one of the existing ``wflow_`` modulefiles (e.g., ``wflow_macos``) and adapt it to their system. Then, run: - .. code-block:: console - - source /path/to/ufs-srweather-app/etc/lmod-setup.sh - module use /path/to/ufs-srweather-app/modulefiles - module load wflow_ - - where ```` refers to a valid machine name (see :numref:`Section %s `). After loading the workflow, users should follow the instructions printed to the console. For example, if the output says: + .. include:: ../../doc-snippets/load-env.rst + + After loading the workflow, users should follow the instructions printed to the console. For example, if the output says: .. code-block:: console diff --git a/doc/UsersGuide/BuildingRunningTesting/RunSRW.rst b/doc/UsersGuide/BuildingRunningTesting/RunSRW.rst index 0eb10e151..bea4ab59a 100644 --- a/doc/UsersGuide/BuildingRunningTesting/RunSRW.rst +++ b/doc/UsersGuide/BuildingRunningTesting/RunSRW.rst @@ -134,13 +134,8 @@ Loading the Workflow Environment The |wflow_env| conda/Python environment can be activated in the following way: -.. code-block:: console - - source /path/to/ufs-srweather-app/etc/lmod-setup.sh - module use /path/to/ufs-srweather-app/modulefiles - module load wflow_ - -where ```` refers to a valid machine name (see :numref:`Section %s ` for ``MACHINE`` options). In a csh shell environment, users should replace ``lmod-setup.sh`` with ``lmod-setup.csh``. +.. include:: ../../doc-snippets/load-env.rst +In a csh shell environment, users should replace ``lmod-setup.sh`` with ``lmod-setup.csh``. .. note:: If users source the lmod-setup file on a system that doesn't need it, it will not cause any problems (it will simply do a ``module purge``). @@ -155,7 +150,7 @@ The ``wflow_`` modulefile will then output instructions to activate th then the user should run |activate|. This activates the |wflow_env| conda environment, and the user typically sees |prompt| in front of the Terminal prompt at this point. .. note:: - If users do not use the wflow module to load conda, ``conda`` will need to be initialized before running ``conda activate srw_app`` command. Depending on the user's system and login setup, this may be accomplished in a variety of ways. Conda initialization usually involves the following command: ``source /etc/profile.d/conda.sh``, where ```` is the base conda installation directory and by default will be the full path to ``ufs-srweather-app/conda``. + If users do not use the ``wflow_`` module to load conda, ``conda`` will need to be initialized before running ``conda activate srw_app`` command. Depending on the user's system and login setup, this may be accomplished in a variety of ways. Conda initialization usually involves the following command: ``source /etc/profile.d/conda.sh``, where ```` is the base conda installation directory and by default will be the full path to ``ufs-srweather-app/conda``. After loading the workflow environment, users may continue to :numref:`Section %s ` for instructions on setting the experiment configuration parameters. @@ -690,7 +685,7 @@ More information about configuring the ``rocoto:`` section can be found in :numr If users have access to NOAA :term:`HPSS` but have not pre-staged the data, the default ``verify_pre.yaml`` taskgroup will activate the tasks, and the workflow will attempt to download the appropriate data from NOAA HPSS. In this case, the ``*_OBS_DIR`` paths must be set to the location where users want the downloaded data to reside. -Users who do not have access to NOAA HPSS and do not have the data on their system will need to download :term:`CCPA`, :term:`MRMS`, and :term:`NDAS` data manually from collections of publicly available data, such as the ones listed `here `__. +Users who do not have access to NOAA HPSS and do not have the data on their system will need to download :term:`CCPA`, :term:`MRMS`, and :term:`NDAS` data manually from collections of publicly available data. Users who have already staged the observation data needed for METplus (i.e., the :term:`CCPA`, :term:`MRMS`, and :term:`NDAS` data) on their system should set the path to this data in ``config.yaml``. diff --git a/doc/UsersGuide/BuildingRunningTesting/Tutorial.rst b/doc/UsersGuide/BuildingRunningTesting/Tutorial.rst index a21b7aa9b..2b7f16971 100644 --- a/doc/UsersGuide/BuildingRunningTesting/Tutorial.rst +++ b/doc/UsersGuide/BuildingRunningTesting/Tutorial.rst @@ -11,7 +11,7 @@ Users can run through the entire set of tutorials or jump to the one that intere #. :ref:`Severe Weather Over Indianapolis `: Change physics suites and compare graphics plots. #. :ref:`Cold Air Damming `: Coming soon! #. :ref:`Southern Plains Winter Weather Event `: Coming soon! - #. :ref:`Halloween Storm `: Coming soon! + #. :ref:`Halloween Storm `: Change :term:`IC/LBC ` sources and compare results. #. :ref:`Hurricane Barry `: Coming soon! Each section provides a summary of the weather event and instructions for configuring an experiment. @@ -41,7 +41,7 @@ A surface boundary associated with a vorticity maximum over the northern Great P Data ------- -On :srw-wiki:`Level 1 ` systems, users can find data for the Indianapolis Severe Weather Forecast in the usual input model data locations (see :numref:`Section %s ` for a list). The data can also be downloaded from the `UFS SRW Application Data Bucket `__. +On :srw-wiki:`Level 1 ` systems, users can find data for the Indianapolis Severe Weather Forecast in the usual input model data locations (see :numref:`Section %s ` for a list). The data can also be downloaded from the `UFS SRW Application Data Bucket `_. * FV3GFS data for the first forecast (``control``) is located at: @@ -55,15 +55,9 @@ On :srw-wiki:`Level 1 ` systems, users can fi Load the Workflow -------------------- -To load the workflow environment, source the lmod-setup file. Then load the workflow conda environment. From the ``ufs-srweather-app`` directory, run: +To load the workflow environment, source the lmod-setup file and load the workflow conda environment by running: -.. code-block:: console - - source etc/lmod-setup.sh # OR: source etc/lmod-setup.csh when running in a csh/tcsh shell - module use modulefiles - module load wflow_ - -where ```` is a valid, lowercased machine name (see ``MACHINE`` in :numref:`Section %s ` for valid values). +.. include:: ../../doc-snippets/load-env.rst After loading the workflow, users should follow the instructions printed to the console. Usually, the instructions will tell the user to run |activate|. For example, a user on Hera with permissions on the ``nems`` project may issue the following commands to load the workflow (replacing ``User.Name`` with their actual username): @@ -77,35 +71,14 @@ After loading the workflow, users should follow the instructions printed to the Configuration ------------------------- -Navigate to the ``ufs-srweather-app/ush`` directory. The default (or "control") configuration for this experiment is based on the ``config.community.yaml`` file in that directory. Users can copy this file into ``config.yaml`` if they have not already done so: - -.. code-block:: console - - cd /path/to/ufs-srweather-app/ush - cp config.community.yaml config.yaml - -Users can save the location of the ``ush`` directory in an environment variable (``$USH``). This makes it easier to navigate between directories later. For example: - -.. code-block:: console - - export USH=/path/to/ufs-srweather-app/ush - -Users should substitute ``/path/to/ufs-srweather-app/ush`` with the actual path on their system. As long as a user remains logged into their system, they can run ``cd $USH``, and it will take them to the ``ush`` directory. The variable will need to be reset for each login session. +.. include:: ../../doc-snippets/expt-conf-intro.rst Experiment 1: Control ^^^^^^^^^^^^^^^^^^^^^^^^ Edit the configuration file (``config.yaml``) to include the variables and values in the sample configuration excerpts below. -.. Hint:: - - To open the configuration file in the command line, users may run the command: - - .. code-block:: console - - vi config.yaml - - To modify the file, hit the ``i`` key and then make any changes required. To close and save, hit the ``esc`` key and type ``:wq`` to write the changes to the file and exit/quit the file. Users may opt to use their preferred code editor instead. +.. include:: ../../doc-snippets/file-edit-hint.rst Start in the ``user:`` section and change the ``MACHINE`` and ``ACCOUNT`` variables. For example, when running on a personal MacOS device, users might set: @@ -138,11 +111,9 @@ In the ``workflow:`` section of ``config.yaml``, update ``EXPT_SUBDIR`` and ``PR .. _CronNote: -.. note:: - - Users may also want to set ``USE_CRON_TO_RELAUNCH: true`` and add ``CRON_RELAUNCH_INTVL_MNTS: 3``. This will automate submission of workflow tasks when running the experiment. However, not all systems have :term:`cron`. +.. include:: ../../doc-snippets/cron-note.rst -``EXPT_SUBDIR:`` This variable can be changed to any name the user wants from "gfsv16_physics_fcst" to "forecast1" to "a;skdfj". However, the best names will indicate useful information about the experiment. This tutorial uses ``control`` to establish a baseline, or "control", forecast. Since this tutorial helps users to compare the output from two different forecasts --- one that uses the FV3_GFS_v16 physics suite and one that uses the FV3_RRFS_v1beta physics suite --- "gfsv16_physics_fcst" could be a good alternative directory name. +``EXPT_SUBDIR:`` This variable can be changed to any name the user wants from "gfsv16_physics_fcst" to "forecast1" to "askdfj" (but note that whitespace and some punctuation characters are not allowed). However, the best names will indicate useful information about the experiment. This tutorial uses ``control`` to establish a baseline, or "control", forecast. Since this tutorial helps users to compare the output from two different forecasts --- one that uses the FV3_GFS_v16 physics suite and one that uses the FV3_RRFS_v1beta physics suite --- "gfsv16_physics_fcst" could be a good alternative directory name. ``PREDEF_GRID_NAME:`` This experiment uses the SUBCONUS_Ind_3km grid, rather than the default RRFS_CONUS_25km grid. The SUBCONUS_Ind_3km grid is a high-resolution grid (with grid cell size of approximately 3km) that covers a small area of the U.S. centered over Indianapolis, IN. For more information on this grid, see :numref:`Section %s `. @@ -251,8 +222,6 @@ Once the control case is running, users can return to the ``config.yaml`` file ( Later, users may want to conduct additional experiments using the FV3_HRRR and FV3_WoFS_v0 physics suites. Like FV3_RRFS_v1beta, these physics suites were designed for use with high-resolution grids for storm-scale predictions. -.. COMMENT: Maybe also FV3_RAP? - Next, users will need to modify the data parameters in ``task_get_extrn_ics:`` and ``task_get_extrn_lbcs:`` to use HRRR and RAP data rather than FV3GFS data. Users will need to change the following lines in each section: .. code-block:: console @@ -331,17 +300,7 @@ Navigate to ``test_expt/2019061518/postprd``. This directory contains the post-p Copy ``.png`` Files onto Local System ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -Users who are working on the cloud or on an HPC cluster may want to copy the ``.png`` files onto their local system to view in their preferred image viewer. Detailed instructions are available in the :ref:`Introduction to SSH & Data Transfer `. - -In summary, users can run the ``scp`` command in a new terminal/command prompt window to securely copy files from a remote system to their local system if an SSH tunnel is already established between the local system and the remote system. Users can adjust one of the following commands for their system: - -.. code-block:: console - - scp username@your-IP-address:/path/to/source_file_or_directory /path/to/destination_file_or_directory - # OR - scp -P 12345 username@localhost:/path/to/source_file_or_directory /path/to/destination_file_or_directory - -Users would need to modify ``username``, ``your-IP-address``, ``-P 12345``, and the file paths to reflect their systems' information. See the :ref:`Introduction to SSH & Data Transfer ` for example commands. +.. include:: ../../doc-snippets/scp-files.rst .. _ComparePlots: @@ -555,18 +514,20 @@ A polar vortex brought arctic air to much of the U.S. and Mexico. A series of co *Southern Plains Winter Weather Event Over Oklahoma City* -.. COMMENT: Upload a png to the SRW wiki and change the hyperlink to point to that. - Tutorial Content ------------------- -Coming Soon! +Coming Soon! .. _fcst4: Sample Forecast #4: Halloween Storm ======================================= +**Objective:** + * Compare forecast outputs for similar experiments that use different :term:`IC/LBC ` sources. + * Coming soon: Option to use verification tools to assess forecast quality. + Weather Summary -------------------- @@ -574,17 +535,329 @@ A line of severe storms brought strong winds, flash flooding, and tornadoes to t **Weather Phenomena:** Flooding and high winds - * `Storm Prediction Center (SPC) Storm Report for 20191031 `__ + * `Storm Prediction Center (SPC) Storm Report for 20191031 `_ -.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/Tutorial/HalloweenStorm.jpg +.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/Tutorial/HalloweenStorm.gif :alt: Radar animation of the Halloween Storm that swept across the Eastern United States in 2019. *Halloween Storm 2019* -Tutorial Content -------------------- +Data +------- -Coming Soon! +Data for the Halloween Storm is publicly available in S3 data buckets. The Rapid Refresh (`RAP `_) data can be downloaded from the `SRW App data bucket `_ using ``wget``. Make sure to issue the command from the folder where you want to place the data. + +.. code-block:: console + + wget https://noaa-ufs-srw-pds.s3.amazonaws.com/develop-20240618/halloween_rap.tgz + tar -xzf halloween_rap.tgz + +This will untar the ``halloween_rap.tgz`` data into a directory named ``RAP``. + +The SRW App can pull HRRR data directly from the `HRRR data bucket `_. Users do not need to download the data separately. + +Load the workflow +--------------------- + +To load the workflow environment, source the lmod-setup file and load the workflow conda environment by running: + +.. include:: ../../doc-snippets/load-env.rst + +After loading the workflow, users should follow the instructions printed to the console. Usually, the instructions will tell the user to run |activate|. For example, a user on Hera with permissions on the ``nems`` project may issue the following commands to load the workflow (replacing ``User.Name`` with their actual username): + +.. code-block:: console + + source /scratch1/NCEPDEV/nems/User.Name/ufs-srweather-app/etc/lmod-setup.sh hera + module use /scratch1/NCEPDEV/nems/User.Name/ufs-srweather-app/modulefiles + module load wflow_hera + conda activate srw_app + +Configuration +------------------------- + +.. include:: ../../doc-snippets/expt-conf-intro.rst + +Experiment 1: RAP Data +^^^^^^^^^^^^^^^^^^^^^^^^ + +Edit the configuration file (``config.yaml``) to include the variables and values in the sample configuration excerpts below. + +.. include:: ../../doc-snippets/file-edit-hint.rst + +Start in the ``user:`` section and change the ``MACHINE`` and ``ACCOUNT`` variables. For example, when running on a personal MacOS device, users might set: + +.. code-block:: console + + user: + RUN_ENVIR: community + MACHINE: macos + ACCOUNT: none + +For a detailed description of these variables, see :numref:`Section %s `. + +Users do not need to change the ``platform:`` section of the configuration file for this tutorial. + +In the ``workflow:`` section of ``config.yaml``, update ``EXPT_SUBDIR``, ``CCPP_PHYS_SUITE``, ``PREDEF_GRID_NAME``, ``DATE_FIRST_CYCL``, ``DATE_LAST_CYCL``, and ``FCST_LEN_HRS``. + +.. code-block:: console + + workflow: + USE_CRON_TO_RELAUNCH: false + EXPT_SUBDIR: halloweenRAP + CCPP_PHYS_SUITE: FV3_RAP + PREDEF_GRID_NAME: RRFS_CONUS_13km + DATE_FIRST_CYCL: '2019103012' + DATE_LAST_CYCL: '2019103012' + FCST_LEN_HRS: 36 + PREEXISTING_DIR_METHOD: rename + VERBOSE: true + COMPILER: intel + +.. include:: ../../doc-snippets/cron-note.rst + +``EXPT_SUBDIR:`` This variable can be changed to any name the user wants from "halloweenRAP" to "HalloweenStorm1" to "askdfj" (but note that whitespace and some punctuation characters are not allowed). However, the best names will indicate useful information about the experiment. Since this tutorial helps users to compare the output from RAP and HRRR forecast input data, this tutorial will use ``halloweenRAP`` for the Halloween Storm experiment that uses RAP forecast data. + +``PREDEF_GRID_NAME:`` This experiment uses the RRFS_CONUS_13km, rather than the default RRFS_CONUS_25km grid. This 13-km resolution is used in the NOAA operational Rapid Refresh (`RAP `_) model and is the resolution envisioned for the initial operational implementation of the Rapid Refresh Forecast System (:term:`RRFS`). For more information on this grid, see :numref:`Section %s `. + +``CCPP_PHYS_SUITE:`` The FV3_RAP physics suite contains the evolving :term:`parameterizations` used operationally in the NOAA Rapid Refresh (`RAP `_) model; the suite is also a prime candidate under consideration for initial RRFS implementation and has been well-tested at the 13-km resolution. It is therefore an appropriate physics choice when using the RRFS_CONUS_13km grid. + +``DATE_FIRST_CYCL``, ``DATE_LAST_CYCL``, and ``FCST_LEN_HRS`` set parameters related to the date and duration of the forecast. Because this is a one-cycle experiment that does not use cycling or :term:`data assimilation`, the date of the first :term:`cycle` and last cycle are the same. + +For a detailed description of other ``workflow:`` variables, see :numref:`Section %s `. + +In the ``task_get_extrn_ics:`` section, add ``USE_USER_STAGED_EXTRN_FILES`` and ``EXTRN_MDL_SOURCE_BASEDIR_ICS``. Users will need to adjust the file path to point to the location of the data on their system. + +.. code-block:: console + + task_get_extrn_ics: + EXTRN_MDL_NAME_ICS: RAP + USE_USER_STAGED_EXTRN_FILES: true + EXTRN_MDL_SOURCE_BASEDIR_ICS: /path/to/RAP/for_ICS + +For a detailed description of the ``task_get_extrn_ics:`` variables, see :numref:`Section %s `. + +Similarly, in the ``task_get_extrn_lbcs:`` section, add ``USE_USER_STAGED_EXTRN_FILES`` and ``EXTRN_MDL_SOURCE_BASEDIR_LBCS``. Users will need to adjust the file path to point to the location of the data on their system. + +.. code-block:: console + + task_get_extrn_lbcs: + EXTRN_MDL_NAME_LBCS: RAP + LBC_SPEC_INTVL_HRS: 3 + USE_USER_STAGED_EXTRN_FILES: true + EXTRN_MDL_SOURCE_BASEDIR_LBCS: /path/to/RAP/for_LBCS + +For a detailed description of the ``task_get_extrn_lbcs:`` variables, see :numref:`Section %s `. + +Users do not need to modify the ``task_run_fcst:`` section for this tutorial. + +.. COMMENT: Do we need to set QUILTING to true? + +In the ``rocoto:tasks:`` section, increase the walltime for the data-related tasks and metatasks. Then include the YAML configuration file containing the plotting task in the ``rocoto:tasks:taskgroups:`` section, like this: + +.. code-block:: console + + rocoto: + tasks: + task_get_extrn_ics: + walltime: 06:00:00 + task_get_extrn_lbcs: + walltime: 06:00:00 + metatask_run_ensemble: + task_make_lbcs_mem#mem#: + walltime: 06:00:00 + task_run_fcst_mem#mem#: + walltime: 06:00:00 + taskgroups: '{{ ["parm/wflow/prep.yaml", "parm/wflow/coldstart.yaml", "parm/wflow/post.yaml", "parm/wflow/plot.yaml"]|include }}' + +.. note:: + + Rocoto tasks are run once each. A :ref:`Rocoto ` metatask expands into one or more similar tasks by replacing the values between ``#`` symbols with the values under the ``var:`` key. See the `Rocoto documentation `_ for more information. + +For more information on how to turn on/off tasks in the workflow, please see :numref:`Section %s `. + +In the ``task_plot_allvars:`` section, add ``PLOT_FCST_INC: 6``. Users may also want to add ``PLOT_FCST_START: 0`` and ``PLOT_FCST_END: 36`` explicitly, but these can be omitted since the default values are the same as the forecast start and end time respectively. + +.. code-block:: console + + task_plot_allvars: + COMOUT_REF: "" + PLOT_FCST_INC: 6 + +``PLOT_FCST_INC:`` This variable indicates the forecast hour increment for the plotting task. By setting the value to ``6``, the task will generate a ``.png`` file for every 6th forecast hour starting from 12z on October 30, 2019 (the 0th forecast hour) through the 36th forecast hour (November 1, 2019 at 0z). + +After configuring the forecast, users can generate the forecast by running: + +.. code-block:: console + + ./generate_FV3LAM_wflow.py + +To see experiment progress, users should navigate to their experiment directory. Then, use the ``rocotorun`` command to launch new workflow tasks and ``rocotostat`` to check on experiment progress. + +.. code-block:: console + + cd /path/to/expt_dirs/halloweenRAP + rocotorun -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10 + rocotostat -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10 + +Users will need to rerun the ``rocotorun`` and ``rocotostat`` commands above regularly and repeatedly to continue submitting workflow tasks and receiving progress updates. + +.. note:: + + When using cron to automate the workflow submission (as described :ref:`above `), users can omit the ``rocotorun`` command and simply use ``rocotostat`` to check on progress periodically. + +Users can save the location of the ``halloweenRAP`` directory in an environment variable (e.g., ``$HRAP``). This makes it easier to navigate between directories later. For example: + +.. code-block:: console + + export HRAP=/path/to/expt_dirs/halloweenRAP + +Users should substitute ``/path/to/expt_dirs/halloweenRAP`` with the actual path to the experiment directory on their system. As long as a user remains logged into their system, they can run ``cd $HRAP``, and it will take them to the ``halloweenRAP`` experiment directory. The variable will need to be reset for each login session. + +Experiment 2: Changing the Forecast Input +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ +Once the ``halloweenRAP`` case is running, users can return to the ``config.yaml`` file (in ``$USH``) and adjust the parameters for a new forecast. In this forecast, users will change the forecast input to use ``HRRR`` data and alter a few associated parameters. + +In the ``workflow:`` section of ``config.yaml``, update ``EXPT_SUBDIR`` and ``PREDEF_GRID_NAME``. Other parameters should remain the same. + +.. code-block:: console + + workflow: + EXPT_SUBDIR: halloweenHRRR + PREDEF_GRID_NAME: RRFS_CONUScompact_13km + +.. note:: + + Relative to the original CONUS domain, the "compact" CONUS domains are slightly smaller. The original CONUS domains were a bit too large to run with :term:`LBCs` from HRRR, so the "compact" domains were created to be just small enough to work with HRRR data. + +In the ``task_get_extrn_ics:`` section, update the values for ``EXTRN_MDL_NAME_ICS`` and ``USE_USER_STAGED_EXTRN_FILES`` and add ``EXTRN_MDL_FILES_ICS``. Users may choose to comment out or remove ``EXTRN_MDL_SOURCE_BASEDIR_ICS``, but this is not necessary. + +.. code-block:: console + + task_get_extrn_ics: + EXTRN_MDL_NAME_ICS: HRRR + USE_USER_STAGED_EXTRN_FILES: false + EXTRN_MDL_FILES_ICS: + - '{yy}{jjj}{hh}00{fcst_hr:02d}00' + +For a detailed description of the ``task_get_extrn_ics:`` variables, see :numref:`Section %s `. + +Update the same values in the ``task_get_extrn_lbcs:`` section: + +.. code-block:: console + + task_get_extrn_lbcs: + EXTRN_MDL_NAME_LBCS: HRRR + LBC_SPEC_INTVL_HRS: 3 + USE_USER_STAGED_EXTRN_FILES: false + EXTRN_MDL_FILES_LBCS: + - '{yy}{jjj}{hh}00{fcst_hr:02d}00' + + +For a detailed description of the ``task_get_extrn_lbcs:`` variables, see :numref:`Section %s `. + +After configuring the forecast, users can generate the second forecast by running: + +.. code-block:: console + + ./generate_FV3LAM_wflow.py + +To see experiment progress, users should navigate to their experiment directory. As in the first forecast, the following commands allow users to launch new workflow tasks and check on experiment progress. + +.. code-block:: console + + cd /path/to/expt_dirs/halloweenHRRR + rocotorun -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10 + rocotostat -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10 + +.. note:: + + When using cron to automate the workflow submission (as described :ref:`above `), users can omit the ``rocotorun`` command and simply use ``rocotostat`` to check on progress periodically. + +.. note:: + + If users have not automated their workflow using cron, they will need to ensure that they continue issuing ``rocotorun`` commands to launch all of the tasks in each experiment. While switching between experiment directories to run ``rocotorun`` and ``rocotostat`` commands in both directories is possible, it may be easier to finish the ``halloweenRAP`` experiment's tasks before starting on ``halloweenHRRR``. + +As with the ``halloweenRAP`` experiment, users can save the location of the ``halloweenHRRR`` directory in an environment variable (e.g., ``$HHRRR``). This makes it easier to navigate between directories later. For example: + +.. code-block:: console + + export HHRRR=/path/to/expt_dirs/halloweenHRRR + +Users should substitute ``/path/to/expt_dirs/halloweenHRRR`` with the actual path on their system. + + +How to Analyze Results +----------------------- +Navigate to ``halloweenHRRR/2019103012/postprd`` and/or ``halloweenRAP/2019203012/postprd``. These directories contain the post-processed data generated by the :term:`UPP` from the Halloween Storm forecasts. After the ``plot_allvars`` task completes, this directory will contain ``.png`` images for several forecast variables. + +Copy ``.png`` Files onto Local System +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + +.. include:: ../../doc-snippets/scp-files.rst + +Examining Forecast Plots at Peak Intensity +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ +This experiment will be looking at plots from HRRR and RAP input files while the Halloween Storm is at or approaching peak intensity. + +.. _fcst4_250wind: + +250mb Wind +`````````` +An effective weather forecast begins with analyzing a 250mb wind chart. By using this wind plot, forecasters can identify key features such as jet stream placement, jet maxima, troughs, ridges, and more. This analysis also helps pinpoint areas with the potential for the strongest severe weather. + +In the 250mb wind plots below, the ``halloweenHRRR`` and ``halloweenRAP`` plots are nearly identical at forecast hour f036. This shows great model agreement. Analyzing this chart we can see multiple ingredients signaling a significant severe weather event over the eastern CONUS. The first thing to notice is the placement of the jet streak along with troughing approaching the eastern US. Also notice an extreme 150KT jet max over southern Ohio further fueling severe weather. The last thing to notice is the divergence aloft present over the eastern CONUS; seeing divergence present all the way up to 250mb indicates a strong system. + +.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/fcst4_plots/250wind_rap_conus_f036.png + :align: center + :width: 75% + + *RAP Plot for 250mb Wind* + +.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/fcst4_plots/250wind_hrrr_conus_f036.png + :align: center + :width: 75% + + *HRRR Plot for 250mb Wind* + +.. _fcst4_10mwind: + +10m Wind +`````````` +The 10m wind plots allows forecasters to pick up on patterns closer to the surface. It shows features such as convergence and pressure areas. + +In the 10m wind plots below, the ``halloweenHRRR`` and ``halloweenRAP`` are once again very similar, which makes sense given that the 250mb wind plots are also so similar. We can see a few key features on this chart. The most important is the area of convergence taking place over the east coast which is driving the line of severe storms. + +.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/fcst4_plots/10mwind_rap_conus_f036.png + :align: center + :width: 75% + + *RAP Plot for 10m Winds* + +.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/fcst4_plots/10mwind_hrrr_conus_f036.png + :align: center + :width: 75% + + *HRRR Plot for 10m Winds* + +.. _fcst4_refc: + +Composite Reflectivity +```````````````````````` +Reflectivity images visually represent the weather based on the energy (measured in decibels [dBZ]) reflected back from radar. Composite reflectivity generates an image based on reflectivity scans at multiple elevation angles, or "tilts", of the antenna. See https://www.noaa.gov/jetstream/reflectivity for a more detailed explanation of composite reflectivity. + +In the composite reflectivity plots below, the ``halloweenHRRR`` and ``halloweenRAP`` models remain quite similar, as expected. Utilizing the reflectivity plots provides the final piece of the puzzle. From the previous analyses, we already had a good understanding of where the storms were likely to occur. Composite reflectivity serves as an additional tool, allowing us to visualize where the models predict storm placement. In this case, the strongest storms are indicated by higher dBZ values and appear to be concentrated in the NC/VA region. + +.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/fcst4_plots/refc_rap_conus_f036.png + :align: center + :width: 75% + + *RAP Plot for Composite Reflectivity* + +.. figure:: https://github.com/ufs-community/ufs-srweather-app/wiki/fcst4_plots/refc_hrrr_conus_f036.png + :align: center + :width: 75% + + *HRRR Plot for Composite Reflectivity* .. _fcst5: diff --git a/doc/UsersGuide/BuildingRunningTesting/VXCases.rst b/doc/UsersGuide/BuildingRunningTesting/VXCases.rst index 2bf6f775d..b36afcefd 100644 --- a/doc/UsersGuide/BuildingRunningTesting/VXCases.rst +++ b/doc/UsersGuide/BuildingRunningTesting/VXCases.rst @@ -81,14 +81,9 @@ Save the path to this file in and ``INDYDATA`` environment variable: Load the Workflow ^^^^^^^^^^^^^^^^^^^^ -First, navigate to the ``ufs-srweather-app/ush`` directory. Then, load the workflow environment: - -.. code-block:: console - - source /path/to/etc/lmod-setup.sh - module use /path/to/ufs-srweather-app/modulefiles - module load wflow_ +To load the workflow environment, run: +.. include:: ../../doc-snippets/load-env.rst Users running a csh/tcsh shell would run ``source /path/to/etc/lmod-setup.csh `` in place of the first command above. After loading the workflow, users should follow the instructions printed to the console. Usually, the instructions will tell the user to run |activate|. @@ -267,7 +262,7 @@ Point STAT Files The Point-Stat files contain continuous variables like temperature, pressure, and wind speed. A description of the Point-Stat file can be found :ref:`here ` in the MET documentation. -The Point-Stat files contain a potentially overwhelming amount of information. Therefore, it is recommended that users focus on the CNT MET test, which contains the `RMSE `__ and `MBIAS `__ statistics. The MET tests are defined in column 24 'LINE_TYPE' of the ``.stat`` file. Look for 'CNT' in this column. Then find column 66-68 for MBIAS and 78-80 for RMSE statistics. A full description of this file can be found :ref:`here `. +The Point-Stat files contain a potentially overwhelming amount of information. Therefore, it is recommended that users focus on the CNT MET test, which contains the `RMSE `_ and `MBIAS `_ statistics. The MET tests are defined in column 24 'LINE_TYPE' of the ``.stat`` file. Look for 'CNT' in this column. Then find column 66-68 for MBIAS and 78-80 for RMSE statistics. A full description of this file can be found :ref:`here `. To narrow down the variable field even further, users can focus on these weather variables: diff --git a/doc/UsersGuide/Reference/FAQ.rst b/doc/UsersGuide/Reference/FAQ.rst index e8c3df0de..b1c7bcce1 100644 --- a/doc/UsersGuide/Reference/FAQ.rst +++ b/doc/UsersGuide/Reference/FAQ.rst @@ -281,12 +281,7 @@ How can I run a new experiment? To run a new experiment at a later time, users need to rerun the commands in :numref:`Section %s ` that reactivate the |wflow_env| environment: -.. code-block:: console - - source /path/to/etc/lmod-setup.sh/or/lmod-setup.csh - module use /path/to/modulefiles - module load wflow_ - +.. include:: ../../doc-snippets/load-env.rst Follow any instructions output by the console (e.g., |activate|). Then, users can configure a new experiment by updating the experiment parameters in ``config.yaml`` to reflect the desired experiment configuration. Detailed instructions can be viewed in :numref:`Section %s `. Parameters and valid values are listed in :numref:`Section %s `. After adjusting the configuration file, generate the new experiment by running ``./generate_FV3LAM_wflow.py``. Check progress by navigating to the ``$EXPTDIR`` and running ``rocotostat -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10``. diff --git a/doc/UsersGuide/Reference/Glossary.rst b/doc/UsersGuide/Reference/Glossary.rst index 748d0f33b..7ffc569b2 100644 --- a/doc/UsersGuide/Reference/Glossary.rst +++ b/doc/UsersGuide/Reference/Glossary.rst @@ -10,7 +10,7 @@ Glossary To transport substances in the atmostphere by :term:`advection`. advection - According to the American Meteorological Society (AMS) `definition `__, advection is "The process of transport of an atmospheric property solely by the mass motion (velocity field) of the atmosphere." In common parlance, advection is movement of atmospheric substances that are carried around by the wind. + According to the American Meteorological Society (AMS) definition, `advection `_ is "The process of transport of an atmospheric property solely by the mass motion (velocity field) of the atmosphere." In common parlance, advection is movement of atmospheric substances that are carried around by the wind. AQM The `Air Quality Model `__ (AQM) is a UFS Application that dynamically couples the Community Multiscale Air Quality (:term:`CMAQ`) model with the UFS Weather Model through the :term:`NUOPC` Layer to simulate temporal and spatial variations of atmospheric compositions (e.g., ozone and aerosol compositions). The CMAQ, treated as a column chemistry model, updates concentrations of chemical species (e.g., ozone and aerosol compositions) at each integration time step. The transport terms (e.g., :term:`advection` and diffusion) of all chemical species are handled by the UFS Weather Model as :term:`tracers`. @@ -22,7 +22,7 @@ Glossary Climatology-Calibrated Precipitation Analysis (CCPA) data. This data is required for METplus precipitation verification tasks within the SRW App. The most recent 8 days worth of data are publicly available and can be accessed `here `__. CCPP - The `Common Community Physics Package `__ is a forecast-model agnostic, vetted collection of code containing atmospheric physical parameterizations and suites of parameterizations for use in Numerical Weather Prediction (NWP) along with a framework that connects the physics to the host forecast model. + The `Common Community Physics Package `_ is a forecast-model agnostic, vetted collection of code containing atmospheric physical parameterizations and suites of parameterizations for use in Numerical Weather Prediction (NWP) along with a framework that connects the physics to the host forecast model. chgres_cube The preprocessing software used to create initial and boundary condition files to @@ -78,10 +78,10 @@ Glossary The radar-indicated top of an area of precipitation. Specifically, it contains the height of the 18 dBZ reflectivity value. EMC - The `Environmental Modeling Center `__. - + The `Environmental Modeling Center `_. + EPIC - The `Earth Prediction Innovation Center `__ seeks to accelerate scientific research and modeling contributions through continuous and sustained community engagement in order to produce the most accurate and reliable operational modeling system in the world. + The `Earth Prediction Innovation Center `_ seeks to accelerate scientific research and modeling contributions through continuous and sustained community engagement in order to produce the most accurate and reliable operational modeling system in the world. ESG Extended Schmidt Gnomonic (ESG) grid. The ESG grid uses the map projection developed by Jim Purser of NOAA :term:`EMC` (:cite:t:`Purser_2020`). @@ -118,7 +118,7 @@ Glossary High-Performance Computing. HPC-Stack - The `HPC-Stack `__ is a repository that provides a unified, shell script-based build system for building the software stack required for numerical weather prediction (NWP) tools such as the `Unified Forecast System (UFS) `__ and the `Joint Effort for Data assimilation Integration (JEDI) `__ framework. View the HPC-Stack documentation :doc:`here `. + The `HPC-Stack `__ is a repository that provides a unified, shell script-based build system for building the software stack required for numerical weather prediction (NWP) tools such as the `Unified Forecast System (UFS) `_ and the `Joint Effort for Data assimilation Integration (JEDI) `_ framework. View the HPC-Stack documentation :doc:`here `. HPSS High Performance Storage System (HPSS). @@ -227,25 +227,25 @@ Glossary A central location in which files (e.g., data, code, documentation) are stored and managed. RRFS - The `Rapid Refresh Forecast System `__ (RRFS) is NOAA's next-generation convection-allowing, rapidly-updated, ensemble-based data assimilation and forecasting system currently scheduled for operational implementation in 2024. It is designed to run forecasts on a 3-km :term:`CONUS` domain, see also `NOAA Rapid Refresh Forecast System (RRFS) `__. Experimental data is currently available from the `AWS S3 NOAA-RRFS `__ bucket for deterministic forecasts out to 60 hours at 00, 06, 12, and 18 UTC. Additionally, hourly forecasts out to 18 hours may be available for more recent RRFS model runs; the user needs to verify that data exists for needed dates. + The `Rapid Refresh Forecast System `_ (RRFS) is NOAA's next-generation convection-allowing, rapidly-updated, ensemble-based data assimilation and forecasting system currently scheduled for operational implementation in 2024. It is designed to run forecasts on a 3-km :term:`CONUS` domain, see also `NOAA Rapid Refresh Forecast System (RRFS) `__. Experimental data is currently available from the `AWS S3 NOAA-RRFS `__ bucket for deterministic forecasts out to 60 hours at 00, 06, 12, and 18 UTC. Additionally, hourly forecasts out to 18 hours may be available for more recent RRFS model runs; the user needs to verify that data exists for needed dates. SDF Suite Definition File. An external file containing information about the construction of a physics suite. It describes the schemes that are called, in which order they are called, whether they are subcycled, and whether they are assembled into groups to be called together. Spack - `Spack `__ is a package management tool designed to support multiple versions and configurations of software on a wide variety of platforms and environments. It was designed for large supercomputing centers, where many users and application teams share common installations of software on clusters with exotic architectures. + `Spack `_ is a package management tool designed to support multiple versions and configurations of software on a wide variety of platforms and environments. It was designed for large supercomputing centers, where many users and application teams share common installations of software on clusters with exotic architectures. spack-stack - The `spack-stack `__ is a collaborative effort between the NOAA Environmental Modeling Center (EMC), the UCAR Joint Center for Satellite Data Assimilation (JCSDA), and the Earth Prediction Innovation Center (EPIC). *spack-stack* is a repository that provides a :term:`Spack`-based method for building the software stack required for numerical weather prediction (NWP) tools such as the `Unified Forecast System (UFS) `__ and the `Joint Effort for Data assimilation Integration (JEDI) `__ framework. *spack-stack* uses the Spack package manager along with custom Spack configuration files and Python scripts to simplify installation of the libraries required to run various applications. The *spack-stack* can be installed on a range of platforms and comes pre-configured for many systems. Users can install the necessary packages for a particular application and later add the missing packages for another application without having to rebuild the entire stack. To get started, check out the documentation :doc:`here `. + The `spack-stack `_ is a collaborative effort between the NOAA Environmental Modeling Center (EMC), the UCAR Joint Center for Satellite Data Assimilation (JCSDA), and the Earth Prediction Innovation Center (EPIC). *spack-stack* is a repository that provides a :term:`Spack`-based method for building the software stack required for numerical weather prediction (NWP) tools such as the `Unified Forecast System (UFS) `_ and the `Joint Effort for Data assimilation Integration (JEDI) `_ framework. *spack-stack* uses the Spack package manager along with custom Spack configuration files and Python scripts to simplify installation of the libraries required to run various applications. The *spack-stack* can be installed on a range of platforms and comes pre-configured for many systems. Users can install the necessary packages for a particular application and later add the missing packages for another application without having to rebuild the entire stack. To get started, check out the documentation :doc:`here `. tracer - According to the American Meteorological Society (AMS) `definition `__, a tracer is "Any substance in the atmosphere that can be used to track the history [i.e., movement] of an air mass." Tracers are carried around by the motion of the atmosphere (i.e., by :term:`advection`). These substances are usually gases (e.g., water vapor, CO2), but they can also be non-gaseous (e.g., rain drops in microphysics parameterizations). In weather models, temperature (or potential temperature), absolute humidity, and radioactivity are also usually treated as tracers. According to AMS, "The main requirement for a tracer is that its lifetime be substantially longer than the transport process under study." + According to the American Meteorological Society (AMS) definition, a `tracer `_ is "Any substance in the atmosphere that can be used to track the history [i.e., movement] of an air mass." Tracers are carried around by the motion of the atmosphere (i.e., by :term:`advection`). These substances are usually gases (e.g., water vapor, CO2), but they can also be non-gaseous (e.g., rain drops in microphysics parameterizations). In weather models, temperature (or potential temperature), absolute humidity, and radioactivity are also usually treated as tracers. According to AMS, "The main requirement for a tracer is that its lifetime be substantially longer than the transport process under study." UFS The Unified Forecast System is a community-based, coupled, comprehensive Earth modeling system consisting of several applications (apps). These apps span regional to global - domains and sub-hourly to seasonal time scales. The UFS is designed to support the :term:`Weather Enterprise` and to be the source system for NOAA's operational numerical weather prediction applications. For more information, visit https://ufscommunity.org/. + domains and sub-hourly to seasonal time scales. The UFS is designed to support the :term:`Weather Enterprise` and to be the source system for NOAA's operational numerical weather prediction applications. For more information, visit https://ufs.epic.noaa.gov/. UFS_UTILS A collection of code used by multiple :term:`UFS` apps (e.g., the UFS Short-Range Weather App, diff --git a/doc/UsersGuide/SSHIntro.rst b/doc/UsersGuide/SSHIntro.rst index 7292a37e9..25837b9cc 100644 --- a/doc/UsersGuide/SSHIntro.rst +++ b/doc/UsersGuide/SSHIntro.rst @@ -139,7 +139,7 @@ Download the Data from a Remote System to a Local System .. note:: - Users should transfer data to or from non-:srw-wiki:`Level 1 ` platforms using the recommended approach for that platform. This section outlines some basic guidance, but users may need to supplement with research of their own. On Level 1 systems, users may find it helpful to refer to the `RDHPCS CommonDocs Wiki `__. + Users should transfer data to or from non-:srw-wiki:`Level 1 ` platforms using the recommended approach for that platform. This section outlines some basic guidance, but users may need to supplement with research of their own. On Level 1 systems, users may find it helpful to refer to the `RDHPCS Data Transfer Documentation `_. To download data using ``scp``, users can typically adjust one of the following commands for use on their system: diff --git a/doc/conf.py b/doc/conf.py index 0d440a733..f1f094d54 100644 --- a/doc/conf.py +++ b/doc/conf.py @@ -103,14 +103,16 @@ # Ignore working links that cause a linkcheck 403 error. linkcheck_ignore = [r'https://www\.intel\.com/content/www/us/en/docs/cpp\-compiler/developer\-guide\-reference/2021\-10/thread\-affinity\-interface\.html', r'https://www\.intel\.com/content/www/us/en/developer/tools/oneapi/hpc\-toolkit\-download\.html', - #r'https://glossary.ametsoc.org/.*', + r'https://glossary.ametsoc.org/.*', ] # Ignore anchor tags for SRW App data bucket. Shows Not Found even when they exist. linkcheck_anchors_ignore = [r"current_srw_release_data/", r"input_model_data/.*", r"fix.*", - r"sample_cases/.*", + r"experiment-user-cases/.*", + r"rrfs_a/*", + r"develop-20240618/*", ] linkcheck_allowed_redirects = {r"https://github\.com/ufs-community/ufs-srweather-app/wiki/.*": diff --git a/doc/doc-snippets/cron-note.rst b/doc/doc-snippets/cron-note.rst new file mode 100644 index 000000000..99192d0a1 --- /dev/null +++ b/doc/doc-snippets/cron-note.rst @@ -0,0 +1,3 @@ +.. note:: + + Users may also want to set ``USE_CRON_TO_RELAUNCH: true`` and add ``CRON_RELAUNCH_INTVL_MNTS: 3``. This will automate submission of workflow tasks when running the experiment. However, not all systems have :term:`cron`. \ No newline at end of file diff --git a/doc/doc-snippets/expt-conf-intro.rst b/doc/doc-snippets/expt-conf-intro.rst new file mode 100644 index 000000000..d23fc546c --- /dev/null +++ b/doc/doc-snippets/expt-conf-intro.rst @@ -0,0 +1,14 @@ +Navigate to the ``ufs-srweather-app/ush`` directory. The default (or "control") configuration for this experiment is based on the ``config.community.yaml`` file in that directory. Users can copy this file into ``config.yaml`` if they have not already done so: + +.. code-block:: console + + cd /path/to/ufs-srweather-app/ush + cp config.community.yaml config.yaml + +Users can save the location of the ``ush`` directory in an environment variable (``$USH``). This makes it easier to navigate between directories later. For example: + +.. code-block:: console + + export USH=/path/to/ufs-srweather-app/ush + +Users should substitute ``/path/to/ufs-srweather-app/ush`` with the actual path on their system. As long as a user remains logged into their system, they can run ``cd $USH``, and it will take them to the ``ush`` directory. The variable will need to be reset for each login session. \ No newline at end of file diff --git a/doc/doc-snippets/file-edit-hint.rst b/doc/doc-snippets/file-edit-hint.rst new file mode 100644 index 000000000..b7a9b99b6 --- /dev/null +++ b/doc/doc-snippets/file-edit-hint.rst @@ -0,0 +1,9 @@ +.. Hint:: + + To open the configuration file in the command line, users may run the command: + + .. code-block:: console + + vi config.yaml + + To modify the file, hit the ``i`` key and then make any changes required. To close and save, hit the ``esc`` key and type ``:wq`` to write the changes to the file and exit/quit the file. Users may opt to use their preferred code editor instead. \ No newline at end of file diff --git a/doc/doc-snippets/load-env.rst b/doc/doc-snippets/load-env.rst new file mode 100644 index 000000000..85afec619 --- /dev/null +++ b/doc/doc-snippets/load-env.rst @@ -0,0 +1,7 @@ +.. code-block:: console + + source /path/to/ufs-srweather-app/etc/lmod-setup.sh + module use /path/to/ufs-srweather-app/modulefiles + module load wflow_ + +where ```` is a valid, lowercased machine name (see ``MACHINE`` in :numref:`Section %s ` for valid values), and ``/path/to/`` is replaced by the actual path to the ``ufs-srweather-app``. \ No newline at end of file diff --git a/doc/doc-snippets/scp-files.rst b/doc/doc-snippets/scp-files.rst new file mode 100644 index 000000000..bc29780e9 --- /dev/null +++ b/doc/doc-snippets/scp-files.rst @@ -0,0 +1,11 @@ +Users who are working on the cloud or on an HPC cluster may want to copy the ``.png`` files onto their local system to view in their preferred image viewer. Detailed instructions are available in the :ref:`Introduction to SSH & Data Transfer `. + +In summary, users can run the ``scp`` command in a new terminal/command prompt window to securely copy files from a remote system to their local system if an SSH tunnel is already established between the local system and the remote system. Users can adjust one of the following commands for their system: + +.. code-block:: console + + scp username@your-IP-address:/path/to/source_file_or_directory /path/to/destination_file_or_directory + # OR + scp -P 12345 username@localhost:/path/to/source_file_or_directory /path/to/destination_file_or_directory + +Users would need to modify ``username``, ``your-IP-address``, ``-P 12345``, and the file paths to reflect their systems' information. See the :ref:`Introduction to SSH & Data Transfer ` for example commands. \ No newline at end of file