From d69c3799773e19adf3fef7bf16821440ddb2ddac Mon Sep 17 00:00:00 2001 From: "kayee.wong" Date: Thu, 26 Sep 2024 17:10:16 +0000 Subject: [PATCH] Update develop branch, gsl_ufs_dev MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Squashed commit of the following: commit 4ad0d52a6fbb581a9804bd0bb627b7c52f338bad Author: kayee.wong Date: Thu Sep 19 19:20:06 2024 +0000 For pygraf plotting. commit 1bfba70ed6aaa3eb47974006ed0c5cae653b8ce8 Author: kayee.wong Date: Thu Sep 19 04:27:55 2024 +0000 Fixed gfsatmprod file name and gfsarch data type. commit 4e0a81f7acf5ccba5de8d3bdf7ab19cae80812cb Author: kayee.wong Date: Wed Sep 18 18:37:50 2024 +0000 Fix typos/links/dependancy. commit d5fdcbf8d6e987dbdfcd6b4da34d0934c8d82c3b Author: kayee.wong Date: Wed Sep 18 07:37:52 2024 +0000 Fixed missing config.base files. commit c6239f925369b20fd8488e8537969e2f4f9e7725 Author: kayee.wong Date: Wed Sep 18 06:53:47 2024 +0000 Update submodule hashs. commit 32bf790f67b818ff50a7f6b74388fb81ed175b3a Author: kayee.wong Date: Wed Sep 18 06:09:41 2024 +0000 Update develop branch, gsl_ufs_dev - based on gsl_ufs_dev from KaYee's fork global-workflow: 07Aug24,37c53ac [develop_07Aug2024_37c53ac] UFS: 19Jul24, c127601 FV3: 19Jul24, 2527c11 UPP: 23Apr24, be0410e CCPP-PHYSICS: 19Jul24, 46df080 UFS_UTILS: 26Jun24, 3ef2e6b Squashed commit of the following: commit 70b557836379bb7e545fcc6642e28d66cfc17735 Merge: 5edbd123e 37c53ac69 Author: Janet Derrico <143837053+jderrico-noaa@users.noreply.github.com> Date: Wed Aug 7 11:02:24 2024 -0600 Merge branch 'NOAA-EMC:develop' into develop commit 37c53ac692274eb5e9f9a3220033406e8c4b4a04 Author: Kate Friedman Date: Wed Aug 7 08:11:21 2024 -0400 Revert MSU FIX_DIRs back to glopara (#2811) commit 876dfee26ad67e1f729bbf52b3167d48ea5a7517 Author: TerrenceMcGuinness-NOAA Date: Tue Aug 6 14:47:36 2024 -0400 Bugfix for updating label states in Jenkins (#2808) Quick bug fix for updating state labels in CI during finalize. (did not reference GitHub CLI executable correctly in the pipeline script) commit 8fee36f0307b0c08080e3a8fa8fa6703e7da5fce Author: Rahul Mahajan Date: Tue Aug 6 11:02:45 2024 -0400 Clean-up temporary rundirs - take 2. (#2753) This PR: - is a follow-up to a previous PR that aggressively pruned run directories. - removes run directories for the current cycle in the clean-up if the cycle is successful. If the cycle is not successful, cleanup is not called and all run directories for the cycle are safe from being purged. - also updates the PR template to list/query for any updates to submodules. --------- Co-authored-by: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> Co-authored-by: David Huber Co-authored-by: Walter Kolczynski - NOAA commit d599fff4aedd41ae587dbe02226acb12ff48efc1 Author: HelinWei-NOAA <48133472+HelinWei-NOAA@users.noreply.github.com> Date: Mon Aug 5 05:31:31 2024 -0400 Change land surface for HR4 (#2787) Resets the default value of opt_diag to 2 corresponding to the land surface upgrades in ufs-weather-model for HR4. Resolves #2786 commit 6d7f7e860a0c7062f90bf09fdf9a5d19dc77cfdb Author: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> Date: Fri Aug 2 15:41:29 2024 -0400 Run METplus serially and correct the name of prod tasks (#2804) Adds 2 hot fixes: - METplus v9.1.3 has a bug in it that sometimes attempts to create multiple copies of the same directory when running in parallel, causing a Python error and downstream problems. This PR makes METplus run in serial mode, preventing such issues. - Corrects the name of the atmos_prod, ocean_prod, and ice_prod tasks in workflow/rocoto/tasks.py (was accidentally changed to e.g. atmosprod) commit 0706c59ac53cc3bbfbaa56cbd7fa75ab51117830 Author: TerrenceMcGuinness-NOAA Date: Fri Aug 2 15:03:45 2024 -0400 Update Java Agent launching script for Jenkins connections (#2762) Made updates to the Jenkins Launching Script for robustness and less ambiguous documentation: - Clearer distinction between required user token for the remote api and the systems token for launching - Added pre-checks: `gh` is authenticating, named compliant token and secret file exists - More robust Jason based parser of the remote api response for checking the state of the Node connection - For `cron` use a 5 minute pause and recheck was added before re-launching of the java agent - Added concise header documentation of requirements and purpose --------- Co-authored-by: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> Co-authored-by: Walter Kolczynski - NOAA commit b73b1fd203496db97f8067652659573a632bcc67 Author: Walter Kolczynski - NOAA Date: Fri Aug 2 07:59:06 2024 -0400 Fix erroneous cdump addition (#2803) commit 49877046ac3306f6b78ca0ab5d5089ba1aa3e3e3 Author: Rahul Mahajan Date: Thu Aug 1 20:26:13 2024 -0400 Update ocean post-processing triggers (#2784) This PR: - replaces `check_netcdf.sh` checker for ocean post-processing with with ocean output at the next forecast hour or finishing of the forecast job for ocean prod - removes no longer needed `ush/check_netcdf.sh` commit aa2af1ca8d59424a60a1730722bf528775d9e606 Author: GeorgeGayno-NOAA <52789452+GeorgeGayno-NOAA@users.noreply.github.com> Date: Thu Aug 1 16:46:10 2024 -0400 Update the gfs_utils repository hash (#2801) # Description Point to the latest hash of the gfs-utils repository, which contains the bug fix to gaussian_sfcanl. Resolves #2669. Refs: https://github.com/NOAA-EMC/gfs-utils/pull/73 commit d3d85f0e0d573f16a71ca44778021dfc0ccf50c8 Author: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> Date: Thu Aug 1 08:12:14 2024 -0400 Add fixes for metplus jobs when gfs_cyc=2 or 4 (#2791) Changes how METplus jobs run so that they run on the last GFS cycle for a given `PDY`. This is a departure from operations where the METplus jobs run on the 00Z cycle for the previous 3 cycles and 00Z (i.e. `${PDYm1}06` through `${PDY}00`). With this PR, for gfs_cyc=4, METplus jobs will run on `${PDY}18` for cycles 00-18. See https://github.com/NOAA-EMC/EMC_verif-global/pull/131 for more details. commit 1cf8b448af562dbb7af198399c78c585977e81da Author: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> Date: Tue Jul 30 10:38:49 2024 -0400 Simplify resource-related variables, remove CDUMP where unneeded (#2727) This overhauls resource-related variables to use a common set of variables for each job. In the process, this also removed the use of CDUMP in most cases. Resolves #1299 #2693 commit 61875f25c9e971f82ae499b5b612d7f095deebd4 Author: Eric Sinsky - NOAA <48259628+EricSinsky-NOAA@users.noreply.github.com> Date: Mon Jul 29 14:40:03 2024 -0400 Remove f000 from atmos rocoto tasks for replay cases (#2778) The main purpose of this PR is to remove the f000 from atmosphere-related rocoto tasks when `REPLAY_ICS` is set to `YES`. In cases where `REPLAY_ICS` is `YES` and `OFFSET_START_HOUR `is greater than `0`, it becomes necessary to have the first lead hour set to `OFFSET_START_HOUR ` for the atmosphere-related rocoto tasks. For example, when `OFFSET_START_HOUR ` is set to `3`, then the minimum lead time for the atmos_prod and atmos_ensstat rocoto tasks needs to be 3 and the minimum lead time for the ocean_prod rocoto task needs to be 6 (assuming `FHOUT_OCN` is 6). This PR makes this rocoto workflow set up possible by removing 0 from fhrs for atmosphere-related tasks in `gefs_tasks.py` when replaying. This PR also moves where f000 is being removed for the ocean_prod and ice_prod tasks. The if-block that performs this f000 removal has been moved from `tasks.py` to `gefs_tasks.py` and `gfs_tasks.py`. --------- Co-authored-by: Rahul Mahajan commit f156a7894d639f177e3e2588f98eec1f6f59aa68 Author: Jessica Meixner Date: Fri Jul 26 14:18:32 2024 -0500 HR4 GWD update (#2732) This update is a combination of the gravity wave drag (GWD) versions from the NOAA/GSL and NOAA/PSL commit a7f6b32ed63efa0de21bfb0ce63364a5b22b9891 Author: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> Date: Thu Jul 25 14:26:52 2024 -0400 Temporarily disable METplus jobs (#2796) commit 848659691fdbf47e7ccdbbb2ebf22a6e470633a2 Author: Guillaume Vernieres Date: Wed Jul 24 15:00:35 2024 -0400 Refactoring of the marine B-matrix job (#2749) Refactor the functionality of B-matrix generation from the GDASApp Resolves #2743 commit 65a7ab75dc0e4baba06a02e11ed0455787056a68 Author: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> Date: Tue Jul 23 08:35:48 2024 -0400 Replace Jinja namespaces with replace_tmpl filter and disable ACCOUNT_SERVICE option (#2775) Removes the namespace-based construction of EnKF member COM directories in the enkf archive template. commit c45b9611f3e701b819bd33dc5af29033f060bb91 Author: Eric Sinsky - NOAA <48259628+EricSinsky-NOAA@users.noreply.github.com> Date: Tue Jul 23 00:33:16 2024 -0400 Add task to process reforecast variables to save on WCOSS2 (#2680) # Description This PR adds an optional task to the global-workflow to process a subset of ocean, ice, wave and atmosphere products to be saved on WCOSS2 for the GEFSv13 reforecast. This task is designed to process GEFS variables so that specific reforecast product requirements are met. A new variable in `config.base` called `DO_EXTRACTVARS` enables this task, which is currently called `extractvars`. `DO_EXTRACTVARS` is set to `NO` by default and is specifically a task designed to be executed for the GEFSv13 reforecast. Refs #1878 # Type of change - New feature (adds functionality) # Change characteristics - Is this a breaking change (a change in existing functionality)? NO - Does this change require a documentation update? NO # How has this been tested? This has been cloned and tested on WCOSS2. This will need to be tested on Hera and other platforms on which the reforecast may be running. # Checklist - [ ] Any dependent changes have been merged and published - [x] My code follows the style guidelines of this project - [x] I have performed a self-review of my own code - [ ] I have commented my code, particularly in hard-to-understand areas - [x] My changes generate no new warnings - [x] New and existing tests pass with my changes - [ ] I have made corresponding changes to the documentation if necessary --------- Co-authored-by: Rahul Mahajan commit 71dc33c6ca991c16ce743760d99feaaf60f2218a Author: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> Date: Mon Jul 22 14:51:53 2024 -0400 Set METplus process count in config.metp; add verif-global support for Rocky 9 (#2774) Fix metp* resources and check that they completed properly; add support for Orion Rocky 9 commit 56df67a90fe090c425199f1285e5aac722c398b1 Author: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> Date: Mon Jul 22 09:28:18 2024 -0400 Hotfix: Update jcb to avoid git-lfs files (#2782) Removes git-lfs files from the `GDASApp` `jcb` submodule, allowing it to be cloned on Hera. This hotfix points to a non-authoritative branch of the GDASApp (https://github.com/DavidHuber-NOAA/GDASApp/tree/hotfix/update_jcb) and should be updated ASAP back to the authoritative repository. commit fc668aa422ebbad76ceda1b3bbf8dc0ea432defd Author: Rahul Mahajan Date: Tue Jul 16 09:44:00 2024 -0400 Address issues in creating XML for GFS forecast-only with app S2SWA (#2757) This bugfix PR: - fixes an issue where a user is unable to generate the XML for a GFS forecast-only experiment with APP=S2SWA Specifically, the changes are related to defining `aero_fcst_cdumps`. Following `setup_expt.py`, the user will have to set `AERO_FCST_CDUMPS="gdas|gfs|both" depending on their use case in `config.base`. commit e0878dba0e53706a7f53429b61aee2936e2c21bf Author: Kate Friedman Date: Mon Jul 15 10:25:11 2024 -0400 Updated prepobs and fit2obs versions for Orion Rocky9 (#2758) Update prepobs to v1.0.2 and fit2obs to v1.1.2 These versions now support Orion Rocky9. Updates are included for new install locations on WCOSS2, Hera, Orion/Hercules, and Jet. commit 4968f3a8de9a5f90651cacd74e38f97bc80b7bbb Author: TerrenceMcGuinness-NOAA Date: Thu Jul 11 17:48:47 2024 +0000 CI maintenance updates and adding CI Unit Tests (#2740) This PR has a few maintenance updates to the CI pipeline and adds a test directory with Unit Tests **Major Maintenance updates:** - Added try blocks with appropriate messaging to GitHub PR of failure for: - - **scm** checkout - - build fail (with error logs sent as gists) - - create experiment fails with `stderr` sent to GitHub PR messaging - Pre-stage FAILS from the above are now captured these fails allow FINALIZE to update the label to FAIL (i.e. no more "hanging" CI state labels in GitHub - see image below) **Minor Maintenance updates:** - Fix for STALLED cases reviled from PR 2700 (just needed a lambda specifier) - Fixed path to experiment directory in PR message (had dropped EXPDIR in path) - Needed `latin-1` decoder in reading log files for publishing **Added python Unit Tests for CI functionality:** - Installed **Rocoto** and **wxfow** in GitHub Runner for testing key CI utility codes - Cashed the install of Rocoto in the GitHub Runners to greatly reduce stetup time for running the unit tests - Unit Tests Python scripts added - `test_rocostat.py`: rocoto_statcount() rocoto_summary() rocoto_stalled() - `test_setup.py`: setup_expt() test_setup_xml() - `test_create_experment`: test_create_experiment() - - Runs all PR cases that do not have ICs in the GItHub Runner - Reporting mechanism in the Actions tab for Python Unit Testing results - Test case data for STALLED and RUNNING stored on S3 and pulled using wget during runtime of tests commit 5ef4db74649b8be03402c17aa29c024e71699a7b Author: AndrewEichmann-NOAA <58948505+AndrewEichmann-NOAA@users.noreply.github.com> Date: Thu Jul 11 08:59:24 2024 -0400 Adds contents of constructor and initialize methods to marine LETKF class (#2635) Adds contents of constructor and initialize methods to marine LETKF class Partially addresses https://github.com/NOAA-EMC/GDASApp/issues/1091 --------- Co-authored-by: Rahul Mahajan Co-authored-by: Cory Martin commit 8998ec7b74123e953b97a93fa14cc78d471a1aee Author: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> Date: Tue Jul 9 08:31:57 2024 -0400 Fix GDAS group B restart archiving (#2735) Archives the GDAS restartb dataset at a 6-hour offset from restarta This allows cycled experiments to restart from the archives. The tabbing for the master archive templates was also added to improve readability. Resolves #2722 commit 3ca74771255727033b9dc043c652ac585178629c Author: AndrewEichmann-NOAA <58948505+AndrewEichmann-NOAA@users.noreply.github.com> Date: Tue Jul 9 08:28:54 2024 -0400 Add fcst dependency to ocnanalprep (#2728) Add previous cycle's `fcst` as a dependency to `ocnanalprep` This ensures that the availability of restart files to the latter. This addresses a seldomly-encountered race condition where `ocnanalprep` fails due to the lack of the files. commit 58fca1668aecd6fb1afd12a441256ad35900e075 Author: Rahul Mahajan Date: Fri Jul 5 15:02:23 2024 -0400 Update (partially) global-workflow for orion+rocky9 (#2741) This PR: - updates a few submodules (GSI, GSI-utils, GSI-monitor, UFS_utils, GFS-utils) to include recent update to their modulefiles for Orion+Rocky9 upgrade - updates the modulefiles in global-workflow to load modules from Orion+Rocky9 paths - updates modulefiles for `gwsetup` and `gwci` as well. - removes NCL and GEMPAK from Orion. NCL is not used and GEMPAK is not installed. - adds `parm/config.gfs/config.resources.ORION` to address GSI performance degradation after Rocky 9 upgrade. This PR: - does not update the build for UPP. Standalone UPP is not available via ufs-weather-model as of #2729 - will need a follow-up update for `prepobs` and `fit2obs` updated locations when they are installed in `glopara` space on Orion. # Type of change - Maintenance (code refactor, clean-up, new CI test, etc.) # Change characteristics - Is this a breaking change (a change in existing functionality)? NO - Does this change require a documentation update? NO # How has this been tested? This PR is not sufficient for Orion. This PR must be tested on other platforms (Hera, WCOSS2) as this PR updates submodules. # Checklist - [ ] Any dependent changes have been merged and published - [ ] My code follows the style guidelines of this project - [ ] I have performed a self-review of my own code - [ ] I have commented my code, particularly in hard-to-understand areas - [ ] My changes generate no new warnings - [ ] New and existing tests pass with my changes - [ ] I have made corresponding changes to the documentation if necessary --------- Co-authored-by: Kate Friedman commit d65d3d257b38225fac74e86b770f43e1f8ae2d5a Author: Jessica Meixner Date: Wed Jul 3 21:07:49 2024 -0400 Update ufs model hash to 20240625 (#2729) Updates UFS weather model hash to hash from 2024-06-24 which has orion porting updates + a few namelist updates. commit 2bd106a013805ba4e16dbdc456d6731f8f36ec85 Author: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> Date: Wed Jul 3 11:32:40 2024 -0400 Hotfix for undefined CLUSTERS (#2748) Defines `CLUSTERS` as an empty string for all hosts except Gaea and uses the native `dict` `get` method to prevent grabbing an unset entry. commit 7dc6651a3b92194d963675bdc0a9ec3c28499abf Author: GwenChen-NOAA <95313292+GwenChen-NOAA@users.noreply.github.com> Date: Wed Jul 3 09:56:08 2024 -0400 Update gempak job to run one fcst hour per task (#2671) This PR updates gempak jobs (gfs, gdas, and goes) from processing all forecast hours at once to one forecast hour at a time. This will reduce the job runtime to less than 5 min, so restart capability is not needed. Resolves #1250 Ref #2666 #2667 --------- Co-authored-by: Walter.Kolczynski commit 8215ae654202186a4f753c3abe937b7b9b91a9c7 Author: Rahul Mahajan Date: Tue Jul 2 16:22:11 2024 -0400 Hotfix for clusters from #2701 (#2747) Fixes an issue created from #2701 that added `CLUSTERS` to the `gaea.yaml`. commit 11943e36ba12b3df49c51942da780698fab02d38 Author: DavidBurrows-NCO <82525974+DavidBurrows-NCO@users.noreply.github.com> Date: Tue Jul 2 12:58:10 2024 -0400 Fix xml file setup and complete C48 ATM and S2SW runs for CI on Gaea (#2701) This PR sets up the ability on Gaea for auto generation of a clean xml file, i.e., an xml file that does not need any alterations before running rocoto. Refs #2572 Refs #2664 commit de8706702ead0630beb54d868f83aa2cb23f8f79 Author: RussTreadon-NOAA <26926959+RussTreadon-NOAA@users.noreply.github.com> Date: Mon Jul 1 09:29:14 2024 -0400 Update for JCB policies and stage DA job files with Jinja2-templates (#2700) This PR updates the `gdas.cd` hash to bring in new JCB conventions. Resolves #2699 From #2654 This PR will move much of the staging code that take place in the python initialization subroutines of the variational and ensemble DA jobs into Jinja2-templated YAML files to be passed into the wxflow file handler. Much of the staging has already been done this way, but this PR simply expands that strategy. The old Python routines that were doing this staging are now removed. This is part of a broader refactoring of the pygfs tasking. wxflow PR [#30](https://github.com/NOAA-EMC/wxflow/pull/30) is a companion to this PR. Co-authored-by: danholdaway Co-authored-by: DavidNew-NOAA Co-authored-by: DavidNew-NOAA <134300700+DavidNew-NOAA@users.noreply.github.com> Co-authored-by: Dan Holdaway <27729500+danholdaway@users.noreply.github.com> commit c49e4eee1a2ca818b3ecdcb9ea41c3f3e91d585b Author: Rahul Mahajan Date: Fri Jun 28 14:56:19 2024 -0400 Revert PR 2681 (#2739) This PR: - reverts #2681 in part - keeps some changes for `RUN`. - is a hotfix - should be merged ASAP after consensus w/ @guillaumevernieres @CatherineThomas-NOAA @WalterKolczynski-NOAA commit 9476c1237af4adbc95f90bd1bdd34b6b99f2f8a3 Author: TerrenceMcGuinness-NOAA Date: Wed Jun 26 15:46:08 2024 -0400 updated Finalize in Jenkinsfile and added try block around scm checkout (#2692) We are updating the Jenkins Pipeline with a try block around checkout to capture errors for the user. Also cleaned up Finalize and added section to clean out workspace on success. commit 968568f682bac7564095440bdb7813abefd76821 Author: Walter Kolczynski - NOAA Date: Wed Jun 26 13:27:19 2024 -0400 Activate snow DA test on WCOSS (#2720) Activate the snow DA test on WCOSS. commit 7706760bb8adbdf78cb640b02739023c886e7699 Author: Rahul Mahajan Date: Wed Jun 26 10:02:22 2024 -0400 Cleanup of stale RUNDIRS from an experiment (#2719) This PR: - removes stale temporary scratch run directories from `$DATAROOT/` every 3 days. - should help to scrub failed attempts. - removes an unused variable `RUNDIR` defined in `config.base` commit 8962991691b5f0857b813bddfd28aa1034d4bd2b Author: Jessica Meixner Date: Wed Jun 26 09:43:48 2024 -0400 Update logic for MOM6 number of layers/exception values (#2681) Updates the logic to be by run instead of for DO_JEDIOCNVAR to determine how many layers and sets the exception value for MOM6 to be 1e-34 for all scenarios. Note, we will no longer have zeros in the ocean grib output and the DA will also run without issues. Fixes https://github.com/NOAA-EMC/global-workflow/issues/2615 commit 12431f76bdce807067929415007592cffc8a2457 Author: Henry R. Winterbottom <49202169+HenryWinterbottom-NOAA@users.noreply.github.com> Date: Wed Jun 26 07:42:35 2024 -0600 Update wave jobs to use COMIN/COMOUT (#2643) NCO has requested that each COM variable specify whether it is an input or an output. This completes that process for the global-workflow wave model and products tasks. Refs #2451 commit b902c0bac126c323a07186ad8881384b032b6fda Author: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> Date: Tue Jun 25 07:48:46 2024 -0400 Assign machine- and RUN-specific resources (#2672) Redefine resource variables based explicitly on RUN or CDUMP Additionally, machine-specific resources are moved out of config.resources and placed in respective config.resources.{machine} files. Resolves #177 #2672 commit 5edbd123e2878a07f5cce8e3c7ea6147f286633a Merge: 09333c01d 4e1b937b6 Author: Janet Derrico <143837053+jderrico-noaa@users.noreply.github.com> Date: Mon Jun 24 12:44:05 2024 -0600 Merge branch 'NOAA-EMC:develop' into develop commit 4e1b937b67ed220120e81925c4507f03b9b8965f Author: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> Date: Mon Jun 24 10:50:52 2024 -0400 Add minimum software requirements (#2712) Adds a table to HPC documentation stating the minimum support versions commit f43a86276aaef91efa28faadc71a3cf50e749efe Author: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> Date: Fri Jun 21 13:44:29 2024 -0400 Fix and simplify online archiving and reenable METplus jobs (#2687) This fixes the online archiving portion of the `*arch` and `*earc00` jobs, a prerequisite for running METplus. This also reenables METplus by default. The approach previously taken created `FileHandler` dictionaries at varying levels within the resulting yaml, which was not properly parsed by `exglobal_archive.py`. This approach creates a single `FileHandler` dictionary and is much less complicated overall. Resolves #2673 #2647 commit 8993b42cb91144c0ab0501dc7841ea8d675c4701 Author: Walter Kolczynski - NOAA Date: Wed Jun 19 21:51:22 2024 -0400 Eliminate post groups (#2667) Eliminates the post groups used for upp and products jobs so that each task only processes one forecast hour. This is more efficient and greatly simplifies downstream dependencies that depend on a specific forecast hour. Resolves #2666 Refs #2642 commit 0b810c888239853fedd0e4584fe62536c6aaacdf Author: Henry R. Winterbottom <49202169+HenryWinterbottom-NOAA@users.noreply.github.com> Date: Tue Jun 18 20:32:48 2024 -0600 Removes misleading "No such file or directory" syntax errors from output files (#2688) This PR addresses issue #1252. The following is accomplished: - Prior to removing files, the existence of a file is checked prior to attempting to remove; this is performed as noted [here](https://github.com/NOAA-EMC/global-workflow/issues/1252#issue-1538627369); this PR only addresses the the `chgrp` issue. Refs #1252 --------- Co-authored-by: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> commit 3270ac3bf00c3ebc8166c70d84647ec44431fbae Author: Henry R. Winterbottom <49202169+HenryWinterbottom-NOAA@users.noreply.github.com> Date: Tue Jun 18 12:17:59 2024 -0600 Hotfix for bug in template names. (#2697) This PR is a hotfix for an incorrectly named (e.g., non-existent) `COM/` template. Resolves #2696 Refs #2451 commit 35d4d99eaac669721add9ddcc793153e5ab3b30a Author: Henry R. Winterbottom <49202169+HenryWinterbottom-NOAA@users.noreply.github.com> Date: Tue Jun 18 08:06:53 2024 -0600 Update archive job to use COMIN/COMOUT (#2668) NCO has requested that each COM variable specify whether it is an input or an output. This completes that process for the global-workflow archive task. Refs #2451 --------- Co-authored-by: Walter Kolczynski - NOAA Co-authored-by: Rahul Mahajan Co-authored-by: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> commit 47b3a581c8257fa24411fb400df8bb0e1e04972a Author: Walter Kolczynski - NOAA Date: Mon Jun 17 22:55:38 2024 -0400 Turn on high-frequency output in extended test (#2679) Turns on high-frequency (hourly) output in the extended products test to exercise that aspect of the code. This test only runs on WCOSS. Also adds the hooks to optionally turn on the metplus jobs, but that is deferred as they are not currently working correctly. commit 38f2df9fb0c074b1f80d3c637080be79be693161 Author: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> Date: Mon Jun 17 17:12:55 2024 +0000 Optimize wavepostpnt (#2657) Optimize the gfswavepostpnt, gfswavepostbndpntbll, and gfswavepostbndpnt jobs This is done by 1) reducing the number of calls to `sed`, `awk`, `grep`, and `cat` by - performing operations on all files at once instead of looping over each file - removing piped `cat` calls (e.g. `cat | sed 'something'`) - combining `sed` and `grep` calls when possible - adding logic to `awk` calls instead of handling that logic in bash 2) minimizing as much as possible the amount of data on disk that has to be read in (e.g. limiting sed to read only the line numbers it needs) --------- Co-authored-by: Walter Kolczynski - NOAA commit 5af325a6a4e0a14d180514a418603ca79fada487 Author: Dan Holdaway <27729500+danholdaway@users.noreply.github.com> Date: Fri Jun 14 18:05:23 2024 -0400 Update GDASapp hash to move JCB into GDASapp (#2665) This PR moves JCB into GDASapp. The PR also bumps up the hash of GDASapp to what is in `feature/move_jcb`, which at time of writing is develop plus the absorption of JCB into GDASapp. Note that I also took the changes from https://github.com/NOAA-EMC/global-workflow/pull/2641 to follow the testing @RussTreadon-NOAA has done. commit 6c93b4554e235fcb4d0004e99a4c4498d55d461b Author: Yaping Wang <49168260+ypwang19@users.noreply.github.com> Date: Fri Jun 14 10:18:17 2024 -0500 Add observation preparation job for aerosols DA to workflow (#2624) Add a prepaeroobs job to prepare aerosol obs files for DA. This job does quality control of the VIIRS aerosol raw observations and convert them to ioda format. Resolves #2623 --------- Co-authored-by: ypwang19 Co-authored-by: TerrenceMcGuinness-NOAA Co-authored-by: Cory Martin Co-authored-by: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> commit 5a5fc2be7555f094a0f90fd3a3df22d071ccdfd4 Author: Jessica Meixner Date: Fri Jun 14 11:04:41 2024 -0400 Remove ocean daily files (#2689) This PR removes the ocn_daily files that are produced by the ocean component. These files can be recreated by averaging data that exists in the 6 hour aveaged files if needed. Fixes https://github.com/NOAA-EMC/global-workflow/issues/2675 Fixes https://github.com/NOAA-EMC/global-workflow/issues/2659 (by removing them and making this obsolete) commit 603a4a8052a5c43ce5986f028c3fcfd5fd248ad4 Author: TerrenceMcGuinness-NOAA Date: Thu Jun 13 12:22:03 2024 -0400 Update Jenkinsfile needed a comma commit dc21eac6c3941d7f30803891d91d82f4cc1f8183 Author: TerrenceMcGuinness-NOAA Date: Thu Jun 13 11:41:14 2024 -0400 Add Hercules-EMC to the Jenkins configurable parameter list (#2685) This quick-fix PR is to update the Jenkins Pipeline's configurable parameter list to include the **Hercules-EMC** node. This allows Jenkins users to restart Jobs in the controller when no updates have been made. commit ebacebfbe458634b8c80af6a735d6b6d01e4e406 Author: RussTreadon-NOAA <26926959+RussTreadon-NOAA@users.noreply.github.com> Date: Thu Jun 13 11:20:24 2024 -0400 Update gdas.cd and gsi_utils hashes (#2641) This PR updates the `sorc/gdas.cd` and `sorc/gsi_utils` hashes. The updated hashes bring in bug fixes, new UFS DA functionality, and a Gaea build for gsi_utils. Resolves #2640 commit 34155fb4767769600a1ff95f0a65e37081addc2a Author: Neil Barton <103681022+NeilBarton-NOAA@users.noreply.github.com> Date: Thu Jun 13 11:18:22 2024 -0400 Add ability to use GEFS replay ICs (#2559) The PR allows the use of ICs from PSL's replay analysis. These replay ICs will be used for GEFS reforecasting and SFS. Two main changes are associated with these updates: (1) replay ICs being valid at 3Z, and (2) the use of warm starts. Resolves #1838 --------- Co-authored-by: Jessica Meixner Co-authored-by: Walter Kolczynski - NOAA Co-authored-by: Rahul Mahajan commit 6c19a0e3fc4400e1d39288be4ee4fc244b74f699 Author: Henry R. Winterbottom <49202169+HenryWinterbottom-NOAA@users.noreply.github.com> Date: Wed Jun 12 19:25:42 2024 -0600 Replace `sleep` with `wait_for_file` (#2586) This PR addresses issue #2444. The following is accomplished: - All `sleep` statements are replaced with `wait_for_file` for the relevant scripts beneath `scripts` and `ush`; - Indentation and shell-norms are updated where applicable. Note: The WAFS scripts are not updated as per @aerorahul direction. Resolves #2444 --------- Co-authored-by: henrywinterbottom-wxdev Co-authored-by: Walter Kolczynski - NOAA commit 5b2a3d449a0835cec2663aabb06f1c47a3faf84e Author: Walter Kolczynski - NOAA Date: Wed Jun 12 13:31:55 2024 -0400 Add COM template for JEDI obs (#2678) Adds a COM template to define a path to store obs processed for JEDI. This will allow UFSDA to stop writing to COM_OBS, which should be read-only as it belongs to obsproc in operations. No functional change yet. commit 2e6f1fcde9935619352b1b26cba42ec0f4d845ed Author: Guoqing Ge Date: Wed Jun 12 09:06:23 2024 -0600 Link both global-nest fix files and non-nest ones at the same time (#2632) This PR enables linking both global-nest fix files and non-nest ones at the same time and users can run both nesting and non-nesting experiments at the same time without worries about what fix files to be linked. Resolves #2631 commit 61de004d4f9e9edf8a31bb173f2719b46451a36a Author: Jessica Meixner Date: Wed Jun 12 11:03:13 2024 -0400 Update ufs-weather-model (#2663) Updates ufs-weather-model, this updates RDHPCS to the newer spack-stack allowing some temporary fixes to be reverted. * removes upp submodule * uses upp from the ufs-weather-model * restores the build and link that were hacked during the Hera Rocky 8 transition to allow for UPP submodule * Removes forecast directories in clean-up Resolves #2617 Resolves #2437 --------- Co-authored-by: Rahul Mahajan commit 15eaf35fb13f361be400be38a5f7ca7b5461ab1d Author: Eric Sinsky - NOAA <48259628+EricSinsky-NOAA@users.noreply.github.com> Date: Wed Jun 12 01:15:37 2024 -0400 Add ability to process ocean/ice products specific to GEFS (#2561) This PR begins to add the capability to produce GEFSv13 ocean and ice products in the global-workflow according to stakeholder requirements. The following features are added. - An oceanice prod yaml file has been added to address the ocean and ice products specific to GEFSv13. - The rocoto dependencies and config.base for GEFS have also been modified to allow for 24-hour averaged ocean and ice output. - Various scripts have been modified to allow for ocean and ice output frequencies of 24 hours. - `FHOUT_OCNICE` has been split into two variables called `FHOUT_OCN` and `FHOUT_ICE`. The same has been done for `FHOUT_OCNICE_GFS`. Refs #1878 commit 6691e7489650e0b738c176fbd096109288dc09b6 Author: Henry R. Winterbottom <49202169+HenryWinterbottom-NOAA@users.noreply.github.com> Date: Tue Jun 11 21:15:07 2024 -0600 Update cleanup job to use COMIN/COMOUT (#2649) NCO has requested that each COM variable specify whether it is an input or an output. This completes that process for the global-workflow clean-up task. Refs #2451 --------- Co-authored-by: Walter Kolczynski - NOAA commit 23a8d8835dd4c5d69ca20f5ff23705f30f17b4b0 Author: TerrenceMcGuinness-NOAA Date: Tue Jun 11 16:17:25 2024 -0400 Add overwrite to creat experiment in BASH CI (#2676) This is a quick hotfix to the CI BASH driver script adding `--overwrite` to create experiment script to avoid errors from restarting an experiment. commit e7909af8d9e1f34140388a3f8556d8e582c58fe5 Author: emilyhcliu <36091766+emilyhcliu@users.noreply.github.com> Date: Mon Jun 10 15:11:27 2024 -0400 Add handling to select CRTM cloud optical table based on cloud scheme and update calcanal_gfs.py (#2645) This PR proposes updates for the following two scripts: 1. In **scripts/exglobal_atmos_analysis.sh** --- Add handling to select CRTM cloud optical table based on cloud microphysical scheme indicated by `imp_physics' The default scheme in the GFS forecast model is Thompson scheme (imp_physics = 8). 2. In **/ush/calcanl_gfs.py** --- Increase the MPI number declared in the script due to increased variables to interplate increments and calculate analysis in the netcdf_io routines in GSI-utils. Here is the related [PR #46 for GSI-utils](https://github.com/NOAA-EMC/GSI-utils/pull/46). --------- Co-authored-by: Rahul Mahajan Co-authored-by: Walter Kolczynski - NOAA commit 9caa51de8fb7be07d2e61775da01937d576964f6 Author: Henry R. Winterbottom <49202169+HenryWinterbottom-NOAA@users.noreply.github.com> Date: Thu Jun 6 22:15:23 2024 -0600 Update RDHPCS Hera resource for `eupd` task (#2636) As per @wx20jjung, the resource for the `eupd` task have been updated for RDHPCS Hera to account for memory issues for C384 `gdaseupd` job fails. Resolves #2454 --------- Co-authored-by: Walter Kolczynski - NOAA commit acf3aaa2b1d3e3024b0b5d2fe23eee8c317a980b Author: DavidNew-NOAA <134300700+DavidNew-NOAA@users.noreply.github.com> Date: Thu Jun 6 11:49:03 2024 -0400 Parameterize some things in config.atmanl and config.atmensanl (#2661) This PR adds some parameters in config.atmanl and config.atmensanl that can be altered with the defaults.yaml. The motivation is to make these files match those in the GDASApp JJOB tests (example: https://github.com/NOAA-EMC/GDASApp/blob/develop/test/atm/global-workflow/config.atmanl), so we can just use the Global Workflow config.atmanl and config.atmensanl in the tests rather than custom ones in GDASApp that have to be separately updated every time the ones in the Global Workflow are updated. commit 54ea0b73a07921be5fbb07fe41e976888bd3e549 Author: Guillaume Vernieres Date: Thu Jun 6 01:36:02 2024 -0400 Add links to the ocean insitu obs processing tools (#2644) Add links to the marine bufr to ioda converters for the marine insitu observations. - fixes https://github.com/NOAA-EMC/GDASApp/issues/1106 - waiting for https://github.com/NOAA-EMC/GDASApp/pull/1135 commit 205d0c2b13e2d7755cec75bf8c978ab20d453862 Author: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> Date: Wed Jun 5 17:31:30 2024 +0000 Update S4 point of contact in docs (#2660) Update the point of contact for global workflow issues on S4. commit aa23ccf1d0d229f9ff1398d84af1fa7ee5bed262 Author: RussTreadon-NOAA <26926959+RussTreadon-NOAA@users.noreply.github.com> Date: Wed Jun 5 12:50:14 2024 -0400 Enable wcoss2 ufsda build and module load (#2620) This PR enables ufsda (`sorc/gdas.cd`) to be built and run on WCOSS2. Resolves #2602 Resolves #2579 commit 67b833e0c7bc390865d453588b4609a1a7ede981 Author: Jessica Meixner Date: Tue Jun 4 13:33:43 2024 -0400 Update ufs-weather-model (#2646) Updates UFS model to the commit https://github.com/ufs-community/ufs-weather-model/commit/5bec704243286421fc613838fc67a2129e96acd6 This should resolve the issue and allow C768 runs on Hera and allow for CICE to run on WCOSS2 (due to library updates to allow linking). From what I can tell, all updates needed were done by @HenryWinterbottom-NOAA which were updates for CICE Fixes #2490 commit c44d0ac86cfdf78eb87492431bf6d825e8bae637 Author: GwenChen-NOAA <95313292+GwenChen-NOAA@users.noreply.github.com> Date: Tue Jun 4 10:29:49 2024 -0400 Update wmo parm files to fix WMO header (#2652) This PR updates wmo parm files to switch WMO header of precipitation type products (CRAIN, CFRZR, CICEP, and CSNOW) from time averaged to instantaneous. Resolves #2566 commit 237d6dd213e8b1455d2f45dc5978fb2d3de93e60 Author: Cory Martin Date: Tue Jun 4 13:55:33 2024 +0000 Add IAU to snow DA (and its test) (#2610) This PR enables IAU for the snow DA which is necessary for GFSv17. A snow analysis is created for the center of the window regardless, and an additional at the beginning of the window is added if IAU is on. The former is needed for UPP and the latter, to initialize the model. The increment is valid throughout the window for 3DVar, so the same increment is added to both forecasts. Additionally, the input file that goes into global_cycle has been updated to be the output of the JEDI snow analysis instead of the forecast (@jiaruidong2017 I recall discussing this, can you confirm this is right or am I mistaken) This PR also makes the CI test for snow DA (and aerosol DA) include IAU rather than without IAU,. --------- Co-authored-by: Rahul Mahajan Co-authored-by: Walter Kolczynski - NOAA commit c92bf415060750127c9c05a62a1d2851c489551a Author: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> Date: Sat Jun 1 05:11:07 2024 +0000 Archiving cleanup (#2621) 1) Adds a lot of comments to the jinja templates for archiving 2) Rearranges the gdas and enkf templates to a more logical order 3) Fixes a couple of bugs in the enkf archiving of increments and analyses 4) Disables archiving for the half cycle 5) Removes the `FITSARC` key from `config.base` and `arcdir.yaml.j2`, instead relying on `DO_FIT2OBS` 6) Updates wxflow to add the option to not allow undefined variables when parsing jinja templates and invokes this feature when running archives Resolves #2612 commit 12aa1e9cd2d159e2b303d2b33d6c79c365688eec Author: Walter Kolczynski - NOAA Date: Fri May 31 04:57:08 2024 -0400 Switch to Rocky 9 built external packages on Hercules (#2608) The workflow was updated to use modules built on Rocky 9, but the external packages (like prepobs) were still pointing to the versions built on CentOS (Orion). This transitions to packages built on Rocky 9. Updating of the tracker package has been deferred until later. As such, the tracker jobs have been disabled by returning immediately if they are on Hercules. Since these jobs are small, resource-wise, it should not meaningfully impact turnover time. commit 4422550c01c9214a2b3b8890bdcc898123ee216a Author: Guoqing Ge Date: Thu May 30 08:05:23 2024 -0600 Add the capability to use slurm reservation nodes (#2627) Add the capability to use slurm reservation nodes Add "ACCOUNT_SERVICE" for jobs to run in PARTITION_SERVICE Resolves #2626 commit a54153fd9d26126206bc07a1da4e80f50c0c5910 Author: Walter Kolczynski - NOAA Date: Wed May 29 23:24:07 2024 -0400 Update forecast job to use COMIN/COMOUT (#2622) NCO has requested that each COM variable specify whether it is an input or an output. This completes that process for the forecast job. Refs #2451 --------- Co-authored-by: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> commit d69a8af95d492982b918670322ed5c41ab074335 Author: Jessica Meixner Date: Wed May 29 21:29:03 2024 -0400 Update to add 1-deg global wave grid (#2619) This PR adds options to use a global 1 deg grid, intended for testing with the SFS application. Requires new fix file changes NOAA-EMC/global-workflow#2618 commit 0b4670ecf83b99b72835c8380573b2bca7cf5324 Author: Jessica Meixner Date: Wed May 29 17:17:21 2024 -0400 Add C384mx025_3DVarAOWCDA yamls (#2625) Adds the C384mx025_3DVarAOWCDA yaml files for one experiment into a new GFSv17 folder. commit 2e885d05c64b947f00a3cf055a1277fbfac195c9 Author: TerrenceMcGuinness-NOAA Date: Wed May 29 13:00:51 2024 -0400 Script to keep Jenkins Agent persistent from cron (#2634) This "persistent" Java Agent launch script can be ran from a cron job: - Uses Jenkins Remote API to check the status of the Node connection using curl for a given machine. - If it is not connected a new agent is launched for that node. Resolves #2633 commit bb58e064d8e82ce51802bd6064cfa84cae2cc4d5 Author: GwenChen-NOAA <95313292+GwenChen-NOAA@users.noreply.github.com> Date: Tue May 28 17:17:11 2024 -0400 Change GRIB2 parameter names and vertical levels for ocean/ice post (#2611) Based on users' feedback, this PR do the following: 1. Change GRIB2 parameter names DLWRF -> NLWRF and DSWRF -> NSWRF 2. Change the vertical level of ocean 3D variables (WTMP, SALIN, UOGRD, and VOGRD) from "%g m below water surface" to "%g m below sea level" 3. Round up depth numbers to integer (e.g. 4481.0625 -> 4481 m) Co-authored-by: Rahul Mahajan commit e53c5e8e0abbc0edf95970a71df0e6e8a2be9f31 Author: DavidNew-NOAA <134300700+DavidNew-NOAA@users.noreply.github.com> Date: Tue May 28 17:16:23 2024 -0400 Add atmensanlfv3inc job (#2592) This PR creates the atmensanlfv3inc job, the ensemble version of atmanlfv3inc, created in GW PR #2420. Its GDASApp companion PR is #[1104](https://github.com/NOAA-EMC/GDASApp/pull/1104), and its JCB-GDAS companion PR is #[3](https://github.com/NOAA-EMC/jcb-gdas/pull/3). commit 50c2b8951b29a3c883a778becbf8582f9519eb48 Author: Anil Kumar <108816337+AnilKumar-NOAA@users.noreply.github.com> Date: Tue May 28 13:23:53 2024 -0400 Global-workflow (AR) Generic updates for Gaea C5 (#2515) - Port global-workflow’s build and run capability to Gaea-C5 - Building global-workflow on Gaea-C5 - Setting up experiments with global-workflow on Gaea-C5 --------- Co-authored-by: AnilKumar-NOAA Co-authored-by: DavidBurrows-NCO <82525974+DavidBurrows-NCO@users.noreply.github.com> commit b6ca771a0c584cbfcbbf9be739765d5f3815df97 Author: TerrenceMcGuinness-NOAA Date: Fri May 24 10:52:45 2024 -0400 Update STMP and PTMP settings in host file for Orion and Hercules (#2614) - Updating STMP and PTMP settings in host file for Orion and Hercules because they are cross mounted. - Also took the opportunity to finally update **SLURM_ACCOUNT** to **HPC_ACCOUT** in CI over rides. - Added a refactor of the `rocotostat.py` tool that is more pythonic and as a execute retry feature because the `rocotostat` utility on Orion has been failing sometimes. commit 7d2c539f45194cd4e5b21bfd4b83a9480189cd0f Author: Guillaume Vernieres Date: Tue May 21 23:50:50 2024 -0400 Sea-ice analysis insertion (#2584) Allows cycling and restarting CICE with the sea-ice analysis if the marine DA is switched on. Resolves #2568 Resolves NOAA-EMC/GDASApp#1103 commit 5369a1ff3a3969149fcf32810fad0e50216752b7 Author: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> Date: Tue May 21 22:12:29 2024 +0000 Refactored archiving (#2491) This provides a new pygfs task, archive.py, that provides all of the tools necessary to archive data to the local (`ARCDIR`) and backup (`ATARDIR`) archive directories. YAML-Jinja2 templates are provided to define the file to be archived or tarred to replace the `hpssarch_gen.sh`, `exglobal_earc`, and `exglobal_archive.sh` scripts and make it easier to add new data and explicitly handle optional and required files. For `ATARDIR` archiving, a master jinja template is provided for each `RUN` (i.e. master_gdas.yaml.j2, master_gfs.yaml.j2, master_enkf.yaml.j2). The master_enkf.yaml.j2 template is used for both `enkfgdas` and `enkfgfs` `RUN`s. These templates then include the appropriate `RUN`-specific jinja templates (e.g. gdas.yaml.j2) based on experiment, cycle, and coupled parameters. Each of these templates corresponds to a single tarball to populate and are tabbed 4 spaces so they are defined within the master `datasets` dictionary. Future developers should not have to make modifications to archive.py unless archiving is being enabled for a new `RUN` (e.g. `gefs`) and then only a single `elif` needs to be added to the configure method to specify the master `ATARDIR` template to archive (e.g. `master_gefs.yaml.j2`). If a new component is coming online that needs to be archived to `ATARDIR` (e.g. SNOW), then create a new template for each `RUN` that it needs to be archived for (e.g. `gdassnow.yaml.j2`) and reference the template in the appropriate master templates, e.g. `master_gdas.yaml:` ```jinja {% if DO_SNOW %} {% include "gdassnow.yaml.j2" %} {% endif %} ``` A few other issues were addressed along the way: 1. Aerosols have been reenabled. Aerosol forecasts should only be performed during gdas cycles, but analyses can be performed for both gfs and gdas cycles. This was accomplished by setting separate `AERO__CDUMP` variables to parse on for both `ANL` and `FCST` jobs. 2. Fixed the name of the `cice6_rst_ok` variable in `forecast_det.sh`. This prevented restarts from being used for cice-enable experiments. This feature was not tested. 3. Create a temporary fix for the `wgrib` utility. For spack-stack 1.6.0, the `grib-util` module file does not declare `WGRIB`. An issue is open (https://github.com/JCSDA/spack-stack/issues/1097) to fix this in existing installations. Once complete, this temporary fix should be removed. 4. The number of `earc` jobs has been reduced for lower resolution experiments. Both C48 and C96 experiments will now only have two earc jobs (one for the non-member files to archive and another for the member files). C192 will have up to 3 earc jobs (one non-member, one for members 1-40 and another for members 41-80, if needed). Resolves #2345 Resolves #2318 --------- Co-authored-by: Walter Kolczynski - NOAA commit 9aad86f27d37d19165b9a0b64cf70c7a4dd6362c Author: TerrenceMcGuinness-NOAA Date: Fri May 17 12:57:59 2024 -0400 Add remove RUNDIRS step in CI before creating experements (#2607) As had been done in Bash CI we need to remove the RUNDIR in Jenkins before a creating an experiment in the event that case had beem previously ran. commit 09333c01dbafddb2d2fe7e181b479af3cc6d3621 Merge: f7e9f4489 bb930050b Author: Janet Derrico <143837053+jderrico-noaa@users.noreply.github.com> Date: Thu May 16 14:33:19 2024 -0600 Merge branch 'NOAA-EMC:develop' into develop commit bb930050b3cd51d28ecba6b231c8675f6d11856c Author: AndrewEichmann-NOAA <58948505+AndrewEichmann-NOAA@users.noreply.github.com> Date: Thu May 16 12:28:30 2024 -0400 Adds jjob and other necessities for marine LETKF task (#2564) Adds jjob, rocoto script, config file, and other necessities for new marine LETKF task. Partially addresses NOAA-EMC/GDASApp#1091 commit 2c50fbde4d6cc3e53c55dca56925353a02fd1730 Author: TerrenceMcGuinness-NOAA Date: Thu May 16 12:12:06 2024 -0400 Updating CI Machine configs with redundant PTMP (#2605) Quick fix adding PTMP as STMP in Machine configs for CI for completeness commit ef340ff33a6f89adf70838206ba3fd56a953fa7a Author: TerrenceMcGuinness-NOAA Date: Thu May 16 11:37:30 2024 -0400 Fix race condition in CI between Orion and Hercules (#2604) Hotfix to solve race conditions in the CI system due to cross-mounted file systems between Orion and Hercules commit e8b17e27f719df280170dc3f5bd9f19917cefaf2 Author: TerrenceMcGuinness-NOAA Date: Wed May 15 17:16:30 2024 -0400 Remove existing EXPDIRs and COMROTs when CI is re-run (#2601) Quick hotfix for having default for re-runing jobs to start clean with new EXPDIRs and COMROTs commit b5d113efb1970ede5cd1d3d4dff8d96320519c41 Author: TerrenceMcGuinness-NOAA Date: Wed May 15 16:52:49 2024 -0400 Moving logic for skipping hosts in pr cases (#2573) This PR removes the logic of skipping hosts for pr cases from `create_experiment.py` and moves it to a test in the cron bash driver using a `parse_yaml.py` python tool. The Jenkins pipeline was not effected as it uses the `get_host_case_list.py` utility to form the cases on a per host bases. Co-authored-by: Rahul Mahajan commit 3cd0c68c0de9900bc7b73e1ed7621573dff5e916 Author: RussTreadon-NOAA <26926959+RussTreadon-NOAA@users.noreply.github.com> Date: Wed May 15 13:13:30 2024 -0400 Update gsi_utils.fd hash (#2598) This PR updates the `gsi_utils.fd` hash to bring in updates which add safeguards to - `src/EnKF/gfs/src/getsigensmeanp_smooth.fd/getsigensmeanp_smooth_ncep.f90` - `src/EnKF/gfs/src/recentersigp.fd/recentersigp.f90` The safeguards are described in GSI-utils PR [#41](https://github.com/NOAA-EMC/GSI-utils/pull/41) and the associated issue. Resolves #2597 commit d5366c66bd67f89d118b18956fe230207cbf0aea Author: Kate Friedman Date: Wed May 15 13:12:56 2024 -0400 Update CICE and MOM6 fix versions (#2600) This PR updates the CICE (`cice_ver`) and MOM6 (`mom6_ver`) fix versions to the newer `20240416` timestamps, which includes updates and fixes to the `100` (1-deg) resolution files. Resolves #2480 Resolves #2483 Resolves #2595 commit 6ca106e6c0466d7165fc37b147e0e2735a1d6a0b Author: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> Date: Mon May 13 22:57:38 2024 +0000 Limit gfswavepostpnt to 40 PEs/node (#2588) This fixes the slow runtime of the gfswavepostpnt job on Hercules. The job is very I/O intensive and does not scale well to large nodes, so limit the number of jobs/node to 40. Resolves #2587 commit 4fb7c12c325702a47f27c802a5067efd33d0327c Author: Fanglin Yang Date: Mon May 13 16:37:51 2024 -0400 Update damping and time-step (#2575) Updates the model to use explicit Rayleigh damping for u/v and implicit damping to w. This improves model stability and allows for longer timesteps. Also unifies the GDAS and GFS to use the same damping. Results from a test at the C1152 resolution (coupled model) can be found at https://www.emc.ncep.noaa.gov/gmb/wx24fy/C1152/newdamp/ Resolves #2574 Co-authored-by: Walter Kolczynski - NOAA Co-authored-by: Lisa Bengtsson Co-authored-by: Rahul Mahajan commit 6a9c1372ecce9e50e4f6e10e56f6e504cde1afe6 Author: TerrenceMcGuinness-NOAA Date: Fri May 10 14:17:13 2024 -0400 Do not use BUILT_semphore to force rebuilds when re-run (#2593) Remove the placement of the `BUILT_semaphore` file after the build in the Jenkins Pipeline and force it to rebuild any changes after a PR is re-ran. commit 2346c6161f75ae02369cbf30f30c6150d3e12b66 Author: Innocent Souopgui <162634017+InnocentSouopgui-NOAA@users.noreply.github.com> Date: Thu May 9 21:17:06 2024 -0500 Migration to Rocky8 spack-stack installations on Jet (#2458) # Description Migrates Global Workflow to Rocky8 spack-stack installations on Jet. Jet has moved from CentOS7 to Rocky8. Resolves #2377 Refs NOAA-EMC/UPP#919 Refs NOAA-EMC/gfs-utils#60 Refs NOAA-EMC/GSI#732 Refs NOAA-EMC/GSI-Monitor#130 Refs NOAA-EMC/GSI-utils#33 commit c7b3973014480a20dd8e24edaeb83a9e9e68159f Author: Jessica Meixner Date: Thu May 9 11:36:58 2024 -0400 Updates for cold start half cycle, then continuing with IAU for WCDA (#2560) This PR allows us to run C384 S2S with IAU, but starting with the first half-cycle as a cold-start. This will be necessary for cycled testing as we build towards the full system for GFSv17. This updates the copying of the restarts for RUN=gdas for both ocean and ice copying what the atm model is doing. It also reduced the amount of restart files from 4 to 3. Other updates: * Add DOJEDI ocean triggers for archiving certain files update from: @CatherineThomas-NOAA * Adds COPY_FINAL_RESTARTS option to turn on/off copying the last restart file to COM. Defaults to off... * Defines model_start_date_current_cycle & model_start_date_next_cycle to help with knowing which IC to grab. Refs #2546 Co-authored-by: Rahul Mahajan commit b405b7d3d11d384ce9fe3b9cd2180f315f7b38f2 Author: Dan Holdaway <27729500+danholdaway@users.noreply.github.com> Date: Wed May 8 20:52:48 2024 -0400 Use JCB for assembling JEDI YAML files for atmospheric GDAS (#2477) Change the JEDI YAML assembly for the atmospheric GDAS to use the JEDI Configuration Builder (JCB) tool so that YAMLs can be made more portable and invoke the observation chronicle mechanism. Resolves #2476 Co-authored-by: danholdaway Co-authored-by: Walter Kolczynski - NOAA commit 0cf0349c1f88048806e68ab58e93a3261b7a0e95 Author: Walter Kolczynski - NOAA Date: Wed May 8 02:04:16 2024 -0400 Add CI test for products (#2567) Adds a new version of the atm3DVar test that runs the full forecast length and produces most of the secondary products. For now, this test will only run on WCOSS due to gempak failures on other machines as well as computational needs. On other machines, the original version will run (the original version will not run on WCOSS). AWIPS remains off for now in this extended test due to a bug involving tocgrib2 and the convective precip fields (see #2566). The new test runs for 4½ cycles and the full 384-hr forecast length to ensure all gempak scripts are exercised. Since the cycle throttle is 3 and the bulk of the time is in the free forecast, the cycles run mostly concurrently so it doesn't extend the total test time too much beyond that of a single 384-hr forecast. Fixes a bug in NPOESS that was introduced when the post filenames were reverted to the previous format for the GOES products until the final filenames are determined (#2499). Also removes the AWIPS g2 job from the rocoto mesh to complete the retirement of grib1 products. Resolves #2132 Resolves #2445 commit 9b6f8404ac4507d14adc404b77cfdf002b55e832 Author: Rahul Mahajan Date: Tue May 7 00:14:36 2024 -0400 Add task to prepare emissions for GEFS (#2562) This PR: - introduces a task to prepare emissions for a forecast into the GEFS application. - adds configuration, j-job, rocoto job, ex-script and the python class for this job - updates GEFS workflow to be able to generate the XML to call this job. - updates the `fcst` and `efcs` job dependencies in the GEFS application to depend on `prep_emissions` if aerosols are turned ON. - provides a placeholder for @bbakernoaa to work on the details for preparing emissions. Co-authored-by: Walter Kolczynski - NOAA commit 233c18815d2e8e6b344f161fd0c102a4fbdfb66d Author: Dan Holdaway <27729500+danholdaway@users.noreply.github.com> Date: Mon May 6 18:11:35 2024 -0400 Single Executable for main GDAS JEDI applications (#2565) Changes that accompany GDAS PR (NOAA-EMC/GDASApp/pull/1075) that allows building of a single gdas executable, which should be more compliant with NCO requirements. Addresses https://github.com/NOAA-EMC/GDASApp/issues/1085 commit a005244bbfb2f1c525bca60d966f8095aa1acb8e Author: Guoqing Ge Date: Thu May 2 20:32:53 2024 -0600 Add nest capability (#2545) This PR adds the global nesting capability to the global workflow so that one can run a regional domain nested in a global run. Resolves #2544 Co-authored-by: Rahul Mahajan commit cb87daaced9fddd17f8370a1842d7e4ddab24e8f Author: DavidBurrows-NCO <82525974+DavidBurrows-NCO@users.noreply.github.com> Date: Thu May 2 15:30:53 2024 -0400 Update gfs_utils for Gaea (#2556) What: add build capability to the gfs_utils submod within the global workflow on Gaea Refs #2535 Co-authored-by: Rahul Mahajan commit 707a1cc5124bcd8e006839a17f76ceb88f45f902 Author: Henry R. Winterbottom <49202169+HenryWinterbottom-NOAA@users.noreply.github.com> Date: Wed May 1 18:13:02 2024 -0600 Updated GEMPAK version and APRUN launcher. (#2555) This PR addresses issue #2248 and #2513. The following is accomplished: - Updates the parallel executable for GEMPAK applications; - Updates the GEMPAK version for RDHPCS Hera. Describe your changes. Focus on the *what* and *why*. The *how* will be evident from the changes. In particular, be sure to note any interface changes, such as command line syntax, that will need to be communicated to users. At the end of your description, please be sure to add the issue this PR solves using the word "Resolves". If there are any issues that are related but not yet resolved (including in other repos), you may use "Refs". Resolves #2248 Resolves #2513 Co-authored-by: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> Co-authored-by: henrywinterbottom-wxdev commit f151cbf394e998ca0fdae893ca174499468d0c85 Author: Travis Elless <113720457+TravisElless-NOAA@users.noreply.github.com> Date: Wed May 1 11:01:18 2024 -0400 Utilize scale-dependent localization for atmospheric analysis (#2542) The plan for GFSv17 is to use a scale-dependent localization for the atmospheric analysis. This PR adds the necessary parameters to allow this feature to be used by default. commit 53b6764392cb4c0f3b4506ccd3f8cba5c5d2c56e Author: Walter Kolczynski - NOAA Date: Tue Apr 30 16:37:54 2024 -0400 Remove implicit symlink names (#2527) Lustre has a defect under Rocky 9 that results in symlink sometimes failing when the link name is not explicit. This updates all link creation to use explicit names. `config.base` is updated to turn off two monitor jobs on Hercules because the executables are not yet built there. This, combined with the previous change, should make workflow available for use on Hercules. Also removes the redundant utility names for NCP, NLN, etc. in the gdas scripts that are already defined in `config.base`. Resolves #2131 Resolves #2522 commit f7e9f4489fc0b10830f621fdf149e8e3149d6a51 Merge: 67a4810c4 762f040a2 Author: Janet Derrico <143837053+jderrico-noaa@users.noreply.github.com> Date: Tue Apr 30 12:24:48 2024 -0600 Merge branch 'NOAA-EMC:develop' into develop commit 762f040a2045db17cdacd9026df9c4b8fd520156 Author: AndrewEichmann-NOAA <58948505+AndrewEichmann-NOAA@users.noreply.github.com> Date: Mon Apr 29 22:26:28 2024 -0400 Fixes sea ice archiving (#2541) Removes/changes sea ice output files that are failing to be added to list for archiving, causing `gdasarch` to fail in WCDA cycling. Resolves https://github.com/NOAA-EMC/GDASApp/issues/1044 commit 2ecf4f86e0cbe59407ba2c4e105ee7292f1eda01 Author: RussTreadon-NOAA <26926959+RussTreadon-NOAA@users.noreply.github.com> Date: Mon Apr 29 22:25:57 2024 -0400 Link ensemble analysis increment files to COMROOT for warm_start (#2553) Scripting is added to `setup_expt.py` to link ensemble analysis increment files to COMROOT for warm_start experiments. Resolves #2552 commit 3a7abe1d63c573006e0e656237f1220002c3f579 Author: TerrenceMcGuinness-NOAA Date: Tue Apr 30 02:25:28 2024 +0000 Launch Multiple Platforms to Jenkins with polling (#2548) When launching Jenkins CI Tests all requested RDHPCS machines can now be selected via the Ready label at once. This is all that is required sense polling also works now on the controller end. Killing the jobs still needs to be done directly on the Jenkins Controller. Do not try to update the CI process using Labels. It is for launching only. Jenkins will update the labels as the states change. NOTE: When a case fails the label will be update to a **Fail** for that system but will continue to report failures of subsequent cases until otherwise stopped in the Jenkins Controller directly. Do not update any labels to **Ready** until a push has been made to the repo of the PR, and The job has been completed or stopped directly on the controller first. You can do this at anytime without having to wait until the other machines are completed. Co-authored-by: tmcguinness Co-authored-by: Rahul Mahajan commit 4b96c1237c562b67650c2d2fa015984c95b228eb Author: Jessica Meixner Date: Sat Apr 27 19:03:26 2024 -0400 Turn C48mx500_3DVarAOWCDA back on (#2543) This PR turns the C48mx500_3DVarAOWCDA test back on. This required a few bugfixes in GDAS app which have now been merged. Resolves #2438 Resolves #2528 co-authors: @guillaumevernieres and @aerorahul who provided the bug fixes. commit 48489b4e7758e80660941279652431e49fdd3cfa Author: Guoqing Ge Date: Sat Apr 27 00:23:00 2024 -0600 Add option to link different orog/ugwd fix files for global nest (#2532) The global nest runs use a different set of tiles and need a different set of orog and ugwd fix files. The PR add an option to link correct fix files for global nest. Resolves #2530 Resolves #2529 commit 7911f12ef0213f487a0681b260e4c982c946c17c Author: GwenChen-NOAA <95313292+GwenChen-NOAA@users.noreply.github.com> Date: Fri Apr 26 16:20:44 2024 -0400 Retire AWIPS GRIB1 products (#2547) This PR retires AWIPS GRIB1 products by deleting the files that are responsible for it. This PR does not remove the awips_g2 tasks from rocoto task list. It will be addressed in a follow up PR by workflow developers Refs #2445 commit 93c853d464908a88222d5a6eeca686cd1a413c0e Author: James Jung Date: Fri Apr 26 13:16:47 2024 -0400 Add CADS use flexibility (#2540) The current design requires a script change to turn CADS on/off for specific instruments. The new design moves the on/off (true/false) flags to the config.anal file. Resolves #2538 commit 11bf141319ef2a29398742007a29d79bbf4439d5 Author: TerrenceMcGuinness-NOAA Date: Thu Apr 25 23:35:57 2024 +0000 Hot fix for bash CI on WCOSS2 (#2536) A couple of minor hotfixes need for bash CI to work on WCOSS2. PR tested in bash on WCOSS with these changes Co-authored-by: tmcguinness commit f11bf3dc2bc062c070b901ce3aa808bd6b69a007 Author: Walter Kolczynski - NOAA Date: Tue Apr 23 16:00:51 2024 -0400 Fix comment indentation (#2526) Corrects comment indentation to satisfy PEP-8 complaints from pynorms. commit d0e1cc8456546904ce5895bf67aa626d2c41cce8 Author: Guoqing Ge Date: Tue Apr 23 11:17:15 2024 -0600 Add CCPP suite and FASTER option to UFS build (#2521) This PR updates related build scripts to (1) include the global-nest physics in the compilation step (2) be able to use the -DFASTER=ON compiling option through the "-f" command line switch. Resolves #2520 commit ee8cce593e645705c5aad926e9bd9e488b59ed3b Author: DavidNew-NOAA <134300700+DavidNew-NOAA@users.noreply.github.com> Date: Tue Apr 23 13:13:43 2024 -0400 New "atmanlfv3inc" Rocoto job (#2420) This PR, a companion to GDASApp PR [#983](https://github.com/NOAA-EMC/GDASApp/pull/983), creates a new Rocoto job called "atmanlfv3inc" that computes the FV3 atmosphere increment from the JEDI variational increment using a JEDI OOPS app in GDASApp, called fv3jedi_fv3inc.x, that replaces the GDASApp Python script, jediinc2fv3.py, for the variational analysis. The "atmanlrun" job is renamed "atmanlvar" to better reflect the role it plays of running now one of two JEDI executables for the atmospheric analysis jobs. Previously, the JEDI variational executable would interpolate and write its increment, during the atmanlrun job, to the Gaussian grid, and then the python script, jediinc2fv3.py, would read it and then write the FV3 increment on the Gaussian grid during the atmanlfinal job. Following the new changes, the JEDI increment will be written directly to the cubed sphere. Then during the atmanlfv3inc job, the OOPS app will read it and compute the FV3 increment directly on the cubed sphere and write it out onto the Gaussian grid. The reason for writing first to the cubed sphere grid is that otherwise the OOPS app would have to interpolate twice, once from Gaussian to cubed sphere before computing the increment and then back to the Gaussian, since all the underlying computations in JEDI are done on the native grid. The motivation for this new app and job is that eventually we wish to transition all intermediate data to the native cubed sphere grid, and the OOPS framework allows us the flexibility to read and write to/from any grid format we wish by just changing the YAML configuration file rather than hardcoding. When we do switch to the cubed sphere, it will be an easy transition. Moreover, it the computations the OOPS app will be done with a compiled executable rather than an interpreted Python script, providing some performance increase. It has been tested with a cycling experiment with JEDI in both Hera and Orion to show that it runs without issues, and I have compared the FV3 increments computed by the original and news codes. The delp and hydrostatic delz increments, the key increments produced during this step, differ by a relative error of 10^-7 and 10^-2 respectively. This difference is most likely due to the original python script doing its internal computation on the interpolated Gaussian grid, while the new OOPS app does its computations on the native cubed sphere before interpolating the the Gaussian grid. commit 3d0f643f84102d4aa2e254c902ea29eeebf1ea3f Author: TerrenceMcGuinness-NOAA Date: Tue Apr 23 15:37:25 2024 +0000 Hotfix to disable STALLED in CI as an error (#2523) This **_hotfix_** to the last CI updates disables the STALLED feature by not flagging it as an error until a more succinct algorithm is determined (i.e. the STALLED state will be at par with RUNNING). A second minor bug is in the event that no error log is produced when a Case fails, the error of the fail is still reported to the general user in GitHub. (Jenkins enabled users can always see the full comprehensive state of system by following the links at the bottom of the PR). commit 3b208124f5aee8356021d6c7c5c8f5310cb47315 Author: Rahul Mahajan Date: Mon Apr 22 22:01:00 2024 -0400 Add restart on failure capability for the forecast executable (#2510) This PR: - enables restart capability of the forecast executable from a previous failure. - saves restarts during the run in a new `DATA` structure. The current `DATA` structure: ![current `DATA`](https://github.com/NOAA-EMC/global-workflow/assets/11394126/03383e2f-b7f8-43e0-8b78-c8f37a79ab84) is being replaced by: ![Screenshot 2024-04-19 at 12 55 44 PM](https://github.com/NOAA-EMC/global-workflow/assets/11394126/8ab6e6df-bbdb-43cf-b0dc-8e066f537ee7) where, the colored boxes are described as: ![Screenshot 2024-04-19 at 12 56 14 PM](https://github.com/NOAA-EMC/global-workflow/assets/11394126/30b20e50-6cc8-4433-988a-02d5b484e7b5) - saves model output from `MOM6` and `CICE` within `MOM6_OUTPUT/` and `CICE_OUTPUT/` sub-directories. This is done to keep the run directory clean and easily identify component output. This PR also: - replaces link with copy. This enables the creation of a `DATA` directory that is self-contained and can be used to diagnose issues during failures. This is a NCO EE2 requirement and addresses part of an outstanding bugzilla. In the process of enabling the restart capability, functionality from `forecast_postdet.sh` is moved to `forecast_predet.sh` that does not depend on the outcome of `forecast_det.sh`. `forecast_det.sh` determines where the initial conditions will come from; `COM` in the case of a clean run or `DATArestart` in the case of a `RERUN`. This should make it easier to separate **static** configuration and data (fix files, etc) from **runtime** configuration (namelists, etc) and data (initial conditions) Additionally, this PR: - adds 3 utility shell scripts in `test/`. - 'nccmp.sh` - compare netCDF files using `nccmp` - `g2cmp.sh` - compare grib2 files using `wgrib2` - `f90nmlcmp.sh` - compare Fortran90 nml files using `f90nml` (Requires modulefiles to load `py-f90nml` module on RDHPCS platforms) They are not used in the workflow, but are useful for users to compare files. Resolves #2273 Co-authored-by: Walter Kolczynski - NOAA commit 1b6cef52e707b6417f5596415cb7d773e294565b Author: Kate Friedman Date: Mon Apr 22 15:42:19 2024 -0400 Update parm/transfer list files to match vetted GFSv16 set (#2517) This PR updates the `parm/transfer/*.list` files within `develop` to match the vetted set within the GFSv16 `dev/gfs.v16` branch. This completes [bugzilla #1383](http://www2.spa.ncep.noaa.gov/bugzilla/show_bug.cgi?id=1383) and duplicates cleanup already done in the `dev/gfs.v16` branch @ https://github.com/NOAA-EMC/global-workflow/commit/d466e830589682f4e5f5fa1e83f6f6526211b7d5. The sets now match - showing comparison between `dev/gfs.v16` `parm/transfer` folder contents and PR branch set: ``` [Kate.Friedman@fe5 feature-bugzilla_1383]$ pwd /lfs4/HFIP/hfv3gfs/Kate.Friedman/git/feature-bugzilla_1383 [Kate.Friedman@fe5 feature-bugzilla_1383]$ diff -r ../dev-gfs.v16/parm/transfer/ parm/transfer/ [Kate.Friedman@fe5 feature-bugzilla_1383]$ ``` Resolves #2516 commit b5a7338dab6f4c61723f31a87f25a74bb1c407ff Author: RussTreadon-NOAA <26926959+RussTreadon-NOAA@users.noreply.github.com> Date: Mon Apr 22 15:40:58 2024 -0400 Update gdas_gsibec_ver to 20240416 (#2497) This PR updates `gdas_gsibec_ver` to use the default `qoption=1` when running `fv3jedi_var.x`. This is the recommended configuration when using the statiic GSI-B in `fv3jedi_var.x`. Resolves #2496 Resolves #2493 commit 36e2febf391da43d5f92ea9f7ab01fa4e69c93c9 Author: GwenChen-NOAA <95313292+GwenChen-NOAA@users.noreply.github.com> Date: Mon Apr 22 15:39:20 2024 -0400 Adding more cycles to gempak script gfs_meta_sa2.sh (#2518) This PR changes gfs_meta_sa2.sh running from 06 cycle only to all cycles. This change will satisfy NCO Bugzilla #1211 request to add 00 cycle run in order to consolidate GFS and UKMET gempak jobs. Resolves #291 commit d839a68db6bd86b169396bc1aae9fa195e410ca5 Author: RussTreadon-NOAA <26926959+RussTreadon-NOAA@users.noreply.github.com> Date: Mon Apr 22 13:37:20 2024 -0400 Update gsi_enkf.sh hash to 457510c (#2514) The `gsi_enkf.fd` hash is updated to bring in revisions to GSI source code file `src/gsi/correlated_obsmod.F90`. The order in which set up checks are performed have been modified to allow `gsi.x` to successfully run retrospective GFS v17 cases with correlated error active. Resolves #2507 commit 7f59229a159a3d1655b8f508ae4b374738410f4d Author: Guoqing Ge Date: Mon Apr 22 08:20:21 2024 -0600 Enable using the FV3_global_nest_v1 CCPP suite (#2512) The ufs-weather-model has included a new "FV3_global_nest_v1" CCPP suite which has been used by the UFS Atmospheric River community. This PR is to provide the functionality to enable using the "FV3_global_nest_v1" CCPP suite in the global-workflow Resolves #2511 commit 1cfc8e5458cda3b909dd1c9839215845fd934308 Author: TerrenceMcGuinness-NOAA Date: Sat Apr 20 00:13:43 2024 +0000 CI Refactoring and STALLED case detection (#2488) These updates to the CI Framework does some bash refactoring and adds python tools in order to effectively create the feature for detecting when a CI Case has an experiment that is in a state where it can not advance such as missing a requisite dependency: - Added separate python script for checking the status of Rocoto driven cases and integrated its use into the bash CI drivers and Jenkins having state logic done in one place. - Added log publishing python utilities into the bash CI drivers as part of refactoring and consolidations of functionalities - Update Jenkins behavior while incorporating the above python codes for Rocoto state checking: - polling on PR works with one update away from including multiple labels as well - Label updates to FAIL as soon as first case fails and continues other cases until completes or is killed by user Resolves #2008 Co-authored-by: TerrenceMcGuinness-NOAA Co-authored-by: terrance.mcguinness Co-authored-by: terry mcguinness Co-authored-by: Walter Kolczynski - NOAA commit e4a552eae7803fde07847e9b0a2dc6e38e393713 Author: Jessica Meixner Date: Fri Apr 19 13:20:19 2024 -0400 Add C768 and C1152 S2SW test cases (#2509) Adds two yaml files for testing free-forecast S2SW at either C768 or C1152. ICs are on WCOSS2, Orion and Hera. Resolves #2502 commit 1f04a80511a43b97c93db56312c8d9bc9682ebf7 Author: AndrewEichmann-NOAA <58948505+AndrewEichmann-NOAA@users.noreply.github.com> Date: Thu Apr 18 16:38:47 2024 -0400 Fix paths for refactored prepocnobs task (#2504) # Description Cycled testing of the refactoring of prepocnobs revealed more issues with paths, which this PR should fix. Refs https://github.com/NOAA-EMC/GDASApp/issues/1047 commit c679d94ae25462603c2a418c5d9f5e216e47e013 Author: Rahul Mahajan Date: Wed Apr 17 12:06:20 2024 -0400 Add rocoto `sh` tag, script to check netcdf file and apply this to check ocean output (#2484) This PR: - adds a Rocoto dependency tag that executes a shell command. The return code of the shell expression serves as a dependency check - adds a script that executes `ncdump` on a netCDF file. If the file is a valid netCDF file, the return code is 0, else it is non-zero - combines the above 2 to use as a dependency check for MOM6 output. If the model is still in the process of writing out the ocean output, the rocoto will execute the shell script and gather the return code. This PR also: - changes permissions on some `ush/` scripts that did not have executable permissions. Resolves #2328 commit b0c2a30cd4c3a7c427c4d5bfda945272ad7af166 Author: WenMeng-NOAA <48260754+WenMeng-NOAA@users.noreply.github.com> Date: Tue Apr 16 19:30:44 2024 -0400 Revert file name changes in off-line post (#2499) This PR addresses inconsistent file naming conventions in both inline post and offline post. Resolves #2191 commit 9052ec4a1a0c6a6af4207ed6c96d7d651e67620a Author: Walter Kolczynski - NOAA Date: Tue Apr 16 16:53:43 2024 -0400 Add mean/spread for atmos grib2 (#2482) Adds the basic mean and spread grib2 products for the atmosphere. The new mean/spread files are named according to the new convention, though `pres_${grid_type}` may be changed to `pgrb_${grid_type}`. Either way, the input filenames will need to be updated when that change is made. For now, the mismatch means the unlettered grib files result in `pres_`. Resolves #2296 Refs #1522 commit b7e5b6ef76d13270c56ae5b919c4334084d3d569 Author: TerrenceMcGuinness-NOAA Date: Tue Apr 16 17:59:23 2024 +0000 Upadte and tested CI Bash for WCOSS2 (#2481) Updates to the driver scripts to the CI framework to work on WCOSS with Cron Bash. Resolves #2268 commit 28b840cfd69e1af490f0c93b25694a5bfbba427f Author: Walter Kolczynski - NOAA Date: Mon Apr 15 14:07:42 2024 -0400 Update fbwind for COM refactor (#2479) Updates fbwind job for the COM refactor and some other cleanup. This works on WCOSS but not on Orion. There seems to be a problem with either `grbindex` or `GETGB()` on Orion that causes the executable to be unable to read the grib1 index file. The grib1 data file produced there seems fine. Haven't checked Hera yet, maybe there is a problem with the spack-stack build of `grbindex`. Resolves: #2160 Refs: #289 commit 7e35a6649cdf2d9aca656a820da23dbee2ea8663 Author: WenMeng-NOAA <48260754+WenMeng-NOAA@users.noreply.github.com> Date: Mon Apr 15 12:55:17 2024 -0400 Update parm files for atm product (#2486) This PR is for updating parm files for gfs pgrb generation to support GFS HR4. Resoves #2485 commit 80e46842b0dfc2ad45b0b6b2e6b13338004de7e2 Author: AndrewEichmann-NOAA <58948505+AndrewEichmann-NOAA@users.noreply.github.com> Date: Fri Apr 12 16:08:47 2024 -0400 Fix paths in prepocnobs task (#2459) This PR fixes path for exglobal_prep_ocean_obs.py in jobs/JGLOBAL_PREP_OCEAN_OBS and associated config file Fixes https://github.com/NOAA-EMC/global-workflow/issues/2353, replaces PR https://github.com/NOAA-EMC/global-workflow/pull/2356 commit 36f3841d21b1550343e728842cd9a84af68e58a2 Author: AndrewEichmann-NOAA <58948505+AndrewEichmann-NOAA@users.noreply.github.com> Date: Fri Apr 12 10:39:37 2024 -0400 Add oceanalecn to workflow generator (#2409) Adds task oceanalecn to the workflow generator, plus some necessary env var work to the jjob. The workflow generator adds oceanalecn only if ~~`nens > 0`~~ `self.do_hybvar`. Resolves https://github.com/NOAA-EMC/GDASApp/issues/912 commit 8edf94a0c7e12566b5a339f7df1d160a8d4081cc Author: James Jung Date: Thu Apr 11 12:34:18 2024 -0400 Add support for CADS IR cloud detection scheme in the GSI (#2478) A new infrared cloud detection scheme "CADS" was added to the GSI. These changes allow the use of this new scheme within the global-workflow. Setting the namelist variables to .true. (e.g. cris_cads=.true.) will envoke the the new scheme for that instrument. Resolves #2473 commit 6d40dbf2c44de05aab325ad842feca2976d84c60 Author: Kate Friedman Date: Wed Apr 10 10:55:57 2024 -0400 Improve error messaging to resolve bugzilla (#2468) The error messaging for fatal errors in ush/forecast_postdet.sh is improved to add "FATAL" where missing and state the file that is missing (when not already mentioned). Correction of the error message mentioned in the bugzilla is included in this. Bugzilla 1374 is resolved with this commit. Resolves #1253 commit ddd91b146270862beb20d2a83c4da0394efc31a2 Author: Jiarui Dong Date: Tue Apr 9 14:10:18 2024 -0400 Check the DO_JEDISNOWDA condition before adding snow DA analysis to the list (#2471) This PR added checking the DO_JEDISNOWDA condition before adding snow DA analysis to the archive list. Resolves #2469 commit 0237dba612a18c64eb513b6e186701cf37a1a5ef Author: Rahul Mahajan Date: Mon Apr 8 09:41:34 2024 -0400 Flip the build for GFS and GEFS with waves (#2462) This PR: - builds the unstructured waves as default when invoked with `build_all.sh`. - updates the CI testing infrastructure to handle this change. - updates the documentation to reflect this change in behaviour. To build with structured grid for waves, the flag `-w` needs to be passed to `./build_all.sh`. Previously, this would trigger the unstructured grid. Since GFSv17 will move to unstructured grid as default for waves, this makes it easier for a majority of developers. Resolves #2461 commit 4f0f7730dc2af0351a34263a6bbb80bd32aa2cc6 Author: Andrew Collard <40322596+ADCollard@users.noreply.github.com> Date: Fri Apr 5 19:39:17 2024 -0400 Add new data sources used in GFS v16.3.12 (#2283) Changes to `scripts/exglobal_atmos_analysis.sh` to include new data that are being operationally assimilated in GFS v16.3.12. **Note that this requires CRTM v2.4.0.1 ** commit 59cdc0ee81926ee8dc7b8e544337bfc85130ad18 Author: Guillaume Vernieres Date: Fri Apr 5 09:55:49 2024 -0400 The soca fix path is needed in config.prepoceanobs (#2460) # Description Give the prepoceanobs task access to SOCA_INPUT_FIX_DIR, which will be needed for some of the obs processing. # How has this been tested? Ran the ocean obs prep job commit 90502b33585971617b2de5df18c91c740a2689a0 Author: CatherineThomas-NOAA <59020064+CatherineThomas-NOAA@users.noreply.github.com> Date: Thu Apr 4 19:14:51 2024 -0400 Turn off reducedgrid in the EnKF (#2456) The reducedgrid feature no longer results in reduced runtime for the current EnKF configuration. This feature also cannot be used in soil DA due to masked fields. It will be set to .false. by default. Resolves #2455 commit 108ce2a83d8dc7279838cba9c3e2ef81623d46d5 Author: Walter Kolczynski - NOAA Date: Thu Apr 4 13:31:22 2024 -0400 Rename generate_com to declare_from_tmpl (#2453) # Description The `generate_com` function is renamed to `declare_from_tmpl`. NCO requested the change, and the new name more accurately represents the functionality. Resolves #2344 commit c54fe98c4fe8d811907366d4ba6ff16347bf174c Author: Walter Kolczynski - NOAA Date: Tue Apr 2 15:58:49 2024 -0400 Move bash utility functions out of preamble (#2447) The preamble was accumulating a bunch of utility functions. These functions are now moved to a separate file that is sourced by the preamble. The only functions remaining in the preamble are those related to script control and logging (`set_trace()`, `set_strict()`, `postamble()`). Resolves #2346 commit 4a39c8afc0555a8f2d621efb55589b9b309a416c Author: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> Date: Tue Apr 2 18:00:21 2024 +0000 Reenable the minimization monitor on Hera (#2446) This allows the minimization monitor to run on Hera Rocky 8. A missing perl module was added (List/MoreUtils.pm), but had to be installed under the perl/5.38.0 installation, thus that module needs to be loaded. Resolves #2439 commit 0eaa53771b5e8d476d3b5feabd3181c8dc48629a Author: Travis Elless <113720457+TravisElless-NOAA@users.noreply.github.com> Date: Tue Apr 2 00:31:14 2024 -0400 Fix rotating member bugs (#2443) When PR #2427 introduced the rotating subset of member guess states for the early-cycle EnKF, the rotating member calculation function was omitted from the DOSFCANL_ENKF if block in the enkf surface script. This PR adds this feature to this section. This PR also removes hard coded values that were included in this function and are replaced with a variable equal to the number of late cycle members. Resolves #2441 commit 39ba9d720c38ac85239a1eb1696c78df82396644 Author: CatherineThomas-NOAA <59020064+CatherineThomas-NOAA@users.noreply.github.com> Date: Mon Apr 1 17:48:42 2024 -0400 Remove the reset of upper layer humidity (#2449) # Description The parameter "nudge_qv" resets the upper layer humidity to HALOE climatology when cold starting. This parameter has been set to ".true." but is no longer needed using v16+ ICs and will now be set to ".false." by default. Resolves: #2448 commit 7f6bf216566e92bbe072ebe4b64d26cc60fb53f1 Author: Jiarui Dong Date: Mon Apr 1 13:08:33 2024 -0400 Archive the snow DA analysis into HPSS (#2414) This PR adds the capabilities to archive the snow analysis output into HPSS. Changes are made to archive the snow stats, letkfoi yaml file and snow analysis into HPSS. commit c1b11a2559e618f61866498b5dad503ba74d8332 Author: Neil Barton <103681022+NeilBarton-NOAA@users.noreply.github.com> Date: Mon Apr 1 13:05:57 2024 -0400 Add GEFS ENS Atmos options (#2392) This PR adds the FV3 atmos perturbation options when running GEFS. This is needed for GEFS reforecasts and GEFS operational forecasts. This PR continues to address the below issues #1720 #1921 Co-authored-by: Rahul Mahajan Co-authored-by: Walter Kolczynski - NOAA Co-authored-by: Bing Fu <48262811+bingfu-NOAA@users.noreply.github.com> commit 67a4810c4869741912622ee2ef58678ed8848547 Merge: 10e817614 834ce3134 Author: Janet Derrico <143837053+jderrico-noaa@users.noreply.github.com> Date: Fri Mar 29 12:15:41 2024 -0600 Merge branch 'NOAA-EMC:develop' into develop commit 834ce31348a627e14d448cdbe33d4ec0dabe99e4 Author: Walter Kolczynski - NOAA Date: Fri Mar 29 13:17:08 2024 -0400 Refactor gempak jobs for new COM and style (#2374) Updates the gempak jobs to fit the new COM structure while also refactoring them some to improve the style. Despite these technical changes, the overall structure is left unchanged for most scripts, though some have been rewritten to make the needed changes easier. Some of these scripts had already been updated some in the original COM refactor and thus needed fewer updates. Style updates includes converting all gempak scripts to bash, making them shellcheck compliant, and removing trailing whitespace. Further refactoring to improve maintainability will be needed in the future (see #2341, #2342, #2343, #2348). The GFS gif scripts were identical except the forecast hour, so they are collapsed down into two: one for f000 and one for other forecast hours. The gempak executables have short path limits. To get around this without having the gempak module recompiled, target directories (mostly relevant for the gempak meta jobs) are symlinked into the working directory to drasticly reduce the path lengths. Part of this update includes replacing existing MPMD calls with the new standard `ush/run_mpmd.sh` script. A new function, `wait_for_file()`, is introduced to standardize waiting for a file to be available. Gempak forecast hours are often hard-coded within scripts. In addition to issues with maintainability, this causes problems for shorter forecasts, such as we typically run for testing purposes. For now, we simply check the values against the forecast length and reduce if necessary. Future work (#2348) will be needed to remove these hard-coded values with variables set in the config file (or just use update gempak products to match standard output time variables). One-degree gempak files have been updated to include `1p00` in the filename. Several gempak job dependencies are corrected. Fake gempak data for external models is being staged on tier-1 machines to allow testing. **Output has not been verified.** Future PRs will likely be needed to bring full functionality online. Resolves #2158 Resolves #2152 Resolves #2151 Resolves #2249 Resolves #2247 Refs #2157 Refs #2348 commit 20635b0639656769842218d544ec7ce2436337c5 Author: TerrenceMcGuinness-NOAA Date: Fri Mar 29 08:15:29 2024 +0000 Turn GEFS CI test on for Hera (#2442) Re-enabling gefs case to test gefs system builds in Jenkins on Hera. Co-authored-by: TerrenceMcGuinness-NOAA Co-authored-by: terrance.mcguinness Co-authored-by: Walter.Kolczynski commit ba6a9d5fa6a079b1e3fdd424a493252bbf499c5d Author: Walter Kolczynski - NOAA Date: Thu Mar 28 17:34:22 2024 -0400 Modify APP based on RUN (#2413) There is a need to change which coupled components are on depending on the current `RUN`. To facilitate this, the `APP` is modified prior to the setting of the `DO_` variables based on `RUN`, turning off components as desired. This new system also replaces the `DO__ENKF` switches that were formerly used to turn components off for the ensemble. Also expands allows apps for cycled to include S2SWA. Resolves #2318 commit 3ff7a92c25564ddf984cb09cb5667ae8fafe01a0 Author: TerrenceMcGuinness-NOAA Date: Thu Mar 28 17:48:00 2024 +0000 Fix post log arg check and don't create build semaphore (#2440) Two Hot Fixes to Jenkins latest updates: 1. Logic fixed in checking for mutually exclusive use of gists and repo for publishing error files 2. Removed creation of build success file semaphore forcing complete rebuild as the default behavior for rerun/restarts Co-authored-by: TerrenceMcGuinness-NOAA Co-authored-by: terrance.mcguinness commit d6be3b5c3a1b8fd025a303b40e0660e2914906a7 Author: Henry R. Winterbottom <49202169+HenryWinterbottom-NOAA@users.noreply.github.com> Date: Wed Mar 27 20:56:18 2024 -0600 Update global-workflow and subcomponents to Hera/Rocky 8 partition (#2421) This PR addresses issue #2329. The following is accomplished: - All submodule RDHPCS Hera stacks are updated to be compatible with the Rocky-8 distro spack-stack; - The global-workflow version files `versions/build.hera.ver` and `versions/run.hera.ver` are updated for Rocky-8; - All submodule hashes have been updated to be compliant with the Rocky-8 distro spack-stack (see the reference PRs below); - Update to `parm/config/config.base` is made for not yet compliant packages; - Relevant updates are made to `modulefiles/module_base.hera.lua` and `modulefiles/module_gwsetup.lua`. Resolves #2329 Refs: [#958](https://github.com/NOAA-EMC/GDASApp/issues/958) [#49](https://github.com/NOAA-EMC/gfs-utils/issues/49) [#124](https://github.com/NOAA-EMC/GSI-Monitor/issues/124) [#31](https://github.com/NOAA-EMC/GSI-utils/issues/31) [#2167](https://github.com/ufs-community/ufs-weather-model/issues/2167) [#2143](https://github.com/ufs-community/ufs-weather-model/issues/2143) [#913](https://github.com/ufs-community/UFS_UTILS/issues/913) Co-authored-by: Rahul Mahajan Co-authored-by: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> commit 47302153f13f6b23539be841b78ed78664599c08 Author: Travis Elless <113720457+TravisElless-NOAA@users.noreply.github.com> Date: Wed Mar 27 21:14:57 2024 -0400 Add a rotating subset of members for early-cycle enkf (#2427) The early-cycle EnKF needs the ability to run with fewer members than the late-cycle due to operational resource constraints. Because of this requirement, the introduction of a rotating subset of member first-guess states used by the early-cycle ensemble is also needed in order to preserve the rotating member initial condition functionality currently used by the GEFS. Co-authored-by: Travis J Elless Co-authored-by: travis elless commit 94c282ef6fdcd47076e932bcadb5bdd55236aa05 Author: TerrenceMcGuinness-NOAA Date: Wed Mar 27 20:28:15 2024 +0000 Uploading error logs to GitHub from Jenkins CI Runs (#2429) This PR enhances the user experience within GitHub for when errors occur during build and running of CI cases from within the PR messages. This is done by uploading the error logs to GitHub Gists and then publishing the links to them along with the full paths of the logs on disk. This PR adds the python Class **GitHubPR** in `${HOMEgfs}/ci/scripts/utils/githubpr.py` by inhearting the GitHub Class from **pyGitHub**. We use this module to introduce a helper python utility that can publish a list of log files into a GitHub Gist and/or the designated branch **error_logs** in the **emcbot** repo **ci-global-workflows** for storing error log files for review from any git configured terminal. This upload feature will also create persistence of errors over time. Also the `build_all.sh -k` script has been updated to support a "quick kill" feature (thanks David) that stops the parallel builds whenever one fails and creates an error_log file that has the paths to the error files that are also uploaded and published in the PR messages in GitHub. Co-authored-by: TerrenceMcGuinness-NOAA Co-authored-by: terrance.mcguinness Co-authored-by: DavidHuber Co-authored-by: Walter Kolczynski - NOAA commit 6c5065e2e83a45b14505e7575aa4500482ef7452 Author: Rahul Mahajan Date: Wed Mar 27 03:39:16 2024 -0400 Add option to use traditional threading in the UFS (#2384) # Description This PR: - adds the option of running the ufs-weather-model with traditional threading in addition to ESMF-managed threading. See new toggle `USE_ESMF_THREADING=YES|NO` set in `config.fcst` - does not change the current default of using ESMF-managed threading. Traditional threading use might need a little more fine tuning for the job-card specification. This will be achieved when the UFSWM RT completely switches over to traditional threading - updates the hash of the ufs-weather-model to the PR https://github.com/ufs-community/ufs-weather-model/pull/2172 Resolves #2277 In addition to the above stated objectives, this PR also addresses open issues. In particular, this PR: - adds a newline at the end of `diag_table_aod`. Fixes #2407 @zhanglikate - reserves more memory on WCOSS2 for offline UPP when running at C768. Fixes #2408 @WenMeng-NOAA Co-authored-by: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> commit bc1c46dfd7393c5164abcdc2dfa76a9c4bc834b8 Author: RussTreadon-NOAA <26926959+RussTreadon-NOAA@users.noreply.github.com> Date: Tue Mar 26 22:27:30 2024 -0400 Correct GDASApp paths (#2435) The changes in this PR - account for changes in GDASApp directory structure - generalize how the path to the GDASApp python ioda library is specified Resolves #2434 commit f0b912be6f2cf2fac590272253f19cb082fbf5f2 Author: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> Date: Mon Mar 25 21:46:32 2024 +0000 Fix *earc jobs where the number of members isn't a multiple of 10 (#2424) This limits the earc search for ensemble members to the maximum number of members, which prevents attempting to send non-existent members to HPSS if the number of ensemble members is not a multiple of 10. Resolves #2390s commit daeb0c855017f8ffd6f06870744b825b276097f3 Author: TerrenceMcGuinness-NOAA Date: Fri Mar 22 17:40:04 2024 +0000 hotfix to update full path to error logs on CI case fail (#2425) This hotfix PR prepends the full path to the error logs on disk to be communitated correctly to the GitHub message to the PR that is being processed when there is a failure in from a case. commit 50f75526549245f2b5d984cdb44e402852e086ec Author: YaliMao-NOAA <53870326+YaliMao-NOAA@users.noreply.github.com> Date: Thu Mar 21 16:58:25 2024 +0000 Add WAFS jobs, scripts and ush to GFS v17 workflow repository (#2412) # Description This PR adds WAFS jobs, scripts and ush to GFS v17 workflow repository --------- Co-authored-by: yali mao Co-authored-by: Rahul Mahajan Co-authored-by: yali mao commit 03ba78ae3df589211d2776254c6e8584ecdc226f Author: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> Date: Thu Mar 21 13:33:50 2024 +0000 Hotfix: send the correct number of build jobs for the UFS (#2423) This fixes a bug in build_ufs.sh that was causing the UFS to always build with 8 jobs (except on the cloud). commit 4d1bf5266f00b35778ea47896f438e3ef612628d Author: Kate Friedman Date: Wed Mar 20 14:00:31 2024 -0400 Updates to RTD documentation (#2418) Updates to the RTD documentation include: - Textual updates - Better definition of GDA subfolder structure - Added note about GDAur having been discontinued - Adjust copyright and author information - Fix Git version table and update its contents - Add "Table of Contents" header before table of contents on front page - Add AWS ICs path to init page - Add link to UFS_UTILS gdas_init RTD documentation - Add note about bash to `gw_setup.sh` section and added a warning block Refs #2395 commit afe874ee8b28942e459796cd1005ec598458a5b7 Author: Neil Barton <103681022+NeilBarton-NOAA@users.noreply.github.com> Date: Tue Mar 19 14:35:02 2024 -0400 Add GEFS Ocean Perturbation Options (#2385) This PR adds the MOM6 ocean perturbation options when running GEFS. This is needed for GEFS reforecasts and GEFS operational forecasts. This PR continues to address the below issues https://github.com/NOAA-EMC/global-workflow/issues/1720 https://github.com/NOAA-EMC/global-workflow/issues/1921 Fixes #2403 commit fa855baa851b0cb635edd1b9ae1bfed5112d41e5 Author: Clara Draper <33430543+ClaraDraper-NOAA@users.noreply.github.com> Date: Mon Mar 18 12:49:14 2024 -0600 Add initial GSI-based soil analysis capability (#2263) First set of changes for adding the new soil analysis, from the assimilation of screen-level T and q. The changes here enable the screen-level observations to be assimilated in the Hybrid (Var and EnKF) update, and the soil temperature and soil moisture updates to be made in the EnKF only. The functionality is turned on by setting GSI_SOILANAL to YES in config.base. Resolves #1479 commit e9700d84b521907ee23e1584712f80e25e60f08e Author: Jessica Meixner Date: Mon Mar 18 09:32:13 2024 -0400 re-enable ci/cases/pr/C48mx500_3DVarAOWCDA.yaml (#2405) Updates to re-enable C48mx500_3DVarAOWCDA CI test after it was disabled in https://github.com/NOAA-EMC/global-workflow/pull/2371 Fixes github.com/NOAA-EMC/global-workflow/issues/2404 commit 3ccffeee120340ab580fc9d96b552970c9f42a8f Author: Rahul Mahajan Date: Mon Mar 18 09:30:35 2024 -0400 Parse jediyaml only once (#2387) `JEDIYAML` was being parsed 3 times; once in `get_obs_dict`, second in `get_bias_dict` and a third time in `initialize` for the specific component analysis task. This PR: - eliminates the duplications and constructs the `jedi_config` dictionary just once. The dictionary is written out before calling the executable. - updates hash to gdasapp - updates configs for snow, aerosol, atmvar and atmens JEDI-DA to include `JEDI_FIX_YAML` and `CRTM_FIX_YAML` . This allows greater flexibility and control over the contents of these fix data sets to be copied into the run directory. - Combines snowDA and aerosolDA into a single test Co-authored-by: Cory Martin Co-authored-by: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> commit a5f24951c5bef142747c6f6cc6abd474f0b53ac2 Author: CatherineThomas-NOAA <59020064+CatherineThomas-NOAA@users.noreply.github.com> Date: Fri Mar 15 14:47:54 2024 -0400 Fix ensemble archive groups to include all members (#2402) The number of groups used in the ensemble archive step (earc) needs to include a task for the ensemble stat files such as the mean and the spread, resulting in `n_groups+1` tasks for `earc`. Resolves: #2390 commit 056cfdca9e7fd7426a315fcbffc38d8ee2891212 Author: TerrenceMcGuinness-NOAA Date: Fri Mar 15 18:30:23 2024 +0000 GitHub message error paths (#2401) Add feature to message paths to error logs of failed experiments to GitHub Messaging the PR from Jenkins. commit d897ee4936d62160811d936248c8555187f81b65 Author: Rahul Mahajan Date: Thu Mar 14 12:37:49 2024 -0400 Missed a comma from the hotfix this AM (#2399) This PR is a hotfix to the hotfix from earlier this AM. A comma was missing. commit 906540acacf1b2ce4c0489d0d9d4913f53a4e8ad Author: Rahul Mahajan Date: Thu Mar 14 10:40:03 2024 -0400 Fix KeyError issue in ocean/ice postprocessing job. (#2398) The jjob for ocean and ice pp, only defines the component specific history and grib directory. This causes an error in the exscript trying to pull keys for both ocean and ice. Fix this. Surprised this has not caused failures before today Co-authored-by: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> commit c27f243ce24c60261718f15628f6ab30d7b09f7b Author: Rahul Mahajan Date: Wed Mar 13 13:31:57 2024 -0400 Remove documentation about generating ICs using global-workflow (#2397) Removes instructions on generating ics using global-workflow and directs the user to use ufs-utils. commit 10e817614a9cba9858ee47c7bdfb7f7ae606558c Merge: 0ad4eb403 0edbdc197 Author: Janet Derrico <143837053+jderrico-noaa@users.noreply.github.com> Date: Tue Mar 12 15:12:20 2024 -0600 Merge branch 'NOAA-EMC:develop' into develop commit 0edbdc197441582f9c402da0f3a7f144b88960c7 Author: DavidNew-NOAA <134300700+DavidNew-NOAA@users.noreply.github.com> Date: Tue Mar 12 16:30:45 2024 -0400 Changed config.atmanl to allow non-hybrid background error yamls (#2394) # Description Makes change so that if DOHYBVAR equals "NO", then the JEDI background error yaml is set to staticb_${STATICB_TYPE}.yaml.j2 rather than hybvar_${STATICB_TYPE}.yaml.j2. This allows GDAS to run without hybvar, which may be necessary for development purposes. This is all accomplished by a simple switch in config.atmanl. commit ccb1f528489e740bb2adc4958146552323dd8709 Author: RussTreadon-NOAA <26926959+RussTreadon-NOAA@users.noreply.github.com> Date: Tue Mar 12 10:45:39 2024 -0400 Add JEDI atmosphere only CI (#2357) The PR contains a minimal set of changes to enable JEDI atmospheric DA CI testing. Prototype JEDI atmospheric cycling has begun. The JEDI atmosphere DA CI case provides an automated way to see if future PRs impact JEDI atmospheric cycling. Resolves #2294 Dependency: GDASApp PR [#937](https://github.com/NOAA-EMC/GDASApp/pull/937) commit b96f5ebbb1968bd539336652b87a2faa8ce68fd4 Author: Kate Friedman Date: Tue Mar 12 07:58:40 2024 -0400 Add switch to control `debug=true` on WCOSS2 for development testing (#2388) Adds a switch (`DEBUG_POSTSCRIPT`) to control whether `debug=true` is set when submitting development rocoto jobs to PBS schedulers (currently just WCOSS2). There isn't an equivalent flag to set for SLURM on the RDHPCS. Have added this new switch to documentation. Refs #619 commit 02d650500353663d0b193ef14003897daa5dd86c Author: TerrenceMcGuinness-NOAA Date: Mon Mar 11 21:51:16 2024 +0000 Rewrote pr_list_database.py to use wxflow's SQLiteDB Class (#2376) This PR updates the `pr_list_database.py` code to use **wxflow** SQLiteDB Class - Improved code's readability - Uses better code style matching project's software culture - Better docstring standards Co-authored-by: tmcguinness Co-authored-by: Rahul Mahajan Co-authored-by: Walter Kolczynski - NOAA commit a3374607d01fbdabbec0660afb82b5eb3677b4af Author: AndrewEichmann-NOAA <58948505+AndrewEichmann-NOAA@users.noreply.github.com> Date: Fri Mar 8 16:08:42 2024 -0500 Return ocnanalrun npes resource setting back to previous value (#2386) Variable `npes` in `ocnanalrun` entry of `config.resources` was erroneously changed in https://github.com/NOAA-EMC/global-workflow/pull/2299 and this PR changes it back. Resolves https://github.com/NOAA-EMC/GDASApp/issues/962 commit d7e9bde84aebe922039589bd2bcd65832c1074eb Author: BoCui-NOAA <53531984+BoCui-NOAA@users.noreply.github.com> Date: Thu Mar 7 16:14:28 2024 -0500 Add new BUFR table file parm/product/bufr_ij9km.txt for GFSv17 C1152 (#2383) This PR will add a new table file parm/product/bufr_ij9km.txt, and modify ush/gfs_bufr.sh to choose the different bufr table files based on the GFSv17 run resolution, i.e. use file bufr_ij9km.txt for C1152 or bufr_ij13km.txt, for C768. Resolves #2382 commit 4a525bef3bbed2ea60a71f71b3740c82df125c36 Author: AndrewEichmann-NOAA <58948505+AndrewEichmann-NOAA@users.noreply.github.com> Date: Thu Mar 7 16:12:31 2024 -0500 Add global-workflow infrastructure for ocean analysis recentering task (#2299) Adds jjob, rocoto script, config file, and basic `config.resources` entry for ocean analysis recentering task This PR is a dependency for further work on the associated issue within global-workflow and GDASApp Refs https://github.com/NOAA-EMC/GDASApp/issues/912 commit f83d17a937006add55241ed453e42f4fcbae50aa Author: Kate Friedman Date: Thu Mar 7 09:29:14 2024 -0500 Clean out non-gfs top level variables (#2366) Clean out non-gfs top level variables that are duplicates or no longer needed. Also standardize how we set these variables in scripts. Refs #2332 commit c7b306e052497aef0022cd53550a168d2c5b6e5b Author: Guillaume Vernieres Date: Wed Mar 6 16:16:08 2024 -0500 Forgotten templated DO_VRFY_OCEANDA (#2379) # Description This PR adds the possibility to use the switch for the ocean and sea-ice DA verify task from the yaml configuration. - fixes [GDASApp/issues/954](https://github.com/NOAA-EMC/GDASApp/issues/954) commit 0ad4eb403a0d68ea8592799b8db4b3b46e5f26c3 Merge: 4038e8ac4 ba6a4fdf6 Author: Janet Derrico <143837053+jderrico-noaa@users.noreply.github.com> Date: Wed Mar 6 10:40:54 2024 -0700 Merge branch 'NOAA-EMC:develop' into develop commit ba6a4fdf6e245b57530f2b20e6f0ccf567115720 Author: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> Date: Tue Mar 5 19:09:48 2024 +0000 Add Hercules support for the GSI monitor (#2373) # Description This updates the GSI monitor hash and updates the modulefiles to add support for the monitor on Hercules. commit 732a874a2c6793296f136afb23545fab9869b181 Author: Rahul Mahajan Date: Mon Mar 4 16:04:26 2024 -0500 Reformat snowDA templates to jinja2 (#2371) # Description This PR: - replaces use of non-jinja2 templates in the yaml templates. Specifically `$( )` in favor of pure jinja2. - uses jinja2's built-in capability to include templates within templates, thereby allowing to assemble a completely rendered template before passing it to for e.g. yaml loader. - requires updates to `wxflow` and `gdasapp` - Changes in `wxflow` in `parse_j2yaml` are **not** backwards compatible Additionally, this PR: - renames `config.base.emc.dyn` to `config.base`. Resolves #2347 commit d1fa41106e991556606b0f62a15bf45f469f4f79 Author: TerrenceMcGuinness-NOAA Date: Sat Mar 2 04:54:26 2024 +0000 Reduce Jenkins messaging to GitHub (#2370) This PR updates the Jenkins Pipeline code with safeguards against the errors caused when Jenkins fails to authenticate with GitHub to message or update a label. This was achieved simply by: - Reducing the number out messages sent to the GitHub PR - Putting try blocks around most of the update label calls Co-authored-by: tmcguinness commit 52fa3cb32d8b50e47f391b82ea8901435fc88aff Author: Henry R. Winterbottom <49202169+HenryWinterbottom-NOAA@users.noreply.github.com> Date: Fri Mar 1 12:36:27 2024 -0700 Adding debug option for all build scripts (#2326) This PR addresses issue #300 that allows building in `debug` mode. Co-authored-by: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> Co-authored-by: Rahul Mahajan commit 91738cbf871d8cdce46912e2c11e304d567a2aae Author: DWesl <22566757+DWesl@users.noreply.github.com> Date: Fri Mar 1 13:46:42 2024 -0500 Sort list of coupler restart files for restart time determination (#2360) The loop in the following conditional seems to assume the list is sorted, so make that explicit in the array construction. commit 23c25527ad2a62275cd9105bd103b8520a28e573 Author: Neil Barton <103681022+NeilBarton-NOAA@users.noreply.github.com> Date: Fri Mar 1 13:45:39 2024 -0500 Update stage IC to handle ocean perturbations (#2364) This PR adds the option to stage ocean perturbation files for ensemble forecasts. These perturbation files are used in GEFS forecasts. A new variable is introduced in config.base to use the ocean perturbation files. This PR does not include using these perturbation files. A future PR will address this. commit 8efe05f475b81e7cf6376745d5f1ce31987cb4eb Author: Jessica Meixner Date: Fri Mar 1 07:41:56 2024 -0500 Turn on C48mx500_3DVarAOWCDA test on hera (#2363) This PR activates the C48mx500_3DVarAOWCDA test on hera. This required an update of the gdas app. commit 4038e8ac4bbcad4e8c36557fb56fe1acd29463fd Merge: e2d88ddce 516659394 Author: Janet Derrico <143837053+jderrico-noaa@users.noreply.github.com> Date: Thu Feb 29 13:42:22 2024 -0700 Merge branch 'NOAA-EMC:develop' into develop commit 5166593945e9ecc04dfa3409752576c08797d09f Author: TerrenceMcGuinness-NOAA Date: Thu Feb 29 20:14:41 2024 +0000 Move Jenkinsfile into ci subdirectory (#2355) Just moves the Jenkinsfile into the ci directory Co-authored-by: tmcguinness commit b7af315bb9dea77b37c6d030b71060b87bedf33e Author: Walter Kolczynski - NOAA Date: Wed Feb 28 22:20:58 2024 +0000 Fix rocoto forecast hour determination for GEFS (#2351) The function that generates the list of forecast hours for rocoto was trying to use variables that are not defined for GEFS, causing workflow generation to fail. The function is updated to not try to load these variables before loading the ones actually used for GFS/ GEFS. Also turns GEFS CI test back on and adds an entry to stage C192 ICs (note: these have not been placed in the centralized location.) commit d3a49271b6c3816a9feeb7f6fb474797bacf1d7e Author: Cory Martin Date: Wed Feb 28 09:38:49 2024 -0500 Rename the land DA jobs to snow DA to better reflect what they are doing (#2330) This PR renames all of the land DA jobs to snow DA to better reflect that this is a JEDI-based snow analysis capability and not a more generic land surface analysis. commit 2693810d6ea9d9b20090777ff3a98e3d072c76d7 Author: Jessica Meixner Date: Tue Feb 27 01:25:51 2024 -0500 Update ufs-waether-model hash (#2338) Routine update of ufs-weather-model hash. Other small updates: * removes comment referencing closed issue. * Updates the CICE diag frequency to once per day as recommend here: https://github.com/NOAA-EMC/global-workflow/issues/1810#issuecomment-1686278925 * Updated amount of time for C384 gdas forecast as it was running out of time * Removed unused variable wave_sys_ver commit 9608852784871ebf03d92b53bde891b6dcab8684 Author: RussTreadon-NOAA <26926959+RussTreadon-NOAA@users.noreply.github.com> Date: Mon Feb 26 14:10:01 2024 -0500 Update JEDI ATM to use .nc for obs and generalize x,y layout (#2336) # Description The changes in this PR are twofold 1. replace `.nc4` suffix for JEDI ATM observation related files with `.nc` 2. use templated variables to specify `{layout_x, layout_y}` for JEDI ATM variational and local ensemble apps The first change conforms with the Unidata recommendation that netCDF files end with the suffix `.nc`. The second change replaces hardwired JEDI ATM var and ens `{layout_x, layout_y}` in `config.resources` with a more flexible approach. Resolves #2335 commit c5c84660f10f0ef9ce939231b2f7fda498b39a29 Author: Kate Friedman Date: Mon Feb 26 10:18:50 2024 -0500 Remove FIX* variables for fix subfolders (#2337) Remove `FIX*` variables for fix subfolders and replace them with the remaining `FIXgfs` variable and the subfolder name (e.g. `${FIXam}` -> `${FIXgfs}/am`). The UFS_UTILS and GDASApp repos were similarly updated. This PR includes a new UFS_UTILS hash. The updated GDASAPP hash was already committed within the spack-stack/1.6.0 PR #2239. Resolves #2184 commit 950c38a093c6a4e2b67e18c76390280d8bfbaef7 Author: TerrenceMcGuinness-NOAA Date: Fri Feb 23 21:15:46 2024 +0000 Fix several Jenkins issues (#2334) Jenkins Updates Resolving final kinks: - Removed all `git` shell commands and now exclusively use Software Control Manger (**scm**) plugin. - Add feature for skipping hosts per configuration specified in case yaml files. - Solved and tested false positive builds and experiments. - Tested archiving of task error log on case fail - First case fail quits pipeline and cancels all pending scheduled jobs - Duel build per yaml configuration arguments supported - All designated case files in PR directory pass on intended host (fully tested on Hera) Remaining updates: - Fist build fail short circuit when building sub-modules and archiving build error log. - Re-build/no-build built in logic for Replay and Rerunning previously failed experiments. commit c67393a203285792b852da0d83fd10fa47155669 Merge: 6f9afff07 79d305e8c Author: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> Date: Fri Feb 23 14:18:15 2024 +0000 Merge pull request #2239 from DavidHuber-NOAA/ss160 Update to spack-stack 1.6.0 Includes all submodules except the UFS, which will be updated at a later time. commit 79d305e8cbc339208ea6fe0475ddc56af94a285b Author: David Huber Date: Thu Feb 22 14:53:57 2024 -0600 Disable snow DA tests commit 0459203e97211b041520b39e93a188951396ba33 Author: David Huber Date: Thu Feb 22 12:55:51 2024 -0600 Update GDASApp hash to current develop commit 5c96eb2272fe6117f9e4a4c0c790db58e4870d46 Author: David Huber Date: Thu Feb 22 12:37:47 2024 -0600 Update GDAS hash to allow modified snow DA analysis commit 79144f2403a33d08140577cc3e8a61d6c4924403 Merge: abbb0b8a7 6f9afff07 Author: DavidHuber Date: Thu Feb 22 15:14:39 2024 +0000 Merge remote-tracking branch 'origin/develop' into ss160 commit abbb0b8a76b39d34433802fd4e3d7973fb9f0a39 Merge: 4ad837ea1 4529e8cf3 Author: DavidHuber Date: Thu Feb 22 15:11:03 2024 +0000 Merge remote-tracking branch 'henry/feature/gwdev_issue_2129' into ss160 commit 4ad837ea157b98d2bd173cb696c4d30d142e5540 Merge: 516b2a270 7ca45db8f Author: DavidHuber Date: Thu Feb 22 15:10:24 2024 +0000 Merge remote-tracking branch 'henry/feature/gfsv17_issue_2125' into ss160 commit 516b2a270234bdab5714ef2705c83dd3835b134d Author: DavidHuber Date: Thu Feb 22 14:19:16 2024 +0000 Updated GDAS to include rocoto/1.3.6 on Hera. commit 6f9afff073dd589096f992a3448fb7f0e62c9804 Author: Jessica Meixner Date: Wed Feb 21 13:09:44 2024 -0500 Add some flexibility for ocean/ice output (#2327) This adds functionality to have ocean/ice output frequency be separate from the atm model. One time was created because there's an assumption in the post that these are the same. This could be further modified to remove this assumption. Refs #1629 commit e2d88ddcec863d1762be19e31a4ac5bddcda4d34 Merge: 3e6b1e2c5 a23b7f2fd Author: Janet Derrico <143837053+jderrico-noaa@users.noreply.github.com> Date: Tue Feb 20 16:08:46 2024 -0700 Merge branch 'NOAA-EMC:develop' into develop commit f6d3015ab9f9fd23ce0081baa2edbc5d9f5f3e16 Author: David Huber Date: Tue Feb 20 09:53:08 2024 -0600 Update GDASApp hash to include SS/1.6.0 support. commit 0bf340bce2f865c582e5305f0c5984cd3affe74e Author: David Huber Date: Tue Feb 20 09:07:32 2024 -0600 Construct SS paths from version variables. commit 3330cd7310bde8090b68720c225656074676d6b2 Author: David Huber Date: Tue Feb 20 09:06:19 2024 -0600 Removed MET/METplus 'not available' comments commit fdc638ca1616b347912caec77e5abc2d3e6f18af Author: David Huber Date: Tue Feb 20 08:35:15 2024 -0600 Move SS module path to version files. commit d7d28a6b65b84c8d821abd2c13d0c068bd5ad6d8 Author: David Huber Date: Tue Feb 20 07:53:15 2024 -0600 Update comment about METplus support. commit a812f88af3fc043d57494840e81a7527723858e4 Author: David Huber Date: Tue Feb 20 07:45:45 2024 -0600 Update verif-global to latest WCOSS2 support. commit d81f07fbf53666a37ab01bd463152e10252869ae Author: David Huber Date: Tue Feb 20 07:41:38 2024 -0600 Clean up build_upp.sh. -Corrected whitespace (tabs instead of spaces) -Removed debug print statement -Alphebetized flags commit ae7eb194cbe2213e564807dd7bc03a28d493eff2 Author: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> Date: Fri Feb 16 16:15:40 2024 -0500 Fix whitespace in build_upp.sh. Co-authored-by: Rahul Mahajan commit 48b34d0f388a398bb91f7c3b1e5f8338f4beb7b9 Author: David Huber Date: Fri Feb 16 20:56:55 2024 +0000 Added verif-global support back to WCOSS2. commit 4529e8cf3736ffbacf615a27e99f4d1beec391aa Author: henrywinterbottom-wxdev Date: Fri Feb 16 11:12:01 2024 -0700 Bug fix. commit 8e4f94d13d32849b2862a9d59aa070f4103c61ae Author: henrywinterbottom-wxdev Date: Fri Feb 16 11:04:41 2024 -0700 Updates requested by reviewer Rahul Mahajan. commit 4624ce21c99ab303afa10c1dd8ddcce7b6f715ca Author: henrywinterbottom-wxdev Date: Fri Feb 16 10:52:11 2024 -0700 Updates requested by reviewer; testing -- DO NOT REVIEW. commit ed25bbd0b26a893d32ce4a10b368dec2bb722424 Author: henrywinterbottom-wxdev Date: Fri Feb 16 09:33:59 2024 -0700 Linter corrections. commit 6a0b7bf214815c13eec98a93bd3abe9978de04e4 Merge: f9fb64ef8 2415b7b4f Author: David Huber Date: Fri Feb 16 09:50:22 2024 -0600 Merge branch 'ss160' of github.com:DavidHuber-NOAA/global-workflow into ss160 commit f9fb64ef802b683d7b369f8e8268d423ab581481 Merge: 777d97d3a a23b7f2fd Author: David Huber Date: Fri Feb 16 09:49:42 2024 -0600 Merge remote-tracking branch 'emc/develop' into ss160 commit 7ca45db8fa32c84ef117ff2938a650822c81a786 Merge: 73bc76bfd a23b7f2fd Author: Henry R. Winterbottom <49202169+HenryWinterbottom-NOAA@users.noreply.github.com> Date: Fri Feb 16 07:46:35 2024 -0700 Merge branch 'NOAA-EMC:develop' into feature/gfsv17_issue_2125 commit eb2ed53857eb3387735488a1ce5d701ff680e3db Merge: 9929277dc a23b7f2fd Author: Henry R. Winterbottom <49202169+HenryWinterbottom-NOAA@users.noreply.github.com> Date: Fri Feb 16 07:46:24 2024 -0700 Merge branch 'NOAA-EMC:develop' into feature/gwdev_issue_2129 commit a23b7f2fdca5be700d257e28052a0104f2173a0f Author: RussTreadon-NOAA <26926959+RussTreadon-NOAA@users.noreply.github.com> Date: Fri Feb 16 09:37:58 2024 -0500 Add JEDI 3DEnVar atmosphere only CI test stub (#2309) commit cf83885548bb3a6740c033f42479ce2ad283a4a9 Author: Jessica Meixner Date: Fri Feb 16 01:55:02 2024 -0500 Add unstructured grid for HR3/GFS (#2230) This adds the capability to use unstructured grids in the global workflow, which will be used in HR3. There are new fix files for a low-resolution 100km grid and a grid closer to our targeted GFSv17 grid which has the resolution combined from the older multi_1 and GFSv16 grids. The fix file update is here: NOAA-EMC/global-workflow#2229 Note: This now means that GFS tests need a new build option: `./build_all.sh -w` So that PDLIB=ON is turned on for compiling relevant UFS and WW3 codes. Resolves NOAA-EMC/global-workflow#1547 commit 9929277dc7d0ad90f5366faf4b5f2278969a0aa2 Merge: eb8791ccb 094e3b86d Author: Henry R. Winterbottom <49202169+HenryWinterbottom-NOAA@users.noreply.github.com> Date: Thu Feb 15 15:05:50 2024 -0700 Merge branch 'NOAA-EMC:develop' into feature/gwdev_issue_2129 commit 094e3b86da44f1d3fc1d99f68f6fdfcd36deb09f Author: Cory Martin Date: Thu Feb 15 14:43:55 2024 -0500 Move IMS remapping files from COM_OBS to FIXgdas (#2322) * Add in IMS obs fix directory and update submodule for gdas commit d465ea06e8b2a8f3a5eb1120647c1e2ce5197d66 Author: TerrenceMcGuinness-NOAA Date: Thu Feb 15 19:25:02 2024 +0000 Set HOMEgfs for module_setup in CI driver (#2321) Hotfixes to CI Bash system from updates with sourcing `detect_machine.sh` in `ush/module-setup.sh` using **HOMEgfs**. commit 2415b7b4f3e6c376aca27707510001141cc9dd92 Author: David Huber Date: Thu Feb 15 19:21:17 2024 +0000 Load default rocoto on Jet. commit eb8791ccbe684828dc529d0e553551969274fc22 Merge: 60d5ee64b 638684e0b Author: Henry R. Winterbottom <49202169+HenryWinterbottom-NOAA@users.noreply.github.com> Date: Thu Feb 15 11:56:40 2024 -0700 Merge branch 'NOAA-EMC:develop' into feature/gwdev_issue_2129 commit 777d97d3a1c9f5ec5e8af3ca40a41224ec7099a1 Author: David Huber Date: Thu Feb 15 12:34:59 2024 -0600 Fixed Orion cdo version. commit ef0723503c72f295de93521c7102c43e75c47417 Author: DavidHuber Date: Thu Feb 15 16:29:32 2024 +0000 Revert UFS hash. commit 0ce8c0dbc13227884fef1c637e93616a28c68d34 Author: DavidHuber Date: Thu Feb 15 14:55:48 2024 +0000 Fix git version in Hera's gwsetup module. commit 3080a34253e8e24105bf2be72b6a872b1c072935 Author: DavidHuber Date: Wed Feb 14 20:52:34 2024 +0000 Fixed xarray version for SS/1.6.0. commit 49392dd47ff84b6586052aeae6879d7d8050b746 Author: DavidHuber Date: Wed Feb 14 20:29:41 2024 +0000 Updated GSI-Utils hash to head of develop. commit c3553f0d8e6a05c6234e7c14a179983aa44fd6f3 Merge: 4568653a6 638684e0b Author: DavidHuber Date: Wed Feb 14 20:28:14 2024 +0000 Merge remote-tracking branch 'origin/develop' into ss160 commit 4568653a67aa37c902379db628fb10f69fe7190f Author: David Huber Date: Wed Feb 14 14:22:13 2024 -0600 Reupgrade Hercules to SS/1.6.0 commit 638684e0bfcd06700cc8695f09824891a0a1eee1 Author: Kate Friedman Date: Wed Feb 14 14:55:21 2024 -0500 Remove `finddate.sh` from system (#2308) * Retire finddate.sh usage from system * Update gfs-utils hash to 7a84c88 - New hash includes removal of finddate.sh Refs #2279 commit 2b160f8470bed16c513ca4a5665e2b6d4448c50e Author: DavidHuber Date: Wed Feb 14 18:31:28 2024 +0000 Reenable METplus jobs on Hercules. commit 8f5900265a31e894060bbe9c89b262f4df0b1760 Author: DavidHuber Date: Wed Feb 14 18:30:23 2024 +0000 Update GSI hashes. commit 73bc76bfd47f2cff54df55e15d9ed8969683367d Author: henrywinterbottom-wxdev Date: Wed Feb 14 09:27:37 2024 -0700 Updates based on user request. commit e4bc674cf3b2df10e0b0dfd50a8ecb0f4f7825d8 Author: henrywinterbottom-wxdev Date: Wed Feb 14 08:03:45 2024 -0700 Corrected based on reviewer review. commit 60d5ee64bda7a16c1f2dc20c8badcd7c8fc4e592 Merge: 40f2cf6cd 1aaef05d3 Author: Henry R. Winterbottom <49202169+HenryWinterbottom-NOAA@users.noreply.github.com> Date: Tue Feb 13 18:08:33 2024 -0700 Merge branch 'NOAA-EMC:develop' into feature/gwdev_issue_2129 commit 03304112347a673b7a0fcc404703bb10960bc47a Merge: 929b90330 1aaef05d3 Author: Henry R. Winterbottom <49202169+HenryWinterbottom-NOAA@users.noreply.github.com> Date: Tue Feb 13 17:58:12 2024 -0700 Merge branch 'NOAA-EMC:develop' into feature/gfsv17_issue_2125 commit 1aaef05d317cd1eec548ef2b9842679c531cef8b Author: TerrenceMcGuinness-NOAA Date: Tue Feb 13 18:15:59 2024 -0500 Jenkins Pipeline updates for Canceling Jobs (#2307) Tuning updates for Jenkins Pipeline : - Added short circuit for all parallel runs of cases on error of any - Fixed canceling of all scheduled jobs on first case error - Added feature to save error log files to Jenkins Archive facility on fail commit 64048926627f8c9edb087de286095e3b93a214c2 Author: Rahul Mahajan Date: Tue Feb 13 14:57:37 2024 -0500 Ocean/ice product generation for GFS and GEFS (#2286) This PR does several things: 1. the model output for ocean and ice in the `COM/` directory are now named per EE2 convention for coupled model. E.g `gfs.ocean.t12z.6hr_avg.f120.nc` and `gfs.ocean.t12z.daily.f120.nc` 2. The products are generated using the `ocnicepost.fd` utility developed by @DeniseWorthen in https://github.com/NOAA-EMC/gfs-utils and converted to grib2 using example scripts provided by @GwenChen-NOAA using `wgrib2`. 3. NetCDF products on the native grid are also generated by subsetting variables from the raw model output. This is done with `xarray`. 4. updates the hash of https://github.com/NOAA-EMC/gfs-utils to include fixes in `ocnicepost.fd` 5. removes NCL related scripting that was previously used for ocean/ice interpolation and `reg2grb2` used for converting to grib2. 6. updates archive scripts to accommodate updated file names 7. removes intermediate ocean processed files such as 2D/3D/xsect data- sets 8. separate jobs are added for ocean and ice product generation. 9. removes intermediate restarts for the mediator and only saves the medi- ator restart at the end of the forecast in `COM`. 10. Increases memory for offline UPP when run at C768. The program segfaults with an OOM when memory is self allocated based on PEs by the scheduler on Hera. 11. Enables ocean/ice ensemble product generation for GEFS 12. Some minor clean-ups Fixes #935 Fixes #1317 Fixes #1864 commit 40f2cf6cd70ac94e01babc96982f37ae1b0c7e79 Author: henrywinterbottom-wxdev Date: Tue Feb 13 10:18:53 2024 -0700 Implemented ush/detect_machine.sh for host determination and removed redundant checks for expected file paths. commit 929b90330c7eb345011c2c850915e0004ca26b11 Merge: 2d08d015a 3f99f700c Author: Henry R. Winterbottom <49202169+HenryWinterbottom-NOAA@users.noreply.github.com> Date: Tue Feb 13 08:31:06 2024 -0700 Merge branch 'NOAA-EMC:develop' into feature/gfsv17_issue_2125 commit 3f99f700c987526c8eb754b3f4c7b698b3e9b1dc Author: Walter Kolczynski - NOAA Date: Tue Feb 13 00:57:18 2024 -0500 Add wave post jobs to GEFS (#2292) Adds the wave post jobs for gridded and points to GEFS. Boundary point jobs are added even though the current GEFS buoy file does not contain any (tested by manually subbing in the GFS buoy file). Resolves #827 commit 842adf38087aec9f1c0bca9567e4b11d494e14c7 Author: TerrenceMcGuinness-NOAA Date: Mon Feb 12 12:50:08 2024 -0500 Added additional test cases to the pr list in Jenkins (#2306) C48mx500_3DVarAOWCDA, C96C48_hybatmDA, and C96_atmsnowDA Co-authored-by: terrance.mcguinness commit bb4ca65fe5524f76e40b97346339f1dda6680ce1 Author: CatherineThomas-NOAA <59020064+CatherineThomas-NOAA@users.noreply.github.com> Date: Mon Feb 12 14:50:41 2024 +0000 Redo v16.3 config.base changes for DA increments (#2304) Include the additional hydrometeors to the INCREMENTS_TO_ZERO and INCVARS_ZERO_STRAT variables in config.base that were modified in v16.3. Resolves: #2303 commit 061992bb6160554430cf688adf6184f01b732098 Author: TerrenceMcGuinness-NOAA Date: Sat Feb 10 01:33:36 2024 -0500 Fix Jenkins success reporting (#2302) Moving the post section back outside of main Run Experiments stage. This allows the system to correctly report the **Success** status until after all tests pass. _Had originally moved them in attempts to solve "Not an SCM GitHub Job" issue and cause the reporting to misbehave._ Also ran through Jenkins linter and updated some messaging that was incorrectly reporting system build type. commit 28ccf78073a20ba1e4d3b379d164109b54ff6708 Merge: b972f66fc 54daa31ce Author: David Huber Date: Fri Feb 9 14:00:43 2024 -0600 Merge remote-tracking branch 'emc/develop' into ss160 commit b972f66fc924790c48f38d395e7141fa78ef9d90 Author: David Huber Date: Fri Feb 9 10:47:00 2024 -0600 Fix SS versions for CI modules. commit 4b01d8eeca26c3b3d843a4b1e7b1618f281de68f Author: David Huber Date: Fri Feb 9 10:44:30 2024 -0600 Revert Hercules modules to SS/1.5.1. commit 54daa31ce0a3c23d4d74def5e54436a39a899ed4 Author: TerrenceMcGuinness-NOAA Date: Thu Feb 8 15:48:38 2024 -0500 Jenkins Declartive Pipeline for CI with gfs/gefs multibuilds (#2246) Adding top level Jenkins file for CI tests running on Jenkins Controller: - Declarative Multi-branch Pipeline (has enhanced restart capabilities on a per section bases) - Starts Pipeline from Label PR same as BASH system (for now) - Progress and restarts can me managed with CAC Login at [EPIC OAR Jenkins](https://jenkins.epic.oarcloud.noaa.gov) - Has logic for multi **gfs/gefs** system builds (arguments based on a configuration file `ci/casts/yamls/build.yaml`) - Any number of **systems** may be added by manual adding an ele- ment to the matrix in the Jenkinsfile - _It may be possible to dynamic add matrix values with a specialty plug-in_ - Currently only runs on **Orion** and **Hera** using `mterry` account Resolves #2119 Resolves #2118 commit 43429e23c12c1f2050b3a3f356abdec98dc73ea0 Author: Rahul Mahajan Date: Thu Feb 8 15:30:28 2024 -0500 Enable AO WCDA test (#1963) This PR: - adds GSI + SOCA C48 5-deg ocean 3DVar test (courtesy @guillaumevernieres) - adds a toggle to optionally disable ocnanalvrfy job. commit 2d08d015afb9c850577efd9754add16b10c45ae6 Merge: 4745d4a06 f56352874 Author: Henry R. Winterbottom <49202169+HenryWinterbottom-NOAA@users.noreply.github.com> Date: Thu Feb 8 07:24:54 2024 -0700 Merge branch 'NOAA-EMC:develop' into feature/gfsv17_issue_2125 commit f56352874d6dc133a4f1181f77c8f91ca38a6416 Author: Kate Friedman Date: Wed Feb 7 15:09:12 2024 -0500 Update JGLOBAL_FORECAST for octal error (#2295) Add "10#" to ENSMEM value > 0 check to handle octal errors. commit 4745d4a06148cc6c702c647e12ede4875e3a5862 Author: henrywinterbottom-wxdev Date: Wed Feb 7 08:45:49 2024 -0700 Removed jlogfile references. commit 5894ca2bf11e8ab7910c69781a9fbe51352a7e8c Author: henrywinterbottom-wxdev Date: Wed Feb 7 08:32:16 2024 -0700 Removed dummy variable passed to perl scripts. commit dae884a7c48b4a30c1844685c3d0ef50c9b78344 Author: henrywinterbottom-wxdev Date: Wed Feb 7 08:29:49 2024 -0700 Removed jlogfile and postmsg references within gempak scripts. commit 801058ffb0cbbfe101fd5b686aed79c5bf7538c1 Author: Henry R. Winterbottom <49202169+HenryWinterbottom-NOAA@users.noreply.github.com> Date: Wed Feb 7 00:41:59 2024 -0700 Consolidate `npe_node_max` (#2289) - The environment variable `npe_node_max` is removed from all files beneath `global-workflow/env`; - The environment variable `npe_node_max` is removed from `parm/config/gefs/config.ufs` and `parm/config/gfs/config.ufs`; - The environment variable `npe_node_max` is maintained only within `parm/config/gefs/config.resources` and `parm/config/gfs/config.resources`. Resolves #2133 commit b0325e0157598702cbba6c3cc09af0120881e2b4 Author: Henry R. Winterbottom <49202169+HenryWinterbottom-NOAA@users.noreply.github.com> Date: Wed Feb 7 00:40:20 2024 -0700 Removes files module loading file no longer used by the GW (#2281) Removes `module-setup.csh.inc` and `module-setup.sh.inc`. The module `ush/module-setup.sh` is updated such that it now sources `ush/detect_machine.sh` to determine which supported platform the global-workflow is being execute on. Resolves #2130 commit 3e6b1e2c530bd94864cb1da1b259b325a8551db6 Merge: f4c8e003e 1ccc9896b Author: Janet Derrico <143837053+jderrico-noaa@users.noreply.github.com> Date: Tue Feb 6 10:43:55 2024 -0700 Merge branch 'NOAA-EMC:develop' into develop commit 1ccc9896b361f2aaef8e6e7592a06ae4cfb7c491 Author: Walter Kolczynski - NOAA Date: Mon Feb 5 14:16:07 2024 -0500 Remove EnKF forecast groups (#2280) Removes the grouping of EnKF forecasts so each job only runs one forecast. Member and MEMDIR are now set at the workflow manager (rocoto) level. This change makes much of the system simpler (especially dependencies) and allows the elimination of the separate efcs scripts. Metatask names of updated jobs have been updated to make them a little less opaque by using the same name name as its constituent tasks (e.g. the forecast metatask is named `enkgdasfcst`, not `enkfgdasefcs`). Metatasks that weren't updated retain the same names as before for now. Resolves #2254 commit 9f3383fd8d8322428a40b94764a172a16872995e Author: Henry R. Winterbottom <49202169+HenryWinterbottom-NOAA@users.noreply.github.com> Date: Mon Feb 5 12:15:13 2024 -0700 Updated detect_machine.sh using that from UFS WM. (#2252) Updates `ush/detect_machine.sh` to match the UFS weather-model `tests/detect_machine.sh` prepared by @BrianCurtis-NOAA Resolves #2228 commit 7d68b0b164f0ffcd56867ae4fdab67905d9589eb Author: CatherineThomas-NOAA <59020064+CatherineThomas-NOAA@users.noreply.github.com> Date: Sun Feb 4 22:02:41 2024 +0000 Update global_cycle for fractional grid (#2262) The hash for ufs_utils is updated to include the changes for fractional grid support within global_cycle. This commit also removes the hack to skip global_cycle in cycling mode with v17 physics. Resolves: #1775 Refs: ufs-community/UFS_UTILS#815 Refs: ufs-community/UFS_UTILS#891 commit ed592a6ecfabc0d0b64a6e276531b7bc5ae3b8ea Author: Kate Friedman Date: Sat Feb 3 02:55:14 2024 -0500 Retire cycle-specific FHMAX_GFS variables (#2278) This PR retires the `FHMAX_GFS_${cyc}` variables that allowed users to specify different gfs forecast lengths for each cycle. This function is no longer supported in global-workflow. The `FHMAX_GFS_*` variables will be removed and will no longer be checked to set the final `FHMAX_GFS` variable. The same forecast length will be set for every cycle. This PR also includes a small fix to add new post parm files into the `.gitignore` file. This was intended to be included in a different PR but that PR is on hold for further testing so it is being included here to get it into `develop` sooner. Resolves #2218 commit 977e2d67b268477321aa26fc56073dd373e4f979 Author: Henry R. Winterbottom <49202169+HenryWinterbottom-NOAA@users.noreply.github.com> Date: Fri Feb 2 07:49:09 2024 -0700 New GDASApp hash. (#2285) commit b5f2bd9ec5632c4be43004604eed0e130dfe1735 Merge: 4d667421d 0400e1f35 Author: David Huber Date: Tue Jan 30 13:55:00 2024 -0600 Merge branch 'ss160' of github.com:DavidHuber-NOAA/global-workflow into ss160 commit 0400e1f3558be8e34c1298d32e14999c9dd46f8c Author: David Huber Date: Tue Jan 30 09:57:11 2024 -0600 Fix gfs_utils Orion spack-stack env path. commit 6bbe823e729291db326d108765d3a92a99552a58 Author: DWesl <22566757+DWesl@users.noreply.github.com> Date: Mon Jan 29 21:15:30 2024 -0500 Use seq to generate the list of times, instead of a bash for-loop (#2264) I'm running a year-long forecast, which means I get a large portion of the log file dedicated to these loops. `seq ${START} ${STEP} ${STOP}` will generate a sequence going from START to STOP by STEP, including STOP if relevant. That seems to be the purpose of these loops. It will by default separate the list with newlines, but `seq -s ' ' ${START} ${STEP} ${STOP}` will separate them with spaces instead, more closely mimicing the previous behavior. I would like this to be two lines in the log, rather than a few hundred, and this may also be faster, though probably more for reasons of fewer writes to disk than because bash isn't designed for arithmetic. commit f4c8e003eb1bf264f0a66196c7cbde773c55a378 Merge: 61975cf58 d5bee3897 Author: Janet Derrico <143837053+jderrico-noaa@users.noreply.github.com> Date: Mon Jan 29 15:38:56 2024 -0700 Merge branch 'NOAA-EMC:develop' into develop commit d5bee38979cde547861261d1cd150f3a61601d4b Author: Kate Friedman Date: Mon Jan 29 14:35:02 2024 -0500 Correct typos in GFS config.resources (#2267) This PR corrects some typos in `parm/config/gfs/config.resources` that were introduced in PR #2216. The esfc job was failing in tests on WCOSS2 due to insufficient memory. This lead to discovering the other typos. The esfc job completes without error after its memory is set back to `80GB` from the incorrect `8GB`. Resolves #2266 commit 81557beca9eecd878e7b25b3822e30a4276f4a16 Author: David Huber Date: Mon Jan 29 12:13:31 2024 -0600 Update monitor hash to noaa-emc with SS/1.6.0 support. commit 2238dd6ac0094ba2ff5e1027e964ef29ad33352c Author: DavidHuber Date: Mon Jan 29 13:21:30 2024 +0000 Update Orion, Hercules, S4 modulefiles. commit 6ffd94fd95f54ceb940f0c9201774ad73fbb055b Author: DavidHuber Date: Fri Jan 26 16:01:23 2024 +0000 Update GDAS hash to include SS/1.6.0 support. commit be11f85f28cf832e5fbb390fdd387f1bdecb5f82 Merge: 56b968080 04e0772d9 Author: DavidHuber Date: Fri Jan 26 15:34:24 2024 +0000 Merge branch 'ss160' of github.com:DavidHuber-NOAA/global-workflow into ss160 commit 8ff344844e28c3b2d03a0356f88b14635f318c12 Author: Rahul Mahajan Date: Fri Jan 26 10:12:18 2024 -0500 Add a yaml for snow DA testing. (#2199) - adds a new test yaml C96_atmsnowDA.yaml for 3DVar atmosphere with GSI and Land (Snow) DA with JEDI - moves a few yamls from platforms/ to yamls/ - adds ability to overwrite a previously created experiment as an addition to user input. --------- Co-authored-by: Cory Martin commit 04e0772d9d3e77ac5a24ce4570d48cb41424a08b Author: David Huber Date: Fri Jan 26 07:16:25 2024 -0600 Update ufs_utils hash for spack-stack/1.6.0 support. commit 3d44ff38c5c3324c22fc104fe3259b4ac864c6d6 Author: Barry Baker Date: Thu Jan 25 14:33:27 2024 -0500 GOCART ExtData biogenic climatology fix (#2253) Updates the ExtData for biogenic emissions to be climatology rather than for current time. Fixes an issue which will crash by default for other years. commit 66f58b8ab1a9524d6be95271f27a06c2f32e5f78 Author: Guillaume Vernieres Date: Thu Jan 25 13:16:41 2024 -0500 Added missing container case in gfs/config.resources (#2258) fixes #2257 commit 553b4f2e74ef610115436b75f7f6df100babd8dd Author: WenMeng-NOAA <48260754+WenMeng-NOAA@users.noreply.github.com> Date: Thu Jan 25 13:00:45 2024 -0500 Fix post parm links (#2243) Change symbolic links under parm/post with the latest version of develop branch from UPP repository and enable MERRA2 aerosol fields. Resolves #2259 commit 4d667421d5eefea2347ea1dd8097e39e167d7201 Merge: 6c058039e afa09e356 Author: David Huber Date: Thu Jan 25 07:49:08 2024 -0600 Merge branch 'ss160' of github.com:DavidHuber-NOAA/global-workflow into ss160 commit 6c058039e209c67ea477d37d2d6f76b7a2fa68ac Author: David Huber Date: Thu Jan 25 07:48:42 2024 -0600 Fix wgrib2/gfs_utils on Hercules. commit ee6f536ea0228c60f5a8bec4037cd6f7ea63b816 Author: Kate Friedman Date: Thu Jan 25 07:43:13 2024 -0500 Update GFS version to v16.3.13 in index.rst (#2256) GFSv16.3.13 WAFS update was implemented Refs #2013 commit 2445d44d0d66f35512080b0bd5867501660793bb Author: RussTreadon-NOAA <26926959+RussTreadon-NOAA@users.noreply.github.com> Date: Thu Jan 25 06:17:11 2024 -0500 Simplify and extend load_ufsda_modules to Hercules (#2245) GDASApp jobs do not run on Hercules because `ush/load_ufsda_modules.sh` does not include logic to load the appropriate GDASApp modules on Hercules. This PR extends `load_ufsda_modules.sh` functionality to Hercules, thereby enabling GDASApp jobs to run on Hercules. Resolves #2244 commit 56b9680803f109720f6439276a0e5783a9c49352 Author: DavidHuber Date: Wed Jan 24 19:47:34 2024 +0000 Update hashes (revert WCOSS2 modules). commit afa09e356503f1befc162df9a79dc9ce7414dc22 Author: DavidHuber Date: Wed Jan 24 18:59:40 2024 +0000 New (cleaner history) gdas hash. commit a9eaec23d5103ea766342ff81d77df8e0f6d9a58 Author: DavidHuber Date: Wed Jan 24 18:52:27 2024 +0000 Update gdasapp to include ss/1.6.0 support. commit 5ef8eb24eeaeaff9c9a03767d929f2114fd4840f Merge: 4c463548b e400068a2 Author: DavidHuber Date: Wed Jan 24 17:08:54 2024 +0000 Merge branch 'ss160' of github.com:DavidHuber-NOAA/global-workflow into ss160 commit 4c463548b0c38f36c13c01716fa0b1e91544e440 Author: DavidHuber Date: Wed Jan 24 17:05:12 2024 +0000 Reenable verif-global support commit f775755df5f1579edfafbce9a7b47034907d023f Author: DavidHuber Date: Wed Jan 24 16:59:10 2024 +0000 Assign fcsthrs for awips_g2 job. commit e400068a2951e5fa3b130369861b4a60ece01491 Author: David Huber Date: Mon Jan 22 15:01:07 2024 -0600 More gsi-addon path fixes. commit f663d4786f226f8164d348d403e8d105cc3d801b Author: David Huber Date: Mon Jan 22 14:41:59 2024 -0600 Fix gsi-addon paths for hercules, orion, and S4. commit e304bbeb364067e411fb50801796d000b44d9147 Author: DavidHuber Date: Mon Jan 22 18:59:34 2024 +0000 Better optimize build jobs (more ufs jobs). commit ccc6e2d445d4c2e2f44fb907c7bc972b1e69f6ff Author: DavidHuber Date: Mon Jan 22 18:56:18 2024 +0000 Reenable verif-global. #2195 commit 3ef3411f856b0c5a9cf51686f3a51a2252596690 Merge: 4365f63e5 f4d187f4e Author: DavidHuber Date: Mon Jan 22 18:39:10 2024 +0000 Merge remote-tracking branch 'origin/develop' into ss160 commit 4365f63e54e2ddb9b88e768a6e88a92504b5d482 Author: DavidHuber Date: Mon Jan 22 18:37:25 2024 +0000 Corrected the SS env. name in the version files. #2195 commit f4d187f4e45fe89583d18987d68a883490827104 Author: AndrewEichmann-NOAA <58948505+AndrewEichmann-NOAA@users.noreply.github.com> Date: Mon Jan 22 12:28:28 2024 -0500 Converts obsproc to obsprep in prepoceanobs config file (#2236) Converts obsproc to obsprep in prepoceanobs config file as a compliment to mutually depended GDASApp PR NOAA-EMC/GDASApp#858 The motivations are explained in refs NOAA-EMC/GDASApp#857 commit 9a09d3082208ad206cf6e7103a53bfb7c4946f4a Author: DavidHuber Date: Fri Jan 19 21:23:50 2024 +0000 Corrected gsi-addon-dev spelling. commit 21ff6458aac9372b26cc223f6a114953a137067e Author: DavidHuber Date: Fri Jan 19 21:11:34 2024 +0000 Update modulefiles, submodules to spack-stack 1.6.0. #2195 commit d4c55d1011f8b0385d25b62ac04710837ed8413e Author: Kate Friedman Date: Fri Jan 19 15:43:37 2024 -0500 Update typing hint for WCOSS version of python (#2238) The typing hint `typing.List` was deprecated with python 3.9 in favor of using the primitive `list[str]`, but the functional version of python on WCOSS2 is <3.9, causing `setup_xml.py` to fail there. This replaces `list[str]` as a typing hint with the deprecated form until the supported version on WCOSS2 is >=3.9. commit 61975cf583169525febd8117999017d85c85b644 Merge: 1344313c4 491928712 Author: Janet Derrico <143837053+jderrico-noaa@users.noreply.github.com> Date: Fri Jan 19 08:38:17 2024 -0700 Merge branch 'NOAA-EMC:develop' into develop commit 491928712ae1dabac62750450c9cad8f538c2a3e Author: Barry Baker Date: Thu Jan 18 22:31:27 2024 -0500 GOCART Emission updates for GEFSv13 and GDAS (#2201) This PR addresses several things needed for more recent simulations using GOCART2G as well as preparing for the GEFSv13 30 year run. The main updates are: - Update CEDS to use monthly emission files instead of daily. This will drastically reduce the number of files needed. No science change - Update biogenic MEGAN inputs to use a climatology rather than the offline biogenic previously used. This is needed to deal with simulations where the current dataset is not available. - Update volcanic SO4 emissions to use degassing emissions only. This is to support the need for more recent simulations where data is not available. commit 13d25cfc3614de978bfd4b7f273d1f13cb820878 Author: Kate Friedman Date: Thu Jan 18 22:29:43 2024 -0500 Add `POSTAMBLE_CMD` into preamble postamble (#2235) This PR adds the ability to run a command at the end of jobs via the preamble's postamble function. A new command can be set via `POSTAMBLE_CMD` and will be invoked at the end of jobs. Users can add the command to the top of an env file to have every job run it or it can be placed within a job if-block in the env file to run for just that job. Resolves #1145 commit 7759163c668eb6ccc11dc1ecd39c0dc5d433cdc1 Author: Rahul Mahajan Date: Thu Jan 18 01:13:49 2024 -0500 Add option to create WGNE atmosphere products (#2233) Adds option to create WGNE products for the atmosphere. Resolves #1951 Resolves #2226 commit d95ed8588379b2c94d771f9aa680c445d6cf6c5a Author: Rahul Mahajan Date: Wed Jan 17 18:04:56 2024 -0500 Use ufs.configure, model_configure, MOM_input, data_table, and ice_in templates from UFS-weather-model (#2051) This PR: - links `ufs.configure.*.IN`, `model_configure.IN`, `MOM_input_${OCNRES}.in`, `MOM6_data_table.IN`, and `ice_in.IN` from ufs-weather-model into the global-workflow `parm/ufs` directory - prepares local variables for use in the templates in the respective functions for FV3, MOM6, and CICE6 - uses `atparse.bash` to render these templates into fully formed namelists - Work was performed by @DeniseWorthen in the ufs-weather-model to templatize several variables that are hard-wired in the ufs-weather-model for CICE. See ufs-community/ufs-weather-model/#2010 commit 44a66dd508aac1f15e56377bb04eb1af6832fe5b Author: RussTreadon-NOAA <26926959+RussTreadon-NOAA@users.noreply.github.com> Date: Wed Jan 17 13:52:09 2024 -0500 Update gdas.cd hash to use gsibec tag 1.1.3 (#2231) (#2234) commit 9046d97b4cc0e0132633054f880b60422fd3b1c7 Author: Walter Kolczynski - NOAA Date: Wed Jan 17 10:19:18 2024 -0500 Add basic atmos products for GEFS and refactor resources (#2216) * Add basic atmos products for GEFS Adds basic atmosphere products for GEFS members (no mean or spread). To facilitate the addition of GEFS parameter lists, the existing GFS lists are renamed. They are also relocated to `parm/products/` direc tory instead of `parm/post/`. GEFS v12 uses different parameter lists for different resolutions, but for this implementation, the 0p25 lists are used for all resolutions. However, to hedge against the final con- figuration, all of the GEFS parameter lists have been added to parm. Implementation of different parameter lists for each resolution will take additional refactoring of the atmos_products job, as it currently assumes all resolutions use the same lists. The new GEFS rocoto task is implented as two loops of metatasks (made possible by PR #2189). The member is set at the rocoto level, so the setting in `config.base` had to be updated to accept a pass-through. The generation of a forecast hour list is moved up to its own function in `rocoto.tasks`, separate from the post groups which are not used. This can be used as a model for future metatasks (or refactoring of old ones). `rocoto_viewer.py` does function with the nested metatasks, though only one level of folding is available and the metatask name is that of the lower metatask (though it encompasses the entire outer metatask). Additionally, the GEFS `config.resources` had diverged from the GFS version, which included not having a section for the atm products job. This led to the refactoring of the GFS `config.resources`, including switching to using a `case` statement instead of sequential `elif`. This refactored script was then copied over to GEFS, replacing the old one, with all jobs not currently implemented for GEFS removed (except wave post, whose inclusion is immenent). New blocks can be copied over from GFS as they are added. Resolves #823 commit fbba174c8b0a2e2d1c669ae3c74bec45fa995d6d Author: RussTreadon-NOAA <26926959+RussTreadon-NOAA@users.noreply.github.com> Date: Tue Jan 16 14:05:48 2024 -0500 Update GDASApp (gdas.cd) hash (#2225) GDASApp now uses gitsubmodules commit 1344313c4fac78ba8d60fb0274c83395891af4ee Merge: 4b5889161 dddd83796 Author: Janet Derrico <143837053+jderrico-noaa@users.noreply.github.com> Date: Tue Jan 16 11:13:59 2024 -0700 Merge branch 'NOAA-EMC:develop' into develop commit dddd83796c8cd52f6264f73282fc68eb7857a475 Author: Rahul Mahajan Date: Tue Jan 16 12:17:19 2024 -0500 Fix ocean resolution for the S2SWA test (#2223) commit 83edaf145382fe15e53f0af8937570488ef5440d Author: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> Date: Tue Jan 16 11:07:02 2024 -0500 Apply fixes to archiving jobs (#2221) Add checks for potentially missing log/data files and update log names for archiving. Missing sflux index files are also added to the products script. Resolves #2076 commit c59047614c29f6ec15a4d4a6fa1855929287debd Author: Rahul Mahajan Date: Fri Jan 12 19:21:02 2024 -0500 Update hash to ufs-weather-model. The noahmptable.tbl was reorganized, so update link_workflow.sh to use the same one used in UFSWM RT (#2219) commit c041968e165c07da785376f1441374687b55d6cc Author: Kate Friedman Date: Fri Jan 12 12:09:35 2024 -0500 Add ocean resolution to setup_expt invocation and retire/reduce COMROT/ROTDIR usage (#2214) Two series of updates: 1) Update setup scripts to now allow users to provide ocean resolution 2) Housekeeping to retire the `COMROT` variable, replacing it with other variables as needed, and reduce the `ROTDIR` variable usage. Both updates change options for the workflow setup API. Refs #2061 commit 997f97816493d9ed0242e169b6dcf84e5c9338d6 Author: AndrewEichmann-NOAA <58948505+AndrewEichmann-NOAA@users.noreply.github.com> Date: Fri Jan 12 10:46:36 2024 -0500 Allow use of ocean obs prep in WCDA cycling and remove R2D2 (#2215) Enables use of ocean obs prep task in WCDA cycling and removes R2D2 from same. Runs task gdasprepoceanobs before gdasocnanalprep -- obtains ocean data nc4 files from DMPDIR, processes them into IODA format and copies them to COM_OBS. Replaces the current R2D2 processing. commit 12a5bb192ab6282d44c008c06dc14947abd37249 Merge: 4cb580201 6492c2da5 Author: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> Date: Fri Jan 12 09:26:27 2024 -0500 Merge pull request #2217 from DavidHuber-NOAA/update/versions Update and clean up version and module files commit 6492c2da5f49bd31575d0bef5278c9a7f8ba3412 Author: David Huber Date: Thu Jan 11 11:49:08 2024 -0600 Update orion module/version files for met/metplus #2123 commit 94c9937f0bff380b0ebd6acdcfd2509e1ae99389 Author: DavidHuber Date: Thu Jan 11 17:42:53 2024 +0000 Comment met/metplus out from Hera modulefile. #2123 commit 8c32f8b7d00bba6e3e954f149e6548fb27713a3c Merge: a65e4c675 4cb580201 Author: David Huber Date: Thu Jan 11 11:34:44 2024 -0600 Merge branch 'develop' of github.com:noaa-emc/global-workflow into develop commit a65e4c675f21dbe72198d263e85e894e62b41e13 Author: David Huber Date: Thu Jan 11 11:34:19 2024 -0600 Initial update of version/module files #2123 commit 4b5889161abb10f131a0d398336a6f070b2b7d3c Merge: a75e73e2c 4cb580201 Author: Janet Derrico <143837053+jderrico-noaa@users.noreply.github.com> Date: Wed Jan 10 10:15:48 2024 -0700 Merge branch 'NOAA-EMC:develop' into develop commit 4cb580201af68c1c0e2ae19faf0727dcbbe43b4d Author: souopgui Date: Wed Jan 10 08:30:22 2024 -0600 Fix OpenMP over-allocation of resources in exglobal_atmos_products.sh when running MPMD tasks (#2212) Fix OpenMP over-allocation of resources running MPMD tasks Co-authored-by: Innocent Souopgui commit b056b531faee6929687cb8a588ffafa1a66426fb Author: TerrenceMcGuinness-NOAA Date: Mon Jan 8 17:28:05 2024 -0500 Add Hercules as valid machine in CI scripts (#2207) Few updates to CI scripts to include names for hercules that where missed the first time. commit 6574d29a8c26b0695614874c344bccc5182363f4 Author: Rahul Mahajan Date: Mon Jan 8 17:25:47 2024 -0500 Fix invalid GH action and restart file name (#2210) Resolves a typo that leads to an invalid workflow yaml and fixes the restart filename in restart detection. Resolves #2205 commit 69605eac299df381ea9e0e329654487e26380ff5 Author: Rahul Mahajan Date: Mon Jan 8 17:00:28 2024 -0500 Stop attempting to comment link to RTD for non-PRs (#2209) Adds a check so comments with a link to documentation are only generated for PRs. commit 4e160a895bfb31c281e21557dd86c13954a1a967 Author: Rahul Mahajan Date: Mon Jan 8 13:10:15 2024 -0500 Enable UPP for GOES processing (#2203) Wnables the creation of special master grib2 files from UPP for GOES processing commit c15875b6dbf685327af9316ee43b3d01f0fc815e Author: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> Date: Mon Jan 8 09:56:06 2024 -0500 Port cycling to Hercules (#2196) Adds cycled support for Hercules (excluding gsi-monitor). Partially resolves #1588 GSI monitoring is disabled on Hercules due to missing Perl modules. That will be enabled in a later PR. commit ef6827dd6abdab2996d1e19f9e0ff5d3071e0fdd Author: Walter Kolczynski - NOAA Date: Mon Jan 8 09:43:12 2024 -0500 Refactor rocoto task XML creation (#2189) Refactors the rocoto task generation to be recursive. This will allow nested metatasks to loop over multiple variables, which is needed for GEFS product generation. As part of this refactor, there is no longer separate arguments to designate metatasks. Instead, task dicts can include a nested 'task_dict' as well as a 'var_dict' containing the variables to loop over. The nested task dict can then either have another layer, or be the innermost task. To accommodate the new recursive nature, some defaults that were previously defined in create_wf_task() had to be pushed down into the function that creates the innermost task. Also, former keywords have been absorbed by the task dict. Refs #823 Refs #827 commit 2b81cfac16f103fb29b3aeb9748941787abf7287 Author: Walter Kolczynski - NOAA Date: Mon Jan 8 09:41:03 2024 -0500 Update fix versions (#2198) Updates fix versions for a few components: - Update cice and mom6 versions to support C96/1p00 marine - Update wave to change betamax setting for glo_025 waves Resolves #2004 Resolves #2107 commit a75e73e2c1942f9e0c5da324a9e62edd931fc3fa Merge: 5adc47b72 9d901dbbf Author: Janet Derrico <143837053+jderrico-noaa@users.noreply.github.com> Date: Tue Jan 2 12:44:40 2024 -0700 Merge branch 'NOAA-EMC:develop' into develop commit 5adc47b727306dc58a4ce99f8fc805debe2a3e9c Merge: a3ce9db20 a3c50998d Author: Janet Derrico <143837053+jderrico-noaa@users.noreply.github.com> Date: Tue Dec 26 12:40:08 2023 -0700 Merge branch 'NOAA-EMC:develop' into develop commit a3ce9db20c6ff3444bd52c781e884af61a45db4e Merge: 2b1a8e3eb a81da33e8 Author: Janet Derrico <143837053+jderrico-noaa@users.noreply.github.com> Date: Tue Dec 19 09:12:48 2023 -0700 Merge branch 'NOAA-EMC:develop' into develop commit 2b1a8e3eb32ecffcd3eaa5f87d10e68cef881b22 Merge: 4562dec4f 9505cb4ab Author: Janet Derrico <143837053+jderrico-noaa@users.noreply.github.com> Date: Fri Dec 15 09:50:35 2023 -0700 Merge branch 'NOAA-EMC:develop' into develop commit 4562dec4f964e03ce844452a1e5903fd8e4a79f8 Merge: 9f7eebaf6 29d34172a Author: Janet Derrico <143837053+jderrico-noaa@users.noreply.github.com> Date: Mon Dec 11 12:10:25 2023 -0700 Merge pull request #34 from NOAA-EMC/develop Sync develop branch with EMC develop --- .github/pull_request_template.md | 11 + .github/workflows/ci_unit_tests.yaml | 64 + .github/workflows/docs.yaml | 2 +- .gitignore | 72 +- .gitmodules | 39 +- .shellcheckrc | 3 + INFO | 11 + ci/Jenkinsfile | 310 +++ ci/cases/gfsv17/C384mx025_3DVarAOWCDA.yaml | 18 + ci/cases/gfsv17/ocnanal.yaml | 27 + ci/cases/hires/C1152_S2SW.yaml | 14 + ci/cases/hires/C768_S2SW.yaml | 14 + ci/cases/pr/C48_ATM.yaml | 2 +- ci/cases/pr/C48_S2SW.yaml | 2 +- ci/cases/pr/C48_S2SWA_gefs.yaml | 5 +- ci/cases/pr/C48mx500_3DVarAOWCDA.yaml | 23 + ci/cases/pr/C96C48_hybatmDA.yaml | 2 +- ci/cases/pr/C96C48_ufs_hybatmDA.yaml | 24 + ci/cases/pr/C96_atm3DVar.yaml | 5 +- ci/cases/pr/C96_atm3DVar_extended.yaml | 22 + ci/cases/pr/C96_atmaerosnowDA.yaml | 21 + ci/cases/weekly/C384C192_hybatmda.yaml | 2 +- ci/cases/weekly/C384_S2SWA.yaml | 4 +- ci/cases/weekly/C384_atm3DVar.yaml | 2 +- ci/cases/yamls/atmaerosnowDA_defaults_ci.yaml | 5 + ci/cases/yamls/build.yaml | 3 + .../yamls}/gefs_ci_defaults.yaml | 2 +- .../yamls}/gfs_defaults_ci.yaml | 2 +- ci/cases/yamls/gfs_extended_ci.yaml | 13 + ci/cases/yamls/soca_gfs_defaults_ci.yaml | 5 + ci/cases/yamls/ufs_hybatmDA_defaults.ci.yaml | 20 + ci/platforms/config.hera | 6 +- ci/platforms/config.hercules | 6 +- ci/platforms/config.orion | 6 +- ci/platforms/config.wcoss2 | 7 + ci/scripts/check_ci.sh | 97 +- ci/scripts/clone-build_ci.sh | 3 +- ci/scripts/driver.sh | 98 +- ci/scripts/driver_weekly.sh | 2 +- ci/scripts/pr_list_database.py | 215 -- ci/scripts/run-check_ci.sh | 65 +- ci/scripts/run_ci.sh | 16 +- ci/scripts/tests/test_create_experiment.py | 29 + ci/scripts/tests/test_rocotostat.py | 89 + ci/scripts/tests/test_setup.py | 89 + ci/scripts/utils/ci_utils.sh | 159 +- ci/scripts/utils/ci_utils_wrapper.sh | 9 + ci/scripts/utils/get_host_case_list.py | 32 + ci/scripts/utils/githubpr.py | 124 + ci/scripts/utils/launch_java_agent.sh | 184 ++ ci/scripts/utils/parse_yaml.py | 70 + ci/scripts/utils/pr_list_database.py | 216 ++ ci/scripts/utils/publish_logs.py | 108 + ci/scripts/utils/rocotostat.py | 243 ++ ci/scripts/utils/wxflow | 1 + docs/doxygen/mainpage.h | 4 +- docs/source/clone.rst | 7 + docs/source/components.rst | 18 +- docs/source/conf.py | 7 +- docs/source/configure.rst | 112 +- docs/source/hpc.rst | 99 +- docs/source/index.rst | 6 +- docs/source/init.rst | 92 +- docs/source/setup.rst | 10 +- .../analysis/create/jenkfgdas_diag.ecf | 1 - .../analysis/create/jenkfgdas_select_obs.ecf | 1 - .../analysis/create/jenkfgdas_update.ecf | 1 - .../analysis/recenter/ecen/jenkfgdas_ecen.ecf | 1 - .../analysis/recenter/jenkfgdas_sfc.ecf | 1 - .../enkfgdas/forecast/jenkfgdas_fcst.ecf | 1 - .../enkfgdas/post/jenkfgdas_post_master.ecf | 1 - .../atmos/analysis/jgdas_atmos_analysis.ecf | 1 - .../analysis/jgdas_atmos_analysis_calc.ecf | 1 - .../analysis/jgdas_atmos_analysis_diag.ecf | 1 - .../gdas/atmos/gempak/jgdas_atmos_gempak.ecf | 7 +- .../gempak/jgdas_atmos_gempak_meta_ncdc.ecf | 1 - .../dump/jgdas_atmos_tropcy_qc_reloc.ecf | 1 - .../prep/jgdas_atmos_emcsfc_sfc_prep.ecf | 1 - .../atmos/post/jgdas_atmos_post_manager.ecf | 1 - .../atmos/post/jgdas_atmos_post_master.ecf | 1 - .../jgdas_atmos_chgres_forenkf.ecf | 1 - .../gdas/atmos/verf/jgdas_atmos_verfozn.ecf | 1 - .../gdas/atmos/verf/jgdas_atmos_verfrad.ecf | 1 - .../gdas/atmos/verf/jgdas_atmos_vminmon.ecf | 1 - ecf/scripts/gdas/jgdas_forecast.ecf | 1 - .../gdas/wave/init/jgdas_wave_init.ecf | 1 - .../gdas/wave/post/jgdas_wave_postpnt.ecf | 1 - .../gdas/wave/post/jgdas_wave_postsbs.ecf | 1 - .../gdas/wave/prep/jgdas_wave_prep.ecf | 1 - .../atmos/analysis/jgfs_atmos_analysis.ecf | 1 - .../analysis/jgfs_atmos_analysis_calc.ecf | 1 - .../gfs/atmos/gempak/jgfs_atmos_gempak.ecf | 6 +- .../atmos/gempak/jgfs_atmos_gempak_meta.ecf | 1 - .../gempak/jgfs_atmos_gempak_ncdc_upapgif.ecf | 1 - .../gempak/jgfs_atmos_npoess_pgrb2_0p5deg.ecf | 1 - .../gempak/jgfs_atmos_pgrb2_spec_gempak.ecf | 6 +- .../dump/jgfs_atmos_tropcy_qc_reloc.ecf | 1 - .../prep/jgfs_atmos_emcsfc_sfc_prep.ecf | 1 - .../atmos/post/jgfs_atmos_post_manager.ecf | 1 - .../gfs/atmos/post/jgfs_atmos_post_master.ecf | 1 - .../jgfs_atmos_awips_master.ecf | 1 - .../atmos/post_processing/awips_g2/.gitignore | 2 - .../awips_g2/jgfs_atmos_awips_g2_master.ecf | 61 - .../bufr_sounding/jgfs_atmos_postsnd.ecf | 1 - .../bulletins/jgfs_atmos_fbwind.ecf | 1 - .../gfs/atmos/verf/jgfs_atmos_vminmon.ecf | 1 - ecf/scripts/gfs/jgfs_forecast.ecf | 1 - .../gfs/wave/gempak/jgfs_wave_gempak.ecf | 1 - ecf/scripts/gfs/wave/init/jgfs_wave_init.ecf | 1 - .../gfs/wave/post/jgfs_wave_post_bndpnt.ecf | 1 - .../wave/post/jgfs_wave_post_bndpntbll.ecf | 1 - .../gfs/wave/post/jgfs_wave_postpnt.ecf | 1 - .../gfs/wave/post/jgfs_wave_postsbs.ecf | 1 - .../gfs/wave/post/jgfs_wave_prdgen_bulls.ecf | 1 - .../wave/post/jgfs_wave_prdgen_gridded.ecf | 1 - ecf/scripts/gfs/wave/prep/jgfs_wave_prep.ecf | 1 - env/AWSPW.env | 130 +- env/CONTAINER.env | 6 - env/GAEA.env | 66 + env/HERA.env | 259 +- env/HERCULES.env | 253 +- env/JET.env | 221 +- env/ORION.env | 251 +- env/S4.env | 220 +- env/WCOSS2.env | 241 +- gempak/fix/datatype.tbl | 14 +- gempak/fix/gfs_meta | 46 +- gempak/ush/gdas_ecmwf_meta_ver.sh | 145 +- gempak/ush/gdas_meta_loop.sh | 191 +- gempak/ush/gdas_meta_na.sh | 88 +- gempak/ush/gdas_ukmet_meta_ver.sh | 169 +- gempak/ush/gempak_gdas_f000_gif.sh | 295 +-- gempak/ush/gempak_gfs_f000_gif.sh | 584 +++++ gempak/ush/gempak_gfs_f00_gif.sh | 602 ----- gempak/ush/gempak_gfs_f12_gif.sh | 213 -- gempak/ush/gempak_gfs_f24_gif.sh | 231 -- gempak/ush/gempak_gfs_f36_gif.sh | 231 -- gempak/ush/gempak_gfs_f48_gif.sh | 231 -- gempak/ush/gempak_gfs_fhhh_gif.sh | 189 ++ gempak/ush/gfs_meta_ak.sh | 136 +- gempak/ush/gfs_meta_bwx.sh | 190 +- gempak/ush/gfs_meta_comp.sh | 1131 ++------- gempak/ush/gfs_meta_crb.sh | 177 +- gempak/ush/gfs_meta_hi.sh | 114 +- gempak/ush/gfs_meta_hur.sh | 210 +- gempak/ush/gfs_meta_mar_atl.sh | 68 +- gempak/ush/gfs_meta_mar_comp.sh | 1060 ++------ gempak/ush/gfs_meta_mar_pac.sh | 69 +- gempak/ush/gfs_meta_mar_ql.sh | 60 +- gempak/ush/gfs_meta_mar_skewt.sh | 57 +- gempak/ush/gfs_meta_mar_ver.sh | 51 +- gempak/ush/gfs_meta_nhsh.sh | 112 +- gempak/ush/gfs_meta_opc_na_ver | 312 +-- gempak/ush/gfs_meta_opc_np_ver | 311 +-- gempak/ush/gfs_meta_precip.sh | 142 +- gempak/ush/gfs_meta_qpf.sh | 144 +- gempak/ush/gfs_meta_sa.sh | 136 +- gempak/ush/gfs_meta_sa2.sh | 282 +-- gempak/ush/gfs_meta_trop.sh | 127 +- gempak/ush/gfs_meta_us.sh | 137 +- gempak/ush/gfs_meta_usext.sh | 162 +- gempak/ush/gfs_meta_ver.sh | 436 +--- jobs/JGDAS_ATMOS_ANALYSIS_DIAG | 5 +- jobs/JGDAS_ATMOS_CHGRES_FORENKF | 7 +- jobs/JGDAS_ATMOS_GEMPAK | 71 +- jobs/JGDAS_ATMOS_GEMPAK_META_NCDC | 68 +- jobs/JGDAS_ATMOS_VERFOZN | 4 +- jobs/JGDAS_ATMOS_VERFRAD | 6 +- jobs/JGDAS_ENKF_ARCHIVE | 12 +- jobs/JGDAS_ENKF_DIAG | 11 +- jobs/JGDAS_ENKF_ECEN | 11 +- jobs/JGDAS_ENKF_FCST | 84 - jobs/JGDAS_ENKF_POST | 1 - jobs/JGDAS_ENKF_SELECT_OBS | 15 +- jobs/JGDAS_ENKF_SFC | 13 +- jobs/JGDAS_ENKF_UPDATE | 5 +- jobs/JGDAS_FIT2OBS | 6 +- jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_CHKPT | 6 +- ..._VRFY => JGDAS_GLOBAL_OCEAN_ANALYSIS_ECEN} | 24 +- jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_POST | 6 +- jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_PREP | 16 +- jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_VRFY | 9 +- jobs/JGFS_ATMOS_AWIPS_20KM_1P0DEG | 17 +- jobs/JGFS_ATMOS_AWIPS_G2 | 65 - jobs/JGFS_ATMOS_CYCLONE_GENESIS | 8 +- jobs/JGFS_ATMOS_CYCLONE_TRACKER | 11 +- jobs/JGFS_ATMOS_FBWIND | 34 +- jobs/JGFS_ATMOS_FSU_GENESIS | 10 +- jobs/JGFS_ATMOS_GEMPAK | 173 +- jobs/JGFS_ATMOS_GEMPAK_META | 46 +- jobs/JGFS_ATMOS_GEMPAK_NCDC_UPAPGIF | 58 +- jobs/JGFS_ATMOS_GEMPAK_PGRB2_SPEC | 68 +- jobs/JGFS_ATMOS_PGRB2_SPEC_NPOESS | 19 +- jobs/JGFS_ATMOS_POSTSND | 17 +- jobs/JGFS_ATMOS_VERIFICATION | 4 +- jobs/JGFS_ATMOS_WAFS | 96 + jobs/JGFS_ATMOS_WAFS_BLENDING_0P25 | 153 ++ jobs/JGFS_ATMOS_WAFS_GCIP | 140 ++ jobs/JGFS_ATMOS_WAFS_GRIB2 | 124 + jobs/JGFS_ATMOS_WAFS_GRIB2_0P25 | 133 + jobs/JGLOBAL_AERO_ANALYSIS_FINALIZE | 6 +- jobs/JGLOBAL_AERO_ANALYSIS_INITIALIZE | 6 +- jobs/JGLOBAL_AERO_ANALYSIS_RUN | 2 +- jobs/JGLOBAL_ARCHIVE | 56 +- jobs/JGLOBAL_ATMENS_ANALYSIS_FINALIZE | 4 +- jobs/JGLOBAL_ATMENS_ANALYSIS_FV3_INCREMENT | 35 + jobs/JGLOBAL_ATMENS_ANALYSIS_INITIALIZE | 6 +- ...YSIS_RUN => JGLOBAL_ATMENS_ANALYSIS_LETKF} | 4 +- jobs/JGLOBAL_ATMOS_ANALYSIS | 14 +- jobs/JGLOBAL_ATMOS_ANALYSIS_CALC | 10 +- jobs/JGLOBAL_ATMOS_ENSSTAT | 48 + jobs/JGLOBAL_ATMOS_POST_MANAGER | 13 +- jobs/JGLOBAL_ATMOS_PRODUCTS | 6 +- jobs/JGLOBAL_ATMOS_SFCANL | 41 +- jobs/JGLOBAL_ATMOS_TROPCY_QC_RELOC | 3 +- jobs/JGLOBAL_ATMOS_UPP | 4 +- jobs/JGLOBAL_ATMOS_VMINMON | 6 +- jobs/JGLOBAL_ATM_ANALYSIS_FINALIZE | 6 +- jobs/JGLOBAL_ATM_ANALYSIS_FV3_INCREMENT | 37 + jobs/JGLOBAL_ATM_ANALYSIS_INITIALIZE | 8 +- ...S_RUN => JGLOBAL_ATM_ANALYSIS_VARIATIONAL} | 4 +- jobs/JGLOBAL_ATM_PREP_IODA_OBS | 6 +- jobs/JGLOBAL_CLEANUP | 8 +- jobs/JGLOBAL_EXTRACTVARS | 47 + jobs/JGLOBAL_FORECAST | 132 +- jobs/JGLOBAL_MARINE_ANALYSIS_LETKF | 50 + jobs/JGLOBAL_MARINE_BMAT | 66 + jobs/JGLOBAL_OCEANICE_PRODUCTS | 40 + jobs/JGLOBAL_PREP_EMISSIONS | 35 + ...AN_ANALYSIS_BMAT => JGLOBAL_PREP_OBS_AERO} | 25 +- jobs/JGLOBAL_PREP_OCEAN_OBS | 9 +- ...AL_PREP_LAND_OBS => JGLOBAL_PREP_SNOW_OBS} | 9 +- ...AL_LAND_ANALYSIS => JGLOBAL_SNOW_ANALYSIS} | 11 +- jobs/JGLOBAL_STAGE_IC | 7 +- jobs/JGLOBAL_WAVE_GEMPAK | 4 +- jobs/JGLOBAL_WAVE_INIT | 13 +- jobs/JGLOBAL_WAVE_POST_BNDPNT | 15 +- jobs/JGLOBAL_WAVE_POST_BNDPNTBLL | 15 +- jobs/JGLOBAL_WAVE_POST_PNT | 15 +- jobs/JGLOBAL_WAVE_POST_SBS | 18 +- jobs/JGLOBAL_WAVE_PRDGEN_BULLS | 8 +- jobs/JGLOBAL_WAVE_PRDGEN_GRIDDED | 9 +- jobs/JGLOBAL_WAVE_PREP | 19 +- jobs/rocoto/aeroanlfinal.sh | 5 - jobs/rocoto/aeroanlinit.sh | 6 - jobs/rocoto/aeroanlrun.sh | 6 - jobs/rocoto/arch.sh | 5 + jobs/rocoto/atmanlfinal.sh | 5 - jobs/rocoto/atmanlfv3inc.sh | 18 + jobs/rocoto/atmanlinit.sh | 6 - jobs/rocoto/atmanlvar.sh | 18 + jobs/rocoto/atmensanlfinal.sh | 5 - jobs/rocoto/atmensanlfv3inc.sh | 18 + jobs/rocoto/atmensanlinit.sh | 6 - jobs/rocoto/atmensanlletkf.sh | 18 + jobs/rocoto/atmos_ensstat.sh | 25 + jobs/rocoto/atmos_products.sh | 22 +- jobs/rocoto/awips_20km_1p0deg.sh | 6 +- jobs/rocoto/awips_g2.sh | 57 - jobs/rocoto/earc.sh | 6 +- jobs/rocoto/efcs.sh | 25 - jobs/rocoto/extractvars.sh | 23 + jobs/rocoto/gempak.sh | 5 +- .../{gempakpgrb2spec.sh => gempakgrb2spec.sh} | 5 +- .../rocoto/{landanl.sh => marineanalletkf.sh} | 9 +- jobs/rocoto/{ocnanalbmat.sh => marinebmat.sh} | 5 +- jobs/rocoto/oceanice_products.sh | 25 + jobs/rocoto/{atmanlrun.sh => ocnanalecen.sh} | 9 +- jobs/rocoto/ocnpost.sh | 119 - jobs/rocoto/prep.sh | 27 +- .../{atmensanlrun.sh => prep_emissions.sh} | 11 +- jobs/rocoto/prepatmiodaobs.sh | 8 +- jobs/rocoto/prepobsaero.sh | 18 + jobs/rocoto/prepoceanobs.sh | 5 + .../rocoto/{preplandobs.sh => prepsnowobs.sh} | 13 +- jobs/rocoto/remapgrib.sh | 110 + jobs/rocoto/snowanl.sh | 18 + jobs/rocoto/upp.sh | 29 +- modulefiles/module-setup.csh.inc | 87 - modulefiles/module-setup.sh.inc | 110 - modulefiles/module_base.gaea.lua | 39 + modulefiles/module_base.hera.lua | 18 +- modulefiles/module_base.hercules.lua | 26 +- modulefiles/module_base.jet.lua | 21 +- modulefiles/module_base.orion.lua | 28 +- modulefiles/module_base.s4.lua | 15 +- modulefiles/module_base.wcoss2.lua | 8 +- modulefiles/module_gwci.hercules.lua | 2 +- modulefiles/module_gwci.orion.lua | 8 +- modulefiles/module_gwci.wcoss2.lua | 8 + modulefiles/module_gwsetup.gaea.lua | 19 + modulefiles/module_gwsetup.hercules.lua | 6 +- modulefiles/module_gwsetup.jet.lua | 6 +- modulefiles/module_gwsetup.orion.lua | 9 +- modulefiles/module_gwsetup.s4.lua | 4 +- modulefiles/module_gwsetup.wcoss2.lua | 2 - parm/archive/arcdir.yaml.j2 | 156 ++ parm/archive/chem.yaml.j2 | 7 + parm/archive/enkf.yaml.j2 | 82 + parm/archive/enkf_grp.yaml.j2 | 29 + parm/archive/enkf_restarta_grp.yaml.j2 | 53 + parm/archive/enkf_restartb_grp.yaml.j2 | 37 + parm/archive/gdas.yaml.j2 | 163 ++ parm/archive/gdas_restarta.yaml.j2 | 54 + parm/archive/gdas_restartb.yaml.j2 | 39 + parm/archive/gdasice.yaml.j2 | 10 + parm/archive/gdasice_restart.yaml.j2 | 7 + parm/archive/gdasocean.yaml.j2 | 9 + parm/archive/gdasocean_analysis.yaml.j2 | 32 + parm/archive/gdasocean_restart.yaml.j2 | 8 + parm/archive/gdaswave.yaml.j2 | 8 + parm/archive/gdaswave_restart.yaml.j2 | 6 + parm/archive/gfs_downstream.yaml.j2 | 12 + parm/archive/gfs_flux.yaml.j2 | 9 + parm/archive/gfs_flux_1p00.yaml.j2 | 9 + parm/archive/gfs_netcdfa.yaml.j2 | 16 + parm/archive/gfs_netcdfb.yaml.j2 | 9 + parm/archive/gfs_pgrb2b.yaml.j2 | 19 + parm/archive/gfs_restarta.yaml.j2 | 23 + parm/archive/gfsa.yaml.j2 | 68 + parm/archive/gfsb.yaml.j2 | 20 + parm/archive/gfswave.yaml.j2 | 30 + parm/archive/ice_6hravg.yaml.j2 | 9 + parm/archive/ice_grib2.yaml.j2 | 19 + parm/archive/master_enkf.yaml.j2 | 102 + parm/archive/master_enkfgdas.yaml.j2 | 6 + parm/archive/master_enkfgfs.yaml.j2 | 6 + parm/archive/master_gdas.yaml.j2 | 109 + parm/archive/master_gfs.yaml.j2 | 113 + parm/archive/ocean_6hravg.yaml.j2 | 8 + parm/archive/ocean_grib2.yaml.j2 | 18 + parm/config/gefs/config.atmos_ensstat | 11 + parm/config/gefs/config.atmos_products | 28 + .../gefs/{config.base.emc.dyn => config.base} | 110 +- parm/config/gefs/config.efcs | 70 +- parm/config/gefs/config.extractvars | 41 + parm/config/gefs/config.fcst | 75 +- parm/config/gefs/config.oceanice_products | 15 + parm/config/gefs/config.prep_emissions | 11 + parm/config/gefs/config.resources | 680 ++---- parm/config/gefs/config.stage_ic | 15 + parm/config/gefs/config.ufs | 191 +- parm/config/gefs/config.wave | 41 +- parm/config/gefs/config.wavepostbndpnt | 11 + parm/config/gefs/config.wavepostbndpntbll | 11 + parm/config/gefs/config.wavepostpnt | 11 + parm/config/gefs/config.wavepostsbs | 27 + parm/config/gefs/yaml/defaults.yaml | 11 +- parm/config/gfs/config.aero | 9 +- parm/config/gfs/config.aeroanl | 17 +- parm/config/gfs/config.aeroanlfinal | 2 +- parm/config/gfs/config.aeroanlinit | 2 +- parm/config/gfs/config.aeroanlrun | 2 +- parm/config/gfs/config.anal | 70 +- parm/config/gfs/config.analcalc | 6 +- parm/config/gfs/config.atmanl | 31 +- parm/config/gfs/config.atmanlfv3inc | 14 + parm/config/gfs/config.atmanlinit | 1 + parm/config/gfs/config.atmanlrun | 11 - parm/config/gfs/config.atmanlvar | 11 + parm/config/gfs/config.atmensanl | 15 +- parm/config/gfs/config.atmensanlfv3inc | 14 + parm/config/gfs/config.atmensanlinit | 1 + parm/config/gfs/config.atmensanlletkf | 11 + parm/config/gfs/config.atmensanlrun | 11 - parm/config/gfs/config.atmos_products | 14 +- parm/config/gfs/config.awips | 3 - parm/config/gfs/config.base | 1 + ...onfig.base.emc.dyn_emc => config.base.emc} | 210 +- parm/config/gfs/config.base.emc.dyn | 1 - parm/config/gfs/config.base.emc.dyn_jet | 405 ---- ...fig.base.emc.dyn_hera => config.base.hera} | 206 +- parm/config/gfs/config.cleanup | 7 +- parm/config/gfs/config.com | 26 +- parm/config/gfs/config.earc | 20 +- parm/config/gfs/config.efcs | 60 +- parm/config/gfs/config.eobs | 3 +- parm/config/gfs/config.epos | 3 - parm/config/gfs/config.esfc | 13 +- parm/config/gfs/config.eupd | 2 +- parm/config/gfs/config.fbwind | 11 + parm/config/gfs/config.fcst | 167 +- parm/config/gfs/config.fit2obs | 4 +- parm/config/gfs/config.ice | 5 + parm/config/gfs/config.landanl | 34 - parm/config/gfs/config.marineanalletkf | 18 + parm/config/gfs/config.marinebmat | 11 + parm/config/gfs/config.metp | 13 +- parm/config/gfs/config.nsst | 5 + parm/config/gfs/config.oceanice_products | 15 + parm/config/gfs/config.ocn | 13 +- parm/config/gfs/config.ocnanal | 18 +- parm/config/gfs/config.ocnanalbmat | 11 - parm/config/gfs/config.ocnanalecen | 11 + parm/config/gfs/config.ocnpost | 29 - parm/config/gfs/config.postsnd | 1 - parm/config/gfs/config.prep | 15 +- parm/config/gfs/config.prepatmiodaobs | 3 - parm/config/gfs/config.preplandobs | 18 - parm/config/gfs/config.prepobsaero | 17 + parm/config/gfs/config.prepoceanobs | 11 +- parm/config/gfs/config.prepsnowobs | 21 + parm/config/gfs/config.resources | 2156 +++++++++-------- parm/config/gfs/config.resources.GAEA | 27 + parm/config/gfs/config.resources.HERA | 35 + parm/config/gfs/config.resources.HERCULES | 16 + parm/config/gfs/config.resources.JET | 52 + parm/config/gfs/config.resources.ORION | 17 + parm/config/gfs/config.resources.S4 | 59 + parm/config/gfs/config.resources.WCOSS2 | 59 + parm/config/gfs/config.sfcanl | 5 + parm/config/gfs/config.snowanl | 30 + parm/config/gfs/config.stage_ic | 22 +- .../gfs/config.ufs_c768_12x12_2th_1wg40wt | 443 ++-- .../gfs/config.ufs_c768_16x16_2th_2wg40wt | 443 ++-- parm/config/gfs/config.upp | 2 +- parm/config/gfs/config.verfozn | 7 +- parm/config/gfs/config.verfrad | 5 +- parm/config/gfs/config.vminmon | 5 +- parm/config/gfs/config.wave | 64 +- parm/config/gfs/config.wavepostbndpnt | 2 +- parm/config/gfs/config.wavepostbndpntbll | 2 +- parm/config/gfs/config.wavepostpnt | 2 +- parm/config/gfs/config.wavepostsbs | 3 +- parm/config/gfs/config.waveprep | 2 +- parm/config/gfs/yaml/defaults.yaml | 42 +- parm/config/gfs/yaml/test_ci.yaml | 2 +- parm/gdas/aero_crtm_coeff.yaml | 13 - parm/gdas/aero_crtm_coeff.yaml.j2 | 13 + parm/gdas/aero_jedi_fix.yaml | 11 - parm/gdas/aero_jedi_fix.yaml.j2 | 7 + ...crtm_coeff.yaml => atm_crtm_coeff.yaml.j2} | 0 parm/gdas/atm_jedi_fix.yaml | 7 - parm/gdas/atm_jedi_fix.yaml.j2 | 9 + parm/gdas/land_jedi_fix.yaml | 7 - parm/gdas/snow_jedi_fix.yaml.j2 | 7 + parm/gdas/staging/atm_berror_gsibec.yaml.j2 | 8 + parm/gdas/staging/atm_lgetkf_bkg.yaml.j2 | 32 + parm/gdas/staging/atm_var_bkg.yaml.j2 | 14 + parm/gdas/staging/atm_var_fv3ens.yaml.j2 | 24 + parm/post/oceanice_products.yaml | 75 + parm/post/oceanice_products_gefs.yaml | 73 + parm/post/upp.yaml | 30 +- parm/product/bufr_ij9km.txt | 2115 ++++++++++++++++ parm/product/gefs.0p25.f000.paramlist.a.txt | 39 + parm/product/gefs.0p25.f000.paramlist.b.txt | 522 ++++ parm/product/gefs.0p25.fFFF.paramlist.a.txt | 38 + parm/product/gefs.0p25.fFFF.paramlist.b.txt | 554 +++++ parm/product/gefs.0p50.f000.paramlist.a.txt | 80 + parm/product/gefs.0p50.f000.paramlist.b.txt | 474 ++++ parm/product/gefs.0p50.fFFF.paramlist.a.txt | 87 + parm/product/gefs.0p50.fFFF.paramlist.b.txt | 506 ++++ parm/product/gefs.1p00.f000.paramlist.a.txt | 1 + parm/product/gefs.1p00.f000.paramlist.b.txt | 1 + parm/product/gefs.1p00.fFFF.paramlist.a.txt | 1 + parm/product/gefs.1p00.fFFF.paramlist.b.txt | 1 + parm/product/gefs.2p50.f000.paramlist.a.txt | 23 + parm/product/gefs.2p50.f000.paramlist.b.txt | 530 ++++ parm/product/gefs.2p50.fFFF.paramlist.a.txt | 22 + parm/product/gefs.2p50.fFFF.paramlist.b.txt | 571 +++++ parm/product/gefs_ice_shortparmlist.parm | 10 + parm/product/gefs_ocn_shortparmlist.parm | 9 + parm/product/gefs_shortparmlist_2d.parm | 38 + parm/product/gefs_shortparmlist_3d_d.parm | 34 + parm/product/gefs_shortparmlist_3d_h.parm | 45 + parm/product/gefs_wav_shortparmlist.parm | 3 + .../gfs.anl.paramlist.a.txt} | 1 - .../gfs.f000.paramlist.a.txt} | 2 - .../gfs.fFFF.paramlist.a.txt} | 3 - .../gfs.fFFF.paramlist.b.txt} | 0 ...ist => transfer_gfs_enkfgdas_enkf_05.list} | 0 ...ist => transfer_gfs_enkfgdas_enkf_10.list} | 0 ...ist => transfer_gfs_enkfgdas_enkf_15.list} | 0 ...ist => transfer_gfs_enkfgdas_enkf_20.list} | 0 ...ist => transfer_gfs_enkfgdas_enkf_25.list} | 0 ...ist => transfer_gfs_enkfgdas_enkf_30.list} | 0 ...ist => transfer_gfs_enkfgdas_enkf_35.list} | 0 ...ist => transfer_gfs_enkfgdas_enkf_40.list} | 0 ...ist => transfer_gfs_enkfgdas_enkf_45.list} | 0 ...ist => transfer_gfs_enkfgdas_enkf_50.list} | 0 ...ist => transfer_gfs_enkfgdas_enkf_55.list} | 0 ...ist => transfer_gfs_enkfgdas_enkf_60.list} | 0 ...ist => transfer_gfs_enkfgdas_enkf_65.list} | 0 ...ist => transfer_gfs_enkfgdas_enkf_70.list} | 0 ...ist => transfer_gfs_enkfgdas_enkf_75.list} | 0 ...ist => transfer_gfs_enkfgdas_enkf_80.list} | 0 ...t => transfer_gfs_enkfgdas_enkf_misc.list} | 0 ...1a.list => transfer_gfs_gdas_gdas_1a.list} | 0 ...1b.list => transfer_gfs_gdas_gdas_1b.list} | 0 ...1c.list => transfer_gfs_gdas_gdas_1c.list} | 0 ....list => transfer_gfs_gdas_gdas_misc.list} | 0 ...fer_gfs_1.list => transfer_gfs_gfs_1.list} | 0 ...gfs_10a.list => transfer_gfs_gfs_10a.list} | 0 ...gfs_10b.list => transfer_gfs_gfs_10b.list} | 0 ...fer_gfs_2.list => transfer_gfs_gfs_2.list} | 0 ...fer_gfs_3.list => transfer_gfs_gfs_3.list} | 0 ...fer_gfs_4.list => transfer_gfs_gfs_4.list} | 0 ...fer_gfs_5.list => transfer_gfs_gfs_5.list} | 0 ...fer_gfs_6.list => transfer_gfs_gfs_6.list} | 0 ...fer_gfs_7.list => transfer_gfs_gfs_7.list} | 0 ...fer_gfs_8.list => transfer_gfs_gfs_8.list} | 0 ...r_gfs_9a.list => transfer_gfs_gfs_9a.list} | 0 ...r_gfs_9b.list => transfer_gfs_gfs_9b.list} | 0 ...mpak.list => transfer_gfs_gfs_gempak.list} | 0 ...s_misc.list => transfer_gfs_gfs_misc.list} | 0 ...transfer_rdhpcs_gfs_gdas_enkf_enkf_1.list} | 4 +- ...transfer_rdhpcs_gfs_gdas_enkf_enkf_2.list} | 5 +- ...transfer_rdhpcs_gfs_gdas_enkf_enkf_3.list} | 6 +- ...transfer_rdhpcs_gfs_gdas_enkf_enkf_4.list} | 6 +- ...transfer_rdhpcs_gfs_gdas_enkf_enkf_5.list} | 4 +- ...transfer_rdhpcs_gfs_gdas_enkf_enkf_6.list} | 4 +- ...transfer_rdhpcs_gfs_gdas_enkf_enkf_7.list} | 4 +- ...transfer_rdhpcs_gfs_gdas_enkf_enkf_8.list} | 4 +- ...ist => transfer_rdhpcs_gfs_gdas_gdas.list} | 6 +- ...s.list => transfer_rdhpcs_gfs_gempak.list} | 1 + ..._gfs.list => transfer_rdhpcs_gfs_gfs.list} | 3 +- parm/ufs/fix/gfs/atmos.fixed_files.yaml | 94 +- parm/ufs/fv3/data_table | 1 - parm/ufs/fv3/diag_table | 108 +- parm/ufs/fv3/diag_table_aod | 2 +- parm/ufs/fv3/diag_table_da | 20 +- .../field_table_thompson_aero_tke_progsigma | 10 +- parm/ufs/gocart/ExtData.other | 36 +- parm/ufs/gocart/SU2G_instance_SU.rc | 4 +- parm/ufs/mom6/MOM_input_template_025 | 902 ------- parm/ufs/mom6/MOM_input_template_050 | 947 -------- parm/ufs/mom6/MOM_input_template_100 | 866 ------- parm/ufs/mom6/MOM_input_template_500 | 592 ----- parm/ufs/ufs.configure.atm.IN | 22 - parm/ufs/ufs.configure.atm_aero.IN | 40 - parm/ufs/ufs.configure.blocked_atm_wav.IN | 41 - parm/ufs/ufs.configure.cpld.IN | 122 - parm/ufs/ufs.configure.cpld_aero.IN | 134 - parm/ufs/ufs.configure.cpld_aero_outerwave.IN | 151 -- parm/ufs/ufs.configure.cpld_aero_wave.IN | 151 -- parm/ufs/ufs.configure.cpld_outerwave.IN | 139 -- parm/ufs/ufs.configure.cpld_wave.IN | 139 -- parm/ufs/ufs.configure.leapfrog_atm_wav.IN | 41 - parm/wave/at_10m_interp.inp.tmpl | 2 +- parm/wave/ep_10m_interp.inp.tmpl | 2 +- parm/wave/glo_15mxt_interp.inp.tmpl | 6 +- parm/wave/glo_200_interp.inp.tmpl | 12 + parm/wave/glo_30m_interp.inp.tmpl | 6 +- parm/wave/wc_10m_interp.inp.tmpl | 2 +- parm/wave/ww3_grib2.glo_100.inp.tmpl | 9 + parm/wmo/grib2_awpgfs_20km_akf003 | 8 +- parm/wmo/grib2_awpgfs_20km_akf006 | 8 +- parm/wmo/grib2_awpgfs_20km_akf009 | 8 +- parm/wmo/grib2_awpgfs_20km_akf012 | 8 +- parm/wmo/grib2_awpgfs_20km_akf015 | 8 +- parm/wmo/grib2_awpgfs_20km_akf018 | 8 +- parm/wmo/grib2_awpgfs_20km_akf021 | 8 +- parm/wmo/grib2_awpgfs_20km_akf024 | 8 +- parm/wmo/grib2_awpgfs_20km_akf027 | 8 +- parm/wmo/grib2_awpgfs_20km_akf030 | 8 +- parm/wmo/grib2_awpgfs_20km_akf033 | 8 +- parm/wmo/grib2_awpgfs_20km_akf036 | 8 +- parm/wmo/grib2_awpgfs_20km_akf039 | 8 +- parm/wmo/grib2_awpgfs_20km_akf042 | 8 +- parm/wmo/grib2_awpgfs_20km_akf045 | 8 +- parm/wmo/grib2_awpgfs_20km_akf048 | 8 +- parm/wmo/grib2_awpgfs_20km_akf051 | 8 +- parm/wmo/grib2_awpgfs_20km_akf054 | 8 +- parm/wmo/grib2_awpgfs_20km_akf057 | 8 +- parm/wmo/grib2_awpgfs_20km_akf060 | 8 +- parm/wmo/grib2_awpgfs_20km_akf063 | 8 +- parm/wmo/grib2_awpgfs_20km_akf066 | 8 +- parm/wmo/grib2_awpgfs_20km_akf069 | 8 +- parm/wmo/grib2_awpgfs_20km_akf072 | 8 +- parm/wmo/grib2_awpgfs_20km_akf075 | 8 +- parm/wmo/grib2_awpgfs_20km_akf078 | 8 +- parm/wmo/grib2_awpgfs_20km_akf081 | 8 +- parm/wmo/grib2_awpgfs_20km_akf084 | 8 +- parm/wmo/grib2_awpgfs_20km_akf090 | 8 +- parm/wmo/grib2_awpgfs_20km_akf096 | 8 +- parm/wmo/grib2_awpgfs_20km_akf102 | 8 +- parm/wmo/grib2_awpgfs_20km_akf108 | 8 +- parm/wmo/grib2_awpgfs_20km_akf114 | 8 +- parm/wmo/grib2_awpgfs_20km_akf120 | 8 +- parm/wmo/grib2_awpgfs_20km_akf126 | 8 +- parm/wmo/grib2_awpgfs_20km_akf132 | 8 +- parm/wmo/grib2_awpgfs_20km_akf138 | 8 +- parm/wmo/grib2_awpgfs_20km_akf144 | 8 +- parm/wmo/grib2_awpgfs_20km_akf150 | 8 +- parm/wmo/grib2_awpgfs_20km_akf156 | 8 +- parm/wmo/grib2_awpgfs_20km_akf162 | 8 +- parm/wmo/grib2_awpgfs_20km_akf168 | 8 +- parm/wmo/grib2_awpgfs_20km_akf174 | 8 +- parm/wmo/grib2_awpgfs_20km_akf180 | 8 +- parm/wmo/grib2_awpgfs_20km_akf186 | 8 +- parm/wmo/grib2_awpgfs_20km_akf192 | 8 +- parm/wmo/grib2_awpgfs_20km_akf198 | 8 +- parm/wmo/grib2_awpgfs_20km_akf204 | 8 +- parm/wmo/grib2_awpgfs_20km_akf210 | 8 +- parm/wmo/grib2_awpgfs_20km_akf216 | 8 +- parm/wmo/grib2_awpgfs_20km_akf222 | 8 +- parm/wmo/grib2_awpgfs_20km_akf228 | 8 +- parm/wmo/grib2_awpgfs_20km_akf234 | 8 +- parm/wmo/grib2_awpgfs_20km_akf240 | 8 +- parm/wmo/grib2_awpgfs_20km_conusf003 | 8 +- parm/wmo/grib2_awpgfs_20km_conusf006 | 8 +- parm/wmo/grib2_awpgfs_20km_conusf009 | 8 +- parm/wmo/grib2_awpgfs_20km_conusf012 | 8 +- parm/wmo/grib2_awpgfs_20km_conusf015 | 8 +- parm/wmo/grib2_awpgfs_20km_conusf018 | 8 +- parm/wmo/grib2_awpgfs_20km_conusf021 | 8 +- parm/wmo/grib2_awpgfs_20km_conusf024 | 8 +- parm/wmo/grib2_awpgfs_20km_conusf027 | 8 +- parm/wmo/grib2_awpgfs_20km_conusf030 | 8 +- parm/wmo/grib2_awpgfs_20km_conusf033 | 8 +- parm/wmo/grib2_awpgfs_20km_conusf036 | 8 +- parm/wmo/grib2_awpgfs_20km_conusf039 | 8 +- parm/wmo/grib2_awpgfs_20km_conusf042 | 8 +- parm/wmo/grib2_awpgfs_20km_conusf045 | 8 +- parm/wmo/grib2_awpgfs_20km_conusf048 | 8 +- parm/wmo/grib2_awpgfs_20km_conusf051 | 8 +- parm/wmo/grib2_awpgfs_20km_conusf054 | 8 +- parm/wmo/grib2_awpgfs_20km_conusf057 | 8 +- parm/wmo/grib2_awpgfs_20km_conusf060 | 8 +- parm/wmo/grib2_awpgfs_20km_conusf063 | 8 +- parm/wmo/grib2_awpgfs_20km_conusf066 | 8 +- parm/wmo/grib2_awpgfs_20km_conusf069 | 8 +- parm/wmo/grib2_awpgfs_20km_conusf072 | 8 +- parm/wmo/grib2_awpgfs_20km_conusf075 | 8 +- parm/wmo/grib2_awpgfs_20km_conusf078 | 8 +- parm/wmo/grib2_awpgfs_20km_conusf081 | 8 +- parm/wmo/grib2_awpgfs_20km_conusf084 | 8 +- parm/wmo/grib2_awpgfs_20km_conusf090 | 8 +- parm/wmo/grib2_awpgfs_20km_conusf096 | 8 +- parm/wmo/grib2_awpgfs_20km_conusf102 | 8 +- parm/wmo/grib2_awpgfs_20km_conusf108 | 8 +- parm/wmo/grib2_awpgfs_20km_conusf114 | 8 +- parm/wmo/grib2_awpgfs_20km_conusf120 | 8 +- parm/wmo/grib2_awpgfs_20km_conusf126 | 8 +- parm/wmo/grib2_awpgfs_20km_conusf132 | 8 +- parm/wmo/grib2_awpgfs_20km_conusf138 | 8 +- parm/wmo/grib2_awpgfs_20km_conusf144 | 8 +- parm/wmo/grib2_awpgfs_20km_conusf150 | 8 +- parm/wmo/grib2_awpgfs_20km_conusf156 | 8 +- parm/wmo/grib2_awpgfs_20km_conusf162 | 8 +- parm/wmo/grib2_awpgfs_20km_conusf168 | 8 +- parm/wmo/grib2_awpgfs_20km_conusf174 | 8 +- parm/wmo/grib2_awpgfs_20km_conusf180 | 8 +- parm/wmo/grib2_awpgfs_20km_conusf186 | 8 +- parm/wmo/grib2_awpgfs_20km_conusf192 | 8 +- parm/wmo/grib2_awpgfs_20km_conusf198 | 8 +- parm/wmo/grib2_awpgfs_20km_conusf204 | 8 +- parm/wmo/grib2_awpgfs_20km_conusf210 | 8 +- parm/wmo/grib2_awpgfs_20km_conusf216 | 8 +- parm/wmo/grib2_awpgfs_20km_conusf222 | 8 +- parm/wmo/grib2_awpgfs_20km_conusf228 | 8 +- parm/wmo/grib2_awpgfs_20km_conusf234 | 8 +- parm/wmo/grib2_awpgfs_20km_conusf240 | 8 +- parm/wmo/grib2_awpgfs_20km_pacf003 | 8 +- parm/wmo/grib2_awpgfs_20km_pacf006 | 8 +- parm/wmo/grib2_awpgfs_20km_pacf009 | 8 +- parm/wmo/grib2_awpgfs_20km_pacf012 | 8 +- parm/wmo/grib2_awpgfs_20km_pacf015 | 8 +- parm/wmo/grib2_awpgfs_20km_pacf018 | 8 +- parm/wmo/grib2_awpgfs_20km_pacf021 | 8 +- parm/wmo/grib2_awpgfs_20km_pacf024 | 8 +- parm/wmo/grib2_awpgfs_20km_pacf027 | 8 +- parm/wmo/grib2_awpgfs_20km_pacf030 | 8 +- parm/wmo/grib2_awpgfs_20km_pacf033 | 8 +- parm/wmo/grib2_awpgfs_20km_pacf036 | 8 +- parm/wmo/grib2_awpgfs_20km_pacf039 | 8 +- parm/wmo/grib2_awpgfs_20km_pacf042 | 8 +- parm/wmo/grib2_awpgfs_20km_pacf045 | 8 +- parm/wmo/grib2_awpgfs_20km_pacf048 | 8 +- parm/wmo/grib2_awpgfs_20km_pacf051 | 8 +- parm/wmo/grib2_awpgfs_20km_pacf054 | 8 +- parm/wmo/grib2_awpgfs_20km_pacf057 | 8 +- parm/wmo/grib2_awpgfs_20km_pacf060 | 8 +- parm/wmo/grib2_awpgfs_20km_pacf063 | 8 +- parm/wmo/grib2_awpgfs_20km_pacf066 | 8 +- parm/wmo/grib2_awpgfs_20km_pacf069 | 8 +- parm/wmo/grib2_awpgfs_20km_pacf072 | 8 +- parm/wmo/grib2_awpgfs_20km_pacf075 | 8 +- parm/wmo/grib2_awpgfs_20km_pacf078 | 8 +- parm/wmo/grib2_awpgfs_20km_pacf081 | 8 +- parm/wmo/grib2_awpgfs_20km_pacf084 | 8 +- parm/wmo/grib2_awpgfs_20km_pacf090 | 8 +- parm/wmo/grib2_awpgfs_20km_pacf096 | 8 +- parm/wmo/grib2_awpgfs_20km_pacf102 | 8 +- parm/wmo/grib2_awpgfs_20km_pacf108 | 8 +- parm/wmo/grib2_awpgfs_20km_pacf114 | 8 +- parm/wmo/grib2_awpgfs_20km_pacf120 | 8 +- parm/wmo/grib2_awpgfs_20km_pacf126 | 8 +- parm/wmo/grib2_awpgfs_20km_pacf132 | 8 +- parm/wmo/grib2_awpgfs_20km_pacf138 | 8 +- parm/wmo/grib2_awpgfs_20km_pacf144 | 8 +- parm/wmo/grib2_awpgfs_20km_pacf150 | 8 +- parm/wmo/grib2_awpgfs_20km_pacf156 | 8 +- parm/wmo/grib2_awpgfs_20km_pacf162 | 8 +- parm/wmo/grib2_awpgfs_20km_pacf168 | 8 +- parm/wmo/grib2_awpgfs_20km_pacf174 | 8 +- parm/wmo/grib2_awpgfs_20km_pacf180 | 8 +- parm/wmo/grib2_awpgfs_20km_pacf186 | 8 +- parm/wmo/grib2_awpgfs_20km_pacf192 | 8 +- parm/wmo/grib2_awpgfs_20km_pacf198 | 8 +- parm/wmo/grib2_awpgfs_20km_pacf204 | 8 +- parm/wmo/grib2_awpgfs_20km_pacf210 | 8 +- parm/wmo/grib2_awpgfs_20km_pacf216 | 8 +- parm/wmo/grib2_awpgfs_20km_pacf222 | 8 +- parm/wmo/grib2_awpgfs_20km_pacf228 | 8 +- parm/wmo/grib2_awpgfs_20km_pacf234 | 8 +- parm/wmo/grib2_awpgfs_20km_pacf240 | 8 +- parm/wmo/grib2_awpgfs_20km_pricof003 | 8 +- parm/wmo/grib2_awpgfs_20km_pricof006 | 8 +- parm/wmo/grib2_awpgfs_20km_pricof009 | 8 +- parm/wmo/grib2_awpgfs_20km_pricof012 | 8 +- parm/wmo/grib2_awpgfs_20km_pricof015 | 8 +- parm/wmo/grib2_awpgfs_20km_pricof018 | 8 +- parm/wmo/grib2_awpgfs_20km_pricof021 | 8 +- parm/wmo/grib2_awpgfs_20km_pricof024 | 8 +- parm/wmo/grib2_awpgfs_20km_pricof027 | 8 +- parm/wmo/grib2_awpgfs_20km_pricof030 | 8 +- parm/wmo/grib2_awpgfs_20km_pricof033 | 8 +- parm/wmo/grib2_awpgfs_20km_pricof036 | 8 +- parm/wmo/grib2_awpgfs_20km_pricof039 | 8 +- parm/wmo/grib2_awpgfs_20km_pricof042 | 8 +- parm/wmo/grib2_awpgfs_20km_pricof045 | 8 +- parm/wmo/grib2_awpgfs_20km_pricof048 | 8 +- parm/wmo/grib2_awpgfs_20km_pricof051 | 8 +- parm/wmo/grib2_awpgfs_20km_pricof054 | 8 +- parm/wmo/grib2_awpgfs_20km_pricof057 | 8 +- parm/wmo/grib2_awpgfs_20km_pricof060 | 8 +- parm/wmo/grib2_awpgfs_20km_pricof063 | 8 +- parm/wmo/grib2_awpgfs_20km_pricof066 | 8 +- parm/wmo/grib2_awpgfs_20km_pricof069 | 8 +- parm/wmo/grib2_awpgfs_20km_pricof072 | 8 +- parm/wmo/grib2_awpgfs_20km_pricof075 | 8 +- parm/wmo/grib2_awpgfs_20km_pricof078 | 8 +- parm/wmo/grib2_awpgfs_20km_pricof081 | 8 +- parm/wmo/grib2_awpgfs_20km_pricof084 | 8 +- parm/wmo/grib2_awpgfs_20km_pricof090 | 8 +- parm/wmo/grib2_awpgfs_20km_pricof096 | 8 +- parm/wmo/grib2_awpgfs_20km_pricof102 | 8 +- parm/wmo/grib2_awpgfs_20km_pricof108 | 8 +- parm/wmo/grib2_awpgfs_20km_pricof114 | 8 +- parm/wmo/grib2_awpgfs_20km_pricof120 | 8 +- parm/wmo/grib2_awpgfs_20km_pricof126 | 8 +- parm/wmo/grib2_awpgfs_20km_pricof132 | 8 +- parm/wmo/grib2_awpgfs_20km_pricof138 | 8 +- parm/wmo/grib2_awpgfs_20km_pricof144 | 8 +- parm/wmo/grib2_awpgfs_20km_pricof150 | 8 +- parm/wmo/grib2_awpgfs_20km_pricof156 | 8 +- parm/wmo/grib2_awpgfs_20km_pricof162 | 8 +- parm/wmo/grib2_awpgfs_20km_pricof168 | 8 +- parm/wmo/grib2_awpgfs_20km_pricof174 | 8 +- parm/wmo/grib2_awpgfs_20km_pricof180 | 8 +- parm/wmo/grib2_awpgfs_20km_pricof186 | 8 +- parm/wmo/grib2_awpgfs_20km_pricof192 | 8 +- parm/wmo/grib2_awpgfs_20km_pricof198 | 8 +- parm/wmo/grib2_awpgfs_20km_pricof204 | 8 +- parm/wmo/grib2_awpgfs_20km_pricof210 | 8 +- parm/wmo/grib2_awpgfs_20km_pricof216 | 8 +- parm/wmo/grib2_awpgfs_20km_pricof222 | 8 +- parm/wmo/grib2_awpgfs_20km_pricof228 | 8 +- parm/wmo/grib2_awpgfs_20km_pricof234 | 8 +- parm/wmo/grib2_awpgfs_20km_pricof240 | 8 +- parm/wmo/grib_awpgfs000.211 | 387 --- parm/wmo/grib_awpgfs006.211 | 405 ---- parm/wmo/grib_awpgfs012.211 | 405 ---- parm/wmo/grib_awpgfs018.211 | 405 ---- parm/wmo/grib_awpgfs024.211 | 405 ---- parm/wmo/grib_awpgfs030.211 | 405 ---- parm/wmo/grib_awpgfs036.211 | 405 ---- parm/wmo/grib_awpgfs042.211 | 405 ---- parm/wmo/grib_awpgfs048.211 | 405 ---- parm/wmo/grib_awpgfs054.211 | 409 ---- parm/wmo/grib_awpgfs060.211 | 409 ---- parm/wmo/grib_awpgfs066.211 | 409 ---- parm/wmo/grib_awpgfs072.211 | 409 ---- parm/wmo/grib_awpgfs078.211 | 409 ---- parm/wmo/grib_awpgfs084.211 | 409 ---- parm/wmo/grib_awpgfs090.211 | 409 ---- parm/wmo/grib_awpgfs096.211 | 409 ---- parm/wmo/grib_awpgfs102.211 | 409 ---- parm/wmo/grib_awpgfs108.211 | 409 ---- parm/wmo/grib_awpgfs114.211 | 409 ---- parm/wmo/grib_awpgfs120.211 | 409 ---- parm/wmo/grib_awpgfs126.211 | 371 --- parm/wmo/grib_awpgfs132.211 | 371 --- parm/wmo/grib_awpgfs138.211 | 371 --- parm/wmo/grib_awpgfs144.211 | 371 --- parm/wmo/grib_awpgfs150.211 | 371 --- parm/wmo/grib_awpgfs156.211 | 371 --- parm/wmo/grib_awpgfs162.211 | 371 --- parm/wmo/grib_awpgfs168.211 | 371 --- parm/wmo/grib_awpgfs174.211 | 371 --- parm/wmo/grib_awpgfs180.211 | 371 --- parm/wmo/grib_awpgfs186.211 | 371 --- parm/wmo/grib_awpgfs192.211 | 371 --- parm/wmo/grib_awpgfs198.211 | 371 --- parm/wmo/grib_awpgfs204.211 | 371 --- parm/wmo/grib_awpgfs210.211 | 371 --- parm/wmo/grib_awpgfs216.211 | 371 --- parm/wmo/grib_awpgfs222.211 | 371 --- parm/wmo/grib_awpgfs228.211 | 371 --- parm/wmo/grib_awpgfs234.211 | 371 --- parm/wmo/grib_awpgfs240.211 | 371 --- scripts/exgdas_atmos_chgres_forenkf.sh | 17 +- scripts/exgdas_atmos_gempak_gif_ncdc.sh | 73 +- scripts/exgdas_atmos_nawips.sh | 197 +- scripts/exgdas_atmos_verfozn.sh | 2 +- scripts/exgdas_atmos_verfrad.sh | 17 +- scripts/exgdas_enkf_earc.py | 60 + scripts/exgdas_enkf_earc.sh | 163 -- scripts/exgdas_enkf_ecen.sh | 46 +- scripts/exgdas_enkf_fcst.sh | 225 -- scripts/exgdas_enkf_post.sh | 23 +- scripts/exgdas_enkf_select_obs.sh | 7 +- scripts/exgdas_enkf_sfc.sh | 55 +- scripts/exgdas_enkf_update.sh | 65 +- .../exgdas_global_marine_analysis_letkf.py | 24 + scripts/exgfs_aero_init_aerosol.py | 18 +- scripts/exgfs_atmos_awips_20km_1p0deg.sh | 32 +- scripts/exgfs_atmos_fbwind.sh | 53 +- scripts/exgfs_atmos_gempak_gif_ncdc_skew_t.sh | 126 +- scripts/exgfs_atmos_gempak_meta.sh | 191 +- scripts/exgfs_atmos_goes_nawips.sh | 167 +- scripts/exgfs_atmos_grib2_special_npoess.sh | 90 +- scripts/exgfs_atmos_grib_awips.sh | 135 -- scripts/exgfs_atmos_nawips.sh | 200 +- scripts/exgfs_atmos_postsnd.sh | 37 +- scripts/exgfs_atmos_wafs_blending_0p25.sh | 298 +++ scripts/exgfs_atmos_wafs_gcip.sh | 242 ++ scripts/exgfs_atmos_wafs_grib.sh | 146 ++ scripts/exgfs_atmos_wafs_grib2.sh | 227 ++ scripts/exgfs_atmos_wafs_grib2_0p25.sh | 200 ++ scripts/exgfs_pmgr.sh | 2 +- scripts/exgfs_prdgen_manager.sh | 2 +- scripts/exgfs_wave_init.sh | 42 +- scripts/exgfs_wave_nawips.sh | 31 +- scripts/exgfs_wave_post_gridded_sbs.sh | 55 +- scripts/exgfs_wave_post_pnt.sh | 136 +- scripts/exgfs_wave_prdgen_bulls.sh | 28 +- scripts/exgfs_wave_prdgen_gridded.sh | 55 +- scripts/exgfs_wave_prep.sh | 51 +- scripts/exglobal_archive.py | 63 + scripts/exglobal_archive.sh | 1 - .../exglobal_atm_analysis_fv3_increment.py | 23 + ...y => exglobal_atm_analysis_variational.py} | 4 +- .../exglobal_atmens_analysis_fv3_increment.py | 23 + ...n.py => exglobal_atmens_analysis_letkf.py} | 6 +- scripts/exglobal_atmos_analysis.sh | 110 +- scripts/exglobal_atmos_analysis_calc.sh | 26 +- scripts/exglobal_atmos_ensstat.sh | 19 + scripts/exglobal_atmos_pmgr.sh | 2 +- scripts/exglobal_atmos_products.sh | 40 +- scripts/exglobal_atmos_sfcanl.sh | 234 +- scripts/exglobal_atmos_tropcy_qc_reloc.sh | 6 +- scripts/exglobal_atmos_vminmon.sh | 8 +- scripts/exglobal_cleanup.sh | 41 +- scripts/exglobal_diag.sh | 18 +- scripts/exglobal_extractvars.sh | 53 + scripts/exglobal_forecast.sh | 70 +- scripts/exglobal_marinebmat.py | 24 + scripts/exglobal_oceanice_products.py | 52 + scripts/exglobal_prep_emissions.py | 25 + scripts/exglobal_prep_obs_aero.py | 23 + ..._land_obs.py => exglobal_prep_snow_obs.py} | 16 +- ..._analysis.py => exglobal_snow_analysis.py} | 12 +- scripts/exglobal_stage_ic.sh | 87 +- scripts/run_reg2grb2.sh | 72 - scripts/run_regrid.sh | 27 - sorc/build_all.sh | 132 +- sorc/build_gdas.sh | 15 +- sorc/build_gfs_utils.sh | 10 +- sorc/build_gsi_enkf.sh | 6 +- sorc/build_gsi_monitor.sh | 10 +- sorc/build_gsi_utils.sh | 10 +- sorc/build_ufs.sh | 18 +- sorc/build_ufs_utils.sh | 11 +- sorc/build_upp.sh | 27 +- sorc/build_ww3prepost.sh | 28 +- sorc/gdas.cd | 2 +- sorc/gfs_utils.fd | 2 +- sorc/gsi_enkf.fd | 2 +- sorc/gsi_monitor.fd | 2 +- sorc/gsi_utils.fd | 2 +- sorc/link_workflow.sh | 190 +- sorc/ncl.setup | 12 - sorc/ufs_model.fd | 2 +- sorc/ufs_utils.fd | 2 +- sorc/verif-global.fd | 2 +- sorc/wxflow | 2 +- test/f90nmlcmp.sh | 19 + test/g2cmp.sh | 20 + test/nccmp.sh | 15 + ush/atmos_ensstat.sh | 99 + ush/atmos_extractvars.sh | 98 + ush/bash_utils.sh | 126 + ush/calcanl_gfs.py | 35 +- ush/check_ice_netcdf.sh | 43 + ush/detect_machine.sh | 51 +- ush/extractvars_tools.sh | 60 + ush/file_utils.sh | 0 ush/forecast_det.sh | 211 +- ush/forecast_postdet.sh | 1374 +++++------ ush/forecast_predet.sh | 675 +++++- ush/fv3gfs_remap.sh | 118 - ush/gaussian_sfcanl.sh | 37 +- ush/getdump.sh | 12 +- ush/getges.sh | 2 +- ush/gfs_bfr2gpk.sh | 2 +- ush/gfs_bufr.sh | 53 +- ush/gfs_bufr_netcdf.sh | 38 +- ush/gfs_sndp.sh | 6 +- ush/gfs_truncate_enkf.sh | 17 +- ush/global_savefits.sh | 2 +- ush/hpssarch_gen.sh | 1 - ush/icepost.ncl | 382 --- ush/interp_atmos_master.sh | 4 +- ush/interp_atmos_sflux.sh | 4 +- ush/jjob_header.sh | 5 +- ush/link_crtm_fix.sh | 12 +- ush/load_fv3gfs_modules.sh | 52 +- ush/load_ufsda_modules.sh | 64 +- ush/minmon_xtrct_costs.pl | 5 +- ush/minmon_xtrct_gnorms.pl | 5 +- ush/minmon_xtrct_reduct.pl | 6 +- ush/module-setup.sh | 45 +- ush/oceanice_nc2grib2.sh | 319 +++ ush/ocnice_extractvars.sh | 66 + ush/ocnpost.ncl | 588 ----- ush/ozn_xtrct.sh | 6 +- ush/parsing_model_configure_DATM.sh | 38 - ush/parsing_model_configure_FV3.sh | 118 +- ush/parsing_namelists_CICE.sh | 462 +--- ush/parsing_namelists_FV3.sh | 130 +- ush/parsing_namelists_FV3_nest.sh | 834 +++++++ ush/parsing_namelists_MOM6.sh | 137 +- ush/parsing_namelists_WW3.sh | 18 +- ..._configure.sh => parsing_ufs_configure.sh} | 35 +- ush/preamble.sh | 85 +- ush/python/pygfs/__init__.py | 18 + ush/python/pygfs/task/aero_analysis.py | 61 +- ush/python/pygfs/task/aero_emissions.py | 84 + ush/python/pygfs/task/aero_prepobs.py | 236 ++ ush/python/pygfs/task/analysis.py | 323 ++- ush/python/pygfs/task/archive.py | 427 ++++ ush/python/pygfs/task/atm_analysis.py | 355 +-- ush/python/pygfs/task/atmens_analysis.py | 197 +- ush/python/pygfs/task/marine_bmat.py | 350 +++ ush/python/pygfs/task/marine_letkf.py | 147 ++ ush/python/pygfs/task/oceanice_products.py | 356 +++ .../{land_analysis.py => snow_analysis.py} | 233 +- ush/python/pygfs/task/upp.py | 17 +- ush/python/pygfs/utils/__init__.py | 0 ush/python/pygfs/utils/marine_da_utils.py | 99 + ush/radmon_err_rpt.sh | 5 +- ush/radmon_verf_angle.sh | 7 +- ush/radmon_verf_bcoef.sh | 3 +- ush/radmon_verf_bcor.sh | 3 +- ush/radmon_verf_time.sh | 8 +- ush/rstprod.sh | 2 +- ush/run_mpmd.sh | 2 +- ush/syndat_getjtbul.sh | 15 +- ush/syndat_qctropcy.sh | 56 +- ush/tropcy_relocate.sh | 98 +- ush/tropcy_relocate_extrkr.sh | 26 +- ush/wafs_mkgbl.sh | 152 ++ ush/wave_extractvars.sh | 34 + ush/wave_grib2_sbs.sh | 40 +- ush/wave_grid_interp_sbs.sh | 42 +- ush/wave_grid_moddef.sh | 26 +- ush/wave_outp_cat.sh | 2 +- ush/wave_outp_spec.sh | 56 +- ush/wave_prnc_cur.sh | 14 +- ush/wave_prnc_ice.sh | 18 +- ush/wave_tar.sh | 72 +- versions/build.gaea.ver | 6 + versions/build.hercules.ver | 3 + versions/build.jet.ver | 2 + versions/build.orion.ver | 6 +- versions/build.s4.ver | 2 + versions/build.spack.ver | 9 +- versions/build.wcoss2.ver | 2 +- versions/fix.ver | 12 +- versions/run.gaea.ver | 6 + versions/run.hera.ver | 6 +- versions/run.hercules.ver | 9 +- versions/run.jet.ver | 5 + versions/run.orion.ver | 9 +- versions/run.s4.ver | 2 + versions/run.spack.ver | 26 +- versions/run.wcoss2.ver | 8 +- workflow/applications/applications.py | 63 +- workflow/applications/gefs.py | 46 +- workflow/applications/gfs_cycled.py | 111 +- workflow/applications/gfs_forecast_only.py | 30 +- workflow/create_experiment.py | 24 +- workflow/gsl_template_hera.xml | 85 +- workflow/hosts.py | 10 +- workflow/hosts/awspw.yaml | 2 + workflow/hosts/container.yaml | 2 + workflow/hosts/gaea.yaml | 27 + workflow/hosts/hera_gsl.yaml | 8 +- workflow/hosts/hercules.yaml | 11 +- workflow/hosts/jet_gsl.yaml | 3 + workflow/hosts/orion.yaml | 9 +- workflow/hosts/s4.yaml | 2 + workflow/hosts/wcoss2.yaml | 5 + workflow/prod.yml | 11 - workflow/rocoto/gefs_tasks.py | 463 +++- workflow/rocoto/gfs_tasks.py | 1158 ++++----- workflow/rocoto/rocoto.py | 29 +- workflow/rocoto/tasks_emc.py | 128 +- workflow/rocoto/tasks_gsl.py | 130 +- workflow/rocoto/workflow_tasks.py | 8 +- workflow/rocoto_viewer.py | 2 +- workflow/setup_expt.py | 91 +- 1014 files changed, 30829 insertions(+), 40315 deletions(-) create mode 100644 .github/workflows/ci_unit_tests.yaml create mode 100644 ci/Jenkinsfile create mode 100644 ci/cases/gfsv17/C384mx025_3DVarAOWCDA.yaml create mode 100644 ci/cases/gfsv17/ocnanal.yaml create mode 100644 ci/cases/hires/C1152_S2SW.yaml create mode 100644 ci/cases/hires/C768_S2SW.yaml create mode 100644 ci/cases/pr/C48mx500_3DVarAOWCDA.yaml create mode 100644 ci/cases/pr/C96C48_ufs_hybatmDA.yaml create mode 100644 ci/cases/pr/C96_atm3DVar_extended.yaml create mode 100644 ci/cases/pr/C96_atmaerosnowDA.yaml create mode 100644 ci/cases/yamls/atmaerosnowDA_defaults_ci.yaml create mode 100644 ci/cases/yamls/build.yaml rename ci/{platforms => cases/yamls}/gefs_ci_defaults.yaml (63%) rename ci/{platforms => cases/yamls}/gfs_defaults_ci.yaml (63%) create mode 100644 ci/cases/yamls/gfs_extended_ci.yaml create mode 100644 ci/cases/yamls/soca_gfs_defaults_ci.yaml create mode 100644 ci/cases/yamls/ufs_hybatmDA_defaults.ci.yaml create mode 100644 ci/platforms/config.wcoss2 delete mode 100755 ci/scripts/pr_list_database.py create mode 100644 ci/scripts/tests/test_create_experiment.py create mode 100755 ci/scripts/tests/test_rocotostat.py create mode 100755 ci/scripts/tests/test_setup.py create mode 100755 ci/scripts/utils/ci_utils_wrapper.sh create mode 100755 ci/scripts/utils/get_host_case_list.py create mode 100755 ci/scripts/utils/githubpr.py create mode 100755 ci/scripts/utils/launch_java_agent.sh create mode 100755 ci/scripts/utils/parse_yaml.py create mode 100755 ci/scripts/utils/pr_list_database.py create mode 100755 ci/scripts/utils/publish_logs.py create mode 100755 ci/scripts/utils/rocotostat.py create mode 120000 ci/scripts/utils/wxflow delete mode 100644 ecf/scripts/gfs/atmos/post_processing/awips_g2/.gitignore delete mode 100755 ecf/scripts/gfs/atmos/post_processing/awips_g2/jgfs_atmos_awips_g2_master.ecf create mode 100755 env/GAEA.env create mode 100755 gempak/ush/gempak_gfs_f000_gif.sh delete mode 100755 gempak/ush/gempak_gfs_f00_gif.sh delete mode 100755 gempak/ush/gempak_gfs_f12_gif.sh delete mode 100755 gempak/ush/gempak_gfs_f24_gif.sh delete mode 100755 gempak/ush/gempak_gfs_f36_gif.sh delete mode 100755 gempak/ush/gempak_gfs_f48_gif.sh create mode 100755 gempak/ush/gempak_gfs_fhhh_gif.sh delete mode 100755 jobs/JGDAS_ENKF_FCST rename jobs/{JGDAS_GLOBAL_OCEAN_ANALYSIS_BMAT_VRFY => JGDAS_GLOBAL_OCEAN_ANALYSIS_ECEN} (58%) delete mode 100755 jobs/JGFS_ATMOS_AWIPS_G2 create mode 100755 jobs/JGFS_ATMOS_WAFS create mode 100755 jobs/JGFS_ATMOS_WAFS_BLENDING_0P25 create mode 100755 jobs/JGFS_ATMOS_WAFS_GCIP create mode 100755 jobs/JGFS_ATMOS_WAFS_GRIB2 create mode 100755 jobs/JGFS_ATMOS_WAFS_GRIB2_0P25 create mode 100755 jobs/JGLOBAL_ATMENS_ANALYSIS_FV3_INCREMENT rename jobs/{JGLOBAL_ATMENS_ANALYSIS_RUN => JGLOBAL_ATMENS_ANALYSIS_LETKF} (83%) create mode 100755 jobs/JGLOBAL_ATMOS_ENSSTAT create mode 100755 jobs/JGLOBAL_ATM_ANALYSIS_FV3_INCREMENT rename jobs/{JGLOBAL_ATM_ANALYSIS_RUN => JGLOBAL_ATM_ANALYSIS_VARIATIONAL} (84%) create mode 100755 jobs/JGLOBAL_EXTRACTVARS create mode 100755 jobs/JGLOBAL_MARINE_ANALYSIS_LETKF create mode 100755 jobs/JGLOBAL_MARINE_BMAT create mode 100755 jobs/JGLOBAL_OCEANICE_PRODUCTS create mode 100755 jobs/JGLOBAL_PREP_EMISSIONS rename jobs/{JGDAS_GLOBAL_OCEAN_ANALYSIS_BMAT => JGLOBAL_PREP_OBS_AERO} (62%) rename jobs/{JGLOBAL_PREP_LAND_OBS => JGLOBAL_PREP_SNOW_OBS} (78%) rename jobs/{JGLOBAL_LAND_ANALYSIS => JGLOBAL_SNOW_ANALYSIS} (73%) create mode 100755 jobs/rocoto/atmanlfv3inc.sh create mode 100755 jobs/rocoto/atmanlvar.sh create mode 100755 jobs/rocoto/atmensanlfv3inc.sh create mode 100755 jobs/rocoto/atmensanlletkf.sh create mode 100755 jobs/rocoto/atmos_ensstat.sh delete mode 100755 jobs/rocoto/awips_g2.sh delete mode 100755 jobs/rocoto/efcs.sh create mode 100755 jobs/rocoto/extractvars.sh rename jobs/rocoto/{gempakpgrb2spec.sh => gempakgrb2spec.sh} (71%) rename jobs/rocoto/{landanl.sh => marineanalletkf.sh} (66%) rename jobs/rocoto/{ocnanalbmat.sh => marinebmat.sh} (79%) create mode 100755 jobs/rocoto/oceanice_products.sh rename jobs/rocoto/{atmanlrun.sh => ocnanalecen.sh} (65%) delete mode 100755 jobs/rocoto/ocnpost.sh rename jobs/rocoto/{atmensanlrun.sh => prep_emissions.sh} (61%) create mode 100755 jobs/rocoto/prepobsaero.sh rename jobs/rocoto/{preplandobs.sh => prepsnowobs.sh} (57%) create mode 100755 jobs/rocoto/remapgrib.sh create mode 100755 jobs/rocoto/snowanl.sh delete mode 100644 modulefiles/module-setup.csh.inc delete mode 100644 modulefiles/module-setup.sh.inc create mode 100644 modulefiles/module_base.gaea.lua create mode 100644 modulefiles/module_gwci.wcoss2.lua create mode 100644 modulefiles/module_gwsetup.gaea.lua create mode 100644 parm/archive/arcdir.yaml.j2 create mode 100644 parm/archive/chem.yaml.j2 create mode 100644 parm/archive/enkf.yaml.j2 create mode 100644 parm/archive/enkf_grp.yaml.j2 create mode 100644 parm/archive/enkf_restarta_grp.yaml.j2 create mode 100644 parm/archive/enkf_restartb_grp.yaml.j2 create mode 100644 parm/archive/gdas.yaml.j2 create mode 100644 parm/archive/gdas_restarta.yaml.j2 create mode 100644 parm/archive/gdas_restartb.yaml.j2 create mode 100644 parm/archive/gdasice.yaml.j2 create mode 100644 parm/archive/gdasice_restart.yaml.j2 create mode 100644 parm/archive/gdasocean.yaml.j2 create mode 100644 parm/archive/gdasocean_analysis.yaml.j2 create mode 100644 parm/archive/gdasocean_restart.yaml.j2 create mode 100644 parm/archive/gdaswave.yaml.j2 create mode 100644 parm/archive/gdaswave_restart.yaml.j2 create mode 100644 parm/archive/gfs_downstream.yaml.j2 create mode 100644 parm/archive/gfs_flux.yaml.j2 create mode 100644 parm/archive/gfs_flux_1p00.yaml.j2 create mode 100644 parm/archive/gfs_netcdfa.yaml.j2 create mode 100644 parm/archive/gfs_netcdfb.yaml.j2 create mode 100644 parm/archive/gfs_pgrb2b.yaml.j2 create mode 100644 parm/archive/gfs_restarta.yaml.j2 create mode 100644 parm/archive/gfsa.yaml.j2 create mode 100644 parm/archive/gfsb.yaml.j2 create mode 100644 parm/archive/gfswave.yaml.j2 create mode 100644 parm/archive/ice_6hravg.yaml.j2 create mode 100644 parm/archive/ice_grib2.yaml.j2 create mode 100644 parm/archive/master_enkf.yaml.j2 create mode 100644 parm/archive/master_enkfgdas.yaml.j2 create mode 100644 parm/archive/master_enkfgfs.yaml.j2 create mode 100644 parm/archive/master_gdas.yaml.j2 create mode 100644 parm/archive/master_gfs.yaml.j2 create mode 100644 parm/archive/ocean_6hravg.yaml.j2 create mode 100644 parm/archive/ocean_grib2.yaml.j2 create mode 100644 parm/config/gefs/config.atmos_ensstat create mode 100644 parm/config/gefs/config.atmos_products rename parm/config/gefs/{config.base.emc.dyn => config.base} (76%) create mode 100644 parm/config/gefs/config.extractvars create mode 100644 parm/config/gefs/config.oceanice_products create mode 100644 parm/config/gefs/config.prep_emissions create mode 100644 parm/config/gefs/config.wavepostbndpnt create mode 100644 parm/config/gefs/config.wavepostbndpntbll create mode 100644 parm/config/gefs/config.wavepostpnt create mode 100644 parm/config/gefs/config.wavepostsbs create mode 100644 parm/config/gfs/config.atmanlfv3inc delete mode 100644 parm/config/gfs/config.atmanlrun create mode 100644 parm/config/gfs/config.atmanlvar create mode 100644 parm/config/gfs/config.atmensanlfv3inc create mode 100644 parm/config/gfs/config.atmensanlletkf delete mode 100644 parm/config/gfs/config.atmensanlrun create mode 120000 parm/config/gfs/config.base rename parm/config/gfs/{config.base.emc.dyn_emc => config.base.emc} (67%) delete mode 120000 parm/config/gfs/config.base.emc.dyn delete mode 100644 parm/config/gfs/config.base.emc.dyn_jet rename parm/config/gfs/{config.base.emc.dyn_hera => config.base.hera} (68%) create mode 100644 parm/config/gfs/config.fbwind delete mode 100644 parm/config/gfs/config.landanl create mode 100644 parm/config/gfs/config.marineanalletkf create mode 100644 parm/config/gfs/config.marinebmat create mode 100644 parm/config/gfs/config.oceanice_products delete mode 100644 parm/config/gfs/config.ocnanalbmat create mode 100644 parm/config/gfs/config.ocnanalecen delete mode 100644 parm/config/gfs/config.ocnpost delete mode 100644 parm/config/gfs/config.preplandobs create mode 100644 parm/config/gfs/config.prepobsaero create mode 100644 parm/config/gfs/config.prepsnowobs create mode 100644 parm/config/gfs/config.resources.GAEA create mode 100644 parm/config/gfs/config.resources.HERA create mode 100644 parm/config/gfs/config.resources.HERCULES create mode 100644 parm/config/gfs/config.resources.JET create mode 100644 parm/config/gfs/config.resources.ORION create mode 100644 parm/config/gfs/config.resources.S4 create mode 100644 parm/config/gfs/config.resources.WCOSS2 create mode 100644 parm/config/gfs/config.snowanl delete mode 100644 parm/gdas/aero_crtm_coeff.yaml create mode 100644 parm/gdas/aero_crtm_coeff.yaml.j2 delete mode 100644 parm/gdas/aero_jedi_fix.yaml create mode 100644 parm/gdas/aero_jedi_fix.yaml.j2 rename parm/gdas/{atm_crtm_coeff.yaml => atm_crtm_coeff.yaml.j2} (100%) delete mode 100644 parm/gdas/atm_jedi_fix.yaml create mode 100644 parm/gdas/atm_jedi_fix.yaml.j2 delete mode 100644 parm/gdas/land_jedi_fix.yaml create mode 100644 parm/gdas/snow_jedi_fix.yaml.j2 create mode 100644 parm/gdas/staging/atm_berror_gsibec.yaml.j2 create mode 100644 parm/gdas/staging/atm_lgetkf_bkg.yaml.j2 create mode 100644 parm/gdas/staging/atm_var_bkg.yaml.j2 create mode 100644 parm/gdas/staging/atm_var_fv3ens.yaml.j2 create mode 100644 parm/post/oceanice_products.yaml create mode 100644 parm/post/oceanice_products_gefs.yaml create mode 100644 parm/product/bufr_ij9km.txt create mode 100644 parm/product/gefs.0p25.f000.paramlist.a.txt create mode 100644 parm/product/gefs.0p25.f000.paramlist.b.txt create mode 100644 parm/product/gefs.0p25.fFFF.paramlist.a.txt create mode 100644 parm/product/gefs.0p25.fFFF.paramlist.b.txt create mode 100644 parm/product/gefs.0p50.f000.paramlist.a.txt create mode 100644 parm/product/gefs.0p50.f000.paramlist.b.txt create mode 100644 parm/product/gefs.0p50.fFFF.paramlist.a.txt create mode 100644 parm/product/gefs.0p50.fFFF.paramlist.b.txt create mode 120000 parm/product/gefs.1p00.f000.paramlist.a.txt create mode 120000 parm/product/gefs.1p00.f000.paramlist.b.txt create mode 120000 parm/product/gefs.1p00.fFFF.paramlist.a.txt create mode 120000 parm/product/gefs.1p00.fFFF.paramlist.b.txt create mode 100644 parm/product/gefs.2p50.f000.paramlist.a.txt create mode 100644 parm/product/gefs.2p50.f000.paramlist.b.txt create mode 100644 parm/product/gefs.2p50.fFFF.paramlist.a.txt create mode 100644 parm/product/gefs.2p50.fFFF.paramlist.b.txt create mode 100644 parm/product/gefs_ice_shortparmlist.parm create mode 100644 parm/product/gefs_ocn_shortparmlist.parm create mode 100644 parm/product/gefs_shortparmlist_2d.parm create mode 100644 parm/product/gefs_shortparmlist_3d_d.parm create mode 100644 parm/product/gefs_shortparmlist_3d_h.parm create mode 100644 parm/product/gefs_wav_shortparmlist.parm rename parm/{post/global_1x1_paramlist_g2.anl => product/gfs.anl.paramlist.a.txt} (99%) rename parm/{post/global_1x1_paramlist_g2.f000 => product/gfs.f000.paramlist.a.txt} (99%) rename parm/{post/global_1x1_paramlist_g2 => product/gfs.fFFF.paramlist.a.txt} (98%) rename parm/{post/global_master-catchup_parmlist_g2 => product/gfs.fFFF.paramlist.b.txt} (100%) rename parm/transfer/{transfer_gdas_enkf_enkf_05.list => transfer_gfs_enkfgdas_enkf_05.list} (100%) rename parm/transfer/{transfer_gdas_enkf_enkf_10.list => transfer_gfs_enkfgdas_enkf_10.list} (100%) rename parm/transfer/{transfer_gdas_enkf_enkf_15.list => transfer_gfs_enkfgdas_enkf_15.list} (100%) rename parm/transfer/{transfer_gdas_enkf_enkf_20.list => transfer_gfs_enkfgdas_enkf_20.list} (100%) rename parm/transfer/{transfer_gdas_enkf_enkf_25.list => transfer_gfs_enkfgdas_enkf_25.list} (100%) rename parm/transfer/{transfer_gdas_enkf_enkf_30.list => transfer_gfs_enkfgdas_enkf_30.list} (100%) rename parm/transfer/{transfer_gdas_enkf_enkf_35.list => transfer_gfs_enkfgdas_enkf_35.list} (100%) rename parm/transfer/{transfer_gdas_enkf_enkf_40.list => transfer_gfs_enkfgdas_enkf_40.list} (100%) rename parm/transfer/{transfer_gdas_enkf_enkf_45.list => transfer_gfs_enkfgdas_enkf_45.list} (100%) rename parm/transfer/{transfer_gdas_enkf_enkf_50.list => transfer_gfs_enkfgdas_enkf_50.list} (100%) rename parm/transfer/{transfer_gdas_enkf_enkf_55.list => transfer_gfs_enkfgdas_enkf_55.list} (100%) rename parm/transfer/{transfer_gdas_enkf_enkf_60.list => transfer_gfs_enkfgdas_enkf_60.list} (100%) rename parm/transfer/{transfer_gdas_enkf_enkf_65.list => transfer_gfs_enkfgdas_enkf_65.list} (100%) rename parm/transfer/{transfer_gdas_enkf_enkf_70.list => transfer_gfs_enkfgdas_enkf_70.list} (100%) rename parm/transfer/{transfer_gdas_enkf_enkf_75.list => transfer_gfs_enkfgdas_enkf_75.list} (100%) rename parm/transfer/{transfer_gdas_enkf_enkf_80.list => transfer_gfs_enkfgdas_enkf_80.list} (100%) rename parm/transfer/{transfer_gdas_enkf_enkf_misc.list => transfer_gfs_enkfgdas_enkf_misc.list} (100%) rename parm/transfer/{transfer_gdas_1a.list => transfer_gfs_gdas_gdas_1a.list} (100%) rename parm/transfer/{transfer_gdas_1b.list => transfer_gfs_gdas_gdas_1b.list} (100%) rename parm/transfer/{transfer_gdas_1c.list => transfer_gfs_gdas_gdas_1c.list} (100%) rename parm/transfer/{transfer_gdas_misc.list => transfer_gfs_gdas_gdas_misc.list} (100%) rename parm/transfer/{transfer_gfs_1.list => transfer_gfs_gfs_1.list} (100%) rename parm/transfer/{transfer_gfs_10a.list => transfer_gfs_gfs_10a.list} (100%) rename parm/transfer/{transfer_gfs_10b.list => transfer_gfs_gfs_10b.list} (100%) rename parm/transfer/{transfer_gfs_2.list => transfer_gfs_gfs_2.list} (100%) rename parm/transfer/{transfer_gfs_3.list => transfer_gfs_gfs_3.list} (100%) rename parm/transfer/{transfer_gfs_4.list => transfer_gfs_gfs_4.list} (100%) rename parm/transfer/{transfer_gfs_5.list => transfer_gfs_gfs_5.list} (100%) rename parm/transfer/{transfer_gfs_6.list => transfer_gfs_gfs_6.list} (100%) rename parm/transfer/{transfer_gfs_7.list => transfer_gfs_gfs_7.list} (100%) rename parm/transfer/{transfer_gfs_8.list => transfer_gfs_gfs_8.list} (100%) rename parm/transfer/{transfer_gfs_9a.list => transfer_gfs_gfs_9a.list} (100%) rename parm/transfer/{transfer_gfs_9b.list => transfer_gfs_gfs_9b.list} (100%) rename parm/transfer/{transfer_gfs_gempak.list => transfer_gfs_gfs_gempak.list} (100%) rename parm/transfer/{transfer_gfs_misc.list => transfer_gfs_gfs_misc.list} (100%) rename parm/transfer/{transfer_rdhpcs_gdas_enkf_enkf_1.list => transfer_rdhpcs_gfs_gdas_enkf_enkf_1.list} (91%) rename parm/transfer/{transfer_rdhpcs_gdas_enkf_enkf_2.list => transfer_rdhpcs_gfs_gdas_enkf_enkf_2.list} (91%) rename parm/transfer/{transfer_rdhpcs_gdas_enkf_enkf_3.list => transfer_rdhpcs_gfs_gdas_enkf_enkf_3.list} (84%) rename parm/transfer/{transfer_rdhpcs_gdas_enkf_enkf_4.list => transfer_rdhpcs_gfs_gdas_enkf_enkf_4.list} (84%) rename parm/transfer/{transfer_rdhpcs_gdas_enkf_enkf_5.list => transfer_rdhpcs_gfs_gdas_enkf_enkf_5.list} (92%) rename parm/transfer/{transfer_rdhpcs_gdas_enkf_enkf_6.list => transfer_rdhpcs_gfs_gdas_enkf_enkf_6.list} (92%) rename parm/transfer/{transfer_rdhpcs_gdas_enkf_enkf_7.list => transfer_rdhpcs_gfs_gdas_enkf_enkf_7.list} (92%) rename parm/transfer/{transfer_rdhpcs_gdas_enkf_enkf_8.list => transfer_rdhpcs_gfs_gdas_enkf_enkf_8.list} (92%) rename parm/transfer/{transfer_rdhpcs_gdas.list => transfer_rdhpcs_gfs_gdas_gdas.list} (89%) rename parm/transfer/{transfer_rdhpcs_gfs_nawips.list => transfer_rdhpcs_gfs_gempak.list} (96%) rename parm/transfer/{transfer_rdhpcs_gfs.list => transfer_rdhpcs_gfs_gfs.list} (94%) delete mode 100644 parm/ufs/fv3/data_table delete mode 100644 parm/ufs/mom6/MOM_input_template_025 delete mode 100644 parm/ufs/mom6/MOM_input_template_050 delete mode 100644 parm/ufs/mom6/MOM_input_template_100 delete mode 100644 parm/ufs/mom6/MOM_input_template_500 delete mode 100644 parm/ufs/ufs.configure.atm.IN delete mode 100644 parm/ufs/ufs.configure.atm_aero.IN delete mode 100644 parm/ufs/ufs.configure.blocked_atm_wav.IN delete mode 100644 parm/ufs/ufs.configure.cpld.IN delete mode 100644 parm/ufs/ufs.configure.cpld_aero.IN delete mode 100644 parm/ufs/ufs.configure.cpld_aero_outerwave.IN delete mode 100644 parm/ufs/ufs.configure.cpld_aero_wave.IN delete mode 100644 parm/ufs/ufs.configure.cpld_outerwave.IN delete mode 100644 parm/ufs/ufs.configure.cpld_wave.IN delete mode 100644 parm/ufs/ufs.configure.leapfrog_atm_wav.IN create mode 100755 parm/wave/glo_200_interp.inp.tmpl create mode 100755 parm/wave/ww3_grib2.glo_100.inp.tmpl delete mode 100755 parm/wmo/grib_awpgfs000.211 delete mode 100755 parm/wmo/grib_awpgfs006.211 delete mode 100755 parm/wmo/grib_awpgfs012.211 delete mode 100755 parm/wmo/grib_awpgfs018.211 delete mode 100755 parm/wmo/grib_awpgfs024.211 delete mode 100755 parm/wmo/grib_awpgfs030.211 delete mode 100755 parm/wmo/grib_awpgfs036.211 delete mode 100755 parm/wmo/grib_awpgfs042.211 delete mode 100755 parm/wmo/grib_awpgfs048.211 delete mode 100755 parm/wmo/grib_awpgfs054.211 delete mode 100755 parm/wmo/grib_awpgfs060.211 delete mode 100755 parm/wmo/grib_awpgfs066.211 delete mode 100755 parm/wmo/grib_awpgfs072.211 delete mode 100755 parm/wmo/grib_awpgfs078.211 delete mode 100755 parm/wmo/grib_awpgfs084.211 delete mode 100755 parm/wmo/grib_awpgfs090.211 delete mode 100755 parm/wmo/grib_awpgfs096.211 delete mode 100755 parm/wmo/grib_awpgfs102.211 delete mode 100755 parm/wmo/grib_awpgfs108.211 delete mode 100755 parm/wmo/grib_awpgfs114.211 delete mode 100755 parm/wmo/grib_awpgfs120.211 delete mode 100755 parm/wmo/grib_awpgfs126.211 delete mode 100755 parm/wmo/grib_awpgfs132.211 delete mode 100755 parm/wmo/grib_awpgfs138.211 delete mode 100755 parm/wmo/grib_awpgfs144.211 delete mode 100755 parm/wmo/grib_awpgfs150.211 delete mode 100755 parm/wmo/grib_awpgfs156.211 delete mode 100755 parm/wmo/grib_awpgfs162.211 delete mode 100755 parm/wmo/grib_awpgfs168.211 delete mode 100755 parm/wmo/grib_awpgfs174.211 delete mode 100755 parm/wmo/grib_awpgfs180.211 delete mode 100755 parm/wmo/grib_awpgfs186.211 delete mode 100755 parm/wmo/grib_awpgfs192.211 delete mode 100755 parm/wmo/grib_awpgfs198.211 delete mode 100755 parm/wmo/grib_awpgfs204.211 delete mode 100755 parm/wmo/grib_awpgfs210.211 delete mode 100755 parm/wmo/grib_awpgfs216.211 delete mode 100755 parm/wmo/grib_awpgfs222.211 delete mode 100755 parm/wmo/grib_awpgfs228.211 delete mode 100755 parm/wmo/grib_awpgfs234.211 delete mode 100755 parm/wmo/grib_awpgfs240.211 create mode 100755 scripts/exgdas_enkf_earc.py delete mode 100755 scripts/exgdas_enkf_earc.sh delete mode 100755 scripts/exgdas_enkf_fcst.sh create mode 100755 scripts/exgdas_global_marine_analysis_letkf.py delete mode 100755 scripts/exgfs_atmos_grib_awips.sh create mode 100755 scripts/exgfs_atmos_wafs_blending_0p25.sh create mode 100755 scripts/exgfs_atmos_wafs_gcip.sh create mode 100755 scripts/exgfs_atmos_wafs_grib.sh create mode 100755 scripts/exgfs_atmos_wafs_grib2.sh create mode 100755 scripts/exgfs_atmos_wafs_grib2_0p25.sh create mode 100755 scripts/exglobal_archive.py delete mode 120000 scripts/exglobal_archive.sh create mode 100755 scripts/exglobal_atm_analysis_fv3_increment.py rename scripts/{exglobal_atm_analysis_run.py => exglobal_atm_analysis_variational.py} (89%) create mode 100755 scripts/exglobal_atmens_analysis_fv3_increment.py rename scripts/{exglobal_atmens_analysis_run.py => exglobal_atmens_analysis_letkf.py} (86%) create mode 100755 scripts/exglobal_atmos_ensstat.sh create mode 100755 scripts/exglobal_extractvars.sh create mode 100755 scripts/exglobal_marinebmat.py create mode 100755 scripts/exglobal_oceanice_products.py create mode 100755 scripts/exglobal_prep_emissions.py create mode 100755 scripts/exglobal_prep_obs_aero.py rename scripts/{exglobal_prep_land_obs.py => exglobal_prep_snow_obs.py} (59%) rename scripts/{exglobal_land_analysis.py => exglobal_snow_analysis.py} (66%) delete mode 100755 scripts/run_reg2grb2.sh delete mode 100755 scripts/run_regrid.sh delete mode 100644 sorc/ncl.setup create mode 100755 test/f90nmlcmp.sh create mode 100755 test/g2cmp.sh create mode 100755 test/nccmp.sh create mode 100755 ush/atmos_ensstat.sh create mode 100755 ush/atmos_extractvars.sh create mode 100755 ush/bash_utils.sh create mode 100755 ush/check_ice_netcdf.sh create mode 100644 ush/extractvars_tools.sh mode change 100644 => 100755 ush/file_utils.sh delete mode 100755 ush/fv3gfs_remap.sh delete mode 120000 ush/hpssarch_gen.sh delete mode 100755 ush/icepost.ncl mode change 100644 => 100755 ush/jjob_header.sh create mode 100755 ush/oceanice_nc2grib2.sh create mode 100755 ush/ocnice_extractvars.sh delete mode 100755 ush/ocnpost.ncl delete mode 100755 ush/parsing_model_configure_DATM.sh create mode 100755 ush/parsing_namelists_FV3_nest.sh rename ush/{ufs_configure.sh => parsing_ufs_configure.sh} (79%) mode change 100644 => 100755 ush/preamble.sh create mode 100644 ush/python/pygfs/task/aero_emissions.py create mode 100644 ush/python/pygfs/task/aero_prepobs.py create mode 100644 ush/python/pygfs/task/archive.py create mode 100644 ush/python/pygfs/task/marine_bmat.py create mode 100644 ush/python/pygfs/task/marine_letkf.py create mode 100644 ush/python/pygfs/task/oceanice_products.py rename ush/python/pygfs/task/{land_analysis.py => snow_analysis.py} (72%) create mode 100644 ush/python/pygfs/utils/__init__.py create mode 100644 ush/python/pygfs/utils/marine_da_utils.py create mode 100755 ush/wafs_mkgbl.sh create mode 100755 ush/wave_extractvars.sh create mode 100644 versions/build.gaea.ver create mode 100644 versions/run.gaea.ver create mode 100644 workflow/hosts/gaea.yaml diff --git a/.github/pull_request_template.md b/.github/pull_request_template.md index dbebfe8f6e..3f8fe65065 100644 --- a/.github/pull_request_template.md +++ b/.github/pull_request_template.md @@ -33,6 +33,17 @@ # Change characteristics - Is this a breaking change (a change in existing functionality)? YES/NO - Does this change require a documentation update? YES/NO +- Does this change require an update to any of the following submodules? YES/NO (If YES, please add a link to any PRs that are pending.) + - [ ] EMC verif-global + - [ ] GDAS + - [ ] GFS-utils + - [ ] GSI + - [ ] GSI-monitor + - [ ] GSI-utils + - [ ] UFS-utils + - [ ] UFS-weather-model + - [ ] wxflow + # How has this been tested? _dir. For all other directories, the + names will follow --> _dir. + """ + + rel_path_dict = {} + for key, value in self.task_config.items(): + if isinstance(value, str): + if root_path in value: + rel_path = value.replace(root_path, "") + rel_key = (key[4:] if key.startswith("COMIN_") else key).lower() + "_dir" + rel_path_dict[rel_key] = rel_path + + return rel_path_dict + + @staticmethod + @logit(logger) + def _construct_arcdir_set(arcdir_j2yaml, arch_dict) -> Dict: + """Construct the list of files to send to the ARCDIR and Fit2Obs + directories from a template. + + TODO Copying Fit2Obs data doesn't belong in archiving should be + moved elsewhere. + + Parameters + ---------- + arcdir_j2yaml: str + The filename of the ARCDIR jinja template to parse. + + arch_dict: Dict + The context dictionary to parse arcdir_j2yaml with. + + Return + ------ + arcdir_set : Dict + FileHandler dictionary (i.e. with top level "mkdir" and "copy" keys) + containing all directories that need to be created and what data + files need to be copied to the ARCDIR and the Fit2Obs directory. + """ + + # Get the FileHandler dictionary for creating directories and copying + # to the ARCDIR and VFYARC directories. + arcdir_set = parse_j2yaml(arcdir_j2yaml, + arch_dict, + allow_missing=True) + + return arcdir_set + + @staticmethod + @logit(logger) + def _rename_cyclone_expt(arch_dict) -> None: + + # Rename the experiment in the tracker files from "AVNO" to the + # first 4 letters of PSLOT. + pslot4 = arch_dict.PSLOT.upper() + if len(arch_dict.PSLOT) > 4: + pslot4 = arch_dict.PSLOT[0:4].upper() + + track_dir_in = arch_dict.COMIN_ATMOS_TRACK + track_dir_out = arch_dict.COMOUT_ATMOS_TRACK + run = arch_dict.RUN + cycle_HH = strftime(arch_dict.current_cycle, "%H") + + if run == "gfs": + in_track_file = (track_dir_in + "/avno.t" + + cycle_HH + "z.cycle.trackatcfunix") + in_track_p_file = (track_dir_in + "/avnop.t" + + cycle_HH + "z.cycle.trackatcfunixp") + elif run == "gdas": + in_track_file = (track_dir_in + "/gdas.t" + + cycle_HH + "z.cycle.trackatcfunix") + in_track_p_file = (track_dir_in + "/gdasp.t" + + cycle_HH + "z.cycle.trackatcfunixp") + + if not os.path.isfile(in_track_file): + # Do not attempt to archive the outputs + return + + out_track_file = track_dir_out + "/atcfunix." + run + "." + to_YMDH(arch_dict.current_cycle) + out_track_p_file = track_dir_out + "/atcfunixp." + run + "." + to_YMDH(arch_dict.current_cycle) + + def replace_string_from_to_file(filename_in, filename_out, search_str, replace_str): + + """Write a new file from the contents of an input file while searching + and replacing ASCII strings. To prevent partial file creation, a + temporary file is created and moved to the final location only + after the search/replace is finished. + + Parameters + ---------- + filename_in : str + Input filename + + filename_out : str + Output filename + + search_str : str + ASCII string to search for + + replace_str : str + ASCII string to replace the search_str with + """ + with open(filename_in) as old_file: + lines = old_file.readlines() + + out_lines = [line.replace(search_str, replace_str) for line in lines] + + with open("/tmp/track_file", "w") as new_file: + new_file.writelines(out_lines) + + shutil.move("tmp/track_file", filename_out) + + replace_string_from_to_file(in_track_file, out_track_file, "AVNO", pslot4) + replace_string_from_to_file(in_track_p_file, out_track_p_file, "AVNO", pslot4) + + return diff --git a/ush/python/pygfs/task/atm_analysis.py b/ush/python/pygfs/task/atm_analysis.py index da41574fc9..4e9d37335c 100644 --- a/ush/python/pygfs/task/atm_analysis.py +++ b/ush/python/pygfs/task/atm_analysis.py @@ -11,7 +11,7 @@ FileHandler, add_to_datetime, to_fv3time, to_timedelta, to_YMDH, chdir, - parse_yamltmpl, parse_j2yaml, save_as_yaml, + parse_j2yaml, save_as_yaml, logit, Executable, WorkflowException) @@ -28,32 +28,35 @@ class AtmAnalysis(Analysis): def __init__(self, config): super().__init__(config) - _res = int(self.config.CASE[1:]) - _res_anl = int(self.config.CASE_ANL[1:]) - _window_begin = add_to_datetime(self.runtime_config.current_cycle, -to_timedelta(f"{self.config.assim_freq}H") / 2) - _fv3jedi_yaml = os.path.join(self.runtime_config.DATA, f"{self.runtime_config.CDUMP}.t{self.runtime_config.cyc:02d}z.atmvar.yaml") + _res = int(self.task_config.CASE[1:]) + _res_anl = int(self.task_config.CASE_ANL[1:]) + _window_begin = add_to_datetime(self.task_config.current_cycle, -to_timedelta(f"{self.task_config.assim_freq}H") / 2) + _jedi_yaml = os.path.join(self.task_config.DATA, f"{self.task_config.RUN}.t{self.task_config.cyc:02d}z.atmvar.yaml") # Create a local dictionary that is repeatedly used across this class local_dict = AttrDict( { 'npx_ges': _res + 1, 'npy_ges': _res + 1, - 'npz_ges': self.config.LEVS - 1, - 'npz': self.config.LEVS - 1, + 'npz_ges': self.task_config.LEVS - 1, + 'npz': self.task_config.LEVS - 1, 'npx_anl': _res_anl + 1, 'npy_anl': _res_anl + 1, - 'npz_anl': self.config.LEVS - 1, + 'npz_anl': self.task_config.LEVS - 1, 'ATM_WINDOW_BEGIN': _window_begin, - 'ATM_WINDOW_LENGTH': f"PT{self.config.assim_freq}H", - 'OPREFIX': f"{self.runtime_config.CDUMP}.t{self.runtime_config.cyc:02d}z.", # TODO: CDUMP is being replaced by RUN - 'APREFIX': f"{self.runtime_config.CDUMP}.t{self.runtime_config.cyc:02d}z.", # TODO: CDUMP is being replaced by RUN - 'GPREFIX': f"gdas.t{self.runtime_config.previous_cycle.hour:02d}z.", - 'fv3jedi_yaml': _fv3jedi_yaml, + 'ATM_WINDOW_LENGTH': f"PT{self.task_config.assim_freq}H", + 'OPREFIX': f"{self.task_config.RUN}.t{self.task_config.cyc:02d}z.", + 'APREFIX': f"{self.task_config.RUN}.t{self.task_config.cyc:02d}z.", + 'GPREFIX': f"gdas.t{self.task_config.previous_cycle.hour:02d}z.", + 'jedi_yaml': _jedi_yaml, + 'atm_obsdatain_path': f"{self.task_config.DATA}/obs/", + 'atm_obsdataout_path': f"{self.task_config.DATA}/diags/", + 'BKG_TSTEP': "PT1H" # Placeholder for 4D applications } ) - # task_config is everything that this task should need - self.task_config = AttrDict(**self.config, **self.runtime_config, **local_dict) + # Extend task_config with local_dict + self.task_config = AttrDict(**self.task_config, **local_dict) @logit(logger) def initialize(self: Analysis) -> None: @@ -71,41 +74,38 @@ def initialize(self: Analysis) -> None: super().initialize() # stage CRTM fix files - crtm_fix_list_path = os.path.join(self.task_config.HOMEgfs, 'parm', 'gdas', 'atm_crtm_coeff.yaml') - logger.debug(f"Staging CRTM fix files from {crtm_fix_list_path}") - crtm_fix_list = parse_j2yaml(crtm_fix_list_path, self.task_config) + logger.info(f"Staging CRTM fix files from {self.task_config.CRTM_FIX_YAML}") + crtm_fix_list = parse_j2yaml(self.task_config.CRTM_FIX_YAML, self.task_config) FileHandler(crtm_fix_list).sync() # stage fix files - jedi_fix_list_path = os.path.join(self.task_config.HOMEgfs, 'parm', 'gdas', 'atm_jedi_fix.yaml') - logger.debug(f"Staging JEDI fix files from {jedi_fix_list_path}") - jedi_fix_list = parse_j2yaml(jedi_fix_list_path, self.task_config) + logger.info(f"Staging JEDI fix files from {self.task_config.JEDI_FIX_YAML}") + jedi_fix_list = parse_j2yaml(self.task_config.JEDI_FIX_YAML, self.task_config) FileHandler(jedi_fix_list).sync() # stage static background error files, otherwise it will assume ID matrix - logger.debug(f"Stage files for STATICB_TYPE {self.task_config.STATICB_TYPE}") - FileHandler(self.get_berror_dict(self.task_config)).sync() + logger.info(f"Stage files for STATICB_TYPE {self.task_config.STATICB_TYPE}") + if self.task_config.STATICB_TYPE != 'identity': + berror_staging_dict = parse_j2yaml(self.task_config.BERROR_STAGING_YAML, self.task_config) + else: + berror_staging_dict = {} + FileHandler(berror_staging_dict).sync() # stage ensemble files for use in hybrid background error if self.task_config.DOHYBVAR: logger.debug(f"Stage ensemble files for DOHYBVAR {self.task_config.DOHYBVAR}") - localconf = AttrDict() - keys = ['COM_ATMOS_RESTART_TMPL', 'previous_cycle', 'ROTDIR', 'RUN', - 'NMEM_ENS', 'DATA', 'current_cycle', 'ntiles'] - for key in keys: - localconf[key] = self.task_config[key] - localconf.RUN = 'enkf' + self.task_config.RUN - localconf.dirname = 'ens' - FileHandler(self.get_fv3ens_dict(localconf)).sync() + fv3ens_staging_dict = parse_j2yaml(self.task_config.FV3ENS_STAGING_YAML, self.task_config) + FileHandler(fv3ens_staging_dict).sync() # stage backgrounds - FileHandler(self.get_bkg_dict(AttrDict(self.task_config))).sync() + logger.info(f"Staging background files from {self.task_config.VAR_BKG_STAGING_YAML}") + bkg_staging_dict = parse_j2yaml(self.task_config.VAR_BKG_STAGING_YAML, self.task_config) + FileHandler(bkg_staging_dict).sync() # generate variational YAML file - logger.debug(f"Generate variational YAML file: {self.task_config.fv3jedi_yaml}") - varda_yaml = parse_j2yaml(self.task_config.ATMVARYAML, self.task_config) - save_as_yaml(varda_yaml, self.task_config.fv3jedi_yaml) - logger.info(f"Wrote variational YAML to: {self.task_config.fv3jedi_yaml}") + logger.debug(f"Generate variational YAML file: {self.task_config.jedi_yaml}") + save_as_yaml(self.task_config.jedi_config, self.task_config.jedi_yaml) + logger.info(f"Wrote variational YAML to: {self.task_config.jedi_yaml}") # need output dir for diags and anl logger.debug("Create empty output [anl, diags] directories to receive output from executable") @@ -116,14 +116,16 @@ def initialize(self: Analysis) -> None: FileHandler({'mkdir': newdirs}).sync() @logit(logger) - def execute(self: Analysis) -> None: + def variational(self: Analysis) -> None: chdir(self.task_config.DATA) - exec_cmd = Executable(self.task_config.APRUN_ATMANL) - exec_name = os.path.join(self.task_config.DATA, 'fv3jedi_var.x') + exec_cmd = Executable(self.task_config.APRUN_ATMANLVAR) + exec_name = os.path.join(self.task_config.DATA, 'gdas.x') exec_cmd.add_default_arg(exec_name) - exec_cmd.add_default_arg(self.task_config.fv3jedi_yaml) + exec_cmd.add_default_arg('fv3jedi') + exec_cmd.add_default_arg('variational') + exec_cmd.add_default_arg(self.task_config.jedi_yaml) try: logger.debug(f"Executing {exec_cmd}") @@ -135,6 +137,31 @@ def execute(self: Analysis) -> None: pass + @logit(logger) + def init_fv3_increment(self: Analysis) -> None: + # Setup JEDI YAML file + self.task_config.jedi_yaml = os.path.join(self.task_config.DATA, + f"{self.task_config.JCB_ALGO}.yaml") + save_as_yaml(self.get_jedi_config(self.task_config.JCB_ALGO), self.task_config.jedi_yaml) + + # Link JEDI executable to run directory + self.task_config.jedi_exe = self.link_jediexe() + + @logit(logger) + def fv3_increment(self: Analysis) -> None: + # Run executable + exec_cmd = Executable(self.task_config.APRUN_ATMANLFV3INC) + exec_cmd.add_default_arg(self.task_config.jedi_exe) + exec_cmd.add_default_arg(self.task_config.jedi_yaml) + + try: + logger.debug(f"Executing {exec_cmd}") + exec_cmd() + except OSError: + raise OSError(f"Failed to execute {exec_cmd}") + except Exception: + raise WorkflowException(f"An error occured during execution of {exec_cmd}") + @logit(logger) def finalize(self: Analysis) -> None: """Finalize a global atm analysis @@ -152,7 +179,7 @@ def finalize(self: Analysis) -> None: atmstat = os.path.join(self.task_config.COM_ATMOS_ANALYSIS, f"{self.task_config.APREFIX}atmstat") # get list of diag files to put in tarball - diags = glob.glob(os.path.join(self.task_config.DATA, 'diags', 'diag*nc4')) + diags = glob.glob(os.path.join(self.task_config.DATA, 'diags', 'diag*nc')) logger.info(f"Compressing {len(diags)} diag files to {atmstat}.gz") @@ -170,9 +197,9 @@ def finalize(self: Analysis) -> None: archive.add(diaggzip, arcname=os.path.basename(diaggzip)) # copy full YAML from executable to ROTDIR - logger.info(f"Copying {self.task_config.fv3jedi_yaml} to {self.task_config.COM_ATMOS_ANALYSIS}") - src = os.path.join(self.task_config.DATA, f"{self.task_config.CDUMP}.t{self.task_config.cyc:02d}z.atmvar.yaml") - dest = os.path.join(self.task_config.COM_ATMOS_ANALYSIS, f"{self.task_config.CDUMP}.t{self.task_config.cyc:02d}z.atmvar.yaml") + logger.info(f"Copying {self.task_config.jedi_yaml} to {self.task_config.COM_ATMOS_ANALYSIS}") + src = os.path.join(self.task_config.DATA, f"{self.task_config.RUN}.t{self.task_config.cyc:02d}z.atmvar.yaml") + dest = os.path.join(self.task_config.COM_ATMOS_ANALYSIS, f"{self.task_config.RUN}.t{self.task_config.cyc:02d}z.atmvar.yaml") logger.debug(f"Copying {src} to {dest}") yaml_copy = { 'mkdir': [self.task_config.COM_ATMOS_ANALYSIS], @@ -212,235 +239,17 @@ def finalize(self: Analysis) -> None: } FileHandler(bias_copy).sync() - # Create UFS model readable atm increment file from UFS-DA atm increment - logger.info("Create UFS model readable atm increment file from UFS-DA atm increment") - self.jedi2fv3inc() + # Copy FV3 atm increment to comrot directory + logger.info("Copy UFS model readable atm increment file") + cdate = to_fv3time(self.task_config.current_cycle) + cdate_inc = cdate.replace('.', '_') + src = os.path.join(self.task_config.DATA, 'anl', f"atminc.{cdate_inc}z.nc4") + dest = os.path.join(self.task_config.COM_ATMOS_ANALYSIS, f'{self.task_config.RUN}.t{self.task_config.cyc:02d}z.atminc.nc') + logger.debug(f"Copying {src} to {dest}") + inc_copy = { + 'copy': [[src, dest]] + } + FileHandler(inc_copy).sync() def clean(self): super().clean() - - @logit(logger) - def get_bkg_dict(self, task_config: Dict[str, Any]) -> Dict[str, List[str]]: - """Compile a dictionary of model background files to copy - - This method constructs a dictionary of FV3 restart files (coupler, core, tracer) - that are needed for global atm DA and returns said dictionary for use by the FileHandler class. - - Parameters - ---------- - task_config: Dict - a dictionary containing all of the configuration needed for the task - - Returns - ---------- - bkg_dict: Dict - a dictionary containing the list of model background files to copy for FileHandler - """ - # NOTE for now this is FV3 restart files and just assumed to be fh006 - - # get FV3 restart files, this will be a lot simpler when using history files - rst_dir = os.path.join(task_config.COM_ATMOS_RESTART_PREV) # for now, option later? - run_dir = os.path.join(task_config.DATA, 'bkg') - - # Start accumulating list of background files to copy - bkglist = [] - - # atm DA needs coupler - basename = f'{to_fv3time(task_config.current_cycle)}.coupler.res' - bkglist.append([os.path.join(rst_dir, basename), os.path.join(run_dir, basename)]) - - # atm DA needs core, srf_wnd, tracer, phy_data, sfc_data - for ftype in ['core', 'srf_wnd', 'tracer']: - template = f'{to_fv3time(self.task_config.current_cycle)}.fv_{ftype}.res.tile{{tilenum}}.nc' - for itile in range(1, task_config.ntiles + 1): - basename = template.format(tilenum=itile) - bkglist.append([os.path.join(rst_dir, basename), os.path.join(run_dir, basename)]) - - for ftype in ['phy_data', 'sfc_data']: - template = f'{to_fv3time(self.task_config.current_cycle)}.{ftype}.tile{{tilenum}}.nc' - for itile in range(1, task_config.ntiles + 1): - basename = template.format(tilenum=itile) - bkglist.append([os.path.join(rst_dir, basename), os.path.join(run_dir, basename)]) - - bkg_dict = { - 'mkdir': [run_dir], - 'copy': bkglist, - } - return bkg_dict - - @logit(logger) - def get_berror_dict(self, config: Dict[str, Any]) -> Dict[str, List[str]]: - """Compile a dictionary of background error files to copy - - This method will construct a dictionary of either bump of gsibec background - error files for global atm DA and return said dictionary for use by the - FileHandler class. - - Parameters - ---------- - config: Dict - a dictionary containing all of the configuration needed - - Returns - ---------- - berror_dict: Dict - a dictionary containing the list of atm background error files to copy for FileHandler - """ - SUPPORTED_BERROR_STATIC_MAP = {'identity': self._get_berror_dict_identity, - 'bump': self._get_berror_dict_bump, - 'gsibec': self._get_berror_dict_gsibec} - - try: - berror_dict = SUPPORTED_BERROR_STATIC_MAP[config.STATICB_TYPE](config) - except KeyError: - raise KeyError(f"{config.STATICB_TYPE} is not a supported background error type.\n" + - f"Currently supported background error types are:\n" + - f'{" | ".join(SUPPORTED_BERROR_STATIC_MAP.keys())}') - - return berror_dict - - @staticmethod - @logit(logger) - def _get_berror_dict_identity(config: Dict[str, Any]) -> Dict[str, List[str]]: - """Identity BE does not need any files for staging. - - This is a private method and should not be accessed directly. - - Parameters - ---------- - config: Dict - a dictionary containing all of the configuration needed - Returns - ---------- - berror_dict: Dict - Empty dictionary [identity BE needs not files to stage] - """ - logger.info(f"Identity background error does not use staged files. Return empty dictionary") - return {} - - @staticmethod - @logit(logger) - def _get_berror_dict_bump(config: Dict[str, Any]) -> Dict[str, List[str]]: - """Compile a dictionary of atm bump background error files to copy - - This method will construct a dictionary of atm bump background error - files for global atm DA and return said dictionary to the parent - - This is a private method and should not be accessed directly. - - Parameters - ---------- - config: Dict - a dictionary containing all of the configuration needed - - Returns - ---------- - berror_dict: Dict - a dictionary of atm bump background error files to copy for FileHandler - """ - # BUMP atm static-B needs nicas, cor_rh, cor_rv and stddev files. - b_dir = config.BERROR_DATA_DIR - b_datestr = to_fv3time(config.BERROR_DATE) - berror_list = [] - for ftype in ['cor_rh', 'cor_rv', 'stddev']: - coupler = f'{b_datestr}.{ftype}.coupler.res' - berror_list.append([ - os.path.join(b_dir, coupler), os.path.join(config.DATA, 'berror', coupler) - ]) - - template = '{b_datestr}.{ftype}.fv_tracer.res.tile{{tilenum}}.nc' - for itile in range(1, config.ntiles + 1): - tracer = template.format(tilenum=itile) - berror_list.append([ - os.path.join(b_dir, tracer), os.path.join(config.DATA, 'berror', tracer) - ]) - - nproc = config.ntiles * config.layout_x * config.layout_y - for nn in range(1, nproc + 1): - berror_list.append([ - os.path.join(b_dir, f'nicas_aero_nicas_local_{nproc:06}-{nn:06}.nc'), - os.path.join(config.DATA, 'berror', f'nicas_aero_nicas_local_{nproc:06}-{nn:06}.nc') - ]) - - # create dictionary of background error files to stage - berror_dict = { - 'mkdir': [os.path.join(config.DATA, 'berror')], - 'copy': berror_list, - } - return berror_dict - - @staticmethod - @logit(logger) - def _get_berror_dict_gsibec(config: Dict[str, Any]) -> Dict[str, List[str]]: - """Compile a dictionary of atm gsibec background error files to copy - - This method will construct a dictionary of atm gsibec background error - files for global atm DA and return said dictionary to the parent - - This is a private method and should not be accessed directly. - - Parameters - ---------- - config: Dict - a dictionary containing all of the configuration needed - - Returns - ---------- - berror_dict: Dict - a dictionary of atm gsibec background error files to copy for FileHandler - """ - # GSI atm static-B needs namelist and coefficient files. - b_dir = os.path.join(config.HOMEgfs, 'fix', 'gdas', 'gsibec', config.CASE_ANL) - berror_list = [] - for ftype in ['gfs_gsi_global.nml', 'gsi-coeffs-gfs-global.nc4']: - berror_list.append([ - os.path.join(b_dir, ftype), - os.path.join(config.DATA, 'berror', ftype) - ]) - - # create dictionary of background error files to stage - berror_dict = { - 'mkdir': [os.path.join(config.DATA, 'berror')], - 'copy': berror_list, - } - return berror_dict - - @logit(logger) - def jedi2fv3inc(self: Analysis) -> None: - """Generate UFS model readable analysis increment - - This method writes a UFS DA atm increment in UFS model readable format. - This includes: - - write UFS-DA atm increments using variable names expected by UFS model - - compute and write delp increment - - compute and write hydrostatic delz increment - - Please note that some of these steps are temporary and will be modified - once the modle is able to directly read atm increments. - - """ - # Select the atm guess file based on the analysis and background resolutions - # Fields from the atm guess are used to compute the delp and delz increments - case_anl = int(self.task_config.CASE_ANL[1:]) - case = int(self.task_config.CASE[1:]) - - file = f"{self.task_config.GPREFIX}" + "atmf006" + f"{'' if case_anl == case else '.ensres'}" + ".nc" - atmges_fv3 = os.path.join(self.task_config.COM_ATMOS_HISTORY_PREV, file) - - # Set the path/name to the input UFS-DA atm increment file (atminc_jedi) - # and the output UFS model atm increment file (atminc_fv3) - cdate = to_fv3time(self.task_config.current_cycle) - cdate_inc = cdate.replace('.', '_') - atminc_jedi = os.path.join(self.task_config.DATA, 'anl', f'atminc.{cdate_inc}z.nc4') - atminc_fv3 = os.path.join(self.task_config.COM_ATMOS_ANALYSIS, f"{self.task_config.CDUMP}.t{self.task_config.cyc:02d}z.atminc.nc") - - # Reference the python script which does the actual work - incpy = os.path.join(self.task_config.HOMEgfs, 'ush/jediinc2fv3.py') - - # Execute incpy to create the UFS model atm increment file - cmd = Executable(incpy) - cmd.add_default_arg(atmges_fv3) - cmd.add_default_arg(atminc_jedi) - cmd.add_default_arg(atminc_fv3) - logger.debug(f"Executing {cmd}") - cmd(output='stdout', error='stderr') diff --git a/ush/python/pygfs/task/atmens_analysis.py b/ush/python/pygfs/task/atmens_analysis.py index 9cf84c07c7..bd5112050e 100644 --- a/ush/python/pygfs/task/atmens_analysis.py +++ b/ush/python/pygfs/task/atmens_analysis.py @@ -11,7 +11,7 @@ FileHandler, add_to_datetime, to_fv3time, to_timedelta, to_YMDH, to_YMD, chdir, - parse_yamltmpl, parse_j2yaml, save_as_yaml, + parse_j2yaml, save_as_yaml, logit, Executable, WorkflowException, @@ -29,28 +29,31 @@ class AtmEnsAnalysis(Analysis): def __init__(self, config): super().__init__(config) - _res = int(self.config.CASE_ENS[1:]) - _window_begin = add_to_datetime(self.runtime_config.current_cycle, -to_timedelta(f"{self.config.assim_freq}H") / 2) - _fv3jedi_yaml = os.path.join(self.runtime_config.DATA, f"{self.runtime_config.CDUMP}.t{self.runtime_config.cyc:02d}z.atmens.yaml") + _res = int(self.task_config.CASE_ENS[1:]) + _window_begin = add_to_datetime(self.task_config.current_cycle, -to_timedelta(f"{self.task_config.assim_freq}H") / 2) + _jedi_yaml = os.path.join(self.task_config.DATA, f"{self.task_config.RUN}.t{self.task_config.cyc:02d}z.atmens.yaml") # Create a local dictionary that is repeatedly used across this class local_dict = AttrDict( { 'npx_ges': _res + 1, 'npy_ges': _res + 1, - 'npz_ges': self.config.LEVS - 1, - 'npz': self.config.LEVS - 1, + 'npz_ges': self.task_config.LEVS - 1, + 'npz': self.task_config.LEVS - 1, 'ATM_WINDOW_BEGIN': _window_begin, - 'ATM_WINDOW_LENGTH': f"PT{self.config.assim_freq}H", - 'OPREFIX': f"{self.config.EUPD_CYC}.t{self.runtime_config.cyc:02d}z.", # TODO: CDUMP is being replaced by RUN - 'APREFIX': f"{self.runtime_config.CDUMP}.t{self.runtime_config.cyc:02d}z.", # TODO: CDUMP is being replaced by RUN - 'GPREFIX': f"gdas.t{self.runtime_config.previous_cycle.hour:02d}z.", - 'fv3jedi_yaml': _fv3jedi_yaml, + 'ATM_WINDOW_LENGTH': f"PT{self.task_config.assim_freq}H", + 'OPREFIX': f"{self.task_config.EUPD_CYC}.t{self.task_config.cyc:02d}z.", + 'APREFIX': f"{self.task_config.RUN}.t{self.task_config.cyc:02d}z.", + 'GPREFIX': f"gdas.t{self.task_config.previous_cycle.hour:02d}z.", + 'jedi_yaml': _jedi_yaml, + 'atm_obsdatain_path': f"./obs/", + 'atm_obsdataout_path': f"./diags/", + 'BKG_TSTEP': "PT1H" # Placeholder for 4D applications } ) - # task_config is everything that this task should need - self.task_config = AttrDict(**self.config, **self.runtime_config, **local_dict) + # Extend task_config with local_dict + self.task_config = AttrDict(**self.task_config, **local_dict) @logit(logger) def initialize(self: Analysis) -> None: @@ -74,54 +77,25 @@ def initialize(self: Analysis) -> None: """ super().initialize() - # Make member directories in DATA for background and in DATA and ROTDIR for analysis files - # create template dictionary for output member analysis directories - template_inc = self.task_config.COM_ATMOS_ANALYSIS_TMPL - tmpl_inc_dict = { - 'ROTDIR': self.task_config.ROTDIR, - 'RUN': self.task_config.RUN, - 'YMD': to_YMD(self.task_config.current_cycle), - 'HH': self.task_config.current_cycle.strftime('%H') - } - dirlist = [] - for imem in range(1, self.task_config.NMEM_ENS + 1): - dirlist.append(os.path.join(self.task_config.DATA, 'bkg', f'mem{imem:03d}')) - dirlist.append(os.path.join(self.task_config.DATA, 'anl', f'mem{imem:03d}')) - - # create output directory path for member analysis - tmpl_inc_dict['MEMDIR'] = f"mem{imem:03d}" - incdir = Template.substitute_structure(template_inc, TemplateConstants.DOLLAR_CURLY_BRACE, tmpl_inc_dict.get) - dirlist.append(incdir) - - FileHandler({'mkdir': dirlist}).sync() - # stage CRTM fix files - crtm_fix_list_path = os.path.join(self.task_config.HOMEgfs, 'parm', 'gdas', 'atm_crtm_coeff.yaml') - logger.debug(f"Staging CRTM fix files from {crtm_fix_list_path}") - crtm_fix_list = parse_j2yaml(crtm_fix_list_path, self.task_config) + logger.info(f"Staging CRTM fix files from {self.task_config.CRTM_FIX_YAML}") + crtm_fix_list = parse_j2yaml(self.task_config.CRTM_FIX_YAML, self.task_config) FileHandler(crtm_fix_list).sync() # stage fix files - jedi_fix_list_path = os.path.join(self.task_config.HOMEgfs, 'parm', 'gdas', 'atm_jedi_fix.yaml') - logger.debug(f"Staging JEDI fix files from {jedi_fix_list_path}") - jedi_fix_list = parse_j2yaml(jedi_fix_list_path, self.task_config) + logger.info(f"Staging JEDI fix files from {self.task_config.JEDI_FIX_YAML}") + jedi_fix_list = parse_j2yaml(self.task_config.JEDI_FIX_YAML, self.task_config) FileHandler(jedi_fix_list).sync() # stage backgrounds - logger.debug(f"Stage ensemble member background files") - localconf = AttrDict() - keys = ['COM_ATMOS_RESTART_TMPL', 'previous_cycle', 'ROTDIR', 'RUN', - 'NMEM_ENS', 'DATA', 'current_cycle', 'ntiles'] - for key in keys: - localconf[key] = self.task_config[key] - localconf.dirname = 'bkg' - FileHandler(self.get_fv3ens_dict(localconf)).sync() + logger.info(f"Stage ensemble member background files") + bkg_staging_dict = parse_j2yaml(self.task_config.LGETKF_BKG_STAGING_YAML, self.task_config) + FileHandler(bkg_staging_dict).sync() # generate ensemble da YAML file - logger.debug(f"Generate ensemble da YAML file: {self.task_config.fv3jedi_yaml}") - ensda_yaml = parse_j2yaml(self.task_config.ATMENSYAML, self.task_config) - save_as_yaml(ensda_yaml, self.task_config.fv3jedi_yaml) - logger.info(f"Wrote ensemble da YAML to: {self.task_config.fv3jedi_yaml}") + logger.debug(f"Generate ensemble da YAML file: {self.task_config.jedi_yaml}") + save_as_yaml(self.task_config.jedi_config, self.task_config.jedi_yaml) + logger.info(f"Wrote ensemble da YAML to: {self.task_config.jedi_yaml}") # need output dir for diags and anl logger.debug("Create empty output [anl, diags] directories to receive output from executable") @@ -132,7 +106,7 @@ def initialize(self: Analysis) -> None: FileHandler({'mkdir': newdirs}).sync() @logit(logger) - def execute(self: Analysis) -> None: + def letkf(self: Analysis) -> None: """Execute a global atmens analysis This method will execute a global atmens analysis using JEDI. @@ -150,10 +124,13 @@ def execute(self: Analysis) -> None: """ chdir(self.task_config.DATA) - exec_cmd = Executable(self.task_config.APRUN_ATMENSANL) - exec_name = os.path.join(self.task_config.DATA, 'fv3jedi_letkf.x') + exec_cmd = Executable(self.task_config.APRUN_ATMENSANLLETKF) + exec_name = os.path.join(self.task_config.DATA, 'gdas.x') + exec_cmd.add_default_arg(exec_name) - exec_cmd.add_default_arg(self.task_config.fv3jedi_yaml) + exec_cmd.add_default_arg('fv3jedi') + exec_cmd.add_default_arg('localensembleda') + exec_cmd.add_default_arg(self.task_config.jedi_yaml) try: logger.debug(f"Executing {exec_cmd}") @@ -165,6 +142,31 @@ def execute(self: Analysis) -> None: pass + @logit(logger) + def init_fv3_increment(self: Analysis) -> None: + # Setup JEDI YAML file + self.task_config.jedi_yaml = os.path.join(self.task_config.DATA, + f"{self.task_config.JCB_ALGO}.yaml") + save_as_yaml(self.get_jedi_config(self.task_config.JCB_ALGO), self.task_config.jedi_yaml) + + # Link JEDI executable to run directory + self.task_config.jedi_exe = self.link_jediexe() + + @logit(logger) + def fv3_increment(self: Analysis) -> None: + # Run executable + exec_cmd = Executable(self.task_config.APRUN_ATMENSANLFV3INC) + exec_cmd.add_default_arg(self.task_config.jedi_exe) + exec_cmd.add_default_arg(self.task_config.jedi_yaml) + + try: + logger.debug(f"Executing {exec_cmd}") + exec_cmd() + except OSError: + raise OSError(f"Failed to execute {exec_cmd}") + except Exception: + raise WorkflowException(f"An error occured during execution of {exec_cmd}") + @logit(logger) def finalize(self: Analysis) -> None: """Finalize a global atmens analysis @@ -188,7 +190,7 @@ def finalize(self: Analysis) -> None: atmensstat = os.path.join(self.task_config.COM_ATMOS_ANALYSIS_ENS, f"{self.task_config.APREFIX}atmensstat") # get list of diag files to put in tarball - diags = glob.glob(os.path.join(self.task_config.DATA, 'diags', 'diag*nc4')) + diags = glob.glob(os.path.join(self.task_config.DATA, 'diags', 'diag*nc')) logger.info(f"Compressing {len(diags)} diag files to {atmensstat}.gz") @@ -206,9 +208,9 @@ def finalize(self: Analysis) -> None: archive.add(diaggzip, arcname=os.path.basename(diaggzip)) # copy full YAML from executable to ROTDIR - logger.info(f"Copying {self.task_config.fv3jedi_yaml} to {self.task_config.COM_ATMOS_ANALYSIS_ENS}") - src = os.path.join(self.task_config.DATA, f"{self.task_config.CDUMP}.t{self.task_config.cyc:02d}z.atmens.yaml") - dest = os.path.join(self.task_config.COM_ATMOS_ANALYSIS_ENS, f"{self.task_config.CDUMP}.t{self.task_config.cyc:02d}z.atmens.yaml") + logger.info(f"Copying {self.task_config.jedi_yaml} to {self.task_config.COM_ATMOS_ANALYSIS_ENS}") + src = os.path.join(self.task_config.DATA, f"{self.task_config.RUN}.t{self.task_config.cyc:02d}z.atmens.yaml") + dest = os.path.join(self.task_config.COM_ATMOS_ANALYSIS_ENS, f"{self.task_config.RUN}.t{self.task_config.cyc:02d}z.atmens.yaml") logger.debug(f"Copying {src} to {dest}") yaml_copy = { 'mkdir': [self.task_config.COM_ATMOS_ANALYSIS_ENS], @@ -216,42 +218,6 @@ def finalize(self: Analysis) -> None: } FileHandler(yaml_copy).sync() - # Create UFS model readable atm increment file from UFS-DA atm increment - logger.info("Create UFS model readable atm increment file from UFS-DA atm increment") - self.jedi2fv3inc() - - def clean(self): - super().clean() - - @logit(logger) - def jedi2fv3inc(self: Analysis) -> None: - """Generate UFS model readable analysis increment - - This method writes a UFS DA atm increment in UFS model readable format. - This includes: - - write UFS-DA atm increments using variable names expected by UFS model - - compute and write delp increment - - compute and write hydrostatic delz increment - - Please note that some of these steps are temporary and will be modified - once the modle is able to directly read atm increments. - - Parameters - ---------- - Analysis: parent class for GDAS task - - Returns - ---------- - None - """ - # Select the atm guess file based on the analysis and background resolutions - # Fields from the atm guess are used to compute the delp and delz increments - cdate = to_fv3time(self.task_config.current_cycle) - cdate_inc = cdate.replace('.', '_') - - # Reference the python script which does the actual work - incpy = os.path.join(self.task_config.HOMEgfs, 'ush/jediinc2fv3.py') - # create template dictionaries template_inc = self.task_config.COM_ATMOS_ANALYSIS_TMPL tmpl_inc_dict = { @@ -261,14 +227,10 @@ def jedi2fv3inc(self: Analysis) -> None: 'HH': self.task_config.current_cycle.strftime('%H') } - template_ges = self.task_config.COM_ATMOS_HISTORY_TMPL - tmpl_ges_dict = { - 'ROTDIR': self.task_config.ROTDIR, - 'RUN': self.task_config.RUN, - 'YMD': to_YMD(self.task_config.previous_cycle), - 'HH': self.task_config.previous_cycle.strftime('%H') - } - + # copy FV3 atm increment to comrot directory + logger.info("Copy UFS model readable atm increment file") + cdate = to_fv3time(self.task_config.current_cycle) + cdate_inc = cdate.replace('.', '_') # loop over ensemble members for imem in range(1, self.task_config.NMEM_ENS + 1): memchar = f"mem{imem:03d}" @@ -276,20 +238,15 @@ def jedi2fv3inc(self: Analysis) -> None: # create output path for member analysis increment tmpl_inc_dict['MEMDIR'] = memchar incdir = Template.substitute_structure(template_inc, TemplateConstants.DOLLAR_CURLY_BRACE, tmpl_inc_dict.get) + src = os.path.join(self.task_config.DATA, 'anl', memchar, f"atminc.{cdate_inc}z.nc4") + dest = os.path.join(incdir, f"{self.task_config.RUN}.t{self.task_config.cyc:02d}z.atminc.nc") - # rewrite UFS-DA atmens increments - tmpl_ges_dict['MEMDIR'] = memchar - gesdir = Template.substitute_structure(template_ges, TemplateConstants.DOLLAR_CURLY_BRACE, tmpl_ges_dict.get) - atmges_fv3 = os.path.join(gesdir, f"{self.task_config.CDUMP}.t{self.task_config.previous_cycle.hour:02d}z.atmf006.nc") - atminc_jedi = os.path.join(self.task_config.DATA, 'anl', memchar, f'atminc.{cdate_inc}z.nc4') - atminc_fv3 = os.path.join(incdir, f"{self.task_config.CDUMP}.t{self.task_config.cyc:02d}z.atminc.nc") - - # Execute incpy to create the UFS model atm increment file - # TODO: use MPMD or parallelize with mpi4py - # See https://github.com/NOAA-EMC/global-workflow/pull/1373#discussion_r1173060656 - cmd = Executable(incpy) - cmd.add_default_arg(atmges_fv3) - cmd.add_default_arg(atminc_jedi) - cmd.add_default_arg(atminc_fv3) - logger.debug(f"Executing {cmd}") - cmd(output='stdout', error='stderr') + # copy increment + logger.debug(f"Copying {src} to {dest}") + inc_copy = { + 'copy': [[src, dest]] + } + FileHandler(inc_copy).sync() + + def clean(self): + super().clean() diff --git a/ush/python/pygfs/task/marine_bmat.py b/ush/python/pygfs/task/marine_bmat.py new file mode 100644 index 0000000000..9d64e621c9 --- /dev/null +++ b/ush/python/pygfs/task/marine_bmat.py @@ -0,0 +1,350 @@ +#!/usr/bin/env python3 + +import os +import glob +from logging import getLogger +import pygfs.utils.marine_da_utils as mdau + +from wxflow import (AttrDict, + FileHandler, + add_to_datetime, to_timedelta, + chdir, + parse_j2yaml, + logit, + Executable, + Task) + +logger = getLogger(__name__.split('.')[-1]) + + +class MarineBMat(Task): + """ + Class for global marine B-matrix tasks + """ + @logit(logger, name="MarineBMat") + def __init__(self, config): + super().__init__(config) + _home_gdas = os.path.join(self.task_config.HOMEgfs, 'sorc', 'gdas.cd') + _calc_scale_exec = os.path.join(self.task_config.HOMEgfs, 'ush', 'soca', 'calc_scales.py') + _window_begin = add_to_datetime(self.task_config.current_cycle, -to_timedelta(f"{self.task_config.assim_freq}H") / 2) + _window_end = add_to_datetime(self.task_config.current_cycle, to_timedelta(f"{self.task_config.assim_freq}H") / 2) + + # compute the relative path from self.task_config.DATA to self.task_config.DATAenspert + if self.task_config.NMEM_ENS > 0: + _enspert_relpath = os.path.relpath(self.task_config.DATAenspert, self.task_config.DATA) + else: + _enspert_relpath = None + + # Create a local dictionary that is repeatedly used across this class + local_dict = AttrDict( + { + 'HOMEgdas': _home_gdas, + 'MARINE_WINDOW_BEGIN': _window_begin, + 'MARINE_WINDOW_END': _window_end, + 'MARINE_WINDOW_MIDDLE': self.task_config.current_cycle, + 'BERROR_YAML_DIR': os.path.join(_home_gdas, 'parm', 'soca', 'berror'), + 'GRID_GEN_YAML': os.path.join(_home_gdas, 'parm', 'soca', 'gridgen', 'gridgen.yaml'), + 'MARINE_ENSDA_STAGE_BKG_YAML_TMPL': os.path.join(_home_gdas, 'parm', 'soca', 'ensda', 'stage_ens_mem.yaml.j2'), + 'MARINE_DET_STAGE_BKG_YAML_TMPL': os.path.join(_home_gdas, 'parm', 'soca', 'soca_det_bkg_stage.yaml.j2'), + 'ENSPERT_RELPATH': _enspert_relpath, + 'CALC_SCALE_EXEC': _calc_scale_exec, + 'APREFIX': f"{self.task_config.RUN}.t{self.task_config.cyc:02d}z.", + } + ) + + # Extend task_config with local_dict + self.task_config = AttrDict(**self.task_config, **local_dict) + + @logit(logger) + def initialize(self: Task) -> None: + """Initialize a global B-matrix + + This method will initialize a global B-Matrix. + This includes: + - staging the deterministic backgrounds (middle of window) + - staging SOCA fix files + - staging static ensemble members (optional) + - staging ensemble members (optional) + - generating the YAML files for the JEDI and GDASApp executables + - creating output directories + """ + super().initialize() + + # stage fix files + logger.info(f"Staging SOCA fix files from {self.task_config.SOCA_INPUT_FIX_DIR}") + soca_fix_list = parse_j2yaml(self.task_config.SOCA_FIX_YAML_TMPL, self.task_config) + FileHandler(soca_fix_list).sync() + + # prepare the MOM6 input.nml + mdau.prep_input_nml(self.task_config) + + # stage backgrounds + # TODO(G): Check ocean backgrounds dates for consistency + bkg_list = parse_j2yaml(self.task_config.MARINE_DET_STAGE_BKG_YAML_TMPL, self.task_config) + FileHandler(bkg_list).sync() + for cice_fname in ['./INPUT/cice.res.nc', './bkg/ice.bkg.f006.nc', './bkg/ice.bkg.f009.nc']: + mdau.cice_hist2fms(cice_fname, cice_fname) + + # stage the grid generation yaml + FileHandler({'copy': [[self.task_config.GRID_GEN_YAML, + os.path.join(self.task_config.DATA, 'gridgen.yaml')]]}).sync() + + # generate the variance partitioning YAML file + logger.debug("Generate variance partitioning YAML file") + diagb_config = parse_j2yaml(path=os.path.join(self.task_config.BERROR_YAML_DIR, 'soca_diagb.yaml.j2'), + data=self.task_config) + diagb_config.save(os.path.join(self.task_config.DATA, 'soca_diagb.yaml')) + + # generate the vertical decorrelation scale YAML file + logger.debug("Generate the vertical correlation scale YAML file") + vtscales_config = parse_j2yaml(path=os.path.join(self.task_config.BERROR_YAML_DIR, 'soca_vtscales.yaml.j2'), + data=self.task_config) + vtscales_config.save(os.path.join(self.task_config.DATA, 'soca_vtscales.yaml')) + + # generate vertical diffusion scale YAML file + logger.debug("Generate vertical diffusion YAML file") + diffvz_config = parse_j2yaml(path=os.path.join(self.task_config.BERROR_YAML_DIR, 'soca_parameters_diffusion_vt.yaml.j2'), + data=self.task_config) + diffvz_config.save(os.path.join(self.task_config.DATA, 'soca_parameters_diffusion_vt.yaml')) + + # generate the horizontal diffusion YAML files + if True: # TODO(G): skip this section once we have optimized the scales + # stage the correlation scale configuration + logger.debug("Generate correlation scale YAML file") + FileHandler({'copy': [[os.path.join(self.task_config.BERROR_YAML_DIR, 'soca_setcorscales.yaml'), + os.path.join(self.task_config.DATA, 'soca_setcorscales.yaml')]]}).sync() + + # generate horizontal diffusion scale YAML file + logger.debug("Generate horizontal diffusion scale YAML file") + diffhz_config = parse_j2yaml(path=os.path.join(self.task_config.BERROR_YAML_DIR, 'soca_parameters_diffusion_hz.yaml.j2'), + data=self.task_config) + diffhz_config.save(os.path.join(self.task_config.DATA, 'soca_parameters_diffusion_hz.yaml')) + + # hybrid EnVAR case + if self.task_config.DOHYBVAR == "YES" or self.task_config.NMEM_ENS > 2: + # stage ensemble membersfiles for use in hybrid background error + logger.debug(f"Stage ensemble members for the hybrid background error") + mdau.stage_ens_mem(self.task_config) + + # generate ensemble recentering/rebalancing YAML file + logger.debug("Generate ensemble recentering YAML file") + ensrecenter_config = parse_j2yaml(path=os.path.join(self.task_config.BERROR_YAML_DIR, 'soca_ensb.yaml.j2'), + data=self.task_config) + ensrecenter_config.save(os.path.join(self.task_config.DATA, 'soca_ensb.yaml')) + + # generate ensemble weights YAML file + logger.debug("Generate ensemble recentering YAML file: {self.task_config.abcd_yaml}") + hybridweights_config = parse_j2yaml(path=os.path.join(self.task_config.BERROR_YAML_DIR, 'soca_ensweights.yaml.j2'), + data=self.task_config) + hybridweights_config.save(os.path.join(self.task_config.DATA, 'soca_ensweights.yaml')) + + # need output dir for ensemble perturbations and static B-matrix + logger.debug("Create empty diagb directories to receive output from executables") + FileHandler({'mkdir': [os.path.join(self.task_config.DATA, 'diagb')]}).sync() + + @logit(logger) + def gridgen(self: Task) -> None: + # link gdas_soca_gridgen.x + mdau.link_executable(self.task_config, 'gdas_soca_gridgen.x') + exec_cmd = Executable(self.task_config.APRUN_MARINEBMAT) + exec_name = os.path.join(self.task_config.DATA, 'gdas_soca_gridgen.x') + exec_cmd.add_default_arg(exec_name) + exec_cmd.add_default_arg('gridgen.yaml') + + mdau.run(exec_cmd) + + @logit(logger) + def variance_partitioning(self: Task) -> None: + # link the variance partitioning executable, gdas_soca_diagb.x + mdau.link_executable(self.task_config, 'gdas_soca_diagb.x') + exec_cmd = Executable(self.task_config.APRUN_MARINEBMAT) + exec_name = os.path.join(self.task_config.DATA, 'gdas_soca_diagb.x') + exec_cmd.add_default_arg(exec_name) + exec_cmd.add_default_arg('soca_diagb.yaml') + + mdau.run(exec_cmd) + + @logit(logger) + def horizontal_diffusion(self: Task) -> None: + """Generate the horizontal diffusion coefficients + """ + # link the executable that computes the correlation scales, gdas_soca_setcorscales.x, + # and prepare the command to run it + mdau.link_executable(self.task_config, 'gdas_soca_setcorscales.x') + exec_cmd = Executable(self.task_config.APRUN_MARINEBMAT) + exec_name = os.path.join(self.task_config.DATA, 'gdas_soca_setcorscales.x') + exec_cmd.add_default_arg(exec_name) + exec_cmd.add_default_arg('soca_setcorscales.yaml') + + # create a files containing the correlation scales + mdau.run(exec_cmd) + + # link the executable that computes the correlation scales, gdas_soca_error_covariance_toolbox.x, + # and prepare the command to run it + mdau.link_executable(self.task_config, 'gdas_soca_error_covariance_toolbox.x') + exec_cmd = Executable(self.task_config.APRUN_MARINEBMAT) + exec_name = os.path.join(self.task_config.DATA, 'gdas_soca_error_covariance_toolbox.x') + exec_cmd.add_default_arg(exec_name) + exec_cmd.add_default_arg('soca_parameters_diffusion_hz.yaml') + + # compute the coefficients of the diffusion operator + mdau.run(exec_cmd) + + @logit(logger) + def vertical_diffusion(self: Task) -> None: + """Generate the vertical diffusion coefficients + """ + # compute the vertical correlation scales based on the MLD + FileHandler({'copy': [[os.path.join(self.task_config.CALC_SCALE_EXEC), + os.path.join(self.task_config.DATA, 'calc_scales.x')]]}).sync() + exec_cmd = Executable("python") + exec_name = os.path.join(self.task_config.DATA, 'calc_scales.x') + exec_cmd.add_default_arg(exec_name) + exec_cmd.add_default_arg('soca_vtscales.yaml') + mdau.run(exec_cmd) + + # link the executable that computes the correlation scales, gdas_soca_error_covariance_toolbox.x, + # and prepare the command to run it + mdau.link_executable(self.task_config, 'gdas_soca_error_covariance_toolbox.x') + exec_cmd = Executable(self.task_config.APRUN_MARINEBMAT) + exec_name = os.path.join(self.task_config.DATA, 'gdas_soca_error_covariance_toolbox.x') + exec_cmd.add_default_arg(exec_name) + exec_cmd.add_default_arg('soca_parameters_diffusion_vt.yaml') + + # compute the coefficients of the diffusion operator + mdau.run(exec_cmd) + + @logit(logger) + def ensemble_perturbations(self: Task) -> None: + """Generate the 3D ensemble of perturbation for the 3DEnVAR + + This method will generate ensemble perturbations re-balanced w.r.t the + deterministic background. + This includes: + - computing a storing the unbalanced ensemble perturbations' statistics + - recentering the ensemble members around the deterministic background and + accounting for the nonlinear steric recentering + - saving the recentered ensemble statistics + """ + mdau.link_executable(self.task_config, 'gdas_ens_handler.x') + exec_cmd = Executable(self.task_config.APRUN_MARINEBMAT) + exec_name = os.path.join(self.task_config.DATA, 'gdas_ens_handler.x') + exec_cmd.add_default_arg(exec_name) + exec_cmd.add_default_arg('soca_ensb.yaml') + + # generate the ensemble perturbations + mdau.run(exec_cmd) + + @logit(logger) + def hybrid_weight(self: Task) -> None: + """Generate the hybrid weights for the 3DEnVAR + + This method will generate the 3D fields hybrid weights for the 3DEnVAR for each + variables. + TODO(G): Currently implemented for the specific case of the static ensemble members only + """ + mdau.link_executable(self.task_config, 'gdas_socahybridweights.x') + exec_cmd = Executable(self.task_config.APRUN_MARINEBMAT) + exec_name = os.path.join(self.task_config.DATA, 'gdas_socahybridweights.x') + exec_cmd.add_default_arg(exec_name) + exec_cmd.add_default_arg('soca_ensweights.yaml') + + # compute the ensemble weights + mdau.run(exec_cmd) + + @logit(logger) + def execute(self: Task) -> None: + """Generate the full B-matrix + + This method will generate the full B-matrix according to the configuration. + """ + chdir(self.task_config.DATA) + self.gridgen() # TODO: This should be optional in case the geometry file was staged + self.variance_partitioning() + self.horizontal_diffusion() # TODO: Make this optional once we've converged on an acceptable set of scales + self.vertical_diffusion() + # hybrid EnVAR case + if self.task_config.DOHYBVAR == "YES" or self.task_config.NMEM_ENS > 2: + self.ensemble_perturbations() # TODO: refactor this from the old scripts + self.hybrid_weight() # TODO: refactor this from the old scripts + + @logit(logger) + def finalize(self: Task) -> None: + """Finalize the global B-matrix job + + This method will finalize the global B-matrix job. + This includes: + - copy the generated static, but cycle dependent background error files to the ROTDIR + - copy the generated YAML file from initialize to the ROTDIR + - keep the re-balanced ensemble perturbation files in DATAenspert + - ... + + """ + # Copy the soca grid if it was created + grid_file = os.path.join(self.task_config.DATA, 'soca_gridspec.nc') + if os.path.exists(grid_file): + logger.info(f"Copying the soca grid file to the ROTDIR") + FileHandler({'copy': [[grid_file, + os.path.join(self.task_config.COMOUT_OCEAN_BMATRIX, 'soca_gridspec.nc')]]}).sync() + + # Copy the diffusion coefficient files to the ROTDIR + logger.info(f"Copying the diffusion coefficient files to the ROTDIR") + diffusion_coeff_list = [] + for diff_type in ['hz', 'vt']: + src = os.path.join(self.task_config.DATA, f"{diff_type}_ocean.nc") + dest = os.path.join(self.task_config.COMOUT_OCEAN_BMATRIX, + f"{self.task_config.APREFIX}{diff_type}_ocean.nc") + diffusion_coeff_list.append([src, dest]) + + src = os.path.join(self.task_config.DATA, f"hz_ice.nc") + dest = os.path.join(self.task_config.COMOUT_ICE_BMATRIX, + f"{self.task_config.APREFIX}hz_ice.nc") + diffusion_coeff_list.append([src, dest]) + + FileHandler({'copy': diffusion_coeff_list}).sync() + + # Copy diag B files to ROTDIR + logger.info(f"Copying diag B files to the ROTDIR") + diagb_list = [] + window_end_iso = self.task_config.MARINE_WINDOW_END.strftime('%Y-%m-%dT%H:%M:%SZ') + + # ocean diag B + src = os.path.join(self.task_config.DATA, 'diagb', f"ocn.bkgerr_stddev.incr.{window_end_iso}.nc") + dst = os.path.join(self.task_config.COMOUT_OCEAN_BMATRIX, + f"{self.task_config.APREFIX}ocean.bkgerr_stddev.nc") + diagb_list.append([src, dst]) + + # ice diag B + src = os.path.join(self.task_config.DATA, 'diagb', f"ice.bkgerr_stddev.incr.{window_end_iso}.nc") + dst = os.path.join(self.task_config.COMOUT_ICE_BMATRIX, + f"{self.task_config.APREFIX}ice.bkgerr_stddev.nc") + diagb_list.append([src, dst]) + + FileHandler({'copy': diagb_list}).sync() + + # Copy the ensemble perturbation diagnostics to the ROTDIR + if self.task_config.DOHYBVAR == "YES" or self.task_config.NMEM_ENS > 3: + window_middle_iso = self.task_config.MARINE_WINDOW_MIDDLE.strftime('%Y-%m-%dT%H:%M:%SZ') + weight_list = [] + src = os.path.join(self.task_config.DATA, f"ocn.ens_weights.incr.{window_middle_iso}.nc") + dst = os.path.join(self.task_config.COMOUT_OCEAN_BMATRIX, + f"{self.task_config.APREFIX}ocean.ens_weights.nc") + weight_list.append([src, dst]) + + src = os.path.join(self.task_config.DATA, f"ice.ens_weights.incr.{window_middle_iso}.nc") + dst = os.path.join(self.task_config.COMOUT_ICE_BMATRIX, + f"{self.task_config.APREFIX}ice.ens_weights.nc") + weight_list.append([src, dst]) + + # TODO(G): missing ssh_steric_stddev, ssh_unbal_stddev, ssh_total_stddev and steric_explained_variance + + FileHandler({'copy': weight_list}).sync() + + # Copy the YAML files to the OCEAN ROTDIR + yamls = glob.glob(os.path.join(self.task_config.DATA, '*.yaml')) + yaml_list = [] + for yaml_file in yamls: + dest = os.path.join(self.task_config.COMOUT_OCEAN_BMATRIX, + f"{self.task_config.APREFIX}{os.path.basename(yaml_file)}") + yaml_list.append([yaml_file, dest]) + FileHandler({'copy': yaml_list}).sync() diff --git a/ush/python/pygfs/task/marine_letkf.py b/ush/python/pygfs/task/marine_letkf.py new file mode 100644 index 0000000000..36c26d594b --- /dev/null +++ b/ush/python/pygfs/task/marine_letkf.py @@ -0,0 +1,147 @@ +#!/usr/bin/env python3 + +import f90nml +from logging import getLogger +import os +from pygfs.task.analysis import Analysis +from typing import Dict +from wxflow import (AttrDict, + FileHandler, + logit, + parse_j2yaml, + to_timedelta, + to_YMDH) + +logger = getLogger(__name__.split('.')[-1]) + + +class MarineLETKF(Analysis): + """ + Class for global ocean and sea ice analysis LETKF task + """ + + @logit(logger, name="MarineLETKF") + def __init__(self, config: Dict) -> None: + """Constructor for ocean and sea ice LETKF task + Parameters: + ------------ + config: Dict + configuration, namely evironment variables + Returns: + -------- + None + """ + + logger.info("init") + super().__init__(config) + + _half_assim_freq = to_timedelta(f"{self.task_config.assim_freq}H") / 2 + _letkf_yaml_file = 'letkf.yaml' + _letkf_exec_args = [self.task_config.MARINE_LETKF_EXEC, + 'soca', + 'localensembleda', + _letkf_yaml_file] + + self.task_config.WINDOW_MIDDLE = self.task_config.current_cycle + self.task_config.WINDOW_BEGIN = self.task_config.current_cycle - _half_assim_freq + self.task_config.letkf_exec_args = _letkf_exec_args + self.task_config.letkf_yaml_file = _letkf_yaml_file + self.task_config.mom_input_nml_tmpl = os.path.join(self.task_config.DATA, 'mom_input.nml.tmpl') + self.task_config.mom_input_nml = os.path.join(self.task_config.DATA, 'mom_input.nml') + self.task_config.obs_dir = os.path.join(self.task_config.DATA, 'obs') + + @logit(logger) + def initialize(self): + """Method initialize for ocean and sea ice LETKF task + Parameters: + ------------ + None + Returns: + -------- + None + """ + + logger.info("initialize") + + # make directories and stage ensemble background files + ensbkgconf = AttrDict() + keys = ['previous_cycle', 'current_cycle', 'DATA', 'NMEM_ENS', + 'PARMgfs', 'ROTDIR', 'COM_OCEAN_HISTORY_TMPL', 'COM_ICE_HISTORY_TMPL'] + for key in keys: + ensbkgconf[key] = self.task_config[key] + ensbkgconf.RUN = 'enkfgdas' + soca_ens_bkg_stage_list = parse_j2yaml(self.task_config.SOCA_ENS_BKG_STAGE_YAML_TMPL, ensbkgconf) + FileHandler(soca_ens_bkg_stage_list).sync() + soca_fix_stage_list = parse_j2yaml(self.task_config.SOCA_FIX_YAML_TMPL, self.task_config) + FileHandler(soca_fix_stage_list).sync() + letkf_stage_list = parse_j2yaml(self.task_config.MARINE_LETKF_STAGE_YAML_TMPL, self.task_config) + FileHandler(letkf_stage_list).sync() + + obs_list = parse_j2yaml(self.task_config.OBS_YAML, self.task_config) + + # get the list of observations + obs_files = [] + for ob in obs_list['observers']: + obs_name = ob['obs space']['name'].lower() + obs_filename = f"{self.task_config.RUN}.t{self.task_config.cyc}z.{obs_name}.{to_YMDH(self.task_config.current_cycle)}.nc" + obs_files.append((obs_filename, ob)) + + obs_files_to_copy = [] + obs_to_use = [] + # copy obs from COMIN_OBS to DATA/obs + for obs_file, ob in obs_files: + obs_src = os.path.join(self.task_config.COMIN_OBS, obs_file) + obs_dst = os.path.join(self.task_config.DATA, self.task_config.obs_dir, obs_file) + if os.path.exists(obs_src): + obs_files_to_copy.append([obs_src, obs_dst]) + obs_to_use.append(ob) + else: + logger.warning(f"{obs_file} is not available in {self.task_config.COMIN_OBS}") + + # stage the desired obs files + FileHandler({'copy': obs_files_to_copy}).sync() + + # make the letkf.yaml + letkfconf = AttrDict() + keys = ['WINDOW_BEGIN', 'WINDOW_MIDDLE', 'RUN', 'gcyc', 'NMEM_ENS'] + for key in keys: + letkfconf[key] = self.task_config[key] + letkfconf.RUN = 'enkfgdas' + letkf_yaml = parse_j2yaml(self.task_config.MARINE_LETKF_YAML_TMPL, letkfconf) + letkf_yaml.observations.observers = obs_to_use + letkf_yaml.save(self.task_config.letkf_yaml_file) + + # swap date and stack size in mom_input.nml + domain_stack_size = self.task_config.DOMAIN_STACK_SIZE + ymdhms = [int(s) for s in self.task_config.WINDOW_BEGIN.strftime('%Y,%m,%d,%H,%M,%S').split(',')] + with open(self.task_config.mom_input_nml_tmpl, 'r') as nml_file: + nml = f90nml.read(nml_file) + nml['ocean_solo_nml']['date_init'] = ymdhms + nml['fms_nml']['domains_stack_size'] = int(domain_stack_size) + nml.write(self.task_config.mom_input_nml, force=True) # force to overwrite if necessary + + @logit(logger) + def run(self): + """Method run for ocean and sea ice LETKF task + Parameters: + ------------ + None + Returns: + -------- + None + """ + + logger.info("run") + + @logit(logger) + def finalize(self): + """Method finalize for ocean and sea ice LETKF task + Parameters: + ------------ + None + Returns: + -------- + None + """ + + logger.info("finalize") diff --git a/ush/python/pygfs/task/oceanice_products.py b/ush/python/pygfs/task/oceanice_products.py new file mode 100644 index 0000000000..98b57ae801 --- /dev/null +++ b/ush/python/pygfs/task/oceanice_products.py @@ -0,0 +1,356 @@ +#!/usr/bin/env python3 + +import os +from logging import getLogger +from typing import List, Dict, Any +from pprint import pformat +import xarray as xr + +from wxflow import (AttrDict, + parse_j2yaml, + FileHandler, + Jinja, + logit, + Task, + add_to_datetime, to_timedelta, + WorkflowException, + Executable) + +logger = getLogger(__name__.split('.')[-1]) + + +class OceanIceProducts(Task): + """Ocean Ice Products Task + """ + + VALID_COMPONENTS = ['ocean', 'ice'] + COMPONENT_RES_MAP = {'ocean': 'OCNRES', 'ice': 'ICERES'} + VALID_PRODUCT_GRIDS = {'mx025': ['1p00', '0p25'], + 'mx050': ['1p00', '0p50'], + 'mx100': ['1p00'], + 'mx500': ['5p00']} + + # These could be read from the yaml file + TRIPOLE_DIMS_MAP = {'mx025': [1440, 1080], 'mx050': [720, 526], 'mx100': [360, 320], 'mx500': [72, 35]} + LATLON_DIMS_MAP = {'0p25': [1440, 721], '0p50': [720, 361], '1p00': [360, 181], '5p00': [72, 36]} + + @logit(logger, name="OceanIceProducts") + def __init__(self, config: Dict[str, Any]) -> None: + """Constructor for the Ocean/Ice Productstask + + Parameters + ---------- + config : Dict[str, Any] + Incoming configuration for the task from the environment + + Returns + ------- + None + """ + super().__init__(config) + + if self.task_config.COMPONENT not in self.VALID_COMPONENTS: + raise NotImplementedError(f'{self.task_config.COMPONENT} is not a valid model component.\n' + + 'Valid model components are:\n' + + f'{", ".join(self.VALID_COMPONENTS)}') + + model_grid = f"mx{self.task_config[self.COMPONENT_RES_MAP[self.task_config.COMPONENT]]:03d}" + + valid_datetime = add_to_datetime(self.task_config.current_cycle, to_timedelta(f"{self.task_config.FORECAST_HOUR}H")) + + if self.task_config.COMPONENT == 'ice': + offset = int(self.task_config.current_cycle.strftime("%H")) % self.task_config.FHOUT_ICE_GFS + # For CICE cases where offset is not 0, forecast_hour needs to be adjusted based on the offset. + # TODO: Consider FHMIN when calculating offset. + if offset != 0: + forecast_hour = self.task_config.FORECAST_HOUR - int(self.task_config.current_cycle.strftime("%H")) + # For the first forecast hour, the interval may be different from the intervals of subsequent forecast hours + if forecast_hour <= self.task_config.FHOUT_ICE_GFS: + interval = self.task_config.FHOUT_ICE_GFS - int(self.task_config.current_cycle.strftime("%H")) + else: + interval = self.task_config.FHOUT_ICE_GFS + else: + forecast_hour = self.task_config.FORECAST_HOUR + interval = self.task_config.FHOUT_ICE_GFS + if self.task_config.COMPONENT == 'ocean': + forecast_hour = self.task_config.FORECAST_HOUR + interval = self.task_config.FHOUT_OCN_GFS + + # TODO: This is a bit of a hack, but it works for now + # FIXME: find a better way to provide the averaging period + avg_period = f"{forecast_hour-interval:03d}-{forecast_hour:03d}" + + # Extend task_config with localdict + localdict = AttrDict( + {'component': self.task_config.COMPONENT, + 'forecast_hour': forecast_hour, + 'valid_datetime': valid_datetime, + 'avg_period': avg_period, + 'model_grid': model_grid, + 'interval': interval, + 'product_grids': self.VALID_PRODUCT_GRIDS[model_grid]} + ) + self.task_config = AttrDict(**self.task_config, **localdict) + + # Read the oceanice_products.yaml file for common configuration + logger.info(f"Read the ocean ice products configuration yaml file {self.task_config.OCEANICEPRODUCTS_CONFIG}") + self.task_config.oceanice_yaml = parse_j2yaml(self.task_config.OCEANICEPRODUCTS_CONFIG, self.task_config) + logger.debug(f"oceanice_yaml:\n{pformat(self.task_config.oceanice_yaml)}") + + @staticmethod + @logit(logger) + def initialize(config: Dict) -> None: + """Initialize the work directory by copying all the common fix data + + Parameters + ---------- + config : Dict + Configuration dictionary for the task + + Returns + ------- + None + """ + + # Copy static data to run directory + logger.info("Copy static data to run directory") + FileHandler(config.oceanice_yaml.ocnicepost.fix_data).sync() + + # Copy "component" specific model data to run directory (e.g. ocean/ice forecast output) + logger.info(f"Copy {config.component} data to run directory") + FileHandler(config.oceanice_yaml[config.component].data_in).sync() + + @staticmethod + @logit(logger) + def configure(config: Dict, product_grid: str) -> None: + """Configure the namelist for the product_grid in the work directory. + Create namelist 'ocnicepost.nml' from template + + Parameters + ---------- + config : Dict + Configuration dictionary for the task + product_grid : str + Target product grid to process + + Returns + ------- + None + """ + + # Make a localconf with the "component" specific configuration for parsing the namelist + localconf = AttrDict() + localconf.DATA = config.DATA + localconf.component = config.component + + localconf.source_tripole_dims = ', '.join(map(str, OceanIceProducts.TRIPOLE_DIMS_MAP[config.model_grid])) + localconf.target_latlon_dims = ', '.join(map(str, OceanIceProducts.LATLON_DIMS_MAP[product_grid])) + + localconf.maskvar = config.oceanice_yaml[config.component].namelist.maskvar + localconf.sinvar = config.oceanice_yaml[config.component].namelist.sinvar + localconf.cosvar = config.oceanice_yaml[config.component].namelist.cosvar + localconf.angvar = config.oceanice_yaml[config.component].namelist.angvar + localconf.debug = ".true." if config.oceanice_yaml.ocnicepost.namelist.debug else ".false." + + logger.debug(f"localconf:\n{pformat(localconf)}") + + # Configure the namelist and write to file + logger.info("Create namelist for ocnicepost.x") + nml_template = os.path.join(localconf.DATA, "ocnicepost.nml.jinja2") + nml_data = Jinja(nml_template, localconf).render + logger.debug(f"ocnicepost_nml:\n{nml_data}") + nml_file = os.path.join(localconf.DATA, "ocnicepost.nml") + with open(nml_file, "w") as fho: + fho.write(nml_data) + + @staticmethod + @logit(logger) + def execute(config: Dict, product_grid: str) -> None: + """Run the ocnicepost.x executable to interpolate and convert to grib2 + + Parameters + ---------- + config : Dict + Configuration dictionary for the task + product_grid : str + Target product grid to process + + Returns + ------- + None + """ + + # Run the ocnicepost.x executable + OceanIceProducts.interp(config.DATA, config.APRUN_OCNICEPOST, exec_name="ocnicepost.x") + + # Convert interpolated netCDF file to grib2 + OceanIceProducts.netCDF_to_grib2(config, product_grid) + + @staticmethod + @logit(logger) + def interp(workdir: str, aprun_cmd: str, exec_name: str = "ocnicepost.x") -> None: + """ + Run the interpolation executable to generate rectilinear netCDF file + + Parameters + ---------- + config : Dict + Configuration dictionary for the task + workdir : str + Working directory for the task + aprun_cmd : str + aprun command to use + exec_name : str + Name of the executable e.g. ocnicepost.x + + Returns + ------- + None + """ + os.chdir(workdir) + logger.debug(f"Current working directory: {os.getcwd()}") + + exec_cmd = Executable(aprun_cmd) + exec_cmd.add_default_arg(os.path.join(workdir, exec_name)) + + OceanIceProducts._call_executable(exec_cmd) + + @staticmethod + @logit(logger) + def netCDF_to_grib2(config: Dict, grid: str) -> None: + """Convert interpolated netCDF file to grib2 + + Parameters + ---------- + config : Dict + Configuration dictionary for the task + grid : str + Target product grid to process + + Returns + ------ + None + """ + + os.chdir(config.DATA) + + exec_cmd = Executable(config.oceanice_yaml.nc2grib2.script) + arguments = [config.component, grid, config.current_cycle.strftime("%Y%m%d%H"), config.avg_period] + if config.component == 'ocean': + levs = config.oceanice_yaml.ocean.namelist.ocean_levels + arguments.append(':'.join(map(str, levs))) + + logger.info(f"Executing {exec_cmd} with arguments {arguments}") + try: + exec_cmd(*arguments) + except OSError: + logger.exception(f"FATAL ERROR: Failed to execute {exec_cmd}") + raise OSError(f"{exec_cmd}") + except Exception: + logger.exception(f"FATAL ERROR: Error occurred during execution of {exec_cmd}") + raise WorkflowException(f"{exec_cmd}") + + @staticmethod + @logit(logger) + def subset(config: Dict) -> None: + """ + Subset a list of variables from a netcdf file and save to a new netcdf file. + Also save global attributes and history from the old netcdf file into new netcdf file + + Parameters + ---------- + config : Dict + Configuration dictionary for the task + + Returns + ------- + None + """ + + os.chdir(config.DATA) + + input_file = f"{config.component}.nc" + output_file = f"{config.component}_subset.nc" + varlist = config.oceanice_yaml[config.component].subset + + logger.info(f"Subsetting {varlist} from {input_file} to {output_file}") + + try: + # open the netcdf file + ds = xr.open_dataset(input_file) + + # subset the variables + ds_subset = ds[varlist] + + # save global attributes from the old netcdf file into new netcdf file + ds_subset.attrs = ds.attrs + + # save subsetted variables to a new netcdf file + ds_subset.to_netcdf(output_file) + + except FileNotFoundError: + logger.exception(f"FATAL ERROR: Input file not found: {input_file}") + raise FileNotFoundError(f"File not found: {input_file}") + + except IOError as err: + logger.exception(f"FATAL ERROR: IOError occurred during netCDF subset: {input_file}") + raise IOError(f"An I/O error occurred: {err}") + + except Exception as err: + logger.exception(f"FATAL ERROR: Error occurred during netCDF subset: {input_file}") + raise WorkflowException(f"{err}") + + finally: + # close the netcdf files + ds.close() + ds_subset.close() + + @staticmethod + @logit(logger) + def _call_executable(exec_cmd: Executable) -> None: + """Internal method to call executable + + Parameters + ---------- + exec_cmd : Executable + Executable to run + + Raises + ------ + OSError + Failure due to OS issues + WorkflowException + All other exceptions + """ + + logger.info(f"Executing {exec_cmd}") + try: + exec_cmd() + except OSError: + logger.exception(f"FATAL ERROR: Failed to execute {exec_cmd}") + raise OSError(f"{exec_cmd}") + except Exception: + logger.exception(f"FATAL ERROR: Error occurred during execution of {exec_cmd}") + raise WorkflowException(f"{exec_cmd}") + + @staticmethod + @logit(logger) + def finalize(config: Dict) -> None: + """Perform closing actions of the task. + Copy data back from the DATA/ directory to COM/ + + Parameters + ---------- + config: Dict + Configuration dictionary for the task + + Returns + ------- + None + """ + + # Copy "component" specific generated data to COM/ directory + data_out = config.oceanice_yaml[config.component].data_out + + logger.info(f"Copy processed data to COM/ directory") + FileHandler(data_out).sync() diff --git a/ush/python/pygfs/task/land_analysis.py b/ush/python/pygfs/task/snow_analysis.py similarity index 72% rename from ush/python/pygfs/task/land_analysis.py rename to ush/python/pygfs/task/snow_analysis.py index 307e875183..9656b00a8e 100644 --- a/ush/python/pygfs/task/land_analysis.py +++ b/ush/python/pygfs/task/snow_analysis.py @@ -11,7 +11,7 @@ FileHandler, to_fv3time, to_YMD, to_YMDH, to_timedelta, add_to_datetime, rm_p, - parse_j2yaml, parse_yamltmpl, save_as_yaml, + parse_j2yaml, save_as_yaml, Jinja, logit, Executable, @@ -21,44 +21,44 @@ logger = getLogger(__name__.split('.')[-1]) -class LandAnalysis(Analysis): +class SnowAnalysis(Analysis): """ - Class for global land analysis tasks + Class for global snow analysis tasks """ - NMEM_LANDENS = 2 # The size of the land ensemble is fixed at 2. Does this need to be a variable? + NMEM_SNOWENS = 2 - @logit(logger, name="LandAnalysis") + @logit(logger, name="SnowAnalysis") def __init__(self, config): super().__init__(config) - _res = int(self.config['CASE'][1:]) - _window_begin = add_to_datetime(self.runtime_config.current_cycle, -to_timedelta(f"{self.config['assim_freq']}H") / 2) - _letkfoi_yaml = os.path.join(self.runtime_config.DATA, f"{self.runtime_config.RUN}.t{self.runtime_config['cyc']:02d}z.letkfoi.yaml") + _res = int(self.task_config['CASE'][1:]) + _window_begin = add_to_datetime(self.task_config.current_cycle, -to_timedelta(f"{self.task_config['assim_freq']}H") / 2) + _letkfoi_yaml = os.path.join(self.task_config.DATA, f"{self.task_config.RUN}.t{self.task_config['cyc']:02d}z.letkfoi.yaml") # Create a local dictionary that is repeatedly used across this class local_dict = AttrDict( { 'npx_ges': _res + 1, 'npy_ges': _res + 1, - 'npz_ges': self.config.LEVS - 1, - 'npz': self.config.LEVS - 1, - 'LAND_WINDOW_BEGIN': _window_begin, - 'LAND_WINDOW_LENGTH': f"PT{self.config['assim_freq']}H", - 'OPREFIX': f"{self.runtime_config.RUN}.t{self.runtime_config.cyc:02d}z.", - 'APREFIX': f"{self.runtime_config.RUN}.t{self.runtime_config.cyc:02d}z.", + 'npz_ges': self.task_config.LEVS - 1, + 'npz': self.task_config.LEVS - 1, + 'SNOW_WINDOW_BEGIN': _window_begin, + 'SNOW_WINDOW_LENGTH': f"PT{self.task_config['assim_freq']}H", + 'OPREFIX': f"{self.task_config.RUN}.t{self.task_config.cyc:02d}z.", + 'APREFIX': f"{self.task_config.RUN}.t{self.task_config.cyc:02d}z.", 'jedi_yaml': _letkfoi_yaml } ) - # task_config is everything that this task should need - self.task_config = AttrDict(**self.config, **self.runtime_config, **local_dict) + # Extend task_config with local_dict + self.task_config = AttrDict(**self.task_config, **local_dict) @logit(logger) def prepare_GTS(self) -> None: - """Prepare the GTS data for a global land analysis + """Prepare the GTS data for a global snow analysis - This method will prepare GTS data for a global land analysis using JEDI. + This method will prepare GTS data for a global snow analysis using JEDI. This includes: - processing GTS bufr snow depth observation data to IODA format @@ -74,7 +74,7 @@ def prepare_GTS(self) -> None: # create a temporary dict of all keys needed in this method localconf = AttrDict() keys = ['HOMEgfs', 'DATA', 'current_cycle', 'COM_OBS', 'COM_ATMOS_RESTART_PREV', - 'OPREFIX', 'CASE', 'ntiles'] + 'OPREFIX', 'CASE', 'OCNRES', 'ntiles'] for key in keys: localconf[key] = self.task_config[key] @@ -99,7 +99,7 @@ def prepare_GTS(self) -> None: def _gtsbufr2iodax(exe, yaml_file): if not os.path.isfile(yaml_file): - logger.exception(f"{yaml_file} not found") + logger.exception(f"FATAL ERROR: {yaml_file} not found") raise FileNotFoundError(yaml_file) logger.info(f"Executing {exe}") @@ -114,7 +114,7 @@ def _gtsbufr2iodax(exe, yaml_file): # 1. generate bufr2ioda YAML files # 2. execute bufr2ioda.x for name in prep_gts_config.bufr2ioda.keys(): - gts_yaml = os.path.join(self.runtime_config.DATA, f"bufr_{name}_snow.yaml") + gts_yaml = os.path.join(self.task_config.DATA, f"bufr_{name}_snow.yaml") logger.info(f"Generate BUFR2IODA YAML file: {gts_yaml}") temp_yaml = parse_j2yaml(prep_gts_config.bufr2ioda[name], localconf) save_as_yaml(temp_yaml, gts_yaml) @@ -133,9 +133,9 @@ def _gtsbufr2iodax(exe, yaml_file): @logit(logger) def prepare_IMS(self) -> None: - """Prepare the IMS data for a global land analysis + """Prepare the IMS data for a global snow analysis - This method will prepare IMS data for a global land analysis using JEDI. + This method will prepare IMS data for a global snow analysis using JEDI. This includes: - staging model backgrounds - processing raw IMS observation data and prepare for conversion to IODA @@ -153,7 +153,7 @@ def prepare_IMS(self) -> None: # create a temporary dict of all keys needed in this method localconf = AttrDict() keys = ['DATA', 'current_cycle', 'COM_OBS', 'COM_ATMOS_RESTART_PREV', - 'OPREFIX', 'CASE', 'OCNRES', 'ntiles'] + 'OPREFIX', 'CASE', 'OCNRES', 'ntiles', 'FIXgfs'] for key in keys: localconf[key] = self.task_config[key] @@ -198,7 +198,7 @@ def prepare_IMS(self) -> None: raise WorkflowException(f"An error occured during execution of {exe}") # Ensure the snow depth IMS file is produced by the above executable - input_file = f"IMSscf.{to_YMD(localconf.current_cycle)}.{localconf.CASE}.mx{localconf.OCNRES}_oro_data.nc" + input_file = f"IMSscf.{to_YMD(localconf.current_cycle)}.{localconf.CASE}_oro_data.nc" if not os.path.isfile(f"{os.path.join(localconf.DATA, input_file)}"): logger.exception(f"{self.task_config.CALCFIMSEXE} failed to produce {input_file}") raise FileNotFoundError(f"{os.path.join(localconf.DATA, input_file)}") @@ -232,7 +232,7 @@ def prepare_IMS(self) -> None: @logit(logger) def initialize(self) -> None: - """Initialize method for Land analysis + """Initialize method for snow analysis This method: - creates artifacts in the DATA directory by copying fix files - creates the JEDI LETKF yaml from the template @@ -241,7 +241,7 @@ def initialize(self) -> None: Parameters ---------- self : Analysis - Instance of the LandAnalysis object + Instance of the SnowAnalysis object """ super().initialize() @@ -249,30 +249,27 @@ def initialize(self) -> None: # create a temporary dict of all keys needed in this method localconf = AttrDict() keys = ['DATA', 'current_cycle', 'COM_OBS', 'COM_ATMOS_RESTART_PREV', - 'OPREFIX', 'CASE', 'ntiles'] + 'OPREFIX', 'CASE', 'OCNRES', 'ntiles'] for key in keys: localconf[key] = self.task_config[key] # Make member directories in DATA for background dirlist = [] - for imem in range(1, LandAnalysis.NMEM_LANDENS + 1): + for imem in range(1, SnowAnalysis.NMEM_SNOWENS + 1): dirlist.append(os.path.join(localconf.DATA, 'bkg', f'mem{imem:03d}')) FileHandler({'mkdir': dirlist}).sync() # stage fix files - jedi_fix_list_path = os.path.join(self.task_config.HOMEgfs, 'parm', 'gdas', 'land_jedi_fix.yaml') - logger.info(f"Staging JEDI fix files from {jedi_fix_list_path}") - jedi_fix_list = parse_yamltmpl(jedi_fix_list_path, self.task_config) + logger.info(f"Staging JEDI fix files from {self.task_config.JEDI_FIX_YAML}") + jedi_fix_list = parse_j2yaml(self.task_config.JEDI_FIX_YAML, self.task_config) FileHandler(jedi_fix_list).sync() # stage backgrounds logger.info("Staging ensemble backgrounds") FileHandler(self.get_ens_bkg_dict(localconf)).sync() - # generate letkfoi YAML file - logger.info(f"Generate JEDI LETKF YAML file: {self.task_config.jedi_yaml}") - letkfoi_yaml = parse_j2yaml(self.task_config.JEDIYAML, self.task_config) - save_as_yaml(letkfoi_yaml, self.task_config.jedi_yaml) + # Write out letkfoi YAML file + save_as_yaml(self.task_config.jedi_config, self.task_config.jedi_yaml) logger.info(f"Wrote letkfoi YAML to: {self.task_config.jedi_yaml}") # need output dir for diags and anl @@ -294,15 +291,15 @@ def execute(self) -> None: Parameters ---------- self : Analysis - Instance of the LandAnalysis object + Instance of the SnowAnalysis object """ # create a temporary dict of all keys needed in this method localconf = AttrDict() keys = ['HOMEgfs', 'DATA', 'current_cycle', - 'COM_ATMOS_RESTART_PREV', 'COM_LAND_ANALYSIS', 'APREFIX', - 'SNOWDEPTHVAR', 'BESTDDEV', 'CASE', 'ntiles', - 'APRUN_LANDANL', 'JEDIEXE', 'jedi_yaml', + 'COM_ATMOS_RESTART_PREV', 'COM_SNOW_ANALYSIS', 'APREFIX', + 'SNOWDEPTHVAR', 'BESTDDEV', 'CASE', 'OCNRES', 'ntiles', + 'APRUN_SNOWANL', 'JEDIEXE', 'jedi_yaml', 'DOIAU', 'SNOW_WINDOW_BEGIN', 'APPLY_INCR_NML_TMPL', 'APPLY_INCR_EXE', 'APRUN_APPLY_INCR'] for key in keys: localconf[key] = self.task_config[key] @@ -313,17 +310,27 @@ def execute(self) -> None: AttrDict({key: localconf[key] for key in ['DATA', 'ntiles', 'current_cycle']})) logger.info("Running JEDI LETKF") - self.execute_jediexe(localconf.DATA, - localconf.APRUN_LANDANL, - os.path.basename(localconf.JEDIEXE), - localconf.jedi_yaml) + exec_cmd = Executable(localconf.APRUN_SNOWANL) + exec_name = os.path.join(localconf.DATA, 'gdas.x') + exec_cmd.add_default_arg(exec_name) + exec_cmd.add_default_arg('fv3jedi') + exec_cmd.add_default_arg('localensembleda') + exec_cmd.add_default_arg(localconf.jedi_yaml) + + try: + logger.debug(f"Executing {exec_cmd}") + exec_cmd() + except OSError: + raise OSError(f"Failed to execute {exec_cmd}") + except Exception: + raise WorkflowException(f"An error occured during execution of {exec_cmd}") logger.info("Creating analysis from backgrounds and increments") self.add_increments(localconf) @logit(logger) def finalize(self) -> None: - """Performs closing actions of the Land analysis task + """Performs closing actions of the Snow analysis task This method: - tar and gzip the output diag files and place in COM/ - copy the generated YAML file from initialize to the COM/ @@ -333,11 +340,11 @@ def finalize(self) -> None: Parameters ---------- self : Analysis - Instance of the LandAnalysis object + Instance of the SnowAnalysis object """ logger.info("Create diagnostic tarball of diag*.nc4 files") - statfile = os.path.join(self.task_config.COM_LAND_ANALYSIS, f"{self.task_config.APREFIX}landstat.tgz") + statfile = os.path.join(self.task_config.COM_SNOW_ANALYSIS, f"{self.task_config.APREFIX}snowstat.tgz") self.tgz_diags(statfile, self.task_config.DATA) logger.info("Copy full YAML to COM") @@ -350,22 +357,28 @@ def finalize(self) -> None: FileHandler(yaml_copy).sync() logger.info("Copy analysis to COM") - template = f'{to_fv3time(self.task_config.current_cycle)}.sfc_data.tile{{tilenum}}.nc' + bkgtimes = [] + if self.task_config.DOIAU: + # need both beginning and middle of window + bkgtimes.append(self.task_config.SNOW_WINDOW_BEGIN) + bkgtimes.append(self.task_config.current_cycle) anllist = [] - for itile in range(1, self.task_config.ntiles + 1): - filename = template.format(tilenum=itile) - src = os.path.join(self.task_config.DATA, 'anl', filename) - dest = os.path.join(self.task_config.COM_LAND_ANALYSIS, filename) - anllist.append([src, dest]) + for bkgtime in bkgtimes: + template = f'{to_fv3time(bkgtime)}.sfc_data.tile{{tilenum}}.nc' + for itile in range(1, self.task_config.ntiles + 1): + filename = template.format(tilenum=itile) + src = os.path.join(self.task_config.DATA, 'anl', filename) + dest = os.path.join(self.task_config.COM_SNOW_ANALYSIS, filename) + anllist.append([src, dest]) FileHandler({'copy': anllist}).sync() logger.info('Copy increments to COM') - template = f'landinc.{to_fv3time(self.task_config.current_cycle)}.sfc_data.tile{{tilenum}}.nc' + template = f'snowinc.{to_fv3time(self.task_config.current_cycle)}.sfc_data.tile{{tilenum}}.nc' inclist = [] for itile in range(1, self.task_config.ntiles + 1): filename = template.format(tilenum=itile) src = os.path.join(self.task_config.DATA, 'anl', filename) - dest = os.path.join(self.task_config.COM_LAND_ANALYSIS, filename) + dest = os.path.join(self.task_config.COM_SNOW_ANALYSIS, filename) inclist.append([src, dest]) FileHandler({'copy': inclist}).sync() @@ -375,7 +388,7 @@ def get_bkg_dict(config: Dict) -> Dict[str, List[str]]: """Compile a dictionary of model background files to copy This method constructs a dictionary of FV3 RESTART files (coupler, sfc_data) - that are needed for global land DA and returns said dictionary for use by the FileHandler class. + that are needed for global snow DA and returns said dictionary for use by the FileHandler class. Parameters ---------- @@ -401,11 +414,11 @@ def get_bkg_dict(config: Dict) -> Dict[str, List[str]]: # Start accumulating list of background files to copy bkglist = [] - # land DA needs coupler + # snow DA needs coupler basename = f'{to_fv3time(config.current_cycle)}.coupler.res' bkglist.append([os.path.join(rst_dir, basename), os.path.join(run_dir, basename)]) - # land DA only needs sfc_data + # snow DA only needs sfc_data for ftype in ['sfc_data']: template = f'{to_fv3time(config.current_cycle)}.{ftype}.tile{{tilenum}}.nc' for itile in range(1, config.ntiles + 1): @@ -447,17 +460,17 @@ def get_ens_bkg_dict(config: Dict) -> Dict: # get FV3 sfc_data RESTART files; Note an ensemble is being created rst_dir = os.path.join(config.COM_ATMOS_RESTART_PREV) - for imem in range(1, LandAnalysis.NMEM_LANDENS + 1): + for imem in range(1, SnowAnalysis.NMEM_SNOWENS + 1): memchar = f"mem{imem:03d}" run_dir = os.path.join(config.DATA, 'bkg', memchar, 'RESTART') dirlist.append(run_dir) - # Land DA needs coupler + # Snow DA needs coupler basename = f'{to_fv3time(config.current_cycle)}.coupler.res' bkglist.append([os.path.join(rst_dir, basename), os.path.join(run_dir, basename)]) - # Land DA only needs sfc_data + # Snow DA only needs sfc_data for ftype in ['sfc_data']: template = f'{to_fv3time(config.current_cycle)}.{ftype}.tile{{tilenum}}.nc' for itile in range(1, config.ntiles + 1): @@ -491,7 +504,7 @@ def create_ensemble(vname: str, bestddev: float, config: Dict) -> None: """ # 2 ens members - offset = bestddev / np.sqrt(LandAnalysis.NMEM_LANDENS) + offset = bestddev / np.sqrt(SnowAnalysis.NMEM_SNOWENS) logger.info(f"Creating ensemble for LETKFOI by offsetting with {offset}") @@ -530,10 +543,13 @@ def add_increments(config: Dict) -> None: DATA current_cycle CASE + OCNRES ntiles APPLY_INCR_NML_TMPL APPLY_INCR_EXE APRUN_APPLY_INCR + DOIAU + SNOW_WINDOW_BEGIN Raises ------ @@ -545,38 +561,67 @@ def add_increments(config: Dict) -> None: # need backgrounds to create analysis from increments after LETKF logger.info("Copy backgrounds into anl/ directory for creating analysis from increments") - template = f'{to_fv3time(config.current_cycle)}.sfc_data.tile{{tilenum}}.nc' + bkgtimes = [] + if config.DOIAU: + # want analysis at beginning and middle of window + bkgtimes.append(config.SNOW_WINDOW_BEGIN) + bkgtimes.append(config.current_cycle) anllist = [] - for itile in range(1, config.ntiles + 1): - filename = template.format(tilenum=itile) - src = os.path.join(config.COM_ATMOS_RESTART_PREV, filename) - dest = os.path.join(config.DATA, "anl", filename) - anllist.append([src, dest]) + for bkgtime in bkgtimes: + template = f'{to_fv3time(bkgtime)}.sfc_data.tile{{tilenum}}.nc' + for itile in range(1, config.ntiles + 1): + filename = template.format(tilenum=itile) + src = os.path.join(config.COM_ATMOS_RESTART_PREV, filename) + dest = os.path.join(config.DATA, "anl", filename) + anllist.append([src, dest]) FileHandler({'copy': anllist}).sync() - logger.info("Create namelist for APPLY_INCR_EXE") - nml_template = config.APPLY_INCR_NML_TMPL - nml_data = Jinja(nml_template, config).render - logger.debug(f"apply_incr_nml:\n{nml_data}") - - nml_file = os.path.join(config.DATA, "apply_incr_nml") - with open(nml_file, "w") as fho: - fho.write(nml_data) - - logger.info("Link APPLY_INCR_EXE into DATA/") - exe_src = config.APPLY_INCR_EXE - exe_dest = os.path.join(config.DATA, os.path.basename(exe_src)) - if os.path.exists(exe_dest): - rm_p(exe_dest) - os.symlink(exe_src, exe_dest) - - # execute APPLY_INCR_EXE to create analysis files - exe = Executable(config.APRUN_APPLY_INCR) - exe.add_default_arg(os.path.join(config.DATA, os.path.basename(exe_src))) - logger.info(f"Executing {exe}") - try: - exe() - except OSError: - raise OSError(f"Failed to execute {exe}") - except Exception: - raise WorkflowException(f"An error occured during execution of {exe}") + if config.DOIAU: + logger.info("Copying increments to beginning of window") + template_in = f'snowinc.{to_fv3time(config.current_cycle)}.sfc_data.tile{{tilenum}}.nc' + template_out = f'snowinc.{to_fv3time(config.SNOW_WINDOW_BEGIN)}.sfc_data.tile{{tilenum}}.nc' + inclist = [] + for itile in range(1, config.ntiles + 1): + filename_in = template_in.format(tilenum=itile) + filename_out = template_out.format(tilenum=itile) + src = os.path.join(config.DATA, 'anl', filename_in) + dest = os.path.join(config.DATA, 'anl', filename_out) + inclist.append([src, dest]) + FileHandler({'copy': inclist}).sync() + + # loop over times to apply increments + for bkgtime in bkgtimes: + logger.info("Processing analysis valid: {bkgtime}") + logger.info("Create namelist for APPLY_INCR_EXE") + nml_template = config.APPLY_INCR_NML_TMPL + nml_config = { + 'current_cycle': bkgtime, + 'CASE': config.CASE, + 'DATA': config.DATA, + 'HOMEgfs': config.HOMEgfs, + 'OCNRES': config.OCNRES, + } + nml_data = Jinja(nml_template, nml_config).render + logger.debug(f"apply_incr_nml:\n{nml_data}") + + nml_file = os.path.join(config.DATA, "apply_incr_nml") + with open(nml_file, "w") as fho: + fho.write(nml_data) + + logger.info("Link APPLY_INCR_EXE into DATA/") + exe_src = config.APPLY_INCR_EXE + exe_dest = os.path.join(config.DATA, os.path.basename(exe_src)) + if os.path.exists(exe_dest): + rm_p(exe_dest) + os.symlink(exe_src, exe_dest) + + # execute APPLY_INCR_EXE to create analysis files + exe = Executable(config.APRUN_APPLY_INCR) + exe.add_default_arg(os.path.join(config.DATA, os.path.basename(exe_src))) + logger.info(f"Executing {exe}") + try: + exe() + except OSError: + raise OSError(f"Failed to execute {exe}") + except Exception: + raise WorkflowException(f"An error occured during execution of {exe}") diff --git a/ush/python/pygfs/task/upp.py b/ush/python/pygfs/task/upp.py index 7db50e1582..7e42e07c64 100644 --- a/ush/python/pygfs/task/upp.py +++ b/ush/python/pygfs/task/upp.py @@ -46,26 +46,27 @@ def __init__(self, config: Dict[str, Any]) -> None: """ super().__init__(config) - if self.config.UPP_RUN not in self.VALID_UPP_RUN: - raise NotImplementedError(f'{self.config.UPP_RUN} is not a valid UPP run type.\n' + + if self.task_config.UPP_RUN not in self.VALID_UPP_RUN: + raise NotImplementedError(f'{self.task_config.UPP_RUN} is not a valid UPP run type.\n' + 'Valid UPP_RUN values are:\n' + f'{", ".join(self.VALID_UPP_RUN)}') - valid_datetime = add_to_datetime(self.runtime_config.current_cycle, to_timedelta(f"{self.config.FORECAST_HOUR}H")) + valid_datetime = add_to_datetime(self.task_config.current_cycle, to_timedelta(f"{self.task_config.FORECAST_HOUR}H")) + # Extend task_config with localdict localdict = AttrDict( - {'upp_run': self.config.UPP_RUN, - 'forecast_hour': self.config.FORECAST_HOUR, + {'upp_run': self.task_config.UPP_RUN, + 'forecast_hour': self.task_config.FORECAST_HOUR, 'valid_datetime': valid_datetime, 'atmos_filename': f"atm_{valid_datetime.strftime('%Y%m%d%H%M%S')}.nc", 'flux_filename': f"sfc_{valid_datetime.strftime('%Y%m%d%H%M%S')}.nc" } ) - self.task_config = AttrDict(**self.config, **self.runtime_config, **localdict) + self.task_config = AttrDict(**self.task_config, **localdict) # Read the upp.yaml file for common configuration - logger.info(f"Read the UPP configuration yaml file {self.config.UPP_CONFIG}") - self.task_config.upp_yaml = parse_j2yaml(self.config.UPP_CONFIG, self.task_config) + logger.info(f"Read the UPP configuration yaml file {self.task_config.UPP_CONFIG}") + self.task_config.upp_yaml = parse_j2yaml(self.task_config.UPP_CONFIG, self.task_config) logger.debug(f"upp_yaml:\n{pformat(self.task_config.upp_yaml)}") @staticmethod diff --git a/ush/python/pygfs/utils/__init__.py b/ush/python/pygfs/utils/__init__.py new file mode 100644 index 0000000000..e69de29bb2 diff --git a/ush/python/pygfs/utils/marine_da_utils.py b/ush/python/pygfs/utils/marine_da_utils.py new file mode 100644 index 0000000000..016551878b --- /dev/null +++ b/ush/python/pygfs/utils/marine_da_utils.py @@ -0,0 +1,99 @@ +import f90nml +import os +from logging import getLogger +import xarray as xr + +from wxflow import (FileHandler, + logit, + WorkflowException, + AttrDict, + parse_j2yaml, + Executable, + jinja) + +logger = getLogger(__name__.split('.')[-1]) + + +@logit(logger) +def run(exec_cmd: Executable) -> None: + """Run the executable command + """ + logger.info(f"Executing {exec_cmd}") + try: + logger.debug(f"Executing {exec_cmd}") + exec_cmd() + except OSError: + raise OSError(f"Failed to execute {exec_cmd}") + except Exception: + raise WorkflowException(f"An error occured during execution of {exec_cmd}") + + +@logit(logger) +def link_executable(task_config: AttrDict, exe_name: str) -> None: + """Link the executable to the DATA directory + """ + logger.info(f"Link executable {exe_name}") + logger.warn("WARNING: Linking is not permitted per EE2.") + exe_src = os.path.join(task_config.EXECgfs, exe_name) + exe_dest = os.path.join(task_config.DATA, exe_name) + if os.path.exists(exe_dest): + os.remove(exe_dest) + os.symlink(exe_src, exe_dest) + + +@logit(logger) +def prep_input_nml(task_config: AttrDict) -> None: + """Prepare the input.nml file + TODO: Use jinja2 instead of f90nml + """ + # stage input.nml + mom_input_nml_tmpl_src = os.path.join(task_config.HOMEgdas, 'parm', 'soca', 'fms', 'input.nml') + mom_input_nml_tmpl = os.path.join(task_config.DATA, 'mom_input.nml.tmpl') + FileHandler({'copy': [[mom_input_nml_tmpl_src, mom_input_nml_tmpl]]}).sync() + + # swap date and stacksize + domain_stack_size = task_config.DOMAIN_STACK_SIZE + ymdhms = [int(s) for s in task_config.MARINE_WINDOW_END.strftime('%Y,%m,%d,%H,%M,%S').split(',')] + with open(mom_input_nml_tmpl, 'r') as nml_file: + nml = f90nml.read(nml_file) + nml['ocean_solo_nml']['date_init'] = ymdhms + nml['fms_nml']['domains_stack_size'] = int(domain_stack_size) + nml.write('mom_input.nml') + + +@logit(logger) +def cice_hist2fms(input_filename: str, output_filename: str) -> None: + """ Reformat the CICE history file so it can be read by SOCA/FMS + Simple reformatting utility to allow soca/fms to read the CICE history files + """ + + # open the CICE history file + ds = xr.open_dataset(input_filename) + + if 'aicen' in ds.variables and 'hicen' in ds.variables and 'hsnon' in ds.variables: + logger.info(f"*** Already reformatted, skipping.") + return + + # rename the dimensions to xaxis_1 and yaxis_1 + ds = ds.rename({'ni': 'xaxis_1', 'nj': 'yaxis_1'}) + + # rename the variables + ds = ds.rename({'aice_h': 'aicen', 'hi_h': 'hicen', 'hs_h': 'hsnon'}) + + # Save the new netCDF file + ds.to_netcdf(output_filename, mode='w') + + +@logit(logger) +def stage_ens_mem(task_config: AttrDict) -> None: + """ Copy the ensemble members to the DATA directory + Copy the ensemble members to the DATA directory and reformat the CICE history files + """ + # Copy the ensemble members to the DATA directory + logger.info("---------------- Stage ensemble members") + ensbkgconf = AttrDict(task_config) + ensbkgconf.RUN = task_config.GDUMP_ENS + logger.debug(f"{jinja.Jinja(task_config.MARINE_ENSDA_STAGE_BKG_YAML_TMPL, ensbkgconf).render}") + letkf_stage_list = parse_j2yaml(task_config.MARINE_ENSDA_STAGE_BKG_YAML_TMPL, ensbkgconf) + logger.info(f"{letkf_stage_list}") + FileHandler(letkf_stage_list).sync() diff --git a/ush/radmon_err_rpt.sh b/ush/radmon_err_rpt.sh index 6ae6505624..c3d251d5cd 100755 --- a/ush/radmon_err_rpt.sh +++ b/ush/radmon_err_rpt.sh @@ -1,6 +1,6 @@ #! /usr/bin/env bash -source "${HOMEgfs}/ush/preamble.sh" +source "${USHgfs}/preamble.sh" ################################################################################ #### UNIX Script Documentation Block @@ -55,9 +55,6 @@ cycle2=${5:-${cycle2:?}} diag_rpt=${6:-${diag_rpt:?}} outfile=${7:-${outfile:?}} -# Directories -HOMEradmon=${HOMEradmon:-$(pwd)} - # Other variables err=0 RADMON_SUFFIX=${RADMON_SUFFIX} diff --git a/ush/radmon_verf_angle.sh b/ush/radmon_verf_angle.sh index f68d7c88cc..3dff2a6f98 100755 --- a/ush/radmon_verf_angle.sh +++ b/ush/radmon_verf_angle.sh @@ -1,6 +1,6 @@ #! /usr/bin/env bash -source "${HOMEgfs}/ush/preamble.sh" +source "${USHgfs}/preamble.sh" ################################################################################ #### UNIX Script Documentation Block @@ -29,8 +29,6 @@ source "${HOMEgfs}/ush/preamble.sh" # Imported Shell Variables: # RADMON_SUFFIX data source suffix # defauls to opr -# EXECgfs executable directory -# PARMmonitor parm directory # RAD_AREA global or regional flag # defaults to global # TANKverf_rad data repository @@ -83,7 +81,6 @@ which prep_step which startmsg # File names -export pgmout=${pgmout:-${jlogfile}} touch "${pgmout}" # Other variables @@ -101,7 +98,7 @@ fi err=0 angle_exec=radmon_angle.x -shared_scaninfo="${shared_scaninfo:-${PARMmonitor}/gdas_radmon_scaninfo.txt}" +shared_scaninfo="${shared_scaninfo:-${PARMgfs}/monitor/gdas_radmon_scaninfo.txt}" scaninfo=scaninfo.txt #-------------------------------------------------------------------- diff --git a/ush/radmon_verf_bcoef.sh b/ush/radmon_verf_bcoef.sh index ab1058711e..4274436154 100755 --- a/ush/radmon_verf_bcoef.sh +++ b/ush/radmon_verf_bcoef.sh @@ -1,6 +1,6 @@ #! /usr/bin/env bash -source "${HOMEgfs}/ush/preamble.sh" +source "${USHgfs}/preamble.sh" ################################################################################ #### UNIX Script Documentation Block @@ -69,7 +69,6 @@ fi echo " RADMON_NETCDF, netcdf_boolean = ${RADMON_NETCDF}, ${netcdf_boolean}" # File names -pgmout=${pgmout:-${jlogfile}} touch "${pgmout}" # Other variables diff --git a/ush/radmon_verf_bcor.sh b/ush/radmon_verf_bcor.sh index f1f97c247e..ea0a7842e6 100755 --- a/ush/radmon_verf_bcor.sh +++ b/ush/radmon_verf_bcor.sh @@ -1,6 +1,6 @@ #! /usr/bin/env bash -source "${HOMEgfs}/ush/preamble.sh" +source "${USHgfs}/preamble.sh" ################################################################################ #### UNIX Script Documentation Block @@ -65,7 +65,6 @@ source "${HOMEgfs}/ush/preamble.sh" #################################################################### # File names -pgmout=${pgmout:-${jlogfile}} touch "${pgmout}" # Other variables diff --git a/ush/radmon_verf_time.sh b/ush/radmon_verf_time.sh index 7f98407ec5..0e935826dd 100755 --- a/ush/radmon_verf_time.sh +++ b/ush/radmon_verf_time.sh @@ -1,6 +1,6 @@ #! /usr/bin/env bash -source "${HOMEgfs}/ush/preamble.sh" +source "${USHgfs}/preamble.sh" ################################################################################ #### UNIX Script Documentation Block @@ -33,8 +33,6 @@ source "${HOMEgfs}/ush/preamble.sh" # defaults to 1 (on) # RADMON_SUFFIX data source suffix # defauls to opr -# EXECgfs executable directory -# PARMmonitor parm data directory # RAD_AREA global or regional flag # defaults to global # TANKverf_rad data repository @@ -75,11 +73,9 @@ source "${HOMEgfs}/ush/preamble.sh" #################################################################### # File names -#pgmout=${pgmout:-${jlogfile}} -#touch $pgmout radmon_err_rpt=${radmon_err_rpt:-${USHgfs}/radmon_err_rpt.sh} -base_file=${base_file:-${PARMmonitor}/gdas_radmon_base.tar} +base_file=${base_file:-${PARMgfs}/monitor/gdas_radmon_base.tar} report=report.txt disclaimer=disclaimer.txt diff --git a/ush/rstprod.sh b/ush/rstprod.sh index acac0340bb..b48a6817e0 100755 --- a/ush/rstprod.sh +++ b/ush/rstprod.sh @@ -1,6 +1,6 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" +source "${USHgfs}/preamble.sh" #--------------------------------------------------------- # rstprod.sh diff --git a/ush/run_mpmd.sh b/ush/run_mpmd.sh index 24cb3f2656..e3fc2b7512 100755 --- a/ush/run_mpmd.sh +++ b/ush/run_mpmd.sh @@ -1,6 +1,6 @@ #!/usr/bin/env bash -source "${HOMEgfs}/ush/preamble.sh" +source "${USHgfs}/preamble.sh" cmdfile=${1:?"run_mpmd requires an input file containing commands to execute in MPMD mode"} diff --git a/ush/syndat_getjtbul.sh b/ush/syndat_getjtbul.sh index c17067ff72..6596c6ef96 100755 --- a/ush/syndat_getjtbul.sh +++ b/ush/syndat_getjtbul.sh @@ -18,17 +18,10 @@ # Imported variables that must be passed in: # DATA - path to working directory # pgmout - string indicating path to for standard output file -# EXECSYND - path to syndat executable directory # TANK_TROPCY - path to home directory containing tropical cyclone record # data base -# Imported variables that can be passed in: -# jlogfile - path to job log file (skipped over by this script if not -# passed in) - -source "$HOMEgfs/ush/preamble.sh" - -EXECSYND=${EXECSYND:-${HOMESYND}/exec} +source "${USHgfs}/preamble.sh" cd $DATA @@ -52,8 +45,6 @@ hour=$(echo $CDATE10 | cut -c9-10) echo $PDYm1 pdym1=$PDYm1 -#pdym1=$(sh $utilscript/finddate.sh $pdy d-1) - echo " " >> $pgmout echo "Entering sub-shell syndat_getjtbul.sh to recover JTWC Bulletins" \ >> $pgmout @@ -123,7 +114,7 @@ fi [ -s jtwcbul ] && echo "Processing JTWC bulletin halfs into tcvitals records" >> $pgmout -pgm=$(basename $EXECSYND/syndat_getjtbul.x) +pgm=$(basename ${EXECgfs}/syndat_getjtbul.x) export pgm if [ -s prep_step ]; then set +u @@ -138,7 +129,7 @@ rm -f fnoc export FORT11=jtwcbul export FORT51=fnoc -time -p ${EXECSYND}/${pgm} >> $pgmout 2> errfile +time -p ${EXECgfs}/${pgm} >> $pgmout 2> errfile errget=$? ###cat errfile cat errfile >> $pgmout diff --git a/ush/syndat_qctropcy.sh b/ush/syndat_qctropcy.sh index cda9030577..8ec8f70b14 100755 --- a/ush/syndat_qctropcy.sh +++ b/ush/syndat_qctropcy.sh @@ -44,10 +44,6 @@ # COMSP - path to both output jtwc-fnoc file and output tcvitals file (this # tcvitals file is read by subsequent relocation processing and/or # subsequent program SYNDAT_SYNDATA) -# PARMSYND - path to syndat parm field directory -# EXECSYND - path to syndat executable directory -# FIXam - path to syndat fix field directory -# USHSYND - path to syndat ush directory # Imported variables that can be passed in: # ARCHSYND - path to syndat archive directory @@ -59,7 +55,7 @@ # data base # (Default: /dcom/us007003) # slmask - path to t126 32-bit gaussian land/sea mask file -# (Default: $FIXam/syndat_slmask.t126.gaussian) +# (Default: ${FIXgfs}/am/syndat_slmask.t126.gaussian) # copy_back - switch to copy updated files back to archive directory and # to tcvitals directory # (Default: YES) @@ -67,19 +63,13 @@ # (Default: not set) # TIMEIT - optional time and resource reporting (Default: not set) -source "$HOMEgfs/ush/preamble.sh" +source "${USHgfs}/preamble.sh" ARCHSYND=${ARCHSYND:-$COMROOTp3/gfs/prod/syndat} -HOMENHCp1=${HOMENHCp1:-/gpfs/?p1/nhc/save/guidance/storm-data/ncep} HOMENHC=${HOMENHC:-/gpfs/dell2/nhc/save/guidance/storm-data/ncep} TANK_TROPCY=${TANK_TROPCY:-${DCOMROOT}/us007003} -FIXam=${FIXam:-$HOMEgfs/fix/am} -USHSYND=${USHSYND:-$HOMEgfs/ush} -EXECSYND=${EXECSYND:-$HOMEgfs/exec} -PARMSYND=${PARMSYND:-$HOMEgfs/parm/relo} - -slmask=${slmask:-$FIXam/syndat_slmask.t126.gaussian} +slmask=${slmask:-${FIXgfs}/am/syndat_slmask.t126.gaussian} copy_back=${copy_back:-YES} files_override=${files_override:-""} @@ -188,12 +178,12 @@ if [ -n "$files_override" ]; then # for testing, typically want FILES=F fi echo " &INPUT RUNID = '${net}_${tmmark}_${cyc}', FILES = $files " > vitchk.inp -cat $PARMSYND/syndat_qctropcy.${RUN}.parm >> vitchk.inp +cat ${PARMgfs}/relo/syndat_qctropcy.${RUN}.parm >> vitchk.inp -# Copy the fixed fields from FIXam +# Copy the fixed fields -cp $FIXam/syndat_fildef.vit fildef.vit -cp $FIXam/syndat_stmnames stmnames +cp ${FIXgfs}/am/syndat_fildef.vit fildef.vit +cp ${FIXgfs}/am/syndat_stmnames stmnames rm -f nhc fnoc lthistry @@ -205,12 +195,9 @@ rm -f nhc fnoc lthistry # All are input to program syndat_qctropcy # ------------------------------------------------------------------ -if [ -s $HOMENHC/tcvitals ]; then - echo "tcvitals found" >> $pgmout - cp $HOMENHC/tcvitals nhc -elif [ -s $HOMENHCp1/tcvitals ]; then +if [ -s ${HOMENHC}/tcvitals ]; then echo "tcvitals found" >> $pgmout - cp $HOMENHCp1/tcvitals nhc + cp ${HOMENHC}/tcvitals nhc else echo "WARNING: tcvitals not found, create empty tcvitals" >> $pgmout > nhc @@ -221,17 +208,17 @@ touch nhc [ "$copy_back" = 'YES' ] && cat nhc >> $ARCHSYND/syndat_tcvitals.$year mv -f nhc nhc1 -$USHSYND/parse-storm-type.pl nhc1 > nhc +${USHgfs}/parse-storm-type.pl nhc1 > nhc cp -p nhc nhc.ORIG # JTWC/FNOC ... execute syndat_getjtbul script to write into working directory # as fnoc; copy to archive -$USHSYND/syndat_getjtbul.sh $CDATE10 +${USHgfs}/syndat_getjtbul.sh $CDATE10 touch fnoc [ "$copy_back" = 'YES' ] && cat fnoc >> $ARCHSYND/syndat_tcvitals.$year mv -f fnoc fnoc1 -$USHSYND/parse-storm-type.pl fnoc1 > fnoc +${USHgfs}/parse-storm-type.pl fnoc1 > fnoc if [ $SENDDBN = YES ]; then $DBNROOT/bin/dbn_alert MODEL SYNDAT_TCVITALS $job $ARCHSYND/syndat_tcvitals.$year @@ -245,7 +232,7 @@ cp $slmask slmask.126 # Execute program syndat_qctropcy -pgm=$(basename $EXECSYND/syndat_qctropcy.x) +pgm=$(basename ${EXECgfs}/syndat_qctropcy.x) export pgm if [ -s prep_step ]; then set +u @@ -259,7 +246,7 @@ fi echo "$CDATE10" > cdate10.dat export FORT11=slmask.126 export FORT12=cdate10.dat -${EXECSYND}/${pgm} >> $pgmout 2> errfile +${EXECgfs}/${pgm} >> $pgmout 2> errfile errqct=$? ###cat errfile cat errfile >> $pgmout @@ -323,28 +310,25 @@ diff nhc nhc.ORIG > /dev/null errdiff=$? ################################### -# Update NHC file in $HOMENHC +# Update NHC file in ${HOMENHC} ################################### if test "$errdiff" -ne '0' then if [ "$copy_back" = 'YES' -a ${envir} = 'prod' ]; then - if [ -s $HOMENHC/tcvitals ]; then - cp nhc $HOMENHC/tcvitals - fi - if [ -s $HOMENHCp1/tcvitals ]; then - cp nhc $HOMENHCp1/tcvitals + if [ -s ${HOMENHC}/tcvitals ]; then + cp nhc ${HOMENHC}/tcvitals fi err=$? if [ "$err" -ne '0' ]; then msg="###ERROR: Previous NHC Synthetic Data Record File \ -$HOMENHC/tcvitals not updated by syndat_qctropcy" +${HOMENHC}/tcvitals not updated by syndat_qctropcy" else msg="Previous NHC Synthetic Data Record File \ -$HOMENHC/tcvitals successfully updated by syndat_qctropcy" +${HOMENHC}/tcvitals successfully updated by syndat_qctropcy" fi set +x @@ -357,7 +341,7 @@ $HOMENHC/tcvitals successfully updated by syndat_qctropcy" else - msg="Previous NHC Synthetic Data Record File $HOMENHC/tcvitals \ + msg="Previous NHC Synthetic Data Record File ${HOMENHC}/tcvitals \ not changed by syndat_qctropcy" set +x echo diff --git a/ush/tropcy_relocate.sh b/ush/tropcy_relocate.sh index 01a21bd12c..11c0afb990 100755 --- a/ush/tropcy_relocate.sh +++ b/ush/tropcy_relocate.sh @@ -84,20 +84,13 @@ # envir String indicating environment under which job runs ('prod' # or 'test') # Default is "prod" -# HOMEALL String indicating parent directory path for some or -# all files under which job runs. -# If the imported variable MACHINE!=sgi, then the default is -# "/nw${envir}"; otherwise the default is -# "/disk1/users/snake/prepobs" -# HOMERELO String indicating parent directory path for relocation -# specific files. (May be under HOMEALL) # envir_getges String indicating environment under which GETGES utility -# ush runs (see documentation in $USHGETGES/getges.sh for +# ush runs (see documentation in ${USHgfs}/getges.sh for # more information) # Default is "$envir" # network_getges # String indicating job network under which GETGES utility -# ush runs (see documentation in $USHGETGES/getges.sh for +# ush runs (see documentation in ${USHgfs}/getges.sh for # more information) # Default is "global" unless the center relocation processing # date/time is not a multiple of 3-hrs, then the default is @@ -122,34 +115,20 @@ # POE_OPTS String indicating options to use with poe command # Default is "-pgmmodel mpmd -ilevel 2 -labelio yes \ # -stdoutmode ordered" -# USHGETGES String indicating directory path for GETGES utility ush -# file -# USHRELO String indicating directory path for RELOCATE ush files -# Default is "${HOMERELO}/ush" -# EXECRELO String indicating directory path for RELOCATE executables -# Default is "${HOMERELO}/exec" -# FIXRELO String indicating directory path for RELOCATE data fix- -# field files -# Default is "${HOMERELO}/fix" -# EXECUTIL String indicating directory path for utility program -# executables -# If the imported variable MACHINE!=sgi, then the default is -# "/nwprod/util/exec"; otherwise the default is -# "${HOMEALL}/util/exec" # RELOX String indicating executable path for RELOCATE_MV_NVORTEX # program -# Default is "$EXECRELO/relocate_mv_nvortex" +# Default is "${EXECgfs}/relocate_mv_nvortex" # SUPVX String indicating executable path for SUPVIT utility # program -# Default is "$EXECUTIL/supvit.x" +# Default is "${EXECgfs}/supvit.x" # GETTX String indicating executable path for GETTRK utility # program -# Default is "$EXECUTIL/gettrk" +# Default is "${EXECgfs}/gettrk" # BKGFREQ Frequency of background files for relocation # Default is "3" # SENDDBN String when set to "YES" alerts output files to $COMSP # NDATE String indicating executable path for NDATE utility program -# Default is "$EXECUTIL/ndate" +# Default is "${EXECgfs}/ndate" # # These do not have to be exported to this script. If they are, they will # be used by the script. If they are not, they will be skipped @@ -166,18 +145,18 @@ # # Modules and files referenced: # Herefile: RELOCATE_GES -# $USHRELO/tropcy_relocate_extrkr.sh -# $USHGETGES/getges.sh +# ${USHgfs}/tropcy_relocate_extrkr.sh +# ${USHgfs}/getges.sh # $NDATE (here and in child script -# $USHRELO/tropcy_relocate_extrkr.sh) +# ${USHgfs}/tropcy_relocate_extrkr.sh) # /usr/bin/poe # postmsg # $DATA/prep_step (here and in child script -# $USHRELO/tropcy_relocate_extrkr.sh) +# ${USHgfs}/tropcy_relocate_extrkr.sh) # $DATA/err_exit (here and in child script -# $USHRELO/tropcy_relocate_extrkr.sh) +# ${USHgfs}/tropcy_relocate_extrkr.sh) # $DATA/err_chk (here and in child script -# $USHRELO/tropcy_relocate_extrkr.sh) +# ${USHgfs}/tropcy_relocate_extrkr.sh) # NOTE: The last three scripts above are NOT REQUIRED utilities. # If $DATA/prep_step not found, a scaled down version of it is # executed in-line. If $DATA/err_exit or $DATA/err_chk are not @@ -188,7 +167,7 @@ # programs : # RELOCATE_MV_NVORTEX - executable $RELOX # T126 GRIB global land/sea mask: -# $FIXRELO/global_slmask.t126.grb +# ${FIXgfs}/am/global_slmask.t126.grb # SUPVIT - executable $SUPVX # GETTRK - executable $GETTX # @@ -204,7 +183,7 @@ # #### -source "$HOMEgfs/ush/preamble.sh" +source "${USHgfs}/preamble.sh" MACHINE=${MACHINE:-$(hostname -s | cut -c 1-3)} @@ -275,14 +254,6 @@ set_trace envir=${envir:-prod} -if [ $MACHINE != sgi ]; then - HOMEALL=${HOMEALL:-$OPSROOT} -else - HOMEALL=${HOMEALL:-/disk1/users/snake/prepobs} -fi - -HOMERELO=${HOMERELO:-${shared_global_home}} - envir_getges=${envir_getges:-$envir} if [ $modhr -eq 0 ]; then network_getges=${network_getges:-global} @@ -295,21 +266,12 @@ pgmout=${pgmout:-/dev/null} tstsp=${tstsp:-/tmp/null/} tmmark=${tmmark:-tm00} -USHRELO=${USHRELO:-${HOMERELO}/ush} -##USHGETGES=${USHGETGES:-/nwprod/util/ush} -##USHGETGES=${USHGETGES:-${HOMERELO}/ush} -USHGETGES=${USHGETGES:-${USHRELO}} - -EXECRELO=${EXECRELO:-${HOMERELO}/exec} - -FIXRELO=${FIXRELO:-${HOMERELO}/fix} - -RELOX=${RELOX:-$EXECRELO/relocate_mv_nvortex} +RELOX=${RELOX:-${EXECgfs}/relocate_mv_nvortex} export BKGFREQ=${BKGFREQ:-1} -SUPVX=${SUPVX:-$EXECRELO/supvit.x} -GETTX=${GETTX:-$EXECRELO/gettrk} +SUPVX=${SUPVX:-${EXECgfs}/supvit.x} +GETTX=${GETTX:-${EXECgfs}/gettrk} ################################################ # EXECUTE TROPICAL CYCLONE RELOCATION PROCESSING @@ -355,7 +317,7 @@ echo " relocation processing date/time" echo "AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA" echo set_trace - $USHGETGES/getges.sh -e $envir_getges -n $network_getges \ + ${USHgfs}/getges.sh -e $envir_getges -n $network_getges \ -v $CDATE10 -f $fhr -t tcvges tcvitals.m${fhr} set +x echo @@ -405,7 +367,7 @@ echo " relocation processing date/time" echo "AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA" echo set_trace - $USHGETGES/getges.sh -e $envir_getges -n $network_getges \ + ${USHgfs}/getges.sh -e $envir_getges -n $network_getges \ -v $CDATE10 -t $stype $sges errges=$? if test $errges -ne 0; then @@ -439,7 +401,7 @@ to center relocation date/time;" # ---------------------------------------------------------------------------- if [ $fhr = "0" ]; then - "${USHGETGES}/getges.sh" -e "${envir_getges}" -n "${network_getges}" -v "${CDATE10}" \ + "${USHgfs}/getges.sh" -e "${envir_getges}" -n "${network_getges}" -v "${CDATE10}" \ -t "${stype}" > "${COM_OBS}/${RUN}.${cycle}.sgesprep_pre-relocate_pathname.${tmmark}" cp "${COM_OBS}/${RUN}.${cycle}.sgesprep_pre-relocate_pathname.${tmmark}" \ "${COM_OBS}/${RUN}.${cycle}.sgesprep_pathname.${tmmark}" @@ -459,7 +421,7 @@ echo " relocation processing date/time" echo "AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA" echo set_trace - $USHGETGES/getges.sh -e $envir_getges -n $network_getges \ + ${USHgfs}/getges.sh -e $envir_getges -n $network_getges \ -v $CDATE10 -t $ptype $pges errges=$? if test $errges -ne 0; then @@ -541,7 +503,7 @@ else # $DATA/$RUN.$cycle.relocate.model_track.tm00 # -------------------------------------------- - $USHRELO/tropcy_relocate_extrkr.sh + ${USHgfs}/tropcy_relocate_extrkr.sh err=$? if [ $err -ne 0 ]; then @@ -550,12 +512,12 @@ else set +x echo - echo "$USHRELO/tropcy_relocate_extrkr.sh failed" + echo "${USHgfs}/tropcy_relocate_extrkr.sh failed" echo "ABNORMAL EXIT!!!!!!!!!!!" echo set_trace if [ -s $DATA/err_exit ]; then - $DATA/err_exit "Script $USHRELO/tropcy_relocate_extrkr.sh failed" + $DATA/err_exit "Script ${USHgfs}/tropcy_relocate_extrkr.sh failed" else exit 555 fi @@ -569,10 +531,10 @@ else rm fort.* fi - ln -sf $DATA/tcvitals.now1 fort.11 - ln -sf $DATA/model_track.all fort.30 - ln -sf $DATA/rel_inform1 fort.62 - ln -sf $DATA/tcvitals.relocate0 fort.65 + ${NLN} $DATA/tcvitals.now1 fort.11 + ${NLN} $DATA/model_track.all fort.30 + ${NLN} $DATA/rel_inform1 fort.62 + ${NLN} $DATA/tcvitals.relocate0 fort.65 i1=20 i2=53 @@ -586,8 +548,8 @@ else tpref=p$fhr fi - ln -sf $DATA/sg${tpref}prep fort.$i1 - ln -sf $DATA/sg${tpref}prep.relocate fort.$i2 + ${NLN} $DATA/sg${tpref}prep fort.$i1 + ${NLN} $DATA/sg${tpref}prep.relocate fort.$i2 i1=$((i1+1)) i2=$((i2+BKGFREQ)) diff --git a/ush/tropcy_relocate_extrkr.sh b/ush/tropcy_relocate_extrkr.sh index ede2318c4a..18e0851368 100755 --- a/ush/tropcy_relocate_extrkr.sh +++ b/ush/tropcy_relocate_extrkr.sh @@ -3,7 +3,7 @@ # This script is executed by the script tropcy_relocate.sh # -------------------------------------------------------- -source "$HOMEgfs/ush/preamble.sh" +source "${USHgfs}/preamble.sh" export machine=${machine:-ZEUS} export machine=$(echo $machine|tr '[a-z]' '[A-Z]') @@ -592,8 +592,8 @@ if [ -s fort.* ]; then rm fort.* fi -ln -s -f ${vdir}/vitals.${symd}${dishh} fort.31 -ln -s -f ${vdir}/vitals.upd.${cmodel}.${symd}${dishh} fort.51 +${NLN} ${vdir}/vitals.${symd}${dishh} fort.31 +${NLN} ${vdir}/vitals.upd.${cmodel}.${symd}${dishh} fort.51 ##$XLF_LINKSSH #if [ -z $XLF_LINKSSH ] ; then @@ -1528,19 +1528,19 @@ if [ -s fort.* ]; then rm fort.* fi -ln -s -f ${gribfile} fort.11 -ln -s -f ${vdir}/tmp.gfs.atcfunix.${symdh} fort.14 -ln -s -f ${vdir}/vitals.upd.${cmodel}.${symd}${dishh} fort.12 -ln -s -f ${ixfile} fort.31 -ln -s -f ${vdir}/trak.${cmodel}.all.${symdh} fort.61 -ln -s -f ${vdir}/trak.${cmodel}.atcf.${symdh} fort.62 -ln -s -f ${vdir}/trak.${cmodel}.radii.${symdh} fort.63 -ln -s -f ${vdir}/trak.${cmodel}.atcfunix.${symdh} fort.64 +${NLN} ${gribfile} fort.11 +${NLN} ${vdir}/tmp.gfs.atcfunix.${symdh} fort.14 +${NLN} ${vdir}/vitals.upd.${cmodel}.${symd}${dishh} fort.12 +${NLN} ${ixfile} fort.31 +${NLN} ${vdir}/trak.${cmodel}.all.${symdh} fort.61 +${NLN} ${vdir}/trak.${cmodel}.atcf.${symdh} fort.62 +${NLN} ${vdir}/trak.${cmodel}.radii.${symdh} fort.63 +${NLN} ${vdir}/trak.${cmodel}.atcfunix.${symdh} fort.64 if [ $BKGFREQ -eq 1 ]; then - ln -s -f ${FIXRELO}/${cmodel}.tracker_leadtimes_hrly fort.15 + ${NLN} ${FIXgfs}/am/${cmodel}.tracker_leadtimes_hrly fort.15 elif [ $BKGFREQ -eq 3 ]; then - ln -s -f ${FIXRELO}/${cmodel}.tracker_leadtimes fort.15 + ${NLN} ${FIXgfs}/am/${cmodel}.tracker_leadtimes fort.15 fi ##$XLF_LINKSSH diff --git a/ush/wafs_mkgbl.sh b/ush/wafs_mkgbl.sh new file mode 100755 index 0000000000..e6139bc9d3 --- /dev/null +++ b/ush/wafs_mkgbl.sh @@ -0,0 +1,152 @@ +# UTILITY SCRIPT NAME : wafs_mkgbl.sh +# AUTHOR : Mary Jacobs +# DATE WRITTEN : 11/06/96 +# +# Abstract: This utility script produces the GFS WAFS +# bulletins. +# +# Input: 2 arguments are passed to this script. +# 1st argument - Forecast Hour - format of 2I +# 2nd argument - In hours 12-30, the designator of +# a or b. +# +# Logic: If we are processing hours 12-30, we have the +# added variable of the a or b, and process +# accordingly. The other hours, the a or b is dropped. +# +echo "History: SEPT 1996 - First implementation of this utility script" +echo "History: AUG 1999 - Modified for implementation on IBM SP" +echo " - Allows users to run interactively" +# + +set -x +hour_list="$1" +sets_key=$2 +num=$# + +if test $num -ge 2 +then + echo " Appropriate number of arguments were passed" + set -x + if [ -z "$DATA" ] + then + export DATA=`pwd` + cd $DATA + setpdy.sh + . PDY + fi +else + echo "" + echo "Usage: wafs_mkgbl.sh \$hour [a|b]" + echo "" + exit 16 +fi + +echo " ------------------------------------------" +echo " BEGIN MAKING ${NET} WAFS PRODUCTS" +echo " ------------------------------------------" + +echo "Enter Make WAFS utility." + +for hour in $hour_list +do + ############################## + # Copy Input Field to $DATA + ############################## + + if test ! -f pgrbf${hour} + then +# cp $COMIN/${RUN}.${cycle}.pgrbf${hour} pgrbf${hour} + +# file name and forecast hour of GFS model data in Grib2 are 3 digits +# export fhr3=$hour +# if test $fhr3 -lt 100 +# then +# export fhr3="0$fhr3" +# fi + fhr3="$(printf "%03d" $(( 10#$hour )) )" + +# To solve Bugzilla #408: remove the dependency of grib1 files in gfs wafs job in next GFS upgrade +# Reason: It's not efficent if simply converting from grib2 to grib1 (costs 6 seconds with 415 records) +# Solution: Need to grep 'selected fields on selected levels' before CNVGRIB (costs 1 second with 92 records) + ${NLN} $COMIN/${RUN}.${cycle}.pgrb2.1p00.f$fhr3 pgrb2f${hour} + $WGRIB2 pgrb2f${hour} | grep -F -f $FIXgfs/grib_wafs.grb2to1.list | $WGRIB2 -i pgrb2f${hour} -grib pgrb2f${hour}.tmp +# on Cray, IOBUF_PARAMS has to used to speed up CNVGRIB +# export IOBUF_PARAMS='*:size=32M:count=4:verbose' + $CNVGRIB -g21 pgrb2f${hour}.tmp pgrbf${hour} +# unset IOBUF_PARAMS + fi + + # + # BAG - Put in fix on 20070925 to force the percision of U and V winds + # to default to 1 through the use of the grib_wafs.namelist file. + # + $COPYGB -g3 -i0 -N$FIXgfs/grib_wafs.namelist -x pgrbf${hour} tmp + mv tmp pgrbf${hour} + $GRBINDEX pgrbf${hour} pgrbif${hour} + + ############################## + # Process WAFS + ############################## + + if test $hour -ge '12' -a $hour -le '30' + then + sets=$sets_key + set +x + echo "We are processing the primary and secondary sets of hours." + echo "These sets are the a and b of hours 12-30." + set -x + else + # This is for hours 00/06 and 36-72. + unset sets + fi + + export pgm=wafs_makewafs + . prep_step + + export FORT11="pgrbf${hour}" + export FORT31="pgrbif${hour}" + export FORT51="xtrn.wfs${NET}${hour}${sets}" + export FORT53="com.wafs${hour}${sets}" + + startmsg + $EXECgfs/wafs_makewafs.x < $FIXgfs/grib_wfs${NET}${hour}${sets} >>$pgmout 2>errfile + export err=$?;err_chk + + + ############################## + # Post Files to PCOM + ############################## + + if test "$SENDCOM" = 'YES' + then + cp xtrn.wfs${NET}${hour}${sets} $PCOM/xtrn.wfs${NET}${cyc}${hour}${sets}.$jobsuffix +# cp com.wafs${hour}${sets} $PCOM/com.wafs${cyc}${hour}${sets}.$jobsuffix + +# if test "$SENDDBN_NTC" = 'YES' +# then +# if test "$NET" = 'gfs' +# then +# $DBNROOT/bin/dbn_alert MODEL GFS_WAFS $job \ +# $PCOM/com.wafs${cyc}${hour}${sets}.$jobsuffix +# $DBNROOT/bin/dbn_alert MODEL GFS_XWAFS $job \ +# $PCOM/xtrn.wfs${NET}${cyc}${hour}${sets}.$jobsuffix +# fi +# fi + fi + + ############################## + # Distribute Data + ############################## + + if [ "$SENDDBN_NTC" = 'YES' ] ; then + $DBNROOT/bin/dbn_alert GRIB_LOW $NET $job $PCOM/xtrn.wfs${NET}${cyc}${hour}${sets}.$jobsuffix + else + echo "xtrn.wfs${NET}${cyc}${hour}${sets}.$job file not posted to db_net." + fi + + echo "Wafs Processing $hour hour completed normally" + +done + +exit diff --git a/ush/wave_extractvars.sh b/ush/wave_extractvars.sh new file mode 100755 index 0000000000..32ee44986b --- /dev/null +++ b/ush/wave_extractvars.sh @@ -0,0 +1,34 @@ +#! /usr/bin/env bash + +################################################################################ +## UNIX Script Documentation Block +## Script name: wave_extractvars.sh +## Script description: Extracts variables from wave products +## and saves these variables in arcdir +####################### +# Main body starts here +####################### + +source "${USHgfs}/preamble.sh" + +subdata=${1} + +[[ -d "${subdata}" ]] || mkdir -p "${subdata}" + +for (( nh = FHOUT_WAV_EXTRACT; nh <= FHMAX_WAV; nh = nh + FHOUT_WAV_EXTRACT )); do + fnh=$(printf "%3.3d" "${nh}") + + infile=${COMIN_WAVE_GRID}/${RUN}wave.t${cyc}z.global.${wavres}.f${fnh}.grib2 + outfile=${subdata}/${RUN}wave.t${cyc}z.global.${wavres}.f${fnh}.grib2 + rm -f "${outfile}" # Remove outfile if it already exists before extraction + + if [[ -f "${infile}" ]]; then # Check if input file exists before extraction + # shellcheck disable=SC2312 + ${WGRIB2} "${infile}" | grep -F -f "${varlist_wav}" | ${WGRIB2} -i "${infile}" -append -grib "${outfile}" + else + echo "WARNING: ${infile} does not exist." + fi + copy_to_comout "${outfile}" "${ARC_RFCST_PROD_WAV}" +done # nh + +exit 0 diff --git a/ush/wave_grib2_sbs.sh b/ush/wave_grib2_sbs.sh index af28760269..99f89f3f37 100755 --- a/ush/wave_grib2_sbs.sh +++ b/ush/wave_grib2_sbs.sh @@ -25,7 +25,7 @@ # --------------------------------------------------------------------------- # # 0. Preparations -source "${HOMEgfs}/ush/preamble.sh" +source "${USHgfs}/preamble.sh" # 0.a Basic modes of operation @@ -72,7 +72,7 @@ if [[ -n ${waveMEMB} ]]; then ENSTAG=".${membTAG}${waveMEMB}" ; fi outfile="${WAV_MOD_TAG}.${cycle}${ENSTAG}.${grdnam}.${grdres}.f${FH3}.grib2" # Only create file if not present in COM -if [[ ! -s "${COM_WAVE_GRID}/${outfile}.idx" ]]; then +if [[ ! -s "${COMOUT_WAVE_GRID}/${outfile}.idx" ]]; then set +x echo ' ' @@ -82,8 +82,8 @@ if [[ ! -s "${COM_WAVE_GRID}/${outfile}.idx" ]]; then echo " Model ID : $WAV_MOD_TAG" set_trace - if [[ -z "${PDY}" ]] || [[ -z ${cyc} ]] || [[ -z "${cycle}" ]] || [[ -z "${EXECwave}" ]] || \ - [[ -z "${COM_WAVE_GRID}" ]] || [[ -z "${WAV_MOD_TAG}" ]] || [[ -z "${gribflags}" ]] || \ + if [[ -z "${PDY}" ]] || [[ -z ${cyc} ]] || [[ -z "${cycle}" ]] || [[ -z "${EXECgfs}" ]] || \ + [[ -z "${COMOUT_WAVE_GRID}" ]] || [[ -z "${WAV_MOD_TAG}" ]] || [[ -z "${gribflags}" ]] || \ [[ -z "${GRIDNR}" ]] || [[ -z "${MODNR}" ]] || \ [[ -z "${SENDDBN}" ]]; then set +x @@ -110,8 +110,8 @@ if [[ ! -s "${COM_WAVE_GRID}/${outfile}.idx" ]]; then # 0.e Links to working directory - ln -s "${DATA}/mod_def.${grdID}" "mod_def.ww3" - ln -s "${DATA}/output_${ymdh}0000/out_grd.${grdID}" "out_grd.ww3" + ${NLN} "${DATA}/mod_def.${grdID}" "mod_def.ww3" + ${NLN} "${DATA}/output_${ymdh}0000/out_grd.${grdID}" "out_grd.ww3" # --------------------------------------------------------------------------- # # 1. Generate GRIB file with all data @@ -138,11 +138,11 @@ if [[ ! -s "${COM_WAVE_GRID}/${outfile}.idx" ]]; then set +x echo " Run ww3_grib2" - echo " Executing ${EXECwave}/ww3_grib" + echo " Executing ${EXECgfs}/ww3_grib" set_trace export pgm=ww3_grib;. prep_step - "${EXECwave}/ww3_grib" > "grib2_${grdnam}_${FH3}.out" 2>&1 + "${EXECgfs}/ww3_grib" > "grib2_${grdnam}_${FH3}.out" 2>&1 export err=$?;err_chk if [ ! -s gribfile ]; then @@ -157,11 +157,11 @@ if [[ ! -s "${COM_WAVE_GRID}/${outfile}.idx" ]]; then fi if (( fhr > 0 )); then - ${WGRIB2} gribfile -set_date "${PDY}${cyc}" -set_ftime "${fhr} hour fcst" -grib "${COM_WAVE_GRID}/${outfile}" + ${WGRIB2} gribfile -set_date "${PDY}${cyc}" -set_ftime "${fhr} hour fcst" -grib "${COMOUT_WAVE_GRID}/${outfile}" err=$? else ${WGRIB2} gribfile -set_date "${PDY}${cyc}" -set_ftime "${fhr} hour fcst" \ - -set table_1.4 1 -set table_1.2 1 -grib "${COM_WAVE_GRID}/${outfile}" + -set table_1.4 1 -set table_1.2 1 -grib "${COMOUT_WAVE_GRID}/${outfile}" err=$? fi @@ -177,7 +177,7 @@ if [[ ! -s "${COM_WAVE_GRID}/${outfile}.idx" ]]; then fi # Create index - ${WGRIB2} -s "${COM_WAVE_GRID}/${outfile}" > "${COM_WAVE_GRID}/${outfile}.idx" + ${WGRIB2} -s "${COMOUT_WAVE_GRID}/${outfile}" > "${COMOUT_WAVE_GRID}/${outfile}.idx" # Create grib2 subgrid is this is the source grid if [[ "${grdID}" = "${WAV_SUBGRBSRC}" ]]; then @@ -186,14 +186,14 @@ if [[ ! -s "${COM_WAVE_GRID}/${outfile}.idx" ]]; then subgrbnam=$(echo ${!subgrb} | cut -d " " -f 21) subgrbres=$(echo ${!subgrb} | cut -d " " -f 22) subfnam="${WAV_MOD_TAG}.${cycle}${ENSTAG}.${subgrbnam}.${subgrbres}.f${FH3}.grib2" - ${COPYGB2} -g "${subgrbref}" -i0 -x "${COM_WAVE_GRID}/${outfile}" "${COM_WAVE_GRID}/${subfnam}" - ${WGRIB2} -s "${COM_WAVE_GRID}/${subfnam}" > "${COM_WAVE_GRID}/${subfnam}.idx" + ${COPYGB2} -g "${subgrbref}" -i0 -x "${COMOUT_WAVE_GRID}/${outfile}" "${COMOUT_WAVE_GRID}/${subfnam}" + ${WGRIB2} -s "${COMOUT_WAVE_GRID}/${subfnam}" > "${COMOUT_WAVE_GRID}/${subfnam}.idx" done fi # 1.e Save in /com - if [[ ! -s "${COM_WAVE_GRID}/${outfile}" ]]; then + if [[ ! -s "${COMOUT_WAVE_GRID}/${outfile}" ]]; then set +x echo ' ' echo '********************************************* ' @@ -205,7 +205,7 @@ if [[ ! -s "${COM_WAVE_GRID}/${outfile}.idx" ]]; then set_trace exit 4 fi - if [[ ! -s "${COM_WAVE_GRID}/${outfile}.idx" ]]; then + if [[ ! -s "${COMOUT_WAVE_GRID}/${outfile}.idx" ]]; then set +x echo ' ' echo '*************************************************** ' @@ -220,11 +220,11 @@ if [[ ! -s "${COM_WAVE_GRID}/${outfile}.idx" ]]; then if [[ "${SENDDBN}" = 'YES' ]] && [[ ${outfile} != *global.0p50* ]]; then set +x - echo " Alerting GRIB file as ${COM_WAVE_GRID}/${outfile}" - echo " Alerting GRIB index file as ${COM_WAVE_GRID}/${outfile}.idx" + echo " Alerting GRIB file as ${COMOUT_WAVE_GRID}/${outfile}" + echo " Alerting GRIB index file as ${COMOUT_WAVE_GRID}/${outfile}.idx" set_trace - "${DBNROOT}/bin/dbn_alert" MODEL "${alertName}_WAVE_GB2" "${job}" "${COM_WAVE_GRID}/${outfile}" - "${DBNROOT}/bin/dbn_alert" MODEL "${alertName}_WAVE_GB2_WIDX" "${job}" "${COM_WAVE_GRID}/${outfile}.idx" + "${DBNROOT}/bin/dbn_alert" MODEL "${alertName}_WAVE_GB2" "${job}" "${COMOUT_WAVE_GRID}/${outfile}" + "${DBNROOT}/bin/dbn_alert" MODEL "${alertName}_WAVE_GB2_WIDX" "${job}" "${COMOUT_WAVE_GRID}/${outfile}.idx" else echo "${outfile} is global.0p50 or SENDDBN is NO, no alert sent" fi @@ -245,7 +245,7 @@ if [[ ! -s "${COM_WAVE_GRID}/${outfile}.idx" ]]; then else set +x echo ' ' - echo " File ${COM_WAVE_GRID}/${outfile} found, skipping generation process" + echo " File ${COMOUT_WAVE_GRID}/${outfile} found, skipping generation process" echo ' ' set_trace fi diff --git a/ush/wave_grid_interp_sbs.sh b/ush/wave_grid_interp_sbs.sh index c11a75f89d..31b7808c16 100755 --- a/ush/wave_grid_interp_sbs.sh +++ b/ush/wave_grid_interp_sbs.sh @@ -25,7 +25,7 @@ # --------------------------------------------------------------------------- # # 0. Preparations -source "$HOMEgfs/ush/preamble.sh" +source "${USHgfs}/preamble.sh" # 0.a Basic modes of operation @@ -65,8 +65,8 @@ source "$HOMEgfs/ush/preamble.sh" echo " Model ID : $WAV_MOD_TAG" set_trace - if [[ -z "${PDY}" ]] || [[ -z "${cyc}" ]] || [[ -z "${cycle}" ]] || [[ -z "${EXECwave}" ]] || \ - [[ -z "${COM_WAVE_PREP}" ]] || [[ -z "${WAV_MOD_TAG}" ]] || [[ -z "${SENDDBN}" ]] || \ + if [[ -z "${PDY}" ]] || [[ -z "${cyc}" ]] || [[ -z "${cycle}" ]] || [[ -z "${EXECgfs}" ]] || \ + [[ -z "${COMOUT_WAVE_PREP}" ]] || [[ -z "${WAV_MOD_TAG}" ]] || [[ -z "${SENDDBN}" ]] || \ [ -z "${waveGRD}" ] then set +x @@ -75,7 +75,7 @@ source "$HOMEgfs/ush/preamble.sh" echo '*** EXPORTED VARIABLES IN postprocessor NOT SET ***' echo '***************************************************' echo ' ' - echo "${PDY}${cyc} ${cycle} ${EXECwave} ${COM_WAVE_PREP} ${WAV_MOD_TAG} ${SENDDBN} ${waveGRD}" + echo "${PDY}${cyc} ${cycle} ${EXECgfs} ${COMOUT_WAVE_PREP} ${WAV_MOD_TAG} ${SENDDBN} ${waveGRD}" set_trace exit 1 fi @@ -85,18 +85,16 @@ source "$HOMEgfs/ush/preamble.sh" rm -f ${DATA}/output_${ymdh}0000/out_grd.$grdID if [ ! -f ${DATA}/${grdID}_interp.inp.tmpl ]; then - cp $PARMwave/${grdID}_interp.inp.tmpl ${DATA} + cp "${PARMgfs}/wave/${grdID}_interp.inp.tmpl" "${DATA}/${grdID}_interp.inp.tmpl" fi - ln -sf ${DATA}/${grdID}_interp.inp.tmpl . + ${NLN} "${DATA}/${grdID}_interp.inp.tmpl" "${grdID}_interp.inp.tmpl" - for ID in $waveGRD - do - ln -sf ${DATA}/output_${ymdh}0000/out_grd.$ID . + for ID in ${waveGRD}; do + ${NLN} "${DATA}/output_${ymdh}0000/out_grd.${ID}" "out_grd.${ID}" done - for ID in $waveGRD $grdID - do - ln -sf ${DATA}/mod_def.$ID . + for ID in ${waveGRD} ${grdID}; do + ${NLN} "${DATA}/mod_def.${ID}" "mod_def.${ID}" done # --------------------------------------------------------------------------- # @@ -113,42 +111,42 @@ source "$HOMEgfs/ush/preamble.sh" wht_OK='no' if [ ! -f ${DATA}/ww3_gint.WHTGRIDINT.bin.${grdID} ]; then - if [ -f $FIXwave/ww3_gint.WHTGRIDINT.bin.${grdID} ] + if [ -f ${FIXgfs}/wave/ww3_gint.WHTGRIDINT.bin.${grdID} ] then set +x echo ' ' - echo " Copying $FIXwave/ww3_gint.WHTGRIDINT.bin.${grdID} " + echo " Copying ${FIXgfs}/wave/ww3_gint.WHTGRIDINT.bin.${grdID} " set_trace - cp $FIXwave/ww3_gint.WHTGRIDINT.bin.${grdID} ${DATA} + cp ${FIXgfs}/wave/ww3_gint.WHTGRIDINT.bin.${grdID} ${DATA} wht_OK='yes' else set +x echo ' ' - echo " Not found: $FIXwave/ww3_gint.WHTGRIDINT.bin.${grdID} " + echo " Not found: ${FIXgfs}/wave/ww3_gint.WHTGRIDINT.bin.${grdID} " fi fi # Check and link weights file if [ -f ${DATA}/ww3_gint.WHTGRIDINT.bin.${grdID} ] then - ln -s ${DATA}/ww3_gint.WHTGRIDINT.bin.${grdID} ./WHTGRIDINT.bin + ${NLN} ${DATA}/ww3_gint.WHTGRIDINT.bin.${grdID} ./WHTGRIDINT.bin fi # 1.b Run interpolation code set +x echo " Run ww3_gint - echo " Executing $EXECwave/ww3_gint + echo " Executing ${EXECgfs}/ww3_gint set_trace export pgm=ww3_gint;. prep_step - $EXECwave/ww3_gint 1> gint.${grdID}.out 2>&1 + ${EXECgfs}/ww3_gint 1> gint.${grdID}.out 2>&1 export err=$?;err_chk # Write interpolation file to main TEMP dir area if not there yet if [ "wht_OK" = 'no' ] then cp -f ./WHTGRIDINT.bin ${DATA}/ww3_gint.WHTGRIDINT.bin.${grdID} - cp -f ./WHTGRIDINT.bin ${FIXwave}/ww3_gint.WHTGRIDINT.bin.${grdID} + cp -f ./WHTGRIDINT.bin ${FIXgfs}/wave/ww3_gint.WHTGRIDINT.bin.${grdID} fi @@ -173,9 +171,9 @@ source "$HOMEgfs/ush/preamble.sh" # 1.c Save in /com set +x - echo " Saving GRID file as ${COM_WAVE_PREP}/${WAV_MOD_TAG}.out_grd.${grdID}.${PDY}${cyc}" + echo " Saving GRID file as ${COMOUT_WAVE_PREP}/${WAV_MOD_TAG}.out_grd.${grdID}.${PDY}${cyc}" set_trace - cp "${DATA}/output_${ymdh}0000/out_grd.${grdID}" "${COM_WAVE_PREP}/${WAV_MOD_TAG}.out_grd.${grdID}.${PDY}${cyc}" + cp "${DATA}/output_${ymdh}0000/out_grd.${grdID}" "${COMOUT_WAVE_PREP}/${WAV_MOD_TAG}.out_grd.${grdID}.${PDY}${cyc}" # if [ "$SENDDBN" = 'YES' ] # then diff --git a/ush/wave_grid_moddef.sh b/ush/wave_grid_moddef.sh index 5b1b212a16..1e8c44054a 100755 --- a/ush/wave_grid_moddef.sh +++ b/ush/wave_grid_moddef.sh @@ -20,7 +20,7 @@ # --------------------------------------------------------------------------- # # 0. Preparations -source "$HOMEgfs/ush/preamble.sh" +source "${USHgfs}/preamble.sh" # 0.a Basic modes of operation @@ -59,7 +59,7 @@ source "$HOMEgfs/ush/preamble.sh" # 0.c Define directories and the search path. # The tested variables should be exported by the postprocessor script. - if [ -z "$grdID" ] || [ -z "$EXECwave" ] || [ -z "$wave_sys_ver" ] + if [ -z "$grdID" ] || [ -z "${EXECgfs}" ] then set +x echo ' ' @@ -77,14 +77,22 @@ source "$HOMEgfs/ush/preamble.sh" set +x echo ' ' echo ' Creating mod_def file ...' - echo " Executing $EXECwave/ww3_grid" + echo " Executing ${EXECgfs}/ww3_grid" echo ' ' set_trace rm -f ww3_grid.inp - ln -sf ../ww3_grid.inp.$grdID ww3_grid.inp + ${NLN} ../ww3_grid.inp.$grdID ww3_grid.inp + + if [ -f ../${grdID}.msh ] + then + rm -f ${grdID}.msh + ${NLN} ../${grdID}.msh ${grdID}.msh + fi + + - $EXECwave/ww3_grid 1> grid_${grdID}.out 2>&1 + "${EXECgfs}/ww3_grid" 1> "grid_${grdID}.out" 2>&1 err=$? if [ "$err" != '0' ] @@ -99,10 +107,10 @@ source "$HOMEgfs/ush/preamble.sh" exit 3 fi - if [ -f mod_def.ww3 ] + if [[ -f mod_def.ww3 ]] then - cp mod_def.ww3 "${COM_WAVE_PREP}/${RUN}wave.mod_def.${grdID}" - mv mod_def.ww3 ../mod_def.$grdID + cp mod_def.ww3 "${COMOUT_WAVE_PREP}/${RUN}wave.mod_def.${grdID}" + mv mod_def.ww3 "../mod_def.${grdID}" else set +x echo ' ' @@ -118,6 +126,6 @@ source "$HOMEgfs/ush/preamble.sh" # 3. Clean up cd .. -rm -rf moddef_$grdID +rm -rf "moddef_${grdID}" # End of ww3_mod_def.sh ------------------------------------------------- # diff --git a/ush/wave_outp_cat.sh b/ush/wave_outp_cat.sh index f4bf6b2294..6ce3ce06cf 100755 --- a/ush/wave_outp_cat.sh +++ b/ush/wave_outp_cat.sh @@ -21,7 +21,7 @@ # --------------------------------------------------------------------------- # # 0. Preparations -source "$HOMEgfs/ush/preamble.sh" +source "${USHgfs}/preamble.sh" # 0.a Basic modes of operation bloc=$1 diff --git a/ush/wave_outp_spec.sh b/ush/wave_outp_spec.sh index 5acc0f95ab..37accbae49 100755 --- a/ush/wave_outp_spec.sh +++ b/ush/wave_outp_spec.sh @@ -22,7 +22,7 @@ # --------------------------------------------------------------------------- # # 0. Preparations -source "$HOMEgfs/ush/preamble.sh" +source "${USHgfs}/preamble.sh" # 0.a Basic modes of operation bloc=$1 @@ -31,6 +31,7 @@ source "$HOMEgfs/ush/preamble.sh" workdir=$4 YMDHE=$($NDATE $FHMAX_WAV_PNT $CDATE) + model_start_date=$(${NDATE} ${OFFSET_START_HOUR} "${PDY}${cyc}") cd $workdir @@ -73,21 +74,7 @@ source "$HOMEgfs/ush/preamble.sh" exit 1 else buoy=$bloc - grep $buoy ${DATA}/buoy_log.ww3 > tmp_list.loc - while read line - do - buoy_name=$(echo $line | awk '{print $2}') - if [ $buoy = $buoy_name ] - then - point=$(echo $line | awk '{ print $1 }') - set +x - echo " Location ID/# : $buoy (${point})" - echo " Spectral output start time : $ymdh " - echo ' ' - set_trace - break - fi - done < tmp_list.loc + point=$(awk "{if (\$2 == \"${buoy}\"){print \$1; exit} }" "${DATA}/buoy_log.ww3") if [ -z "$point" ] then set +x @@ -97,6 +84,11 @@ source "$HOMEgfs/ush/preamble.sh" echo ' ' set_trace exit 2 + else + set +x + echo " Location ID/# : $buoy (${point})" + echo " Spectral output start time : $ymdh " + echo ' ' fi fi @@ -104,7 +96,7 @@ source "$HOMEgfs/ush/preamble.sh" # 0.c Define directories and the search path. # The tested variables should be exported by the postprocessor script. - if [ -z "$CDATE" ] || [ -z "$dtspec" ] || [ -z "$EXECwave" ] || \ + if [ -z "$CDATE" ] || [ -z "$dtspec" ] || [ -z "${EXECgfs}" ] || \ [ -z "$WAV_MOD_TAG" ] || [ -z "${STA_DIR}" ] then set +x @@ -135,8 +127,8 @@ source "$HOMEgfs/ush/preamble.sh" # 0.f Links to mother directory - ln -s ${DATA}/output_${ymdh}0000/mod_def.${waveuoutpGRD} ./mod_def.ww3 - ln -s ${DATA}/output_${ymdh}0000/out_pnt.${waveuoutpGRD} ./out_pnt.ww3 + ${NLN} ${DATA}/output_${ymdh}0000/mod_def.${waveuoutpGRD} ./mod_def.ww3 + ${NLN} ${DATA}/output_${ymdh}0000/out_pnt.${waveuoutpGRD} ./out_pnt.ww3 # --------------------------------------------------------------------------- # # 2. Generate spectral data file @@ -170,11 +162,11 @@ source "$HOMEgfs/ush/preamble.sh" # 2.b Run the postprocessor set +x - echo " Executing $EXECwave/ww3_outp" + echo " Executing ${EXECgfs}/ww3_outp" set_trace export pgm=ww3_outp;. prep_step - $EXECwave/ww3_outp 1> outp_${specdir}_${buoy}.out 2>&1 + ${EXECgfs}/ww3_outp 1> outp_${specdir}_${buoy}.out 2>&1 export err=$?;err_chk @@ -196,31 +188,31 @@ source "$HOMEgfs/ush/preamble.sh" if [ -f $outfile ] then - if [ "${ymdh}" = "${CDATE}" ] + if [ "${ymdh}" = "${model_start_date}" ] then if [ "$specdir" = "bull" ] then - cat $outfile | sed -e '9,$d' >> ${STA_DIR}/${specdir}fhr/$WAV_MOD_TAG.${ymdh}.$buoy.bull - cat $coutfile | sed -e '8,$d' >> ${STA_DIR}/c${specdir}fhr/$WAV_MOD_TAG.${ymdh}.$buoy.cbull + sed '9,$d' "${outfile}" >> "${STA_DIR}/${specdir}fhr/${WAV_MOD_TAG}.${ymdh}.${buoy}.bull" + sed '8,$d' "${coutfile}" >> "${STA_DIR}/c${specdir}fhr/${WAV_MOD_TAG}.${ymdh}.${buoy}.cbull" else - cat $outfile >> ${STA_DIR}/${specdir}fhr/$WAV_MOD_TAG.${ymdh}.$buoy.spec + cat $outfile >> "${STA_DIR}/${specdir}fhr/${WAV_MOD_TAG}.${ymdh}.${buoy}.spec" fi elif [ "${ymdh}" = "${YMDHE}" ] then if [ "$specdir" = "bull" ] then - cat $outfile | sed -e '1,7d' >> ${STA_DIR}/${specdir}fhr/$WAV_MOD_TAG.${ymdh}.$buoy.bull - cat $coutfile | sed -e '1,6d' >> ${STA_DIR}/c${specdir}fhr/$WAV_MOD_TAG.${ymdh}.$buoy.cbull + sed '1,7d' "${outfile}" >> "${STA_DIR}/${specdir}fhr/${WAV_MOD_TAG}.${ymdh}.${buoy}.bull" + sed '1,6d' "${coutfile}" >> "${STA_DIR}/c${specdir}fhr/${WAV_MOD_TAG}.${ymdh}.${buoy}.cbull" else - cat $outfile | sed -n "/^${YMD} ${HMS}$/,\$p" >> ${STA_DIR}/${specdir}fhr/$WAV_MOD_TAG.${ymdh}.$buoy.spec + sed -n "/^${YMD} ${HMS}$/,\$p" "${outfile}" >> "${STA_DIR}/${specdir}fhr/${WAV_MOD_TAG}.${ymdh}.${buoy}.spec" fi else if [ "$specdir" = "bull" ] then - cat $outfile | sed -e '1,7d' | sed -e '2,$d' >> ${STA_DIR}/${specdir}fhr/$WAV_MOD_TAG.${ymdh}.$buoy.bull - cat $coutfile | sed -e '1,6d' | sed -e '2,$d' >> ${STA_DIR}/c${specdir}fhr/$WAV_MOD_TAG.${ymdh}.$buoy.cbull + sed '8q;d' "${outfile}" >> "${STA_DIR}/${specdir}fhr/${WAV_MOD_TAG}.${ymdh}.${buoy}.bull" + sed '7q;d' "${coutfile}" >> "${STA_DIR}/c${specdir}fhr/${WAV_MOD_TAG}.${ymdh}.${buoy}.cbull" else - cat $outfile | sed -n "/^${YMD} ${HMS}$/,\$p" >> ${STA_DIR}/${specdir}fhr/$WAV_MOD_TAG.${ymdh}.$buoy.spec + sed -n "/^${YMD} ${HMS}$/,\$p" "${outfile}" >> "${STA_DIR}/${specdir}fhr/${WAV_MOD_TAG}.${ymdh}.${buoy}.spec" fi fi else @@ -237,6 +229,6 @@ source "$HOMEgfs/ush/preamble.sh" # 3.b Clean up the rest cd .. -rm -rf ${specdir}_${bloc} +rm -rf "${specdir}_${bloc}" # End of ww3_outp_spec.sh ---------------------------------------------------- # diff --git a/ush/wave_prnc_cur.sh b/ush/wave_prnc_cur.sh index 6b1ab19db2..927710c581 100755 --- a/ush/wave_prnc_cur.sh +++ b/ush/wave_prnc_cur.sh @@ -22,7 +22,7 @@ ################################################################################ # -source "$HOMEgfs/ush/preamble.sh" +source "${USHgfs}/preamble.sh" ymdh_rtofs=$1 curfile=$2 @@ -46,7 +46,7 @@ mv -f "cur_temp3.nc" "cur_uv_${PDY}_${fext}${fh3}_flat.nc" # Convert to regular lat lon file # If weights need to be regenerated due to CDO ver change, use: # $CDO genbil,r4320x2160 rtofs_glo_2ds_f000_3hrly_prog.nc weights.nc -cp ${FIXwave}/weights_rtofs_to_r4320x2160.nc ./weights.nc +cp ${FIXgfs}/wave/weights_rtofs_to_r4320x2160.nc ./weights.nc # Interpolate to regular 5 min grid ${CDO} remap,r4320x2160,weights.nc "cur_uv_${PDY}_${fext}${fh3}_flat.nc" "cur_5min_01.nc" @@ -65,17 +65,17 @@ rm -f cur_temp[123].nc cur_5min_??.nc "cur_glo_uv_${PDY}_${fext}${fh3}.nc weight if [ ${flagfirst} = "T" ] then - sed -e "s/HDRFL/T/g" ${PARMwave}/ww3_prnc.cur.${WAVECUR_FID}.inp.tmpl > ww3_prnc.inp + sed -e "s/HDRFL/T/g" ${PARMgfs}/wave/ww3_prnc.cur.${WAVECUR_FID}.inp.tmpl > ww3_prnc.inp else - sed -e "s/HDRFL/F/g" ${PARMwave}/ww3_prnc.cur.${WAVECUR_FID}.inp.tmpl > ww3_prnc.inp + sed -e "s/HDRFL/F/g" ${PARMgfs}/wave/ww3_prnc.cur.${WAVECUR_FID}.inp.tmpl > ww3_prnc.inp fi rm -f cur.nc -ln -s "cur_glo_uv_${PDY}_${fext}${fh3}_5min.nc" "cur.nc" -ln -s "${DATA}/mod_def.${WAVECUR_FID}" ./mod_def.ww3 +${NLN} "cur_glo_uv_${PDY}_${fext}${fh3}_5min.nc" "cur.nc" +${NLN} "${DATA}/mod_def.${WAVECUR_FID}" ./mod_def.ww3 export pgm=ww3_prnc;. prep_step -$EXECwave/ww3_prnc 1> prnc_${WAVECUR_FID}_${ymdh_rtofs}.out 2>&1 +${EXECgfs}/ww3_prnc 1> prnc_${WAVECUR_FID}_${ymdh_rtofs}.out 2>&1 export err=$?; err_chk diff --git a/ush/wave_prnc_ice.sh b/ush/wave_prnc_ice.sh index 5ec1d7fc2e..be089c30bd 100755 --- a/ush/wave_prnc_ice.sh +++ b/ush/wave_prnc_ice.sh @@ -27,7 +27,7 @@ # --------------------------------------------------------------------------- # # 0. Preparations -source "$HOMEgfs/ush/preamble.sh" +source "${USHgfs}/preamble.sh" # 0.a Basic modes of operation @@ -36,7 +36,7 @@ source "$HOMEgfs/ush/preamble.sh" rm -rf ice mkdir ice cd ice - ln -s ${DATA}/postmsg . + ${NLN} "${DATA}/postmsg" postmsg # 0.b Define directories and the search path. # The tested variables should be exported by the postprocessor script. @@ -55,8 +55,8 @@ source "$HOMEgfs/ush/preamble.sh" echo "Making ice fields." if [[ -z "${YMDH}" ]] || [[ -z "${cycle}" ]] || \ - [[ -z "${COM_WAVE_PREP}" ]] || [[ -z "${FIXwave}" ]] || [[ -z "${EXECwave}" ]] || \ - [[ -z "${WAV_MOD_TAG}" ]] || [[ -z "${WAVEICE_FID}" ]] || [[ -z "${COM_OBS}" ]]; then + [[ -z "${COMOUT_WAVE_PREP}" ]] || [[ -z "${FIXgfs}" ]] || [[ -z "${EXECgfs}" ]] || \ + [[ -z "${WAV_MOD_TAG}" ]] || [[ -z "${WAVEICE_FID}" ]] || [[ -z "${COMIN_OBS}" ]]; then set +x echo ' ' @@ -71,13 +71,13 @@ source "$HOMEgfs/ush/preamble.sh" # 0.c Links to working directory - ln -s ${DATA}/mod_def.$WAVEICE_FID mod_def.ww3 + ${NLN} ${DATA}/mod_def.$WAVEICE_FID mod_def.ww3 # --------------------------------------------------------------------------- # # 1. Get the necessary files # 1.a Copy the ice data file - file=${COM_OBS}/${WAVICEFILE} + file=${COMIN_OBS}/${WAVICEFILE} if [ -f $file ] then @@ -144,7 +144,7 @@ source "$HOMEgfs/ush/preamble.sh" export pgm=ww3_prnc;. prep_step - $EXECwave/ww3_prnc 1> prnc_${WAVEICE_FID}_${cycle}.out 2>&1 + ${EXECgfs}/ww3_prnc 1> prnc_${WAVEICE_FID}_${cycle}.out 2>&1 export err=$?; err_chk if [ "$err" != '0' ] @@ -178,9 +178,9 @@ source "$HOMEgfs/ush/preamble.sh" fi set +x - echo " Saving ice.ww3 as ${COM_WAVE_PREP}/${icefile}" + echo " Saving ice.ww3 as ${COMOUT_WAVE_PREP}/${icefile}" set_trace - cp ice.ww3 "${COM_WAVE_PREP}/${icefile}" + cp ice.ww3 "${COMOUT_WAVE_PREP}/${icefile}" rm -f ice.ww3 # --------------------------------------------------------------------------- # diff --git a/ush/wave_tar.sh b/ush/wave_tar.sh index 1a8d6d6cc5..f82849854f 100755 --- a/ush/wave_tar.sh +++ b/ush/wave_tar.sh @@ -25,11 +25,11 @@ # --------------------------------------------------------------------------- # # 0. Preparations -source "$HOMEgfs/ush/preamble.sh" +source "${USHgfs}/preamble.sh" # 0.a Basic modes of operation - cd $DATA + cd "${DATA}" echo "Making TAR FILE" alertName=$(echo $RUN|tr [a-z] [A-Z]) @@ -47,7 +47,7 @@ source "$HOMEgfs/ush/preamble.sh" # 0.b Check if type set - if [ "$#" -lt '3' ] + if [[ "$#" -lt '3' ]] then set +x echo ' ' @@ -64,9 +64,9 @@ source "$HOMEgfs/ush/preamble.sh" fi filext=$type - if [ "$type" = "ibp" ]; then filext='spec'; fi - if [ "$type" = "ibpbull" ]; then filext='bull'; fi - if [ "$type" = "ibpcbull" ]; then filext='cbull'; fi + if [[ "$type" = "ibp" ]]; then filext='spec'; fi + if [[ "$type" = "ibpbull" ]]; then filext='bull'; fi + if [[ "$type" = "ibpcbull" ]]; then filext='cbull'; fi rm -rf TAR_${filext}_$ID @@ -76,7 +76,7 @@ source "$HOMEgfs/ush/preamble.sh" # 0.c Define directories and the search path. # The tested variables should be exported by the postprocessor script. - if [[ -z "${cycle}" ]] || [[ -z "${COM_WAVE_STATION}" ]] || [[ -z "${WAV_MOD_TAG}" ]] || \ + if [[ -z "${cycle}" ]] || [[ -z "${COMOUT_WAVE_STATION}" ]] || [[ -z "${WAV_MOD_TAG}" ]] || \ [[ -z "${SENDDBN}" ]] || [[ -z "${STA_DIR}" ]]; then set +x echo ' ' @@ -88,7 +88,7 @@ source "$HOMEgfs/ush/preamble.sh" exit 2 fi - cd ${STA_DIR}/${filext} + cd "${STA_DIR}/${filext}" # --------------------------------------------------------------------------- # # 2. Generate tar file (spectral files are compressed) @@ -98,21 +98,27 @@ source "$HOMEgfs/ush/preamble.sh" echo ' Making tar file ...' set_trace - count=0 countMAX=5 tardone='no' - - while [ "$count" -lt "$countMAX" ] && [ "$tardone" = 'no' ] + sleep_interval=10 + + while [[ "${tardone}" = "no" ]] do nf=$(ls | awk '/'$ID.*.$filext'/ {a++} END {print a}') nbm2=$(( $nb - 2 )) - if [ $nf -ge $nbm2 ] - then - tar -cf $ID.$cycle.${type}_tar ./$ID.*.$filext + if [[ "${nf}" -ge "${nbm2}" ]] + then + + tar -cf "${ID}.${cycle}.${type}_tar" ./${ID}.*.${filext} exit=$? + filename="${ID}.${cycle}.${type}_tar" + if ! wait_for_file "${filename}" "${sleep_interval}" "${countMAX}" ; then + echo "FATAL ERROR: File ${filename} not found after waiting $(( sleep_interval * (countMAX + 1) )) secs" + exit 3 + fi - if [ "$exit" != '0' ] + if [[ "${exit}" != '0' ]] then set +x echo ' ' @@ -124,21 +130,15 @@ source "$HOMEgfs/ush/preamble.sh" exit 3 fi - if [ -f "$ID.$cycle.${type}_tar" ] + if [[ -f "${ID}.${cycle}.${type}_tar" ]] then tardone='yes' fi - else - set +x - echo ' All files not found for tar. Sleeping 10 seconds and trying again ..' - set_trace - sleep 10 - count=$(expr $count + 1) fi done - if [ "$tardone" = 'no' ] + if [[ "${tardone}" = 'no' ]] then set +x echo ' ' @@ -150,15 +150,15 @@ source "$HOMEgfs/ush/preamble.sh" exit 3 fi - if [ "$type" = 'spec' ] + if [[ "${type}" = 'spec' ]] then - if [ -s $ID.$cycle.${type}_tar ] + if [[ -s "${ID}.${cycle}.${type}_tar" ]] then - file_name=$ID.$cycle.${type}_tar.gz - /usr/bin/gzip -c $ID.$cycle.${type}_tar > ${file_name} + file_name="${ID}.${cycle}.${type}_tar.gz" + /usr/bin/gzip -c "${ID}.${cycle}.${type}_tar" > "${file_name}" exit=$? - if [ "$exit" != '0' ] + if [[ "${exit}" != '0' ]] then set +x echo ' ' @@ -171,7 +171,7 @@ source "$HOMEgfs/ush/preamble.sh" fi fi else - file_name=$ID.$cycle.${type}_tar + file_name="${ID}.${cycle}.${type}_tar" fi # --------------------------------------------------------------------------- # @@ -179,14 +179,14 @@ source "$HOMEgfs/ush/preamble.sh" set +x echo ' ' - echo " Moving tar file ${file_name} to ${COM_WAVE_STATION} ..." + echo " Moving tar file ${file_name} to ${COMOUT_WAVE_STATION} ..." set_trace - cp "${file_name}" "${COM_WAVE_STATION}/." + cp "${file_name}" "${COMOUT_WAVE_STATION}/." exit=$? - if [ "$exit" != '0' ] + if [[ "${exit}" != '0' ]] then set +x echo ' ' @@ -198,21 +198,21 @@ source "$HOMEgfs/ush/preamble.sh" exit 4 fi - if [ "$SENDDBN" = 'YES' ] + if [[ "${SENDDBN}" = 'YES' ]] then set +x echo ' ' - echo " Alerting TAR file as ${COM_WAVE_STATION}/${file_name}" + echo " Alerting TAR file as ${COMOUT_WAVE_STATION}/${file_name}" echo ' ' set_trace "${DBNROOT}/bin/dbn_alert MODEL" "${alertName}_WAVE_TAR" "${job}" \ - "${COM_WAVE_STATION}/${file_name}" + "${COMOUT_WAVE_STATION}/${file_name}" fi # --------------------------------------------------------------------------- # # 4. Final clean up -cd $DATA +cd "${DATA}" if [[ ${KEEPDATA:-NO} == "NO" ]]; then set -v diff --git a/versions/build.gaea.ver b/versions/build.gaea.ver new file mode 100644 index 0000000000..b92fe8c1db --- /dev/null +++ b/versions/build.gaea.ver @@ -0,0 +1,6 @@ +export stack_intel_ver=2023.1.0 +export stack_cray_mpich_ver=8.1.25 +export spack_env=gsi-addon-dev + +source "${HOMEgfs:-}/versions/run.spack.ver" +export spack_mod_path="/ncrc/proj/epic/spack-stack/spack-stack-${spack_stack_ver}/envs/${spack_env}/install/modulefiles/Core" diff --git a/versions/build.hercules.ver b/versions/build.hercules.ver index 5513466631..cab0c92111 100644 --- a/versions/build.hercules.ver +++ b/versions/build.hercules.ver @@ -1,3 +1,6 @@ export stack_intel_ver=2021.9.0 export stack_impi_ver=2021.9.0 +export intel_mkl_ver=2023.1.0 +export spack_env=gsi-addon-env source "${HOMEgfs:-}/versions/build.spack.ver" +export spack_mod_path="/work/noaa/epic/role-epic/spack-stack/hercules/spack-stack-${spack_stack_ver}/envs/${spack_env}/install/modulefiles/Core" diff --git a/versions/build.jet.ver b/versions/build.jet.ver index ff85b1a801..55c0ea0bd1 100644 --- a/versions/build.jet.ver +++ b/versions/build.jet.ver @@ -1,3 +1,5 @@ export stack_intel_ver=2021.5.0 export stack_impi_ver=2021.5.1 +export spack_env=gsi-addon-dev source "${HOMEgfs:-}/versions/build.spack.ver" +export spack_mod_path="/lfs4/HFIP/hfv3gfs/role.epic/spack-stack/spack-stack-${spack_stack_ver}/envs/${spack_env}/install/modulefiles/Core" diff --git a/versions/build.orion.ver b/versions/build.orion.ver index ff85b1a801..834ecfc166 100644 --- a/versions/build.orion.ver +++ b/versions/build.orion.ver @@ -1,3 +1,5 @@ -export stack_intel_ver=2021.5.0 -export stack_impi_ver=2021.5.1 +export stack_intel_ver=2021.9.0 +export stack_impi_ver=2021.9.0 +export spack_env=gsi-addon-env-rocky9 source "${HOMEgfs:-}/versions/build.spack.ver" +export spack_mod_path="/work/noaa/epic/role-epic/spack-stack/orion/spack-stack-${spack_stack_ver}/envs/${spack_env}/install/modulefiles/Core" diff --git a/versions/build.s4.ver b/versions/build.s4.ver index a0aae51d87..e2731ccfb3 100644 --- a/versions/build.s4.ver +++ b/versions/build.s4.ver @@ -1,3 +1,5 @@ export stack_intel_ver=2021.5.0 export stack_impi_ver=2021.5.0 +export spack_env=gsi-addon-env source "${HOMEgfs:-}/versions/build.spack.ver" +export spack_mod_path="/data/prod/jedi/spack-stack/spack-stack-${spack_stack_ver}/envs/${spack_env}/install/modulefiles/Core" diff --git a/versions/build.spack.ver b/versions/build.spack.ver index fb5b244bf5..808f85dd16 100644 --- a/versions/build.spack.ver +++ b/versions/build.spack.ver @@ -1,5 +1,4 @@ -export spack_stack_ver=1.5.1 -export spack_env=gsi-addon +export spack_stack_ver=1.6.0 export cmake_ver=3.23.1 @@ -11,7 +10,7 @@ export fms_ver=2023.02.01 export hdf5_ver=1.14.0 export netcdf_c_ver=4.9.2 -export netcdf_fortran_ver=4.6.0 +export netcdf_fortran_ver=4.6.1 export bacio_ver=2.4.1 export nemsio_ver=2.5.4 @@ -19,10 +18,10 @@ export sigio_ver=2.3.2 export w3emc_ver=2.10.0 export bufr_ver=11.7.0 export g2_ver=3.4.5 -export sp_ver=2.3.3 +export sp_ver=2.5.0 export ip_ver=4.3.0 export gsi_ncdiag_ver=1.1.2 export g2tmpl_ver=1.10.2 -export crtm_ver=2.4.0 +export crtm_ver=2.4.0.1 export wgrib2_ver=2.0.8 export grib_util_ver=1.3.0 diff --git a/versions/build.wcoss2.ver b/versions/build.wcoss2.ver index 046ff5c64e..3ae0b3a1cc 100644 --- a/versions/build.wcoss2.ver +++ b/versions/build.wcoss2.ver @@ -28,6 +28,6 @@ export wrf_io_ver=1.2.0 export ncio_ver=1.1.2 export ncdiag_ver=1.0.0 export g2tmpl_ver=1.10.2 -export crtm_ver=2.4.0 +export crtm_ver=2.4.0.1 export upp_ver=10.0.8 diff --git a/versions/fix.ver b/versions/fix.ver index 13d9b56dd2..5ca044ae3d 100644 --- a/versions/fix.ver +++ b/versions/fix.ver @@ -4,19 +4,23 @@ export aer_ver=20220805 export am_ver=20220805 export chem_ver=20220805 -export cice_ver=20231219 +export cice_ver=20240416 export cpl_ver=20230526 export datm_ver=20220805 export gdas_crtm_ver=20220805 export gdas_fv3jedi_ver=20220805 -export gdas_gsibec_ver=20221031 +export gdas_soca_ver=20240624 +export gdas_gsibec_ver=20240416 +export gdas_obs_ver=20240213 export glwu_ver=20220805 -export gsi_ver=20230911 +export gsi_ver=20240208 export lut_ver=20220805 -export mom6_ver=20231219 +export mom6_ver=20240416 export orog_ver=20231027 export reg2grb2_ver=20220805 export sfc_climo_ver=20220805 export ugwd_ver=20220805 export verif_ver=20220805 export wave_ver=20240105 +export orog_nest_ver=global-nest.20240419 +export ugwd_nest_ver=global-nest.20240419 diff --git a/versions/run.gaea.ver b/versions/run.gaea.ver new file mode 100644 index 0000000000..b92fe8c1db --- /dev/null +++ b/versions/run.gaea.ver @@ -0,0 +1,6 @@ +export stack_intel_ver=2023.1.0 +export stack_cray_mpich_ver=8.1.25 +export spack_env=gsi-addon-dev + +source "${HOMEgfs:-}/versions/run.spack.ver" +export spack_mod_path="/ncrc/proj/epic/spack-stack/spack-stack-${spack_stack_ver}/envs/${spack_env}/install/modulefiles/Core" diff --git a/versions/run.hera.ver b/versions/run.hera.ver index b358f9d495..34f81bfe96 100644 --- a/versions/run.hera.ver +++ b/versions/run.hera.ver @@ -4,8 +4,10 @@ export spack_env=gsi-addon-dev-rocky8 export hpss_ver=hpss export ncl_ver=6.6.2 -export R_ver=3.5.0 -export gempak_ver=7.4.2 +export R_ver=3.6.1 + +export gempak_ver=7.17.0 +export perl_ver=5.38.0 source "${HOMEgfs:-}/versions/run.spack.ver" export spack_mod_path="/scratch1/NCEPDEV/nems/role.epic/spack-stack/spack-stack-${spack_stack_ver}/envs/${spack_env}/install/modulefiles/Core" diff --git a/versions/run.hercules.ver b/versions/run.hercules.ver index 43f1b2181d..ee8e4f8aea 100644 --- a/versions/run.hercules.ver +++ b/versions/run.hercules.ver @@ -1,12 +1,7 @@ export stack_intel_ver=2021.9.0 export stack_impi_ver=2021.9.0 export intel_mkl_ver=2023.1.0 - -export ncl_ver=6.6.2 -export perl_ver=5.36.0 +export spack_env=gsi-addon-env source "${HOMEgfs:-}/versions/run.spack.ver" - -# wgrib2 and cdo are different on Hercules from all the other systems -export wgrib2_ver=3.1.1 -export cdo_ver=2.2.0 +export spack_mod_path="/work/noaa/epic/role-epic/spack-stack/hercules/spack-stack-${spack_stack_ver}/envs/${spack_env}/install/modulefiles/Core" diff --git a/versions/run.jet.ver b/versions/run.jet.ver index 18a82cab4f..3aa586ee42 100644 --- a/versions/run.jet.ver +++ b/versions/run.jet.ver @@ -1,9 +1,14 @@ export stack_intel_ver=2021.5.0 export stack_impi_ver=2021.5.1 +export spack_env=gsi-addon-dev-rocky8 export hpss_ver= export ncl_ver=6.6.2 export R_ver=4.0.2 export gempak_ver=7.4.2 +# Adding perl as a module; With Rocky8, perl packages will not be from the OS +export perl_ver=5.38.0 + source "${HOMEgfs:-}/versions/run.spack.ver" +export spack_mod_path="/lfs4/HFIP/hfv3gfs/role.epic/spack-stack/spack-stack-${spack_stack_ver}/envs/${spack_env}/install/modulefiles/Core" diff --git a/versions/run.orion.ver b/versions/run.orion.ver index 7671bc028d..59adda6b50 100644 --- a/versions/run.orion.ver +++ b/versions/run.orion.ver @@ -1,11 +1,10 @@ -export stack_intel_ver=2022.0.2 -export stack_impi_ver=2021.5.1 - -export ncl_ver=6.6.2 -export gempak_ver=7.5.1 +export stack_intel_ver=2021.9.0 +export stack_impi_ver=2021.9.0 +export spack_env=gsi-addon-env-rocky9 #For metplus jobs, not currently working with spack-stack #export met_ver=9.1.3 #export metplus_ver=3.1.1 source "${HOMEgfs:-}/versions/run.spack.ver" +export spack_mod_path="/work/noaa/epic/role-epic/spack-stack/orion/spack-stack-${spack_stack_ver}/envs/${spack_env}/install/modulefiles/Core" diff --git a/versions/run.s4.ver b/versions/run.s4.ver index 56817ef439..6d0f4cbaca 100644 --- a/versions/run.s4.ver +++ b/versions/run.s4.ver @@ -1,6 +1,8 @@ export stack_intel_ver=2021.5.0 export stack_impi_ver=2021.5.0 +export spack_env=gsi-addon-env export ncl_ver=6.4.0-precompiled source "${HOMEgfs:-}/versions/run.spack.ver" +export spack_mod_path="/data/prod/jedi/spack-stack/spack-stack-${spack_stack_ver}/envs/${spack_env}/install/modulefiles/Core" diff --git a/versions/run.spack.ver b/versions/run.spack.ver index 80fa6acd1a..9aa5460c80 100644 --- a/versions/run.spack.ver +++ b/versions/run.spack.ver @@ -1,29 +1,35 @@ -export spack_stack_ver=1.5.1 -export spack_env=gsi-addon-dev-rocky8 -export python_ver=3.10.8 +export spack_stack_ver=1.6.0 +export python_ver=3.11.6 export jasper_ver=2.0.32 export libpng_ver=1.6.37 -export cdo_ver=2.0.5 +export cdo_ver=2.2.0 export nco_ver=5.0.6 export hdf5_ver=1.14.0 export netcdf_c_ver=4.9.2 -export netcdf_fortran_ver=4.6.0 +export netcdf_fortran_ver=4.6.1 export bufr_ver=11.7.0 export gsi_ncdiag_ver=1.1.2 export g2tmpl_ver=1.10.2 -export crtm_ver=2.4.0 +export crtm_ver=2.4.0.1 export wgrib2_ver=2.0.8 export grib_util_ver=1.3.0 -export prod_util_ver=1.2.2 +export prod_util_ver=2.1.1 export py_netcdf4_ver=1.5.8 -export py_pyyaml_ver=5.4.1 +export py_pyyaml_ver=6.0 export py_jinja2_ver=3.1.2 +export py_pandas_ver=1.5.3 +export py_python_dateutil_ver=2.8.2 +export py_f90nml_ver=1.4.3 + +export met_ver=9.1.3 +export metplus_ver=3.1.1 +export py_xarray_ver=2023.7.0 export obsproc_run_ver=1.1.2 -export prepobs_run_ver=1.0.1 +export prepobs_run_ver=1.0.2 export ens_tracker_ver=feature-GFSv17_com_reorg -export fit2obs_ver=1.0.0 +export fit2obs_ver=1.1.2 diff --git a/versions/run.wcoss2.ver b/versions/run.wcoss2.ver index a188cdea74..7f653dd50e 100644 --- a/versions/run.wcoss2.ver +++ b/versions/run.wcoss2.ver @@ -37,15 +37,17 @@ export bufr_dump_ver=1.0.0 export util_shared_ver=1.4.0 export g2tmpl_ver=1.10.2 export ncdiag_ver=1.0.0 -export crtm_ver=2.4.0 +export crtm_ver=2.4.0.1 export wgrib2_ver=2.0.8 +export met_ver=9.1.3 +export metplus_ver=3.1.1 # Development-only below export obsproc_run_ver=1.1.2 -export prepobs_run_ver=1.0.1 +export prepobs_run_ver=1.0.2 export ens_tracker_ver=feature-GFSv17_com_reorg -export fit2obs_ver=1.0.0 +export fit2obs_ver=1.1.2 export mos_ver=5.4.3 export mos_shared_ver=2.7.2 diff --git a/workflow/applications/applications.py b/workflow/applications/applications.py index d45b6a9abc..97a77c2c21 100644 --- a/workflow/applications/applications.py +++ b/workflow/applications/applications.py @@ -3,6 +3,7 @@ from typing import Dict, List, Any from datetime import timedelta from hosts import Host +from pathlib import Path from wxflow import Configuration, to_timedelta from abc import ABC, ABCMeta, abstractmethod @@ -31,7 +32,11 @@ def __init__(self, conf: Configuration) -> None: self.scheduler = Host().scheduler - _base = conf.parse_config('config.base') + # Save the configuration so we can source the config files when + # determining task resources + self.conf = conf + + _base = self.conf.parse_config('config.base') # Define here so the child __init__ functions can use it; will # be overwritten later during _init_finalize(). self._base = _base @@ -51,6 +56,7 @@ def __init__(self, conf: Configuration) -> None: self.do_ocean = _base.get('DO_OCN', False) self.do_ice = _base.get('DO_ICE', False) self.do_aero = _base.get('DO_AERO', False) + self.do_prep_obs_aero = _base.get('DO_PREP_OBS_AERO', False) self.do_bufrsnd = _base.get('DO_BUFRSND', False) self.do_gempak = _base.get('DO_GEMPAK', False) self.do_awips = _base.get('DO_AWIPS', False) @@ -64,30 +70,45 @@ def __init__(self, conf: Configuration) -> None: self.do_upp = not _base.get('WRITE_DOPOST', True) self.do_goes = _base.get('DO_GOES', False) self.do_mos = _base.get('DO_MOS', False) + self.do_extractvars = _base.get('DO_EXTRACTVARS', False) self.do_hpssarch = _base.get('HPSSARCH', False) self.nens = _base.get('NMEM_ENS', 0) - self.wave_cdumps = None + self.wave_runs = None if self.do_wave: - wave_cdump = _base.get('WAVE_CDUMP', 'BOTH').lower() - if wave_cdump in ['both']: - self.wave_cdumps = ['gfs', 'gdas'] - elif wave_cdump in ['gfs', 'gdas']: - self.wave_cdumps = [wave_cdump] - - def _init_finalize(self, conf: Configuration): + wave_run = _base.get('WAVE_RUN', 'BOTH').lower() + if wave_run in ['both']: + self.wave_runs = ['gfs', 'gdas'] + elif wave_run in ['gfs', 'gdas']: + self.wave_runs = [wave_run] + + self.aero_anl_runs = None + self.aero_fcst_runs = None + if self.do_aero: + aero_anl_run = _base.get('AERO_ANL_RUN', 'BOTH').lower() + if aero_anl_run in ['both']: + self.aero_anl_runs = ['gfs', 'gdas'] + elif aero_anl_run in ['gfs', 'gdas']: + self.aero_anl_runs = [aero_anl_run] + aero_fcst_run = _base.get('AERO_FCST_RUN', None).lower() + if aero_fcst_run in ['both']: + self.aero_fcst_runs = ['gfs', 'gdas'] + elif aero_fcst_run in ['gfs', 'gdas']: + self.aero_fcst_runs = [aero_fcst_run] + + def _init_finalize(self, *args): print("Finalizing initialize") # Get a list of all possible config_files that would be part of the application self.configs_names = self._get_app_configs() # Source the config_files for the jobs in the application - self.configs = self._source_configs(conf) + self.configs = self.source_configs() # Update the base config dictionary base on application - self.configs['base'] = self._update_base(self.configs['base']) + self.configs['base'] = self.update_base(self.configs['base']) # Save base in the internal state since it is often needed self._base = self.configs['base'] @@ -104,7 +125,7 @@ def _get_app_configs(self): @staticmethod @abstractmethod - def _update_base(base_in: Dict[str, Any]) -> Dict[str, Any]: + def update_base(base_in: Dict[str, Any]) -> Dict[str, Any]: ''' Make final updates to base and return an updated copy @@ -121,9 +142,9 @@ def _update_base(base_in: Dict[str, Any]) -> Dict[str, Any]: ''' pass - def _source_configs(self, conf: Configuration) -> Dict[str, Any]: + def source_configs(self, run: str = "gfs", log: bool = True) -> Dict[str, Any]: """ - Given the configuration object and jobs, + Given the configuration object used to initialize this application, source the configurations for each config and return a dictionary Every config depends on "config.base" """ @@ -131,7 +152,7 @@ def _source_configs(self, conf: Configuration) -> Dict[str, Any]: configs = dict() # Return config.base as well - configs['base'] = conf.parse_config('config.base') + configs['base'] = self.conf.parse_config('config.base') # Source the list of all config_files involved in the application for config in self.configs_names: @@ -145,20 +166,24 @@ def _source_configs(self, conf: Configuration) -> Dict[str, Any]: files += ['config.anal', 'config.eupd'] elif config in ['efcs']: files += ['config.fcst', 'config.efcs'] + elif config in ['atmanlinit', 'atmanlvar', 'atmanlfv3inc']: + files += ['config.atmanl', f'config.{config}'] + elif config in ['atmensanlinit', 'atmensanlletkf', 'atmensanlfv3inc']: + files += ['config.atmensanl', f'config.{config}'] elif 'wave' in config: files += ['config.wave', f'config.{config}'] else: files += [f'config.{config}'] - print(f'sourcing config.{config}') - configs[config] = conf.parse_config(files) + print(f'sourcing config.{config}') if log else 0 + configs[config] = self.conf.parse_config(files, RUN=run) return configs @abstractmethod def get_task_names(self) -> Dict[str, List[str]]: ''' - Create a list of task names for each CDUMP valid for the configuation. + Create a list of task names for each RUN valid for the configuation. Parameters ---------- @@ -166,7 +191,7 @@ def get_task_names(self) -> Dict[str, List[str]]: Returns ------- - Dict[str, List[str]]: Lists of tasks for each CDUMP. + Dict[str, List[str]]: Lists of tasks for each RUN. ''' pass diff --git a/workflow/applications/gefs.py b/workflow/applications/gefs.py index b2369e8dfc..364ee2c48b 100644 --- a/workflow/applications/gefs.py +++ b/workflow/applications/gefs.py @@ -14,22 +14,33 @@ def _get_app_configs(self): """ Returns the config_files that are involved in gefs """ - configs = ['stage_ic', 'fcst'] + configs = ['stage_ic', 'fcst', 'atmos_products'] if self.nens > 0: - configs += ['efcs'] + configs += ['efcs', 'atmos_ensstat'] if self.do_wave: - configs += ['waveinit'] + configs += ['waveinit', 'wavepostsbs', 'wavepostpnt'] + if self.do_wave_bnd: + configs += ['wavepostbndpnt', 'wavepostbndpntbll'] + + if self.do_ocean or self.do_ice: + configs += ['oceanice_products'] + + if self.do_aero: + configs += ['prep_emissions'] + + if self.do_extractvars: + configs += ['extractvars'] return configs @staticmethod - def _update_base(base_in): + def update_base(base_in): base_out = base_in.copy() base_out['INTERVAL_GFS'] = AppConfig.get_gfs_interval(base_in['gfs_cyc']) - base_out['CDUMP'] = 'gefs' + base_out['RUN'] = 'gefs' return base_out @@ -40,9 +51,32 @@ def get_task_names(self): if self.do_wave: tasks += ['waveinit'] + if self.do_aero: + tasks += ['prep_emissions'] + tasks += ['fcst'] if self.nens > 0: tasks += ['efcs'] - return {f"{self._base['CDUMP']}": tasks} + tasks += ['atmos_prod'] + + if self.nens > 0: + tasks += ['atmos_ensstat'] + + if self.do_ocean: + tasks += ['ocean_prod'] + + if self.do_ice: + tasks += ['ice_prod'] + + if self.do_wave: + tasks += ['wavepostsbs'] + if self.do_wave_bnd: + tasks += ['wavepostbndpnt', 'wavepostbndpntbll'] + tasks += ['wavepostpnt'] + + if self.do_extractvars: + tasks += ['extractvars'] + + return {f"{self._base['RUN']}": tasks} diff --git a/workflow/applications/gfs_cycled.py b/workflow/applications/gfs_cycled.py index 1ff6cc3723..e049a7d422 100644 --- a/workflow/applications/gfs_cycled.py +++ b/workflow/applications/gfs_cycled.py @@ -16,18 +16,19 @@ def __init__(self, conf: Configuration): self.do_jediatmvar = self._base.get('DO_JEDIATMVAR', False) self.do_jediatmens = self._base.get('DO_JEDIATMENS', False) self.do_jediocnvar = self._base.get('DO_JEDIOCNVAR', False) - self.do_jedilandda = self._base.get('DO_JEDILANDDA', False) + self.do_jedisnowda = self._base.get('DO_JEDISNOWDA', False) self.do_mergensst = self._base.get('DO_MERGENSST', False) + self.do_vrfy_oceanda = self._base.get('DO_VRFY_OCEANDA', False) self.lobsdiag_forenkf = False - self.eupd_cdumps = None + self.eupd_runs = None if self.do_hybvar: self.lobsdiag_forenkf = self._base.get('lobsdiag_forenkf', False) - eupd_cdump = self._base.get('EUPD_CYC', 'gdas').lower() - if eupd_cdump in ['both']: - self.eupd_cdumps = ['gfs', 'gdas'] - elif eupd_cdump in ['gfs', 'gdas']: - self.eupd_cdumps = [eupd_cdump] + eupd_run = self._base.get('EUPD_CYC', 'gdas').lower() + if eupd_run in ['both']: + self.eupd_runs = ['gfs', 'gdas'] + elif eupd_run in ['gfs', 'gdas']: + self.eupd_runs = [eupd_run] def _get_app_configs(self): """ @@ -37,23 +38,26 @@ def _get_app_configs(self): configs = ['prep'] if self.do_jediatmvar: - configs += ['prepatmiodaobs', 'atmanlinit', 'atmanlrun', 'atmanlfinal'] + configs += ['prepatmiodaobs', 'atmanlinit', 'atmanlvar', 'atmanlfv3inc', 'atmanlfinal'] else: configs += ['anal', 'analdiag'] if self.do_jediocnvar: - configs += ['prepoceanobs', 'ocnanalprep', 'ocnanalbmat', - 'ocnanalrun', 'ocnanalchkpt', 'ocnanalpost', - 'ocnanalvrfy'] + configs += ['prepoceanobs', 'ocnanalprep', 'marinebmat', 'ocnanalrun'] + if self.do_hybvar: + configs += ['ocnanalecen'] + configs += ['ocnanalchkpt', 'ocnanalpost'] + if self.do_vrfy_oceanda: + configs += ['ocnanalvrfy'] - if self.do_ocean: - configs += ['ocnpost'] + if self.do_ocean or self.do_ice: + configs += ['oceanice_products'] configs += ['sfcanl', 'analcalc', 'fcst', 'upp', 'atmos_products', 'arch', 'cleanup'] if self.do_hybvar: if self.do_jediatmens: - configs += ['atmensanlinit', 'atmensanlrun', 'atmensanlfinal'] + configs += ['atmensanlinit', 'atmensanlletkf', 'atmensanlfv3inc', 'atmensanlfinal'] else: configs += ['eobs', 'eomg', 'ediag', 'eupd'] configs += ['ecen', 'esfc', 'efcs', 'echgres', 'epos', 'earc'] @@ -83,7 +87,9 @@ def _get_app_configs(self): configs += ['metp'] if self.do_gempak: - configs += ['gempak', 'npoess'] + configs += ['gempak'] + if self.do_goes: + configs += ['npoess'] if self.do_bufrsnd: configs += ['postsnd'] @@ -102,9 +108,11 @@ def _get_app_configs(self): if self.do_aero: configs += ['aeroanlinit', 'aeroanlrun', 'aeroanlfinal'] + if self.do_prep_obs_aero: + configs += ['prepobsaero'] - if self.do_jedilandda: - configs += ['preplandobs', 'landanl'] + if self.do_jedisnowda: + configs += ['prepsnowobs', 'snowanl'] if self.do_mos: configs += ['mos_stn_prep', 'mos_grd_prep', 'mos_ext_stn_prep', 'mos_ext_grd_prep', @@ -115,7 +123,7 @@ def _get_app_configs(self): return configs @staticmethod - def _update_base(base_in): + def update_base(base_in): return GFSCycledAppConfig.get_gfs_cyc_dates(base_in) @@ -130,23 +138,22 @@ def get_task_names(self): gdas_gfs_common_cleanup_tasks = ['arch', 'cleanup'] if self.do_jediatmvar: - gdas_gfs_common_tasks_before_fcst += ['prepatmiodaobs', 'atmanlinit', 'atmanlrun', 'atmanlfinal'] + gdas_gfs_common_tasks_before_fcst += ['prepatmiodaobs', 'atmanlinit', 'atmanlvar', 'atmanlfv3inc', 'atmanlfinal'] else: gdas_gfs_common_tasks_before_fcst += ['anal'] if self.do_jediocnvar: - gdas_gfs_common_tasks_before_fcst += ['prepoceanobs', 'ocnanalprep', - 'ocnanalbmat', 'ocnanalrun', - 'ocnanalchkpt', 'ocnanalpost', - 'ocnanalvrfy'] + gdas_gfs_common_tasks_before_fcst += ['prepoceanobs', 'ocnanalprep', 'marinebmat', 'ocnanalrun'] + if self.do_hybvar: + gdas_gfs_common_tasks_before_fcst += ['ocnanalecen'] + gdas_gfs_common_tasks_before_fcst += ['ocnanalchkpt', 'ocnanalpost'] + if self.do_vrfy_oceanda: + gdas_gfs_common_tasks_before_fcst += ['ocnanalvrfy'] gdas_gfs_common_tasks_before_fcst += ['sfcanl', 'analcalc'] - if self.do_aero: - gdas_gfs_common_tasks_before_fcst += ['aeroanlinit', 'aeroanlrun', 'aeroanlfinal'] - - if self.do_jedilandda: - gdas_gfs_common_tasks_before_fcst += ['preplandobs', 'landanl'] + if self.do_jedisnowda: + gdas_gfs_common_tasks_before_fcst += ['prepsnowobs', 'snowanl'] wave_prep_tasks = ['waveinit', 'waveprep'] wave_bndpnt_tasks = ['wavepostbndpnt', 'wavepostbndpntbll'] @@ -156,7 +163,7 @@ def get_task_names(self): hybrid_after_eupd_tasks = [] if self.do_hybvar: if self.do_jediatmens: - hybrid_tasks += ['atmensanlinit', 'atmensanlrun', 'atmensanlfinal', 'echgres'] + hybrid_tasks += ['atmensanlinit', 'atmensanlletkf', 'atmensanlfv3inc', 'atmensanlfinal', 'echgres'] else: hybrid_tasks += ['eobs', 'eupd', 'echgres'] hybrid_tasks += ['ediag'] if self.lobsdiag_forenkf else ['eomg'] @@ -168,16 +175,21 @@ def get_task_names(self): if not self.do_jediatmvar: gdas_tasks += ['analdiag'] - if self.do_wave and 'gdas' in self.wave_cdumps: + if self.do_wave and 'gdas' in self.wave_runs: gdas_tasks += wave_prep_tasks + if self.do_aero and 'gdas' in self.aero_anl_runs: + gdas_tasks += ['aeroanlinit', 'aeroanlrun', 'aeroanlfinal'] + if self.do_prep_obs_aero: + gdas_tasks += ['prepobsaero'] + gdas_tasks += ['atmanlupp', 'atmanlprod', 'fcst'] if self.do_upp: gdas_tasks += ['atmupp'] - gdas_tasks += ['atmprod'] + gdas_tasks += ['atmos_prod'] - if self.do_wave and 'gdas' in self.wave_cdumps: + if self.do_wave and 'gdas' in self.wave_runs: if self.do_wave_bnd: gdas_tasks += wave_bndpnt_tasks gdas_tasks += wave_post_tasks @@ -202,14 +214,25 @@ def get_task_names(self): # Collect "gfs" cycle tasks gfs_tasks = gdas_gfs_common_tasks_before_fcst.copy() - if self.do_wave and 'gfs' in self.wave_cdumps: + if self.do_wave and 'gfs' in self.wave_runs: gfs_tasks += wave_prep_tasks + if self.do_aero and 'gfs' in self.aero_anl_runs: + gfs_tasks += ['aeroanlinit', 'aeroanlrun', 'aeroanlfinal'] + if self.do_prep_obs_aero: + gfs_tasks += ['prepobsaero'] + gfs_tasks += ['atmanlupp', 'atmanlprod', 'fcst'] + if self.do_ocean: + gfs_tasks += ['ocean_prod'] + + if self.do_ice: + gfs_tasks += ['ice_prod'] + if self.do_upp: gfs_tasks += ['atmupp'] - gfs_tasks += ['atmprod'] + gfs_tasks += ['atmos_prod'] if self.do_goes: gfs_tasks += ['goesupp'] @@ -229,7 +252,7 @@ def get_task_names(self): if self.do_metp: gfs_tasks += ['metp'] - if self.do_wave and 'gfs' in self.wave_cdumps: + if self.do_wave and 'gfs' in self.wave_runs: if self.do_wave_bnd: gfs_tasks += wave_bndpnt_tasks gfs_tasks += wave_post_tasks @@ -245,11 +268,12 @@ def get_task_names(self): gfs_tasks += ['gempak'] gfs_tasks += ['gempakmeta'] gfs_tasks += ['gempakncdcupapgif'] - gfs_tasks += ['npoess_pgrb2_0p5deg'] - gfs_tasks += ['gempakpgrb2spec'] + if self.do_goes: + gfs_tasks += ['npoess_pgrb2_0p5deg'] + gfs_tasks += ['gempakpgrb2spec'] if self.do_awips: - gfs_tasks += ['awips_20km_1p0deg', 'awips_g2', 'fbwind'] + gfs_tasks += ['awips_20km_1p0deg', 'fbwind'] if self.do_mos: gfs_tasks += ['mos_stn_prep', 'mos_grd_prep', 'mos_ext_stn_prep', 'mos_ext_grd_prep', @@ -262,15 +286,15 @@ def get_task_names(self): tasks = dict() tasks['gdas'] = gdas_tasks - if self.do_hybvar and 'gdas' in self.eupd_cdumps: + if self.do_hybvar and 'gdas' in self.eupd_runs: enkfgdas_tasks = hybrid_tasks + hybrid_after_eupd_tasks tasks['enkfgdas'] = enkfgdas_tasks - # Add CDUMP=gfs tasks if running early cycle + # Add RUN=gfs tasks if running early cycle if self.gfs_cyc > 0: tasks['gfs'] = gfs_tasks - if self.do_hybvar and 'gfs' in self.eupd_cdumps: + if self.do_hybvar and 'gfs' in self.eupd_runs: enkfgfs_tasks = hybrid_tasks + hybrid_after_eupd_tasks enkfgfs_tasks.remove("echgres") tasks['enkfgfs'] = enkfgfs_tasks @@ -321,9 +345,4 @@ def get_gfs_cyc_dates(base: Dict[str, Any]) -> Dict[str, Any]: base_out['EDATE_GFS'] = edate_gfs base_out['INTERVAL_GFS'] = interval_gfs - fhmax_gfs = {} - for hh in ['00', '06', '12', '18']: - fhmax_gfs[hh] = base.get(f'FHMAX_GFS_{hh}', base.get('FHMAX_GFS_00', 120)) - base_out['FHMAX_GFS'] = fhmax_gfs - return base_out diff --git a/workflow/applications/gfs_forecast_only.py b/workflow/applications/gfs_forecast_only.py index 1145863210..caa545d1e1 100644 --- a/workflow/applications/gfs_forecast_only.py +++ b/workflow/applications/gfs_forecast_only.py @@ -25,7 +25,8 @@ def _get_app_configs(self): configs += ['atmos_products'] if self.do_aero: - configs += ['aerosol_init'] + if not self._base['EXP_WARM_START']: + configs += ['aerosol_init'] if self.do_tracker: configs += ['tracker'] @@ -49,7 +50,7 @@ def _get_app_configs(self): configs += ['awips'] if self.do_ocean or self.do_ice: - configs += ['ocnpost'] + configs += ['oceanice_products'] if self.do_wave: configs += ['waveinit', 'waveprep', 'wavepostsbs', 'wavepostpnt'] @@ -69,11 +70,11 @@ def _get_app_configs(self): return configs @staticmethod - def _update_base(base_in): + def update_base(base_in): base_out = base_in.copy() base_out['INTERVAL_GFS'] = AppConfig.get_gfs_interval(base_in['gfs_cyc']) - base_out['CDUMP'] = 'gfs' + base_out['RUN'] = 'gfs' return base_out @@ -87,7 +88,10 @@ def get_task_names(self): tasks = ['stage_ic'] if self.do_aero: - tasks += ['aerosol_init'] + aero_fcst_run = self._base.get('AERO_FCST_RUN', 'BOTH').lower() + if self._base['RUN'] in aero_fcst_run or aero_fcst_run == "both": + if not self._base['EXP_WARM_START']: + tasks += ['aerosol_init'] if self.do_wave: tasks += ['waveinit'] @@ -100,7 +104,10 @@ def get_task_names(self): if self.do_upp: tasks += ['atmupp'] - tasks += ['atmprod'] + tasks += ['atmos_prod'] + + if self.do_goes: + tasks += ['goesupp'] if self.do_goes: tasks += ['goesupp'] @@ -124,10 +131,13 @@ def get_task_names(self): tasks += ['gempak', 'gempakmeta', 'gempakncdcupapgif', 'gempakpgrb2spec'] if self.do_awips: - tasks += ['awips_20km_1p0deg', 'awips_g2', 'fbwind'] + tasks += ['awips_20km_1p0deg', 'fbwind'] - if self.do_ocean or self.do_ice: - tasks += ['ocnpost'] + if self.do_ocean: + tasks += ['ocean_prod'] + + if self.do_ice: + tasks += ['ice_prod'] if self.do_wave: if self.do_wave_bnd: @@ -146,4 +156,4 @@ def get_task_names(self): tasks += ['arch', 'cleanup'] # arch and cleanup **must** be the last tasks - return {f"{self._base['CDUMP']}": tasks} + return {f"{self._base['RUN']}": tasks} diff --git a/workflow/create_experiment.py b/workflow/create_experiment.py index 7e0f350c0f..1317f7be28 100755 --- a/workflow/create_experiment.py +++ b/workflow/create_experiment.py @@ -11,6 +11,14 @@ The yaml file are simply the arguments for these two scripts. After this scripts runs the experiment is ready for launch. +Environmental variables +----------------------- + pslot + Name of the experiment + + RUNTESTS + Root directory where the test EXPDIR and COMROOT will be placed + Output ------ Functionally an experiment is setup as a result running the two scripts described above @@ -18,7 +26,6 @@ """ import os -import sys from argparse import ArgumentParser, ArgumentDefaultsHelpFormatter from pathlib import Path @@ -28,8 +35,6 @@ import setup_expt import setup_xml -from hosts import Host - _here = os.path.dirname(__file__) _top = os.path.abspath(os.path.join(os.path.abspath(_here), '..')) @@ -63,7 +68,9 @@ def input_args(): formatter_class=ArgumentDefaultsHelpFormatter) parser.add_argument( - '--yaml', help='full path to yaml file describing the experiment configuration', type=Path, required=True) + '-y', '--yaml', help='full path to yaml file describing the experiment configuration', type=Path, required=True) + parser.add_argument( + '-o', '--overwrite', help='overwrite previously created experiment', action="store_true", required=False) return parser.parse_args() @@ -77,18 +84,15 @@ def input_args(): data.update(os.environ) testconf = parse_j2yaml(path=user_inputs.yaml, data=data) - if 'skip_ci_on_hosts' in testconf: - host = Host() - if host.machine.lower() in [machine.lower() for machine in testconf.skip_ci_on_hosts]: - logger.info(f'Skipping creation of case: {testconf.arguments.pslot} on {host.machine.capitalize()}') - sys.exit(0) - # Create a list of arguments to setup_expt.py setup_expt_args = [testconf.experiment.system, testconf.experiment.mode] for kk, vv in testconf.arguments.items(): setup_expt_args.append(f"--{kk}") setup_expt_args.append(str(vv)) + if user_inputs.overwrite: + setup_expt_args.append("--overwrite") + logger.info(f"Call: setup_expt.main()") logger.debug(f"setup_expt.py {' '.join(setup_expt_args)}") setup_expt.main(setup_expt_args) diff --git a/workflow/gsl_template_hera.xml b/workflow/gsl_template_hera.xml index 6205d45ed4..f2ece44386 100644 --- a/workflow/gsl_template_hera.xml +++ b/workflow/gsl_template_hera.xml @@ -6,7 +6,7 @@ Main workflow manager for Global Forecast System NOTES: - This workflow was automatically generated at 2023-06-13 23:31:49.582810 + This workflow was automatically generated at 2024-09-05 15:37:41.961069 --> + @@ -25,7 +26,7 @@ &EXPDIR;/logs/@Y@m@d@H.log - 202401140000 202401140000 24:00:00 + 202409050000 202409050000 24:00:00 @@ -41,23 +42,25 @@ &ROTDIR;/logs/@Y@m@d@H/gfsinit.log - RUN_ENVIRemc - HOMEgfs&HOMEgfs; - EXPDIR&EXPDIR; - ROTDIR&ROTDIR; - ICSDIR&ICSDIR; - CASE&CASE; - COMPONENT&COMPONENT; - NETgfs - CDUMPgfs - RUNgfs - CDATE@Y@m@d@H - PDY@Y@m@d - cyc@H - COMROOT/scratch1/NCEPDEV/global/glopara/com - DATAROOT&ROTDIR;/../RUNDIRS/&PSLOT; - - + RUN_ENVIRemc + HOMEgfs&HOMEgfs; + EXPDIR&EXPDIR; + ROTDIR&ROTDIR; + ICSDIR&ICSDIR; + CASE&CASE; + COMPONENT&COMPONENT; + NETgfs + CDUMPgfs + RUNgfs + CDATE@Y@m@d@H + PDY@Y@m@d + cyc@H + COMROOT/scratch1/NCEPDEV/global/glopara/com + DATAROOT&ROTDIR;/../RUNDIRS/&PSLOT; + FHR3#fhr# + COMPONENTatmos + + &ROTDIR;/gfs.@Y@m@d/@H/model_data/atmos/input @@ -79,7 +82,7 @@ gsd-fv3 batch hera - 05:00:00 + 06:00:00 56:ppn=40:tpp=1 &NATIVE_STR; @@ -112,15 +115,16 @@ - _f000-f012 _f018-f030 _f036-f048 _f054-f066 _f072-f084 _f090-f102 _f108-f120 - f012 f030 f048 f066 f084 f102 f120 - f000_f006_f012 f018_f024_f030 f036_f042_f048 f054_f060_f066 f072_f078_f084 f090_f096_f102 f108_f114_f120 + + 000 006 012 018 024 030 036 042 048 054 060 066 072 078 084 090 096 102 108 114 120 - + &JOBS_DIR;/atmos_products.sh - &PSLOT;_gfsatmprod#grp#_@H + &PSLOT;_gfsatmprod_f#fhr#_@H gsd-fv3 batch hera @@ -128,31 +132,29 @@ 1:ppn=24:tpp=1 &NATIVE_STR; - &ROTDIR;/logs/@Y@m@d@H/gfsatmprod#grp#.log + &ROTDIR;/logs/@Y@m@d@H/gfsatmprod_f#fhr#.log RUN_ENVIRemc HOMEgfs&HOMEgfs; EXPDIR&EXPDIR; - ROTDIR&ROTDIR; NETgfs - CDUMPgfs RUNgfs CDATE@Y@m@d@H PDY@Y@m@d cyc@H COMROOT/scratch1/NCEPDEV/global/glopara/com DATAROOT&ROTDIR;/../RUNDIRS/&PSLOT; - FHRLST#lst# + FHR3#fhr# + COMPONENTatmos - &ROTDIR;/gfs.@Y@m@d/@H//model_data/atmos/master/gfs.t@Hz.master.grb2#dep# + &ROTDIR;/gfs.@Y@m@d/@H//model_data/atmos/master/gfs.t@Hz.master.grb2f#fhr# -