From 5f5727af6fa260cf9f816fd47b510e2ac94e730c Mon Sep 17 00:00:00 2001 From: "kayee.wong" Date: Thu, 11 Apr 2024 06:43:06 +0000 Subject: [PATCH] Merge to 2024 Apr 2 NOAA-EMC/develop (c54fe98) Squashed commit of the following: commit c54fe98c4fe8d811907366d4ba6ff16347bf174c Author: Walter Kolczynski - NOAA Date: Tue Apr 2 15:58:49 2024 -0400 Move bash utility functions out of preamble (#2447) The preamble was accumulating a bunch of utility functions. These functions are now moved to a separate file that is sourced by the preamble. The only functions remaining in the preamble are those related to script control and logging (`set_trace()`, `set_strict()`, `postamble()`). Resolves #2346 commit 4a39c8afc0555a8f2d621efb55589b9b309a416c Author: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> Date: Tue Apr 2 18:00:21 2024 +0000 Reenable the minimization monitor on Hera (#2446) This allows the minimization monitor to run on Hera Rocky 8. A missing perl module was added (List/MoreUtils.pm), but had to be installed under the perl/5.38.0 installation, thus that module needs to be loaded. Resolves #2439 commit 0eaa53771b5e8d476d3b5feabd3181c8dc48629a Author: Travis Elless <113720457+TravisElless-NOAA@users.noreply.github.com> Date: Tue Apr 2 00:31:14 2024 -0400 Fix rotating member bugs (#2443) When PR #2427 introduced the rotating subset of member guess states for the early-cycle EnKF, the rotating member calculation function was omitted from the DOSFCANL_ENKF if block in the enkf surface script. This PR adds this feature to this section. This PR also removes hard coded values that were included in this function and are replaced with a variable equal to the number of late cycle members. Resolves #2441 commit 39ba9d720c38ac85239a1eb1696c78df82396644 Author: CatherineThomas-NOAA <59020064+CatherineThomas-NOAA@users.noreply.github.com> Date: Mon Apr 1 17:48:42 2024 -0400 Remove the reset of upper layer humidity (#2449) # Description The parameter "nudge_qv" resets the upper layer humidity to HALOE climatology when cold starting. This parameter has been set to ".true." but is no longer needed using v16+ ICs and will now be set to ".false." by default. Resolves: #2448 commit 7f6bf216566e92bbe072ebe4b64d26cc60fb53f1 Author: Jiarui Dong Date: Mon Apr 1 13:08:33 2024 -0400 Archive the snow DA analysis into HPSS (#2414) This PR adds the capabilities to archive the snow analysis output into HPSS. Changes are made to archive the snow stats, letkfoi yaml file and snow analysis into HPSS. commit c1b11a2559e618f61866498b5dad503ba74d8332 Author: Neil Barton <103681022+NeilBarton-NOAA@users.noreply.github.com> Date: Mon Apr 1 13:05:57 2024 -0400 Add GEFS ENS Atmos options (#2392) This PR adds the FV3 atmos perturbation options when running GEFS. This is needed for GEFS reforecasts and GEFS operational forecasts. This PR continues to address the below issues #1720 #1921 Co-authored-by: Rahul Mahajan Co-authored-by: Walter Kolczynski - NOAA Co-authored-by: Bing Fu <48262811+bingfu-NOAA@users.noreply.github.com> commit 834ce31348a627e14d448cdbe33d4ec0dabe99e4 Author: Walter Kolczynski - NOAA Date: Fri Mar 29 13:17:08 2024 -0400 Refactor gempak jobs for new COM and style (#2374) Updates the gempak jobs to fit the new COM structure while also refactoring them some to improve the style. Despite these technical changes, the overall structure is left unchanged for most scripts, though some have been rewritten to make the needed changes easier. Some of these scripts had already been updated some in the original COM refactor and thus needed fewer updates. Style updates includes converting all gempak scripts to bash, making them shellcheck compliant, and removing trailing whitespace. Further refactoring to improve maintainability will be needed in the future (see #2341, #2342, #2343, #2348). The GFS gif scripts were identical except the forecast hour, so they are collapsed down into two: one for f000 and one for other forecast hours. The gempak executables have short path limits. To get around this without having the gempak module recompiled, target directories (mostly relevant for the gempak meta jobs) are symlinked into the working directory to drasticly reduce the path lengths. Part of this update includes replacing existing MPMD calls with the new standard `ush/run_mpmd.sh` script. A new function, `wait_for_file()`, is introduced to standardize waiting for a file to be available. Gempak forecast hours are often hard-coded within scripts. In addition to issues with maintainability, this causes problems for shorter forecasts, such as we typically run for testing purposes. For now, we simply check the values against the forecast length and reduce if necessary. Future work (#2348) will be needed to remove these hard-coded values with variables set in the config file (or just use update gempak products to match standard output time variables). One-degree gempak files have been updated to include `1p00` in the filename. Several gempak job dependencies are corrected. Fake gempak data for external models is being staged on tier-1 machines to allow testing. **Output has not been verified.** Future PRs will likely be needed to bring full functionality online. Resolves #2158 Resolves #2152 Resolves #2151 Resolves #2249 Resolves #2247 Refs #2157 Refs #2348 commit 20635b0639656769842218d544ec7ce2436337c5 Author: TerrenceMcGuinness-NOAA Date: Fri Mar 29 08:15:29 2024 +0000 Turn GEFS CI test on for Hera (#2442) Re-enabling gefs case to test gefs system builds in Jenkins on Hera. Co-authored-by: TerrenceMcGuinness-NOAA Co-authored-by: terrance.mcguinness Co-authored-by: Walter.Kolczynski commit ba6a9d5fa6a079b1e3fdd424a493252bbf499c5d Author: Walter Kolczynski - NOAA Date: Thu Mar 28 17:34:22 2024 -0400 Modify APP based on RUN (#2413) There is a need to change which coupled components are on depending on the current `RUN`. To facilitate this, the `APP` is modified prior to the setting of the `DO_` variables based on `RUN`, turning off components as desired. This new system also replaces the `DO__ENKF` switches that were formerly used to turn components off for the ensemble. Also expands allows apps for cycled to include S2SWA. Resolves #2318 commit 3ff7a92c25564ddf984cb09cb5667ae8fafe01a0 Author: TerrenceMcGuinness-NOAA Date: Thu Mar 28 17:48:00 2024 +0000 Fix post log arg check and don't create build semaphore (#2440) Two Hot Fixes to Jenkins latest updates: 1. Logic fixed in checking for mutually exclusive use of gists and repo for publishing error files 2. Removed creation of build success file semaphore forcing complete rebuild as the default behavior for rerun/restarts Co-authored-by: TerrenceMcGuinness-NOAA Co-authored-by: terrance.mcguinness commit d6be3b5c3a1b8fd025a303b40e0660e2914906a7 Author: Henry R. Winterbottom <49202169+HenryWinterbottom-NOAA@users.noreply.github.com> Date: Wed Mar 27 20:56:18 2024 -0600 Update global-workflow and subcomponents to Hera/Rocky 8 partition (#2421) This PR addresses issue #2329. The following is accomplished: - All submodule RDHPCS Hera stacks are updated to be compatible with the Rocky-8 distro spack-stack; - The global-workflow version files `versions/build.hera.ver` and `versions/run.hera.ver` are updated for Rocky-8; - All submodule hashes have been updated to be compliant with the Rocky-8 distro spack-stack (see the reference PRs below); - Update to `parm/config/config.base` is made for not yet compliant packages; - Relevant updates are made to `modulefiles/module_base.hera.lua` and `modulefiles/module_gwsetup.lua`. Resolves #2329 Refs: [#958](https://github.com/NOAA-EMC/GDASApp/issues/958) [#49](https://github.com/NOAA-EMC/gfs-utils/issues/49) [#124](https://github.com/NOAA-EMC/GSI-Monitor/issues/124) [#31](https://github.com/NOAA-EMC/GSI-utils/issues/31) [#2167](https://github.com/ufs-community/ufs-weather-model/issues/2167) [#2143](https://github.com/ufs-community/ufs-weather-model/issues/2143) [#913](https://github.com/ufs-community/UFS_UTILS/issues/913) Co-authored-by: Rahul Mahajan Co-authored-by: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> commit 47302153f13f6b23539be841b78ed78664599c08 Author: Travis Elless <113720457+TravisElless-NOAA@users.noreply.github.com> Date: Wed Mar 27 21:14:57 2024 -0400 Add a rotating subset of members for early-cycle enkf (#2427) The early-cycle EnKF needs the ability to run with fewer members than the late-cycle due to operational resource constraints. Because of this requirement, the introduction of a rotating subset of member first-guess states used by the early-cycle ensemble is also needed in order to preserve the rotating member initial condition functionality currently used by the GEFS. Co-authored-by: Travis J Elless Co-authored-by: travis elless commit 94c282ef6fdcd47076e932bcadb5bdd55236aa05 Author: TerrenceMcGuinness-NOAA Date: Wed Mar 27 20:28:15 2024 +0000 Uploading error logs to GitHub from Jenkins CI Runs (#2429) This PR enhances the user experience within GitHub for when errors occur during build and running of CI cases from within the PR messages. This is done by uploading the error logs to GitHub Gists and then publishing the links to them along with the full paths of the logs on disk. This PR adds the python Class **GitHubPR** in `${HOMEgfs}/ci/scripts/utils/githubpr.py` by inhearting the GitHub Class from **pyGitHub**. We use this module to introduce a helper python utility that can publish a list of log files into a GitHub Gist and/or the designated branch **error_logs** in the **emcbot** repo **ci-global-workflows** for storing error log files for review from any git configured terminal. This upload feature will also create persistence of errors over time. Also the `build_all.sh -k` script has been updated to support a "quick kill" feature (thanks David) that stops the parallel builds whenever one fails and creates an error_log file that has the paths to the error files that are also uploaded and published in the PR messages in GitHub. Co-authored-by: TerrenceMcGuinness-NOAA Co-authored-by: terrance.mcguinness Co-authored-by: DavidHuber Co-authored-by: Walter Kolczynski - NOAA commit 6c5065e2e83a45b14505e7575aa4500482ef7452 Author: Rahul Mahajan Date: Wed Mar 27 03:39:16 2024 -0400 Add option to use traditional threading in the UFS (#2384) # Description This PR: - adds the option of running the ufs-weather-model with traditional threading in addition to ESMF-managed threading. See new toggle `USE_ESMF_THREADING=YES|NO` set in `config.fcst` - does not change the current default of using ESMF-managed threading. Traditional threading use might need a little more fine tuning for the job-card specification. This will be achieved when the UFSWM RT completely switches over to traditional threading - updates the hash of the ufs-weather-model to the PR https://github.com/ufs-community/ufs-weather-model/pull/2172 Resolves #2277 In addition to the above stated objectives, this PR also addresses open issues. In particular, this PR: - adds a newline at the end of `diag_table_aod`. Fixes #2407 @zhanglikate - reserves more memory on WCOSS2 for offline UPP when running at C768. Fixes #2408 @WenMeng-NOAA Co-authored-by: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> commit bc1c46dfd7393c5164abcdc2dfa76a9c4bc834b8 Author: RussTreadon-NOAA <26926959+RussTreadon-NOAA@users.noreply.github.com> Date: Tue Mar 26 22:27:30 2024 -0400 Correct GDASApp paths (#2435) The changes in this PR - account for changes in GDASApp directory structure - generalize how the path to the GDASApp python ioda library is specified Resolves #2434 commit f0b912be6f2cf2fac590272253f19cb082fbf5f2 Author: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> Date: Mon Mar 25 21:46:32 2024 +0000 Fix *earc jobs where the number of members isn't a multiple of 10 (#2424) This limits the earc search for ensemble members to the maximum number of members, which prevents attempting to send non-existent members to HPSS if the number of ensemble members is not a multiple of 10. Resolves #2390s commit daeb0c855017f8ffd6f06870744b825b276097f3 Author: TerrenceMcGuinness-NOAA Date: Fri Mar 22 17:40:04 2024 +0000 hotfix to update full path to error logs on CI case fail (#2425) This hotfix PR prepends the full path to the error logs on disk to be communitated correctly to the GitHub message to the PR that is being processed when there is a failure in from a case. commit 50f75526549245f2b5d984cdb44e402852e086ec Author: YaliMao-NOAA <53870326+YaliMao-NOAA@users.noreply.github.com> Date: Thu Mar 21 16:58:25 2024 +0000 Add WAFS jobs, scripts and ush to GFS v17 workflow repository (#2412) # Description This PR adds WAFS jobs, scripts and ush to GFS v17 workflow repository --------- Co-authored-by: yali mao Co-authored-by: Rahul Mahajan Co-authored-by: yali mao commit 03ba78ae3df589211d2776254c6e8584ecdc226f Author: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> Date: Thu Mar 21 13:33:50 2024 +0000 Hotfix: send the correct number of build jobs for the UFS (#2423) This fixes a bug in build_ufs.sh that was causing the UFS to always build with 8 jobs (except on the cloud). commit 4d1bf5266f00b35778ea47896f438e3ef612628d Author: Kate Friedman Date: Wed Mar 20 14:00:31 2024 -0400 Updates to RTD documentation (#2418) Updates to the RTD documentation include: - Textual updates - Better definition of GDA subfolder structure - Added note about GDAur having been discontinued - Adjust copyright and author information - Fix Git version table and update its contents - Add "Table of Contents" header before table of contents on front page - Add AWS ICs path to init page - Add link to UFS_UTILS gdas_init RTD documentation - Add note about bash to `gw_setup.sh` section and added a warning block Refs #2395 commit afe874ee8b28942e459796cd1005ec598458a5b7 Author: Neil Barton <103681022+NeilBarton-NOAA@users.noreply.github.com> Date: Tue Mar 19 14:35:02 2024 -0400 Add GEFS Ocean Perturbation Options (#2385) This PR adds the MOM6 ocean perturbation options when running GEFS. This is needed for GEFS reforecasts and GEFS operational forecasts. This PR continues to address the below issues https://github.com/NOAA-EMC/global-workflow/issues/1720 https://github.com/NOAA-EMC/global-workflow/issues/1921 Fixes #2403 commit fa855baa851b0cb635edd1b9ae1bfed5112d41e5 Author: Clara Draper <33430543+ClaraDraper-NOAA@users.noreply.github.com> Date: Mon Mar 18 12:49:14 2024 -0600 Add initial GSI-based soil analysis capability (#2263) First set of changes for adding the new soil analysis, from the assimilation of screen-level T and q. The changes here enable the screen-level observations to be assimilated in the Hybrid (Var and EnKF) update, and the soil temperature and soil moisture updates to be made in the EnKF only. The functionality is turned on by setting GSI_SOILANAL to YES in config.base. Resolves #1479 commit e9700d84b521907ee23e1584712f80e25e60f08e Author: Jessica Meixner Date: Mon Mar 18 09:32:13 2024 -0400 re-enable ci/cases/pr/C48mx500_3DVarAOWCDA.yaml (#2405) Updates to re-enable C48mx500_3DVarAOWCDA CI test after it was disabled in https://github.com/NOAA-EMC/global-workflow/pull/2371 Fixes github.com/NOAA-EMC/global-workflow/issues/2404 commit 3ccffeee120340ab580fc9d96b552970c9f42a8f Author: Rahul Mahajan Date: Mon Mar 18 09:30:35 2024 -0400 Parse jediyaml only once (#2387) `JEDIYAML` was being parsed 3 times; once in `get_obs_dict`, second in `get_bias_dict` and a third time in `initialize` for the specific component analysis task. This PR: - eliminates the duplications and constructs the `jedi_config` dictionary just once. The dictionary is written out before calling the executable. - updates hash to gdasapp - updates configs for snow, aerosol, atmvar and atmens JEDI-DA to include `JEDI_FIX_YAML` and `CRTM_FIX_YAML` . This allows greater flexibility and control over the contents of these fix data sets to be copied into the run directory. - Combines snowDA and aerosolDA into a single test Co-authored-by: Cory Martin Co-authored-by: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> commit a5f24951c5bef142747c6f6cc6abd474f0b53ac2 Author: CatherineThomas-NOAA <59020064+CatherineThomas-NOAA@users.noreply.github.com> Date: Fri Mar 15 14:47:54 2024 -0400 Fix ensemble archive groups to include all members (#2402) The number of groups used in the ensemble archive step (earc) needs to include a task for the ensemble stat files such as the mean and the spread, resulting in `n_groups+1` tasks for `earc`. Resolves: #2390 commit 056cfdca9e7fd7426a315fcbffc38d8ee2891212 Author: TerrenceMcGuinness-NOAA Date: Fri Mar 15 18:30:23 2024 +0000 GitHub message error paths (#2401) Add feature to message paths to error logs of failed experiments to GitHub Messaging the PR from Jenkins. commit d897ee4936d62160811d936248c8555187f81b65 Author: Rahul Mahajan Date: Thu Mar 14 12:37:49 2024 -0400 Missed a comma from the hotfix this AM (#2399) This PR is a hotfix to the hotfix from earlier this AM. A comma was missing. commit 906540acacf1b2ce4c0489d0d9d4913f53a4e8ad Author: Rahul Mahajan Date: Thu Mar 14 10:40:03 2024 -0400 Fix KeyError issue in ocean/ice postprocessing job. (#2398) The jjob for ocean and ice pp, only defines the component specific history and grib directory. This causes an error in the exscript trying to pull keys for both ocean and ice. Fix this. Surprised this has not caused failures before today Co-authored-by: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> commit c27f243ce24c60261718f15628f6ab30d7b09f7b Author: Rahul Mahajan Date: Wed Mar 13 13:31:57 2024 -0400 Remove documentation about generating ICs using global-workflow (#2397) Removes instructions on generating ics using global-workflow and directs the user to use ufs-utils. commit 0edbdc197441582f9c402da0f3a7f144b88960c7 Author: DavidNew-NOAA <134300700+DavidNew-NOAA@users.noreply.github.com> Date: Tue Mar 12 16:30:45 2024 -0400 Changed config.atmanl to allow non-hybrid background error yamls (#2394) # Description Makes change so that if DOHYBVAR equals "NO", then the JEDI background error yaml is set to staticb_${STATICB_TYPE}.yaml.j2 rather than hybvar_${STATICB_TYPE}.yaml.j2. This allows GDAS to run without hybvar, which may be necessary for development purposes. This is all accomplished by a simple switch in config.atmanl. commit ccb1f528489e740bb2adc4958146552323dd8709 Author: RussTreadon-NOAA <26926959+RussTreadon-NOAA@users.noreply.github.com> Date: Tue Mar 12 10:45:39 2024 -0400 Add JEDI atmosphere only CI (#2357) The PR contains a minimal set of changes to enable JEDI atmospheric DA CI testing. Prototype JEDI atmospheric cycling has begun. The JEDI atmosphere DA CI case provides an automated way to see if future PRs impact JEDI atmospheric cycling. Resolves #2294 Dependency: GDASApp PR [#937](https://github.com/NOAA-EMC/GDASApp/pull/937) commit b96f5ebbb1968bd539336652b87a2faa8ce68fd4 Author: Kate Friedman Date: Tue Mar 12 07:58:40 2024 -0400 Add switch to control `debug=true` on WCOSS2 for development testing (#2388) Adds a switch (`DEBUG_POSTSCRIPT`) to control whether `debug=true` is set when submitting development rocoto jobs to PBS schedulers (currently just WCOSS2). There isn't an equivalent flag to set for SLURM on the RDHPCS. Have added this new switch to documentation. Refs #619 commit 02d650500353663d0b193ef14003897daa5dd86c Author: TerrenceMcGuinness-NOAA Date: Mon Mar 11 21:51:16 2024 +0000 Rewrote pr_list_database.py to use wxflow's SQLiteDB Class (#2376) This PR updates the `pr_list_database.py` code to use **wxflow** SQLiteDB Class - Improved code's readability - Uses better code style matching project's software culture - Better docstring standards Co-authored-by: tmcguinness Co-authored-by: Rahul Mahajan Co-authored-by: Walter Kolczynski - NOAA commit a3374607d01fbdabbec0660afb82b5eb3677b4af Author: AndrewEichmann-NOAA <58948505+AndrewEichmann-NOAA@users.noreply.github.com> Date: Fri Mar 8 16:08:42 2024 -0500 Return ocnanalrun npes resource setting back to previous value (#2386) Variable `npes` in `ocnanalrun` entry of `config.resources` was erroneously changed in https://github.com/NOAA-EMC/global-workflow/pull/2299 and this PR changes it back. Resolves https://github.com/NOAA-EMC/GDASApp/issues/962 commit d7e9bde84aebe922039589bd2bcd65832c1074eb Author: BoCui-NOAA <53531984+BoCui-NOAA@users.noreply.github.com> Date: Thu Mar 7 16:14:28 2024 -0500 Add new BUFR table file parm/product/bufr_ij9km.txt for GFSv17 C1152 (#2383) This PR will add a new table file parm/product/bufr_ij9km.txt, and modify ush/gfs_bufr.sh to choose the different bufr table files based on the GFSv17 run resolution, i.e. use file bufr_ij9km.txt for C1152 or bufr_ij13km.txt, for C768. Resolves #2382 commit 4a525bef3bbed2ea60a71f71b3740c82df125c36 Author: AndrewEichmann-NOAA <58948505+AndrewEichmann-NOAA@users.noreply.github.com> Date: Thu Mar 7 16:12:31 2024 -0500 Add global-workflow infrastructure for ocean analysis recentering task (#2299) Adds jjob, rocoto script, config file, and basic `config.resources` entry for ocean analysis recentering task This PR is a dependency for further work on the associated issue within global-workflow and GDASApp Refs https://github.com/NOAA-EMC/GDASApp/issues/912 commit f83d17a937006add55241ed453e42f4fcbae50aa Author: Kate Friedman Date: Thu Mar 7 09:29:14 2024 -0500 Clean out non-gfs top level variables (#2366) Clean out non-gfs top level variables that are duplicates or no longer needed. Also standardize how we set these variables in scripts. Refs #2332 commit c7b306e052497aef0022cd53550a168d2c5b6e5b Author: Guillaume Vernieres Date: Wed Mar 6 16:16:08 2024 -0500 Forgotten templated DO_VRFY_OCEANDA (#2379) # Description This PR adds the possibility to use the switch for the ocean and sea-ice DA verify task from the yaml configuration. - fixes [GDASApp/issues/954](https://github.com/NOAA-EMC/GDASApp/issues/954) commit ba6a4fdf6e245b57530f2b20e6f0ccf567115720 Author: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> Date: Tue Mar 5 19:09:48 2024 +0000 Add Hercules support for the GSI monitor (#2373) # Description This updates the GSI monitor hash and updates the modulefiles to add support for the monitor on Hercules. commit 732a874a2c6793296f136afb23545fab9869b181 Author: Rahul Mahajan Date: Mon Mar 4 16:04:26 2024 -0500 Reformat snowDA templates to jinja2 (#2371) # Description This PR: - replaces use of non-jinja2 templates in the yaml templates. Specifically `$( )` in favor of pure jinja2. - uses jinja2's built-in capability to include templates within templates, thereby allowing to assemble a completely rendered template before passing it to for e.g. yaml loader. - requires updates to `wxflow` and `gdasapp` - Changes in `wxflow` in `parse_j2yaml` are **not** backwards compatible Additionally, this PR: - renames `config.base.emc.dyn` to `config.base`. Resolves #2347 commit d1fa41106e991556606b0f62a15bf45f469f4f79 Author: TerrenceMcGuinness-NOAA Date: Sat Mar 2 04:54:26 2024 +0000 Reduce Jenkins messaging to GitHub (#2370) This PR updates the Jenkins Pipeline code with safeguards against the errors caused when Jenkins fails to authenticate with GitHub to message or update a label. This was achieved simply by: - Reducing the number out messages sent to the GitHub PR - Putting try blocks around most of the update label calls Co-authored-by: tmcguinness commit 52fa3cb32d8b50e47f391b82ea8901435fc88aff Author: Henry R. Winterbottom <49202169+HenryWinterbottom-NOAA@users.noreply.github.com> Date: Fri Mar 1 12:36:27 2024 -0700 Adding debug option for all build scripts (#2326) This PR addresses issue #300 that allows building in `debug` mode. Co-authored-by: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> Co-authored-by: Rahul Mahajan commit 91738cbf871d8cdce46912e2c11e304d567a2aae Author: DWesl <22566757+DWesl@users.noreply.github.com> Date: Fri Mar 1 13:46:42 2024 -0500 Sort list of coupler restart files for restart time determination (#2360) The loop in the following conditional seems to assume the list is sorted, so make that explicit in the array construction. commit 23c25527ad2a62275cd9105bd103b8520a28e573 Author: Neil Barton <103681022+NeilBarton-NOAA@users.noreply.github.com> Date: Fri Mar 1 13:45:39 2024 -0500 Update stage IC to handle ocean perturbations (#2364) This PR adds the option to stage ocean perturbation files for ensemble forecasts. These perturbation files are used in GEFS forecasts. A new variable is introduced in config.base to use the ocean perturbation files. This PR does not include using these perturbation files. A future PR will address this. commit 8efe05f475b81e7cf6376745d5f1ce31987cb4eb Author: Jessica Meixner Date: Fri Mar 1 07:41:56 2024 -0500 Turn on C48mx500_3DVarAOWCDA test on hera (#2363) This PR activates the C48mx500_3DVarAOWCDA test on hera. This required an update of the gdas app. commit 5166593945e9ecc04dfa3409752576c08797d09f Author: TerrenceMcGuinness-NOAA Date: Thu Feb 29 20:14:41 2024 +0000 Move Jenkinsfile into ci subdirectory (#2355) Just moves the Jenkinsfile into the ci directory Co-authored-by: tmcguinness commit b7af315bb9dea77b37c6d030b71060b87bedf33e Author: Walter Kolczynski - NOAA Date: Wed Feb 28 22:20:58 2024 +0000 Fix rocoto forecast hour determination for GEFS (#2351) The function that generates the list of forecast hours for rocoto was trying to use variables that are not defined for GEFS, causing workflow generation to fail. The function is updated to not try to load these variables before loading the ones actually used for GFS/ GEFS. Also turns GEFS CI test back on and adds an entry to stage C192 ICs (note: these have not been placed in the centralized location.) commit d3a49271b6c3816a9feeb7f6fb474797bacf1d7e Author: Cory Martin Date: Wed Feb 28 09:38:49 2024 -0500 Rename the land DA jobs to snow DA to better reflect what they are doing (#2330) This PR renames all of the land DA jobs to snow DA to better reflect that this is a JEDI-based snow analysis capability and not a more generic land surface analysis. commit 2693810d6ea9d9b20090777ff3a98e3d072c76d7 Author: Jessica Meixner Date: Tue Feb 27 01:25:51 2024 -0500 Update ufs-waether-model hash (#2338) Routine update of ufs-weather-model hash. Other small updates: * removes comment referencing closed issue. * Updates the CICE diag frequency to once per day as recommend here: https://github.com/NOAA-EMC/global-workflow/issues/1810#issuecomment-1686278925 * Updated amount of time for C384 gdas forecast as it was running out of time * Removed unused variable wave_sys_ver commit 9608852784871ebf03d92b53bde891b6dcab8684 Author: RussTreadon-NOAA <26926959+RussTreadon-NOAA@users.noreply.github.com> Date: Mon Feb 26 14:10:01 2024 -0500 Update JEDI ATM to use .nc for obs and generalize x,y layout (#2336) # Description The changes in this PR are twofold 1. replace `.nc4` suffix for JEDI ATM observation related files with `.nc` 2. use templated variables to specify `{layout_x, layout_y}` for JEDI ATM variational and local ensemble apps The first change conforms with the Unidata recommendation that netCDF files end with the suffix `.nc`. The second change replaces hardwired JEDI ATM var and ens `{layout_x, layout_y}` in `config.resources` with a more flexible approach. Resolves #2335 commit c5c84660f10f0ef9ce939231b2f7fda498b39a29 Author: Kate Friedman Date: Mon Feb 26 10:18:50 2024 -0500 Remove FIX* variables for fix subfolders (#2337) Remove `FIX*` variables for fix subfolders and replace them with the remaining `FIXgfs` variable and the subfolder name (e.g. `${FIXam}` -> `${FIXgfs}/am`). The UFS_UTILS and GDASApp repos were similarly updated. This PR includes a new UFS_UTILS hash. The updated GDASAPP hash was already committed within the spack-stack/1.6.0 PR #2239. Resolves #2184 commit 950c38a093c6a4e2b67e18c76390280d8bfbaef7 Author: TerrenceMcGuinness-NOAA Date: Fri Feb 23 21:15:46 2024 +0000 Fix several Jenkins issues (#2334) Jenkins Updates Resolving final kinks: - Removed all `git` shell commands and now exclusively use Software Control Manger (**scm**) plugin. - Add feature for skipping hosts per configuration specified in case yaml files. - Solved and tested false positive builds and experiments. - Tested archiving of task error log on case fail - First case fail quits pipeline and cancels all pending scheduled jobs - Duel build per yaml configuration arguments supported - All designated case files in PR directory pass on intended host (fully tested on Hera) Remaining updates: - Fist build fail short circuit when building sub-modules and archiving build error log. - Re-build/no-build built in logic for Replay and Rerunning previously failed experiments. commit c67393a203285792b852da0d83fd10fa47155669 Merge: 6f9afff07 79d305e8c Author: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> Date: Fri Feb 23 14:18:15 2024 +0000 Merge pull request #2239 from DavidHuber-NOAA/ss160 Update to spack-stack 1.6.0 Includes all submodules except the UFS, which will be updated at a later time. commit 79d305e8cbc339208ea6fe0475ddc56af94a285b Author: David Huber Date: Thu Feb 22 14:53:57 2024 -0600 Disable snow DA tests commit 0459203e97211b041520b39e93a188951396ba33 Author: David Huber Date: Thu Feb 22 12:55:51 2024 -0600 Update GDASApp hash to current develop commit 5c96eb2272fe6117f9e4a4c0c790db58e4870d46 Author: David Huber Date: Thu Feb 22 12:37:47 2024 -0600 Update GDAS hash to allow modified snow DA analysis commit 79144f2403a33d08140577cc3e8a61d6c4924403 Merge: abbb0b8a7 6f9afff07 Author: DavidHuber Date: Thu Feb 22 15:14:39 2024 +0000 Merge remote-tracking branch 'origin/develop' into ss160 commit abbb0b8a76b39d34433802fd4e3d7973fb9f0a39 Merge: 4ad837ea1 4529e8cf3 Author: DavidHuber Date: Thu Feb 22 15:11:03 2024 +0000 Merge remote-tracking branch 'henry/feature/gwdev_issue_2129' into ss160 commit 4ad837ea157b98d2bd173cb696c4d30d142e5540 Merge: 516b2a270 7ca45db8f Author: DavidHuber Date: Thu Feb 22 15:10:24 2024 +0000 Merge remote-tracking branch 'henry/feature/gfsv17_issue_2125' into ss160 commit 516b2a270234bdab5714ef2705c83dd3835b134d Author: DavidHuber Date: Thu Feb 22 14:19:16 2024 +0000 Updated GDAS to include rocoto/1.3.6 on Hera. commit 6f9afff073dd589096f992a3448fb7f0e62c9804 Author: Jessica Meixner Date: Wed Feb 21 13:09:44 2024 -0500 Add some flexibility for ocean/ice output (#2327) This adds functionality to have ocean/ice output frequency be separate from the atm model. One time was created because there's an assumption in the post that these are the same. This could be further modified to remove this assumption. Refs #1629 commit f6d3015ab9f9fd23ce0081baa2edbc5d9f5f3e16 Author: David Huber Date: Tue Feb 20 09:53:08 2024 -0600 Update GDASApp hash to include SS/1.6.0 support. commit 0bf340bce2f865c582e5305f0c5984cd3affe74e Author: David Huber Date: Tue Feb 20 09:07:32 2024 -0600 Construct SS paths from version variables. commit 3330cd7310bde8090b68720c225656074676d6b2 Author: David Huber Date: Tue Feb 20 09:06:19 2024 -0600 Removed MET/METplus 'not available' comments commit fdc638ca1616b347912caec77e5abc2d3e6f18af Author: David Huber Date: Tue Feb 20 08:35:15 2024 -0600 Move SS module path to version files. commit d7d28a6b65b84c8d821abd2c13d0c068bd5ad6d8 Author: David Huber Date: Tue Feb 20 07:53:15 2024 -0600 Update comment about METplus support. commit a812f88af3fc043d57494840e81a7527723858e4 Author: David Huber Date: Tue Feb 20 07:45:45 2024 -0600 Update verif-global to latest WCOSS2 support. commit d81f07fbf53666a37ab01bd463152e10252869ae Author: David Huber Date: Tue Feb 20 07:41:38 2024 -0600 Clean up build_upp.sh. -Corrected whitespace (tabs instead of spaces) -Removed debug print statement -Alphebetized flags commit ae7eb194cbe2213e564807dd7bc03a28d493eff2 Author: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> Date: Fri Feb 16 16:15:40 2024 -0500 Fix whitespace in build_upp.sh. Co-authored-by: Rahul Mahajan commit 48b34d0f388a398bb91f7c3b1e5f8338f4beb7b9 Author: David Huber Date: Fri Feb 16 20:56:55 2024 +0000 Added verif-global support back to WCOSS2. commit 4529e8cf3736ffbacf615a27e99f4d1beec391aa Author: henrywinterbottom-wxdev Date: Fri Feb 16 11:12:01 2024 -0700 Bug fix. commit 8e4f94d13d32849b2862a9d59aa070f4103c61ae Author: henrywinterbottom-wxdev Date: Fri Feb 16 11:04:41 2024 -0700 Updates requested by reviewer Rahul Mahajan. commit 4624ce21c99ab303afa10c1dd8ddcce7b6f715ca Author: henrywinterbottom-wxdev Date: Fri Feb 16 10:52:11 2024 -0700 Updates requested by reviewer; testing -- DO NOT REVIEW. commit ed25bbd0b26a893d32ce4a10b368dec2bb722424 Author: henrywinterbottom-wxdev Date: Fri Feb 16 09:33:59 2024 -0700 Linter corrections. commit 6a0b7bf214815c13eec98a93bd3abe9978de04e4 Merge: f9fb64ef8 2415b7b4f Author: David Huber Date: Fri Feb 16 09:50:22 2024 -0600 Merge branch 'ss160' of github.com:DavidHuber-NOAA/global-workflow into ss160 commit f9fb64ef802b683d7b369f8e8268d423ab581481 Merge: 777d97d3a a23b7f2fd Author: David Huber Date: Fri Feb 16 09:49:42 2024 -0600 Merge remote-tracking branch 'emc/develop' into ss160 commit 7ca45db8fa32c84ef117ff2938a650822c81a786 Merge: 73bc76bfd a23b7f2fd Author: Henry R. Winterbottom <49202169+HenryWinterbottom-NOAA@users.noreply.github.com> Date: Fri Feb 16 07:46:35 2024 -0700 Merge branch 'NOAA-EMC:develop' into feature/gfsv17_issue_2125 commit eb2ed53857eb3387735488a1ce5d701ff680e3db Merge: 9929277dc a23b7f2fd Author: Henry R. Winterbottom <49202169+HenryWinterbottom-NOAA@users.noreply.github.com> Date: Fri Feb 16 07:46:24 2024 -0700 Merge branch 'NOAA-EMC:develop' into feature/gwdev_issue_2129 commit a23b7f2fdca5be700d257e28052a0104f2173a0f Author: RussTreadon-NOAA <26926959+RussTreadon-NOAA@users.noreply.github.com> Date: Fri Feb 16 09:37:58 2024 -0500 Add JEDI 3DEnVar atmosphere only CI test stub (#2309) commit cf83885548bb3a6740c033f42479ce2ad283a4a9 Author: Jessica Meixner Date: Fri Feb 16 01:55:02 2024 -0500 Add unstructured grid for HR3/GFS (#2230) This adds the capability to use unstructured grids in the global workflow, which will be used in HR3. There are new fix files for a low-resolution 100km grid and a grid closer to our targeted GFSv17 grid which has the resolution combined from the older multi_1 and GFSv16 grids. The fix file update is here: NOAA-EMC/global-workflow#2229 Note: This now means that GFS tests need a new build option: `./build_all.sh -w` So that PDLIB=ON is turned on for compiling relevant UFS and WW3 codes. Resolves NOAA-EMC/global-workflow#1547 commit 9929277dc7d0ad90f5366faf4b5f2278969a0aa2 Merge: eb8791ccb 094e3b86d Author: Henry R. Winterbottom <49202169+HenryWinterbottom-NOAA@users.noreply.github.com> Date: Thu Feb 15 15:05:50 2024 -0700 Merge branch 'NOAA-EMC:develop' into feature/gwdev_issue_2129 commit 094e3b86da44f1d3fc1d99f68f6fdfcd36deb09f Author: Cory Martin Date: Thu Feb 15 14:43:55 2024 -0500 Move IMS remapping files from COM_OBS to FIXgdas (#2322) * Add in IMS obs fix directory and update submodule for gdas commit d465ea06e8b2a8f3a5eb1120647c1e2ce5197d66 Author: TerrenceMcGuinness-NOAA Date: Thu Feb 15 19:25:02 2024 +0000 Set HOMEgfs for module_setup in CI driver (#2321) Hotfixes to CI Bash system from updates with sourcing `detect_machine.sh` in `ush/module-setup.sh` using **HOMEgfs**. commit 2415b7b4f3e6c376aca27707510001141cc9dd92 Author: David Huber Date: Thu Feb 15 19:21:17 2024 +0000 Load default rocoto on Jet. commit eb8791ccbe684828dc529d0e553551969274fc22 Merge: 60d5ee64b 638684e0b Author: Henry R. Winterbottom <49202169+HenryWinterbottom-NOAA@users.noreply.github.com> Date: Thu Feb 15 11:56:40 2024 -0700 Merge branch 'NOAA-EMC:develop' into feature/gwdev_issue_2129 commit 777d97d3a1c9f5ec5e8af3ca40a41224ec7099a1 Author: David Huber Date: Thu Feb 15 12:34:59 2024 -0600 Fixed Orion cdo version. commit ef0723503c72f295de93521c7102c43e75c47417 Author: DavidHuber Date: Thu Feb 15 16:29:32 2024 +0000 Revert UFS hash. commit 0ce8c0dbc13227884fef1c637e93616a28c68d34 Author: DavidHuber Date: Thu Feb 15 14:55:48 2024 +0000 Fix git version in Hera's gwsetup module. commit 3080a34253e8e24105bf2be72b6a872b1c072935 Author: DavidHuber Date: Wed Feb 14 20:52:34 2024 +0000 Fixed xarray version for SS/1.6.0. commit 49392dd47ff84b6586052aeae6879d7d8050b746 Author: DavidHuber Date: Wed Feb 14 20:29:41 2024 +0000 Updated GSI-Utils hash to head of develop. commit c3553f0d8e6a05c6234e7c14a179983aa44fd6f3 Merge: 4568653a6 638684e0b Author: DavidHuber Date: Wed Feb 14 20:28:14 2024 +0000 Merge remote-tracking branch 'origin/develop' into ss160 commit 4568653a67aa37c902379db628fb10f69fe7190f Author: David Huber Date: Wed Feb 14 14:22:13 2024 -0600 Reupgrade Hercules to SS/1.6.0 commit 638684e0bfcd06700cc8695f09824891a0a1eee1 Author: Kate Friedman Date: Wed Feb 14 14:55:21 2024 -0500 Remove `finddate.sh` from system (#2308) * Retire finddate.sh usage from system * Update gfs-utils hash to 7a84c88 - New hash includes removal of finddate.sh Refs #2279 commit 2b160f8470bed16c513ca4a5665e2b6d4448c50e Author: DavidHuber Date: Wed Feb 14 18:31:28 2024 +0000 Reenable METplus jobs on Hercules. commit 8f5900265a31e894060bbe9c89b262f4df0b1760 Author: DavidHuber Date: Wed Feb 14 18:30:23 2024 +0000 Update GSI hashes. commit 73bc76bfd47f2cff54df55e15d9ed8969683367d Author: henrywinterbottom-wxdev Date: Wed Feb 14 09:27:37 2024 -0700 Updates based on user request. commit e4bc674cf3b2df10e0b0dfd50a8ecb0f4f7825d8 Author: henrywinterbottom-wxdev Date: Wed Feb 14 08:03:45 2024 -0700 Corrected based on reviewer review. commit 60d5ee64bda7a16c1f2dc20c8badcd7c8fc4e592 Merge: 40f2cf6cd 1aaef05d3 Author: Henry R. Winterbottom <49202169+HenryWinterbottom-NOAA@users.noreply.github.com> Date: Tue Feb 13 18:08:33 2024 -0700 Merge branch 'NOAA-EMC:develop' into feature/gwdev_issue_2129 commit 03304112347a673b7a0fcc404703bb10960bc47a Merge: 929b90330 1aaef05d3 Author: Henry R. Winterbottom <49202169+HenryWinterbottom-NOAA@users.noreply.github.com> Date: Tue Feb 13 17:58:12 2024 -0700 Merge branch 'NOAA-EMC:develop' into feature/gfsv17_issue_2125 commit 1aaef05d317cd1eec548ef2b9842679c531cef8b Author: TerrenceMcGuinness-NOAA Date: Tue Feb 13 18:15:59 2024 -0500 Jenkins Pipeline updates for Canceling Jobs (#2307) Tuning updates for Jenkins Pipeline : - Added short circuit for all parallel runs of cases on error of any - Fixed canceling of all scheduled jobs on first case error - Added feature to save error log files to Jenkins Archive facility on fail commit 64048926627f8c9edb087de286095e3b93a214c2 Author: Rahul Mahajan Date: Tue Feb 13 14:57:37 2024 -0500 Ocean/ice product generation for GFS and GEFS (#2286) This PR does several things: 1. the model output for ocean and ice in the `COM/` directory are now named per EE2 convention for coupled model. E.g `gfs.ocean.t12z.6hr_avg.f120.nc` and `gfs.ocean.t12z.daily.f120.nc` 2. The products are generated using the `ocnicepost.fd` utility developed by @DeniseWorthen in https://github.com/NOAA-EMC/gfs-utils and converted to grib2 using example scripts provided by @GwenChen-NOAA using `wgrib2`. 3. NetCDF products on the native grid are also generated by subsetting variables from the raw model output. This is done with `xarray`. 4. updates the hash of https://github.com/NOAA-EMC/gfs-utils to include fixes in `ocnicepost.fd` 5. removes NCL related scripting that was previously used for ocean/ice interpolation and `reg2grb2` used for converting to grib2. 6. updates archive scripts to accommodate updated file names 7. removes intermediate ocean processed files such as 2D/3D/xsect data- sets 8. separate jobs are added for ocean and ice product generation. 9. removes intermediate restarts for the mediator and only saves the medi- ator restart at the end of the forecast in `COM`. 10. Increases memory for offline UPP when run at C768. The program segfaults with an OOM when memory is self allocated based on PEs by the scheduler on Hera. 11. Enables ocean/ice ensemble product generation for GEFS 12. Some minor clean-ups Fixes #935 Fixes #1317 Fixes #1864 commit 40f2cf6cd70ac94e01babc96982f37ae1b0c7e79 Author: henrywinterbottom-wxdev Date: Tue Feb 13 10:18:53 2024 -0700 Implemented ush/detect_machine.sh for host determination and removed redundant checks for expected file paths. commit 929b90330c7eb345011c2c850915e0004ca26b11 Merge: 2d08d015a 3f99f700c Author: Henry R. Winterbottom <49202169+HenryWinterbottom-NOAA@users.noreply.github.com> Date: Tue Feb 13 08:31:06 2024 -0700 Merge branch 'NOAA-EMC:develop' into feature/gfsv17_issue_2125 commit 3f99f700c987526c8eb754b3f4c7b698b3e9b1dc Author: Walter Kolczynski - NOAA Date: Tue Feb 13 00:57:18 2024 -0500 Add wave post jobs to GEFS (#2292) Adds the wave post jobs for gridded and points to GEFS. Boundary point jobs are added even though the current GEFS buoy file does not contain any (tested by manually subbing in the GFS buoy file). Resolves #827 commit 842adf38087aec9f1c0bca9567e4b11d494e14c7 Author: TerrenceMcGuinness-NOAA Date: Mon Feb 12 12:50:08 2024 -0500 Added additional test cases to the pr list in Jenkins (#2306) C48mx500_3DVarAOWCDA, C96C48_hybatmDA, and C96_atmsnowDA Co-authored-by: terrance.mcguinness commit bb4ca65fe5524f76e40b97346339f1dda6680ce1 Author: CatherineThomas-NOAA <59020064+CatherineThomas-NOAA@users.noreply.github.com> Date: Mon Feb 12 14:50:41 2024 +0000 Redo v16.3 config.base changes for DA increments (#2304) Include the additional hydrometeors to the INCREMENTS_TO_ZERO and INCVARS_ZERO_STRAT variables in config.base that were modified in v16.3. Resolves: #2303 commit 061992bb6160554430cf688adf6184f01b732098 Author: TerrenceMcGuinness-NOAA Date: Sat Feb 10 01:33:36 2024 -0500 Fix Jenkins success reporting (#2302) Moving the post section back outside of main Run Experiments stage. This allows the system to correctly report the **Success** status until after all tests pass. _Had originally moved them in attempts to solve "Not an SCM GitHub Job" issue and cause the reporting to misbehave._ Also ran through Jenkins linter and updated some messaging that was incorrectly reporting system build type. commit 28ccf78073a20ba1e4d3b379d164109b54ff6708 Merge: b972f66fc 54daa31ce Author: David Huber Date: Fri Feb 9 14:00:43 2024 -0600 Merge remote-tracking branch 'emc/develop' into ss160 commit b972f66fc924790c48f38d395e7141fa78ef9d90 Author: David Huber Date: Fri Feb 9 10:47:00 2024 -0600 Fix SS versions for CI modules. commit 4b01d8eeca26c3b3d843a4b1e7b1618f281de68f Author: David Huber Date: Fri Feb 9 10:44:30 2024 -0600 Revert Hercules modules to SS/1.5.1. commit 54daa31ce0a3c23d4d74def5e54436a39a899ed4 Author: TerrenceMcGuinness-NOAA Date: Thu Feb 8 15:48:38 2024 -0500 Jenkins Declartive Pipeline for CI with gfs/gefs multibuilds (#2246) Adding top level Jenkins file for CI tests running on Jenkins Controller: - Declarative Multi-branch Pipeline (has enhanced restart capabilities on a per section bases) - Starts Pipeline from Label PR same as BASH system (for now) - Progress and restarts can me managed with CAC Login at [EPIC OAR Jenkins](https://jenkins.epic.oarcloud.noaa.gov) - Has logic for multi **gfs/gefs** system builds (arguments based on a configuration file `ci/casts/yamls/build.yaml`) - Any number of **systems** may be added by manual adding an ele- ment to the matrix in the Jenkinsfile - _It may be possible to dynamic add matrix values with a specialty plug-in_ - Currently only runs on **Orion** and **Hera** using `mterry` account Resolves #2119 Resolves #2118 commit 43429e23c12c1f2050b3a3f356abdec98dc73ea0 Author: Rahul Mahajan Date: Thu Feb 8 15:30:28 2024 -0500 Enable AO WCDA test (#1963) This PR: - adds GSI + SOCA C48 5-deg ocean 3DVar test (courtesy @guillaumevernieres) - adds a toggle to optionally disable ocnanalvrfy job. commit 2d08d015afb9c850577efd9754add16b10c45ae6 Merge: 4745d4a06 f56352874 Author: Henry R. Winterbottom <49202169+HenryWinterbottom-NOAA@users.noreply.github.com> Date: Thu Feb 8 07:24:54 2024 -0700 Merge branch 'NOAA-EMC:develop' into feature/gfsv17_issue_2125 commit f56352874d6dc133a4f1181f77c8f91ca38a6416 Author: Kate Friedman Date: Wed Feb 7 15:09:12 2024 -0500 Update JGLOBAL_FORECAST for octal error (#2295) Add "10#" to ENSMEM value > 0 check to handle octal errors. commit 4745d4a06148cc6c702c647e12ede4875e3a5862 Author: henrywinterbottom-wxdev Date: Wed Feb 7 08:45:49 2024 -0700 Removed jlogfile references. commit 5894ca2bf11e8ab7910c69781a9fbe51352a7e8c Author: henrywinterbottom-wxdev Date: Wed Feb 7 08:32:16 2024 -0700 Removed dummy variable passed to perl scripts. commit dae884a7c48b4a30c1844685c3d0ef50c9b78344 Author: henrywinterbottom-wxdev Date: Wed Feb 7 08:29:49 2024 -0700 Removed jlogfile and postmsg references within gempak scripts. commit 801058ffb0cbbfe101fd5b686aed79c5bf7538c1 Author: Henry R. Winterbottom <49202169+HenryWinterbottom-NOAA@users.noreply.github.com> Date: Wed Feb 7 00:41:59 2024 -0700 Consolidate `npe_node_max` (#2289) - The environment variable `npe_node_max` is removed from all files beneath `global-workflow/env`; - The environment variable `npe_node_max` is removed from `parm/config/gefs/config.ufs` and `parm/config/gfs/config.ufs`; - The environment variable `npe_node_max` is maintained only within `parm/config/gefs/config.resources` and `parm/config/gfs/config.resources`. Resolves #2133 commit b0325e0157598702cbba6c3cc09af0120881e2b4 Author: Henry R. Winterbottom <49202169+HenryWinterbottom-NOAA@users.noreply.github.com> Date: Wed Feb 7 00:40:20 2024 -0700 Removes files module loading file no longer used by the GW (#2281) Removes `module-setup.csh.inc` and `module-setup.sh.inc`. The module `ush/module-setup.sh` is updated such that it now sources `ush/detect_machine.sh` to determine which supported platform the global-workflow is being execute on. Resolves #2130 commit 1ccc9896b361f2aaef8e6e7592a06ae4cfb7c491 Author: Walter Kolczynski - NOAA Date: Mon Feb 5 14:16:07 2024 -0500 Remove EnKF forecast groups (#2280) Removes the grouping of EnKF forecasts so each job only runs one forecast. Member and MEMDIR are now set at the workflow manager (rocoto) level. This change makes much of the system simpler (especially dependencies) and allows the elimination of the separate efcs scripts. Metatask names of updated jobs have been updated to make them a little less opaque by using the same name name as its constituent tasks (e.g. the forecast metatask is named `enkgdasfcst`, not `enkfgdasefcs`). Metatasks that weren't updated retain the same names as before for now. Resolves #2254 commit 9f3383fd8d8322428a40b94764a172a16872995e Author: Henry R. Winterbottom <49202169+HenryWinterbottom-NOAA@users.noreply.github.com> Date: Mon Feb 5 12:15:13 2024 -0700 Updated detect_machine.sh using that from UFS WM. (#2252) Updates `ush/detect_machine.sh` to match the UFS weather-model `tests/detect_machine.sh` prepared by @BrianCurtis-NOAA Resolves #2228 commit 7d68b0b164f0ffcd56867ae4fdab67905d9589eb Author: CatherineThomas-NOAA <59020064+CatherineThomas-NOAA@users.noreply.github.com> Date: Sun Feb 4 22:02:41 2024 +0000 Update global_cycle for fractional grid (#2262) The hash for ufs_utils is updated to include the changes for fractional grid support within global_cycle. This commit also removes the hack to skip global_cycle in cycling mode with v17 physics. Resolves: #1775 Refs: ufs-community/UFS_UTILS#815 Refs: ufs-community/UFS_UTILS#891 commit ed592a6ecfabc0d0b64a6e276531b7bc5ae3b8ea Author: Kate Friedman Date: Sat Feb 3 02:55:14 2024 -0500 Retire cycle-specific FHMAX_GFS variables (#2278) This PR retires the `FHMAX_GFS_${cyc}` variables that allowed users to specify different gfs forecast lengths for each cycle. This function is no longer supported in global-workflow. The `FHMAX_GFS_*` variables will be removed and will no longer be checked to set the final `FHMAX_GFS` variable. The same forecast length will be set for every cycle. This PR also includes a small fix to add new post parm files into the `.gitignore` file. This was intended to be included in a different PR but that PR is on hold for further testing so it is being included here to get it into `develop` sooner. Resolves #2218 commit 977e2d67b268477321aa26fc56073dd373e4f979 Author: Henry R. Winterbottom <49202169+HenryWinterbottom-NOAA@users.noreply.github.com> Date: Fri Feb 2 07:49:09 2024 -0700 New GDASApp hash. (#2285) commit b5f2bd9ec5632c4be43004604eed0e130dfe1735 Merge: 4d667421d 0400e1f35 Author: David Huber Date: Tue Jan 30 13:55:00 2024 -0600 Merge branch 'ss160' of github.com:DavidHuber-NOAA/global-workflow into ss160 commit 0400e1f3558be8e34c1298d32e14999c9dd46f8c Author: David Huber Date: Tue Jan 30 09:57:11 2024 -0600 Fix gfs_utils Orion spack-stack env path. commit 6bbe823e729291db326d108765d3a92a99552a58 Author: DWesl <22566757+DWesl@users.noreply.github.com> Date: Mon Jan 29 21:15:30 2024 -0500 Use seq to generate the list of times, instead of a bash for-loop (#2264) I'm running a year-long forecast, which means I get a large portion of the log file dedicated to these loops. `seq ${START} ${STEP} ${STOP}` will generate a sequence going from START to STOP by STEP, including STOP if relevant. That seems to be the purpose of these loops. It will by default separate the list with newlines, but `seq -s ' ' ${START} ${STEP} ${STOP}` will separate them with spaces instead, more closely mimicing the previous behavior. I would like this to be two lines in the log, rather than a few hundred, and this may also be faster, though probably more for reasons of fewer writes to disk than because bash isn't designed for arithmetic. commit d5bee38979cde547861261d1cd150f3a61601d4b Author: Kate Friedman Date: Mon Jan 29 14:35:02 2024 -0500 Correct typos in GFS config.resources (#2267) This PR corrects some typos in `parm/config/gfs/config.resources` that were introduced in PR #2216. The esfc job was failing in tests on WCOSS2 due to insufficient memory. This lead to discovering the other typos. The esfc job completes without error after its memory is set back to `80GB` from the incorrect `8GB`. Resolves #2266 commit 81557beca9eecd878e7b25b3822e30a4276f4a16 Author: David Huber Date: Mon Jan 29 12:13:31 2024 -0600 Update monitor hash to noaa-emc with SS/1.6.0 support. commit 2238dd6ac0094ba2ff5e1027e964ef29ad33352c Author: DavidHuber Date: Mon Jan 29 13:21:30 2024 +0000 Update Orion, Hercules, S4 modulefiles. commit 6ffd94fd95f54ceb940f0c9201774ad73fbb055b Author: DavidHuber Date: Fri Jan 26 16:01:23 2024 +0000 Update GDAS hash to include SS/1.6.0 support. commit be11f85f28cf832e5fbb390fdd387f1bdecb5f82 Merge: 56b968080 04e0772d9 Author: DavidHuber Date: Fri Jan 26 15:34:24 2024 +0000 Merge branch 'ss160' of github.com:DavidHuber-NOAA/global-workflow into ss160 commit 8ff344844e28c3b2d03a0356f88b14635f318c12 Author: Rahul Mahajan Date: Fri Jan 26 10:12:18 2024 -0500 Add a yaml for snow DA testing. (#2199) - adds a new test yaml C96_atmsnowDA.yaml for 3DVar atmosphere with GSI and Land (Snow) DA with JEDI - moves a few yamls from platforms/ to yamls/ - adds ability to overwrite a previously created experiment as an addition to user input. --------- Co-authored-by: Cory Martin commit 04e0772d9d3e77ac5a24ce4570d48cb41424a08b Author: David Huber Date: Fri Jan 26 07:16:25 2024 -0600 Update ufs_utils hash for spack-stack/1.6.0 support. commit 3d44ff38c5c3324c22fc104fe3259b4ac864c6d6 Author: Barry Baker Date: Thu Jan 25 14:33:27 2024 -0500 GOCART ExtData biogenic climatology fix (#2253) Updates the ExtData for biogenic emissions to be climatology rather than for current time. Fixes an issue which will crash by default for other years. commit 66f58b8ab1a9524d6be95271f27a06c2f32e5f78 Author: Guillaume Vernieres Date: Thu Jan 25 13:16:41 2024 -0500 Added missing container case in gfs/config.resources (#2258) fixes #2257 commit 553b4f2e74ef610115436b75f7f6df100babd8dd Author: WenMeng-NOAA <48260754+WenMeng-NOAA@users.noreply.github.com> Date: Thu Jan 25 13:00:45 2024 -0500 Fix post parm links (#2243) Change symbolic links under parm/post with the latest version of develop branch from UPP repository and enable MERRA2 aerosol fields. Resolves #2259 commit 4d667421d5eefea2347ea1dd8097e39e167d7201 Merge: 6c058039e afa09e356 Author: David Huber Date: Thu Jan 25 07:49:08 2024 -0600 Merge branch 'ss160' of github.com:DavidHuber-NOAA/global-workflow into ss160 commit 6c058039e209c67ea477d37d2d6f76b7a2fa68ac Author: David Huber Date: Thu Jan 25 07:48:42 2024 -0600 Fix wgrib2/gfs_utils on Hercules. commit ee6f536ea0228c60f5a8bec4037cd6f7ea63b816 Author: Kate Friedman Date: Thu Jan 25 07:43:13 2024 -0500 Update GFS version to v16.3.13 in index.rst (#2256) GFSv16.3.13 WAFS update was implemented Refs #2013 commit 2445d44d0d66f35512080b0bd5867501660793bb Author: RussTreadon-NOAA <26926959+RussTreadon-NOAA@users.noreply.github.com> Date: Thu Jan 25 06:17:11 2024 -0500 Simplify and extend load_ufsda_modules to Hercules (#2245) GDASApp jobs do not run on Hercules because `ush/load_ufsda_modules.sh` does not include logic to load the appropriate GDASApp modules on Hercules. This PR extends `load_ufsda_modules.sh` functionality to Hercules, thereby enabling GDASApp jobs to run on Hercules. Resolves #2244 commit 56b9680803f109720f6439276a0e5783a9c49352 Author: DavidHuber Date: Wed Jan 24 19:47:34 2024 +0000 Update hashes (revert WCOSS2 modules). commit afa09e356503f1befc162df9a79dc9ce7414dc22 Author: DavidHuber Date: Wed Jan 24 18:59:40 2024 +0000 New (cleaner history) gdas hash. commit a9eaec23d5103ea766342ff81d77df8e0f6d9a58 Author: DavidHuber Date: Wed Jan 24 18:52:27 2024 +0000 Update gdasapp to include ss/1.6.0 support. commit 5ef8eb24eeaeaff9c9a03767d929f2114fd4840f Merge: 4c463548b e400068a2 Author: DavidHuber Date: Wed Jan 24 17:08:54 2024 +0000 Merge branch 'ss160' of github.com:DavidHuber-NOAA/global-workflow into ss160 commit 4c463548b0c38f36c13c01716fa0b1e91544e440 Author: DavidHuber Date: Wed Jan 24 17:05:12 2024 +0000 Reenable verif-global support commit f775755df5f1579edfafbce9a7b47034907d023f Author: DavidHuber Date: Wed Jan 24 16:59:10 2024 +0000 Assign fcsthrs for awips_g2 job. commit e400068a2951e5fa3b130369861b4a60ece01491 Author: David Huber Date: Mon Jan 22 15:01:07 2024 -0600 More gsi-addon path fixes. commit f663d4786f226f8164d348d403e8d105cc3d801b Author: David Huber Date: Mon Jan 22 14:41:59 2024 -0600 Fix gsi-addon paths for hercules, orion, and S4. commit e304bbeb364067e411fb50801796d000b44d9147 Author: DavidHuber Date: Mon Jan 22 18:59:34 2024 +0000 Better optimize build jobs (more ufs jobs). commit ccc6e2d445d4c2e2f44fb907c7bc972b1e69f6ff Author: DavidHuber Date: Mon Jan 22 18:56:18 2024 +0000 Reenable verif-global. #2195 commit 3ef3411f856b0c5a9cf51686f3a51a2252596690 Merge: 4365f63e5 f4d187f4e Author: DavidHuber Date: Mon Jan 22 18:39:10 2024 +0000 Merge remote-tracking branch 'origin/develop' into ss160 commit 4365f63e54e2ddb9b88e768a6e88a92504b5d482 Author: DavidHuber Date: Mon Jan 22 18:37:25 2024 +0000 Corrected the SS env. name in the version files. #2195 commit f4d187f4e45fe89583d18987d68a883490827104 Author: AndrewEichmann-NOAA <58948505+AndrewEichmann-NOAA@users.noreply.github.com> Date: Mon Jan 22 12:28:28 2024 -0500 Converts obsproc to obsprep in prepoceanobs config file (#2236) Converts obsproc to obsprep in prepoceanobs config file as a compliment to mutually depended GDASApp PR NOAA-EMC/GDASApp#858 The motivations are explained in refs NOAA-EMC/GDASApp#857 commit 9a09d3082208ad206cf6e7103a53bfb7c4946f4a Author: DavidHuber Date: Fri Jan 19 21:23:50 2024 +0000 Corrected gsi-addon-dev spelling. commit 21ff6458aac9372b26cc223f6a114953a137067e Author: DavidHuber Date: Fri Jan 19 21:11:34 2024 +0000 Update modulefiles, submodules to spack-stack 1.6.0. #2195 commit d4c55d1011f8b0385d25b62ac04710837ed8413e Author: Kate Friedman Date: Fri Jan 19 15:43:37 2024 -0500 Update typing hint for WCOSS version of python (#2238) The typing hint `typing.List` was deprecated with python 3.9 in favor of using the primitive `list[str]`, but the functional version of python on WCOSS2 is <3.9, causing `setup_xml.py` to fail there. This replaces `list[str]` as a typing hint with the deprecated form until the supported version on WCOSS2 is >=3.9. commit 491928712ae1dabac62750450c9cad8f538c2a3e Author: Barry Baker Date: Thu Jan 18 22:31:27 2024 -0500 GOCART Emission updates for GEFSv13 and GDAS (#2201) This PR addresses several things needed for more recent simulations using GOCART2G as well as preparing for the GEFSv13 30 year run. The main updates are: - Update CEDS to use monthly emission files instead of daily. This will drastically reduce the number of files needed. No science change - Update biogenic MEGAN inputs to use a climatology rather than the offline biogenic previously used. This is needed to deal with simulations where the current dataset is not available. - Update volcanic SO4 emissions to use degassing emissions only. This is to support the need for more recent simulations where data is not available. commit 13d25cfc3614de978bfd4b7f273d1f13cb820878 Author: Kate Friedman Date: Thu Jan 18 22:29:43 2024 -0500 Add `POSTAMBLE_CMD` into preamble postamble (#2235) This PR adds the ability to run a command at the end of jobs via the preamble's postamble function. A new command can be set via `POSTAMBLE_CMD` and will be invoked at the end of jobs. Users can add the command to the top of an env file to have every job run it or it can be placed within a job if-block in the env file to run for just that job. Resolves #1145 commit 7759163c668eb6ccc11dc1ecd39c0dc5d433cdc1 Author: Rahul Mahajan Date: Thu Jan 18 01:13:49 2024 -0500 Add option to create WGNE atmosphere products (#2233) Adds option to create WGNE products for the atmosphere. Resolves #1951 Resolves #2226 commit d95ed8588379b2c94d771f9aa680c445d6cf6c5a Author: Rahul Mahajan Date: Wed Jan 17 18:04:56 2024 -0500 Use ufs.configure, model_configure, MOM_input, data_table, and ice_in templates from UFS-weather-model (#2051) This PR: - links `ufs.configure.*.IN`, `model_configure.IN`, `MOM_input_${OCNRES}.in`, `MOM6_data_table.IN`, and `ice_in.IN` from ufs-weather-model into the global-workflow `parm/ufs` directory - prepares local variables for use in the templates in the respective functions for FV3, MOM6, and CICE6 - uses `atparse.bash` to render these templates into fully formed namelists - Work was performed by @DeniseWorthen in the ufs-weather-model to templatize several variables that are hard-wired in the ufs-weather-model for CICE. See ufs-community/ufs-weather-model/#2010 commit 44a66dd508aac1f15e56377bb04eb1af6832fe5b Author: RussTreadon-NOAA <26926959+RussTreadon-NOAA@users.noreply.github.com> Date: Wed Jan 17 13:52:09 2024 -0500 Update gdas.cd hash to use gsibec tag 1.1.3 (#2231) (#2234) commit 9046d97b4cc0e0132633054f880b60422fd3b1c7 Author: Walter Kolczynski - NOAA Date: Wed Jan 17 10:19:18 2024 -0500 Add basic atmos products for GEFS and refactor resources (#2216) * Add basic atmos products for GEFS Adds basic atmosphere products for GEFS members (no mean or spread). To facilitate the addition of GEFS parameter lists, the existing GFS lists are renamed. They are also relocated to `parm/products/` direc tory instead of `parm/post/`. GEFS v12 uses different parameter lists for different resolutions, but for this implementation, the 0p25 lists are used for all resolutions. However, to hedge against the final con- figuration, all of the GEFS parameter lists have been added to parm. Implementation of different parameter lists for each resolution will take additional refactoring of the atmos_products job, as it currently assumes all resolutions use the same lists. The new GEFS rocoto task is implented as two loops of metatasks (made possible by PR #2189). The member is set at the rocoto level, so the setting in `config.base` had to be updated to accept a pass-through. The generation of a forecast hour list is moved up to its own function in `rocoto.tasks`, separate from the post groups which are not used. This can be used as a model for future metatasks (or refactoring of old ones). `rocoto_viewer.py` does function with the nested metatasks, though only one level of folding is available and the metatask name is that of the lower metatask (though it encompasses the entire outer metatask). Additionally, the GEFS `config.resources` had diverged from the GFS version, which included not having a section for the atm products job. This led to the refactoring of the GFS `config.resources`, including switching to using a `case` statement instead of sequential `elif`. This refactored script was then copied over to GEFS, replacing the old one, with all jobs not currently implemented for GEFS removed (except wave post, whose inclusion is immenent). New blocks can be copied over from GFS as they are added. Resolves #823 commit fbba174c8b0a2e2d1c669ae3c74bec45fa995d6d Author: RussTreadon-NOAA <26926959+RussTreadon-NOAA@users.noreply.github.com> Date: Tue Jan 16 14:05:48 2024 -0500 Update GDASApp (gdas.cd) hash (#2225) GDASApp now uses gitsubmodules commit dddd83796c8cd52f6264f73282fc68eb7857a475 Author: Rahul Mahajan Date: Tue Jan 16 12:17:19 2024 -0500 Fix ocean resolution for the S2SWA test (#2223) commit 83edaf145382fe15e53f0af8937570488ef5440d Author: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> Date: Tue Jan 16 11:07:02 2024 -0500 Apply fixes to archiving jobs (#2221) Add checks for potentially missing log/data files and update log names for archiving. Missing sflux index files are also added to the products script. Resolves #2076 commit c59047614c29f6ec15a4d4a6fa1855929287debd Author: Rahul Mahajan Date: Fri Jan 12 19:21:02 2024 -0500 Update hash to ufs-weather-model. The noahmptable.tbl was reorganized, so update link_workflow.sh to use the same one used in UFSWM RT (#2219) commit c041968e165c07da785376f1441374687b55d6cc Author: Kate Friedman Date: Fri Jan 12 12:09:35 2024 -0500 Add ocean resolution to setup_expt invocation and retire/reduce COMROT/ROTDIR usage (#2214) Two series of updates: 1) Update setup scripts to now allow users to provide ocean resolution 2) Housekeeping to retire the `COMROT` variable, replacing it with other variables as needed, and reduce the `ROTDIR` variable usage. Both updates change options for the workflow setup API. Refs #2061 commit 997f97816493d9ed0242e169b6dcf84e5c9338d6 Author: AndrewEichmann-NOAA <58948505+AndrewEichmann-NOAA@users.noreply.github.com> Date: Fri Jan 12 10:46:36 2024 -0500 Allow use of ocean obs prep in WCDA cycling and remove R2D2 (#2215) Enables use of ocean obs prep task in WCDA cycling and removes R2D2 from same. Runs task gdasprepoceanobs before gdasocnanalprep -- obtains ocean data nc4 files from DMPDIR, processes them into IODA format and copies them to COM_OBS. Replaces the current R2D2 processing. commit 12a5bb192ab6282d44c008c06dc14947abd37249 Merge: 4cb580201 6492c2da5 Author: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> Date: Fri Jan 12 09:26:27 2024 -0500 Merge pull request #2217 from DavidHuber-NOAA/update/versions Update and clean up version and module files commit 6492c2da5f49bd31575d0bef5278c9a7f8ba3412 Author: David Huber Date: Thu Jan 11 11:49:08 2024 -0600 Update orion module/version files for met/metplus #2123 commit 94c9937f0bff380b0ebd6acdcfd2509e1ae99389 Author: DavidHuber Date: Thu Jan 11 17:42:53 2024 +0000 Comment met/metplus out from Hera modulefile. #2123 commit 8c32f8b7d00bba6e3e954f149e6548fb27713a3c Merge: a65e4c675 4cb580201 Author: David Huber Date: Thu Jan 11 11:34:44 2024 -0600 Merge branch 'develop' of github.com:noaa-emc/global-workflow into develop commit a65e4c675f21dbe72198d263e85e894e62b41e13 Author: David Huber Date: Thu Jan 11 11:34:19 2024 -0600 Initial update of version/module files #2123 commit 4cb580201af68c1c0e2ae19faf0727dcbbe43b4d Author: souopgui Date: Wed Jan 10 08:30:22 2024 -0600 Fix OpenMP over-allocation of resources in exglobal_atmos_products.sh when running MPMD tasks (#2212) Fix OpenMP over-allocation of resources running MPMD tasks Co-authored-by: Innocent Souopgui commit b056b531faee6929687cb8a588ffafa1a66426fb Author: TerrenceMcGuinness-NOAA Date: Mon Jan 8 17:28:05 2024 -0500 Add Hercules as valid machine in CI scripts (#2207) Few updates to CI scripts to include names for hercules that where missed the first time. commit 6574d29a8c26b0695614874c344bccc5182363f4 Author: Rahul Mahajan Date: Mon Jan 8 17:25:47 2024 -0500 Fix invalid GH action and restart file name (#2210) Resolves a typo that leads to an invalid workflow yaml and fixes the restart filename in restart detection. Resolves #2205 commit 69605eac299df381ea9e0e329654487e26380ff5 Author: Rahul Mahajan Date: Mon Jan 8 17:00:28 2024 -0500 Stop attempting to comment link to RTD for non-PRs (#2209) Adds a check so comments with a link to documentation are only generated for PRs. commit 4e160a895bfb31c281e21557dd86c13954a1a967 Author: Rahul Mahajan Date: Mon Jan 8 13:10:15 2024 -0500 Enable UPP for GOES processing (#2203) Wnables the creation of special master grib2 files from UPP for GOES processing commit c15875b6dbf685327af9316ee43b3d01f0fc815e Author: David Huber <69919478+DavidHuber-NOAA@users.noreply.github.com> Date: Mon Jan 8 09:56:06 2024 -0500 Port cycling to Hercules (#2196) Adds cycled support for Hercules (excluding gsi-monitor). Partially resolves #1588 GSI monitoring is disabled on Hercules due to missing Perl modules. That will be enabled in a later PR. commit ef6827dd6abdab2996d1e19f9e0ff5d3071e0fdd Author: Walter Kolczynski - NOAA Date: Mon Jan 8 09:43:12 2024 -0500 Refactor rocoto task XML creation (#2189) Refactors the rocoto task generation to be recursive. This will allow nested metatasks to loop over multiple variables, which is needed for GEFS product generation. As part of this refactor, there is no longer separate arguments to designate metatasks. Instead, task dicts can include a nested 'task_dict' as well as a 'var_dict' containing the variables to loop over. The nested task dict can then either have another layer, or be the innermost task. To accommodate the new recursive nature, some defaults that were previously defined in create_wf_task() had to be pushed down into the function that creates the innermost task. Also, former keywords have been absorbed by the task dict. Refs #823 Refs #827 commit 2b81cfac16f103fb29b3aeb9748941787abf7287 Author: Walter Kolczynski - NOAA Date: Mon Jan 8 09:41:03 2024 -0500 Update fix versions (#2198) Updates fix versions for a few components: - Update cice and mom6 versions to support C96/1p00 marine - Update wave to change betamax setting for glo_025 waves Resolves #2004 Resolves #2107 --- .github/workflows/docs.yaml | 2 +- .gitignore | 45 +- .gitmodules | 41 +- .shellcheckrc | 3 + ci/Jenkinsfile | 268 +++ ci/cases/pr/C48_ATM.yaml | 2 +- ci/cases/pr/C48_S2SW.yaml | 2 +- ci/cases/pr/C48_S2SWA_gefs.yaml | 3 +- ci/cases/pr/C48mx500_3DVarAOWCDA.yaml | 23 + ci/cases/pr/C96C48_hybatmDA.yaml | 2 +- ci/cases/pr/C96C48_ufs_hybatmDA.yaml | 23 + ci/cases/pr/C96_atm3DVar.yaml | 2 +- ci/cases/pr/C96_atmaerosnowDA.yaml | 21 + ci/cases/weekly/C384C192_hybatmda.yaml | 2 +- ci/cases/weekly/C384_S2SWA.yaml | 4 +- ci/cases/weekly/C384_atm3DVar.yaml | 2 +- ci/cases/yamls/atmaerosnowDA_defaults_ci.yaml | 6 + ci/cases/yamls/build.yaml | 3 + .../yamls}/gefs_ci_defaults.yaml | 0 .../yamls}/gfs_defaults_ci.yaml | 0 ci/cases/yamls/soca_gfs_defaults_ci.yaml | 5 + ci/cases/yamls/ufs_hybatmDA_defaults.ci.yaml | 20 + ci/scripts/check_ci.sh | 23 +- ci/scripts/clone-build_ci.sh | 2 +- ci/scripts/driver.sh | 3 + ci/scripts/pr_list_database.py | 196 +- ci/scripts/run-check_ci.sh | 20 +- ci/scripts/run_ci.sh | 13 +- ci/scripts/utils/ci_utils.sh | 130 +- ci/scripts/utils/ci_utils_wrapper.sh | 9 + ci/scripts/utils/get_host_case_list.py | 32 + ci/scripts/utils/githubpr.py | 124 + ci/scripts/utils/publish_logs.py | 106 + ci/scripts/utils/wxflow | 1 + docs/source/clone.rst | 7 + docs/source/components.rst | 18 +- docs/source/conf.py | 7 +- docs/source/configure.rst | 112 +- docs/source/hpc.rst | 22 +- docs/source/index.rst | 6 +- docs/source/init.rst | 88 +- docs/source/setup.rst | 6 +- env/AWSPW.env | 3 +- env/CONTAINER.env | 3 +- env/HERA.env | 49 +- env/HERCULES.env | 23 +- env/JET.env | 30 +- env/ORION.env | 30 +- env/S4.env | 27 +- env/WCOSS2.env | 26 +- gempak/fix/datatype.tbl | 14 +- gempak/fix/gfs_meta | 46 +- gempak/ush/gdas_ecmwf_meta_ver.sh | 145 +- gempak/ush/gdas_meta_loop.sh | 191 +- gempak/ush/gdas_meta_na.sh | 88 +- gempak/ush/gdas_ukmet_meta_ver.sh | 169 +- gempak/ush/gempak_gdas_f000_gif.sh | 295 +-- gempak/ush/gempak_gfs_f000_gif.sh | 584 +++++ gempak/ush/gempak_gfs_f00_gif.sh | 602 ----- gempak/ush/gempak_gfs_f12_gif.sh | 213 -- gempak/ush/gempak_gfs_f24_gif.sh | 231 -- gempak/ush/gempak_gfs_f36_gif.sh | 231 -- gempak/ush/gempak_gfs_f48_gif.sh | 231 -- gempak/ush/gempak_gfs_fhhh_gif.sh | 189 ++ gempak/ush/gfs_meta_ak.sh | 136 +- gempak/ush/gfs_meta_bwx.sh | 190 +- gempak/ush/gfs_meta_comp.sh | 1131 ++------- gempak/ush/gfs_meta_crb.sh | 177 +- gempak/ush/gfs_meta_hi.sh | 114 +- gempak/ush/gfs_meta_hur.sh | 210 +- gempak/ush/gfs_meta_mar_atl.sh | 68 +- gempak/ush/gfs_meta_mar_comp.sh | 1060 +++------ gempak/ush/gfs_meta_mar_pac.sh | 69 +- gempak/ush/gfs_meta_mar_ql.sh | 60 +- gempak/ush/gfs_meta_mar_skewt.sh | 57 +- gempak/ush/gfs_meta_mar_ver.sh | 51 +- gempak/ush/gfs_meta_nhsh.sh | 112 +- gempak/ush/gfs_meta_opc_na_ver | 312 +-- gempak/ush/gfs_meta_opc_np_ver | 311 +-- gempak/ush/gfs_meta_precip.sh | 142 +- gempak/ush/gfs_meta_qpf.sh | 144 +- gempak/ush/gfs_meta_sa.sh | 136 +- gempak/ush/gfs_meta_sa2.sh | 274 +-- gempak/ush/gfs_meta_trop.sh | 127 +- gempak/ush/gfs_meta_us.sh | 137 +- gempak/ush/gfs_meta_usext.sh | 162 +- gempak/ush/gfs_meta_ver.sh | 436 +--- jobs/JGDAS_ATMOS_GEMPAK | 57 +- jobs/JGDAS_ATMOS_GEMPAK_META_NCDC | 68 +- jobs/JGDAS_ENKF_FCST | 84 - jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_BMAT | 2 +- jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_ECEN | 38 + jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_PREP | 3 +- jobs/JGFS_ATMOS_AWIPS_20KM_1P0DEG | 13 +- jobs/JGFS_ATMOS_AWIPS_G2 | 13 +- jobs/JGFS_ATMOS_FBWIND | 13 +- jobs/JGFS_ATMOS_GEMPAK | 151 +- jobs/JGFS_ATMOS_GEMPAK_META | 44 +- jobs/JGFS_ATMOS_GEMPAK_NCDC_UPAPGIF | 56 +- jobs/JGFS_ATMOS_GEMPAK_PGRB2_SPEC | 40 +- jobs/JGFS_ATMOS_PGRB2_SPEC_NPOESS | 15 +- jobs/JGFS_ATMOS_POSTSND | 14 +- jobs/JGFS_ATMOS_WAFS | 96 + jobs/JGFS_ATMOS_WAFS_BLENDING_0P25 | 153 ++ jobs/JGFS_ATMOS_WAFS_GCIP | 140 ++ jobs/JGFS_ATMOS_WAFS_GRIB2 | 124 + jobs/JGFS_ATMOS_WAFS_GRIB2_0P25 | 133 ++ jobs/JGLOBAL_AERO_ANALYSIS_FINALIZE | 2 +- jobs/JGLOBAL_AERO_ANALYSIS_INITIALIZE | 2 +- jobs/JGLOBAL_AERO_ANALYSIS_RUN | 2 +- jobs/JGLOBAL_ARCHIVE | 8 +- jobs/JGLOBAL_ATMENS_ANALYSIS_FINALIZE | 2 +- jobs/JGLOBAL_ATMENS_ANALYSIS_INITIALIZE | 2 +- jobs/JGLOBAL_ATMENS_ANALYSIS_RUN | 2 +- jobs/JGLOBAL_ATMOS_ANALYSIS | 2 +- jobs/JGLOBAL_ATMOS_POST_MANAGER | 11 +- jobs/JGLOBAL_ATMOS_PRODUCTS | 2 +- jobs/JGLOBAL_ATMOS_SFCANL | 2 +- jobs/JGLOBAL_ATMOS_UPP | 2 +- jobs/JGLOBAL_ATM_ANALYSIS_FINALIZE | 2 +- jobs/JGLOBAL_ATM_ANALYSIS_INITIALIZE | 2 +- jobs/JGLOBAL_ATM_ANALYSIS_RUN | 2 +- jobs/JGLOBAL_ATM_PREP_IODA_OBS | 4 +- jobs/JGLOBAL_CLEANUP | 2 +- jobs/JGLOBAL_FORECAST | 60 +- jobs/JGLOBAL_OCEANICE_PRODUCTS | 40 + jobs/JGLOBAL_PREP_OCEAN_OBS | 6 +- ...AL_PREP_LAND_OBS => JGLOBAL_PREP_SNOW_OBS} | 5 +- ...AL_LAND_ANALYSIS => JGLOBAL_SNOW_ANALYSIS} | 9 +- jobs/JGLOBAL_STAGE_IC | 2 +- jobs/JGLOBAL_WAVE_GEMPAK | 2 +- jobs/JGLOBAL_WAVE_INIT | 8 +- jobs/JGLOBAL_WAVE_POST_BNDPNT | 8 +- jobs/JGLOBAL_WAVE_POST_BNDPNTBLL | 8 +- jobs/JGLOBAL_WAVE_POST_PNT | 8 +- jobs/JGLOBAL_WAVE_POST_SBS | 8 +- jobs/JGLOBAL_WAVE_PRDGEN_BULLS | 2 +- jobs/JGLOBAL_WAVE_PRDGEN_GRIDDED | 2 +- jobs/JGLOBAL_WAVE_PREP | 8 +- jobs/rocoto/awips_20km_1p0deg.sh | 4 +- jobs/rocoto/awips_g2.sh | 3 +- jobs/rocoto/efcs.sh | 25 - .../{gempakpgrb2spec.sh => gempakgrb2spec.sh} | 0 jobs/rocoto/oceanice_products.sh | 37 + jobs/rocoto/ocnanalecen.sh | 23 + jobs/rocoto/ocnpost.sh | 119 - jobs/rocoto/prepatmiodaobs.sh | 5 +- .../rocoto/{preplandobs.sh => prepsnowobs.sh} | 8 +- jobs/rocoto/{landanl.sh => snowanl.sh} | 4 +- modulefiles/module-setup.csh.inc | 87 - modulefiles/module-setup.sh.inc | 110 - modulefiles/module_base.hera.lua | 17 +- modulefiles/module_base.hercules.lua | 15 +- modulefiles/module_base.jet.lua | 10 +- modulefiles/module_base.orion.lua | 14 +- modulefiles/module_base.s4.lua | 10 +- modulefiles/module_base.wcoss2.lua | 5 + modulefiles/module_gwci.hera.lua | 4 +- modulefiles/module_gwci.hercules.lua | 2 +- modulefiles/module_gwci.orion.lua | 4 +- modulefiles/module_gwsetup.hera.lua | 6 +- modulefiles/module_gwsetup.hercules.lua | 4 +- modulefiles/module_gwsetup.jet.lua | 6 +- modulefiles/module_gwsetup.orion.lua | 4 +- modulefiles/module_gwsetup.s4.lua | 4 +- parm/config/gefs/config.atmos_products | 28 + .../gefs/{config.base.emc.dyn => config.base} | 43 +- parm/config/gefs/config.efcs | 70 +- parm/config/gefs/config.fcst | 47 +- parm/config/gefs/config.oceanice_products | 1 + parm/config/gefs/config.resources | 639 ++--- parm/config/gefs/config.stage_ic | 6 + parm/config/gefs/config.ufs | 110 +- parm/config/gefs/config.wave | 9 - parm/config/gefs/config.wavepostbndpnt | 11 + parm/config/gefs/config.wavepostbndpntbll | 11 + parm/config/gefs/config.wavepostpnt | 11 + parm/config/gefs/config.wavepostsbs | 28 + parm/config/gefs/yaml/defaults.yaml | 6 +- parm/config/gfs/config.aero | 6 +- parm/config/gfs/config.aeroanl | 17 +- parm/config/gfs/config.aeroanlfinal | 2 +- parm/config/gfs/config.aeroanlinit | 2 +- parm/config/gfs/config.aeroanlrun | 2 +- parm/config/gfs/config.anal | 55 +- parm/config/gfs/config.atmanl | 24 +- parm/config/gfs/config.atmanlinit | 1 + parm/config/gfs/config.atmensanl | 13 +- parm/config/gfs/config.atmensanlinit | 1 + parm/config/gfs/config.atmos_products | 19 +- parm/config/gfs/config.awips | 3 - .../gfs/{config.base.emc.dyn => config.base} | 0 parm/config/gfs/config.base.emc.dyn_emc | 100 +- parm/config/gfs/config.base.emc.dyn_hera | 100 +- parm/config/gfs/config.cleanup | 7 +- parm/config/gfs/config.com | 9 +- parm/config/gfs/config.efcs | 53 +- parm/config/gfs/config.esfc | 13 +- parm/config/gfs/config.fcst | 53 +- parm/config/gfs/config.fit2obs | 4 +- parm/config/gfs/config.ice | 5 + parm/config/gfs/config.landanl | 34 - parm/config/gfs/config.metp | 1 + parm/config/gfs/config.nsst | 5 + parm/config/gfs/config.oceanice_products | 15 + parm/config/gfs/config.ocn | 13 +- parm/config/gfs/config.ocnanalecen | 11 + parm/config/gfs/config.ocnpost | 29 - parm/config/gfs/config.postsnd | 1 - parm/config/gfs/config.prep | 13 +- parm/config/gfs/config.prepatmiodaobs | 3 - parm/config/gfs/config.preplandobs | 18 - parm/config/gfs/config.prepoceanobs | 2 +- parm/config/gfs/config.prepsnowobs | 21 + parm/config/gfs/config.resources | 1171 ++++----- parm/config/gfs/config.sfcanl | 5 + parm/config/gfs/config.snowanl | 30 + parm/config/gfs/config.stage_ic | 18 +- .../gfs/config.ufs_c768_12x12_2th_1wg40wt | 102 +- .../gfs/config.ufs_c768_16x16_2th_2wg40wt | 98 +- parm/config/gfs/config.ufs_emc | 498 ++++ parm/config/gfs/config.upp | 2 +- parm/config/gfs/config.verfozn | 7 +- parm/config/gfs/config.verfrad | 5 +- parm/config/gfs/config.vminmon | 5 +- parm/config/gfs/config.wave | 23 +- parm/config/gfs/config.wavepostbndpnt | 2 +- parm/config/gfs/config.wavepostbndpntbll | 2 +- parm/config/gfs/config.wavepostpnt | 2 +- parm/config/gfs/config.wavepostsbs | 2 +- parm/config/gfs/yaml/defaults.yaml | 26 +- parm/config/gfs/yaml/test_ci.yaml | 2 +- parm/gdas/aero_crtm_coeff.yaml | 13 - parm/gdas/aero_crtm_coeff.yaml.j2 | 13 + parm/gdas/aero_jedi_fix.yaml | 11 - parm/gdas/aero_jedi_fix.yaml.j2 | 7 + ...crtm_coeff.yaml => atm_crtm_coeff.yaml.j2} | 0 parm/gdas/atm_jedi_fix.yaml | 7 - parm/gdas/atm_jedi_fix.yaml.j2 | 7 + parm/gdas/land_jedi_fix.yaml | 7 - parm/gdas/snow_jedi_fix.yaml.j2 | 7 + parm/post/oceanice_products.yaml | 75 + parm/post/upp.yaml | 18 +- parm/product/bufr_ij9km.txt | 2115 +++++++++++++++++ parm/product/gefs.0p25.f000.paramlist.a.txt | 39 + parm/product/gefs.0p25.f000.paramlist.b.txt | 522 ++++ parm/product/gefs.0p25.fFFF.paramlist.a.txt | 38 + parm/product/gefs.0p25.fFFF.paramlist.b.txt | 554 +++++ parm/product/gefs.0p50.f000.paramlist.a.txt | 80 + parm/product/gefs.0p50.f000.paramlist.b.txt | 474 ++++ parm/product/gefs.0p50.fFFF.paramlist.a.txt | 87 + parm/product/gefs.0p50.fFFF.paramlist.b.txt | 506 ++++ parm/product/gefs.1p00.f000.paramlist.a.txt | 1 + parm/product/gefs.1p00.f000.paramlist.b.txt | 1 + parm/product/gefs.1p00.fFFF.paramlist.a.txt | 1 + parm/product/gefs.1p00.fFFF.paramlist.b.txt | 1 + parm/product/gefs.2p50.f000.paramlist.a.txt | 23 + parm/product/gefs.2p50.f000.paramlist.b.txt | 530 +++++ parm/product/gefs.2p50.fFFF.paramlist.a.txt | 22 + parm/product/gefs.2p50.fFFF.paramlist.b.txt | 571 +++++ .../gfs.anl.paramlist.a.txt} | 0 .../gfs.f000.paramlist.a.txt} | 0 .../gfs.fFFF.paramlist.a.txt} | 0 .../gfs.fFFF.paramlist.b.txt} | 0 parm/ufs/fix/gfs/atmos.fixed_files.yaml | 136 +- parm/ufs/fix/gfs/land.fixed_files.yaml | 98 +- parm/ufs/fix/gfs/ocean.fixed_files.yaml | 12 +- parm/ufs/fv3/data_table | 1 - parm/ufs/fv3/diag_table | 4 +- parm/ufs/fv3/diag_table_aod | 2 +- parm/ufs/gocart/ExtData.other | 36 +- parm/ufs/gocart/SU2G_instance_SU.rc | 4 +- parm/ufs/mom6/MOM_input_template_025 | 902 ------- parm/ufs/mom6/MOM_input_template_050 | 947 -------- parm/ufs/mom6/MOM_input_template_100 | 866 ------- parm/ufs/mom6/MOM_input_template_500 | 592 ----- parm/ufs/ufs.configure.atm.IN | 22 - parm/ufs/ufs.configure.atm_aero.IN | 40 - parm/ufs/ufs.configure.blocked_atm_wav.IN | 41 - parm/ufs/ufs.configure.cpld.IN | 122 - parm/ufs/ufs.configure.cpld_aero.IN | 134 -- parm/ufs/ufs.configure.cpld_aero_outerwave.IN | 151 -- parm/ufs/ufs.configure.cpld_aero_wave.IN | 151 -- parm/ufs/ufs.configure.cpld_outerwave.IN | 139 -- parm/ufs/ufs.configure.cpld_wave.IN | 139 -- parm/ufs/ufs.configure.leapfrog_atm_wav.IN | 41 - parm/wave/at_10m_interp.inp.tmpl | 2 +- parm/wave/ep_10m_interp.inp.tmpl | 2 +- parm/wave/glo_15mxt_interp.inp.tmpl | 6 +- parm/wave/glo_200_interp.inp.tmpl | 12 + parm/wave/glo_30m_interp.inp.tmpl | 6 +- parm/wave/wc_10m_interp.inp.tmpl | 2 +- scripts/exgdas_atmos_chgres_forenkf.sh | 11 +- scripts/exgdas_atmos_gempak_gif_ncdc.sh | 73 +- scripts/exgdas_atmos_nawips.sh | 153 +- scripts/exgdas_atmos_verfozn.sh | 2 +- scripts/exgdas_atmos_verfrad.sh | 17 +- scripts/exgdas_enkf_earc.sh | 8 +- scripts/exgdas_enkf_ecen.sh | 42 +- scripts/exgdas_enkf_fcst.sh | 225 -- scripts/exgdas_enkf_post.sh | 13 +- scripts/exgdas_enkf_select_obs.sh | 4 +- scripts/exgdas_enkf_sfc.sh | 57 +- scripts/exgdas_enkf_update.sh | 57 +- scripts/exgfs_atmos_awips_20km_1p0deg.sh | 10 +- scripts/exgfs_atmos_fbwind.sh | 6 +- scripts/exgfs_atmos_gempak_gif_ncdc_skew_t.sh | 126 +- scripts/exgfs_atmos_gempak_meta.sh | 191 +- scripts/exgfs_atmos_goes_nawips.sh | 132 +- scripts/exgfs_atmos_grib2_special_npoess.sh | 86 +- scripts/exgfs_atmos_grib_awips.sh | 12 +- scripts/exgfs_atmos_nawips.sh | 126 +- scripts/exgfs_atmos_postsnd.sh | 11 +- scripts/exgfs_atmos_wafs_blending_0p25.sh | 298 +++ scripts/exgfs_atmos_wafs_gcip.sh | 242 ++ scripts/exgfs_atmos_wafs_grib.sh | 146 ++ scripts/exgfs_atmos_wafs_grib2.sh | 227 ++ scripts/exgfs_atmos_wafs_grib2_0p25.sh | 200 ++ scripts/exgfs_pmgr.sh | 2 +- scripts/exgfs_prdgen_manager.sh | 2 +- scripts/exgfs_wave_init.sh | 25 +- scripts/exgfs_wave_nawips.sh | 3 +- scripts/exgfs_wave_post_gridded_sbs.sh | 16 +- scripts/exgfs_wave_post_pnt.sh | 56 +- scripts/exgfs_wave_prdgen_bulls.sh | 12 +- scripts/exgfs_wave_prdgen_gridded.sh | 7 +- scripts/exgfs_wave_prep.sh | 16 +- scripts/exglobal_archive_emc.sh | 19 +- scripts/exglobal_archive_gsl.sh | 17 +- scripts/exglobal_atmos_analysis.sh | 57 +- scripts/exglobal_atmos_analysis_calc.sh | 19 +- scripts/exglobal_atmos_pmgr.sh | 2 +- scripts/exglobal_atmos_products.sh | 36 +- scripts/exglobal_atmos_sfcanl.sh | 34 +- scripts/exglobal_atmos_tropcy_qc_reloc.sh | 6 +- scripts/exglobal_atmos_vminmon.sh | 8 +- scripts/exglobal_cleanup.sh | 2 +- scripts/exglobal_diag.sh | 4 +- scripts/exglobal_forecast.sh | 48 +- scripts/exglobal_oceanice_products.py | 52 + ..._land_obs.py => exglobal_prep_snow_obs.py} | 16 +- ..._analysis.py => exglobal_snow_analysis.py} | 12 +- scripts/exglobal_stage_ic.sh | 13 +- scripts/run_reg2grb2.sh | 72 - scripts/run_regrid.sh | 27 - sorc/build_all.sh | 121 +- sorc/build_gdas.sh | 15 +- sorc/build_gfs_utils.sh | 10 +- sorc/build_gsi_enkf.sh | 6 +- sorc/build_gsi_monitor.sh | 10 +- sorc/build_gsi_utils.sh | 10 +- sorc/build_ufs.sh | 12 +- sorc/build_ufs_utils.sh | 11 +- sorc/build_upp.sh | 29 +- sorc/build_ww3prepost.sh | 28 +- sorc/gdas.cd | 2 +- sorc/gfs_utils.fd | 2 +- sorc/gsi_enkf.fd | 2 +- sorc/gsi_monitor.fd | 2 +- sorc/gsi_utils.fd | 2 +- sorc/link_workflow.sh | 109 +- sorc/ncl.setup | 12 - sorc/ufs_model.fd | 2 +- sorc/ufs_utils.fd | 2 +- sorc/upp.fd | 1 + sorc/verif-global.fd | 2 +- sorc/wxflow | 2 +- ush/bash_utils.sh | 127 + ush/calcanl_gfs.py | 2 +- ush/detect_machine.sh | 41 +- ush/forecast_det.sh | 7 +- ush/forecast_postdet.sh | 493 ++-- ush/forecast_predet.sh | 160 +- ush/fv3gfs_remap.sh | 118 - ush/gaussian_sfcanl.sh | 39 +- ush/getdump.sh | 2 +- ush/getges.sh | 2 +- ush/gfs_bfr2gpk.sh | 2 +- ush/gfs_bufr.sh | 23 +- ush/gfs_bufr_netcdf.sh | 10 +- ush/gfs_sndp.sh | 6 +- ush/gfs_truncate_enkf.sh | 17 +- ush/global_savefits.sh | 2 +- ush/hpssarch_gen_emc.sh | 124 +- ush/hpssarch_gen_gsl.sh | 8 +- ush/icepost.ncl | 382 --- ush/interp_atmos_master.sh | 4 +- ush/interp_atmos_sflux.sh | 4 +- ush/link_crtm_fix.sh | 2 +- ush/load_fv3gfs_modules.sh | 41 +- ush/load_ufsda_modules.sh | 59 +- ush/minmon_xtrct_costs.pl | 5 +- ush/minmon_xtrct_gnorms.pl | 5 +- ush/minmon_xtrct_reduct.pl | 6 +- ush/module-setup.sh | 12 +- ush/oceanice_nc2grib2.sh | 319 +++ ush/ocnpost.ncl | 588 ----- ush/ozn_xtrct.sh | 6 +- ush/parsing_model_configure_DATM.sh | 38 - ush/parsing_model_configure_FV3.sh | 107 +- ush/parsing_namelists_CICE.sh | 434 +--- ush/parsing_namelists_FV3.sh | 157 +- ush/parsing_namelists_MOM6.sh | 140 +- ush/parsing_namelists_WW3.sh | 16 +- ..._configure.sh => parsing_ufs_configure.sh} | 31 +- ush/preamble.sh | 85 +- ush/python/pygfs/task/aero_analysis.py | 29 +- ush/python/pygfs/task/analysis.py | 143 +- ush/python/pygfs/task/atm_analysis.py | 33 +- ush/python/pygfs/task/atmens_analysis.py | 31 +- ush/python/pygfs/task/oceanice_products.py | 337 +++ .../{land_analysis.py => snow_analysis.py} | 86 +- ush/radmon_err_rpt.sh | 5 +- ush/radmon_verf_angle.sh | 7 +- ush/radmon_verf_bcoef.sh | 3 +- ush/radmon_verf_bcor.sh | 3 +- ush/radmon_verf_time.sh | 8 +- ush/rstprod.sh | 2 +- ush/run_mpmd.sh | 2 +- ush/syndat_getjtbul.sh | 15 +- ush/syndat_qctropcy.sh | 56 +- ush/tropcy_relocate.sh | 86 +- ush/tropcy_relocate_extrkr.sh | 6 +- ush/wafs_mkgbl.sh | 152 ++ ush/wave_grib2_sbs.sh | 8 +- ush/wave_grid_interp_sbs.sh | 22 +- ush/wave_grid_moddef.sh | 22 +- ush/wave_outp_cat.sh | 2 +- ush/wave_outp_spec.sh | 8 +- ush/wave_prnc_cur.sh | 10 +- ush/wave_prnc_ice.sh | 6 +- ush/wave_tar.sh | 2 +- versions/build.hera.ver | 2 + versions/build.hercules.ver | 3 + versions/build.jet.ver | 2 + versions/build.orion.ver | 4 +- versions/build.s4.ver | 2 + versions/build.spack.ver | 9 +- versions/fix.ver | 5 +- versions/run.hera.ver | 7 +- versions/run.hercules.ver | 9 +- versions/run.jet.ver | 2 + versions/run.orion.ver | 9 +- versions/run.s4.ver | 2 + versions/run.spack.ver | 21 +- versions/run.wcoss2.ver | 2 + workflow/applications/applications.py | 4 + workflow/applications/gefs.py | 23 +- workflow/applications/gfs_cycled.py | 49 +- workflow/applications/gfs_forecast_only.py | 11 +- workflow/create_experiment.py | 7 +- workflow/hosts/hera_emc.yaml | 3 + workflow/hosts/hera_gsl.yaml | 6 +- workflow/hosts/hercules.yaml | 3 + workflow/hosts/jet_emc.yaml | 3 + workflow/hosts/jet_gsl.yaml | 3 + workflow/hosts/orion.yaml | 3 + workflow/hosts/wcoss2.yaml | 3 + workflow/rocoto/gefs_tasks.py | 260 +- workflow/rocoto/gfs_tasks.py | 239 +- workflow/rocoto/tasks_emc.py | 58 +- workflow/rocoto/tasks_gsl.py | 62 +- workflow/setup_expt.py | 49 +- 463 files changed, 18026 insertions(+), 17466 deletions(-) create mode 100644 ci/Jenkinsfile create mode 100644 ci/cases/pr/C48mx500_3DVarAOWCDA.yaml create mode 100644 ci/cases/pr/C96C48_ufs_hybatmDA.yaml create mode 100644 ci/cases/pr/C96_atmaerosnowDA.yaml create mode 100644 ci/cases/yamls/atmaerosnowDA_defaults_ci.yaml create mode 100644 ci/cases/yamls/build.yaml rename ci/{platforms => cases/yamls}/gefs_ci_defaults.yaml (100%) rename ci/{platforms => cases/yamls}/gfs_defaults_ci.yaml (100%) create mode 100644 ci/cases/yamls/soca_gfs_defaults_ci.yaml create mode 100644 ci/cases/yamls/ufs_hybatmDA_defaults.ci.yaml create mode 100755 ci/scripts/utils/ci_utils_wrapper.sh create mode 100755 ci/scripts/utils/get_host_case_list.py create mode 100755 ci/scripts/utils/githubpr.py create mode 100755 ci/scripts/utils/publish_logs.py create mode 120000 ci/scripts/utils/wxflow create mode 100755 gempak/ush/gempak_gfs_f000_gif.sh delete mode 100755 gempak/ush/gempak_gfs_f00_gif.sh delete mode 100755 gempak/ush/gempak_gfs_f12_gif.sh delete mode 100755 gempak/ush/gempak_gfs_f24_gif.sh delete mode 100755 gempak/ush/gempak_gfs_f36_gif.sh delete mode 100755 gempak/ush/gempak_gfs_f48_gif.sh create mode 100755 gempak/ush/gempak_gfs_fhhh_gif.sh delete mode 100755 jobs/JGDAS_ENKF_FCST create mode 100755 jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_ECEN create mode 100755 jobs/JGFS_ATMOS_WAFS create mode 100755 jobs/JGFS_ATMOS_WAFS_BLENDING_0P25 create mode 100755 jobs/JGFS_ATMOS_WAFS_GCIP create mode 100755 jobs/JGFS_ATMOS_WAFS_GRIB2 create mode 100755 jobs/JGFS_ATMOS_WAFS_GRIB2_0P25 create mode 100755 jobs/JGLOBAL_OCEANICE_PRODUCTS rename jobs/{JGLOBAL_PREP_LAND_OBS => JGLOBAL_PREP_SNOW_OBS} (86%) rename jobs/{JGLOBAL_LAND_ANALYSIS => JGLOBAL_SNOW_ANALYSIS} (78%) delete mode 100755 jobs/rocoto/efcs.sh rename jobs/rocoto/{gempakpgrb2spec.sh => gempakgrb2spec.sh} (100%) create mode 100755 jobs/rocoto/oceanice_products.sh create mode 100755 jobs/rocoto/ocnanalecen.sh delete mode 100755 jobs/rocoto/ocnpost.sh rename jobs/rocoto/{preplandobs.sh => prepsnowobs.sh} (73%) rename jobs/rocoto/{landanl.sh => snowanl.sh} (91%) delete mode 100644 modulefiles/module-setup.csh.inc delete mode 100644 modulefiles/module-setup.sh.inc create mode 100644 parm/config/gefs/config.atmos_products rename parm/config/gefs/{config.base.emc.dyn => config.base} (92%) create mode 120000 parm/config/gefs/config.oceanice_products create mode 100644 parm/config/gefs/config.wavepostbndpnt create mode 100644 parm/config/gefs/config.wavepostbndpntbll create mode 100644 parm/config/gefs/config.wavepostpnt create mode 100644 parm/config/gefs/config.wavepostsbs rename parm/config/gfs/{config.base.emc.dyn => config.base} (100%) delete mode 100644 parm/config/gfs/config.landanl create mode 100644 parm/config/gfs/config.oceanice_products create mode 100644 parm/config/gfs/config.ocnanalecen delete mode 100644 parm/config/gfs/config.ocnpost delete mode 100644 parm/config/gfs/config.preplandobs create mode 100644 parm/config/gfs/config.prepsnowobs create mode 100644 parm/config/gfs/config.snowanl create mode 100644 parm/config/gfs/config.ufs_emc delete mode 100644 parm/gdas/aero_crtm_coeff.yaml create mode 100644 parm/gdas/aero_crtm_coeff.yaml.j2 delete mode 100644 parm/gdas/aero_jedi_fix.yaml create mode 100644 parm/gdas/aero_jedi_fix.yaml.j2 rename parm/gdas/{atm_crtm_coeff.yaml => atm_crtm_coeff.yaml.j2} (100%) delete mode 100644 parm/gdas/atm_jedi_fix.yaml create mode 100644 parm/gdas/atm_jedi_fix.yaml.j2 delete mode 100644 parm/gdas/land_jedi_fix.yaml create mode 100644 parm/gdas/snow_jedi_fix.yaml.j2 create mode 100644 parm/post/oceanice_products.yaml create mode 100644 parm/product/bufr_ij9km.txt create mode 100644 parm/product/gefs.0p25.f000.paramlist.a.txt create mode 100644 parm/product/gefs.0p25.f000.paramlist.b.txt create mode 100644 parm/product/gefs.0p25.fFFF.paramlist.a.txt create mode 100644 parm/product/gefs.0p25.fFFF.paramlist.b.txt create mode 100644 parm/product/gefs.0p50.f000.paramlist.a.txt create mode 100644 parm/product/gefs.0p50.f000.paramlist.b.txt create mode 100644 parm/product/gefs.0p50.fFFF.paramlist.a.txt create mode 100644 parm/product/gefs.0p50.fFFF.paramlist.b.txt create mode 120000 parm/product/gefs.1p00.f000.paramlist.a.txt create mode 120000 parm/product/gefs.1p00.f000.paramlist.b.txt create mode 120000 parm/product/gefs.1p00.fFFF.paramlist.a.txt create mode 120000 parm/product/gefs.1p00.fFFF.paramlist.b.txt create mode 100644 parm/product/gefs.2p50.f000.paramlist.a.txt create mode 100644 parm/product/gefs.2p50.f000.paramlist.b.txt create mode 100644 parm/product/gefs.2p50.fFFF.paramlist.a.txt create mode 100644 parm/product/gefs.2p50.fFFF.paramlist.b.txt rename parm/{post/global_1x1_paramlist_g2.anl => product/gfs.anl.paramlist.a.txt} (100%) rename parm/{post/global_1x1_paramlist_g2.f000 => product/gfs.f000.paramlist.a.txt} (100%) rename parm/{post/global_1x1_paramlist_g2 => product/gfs.fFFF.paramlist.a.txt} (100%) rename parm/{post/global_master-catchup_parmlist_g2 => product/gfs.fFFF.paramlist.b.txt} (100%) delete mode 100644 parm/ufs/fv3/data_table delete mode 100644 parm/ufs/mom6/MOM_input_template_025 delete mode 100644 parm/ufs/mom6/MOM_input_template_050 delete mode 100644 parm/ufs/mom6/MOM_input_template_100 delete mode 100644 parm/ufs/mom6/MOM_input_template_500 delete mode 100644 parm/ufs/ufs.configure.atm.IN delete mode 100644 parm/ufs/ufs.configure.atm_aero.IN delete mode 100644 parm/ufs/ufs.configure.blocked_atm_wav.IN delete mode 100644 parm/ufs/ufs.configure.cpld.IN delete mode 100644 parm/ufs/ufs.configure.cpld_aero.IN delete mode 100644 parm/ufs/ufs.configure.cpld_aero_outerwave.IN delete mode 100644 parm/ufs/ufs.configure.cpld_aero_wave.IN delete mode 100644 parm/ufs/ufs.configure.cpld_outerwave.IN delete mode 100644 parm/ufs/ufs.configure.cpld_wave.IN delete mode 100644 parm/ufs/ufs.configure.leapfrog_atm_wav.IN create mode 100755 parm/wave/glo_200_interp.inp.tmpl delete mode 100755 scripts/exgdas_enkf_fcst.sh create mode 100755 scripts/exgfs_atmos_wafs_blending_0p25.sh create mode 100755 scripts/exgfs_atmos_wafs_gcip.sh create mode 100755 scripts/exgfs_atmos_wafs_grib.sh create mode 100755 scripts/exgfs_atmos_wafs_grib2.sh create mode 100755 scripts/exgfs_atmos_wafs_grib2_0p25.sh create mode 100755 scripts/exglobal_oceanice_products.py rename scripts/{exglobal_prep_land_obs.py => exglobal_prep_snow_obs.py} (59%) rename scripts/{exglobal_land_analysis.py => exglobal_snow_analysis.py} (66%) delete mode 100755 scripts/run_reg2grb2.sh delete mode 100755 scripts/run_regrid.sh delete mode 100644 sorc/ncl.setup create mode 160000 sorc/upp.fd create mode 100644 ush/bash_utils.sh delete mode 100755 ush/fv3gfs_remap.sh delete mode 100755 ush/icepost.ncl create mode 100755 ush/oceanice_nc2grib2.sh delete mode 100755 ush/ocnpost.ncl delete mode 100755 ush/parsing_model_configure_DATM.sh rename ush/{ufs_configure.sh => parsing_ufs_configure.sh} (80%) create mode 100644 ush/python/pygfs/task/oceanice_products.py rename ush/python/pygfs/task/{land_analysis.py => snow_analysis.py} (89%) create mode 100755 ush/wafs_mkgbl.sh diff --git a/.github/workflows/docs.yaml b/.github/workflows/docs.yaml index 3113c31149..b04c85688b 100644 --- a/.github/workflows/docs.yaml +++ b/.github/workflows/docs.yaml @@ -57,7 +57,7 @@ jobs: if-no-files-found: ignore - name: Comment ReadDocs Link in PR - if: github.event_name == 'pull_request' + if: ${{ github.event_name == 'pull_request' }} uses: actions/github-script@v6 with: script: | diff --git a/.gitignore b/.gitignore index 047313a32f..2b6c835045 100644 --- a/.gitignore +++ b/.gitignore @@ -30,13 +30,11 @@ fix/chem fix/cice fix/cpl fix/datm -fix/gldas fix/gdas fix/gsi fix/lut fix/mom6 fix/orog -fix/reg2grb2 fix/sfc_climo fix/ugwd fix/verif @@ -44,18 +42,28 @@ fix/wave # Ignore parm file symlinks #-------------------------- -parm/config/config.base -parm/gldas +parm/gdas/aero +parm/gdas/atm +parm/gdas/io +parm/gdas/ioda +parm/gdas/snow +parm/gdas/soca parm/monitor parm/post/AEROSOL_LUTS.dat parm/post/nam_micro_lookup.dat parm/post/optics_luts_DUST.dat parm/post/gtg.config.gfs parm/post/gtg_imprintings.txt +parm/post/optics_luts_DUST_nasa.dat +parm/post/optics_luts_NITR_nasa.dat parm/post/optics_luts_SALT.dat +parm/post/optics_luts_SALT_nasa.dat parm/post/optics_luts_SOOT.dat +parm/post/optics_luts_SOOT_nasa.dat parm/post/optics_luts_SUSO.dat +parm/post/optics_luts_SUSO_nasa.dat parm/post/optics_luts_WASO.dat +parm/post/optics_luts_WASO_nasa.dat parm/post/params_grib2_tbl_new parm/post/post_tag_gfs128 parm/post/post_tag_gfs65 @@ -77,6 +85,9 @@ parm/post/postcntrl_gfs_wafs.xml parm/post/postcntrl_gfs_wafs_anl.xml parm/post/postxconfig-NT-GEFS-ANL.txt parm/post/postxconfig-NT-GEFS-F00.txt +parm/post/postxconfig-NT-GEFS-F00-aerosol.txt +parm/post/postxconfig-NT-GEFS-WAFS.txt +parm/post/postxconfig-NT-GEFS-aerosol.txt parm/post/postxconfig-NT-GEFS.txt parm/post/postxconfig-NT-GFS-ANL.txt parm/post/postxconfig-NT-GFS-F00-TWO.txt @@ -90,7 +101,15 @@ parm/post/postxconfig-NT-GFS-WAFS.txt parm/post/postxconfig-NT-GFS.txt parm/post/postxconfig-NT-gefs-aerosol.txt parm/post/postxconfig-NT-gefs-chem.txt +parm/post/ocean.csv +parm/post/ice.csv +parm/post/ocnicepost.nml.jinja2 parm/ufs/noahmptable.tbl +parm/ufs/model_configure.IN +parm/ufs/MOM_input_*.IN +parm/ufs/MOM6_data_table.IN +parm/ufs/ice_in.IN +parm/ufs/ufs.configure.*.IN parm/wafs # Ignore sorc and logs folders from externals @@ -123,7 +142,6 @@ sorc/radmon_bcor.fd sorc/radmon_time.fd sorc/rdbfmsua.fd sorc/recentersigp.fd -sorc/reg2grb2.fd sorc/supvit.fd sorc/syndat_getjtbul.fd sorc/syndat_maksynrc.fd @@ -133,24 +151,13 @@ sorc/tocsbufr.fd sorc/upp.fd sorc/vint.fd sorc/webtitle.fd +sorc/ocnicepost.fd # Ignore scripts from externals #------------------------------ # jobs symlinks -jobs/JGFS_ATMOS_WAFS -jobs/JGFS_ATMOS_WAFS_BLENDING -jobs/JGFS_ATMOS_WAFS_BLENDING_0P25 -jobs/JGFS_ATMOS_WAFS_GCIP -jobs/JGFS_ATMOS_WAFS_GRIB2 -jobs/JGFS_ATMOS_WAFS_GRIB2_0P25 # scripts symlinks scripts/exemcsfc_global_sfc_prep.sh -scripts/exgfs_atmos_wafs_blending.sh -scripts/exgfs_atmos_wafs_blending_0p25.sh -scripts/exgfs_atmos_wafs_gcip.sh -scripts/exgfs_atmos_wafs_grib.sh -scripts/exgfs_atmos_wafs_grib2.sh -scripts/exgfs_atmos_wafs_grib2_0p25.sh # ush symlinks ush/chgres_cube.sh ush/emcsfc_ice_blend.sh @@ -166,11 +173,7 @@ ush/global_chgres_driver.sh ush/global_cycle.sh ush/global_cycle_driver.sh ush/jediinc2fv3.py -ush/mkwfsgbl.sh ush/ufsda -ush/wafs_blending.sh -ush/wafs_grib2.regrid.sh -ush/wafs_intdsk.sh ush/finddate.sh ush/make_NTC_file.pl ush/make_ntc_bull.pl diff --git a/.gitmodules b/.gitmodules index ed2f92d45c..2c84837d80 100644 --- a/.gitmodules +++ b/.gitmodules @@ -1,28 +1,31 @@ [submodule "sorc/ufs_model.fd"] - path = sorc/ufs_model.fd - url = https://github.com/NOAA-GSL/ufs-weather-model - ignore = dirty + path = sorc/ufs_model.fd + url = https://github.com/NOAA-GSL/ufs-weather-model + ignore = dirty [submodule "sorc/wxflow"] - path = sorc/wxflow - url = https://github.com/NOAA-EMC/wxflow + path = sorc/wxflow + url = https://github.com/NOAA-EMC/wxflow [submodule "sorc/gfs_utils.fd"] - path = sorc/gfs_utils.fd - url = https://github.com/NOAA-EMC/gfs-utils + path = sorc/gfs_utils.fd + url = https://github.com/NOAA-EMC/gfs-utils [submodule "sorc/ufs_utils.fd"] - path = sorc/ufs_utils.fd - url = https://github.com/NOAA-GSL/UFS_UTILS.git + path = sorc/ufs_utils.fd + url = https://github.com/NOAA-GSL/UFS_UTILS.git [submodule "sorc/verif-global.fd"] - path = sorc/verif-global.fd - url = https://github.com/NOAA-EMC/EMC_verif-global.git + path = sorc/verif-global.fd + url = https://github.com/NOAA-EMC/EMC_verif-global.git [submodule "sorc/gsi_enkf.fd"] - path = sorc/gsi_enkf.fd - url = https://github.com/NOAA-EMC/GSI.git + path = sorc/gsi_enkf.fd + url = https://github.com/NOAA-EMC/GSI.git [submodule "sorc/gdas.cd"] - path = sorc/gdas.cd - url = https://github.com/NOAA-EMC/GDASApp.git + path = sorc/gdas.cd + url = https://github.com/NOAA-EMC/GDASApp.git [submodule "sorc/gsi_utils.fd"] - path = sorc/gsi_utils.fd - url = https://github.com/NOAA-EMC/GSI-Utils.git + path = sorc/gsi_utils.fd + url = https://github.com/NOAA-EMC/GSI-Utils.git [submodule "sorc/gsi_monitor.fd"] - path = sorc/gsi_monitor.fd - url = https://github.com/NOAA-EMC/GSI-Monitor.git + path = sorc/gsi_monitor.fd + url = https://github.com/NOAA-EMC/GSI-Monitor.git +[submodule "sorc/upp.fd"] + path = sorc/upp.fd + url = https://github.com/NOAA-EMC/UPP.git diff --git a/.shellcheckrc b/.shellcheckrc index 6d540ba17f..67fabfe157 100644 --- a/.shellcheckrc +++ b/.shellcheckrc @@ -14,3 +14,6 @@ disable=SC1091 # Disable -p -m only applies to deepest directory disable=SC2174 + +# Disable warning of functions in test statements +disable=SC2310 diff --git a/ci/Jenkinsfile b/ci/Jenkinsfile new file mode 100644 index 0000000000..c799d3c488 --- /dev/null +++ b/ci/Jenkinsfile @@ -0,0 +1,268 @@ +def Machine = 'none' +def machine = 'none' +def HOME = 'none' +def caseList = '' +// Location of the custom workspaces for each machine in the CI system. They are persitent for each iteration of the PR. +def custom_workspace = [hera: '/scratch1/NCEPDEV/global/CI', orion: '/work2/noaa/stmp/CI/ORION', hercules: '/work2/noaa/stmp/CI/HERCULES'] + +pipeline { + agent { label 'built-in' } + + options { + skipDefaultCheckout() + parallelsAlwaysFailFast() + } + + stages { // This initial stage is used to get the Machine name from the GitHub labels on the PR + // which is used to designate the Nodes in the Jenkins Controler by the agent label + // Each Jenknis Node is connected to said machine via an JAVA agent via an ssh tunnel + + stage('Get Machine') { + agent { label 'built-in' } + steps { + script { + machine = 'none' + for (label in pullRequest.labels) { + echo "Label: ${label}" + if ((label.matches('CI-Hera-Ready'))) { + machine = 'hera' + } else if ((label.matches('CI-Orion-Ready'))) { + machine = 'orion' + } else if ((label.matches('CI-Hercules-Ready'))) { + machine = 'hercules' + } + } // createing a second machine varible with first letter capital + // because the first letter of the machine name is captitalized in the GitHub labels + Machine = machine[0].toUpperCase() + machine.substring(1) + } + } + } + + stage('Get Common Workspace') { + agent { label "${machine}-emc" } + steps { + script { + ws("${custom_workspace[machine]}/${env.CHANGE_ID}") { + properties([parameters([[$class: 'NodeParameterDefinition', allowedSlaves: ['built-in', 'Hera-EMC', 'Orion-EMC'], defaultSlaves: ['built-in'], name: '', nodeEligibility: [$class: 'AllNodeEligibility'], triggerIfResult: 'allCases']])]) + HOME = "${WORKSPACE}" + sh(script: "mkdir -p ${HOME}/RUNTESTS;rm -Rf ${HOME}/RUNTESTS/error.logs") + pullRequest.addLabel("CI-${Machine}-Building") + if (pullRequest.labels.any { value -> value.matches("CI-${Machine}-Ready") }) { + pullRequest.removeLabel("CI-${Machine}-Ready") + } + } + echo "Building and running on ${Machine} in directory ${HOME}" + } + } + } + + stage('Build System') { + matrix { + agent { label "${machine}-emc" } + //options { + // throttle(['global_matrix_build']) + //} + axes { + axis { + name 'system' + values 'gfs', 'gefs' + } + } + stages { + stage('build system') { + steps { + script { + def HOMEgfs = "${HOME}/${system}" // local HOMEgfs is used to build the system on per system basis under the common workspace HOME + sh(script: "mkdir -p ${HOMEgfs}") + ws(HOMEgfs) { + if (fileExists("${HOMEgfs}/sorc/BUILT_semaphor")) { // if the system is already built, skip the build in the case of re-runs + sh(script: "cat ${HOMEgfs}/sorc/BUILT_semaphor", returnStdout: true).trim() // TODO: and user configurable control to manage build semphore + checkout scm + dir('sorc') { + sh(script: './link_workflow.sh') + } + } else { + checkout scm + def gist_url = "" + def error_logs = "" + def error_logs_message = "" + def builds_file = readYaml file: 'ci/cases/yamls/build.yaml' + def build_args_list = builds_file['builds'] + def build_args = build_args_list[system].join(' ').trim().replaceAll('null', '') + dir("${HOMEgfs}/sorc") { + try { + sh(script: "${build_args}") + } catch (Exception error_build) { + echo "Failed to build system: ${error_build.getMessage()}" + if ( fileExists("logs/error.logs") ) { + def fileContent = readFile 'logs/error.logs' + def lines = fileContent.readLines() + for (line in lines) { + echo "archiving: ${line}" + if (fileExists("${line}") && readFile("${line}").length() > 0 ){ + try { + archiveArtifacts artifacts: "${line}", fingerprint: true + error_logs = error_logs + "${HOMEgfs}/sorc/${line} " + error_logs_message = error_logs_message + "${HOMEgfs}/sorc/${line}\n" + } + catch (Exception error_arch) { echo "Failed to archive error log ${line}: ${error_arch.getMessage()}" } + } + } + repo_url=sh(script: "${HOMEgfs}/ci/scripts/utils/publish_logs.py --file ${error_logs} --repo PR_BUILD_${env.CHANGE_ID}", returnStdout: true).trim() + gist_url=sh(script: "${HOMEgfs}/ci/scripts/utils/publish_logs.py --file ${error_logs} --gist PR_BUILD_${env.CHANGE_ID}", returnStdout: true).trim() + try { + pullRequest.comment("Build failed on **${Machine}** with error logs:${error_logs_message}\n\nFollow link here to view the contents of the above file(s): [(link)](${gist_url})") + } catch (Exception error_comment) { + echo "Failed to comment on PR: ${error_comment.getMessage()}" + } + error("Failed to build system on ${Machine}") + } + } + sh(script: './link_workflow.sh') + // sh(script: "echo ${HOMEgfs} > BUILT_semaphor") + } + } + if (env.CHANGE_ID && system == 'gfs') { + try { + if (pullRequest.labels.any { value -> value.matches("CI-${Machine}-Building") }) { + pullRequest.removeLabel("CI-${Machine}-Building") + } + pullRequest.addLabel("CI-${Machine}-Running") + } catch (Exception e) { + echo "Failed to update label from Buildng to Running: ${e.getMessage()}" + } + } + if (system == 'gfs') { + caseList = sh(script: "${HOMEgfs}/ci/scripts/utils/get_host_case_list.py ${machine}", returnStdout: true).trim().split() + } + } + } + } + } + } + } + } + + stage('Run Tests') { + matrix { + agent { label "${machine}-emc" } + axes { + axis { + name 'Case' + // TODO add dynamic list of cases from env vars (needs addtional plugins) + values 'C48C48_ufs_hybatmDA', 'C48_ATM', 'C48_S2SW', 'C48_S2SWA_gefs', 'C48mx500_3DVarAOWCDA', 'C96C48_hybatmDA', 'C96_atm3DVar', 'C96_atmaerosnowDA' + } + } + stages { + + stage('Create Experiments') { + when { + expression { return caseList.contains(Case) } + } + steps { + script { + sh(script: "sed -n '/{.*}/!p' ${HOME}/gfs/ci/cases/pr/${Case}.yaml > ${HOME}/gfs/ci/cases/pr/${Case}.yaml.tmp") + def yaml_case = readYaml file: "${HOME}/gfs/ci/cases/pr/${Case}.yaml.tmp" + system = yaml_case.experiment.system + def HOMEgfs = "${HOME}/${system}" // local HOMEgfs is used to populate the XML on per system basis + env.RUNTESTS = "${HOME}/RUNTESTS" + sh(script: "${HOMEgfs}/ci/scripts/utils/ci_utils_wrapper.sh create_experiment ${HOMEgfs}/ci/cases/pr/${Case}.yaml") + } + } + } + + stage('Run Experiments') { + when { + expression { return caseList.contains(Case) } + } + steps { + script { + HOMEgfs = "${HOME}/gfs" // common HOMEgfs is used to launch the scripts that run the experiments + pslot = sh(script: "${HOMEgfs}/ci/scripts/utils/ci_utils_wrapper.sh get_pslot ${HOME}/RUNTESTS ${Case}", returnStdout: true).trim() + try { + sh(script: "${HOMEgfs}/ci/scripts/run-check_ci.sh ${HOME} ${pslot}") + } catch (Exception error_experment) { + sh(script: "${HOMEgfs}/ci/scripts/utils/ci_utils_wrapper.sh cancel_all_batch_jobs ${HOME}/RUNTESTS") + ws(HOME) { + def error_logs = "" + def error_logs_message = "" + if (fileExists("RUNTESTS/error.logs")) { + def fileContent = readFile 'RUNTESTS/error.logs' + def lines = fileContent.readLines() + for (line in lines) { + echo "archiving: ${line}" + if (fileExists("${HOME}/${line}") && readFile("${HOME}/${line}").length() > 0) { + try { + archiveArtifacts artifacts: "${line}", fingerprint: true + error_logs = error_logs + "${HOME}/${line} " + error_logs_message = error_logs_message + "${HOME}/${line}\n" + } catch (Exception error_arch) { + echo "Failed to archive error log ${line}: ${error_arch.getMessage()}" + } + } + } + repo_url = sh(script: "${HOMEgfs}/ci/scripts/utils/publish_logs.py --file ${error_logs} --repo PR_${env.CHANGE_ID}", returnStdout: true).trim() + gist_url = sh(script: "${HOMEgfs}/ci/scripts/utils/publish_logs.py --file ${error_logs} --gist PR_${env.CHANGE_ID}", returnStdout: true).trim() + try { + pullRequest.comment("Experiment ${Case} failed on ${Machine} with error logs: ${error_logs_message}\n\nFollow link here to view the contents of the above file(s): [(link)](${gist_url})") + } catch (Exception error_comment) { + echo "Failed to comment on PR: ${error_comment.getMessage()}" + } + } else { + echo "No error logs found for failed cases in $HOME/RUNTESTS/error.logs" + } + error("Failed to run experiments ${Case} on ${Machine}") + } + } + } + } + } + } + } + } + } + + post { + always { + script { + if(env.CHANGE_ID) { + try { + for (label in pullRequest.labels) { + if (label.contains("${Machine}")) { + pullRequest.removeLabel(label) + } + } + } catch (Exception e) { + echo "Failed to remove labels: ${e.getMessage()}" + } + } + } + } + success { + script { + if(env.CHANGE_ID) { + try { + pullRequest.addLabel("CI-${Machine}-Passed") + def timestamp = new Date().format('MM dd HH:mm:ss', TimeZone.getTimeZone('America/New_York')) + pullRequest.comment("**CI SUCCESS** ${Machine} at ${timestamp}\n\nBuilt and ran in directory `${HOME}`") + } catch (Exception e) { + echo "Failed to add success label or comment: ${e.getMessage()}" + } + } + } + } + failure { + script { + if(env.CHANGE_ID) { + try { + pullRequest.addLabel("CI-${Machine}-Failed") + def timestamp = new Date().format('MM dd HH:mm:ss', TimeZone.getTimeZone('America/New_York')) + pullRequest.comment("**CI FAILED** ${Machine} at ${timestamp}
Built and ran in directory `${HOME}`") + } catch (Exception e) { + echo "Failed to add failure label or comment: ${e.getMessage()}" + } + } + } + } + } +} diff --git a/ci/cases/pr/C48_ATM.yaml b/ci/cases/pr/C48_ATM.yaml index 39412e8aeb..79706556e6 100644 --- a/ci/cases/pr/C48_ATM.yaml +++ b/ci/cases/pr/C48_ATM.yaml @@ -10,4 +10,4 @@ arguments: expdir: {{ 'RUNTESTS' | getenv }}/EXPDIR idate: 2021032312 edate: 2021032312 - yaml: {{ HOMEgfs }}/ci/platforms/gfs_defaults_ci.yaml + yaml: {{ HOMEgfs }}/ci/cases/yamls/gfs_defaults_ci.yaml diff --git a/ci/cases/pr/C48_S2SW.yaml b/ci/cases/pr/C48_S2SW.yaml index 2aba42f562..6367564514 100644 --- a/ci/cases/pr/C48_S2SW.yaml +++ b/ci/cases/pr/C48_S2SW.yaml @@ -11,4 +11,4 @@ arguments: expdir: {{ 'RUNTESTS' | getenv }}/EXPDIR idate: 2021032312 edate: 2021032312 - yaml: {{ HOMEgfs }}/ci/platforms/gfs_defaults_ci.yaml + yaml: {{ HOMEgfs }}/ci/cases/yamls/gfs_defaults_ci.yaml diff --git a/ci/cases/pr/C48_S2SWA_gefs.yaml b/ci/cases/pr/C48_S2SWA_gefs.yaml index d68360bf44..310d0ea615 100644 --- a/ci/cases/pr/C48_S2SWA_gefs.yaml +++ b/ci/cases/pr/C48_S2SWA_gefs.yaml @@ -15,4 +15,5 @@ arguments: expdir: {{ 'RUNTESTS' | getenv }}/EXPDIR idate: 2021032312 edate: 2021032312 - yaml: {{ HOMEgfs }}/ci/platforms/gefs_ci_defaults.yaml + yaml: {{ HOMEgfs }}/ci/cases/yamls/gefs_ci_defaults.yaml + diff --git a/ci/cases/pr/C48mx500_3DVarAOWCDA.yaml b/ci/cases/pr/C48mx500_3DVarAOWCDA.yaml new file mode 100644 index 0000000000..d9156e38f3 --- /dev/null +++ b/ci/cases/pr/C48mx500_3DVarAOWCDA.yaml @@ -0,0 +1,23 @@ +experiment: + system: gfs + mode: cycled + +arguments: + pslot: {{ 'pslot' | getenv }} + app: S2S + resdetatmos: 48 + resdetocean: 5.0 + comroot: {{ 'RUNTESTS' | getenv }}/COMROOT + expdir: {{ 'RUNTESTS' | getenv }}/EXPDIR + icsdir: {{ 'ICSDIR_ROOT' | getenv }}/C48mx500 + idate: 2021032412 + edate: 2021032418 + nens: 0 + gfs_cyc: 0 + start: warm + yaml: {{ HOMEgfs }}/ci/cases/yamls/soca_gfs_defaults_ci.yaml + +skip_ci_on_hosts: + - orion + - hera + - hercules diff --git a/ci/cases/pr/C96C48_hybatmDA.yaml b/ci/cases/pr/C96C48_hybatmDA.yaml index be35283cff..d08374d4e0 100644 --- a/ci/cases/pr/C96C48_hybatmDA.yaml +++ b/ci/cases/pr/C96C48_hybatmDA.yaml @@ -16,4 +16,4 @@ arguments: nens: 2 gfs_cyc: 1 start: cold - yaml: {{ HOMEgfs }}/ci/platforms/gfs_defaults_ci.yaml + yaml: {{ HOMEgfs }}/ci/cases/yamls/gfs_defaults_ci.yaml diff --git a/ci/cases/pr/C96C48_ufs_hybatmDA.yaml b/ci/cases/pr/C96C48_ufs_hybatmDA.yaml new file mode 100644 index 0000000000..da68b0f86c --- /dev/null +++ b/ci/cases/pr/C96C48_ufs_hybatmDA.yaml @@ -0,0 +1,23 @@ +experiment: + system: gfs + mode: cycled + +arguments: + pslot: {{ 'pslot' | getenv }} + app: ATM + resdetatmos: 96 + resensatmos: 48 + comroot: {{ 'RUNTESTS' | getenv }}/COMROOT + expdir: {{ 'RUNTESTS' | getenv }}/EXPDIR + icsdir: {{ 'ICSDIR_ROOT' | getenv }}/C96C48 + idate: 2024022318 + edate: 2024022400 + nens: 2 + gfs_cyc: 1 + start: warm + yaml: {{ HOMEgfs }}/ci/cases/yamls/ufs_hybatmDA_defaults.ci.yaml + +skip_ci_on_hosts: + - hera + - orion + - hercules diff --git a/ci/cases/pr/C96_atm3DVar.yaml b/ci/cases/pr/C96_atm3DVar.yaml index dee1525d80..d992938f7f 100644 --- a/ci/cases/pr/C96_atm3DVar.yaml +++ b/ci/cases/pr/C96_atm3DVar.yaml @@ -14,4 +14,4 @@ arguments: nens: 0 gfs_cyc: 1 start: cold - yaml: {{ HOMEgfs }}/ci/platforms/gfs_defaults_ci.yaml + yaml: {{ HOMEgfs }}/ci/cases/yamls/gfs_defaults_ci.yaml diff --git a/ci/cases/pr/C96_atmaerosnowDA.yaml b/ci/cases/pr/C96_atmaerosnowDA.yaml new file mode 100644 index 0000000000..7e22955a37 --- /dev/null +++ b/ci/cases/pr/C96_atmaerosnowDA.yaml @@ -0,0 +1,21 @@ +experiment: + system: gfs + mode: cycled + +arguments: + pslot: {{ 'pslot' | getenv }} + app: ATMA + resdetatmos: 96 + comroot: {{ 'RUNTESTS' | getenv }}/COMROOT + expdir: {{ 'RUNTESTS' | getenv }}/EXPDIR + icsdir: {{ 'ICSDIR_ROOT' | getenv }}/C96C48 + idate: 2021122012 + edate: 2021122100 + nens: 0 + gfs_cyc: 1 + start: cold + yaml: {{ HOMEgfs }}/ci/cases/yamls/atmaerosnowDA_defaults_ci.yaml + +skip_ci_on_hosts: + - orion + - hercules diff --git a/ci/cases/weekly/C384C192_hybatmda.yaml b/ci/cases/weekly/C384C192_hybatmda.yaml index a4eae7d9a1..131ada95d5 100644 --- a/ci/cases/weekly/C384C192_hybatmda.yaml +++ b/ci/cases/weekly/C384C192_hybatmda.yaml @@ -16,4 +16,4 @@ arguments: nens: 2 gfs_cyc: 1 start: cold - yaml: {{ HOMEgfs }}/ci/platforms/gfs_defaults_ci.yaml + yaml: {{ HOMEgfs }}/ci/cases/yamls/gfs_defaults_ci.yaml diff --git a/ci/cases/weekly/C384_S2SWA.yaml b/ci/cases/weekly/C384_S2SWA.yaml index 8e2c043a4c..7bbdc44671 100644 --- a/ci/cases/weekly/C384_S2SWA.yaml +++ b/ci/cases/weekly/C384_S2SWA.yaml @@ -6,9 +6,9 @@ arguments: pslot: {{ 'pslot' | getenv }} app: S2SWA resdetatmos: 384 - resdetocean: 0.5 + resdetocean: 0.25 comroot: {{ 'RUNTESTS' | getenv }}/COMROOT expdir: {{ 'RUNTESTS' | getenv }}/EXPDIR idate: 2016070100 edate: 2016070100 - yaml: {{ HOMEgfs }}/ci/platforms/gfs_defaults_ci.yaml + yaml: {{ HOMEgfs }}/ci/cases/yamls/gfs_defaults_ci.yaml diff --git a/ci/cases/weekly/C384_atm3DVar.yaml b/ci/cases/weekly/C384_atm3DVar.yaml index 479d731b25..40487f3b47 100644 --- a/ci/cases/weekly/C384_atm3DVar.yaml +++ b/ci/cases/weekly/C384_atm3DVar.yaml @@ -16,4 +16,4 @@ arguments: nens: 0 gfs_cyc: 1 start: cold - yaml: {{ HOMEgfs }}/ci/platforms/gfs_defaults_ci.yaml + yaml: {{ HOMEgfs }}/ci/cases/yamls/gfs_defaults_ci.yaml diff --git a/ci/cases/yamls/atmaerosnowDA_defaults_ci.yaml b/ci/cases/yamls/atmaerosnowDA_defaults_ci.yaml new file mode 100644 index 0000000000..417525742e --- /dev/null +++ b/ci/cases/yamls/atmaerosnowDA_defaults_ci.yaml @@ -0,0 +1,6 @@ +defaults: + !INC {{ HOMEgfs }}/parm/config/gfs/yaml/defaults.yaml +base: + DOIAU: "NO" + DO_JEDISNOWDA: "YES" + ACCOUNT: {{ 'SLURM_ACCOUNT' | getenv }} diff --git a/ci/cases/yamls/build.yaml b/ci/cases/yamls/build.yaml new file mode 100644 index 0000000000..87fae42584 --- /dev/null +++ b/ci/cases/yamls/build.yaml @@ -0,0 +1,3 @@ +builds: + - gefs: './build_all.sh -k' + - gfs: './build_all.sh -kwgu' diff --git a/ci/platforms/gefs_ci_defaults.yaml b/ci/cases/yamls/gefs_ci_defaults.yaml similarity index 100% rename from ci/platforms/gefs_ci_defaults.yaml rename to ci/cases/yamls/gefs_ci_defaults.yaml diff --git a/ci/platforms/gfs_defaults_ci.yaml b/ci/cases/yamls/gfs_defaults_ci.yaml similarity index 100% rename from ci/platforms/gfs_defaults_ci.yaml rename to ci/cases/yamls/gfs_defaults_ci.yaml diff --git a/ci/cases/yamls/soca_gfs_defaults_ci.yaml b/ci/cases/yamls/soca_gfs_defaults_ci.yaml new file mode 100644 index 0000000000..126637cd86 --- /dev/null +++ b/ci/cases/yamls/soca_gfs_defaults_ci.yaml @@ -0,0 +1,5 @@ +defaults: + !INC {{ HOMEgfs }}/parm/config/gfs/yaml/defaults.yaml +base: + ACCOUNT: {{ 'SLURM_ACCOUNT' | getenv }} + DO_JEDIOCNVAR: "YES" diff --git a/ci/cases/yamls/ufs_hybatmDA_defaults.ci.yaml b/ci/cases/yamls/ufs_hybatmDA_defaults.ci.yaml new file mode 100644 index 0000000000..1075f55b63 --- /dev/null +++ b/ci/cases/yamls/ufs_hybatmDA_defaults.ci.yaml @@ -0,0 +1,20 @@ +defaults: + !INC {{ HOMEgfs }}/parm/config/gfs/yaml/defaults.yaml +base: + DOIAU: "NO" + DO_JEDIATMVAR: "YES" + DO_JEDIATMENS: "YES" + ACCOUNT: {{ 'SLURM_ACCOUNT' | getenv }} +atmanl: + LAYOUT_X_ATMANL: 4 + LAYOUT_Y_ATMANL: 4 +atmensanl: + LAYOUT_X_ATMENSANL: 4 + LAYOUT_Y_ATMENSANL: 4 +esfc: + DONST: "NO" +nsst: + NST_MODEL: "1" +sfcanl: + DONST: "NO" + diff --git a/ci/scripts/check_ci.sh b/ci/scripts/check_ci.sh index cda2d4e9f2..4ff7eefd26 100755 --- a/ci/scripts/check_ci.sh +++ b/ci/scripts/check_ci.sh @@ -8,7 +8,7 @@ set -eux # to run from within a cron job in the CI Managers account ##################################################################################### -ROOT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")/../.." >/dev/null 2>&1 && pwd )" +HOMEgfs="$(cd "$(dirname "${BASH_SOURCE[0]}")/../.." >/dev/null 2>&1 && pwd )" scriptname=$(basename "${BASH_SOURCE[0]}") echo "Begin ${scriptname} at $(date -u)" || true export PS4='+ $(basename ${BASH_SOURCE})[${LINENO}]' @@ -20,11 +20,11 @@ REPO_URL="https://github.com/NOAA-EMC/global-workflow.git" # Set up runtime environment varibles for accounts on supproted machines ######################################################################### -source "${ROOT_DIR}/ush/detect_machine.sh" +source "${HOMEgfs}/ush/detect_machine.sh" case ${MACHINE_ID} in hera | orion | hercules) echo "Running Automated Testing on ${MACHINE_ID}" - source "${ROOT_DIR}/ci/platforms/config.${MACHINE_ID}" + source "${HOMEgfs}/ci/platforms/config.${MACHINE_ID}" ;; *) echo "Unsupported platform. Exiting with error." @@ -32,9 +32,10 @@ case ${MACHINE_ID} in ;; esac set +x -source "${ROOT_DIR}/ush/module-setup.sh" -source "${ROOT_DIR}/ci/scripts/utils/ci_utils.sh" -module use "${ROOT_DIR}/modulefiles" +export HOMEgfs +source "${HOMEgfs}/ush/module-setup.sh" +source "${HOMEgfs}/ci/scripts/utils/ci_utils.sh" +module use "${HOMEgfs}/modulefiles" module load "module_gwsetup.${MACHINE_ID}" module list set -x @@ -57,7 +58,7 @@ pr_list_dbfile="${GFS_CI_ROOT}/open_pr_list.db" pr_list="" if [[ -f "${pr_list_dbfile}" ]]; then - pr_list=$("${ROOT_DIR}/ci/scripts/pr_list_database.py" --dbfile "${pr_list_dbfile}" --display | grep -v Failed | grep Running | awk '{print $1}') || true + pr_list=$("${HOMEgfs}/ci/scripts/pr_list_database.py" --dbfile "${pr_list_dbfile}" --display | grep -v Failed | grep Running | awk '{print $1}') || true fi if [[ -z "${pr_list+x}" ]]; then echo "no PRs open and ready to run cases on .. exiting" @@ -89,13 +90,13 @@ for pr in ${pr_list}; do sed -i "1 i\`\`\`" "${output_ci}" sed -i "1 i\All CI Test Cases Passed on ${MACHINE_ID^}:" "${output_ci}" "${GH}" pr comment "${pr}" --repo "${REPO_URL}" --body-file "${output_ci}" - "${ROOT_DIR}/ci/scripts/pr_list_database.py" --remove_pr "${pr}" --dbfile "${pr_list_dbfile}" + "${HOMEgfs}/ci/scripts/pr_list_database.py" --remove_pr "${pr}" --dbfile "${pr_list_dbfile}" # Check to see if this PR that was opened by the weekly tests and if so close it if it passed on all platforms weekly_labels=$(${GH} pr view "${pr}" --repo "${REPO_URL}" --json headRefName,labels,author --jq 'select(.author.login | contains("emcbot")) | select(.headRefName | contains("weekly_ci")) | .labels[].name ') || true if [[ -n "${weekly_labels}" ]]; then - num_platforms=$(find "${ROOT_DIR}/ci/platforms" -type f -name "config.*" | wc -l) + num_platforms=$(find "${HOMEgfs}/ci/platforms" -type f -name "config.*" | wc -l) passed=0 - for platforms in "${ROOT_DIR}"/ci/platforms/config.*; do + for platforms in "${HOMEgfs}"/ci/platforms/config.*; do machine=$(basename "${platforms}" | cut -d. -f2) if [[ "${weekly_labels}" == *"CI-${machine^}-Passed"* ]]; then ((passed=passed+1)) @@ -139,7 +140,7 @@ for pr in ${pr_list}; do } >> "${output_ci}" sed -i "1 i\`\`\`" "${output_ci}" "${GH}" pr comment "${pr}" --repo "${REPO_URL}" --body-file "${output_ci}" - "${ROOT_DIR}/ci/scripts/pr_list_database.py" --remove_pr "${pr}" --dbfile "${pr_list_dbfile}" + "${HOMEgfs}/ci/scripts/pr_list_database.py" --remove_pr "${pr}" --dbfile "${pr_list_dbfile}" for kill_cases in "${pr_dir}/RUNTESTS/"*; do pslot=$(basename "${kill_cases}") cancel_slurm_jobs "${pslot}" diff --git a/ci/scripts/clone-build_ci.sh b/ci/scripts/clone-build_ci.sh index 798c98bf50..989afabb80 100755 --- a/ci/scripts/clone-build_ci.sh +++ b/ci/scripts/clone-build_ci.sh @@ -74,7 +74,7 @@ set +e source "${HOMEgfs}/ush/module-setup.sh" export BUILD_JOBS=8 rm -rf log.build -./build_all.sh -gu >> log.build 2>&1 +./build_all.sh -guw >> log.build 2>&1 build_status=$? DATE=$(date +'%D %r') diff --git a/ci/scripts/driver.sh b/ci/scripts/driver.sh index 5fc13ea524..f37b5e3f2e 100755 --- a/ci/scripts/driver.sh +++ b/ci/scripts/driver.sh @@ -47,12 +47,15 @@ esac ###################################################### # setup runtime env for correct python install and git ###################################################### +HOMEgfs=${ROOT_DIR} +export HOMEgfs set +x source "${ROOT_DIR}/ci/scripts/utils/ci_utils.sh" source "${ROOT_DIR}/ush/module-setup.sh" module use "${ROOT_DIR}/modulefiles" module load "module_gwsetup.${MACHINE_ID}" set -x +unset HOMEgfs ############################################################ # query repo and get list of open PRs with tags {machine}-CI diff --git a/ci/scripts/pr_list_database.py b/ci/scripts/pr_list_database.py index 224aabd361..f525d64987 100755 --- a/ci/scripts/pr_list_database.py +++ b/ci/scripts/pr_list_database.py @@ -2,154 +2,126 @@ import sys import os -from pathlib import Path -from argparse import ArgumentParser, ArgumentDefaultsHelpFormatter, REMAINDER, ZERO_OR_MORE -import sqlite3 +from wxflow import SQLiteDB, SQLiteDBError +from argparse import ArgumentParser, ArgumentDefaultsHelpFormatter, REMAINDER def full_path(string): """ - Gets the absolute path of the given file and confirms the directory exists + full_path Get the absolute path of a file or directory. Parameters ---------- string : str - Path to a file + The relative path of the file or directory. Returns - -------- + ------- str - Absolute path of input path + The absolute path of the file or directory. Raises - ------- + ------ NotADirectoryError - If the target directory for the file does not exist. - + If the provided string does not represent a valid file or directory. """ - if os.path.isfile(string) or os.path.isdir(os.path.dirname(string)): return os.path.abspath(string) else: raise NotADirectoryError(string) -def sql_connection(filename: os.path) -> sqlite3.Connection: +def create_table(db: SQLiteDB): """ - Returns an Sqlite3 Cursor object from a given path to a sqlite3 database file + Create a new table in a database. Parameters ---------- - filename : Path - Full path to a sqlite3 database file - - Returns - ------- - sqlite3.Connection - Sqlite3 Connection object for updating table - + db : SQLiteDB + The database to create. """ - try: - return sqlite3.connect(filename) - except sqlite3.Error: - print(sqlite3.Error) - sys.exit(-1) + db.create_table('pr_list', ['pr INTEGER PRIMARY KEY UNIQUE', 'state TEXT', 'status TEXT', 'reset_id INTEGER', 'cases TEXT']) -def sql_table(obj: sqlite3.Cursor) -> None: +def add_pr(db: SQLiteDB, pr: str): """ - Creates the initial sqlite3 table for PR states and status + Add a pull request to the database. Parameters ---------- - obj : sqlite3.Cursor - Cursor object for Sqlite3 - - """ - - obj.execute("CREATE TABLE processing(pr integer PRIMARY KEY, state text, status text, reset_id integer, cases text)") - - -def sql_insert(obj: sqlite3.Cursor, entities: list) -> None: + ci_database : SQLiteDB + The database to add the pull request to. + pr : str + The pull request to add. """ - Inserts a new row in sqlite3 table with PR, state, and status + entities = (pr, 'Open', 'Ready', 0, 'ci_repo') + try: + db.insert_data('pr_list', entities) + except (SQLiteDBError.IntegrityError) as e: + if 'unique' in str(e).lower(): + print(f"pr {pr} already is in list: nothing added") - Parameters - ---------- - obj : sqlite3.Cursor - Cursor object for Sqlite3 - entities : list - A list of four string values that go into sqlite table (pr, state, status, reset_id, cases) - pr: pull request number - state: The new value for the state (Open, Closed) - status: The new value for the status (Ready, Running, Failed) - reset_id: The value for number of times reset_id to Ready - cases: String containing case selection information +def update_pr(db: SQLiteDB, args): """ - - obj.execute('INSERT INTO processing(pr, state, status, reset_id, cases) VALUES(?, ?, ?, ?, ?)', entities) - - -def sql_update(obj: sqlite3.Cursor, pr: str, updates: dict) -> None: - """Updates table for a given pr with new values for state and status + Update a pull request in the database. Parameters ---------- - obj : sqlite.sql_connection - sqlite3 Cursor Object - pr : str - The given pr number to update in the table - updates : dict - Dictionary of values to update for a given PR to include by postion - state, The new value for the state (Open, Closed) - status, The new value for the status (Ready, Running, Failed) - reset_id, The value for number of times reset_id to Ready - cases, Information regarding which cases are used (i.e. self PR) - + db : SQLiteDB + The database to update the pull request in. + args : argparse.Namespace + The command line arguments. """ + if len(args.update_pr) < 2: + print(f"update_pr must have at least one vaule to update") + sys.exit(0) update_list = ['state', 'status', 'reset_id', 'cases'] - rows = sql_fetch(obj) - for value in updates: + for value in args.update_pr[1:]: update = update_list.pop(0) - obj.execute(f'UPDATE processing SET "{update}" = "{value}" WHERE pr = {pr}') - + db.update_data('pr_list', update, value, 'pr', args.update_pr[0]) -def sql_fetch(obj: sqlite3.Cursor) -> list: - """ Gets list of all rows in table - - Parameters - ---------- - obj : sqlite.sql_connection - sqlite3 Cursor Object +def display_db(db, display): """ - - obj.execute('SELECT * FROM processing') - return obj.fetchall() - - -def sql_remove(obj: sqlite3.Cursor, pr: str) -> None: - """ Removes the row from table with given pr number + Display the database. Parameters ---------- - obj : sqlite.sql_connection - sqlite3 Connection Object - pr : str - pr number acting as key for removing the row with in it + ci_database : SQLiteDB + The database to display. + display : list + The command line argument values. + Returns + ------- + list + The rows of the database. """ + values = [] + if len(display) == 1: + rows = db.fetch_data('pr_list', ['pr', 'state', 'status', 'reset_id', 'cases'], f'pr = {display[0]}') + else: + rows = db.fetch_data('pr_list', ['pr', 'state', 'status', 'reset_id', 'cases']) + for row in rows: + values.append(' '.join(map(str, row))) - obj.execute(f'DELETE FROM processing WHERE pr = {pr}').rowcount + return values def input_args(): + """ + Parse command line arguments. - description = """Arguments for creating and updating db file for pr states + Returns + ------- + argparse.Namespace + The parsed command line arguments. """ + description = """Arguments for creating and updating db file for pr states + """ parser = ArgumentParser(description=description, formatter_class=ArgumentDefaultsHelpFormatter) @@ -160,7 +132,6 @@ def input_args(): parser.add_argument('--update_pr', nargs=REMAINDER, metavar=('pr', 'state', 'status', 'reset_id', 'cases'), help='updates state and status of a given pr', required=False) parser.add_argument('--display', nargs='*', help='output pr table', required=False) - args = parser.parse_args() return args @@ -174,42 +145,19 @@ def input_args(): print(f'Error: {args.dbfile} does not exsist') sys.exit(-1) - con = sql_connection(args.dbfile) - obj = con.cursor() + ci_database = SQLiteDB(args.dbfile) + ci_database.connect() if args.create: - sql_table(obj) - + create_table(ci_database) if args.add_pr: - rows = sql_fetch(obj) - for row in rows: - if str(row[0]) == str(args.add_pr[0]): - print(f"pr {row[0]} already is in list: nothing added") - sys.exit(0) - - entities = (args.add_pr[0], 'Open', 'Ready', 0, 'ci_repo') - sql_insert(obj, entities) - + add_pr(ci_database, args.add_pr[0]) if args.update_pr: - if len(args.update_pr) < 2: - print(f"update_pr must have at least one vaule to update") - sys.exit(0) - pr = args.update_pr[0] - - sql_update(obj, pr, args.update_pr[1:]) - + update_pr(ci_database, args) if args.remove_pr: - sql_remove(obj, args.remove_pr[0]) - + ci_database.remove_data('pr_list', 'PR', args.remove_pr[0]) if args.display is not None: - rows = sql_fetch(obj) - if len(args.display) == 1: - for row in rows: - if int(args.display[0]) == int(row[0]): - print(' '.join(map(str, row))) - else: - for row in rows: - print(' '.join(map(str, row))) - - con.commit() - con.close() + for rows in display_db(ci_database, args.display): + print(rows) + + ci_database.disconnect() diff --git a/ci/scripts/run-check_ci.sh b/ci/scripts/run-check_ci.sh index 5a909c1c64..8e1e927050 100755 --- a/ci/scripts/run-check_ci.sh +++ b/ci/scripts/run-check_ci.sh @@ -21,8 +21,11 @@ pslot=${2:-${pslot:-?}} # Name of the experiment being tested by this scr # │   └── ${pslot} # └── EXPDIR # └── ${pslot} -HOMEgfs="${TEST_DIR}/HOMEgfs" +# Two system build directories created at build time gfs, and gdas +# TODO: Make this configurable (for now all scripts run from gfs for CI at runtime) +HOMEgfs="${TEST_DIR}/gfs" RUNTESTS="${TEST_DIR}/RUNTESTS" +run_check_logfile="${RUNTESTS}/ci-run_check.log" # Source modules and setup logging echo "Source modules." @@ -75,15 +78,16 @@ while true; do { echo "Experiment ${pslot} Terminated with ${num_failed} tasks failed at $(date)" || true echo "Experiment ${pslot} Terminated: *FAILED*" - } >> "${RUNTESTS}/ci.log" - + } | tee -a "${run_check_logfile}" error_logs=$(rocotostat -d "${db}" -w "${xml}" | grep -E 'FAIL|DEAD' | awk '{print "-c", $1, "-t", $2}' | xargs rocotocheck -d "${db}" -w "${xml}" | grep join | awk '{print $2}') || true { echo "Error logs:" echo "${error_logs}" - } >> "${RUNTESTS}/ci.log" - sed -i "s/\`\`\`//2g" "${RUNTESTS}/ci.log" - sacct --format=jobid,jobname%35,WorkDir%100,stat | grep "${pslot}" | grep "${pr}\/RUNTESTS" | awk '{print $1}' | xargs scancel || true + } | tee -a "${run_check_logfile}" + # rm -f "${RUNTESTS}/error.logs" + for log in ${error_logs}; do + echo "RUNTESTS${log#*RUNTESTS}" >> "${RUNTESTS}/error.logs" + done rc=1 break fi @@ -93,8 +97,7 @@ while true; do echo "Experiment ${pslot} Completed at $(date)" || true echo "with ${num_succeeded} successfully completed jobs" || true echo "Experiment ${pslot} Completed: *SUCCESS*" - } >> "${RUNTESTS}/ci.log" - sed -i "s/\`\`\`//2g" "${RUNTESTS}/ci.log" + } | tee -a "${run_check_logfile}" rc=0 break fi @@ -105,3 +108,4 @@ while true; do done exit "${rc}" + diff --git a/ci/scripts/run_ci.sh b/ci/scripts/run_ci.sh index 4a390a23f2..f50a4465d0 100755 --- a/ci/scripts/run_ci.sh +++ b/ci/scripts/run_ci.sh @@ -9,7 +9,7 @@ set -eux # Abstract TODO ##################################################################################### -ROOT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")/../.." >/dev/null 2>&1 && pwd )" +HOMEgfs="$(cd "$(dirname "${BASH_SOURCE[0]}")/../.." >/dev/null 2>&1 && pwd )" scriptname=$(basename "${BASH_SOURCE[0]}") echo "Begin ${scriptname} at $(date -u)" || true export PS4='+ $(basename ${BASH_SOURCE})[${LINENO}]' @@ -18,11 +18,11 @@ export PS4='+ $(basename ${BASH_SOURCE})[${LINENO}]' # Set up runtime environment varibles for accounts on supproted machines ######################################################################### -source "${ROOT_DIR}/ush/detect_machine.sh" +source "${HOMEgfs}/ush/detect_machine.sh" case ${MACHINE_ID} in hera | orion | hercules) echo "Running Automated Testing on ${MACHINE_ID}" - source "${ROOT_DIR}/ci/platforms/config.${MACHINE_ID}" + source "${HOMEgfs}/ci/platforms/config.${MACHINE_ID}" ;; *) echo "Unsupported platform. Exiting with error." @@ -30,8 +30,9 @@ case ${MACHINE_ID} in ;; esac set +x -source "${ROOT_DIR}/ush/module-setup.sh" -module use "${ROOT_DIR}/modulefiles" +export HOMEgfs +source "${HOMEgfs}/ush/module-setup.sh" +module use "${HOMEgfs}/modulefiles" module load "module_gwsetup.${MACHINE_ID}" module list set -eux @@ -47,7 +48,7 @@ pr_list_dbfile="${GFS_CI_ROOT}/open_pr_list.db" pr_list="" if [[ -f "${pr_list_dbfile}" ]]; then - pr_list=$("${ROOT_DIR}/ci/scripts/pr_list_database.py" --display --dbfile "${pr_list_dbfile}" | grep -v Failed | grep Open | grep Running | awk '{print $1}' | head -"${max_concurrent_pr}") || true + pr_list=$("${HOMEgfs}/ci/scripts/pr_list_database.py" --display --dbfile "${pr_list_dbfile}" | grep -v Failed | grep Open | grep Running | awk '{print $1}' | head -"${max_concurrent_pr}") || true fi if [[ -z "${pr_list}" ]]; then echo "no open and built PRs that are ready for the cases to advance with rocotorun .. exiting" diff --git a/ci/scripts/utils/ci_utils.sh b/ci/scripts/utils/ci_utils.sh index 737a3e5a86..ce2e039307 100755 --- a/ci/scripts/utils/ci_utils.sh +++ b/ci/scripts/utils/ci_utils.sh @@ -1,24 +1,128 @@ #!/bin/env bash -function cancel_slurm_jobs() { +function determine_scheduler() { + if command -v sbatch &> /dev/null; then + echo "slurm"; + elif command -v qsub &> /dev/null; then + echo "torque"; + else + echo "unknown" + fi +} - # Usage: cancel_slurm_jobs - # Example: cancel_slurm_jobs "C48_ATM_3c4e7f74" +function cancel_batch_jobs() { + # Usage: cancel_batch_jobs + # Example: cancel_batch_jobs "C48_ATM_3c4e7f74" # - # Cancel all Slurm jobs that have the given substring in their name + # Cancel all batch jobs that have the given substring in their name # So like in the example all jobs with "C48_ATM_3c4e7f74" # in their name will be canceled local substring=$1 local job_ids - job_ids=$(squeue -u "${USER}" -h -o "%i") - - for job_id in ${job_ids}; do - job_name=$(sacct -j "${job_id}" --format=JobName%100 | head -3 | tail -1 | sed -r 's/\s+//g') || true - if [[ "${job_name}" =~ ${substring} ]]; then - echo "Canceling Slurm Job ${job_name} with: scancel ${job_id}" - scancel "${job_id}" - continue - fi + + scheduler=$(determine_scheduler) + + if [[ "${schduler}" == "torque" ]]; then + job_ids=$(qstat -u "${USER}" | awk '{print $1}') || true + + for job_id in ${job_ids}; do + job_name=$(qstat -f "${job_id}" | grep Job_Name | awk '{print $3}') || true + if [[ "${job_name}" =~ ${substring} ]]; then + echo "Canceling PBS Job ${job_name} with: qdel ${job_id}" + qdel "${job_id}" + continue + fi + done + + elif [[ "${scheduler}" == "slurm" ]]; then + + job_ids=$(squeue -u "${USER}" -h -o "%i") + + for job_id in ${job_ids}; do + job_name=$(sacct -j "${job_id}" --format=JobName%100 | head -3 | tail -1 | sed -r 's/\s+//g') || true + if [[ "${job_name}" =~ ${substring} ]]; then + echo "Canceling Slurm Job ${job_name} with: scancel ${job_id}" + scancel "${job_id}" + continue + fi + done + + else + echo "FATAL: Unknown/unsupported job scheduler" + exit 1 + fi +} + + +function get_pr_case_list () { + + ############################################################# + # loop over every yaml file in the PR's ci/cases + # and create an run directory for each one for this PR loop + ############################################################# + for yaml_config in "${HOMEgfs}/ci/cases/pr/"*.yaml; do + case=$(basename "${yaml_config}" .yaml) || true + echo "${case}" + done +} + +function get_pslot_list () { + + local RUNTESTS="${1}" + + ############################################################# + # loop over expdir directories in RUNTESTS + # and create list of the directory names (pslot) with the hash tag + ############################################################# + for pslot_dir in "${RUNTESTS}/EXPDIR/"*; do + pslot=$(basename "${pslot_dir}") || true + echo "${pslot}" + done + +} + +function get_pslot () { + + local RUNTESTS="${1}" + local case="${2}" + + ############################################################# + # loop over expdir directories in RUNTESTS + # and return the name of the pslot with its tag that matches the case + ############################################################# + for pslot_dir in "${RUNTESTS}/EXPDIR/"*; do + pslot=$(basename "${pslot_dir}") + check_case=$(echo "${pslot}" | rev | cut -d"_" -f2- | rev) || true + if [[ "${check_case}" == "${case}" ]]; then + echo "${pslot}" + break + fi + done + +} + +function cancel_all_batch_jobs () { + local RUNTESTS="${1}" + pslot_list=$(get_pslot_list "${RUNTESTS}") + for pslot in ${pslot_list}; do + cancel_batch_jobs "${pslot}" done } + +function create_experiment () { + + local yaml_config="${1}" + cd "${HOMEgfs}" || exit 1 + pr_sha=$(git rev-parse --short HEAD) + case=$(basename "${yaml_config}" .yaml) || true + export pslot=${case}_${pr_sha} + + source "${HOMEgfs}/ci/platforms/config.${MACHINE_ID}" + source "${HOMEgfs}/workflow/gw_setup.sh" + + # system=$(grep "system:" "${yaml_config}" | cut -d":" -f2 | tr -d " ") || true + + "${HOMEgfs}/${system}/workflow/create_experiment.py" --overwrite --yaml "${yaml_config}" + +} diff --git a/ci/scripts/utils/ci_utils_wrapper.sh b/ci/scripts/utils/ci_utils_wrapper.sh new file mode 100755 index 0000000000..51c392fb99 --- /dev/null +++ b/ci/scripts/utils/ci_utils_wrapper.sh @@ -0,0 +1,9 @@ +#!/usr/bin/env bash + +HOMEgfs="$(cd "$(dirname "${BASH_SOURCE[0]}")/../../.." >/dev/null 2>&1 && pwd )" +source "${HOMEgfs}/ush/detect_machine.sh" + +utitilty_function="${1}" + +source "${HOMEgfs}/ci/scripts/utils/ci_utils.sh" +${utitilty_function} "${@:2}" diff --git a/ci/scripts/utils/get_host_case_list.py b/ci/scripts/utils/get_host_case_list.py new file mode 100755 index 0000000000..eb10f29f05 --- /dev/null +++ b/ci/scripts/utils/get_host_case_list.py @@ -0,0 +1,32 @@ +#!/usr/bin/env python3 +import os +from os.path import basename, splitext +import sys +import glob +from wxflow import parse_j2yaml +from wxflow import AttrDict + +_here = os.path.dirname(__file__) +_top = os.path.abspath(os.path.join(os.path.abspath(_here), '../../..')) + +if __name__ == '__main__': + + if len(sys.argv) < 2: + print('Usage: get_host_case_list.py ') + sys.exit(1) + + host = sys.argv[1] + + case_list = [] + HOMEgfs = _top + data = AttrDict(HOMEgfs=_top) + data.update(os.environ) + + case_files = glob.glob(f'{HOMEgfs}/ci/cases/pr/*.yaml') + for case_yaml in case_files: + case_conf = parse_j2yaml(path=case_yaml, data=data) + if 'skip_ci_on_hosts' in case_conf: + if host.lower() in [machine.lower() for machine in case_conf.skip_ci_on_hosts]: + continue + case_list.append(splitext(basename(case_yaml))[0]) + print(' '.join(case_list)) diff --git a/ci/scripts/utils/githubpr.py b/ci/scripts/utils/githubpr.py new file mode 100755 index 0000000000..5fe0b643ea --- /dev/null +++ b/ci/scripts/utils/githubpr.py @@ -0,0 +1,124 @@ +#!/usr/bin/env python3 + +import os +import re + +from github import Github, GithubException, InputFileContent, UnknownObjectException +from wxflow import which + + +class GitHubDBError(Exception): + """ + Base class for GitHubDB exceptions. + """ + UnknownObjectException = UnknownObjectException + GithubException = GithubException + + +class GitHubPR(Github): + """ + GitHubPR is an inherited class from GitHub in pyGitHub for interacting with GitHub pull requests. + + Attributes + ---------- + repo : github.Repository.Repository + The GitHub repository to interact with. + pulls : github.PaginatedList.PaginatedList of github.PullRequest.PullRequest + The list of open pull requests in the repository, sorted by last updated. + user : github.AuthenticatedUser.AuthenticatedUser + The authenticated user. + InputFileContent : github.InputFileContent.InputFileContent + The class used to create file content for gists. + + Methods + ------- + __init__(self, repo_url=None, TOKEN=None) + Initialize a new GitHubPR instance. + get_repo_url(self, repo_url=None) + Set the repository for the GitHubPR instance + using an URL directly or from 'REPO_URL' environment variable. + get_pr_list(self) + Get the numerical list of all pull requests. + get_ci_pr_list(self, state='Ready', host=None) + Get the numerical list of all pull requests with a specific state from labels. + for example if a PR has a label 'CI-Ready-Hera' of the form CI-[state]-[host] + its corresponding PR number will be included in the list. + """ + + def __init__(self, repo_url=None, TOKEN=None): + """ + __init__ Initialize a new GitHubPR instance. + + This method authenticates with the GitHub API using the 'gh' CLI tool + when the TOKEN is not provided. The repository comes from from the 'REPO_URL' + environment variable when repo_url is not provided. + """ + if TOKEN is None: + gh_cli = which('gh') + gh_cli.add_default_arg(['auth', 'status', '--show-token']) + TOKEN = gh_cli(output=str, error=str).split('\n')[3].split(': ')[1] + super().__init__(TOKEN) + + self.repo = self.get_repo_url(repo_url) + self.pulls = self.repo.get_pulls(state='open', sort='updated', direction='desc') + self.user = self.get_user() + + self.InputFileContent = InputFileContent + + def get_repo_url(self, repo_url=None): + """ + set_repo Set the repository for the GitHubPR instance. + + Parameters + ---------- + repo_url : Repository URL + The GitHub repository. + """ + if repo_url is None: + repo_url = os.environ.get("REPO_URL") + match = re.search(r"github\.com/(.+)", repo_url) + repo_identifier = match.group(1)[:-4] + return self.get_repo(repo_identifier) + + def get_pr_list(self): + """ + get_pr_list Get the numerical list of all pull requests. + + Returns + ------- + list + A list of all pull request numbers. + """ + return [pull.number for pull in self.pulls] + + def get_ci_pr_list(self, state='Ready', host=None): + """ + get_ci_pr_list Get a list of pull requests that match a specified state and host. + + Parameters + ---------- + state : str, optional + The state of the pull requests to get (default is 'Ready'). + host : str, optional + The host of the pull requests to get. If None, all hosts are included (default is None). + + Returns + ------- + list + A list of pull request numbers that match the specified state and host. + """ + pr_list = [] + for pull in self.pulls: + labels = pull.get_labels() + ci_labels = [s for s in labels if 'CI' in s.name] + for label in ci_labels: + if state in label.name: + if host is not None: + if host.lower() in label.name.lower(): + pr_list.append(pull.number) + break + else: + pr_list.append(pull.number) + break + + return pr_list diff --git a/ci/scripts/utils/publish_logs.py b/ci/scripts/utils/publish_logs.py new file mode 100755 index 0000000000..7768c17c10 --- /dev/null +++ b/ci/scripts/utils/publish_logs.py @@ -0,0 +1,106 @@ +#!/usr/bin/env python3 + +import os +from githubpr import GitHubPR, GitHubDBError +from argparse import ArgumentParser, FileType + + +def parse_args(): + """ + Parse command line arguments. + + Returns + ------- + argparse.Namespace + The parsed command line arguments. + """ + + description = """Arguments for creating and updating error log files + """ + parser = ArgumentParser(description=description) + + parser.add_argument('--file', help='path to file for uploading to GitHub', required=False, type=FileType('r'), nargs='+') + parser.add_argument('--gist', help='create a gist of the file', nargs=1, metavar='identifier_string', required=False) + parser.add_argument('--repo', help='create a file in a repo', nargs=1, metavar='path_header', required=False) + args = parser.parse_args() + if bool(args.gist) == bool(args.repo): # Exactly one of the two is required + parser.error("Exactly one of --gist and --repo is required") + return args + + +def add_logs_to_gist(args, emcbot_gh): + """ + Adds log files to a GitHub gist. + + Parameters + ---------- + args : Namespace + The arguments parsed from the command line. + emcbot_gh : GitHubPR + The GitHubPR object to interact with GitHub. + + Prints + ------ + The URL of the created gist. + """ + + gist_files = {} + for file in args.file: + file_content = file.read() + gist_files[os.path.basename(file.name)] = emcbot_gh.InputFileContent(file_content) + + gist = emcbot_gh.user.create_gist(public=True, files=gist_files, description=f"error log file from CI run {args.gist[0]}") + print(gist.html_url) + + +def upload_logs_to_repo(args, emcbot_gh, emcbot_ci_url): + """ + Upload log files to a repository. + + Parameters + ---------- + args : Namespace + The arguments parsed from the command line. + emcbot_gh : GitHubPR + The GitHubPR object to interact with GitHub. + emcbot_ci_url : str + The URL of the repository to upload the logs to. + + Prints + ------ + The URL of the uploaded file in the repository. + """ + + path_header = args.repo[0] + repo_branch = "error_logs" + repo_path = "ci/error_logs" + extra = 0 + while True: + try: + extra += 1 + file_path_in_repo = f"{repo_path}/{path_header}/" + str(os.path.basename(args.file[0].name)) + content = emcbot_gh.repo.get_contents(file_path_in_repo, ref='error_logs') + path_header = f'{args.repo[0]}_{str(extra)}' + except GitHubDBError.GithubException as e: + break + + for file in args.file: + file_content = file.read() + file_path_in_repo = f"{repo_path}/{path_header}/" + str(os.path.basename(file.name)) + emcbot_gh.repo.create_file(file_path_in_repo, "Adding error log file", file_content, branch="error_logs") + + file_url = f"{emcbot_ci_url.rsplit('.',1)[0]}/tree/{repo_branch}/{repo_path}/{path_header}" + print(file_url) + + +if __name__ == '__main__': + + args = parse_args() + emcbot_ci_url = "https://github.com/emcbot/ci-global-workflows.git" + emcbot_gh = GitHubPR(repo_url=emcbot_ci_url) + + if args.gist: # Add error logs to a gist in GitHub emcbot's account + add_logs_to_gist(args, emcbot_gh) + + if args.repo: # Upload error logs to emcbot's ci-global-workflows error_logs branch + upload_logs_to_repo(args, emcbot_gh, emcbot_ci_url) diff --git a/ci/scripts/utils/wxflow b/ci/scripts/utils/wxflow new file mode 120000 index 0000000000..54d0558aba --- /dev/null +++ b/ci/scripts/utils/wxflow @@ -0,0 +1 @@ +../../../sorc/wxflow/src/wxflow \ No newline at end of file diff --git a/docs/source/clone.rst b/docs/source/clone.rst index bad3f0e9f6..4f47eb230f 100644 --- a/docs/source/clone.rst +++ b/docs/source/clone.rst @@ -39,6 +39,13 @@ For coupled cycling (include new UFSDA) use the `-gu` options during build: ./build_all.sh -gu +For building with PDLIB for the wave model, use the `-w` options during build: + +:: + + ./build_all.sh -w + + Build workflow components and link workflow artifacts such as executables, etc. :: diff --git a/docs/source/components.rst b/docs/source/components.rst index 98e76b467b..869ef89bab 100644 --- a/docs/source/components.rst +++ b/docs/source/components.rst @@ -28,7 +28,7 @@ Components included as submodules: * **GSI Monitor** (https://github.com/NOAA-EMC/GSI-Monitor): These tools monitor the GSI package's data assimilation, detecting and reporting missing data sources, low observation counts, and high penalty values * **GDAS** (https://github.com/NOAA-EMC/GDASApp): Jedi based Data Assimilation system. This system is currently being developed for marine Data Assimilation and in time will replace GSI for atmospheric data assimilation as well * **UFS UTILS** (https://github.com/ufs-community/UFS_UTILS): Utility codes needed for UFS-weather-model -* **wxflow** Collection of python utilities for weather workflows (https://github.com/NOAA-EMC/wxflow) +* **wxflow** (https://github.com/NOAA-EMC/wxflow): Collection of python utilities for weather workflows * **Verif global** (https://github.com/NOAA-EMC/EMC_verif-global): Verification package to evaluate GFS parallels. It uses MET and METplus. At this moment the verification package is limited to providing atmospheric metrics only .. note:: @@ -57,19 +57,20 @@ Data Observation data, also known as dump data, is prepared in production and then archived in a global dump archive (GDA) for use by users when running cycled experiments. The GDA (identified as ``$DMPDIR`` in the workflow) is available on supported platforms and the workflow system knows where to find the data. -* Hera: /scratch1/NCEPDEV/global/glopara/dump -* Orion/Hercules: /work/noaa/rstprod/dump -* Jet: /mnt/lfs4/HFIP/hfv3gfs/glopara/dump -* WCOSS2: /lfs/h2/emc/global/noscrub/emc.global/dump -* S4: /data/prod/glopara/dump +* Hera: ``/scratch1/NCEPDEV/global/glopara/dump`` +* Orion/Hercules: ``/work/noaa/rstprod/dump`` +* Jet: ``/mnt/lfs4/HFIP/hfv3gfs/glopara/dump`` +* WCOSS2: ``/lfs/h2/emc/global/noscrub/emc.global/dump`` +* S4: ``/data/prod/glopara/dump`` ----------------------------- Global Dump Archive Structure ----------------------------- -The global dump archive (GDA) mimics the structure of its production source: ``DMPDIR/CDUMP.PDY/[CC/atmos/]FILES`` +The global dump archive (GDA) mimics the structure of its production source: -The ``CDUMP`` is either gdas, gfs, or rtofs. All three contain production output for each day (``PDY``). The gdas and gfs folders are further broken into cycle (``CC``) and component (``atmos``). +* GDAS/GFS: ``DMPDIR/gdas[gfs].PDY/CC/atmos/FILES`` +* RTOFS: ``DMPDIR/rtofs.PDY/FILES`` The GDA also contains special versions of some datasets and experimental data that is being evaluated ahead of implementation into production. The following subfolder suffixes exist: @@ -81,6 +82,7 @@ The GDA also contains special versions of some datasets and experimental data th +--------+------------------------------------------------------------------------------------------------------+ | ur | Un-restricted versions of restricted files in production. Produced and archived on a 48hrs delay. | | | Some restricted datasets are unrestricted. Data amounts: restricted > un-restricted > non-restricted | +| | Limited availability. Discontinued producing mid-2023. | +--------+------------------------------------------------------------------------------------------------------+ | x | Experimental global datasets being evaluated for production. Dates and types vary depending on | | | upcoming global upgrades. | diff --git a/docs/source/conf.py b/docs/source/conf.py index 89526d9f69..81f231f6b0 100644 --- a/docs/source/conf.py +++ b/docs/source/conf.py @@ -13,13 +13,14 @@ import os import sys sys.path.insert(0, os.path.abspath('.')) - +from datetime import datetime # -- Project information ----------------------------------------------------- project = 'Global-workflow' -copyright = '2023, Kate Friedman, Walter Kolczynski, Rahul Mahajan, Lin Gan, Arun Chawla' -author = 'Kate Friedman, Walter Kolczynski, Rahul Mahajan, Lin Gan, Arun Chawla' +year = datetime.now().year +copyright = f"2015-{year} NOAA/NWS/NCEP/EMC" +author = 'Kate Friedman, Walter Kolczynski, Rahul Mahajan, Lin Gan, and numerous collaborators and contributors' # The full version, including alpha/beta/rc tags release = '0.1' diff --git a/docs/source/configure.rst b/docs/source/configure.rst index 12c2f75a48..439c5df110 100644 --- a/docs/source/configure.rst +++ b/docs/source/configure.rst @@ -4,58 +4,60 @@ Configure Run The global-workflow configs contain switches that change how the system runs. Many defaults are set initially. Users wishing to run with different settings should adjust their $EXPDIR configs and then rerun the ``setup_xml.py`` script since some configuration settings/switches change the workflow/xml ("Adjusts XML" column value is "YES"). -+----------------+----------------------------------+---------------+-------------+---------------------------------------------------+ -| Switch | What | Default | Adjusts XML | More Details | -+================+==================================+===============+=============+===================================================+ -| APP | Model application | ATM | YES | See case block in config.base for options | -+----------------+----------------------------------+---------------+-------------+---------------------------------------------------+ -| DOIAU | Enable 4DIAU for control | YES | NO | Turned off for cold-start first half cycle | -| | with 3 increments | | | | -+----------------+----------------------------------+---------------+-------------+---------------------------------------------------+ -| DOHYBVAR | Run EnKF | YES | YES | Don't recommend turning off | -+----------------+----------------------------------+---------------+-------------+---------------------------------------------------+ -| DONST | Run NSST | YES | NO | If YES, turns on NSST in anal/fcst steps, and | -| | | | | turn off rtgsst | -+----------------+----------------------------------+---------------+-------------+---------------------------------------------------+ -| DO_AWIPS | Run jobs to produce AWIPS | NO | YES | downstream processing, ops only | -| | products | | | | -+----------------+----------------------------------+---------------+-------------+---------------------------------------------------+ -| DO_BUFRSND | Run job to produce BUFR | NO | YES | downstream processing | -| | sounding products | | | | -+----------------+----------------------------------+---------------+-------------+---------------------------------------------------+ -| DO_GEMPAK | Run job to produce GEMPAK | NO | YES | downstream processing, ops only | -| | products | | | | -+----------------+----------------------------------+---------------+-------------+---------------------------------------------------+ -| DO_FIT2OBS | Run FIT2OBS job | YES | YES | Whether to run the FIT2OBS job | -+----------------+----------------------------------+---------------+-------------+---------------------------------------------------+ -| DO_TRACKER | Run tracker job | YES | YES | Whether to run the tracker job | -+----------------+----------------------------------+---------------+-------------+---------------------------------------------------+ -| DO_GENESIS | Run genesis job | YES | YES | Whether to run the genesis job | -+----------------+----------------------------------+---------------+-------------+---------------------------------------------------+ -| DO_GENESIS_FSU | Run FSU genesis job | YES | YES | Whether to run the FSU genesis job | -+----------------+----------------------------------+---------------+-------------+---------------------------------------------------+ -| DO_VERFOZN | Run GSI monitor ozone job | YES | YES | Whether to run the GSI monitor ozone job | -+----------------+----------------------------------+---------------+-------------+---------------------------------------------------+ -| DO_VERFRAD | Run GSI monitor radiance job | YES | YES | Whether to run the GSI monitor radiance job | -+----------------+----------------------------------+---------------+-------------+---------------------------------------------------+ -| DO_VMINMON | Run GSI monitor minimization job | YES | YES | Whether to run the GSI monitor minimization job | -+----------------+----------------------------------+---------------+-------------+---------------------------------------------------+ -| DO_METP | Run METplus jobs | YES | YES | One cycle spinup | -+----------------+----------------------------------+---------------+-------------+---------------------------------------------------+ -| EXP_WARM_START | Is experiment starting warm | .false. | NO | Impacts IAU settings for initial cycle. Can also | -| | (.true.) or cold (.false)? | | | be set when running ``setup_expt.py`` script with | -| | | | | the ``--start`` flag (e.g. ``--start warm``) | -+----------------+----------------------------------+---------------+-------------+---------------------------------------------------+ -| HPSSARCH | Archive to HPPS | NO | Possibly | Whether to save output to tarballs on HPPS | -+----------------+----------------------------------+---------------+-------------+---------------------------------------------------+ -| LOCALARCH | Archive to a local directory | NO | Possibly | Instead of archiving data to HPSS, archive to a | -| | | | | local directory, specified by ATARDIR. If | -| | | | | LOCALARCH=YES, then HPSSARCH must =NO. Changing | -| | | | | HPSSARCH from YES to NO will adjust the XML. | -+----------------+----------------------------------+---------------+-------------+---------------------------------------------------+ -| QUILTING | Use I/O quilting | .true. | NO | If .true. choose OUTPUT_GRID as cubed_sphere_grid | -| | | | | in netcdf or gaussian_grid | -+----------------+----------------------------------+---------------+-------------+---------------------------------------------------+ -| WRITE_DOPOST | Run inline post | .true. | NO | If .true. produces master post output in forecast | -| | | | | job | -+----------------+----------------------------------+---------------+-------------+---------------------------------------------------+ ++------------------+----------------------------------+---------------+-------------+---------------------------------------------------+ +| Switch | What | Default | Adjusts XML | More Details | ++==================+==================================+===============+=============+===================================================+ +| APP | Model application | ATM | YES | See case block in config.base for options | ++------------------+----------------------------------+---------------+-------------+---------------------------------------------------+ +| DEBUG_POSTSCRIPT | Debug option for PBS scheduler | NO | YES | Sets debug=true for additional logging | ++------------------+----------------------------------+---------------+-------------+---------------------------------------------------+ +| DOIAU | Enable 4DIAU for control | YES | NO | Turned off for cold-start first half cycle | +| | with 3 increments | | | | ++------------------+----------------------------------+---------------+-------------+---------------------------------------------------+ +| DOHYBVAR | Run EnKF | YES | YES | Don't recommend turning off | ++------------------+----------------------------------+---------------+-------------+---------------------------------------------------+ +| DONST | Run NSST | YES | NO | If YES, turns on NSST in anal/fcst steps, and | +| | | | | turn off rtgsst | ++------------------+----------------------------------+---------------+-------------+---------------------------------------------------+ +| DO_AWIPS | Run jobs to produce AWIPS | NO | YES | downstream processing, ops only | +| | products | | | | ++------------------+----------------------------------+---------------+-------------+---------------------------------------------------+ +| DO_BUFRSND | Run job to produce BUFR | NO | YES | downstream processing | +| | sounding products | | | | ++------------------+----------------------------------+---------------+-------------+---------------------------------------------------+ +| DO_GEMPAK | Run job to produce GEMPAK | NO | YES | downstream processing, ops only | +| | products | | | | ++------------------+----------------------------------+---------------+-------------+---------------------------------------------------+ +| DO_FIT2OBS | Run FIT2OBS job | YES | YES | Whether to run the FIT2OBS job | ++------------------+----------------------------------+---------------+-------------+---------------------------------------------------+ +| DO_TRACKER | Run tracker job | YES | YES | Whether to run the tracker job | ++------------------+----------------------------------+---------------+-------------+---------------------------------------------------+ +| DO_GENESIS | Run genesis job | YES | YES | Whether to run the genesis job | ++------------------+----------------------------------+---------------+-------------+---------------------------------------------------+ +| DO_GENESIS_FSU | Run FSU genesis job | YES | YES | Whether to run the FSU genesis job | ++------------------+----------------------------------+---------------+-------------+---------------------------------------------------+ +| DO_VERFOZN | Run GSI monitor ozone job | YES | YES | Whether to run the GSI monitor ozone job | ++------------------+----------------------------------+---------------+-------------+---------------------------------------------------+ +| DO_VERFRAD | Run GSI monitor radiance job | YES | YES | Whether to run the GSI monitor radiance job | ++------------------+----------------------------------+---------------+-------------+---------------------------------------------------+ +| DO_VMINMON | Run GSI monitor minimization job | YES | YES | Whether to run the GSI monitor minimization job | ++------------------+----------------------------------+---------------+-------------+---------------------------------------------------+ +| DO_METP | Run METplus jobs | YES | YES | One cycle spinup | ++------------------+----------------------------------+---------------+-------------+---------------------------------------------------+ +| EXP_WARM_START | Is experiment starting warm | .false. | NO | Impacts IAU settings for initial cycle. Can also | +| | (.true.) or cold (.false)? | | | be set when running ``setup_expt.py`` script with | +| | | | | the ``--start`` flag (e.g. ``--start warm``) | ++------------------+----------------------------------+---------------+-------------+---------------------------------------------------+ +| HPSSARCH | Archive to HPPS | NO | Possibly | Whether to save output to tarballs on HPPS | ++------------------+----------------------------------+---------------+-------------+---------------------------------------------------+ +| LOCALARCH | Archive to a local directory | NO | Possibly | Instead of archiving data to HPSS, archive to a | +| | | | | local directory, specified by ATARDIR. If | +| | | | | LOCALARCH=YES, then HPSSARCH must =NO. Changing | +| | | | | HPSSARCH from YES to NO will adjust the XML. | ++------------------+----------------------------------+---------------+-------------+---------------------------------------------------+ +| QUILTING | Use I/O quilting | .true. | NO | If .true. choose OUTPUT_GRID as cubed_sphere_grid | +| | | | | in netcdf or gaussian_grid | ++------------------+----------------------------------+---------------+-------------+---------------------------------------------------+ +| WRITE_DOPOST | Run inline post | .true. | NO | If .true. produces master post output in forecast | +| | | | | job | ++------------------+----------------------------------+---------------+-------------+---------------------------------------------------+ diff --git a/docs/source/hpc.rst b/docs/source/hpc.rst index 3ce6a889d9..508597781d 100644 --- a/docs/source/hpc.rst +++ b/docs/source/hpc.rst @@ -72,23 +72,23 @@ Version It is advised to use Git v2+ when available. At the time of writing this documentation the default Git clients on the different machines were as noted in the table below. It is recommended that you check the default modules before loading recommended ones: -+---------+----------+---------------------------------------+ ++----------+----------+---------------------------------------+ | Machine | Default | Recommended | -+---------+----------+---------------------------------------+ ++----------+----------+---------------------------------------+ | Hera | v2.18.0 | default | -+---------+----------+---------------------------------------+ ++----------+----------+---------------------------------------+ | Hercules | v2.31.1 | default | -+---------+----------+---------------------------------------+ ++----------+----------+---------------------------------------+ | Orion | v1.8.3.1 | **module load git/2.28.0** | -+---------+----------+---------------------------------------+ ++----------+----------+---------------------------------------+ | Jet | v2.18.0 | default | -+---------+----------+---------------------------------------+ -| WCOSS2 | v2.26.2 | default or **module load git/2.29.0** | -+---------+----------+---------------------------------------+ ++----------+----------+---------------------------------------+ +| WCOSS2 | v2.35.3 | default | ++----------+----------+---------------------------------------+ | S4 | v1.8.3.1 | **module load git/2.30.0** | -+---------+----------+---------------------------------------+ -| AWS PW | v1.8.3.1 | default -+---------+----------+---------------------------------------+ ++----------+----------+---------------------------------------+ +| AWS PW | v1.8.3.1 | default | ++----------+----------+---------------------------------------+ ^^^^^^^^^^^^^ Output format diff --git a/docs/source/index.rst b/docs/source/index.rst index 2eb786199a..a5161789b3 100644 --- a/docs/source/index.rst +++ b/docs/source/index.rst @@ -10,7 +10,7 @@ Status ====== * State of develop (HEAD) branch: GFSv17+ development -* State of operations (dev/gfs.v16 branch): GFS v16.3.12 `tag: [gfs.v16.3.12] `_ +* State of operations (dev/gfs.v16 branch): GFS v16.3.13 `tag: [gfs.v16.3.13] `_ ============= Code managers @@ -27,6 +27,10 @@ General updates: NOAA employees and affiliates can join the gfs-announce distrib GitHub updates: Users should adjust their "Watch" settings for this repo so they receive notifications as they'd like to. Find the "Watch" or "Unwatch" button towards the top right of the `authoritative global-workflow repository page `_ and click it to adjust how you watch the repo. +================= +Table of Contents +================= + .. toctree:: :numbered: :maxdepth: 3 diff --git a/docs/source/init.rst b/docs/source/init.rst index 14a0ea0d56..f945494af8 100644 --- a/docs/source/init.rst +++ b/docs/source/init.rst @@ -51,6 +51,7 @@ Cold-start atmosphere-only cycled C96 deterministic C48 enkf (80 members) ICs ar Hera: /scratch1/NCEPDEV/global/glopara/data/ICSDIR/C96C48 Orion/Hercules: /work/noaa/global/glopara/data/ICSDIR/C96C48 WCOSS2: /lfs/h2/emc/global/noscrub/emc.global/data/ICSDIR/C96C48 + AWS: https://noaa-nws-global-pds.s3.amazonaws.com/index.html#data/ICSDIR/C96C48 Start date = 2021122018 @@ -111,6 +112,7 @@ Warm-start cycled w/ coupled (S2S) model C48 atmosphere C48 enkf (80 members) 5 Orion/Hercules: /work/noaa/global/glopara/data/ICSDIR/C48C48mx500 WCOSS2: /lfs/h2/emc/global/noscrub/emc.global/data/ICSDIR/C48C48mx500 Jet: /lfs4/HFIP/hfv3gfs/glopara/data/ICSDIR/C48C48mx500 + AWS: https://noaa-nws-global-pds.s3.amazonaws.com/index.html#data/ICSDIR/C48C48mx500 Start date = 2021032312 @@ -246,14 +248,15 @@ Automated Generation Cycled mode ----------- -Not yet supported. See :ref:`Manual Generation` section below for how to create your ICs yourself (outside of workflow). +Not yet supported. See the UFS_UTILS documentation on the gdas_init utility to generate your own ICs for cycled or forecast-only mode: https://noaa-emcufs-utils.readthedocs.io/en/latest/ufs_utils.html#gdas-init .. _forecastonly-coupled: --------------------- Forecast-only coupled --------------------- -Coupled initial conditions are currently only generated offline and copied prior to the forecast run. Prototype initial conditions will automatically be used when setting up an experiment as an S2SW app, there is no need to do anything additional. Copies of initial conditions from the prototype runs are currently maintained on Hera, Orion/Hercules, Jet, and WCOSS2. The locations used are determined by ``parm/config/config.coupled_ic``. If you need prototype ICs on another machine, please contact Walter (Walter.Kolczynski@noaa.gov). +Coupled initial conditions are currently only generated offline and copied prior to the forecast run. Prototype initial conditions will automatically be used when setting up an experiment as an S2SW app, there is no need to do anything additional. Sample copies of initial conditions from the prototype runs are currently maintained on Hera, Orion/Hercules, Jet, and WCOSS2. The locations used are determined by ``parm/config/config.stage_ic``. +Note however, that due to the rapid changes in the model configuration, some staged initial conditions may not work. .. _forecastonly-atmonly: @@ -261,7 +264,7 @@ Coupled initial conditions are currently only generated offline and copied prior Forecast-only mode (atm-only) ----------------------------- -The table below lists the needed initial condition files from past GFS versions to be used by the UFS_UTILS gdas_init utility. The utility will pull these files for you. See the next section (Manual Generation) for how to run the UFS_UTILS gdas_init utility and create initial conditions for your experiment. +The table below lists for reference the needed initial condition files from past GFS versions to be used by the UFS_UTILS gdas_init utility. The utility will pull these files for you. See the next section (Manual Generation) for how to run the UFS_UTILS gdas_init utility and create initial conditions for your experiment. Note for table: yyyy=year; mm=month; dd=day; hh=cycle @@ -284,11 +287,11 @@ Operations/production output location on HPSS: /NCEPPROD/hpssprod/runhistory/rh +----------------+---------------------------------+-----------------------------------------------------------------------------+--------------------------------+ | v15 ops | gfs.t. ``hh`` z.atmanl.nemsio | gpfs_dell1_nco_ops_com_gfs_prod_gfs. ``yyyymmdd`` _ ``hh`` .gfs_nemsioa.tar | gfs. ``yyyymmdd`` /``hh`` | | | | | | -| pre-2020022600 | gfs.t. ``hh`` z.sfcanl.nemsio | | | +| pre-2020022600 | gfs.t. ``hh`` z.sfcanl.nemsio | | | +----------------+---------------------------------+-----------------------------------------------------------------------------+--------------------------------+ | v15 ops | gfs.t. ``hh`` z.atmanl.nemsio | com_gfs_prod_gfs. ``yyyymmdd`` _ ``hh`` .gfs_nemsioa.tar | gfs. ``yyyymmdd`` /``hh`` | | | | | | -| | gfs.t. ``hh`` z.sfcanl.nemsio | | | +| | gfs.t. ``hh`` z.sfcanl.nemsio | | | +----------------+---------------------------------+-----------------------------------------------------------------------------+--------------------------------+ | v16 retro | gfs.t. ``hh`` z.atmanl.nc | gfs_netcdfa.tar* | gfs. ``yyyymmdd`` /``hh``/atmos| | | | | | @@ -318,82 +321,14 @@ Manual Generation The following information is for users needing to generate cold-start initial conditions for a cycled experiment that will run at a different resolution or layer amount than the operational GFS (C768C384L127). -The ``chgres_cube`` code is available from the `UFS_UTILS repository `_ on GitHub and can be used to convert GFS ICs to a different resolution or number of layers. Users may clone the develop/HEAD branch or the same version used by global-workflow develop. The ``chgres_cube`` code/scripts currently support the following GFS inputs: +The ``chgres_cube`` code is available from the `UFS_UTILS repository `_ on GitHub and can be used to convert GFS ICs to a different resolution or number of layers. Users should see the `documentation to generation initial conditions in the UFS_UTILS repository `_. The ``chgres_cube`` code/scripts currently support the following GFS inputs: * pre-GFSv14 * GFSv14 * GFSv15 * GFSv16 -Users can use the copy of UFS_UTILS that is already cloned and built within their global-workflow clone or clone/build it separately: - -Within a built/linked global-workflow clone: - -:: - - cd sorc/ufs_utils.fd/util/gdas_init - -Clone and build separately: - -1. Clone UFS_UTILS: - -:: - - git clone --recursive https://github.com/NOAA-EMC/UFS_UTILS.git - -Then switch to a different tag or use the default branch (develop). - -2. Build UFS_UTILS: - -:: - - sh build_all.sh - cd fix - sh link_fixdirs.sh emc $MACHINE - -where ``$MACHINE`` is ``wcoss2``, ``hera``, or ``jet``. - -.. note:: - UFS-UTILS builds on Orion/Hercules but due to the lack of HPSS access on Orion/Hercules the ``gdas_init`` utility is not supported there. - -3. Configure your conversion: - -:: - - cd util/gdas_init - vi config - -Read the doc block at the top of the config and adjust the variables to meet you needs (e.g. ``yy, mm, dd, hh`` for ``SDATE``). - -Most users will want to adjust the following ``config`` settings for the current system design: - -#. EXTRACT_DATA=YES (to pull original ICs to convert off HPSS) -#. RUN_CHGRES=YES (to run chgres_cube on the original ICs pulled off HPSS) -#. LEVS=128 (for the L127 GFS) - -4. Submit conversion script: - -:: - - ./driver.$MACHINE.sh - -where ``$MACHINE`` is currently ``wcoss2``, ``hera`` or ``jet``. Additional options will be available as support for other machines expands. - -.. note:: - UFS-UTILS builds on Orion/Hercules but due to lack of HPSS access there is no ``gdas_init`` driver for Orion/Hercules nor support to pull initial conditions from HPSS for the ``gdas_init`` utility. - -Several small jobs will be submitted: - - - 1 jobs to pull inputs off HPSS - - 1 or 2 jobs to run ``chgres_cube`` (1 for deterministic/hires and 1 for each EnKF ensemble member) - -The chgres jobs will have a dependency on the data-pull jobs and will wait to run until all data-pull jobs have completed. - -5. Check output: - -In the config you will have defined an output folder called ``$OUTDIR``. The converted output will be found there, including the needed abias and radstat initial condition files (if CDUMP=gdas). The files will be in the needed directory structure for the global-workflow system, therefore a user can move the contents of their ``$OUTDIR`` directly into their ``$ROTDIR``. - -Please report bugs to George Gayno (george.gayno@noaa.gov) and Kate Friedman (kate.friedman@noaa.gov). +See instructions in UFS_UTILS to clone, build and generate initial conditions: https://noaa-emcufs-utils.readthedocs.io/en/latest/ufs_utils.html#gdas-init .. _warmstarts-prod: @@ -489,7 +424,7 @@ Tarballs per cycle: com_gfs_vGFSVER_enkfgdas.YYYYMMDD_CC.enkfgdas_restart_grp7.tar com_gfs_vGFSVER_enkfgdas.YYYYMMDD_CC.enkfgdas_restart_grp8.tar -Go to the top of your ``ROTDIR`` and pull the contents of all tarballs there. The tarballs already contain the needed directory structure. +Go to the top of your ``ROTDIR`` and pull the contents of all tarballs there. The tarballs already contain the needed directory structure. Note that the directory structure has changed, so this may not be correct. .. _warmstarts-preprod-parallels: @@ -517,6 +452,7 @@ Recent pre-implementation parallel series was for GFS v16 (implemented March 202 * **Where do I put the warm-start initial conditions?** Extraction should occur right inside your ROTDIR. You may need to rename the enkf folder (enkf.gdas.$PDY -> enkfgdas.$PDY). Due to a recent change in the dycore, you may also need an additional offline step to fix the checksum of the NetCDF files for warm start. See the :ref:`Fix netcdf checksum section `. +The current model has undergone several updates and the files generated may not be completely usable by the model. .. _retrospective: diff --git a/docs/source/setup.rst b/docs/source/setup.rst index 0e87ade9a5..de5cfa099a 100644 --- a/docs/source/setup.rst +++ b/docs/source/setup.rst @@ -6,9 +6,13 @@ Experiment Setup :: - # Note: this will wipe your existing lmod environment source workflow/gw_setup.sh +.. warning:: + Sourcing gw_setup.sh will wipe your existing lmod environment + +.. note:: + Bash shell is required to source gw_setup.sh ^^^^^^^^^^^^^^^^^^^^^^^^ Forecast-only experiment diff --git a/env/AWSPW.env b/env/AWSPW.env index 894cce2343..2dbba67eb3 100755 --- a/env/AWSPW.env +++ b/env/AWSPW.env @@ -4,7 +4,7 @@ if [[ $# -ne 1 ]]; then echo "Must specify an input argument to set runtime environment variables!" echo "argument can be any one of the following:" - echo "atmanlrun atmensanlrun aeroanlrun landanlrun" + echo "atmanlrun atmensanlrun aeroanlrun snowanl" echo "anal sfcanl fcst post metp" echo "eobs eupd ecen efcs epos" echo "postsnd awips gempak" @@ -14,7 +14,6 @@ fi step=$1 -export npe_node_max=36 export launcher="mpiexec.hydra" export mpmd_opt="" diff --git a/env/CONTAINER.env b/env/CONTAINER.env index bfeb6dd6da..bc2d64b4ce 100755 --- a/env/CONTAINER.env +++ b/env/CONTAINER.env @@ -4,7 +4,7 @@ if [[ $# -ne 1 ]]; then echo "Must specify an input argument to set runtime environment variables!" echo "argument can be any one of the following:" - echo "atmanlrun atmensanlrun aeroanlrun landanl" + echo "atmanlrun atmensanlrun aeroanlrun snowanl" echo "anal sfcanl fcst post metp" echo "eobs eupd ecen efcs epos" echo "postsnd awips gempak" @@ -14,7 +14,6 @@ fi step=$1 -export npe_node_max=40 export launcher="mpirun" export mpmd_opt="--multi-prog" diff --git a/env/HERA.env b/env/HERA.env index 4ad9e41d01..4d035a8d34 100755 --- a/env/HERA.env +++ b/env/HERA.env @@ -4,7 +4,7 @@ if [[ $# -ne 1 ]]; then echo "Must specify an input argument to set runtime environment variables!" echo "argument can be any one of the following:" - echo "atmanlrun atmensanlrun aeroanlrun landanl" + echo "atmanlrun atmensanlrun aeroanlrun snowanl" echo "anal sfcanl fcst post metp" echo "eobs eupd ecen efcs epos" echo "postsnd awips gempak" @@ -19,6 +19,8 @@ export npe_node_max=40 export launcher="srun -l --epilog=/apps/local/bin/report-mem --export=ALL" export mpmd_opt="--multi-prog --output=mpmd.%j.%t.out" +#export POSTAMBLE_CMD='report-mem' + # Configure MPI environment #export I_MPI_ADJUST_ALLREDUCE=5 #export MPI_BUFS_PER_PROC=2048 @@ -42,7 +44,7 @@ if [[ "${step}" = "prep" ]] || [[ "${step}" = "prepbufr" ]]; then export sys_tp="HERA" export launcher_PREP="srun" -elif [[ "${step}" = "preplandobs" ]]; then +elif [[ "${step}" = "prepsnowobs" ]]; then export APRUN_CALCFIMS="${launcher} -n 1" @@ -79,13 +81,13 @@ elif [[ "${step}" = "aeroanlrun" ]]; then [[ ${NTHREADS_AEROANL} -gt ${nth_max} ]] && export NTHREADS_AEROANL=${nth_max} export APRUN_AEROANL="${launcher} -n ${npe_aeroanlrun} --cpus-per-task=${NTHREADS_AEROANL}" -elif [[ "${step}" = "landanl" ]]; then +elif [[ "${step}" = "snowanl" ]]; then - nth_max=$((npe_node_max / npe_node_landanl)) + nth_max=$((npe_node_max / npe_node_snowanl)) - export NTHREADS_LANDANL=${nth_landanl:-${nth_max}} - [[ ${NTHREADS_LANDANL} -gt ${nth_max} ]] && export NTHREADS_LANDANL=${nth_max} - export APRUN_LANDANL="${launcher} -n ${npe_landanl} --cpus-per-task=${NTHREADS_LANDANL}" + export NTHREADS_SNOWANL=${nth_snowanl:-${nth_max}} + [[ ${NTHREADS_SNOWANL} -gt ${nth_max} ]] && export NTHREADS_SNOWANL=${nth_max} + export APRUN_SNOWANL="${launcher} -n ${npe_snowanl} --cpus-per-task=${NTHREADS_SNOWANL}" export APRUN_APPLY_INCR="${launcher} -n 6" @@ -93,31 +95,27 @@ elif [[ "${step}" = "ocnanalbmat" ]]; then export APRUNCFP="${launcher} -n \$ncmd --multi-prog" - nth_max=$((npe_node_max / npe_node_ocnanalbmat)) - - export NTHREADS_OCNANAL=${nth_ocnanalbmat:-${nth_max}} - [[ ${NTHREADS_OCNANAL} -gt ${nth_max} ]] && export NTHREADS_OCNANAL=${nth_max} - export APRUN_OCNANAL="${launcher} -n ${npe_ocnanalbmat} --cpus-per-task=${NTHREADS_OCNANAL}" + export APRUN_OCNANAL="${launcher} -n ${npe_ocnanalbmat}" elif [[ "${step}" = "ocnanalrun" ]]; then export APRUNCFP="${launcher} -n \$ncmd --multi-prog" - nth_max=$((npe_node_max / npe_node_ocnanalrun)) - - export NTHREADS_OCNANAL=${nth_ocnanalrun:-${nth_max}} - [[ ${NTHREADS_OCNANAL} -gt ${nth_max} ]] && export NTHREADS_OCNANAL=${nth_max} - export APRUN_OCNANAL="${launcher} -n ${npe_ocnanalrun} --cpus-per-task=${NTHREADS_OCNANAL}" + export APRUN_OCNANAL="${launcher} -n ${npe_ocnanalrun}" elif [[ "${step}" = "ocnanalchkpt" ]]; then export APRUNCFP="${launcher} -n \$ncmd --multi-prog" - nth_max=$((npe_node_max / npe_node_ocnanalchkpt)) + export APRUN_OCNANAL="${launcher} -n ${npe_ocnanalchkpt}" + +elif [[ "${step}" = "ocnanalecen" ]]; then - export NTHREADS_OCNANAL=${nth_ocnanalchkpt:-${nth_max}} - [[ ${NTHREADS_OCNANAL} -gt ${nth_max} ]] && export NTHREADS_OCNANAL=${nth_max} - export APRUN_OCNANAL="${launcher} -n ${npe_ocnanalchkpt} --cpus-per-task=${NTHREADS_OCNANAL}" + nth_max=$((npe_node_max / npe_node_ocnanalecen)) + + export NTHREADS_OCNANALECEN=${nth_ocnanalecen:-${nth_max}} + [[ ${NTHREADS_OCNANALECEN} -gt ${nth_max} ]] && export NTHREADS_OCNANALECEN=${nth_max} + export APRUN_OCNANALECEN="${launcher} -n ${npe_ocnanalecen} --cpus-per-task=${NTHREADS_OCNANALECEN}" elif [[ "${step}" = "anal" ]] || [[ "${step}" = "analcalc" ]]; then @@ -211,6 +209,13 @@ elif [[ "${step}" = "atmos_products" ]]; then export USE_CFP="YES" # Use MPMD for downstream product generation on Hera +elif [[ "${step}" = "oceanice_products" ]]; then + + nth_max=$((npe_node_max / npe_node_oceanice_products)) + + export NTHREADS_OCNICEPOST=${nth_oceanice_products:-1} + export APRUN_OCNICEPOST="${launcher} -n 1 --cpus-per-task=${NTHREADS_OCNICEPOST}" + elif [[ "${step}" = "ecen" ]]; then nth_max=$((npe_node_max / npe_node_ecen)) @@ -247,7 +252,7 @@ elif [[ "${step}" = "epos" ]]; then [[ ${NTHREADS_EPOS} -gt ${nth_max} ]] && export NTHREADS_EPOS=${nth_max} export APRUN_EPOS="${launcher} -n ${npe_epos} --cpus-per-task=${NTHREADS_EPOS}" -elif [[ "${step}" = "init" ]]; then +elif [[ "${step}" = "init" ]]; then ##JKH export APRUN="${launcher} -n ${npe_init}" diff --git a/env/HERCULES.env b/env/HERCULES.env index 6a4aad7a7d..7d2aa5f8d0 100755 --- a/env/HERCULES.env +++ b/env/HERCULES.env @@ -12,7 +12,6 @@ fi step=$1 -export npe_node_max=80 export launcher="srun -l --export=ALL" export mpmd_opt="--multi-prog --output=mpmd.%j.%t.out" @@ -42,7 +41,7 @@ case ${step} in export sys_tp="HERCULES" export launcher_PREP="srun" ;; - "preplandobs") + "prepsnowobs") export APRUN_CALCFIMS="${launcher} -n 1" ;; @@ -80,13 +79,13 @@ case ${step} in [[ ${NTHREADS_AEROANL} -gt ${nth_max} ]] && export NTHREADS_AEROANL=${nth_max} export APRUN_AEROANL="${launcher} -n ${npe_aeroanlrun} --cpus-per-task=${NTHREADS_AEROANL}" ;; - "landanl") + "snowanl") - nth_max=$((npe_node_max / npe_node_landanl)) + nth_max=$((npe_node_max / npe_node_snowanl)) - export NTHREADS_LANDANL=${nth_landanl:-${nth_max}} - [[ ${NTHREADS_LANDANL} -gt ${nth_max} ]] && export NTHREADS_LANDANL=${nth_max} - export APRUN_LANDANL="${launcher} -n ${npe_landanl} --cpus-per-task=${NTHREADS_LANDANL}" + export NTHREADS_SNOWANL=${nth_snowanl:-${nth_max}} + [[ ${NTHREADS_SNOWANL} -gt ${nth_max} ]] && export NTHREADS_SNOWANL=${nth_max} + export APRUN_SNOWANL="${launcher} -n ${npe_snowanl} --cpus-per-task=${NTHREADS_SNOWANL}" export APRUN_APPLY_INCR="${launcher} -n 6" ;; @@ -208,10 +207,20 @@ case ${step} in [[ ${NTHREADS_UPP} -gt ${nth_max} ]] && export NTHREADS_UPP=${nth_max} export APRUN_UPP="${launcher} -n ${npe_upp} --cpus-per-task=${NTHREADS_UPP}" ;; + "atmos_products") export USE_CFP="YES" # Use MPMD for downstream product generation ;; + +"oceanice_products") + + nth_max=$((npe_node_max / npe_node_oceanice_products)) + + export NTHREADS_OCNICEPOST=${nth_oceanice_products:-1} + export APRUN_OCNICEPOST="${launcher} -n 1 --cpus-per-task=${NTHREADS_OCNICEPOST}" +;; + "ecen") nth_max=$((npe_node_max / npe_node_ecen)) diff --git a/env/JET.env b/env/JET.env index f458bff72d..df6666d8dc 100755 --- a/env/JET.env +++ b/env/JET.env @@ -4,7 +4,7 @@ if [[ $# -ne 1 ]]; then echo "Must specify an input argument to set runtime environment variables!" echo "argument can be any one of the following:" - echo "atmanlrun atmensanlrun aeroanlrun landanl" + echo "atmanlrun atmensanlrun aeroanlrun snowanl" echo "anal sfcanl fcst post metp" echo "eobs eupd ecen efcs epos" echo "postsnd awips gempak" @@ -14,15 +14,6 @@ fi step=$1 -if [[ "${PARTITION_POST_BATCH}" = "sjet" ]]; then - export npe_node_max=16 -elif [[ "${PARTITION_BATCH}" = "xjet" ]]; then - export npe_node_max=24 -elif [[ "${PARTITION_BATCH}" = "vjet" ]]; then - export npe_node_max=16 -elif [[ "${PARTITION_BATCH}" = "kjet" ]]; then - export npe_node_max=40 -fi export launcher="srun -l --epilog=/apps/local/bin/report-mem --export=ALL" export mpmd_opt="--multi-prog --output=mpmd.%j.%t.out" @@ -42,7 +33,7 @@ if [[ "${step}" = "prep" ]] || [[ "${step}" = "prepbufr" ]]; then export sys_tp="JET" export launcher_PREP="srun" -elif [[ "${step}" = "preplandobs" ]]; then +elif [[ "${step}" = "prepsnowobs" ]]; then export APRUN_CALCFIMS="${launcher} -n 1" @@ -79,13 +70,13 @@ elif [[ "${step}" = "aeroanlrun" ]]; then [[ ${NTHREADS_AEROANL} -gt ${nth_max} ]] && export NTHREADS_AEROANL=${nth_max} export APRUN_AEROANL="${launcher} -n ${npe_aeroanlrun}" -elif [[ "${step}" = "landanl" ]]; then +elif [[ "${step}" = "snowanl" ]]; then - nth_max=$((npe_node_max / npe_node_landanl)) + nth_max=$((npe_node_max / npe_node_snowanl)) - export NTHREADS_LANDANL=${nth_landanl:-${nth_max}} - [[ ${NTHREADS_LANDANL} -gt ${nth_max} ]] && export NTHREADS_LANDANL=${nth_max} - export APRUN_LANDANL="${launcher} -n ${npe_landanl}" + export NTHREADS_SNOWANL=${nth_snowanl:-${nth_max}} + [[ ${NTHREADS_SNOWANL} -gt ${nth_max} ]] && export NTHREADS_SNOWANL=${nth_max} + export APRUN_SNOWANL="${launcher} -n ${npe_snowanl}" export APRUN_APPLY_INCR="${launcher} -n 6" @@ -199,6 +190,13 @@ elif [[ "${step}" = "atmos_products" ]]; then export USE_CFP="YES" # Use MPMD for downstream product generation +elif [[ "${step}" = "oceanice_products" ]]; then + + nth_max=$((npe_node_max / npe_node_oceanice_products)) + + export NTHREADS_OCNICEPOST=${nth_oceanice_products:-1} + export APRUN_OCNICEPOST="${launcher} -n 1 --cpus-per-task=${NTHREADS_OCNICEPOST}" + elif [[ "${step}" = "ecen" ]]; then nth_max=$((npe_node_max / npe_node_ecen)) diff --git a/env/ORION.env b/env/ORION.env index d91fd4db03..17d0d24d97 100755 --- a/env/ORION.env +++ b/env/ORION.env @@ -4,7 +4,7 @@ if [[ $# -ne 1 ]]; then echo "Must specify an input argument to set runtime environment variables!" echo "argument can be any one of the following:" - echo "atmanlrun atmensanlrun aeroanlrun landanl" + echo "atmanlrun atmensanlrun aeroanlrun snowanl" echo "anal sfcanl fcst post metp" echo "eobs eupd ecen efcs epos" echo "postsnd awips gempak" @@ -14,7 +14,6 @@ fi step=$1 -export npe_node_max=40 export launcher="srun -l --export=ALL" export mpmd_opt="--multi-prog --output=mpmd.%j.%t.out" @@ -41,7 +40,7 @@ if [[ "${step}" = "prep" ]] || [[ "${step}" = "prepbufr" ]]; then export sys_tp="ORION" export launcher_PREP="srun" -elif [[ "${step}" = "preplandobs" ]]; then +elif [[ "${step}" = "prepsnowobs" ]]; then export APRUN_CALCFIMS="${launcher} -n 1" @@ -79,13 +78,13 @@ elif [[ "${step}" = "aeroanlrun" ]]; then [[ ${NTHREADS_AEROANL} -gt ${nth_max} ]] && export NTHREADS_AEROANL=${nth_max} export APRUN_AEROANL="${launcher} -n ${npe_aeroanlrun} --cpus-per-task=${NTHREADS_AEROANL}" -elif [[ "${step}" = "landanl" ]]; then +elif [[ "${step}" = "snowanl" ]]; then - nth_max=$((npe_node_max / npe_node_landanl)) + nth_max=$((npe_node_max / npe_node_snowanl)) - export NTHREADS_LANDANL=${nth_landanl:-${nth_max}} - [[ ${NTHREADS_LANDANL} -gt ${nth_max} ]] && export NTHREADS_LANDANL=${nth_max} - export APRUN_LANDANL="${launcher} -n ${npe_landanl} --cpus-per-task=${NTHREADS_LANDANL}" + export NTHREADS_SNOWANL=${nth_snowanl:-${nth_max}} + [[ ${NTHREADS_SNOWANL} -gt ${nth_max} ]] && export NTHREADS_SNOWANL=${nth_max} + export APRUN_SNOWANL="${launcher} -n ${npe_snowanl} --cpus-per-task=${NTHREADS_SNOWANL}" export APRUN_APPLY_INCR="${launcher} -n 6" @@ -119,6 +118,14 @@ elif [[ "${step}" = "ocnanalchkpt" ]]; then [[ ${NTHREADS_OCNANAL} -gt ${nth_max} ]] && export NTHREADS_OCNANAL=${nth_max} export APRUN_OCNANAL="${launcher} -n ${npe_ocnanalchkpt} --cpus-per-task=${NTHREADS_OCNANAL}" +elif [[ "${step}" = "ocnanalecen" ]]; then + + nth_max=$((npe_node_max / npe_node_ocnanalecen)) + + export NTHREADS_OCNANALECEN=${nth_ocnanalecen:-${nth_max}} + [[ ${NTHREADS_OCNANALECEN} -gt ${nth_max} ]] && export NTHREADS_OCNANALECEN=${nth_max} + export APRUN_OCNANALECEN="${launcher} -n ${npe_ocnanalecen} --cpus-per-task=${NTHREADS_OCNANALECEN}" + elif [[ "${step}" = "anal" ]] || [[ "${step}" = "analcalc" ]]; then export MKL_NUM_THREADS=4 @@ -210,6 +217,13 @@ elif [[ "${step}" = "atmos_products" ]]; then export USE_CFP="YES" # Use MPMD for downstream product generation +elif [[ "${step}" = "oceanice_products" ]]; then + + nth_max=$((npe_node_max / npe_node_oceanice_products)) + + export NTHREADS_OCNICEPOST=${nth_oceanice_products:-1} + export APRUN_OCNICEPOST="${launcher} -n 1 --cpus-per-task=${NTHREADS_OCNICEPOST}" + elif [[ "${step}" = "ecen" ]]; then nth_max=$((npe_node_max / npe_node_ecen)) diff --git a/env/S4.env b/env/S4.env index 3dab3fc3e7..ab564eb974 100755 --- a/env/S4.env +++ b/env/S4.env @@ -4,7 +4,7 @@ if [[ $# -ne 1 ]]; then echo "Must specify an input argument to set runtime environment variables!" echo "argument can be any one of the following:" - echo "atmanlrun atmensanlrun aeroanlrun landanl" + echo "atmanlrun atmensanlrun aeroanlrun snowanl" echo "anal sfcanl fcst post metp" echo "eobs eupd ecen efcs epos" echo "postsnd awips gempak" @@ -13,13 +13,7 @@ if [[ $# -ne 1 ]]; then fi step=$1 -PARTITION_BATCH=${PARTITION_BATCH:-"s4"} -if [[ ${PARTITION_BATCH} = "s4" ]]; then - export npe_node_max=32 -elif [[ ${PARTITION_BATCH} = "ivy" ]]; then - export npe_node_max=20 -fi export launcher="srun -l --export=ALL" export mpmd_opt="--multi-prog --output=mpmd.%j.%t.out" @@ -39,7 +33,7 @@ if [[ "${step}" = "prep" ]] || [[ "${step}" = "prepbufr" ]]; then export sys_tp="S4" export launcher_PREP="srun" -elif [[ "${step}" = "preplandobs" ]]; then +elif [[ "${step}" = "prepsnowobs" ]]; then export APRUN_CALCFIMS="${launcher} -n 1" @@ -76,13 +70,13 @@ elif [[ "${step}" = "aeroanlrun" ]]; then [[ ${NTHREADS_AEROANL} -gt ${nth_max} ]] && export NTHREADS_AEROANL=${nth_max} export APRUN_AEROANL="${launcher} -n ${npe_aeroanlrun}" -elif [[ "${step}" = "landanl" ]]; then +elif [[ "${step}" = "snowanl" ]]; then - nth_max=$((npe_node_max / npe_node_landanl)) + nth_max=$((npe_node_max / npe_node_snowanl)) - export NTHREADS_LANDANL=${nth_landanl:-${nth_max}} - [[ ${NTHREADS_LANDANL} -gt ${nth_max} ]] && export NTHREADS_LANDANL=${nth_max} - export APRUN_LANDANL="${launcher} -n ${npe_landanl}" + export NTHREADS_SNOWANL=${nth_snowanl:-${nth_max}} + [[ ${NTHREADS_SNOWANL} -gt ${nth_max} ]] && export NTHREADS_SNOWANL=${nth_max} + export APRUN_SNOWANL="${launcher} -n ${npe_snowanl}" export APRUN_APPLY_INCR="${launcher} -n 6" @@ -183,6 +177,13 @@ elif [[ "${step}" = "atmos_products" ]]; then export USE_CFP="YES" # Use MPMD for downstream product generation +elif [[ "${step}" = "oceanice_products" ]]; then + + nth_max=$((npe_node_max / npe_node_oceanice_products)) + + export NTHREADS_OCNICEPOST=${nth_oceanice_products:-1} + export APRUN_OCNICEPOST="${launcher} -n 1 --cpus-per-task=${NTHREADS_OCNICEPOST}" + elif [[ "${step}" = "ecen" ]]; then nth_max=$((npe_node_max / npe_node_ecen)) diff --git a/env/WCOSS2.env b/env/WCOSS2.env index a4fe81060d..4533629edc 100755 --- a/env/WCOSS2.env +++ b/env/WCOSS2.env @@ -4,7 +4,7 @@ if [[ $# -ne 1 ]]; then echo "Must specify an input argument to set runtime environment variables!" echo "argument can be any one of the following:" - echo "atmanlrun atmensanlrun aeroanlrun landanl" + echo "atmanlrun atmensanlrun aeroanlrun snowanl" echo "anal sfcanl fcst post metp" echo "eobs eupd ecen esfc efcs epos" echo "postsnd awips gempak" @@ -18,8 +18,6 @@ step=$1 export launcher="mpiexec -l" export mpmd_opt="--cpu-bind verbose,core cfp" -export npe_node_max=128 - if [[ "${step}" = "prep" ]] || [[ "${step}" = "prepbufr" ]]; then nth_max=$((npe_node_max / npe_node_prep)) @@ -29,7 +27,7 @@ if [[ "${step}" = "prep" ]] || [[ "${step}" = "prepbufr" ]]; then export sys_tp="wcoss2" export launcher_PREP="mpiexec" -elif [[ "${step}" = "preplandobs" ]]; then +elif [[ "${step}" = "prepsnowobs" ]]; then export APRUN_CALCFIMS="${launcher} -n 1" @@ -66,13 +64,13 @@ elif [[ "${step}" = "aeroanlrun" ]]; then [[ ${NTHREADS_AEROANL} -gt ${nth_max} ]] && export NTHREADS_AEROANL=${nth_max} export APRUN_AEROANL="${launcher} -n ${npe_aeroanlrun}" -elif [[ "${step}" = "landanl" ]]; then +elif [[ "${step}" = "snowanl" ]]; then - nth_max=$((npe_node_max / npe_node_landanl)) + nth_max=$((npe_node_max / npe_node_snowanl)) - export NTHREADS_LANDANL=${nth_landanl:-${nth_max}} - [[ ${NTHREADS_LANDANL} -gt ${nth_max} ]] && export NTHREADS_LANDANL=${nth_max} - export APRUN_LANDANL="${launcher} -n ${npe_landanl}" + export NTHREADS_SNOWANL=${nth_snowanl:-${nth_max}} + [[ ${NTHREADS_SNOWANL} -gt ${nth_max} ]] && export NTHREADS_SNOWANL=${nth_max} + export APRUN_SNOWANL="${launcher} -n ${npe_snowanl}" export APRUN_APPLY_INCR="${launcher} -n 6" @@ -175,10 +173,9 @@ elif [[ "${step}" = "fcst" ]] || [[ "${step}" = "efcs" ]]; then # https://github.com/ufs-community/ufs-weather-model/blob/develop/tests/fv3_conf/fv3_qsub.IN_wcoss2 export FI_OFI_RXM_RX_SIZE=40000 export FI_OFI_RXM_TX_SIZE=40000 - if [[ "${step}" = "fcst" ]]; then + if [[ "${step}" = "fcst" ]] || [[ "${step}" = "efcs" ]]; then export OMP_PLACES=cores export OMP_STACKSIZE=2048M - elif [[ "${step}" = "efcs" ]]; then export MPICH_MPIIO_HINTS="*:romio_cb_write=disable" export FI_OFI_RXM_SAR_LIMIT=3145728 fi @@ -195,6 +192,13 @@ elif [[ "${step}" = "atmos_products" ]]; then export USE_CFP="YES" # Use MPMD for downstream product generation +elif [[ "${step}" = "oceanice_products" ]]; then + + nth_max=$((npe_node_max / npe_node_oceanice_products)) + + export NTHREADS_OCNICEPOST=${nth_oceanice_products:-1} + export APRUN_OCNICEPOST="${launcher} -n 1 -ppn ${npe_node_oceanice_products} --cpu-bind depth --depth ${NTHREADS_OCNICEPOST}" + elif [[ "${step}" = "ecen" ]]; then nth_max=$((npe_node_max / npe_node_ecen)) diff --git a/gempak/fix/datatype.tbl b/gempak/fix/datatype.tbl index e52e156de4..63b06c0826 100755 --- a/gempak/fix/datatype.tbl +++ b/gempak/fix/datatype.tbl @@ -102,10 +102,10 @@ LTNG $OBS/ltng YYYYMMDDHH.ltng CAT_MSC SCAT_N ! CLIMO $GEMPAK/climo climate_MM.mos CAT_NIL SCAT_NIL 1 -1 -1 ! -GFS $MODEL/gfs gfs_YYYYMMDDHH CAT_GRD SCAT_FCT -1 -1 -1 -F-GFS $COMIN gfs_YYYYMMDDHHfFFF CAT_GRD SCAT_FCT -1 -1 -1 -F-GFSP $COMIN gfs_YYYYMMDDHHfFFF CAT_GRD SCAT_FCT -1 -1 -1 -F-GFSHPC $HPCGFS gfs_YYYYMMDDHHfFFF CAT_GRD SCAT_FCT -1 -1 -1 +GFS $MODEL/gfs gfs_1p00_YYYYMMDDHH CAT_GRD SCAT_FCT -1 -1 -1 +F-GFS $COMIN gfs_1p00_YYYYMMDDHHfFFF CAT_GRD SCAT_FCT -1 -1 -1 +F-GFSP $COMIN gfs_1p00_YYYYMMDDHHfFFF CAT_GRD SCAT_FCT -1 -1 -1 +F-GFSHPC $HPCGFS gfs_1p00_YYYYMMDDHHfFFF CAT_GRD SCAT_FCT -1 -1 -1 GFSEXT $MODEL/ens gfs.YYYYMMDDHH CAT_GRD SCAT_FCT -1 -1 -1 GFS1 $MODEL/ens gfs1.YYYYMMDDHH CAT_GRD SCAT_FCT -1 -1 -1 GFS2 $MODEL/ens gfs2.YYYYMMDDHH CAT_GRD SCAT_FCT -1 -1 -1 @@ -156,9 +156,9 @@ F-NAMP20 $COMIN nam20_YYYYMMDDHHfFFF CAT_GRD SCAT_F F-NAMP44 $COMIN nam44_YYYYMMDDHHfFFF CAT_GRD SCAT_FCT -1 -1 -1 F-THREATS $COMIN ${NEST}_YYYYMMDDHHfFFF CAT_GRD SCAT_FCT -1 -1 -1 F-NAMHPC $HPCNAM nam_YYYYMMDDHHfFFF CAT_GRD SCAT_FCT -1 -1 -1 -GDAS $MODEL/gdas gdas_YYMMDDHH CAT_GRD SCAT_FCT -1 -1 -1 -F-GDAS $COMIN gdas_YYYYMMDDHHfFFF CAT_GRD SCAT_FCT -1 -1 -1 -F-GFS $COMIN gfs_YYYYMMDDHHfFFF CAT_GRD SCAT_FCT -1 -1 -1 +GDAS $MODEL/gdas gdas_1p00_YYYYMMDDHH CAT_GRD SCAT_FCT -1 -1 -1 +F-GDAS $COMIN gdas_1p00_YYYYMMDDHHfFFF CAT_GRD SCAT_FCT -1 -1 -1 +F-GFS $COMIN gfs_1p00_YYYYMMDDHHfFFF CAT_GRD SCAT_FCT -1 -1 -1 F-HWRF $COMIN hwrfp_YYYYMMDDHHfFFF_* CAT_GRD SCAT_FCT -1 -1 -1 F-HWRFN $COMIN hwrfn_YYYYMMDDHHfFFF_* CAT_GRD SCAT_FCT -1 -1 -1 F-GHM $COMIN ghmg_YYYYMMDDHHfFFF_* CAT_GRD SCAT_FCT -1 -1 -1 diff --git a/gempak/fix/gfs_meta b/gempak/fix/gfs_meta index 5ca99b4dc6..c86233214b 100755 --- a/gempak/fix/gfs_meta +++ b/gempak/fix/gfs_meta @@ -1,23 +1,23 @@ -$USHgempak/gfs_meta_us.sh 36 84 126 216 -$USHgempak/gfs_meta_bwx.sh 36 84 126 180 -$USHgempak/gfs_meta_comp.sh 36 84 126 -$USHgempak/gfs_meta_ak.sh 36 84 132 216 -$USHgempak/gfs_meta_crb.sh 126 -$USHgempak/gfs_meta_hur.sh 36 84 126 -$USHgempak/gfs_meta_qpf.sh 36 84 132 216 -$USHgempak/gfs_meta_precip.sh 36 84 132 216 384 -$USHgempak/gfs_meta_sa.sh 126 -$USHgempak/gfs_meta_ver.sh 126 -$USHgempak/gfs_meta_hi.sh 384 -$USHgempak/gfs_meta_nhsh.sh 384 -$USHgempak/gfs_meta_trop.sh 384 -$USHgempak/gfs_meta_usext.sh 384 -$USHgempak/gfs_meta_mar_ql.sh 24 48 96 180 -$USHgempak/gfs_meta_mar_comp.sh 126 -$USHgempak/gfs_meta_opc_na_ver 126 -$USHgempak/gfs_meta_opc_np_ver 126 -$USHgempak/gfs_meta_mar_atl.sh 180 -$USHgempak/gfs_meta_mar_pac.sh 180 -$USHgempak/gfs_meta_mar_ver.sh 48 -$USHgempak/gfs_meta_mar_skewt.sh 72 -$USHgempak/gfs_meta_sa2.sh 144 +${HOMEgfs}/gempak/ush/gfs_meta_us.sh 36 84 126 216 +${HOMEgfs}/gempak/ush/gfs_meta_bwx.sh 36 84 126 180 +${HOMEgfs}/gempak/ush/gfs_meta_comp.sh 36 84 126 +${HOMEgfs}/gempak/ush/gfs_meta_ak.sh 36 84 132 216 +${HOMEgfs}/gempak/ush/gfs_meta_crb.sh 126 +${HOMEgfs}/gempak/ush/gfs_meta_hur.sh 36 84 126 +${HOMEgfs}/gempak/ush/gfs_meta_qpf.sh 36 84 132 216 +${HOMEgfs}/gempak/ush/gfs_meta_precip.sh 36 84 132 216 384 +${HOMEgfs}/gempak/ush/gfs_meta_sa.sh 126 +${HOMEgfs}/gempak/ush/gfs_meta_ver.sh 126 +${HOMEgfs}/gempak/ush/gfs_meta_hi.sh 384 +${HOMEgfs}/gempak/ush/gfs_meta_nhsh.sh 384 +${HOMEgfs}/gempak/ush/gfs_meta_trop.sh 384 +${HOMEgfs}/gempak/ush/gfs_meta_usext.sh 384 +${HOMEgfs}/gempak/ush/gfs_meta_mar_ql.sh 24 48 96 180 +${HOMEgfs}/gempak/ush/gfs_meta_mar_comp.sh 126 +${HOMEgfs}/gempak/ush/gfs_meta_opc_na_ver 126 +${HOMEgfs}/gempak/ush/gfs_meta_opc_np_ver 126 +${HOMEgfs}/gempak/ush/gfs_meta_mar_atl.sh 180 +${HOMEgfs}/gempak/ush/gfs_meta_mar_pac.sh 180 +${HOMEgfs}/gempak/ush/gfs_meta_mar_ver.sh 48 +${HOMEgfs}/gempak/ush/gfs_meta_mar_skewt.sh 72 +${HOMEgfs}/gempak/ush/gfs_meta_sa2.sh 144 diff --git a/gempak/ush/gdas_ecmwf_meta_ver.sh b/gempak/ush/gdas_ecmwf_meta_ver.sh index e4fffd9c8a..4a4e5b5b64 100755 --- a/gempak/ush/gdas_ecmwf_meta_ver.sh +++ b/gempak/ush/gdas_ecmwf_meta_ver.sh @@ -1,75 +1,39 @@ -#!/bin/sh -# -# Metafile Script : gdas_ecmwf_meta_ver +#! /usr/bin/env bash # # Creates a loop comparing the 6 hr gdas fcst to the pervious 7 days # of ecmwf fcsts # -# Log : -# J. Carr/HPC 3/2001 New metafile for verification of ecmwf. -# J. Carr/HPC 5/2001 Added a mn variable for a/b side dbnet root variable. -# M. Klein/HPC 11/2004 Changed verification grid from fnl to gdas -# M. Klein/HPC 2/2005 Changed location of working directory to /ptmp -# M. Klein/HPC 11/2006 Modify to run in production. -# - -#cd $DATA -set -xa - -if [ $cyc -ne "06" ] ; then - exit -fi +source "${HOMEgfs}/ush/preamble.sh" -export pgm=gdplot2_nc;. prep_step; startmsg +export pgm=gdplot2_nc;. prep_step -cyc=12 +cyc2=12 device="nc | ecmwfver.meta" -PDY2=$(echo ${PDY} | cut -c3-) # # Copy in datatype table to define gdfile type # -cp $FIXgempak/datatype.tbl datatype.tbl + +cp "${HOMEgfs}/gempak/fix/datatype.tbl" datatype.tbl export err=$? -if [[ $err -ne 0 ]] ; then - echo " File datatype.tbl does not exist." - exit $err +if (( err != 0 )) ; then + echo "FATAL ERROR: File datatype.tbl does not exist." + exit "${err}" fi -# -# DEFINE YESTERDAY -date1=$($NDATE -24 ${PDY}${cyc} | cut -c -8) -sdate1=$(echo ${date1} | cut -c 3-) -# DEFINE 2 DAYS AGO -date2=$($NDATE -48 ${PDY}${cyc} | cut -c -8) -sdate2=$(echo ${date2} | cut -c 3-) -# DEFINE 3 DAYS AGO -date3=$($NDATE -72 ${PDY}${cyc} | cut -c -8) -sdate3=$(echo ${date3} | cut -c 3-) -# DEFINE 4 DAYS AGO -date4=$($NDATE -96 ${PDY}${cyc} | cut -c -8) -sdate4=$(echo ${date4} | cut -c 3-) -# DEFINE 5 DAYS AGO -date5=$($NDATE -120 ${PDY}${cyc} | cut -c -8) -sdate5=$(echo ${date5} | cut -c 3-) -# DEFINE 6 DAYS AGO -date6=$($NDATE -144 ${PDY}${cyc} | cut -c -8) -sdate6=$(echo ${date6} | cut -c 3-) -# DEFINE 7 DAYS AGO -date7=$($NDATE -168 ${PDY}${cyc} | cut -c -8) -sdate7=$(echo ${date7} | cut -c 3-) - -vergrid="F-GDAS | ${PDY2}/0600" +export COMIN="gdas.${PDY}${cyc}" +if [[ ! -L ${COMIN} ]]; then + ln -sf "${COM_ATMOS_GEMPAK_1p00}" "${COMIN}" +fi +vergrid="F-GDAS | ${PDY:2}/0600" fcsthr="0600f006" # GENERATING THE METAFILES. areas="SAM NAM" -verdays="${date1} ${date2} ${date3} ${date4} ${date5} ${date6} ${date7}" -for area in $areas - do - if [ $area == "NAM" ] ; then +for area in ${areas}; do + if [[ "${area}" == "NAM" ]] ; then garea="5.1;-124.6;49.6;-11.9" proj="STR/90.0;-95.0;0.0" latlon="0" @@ -80,37 +44,18 @@ for area in $areas latlon="1/10/1/2/10;10" run=" " fi - for verday in $verdays - do - verddate=$(echo ${verday} | cut -c 3-) - if [ ${verday} -eq ${date1} ] ; then - dgdattim=f024 - sdatenum=$sdate1 - elif [ ${verday} -eq ${date2} ] ; then - dgdattim=f048 - sdatenum=$sdate2 - elif [ ${verday} -eq ${date3} ] ; then - dgdattim=f072 - sdatenum=$sdate3 - elif [ ${verday} -eq ${date4} ] ; then - dgdattim=f096 - sdatenum=$sdate4 - elif [ ${verday} -eq ${date5} ] ; then - dgdattim=f120 - sdatenum=$sdate5 - elif [ ${verday} -eq ${date6} ] ; then - dgdattim=f144 - sdatenum=$sdate6 - elif [ ${verday} -eq ${date7} ] ; then - dgdattim=f168 - sdatenum=$sdate7 + for (( fhr=24; fhr<=168; fhr+=24 )); do + dgdattim=$(printf "f%03d" "${fhr}") + sdatenum=$(date --utc +%y%m%d -d "${PDY} ${cyc2} - ${fhr} hours") + + if [[ ! -L "ecmwf.20${sdatenum}" ]]; then + ln -sf "${COMINecmwf}/ecmwf.20${sdatenum}/gempak" "ecmwf.20${sdatenum}" fi - # JY grid="$COMROOT/nawips/${envir}/ecmwf.20${sdatenum}/ecmwf_glob_20${sdatenum}12" - grid="${COMINecmwf}.20${sdatenum}/gempak/ecmwf_glob_20${sdatenum}12" + gdfile="ecmwf.20${sdatenum}/ecmwf_glob_20${sdatenum}12" -# 500 MB HEIGHT METAFILE + # 500 MB HEIGHT METAFILE -$GEMEXE/gdplot2_nc << EOFplt + "${GEMEXE}/gdplot2_nc" << EOFplt \$MAPFIL = mepowo.gsf PROJ = ${proj} GAREA = ${garea} @@ -134,7 +79,7 @@ line = 6/1/3 title = 6/-2/~ GDAS 500 MB HGT (6-HR FCST)|~${area} 500 HGT DF r -gdfile = ${grid} +gdfile = ${gdfile} gdattim = ${dgdattim} line = 5/1/3 contur = 4 @@ -157,7 +102,7 @@ clear = yes latlon = ${latlon} r -gdfile = ${grid} +gdfile = ${gdfile} gdattim = ${dgdattim} line = 5/1/3 contur = 4 @@ -165,7 +110,7 @@ title = 5/-1/~ ECMWF PMSL clear = no r -PROJ = +PROJ = GAREA = bwus gdfile = ${vergrid} gdattim = ${fcsthr} @@ -181,7 +126,7 @@ clear = yes latlon = ${latlon} ${run} -gdfile = ${grid} +gdfile = ${gdfile} gdattim = ${dgdattim} line = 5/1/3 contur = 4 @@ -195,28 +140,28 @@ EOFplt done done -export err=$?;err_chk +export err=$? + ##################################################### # GEMPAK DOES NOT ALWAYS HAVE A NON ZERO RETURN CODE # WHEN IT CAN NOT PRODUCE THE DESIRED GRID. CHECK # FOR THIS CASE HERE. ##################################################### -ls -l ecmwfver.meta -export err=$?;export pgm="GEMPAK CHECK FILE";err_chk - -if [ $SENDCOM = "YES" ] ; then - mkdir -p -m 775 ${COMOUTecmwf}.${PDY}/meta - mv ecmwfver.meta ${COMOUTecmwf}.${PDY}/meta/ecmwfver_${PDY}_${cyc} - export err=$? - if [[ $err -ne 0 ]] ; then - echo " File ecmwfver.meta does not exist." - exit $err - fi +if (( err != 0 )) || [[ ! -s ecmwfver.meta ]]; then + echo "FATAL ERROR: Failed to create ecmwf meta file" + exit "${err}" +fi - if [ $SENDDBN = "YES" ] ; then - ${DBNROOT}/bin/dbn_alert MODEL ECMWFVER_HPCMETAFILE $job \ - ${COMOUTecmwf}.${PDY}/meta/ecmwfver_${PDY}_${cyc} - fi +mv ecmwfver.meta "${COM_ATMOS_GEMPAK_META}/ecmwfver_${PDY}_${cyc2}" +export err=$? +if (( err != 0 )) ; then + echo "FATAL ERROR: Failed to move meta file to ${COM_ATMOS_GEMPAK_META}/ecmwfver_${PDY}_${cyc2}" + exit "${err}" +fi + +if [[ "${SENDDBN}" == "YES" ]] ; then + "${DBNROOT}/bin/dbn_alert" MODEL ECMWFVER_HPCMETAFILE "${job}" \ + "${COM_ATMOS_GEMPAK_META}/ecmwfver_${PDY}_${cyc2}" fi exit diff --git a/gempak/ush/gdas_meta_loop.sh b/gempak/ush/gdas_meta_loop.sh index cd0d9b781b..e09fc9a7f9 100755 --- a/gempak/ush/gdas_meta_loop.sh +++ b/gempak/ush/gdas_meta_loop.sh @@ -1,91 +1,59 @@ -#! /bin/sh +#! /usr/bin/env bash # # Metafile Script : gdas_meta_loop # -# Log : -# D.W.Plummer/NCEP 2/97 Add log header -# J. Carr/HPC 3/98 Changed to gdplot2 -# J. Carr/HPC 8/98 Changed map to medium resolution -# J. Carr/HPC 2/99 Changed skip to 0 -# J. Carr/HPC 2/01 Implemented usage on IBM operationally. -# J. Carr/HPC 5/2001 Added a mn variable for a/b side dbnet root variable. -# M. Klein/HPC 11/2004 Change fnl to gdas -# M. Klein/HPC 2/2005 Changed location of working directory to /ptmp -# M. Klein/HPC 11/2006 Modify for production on CCS -#cd $DATA - -set -xa +source "${HOMEgfs}/ush/preamble.sh" device="nc | gdasloop.meta" -PDY2=$(echo $PDY | cut -c3-) +# +# Link data into DATA to sidestep gempak path limits +# TODO: Replace this +# +export COMIN="${RUN}.${PDY}${cyc}" +if [[ ! -L "${COMIN}" ]]; then + ln -sf "${COM_ATMOS_GEMPAK_1p00}" "${COMIN}" +fi -if [ "$envir" = "para" ] ; then +if [[ "${envir}" == "para" ]] ; then export m_title="GDASP" else export m_title="GDAS" fi -export COMPONENT=${COMPONENT:-atmos} -export pgm=gdplot2_nc;. prep_step; startmsg +export pgm=gdplot2_nc;. prep_step -# -# Copy in datatype table to define gdfile type -# -cp $FIXgempak/datatype.tbl datatype.tbl -export err=$? -if [[ $err -ne 0 ]] ; then - echo " File datatype.tbl does not exist." - exit $err -fi - -# -# Define previous days -# -PDYm1=$($NDATE -24 ${PDY}${cyc} | cut -c -8) -PDYm2=$($NDATE -48 ${PDY}${cyc} | cut -c -8) -PDYm3=$($NDATE -72 ${PDY}${cyc} | cut -c -8) -PDYm4=$($NDATE -96 ${PDY}${cyc} | cut -c -8) -PDYm5=$($NDATE -120 ${PDY}${cyc} | cut -c -8) -PDYm6=$($NDATE -144 ${PDY}${cyc} | cut -c -8) -# - -verdays="$PDYm6 $PDYm5 $PDYm4 $PDYm3 $PDYm2 $PDYm1 $PDY" - -for day in $verdays - do - PDY2=$(echo $day | cut -c 3-) - if [ $day -eq $PDY ] ; then - if [ $cyc -eq "00" ] ; then - cycles="00" - elif [ $cyc -eq "06" ] ; then - cycles="00 06" - elif [ $cyc -eq "12" ] ; then - cycles="00 06 12" - elif [ $cyc -eq "18" ] ; then - cycles="00 06 12 18" +for (( fhr=24; fhr<=144; fhr+=24 )); do + day=$(date --utc +%Y%m%d -d "${PDY} ${cyc} - ${fhr} hours") + if (( ${day}${cyc} < SDATE )); then + # Stop looking because these cycles weren't run + if (( fhr == 24 )); then + exit + else + break fi - else - cycles="00 06 12 18" fi - for cycle in $cycles - do -# Test with GDAS in PROD -# grid="${COMROOT}/nawips/${envir}/gdas.${day}/gdas_${day}${cycle}f000" - export COMIN=${COMINgdas}.${day}/${cycle}/${COMPONENT}/gempak - grid="${COMINgdas}.${day}/${cycle}/${COMPONENT}/gempak/gdas_${day}${cycle}f000" + cycles=$(seq -s ' ' -f "%02g" 0 6 "${cyc}") + for cycle in ${cycles}; do + # Test with GDAS in PROD + YMD=${day} HH=${cyc} GRID=1p00 generate_com "COM_ATMOS_GEMPAK_1p00_past:COM_ATMOS_GEMPAK_TMPL" + export COMIN="${RUN}.${day}${cycle}" + if [[ ! -L "${COMIN}" ]]; then + ln -sf "${COM_ATMOS_GEMPAK_1p00_past}" "${COMIN}" + fi + gdfile="${COMIN}/gdas_1p00_${day}${cycle}f000" -$GEMEXE/gdplot2_nc << EOF + "${GEMEXE}/gdplot2_nc" << EOF \$MAPFIL = mepowo.gsf -GDFILE = $grid -GDATTIM = F00 -DEVICE = $device +GDFILE = ${gdfile} +GDATTIM = F000 +DEVICE = ${device} PANEL = 0 TEXT = m/21//hw CONTUR = 2 -PROJ = +PROJ = GAREA = nam LATLON = 0 CLEAR = yes @@ -106,9 +74,9 @@ CLRBAR = 1/V/LL !0 WIND = am0 MAP = 1/1/1 REFVEC = -TITLE = 1/0/~ $m_title PW, EST MSLP, THICKNESS|~NAM PRCP WATER!0 +TITLE = 1/0/~ ${m_title} PW, EST MSLP, THICKNESS|~NAM PRCP WATER!0 r - + PROJ = STR/90;-105;0 GAREA = 2;-139;27;-22 LATLON = 1/1/1//15;15 @@ -124,11 +92,11 @@ LINE = 7/5/1/2 !20/1/2/1 FINT = 15;21;27;33;39;45;51;57 FLINE = 0;23-15 HILO = 2;6/X;N/10-99;10-99! -HLSYM = +HLSYM = CLRBAR = 1 WIND = 0 REFVEC = -TITLE = 5/-2/~ $m_title @ HGT AND VORTICITY|~NAM @ HGT AND VORT!0 +TITLE = 5/-2/~ ${m_title} @ HGT AND VORTICITY|~NAM @ HGT AND VORT!0 r GLEVEL = 250 @@ -146,50 +114,24 @@ HLSYM = CLRBAR = 1 WIND = 0 !Bk9/.7/2/b/! REFVEC = -TITLE = 5/-2/~ $m_title @ HGHT, ISOTACHS AND WIND (KTS)|~NAM @ HGT & WIND!0 +TITLE = 5/-2/~ ${m_title} @ HGHT, ISOTACHS AND WIND (KTS)|~NAM @ HGT & WIND!0 FILTER = n r exit EOF - done - -done - -for day in $verdays - do - PDY2=$(echo $day | cut -c 3-) - if [ $day -eq $PDY ] ; then - if [ $cyc -eq "00" ] ; then - cycles="00" - elif [ $cyc -eq "06" ] ; then - cycles="00 06" - elif [ $cyc -eq "12" ] ; then - cycles="00 06 12" - elif [ $cyc -eq "18" ] ; then - cycles="00 06 12 18" - fi - else - cycles="00 06 12 18" - fi + gdfile="${COMIN}/gdas_1p00_${day}${cycle}f000" - for cycle in $cycles - do -# Test with GDAS in PROD -# grid="${COMROOT}/nawips/${envir}/gdas.${day}/gdas_${day}${cycle}f000" - export COMIN=${COMINgdas}.${day}/${cycle}/${COMPONENT}/gempak - grid="${COMINgdas}.${day}/${cycle}/${COMPONENT}/gempak/gdas_${day}${cycle}f000" - -$GEMEXE/gdplot2_nc << EOF +"${GEMEXE}/gdplot2_nc" << EOF \$MAPFIL = mepowo.gsf -GDFILE = $grid -GDATTIM = F00 -DEVICE = $device +GDFILE = ${gdfile} +GDATTIM = F000 +DEVICE = ${device} PANEL = 0 TEXT = m/21//hw CONTUR = 1 -PROJ = +PROJ = GAREA = samps LATLON = 1/1/1//15;15 CLEAR = yes @@ -210,9 +152,9 @@ CLRBAR = 1/V/LL !0 WIND = am0 MAP = 1/1/1 REFVEC = -TITLE = 1/0/~ $m_title PW, MSLP, THICKNESS|~SAM PRCP WATER!0 +TITLE = 1/0/~ ${m_title} PW, MSLP, THICKNESS|~SAM PRCP WATER!0 r - + GLEVEL = 500 GVCORD = PRES SKIP = 0 !0 !0 !0 !0 @@ -225,11 +167,11 @@ LINE = 7/5/1/2 !29/5/1/2!7/5/1/2 !29/5/1/2 !20/1/2/1 FINT = 16;20;24;28;32;36;40;44 FLINE = 0;23-15 HILO = 2;6/X;N/10-99;10-99! !2;6/X;N/10-99;10-99! ! -HLSYM = +HLSYM = CLRBAR = 1 WIND = 0 REFVEC = -TITLE = 5/-2/~ $m_title @ HGT AND VORTICITY|~SAM @ HGT & VORT!0 +TITLE = 5/-2/~ ${m_title} @ HGT AND VORTICITY|~SAM @ HGT & VORT!0 r GLEVEL = 250 @@ -247,7 +189,7 @@ HLSYM = CLRBAR = 1 WIND = 0 !Bk9/.7/2/b/! REFVEC = -TITLE = 5/-2/~ $m_title @ HGHT, ISOTACHS AND WIND (KTS)|~SAM @ HGT & WIND!0 +TITLE = 5/-2/~ ${m_title} @ HGHT, ISOTACHS AND WIND (KTS)|~SAM @ HGT & WIND!0 FILTER = n r @@ -261,11 +203,11 @@ TYPE = c !c CINT = 1 !4 LINE = 22/5/2/1 !10/1/1 FINT = -FLINE = +FLINE = HILO = !26;2/H#;L#/1020-1070;900-1012/3/30;30/y HLSYM = !2;1.5//21//hw WIND = 0 -TITLE = 1/-1/~ $m_title PMSL, 1000-850mb THKN|~SAM PMSL, 1000-850 TK!0 +TITLE = 1/-1/~ ${m_title} PMSL, 1000-850mb THKN|~SAM PMSL, 1000-850 TK!0 r exit @@ -274,27 +216,28 @@ EOF done done -export err=$?;err_chk +export err=$? + ##################################################### # GEMPAK DOES NOT ALWAYS HAVE A NON ZERO RETURN CODE # WHEN IT CAN NOT PRODUCE THE DESIRED GRID. CHECK # FOR THIS CASE HERE. ##################################################### -ls -l gdasloop.meta -export err=$?;export pgm="GEMPAK CHECK FILE";err_chk +if (( err != 0 )) || [[ ! -s gdasloop.meta ]]; then + echo "FATAL ERROR: Failed to create gdasloop meta file" + exit "${err}" +fi -if [ $SENDCOM = "YES" ] ; then - mv gdasloop.meta ${COMOUT}/gdas_${PDY}_${cyc}_loop - export err=$? - if [[ $err -ne 0 ]] ; then - echo " File gdasloop.meta does not exist." - exit $err - fi +mv gdasloop.meta "${COM_ATMOS_GEMPAK_META}/gdas_${PDY}_${cyc}_loop" +export err=$? +if (( err != 0 )) ; then + echo "FATAL ERROR: Failed to move meta file to ${COM_ATMOS_GEMPAK_META}/gdas_${PDY}_${cyc}_loop" + exit "${err}" +fi - if [ $SENDDBN = "YES" ] ; then - ${DBNROOT}/bin/dbn_alert MODEL ${DBN_ALERT_TYPE} $job \ - $COMOUT/gdas_${PDY}_${cyc}_loop - fi +if [[ ${SENDDBN} == "YES" ]] ; then + "${DBNROOT}/bin/dbn_alert" MODEL "${DBN_ALERT_TYPE}" "${job}" \ + "${COM_ATMOS_GEMPAK_META}/gdas_${PDY}_${cyc}_loop" fi exit diff --git a/gempak/ush/gdas_meta_na.sh b/gempak/ush/gdas_meta_na.sh index 6c4768cfb7..9c51bc18a7 100755 --- a/gempak/ush/gdas_meta_na.sh +++ b/gempak/ush/gdas_meta_na.sh @@ -1,41 +1,33 @@ -#!/bin/sh - +#! /usr/bin/env bash # # Metafile Script : gdas_meta_na # -# Log : -# D.W.Plummer/NCEP 2/97 Add log header -# LJ REED 4/10/98 added line to define BIN_DIR -# J. Carr/HPC 2/99 Changed skip to 0 -# B. Gordon 4/00 Modified for production on IBM-SP -# and changed gdplot_nc -> gdplot2_nc -# D. Michaud 4/01 Added logic to display different title -# for parallel runs -# J. Carr 11/04 Added a ? in all title/TITLE lines. -# J. Carr 11/04 Changed GAREA and PROJ to match GFS and NAM. -# - -cd $DATA -set -xa +source "${HOMEgfs}/ush/preamble.sh" device="nc | gdas.meta" -PDY2=$(echo $PDY | cut -c3-) +# +# Link data into DATA to sidestep gempak path limits +# TODO: Replace this +# +export COMIN="${RUN}.${PDY}${cyc}" +if [[ ! -L "${COMIN}" ]]; then + ln -sf "${COM_ATMOS_GEMPAK_1p00}" "${COMIN}" +fi -if [ "$envir" = "para" ] ; then +if [[ "${envir}" == "para" ]] ; then export m_title="GDASP" else export m_title="GDAS" fi export pgm=gdplot2_nc; prep_step -startmsg -$GEMEXE/gdplot2_nc << EOF -GDFILE = F-GDAS | ${PDY2}/${cyc}00 +"${GEMEXE}/gdplot2_nc" << EOF +GDFILE = F-GDAS | ${PDY:2}/${cyc}00 GDATTIM = FALL -DEVICE = $device +DEVICE = ${device} PANEL = 0 TEXT = 1/21//hw CONTUR = 2 @@ -51,72 +43,70 @@ PROJ = str/90;-105;0 LATLON = 1 -restore $USHgempak/restore/pmsl_thkn.2.nts +restore ${HOMEgfs}/gempak/ush/restore/pmsl_thkn.2.nts CLRBAR = 1 HLSYM = 2;1.5//21//hw TEXT = 1/21//hw -TITLE = 5/-2/~ ? $m_title PMSL, 1000-500 MB THICKNESS|~MSLP, 1000-500 THKN!0 +TITLE = 5/-2/~ ? ${m_title} PMSL, 1000-500 MB THICKNESS|~MSLP, 1000-500 THKN!0 l ru -restore $USHgempak/restore/850mb_hght_tmpc.2.nts +restore ${HOMEgfs}/gempak/ush/restore/850mb_hght_tmpc.2.nts CLRBAR = 1 TEXT = 1/21//hw SKIP = 0 !0 !0 !0 !/3 FILTER = NO -TITLE = 5/-2/~ ? $m_title @ HGT, TEMP AND WIND (KTS)|~@ HGT, TMP, WIND!0 +TITLE = 5/-2/~ ? ${m_title} @ HGT, TEMP AND WIND (KTS)|~@ HGT, TMP, WIND!0 l ru -restore $USHgempak/restore/700mb_hght_relh_omeg.2.nts +restore ${HOMEgfs}/gempak/ush/restore/700mb_hght_relh_omeg.2.nts CLRBAR = 1 TEXT = 1/21//hw -TITLE = 5/-2/~ ? $m_title @ HGT, REL HUMIDITY AND OMEGA|~@ HGT, RH AND OMEGA!0 +TITLE = 5/-2/~ ? ${m_title} @ HGT, REL HUMIDITY AND OMEGA|~@ HGT, RH AND OMEGA!0 l ru -restore $USHgempak/restore/500mb_hght_absv.2.nts +restore ${HOMEgfs}/gempak/ush/restore/500mb_hght_absv.2.nts CLRBAR = 1 TEXT = 1/21//hw -TITLE = 5/-2/~ ? $m_title @ HGT AND VORTICITY|~@ HGT AND VORTICITY!0 +TITLE = 5/-2/~ ? ${m_title} @ HGT AND VORTICITY|~@ HGT AND VORTICITY!0 l ru -restore $USHgempak/restore/250mb_hght_wnd.2.nts +restore ${HOMEgfs}/gempak/ush/restore/250mb_hght_wnd.2.nts CLRBAR = 1 TEXT = 1/21//hw -TITLE = 5/-2/~ ? $m_title @ HGT, ISOTACHS AND WIND (KTS)|~@ HGT AND WIND!0 +TITLE = 5/-2/~ ? ${m_title} @ HGT, ISOTACHS AND WIND (KTS)|~@ HGT AND WIND!0 l ru exit EOF -export err=$?;err_chk +export err=$? ##################################################### # GEMPAK DOES NOT ALWAYS HAVE A NON ZERO RETURN CODE # WHEN IT CAN NOT PRODUCE THE DESIRED GRID. CHECK # FOR THIS CASE HERE. ##################################################### -ls -l gdas.meta -export err=$?;export pgm="GEMPAK CHECK FILE";err_chk - -if [ $SENDCOM = "YES" ] ; then - mv gdas.meta ${COMOUT}/gdas_${PDY}_${cyc}_na - export err=$? - if [[ $err -ne 0 ]] ; then - echo " File gdas.meta does not exist." - exit $err - fi - - if [ $SENDDBN = "YES" ] ; then - $DBNROOT/bin/dbn_alert MODEL ${DBN_ALERT_TYPE} $job \ - $COMOUT/gdas_${PDY}_${cyc}_na - fi +if (( err != 0 )) || [[ ! -s gdas.meta ]] &> /dev/null; then + echo "FATAL ERROR: Failed to create gempak meta file for North America" + exit "${err}" fi -# +mv gdas.meta "${COM_ATMOS_GEMPAK_META}/gdas_${PDY}_${cyc}_na" +export err=$? +if (( err != 0 )) ; then + echo "FATAL ERROR: Failed to move meta file to ${COM_ATMOS_GEMPAK_META}/gdas_${PDY}_${cyc}_na" + exit "${err}" +fi + +if [[ "${SENDDBN}" == "YES" ]] ; then + "${DBNROOT}/bin/dbn_alert" MODEL "${DBN_ALERT_TYPE}" "${job}" \ + "${COM_ATMOS_GEMPAK_META}/gdas_${PDY}_${cyc}_na" +fi diff --git a/gempak/ush/gdas_ukmet_meta_ver.sh b/gempak/ush/gdas_ukmet_meta_ver.sh index 845fa1cc6b..90c0d214b7 100755 --- a/gempak/ush/gdas_ukmet_meta_ver.sh +++ b/gempak/ush/gdas_ukmet_meta_ver.sh @@ -1,4 +1,4 @@ -#!/bin/sh +#! /usr/bin/env bash # # Metafile Script : gdas_ukmet_meta_ver # @@ -13,83 +13,27 @@ # M. Klein/HPC 11/2006 Modify to run in production. # -#cd $DATA +source "${HOMEgfs}/ush/preamble.sh" -set -xa - -if [ $cyc -ne "06" ] ; then - exit -fi - -export pgm=gdplot2_nc;. prep_step; startmsg +export pgm=gdplot2_nc;. prep_step device="nc | ukmetver_12.meta" -PDY2=$(echo ${PDY} | cut -c3-) - -# -# Copy in datatype table to define gdfile type -# -cp $FIXgempak/datatype.tbl datatype.tbl - -# -# DEFINE 1 CYCLE AGO -dc1=$($NDATE -06 ${PDY}${cyc} | cut -c -10) -date1=$(echo ${dc1} | cut -c -8) -sdate1=$(echo ${dc1} | cut -c 3-8) -cycle1=$(echo ${dc1} | cut -c 9,10) -# DEFINE 2 CYCLES AGO -dc2=$($NDATE -18 ${PDY}${cyc} | cut -c -10) -date2=$(echo ${dc2} | cut -c -8) -sdate2=$(echo ${dc2} | cut -c 3-8) -cycle2=$(echo ${dc2} | cut -c 9,10) -# DEFINE 3 CYCLES AGO -dc3=$($NDATE -30 ${PDY}${cyc} | cut -c -10) -date3=$(echo ${dc3} | cut -c -8) -sdate3=$(echo ${dc3} | cut -c 3-8) -cycle3=$(echo ${dc3} | cut -c 9,10) -# DEFINE 4 CYCLES AGO -dc4=$($NDATE -42 ${PDY}${cyc} | cut -c -10) -date4=$(echo ${dc4} | cut -c -8) -sdate4=$(echo ${dc4} | cut -c 3-8) -cycle4=$(echo ${dc4} | cut -c 9,10) -# DEFINE 5 CYCLES AGO -dc5=$($NDATE -54 ${PDY}${cyc} | cut -c -10) -date5=$(echo ${dc5} | cut -c -8) -sdate5=$(echo ${dc5} | cut -c 3-8) -cycle5=$(echo ${dc5} | cut -c 9,10) -# DEFINE 6 CYCLES AGO -dc6=$($NDATE -66 ${PDY}${cyc} | cut -c -10) -date6=$(echo ${dc6} | cut -c -8) -sdate6=$(echo ${dc6} | cut -c 3-8) -cycle6=$(echo ${dc6} | cut -c 9,10) -# DEFINE 7 CYCLES AGO -dc7=$($NDATE -90 ${PDY}${cyc} | cut -c -10) -date7=$(echo ${dc7} | cut -c -8) -sdate7=$(echo ${dc7} | cut -c 3-8) -cycle7=$(echo ${dc7} | cut -c 9,10) -# DEFINE 8 CYCLES AGO -dc8=$($NDATE -114 ${PDY}${cyc} | cut -c -10) -date8=$(echo ${dc8} | cut -c -8) -sdate8=$(echo ${dc8} | cut -c 3-8) -cycle8=$(echo ${dc8} | cut -c 9,10) -# DEFINE 9 CYCLES AGO -dc9=$($NDATE -138 ${PDY}${cyc} | cut -c -10) -date9=$(echo ${dc9} | cut -c -8) -sdate9=$(echo ${dc9} | cut -c 3-8) -cycle9=$(echo ${dc9} | cut -c 9,10) +cp "${HOMEgfs}/gempak/fix/datatype.tbl" datatype.tbl # SET CURRENT CYCLE AS THE VERIFICATION GRIDDED FILE. -vergrid="F-GDAS | ${PDY2}/0600" +export COMIN="gdas.${PDY}${cyc}" +if [[ ! -L ${COMIN} ]]; then + ln -sf "${COM_ATMOS_GEMPAK_1p00}" "${COMIN}" +fi +vergrid="F-GDAS | ${PDY:2}/0600" fcsthr="0600f006" # SET WHAT RUNS TO COMPARE AGAINST BASED ON MODEL CYCLE TIME. areas="SAM NAM" -verdays="${dc1} ${dc2} ${dc3} ${dc4} ${dc5} ${dc6} ${dc7} ${dc8} ${dc9}" # GENERATING THE METAFILES. -for area in $areas - do - if [ ${area} = "NAM" ] ; then +for area in ${areas}; do + if [[ "${area}" == "NAM" ]] ; then garea="5.1;-124.6;49.6;-11.9" proj="STR/90.0;-95.0;0.0" latlon="0" @@ -100,50 +44,23 @@ for area in $areas latlon="1/10/1/2/10;10" run=" " fi - for verday in $verdays - do - if [ ${verday} -eq ${dc1} ] ; then - dgdattim=f012 - sdatenum=$sdate1 - cyclenum=$cycle1 - elif [ ${verday} -eq ${dc2} ] ; then - dgdattim=f024 - sdatenum=$sdate2 - cyclenum=$cycle2 - elif [ ${verday} -eq ${dc3} ] ; then - dgdattim=f036 - sdatenum=$sdate3 - cyclenum=$cycle3 - elif [ ${verday} -eq ${dc4} ] ; then - dgdattim=f048 - sdatenum=$sdate4 - cyclenum=$cycle4 - elif [ ${verday} -eq ${dc5} ] ; then - dgdattim=f060 - sdatenum=$sdate5 - cyclenum=$cycle5 - elif [ ${verday} -eq ${dc6} ] ; then - dgdattim=f072 - sdatenum=$sdate6 - cyclenum=$cycle6 - elif [ ${verday} -eq ${dc7} ] ; then - dgdattim=f096 - sdatenum=$sdate7 - cyclenum=$cycle7 - elif [ ${verday} -eq ${dc8} ] ; then - dgdattim=f120 - sdatenum=$sdate8 - cyclenum=$cycle8 - elif [ ${verday} -eq ${dc9} ] ; then - dgdattim=f144 - sdatenum=$sdate9 - cyclenum=$cycle9 + + fhrs=$(seq -s ' ' 12 12 72) + fhrs="${fhrs} $(seq -s ' ' 96 24 144)" + for fhr in ${fhrs}; do + stime=$(date --utc +%y%m%d -d "${PDY} ${cyc} - ${fhr} hours") + dgdattim=$(printf "f%03d" "${fhr}") + sdatenum=${stime:0:6} + cyclenum=${stime:6} + + if [[ ! -L "ukmet.20${sdatenum}" ]]; then + ln -sf "${COMINukmet}/ukmet.20${sdatenum}/gempak" "ukmet.20${sdatenum}" fi - grid="${COMINukmet}.20${sdatenum}/gempak/ukmet_20${sdatenum}${cyclenum}${dgdattim}" + gdfile="ukmet.20${sdatenum}/ukmet_20${sdatenum}${cyclenum}${dgdattim}" -# 500 MB HEIGHT METAFILE + # 500 MB HEIGHT METAFILE -$GEMEXE/gdplot2_nc << EOFplt + "${GEMEXE}/gdplot2_nc" << EOFplt \$MAPFIL = mepowo.gsf PROJ = ${proj} GAREA = ${garea} @@ -167,7 +84,7 @@ line = 6/1/3 title = 6/-2/~ GDAS 500 MB HGT (6-HR FCST)|~${area} 500 HGT DIFF r -gdfile = ${grid} +gdfile = ${gdfile} gdattim = ${dgdattim} line = 5/1/3 contur = 4 @@ -189,7 +106,7 @@ clear = yes latlon = ${latlon} r -gdfile = ${grid} +gdfile = ${gdfile} gdattim = ${dgdattim} line = 5/1/3 contur = 4 @@ -197,7 +114,7 @@ title = 5/-1/~ UKMET PMSL clear = no r -PROJ = +PROJ = GAREA = bwus gdfile = ${vergrid} gdattim = ${fcsthr} @@ -213,7 +130,7 @@ clear = yes latlon = ${latlon} ${run} -gdfile = ${grid} +gdfile = ${gdfile} gdattim = ${dgdattim} line = 5/1/3 contur = 4 @@ -226,22 +143,28 @@ EOFplt done done -export err=$?;err_chk +export err=$? + ##################################################### # GEMPAK DOES NOT ALWAYS HAVE A NON ZERO RETURN CODE # WHEN IT CAN NOT PRODUCE THE DESIRED GRID. CHECK # FOR THIS CASE HERE. ##################################################### -ls -l ukmetver_12.meta -export err=$?;export pgm="GEMPAK CHECK FILE";err_chk - -if [ $SENDCOM = "YES" ] ; then - mkdir -p -m 775 ${COMOUTukmet}/ukmet.${PDY}/meta/ - mv ukmetver_12.meta ${COMOUTukmet}/ukmet.${PDY}/meta/ukmetver_${PDY}_12 - if [ $SENDDBN = "YES" ] ; then - ${DBNROOT}/bin/dbn_alert MODEL UKMETVER_HPCMETAFILE $job \ - ${COMOUTukmet}/ukmet.${PDY}/meta/ukmetver_${PDY}_12 - fi +if (( err != 0 )) || [[ ! -s ukmetver_12.meta ]]; then + echo "FATAL ERROR: Failed to create ukmet meta file" + exit "${err}" +fi + +mv ukmetver_12.meta "${COM_ATMOS_GEMPAK_META}/ukmetver_${PDY}_12" +export err=$? +if (( err != 0 )) ; then + echo "FATAL ERROR: Failed to move meta file to ${COM_ATMOS_GEMPAK_META}/ukmetver_${PDY}_12" + exit "${err}" +fi + +if [[ "${SENDDBN}" == "YES" ]] ; then + "${DBNROOT}/bin/dbn_alert" MODEL UKMETVER_HPCMETAFILE "${job}" \ + "${COM_ATMOS_GEMPAK_META}/ukmetver_${PDY}_12" fi exit diff --git a/gempak/ush/gempak_gdas_f000_gif.sh b/gempak/ush/gempak_gdas_f000_gif.sh index cdf7659155..80e28f5345 100755 --- a/gempak/ush/gempak_gdas_f000_gif.sh +++ b/gempak/ush/gempak_gdas_f000_gif.sh @@ -1,100 +1,87 @@ -#!/bin/sh +#! /usr/bin/env bash ######################################################################### # -# Script: gempak_gdas_f00_gif.sh -# # This scripts creates GEMPAK .gif images of 00HR/Analysis fields from # GDAS model output for archiving at NCDC. # -# -# History: Ralph Jones 02/16/2005 JIF original version. -# -# ######################################################################### - msg=" Make GEMPAK GIFS utility" - postmsg "$jlogfile" "$msg" - - set -x - - MAPAREA="normal" +source "${HOMEgfs}/ush/preamble.sh" - LATVAL="1/1/1/1/5;5" - LATSOUTH="1/1/1/1;4/5;5" +LATVAL="1/1/1/1/5;5" +LATSOUTH="1/1/1/1;4/5;5" +pixels="1728;1472" - pixels="1728;1472" - - cp $FIXgempak/coltbl.spc coltbl.xwp +cp "${HOMEgfs}/gempak/fix/coltbl.spc" coltbl.xwp ################################################################# -# NORTHERN HEMISPHERE ANALYSIS CHARTS # +# NORTHERN HEMISPHERE ANALYSIS CHARTS # ################################################################# -# Create time stamp (bottom) label +# Create time stamp (bottom) label - echo 0000${PDY}${cyc} > dates - export FORT55="title.output" -# $WEBTITLE < dates - ${UTILgfs}/exec/webtitle < dates +echo "0000${PDY}${cyc}" > dates +export FORT55="title.output" +"${HOMEgfs}/exec/webtitle.x" < dates - export TITLE=$(cat title.output) - echo "\n\n TITLE = $TITLE \n" +TITLE="$(cat title.output)" +echo "TITLE = ${TITLE}" # Define labels and file names for Northern Hemisphere analysis charts - hgttmp850lab="850MB ANALYSIS HEIGHTS/TEMPERATURE" - hgttmp850dev="gdas_850_hgt_tmp_nh_anl_${cyc}.gif" +hgttmp850lab="850MB ANALYSIS HEIGHTS/TEMPERATURE" +hgttmp850dev="gdas_850_hgt_tmp_nh_anl_${cyc}.gif" + +hgttmp700lab="700MB ANALYSIS HEIGHTS/TEMPERATURE" +hgttmp700dev="gdas_700_hgt_tmp_nh_anl_${cyc}.gif" - hgttmp700lab="700MB ANALYSIS HEIGHTS/TEMPERATURE" - hgttmp700dev="gdas_700_hgt_tmp_nh_anl_${cyc}.gif" +hgttmp500lab="500MB ANALYSIS HEIGHTS/TEMPERATURE" +hgttmp500dev="gdas_500_hgt_tmp_nh_anl_${cyc}.gif" - hgttmp500lab="500MB ANALYSIS HEIGHTS/TEMPERATURE" - hgttmp500dev="gdas_500_hgt_tmp_nh_anl_${cyc}.gif" +hgtiso300lab="300MB ANALYSIS HEIGHTS/ISOTACHS" +hgtiso300dev="gdas_300_hgt_iso_nh_anl_${cyc}.gif" - hgtiso300lab="300MB ANALYSIS HEIGHTS/ISOTACHS" - hgtiso300dev="gdas_300_hgt_iso_nh_anl_${cyc}.gif" +hgtiso250lab="250MB ANALYSIS HEIGHTS/ISOTACHS" +hgtiso250dev="gdas_250_hgt_iso_nh_anl_${cyc}.gif" - hgtiso250lab="250MB ANALYSIS HEIGHTS/ISOTACHS" - hgtiso250dev="gdas_250_hgt_iso_nh_anl_${cyc}.gif" +hgtiso200lab="200MB ANALYSIS HEIGHTS/ISOTACHS" +hgtiso200dev="gdas_200_hgt_iso_nh_anl_${cyc}.gif" - hgtiso200lab="200MB ANALYSIS HEIGHTS/ISOTACHS" - hgtiso200dev="gdas_200_hgt_iso_nh_anl_${cyc}.gif" +mslpthksfclab="ANALYSIS MEAN SEA LEVEL PRESSURE/1000-500MB THICKNESS" +mslpthksfcdev="gdas_sfc_mslp_thk_nh_anl_${cyc}.gif" - mslpthksfclab="ANALYSIS MEAN SEA LEVEL PRESSURE/1000-500MB THICKNESS" - mslpthksfcdev="gdas_sfc_mslp_thk_nh_anl_${cyc}.gif" - # Set grid date and input file name - gdattim=$(echo ${PDY} | cut -c3-8)/${cyc}00F000 - gdfile=gem_grids${fhr}.gem +gdattim="${PDY:2:6}/${cyc}00F000" +gdfile=gem_grids${fhr3}.gem # Execute the GEMPAK program -$GEMEXE/gdplot2_gif << EOF +"${GEMEXE}/gdplot2_gif" << EOF ! 850MB HEIGHTS/TEMPERATURES - restore $NTS/base_nh.nts - restore $NTS/850_hgt_tmp.nts + restore ${NTS}/base_nh.nts + restore ${NTS}/850_hgt_tmp.nts CLEAR = yes - GDFILE = $gdfile - GDATTIM = $gdattim + GDFILE = ${gdfile} + GDATTIM = ${gdattim} MAP = 1 - DEVICE = gif | ${hgttmp850dev} | $pixels + DEVICE = gif | ${hgttmp850dev} | ${pixels} TITLE = TEXT = 1/3/2/sw - LATLON = $LATVAL + LATLON = ${LATVAL} l r CLEAR = no GDPFUN = - TITLE = 1/-4/$TITLE + TITLE = 1/-4/${TITLE} TEXT = 2/3/2/sw LATLON = 0 l @@ -107,23 +94,23 @@ $GEMEXE/gdplot2_gif << EOF ! 700MB HEIGHTS/TEMPERATURES - restore $NTS/base_nh.nts - restore $NTS/700_hgt_tmp.nts + restore ${NTS}/base_nh.nts + restore ${NTS}/700_hgt_tmp.nts CLEAR = yes - GDFILE = $gdfile - GDATTIM = $gdattim + GDFILE = ${gdfile} + GDATTIM = ${gdattim} MAP = 1 - DEVICE = gif | ${hgttmp700dev} | $pixels + DEVICE = gif | ${hgttmp700dev} | ${pixels} TITLE = TEXT = 1/3/2/sw - LATLON = $LATVAL + LATLON = ${LATVAL} l r CLEAR = no GDPFUN = - TITLE = 1/-4/$TITLE + TITLE = 1/-4/${TITLE} TEXT = 2/3/2/sw LATLON = 0 l @@ -136,23 +123,23 @@ $GEMEXE/gdplot2_gif << EOF ! 500MB HEIGHTS/TEMPERATURES - restore $NTS/base_nh.nts - restore $NTS/500_hgt_tmp.nts + restore ${NTS}/base_nh.nts + restore ${NTS}/500_hgt_tmp.nts CLEAR = yes - GDFILE = $gdfile - GDATTIM = $gdattim + GDFILE = ${gdfile} + GDATTIM = ${gdattim} MAP = 1 - DEVICE = gif | ${hgttmp500dev} | $pixels + DEVICE = gif | ${hgttmp500dev} | ${pixels} TITLE = TEXT = 1/3/2/sw - LATLON = $LATVAL + LATLON = ${LATVAL} l r CLEAR = no GDPFUN = - TITLE = 1/-4/$TITLE + TITLE = 1/-4/${TITLE} TEXT = 2/3/2/sw LATLON = 0 l @@ -165,14 +152,14 @@ $GEMEXE/gdplot2_gif << EOF ! 300MB HEIGHTS/ISOTACHS - restore $NTS/base_nh.nts - restore $NTS/300_hgt_iso.nts + restore ${NTS}/base_nh.nts + restore ${NTS}/300_hgt_iso.nts CLEAR = yes - GDFILE = $gdfile - GDATTIM = $gdattim + GDFILE = ${gdfile} + GDATTIM = ${gdattim} MAP = 1 - DEVICE = gif | ${hgtiso300dev} | $pixels + DEVICE = gif | ${hgtiso300dev} | ${pixels} TITLE = TEXT = 1/3/2/sw LATLON = 1/1/1/1/5;5 ! @@ -181,7 +168,7 @@ $GEMEXE/gdplot2_gif << EOF CLEAR = no GDPFUN = - TITLE = 1/-4/$TITLE + TITLE = 1/-4/${TITLE} TEXT = 2/3/2/sw LATLON = 0 l @@ -194,17 +181,17 @@ $GEMEXE/gdplot2_gif << EOF ! 250MB ANALYSIS HEIGHTS/ISOTACHS - restore $NTS/base_nh.nts - restore $NTS/250_hgt_iso.nts + restore ${NTS}/base_nh.nts + restore ${NTS}/250_hgt_iso.nts - CLEAR = yes + CLEAR = yes GDFILE = ${gdfile} GDATTIM = ${gdattim} - MAP = 1 - DEVICE = gif | ${hgtiso250dev} | $pixels - TITLE = + MAP = 1 + DEVICE = gif | ${hgtiso250dev} | ${pixels} + TITLE = TEXT = 1/3/2/sw - LATLON = $LATVAL + LATLON = ${LATVAL} l r @@ -224,14 +211,14 @@ $GEMEXE/gdplot2_gif << EOF ! 200MB HEIGHTS/ISOTACHS - restore $NTS/base_nh.nts - restore $NTS/200_hgt_iso.nts + restore ${NTS}/base_nh.nts + restore ${NTS}/200_hgt_iso.nts CLEAR = yes - GDFILE = $gdfile - GDATTIM = $gdattim + GDFILE = ${gdfile} + GDATTIM = ${gdattim} MAP = 1 - DEVICE = gif | ${hgtiso200dev} | $pixels + DEVICE = gif | ${hgtiso200dev} | ${pixels} TITLE = TEXT = 1/3/2/sw LATLON = 1/1/1/1/5;5 ! @@ -240,7 +227,7 @@ $GEMEXE/gdplot2_gif << EOF CLEAR = no GDPFUN = - TITLE = 1/-4/$TITLE + TITLE = 1/-4/${TITLE} TEXT = 2/3/2/sw LATLON = 0 l @@ -253,17 +240,17 @@ $GEMEXE/gdplot2_gif << EOF ! ANALYSIS MSLP/1000-500 THICKNESS - restore $NTS/base_nh.nts - restore $NTS/sfc_mslp_thk.nts + restore ${NTS}/base_nh.nts + restore ${NTS}/sfc_mslp_thk.nts CLEAR = yes GDFILE = ${gdfile} GDATTIM = ${gdattim} MAP = 1 - DEVICE = gif | ${mslpthksfcdev} | $pixels + DEVICE = gif | ${mslpthksfcdev} | ${pixels} TITLE = TEXT = 1/3/2/sw - LATLON = $LATVAL + LATLON = ${LATVAL} l r @@ -283,75 +270,66 @@ $GEMEXE/gdplot2_gif << EOF exit EOF -$GEMEXE/gpend - -if [ $SENDCOM = YES ]; then +"${GEMEXE}/gpend" # Copy the GIF images into my area - cp ${hgttmp850dev} $COMOUTncdc/. - cp ${hgttmp700dev} $COMOUTncdc/. - cp ${hgttmp500dev} $COMOUTncdc/. - cp ${hgtiso300dev} $COMOUTncdc/. - cp ${hgtiso250dev} $COMOUTncdc/. - cp ${hgtiso200dev} $COMOUTncdc/. - cp ${mslpthksfcdev} $COMOUTncdc/. +cp "${hgttmp850dev}" "${COM_ATMOS_GEMPAK_GIF}/." +cp "${hgttmp700dev}" "${COM_ATMOS_GEMPAK_GIF}/." +cp "${hgttmp500dev}" "${COM_ATMOS_GEMPAK_GIF}/." +cp "${hgtiso300dev}" "${COM_ATMOS_GEMPAK_GIF}/." +cp "${hgtiso250dev}" "${COM_ATMOS_GEMPAK_GIF}/." +cp "${hgtiso200dev}" "${COM_ATMOS_GEMPAK_GIF}/." +cp "${mslpthksfcdev}" "${COM_ATMOS_GEMPAK_GIF}/." # Send the GIF images onto the NCDC area on the public ftp server - if [ $SENDDBN = YES ]; then - - $DBNROOT/bin/dbn_alert MODEL NCDCGIF ${job} $COMOUTncdc/${hgttmp850dev} - $DBNROOT/bin/dbn_alert MODEL NCDCGIF ${job} $COMOUTncdc/${hgttmp700dev} - $DBNROOT/bin/dbn_alert MODEL NCDCGIF ${job} $COMOUTncdc/${hgttmp500dev} - $DBNROOT/bin/dbn_alert MODEL NCDCGIF ${job} $COMOUTncdc/${hgtiso300dev} - $DBNROOT/bin/dbn_alert MODEL NCDCGIF ${job} $COMOUTncdc/${hgtiso250dev} - $DBNROOT/bin/dbn_alert MODEL NCDCGIF ${job} $COMOUTncdc/${hgtiso200dev} - $DBNROOT/bin/dbn_alert MODEL NCDCGIF ${job} $COMOUTncdc/${mslpthksfcdev} - - fi - +if [[ ${SENDDBN} == YES ]]; then + "${DBNROOT}/bin/dbn_alert" MODEL NCDCGIF "${job}" "${COM_ATMOS_GEMPAK_GIF}/${hgttmp850dev}" + "${DBNROOT}/bin/dbn_alert" MODEL NCDCGIF "${job}" "${COM_ATMOS_GEMPAK_GIF}/${hgttmp700dev}" + "${DBNROOT}/bin/dbn_alert" MODEL NCDCGIF "${job}" "${COM_ATMOS_GEMPAK_GIF}/${hgttmp500dev}" + "${DBNROOT}/bin/dbn_alert" MODEL NCDCGIF "${job}" "${COM_ATMOS_GEMPAK_GIF}/${hgtiso300dev}" + "${DBNROOT}/bin/dbn_alert" MODEL NCDCGIF "${job}" "${COM_ATMOS_GEMPAK_GIF}/${hgtiso250dev}" + "${DBNROOT}/bin/dbn_alert" MODEL NCDCGIF "${job}" "${COM_ATMOS_GEMPAK_GIF}/${hgtiso200dev}" + "${DBNROOT}/bin/dbn_alert" MODEL NCDCGIF "${job}" "${COM_ATMOS_GEMPAK_GIF}/${mslpthksfcdev}" fi - - ########################################################## # SOUTHERN HEMISPHERE ANALYSIS CHARTS # ########################################################## +mslpthksfclab="ANALYSIS MEAN SEA LEVEL PRESSURE/1000-500MB THICKNESS" +mslpthksfcdev="gdas_sfc_mslp_thk_sh_anl_${cyc}.gif" - mslpthksfclab="ANALYSIS MEAN SEA LEVEL PRESSURE/1000-500MB THICKNESS" - mslpthksfcdev="gdas_sfc_mslp_thk_sh_anl_${cyc}.gif" - - hgttmp500lab="500MB ANALYSIS HEIGHTS/TEMPERATURE" - hgttmp500dev="gdas_500_hgt_tmp_sh_anl_${cyc}.gif" +hgttmp500lab="500MB ANALYSIS HEIGHTS/TEMPERATURE" +hgttmp500dev="gdas_500_hgt_tmp_sh_anl_${cyc}.gif" - hgtiso300lab="300MB ANALYSIS HEIGHTS/ISOTACHS" - hgtiso300dev="gdas_300_hgt_iso_sh_anl_${cyc}.gif" +hgtiso300lab="300MB ANALYSIS HEIGHTS/ISOTACHS" +hgtiso300dev="gdas_300_hgt_iso_sh_anl_${cyc}.gif" - hgtiso250lab="250MB ANALYSIS HEIGHTS/ISOTACHS" - hgtiso250dev="gdas_250_hgt_iso_sh_anl_${cyc}.gif" +hgtiso250lab="250MB ANALYSIS HEIGHTS/ISOTACHS" +hgtiso250dev="gdas_250_hgt_iso_sh_anl_${cyc}.gif" # Execute the GEMPAK program -$GEMEXE/gdplot2_gif << EOF +"${GEMEXE}/gdplot2_gif" << EOF ! ANALYSIS MSLP/1000-500 THICKNESS - restore $NTS/base_sh.nts - restore $NTS/sfc_mslp_thk.nts + restore ${NTS}/base_sh.nts + restore ${NTS}/sfc_mslp_thk.nts CLEAR = yes GDFILE = ${gdfile} GDATTIM = ${gdattim} MAP = 1 - DEVICE = gif | ${mslpthksfcdev} | $pixels + DEVICE = gif | ${mslpthksfcdev} | ${pixels} TITLE = TEXT = 1/3/2/sw - LATLON = $LATSOUTH + LATLON = ${LATSOUTH} l r @@ -371,18 +349,18 @@ $GEMEXE/gdplot2_gif << EOF ! 500MB ANALYSIS HEIGHTS/TEMPERATURES - restore $NTS/base_sh.nts - restore $NTS/500_hgt_tmp.nts + restore ${NTS}/base_sh.nts + restore ${NTS}/500_hgt_tmp.nts CLEAR = yes GDFILE = ${gdfile} GDATTIM = ${gdattim} MAP = 1 - DEVICE = gif | ${hgttmp500dev} | $pixels + DEVICE = gif | ${hgttmp500dev} | ${pixels} TITLE = TEXT = 1/3/2/sw - LATLON = $LATSOUTH + LATLON = ${LATSOUTH} l r @@ -401,23 +379,23 @@ $GEMEXE/gdplot2_gif << EOF ! 300MB HEIGHTS/ISOTACHS - restore $NTS/base_sh.nts - restore $NTS/300_hgt_iso.nts + restore ${NTS}/base_sh.nts + restore ${NTS}/300_hgt_iso.nts CLEAR = yes - GDFILE = $gdfile - GDATTIM = $gdattim + GDFILE = ${gdfile} + GDATTIM = ${gdattim} MAP = 1 - DEVICE = gif | ${hgtiso300dev} | $pixels + DEVICE = gif | ${hgtiso300dev} | ${pixels} TITLE = TEXT = 1/3/2/sw - LATLON = $LATSOUTH ! + LATLON = ${LATSOUTH} ! l r CLEAR = no GDPFUN = - TITLE = 1/-4/$TITLE + TITLE = 1/-4/${TITLE} TEXT = 2/3/2/sw LATLON = 0 l @@ -430,17 +408,17 @@ $GEMEXE/gdplot2_gif << EOF ! 250MB ANALYSIS HEIGHTS/ISOTACHS - restore $NTS/base_sh.nts - restore $NTS/250_hgt_iso.nts + restore ${NTS}/base_sh.nts + restore ${NTS}/250_hgt_iso.nts CLEAR = yes GDFILE = ${gdfile} GDATTIM = ${gdattim} MAP = 1 - DEVICE = gif | ${hgtiso250dev} | $pixels + DEVICE = gif | ${hgtiso250dev} | ${pixels} TITLE = TEXT = 1/3/2/sw - LATLON = $LATSOUTH + LATLON = ${LATSOUTH} l r @@ -461,35 +439,22 @@ $GEMEXE/gdplot2_gif << EOF EOF -$GEMEXE/gpend - +"${GEMEXE}/gpend" -if [ $SENDCOM = YES ]; then # Copy the GIF images into my area - - cp ${mslpthksfcdev} $COMOUTncdc/. - cp ${hgttmp500dev} $COMOUTncdc/. - cp ${hgtiso300dev} $COMOUTncdc/. - cp ${hgtiso250dev} $COMOUTncdc/. - +cp "${mslpthksfcdev}" "${COM_ATMOS_GEMPAK_GIF}/." +cp "${hgttmp500dev}" "${COM_ATMOS_GEMPAK_GIF}/." +cp "${hgtiso300dev}" "${COM_ATMOS_GEMPAK_GIF}/." +cp "${hgtiso250dev}" "${COM_ATMOS_GEMPAK_GIF}/." # Copy the GIF images onto the NCDC area on the public ftp server - if [ $SENDDBN = YES ]; then - - $DBNROOT/bin/dbn_alert MODEL NCDCGIF ${job} $COMOUTncdc/${mslpthksfcdev} - $DBNROOT/bin/dbn_alert MODEL NCDCGIF ${job} $COMOUTncdc/${hgttmp500dev} - $DBNROOT/bin/dbn_alert MODEL NCDCGIF ${job} $COMOUTncdc/${hgtiso300dev} - $DBNROOT/bin/dbn_alert MODEL NCDCGIF ${job} $COMOUTncdc/${hgtiso250dev} - - fi - +if [[ ${SENDDBN} == YES ]]; then + "${DBNROOT}/bin/dbn_alert" MODEL NCDCGIF "${job}" "${COM_ATMOS_GEMPAK_GIF}/${mslpthksfcdev}" + "${DBNROOT}/bin/dbn_alert" MODEL NCDCGIF "${job}" "${COM_ATMOS_GEMPAK_GIF}/${hgttmp500dev}" + "${DBNROOT}/bin/dbn_alert" MODEL NCDCGIF "${job}" "${COM_ATMOS_GEMPAK_GIF}/${hgtiso300dev}" + "${DBNROOT}/bin/dbn_alert" MODEL NCDCGIF "${job}" "${COM_ATMOS_GEMPAK_GIF}/${hgtiso250dev}" fi - - - msg=" GEMPAK_GIF ${fhr} hour completed normally" - postmsg "$jlogfile" "$msg" - - exit +exit diff --git a/gempak/ush/gempak_gfs_f000_gif.sh b/gempak/ush/gempak_gfs_f000_gif.sh new file mode 100755 index 0000000000..6a709fcc16 --- /dev/null +++ b/gempak/ush/gempak_gfs_f000_gif.sh @@ -0,0 +1,584 @@ +#! /usr/bin/env bash + +######################################################################### +# +# Script: gempak_gfs_f00_gif.sh +# +# This scripts creates GEMPAK .gif images of 00HR/Analysis fields from +# GFS model output for archiving at NCDC. +# +# +# History: Ralph Jones 02/16/2005 JIF original version. +# History: Steve Lilly 04/30/2008 Change font size of the Titles +# from .8 to a larger size (1 or 2) +# +# +######################################################################### + +LATVAL="1/1/1/1/5;5" +pixels="1728;1472" +cp "${HOMEgfs}/gempak/fix/coltbl.spc" coltbl.xwp + +################################################################# +# ANALYSIS CHARTS # +################################################################# + + +# Create time stamp (bottom) label + +echo "0000${PDY}${cyc}" > dates +export FORT55="title.output" +"${HOMEgfs}/exec/webtitle.x" < dates +TITLE="$(cat title.output)" +echo "TITLE = ${TITLE}" + +# Define labels and file names for analysis charts + +hgttmp700lab="700MB ANALYSIS HEIGHTS/TEMPERATURE" +hgttmp700dev="gfs_700_hgt_tmp_nh_anl_${cyc}.gif" + +hgttmp500lab="500MB ANALYSIS HEIGHTS/TEMPERATURE" +hgttmp500dev="gfs_500_hgt_tmp_nh_anl_${cyc}.gif" + +hgtiso300lab="300MB ANALYSIS HEIGHTS/ISOTACHS" +hgtiso300dev="gfs_300_hgt_iso_nh_anl_${cyc}.gif" + +hgtiso250lab="250MB ANALYSIS HEIGHTS/ISOTACHS" +hgtiso250dev="gfs_250_hgt_iso_nh_anl_${cyc}.gif" + +hgttmp250lab="250MB ANALYSIS HEIGHTS/TEMPERATURE" +hgttmp250dev="gfs_250_hgt_tmp_nh_anl_${cyc}.gif" + +hgtiso200lab="200MB ANALYSIS HEIGHTS/ISOTACHS" +hgtiso200dev="gfs_200_hgt_iso_nh_anl_${cyc}.gif" + +# Not being used? +# hgttmp200lab="200MB ANALYSIS HEIGHTS/TEMPERATURE" +# hgttmp200dev="gfs_200_hgt_tmp_nh_anl_${cyc}.gif" + +hgtiso100lab="100MB ANALYSIS HEIGHTS/ISOTACHS" +hgtiso100dev="gfs_100_hgt_iso_nh_anl_${cyc}.gif" + +hgttmp100lab="100MB ANALYSIS HEIGHTS/TEMPERATURE" +hgttmp100dev="gfs_100_hgt_tmp_nh_anl_${cyc}.gif" + +hgtvor500lab="500MB ANALYSIS HEIGHTS/VORTICITY" +hgtvor500dev="gfs_500_hgt_vor_nh_anl_${cyc}.gif" + +hgtvor500usdev="gfs_500_hgt_vor_uscan_anl_${cyc}.gif" + +mslpthksfclab="ANALYSIS MEAN SEA LEVEL PRESSURE/1000-500MB THICKNESS" +mslpthksfcdev="gfs_sfc_mslp_thk_nh_anl_${cyc}.gif" + +mslpthksfcusdev="gfs_sfc_mslp_thk_uscan_anl_${cyc}.gif" + +rhvvel700lab="700MB ANALYSIS RH/VERT VEL" +rhvvel700dev="gfs_700_rh_vvel_nh_anl_${cyc}.gif" + +liftlab="ANALYSIS LIFTED INDEX" +liftdev="gfs_lift_nh_anl_${cyc}.gif" + +prswshtroplab="TROPOPAUSE PRESSURE/WIND SHEAR" +prswshtropdev="gfs_trop_prs_wsh_nh_anl_${cyc}.gif" + +# Set grid date and input file name + +gdattim=${PDY:2:6}/${cyc}00F000 +gdfile=gem_grids${fhr3}.gem + +# Execute the GEMPAK program + +"${GEMEXE}/gdplot2_gif" << EOF + +! 700MB HEIGHTS/TEMPERATURES + + restore ${NTS}/base_nh.nts + restore ${NTS}/700_hgt_tmp.nts + + CLEAR = yes + GDFILE = ${gdfile} + GDATTIM = ${gdattim} + MAP = 1 + DEVICE = gif | ${hgttmp700dev} | ${pixels} + TITLE = + TEXT = 1/2/2/c/sw + LATLON = ${LATVAL} + l + r + + CLEAR = no + GDPFUN = + TITLE = 1/-4/${TITLE} + TEXT = 2/2/2/c/sw + LATLON = 0 + l + r + + TITLE = 1/3/${hgttmp700lab} + l + r + + +! 500MB HEIGHTS/TEMPERATURES + + restore ${NTS}/base_nh.nts + restore ${NTS}/500_hgt_tmp.nts + + CLEAR = yes + GDFILE = ${gdfile} + GDATTIM = ${gdattim} + MAP = 1 + DEVICE = gif | ${hgttmp500dev} | ${pixels} + TITLE = + TEXT = 1/2/2/c/sw + LATLON = ${LATVAL} + l + r + + CLEAR = no + GDPFUN = + TITLE = 1/-4/${TITLE} + TEXT = 2/2/2/c/sw + LATLON = 0 + l + r + + TITLE = 1/3/${hgttmp500lab} + l + r + + +! 300MB HEIGHTS/ISOTACHS + + restore ${NTS}/base_nh.nts + restore ${NTS}/300_hgt_iso.nts + + CLEAR = yes + GDFILE = ${gdfile} + GDATTIM = ${gdattim} + MAP = 1 + DEVICE = gif | ${hgtiso300dev} | ${pixels} + TITLE = + TEXT = 1/2/2/c/sw + LATLON = 1/1/1/1/5;5 ! + l + r + + CLEAR = no + GDPFUN = + TITLE = 1/-4/${TITLE} + TEXT = 2/2/2/c/sw + LATLON = 0 + l + r + + TITLE = 1/3/${hgtiso300lab} + l + r + + +! 250MB HEIGHTS/TEMPERATURES + + restore ${NTS}/base_nh.nts + restore ${NTS}/250_hgt_tmp.nts + + CLEAR = yes + GDFILE = ${gdfile} + GDATTIM = ${gdattim} + MAP = 1 + DEVICE = gif | ${hgttmp250dev} | ${pixels} + TITLE = + TEXT = 1/2/2/c/sw + LATLON = ${LATVAL} + l + r + + CLEAR = no + GDPFUN = + TITLE = 1/-4/${TITLE} + TEXT = 2/2/2/c/sw + LATLON = 0 + l + r + + TITLE = 1/3/${hgttmp250lab} + l + r + + +! 250MB ANALYSIS HEIGHTS/ISOTACHS + + restore ${NTS}/base_nh.nts + restore ${NTS}/250_hgt_iso.nts + + CLEAR = yes + GDFILE = ${gdfile} + GDATTIM = ${gdattim} + MAP = 1 + DEVICE = gif | ${hgtiso250dev} | ${pixels} + TITLE = + TEXT = 1/2/2/c/sw + LATLON = ${LATVAL} + l + r + + CLEAR = no + MAP = 0 + GDPFUN = + TITLE = 1/-4/${TITLE} + TEXT = 2/2/2/c/sw + LATLON = 0 + l + r + + TITLE = 1/3/${hgtiso250lab} + l + r + + +! 200MB HEIGHTS/ISOTACHS + + restore ${NTS}/base_nh.nts + restore ${NTS}/200_hgt_iso.nts + + CLEAR = yes + GDFILE = ${gdfile} + GDATTIM = ${gdattim} + MAP = 1 + DEVICE = gif | ${hgtiso200dev} | ${pixels} + TITLE = + TEXT = 1/2/2/c/sw + LATLON = 1/1/1/1/5;5 ! + l + r + + CLEAR = no + GDPFUN = + TITLE = 1/-4/${TITLE} + TEXT = 2/2/2/c/sw + LATLON = 0 + l + r + + TITLE = 1/3/${hgtiso200lab} + l + r + + +! 100MB HEIGHTS/TEMPERATURES + + restore ${NTS}/base_nh.nts + restore ${NTS}/100_hgt_tmp.nts + + CLEAR = yes + GDFILE = ${gdfile} + GDATTIM = ${gdattim} + MAP = 1 + DEVICE = gif | ${hgttmp100dev} | ${pixels} + TITLE = + TEXT = 1/2/2/c/sw + LATLON = ${LATVAL} + l + r + + CLEAR = no + GDPFUN = + TITLE = 1/-4/${TITLE} + TEXT = 2/2/2/c/sw + LATLON = 0 + l + r + + TITLE = 1/3/${hgttmp100lab} + l + r + + +! 100MB HEIGHTS/ISOTACHS + + restore ${NTS}/base_nh.nts + restore ${NTS}/100_hgt_iso.nts + + CLEAR = yes + GDFILE = ${gdfile} + GDATTIM = ${gdattim} + MAP = 1 + DEVICE = gif | ${hgtiso100dev} | ${pixels} + TITLE = + TEXT = 1/2/2/c/sw + LATLON = 1/1/1/1/5;5 ! + l + r + + CLEAR = no + GDPFUN = + TITLE = 1/-4/${TITLE} + TEXT = 2/2/2/c/sw + LATLON = 0 + l + r + + TITLE = 1/3/${hgtiso100lab} + l + r + + +! ANALYSIS MSLP/1000-500 THICKNESS + + restore ${NTS}/base_nh.nts + restore ${NTS}/sfc_mslp_thk.nts + + CLEAR = yes + GDFILE = ${gdfile} + GDATTIM = ${gdattim} + MAP = 1 + DEVICE = gif | ${mslpthksfcdev} | ${pixels} + TITLE = + TEXT = 1/2/2/c/sw + LATLON = ${LATVAL} + l + r + + CLEAR = no + MAP = 0 + GDPFUN = + TITLE = 1/-4/${TITLE} + TEXT = 2/2/2/c/sw + LATLON = 0 + l + r + + TITLE = 1/3/${mslpthksfclab} + l + r + + +! ANALYSIS MSLP/1000-500 THICKNESS (US/CANADA) + + restore ${NTS}/base_uscan.nts + restore ${NTS}/sfc_mslp_thk.nts + + CLEAR = yes + GDFILE = ${gdfile} + GDATTIM = ${gdattim} + MAP = 1 + DEVICE = gif | ${mslpthksfcusdev} | ${pixels} + TITLE = + TEXT = 1/2/2/c/sw + LATLON = ${LATVAL} + l + r + + CLEAR = no + MAP = 0 + GDPFUN = + TITLE = 1/-4/${TITLE} + TEXT = 2/2/2/c/sw + LATLON = 0 + l + r + + TITLE = 1/3/${mslpthksfclab} + l + r + +! 500MB ANALYSIS HEIGHTS/VORTICITY + + restore ${NTS}/base_nh.nts + restore ${NTS}/500_hgt_vor.nts + + CLEAR = yes + GDFILE = ${gdfile} + GDATTIM = ${gdattim} + MAP = 1 + DEVICE = gif | ${hgtvor500dev} | ${pixels} + TITLE = + TEXT = 1/2/2/c/sw + LATLON = ${LATVAL} + l + r + + CLEAR = no + MAP = 0 + GDPFUN = + TITLE = 1/-4/${TITLE} + TEXT = 2/2/2/c/sw + LATLON = 0 + l + r + + TITLE = 1/3/${hgtvor500lab} + l + r + +! 500MB ANALYSIS HEIGHTS/VORTICITY (US/CANADA) + + restore ${NTS}/base_uscan.nts + restore ${NTS}/500_hgt_vor.nts + + CLEAR = yes + GDFILE = ${gdfile} + GDATTIM = ${gdattim} + MAP = 1 + DEVICE = gif | ${hgtvor500usdev} | ${pixels} + TITLE = + TEXT = 1/2/2/c/sw + LATLON = ${LATVAL} + l + r + + CLEAR = no + GDPFUN = + TITLE = 1/-4/${TITLE} + TEXT = 2/2/2/c/sw + LATLON = 0 + l + + TITLE = 1/3/${hgtvor500lab} + l + r + + +! ANALYSIS LIFTED INDEX + + restore ${NTS}/base_nh.nts + restore ${NTS}/100_lift.nts + + CLEAR = yes + GDFILE = ${gdfile} + GDATTIM = ${gdattim} + MAP = 1 + DEVICE = gif | ${liftdev} | ${pixels} + TITLE = + TEXT = 1/2/2/c/sw + LATLON = ${LATVAL} + l + r + + CLEAR = no + GDPFUN = + TITLE = 1/-4/${TITLE} + TEXT = 2/2/2/c/sw + LATLON = 0 + l + r + + TITLE = 1/3/${liftlab} + l + r + + +! ANALYSIS TROPOPAUSE PRESSURE/WIND SHEAR + + restore ${NTS}/base_nh.nts + restore ${NTS}/trop_pres_wshr.nts + + CLEAR = yes + GDFILE = ${gdfile} + GDATTIM = ${gdattim} + MAP = 1 + DEVICE = gif | ${prswshtropdev} | ${pixels} + TITLE = + TEXT = 1/2/2/c/sw + LATLON = ${LATVAL} + l + r + + CLEAR = no + GDPFUN = + TITLE = 1/-4/${TITLE} + TEXT = 2/2/2/c/sw + LATLON = 0 + l + r + + TITLE = 1/3/${prswshtroplab} + l + r + + +! ANALYSIS 700MB RELATIVE HUMIDITY AND VERTICAL VELOCITY + + restore ${NTS}/base_nh.nts + restore ${NTS}/700_rel_vvel.nts + + CLEAR = yes + GDFILE = ${gdfile} + GDATTIM = ${gdattim} + MAP = 1 + DEVICE = gif | ${rhvvel700dev} | ${pixels} + TITLE = + TEXT = 1/2/2/c/sw + LATLON = ${LATVAL} + l + r + + CLEAR = no + GDPFUN = + TITLE = 1/-4/${TITLE} + TEXT = 2/2/2/c/sw + LATLON = 0 + l + r + + TITLE = 1/3/${rhvvel700lab} + l + r + + exit +EOF + + +"${GEMEXE}/gpend" + + +# Copy the GIF images into my area +cp "${hgttmp700dev}" "${COM_ATMOS_GEMPAK_GIF}" +cp "${hgttmp500dev}" "${COM_ATMOS_GEMPAK_GIF}" +cp "${hgtiso300dev}" "${COM_ATMOS_GEMPAK_GIF}" +cp "${hgtiso250dev}" "${COM_ATMOS_GEMPAK_GIF}" +cp "${hgttmp250dev}" "${COM_ATMOS_GEMPAK_GIF}" +cp "${hgtiso200dev}" "${COM_ATMOS_GEMPAK_GIF}" +cp "${hgtiso100dev}" "${COM_ATMOS_GEMPAK_GIF}" +cp "${hgttmp100dev}" "${COM_ATMOS_GEMPAK_GIF}" +cp "${mslpthksfcdev}" "${COM_ATMOS_GEMPAK_GIF}" +cp "${mslpthksfcusdev}" "${COM_ATMOS_GEMPAK_GIF}" +cp "${hgtvor500dev}" "${COM_ATMOS_GEMPAK_GIF}" +cp "${hgtvor500usdev}" "${COM_ATMOS_GEMPAK_GIF}" +cp "${liftdev}" "${COM_ATMOS_GEMPAK_GIF}" +cp "${prswshtropdev}" "${COM_ATMOS_GEMPAK_GIF}" +cp "${rhvvel700dev}" "${COM_ATMOS_GEMPAK_GIF}" + +# Copy the GIF images onto the NCDC area on the public ftp server + +if [[ "${SENDDBN}" == "YES" ]]; then + "${DBNROOT}/bin/dbn_alert" MODEL NCDCGIF "${job}" "${COM_ATMOS_GEMPAK_GIF}/${hgttmp700dev}" + "${DBNROOT}/bin/dbn_alert" MODEL NCDCGIF "${job}" "${COM_ATMOS_GEMPAK_GIF}/${hgttmp500dev}" + "${DBNROOT}/bin/dbn_alert" MODEL NCDCGIF "${job}" "${COM_ATMOS_GEMPAK_GIF}/${hgtiso300dev}" + "${DBNROOT}/bin/dbn_alert" MODEL NCDCGIF "${job}" "${COM_ATMOS_GEMPAK_GIF}/${hgtiso250dev}" + "${DBNROOT}/bin/dbn_alert" MODEL NCDCGIF "${job}" "${COM_ATMOS_GEMPAK_GIF}/${hgttmp250dev}" + "${DBNROOT}/bin/dbn_alert" MODEL NCDCGIF "${job}" "${COM_ATMOS_GEMPAK_GIF}/${hgtiso200dev}" +# "${DBNROOT}/bin/dbn_alert" MODEL NCDCGIF "${job}" "${COM_ATMOS_GEMPAK_GIF}/${hgttmp200dev}" + "${DBNROOT}/bin/dbn_alert" MODEL NCDCGIF "${job}" "${COM_ATMOS_GEMPAK_GIF}/${hgtiso100dev}" + "${DBNROOT}/bin/dbn_alert" MODEL NCDCGIF "${job}" "${COM_ATMOS_GEMPAK_GIF}/${hgttmp100dev}" + "${DBNROOT}/bin/dbn_alert" MODEL NCDCGIF "${job}" "${COM_ATMOS_GEMPAK_GIF}/${mslpthksfcdev}" + "${DBNROOT}/bin/dbn_alert" MODEL NCDCGIF "${job}" "${COM_ATMOS_GEMPAK_GIF}/${mslpthksfcusdev}" + "${DBNROOT}/bin/dbn_alert" MODEL NCDCGIF "${job}" "${COM_ATMOS_GEMPAK_GIF}/${hgtvor500dev}" + "${DBNROOT}/bin/dbn_alert" MODEL NCDCGIF "${job}" "${COM_ATMOS_GEMPAK_GIF}/${hgtvor500usdev}" + "${DBNROOT}/bin/dbn_alert" MODEL NCDCGIF "${job}" "${COM_ATMOS_GEMPAK_GIF}/${liftdev}" + "${DBNROOT}/bin/dbn_alert" MODEL NCDCGIF "${job}" "${COM_ATMOS_GEMPAK_GIF}/${prswshtropdev}" + "${DBNROOT}/bin/dbn_alert" MODEL NCDCGIF "${job}" "${COM_ATMOS_GEMPAK_GIF}/${rhvvel700dev}" + + + +fi + +# Convert the 500mb NH Hgts/Temps chart to tif, attach a heading and +# send to TOC via the NTC + +export input=${COM_ATMOS_GEMPAK_GIF}/${hgttmp500dev} +export HEADER=YES +export OUTPATH=${DATA}/gfs_500_hgt_tmp_nh_anl_${cyc}.tif +"${USHgfs}/make_tif.sh" + +exit diff --git a/gempak/ush/gempak_gfs_f00_gif.sh b/gempak/ush/gempak_gfs_f00_gif.sh deleted file mode 100755 index 2a7cca5c9f..0000000000 --- a/gempak/ush/gempak_gfs_f00_gif.sh +++ /dev/null @@ -1,602 +0,0 @@ -#!/bin/sh - -######################################################################### -# -# Script: gempak_gfs_f00_gif.sh -# -# This scripts creates GEMPAK .gif images of 00HR/Analysis fields from -# GFS model output for archiving at NCDC. -# -# -# History: Ralph Jones 02/16/2005 JIF original version. -# History: Steve Lilly 04/30/2008 Change font size of the Titles -# from .8 to a larger size (1 or 2) -# -# -######################################################################### - - msg=" Make GEMPAK GIFS utility" - postmsg "$jlogfile" "$msg" - - set -x - - MAPAREA="normal" - - LATVAL="1/1/1/1/5;5" - - pixels="1728;1472" - - cp $FIXgempak/coltbl.spc coltbl.xwp - -################################################################# -# ANALYSIS CHARTS # -################################################################# - - -# Create time stamp (bottom) label - - echo 0000${PDY}${cyc} > dates - export FORT55="title.output" -# $WEBTITLE < dates - ${UTILgfs}/exec/webtitle < dates - export TITLE=$(cat title.output) - echo "\n\n TITLE = $TITLE \n" - -# Define labels and file names for analysis charts - - hgttmp700lab="700MB ANALYSIS HEIGHTS/TEMPERATURE" - hgttmp700dev="gfs_700_hgt_tmp_nh_anl_${cyc}.gif" - - hgttmp500lab="500MB ANALYSIS HEIGHTS/TEMPERATURE" - hgttmp500dev="gfs_500_hgt_tmp_nh_anl_${cyc}.gif" - - hgtiso500lab="500MB ANALYSIS HEIGHTS/ISOTACHS" - hgtiso500dev="gfs_500_hgt_iso_nh_anl_${cyc}.gif" - - hgtiso300lab="300MB ANALYSIS HEIGHTS/ISOTACHS" - hgtiso300dev="gfs_300_hgt_iso_nh_anl_${cyc}.gif" - - hgtiso250lab="250MB ANALYSIS HEIGHTS/ISOTACHS" - hgtiso250dev="gfs_250_hgt_iso_nh_anl_${cyc}.gif" - - hgttmp250lab="250MB ANALYSIS HEIGHTS/TEMPERATURE" - hgttmp250dev="gfs_250_hgt_tmp_nh_anl_${cyc}.gif" - - hgtiso200lab="200MB ANALYSIS HEIGHTS/ISOTACHS" - hgtiso200dev="gfs_200_hgt_iso_nh_anl_${cyc}.gif" - - hgttmp200lab="200MB ANALYSIS HEIGHTS/TEMPERATURE" - hgttmp200dev="gfs_200_hgt_tmp_nh_anl_${cyc}.gif" - - hgtiso100lab="100MB ANALYSIS HEIGHTS/ISOTACHS" - hgtiso100dev="gfs_100_hgt_iso_nh_anl_${cyc}.gif" - - hgttmp100lab="100MB ANALYSIS HEIGHTS/TEMPERATURE" - hgttmp100dev="gfs_100_hgt_tmp_nh_anl_${cyc}.gif" - - hgtvor500lab="500MB ANALYSIS HEIGHTS/VORTICITY" - hgtvor500dev="gfs_500_hgt_vor_nh_anl_${cyc}.gif" - - hgtvor500usdev="gfs_500_hgt_vor_uscan_anl_${cyc}.gif" - - mslpthksfclab="ANALYSIS MEAN SEA LEVEL PRESSURE/1000-500MB THICKNESS" - mslpthksfcdev="gfs_sfc_mslp_thk_nh_anl_${cyc}.gif" - - mslpthksfcusdev="gfs_sfc_mslp_thk_uscan_anl_${cyc}.gif" - - rhvvel700lab="700MB ANALYSIS RH/VERT VEL" - rhvvel700dev="gfs_700_rh_vvel_nh_anl_${cyc}.gif" - - liftlab="ANALYSIS LIFTED INDEX" - liftdev="gfs_lift_nh_anl_${cyc}.gif" - - prswshtroplab="TROPOPAUSE PRESSURE/WIND SHEAR" - prswshtropdev="gfs_trop_prs_wsh_nh_anl_${cyc}.gif" - -# Set grid date and input file name - - gdattim=$(echo ${PDY} | cut -c3-8)/${cyc}00F000 - gdfile=gem_grids${fhr}.gem - -# Execute the GEMPAK program - - $GEMEXE/gdplot2_gif << EOF - -! 700MB HEIGHTS/TEMPERATURES - - restore $NTS/base_nh.nts - restore $NTS/700_hgt_tmp.nts - - CLEAR = yes - GDFILE = $gdfile - GDATTIM = $gdattim - MAP = 1 - DEVICE = gif | ${hgttmp700dev} | $pixels - TITLE = - TEXT = 1/2/2/c/sw - LATLON = $LATVAL - l - r - - CLEAR = no - GDPFUN = - TITLE = 1/-4/$TITLE - TEXT = 2/2/2/c/sw - LATLON = 0 - l - r - - TITLE = 1/3/${hgttmp700lab} - l - r - - -! 500MB HEIGHTS/TEMPERATURES - - restore $NTS/base_nh.nts - restore $NTS/500_hgt_tmp.nts - - CLEAR = yes - GDFILE = $gdfile - GDATTIM = $gdattim - MAP = 1 - DEVICE = gif | ${hgttmp500dev} | $pixels - TITLE = - TEXT = 1/2/2/c/sw - LATLON = $LATVAL - l - r - - CLEAR = no - GDPFUN = - TITLE = 1/-4/$TITLE - TEXT = 2/2/2/c/sw - LATLON = 0 - l - r - - TITLE = 1/3/${hgttmp500lab} - l - r - - -! 300MB HEIGHTS/ISOTACHS - - restore $NTS/base_nh.nts - restore $NTS/300_hgt_iso.nts - - CLEAR = yes - GDFILE = $gdfile - GDATTIM = $gdattim - MAP = 1 - DEVICE = gif | ${hgtiso300dev} | $pixels - TITLE = - TEXT = 1/2/2/c/sw - LATLON = 1/1/1/1/5;5 ! - l - r - - CLEAR = no - GDPFUN = - TITLE = 1/-4/$TITLE - TEXT = 2/2/2/c/sw - LATLON = 0 - l - r - - TITLE = 1/3/${hgtiso300lab} - l - r - - -! 250MB HEIGHTS/TEMPERATURES - - restore $NTS/base_nh.nts - restore $NTS/250_hgt_tmp.nts - - CLEAR = yes - GDFILE = $gdfile - GDATTIM = $gdattim - MAP = 1 - DEVICE = gif | ${hgttmp250dev} | $pixels - TITLE = - TEXT = 1/2/2/c/sw - LATLON = $LATVAL - l - r - - CLEAR = no - GDPFUN = - TITLE = 1/-4/$TITLE - TEXT = 2/2/2/c/sw - LATLON = 0 - l - r - - TITLE = 1/3/${hgttmp250lab} - l - r - - -! 250MB ANALYSIS HEIGHTS/ISOTACHS - - restore $NTS/base_nh.nts - restore $NTS/250_hgt_iso.nts - - CLEAR = yes - GDFILE = ${gdfile} - GDATTIM = ${gdattim} - MAP = 1 - DEVICE = gif | ${hgtiso250dev} | $pixels - TITLE = - TEXT = 1/2/2/c/sw - LATLON = $LATVAL - l - r - - CLEAR = no - MAP = 0 - GDPFUN = - TITLE = 1/-4/${TITLE} - TEXT = 2/2/2/c/sw - LATLON = 0 - l - r - - TITLE = 1/3/${hgtiso250lab} - l - r - - -! 200MB HEIGHTS/ISOTACHS - - restore $NTS/base_nh.nts - restore $NTS/200_hgt_iso.nts - - CLEAR = yes - GDFILE = $gdfile - GDATTIM = $gdattim - MAP = 1 - DEVICE = gif | ${hgtiso200dev} | $pixels - TITLE = - TEXT = 1/2/2/c/sw - LATLON = 1/1/1/1/5;5 ! - l - r - - CLEAR = no - GDPFUN = - TITLE = 1/-4/$TITLE - TEXT = 2/2/2/c/sw - LATLON = 0 - l - r - - TITLE = 1/3/${hgtiso200lab} - l - r - - -! 100MB HEIGHTS/TEMPERATURES - - restore $NTS/base_nh.nts - restore $NTS/100_hgt_tmp.nts - - CLEAR = yes - GDFILE = $gdfile - GDATTIM = $gdattim - MAP = 1 - DEVICE = gif | ${hgttmp100dev} | $pixels - TITLE = - TEXT = 1/2/2/c/sw - LATLON = $LATVAL - l - r - - CLEAR = no - GDPFUN = - TITLE = 1/-4/$TITLE - TEXT = 2/2/2/c/sw - LATLON = 0 - l - r - - TITLE = 1/3/${hgttmp100lab} - l - r - - -! 100MB HEIGHTS/ISOTACHS - - restore $NTS/base_nh.nts - restore $NTS/100_hgt_iso.nts - - CLEAR = yes - GDFILE = $gdfile - GDATTIM = $gdattim - MAP = 1 - DEVICE = gif | ${hgtiso100dev} | $pixels - TITLE = - TEXT = 1/2/2/c/sw - LATLON = 1/1/1/1/5;5 ! - l - r - - CLEAR = no - GDPFUN = - TITLE = 1/-4/$TITLE - TEXT = 2/2/2/c/sw - LATLON = 0 - l - r - - TITLE = 1/3/${hgtiso100lab} - l - r - - -! ANALYSIS MSLP/1000-500 THICKNESS - - restore $NTS/base_nh.nts - restore $NTS/sfc_mslp_thk.nts - - CLEAR = yes - GDFILE = ${gdfile} - GDATTIM = ${gdattim} - MAP = 1 - DEVICE = gif | ${mslpthksfcdev} | $pixels - TITLE = - TEXT = 1/2/2/c/sw - LATLON = $LATVAL - l - r - - CLEAR = no - MAP = 0 - GDPFUN = - TITLE = 1/-4/${TITLE} - TEXT = 2/2/2/c/sw - LATLON = 0 - l - r - - TITLE = 1/3/${mslpthksfclab} - l - r - - -! ANALYSIS MSLP/1000-500 THICKNESS (US/CANADA) - - restore $NTS/base_uscan.nts - restore $NTS/sfc_mslp_thk.nts - - CLEAR = yes - GDFILE = ${gdfile} - GDATTIM = ${gdattim} - MAP = 1 - DEVICE = gif | ${mslpthksfcusdev} | $pixels - TITLE = - TEXT = 1/2/2/c/sw - LATLON = $LATVAL - l - r - - CLEAR = no - MAP = 0 - GDPFUN = - TITLE = 1/-4/${TITLE} - TEXT = 2/2/2/c/sw - LATLON = 0 - l - r - - TITLE = 1/3/${mslpthksfclab} - l - r - -! 500MB ANALYSIS HEIGHTS/VORTICITY - - restore $NTS/base_nh.nts - restore $NTS/500_hgt_vor.nts - - CLEAR = yes - GDFILE = ${gdfile} - GDATTIM = ${gdattim} - MAP = 1 - DEVICE = gif | ${hgtvor500dev} | $pixels - TITLE = - TEXT = 1/2/2/c/sw - LATLON = $LATVAL - l - r - - CLEAR = no - MAP = 0 - GDPFUN = - TITLE = 1/-4/${TITLE} - TEXT = 2/2/2/c/sw - LATLON = 0 - l - r - - TITLE = 1/3/${hgtvor500lab} - l - r - -! 500MB ANALYSIS HEIGHTS/VORTICITY (US/CANADA) - - restore $NTS/base_uscan.nts - restore $NTS/500_hgt_vor.nts - - CLEAR = yes - GDFILE = ${gdfile} - GDATTIM = ${gdattim} - MAP = 1 - DEVICE = gif | ${hgtvor500usdev} | $pixels - TITLE = - TEXT = 1/2/2/c/sw - LATLON = $LATVAL - l - r - - CLEAR = no - GDPFUN = - TITLE = 1/-4/${TITLE} - TEXT = 2/2/2/c/sw - LATLON = 0 - l - - TITLE = 1/3/${hgtvor500lab} - l - r - - -! ANALYSIS LIFTED INDEX - - restore $NTS/base_nh.nts - restore $NTS/100_lift.nts - - CLEAR = yes - GDFILE = ${gdfile} - GDATTIM = ${gdattim} - MAP = 1 - DEVICE = gif | ${liftdev} | $pixels - TITLE = - TEXT = 1/2/2/c/sw - LATLON = $LATVAL - l - r - - CLEAR = no - GDPFUN = - TITLE = 1/-4/${TITLE} - TEXT = 2/2/2/c/sw - LATLON = 0 - l - r - - TITLE = 1/3/${liftlab} - l - r - - -! ANALYSIS TROPOPAUSE PRESSURE/WIND SHEAR - - restore $NTS/base_nh.nts - restore $NTS/trop_pres_wshr.nts - - CLEAR = yes - GDFILE = ${gdfile} - GDATTIM = ${gdattim} - MAP = 1 - DEVICE = gif | ${prswshtropdev} | $pixels - TITLE = - TEXT = 1/2/2/c/sw - LATLON = $LATVAL - l - r - - CLEAR = no - GDPFUN = - TITLE = 1/-4/${TITLE} - TEXT = 2/2/2/c/sw - LATLON = 0 - l - r - - TITLE = 1/3/${prswshtroplab} - l - r - - -! ANALYSIS 700MB RELATIVE HUMIDITY AND VERTICAL VELOCITY - - restore $NTS/base_nh.nts - restore $NTS/700_rel_vvel.nts - - CLEAR = yes - GDFILE = ${gdfile} - GDATTIM = ${gdattim} - MAP = 1 - DEVICE = gif | ${rhvvel700dev} | $pixels - TITLE = - TEXT = 1/2/2/c/sw - LATLON = $LATVAL - l - r - - CLEAR = no - GDPFUN = - TITLE = 1/-4/${TITLE} - TEXT = 2/2/2/c/sw - LATLON = 0 - l - r - - TITLE = 1/3/${rhvvel700lab} - l - r - - exit -EOF - - -$GEMEXE/gpend - - -if [ $SENDCOM = YES ]; then - -# Copy the GIF images into my area - - cp ${hgttmp700dev} ${COMOUT} - cp ${hgttmp500dev} ${COMOUT} - cp ${hgtiso300dev} ${COMOUT} - cp ${hgtiso250dev} ${COMOUT} - cp ${hgttmp250dev} ${COMOUT} - cp ${hgtiso200dev} ${COMOUT} - cp ${hgtiso100dev} ${COMOUT} - cp ${hgttmp100dev} ${COMOUT} - cp ${mslpthksfcdev} ${COMOUT} - cp ${mslpthksfcusdev} ${COMOUT} - cp ${hgtvor500dev} ${COMOUT} - cp ${hgtvor500usdev} ${COMOUT} - cp ${liftdev} ${COMOUT} - cp ${prswshtropdev} ${COMOUT} - cp ${rhvvel700dev} ${COMOUT} - -# Copy the GIF images onto the NCDC area on the public ftp server - - if [ $SENDDBN = YES ]; then - - $DBNROOT/bin/dbn_alert MODEL NCDCGIF ${job} ${COMOUT}/${hgttmp700dev} - $DBNROOT/bin/dbn_alert MODEL NCDCGIF ${job} ${COMOUT}/${hgttmp500dev} - $DBNROOT/bin/dbn_alert MODEL NCDCGIF ${job} ${COMOUT}/${hgtiso300dev} - $DBNROOT/bin/dbn_alert MODEL NCDCGIF ${job} ${COMOUT}/${hgtiso250dev} - $DBNROOT/bin/dbn_alert MODEL NCDCGIF ${job} ${COMOUT}/${hgttmp250dev} - $DBNROOT/bin/dbn_alert MODEL NCDCGIF ${job} ${COMOUT}/${hgtiso200dev} -# $DBNROOT/bin/dbn_alert MODEL NCDCGIF ${job} ${COMOUT}/${hgttmp200dev} - $DBNROOT/bin/dbn_alert MODEL NCDCGIF ${job} ${COMOUT}/${hgtiso100dev} - $DBNROOT/bin/dbn_alert MODEL NCDCGIF ${job} ${COMOUT}/${hgttmp100dev} - $DBNROOT/bin/dbn_alert MODEL NCDCGIF ${job} ${COMOUT}/${mslpthksfcdev} - $DBNROOT/bin/dbn_alert MODEL NCDCGIF ${job} ${COMOUT}/${mslpthksfcusdev} - $DBNROOT/bin/dbn_alert MODEL NCDCGIF ${job} ${COMOUT}/${hgtvor500dev} - $DBNROOT/bin/dbn_alert MODEL NCDCGIF ${job} ${COMOUT}/${hgtvor500usdev} - $DBNROOT/bin/dbn_alert MODEL NCDCGIF ${job} ${COMOUT}/${liftdev} - $DBNROOT/bin/dbn_alert MODEL NCDCGIF ${job} ${COMOUT}/${prswshtropdev} - $DBNROOT/bin/dbn_alert MODEL NCDCGIF ${job} ${COMOUT}/${rhvvel700dev} - - -# Convert the 500mb NH Hgts/Temps chart to tif, attach a heading and -# send to TOC via the NTC - - fi - export input=${COMOUT}/${hgttmp500dev} - export HEADER=YES - export OUTPATH=$DATA/gfs_500_hgt_tmp_nh_anl_${cyc}.tif - ${USHgfs}/make_tif.sh -fi - - msg=" GEMPAK_GIF ${fhr} hour completed normally" - postmsg "$jlogfile" "$msg" - - exit diff --git a/gempak/ush/gempak_gfs_f12_gif.sh b/gempak/ush/gempak_gfs_f12_gif.sh deleted file mode 100755 index 611252a2e2..0000000000 --- a/gempak/ush/gempak_gfs_f12_gif.sh +++ /dev/null @@ -1,213 +0,0 @@ -#!/bin/sh - -######################################################################### -# -# Script: gempak_gfs_f12_gif.sh -# -# This scripts creates GEMPAK .gif images of 12HR forecast fields from -# GFS model output for archiving at NCDC. -# -# -# History: Ralph Jones 02/16/2005 JIF original version. -# History: Steve Lilly 04/30/2008 Change font size of the Titles -# from .8 to a larger size (1 or 2) -# -# -######################################################################### - - msg=" Make GEMPAK GIFS utility" - postmsg "$jlogfile" "$msg" - - set -x - - MAPAREA="normal" - - LATVAL="1/1/1/1/5;5" - - pixels="1728;1472" - - cp $FIXgempak/coltbl.spc coltbl.xwp - -########################################################## -# 12HR FORECAST CHARTS # -########################################################## - - -# Create time stamp (bottom) label - - echo 00${fhr}${PDY}${cyc} > dates - export FORT55="title.output" -# $WEBTITLE < dates - ${UTILgfs}/exec/webtitle < dates - - export TITLE=$(cat title.output) - echo "\n\n TITLE = $TITLE \n" - - -# Define labels and file names for 12hr forecast charts - - hgtvor500lab="500MB ${fhr}HR FORECAST HEIGHTS/VORTICITY" - hgtvor500dev="gfs_500_hgt_vor_nh_f${fhr}_${cyc}.gif" - - hgtvor500usdev="gfs_500_hgt_vor_uscan_f${fhr}_${cyc}.gif" - - mslpthksfclab="${fhr}HR FORECAST MEAN SEA LEVEL PRESSURE/1000-500MB THICKNESS" - mslpthksfcdev="gfs_sfc_mslp_thk_nh_f${fhr}_${cyc}.gif" - - rhvvel700lab="700MB ${fhr}HR FORECAST RH/VERT VEL" - rhvvel700dev="gfs_700_rh_vvel_nh_f${fhr}_${cyc}.gif" - -# Set grid date and input file name - - gdattim=$(echo ${PDY} | cut -c3-8)/${cyc}00F0${fhr} - gdfile=gem_grids${fhr}.gem - -# Execute the GEMPAK program - - $GEMEXE/gdplot2_gif << EOF - -! ANALYSIS MSLP/1000-500 THICKNESS - - restore $NTS/base_nh.nts - restore $NTS/sfc_mslp_thk.nts - - CLEAR = yes - GDFILE = ${gdfile} - GDATTIM = ${gdattim} - MAP = 1 - DEVICE = gif | ${mslpthksfcdev} | $pixels - TITLE = - TEXT = 1/2/2/c/sw - LATLON = $LATVAL - l - r - - CLEAR = no - MAP = 0 - GDPFUN = - TITLE = 1/-4/${TITLE} - TEXT = 2/2/2/c/sw - LATLON = 0 - l - r - - TITLE = 1/3/${mslpthksfclab} - l - r - -! 500MB ANALYSIS HEIGHTS/VORTICITY - - restore $NTS/base_nh.nts - restore $NTS/500_hgt_vor.nts - - CLEAR = yes - GDFILE = ${gdfile} - GDATTIM = ${gdattim} - MAP = 1 - DEVICE = gif | ${hgtvor500dev} | $pixels - TITLE = - TEXT = 1/2/2/c/sw - LATLON = $LATVAL - l - r - - CLEAR = no - GDPFUN = - TITLE = 1/-4/${TITLE} - TEXT = 2/2/2/c/sw - LATLON = 0 - l - r - - TITLE = 1/3/${hgtvor500lab} - l - r - -! 500MB ANALYSIS HEIGHTS/VORTICITY (US/CANADA) - - restore $NTS/base_uscan.nts - restore $NTS/500_hgt_vor.nts - - CLEAR = yes - GDFILE = ${gdfile} - GDATTIM = ${gdattim} - MAP = 1 - DEVICE = gif | ${hgtvor500usdev} | $pixels - TITLE = - TEXT = 1/2/2/c/sw - LATLON = $LATVAL - l - r - - CLEAR = no - GDPFUN = - TITLE = 1/-4/${TITLE} - TEXT = 2/2/2/c/sw - LATLON = 0 - l - r - - TITLE = 1/3/${hgtvor500lab} - l - r - - -! ANALYSIS 700MB RELATIVE HUMIDITY AND VERTICAL VELOCITY - - restore $NTS/base_nh.nts - restore $NTS/700_rel_vvel.nts - - CLEAR = yes - GDFILE = ${gdfile} - GDATTIM = ${gdattim} - MAP = 1 - DEVICE = gif | ${rhvvel700dev} | $pixels - TITLE = - TEXT = 1/2/2/c/sw - LATLON = $LATVAL - l - r - - CLEAR = no - GDPFUN = - TITLE = 1/-4/${TITLE} - TEXT = 2/2/2/c/sw - LATLON = 0 - l - r - - TITLE = 1/3/${rhvvel700lab} - l - r - - exit -EOF - -$GEMEXE/gpend - -if [ $SENDCOM = YES ]; then - -# Copy the GIF images into my area - - cp ${mslpthksfcdev} ${COMOUT} - cp ${hgtvor500dev} ${COMOUT} - cp ${hgtvor500usdev} ${COMOUT} - cp ${rhvvel700dev} ${COMOUT} - -# Copy the GIF images onto the NCDC area on the public ftp server - - if [ $SENDDBN = YES ]; then - - $DBNROOT/bin/dbn_alert MODEL NCDCGIF ${job} ${COMOUT}/${mslpthksfcdev} - $DBNROOT/bin/dbn_alert MODEL NCDCGIF ${job} ${COMOUT}/${hgtvor500dev} -# $DBNROOT/bin/dbn_alert MODEL NCDCGIF ${job} ${COMOUT}/${hgtvor500usdev} - $DBNROOT/bin/dbn_alert MODEL NCDCGIF ${job} ${COMOUT}/${rhvvel700dev} - - fi - -fi - - msg=" GEMPAK_GIF ${fhr} hour completed normally" - postmsg "$jlogfile" "$msg" - - exit diff --git a/gempak/ush/gempak_gfs_f24_gif.sh b/gempak/ush/gempak_gfs_f24_gif.sh deleted file mode 100755 index 53670a29bd..0000000000 --- a/gempak/ush/gempak_gfs_f24_gif.sh +++ /dev/null @@ -1,231 +0,0 @@ -#!/bin/sh - - -######################################################################### -# -# Script: gempak_gfs_f24_gif.sh -# -# This scripts creates GEMPAK .gif images of 24HR forecast fields from -# GFS model output for archiving at NCDC. -# -# -# History: Ralph Jones 02/16/2005 JIF original version. -# History: Steve Lilly 04/30/2008 Change font size of the Titles -# from .8 to a larger size (1 or 2) -# -# -######################################################################### - - - - msg=" Make GEMPAK GIFS utility" - postmsg "$jlogfile" "$msg" - - - - set -x - - - MAPAREA="normal" - - LATVAL="1/1/1/1/5;5" - - pixels="1728;1472" - - cp $FIXgempak/coltbl.spc coltbl.xwp - - - -########################################################## -# 24HR FORECAST CHARTS # -########################################################## - - -# Create time stamp (bottom) label - - echo 00${fhr}${PDY}${cyc} > dates - export FORT55="title.output" -# $WEBTITLE < dates - ${UTILgfs}/exec/webtitle < dates - - export TITLE=$(cat title.output) - echo "\n\n TITLE = $TITLE \n" - - -# Define labels and file names for 24hr forecast charts - - hgtvor500lab="500MB ${fhr}HR FORECAST HEIGHTS/VORTICITY" - hgtvor500dev="gfs_500_hgt_vor_nh_f${fhr}_${cyc}.gif" - - hgtvor500usdev="gfs_500_hgt_vor_uscan_f${fhr}_${cyc}.gif" - - mslpthksfclab="${fhr}HR FORECAST MEAN SEA LEVEL PRESSURE/1000-500MB THICKNESS" - mslpthksfcdev="gfs_sfc_mslp_thk_nh_f${fhr}_${cyc}.gif" - - rhvvel700lab="700MB ${fhr}HR FORECAST RH/VERT VEL" - rhvvel700dev="gfs_700_rh_vvel_nh_f${fhr}_${cyc}.gif" - - -# Set grid date and input file name - - gdattim=$(echo ${PDY} | cut -c3-8)/${cyc}00F0${fhr} - gdfile=gem_grids${fhr}.gem - - - -# Execute the GEMPAK program - - $GEMEXE/gdplot2_gif << EOF - - -! ANALYSIS MSLP/1000-500 THICKNESS - - restore $NTS/base_nh.nts - restore $NTS/sfc_mslp_thk.nts - - CLEAR = yes - GDFILE = ${gdfile} - GDATTIM = ${gdattim} - MAP = 1 - DEVICE = gif | ${mslpthksfcdev} | $pixels - TITLE = - TEXT = 1/2/2/c/sw - LATLON = $LATVAL - l - r - - CLEAR = no - MAP = 0 - GDPFUN = - TITLE = 1/-4/${TITLE} - TEXT = 2/2/2/c/sw - LATLON = 0 - l - r - - TITLE = 1/3/${mslpthksfclab} - l - r - - -! 500MB ANALYSIS HEIGHTS/VORTICITY - - restore $NTS/base_nh.nts - restore $NTS/500_hgt_vor.nts - - CLEAR = yes - GDFILE = ${gdfile} - GDATTIM = ${gdattim} - MAP = 1 - DEVICE = gif | ${hgtvor500dev} | $pixels - TITLE = - TEXT = 1/2/2/c/sw - LATLON = $LATVAL - l - r - - CLEAR = no - GDPFUN = - TITLE = 1/-4/${TITLE} - TEXT = 2/2/2/c/sw - LATLON = 0 - l - r - - TITLE = 1/3/${hgtvor500lab} - l - r - - -! 500MB ANALYSIS HEIGHTS/VORTICITY (US/CANADA) - - restore $NTS/base_uscan.nts - restore $NTS/500_hgt_vor.nts - - CLEAR = yes - GDFILE = ${gdfile} - GDATTIM = ${gdattim} - MAP = 1 - DEVICE = gif | ${hgtvor500usdev} | $pixels - TITLE = - TEXT = 1/2/2/c/sw - LATLON = $LATVAL - l - r - - CLEAR = no - GDPFUN = - TITLE = 1/-4/${TITLE} - TEXT = 2/2/2/c/sw - LATLON = 0 - l - r - - TITLE = 1/3/${hgtvor500lab} - l - r - - -! ANALYSIS 700MB RELATIVE HUMIDITY AND VERTICAL VELOCITY - - restore $NTS/base_nh.nts - restore $NTS/700_rel_vvel.nts - - CLEAR = yes - GDFILE = ${gdfile} - GDATTIM = ${gdattim} - MAP = 1 - DEVICE = gif | ${rhvvel700dev} | $pixels - TITLE = - TEXT = 1/2/2/c/sw - LATLON = $LATVAL - l - r - - CLEAR = no - GDPFUN = - TITLE = 1/-4/${TITLE} - TEXT = 2/2/2/c/sw - LATLON = 0 - l - r - - TITLE = 1/3/${rhvvel700lab} - l - r - - exit -EOF - - -$GEMEXE/gpend - - -if [ $SENDCOM = YES ]; then - -# Copy the GIF images into my area - - cp ${mslpthksfcdev} ${COMOUT} - cp ${hgtvor500dev} ${COMOUT} - cp ${hgtvor500usdev} ${COMOUT} - cp ${rhvvel700dev} ${COMOUT} - - -# Copy the GIF images onto the NCDC area on the public ftp server - - if [ $SENDDBN = YES ]; then - - $DBNROOT/bin/dbn_alert MODEL NCDCGIF ${job} ${COMOUT}/${mslpthksfcdev} - $DBNROOT/bin/dbn_alert MODEL NCDCGIF ${job} ${COMOUT}/${hgtvor500dev} -# $DBNROOT/bin/dbn_alert MODEL NCDCGIF ${job} ${COMOUT}/${hgtvor500usdev} - $DBNROOT/bin/dbn_alert MODEL NCDCGIF ${job} ${COMOUT}/${rhvvel700dev} - - fi - -fi - - - msg=" GEMPAK_GIF ${fhr} hour completed normally" - postmsg "$jlogfile" "$msg" - - exit diff --git a/gempak/ush/gempak_gfs_f36_gif.sh b/gempak/ush/gempak_gfs_f36_gif.sh deleted file mode 100755 index e1999090c0..0000000000 --- a/gempak/ush/gempak_gfs_f36_gif.sh +++ /dev/null @@ -1,231 +0,0 @@ -#!/bin/sh - - -######################################################################### -# -# Script: gempak_gfs_f36_gif.sh -# -# This scripts creates GEMPAK .gif images of 36HR forecast fields from -# GFS model output for archiving at NCDC. -# -# -# History: Ralph Jones 02/16/2005 JIF original version. -# History: Steve Lilly 04/30/2008 Change font size of the Titles -# from .8 to a larger size (1 or 2) -# -# -######################################################################### - - - - msg=" Make GEMPAK GIFS utility" - postmsg "$jlogfile" "$msg" - - - set -x - - - MAPAREA="normal" - - LATVAL="1/1/1/1/5;5" - - pixels="1728;1472" - - cp $FIXgempak/coltbl.spc coltbl.xwp - - - -########################################################## -# 36HR FORECAST CHARTS # -########################################################## - - -# Create time stamp (bottom) label - - echo 00${fhr}${PDY}${cyc} > dates - export FORT55="title.output" -# $WEBTITLE < dates - ${UTILgfs}/exec/webtitle < dates - - export TITLE=$(cat title.output) - echo "\n\n TITLE = $TITLE \n" - - -# Define labels and file names for 36hr forecast charts - - hgtvor500lab="500MB ${fhr}HR FORECAST HEIGHTS/VORTICITY" - hgtvor500dev="gfs_500_hgt_vor_nh_f${fhr}_${cyc}.gif" - - hgtvor500usdev="gfs_500_hgt_vor_uscan_f${fhr}_${cyc}.gif" - - mslpthksfclab="${fhr}HR FORECAST MEAN SEA LEVEL PRESSURE/1000-500MB THICKNESS" - mslpthksfcdev="gfs_sfc_mslp_thk_nh_f${fhr}_${cyc}.gif" - - rhvvel700lab="700MB ${fhr}HR FORECAST RH/VERT VEL" - rhvvel700dev="gfs_700_rh_vvel_nh_f${fhr}_${cyc}.gif" - - -# Set grid date and input file name - - gdattim=$(echo ${PDY} | cut -c3-8)/${cyc}00F0${fhr} - gdfile=gem_grids${fhr}.gem - - - -# Execute the GEMPAK program - - $GEMEXE/gdplot2_gif << EOF - - -! ANALYSIS MSLP/1000-500 THICKNESS - - restore $NTS/base_nh.nts - restore $NTS/sfc_mslp_thk.nts - - CLEAR = yes - GDFILE = ${gdfile} - GDATTIM = ${gdattim} - MAP = 1 - DEVICE = gif | ${mslpthksfcdev} | $pixels - TITLE = - TEXT = 1/2/2/c/sw - LATLON = $LATVAL - l - r - - CLEAR = no - MAP = 0 - GDPFUN = - TITLE = 1/-4/${TITLE} - TEXT = 2/2/2/c/sw - LATLON = 0 - l - r - - TITLE = 1/3/${mslpthksfclab} - l - r - - -! 500MB ANALYSIS HEIGHTS/VORTICITY - - restore $NTS/base_nh.nts - restore $NTS/500_hgt_vor.nts - - CLEAR = yes - GDFILE = ${gdfile} - GDATTIM = ${gdattim} - MAP = 1 - DEVICE = gif | ${hgtvor500dev} | $pixels - TITLE = - TEXT = 1/2/2/c/sw - LATLON = $LATVAL - l - r - - CLEAR = no - GDPFUN = - TITLE = 1/-4/${TITLE} - TEXT = 2/2/2/c/sw - LATLON = 0 - l - r - - TITLE = 1/3/${hgtvor500lab} - l - r - - -! 500MB ANALYSIS HEIGHTS/VORTICITY (US/CANADA) - - restore $NTS/base_uscan.nts - restore $NTS/500_hgt_vor.nts - - CLEAR = yes - GDFILE = ${gdfile} - GDATTIM = ${gdattim} - MAP = 1 - DEVICE = gif | ${hgtvor500usdev} | $pixels - TITLE = - TEXT = 1/2/2/c/sw - LATLON = $LATVAL - l - r - - CLEAR = no - GDPFUN = - TITLE = 1/-4/${TITLE} - TEXT = 2/2/2/c/sw - LATLON = 0 - l - r - - TITLE = 1/3/${hgtvor500lab} - l - r - - -! ANALYSIS 700MB RELATIVE HUMIDITY AND VERTICAL VELOCITY - - restore $NTS/base_nh.nts - restore $NTS/700_rel_vvel.nts - - CLEAR = yes - GDFILE = ${gdfile} - GDATTIM = ${gdattim} - MAP = 1 - DEVICE = gif | ${rhvvel700dev} | $pixels - TITLE = - TEXT = 1/2/2/c/sw - LATLON = $LATVAL - l - r - - CLEAR = no - GDPFUN = - TITLE = 1/-4/${TITLE} - TEXT = 2/2/2/c/sw - LATLON = 0 - l - r - - TITLE = 1/3/${rhvvel700lab} - l - r - - exit -EOF - - -$GEMEXE/gpend - - -if [ $SENDCOM = YES ]; then - -# Copy the GIF images into my area - - cp ${mslpthksfcdev} ${COMOUT} - cp ${hgtvor500dev} ${COMOUT} - cp ${hgtvor500usdev} ${COMOUT} - cp ${rhvvel700dev} ${COMOUT} - - -# Copy the GIF images onto the NCDC area on the public ftp server - - if [ $SENDDBN = YES ]; then - - $DBNROOT/bin/dbn_alert MODEL NCDCGIF ${job} ${COMOUT}/${mslpthksfcdev} - $DBNROOT/bin/dbn_alert MODEL NCDCGIF ${job} ${COMOUT}/${hgtvor500dev} -# $DBNROOT/bin/dbn_alert MODEL NCDCGIF ${job} ${COMOUT}/${hgtvor500usdev} - $DBNROOT/bin/dbn_alert MODEL NCDCGIF ${job} ${COMOUT}/${rhvvel700dev} - - fi - -fi - - - - msg=" GEMPAK_GIF ${fhr} hour completed normally" - postmsg "$jlogfile" "$msg" - - exit diff --git a/gempak/ush/gempak_gfs_f48_gif.sh b/gempak/ush/gempak_gfs_f48_gif.sh deleted file mode 100755 index 1e0ba532fd..0000000000 --- a/gempak/ush/gempak_gfs_f48_gif.sh +++ /dev/null @@ -1,231 +0,0 @@ -#!/bin/sh - - -######################################################################### -# -# Script: gempak_gfs_f48_gif.sh -# -# This scripts creates GEMPAK .gif images of 48HR forecast fields from -# GFS model output for archiving at NCDC. -# -# -# History: Ralph Jones 02/16/2005 JIF original version. -# History: Steve Lilly 04/30/2008 Change font size of the Titles -# from .8 to a larger size (1 or 2) -# -# -######################################################################### - - - - msg=" Make GEMPAK GIFS utility" - postmsg "$jlogfile" "$msg" - - - set -x - - - MAPAREA="normal" - - LATVAL="1/1/1/1/5;5" - - pixels="1728;1472" - - cp $FIXgempak/coltbl.spc coltbl.xwp - - - -########################################################## -# 48HR FORECAST CHARTS # -########################################################## - - -# Create time stamp (bottom) label - - echo 00${fhr}${PDY}${cyc} > dates - export FORT55="title.output" -# $WEBTITLE < dates - ${UTILgfs}/exec/webtitle < dates - - export TITLE=$(cat title.output) - echo "\n\n TITLE = $TITLE \n" - - -# Define labels and file names for 48hr forecast charts - - hgtvor500lab="500MB ${fhr}HR FORECAST HEIGHTS/VORTICITY" - hgtvor500dev="gfs_500_hgt_vor_nh_f${fhr}_${cyc}.gif" - - hgtvor500usdev="gfs_500_hgt_vor_uscan_f${fhr}_${cyc}.gif" - - mslpthksfclab="${fhr}HR FORECAST MEAN SEA LEVEL PRESSURE/1000-500MB THICKNESS" - mslpthksfcdev="gfs_sfc_mslp_thk_nh_f${fhr}_${cyc}.gif" - - rhvvel700lab="700MB ${fhr}HR FORECAST RH/VERT VEL" - rhvvel700dev="gfs_700_rh_vvel_nh_f${fhr}_${cyc}.gif" - - -# Set grid date and input file name - - gdattim=$(echo ${PDY} | cut -c3-8)/${cyc}00F0${fhr} - gdfile=gem_grids${fhr}.gem - - - -# Execute the GEMPAK program - - $GEMEXE/gdplot2_gif << EOF - - -! ANALYSIS MSLP/1000-500 THICKNESS - - restore $NTS/base_nh.nts - restore $NTS/sfc_mslp_thk.nts - - CLEAR = yes - GDFILE = ${gdfile} - GDATTIM = ${gdattim} - MAP = 1 - DEVICE = gif | ${mslpthksfcdev} | $pixels - TITLE = - TEXT = 1/2/2/c/sw - LATLON = $LATVAL - l - r - - CLEAR = no - MAP = 0 - GDPFUN = - TITLE = 1/-4/${TITLE} - TEXT = 2/2/2/c/sw - LATLON = 0 - l - r - - TITLE = 1/3/${mslpthksfclab} - l - r - - -! 500MB ANALYSIS HEIGHTS/VORTICITY - - restore $NTS/base_nh.nts - restore $NTS/500_hgt_vor.nts - - CLEAR = yes - GDFILE = ${gdfile} - GDATTIM = ${gdattim} - MAP = 1 - DEVICE = gif | ${hgtvor500dev} | $pixels - TITLE = - TEXT = 1/2/2/c/sw - LATLON = $LATVAL - l - r - - CLEAR = no - GDPFUN = - TITLE = 1/-4/${TITLE} - TEXT = 2/2/2/c/sw - LATLON = 0 - l - r - - TITLE = 1/3/${hgtvor500lab} - l - r - - -! 500MB ANALYSIS HEIGHTS/VORTICITY (US/CANADA) - - restore $NTS/base_uscan.nts - restore $NTS/500_hgt_vor.nts - - CLEAR = yes - GDFILE = ${gdfile} - GDATTIM = ${gdattim} - MAP = 1 - DEVICE = gif | ${hgtvor500usdev} | $pixels - TITLE = - TEXT = 1/2/2/c/sw - LATLON = $LATVAL - l - r - - CLEAR = no - GDPFUN = - TITLE = 1/-4/${TITLE} - TEXT = 2/2/2/c/sw - LATLON = 0 - l - r - - TITLE = 1/3/${hgtvor500lab} - l - r - - -! ANALYSIS 700MB RELATIVE HUMIDITY AND VERTICAL VELOCITY - - restore $NTS/base_nh.nts - restore $NTS/700_rel_vvel.nts - - CLEAR = yes - GDFILE = ${gdfile} - GDATTIM = ${gdattim} - MAP = 1 - DEVICE = gif | ${rhvvel700dev} | $pixels - TITLE = - TEXT = 1/2/2/c/sw - LATLON = $LATVAL - l - r - - CLEAR = no - GDPFUN = - TITLE = 1/-4/${TITLE} - TEXT = 2/2/2/c/sw - LATLON = 0 - l - r - - TITLE = 1/3/${rhvvel700lab} - l - r - - exit -EOF - - -$GEMEXE/gpend - - -if [ $SENDCOM = YES ]; then - -# Copy the GIF images into my area - - cp ${mslpthksfcdev} ${COMOUT} - cp ${hgtvor500dev} ${COMOUT} - cp ${hgtvor500usdev} ${COMOUT} - cp ${rhvvel700dev} ${COMOUT} - - -# Copy the GIF images onto the NCDC area on the public ftp server - - if [ $SENDDBN = YES ]; then - - $DBNROOT/bin/dbn_alert MODEL NCDCGIF ${job} ${COMOUT}/${mslpthksfcdev} - $DBNROOT/bin/dbn_alert MODEL NCDCGIF ${job} ${COMOUT}/${hgtvor500dev} -# $DBNROOT/bin/dbn_alert MODEL NCDCGIF ${job} ${COMOUT}/${hgtvor500usdev} - $DBNROOT/bin/dbn_alert MODEL NCDCGIF ${job} ${COMOUT}/${rhvvel700dev} - - fi - -fi - - - - msg=" GEMPAK_GIF ${fhr} hour completed normally" - postmsg "$jlogfile" "$msg" - - exit diff --git a/gempak/ush/gempak_gfs_fhhh_gif.sh b/gempak/ush/gempak_gfs_fhhh_gif.sh new file mode 100755 index 0000000000..33f5764068 --- /dev/null +++ b/gempak/ush/gempak_gfs_fhhh_gif.sh @@ -0,0 +1,189 @@ +#! /usr/bin/env bash + +######################################################################### +# +# This scripts creates GEMPAK .gif images of forecast fields from +# GFS model output for archiving at NCDC. +# +######################################################################### + +LATVAL="1/1/1/1/5;5" +pixels="1728;1472" +cp "${HOMEgfs}/gempak/fix/coltbl.spc" coltbl.xwp + +########################################################## +# FORECAST CHARTS # +########################################################## + + +# Create time stamp (bottom) label + +echo "0${fhr3}${PDY}${cyc}" > dates +export FORT55="title.output" +"${HOMEgfs}/exec/webtitle.x" < dates + +TITLE="$(cat title.output)" +echo "TITLE = ${TITLE}" + +# Define labels and file names for forecast charts +hgtvor500lab="500MB ${fhr3}HR FORECAST HEIGHTS/VORTICITY" +hgtvor500dev="gfs_500_hgt_vor_nh_f${fhr3}_${cyc}.gif" + +hgtvor500usdev="gfs_500_hgt_vor_uscan_f${fhr3}_${cyc}.gif" + +mslpthksfclab="${fhr3}HR FORECAST MEAN SEA LEVEL PRESSURE/1000-500MB THICKNESS" +mslpthksfcdev="gfs_sfc_mslp_thk_nh_f${fhr3}_${cyc}.gif" + +rhvvel700lab="700MB ${fhr3}HR FORECAST RH/VERT VEL" +rhvvel700dev="gfs_700_rh_vvel_nh_f${fhr3}_${cyc}.gif" + + +# Set grid date and input file name +gdattim="${PDY:2:6}/${cyc}00F${fhr3}" +gdfile=gem_grids${fhr3}.gem + +# Execute the GEMPAK program + +"${GEMEXE}/gdplot2_gif" << EOF + + +! ANALYSIS MSLP/1000-500 THICKNESS + + restore ${NTS}/base_nh.nts + restore ${NTS}/sfc_mslp_thk.nts + + CLEAR = yes + GDFILE = ${gdfile} + GDATTIM = ${gdattim} + MAP = 1 + DEVICE = gif | ${mslpthksfcdev} | ${pixels} + TITLE = + TEXT = 1/2/2/c/sw + LATLON = ${LATVAL} + l + r + + CLEAR = no + MAP = 0 + GDPFUN = + TITLE = 1/-4/${TITLE} + TEXT = 2/2/2/c/sw + LATLON = 0 + l + r + + TITLE = 1/3/${mslpthksfclab} + l + r + + +! 500MB ANALYSIS HEIGHTS/VORTICITY + + restore ${NTS}/base_nh.nts + restore ${NTS}/500_hgt_vor.nts + + CLEAR = yes + GDFILE = ${gdfile} + GDATTIM = ${gdattim} + MAP = 1 + DEVICE = gif | ${hgtvor500dev} | ${pixels} + TITLE = + TEXT = 1/2/2/c/sw + LATLON = ${LATVAL} + l + r + + CLEAR = no + GDPFUN = + TITLE = 1/-4/${TITLE} + TEXT = 2/2/2/c/sw + LATLON = 0 + l + r + + TITLE = 1/3/${hgtvor500lab} + l + r + + +! 500MB ANALYSIS HEIGHTS/VORTICITY (US/CANADA) + + restore ${NTS}/base_uscan.nts + restore ${NTS}/500_hgt_vor.nts + + CLEAR = yes + GDFILE = ${gdfile} + GDATTIM = ${gdattim} + MAP = 1 + DEVICE = gif | ${hgtvor500usdev} | ${pixels} + TITLE = + TEXT = 1/2/2/c/sw + LATLON = ${LATVAL} + l + r + + CLEAR = no + GDPFUN = + TITLE = 1/-4/${TITLE} + TEXT = 2/2/2/c/sw + LATLON = 0 + l + r + + TITLE = 1/3/${hgtvor500lab} + l + r + + +! ANALYSIS 700MB RELATIVE HUMIDITY AND VERTICAL VELOCITY + + restore ${NTS}/base_nh.nts + restore ${NTS}/700_rel_vvel.nts + + CLEAR = yes + GDFILE = ${gdfile} + GDATTIM = ${gdattim} + MAP = 1 + DEVICE = gif | ${rhvvel700dev} | ${pixels} + TITLE = + TEXT = 1/2/2/c/sw + LATLON = ${LATVAL} + l + r + + CLEAR = no + GDPFUN = + TITLE = 1/-4/${TITLE} + TEXT = 2/2/2/c/sw + LATLON = 0 + l + r + + TITLE = 1/3/${rhvvel700lab} + l + r + + exit +EOF + +"${GEMEXE}/gpend" + +# Copy the GIF images into my area + +cp "${mslpthksfcdev}" "${COM_ATMOS_GEMPAK_GIF}" +cp "${hgtvor500dev}" "${COM_ATMOS_GEMPAK_GIF}" +cp "${hgtvor500usdev}" "${COM_ATMOS_GEMPAK_GIF}" +cp "${rhvvel700dev}" "${COM_ATMOS_GEMPAK_GIF}" + +# Copy the GIF images onto the NCDC area on the public ftp server + +if [[ "${SENDDBN}" == YES ]]; then + "${DBNROOT}/bin/dbn_alert" MODEL NCDCGIF "${job}" "${COM_ATMOS_GEMPAK_GIF}/${mslpthksfcdev}" + "${DBNROOT}/bin/dbn_alert" MODEL NCDCGIF "${job}" "${COM_ATMOS_GEMPAK_GIF}/${hgtvor500dev}" + # "${DBNROOT}/bin/dbn_alert" MODEL NCDCGIF "${job}" "${COM_ATMOS_GEMPAK_GIF}/${hgtvor500usdev}" + "${DBNROOT}/bin/dbn_alert" MODEL NCDCGIF "${job}" "${COM_ATMOS_GEMPAK_GIF}/${rhvvel700dev}" +fi + +echo "GEMPAK_GIF ${fhr3} hour completed normally" + +exit diff --git a/gempak/ush/gfs_meta_ak.sh b/gempak/ush/gfs_meta_ak.sh index c258b7e83a..88f136ae13 100755 --- a/gempak/ush/gfs_meta_ak.sh +++ b/gempak/ush/gfs_meta_ak.sh @@ -1,48 +1,41 @@ -#!/bin/sh +#! /usr/bin/env bash # # Metafile Script : gfs_meta_ak.sh # -# Log : -# D.W.Plummer/NCEP 2/97 Add log header -# D.W.Plummer/NCEP 3/97 Added ecmwf comparison. -# D.W.Plummer/NCEP 3/97 Added $MAPFIL specification for lower resolution -# D.W.Plummer/NCEP 4/97 Removed run from 3-HOURLY PRECIP -# J. Carr/HPC 2/99 Changed skip to 0 -# B. Gordon/NCO 5/00 Modified for production on IBM-SP -# Changed gdplot_nc -> gdplot2_nc -# D. Michaud/NCO 4/01 Modified to Reflect Different Title for -# Parallel runs -# J. Carr/PMB 11/04 Added a ? to all title lines -# Changed contur from a 1 to a 2. -# M. Klein/HPC 6/07 Modify for Alaska medium-range desk and rename script. -# -cd $DATA +source "${HOMEgfs}/ush/preamble.sh" -set -xa +cd "${DATA}" || exit 2 -rm -rf $DATA/ak -mkdir -p -m 775 $DATA/ak -cd $DATA/ak -cp $FIXgempak/datatype.tbl datatype.tbl +rm -rf "${DATA}/ak" +mkdir -p -m 775 "${DATA}/ak" +cd "${DATA}/ak" || exit 2 +cp "${HOMEgfs}/gempak/fix/datatype.tbl" datatype.tbl device="nc | gfs.meta.ak" -PDY2=$(echo $PDY | cut -c3-) + +# +# Link data into DATA to sidestep gempak path limits +# TODO: Replace this +# +export COMIN="${RUN}.${PDY}${cyc}" +if [[ ! -L ${COMIN} ]]; then + ln -sf "${COM_ATMOS_GEMPAK_1p00}" "${COMIN}" +fi fend=F216 -if [ "$envir" = "para" ] ; then +if [[ "${envir}" == "para" ]] ; then export m_title="GFSP" else export m_title="GFS" fi export pgm=gdplot2_nc;. prep_step -startmsg -$GEMEXE/gdplot2_nc << EOF -GDFILE = F-GFS | ${PDY2}/${cyc}00 -GDATTIM = F00-$fend-6 +"${GEMEXE}/gdplot2_nc" << EOF +GDFILE = F-GFS | ${PDY:2}/${cyc}00 +GDATTIM = F00-${fend}-6 DEVICE = ${device} PANEL = 0 TEXT = 1/21//hw @@ -71,13 +64,13 @@ HLSYM = 2;1.5//21//hw CLRBAR = 1 WIND = REFVEC = -TITLE = 5/-2/~ ? $m_title PMSL, 1000-500 MB THICKNESS|~MSLP, 1000-500 THKN!0 +TITLE = 5/-2/~ ? ${m_title} PMSL, 1000-500 MB THICKNESS|~MSLP, 1000-500 THKN!0 l run GLEVEL = 4400:10000 !0 GVCORD = sgma !none -SKIP = 0 +SKIP = 0 SCALE = 0 GDPFUN = sm5s(relh) !pmsl TYPE = c/f !c @@ -88,11 +81,11 @@ FLINE = 0;24;23;22 HILO = 26;2/H#;L#/1018-1070;900-1012//30;30/y HLSYM = 2;1.5//21//hw CLRBAR = 1 -TITLE = 5/-2/~ ? $m_title PMSL, 1000-500MB MEAN RH|~MSLP, 1000-500 MEAN RH!0 +TITLE = 5/-2/~ ? ${m_title} PMSL, 1000-500MB MEAN RH|~MSLP, 1000-500 MEAN RH!0 run GLEVEL = 850 -GVCORD = pres +GVCORD = pres SKIP = 0 !0 !0 !0 !0/1;-1 SCALE = 0 !0 !0 !-1 !0 GDPFUN = sm9s(tmpc)!sm9s(tmpc)!sm9s(tmpc)!sm5s(hght)!kntv(wnd) @@ -105,7 +98,7 @@ FLINE = 24;30;28;29;25;0;17 HILO = HLSYM = WIND = 18//1 -TITLE = 5/-2/~ ? $m_title @ HGT, TEMPERATURE AND WIND (KTS)|~@ HGT, TMP, WIND!0 +TITLE = 5/-2/~ ? ${m_title} @ HGT, TEMPERATURE AND WIND (KTS)|~@ HGT, TMP, WIND!0 l run @@ -119,7 +112,7 @@ LINE = 8//2/0 !23//2/0 !20/1/1/1 !6/1/1/1 ! 24/5/1/1 FINT = 70;90 FLINE = 0;23;22 WIND = -TITLE = 5/-2/~ ? $m_title @ HGT, REL HUMIDITY AND OMEGA|~@ HGT, RH AND OMEGA!0 +TITLE = 5/-2/~ ? ${m_title} @ HGT, REL HUMIDITY AND OMEGA|~@ HGT, RH AND OMEGA!0 l run @@ -133,8 +126,8 @@ LINE = 7/5/1/2 ! 29/5/1/2 !5/1/2/1 FINT = 16;20;24;28;32;36;40;44 FLINE = 0;23-15 HILO = 2;6/X;N/10-99;10-99! -HLSYM = -TITLE = 5/-2/~ ? $m_title @ HGT AND VORTICITY|~@ HGT AND VORTICITY!0 +HLSYM = +TITLE = 5/-2/~ ? ${m_title} @ HGT AND VORTICITY|~@ HGT AND VORTICITY!0 l run @@ -150,7 +143,7 @@ FLINE = 0;25;24;29;7;15 HILO = HLSYM = WIND = 18//1 -TITLE = 5/-2/~ ? $m_title @ HGT, ISOTACHS AND WIND (KTS)|~@ HGT AND WIND!0 +TITLE = 5/-2/~ ? ${m_title} @ HGT, ISOTACHS AND WIND (KTS)|~@ HGT AND WIND!0 l run @@ -158,87 +151,88 @@ GDATTIM = F06-F180-6 GLEVEL = 0 SKIP = 0 GVCORD = none -SCALE = 0 +SCALE = 0 GDPFUN = p06i !pmsl TYPE = f !c -CONTUR = 2 +CONTUR = 2 CINT = !4 LINE = !5/1/1/0 FINT = .01;.1;.25;.5;.75;1;1.25;1.5;1.75;2;2.5;3;4;5;6;7;8;9 FLINE = 0;21-30;14-20;5 HILO = !26;2////30;30/y HLSYM = 2;1.5//21//hw -WIND = -TITLE = 5/-2/~ ? $m_title 6-HR TOTAL PCPN, MSLP|~6-HR TOTAL PCPN, MSLP!0 +WIND = +TITLE = 5/-2/~ ? ${m_title} 6-HR TOTAL PCPN, MSLP|~6-HR TOTAL PCPN, MSLP!0 l run -GDPFUN = p06i -TYPE = f +GDPFUN = p06i +TYPE = f HILO = 31;0/x#2////y HLSYM = 1.5 -TITLE = 5/-2/~ ? $m_title 6-HR TOTAL PCPN |~6-HR TOTAL PCPN!0 +TITLE = 5/-2/~ ? ${m_title} 6-HR TOTAL PCPN |~6-HR TOTAL PCPN!0 run -GDATTIM = F12-$fend-06 +GDATTIM = F12-${fend}-06 GDPFUN = p12i !pmsl TYPE = f !c HILO = !26;2////30;30/y HLSYM = 2;1.5//21//hw -TITLE = 5/-2/~ ? $m_title 12-HR TOTAL PCPN, MSLP|~12-HR TOTAL PCPN, MSLP!0 +TITLE = 5/-2/~ ? ${m_title} 12-HR TOTAL PCPN, MSLP|~12-HR TOTAL PCPN, MSLP!0 run -GDPFUN = p12i +GDPFUN = p12i TYPE = f HILO = 31;0/x#2////y HLSYM = 1.5 -TITLE = 5/-2/~ ? $m_title 12-HR TOTAL PCPN (IN)|~12-HR TOTAL PCPN!0 +TITLE = 5/-2/~ ? ${m_title} 12-HR TOTAL PCPN (IN)|~12-HR TOTAL PCPN!0 l run -GDATTIM = F24-$fend-06 +GDATTIM = F24-${fend}-06 GDPFUN = p24i !pmsl TYPE = f !c HILO = !26;2////30;30/y HLSYM = 2;1.5//21//hw -TITLE = 5/-2/~ ? $m_title 24-HR TOTAL PCPN, MSLP|~24-HR TOTAL PCPN, MSLP!0 -run +TITLE = 5/-2/~ ? ${m_title} 24-HR TOTAL PCPN, MSLP|~24-HR TOTAL PCPN, MSLP!0 +run -GDPFUN = p24i +GDPFUN = p24i TYPE = f HILO = 31;0/x#2////y HLSYM = 1.5 -TITLE = 5/-2/~ ? $m_title 24-HR TOTAL PCPN (IN)|~24-HR TOTAL PCPN +TITLE = 5/-2/~ ? ${m_title} 24-HR TOTAL PCPN (IN)|~24-HR TOTAL PCPN run exit EOF -export err=$?;err_chk +export err=$? ##################################################### # GEMPAK DOES NOT ALWAYS HAVE A NON ZERO RETURN CODE # WHEN IT CAN NOT PRODUCE THE DESIRED GRID. CHECK # FOR THIS CASE HERE. ##################################################### -ls -l gfs.meta.ak -export err=$?;export pgm="GEMPAK CHECK FILE";err_chk - - -if [ $SENDCOM = "YES" ] ; then - mv gfs.meta.ak ${COMOUT}/gfs_${PDY}_${cyc}_ak - if [ $SENDDBN = "YES" ] ; then - $DBNROOT/bin/dbn_alert MODEL ${DBN_ALERT_TYPE} $job \ - $COMOUT/gfs_${PDY}_${cyc}_ak - if [ $DBN_ALERT_TYPE = "GFS_METAFILE_LAST" ] ; then - DBN_ALERT_TYPE=GFS_METAFILE - ${DBNROOT}/bin/dbn_alert MODEL ${DBN_ALERT_TYPE} $job \ - ${COMOUT}/gfs_${PDY}_${cyc}_ak - fi - if [ $fhr -eq 216 ] ; then - ${DBNROOT}/bin/dbn_alert MODEL GFS_METAFILE_LAST $job \ - ${COMOUT}/gfs_${PDY}_${cyc}_ak +if (( err != 0 )) || [[ ! -s gfs.meta.ak ]]; then + echo "FATAL ERROR: Failed to create alaska meta file" + exit "${err}" +fi + +mv gfs.meta.ak "${COM_ATMOS_GEMPAK_META}/gfs_${PDY}_${cyc}_ak" +export err=$? +if (( err != 0 )) ; then + echo "FATAL ERROR: Failed to move meta file to ${COM_ATMOS_GEMPAK_META}/gfs_${PDY}_${cyc}_ak" + exit $(( err + 100 )) +fi + +if [[ "${SENDDBN}" == "YES" ]] ; then + "${DBNROOT}/bin/dbn_alert" MODEL "${DBN_ALERT_TYPE}" "${job}" \ + "${COM_ATMOS_GEMPAK_META}/gfs_${PDY}_${cyc}_ak" + if [[ ${DBN_ALERT_TYPE} = "GFS_METAFILE_LAST" ]] ; then + DBN_ALERT_TYPE=GFS_METAFILE + "${DBNROOT}/bin/dbn_alert" MODEL "${DBN_ALERT_TYPE}" "${job}" \ + "${COM_ATMOS_GEMPAK_META}/gfs_${PDY}_${cyc}_ak" fi - fi fi diff --git a/gempak/ush/gfs_meta_bwx.sh b/gempak/ush/gfs_meta_bwx.sh index f5b4e1d944..eee5f496b7 100755 --- a/gempak/ush/gfs_meta_bwx.sh +++ b/gempak/ush/gfs_meta_bwx.sh @@ -1,55 +1,38 @@ -#! /bin/sh +#! /usr/bin/env bash # # Metafile Script : gfs_meta_bwx_new # -# Log : -# D.W.Plummer/NCEP 2/97 Add log header -# J. Carr/HPC 12/12/97 Converted from gdplot to gdplot2 -# J. Carr/HPC 08/05/98 Changed map to medium resolution -# J. Carr/HPC 02/02/99 Changed skip to 0 -# J. Carr/HPC 04/12/99 Added gfs out to 84 hrs. -# J. Carr/HPC 6/99 Added a filter to map -# J. Carr/HPC 1/2000 Eliminated 250 mb vort and pw field. Eliminated pv field. Added another ptype field. -# J. Carr/HPC 2/2001 Edited to run on the IBM. -# J. Carr/HPC 5/2001 Added a mn variable for a/b side dbnet root variable. -# J. Carr/HPC 6/2001 Converted to a korn shell prior to delivering script to Production. -# J. Carr/HPC 7/2001 Submitted. -# J. Carr/PMB 11/2004 Added a ? to all title/TITLE lines. -# M. Klein/HPC 01/2010 Extend to 180 hours -# # Set up Local Variables # -set -x -# -export PS4='BWX:$SECONDS + ' -mkdir -p -m 775 $DATA/BWX -cd $DATA/BWX -cp $FIXgempak/datatype.tbl datatype.tbl -mdl=gfs -MDL="GFS" +source "${HOMEgfs}/ush/preamble.sh" + +mkdir -p -m 775 "${DATA}/BWX" +cd "${DATA}/BWX" || exit 2 +cp "${HOMEgfs}/gempak/fix/datatype.tbl" datatype.tbl + metatype="bwx" -metaname="${mdl}_${metatype}_${cyc}.meta" +metaname="${RUN}_${PDY}_${cyc}_us_${metatype}" device="nc | ${metaname}" -PDY2=$(echo $PDY | cut -c3-) + +# +# Link data into DATA to sidestep gempak path limits +# TODO: Replace this # -#if [ ${cyc} -eq 00 ] ; then -# fend=F126 -#elif [ ${cyc} -eq 12 ] ; then -# fend=F126 -#else -# fend=F126 -#fi +export COMIN="${RUN}.${PDY}${cyc}" +if [[ ! -L ${COMIN} ]]; then + ln -sf "${COM_ATMOS_GEMPAK_1p00}" "${COMIN}" +fi fend=F180 -export pgm=gdplot2_nc;. prep_step; startmsg -$GEMEXE/gdplot2_nc<< EOFplt -gdfile = F-${MDL} | ${PDY2}/${cyc}00 +export pgm=gdplot2_nc;. prep_step +"${GEMEXE}/gdplot2_nc" << EOFplt +gdfile = F-${RUN} | ${PDY:2}/${cyc}00 gdattim = F00-${fend}-6 CONTUR = 1 garea = bwus -proj = +proj = map = 1/1/1/yes latlon = 0 text = 1/22/2/hw @@ -68,12 +51,12 @@ cint = 4/200/308 !4/312/324 !4/328 line = 16/1/1 !2/1/3 !32/1/2/1 fint = 328;336;344;352;360;368 fline = 0;24;30;29;15;18;20 -hilo = -hlsym = +hilo = +hlsym = clrbar = 1/V/LL !0 -wind = bk0 !bk0 !bk0 !bk9/0.7/2/112 -refvec = -title = 1/0/~ ? ${MDL} BL THTE & WIND (KTS)|~BL THTE & WIND!0 +wind = bk0 !bk0 !bk0 !bk9/0.7/2/112 +refvec = +title = 1/0/~ ? ${RUN} BL THTE & WIND (KTS)|~BL THTE & WIND!0 l r @@ -90,9 +73,9 @@ hilo = 0 !0 !0 !20/H#;L#/1020-1070;900-1012 hlsym = 0 !0 !0 !1.3;1.3//22;22/3;3/hw clrbar = 1/V/LL !0 wind = bk0 !bk0 !bk0 !bk0 !bk9/0.7/2/112 -refvec = -title = 1/0/~ ? ${MDL} PMSL, BL TEMP, WIND (KTS)|~PMSL, BL TEMP, WIND!0 -r +refvec = +title = 1/0/~ ? ${RUN} PMSL, BL TEMP, WIND (KTS)|~PMSL, BL TEMP, WIND!0 +r GLEVEL = 1000 !1000 !0 !1000 GVCORD = pres !pres !none !pres @@ -110,9 +93,9 @@ HLSYM = !!1.5;1.5//22;22/3;3/hw CLRBAR = 1 WIND = !!!bk9/0.6/2/121/.6 REFVEC = -TITLE = 1/0/~ ? ${MDL} PMSL, 1000 MB TMP (F), FRONTOGENESIS (F)|~@ FRONTOGENESIS!0 +TITLE = 1/0/~ ? ${RUN} PMSL, 1000 MB TMP (F), FRONTOGENESIS (F)|~@ FRONTOGENESIS!0 r - + glevel = 700 !700 !9950 !0 gdpfun = sm5s(kinx)!sm5s(tmpc)!sm5s(dwpf)!sm5s(pmsl) gvcord = pres !pres !sgma !none @@ -127,63 +110,63 @@ hlsym = !!!1.5;1.5//22;22/3;3/hw clrbar = 1/V/LL!0 wind = refvec = -title = 1/0/~ ? ${MDL} K INDEX, 700mb TEMP (>6 C), sfc DWPT & MSLP|~K INDEX!0 +title = 1/0/~ ? ${RUN} K INDEX, 700mb TEMP (>6 C), sfc DWPT & MSLP|~K INDEX!0 r -gdattim = F06-${fend}-06 +gdattim = F06-${fend}-06 glevel = 0!500:1000!500:1000!0 -gvcord = none!pres!pres!none -skip = 0 -scale = 0 !-1 !-1 !0 +gvcord = none!pres!pres!none +skip = 0 +scale = 0 !-1 !-1 !0 gdpfun = p06i!sm5s(ldf(hght) !sm5s(ldf(hght)!sm5s(pmsl) type = f !c !c cint = !3/0/540 !3/543/1000 !4 -line = !4/5/2 !2/5/2 !19//3 +line = !4/5/2 !2/5/2 !19//3 fint = .01;.1;.25;.5;.75;1;1.25;1.5;1.75;2;2.5;3;4;5;6;7;8;9 fline = 0;21-30;14-20;5 hilo = !0!0!19/H#;L#/1020-1070;900-1010 hlsym = !0!0!1.3;1.3//22;22/3;3/hw clrbar = 1 -wind = bk0 +wind = bk0 CONTUR = 2 -refvec = -title = 1/0/~ ? ${MDL} 6-HR TOTAL PCPN, 1000-500mb THK |~6-HR PCPN & 1000-500 THK!0 +refvec = +title = 1/0/~ ? ${RUN} 6-HR TOTAL PCPN, 1000-500mb THK |~6-HR PCPN & 1000-500 THK!0 r gdattim = F00-${fend}-6 GLEVEL = 700!700!700!850!850!9950!9950 GVCORD = PRES!PRES!PRES!PRES!PRES!sgma!sgma -SKIP = 0 -SCALE = 0 +SKIP = 0 +SCALE = 0 GDPFUN = sm5s(relh)!sm5s(tmpc)!sm5s(tmpc)!sm5s(tmpc)!sm5s(tmpc)!sm5s(tmpc)!sm5s(tmpc) TYPE = c/f ! c -CINT = 50;70;90;95!2;-2 !200;0 !2;-2 !200;0 !2;-2 !-100;0;100 -LINE = 32//1/0 !6/3/2!6/1/2 !2/3/2!2/1/2 !20/3/2!20/1/2 +CINT = 50;70;90;95!2;-2 !200;0 !2;-2 !200;0 !2;-2 !-100;0;100 +LINE = 32//1/0 !6/3/2!6/1/2 !2/3/2!2/1/2 !20/3/2!20/1/2 FINT = 50;70;90 FLINE = 0;24;23;22 -HILO = -HLSYM = +HILO = +HLSYM = CLRBAR = 1 -WIND = +WIND = REFVEC = -TITLE = 1/0/~ ? ${MDL} @ RH, T (BL yel,850 red,700 cyan)|~@ RH, R/S TEMP!0 -r +TITLE = 1/0/~ ? ${RUN} @ RH, T (BL yel,850 red,700 cyan)|~@ RH, R/S TEMP!0 +r GLEVEL = 4400:10000!700:500!700:500!850 !850 !9950!9950 GVCORD = SGMA !PRES !PRES !PRES!PRES!SGMA!SGMA -SCALE = 0!3!3!0 +SCALE = 0!3!3!0 GDPFUN = sm5s(relh)!sm5s(lav(omeg))!sm5s(lav(omeg))!sm5s(tmpc)!sm5s(tmpc)!sm5s(tmpc)!sm5s(tmpc) TYPE = c/f ! c -CINT = 50;70;90;95!1/1!-1;-3;-5;-7;-9;-11;-13;-15;-17;-19;-21!2;-2!200;0!2;-2!200;0 -LINE = 32//2/0 !30/10/3!6/1/2 !2/3/2!2/1/2 !20/3/2!20/1/2 +CINT = 50;70;90;95!1/1!-1;-3;-5;-7;-9;-11;-13;-15;-17;-19;-21!2;-2!200;0!2;-2!200;0 +LINE = 32//2/0 !30/10/3!6/1/2 !2/3/2!2/1/2 !20/3/2!20/1/2 FINT = 50;70;90 FLINE = 0;24;23;22 -HILO = -HLSYM = +HILO = +HLSYM = CLRBAR = 1 -WIND = +WIND = REFVEC = -TITLE = 1/0/~ ? ${MDL} @ RH,T (BL yel,850 red),7-500 VV|~@ RH,R/S T,VV!0 +TITLE = 1/0/~ ? ${RUN} @ RH,T (BL yel,850 red),7-500 VV|~@ RH,R/S T,VV!0 r glevel = 0!0!0!0!700:500 !4400:10000 @@ -197,7 +180,7 @@ line = 22/1/2/0!4/1/2/0!7/1/2/0!2/1/2/0!6/1/3!21/1/3 fint = 50;200!50;200!50;200!50;200 fline = 0;23;23!0;25;25!0;30;30!0;15;15 clrbar = -title = 1/0/~ ? ${MDL} PCPN TYPE, 1000-500 RH & 7-500 VV|~PCPN TYPE & VV!0 +title = 1/0/~ ? ${RUN} PCPN TYPE, 1000-500 RH & 7-500 VV|~PCPN TYPE & VV!0 r glevel = 0 !0 !0 !0 @@ -211,7 +194,7 @@ line = 22/1/2/0 !4/1/2/0 !7/1/2/0 !2/1/2/0 fint = 50;200 !50;200 !50;200 !50;200 fline = 0;23;23 !0;25;25 !0;30;30 !0;15;15 clrbar = -title = 1/0/~ ? ${MDL} PCPN TYPE|~PCPN TYPE!0 +title = 1/0/~ ? ${RUN} PCPN TYPE|~PCPN TYPE!0 r GLEVEL = 500 @@ -230,7 +213,7 @@ HLSYM = CLRBAR = 1 WIND = !!!am1/.2/1/121/.4 REFVEC = -TITLE = 1/0/~ ? ${MDL} @ HGHT, TEMP & WIND|~500 HGHT,TMP,WIND!0 +TITLE = 1/0/~ ? ${RUN} @ HGHT, TEMP & WIND|~500 HGHT,TMP,WIND!0 TEXT = 1/21//hw MAP = 11/1/2/yes STNPLT = @@ -266,9 +249,9 @@ FLINE = 0 !0;24;25;30;29;28;27 !11;12;2;10;15;14;0 HILO = 0 !0 !0 !5/H#;L# HLSYM = 0 ! !0 !1.5//21//hw CLRBAR = 0 !0 !1 !0 -WIND = -REFVEC = -TITLE = 1/-1/~ ? ${MDL} @ MB HGT|~500 HGT CHG!1/-2/~ ? ${MDL} @ MB 12-HR HGT FALLS!0 +WIND = +REFVEC = +TITLE = 1/-1/~ ? ${RUN} @ MB HGT|~500 HGT CHG!1/-2/~ ? ${RUN} @ MB 12-HR HGT FALLS!0 TEXT = 1/21////hw CLEAR = YES l @@ -276,59 +259,59 @@ run GDATTIM = f24 GDPFUN = sm5s(hght)!(sub(hght^f24,hght^f12))!(sub(hght^f24,hght^f12))!sm5s(hght) -TITLE = 1/-1/~ ? ${MDL} @ MB HGT|~500 HGT CHG!1/-2/~ ? ${MDL} @ MB 12-HR HGT FALLS!0 +TITLE = 1/-1/~ ? ${RUN} @ MB HGT|~500 HGT CHG!1/-2/~ ? ${RUN} @ MB 12-HR HGT FALLS!0 l -run +run GDATTIM = f36 GDPFUN = sm5s(hght)!(sub(hght^f36,hght^f24))!(sub(hght^f36,hght^f24))!sm5s(hght) -TITLE = 1/-1/~ ? ${MDL} @ MB HGT|~500 HGT CHG!1/-2/~ ? ${MDL} @ MB 12-HR HGT FALLS!0 +TITLE = 1/-1/~ ? ${RUN} @ MB HGT|~500 HGT CHG!1/-2/~ ? ${RUN} @ MB 12-HR HGT FALLS!0 l run GDATTIM = f48 GDPFUN = sm5s(hght)!(sub(hght^f48,hght^f36))!(sub(hght^f48,hght^f36))!sm5s(hght) -TITLE = 1/-1/~ ? ${MDL} @ MB HGT|~500 HGT CHG!1/-2/~ ? ${MDL} @ MB 12-HR HGT FALLS!0 +TITLE = 1/-1/~ ? ${RUN} @ MB HGT|~500 HGT CHG!1/-2/~ ? ${RUN} @ MB 12-HR HGT FALLS!0 l run GDATTIM = f60 GDPFUN = sm5s(hght)!(sub(hght^f60,hght^f48))!(sub(hght^f60,hght^f48))!sm5s(hght) -TITLE = 1/-1/~ ? ${MDL} @ MB HGT|~500 HGT CHG!1/-2/~ ? ${MDL} @ MB 12-HR HGT FALLS!0 +TITLE = 1/-1/~ ? ${RUN} @ MB HGT|~500 HGT CHG!1/-2/~ ? ${RUN} @ MB 12-HR HGT FALLS!0 l run GDATTIM = f72 GDPFUN = sm5s(hght)!(sub(hght^f72,hght^f60))!(sub(hght^f72,hght^f60))!sm5s(hght) -TITLE = 1/-1/~ ? ${MDL} @ MB HGT|~500 HGT CHG!1/-2/~ ? ${MDL} @ MB 12-HR HGT FALLS!0 +TITLE = 1/-1/~ ? ${RUN} @ MB HGT|~500 HGT CHG!1/-2/~ ? ${RUN} @ MB 12-HR HGT FALLS!0 l run GDATTIM = f84 GDPFUN = sm5s(hght)!(sub(hght^f84,hght^f72))!(sub(hght^f84,hght^f72))!sm5s(hght) -TITLE = 1/-1/~ ? ${MDL} @ MB HGT|~500 HGT CHG!1/-2/~ ? ${MDL} @ MB 12-HR HGT FALLS!0 +TITLE = 1/-1/~ ? ${RUN} @ MB HGT|~500 HGT CHG!1/-2/~ ? ${RUN} @ MB 12-HR HGT FALLS!0 l run GDATTIM = f96 GDPFUN = sm5s(hght)!(sub(hght^f96,hght^f84))!(sub(hght^f96,hght^f84))!sm5s(hght) -TITLE = 1/-1/~ ? ${MDL} @ MB HGT|~500 HGT CHG!1/-2/~ ? ${MDL} @ MB 12-HR HGT FALLS!0 +TITLE = 1/-1/~ ? ${RUN} @ MB HGT|~500 HGT CHG!1/-2/~ ? ${RUN} @ MB 12-HR HGT FALLS!0 l run GDATTIM = f108 GDPFUN = sm5s(hght)!(sub(hght^f108,hght^f96))!(sub(hght^f108,hght^f96))!sm5s(hght) -TITLE = 1/-1/~ ? ${MDL} @ MB HGT|~500 HGT CHG!1/-2/~ ? ${MDL} @ MB 12-HR HGT FALLS!0 +TITLE = 1/-1/~ ? ${RUN} @ MB HGT|~500 HGT CHG!1/-2/~ ? ${RUN} @ MB 12-HR HGT FALLS!0 l run GDATTIM = f120 GDPFUN = sm5s(hght)!(sub(hght^f120,hght^f108))!(sub(hght^f120,hght^f108))!sm5s(hght) -TITLE = 1/-1/~ ? ${MDL} @ MB HGT|~500 HGT CHG!1/-2/~ ? ${MDL} @ MB 12-HR HGT FALLS!0 +TITLE = 1/-1/~ ? ${RUN} @ MB HGT|~500 HGT CHG!1/-2/~ ? ${RUN} @ MB 12-HR HGT FALLS!0 l run -MAP = 4/1/2/yes +MAP = 4/1/2/yes garea = 38.5;-91.3;51.4;-71.4 proj = nps//3;3;0;1 GDATTIM = F00-${fend}-6 @@ -346,31 +329,32 @@ hlsym = 0 clrbar = 1/V/LL !0 wind = bk0 !bk0 !bk0 !bk0 !bk9/0.9/2/112 refvec = -title = 1/0/~ ? ${MDL} 720-940 MB AVG RH,BL1 WND,850 MB OMG,850-2m dT,850 T|~GR LAKE!0 +title = 1/0/~ ? ${RUN} 720-940 MB AVG RH,BL1 WND,850 MB OMG,850-2m dT,850 T|~GR LAKE!0 FILTER = y r exit EOFplt -export err=$?;err_chk +export err=$? + ##################################################### # GEMPAK DOES NOT ALWAYS HAVE A NON ZERO RETURN CODE # WHEN IT CAN NOT PRODUCE THE DESIRED GRID. CHECK # FOR THIS CASE HERE. ##################################################### -ls -l $metaname -export err=$?;export pgm="GEMPAK CHECK FILE";err_chk +if (( err != 0 )) || [[ ! -s "${metaname}" ]]; then + echo "FATAL ERROR: Failed to create bwx meta file" + exit $(( err + 100 )) +fi -if [ $SENDCOM = "YES" ] ; then - mv ${metaname} ${COMOUT}/${mdl}_${PDY}_${cyc}_us_${metatype} - if [ $SENDDBN = "YES" ] ; then - ${DBNROOT}/bin/dbn_alert MODEL ${DBN_ALERT_TYPE} $job \ - ${COMOUT}/${mdl}_${PDY}_${cyc}_us_${metatype} - if [ $DBN_ALERT_TYPE = "GFS_METAFILE_LAST" ] ; then +mv "${metaname}" "${COM_ATMOS_GEMPAK_META}/${metaname}" +if [[ "${SENDDBN}" == "YES" ]] ; then + "${DBNROOT}/bin/dbn_alert" MODEL "${DBN_ALERT_TYPE}" "${job}" \ + "${COM_ATMOS_GEMPAK_META}/${metaname}" + if [[ ${DBN_ALERT_TYPE} = "GFS_METAFILE_LAST" ]] ; then DBN_ALERT_TYPE=GFS_METAFILE - ${DBNROOT}/bin/dbn_alert MODEL ${DBN_ALERT_TYPE} $job \ - ${COMOUT}/${mdl}_${PDY}_${cyc}_us_${metatype} - fi - fi + "${DBNROOT}/bin/dbn_alert" MODEL "${DBN_ALERT_TYPE}" "${job}" \ + "${COM_ATMOS_GEMPAK_META}/${metaname}" + fi fi exit diff --git a/gempak/ush/gfs_meta_comp.sh b/gempak/ush/gfs_meta_comp.sh index 9bd27c5736..5f01de5d48 100755 --- a/gempak/ush/gfs_meta_comp.sh +++ b/gempak/ush/gfs_meta_comp.sh @@ -1,184 +1,161 @@ -#! /bin/sh +#! /usr/bin/env bash # Metafile Script : gfs_meta_comp.sh # -# This is a script which creates a metafile that runs a comparison of 500 MB -# heights and PMSL between the older GFS model run and the newer one. The +# This is a script which creates a metafile that runs a comparison of 500 MB +# heights and PMSL between the older GFS model run and the newer one. The # metafile also generates a comparison between the UKMET older run and the newer # GFS model run. # -# Log : -# J. Carr/HPC 5/12/97 Developed Script -# J. Carr/HPC 8/05/98 Changed map to medium resolution and redefined yesterday code -# J. Carr/HPC 2/01/99 Changed skip to 0 -# J. Carr/HPC 4/12/99 Added gfs model out to 84 hours. -# J. Carr/HPC 6/99 put a filter on map -# J. Carr/HPC 4/2000 Upped the eta comp to 60 hrs. -# J. Carr/HPC 2/2001 Edited to run on the IBM. -# J. Carr/HPC 5/2001 Added a mn variable for a/b side dbnet root variable. -# J. Carr/HPC 7/2001 Added more comparison times. -# J. Carr/HPC 7/2001 Converted to a korn shell prior to delivering script to Production. -# J. Carr/HPC 7/2001 Submitted. -# J. Carr/HPC 11/2004 Changed all eta/ETA entries to nam/NAM. -# Inserted a ? in all title/TITLE lines. -# # Set up Local Variables # -set -x -# -export PS4='COMP:$SECONDS + ' -rm -Rf $DATA/COMP $DATA/GEMPAK_META_COMP -mkdir -p -m 775 $DATA/COMP $DATA/GEMPAK_META_COMP -cd $DATA/COMP -cp $FIXgempak/datatype.tbl datatype.tbl -export COMPONENT=${COMPONENT:-atmos} +source "${HOMEgfs}/ush/preamble.sh" + +rm -Rf "${DATA}/COMP" "${DATA}/GEMPAK_META_COMP" +mkdir -p -m 775 "${DATA}/COMP" "${DATA}/GEMPAK_META_COMP" +cd "${DATA}/COMP" || exit 2 +cp "${HOMEgfs}/gempak/fix/datatype.tbl" datatype.tbl mdl=gfs MDL=GFS metatype="comp" metaname="${mdl}_${metatype}_${cyc}.meta" device="nc | ${metaname}" -PDY2=$(echo $PDY | cut -c3-) -# -#XXW export MODEL=$COMROOT/nawips/prod -# BV export MODEL=$COMROOT/nawips/${envir} -# BV export HPCGFS=${MODEL}/${mdl}.$PDY -export HPCGFS=${COMINgempak}/${mdl}.${PDY}/${cyc}/${COMPONENT}/gempak -export COMIN00=${COMINgempak}/${mdl}.${PDY}/00/${COMPONENT}/gempak -export COMIN06=${COMINgempak}/${mdl}.${PDY}/06/${COMPONENT}/gempak -export COMIN12=${COMINgempak}/${mdl}.${PDY}/12/${COMPONENT}/gempak -export COMIN18=${COMINgempak}/${mdl}.${PDY}/18/${COMPONENT}/gempak -if [ ${cyc} -eq 00 ] ; then - cp $COMIN00/gfs_${PDY}00f* $DATA/GEMPAK_META_COMP -elif [ ${cyc} -eq 06 ] ; then - cp $COMIN00/gfs_${PDY}00f* $DATA/GEMPAK_META_COMP - cp $COMIN06/gfs_${PDY}06f* $DATA/GEMPAK_META_COMP -elif [ ${cyc} -eq 12 ] ; then - cp $COMIN00/gfs_${PDY}00f* $DATA/GEMPAK_META_COMP - cp $COMIN06/gfs_${PDY}06f* $DATA/GEMPAK_META_COMP - cp $COMIN12/gfs_${PDY}12f* $DATA/GEMPAK_META_COMP -elif [ ${cyc} -eq 18 ] ; then - cp $COMIN00/gfs_${PDY}00f* $DATA/GEMPAK_META_COMP - cp $COMIN06/gfs_${PDY}06f* $DATA/GEMPAK_META_COMP - cp $COMIN12/gfs_${PDY}12f* $DATA/GEMPAK_META_COMP - cp $COMIN18/gfs_${PDY}18f* $DATA/GEMPAK_META_COMP -fi -export COMIN=$DATA/GEMPAK_META_COMP -#XXW export HPCNAM=${MODEL}/nam.$PDY -#XXW export HPCNGM=${MODEL}/ngm.$PDY -# BV export HPCNAM=$COMROOT/nawips/prod/nam.$PDY -export HPCNAM=${COMINnam}.$PDY/gempak +export COMIN="gfs.multi" +mkdir "${COMIN}" +for cycle in $(seq -f "%02g" -s ' ' 0 "${STEP_GFS}" "${cyc}"); do + YMD=${PDY} HH=${cycle} GRID="1p00" generate_com gempak_dir:COM_ATMOS_GEMPAK_TMPL + for file_in in "${gempak_dir}/gfs_1p00_${PDY}${cycle}f"*; do + file_out="${COMIN}/$(basename "${file_in}")" + if [[ ! -L "${file_out}" ]]; then + ln -sf "${file_in}" "${file_out}" + fi + done +done + +export HPCNAM="nam.${PDY}" +if [[ ! -L ${HPCNAM} ]]; then + ln -sf "${COMINnam}/nam.${PDY}/gempak" "${HPCNAM}" +fi -# export HPCNGM=$COMROOT/nawips/prod/ngm.$PDY # # DEFINE YESTERDAY -PDYm1=$($NDATE -24 ${PDY}${cyc} | cut -c -8) -PDY2m1=$(echo $PDYm1 | cut -c 3-) +PDYm1=$(date --utc +%Y%m%d -d "${PDY} - 24 hours") # # DEFINE 2 DAYS AGO -PDYm2=$($NDATE -48 ${PDY}${cyc} | cut -c -8) -PDY2m2=$(echo $PDYm2 | cut -c 3-) -# -# DEFINE 3 DAYS AGO -PDYm3=$($NDATE -72 ${PDY}${cyc} | cut -c -8) -PDY2m3=$(echo $PDYm3 | cut -c 3-) -# -# THE 1200 UTC CYCLE -# -if [ ${cyc} -eq 12 ] ; then - grid="F-${MDL} | ${PDY2}/${cyc}00" - for gareas in US NP - do - if [ ${gareas} = US ] ; then +PDYm2=$(date --utc +%Y%m%d -d "${PDY} - 48 hours") + +grid="F-${MDL} | ${PDY:2}/${cyc}00" +for gareas in US NP; do + case ${gareas} in + US) garea="bwus" proj=" " latlon="0" - elif [ ${gareas} = NP ] ; then + ;; + NP) garea="5;-177;45;-72" proj="STR/90.0;-155.0;0.0" latlon="1/1/1/1/10" + ;; + *) + echo "FATAL ERROR: Unknown domain" + exit 100 + esac + + case ${cyc} in + 00 | 12) + offsets=(6 12 24 48) + contours=1 + type_param="CTYPE" + ex="" + ;; + 06 | 18) + offsets=(6 12 18 24) + contours=2 + type_param="TYPE" + ex="ex" + ;; + *) + echo "FATAL ERROR: Invalid cycle ${cyc} passed to ${BASH_SOURCE[0]}" + ;; + esac + + for offset in "${offsets[@]}"; do + init_time=$(date --utc +%Y%m%d%H -d "${PDY} ${cyc} - ${offset} hours") + init_PDY=${init_time:0:8} + init_cyc=${init_time:8:2} + + if (( init_time <= SDATE )); then + echo "Skipping generation for ${init_time} because it is before the experiment began" + if (( offset == "${offsets[0]}" )); then + echo "First forecast time, no metafile produced" + exit 0 + fi + continue fi - for runtime in 06 00 12y 122d - do - if [ ${runtime} = "06" ] ; then - cyc2="06" - desc="T" - grid2="F-${MDL} | ${PDY2}/0600" - add="06" - testgfsfhr="120" - elif [ ${runtime} = "00" ] ; then - cyc2="00" - desc="T" - grid2="F-${MDL} | ${PDY2}/0000" - add="12" - testgfsfhr="114" - elif [ ${runtime} = "12y" ] ; then - cyc2="12" - desc="Y" - #XXW export HPCGFS=${MODEL}/gfs.${PDYm1} - # BV export HPCGFS=$COMROOT/nawips/${envir}/gfs.${PDYm1} - export HPCGFS=${COMINgempak}/${mdl}.${PDYm1}/${cyc2}/${COMPONENT}/gempak - grid2="F-GFSHPC | ${PDY2m1}/1200" - add="24" - testgfsfhr="102" - elif [ ${runtime} = "122d" ] ; then - cyc2="12" - desc="Y2" - #XXW export HPCGFS=${MODEL}/gfs.${PDYm2} - # BV export HPCGFS=$COMROOT/nawips/${esnvir}/gfs.${PDYm2} - export HPCGFS=${COMINgempak}/${mdl}.${PDYm2}/${cyc2}/${COMPONENT}/gempak + # Create symlink in DATA to sidestep gempak path limits + HPCGFS="${RUN}.${init_time}" + if [[ ! -L ${HPCGFS} ]]; then + YMD="${init_PDY}" HH="${init_cyc}" GRID="1p00" generate_com source_dir:COM_ATMOS_GEMPAK_TMPL + ln -sf "${source_dir}" "${HPCGFS}" + fi - grid2="F-GFSHPC | ${PDY2m2}/1200" - add="48" - testgfsfhr="96" - fi + if [[ ${init_PDY} == "${PDY}" ]]; then + desc="T" + elif [[ ${init_PDY} == "${PDYm1}" ]]; then + desc="Y" + elif [[ ${init_PDY} == "${PDYm2}" ]]; then + desc="Y2" + else + echo "FATAL ERROR: Unexpected offset" + exit 100 + fi + + testgfsfhr=$(( 126 - offset )) + + for fhr in $(seq -s ' ' 0 6 126); do + gfsfhr=F$(printf "%02g" "${fhr}") + gfsoldfhr=F$(printf "%02g" $((fhr + offset))) + grid2="F-GFSHPC | ${init_time:2}/${init_cyc}00" gdpfun1="sm5s(hght)!sm5s(hght)" gdpfun2="sm5s(pmsl)!sm5s(pmsl)" line="5/1/3/2/2!6/1/3/2/2" hilo1="5/H#;L#//5/5;5/y!6/H#;L#//5/5;5/y" hilo2="5/H#;L#/1018-1060;900-1012/5/10;10/y!6/H#;L#/1018-1060;900-1012/5/10;10/y" - hilo3="5/H#;L#//5/5;5/y" - hilo4="5/H#;L#/1018-1060;900-1012/5/10;10/y" - title1="5/-2/~ ? ^ ${MDL} @ HGT (${cyc}Z YELLOW)|^${gareas} ${cyc}Z VS ${desc} ${cyc2}Z 500 HGT!6/-3/~ ? ${MDL} @ HGT (${cyc2}Z ${desc} CYAN)" - title2="5/-2/~ ? ^ ${MDL} PMSL (${cyc}Z YELLOW)|^${gareas} ${cyc}Z VS ${desc} ${cyc2}Z PMSL!6/-3/~ ? ${MDL} PMSL (${cyc2}Z ${desc} CYAN)" - title3="5/-2/~ ? ^ ${MDL} @ HGT (${cyc}Z YELLOW)|^${gareas} ${cyc}Z VS ${desc} ${cyc2}Z 500 HGT" - title4="5/-2/~ ? ^ ${MDL} PMSL (${cyc}Z YELLOW)|^${gareas} ${cyc}Z VS ${desc} ${cyc2}Z PMSL" - for gfsfhr in 00 06 12 18 24 30 36 42 48 54 60 66 72 78 84 90 96 102 108 114 120 126 - do - gfsoldfhr=F$(expr ${gfsfhr} + ${add}) - gfsfhr2=$(echo ${gfsfhr}) - gfsfhr=F${gfsfhr} - if [ ${gfsfhr2} -gt ${testgfsfhr} ] ; then - grid="F-${MDL} | ${PDY2}/${cyc}00" - grid2=" " - gfsoldfhr=" " - gdpfun1="sm5s(hght)" - gdpfun2="sm5s(pmsl)" - line="5/1/3/2/2" - hilo1=$(echo ${hilo3}) - hilo2=$(echo ${hilo4}) - title1=$(echo ${title3}) - title2=$(echo ${title4}) - fi + title1="5/-2/~ ? ^ ${MDL} @ HGT (${cyc}Z YELLOW)|^${gareas} ${cyc}Z VS ${desc} ${init_cyc}Z 500 HGT!6/-3/~ ? ${MDL} @ HGT (${init_cyc}Z ${desc} CYAN)" + title2="5/-2/~ ? ^ ${MDL} PMSL (${cyc}Z YELLOW)|^${gareas} ${cyc}Z VS ${desc} ${init_cyc}Z PMSL!6/-3/~ ? ${MDL} PMSL (${init_cyc}Z ${desc} CYAN)" + if (( fhr > testgfsfhr )); then + grid="F-${MDL} | ${PDY:2}/${cyc}00" + grid2=" " + gfsoldfhr=" " + gdpfun1="sm5s(hght)" + gdpfun2="sm5s(pmsl)" + line="5/1/3/2/2" + hilo1="5/H#;L#//5/5;5/y" + hilo2="5/H#;L#/1018-1060;900-1012/5/10;10/y" + title1="5/-2/~ ? ^ ${MDL} @ HGT (${cyc}Z YELLOW)|^${gareas} ${cyc}Z VS ${desc} ${init_cyc}Z 500 HGT" + title2="5/-2/~ ? ^ ${MDL} PMSL (${cyc}Z YELLOW)|^${gareas} ${cyc}Z VS ${desc} ${init_cyc}Z PMSL" + fi -export pgm=gdplot2_nc;. prep_step; startmsg -$GEMEXE/gdplot2_nc << EOF + export pgm=gdplot2_nc;. prep_step + "${GEMEXE}/gdplot2_nc" << EOF \$MAPFIL= mepowo.gsf DEVICE = ${device} MAP = 1/1/1/yes CLEAR = yes GAREA = ${garea} -PROJ = ${proj} +PROJ = ${proj} LATLON = ${latlon} SKIP = 0 PANEL = 0 -CONTUR = 2 -CLRBAR = -FINT = -FLINE = -REFVEC = +CONTUR = ${contours} +CLRBAR = +FINT = +FLINE = +REFVEC = WIND = 0 GDFILE = ${grid} !${grid2} @@ -188,7 +165,7 @@ GVCORD = PRES GDPFUN = ${gdpfun1} LINE = ${line} SCALE = -1 -CTYPE = c +${type_param} = c CINT = 6 HLSYM = 1.2;1.2//21//hw TEXT = 1/21//hw @@ -211,22 +188,58 @@ HILO = ${hilo2} TITLE = ${title2} run +${ex} EOF -export err=$?;err_chk - done + export err=$?;err_chk done - # COMPARE THE 1200 UTC GFS MODEL TO THE 0000 UTC UKMET MODEL - grid="F-${MDL} | ${PDY2}/${cyc}00" - export HPCUKMET=${COMINukmet}.${PDY}/gempak - grid2="F-UKMETHPC | ${PDY2}/0000" - # for gfsfhr in 00 12 24 36 48 60 84 108 - for gfsfhr in 00 12 24 84 108 - do - ukmetfhr=F$(expr ${gfsfhr} + 12) - gfsfhr=F${gfsfhr} + done + + if (( 10#${cyc} % 12 ==0 )); then + + # + # There are some differences between 00z and 12z + # The YEST string makes sense (but is inconsistently used) + # The others I'm not sure why they differ. - WCK + # + case ${cyc} in + 00) + type_param="TYPE" + hlsym="1.2;1.2//21//hw" + wind="" + yest=" YEST" + run_cmd="run" + extra_cmd="\nHLSYM = 1.2;1.2//21//hw\nTEXT = s/21//hw" + ;; + 12) + type_param="CTYPE" + hlsym="1;1//21//hw" + wind="0" + yest="" + run_cmd="ru" + extra_cmd="" + ;; + *) + echo "FATAL ERROR: Invalid cycle ${cyc}" + exit 100 + ;; + esac + + # COMPARE THE GFS MODEL TO THE UKMET MODEL 12-HOURS PRIOR + ukmet_date=$(date --utc +%Y%m%d%H -d "${PDY} ${cyc} - 12 hours") + ukmet_PDY=${ukmet_date:0:8} + ukmet_cyc=${ukmet_date:8:2} + export HPCUKMET=ukmet.${ukmet_PDY} + if [[ ! -L "${HPCUKMET}" ]]; then + ln -sf "${COMINukmet}/ukmet.${ukmet_PDY}/gempak" "${HPCUKMET}" + fi + grid2="F-UKMETHPC | ${ukmet_PDY:2}/${ukmet_date}" + + for fhr in 0 12 24 84 108; do + gfsfhr=F$(printf "%02g" "${fhr}") + ukmetfhr=F$(printf "%02g" $((fhr + 12))) -export pgm=gdplot2_nc;. prep_step; startmsg -$GEMEXE/gdplot2_nc << EOF + export pgm=gdplot2_nc;. prep_step + "${GEMEXE}/gdplot2_nc" << EOF \$MAPFIL= mepowo.gsf DEVICE = ${device} MAP = 1/1/1/yes @@ -240,22 +253,22 @@ GDATTIM = ${gfsfhr} SKIP = 0 PANEL = 0 CONTUR = 2 -CLRBAR = +CLRBAR = GLEVEL = 500 GVCORD = PRES GDPFUN = sm5s(hght) LINE = 5/1/3/2 SCALE = -1 -CTYPE = c +${type_param} = c CINT = 6 -FINT = -FLINE = -HLSYM = 1;1//21//hw +FINT = +FLINE = +HLSYM = ${hlsym} TEXT = s/21//hw -WIND = 0 -REFVEC = +WIND = ${wind} +REFVEC = HILO = 5/H#;L#//5/5;5/y -TITLE = 5/-1/~ ? ${MDL} @ HGT (12Z YELLOW)|~${gareas} 12Z VS UK 00Z 500 HGT!0 +TITLE = 5/-1/~ ? ${MDL} @ HGT (${cyc}Z YELLOW)|~${gareas} ${cyc}Z VS UK ${ukmet_cyc}Z 500 HGT!0 l run @@ -265,23 +278,23 @@ GDATTIM = ${ukmetfhr} GDPFUN = sm5s(hght) LINE = 6/1/3/2 HILO = 6/H#;L#//5/5;5/y -TITLE = 6/-2/~ ? UKMET @ HGT (00Z CYAN)!0 +TITLE = 6/-2/~ ? UKMET @ HGT (${ukmet_cyc}Z${yest} CYAN)!0 l -ru +${run_cmd} CLEAR = yes GLEVEL = 0 GVCORD = none SCALE = 0 GDPFUN = sm5s(pmsl) -CINT = 4 +CINT = 4${extra_cmd} GDFILE = ${grid} GDATTIM = ${gfsfhr} LINE = 5/1/3/2 HILO = 5/H#;L#/1018-1060;900-1012/5/10;10/y -TITLE = 5/-1/~ ? ${MDL} PMSL (12Z YELLOW)|~${gareas} 12Z VS UK 00Z PMSL!0 +TITLE = 5/-1/~ ? ${MDL} PMSL (${cyc}Z YELLOW)|~${gareas} ${cyc}Z VS UK ${ukmet_cyc}Z PMSL!0 l -ru +${run_cmd} CLEAR = no GDFILE = ${grid2} @@ -291,426 +304,55 @@ LINE = 6/1/3/2 HILO = 6/H#;L#/1018-1060;900-1012/5/10;10/y TITLE = 6/-2/~ ? UKMET PMSL (00Z CYAN)!0 l -ru +${run_cmd} EOF -export err=$?;err_chk - done - # COMPARE THE 1200 UTC GFS MODEL TO THE 1200 UTC ECMWF FROM YESTERDAY - grid="F-${MDL} | ${PDY2}/${cyc}00" - #XXW grid2=${MODEL}/ecmwf.${PDYm1}/ecmwf_glob_${PDYm1}12 - grid2=${COMINecmwf}.${PDYm1}/gempak/ecmwf_glob_${PDYm1}12 - for gfsfhr in 00 24 48 72 96 120 - do - ecmwffhr=F$(expr ${gfsfhr} + 24) - gfsfhr=F${gfsfhr} - -export pgm=gdplot2_nc;. prep_step; startmsg -$GEMEXE/gdplot2_nc << EOF -\$MAPFIL= mepowo.gsf -DEVICE = ${device} -MAP = 1/1/1/yes -CLEAR = yes -GAREA = ${garea} -PROJ = ${proj} -LATLON = ${latlon} -GDFILE = ${grid} -GDATTIM = ${gfsfhr} -SKIP = 0 -PANEL = 0 -CONTUR = 2 -CLRBAR = -GLEVEL = 500 -GVCORD = PRES -GDPFUN = sm5s(hght) -LINE = 5/1/3/2 -SCALE = -1 -CTYPE = c -CINT = 6 -FINT = -FLINE = -HLSYM = 1;1//21//hw -TEXT = s/21//hw -WIND = 0 -REFVEC = -HILO = 5/H#;L#//5/5;5/y -TITLE = 5/-1/~ ? ${MDL} @ HGT (12Z YELLOW)|~${gareas} 12Z VS EC Y 12Z 500 HGT!0 -l -run - -CLEAR = no -GDFILE = ${grid2} -GDATTIM = ${ecmwffhr} -GDPFUN = sm5s(hght) -LINE = 6/1/3/2 -HILO = 6/H#;L#//5/5;5/y -TITLE = 6/-2/~ ? ECMWF @ HGT (12Z YEST CYAN)!0 -l -run + export err=$?;err_chk + done -CLEAR = yes -GLEVEL = 0 -GVCORD = none -SCALE = 0 -GDPFUN = sm5s(pmsl) -CINT = 4 -GDFILE = ${grid} -GDATTIM = ${gfsfhr} -LINE = 5/1/3/2 -HILO = 5/H#;L#/1018-1060;900-1012/5/10;10/y -TITLE = 5/-1/~ ? ${MDL} PMSL (12Z YELLOW)|~${gareas} 12Z VS EC Y 12Z PMSL!0 -l -run + # COMPARE THE GFS MODEL TO THE 12 UTC ECMWF FROM YESTERDAY + offset=$(( (10#${cyc}+12)%24 + 12 )) + ecmwf_date=$(date --utc +%Y%m%d%H -d "${PDY} ${cyc} - ${offset} hours") + ecmwf_PDY=${ecmwf_date:0:8} + # ecmwf_cyc=${ecmwf_date:8:2} + grid2=${COMINecmwf}/ecmwf.${ecmwf_PDY}/gempak/ecmwf_glob_${ecmwf_date} -CLEAR = no -GDFILE = ${grid2} -GDPFUN = sm5s(pmsl) -GDATTIM = ${ecmwffhr} -LINE = 6/1/3/2 -HILO = 6/H#;L#/1018-1060;900-1012/5/10;10/y -TITLE = 6/-2/~ ? ECMWF PMSL (12Z YEST CYAN)!0 -l -run + for fhr in $(seq -s ' ' $(( offset%24 )) 24 120 ); do + gfsfhr=F$(printf "%02g" "${fhr}") + ecmwffhr=F$(printf "%02g" $((fhr + 24))) -EOF -export err=$?;err_chk - done - # COMPARE THE 1200 UTC GFS MODEL TO THE 1200 UTC NAM AND NGM - grid="F-${MDL} | ${PDY2}/${cyc}00" - grid2="F-NAMHPC | ${PDY2}/${cyc}00" - # grid2ngm="F-NGMHPC | ${PDY2}/${cyc}00" - for gfsfhr in 00 06 12 18 24 30 36 42 48 54 60 66 72 78 84 - do - namfhr=F${gfsfhr} - # ngmfhr=F${gfsfhr} - gfsfhr=F${gfsfhr} - -export pgm=gdplot2_nc;. prep_step; startmsg -$GEMEXE/gdplot2_nc << EOF + export pgm=gdplot2_nc;. prep_step + "${GEMEXE}/gdplot2_nc" << EOF \$MAPFIL= mepowo.gsf DEVICE = ${device} MAP = 1/1/1/yes CLEAR = yes GAREA = ${garea} PROJ = ${proj} -LATLON = ${latlon} +LATLON = ${latlon} GDFILE = ${grid} GDATTIM = ${gfsfhr} -SKIP = 0 +SKIP = 0 PANEL = 0 CONTUR = 2 -CLRBAR = -GLEVEL = 500 -GVCORD = PRES -GDPFUN = sm5s(hght) -LINE = 3/1/3/2 -SCALE = -1 -TYPE = c -CINT = 6 -FINT = -FLINE = -HLSYM = 1;1//21//hw -TEXT = s/21//hw -WIND = -REFVEC = -HILO = 3/H#;L#//5/5;5/y -TITLE = 3/-1/~ ? ${MDL} @ HGT (12Z YELLOW)|~${gareas} ${MDL}/NAM/NGM 500 HGT!0 -l -run - -CLEAR = no -GDFILE = ${grid2} -GDATTIM = ${namfhr} +CLRBAR = +GLEVEL = 500 +GVCORD = PRES GDPFUN = sm5s(hght) LINE = 5/1/3/2 -HILO = 5/H#;L#//5/5;5/y -TITLE = 5/-2/~ ? NAM @ HGT (12Z CYAN)!0 -l -run - -CLEAR = yes -GLEVEL = 0 -GVCORD = none -SCALE = 0 -GDPFUN = sm5s(pmsl) -CINT = 4 -GDFILE = ${grid} -GDATTIM = ${gfsfhr} -LINE = 3/1/3/2 -HILO = 3/H#;L#/1018-1060;900-1012/5/10;10/y -TITLE = 3/-1/~ ? ${MDL} PMSL (12Z YELLOW)|~${gareas} ${MDL}/NAM/NGM PMSL!0 -l -run - -CLEAR = no -GDFILE = ${grid2} -GDPFUN = sm5s(pmsl) -GDATTIM = ${namfhr} -LINE = 5/1/3/2 -HILO = 5/H#;L#/1018-1060;900-1012/5/10;10/y -TITLE = 5/-2/~ ? NAM PMSL (12Z CYAN)!0 -l -run - -EOF -export err=$?;err_chk - done - done -fi - -if [ ${cyc} -eq 00 ] ; then - grid="F-${MDL} | ${PDY2}/${cyc}00" - for gareas in US NP - do - if [ ${gareas} = US ] ; then - garea="bwus" - proj=" " - latlon="0" - elif [ ${gareas} = NP ] ; then - garea="5;-177;45;-72" - proj="STR/90.0;-155.0;0.0" - latlon="1/1/1/1/10" - fi - for runtime in 18 12 00y 002d - do - if [ ${runtime} = "18" ] ; then - cyc2="18" - desc="Y" -# BV export HPCGFS=${MODEL}/gfs.${PDYm1} - export HPCGFS=${COMINgempak}/${mdl}.${PDYm1}/${cyc2}/${COMPONENT}/gempak - - grid2="F-GFSHPC | ${PDY2m1}/1800" - add="06" - testgfsfhr="120" - elif [ ${runtime} = "12" ] ; then - cyc2="12" - desc="Y" - export HPCGFS=${COMINgempak}/${mdl}.${PDYm1}/${cyc2}/${COMPONENT}/gempak - - grid2="F-GFSHPC | ${PDY2m1}/1200" - add="12" - testgfsfhr="114" - elif [ ${runtime} = "00y" ] ; then - cyc2="00" - desc="Y" - export HPCGFS=${COMINgempak}/${mdl}.${PDYm1}/${cyc2}/${COMPONENT}/gempak - - grid2="F-GFSHPC | ${PDY2m1}/0000" - add="24" - testgfsfhr="102" - elif [ ${runtime} = "002d" ] ; then - cyc2="00" - desc="Y2" - export HPCGFS=${COMINgempak}/${mdl}.${PDYm2}/${cyc2}/${COMPONENT}/gempak - - grid2="F-GFSHPC | ${PDY2m2}/0000" - add="48" - testgfsfhr="96" - fi - gdpfun1="sm5s(hght)!sm5s(hght)" - gdpfun2="sm5s(pmsl)!sm5s(pmsl)" - line="5/1/3/2/2!6/1/3/2/2" - hilo1="5/H#;L#//5/5;5/y!6/H#;L#//5/5;5/y" - hilo2="5/H#;L#/1018-1060;900-1012/5/10;10/y!6/H#;L#/1018-1060;900-1012/5/10;10/y" - hilo3="5/H#;L#//5/5;5/y" - hilo4="5/H#;L#/1018-1060;900-1012/5/10;10/y" - title1="5/-2/~ ? ^ ${MDL} @ HGT (${cyc}Z YELLOW)|^${gareas} ${cyc}Z VS ${desc} ${cyc2}Z 500 HGT!6/-3/~ ? ${MDL} @ HGT (${cyc2}Z ${desc} CYAN)" - title2="5/-2/~ ? ^ ${MDL} PMSL (${cyc}Z YELLOW)|^${gareas} ${cyc}Z VS ${desc} ${cyc2}Z PMSL!6/-3/~ ? ${MDL} PMSL (${cyc2}Z ${desc} CYAN)" - title3="5/-2/~ ? ^ ${MDL} @ HGT (${cyc}Z YELLOW)|^${gareas} ${cyc}Z VS ${desc} ${cyc2}Z 500 HGT" - title4="5/-2/~ ? ^ ${MDL} PMSL (${cyc}Z YELLOW)|^${gareas} ${cyc}Z VS ${desc} ${cyc2}Z PMSL" - for gfsfhr in 00 06 12 18 24 30 36 42 48 54 60 66 72 78 84 90 96 102 108 114 120 126 - do - gfsoldfhr=F$(expr ${gfsfhr} + ${add}) - gfsfhr2=$(echo ${gfsfhr}) - gfsfhr=F${gfsfhr} - if [ ${gfsfhr2} -gt ${testgfsfhr} ] ; then - grid="F-${MDL} | ${PDY2}/${cyc}00" - grid2=" " - gfsoldfhr=" " - gdpfun1="sm5s(hght)" - gdpfun2="sm5s(pmsl)" - line="5/1/3/2/2" - hilo1=$(echo ${hilo3}) - hilo2=$(echo ${hilo4}) - title1=$(echo ${title3}) - title2=$(echo ${title4}) - fi - -export pgm=gdplot2_nc;. prep_step; startmsg -$GEMEXE/gdplot2_nc << EOF -\$MAPFIL= mepowo.gsf -DEVICE = ${device} -MAP = 1/1/1/yes -CLEAR = yes -GAREA = ${garea} -PROJ = ${proj} -LATLON = ${latlon} -SKIP = 0 -PANEL = 0 -CONTUR = 2 -CLRBAR = -FINT = -FLINE = -REFVEC = -WIND = 0 - -GDFILE = ${grid} !${grid2} -GDATTIM = ${gfsfhr}!${gfsoldfhr} -GLEVEL = 500 -GVCORD = PRES -GDPFUN = ${gdpfun1} -LINE = ${line} SCALE = -1 -CTYPE = c +${type_param} = c CINT = 6 -HLSYM = 1.2;1.2//21//hw -TEXT = 1/21//hw -HILO = ${hilo1} -TITLE = ${title1} -run - -CLEAR = yes -GLEVEL = 0 -GVCORD = none -SCALE = 0 -GDPFUN = ${gdpfun2} -CINT = 4 -HLSYM = 1.2;1.2//21//hw -TEXT = 1/21//hw -GDFILE = ${grid} !${grid2} -GDATTIM = ${gfsfhr}!${gfsoldfhr} -LINE = ${line} -HILO = ${hilo2} -TITLE = ${title2} -run - -EOF -export err=$?;err_chk - done - done - # COMPARE THE 0000 UTC GFS MODEL TO THE 1200 UTC UKMET FROM YESTERDAY - grid="F-${MDL} | ${PDY2}/${cyc}00" - export HPCUKMET=${COMINukmet}.${PDYm1}/gempak - grid2="F-UKMETHPC | ${PDY2m1}/1200" - # for gfsfhr in 00 12 24 36 48 60 84 108 - for gfsfhr in 00 12 24 84 108 - do - ukmetfhr=F$(expr ${gfsfhr} + 12) - gfsfhr=F${gfsfhr} - -export pgm=gdplot2_nc;. prep_step; startmsg -$GEMEXE/gdplot2_nc << EOF -\$MAPFIL= mepowo.gsf -DEVICE = ${device} -MAP = 1/1/1/yes -CLEAR = yes -GAREA = ${garea} -PROJ = ${proj} -LATLON = ${latlon} -GDFILE = ${grid} -GDATTIM = ${gfsfhr} -SKIP = 0 -PANEL = 0 -CONTUR = 2 -CLRBAR = -GLEVEL = 500 -GVCORD = PRES -GDPFUN = sm5s(hght) -LINE = 5/1/3/2 -SCALE = -1 -TYPE = c -CINT = 6 -FINT = -FLINE = -HLSYM = 1.2;1.2//21//hw -TEXT = s/21//hw -WIND = -REFVEC = -HILO = 5/H#;L#//5/5;5/y -TITLE = 5/-1/~ ? ${MDL} @ HEIGHTS (00Z YELLOW)|~${gareas} 00Z VS UK Y 12Z 500 HGT!0 -l -run - -CLEAR = no -GDFILE = ${grid2} -GDATTIM = ${ukmetfhr} -GDPFUN = sm5s(hght) -LINE = 6/1/3/2 -HILO = 6/H#;L#//5/5;5/y -TITLE = 6/-2/~ ? UKMET @ HEIGHTS (12Z YEST CYAN)!0 -l -run - -CLEAR = yes -GLEVEL = 0 -GVCORD = none -SCALE = 0 -GDPFUN = sm5s(pmsl) -CINT = 4 -HLSYM = 1.2;1.2//21//hw -TEXT = s/21//hw -GDFILE = ${grid} -GDATTIM = ${gfsfhr} -LINE = 5/1/3/2 -HILO = 5/H#;L#/1018-1060;900-1012/5/10;10/y -TITLE = 5/-1/~ ? ${MDL} PMSL (00Z YELLOW) |~${gareas} 00Z VS UK Y 12Z PMSL!0 -l -run - -CLEAR = no -GDFILE = ${grid2} -GDPFUN = sm5s(pmsl) -GDATTIM = ${ukmetfhr} -LINE = 6/1/3/2 -HILO = 6/H#;L#/1018-1060;900-1012/5/10;10/y -TITLE = 6/-2/~ ? UKMET PMSL (12Z YEST CYAN)!0 -l -run - -EOF -export err=$?;err_chk - done - # COMPARE THE 0000 UTC GFS MODEL TO THE 1200 UTC ECMWF FROM YESTERDAY - grid="F-${MDL} | ${PDY2}/${cyc}00" - # JY grid2="$COMROOT/nawips/prod/ecmwf.${PDYm1}/ecmwf_glob_${PDYm1}12" - grid2="${COMINecmwf}.${PDYm1}/gempak/ecmwf_glob_${PDYm1}12" - for gfsfhr in 12 36 60 84 108 - do - ecmwffhr=F$(expr ${gfsfhr} + 12) - gfsfhr=F${gfsfhr} - -export pgm=gdplot2_nc;. prep_step; startmsg -$GEMEXE/gdplot2_nc << EOF -\$MAPFIL= mepowo.gsf -DEVICE = ${device} -MAP = 1/1/1/yes -CLEAR = yes -GAREA = ${garea} -PROJ = ${proj} -LATLON = ${latlon} -GDFILE = ${grid} -GDATTIM = ${gfsfhr} -SKIP = 0 -PANEL = 0 -CONTUR = 2 -CLRBAR = -GLEVEL = 500 -GVCORD = PRES -GDPFUN = sm5s(hght) -LINE = 5/1/3/2 -SCALE = -1 -TYPE = c -CINT = 6 -FINT = -FLINE = -HLSYM = 1.2;1.2//21//hw -TEXT = s/21//hw -WIND = -REFVEC = +FINT = +FLINE = +HLSYM = ${hlsym} +TEXT = s/21//hw +WIND = ${wind} +REFVEC = HILO = 5/H#;L#//5/5;5/y -TITLE = 5/-1/~ ? ${MDL} @ HGT (00Z YELLOW)|~${gareas} 00Z VS EC Y 12Z 500 HGT!0 +TITLE = 5/-1/~ ? ${MDL} @ HGT (${cyc}Z YELLOW)|~${gareas} ${cyc}Z VS EC Y 12Z 500 HGT!0 l run @@ -729,14 +371,12 @@ GLEVEL = 0 GVCORD = none SCALE = 0 GDPFUN = sm5s(pmsl) -CINT = 4 -HLSYM = 1.2;1.2//21//hw -TEXT = s/21//hw +CINT = 4 ${extra_cmd} GDFILE = ${grid} GDATTIM = ${gfsfhr} LINE = 5/1/3/2 HILO = 5/H#;L#/1018-1060;900-1012/5/10;10/y -TITLE = 5/-1/~ ? ${MDL} PMSL (00Z YELLOW) |~${gareas} 00Z VS EC Y 12Z PMSL!0 +TITLE = 5/-1/~ ? ${MDL} PMSL (${cyc}Z YELLOW)|~${gareas} ${cyc}Z VS EC Y 12Z PMSL!0 l run @@ -751,18 +391,18 @@ l run EOF -export err=$?;err_chk + + export err=$?;err_chk done - # COMPARE THE 0000 UTC GFS MODEL TO THE 0000 UTC NAM AND NGM - grid="F-${MDL} | ${PDY2}/${cyc}00" - grid2="F-NAMHPC | ${PDY2}/${cyc}00" - for gfsfhr in 00 06 12 18 24 30 36 42 48 54 60 66 72 78 84 - do - namfhr=F${gfsfhr} - gfsfhr=F${gfsfhr} - -export pgm=gdplot2_nc;. prep_step; startmsg -$GEMEXE/gdplot2_nc << EOF + + # COMPARE THE GFS MODEL TO THE NAM and NGM + grid2="F-NAMHPC | ${PDY:2}/${cyc}00" + for fhr in $(seq -s ' ' 0 6 84); do + gfsfhr=F$(printf "%02g" "${fhr}") + namfhr=F$(printf "%02g" "${fhr}") + + export pgm=gdplot2_nc;. prep_step + "${GEMEXE}/gdplot2_nc" << EOF \$MAPFIL= mepowo.gsf DEVICE = ${device} MAP = 1/1/1/yes @@ -773,25 +413,25 @@ LATLON = ${latlon} GDFILE = ${grid} GDATTIM = ${gfsfhr} -SKIP = 0 +SKIP = 0 PANEL = 0 CONTUR = 2 -CLRBAR = -GLEVEL = 500 -GVCORD = PRES +CLRBAR = +GLEVEL = 500 +GVCORD = PRES GDPFUN = sm5s(hght) -LINE = 3/1/3/2 -SCALE = -1 -TYPE = c -CINT = 6 -FINT = -FLINE = -HLSYM = 1.2;1.2//21//hw -TEXT = s/21//hw -WIND = -REFVEC = +LINE = 3/1/3/2 +SCALE = -1 +TYPE = c +CINT = 6 +FINT = +FLINE = +HLSYM = ${hlsym} +TEXT = s/21//hw +WIND = +REFVEC = HILO = 3/H#;L#//5/5;5/y -TITLE = 3/-1/~ ? ${MDL} @ HGT (00Z YELLOW)|~${gareas} ${MDL}/NAM/NGM 500 HGT!0 +TITLE = 3/-1/~ ? ${MDL} @ HGT (${cyc}Z YELLOW)|~${gareas} ${MDL}/NAM/NGM 500 HGT!0 l run @@ -801,7 +441,7 @@ GDATTIM = ${namfhr} GDPFUN = sm5s(hght) LINE = 5/1/3/2 HILO = 5/H#;L#//5/5;5/y -TITLE = 5/-2/~ ? NAM @ HGT (00Z CYAN)!0 +TITLE = 5/-2/~ ? NAM @ HGT (${cyc}Z CYAN)!0 l run @@ -810,14 +450,12 @@ GLEVEL = 0 GVCORD = none SCALE = 0 GDPFUN = sm5s(pmsl) -CINT = 4 -HLSYM = 1.2;1.2//21//hw -TEXT = s/21//hw +CINT = 4${extra_cmd} GDFILE = ${grid} GDATTIM = ${gfsfhr} LINE = 3/1/3/2 HILO = 3/H#;L#/1018-1060;900-1012/5/10;10/y -TITLE = 3/-1/~ ? ${MDL} PMSL (00Z YELLOW) |~${gareas} ${MDL}/NAM/NGM PMSL!0 +TITLE = 3/-1/~ ? ${MDL} PMSL (${cyc}Z YELLOW)|~${gareas} ${MDL}/NAM/NGM PMSL!0 l run @@ -827,295 +465,40 @@ GDPFUN = sm5s(pmsl) GDATTIM = ${namfhr} LINE = 5/1/3/2 HILO = 5/H#;L#/1018-1060;900-1012/5/10;10/y -TITLE = 5/-2/~ ? NAM PMSL (CYAN)!0 +TITLE = 5/-2/~ ? NAM PMSL (${cyc}Z CYAN)!0 l run EOF -export err=$?;err_chk - done - done -fi - -if [ ${cyc} -eq 18 ] ; then - grid="F-${MDL} | ${PDY2}/${cyc}00" - for gareas in US NP - do - if [ ${gareas} = US ] ; then - garea="bwus" - proj=" " - latlon="0" - elif [ ${gareas} = NP ] ; then - garea="5;-177;45;-72" - proj="STR/90.0;-155.0;0.0" - latlon="1/1/1/1/10" - fi - for runtime in 12 06 00 18y - do - if [ ${runtime} = "12" ] ; then - cyc2="12" - desc="T" - grid2="F-${MDL} | ${PDY2}/1200" - add="06" - testgfsfhr="120" - elif [ ${runtime} = "06" ] ; then - cyc2="06" - desc="T" - grid2="F-${MDL} | ${PDY2}/0600" - add="12" - testgfsfhr="114" - elif [ ${runtime} = "00" ] ; then - cyc2="00" - desc="T" - grid2="F-${MDL} | ${PDY2}/0000" - add="18" - testgfsfhr="108" - elif [ ${runtime} = "18y" ] ; then - cyc2="18" - desc="Y" - export HPCGFS=${COMINgempak}/${mdl}.${PDYm1}/${cyc2}/${COMPONENT}/gempak - - grid2="F-GFSHPC | ${PDY2m1}/1800" - add="24" - testgfsfhr="102" - fi - gdpfun1="sm5s(hght)!sm5s(hght)" - gdpfun2="sm5s(pmsl)!sm5s(pmsl)" - line="5/1/3/2/2!6/1/3/2/2" - hilo1="5/H#;L#//5/5;5/y!6/H#;L#//5/5;5/y" - hilo2="5/H#;L#/1018-1060;900-1012/5/10;10/y!6/H#;L#/1018-1060;900-1012/5/10;10/y" - hilo3="5/H#;L#//5/5;5/y" - hilo4="5/H#;L#/1018-1060;900-1012/5/10;10/y" - title1="5/-2/~ ? ^ ${MDL} @ HGT (${cyc}Z YELLOW)|^${gareas} ${cyc}Z VS ${desc} ${cyc2}Z 500 HGT!6/-3/~ ? ${MDL} @ HGT (${cyc2}Z ${desc} CYAN)" - title2="5/-2/~ ? ^ ${MDL} PMSL (${cyc}Z YELLOW)|^${gareas} ${cyc}Z VS ${desc} ${cyc2}Z PMSL!6/-3/~ ? ${MDL} PMSL (${cyc2}Z ${desc} CYAN)" - title3="5/-2/~ ? ^ ${MDL} @ HGT (${cyc}Z YELLOW)|^${gareas} ${cyc}Z VS ${desc} ${cyc2}Z 500 HGT" - title4="5/-2/~ ? ^ ${MDL} PMSL (${cyc}Z YELLOW)|^${gareas} ${cyc}Z VS ${desc} ${cyc2}Z PMSL" - for gfsfhr in 00 06 12 18 24 30 36 42 48 54 60 66 72 78 84 90 96 102 108 114 120 126 - do - gfsoldfhr=F$(expr ${gfsfhr} + ${add}) - gfsfhr2=$(echo ${gfsfhr}) - gfsfhr="F${gfsfhr}" - if [ ${gfsfhr2} -gt ${testgfsfhr} ] ; then - grid="F-${MDL} | ${PDY2}/${cyc}00" - grid2=" " - gfsoldfhr=" " - gdpfun1="sm5s(hght)" - gdpfun2="sm5s(pmsl)" - line="5/1/3/2/2" - hilo1=$(echo ${hilo3}) - hilo2=$(echo ${hilo4}) - title1=$(echo ${title3}) - title2=$(echo ${title4}) - fi -export pgm=gdplot2_nc;. prep_step; startmsg -$GEMEXE/gdplot2_nc << EOF -\$MAPFIL= mepowo.gsf -DEVICE = ${device} -MAP = 1/1/1/yes -CLEAR = yes -GAREA = ${garea} -PROJ = ${proj} -LATLON = ${latlon} -SKIP = 0 -PANEL = 0 -CONTUR = 1 -CLRBAR = -FINT = -FLINE = -REFVEC = -WIND = 0 - -GDFILE = ${grid}!${grid2} -GDATTIM = ${gfsfhr}!${gfsoldfhr} -GLEVEL = 500 -GVCORD = PRES -GDPFUN = ${gdpfun1} -LINE = ${line} -SCALE = -1 -TYPE = c -CINT = 6 -HLSYM = 1.2;1.2//21//hw -TEXT = 1/21//hw -HILO = ${hilo1} -TITLE = ${title1} -run - -CLEAR = yes -GLEVEL = 0 -GVCORD = none -SCALE = 0 -GDPFUN = ${gdpfun2} -CINT = 4 -HLSYM = 1.2;1.2//21//hw -TEXT = 1/21//hw -GDFILE = ${grid} !${grid2} -GDATTIM = ${gfsfhr}!${gfsoldfhr} -LINE = ${line} -HILO = ${hilo2} -TITLE = ${title2} -run - -ex -EOF -export err=$?;err_chk - done - done - done -fi - -if [ ${cyc} -eq 06 ] ; then - grid="F-${MDL} | ${PDY2}/${cyc}00" - for gareas in US NP - do - if [ ${gareas} = US ] ; then - garea="bwus" - proj=" " - latlon="0" - elif [ ${gareas} = NP ] ; then - garea="5;-177;45;-72" - proj="STR/90.0;-155.0;0.0" - latlon="1/1/1/1/10" - fi - for runtime in 00 18 12 06 - do - if [ ${runtime} -eq 00 ] ; then - cyc2="00" - desc="T" - grid2="F-${MDL} | ${PDY2}/0000" - add="06" - testgfsfhr="120" - elif [ ${runtime} -eq 18 ] ; then - cyc2="18" - desc="Y" - export HPCGFS=${COMINgempak}/${mdl}.${PDYm1}/${cyc2}/${COMPONENT}/gempak - grid2="F-GFSHPC | ${PDY2m1}/1800" - add="12" - testgfsfhr="114" - elif [ ${runtime} -eq 12 ] ; then - cyc2="12" - desc="Y" - export HPCGFS=${COMINgempak}/${mdl}.${PDYm1}/${cyc2}/${COMPONENT}/gempak - - grid2="F-GFSHPC | ${PDY2m1}/1200" - add="18" - testgfsfhr="108" - elif [ ${runtime} -eq 06 ] ; then - cyc2="06" - desc="Y" - export HPCGFS=${COMINgempak}/${NET}/${envir}/${mdl}.${PDYm1}/${cyc2}/${COMPONENT}/gempak - - grid2="F-GFSHPC | ${PDY2m1}/0600" - add="24" - testgfsfhr="102" - fi - gdpfun1="sm5s(hght)!sm5s(hght)" - gdpfun2="sm5s(pmsl)!sm5s(pmsl)" - line="5/1/3/2/2!6/1/3/2/2" - hilo1="5/H#;L#//5/5;5/y!6/H#;L#//5/5;5/y" - hilo2="5/H#;L#/1018-1060;900-1012/5/10;10/y!6/H#;L#/1018-1060;900-1012/5/10;10/y" - hilo3="5/H#;L#//5/5;5/y" - hilo4="5/H#;L#/1018-1060;900-1012/5/10;10/y" - title1="5/-2/~ ? ^ ${MDL} @ HGT (${cyc}Z YELLOW)|^${gareas} ${cyc}Z VS ${desc} ${cyc2}Z 500 HGT!6/-3/~ ? ${MDL} @ HGT (${cyc2}Z ${desc} CYAN)" - title2="5/-2/~ ? ^ ${MDL} PMSL (${cyc}Z YELLOW)|^${gareas} ${cyc}Z VS ${desc} ${cyc2}Z PMSL!6/-3/~ ? ${MDL} PMSL (${cyc2}Z ${desc} CYAN)" - title3="5/-2/~ ? ^ ${MDL} @ HGT (${cyc}Z YELLOW)|^${gareas} ${cyc}Z VS ${desc} ${cyc2}Z 500 HGT" - title4="5/-2/~ ? ^ ${MDL} PMSL (${cyc}Z YELLOW)|^${gareas} ${cyc}Z VS ${desc} ${cyc2}Z PMSL" - for gfsfhr in 00 06 12 18 24 30 36 42 48 54 60 66 72 78 84 90 96 102 108 114 120 126 - do - gfsoldfhr=F$(expr ${gfsfhr} + ${add}) - gfsfhr2=$(echo ${gfsfhr}) - gfsfhr="F${gfsfhr}" - if [ ${gfsfhr2} -gt ${testgfsfhr} ] ; then - grid="F-${MDL} | ${PDY2}/${cyc}00" - grid2=" " - gfsoldfhr=" " - gdpfun1="sm5s(hght)" - gdpfun2="sm5s(pmsl)" - line="5/1/3/2/2" - hilo1=$(echo ${hilo3}) - hilo2=$(echo ${hilo4}) - title1=$(echo ${title3}) - title2=$(echo ${title4}) - fi -export pgm=gdplot2_nc;. prep_step; startmsg -$GEMEXE/gdplot2_nc << EOF -\$MAPFIL= mepowo.gsf -DEVICE = ${device} -MAP = 1/1/1/yes -CLEAR = yes -GAREA = ${garea} -PROJ = ${proj} -LATLON = ${latlon} -SKIP = 0 -PANEL = 0 -CONTUR = 1 -CLRBAR = -FINT = -FLINE = -REFVEC = -WIND = 0 - -GDFILE = ${grid}!${grid2} -GDATTIM = ${gfsfhr}!${gfsoldfhr} -GLEVEL = 500 -GVCORD = PRES -GDPFUN = ${gdpfun1} -LINE = ${line} -SCALE = -1 -TYPE = c -CINT = 6 -HLSYM = 1.2;1.2//21//hw -TEXT = 1/21//hw -HILO = ${hilo1} -TITLE = ${title1} -run - -CLEAR = yes -GLEVEL = 0 -GVCORD = none -SCALE = 0 -GDPFUN = ${gdpfun2} -CINT = 4 -HLSYM = 1.2;1.2//21//hw -TEXT = 1/21//hw -GDFILE = ${grid} !${grid2} -GDATTIM = ${gfsfhr}!${gfsoldfhr} -LINE = ${line} -HILO = ${hilo2} -TITLE = ${title2} -run - -ex -EOF -export err=$?;err_chk - done + export err=$?;err_chk done - done -fi + fi +done -#################################################### +##################################################### # GEMPAK DOES NOT ALWAYS HAVE A NON ZERO RETURN CODE # WHEN IT CAN NOT PRODUCE THE DESIRED GRID. CHECK # FOR THIS CASE HERE. ##################################################### -ls -l $metaname -export err=$?;export pgm="GEMPAK CHECK FILE";err_chk -if [ $SENDCOM = "YES" ] ; then - mv ${metaname} ${COMOUT}/${mdl}_${PDY}_${cyc}_us_${metatype} - if [ $SENDDBN = "YES" ] ; then - ${DBNROOT}/bin/dbn_alert MODEL ${DBN_ALERT_TYPE} $job \ - ${COMOUT}/${mdl}_${PDY}_${cyc}_us_${metatype} - if [ $DBN_ALERT_TYPE = "GFS_METAFILE_LAST" ] ; then - DBN_ALERT_TYPE=GFS_METAFILE - ${DBNROOT}/bin/dbn_alert MODEL ${DBN_ALERT_TYPE} $job \ - ${COMOUT}/${mdl}_${PDY}_${cyc}_us_${metatype} +if (( err != 0 )) || [[ ! -s "${metaname}" ]] &> /dev/null; then + echo "FATAL ERROR: Failed to create gempak meta file ${metaname}" + exit $(( err + 100 )) +fi + +mv "${metaname}" "${COM_ATMOS_GEMPAK_META}/${mdl}_${PDY}_${cyc}_us_${metatype}" +if [[ "${SENDDBN}" == "YES" ]] ; then + "${DBNROOT}/bin/dbn_alert" MODEL "${DBN_ALERT_TYPE}" "${job}" \ + "${COM_ATMOS_GEMPAK_META}/${mdl}_${PDY}_${cyc}_us_${metatype}" + if [[ ${DBN_ALERT_TYPE} = "GFS_METAFILE_LAST" ]] ; then + DBN_ALERT_TYPE=GFS_METAFILE + "${DBNROOT}/bin/dbn_alert" MODEL "${DBN_ALERT_TYPE}" "${job}" \ + "${COM_ATMOS_GEMPAK_META}/${mdl}_${PDY}_${cyc}_us_${metatype}" fi - if [ $fhr -eq 126 ] ; then - ${DBNROOT}/bin/dbn_alert MODEL GFS_METAFILE_LAST $job \ - ${COMOUT}/${mdl}_${PDY}_${cyc}_us_${metatype} + if (( fhr == 126 )) ; then + "${DBNROOT}/bin/dbn_alert" MODEL GFS_METAFILE_LAST "${job}" \ + "${COM_ATMOS_GEMPAK_META}/${mdl}_${PDY}_${cyc}_us_${metatype}" fi - fi fi - exit diff --git a/gempak/ush/gfs_meta_crb.sh b/gempak/ush/gfs_meta_crb.sh index 82fa7795e8..aeed1a788a 100755 --- a/gempak/ush/gfs_meta_crb.sh +++ b/gempak/ush/gfs_meta_crb.sh @@ -1,50 +1,40 @@ -#! /bin/sh +#! /usr/bin/env bash # # Metafile Script : gfs_meta_crb_new # -# Log : -# J.Carr/HPC 03/13/2001 New script for the Caribbean desk. -# J. Carr/HPC 5/2001 Added a mn variable for a/b side dbnet root variable. -# J. Carr/HPC 6/2001 Converted to a korn shell prior to delivering script to Production. -# J. Carr/HPC 7/2001 Submitted. -# J. Carr/PMB 11/15/2004 Added a ? to all title/TITLE lines. Changed contur parameter to 2. -# Changed 12-hr increments to 6-hr with regards to 12-hr and 24-hr pcpn. -# # Set Up Local Variables # -set -x -# -export PS4='crb:$SECONDS + ' -mkdir -p -m 775 $DATA/crb -cd $DATA/crb -cp $FIXgempak/datatype.tbl datatype.tbl + +source "${HOMEgfs}/ush/preamble.sh" + +mkdir -p -m 775 "${DATA}/crb" +cd "${DATA}/crb" || exit 2 +cp "${HOMEgfs}/gempak/fix/datatype.tbl" datatype.tbl # mdl=gfs MDL=GFS metatype="crb" metaname="${mdl}_${metatype}_${cyc}.meta" device="nc | ${metaname}" -PDY2=$(echo ${PDY} | cut -c3-) + # -# DEFINE YESTERDAY -PDYm1=$($NDATE -24 ${PDY}${cyc} | cut -c -8) -PDY2m1=$(echo ${PDYm1} | cut -c 3-) +# Link data into DATA to sidestep gempak path limits +# TODO: Replace this # -#if [ ${cyc} -eq 00 ] ; then -# fend=F126 -#elif [ ${cyc} -eq 12 ] ; then -# fend=F126 -#else -# fend=F126 -# fend=F84 -#fi +export COMIN="${RUN}.${PDY}${cyc}" +if [[ ! -L ${COMIN} ]]; then + ln -sf "${COM_ATMOS_GEMPAK_1p00}" "${COMIN}" +fi + +# DEFINE YESTERDAY +PDYm1=$(date --utc +%Y%m%d -d "${PDY} 00 - 24 hours") fend=F126 -export pgm=gdplot2_nc;. prep_step; startmsg -$GEMEXE/gdplot2_nc << EOF -GDFILE = F-${MDL} | ${PDY2}/${cyc}00 -GDATTIM = F00-${fend}-06 +export pgm=gdplot2_nc;. prep_step +"${GEMEXE}/gdplot2_nc" << EOF +GDFILE = F-${MDL} | ${PDY:2}/${cyc}00 +GDATTIM = F00-${fend}-06 DEVICE = ${device} PANEL = 0 TEXT = 1/21//hw @@ -70,13 +60,13 @@ HILO = !!26;2/H#;L#/1020-1070;900-1012//30;30/y HLSYM = !!2;1.5//21//hw CLRBAR = 1 WIND = ! !bk9/0.7/2/112 -REFVEC = +REFVEC = TITLE = 5/-2/~ ? ${MDL} MSLP, 1000-500mb THICK & 850mb WIND|~MSLP, 1000-500 THKN! ru glevel = 9950 gvcord = sgma -scale = 7 !0 +scale = 7 !0 gdpfun = sm5s(sdiv(mixr@0%none;wnd)!kntv(wnd) type = f !b cint = 0 ! @@ -90,22 +80,22 @@ wind = am0!bk9/0.8/2/112 refvec = title = 1/-2/~ ? ${MDL} BL MOIST CONV & WIND|~BL MOISTURE CONV!0 r - + glevel = 0 !9950 gvcord = none !sgma scale = 0 skip = 0/1 gdpfun = sm5s(thte)!kntv(wnd) type = c/f !b -cint = 4/200/336 -line = 5/1/1 +cint = 4/200/336 +line = 5/1/1 fint = 336;340;344;348;352;356;360;364;368;372;376 fline = 0 ; 21; 22; 23; 24; 25; 26; 27; 28; 29; 30; 14 -hilo = -hlsym = +hilo = +hlsym = clrbar = 1/V/LL!0 wind = bk0 !bk9/0.9/2/112 -refvec = +refvec = title = 1/-2/~ ? ${MDL} BL THTE & WIND (KTS)|~BL THTE & WIND r @@ -115,9 +105,9 @@ SKIP = 0/1;2 GDPFUN = vor(wnd) !vor(wnd)!kntv(wnd) CINT = 2/-99/-2 !2/2/99 LINE = 29/5/1/2 !7/5/1/2 -HILO = 2;6/X;N/-99--4;4-99 ! -SCALE = 5 !5 -WIND = !!bk6/.8/2/112!0 +HILO = 2;6/X;N/-99--4;4-99 ! +SCALE = 5 !5 +WIND = !!bk6/.8/2/112!0 TITLE = 1//~ ? ${MDL} @ WIND AND REL VORT|~@ WIND AND REL VORT!0 FINT = 4;6;8;10;12;14;16;18 FLINE = 0;14-21 @@ -134,23 +124,23 @@ CINT = 5/20 LINE = 26//1 FINT = 5/20 FLINE = 0;24;30;29;23;22;14;15;16;17;20;5 -HILO = +HILO = HLSYM = CLRBAR = 1 WIND = bk0!ak7/.3/1/221/.4!ak6/.3/1/221/.4 REFVEC = 0 TITLE = 1/-2/~ ? ${MDL} @ WIND SHEAR (KNTS)|~850MB-300MB WIND SHEAR!0 filter = no - + GLEVEL = 700 GVCORD = pres GDPFUN = vor(wnd) !vor(wnd)!kntv(wnd) CINT = 2/-99/-2 !2/2/99 LINE = 29/5/1/2 !7/5/1/2 -HILO = 2;6/X;N/-99--4;4-99 ! -SCALE = 5 !5 -WIND = !!bk6/.8/2/112!0 +HILO = 2;6/X;N/-99--4;4-99 ! +SCALE = 5 !5 +WIND = !!bk6/.8/2/112!0 TITLE = 1/-2/~ ? ${MDL} @ WIND AND REL VORT|~@ WIND AND REL VORT!0 FINT = 6;8;10;12;14;16;18;20 FLINE = 0;14-21 @@ -167,8 +157,8 @@ LINE = 7/5/1/2 !29/5/1/2!7/5/1/2 !29/5/1/2 !20/1/2/1 FINT = 16;20;24;28;32;36;40;44 FLINE = 0;23-15 HILO = 2;6/X;N/10-99;10-99! !2;6/X;N/10-99;10-99! ! -HLSYM = -WIND = bk0 !bk0 !bk0 !bk0 !bk0 !bk9/0.7/2/112!0 +HLSYM = +WIND = bk0 !bk0 !bk0 !bk0 !bk0 !bk9/0.7/2/112!0 TITLE = 1/-2/~ ? ${MDL} @ HEIGHT AND VORTICITY|~@ HGT AND VORTICITY!0 ru @@ -209,7 +199,7 @@ FLINE = 0;21-30;14-20;5 HILO = 31;0/x#/10-400///y HLSYM = 1.5 CLRBAR = 1/V/LL -WIND = +WIND = REFVEC = TITLE = 1/-2/~ ? ${MDL} 12-HR TOTAL PCPN|~12-HR TOTAL PCPN r @@ -228,7 +218,7 @@ type = c !c/f !b cint = 6/6/18!6/24 line = 22///2!32//2/2 fint = !13;25;38;50 -fline = !0;23;22;21;2 +fline = !0;23;22;21;2 hilo = 0!0 HLSYM = 0!0 clrbar = 0!1 @@ -246,11 +236,11 @@ CINT = 10;20;80;90 !30;40;50;60;70 LINE = 32//2 !23//2 FINT = 10;30;70;90 FLINE = 18;8;0;22;23 -HILO = +HILO = HLSYM = CLRBAR = 1 -WIND = -REFVEC = +WIND = +REFVEC = TITLE = 1/-2/~ ? ${MDL} @ LYR RH|~MEAN RH!0 ru @@ -259,19 +249,29 @@ EOF export err=$?;err_chk -if [ ${cyc} -eq 00 ] ; then - export HPCECMWF=${COMINecmwf}.${PDY}/gempak - export HPCUKMET=${COMINukmet}.${PDYm1}/gempak - grid1="F-${MDL} | ${PDY2}/${cyc}00" - grid2="${COMINecmwf}.${PDYm1}/gempak/ecmwf_glob_${PDYm1}12" - grid3="F-UKMETHPC | ${PDY2m1}/1200" - for gfsfhr in 12 36 60 84 108 - do - ecmwffhr="F$(expr ${gfsfhr} + 12)" - gfsfhr="F${gfsfhr}" +if [[ ${cyc} == 00 ]] ; then + export HPCECMWF=ecmwf.${PDY} + HPCECMWF_m1=ecmwf.${PDY} + export HPCUKMET=ukmet.${PDYm1} + if [[ ! -L "${HPCECMWF}" ]]; then + ln -sf "${COMINecmwf}ecmwf.${PDY}/gempak" "${HPCECMWF}" + fi + if [[ ! -L "${HPCECMWF_m1}" ]]; then + ln -sf "${COMINecmwf}ecmwf.${PDYm1}/gempak" "${HPCECMWF_m1}" + fi + if [[ ! -L "${HPCUKMET}" ]]; then + ln -sf "${COMINukmet}/ukmet.${PDYm1}/gempak" "${HPCUKMET}" + fi + + grid1="F-${MDL} | ${PDY:2}/${cyc}00" + grid2="${HPCECMWF_m1}/ecmwf_glob_${PDYm1}12" + grid3="F-UKMETHPC | ${PDYm1:2}/1200" + for fhr in $(seq -s ' ' 12 24 108); do + gfsfhr=F$(printf "%02g" "${fhr}") + ecmwffhr=F$(printf "%02g" $((fhr + 12))) -export pgm=gdplot2_nc;. prep_step; startmsg -$GEMEXE/gdplot2_nc << EOF10 + export pgm=gdplot2_nc;. prep_step + "${GEMEXE}/gdplot2_nc" << EOF10 GDFILE = ${grid1} !${grid2} GDATTIM = ${gfsfhr}!${ecmwffhr} DEVICE = ${device} @@ -280,8 +280,8 @@ TEXT = 1/21//hw MAP = 6/1/1/yes CLEAR = yes CLRBAR = 1 -PROJ = mer//3;3;0;1 -GAREA = -25;-130;40;-15 +PROJ = mer//3;3;0;1 +GAREA = -25;-130;40;-15 LATLON = 18//1/1/10 GLEVEL = 500 @@ -327,16 +327,15 @@ r ex EOF10 -export err=$?;err_chk + export err=$?;err_chk done - for gfsfhr in 00 12 24 36 48 60 84 108 132 - do - ukmetfhr="F$(expr ${gfsfhr} + 12)" - gfsfhr=F${gfsfhr} + for fhr in 0 12 24 36 48 60 84 108 132; do + gfsfhr=F$(printf "%02g" "${fhr}") + ukmetfhr=F$(printf "%02g" $((fhr + 12))) -export pgm=gdplot2_nc;. prep_step; startmsg -$GEMEXE/gdplot2_nc << EOF25 + export pgm=gdplot2_nc;. prep_step + "${GEMEXE}/gdplot2_nc" << EOF25 DEVICE = ${device} PANEL = 0 TEXT = 1/21//hw @@ -390,7 +389,7 @@ r ex EOF25 -export err=$?;err_chk + export err=$?;err_chk done fi @@ -400,20 +399,20 @@ fi # WHEN IT CAN NOT PRODUCE THE DESIRED GRID. CHECK # FOR THIS CASE HERE. ##################################################### -ls -l $metaname -export err=$?;export pgm="GEMPAK CHECK FILE";err_chk - -if [ $SENDCOM = "YES" ] ; then - mv ${metaname} ${COMOUT}/${mdl}_${PDY}_${cyc}_${metatype} - if [ $SENDDBN = "YES" ] ; then - ${DBNROOT}/bin/dbn_alert MODEL ${DBN_ALERT_TYPE} $job \ - ${COMOUT}/${mdl}_${PDY}_${cyc}_${metatype} - if [ $DBN_ALERT_TYPE = "GFS_METAFILE_LAST" ] ; then +if (( err != 0 )) || [[ ! -s "${metaname}" ]] &> /dev/null; then + echo "FATAL ERROR: Failed to create gempak meta file ${metaname}" + exit $(( err + 100 )) +fi + +mv "${metaname}" "${COM_ATMOS_GEMPAK_META}/${mdl}_${PDY}_${cyc}_${metatype}" +if [[ "${SENDDBN}" == "YES" ]] ; then + "${DBNROOT}/bin/dbn_alert" MODEL "${DBN_ALERT_TYPE}" "${job}" \ + "${COM_ATMOS_GEMPAK_META}/${mdl}_${PDY}_${cyc}_${metatype}" + if [[ ${DBN_ALERT_TYPE} == "GFS_METAFILE_LAST" ]] ; then DBN_ALERT_TYPE=GFS_METAFILE - ${DBNROOT}/bin/dbn_alert MODEL ${DBN_ALERT_TYPE} $job \ - ${COMOUT}/${mdl}_${PDY}_${cyc}_${metatype} - fi - fi + "${DBNROOT}/bin/dbn_alert" MODEL "${DBN_ALERT_TYPE}" "${job}" \ + "${COM_ATMOS_GEMPAK_META}/${mdl}_${PDY}_${cyc}_${metatype}" + fi fi exit diff --git a/gempak/ush/gfs_meta_hi.sh b/gempak/ush/gfs_meta_hi.sh index 2b47474e12..23423381f9 100755 --- a/gempak/ush/gfs_meta_hi.sh +++ b/gempak/ush/gfs_meta_hi.sh @@ -1,43 +1,37 @@ -#!/bin/sh +#! /usr/bin/env bash # # Metafile Script : gfs_meta_hi.sh # -# Log : -# D.W.Plummer/NCEP 2/97 Add log header -# D.W.Plummer/NCEP 4/97 Changed SKIP for grid2 -# B. Gordon 4/00 Converted for production on the IBM-SP -# and changed gdplot_nc -> gdplot2_nc -# D. Michaud 4/16 Added logic to display different title -# for parallel runs -# B. Gordon 7/02 Converted to run off the GFS due to demise -# of the MRF. -# J. Carr 11/04 Added a ? to all title/TITLE lines. Changed contur parameter to a 2. -# Changed the GDATTIM line to end at F240 every 6 hrs instead of out to -# F384 every 12 hrs. This is to account for 06 and 18 UTC runs. -# M. Klein 4/07 Fix bug in PW display. -# -set -xa -mkdir -p -m 775 $DATA/mrfhi -cd $DATA/mrfhi -cp $FIXgempak/datatype.tbl datatype.tbl + +source "${HOMEgfs}/ush/preamble.sh" + +mkdir -p -m 775 "${DATA}/mrfhi" +cd "${DATA}/mrfhi" || exit 2 +cp "${HOMEgfs}/gempak/fix/datatype.tbl" datatype.tbl device="nc | mrfhi.meta" -PDY2=$(echo $PDY | cut -c3-) +# +# Link data into DATA to sidestep gempak path limits +# TODO: Replace this +# +export COMIN="${RUN}.${PDY}${cyc}" +if [[ ! -L ${COMIN} ]]; then + ln -sf "${COM_ATMOS_GEMPAK_1p00}" "${COMIN}" +fi -if [ "$envir" = "prod" ] ; then +if [[ "${envir}" = "prod" ]] ; then export m_title="GFS" else export m_title="GFSP" fi export pgm=gdplot2_nc;. prep_step -startmsg -$GEMEXE/gdplot2_nc << EOF -GDFILE = F-GFS | ${PDY2}/${cyc}00 +"${GEMEXE}/gdplot2_nc" << EOF +GDFILE = F-GFS | ${PDY:2}/${cyc}00 GDATTIM = F000-F192-06; F214-F240-12 -DEVICE = $device +DEVICE = ${device} PANEL = 0 TEXT = 1/21//hw CONTUR = 2 @@ -45,61 +39,61 @@ MAP = 1 CLEAR = yes CLRBAR = 1 -restore ${USHgempak}/restore/garea_hi.nts +restore ${HOMEgfs}/gempak/ush/restore/garea_hi.nts -restore ${USHgempak}/restore/pmsl_thkn.2.nts +restore ${HOMEgfs}/gempak/ush/restore/pmsl_thkn.2.nts CLRBAR = 1 HLSYM = 2;1.5//21//hw TEXT = 1/21//hw -TITLE = 5/-2/~ ? $m_title MSL PRESSURE, 1000-500 MB THICKNESS|~MSLP, 1000-500 THKN!0 +TITLE = 5/-2/~ ? ${m_title} MSL PRESSURE, 1000-500 MB THICKNESS|~MSLP, 1000-500 THKN!0 l ru -restore ${USHgempak}/restore/850mb_hght_tmpc.2.nts +restore ${HOMEgfs}/gempak/ush/restore/850mb_hght_tmpc.2.nts CLRBAR = 1 TEXT = 1/21//hw -TITLE = 5/-2/~ ? $m_title @ HGHTS, TEMPERATURE AND WIND (KTS)|~@ HGHT, TMP, WIND!0!0!0 +TITLE = 5/-2/~ ? ${m_title} @ HGHTS, TEMPERATURE AND WIND (KTS)|~@ HGHT, TMP, WIND!0!0!0 l ru -restore ${USHgempak}/restore/700mb_hght_relh_omeg.2.nts +restore ${HOMEgfs}/gempak/ush/restore/700mb_hght_relh_omeg.2.nts CLRBAR = 1 TEXT = 1/21//hw -TITLE = 5/-2/~ ? $m_title @ HGHTS, REL HUMIDITY AND OMEGA|~@ HGHT, RH AND OMEGA!0 +TITLE = 5/-2/~ ? ${m_title} @ HGHTS, REL HUMIDITY AND OMEGA|~@ HGHT, RH AND OMEGA!0 l ru -restore ${USHgempak}/restore/500mb_hght_absv.2.nts +restore ${HOMEgfs}/gempak/ush/restore/500mb_hght_absv.2.nts CLRBAR = 1 TEXT = 1/21//hw -TITLE = 5/-2/~ ? $m_title @ HEIGHTS AND VORTICITY|~@ HGHT AND VORTICITY!0 +TITLE = 5/-2/~ ? ${m_title} @ HEIGHTS AND VORTICITY|~@ HGHT AND VORTICITY!0 l ru -restore ${USHgempak}/restore/200mb_hght_wnd.2.nts +restore ${HOMEgfs}/gempak/ush/restore/200mb_hght_wnd.2.nts CLRBAR = 1 TEXT = 1/21//hw -TITLE = 5/-2/~ ? $m_title @ HEIGHTS, ISOTACHS AND WIND (KTS)|~@ HGHT AND WIND!0 +TITLE = 5/-2/~ ? ${m_title} @ HEIGHTS, ISOTACHS AND WIND (KTS)|~@ HGHT AND WIND!0 l ru -restore ${USHgempak}/restore/250mb_hght_wnd.2.nts +restore ${HOMEgfs}/gempak/ush/restore/250mb_hght_wnd.2.nts CLRBAR = 1 TEXT = 1/21//hw -TITLE = 5/-2/~ ? $m_title @ HEIGHTS, ISOTACHS AND WIND (KTS)|~@ HGHT AND WIND!0 +TITLE = 5/-2/~ ? ${m_title} @ HEIGHTS, ISOTACHS AND WIND (KTS)|~@ HGHT AND WIND!0 l ru -restore ${USHgempak}/restore/300mb_hght_wnd.2.nts +restore ${HOMEgfs}/gempak/ush/restore/300mb_hght_wnd.2.nts CLRBAR = 1 TEXT = 1/21//hw -TITLE = 5/-2/~ ? $m_title @ HEIGHTS, ISOTACHS AND WIND (KTS)|~@ HGHT AND WIND!0 +TITLE = 5/-2/~ ? ${m_title} @ HEIGHTS, ISOTACHS AND WIND (KTS)|~@ HGHT AND WIND!0 l ru @@ -109,7 +103,7 @@ GVCORD = sgma GDPFUN = sm5s(relh) CINT = 10 LINE = 21/1/2 -TITLE = 5/-2/~ ? $m_title MEAN RELATIVE HUMIDITY|~1000-440 MB MEAN RH!0 +TITLE = 5/-2/~ ? ${m_title} MEAN RELATIVE HUMIDITY|~1000-440 MB MEAN RH!0 SCALE = 0 FINT = 10;70;90 FLINE = 20;0;22;23 @@ -122,7 +116,7 @@ GVCORD = none GDPFUN = sm5s(quo(pwtr,25.4)) !sm5s(quo(pwtr,25.4)) CINT = 0.25/0.25/0.5 !0.25/0.75/6.0 LINE = 22///2 !32//2/2 -TITLE = 5/-2/~ ? $m_title PRECIPITABLE WATER (in)|~PRECIPITABLE WATER!0 +TITLE = 5/-2/~ ? ${m_title} PRECIPITABLE WATER (in)|~PRECIPITABLE WATER!0 SCALE = 0 SKIP = 0 FINT = 0.5;1.0;1.5;2.0 @@ -137,7 +131,7 @@ GVCORD = pres GDPFUN = sm5s(tmpc) !kntv(wnd) CINT = 5 LINE = 2/1/3 -TITLE = 5/-2/~ ? $m_title @ TEMPERATURE|~1000 MB TEMP!0 +TITLE = 5/-2/~ ? ${m_title} @ TEMPERATURE|~1000 MB TEMP!0 SCALE = 0 SKIP = 0 FINT = 0;30 @@ -149,12 +143,12 @@ CLRBAR = 1 r -restore ${USHgempak}/restore/precip.2.nts +restore ${HOMEgfs}/gempak/ush/restore/precip.2.nts CLRBAR = 1 TEXT = 1/21//hw GDATTIM = F12-F192-06; F214-F384-12 GDPFUN = (quo(p12m,25.4)) -TITLE = 5/-2/~ ? $m_title 12-HOUR TOTAL PRECIPITATION (IN)|~12-HOURLY TOTAL PCPN +TITLE = 5/-2/~ ? ${m_title} 12-HOUR TOTAL PRECIPITATION (IN)|~12-HOURLY TOTAL PCPN hilo = 31;0/x#2/.01-20//50;50/y!17/H#;L#/1020-1070;900-1012 hlsym = 1.5!1;1//22;22/2;2/hw l @@ -162,13 +156,13 @@ ru GDATTIM = F24-F192-06; F214-F384-12 GDPFUN = (quo(p24m,25.4)) -TITLE = 5/-2/~ ? $m_title 24-HOUR TOTAL PRECIPITATION (IN)|~24-HOURLY TOTAL PCPN +TITLE = 5/-2/~ ? ${m_title} 24-HOUR TOTAL PRECIPITATION (IN)|~24-HOURLY TOTAL PCPN l ru GDATTIM = F180 GDPFUN = (quo(p180m,25.4)) -TITLE = 5/-2/~ ? $m_title 180-HOUR TOTAL PRECIPITATION (IN)|~180-HOURLY TOTAL PCPN +TITLE = 5/-2/~ ? ${m_title} 180-HOUR TOTAL PRECIPITATION (IN)|~180-HOURLY TOTAL PCPN l ru @@ -181,20 +175,20 @@ export err=$?; err_chk # WHEN IT CAN NOT PRODUCE THE DESIRED GRID. CHECK # FOR THIS CASE HERE. ##################################################### -ls -l mrfhi.meta -export err=$?;export pgm="GEMPAK CHECK FILE"; err_chk - -if [ $SENDCOM = "YES" ] ; then - mv mrfhi.meta ${COMOUT}/gfs_${PDY}_${cyc}_hi - if [ $SENDDBN = "YES" ] ; then - $DBNROOT/bin/dbn_alert MODEL ${DBN_ALERT_TYPE} $job \ - $COMOUT/gfs_${PDY}_${cyc}_hi - if [ $DBN_ALERT_TYPE = "GFS_METAFILE_LAST" ] ; then +if (( err != 0 )) || [[ ! -s mrfhi.meta ]] &> /dev/null; then + echo "FATAL ERROR: Failed to create gempak meta file mrfhi.meta" + exit $(( err + 100 )) +fi + +mv mrfhi.meta "${COM_ATMOS_GEMPAK_META}/gfs_${PDY}_${cyc}_hi" +if [[ "${SENDDBN}" == "YES" ]] ; then + "${DBNROOT}/bin/dbn_alert" MODEL "${DBN_ALERT_TYPE}" "${job}" \ + "${COM_ATMOS_GEMPAK_META}/gfs_${PDY}_${cyc}_hi" + if [[ ${DBN_ALERT_TYPE} == "GFS_METAFILE_LAST" ]] ; then DBN_ALERT_TYPE=GFS_METAFILE - ${DBNROOT}/bin/dbn_alert MODEL ${DBN_ALERT_TYPE} $job \ - ${COMOUT}/gfs_${PDY}_${cyc}_hi - fi - fi + "${DBNROOT}/bin/dbn_alert" MODEL "${DBN_ALERT_TYPE}" "${job}" \ + "${COM_ATMOS_GEMPAK_META}/gfs_${PDY}_${cyc}_hi" + fi fi # diff --git a/gempak/ush/gfs_meta_hur.sh b/gempak/ush/gfs_meta_hur.sh index aed25d6d78..c85466ca23 100755 --- a/gempak/ush/gfs_meta_hur.sh +++ b/gempak/ush/gfs_meta_hur.sh @@ -1,87 +1,56 @@ -#! /bin/sh +#! /usr/bin/env bash # # Metafile Script : gfs_meta_hur_new # -# Log : -# D.W. Plummer/NCEP 2/97 Add log header -# J.W. Carr/HPC 4/15/97 changed the skip parameter -# J.W. Carr/HPC 4/06/98 Converted from gdplot to gdplot2 -# J.L. Partain/MPC 5/25/98 Chg VOR to AVOR @ 500mb,chg 200 to 250mb to match ETA,NGM -# J.W. Carr/HPC 8/05/98 Changed map to medium resolution -# J.W. Carr/HPC 2/02/99 Changed skip to 0 -# J.W. Carr/HPC 4/12/99 Added 84-hr time step. -# J. Carr/HPC 6/99 Added a filter to map. -# J. Carr/HPC 2/2001 Edited to run on the IBM. -# J. Carr/HPC 5/2001 Added a mn variable for a/b side dbnet root variable. -# J. Carr/HPC 6/2001 Incorporated the crb metafile into this one. -# J. Carr/HPC 6/2001 Converted to a korn shell prior to delivering script to Production. -# J. Carr/HPC 7/2001 Submitted. -# J. Carr/PMB 11/2004 Added a ? to all title lines. Changed contur to a 2 from a 1. -# # Set up Local Variables # -set -x -# -export PS4='hur:$SECONDS + ' -mkdir -p -m 775 $DATA/hur -cd $DATA/hur -cp $FIXgempak/datatype.tbl datatype.tbl + +source "${HOMEgfs}/ush/preamble.sh" + +mkdir -p -m 775 "${DATA}/hur" +cd "${DATA}/hur" || exit 2 +cp "${HOMEgfs}/gempak/fix/datatype.tbl" datatype.tbl mdl=gfs MDL=GFS metatype="hur" metaname="${mdl}_${metatype}_${cyc}.meta" device="nc | ${metaname}" -PDY2=$(echo ${PDY} | cut -c3-) + # -# DEFINE YESTERDAY -PDYm1=$($NDATE -24 ${PDY}${cyc} | cut -c -8) -PDY2m1=$(echo ${PDYm1} | cut -c 3-) +# Link data into DATA to sidestep gempak path limits +# TODO: Replace this # -if [ ${cyc} -eq 00 ] ; then - gdat="F000-F126-06" - gdatpcpn06="F006-F126-06" - gdatpcpn12="F012-F126-06" - gdatpcpn24="F024-F126-06" - gdatpcpn48="F048-F126-06" - gdatpcpn60="F060-F126-06" - gdatpcpn72="F072-F126-06" - gdatpcpn84="F084-F126-06" - gdatpcpn96="F096-F126-06" - gdatpcpn120="F120-F126-06" - gdatpcpn126="F126" - run="r" -elif [ ${cyc} -eq 12 ] ; then - gdat="F000-F126-06" - gdatpcpn06="F006-F126-06" - gdatpcpn12="F012-F126-06" - gdatpcpn24="F024-F126-06" - gdatpcpn48="F048-F126-06" - gdatpcpn60="F060-F126-06" - gdatpcpn72="F072-F126-06" - gdatpcpn84="F084-F126-06" - gdatpcpn96="F096-F126-06" - gdatpcpn120="F120-F126-06" - gdatpcpn126="F126" - run="r" -else - gdat="F000-F126-06" - gdatpcpn06="F006-F126-06" - gdatpcpn12="F012-F126-06" - gdatpcpn24="F024-F126-06" - gdatpcpn48="F048-F126-06" - gdatpcpn60="F060-F126-06" - gdatpcpn72="F072-F126-06" - gdatpcpn84="F084-F126-06" - gdatpcpn96="F096-F126-06" - gdatpcpn120="F120-F126-06" - gdatpcpn126="F126" - run="r" +export COMIN="${RUN}.${PDY}${cyc}" +if [[ ! -L ${COMIN} ]]; then + ln -sf "${COM_ATMOS_GEMPAK_1p00}" "${COMIN}" fi -export pgm=gdplot2_nc;. prep_step; startmsg -$GEMEXE/gdplot2_nc << EOF -gdfile = F-${MDL} | ${PDY2}/${cyc}00 +# +# DEFINE YESTERDAY +PDYm1=$(date --utc +%Y%m%d -d "${PDY} 00 - 24 hours") +# +case ${cyc} in + 00) + gdat="F000-F126-06" + gdatpcpn12="F012-F126-06" + gdatpcpn24="F024-F126-06" + ;; + 12) + gdat="F000-F126-06" + gdatpcpn12="F012-F126-06" + gdatpcpn24="F024-F126-06" + ;; + *) + gdat="F000-F126-06" + gdatpcpn12="F012-F126-06" + gdatpcpn24="F024-F126-06" + ;; +esac + +export pgm=gdplot2_nc;. prep_step +"${GEMEXE}/gdplot2_nc" << EOF +gdfile = F-${MDL} | ${PDY:2}/${cyc}00 gdattim = ${gdat} GAREA = -6;-111;52;-14 PROJ = MER/0.0;-49.5;0.0 @@ -89,7 +58,7 @@ MAP = 1/1/1/yes LATLON = 1//1/1/10 CONTUR = 2 clear = y -device = ${device} +device = ${device} TEXT = 1/22/2/hw PANEL = 0 filter = yes @@ -205,10 +174,10 @@ GDPFUN = vor(wnd) !vor(wnd) !sm5s(pmsl)!kntv(wnd) TYPE = c/f !c !c !b CINT = 2/-99/-2 !2/2/99 !2//1008 LINE = 29/5/1/2 !7/5/1/2 !6/1/1 -HILO = 2;6/X;N/-99--4;4-99! !6/L#/880-1004///1 +HILO = 2;6/X;N/-99--4;4-99! !6/L#/880-1004///1 HLSYM = 1;1//22;22/3;3/hw SCALE = 5 !5 !0 -WIND = bk0 !bk0 !bk0 !bk9/.8/1.4/112 +WIND = bk0 !bk0 !bk0 !bk9/.8/1.4/112 TITLE = 1/-2/~ ? ${MDL} @ WIND AND REL VORT|~@ WIND AND REL VORT!0 FINT = 4;6;8;10;12;14;16;18 FLINE = 0;14-21 @@ -222,7 +191,7 @@ LINE = 29/5/1/2 !7/5/1/2 !6//1 HILO = 2;6/X;N/-99--4;4-99! !6/L#/880-1004///1 HLSYM = 1;1//22;22/3;3/hw SCALE = 5 !5 !0 -WIND = bk0 !bk0 !bk0 !bk9/.8/1.4/112 +WIND = bk0 !bk0 !bk0 !bk9/.8/1.4/112 TITLE = 1/-2/~ ? ${MDL} @ WIND AND REL VORT|~@ WIND AND REL VORT!0 FINT = 4;6;8;10;12;14;16;18 FLINE = 0;14-21 @@ -238,12 +207,12 @@ LINE = 29/5/1/2 !7/5/1/2 !20/1/2/1 HILO = 2;6/X;N/-99--4;4-99! ! HLSYM = 1;1//22;22/3;3/hw ! SCALE = 5 !5 !-1 -WIND = bk0 !bk0 !bk0 !bk9/.8/1.4/112 +WIND = bk0 !bk0 !bk0 !bk9/.8/1.4/112 TITLE = 1/-2/~ ? ${MDL} @ WIND AND ABS VORT|~@ WIND AND ABS VORT!0 FINT = 16;20;24;28;32;36;40;44 FLINE = 0;23-15 TYPE = c/f !c !c !b -r +r GLEVEL = 250 GVCORD = PRES @@ -260,14 +229,14 @@ CINT = 5/20 LINE = 26//1 FINT = 5/25 FLINE = 0;24;30;29;23;22;14;15;16;17;20;5 -HILO = +HILO = HLSYM = CLRBAR = 1 WIND = ak0!ak7/.1/1/221/.2!ak6/.1/1/221/.2 -REFVEC = +REFVEC = TITLE = 1/-2/~ ? ${MDL} @ WIND SHEAR (850=Purple, 300=Cyan) |~850-300MB WIND SHEAR!0 filter = no -r +r glevel = 250 gvcord = pres @@ -295,13 +264,13 @@ type = b !c cint = 0!4 line = 0!20//3 SKIP = 0/2;2 -fint = -fline = +fint = +fline = hilo = 0!26;2/H#;L#/1020-1070;900-1012//30;30/y hlsym = 0!2;1.5//21//hw clrbar = 0 wind = bk10/0.9/1.4/112!bk0 -refvec = +refvec = title = 1/-2/~ ? ${MDL} 850-400mb MLW and MSLP|~850-400mb MLW & MSLP!0 r @@ -333,22 +302,28 @@ exit EOF export err=$?;err_chk -if [ ${cyc} -eq 00 ] ; then - # BV export MODEL=/com/nawips/prod - # JY export HPCECMWF=${MODEL}/ecmwf.${PDY} - # JY export HPCUKMET=${MODEL}/ukmet.${PDY} - export HPCECMWF=${COMINecmwf}.${PDY}/gempak - export HPCUKMET=${COMINukmet}.${PDY}/gempak - grid1="F-${MDL} | ${PDY2}/${cyc}00" - grid2="${COMINecmwf}.${PDYm1}/gempak/ecmwf_glob_${PDYm1}12" - grid3="F-UKMETHPC | ${PDY2}/${cyc}00" - for gfsfhr in 12 36 60 84 108 - do - ecmwffhr="F$(expr ${gfsfhr} + 12)" - gfsfhr="F${gfsfhr}" - -export pgm=gdplot2_nc;. prep_step; startmsg -$GEMEXE/gdplot2_nc << EOF +if [[ ${cyc} -eq 00 ]] ; then + export HPCECMWF=ecmwf.${PDY} + HPCECMWF_m1=ecmwf.${PDY} + export HPCUKMET=ukmet.${PDYm1} + if [[ ! -L "${HPCECMWF}" ]]; then + ln -sf "${COMINecmwf}ecmwf.${PDY}/gempak" "${HPCECMWF}" + fi + if [[ ! -L "${HPCECMWF_m1}" ]]; then + Ln -sf "${COMINecmwf}ecmwf.${PDYm1}/gempak" "${HPCECMWF_m1}" + fi + if [[ ! -L "${HPCUKMET}" ]]; then + ln -sf "${COMINukmet}/ukmet.${PDYm1}/gempak" "${HPCUKMET}" + fi + grid1="F-${MDL} | ${PDY:2}/${cyc}00" + grid2="${HPCECMWF_m1}/ecmwf_glob_${PDYm1}12" + grid3="F-UKMETHPC | ${PDY:2}/${cyc}00" + for fhr in $(seq -s ' ' 12 24 108); do + gfsfhr=F$(printf "%02g" "${fhr}") + ecmwffhr=F$(printf "%02g" $((fhr + 12))) + + export pgm=gdplot2_nc;. prep_step + "${GEMEXE}/gdplot2_nc" << EOF GDFILE = ${grid1} !${grid2} GDATTIM = ${gfsfhr}!${ecmwffhr} DEVICE = ${device} @@ -405,16 +380,15 @@ r ex EOF -export err=$?;err_chk + export err=$?;err_chk done - for gfsfhr in 12 24 36 48 60 72 96 120 - do - ukmetfhr=F${gfsfhr} - gfsfhr=F${gfsfhr} + for gfsfhr in 12 24 36 48 60 72 96 120; do + gfsfhr=F$(printf "%02g" "${fhr}") + ukmetfhr=F$(printf "%02g" $((fhr))) -export pgm=gdplot2_nc;. prep_step; startmsg -$GEMEXE/gdplot2_nc << EOF + export pgm=gdplot2_nc;. prep_step + "${GEMEXE}/gdplot2_nc" << EOF DEVICE = ${device} PANEL = 0 TEXT = 1/21//hw @@ -468,7 +442,7 @@ r ex EOF -export err=$?;err_chk + export err=$?;err_chk done fi @@ -477,19 +451,19 @@ fi # WHEN IT CAN NOT PRODUCE THE DESIRED GRID. CHECK # FOR THIS CASE HERE. ##################################################### -ls -l $metaname -export err=$?;export pgm="GEMPAK CHECK FILE";err_chk - -if [ $SENDCOM = "YES" ] ; then - mv ${metaname} ${COMOUT}/${mdl}_${PDY}_${cyc}_${metatype} - if [ $SENDDBN = "YES" ] ; then - ${DBNROOT}/bin/dbn_alert MODEL ${DBN_ALERT_TYPE} $job \ - ${COMOUT}/${mdl}_${PDY}_${cyc}_${metatype} - if [ $DBN_ALERT_TYPE = "GFS_METAFILE_LAST" ] ; then +if (( err != 0 )) || [[ ! -s "${metaname}" ]] &> /dev/null; then + echo "FATAL ERROR: Failed to create gempak meta file ${metaname}" + exit $(( err + 100 )) +fi + +mv "${metaname}" "${COM_ATMOS_GEMPAK_META}/${mdl}_${PDY}_${cyc}_${metatype}" +if [[ "${SENDDBN}" == "YES" ]] ; then + "${DBNROOT}/bin/dbn_alert" MODEL "${DBN_ALERT_TYPE}" "${job}" \ + "${COM_ATMOS_GEMPAK_META}/${mdl}_${PDY}_${cyc}_${metatype}" + if [[ ${DBN_ALERT_TYPE} == "GFS_METAFILE_LAST" ]] ; then DBN_ALERT_TYPE=GFS_METAFILE - ${DBNROOT}/bin/dbn_alert MODEL ${DBN_ALERT_TYPE} $job \ - ${COMOUT}/${mdl}_${PDY}_${cyc}_${metatype} - fi - fi + "${DBNROOT}/bin/dbn_alert" MODEL "${DBN_ALERT_TYPE}" "${job}" \ + "${COM_ATMOS_GEMPAK_META}/${mdl}_${PDY}_${cyc}_${metatype}" + fi fi exit diff --git a/gempak/ush/gfs_meta_mar_atl.sh b/gempak/ush/gfs_meta_mar_atl.sh index c8db3e59d4..50a4e0dda6 100755 --- a/gempak/ush/gfs_meta_mar_atl.sh +++ b/gempak/ush/gfs_meta_mar_atl.sh @@ -1,31 +1,36 @@ -#! /bin/sh +#! /usr/bin/env bash # # Metafile Script : gfs_meta_mar_atl.sh # -# Log : -# J. Carr/PMB 12/08/2004 Pushed into production. -# # Set up Local Variables # -set -x + +source "${HOMEgfs}/ush/preamble.sh" + +mkdir -p -m 775 "${DATA}/MAR_ATL" +cd "${DATA}/MAR_ATL" || exit 2 +cp "${HOMEgfs}/gempak/fix/datatype.tbl" datatype.tbl + +# +# Link data into DATA to sidestep gempak path limits +# TODO: Replace this # -export PS4='MAR_ATL:$SECONDS + ' -mkdir -p -m 775 $DATA/MAR_ATL -cd $DATA/MAR_ATL -cp $FIXgempak/datatype.tbl datatype.tbl +export COMIN="${RUN}.${PDY}${cyc}" +if [[ ! -L ${COMIN} ]]; then + ln -sf "${COM_ATMOS_GEMPAK_1p00}" "${COMIN}" +fi mdl=gfs MDL="GFS" metatype="mar_atl" metaname="${mdl}_${metatype}_${cyc}.meta" device="nc | ${metaname}" -PDY2=$(echo $PDY | cut -c3-) -export pgm=gdplot2_nc;. prep_step; startmsg +export pgm=gdplot2_nc;. prep_step -$GEMEXE/gdplot2_nc << EOFplt +"${GEMEXE}/gdplot2_nc" << EOFplt \$MAPFIL=mepowo.gsf+mehsuo.ncp+mereuo.ncp+mefbao.ncp -gdfile = F-${MDL} | ${PDY2}/${cyc}00 +gdfile = F-${MDL} | ${PDY:2}/${cyc}00 gdattim = f00-f180-6 GAREA = 16;-100;71;5 PROJ = mer//3;3;0;1 @@ -34,7 +39,7 @@ LATLON = 18/2///10 CONTUR = 0 clear = y -device = $device +device = ${device} GLEVEL = 850:1000 !0 GVCORD = pres !none @@ -69,8 +74,8 @@ fline = 29;30;24;0 ! hilo = 0!0!0!20/H#;L#/1020-1070;900-1012 hlsym = 0!0!0!1;1//22;22/3;3/hw clrbar = 1/V/LL!0 -wind = bk9/0.8/1/112! -refvec = +wind = bk9/0.8/1/112! +refvec = title = 1/-2/~ ? |~ PMSL, BL TEMP, WIND!1//${MDL} MSL PRES,BL TEMP,WIND (KTS)!0 text = 1.2/22/2/hw clear = y @@ -113,9 +118,9 @@ LINE = 7/5/1/2 !20/1/2/1 FINT = 15;21;27;33;39;45;51;57 FLINE = 0;23-15 HILO = 2;6/X;N/10-99;10-99! ! -HLSYM = +HLSYM = CLRBAR = 1 -WIND = +WIND = REFVEC = TITLE = 5//~ ? ${MDL} @ HEIGHTS AND VORTICITY|~ @ HGHT AND VORTICITY!0 TEXT = 1/21//hw @@ -146,22 +151,22 @@ CLEAR = yes li r -glevel = 300!300!300 -gvcord = pres!pres!pres +glevel = 300!300!300 +gvcord = pres!pres!pres panel = 0 skip = 1!1!1/3/3!1 scale = 0!0!5!5!-1 GDPFUN = mag(kntv(wnd))//jet!jet!div(wnd)//dvg!dvg!sm5s(hght) TYPE = c!c/f!c/f!c!c cint = 30;50!70;90;110;130;150;170;190!-11;-9;-7;-5;-3!2/3/18!12/720 -line = 26!32//2!19/-2//2!20!1//2 -fint = !70;90;110;130;150;170;190!3;5;7;9;11;13! +line = 26!32//2!19/-2//2!20!1//2 +fint = !70;90;110;130;150;170;190!3;5;7;9;11;13! fline = !0;24;25;29;7;15;14;2!0;23;22;21;17;16;2! hilo = 0!0!0!0!1/H;L/3 hlsym = 0!0!0!0!1.5;1.5//22;22/2;2/hw clrbar = 0!0!1/V/LL!0 -wind = !!am16/0.3//211/0.4! -refvec = 10 +wind = !!am16/0.3//211/0.4! +refvec = 10 title = 1/-2/~ ?|~ @ SPEED & DIVERG!1//${MDL} @ HGHTS, ISOTACHS, & DIVERGENCE!0 text = 1.2/22/2/hw clear = y @@ -262,14 +267,15 @@ export err=$?;err_chk # WHEN IT CAN NOT PRODUCE THE DESIRED GRID. CHECK # FOR THIS CASE HERE. ##################################################### -ls -l $metaname -export err=$?;export pgm="GEMPAK CHECK FILE";err_chk +if (( err != 0 )) || [[ ! -s "${metaname}" ]] &> /dev/null; then + echo "FATAL ERROR: Failed to create gempak meta file ${metaname}" + exit $(( err + 100 )) +fi -if [ $SENDCOM = "YES" ] ; then - mv ${metaname} ${COMOUT}/${mdl}_${PDY}_${cyc}_mar_atl - if [ $SENDDBN = "YES" ] ; then - ${DBNROOT}/bin/dbn_alert MODEL ${DBN_ALERT_TYPE} $job ${COMOUT}/${mdl}_${PDY}_${cyc}_mar_atl - fi +mv "${metaname}" "${COM_ATMOS_GEMPAK_META}/${mdl}_${PDY}_${cyc}_mar_atl" +if [[ "${SENDDBN}" == "YES" ]] ; then + "${DBNROOT}/bin/dbn_alert" MODEL "${DBN_ALERT_TYPE}" "${job}" \ + "${COM_ATMOS_GEMPAK_META}/${mdl}_${PDY}_${cyc}_mar_atl" fi diff --git a/gempak/ush/gfs_meta_mar_comp.sh b/gempak/ush/gfs_meta_mar_comp.sh index a55fa3c642..9573dd6bd6 100755 --- a/gempak/ush/gfs_meta_mar_comp.sh +++ b/gempak/ush/gfs_meta_mar_comp.sh @@ -1,158 +1,162 @@ -#! /bin/sh -# Metafile Script : gfs_meta_mar_comp.sh +#! /usr/bin/env bash # -# This is a script which creates a metafile that runs a comparison of 500 MB -# heights and PMSL between the older GFS model run and the newer one. The -# metafile also generates a comparison between the UKMET older run and the newer -# GFS model run. +# Metafile Script : gfs_meta_mar_comp.sh # -# Log : -# J. Carr/PMB 12/07/2004 Pushed into production. - # Set up Local Variables # -set -x -# -export PS4='MAR_COMP_F${fend}:$SECONDS + ' -rm -Rf $DATA/GEMPAK_META_MAR -mkdir -p -m 775 $DATA/GEMPAK_META_MAR $DATA/MAR_COMP -cd $DATA/MAR_COMP -cp $FIXgempak/datatype.tbl datatype.tbl +source "${HOMEgfs}/ush/preamble.sh" + +rm -Rf "${DATA}/GEMPAK_META_MAR" +mkdir -p -m 775 "${DATA}/GEMPAK_META_MAR" "${DATA}/MAR_COMP" + +cd "${DATA}/MAR_COMP" || exit 2 +cp "${HOMEgfs}/gempak/fix/datatype.tbl" datatype.tbl -export COMPONENT=${COMPONENT:-atmos} +export COMIN="gfs.multi" +mkdir -p "${COMIN}" +for cycle in $(seq -f "%02g" -s ' ' 0 "${STEP_GFS}" "${cyc}"); do + YMD=${PDY} HH=${cycle} GRID="1p00" generate_com gempak_dir:COM_ATMOS_GEMPAK_TMPL + for file_in in "${gempak_dir}/gfs_1p00_${PDY}${cycle}f"*; do + file_out="${COMIN}/$(basename "${file_in}")" + if [[ ! -L "${file_out}" ]]; then + ln -sf "${file_in}" "${file_out}" + fi + done +done + +# +# Link data into DATA to sidestep gempak path limits +# TODO: Replace this +# +export HPCNAM="nam.${PDY}" +if [[ ! -L ${HPCNAM} ]]; then + ln -sf "${COMINnam}/nam.${PDY}/gempak" "${HPCNAM}" +fi mdl=gfs MDL="GFS" metatype="mar_comp" metaname="${mdl}_${metatype}_${cyc}.meta" device="nc | ${metaname}" -PDY2=$(echo $PDY | cut -c3-) -# -# BV export MODEL=/com/nawips/prod -#XXW export HPCGFS=${MODEL}/${mdl}.$PDY -# BV export HPCGFS=${COMROOT}/nawips/${envir}/${mdl}.$PDY -export HPCGFS=${COMINgempak}/${mdl}.${PDY}/${cyc}/${COMPONENT}/gempak -export COMIN00=${COMINgempak}/${mdl}.${PDY}/00/${COMPONENT}/gempak -export COMIN06=${COMINgempak}/${mdl}.${PDY}/06/${COMPONENT}/gempak -export COMIN12=${COMINgempak}/${mdl}.${PDY}/12/${COMPONENT}/gempak -export COMIN18=${COMINgempak}/${mdl}.${PDY}/18/${COMPONENT}/gempak -if [ ${cyc} -eq 00 ] ; then - cp $COMIN00/gfs_${PDY}00f* $DATA/GEMPAK_META_MAR -elif [ ${cyc} -eq 06 ] ; then - cp $COMIN00/gfs_${PDY}00f* $DATA/GEMPAK_META_MAR - cp $COMIN06/gfs_${PDY}06f* $DATA/GEMPAK_META_MAR -elif [ ${cyc} -eq 12 ] ; then - cp $COMIN00/gfs_${PDY}00f* $DATA/GEMPAK_META_MAR - cp $COMIN06/gfs_${PDY}06f* $DATA/GEMPAK_META_MAR - cp $COMIN12/gfs_${PDY}12f* $DATA/GEMPAK_META_MAR -elif [ ${cyc} -eq 18 ] ; then - cp $COMIN00/gfs_${PDY}00f* $DATA/GEMPAK_META_MAR - cp $COMIN06/gfs_${PDY}06f* $DATA/GEMPAK_META_MAR - cp $COMIN12/gfs_${PDY}12f* $DATA/GEMPAK_META_MAR - cp $COMIN18/gfs_${PDY}18f* $DATA/GEMPAK_META_MAR -fi -export COMIN=$DATA/GEMPAK_META_MAR - -# export HPCNAM=${COMINnam}.$PDY -export HPCNAM=${COMINnam}.$PDY/gempak -# export HPCNGM=${MODEL}/ngm.$PDY -# -# DEFINE YESTERDAY -PDYm1=$($NDATE -24 ${PDY}${cyc} | cut -c -8) -PDY2m1=$(echo $PDYm1 | cut -c 3-) -# -# DEFINE 2 DAYS AGO -PDYm2=$($NDATE -48 ${PDY}${cyc} | cut -c -8) -PDY2m2=$(echo $PDYm2 | cut -c 3-) -# -# DEFINE 3 DAYS AGO -PDYm3=$($NDATE -72 ${PDY}${cyc} | cut -c -8) -PDY2m3=$(echo $PDYm3 | cut -c 3-) -# -# THE 1200 UTC CYCLE -# -if [ ${cyc} -eq 12 ] ; then - grid="F-${MDL} | ${PDY2}/${cyc}00" - for gareas in NAtl NPac - do - if [ ${gareas} = "NAtl" ] ; then +grid="F-${MDL} | ${PDY:2}/${cyc}00" +for garea in NAtl NPac; do + case ${garea} in + NAtl) garea="natl" proj=" " latlon="18/2/1/1/10" - elif [ ${gareas} = "NPac" ] ; then + ;; + NPac) garea="mpac" proj=" " latlon="18/2/1/1/10" - fi - for runtime in 06 00 - do - if [ ${runtime} = "06" ] ; then - cyc2="06" - grid2="F-${MDL} | ${PDY2}/0600" - add="06" - testgfsfhr="78" - elif [ ${runtime} = "00" ] ; then - cyc2="00" - grid2="F-${MDL} | ${PDY2}/0000" - add="12" - testgfsfhr="114" + ;; + *) + echo "FATAL ERROR: Unknown domain" + exit 100 + esac + + offsets=(6 12) + for offset in "${offsets[@]}"; do + init_time=$(date --utc +%Y%m%d%H -d "${PDY} ${cyc} - ${offset} hours") + init_PDY=${init_time:0:8} + init_cyc=${init_time:8:2} + + if (( init_time <= SDATE )); then + echo "Skipping generation for ${init_time} because it is before the experiment began" + if (( offset == "${offsets[0]}" )); then + echo "First forecast time, no metafile produced" + exit 0 fi + continue + fi + + # Create symlink in DATA to sidestep gempak path limits + HPCGFS="${RUN}.${init_time}" + if [[ ! -L ${HPCGFS} ]]; then + YMD="${init_PDY}" HH="${init_cyc}" GRID="1p00" generate_com source_dir:COM_ATMOS_GEMPAK_TMPL + ln -sf "${source_dir}" "${HPCGFS}" + fi + + case ${cyc} in + 00 | 12) + contours=1 + type_param="CTYPE" + ex="" + ;; + 06 | 18) + contours=2 + type_param="TYPE" + ex="ex" + ;; + *) + echo "FATAL ERROR: Invalid cycle ${cyc} passed to ${BASH_SOURCE[0]}" + esac + + case ${cyc}_${init_cyc} in + 00_*) testgfsfhr=114;; + 06_00) testgfsfhr=84;; + 06_18) testgfsfhr=72;; + 12_00) testgfsfhr=114;; + 12_06) testgfsfhr=78;; + 18_06) testgfsfhr=72;; + 18_12) testgfsfhr=84;; + *) + echo "FATAL ERROR: Undefined pairing of cycles" + exit 200 + ;; + esac + + for fhr in $(seq -s ' ' 0 6 126); do + gfsfhr=F$(printf "%02g" "${fhr}") + gfsoldfhr=F$(printf "%02g" $((fhr + offset))) + grid2="F-GFSHPC | ${init_time:2}/${init_cyc}00" gdpfun1="sm5s(hght)!sm5s(hght)" gdpfun2="sm5s(pmsl)!sm5s(pmsl)" line="5/1/3/2/2!6/1/3/2/2" hilo1="5/H#;L#//5/5;5/y!6/H#;L#//5/5;5/y" hilo2="5/H#;L#/1018-1060;900-1012/5/10;10/y!6/H#;L#/1018-1060;900-1012/5/10;10/y" - hilo3="5/H#;L#//5/5;5/y" - hilo4="5/H#;L#/1018-1060;900-1012/5/10;10/y" - title1="5/-2/~ ? ^ ${MDL} @ HGT (${cyc}Z YELLOW)|^${gareas} ${cyc}Z vs ${cyc2}Z 500 HGT!6/-3/~ ? ${MDL} @ HGT (${cyc2}Z CYAN)" - title2="5/-2/~ ? ^ ${MDL} PMSL (${cyc}Z YELLOW)|^${gareas} ${cyc}Z vs ${cyc2}Z PMSL!6/-3/~ ? ${MDL} PMSL (${cyc2}Z CYAN)" - title3="5/-2/~ ? ^ ${MDL} @ HGT (${cyc}Z YELLOW)|^${gareas} ${cyc}Z vs ${cyc2}Z 500 HGT" - title4="5/-2/~ ? ^ ${MDL} PMSL (${cyc}Z YELLOW)|^${gareas} ${cyc}Z vs ${cyc2}Z PMSL" - for gfsfhr in 00 06 12 18 24 30 36 42 48 54 60 66 72 78 84 90 96 102 108 114 120 126 - do - gfsoldfhr=F$(expr ${gfsfhr} + ${add}) - gfsfhr2=$(echo ${gfsfhr}) - gfsfhr=F${gfsfhr} - if [ ${gfsfhr2} -gt ${testgfsfhr} ] ; then - grid="F-${MDL} | ${PDY2}/${cyc}00" - grid2=" " - gfsoldfhr=" " - gdpfun1="sm5s(hght)" - gdpfun2="sm5s(pmsl)" - line="5/1/3/2/2" - hilo1=$(echo ${hilo3}) - hilo2=$(echo ${hilo4}) - title1=$(echo ${title3}) - title2=$(echo ${title4}) - fi - -export pgm=gdplot2_nc;. prep_step; startmsg -$GEMEXE/gdplot2_nc << EOF + title1="5/-2/~ ? ^ ${MDL} @ HGT (${cyc}Z YELLOW)|^${garea} ${cyc}Z vs ${init_cyc}Z 500 HGT!6/-3/~ ? ${MDL} @ HGT (${init_cyc}Z CYAN)" + title2="5/-2/~ ? ^ ${MDL} PMSL (${cyc}Z YELLOW)|^${garea} ${cyc}Z vs ${init_cyc}Z PMSL!6/-3/~ ? ${MDL} PMSL (${init_cyc}Z CYAN)" + if (( fhr > testgfsfhr )); then + grid2=" " + gfsoldfhr=" " + gdpfun1="sm5s(hght)" + gdpfun2="sm5s(pmsl)" + line="5/1/3/2/2" + hilo1="5/H#;L#//5/5;5/y" + hilo2="5/H#;L#/1018-1060;900-1012/5/10;10/y" + title1="5/-2/~ ? ^ ${MDL} @ HGT (${cyc}Z YELLOW)|^${garea} ${cyc}Z vs ${init_cyc}Z 500 HGT" + title2="5/-2/~ ? ^ ${MDL} PMSL (${cyc}Z YELLOW)|^${garea} ${cyc}Z vs ${init_cyc}Z PMSL" + fi + + export pgm=gdplot2_nc;. prep_step + "${GEMEXE}/gdplot2_nc" << EOF DEVICE = ${device} MAP = 1/1/1/yes CLEAR = yes GAREA = ${garea} -PROJ = ${proj} +PROJ = ${proj} LATLON = ${latlon} -SKIP = 0 +SKIP = 0 PANEL = 0 -CONTUR = 2 -CLRBAR = -FINT = -FLINE = -REFVEC = -WIND = 0 +CONTUR = ${contours} +CLRBAR = +FINT = +FLINE = +REFVEC = +WIND = 0 GDFILE = ${grid} !${grid2} GDATTIM = ${gfsfhr}!${gfsoldfhr} -GLEVEL = 500 +GLEVEL = 500 GVCORD = PRES GDPFUN = ${gdpfun1} LINE = ${line} SCALE = -1 -CTYPE = c +${type_param} = c CINT = 6 HLSYM = 1.2;1.2//21//hw TEXT = 1/21//hw @@ -175,100 +179,59 @@ HILO = ${hilo2} TITLE = ${title2} run +${ex} EOF -export err=$?;err_chk - done + export err=$?;err_chk done - # COMPARE THE 1200 UTC GFS MODEL TO THE 0000 UTC UKMET MODEL - grid="F-${MDL} | ${PDY2}/${cyc}00" - export HPCUKMET=${COMINukmet}.${PDY}/gempak - grid2="F-UKMETHPC | ${PDY2}/0000" - # for gfsfhr in 00 12 24 36 48 60 84 108 - for gfsfhr in 00 12 24 84 108 - do - ukmetfhr=F$(expr ${gfsfhr} + 12) - gfsfhr=F${gfsfhr} - -export pgm=gdplot2_nc;. prep_step; startmsg -$GEMEXE/gdplot2_nc << EOF -DEVICE = ${device} -MAP = 1/1/1/yes -CLEAR = yes -GAREA = ${garea} -PROJ = ${proj} -LATLON = ${latlon} -GDFILE = ${grid} -GDATTIM = ${gfsfhr} - -SKIP = 0 -PANEL = 0 -CONTUR = 2 -CLRBAR = -GLEVEL = 500 -GVCORD = PRES -GDPFUN = sm5s(hght) -LINE = 5/1/3/2 -SCALE = -1 -CTYPE = c -CINT = 6 -FINT = -FLINE = -HLSYM = 1;1//21//hw -TEXT = s/21//hw -WIND = 0 -REFVEC = -HILO = 5/H#;L#//5/5;5/y -TITLE = 5/-1/~ ? ${MDL} @ HGT (12Z YELLOW)|~${gareas} 12Z vs UKM 00Z 500 HGT!0 -l -run + done -CLEAR = no -GDFILE = ${grid2} -GDATTIM = ${ukmetfhr} -GDPFUN = sm5s(hght) -LINE = 6/1/3/2 -HILO = 6/H#;L#//5/5;5/y -TITLE = 6/-2/~ ? UKMET @ HGT (00Z CYAN)!0 -l -ru + if (( 10#${cyc} % 12 ==0 )); then + + # + # There are some differences between 00z and 12z + # The YEST string makes sense (but is inconsistently used) + # The others I'm not sure why they differ. - WCK + # + case ${cyc} in + 00) + type_param="TYPE" + hlsym="1.2;1.2//21//hw" + wind="" + yest=" YEST" + run_cmd="run" + extra_cmd="\nHLSYM = 1.2;1.2//21//hw\nTEXT = s/21//hw" + ;; + 12) + type_param="CTYPE" + hlsym="1;1//21//hw" + wind="0" + yest="" + run_cmd="ru" + extra_cmd="" + ;; + *) + echo "FATAL ERROR: Invalid cycle {cyc} in ${BASH_SOURCE[0]}" + exit 100 + ;; + esac + + # COMPARE THE GFS MODEL TO THE UKMET MODEL 12-HOURS PRIOR + ukmet_date=$(date --utc +%Y%m%d%H -d "${PDY} ${cyc} - 12 hours") + ukmet_PDY=${ukmet_date:0:8} + ukmet_cyc=${ukmet_date:8:2} + + export HPCUKMET="ukmet.${ukmet_PDY}" + if [[ ! -L "${HPCUKMET}" ]]; then + ln -sf "${COMINukmet}/ukmet.${ukmet_PDY}/gempak" "${HPCUKMET}" + fi + grid2="F-UKMETHPC | ${ukmet_PDY:2}/${ukmet_date}" -CLEAR = yes -GLEVEL = 0 -GVCORD = none -SCALE = 0 -GDPFUN = sm5s(pmsl) -CINT = 4 -GDFILE = ${grid} -GDATTIM = ${gfsfhr} -LINE = 5/1/3/2 -HILO = 5/H#;L#/1018-1060;900-1012/5/10;10/y -TITLE = 5/-1/~ ? ${MDL} PMSL (12Z YELLOW)|~${gareas} 12Z vs UKM 00Z PMSL!0 -l -ru + for fhr in 00 12 24 84 108; do + gfsfhr=F$(printf "%02g" "${fhr}") + ukmetfhr=F$(printf "%02g" $((fhr + 12))) -CLEAR = no -GDFILE = ${grid2} -GDPFUN = sm5s(pmsl) -GDATTIM = ${ukmetfhr} -LINE = 6/1/3/2 -HILO = 6/H#;L#/1018-1060;900-1012/5/10;10/y -TITLE = 6/-2/~ ? UKMET PMSL (00Z CYAN)!0 -l -ru - -EOF -export err=$?;err_chk - done - # COMPARE THE 1200 UTC GFS MODEL TO THE 1200 UTC ECMWF FROM YESTERDAY - grid="F-${MDL} | ${PDY2}/${cyc}00" - grid2=${COMINecmwf}.${PDYm1}/gempak/ecmwf_glob_${PDYm1}12 - for gfsfhr in 00 24 48 72 96 120 - do - ecmwffhr=F$(expr ${gfsfhr} + 24) - gfsfhr=F${gfsfhr} - -export pgm=gdplot2_nc;. prep_step; startmsg -$GEMEXE/gdplot2_nc << EOF + export pgm=gdplot2_nc;. prep_step + "${GEMEXE}/gdplot2_nc" << EOF DEVICE = ${device} MAP = 1/1/1/yes CLEAR = yes @@ -278,271 +241,85 @@ LATLON = ${latlon} GDFILE = ${grid} GDATTIM = ${gfsfhr} -SKIP = 0 +SKIP = 0 PANEL = 0 CONTUR = 2 -CLRBAR = -GLEVEL = 500 -GVCORD = PRES +CLRBAR = +GLEVEL = 500 +GVCORD = PRES GDPFUN = sm5s(hght) -LINE = 5/1/3/2 -SCALE = -1 -CTYPE = c -CINT = 6 -FINT = -FLINE = -HLSYM = 1;1//21//hw -TEXT = s/21//hw -WIND = 0 -REFVEC = +LINE = 5/1/3/2 +SCALE = -1 +${type_param} = c +CINT = 6 +FINT = +FLINE = +HLSYM = ${hlsym} +TEXT = s/21//hw +WIND = 0 +REFVEC = HILO = 5/H#;L#//5/5;5/y -TITLE = 5/-1/~ ? ${MDL} @ HGT (12Z YELLOW)|~${gareas} 12Z vs ECM yest 12Z 500 HGT!0 +TITLE = 5/-1/~ ? ${MDL} @ HGT (${cyc}Z YELLOW)|~${garea} ${cyc}Z vs UKM ${ukmet_cyc}Z 500 HGT!0 l run CLEAR = no GDFILE = ${grid2} -GDATTIM = ${ecmwffhr} +GDATTIM = ${ukmetfhr} GDPFUN = sm5s(hght) LINE = 6/1/3/2 HILO = 6/H#;L#//5/5;5/y -TITLE = 6/-2/~ ? ECMWF @ HGT (12Z YEST CYAN)!0 +TITLE = 6/-2/~ ? UKMET @ HGT (${ukmet_cyc}Z${yest} CYAN)!0 l -run +${run_cmd} CLEAR = yes GLEVEL = 0 GVCORD = none SCALE = 0 GDPFUN = sm5s(pmsl) -CINT = 4 +CINT = 4${extra_cmd} GDFILE = ${grid} GDATTIM = ${gfsfhr} LINE = 5/1/3/2 HILO = 5/H#;L#/1018-1060;900-1012/5/10;10/y -TITLE = 5/-1/~ ? ${MDL} PMSL (12Z YELLOW)|~${gareas} 12Z vs ECM yest 12Z PMSL!0 +TITLE = 5/-1/~ ? ${MDL} PMSL (${cyc}Z YELLOW)|~${garea} ${cyc}Z vs UKM ${ukmet_cyc}Z PMSL!0 l -run +${run_cmd} CLEAR = no GDFILE = ${grid2} GDPFUN = sm5s(pmsl) -GDATTIM = ${ecmwffhr} +GDATTIM = ${ukmetfhr} LINE = 6/1/3/2 HILO = 6/H#;L#/1018-1060;900-1012/5/10;10/y -TITLE = 6/-2/~ ? ECMWF PMSL (12Z YEST CYAN)!0 +TITLE = 6/-2/~ ? UKMET PMSL (${ukmet_cyc}Z CYAN)!0 l -run +${run_cmd} EOF -export err=$?;err_chk + export err=$?;err_chk done - # COMPARE THE 1200 UTC GFS MODEL TO THE 1200 UTC NAM AND NGM - grid="F-${MDL} | ${PDY2}/${cyc}00" - grid2="F-NAMHPC | ${PDY2}/${cyc}00" - # grid2ngm="F-NGMHPC | ${PDY2}/${cyc}00" - for gfsfhr in 00 06 12 18 24 30 36 42 48 54 60 66 72 78 84 - do - namfhr=F${gfsfhr} - # ngmfhr=F${gfsfhr} - gfsfhr=F${gfsfhr} - -$GEMEXE/gdplot2_nc << EOF -DEVICE = ${device} -MAP = 1/1/1/yes -CLEAR = yes -GAREA = ${garea} -PROJ = ${proj} -LATLON = ${latlon} -GDFILE = ${grid} -GDATTIM = ${gfsfhr} - -SKIP = 0 -PANEL = 0 -CONTUR = 2 -CLRBAR = -GLEVEL = 500 -GVCORD = PRES -GDPFUN = sm5s(hght) -LINE = 5/1/3/2 -SCALE = -1 -TYPE = c -CINT = 6 -FINT = -FLINE = -HLSYM = 1;1//21//hw -TEXT = s/21//hw -WIND = -REFVEC = -HILO = 3/H#;L#//5/5;5/y -TITLE = 5/-1/~ ? ${MDL} @ HGT (12Z YELLOW)|~${gareas} ${MDL}/NAM/NGM 500 HGT!0 -l -run -CLEAR = no -GDFILE = ${grid2} -GDATTIM = ${namfhr} -GDPFUN = sm5s(hght) -LINE = 6/1/3/2 -HILO = 5/H#;L#//5/5;5/y -TITLE = 6/-2/~ ? NAM @ HGT (12Z CYAN)!0 -l -run + # COMPARE THE GFS MODEL TO THE 12 UTC ECMWF FROM YESTERDAY + offset=$(( (10#${cyc}+12)%24 + 12 )) + ecmwf_date=$(date --utc +%Y%m%d%H -d "${PDY} ${cyc} - ${offset} hours") + ecmwf_PDY=${ecmwf_date:0:8} + # ecmwf_cyc=${ecmwf_date:8:2} -CLEAR = yes -GLEVEL = 0 -GVCORD = none -SCALE = 0 -GDPFUN = sm5s(pmsl) -CINT = 4 -GDFILE = ${grid} -GDATTIM = ${gfsfhr} -LINE = 5/1/3/2 -HILO = 3/H#;L#/1018-1060;900-1012/5/10;10/y -TITLE = 5/-1/~ ? ${MDL} PMSL (12Z YELLOW)|~${gareas} ${MDL}/NAM/NGM PMSL!0 -l -run - -CLEAR = no -GDFILE = ${grid2} -GDPFUN = sm5s(pmsl) -GDATTIM = ${namfhr} -LINE = 6/1/3/2 -HILO = 5/H#;L#/1018-1060;900-1012/5/10;10/y -TITLE = 6/-2/~ ? NAM PMSL (12Z CYAN)!0 -l -run - -EOF -export err=$?;err_chk - done - done -fi -if [ ${cyc} = "00" ] ; then - grid="F-${MDL} | ${PDY2}/${cyc}00" - for gareas in NAtl NPac - do - if [ ${gareas} = "NAtl" ] ; then - garea="natl" - proj=" " - latlon="0" - elif [ ${gareas} = "NPac" ] ; then - garea="mpac" - proj=" " - latlon="18/2/1/1/10" + HPCECMWF=ecmwf.${PDY} + if [[ ! -L "${HPCECMWF}" ]]; then + ln -sf "${COMINecmwf}/ecmwf.${ecmwf_PDY}/gempak" "${HPCECMWF}" fi - for runtime in 18 12 - do - if [ ${runtime} = "18" ] ; then - cyc2="18" - #XXW export HPCGFS=${MODEL}/${mdl}.${PDYm1} - # BV export HPCGFS=$COMROOT/nawips/${envir}/${mdl}.${PDYm1} - export HPCGFS=${COMINgempak}/${mdl}.${PDYm1}/${cyc2}/${COMPONENT}/gempak - - grid2="F-GFSHPC | ${PDY2m1}/1800" - add="06" - testgfsfhr="114" - elif [ ${runtime} = "12" ] ; then - cyc2="12" - #XXW export HPCGFS=${MODEL}/${mdl}.${PDYm1} - export HPCGFS=${COMINgempak}/${mdl}.${PDYm1}/${cyc2}/${COMPONENT}/gempak - - grid2="F-GFSHPC | ${PDY2m1}/1200" - add="12" - testgfsfhr="114" - fi - gdpfun1="sm5s(hght)!sm5s(hght)" - gdpfun2="sm5s(pmsl)!sm5s(pmsl)" - line="5/1/3/2/2!6/1/3/2/2" - hilo1="5/H#;L#//5/5;5/y!6/H#;L#//5/5;5/y" - hilo2="5/H#;L#/1018-1060;900-1012/5/10;10/y!6/H#;L#/1018-1060;900-1012/5/10;10/y" - hilo3="5/H#;L#//5/5;5/y" - hilo4="5/H#;L#/1018-1060;900-1012/5/10;10/y" - title1="5/-2/~ ? ^ ${MDL} @ HGT (${cyc}Z YELLOW)|^${gareas} ${cyc}Z vs ${cyc2}Z 500 HGT!6/-3/~ ? ${MDL} @ HGT (${cyc2}Z CYAN)" - title2="5/-2/~ ? ^ ${MDL} PMSL (${cyc}Z YELLOW)|^${gareas} ${cyc}Z vs ${cyc2}Z PMSL!6/-3/~ ? ${MDL} PMSL (${cyc2}Z CYAN)" - title3="5/-2/~ ? ^ ${MDL} @ HGT (${cyc}Z YELLOW)|^${gareas} ${cyc}Z vs ${cyc2}Z 500 HGT" - title4="5/-2/~ ? ^ ${MDL} PMSL (${cyc}Z YELLOW)|^${gareas} ${cyc}Z vs ${cyc2}Z PMSL" - for gfsfhr in 00 06 12 18 24 30 36 42 48 54 60 66 72 78 84 90 96 102 108 114 120 126 - do - gfsoldfhr=F$(expr ${gfsfhr} + ${add}) - gfsfhr2=$(echo ${gfsfhr}) - gfsfhr=F${gfsfhr} - if [ ${gfsfhr2} -gt ${testgfsfhr} ] ; then - grid="F-${MDL} | ${PDY2}/${cyc}00" - grid2=" " - gfsoldfhr=" " - gdpfun1="sm5s(hght)" - gdpfun2="sm5s(pmsl)" - line="5/1/3/2/2" - hilo1=$(echo ${hilo3}) - hilo2=$(echo ${hilo4}) - title1=$(echo ${title3}) - title2=$(echo ${title4}) - fi -export pgm=gdplot2_nc;. prep_step; startmsg -$GEMEXE/gdplot2_nc << EOF -DEVICE = ${device} -MAP = 1/1/1/yes -CLEAR = yes -GAREA = ${garea} -PROJ = ${proj} -LATLON = ${latlon} -SKIP = 0 -PANEL = 0 -CONTUR = 2 -CLRBAR = -FINT = -FLINE = -REFVEC = -WIND = 0 + grid2="${HPCECMWF}/ecmwf_glob_${ecmwf_date}" -GDFILE = ${grid} !${grid2} -GDATTIM = ${gfsfhr}!${gfsoldfhr} -GLEVEL = 500 -GVCORD = PRES -GDPFUN = ${gdpfun1} -LINE = ${line} -SCALE = -1 -CTYPE = c -CINT = 6 -HLSYM = 1.2;1.2//21//hw -TEXT = 1/21//hw -HILO = ${hilo1} -TITLE = ${title1} -run - -CLEAR = yes -GLEVEL = 0 -GVCORD = none -SCALE = 0 -GDPFUN = ${gdpfun2} -CINT = 4 -HLSYM = 1.2;1.2//21//hw -TEXT = 1/21//hw -GDFILE = ${grid} !${grid2} -GDATTIM = ${gfsfhr}!${gfsoldfhr} -LINE = ${line} -HILO = ${hilo2} -TITLE = ${title2} -run - -EOF -export err=$?;err_chk + for fhr in $(seq -s ' ' $(( offset%24 )) 24 120 ); do + gfsfhr=F$(printf "%02g" "${fhr}") + ecmwffhr=F$(printf "%02g" $((fhr + 24))) - done - done - # COMPARE THE 0000 UTC GFS MODEL TO THE 1200 UTC UKMET FROM YESTERDAY - grid="F-${MDL} | ${PDY2}/${cyc}00" - export HPCUKMET=${COMINukmet}.${PDYm1}/gempak - grid2="F-UKMETHPC | ${PDY2m1}/1200" - # for gfsfhr in 00 12 24 36 48 60 84 108 - for gfsfhr in 00 12 24 84 108 - do - ukmetfhr=F$(expr ${gfsfhr} + 12) - gfsfhr=F${gfsfhr} -export pgm=gdplot2_nc;. prep_step; startmsg -$GEMEXE/gdplot2_nc << EOF + export pgm=gdplot2_nc;. prep_step + "${GEMEXE}/gdplot2_nc" << EOF DEVICE = ${device} MAP = 1/1/1/yes CLEAR = yes @@ -551,103 +328,26 @@ PROJ = ${proj} LATLON = ${latlon} GDFILE = ${grid} GDATTIM = ${gfsfhr} -SKIP = 0 -PANEL = 0 -CONTUR = 2 -CLRBAR = -GLEVEL = 500 -GVCORD = PRES -GDPFUN = sm5s(hght) -LINE = 5/1/3/2 -SCALE = -1 -TYPE = c -CINT = 6 -FINT = -FLINE = -HLSYM = 1.2;1.2//21//hw -TEXT = s/21//hw -WIND = -REFVEC = -HILO = 5/H#;L#//5/5;5/y -TITLE = 5/-1/~ ? ${MDL} @ HEIGHTS (00Z YELLOW)|~${gareas} 00Z vs UKM 12Z 500 HGT!0 -l -run - -CLEAR = no -GDFILE = ${grid2} -GDATTIM = ${ukmetfhr} -GDPFUN = sm5s(hght) -LINE = 6/1/3/2 -HILO = 6/H#;L#//5/5;5/y -TITLE = 6/-2/~ ? UKMET @ HEIGHTS (12Z YEST CYAN)!0 -l -run -CLEAR = yes -GLEVEL = 0 -GVCORD = none -SCALE = 0 -GDPFUN = sm5s(pmsl) -CINT = 4 -HLSYM = 1.2;1.2//21//hw -TEXT = s/21//hw -GDFILE = ${grid} -GDATTIM = ${gfsfhr} -LINE = 5/1/3/2 -HILO = 5/H#;L#/1018-1060;900-1012/5/10;10/y -TITLE = 5/-1/~ ? ${MDL} PMSL (00Z YELLOW) |~${gareas} 00Z vs UKM 12Z PMSL!0 -l -run - -CLEAR = no -GDFILE = ${grid2} -GDPFUN = sm5s(pmsl) -GDATTIM = ${ukmetfhr} -LINE = 6/1/3/2 -HILO = 6/H#;L#/1018-1060;900-1012/5/10;10/y -TITLE = 6/-2/~ ? UKMET PMSL (12Z YEST CYAN)!0 -l -run - -EOF -export err=$?;err_chk - done - # COMPARE THE 0000 UTC GFS MODEL TO THE 1200 UTC ECMWF FROM YESTERDAY - grid="F-${MDL} | ${PDY2}/${cyc}00" - grid2="${COMINecmwf}.${PDYm1}/gempak/ecmwf_glob_${PDYm1}12" - for gfsfhr in 12 36 60 84 108 - do - ecmwffhr=F$(expr ${gfsfhr} + 12) - gfsfhr=F${gfsfhr} - -$GEMEXE/gdplot2_nc << EOF -DEVICE = ${device} -MAP = 1/1/1/yes -CLEAR = yes -GAREA = ${garea} -PROJ = ${proj} -LATLON = ${latlon} -GDFILE = ${grid} -GDATTIM = ${gfsfhr} -SKIP = 0 +SKIP = 0 PANEL = 0 CONTUR = 2 -CLRBAR = -GLEVEL = 500 -GVCORD = PRES +CLRBAR = +GLEVEL = 500 +GVCORD = PRES GDPFUN = sm5s(hght) -LINE = 5/1/3/2 -SCALE = -1 -TYPE = c -CINT = 6 -FINT = -FLINE = -HLSYM = 1.2;1.2//21//hw -TEXT = s/21//hw -WIND = -REFVEC = +LINE = 5/1/3/2 +SCALE = -1 +${type_param} = c +CINT = 6 +FINT = +FLINE = +HLSYM = ${hlsym} +TEXT = s/21//hw +WIND = ${wind} +REFVEC = HILO = 5/H#;L#//5/5;5/y -TITLE = 5/-1/~ ? ${MDL} @ HGT (00Z YELLOW)|~${gareas} 00Z vs ECM 12Z 500 HGT!0 +TITLE = 5/-1/~ ? ${MDL} @ HGT (12Z YELLOW)|~${garea} 12Z vs ECM yest 12Z 500 HGT!0 l run @@ -666,14 +366,12 @@ GLEVEL = 0 GVCORD = none SCALE = 0 GDPFUN = sm5s(pmsl) -CINT = 4 -HLSYM = 1.2;1.2//21//hw -TEXT = s/21//hw +CINT = 4${extra_cmd} GDFILE = ${grid} GDATTIM = ${gfsfhr} LINE = 5/1/3/2 HILO = 5/H#;L#/1018-1060;900-1012/5/10;10/y -TITLE = 5/-1/~ ? ${MDL} PMSL (00Z YELLOW) |~${gareas} 00Z vs ECM 12Z PMSL!0 +TITLE = 5/-1/~ ? ${MDL} PMSL (12Z YELLOW)|~${garea} 12Z vs ECM yest 12Z PMSL!0 l run @@ -688,18 +386,16 @@ l run EOF -export err=$?;err_chk + export err=$?;err_chk done - # COMPARE THE 0000 UTC GFS MODEL TO THE 0000 UTC NAM AND NGM - grid="F-${MDL} | ${PDY2}/${cyc}00" - grid2="F-NAMHPC | ${PDY2}/${cyc}00" - for gfsfhr in 00 06 12 18 24 30 36 42 48 54 60 66 72 78 84 - do - namfhr=F${gfsfhr} - gfsfhr=F${gfsfhr} - -export pgm=gdplot2_nc;. prep_step; startmsg -$GEMEXE/gdplot2_nc << EOF + + # COMPARE THE GFS MODEL TO THE NAM and NGM + grid2="F-NAMHPC | ${PDY:2}/${cyc}00" + for fhr in $(seq -s ' ' 0 6 84); do + gfsfhr=F$(printf "%02g" "${fhr}") + namfhr=F$(printf "%02g" "${fhr}") + + "${GEMEXE}/gdplot2_nc" << EOF DEVICE = ${device} MAP = 1/1/1/yes CLEAR = yes @@ -709,25 +405,25 @@ LATLON = ${latlon} GDFILE = ${grid} GDATTIM = ${gfsfhr} -SKIP = 0 +SKIP = 0 PANEL = 0 CONTUR = 2 -CLRBAR = -GLEVEL = 500 -GVCORD = PRES +CLRBAR = +GLEVEL = 500 +GVCORD = PRES GDPFUN = sm5s(hght) -LINE = 5/1/3/2 -SCALE = -1 -TYPE = c -CINT = 6 -FINT = -FLINE = -HLSYM = 1.2;1.2//21//hw -TEXT = s/21//hw -WIND = -REFVEC = +LINE = 5/1/3/2 +SCALE = -1 +TYPE = c +CINT = 6 +FINT = +FLINE = +HLSYM = ${hlsym} +TEXT = s/21//hw +WIND = +REFVEC = HILO = 3/H#;L#//5/5;5/y -TITLE = 5/-1/~ ? ${MDL} @ HGT (00Z YELLOW)|~${gareas} ${MDL}/NAM/NGM 500 HGT!0 +TITLE = 5/-1/~ ? ${MDL} @ HGT (${cyc}Z YELLOW)|~${garea} ${MDL}/NAM/NGM 500 HGT!0 l run @@ -737,7 +433,7 @@ GDATTIM = ${namfhr} GDPFUN = sm5s(hght) LINE = 6/1/3/2 HILO = 5/H#;L#//5/5;5/y -TITLE = 6/-2/~ ? NAM @ HGT (00Z CYAN)!0 +TITLE = 6/-2/~ ? NAM @ HGT (${cyc}Z CYAN)!0 l run @@ -746,14 +442,12 @@ GLEVEL = 0 GVCORD = none SCALE = 0 GDPFUN = sm5s(pmsl) -CINT = 4 -HLSYM = 1.2;1.2//21//hw -TEXT = s/21//hw +CINT = 4${extra_cmd} GDFILE = ${grid} GDATTIM = ${gfsfhr} LINE = 5/1/3/2 HILO = 3/H#;L#/1018-1060;900-1012/5/10;10/y -TITLE = 5/-1/~ ? ${MDL} PMSL (00Z YELLOW) |~${gareas} ${MDL}/NAM/NGM PMSL!0 +TITLE = 5/-1/~ ? ${MDL} PMSL (${cyc}Z YELLOW)|~${garea} ${MDL}/NAM/NGM PMSL!0 l run @@ -763,252 +457,30 @@ GDPFUN = sm5s(pmsl) GDATTIM = ${namfhr} LINE = 6/1/3/2 HILO = 5/H#;L#/1018-1060;900-1012/5/10;10/y -TITLE = 6/-2/~ ? NAM PMSL (CYAN)!0 +TITLE = 6/-2/~ ? NAM PMSL (${cyc}Z CYAN)!0 l run EOF -export err=$?;err_chk - + export err=$?;err_chk done - done -fi + fi +done -if [ ${cyc} -eq "18" ] ; then - grid="F-${MDL} | ${PDY2}/${cyc}00" - for gareas in NAtl NPac - do - if [ ${gareas} = "NAtl" ] ; then - garea="natl" - proj=" " - latlon="0" - elif [ ${gareas} = "NPac" ] ; then - garea="mpac" - proj=" " - latlon="18/2/1/1/10" - fi - for runtime in 12 06 - do - if [ ${runtime} = "12" ] ; then - cyc2="12" - grid2="F-${MDL} | ${PDY2}/1200" - add="06" - testgfsfhr="84" - elif [ ${runtime} = "06" ] ; then - cyc2="06" - grid2="F-${MDL} | ${PDY2}/0600" - add="12" - testgfsfhr="72" - fi - gdpfun1="sm5s(hght)!sm5s(hght)" - gdpfun2="sm5s(pmsl)!sm5s(pmsl)" - line="5/1/3/2/2!6/1/3/2/2" - hilo1="5/H#;L#//5/5;5/y!6/H#;L#//5/5;5/y" - hilo2="5/H#;L#/1018-1060;900-1012/5/10;10/y!6/H#;L#/1018-1060;900-1012/5/10;10/y" - hilo3="5/H#;L#//5/5;5/y" - hilo4="5/H#;L#/1018-1060;900-1012/5/10;10/y" - title1="5/-2/~ ? ^ ${MDL} @ HGT (${cyc}Z YELLOW)|^${gareas} ${cyc}Z vs ${cyc2}Z 500 HGT!6/-3/~ ? ${MDL} @ HGT (${cyc2}Z CYAN)" - title2="5/-2/~ ? ^ ${MDL} PMSL (${cyc}Z YELLOW)|^${gareas} ${cyc}Z vs ${cyc2}Z PMSL!6/-3/~ ? ${MDL} PMSL (${cyc2}Z CYAN)" - title3="5/-2/~ ? ^ ${MDL} @ HGT (${cyc}Z YELLOW)|^${gareas} ${cyc}Z vs ${cyc2}Z 500 HGT" - title4="5/-2/~ ? ^ ${MDL} PMSL (${cyc}Z YELLOW)|^${gareas} ${cyc}Z vs ${cyc2}Z PMSL" - for gfsfhr in 00 06 12 18 24 30 36 42 48 54 60 66 72 78 84 - do - gfsoldfhr=F$(expr ${gfsfhr} + ${add}) - gfsfhr2=$(echo ${gfsfhr}) - gfsfhr="F${gfsfhr}" - if [ ${gfsfhr2} -gt ${testgfsfhr} ] ; then - grid="F-${MDL} | ${PDY2}/${cyc}00" - grid2=" " - gfsoldfhr=" " - gdpfun1="sm5s(hght)" - gdpfun2="sm5s(pmsl)" - line="5/1/3/2/2" - hilo1=$(echo ${hilo3}) - hilo2=$(echo ${hilo4}) - title1=$(echo ${title3}) - title2=$(echo ${title4}) - fi -export pgm=gdplot2_nc;. prep_step; startmsg - -$GEMEXE/gdplot2_nc << EOF -DEVICE = ${device} -MAP = 1/1/1/yes -CLEAR = yes -GAREA = ${garea} -PROJ = ${proj} -LATLON = ${latlon} -SKIP = 0 -PANEL = 0 -CONTUR = 1 -CLRBAR = -FINT = -FLINE = -REFVEC = -WIND = 0 - -GDFILE = ${grid}!${grid2} -GDATTIM = ${gfsfhr}!${gfsoldfhr} -GLEVEL = 500 -GVCORD = PRES -GDPFUN = ${gdpfun1} -LINE = ${line} -SCALE = -1 -TYPE = c -CINT = 6 -HLSYM = 1.2;1.2//21//hw -TEXT = 1/21//hw -HILO = ${hilo1} -TITLE = ${title1} -run - -CLEAR = yes -GLEVEL = 0 -GVCORD = none -SCALE = 0 -GDPFUN = ${gdpfun2} -CINT = 4 -HLSYM = 1.2;1.2//21//hw -TEXT = 1/21//hw -GDFILE = ${grid} !${grid2} -GDATTIM = ${gfsfhr}!${gfsoldfhr} -LINE = ${line} -HILO = ${hilo2} -TITLE = ${title2} -run - -ex -EOF -export err=$?;err_chk - done - done - done -fi - -if [ ${cyc} -eq "06" ] ; then - grid="F-${MDL} | ${PDY2}/${cyc}00" - for gareas in NAtl NPac - do - if [ ${gareas} = "NAtl" ] ; then - garea="natl" - proj=" " - latlon="0" - elif [ ${gareas} = "NPac" ] ; then - garea="mpac" - proj=" " - latlon="18/2/1/1/10" - fi - for runtime in 00 18 - do - if [ ${runtime} = "00" ] ; then - cyc2="00" - grid2="F-${MDL} | ${PDY2}/0000" - add="06" - testgfsfhr="84" - elif [ ${runtime} = "18" ] ; then - cyc2="18" - #XXW export HPCGFS=${MODEL}/${mdl}.${PDYm1} - export HPCGFS=${COMINgempak}/${mdl}.${PDYm1}/${cyc2}/${COMPONENT}/gempak - grid2="F-GFSHPC | ${PDY2m1}/1800" - add="12" - testgfsfhr="72" - fi - gdpfun1="sm5s(hght)!sm5s(hght)" - gdpfun2="sm5s(pmsl)!sm5s(pmsl)" - line="5/1/3/2/2!6/1/3/2/2" - hilo1="5/H#;L#//5/5;5/y!6/H#;L#//5/5;5/y" - hilo2="5/H#;L#/1018-1060;900-1012/5/10;10/y!6/H#;L#/1018-1060;900-1012/5/10;10/y" - hilo3="5/H#;L#//5/5;5/y" - hilo4="5/H#;L#/1018-1060;900-1012/5/10;10/y" - title1="5/-2/~ ? ^ ${MDL} @ HGT (${cyc}Z YELLOW)|^${gareas} ${cyc}Z vs ${cyc2}Z 500 HGT!6/-3/~ ? ${MDL} @ HGT (${cyc2}Z CYAN)" - title2="5/-2/~ ? ^ ${MDL} PMSL (${cyc}Z YELLOW)|^${gareas} ${cyc}Z vs ${cyc2}Z PMSL!6/-3/~ ? ${MDL} PMSL (${cyc2}Z CYAN)" - title3="5/-2/~ ? ^ ${MDL} @ HGT (${cyc}Z YELLOW)|^${gareas} ${cyc}Z vs ${cyc2}Z 500 HGT" - title4="5/-2/~ ? ^ ${MDL} PMSL (${cyc}Z YELLOW)|^${gareas} ${cyc}Z vs ${cyc2}Z PMSL" - for gfsfhr in 00 06 12 18 24 30 36 42 48 54 60 66 72 78 84 - do - gfsoldfhr=F$(expr ${gfsfhr} + ${add}) - gfsfhr2=$(echo ${gfsfhr}) - gfsfhr="F${gfsfhr}" - if [ ${gfsfhr2} -gt ${testgfsfhr} ] ; then - grid="F-${MDL} | ${PDY2}/${cyc}00" - grid2=" " - gfsoldfhr=" " - gdpfun1="sm5s(hght)" - gdpfun2="sm5s(pmsl)" - line="5/1/3/2/2" - hilo1=$(echo ${hilo3}) - hilo2=$(echo ${hilo4}) - title1=$(echo ${title3}) - title2=$(echo ${title4}) - fi -export pgm=gdplot2_nc;. prep_step; startmsg - -$GEMEXE/gdplot2_nc << EOF -DEVICE = ${device} -MAP = 1/1/1/yes -CLEAR = yes -GAREA = ${garea} -PROJ = ${proj} -LATLON = ${latlon} -SKIP = 0 -PANEL = 0 -CONTUR = 1 -CLRBAR = -FINT = -FLINE = -REFVEC = -WIND = 0 - -GDFILE = ${grid}!${grid2} -GDATTIM = ${gfsfhr}!${gfsoldfhr} -GLEVEL = 500 -GVCORD = PRES -GDPFUN = ${gdpfun1} -LINE = ${line} -SCALE = -1 -TYPE = c -CINT = 6 -HLSYM = 1.2;1.2//21//hw -TEXT = 1/21//hw -HILO = ${hilo1} -TITLE = ${title1} -run - -CLEAR = yes -GLEVEL = 0 -GVCORD = none -SCALE = 0 -GDPFUN = ${gdpfun2} -CINT = 4 -HLSYM = 1.2;1.2//21//hw -TEXT = 1/21//hw -GDFILE = ${grid} !${grid2} -GDATTIM = ${gfsfhr}!${gfsoldfhr} -LINE = ${line} -HILO = ${hilo2} -TITLE = ${title2} -run - -ex -EOF -export err=$?;err_chk - - done - done - done -fi - -#################################################### +##################################################### # GEMPAK DOES NOT ALWAYS HAVE A NON ZERO RETURN CODE # WHEN IT CAN NOT PRODUCE THE DESIRED GRID. CHECK # FOR THIS CASE HERE. ##################################################### -ls -l ${metaname} -export err=$?;export pgm="GEMPAK CHECK FILE";err_chk -if [ $SENDCOM = "YES" ] ; then - mv ${metaname} ${COMOUT}/${mdl}_${PDY}_${cyc}_mar_comp - if [ $SENDDBN = "YES" ] ; then - ${DBNROOT}/bin/dbn_alert MODEL ${DBN_ALERT_TYPE} $job ${COMOUT}/${mdl}_${PDY}_${cyc}_mar_comp - fi +if (( err != 0 )) || [[ ! -s "${metaname}" ]] &> /dev/null; then + echo "FATAL ERROR: Failed to create gempak meta file ${metaname}" + exit $(( err + 100 )) +fi + +mv "${metaname}" "${COM_ATMOS_GEMPAK_META}/${mdl}_${PDY}_${cyc}_mar_comp" +if [[ "${SENDDBN}" == "YES" ]] ; then + "${DBNROOT}/bin/dbn_alert MODEL" "${DBN_ALERT_TYPE}" "${job}" \ + "${COM_ATMOS_GEMPAK_META}/${mdl}_${PDY}_${cyc}_mar_comp" fi exit diff --git a/gempak/ush/gfs_meta_mar_pac.sh b/gempak/ush/gfs_meta_mar_pac.sh index b44f60a2f7..c53a208d55 100755 --- a/gempak/ush/gfs_meta_mar_pac.sh +++ b/gempak/ush/gfs_meta_mar_pac.sh @@ -1,31 +1,36 @@ -#! /bin/sh +#! /usr/bin/env bash # # Metafile Script : gfs_meta_mar_pac.sh # -# Log : -# J. Carr/PMB 12/08/2004 Pushed into production - # Set up Local Variables # -set -x + +source "${HOMEgfs}/ush/preamble.sh" + +mkdir -p -m 775 "${DATA}/MAR_PAC" +cd "${DATA}/MAR_PAC" || exit 2 +cp "${HOMEgfs}/gempak/fix/datatype.tbl" datatype.tbl + +# +# Link data into DATA to sidestep gempak path limits +# TODO: Replace this # -export PS4='MAR_PAC:$SECONDS + ' -mkdir -p -m 775 $DATA/MAR_PAC -cd $DATA/MAR_PAC -cp $FIXgempak/datatype.tbl datatype.tbl +export COMIN="${RUN}.${PDY}${cyc}" +if [[ ! -L ${COMIN} ]]; then + ln -sf "${COM_ATMOS_GEMPAK_1p00}" "${COMIN}" +fi mdl=gfs MDL="GFS" metatype="mar_pac" metaname="${mdl}_${metatype}_${cyc}.meta" device="nc | ${metaname}" -PDY2=$(echo $PDY | cut -c3-) -export pgm=gdplot2_nc;. prep_step; startmsg +export pgm=gdplot2_nc;. prep_step -$GEMEXE/gdplot2_nc << EOFplt +"${GEMEXE}/gdplot2_nc" << EOFplt \$MAPFIL=mepowo.gsf+mehsuo.ncp+mereuo.ncp+himouo.nws -gdfile = F-${MDL} | ${PDY2}/${cyc}00 +gdfile = F-${MDL} | ${PDY:2}/${cyc}00 gdattim = f00-f180-6 GAREA = 4;120;69;-105 PROJ = mer//3;3;0;1 @@ -34,7 +39,7 @@ LATLON = 18/2///10 CONTUR = 1 clear = y -device = $device +device = ${device} GLEVEL = 850:1000 !0 GVCORD = pres !none @@ -69,8 +74,8 @@ fline = 29;30;24;0 ! hilo = 0!0!0!20/H#;L#/1020-1070;900-1012 hlsym = 0!0!0!1;1//22;22/3;3/hw clrbar = 1/V/LL!0 -wind = bk9/0.8/1/112! -refvec = +wind = bk9/0.8/1/112! +refvec = title = 1/-2/~ ?|~ PMSL, BL TEMP, WIND!1//${MDL} MSL PRES,BL TEMP,WIND (KTS)!0 text = 1.2/22/2/hw clear = y @@ -113,9 +118,9 @@ LINE = 7/5/1/2 !20/1/2/1 FINT = 15;21;27;33;39;45;51;57 FLINE = 0;23-15 HILO = 2;6/X;N/10-99;10-99! ! -HLSYM = +HLSYM = CLRBAR = 1 -WIND = +WIND = REFVEC = TITLE = 5//~ ? ${MDL} @ HEIGHTS AND VORTICITY|~ @ HGHT AND VORTICITY!0 TEXT = 1/21//hw @@ -146,22 +151,22 @@ CLEAR = yes li r -glevel = 300!300!300 -gvcord = pres!pres!pres +glevel = 300!300!300 +gvcord = pres!pres!pres panel = 0 skip = 1!1!1/3/3!1 scale = 0!0!5!5!-1 GDPFUN = mag(kntv(wnd))//jet!jet!div(wnd)//dvg!dvg!sm5s(hght) TYPE = c!c/f!c/f!c!c cint = 30;50!70;90;110;130;150;170;190!-11;-9;-7;-5;-3!2/3/18!12/720 -line = 26!32//2!19/-2//2!20!1//2 -fint = !70;90;110;130;150;170;190!3;5;7;9;11;13! +line = 26!32//2!19/-2//2!20!1//2 +fint = !70;90;110;130;150;170;190!3;5;7;9;11;13! fline = !0;24;25;29;7;15;14;2!0;23;22;21;17;16;2! hilo = 0!0!0!0!1/H;L/3 hlsym = 0!0!0!0!1.5;1.5//22;22/2;2/hw clrbar = 0!0!1/V/LL!0 -wind = !!am16/0.3//211/0.4! -refvec = 10 +wind = !!am16/0.3//211/0.4! +refvec = 10 title = 1/-2/~ ?|~ @ SPEED & DIVERG!1//${MDL} @ HGHTS, ISOTACHS, & DIVERGENCE!0 text = 1.2/22/2/hw clear = y @@ -255,19 +260,21 @@ exit EOFplt export err=$?;err_chk + ##################################################### # GEMPAK DOES NOT ALWAYS HAVE A NON ZERO RETURN CODE # WHEN IT CAN NOT PRODUCE THE DESIRED GRID. CHECK # FOR THIS CASE HERE. ##################################################### -ls -l $metaname -export err=$?;export pgm="GEMPAK CHECK FILE";err_chk +if (( err != 0 )) || [[ ! -s "${metaname}" ]] &> /dev/null; then + echo "FATAL ERROR: Failed to create gempak meta file ${metaname}" + exit $(( err + 100 )) +fi -if [ $SENDCOM = "YES" ] ; then - mv ${metaname} ${COMOUT}/${mdl}_${PDY}_${cyc}_mar_pac - if [ $SENDDBN = "YES" ] ; then - ${DBNROOT}/bin/dbn_alert MODEL ${DBN_ALERT_TYPE} $job ${COMOUT}/${mdl}_${PDY}_${cyc}_mar_pac - fi +mv "${metaname}" "${COM_ATMOS_GEMPAK_META}/${mdl}_${PDY}_${cyc}_mar_pac" +if [[ "${SENDDBN}" == "YES" ]] ; then + "${DBNROOT}/bin/dbn_alert" MODEL "${DBN_ALERT_TYPE}" "${job}" \ + "${COM_ATMOS_GEMPAK_META}/${mdl}_${PDY}_${cyc}_mar_pac" fi exit diff --git a/gempak/ush/gfs_meta_mar_ql.sh b/gempak/ush/gfs_meta_mar_ql.sh index f1abf3d395..6d653bb550 100755 --- a/gempak/ush/gfs_meta_mar_ql.sh +++ b/gempak/ush/gfs_meta_mar_ql.sh @@ -1,39 +1,43 @@ -#! /bin/sh +#! /usr/bin/env bash # # Metafile Script : gfs_meta_mar_ql.sh # -# Log : -# J. Carr/PMB 12/07/2004 Pushed into production -# # Set up Local Variables # -set -x + +source "${HOMEgfs}/ush/preamble.sh" + +mkdir -p -m 775 "${DATA}/MAR_QL" +cd "${DATA}/MAR_QL" || exit 2 +cp "${HOMEgfs}/gempak/fix/datatype.tbl" datatype.tbl + +# +# Link data into DATA to sidestep gempak path limits +# TODO: Replace this # -export PS4='MAR_QL_F${fend}:$SECONDS + ' -mkdir -p -m 775 $DATA/MAR_QL -cd $DATA/MAR_QL -cp $FIXgempak/datatype.tbl datatype.tbl +export COMIN="${RUN}.${PDY}${cyc}" +if [[ ! -L ${COMIN} ]]; then + ln -sf "${COM_ATMOS_GEMPAK_1p00}" "${COMIN}" +fi mdl=gfs MDL="GFS" metatype="mar_ql" metaname="${mdl}_${metatype}_${cyc}.meta" device="nc | ${metaname}" -PDY2=$(echo $PDY | cut -c3-) -# fend=180 -export pgm=gdplot2_nc;. prep_step; startmsg +export pgm=gdplot2_nc;. prep_step -$GEMEXE/gdplot2_nc << EOFplt +"${GEMEXE}/gdplot2_nc" << EOFplt \$MAPFIL=mepowo.gsf+mehsuo.ncp+mereuo.ncp+mefbao.ncp -gdfile = F-${MDL} | ${PDY2}/${cyc}00 +gdfile = F-${MDL} | ${PDY:2}/${cyc}00 gdattim = f00-f${fend}-6 GAREA = 15;-100;70;5 PROJ = mer//3;3;0;1 MAP = 31 + 6 + 3 + 5 LATLON = 18/2/1/1/10 CONTUR = 0 -device = $device +device = ${device} GLEVEL = 9950!0 GVCORD = sgma!none PANEL = 0 @@ -83,7 +87,7 @@ ru GLEVEL = 500 GVCORD = PRES -SKIP = 0 +SKIP = 0 SCALE = 5 !-1 GDPFUN = (avor(wnd)) !hght TYPE = c/f !c @@ -92,7 +96,7 @@ LINE = 7/5/1/2 !20/1/2/1 FINT = 15;21;27;33;39;45;51;57 FLINE = 0;23-15 HILO = 2;6/X;N/10-99;10-99! ! -HLSYM = +HLSYM = WIND = 0 TITLE = 5//~ ? GFS @ HEIGHTS AND VORTICITY|~WATL @ HGHT AND VORT!0 li @@ -148,7 +152,7 @@ ru GLEVEL = 500 GVCORD = PRES -SKIP = 0 +SKIP = 0 SCALE = 5 !-1 GDPFUN = (avor(wnd)) !hght TYPE = c/f !c @@ -157,27 +161,29 @@ LINE = 7/5/1/2 !20/1/2/1 FINT = 15;21;27;33;39;45;51;57 FLINE = 0;23-15 HILO = 2;6/X;N/10-99;10-99! ! -HLSYM = +HLSYM = WIND = 0 TITLE = 5//~ ? GFS @ HEIGHTS AND VORTICITY|~EPAC @ HGHT AND VORT!0 li ru exit EOFplt +export err=$?;err_chk ##################################################### # GEMPAK DOES NOT ALWAYS HAVE A NON ZERO RETURN CODE # WHEN IT CAN NOT PRODUCE THE DESIRED GRID. CHECK # FOR THIS CASE HERE. ##################################################### -ls -l $metaname -export err=$?;export pgm="GEMPAK CHECK FILE";err_chk - -if [ $SENDCOM = "YES" ] ; then - mv ${metaname} ${COMOUT}/${mdl}_${PDY}_${cyc}_mar_ql - if [ $SENDDBN = "YES" ] ; then - ${DBNROOT}/bin/dbn_alert MODEL ${DBN_ALERT_TYPE} $job ${COMOUT}/${mdl}_${PDY}_${cyc}_mar_ql - fi +if (( err != 0 )) || [[ ! -s "${metaname}" ]] &> /dev/null; then + echo "FATAL ERROR: Failed to create gempak meta file ${metaname}" + exit $(( err + 100 )) +fi + +mv "${metaname}" "${COM_ATMOS_GEMPAK_META}/${mdl}_${PDY}_${cyc}_mar_ql" +if [[ "${SENDDBN}" == "YES" ]] ; then + "${DBNROOT}/bin/dbn_alert" MODEL "${DBN_ALERT_TYPE}" "${job}" \ + "${COM_ATMOS_GEMPAK_META}/${mdl}_${PDY}_${cyc}_mar_ql" fi exit diff --git a/gempak/ush/gfs_meta_mar_skewt.sh b/gempak/ush/gfs_meta_mar_skewt.sh index 040e09e932..e61c4cb1dc 100755 --- a/gempak/ush/gfs_meta_mar_skewt.sh +++ b/gempak/ush/gfs_meta_mar_skewt.sh @@ -1,31 +1,35 @@ -#! /bin/sh +#! /usr/bin/env bash # # Metafile Script : gfs_meta_mar_skewt.sh # -# Log : -# J. Carr/PMB 12/08/2004 Pushed into production - # Set up Local Variables # -set -x + +source "${HOMEgfs}/ush/preamble.sh" + +mkdir -p -m 775 "${DATA}/MAR_SKEWT" +cd "${DATA}/MAR_SKEWT" || exit 2 +cp "${HOMEgfs}/gempak/fix/datatype.tbl" datatype.tbl + +# +# Link data into DATA to sidestep gempak path limits +# TODO: Replace this # -export PS4='MAR_SKEWT:$SECONDS + ' -mkdir -p -m 775 $DATA/MAR_SKEWT -cd $DATA/MAR_SKEWT -cp $FIXgempak/datatype.tbl datatype.tbl +export COMIN="${RUN}.${PDY}${cyc}" +if [[ ! -L ${COMIN} ]]; then + ln -sf "${COM_ATMOS_GEMPAK_1p00}" "${COMIN}" +fi mdl=gfs MDL="GFS" metatype="mar_skewt" metaname="${mdl}_${metatype}_${cyc}.meta" device="nc | ${metaname}" -PDY2=$(echo $PDY | cut -c3-) -for fhr in 000 006 012 018 024 030 036 042 048 054 060 066 072 -do - export pgm=gdprof;. prep_step; startmsg +for fhr in $(seq -f "%03g" -s ' ' 0 6 72); do + export pgm=gdprof;. prep_step -$GEMEXE/gdprof << EOFplt + "${GEMEXE}/gdprof" << EOFplt GDATTIM = F${fhr} GVCORD = PRES GDFILE = F-${MDL} @@ -38,12 +42,12 @@ SCALE = 0 XAXIS = -40/50/10/1;1;1 YAXIS = 1050/100//1;1;1 WIND = bk1 -REFVEC = +REFVEC = WINPOS = 1 FILTER = no PANEL = 0 TEXT = 1.2/22/2/hw -DEVICE = $device +DEVICE = ${device} OUTPUT = T THTALN = 18/1/1 THTELN = 23/2/1 @@ -272,25 +276,26 @@ ru exit EOFplt -export err=$?;err_chk + export err=$?;err_chk done -$GEMEXE/gpend +"${GEMEXE}/gpend" ##################################################### # GEMPAK DOES NOT ALWAYS HAVE A NON ZERO RETURN CODE # WHEN IT CAN NOT PRODUCE THE DESIRED GRID. CHECK # FOR THIS CASE HERE. ##################################################### -ls -l $metaname -export err=$?;export pgm="GEMPAK CHECK FILE";err_chk - -if [ $SENDCOM = "YES" ] ; then - mv ${metaname} ${COMOUT}/${mdl}_${PDY}_${cyc}_mar_skewt - if [ $SENDDBN = "YES" ] ; then - ${DBNROOT}/bin/dbn_alert MODEL ${DBN_ALERT_TYPE} $job ${COMOUT}/${mdl}_${PDY}_${cyc}_mar_skewt - fi +if (( err != 0 )) || [[ ! -s "${metaname}" ]] &> /dev/null; then + echo "FATAL ERROR: Failed to create gempak meta file ${metaname}" + exit $(( err + 100 )) +fi + +mv "${metaname}" "${COM_ATMOS_GEMPAK_META}/${mdl}_${PDY}_${cyc}_mar_skewt" +if [[ "${SENDDBN}" == "YES" ]] ; then + "${DBNROOT}/bin/dbn_alert" MODEL "${DBN_ALERT_TYPE}" "${job}" \ + "${COM_ATMOS_GEMPAK_META}/${mdl}_${PDY}_${cyc}_mar_skewt" fi exit diff --git a/gempak/ush/gfs_meta_mar_ver.sh b/gempak/ush/gfs_meta_mar_ver.sh index 63ccba00ed..32b5117ce0 100755 --- a/gempak/ush/gfs_meta_mar_ver.sh +++ b/gempak/ush/gfs_meta_mar_ver.sh @@ -1,31 +1,36 @@ -#! /bin/sh +#! /usr/bin/env bash # # Metafile Script : gfs_meta_mar_ver.sh # -# Log : -# J. Carr/PMB 12/08/2004 Pushed into production -# # Set up Local Variables # -set -x + +source "${HOMEgfs}/ush/preamble.sh" + +mkdir -p -m 775 "${DATA}/MAR_VER" +cd "${DATA}/MAR_VER" || exit 2 +cp "${HOMEgfs}/gempak/fix/datatype.tbl" datatype.tbl + +# +# Link data into DATA to sidestep gempak path limits +# TODO: Replace this # -export PS4='MAR_VER:$SECONDS + ' -mkdir -p -m 775 $DATA/MAR_VER -cd $DATA/MAR_VER -cp $FIXgempak/datatype.tbl datatype.tbl +export COMIN="${RUN}.${PDY}${cyc}" +if [[ ! -L ${COMIN} ]]; then + ln -sf "${COM_ATMOS_GEMPAK_1p00}" "${COMIN}" +fi mdl=gfs MDL="GFS" metatype="mar_ver" metaname="${mdl}_${metatype}_${cyc}.meta" device="nc | ${metaname}" -PDY2=$(echo $PDY | cut -c3-) -export pgm=gdplot2_nc;. prep_step; startmsg +export pgm=gdplot2_nc;. prep_step -$GEMEXE/gdplot2_nc << EOFplt +"${GEMEXE}/gdplot2_nc" << EOFplt \$MAPFIL=hipowo.gsf+mefbao.ncp -gdfile = F-${MDL} | ${PDY2}/${cyc}00 +gdfile = F-${MDL} | ${PDY:2}/${cyc}00 gdattim = f00-f48-6 GLEVEL = 9950 GVCORD = sgma @@ -42,7 +47,7 @@ FLINE = HILO = HLSYM = CLRBAR = -WIND = +WIND = REFVEC = TITLE = 31/-2/~ ? ${MDL} Gridded BL Wind Direction (40m AGL)|~ WATL GRIDDED WIND DIR!0 TEXT = 0.8/21/1/hw @@ -51,7 +56,7 @@ GAREA = 27.2;-81.9;46.7;-61.4 PROJ = STR/90.0;-67.0;1.0 MAP = 31+6 LATLON = 18/1/1/1;1/5;5 -DEVICE = $device +DEVICE = ${device} STNPLT = 31/1.3/22/1.6/hw|25/19/1.3/1.6|buoys.tbl SATFIL = RADFIL = @@ -86,19 +91,21 @@ exit EOFplt export err=$?;err_chk + ##################################################### # GEMPAK DOES NOT ALWAYS HAVE A NON ZERO RETURN CODE # WHEN IT CAN NOT PRODUCE THE DESIRED GRID. CHECK # FOR THIS CASE HERE. ##################################################### -ls -l $metaname -export err=$?;export pgm="GEMPAK CHECK FILE";err_chk +if (( err != 0 )) || [[ ! -s "${metaname}" ]] &> /dev/null; then + echo "FATAL ERROR: Failed to create gempak meta file ${metaname}" + exit $(( err + 100 )) +fi -if [ $SENDCOM = "YES" ] ; then - mv ${metaname} ${COMOUT}/${mdl}_${PDY}_${cyc}_mar_ver - if [ $SENDDBN = "YES" ] ; then - ${DBNROOT}/bin/dbn_alert MODEL ${DBN_ALERT_TYPE} $job ${COMOUT}/${mdl}_${PDY}_${cyc}_mar_ver - fi +mv "${metaname}" "${COM_ATMOS_GEMPAK_META}/${mdl}_${PDY}_${cyc}_mar_ver" +if [[ "${SENDDBN}" == "YES" ]] ; then + "${DBNROOT}/bin/dbn_alert" MODEL "${DBN_ALERT_TYPE}" "${job}" \ + "${COM_ATMOS_GEMPAK_META}/${mdl}_${PDY}_${cyc}_mar_ver" fi exit diff --git a/gempak/ush/gfs_meta_nhsh.sh b/gempak/ush/gfs_meta_nhsh.sh index 3e0146270e..d52ed9a64c 100755 --- a/gempak/ush/gfs_meta_nhsh.sh +++ b/gempak/ush/gfs_meta_nhsh.sh @@ -1,39 +1,34 @@ -#!/bin/sh - +#! /usr/bin/env bash # # Metafile Script : mrf_meta_nhsh # -# Log : -# D.W.Plummer/NCEP 2/97 Add log header -# D.W.Plummer/NCEP 2/97 Added $MAPFIL=mepowo.gsf -# D.W.Plummer/NCEP 4/97 Changed SKIP for grid2 -# B. Gordon 4/00 Converted for production on IBM-SP -# and changed gdplot_nc -> gdplot2_nc -# D. Michaud 4/16 Added logic to display different titles -# for parallel runs -# B. Gordon 7/02 Converted to run off the GFS due to demise -# of the MRF. -# J. Carr 11/04 Changed contur from 1 to a 2. -# Added a ? to all title/TITLE lines. -# -set -xa -mkdir -p -m 775 $DATA/mrfnhsh -cd $DATA/mrfnhsh -cp $FIXgempak/datatype.tbl datatype.tbl -PDY2=$(echo $PDY | cut -c3-) +source "${HOMEgfs}/ush/preamble.sh" + +mkdir -p -m 775 "${DATA}/mrfnhsh" +cd "${DATA}/mrfnhsh" || exit 2 +cp "${HOMEgfs}/gempak/fix/datatype.tbl" datatype.tbl + +# +# Link data into DATA to sidestep gempak path limits +# TODO: Replace this +# +export COMIN="${RUN}.${PDY}${cyc}" +if [[ ! -L ${COMIN} ]]; then + ln -sf "${COM_ATMOS_GEMPAK_1p00}" "${COMIN}" +fi -if [ "$envir" = "para" ] ; then +if [[ "${envir}" == "para" ]] ; then export m_title="GFSP" else export m_title="GFS" fi -export pgm=gdplot2_nc; prep_step; startmsg +export pgm=gdplot2_nc; prep_step -$GEMEXE/gdplot2_nc << EOF +"${GEMEXE}/gdplot2_nc" << EOF \$MAPFIL=mepowo.gsf -GDFILE = F-GFS | ${PDY2}/${cyc}00 +GDFILE = F-GFS | ${PDY:2}/${cyc}00 GDATTIM = F000-F384-12 DEVICE = nc | Nmeta_nh PANEL = 0 @@ -43,11 +38,11 @@ MAP = 1 CLEAR = yes CLRBAR = 1 -restore ${USHgempak}/restore/garea_nh.nts +restore ${HOMEgfs}/gempak/ush/restore/garea_nh.nts -restore ${USHgempak}/restore/500mb_hght_absv.2.nts +restore ${HOMEgfs}/gempak/ush/restore/500mb_hght_absv.2.nts CLRBAR = 1 -TEXT = 1/21//hw +TEXT = 1/21//hw SKIP = 0 !0 !1 SCALE = 5 !5 !-1 GFUNC = (avor(wnd))//v !mul(v,-1) !hght @@ -59,47 +54,47 @@ HILO = 2;6/X;N/10-99;10-99!2;6/X;N/10-99;10-99! TITLE = 5//~ ? @ HEIGHTS AND VORTICITY|~ @ HGHT AND VORTICITY! TEXT = 1/21//hw CLEAR = yes - -TITLE = 5//~ ? $m_title @ HEIGHTS AND VORTICITY|~ @ HGHT AND VORTICITY!0 + +TITLE = 5//~ ? ${m_title} @ HEIGHTS AND VORTICITY|~ @ HGHT AND VORTICITY!0 l ru -restore ${USHgempak}/restore/garea_sh.nts +restore ${HOMEgfs}/gempak/ush/restore/garea_sh.nts DEVICE = nc | Nmeta_sh -TITLE = 5//~ ? $m_title @ HEIGHTS AND VORTICITY|~ @ HGHT AND VORTICITY!0 +TITLE = 5//~ ? ${m_title} @ HEIGHTS AND VORTICITY|~ @ HGHT AND VORTICITY!0 l ru -restore ${USHgempak}/restore/garea_nh.nts +restore ${HOMEgfs}/gempak/ush/restore/garea_nh.nts DEVICE = nc | Nmeta_nh -restore ${USHgempak}/restore/250mb_hght_wnd.2.nts +restore ${HOMEgfs}/gempak/ush/restore/250mb_hght_wnd.2.nts CLRBAR = 1 TEXT = 1/21//hw GDPFUN = knts((mag(wnd))) !sm9s(hght) -TITLE = 5/-2/~ ? $m_title @ HEIGHTS, ISOTACHS AND WIND (KTS)|~ @ HGHT AND WIND!0 +TITLE = 5/-2/~ ? ${m_title} @ HEIGHTS, ISOTACHS AND WIND (KTS)|~ @ HGHT AND WIND!0 l ru -restore ${USHgempak}/restore/garea_sh.nts +restore ${HOMEgfs}/gempak/ush/restore/garea_sh.nts DEVICE = nc | Nmeta_sh ru -restore ${USHgempak}/restore/precip.2.nts +restore ${HOMEgfs}/gempak/ush/restore/precip.2.nts CLRBAR = 1 TEXT = 1/21//hw GDATTIM = F12-F240-12 GDPFUN = (quo(mul(pr12,43200),25.4)) GDPFUN = (quo(p12m,25.4)) -TITLE = 5//~ ? $m_title 12-HOUR TOTAL PRECIPITATION (IN)|~ 12-HOURLY TOTAL PCPN +TITLE = 5//~ ? ${m_title} 12-HOUR TOTAL PRECIPITATION (IN)|~ 12-HOURLY TOTAL PCPN l r -restore ${USHgempak}/restore/garea_sh.nts +restore ${HOMEgfs}/gempak/ush/restore/garea_sh.nts DEVICE = nc | Nmeta_sh ru @@ -112,27 +107,20 @@ export err=$?; err_chk # WHEN IT CAN NOT PRODUCE THE DESIRED GRID. CHECK # FOR THIS CASE HERE. ##################################################### -ls -l Nmeta_nh -export err=$?;export pgm="GEMPAK CHECK FILE"; err_chk -ls -l Nmeta_sh -export err=$?;export pgm="GEMPAK CHECK FILE"; err_chk - -if [ $SENDCOM = "YES" ] ; then - mv Nmeta_nh ${COMOUT}/gfs_${PDY}_${cyc}_nh - mv Nmeta_sh ${COMOUT}/gfs_${PDY}_${cyc}_sh - if [ $SENDDBN = "YES" ] ; then - $DBNROOT/bin/dbn_alert MODEL ${DBN_ALERT_TYPE} $job \ - $COMOUT/gfs_${PDY}_${cyc}_nh - $DBNROOT/bin/dbn_alert MODEL ${DBN_ALERT_TYPE} $job \ - $COMOUT/gfs_${PDY}_${cyc}_sh - if [ $DBN_ALERT_TYPE = "GFS_METAFILE_LAST" ] ; then - DBN_ALERT_TYPE=GFS_METAFILE - $DBNROOT/bin/dbn_alert MODEL ${DBN_ALERT_TYPE} $job \ - $COMOUT/gfs_${PDY}_${cyc}_nh - $DBNROOT/bin/dbn_alert MODEL ${DBN_ALERT_TYPE} $job \ - $COMOUT/gfs_${PDY}_${cyc}_sh - fi - fi -fi - -# +for metaname in Nmeta_nh Nmeta_sh; do + if (( err != 0 )) || [[ ! -s "${metaname}" ]] &> /dev/null; then + echo "FATAL ERROR: Failed to create gempak meta file ${metaname}" + exit $(( err + 100 )) + fi + + mv "${metaname}" "${COM_ATMOS_GEMPAK_META}/gfs_${PDY}_${cyc}_${metaname/Nmeta_}" + if [[ "${SENDDBN}" == "YES" ]] ; then + "${DBNROOT}/bin/dbn_alert" MODEL "${DBN_ALERT_TYPE}" "${job}" \ + "${COM_ATMOS_GEMPAK_META}/gfs_${PDY}_${cyc}_${metaname/Nmeta_}" + if [[ ${DBN_ALERT_TYPE} = "GFS_METAFILE_LAST" ]] ; then + DBN_ALERT_TYPE=GFS_METAFILE + "${DBNROOT}/bin/dbn_alert" MODEL "${DBN_ALERT_TYPE}" "${job}" \ + "${COM_ATMOS_GEMPAK_META}/gfs_${PDY}_${cyc}_${metaname/Nmeta_}" + fi + fi +done diff --git a/gempak/ush/gfs_meta_opc_na_ver b/gempak/ush/gfs_meta_opc_na_ver index 8d5f394b3d..1d543c52e9 100755 --- a/gempak/ush/gfs_meta_opc_na_ver +++ b/gempak/ush/gfs_meta_opc_na_ver @@ -1,247 +1,79 @@ -#!/bin/sh +#! /usr/bin/env bash # # Metafile Script : gfs_meta_opc_na_ver # -# Log : -# J. Carr/HPC 12/08/2004 Submitted into production. -# # Set up Local Variables # -set -x -# -export PS4='OPC_NA_VER_F${fend}:$SECONDS + ' -mkdir -p -m 775 $DATA/OPC_NA_VER_F${fend} -cd $DATA/OPC_NA_VER_F${fend} -cp $FIXgempak/datatype.tbl datatype.tbl -export COMPONENT=${COMPONENT:-atmos} +source "${HOMEgfs}/ush/preamble.sh" + +mkdir -p -m 775 "${DATA}/OPC_NA_VER_F${fend}" +cd "${DATA}/OPC_NA_VER_F${fend}" || exit 2 +cp "${HOMEgfs}/gempak/fix/datatype.tbl" datatype.tbl + +# +# Link data into DATA to sidestep gempak path limits +# TODO: Replace this +# +export COMIN="${RUN}.${PDY}${cyc}" +if [[ ! -L ${COMIN} ]]; then + ln -sf "${COM_ATMOS_GEMPAK_1p00}" "${COMIN}" +fi mdl=gfs MDL="GFS" -metatype="ver" metaname="gfsver_mpc_na_${cyc}.meta" device="nc | ${metaname}" -PDY2=$(echo $PDY | cut -c3-) -# -# DEFINE 1 CYCLE AGO -dc1=$($NDATE -06 ${PDY}${cyc} | cut -c -10) -date1=$(echo ${dc1} | cut -c -8) -sdate1=$(echo ${dc1} | cut -c 3-8) -cycle1=$(echo ${dc1} | cut -c 9,10) -# DEFINE 2 CYCLES AGO -dc2=$($NDATE -12 ${PDY}${cyc} | cut -c -10) -date2=$(echo ${dc2} | cut -c -8) -sdate2=$(echo ${dc2} | cut -c 3-8) -cycle2=$(echo ${dc2} | cut -c 9,10) -# DEFINE 3 CYCLES AGO -dc3=$($NDATE -18 ${PDY}${cyc} | cut -c -10) -date3=$(echo ${dc3} | cut -c -8) -sdate3=$(echo ${dc3} | cut -c 3-8) -cycle3=$(echo ${dc3} | cut -c 9,10) -# DEFINE 4 CYCLES AGO -dc4=$($NDATE -24 ${PDY}${cyc} | cut -c -10) -date4=$(echo ${dc4} | cut -c -8) -sdate4=$(echo ${dc4} | cut -c 3-8) -cycle4=$(echo ${dc4} | cut -c 9,10) -# DEFINE 5 CYCLES AGO -dc5=$($NDATE -30 ${PDY}${cyc} | cut -c -10) -date5=$(echo ${dc5} | cut -c -8) -sdate5=$(echo ${dc5} | cut -c 3-8) -cycle5=$(echo ${dc5} | cut -c 9,10) -# DEFINE 6 CYCLES AGO -dc6=$($NDATE -36 ${PDY}${cyc} | cut -c -10) -date6=$(echo ${dc6} | cut -c -8) -sdate6=$(echo ${dc6} | cut -c 3-8) -cycle6=$(echo ${dc6} | cut -c 9,10) -# DEFINE 7 CYCLES AGO -dc7=$($NDATE -42 ${PDY}${cyc} | cut -c -10) -date7=$(echo ${dc7} | cut -c -8) -sdate7=$(echo ${dc7} | cut -c 3-8) -cycle7=$(echo ${dc7} | cut -c 9,10) -# DEFINE 8 CYCLES AGO -dc8=$($NDATE -48 ${PDY}${cyc} | cut -c -10) -date8=$(echo ${dc8} | cut -c -8) -sdate8=$(echo ${dc8} | cut -c 3-8) -cycle8=$(echo ${dc8} | cut -c 9,10) -# DEFINE 9 CYCLES AGO -dc9=$($NDATE -54 ${PDY}${cyc} | cut -c -10) -date9=$(echo ${dc9} | cut -c -8) -sdate9=$(echo ${dc9} | cut -c 3-8) -cycle9=$(echo ${dc9} | cut -c 9,10) -# DEFINE 10 CYCLES AGO -dc10=$($NDATE -60 ${PDY}${cyc} | cut -c -10) -date10=$(echo ${dc10} | cut -c -8) -sdate10=$(echo ${dc10} | cut -c 3-8) -cycle10=$(echo ${dc10} | cut -c 9,10) -# DEFINE 11 CYCLES AGO -dc11=$($NDATE -66 ${PDY}${cyc} | cut -c -10) -date11=$(echo ${dc11} | cut -c -8) -sdate11=$(echo ${dc11} | cut -c 3-8) -cycle11=$(echo ${dc11} | cut -c 9,10) -# DEFINE 12 CYCLES AGO -dc12=$($NDATE -72 ${PDY}${cyc} | cut -c -10) -date12=$(echo ${dc12} | cut -c -8) -sdate12=$(echo ${dc12} | cut -c 3-8) -cycle12=$(echo ${dc12} | cut -c 9,10) -# DEFINE 13 CYCLES AGO -dc13=$($NDATE -78 ${PDY}${cyc} | cut -c -10) -date13=$(echo ${dc13} | cut -c -8) -sdate13=$(echo ${dc13} | cut -c 3-8) -cycle13=$(echo ${dc13} | cut -c 9,10) -# DEFINE 14 CYCLES AGO -dc14=$($NDATE -84 ${PDY}${cyc} | cut -c -10) -date14=$(echo ${dc14} | cut -c -8) -sdate14=$(echo ${dc14} | cut -c 3-8) -cycle14=$(echo ${dc14} | cut -c 9,10) -# DEFINE 15 CYCLES AGO -dc15=$($NDATE -90 ${PDY}${cyc} | cut -c -10) -date15=$(echo ${dc15} | cut -c -8) -sdate15=$(echo ${dc15} | cut -c 3-8) -cycle15=$(echo ${dc15} | cut -c 9,10) -# DEFINE 16 CYCLES AGO -dc16=$($NDATE -96 ${PDY}${cyc} | cut -c -10) -date16=$(echo ${dc16} | cut -c -8) -sdate16=$(echo ${dc16} | cut -c 3-8) -cycle16=$(echo ${dc16} | cut -c 9,10) -# DEFINE 17 CYCLES AGO -dc17=$($NDATE -102 ${PDY}${cyc} | cut -c -10) -date17=$(echo ${dc17} | cut -c -8) -sdate17=$(echo ${dc17} | cut -c 3-8) -cycle17=$(echo ${dc17} | cut -c 9,10) -# DEFINE 18 CYCLES AGO -dc18=$($NDATE -108 ${PDY}${cyc} | cut -c -10) -date18=$(echo ${dc18} | cut -c -8) -sdate18=$(echo ${dc18} | cut -c 3-8) -cycle18=$(echo ${dc18} | cut -c 9,10) -# DEFINE 19 CYCLES AGO -dc19=$($NDATE -114 ${PDY}${cyc} | cut -c -10) -date19=$(echo ${dc19} | cut -c -8) -sdate19=$(echo ${dc19} | cut -c 3-8) -cycle19=$(echo ${dc19} | cut -c 9,10) -# DEFINE 20 CYCLES AGO -dc20=$($NDATE -120 ${PDY}${cyc} | cut -c -10) -date20=$(echo ${dc20} | cut -c -8) -sdate20=$(echo ${dc20} | cut -c 3-8) -cycle20=$(echo ${dc20} | cut -c 9,10) -# DEFINE 21 CYCLES AGO -dc21=$($NDATE -126 ${PDY}${cyc} | cut -c -10) -date21=$(echo ${dc21} | cut -c -8) -sdate21=$(echo ${dc21} | cut -c 3-8) -cycle21=$(echo ${dc21} | cut -c 9,10) # SET CURRENT CYCLE AS THE VERIFICATION GRIDDED FILE. -vergrid="F-${MDL} | ${PDY2}/${cyc}00" +vergrid="F-${MDL} | ${PDY:2}/${cyc}00" fcsthr="f00" # SET WHAT RUNS TO COMPARE AGAINST BASED ON MODEL CYCLE TIME. -if [ ${cyc} -eq 00 ] ; then - verdays="${dc1} ${dc2} ${dc3} ${dc4} ${dc5} ${dc6} ${dc7} ${dc8} ${dc9} ${dc10} ${dc11} ${dc12} ${dc13} ${dc14} ${dc16} ${dc18} ${dc20}" -elif [ ${cyc} -eq 12 ] ; then - verdays="${dc1} ${dc2} ${dc3} ${dc4} ${dc5} ${dc6} ${dc7} ${dc8} ${dc9} ${dc10} ${dc11} ${dc12} ${dc13} ${dc14} ${dc16} ${dc18} ${dc20}" -else - verdays="${dc1} ${dc2} ${dc3} ${dc4} ${dc5} ${dc6} ${dc7} ${dc8} ${dc9} ${dc10} ${dc11} ${dc12} ${dc13} ${dc14} ${dc15} ${dc17} ${dc19} ${dc21}" -fi +# seq won't give us any splitting problems, ignore warnings +# shellcheck disable=SC2207,SC2312 +case ${cyc} in + 00 | 12) IFS=$'\n' lookbacks=($(seq 6 6 84) $(seq 96 12 120)) ;; + 06 | 18) IFS=$'\n' lookbacks=($(seq 6 6 84) $(seq 90 12 126)) ;; + *) + echo "FATAL ERROR: Invalid cycle ${cyc} passed to ${BASH_SOURCE[0]}" + exit 100 + ;; +esac #GENERATING THE METAFILES. MDL2="GFSHPC" -for verday in ${verdays} - do - cominday=$(echo ${verday} | cut -c -8) - #XXW export HPCGFS=$COMROOT/nawips/prod/${mdl}.${cominday} - # BV export HPCGFS=$COMROOT/nawips/${envir}/${mdl}.${cominday} - export HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cyc}/${COMPONENT}/gempak +for lookback in "${lookbacks[@]}"; do + init_time="$(date --utc +%Y%m%d%H -d "${PDY} ${cyc} - ${lookback} hours")" + init_PDY=${init_time:0:8} + init_cyc=${init_time:8:2} + + if (( init_time <= ${SDATE:-0} )); then + echo "Skipping ver for ${init_time} because it is before the experiment began" + if (( lookback == "${lookbacks[0]}" )); then + echo "First forecast time, no metafile produced" + exit 0 + else + break + fi + fi - if [ ${verday} -eq ${dc1} ] ; then - dgdattim=f006 - export HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle1}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate1}/${cycle1}00" - elif [ ${verday} -eq ${dc2} ] ; then - dgdattim=f012 - export HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle2}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate2}/${cycle2}00" - elif [ ${verday} -eq ${dc3} ] ; then - dgdattim=f018 - export HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle3}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate3}/${cycle3}00" - elif [ ${verday} -eq ${dc4} ] ; then - dgdattim=f024 - export HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle4}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate4}/${cycle4}00" - elif [ ${verday} -eq ${dc5} ] ; then - dgdattim=f030 - export HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle5}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate5}/${cycle5}00" - elif [ ${verday} -eq ${dc6} ] ; then - dgdattim=f036 - export HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle6}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate6}/${cycle6}00" - elif [ ${verday} -eq ${dc7} ] ; then - dgdattim=f042 - export HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle7}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate7}/${cycle7}00" - elif [ ${verday} -eq ${dc8} ] ; then - dgdattim=f048 - export HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle8}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate8}/${cycle8}00" - elif [ ${verday} -eq ${dc9} ] ; then - dgdattim=f054 - export HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle9}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate9}/${cycle9}00" - elif [ ${verday} -eq ${dc10} ] ; then - dgdattim=f060 - export HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle10}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate10}/${cycle10}00" - elif [ ${verday} -eq ${dc11} ] ; then - dgdattim=f066 - export HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle11}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate11}/${cycle11}00" - elif [ ${verday} -eq ${dc12} ] ; then - dgdattim=f072 - export HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle12}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate12}/${cycle12}00" - elif [ ${verday} -eq ${dc13} ] ; then - dgdattim=f078 - export HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle13}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate13}/${cycle13}00" - elif [ ${verday} -eq ${dc14} ] ; then - dgdattim=f084 - export HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle14}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate14}/${cycle14}00" - elif [ ${verday} -eq ${dc15} ] ; then - dgdattim=f090 - export HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle15}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate15}/${cycle15}00" - elif [ ${verday} -eq ${dc16} ] ; then - dgdattim=f096 - export HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle16}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate16}/${cycle16}00" - elif [ ${verday} -eq ${dc17} ] ; then - dgdattim=f102 - export HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle17}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate17}/${cycle17}00" - elif [ ${verday} -eq ${dc18} ] ; then - dgdattim=f108 - export HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle18}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate18}/${cycle18}00" - elif [ ${verday} -eq ${dc19} ] ; then - dgdattim=f114 - export HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle19}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate19}/${cycle19}00" - elif [ ${verday} -eq ${dc20} ] ; then - dgdattim=f120 - export HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle20}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate20}/${cycle20}00" - elif [ ${verday} -eq ${dc21} ] ; then - dgdattim=f126 - export HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle21}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate21}/${cycle21}00" + dgdattim="f$(printf "%03g" "${lookback}")" + + # Create symlink in DATA to sidestep gempak path limits + HPCGFS="${RUN}.${init_time}" + if [[ ! -L ${HPCGFS} ]]; then + YMD=${init_PDY} HH=${init_cyc} GRID="1p00" generate_com source_dir:COM_ATMOS_GEMPAK_TMPL + ln -sf "${source_dir}" "${HPCGFS}" fi -# 500 MB HEIGHT METAFILE -export pgm=gdplot2_nc;. prep_step; startmsg + grid="F-${MDL2} | ${init_PDY}/${init_cyc}00" + + # 500 MB HEIGHT METAFILE + export pgm=gdplot2_nc;. prep_step -$GEMEXE/gdplot2_nc << EOFplt -PROJ = MER + "${GEMEXE}/gdplot2_nc" << EOFplt +PROJ = MER GAREA = 15.0;-100.0;70.0;20.0 map = 1//2 clear = yes @@ -294,11 +126,11 @@ r gdfile = ${vergrid} gdattim = ${fcsthr} gdpfun = mag(kntv(wnd)) -glevel = 9950 -gvcord = sgma -scale = 0 -cint = 35;50;65 -line = 6/1/3 +glevel = 9950 +gvcord = sgma +scale = 0 +cint = 35;50;65 +line = 6/1/3 title = 6/-2/~ GFS WIND ISOTACHS 30m|~WIND DIFF clear = yes r @@ -306,29 +138,35 @@ r gdfile = ${grid} gdattim = ${dgdattim} line = 5/1/3 -contur = 0 -title = 5/-1/~ GFS WIND ISOTACHS 30m +contur = 0 +title = 5/-1/~ GFS WIND ISOTACHS 30m clear = no r ex EOFplt -export err=$?;err_chk + export err=$?;err_chk + if (( err != 0 )); then + echo "FATAL ERROR: Failed to create gempak meta file ${metaname}" + exit $(( err + 100 )) + fi done -#################################################### +##################################################### # GEMPAK DOES NOT ALWAYS HAVE A NON ZERO RETURN CODE # WHEN IT CAN NOT PRODUCE THE DESIRED GRID. CHECK # FOR THIS CASE HERE. ##################################################### -ls -l ${metaname} -export err=$?;export pgm="GEMPAK CHECK FILE";err_chk -if [ $SENDCOM = "YES" ] ; then - mv ${metaname} ${COMOUT}/${mdl}ver_${PDY}_${cyc}_na_mar - if [ $SENDDBN = "YES" ] ; then - ${DBNROOT}/bin/dbn_alert MODEL ${DBN_ALERT_TYPE} $job ${COMOUT}/${mdl}ver_${PDY}_${cyc}_na_mar - fi +if [[ ! -s "${metaname}" ]] &> /dev/null; then + echo "FATAL ERROR: Failed to create gempak meta file ${metaname}" + exit 100 +fi + +mv "${metaname}" "${COM_ATMOS_GEMPAK_META}/${mdl}ver_${PDY}_${cyc}_na_mar" +if [[ "${SENDDBN}" == "YES" ]] ; then + "${DBNROOT}/bin/dbn_alert" MODEL "${DBN_ALERT_TYPE}" "${job}" \ + "${COM_ATMOS_GEMPAK_META}/${mdl}ver_${PDY}_${cyc}_na_mar" fi exit diff --git a/gempak/ush/gfs_meta_opc_np_ver b/gempak/ush/gfs_meta_opc_np_ver index 5cb9fba3c9..45a6824fa8 100755 --- a/gempak/ush/gfs_meta_opc_np_ver +++ b/gempak/ush/gfs_meta_opc_np_ver @@ -1,249 +1,79 @@ -#!/bin/sh +#! /usr/bin/env bash # # Metafile Script : gfs_meta_opc_np_ver # -# Log : -# J. Carr/HPC 12/08/2004 Submitted into production. -# # Set up Local Variables # -set -x + +source "${HOMEgfs}/ush/preamble.sh" + +mkdir -p -m 775 "${DATA}/OPC_NP_VER_F${fend}" +cd "${DATA}/OPC_NP_VER_F${fend}" || exit 2 +cp "${HOMEgfs}/gempak/fix/datatype.tbl" datatype.tbl + # -export PS4='OPC_NP_VER_F${fend}:$SECONDS + ' -mkdir -p -m 775 $DATA/OPC_NP_VER_F${fend} -cd $DATA/OPC_NP_VER_F${fend} -cp $FIXgempak/datatype.tbl datatype.tbl +# Link data into DATA to sidestep gempak path limits +# TODO: Replace this # -export COMPONENT=${COMPONENT:-atmos} +export COMIN="${RUN}.${PDY}${cyc}" +if [[ ! -L ${COMIN} ]]; then + ln -sf "${COM_ATMOS_GEMPAK_1p00}" "${COMIN}" +fi mdl=gfs MDL="GFS" -metatype="ver" metaname="gfsver_mpc_np_${cyc}.meta" device="nc | ${metaname}" -PDY2=$(echo $PDY | cut -c3-) -# -# -# DEFINE 1 CYCLE AGO -dc1=$($NDATE -06 ${PDY}${cyc} | cut -c -10) -date1=$(echo ${dc1} | cut -c -8) -sdate1=$(echo ${dc1} | cut -c 3-8) -cycle1=$(echo ${dc1} | cut -c 9,10) -# DEFINE 2 CYCLES AGO -dc2=$($NDATE -12 ${PDY}${cyc} | cut -c -10) -date2=$(echo ${dc2} | cut -c -8) -sdate2=$(echo ${dc2} | cut -c 3-8) -cycle2=$(echo ${dc2} | cut -c 9,10) -# DEFINE 3 CYCLES AGO -dc3=$($NDATE -18 ${PDY}${cyc} | cut -c -10) -date3=$(echo ${dc3} | cut -c -8) -sdate3=$(echo ${dc3} | cut -c 3-8) -cycle3=$(echo ${dc3} | cut -c 9,10) -# DEFINE 4 CYCLES AGO -dc4=$($NDATE -24 ${PDY}${cyc} | cut -c -10) -date4=$(echo ${dc4} | cut -c -8) -sdate4=$(echo ${dc4} | cut -c 3-8) -cycle4=$(echo ${dc4} | cut -c 9,10) -# DEFINE 5 CYCLES AGO -dc5=$($NDATE -30 ${PDY}${cyc} | cut -c -10) -date5=$(echo ${dc5} | cut -c -8) -sdate5=$(echo ${dc5} | cut -c 3-8) -cycle5=$(echo ${dc5} | cut -c 9,10) -# DEFINE 6 CYCLES AGO -dc6=$($NDATE -36 ${PDY}${cyc} | cut -c -10) -date6=$(echo ${dc6} | cut -c -8) -sdate6=$(echo ${dc6} | cut -c 3-8) -cycle6=$(echo ${dc6} | cut -c 9,10) -# DEFINE 7 CYCLES AGO -dc7=$($NDATE -42 ${PDY}${cyc} | cut -c -10) -date7=$(echo ${dc7} | cut -c -8) -sdate7=$(echo ${dc7} | cut -c 3-8) -cycle7=$(echo ${dc7} | cut -c 9,10) -# DEFINE 8 CYCLES AGO -dc8=$($NDATE -48 ${PDY}${cyc} | cut -c -10) -date8=$(echo ${dc8} | cut -c -8) -sdate8=$(echo ${dc8} | cut -c 3-8) -cycle8=$(echo ${dc8} | cut -c 9,10) -# DEFINE 9 CYCLES AGO -dc9=$($NDATE -54 ${PDY}${cyc} | cut -c -10) -date9=$(echo ${dc9} | cut -c -8) -sdate9=$(echo ${dc9} | cut -c 3-8) -cycle9=$(echo ${dc9} | cut -c 9,10) -# DEFINE 10 CYCLES AGO -dc10=$($NDATE -60 ${PDY}${cyc} | cut -c -10) -date10=$(echo ${dc10} | cut -c -8) -sdate10=$(echo ${dc10} | cut -c 3-8) -cycle10=$(echo ${dc10} | cut -c 9,10) -# DEFINE 11 CYCLES AGO -dc11=$($NDATE -66 ${PDY}${cyc} | cut -c -10) -date11=$(echo ${dc11} | cut -c -8) -sdate11=$(echo ${dc11} | cut -c 3-8) -cycle11=$(echo ${dc11} | cut -c 9,10) -# DEFINE 12 CYCLES AGO -dc12=$($NDATE -72 ${PDY}${cyc} | cut -c -10) -date12=$(echo ${dc12} | cut -c -8) -sdate12=$(echo ${dc12} | cut -c 3-8) -cycle12=$(echo ${dc12} | cut -c 9,10) -# DEFINE 13 CYCLES AGO -dc13=$($NDATE -78 ${PDY}${cyc} | cut -c -10) -date13=$(echo ${dc13} | cut -c -8) -sdate13=$(echo ${dc13} | cut -c 3-8) -cycle13=$(echo ${dc13} | cut -c 9,10) -# DEFINE 14 CYCLES AGO -dc14=$($NDATE -84 ${PDY}${cyc} | cut -c -10) -date14=$(echo ${dc14} | cut -c -8) -sdate14=$(echo ${dc14} | cut -c 3-8) -cycle14=$(echo ${dc14} | cut -c 9,10) -# DEFINE 15 CYCLES AGO -dc15=$($NDATE -90 ${PDY}${cyc} | cut -c -10) -date15=$(echo ${dc15} | cut -c -8) -sdate15=$(echo ${dc15} | cut -c 3-8) -cycle15=$(echo ${dc15} | cut -c 9,10) -# DEFINE 16 CYCLES AGO -dc16=$($NDATE -96 ${PDY}${cyc} | cut -c -10) -date16=$(echo ${dc16} | cut -c -8) -sdate16=$(echo ${dc16} | cut -c 3-8) -cycle16=$(echo ${dc16} | cut -c 9,10) -# DEFINE 17 CYCLES AGO -dc17=$($NDATE -102 ${PDY}${cyc} | cut -c -10) -date17=$(echo ${dc17} | cut -c -8) -sdate17=$(echo ${dc17} | cut -c 3-8) -cycle17=$(echo ${dc17} | cut -c 9,10) -# DEFINE 18 CYCLES AGO -dc18=$($NDATE -108 ${PDY}${cyc} | cut -c -10) -date18=$(echo ${dc18} | cut -c -8) -sdate18=$(echo ${dc18} | cut -c 3-8) -cycle18=$(echo ${dc18} | cut -c 9,10) -# DEFINE 19 CYCLES AGO -dc19=$($NDATE -114 ${PDY}${cyc} | cut -c -10) -date19=$(echo ${dc19} | cut -c -8) -sdate19=$(echo ${dc19} | cut -c 3-8) -cycle19=$(echo ${dc19} | cut -c 9,10) -# DEFINE 20 CYCLES AGO -dc20=$($NDATE -120 ${PDY}${cyc} | cut -c -10) -date20=$(echo ${dc20} | cut -c -8) -sdate20=$(echo ${dc20} | cut -c 3-8) -cycle20=$(echo ${dc20} | cut -c 9,10) -# DEFINE 21 CYCLES AGO -dc21=$($NDATE -126 ${PDY}${cyc} | cut -c -10) -date21=$(echo ${dc21} | cut -c -8) -sdate21=$(echo ${dc21} | cut -c 3-8) -cycle21=$(echo ${dc21} | cut -c 9,10) # SET CURRENT CYCLE AS THE VERIFICATION GRIDDED FILE. -vergrid="F-${MDL} | ${PDY2}/${cyc}00" +vergrid="F-${MDL} | ${PDY:2}/${cyc}00" fcsthr="f00" # SET WHAT RUNS TO COMPARE AGAINST BASED ON MODEL CYCLE TIME. -if [ ${cyc} -eq 00 ] ; then - verdays="${dc1} ${dc2} ${dc3} ${dc4} ${dc5} ${dc6} ${dc7} ${dc8} ${dc9} ${dc10} ${dc11} ${dc12} ${dc13} ${dc14} ${dc16} ${dc18} ${dc20}" -elif [ ${cyc} -eq 12 ] ; then - verdays="${dc1} ${dc2} ${dc3} ${dc4} ${dc5} ${dc6} ${dc7} ${dc8} ${dc9} ${dc10} ${dc11} ${dc12} ${dc13} ${dc14} ${dc16} ${dc18} ${dc20}" -else - verdays="${dc1} ${dc2} ${dc3} ${dc4} ${dc5} ${dc6} ${dc7} ${dc8} ${dc9} ${dc10} ${dc11} ${dc12} ${dc13} ${dc14} ${dc15} ${dc17} ${dc19} ${dc21}" -fi +# seq won't give us any splitting problems, ignore warnings +# shellcheck disable=SC2207,SC2312 +case ${cyc} in + 00 | 12) IFS=$'\n' lookbacks=($(seq 6 6 84) $(seq 96 12 120)) ;; + 06 | 18) IFS=$'\n' lookbacks=($(seq 6 6 84) $(seq 90 12 126)) ;; + *) + echo "FATAL ERROR: Invalid cycle ${cyc} passed to ${BASH_SOURCE[0]}" + exit 100 + ;; +esac #GENERATING THE METAFILES. MDL2="GFSHPC" -for verday in ${verdays} - do - cominday=$(echo ${verday} | cut -c -8) - #XXW export HPCGFS=$COMROOT/nawips/prod/${mdl}.${cominday} - # BV export HPCGFS=$COMROOT/nawips/${envir}/${mdl}.${cominday} - export HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cyc}/${COMPONENT}/gempak +for lookback in "${lookbacks[@]}"; do + init_time="$(date --utc +%Y%m%d%H -d "${PDY} ${cyc} - ${lookback} hours")" + init_PDY=${init_time:0:8} + init_cyc=${init_time:8:2} + + if (( init_time <= ${SDATE:-0} )); then + echo "Skipping ver for ${init_time} because it is before the experiment began" + if (( lookback == "${lookbacks[0]}" )); then + echo "First forecast time, no metafile produced" + exit 0 + else + break + fi + fi - if [ ${verday} -eq ${dc1} ] ; then - dgdattim=f006 - export HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle1}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate1}/${cycle1}00" - elif [ ${verday} -eq ${dc2} ] ; then - dgdattim=f012 - export HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle2}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate2}/${cycle2}00" - elif [ ${verday} -eq ${dc3} ] ; then - dgdattim=f018 - export HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle3}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate3}/${cycle3}00" - elif [ ${verday} -eq ${dc4} ] ; then - dgdattim=f024 - export HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle4}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate4}/${cycle4}00" - elif [ ${verday} -eq ${dc5} ] ; then - dgdattim=f030 - export HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle5}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate5}/${cycle5}00" - elif [ ${verday} -eq ${dc6} ] ; then - dgdattim=f036 - export HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle6}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate6}/${cycle6}00" - elif [ ${verday} -eq ${dc7} ] ; then - dgdattim=f042 - export HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle7}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate7}/${cycle7}00" - elif [ ${verday} -eq ${dc8} ] ; then - dgdattim=f048 - export HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle8}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate8}/${cycle8}00" - elif [ ${verday} -eq ${dc9} ] ; then - dgdattim=f054 - export HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle9}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate9}/${cycle9}00" - elif [ ${verday} -eq ${dc10} ] ; then - dgdattim=f060 - export HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle10}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate10}/${cycle10}00" - elif [ ${verday} -eq ${dc11} ] ; then - dgdattim=f066 - export HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle11}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate11}/${cycle11}00" - elif [ ${verday} -eq ${dc12} ] ; then - dgdattim=f072 - export HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle12}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate12}/${cycle12}00" - elif [ ${verday} -eq ${dc13} ] ; then - dgdattim=f078 - export HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle13}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate13}/${cycle13}00" - elif [ ${verday} -eq ${dc14} ] ; then - dgdattim=f084 - export HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle14}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate14}/${cycle14}00" - elif [ ${verday} -eq ${dc15} ] ; then - dgdattim=f090 - export HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle15}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate15}/${cycle15}00" - elif [ ${verday} -eq ${dc16} ] ; then - dgdattim=f096 - export HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle16}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate16}/${cycle16}00" - elif [ ${verday} -eq ${dc17} ] ; then - dgdattim=f102 - export HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle17}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate17}/${cycle17}00" - elif [ ${verday} -eq ${dc18} ] ; then - dgdattim=f108 - export HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle18}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate18}/${cycle18}00" - elif [ ${verday} -eq ${dc19} ] ; then - dgdattim=f114 - export HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle19}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate19}/${cycle19}00" - elif [ ${verday} -eq ${dc20} ] ; then - dgdattim=f120 - export HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle20}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate20}/${cycle20}00" - elif [ ${verday} -eq ${dc21} ] ; then - dgdattim=f126 - export HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle21}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate21}/${cycle21}00" + dgdattim="f$(printf "%03g" "${lookback}")" + + # Create symlink in DATA to sidestep gempak path limits + HPCGFS="${RUN}.${init_time}" + if [[ ! -L "${HPCGFS}" ]]; then + YMD=${init_PDY} HH=${init_cyc} GRID="1p00" generate_com source_dir:COM_ATMOS_GEMPAK_TMPL + ln -sf "${source_dir}" "${HPCGFS}" fi -# 500 MB HEIGHT METAFILE + grid="F-${MDL2} | ${init_PDY}/${init_cyc}00" -export pgm=gdplot2_nc;. prep_step; startmsg + # 500 MB HEIGHT METAFILE + export pgm=gdplot2_nc;. prep_step -$GEMEXE/gdplot2_nc << EOFplt -PROJ = MER + "${GEMEXE}/gdplot2_nc" << EOFplt +PROJ = MER GAREA = 5.0;120.0;70.0;-105.0 map = 1//2 clear = yes @@ -296,11 +126,11 @@ r gdfile = ${vergrid} gdattim = ${fcsthr} gdpfun = mag(kntv(wnd)) -glevel = 9950 -gvcord = sgma -scale = 0 -cint = 35;50;65 -line = 6/1/3 +glevel = 9950 +gvcord = sgma +scale = 0 +cint = 35;50;65 +line = 6/1/3 title = 6/-2/~ GFS WIND ISOTACHS 30m|~WIND DIFF clear = yes r @@ -308,29 +138,34 @@ r gdfile = ${grid} gdattim = ${dgdattim} line = 5/1/3 -contur = 0 -title = 5/-1/~ GFS WIND ISOTACHS 30m +contur = 0 +title = 5/-1/~ GFS WIND ISOTACHS 30m clear = no r ex EOFplt -export err=$?;err_chk - + export err=$?;err_chk + if (( err != 0 )); then + echo "FATAL ERROR: Failed to create gempak meta file ${metaname}" + exit $(( err + 100 )) + fi done -#################################################### +##################################################### # GEMPAK DOES NOT ALWAYS HAVE A NON ZERO RETURN CODE # WHEN IT CAN NOT PRODUCE THE DESIRED GRID. CHECK # FOR THIS CASE HERE. ##################################################### -ls -l ${metaname} -export err=$?;export pgm="GEMPAK CHECK FILE";err_chk -if [ $SENDCOM = "YES" ] ; then - mv ${metaname} ${COMOUT}/${mdl}ver_${PDY}_${cyc}_np_mar - if [ $SENDDBN = "YES" ] ; then - ${DBNROOT}/bin/dbn_alert MODEL ${DBN_ALERT_TYPE} $job ${COMOUT}/${mdl}ver_${PDY}_${cyc}_np_mar - fi +if [[ ! -s "${metaname}" ]] &> /dev/null; then + echo "FATAL ERROR: Failed to create gempak meta file ${metaname}" + exit 100 +fi + +mv "${metaname}" "${COM_ATMOS_GEMPAK_META}/${mdl}ver_${PDY}_${cyc}_np_mar" +if [[ "${SENDDBN}" == "YES" ]] ; then + "${DBNROOT}/bin/dbn_alert" MODEL "${DBN_ALERT_TYPE}" "${job}" \ + "${COM_ATMOS_GEMPAK_META}/${mdl}ver_${PDY}_${cyc}_np_mar" fi exit diff --git a/gempak/ush/gfs_meta_precip.sh b/gempak/ush/gfs_meta_precip.sh index cf3db9cbae..940a450106 100755 --- a/gempak/ush/gfs_meta_precip.sh +++ b/gempak/ush/gfs_meta_precip.sh @@ -1,17 +1,24 @@ -#! /bin/sh +#! /usr/bin/env bash # # Metafile Script : gfs_meta_precip.sh # -# Log : -# M. Klein/WPC 01/29/2014 Created. Adapted from gfs_meta_qpf.sh -# # Set up Local Variables # -set -x -export PS4='qpf:$SECONDS + ' -mkdir -p -m 775 $DATA/precip -cd $DATA/precip -cp $FIXgempak/datatype.tbl datatype.tbl + +source "${HOMEgfs}/ush/preamble.sh" + +mkdir -p -m 775 "${DATA}/precip" +cd "${DATA}/precip" || exit 2 +cp "${HOMEgfs}/gempak/fix/datatype.tbl" datatype.tbl + +# +# Link data into DATA to sidestep gempak path limits +# TODO: Replace this +# +export COMIN="${RUN}.${PDY}${cyc}" +if [[ ! -L ${COMIN} ]]; then + ln -sf "${COM_ATMOS_GEMPAK_1p00}" "${COMIN}" +fi # # Set model and metafile naming conventions @@ -21,13 +28,12 @@ MDL=GFS metatype="precip" metaname="${mdl}_${metatype}_${cyc}.meta" device="nc | ${metaname}" -PDY2=$(echo $PDY | cut -c3-) # -# Set range of forecast hours. GFS is available every 6 hours through F192, then +# Set range of forecast hours. GFS is available every 6 hours through F192, then # every 12 hours after. The request was to have the fields go to F216, so will run # the gdplot for the ranges set below, then for the 12-hour and greater QPF periods, -# run the gdplot2 from F204-F216. 6-hour QPF will stop at F192. +# run the gdplot2 from F204-F216. 6-hour QPF will stop at F192. # gdatpcpn06="F006-F192-06" @@ -41,7 +47,6 @@ gdatpcpn96="F096-F192-06" gdatpcpn120="F120-F192-06" gdatpcpn144="F144-F192-06" gdatpcpn168="F168-F192-06" -run="r" # # For CPC - Day 6-10 and Day 8-14 QPFs using a North American regional display @@ -49,49 +54,50 @@ run="r" garea_cpc="17.529;-129.296;53.771;-22.374" proj_cpc="str/90;-105;0" -# Notes -- +# Notes -- # 00Z cycle - 8-14 Day -- No F198 file, so started at F204. Makes a P156I, not P162I. # 06Z cycle - 6-10 Day -- No F258 file, so ended at F252. Makes a P108I, not P114I. -# - 8-14 Day -- No F354 file, so ended at F348. Makes a P156I, not P162I. +# - 8-14 Day -- No F354 file, so ended at F348. Makes a P156I, not P162I. # 12Z cycle - 8-14 Day -- No F210 file, so started at F216. Makes a P156I, not P162I. # 18Z cycle - 6-10 Day -- No F270 file, so ended at F264. Makes a P108I, not P114I. # - 8-14 Day -- No F366 file, so ended at F360. Makes a P156I, not P162I. -gdattim_6to10="" -gdattim_8to14="" -gdpfun_6to10="p114i" -gdpfun_8to14="p162i" -if [ ${cyc} = "00" ] ; then - gdattim_6to10="${PDY2}/${cyc}00F264" - gdattim_8to14="${PDY2}/${cyc}00F360" - gdpfun_6to10="p114i" - gdpfun_8to14="p156i" -elif [ ${cyc} = "06" ] ; then - #gdattim_6to10="${PDY2}/${cyc}00F258" - #gdattim_8to14="${PDY2}/${cyc}00F354" - gdattim_6to10="${PDY2}/${cyc}00F252" - gdattim_8to14="${PDY2}/${cyc}00F348" - gdpfun_6to10="p108i" - gdpfun_8to14="p156i" -elif [ ${cyc} = "12" ] ; then - gdattim_6to10="${PDY2}/${cyc}00F276" - gdattim_8to14="${PDY2}/${cyc}00F372" - gdpfun_6to10="p114i" - gdpfun_8to14="p156i" -elif [ ${cyc} = "18" ] ; then - #gdattim_6to10="${PDY2}/${cyc}00F270" - #gdattim_8to14="${PDY2}/${cyc}00F366" - gdattim_6to10="${PDY2}/${cyc}00F264" - gdattim_8to14="${PDY2}/${cyc}00F360" - gdpfun_6to10="p108i" - gdpfun_8to14="p156i" -fi - -export pgm=gdplot2_nc;. prep_step; startmsg -$GEMEXE/gdplot2_nc << EOFplt -gdfile = F-${MDL} | ${PDY2}/${cyc}00 +case ${cyc} in + 00) + gdattim_6to10="${PDY:2}/${cyc}00F264" + gdattim_8to14="${PDY:2}/${cyc}00F360" + gdpfun_6to10="p114i" + gdpfun_8to14="p156i" + ;; + 06) + gdattim_6to10="${PDY:2}/${cyc}00F252" + gdattim_8to14="${PDY:2}/${cyc}00F348" + gdpfun_6to10="p108i" + gdpfun_8to14="p156i" + ;; + 12) + gdattim_6to10="${PDY:2}/${cyc}00F276" + gdattim_8to14="${PDY:2}/${cyc}00F372" + gdpfun_6to10="p114i" + gdpfun_8to14="p156i" + ;; + 18) + gdattim_6to10="${PDY:2}/${cyc}00F264" + gdattim_8to14="${PDY:2}/${cyc}00F360" + gdpfun_6to10="p108i" + gdpfun_8to14="p156i" + ;; + *) + echo "FATAL ERROR: InvaLid cycle ${cyc} passed to ${BASH_SOURCE[0]}" + exit 100 + ;; +esac + +export pgm=gdplot2_nc;. prep_step +"${GEMEXE}/gdplot2_nc" << EOFplt +gdfile = F-${MDL} | ${PDY:2}/${cyc}00 garea = us -proj = +proj = map = 1/1/2/yes device = ${device} clear = yes @@ -131,15 +137,15 @@ scale = 0 gdpfun = p06i type = f cint = -line = +line = hilo = 31;0/x#2/.01-20//50;50/y hlsym = 1.5 -wind = +wind = title = 1/-2/~ ? ${MDL} 6-HOUR TOTAL PCPN|~6-HR TOTAL PCPN!0 l r -gdattim = ${gdatpcpn12} +gdattim = ${gdatpcpn12} gdpfun = p12i title = 1/-2/~ ? ${MDL} 12-HOUR TOTAL PCPN|~12-HR TOTAL PCPN!0 l @@ -149,7 +155,7 @@ gdattim = F204-F216-12 l r -gdattim = ${gdatpcpn24} +gdattim = ${gdatpcpn24} gdpfun = p24i title = 1/-2/~ ? ${MDL} 24-HOUR TOTAL PCPN|~24-HR TOTAL PCPN!0 l @@ -177,7 +183,7 @@ r gdattim = F204-F216-12 r -gdattim = ${gdatpcpn72} +gdattim = ${gdatpcpn72} gdpfun = p72i title = 1/-2/~ ? ${MDL} 72 HOUR TOTAL PCPN|~72-HR TOTAL PCPN!0 r @@ -185,7 +191,7 @@ r gdattim = F204-F216-12 r -gdattim = ${gdatpcpn84} +gdattim = ${gdatpcpn84} gdpfun = p84i title = 1/-2/~ ? ${MDL} 84 HOUR TOTAL PCPN|~84-HR TOTAL PCPN!0 r @@ -249,20 +255,20 @@ export err=$?;err_chk # WHEN IT CAN NOT PRODUCE THE DESIRED GRID. CHECK # FOR THIS CASE HERE. ##################################################### -ls -l $metaname -export err=$?;export pgm="GEMPAK CHECK FILE";err_chk - -if [ $SENDCOM = "YES" ] ; then - mv ${metaname} ${COMOUT}/${mdl}_${PDY}_${cyc}_us_${metatype} - if [ $SENDDBN = "YES" ] ; then - ${DBNROOT}/bin/dbn_alert MODEL ${DBN_ALERT_TYPE} $job \ - ${COMOUT}/${mdl}_${PDY}_${cyc}_us_${metatype} - if [ $DBN_ALERT_TYPE = "GFS_METAFILE_LAST" ] ; then +if (( err != 0 )) || [[ ! -s "${metaname}" ]] &> /dev/null; then + echo "FATAL ERROR: Failed to create gempak meta file ${metaname}" + exit $(( err + 100 )) +fi + +mv "${metaname}" "${COM_ATMOS_GEMPAK_META}/${mdl}_${PDY}_${cyc}_us_${metatype}" +if [[ "${SENDDBN}" == "YES" ]] ; then + "${DBNROOT}/bin/dbn_alert" MODEL "${DBN_ALERT_TYPE}" "${job}" \ + "${COM_ATMOS_GEMPAK_META}/${mdl}_${PDY}_${cyc}_us_${metatype}" + if [[ ${DBN_ALERT_TYPE} == "GFS_METAFILE_LAST" ]] ; then DBN_ALERT_TYPE=GFS_METAFILE - ${DBNROOT}/bin/dbn_alert MODEL ${DBN_ALERT_TYPE} $job \ - ${COMOUT}/${mdl}_${PDY}_${cyc}_us_${metatype} - fi - fi + "${DBNROOT}/bin/dbn_alert" MODEL "${DBN_ALERT_TYPE}" "${job}" \ + "${COM_ATMOS_GEMPAK_META}/${mdl}_${PDY}_${cyc}_us_${metatype}" + fi fi exit diff --git a/gempak/ush/gfs_meta_qpf.sh b/gempak/ush/gfs_meta_qpf.sh index 49ca0d8bd4..d4f6c04310 100755 --- a/gempak/ush/gfs_meta_qpf.sh +++ b/gempak/ush/gfs_meta_qpf.sh @@ -1,58 +1,48 @@ -#! /bin/sh +#! /usr/bin/env bash # # Metafile Script : gfs_meta_qpf.sh # -# Log : -# D.W.Plummer/NCEP 2/97 Add log header -# J. Carr/HPC 7/7/97 Changed script so that it uses gdplot2 instead of gdplot -# J. Carr/HPC 8/5/98 Removed pcpn potential product and changed map to a medium resolution -# J. Carr/HPC 2/2/99 Changed skip to 0 -# J. Carr/HPC 2/10/99 Changed type c/f to just f for pcpn -# J. Carr/HPC 4/12/99 Added 84-hr time for the gfs. -# J. Carr/HPC 6/99 Added a filter on map -# J. Carr/HPC 2/2001 Edited to run on IBM. -# J. Carr/HPC 5/2001 Added a mn variable for a/b side dbnet root variable. -# J. Carr/HPC 6/2001 Converted to a korn shell prior to delivering script to Production. -# J. Carr/HPC 7/2001 Submitted. -# J. Carr/HPC 11/2004 Changed contur from 1 to a 2. -# Inserted a ? in all title lines. -# Commented out if statement for cycles since this is old code based on when various runs of GFS ran -# out to differing times. -# M. Klein/HPC 02/2010 Run 48-hour QPF out to F216 for medium-range. -# # Set up Local Variables # -set -x -export PS4='qpf:$SECONDS + ' -mkdir -p -m 775 $DATA/qpf -cd $DATA/qpf -cp $FIXgempak/datatype.tbl datatype.tbl + +source "${HOMEgfs}/ush/preamble.sh" + +mkdir -p -m 775 "${DATA}/qpf" +cd "${DATA}/qpf" || exit 2 +cp "${HOMEgfs}/gempak/fix/datatype.tbl" datatype.tbl + +# +# Link data into DATA to sidestep gempak path limits +# TODO: Replace this +# +export COMIN="${RUN}.${PDY}${cyc}" +if [[ ! -L ${COMIN} ]]; then + ln -sf "${COM_ATMOS_GEMPAK_1p00}" "${COMIN}" +fi mdl=gfs MDL=GFS metatype="qpf" metaname="${mdl}_${metatype}_${cyc}.meta" device="nc | ${metaname}" -PDY2=$(echo $PDY | cut -c3-) - gdat="F000-F126-06" - gdatpcpn06="F006-F126-06" - gdatpcpn12="F012-F126-06" - gdatpcpn24="F024-F126-06" - gdatpcpn48="F048-F216-06" - gdatpcpn60="F060-F126-06" - gdatpcpn72="F072-F126-06" - gdatpcpn84="F084-F126-06" - gdatpcpn96="F096-F126-06" - gdatpcpn120="F120-F126-06" - gdatpcpn126="F126" - run="r" - -export pgm=gdplot2_nc;. prep_step; startmsg -$GEMEXE/gdplot2_nc << EOFplt -gdfile = F-${MDL} | ${PDY2}/${cyc}00 +gdat="F000-F126-06" +gdatpcpn06="F006-F126-06" +gdatpcpn12="F012-F126-06" +gdatpcpn24="F024-F126-06" +gdatpcpn48="F048-F216-06" +gdatpcpn60="F060-F126-06" +gdatpcpn72="F072-F126-06" +gdatpcpn84="F084-F126-06" +gdatpcpn96="F096-F126-06" +gdatpcpn120="F120-F126-06" +run="r" + +export pgm=gdplot2_nc;. prep_step +"${GEMEXE}/gdplot2_nc" << EOFplt +gdfile = F-${MDL} | ${PDY:2}/${cyc}00 gdattim = ${gdat} garea = us -proj = +proj = map = 1/1/2/yes device = ${device} clear = yes @@ -63,20 +53,20 @@ latlon = 0 filter = yes glevel = 0 -gvcord = none +gvcord = none skip = 0 scale = 0 gdpfun = sm5s(lft4)!sm5s(lft4) !sm5s(lft4)!kntv(wnd@9950%sgma) type = c/f !c !c !b cint = 2/2 !-10000;0.05 !2/-100/-2 -line = 20/-32/2 !0;5//0;4/0;0!32//2 -fint = -8;-6;-4;-2;0.05;10 +line = 20/-32/2 !0;5//0;4/0;0!32//2 +fint = -8;-6;-4;-2;0.05;10 fline = 2;15;21;22;23;0;24 hilo = 0 !0 hlsym = 1;1//22;22/2;2/hw!0 clrbar = 1/V/LL !0 -wind = bk0 !bk0 !bk0 !bk10/0.8/2/112!bk0 -refvec = +wind = bk0 !bk0 !bk0 !bk10/0.8/2/112!bk0 +refvec = title = 1/-2/~ ? ${MDL} Best LI AND BL WINDS|~BEST LI!0 r @@ -89,7 +79,7 @@ type = c !c/f !c !c !c cint = 0.25/0.25/0.5 !0.25/0.75/6.0!4!3/0/540!3/543/1000 line = 22///2 !32//2/2!6//3!4/5/2!5/5/2 fint = 0 !0.5;1.0;1.5;2.0 -fline = 0 !0;23;22;30;14 +fline = 0 !0;23;22;30;14 hilo = 0 !0!6/H#;L#/1020-1070;900-1012!0 HLSYM = 0 !0!1.5;1.5//22;22/3;3/hw!0 clrbar = 0 !1/V/LL!0!0 @@ -143,7 +133,7 @@ type = c!c/f!b cint = 0.25/0.25/0.5!0.25/0.75/6.0 line = 22///2!32//2/2 fint = !0.5;1.0;1.5;2.0 -fline = !0;23;22;21;2 +fline = !0;23;22;21;2 hilo = 0!0 HLSYM = 0!0 clrbar = 0!1/V/LL @@ -169,7 +159,7 @@ WIND = ! REFVEC = TITLE = 1/-2/~ ? ${MDL} PCPN POTENTIAL (PW X (1000-440 MB RH)) INCHES OF PW|~PCPN POT!0 r - + glevel = 850!850!850 gvcord = pres!pres!pres skip = 0/1;1 @@ -180,11 +170,11 @@ cint = -4;-2;0;2;4!2/6/28!3 line = 3//1!32//1!6//3 fint = 4;8;12;16;20 fline = 0;23;22;30;14;2 -hilo = 0!0!6/H#;L# +hilo = 0!0!6/H#;L# hlsym = 0!0!1.5;1.5//22;22/2;2/hw clrbar = 1/V/LL!0 -wind = bk0!bk0!bk0!bk9/0.8/2/212 -refvec = +wind = bk0!bk0!bk0!bk9/0.8/2/212 +refvec = title = 1/-2/~ ? ${MDL} @ DEW POINT, WIND, AND HGHT|~@ DEW POINT!0 r @@ -205,25 +195,25 @@ refvec = title = 1/-2/~ ? ${MDL} @ DEWPOINT, WIND, AND HGHT|~@ DEWPOINT!0 r -glevel = 850 !850 !0 !850 +glevel = 850 !850 !0 !850 gvcord = pres !pres !none !pres skip = 0/1;2 scale = 2 !-1/2 !0 !2 gdpfun = sm5s(mag(smul(mixr;wnd)!sm5s(hght)!sm5s(thte)!smul(mixr;wnd) type = c/f !c !c !a cint = 3 !3 !5 -line = 3 !6//2 !25/10/2 +line = 3 !6//2 !25/10/2 fint = 6;12;18;24;30 fline = 0;23;22;21;14;15;2 hilo = 0!6/H#;L#!0 hlsym = 0!1;1//22;22/2;2/hw clrbar = 1/V/LL!0 -wind = bk0!bk0!bk0!am16/0.6/2/211/0.3!bk0 -refvec = 10 +wind = bk0!bk0!bk0!am16/0.6/2/211/0.3!bk0 +refvec = 10 text = s/22/2/hw title = 1/-2/~ ? ${MDL} @ MOIST. TRNSPT, HGHT, BL THTE|~@ H2O TRANSPORT!0 r - + glevel = 850 gvcord = pres skip = 0/1;1 @@ -234,8 +224,8 @@ cint = 2 !4//304 !4/308/324 !4/328 line = 32/1/2 !23/10/3 !22/10/3 !21/1/2 fint = -14;-10;-6;-2;2;6;10;14! fline = 7;29;30;24;0;14;15;18;5! -hilo = -hlsym = +hilo = +hlsym = clrbar = 1/V/LL!0 wind = bk0 !bk0 !bk0 !bk0 !bk9/0.8/2/112!bk0 refvec = 10 @@ -343,13 +333,13 @@ refvec = title = 1/-2/~ ? ${MDL} 6-HOUR TOTAL PCPN, MSLP |~6-HR TOTAL PCPN!0 r -gdattim = ${gdatpcpn12} +gdattim = ${gdatpcpn12} gdpfun = p12i type = f title = 1/-2/~ ? ${MDL} 12-HOUR TOTAL PCPN|~12-HR TOTAL PCPN!0 r -gdattim = ${gdatpcpn24} +gdattim = ${gdatpcpn24} gdpfun = p24i title = 1/-2/~ ? ${MDL} 24-HOUR TOTAL PCPN|~24-HR TOTAL PCPN!0 r @@ -364,12 +354,12 @@ gdpfun = p60i title = 1/-2/~ ? ${MDL} 60 HOUR TOTAL PCPN|~60-HR TOTAL PCPN!0 r -gdattim = ${gdatpcpn72} +gdattim = ${gdatpcpn72} gdpfun = p72i title = 1/-2/~ ? ${MDL} 72 HOUR TOTAL PCPN|~72-HR TOTAL PCPN!0 r -gdattim = ${gdatpcpn84} +gdattim = ${gdatpcpn84} gdpfun = p84i title = 1/-2/~ ? ${MDL} 84 HOUR TOTAL PCPN|~84-HR TOTAL PCPN!0 r @@ -403,19 +393,19 @@ export err=$?;err_chk # WHEN IT CAN NOT PRODUCE THE DESIRED GRID. CHECK # FOR THIS CASE HERE. ##################################################### -ls -l $metaname -export err=$?;export pgm="GEMPAK CHECK FILE";err_chk - -if [ $SENDCOM = "YES" ] ; then - mv ${metaname} ${COMOUT}/${mdl}_${PDY}_${cyc}_us_${metatype} - if [ $SENDDBN = "YES" ] ; then - ${DBNROOT}/bin/dbn_alert MODEL ${DBN_ALERT_TYPE} $job \ - ${COMOUT}/${mdl}_${PDY}_${cyc}_us_${metatype} - if [ $DBN_ALERT_TYPE = "GFS_METAFILE_LAST" ] ; then - DBN_ALERT_TYPE=GFS_METAFILE - ${DBNROOT}/bin/dbn_alert MODEL ${DBN_ALERT_TYPE} $job \ - ${COMOUT}/${mdl}_${PDY}_${cyc}_us_${metatype} - fi +if (( err != 0 )) || [[ ! -s "${metaname}" ]] &> /dev/null; then + echo "FATAL ERROR: Failed to create gempak meta file ${metaname}" + exit $(( err + 100 )) +fi + +mv "${metaname}" "${COM_ATMOS_GEMPAK_META}/${mdl}_${PDY}_${cyc}_us_${metatype}" +if [[ "${SENDDBN}" == "YES" ]] ; then + "${DBNROOT}/bin/dbn_alert" MODEL "${DBN_ALERT_TYPE}" "${job}" \ + "${COM_ATMOS_GEMPAK_META}/${mdl}_${PDY}_${cyc}_us_${metatype}" + if [[ ${DBN_ALERT_TYPE} == "GFS_METAFILE_LAST" ]] ; then + DBN_ALERT_TYPE=GFS_METAFILE + "${DBNROOT}/bin/dbn_alert" MODEL "${DBN_ALERT_TYPE}" "${job}" \ + "${COM_ATMOS_GEMPAK_META}/${mdl}_${PDY}_${cyc}_us_${metatype}" fi fi diff --git a/gempak/ush/gfs_meta_sa.sh b/gempak/ush/gfs_meta_sa.sh index 47984e641d..2923cfc24e 100755 --- a/gempak/ush/gfs_meta_sa.sh +++ b/gempak/ush/gfs_meta_sa.sh @@ -1,52 +1,37 @@ -#! /bin/sh +#! /usr/bin/env bash # # Metafile Script : gfs_meta_sa.sh # -# Log : -# D.W.Plummer/NCEP 2/97 Add log header -# J.W.Carr/HPC 4/97 Changed Comparison to 1200 UTC UKMET instead of 0000 UTC UKMET -# J.W.Carr/HPC 4/97 Added UKMET2 --- past 72 hours to the comparison -# J.W.Carr/HPC 2/98 changed garea of sfc conv, bl dwpt and wind product -# J.W.Carr/HPC 5/98 converted gdplot to gdplot2 -# J.W.Carr/HPC 8/98 Changed map to medium resolution -# J. Carr/HPC 7/99 Put a filter on map. -# J. Carr/HPC 02/2001 Updated to run on IBM and send to ncodas -# J. Carr/HPC 04/2001 Remove old metafiles from metaout before creating new ones. -# J. Carr/HPC 5/2001 Added a mn variable for a/b side dbnet root variable. -# J. Carr/HPC 6/2001 Converted to a korn shell prior to delivering script to Production. -# J. Carr/HPC 8/2001 Submitted. -# J. Carr/HPC 3/2002 Tweaked a few products. -# # Set Up Local Variables # -set -x + +source "${HOMEgfs}/ush/preamble.sh" + +mkdir -p -m 775 "${DATA}/SA" +cd "${DATA}/SA" || exit 2 +cp "${HOMEgfs}/gempak/fix/datatype.tbl" datatype.tbl + +# +# Link data into DATA to sidestep gempak path limits +# TODO: Replace this # -export PS4='SA:$SECONDS + ' -mkdir -p -m 775 $DATA/SA -cd $DATA/SA -cp $FIXgempak/datatype.tbl datatype.tbl +export COMIN="${RUN}.${PDY}${cyc}" +if [[ ! -L ${COMIN} ]]; then + ln -sf "${COM_ATMOS_GEMPAK_1p00}" "${COMIN}" +fi mdl=gfs MDL=GFS metatype="sa" metaname="${mdl}_${metatype}_${cyc}.meta" device="nc | ${metaname}" -PDY2=$(echo ${PDY} | cut -c3-) -# -#if [ ${cyc} -eq 00 ] ; then -# fend=F126 -#elif [ ${cyc} -eq 12 ] ; then -# fend=F126 -#else -# fend=F126 -#fi fend=F126 # -export pgm=gdplot2_nc;. prep_step; startmsg -$GEMEXE/gdplot2_nc << EOF -GDFILE = F-${MDL} | ${PDY2}/${cyc}00 -GDATTIM = F00-${fend}-06 +export pgm=gdplot2_nc;. prep_step +"${GEMEXE}/gdplot2_nc" << EOF +GDFILE = F-${MDL} | ${PDY:2}/${cyc}00 +GDATTIM = F00-${fend}-06 DEVICE = ${device} PANEL = 0 TEXT = 1/21//hw @@ -72,8 +57,8 @@ FLINE = HILO = !!26;2/H#;L#/1020-1070;900-1012//30;30/y HLSYM = 1.3;1.3//21//hw CLRBAR = 1 -WIND = ! ! -REFVEC = +WIND = ! ! +REFVEC = TITLE = 1/-2/~ ${MDL} MSLP, 1000-500mb THICK|~MSLP, 1000-500 THKN! ru @@ -104,30 +89,30 @@ type = c/f!c!c!c cint = 13;25;38;50;62!4!4/0/540!4/544/600 line = 32/1/1!6/1/3!5/5/2!17/5/2 fint = 13;25;38;50 -fline = 0;23;22;21;2 +fline = 0;23;22;21;2 hilo = 26;2/H#;L#/1017-1050;930-1004/2//y HLSYM = 0!1.5;1.5//22;22/3;3/hw!0 clrbar = 1/V/LL!0 -wind = +wind = refvec = title = 1/-2/~ ${MDL} PW, EST MSLP, THICKNESS|~PRECIP WATER, MSLP!0 r -glevel = 0!0 -gvcord = none -skip = 0 +glevel = 0!0 +gvcord = none +skip = 0 scale = 0 gdpfun = sm5s(lft4)!sm5s(lft4)!sm5s(lft4)!kntv(wnd@9950%sgma) type = c/f !c !c !b cint = 3/3 !1/-0.5/0.5!3/-15/-3 -line = 25/1/1 !22/1/2 !21/1/1 -fint = -9;-6;-3;3;6 +line = 25/1/1 !22/1/2 !21/1/1 +fint = -9;-6;-3;3;6 fline = 2;15;22;0;0;24 hilo = 0!0 hlsym = 1;1//22;22/2;2/hw clrbar = 1/V/LL!0 wind = bk0!bk0!bk0!bk9/0.9/2/112 -refvec = +refvec = title = 1/-2/~ ${MDL} LI AND BL WINDS|~LIFTED INDEX!0 r @@ -142,7 +127,7 @@ scale = 7 !0 !0 gdpfun = sm5s(sdiv(mixr(dwpc;pres@0%none);wnd)!sm5s(dwpc)!sm5s(dwpc)!kntv(wnd@9950%sgma) type = f !c !c !b cint = 1//-1 !3/12 !3/21 -line = 32 !5//2 !6//2 +line = 32 !5//2 !6//2 clrbar = 1/V/LL!0 fint = -8;-6;-4;-2 fline = 2;23;22;3;0 @@ -159,13 +144,13 @@ GAREA = -66;-127;14.5;-19 GLEVEL = 0 !0 !0 !0 GVCORD = none !none !none !none SKIP = 0 -SCALE = 0 +SCALE = 0 GDPFUN = sm5s(tmpc)!sm5s(tmpc)!sm5s(tmpc)!sm5s(pmsl)!kntv(wnd@9950%sgma) TYPE = c/f !c !c !c !b CINT = 3/-99/0 !3/3/21 !3/24/99 !4 LINE = 27/1/2 !2/1/2 !16/1/2 !19//3 -FINT = -18;-15;-12;-9;-6;-3;0 -FLINE = 30;29;7;6;4;25;24;0 +FINT = -18;-15;-12;-9;-6;-3;0 +FLINE = 30;29;7;6;4;25;24;0 HILO = 0 !0 !0 !26;2/H#;L#/1016-1050;930-1006/2//y HLSYM = 0 !0 !0 !1.5;1.5//22;22/3;3/hw CLRBAR = 1/V/LL !0 @@ -188,13 +173,13 @@ GAREA = -66;-127;14.5;-19 GLEVEL = 9950 !9950 !9950 !0 GVCORD = sgma!sgma!sgma!none SKIP = 0 -SCALE = 0 +SCALE = 0 GDPFUN = sm5s(tmpc)!sm5s(tmpc)!sm5s(tmpc)!sm5s(pmsl)!kntv(wnd@9950%sgma) TYPE = c/f !c !c !c !b CINT = 3/-99/0 !3/3/21 !3/24/99 !4 LINE = 27/1/2 !2/1/2 !16/1/2 !19//3 -FINT = -18;-15;-12;-9;-6;-3;0 -FLINE = 30;29;7;6;4;25;24;0 +FINT = -18;-15;-12;-9;-6;-3;0 +FLINE = 30;29;7;6;4;25;24;0 HILO = 0 !0 !0 !26;2/H#;L#/1016-1050;930-1006/2//y HLSYM = 0 !0 !0 !1.5;1.5//22;22/3;3/hw CLRBAR = 1/V/LL !0 @@ -220,7 +205,7 @@ type = c/f !b cint = 13;25;38;50;62! line = 32/1/2/1 fint = 13;25;38;50 -fline = 0;23;22;21;2 +fline = 0;23;22;21;2 hilo = 0 !0 HLSYM = 0 !0 clrbar = 1 @@ -239,11 +224,11 @@ CINT = 10;20;80;90 !30;40;50;60;70 LINE = 32//2 !23//2 FINT = 10;30;70;90 FLINE = 18;8;0;22;23 -HILO = +HILO = HLSYM = CLRBAR = 1 -WIND = -REFVEC = +WIND = +REFVEC = TITLE = 1/-2/~ ${MDL} @ MEAN LAYER RH|~MEAN LAYER RH!0 ru @@ -276,7 +261,7 @@ LINE = 7/5/1/2 !29/5/1/2!7/5/1/2 !29/5/1/2 !20/1/2/1 FINT = 16;20;24;28;32;36;40;44 FLINE = 0;23-15 HILO = 2;6/X;N/10-99;10-99! !2;6/X;N/10-99;10-99! ! -HLSYM = +HLSYM = CLRBAR = 1 WIND = bk0 REFVEC = @@ -308,7 +293,7 @@ gvcord = pres SKIP = 0/2;2 scale = 0 !5/0 !5/0 !-1 gdpfun = sm5s(mag(kntv(wnd))//jet!sm5s(div(wnd)//dvg!dvg !sm5s(hght) -type = c/f !c !c !c +type = c/f !c !c !c cint = 70;90;110;130;150;170 !-11;-9;-7;-5;-3;-1!2/2/100!12 line = 32/1 !20/-2/2 !3/1/2 !1//2 fint = 70;90;110;130;150;170;190! @@ -375,15 +360,15 @@ CINT = 20/70// !2/2 LINE = 32/1/2/1 !5/2/2/2 FINT = 80;90;110;130;150;170;190 !1;2;3;4;5;6;7 FLINE = 0;25;24;29;7;15;20;14 !0;23;22;21;17;16;2;1 -HILO = -HLSYM = +HILO = +HLSYM = CLRBAR = 1/v/ll !0 WIND = bk0 !am16/0.3//211/0.4!Bk9/0.75/2 -REFVEC = +REFVEC = TITLE = 1/-2/~ @ ISOTACHS AND WIND (KTS)|~200 MB WIND!0 FILTER = yes ru - + GAREA = -66;-127;14.5;-19 LATLON = 1//1/1/10 @@ -401,7 +386,7 @@ FLINE = 0;21-30;14-20;5 HILO = 31;0/x#/10-500///y HLSYM = 1.5 CLRBAR = 1/V/LL -WIND = +WIND = REFVEC = TITLE = 1/-2/~ ${MDL} 12-HR TOTAL PCPN|~12-HR TOTAL PCPN!0 r @@ -415,25 +400,26 @@ exit EOF export err=$?;err_chk + ##################################################### # GEMPAK DOES NOT ALWAYS HAVE A NON ZERO RETURN CODE # WHEN IT CAN NOT PRODUCE THE DESIRED GRID. CHECK # FOR THIS CASE HERE. ##################################################### -ls -l $metaname -export err=$?;export pgm="GEMPAK CHECK FILE";err_chk - -if [ $SENDCOM = "YES" ] ; then - mv ${metaname} ${COMOUT}/${mdl}_${PDY}_${cyc}_${metatype} - if [ $SENDDBN = "YES" ] ; then - ${DBNROOT}/bin/dbn_alert MODEL ${DBN_ALERT_TYPE} $job \ - ${COMOUT}/${mdl}_${PDY}_${cyc}_${metatype} - if [ $DBN_ALERT_TYPE = "GFS_METAFILE_LAST" ] ; then +if (( err != 0 )) || [[ ! -s "${metaname}" ]] &> /dev/null; then + echo "FATAL ERROR: Failed to create gempak meta file ${metaname}" + exit $(( err + 100 )) +fi + +mv "${metaname}" "${COM_ATMOS_GEMPAK_META}/${mdl}_${PDY}_${cyc}_${metatype}" +if [[ "${SENDDBN}" == "YES" ]] ; then + "${DBNROOT}/bin/dbn_alert" MODEL "${DBN_ALERT_TYPE}" "${job}" \ + "${COM_ATMOS_GEMPAK_META}/${mdl}_${PDY}_${cyc}_${metatype}" + if [[ ${DBN_ALERT_TYPE} == "GFS_METAFILE_LAST" ]] ; then DBN_ALERT_TYPE=GFS_METAFILE - ${DBNROOT}/bin/dbn_alert MODEL ${DBN_ALERT_TYPE} $job \ - ${COMOUT}/${mdl}_${PDY}_${cyc}_${metatype} - fi - fi + "${DBNROOT}/bin/dbn_alert" MODEL "${DBN_ALERT_TYPE}" "${job}" \ + "${COM_ATMOS_GEMPAK_META}/${mdl}_${PDY}_${cyc}_${metatype}" + fi fi exit diff --git a/gempak/ush/gfs_meta_sa2.sh b/gempak/ush/gfs_meta_sa2.sh index a566031030..c144173ee3 100755 --- a/gempak/ush/gfs_meta_sa2.sh +++ b/gempak/ush/gfs_meta_sa2.sh @@ -1,27 +1,27 @@ -#! /bin/sh +#! /usr/bin/env bash # # Metafile Script : ukmet_gfs_meta_sa2.sh # -# Creates several South American gfs charts, including 500mb and psml +# Creates several South American gfs charts, including 500mb and psml # comparisons to the ecmwf and ukmet # -# Log : -# J. Carr/HPC 07/2002 Added this metafile -# J. Carr/HPC 07/2002 Gif script. -# M. Klein/HPC 11/2004 Change references to gfs from avn -# M. Klein/HPC 02/2005 Changed location of working directory to /ptmp -# A. Robson/HPC 11/01/2006 Converted to sh prior to gif'ing -# F. Achorn/NCO 11/03/2006 Changed location of working directory to $DATA from ptmp -# -# -set -x -# -echo " start with ukmet_gfs_meta_sa2.sh" -export PS4='SA2:$SECONDS + ' -cp $FIXgempak/datatype.tbl datatype.tbl +source "${HOMEgfs}/ush/preamble.sh" + +mkdir SA2 +cd SA2 || exit 1 + -export COMPONENT=${COMPONENT:-atmos} +cp "${HOMEgfs}/gempak/fix/datatype.tbl" datatype.tbl + +# +# Link data into DATA to sidestep gempak path limits +# TODO: Replace this +# +export HPCGFS="${RUN}.${PDY}${cyc}" +if [[ ! -L ${HPCGFS} ]]; then + ln -sf "${COM_ATMOS_GEMPAK_1p00}" "${HPCGFS}" +fi mdl=gfs MDL=GFS @@ -34,30 +34,28 @@ device="nc | ${metaname}" # IF CYCLE IS NOT 00Z OR 06Z EXIT SCRIPT. # Also exit if run from 00z gfs # -if [ ${cyc} -eq "12" ] || [ ${cyc} -eq "18" ] -then - exit -# elif [ ${cyc} -eq "00" ] && [ $(echo $COMIN | awk -F/ '{print $5}' | awk -F. '{print $1}') = "gfs" ] -elif [ ${cyc} -eq "00" ] && [ ${mdl} = "gfs" ] -then - # don't want to run from 00z gfs +if [[ ${cyc} != "06" ]]; then exit fi -PDY2=$(echo ${PDY} | cut -c3-) -# export HPCGFS=$COMROOT/nawips/${envir}/gfs.${PDY} -export HPCGFS=${COMINgempak}/${mdl}.${PDY}/${cyc}/${COMPONENT}/gempak - -grid1="F-GFSHPC | ${PDY2}/${cyc}00" +grid1="F-GFSHPC | ${PDY:2}/${cyc}00" # DEFINE YESTERDAY -PDYm1=$($NDATE -24 ${PDY}${cyc} | cut -c -8) -PDY2m1=$(echo ${PDYm1} | cut -c 3-) +PDYm1="$(date --utc +%Y%m%d -d "${PDY} ${cyc} - 24 hours")" + +HPCECMWF="ecmwf.${PDYm1}" +HPCUKMET="ukmet.${PDY}" +if [[ ! -L "${HPCECMWF}" ]]; then + ln -sf "${COMINecmwf}/ecmwf.${PDYm1}/gempak" "${HPCECMWF}" +fi +if [[ ! -L "${HPCUKMET}" ]]; then + ln -sf "${COMINukmet}/ukmet.${PDY}/gempak" "${HPCUKMET}" +fi -$GEMEXE/gdplot2_nc << EOF +"${GEMEXE}/gdplot2_nc" << EOF \$MAPFIL= mepowo.gsf GDFILE = ${grid1} -GDATTIM = F000-F144-12 +GDATTIM = F000-F144-12 DEVICE = ${device} PANEL = 0 TEXT = 1/21//hw @@ -83,8 +81,8 @@ FLINE = HILO = !!26;2/H#;L#/1020-1070;900-1012//30;30/y HLSYM = 1.3;1.3//21//hw CLRBAR = 1 -WIND = ! ! -REFVEC = +WIND = ! ! +REFVEC = TITLE = 1/-2/~ ? ${MDL} MSLP, 1000-500mb THICK|~MSLP, 1000-500 THKN! l ru @@ -100,7 +98,7 @@ LINE = 7/5/1/2 !29/5/1/2!7/5/1/2 !29/5/1/2 !20/1/2/1 FINT = 16;20;24;28;32;36;40;44 FLINE = 0;23-15 HILO = 2;6/X;N/10-99;10-99! !2;6/X;N/10-99;10-99! ! -HLSYM = +HLSYM = CLRBAR = 1 WIND = 0 REFVEC = @@ -123,8 +121,8 @@ FLINE = 0!0;23;24;25;30;29;28;27 !11;12;2;10;15;14;13;0 HILO = 0!0!0!5/H#;L# HLSYM = 0!!1.0//21//hw!1.5 CLRBAR = 0!0!1!0 -WIND = -REFVEC = +WIND = +REFVEC = TITLE = 1/-1/~ ? ${MDL} @ MB HGT|~500 HGT CHG!1/-2/~ ${MDL} @ MB 24-HR HGT FALLS!0 TEXT = 1/21////hw CLEAR = YES @@ -135,7 +133,7 @@ GDATTIM = f060 GDPFUN = sm5s(hght)!(sub(hght^f60,hght^f36))!(sub(hght^f60,hght^f36))!sm5s(hght) TITLE = 1/-1/~ ? ${MDL} @ MB HGT|~500 HGT CHG!1/-2/~ ${MDL} @ MB 24-HR HGT FALLS!0 l -run +run GDATTIM = f084 GDPFUN = sm5s(hght)!(sub(hght^f84,hght^f60))!(sub(hght^f84,hght^f60))!sm5s(hght) @@ -201,27 +199,17 @@ ru ex EOF -if [ ${cyc} -eq "00" ]; then - times="012 036 060 084 108 132" -else - times="006 030 054 078 102 126" -fi - -for gfsfhr in $(echo ${times}) -do - if [ ${cyc} == "06" ]; then - ecmwffhr="F$(expr ${gfsfhr} + 18)" +for fhr in $(seq -s ' ' 6 24 126); do + gfsfhr="F$(printf "%03g" "${fhr}")" + if (( fhr < 100 )); then + offset=6 else - ecmwffhr="F$(expr ${gfsfhr} + 12)" + offset=18 fi - while [ $(expr length $ecmwffhr) -lt 3 ] - do - ecmwffhr="F0$(expr ${gfsfhr} + 6)" - done - gfsfhr="F${gfsfhr}" - grid2="${COMINecmwf}.${PDYm1}/gempak/ecmwf_glob_${PDYm1}12" - -$GEMEXE/gdplot2_nc << EOF10 + ecmwffhr="F$(printf "%03g" $((fhr + offset)))" + grid2="${HPCECMWF}/ecmwf_glob_${PDYm1}12" + + "${GEMEXE}/gdplot2_nc" << EOF10 \$MAPFIL = mepowo.gsf GDFILE = ${grid1} !${grid2} GDATTIM = ${gfsfhr}!${ecmwffhr} @@ -236,20 +224,20 @@ PROJ = mer//3;3;0;1 GAREA = -71;-135;20;-20 LATLON = 18//1/1/10 -GLEVEL = 500 -GVCORD = PRES -PANEL = 0 -SKIP = 0 -SCALE = -1 -GDPFUN = sm5s(hght)!sm5s(hght) -TYPE = c -CONTUR = 1 -CINT = 6 -FINT = -FLINE = -HLSYM = -WIND = -REFVEC = +GLEVEL = 500 +GVCORD = PRES +PANEL = 0 +SKIP = 0 +SCALE = -1 +GDPFUN = sm5s(hght)!sm5s(hght) +TYPE = c +CONTUR = 1 +CINT = 6 +FINT = +FLINE = +HLSYM = +WIND = +REFVEC = LINE = 31//2!2//2 HILO = 31/H#;L#//5/5;5/y!2/H#;L#//5/5;5/y TITLE = 31/-1/~ ? ${MDL} @ HGHT (WHITE)|~EC VS ${MDL} 500!2/-2/~ ? ECMWF 500 HGHT (RED) @@ -258,20 +246,20 @@ r GLEVEL = 0 GVCORD = none -PANEL = 0 -SKIP = 0 +PANEL = 0 +SKIP = 0 SCALE = 0 GDPFUN = (pmsl)!(pmsl) -TYPE = c -CONTUR = 7 -CINT = 4 -FINT = -FLINE = -HLSYM = 1.5;1.5//21//hw -CLRBAR = 1 -WIND = -REFVEC = -TEXT = 1/21//hw +TYPE = c +CONTUR = 7 +CINT = 4 +FINT = +FLINE = +HLSYM = 1.5;1.5//21//hw +CLRBAR = 1 +WIND = +REFVEC = +TEXT = 1/21//hw CLEAR = yes GDFILE = ${grid1}!${grid2} GDATTIM = ${gfsfhr}!${ecmwffhr} @@ -285,27 +273,12 @@ ex EOF10 done -if [ ${cyc} -eq "00" ]; then - times="000 012 024 036 048 060 072 096 120 144" -elif [ ${cyc} -eq "06" ]; then - times="006 018 030 042 054 066 090 114 138" -fi - -for gfsfhr in $(echo ${times}) -do - if [ ${cyc} -eq "06" ]; then - ukmetfhr="$(expr ${gfsfhr} + 6)" - while [ $(expr length $ukmetfhr) -lt 3 ] - do - ukmetfhr="0$(expr ${gfsfhr} + 6)" - done - else - ukmetfhr=${gfsfhr} - fi - gfsfhr="F${gfsfhr}" - grid3="${COMINukmet}.${PDY}/gempak/ukmet_${PDY}00f${ukmetfhr}" +for fhr in $(seq -s ' ' 6 12 138); do + gfsfhr="F$(printf "%03g" "${fhr}")" + ukmetfhr="F$(printf "%03g" $((fhr + 6)))" + grid3="${HPCUKMET}/ukmet_${PDY}00f${ukmetfhr}" -$GEMEXE/gdplot2_nc << EOF25 + "${GEMEXE}/gdplot2_nc" << EOF25 \$MAPFIL = mepowo.gsf DEVICE = ${device} PANEL = 0 @@ -313,22 +286,22 @@ TEXT = 1/21//hw CONTUR = 2 MAP = 6/1/1/yes CLEAR = yes -CLRBAR = -GLEVEL = 500 -GVCORD = PRES -PANEL = 0 -SKIP = 0 -SCALE = -1 -GDPFUN = sm5s(hght)!sm5s(hght) -TYPE = c -CONTUR = 1 -CINT = 6 -FINT = -FLINE = -HLSYM = -GVECT = -WIND = -REFVEC = +CLRBAR = +GLEVEL = 500 +GVCORD = PRES +PANEL = 0 +SKIP = 0 +SCALE = -1 +GDPFUN = sm5s(hght)!sm5s(hght) +TYPE = c +CONTUR = 1 +CINT = 6 +FINT = +FLINE = +HLSYM = +GVECT = +WIND = +REFVEC = clear = yes GDFILE = ${grid1}!${grid3} GDATTIM = ${gfsfhr}!F${ukmetfhr} @@ -340,20 +313,20 @@ r GLEVEL = 0 GVCORD = none -PANEL = 0 -SKIP = 0 +PANEL = 0 +SKIP = 0 SCALE = 0 GDPFUN = sm5s(pmsl)!sm5s(pmsl) -TYPE = c -CONTUR = 2 -CINT = 4 -FINT = -FLINE = -HLSYM = 1.5;1.5//21//hw -CLRBAR = -WIND = -REFVEC = -TEXT = 1/21//hw +TYPE = c +CONTUR = 2 +CINT = 4 +FINT = +FLINE = +HLSYM = 1.5;1.5//21//hw +CLRBAR = +WIND = +REFVEC = +TEXT = 1/21//hw CLEAR = yes GDFILE = ${grid1}!${grid3} GDATTIM = ${gfsfhr}!F${ukmetfhr} @@ -362,26 +335,31 @@ HILO = 31/H#;L#/1020-1060;900-1010/5/10;10!2/H#;L#/1020-1060;900-1010/5/10;10 TITLE = 31/-1/~ ? ${MDL} PMSL (WHITE)|~UK VS ${MDL} PMSL!2/-2/~ ? UKMET PMSL (RED) l r - + ex EOF25 done - - -if [ $SENDCOM = "YES" ] ; then - mv ${metaname} ${COMOUT}/${mdl}_${PDY}_${cyc}_${metatype} - if [ $SENDDBN = "YES" ] ; then - ${DBNROOT}/bin/dbn_alert MODEL ${DBN_ALERT_TYPE} $job \ - ${COMOUT}/${mdl}_${PDY}_${cyc}_${metatype} - if [ $DBN_ALERT_TYPE = "GFS_METAFILE_LAST" ] ; then - DBN_ALERT_TYPE=GFS_METAFILE - ${DBNROOT}/bin/dbn_alert MODEL ${DBN_ALERT_TYPE} $job \ - ${COMOUT}/${mdl}_${PDY}_${cyc}_${metatype} - fi - fi +export err=$?;err_chk + +##################################################### +# GEMPAK DOES NOT ALWAYS HAVE A NON ZERO RETURN CODE +# WHEN IT CAN NOT PRODUCE THE DESIRED GRID. CHECK +# FOR THIS CASE HERE. +##################################################### +if (( err != 0 )) || [[ ! -s "${metaname}" ]] &> /dev/null; then + echo "FATAL ERROR: Failed to create gempak meta file ${metaname}" + exit $(( err + 100 )) fi -#export COMIN=/com/nawips/${envir}/ukmet.${PDY} -echo " end with ukmet_gfs_meta_sa2.sh" +mv "${metaname}" "${COM_ATMOS_GEMPAK_META}/${mdl}_${PDY}_${cyc}_${metatype}" +if [[ "${SENDDBN}" == "YES" ]] ; then + "${DBNROOT}/bin/dbn_alert" MODEL "${DBN_ALERT_TYPE}" "${job}" \ + "${COM_ATMOS_GEMPAK_META}/${mdl}_${PDY}_${cyc}_${metatype}" + if [[ ${DBN_ALERT_TYPE} == "GFS_METAFILE_LAST" ]] ; then + DBN_ALERT_TYPE=GFS_METAFILE + "${DBNROOT}/bin/dbn_alert" MODEL "${DBN_ALERT_TYPE}" "${job}" \ + "${COM_ATMOS_GEMPAK_META}/${mdl}_${PDY}_${cyc}_${metatype}" + fi +fi exit diff --git a/gempak/ush/gfs_meta_trop.sh b/gempak/ush/gfs_meta_trop.sh index d0cc0dbd14..eb918eaab7 100755 --- a/gempak/ush/gfs_meta_trop.sh +++ b/gempak/ush/gfs_meta_trop.sh @@ -1,59 +1,53 @@ -#! /bin/sh +#! /usr/bin/env bash # # Metafile Script : gfs_meta_trop.sh # -# Log : -# D.W.Plummer/NCEP 2/97 Add log header -# J. Carr/HPC/DTB 3/97 Added WPAC area -# J. Carr/HPC/DTB 4/97 Put more skipping in for winds -# J. Carr/HPC/DTB 4/97 Changed pcpn from mm to inches and added hilo -# J. Carr/HPC/DTB 5/98 Converted gdplot to gdplot2 -# J.L.Partain/MPC 5/98 Mods to make Atl prods same as GFS, MRF, ETA -# J. Carr/HPC/DTB 8/98 Changed map to medium resolution -# J. Carr/HPC 02/01 Updated script to run on IBM. -# J. Carr/HPC 5/2001 Added a mn variable for a/b side dbnet root variable. -# J. Carr/HPC 7/2001 Converted to a korn shell prior to delivering script -# to Production. -# J. Carr/HPC 7/2001 Submitted as a jif. -# -# B. Gordon 7/02 Converted to run off the GFS due to demise -# of the MRF. -# J. Carr/PMB 11/2004 Added a ? to all title lines. -# Changed contur from a 1 to a 2. -# Changed interval of 24-HR PCPN from 24 to 6. -# # Set Up Local Variables # -set -x + +source "${HOMEgfs}/ush/preamble.sh" + +mkdir -p -m 775 "${DATA}/TROP" +cd "${DATA}/TROP" || exit 2 +cp "${HOMEgfs}/gempak/fix/datatype.tbl" datatype.tbl + +# +# Link data into DATA to sidestep gempak path limits +# TODO: Replace this # -export PS4='TROP:$SECONDS + ' -mkdir -p -m 775 $DATA/TROP -cd $DATA/TROP -cp $FIXgempak/datatype.tbl datatype.tbl +export COMIN="${RUN}.${PDY}${cyc}" +if [[ ! -L ${COMIN} ]]; then + ln -sf "${COM_ATMOS_GEMPAK_1p00}" "${COMIN}" +fi mdl=gfs MDL=GFS metatype="trop" metaname="${mdl}_${metatype}_${cyc}.meta" device="nc | ${metaname}" -PDY2=$(echo ${PDY} | cut -c3-) # -for a in ATL PAC WPAC -do - if [ ${a} = "ATL" ] ; then +for domain in ATL PAC WPAC; do + case ${domain} in + ATL) garea="-6;-111;52;-14" proj="MER/0.0;-49.5;0.0" - elif [ ${a} = "PAC" ] ; then + ;; + PAC) garea="0;-140;45;-75" proj="mer//3;3;0;1" - elif [ ${a} = "WPAC" ] ; then + ;; + WPAC) garea="0;90;45;180" proj="mer//3;3;0;1" - fi - -export pgm=gdplot2_nc;. prep_step; startmsg -$GEMEXE/gdplot2_nc << EOF -GDFILE = F-${MDL} | ${PDY2}/${cyc}00 + ;; + *) + echo "FATAL ERROR: Unknown domain in ${BASH_SOURCE[0]}" + exit 100 + esac + + export pgm=gdplot2_nc;. prep_step + "${GEMEXE}/gdplot2_nc" << EOF +GDFILE = F-${MDL} | ${PDY:2}/${cyc}00 GDATTIM = F00-F180-12 DEVICE = ${device} PANEL = 0 @@ -83,7 +77,7 @@ HLSYM = 0!1;1//22;22/3;3/hw CLRBAR = 1/V/LL!0 WIND = bk0!bk0!bk9/.8/1.4/112 REFVEC = -TITLE = 1/-2/~ ? ${MDL} PMSL, BL WIND (40m AGL; KTS)|~${a} PMSL & BL WIND!0 +TITLE = 1/-2/~ ? ${MDL} PMSL, BL WIND (40m AGL; KTS)|~${domain} PMSL & BL WIND!0 r GLEVEL = 850 @@ -95,7 +89,7 @@ LINE = 29/5/1/2 !7/5/1/2 HILO = 2;6/X;N/-99--4;4-99 ! SCALE = 5 !5 WIND = bk0 !bk0 !bk6/.8/2/112!0 -TITLE = 1/-2/~ ? ${MDL} @ WIND AND REL VORT|~${a} @ WIND AND REL VORT!0 +TITLE = 1/-2/~ ? ${MDL} @ WIND AND REL VORT|~${domain} @ WIND AND REL VORT!0 FINT = 4;6;8;10;12;14;16;18 FLINE = 0;14-21 TYPE = c/f!c!b @@ -110,14 +104,14 @@ HILO = 2;6/X;N/-99--4;4-99! !6/L#/880-1004///1 HLSYM = 1;1//22;22/3;3/hw SCALE = 5 !5 !0 WIND = bk0 !bk0 !bk0 !bk9/.8/1.4/112 -TITLE = 1/-2/~ ? ${MDL} @ WIND AND REL VORT|~${a} @ WIND AND REL VORT!0 +TITLE = 1/-2/~ ? ${MDL} @ WIND AND REL VORT|~${domain} @ WIND AND REL VORT!0 FINT = 4;6;8;10;12;14;16;18 FLINE = 0;14-21 r GLEVEL = 700 GDPFUN = vor(wnd) !vor(wnd)!kntv(wnd) -TITLE = 1/-2/~ ? ${MDL} @ WIND AND REL VORT|~${a} @ WIND AND REL VORT!0 +TITLE = 1/-2/~ ? ${MDL} @ WIND AND REL VORT|~${domain} @ WIND AND REL VORT!0 GLEVEL = 700!700!0!700 GVCORD = pres!pres!none!pres @@ -128,7 +122,7 @@ HILO = 2;6/X;N/-99--4;4-99! !6/L#/880-1004///1 HLSYM = 1;1//22;22/3;3/hw SCALE = 5 !5 !0 WIND = bk0 !bk0 !bk0 !bk9/.8/1.4/112 -TITLE = 1/-2/~ ? ${MDL} @ WIND AND REL VORT|~${a} @ WIND AND REL VORT!0 +TITLE = 1/-2/~ ? ${MDL} @ WIND AND REL VORT|~${domain} @ WIND AND REL VORT!0 FINT = 4;6;8;10;12;14;16;18 FLINE = 0;14-21 TYPE = c/f !c !c !b @@ -145,11 +139,11 @@ LINE = 7/5/1/2 ! 29/5/1/2 ! 7/5/1/2 ! 29/5/1/2 ! 20/1/2/1 FINT = 16;20;24;28;32;36;40;44 FLINE = 0;23-15 HILO = 2;6/X;N/10-99;10-99! !2;6/X;N/10-99;10-99! ! -HLSYM = +HLSYM = CLRBAR = 1 -WIND = bk0!bk0!bk0!bk0!bk0!bk9/0.9/1.4/112!0 +WIND = bk0!bk0!bk0!bk0!bk0!bk9/0.9/1.4/112!0 REFVEC = -TITLE = 1/-2/~ ? ${MDL} @ WIND AND ABS VORT|~${a} @ WIND AND ABS VORT!0 +TITLE = 1/-2/~ ? ${MDL} @ WIND AND ABS VORT|~${domain} @ WIND AND ABS VORT!0 r GLEVEL = 300:850 !850 !300 @@ -167,9 +161,9 @@ HLSYM = CLRBAR = 1 WIND = ak0!ak7/.4/1/221/.2!ak6/.4/1/221/.2 REFVEC = -TITLE = 1/-2/~ ? ${MDL} @ WIND SHEAR (850=Purple, 300=Cyan) |~${a} 850-300MB WIND SHEAR!0 +TITLE = 1/-2/~ ? ${MDL} @ WIND SHEAR (850=Purple, 300=Cyan) |~${domain} 850-300MB WIND SHEAR!0 filter = no - + glevel = 250!250 gvcord = pres!pres @@ -187,7 +181,7 @@ hlsym = 0!0!0!0 clrbar = 0!0!1/V/LL!0 wind = bk0!bk0!bk0!bk0!bk9/.9/1.4/112 refvec = 10 -title = 1/-2/~ ? ${MDL} @ HGHTS, ISOTACHS, & DIVERG|~${a} @ SPEED & DIVERG!0 +title = 1/-2/~ ? ${MDL} @ HGHTS, ISOTACHS, & DIVERG|~${domain} @ SPEED & DIVERG!0 r glevel = 400:850!0 @@ -205,8 +199,8 @@ hlsym = 0!2;1.5//21//hw clrbar = 0 wind = bk10/0.9/1.4/112!bk0 refvec = -title = 1/-2/~ ? ${MDL} 850-400mb MLW and MSLP|~${a} 850-400mb MLW & MSLP!0 - +title = 1/-2/~ ? ${MDL} 850-400mb MLW and MSLP|~${domain} 850-400mb MLW & MSLP!0 + GDATTIM = F24-F144-06 GLEVEL = 0 @@ -214,7 +208,7 @@ GVCORD = none SKIP = 0 SCALE = 0 GDPFUN = p24i -TYPE = f +TYPE = f CINT = 0 LINE = 0 FINT = .01;.1;.25;.5;.75;1;1.25;1.5;1.75;2;2.25;2.5;2.75;3;3.25;3.5;3.75;4 @@ -222,14 +216,14 @@ FLINE = 0;21-30;14-20;5 HILO = 31;0/x#2/.10-8.0///y HLSYM = 1.4//22/2/hw CLRBAR = 1/V/LL -WIND = +WIND = REFVEC = -TITLE = 1/-2/~ ${MDL} 24-HR TOTAL PCPN|~${a} 24-HR TOTAL PCPN!0 +TITLE = 1/-2/~ ${MDL} 24-HR TOTAL PCPN|~${domain} 24-HR TOTAL PCPN!0 r exit EOF -export err=$?;err_chk + export err=$?;err_chk done @@ -238,21 +232,20 @@ done # WHEN IT CAN NOT PRODUCE THE DESIRED GRID. CHECK # FOR THIS CASE HERE. ##################################################### -ls -l $metaname -export err=$?;export pgm="GEMPAK CHECK FILE";err_chk +if (( err != 0 )) || [[ ! -s "${metaname}" ]] &> /dev/null; then + echo "FATAL ERROR: Failed to create gempak meta file ${metaname}" + exit $(( err + 100 )) +fi -if [ $SENDCOM = "YES" ] ; then - mv ${metaname} ${COMOUT}/${mdl}_${PDY}_${cyc}_${metatype} - if [ $SENDDBN = "YES" ] ; then - ${DBNROOT}/bin/dbn_alert MODEL ${DBN_ALERT_TYPE} $job \ - ${COMOUT}/${mdl}_${PDY}_${cyc}_${metatype} - if [ $DBN_ALERT_TYPE = "GFS_METAFILE_LAST" ] ; then +mv "${metaname}" "${COM_ATMOS_GEMPAK_META}/${mdl}_${PDY}_${cyc}_${metatype}" +if [[ "${SENDDBN}" == "YES" ]] ; then + "${DBNROOT}/bin/dbn_alert" MODEL "${DBN_ALERT_TYPE}" "${job}" \ + "${COM_ATMOS_GEMPAK_META}/${mdl}_${PDY}_${cyc}_${metatype}" + if [[ "${DBN_ALERT_TYPE}" == "GFS_METAFILE_LAST" ]] ; then DBN_ALERT_TYPE=GFS_METAFILE - ${DBNROOT}/bin/dbn_alert MODEL ${DBN_ALERT_TYPE} $job \ - ${COMOUT}/${mdl}_${PDY}_${cyc}_${metatype} - fi - fi + "${DBNROOT}/bin/dbn_alert" MODEL "${DBN_ALERT_TYPE}" "${job}" \ + "${COM_ATMOS_GEMPAK_META}/${mdl}_${PDY}_${cyc}_${metatype}" + fi fi - exit diff --git a/gempak/ush/gfs_meta_us.sh b/gempak/ush/gfs_meta_us.sh index 7a818c338b..b8c9e17a22 100755 --- a/gempak/ush/gfs_meta_us.sh +++ b/gempak/ush/gfs_meta_us.sh @@ -1,45 +1,40 @@ -#!/bin/sh +#! /usr/bin/env bash # # Metafile Script : gfs_meta_us.sh # -# Log : -# D.W.Plummer/NCEP 2/97 Add log header -# D.W.Plummer/NCEP 3/97 Added ecmwf comparison. -# D.W.Plummer/NCEP 3/97 Added $MAPFIL specification for lower resolution -# D.W.Plummer/NCEP 4/97 Removed run from 3-HOURLY PRECIP -# J. Carr/HPC 2/99 Changed skip to 0 -# B. Gordon/NCO 5/00 Modified for production on IBM-SP -# Changed gdplot_nc -> gdplot2_nc -# D. Michaud/NCO 4/01 Modified to Reflect Different Title for -# Parallel runs -# J. Carr/PMB 11/04 Added a ? to all title lines -# Changed contur from a 1 to a 2. -# -cd $DATA -set -xa +source "${HOMEgfs}/ush/preamble.sh" + +cd "${DATA}" || exit 2 +rm -rf "${DATA}/us" +mkdir -p -m 775 "${DATA}/us" +cd "${DATA}/us" || exit 2 +cp "${HOMEgfs}/gempak/fix/datatype.tbl" datatype.tbl -rm -rf $DATA/us -mkdir -p -m 775 $DATA/us -cd $DATA/us -cp $FIXgempak/datatype.tbl datatype.tbl +# +# Link data into DATA to sidestep gempak path limits +# TODO: Replace this +# +export COMIN="${RUN}.${PDY}${cyc}" +if [[ ! -L ${COMIN} ]]; then + ln -sf "${COM_ATMOS_GEMPAK_1p00}" "${COMIN}" +fi device="nc | gfs.meta" -PDY2=$(echo $PDY | cut -c3-) export fend=F216 -if [ "$envir" = "para" ] ; then +if [[ "${envir}" == "para" ]] ; then export m_title="GFSP" else export m_title="GFS" fi export pgm=gdplot2_nc;. prep_step -startmsg -$GEMEXE/gdplot2_nc << EOF -GDFILE = F-GFS | ${PDY2}/${cyc}00 -GDATTIM = F00-$fend-6 + +"${GEMEXE}/gdplot2_nc" << EOF +GDFILE = F-GFS | ${PDY:2}/${cyc}00 +GDATTIM = F00-${fend}-6 DEVICE = ${device} PANEL = 0 TEXT = 1/21//hw @@ -53,81 +48,82 @@ GAREA = 17.529;-129.296;53.771;-22.374 PROJ = str/90;-105;0 LATLON = 0 -restore $USHgempak/restore/pmsl_thkn.2.nts +restore ${HOMEgfs}/gempak/ush/restore/pmsl_thkn.2.nts CLRBAR = 1 HLSYM = 2;1.5//21//hw TEXT = 1/21//hw -TITLE = 5/-2/~ ? $m_title PMSL, 1000-500 MB THICKNESS|~MSLP, 1000-500 THKN!0 +TITLE = 5/-2/~ ? ${m_title} PMSL, 1000-500 MB THICKNESS|~MSLP, 1000-500 THKN!0 l run -restore $USHgempak/restore/850mb_hght_tmpc.2.nts +restore ${HOMEgfs}/gempak/ush/restore/850mb_hght_tmpc.2.nts CLRBAR = 1 HLSYM = 2;1.5//21//hw TEXT = 1/21//hw -TITLE = 5/-2/~ ? $m_title @ HGT, TEMPERATURE AND WIND (KTS)|~@ HGT, TMP, WIND!0 +TITLE = 5/-2/~ ? ${m_title} @ HGT, TEMPERATURE AND WIND (KTS)|~@ HGT, TMP, WIND!0 l run -restore $USHgempak/restore/700mb_hght_relh_omeg.2.nts +restore ${HOMEgfs}/gempak/ush/restore/700mb_hght_relh_omeg.2.nts CLRBAR = 1 HLSYM = 2;1.5//21//hw TEXT = 1/21//hw -TITLE = 5/-2/~ ? $m_title @ HGT, REL HUMIDITY AND OMEGA|~@ HGT, RH AND OMEGA!0 +TITLE = 5/-2/~ ? ${m_title} @ HGT, REL HUMIDITY AND OMEGA|~@ HGT, RH AND OMEGA!0 l run -restore $USHgempak/restore/500mb_hght_absv.2.nts +restore ${HOMEgfs}/gempak/ush/restore/500mb_hght_absv.2.nts CLRBAR = 1 HLSYM = 2;1.5//21//hw TEXT = 1/21//hw -TITLE = 5/-2/~ ? $m_title @ HGT AND VORTICITY|~@ HGT AND VORTICITY!0 +TITLE = 5/-2/~ ? ${m_title} @ HGT AND VORTICITY|~@ HGT AND VORTICITY!0 l run -restore $USHgempak/restore/250mb_hght_wnd.2.nts +restore ${HOMEgfs}/gempak/ush/restore/250mb_hght_wnd.2.nts CLRBAR = 1 HLSYM = 2;1.5//21//hw TEXT = 1/21//hw -TITLE = 5/-2/~ ? $m_title @ HGT, ISOTACHS AND WIND (KTS)|~@ HGT AND WIND!0 +TITLE = 5/-2/~ ? ${m_title} @ HGT, ISOTACHS AND WIND (KTS)|~@ HGT AND WIND!0 l run - -restore $USHgempak/restore/p06m_pmsl.2.nts + + +restore ${HOMEgfs}/gempak/ush/restore/p06m_pmsl.2.nts CLRBAR = 1 HLSYM = 2;1.5//21//hw HLSYM = 2;1.5//21//hw TEXT = 1/21//hw -GDATTIM = F06-$fend-6 -TITLE = 5/-2/~ ? $m_title 6-HR TOTAL PCPN, MSLP|~6-HR TOTAL PCPN, MSLP!0 +GDATTIM = F06-${fend}-6 +TITLE = 5/-2/~ ? ${m_title} 6-HR TOTAL PCPN, MSLP|~6-HR TOTAL PCPN, MSLP!0 l run HILO = 31;0/x#2////y HLSYM = 1.5 -GDATTIM = F12-$fend-06 -GDPFUN = p12i -TITLE = 5/-2/~ ? $m_title 12-HR TOTAL PCPN (IN)|~12-HR TOTAL PCPN +GDATTIM = F12-${fend}-06 +GDPFUN = p12i +TITLE = 5/-2/~ ? ${m_title} 12-HR TOTAL PCPN (IN)|~12-HR TOTAL PCPN l run -GDATTIM = F24-$fend-06 -GDPFUN = p24i -TITLE = 5/-2/~ ? $m_title 24-HR TOTAL PCPN (IN)|~24-HR TOTAL PCPN +GDATTIM = F24-${fend}-06 +GDPFUN = p24i +TITLE = 5/-2/~ ? ${m_title} 24-HR TOTAL PCPN (IN)|~24-HR TOTAL PCPN l run GDATTIM = F72;f78;f84 -GDPFUN = p72i -TITLE = 5/-2/~ ? $m_title 72-HR TOTAL PCPN(IN)|~72-HR TOTAL PCPN +GDPFUN = p72i +TITLE = 5/-2/~ ? ${m_title} 72-HR TOTAL PCPN(IN)|~72-HR TOTAL PCPN l run @@ -135,18 +131,18 @@ run GAREA = 26.52;-119.70;50.21;-90.42 PROJ = str/90;-105;0/3;3;0;1 MAP = 1//2 -GDATTIM = F24-$fend-06 -GDPFUN = p24i -TITLE = 5/-2/~ ? $m_title 24-HR TOTAL PCPN (IN)|~WEST: 24-HR PCPN +GDATTIM = F24-${fend}-06 +GDPFUN = p24i +TITLE = 5/-2/~ ? ${m_title} 24-HR TOTAL PCPN (IN)|~WEST: 24-HR PCPN l GAREA = 24.57;-100.55;47.20;-65.42 PROJ = str/90;-90;0/3;3;0;1 MAP = 1//2 -GDATTIM = F24-$fend-06 -GDPFUN = p24i -TITLE = 5/-2/~ ? $m_title 24-HR TOTAL PCPN (IN)|~EAST: 24-HR PCPN +GDATTIM = F24-${fend}-06 +GDPFUN = p24i +TITLE = 5/-2/~ ? ${m_title} 24-HR TOTAL PCPN (IN)|~EAST: 24-HR PCPN l exit @@ -158,24 +154,23 @@ export err=$?;err_chk # WHEN IT CAN NOT PRODUCE THE DESIRED GRID. CHECK # FOR THIS CASE HERE. ##################################################### -ls -l gfs.meta -export err=$?;export pgm="GEMPAK CHECK FILE";err_chk - - -if [ $SENDCOM = "YES" ] ; then - mv gfs.meta ${COMOUT}/gfs_${PDY}_${cyc}_us - if [ $SENDDBN = "YES" ] ; then - $DBNROOT/bin/dbn_alert MODEL ${DBN_ALERT_TYPE} $job \ - $COMOUT/gfs_${PDY}_${cyc}_us - if [ $DBN_ALERT_TYPE = "GFS_METAFILE_LAST" ] ; then - DBN_ALERT_TYPE=GFS_METAFILE - ${DBNROOT}/bin/dbn_alert MODEL ${DBN_ALERT_TYPE} $job \ - ${COMOUT}/gfs_${PDY}_${cyc}_us +if (( err != 0 )) || [[ ! -s gfs.meta ]] &> /dev/null; then + echo "FATAL ERROR: Failed to create gempak meta file gfs.meta" + exit $(( err + 100 )) +fi + +mv gfs.meta "${COM_ATMOS_GEMPAK_META}/gfs_${PDY}_${cyc}_us" +if [[ "${SENDDBN}" == "YES" ]] ; then + "${DBNROOT}/bin/dbn_alert" MODEL "${DBN_ALERT_TYPE}" "${job}" \ + "${COM_ATMOS_GEMPAK_META}/gfs_${PDY}_${cyc}_us" + if [[ ${DBN_ALERT_TYPE} == "GFS_METAFILE_LAST" ]] ; then + DBN_ALERT_TYPE=GFS_METAFILE + "${DBNROOT}/bin/dbn_alert" MODEL "${DBN_ALERT_TYPE}" "${job}" \ + "${COM_ATMOS_GEMPAK_META}/gfs_${PDY}_${cyc}_us" fi - if [ $fhr -eq 216 ] ; then - ${DBNROOT}/bin/dbn_alert MODEL GFS_METAFILE_LAST $job \ - ${COMOUT}/gfs_${PDY}_${cyc}_us + if (( fhr == 216 )) ; then + "${DBNROOT}/bin/dbn_alert" MODEL GFS_METAFILE_LAST "${job}" \ + "${COM_ATMOS_GEMPAK_META}/gfs_${PDY}_${cyc}_us" fi - fi fi diff --git a/gempak/ush/gfs_meta_usext.sh b/gempak/ush/gfs_meta_usext.sh index dc522bb896..5f0e930cdb 100755 --- a/gempak/ush/gfs_meta_usext.sh +++ b/gempak/ush/gfs_meta_usext.sh @@ -1,63 +1,48 @@ -#!/bin/sh +#! /usr/bin/env bash # # Metafile Script : gfs_meta_usext.sh # -# Log : -# D.W.Plummer/NCEP 2/97 Add log header -# D.W.Plummer/NCEP 3/97 Add ecmwf comparison. -# D.W.Plummer/NCEP 3/97 Added $MAPFIL specification for lower resolution -# D.W.Plummer/NCEP 4/97 Changed SKIP for grid2 -# D.W.Plummer/NCEP 4/97 Changed gdplot to gdplot2 and related restore files -# D.W.Plummer/NCEP 4/97 Changed NAOP to NAOP w/ .us suffix -# D.W.Plummer/NCEP 1/98 Added 12hr 2m min and max temps out to day 6 and 7 -# J.L.Partain/MPC 8/98 Added latlon lines -# J. Carr/HPC 2/99 Changed skip to a 0 -# J. Carr/HPC 4/2000 Changed the Alaska 5-day pcpn to a 3-5 day pcpn -# Added other pcpn products for the medium range fcstrs. -# B. Gordon/NCO 4/00 Converted to run as production on the IBM-SP -# D. Michaud/NCO 4/01 Added logic to display different title for parallel -# runs. -# B. Gordon 7/02 Converted to run off the GFS due to demise -# of the MRF. -# J. Carr/PMB 11/04 Added a ? to title lines. -# Changed contur from a 1 to a 2. -# Changed increment in gdattim to every 6 hrs instead of 12. -# Added 3 new products for HPC medium range. (2 48-hr qpf and 1 5 day qpf) -# M. Klein/HPC 01/10 Add boundary layer winds/isotachs to the metafile for CPC. -# -set -xa -mkdir -p -m 775 $DATA/mrfus -cd $DATA/mrfus -cp $FIXgempak/datatype.tbl datatype.tbl -device="nc | mrf.meta" +source "${HOMEgfs}/ush/preamble.sh" -#XXW cp $FIXgempak/model/gfs/ak_sfstns.tbl alaska.tbl -cp $FIXgempak/ak_sfstns.tbl alaska.tbl +mkdir -p -m 775 "${DATA}/mrfus" +cd "${DATA}/mrfus" || exit 2 +cp "${HOMEgfs}/gempak/fix/datatype.tbl" datatype.tbl +cp "${HOMEgfs}/gempak/fix/ak_sfstns.tbl" alaska.tbl -month=$(echo $PDY | cut -c5,6) -if [ $month -ge 5 -a $month -le 9 ] ; then -# fint="40;45;50;55;60;65;70;75;80;85;90;95;100" -# fline="26;25;24;23;22;21;20;19;18;17;16;15;14;31" - fint="60;65;70;75;80;85;90;95;100;105;110;115;120" - fline="26;25;24;23;22;21;20;19;18;17;16;15;14;31" -else - fint="-5;0;5;10;15;20;25;30;35;40;45;50;55;60;65;70;75;80" - fline="4;30;29;28;27;26;25;24;23;22;21;20;19;18;17;16;15;14;31" +# +# Link data into DATA to sidestep gempak path limits +# TODO: Replace this +# +export COMIN="${RUN}.${PDY}${cyc}" +if [[ ! -L ${COMIN} ]]; then + ln -sf "${COM_ATMOS_GEMPAK_1p00}" "${COMIN}" fi -PDY2=$(echo $PDY | cut -c3-) +device="nc | mrf.meta" -if [ "$envir" = "para" ] ; then +# Not being used? +# case $(( 10#${PDY:4:2} )) in +# [5-9]) +# fint="60;65;70;75;80;85;90;95;100;105;110;115;120" +# fline="26;25;24;23;22;21;20;19;18;17;16;15;14;31" +# ;; +# *) +# fint="-5;0;5;10;15;20;25;30;35;40;45;50;55;60;65;70;75;80" +# fline="4;30;29;28;27;26;25;24;23;22;21;20;19;18;17;16;15;14;31" +# ;; +# esac + +if [[ "${envir}" = "para" ]] ; then export m_title="GFSP" else export m_title="GFS" fi -export pgm=gdplot2_nc; prep_step; startmsg +export pgm=gdplot2_nc; prep_step -$GEMEXE/gdplot2_nc << EOF -GDFILE = F-GFS | ${PDY2}/${cyc}00 +"${GEMEXE}/gdplot2_nc" << EOF +GDFILE = F-GFS | ${PDY:2}/${cyc}00 GDATTIM = F000-F384-06 DEVICE = ${device} PANEL = 0 @@ -71,46 +56,46 @@ GAREA = 17.529;-129.296;53.771;-22.374 PROJ = str/90;-105;0 LATLON = 18/2 -restore ${USHgempak}/restore/pmsl_thkn.2.nts +restore ${HOMEgfs}/gempak/ush/restore/pmsl_thkn.2.nts CLRBAR = 1 HLSYM = 2;1.5//21//hw TEXT = 1/21//hw -TITLE = 1/-2/~ ? $m_title PMSL, 1000-500 MB THICKNESS|~MSLP, 1000-500 THKN!0 +TITLE = 1/-2/~ ? ${m_title} PMSL, 1000-500 MB THICKNESS|~MSLP, 1000-500 THKN!0 l run -restore ${USHgempak}/restore/850mb_hght_tmpc.2.nts +restore ${HOMEgfs}/gempak/ush/restore/850mb_hght_tmpc.2.nts CLRBAR = 1 TEXT = 1/21//hw -TITLE = 1/-2/~ ? $m_title @ HGT, TEMPERATURE AND WIND (KTS)|~@ HGT, TMP, WIND!0 +TITLE = 1/-2/~ ? ${m_title} @ HGT, TEMPERATURE AND WIND (KTS)|~@ HGT, TMP, WIND!0 l run -restore ${USHgempak}/restore/700mb_hght_relh_omeg.2.nts +restore ${HOMEgfs}/gempak/ush/restore/700mb_hght_relh_omeg.2.nts CLRBAR = 1 TEXT = 1/21//hw -TITLE = 1/-2/~ ? $m_title @ HGT, REL HUMIDITY AND OMEGA|~@ HGT, RH AND OMEGA!0 +TITLE = 1/-2/~ ? ${m_title} @ HGT, REL HUMIDITY AND OMEGA|~@ HGT, RH AND OMEGA!0 l run -restore ${USHgempak}/restore/500mb_hght_absv.2.nts +restore ${HOMEgfs}/gempak/ush/restore/500mb_hght_absv.2.nts CLRBAR = 1 TEXT = 1/21//hw -TITLE = 1/-2/~ ? $m_title @ HGT AND VORTICITY|~@ HGT AND VORTICITY!0 +TITLE = 1/-2/~ ? ${m_title} @ HGT AND VORTICITY|~@ HGT AND VORTICITY!0 l run -restore ${USHgempak}/restore/500mb_hght_gabsv.2.nts +restore ${HOMEgfs}/gempak/ush/restore/500mb_hght_gabsv.2.nts CLRBAR = 1 TEXT = 1/21//hw -TITLE = 1/-2/~ ? $m_title @ HGT AND GEO ABS VORT|~@ HGT, GEO ABS VORT!0 +TITLE = 1/-2/~ ? ${m_title} @ HGT AND GEO ABS VORT|~@ HGT, GEO ABS VORT!0 l run -restore ${USHgempak}/restore/250mb_hght_wnd.2.nts +restore ${HOMEgfs}/gempak/ush/restore/250mb_hght_wnd.2.nts CLRBAR = 1 TEXT = 1/21//hw -TITLE = 1/-2/~ ? $m_title @ HGT, ISOTACHS AND WIND (KTS)|~@ HGT AND WIND!0 +TITLE = 1/-2/~ ? ${m_title} @ HGT, ISOTACHS AND WIND (KTS)|~@ HGT AND WIND!0 l run @@ -131,26 +116,26 @@ HLSYM = CLRBAR = 1/V/LL!0 WIND = bk18/0.8/1/112 REFVEC = -TITLE = 1/-2/~ ? $m_title BOUNDARY LAYER WINDS (KTS) AND ISOTACHS|~BL WIND, ISOTACHS !0 +TITLE = 1/-2/~ ? ${m_title} BOUNDARY LAYER WINDS (KTS) AND ISOTACHS|~BL WIND, ISOTACHS !0 TEXT = 1/21//hw CLEAR = YES l run -restore ${USHgempak}/restore/precip.2.nts +restore ${HOMEgfs}/gempak/ush/restore/precip.2.nts CLRBAR = 1 TEXT = 1/21//hw HILO = 31;0/x#2/.25-10///y HLSYM = 1.5 GDATTIM = F12-F384-6 GDPFUN = p12i -TITLE = 1/-2/~ ? $m_title 12-HR TOTAL PCPN (IN)|~12-HR TOTAL PCPN +TITLE = 1/-2/~ ? ${m_title} 12-HR TOTAL PCPN (IN)|~12-HR TOTAL PCPN l run GDATTIM = F24-F384-6 GDPFUN = p24i -TITLE = 5/-2/~ ? $m_title 24-HR TOTAL PCPN (IN)|~24-HR TOTAL PCPN +TITLE = 5/-2/~ ? ${m_title} 24-HR TOTAL PCPN (IN)|~24-HR TOTAL PCPN l run @@ -158,34 +143,34 @@ GDATTIM = F84 wind = bk0 gvcord = none type = f -cint = -line = +cint = +line = clrbar = 1/V/LL fint = .01;.1;.25;.5;.75;1;1.5;2;2.5;3;4;5;6;7;8;9;10 fline = 0;21-30;14-20;5 glevel = 0 scale = 0 gdpfun = p72i -refvec = -title = 1/-2/~ ? $m_title 3-day (F12-F84) PCPN|~DAY 1-3 (F12-F84) PCPN +refvec = +title = 1/-2/~ ? ${m_title} 3-day (F12-F84) PCPN|~DAY 1-3 (F12-F84) PCPN l run GDATTIM = F108 gdpfun = p96i -title = 1/-2/~ ? $m_title 4-day (F12-F120) PCPN|~DAY 1-4 (F12-F108) PCPN +title = 1/-2/~ ? ${m_title} 4-day (F12-F120) PCPN|~DAY 1-4 (F12-F108) PCPN l run GDATTIM = F132 -gdpfun = p120i -title = 1/-2/~ ? $m_title 5-day (F12-F132) PCPN|~DAY 1-5 (F12-F132) PCPN +gdpfun = p120i +title = 1/-2/~ ? ${m_title} 5-day (F12-F132) PCPN|~DAY 1-5 (F12-F132) PCPN l run GDATTIM = F132 gdpfun = p48i -title = 1/-2/~ ? $m_title 2-day (F84-F132) PCPN|~DAY 4-5 (F84-F132) PCPN +title = 1/-2/~ ? ${m_title} 2-day (F84-F132) PCPN|~DAY 4-5 (F84-F132) PCPN l run @@ -193,7 +178,7 @@ run GDATTIM = F126 gdpfun = p120i -title = 1/-2/~ ? $m_title 5-day (F06-F126) PCPN|~DAY 1-5 (F06-F126) PCPN +title = 1/-2/~ ? ${m_title} 5-day (F06-F126) PCPN|~DAY 1-5 (F06-F126) PCPN l run @@ -201,7 +186,7 @@ run GDATTIM = F126 gdpfun = p48i -title = 1/-2/~ ? $m_title 2-day (F78-F126) PCPN|~DAY 4-5 (F78-F126) PCPN +title = 1/-2/~ ? ${m_title} 2-day (F78-F126) PCPN|~DAY 4-5 (F78-F126) PCPN l run @@ -209,13 +194,13 @@ run GDATTIM = F138 gdpfun = p48i -title = 1/-2/~ ? $m_title 2-day (F90-F138) PCPN|~DAY 4-5 (F90-F138) PCPN +title = 1/-2/~ ? ${m_title} 2-day (F90-F138) PCPN|~DAY 4-5 (F90-F138) PCPN l run GDATTIM = F156 gdpfun = p72i -title = 1/-2/~ ? $m_title 3-day (F84-F156) PCPN|~DAY 4-6 (F84-F156) PCPN +title = 1/-2/~ ? ${m_title} 3-day (F84-F156) PCPN|~DAY 4-6 (F84-F156) PCPN l run @@ -225,7 +210,7 @@ PROJ = mer//3;3;0;1 STNPLT = 5|5/1//2|alaska.tbl gdattim = f144 gdpfun = p72i -title = 1/-2/~ ? $m_title 3-day (F72-F144) PCPN|~AK 3-DAY(F72-F144) PCPN +title = 1/-2/~ ? ${m_title} 3-day (F72-F144) PCPN|~AK 3-DAY(F72-F144) PCPN l run @@ -240,21 +225,18 @@ export err=$?; err_chk # WHEN IT CAN NOT PRODUCE THE DESIRED GRID. CHECK # FOR THIS CASE HERE. ##################################################### -ls -l mrf.meta -export err=$?;export pgm="GEMPAK CHECK FILE"; err_chk - -if [ $SENDCOM = "YES" ] ; then - mv mrf.meta ${COMOUT}/gfs_${PDY}_${cyc}_usext - if [ $SENDDBN = "YES" ] ; then - $DBNROOT/bin/dbn_alert MODEL ${DBN_ALERT_TYPE} $job \ - $COMOUT/gfs_${PDY}_${cyc}_usext - if [ $DBN_ALERT_TYPE = "GFS_METAFILE_LAST" ] ; then - DBN_ALERT_TYPE=GFS_METAFILE - ${DBNROOT}/bin/dbn_alert MODEL ${DBN_ALERT_TYPE} $job \ - ${COMOUT}/gfs_${PDY}_${cyc}_usext - fi - fi +if (( err != 0 )) || [[ ! -s mrf.meta ]] &> /dev/null; then + echo "FATAL ERROR: Failed to create gempak meta file mrf.meta" + exit $(( err + 100 )) fi -# - +mv mrf.meta "${COM_ATMOS_GEMPAK_META}/gfs_${PDY}_${cyc}_usext" +if [[ "${SENDDBN}" == "YES" ]] ; then + "${DBNROOT}/bin/dbn_alert" MODEL "${DBN_ALERT_TYPE}" "${job}" \ + "${COM_ATMOS_GEMPAK_META}/gfs_${PDY}_${cyc}_usext" + if [[ ${DBN_ALERT_TYPE} == "GFS_METAFILE_LAST" ]] ; then + DBN_ALERT_TYPE=GFS_METAFILE + "${DBNROOT}/bin/dbn_alert" MODEL "${DBN_ALERT_TYPE}" "${job}" \ + "${COM_ATMOS_GEMPAK_META}/gfs_${PDY}_${cyc}_usext" + fi +fi diff --git a/gempak/ush/gfs_meta_ver.sh b/gempak/ush/gfs_meta_ver.sh index d63f6bc6df..1c03bad235 100755 --- a/gempak/ush/gfs_meta_ver.sh +++ b/gempak/ush/gfs_meta_ver.sh @@ -1,372 +1,68 @@ -#!/bin/sh +#! /usr/bin/env bash # # Metafile Script : gfs_meta_ver_new # -# Log : -# J. Carr/HPC 1/98 Added new metafile -# J. Carr/HPC 5/98 Converted to gdplot2 -# J. Carr/HPC 8/98 Changed map to medium resolution -# J. Carr/HPC 2/99 Changed skip to 0 -# J. Carr/HPC 6/99 Added latlon and a filter to map -# J. Carr/HPC 7/99 Added South American area. -# J. Carr/HPC 2/2001 Edited to run on the IBM. -# J. Carr/HPC 5/2001 Added a mn variable for a/b side dbnet root variable. -# J. Carr/HPC 8/2001 Changed to a korn shell for turnover to production. -# J. Carr/HPC 8/2001 Submitted. -# J. Carr/PMB 11/2004 Inserted a ? into all title/TITLE lines. -# Changed contur from 1 to a 2. -# Added logic to take the script from f126 to f228. -# This will remove need for mrfver. -# Removed logic which differentiated cycles since all cycles run to F384. -# Added a South American area for International desk. -# # Set up Local Variables # -set -x -export PS4='VER:$SECONDS + ' -mkdir -p -m 775 $DATA/VER -cd $DATA/VER -cp $FIXgempak/datatype.tbl datatype.tbl -export COMPONENT=${COMPONENT:-atmos} +source "${HOMEgfs}/ush/preamble.sh" + +mkdir -p -m 775 "${DATA}/VER" +cd "${DATA}/VER" || exit 2 +cp "${HOMEgfs}/gempak/fix/datatype.tbl" datatype.tbl -mdl=gfs MDL=GFS -metatype="ver" metaname="gfsver_${cyc}.meta" device="nc | ${metaname}" -PDY2=$(echo ${PDY} | cut -c3-) + +# +# Link data into DATA to sidestep gempak path limits +# TODO: Replace this # -# DEFINE 1 CYCLE AGO -dc1=$($NDATE -06 ${PDY}${cyc} | cut -c -10) -date1=$(echo ${dc1} | cut -c -8) -sdate1=$(echo ${dc1} | cut -c 3-8) -cycle1=$(echo ${dc1} | cut -c 9,10) -# DEFINE 2 CYCLES AGO -dc2=$($NDATE -12 ${PDY}${cyc} | cut -c -10) -date2=$(echo ${dc2} | cut -c -8) -sdate2=$(echo ${dc2} | cut -c 3-8) -cycle2=$(echo ${dc2} | cut -c 9,10) -# DEFINE 3 CYCLES AGO -dc3=$($NDATE -18 ${PDY}${cyc} | cut -c -10) -date3=$(echo ${dc3} | cut -c -8) -sdate3=$(echo ${dc3} | cut -c 3-8) -cycle3=$(echo ${dc3} | cut -c 9,10) -# DEFINE 4 CYCLES AGO -dc4=$($NDATE -24 ${PDY}${cyc} | cut -c -10) -date4=$(echo ${dc4} | cut -c -8) -sdate4=$(echo ${dc4} | cut -c 3-8) -cycle4=$(echo ${dc4} | cut -c 9,10) -# DEFINE 5 CYCLES AGO -dc5=$($NDATE -30 ${PDY}${cyc} | cut -c -10) -date5=$(echo ${dc5} | cut -c -8) -sdate5=$(echo ${dc5} | cut -c 3-8) -cycle5=$(echo ${dc5} | cut -c 9,10) -# DEFINE 6 CYCLES AGO -dc6=$($NDATE -36 ${PDY}${cyc} | cut -c -10) -date6=$(echo ${dc6} | cut -c -8) -sdate6=$(echo ${dc6} | cut -c 3-8) -cycle6=$(echo ${dc6} | cut -c 9,10) -# DEFINE 7 CYCLES AGO -dc7=$($NDATE -42 ${PDY}${cyc} | cut -c -10) -date7=$(echo ${dc7} | cut -c -8) -sdate7=$(echo ${dc7} | cut -c 3-8) -cycle7=$(echo ${dc7} | cut -c 9,10) -# DEFINE 8 CYCLES AGO -dc8=$($NDATE -48 ${PDY}${cyc} | cut -c -10) -date8=$(echo ${dc8} | cut -c -8) -sdate8=$(echo ${dc8} | cut -c 3-8) -cycle8=$(echo ${dc8} | cut -c 9,10) -# DEFINE 9 CYCLES AGO -dc9=$($NDATE -54 ${PDY}${cyc} | cut -c -10) -date9=$(echo ${dc9} | cut -c -8) -sdate9=$(echo ${dc9} | cut -c 3-8) -cycle9=$(echo ${dc9} | cut -c 9,10) -# DEFINE 10 CYCLES AGO -dc10=$($NDATE -60 ${PDY}${cyc} | cut -c -10) -date10=$(echo ${dc10} | cut -c -8) -sdate10=$(echo ${dc10} | cut -c 3-8) -cycle10=$(echo ${dc10} | cut -c 9,10) -# DEFINE 11 CYCLES AGO -dc11=$($NDATE -66 ${PDY}${cyc} | cut -c -10) -date11=$(echo ${dc11} | cut -c -8) -sdate11=$(echo ${dc11} | cut -c 3-8) -cycle11=$(echo ${dc11} | cut -c 9,10) -# DEFINE 12 CYCLES AGO -dc12=$($NDATE -72 ${PDY}${cyc} | cut -c -10) -date12=$(echo ${dc12} | cut -c -8) -sdate12=$(echo ${dc12} | cut -c 3-8) -cycle12=$(echo ${dc12} | cut -c 9,10) -# DEFINE 13 CYCLES AGO -dc13=$($NDATE -78 ${PDY}${cyc} | cut -c -10) -date13=$(echo ${dc13} | cut -c -8) -sdate13=$(echo ${dc13} | cut -c 3-8) -cycle13=$(echo ${dc13} | cut -c 9,10) -# DEFINE 14 CYCLES AGO -dc14=$($NDATE -84 ${PDY}${cyc} | cut -c -10) -date14=$(echo ${dc14} | cut -c -8) -sdate14=$(echo ${dc14} | cut -c 3-8) -cycle14=$(echo ${dc14} | cut -c 9,10) -# DEFINE 15 CYCLES AGO -dc15=$($NDATE -90 ${PDY}${cyc} | cut -c -10) -date15=$(echo ${dc15} | cut -c -8) -sdate15=$(echo ${dc15} | cut -c 3-8) -cycle15=$(echo ${dc15} | cut -c 9,10) -# DEFINE 16 CYCLES AGO -dc16=$($NDATE -96 ${PDY}${cyc} | cut -c -10) -date16=$(echo ${dc16} | cut -c -8) -sdate16=$(echo ${dc16} | cut -c 3-8) -cycle16=$(echo ${dc16} | cut -c 9,10) -# DEFINE 17 CYCLES AGO -dc17=$($NDATE -102 ${PDY}${cyc} | cut -c -10) -date17=$(echo ${dc17} | cut -c -8) -sdate17=$(echo ${dc17} | cut -c 3-8) -cycle17=$(echo ${dc17} | cut -c 9,10) -# DEFINE 18 CYCLES AGO -dc18=$($NDATE -108 ${PDY}${cyc} | cut -c -10) -date18=$(echo ${dc18} | cut -c -8) -sdate18=$(echo ${dc18} | cut -c 3-8) -cycle18=$(echo ${dc18} | cut -c 9,10) -# DEFINE 19 CYCLES AGO -dc19=$($NDATE -114 ${PDY}${cyc} | cut -c -10) -date19=$(echo ${dc19} | cut -c -8) -sdate19=$(echo ${dc19} | cut -c 3-8) -cycle19=$(echo ${dc19} | cut -c 9,10) -# DEFINE 20 CYCLES AGO -dc20=$($NDATE -120 ${PDY}${cyc} | cut -c -10) -date20=$(echo ${dc20} | cut -c -8) -sdate20=$(echo ${dc20} | cut -c 3-8) -cycle20=$(echo ${dc20} | cut -c 9,10) -# DEFINE 21 CYCLES AGO -dc21=$($NDATE -126 ${PDY}${cyc} | cut -c -10) -date21=$(echo ${dc21} | cut -c -8) -sdate21=$(echo ${dc21} | cut -c 3-8) -cycle21=$(echo ${dc21} | cut -c 9,10) -# DEFINE 22 CYCLES AGO -dc22=$($NDATE -132 ${PDY}${cyc} | cut -c -10) -date22=$(echo ${dc22} | cut -c -8) -sdate22=$(echo ${dc22} | cut -c 3-8) -cycle22=$(echo ${dc22} | cut -c 9,10) -# DEFINE 23 CYCLES AGO -dc23=$($NDATE -138 ${PDY}${cyc} | cut -c -10) -date23=$(echo ${dc23} | cut -c -8) -sdate23=$(echo ${dc23} | cut -c 3-8) -cycle23=$(echo ${dc23} | cut -c 9,10) -# DEFINE 24 CYCLES AGO -dc24=$($NDATE -144 ${PDY}${cyc} | cut -c -10) -date24=$(echo ${dc24} | cut -c -8) -sdate24=$(echo ${dc24} | cut -c 3-8) -cycle24=$(echo ${dc24} | cut -c 9,10) -# DEFINE 25 CYCLES AGO -dc25=$($NDATE -150 ${PDY}${cyc} | cut -c -10) -date25=$(echo ${dc25} | cut -c -8) -sdate25=$(echo ${dc25} | cut -c 3-8) -cycle25=$(echo ${dc25} | cut -c 9,10) -# DEFINE 26 CYCLES AGO -dc26=$($NDATE -156 ${PDY}${cyc} | cut -c -10) -date26=$(echo ${dc26} | cut -c -8) -sdate26=$(echo ${dc26} | cut -c 3-8) -cycle26=$(echo ${dc26} | cut -c 9,10) -# DEFINE 27 CYCLES AGO -dc27=$($NDATE -162 ${PDY}${cyc} | cut -c -10) -date27=$(echo ${dc27} | cut -c -8) -sdate27=$(echo ${dc27} | cut -c 3-8) -cycle27=$(echo ${dc27} | cut -c 9,10) -# DEFINE 28 CYCLES AGO -dc28=$($NDATE -168 ${PDY}${cyc} | cut -c -10) -date28=$(echo ${dc28} | cut -c -8) -sdate28=$(echo ${dc28} | cut -c 3-8) -cycle28=$(echo ${dc28} | cut -c 9,10) -# DEFINE 29 CYCLES AGO -dc29=$($NDATE -174 ${PDY}${cyc} | cut -c -10) -date29=$(echo ${dc29} | cut -c -8) -sdate29=$(echo ${dc29} | cut -c 3-8) -cycle29=$(echo ${dc29} | cut -c 9,10) -# DEFINE 30 CYCLES AGO -dc30=$($NDATE -180 ${PDY}${cyc} | cut -c -10) -date30=$(echo ${dc30} | cut -c -8) -sdate30=$(echo ${dc30} | cut -c 3-8) -cycle30=$(echo ${dc30} | cut -c 9,10) -# DEFINE 31 CYCLES AGO -dc31=$($NDATE -192 ${PDY}${cyc} | cut -c -10) -date31=$(echo ${dc31} | cut -c -8) -sdate31=$(echo ${dc31} | cut -c 3-8) -cycle31=$(echo ${dc31} | cut -c 9,10) -# DEFINE 32 CYCLES AGO -dc32=$($NDATE -204 ${PDY}${cyc} | cut -c -10) -date32=$(echo ${dc32} | cut -c -8) -sdate32=$(echo ${dc32} | cut -c 3-8) -cycle32=$(echo ${dc32} | cut -c 9,10) -# DEFINE 33 CYCLES AGO -dc33=$($NDATE -216 ${PDY}${cyc} | cut -c -10) -date33=$(echo ${dc33} | cut -c -8) -sdate33=$(echo ${dc33} | cut -c 3-8) -cycle33=$(echo ${dc33} | cut -c 9,10) +export COMIN="${RUN}.${PDY}${cyc}" +if [[ ! -L ${COMIN} ]]; then + ln -sf "${COM_ATMOS_GEMPAK_1p00}" "${COMIN}" +fi # SET CURRENT CYCLE AS THE VERIFICATION GRIDDED FILE. -vergrid="F-${MDL} | ${PDY2}/${cyc}00" +vergrid="F-${MDL} | ${PDY:2}/${cyc}00" fcsthr="f00" -# SET WHAT RUNS TO COMPARE AGAINST BASED ON MODEL CYCLE TIME. -#if [ ${cyc} -eq 00 ] ; then -# verdays="${dc1} ${dc2} ${dc3} ${dc4} ${dc5} ${dc6} ${dc7} ${dc8} ${dc9} ${dc10} ${dc11} ${dc12} ${dc13} ${dc14} ${dc15} ${dc16} ${dc17} ${dc18} ${dc19} ${dc20} ${dc21}" -#elif [ ${cyc} -eq 12 ] ; then -# verdays="${dc1} ${dc2} ${dc3} ${dc4} ${dc5} ${dc6} ${dc7} ${dc8} ${dc9} ${dc10} ${dc11} ${dc12} ${dc13} ${dc14} ${dc15} ${dc16} ${dc17} ${dc18} ${dc19} ${dc20} ${dc21}" -#else -# verdays="${dc1} ${dc2} ${dc3} ${dc4} ${dc5} ${dc6} ${dc7} ${dc8} ${dc9} ${dc10} ${dc11} ${dc12} ${dc13} ${dc14} ${dc15} ${dc16} ${dc17} ${dc18} ${dc19} ${dc20} ${dc21}" -#fi - -verdays="${dc1} ${dc2} ${dc3} ${dc4} ${dc5} ${dc6} ${dc7} ${dc8} ${dc9} ${dc10} ${dc11} ${dc12} ${dc13} ${dc14} ${dc15} ${dc16} ${dc17} ${dc18} ${dc19} -${dc20} ${dc21} ${dc22} ${dc23} ${dc24} ${dc25} ${dc26} ${dc27} ${dc28} ${dc29} ${dc30} ${dc31} ${dc32} ${dc33}" - - -#GENERATING THE METAFILES. MDL2="GFSHPC" -for verday in ${verdays} - do - cominday=$(echo ${verday} | cut -c -8) - #XXW export HPCGFS=$COMROOT/nawips/prod/${mdl}.${cominday} - # BV export HPCGFS=$COMROOT/nawips/${envir}/${mdl}.${cominday} - export HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cyc}/${COMPONENT}/gempak +#GENERATING THE METAFILES. +# seq won't give us any splitting problems, ignore warnings +# shellcheck disable=SC2207,SC2312 +IFS=$'\n' lookbacks=($(seq 6 6 180) $(seq 192 12 216)) +for lookback in "${lookbacks[@]}"; do + init_time="$(date --utc +%Y%m%d%H -d "${PDY} ${cyc} - ${lookback} hours")" + init_PDY=${init_time:0:8} + init_cyc=${init_time:8:2} + + if (( init_time <= ${SDATE:-0} )); then + echo "Skipping ver for ${init_time} because it is before the experiment began" + if (( lookback == "${lookbacks[0]}" )); then + echo "First forecast time, no metafile produced" + exit 0 + else + break + fi + fi + + dgdattim="f$(printf "%03g" "${lookback}")" - if [ ${verday} -eq ${dc1} ] ; then - dgdattim=f006 - HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle1}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate1}/${cycle1}00" - elif [ ${verday} -eq ${dc2} ] ; then - dgdattim=f012 - HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle2}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate2}/${cycle2}00" - elif [ ${verday} -eq ${dc3} ] ; then - dgdattim=f018 - HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle3}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate3}/${cycle3}00" - elif [ ${verday} -eq ${dc4} ] ; then - dgdattim=f024 - HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle4}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate4}/${cycle4}00" - elif [ ${verday} -eq ${dc5} ] ; then - dgdattim=f030 - HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle5}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate5}/${cycle5}00" - elif [ ${verday} -eq ${dc6} ] ; then - dgdattim=f036 - HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle6}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate6}/${cycle6}00" - elif [ ${verday} -eq ${dc7} ] ; then - dgdattim=f042 - HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle7}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate7}/${cycle7}00" - elif [ ${verday} -eq ${dc8} ] ; then - dgdattim=f048 - HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle8}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate8}/${cycle8}00" - elif [ ${verday} -eq ${dc9} ] ; then - dgdattim=f054 - HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle9}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate9}/${cycle9}00" - elif [ ${verday} -eq ${dc10} ] ; then - dgdattim=f060 - HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle10}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate10}/${cycle10}00" - elif [ ${verday} -eq ${dc11} ] ; then - dgdattim=f066 - HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle11}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate11}/${cycle11}00" - elif [ ${verday} -eq ${dc12} ] ; then - dgdattim=f072 - HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle12}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate12}/${cycle12}00" - elif [ ${verday} -eq ${dc13} ] ; then - dgdattim=f078 - HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle13}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate13}/${cycle13}00" - elif [ ${verday} -eq ${dc14} ] ; then - dgdattim=f084 - HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle14}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate14}/${cycle14}00" - elif [ ${verday} -eq ${dc15} ] ; then - dgdattim=f090 - HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle15}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate15}/${cycle15}00" - elif [ ${verday} -eq ${dc16} ] ; then - dgdattim=f096 - HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle16}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate16}/${cycle16}00" - elif [ ${verday} -eq ${dc17} ] ; then - dgdattim=f102 - HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle17}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate17}/${cycle17}00" - elif [ ${verday} -eq ${dc18} ] ; then - dgdattim=f108 - HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle18}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate18}/${cycle18}00" - elif [ ${verday} -eq ${dc19} ] ; then - dgdattim=f114 - HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle19}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate19}/${cycle19}00" - elif [ ${verday} -eq ${dc20} ] ; then - dgdattim=f120 - HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle20}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate20}/${cycle20}00" - elif [ ${verday} -eq ${dc21} ] ; then - dgdattim=f126 - HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle21}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate21}/${cycle21}00" - elif [ ${verday} -eq ${dc22} ] ; then - dgdattim=f132 - HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle22}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate22}/${cycle22}00" - elif [ ${verday} -eq ${dc23} ] ; then - dgdattim=f138 - HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle23}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate23}/${cycle23}00" - elif [ ${verday} -eq ${dc24} ] ; then - dgdattim=f144 - HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle24}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate24}/${cycle24}00" - elif [ ${verday} -eq ${dc25} ] ; then - dgdattim=f150 - HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle25}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate25}/${cycle25}00" - elif [ ${verday} -eq ${dc26} ] ; then - dgdattim=f156 - HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle26}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate26}/${cycle26}00" - elif [ ${verday} -eq ${dc27} ] ; then - dgdattim=f162 - HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle27}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate27}/${cycle27}00" - elif [ ${verday} -eq ${dc28} ] ; then - dgdattim=f168 - HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle28}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate28}/${cycle28}00" - elif [ ${verday} -eq ${dc29} ] ; then - dgdattim=f174 - HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle29}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate29}/${cycle29}00" - elif [ ${verday} -eq ${dc30} ] ; then - dgdattim=f180 - HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle30}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate30}/${cycle30}00" - elif [ ${verday} -eq ${dc31} ] ; then - dgdattim=f192 - HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle31}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate31}/${cycle31}00" - elif [ ${verday} -eq ${dc32} ] ; then - dgdattim=f204 - HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle32}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate32}/${cycle32}00" - elif [ ${verday} -eq ${dc33} ] ; then - dgdattim=f216 - HPCGFS=${COMINgempak}/${mdl}.${cominday}/${cycle33}/${COMPONENT}/gempak - grid="F-${MDL2} | ${sdate33}/${cycle33}00" + # Create symlink in DATA to sidestep gempak path limits + HPCGFS="${RUN}.${init_time}" + if [[ ! -L "${HPCGFS}" ]]; then + YMD=${init_PDY} HH=${init_cyc} GRID="1p00" generate_com source_dir:COM_ATMOS_GEMPAK_TMPL + ln -sf "${source_dir}" "${HPCGFS}" fi -# 500 MB HEIGHT METAFILE + grid="F-${MDL2} | ${init_PDY}/${init_cyc}00" -export pgm=gdplot2_nc;. prep_step; startmsg -$GEMEXE/gdplot2_nc << EOFplt + # 500 MB HEIGHT METAFILE + + export pgm=gdplot2_nc;. prep_step + "${GEMEXE}/gdplot2_nc" << EOFplt PROJ = STR/90.0;-95.0;0.0 GAREA = 5.1;-124.6;49.6;-11.9 map = 1//2 @@ -417,7 +113,7 @@ title = 5/-1/~ ? GFS PMSL clear = no r -!PROJ = +!PROJ = !GAREA = bwus !gdfile = ${vergrid} !gdattim = ${fcsthr} @@ -443,7 +139,7 @@ r ! SOUTH AMERICAN AREA. ! 500 MB -PROJ = +PROJ = GAREA = samps map = 1//2 clear = yes @@ -498,29 +194,29 @@ r ex EOFplt -export err=$?;err_chk -##################################################### -# GEMPAK DOES NOT ALWAYS HAVE A NON ZERO RETURN CODE -# WHEN IT CAN NOT PRODUCE THE DESIRED GRID. CHECK -# FOR THIS CASE HERE. -##################################################### -ls -l $metaname -export err=$?;export pgm="GEMPAK CHECK FILE";err_chk + export err=$?;err_chk + + ##################################################### + # GEMPAK DOES NOT ALWAYS HAVE A NON ZERO RETURN CODE + # WHEN IT CAN NOT PRODUCE THE DESIRED GRID. CHECK + # FOR THIS CASE HERE. + ##################################################### + if (( err != 0 )) || [[ ! -s "${metaname}" ]] &> /dev/null; then + echo "FATAL ERROR: Failed to create gempak meta file ${metaname}" + exit $(( err + 100 )) + fi done -if [ $SENDCOM = "YES" ] ; then - mv ${metaname} ${COMOUT}/gfsver_${PDY}_${cyc} - if [ $SENDDBN = "YES" ] ; then - ${DBNROOT}/bin/dbn_alert MODEL ${DBN_ALERT_TYPE} $job \ - ${COMOUT}/gfsver_${PDY}_${cyc} - if [ $DBN_ALERT_TYPE = "GFS_METAFILE_LAST" ] ; then +mv "${metaname}" "${COM_ATMOS_GEMPAK_META}/gfsver_${PDY}_${cyc}" +if [[ "${SENDDBN}" == "YES" ]] ; then + "${DBNROOT}/bin/dbn_alert" MODEL "${DBN_ALERT_TYPE}" "${job}" \ + "${COM_ATMOS_GEMPAK_META}/gfsver_${PDY}_${cyc}" + if [[ "${DBN_ALERT_TYPE}" = "GFS_METAFILE_LAST" ]] ; then DBN_ALERT_TYPE=GFS_METAFILE - ${DBNROOT}/bin/dbn_alert MODEL ${DBN_ALERT_TYPE} $job \ - ${COMOUT}/gfsver_${PDY}_${cyc} - fi - fi + "${DBNROOT}/bin/dbn_alert" MODEL "${DBN_ALERT_TYPE}" "${job}" \ + "${COM_ATMOS_GEMPAK_META}/gfsver_${PDY}_${cyc}" + fi fi - exit diff --git a/jobs/JGDAS_ATMOS_GEMPAK b/jobs/JGDAS_ATMOS_GEMPAK index 1535e07ae3..b38f4b60bb 100755 --- a/jobs/JGDAS_ATMOS_GEMPAK +++ b/jobs/JGDAS_ATMOS_GEMPAK @@ -5,17 +5,6 @@ source "${HOMEgfs}/ush/jjob_header.sh" -e "gempak" -c "base gempak" # TODO (#1219) This j-job is not part of the rocoto suite -################################ -# Set up the HOME directory -export EXECgfs=${EXECgfs:-${HOMEgfs}/exec} -export PARMgfs=${PARMgfs:-${HOMEgfs}/parm} -export PARMwmo=${PARMwmo:-${HOMEgfs}/parm/wmo} -export PARMproduct=${PARMproduct:-${HOMEgfs}/parm/product} -export FIXgempak=${FIXgempak:-${HOMEgfs}/gempak/fix} -export USHgempak=${USHgempak:-${HOMEgfs}/gempak/ush} -export SRCgfs=${SRCgfs:-${HOMEgfs}/scripts} -export UTILgfs=${UTILgfs:-${HOMEgfs}/util} - ############################################ # Set up model and cycle specific variables ############################################ @@ -38,11 +27,11 @@ export model=${model:-gdas} ############################################## # Define COM directories ############################################## -for grid in 0p25 0p50 1p00; do +for grid in 0p25 1p00; do GRID=${grid} YMD=${PDY} HH=${cyc} generate_com -rx "COM_ATMOS_GRIB_${grid}:COM_ATMOS_GRIB_GRID_TMPL" done -for grid in 1p00 0p25; do +for grid in 0p25 1p00; do prod_dir="COM_ATMOS_GEMPAK_${grid}" GRID=${grid} YMD=${PDY} HH=${cyc} generate_com -rx "COM_ATMOS_GEMPAK_${grid}:COM_ATMOS_GEMPAK_TMPL" @@ -51,50 +40,38 @@ for grid in 1p00 0p25; do fi done - # TODO: These actions belong in an ex-script not a j-job if [[ -f poescript ]]; then rm -f poescript fi -######################################################## -# Execute the script. -echo "${SRCgfs}/exgdas_atmos_nawips.sh gdas 009 GDAS_GEMPAK ${COM_ATMOS_GEMPAK_1p00}" >> poescript -######################################################## +{ + ######################################################## + # Execute the script. + echo "${SCRgfs}/exgdas_atmos_nawips.sh 1p00 009 GDAS_GEMPAK ${COM_ATMOS_GEMPAK_1p00}" + ######################################################## -######################################################## -# Execute the script for quater-degree grib -echo "${SRCgfs}/exgdas_atmos_nawips.sh gdas_0p25 009 GDAS_GEMPAK ${COM_ATMOS_GEMPAK_0p25}" >> poescript -######################################################## + ######################################################## + # Execute the script for quater-degree grib + echo "${SCRgfs}/exgdas_atmos_nawips.sh 0p25 009 GDAS_GEMPAK ${COM_ATMOS_GEMPAK_0p25}" + ######################################################## +} > poescript cat poescript -chmod 775 ${DATA}/poescript -export MP_PGMMODEL=mpmd -export MP_CMDFILE=${DATA}/poescript - -ntasks=${NTASKS_GEMPAK:-$(cat ${DATA}/poescript | wc -l)} -ptile=${PTILE_GEMPAK:-4} -threads=${NTHREADS_GEMPAK:-1} -export OMP_NUM_THREADS=${threads} -APRUN="mpiexec -l -np ${ntasks} --cpu-bind verbose,core cfp" - -APRUN_GEMPAKCFP=${APRUN_GEMPAKCFP:-${APRUN}} - -${APRUN_GEMPAKCFP} ${DATA}/poescript +"${HOMEgfs}/ush/run_mpmd.sh" poescript export err=$?; err_chk ############################################ # print exec I/O output ############################################ -if [ -e "${pgmout}" ] ; then - cat ${pgmout} +if [[ -e "${pgmout}" ]] ; then + cat "${pgmout}" fi ################################### # Remove temp directories ################################### -if [ "${KEEPDATA}" != "YES" ] ; then - rm -rf ${DATA} +if [[ "${KEEPDATA}" != "YES" ]] ; then + rm -rf "${DATA}" fi - diff --git a/jobs/JGDAS_ATMOS_GEMPAK_META_NCDC b/jobs/JGDAS_ATMOS_GEMPAK_META_NCDC index 6948d29df6..27050de942 100755 --- a/jobs/JGDAS_ATMOS_GEMPAK_META_NCDC +++ b/jobs/JGDAS_ATMOS_GEMPAK_META_NCDC @@ -4,29 +4,13 @@ # GDAS GEMPAK META NCDC PRODUCT GENERATION ############################################ -# TODO (#1222) This j-job is not part of the rocoto - source "${HOMEgfs}/ush/preamble.sh" source "${HOMEgfs}/ush/jjob_header.sh" -e "gempak_meta" -c "base gempak" -################################ -# Set up the HOME directory -################################ -export HOMEgfs=${HOMEgfs:-${PACKAGEROOT}/gfs.${gfs_ver}} -export EXECgfs=${EXECgfs:-${HOMEgfs}/exec} -export PARMgfs=${PARMgfs:-${HOMEgfs}/parm} -export PARMwmo=${PARMwmo:-${HOMEgfs}/parm/wmo} -export PARMproduct=${PARMproduct:-${HOMEgfs}/parm/product} -export FIXgempak=${FIXgempak:-${HOMEgfs}/gempak/fix} -export USHgempak=${USHgempak:-${HOMEgfs}/gempak/ush} -export SRCgfs=${SRCgfs:-${HOMEgfs}/scripts} -export UTILgfs=${UTILgfs:-${HOMEgfs}/util} - -# # Now set up GEMPAK/NTRANS environment -# -cp ${FIXgempak}/datatype.tbl datatype.tbl +# datatype.tbl specifies the paths and filenames of files +cp "${HOMEgfs}/gempak/fix/datatype.tbl" datatype.tbl ################################### # Specify NET and RUN Name and model @@ -50,29 +34,37 @@ export DBNROOT=${DBNROOT:-${UTILROOT}/fakedbn} ############################################## # Define COM directories ############################################## -export COMIN=${COMIN:-$(compath.py ${envir}/${NET}/${gfs_ver})/${RUN}.${PDY}/${cyc}/${COMPONENT}/gempak} -export COMINgdas=${COMINgdas:-$(compath.py ${envir}/${NET}/${gfs_ver})/${RUN}} -export COMOUT=${COMOUT:-$(compath.py -o ${NET}/${gfs_ver}/${RUN}.${PDY})/${cyc}/${COMPONENT}/gempak/meta} -export COMOUTncdc=${COMOUTncdc:-$(compath.py -o ${NET}/${gfs_ver}/${RUN}.${PDY})/${cyc}/${COMPONENT}} +GRID=1p00 YMD=${PDY} HH=${cyc} generate_com -rx "COM_ATMOS_GEMPAK_1p00:COM_ATMOS_GEMPAK_TMPL" -export COMINukmet=${COMINukmet:-$(compath.py ${envir}/ukmet/${ukmet_ver})/ukmet} -export COMINecmwf=${COMINecmwf:-$(compath.py ${envir}/ecmwf/${ecmwf_ver})/ecmwf} +GRID="meta" YMD=${PDY} HH=${cyc} generate_com -rx "COM_ATMOS_GEMPAK_META:COM_ATMOS_GEMPAK_TMPL" +if [[ ! -d "${COM_ATMOS_GEMPAK_META}" ]]; then + mkdir -m 775 -p "${COM_ATMOS_GEMPAK_META}" +fi -export COMOUTukmet=${COMOUT} -export COMOUTecmwf=${COMOUT} +if (( cyc%12 == 0 )); then + GRID="gif" YMD=${PDY} HH=${cyc} generate_com -rx "COM_ATMOS_GEMPAK_GIF:COM_ATMOS_GEMPAK_TMPL" + if [[ ! -d "${COM_ATMOS_GEMPAK_GIF}" ]]; then + mkdir -m 775 -p "${COM_ATMOS_GEMPAK_GIF}" + fi +fi -mkdir -m 775 -p ${COMOUT} ${COMOUTncdc} ${COMOUTukmet} ${COMOUTecmwf} +export COMINukmet="${COMINukmet:-$(compath.py "${envir}/ukmet/${ukmet_ver}")/ukmet}" +export COMINecmwf="${COMINecmwf:-$(compath.py "${envir}/ecmwf/${ecmwf_ver}")/ecmwf}" export pgmout=OUTPUT.$$ - ######################################################## # Execute the script. -${USHgempak}/gdas_meta_na.sh -${USHgempak}/gdas_ecmwf_meta_ver.sh -${USHgempak}/gdas_meta_loop.sh -${USHgempak}/gdas_ukmet_meta_ver.sh +"${HOMEgfs}/gempak/ush/gdas_meta_na.sh" export err=$?; err_chk +"${HOMEgfs}/gempak/ush/gdas_meta_loop.sh" +export err=$?; err_chk +if [[ "${cyc}" == '06' ]]; then + "${HOMEgfs}/gempak/ush/gdas_ecmwf_meta_ver.sh" + export err=$?; err_chk + "${HOMEgfs}/gempak/ush/gdas_ukmet_meta_ver.sh" + export err=$?; err_chk +fi ######################################################## ############################################ @@ -81,21 +73,23 @@ export err=$?; err_chk ######################################################## # Execute the script. -${SRCgfs}/exgdas_atmos_gempak_gif_ncdc.sh +if (( cyc%12 == 0 )); then + "${SCRgfs}/exgdas_atmos_gempak_gif_ncdc.sh" +fi export err=$?; err_chk ######################################################## ############################################ # print exec I/O output ############################################ -if [ -e "${pgmout}" ] ; then - cat ${pgmout} +if [[ -e "${pgmout}" ]] ; then + cat "${pgmout}" fi ################################### # Remove temp directories ################################### -if [ "${KEEPDATA}" != "YES" ] ; then - rm -rf ${DATA} +if [[ "${KEEPDATA}" != "YES" ]] ; then + rm -rf "${DATA}" fi diff --git a/jobs/JGDAS_ENKF_FCST b/jobs/JGDAS_ENKF_FCST deleted file mode 100755 index 53408df8cf..0000000000 --- a/jobs/JGDAS_ENKF_FCST +++ /dev/null @@ -1,84 +0,0 @@ -#! /usr/bin/env bash - -source "${HOMEgfs}/ush/preamble.sh" -source "${HOMEgfs}/ush/jjob_header.sh" -e "efcs" -c "base fcst efcs" - - -############################################## -# Set variables used in the script -############################################## -export CDUMP=${RUN/enkf} - -############################################## -# Begin JOB SPECIFIC work -############################################## - -export CASE=${CASE_ENS} - -YMD=${PDY} HH=${cyc} generate_com -rx COM_TOP - - -# Forecast length for EnKF forecast -export FHMIN=${FHMIN_ENKF} -export FHOUT=${FHOUT_ENKF} -export FHMAX=${FHMAX_ENKF} - -# Get ENSBEG/ENSEND from ENSGRP and NMEM_EFCSGRP -if [[ $CDUMP == "gfs" ]]; then - export NMEM_EFCSGRP=${NMEM_EFCSGRP_GFS:-${NMEM_EFCSGRP:-1}} -fi -export ENSEND=$((NMEM_EFCSGRP * 10#${ENSGRP})) -export ENSBEG=$((ENSEND - NMEM_EFCSGRP + 1)) - -if [[ ${DO_WAVE} == "YES" ]]; then - declare -rx RUNwave="${RUN}wave" -fi - -############################################################### -# Run relevant script - -${ENKFFCSTSH:-${SCRgfs}/exgdas_enkf_fcst.sh} -status=$? -[[ ${status} -ne 0 ]] && exit ${status} - - -# Double check the status of members in ENSGRP -EFCSGRP="${COM_TOP}/efcs.grp${ENSGRP}" -npass=0 -if [ -f ${EFCSGRP} ]; then - npass=$(grep "PASS" ${EFCSGRP} | wc -l) -fi -echo "${npass}/${NMEM_EFCSGRP} members successfull in efcs.grp${ENSGRP}" -if [ ${npass} -ne ${NMEM_EFCSGRP} ]; then - echo "FATAL ERROR: Failed members in group ${ENSGRP}, ABORT!" - cat ${EFCSGRP} - exit 99 -fi - - -############################################## -# Send Alerts -############################################## -if [ ${SENDDBN} = YES ] ; then - ${DBNROOT}/bin/dbn_alert MODEL ENKF1_MSC_fcsstat ${job} ${EFCSGRP} -fi - - -############################################## -# End JOB SPECIFIC work -############################################## - -############################################## -# Final processing -############################################## -if [ -e "${pgmout}" ] ; then - cat ${pgmout} -fi - -########################################## -# Remove the Temporary working directory -########################################## -cd ${DATAROOT} -[[ ${KEEPDATA} = "NO" ]] && rm -rf ${DATA} - -exit 0 diff --git a/jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_BMAT b/jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_BMAT index 4a8242abfb..20b669ab4c 100755 --- a/jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_BMAT +++ b/jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_BMAT @@ -3,7 +3,7 @@ source "${HOMEgfs}/ush/preamble.sh" export WIPE_DATA="NO" export DATA="${DATAROOT}/${RUN}ocnanal_${cyc}" -source "${HOMEgfs}/ush/jjob_header.sh" -e "ocnanalrun" -c "base ocnanal ocnanalrun" +source "${HOMEgfs}/ush/jjob_header.sh" -e "ocnanalbmat" -c "base ocnanal ocnanalbmat" ############################################## diff --git a/jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_ECEN b/jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_ECEN new file mode 100755 index 0000000000..c4ad80c9e3 --- /dev/null +++ b/jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_ECEN @@ -0,0 +1,38 @@ +#!/bin/bash +source "${HOMEgfs}/ush/preamble.sh" +source "${HOMEgfs}/ush/jjob_header.sh" -e "ocnanalecen" -c "base ocnanal ocnanalecen" + +############################################## +# Set variables used in the script +############################################## + +############################################## +# Begin JOB SPECIFIC work +############################################## + +############################################################### +# Run relevant script + +EXSCRIPT=${GDASOCNCENPY:-${HOMEgfs}/scripts/exgdas_global_marine_analysis_ecen.py} +${EXSCRIPT} +status=$? +[[ ${status} -ne 0 ]] && exit "${status}" + +############################################## +# End JOB SPECIFIC work +############################################## + +############################################## +# Final processing +############################################## +if [[ -e "${pgmout}" ]] ; then + cat "${pgmout}" +fi + +########################################## +# Remove the Temporary working directory +########################################## +cd "${DATAROOT}" || exit 1 +[[ "${KEEPDATA}" = "NO" ]] && rm -rf "${DATA}" + +exit 0 diff --git a/jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_PREP b/jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_PREP index 2e49a9f14d..bad646bf2d 100755 --- a/jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_PREP +++ b/jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_PREP @@ -34,7 +34,8 @@ RUN=${GDUMP} YMD=${gPDY} HH=${gcyc} generate_com -rx \ # Add UFSDA to PYTHONPATH ufsdaPATH="${HOMEgfs}/sorc/gdas.cd/ush/" -pyiodaPATH="${HOMEgfs}/sorc/gdas.cd/build/lib/python3.7/" +# shellcheck disable=SC2311 +pyiodaPATH="${HOMEgfs}/sorc/gdas.cd/build/lib/python$(detect_py_ver)/" PYTHONPATH="${PYTHONPATH:+${PYTHONPATH}:}${ufsdaPATH}:${pyiodaPATH}" export PYTHONPATH diff --git a/jobs/JGFS_ATMOS_AWIPS_20KM_1P0DEG b/jobs/JGFS_ATMOS_AWIPS_20KM_1P0DEG index 516c7a403b..60f21c469b 100755 --- a/jobs/JGFS_ATMOS_AWIPS_20KM_1P0DEG +++ b/jobs/JGFS_ATMOS_AWIPS_20KM_1P0DEG @@ -5,17 +5,6 @@ source "${HOMEgfs}/ush/jjob_header.sh" -e "awips" -c "base awips" export OMP_NUM_THREADS=${OMP_NUM_THREADS:-1} -################################ -# Set up the HOME directory -################################ -export HOMEgfs=${HOMEgfs:-${PACKAGEROOT}/gfs.${gfs_ver}} -export USHgfs=${USHgfs:-${HOMEgfs}/ush} -export EXECgfs=${EXECgfs:-${HOMEgfs}/exec} -export PARMgfs=${PARMgfs:-${HOMEgfs}/parm} -export PARMwmo=${PARMwmo:-${HOMEgfs}/parm/wmo} -export PARMproduct=${PARMproduct:-${HOMEgfs}/parm/product} -export FIXgfs=${FIXgfs:-${HOMEgfs}/fix} - ################################### # Specify NET and RUN Name and model #################################### @@ -47,7 +36,7 @@ export pgmout=OUTPUT.$$ ######################################################## # Execute the script. -"${HOMEgfs}/scripts/exgfs_atmos_awips_20km_1p0deg.sh" "${fcsthrs}" +"${SCRgfs}/exgfs_atmos_awips_20km_1p0deg.sh" "${fcsthrs}" export err=$?; err_chk ######################################################## diff --git a/jobs/JGFS_ATMOS_AWIPS_G2 b/jobs/JGFS_ATMOS_AWIPS_G2 index 5bd7749997..27e37d7214 100755 --- a/jobs/JGFS_ATMOS_AWIPS_G2 +++ b/jobs/JGFS_ATMOS_AWIPS_G2 @@ -9,17 +9,6 @@ source "${HOMEgfs}/ush/jjob_header.sh" -e "awips" -c "base awips" export OMP_NUM_THREADS=${OMP_NUM_THREADS:-1} -################################ -# Set up the HOME directory -################################ -export USHgfs=${USHgfs:-${HOMEgfs}/ush} -export EXECgfs=${EXECgfs:-${HOMEgfs}/exec} -export PARMgfs=${PARMgfs:-${HOMEgfs}/parm} -export PARMwmo=${PARMwmo:-${HOMEgfs}/parm/wmo} -export PARMproduct=${PARMproduct:-${HOMEgfs}/parm/product} -export FIXgfs=${FIXgfs:-${HOMEgfs}/fix} -export UTILgfs=${UTILgfs:-${HOMEgfs}/util} - ################################### # Specify NET and RUN Name and model #################################### @@ -46,7 +35,7 @@ export pgmout=OUTPUT.$$ ######################################################### mkdir -m 775 awips_g1 cd ${DATA}/awips_g1 -${HOMEgfs}/scripts/exgfs_atmos_grib_awips.sh ${fcsthrs} +"${SCRgfs}/exgfs_atmos_grib_awips.sh" "${fcsthrs}" export err=$?; err_chk ############################################ diff --git a/jobs/JGFS_ATMOS_FBWIND b/jobs/JGFS_ATMOS_FBWIND index e04b06c0d6..dbfca49610 100755 --- a/jobs/JGFS_ATMOS_FBWIND +++ b/jobs/JGFS_ATMOS_FBWIND @@ -8,17 +8,6 @@ source "${HOMEgfs}/ush/preamble.sh" source "${HOMEgfs}/ush/jjob_header.sh" -e "fbwind" -c "base" -################################ -# Set up the HOME directory -################################ -export USHgfs=${USHgfs:-${HOMEgfs}/ush} -export EXECgfs=${EXECgfs:-${HOMEgfs}/exec} -export PARMgfs=${PARMgfs:-${HOMEgfs}/parm} -export PARMwmo=${PARMwmo:-${HOMEgfs}/parm/wmo} -export PARMproduct=${PARMproduct:-${HOMEgfs}/parm/product} -export FIXgfs=${FIXgfs:-${HOMEgfs}/fix} -export UTILgfs=${UTILgfs:-${HOMEgfs}/util} - ################################### # Specify NET and RUN Name and model #################################### @@ -40,7 +29,7 @@ mkdir -m 775 -p ${COMOUT} ${COMOUTwmo} ######################################################## # Execute the script. -${HOMEgfs}/scripts/exgfs_atmos_fbwind.sh +${SCRgfs}/exgfs_atmos_fbwind.sh export err=$?;err_chk ######################################################## diff --git a/jobs/JGFS_ATMOS_GEMPAK b/jobs/JGFS_ATMOS_GEMPAK index ddf10342d2..5aa6c6826f 100755 --- a/jobs/JGFS_ATMOS_GEMPAK +++ b/jobs/JGFS_ATMOS_GEMPAK @@ -3,20 +3,6 @@ source "${HOMEgfs}/ush/preamble.sh" source "${HOMEgfs}/ush/jjob_header.sh" -e "gempak" -c "base gempak" - -################################ -# Set up the HOME directory -################################ -export EXECgfs=${EXECgfs:-${HOMEgfs}/exec} -export PARMgfs=${PARMgfs:-${HOMEgfs}/parm} -export EXPDIR=${EXPDIR:-${HOMEgfs}/parm/config} -export FIXgempak=${FIXgempak:-${HOMEgfs}/gempak/fix} -export USHgempak=${USHgempak:-${HOMEgfs}/gempak/ush} -export SRCgfs=${SRCgfs:-${HOMEgfs}/scripts} - -# For half-degree P Grib files -export DO_HD_PGRB=${DO_HD_PGRB:-YES} - ############################################ # Set up model and cycle specific variables ############################################ @@ -27,6 +13,9 @@ export GRIB=${GRIB:-pgrb2f} export EXT="" export DBN_ALERT_TYPE=${DBN_ALERT_TYPE:-GFS_GEMPAK} +# For half-degree P Grib files +export DO_HD_PGRB=${DO_HD_PGRB:-YES} + ################################### # Specify NET and RUN Name and model #################################### @@ -61,93 +50,75 @@ if (( ocean_domain_max > FHMAX_GFS )); then ocean_domain_max=${FHMAX_GFS} fi -################################################################# -# Execute the script for the 384 hour 1 degree grib -################################################################## -echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_1p00} &> ${DATA}/gfs_1p0.$$.1 " >> poescript -echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_1p00} &> ${DATA}/gfs_1p0.$$.2 " >> poescript -echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_1p00} &> ${DATA}/gfs_1p0.$$.3 " >> poescript -echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_1p00} &> ${DATA}/gfs_1p0.$$.4 " >> poescript -echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_1p00} &> ${DATA}/gfs_1p0.$$.5 " >> poescript -echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_1p00} &> ${DATA}/gfs_1p0.$$.6 " >> poescript - -################################################################# -# Execute the script for the half-degree grib -################################################################## -echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs_0p50 ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_0p50} &> ${DATA}/gfs_0p5.$$.1 " >> poescript -echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs_0p50 ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_0p50} &> ${DATA}/gfs_0p5.$$.2 " >> poescript -echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs_0p50 ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_0p50} &> ${DATA}/gfs_0p5.$$.3 " >> poescript -echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs_0p50 ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_0p50} &> ${DATA}/gfs_0p5.$$.4 " >> poescript -echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs_0p50 ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_0p50} &> ${DATA}/gfs_0p5.$$.5 " >> poescript -echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs_0p50 ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_0p50} &> ${DATA}/gfs_0p5.$$.6 " >> poescript - -################################################################# -# Execute the script for the quater-degree grib -#################################################################### -echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs_0p25 ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_0p25}&> ${DATA}/gfs_0p25.$$.1 " >> poescript -echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs_0p25 ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_0p25}&> ${DATA}/gfs_0p25.$$.2 " >> poescript -echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs_0p25 ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_0p25}&> ${DATA}/gfs_0p25.$$.3 " >> poescript -echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs_0p25 ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_0p25}&> ${DATA}/gfs_0p25.$$.4 " >> poescript -echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs_0p25 ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_0p25}&> ${DATA}/gfs_0p25.$$.5 " >> poescript -echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs_0p25 ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_0p25}&> ${DATA}/gfs_0p25.$$.6 " >> poescript -echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs_0p25 ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_0p25}&> ${DATA}/gfs_0p25.$$.7 " >> poescript -echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs_0p25 ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_0p25}&> ${DATA}/gfs_0p25.$$.8 " >> poescript -echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs_0p25 ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_0p25}&> ${DATA}/gfs_0p25.$$.9 " >> poescript -echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs_0p25 ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_0p25}&> ${DATA}/gfs_0p25.$$.10 " >> poescript - -#################################################################### -# Execute the script to create the 35km Pacific grids for OPC -##################################################################### -echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs35_pac ${ocean_domain_max} GFS_GEMPAK_WWB ${COM_ATMOS_GEMPAK_35km_pac} &> ${DATA}/gfs35_pac.$$.1 " >> poescript -echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs35_pac ${ocean_domain_max} GFS_GEMPAK_WWB ${COM_ATMOS_GEMPAK_35km_pac} &> ${DATA}/gfs35_pac.$$.2 " >> poescript - -#################################################################### -# Execute the script to create the 35km Atlantic grids for OPC -##################################################################### -echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs35_atl ${ocean_domain_max} GFS_GEMPAK_WWB ${COM_ATMOS_GEMPAK_35km_atl} &> ${DATA}/gfs35_atl.$$.1 " >> poescript -echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs35_atl ${ocean_domain_max} GFS_GEMPAK_WWB ${COM_ATMOS_GEMPAK_35km_atl} &> ${DATA}/gfs35_atl.$$.2 " >> poescript - -##################################################################### -# Execute the script to create the 40km grids for HPC -###################################################################### -echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs40 ${ocean_domain_max} GFS_GEMPAK_WWB ${COM_ATMOS_GEMPAK_40km} &> ${DATA}/gfs40.$$.1 " >> poescript -echo "time ${SRCgfs}/exgfs_atmos_nawips.sh gfs40 ${ocean_domain_max} GFS_GEMPAK_WWB ${COM_ATMOS_GEMPAK_40km} &> ${DATA}/gfs40.$$.2 " >> poescript - -if [[ ${CFP_MP:-"NO"} == "YES" ]]; then - # Add task number to the MPMD script - nl -n ln -v 0 poescript > poescript.new - mv poescript.new poescript -fi +{ + ################################################################# + # Execute the script for the 384 hour 1 degree grib + ################################################################## + echo "time ${SCRgfs}/exgfs_atmos_nawips.sh 1p00 ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_1p00}" + echo "time ${SCRgfs}/exgfs_atmos_nawips.sh 1p00 ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_1p00}" + echo "time ${SCRgfs}/exgfs_atmos_nawips.sh 1p00 ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_1p00}" + echo "time ${SCRgfs}/exgfs_atmos_nawips.sh 1p00 ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_1p00}" + echo "time ${SCRgfs}/exgfs_atmos_nawips.sh 1p00 ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_1p00}" + echo "time ${SCRgfs}/exgfs_atmos_nawips.sh 1p00 ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_1p00}" + + ################################################################# + # Execute the script for the half-degree grib + ################################################################## + echo "time ${SCRgfs}/exgfs_atmos_nawips.sh 0p50 ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_0p50}" + echo "time ${SCRgfs}/exgfs_atmos_nawips.sh 0p50 ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_0p50}" + echo "time ${SCRgfs}/exgfs_atmos_nawips.sh 0p50 ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_0p50}" + echo "time ${SCRgfs}/exgfs_atmos_nawips.sh 0p50 ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_0p50}" + echo "time ${SCRgfs}/exgfs_atmos_nawips.sh 0p50 ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_0p50}" + echo "time ${SCRgfs}/exgfs_atmos_nawips.sh 0p50 ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_0p50}" + + ################################################################# + # Execute the script for the quater-degree grib + #################################################################### + echo "time ${SCRgfs}/exgfs_atmos_nawips.sh 0p25 ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_0p25}" + echo "time ${SCRgfs}/exgfs_atmos_nawips.sh 0p25 ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_0p25}" + echo "time ${SCRgfs}/exgfs_atmos_nawips.sh 0p25 ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_0p25}" + echo "time ${SCRgfs}/exgfs_atmos_nawips.sh 0p25 ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_0p25}" + echo "time ${SCRgfs}/exgfs_atmos_nawips.sh 0p25 ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_0p25}" + echo "time ${SCRgfs}/exgfs_atmos_nawips.sh 0p25 ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_0p25}" + echo "time ${SCRgfs}/exgfs_atmos_nawips.sh 0p25 ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_0p25}" + echo "time ${SCRgfs}/exgfs_atmos_nawips.sh 0p25 ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_0p25}" + echo "time ${SCRgfs}/exgfs_atmos_nawips.sh 0p25 ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_0p25}" + echo "time ${SCRgfs}/exgfs_atmos_nawips.sh 0p25 ${FHMAX_GFS} GFS_GEMPAK ${COM_ATMOS_GEMPAK_0p25}" + + #################################################################### + # Execute the script to create the 35km Pacific grids for OPC + ##################################################################### + echo "time ${SCRgfs}/exgfs_atmos_nawips.sh 35km_pac ${ocean_domain_max} GFS_GEMPAK_WWB ${COM_ATMOS_GEMPAK_35km_pac}" + echo "time ${SCRgfs}/exgfs_atmos_nawips.sh 35km_pac ${ocean_domain_max} GFS_GEMPAK_WWB ${COM_ATMOS_GEMPAK_35km_pac}" + + #################################################################### + # Execute the script to create the 35km Atlantic grids for OPC + ##################################################################### + echo "time ${SCRgfs}/exgfs_atmos_nawips.sh 35km_atl ${ocean_domain_max} GFS_GEMPAK_WWB ${COM_ATMOS_GEMPAK_35km_atl}" + echo "time ${SCRgfs}/exgfs_atmos_nawips.sh 35km_atl ${ocean_domain_max} GFS_GEMPAK_WWB ${COM_ATMOS_GEMPAK_35km_atl}" + + ##################################################################### + # Execute the script to create the 40km grids for HPC + ###################################################################### + echo "time ${SCRgfs}/exgfs_atmos_nawips.sh 40km ${ocean_domain_max} GFS_GEMPAK_WWB ${COM_ATMOS_GEMPAK_40km}" + echo "time ${SCRgfs}/exgfs_atmos_nawips.sh 40km ${ocean_domain_max} GFS_GEMPAK_WWB ${COM_ATMOS_GEMPAK_40km}" +} > poescript cat poescript -chmod 775 ${DATA}/poescript -export MP_PGMMODEL=mpmd -export MP_CMDFILE=${DATA}/poescript - -ntasks=$(cat ${DATA}/poescript | wc -l) -ptile=${PTILE_GEMPAK:-4} -threads=${NTHREADS_GEMPAK:-1} -export OMP_NUM_THREADS=${threads} -APRUN=${APRUN:-"mpiexec -l -np ${ntasks} --cpu-bind verbose,core cfp"} - -APRUN_GEMPAKCFP=${APRUN_GEMPAKCFP:-${APRUN}} -APRUNCFP=${APRUN_GEMPAKCFP} - -${APRUNCFP} ${DATA}/poescript +"${HOMEgfs}/ush/run_mpmd.sh" poescript export err=$?; err_chk ############################################ # print exec I/O output ############################################ -if [ -e "${pgmout}" ] ; then - cat ${pgmout} +if [[ -e "${pgmout}" ]] ; then + cat "${pgmout}" fi ################################### # Remove temp directories ################################### -if [ "${KEEPDATA}" != "YES" ] ; then - rm -rf ${DATA} +if [[ "${KEEPDATA}" != "YES" ]] ; then + rm -rf "${DATA}" fi - diff --git a/jobs/JGFS_ATMOS_GEMPAK_META b/jobs/JGFS_ATMOS_GEMPAK_META index 8e1c05763f..ac49a1e9ae 100755 --- a/jobs/JGFS_ATMOS_GEMPAK_META +++ b/jobs/JGFS_ATMOS_GEMPAK_META @@ -18,25 +18,19 @@ export MP_LABELIO=yes export MP_PULSE=0 export MP_DEBUG_NOTIMEOUT=yes -################################ -# Set up the HOME directory -################################ -export EXECgfs=${EXECgfs:-${HOMEgfs}/exec} -export PARMgfs=${PARMgfs:-${HOMEgfs}/parm} -export EXPDIR=${EXPDIR:-${HOMEgfs}/parm/config} -export FIXgempak=${FIXgempak:-${HOMEgfs}/gempak/fix} -export USHgempak=${USHgempak:-${HOMEgfs}/gempak/ush} -export SRCgfs=${SRCgfs:-${HOMEgfs}/scripts} - -cp ${FIXgempak}/datatype.tbl datatype.tbl +cp "${HOMEgfs}/gempak/fix/datatype.tbl" datatype.tbl ############################################# #set the fcst hrs for all the cycles ############################################# -export fhbeg=00 +export fhbeg=0 export fhend=384 export fhinc=12 +if (( fhend > FHMAX_GFS )); then + export fhend=${FHMAX_GFS} +fi + ################################### # Specify NET and RUN Name and model #################################### @@ -51,37 +45,37 @@ export DBN_ALERT_TYPE=GFS_METAFILE ############################################## # Define COM directories ############################################## -export COMIN=${COMIN:-$(compath.py ${envir}/${NET}/${gfs_ver})/${RUN}.${PDY}/${cyc}/${COMPONENT}/gempak} -export COMOUT=${COMOUT:-$(compath.py -o ${NET}/${gfs_ver}/${RUN}.${PDY})/${cyc}/${COMPONENT}/gempak/meta} -export COMINgempak=${COMINgempak:-$(compath.py ${envir}/${NET}/${gfs_ver})} - -export COMINukmet=${COMINukmet:-$(compath.py ${envir}/ukmet/${ukmet_ver})/ukmet} -export COMINecmwf=${COMINecmwf:-$(compath.py ${envir}/ecmwf/${ecmwf_ver})/ecmwf} -export COMINnam=${COMINnam:-$(compath.py ${envir}/nam/${nam_ver})/nam} +export COMINukmet=${COMINukmet:-$(compath.py "${envir}/ukmet/${ukmet_ver}")/ukmet} +export COMINecmwf=${COMINecmwf:-$(compath.py "${envir}/ecmwf/${ecmwf_ver}")/ecmwf} +export COMINnam=${COMINnam:-$(compath.py "${envir}/nam/${nam_ver}")/nam} export SENDDBN=${SENDDBN:-NO} export DBNROOT=${DBNROOT:-${UTILROOT}/fakedbn} -mkdir -m 775 -p ${COMOUT} +GRID=1p00 YMD=${PDY} HH=${cyc} generate_com -rx "COM_ATMOS_GEMPAK_1p00:COM_ATMOS_GEMPAK_TMPL" +GRID="meta" YMD=${PDY} HH=${cyc} generate_com -rx "COM_ATMOS_GEMPAK_META:COM_ATMOS_GEMPAK_TMPL" +if [[ ! -d "${COM_ATMOS_GEMPAK_META}" ]] ; then + mkdir -m 775 -p "${COM_ATMOS_GEMPAK_META}" +fi ######################################################## # Execute the script. -${SRCgfs}/exgfs_atmos_gempak_meta.sh +"${SCRgfs}/exgfs_atmos_gempak_meta.sh" export err=$?; err_chk ######################################################## ############################################ # print exec I/O output ############################################ -if [ -e "${pgmout}" ] ; then - cat ${pgmout} +if [[ -e "${pgmout}" ]] ; then + cat "${pgmout}" fi ################################### # Remove temp directories ################################### -if [ "${KEEPDATA}" != "YES" ] ; then - rm -rf ${DATA} +if [[ "${KEEPDATA}" != "YES" ]] ; then + rm -rf "${DATA}" fi diff --git a/jobs/JGFS_ATMOS_GEMPAK_NCDC_UPAPGIF b/jobs/JGFS_ATMOS_GEMPAK_NCDC_UPAPGIF index 58b24c5e49..4be68abcd3 100755 --- a/jobs/JGFS_ATMOS_GEMPAK_NCDC_UPAPGIF +++ b/jobs/JGFS_ATMOS_GEMPAK_NCDC_UPAPGIF @@ -1,41 +1,20 @@ #! /usr/bin/env bash -# TODO (#1222) This job is not part of the rocoto suite - ############################################ # GFS GEMPAK NCDC PRODUCT GENERATION ############################################ source "${HOMEgfs}/ush/preamble.sh" source "${HOMEgfs}/ush/jjob_header.sh" -e "gempak_gif" -c "base" - -################################ -# Set up the HOME directory -################################ -export EXECgfs=${EXECgfs:-${HOMEgfs}/exec} -export PARMgfs=${PARMgfs:-${HOMEgfs}/parm} -export EXPDIR=${EXPDIR:-${HOMEgfs}/parm/config} -export FIXgfs=${FIXgfs:-${HOMEgfs}/gempak/fix} -export USHgempak=${USHgempak:-${HOMEgfs}/gempak/ush} -export SRCgfs=${SRCgfs:-${HOMEgfs}/scripts} -export UTILgfs=${UTILgfs:-${HOMEgfs}/util} - -###################################### -# Set up the GEMPAK directory -####################################### -export HOMEgempak=${HOMEgempak:-${HOMEgfs}/gempak} -export FIXgempak=${FIXgempak:-${HOMEgempak}/fix} -export USHgempak=${USHgempak:-${HOMEgempak}/ush} - export MP_PULSE=0 export MP_TIMEOUT=2000 - -# # Set up model and cycle specific variables -# export MODEL=GFS export fend=384 +if (( fend > FHMAX_GFS )); then + export fend="${FHMAX_GFS}" +fi # set increment to 6 hours -- 3 hours is available. export finc=6 @@ -50,37 +29,40 @@ export COMPONENT="atmos" ############################################## # Define COM directories ############################################## -export COMIN=${COMIN:-$(compath.py ${envir}/${NET}/${gfs_ver})/${RUN}.${PDY}/${cyc}/${COMPONENT}/gempak} -export COMINgfs=${COMINgfs:-$(compath.py ${envir}/${NET}/${gfs_ver}/${RUN}.${PDY})/${cyc}/${COMPONENT}} -export COMINobsproc=${COMINobsproc:-$(compath.py ${envir}/obsproc/${obsproc_ver})/${RUN}.${PDY}/${cyc}/${COMPONENT}} -export COMOUT=${COMOUT:-$(compath.py -o ${NET}/${gfs_ver}/${RUN}.${PDY})/${cyc}/${COMPONENT}} -export COMOUTwmo=${COMOUTwmo:-${COMOUT}/wmo} +YMD=${PDY} HH=${cyc} generate_com -rx "COM_OBS" +GRID=1p00 YMD=${PDY} HH=${cyc} generate_com -rx "COM_ATMOS_GEMPAK_1p00:COM_ATMOS_GEMPAK_TMPL" + +for grid in gif upper_air; do + gempak_dir="COM_ATMOS_GEMPAK_${grid^^}" + GRID=${grid} YMD=${PDY} HH=${cyc} generate_com -rx "${gempak_dir}:COM_ATMOS_GEMPAK_TMPL" + if [[ ! -d "${!gempak_dir}" ]]; then mkdir -m 775 -p "${!gempak_dir}"; fi +done + +YMD=${PDY} HH=${cyc} generate_com -rx COM_ATMOS_WMO +if [[ ! -d "${COM_ATMOS_WMO}" ]]; then mkdir -m 775 -p "${COM_ATMOS_WMO}"; fi export SENDDBN=${SENDDBN:-NO} export DBNROOT=${DBNROOT:-${UTILROOT}/fakedbn} -mkdir -m 775 -p ${COMOUT} ${COMOUTwmo} - export pgmout=OUTPUT.$$ - ######################################################## # Execute the script. -${SRCgfs}/exgfs_atmos_gempak_gif_ncdc_skew_t.sh +"${SCRgfs}/exgfs_atmos_gempak_gif_ncdc_skew_t.sh" export err=$?; err_chk ######################################################## ############################################ # print exec I/O output ############################################ -if [ -e "${pgmout}" ] ; then - cat ${pgmout} +if [[ -e "${pgmout}" ]] ; then + cat "${pgmout}" fi ################################### # Remove temp directories ################################### -if [ "${KEEPDATA}" != "YES" ] ; then - rm -rf ${DATA} +if [[ "${KEEPDATA}" != "YES" ]] ; then + rm -rf "${DATA}" fi diff --git a/jobs/JGFS_ATMOS_GEMPAK_PGRB2_SPEC b/jobs/JGFS_ATMOS_GEMPAK_PGRB2_SPEC index 47415a39ff..2412731cab 100755 --- a/jobs/JGFS_ATMOS_GEMPAK_PGRB2_SPEC +++ b/jobs/JGFS_ATMOS_GEMPAK_PGRB2_SPEC @@ -6,20 +6,6 @@ source "${HOMEgfs}/ush/preamble.sh" source "${HOMEgfs}/ush/jjob_header.sh" -e "gempak_spec" -c "base" - -################################ -# Set up the HOME directory -################################ -export EXECgfs="${EXECgfs:-${HOMEgfs}/exec}" -export PARMgfs="${PARMgfs:-${HOMEgfs}/parm}" -export EXPDIR="${EXPDIR:-${HOMEgfs}/parm/config}" -export FIXgempak="${FIXgempak:-${HOMEgfs}/gempak/fix}" -export USHgempak="${USHgempak:-${HOMEgfs}/gempak/ush}" -export SRCgfs="${SRCgfs:-${HOMEgfs}/scripts}" - -# For half-degree P Grib files -#export DO_HD_PGRB=YES - ################################### # Specify NET and RUN Name and model #################################### @@ -28,17 +14,19 @@ export finc=3 export model=gfs export EXT="" +# For half-degree P Grib files +#export DO_HD_PGRB=YES + ############################################## # Define COM directories ############################################## -export COMIN="${COMIN:-$(compath.py "${envir}"/"${NET}"/"${gfs_ver}")/${RUN}.${PDY}/${cyc}/${COMPONENT}}" -export COMOUT="${COMOUT:-$(compath.py -o "${NET}"/"${gfs_ver}"/"${NET}"."${PDY}")/${cyc}/${COMPONENT}/gempak}" +YMD=${PDY} HH=${cyc} generate_com -rx COM_ATMOS_GOES +GRID=0p25 YMD=${PDY} HH=${cyc} generate_com -rx "COM_ATMOS_GEMPAK_0p25:COM_ATMOS_GEMPAK_TMPL" +if [[ ! -d "${COM_ATMOS_GEMPAK_0p25}" ]]; then mkdir -m 775 -p "${COM_ATMOS_GEMPAK_0p25}"; fi export SENDDBN="${SENDDBN:-NO}" export DBNROOT="${DBNROOT:-${UTILROOT}/fakedbn}" -mkdir -m 775 -p "${COMOUT}" - ################################################################# # Execute the script for the regular grib ################################################################# @@ -49,16 +37,19 @@ cd "${DATA_SPECIAL}" || exit 1 export DBN_ALERT_TYPE=GFS_GOESSIM_GEMPAK export RUN2=gfs_goessim export GRIB=goessimpgrb2.0p25.f -export EXT=" " +export EXT="" export fend=180 +if (( fend > FHMAX_GFS )); then + fend=${FHMAX_GFS} +fi export finc=3 -export fstart=000 +export fstart=0 echo "RUNS the Program" ######################################################## # Execute the script. -"${SRCgfs}/exgfs_atmos_goes_nawips.sh" +"${SCRgfs}/exgfs_atmos_goes_nawips.sh" ################################################################# # Execute the script for the 221 grib @@ -72,14 +63,17 @@ export RUN2=gfs_goessim221 export GRIB=goessimpgrb2f export EXT=".grd221" export fend=180 +if (( fend > FHMAX_GFS )); then + fend=${FHMAX_GFS} +fi export finc=3 -export fstart=000 +export fstart=0 echo "RUNS the Program" ######################################################## # Execute the script. -"${SRCgfs}/exgfs_atmos_goes_nawips.sh" +"${SCRgfs}/exgfs_atmos_goes_nawips.sh" export err=$?; err_chk ######################################################## diff --git a/jobs/JGFS_ATMOS_PGRB2_SPEC_NPOESS b/jobs/JGFS_ATMOS_PGRB2_SPEC_NPOESS index a98835ada2..fba33bb75c 100755 --- a/jobs/JGFS_ATMOS_PGRB2_SPEC_NPOESS +++ b/jobs/JGFS_ATMOS_PGRB2_SPEC_NPOESS @@ -10,17 +10,6 @@ source "${HOMEgfs}/ush/jjob_header.sh" -e "npoess" -c "base" export OMP_NUM_THREADS=${OMP_NUM_THREADS:-1} -################################ -# Set up the HOME directory -################################ -export USHgfs=${USHgfs:-${HOMEgfs}/ush} -export EXECgfs=${EXECgfs:-${HOMEgfs}/exec} -export PARMgfs=${PARMgfs:-${HOMEgfs}/parm} -export EXPDIR=${EXPDIR:-${HOMEgfs}/parm/config} -export PARMwmo=${PARMwmo:-${HOMEgfs}/parm/wmo} -export PARMproduct=${PARMproduct:-${HOMEgfs}/parm/product} -export FIXgfs=${FIXgfs:-${HOMEgfs}/fix} - ################################### # Specify NET and RUN Name and model #################################### @@ -32,14 +21,14 @@ export model=${model:-gfs} export SENDDBN=${SENDDBN:-NO} export DBNROOT=${DBNROOT:-${UTILROOT}/fakedbn} -YMD=${PDY} HH=${cyc} generate_com -rx COM_ATMOS_GOES +YMD=${PDY} HH=${cyc} generate_com -rx COM_ATMOS_MASTER COM_ATMOS_GOES GRID="0p50" YMD=${PDY} HH=${cyc} generate_com -rx COM_ATMOS_GRIB_0p50:COM_ATMOS_GRIB_GRID_TMPL mkdir -m 775 -p "${COM_ATMOS_GOES}" ############################################################# # Execute the script -"${HOMEgfs}/scripts/exgfs_atmos_grib2_special_npoess.sh" +"${SCRgfs}/exgfs_atmos_grib2_special_npoess.sh" export err=$?;err_chk ############################################################# diff --git a/jobs/JGFS_ATMOS_POSTSND b/jobs/JGFS_ATMOS_POSTSND index 721dd27628..e6411709fa 100755 --- a/jobs/JGFS_ATMOS_POSTSND +++ b/jobs/JGFS_ATMOS_POSTSND @@ -9,7 +9,6 @@ source "${HOMEgfs}/ush/jjob_header.sh" -e "postsnd" -c "base postsnd" ############################################## export CDUMP=${RUN/enkf} - ######################################## # Runs GFS BUFR SOUNDINGS ######################################## @@ -18,17 +17,6 @@ export model=${model:-gfs} export SENDDBN=${SENDDBN:-YES} export DBNROOT=${DBNROOT:-${UTILROOT}/fakedbn} -################################### -# Set up the source directories -################################### - -export HOMEbufrsnd=${HOMEbufrsnd:-${HOMEgfs}} -export EXECbufrsnd=${EXECbufrsnd:-${HOMEbufrsnd}/exec} -export FIXbufrsnd=${FIXbufrsnd:-${HOMEbufrsnd}/fix/product} -export PARMbufrsnd=${PARMbufrsnd:-${HOMEbufrsnd}/parm/product} -export USHbufrsnd=${USHbufrsnd:-${HOMEbufrsnd}/ush} -export SCRbufrsnd=${SCRbufrsnd:-${HOMEbufrsnd}/scripts} - ############################## # Define COM Directories ############################## @@ -44,7 +32,7 @@ YMD=${PDY} HH=${cyc} generate_com -rx COM_ATMOS_HISTORY COM_ATMOS_BUFR \ ######################################################## # Execute the script. -${SCRbufrsnd}/exgfs_atmos_postsnd.sh +${SCRgfs}/exgfs_atmos_postsnd.sh status=$? [[ ${status} -ne 0 ]] && exit ${status} diff --git a/jobs/JGFS_ATMOS_WAFS b/jobs/JGFS_ATMOS_WAFS new file mode 100755 index 0000000000..35a916bf1a --- /dev/null +++ b/jobs/JGFS_ATMOS_WAFS @@ -0,0 +1,96 @@ +#!/bin/sh + +######################################## +# GFS AWIPS PRODUCT GENERATION +######################################## +date +export PS4='$SECONDS + ' +set -xa + +export KEEPDATA=${KEEPDATA:-NO} + +############################################ +# Working Directory +############################################ +export DATA=${DATA:-${DATAROOT}/${jobid:?}} +mkdir -p $DATA +cd $DATA + +############################################ +# Output for executables +############################################ +export pgmout=OUTPUT.$$ + +############################################ +# Load the UTILITIES module +############################################ +#### module load prod_util +#### module load grib_util + +########################################### +# Run setpdy and initialize PDY variables +########################################### +export cycle=t${cyc}z +setpdy.sh +. ./PDY + +export RERUN=${RERUN:-NO} + +############################################ +# Set up the NET and RUN +############################################ +export NET=${NET:-gfs} +export RUN=${RUN:-gfs} +export COMPONENT=${COMPONENT:-atmos} + +############################################ +# Specify HOME Directory +############################################ +export gfs_ver=${gfs_ver:-v16.3.0} +export HOMEgfs=${HOMEgfs:-${NWROOT}/gfs.${gfs_ver}} +export EXECgfs=$HOMEgfs/exec +export FIXgfs=$HOMEgfs/fix/wafs +export PARMgfs=$HOMEgfs/parm/wafs +export USHgfs=$HOMEgfs/ush +export SCRIPTSgfs=$HOMEgfs/scripts + +################################################ +# Set up the input/output directory +################################################ +export COMIN=${COMIN:-$(compath.py ${envir}/${NET}/${gfs_ver})/${RUN}.${PDY}/${cyc}/$COMPONENT} +export COMOUT=${COMOUT:-$(compath.py -o $NET/$gfs_ver)/$RUN.$PDY/$cyc/$COMPONENT} +export PCOM=${PCOM:-$COMOUT/wmo} + +if [ $SENDCOM = YES ] ; then + mkdir -p $COMOUT $PCOM +fi + +############################################ +# print current environment +############################################ +env + +############################################ +# Execute the script. +############################################ + +${SCRIPTSgfs}/exgfs_atmos_wafs_grib.sh $fcsthrs +export err=$?; err_chk + +echo "JOB $job HAS COMPLETED NORMALLY!" + +############################################ +# print exec output +############################################ +if [ -e "$pgmout" ] ; then + cat $pgmout +fi + +############################################ +# remove temporary working directory +############################################ +if [ $KEEPDATA != YES ] ; then + rm -rf $DATA +fi + +date diff --git a/jobs/JGFS_ATMOS_WAFS_BLENDING_0P25 b/jobs/JGFS_ATMOS_WAFS_BLENDING_0P25 new file mode 100755 index 0000000000..7367ce5a2c --- /dev/null +++ b/jobs/JGFS_ATMOS_WAFS_BLENDING_0P25 @@ -0,0 +1,153 @@ +#!/bin/sh +######################################################## +# This job runs the code to blend US's and UK's WAFS products at 0.25 deg +######################################################## + +date +export PS4='$SECONDS + ' +set -x + +# keep the working directory or not +export KEEPDATA=${KEEPDATA:-NO} + +############################################ +# Working Directory +############################################ +export DATA=${DATA:-${DATAROOT}/${jobid:?}} +mkdir -p $DATA +cd $DATA + +############################################ +# Output for executables +############################################ +export pgmout=OUTPUT.$$ + +########################################### +# Run setpdy and initialize PDY variables +########################################### +export cycle=t${cyc}z +setpdy.sh +. ./PDY + +export RERUN=${RERUN:-NO} + +############################################ +# Set up the NET and RUN +############################################ +export NET=${NET:-gfs} +export RUN=${RUN:-gfs} +export COMPONENT=${COMPONENT:-atmos} + +############################################ +# Specify HOME Directory +############################################ +export gfs_ver=${gfs_ver:-v16.3.0} +export HOMEgfs=${HOMEgfs:-${NWROOT}/gfs.${gfs_ver}} +export EXECgfs=$HOMEgfs/exec +export FIXgfs=$HOMEgfs/fix/wafs +export PARMgfs=$HOMEgfs/parm/wafs +export USHgfs=$HOMEgfs/ush +export SCRIPTSgfs=$HOMEgfs/scripts + +################################################ +# Set up the INPUT and OUTPUT directories +################################################ +export COMIN=${COMIN:-$(compath.py ${envir}/${NET}/${gfs_ver})/${RUN}.${PDY}/${cyc}/$COMPONENT} +export COMOUT=${COMOUT:-$(compath.py -o $NET/$gfs_ver)/$RUN.$PDY/$cyc/$COMPONENT} +export PCOM=${PCOM:-$COMOUT/wmo} + +if [ $SENDCOM = YES ] ; then + mkdir -p $COMOUT $PCOM +fi + +export COMINus=${COMINus:-$COMIN} +export COMINuk=${COMINuk:-$DCOMROOT/$PDY/wgrbbul/ukmet_wafs} + +############################################ +# print current environment +############################################ +env + +############################################## +# Set up the forecast hours +############################################## +export SHOUR=${SHOUR:-06} +export EHOUR=${EHOUR:-48} +export FHOUT_GFS=${FHOUT_GFS:-1} + +############################################### +# Specify Timeout Behavior of WAFS blending +# +# SLEEP_TIME - Amount of time to wait for +# a input file before exiting +# SLEEP_INT - Amount of time to wait between +# checking for input files +############################################### +# export SLEEP_TIME=300 # changed to 60 to avoid hitting wall_clock when miss umket wafs files ... +# JY -0129: export SLEEP_TIME=600 +export SLEEP_TIME=900 +export SLEEP_INT=10 + +#################################### +# Check if this is a restart +#################################### +if test -f $COMOUT/$RUN.t${cyc}z.control.wafsblending_0p25 +then + modelrecvy=`cat < $COMOUT/${RUN}.t${cyc}z.control.wafsblending_0p25` + recvy_pdy=`echo $modelrecvy | cut -c1-8` + recvy_cyc=`echo $modelrecvy | cut -c9-10` + recvy_shour=`echo $modelrecvy | cut -c11-` + + if [ $FHOUT_GFS -eq 3 ] ; then + FHINC=03 + else + if [ $recvy_shour -lt 24 ] ; then + FHINC=01 + else + FHINC=03 + fi + fi + + if test $RERUN = "NO" + then + if [ $recvy_shour -lt $EHOUR ] + then + new_shour=`expr $recvy_shour + $FHINC` + fi + if test $new_shour -ge $SHOUR + then + export SHOUR=$new_shour + if [ $SHOUR -lt 10 ]; then SHOUR=0$SHOUR; fi + fi + if test $recvy_shour -ge $EHOUR + then + echo "WAFS blending Already Completed to $EHOUR" + else + echo "Starting: PDY=$PDY cycle=t${recvy_cyc}z SHOUR=$SHOUR ." + fi + fi +fi + +############################################ +# Execute the script. +############################################ +${SCRIPTSgfs}/exgfs_atmos_wafs_blending_0p25.sh +export err=$?; err_chk + +echo "JOB $job HAS COMPLETED NORMALLY." + +############################################ +# print exec output +############################################ +if [ -e "$pgmout" ] ; then + cat $pgmout +fi + +############################################ +# remove temporary working directory +############################################ +if [ $KEEPDATA != YES ] ; then + rm -rf $DATA +fi + +date diff --git a/jobs/JGFS_ATMOS_WAFS_GCIP b/jobs/JGFS_ATMOS_WAFS_GCIP new file mode 100755 index 0000000000..d4e1a4529f --- /dev/null +++ b/jobs/JGFS_ATMOS_WAFS_GCIP @@ -0,0 +1,140 @@ +#!/bin/sh + +############################################ +# GFS GCIP PRODUCT GENERATION +############################################ + +date +export PS4='$SECONDS + ' +set -xa + +# keep the working directory or not +export KEEPDATA=${KEEPDATA:-NO} + +############################################ +# Working Directory +############################################ +export DATA=${DATA:-${DATAROOT}/${jobid:?}} +mkdir -p $DATA +cd $DATA + +############################################ +# Output for executables +############################################ +export pgmout=OUTPUT.$$ + +############################################ +# Load the UTILITIES module +############################################ +#### module load prod_util +#### module load grib_util + +############################################ +# Run setpdy and initialize PDY variables +############################################ +export cycle=t${cyc}z +setpdy.sh +. ./PDY + +############################################ +# Set up the NET and RUN +############################################ +export NET=${NET:-gfs} +export RUN=${RUN:-gfs} +export COMPONENT=${COMPONENT:-atmos} + +############################################ +# Specify HOME Directory +############################################ +export gfs_ver=${gfs_ver:-v16.3.0} +export HOMEgfs=${HOMEgfs:-${NWROOT}/gfs.${gfs_ver}} +export EXECgfs=$HOMEgfs/exec +export FIXgfs=$HOMEgfs/fix/wafs +export PARMgfs=$HOMEgfs/parm/wafs +export USHgfs=$HOMEgfs/ush +export SCRIPTSgfs=$HOMEgfs/scripts + +# For BUFR dump, TMPDIR must be defined +export TMPDIR=$DATA # will be overwritten in exgfs script for parallel runs on ffhr +# For BUFR dump, these two environment variables are defined by module load +# HOMEobsproc_shared_bufr_dumplist <= module load bufr_dumplist/1.5.0 +# HOMEobsproc_dump <= module load dumpjb/4.0.0 + + +################################################ +# Set up the input/output directory +################################################ +# model data +export COMINgfs=${COMINgfs:-$(compath.py ${envir}/${NET}/${gfs_ver})/${RUN}.${PDY}/${cyc}/$COMPONENT} + +# satellite data +#ftp://satepsanone.nesdis.noaa.gov/2day/gmosaic/ +# Have to change IP address to digital ones, which BSUB can identify +#export COMINsat=${COMINsat:-ftp://140.90.213.161/2day/gmosaic} +export COMINsat=${COMINsat:-$DCOMROOT/$PDY/mcidas} + +# radar data +export radarl2_ver=${radarl2_ver:-v1.2} +export COMINradar=${COMINradar:-$(compath.py ${envir}/radarl2/$radarl2_ver)/radar.$PDY} + +# metar/ships/lightning/pireps +# data are dumped by $USHobsproc_dump/dumpjb +# + +# COMOUT +export COMOUT=${COMOUT:-$(compath.py -o $NET/$gfs_ver)/$RUN.$PDY/$cyc/$COMPONENT} + +mkdir -p $COMOUT + +############################################### +# Specify Timeout Behavior of WAFS GCIP +# +# SLEEP_TIME - how long to wait for inputs before exiting +# SLEEP_INT - time interval for checking for inputs +############################################### +# JY export SLEEP_TIME=300 +export SLEEP_TIME=600 +export SLEEP_INT=10 + +############################################ +# Execute the script, parallel run for 000 003 +############################################ +export MPIRUN=${MPIRUN:-"mpiexec -l -np 2 --cpu-bind verbose,core cfp"} + +# GCIP runs f000 f003 for each cycle, 4 times/day, +# to make the output valid every 3 hours +if [ `echo $MPIRUN | cut -d " " -f1` = 'srun' ] ; then + echo 0 ${SCRIPTSgfs}/exgfs_atmos_wafs_gcip.sh 000 >> gcip.cmdfile + echo 1 ${SCRIPTSgfs}/exgfs_atmos_wafs_gcip.sh 003 >> gcip.cmdfile +else + echo ${SCRIPTSgfs}/exgfs_atmos_wafs_gcip.sh 000 >> gcip.cmdfile + echo ${SCRIPTSgfs}/exgfs_atmos_wafs_gcip.sh 003 >> gcip.cmdfile + export MP_PGMMODEL=mpmd +fi + +$MPIRUN gcip.cmdfile + +export err=$? +if [ $err -eq 0 ] ; then + echo "JOB $job HAS COMPLETED NORMALLY!" +elif [ $err -eq 1 ] ; then + echo "WARNING!!! JOB $job incomplete. Missing satellite data." +else + echo "JOB $job FAILED!!!!" +fi + +############################################ +# print exec output +############################################ +if [ -e "$pgmout" ] ; then + cat $pgmout +fi + +############################################ +# remove temporary working directory +############################################ +if [ $KEEPDATA != YES ] ; then + rm -rf $DATA +fi + +date diff --git a/jobs/JGFS_ATMOS_WAFS_GRIB2 b/jobs/JGFS_ATMOS_WAFS_GRIB2 new file mode 100755 index 0000000000..ed4c92979e --- /dev/null +++ b/jobs/JGFS_ATMOS_WAFS_GRIB2 @@ -0,0 +1,124 @@ +#!/bin/sh + +######################################## +# GFS AWIPS PRODUCT GENERATION +######################################## + +date +export PS4='$SECONDS + ' +set -x + +# keep the working directory or not +export KEEPDATA=${KEEPDATA:-NO} + +############################################ +# Working Directory +############################################ +export DATA=${DATA:-${DATAROOT}/${jobid:?}} +mkdir -p $DATA +cd $DATA + +############################################ +# Output for executables +############################################ +export pgmout=OUTPUT.$$ + +############################################ +# Load the UTILITIES module +############################################ +#### module load prod_util +#### module load grib_util + +########################################### +# Run setpdy and initialize PDY variables +########################################### +export cycle=t${cyc}z +setpdy.sh +. ./PDY + +############################################ +# Set up the NET and RUN +############################################ +export NET=${NET:-gfs} +export RUN=${RUN:-gfs} +export COMPONENT=${COMPONENT:-atmos} + +############################################ +# Specify HOME Directory +############################################ +export gfs_ver=${gfs_ver:-v16.3.0} +export HOMEgfs=${HOMEgfs:-${NWROOT}/gfs.${gfs_ver}} +export EXECgfs=$HOMEgfs/exec +export FIXgfs=$HOMEgfs/fix/wafs +export PARMgfs=$HOMEgfs/parm/wafs +export USHgfs=$HOMEgfs/ush +export SCRIPTSgfs=$HOMEgfs/scripts + +################################################ +# Set up the input/output directory +################################################ +#### if [ $envir = "prod" ] || [ $envir = "para" ] ; then +#### export COMIN=${COMIN:-$COMROOT/${NET}/${envir}/$RUN.$PDY} +#### else +#### export COMIN=${COMIN:-$COMROOT/${NET}/prod/$RUN.$PDY} +#### fi + +export COMIN=${COMIN:-$(compath.py ${envir}/${NET}/${gfs_ver})/${RUN}.${PDY}/${cyc}/$COMPONENT} +export COMOUT=${COMOUT:-$(compath.py -o $NET/$gfs_ver)/$RUN.$PDY/$cyc/$COMPONENT} +export PCOM=${PCOM:-$COMOUT/wmo} + +if [ $SENDCOM = YES ] ; then + mkdir -p $COMOUT $PCOM +fi + +############################################ +# print current environment +############################################ +env + +############################################## +# Set up the forecast hours +############################################## +export FHOURS=${FHOURS:-"00 06 09 12 15 18 21 24 27 30 33 36 42 48 54 60 66 72"} + +############################################ +# Execute the script. +############################################ + +NP=`echo $FHOURS | wc -w` +export MPIRUN=${MPIRUN:-"mpiexec -np $NP -cpu-bind verbose,core cfp"} + +rm wafsgrib2.cmdfile +ic=0 +for fcsthrs in $FHOURS ; do + if [ `echo $MPIRUN | cut -d " " -f1` = 'srun' ] ; then + echo $ic ${SCRIPTSgfs}/exgfs_atmos_wafs_grib2.sh $fcsthrs >> wafsgrib2.cmdfile + else + echo ${SCRIPTSgfs}/exgfs_atmos_wafs_grib2.sh $fcsthrs >> wafsgrib2.cmdfile + export MP_PGMMODEL=mpmd + fi + ic=`expr $ic + 1` +done + +$MPIRUN wafsgrib2.cmdfile + +export err=$?; err_chk + +echo "JOB $job HAS COMPLETED NORMALLY!" + +############################################ +# print exec output +############################################ +if [ -e "$pgmout" ] ; then + cat $pgmout +fi + +############################################ +# remove temporary working directory +############################################ +if [ $KEEPDATA != YES ] ; then + rm -rf $DATA +fi + +date + diff --git a/jobs/JGFS_ATMOS_WAFS_GRIB2_0P25 b/jobs/JGFS_ATMOS_WAFS_GRIB2_0P25 new file mode 100755 index 0000000000..64570bbf5d --- /dev/null +++ b/jobs/JGFS_ATMOS_WAFS_GRIB2_0P25 @@ -0,0 +1,133 @@ +#!/bin/sh + +######################################## +# GFS WAFS GRIB 0P25 PRODUCT GENERATION +######################################## + +date +export PS4='$SECONDS + ' +set -x + +# keep the working directory or not +export KEEPDATA=${KEEPDATA:-NO} + +############################################ +# Working Directory +############################################ +export DATA=${DATA:-${DATAROOT}/${jobid:?}} +mkdir -p $DATA +cd $DATA + +############################################ +# Output for executables +############################################ +export pgmout=OUTPUT.$$ + +########################################### +# Run setpdy and initialize PDY variables +########################################### +export cycle=t${cyc}z +setpdy.sh +. ./PDY + +############################################ +# Set up the NET and RUN +############################################ +export NET=${NET:-gfs} +export RUN=${RUN:-gfs} +export COMPONENT=${COMPONENT:-atmos} + +############################################ +# Specify HOME Directory +############################################ +export gfs_ver=${gfs_ver:-v16.3.0} +export HOMEgfs=${HOMEgfs:-${NWROOT}/gfs.${gfs_ver}} +export EXECgfs=$HOMEgfs/exec +export FIXgfs=$HOMEgfs/fix/wafs +export PARMgfs=$HOMEgfs/parm/wafs +export USHgfs=$HOMEgfs/ush +export SCRIPTSgfs=$HOMEgfs/scripts + +################################################ +# Set up the input/output directory +################################################ +export COMIN=${COMIN:-$(compath.py ${envir}/${NET}/${gfs_ver})/${RUN}.${PDY}/${cyc}/$COMPONENT} +export COMOUT=${COMOUT:-$(compath.py -o $NET/$gfs_ver)/$RUN.$PDY/$cyc/$COMPONENT} +export PCOM=${PCOM:-$COMOUT/wmo} + +if [ $SENDCOM = YES ] ; then + mkdir -p $COMOUT $PCOM +fi + +############################################ +# print current environment +############################################ +env + +############################################## +# Set up the forecast hours +############################################## +#export SHOUR=${SHOUR:-06} +# Will change to 120 for 2023 ICAO standard +#export EHOUR=${EHOUR:-120} +#export EHOUR=${EHOUR:-36} + +export FHOUT_GFS=${FHOUT_GFS:-1} +if [ $FHOUT_GFS -eq 3 ] ; then #27 + export FHOURS=${FHOURS:-"6 9 12 15 18 21 24 27 30 33 36 39 42 45 48 54 60 66 72 78 84 90 96 102 108 114 120"} +else #39 + export FHOURS=${FHOURS:-"6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 27 30 33 36 39 42 45 48 54 60 66 72 78 84 90 96 102 108 114 120"} +fi + +############################################### +# Specify Timeout Behavior of WAFS blending +# +# SLEEP_TIME - Amount of time to wait for +# a input file before exiting +# SLEEP_INT - Amount of time to wait between +# checking for input files +############################################### +# export SLEEP_TIME=300 # changed to 60 to avoid hitting wall_clock when miss umket wafs files ... +export SLEEP_TIME=600 +export SLEEP_INT=10 + +############################################ +# Execute the script. +############################################ +NP=`echo $FHOURS | wc -w` +export MPIRUN=${MPIRUN:-"mpiexec -np $NP -cpu-bind verbose,core cfp"} + +rm wafsgrib2_0p25.cmdfile +ic=0 +for fcsthrs in $FHOURS ; do + if [ `echo $MPIRUN | cut -d " " -f1` = 'srun' ] ; then + echo $ic ${SCRIPTSgfs}/exgfs_atmos_wafs_grib2_0p25.sh $fcsthrs >> wafsgrib2_0p25.cmdfile + else + echo ${SCRIPTSgfs}/exgfs_atmos_wafs_grib2_0p25.sh $fcsthrs >> wafsgrib2_0p25.cmdfile + export MP_PGMMODEL=mpmd + fi + ic=`expr $ic + 1` +done + +$MPIRUN wafsgrib2_0p25.cmdfile + +export err=$?; err_chk + +echo "JOB $job HAS COMPLETED NORMALLY!" + +############################################ +# print exec output +############################################ +if [ -e "$pgmout" ] ; then + cat $pgmout +fi + +############################################ +# remove temporary working directory +############################################ +if [ $KEEPDATA != YES ] ; then + rm -rf $DATA +fi + +date + diff --git a/jobs/JGLOBAL_AERO_ANALYSIS_FINALIZE b/jobs/JGLOBAL_AERO_ANALYSIS_FINALIZE index ff8e2e9569..aaf5792bc2 100755 --- a/jobs/JGLOBAL_AERO_ANALYSIS_FINALIZE +++ b/jobs/JGLOBAL_AERO_ANALYSIS_FINALIZE @@ -31,7 +31,7 @@ mkdir -m 775 -p "${COM_CHEM_ANALYSIS}" ############################################################### # Run relevant script -EXSCRIPT=${GDASAEROFINALPY:-${HOMEgfs}/scripts/exglobal_aero_analysis_finalize.py} +EXSCRIPT=${GDASAEROFINALPY:-${SCRgfs}/exglobal_aero_analysis_finalize.py} ${EXSCRIPT} status=$? [[ ${status} -ne 0 ]] && exit "${status}" diff --git a/jobs/JGLOBAL_AERO_ANALYSIS_INITIALIZE b/jobs/JGLOBAL_AERO_ANALYSIS_INITIALIZE index 79320b77ee..61a99e3137 100755 --- a/jobs/JGLOBAL_AERO_ANALYSIS_INITIALIZE +++ b/jobs/JGLOBAL_AERO_ANALYSIS_INITIALIZE @@ -30,7 +30,7 @@ mkdir -m 775 -p "${COM_CHEM_ANALYSIS}" ############################################################### # Run relevant script -EXSCRIPT=${GDASAEROINITPY:-${HOMEgfs}/scripts/exglobal_aero_analysis_initialize.py} +EXSCRIPT=${GDASAEROINITPY:-${SCRgfs}/exglobal_aero_analysis_initialize.py} ${EXSCRIPT} status=$? [[ ${status} -ne 0 ]] && exit "${status}" diff --git a/jobs/JGLOBAL_AERO_ANALYSIS_RUN b/jobs/JGLOBAL_AERO_ANALYSIS_RUN index 853909dc03..43749b78c5 100755 --- a/jobs/JGLOBAL_AERO_ANALYSIS_RUN +++ b/jobs/JGLOBAL_AERO_ANALYSIS_RUN @@ -16,7 +16,7 @@ source "${HOMEgfs}/ush/jjob_header.sh" -e "aeroanlrun" -c "base aeroanl aeroanlr ############################################################### # Run relevant script -EXSCRIPT=${GDASAERORUNSH:-${HOMEgfs}/scripts/exglobal_aero_analysis_run.py} +EXSCRIPT=${GDASAERORUNSH:-${SCRgfs}/exglobal_aero_analysis_run.py} ${EXSCRIPT} status=$? [[ ${status} -ne 0 ]] && exit "${status}" diff --git a/jobs/JGLOBAL_ARCHIVE b/jobs/JGLOBAL_ARCHIVE index e6c016e703..0925c55c8c 100755 --- a/jobs/JGLOBAL_ARCHIVE +++ b/jobs/JGLOBAL_ARCHIVE @@ -14,16 +14,16 @@ YMD=${PDY} HH=${cyc} generate_com -rx COM_ATMOS_ANALYSIS COM_ATMOS_BUFR COM_ATMO COM_ATMOS_TRACK COM_ATMOS_WMO \ COM_CHEM_HISTORY COM_CHEM_ANALYSIS\ COM_MED_RESTART \ - COM_ICE_HISTORY COM_ICE_INPUT COM_ICE_RESTART \ + COM_SNOW_ANALYSIS \ + COM_ICE_HISTORY COM_ICE_INPUT COM_ICE_RESTART COM_ICE_GRIB \ COM_OBS COM_TOP \ - COM_OCEAN_HISTORY COM_OCEAN_INPUT COM_OCEAN_RESTART COM_OCEAN_XSECT COM_OCEAN_2D COM_OCEAN_3D \ + COM_OCEAN_HISTORY COM_OCEAN_INPUT COM_OCEAN_RESTART COM_OCEAN_GRIB COM_OCEAN_NETCDF \ COM_OCEAN_ANALYSIS \ COM_WAVE_GRID COM_WAVE_HISTORY COM_WAVE_STATION \ - COM_ATMOS_OZNMON COM_ATMOS_RADMON COM_ATMOS_MINMON + COM_ATMOS_OZNMON COM_ATMOS_RADMON COM_ATMOS_MINMON COM_CONF for grid in "0p25" "0p50" "1p00"; do YMD=${PDY} HH=${cyc} GRID=${grid} generate_com -rx "COM_ATMOS_GRIB_${grid}:COM_ATMOS_GRIB_GRID_TMPL" - YMD=${PDY} HH=${cyc} GRID=${grid} generate_com -rx "COM_OCEAN_GRIB_${grid}:COM_OCEAN_GRIB_GRID_TMPL" done ############################################################### diff --git a/jobs/JGLOBAL_ATMENS_ANALYSIS_FINALIZE b/jobs/JGLOBAL_ATMENS_ANALYSIS_FINALIZE index 37a49e0ae0..5411b2dd8b 100755 --- a/jobs/JGLOBAL_ATMENS_ANALYSIS_FINALIZE +++ b/jobs/JGLOBAL_ATMENS_ANALYSIS_FINALIZE @@ -23,7 +23,7 @@ mkdir -m 755 -p "${COM_ATMOS_ANALYSIS_ENS}" ############################################################### # Run relevant script -EXSCRIPT=${GDASATMENSFINALPY:-${HOMEgfs}/scripts/exglobal_atmens_analysis_finalize.py} +EXSCRIPT=${GDASATMENSFINALPY:-${SCRgfs}/exglobal_atmens_analysis_finalize.py} ${EXSCRIPT} status=$? [[ ${status} -ne 0 ]] && exit "${status}" diff --git a/jobs/JGLOBAL_ATMENS_ANALYSIS_INITIALIZE b/jobs/JGLOBAL_ATMENS_ANALYSIS_INITIALIZE index c50214aad1..69a1239e61 100755 --- a/jobs/JGLOBAL_ATMENS_ANALYSIS_INITIALIZE +++ b/jobs/JGLOBAL_ATMENS_ANALYSIS_INITIALIZE @@ -25,7 +25,7 @@ RUN=${GDUMP} YMD=${gPDY} HH=${gcyc} generate_com -rx \ ############################################################### # Run relevant script -EXSCRIPT=${GDASATMENSINITPY:-${HOMEgfs}/scripts/exglobal_atmens_analysis_initialize.py} +EXSCRIPT=${GDASATMENSINITPY:-${SCRgfs}/exglobal_atmens_analysis_initialize.py} ${EXSCRIPT} status=$? [[ ${status} -ne 0 ]] && exit "${status}" diff --git a/jobs/JGLOBAL_ATMENS_ANALYSIS_RUN b/jobs/JGLOBAL_ATMENS_ANALYSIS_RUN index 0d10c76b05..65eeb5e0d8 100755 --- a/jobs/JGLOBAL_ATMENS_ANALYSIS_RUN +++ b/jobs/JGLOBAL_ATMENS_ANALYSIS_RUN @@ -16,7 +16,7 @@ source "${HOMEgfs}/ush/jjob_header.sh" -e "atmensanlrun" -c "base atmensanl atme ############################################################### # Run relevant script -EXSCRIPT=${GDASATMENSRUNSH:-${HOMEgfs}/scripts/exglobal_atmens_analysis_run.py} +EXSCRIPT=${GDASATMENSRUNSH:-${SCRgfs}/exglobal_atmens_analysis_run.py} ${EXSCRIPT} status=$? [[ ${status} -ne 0 ]] && exit "${status}" diff --git a/jobs/JGLOBAL_ATMOS_ANALYSIS b/jobs/JGLOBAL_ATMOS_ANALYSIS index 9e5850bfc3..3d7a4278a2 100755 --- a/jobs/JGLOBAL_ATMOS_ANALYSIS +++ b/jobs/JGLOBAL_ATMOS_ANALYSIS @@ -79,7 +79,7 @@ export PREPQCPF="${COM_OBS}/${OPREFIX}prepbufr.acft_profiles" # Copy fix file for obsproc # TODO: Why is this necessary? if [[ ${RUN} = "gfs" ]]; then mkdir -p ${ROTDIR}/fix - cp ${FIXgsi}/prepobs_errtable.global ${ROTDIR}/fix/ + cp ${FIXgfs}/gsi/prepobs_errtable.global ${ROTDIR}/fix/ fi diff --git a/jobs/JGLOBAL_ATMOS_POST_MANAGER b/jobs/JGLOBAL_ATMOS_POST_MANAGER index 7c726bc2ad..d836de5d05 100755 --- a/jobs/JGLOBAL_ATMOS_POST_MANAGER +++ b/jobs/JGLOBAL_ATMOS_POST_MANAGER @@ -12,15 +12,6 @@ source "${HOMEgfs}/ush/jjob_header.sh" -e "post" -c "base post" export NET=${NET:-gfs} export RUN=${RUN:-gfs} -#################################### -# Specify Execution Areas -#################################### -export HOMEgfs=${HOMEgfs:-${PACKAGEROOT}/gfs.${gfs_ver}} -export EXECgfs=${HOMEgfs:-${HOMEgfs}/exec} -export FIXgfs=${HOMEgfs:-${HOMEgfs}/fix} -export PARMgfs=${HOMEgfs:-${HOMEgfs}/parm} -export USHgfs=${HOMEgfs:-${HOMEgfs}/ush} - ########################### # Set up EXT variable ########################### @@ -30,6 +21,6 @@ YMD=${PDY} HH=${cyc} generate_com -rx COM_ATMOS_HISTORY ######################################################## # Execute the script. -${HOMEgfs}/scripts/exglobal_atmos_pmgr.sh +${SCRgfs}/exglobal_atmos_pmgr.sh ######################################################## diff --git a/jobs/JGLOBAL_ATMOS_PRODUCTS b/jobs/JGLOBAL_ATMOS_PRODUCTS index 24e7edacdd..6fa00618d6 100755 --- a/jobs/JGLOBAL_ATMOS_PRODUCTS +++ b/jobs/JGLOBAL_ATMOS_PRODUCTS @@ -22,7 +22,7 @@ export PREFIX="${RUN}.t${cyc}z." ############################################################### # Run exglobal script -"${HOMEgfs}/scripts/exglobal_atmos_products.sh" +"${SCRgfs}/exglobal_atmos_products.sh" status=$? (( status != 0 )) && exit "${status}" diff --git a/jobs/JGLOBAL_ATMOS_SFCANL b/jobs/JGLOBAL_ATMOS_SFCANL index 0d709e56dd..3d897db4c3 100755 --- a/jobs/JGLOBAL_ATMOS_SFCANL +++ b/jobs/JGLOBAL_ATMOS_SFCANL @@ -26,7 +26,7 @@ export GPREFIX="${GDUMP}.t${gcyc}z." export APREFIX="${CDUMP}.t${cyc}z." YMD=${PDY} HH=${cyc} generate_com -rx COM_OBS COM_ATMOS_ANALYSIS COM_ATMOS_RESTART \ - COM_LAND_ANALYSIS + COM_SNOW_ANALYSIS RUN=${GDUMP} YMD=${gPDY} HH=${gcyc} generate_com -rx \ COM_OBS_PREV:COM_OBS_TMPL \ diff --git a/jobs/JGLOBAL_ATMOS_UPP b/jobs/JGLOBAL_ATMOS_UPP index 9364f33225..1a1ecbc2a1 100755 --- a/jobs/JGLOBAL_ATMOS_UPP +++ b/jobs/JGLOBAL_ATMOS_UPP @@ -19,7 +19,7 @@ if [[ ! -d ${COM_ATMOS_MASTER} ]]; then mkdir -m 775 -p "${COM_ATMOS_MASTER}"; f ############################################################### # Run relevant exglobal script -"${HOMEgfs}/scripts/exglobal_atmos_upp.py" +"${SCRgfs}/exglobal_atmos_upp.py" status=$? (( status != 0 )) && exit "${status}" diff --git a/jobs/JGLOBAL_ATM_ANALYSIS_FINALIZE b/jobs/JGLOBAL_ATM_ANALYSIS_FINALIZE index 52a782d7c4..087eab604a 100755 --- a/jobs/JGLOBAL_ATM_ANALYSIS_FINALIZE +++ b/jobs/JGLOBAL_ATM_ANALYSIS_FINALIZE @@ -33,7 +33,7 @@ mkdir -m 775 -p "${COM_ATMOS_ANALYSIS}" ############################################################### # Run relevant script -EXSCRIPT=${GDASATMFINALPY:-${HOMEgfs}/scripts/exglobal_atm_analysis_finalize.py} +EXSCRIPT=${GDASATMFINALPY:-${SCRgfs}/exglobal_atm_analysis_finalize.py} ${EXSCRIPT} status=$? [[ ${status} -ne 0 ]] && exit "${status}" diff --git a/jobs/JGLOBAL_ATM_ANALYSIS_INITIALIZE b/jobs/JGLOBAL_ATM_ANALYSIS_INITIALIZE index 4ef5e6392d..b7e2eeacc6 100755 --- a/jobs/JGLOBAL_ATM_ANALYSIS_INITIALIZE +++ b/jobs/JGLOBAL_ATM_ANALYSIS_INITIALIZE @@ -36,7 +36,7 @@ mkdir -m 775 -p "${COM_ATMOS_ANALYSIS}" ############################################################### # Run relevant script -EXSCRIPT=${GDASATMINITPY:-${HOMEgfs}/scripts/exglobal_atm_analysis_initialize.py} +EXSCRIPT=${GDASATMINITPY:-${SCRgfs}/exglobal_atm_analysis_initialize.py} ${EXSCRIPT} status=$? [[ ${status} -ne 0 ]] && exit "${status}" diff --git a/jobs/JGLOBAL_ATM_ANALYSIS_RUN b/jobs/JGLOBAL_ATM_ANALYSIS_RUN index bbfdbe4a1f..2105d719de 100755 --- a/jobs/JGLOBAL_ATM_ANALYSIS_RUN +++ b/jobs/JGLOBAL_ATM_ANALYSIS_RUN @@ -18,7 +18,7 @@ source "${HOMEgfs}/ush/jjob_header.sh" -e "atmanlrun" -c "base atmanl atmanlrun" ############################################################### # Run relevant script -EXSCRIPT=${GDASATMRUNSH:-${HOMEgfs}/scripts/exglobal_atm_analysis_run.py} +EXSCRIPT=${GDASATMRUNSH:-${SCRgfs}/exglobal_atm_analysis_run.py} ${EXSCRIPT} status=$? [[ ${status} -ne 0 ]] && exit "${status}" diff --git a/jobs/JGLOBAL_ATM_PREP_IODA_OBS b/jobs/JGLOBAL_ATM_PREP_IODA_OBS index ef0e682468..a3b23e859f 100755 --- a/jobs/JGLOBAL_ATM_PREP_IODA_OBS +++ b/jobs/JGLOBAL_ATM_PREP_IODA_OBS @@ -17,8 +17,8 @@ YMD=${PDY} HH=${cyc} generate_com -rx COM_OBS ############################################################### # Run relevant script -EXSCRIPT=${BUFR2IODASH:-${HOMEgfs}/ush/run_bufr2ioda.py} -${EXSCRIPT} "${PDY}${cyc}" "${RUN}" "${DMPDIR}" "${IODAPARM}" "${COM_OBS}/" +EXSCRIPT=${BUFR2IODASH:-${USHgfs}/run_bufr2ioda.py} +${EXSCRIPT} "${PDY}${cyc}" "${RUN}" "${DMPDIR}" "${PARMgfs}/gdas/ioda/bufr2ioda" "${COM_OBS}/" status=$? [[ ${status} -ne 0 ]] && (echo "FATAL ERROR: Error executing ${EXSCRIPT}, ABORT!"; exit "${status}") diff --git a/jobs/JGLOBAL_CLEANUP b/jobs/JGLOBAL_CLEANUP index ad938ccf60..2f81003989 100755 --- a/jobs/JGLOBAL_CLEANUP +++ b/jobs/JGLOBAL_CLEANUP @@ -3,7 +3,7 @@ source "${HOMEgfs}/ush/preamble.sh" source "${HOMEgfs}/ush/jjob_header.sh" -e "cleanup" -c "base cleanup" -"${HOMEgfs}/scripts/exglobal_cleanup.sh" +"${SCRgfs}/exglobal_cleanup.sh" status=$? [[ ${status} -ne 0 ]] && exit "${status}" diff --git a/jobs/JGLOBAL_FORECAST b/jobs/JGLOBAL_FORECAST index b2825af54f..d99712d421 100755 --- a/jobs/JGLOBAL_FORECAST +++ b/jobs/JGLOBAL_FORECAST @@ -1,40 +1,19 @@ #! /usr/bin/env bash source "${HOMEgfs}/ush/preamble.sh" -source "${HOMEgfs}/ush/jjob_header.sh" -e "fcst" -c "base fcst" - -############################################## -# Set variables used in the script -############################################## -export CDUMP=${RUN/enkf} +if (( 10#${ENSMEM:-0} > 0 )); then + source "${HOMEgfs}/ush/jjob_header.sh" -e "efcs" -c "base fcst efcs" +else + source "${HOMEgfs}/ush/jjob_header.sh" -e "fcst" -c "base fcst" +fi ############################################## # Begin JOB SPECIFIC work ############################################## # Restart conditions for GFS cycle come from GDAS -rCDUMP=${CDUMP} -[[ ${CDUMP} = "gfs" ]] && export rCDUMP="gdas" - -# Forecast length for GFS forecast -case ${RUN} in - *gfs | *gefs) - # shellcheck disable=SC2153 - export FHMAX=${FHMAX_GFS} - # shellcheck disable=SC2153 - export FHOUT=${FHOUT_GFS} - export FHMAX_HF=${FHMAX_HF_GFS} - export FHOUT_HF=${FHOUT_HF_GFS} - ;; - *gdas) - export FHMAX_HF=0 - export FHOUT_HF=0 - ;; - *) - echo "FATAL ERROR: Unsupported RUN '${RUN}'" - exit 1 -esac - +rCDUMP=${RUN} +export rCDUMP="${RUN/gfs/gdas}" # Ignore possible spelling error (nothing is misspelled) # shellcheck disable=SC2153 @@ -78,11 +57,21 @@ fi ############################################################### # Run relevant exglobal script - +############################################################### ${FORECASTSH:-${SCRgfs}/exglobal_forecast.sh} status=$? -[[ ${status} -ne 0 ]] && exit ${status} - +[[ ${status} -ne 0 ]] && exit "${status}" + +# Send DBN alerts for EnKF +# TODO: Should these be in post manager instead? +if [[ "${RUN}" =~ "enkf" ]] && [[ "${SENDDBN}" = YES ]]; then + for (( fhr = FHOUT; fhr <= FHMAX; fhr + FHOUT )); do + if (( fhr % 3 == 0 )); then + fhr3=$(printf %03i "${fhr}") + "${DBNROOT}/bin/dbn_alert" MODEL GFS_ENKF "${job}" "${COM_ATMOS_HISTORY}/${RUN}.t${cyc}z.sfcf${fhr3}.nc" + fi + done +fi ############################################## # End JOB SPECIFIC work @@ -91,15 +80,14 @@ status=$? ############################################## # Final processing ############################################## -if [ -e "${pgmout}" ] ; then - cat ${pgmout} +if [[ -e "${pgmout}" ]] ; then + cat "${pgmout}" fi ########################################## # Remove the Temporary working directory ########################################## -cd ${DATAROOT} -[[ ${KEEPDATA} = "NO" ]] && rm -rf ${DATA} - +cd "${DATAROOT}" || true +[[ ${KEEPDATA} = "NO" ]] && rm -rf "${DATA}" exit 0 diff --git a/jobs/JGLOBAL_OCEANICE_PRODUCTS b/jobs/JGLOBAL_OCEANICE_PRODUCTS new file mode 100755 index 0000000000..69e5f69448 --- /dev/null +++ b/jobs/JGLOBAL_OCEANICE_PRODUCTS @@ -0,0 +1,40 @@ +#! /usr/bin/env bash + +source "${HOMEgfs}/ush/preamble.sh" +source "${HOMEgfs}/ush/jjob_header.sh" -e "oceanice_products" -c "base oceanice_products" + + +############################################## +# Begin JOB SPECIFIC work +############################################## + +# Construct COM variables from templates +YMD="${PDY}" HH="${cyc}" generate_com -rx "COM_${COMPONENT^^}_HISTORY" +YMD="${PDY}" HH="${cyc}" generate_com -rx "COM_${COMPONENT^^}_GRIB" +YMD="${PDY}" HH="${cyc}" generate_com -rx "COM_${COMPONENT^^}_NETCDF" + +############################################################### +# Run exglobal script +"${SCRgfs}/exglobal_oceanice_products.py" +status=$? +(( status != 0 )) && exit "${status}" + +############################################## +# End JOB SPECIFIC work +############################################## + +############################################## +# Final processing +############################################## +if [[ -e "${pgmout}" ]]; then + cat "${pgmout}" +fi + +########################################## +# Remove the Temporary working directory +########################################## +cd "${DATAROOT}" || exit 1 +[[ "${KEEPDATA:-NO}" == "NO" ]] && rm -rf "${DATA}" + + +exit 0 diff --git a/jobs/JGLOBAL_PREP_OCEAN_OBS b/jobs/JGLOBAL_PREP_OCEAN_OBS index a100aca89c..52f202d72a 100755 --- a/jobs/JGLOBAL_PREP_OCEAN_OBS +++ b/jobs/JGLOBAL_PREP_OCEAN_OBS @@ -15,12 +15,12 @@ YMD=${PDY} HH=${cyc} generate_com -rx COMOUT_OBS:COM_OBS_TMPL ############################################## # Add prep_marine_obs.py to PYTHONPATH -export PYTHONPATH=${HOMEgfs}/sorc/gdas.cd/ush/soca:${PYTHONPATH} +export PYTHONPATH=${HOMEgfs}/sorc/gdas.cd/ush:${PYTHONPATH} ############################################################### # Run relevant script -EXSCRIPT=${GDASPREPOCNOBSPY:-${HOMEgfs}/ush/exglobal_prep_ocean_obs.py} +EXSCRIPT=${GDASPREPOCNOBSPY:-${USHgfs}/exglobal_prep_ocean_obs.py} ${EXSCRIPT} status=$? [[ ${status} -ne 0 ]] && exit "${status}" @@ -38,7 +38,7 @@ if [[ -e "${pgmout}" ]] ; then fi ########################################## -# Handle the temporary working directory +# Handle the temporary working directory ########################################## cd "${DATAROOT}" || (echo "FATAL ERROR: ${DATAROOT} does not exist. ABORT!"; exit 1) [[ ${KEEPDATA} = "NO" ]] && rm -rf "${DATA}" diff --git a/jobs/JGLOBAL_PREP_LAND_OBS b/jobs/JGLOBAL_PREP_SNOW_OBS similarity index 86% rename from jobs/JGLOBAL_PREP_LAND_OBS rename to jobs/JGLOBAL_PREP_SNOW_OBS index 025adae529..1fc7e3e5c3 100755 --- a/jobs/JGLOBAL_PREP_LAND_OBS +++ b/jobs/JGLOBAL_PREP_SNOW_OBS @@ -1,7 +1,8 @@ #! /usr/bin/env bash source "${HOMEgfs}/ush/preamble.sh" -source "${HOMEgfs}/ush/jjob_header.sh" -e "preplandobs" -c "base preplandobs" +export DATA=${DATA:-${DATAROOT}/${RUN}snowanl_${cyc}} +source "${HOMEgfs}/ush/jjob_header.sh" -e "prepsnowobs" -c "base prepsnowobs" ############################################## # Set variables used in the script @@ -24,7 +25,7 @@ RUN=${GDUMP} YMD=${gPDY} HH=${gcyc} generate_com -rx \ ############################################################### # Run relevant script -EXSCRIPT=${GDASLANDPREPSH:-${HOMEgfs}/scripts/exglobal_prep_land_obs.py} +EXSCRIPT=${GDASSNOWPREPPY:-${SCRgfs}/exglobal_prep_snow_obs.py} ${EXSCRIPT} status=$? [[ ${status} -ne 0 ]] && (echo "FATAL ERROR: Error executing ${EXSCRIPT}, ABORT!"; exit "${status}") diff --git a/jobs/JGLOBAL_LAND_ANALYSIS b/jobs/JGLOBAL_SNOW_ANALYSIS similarity index 78% rename from jobs/JGLOBAL_LAND_ANALYSIS rename to jobs/JGLOBAL_SNOW_ANALYSIS index 3ff7e72a35..50372d1342 100755 --- a/jobs/JGLOBAL_LAND_ANALYSIS +++ b/jobs/JGLOBAL_SNOW_ANALYSIS @@ -1,7 +1,8 @@ #! /usr/bin/env bash source "${HOMEgfs}/ush/preamble.sh" -source "${HOMEgfs}/ush/jjob_header.sh" -e "landanl" -c "base landanl" +export DATA=${DATA:-${DATAROOT}/${RUN}snowanl_${cyc}} +source "${HOMEgfs}/ush/jjob_header.sh" -e "snowanl" -c "base snowanl" ############################################## # Set variables used in the script @@ -17,17 +18,17 @@ GDUMP="gdas" # Begin JOB SPECIFIC work ############################################## # Generate COM variables from templates -YMD=${PDY} HH=${cyc} generate_com -rx COM_OBS COM_LAND_ANALYSIS COM_CONF +YMD=${PDY} HH=${cyc} generate_com -rx COM_OBS COM_SNOW_ANALYSIS COM_CONF RUN=${GDUMP} YMD=${gPDY} HH=${gcyc} generate_com -rx \ COM_ATMOS_RESTART_PREV:COM_ATMOS_RESTART_TMPL -mkdir -m 775 -p "${COM_LAND_ANALYSIS}" "${COM_CONF}" +mkdir -m 775 -p "${COM_SNOW_ANALYSIS}" "${COM_CONF}" ############################################################### # Run relevant script -EXSCRIPT=${LANDANLPY:-${HOMEgfs}/scripts/exglobal_land_analysis.py} +EXSCRIPT=${SNOWANLPY:-${SCRgfs}/exglobal_snow_analysis.py} ${EXSCRIPT} status=$? [[ ${status} -ne 0 ]] && exit "${status}" diff --git a/jobs/JGLOBAL_STAGE_IC b/jobs/JGLOBAL_STAGE_IC index 4c94990fde..c460e91c9e 100755 --- a/jobs/JGLOBAL_STAGE_IC +++ b/jobs/JGLOBAL_STAGE_IC @@ -10,7 +10,7 @@ rCDUMP=${CDUMP} export rCDUMP # Execute the Script -"${HOMEgfs}/scripts/exglobal_stage_ic.sh" +"${SCRgfs}/exglobal_stage_ic.sh" ########################################## # Remove the Temporary working directory diff --git a/jobs/JGLOBAL_WAVE_GEMPAK b/jobs/JGLOBAL_WAVE_GEMPAK index 89c389fa11..f02bf3fce9 100755 --- a/jobs/JGLOBAL_WAVE_GEMPAK +++ b/jobs/JGLOBAL_WAVE_GEMPAK @@ -19,7 +19,7 @@ if [[ ! -d ${COM_WAVE_GEMPAK} ]]; then mkdir -p "${COM_WAVE_GEMPAK}"; fi ######################################################## # Execute the script. -${HOMEgfs}/scripts/exgfs_wave_nawips.sh +${SCRgfs}/exgfs_wave_nawips.sh status=$? [[ ${status} -ne 0 ]] && exit ${status} ################################### diff --git a/jobs/JGLOBAL_WAVE_INIT b/jobs/JGLOBAL_WAVE_INIT index 7ad742f25a..02d4009c65 100755 --- a/jobs/JGLOBAL_WAVE_INIT +++ b/jobs/JGLOBAL_WAVE_INIT @@ -9,12 +9,6 @@ export errchk=${errchk:-err_chk} export MP_PULSE=0 -# Path to HOME Directory -export FIXwave=${FIXwave:-${HOMEgfs}/fix/wave} -export PARMwave=${PARMwave:-${HOMEgfs}/parm/wave} -export USHwave=${USHwave:-${HOMEgfs}/ush} -export EXECwave=${EXECwave:-${HOMEgfs}/exec} - # Set COM Paths YMD=${PDY} HH=${cyc} generate_com -rx COM_WAVE_PREP @@ -25,7 +19,7 @@ export wavempexec=${wavempexec:-"mpirun -n"} export wave_mpmd=${wave_mpmd:-"cfp"} # Execute the Script -${HOMEgfs}/scripts/exgfs_wave_init.sh +${SCRgfs}/exgfs_wave_init.sh ########################################## # Remove the Temporary working directory diff --git a/jobs/JGLOBAL_WAVE_POST_BNDPNT b/jobs/JGLOBAL_WAVE_POST_BNDPNT index 9d404077fd..cdf2ad3728 100755 --- a/jobs/JGLOBAL_WAVE_POST_BNDPNT +++ b/jobs/JGLOBAL_WAVE_POST_BNDPNT @@ -8,12 +8,6 @@ export errchk=${errchk:-err_chk} export MP_PULSE=0 -# Path to HOME Directory -export FIXwave=${FIXwave:-${HOMEgfs}/fix/wave} -export PARMwave=${PARMwave:-${HOMEgfs}/parm/wave} -export USHwave=${USHwave:-${HOMEgfs}/ush} -export EXECwave=${EXECwave:-${HOMEgfs}/exec} - # Set COM Paths and GETGES environment YMD=${PDY} HH=${cyc} generate_com -rx COM_WAVE_PREP COM_WAVE_HISTORY COM_WAVE_STATION @@ -34,7 +28,7 @@ export DOBLL_WAV='NO' # Bulletin post export DOBNDPNT_WAV='YES' # Do boundary points # Execute the Script -${HOMEgfs}/scripts/exgfs_wave_post_pnt.sh +${SCRgfs}/exgfs_wave_post_pnt.sh err=$? if [ ${err} -ne 0 ]; then echo "FATAL ERROR: ex-script of GWES_POST failed!" diff --git a/jobs/JGLOBAL_WAVE_POST_BNDPNTBLL b/jobs/JGLOBAL_WAVE_POST_BNDPNTBLL index 3de49fcc3b..c091d67aca 100755 --- a/jobs/JGLOBAL_WAVE_POST_BNDPNTBLL +++ b/jobs/JGLOBAL_WAVE_POST_BNDPNTBLL @@ -12,12 +12,6 @@ export CDATE=${PDY}${cyc} export MP_PULSE=0 -# Path to HOME Directory -export FIXwave=${FIXwave:-${HOMEgfs}/fix/wave} -export PARMwave=${PARMwave:-${HOMEgfs}/parm/wave} -export USHwave=${USHwave:-${HOMEgfs}/ush} -export EXECwave=${EXECwave:-${HOMEgfs}/exec} - # Set COM Paths and GETGES environment YMD=${PDY} HH=${cyc} generate_com -rx COM_WAVE_PREP COM_WAVE_HISTORY COM_WAVE_STATION @@ -38,7 +32,7 @@ export DOBLL_WAV='YES' # Bulletin post export DOBNDPNT_WAV='YES' #boundary points # Execute the Script -${HOMEgfs}/scripts/exgfs_wave_post_pnt.sh +${SCRgfs}/exgfs_wave_post_pnt.sh err=$? if [ ${err} -ne 0 ]; then echo "FATAL ERROR: ex-script of GFS_WAVE_POST_PNT failed!" diff --git a/jobs/JGLOBAL_WAVE_POST_PNT b/jobs/JGLOBAL_WAVE_POST_PNT index 1b573435a3..800a58fbd8 100755 --- a/jobs/JGLOBAL_WAVE_POST_PNT +++ b/jobs/JGLOBAL_WAVE_POST_PNT @@ -8,12 +8,6 @@ export errchk=${errchk:-err_chk} export MP_PULSE=0 -# Path to HOME Directory -export FIXwave=${FIXwave:-${HOMEgfs}/fix/wave} -export PARMwave=${PARMwave:-${HOMEgfs}/parm/wave} -export USHwave=${USHwave:-${HOMEgfs}/ush} -export EXECwave=${EXECwave:-${HOMEgfs}/exec} - # Set COM Paths and GETGES environment YMD=${PDY} HH=${cyc} generate_com -rx COM_WAVE_PREP COM_WAVE_HISTORY COM_WAVE_STATION @@ -35,7 +29,7 @@ export DOBNDPNT_WAV='NO' #not boundary points # Execute the Script -${HOMEgfs}/scripts/exgfs_wave_post_pnt.sh +${SCRgfs}/exgfs_wave_post_pnt.sh err=$? if [ ${err} -ne 0 ]; then echo "FATAL ERROR: ex-script of GWES_POST failed!" diff --git a/jobs/JGLOBAL_WAVE_POST_SBS b/jobs/JGLOBAL_WAVE_POST_SBS index 231b793de7..662b6e4395 100755 --- a/jobs/JGLOBAL_WAVE_POST_SBS +++ b/jobs/JGLOBAL_WAVE_POST_SBS @@ -8,12 +8,6 @@ export errchk=${errchk:-err_chk} export MP_PULSE=0 -# Path to HOME Directory -export FIXwave=${FIXwave:-${HOMEgfs}/fix/wave} -export PARMwave=${PARMwave:-${HOMEgfs}/parm/wave} -export USHwave=${USHwave:-${HOMEgfs}/ush} -export EXECwave=${EXECwave:-${HOMEgfs}/exec} - # Set COM Paths and GETGES environment YMD=${PDY} HH=${cyc} generate_com -rx COM_WAVE_PREP COM_WAVE_HISTORY COM_WAVE_GRID @@ -32,7 +26,7 @@ export WAV_MOD_TAG=${RUN}wave${waveMEMB} export CFP_VERBOSE=1 # Execute the Script -${HOMEgfs}/scripts/exgfs_wave_post_gridded_sbs.sh +${SCRgfs}/exgfs_wave_post_gridded_sbs.sh err=$? if [ ${err} -ne 0 ]; then echo "FATAL ERROR: ex-script of GWES_POST failed!" diff --git a/jobs/JGLOBAL_WAVE_PRDGEN_BULLS b/jobs/JGLOBAL_WAVE_PRDGEN_BULLS index 3a2947af56..5e3756e276 100755 --- a/jobs/JGLOBAL_WAVE_PRDGEN_BULLS +++ b/jobs/JGLOBAL_WAVE_PRDGEN_BULLS @@ -20,7 +20,7 @@ if [[ ! -d ${COM_WAVE_WMO} ]]; then mkdir -p "${COM_WAVE_WMO}"; fi ################################### # Execute the Script -${HOMEgfs}/scripts/exgfs_wave_prdgen_bulls.sh +${SCRgfs}/exgfs_wave_prdgen_bulls.sh status=$? [[ ${status} -ne 0 ]] && exit ${status} diff --git a/jobs/JGLOBAL_WAVE_PRDGEN_GRIDDED b/jobs/JGLOBAL_WAVE_PRDGEN_GRIDDED index 4b32c709bf..d2cfd363d2 100755 --- a/jobs/JGLOBAL_WAVE_PRDGEN_GRIDDED +++ b/jobs/JGLOBAL_WAVE_PRDGEN_GRIDDED @@ -22,7 +22,7 @@ mkdir -p "${COM_WAVE_WMO}" ################################### # Execute the Script ################################### -${HOMEgfs}/scripts/exgfs_wave_prdgen_gridded.sh +${SCRgfs}/exgfs_wave_prdgen_gridded.sh status=$? [[ ${status} -ne 0 ]] && exit ${status} diff --git a/jobs/JGLOBAL_WAVE_PREP b/jobs/JGLOBAL_WAVE_PREP index f246045f53..9fbc4b601b 100755 --- a/jobs/JGLOBAL_WAVE_PREP +++ b/jobs/JGLOBAL_WAVE_PREP @@ -16,19 +16,13 @@ export MP_PULSE=0 # CDO required for processing RTOFS currents export CDO=${CDO_ROOT}/bin/cdo -# Path to HOME Directory -export FIXwave=${FIXwave:-${HOMEgfs}/fix/wave} -export PARMwave=${PARMwave:-${HOMEgfs}/parm/wave} -export USHwave=${USHwave:-${HOMEgfs}/ush} -export EXECwave=${EXECwave:-${HOMEgfs}/exec} - # Set COM Paths and GETGES environment YMD=${PDY} HH=${cyc} generate_com -rx COM_OBS COM_WAVE_PREP generate_com -rx COM_RTOFS [[ ! -d ${COM_WAVE_PREP} ]] && mkdir -m 775 -p "${COM_WAVE_PREP}" # Execute the Script -${HOMEgfs}/scripts/exgfs_wave_prep.sh +${SCRgfs}/exgfs_wave_prep.sh ########################################## # Remove the Temporary working directory diff --git a/jobs/rocoto/awips_20km_1p0deg.sh b/jobs/rocoto/awips_20km_1p0deg.sh index e1bf623883..b2a291e37e 100755 --- a/jobs/rocoto/awips_20km_1p0deg.sh +++ b/jobs/rocoto/awips_20km_1p0deg.sh @@ -45,7 +45,7 @@ for fhr3 in ${fhrlst}; do if (( fhr >= fhmin && fhr <= fhmax )); then if ((fhr % 3 == 0)); then export fcsthrs="${fhr3}" - "${AWIPS20KM1P0DEGSH}" + "${HOMEgfs}/jobs/JGFS_ATMOS_AWIPS_20KM_1P0DEG" fi fi @@ -54,7 +54,7 @@ for fhr3 in ${fhrlst}; do if (( fhr >= fhmin && fhr <= fhmax )); then if ((fhr % 6 == 0)); then export fcsthrs="${fhr3}" - "${AWIPS20KM1P0DEGSH}" + "${HOMEgfs}/jobs/JGFS_ATMOS_AWIPS_20KM_1P0DEG" fi fi done diff --git a/jobs/rocoto/awips_g2.sh b/jobs/rocoto/awips_g2.sh index 121c96d63f..6bb004048d 100755 --- a/jobs/rocoto/awips_g2.sh +++ b/jobs/rocoto/awips_g2.sh @@ -44,7 +44,8 @@ for fhr3 in ${fhrlst}; do fhmax=240 if (( fhr >= fhmin && fhr <= fhmax )); then if ((fhr % 6 == 0)); then - "${AWIPSG2SH}" + export fcsthrs="${fhr3}" + "${HOMEgfs}/jobs/JGFS_ATMOS_AWIPS_G2" fi fi done diff --git a/jobs/rocoto/efcs.sh b/jobs/rocoto/efcs.sh deleted file mode 100755 index c5667cb970..0000000000 --- a/jobs/rocoto/efcs.sh +++ /dev/null @@ -1,25 +0,0 @@ -#! /usr/bin/env bash - -source "${HOMEgfs}/ush/preamble.sh" - -############################################################### -# Source FV3GFS workflow modules -# TODO clean this up once ncdiag/1.1.2 is installed on WCOSS2 -source "${HOMEgfs}/ush/detect_machine.sh" -if [[ "${MACHINE_ID}" == "wcoss2" ]]; then - . ${HOMEgfs}/ush/load_ufswm_modules.sh -else - . ${HOMEgfs}/ush/load_fv3gfs_modules.sh -fi -status=$? -[[ ${status} -ne 0 ]] && exit ${status} - -export job="efcs" -export jobid="${job}.$$" - -############################################################### -# Execute the JJOB -"${HOMEgfs}/jobs/JGDAS_ENKF_FCST" -status=$? - -exit ${status} diff --git a/jobs/rocoto/gempakpgrb2spec.sh b/jobs/rocoto/gempakgrb2spec.sh similarity index 100% rename from jobs/rocoto/gempakpgrb2spec.sh rename to jobs/rocoto/gempakgrb2spec.sh diff --git a/jobs/rocoto/oceanice_products.sh b/jobs/rocoto/oceanice_products.sh new file mode 100755 index 0000000000..48816fb3a1 --- /dev/null +++ b/jobs/rocoto/oceanice_products.sh @@ -0,0 +1,37 @@ +#! /usr/bin/env bash + +source "${HOMEgfs}/ush/preamble.sh" + +############################################################### +## ocean ice products driver script +## FHRLST : forecast hour list to post-process (e.g. f000, f000_f001_f002, ...) +############################################################### + +# Source FV3GFS workflow modules +. "${HOMEgfs}/ush/load_fv3gfs_modules.sh" +status=$? +if (( status != 0 )); then exit "${status}"; fi + +############################################################### +# setup python path for workflow utilities and tasks +wxflowPATH="${HOMEgfs}/ush/python:${HOMEgfs}/ush/python/wxflow/src" +PYTHONPATH="${PYTHONPATH:+${PYTHONPATH}:}${wxflowPATH}" +export PYTHONPATH + +export job="oceanice_products" +export jobid="${job}.$$" + +############################################################### +# shellcheck disable=SC2153,SC2001 +IFS='_' read -ra fhrs <<< "${FHRLST//f}" # strip off the 'f's and convert to array + +#--------------------------------------------------------------- +# Execute the JJOB +for fhr in "${fhrs[@]}"; do + export FORECAST_HOUR=$(( 10#${fhr} )) + "${HOMEgfs}/jobs/JGLOBAL_OCEANICE_PRODUCTS" + status=$? + if (( status != 0 )); then exit "${status}"; fi +done + +exit 0 diff --git a/jobs/rocoto/ocnanalecen.sh b/jobs/rocoto/ocnanalecen.sh new file mode 100755 index 0000000000..c5fdbbbf32 --- /dev/null +++ b/jobs/rocoto/ocnanalecen.sh @@ -0,0 +1,23 @@ +#! /usr/bin/env bash + +source "${HOMEgfs}/ush/preamble.sh" + +############################################################### +# Source UFSDA workflow modules +. "${HOMEgfs}/ush/load_ufsda_modules.sh" +status=$? +[[ ${status} -ne 0 ]] && exit "${status}" + +export job="ocnanalecen" +export jobid="${job}.$$" + +############################################################### +# Setup Python path for GDASApp ush +PYTHONPATH="${PYTHONPATH:+${PYTHONPATH}:}${HOMEgfs}/sorc/gdas.cd/ush" +export PYTHONPATH + +############################################################### +# Execute the JJOB +"${HOMEgfs}"/jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_ECEN +status=$? +exit "${status}" diff --git a/jobs/rocoto/ocnpost.sh b/jobs/rocoto/ocnpost.sh deleted file mode 100755 index 5a2dc091cf..0000000000 --- a/jobs/rocoto/ocnpost.sh +++ /dev/null @@ -1,119 +0,0 @@ -#! /usr/bin/env bash - -source "${HOMEgfs}/ush/preamble.sh" - -############################################################### -## CICE5/MOM6 post driver script -## FHRGRP : forecast hour group to post-process (e.g. 0, 1, 2 ...) -## FHRLST : forecast hourlist to be post-process (e.g. anl, f000, f000_f001_f002, ...) -############################################################### - -# Source FV3GFS workflow modules -source "${HOMEgfs}/ush/load_fv3gfs_modules.sh" -status=$? -(( status != 0 )) && exit "${status}" - -export job="ocnpost" -export jobid="${job}.$$" -source "${HOMEgfs}/ush/jjob_header.sh" -e "ocnpost" -c "base ocnpost" - -############################################## -# Set variables used in the exglobal script -############################################## -export CDUMP=${RUN/enkf} - -############################################## -# Begin JOB SPECIFIC work -############################################## -YMD=${PDY} HH=${cyc} generate_com -rx COM_OCEAN_HISTORY COM_OCEAN_2D COM_OCEAN_3D \ - COM_OCEAN_XSECT COM_ICE_HISTORY - -for grid in "0p50" "0p25"; do - YMD=${PDY} HH=${cyc} GRID=${grid} generate_com -rx "COM_OCEAN_GRIB_${grid}:COM_OCEAN_GRIB_GRID_TMPL" -done - -for outdir in COM_OCEAN_2D COM_OCEAN_3D COM_OCEAN_XSECT COM_OCEAN_GRIB_0p25 COM_OCEAN_GRIB_0p50; do - if [[ ! -d "${!outdir}" ]]; then - mkdir -p "${!outdir}" - fi -done - -fhrlst=$(echo ${FHRLST} | sed -e 's/_/ /g; s/f/ /g; s/,/ /g') - -export OMP_NUM_THREADS=1 -export ENSMEM=${ENSMEM:-000} - -export IDATE=${PDY}${cyc} - -for fhr in ${fhrlst}; do - export fhr=${fhr} - # Ignore possible spelling error (nothing is misspelled) - # shellcheck disable=SC2153 - VDATE=$(${NDATE} "${fhr}" "${IDATE}") - # shellcheck disable= - declare -x VDATE - cd "${DATA}" || exit 2 - if (( 10#${fhr} > 0 )); then - # TODO: This portion calls NCL scripts that are deprecated (see Issue #923) - if [[ "${MAKE_OCN_GRIB:-YES}" == "YES" ]]; then - export MOM6REGRID=${MOM6REGRID:-${HOMEgfs}} - "${MOM6REGRID}/scripts/run_regrid.sh" - status=$? - [[ ${status} -ne 0 ]] && exit "${status}" - - # Convert the netcdf files to grib2 - export executable=${MOM6REGRID}/exec/reg2grb2.x - "${MOM6REGRID}/scripts/run_reg2grb2.sh" - status=$? - [[ ${status} -ne 0 ]] && exit "${status}" - ${NMV} "ocn_ice${VDATE}.${ENSMEM}.${IDATE}_0p25x0p25.grb2" "${COM_OCEAN_GRIB_0p25}/" - ${NMV} "ocn_ice${VDATE}.${ENSMEM}.${IDATE}_0p5x0p5.grb2" "${COM_OCEAN_GRIB_0p50}/" - fi - - #break up ocn netcdf into multiple files: - if [[ -f "${COM_OCEAN_2D}/ocn_2D_${VDATE}.${ENSMEM}.${IDATE}.nc" ]]; then - echo "File ${COM_OCEAN_2D}/ocn_2D_${VDATE}.${ENSMEM}.${IDATE}.nc already exists" - else - ncks -x -v vo,uo,so,temp \ - "${COM_OCEAN_HISTORY}/ocn${VDATE}.${ENSMEM}.${IDATE}.nc" \ - "${COM_OCEAN_2D}/ocn_2D_${VDATE}.${ENSMEM}.${IDATE}.nc" - status=$? - [[ ${status} -ne 0 ]] && exit "${status}" - fi - if [[ -f "${COM_OCEAN_3D}/ocn_3D_${VDATE}.${ENSMEM}.${IDATE}.nc" ]]; then - echo "File ${COM_OCEAN_3D}/ocn_3D_${VDATE}.${ENSMEM}.${IDATE}.nc already exists" - else - ncks -x -v Heat_PmE,LW,LwLatSens,MLD_003,MLD_0125,SSH,SSS,SST,SSU,SSV,SW,cos_rot,ePBL,evap,fprec,frazil,latent,lprec,lrunoff,sensible,sin_rot,speed,taux,tauy,wet_c,wet_u,wet_v \ - "${COM_OCEAN_HISTORY}/ocn${VDATE}.${ENSMEM}.${IDATE}.nc" \ - "${COM_OCEAN_3D}/ocn_3D_${VDATE}.${ENSMEM}.${IDATE}.nc" - status=$? - [[ ${status} -ne 0 ]] && exit "${status}" - fi - if [[ -f "${COM_OCEAN_XSECT}/ocn-temp-EQ_${VDATE}.${ENSMEM}.${IDATE}.nc" ]]; then - echo "File ${COM_OCEAN_XSECT}/ocn-temp-EQ_${VDATE}.${ENSMEM}.${IDATE}.nc already exists" - else - ncks -v temp -d yh,0.0 \ - "${COM_OCEAN_3D}/ocn_3D_${VDATE}.${ENSMEM}.${IDATE}.nc" \ - "${COM_OCEAN_XSECT}/ocn-temp-EQ_${VDATE}.${ENSMEM}.${IDATE}.nc" - status=$? - [[ ${status} -ne 0 ]] && exit "${status}" - fi - if [[ -f "${COM_OCEAN_XSECT}/ocn-uo-EQ_${VDATE}.${ENSMEM}.${IDATE}.nc" ]]; then - echo "File ${COM_OCEAN_XSECT}/ocn-uo-EQ_${VDATE}.${ENSMEM}.${IDATE}.nc already exists" - else - ncks -v uo -d yh,0.0 \ - "${COM_OCEAN_3D}/ocn_3D_${VDATE}.${ENSMEM}.${IDATE}.nc" \ - "${COM_OCEAN_XSECT}/ocn-uo-EQ_${VDATE}.${ENSMEM}.${IDATE}.nc" - status=$? - [[ ${status} -ne 0 ]] && exit "${status}" - fi - fi -done - -# clean up working folder -if [[ ${KEEPDATA:-"NO"} = "NO" ]] ; then rm -rf "${DATA}" ; fi -############################################################### -# Exit out cleanly - - -exit 0 diff --git a/jobs/rocoto/prepatmiodaobs.sh b/jobs/rocoto/prepatmiodaobs.sh index d424df9261..0e69eda5c9 100755 --- a/jobs/rocoto/prepatmiodaobs.sh +++ b/jobs/rocoto/prepatmiodaobs.sh @@ -14,8 +14,9 @@ export jobid="${job}.$$" ############################################################### # setup python path for workflow and ioda utilities wxflowPATH="${HOMEgfs}/ush/python:${HOMEgfs}/ush/python/wxflow/src" -PYIODALIB="${HOMEgfs}/sorc/gdas.cd/build/lib/python3.7" -PYTHONPATH="${PYIODALIB}:${wxflowPATH}:${PYTHONPATH}" +# shellcheck disable=SC2311 +pyiodaPATH="${HOMEgfs}/sorc/gdas.cd/build/lib/python$(detect_py_ver)/" +PYTHONPATH="${pyiodaPATH}:${wxflowPATH}:${PYTHONPATH}" export PYTHONPATH ############################################################### diff --git a/jobs/rocoto/preplandobs.sh b/jobs/rocoto/prepsnowobs.sh similarity index 73% rename from jobs/rocoto/preplandobs.sh rename to jobs/rocoto/prepsnowobs.sh index 6304dd611b..cff082bab2 100755 --- a/jobs/rocoto/preplandobs.sh +++ b/jobs/rocoto/prepsnowobs.sh @@ -8,18 +8,20 @@ source "${HOMEgfs}/ush/preamble.sh" status=$? [[ ${status} -ne 0 ]] && exit "${status}" -export job="preplandobs" +export job="prepsnowobs" export jobid="${job}.$$" ############################################################### # setup python path for workflow utilities and tasks wxflowPATH="${HOMEgfs}/ush/python:${HOMEgfs}/ush/python/wxflow/src" -gdasappPATH="${HOMEgfs}/sorc/gdas.cd/iodaconv/src:${HOMEgfs}/sorc/gdas.cd/build/lib/python3.7" +# shellcheck disable=SC2311 +pyiodaPATH="${HOMEgfs}/sorc/gdas.cd/build/lib/python$(detect_py_ver)/" +gdasappPATH="${HOMEgfs}/sorc/gdas.cd/sorc/iodaconv/src:${pyiodaPATH}" PYTHONPATH="${PYTHONPATH:+${PYTHONPATH}:}${wxflowPATH}:${gdasappPATH}" export PYTHONPATH ############################################################### # Execute the JJOB -"${HOMEgfs}/jobs/JGLOBAL_PREP_LAND_OBS" +"${HOMEgfs}/jobs/JGLOBAL_PREP_SNOW_OBS" status=$? exit "${status}" diff --git a/jobs/rocoto/landanl.sh b/jobs/rocoto/snowanl.sh similarity index 91% rename from jobs/rocoto/landanl.sh rename to jobs/rocoto/snowanl.sh index f49b6f9f8b..627dd860f4 100755 --- a/jobs/rocoto/landanl.sh +++ b/jobs/rocoto/snowanl.sh @@ -8,7 +8,7 @@ source "${HOMEgfs}/ush/preamble.sh" status=$? [[ ${status} -ne 0 ]] && exit "${status}" -export job="landanl" +export job="snowanl" export jobid="${job}.$$" ############################################################### @@ -19,6 +19,6 @@ export PYTHONPATH ############################################################### # Execute the JJOB -"${HOMEgfs}/jobs/JGLOBAL_LAND_ANALYSIS" +"${HOMEgfs}/jobs/JGLOBAL_SNOW_ANALYSIS" status=$? exit "${status}" diff --git a/modulefiles/module-setup.csh.inc b/modulefiles/module-setup.csh.inc deleted file mode 100644 index 7086326627..0000000000 --- a/modulefiles/module-setup.csh.inc +++ /dev/null @@ -1,87 +0,0 @@ -set __ms_shell=csh - -eval "if ( -d / ) set __ms_shell=tcsh" - -if ( { test -d /lfs/f1 } ) then - # We are on NOAA Cactus or Dogwood - if ( ! { module help >& /dev/null } ) then - source /usr/share/lmod/lmod/init/$__ms_shell - fi - module reset -else if ( { test -d /lfs3 } ) then - if ( ! { module help >& /dev/null } ) then - source /apps/lmod/lmod/init/$__ms_shell - endif - module purge -else if ( { test -d /scratch1 } ) then - # We are on NOAA Hera - if ( ! { module help >& /dev/null } ) then - source /apps/lmod/lmod/init/$__ms_shell - endif - module purge -elif [[ -d /work ]] ; then - # We are on MSU Orion or Hercules - if [[ -d /apps/other ]] ; then - # Hercules - init_path="/apps/other/lmod/lmod/init/$__ms_shell" - else - # Orion - init_path="/apps/lmod/lmod/init/$__ms_shell" - fi - if ( ! eval module help > /dev/null 2>&1 ) ; then - source "${init_path}" - fi - module purge -else if ( { test -d /data/prod } ) then - # We are on SSEC S4 - if ( ! { module help >& /dev/null } ) then - source /usr/share/lmod/lmod/init/$__ms_shell - endif - source /etc/profile - module purge -else if ( { test -d /glade } ) then - # We are on NCAR Yellowstone - if ( ! { module help >& /dev/null } ) then - source /usr/share/Modules/init/$__ms_shell - endif - module purge -else if ( { test -d /lustre -a -d /ncrc } ) then - # We are on GAEA. - if ( ! { module help >& /dev/null } ) then - # We cannot simply load the module command. The GAEA - # /etc/csh.login modifies a number of module-related variables - # before loading the module command. Without those variables, - # the module command fails. Hence we actually have to source - # /etc/csh.login here. - source /etc/csh.login - set __ms_source_etc_csh_login=yes - else - set __ms_source_etc_csh_login=no - endif - module purge - unsetenv _LMFILES_ - unsetenv _LMFILES_000 - unsetenv _LMFILES_001 - unsetenv LOADEDMODULES - module load modules - if ( { test -d /opt/cray/ari/modulefiles } ) then - module use -a /opt/cray/ari/modulefiles - endif - if ( { test -d /opt/cray/pe/ari/modulefiles } ) then - module use -a /opt/cray/pe/ari/modulefiles - endif - if ( { test -d /opt/cray/pe/craype/default/modulefiles } ) then - module use -a /opt/cray/pe/craype/default/modulefiles - endif - setenv NCEPLIBS /lustre/f1/pdata/ncep_shared/NCEPLIBS/lib - if ( { test -d /lustre/f1/pdata/ncep_shared/NCEPLIBS/lib } ) then - module use $NCEPLIBS/modulefiles - endif - if ( "$__ms_source_etc_csh_login" == yes ) then - source /etc/csh.login - unset __ms_source_etc_csh_login - endif -else - # Workaround for csh limitation. Use sh to print to stderr. - sh -c 'echo WARNING: UNKNOWN PLATFORM 1>&2' -endif diff --git a/modulefiles/module-setup.sh.inc b/modulefiles/module-setup.sh.inc deleted file mode 100644 index db9dabffe1..0000000000 --- a/modulefiles/module-setup.sh.inc +++ /dev/null @@ -1,110 +0,0 @@ -# Create a test function for sh vs. bash detection. The name is -# randomly generated to reduce the chances of name collision. -__ms_function_name="setup__test_function__$$" -eval "$__ms_function_name() { /bin/true ; }" - -# Determine which shell we are using -__ms_ksh_test=$( eval '__text="text" ; if [[ $__text =~ ^(t).* ]] ; then printf "%s" ${.sh.match[1]} ; fi' 2> /dev/null | cat ) -__ms_bash_test=$( eval 'if ( set | grep '$__ms_function_name' | grep -v name > /dev/null 2>&1 ) ; then echo t ; fi ' 2> /dev/null | cat ) - -if [[ ! -z "$__ms_ksh_test" ]] ; then - __ms_shell=ksh -elif [[ ! -z "$__ms_bash_test" ]] ; then - __ms_shell=bash -else - # Not bash or ksh, so assume sh. - __ms_shell=sh -fi - -if [[ -d /lfs/f1 ]] ; then - # We are on NOAA Cactus or Dogwood - if ( ! eval module help > /dev/null 2>&1 ) ; then - source /usr/share/lmod/lmod/init/$__ms_shell - fi - module reset -elif [[ -d /mnt/lfs1 ]] ; then - # We are on NOAA Jet - if ( ! eval module help > /dev/null 2>&1 ) ; then - source /apps/lmod/lmod/init/$__ms_shell - fi - module purge -elif [[ -d /scratch1 ]] ; then - # We are on NOAA Hera - if ( ! eval module help > /dev/null 2>&1 ) ; then - source /apps/lmod/lmod/init/$__ms_shell - fi - module purge -elif [[ -d /work ]] ; then - # We are on MSU Orion or Hercules - if [[ -d /apps/other ]] ; then - # Hercules - init_path="/apps/other/lmod/lmod/init/$__ms_shell" - else - # Orion - init_path="/apps/lmod/lmod/init/$__ms_shell" - fi - if ( ! eval module help > /dev/null 2>&1 ) ; then - source "${init_path}" - fi - module purge -elif [[ -d /glade ]] ; then - # We are on NCAR Yellowstone - if ( ! eval module help > /dev/null 2>&1 ) ; then - . /usr/share/Modules/init/$__ms_shell - fi - module purge -elif [[ -d /lustre && -d /ncrc ]] ; then - # We are on GAEA. - if ( ! eval module help > /dev/null 2>&1 ) ; then - # We cannot simply load the module command. The GAEA - # /etc/profile modifies a number of module-related variables - # before loading the module command. Without those variables, - # the module command fails. Hence we actually have to source - # /etc/profile here. - source /etc/profile - __ms_source_etc_profile=yes - else - __ms_source_etc_profile=no - fi - module purge - # clean up after purge - unset _LMFILES_ - unset _LMFILES_000 - unset _LMFILES_001 - unset LOADEDMODULES - module load modules - if [[ -d /opt/cray/ari/modulefiles ]] ; then - module use -a /opt/cray/ari/modulefiles - fi - if [[ -d /opt/cray/pe/ari/modulefiles ]] ; then - module use -a /opt/cray/pe/ari/modulefiles - fi - if [[ -d /opt/cray/pe/craype/default/modulefiles ]] ; then - module use -a /opt/cray/pe/craype/default/modulefiles - fi - if [[ -s /etc/opt/cray/pe/admin-pe/site-config ]] ; then - source /etc/opt/cray/pe/admin-pe/site-config - fi - export NCEPLIBS=/lustre/f1/pdata/ncep_shared/NCEPLIBS/lib - if [[ -d "$NCEPLIBS" ]] ; then - module use $NCEPLIBS/modulefiles - fi - if [[ "$__ms_source_etc_profile" == yes ]] ; then - source /etc/profile - unset __ms_source_etc_profile - fi -elif [[ -d /data/prod ]] ; then - # We are on SSEC's S4 - if ( ! eval module help > /dev/null 2>&1 ) ; then - source /usr/share/lmod/lmod/init/$__ms_shell - fi - module purge -else - echo WARNING: UNKNOWN PLATFORM 1>&2 -fi - -unset __ms_shell -unset __ms_ksh_test -unset __ms_bash_test -unset $__ms_function_name -unset __ms_function_name diff --git a/modulefiles/module_base.hera.lua b/modulefiles/module_base.hera.lua index 311fb0a1cf..9a542f8f4f 100644 --- a/modulefiles/module_base.hera.lua +++ b/modulefiles/module_base.hera.lua @@ -2,9 +2,8 @@ help([[ Load environment to run GFS on Hera ]]) -spack_stack_ver=(os.getenv("spack_stack_ver") or "None") -spack_env=(os.getenv("spack_env") or "None") -prepend_path("MODULEPATH", "/scratch1/NCEPDEV/nems/role.epic/spack-stack/spack-stack-" .. spack_stack_ver .. "/envs/" .. spack_env .. "/install/modulefiles/Core") +local spack_mod_path=(os.getenv("spack_mod_path") or "None") +prepend_path("MODULEPATH", spack_mod_path) load(pathJoin("stack-intel", (os.getenv("stack_intel_ver") or "None"))) load(pathJoin("stack-intel-oneapi-mpi", (os.getenv("stack_impi_ver") or "None"))) @@ -17,6 +16,7 @@ load(pathJoin("jasper", (os.getenv("jasper_ver") or "None"))) load(pathJoin("libpng", (os.getenv("libpng_ver") or "None"))) load(pathJoin("cdo", (os.getenv("cdo_ver") or "None"))) load(pathJoin("R", (os.getenv("R_ver") or "None"))) +load(pathJoin("perl", (os.getenv("perl_ver") or "None"))) load(pathJoin("hdf5", (os.getenv("hdf5_ver") or "None"))) load(pathJoin("netcdf-c", (os.getenv("netcdf_c_ver") or "None"))) @@ -33,16 +33,17 @@ load(pathJoin("wgrib2", (os.getenv("wgrib2_ver") or "None"))) load(pathJoin("py-netcdf4", (os.getenv("py_netcdf4_ver") or "None"))) load(pathJoin("py-pyyaml", (os.getenv("py_pyyaml_ver") or "None"))) load(pathJoin("py-jinja2", (os.getenv("py_jinja2_ver") or "None"))) - --- MET/METplus are not available for use with spack-stack, yet ---load(pathJoin("met", (os.getenv("met_ver") or "None"))) ---load(pathJoin("metplus", (os.getenv("metplus_ver") or "None"))) +load(pathJoin("py-pandas", (os.getenv("py_pandas_ver") or "None"))) +load(pathJoin("py-python-dateutil", (os.getenv("py_python_dateutil_ver") or "None"))) +load(pathJoin("met", (os.getenv("met_ver") or "None"))) +load(pathJoin("metplus", (os.getenv("metplus_ver") or "None"))) +load(pathJoin("py-xarray", (os.getenv("py_xarray_ver") or "None"))) setenv("WGRIB2","wgrib2") setenv("UTILROOT",(os.getenv("prod_util_ROOT") or "None")) --prepend_path("MODULEPATH", pathJoin("/scratch1/NCEPDEV/global/glopara/git/prepobs/v" .. (os.getenv("prepobs_run_ver") or "None"), "modulefiles")) -prepend_path("MODULEPATH", pathJoin("/scratch1/NCEPDEV/global/glopara/git/prepobs/feature-GFSv17_com_reorg_log_update/modulefiles")) +prepend_path("MODULEPATH", pathJoin("/scratch1/NCEPDEV/global/glopara/git/prepobs/dev-gfsv17/modulefiles")) load(pathJoin("prepobs", (os.getenv("prepobs_run_ver") or "None"))) prepend_path("MODULEPATH", pathJoin("/scratch1/NCEPDEV/global/glopara/git/Fit2Obs/v" .. (os.getenv("fit2obs_ver") or "None"), "modulefiles")) diff --git a/modulefiles/module_base.hercules.lua b/modulefiles/module_base.hercules.lua index d9c8f5ed0b..77a77f5505 100644 --- a/modulefiles/module_base.hercules.lua +++ b/modulefiles/module_base.hercules.lua @@ -2,20 +2,14 @@ help([[ Load environment to run GFS on Hercules ]]) -spack_stack_ver=(os.getenv("spack_stack_ver") or "None") -spack_env=(os.getenv("spack_env") or "None") -prepend_path("MODULEPATH", "/work/noaa/epic/role-epic/spack-stack/hercules/spack-stack-" .. spack_stack_ver .. "/envs/" .. spack_env .. "/install/modulefiles/Core") +local spack_mod_path=(os.getenv("spack_mod_path") or "None") +prepend_path("MODULEPATH", spack_mod_path) load(pathJoin("stack-intel", (os.getenv("stack_intel_ver") or "None"))) load(pathJoin("stack-intel-oneapi-mpi", (os.getenv("stack_impi_ver") or "None"))) load(pathJoin("intel-oneapi-mkl", (os.getenv("intel_mkl_ver") or "None"))) load(pathJoin("python", (os.getenv("python_ver") or "None"))) -load(pathJoin("perl", (os.getenv("perl_ver") or "None"))) --- TODO load NCL once the SAs remove the 'depends_on' statements within it --- NCL is a static installation and does not depend on any libraries --- but as is will load, among others, the system netcdf-c/4.9.0 module ---load(pathJoin("ncl", (os.getenv("ncl_ver") or "None"))) load(pathJoin("jasper", (os.getenv("jasper_ver") or "None"))) load(pathJoin("libpng", (os.getenv("libpng_ver") or "None"))) load(pathJoin("cdo", (os.getenv("cdo_ver") or "None"))) @@ -35,6 +29,11 @@ load(pathJoin("wgrib2", (os.getenv("wgrib2_ver") or "None"))) load(pathJoin("py-netcdf4", (os.getenv("py_netcdf4_ver") or "None"))) load(pathJoin("py-pyyaml", (os.getenv("py_pyyaml_ver") or "None"))) load(pathJoin("py-jinja2", (os.getenv("py_jinja2_ver") or "None"))) +load(pathJoin("py-pandas", (os.getenv("py_pandas_ver") or "None"))) +load(pathJoin("py-python-dateutil", (os.getenv("py_python_dateutil_ver") or "None"))) +load(pathJoin("met", (os.getenv("met_ver") or "None"))) +load(pathJoin("metplus", (os.getenv("metplus_ver") or "None"))) +load(pathJoin("py-xarray", (os.getenv("py_xarray_ver") or "None"))) setenv("WGRIB2","wgrib2") setenv("UTILROOT",(os.getenv("prod_util_ROOT") or "None")) diff --git a/modulefiles/module_base.jet.lua b/modulefiles/module_base.jet.lua index 64d35da57a..55a1eb1c68 100644 --- a/modulefiles/module_base.jet.lua +++ b/modulefiles/module_base.jet.lua @@ -2,9 +2,8 @@ help([[ Load environment to run GFS on Jet ]]) -spack_stack_ver=(os.getenv("spack_stack_ver") or "None") -spack_env=(os.getenv("spack_env") or "None") -prepend_path("MODULEPATH", "/mnt/lfs4/HFIP/hfv3gfs/role.epic/spack-stack/spack-stack-" .. spack_stack_ver .. "/envs/" .. spack_env .. "/install/modulefiles/Core") +local spack_mod_path=(os.getenv("spack_mod_path") or "None") +prepend_path("MODULEPATH", spack_mod_path) load(pathJoin("stack-intel", (os.getenv("stack_intel_ver") or "None"))) load(pathJoin("stack-intel-oneapi-mpi", (os.getenv("stack_impi_ver") or "None"))) @@ -33,6 +32,11 @@ load(pathJoin("wgrib2", (os.getenv("wgrib2_ver") or "None"))) load(pathJoin("py-netcdf4", (os.getenv("py_netcdf4_ver") or "None"))) load(pathJoin("py-pyyaml", (os.getenv("py_pyyaml_ver") or "None"))) load(pathJoin("py-jinja2", (os.getenv("py_jinja2_ver") or "None"))) +load(pathJoin("py-pandas", (os.getenv("py_pandas_ver") or "None"))) +load(pathJoin("py-python-dateutil", (os.getenv("py_python_dateutil_ver") or "None"))) +load(pathJoin("met", (os.getenv("met_ver") or "None"))) +load(pathJoin("metplus", (os.getenv("metplus_ver") or "None"))) +load(pathJoin("py-xarray", (os.getenv("py_xarray_ver") or "None"))) setenv("WGRIB2","wgrib2") setenv("UTILROOT",(os.getenv("prod_util_ROOT") or "None")) diff --git a/modulefiles/module_base.orion.lua b/modulefiles/module_base.orion.lua index 65486855d0..4e2e24b82f 100644 --- a/modulefiles/module_base.orion.lua +++ b/modulefiles/module_base.orion.lua @@ -2,9 +2,8 @@ help([[ Load environment to run GFS on Orion ]]) -spack_stack_ver=(os.getenv("spack_stack_ver") or "None") -spack_env=(os.getenv("spack_env") or "None") -prepend_path("MODULEPATH", "/work/noaa/epic/role-epic/spack-stack/orion/spack-stack-" .. spack_stack_ver .. "/envs/" .. spack_env .. "/install/modulefiles/Core") +local spack_mod_path=(os.getenv("spack_mod_path") or "None") +prepend_path("MODULEPATH", spack_mod_path) load(pathJoin("stack-intel", (os.getenv("stack_intel_ver") or "None"))) load(pathJoin("stack-intel-oneapi-mpi", (os.getenv("stack_impi_ver") or "None"))) @@ -31,10 +30,11 @@ load(pathJoin("wgrib2", (os.getenv("wgrib2_ver") or "None"))) load(pathJoin("py-netcdf4", (os.getenv("py_netcdf4_ver") or "None"))) load(pathJoin("py-pyyaml", (os.getenv("py_pyyaml_ver") or "None"))) load(pathJoin("py-jinja2", (os.getenv("py_jinja2_ver") or "None"))) - --- MET/METplus are not yet supported with spack-stack ---load(pathJoin("met", (os.getenv("met_ver") or "None"))) ---load(pathJoin("metplus", (os.getenv("metplus_ver") or "None"))) +load(pathJoin("py-pandas", (os.getenv("py_pandas_ver") or "None"))) +load(pathJoin("py-python-dateutil", (os.getenv("py_python_dateutil_ver") or "None"))) +load(pathJoin("met", (os.getenv("met_ver") or "None"))) +load(pathJoin("metplus", (os.getenv("metplus_ver") or "None"))) +load(pathJoin("py-xarray", (os.getenv("py_xarray_ver") or "None"))) setenv("WGRIB2","wgrib2") setenv("UTILROOT",(os.getenv("prod_util_ROOT") or "None")) diff --git a/modulefiles/module_base.s4.lua b/modulefiles/module_base.s4.lua index d99a93c3f4..d8dccc89ba 100644 --- a/modulefiles/module_base.s4.lua +++ b/modulefiles/module_base.s4.lua @@ -2,9 +2,8 @@ help([[ Load environment to run GFS on S4 ]]) -spack_stack_ver=(os.getenv("spack_stack_ver") or "None") -spack_env=(os.getenv("spack_env") or "None") -prepend_path("MODULEPATH", "/data/prod/jedi/spack-stack/spack-stack-" .. spack_stack_ver .. "/envs/" .. spack_env .. "/install/modulefiles/Core") +local spack_mod_path=(os.getenv("spack_mod_path") or "None") +prepend_path("MODULEPATH", spack_mod_path) load(pathJoin("stack-intel", (os.getenv("stack_intel_ver") or "None"))) load(pathJoin("stack-intel-oneapi-mpi", (os.getenv("stack_impi_ver") or "None"))) @@ -30,6 +29,11 @@ load(pathJoin("wgrib2", (os.getenv("wgrib2_ver") or "None"))) load(pathJoin("py-netcdf4", (os.getenv("py_netcdf4_ver") or "None"))) load(pathJoin("py-pyyaml", (os.getenv("py_pyyaml_ver") or "None"))) load(pathJoin("py-jinja2", (os.getenv("py_jinja2_ver") or "None"))) +load(pathJoin("py-pandas", (os.getenv("py_pandas_ver") or "None"))) +load(pathJoin("py-python-dateutil", (os.getenv("py_python_dateutil_ver") or "None"))) +load(pathJoin("met", (os.getenv("met_ver") or "None"))) +load(pathJoin("metplus", (os.getenv("metplus_ver") or "None"))) +load(pathJoin("py-xarray", (os.getenv("py_xarray_ver") or "None"))) setenv("WGRIB2","wgrib2") setenv("UTILROOT",(os.getenv("prod_util_ROOT") or "None")) diff --git a/modulefiles/module_base.wcoss2.lua b/modulefiles/module_base.wcoss2.lua index ee4ee6a5fb..43b21ccc25 100644 --- a/modulefiles/module_base.wcoss2.lua +++ b/modulefiles/module_base.wcoss2.lua @@ -31,6 +31,11 @@ load(pathJoin("ncdiag", (os.getenv("ncdiag_ver") or "None"))) load(pathJoin("crtm", (os.getenv("crtm_ver") or "None"))) load(pathJoin("wgrib2", (os.getenv("wgrib2_ver") or "None"))) +prepend_path("MODULEPATH", "/apps/ops/para/libs/modulefiles/compiler/intel/19.1.3.304") +setenv("HPC_OPT", "/apps/ops/para/libs") +load(pathJoin("met", (os.getenv("met_ver") or "None"))) +load(pathJoin("metplus", (os.getenv("metplus_ver") or "None"))) + --prepend_path("MODULEPATH", pathJoin("/lfs/h2/emc/global/save/emc.global/git/prepobs/v" .. (os.getenv("prepobs_run_ver") or "None"), "modulefiles")) prepend_path("MODULEPATH", pathJoin("/lfs/h2/emc/global/save/emc.global/git/prepobs/feature-GFSv17_com_reorg_log_update/modulefiles")) load(pathJoin("prepobs", (os.getenv("prepobs_run_ver") or "None"))) diff --git a/modulefiles/module_gwci.hera.lua b/modulefiles/module_gwci.hera.lua index 1aecddf549..75b972b69b 100644 --- a/modulefiles/module_gwci.hera.lua +++ b/modulefiles/module_gwci.hera.lua @@ -2,13 +2,13 @@ help([[ Load environment to run GFS workflow setup scripts on Hera ]]) -prepend_path("MODULEPATH", "/scratch1/NCEPDEV/nems/role.epic/spack-stack/spack-stack-1.5.1/envs/gsi-addon/install/modulefiles/Core") +prepend_path("MODULEPATH", "/scratch1/NCEPDEV/nems/role.epic/spack-stack/spack-stack-1.6.0/envs/gsi-addon-dev-rocky8/install/modulefiles/Core") load(pathJoin("stack-intel", os.getenv("2021.5.0"))) load(pathJoin("stack-intel-oneapi-mpi", os.getenv("2021.5.1"))) load(pathJoin("netcdf-c", os.getenv("4.9.2"))) -load(pathJoin("netcdf-fortran", os.getenv("4.6.0"))) +load(pathJoin("netcdf-fortran", os.getenv("4.6.1"))) load(pathJoin("nccmp","1.9.0.1")) load(pathJoin("wgrib2", "2.0.8")) diff --git a/modulefiles/module_gwci.hercules.lua b/modulefiles/module_gwci.hercules.lua index 9c60aed467..179bbef114 100644 --- a/modulefiles/module_gwci.hercules.lua +++ b/modulefiles/module_gwci.hercules.lua @@ -2,7 +2,7 @@ help([[ Load environment to run GFS workflow ci scripts on Hercules ]]) -prepend_path("MODULEPATH", "/work/noaa/epic/role-epic/spack-stack/hercules/spack-stack-1.5.1/envs/gsi-addon/install/modulefiles/Core") +prepend_path("MODULEPATH", "/work/noaa/epic/role-epic/spack-stack/hercules/spack-stack-1.6.0/envs/gsi-addon-env/install/modulefiles/Core") load(pathJoin("stack-intel", os.getenv("2021.9.0"))) load(pathJoin("stack-intel-oneapi-mpi", os.getenv("2021.9.0"))) diff --git a/modulefiles/module_gwci.orion.lua b/modulefiles/module_gwci.orion.lua index 18851ba7d4..cef7acf308 100644 --- a/modulefiles/module_gwci.orion.lua +++ b/modulefiles/module_gwci.orion.lua @@ -2,13 +2,13 @@ help([[ Load environment to run GFS workflow ci scripts on Orion ]]) -prepend_path("MODULEPATH", "/work/noaa/epic/role-epic/spack-stack/orion/spack-stack-1.5.1/envs/gsi-addon/install/modulefiles/Core") +prepend_path("MODULEPATH", "/work/noaa/epic/role-epic/spack-stack/orion/spack-stack-1.6.0/envs/gsi-addon-env/install/modulefiles/Core") load(pathJoin("stack-intel", os.getenv("2022.0.2"))) load(pathJoin("stack-intel-oneapi-mpi", os.getenv("2021.5.1"))) load(pathJoin("netcdf-c", os.getenv("4.9.2"))) -load(pathJoin("netcdf-fortran", os.getenv("4.6.0"))) +load(pathJoin("netcdf-fortran", os.getenv("4.6.1"))) load(pathJoin("nccmp","1.9.0.1")) load(pathJoin("wgrib2", "2.0.8")) diff --git a/modulefiles/module_gwsetup.hera.lua b/modulefiles/module_gwsetup.hera.lua index c86cac7b02..696f9577be 100644 --- a/modulefiles/module_gwsetup.hera.lua +++ b/modulefiles/module_gwsetup.hera.lua @@ -4,15 +4,17 @@ Load environment to run GFS workflow setup scripts on Hera load(pathJoin("rocoto")) -prepend_path("MODULEPATH", "/scratch1/NCEPDEV/nems/role.epic/spack-stack/spack-stack-1.5.1/envs/gsi-addon/install/modulefiles/Core") +prepend_path("MODULEPATH", "/scratch1/NCEPDEV/nems/role.epic/spack-stack/spack-stack-1.6.0/envs/gsi-addon-dev-rocky8/install/modulefiles/Core") local stack_intel_ver=os.getenv("stack_intel_ver") or "2021.5.0" -local python_ver=os.getenv("python_ver") or "3.10.8" +local python_ver=os.getenv("python_ver") or "3.11.6" load(pathJoin("stack-intel", stack_intel_ver)) load(pathJoin("python", python_ver)) load("py-jinja2") load("py-pyyaml") load("py-numpy") +local git_ver=os.getenv("git_ver") or "2.18.0" +load(pathJoin("git", git_ver)) whatis("Description: GFS run setup environment") diff --git a/modulefiles/module_gwsetup.hercules.lua b/modulefiles/module_gwsetup.hercules.lua index 673928605c..795b295b30 100644 --- a/modulefiles/module_gwsetup.hercules.lua +++ b/modulefiles/module_gwsetup.hercules.lua @@ -5,10 +5,10 @@ Load environment to run GFS workflow ci scripts on Hercules load(pathJoin("contrib","0.1")) load(pathJoin("rocoto","1.3.5")) -prepend_path("MODULEPATH", "/work/noaa/epic/role-epic/spack-stack/hercules/spack-stack-1.5.1/envs/gsi-addon/install/modulefiles/Core") +prepend_path("MODULEPATH", "/work/noaa/epic/role-epic/spack-stack/hercules/spack-stack-1.6.0/envs/gsi-addon-env/install/modulefiles/Core") local stack_intel_ver=os.getenv("stack_intel_ver") or "2021.9.0" -local python_ver=os.getenv("python_ver") or "3.10.8" +local python_ver=os.getenv("python_ver") or "3.11.6" load(pathJoin("stack-intel", stack_intel_ver)) load(pathJoin("python", python_ver)) diff --git a/modulefiles/module_gwsetup.jet.lua b/modulefiles/module_gwsetup.jet.lua index d08389c711..72c40469e4 100644 --- a/modulefiles/module_gwsetup.jet.lua +++ b/modulefiles/module_gwsetup.jet.lua @@ -2,12 +2,12 @@ help([[ Load environment to run GFS workflow setup scripts on Jet ]]) -load(pathJoin("rocoto", "1.3.3")) +load(pathJoin("rocoto")) -prepend_path("MODULEPATH", "/mnt/lfs4/HFIP/hfv3gfs/role.epic/spack-stack/spack-stack-1.5.1/envs/gsi-addon/install/modulefiles/Core") +prepend_path("MODULEPATH", "/mnt/lfs4/HFIP/hfv3gfs/role.epic/spack-stack/spack-stack-1.6.0/envs/gsi-addon-dev/install/modulefiles/Core") local stack_intel_ver=os.getenv("stack_intel_ver") or "2021.5.0" -local python_ver=os.getenv("python_ver") or "3.10.8" +local python_ver=os.getenv("python_ver") or "3.11.6" load(pathJoin("stack-intel", stack_intel_ver)) load(pathJoin("python", python_ver)) diff --git a/modulefiles/module_gwsetup.orion.lua b/modulefiles/module_gwsetup.orion.lua index 93a59c8e50..96ed50f7f0 100644 --- a/modulefiles/module_gwsetup.orion.lua +++ b/modulefiles/module_gwsetup.orion.lua @@ -7,10 +7,10 @@ load(pathJoin("contrib","0.1")) load(pathJoin("rocoto","1.3.3")) load(pathJoin("git","2.28.0")) -prepend_path("MODULEPATH", "/work/noaa/epic/role-epic/spack-stack/orion/spack-stack-1.5.1/envs/gsi-addon/install/modulefiles/Core") +prepend_path("MODULEPATH", "/work/noaa/epic/role-epic/spack-stack/orion/spack-stack-1.6.0/envs/gsi-addon-env/install/modulefiles/Core") local stack_intel_ver=os.getenv("stack_intel_ver") or "2022.0.2" -local python_ver=os.getenv("python_ver") or "3.10.8" +local python_ver=os.getenv("python_ver") or "3.11.6" load(pathJoin("stack-intel", stack_intel_ver)) load(pathJoin("python", python_ver)) diff --git a/modulefiles/module_gwsetup.s4.lua b/modulefiles/module_gwsetup.s4.lua index 291c654bb3..77a647006f 100644 --- a/modulefiles/module_gwsetup.s4.lua +++ b/modulefiles/module_gwsetup.s4.lua @@ -5,10 +5,10 @@ Load environment to run GFS workflow setup scripts on S4 load(pathJoin("rocoto","1.3.5")) load(pathJoin("git","2.30.0")) -prepend_path("MODULEPATH", "/data/prod/jedi/spack-stack/spack-stack-1.5.1/envs/gsi-addon/install/modulefiles/Core") +prepend_path("MODULEPATH", "/data/prod/jedi/spack-stack/spack-stack-1.6.0/envs/gsi-addon-env/install/modulefiles/Core") local stack_intel_ver=os.getenv("stack_intel_ver") or "2021.5.0" -local python_ver=os.getenv("python_ver") or "3.10.8" +local python_ver=os.getenv("python_ver") or "3.11.6" load(pathJoin("stack-intel", stack_intel_ver)) load(pathJoin("python", python_ver)) diff --git a/parm/config/gefs/config.atmos_products b/parm/config/gefs/config.atmos_products new file mode 100644 index 0000000000..4a0fb8b49f --- /dev/null +++ b/parm/config/gefs/config.atmos_products @@ -0,0 +1,28 @@ +#! /usr/bin/env bash + +########## config.atmos_products ########## +# atmosphere grib2 products specific + +echo "BEGIN: config.atmos_products" + +# Get task specific resources +. "${EXPDIR}/config.resources" atmos_products + +# No. of forecast hours to process in a single job +export NFHRS_PER_GROUP=3 + +# Scripts used by this job +export INTERP_ATMOS_MASTERSH="${USHgfs}/interp_atmos_master.sh" +export INTERP_ATMOS_SFLUXSH="${USHgfs}/interp_atmos_sflux.sh" + +export downset=2 +export FHOUT_PGBS=${FHOUT_GFS:-3} # Output frequency of supplemental gfs pgb file at 1.0 and 0.5 deg +export FLXGF="NO" # Create interpolated sflux.1p00 file + +# paramlist files for the different forecast hours and downsets +export paramlista="${PARMgfs}/product/gefs.0p25.fFFF.paramlist.a.txt" +export paramlista_anl="${PARMgfs}/product/gefs.0p25.anl.paramlist.a.txt" +export paramlista_f000="${PARMgfs}/product/gefs.0p25.f000.paramlist.a.txt" +export paramlistb="${PARMgfs}/product/gefs.0p25.fFFF.paramlist.b.txt" + +echo "END: config.atmos_products" diff --git a/parm/config/gefs/config.base.emc.dyn b/parm/config/gefs/config.base similarity index 92% rename from parm/config/gefs/config.base.emc.dyn rename to parm/config/gefs/config.base index 5358a37768..07664f15ff 100644 --- a/parm/config/gefs/config.base.emc.dyn +++ b/parm/config/gefs/config.base @@ -23,22 +23,11 @@ export HPSS_PROJECT="@HPSS_PROJECT@" # Directories relative to installation areas: export HOMEgfs=@HOMEgfs@ -export PARMgfs=${HOMEgfs}/parm -export FIXgfs=${HOMEgfs}/fix -export USHgfs=${HOMEgfs}/ush -export UTILgfs=${HOMEgfs}/util export EXECgfs=${HOMEgfs}/exec +export FIXgfs=${HOMEgfs}/fix +export PARMgfs=${HOMEgfs}/parm export SCRgfs=${HOMEgfs}/scripts - -export FIXam="${FIXgfs}/am" -export FIXaer="${FIXgfs}/aer" -export FIXcpl="${FIXgfs}/cpl" -export FIXlut="${FIXgfs}/lut" -export FIXorog="${FIXgfs}/orog" -export FIXcice="${FIXgfs}/cice" -export FIXmom="${FIXgfs}/mom6" -export FIXreg2grb2="${FIXgfs}/reg2grb2" -export FIXugwd="${FIXgfs}/ugwd" +export USHgfs=${HOMEgfs}/ush ######################################################################## @@ -85,7 +74,8 @@ export NCP="/bin/cp -p" export NMV="/bin/mv" export NLN="/bin/ln -sf" export VERBOSE="YES" -export KEEPDATA="NO" +export KEEPDATA="@KEEPDATA@" +export DEBUG_POSTSCRIPT="NO" # PBS only; sets debug=true export CHGRP_RSTPROD="@CHGRP_RSTPROD@" export CHGRP_CMD="@CHGRP_CMD@" export NCDUMP="${NETCDF:-${netcdf_c_ROOT:-}}/bin/ncdump" @@ -120,6 +110,7 @@ export RUN="gefs" # RUN is defined in the job-card (ecf); CDUMP is used at EMC # Get all the COM path templates source "${EXPDIR}/config.com" +# shellcheck disable=SC2016 export ERRSCRIPT=${ERRSCRIPT:-'eval [[ $err = 0 ]]'} export LOGSCRIPT=${LOGSCRIPT:-""} #export ERRSCRIPT=${ERRSCRIPT:-"err_chk"} @@ -144,7 +135,7 @@ export DO_OCN="NO" export DO_ICE="NO" export DO_AERO="NO" export WAVE_CDUMP="" # When to include wave suite: gdas, gfs, or both -export DOBNDPNT_WAVE="NO" +export DOBNDPNT_WAVE="NO" # The GEFS buoys file does not currently have any boundary points export FRAC_GRID=".true." # Set operational resolution @@ -218,16 +209,12 @@ export gfs_cyc=@gfs_cyc@ # 0: no GFS cycle, 1: 00Z only, 2: 00Z and 12Z only, 4: # GFS output and frequency export FHMIN_GFS=0 - -export FHMAX_GFS_00=120 -export FHMAX_GFS_06=120 -export FHMAX_GFS_12=120 -export FHMAX_GFS_18=120 -current_fhmax_var=FHMAX_GFS_${cyc}; declare -x FHMAX_GFS=${!current_fhmax_var} - -export FHOUT_GFS=6 # Must be 6 for S2S until #1629 is addressed; 3 for ops +export FHMIN=${FHMIN_GFS} +export FHMAX_GFS=@FHMAX_GFS@ +export FHOUT_GFS=6 export FHMAX_HF_GFS=0 export FHOUT_HF_GFS=1 +export FHOUT_OCNICE_GFS=6 if (( gfs_cyc != 0 )); then export STEP_GFS=$(( 24 / gfs_cyc )) else @@ -262,7 +249,7 @@ export imp_physics=8 export DO_JEDIATMVAR="NO" export DO_JEDIATMENS="NO" export DO_JEDIOCNVAR="NO" -export DO_JEDILANDDA="NO" +export DO_JEDISNOWDA="NO" export DO_MERGENSST="NO" # Hybrid related @@ -270,9 +257,13 @@ export NMEM_ENS=@NMEM_ENS@ # set default member number memdir for control # this will be overwritten for the perturbed members -export ENSMEM="000" +export ENSMEM=${ENSMEM:-"000"} export MEMDIR="mem${ENSMEM}" +# initialize ocean ensemble members with perturbations +# if true, only occurs for members greater than zero +export USE_OCN_PERTURB_FILES=@USE_OCN_PERTURB_FILES@ + export DOIAU="NO" # While we are not doing IAU, we may want to warm start w/ IAU in the future # Check if cycle is cold starting if [[ "${EXP_WARM_START}" = ".false." ]]; then diff --git a/parm/config/gefs/config.efcs b/parm/config/gefs/config.efcs index 9593408848..ad90fa864c 100644 --- a/parm/config/gefs/config.efcs +++ b/parm/config/gefs/config.efcs @@ -5,14 +5,16 @@ echo "BEGIN: config.efcs" -# Turn off components in ensemble via _ENKF, or use setting from deterministic -export DO_AERO=${DO_AERO_ENKF:-${DO_AERO:-"NO"}} -export DO_OCN=${DO_OCN_ENKF:-${DO_OCN:-"NO"}} -export DO_ICE=${DO_ICE_ENKF:-${DO_ICE:-"NO"}} -export DO_WAVE=${DO_WAVE_ENKF:-${DO_WAVE:-"NO"}} +# Turn off components in ensemble +# export DO_AERO="NO" +# export DO_OCN="NO" +# export DO_ICE="NO" +# export DO_WAVE="NO" + +export CASE="${CASE_ENS}" # Source model specific information that is resolution dependent -string="--fv3 ${CASE_ENS}" +string="--fv3 ${CASE}" # Ocean/Ice/Waves ensemble configurations are identical to deterministic member [[ "${DO_OCN}" == "YES" ]] && string="${string} --mom6 ${OCNRES}" [[ "${DO_ICE}" == "YES" ]] && string="${string} --cice6 ${ICERES}" @@ -24,23 +26,24 @@ source "${EXPDIR}/config.ufs" ${string} # Get task specific resources source "${EXPDIR}/config.resources" efcs -# Use serial I/O for ensemble (lustre?) -export OUTPUT_FILETYPE_ATM="netcdf" -export OUTPUT_FILETYPE_SFC="netcdf" - -# Number of enkf members per fcst job -export NMEM_EFCSGRP=1 -export RERUN_EFCSGRP="NO" +# nggps_diag_nml +export FHOUT=${FHOUT_ENKF:-3} +if [[ "${RUN}" == "enkfgfs" ]]; then + export FHOUT=${FHOUT_ENKF_GFS:-${FHOUT}} +fi -# Turn off inline UPP for EnKF forecast -export WRITE_DOPOST=".true." +# model_configure +export FHMAX=${FHMAX_ENKF:-9} +if [[ "${RUN}" == "enkfgfs" ]]; then + export FHMAX=${FHMAX_ENKF_GFS:-${FHMAX}} +fi # Stochastic physics parameters (only for ensemble forecasts) export DO_SKEB="YES" -export SKEB=0.3 -export SKEB_TAU=21600. -export SKEB_LSCALE=250000. -export SKEBNORM=0 +export SKEB="0.8,-999,-999,-999,-999" +export SKEB_TAU="2.16E4,2.592E5,2.592E6,7.776E6,3.1536E7" +export SKEB_LSCALE="500.E3,1000.E3,2000.E3,2000.E3,2000.E3" +export SKEBNORM=1 export SKEB_NPASS=30 export SKEB_VDOF=5 export DO_SHUM="YES" @@ -48,12 +51,33 @@ export SHUM=0.005 export SHUM_TAU=21600. export SHUM_LSCALE=500000. export DO_SPPT="YES" -export SPPT=0.5 -export SPPT_TAU=21600. -export SPPT_LSCALE=500000. +export SPPT="0.56,0.28,0.14,0.056,0.028" +export SPPT_TAU="2.16E4,2.592E5,2.592E6,7.776E6,3.1536E7" +export SPPT_LSCALE="500.E3,1000.E3,2000.E3,2000.E3,2000.E3" export SPPT_LOGIT=".true." export SPPT_SFCLIMIT=".true." +export DO_CA="YES" +# OCN options +export DO_OCN_SPPT="YES" +export OCNSPPT="0.8,0.4,0.2,0.08,0.04" +export OCNSPPT_TAU="2.16E4,2.592E5,2.592E6,7.776E6,3.1536E7" +export OCNSPPT_LSCALE="500.E3,1000.E3,2000.E3,2000.E3,2000.E3" +export DO_OCN_PERT_EPBL="YES" +export EPBL="0.8,0.4,0.2,0.08,0.04" +export EPBL_TAU="2.16E4,2.592E5,2.592E6,7.776E6,3.1536E7" +export EPBL_LSCALE="500.E3,1000.E3,2000.E3,2000.E3,2000.E3" -export restart_interval=${restart_interval_gfs} +if [[ "${USE_OCN_PERTURB_FILES:-false}" == "true" ]]; then + export ODA_INCUPD="True" + export ODA_TEMPINC_VAR='t_pert' + export ODA_SALTINC_VAR='s_pert' + export ODA_THK_VAR='h_anl' + export ODA_UINC_VAR='u_pert' + export ODA_VINC_VAR='v_pert' + export ODA_INCUPD_NHOURS=0.0 +else + export ODA_INCUPD="False" +fi +export restart_interval="${restart_interval_gfs}" echo "END: config.efcs" diff --git a/parm/config/gefs/config.fcst b/parm/config/gefs/config.fcst index 4c8d3be99f..5c592556c8 100644 --- a/parm/config/gefs/config.fcst +++ b/parm/config/gefs/config.fcst @@ -5,6 +5,8 @@ echo "BEGIN: config.fcst" +export USE_ESMF_THREADING="YES" # Toggle to use ESMF-managed threading or traditional threading in UFSWM + # Turn off waves if not used for this CDUMP case ${WAVE_CDUMP} in both | "${CDUMP/enkf}" ) ;; # Don't change @@ -21,6 +23,13 @@ string="--fv3 ${CASE}" # shellcheck disable=SC2086 source "${EXPDIR}/config.ufs" ${string} +# shellcheck disable=SC2153 +export FHMAX=${FHMAX_GFS} +# shellcheck disable=SC2153 +export FHOUT=${FHOUT_GFS} +export FHMAX_HF=${FHMAX_HF_GFS} +export FHOUT_HF=${FHOUT_HF_GFS} +export FHOUT_OCNICE=${FHOUT_OCNICE_GFS} # Get task specific resources source "${EXPDIR}/config.resources" fcst @@ -37,9 +46,8 @@ export esmf_logkind="ESMF_LOGKIND_MULTI_ON_ERROR" #Options: ESMF_LOGKIND_MULTI_O ####################################################################### -export FORECASTSH="${HOMEgfs}/scripts/exglobal_forecast.sh" -#export FORECASTSH="${HOMEgfs}/scripts/exglobal_forecast.py" # Temp. while this is worked on -export FCSTEXECDIR="${HOMEgfs}/exec" +export FORECASTSH="${SCRgfs}/exglobal_forecast.sh" +#export FORECASTSH="${SCRgfs}/exglobal_forecast.py" # Temp. while this is worked on export FCSTEXEC="ufs_model.x" ####################################################################### @@ -97,9 +105,6 @@ if (( gwd_opt == 2 )); then export do_ugwp_v1_orog_only=".false." launch_level=$(echo "${LEVS}/2.35" |bc) export launch_level - if [[ ${do_gsl_drag_ls_bl} == ".true." ]]; then - export cdmbgwd=${cdmbgwd_gsl} - fi fi # Sponge layer settings @@ -129,7 +134,11 @@ tbp="" if [[ "${progsigma}" == ".true." ]]; then tbp="_progsigma" ; fi # Radiation options -export IAER=1011 ; #spectral band mapping method for aerosol optical properties +if [[ "${DO_AERO}" == "YES" ]]; then + export IAER=2011 # spectral band mapping method for aerosol optical properties +else + export IAER=1011 +fi export iovr_lw=3 ; #de-correlation length cloud overlap method (Barker, 2008) export iovr_sw=3 ; #de-correlation length cloud overlap method (Barker, 2008) export iovr=3 ; #de-correlation length cloud overlap method (Barker, 2008) @@ -156,17 +165,17 @@ export random_clds=".true." case ${imp_physics} in 99) # ZhaoCarr export ncld=1 - export FIELD_TABLE="${HOMEgfs}/parm/ufs/fv3/field_table_zhaocarr${tbf}${tbp}" + export FIELD_TABLE="${PARMgfs}/ufs/fv3/field_table_zhaocarr${tbf}${tbp}" export nwat=2 ;; 6) # WSM6 export ncld=2 - export FIELD_TABLE="${HOMEgfs}/parm/ufs/fv3/field_table_wsm6${tbf}${tbp}" + export FIELD_TABLE="${PARMgfs}/ufs/fv3/field_table_wsm6${tbf}${tbp}" export nwat=6 ;; 8) # Thompson export ncld=2 - export FIELD_TABLE="${HOMEgfs}/parm/ufs/fv3/field_table_thompson_noaero_tke${tbp}" + export FIELD_TABLE="${PARMgfs}/ufs/fv3/field_table_thompson_noaero_tke${tbp}" export nwat=6 export cal_pre=".false." @@ -189,7 +198,7 @@ case ${imp_physics} in ;; 11) # GFDL export ncld=5 - export FIELD_TABLE="${HOMEgfs}/parm/ufs/fv3/field_table_gfdl${tbf}${tbp}" + export FIELD_TABLE="${PARMgfs}/ufs/fv3/field_table_gfdl${tbf}${tbp}" export nwat=6 export dnats=1 export cal_pre=".false." @@ -231,22 +240,8 @@ export FSICL="0" export FSICS="0" #--------------------------------------------------------------------- - -# ideflate: netcdf zlib lossless compression (0-9): 0 no compression -# nbits: netcdf lossy compression level (0-32): 0 lossless -export ideflate=1 -export nbits=14 -export ishuffle=0 -# compression for RESTART files written by FMS -export shuffle=1 -export deflate_level=1 - -#--------------------------------------------------------------------- -# Disable the use of coupler.res; get model start time from model_configure -export USE_COUPLER_RES="NO" - # Write more variables to output -export DIAG_TABLE="${HOMEgfs}/parm/ufs/fv3/diag_table" +export DIAG_TABLE="${PARMgfs}/ufs/fv3/diag_table" # Write gfs restart files to rerun fcst from any break point export restart_interval=${restart_interval_gfs:-12} diff --git a/parm/config/gefs/config.oceanice_products b/parm/config/gefs/config.oceanice_products new file mode 120000 index 0000000000..f6cf9cd60b --- /dev/null +++ b/parm/config/gefs/config.oceanice_products @@ -0,0 +1 @@ +../gfs/config.oceanice_products \ No newline at end of file diff --git a/parm/config/gefs/config.resources b/parm/config/gefs/config.resources index a50418d23a..0be7e864a1 100644 --- a/parm/config/gefs/config.resources +++ b/parm/config/gefs/config.resources @@ -4,17 +4,26 @@ # Set resource information for job tasks # e.g. walltime, node, cores per node, memory etc. -if [[ $# -ne 1 ]]; then +if (( $# != 1 )); then echo "Must specify an input task argument to set resource variables!" echo "argument can be any one of the following:" echo "stage_ic aerosol_init" - echo "sfcanl analcalc analdiag fcst fit2obs metp arch echgres" - echo "ecen esfc efcs epos earc" - echo "init_chem mom6ic ocnpost" + echo "prep prepsnowobs prepatmiodaobs" + echo "atmanlinit atmanlrun atmanlfinal" + echo "atmensanlinit atmensanlrun atmensanlfinal" + echo "snowanl" + echo "aeroanlinit aeroanlrun aeroanlfinal" + echo "anal sfcanl analcalc analdiag fcst echgres" + echo "upp atmos_products" + echo "tracker genesis genesis_fsu" + echo "verfozn verfrad vminmon fit2obs metp arch cleanup" + echo "eobs ediag eomg eupd ecen esfc efcs epos earc" + echo "init_chem mom6ic" echo "waveinit waveprep wavepostsbs wavepostbndpnt wavepostbndpntbll wavepostpnt" echo "wavegempak waveawipsbulls waveawipsgridded" - echo "postsnd awips gempak" + echo "postsnd awips gempak npoess" + echo "ocnanalprep prepoceanobs ocnanalbmat ocnanalrun ocnanalchkpt ocnanalpost ocnanalvrfy" exit 1 fi @@ -23,281 +32,172 @@ step=$1 echo "BEGIN: config.resources" -if [[ "${machine}" = "WCOSS2" ]]; then - export npe_node_max=128 -elif [[ "${machine}" = "JET" ]]; then - if [[ ${PARTITION_BATCH} = "xjet" ]]; then - export npe_node_max=24 - elif [[ ${PARTITION_BATCH} = "vjet" || ${PARTITION_BATCH} = "sjet" ]]; then - export npe_node_max=16 - elif [[ ${PARTITION_BATCH} = "kjet" ]]; then - export npe_node_max=40 - fi -elif [[ ${machine} = "HERA" ]]; then - export npe_node_max=40 -elif [[ ${machine} = "S4" ]]; then - if [[ ${PARTITION_BATCH} = "s4" ]]; then - export npe_node_max=32 - elif [[ ${PARTITION_BATCH} = "ivy" ]]; then - export npe_node_max=20 - fi -elif [[ ${machine} = "ORION" ]]; then - export npe_node_max=40 -elif [[ ${machine} = "HERCULES" ]]; then - export npe_node_max=40 -fi +case ${machine} in + "WCOSS2") npe_node_max=128;; + "HERA") npe_node_max=40;; + "ORION") npe_node_max=40;; + "HERCULES") npe_node_max=80;; + "JET") + case ${PARTITION_BATCH} in + "xjet") npe_node_max=24;; + "vjet" | "sjet") npe_node_max=16;; + "kjet") npe_node_max=40;; + *) + echo "FATAL ERROR: Unknown partition ${PARTITION_BATCH} specified for ${machine}" + exit 3 + esac + ;; + "S4") + case ${PARTITION_BATCH} in + "s4") npe_node_max=32;; + "ivy") npe_node_max=20;; + *) + echo "FATAL ERROR: Unknown partition ${PARTITION_BATCH} specified for ${machine}" + exit 3 + esac + ;; + "AWSPW") + export PARTITION_BATCH="compute" + npe_node_max=40 + ;; + *) + echo "FATAL ERROR: Unknown machine encountered by ${BASH_SOURCE[0]}" + exit 2 + ;; +esac +export npe_node_max -if [[ ${step} = "prep" ]]; then - export wtime_prep='00:30:00' - export npe_prep=4 - export npe_node_prep=2 - export nth_prep=1 - if [[ "${machine}" = "WCOSS2" ]]; then - export is_exclusive=True - else - export memory_prep="40G" - fi - -elif [[ "${step}" = "aerosol_init" ]]; then - export wtime_aerosol_init="00:05:00" - export npe_aerosol_init=1 - export nth_aerosol_init=1 - npe_node_aerosol_init=$(echo "${npe_node_max} / ${nth_aerosol_init}" | bc) - export npe_node_aerosol_init - export NTASKS=${npe_aerosol_init} - export memory_aerosol_init="6G" - -elif [[ ${step} = "waveinit" ]]; then +case ${step} in + + "stage_ic") + export wtime_stage_ic="00:15:00" + export npe_stage_ic=1 + export npe_node_stage_ic=1 + export nth_stage_ic=1 + export is_exclusive=True + ;; + "waveinit") export wtime_waveinit="00:10:00" export npe_waveinit=12 export nth_waveinit=1 - npe_node_waveinit=$(echo "${npe_node_max} / ${nth_waveinit}" | bc) - export npe_node_waveinit + export npe_node_waveinit=$(( npe_node_max / nth_waveinit )) export NTASKS=${npe_waveinit} export memory_waveinit="2GB" + ;; -elif [[ ${step} = "waveprep" ]]; then - - export wtime_waveprep="00:10:00" - export npe_waveprep=5 - export npe_waveprep_gfs=65 - export nth_waveprep=1 - export nth_waveprep_gfs=1 - npe_node_waveprep=$(echo "${npe_node_max} / ${nth_waveprep}" | bc) - export npe_node_waveprep - npe_node_waveprep_gfs=$(echo "${npe_node_max} / ${nth_waveprep_gfs}" | bc) - export npe_node_waveprep_gfs - export NTASKS=${npe_waveprep} - export NTASKS_gfs=${npe_waveprep_gfs} - export memory_waveprep="100GB" - export memory_waveprep_gfs="150GB" - -elif [[ ${step} = "wavepostsbs" ]]; then - - export wtime_wavepostsbs="00:20:00" - export wtime_wavepostsbs_gfs="03:00:00" - export npe_wavepostsbs=8 - export nth_wavepostsbs=1 - npe_node_wavepostsbs=$(echo "${npe_node_max} / ${nth_wavepostsbs}" | bc) - export npe_node_wavepostsbs - export NTASKS=${npe_wavepostsbs} - export memory_wavepostsbs="10GB" - export memory_wavepostsbs_gfs="10GB" - -elif [[ ${step} = "wavepostbndpnt" ]]; then - - export wtime_wavepostbndpnt="01:00:00" - export npe_wavepostbndpnt=240 - export nth_wavepostbndpnt=1 - npe_node_wavepostbndpnt=$(echo "${npe_node_max} / ${nth_wavepostbndpnt}" | bc) - export npe_node_wavepostbndpnt - export NTASKS=${npe_wavepostbndpnt} + "fcst" | "efcs") export is_exclusive=True -elif [[ ${step} = "wavepostbndpntbll" ]]; then - - export wtime_wavepostbndpntbll="01:00:00" - export npe_wavepostbndpntbll=448 - export nth_wavepostbndpntbll=1 - npe_node_wavepostbndpntbll=$(echo "${npe_node_max} / ${nth_wavepostbndpntbll}" | bc) - export npe_node_wavepostbndpntbll - export NTASKS=${npe_wavepostbndpntbll} - export is_exclusive=True - -elif [[ ${step} = "wavepostpnt" ]]; then - - export wtime_wavepostpnt="01:30:00" - export npe_wavepostpnt=200 - export nth_wavepostpnt=1 - npe_node_wavepostpnt=$(echo "${npe_node_max} / ${nth_wavepostpnt}" | bc) - export npe_node_wavepostpnt - export NTASKS=${npe_wavepostpnt} - export is_exclusive=True - -elif [[ ${step} = "wavegempak" ]]; then - - export wtime_wavegempak="02:00:00" - export npe_wavegempak=1 - export nth_wavegempak=1 - npe_node_wavegempak=$(echo "${npe_node_max} / ${nth_wavegempak}" | bc) - export npe_node_wavegempak - export NTASKS=${npe_wavegempak} - export memory_wavegempak="1GB" - -elif [[ ${step} = "waveawipsbulls" ]]; then - - export wtime_waveawipsbulls="00:20:00" - export npe_waveawipsbulls=1 - export nth_waveawipsbulls=1 - npe_node_waveawipsbulls=$(echo "${npe_node_max} / ${nth_waveawipsbulls}" | bc) - export npe_node_waveawipsbulls - export NTASKS=${npe_waveawipsbulls} - export is_exclusive=True - -elif [[ ${step} = "waveawipsgridded" ]]; then - - export wtime_waveawipsgridded="02:00:00" - export npe_waveawipsgridded=1 - export nth_waveawipsgridded=1 - npe_node_waveawipsgridded=$(echo "${npe_node_max} / ${nth_waveawipsgridded}" | bc) - export npe_node_waveawipsgridded - export NTASKS=${npe_waveawipsgridded} - export memory_waveawipsgridded_gfs="1GB" - -elif [[ ${step} = "analcalc" ]]; then - - export wtime_analcalc="00:10:00" - export npe_analcalc=127 - export ntasks="${npe_analcalc}" - export nth_analcalc=1 - export nth_echgres=4 - export nth_echgres_gfs=12 - npe_node_analcalc=$(echo "${npe_node_max} / ${nth_analcalc}" | bc) - export npe_node_analcalc - export is_exclusive=True - -elif [[ ${step} = "analdiag" ]]; then - - export wtime_analdiag="00:15:00" - export npe_analdiag=96 # Should be at least twice npe_ediag - export nth_analdiag=1 - npe_node_analdiag=$(echo "${npe_node_max} / ${nth_analdiag}" | bc) - export npe_node_analdiag - export memory_analdiag="48GB" - -elif [[ ${step} = "sfcanl" ]]; then - - export wtime_sfcanl="00:10:00" - export npe_sfcanl=6 - export nth_sfcanl=1 - npe_node_sfcanl=$(echo "${npe_node_max} / ${nth_sfcanl}" | bc) - export npe_node_sfcanl - export is_exclusive=True - -elif [[ "${step}" = "fcst" || "${step}" = "efcs" ]]; then - - export is_exclusive=True - - if [[ "${step}" = "fcst" ]]; then - _CDUMP_LIST=${CDUMP:-"gdas gfs"} - elif [[ "${step}" = "efcs" ]]; then - _CDUMP_LIST=${CDUMP:-"enkfgdas enkfgfs"} - fi + _CDUMP_LIST=${CDUMP:-"gdas gfs"} # During workflow creation, we need resources for all CDUMPs and CDUMP is undefined for _CDUMP in ${_CDUMP_LIST}; do - if [[ "${_CDUMP}" =~ "gfs" ]]; then - export layout_x=${layout_x_gfs} - export layout_y=${layout_y_gfs} - export WRITE_GROUP=${WRITE_GROUP_GFS} - export WRTTASK_PER_GROUP_PER_THREAD=${WRTTASK_PER_GROUP_PER_THREAD_GFS} - ntasks_fv3=${ntasks_fv3_gfs} - ntasks_quilt=${ntasks_quilt_gfs} - nthreads_fv3=${nthreads_fv3_gfs} - fi - - # PETS for the atmosphere dycore - (( FV3PETS = ntasks_fv3 * nthreads_fv3 )) - echo "FV3 using (nthreads, PETS) = (${nthreads_fv3}, ${FV3PETS})" - - # PETS for quilting - if [[ "${QUILTING:-}" = ".true." ]]; then - (( QUILTPETS = ntasks_quilt * nthreads_fv3 )) - (( WRTTASK_PER_GROUP = WRTTASK_PER_GROUP_PER_THREAD )) - export WRTTASK_PER_GROUP - else - QUILTPETS=0 - fi - echo "QUILT using (nthreads, PETS) = (${nthreads_fv3}, ${QUILTPETS})" - - # Total PETS for the atmosphere component - ATMTHREADS=${nthreads_fv3} - (( ATMPETS = FV3PETS + QUILTPETS )) - export ATMPETS ATMTHREADS - echo "FV3ATM using (nthreads, PETS) = (${ATMTHREADS}, ${ATMPETS})" - - # Total PETS for the coupled model (starting w/ the atmosphere) - NTASKS_TOT=${ATMPETS} - - # The mediator PETS can overlap with other components, usually it lands on the atmosphere tasks. - # However, it is suggested limiting mediator PETS to 300, as it may cause the slow performance. - # See https://docs.google.com/document/d/1bKpi-52t5jIfv2tuNHmQkYUe3hkKsiG_DG_s6Mnukog/edit - # TODO: Update reference when moved to ufs-weather-model RTD - MEDTHREADS=${nthreads_mediator:-1} - MEDPETS=${MEDPETS:-ATMPETS} - [[ "${MEDPETS}" -gt 300 ]] && MEDPETS=300 - export MEDPETS MEDTHREADS - echo "MEDIATOR using (threads, PETS) = (${MEDTHREADS}, ${MEDPETS})" - - CHMPETS=0; CHMTHREADS=0 - if [[ "${DO_AERO}" = "YES" ]]; then - # GOCART shares the same grid and forecast tasks as FV3 (do not add write grid component tasks). - (( CHMTHREADS = ATMTHREADS )) - (( CHMPETS = FV3PETS )) - # Do not add to NTASKS_TOT - echo "GOCART using (threads, PETS) = (${CHMTHREADS}, ${CHMPETS})" - fi - export CHMPETS CHMTHREADS - - WAVPETS=0; WAVTHREADS=0 - if [[ "${DO_WAVE}" = "YES" ]]; then - (( WAVPETS = ntasks_ww3 * nthreads_ww3 )) - (( WAVTHREADS = nthreads_ww3 )) - echo "WW3 using (threads, PETS) = (${WAVTHREADS}, ${WAVPETS})" - (( NTASKS_TOT = NTASKS_TOT + WAVPETS )) - fi - export WAVPETS WAVTHREADS - - OCNPETS=0; OCNTHREADS=0 - if [[ "${DO_OCN}" = "YES" ]]; then - (( OCNPETS = ntasks_mom6 * nthreads_mom6 )) - (( OCNTHREADS = nthreads_mom6 )) - echo "MOM6 using (threads, PETS) = (${OCNTHREADS}, ${OCNPETS})" - (( NTASKS_TOT = NTASKS_TOT + OCNPETS )) - fi - export OCNPETS OCNTHREADS - - ICEPETS=0; ICETHREADS=0 - if [[ "${DO_ICE}" = "YES" ]]; then - (( ICEPETS = ntasks_cice6 * nthreads_cice6 )) - (( ICETHREADS = nthreads_cice6 )) - echo "CICE6 using (threads, PETS) = (${ICETHREADS}, ${ICEPETS})" - (( NTASKS_TOT = NTASKS_TOT + ICEPETS )) - fi - export ICEPETS ICETHREADS - - echo "Total PETS for ${_CDUMP} = ${NTASKS_TOT}" - - if [[ "${_CDUMP}" =~ "gfs" ]]; then - declare -x "npe_${step}_gfs"="${NTASKS_TOT}" - declare -x "nth_${step}_gfs"=1 # ESMF handles threading for the UFS-weather-model - declare -x "npe_node_${step}_gfs"="${npe_node_max}" - else - declare -x "npe_${step}"="${NTASKS_TOT}" - declare -x "nth_${step}"=1 # ESMF handles threading for the UFS-weather-model - declare -x "npe_node_${step}"="${npe_node_max}" - fi + if [[ "${_CDUMP}" =~ "gfs" ]]; then + export layout_x=${layout_x_gfs} + export layout_y=${layout_y_gfs} + export WRITE_GROUP=${WRITE_GROUP_GFS} + export WRTTASK_PER_GROUP_PER_THREAD=${WRTTASK_PER_GROUP_PER_THREAD_GFS} + ntasks_fv3=${ntasks_fv3_gfs} + ntasks_quilt=${ntasks_quilt_gfs} + nthreads_fv3=${nthreads_fv3_gfs} + nthreads_ufs=${nthreads_ufs_gfs} + fi + + # Determine if using ESMF-managed threading or traditional threading + # If using traditional threading, set them to 1 + if [[ "${USE_ESMF_THREADING:-}" == "YES" ]]; then + export UFS_THREADS=1 + else # traditional threading + export UFS_THREADS=${nthreads_ufs:-1} + nthreads_fv3=1 + nthreads_mediator=1 + [[ "${DO_WAVE}" == "YES" ]] && nthreads_ww3=1 + [[ "${DO_OCN}" == "YES" ]] && nthreads_mom6=1 + [[ "${DO_ICE}" == "YES" ]] && nthreads_cice6=1 + fi + + # PETS for the atmosphere dycore + (( FV3PETS = ntasks_fv3 * nthreads_fv3 )) + echo "FV3 using (nthreads, PETS) = (${nthreads_fv3}, ${FV3PETS})" + + # PETS for quilting + if [[ "${QUILTING:-}" == ".true." ]]; then + (( QUILTPETS = ntasks_quilt * nthreads_fv3 )) + (( WRTTASK_PER_GROUP = WRTTASK_PER_GROUP_PER_THREAD )) + export WRTTASK_PER_GROUP + else + QUILTPETS=0 + fi + echo "QUILT using (nthreads, PETS) = (${nthreads_fv3}, ${QUILTPETS})" + + # Total PETS for the atmosphere component + ATMTHREADS=${nthreads_fv3} + (( ATMPETS = FV3PETS + QUILTPETS )) + export ATMPETS ATMTHREADS + echo "FV3ATM using (nthreads, PETS) = (${ATMTHREADS}, ${ATMPETS})" + + # Total PETS for the coupled model (starting w/ the atmosphere) + NTASKS_TOT=${ATMPETS} + + # The mediator PETS can overlap with other components, usually it lands on the atmosphere tasks. + # However, it is suggested limiting mediator PETS to 300, as it may cause the slow performance. + # See https://docs.google.com/document/d/1bKpi-52t5jIfv2tuNHmQkYUe3hkKsiG_DG_s6Mnukog/edit + # TODO: Update reference when moved to ufs-weather-model RTD + MEDTHREADS=${nthreads_mediator:-1} + MEDPETS=${MEDPETS:-${FV3PETS}} + (( "${MEDPETS}" > 300 )) && MEDPETS=300 + export MEDPETS MEDTHREADS + echo "MEDIATOR using (threads, PETS) = (${MEDTHREADS}, ${MEDPETS})" + + CHMPETS=0; CHMTHREADS=0 + if [[ "${DO_AERO}" == "YES" ]]; then + # GOCART shares the same grid and forecast tasks as FV3 (do not add write grid component tasks). + (( CHMTHREADS = ATMTHREADS )) + (( CHMPETS = FV3PETS )) + # Do not add to NTASKS_TOT + echo "GOCART using (threads, PETS) = (${CHMTHREADS}, ${CHMPETS})" + fi + export CHMPETS CHMTHREADS + + WAVPETS=0; WAVTHREADS=0 + if [[ "${DO_WAVE}" == "YES" ]]; then + (( WAVPETS = ntasks_ww3 * nthreads_ww3 )) + (( WAVTHREADS = nthreads_ww3 )) + echo "WW3 using (threads, PETS) = (${WAVTHREADS}, ${WAVPETS})" + (( NTASKS_TOT = NTASKS_TOT + WAVPETS )) + fi + export WAVPETS WAVTHREADS + + OCNPETS=0; OCNTHREADS=0 + if [[ "${DO_OCN}" == "YES" ]]; then + (( OCNPETS = ntasks_mom6 * nthreads_mom6 )) + (( OCNTHREADS = nthreads_mom6 )) + echo "MOM6 using (threads, PETS) = (${OCNTHREADS}, ${OCNPETS})" + (( NTASKS_TOT = NTASKS_TOT + OCNPETS )) + fi + export OCNPETS OCNTHREADS + + ICEPETS=0; ICETHREADS=0 + if [[ "${DO_ICE}" == "YES" ]]; then + (( ICEPETS = ntasks_cice6 * nthreads_cice6 )) + (( ICETHREADS = nthreads_cice6 )) + echo "CICE6 using (threads, PETS) = (${ICETHREADS}, ${ICEPETS})" + (( NTASKS_TOT = NTASKS_TOT + ICEPETS )) + fi + export ICEPETS ICETHREADS + + echo "Total PETS for ${_CDUMP} = ${NTASKS_TOT}" + + if [[ "${_CDUMP}" =~ "gfs" ]]; then + declare -x "npe_${step}_gfs"="${NTASKS_TOT}" + declare -x "nth_${step}_gfs"="${UFS_THREADS}" + declare -x "npe_node_${step}_gfs"="${npe_node_max}" + else + declare -x "npe_${step}"="${NTASKS_TOT}" + declare -x "nth_${step}"="${UFS_THREADS}" + declare -x "npe_node_${step}"="${npe_node_max}" + fi done @@ -311,167 +211,76 @@ elif [[ "${step}" = "fcst" || "${step}" = "efcs" ]]; then declare -x "wtime_${step}_gfs"="06:00:00" ;; *) - echo "FATAL ERROR: Resolution ${CASE} not supported in ${step}" - exit 1 + echo "FATAL ERROR: Resources not defined for job ${job} at resolution ${CASE}" + exit 4 ;; esac unset _CDUMP _CDUMP_LIST unset NTASKS_TOT - -elif [[ ${step} = "ocnpost" ]]; then - - export wtime_ocnpost="00:30:00" - export npe_ocnpost=1 - export npe_node_ocnpost=1 - export nth_ocnpost=1 - export memory_ocnpost="96G" - if [[ ${machine} == "JET" ]]; then - # JET only has 88GB of requestable memory per node - # so a second node is required to meet the requiremtn - npe_ocnpost=2 - fi - -elif [[ "${step}" = "fit2obs" ]]; then - - export wtime_fit2obs="00:20:00" - export npe_fit2obs=3 - export nth_fit2obs=1 - export npe_node_fit2obs=1 - export memory_fit2obs="20G" - if [[ ${machine} == "WCOSS2" ]]; then export npe_node_fit2obs=3 ; fi - -elif [[ "${step}" = "metp" ]]; then - - export nth_metp=1 - export wtime_metp="03:00:00" - export npe_metp=4 - export npe_node_metp=4 - export wtime_metp_gfs="06:00:00" - export npe_metp_gfs=4 - export npe_node_metp_gfs=4 - export is_exclusive=True - -elif [[ ${step} = "echgres" ]]; then - - export wtime_echgres="00:10:00" - export npe_echgres=3 - export nth_echgres=${npe_node_max} - export npe_node_echgres=1 - if [[ "${machine}" = "WCOSS2" ]]; then - export memory_echgres="200GB" - fi - -elif [[ ${step} = "init_chem" ]]; then - - export wtime_init_chem="00:30:00" - export npe_init_chem=1 - export npe_node_init_chem=1 - export is_exclusive=True - -elif [[ ${step} = "mom6ic" ]]; then - - export wtime_mom6ic="00:30:00" - export npe_mom6ic=24 - export npe_node_mom6ic=24 - export is_exclusive=True - -elif [[ ${step} = "arch" || ${step} = "earc" ]]; then - - eval "export wtime_${step}='06:00:00'" - eval "export npe_${step}=1" - eval "export npe_node_${step}=1" - eval "export nth_${step}=1" - eval "export memory_${step}=4096M" - if [[ "${machine}" = "WCOSS2" ]]; then - eval "export memory_${step}=50GB" - fi - -elif [[ ${step} = "stage_ic" ]]; then - - export wtime_stage_ic="00:15:00" - export npe_stage_ic=1 - export npe_node_stage_ic=1 - export nth_stage_ic=1 + ;; + + "atmos_products") + export wtime_atmos_products="00:15:00" + export npe_atmos_products=24 + export nth_atmos_products=1 + export npe_node_atmos_products="${npe_atmos_products}" + export wtime_atmos_products_gfs="${wtime_atmos_products}" + export npe_atmos_products_gfs="${npe_atmos_products}" + export nth_atmos_products_gfs="${nth_atmos_products}" + export npe_node_atmos_products_gfs="${npe_node_atmos_products}" export is_exclusive=True + ;; + + "oceanice_products") + export wtime_oceanice_products="00:15:00" + export npe_oceanice_products=1 + export npe_node_oceanice_products=1 + export nth_oceanice_products=1 + export memory_oceanice_products="96GB" + ;; + + "wavepostsbs") + export wtime_wavepostsbs="03:00:00" + export npe_wavepostsbs=1 + export nth_wavepostsbs=1 + export npe_node_wavepostsbs=$(( npe_node_max / nth_wavepostsbs )) + export NTASKS=${npe_wavepostsbs} + export memory_wavepostsbs="10GB" + ;; -elif [[ ${step} = "ecen" ]]; then - - export wtime_ecen="00:10:00" - export npe_ecen=80 - export nth_ecen=4 - if [[ "${machine}" = "HERA" ]]; then export nth_ecen=6; fi - if [[ ${CASE} = "C384" || ${CASE} = "C192" || ${CASE} = "C96" || ${CASE} = "C48" ]]; then export nth_ecen=2; fi - npe_node_ecen=$(echo "${npe_node_max} / ${nth_ecen}" | bc) - export npe_node_ecen - export nth_cycle=${nth_ecen} - npe_node_cycle=$(echo "${npe_node_max} / ${nth_cycle}" | bc) - export npe_node_cycle + "wavepostbndpnt") + export wtime_wavepostbndpnt="01:00:00" + export npe_wavepostbndpnt=240 + export nth_wavepostbndpnt=1 + export npe_node_wavepostbndpnt=$(( npe_node_max / nth_wavepostbndpnt )) + export NTASKS=${npe_wavepostbndpnt} export is_exclusive=True + ;; -elif [[ ${step} = "esfc" ]]; then - - export wtime_esfc="00:06:00" - export npe_esfc=80 - export nth_esfc=1 - npe_node_esfc=$(echo "${npe_node_max} / ${nth_esfc}" | bc) - export npe_node_esfc - export nth_cycle=${nth_esfc} - npe_node_cycle=$(echo "${npe_node_max} / ${nth_cycle}" | bc) - export npe_node_cycle - export memory_esfc="80GB" - -elif [[ ${step} = "epos" ]]; then - - export wtime_epos="00:15:00" - export npe_epos=80 - export nth_epos=4 - if [[ "${machine}" == "HERA" ]]; then - export nth_epos=6 - fi - npe_node_epos=$(echo "${npe_node_max} / ${nth_epos}" | bc) - export npe_node_epos + "wavepostbndpntbll") + export wtime_wavepostbndpntbll="01:00:00" + export npe_wavepostbndpntbll=448 + export nth_wavepostbndpntbll=1 + export npe_node_wavepostbndpntbll=$(( npe_node_max / nth_wavepostbndpntbll )) + export NTASKS=${npe_wavepostbndpntbll} export is_exclusive=True + ;; -elif [[ ${step} = "postsnd" ]]; then - - export wtime_postsnd="02:00:00" - export npe_postsnd=40 - export nth_postsnd=8 - export npe_node_postsnd=10 - export npe_postsndcfp=9 - export npe_node_postsndcfp=1 - postsnd_req_cores=$(echo "${npe_node_postsnd} * ${nth_postsnd}" | bc) - if [[ ${postsnd_req_cores} -gt "${npe_node_max}" ]]; then - npe_node_postsnd=$(echo "${npe_node_max} / ${nth_postsnd}" | bc) - export npe_node_postsnd - fi + "wavepostpnt") + export wtime_wavepostpnt="04:00:00" + export npe_wavepostpnt=200 + export nth_wavepostpnt=1 + export npe_node_wavepostpnt=$(( npe_node_max / nth_wavepostpnt )) + export NTASKS=${npe_wavepostpnt} export is_exclusive=True + ;; -elif [[ ${step} = "awips" ]]; then - - export wtime_awips="03:30:00" - export npe_awips=1 - export npe_node_awips=1 - export nth_awips=1 - export memory_awips="3GB" - -elif [[ ${step} = "gempak" ]]; then - - export wtime_gempak="03:00:00" - export npe_gempak=2 - export npe_gempak_gfs=28 - export npe_node_gempak=2 - export npe_node_gempak_gfs=28 - export nth_gempak=1 - export memory_gempak="4GB" - export memory_gempak_gfs="2GB" - -else - - echo "Invalid step = ${step}, ABORT!" - exit 2 + *) + echo "FATAL ERROR: Invalid job ${step} passed to ${BASH_SOURCE[0]}" + exit 1 + ;; -fi +esac echo "END: config.resources" diff --git a/parm/config/gefs/config.stage_ic b/parm/config/gefs/config.stage_ic index e2bb0af2b8..b332ee1826 100644 --- a/parm/config/gefs/config.stage_ic +++ b/parm/config/gefs/config.stage_ic @@ -8,6 +8,12 @@ echo "BEGIN: config.stage_ic" source "${EXPDIR}/config.resources" stage_ic case "${CASE}" in + "C384") + export CPL_ATMIC="" + export CPL_ICEIC="" + export CPL_OCNIC="" + export CPL_WAVIC="" + ;; "C48") export CPL_ATMIC="gefs_test" export CPL_ICEIC="gefs_test" diff --git a/parm/config/gefs/config.ufs b/parm/config/gefs/config.ufs index 68b364529e..b8695b6dbb 100644 --- a/parm/config/gefs/config.ufs +++ b/parm/config/gefs/config.ufs @@ -68,51 +68,6 @@ if [[ "${skip_mom6}" == "false" ]] || [[ "${skip_cice6}" == "false" ]] || [[ "${ skip_mediator=false fi -case "${machine}" in - "WCOSS2") - npe_node_max=128 - ;; - "HERA" | "ORION" | "HERCULES" ) - npe_node_max=40 - ;; - "JET") - case "${PARTITION_BATCH}" in - "xjet") - npe_node_max=24 - ;; - "vjet" | "sjet") - npe_node_max=16 - ;; - "kjet") - npe_node_max=40 - ;; - *) - echo "FATAL ERROR: Unsupported ${machine} PARTITION_BATCH = ${PARTITION_BATCH}, ABORT!" - exit 1 - ;; - esac - ;; - "S4") - case "${PARTITION_BATCH}" in - "s4") - npe_node_max=32 - ;; - "ivy") - npe_node_max=20 - ;; - *) - echo "FATAL ERROR: Unsupported ${machine} PARTITION_BATCH = ${PARTITION_BATCH}, ABORT!" - exit 1 - ;; - esac - ;; - *) - echo "FATAL ERROR: Unrecognized machine ${machine}" - exit 14 - ;; -esac -export npe_node_max - # (Standard) Model resolution dependent variables case "${fv3_res}" in "C48") @@ -123,6 +78,8 @@ case "${fv3_res}" in export layout_y_gfs=1 export nthreads_fv3=1 export nthreads_fv3_gfs=1 + export nthreads_ufs=1 + export nthreads_ufs_gfs=1 export cdmbgwd="0.071,2.1,1.0,1.0" # mountain blocking, ogwd, cgwd, cgwd src scaling export cdmbgwd_gsl="40.0,1.77,1.0,1.0" # settings for GSL drag suite export knob_ugwp_tauamp=6.0e-3 # setting for UGWPv1 non-stationary GWD @@ -139,6 +96,8 @@ case "${fv3_res}" in export layout_y_gfs=2 export nthreads_fv3=1 export nthreads_fv3_gfs=1 + export nthreads_ufs=1 + export nthreads_ufs_gfs=1 export cdmbgwd="0.14,1.8,1.0,1.0" # mountain blocking, ogwd, cgwd, cgwd src scaling export cdmbgwd_gsl="20.0,2.5,1.0,1.0" # settings for GSL drag suite export knob_ugwp_tauamp=3.0e-3 # setting for UGWPv1 non-stationary GWD @@ -155,6 +114,8 @@ case "${fv3_res}" in export layout_y_gfs=6 export nthreads_fv3=1 export nthreads_fv3_gfs=2 + export nthreads_ufs=1 + export nthreads_ufs_gfs=2 export cdmbgwd="0.23,1.5,1.0,1.0" # mountain blocking, ogwd, cgwd, cgwd src scaling export cdmbgwd_gsl="10.0,3.5,1.0,1.0" # settings for GSL drag suite export knob_ugwp_tauamp=1.5e-3 # setting for UGWPv1 non-stationary GWD @@ -171,6 +132,8 @@ case "${fv3_res}" in export layout_y_gfs=8 export nthreads_fv3=1 export nthreads_fv3_gfs=2 + export nthreads_ufs=1 + export nthreads_ufs_gfs=2 export cdmbgwd="1.1,0.72,1.0,1.0" # mountain blocking, ogwd, cgwd, cgwd src scaling export cdmbgwd_gsl="5.0,5.0,1.0,1.0" # settings for GSL drag suite export knob_ugwp_tauamp=0.8e-3 # setting for UGWPv1 non-stationary GWD @@ -187,13 +150,15 @@ case "${fv3_res}" in export layout_y_gfs=16 export nthreads_fv3=4 export nthreads_fv3_gfs=4 + export nthreads_ufs=4 + export nthreads_ufs_gfs=4 export cdmbgwd="4.0,0.15,1.0,1.0" # mountain blocking, ogwd, cgwd, cgwd src scaling export cdmbgwd_gsl="2.5,7.5,1.0,1.0" # settings for GSL drag suite export knob_ugwp_tauamp=0.5e-3 # setting for UGWPv1 non-stationary GWD export WRITE_GROUP=2 export WRTTASK_PER_GROUP_PER_THREAD_PER_TILE=10 export WRITE_GROUP_GFS=4 - export WRTTASK_PER_GROUP_PER_THREAD_PER_TILE_GFS=10 + export WRTTASK_PER_GROUP_PER_THREAD_PER_TILE_GFS=20 #Note this should be 10 for WCOSS2 ;; "C1152") export DELTIM=120 @@ -203,13 +168,15 @@ case "${fv3_res}" in export layout_y_gfs=16 export nthreads_fv3=4 export nthreads_fv3_gfs=4 + export nthreads_ufs=4 + export nthreads_ufs_gfs=4 export cdmbgwd="4.0,0.10,1.0,1.0" # mountain blocking, ogwd, cgwd, cgwd src scaling export cdmbgwd_gsl="1.67,8.8,1.0,1.0" # settings for GSL drag suite export knob_ugwp_tauamp=0.35e-3 # setting for UGWPv1 non-stationary GWD export WRITE_GROUP=4 export WRTTASK_PER_GROUP_PER_THREAD_PER_TILE=10 # TODO: refine these numbers when a case is available export WRITE_GROUP_GFS=4 - export WRTTASK_PER_GROUP_PER_THREAD_PER_TILE_GFS=10 # TODO: refine these numbers when a case is available + export WRTTASK_PER_GROUP_PER_THREAD_PER_TILE_GFS=20 # TODO: refine these numbers when a case is available ;; "C3072") export DELTIM=90 @@ -219,6 +186,8 @@ case "${fv3_res}" in export layout_y_gfs=32 export nthreads_fv3=4 export nthreads_fv3_gfs=4 + export nthreads_ufs=4 + export nthreads_ufs_gfs=4 export cdmbgwd="4.0,0.05,1.0,1.0" # mountain blocking, ogwd, cgwd, cgwd src scaling export cdmbgwd_gsl="0.625,14.1,1.0,1.0" # settings for GSL drag suite export knob_ugwp_tauamp=0.13e-3 # setting for UGWPv1 non-stationary GWD @@ -258,6 +227,10 @@ case ${fv3_res} in OUTPUT_FILETYPE_ATM="netcdf_parallel" OUTPUT_FILETYPE_SFC="netcdf_parallel" ;; + *) + echo "FATAL ERROR: Unrecognized FV3 resolution ${fv3_res}" + exit 15 + ;; esac export OUTPUT_FILETYPE_ATM OUTPUT_FILETYPE_SFC @@ -309,11 +282,12 @@ if [[ "${skip_mom6}" == "false" ]]; then NY_GLB=320 DT_DYNAM_MOM6='1800' DT_THERM_MOM6='3600' - FRUNOFF="" + FRUNOFF="runoff.daitren.clim.1deg.nc" CHLCLIM="seawifs_1998-2006_smoothed_2X.nc" - MOM6_RESTART_SETTING='n' + MOM6_RESTART_SETTING='r' MOM6_RIVER_RUNOFF='False' eps_imesh="2.5e-1" + TOPOEDITS="ufs.topo_edits_011818.nc" if [[ "${DO_JEDIOCNVAR:-NO}" = "YES" ]]; then MOM6_DIAG_COORD_DEF_Z_FILE="oceanda_zgrid_75L.nc" MOM6_DIAG_MISVAL="0.0" @@ -321,6 +295,7 @@ if [[ "${skip_mom6}" == "false" ]]; then MOM6_DIAG_COORD_DEF_Z_FILE="interpolate_zgrid_40L.nc" MOM6_DIAG_MISVAL="-1e34" fi + MOM6_ALLOW_LANDMASK_CHANGES='True' ;; "050") ntasks_mom6=60 @@ -334,7 +309,6 @@ if [[ "${skip_mom6}" == "false" ]]; then MOM6_RESTART_SETTING='n' MOM6_RIVER_RUNOFF='True' eps_imesh="1.0e-1" - TOPOEDITS="ufs.topo_edits_011818.nc" if [[ "${DO_JEDIOCNVAR:-NO}" = "YES" ]]; then MOM6_DIAG_COORD_DEF_Z_FILE="oceanda_zgrid_75L.nc" MOM6_DIAG_MISVAL="0.0" @@ -342,7 +316,8 @@ if [[ "${skip_mom6}" == "false" ]]; then MOM6_DIAG_COORD_DEF_Z_FILE="interpolate_zgrid_40L.nc" MOM6_DIAG_MISVAL="-1e34" fi - MOM6_ALLOW_LANDMASK_CHANGES='True' + MOM6_ALLOW_LANDMASK_CHANGES='False' + TOPOEDITS="" ;; "025") ntasks_mom6=220 @@ -356,7 +331,6 @@ if [[ "${skip_mom6}" == "false" ]]; then MOM6_RIVER_RUNOFF='True' MOM6_RESTART_SETTING="r" eps_imesh="1.0e-1" - TOPOEDITS="" if [[ "${DO_JEDIOCNVAR:-NO}" = "YES" ]]; then MOM6_DIAG_COORD_DEF_Z_FILE="oceanda_zgrid_75L.nc" MOM6_DIAG_MISVAL="0.0" @@ -364,7 +338,8 @@ if [[ "${skip_mom6}" == "false" ]]; then MOM6_DIAG_COORD_DEF_Z_FILE="interpolate_zgrid_40L.nc" MOM6_DIAG_MISVAL="-1e34" fi - MOM6_ALLOW_LANDMASK_CHANGES='True' + MOM6_ALLOW_LANDMASK_CHANGES='False' + TOPOEDITS="" ;; *) echo "FATAL ERROR: Unsupported MOM6 resolution = ${mom6_res}, ABORT!" @@ -378,10 +353,10 @@ if [[ "${skip_mom6}" == "false" ]]; then export DT_DYNAM_MOM6 DT_THERM_MOM6 export FRUNOFF export CHLCLIM + export TOPOEDITS export MOM6_RIVER_RUNOFF export MOM6_RESTART_SETTING export eps_imesh - export TOPOEDITS export MOM6_DIAG_COORD_DEF_Z_FILE export MOM6_DIAG_MISVAL export MOM6_ALLOW_LANDMASK_CHANGES @@ -397,6 +372,7 @@ if [[ "${skip_cice6}" == "false" ]]; then echo "FATAL ERROR: CICE6 cannot be configured without MOM6, ABORT!" exit 1 fi + nthreads_cice6=${nthreads_mom6} # CICE6 needs to run on same threads as MOM6 case "${cice6_res}" in "500") @@ -470,39 +446,45 @@ if [[ "${skip_gocart}" == "false" ]]; then fi # Set the name of the UFS (previously nems) configure template to use +# Default ufs.configure templates for supported model configurations +if [[ "${USE_ESMF_THREADING:-}" == "YES" ]]; then + tmpl_suffix="_esmf" +fi case "${model_list}" in atm) - export ufs_configure_template="${HOMEgfs}/parm/ufs/ufs.configure.atm.IN" + default_template="${PARMgfs}/ufs/ufs.configure.atm${tmpl_suffix:-}.IN" ;; atm.aero) - export ufs_configure_template="${HOMEgfs}/parm/ufs/ufs.configure.atm_aero.IN" + default_template="${PARMgfs}/ufs/ufs.configure.atmaero${tmpl_suffix:-}.IN" ;; atm.wave) - export ufs_configure_template="${HOMEgfs}/parm/ufs/ufs.configure.leapfrog_atm_wav.IN" + default_template="${PARMgfs}/ufs/ufs.configure.leapfrog_atm_wav${tmpl_suffix:-}.IN" ;; atm.ocean.ice) - export ufs_configure_template="${HOMEgfs}/parm/ufs/ufs.configure.cpld.IN" + default_template="${PARMgfs}/ufs/ufs.configure.s2s${tmpl_suffix:-}.IN" ;; atm.ocean.ice.aero) - export ufs_configure_template="${HOMEgfs}/parm/ufs/ufs.configure.cpld_aero.IN" + default_template="${PARMgfs}/ufs/ufs.configure.s2sa${tmpl_suffix:-}.IN" ;; atm.ocean.ice.wave) - export ufs_configure_template="${HOMEgfs}/parm/ufs/ufs.configure.cpld_outerwave.IN" + default_template="${PARMgfs}/ufs/ufs.configure.s2sw${tmpl_suffix:-}.IN" ;; atm.ocean.ice.wave.aero) - export ufs_configure_template="${HOMEgfs}/parm/ufs/ufs.configure.cpld_aero_outerwave.IN" + default_template="${PARMgfs}/ufs/ufs.configure.s2swa${tmpl_suffix:-}.IN" ;; *) - echo "FATAL ERROR: Unable to determine appropriate UFS configure template for ${model_list}" + echo "FATAL ERROR: Unsupported UFSWM configuration for ${model_list}" exit 16 ;; esac +# Allow user to override the default template +export ufs_configure_template=${ufs_configure_template:-${default_template:-"/dev/null"}} +unset model_list default_template + if [[ ! -r "${ufs_configure_template}" ]]; then echo "FATAL ERROR: ${ufs_configure_template} either doesn't exist or is not readable." exit 17 fi -unset model_list - echo "END: config.ufs" diff --git a/parm/config/gefs/config.wave b/parm/config/gefs/config.wave index e04331e533..7298b12aec 100644 --- a/parm/config/gefs/config.wave +++ b/parm/config/gefs/config.wave @@ -6,15 +6,6 @@ echo "BEGIN: config.wave" # Parameters that are common to all wave model steps - -# System and version -export wave_sys_ver=v1.0.0 - -export EXECwave="${HOMEgfs}/exec" -export FIXwave="${HOMEgfs}/fix/wave" -export PARMwave="${HOMEgfs}/parm/wave" -export USHwave="${HOMEgfs}/ush" - # This config contains variables/parameters used in the fcst step # Some others are also used across the workflow in wave component scripts diff --git a/parm/config/gefs/config.wavepostbndpnt b/parm/config/gefs/config.wavepostbndpnt new file mode 100644 index 0000000000..412c5fb42a --- /dev/null +++ b/parm/config/gefs/config.wavepostbndpnt @@ -0,0 +1,11 @@ +#! /usr/bin/env bash + +########## config.wavepostbndpnt ########## +# Wave steps specific + +echo "BEGIN: config.wavepostbndpnt" + +# Get task specific resources +source "${EXPDIR}/config.resources" wavepostbndpnt + +echo "END: config.wavepostbndpnt" diff --git a/parm/config/gefs/config.wavepostbndpntbll b/parm/config/gefs/config.wavepostbndpntbll new file mode 100644 index 0000000000..6695ab0f84 --- /dev/null +++ b/parm/config/gefs/config.wavepostbndpntbll @@ -0,0 +1,11 @@ +#! /usr/bin/env bash + +########## config.wavepostbndpntbll ########## +# Wave steps specific + +echo "BEGIN: config.wavepostbndpntbll" + +# Get task specific resources +source "${EXPDIR}/config.resources" wavepostbndpntbll + +echo "END: config.wavepostbndpntbll" diff --git a/parm/config/gefs/config.wavepostpnt b/parm/config/gefs/config.wavepostpnt new file mode 100644 index 0000000000..e87237da82 --- /dev/null +++ b/parm/config/gefs/config.wavepostpnt @@ -0,0 +1,11 @@ +#! /usr/bin/env bash + +########## config.wavepostpnt ########## +# Wave steps specific + +echo "BEGIN: config.wavepostpnt" + +# Get task specific resources +source "${EXPDIR}/config.resources" wavepostpnt + +echo "END: config.wavepostpnt" diff --git a/parm/config/gefs/config.wavepostsbs b/parm/config/gefs/config.wavepostsbs new file mode 100644 index 0000000000..b3c5902e3c --- /dev/null +++ b/parm/config/gefs/config.wavepostsbs @@ -0,0 +1,28 @@ +#! /usr/bin/env bash + +########## config.wavepostsbs ########## +# Wave steps specific + +echo "BEGIN: config.wavepostsbs" + +# Get task specific resources +source "${EXPDIR}/config.resources" wavepostsbs + +# Subgrid info for grib2 encoding +export WAV_SUBGRBSRC="" +export WAV_SUBGRB="" + +# Options for point output (switch on/off boundary point output) +export DOIBP_WAV='NO' # Input boundary points +export DOFLD_WAV='YES' # Field data +export DOPNT_WAV='YES' # Station data +export DOGRB_WAV='YES' # Create grib2 files +if [[ -n "${waveinterpGRD}" ]]; then + export DOGRI_WAV='YES' # Create interpolated grids +else + export DOGRI_WAV='NO' # Do not create interpolated grids +fi +export DOSPC_WAV='YES' # Spectral post +export DOBLL_WAV='YES' # Bulletin post + +echo "END: config.wavepostsbs" diff --git a/parm/config/gefs/yaml/defaults.yaml b/parm/config/gefs/yaml/defaults.yaml index ce5d8aeb3d..d252e0d1b2 100644 --- a/parm/config/gefs/yaml/defaults.yaml +++ b/parm/config/gefs/yaml/defaults.yaml @@ -2,5 +2,9 @@ base: DO_JEDIATMVAR: "NO" DO_JEDIATMENS: "NO" DO_JEDIOCNVAR: "NO" - DO_JEDILANDDA: "NO" + DO_JEDISNOWDA: "NO" DO_MERGENSST: "NO" + KEEPDATA: "NO" + FHMAX_GFS: 120 + USE_OCN_PERTURB_FILES: "false" + diff --git a/parm/config/gfs/config.aero b/parm/config/gfs/config.aero index 32993554b4..c152fafd12 100644 --- a/parm/config/gfs/config.aero +++ b/parm/config/gfs/config.aero @@ -30,12 +30,12 @@ case ${machine} in esac export AERO_INPUTS_DIR -export AERO_DIAG_TABLE="${HOMEgfs}/parm/ufs/fv3/diag_table.aero" -export AERO_FIELD_TABLE="${HOMEgfs}/parm/ufs/fv3/field_table.aero" +export AERO_DIAG_TABLE="${PARMgfs}/ufs/fv3/diag_table.aero" +export AERO_FIELD_TABLE="${PARMgfs}/ufs/fv3/field_table.aero" # Biomass burning emission dataset. Choose from: gbbepx, qfed, none export AERO_EMIS_FIRE="qfed" # Directory containing GOCART configuration files -export AERO_CONFIG_DIR="${HOMEgfs}/parm/ufs/gocart" +export AERO_CONFIG_DIR="${PARMgfs}/ufs/gocart" # Aerosol convective scavenging factors (list of string array elements) # Element syntax: ':'. Use = * to set default factor for all aerosol tracers diff --git a/parm/config/gfs/config.aeroanl b/parm/config/gfs/config.aeroanl index 32ba43b7ba..972f393feb 100644 --- a/parm/config/gfs/config.aeroanl +++ b/parm/config/gfs/config.aeroanl @@ -6,25 +6,26 @@ echo "BEGIN: config.aeroanl" export CASE_ANL=${CASE} -export OBS_YAML_DIR=${HOMEgfs}/sorc/gdas.cd/parm/aero/obs/config/ -export OBS_LIST=${HOMEgfs}/sorc/gdas.cd/parm/aero/obs/lists/gdas_aero_prototype.yaml +export OBS_LIST="${PARMgfs}/gdas/aero/obs/lists/gdas_aero.yaml.j2" export STATICB_TYPE='identity' -export BERROR_YAML=${HOMEgfs}/sorc/gdas.cd/parm/aero/berror/staticb_${STATICB_TYPE}.yaml -export FIXgdas=${HOMEgfs}/fix/gdas -export BERROR_DATA_DIR=${FIXgdas}/bump/aero/${CASE_ANL}/ +export BERROR_YAML="${PARMgfs}/gdas/aero/berror/staticb_${STATICB_TYPE}.yaml.j2" +export BERROR_DATA_DIR="${FIXgfs}/gdas/bump/aero/${CASE_ANL}/" export BERROR_DATE="20160630.000000" +export CRTM_FIX_YAML="${PARMgfs}/gdas/aero_crtm_coeff.yaml.j2" +export JEDI_FIX_YAML="${PARMgfs}/gdas/aero_jedi_fix.yaml.j2" + export io_layout_x=@IO_LAYOUT_X@ export io_layout_y=@IO_LAYOUT_Y@ -export JEDIEXE=${HOMEgfs}/exec/fv3jedi_var.x +export JEDIEXE="${EXECgfs}/fv3jedi_var.x" if [[ "${DOIAU}" == "YES" ]]; then export aero_bkg_times="3,6,9" - export AEROVARYAML=${HOMEgfs}/sorc/gdas.cd/parm/aero/variational/3dvar_fgat_gfs_aero.yaml + export JEDIYAML="${PARMgfs}/gdas/aero/variational/3dvar_fgat_gfs_aero.yaml.j2" else export aero_bkg_times="6" - export AEROVARYAML=${HOMEgfs}/sorc/gdas.cd/parm/aero/variational/3dvar_gfs_aero.yaml + export JEDIYAML="${PARMgfs}/gdas/aero/variational/3dvar_gfs_aero.yaml.j2" fi echo "END: config.aeroanl" diff --git a/parm/config/gfs/config.aeroanlfinal b/parm/config/gfs/config.aeroanlfinal index 230ec5205a..34e5d8f116 100644 --- a/parm/config/gfs/config.aeroanlfinal +++ b/parm/config/gfs/config.aeroanlfinal @@ -6,5 +6,5 @@ echo "BEGIN: config.aeroanlfinal" # Get task specific resources -. $EXPDIR/config.resources aeroanlfinal +source "${EXPDIR}/config.resources" aeroanlfinal echo "END: config.aeroanlfinal" diff --git a/parm/config/gfs/config.aeroanlinit b/parm/config/gfs/config.aeroanlinit index 72175b8d0c..7036d3d27b 100644 --- a/parm/config/gfs/config.aeroanlinit +++ b/parm/config/gfs/config.aeroanlinit @@ -6,5 +6,5 @@ echo "BEGIN: config.aeroanlinit" # Get task specific resources -. $EXPDIR/config.resources aeroanlinit +source "${EXPDIR}/config.resources" aeroanlinit echo "END: config.aeroanlinit" diff --git a/parm/config/gfs/config.aeroanlrun b/parm/config/gfs/config.aeroanlrun index da13df2831..012e5b79f3 100644 --- a/parm/config/gfs/config.aeroanlrun +++ b/parm/config/gfs/config.aeroanlrun @@ -6,6 +6,6 @@ echo "BEGIN: config.aeroanlrun" # Get task specific resources -. $EXPDIR/config.resources aeroanlrun +source "${EXPDIR}/config.resources" aeroanlrun echo "END: config.aeroanlrun" diff --git a/parm/config/gfs/config.anal b/parm/config/gfs/config.anal index e3a17f9c6a..09aaa15a98 100644 --- a/parm/config/gfs/config.anal +++ b/parm/config/gfs/config.anal @@ -45,51 +45,58 @@ export AMSR2BF=${AMSR2BF:-/dev/null} # Set default values for info files and observation error # NOTE: Remember to set PRVT in config.prep as OBERROR is set below -export CONVINFO=${FIXgsi}/global_convinfo.txt -export OZINFO=${FIXgsi}/global_ozinfo.txt -export SATINFO=${FIXgsi}/global_satinfo.txt -export OBERROR=${FIXgsi}/prepobs_errtable.global - +export CONVINFO=${FIXgfs}/gsi/global_convinfo.txt +export OZINFO=${FIXgfs}/gsi/global_ozinfo.txt +export SATINFO=${FIXgfs}/gsi/global_satinfo.txt +export OBERROR=${FIXgfs}/gsi/prepobs_errtable.global + +if [[ ${GSI_SOILANAL} = "YES" ]]; then + export hofx_2m_sfcfile=".true." + export reducedgrid=".false." # not possible for sfc analysis, Jeff Whitaker says it's not useful anyway + export paranc=".false." # temporary until sfc io coded for parance (PR being prepared by T. Gichamo) + export CONVINFO=${FIXgfs}/gsi/global_convinfo_2mObs.txt + export ANAVINFO=${FIXgfs}/gsi/global_anavinfo_soilanal.l127.txt +fi # Use experimental dumps in EMC GFS v16 parallels if [[ ${RUN_ENVIR} == "emc" ]]; then # Set info files and prepobs.errtable.global for GFS v16 retrospective parallels if [[ "${PDY}${cyc}" -ge "2019021900" && "${PDY}${cyc}" -lt "2019110706" ]]; then - export CONVINFO=${FIXgsi}/gfsv16_historical/global_convinfo.txt.2019021900 - export OBERROR=${FIXgsi}/gfsv16_historical/prepobs_errtable.global.2019021900 + export CONVINFO=${FIXgfs}/gsi/gfsv16_historical/global_convinfo.txt.2019021900 + export OBERROR=${FIXgfs}/gsi/gfsv16_historical/prepobs_errtable.global.2019021900 fi # Place GOES-15 AMVs in monitor, assimilate GOES-17 AMVs, assimilate KOMPSAT-5 gps if [[ "${PDY}${cyc}" -ge "2019110706" && "${PDY}${cyc}" -lt "2020040718" ]]; then - export CONVINFO=${FIXgsi}/gfsv16_historical/global_convinfo.txt.2019110706 - export OBERROR=${FIXgsi}/gfsv16_historical/prepobs_errtable.global.2019110706 + export CONVINFO=${FIXgfs}/gsi/gfsv16_historical/global_convinfo.txt.2019110706 + export OBERROR=${FIXgfs}/gsi/gfsv16_historical/prepobs_errtable.global.2019110706 fi # Assimilate 135 (T) & 235 (uv) Canadian AMDAR observations if [[ "${PDY}${cyc}" -ge "2020040718" && "${PDY}${cyc}" -lt "2020052612" ]]; then - export CONVINFO=${FIXgsi}/gfsv16_historical/global_convinfo.txt.2020040718 - export OBERROR=${FIXgsi}/gfsv16_historical/prepobs_errtable.global.2020040718 + export CONVINFO=${FIXgfs}/gsi/gfsv16_historical/global_convinfo.txt.2020040718 + export OBERROR=${FIXgfs}/gsi/gfsv16_historical/prepobs_errtable.global.2020040718 fi # Assimilate COSMIC-2 if [[ "${PDY}${cyc}" -ge "2020052612" && "${PDY}${cyc}" -lt "2020082412" ]]; then - export CONVINFO=${FIXgsi}/gfsv16_historical/global_convinfo.txt.2020052612 - export OBERROR=${FIXgsi}/gfsv16_historical/prepobs_errtable.global.2020040718 + export CONVINFO=${FIXgfs}/gsi/gfsv16_historical/global_convinfo.txt.2020052612 + export OBERROR=${FIXgfs}/gsi/gfsv16_historical/prepobs_errtable.global.2020040718 fi # Assimilate HDOB if [[ "${PDY}${cyc}" -ge "2020082412" && "${PDY}${cyc}" -lt "2020091612" ]]; then - export CONVINFO=${FIXgsi}/gfsv16_historical/global_convinfo.txt.2020082412 + export CONVINFO=${FIXgfs}/gsi/gfsv16_historical/global_convinfo.txt.2020082412 fi # Assimilate Metop-C GNSSRO if [[ "${PDY}${cyc}" -ge "2020091612" && "${PDY}${cyc}" -lt "2021031712" ]]; then - export CONVINFO=${FIXgsi}/gfsv16_historical/global_convinfo.txt.2020091612 + export CONVINFO=${FIXgfs}/gsi/gfsv16_historical/global_convinfo.txt.2020091612 fi # Assimilate DO-2 GeoOptics if [[ "${PDY}${cyc}" -ge "2021031712" && "${PDY}${cyc}" -lt "2021091612" ]]; then - export CONVINFO=${FIXgsi}/gfsv16_historical/global_convinfo.txt.2021031712 + export CONVINFO=${FIXgfs}/gsi/gfsv16_historical/global_convinfo.txt.2021031712 fi # NOTE: @@ -98,38 +105,38 @@ if [[ ${RUN_ENVIR} == "emc" ]]; then # needed at this time. # Assimilate COSMIC-2 GPS # if [[ "${PDY}${cyc}" -ge "2021110312" && "${PDY}${cyc}" -lt "YYYYMMDDHH" ]]; then - # export CONVINFO=$FIXgsi/gfsv16_historical/global_convinfo.txt.2021110312 + # export CONVINFO=${FIXgfs}/gsi/gfsv16_historical/global_convinfo.txt.2021110312 # fi # Turn off assmilation of OMPS during period of bad data if [[ "${PDY}${cyc}" -ge "2020011600" && "${PDY}${cyc}" -lt "2020011806" ]]; then - export OZINFO=${FIXgsi}/gfsv16_historical/global_ozinfo.txt.2020011600 + export OZINFO=${FIXgfs}/gsi/gfsv16_historical/global_ozinfo.txt.2020011600 fi # Set satinfo for start of GFS v16 parallels if [[ "${PDY}${cyc}" -ge "2019021900" && "${PDY}${cyc}" -lt "2019110706" ]]; then - export SATINFO=${FIXgsi}/gfsv16_historical/global_satinfo.txt.2019021900 + export SATINFO=${FIXgfs}/gsi/gfsv16_historical/global_satinfo.txt.2019021900 fi # Turn on assimilation of Metop-C AMSUA and MHS if [[ "${PDY}${cyc}" -ge "2019110706" && "${PDY}${cyc}" -lt "2020022012" ]]; then - export SATINFO=${FIXgsi}/gfsv16_historical/global_satinfo.txt.2019110706 + export SATINFO=${FIXgfs}/gsi/gfsv16_historical/global_satinfo.txt.2019110706 fi # Turn off assimilation of Metop-A MHS if [[ "${PDY}${cyc}" -ge "2020022012" && "${PDY}${cyc}" -lt "2021052118" ]]; then - export SATINFO=${FIXgsi}/gfsv16_historical/global_satinfo.txt.2020022012 + export SATINFO=${FIXgfs}/gsi/gfsv16_historical/global_satinfo.txt.2020022012 fi # Turn off assimilation of S-NPP CrIS if [[ "${PDY}${cyc}" -ge "2021052118" && "${PDY}${cyc}" -lt "2021092206" ]]; then - export SATINFO=${FIXgsi}/gfsv16_historical/global_satinfo.txt.2021052118 + export SATINFO=${FIXgfs}/gsi/gfsv16_historical/global_satinfo.txt.2021052118 fi # Turn off assimilation of MetOp-A IASI if [[ "${PDY}${cyc}" -ge "2021092206" && "${PDY}${cyc}" -lt "2021102612" ]]; then - export SATINFO=${FIXgsi}/gfsv16_historical/global_satinfo.txt.2021092206 + export SATINFO=${FIXgfs}/gsi/gfsv16_historical/global_satinfo.txt.2021092206 fi # NOTE: @@ -139,7 +146,7 @@ if [[ ${RUN_ENVIR} == "emc" ]]; then # # Turn off assmilation of all Metop-A MHS # if [[ "${PDY}${cyc}" -ge "2021110312" && "${PDY}${cyc}" -lt "YYYYMMDDHH" ]]; then - # export SATINFO=$FIXgsi/gfsv16_historical/global_satinfo.txt.2021110312 + # export SATINFO=${FIXgfs}/gsi/gfsv16_historical/global_satinfo.txt.2021110312 # fi fi diff --git a/parm/config/gfs/config.atmanl b/parm/config/gfs/config.atmanl index abfbd80734..7cfd0cb47f 100644 --- a/parm/config/gfs/config.atmanl +++ b/parm/config/gfs/config.atmanl @@ -5,17 +5,29 @@ echo "BEGIN: config.atmanl" -export CASE_ANL=${CASE} -export OBS_YAML_DIR=${HOMEgfs}/sorc/gdas.cd/parm/atm/obs/config/ -export OBS_LIST=${HOMEgfs}/sorc/gdas.cd/parm/atm/obs/lists/gdas_prototype_3d.yaml -export ATMVARYAML=${HOMEgfs}/sorc/gdas.cd/parm/atm/variational/3dvar_dripcg.yaml +export OBS_LIST="${PARMgfs}/gdas/atm/obs/lists/gdas_prototype_3d.yaml.j2" +export JEDIYAML="${PARMgfs}/gdas/atm/variational/3dvar_drpcg.yaml.j2" export STATICB_TYPE="gsibec" -export BERROR_YAML=${HOMEgfs}/sorc/gdas.cd/parm/atm/berror/staticb_${STATICB_TYPE}.yaml export INTERP_METHOD='barycentric' +if [[ ${DOHYBVAR} = "YES" ]]; then + # shellcheck disable=SC2153 + export CASE_ANL=${CASE_ENS} + export BERROR_YAML="${PARMgfs}/gdas/atm/berror/hybvar_${STATICB_TYPE}.yaml.j2" +else + export CASE_ANL=${CASE} + export BERROR_YAML="${PARMgfs}/gdas/atm/berror/staticb_${STATICB_TYPE}.yaml.j2" +fi + +export CRTM_FIX_YAML="${PARMgfs}/gdas/atm_crtm_coeff.yaml.j2" +export JEDI_FIX_YAML="${PARMgfs}/gdas/atm_jedi_fix.yaml.j2" + +export layout_x_atmanl=@LAYOUT_X_ATMANL@ +export layout_y_atmanl=@LAYOUT_Y_ATMANL@ + export io_layout_x=@IO_LAYOUT_X@ export io_layout_y=@IO_LAYOUT_Y@ -export JEDIEXE=${HOMEgfs}/exec/fv3jedi_var.x +export JEDIEXE=${EXECgfs}/fv3jedi_var.x echo "END: config.atmanl" diff --git a/parm/config/gfs/config.atmanlinit b/parm/config/gfs/config.atmanlinit index bc95ef4962..1aec88bcc2 100644 --- a/parm/config/gfs/config.atmanlinit +++ b/parm/config/gfs/config.atmanlinit @@ -7,4 +7,5 @@ echo "BEGIN: config.atmanlinit" # Get task specific resources . "${EXPDIR}/config.resources" atmanlinit + echo "END: config.atmanlinit" diff --git a/parm/config/gfs/config.atmensanl b/parm/config/gfs/config.atmensanl index 58fd7b6e22..8e824b22f6 100644 --- a/parm/config/gfs/config.atmensanl +++ b/parm/config/gfs/config.atmensanl @@ -5,14 +5,19 @@ echo "BEGIN: config.atmensanl" -export OBS_YAML_DIR=${HOMEgfs}/sorc/gdas.cd/parm/atm/obs/config/ -export OBS_LIST=${HOMEgfs}/sorc/gdas.cd/parm/atm/obs/lists/lgetkf_prototype.yaml -export ATMENSYAML=${HOMEgfs}/sorc/gdas.cd/parm/atm/lgetkf/lgetkf.yaml +export OBS_LIST="${PARMgfs}/gdas/atm/obs/lists/lgetkf_prototype.yaml.j2" +export JEDIYAML="${PARMgfs}/gdas/atm/lgetkf/lgetkf.yaml.j2" export INTERP_METHOD='barycentric' +export CRTM_FIX_YAML="${PARMgfs}/gdas/atm_crtm_coeff.yaml.j2" +export JEDI_FIX_YAML="${PARMgfs}/gdas/atm_jedi_fix.yaml.j2" + +export layout_x_atmensanl=@LAYOUT_X_ATMENSANL@ +export layout_y_atmensanl=@LAYOUT_Y_ATMENSANL@ + export io_layout_x=@IO_LAYOUT_X@ export io_layout_y=@IO_LAYOUT_Y@ -export JEDIEXE=${HOMEgfs}/exec/fv3jedi_letkf.x +export JEDIEXE=${EXECgfs}/fv3jedi_letkf.x echo "END: config.atmensanl" diff --git a/parm/config/gfs/config.atmensanlinit b/parm/config/gfs/config.atmensanlinit index 34429023bb..0eee2ffa82 100644 --- a/parm/config/gfs/config.atmensanlinit +++ b/parm/config/gfs/config.atmensanlinit @@ -7,4 +7,5 @@ echo "BEGIN: config.atmensanlinit" # Get task specific resources . "${EXPDIR}/config.resources" atmensanlinit + echo "END: config.atmensanlinit" diff --git a/parm/config/gfs/config.atmos_products b/parm/config/gfs/config.atmos_products index c3e861b281..70f9112edb 100644 --- a/parm/config/gfs/config.atmos_products +++ b/parm/config/gfs/config.atmos_products @@ -12,24 +12,31 @@ echo "BEGIN: config.atmos_products" export NFHRS_PER_GROUP=3 # Scripts used by this job -export INTERP_ATMOS_MASTERSH="${HOMEgfs}/ush/interp_atmos_master.sh" -export INTERP_ATMOS_SFLUXSH="${HOMEgfs}/ush/interp_atmos_sflux.sh" +export INTERP_ATMOS_MASTERSH="${USHgfs}/interp_atmos_master.sh" +export INTERP_ATMOS_SFLUXSH="${USHgfs}/interp_atmos_sflux.sh" if [[ "${RUN:-}" == "gdas" ]]; then export downset=1 export FHOUT_PGBS=${FHOUT:-1} # Output frequency of supplemental gfs pgb file at 1.0 and 0.5 deg export FLXGF="NO" # Create interpolated sflux.1p00 file + export WGNE="NO" # WGNE products are created for first FHMAX_WGNE forecast hours + export FHMAX_WGNE=0 elif [[ "${RUN:-}" == "gfs" ]]; then #JKHexport downset=2 ## create pgrb2b files export downset=1 ## JKH export FHOUT_PGBS=${FHOUT_GFS:-3} # Output frequency of supplemental gfs pgb file at 1.0 and 0.5 deg +#JKH export FLXGF="YES" # Create interpolated sflux.1p00 file export FLXGF="NO" # Create interpolated sflux.1p00 file + export WGNE="YES" # WGNE products are created for first FHMAX_WGNE forecast hours + export FHMAX_WGNE=180 fi +export APCP_MSG="597" # Message number for APCP in GFSv16. Look for TODO in exglobal_atmos_products.sh + # paramlist files for the different forecast hours and downsets -export paramlista="${HOMEgfs}/parm/post/global_1x1_paramlist_g2" -export paramlista_anl="${HOMEgfs}/parm/post/global_1x1_paramlist_g2.anl" -export paramlista_f000="${HOMEgfs}/parm/post/global_1x1_paramlist_g2.f000" -export paramlistb="${HOMEgfs}/parm/post/global_master-catchup_parmlist_g2" +export paramlista="${PARMgfs}/product/gfs.fFFF.paramlist.a.txt" +export paramlista_anl="${PARMgfs}/product/gfs.anl.paramlist.a.txt" +export paramlista_f000="${PARMgfs}/product/gfs.f000.paramlist.a.txt" +export paramlistb="${PARMgfs}/product/gfs.fFFF.paramlist.b.txt" echo "END: config.atmos_products" diff --git a/parm/config/gfs/config.awips b/parm/config/gfs/config.awips index 3b78d4bb4b..61f0dc5652 100644 --- a/parm/config/gfs/config.awips +++ b/parm/config/gfs/config.awips @@ -8,9 +8,6 @@ echo "BEGIN: config.awips" # Get task specific resources . "${EXPDIR}/config.resources" awips -export AWIPS20KM1P0DEGSH="${HOMEgfs}/jobs/JGFS_ATMOS_AWIPS_20KM_1P0DEG" -export AWIPSG2SH="${HOMEgfs}/jobs/JGFS_ATMOS_AWIPS_G2" - # No. of concurrent awips jobs export NAWIPSGRP=42 diff --git a/parm/config/gfs/config.base.emc.dyn b/parm/config/gfs/config.base similarity index 100% rename from parm/config/gfs/config.base.emc.dyn rename to parm/config/gfs/config.base diff --git a/parm/config/gfs/config.base.emc.dyn_emc b/parm/config/gfs/config.base.emc.dyn_emc index 88a9643ab8..a7359cd632 100644 --- a/parm/config/gfs/config.base.emc.dyn_emc +++ b/parm/config/gfs/config.base.emc.dyn_emc @@ -23,12 +23,11 @@ export HPSS_PROJECT="@HPSS_PROJECT@" # Directories relative to installation areas: export HOMEgfs=@HOMEgfs@ -export PARMgfs="${HOMEgfs}/parm" -export FIXgfs="${HOMEgfs}/fix" -export USHgfs="${HOMEgfs}/ush" -export UTILgfs="${HOMEgfs}/util" export EXECgfs="${HOMEgfs}/exec" +export FIXgfs="${HOMEgfs}/fix" +export PARMgfs="${HOMEgfs}/parm" export SCRgfs="${HOMEgfs}/scripts" +export USHgfs="${HOMEgfs}/ush" export FIXam="${FIXgfs}/am" export FIXaer="${FIXgfs}/aer" @@ -39,6 +38,7 @@ export FIXcice="${FIXgfs}/cice" export FIXmom="${FIXgfs}/mom6" export FIXreg2grb2="${FIXgfs}/reg2grb2" export FIXugwd="${FIXgfs}/ugwd" +export FIXgdas="${FIXgfs}/gdas" ######################################################################## @@ -49,6 +49,12 @@ export COMINsyn="@COMINsyn@" export DMPDIR="@DMPDIR@" export BASE_CPLIC="@BASE_CPLIC@" +# Gempak from external models +# Default locations are to dummy locations for testing +export COMINecmwf=@COMINecmwf@ +export COMINnam=@COMINnam@ +export COMINukmet=@COMINukmet@ + # USER specific paths export HOMEDIR="@HOMEDIR@" export STMP="@STMP@" @@ -67,16 +73,9 @@ export DO_NPOESS="NO" # NPOESS products export DO_TRACKER="YES" # Hurricane track verification export DO_GENESIS="YES" # Cyclone genesis verification export DO_GENESIS_FSU="NO" # Cyclone genesis verification (FSU) -# The monitor is not yet supported on Hercules -if [[ "${machine}" == "HERCULES" ]]; then - export DO_VERFOZN="NO" # Ozone data assimilation monitoring - export DO_VERFRAD="NO" # Radiance data assimilation monitoring - export DO_VMINMON="NO" # GSI minimization monitoring -else - export DO_VERFOZN="YES" # Ozone data assimilation monitoring - export DO_VERFRAD="YES" # Radiance data assimilation monitoring - export DO_VMINMON="YES" # GSI minimization monitoring -fi +export DO_VERFOZN="YES" # Ozone data assimilation monitoring +export DO_VERFRAD="YES" # Radiance data assimilation monitoring +export DO_VMINMON="YES" # GSI minimization monitoring export DO_MOS="NO" # GFS Model Output Statistics - Only supported on WCOSS2 # NO for retrospective parallel; YES for real-time parallel @@ -105,6 +104,7 @@ export NMV="/bin/mv" export NLN="/bin/ln -sf" export VERBOSE="YES" export KEEPDATA="NO" +export DEBUG_POSTSCRIPT="NO" # PBS only; sets debug=true export CHGRP_RSTPROD="@CHGRP_RSTPROD@" export CHGRP_CMD="@CHGRP_CMD@" export NCDUMP="${NETCDF:-${netcdf_c_ROOT:-}}/bin/ncdump" @@ -143,6 +143,7 @@ export RUN=${RUN:-${CDUMP:-"gfs"}} # RUN is defined in the job-card (ecf); CDUM # Get all the COM path templates source "${EXPDIR}/config.com" +# shellcheck disable=SC2016 export ERRSCRIPT=${ERRSCRIPT:-'eval [[ $err = 0 ]]'} export LOGSCRIPT=${LOGSCRIPT:-""} #export ERRSCRIPT=${ERRSCRIPT:-"err_chk"} @@ -159,6 +160,20 @@ export DBNROOT=${DBNROOT:-${UTILROOT:-}/fakedbn} # APP settings export APP=@APP@ +shopt -s extglob +# Adjust APP based on RUN +case "${RUN}" in + gfs) # Turn off aerosols + APP="${APP/%A}" + ;; + enkf*) # Turn off aerosols and waves + APP="${APP/%+([WA])}" + ;; + *) # Keep app unchanged + ;; +esac +shopt -u extglob + # Defaults: export DO_ATM="YES" export DO_COUPLED="NO" @@ -179,19 +194,20 @@ export CASE="@CASECTL@" export CASE_ENS="@CASEENS@" export OCNRES="@OCNRES@" export ICERES="${OCNRES}" + # These are the currently recommended grid-combinations case "${CASE}" in "C48") - export waveGRD='glo_500' + export waveGRD='uglo_100km' ;; "C96" | "C192") - export waveGRD='glo_200' + export waveGRD='uglo_100km' ;; "C384") - export waveGRD='glo_025' + export waveGRD='uglo_100km' ;; "C768" | "C1152") - export waveGRD='mx025' + export waveGRD='uglo_m1g16' ;; *) echo "FATAL ERROR: Unrecognized CASE ${CASE}, ABORT!" @@ -247,35 +263,34 @@ fi export FHMIN=0 export FHMAX=9 export FHOUT=3 # Will be changed to 1 in config.base if (DOHYBVAR set to NO and l4densvar set to false) +export FHOUT_OCNICE=3 # Cycle to run EnKF (set to BOTH for both gfs and gdas) -export EUPD_CYC="gdas" +export EUPD_CYC="@EUPD_CYC@" # GFS cycle info export gfs_cyc=@gfs_cyc@ # 0: no GFS cycle, 1: 00Z only, 2: 00Z and 12Z only, 4: all 4 cycles. # GFS output and frequency export FHMIN_GFS=0 - -export FHMAX_GFS_00=120 -export FHMAX_GFS_06=120 -export FHMAX_GFS_12=120 -export FHMAX_GFS_18=120 -current_fhmax_var=FHMAX_GFS_${cyc}; declare -x FHMAX_GFS=${!current_fhmax_var} - -export FHOUT_GFS=6 # Must be 6 for S2S until #1629 is addressed; 3 for ops +export FHMAX_GFS=@FHMAX_GFS@ +export FHOUT_GFS=3 # Must be 6 for S2S until #1629 is addressed; 3 for ops export FHMAX_HF_GFS=0 export FHOUT_HF_GFS=1 +export FHOUT_OCNICE_GFS=6 if (( gfs_cyc != 0 )); then export STEP_GFS=$(( 24 / gfs_cyc )) else export STEP_GFS="0" fi -export ILPOST=1 # gempak output frequency up to F120 +# TODO: Change gempak to use standard out variables (#2348) +export ILPOST=${FHOUT_HF_GFS} # gempak output frequency up to F120 +if (( FHMAX_HF_GFS < 120 )); then + export ILPOST=${FHOUT_GFS} +fi # GFS restart interval in hours -#JKHexport restart_interval_gfs=12 -export restart_interval_gfs=-1 ## JKH +export restart_interval_gfs=12 # NOTE: Do not set this to zero. Instead set it to $FHMAX_GFS # TODO: Remove this variable from config.base and reference from config.fcst # TODO: rework logic in config.wave and push it to parsing_nameslist_WW3.sh where it is actually used @@ -317,16 +332,20 @@ export DO_MERGENSST="@DO_MERGENSST@" # Hybrid related export DOHYBVAR="@DOHYBVAR@" export NMEM_ENS=@NMEM_ENS@ -export NMEM_ENS_GFS=@NMEM_ENS@ export SMOOTH_ENKF="NO" export l4densvar=".true." export lwrite4danl=".true." +# Early-cycle EnKF parameters +export NMEM_ENS_GFS=30 +export NMEM_ENS_GFS_OFFSET=20 +export DO_CALC_INCREMENT_ENKF_GFS="NO" + # EnKF output frequency if [[ ${DOHYBVAR} = "YES" ]]; then export FHMIN_ENKF=3 export FHMAX_ENKF=9 - export FHMAX_ENKF_GFS=120 + export FHMAX_ENKF_GFS=@FHMAX_ENKF_GFS@ export FHOUT_ENKF_GFS=3 if [[ ${l4densvar} = ".true." ]]; then export FHOUT=1 @@ -353,6 +372,8 @@ fi if [[ "${DOIAU_ENKF}" = "NO" ]]; then export IAUFHRS_ENKF="6"; fi +export GSI_SOILANAL=@GSI_SOILANAL@ + # turned on nsst in anal and/or fcst steps, and turn off rtgsst export DONST="YES" if [[ ${DONST} = "YES" ]]; then export FNTSFA=" "; fi @@ -367,13 +388,10 @@ export MAKE_NSSTBUFR="@MAKE_NSSTBUFR@" export MAKE_ACFTBUFR="@MAKE_ACFTBUFR@" # Analysis increments to zero in CALCINCEXEC -export INCREMENTS_TO_ZERO="'liq_wat_inc','icmr_inc'" - -# Write analysis files for early cycle EnKF -export DO_CALC_INCREMENT_ENKF_GFS="YES" +export INCREMENTS_TO_ZERO="'liq_wat_inc','icmr_inc','rwmr_inc','snmr_inc','grle_inc'" # Stratospheric increments to zero -export INCVARS_ZERO_STRAT="'sphum_inc','liq_wat_inc','icmr_inc'" +export INCVARS_ZERO_STRAT="'sphum_inc','liq_wat_inc','icmr_inc','rwmr_inc','snmr_inc','grle_inc'" export INCVARS_EFOLD="5" # Swith to generate netcdf or binary diagnostic files. If not specified, @@ -385,6 +403,7 @@ export binary_diag=".false." # Verification options export DO_METP="NO" # Run METPLUS jobs - set METPLUS settings in config.metp; not supported with spack-stack export DO_FIT2OBS="YES" # Run fit to observations package +export DO_VRFY_OCEANDA="@DO_VRFY_OCEANDA@" # Run SOCA Ocean and Seaice DA verification tasks # Archiving options export HPSSARCH="@HPSSARCH@" # save data to HPSS archive @@ -402,4 +421,11 @@ export FITSARC="YES" export FHMAX_FITS=132 [[ "${FHMAX_FITS}" -gt "${FHMAX_GFS}" ]] && export FHMAX_FITS=${FHMAX_GFS} +# The monitor jobs are not yet supported for JEDIATMVAR. +if [[ ${DO_JEDIATMVAR} = "YES" ]]; then + export DO_VERFOZN="NO" # Ozone data assimilation monitoring + export DO_VERFRAD="NO" # Radiance data assimilation monitoring + export DO_VMINMON="NO" # GSI minimization monitoring +fi + echo "END: config.base" diff --git a/parm/config/gfs/config.base.emc.dyn_hera b/parm/config/gfs/config.base.emc.dyn_hera index 231b48b0b2..facb9c1dc1 100644 --- a/parm/config/gfs/config.base.emc.dyn_hera +++ b/parm/config/gfs/config.base.emc.dyn_hera @@ -23,12 +23,11 @@ export HPSS_PROJECT="@HPSS_PROJECT@" # Directories relative to installation areas: export HOMEgfs=@HOMEgfs@ -export PARMgfs="${HOMEgfs}/parm" -export FIXgfs="${HOMEgfs}/fix" -export USHgfs="${HOMEgfs}/ush" -export UTILgfs="${HOMEgfs}/util" export EXECgfs="${HOMEgfs}/exec" +export FIXgfs="${HOMEgfs}/fix" +export PARMgfs="${HOMEgfs}/parm" export SCRgfs="${HOMEgfs}/scripts" +export USHgfs="${HOMEgfs}/ush" export FIXam="${FIXgfs}/am" export FIXaer="${FIXgfs}/aer" @@ -39,6 +38,7 @@ export FIXcice="${FIXgfs}/cice" export FIXmom="${FIXgfs}/mom6" export FIXreg2grb2="${FIXgfs}/reg2grb2" export FIXugwd="${FIXgfs}/ugwd" +export FIXgdas="${FIXgfs}/gdas" ######################################################################## @@ -49,6 +49,12 @@ export COMINsyn="@COMINsyn@" export DMPDIR="@DMPDIR@" export BASE_CPLIC="@BASE_CPLIC@" +# Gempak from external models +# Default locations are to dummy locations for testing +export COMINecmwf=@COMINecmwf@ +export COMINnam=@COMINnam@ +export COMINukmet=@COMINukmet@ + # USER specific paths export HOMEDIR="@HOMEDIR@" export STMP="@STMP@" @@ -67,16 +73,9 @@ export DO_NPOESS="NO" # NPOESS products export DO_TRACKER="NO" # Hurricane track verification ## JKH export DO_GENESIS="NO" # Cyclone genesis verification ## JKH export DO_GENESIS_FSU="NO" # Cyclone genesis verification (FSU) -# The monitor is not yet supported on Hercules -if [[ "${machine}" == "HERCULES" ]]; then - export DO_VERFOZN="NO" # Ozone data assimilation monitoring - export DO_VERFRAD="NO" # Radiance data assimilation monitoring - export DO_VMINMON="NO" # GSI minimization monitoring -else - export DO_VERFOZN="YES" # Ozone data assimilation monitoring - export DO_VERFRAD="YES" # Radiance data assimilation monitoring - export DO_VMINMON="YES" # GSI minimization monitoring -fi +export DO_VERFOZN="YES" # Ozone data assimilation monitoring +export DO_VERFRAD="YES" # Radiance data assimilation monitoring +export DO_VMINMON="YES" # GSI minimization monitoring export DO_MOS="NO" # GFS Model Output Statistics - Only supported on WCOSS2 # NO for retrospective parallel; YES for real-time parallel @@ -105,6 +104,7 @@ export NMV="/bin/mv" export NLN="/bin/ln -sf" export VERBOSE="YES" export KEEPDATA="NO" +export DEBUG_POSTSCRIPT="NO" # PBS only; sets debug=true export CHGRP_RSTPROD="@CHGRP_RSTPROD@" export CHGRP_CMD="@CHGRP_CMD@" export NCDUMP="${NETCDF:-${netcdf_c_ROOT:-}}/bin/ncdump" @@ -143,6 +143,7 @@ export RUN=${RUN:-${CDUMP:-"gfs"}} # RUN is defined in the job-card (ecf); CDUM # Get all the COM path templates source "${EXPDIR}/config.com" +# shellcheck disable=SC2016 export ERRSCRIPT=${ERRSCRIPT:-'eval [[ $err = 0 ]]'} export LOGSCRIPT=${LOGSCRIPT:-""} #export ERRSCRIPT=${ERRSCRIPT:-"err_chk"} @@ -159,6 +160,20 @@ export DBNROOT=${DBNROOT:-${UTILROOT:-}/fakedbn} # APP settings export APP=@APP@ +shopt -s extglob +# Adjust APP based on RUN +case "${RUN}" in + gfs) # Turn off aerosols + APP="${APP/%A}" + ;; + enkf*) # Turn off aerosols and waves + APP="${APP/%+([WA])}" + ;; + *) # Keep app unchanged + ;; +esac +shopt -u extglob + # Defaults: export DO_ATM="YES" export DO_COUPLED="NO" @@ -179,19 +194,20 @@ export CASE="@CASECTL@" export CASE_ENS="@CASEENS@" export OCNRES="@OCNRES@" export ICERES="${OCNRES}" + # These are the currently recommended grid-combinations case "${CASE}" in "C48") - export waveGRD='glo_500' + export waveGRD='uglo_100km' ;; "C96" | "C192") - export waveGRD='glo_200' + export waveGRD='uglo_100km' ;; "C384") - export waveGRD='glo_025' + export waveGRD='uglo_100km' ;; "C768" | "C1152") - export waveGRD='mx025' + export waveGRD='uglo_m1g16' ;; *) echo "FATAL ERROR: Unrecognized CASE ${CASE}, ABORT!" @@ -247,31 +263,31 @@ fi export FHMIN=0 export FHMAX=9 export FHOUT=3 # Will be changed to 1 in config.base if (DOHYBVAR set to NO and l4densvar set to false) +export FHOUT_OCNICE=3 # Cycle to run EnKF (set to BOTH for both gfs and gdas) -export EUPD_CYC="gdas" +export EUPD_CYC="@EUPD_CYC@" # GFS cycle info export gfs_cyc=@gfs_cyc@ # 0: no GFS cycle, 1: 00Z only, 2: 00Z and 12Z only, 4: all 4 cycles. # GFS output and frequency export FHMIN_GFS=0 - -export FHMAX_GFS_00=120 -export FHMAX_GFS_06=120 -export FHMAX_GFS_12=120 -export FHMAX_GFS_18=120 -current_fhmax_var=FHMAX_GFS_${cyc}; declare -x FHMAX_GFS=${!current_fhmax_var} - -export FHOUT_GFS=6 # Must be 6 for S2S until #1629 is addressed; 3 for ops +export FHMAX_GFS=@FHMAX_GFS@ +export FHOUT_GFS=6 ##JKH # Must be 6 for S2S until #1629 is addressed; 3 for ops export FHMAX_HF_GFS=0 export FHOUT_HF_GFS=1 +export FHOUT_OCNICE_GFS=6 if (( gfs_cyc != 0 )); then export STEP_GFS=$(( 24 / gfs_cyc )) else export STEP_GFS="0" fi -export ILPOST=1 # gempak output frequency up to F120 +# TODO: Change gempak to use standard out variables (#2348) +export ILPOST=${FHOUT_HF_GFS} # gempak output frequency up to F120 +if (( FHMAX_HF_GFS < 120 )); then + export ILPOST=${FHOUT_GFS} +fi # GFS restart interval in hours #JKHexport restart_interval_gfs=12 @@ -317,16 +333,21 @@ export DO_MERGENSST="@DO_MERGENSST@" # Hybrid related export DOHYBVAR="@DOHYBVAR@" export NMEM_ENS=@NMEM_ENS@ -export NMEM_ENS_GFS=@NMEM_ENS@ +export NMEM_ENS_GFS=@NMEM_ENS@ #JKH export SMOOTH_ENKF="NO" export l4densvar=".true." export lwrite4danl=".true." +# Early-cycle EnKF parameters +export NMEM_ENS_GFS=30 +export NMEM_ENS_GFS_OFFSET=20 +export DO_CALC_INCREMENT_ENKF_GFS="NO" + # EnKF output frequency if [[ ${DOHYBVAR} = "YES" ]]; then export FHMIN_ENKF=3 export FHMAX_ENKF=9 - export FHMAX_ENKF_GFS=120 + export FHMAX_ENKF_GFS=@FHMAX_ENKF_GFS@ export FHOUT_ENKF_GFS=3 if [[ ${l4densvar} = ".true." ]]; then export FHOUT=1 @@ -353,6 +374,8 @@ fi if [[ "${DOIAU_ENKF}" = "NO" ]]; then export IAUFHRS_ENKF="6"; fi +export GSI_SOILANAL=@GSI_SOILANAL@ + # turned on nsst in anal and/or fcst steps, and turn off rtgsst export DONST="YES" if [[ ${DONST} = "YES" ]]; then export FNTSFA=" "; fi @@ -361,19 +384,16 @@ if [[ ${DONST} = "YES" ]]; then export FNTSFA=" "; fi export nst_anl=.true. # Make the nsstbufr file on the fly or use the GDA version -export MAKE_NSSTBUFR="@MAKE_NSSTBUFR@" +export MAKE_NSSTBUFR="@MAKE_NSSTBUFR@" #JKH # Make the aircraft prepbufr file on the fly or use the GDA version export MAKE_ACFTBUFR="@MAKE_ACFTBUFR@" # Analysis increments to zero in CALCINCEXEC -export INCREMENTS_TO_ZERO="'liq_wat_inc','icmr_inc'" - -# Write analysis files for early cycle EnKF -export DO_CALC_INCREMENT_ENKF_GFS="YES" +export INCREMENTS_TO_ZERO="'liq_wat_inc','icmr_inc','rwmr_inc','snmr_inc','grle_inc'" # Stratospheric increments to zero -export INCVARS_ZERO_STRAT="'sphum_inc','liq_wat_inc','icmr_inc'" +export INCVARS_ZERO_STRAT="'sphum_inc','liq_wat_inc','icmr_inc','rwmr_inc','snmr_inc','grle_inc'" export INCVARS_EFOLD="5" # Swith to generate netcdf or binary diagnostic files. If not specified, @@ -385,6 +405,7 @@ export binary_diag=".false." # Verification options export DO_METP="NO" # Run METPLUS jobs - set METPLUS settings in config.metp; not supported with spack-stack export DO_FIT2OBS="NO" # Run fit to observations package ## JKH +export DO_VRFY_OCEANDA="@DO_VRFY_OCEANDA@" # Run SOCA Ocean and Seaice DA verification tasks # Archiving options export HPSSARCH="@HPSSARCH@" # save data to HPSS archive @@ -402,4 +423,11 @@ export FITSARC="YES" export FHMAX_FITS=132 [[ "${FHMAX_FITS}" -gt "${FHMAX_GFS}" ]] && export FHMAX_FITS=${FHMAX_GFS} +# The monitor jobs are not yet supported for JEDIATMVAR. +if [[ ${DO_JEDIATMVAR} = "YES" ]]; then + export DO_VERFOZN="NO" # Ozone data assimilation monitoring + export DO_VERFRAD="NO" # Radiance data assimilation monitoring + export DO_VMINMON="NO" # GSI minimization monitoring +fi + echo "END: config.base" diff --git a/parm/config/gfs/config.cleanup b/parm/config/gfs/config.cleanup index 1908c91bb5..44e2690f65 100644 --- a/parm/config/gfs/config.cleanup +++ b/parm/config/gfs/config.cleanup @@ -12,6 +12,11 @@ export CLEANUP_COM="YES" # NO=retain ROTDIR. YES default in cleanup.sh export RMOLDSTD=144 export RMOLDEND=24 +if [[ "${DO_GEMPAK}" == "YES" ]]; then + export RMOLDSTD=346 + export RMOLDEND=222 +fi + # Specify the list of files to exclude from the first stage of cleanup # Because arrays cannot be exported, list is a single string of comma- # separated values. This string is split to form an array at runtime. @@ -22,4 +27,4 @@ case ${RUN} in esac export exclude_string -echo "END: config.cleanup" \ No newline at end of file +echo "END: config.cleanup" diff --git a/parm/config/gfs/config.com b/parm/config/gfs/config.com index db648b5866..2f99e709ea 100644 --- a/parm/config/gfs/config.com +++ b/parm/config/gfs/config.com @@ -52,7 +52,7 @@ declare -rx COM_CONF_TMPL=${COM_BASE}'/conf' declare -rx COM_ATMOS_INPUT_TMPL=${COM_BASE}'/model_data/atmos/input' declare -rx COM_ATMOS_RESTART_TMPL=${COM_BASE}'/model_data/atmos/restart' declare -rx COM_ATMOS_ANALYSIS_TMPL=${COM_BASE}'/analysis/atmos' -declare -rx COM_LAND_ANALYSIS_TMPL=${COM_BASE}'/analysis/land' +declare -rx COM_SNOW_ANALYSIS_TMPL=${COM_BASE}'/analysis/snow' declare -rx COM_ATMOS_HISTORY_TMPL=${COM_BASE}'/model_data/atmos/history' declare -rx COM_ATMOS_MASTER_TMPL=${COM_BASE}'/model_data/atmos/master' declare -rx COM_ATMOS_GRIB_TMPL=${COM_BASE}'/products/atmos/grib2' @@ -80,15 +80,16 @@ declare -rx COM_OCEAN_HISTORY_TMPL=${COM_BASE}'/model_data/ocean/history' declare -rx COM_OCEAN_RESTART_TMPL=${COM_BASE}'/model_data/ocean/restart' declare -rx COM_OCEAN_INPUT_TMPL=${COM_BASE}'/model_data/ocean/input' declare -rx COM_OCEAN_ANALYSIS_TMPL=${COM_BASE}'/analysis/ocean' -declare -rx COM_OCEAN_2D_TMPL=${COM_BASE}'/products/ocean/2D' -declare -rx COM_OCEAN_3D_TMPL=${COM_BASE}'/products/ocean/3D' -declare -rx COM_OCEAN_XSECT_TMPL=${COM_BASE}'/products/ocean/xsect' +declare -rx COM_OCEAN_NETCDF_TMPL=${COM_BASE}'/products/ocean/netcdf' declare -rx COM_OCEAN_GRIB_TMPL=${COM_BASE}'/products/ocean/grib2' declare -rx COM_OCEAN_GRIB_GRID_TMPL=${COM_OCEAN_GRIB_TMPL}'/${GRID}' declare -rx COM_ICE_INPUT_TMPL=${COM_BASE}'/model_data/ice/input' declare -rx COM_ICE_HISTORY_TMPL=${COM_BASE}'/model_data/ice/history' declare -rx COM_ICE_RESTART_TMPL=${COM_BASE}'/model_data/ice/restart' +declare -rx COM_ICE_NETCDF_TMPL=${COM_BASE}'/products/ice/netcdf' +declare -rx COM_ICE_GRIB_TMPL=${COM_BASE}'/products/ice/grib2' +declare -rx COM_ICE_GRIB_GRID_TMPL=${COM_ICE_GRIB_TMPL}'/${GRID}' declare -rx COM_CHEM_HISTORY_TMPL=${COM_BASE}'/model_data/chem/history' declare -rx COM_CHEM_ANALYSIS_TMPL=${COM_BASE}'/analysis/chem' diff --git a/parm/config/gfs/config.efcs b/parm/config/gfs/config.efcs index 283ec3ab7e..402ba64087 100644 --- a/parm/config/gfs/config.efcs +++ b/parm/config/gfs/config.efcs @@ -5,14 +5,10 @@ echo "BEGIN: config.efcs" -# Turn off components in ensemble via _ENKF, or use setting from deterministic -export DO_AERO=${DO_AERO_ENKF:-${DO_AERO:-"NO"}} -export DO_OCN=${DO_OCN_ENKF:-${DO_OCN:-"NO"}} -export DO_ICE=${DO_ICE_ENKF:-${DO_ICE:-"NO"}} -export DO_WAVE=${DO_WAVE_ENKF:-${DO_WAVE:-"NO"}} +export CASE="${CASE_ENS}" # Source model specific information that is resolution dependent -string="--fv3 ${CASE_ENS}" +string="--fv3 ${CASE}" # Ocean/Ice/Waves ensemble configurations are identical to deterministic member [[ "${DO_OCN}" == "YES" ]] && string="${string} --mom6 ${OCNRES}" [[ "${DO_ICE}" == "YES" ]] && string="${string} --cice6 ${ICERES}" @@ -25,15 +21,23 @@ source "${EXPDIR}/config.ufs" ${string} # Get task specific resources . "${EXPDIR}/config.resources" efcs +# nggps_diag_nml +export FHOUT=${FHOUT_ENKF:-3} +if [[ ${RUN} == "enkfgfs" ]]; then + export FHOUT=${FHOUT_ENKF_GFS:-${FHOUT}} +fi + +# model_configure +export FHMIN=${FHMIN_ENKF:-3} +export FHMAX=${FHMAX_ENKF:-9} +if [[ ${RUN} == "enkfgfs" ]]; then + export FHMAX=${FHMAX_ENKF_GFS:-${FHMAX}} +fi + # Use serial I/O for ensemble (lustre?) export OUTPUT_FILETYPE_ATM="netcdf" export OUTPUT_FILETYPE_SFC="netcdf" -# Number of enkf members per fcst job -export NMEM_EFCSGRP=2 -export NMEM_EFCSGRP_GFS=1 -export RERUN_EFCSGRP="NO" - # Turn off inline UPP for EnKF forecast export WRITE_DOPOST=".false." @@ -56,14 +60,33 @@ export SPPT_LSCALE=500000. export SPPT_LOGIT=".true." export SPPT_SFCLIMIT=".true." -if [[ "${QUILTING}" = ".true." ]] && [[ "${OUTPUT_GRID}" = "gaussian_grid" ]]; then - export DIAG_TABLE="${HOMEgfs}/parm/ufs/fv3/diag_table_da" +if [[ "${QUILTING}" == ".true." ]] && [[ "${OUTPUT_GRID}" == "gaussian_grid" ]]; then + export DIAG_TABLE="${PARMgfs}/ufs/fv3/diag_table_da" else - export DIAG_TABLE="${HOMEgfs}/parm/ufs/fv3/diag_table_da_orig" + export DIAG_TABLE="${PARMgfs}/ufs/fv3/diag_table_da_orig" +fi + +# Model config option for Ensemble +# export TYPE=nh # choices: nh, hydro +# export MONO=non-mono # choices: mono, non-mono + +# gfs_physics_nml +export FHSWR=3600. +export FHLWR=3600. +export IEMS=1 +export ISOL=2 +export ICO2=2 +export dspheat=".true." +export shal_cnv=".true." +export FHZER=6 + +# Set PREFIX_ATMINC to r when recentering on +if [[ ${RECENTER_ENKF:-"YES"} == "YES" ]]; then + export PREFIX_ATMINC="r" fi # For IAU, write restarts at beginning of window also -if [[ "${DOIAU_ENKF:-}" = "YES" ]]; then +if [[ "${DOIAU_ENKF:-}" == "YES" ]]; then export restart_interval="3" else export restart_interval="6" diff --git a/parm/config/gfs/config.esfc b/parm/config/gfs/config.esfc index 2bb3d48bb4..684dea4ee3 100644 --- a/parm/config/gfs/config.esfc +++ b/parm/config/gfs/config.esfc @@ -12,8 +12,19 @@ echo "BEGIN: config.esfc" # Set DOSFCANL_ENKF=NO to prevent creation of sfcanl at # center of analysis window. -if [ $DOIAU_ENKF = "YES" ]; then +if [[ ${DOIAU_ENKF} = "YES" ]]; then export DOSFCANL_ENKF="NO" fi +# Turn off NST in JEDIATMENS +if [[ "${DO_JEDIATMENS}" == "YES" ]]; then + export DONST="NO" +fi + +# set up soil analysis +if [[ ${GSI_SOILANAL} = "YES" ]]; then + export DO_LNDINC=".true." + export LND_SOI_FILE="lnd_incr" +fi + echo "END: config.esfc" diff --git a/parm/config/gfs/config.fcst b/parm/config/gfs/config.fcst index a4c4ee8072..580bc1a06b 100644 --- a/parm/config/gfs/config.fcst +++ b/parm/config/gfs/config.fcst @@ -5,6 +5,8 @@ echo "BEGIN: config.fcst" +export USE_ESMF_THREADING="YES" # Toggle to use ESMF-managed threading or traditional threading in UFSWM + # Turn off waves if not used for this CDUMP case ${WAVE_CDUMP} in both | "${CDUMP/enkf}" ) ;; # Don't change @@ -21,6 +23,25 @@ string="--fv3 ${CASE}" # shellcheck disable=SC2086 source "${EXPDIR}/config.ufs" ${string} +# Forecast length for GFS forecast +case ${RUN} in + *gfs) + # shellcheck disable=SC2153 + export FHMAX=${FHMAX_GFS} + # shellcheck disable=SC2153 + export FHOUT=${FHOUT_GFS} + export FHMAX_HF=${FHMAX_HF_GFS} + export FHOUT_HF=${FHOUT_HF_GFS} + export FHOUT_OCNICE=${FHOUT_OCNICE_GFS} + ;; + *gdas) + export FHMAX_HF=0 + export FHOUT_HF=0 + ;; + *) + echo "FATAL ERROR: Unsupported RUN '${RUN}'" + exit 1 +esac # Get task specific resources source "${EXPDIR}/config.resources" fcst @@ -37,16 +58,14 @@ export esmf_logkind="ESMF_LOGKIND_MULTI_ON_ERROR" #Options: ESMF_LOGKIND_MULTI_O ####################################################################### -export FORECASTSH="${HOMEgfs}/scripts/exglobal_forecast.sh" -#export FORECASTSH="${HOMEgfs}/scripts/exglobal_forecast.py" # Temp. while this is worked on -export FCSTEXECDIR="${HOMEgfs}/exec" +export FORECASTSH="${SCRgfs}/exglobal_forecast.sh" +#export FORECASTSH="${SCRgfs}/exglobal_forecast.py" # Temp. while this is worked on export FCSTEXEC="ufs_model.x" ####################################################################### # Model configuration export TYPE="nh" export MONO="non-mono" -#JKHexport range_warn=".false." ## JKH # Use stratosphere h2o physics export h2o_phys=".true." @@ -98,9 +117,6 @@ if (( gwd_opt == 2 )); then export do_ugwp_v1_orog_only=".false." launch_level=$(echo "${LEVS}/2.35" |bc) export launch_level - if [[ ${do_gsl_drag_ls_bl} == ".true." ]]; then - export cdmbgwd=${cdmbgwd_gsl} - fi fi # Sponge layer settings @@ -143,7 +159,11 @@ tbp="" if [[ "${progsigma}" == ".true." ]]; then tbp="_progsigma" ; fi # Radiation options -export IAER=1011 ; #spectral band mapping method for aerosol optical properties +if [[ "${DO_AERO}" == "YES" ]]; then + export IAER=2011 # spectral band mapping method for aerosol optical properties +else + export IAER=1011 +fi export iovr_lw=3 ; #de-correlation length cloud overlap method (Barker, 2008) export iovr_sw=3 ; #de-correlation length cloud overlap method (Barker, 2008) export iovr=3 ; #de-correlation length cloud overlap method (Barker, 2008) @@ -199,26 +219,31 @@ export random_clds=".true." case ${imp_physics} in 99) # ZhaoCarr export ncld=1 - export FIELD_TABLE="${HOMEgfs}/parm/ufs/fv3/field_table_zhaocarr${tbf}${tbp}" + export FIELD_TABLE="${PARMgfs}/ufs/fv3/field_table_zhaocarr${tbf}${tbp}" export nwat=2 ;; 6) # WSM6 export ncld=2 - export FIELD_TABLE="${HOMEgfs}/parm/ufs/fv3/field_table_wsm6${tbf}${tbp}" + export FIELD_TABLE="${PARMgfs}/ufs/fv3/field_table_wsm6${tbf}${tbp}" export nwat=6 ;; 8) # Thompson export ncld=2 + export FIELD_TABLE="${PARMgfs}/ufs/fv3/field_table_thompson_noaero_tke${tbp}" export nwat=6 export cal_pre=".false." export random_clds=".false." export effr_in=".true." + #JHK export lradar=".true." export ttendlim="-999" + #JHK export sedi_semi=.true. + #JHK export decfl=10 +#########JHK session########### if [[ "${CCPP_SUITE}" == "FV3_GFS_v17_p8_ugwpv1_mynn" || "${CCPP_SUITE}" == "FV3_GFS_v17_p8_ugwpv1_c3_mynn" || "${CCPP_SUITE}" == "FV3_GFS_v17_p8_mynn" || "${CCPP_SUITE}" == "FV3_GFS_v17_p8_c3_mynn" || "${CCPP_SUITE}" == "FV3_GFS_v17_p8_thompson" ]] ; then #JKH keep dt_inner $DELTIM/2 (75) if running aerosol-aware Thompson @@ -241,7 +266,7 @@ case ${imp_physics} in ;; 11) # GFDL export ncld=5 - export FIELD_TABLE="${HOMEgfs}/parm/ufs/fv3/field_table_gfdl${tbf}${tbp}" + export FIELD_TABLE="${PARMgfs}/ufs/fv3/field_table_gfdl${tbf}${tbp}" export nwat=6 export dnats=1 export cal_pre=".false." @@ -283,7 +308,7 @@ export FSICL="0" export FSICS="0" #--------------------------------------------------------------------- - +#JHK kept this session since 2019!!! # ideflate: netcdf zlib lossless compression (0-9): 0 no compression # nbits: netcdf lossy compression level (0-32): 0 lossless export ideflate=1 @@ -300,7 +325,7 @@ export USE_COUPLER_RES="NO" if [[ "${CDUMP}" =~ "gdas" ]] ; then # GDAS cycle specific parameters # Variables used in DA cycling - export DIAG_TABLE="${HOMEgfs}/parm/ufs/fv3/diag_table_da" + export DIAG_TABLE="${PARMgfs}/ufs/fv3/diag_table_da" if [[ "${DOIAU}" == "YES" ]]; then export restart_interval="3" @@ -314,7 +339,7 @@ if [[ "${CDUMP}" =~ "gdas" ]] ; then # GDAS cycle specific parameters elif [[ "${CDUMP}" =~ "gfs" ]] ; then # GFS cycle specific parameters # Write more variables to output - export DIAG_TABLE="${HOMEgfs}/parm/ufs/fv3/diag_table" + export DIAG_TABLE="${PARMgfs}/ufs/fv3/diag_table" # Write gfs restart files to rerun fcst from any break point export restart_interval=${restart_interval_gfs:-12} diff --git a/parm/config/gfs/config.fit2obs b/parm/config/gfs/config.fit2obs index 46baaa9e45..9b3fb87ead 100644 --- a/parm/config/gfs/config.fit2obs +++ b/parm/config/gfs/config.fit2obs @@ -8,8 +8,8 @@ echo "BEGIN: config.fit2obs" # Get task specific resources . "${EXPDIR}/config.resources" fit2obs -export PRVT=${HOMEgfs}/fix/gsi/prepobs_errtable.global -export HYBLEVS=${HOMEgfs}/fix/am/global_hyblev.l${LEVS}.txt +export PRVT=${FIXgfs}/gsi/prepobs_errtable.global +export HYBLEVS=${FIXgfs}/am/global_hyblev.l${LEVS}.txt export VBACKUP_FITS=24 export OUTPUT_FILETYPE="netcdf" diff --git a/parm/config/gfs/config.ice b/parm/config/gfs/config.ice index 205458020f..055bd1e2bb 100644 --- a/parm/config/gfs/config.ice +++ b/parm/config/gfs/config.ice @@ -6,4 +6,9 @@ echo "BEGIN: config.ice" export min_seaice="1.0e-6" export use_cice_alb=".true." +export MESH_ICE="mesh.mx${ICERES}.nc" + +export CICE_GRID="grid_cice_NEMS_mx${ICERES}.nc" +export CICE_MASK="kmtu_cice_NEMS_mx${ICERES}.nc" + echo "END: config.ice" diff --git a/parm/config/gfs/config.landanl b/parm/config/gfs/config.landanl deleted file mode 100644 index 70ebae7529..0000000000 --- a/parm/config/gfs/config.landanl +++ /dev/null @@ -1,34 +0,0 @@ -#! /usr/bin/env bash - -########## config.landanl ########## -# configuration common to land analysis tasks - -echo "BEGIN: config.landanl" - -# Get task specific resources -. "${EXPDIR}/config.resources" landanl - -obs_list_name=gdas_land_gts_only.yaml -if [[ "${cyc}" = "18" ]]; then - obs_list_name=gdas_land_prototype.yaml -fi - -export OBS_YAML_DIR=${HOMEgfs}/sorc/gdas.cd/parm/land/obs/config/ -export OBS_LIST=${HOMEgfs}/sorc/gdas.cd/parm/land/obs/lists/${obs_list_name} - -# Name of the JEDI executable and its yaml template -export JEDIEXE="${HOMEgfs}/exec/fv3jedi_letkf.x" -export JEDIYAML="${HOMEgfs}/sorc/gdas.cd/parm/land/letkfoi/letkfoi.yaml" - -# Ensemble member properties -export SNOWDEPTHVAR="snodl" -export BESTDDEV="30." # Background Error Std. Dev. for LETKFOI - -# Name of the executable that applies increment to bkg and its namelist template -export APPLY_INCR_EXE="${HOMEgfs}/exec/apply_incr.exe" -export APPLY_INCR_NML_TMPL="${HOMEgfs}/sorc/gdas.cd/parm/land/letkfoi/apply_incr_nml.j2" - -export io_layout_x=@IO_LAYOUT_X@ -export io_layout_y=@IO_LAYOUT_Y@ - -echo "END: config.landanl" diff --git a/parm/config/gfs/config.metp b/parm/config/gfs/config.metp index c90903f6a5..8260d1c472 100644 --- a/parm/config/gfs/config.metp +++ b/parm/config/gfs/config.metp @@ -23,6 +23,7 @@ export VERIF_GLOBALSH=${HOMEverif_global}/ush/run_verif_global_in_global_workflo export model=${PSLOT} export model_file_format="pgbf{lead?fmt=%2H}.${CDUMP}.{init?fmt=%Y%m%d%H}.grib2" export model_hpss_dir=${ATARDIR}/.. +export model_dir=${ARCDIR}/.. export get_data_from_hpss="NO" export hpss_walltime="10" ## OUTPUT SETTINGS diff --git a/parm/config/gfs/config.nsst b/parm/config/gfs/config.nsst index db4367b2c0..7bda81f058 100644 --- a/parm/config/gfs/config.nsst +++ b/parm/config/gfs/config.nsst @@ -10,6 +10,11 @@ echo "BEGIN: config.nsst" # nstf_name(1) : NST_MODEL (NSST Model) : 0 = OFF, 1 = ON but uncoupled, 2 = ON and coupled export NST_MODEL=2 +# Set NST_MODEL for JEDIATMVAR or JEDIATMENS +if [[ "${DO_JEDIATMVAR}" == "YES" || "${DO_JEDIATMENS}" == "YES" ]]; then + export NST_MODEL=1 +fi + # nstf_name(2) : NST_SPINUP : 0 = OFF, 1 = ON, export NST_SPINUP=0 cdate="${PDY}${cyc}" diff --git a/parm/config/gfs/config.oceanice_products b/parm/config/gfs/config.oceanice_products new file mode 100644 index 0000000000..9e5c5b1c68 --- /dev/null +++ b/parm/config/gfs/config.oceanice_products @@ -0,0 +1,15 @@ +#! /usr/bin/env bash + +########## config.oceanice_products ########## + +echo "BEGIN: config.oceanice_products" + +# Get task specific resources +source "${EXPDIR}/config.resources" oceanice_products + +export OCEANICEPRODUCTS_CONFIG="${PARMgfs}/post/oceanice_products.yaml" + +# No. of forecast hours to process in a single job +export NFHRS_PER_GROUP=3 + +echo "END: config.oceanice_products" diff --git a/parm/config/gfs/config.ocn b/parm/config/gfs/config.ocn index 37f6a966aa..317a76e58a 100644 --- a/parm/config/gfs/config.ocn +++ b/parm/config/gfs/config.ocn @@ -2,8 +2,7 @@ echo "BEGIN: config.ocn" -# MOM_input template to use -export MOM_INPUT="MOM_input_template_${OCNRES}" +export MESH_OCN="mesh.mx${OCNRES}.nc" export DO_OCN_SPPT="NO" # In MOM_input, this variable is determines OCN_SPPT (OCN_SPPT = True|False) export DO_OCN_PERT_EPBL="NO" # In MOM_input, this variable determines PERT_EPBL (PERT_EPBL = True|False) @@ -17,6 +16,14 @@ if [[ "${DO_JEDIOCNVAR}" == "YES" ]]; then else export ODA_INCUPD="False" fi -export ODA_INCUPD_NHOURS="3.0" # In MOM_input, this is time interval for applying increment + +# Time interval for applying the increment +if [[ "${DOIAU}" == "YES" ]]; then + export ODA_INCUPD_NHOURS="6.0" +else + export ODA_INCUPD_NHOURS="3.0" +fi + + echo "END: config.ocn" diff --git a/parm/config/gfs/config.ocnanalecen b/parm/config/gfs/config.ocnanalecen new file mode 100644 index 0000000000..b64c2bcf62 --- /dev/null +++ b/parm/config/gfs/config.ocnanalecen @@ -0,0 +1,11 @@ +#!/bin/bash + +########## config.ocnanalecen ########## +# Ocn Analysis specific + +echo "BEGIN: config.ocnanalecen" + +# Get task specific resources +. "${EXPDIR}/config.resources" ocnanalecen + +echo "END: config.ocnanalecen" diff --git a/parm/config/gfs/config.ocnpost b/parm/config/gfs/config.ocnpost deleted file mode 100644 index 851c476e6c..0000000000 --- a/parm/config/gfs/config.ocnpost +++ /dev/null @@ -1,29 +0,0 @@ -#! /usr/bin/env bash - -########## config.ocnpost ########## - -echo "BEGIN: config.ocnpost" - -# Get task specific resources -source "${EXPDIR}/config.resources" ocnpost - -# Convert netcdf files to grib files using post job -#------------------------------------------- -case "${OCNRES}" in - "025") export MAKE_OCN_GRIB="YES";; - "050") export MAKE_OCN_GRIB="NO";; - "100") export MAKE_OCN_GRIB="NO";; - "500") export MAKE_OCN_GRIB="NO";; - *) export MAKE_OCN_GRIB="NO";; -esac - -if [[ "${machine}" = "WCOSS2" ]] || [[ "${machine}" = "HERCULES" ]]; then - #Currently the conversion to netcdf uses NCL which is not on WCOSS2 or HERCULES - #This should be removed when this is updated - export MAKE_OCN_GRIB="NO" -fi - -# No. of forecast hours to process in a single job -export NFHRS_PER_GROUP=3 - -echo "END: config.ocnpost" diff --git a/parm/config/gfs/config.postsnd b/parm/config/gfs/config.postsnd index 53d66bf4f6..7ec0ad6321 100644 --- a/parm/config/gfs/config.postsnd +++ b/parm/config/gfs/config.postsnd @@ -8,7 +8,6 @@ echo "BEGIN: config.postsnd" # Get task specific resources . $EXPDIR/config.resources postsnd -export POSTSNDSH=$HOMEgfs/jobs/JGFS_ATMOS_POSTSND export ENDHOUR=180 if [[ "$FHMAX_GFS" -lt "$ENDHOUR" ]] ; then export ENDHOUR=$FHMAX_GFS ; fi diff --git a/parm/config/gfs/config.prep b/parm/config/gfs/config.prep index d5ac1925f7..6009280db0 100644 --- a/parm/config/gfs/config.prep +++ b/parm/config/gfs/config.prep @@ -13,33 +13,28 @@ export cdate10=${PDY}${cyc} # Relocation and syndata QC export PROCESS_TROPCY=${PROCESS_TROPCY:-NO} -export TROPCYQCRELOSH="$HOMEgfs/scripts/exglobal_atmos_tropcy_qc_reloc.sh" +export TROPCYQCRELOSH="${SCRgfs}/exglobal_atmos_tropcy_qc_reloc.sh" export COMINtcvital=${COMINtcvital:-${DMPDIR}/${CDUMP}.${PDY}/${cyc}/atmos} export COMINsyn=${COMINsyn:-$(compath.py ${envir}/com/gfs/${gfs_ver})/syndat} -export HOMERELO=$HOMEgfs -export EXECRELO=${HOMERELO}/exec -export FIXRELO=${HOMERELO}/fix/am -export USHRELO=${HOMERELO}/ush - # Adjust observation error for GFS v16 parallels # # NOTE: Remember to set OBERROR in config.anal as PRVT is set below # # Set default prepobs_errtable.global -export PRVT=$FIXgsi/prepobs_errtable.global +export PRVT=${FIXgfs}/gsi/prepobs_errtable.global # Set prepobs.errtable.global for GFS v16 retrospective parallels if [[ $RUN_ENVIR == "emc" ]]; then if [[ "${PDY}${cyc}" -ge "2019021900" && "${PDY}${cyc}" -lt "2019110706" ]]; then - export PRVT=$FIXgsi/gfsv16_historical/prepobs_errtable.global.2019021900 + export PRVT=${FIXgfs}/gsi/gfsv16_historical/prepobs_errtable.global.2019021900 fi # Place GOES-15 AMVs in monitor, assimilate GOES-17 AMVs, assimilate KOMPSAT-5 gps if [[ "${PDY}${cyc}" -ge "2019110706" && "${PDY}${cyc}" -lt "2020040718" ]]; then - export PRVT=$FIXgsi/gfsv16_historical/prepobs_errtable.global.2019110706 + export PRVT=${FIXgfs}/gsi/gfsv16_historical/prepobs_errtable.global.2019110706 fi # NOTE: diff --git a/parm/config/gfs/config.prepatmiodaobs b/parm/config/gfs/config.prepatmiodaobs index ed9b246120..e29cf67b07 100644 --- a/parm/config/gfs/config.prepatmiodaobs +++ b/parm/config/gfs/config.prepatmiodaobs @@ -8,7 +8,4 @@ echo "BEGIN: config.prepatmiodaobs" # Get task specific resources . "${EXPDIR}/config.resources" prepatmiodaobs -export BUFR2IODASH="${HOMEgfs}/ush/run_bufr2ioda.py" -export IODAPARM="${HOMEgfs}/sorc/gdas.cd/parm/ioda/bufr2ioda" - echo "END: config.prepatmiodaobs" diff --git a/parm/config/gfs/config.preplandobs b/parm/config/gfs/config.preplandobs deleted file mode 100644 index 20ae20b5ad..0000000000 --- a/parm/config/gfs/config.preplandobs +++ /dev/null @@ -1,18 +0,0 @@ -#! /usr/bin/env bash - -########## config.preplandobs ########## -# Land Obs Prep specific - -echo "BEGIN: config.preplandobs" - -# Get task specific resources -. "${EXPDIR}/config.resources" preplandobs - -export GTS_OBS_LIST="${HOMEgfs}/sorc/gdas.cd/parm/land/prep/prep_gts.yaml" -export BUFR2IODAX="${HOMEgfs}/exec/bufr2ioda.x" -export FIMS_NML_TMPL="${HOMEgfs}/sorc/gdas.cd/parm/land/prep/fims.nml.j2" -export IMS_OBS_LIST="${HOMEgfs}/sorc/gdas.cd/parm/land/prep/prep_ims.yaml" -export CALCFIMSEXE="${HOMEgfs}/exec/calcfIMS.exe" -export IMS2IODACONV="${HOMEgfs}/ush/imsfv3_scf2ioda.py" - -echo "END: config.preplandobs" diff --git a/parm/config/gfs/config.prepoceanobs b/parm/config/gfs/config.prepoceanobs index d7c4e37bb9..56fc349ce2 100644 --- a/parm/config/gfs/config.prepoceanobs +++ b/parm/config/gfs/config.prepoceanobs @@ -7,7 +7,7 @@ echo "BEGIN: config.prepoceanobs" export OCNOBS2IODAEXEC=${HOMEgfs}/sorc/gdas.cd/build/bin/gdas_obsprovider2ioda.x export OBS_YAML_DIR=${HOMEgfs}/sorc/gdas.cd/parm/soca/obs/config -export OBSPROC_YAML=@OBSPROC_YAML@ +export OBSPREP_YAML=@OBSPREP_YAML@ export OBS_LIST=@SOCA_OBS_LIST@ [[ -n "${OBS_LIST}" ]] || export OBS_LIST=${HOMEgfs}/sorc/gdas.cd/parm/soca/obs/obs_list.yaml export OBS_YAML=${OBS_LIST} diff --git a/parm/config/gfs/config.prepsnowobs b/parm/config/gfs/config.prepsnowobs new file mode 100644 index 0000000000..60ca16ce9e --- /dev/null +++ b/parm/config/gfs/config.prepsnowobs @@ -0,0 +1,21 @@ +#! /usr/bin/env bash + +########## config.prepsnowobs ########## +# Snow Obs Prep specific + +echo "BEGIN: config.prepsnowobs" + +# Get task specific resources +. "${EXPDIR}/config.resources" prepsnowobs + +export GTS_OBS_LIST="${PARMgfs}/gdas/snow/prep/prep_gts.yaml.j2" +export IMS_OBS_LIST="${PARMgfs}/gdas/snow/prep/prep_ims.yaml.j2" + +export BUFR2IODAX="${EXECgfs}/bufr2ioda.x" + +export CALCFIMSEXE="${EXECgfs}/calcfIMS.exe" +export FIMS_NML_TMPL="${PARMgfs}/gdas/snow/prep/fims.nml.j2" + +export IMS2IODACONV="${USHgfs}/imsfv3_scf2ioda.py" + +echo "END: config.prepsnowobs" diff --git a/parm/config/gfs/config.resources b/parm/config/gfs/config.resources index c179c33df4..98fc3b2668 100644 --- a/parm/config/gfs/config.resources +++ b/parm/config/gfs/config.resources @@ -4,26 +4,26 @@ # Set resource information for job tasks # e.g. walltime, node, cores per node, memory etc. -if [[ $# -ne 1 ]]; then +if (( $# != 1 )); then echo "Must specify an input task argument to set resource variables!" echo "argument can be any one of the following:" echo "stage_ic aerosol_init" - echo "prep preplandobs prepatmiodaobs" + echo "prep prepsnowobs prepatmiodaobs" echo "atmanlinit atmanlrun atmanlfinal" echo "atmensanlinit atmensanlrun atmensanlfinal" - echo "landanl" + echo "snowanl" echo "aeroanlinit aeroanlrun aeroanlfinal" echo "anal sfcanl analcalc analdiag fcst echgres" echo "upp atmos_products" echo "tracker genesis genesis_fsu" echo "verfozn verfrad vminmon fit2obs metp arch cleanup" echo "eobs ediag eomg eupd ecen esfc efcs epos earc" - echo "init_chem mom6ic ocnpost" + echo "init_chem mom6ic oceanice_products" echo "waveinit waveprep wavepostsbs wavepostbndpnt wavepostbndpntbll wavepostpnt" echo "wavegempak waveawipsbulls waveawipsgridded" echo "postsnd awips gempak npoess" - echo "ocnanalprep prepoceanobs ocnanalbmat ocnanalrun ocnanalchkpt ocnanalpost ocnanalvrfy" + echo "ocnanalprep prepoceanobs ocnanalbmat ocnanalrun ocnanalecen ocnanalchkpt ocnanalpost ocnanalvrfy" exit 1 fi @@ -32,672 +32,692 @@ step=$1 echo "BEGIN: config.resources" -if [[ "${machine}" = "WCOSS2" ]]; then - export npe_node_max=128 -elif [[ "${machine}" = "JET" ]]; then - if [[ "${PARTITION_POST_BATCH}" = "sjet" ]]; then - export npe_node_max=16 - elif [[ "${PARTITION_BATCH}" = "xjet" ]]; then - export npe_node_max=24 - elif [[ ${PARTITION_BATCH} = "vjet" || ${PARTITION_BATCH} = "sjet" ]]; then - export npe_node_max=16 - elif [[ ${PARTITION_BATCH} = "kjet" ]]; then - export npe_node_max=40 - fi -elif [[ "${machine}" = "HERA" ]]; then - export npe_node_max=40 -elif [[ "${machine}" = "S4" ]]; then - if [[ ${PARTITION_BATCH} = "s4" ]]; then - export npe_node_max=32 - elif [[ ${PARTITION_BATCH} = "ivy" ]]; then - export npe_node_max=20 - fi -elif [[ "${machine}" = "AWSPW" ]]; then - export PARTITION_BATCH="compute" - export npe_node_max=40 -elif [[ "${machine}" = "ORION" ]]; then - export npe_node_max=40 -elif [[ "${machine}" = "HERCULES" ]]; then - export npe_node_max=80 -fi +case ${machine} in + "WCOSS2") npe_node_max=128;; + "HERA") npe_node_max=40;; + "ORION") npe_node_max=40;; + "HERCULES") npe_node_max=80;; + "JET") + case ${PARTITION_BATCH} in + "xjet") npe_node_max=24;; + "vjet" | "sjet") npe_node_max=16;; + "kjet") npe_node_max=40;; + *) + echo "FATAL ERROR: Unknown partition ${PARTITION_BATCH} specified for ${machine}" + exit 3 + esac + ;; + "S4") + case ${PARTITION_BATCH} in + "s4") npe_node_max=32;; + "ivy") npe_node_max=20;; + *) + echo "FATAL ERROR: Unknown partition ${PARTITION_BATCH} specified for ${machine}" + exit 3 + esac + ;; + "AWSPW") + export PARTITION_BATCH="compute" + npe_node_max=40 + ;; + "CONTAINER") + npe_node_max=1 + ;; + *) + echo "FATAL ERROR: Unknown machine encountered by ${BASH_SOURCE[0]}" + exit 2 + ;; +esac +export npe_node_max -if [[ "${step}" = "prep" ]]; then +case ${step} in + "prep") export wtime_prep='00:30:00' export npe_prep=4 export npe_node_prep=2 export nth_prep=1 - if [[ "${machine}" = "WCOSS2" ]]; then + if [[ "${machine}" == "WCOSS2" ]]; then export is_exclusive=True else - export memory_prep="40G" + export memory_prep="40GB" fi + ;; -elif [[ "${step}" = "preplandobs" ]]; then - export wtime_preplandobs="00:05:00" - npe_preplandobs=1 - export npe_preplandobs - export nth_preplandobs=1 - npe_node_preplandobs=1 - export npe_node_preplandobs + "prepsnowobs") + export wtime_prepsnowobs="00:05:00" + export npe_prepsnowobs=1 + export nth_prepsnowobs=1 + export npe_node_prepsnowobs=1 + ;; -elif [[ "${step}" = "prepatmiodaobs" ]]; then - export wtime_prepatmiodaobs="00:10:00" + "prepatmiodaobs") + export wtime_prepatmiodaobs="00:30:00" export npe_prepatmiodaobs=1 export nth_prepatmiodaobs=1 - npe_node_prepatmiodaobs=$(echo "${npe_node_max} / ${nth_prepatmiodaobs}" | bc) - export npe_node_prepatmiodaobs + export npe_node_prepatmiodaobs=$(( npe_node_max / nth_prepatmiodaobs )) + ;; -elif [[ "${step}" = "aerosol_init" ]]; then + "aerosol_init") export wtime_aerosol_init="00:05:00" export npe_aerosol_init=1 export nth_aerosol_init=1 - npe_node_aerosol_init=$(echo "${npe_node_max} / ${nth_aerosol_init}" | bc) - export npe_node_aerosol_init + export npe_node_aerosol_init=$(( npe_node_max / nth_aerosol_init )) export NTASKS=${npe_aerosol_init} - export memory_aerosol_init="6G" - -elif [[ "${step}" = "waveinit" ]]; then + export memory_aerosol_init="6GB" + ;; + "waveinit") export wtime_waveinit="00:10:00" export npe_waveinit=12 export nth_waveinit=1 - npe_node_waveinit=$(echo "${npe_node_max} / ${nth_waveinit}" | bc) - export npe_node_waveinit + export npe_node_waveinit=$(( npe_node_max / nth_waveinit )) export NTASKS=${npe_waveinit} export memory_waveinit="2GB" + ;; -elif [[ "${step}" = "waveprep" ]]; then - + "waveprep") export wtime_waveprep="00:10:00" export npe_waveprep=5 export npe_waveprep_gfs=65 export nth_waveprep=1 export nth_waveprep_gfs=1 - npe_node_waveprep=$(echo "${npe_node_max} / ${nth_waveprep}" | bc) - export npe_node_waveprep - npe_node_waveprep_gfs=$(echo "${npe_node_max} / ${nth_waveprep_gfs}" | bc) - export npe_node_waveprep_gfs + export npe_node_waveprep=$(( npe_node_max / nth_waveprep )) + export npe_node_waveprep_gfs=$(( npe_node_max / nth_waveprep_gfs )) export NTASKS=${npe_waveprep} export NTASKS_gfs=${npe_waveprep_gfs} export memory_waveprep="100GB" export memory_waveprep_gfs="150GB" + ;; -elif [[ "${step}" = "wavepostsbs" ]]; then - + "wavepostsbs") export wtime_wavepostsbs="00:20:00" export wtime_wavepostsbs_gfs="03:00:00" export npe_wavepostsbs=8 export nth_wavepostsbs=1 - npe_node_wavepostsbs=$(echo "${npe_node_max} / ${nth_wavepostsbs}" | bc) - export npe_node_wavepostsbs + export npe_node_wavepostsbs=$(( npe_node_max / nth_wavepostsbs )) export NTASKS=${npe_wavepostsbs} export memory_wavepostsbs="10GB" export memory_wavepostsbs_gfs="10GB" + ;; -elif [[ "${step}" = "wavepostbndpnt" ]]; then - + "wavepostbndpnt") export wtime_wavepostbndpnt="01:00:00" export npe_wavepostbndpnt=240 export nth_wavepostbndpnt=1 - npe_node_wavepostbndpnt=$(echo "${npe_node_max} / ${nth_wavepostbndpnt}" | bc) - export npe_node_wavepostbndpnt + export npe_node_wavepostbndpnt=$(( npe_node_max / nth_wavepostbndpnt )) export NTASKS=${npe_wavepostbndpnt} export is_exclusive=True + ;; -elif [[ "${step}" = "wavepostbndpntbll" ]]; then - + "wavepostbndpntbll") export wtime_wavepostbndpntbll="01:00:00" export npe_wavepostbndpntbll=448 export nth_wavepostbndpntbll=1 - npe_node_wavepostbndpntbll=$(echo "${npe_node_max} / ${nth_wavepostbndpntbll}" | bc) - export npe_node_wavepostbndpntbll + export npe_node_wavepostbndpntbll=$(( npe_node_max / nth_wavepostbndpntbll )) export NTASKS=${npe_wavepostbndpntbll} export is_exclusive=True + ;; -elif [[ "${step}" = "wavepostpnt" ]]; then - + "wavepostpnt") export wtime_wavepostpnt="04:00:00" export npe_wavepostpnt=200 export nth_wavepostpnt=1 - npe_node_wavepostpnt=$(echo "${npe_node_max} / ${nth_wavepostpnt}" | bc) - export npe_node_wavepostpnt + export npe_node_wavepostpnt=$(( npe_node_max / nth_wavepostpnt )) export NTASKS=${npe_wavepostpnt} export is_exclusive=True + ;; -elif [[ "${step}" = "wavegempak" ]]; then - + "wavegempak") export wtime_wavegempak="02:00:00" export npe_wavegempak=1 export nth_wavegempak=1 - npe_node_wavegempak=$(echo "${npe_node_max} / ${nth_wavegempak}" | bc) - export npe_node_wavegempak + export npe_node_wavegempak=$(( npe_node_max / nth_wavegempak )) export NTASKS=${npe_wavegempak} export memory_wavegempak="1GB" + ;; -elif [[ "${step}" = "waveawipsbulls" ]]; then - + "waveawipsbulls") export wtime_waveawipsbulls="00:20:00" export npe_waveawipsbulls=1 export nth_waveawipsbulls=1 - npe_node_waveawipsbulls=$(echo "${npe_node_max} / ${nth_waveawipsbulls}" | bc) - export npe_node_waveawipsbulls + export npe_node_waveawipsbulls=$(( npe_node_max / nth_waveawipsbulls )) export NTASKS=${npe_waveawipsbulls} export is_exclusive=True + ;; -elif [[ ${step} = "waveawipsgridded" ]]; then - + "waveawipsgridded") export wtime_waveawipsgridded="02:00:00" export npe_waveawipsgridded=1 export nth_waveawipsgridded=1 - npe_node_waveawipsgridded=$(echo "${npe_node_max} / ${nth_waveawipsgridded}" | bc) - export npe_node_waveawipsgridded + export npe_node_waveawipsgridded=$(( npe_node_max / nth_waveawipsgridded )) export NTASKS=${npe_waveawipsgridded} export memory_waveawipsgridded_gfs="1GB" + ;; -elif [[ "${step}" = "atmanlinit" ]]; then + "atmanlinit") + export layout_x=${layout_x_atmanl} + export layout_y=${layout_y_atmanl} - # make below case dependent later - export layout_x=1 - export layout_y=1 - - layout_gsib_x=$(echo "${layout_x} * 3" | bc) - export layout_gsib_x - layout_gsib_y=$(echo "${layout_y} * 2" | bc) - export layout_gsib_y + export layout_gsib_x=$(( layout_x * 3 )) + export layout_gsib_y=$(( layout_y * 2 )) export wtime_atmanlinit="00:10:00" export npe_atmanlinit=1 export nth_atmanlinit=1 - npe_node_atmanlinit=$(echo "${npe_node_max} / ${nth_atmanlinit}" | bc) + export npe_node_atmanlinit=$(( npe_node_max / nth_atmanlinit )) export npe_node_atmanlinit export memory_atmanlinit="3072M" + ;; -elif [[ "${step}" = "atmanlrun" ]]; then - - # make below case dependent later - export layout_x=1 - export layout_y=1 + "atmanlrun") + export layout_x=${layout_x_atmanl} + export layout_y=${layout_y_atmanl} export wtime_atmanlrun="00:30:00" - npe_atmanlrun=$(echo "${layout_x} * ${layout_y} * 6" | bc) - export npe_atmanlrun - npe_atmanlrun_gfs=$(echo "${layout_x} * ${layout_y} * 6" | bc) - export npe_atmanlrun_gfs + export npe_atmanlrun=$(( layout_x * layout_y * 6 )) + export npe_atmanlrun_gfs=$(( layout_x * layout_y * 6 )) export nth_atmanlrun=1 export nth_atmanlrun_gfs=${nth_atmanlrun} - npe_node_atmanlrun=$(echo "${npe_node_max} / ${nth_atmanlrun}" | bc) - export npe_node_atmanlrun + export npe_node_atmanlrun=$(( npe_node_max / nth_atmanlrun )) + export memory_atmanlrun="96GB" export is_exclusive=True + ;; -elif [[ "${step}" = "atmanlfinal" ]]; then - + "atmanlfinal") export wtime_atmanlfinal="00:30:00" export npe_atmanlfinal=${npe_node_max} export nth_atmanlfinal=1 - npe_node_atmanlfinal=$(echo "${npe_node_max} / ${nth_atmanlfinal}" | bc) - export npe_node_atmanlfinal + export npe_node_atmanlfinal=$(( npe_node_max / nth_atmanlfinal )) export is_exclusive=True + ;; -elif [[ "${step}" = "landanl" ]]; then - # below lines are for creating JEDI YAML - case ${CASE} in - C768) + "snowanl") + # below lines are for creating JEDI YAML + case ${CASE} in + "C768") layout_x=6 layout_y=6 ;; - C384) + "C384") layout_x=5 layout_y=5 ;; - C192 | C96 | C48) + "C192" | "C96" | "C48") layout_x=1 layout_y=1 ;; - *) - echo "FATAL ERROR: Resolution not supported for land analysis'" - exit 1 - esac - - export layout_x - export layout_y - - export wtime_landanl="00:15:00" - npe_landanl=$(echo "${layout_x} * ${layout_y} * 6" | bc) - export npe_landanl - export nth_landanl=1 - npe_node_landanl=$(echo "${npe_node_max} / ${nth_landanl}" | bc) - export npe_node_landanl - -elif [[ "${step}" = "aeroanlinit" ]]; then - - # below lines are for creating JEDI YAML - case ${CASE} in - C768) + *) + echo "FATAL ERROR: Resources not defined for job ${job} at resolution ${CASE}" + exit 4 + esac + + export layout_x + export layout_y + + export wtime_snowanl="00:15:00" + export npe_snowanl=$(( layout_x * layout_y * 6 )) + export nth_snowanl=1 + export npe_node_snowanl=$(( npe_node_max / nth_snowanl )) + ;; + + "aeroanlinit") + # below lines are for creating JEDI YAML + case ${CASE} in + "C768") layout_x=8 layout_y=8 ;; - C384) + "C384") layout_x=8 layout_y=8 ;; - C192 | C96) + "C192" | "C96") layout_x=8 layout_y=8 ;; - C48 ) + "C48" ) # this case is for testing only layout_x=1 layout_y=1 ;; *) - echo "FATAL ERROR: Resolution not supported for aerosol analysis'" - exit 1 + echo "FATAL ERROR: Resources not defined for job ${job} at resolution ${CASE}" + exit 4 esac export layout_x export layout_y - export wtime_aeroanlinit="00:10:00" export npe_aeroanlinit=1 export nth_aeroanlinit=1 - npe_node_aeroanlinit=$(echo "${npe_node_max} / ${nth_aeroanlinit}" | bc) - export npe_node_aeroanlinit + export npe_node_aeroanlinit=$(( npe_node_max / nth_aeroanlinit )) export memory_aeroanlinit="3072M" + ;; -elif [[ "${step}" = "aeroanlrun" ]]; then - - case ${CASE} in - C768) + "aeroanlrun") + case ${CASE} in + "C768") layout_x=8 layout_y=8 ;; - C384) + "C384") layout_x=8 layout_y=8 ;; - C192 | C96) + "C192" | "C96") layout_x=8 layout_y=8 ;; - C48 ) + "C48" ) # this case is for testing only layout_x=1 layout_y=1 ;; *) - echo "FATAL ERROR: Resolution ${CASE} is not supported, ABORT!" - exit 1 + echo "FATAL ERROR: Resources not defined for job ${job} at resolution ${CASE}" + exit 4 esac export layout_x export layout_y export wtime_aeroanlrun="00:30:00" - npe_aeroanlrun=$(echo "${layout_x} * ${layout_y} * 6" | bc) - export npe_aeroanlrun - npe_aeroanlrun_gfs=$(echo "${layout_x} * ${layout_y} * 6" | bc) - export npe_aeroanlrun_gfs + export npe_aeroanlrun=$(( layout_x * layout_y * 6 )) + export npe_aeroanlrun_gfs=$(( layout_x * layout_y * 6 )) export nth_aeroanlrun=1 export nth_aeroanlrun_gfs=1 - npe_node_aeroanlrun=$(echo "${npe_node_max} / ${nth_aeroanlrun}" | bc) - export npe_node_aeroanlrun + export npe_node_aeroanlrun=$(( npe_node_max / nth_aeroanlrun )) export is_exclusive=True + ;; -elif [[ "${step}" = "aeroanlfinal" ]]; then - + "aeroanlfinal") export wtime_aeroanlfinal="00:10:00" export npe_aeroanlfinal=1 export nth_aeroanlfinal=1 - npe_node_aeroanlfinal=$(echo "${npe_node_max} / ${nth_aeroanlfinal}" | bc) - export npe_node_aeroanlfinal + export npe_node_aeroanlfinal=$(( npe_node_max / nth_aeroanlfinal )) export memory_aeroanlfinal="3072M" + ;; -elif [[ "${step}" = "ocnanalprep" ]]; then - + "ocnanalprep") export wtime_ocnanalprep="00:10:00" export npe_ocnanalprep=1 export nth_ocnanalprep=1 - npe_node_ocnanalprep=$(echo "${npe_node_max} / ${nth_ocnanalprep}" | bc) - export npe_node_ocnanalprep + export npe_node_ocnanalprep=$(( npe_node_max / nth_ocnanalprep )) export memory_ocnanalprep="24GB" + ;; -elif [[ "${step}" = "prepoceanobs" ]]; then - + "prepoceanobs") export wtime_prepoceanobs="00:10:00" export npe_prepoceanobs=1 export nth_prepoceanobs=1 - npe_node_prepoceanobs=$(echo "${npe_node_max} / ${nth_prepoceanobs}" | bc) - export npe_node_prepoceanobs - export memory_prepoceanobs="24GB" - - -elif [[ "${step}" = "ocnanalbmat" ]]; then - npes=16 - case ${CASE} in - C384) - npes=480 - ;; - C96) - npes=16 - ;; - C48) - npes=16 - ;; + export npe_node_prepoceanobs=$(( npe_node_max / nth_prepoceanobs )) + export memory_prepoceanobs="48GB" + ;; + + "ocnanalbmat") + npes=16 + case ${OCNRES} in + "025") npes=480;; + "050") npes=16;; + "500") npes=16;; *) - echo "FATAL: Resolution not supported'" - exit 1 + echo "FATAL ERROR: Resources not defined for job ${job} at resolution ${OCNRES}" + exit 4 esac export wtime_ocnanalbmat="00:30:00" export npe_ocnanalbmat=${npes} export nth_ocnanalbmat=1 export is_exclusive=True - npe_node_ocnanalbmat=$(echo "${npe_node_max} / ${nth_ocnanalbmat}" | bc) - export npe_node_ocnanalbmat + export npe_node_ocnanalbmat=$(( npe_node_max / nth_ocnanalbmat )) + ;; -elif [[ "${step}" = "ocnanalrun" ]]; then - npes=16 - case ${CASE} in - C384) + "ocnanalrun") + npes=16 + case ${OCNRES} in + "025") npes=480 - memory_ocnanalrun="128GB" + memory_ocnanalrun="96GB" ;; - C96) + "050") npes=16 + memory_ocnanalrun="96GB" ;; - C48) + "500") npes=16 - memory_ocnanalrun="64GB" + memory_ocnanalrun="24GB" ;; *) - echo "FATAL: Resolution not supported'" - exit 1 + echo "FATAL ERROR: Resources not defined for job ${job} at resolution ${OCNRES}" + exit 4 esac export wtime_ocnanalrun="00:15:00" export npe_ocnanalrun=${npes} - export nth_ocnanalrun=2 + export nth_ocnanalrun=1 export is_exclusive=True - npe_node_ocnanalrun=$(echo "${npe_node_max} / ${nth_ocnanalrun}" | bc) - export npe_node_ocnanalrun + export npe_node_ocnanalrun=$(( npe_node_max / nth_ocnanalrun )) export memory_ocnanalrun - -elif [[ "${step}" = "ocnanalchkpt" ]]; then - - export wtime_ocnanalchkpt="00:10:00" - export npe_ocnanalchkpt=1 - export nth_ocnanalchkpt=1 - npe_node_ocnanalchkpt=$(echo "${npe_node_max} / ${nth_ocnanalchkpt}" | bc) - export npe_node_ocnanalchkpt - case ${CASE} in - C384) - export memory_ocnanalchkpt="128GB" + ;; + + "ocnanalecen") + npes=16 + case ${OCNRES} in + "025") + npes=40 + memory_ocnanalecen="96GB" ;; - C96) - export memory_ocnanalchkpt="32GB" + "050") + npes=16 + memory_ocnanalecen="96GB" ;; - C48) - export memory_ocnanalchkpt="32GB" + "500") + npes=16 + memory_ocnanalecen="24GB" ;; *) - echo "FATAL: Resolution not supported'" - exit 1 + echo "FATAL ERROR: Resources not defined for job ${job} at resolution ${OCNRES}" + exit 4 esac -elif [[ "${step}" = "ocnanalpost" ]]; then + export wtime_ocnanalecen="00:10:00" + export npe_ocnanalecen=${npes} + export nth_ocnanalecen=1 + export is_exclusive=True + export npe_node_ocnanalecen=$(( npe_node_max / nth_ocnanalecen )) + export memory_ocnanalecen + ;; + + "ocnanalchkpt") + export wtime_ocnanalchkpt="00:10:00" + export npe_ocnanalchkpt=1 + export nth_ocnanalchkpt=1 + export npe_node_ocnanalchkpt=$(( npe_node_max / nth_ocnanalchkpt )) + case ${OCNRES} in + "025") + memory_ocnanalchkpt="128GB" + npes=40;; + "050") + memory_ocnanalchkpt="32GB" + npes=16;; + "500") + memory_ocnanalchkpt="32GB" + npes=8;; + *) + echo "FATAL ERROR: Resources not defined for job ${job} at resolution ${OCNRES}" + exit 4 + esac + export npe_ocnanalchkpt=${npes} + export memory_ocnanalchkpt + ;; + "ocnanalpost") export wtime_ocnanalpost="00:30:00" export npe_ocnanalpost=${npe_node_max} export nth_ocnanalpost=1 - npe_node_ocnanalpost=$(echo "${npe_node_max} / ${nth_ocnanalpost}" | bc) - export npe_node_ocnanalpost - -elif [[ "${step}" = "ocnanalvrfy" ]]; then + export npe_node_ocnanalpost=$(( npe_node_max / nth_ocnanalpost )) + ;; + "ocnanalvrfy") export wtime_ocnanalvrfy="00:35:00" export npe_ocnanalvrfy=1 export nth_ocnanalvrfy=1 - npe_node_ocnanalvrfy=$(echo "${npe_node_max} / ${nth_ocnanalvrfy}" | bc) - export npe_node_ocnanalvrfy + export npe_node_ocnanalvrfy=$(( npe_node_max / nth_ocnanalvrfy )) export memory_ocnanalvrfy="24GB" + ;; -elif [[ "${step}" = "anal" ]]; then - + "anal") export wtime_anal="00:50:00" export wtime_anal_gfs="00:40:00" export npe_anal=780 export nth_anal=5 export npe_anal_gfs=825 export nth_anal_gfs=5 - if [[ "${machine}" = "WCOSS2" ]]; then + if [[ "${machine}" == "WCOSS2" ]]; then export nth_anal=8 export nth_anal_gfs=8 fi - if [[ "${CASE}" = "C384" ]]; then - export npe_anal=160 - export npe_anal_gfs=160 - export nth_anal=10 - export nth_anal_gfs=10 - if [[ "${machine}" = "S4" ]]; then - #On the S4-s4 partition, this is accomplished by increasing the task - #count to a multiple of 32 - if [[ ${PARTITION_BATCH} = "s4" ]]; then + case ${CASE} in + "C384") + export npe_anal=160 + export npe_anal_gfs=160 + export nth_anal=10 + export nth_anal_gfs=10 + if [[ ${machine} = "S4" ]]; then + #On the S4-s4 partition, this is accomplished by increasing the task + #count to a multiple of 32 + if [[ ${PARTITION_BATCH} = "s4" ]]; then export npe_anal=416 export npe_anal_gfs=416 - fi - #S4 is small, so run this task with just 1 thread - export nth_anal=1 - export nth_anal_gfs=1 - export wtime_anal="02:00:00" - fi - fi - if [[ "${CASE}" = "C192" || "${CASE}" = "C96" || "${CASE}" = "C48" ]]; then - export npe_anal=84 - export npe_anal_gfs=84 - if [[ "${machine}" = "S4" ]]; then - export nth_anal=4 - export nth_anal_gfs=4 - #Adjust job count for S4 - if [[ "${PARTITION_BATCH}" = "s4" ]]; then + fi + #S4 is small, so run this task with just 1 thread + export nth_anal=1 + export nth_anal_gfs=1 + export wtime_anal="02:00:00" + fi + ;; + "C192" | "C96" | "C48") + export npe_anal=84 + export npe_anal_gfs=84 + if [[ ${machine} == "S4" ]]; then + export nth_anal=4 + export nth_anal_gfs=4 + #Adjust job count for S4 + if [[ ${PARTITION_BATCH} == "s4" ]]; then export npe_anal=88 export npe_anal_gfs=88 - elif [[ ${PARTITION_BATCH} = "ivy" ]]; then + elif [[ ${PARTITION_BATCH} == "ivy" ]]; then export npe_anal=90 export npe_anal_gfs=90 - fi - fi - fi - npe_node_anal=$(echo "${npe_node_max} / ${nth_anal}" | bc) - export npe_node_anal + fi + fi + ;; + *) + echo "FATAL ERROR: Resources not defined for job ${job} at resolution ${CASE}" + exit 4 + ;; + esac + export npe_node_anal=$(( npe_node_max / nth_anal )) export nth_cycle=${nth_anal} - npe_node_cycle=$(echo "${npe_node_max} / ${nth_cycle}" | bc) - export npe_node_cycle + export npe_node_cycle=$(( npe_node_max / nth_cycle )) export is_exclusive=True + ;; -elif [[ "${step}" = "analcalc" ]]; then - + "analcalc") export wtime_analcalc="00:10:00" export npe_analcalc=127 export ntasks="${npe_analcalc}" export nth_analcalc=1 export nth_echgres=4 export nth_echgres_gfs=12 - npe_node_analcalc=$(echo "${npe_node_max} / ${nth_analcalc}" | bc) - export npe_node_analcalc + export npe_node_analcalc=$(( npe_node_max / nth_analcalc )) export is_exclusive=True export memory_analcalc="48GB" + ;; -elif [[ "${step}" = "analdiag" ]]; then - + "analdiag") export wtime_analdiag="00:15:00" export npe_analdiag=96 # Should be at least twice npe_ediag export nth_analdiag=1 - npe_node_analdiag=$(echo "${npe_node_max} / ${nth_analdiag}" | bc) - export npe_node_analdiag + export npe_node_analdiag=$(( npe_node_max / nth_analdiag )) export memory_analdiag="48GB" + ;; -elif [[ "${step}" = "sfcanl" ]]; then - + "sfcanl") export wtime_sfcanl="00:10:00" export npe_sfcanl=6 export nth_sfcanl=1 - npe_node_sfcanl=$(echo "${npe_node_max} / ${nth_sfcanl}" | bc) - export npe_node_sfcanl + export npe_node_sfcanl=$(( npe_node_max / nth_sfcanl )) export is_exclusive=True + ;; -elif [[ "${step}" = "fcst" || "${step}" = "efcs" ]]; then - + "fcst" | "efcs") export is_exclusive=True - if [[ "${step}" = "fcst" ]]; then - _CDUMP_LIST=${CDUMP:-"gdas gfs"} - elif [[ "${step}" = "efcs" ]]; then - _CDUMP_LIST=${CDUMP:-"enkfgdas enkfgfs"} + if [[ "${step}" == "fcst" ]]; then + _CDUMP_LIST=${CDUMP:-"gdas gfs"} + elif [[ "${step}" == "efcs" ]]; then + _CDUMP_LIST=${CDUMP:-"enkfgdas enkfgfs"} fi # During workflow creation, we need resources for all CDUMPs and CDUMP is undefined for _CDUMP in ${_CDUMP_LIST}; do - if [[ "${_CDUMP}" =~ "gfs" ]]; then - export layout_x=${layout_x_gfs} - export layout_y=${layout_y_gfs} - export WRITE_GROUP=${WRITE_GROUP_GFS} - export WRTTASK_PER_GROUP_PER_THREAD=${WRTTASK_PER_GROUP_PER_THREAD_GFS} - ntasks_fv3=${ntasks_fv3_gfs} - ntasks_quilt=${ntasks_quilt_gfs} - nthreads_fv3=${nthreads_fv3_gfs} - fi + if [[ "${_CDUMP}" =~ "gfs" ]]; then + export layout_x=${layout_x_gfs} + export layout_y=${layout_y_gfs} + export WRITE_GROUP=${WRITE_GROUP_GFS} + export WRTTASK_PER_GROUP_PER_THREAD=${WRTTASK_PER_GROUP_PER_THREAD_GFS} + ntasks_fv3=${ntasks_fv3_gfs} + ntasks_quilt=${ntasks_quilt_gfs} + nthreads_fv3=${nthreads_fv3_gfs} + nthreads_ufs=${nthreads_ufs_gfs} + fi - # PETS for the atmosphere dycore - (( FV3PETS = ntasks_fv3 * nthreads_fv3 )) - echo "FV3 using (nthreads, PETS) = (${nthreads_fv3}, ${FV3PETS})" - - # PETS for quilting - if [[ "${QUILTING:-}" = ".true." ]]; then - (( QUILTPETS = ntasks_quilt * nthreads_fv3 )) - (( WRTTASK_PER_GROUP = WRTTASK_PER_GROUP_PER_THREAD )) - export WRTTASK_PER_GROUP - else - QUILTPETS=0 - fi - echo "QUILT using (nthreads, PETS) = (${nthreads_fv3}, ${QUILTPETS})" - - # Total PETS for the atmosphere component - ATMTHREADS=${nthreads_fv3} - (( ATMPETS = FV3PETS + QUILTPETS )) - export ATMPETS ATMTHREADS - echo "FV3ATM using (nthreads, PETS) = (${ATMTHREADS}, ${ATMPETS})" - - # Total PETS for the coupled model (starting w/ the atmosphere) - NTASKS_TOT=${ATMPETS} - - # The mediator PETS can overlap with other components, usually it lands on the atmosphere tasks. - # However, it is suggested limiting mediator PETS to 300, as it may cause the slow performance. - # See https://docs.google.com/document/d/1bKpi-52t5jIfv2tuNHmQkYUe3hkKsiG_DG_s6Mnukog/edit - # TODO: Update reference when moved to ufs-weather-model RTD - MEDTHREADS=${nthreads_mediator:-1} - MEDPETS=${MEDPETS:-${FV3PETS}} - [[ "${MEDPETS}" -gt 300 ]] && MEDPETS=300 - export MEDPETS MEDTHREADS - echo "MEDIATOR using (threads, PETS) = (${MEDTHREADS}, ${MEDPETS})" - - CHMPETS=0; CHMTHREADS=0 - if [[ "${DO_AERO}" = "YES" ]]; then - # GOCART shares the same grid and forecast tasks as FV3 (do not add write grid component tasks). - (( CHMTHREADS = ATMTHREADS )) - (( CHMPETS = FV3PETS )) - # Do not add to NTASKS_TOT - echo "GOCART using (threads, PETS) = (${CHMTHREADS}, ${CHMPETS})" - fi - export CHMPETS CHMTHREADS - - WAVPETS=0; WAVTHREADS=0 - if [[ "${DO_WAVE}" = "YES" ]]; then - (( WAVPETS = ntasks_ww3 * nthreads_ww3 )) - (( WAVTHREADS = nthreads_ww3 )) - echo "WW3 using (threads, PETS) = (${WAVTHREADS}, ${WAVPETS})" - (( NTASKS_TOT = NTASKS_TOT + WAVPETS )) - fi - export WAVPETS WAVTHREADS - - OCNPETS=0; OCNTHREADS=0 - if [[ "${DO_OCN}" = "YES" ]]; then - (( OCNPETS = ntasks_mom6 * nthreads_mom6 )) - (( OCNTHREADS = nthreads_mom6 )) - echo "MOM6 using (threads, PETS) = (${OCNTHREADS}, ${OCNPETS})" - (( NTASKS_TOT = NTASKS_TOT + OCNPETS )) - fi - export OCNPETS OCNTHREADS - - ICEPETS=0; ICETHREADS=0 - if [[ "${DO_ICE}" = "YES" ]]; then - (( ICEPETS = ntasks_cice6 * nthreads_cice6 )) - (( ICETHREADS = nthreads_cice6 )) - echo "CICE6 using (threads, PETS) = (${ICETHREADS}, ${ICEPETS})" - (( NTASKS_TOT = NTASKS_TOT + ICEPETS )) - fi - export ICEPETS ICETHREADS - - echo "Total PETS for ${_CDUMP} = ${NTASKS_TOT}" - - if [[ "${_CDUMP}" =~ "gfs" ]]; then - declare -x "npe_${step}_gfs"="${NTASKS_TOT}" - declare -x "nth_${step}_gfs"=1 # ESMF handles threading for the UFS-weather-model - declare -x "npe_node_${step}_gfs"="${npe_node_max}" - else - declare -x "npe_${step}"="${NTASKS_TOT}" - declare -x "nth_${step}"=1 # ESMF handles threading for the UFS-weather-model - declare -x "npe_node_${step}"="${npe_node_max}" - fi + # Determine if using ESMF-managed threading or traditional threading + # If using traditional threading, set them to 1 + if [[ "${USE_ESMF_THREADING:-}" == "YES" ]]; then + export UFS_THREADS=1 + else # traditional threading + export UFS_THREADS=${nthreads_ufs:-1} + nthreads_fv3=1 + nthreads_mediator=1 + [[ "${DO_WAVE}" == "YES" ]] && nthreads_ww3=1 + [[ "${DO_OCN}" == "YES" ]] && nthreads_mom6=1 + [[ "${DO_ICE}" == "YES" ]] && nthreads_cice6=1 + fi + + # PETS for the atmosphere dycore + (( FV3PETS = ntasks_fv3 * nthreads_fv3 )) + echo "FV3 using (nthreads, PETS) = (${nthreads_fv3}, ${FV3PETS})" + + # PETS for quilting + if [[ "${QUILTING:-}" == ".true." ]]; then + (( QUILTPETS = ntasks_quilt * nthreads_fv3 )) + (( WRTTASK_PER_GROUP = WRTTASK_PER_GROUP_PER_THREAD )) + export WRTTASK_PER_GROUP + else + QUILTPETS=0 + fi + echo "QUILT using (nthreads, PETS) = (${nthreads_fv3}, ${QUILTPETS})" + + # Total PETS for the atmosphere component + ATMTHREADS=${nthreads_fv3} + (( ATMPETS = FV3PETS + QUILTPETS )) + export ATMPETS ATMTHREADS + echo "FV3ATM using (nthreads, PETS) = (${ATMTHREADS}, ${ATMPETS})" + + # Total PETS for the coupled model (starting w/ the atmosphere) + NTASKS_TOT=${ATMPETS} + + # The mediator PETS can overlap with other components, usually it lands on the atmosphere tasks. + # However, it is suggested limiting mediator PETS to 300, as it may cause the slow performance. + # See https://docs.google.com/document/d/1bKpi-52t5jIfv2tuNHmQkYUe3hkKsiG_DG_s6Mnukog/edit + # TODO: Update reference when moved to ufs-weather-model RTD + MEDTHREADS=${nthreads_mediator:-1} + MEDPETS=${MEDPETS:-${FV3PETS}} + (( "${MEDPETS}" > 300 )) && MEDPETS=300 + export MEDPETS MEDTHREADS + echo "MEDIATOR using (threads, PETS) = (${MEDTHREADS}, ${MEDPETS})" + + CHMPETS=0; CHMTHREADS=0 + if [[ "${DO_AERO}" == "YES" ]]; then + # GOCART shares the same grid and forecast tasks as FV3 (do not add write grid component tasks). + (( CHMTHREADS = ATMTHREADS )) + (( CHMPETS = FV3PETS )) + # Do not add to NTASKS_TOT + echo "GOCART using (threads, PETS) = (${CHMTHREADS}, ${CHMPETS})" + fi + export CHMPETS CHMTHREADS + + WAVPETS=0; WAVTHREADS=0 + if [[ "${DO_WAVE}" == "YES" ]]; then + (( WAVPETS = ntasks_ww3 * nthreads_ww3 )) + (( WAVTHREADS = nthreads_ww3 )) + echo "WW3 using (threads, PETS) = (${WAVTHREADS}, ${WAVPETS})" + (( NTASKS_TOT = NTASKS_TOT + WAVPETS )) + fi + export WAVPETS WAVTHREADS + + OCNPETS=0; OCNTHREADS=0 + if [[ "${DO_OCN}" == "YES" ]]; then + (( OCNPETS = ntasks_mom6 * nthreads_mom6 )) + (( OCNTHREADS = nthreads_mom6 )) + echo "MOM6 using (threads, PETS) = (${OCNTHREADS}, ${OCNPETS})" + (( NTASKS_TOT = NTASKS_TOT + OCNPETS )) + fi + export OCNPETS OCNTHREADS + + ICEPETS=0; ICETHREADS=0 + if [[ "${DO_ICE}" == "YES" ]]; then + (( ICEPETS = ntasks_cice6 * nthreads_cice6 )) + (( ICETHREADS = nthreads_cice6 )) + echo "CICE6 using (threads, PETS) = (${ICETHREADS}, ${ICEPETS})" + (( NTASKS_TOT = NTASKS_TOT + ICEPETS )) + fi + export ICEPETS ICETHREADS + + echo "Total PETS for ${_CDUMP} = ${NTASKS_TOT}" + + if [[ "${_CDUMP}" =~ "gfs" ]]; then + declare -x "npe_${step}_gfs"="${NTASKS_TOT}" + declare -x "nth_${step}_gfs"="${UFS_THREADS}" + declare -x "npe_node_${step}_gfs"="${npe_node_max}" + else + declare -x "npe_${step}"="${NTASKS_TOT}" + declare -x "nth_${step}"="${UFS_THREADS}" + declare -x "npe_node_${step}"="${npe_node_max}" + fi done case "${CASE}" in "C48" | "C96" | "C192") - declare -x "wtime_${step}"="00:30:00" + declare -x "wtime_${step}"="00:15:00" declare -x "wtime_${step}_gfs"="03:00:00" ;; "C384") - declare -x "wtime_${step}"="00:20:00" + declare -x "wtime_${step}"="00:30:00" declare -x "wtime_${step}_gfs"="06:00:00" ;; "C768" | "C1152") - declare -x "wtime_${step}"="01:00:00" + declare -x "wtime_${step}"="00:30:00" declare -x "wtime_${step}_gfs"="06:00:00" ;; *) - echo "FATAL ERROR: Resolution ${CASE} not supported in ${step}" - exit 1 + echo "FATAL ERROR: Resources not defined for job ${job} at resolution ${CASE}" + exit 4 ;; esac unset _CDUMP _CDUMP_LIST unset NTASKS_TOT + ;; -elif [[ "${step}" = "ocnpost" ]]; then - - export wtime_ocnpost="00:30:00" - export npe_ocnpost=1 - export npe_node_ocnpost=1 - export nth_ocnpost=1 - export memory_ocnpost="96G" - if [[ "${machine}" == "JET" ]]; then - # JET only has 88GB of requestable memory per node - # so a second node is required to meet the requiremtn - npe_ocnpost=2 - fi - -elif [[ "${step}" = "upp" ]]; then + "oceanice_products") + export wtime_oceanice_products="00:15:00" + export npe_oceanice_products=1 + export npe_node_oceanice_products=1 + export nth_oceanice_products=1 + export memory_oceanice_products="96GB" + ;; + "upp") case "${CASE}" in "C48" | "C96") export npe_upp=${CASE:1} ;; - "C192" | "C384" | "C768") + "C192" | "C384") + export npe_upp=120 + export memory_upp="96GB" + ;; + "C768") export npe_upp=120 + export memory_upp="96GB" + if [[ ${machine} == "WCOSS2" ]]; then export memory_upp="480GB" ; fi ;; *) - echo "FATAL ERROR: Resolution '${CASE}' not supported for UPP'" - exit 1 + echo "FATAL ERROR: Resources not defined for job ${job} at resolution ${CASE}" + exit 4 ;; esac export npe_node_upp=${npe_upp} @@ -705,13 +725,13 @@ elif [[ "${step}" = "upp" ]]; then export nth_upp=1 export wtime_upp="00:15:00" - if [[ "${npe_node_upp}" -gt "${npe_node_max}" ]]; then + if (( npe_node_upp > npe_node_max )); then export npe_node_upp=${npe_node_max} fi export is_exclusive=True + ;; -elif [[ ${step} = "atmos_products" ]]; then - + "atmos_products") export wtime_atmos_products="00:15:00" export npe_atmos_products=24 export nth_atmos_products=1 @@ -721,25 +741,25 @@ elif [[ ${step} = "atmos_products" ]]; then export nth_atmos_products_gfs="${nth_atmos_products}" export npe_node_atmos_products_gfs="${npe_node_atmos_products}" export is_exclusive=True + ;; -elif [[ ${step} = "verfozn" ]]; then - + "verfozn") export wtime_verfozn="00:05:00" export npe_verfozn=1 export nth_verfozn=1 export npe_node_verfozn=1 export memory_verfozn="1G" + ;; -elif [[ ${step} = "verfrad" ]]; then - + "verfrad") export wtime_verfrad="00:40:00" export npe_verfrad=1 export nth_verfrad=1 export npe_node_verfrad=1 export memory_verfrad="5G" + ;; -elif [[ ${step} = "vminmon" ]]; then - + "vminmon") export wtime_vminmon="00:05:00" export npe_vminmon=1 export nth_vminmon=1 @@ -749,42 +769,42 @@ elif [[ ${step} = "vminmon" ]]; then export nth_vminmon_gfs=1 export npe_node_vminmon_gfs=1 export memory_vminmon="1G" + ;; -elif [[ ${step} = "tracker" ]]; then - + "tracker") export wtime_tracker="00:10:00" export npe_tracker=1 export nth_tracker=1 export npe_node_tracker=1 export memory_tracker="4G" + ;; -elif [[ ${step} = "genesis" ]]; then - + "genesis") export wtime_genesis="00:25:00" export npe_genesis=1 export nth_genesis=1 export npe_node_genesis=1 - export memory_genesis="4G" - -elif [[ ${step} = "genesis_fsu" ]]; then + export memory_genesis="10G" + ;; + "genesis_fsu") export wtime_genesis_fsu="00:10:00" export npe_genesis_fsu=1 export nth_genesis_fsu=1 export npe_node_genesis_fsu=1 - export memory_genesis_fsu="4G" - -elif [[ "${step}" = "fit2obs" ]]; then + export memory_genesis_fsu="10G" + ;; + "fit2obs") export wtime_fit2obs="00:20:00" export npe_fit2obs=3 export nth_fit2obs=1 export npe_node_fit2obs=1 export memory_fit2obs="20G" - if [[ "${machine}" == "WCOSS2" ]]; then export npe_node_fit2obs=3 ; fi - -elif [[ "${step}" = "metp" ]]; then + if [[ ${machine} == "WCOSS2" ]]; then export npe_node_fit2obs=3 ; fi + ;; + "metp") export nth_metp=1 export wtime_metp="03:00:00" export npe_metp=4 @@ -793,242 +813,240 @@ elif [[ "${step}" = "metp" ]]; then export npe_metp_gfs=4 export npe_node_metp_gfs=4 export is_exclusive=True + ;; -elif [[ "${step}" = "echgres" ]]; then - + "echgres") export wtime_echgres="00:10:00" export npe_echgres=3 export nth_echgres=${npe_node_max} export npe_node_echgres=1 - if [[ "${machine}" = "WCOSS2" ]]; then + if [[ "${machine}" == "WCOSS2" ]]; then export memory_echgres="200GB" fi + ;; -elif [[ "${step}" = "init" ]]; then - + "init") export wtime_init="00:30:00" export npe_init=24 export nth_init=1 export npe_node_init=6 - export memory_init="70G" - -elif [[ "${step}" = "init_chem" ]]; then + export memory_init="70GB" + ;; + "init_chem") export wtime_init_chem="00:30:00" export npe_init_chem=1 export npe_node_init_chem=1 export is_exclusive=True + ;; -elif [[ "${step}" = "mom6ic" ]]; then - + "mom6ic") export wtime_mom6ic="00:30:00" export npe_mom6ic=24 export npe_node_mom6ic=24 export is_exclusive=True - -elif [[ ${step} = "arch" || ${step} = "earc" || ${step} = "getic" ]]; then - - eval "export wtime_${step}='06:00:00'" - eval "export npe_${step}=1" - eval "export npe_node_${step}=1" - eval "export nth_${step}=1" - eval "export memory_${step}=4096M" - if [[ "${machine}" = "WCOSS2" ]]; then - eval "export memory_${step}=50GB" + ;; + + "arch" | "earc" | "getic") + declare -x "wtime_${step}"="06:00:00" + declare -x "npe_${step}"="1" + declare -x "npe_node_${step}"="1" + declare -x "nth_${step}"="1" + declare -x "memory_${step}"="4096M" + if [[ "${machine}" == "WCOSS2" ]]; then + declare -x "memory_${step}"="50GB" fi + ;; -elif [[ ${step} == "cleanup" ]]; then - export wtime_cleanup="01:00:00" + "cleanup") + export wtime_cleanup="00:15:00" export npe_cleanup=1 export npe_node_cleanup=1 export nth_cleanup=1 export memory_cleanup="4096M" + ;; -elif [[ ${step} = "stage_ic" ]]; then - + "stage_ic") export wtime_stage_ic="00:15:00" export npe_stage_ic=1 export npe_node_stage_ic=1 export nth_stage_ic=1 export is_exclusive=True + ;; -elif [[ "${step}" = "atmensanlinit" ]]; then - - # make below case dependent later - export layout_x=1 - export layout_y=1 + "atmensanlinit") + export layout_x=${layout_x_atmensanl} + export layout_y=${layout_y_atmensanl} export wtime_atmensanlinit="00:10:00" export npe_atmensanlinit=1 export nth_atmensanlinit=1 - npe_node_atmensanlinit=$(echo "${npe_node_max} / ${nth_atmensanlinit}" | bc) - export npe_node_atmensanlinit + export npe_node_atmensanlinit=$(( npe_node_max / nth_atmensanlinit )) export memory_atmensanlinit="3072M" + ;; -elif [[ "${step}" = "atmensanlrun" ]]; then - - # make below case dependent later - export layout_x=1 - export layout_y=1 + "atmensanlrun") + export layout_x=${layout_x_atmensanl} + export layout_y=${layout_y_atmensanl} export wtime_atmensanlrun="00:30:00" - npe_atmensanlrun=$(echo "${layout_x} * ${layout_y} * 6" | bc) - export npe_atmensanlrun - npe_atmensanlrun_gfs=$(echo "${layout_x} * ${layout_y} * 6" | bc) - export npe_atmensanlrun_gfs + export npe_atmensanlrun=$(( layout_x * layout_y * 6 )) + export npe_atmensanlrun_gfs=$(( layout_x * layout_y * 6 )) export nth_atmensanlrun=1 export nth_atmensanlrun_gfs=${nth_atmensanlrun} - npe_node_atmensanlrun=$(echo "${npe_node_max} / ${nth_atmensanlrun}" | bc) - export npe_node_atmensanlrun + export npe_node_atmensanlrun=$(( npe_node_max / nth_atmensanlrun )) + export memory_atmensanlrun="96GB" export is_exclusive=True + ;; -elif [[ "${step}" = "atmensanlfinal" ]]; then - + "atmensanlfinal") export wtime_atmensanlfinal="00:30:00" export npe_atmensanlfinal=${npe_node_max} export nth_atmensanlfinal=1 - npe_node_atmensanlfinal=$(echo "${npe_node_max} / ${nth_atmensanlfinal}" | bc) - export npe_node_atmensanlfinal + export npe_node_atmensanlfinal=$(( npe_node_max / nth_atmensanlfinal )) export is_exclusive=True + ;; -elif [[ "${step}" = "eobs" || "${step}" = "eomg" ]]; then - + "eobs" | "eomg") export wtime_eobs="00:15:00" - export wtime_eomg="01:00:00" - if [[ "${CASE}" = "C768" ]]; then - export npe_eobs=200 - elif [[ "${CASE}" = "C384" ]]; then - export npe_eobs=100 - elif [[ "${CASE}" = "C192" || "${CASE}" = "C96" || "${CASE}" = "C48" ]]; then - export npe_eobs=40 - fi + export wtime_eomg="00:30:00" + case ${CASE} in + "C768") export npe_eobs=200;; + "C384") export npe_eobs=100;; + "C192" | "C96" | "C48") export npe_eobs=40;; + *) + echo "FATAL ERROR: Resources not defined for job ${job} at resolution ${CASE}" + exit 4 + ;; + esac export npe_eomg=${npe_eobs} export nth_eobs=2 export nth_eomg=${nth_eobs} - npe_node_eobs=$(echo "${npe_node_max} / ${nth_eobs}" | bc) - export npe_node_eobs + export npe_node_eobs=$(( npe_node_max / nth_eobs )) export is_exclusive=True # The number of tasks and cores used must be the same for eobs # See https://github.com/NOAA-EMC/global-workflow/issues/2092 for details # For S4, this is accomplished by running 10 tasks/node if [[ ${machine} = "S4" ]]; then - export npe_node_eobs=10 + export npe_node_eobs=10 elif [[ ${machine} = "HERCULES" ]]; then - # For Hercules, this is only an issue at C384; use 20 tasks/node - if [[ ${CASE} = "C384" ]]; then - export npe_node_eobs=20 - fi + # For Hercules, this is only an issue at C384; use 20 tasks/node + if [[ ${CASE} = "C384" ]]; then + export npe_node_eobs=20 + fi fi export npe_node_eomg=${npe_node_eobs} + ;; -elif [[ "${step}" = "ediag" ]]; then - + "ediag") export wtime_ediag="00:15:00" export npe_ediag=48 export nth_ediag=1 - npe_node_ediag=$(echo "${npe_node_max} / ${nth_ediag}" | bc) - export npe_node_ediag + export npe_node_ediag=$(( npe_node_max / nth_ediag )) export memory_ediag="30GB" + ;; -elif [[ "${step}" = "eupd" ]]; then - + "eupd") export wtime_eupd="00:30:00" - if [[ "${CASE}" = "C768" ]]; then - export npe_eupd=480 - export nth_eupd=6 - if [[ "${machine}" = "WCOSS2" ]]; then - export npe_eupd=315 - export nth_eupd=14 - fi - elif [[ "${CASE}" = "C384" ]]; then - export npe_eupd=270 - export nth_eupd=8 - if [[ "${machine}" = "WCOSS2" ]]; then - export npe_eupd=315 - export nth_eupd=14 - elif [[ "${machine}" = "S4" ]]; then - export npe_eupd=160 - export nth_eupd=2 - fi - elif [[ "${CASE}" = "C192" || "${CASE}" = "C96" || "${CASE}" = "C48" ]]; then - export npe_eupd=42 - export nth_eupd=2 - if [[ "${machine}" = "HERA" || "${machine}" = "JET" ]]; then - export nth_eupd=4 - fi - fi - npe_node_eupd=$(echo "${npe_node_max} / ${nth_eupd}" | bc) - export npe_node_eupd + case ${CASE} in + "C768") + export npe_eupd=480 + export nth_eupd=6 + if [[ "${machine}" == "WCOSS2" ]]; then + export npe_eupd=315 + export nth_eupd=14 + fi + ;; + "C384") + export npe_eupd=270 + export nth_eupd=8 + if [[ "${machine}" == "WCOSS2" ]]; then + export npe_eupd=315 + export nth_eupd=14 + elif [[ ${machine} == "S4" ]]; then + export npe_eupd=160 + export nth_eupd=2 + fi + ;; + "C192" | "C96" | "C48") + export npe_eupd=42 + export nth_eupd=2 + if [[ "${machine}" == "HERA" || "${machine}" == "JET" ]]; then + export nth_eupd=4 + fi + ;; + *) + echo "FATAL ERROR: Resources not defined for job ${job} at resolution ${CASE}" + exit 4 + ;; + esac + export npe_node_eupd=$(( npe_node_max / nth_eupd )) export is_exclusive=True + ;; -elif [[ "${step}" = "ecen" ]]; then - + "ecen") export wtime_ecen="00:10:00" export npe_ecen=80 export nth_ecen=4 - if [[ "${machine}" = "HERA" ]]; then export nth_ecen=6; fi - if [[ "${CASE}" = "C384" || "${CASE}" = "C192" || "${CASE}" = "C96" || "${CASE}" = "C48" ]]; then export nth_ecen=2; fi - npe_node_ecen=$(echo "${npe_node_max} / ${nth_ecen}" | bc) - export npe_node_ecen + if [[ "${machine}" == "HERA" ]]; then export nth_ecen=6; fi + if [[ ${CASE} == "C384" || ${CASE} == "C192" || ${CASE} == "C96" || ${CASE} == "C48" ]]; then + export nth_ecen=2 + fi + export npe_node_ecen=$(( npe_node_max / nth_ecen )) export nth_cycle=${nth_ecen} - npe_node_cycle=$(echo "${npe_node_max} / ${nth_cycle}" | bc) - export npe_node_cycle + export npe_node_cycle=$(( npe_node_max / nth_cycle )) export is_exclusive=True + ;; -elif [[ "${step}" = "esfc" ]]; then - + "esfc") export wtime_esfc="00:08:00" export npe_esfc=80 export nth_esfc=1 - npe_node_esfc=$(echo "${npe_node_max} / ${nth_esfc}" | bc) - export npe_node_esfc + export npe_node_esfc=$(( npe_node_max / nth_esfc )) export nth_cycle=${nth_esfc} - npe_node_cycle=$(echo "${npe_node_max} / ${nth_cycle}" | bc) - export npe_node_cycle + export npe_node_cycle=$(( npe_node_max / nth_cycle )) export memory_esfc="80GB" + ;; -elif [[ "${step}" = "epos" ]]; then - + "epos") export wtime_epos="00:15:00" export npe_epos=80 export nth_epos=1 - npe_node_epos=$(echo "${npe_node_max} / ${nth_epos}" | bc) - export npe_node_epos + export npe_node_epos=$(( npe_node_max / nth_epos )) export is_exclusive=True + ;; -elif [[ "${step}" = "postsnd" ]]; then - + "postsnd") export wtime_postsnd="02:00:00" export npe_postsnd=40 export nth_postsnd=8 export npe_node_postsnd=10 export npe_postsndcfp=9 export npe_node_postsndcfp=1 - postsnd_req_cores=$(echo "${npe_node_postsnd} * ${nth_postsnd}" | bc) - if [[ ${postsnd_req_cores} -gt "${npe_node_max}" ]]; then - npe_node_postsnd=$(echo "${npe_node_max} / ${nth_postsnd}" | bc) - export npe_node_postsnd + postsnd_req_cores=$(( npe_node_postsnd * nth_postsnd )) + if (( postsnd_req_cores > npe_node_max )); then + export npe_node_postsnd=$(( npe_node_max / nth_postsnd )) fi export is_exclusive=True + ;; -elif [[ "${step}" = "awips" ]]; then - + "awips") export wtime_awips="03:30:00" export npe_awips=1 export npe_node_awips=1 export nth_awips=1 export memory_awips="3GB" + ;; -elif [[ ${step} = "npoess" ]]; then - + "npoess") export wtime_npoess="03:30:00" export npe_npoess=1 export npe_node_npoess=1 export nth_npoess=1 export memory_npoess="3GB" + ;; -elif [[ ${step} = "gempak" ]]; then - + "gempak") export wtime_gempak="03:00:00" export npe_gempak=2 export npe_gempak_gfs=28 @@ -1037,9 +1055,9 @@ elif [[ ${step} = "gempak" ]]; then export nth_gempak=1 export memory_gempak="4GB" export memory_gempak_gfs="2GB" + ;; -elif [[ ${step} = "mos_stn_prep" ]]; then - + "mos_stn_prep") export wtime_mos_stn_prep="00:10:00" export npe_mos_stn_prep=3 export npe_node_mos_stn_prep=3 @@ -1047,9 +1065,9 @@ elif [[ ${step} = "mos_stn_prep" ]]; then export memory_mos_stn_prep="5GB" export NTASK="${npe_mos_stn_prep}" export PTILE="${npe_node_mos_stn_prep}" + ;; -elif [[ ${step} = "mos_grd_prep" ]]; then - + "mos_grd_prep") export wtime_mos_grd_prep="00:10:00" export npe_mos_grd_prep=4 export npe_node_mos_grd_prep=4 @@ -1057,9 +1075,9 @@ elif [[ ${step} = "mos_grd_prep" ]]; then export memory_mos_grd_prep="16GB" export NTASK="${npe_mos_grd_prep}" export PTILE="${npe_node_mos_grd_prep}" + ;; -elif [[ ${step} = "mos_ext_stn_prep" ]]; then - + "mos_ext_stn_prep") export wtime_mos_ext_stn_prep="00:15:00" export npe_mos_ext_stn_prep=2 export npe_node_mos_ext_stn_prep=2 @@ -1067,9 +1085,9 @@ elif [[ ${step} = "mos_ext_stn_prep" ]]; then export memory_mos_ext_stn_prep="5GB" export NTASK="${npe_mos_ext_stn_prep}" export PTILE="${npe_node_mos_ext_stn_prep}" + ;; -elif [[ ${step} = "mos_ext_grd_prep" ]]; then - + "mos_ext_grd_prep") export wtime_mos_ext_grd_prep="00:10:00" export npe_mos_ext_grd_prep=7 export npe_node_mos_ext_grd_prep=7 @@ -1077,9 +1095,9 @@ elif [[ ${step} = "mos_ext_grd_prep" ]]; then export memory_mos_ext_grd_prep="3GB" export NTASK="${npe_mos_ext_grd_prep}" export PTILE="${npe_node_mos_ext_grd_prep}" + ;; -elif [[ ${step} = "mos_stn_fcst" ]]; then - + "mos_stn_fcst") export wtime_mos_stn_fcst="00:10:00" export npe_mos_stn_fcst=5 export npe_node_mos_stn_fcst=5 @@ -1087,9 +1105,9 @@ elif [[ ${step} = "mos_stn_fcst" ]]; then export memory_mos_stn_fcst="40GB" export NTASK="${npe_mos_stn_fcst}" export PTILE="${npe_node_mos_stn_fcst}" + ;; -elif [[ ${step} = "mos_grd_fcst" ]]; then - + "mos_grd_fcst") export wtime_mos_grd_fcst="00:10:00" export npe_mos_grd_fcst=7 export npe_node_mos_grd_fcst=7 @@ -1097,9 +1115,9 @@ elif [[ ${step} = "mos_grd_fcst" ]]; then export memory_mos_grd_fcst="50GB" export NTASK="${npe_mos_grd_fcst}" export PTILE="${npe_node_mos_grd_fcst}" + ;; -elif [[ ${step} = "mos_ext_stn_fcst" ]]; then - + "mos_ext_stn_fcst") export wtime_mos_ext_stn_fcst="00:20:00" export npe_mos_ext_stn_fcst=3 export npe_node_mos_ext_stn_fcst=3 @@ -1108,9 +1126,9 @@ elif [[ ${step} = "mos_ext_stn_fcst" ]]; then export NTASK="${npe_mos_ext_stn_fcst}" export PTILE="${npe_node_mos_ext_stn_fcst}" export prepost=True + ;; -elif [[ ${step} = "mos_ext_grd_fcst" ]]; then - + "mos_ext_grd_fcst") export wtime_mos_ext_grd_fcst="00:10:00" export npe_mos_ext_grd_fcst=7 export npe_node_mos_ext_grd_fcst=7 @@ -1118,9 +1136,9 @@ elif [[ ${step} = "mos_ext_grd_fcst" ]]; then export memory_mos_ext_grd_fcst="50GB" export NTASK="${npe_mos_ext_grd_fcst}" export PTILE="${npe_node_mos_ext_grd_fcst}" + ;; -elif [[ ${step} = "mos_stn_prdgen" ]]; then - + "mos_stn_prdgen") export wtime_mos_stn_prdgen="00:10:00" export npe_mos_stn_prdgen=1 export npe_node_mos_stn_prdgen=1 @@ -1129,9 +1147,9 @@ elif [[ ${step} = "mos_stn_prdgen" ]]; then export NTASK="${npe_mos_stn_prdgen}" export PTILE="${npe_node_mos_stn_prdgen}" export prepost=True + ;; -elif [[ ${step} = "mos_grd_prdgen" ]]; then - + "mos_grd_prdgen") export wtime_mos_grd_prdgen="00:40:00" export npe_mos_grd_prdgen=72 export npe_node_mos_grd_prdgen=18 @@ -1140,9 +1158,9 @@ elif [[ ${step} = "mos_grd_prdgen" ]]; then export NTASK="${npe_mos_grd_prdgen}" export PTILE="${npe_node_mos_grd_prdgen}" export OMP_NUM_THREADS="${nth_mos_grd_prdgen}" + ;; -elif [[ ${step} = "mos_ext_stn_prdgen" ]]; then - + "mos_ext_stn_prdgen") export wtime_mos_ext_stn_prdgen="00:10:00" export npe_mos_ext_stn_prdgen=1 export npe_node_mos_ext_stn_prdgen=1 @@ -1151,9 +1169,9 @@ elif [[ ${step} = "mos_ext_stn_prdgen" ]]; then export NTASK="${npe_mos_ext_stn_prdgen}" export PTILE="${npe_node_mos_ext_stn_prdgen}" export prepost=True + ;; -elif [[ ${step} = "mos_ext_grd_prdgen" ]]; then - + "mos_ext_grd_prdgen") export wtime_mos_ext_grd_prdgen="00:30:00" export npe_mos_ext_grd_prdgen=96 export npe_node_mos_ext_grd_prdgen=6 @@ -1162,9 +1180,9 @@ elif [[ ${step} = "mos_ext_grd_prdgen" ]]; then export NTASK="${npe_mos_ext_grd_prdgen}" export PTILE="${npe_node_mos_ext_grd_prdgen}" export OMP_NUM_THREADS="${nth_mos_ext_grd_prdgen}" + ;; -elif [[ ${step} = "mos_wx_prdgen" ]]; then - + "mos_wx_prdgen") export wtime_mos_wx_prdgen="00:10:00" export npe_mos_wx_prdgen=4 export npe_node_mos_wx_prdgen=2 @@ -1173,9 +1191,9 @@ elif [[ ${step} = "mos_wx_prdgen" ]]; then export NTASK="${npe_mos_wx_prdgen}" export PTILE="${npe_node_mos_wx_prdgen}" export OMP_NUM_THREADS="${nth_mos_wx_prdgen}" + ;; -elif [[ ${step} = "mos_wx_ext_prdgen" ]]; then - + "mos_wx_ext_prdgen") export wtime_mos_wx_ext_prdgen="00:10:00" export npe_mos_wx_ext_prdgen=4 export npe_node_mos_wx_ext_prdgen=2 @@ -1184,12 +1202,13 @@ elif [[ ${step} = "mos_wx_ext_prdgen" ]]; then export NTASK="${npe_mos_wx_ext_prdgen}" export PTILE="${npe_node_mos_wx_ext_prdgen}" export OMP_NUM_THREADS="${nth_mos_wx_ext_prdgen}" + ;; -else - - echo "Invalid step = ${step}, ABORT!" - exit 2 + *) + echo "FATAL ERROR: Invalid job ${step} passed to ${BASH_SOURCE[0]}" + exit 1 + ;; -fi +esac echo "END: config.resources" diff --git a/parm/config/gfs/config.sfcanl b/parm/config/gfs/config.sfcanl index 9592fb77c9..e2fde8992a 100644 --- a/parm/config/gfs/config.sfcanl +++ b/parm/config/gfs/config.sfcanl @@ -8,4 +8,9 @@ echo "BEGIN: config.sfcanl" # Get task specific resources . $EXPDIR/config.resources sfcanl +# Turn off NST in JEDIATMVAR +if [[ "${DO_JEDIATMVAR}" == "YES" ]]; then + export DONST="NO" +fi + echo "END: config.sfcanl" diff --git a/parm/config/gfs/config.snowanl b/parm/config/gfs/config.snowanl new file mode 100644 index 0000000000..7b3ffa47f3 --- /dev/null +++ b/parm/config/gfs/config.snowanl @@ -0,0 +1,30 @@ +#! /usr/bin/env bash + +########## config.snowanl ########## +# configuration common to snow analysis tasks + +echo "BEGIN: config.snowanl" + +# Get task specific resources +source "${EXPDIR}/config.resources" snowanl + +export OBS_LIST="${PARMgfs}/gdas/snow/obs/lists/gdas_snow.yaml.j2" + +# Name of the JEDI executable and its yaml template +export JEDIEXE="${EXECgfs}/fv3jedi_letkf.x" +export JEDIYAML="${PARMgfs}/gdas/snow/letkfoi/letkfoi.yaml.j2" + +# Ensemble member properties +export SNOWDEPTHVAR="snodl" +export BESTDDEV="30." # Background Error Std. Dev. for LETKFOI + +# Name of the executable that applies increment to bkg and its namelist template +export APPLY_INCR_EXE="${EXECgfs}/apply_incr.exe" +export APPLY_INCR_NML_TMPL="${PARMgfs}/gdas/snow/letkfoi/apply_incr_nml.j2" + +export JEDI_FIX_YAML="${PARMgfs}/gdas/snow_jedi_fix.yaml.j2" + +export io_layout_x=@IO_LAYOUT_X@ +export io_layout_y=@IO_LAYOUT_Y@ + +echo "END: config.snowanl" diff --git a/parm/config/gfs/config.stage_ic b/parm/config/gfs/config.stage_ic index 7f3956af4d..63d0e4a5cf 100644 --- a/parm/config/gfs/config.stage_ic +++ b/parm/config/gfs/config.stage_ic @@ -8,7 +8,7 @@ echo "BEGIN: config.stage_ic" source "${EXPDIR}/config.resources" stage_ic case "${CASE}" in - "C48" | "C96") + "C48" | "C96" | "C192") export CPL_ATMIC="workflow_${CASE}_refactored" export CPL_ICEIC="workflow_${CASE}_refactored" export CPL_OCNIC="workflow_${CASE}_refactored" @@ -21,16 +21,16 @@ case "${CASE}" in export CPL_WAVIC=workflow_C384_refactored ;; "C768") - export CPL_ATMIC=HR2_refactored - export CPL_ICEIC=HR1_refactored - export CPL_OCNIC=HR1_refactored - export CPL_WAVIC=HR1_refactored + export CPL_ATMIC=HR3C768 + export CPL_ICEIC=HR3marine + export CPL_OCNIC=HR3marine + export CPL_WAVIC=HR3marine ;; "C1152") - export CPL_ATMIC=HR2_C1152_refactored - export CPL_ICEIC=HR3_refactored - export CPL_OCNIC=HR3_refactored - export CPL_WAVIC=HR1_refactored + export CPL_ATMIC=HR3C1152 + export CPL_ICEIC=HR3marine + export CPL_OCNIC=HR3marine + export CPL_WAVIC=HR3marine ;; *) echo "FATAL ERROR Unrecognized resolution: ${CASE}" diff --git a/parm/config/gfs/config.ufs_c768_12x12_2th_1wg40wt b/parm/config/gfs/config.ufs_c768_12x12_2th_1wg40wt index 5b3dab7a98..38b4f7a8e7 100644 --- a/parm/config/gfs/config.ufs_c768_12x12_2th_1wg40wt +++ b/parm/config/gfs/config.ufs_c768_12x12_2th_1wg40wt @@ -15,7 +15,7 @@ if (( $# <= 1 )); then echo "--fv3 C48|C96|C192|C384|C768|C1152|C3072" echo "--mom6 500|100|025" echo "--cice6 500|100|025" - echo "--ww3 gnh_10m;aoc_9km;gsh_15m|gwes_30m|glo_025|glo_200|glo_500|mx025" + echo "--ww3 gnh_10m;aoc_9km;gsh_15m|gwes_30m|glo_025|glo_200|glo_500|mx025|uglo_100km|uglo_m1g16" echo "--gocart" exit 1 @@ -68,54 +68,6 @@ if [[ "${skip_mom6}" == "false" ]] || [[ "${skip_cice6}" == "false" ]] || [[ "${ skip_mediator=false fi -case "${machine}" in - "WCOSS2") - npe_node_max=128 - ;; - "HERA" | "ORION" ) - npe_node_max=40 - ;; - "HERCULES" ) - npe_node_max=80 - ;; - "JET") - case "${PARTITION_BATCH}" in - "xjet") - npe_node_max=24 - ;; - "vjet" | "sjet") - npe_node_max=16 - ;; - "kjet") - npe_node_max=40 - ;; - *) - echo "FATAL ERROR: Unsupported ${machine} PARTITION_BATCH = ${PARTITION_BATCH}, ABORT!" - exit 1 - ;; - esac - ;; - "S4") - case "${PARTITION_BATCH}" in - "s4") - npe_node_max=32 - ;; - "ivy") - npe_node_max=20 - ;; - *) - echo "FATAL ERROR: Unsupported ${machine} PARTITION_BATCH = ${PARTITION_BATCH}, ABORT!" - exit 1 - ;; - esac - ;; - *) - echo "FATAL ERROR: Unrecognized machine ${machine}" - exit 14 - ;; -esac -export npe_node_max - # (Standard) Model resolution dependent variables case "${fv3_res}" in "C48") @@ -126,6 +78,8 @@ case "${fv3_res}" in export layout_y_gfs=1 export nthreads_fv3=1 export nthreads_fv3_gfs=1 + export nthreads_ufs=1 + export nthreads_ufs_gfs=1 export cdmbgwd="0.071,2.1,1.0,1.0" # mountain blocking, ogwd, cgwd, cgwd src scaling export cdmbgwd_gsl="40.0,1.77,1.0,1.0" # settings for GSL drag suite export knob_ugwp_tauamp=6.0e-3 # setting for UGWPv1 non-stationary GWD @@ -142,6 +96,8 @@ case "${fv3_res}" in export layout_y_gfs=2 export nthreads_fv3=1 export nthreads_fv3_gfs=1 + export nthreads_ufs=1 + export nthreads_ufs_gfs=1 export cdmbgwd="0.14,1.8,1.0,1.0" # mountain blocking, ogwd, cgwd, cgwd src scaling export cdmbgwd_gsl="20.0,2.5,1.0,1.0" # settings for GSL drag suite export knob_ugwp_tauamp=3.0e-3 # setting for UGWPv1 non-stationary GWD @@ -158,6 +114,8 @@ case "${fv3_res}" in export layout_y_gfs=6 export nthreads_fv3=1 export nthreads_fv3_gfs=2 + export nthreads_ufs=1 + export nthreads_ufs_gfs=2 export cdmbgwd="0.23,1.5,1.0,1.0" # mountain blocking, ogwd, cgwd, cgwd src scaling export cdmbgwd_gsl="10.0,3.5,1.0,1.0" # settings for GSL drag suite export knob_ugwp_tauamp=1.5e-3 # setting for UGWPv1 non-stationary GWD @@ -174,6 +132,8 @@ case "${fv3_res}" in export layout_y_gfs=8 export nthreads_fv3=2 export nthreads_fv3_gfs=2 + export nthreads_ufs=2 + export nthreads_ufs_gfs=2 export cdmbgwd="1.1,0.72,1.0,1.0" # mountain blocking, ogwd, cgwd, cgwd src scaling export cdmbgwd_gsl="5.0,5.0,1.0,1.0" # settings for GSL drag suite export knob_ugwp_tauamp=0.8e-3 # setting for UGWPv1 non-stationary GWD @@ -187,11 +147,13 @@ case "${fv3_res}" in export layout_x=8 export layout_y=12 export layout_x_gfs=12 - export layout_y_gfs=12 - #JKHexport layout_y_gfs=16 + export layout_y_gfs=12 + #JKHexport layout_y_gfs=16 export nthreads_fv3=4 #JKHexport nthreads_fv3_gfs=4 export nthreads_fv3_gfs=2 + export nthreads_ufs=4 + export nthreads_ufs_gfs=4 export cdmbgwd="4.0,0.15,1.0,1.0" # mountain blocking, ogwd, cgwd, cgwd src scaling export cdmbgwd_gsl="2.5,7.5,1.0,1.0" # settings for GSL drag suite export knob_ugwp_tauamp=0.5e-3 # setting for UGWPv1 non-stationary GWD @@ -210,6 +172,8 @@ case "${fv3_res}" in export layout_y_gfs=16 export nthreads_fv3=4 export nthreads_fv3_gfs=4 + export nthreads_ufs=4 + export nthreads_ufs_gfs=4 export cdmbgwd="4.0,0.10,1.0,1.0" # mountain blocking, ogwd, cgwd, cgwd src scaling export cdmbgwd_gsl="1.67,8.8,1.0,1.0" # settings for GSL drag suite export knob_ugwp_tauamp=0.35e-3 # setting for UGWPv1 non-stationary GWD @@ -226,6 +190,8 @@ case "${fv3_res}" in export layout_y_gfs=32 export nthreads_fv3=4 export nthreads_fv3_gfs=4 + export nthreads_ufs=4 + export nthreads_ufs_gfs=4 export cdmbgwd="4.0,0.05,1.0,1.0" # mountain blocking, ogwd, cgwd, cgwd src scaling export cdmbgwd_gsl="0.625,14.1,1.0,1.0" # settings for GSL drag suite export knob_ugwp_tauamp=0.13e-3 # setting for UGWPv1 non-stationary GWD @@ -468,6 +434,14 @@ if [[ "${skip_ww3}" == "false" ]]; then "mx025") ntasks_ww3=80 ;; + "uglo_100km") + ntasks_ww3=40 + nthreads_ww3=1 + ;; + "uglo_m1g16") + ntasks_ww3=1000 + nthreads_ww3=1 + ;; *) echo "FATAL ERROR: Unsupported WW3 resolution = ${ww3_res}, ABORT!" exit 1 @@ -484,39 +458,45 @@ if [[ "${skip_gocart}" == "false" ]]; then fi # Set the name of the UFS (previously nems) configure template to use +# Default ufs.configure templates for supported model configurations +if [[ "${USE_ESMF_THREADING:-}" == "YES" ]]; then + tmpl_suffix="_esmf" +fi case "${model_list}" in atm) - export ufs_configure_template="${HOMEgfs}/parm/ufs/ufs.configure.atm.IN" + default_template="${PARMgfs}/ufs/ufs.configure.atm${tmpl_suffix:-}.IN" ;; atm.aero) - export ufs_configure_template="${HOMEgfs}/parm/ufs/ufs.configure.atm_aero.IN" + default_template="${PARMgfs}/ufs/ufs.configure.atmaero${tmpl_suffix:-}.IN" ;; atm.wave) - export ufs_configure_template="${HOMEgfs}/parm/ufs/ufs.configure.leapfrog_atm_wav.IN" + default_template="${PARMgfs}/ufs/ufs.configure.leapfrog_atm_wav${tmpl_suffix:-}.IN" ;; atm.ocean.ice) - export ufs_configure_template="${HOMEgfs}/parm/ufs/ufs.configure.cpld.IN" + default_template="${PARMgfs}/ufs/ufs.configure.s2s${tmpl_suffix:-}.IN" ;; atm.ocean.ice.aero) - export ufs_configure_template="${HOMEgfs}/parm/ufs/ufs.configure.cpld_aero.IN" + default_template="${PARMgfs}/ufs/ufs.configure.s2sa${tmpl_suffix:-}.IN" ;; atm.ocean.ice.wave) - export ufs_configure_template="${HOMEgfs}/parm/ufs/ufs.configure.cpld_outerwave.IN" + default_template="${PARMgfs}/ufs/ufs.configure.s2sw${tmpl_suffix:-}.IN" ;; atm.ocean.ice.wave.aero) - export ufs_configure_template="${HOMEgfs}/parm/ufs/ufs.configure.cpld_aero_outerwave.IN" + default_template="${PARMgfs}/ufs/ufs.configure.s2swa${tmpl_suffix:-}.IN" ;; *) - echo "FATAL ERROR: Unable to determine appropriate UFS configure template for ${model_list}" + echo "FATAL ERROR: Unsupported UFSWM configuration for ${model_list}" exit 16 ;; esac +# Allow user to override the default template +export ufs_configure_template=${ufs_configure_template:-${default_template:-"/dev/null"}} +unset model_list default_template + if [[ ! -r "${ufs_configure_template}" ]]; then echo "FATAL ERROR: ${ufs_configure_template} either doesn't exist or is not readable." exit 17 fi -unset model_list - echo "END: config.ufs" diff --git a/parm/config/gfs/config.ufs_c768_16x16_2th_2wg40wt b/parm/config/gfs/config.ufs_c768_16x16_2th_2wg40wt index ad3f472873..80f3e2a3c6 100644 --- a/parm/config/gfs/config.ufs_c768_16x16_2th_2wg40wt +++ b/parm/config/gfs/config.ufs_c768_16x16_2th_2wg40wt @@ -15,7 +15,7 @@ if (( $# <= 1 )); then echo "--fv3 C48|C96|C192|C384|C768|C1152|C3072" echo "--mom6 500|100|025" echo "--cice6 500|100|025" - echo "--ww3 gnh_10m;aoc_9km;gsh_15m|gwes_30m|glo_025|glo_200|glo_500|mx025" + echo "--ww3 gnh_10m;aoc_9km;gsh_15m|gwes_30m|glo_025|glo_200|glo_500|mx025|uglo_100km|uglo_m1g16" echo "--gocart" exit 1 @@ -68,54 +68,6 @@ if [[ "${skip_mom6}" == "false" ]] || [[ "${skip_cice6}" == "false" ]] || [[ "${ skip_mediator=false fi -case "${machine}" in - "WCOSS2") - npe_node_max=128 - ;; - "HERA" | "ORION" ) - npe_node_max=40 - ;; - "HERCULES" ) - npe_node_max=80 - ;; - "JET") - case "${PARTITION_BATCH}" in - "xjet") - npe_node_max=24 - ;; - "vjet" | "sjet") - npe_node_max=16 - ;; - "kjet") - npe_node_max=40 - ;; - *) - echo "FATAL ERROR: Unsupported ${machine} PARTITION_BATCH = ${PARTITION_BATCH}, ABORT!" - exit 1 - ;; - esac - ;; - "S4") - case "${PARTITION_BATCH}" in - "s4") - npe_node_max=32 - ;; - "ivy") - npe_node_max=20 - ;; - *) - echo "FATAL ERROR: Unsupported ${machine} PARTITION_BATCH = ${PARTITION_BATCH}, ABORT!" - exit 1 - ;; - esac - ;; - *) - echo "FATAL ERROR: Unrecognized machine ${machine}" - exit 14 - ;; -esac -export npe_node_max - # (Standard) Model resolution dependent variables case "${fv3_res}" in "C48") @@ -126,6 +78,8 @@ case "${fv3_res}" in export layout_y_gfs=1 export nthreads_fv3=1 export nthreads_fv3_gfs=1 + export nthreads_ufs=1 + export nthreads_ufs_gfs=1 export cdmbgwd="0.071,2.1,1.0,1.0" # mountain blocking, ogwd, cgwd, cgwd src scaling export cdmbgwd_gsl="40.0,1.77,1.0,1.0" # settings for GSL drag suite export knob_ugwp_tauamp=6.0e-3 # setting for UGWPv1 non-stationary GWD @@ -142,6 +96,8 @@ case "${fv3_res}" in export layout_y_gfs=2 export nthreads_fv3=1 export nthreads_fv3_gfs=1 + export nthreads_ufs=1 + export nthreads_ufs_gfs=1 export cdmbgwd="0.14,1.8,1.0,1.0" # mountain blocking, ogwd, cgwd, cgwd src scaling export cdmbgwd_gsl="20.0,2.5,1.0,1.0" # settings for GSL drag suite export knob_ugwp_tauamp=3.0e-3 # setting for UGWPv1 non-stationary GWD @@ -158,6 +114,8 @@ case "${fv3_res}" in export layout_y_gfs=6 export nthreads_fv3=1 export nthreads_fv3_gfs=2 + export nthreads_ufs=1 + export nthreads_ufs_gfs=2 export cdmbgwd="0.23,1.5,1.0,1.0" # mountain blocking, ogwd, cgwd, cgwd src scaling export cdmbgwd_gsl="10.0,3.5,1.0,1.0" # settings for GSL drag suite export knob_ugwp_tauamp=1.5e-3 # setting for UGWPv1 non-stationary GWD @@ -174,6 +132,8 @@ case "${fv3_res}" in export layout_y_gfs=8 export nthreads_fv3=2 export nthreads_fv3_gfs=2 + export nthreads_ufs=2 + export nthreads_ufs_gfs=2 export cdmbgwd="1.1,0.72,1.0,1.0" # mountain blocking, ogwd, cgwd, cgwd src scaling export cdmbgwd_gsl="5.0,5.0,1.0,1.0" # settings for GSL drag suite export knob_ugwp_tauamp=0.8e-3 # setting for UGWPv1 non-stationary GWD @@ -192,6 +152,8 @@ case "${fv3_res}" in export nthreads_fv3=4 #JKHexport nthreads_fv3_gfs=4 export nthreads_fv3_gfs=2 + export nthreads_ufs=4 + export nthreads_ufs_gfs=4 export cdmbgwd="4.0,0.15,1.0,1.0" # mountain blocking, ogwd, cgwd, cgwd src scaling export cdmbgwd_gsl="2.5,7.5,1.0,1.0" # settings for GSL drag suite export knob_ugwp_tauamp=0.5e-3 # setting for UGWPv1 non-stationary GWD @@ -210,6 +172,8 @@ case "${fv3_res}" in export layout_y_gfs=16 export nthreads_fv3=4 export nthreads_fv3_gfs=4 + export nthreads_ufs=4 + export nthreads_ufs_gfs=4 export cdmbgwd="4.0,0.10,1.0,1.0" # mountain blocking, ogwd, cgwd, cgwd src scaling export cdmbgwd_gsl="1.67,8.8,1.0,1.0" # settings for GSL drag suite export knob_ugwp_tauamp=0.35e-3 # setting for UGWPv1 non-stationary GWD @@ -226,6 +190,8 @@ case "${fv3_res}" in export layout_y_gfs=32 export nthreads_fv3=4 export nthreads_fv3_gfs=4 + export nthreads_ufs=4 + export nthreads_ufs_gfs=4 export cdmbgwd="4.0,0.05,1.0,1.0" # mountain blocking, ogwd, cgwd, cgwd src scaling export cdmbgwd_gsl="0.625,14.1,1.0,1.0" # settings for GSL drag suite export knob_ugwp_tauamp=0.13e-3 # setting for UGWPv1 non-stationary GWD @@ -468,6 +434,14 @@ if [[ "${skip_ww3}" == "false" ]]; then "mx025") ntasks_ww3=80 ;; + "uglo_100km") + ntasks_ww3=40 + nthreads_ww3=1 + ;; + "uglo_m1g16") + ntasks_ww3=1000 + nthreads_ww3=1 + ;; *) echo "FATAL ERROR: Unsupported WW3 resolution = ${ww3_res}, ABORT!" exit 1 @@ -484,39 +458,45 @@ if [[ "${skip_gocart}" == "false" ]]; then fi # Set the name of the UFS (previously nems) configure template to use +# Default ufs.configure templates for supported model configurations +if [[ "${USE_ESMF_THREADING:-}" == "YES" ]]; then + tmpl_suffix="_esmf" +fi case "${model_list}" in atm) - export ufs_configure_template="${HOMEgfs}/parm/ufs/ufs.configure.atm.IN" + default_template="${PARMgfs}/ufs/ufs.configure.atm${tmpl_suffix:-}.IN" ;; atm.aero) - export ufs_configure_template="${HOMEgfs}/parm/ufs/ufs.configure.atm_aero.IN" + default_template="${PARMgfs}/ufs/ufs.configure.atmaero${tmpl_suffix:-}.IN" ;; atm.wave) - export ufs_configure_template="${HOMEgfs}/parm/ufs/ufs.configure.leapfrog_atm_wav.IN" + default_template="${PARMgfs}/ufs/ufs.configure.leapfrog_atm_wav${tmpl_suffix:-}.IN" ;; atm.ocean.ice) - export ufs_configure_template="${HOMEgfs}/parm/ufs/ufs.configure.cpld.IN" + default_template="${PARMgfs}/ufs/ufs.configure.s2s${tmpl_suffix:-}.IN" ;; atm.ocean.ice.aero) - export ufs_configure_template="${HOMEgfs}/parm/ufs/ufs.configure.cpld_aero.IN" + default_template="${PARMgfs}/ufs/ufs.configure.s2sa${tmpl_suffix:-}.IN" ;; atm.ocean.ice.wave) - export ufs_configure_template="${HOMEgfs}/parm/ufs/ufs.configure.cpld_outerwave.IN" + default_template="${PARMgfs}/ufs/ufs.configure.s2sw${tmpl_suffix:-}.IN" ;; atm.ocean.ice.wave.aero) - export ufs_configure_template="${HOMEgfs}/parm/ufs/ufs.configure.cpld_aero_outerwave.IN" + default_template="${PARMgfs}/ufs/ufs.configure.s2swa${tmpl_suffix:-}.IN" ;; *) - echo "FATAL ERROR: Unable to determine appropriate UFS configure template for ${model_list}" + echo "FATAL ERROR: Unsupported UFSWM configuration for ${model_list}" exit 16 ;; esac +# Allow user to override the default template +export ufs_configure_template=${ufs_configure_template:-${default_template:-"/dev/null"}} +unset model_list default_template + if [[ ! -r "${ufs_configure_template}" ]]; then echo "FATAL ERROR: ${ufs_configure_template} either doesn't exist or is not readable." exit 17 fi -unset model_list - echo "END: config.ufs" diff --git a/parm/config/gfs/config.ufs_emc b/parm/config/gfs/config.ufs_emc new file mode 100644 index 0000000000..0db6f090a5 --- /dev/null +++ b/parm/config/gfs/config.ufs_emc @@ -0,0 +1,498 @@ +#! /usr/bin/env bash + +########## config.ufs ########## +# UFS model resolution specific parameters +# e.g. time-step, processor layout, physics and dynamics parameters +# This config sets default variables for FV3, MOM6, CICE6 for their resolutions +# User can over-ride after sourcing this config file + +echo "BEGIN: config.ufs" + +if (( $# <= 1 )); then + + echo "Must specify an input resolution argument to set variables!" + echo "argument can be any one of the following:" + echo "--fv3 C48|C96|C192|C384|C768|C1152|C3072" + echo "--mom6 500|100|025" + echo "--cice6 500|100|025" + echo "--ww3 gnh_10m;aoc_9km;gsh_15m|gwes_30m|glo_025|glo_200|glo_500|mx025|uglo_100km|uglo_m1g16" + echo "--gocart" + + exit 1 + +fi + +# Initialize +skip_mom6=true +skip_cice6=true +skip_ww3=true +skip_gocart=true +skip_mediator=true + +# Loop through named arguments +while (( $# > 0 )); do + key="$1" + case "${key}" in + "--fv3") + fv3_res="$2" + shift + ;; + "--mom6") + mom6_res="$2" + skip_mom6=false + shift + ;; + "--cice6") + cice6_res="$2" + skip_cice6=false + shift + ;; + "--ww3") + ww3_res="$2" + skip_ww3=false + shift + ;; + "--gocart") + skip_gocart=false + ;; + *) # unknown option + echo "FATAL ERROR: Unknown option: ${key}, ABORT!" + exit 1 + ;; + esac + shift +done + +# Mediator is required if any of the non-ATM components are used +if [[ "${skip_mom6}" == "false" ]] || [[ "${skip_cice6}" == "false" ]] || [[ "${skip_ww3}" == "false" ]]; then + skip_mediator=false +fi + +# (Standard) Model resolution dependent variables +case "${fv3_res}" in + "C48") + export DELTIM=1200 + export layout_x=1 + export layout_y=1 + export layout_x_gfs=1 + export layout_y_gfs=1 + export nthreads_fv3=1 + export nthreads_fv3_gfs=1 + export nthreads_ufs=1 + export nthreads_ufs_gfs=1 + export cdmbgwd="0.071,2.1,1.0,1.0" # mountain blocking, ogwd, cgwd, cgwd src scaling + export cdmbgwd_gsl="40.0,1.77,1.0,1.0" # settings for GSL drag suite + export knob_ugwp_tauamp=6.0e-3 # setting for UGWPv1 non-stationary GWD + export WRITE_GROUP=1 + export WRTTASK_PER_GROUP_PER_THREAD_PER_TILE=1 + export WRITE_GROUP_GFS=1 + export WRTTASK_PER_GROUP_PER_THREAD_PER_TILE_GFS=1 + ;; + "C96") + export DELTIM=600 + export layout_x=2 + export layout_y=2 + export layout_x_gfs=2 + export layout_y_gfs=2 + export nthreads_fv3=1 + export nthreads_fv3_gfs=1 + export nthreads_ufs=1 + export nthreads_ufs_gfs=1 + export cdmbgwd="0.14,1.8,1.0,1.0" # mountain blocking, ogwd, cgwd, cgwd src scaling + export cdmbgwd_gsl="20.0,2.5,1.0,1.0" # settings for GSL drag suite + export knob_ugwp_tauamp=3.0e-3 # setting for UGWPv1 non-stationary GWD + export WRITE_GROUP=1 + export WRTTASK_PER_GROUP_PER_THREAD_PER_TILE=1 + export WRITE_GROUP_GFS=1 + export WRTTASK_PER_GROUP_PER_THREAD_PER_TILE_GFS=1 + ;; + "C192") + export DELTIM=450 + export layout_x=4 + export layout_y=6 + export layout_x_gfs=4 + export layout_y_gfs=6 + export nthreads_fv3=1 + export nthreads_fv3_gfs=2 + export nthreads_ufs=1 + export nthreads_ufs_gfs=2 + export cdmbgwd="0.23,1.5,1.0,1.0" # mountain blocking, ogwd, cgwd, cgwd src scaling + export cdmbgwd_gsl="10.0,3.5,1.0,1.0" # settings for GSL drag suite + export knob_ugwp_tauamp=1.5e-3 # setting for UGWPv1 non-stationary GWD + export WRITE_GROUP=1 + export WRTTASK_PER_GROUP_PER_THREAD_PER_TILE=10 + export WRITE_GROUP_GFS=2 + export WRTTASK_PER_GROUP_PER_THREAD_PER_TILE_GFS=5 + ;; + "C384") + export DELTIM=300 + export layout_x=8 + export layout_y=8 + export layout_x_gfs=8 + export layout_y_gfs=8 + export nthreads_fv3=2 + export nthreads_fv3_gfs=2 + export nthreads_ufs=2 + export nthreads_ufs_gfs=2 + export cdmbgwd="1.1,0.72,1.0,1.0" # mountain blocking, ogwd, cgwd, cgwd src scaling + export cdmbgwd_gsl="5.0,5.0,1.0,1.0" # settings for GSL drag suite + export knob_ugwp_tauamp=0.8e-3 # setting for UGWPv1 non-stationary GWD + export WRITE_GROUP=4 + export WRTTASK_PER_GROUP_PER_THREAD_PER_TILE=10 + export WRITE_GROUP_GFS=4 + export WRTTASK_PER_GROUP_PER_THREAD_PER_TILE_GFS=10 + ;; + "C768") + export DELTIM=150 + export layout_x=8 + export layout_y=12 + export layout_x_gfs=12 + export layout_y_gfs=16 + export nthreads_fv3=4 + export nthreads_fv3_gfs=4 + export nthreads_ufs=4 + export nthreads_ufs_gfs=4 + export cdmbgwd="4.0,0.15,1.0,1.0" # mountain blocking, ogwd, cgwd, cgwd src scaling + export cdmbgwd_gsl="2.5,7.5,1.0,1.0" # settings for GSL drag suite + export knob_ugwp_tauamp=0.5e-3 # setting for UGWPv1 non-stationary GWD + export WRITE_GROUP=2 + export WRTTASK_PER_GROUP_PER_THREAD_PER_TILE=10 + export WRITE_GROUP_GFS=4 + export WRTTASK_PER_GROUP_PER_THREAD_PER_TILE_GFS=20 #Note this should be 10 for WCOSS2 + ;; + "C1152") + export DELTIM=120 + export layout_x=8 + export layout_y=16 + export layout_x_gfs=8 + export layout_y_gfs=16 + export nthreads_fv3=4 + export nthreads_fv3_gfs=4 + export nthreads_ufs=4 + export nthreads_ufs_gfs=4 + export cdmbgwd="4.0,0.10,1.0,1.0" # mountain blocking, ogwd, cgwd, cgwd src scaling + export cdmbgwd_gsl="1.67,8.8,1.0,1.0" # settings for GSL drag suite + export knob_ugwp_tauamp=0.35e-3 # setting for UGWPv1 non-stationary GWD + export WRITE_GROUP=4 + export WRTTASK_PER_GROUP_PER_THREAD_PER_TILE=10 # TODO: refine these numbers when a case is available + export WRITE_GROUP_GFS=4 + export WRTTASK_PER_GROUP_PER_THREAD_PER_TILE_GFS=20 # TODO: refine these numbers when a case is available + ;; + "C3072") + export DELTIM=90 + export layout_x=16 + export layout_y=32 + export layout_x_gfs=16 + export layout_y_gfs=32 + export nthreads_fv3=4 + export nthreads_fv3_gfs=4 + export nthreads_ufs=4 + export nthreads_ufs_gfs=4 + export cdmbgwd="4.0,0.05,1.0,1.0" # mountain blocking, ogwd, cgwd, cgwd src scaling + export cdmbgwd_gsl="0.625,14.1,1.0,1.0" # settings for GSL drag suite + export knob_ugwp_tauamp=0.13e-3 # setting for UGWPv1 non-stationary GWD + export WRITE_GROUP=4 + export WRTTASK_PER_GROUP_PER_THREAD_PER_TILE=10 # TODO: refine these numbers when a case is available + export WRITE_GROUP_GFS=4 + export WRTTASK_PER_GROUP_PER_THREAD_PER_TILE_GFS=10 # TODO: refine these numbers when a case is available + ;; + *) + echo "FATAL ERROR: Unsupported FV3 resolution = ${fv3_res}, ABORT!" + exit 1 + ;; +esac + +(( WRTTASK_PER_GROUP_PER_THREAD = WRTTASK_PER_GROUP_PER_THREAD_PER_TILE * 6 )) +(( WRTTASK_PER_GROUP_PER_THREAD_GFS = WRTTASK_PER_GROUP_PER_THREAD_PER_TILE_GFS * 6 )) +export WRTTASK_PER_GROUP_PER_THREAD +export WRTTASK_PER_GROUP_PER_THREAD_GFS + +(( ntasks_fv3 = layout_x * layout_y * 6 )) +(( ntasks_fv3_gfs = layout_x_gfs * layout_y_gfs * 6 )) +export ntasks_fv3 +export ntasks_fv3_gfs + +(( ntasks_quilt = WRITE_GROUP * WRTTASK_PER_GROUP_PER_THREAD )) +(( ntasks_quilt_gfs = WRITE_GROUP_GFS * WRTTASK_PER_GROUP_PER_THREAD_GFS )) +export ntasks_quilt +export ntasks_quilt_gfs + +# Determine whether to use parallel NetCDF based on resolution +case ${fv3_res} in + "C48" | "C96" | "C192" | "C384") + OUTPUT_FILETYPE_ATM="netcdf" + OUTPUT_FILETYPE_SFC="netcdf" + ;; + "C768" | "C1152" | "C3072") + OUTPUT_FILETYPE_ATM="netcdf_parallel" + OUTPUT_FILETYPE_SFC="netcdf_parallel" + ;; + *) + echo "FATAL ERROR: Unrecognized FV3 resolution ${fv3_res}" + exit 15 + ;; +esac +export OUTPUT_FILETYPE_ATM OUTPUT_FILETYPE_SFC + +# cpl defaults +export cpl=".false." +export cplflx=".false." +export cplice=".false." +export cplchm=".false." +export cplwav=".false." +export cplwav2atm=".false." +export CCPP_SUITE="FV3_GFS_v17_p8_ugwpv1" +model_list="atm" + +# Mediator specific settings +if [[ "${skip_mediator}" == "false" ]]; then + export cpl=".true." + export nthreads_mediator=${nthreads_fv3} # Use same threads as FV3 + export CCPP_SUITE="FV3_GFS_v17_coupled_p8_ugwpv1" # TODO: Does this include FV3_GFS_v17_p8? Can this be used instead of FV3_GFS_v17_p8? +fi + +# MOM6 specific settings +if [[ "${skip_mom6}" == "false" ]]; then + source "${EXPDIR}/config.ocn" + export cplflx=".true." + model_list="${model_list}.ocean" + nthreads_mom6=1 + case "${mom6_res}" in + "500") + ntasks_mom6=8 + OCNTIM=3600 + NX_GLB=72 + NY_GLB=35 + DT_DYNAM_MOM6='3600' + DT_THERM_MOM6='3600' + FRUNOFF="" + CHLCLIM="seawifs_1998-2006_smoothed_2X.nc" + MOM6_RESTART_SETTING='r' + MOM6_RIVER_RUNOFF='False' + eps_imesh="4.0e-1" + MOM6_DIAG_COORD_DEF_Z_FILE="oceanda_zgrid_25L.nc" + MOM6_DIAG_MISVAL="0.0" + MOM6_ALLOW_LANDMASK_CHANGES='False' + TOPOEDITS="" + ;; + "100") + ntasks_mom6=20 + OCNTIM=3600 + NX_GLB=360 + NY_GLB=320 + DT_DYNAM_MOM6='1800' + DT_THERM_MOM6='3600' + FRUNOFF="runoff.daitren.clim.1deg.nc" + CHLCLIM="seawifs_1998-2006_smoothed_2X.nc" + MOM6_RESTART_SETTING='r' + MOM6_RIVER_RUNOFF='False' + eps_imesh="2.5e-1" + TOPOEDITS="ufs.topo_edits_011818.nc" + if [[ "${DO_JEDIOCNVAR:-NO}" = "YES" ]]; then + MOM6_DIAG_COORD_DEF_Z_FILE="oceanda_zgrid_75L.nc" + MOM6_DIAG_MISVAL="0.0" + else + MOM6_DIAG_COORD_DEF_Z_FILE="interpolate_zgrid_40L.nc" + MOM6_DIAG_MISVAL="-1e34" + fi + MOM6_ALLOW_LANDMASK_CHANGES='True' + ;; + "050") + ntasks_mom6=60 + OCNTIM=3600 + NX_GLB=720 + NY_GLB=576 + DT_DYNAM_MOM6='1800' + DT_THERM_MOM6='3600' + FRUNOFF="runoff.daitren.clim.${NX_GLB}x${NY_GLB}.v20180328.nc" + CHLCLIM="seawifs-clim-1997-2010.${NX_GLB}x${NY_GLB}.v20180328.nc" + MOM6_RESTART_SETTING='n' + MOM6_RIVER_RUNOFF='True' + eps_imesh="1.0e-1" + if [[ "${DO_JEDIOCNVAR:-NO}" = "YES" ]]; then + MOM6_DIAG_COORD_DEF_Z_FILE="oceanda_zgrid_75L.nc" + MOM6_DIAG_MISVAL="0.0" + else + MOM6_DIAG_COORD_DEF_Z_FILE="interpolate_zgrid_40L.nc" + MOM6_DIAG_MISVAL="-1e34" + fi + MOM6_ALLOW_LANDMASK_CHANGES='False' + TOPOEDITS="" + ;; + "025") + ntasks_mom6=220 + OCNTIM=1800 + NX_GLB=1440 + NY_GLB=1080 + DT_DYNAM_MOM6='900' + DT_THERM_MOM6='1800' + FRUNOFF="runoff.daitren.clim.${NX_GLB}x${NY_GLB}.v20180328.nc" + CHLCLIM="seawifs-clim-1997-2010.${NX_GLB}x${NY_GLB}.v20180328.nc" + MOM6_RIVER_RUNOFF='True' + MOM6_RESTART_SETTING="r" + eps_imesh="1.0e-1" + if [[ "${DO_JEDIOCNVAR:-NO}" = "YES" ]]; then + MOM6_DIAG_COORD_DEF_Z_FILE="oceanda_zgrid_75L.nc" + MOM6_DIAG_MISVAL="0.0" + else + MOM6_DIAG_COORD_DEF_Z_FILE="interpolate_zgrid_40L.nc" + MOM6_DIAG_MISVAL="-1e34" + fi + MOM6_ALLOW_LANDMASK_CHANGES='False' + TOPOEDITS="" + ;; + *) + echo "FATAL ERROR: Unsupported MOM6 resolution = ${mom6_res}, ABORT!" + exit 1 + ;; + esac + + export nthreads_mom6 ntasks_mom6 + export OCNTIM + export NX_GLB NY_GLB + export DT_DYNAM_MOM6 DT_THERM_MOM6 + export FRUNOFF + export CHLCLIM + export TOPOEDITS + export MOM6_RIVER_RUNOFF + export MOM6_RESTART_SETTING + export eps_imesh + export MOM6_DIAG_COORD_DEF_Z_FILE + export MOM6_DIAG_MISVAL + export MOM6_ALLOW_LANDMASK_CHANGES +fi + +# CICE6 specific settings +if [[ "${skip_cice6}" == "false" ]]; then + source "${EXPDIR}/config.ice" + export cplice=".true." + model_list="${model_list}.ice" + # Ensure we sourced the MOM6 section + if [[ "${skip_mom6}" == "true" ]]; then + echo "FATAL ERROR: CICE6 cannot be configured without MOM6, ABORT!" + exit 1 + fi + + nthreads_cice6=${nthreads_mom6} # CICE6 needs to run on same threads as MOM6 + case "${cice6_res}" in + "500") + ntasks_cice6=4 + cice6_processor_shape="slenderX1" + ;; + "100") + ntasks_cice6=10 + cice6_processor_shape="slenderX2" + ;; + "050") + ntasks_cice6=30 + cice6_processor_shape="slenderX2" + ;; + "025") + ntasks_cice6=120 + cice6_processor_shape="slenderX2" + ;; + *) + echo "FATAL ERROR: Unsupported CICE6 resolution = ${cice6_res}, ABORT!" + exit 1 + ;; + esac + # NX_GLB and NY_GLB are set in the MOM6 section above + # CICE6 runs on the same domain decomposition as MOM6 + export nthreads_cice6 ntasks_cice6 + export cice6_processor_shape +fi + +# WW3 specific settings +if [[ "${skip_ww3}" == "false" ]]; then + source "${EXPDIR}/config.wave" + export cplwav=".true." + export cplwav2atm=".true." + model_list="${model_list}.wave" + nthreads_ww3=2 + case "${ww3_res}" in + "gnh_10m;aoc_9km;gsh_15m") + ntasks_ww3=140 + ;; + "gwes_30m") + ntasks_ww3=100 + ;; + "glo_025") + ntasks_ww3=262 + ;; + "glo_200") + ntasks_ww3=30 + nthreads_ww3=1 + ;; + "glo_500") + ntasks_ww3=12 + nthreads_ww3=1 + ;; + "mx025") + ntasks_ww3=80 + ;; + "uglo_100km") + ntasks_ww3=40 + nthreads_ww3=1 + ;; + "uglo_m1g16") + ntasks_ww3=1000 + nthreads_ww3=1 + ;; + *) + echo "FATAL ERROR: Unsupported WW3 resolution = ${ww3_res}, ABORT!" + exit 1 + ;; + esac + export ntasks_ww3 nthreads_ww3 +fi + +# GOCART specific settings +if [[ "${skip_gocart}" == "false" ]]; then + source "${EXPDIR}/config.aero" + export cplchm=".true." + model_list="${model_list}.aero" +fi + +# Set the name of the UFS (previously nems) configure template to use +# Default ufs.configure templates for supported model configurations +if [[ "${USE_ESMF_THREADING:-}" == "YES" ]]; then + tmpl_suffix="_esmf" +fi +case "${model_list}" in + atm) + default_template="${PARMgfs}/ufs/ufs.configure.atm${tmpl_suffix:-}.IN" + ;; + atm.aero) + default_template="${PARMgfs}/ufs/ufs.configure.atmaero${tmpl_suffix:-}.IN" + ;; + atm.wave) + default_template="${PARMgfs}/ufs/ufs.configure.leapfrog_atm_wav${tmpl_suffix:-}.IN" + ;; + atm.ocean.ice) + default_template="${PARMgfs}/ufs/ufs.configure.s2s${tmpl_suffix:-}.IN" + ;; + atm.ocean.ice.aero) + default_template="${PARMgfs}/ufs/ufs.configure.s2sa${tmpl_suffix:-}.IN" + ;; + atm.ocean.ice.wave) + default_template="${PARMgfs}/ufs/ufs.configure.s2sw${tmpl_suffix:-}.IN" + ;; + atm.ocean.ice.wave.aero) + default_template="${PARMgfs}/ufs/ufs.configure.s2swa${tmpl_suffix:-}.IN" + ;; + *) + echo "FATAL ERROR: Unsupported UFSWM configuration for ${model_list}" + exit 16 + ;; +esac + +# Allow user to override the default template +export ufs_configure_template=${ufs_configure_template:-${default_template:-"/dev/null"}} +unset model_list default_template + +if [[ ! -r "${ufs_configure_template}" ]]; then + echo "FATAL ERROR: ${ufs_configure_template} either doesn't exist or is not readable." + exit 17 +fi + +echo "END: config.ufs" diff --git a/parm/config/gfs/config.upp b/parm/config/gfs/config.upp index a1bd0a7d34..41015c2fee 100644 --- a/parm/config/gfs/config.upp +++ b/parm/config/gfs/config.upp @@ -8,7 +8,7 @@ echo "BEGIN: config.upp" # Get task specific resources . "${EXPDIR}/config.resources" upp -export UPP_CONFIG="${HOMEgfs}/parm/post/upp.yaml" +export UPP_CONFIG="${PARMgfs}/post/upp.yaml" # No. of forecast hours to process in a single job export NFHRS_PER_GROUP=3 diff --git a/parm/config/gfs/config.verfozn b/parm/config/gfs/config.verfozn index 9eea0f25a3..df7d18012d 100644 --- a/parm/config/gfs/config.verfozn +++ b/parm/config/gfs/config.verfozn @@ -9,15 +9,14 @@ echo "BEGIN: config.verfozn" export DO_DATA_RPT=1 export OZN_AREA="glb" export OZNMON_SUFFIX=${NET} -export PARMmonitor=${PARMgfs}/monitor -export SATYPE_FILE=${PARMmonitor}/gdas_oznmon_satype.txt +export SATYPE_FILE=${PARMgfs}/monitor/gdas_oznmon_satype.txt # Source the parm file -. "${PARMmonitor}/gdas_oznmon.parm" +. "${PARMgfs}/monitor/gdas_oznmon.parm" # Set up validation file if [[ ${VALIDATE_DATA} -eq 1 ]]; then - export ozn_val_file=${PARMmonitor}/gdas_oznmon_base.tar + export ozn_val_file=${PARMgfs}/monitor/gdas_oznmon_base.tar fi echo "END: config.verfozn" diff --git a/parm/config/gfs/config.verfrad b/parm/config/gfs/config.verfrad index dd65020180..506ce50b4f 100644 --- a/parm/config/gfs/config.verfrad +++ b/parm/config/gfs/config.verfrad @@ -6,11 +6,10 @@ echo "BEGIN: config.verfrad" # Get task specific resources . "${EXPDIR}/config.resources" verfrad -export PARMmonitor=${PARMgfs}/monitor -export satype_file=${PARMmonitor}/gdas_radmon_satype.txt +export satype_file=${PARMgfs}/monitor/gdas_radmon_satype.txt # Source the parm file -. "${PARMmonitor}/da_mon.parm" +. "${PARMgfs}/monitor/da_mon.parm" # Other variables export RAD_AREA="glb" diff --git a/parm/config/gfs/config.vminmon b/parm/config/gfs/config.vminmon index 8929c36e0e..7c7d362161 100644 --- a/parm/config/gfs/config.vminmon +++ b/parm/config/gfs/config.vminmon @@ -9,8 +9,7 @@ echo "BEGIN: config.vminmon" export MINMON_SUFFIX=${MINMON_SUFFIX:-${NET}} export CYCLE_INTERVAL=${assim_freq:-6} -export PARMmonitor=${PARMgfs}/monitor -export mm_gnormfile=${PARMmonitor}/${RUN}_minmon_gnorm.txt -export mm_costfile=${PARMmonitor}/${RUN}_minmon_cost.txt +export mm_gnormfile=${PARMgfs}/monitor/${RUN}_minmon_gnorm.txt +export mm_costfile=${PARMgfs}/monitor/${RUN}_minmon_cost.txt echo "END: config.vminmon" diff --git a/parm/config/gfs/config.wave b/parm/config/gfs/config.wave index acb4c518ba..6fbce69996 100644 --- a/parm/config/gfs/config.wave +++ b/parm/config/gfs/config.wave @@ -6,15 +6,6 @@ echo "BEGIN: config.wave" # Parameters that are common to all wave model steps - -# System and version -export wave_sys_ver=v1.0.0 - -export EXECwave="${HOMEgfs}/exec" -export FIXwave="${HOMEgfs}/fix/wave" -export PARMwave="${HOMEgfs}/parm/wave" -export USHwave="${HOMEgfs}/ush" - # This config contains variables/parameters used in the fcst step # Some others are also used across the workflow in wave component scripts @@ -80,7 +71,19 @@ case "${waveGRD}" in export wavepostGRD='glo_500' export waveuoutpGRD=${waveGRD} ;; - *) + "uglo_100km") + #unstructured 100km grid + export waveinterpGRD='glo_200' + export wavepostGRD='' + export waveuoutpGRD=${waveGRD} + ;; + "uglo_m1g16") + #unstructured m1v16 grid + export waveinterpGRD='glo_15mxt' + export wavepostGRD='' + export waveuoutpGRD=${waveGRD} + ;; + *) echo "FATAL ERROR: No grid specific wave config values exist for ${waveGRD}. Aborting." exit 1 ;; diff --git a/parm/config/gfs/config.wavepostbndpnt b/parm/config/gfs/config.wavepostbndpnt index dfeddc79b2..412c5fb42a 100644 --- a/parm/config/gfs/config.wavepostbndpnt +++ b/parm/config/gfs/config.wavepostbndpnt @@ -6,6 +6,6 @@ echo "BEGIN: config.wavepostbndpnt" # Get task specific resources -. $EXPDIR/config.resources wavepostbndpnt +source "${EXPDIR}/config.resources" wavepostbndpnt echo "END: config.wavepostbndpnt" diff --git a/parm/config/gfs/config.wavepostbndpntbll b/parm/config/gfs/config.wavepostbndpntbll index bb7224cc70..6695ab0f84 100644 --- a/parm/config/gfs/config.wavepostbndpntbll +++ b/parm/config/gfs/config.wavepostbndpntbll @@ -6,6 +6,6 @@ echo "BEGIN: config.wavepostbndpntbll" # Get task specific resources -. $EXPDIR/config.resources wavepostbndpntbll +source "${EXPDIR}/config.resources" wavepostbndpntbll echo "END: config.wavepostbndpntbll" diff --git a/parm/config/gfs/config.wavepostpnt b/parm/config/gfs/config.wavepostpnt index 8befb91760..e87237da82 100644 --- a/parm/config/gfs/config.wavepostpnt +++ b/parm/config/gfs/config.wavepostpnt @@ -6,6 +6,6 @@ echo "BEGIN: config.wavepostpnt" # Get task specific resources -. $EXPDIR/config.resources wavepostpnt +source "${EXPDIR}/config.resources" wavepostpnt echo "END: config.wavepostpnt" diff --git a/parm/config/gfs/config.wavepostsbs b/parm/config/gfs/config.wavepostsbs index 8e74aae069..b3c5902e3c 100644 --- a/parm/config/gfs/config.wavepostsbs +++ b/parm/config/gfs/config.wavepostsbs @@ -6,7 +6,7 @@ echo "BEGIN: config.wavepostsbs" # Get task specific resources -. $EXPDIR/config.resources wavepostsbs +source "${EXPDIR}/config.resources" wavepostsbs # Subgrid info for grib2 encoding export WAV_SUBGRBSRC="" diff --git a/parm/config/gfs/yaml/defaults.yaml b/parm/config/gfs/yaml/defaults.yaml index ade83fa484..521c7a03ba 100644 --- a/parm/config/gfs/yaml/defaults.yaml +++ b/parm/config/gfs/yaml/defaults.yaml @@ -3,15 +3,24 @@ base: DO_JEDIATMVAR: "NO" DO_JEDIATMENS: "NO" DO_JEDIOCNVAR: "NO" - DO_JEDILANDDA: "NO" + DO_JEDISNOWDA: "NO" DO_MERGENSST: "NO" DO_GOES: "NO" + FHMAX_GFS: 120 + DO_VRFY_OCEANDA: "NO" + GSI_SOILANAL: "NO" + EUPD_CYC: "gdas" + FHMAX_ENKF_GFS: 12 atmanl: + LAYOUT_X_ATMANL: 8 + LAYOUT_Y_ATMANL: 8 IO_LAYOUT_X: 1 IO_LAYOUT_Y: 1 atmensanl: + LAYOUT_X_ATMENSANL: 8 + LAYOUT_Y_ATMENSANL: 8 IO_LAYOUT_X: 1 IO_LAYOUT_Y: 1 @@ -19,18 +28,19 @@ aeroanl: IO_LAYOUT_X: 1 IO_LAYOUT_Y: 1 -landanl: +snowanl: IO_LAYOUT_X: 1 IO_LAYOUT_Y: 1 ocnanal: - SOCA_INPUT_FIX_DIR: "/scratch2/NCEPDEV/ocean/Guillaume.Vernieres/data/static/72x35x25/soca" # TODO: These need to go to glopara fix space. @guillaumevernieres will open an issue - CASE_ANL: "C48" - COMIN_OBS: "/scratch2/NCEPDEV/marineda/r2d2-v2-v3" # TODO: make platform agnostic - SOCA_OBS_LIST: "{{ HOMEgfs }}/sorc/gdas.cd/parm/soca/obs/obs_list.yaml" + SOCA_INPUT_FIX_DIR: "/scratch2/NCEPDEV/ocean/Guillaume.Vernieres/data/static/72x35x25/soca" # TODO: These need to go to glopara fix space. + CASE_ANL: "C48" # TODO: Check in gdasapp if used anywhere for SOCA + SOCA_OBS_LIST: "${PARMgfs}/gdas/soca/obs/obs_list.yaml" # TODO: This is also repeated in oceanprepobs SOCA_NINNER: 100 - R2D2_OBS_SRC: "gdas_marine" - R2D2_OBS_DUMP: "s2s_v1" SABER_BLOCKS_YAML: "" NICAS_RESOL: 1 NICAS_GRID_SIZE: 15000 +prepoceanobs: + SOCA_OBS_LIST: "${PARMgfs}/gdas/soca/obs/obs_list.yaml" # TODO: This is also repeated in ocnanal + OBSPREP_YAML: "${PARMgfs}/gdas/soca/obsprep/obsprep_config.yaml" + DMPDIR: "/scratch1/NCEPDEV/global/glopara/data/experimental_obs" diff --git a/parm/config/gfs/yaml/test_ci.yaml b/parm/config/gfs/yaml/test_ci.yaml index bb9602be59..7425d4d029 100644 --- a/parm/config/gfs/yaml/test_ci.yaml +++ b/parm/config/gfs/yaml/test_ci.yaml @@ -1,4 +1,4 @@ defaults: - !INC {{ HOMEgfs }}/parm/config/gfs/yaml/defaults.yaml + !INC {{ PARMgfs }}/config/gfs/yaml/defaults.yaml base: ACCOUNT: "nems" diff --git a/parm/gdas/aero_crtm_coeff.yaml b/parm/gdas/aero_crtm_coeff.yaml deleted file mode 100644 index 75b54c3741..0000000000 --- a/parm/gdas/aero_crtm_coeff.yaml +++ /dev/null @@ -1,13 +0,0 @@ -mkdir: -- {{ DATA }}/crtm/ -copy: -- [{{ CRTM_FIX }}/AerosolCoeff.bin, {{ DATA }}/crtm/] -- [{{ CRTM_FIX }}/CloudCoeff.bin, {{ DATA }}/crtm/] -- [{{ CRTM_FIX }}/v.viirs-m_npp.SpcCoeff.bin, {{ DATA }}/crtm/] -- [{{ CRTM_FIX }}/v.viirs-m_npp.TauCoeff.bin, {{ DATA }}/crtm/] -- [{{ CRTM_FIX }}/v.viirs-m_j1.SpcCoeff.bin, {{ DATA }}/crtm/] -- [{{ CRTM_FIX }}/v.viirs-m_j1.TauCoeff.bin, {{ DATA }}/crtm/] -- [{{ CRTM_FIX }}/NPOESS.VISice.EmisCoeff.bin, {{ DATA }}/crtm/] -- [{{ CRTM_FIX }}/NPOESS.VISland.EmisCoeff.bin, {{ DATA }}/crtm/] -- [{{ CRTM_FIX }}/NPOESS.VISsnow.EmisCoeff.bin, {{ DATA }}/crtm/] -- [{{ CRTM_FIX }}/NPOESS.VISwater.EmisCoeff.bin, {{ DATA }}/crtm/] diff --git a/parm/gdas/aero_crtm_coeff.yaml.j2 b/parm/gdas/aero_crtm_coeff.yaml.j2 new file mode 100644 index 0000000000..b48d8ff231 --- /dev/null +++ b/parm/gdas/aero_crtm_coeff.yaml.j2 @@ -0,0 +1,13 @@ +mkdir: +- '{{ DATA }}/crtm/' +copy: +- ['{{ CRTM_FIX }}/AerosolCoeff.bin', '{{ DATA }}/crtm/'] +- ['{{ CRTM_FIX }}/CloudCoeff.bin', '{{ DATA }}/crtm/'] +- ['{{ CRTM_FIX }}/v.viirs-m_npp.SpcCoeff.bin', '{{ DATA }}/crtm/'] +- ['{{ CRTM_FIX }}/v.viirs-m_npp.TauCoeff.bin', '{{ DATA }}/crtm/'] +- ['{{ CRTM_FIX }}/v.viirs-m_j1.SpcCoeff.bin', '{{ DATA }}/crtm/'] +- ['{{ CRTM_FIX }}/v.viirs-m_j1.TauCoeff.bin', '{{ DATA }}/crtm/'] +- ['{{ CRTM_FIX }}/NPOESS.VISice.EmisCoeff.bin', '{{ DATA }}/crtm/'] +- ['{{ CRTM_FIX }}/NPOESS.VISland.EmisCoeff.bin', '{{ DATA }}/crtm/'] +- ['{{ CRTM_FIX }}/NPOESS.VISsnow.EmisCoeff.bin', '{{ DATA }}/crtm/'] +- ['{{ CRTM_FIX }}/NPOESS.VISwater.EmisCoeff.bin', '{{ DATA }}/crtm/'] diff --git a/parm/gdas/aero_jedi_fix.yaml b/parm/gdas/aero_jedi_fix.yaml deleted file mode 100644 index 85a00c3c30..0000000000 --- a/parm/gdas/aero_jedi_fix.yaml +++ /dev/null @@ -1,11 +0,0 @@ -mkdir: -- !ENV ${DATA}/fv3jedi -copy: -- - !ENV ${FIXgdas}/fv3jedi/fv3files/akbk$(npz).nc4 - - !ENV ${DATA}/fv3jedi/akbk.nc4 -- - !ENV ${FIXgdas}/fv3jedi/fv3files/fmsmpp.nml - - !ENV ${DATA}/fv3jedi/fmsmpp.nml -- - !ENV ${FIXgdas}/fv3jedi/fv3files/field_table_gfdl - - !ENV ${DATA}/fv3jedi/field_table -- - !ENV $(HOMEgfs)/sorc/gdas.cd/parm/io/fv3jedi_fieldmetadata_restart.yaml - - !ENV ${DATA}/fv3jedi/fv3jedi_fieldmetadata_restart.yaml diff --git a/parm/gdas/aero_jedi_fix.yaml.j2 b/parm/gdas/aero_jedi_fix.yaml.j2 new file mode 100644 index 0000000000..69039baddf --- /dev/null +++ b/parm/gdas/aero_jedi_fix.yaml.j2 @@ -0,0 +1,7 @@ +mkdir: +- '{{ DATA }}/fv3jedi' +copy: +- ['{{ FIXgfs }}/gdas/fv3jedi/fv3files/akbk{{ npz }}.nc4', '{{ DATA }}/fv3jedi/akbk.nc4'] +- ['{{ FIXgfs }}/gdas/fv3jedi/fv3files/fmsmpp.nml', '{{ DATA }}/fv3jedi/fmsmpp.nml'] +- ['{{ FIXgfs }}/gdas/fv3jedi/fv3files/field_table_gfdl', '{{ DATA }}/fv3jedi/field_table'] +- ['{{ PARMgfs }}/gdas/io/fv3jedi_fieldmetadata_restart.yaml', '{{ DATA }}/fv3jedi/fv3jedi_fieldmetadata_restart.yaml'] diff --git a/parm/gdas/atm_crtm_coeff.yaml b/parm/gdas/atm_crtm_coeff.yaml.j2 similarity index 100% rename from parm/gdas/atm_crtm_coeff.yaml rename to parm/gdas/atm_crtm_coeff.yaml.j2 diff --git a/parm/gdas/atm_jedi_fix.yaml b/parm/gdas/atm_jedi_fix.yaml deleted file mode 100644 index 3d1ca79f33..0000000000 --- a/parm/gdas/atm_jedi_fix.yaml +++ /dev/null @@ -1,7 +0,0 @@ -mkdir: -- $(DATA)/fv3jedi -copy: -- [$(HOMEgfs)/fix/gdas/fv3jedi/fv3files/akbk$(npz).nc4, $(DATA)/fv3jedi/akbk.nc4] -- [$(HOMEgfs)/fix/gdas/fv3jedi/fv3files/fmsmpp.nml, $(DATA)/fv3jedi/fmsmpp.nml] -- [$(HOMEgfs)/fix/gdas/fv3jedi/fv3files/field_table_gfdl, $(DATA)/fv3jedi/field_table] -- [$(HOMEgfs)/sorc/gdas.cd/parm/io/fv3jedi_fieldmetadata_restart.yaml, $(DATA)/fv3jedi/fv3jedi_fieldmetadata_restart.yaml] diff --git a/parm/gdas/atm_jedi_fix.yaml.j2 b/parm/gdas/atm_jedi_fix.yaml.j2 new file mode 100644 index 0000000000..69039baddf --- /dev/null +++ b/parm/gdas/atm_jedi_fix.yaml.j2 @@ -0,0 +1,7 @@ +mkdir: +- '{{ DATA }}/fv3jedi' +copy: +- ['{{ FIXgfs }}/gdas/fv3jedi/fv3files/akbk{{ npz }}.nc4', '{{ DATA }}/fv3jedi/akbk.nc4'] +- ['{{ FIXgfs }}/gdas/fv3jedi/fv3files/fmsmpp.nml', '{{ DATA }}/fv3jedi/fmsmpp.nml'] +- ['{{ FIXgfs }}/gdas/fv3jedi/fv3files/field_table_gfdl', '{{ DATA }}/fv3jedi/field_table'] +- ['{{ PARMgfs }}/gdas/io/fv3jedi_fieldmetadata_restart.yaml', '{{ DATA }}/fv3jedi/fv3jedi_fieldmetadata_restart.yaml'] diff --git a/parm/gdas/land_jedi_fix.yaml b/parm/gdas/land_jedi_fix.yaml deleted file mode 100644 index 3d1ca79f33..0000000000 --- a/parm/gdas/land_jedi_fix.yaml +++ /dev/null @@ -1,7 +0,0 @@ -mkdir: -- $(DATA)/fv3jedi -copy: -- [$(HOMEgfs)/fix/gdas/fv3jedi/fv3files/akbk$(npz).nc4, $(DATA)/fv3jedi/akbk.nc4] -- [$(HOMEgfs)/fix/gdas/fv3jedi/fv3files/fmsmpp.nml, $(DATA)/fv3jedi/fmsmpp.nml] -- [$(HOMEgfs)/fix/gdas/fv3jedi/fv3files/field_table_gfdl, $(DATA)/fv3jedi/field_table] -- [$(HOMEgfs)/sorc/gdas.cd/parm/io/fv3jedi_fieldmetadata_restart.yaml, $(DATA)/fv3jedi/fv3jedi_fieldmetadata_restart.yaml] diff --git a/parm/gdas/snow_jedi_fix.yaml.j2 b/parm/gdas/snow_jedi_fix.yaml.j2 new file mode 100644 index 0000000000..69039baddf --- /dev/null +++ b/parm/gdas/snow_jedi_fix.yaml.j2 @@ -0,0 +1,7 @@ +mkdir: +- '{{ DATA }}/fv3jedi' +copy: +- ['{{ FIXgfs }}/gdas/fv3jedi/fv3files/akbk{{ npz }}.nc4', '{{ DATA }}/fv3jedi/akbk.nc4'] +- ['{{ FIXgfs }}/gdas/fv3jedi/fv3files/fmsmpp.nml', '{{ DATA }}/fv3jedi/fmsmpp.nml'] +- ['{{ FIXgfs }}/gdas/fv3jedi/fv3files/field_table_gfdl', '{{ DATA }}/fv3jedi/field_table'] +- ['{{ PARMgfs }}/gdas/io/fv3jedi_fieldmetadata_restart.yaml', '{{ DATA }}/fv3jedi/fv3jedi_fieldmetadata_restart.yaml'] diff --git a/parm/post/oceanice_products.yaml b/parm/post/oceanice_products.yaml new file mode 100644 index 0000000000..8d8bcd4b51 --- /dev/null +++ b/parm/post/oceanice_products.yaml @@ -0,0 +1,75 @@ +ocnicepost: + executable: "ocnicepost.x" + namelist: + debug: False + fix_data: + mkdir: + - "{{ DATA }}" + copy: + - ["{{ EXECgfs }}/ocnicepost.x", "{{ DATA }}/"] + - ["{{ PARMgfs }}/post/ocnicepost.nml.jinja2", "{{ DATA }}/"] + - ["{{ PARMgfs }}/post/{{ component }}.csv", "{{ DATA }}/"] + - ["{{ FIXgfs }}/mom6/post/{{ model_grid }}/tripole.{{ model_grid }}.Bu.to.Ct.bilinear.nc", "{{ DATA }}/"] + - ["{{ FIXgfs }}/mom6/post/{{ model_grid }}/tripole.{{ model_grid }}.Cu.to.Ct.bilinear.nc", "{{ DATA }}/"] + - ["{{ FIXgfs }}/mom6/post/{{ model_grid }}/tripole.{{ model_grid }}.Cv.to.Ct.bilinear.nc", "{{ DATA }}/"] + {% for grid in product_grids %} + - ["{{ FIXgfs }}/mom6/post/{{ model_grid }}/tripole.{{ model_grid }}.Ct.to.rect.{{ grid }}.bilinear.nc", "{{ DATA }}/"] + - ["{{ FIXgfs }}/mom6/post/{{ model_grid }}/tripole.{{ model_grid }}.Ct.to.rect.{{ grid }}.conserve.nc", "{{ DATA }}/"] + - ["{{ FIXgfs }}/mom6/post/template.global.{{ grid }}.gb2", "{{ DATA }}/"] + {% endfor %} + +nc2grib2: + script: "{{ USHgfs }}/oceanice_nc2grib2.sh" + +ocean: + namelist: + ftype: "ocean" + maskvar: "temp" + sinvar: "sin_rot" + cosvar: "cos_rot" + angvar: "" + {% if model_grid == 'mx025' or model_grid == 'mx050' or model_grid == 'mx100' %} + ocean_levels: [5, 15, 25, 35, 45, 55, 65, 75, 85, 95, 105, 115, 125, 135, 145, 155, 165, 175, 185, 195, 205, 215, 225.86945, 241.06255, 266.5239, 308.7874, 373.9288, 467.3998, 593.87915, 757.1453, 959.97325, 1204.059, 1489.9735, 1817.1455, 2183.879, 2587.3995, 3023.9285, 3488.7875, 3976.524, 4481.0625] + {% elif model_grid == 'mx500' %} + ocean_levels: [5, 15, 25, 35, 45, 55, 65, 75, 85, 95, 105, 115, 125, 135, 145, 155, 165, 175, 185, 195, 205, 215, 225.86945, 241.06255, 266.5239] + {% endif %} + subset: ['SSH', 'SST', 'SSS', 'speed', 'MLD_003', 'latent', 'sensible', 'SW', 'LW', 'LwLatSens', 'Heat_PmE', 'SSU', 'SSV', 'taux', 'tauy', 'temp', 'so', 'uo', 'vo'] + data_in: + copy: + - ["{{ COM_OCEAN_HISTORY }}/{{ RUN }}.ocean.t{{ current_cycle | strftime('%H') }}z.6hr_avg.f{{ '%03d' % forecast_hour }}.nc", "{{ DATA }}/ocean.nc"] + data_out: + mkdir: + - "{{ COM_OCEAN_NETCDF }}" + {% for grid in product_grids %} + - "{{ COM_OCEAN_GRIB }}/{{ grid }}" + {% endfor %} + copy: + - ["{{ DATA }}/ocean_subset.nc", "{{ COM_OCEAN_NETCDF }}/{{ RUN }}.ocean.t{{ current_cycle | strftime('%H') }}z.native.f{{ '%03d' % forecast_hour }}.nc"] + {% for grid in product_grids %} + - ["{{ DATA }}/ocean.{{ grid }}.grib2", "{{ COM_OCEAN_GRIB }}/{{ grid }}/{{ RUN }}.ocean.t{{ current_cycle | strftime('%H') }}z.{{ grid }}.f{{ '%03d' % forecast_hour }}.grib2"] + - ["{{ DATA }}/ocean.{{ grid }}.grib2.idx", "{{ COM_OCEAN_GRIB }}/{{ grid }}/{{ RUN }}.ocean.t{{ current_cycle | strftime('%H') }}z.{{ grid }}.f{{ '%03d' % forecast_hour }}.grib2.idx"] + {% endfor %} + +ice: + namelist: + ftype: "ice" + maskvar: "tmask" + sinvar: "" + cosvar: "" + angvar: "ANGLET" + subset: ['hi_h', 'hs_h', 'aice_h', 'Tsfc_h', 'uvel_h', 'vvel_h', 'frzmlt_h', 'albsni_h', 'mlt_onset_h', 'frz_onset_h'] + data_in: + copy: + - ["{{ COM_ICE_HISTORY }}/{{ RUN }}.ice.t{{ current_cycle | strftime('%H') }}z.6hr_avg.f{{ '%03d' % forecast_hour }}.nc", "{{ DATA }}/ice.nc"] + data_out: + mkdir: + - "{{ COM_ICE_NETCDF }}" + {% for grid in product_grids %} + - "{{ COM_ICE_GRIB }}/{{ grid }}" + {% endfor %} + copy: + - ["{{ DATA }}/ice_subset.nc", "{{ COM_ICE_NETCDF }}/{{ RUN }}.ice.t{{ current_cycle | strftime('%H') }}z.native.f{{ '%03d' % forecast_hour }}.nc"] + {% for grid in product_grids %} + - ["{{ DATA }}/ice.{{ grid }}.grib2", "{{ COM_ICE_GRIB }}/{{ grid }}/{{ RUN }}.ice.t{{ current_cycle | strftime('%H') }}z.{{ grid }}.f{{ '%03d' % forecast_hour }}.grib2"] + - ["{{ DATA }}/ice.{{ grid }}.grib2.idx", "{{ COM_ICE_GRIB }}/{{ grid }}/{{ RUN }}.ice.t{{ current_cycle | strftime('%H') }}z.{{ grid }}.f{{ '%03d' % forecast_hour }}.grib2.idx"] + {% endfor %} diff --git a/parm/post/upp.yaml b/parm/post/upp.yaml index 651f3c12a8..c2f0315d86 100644 --- a/parm/post/upp.yaml +++ b/parm/post/upp.yaml @@ -9,16 +9,16 @@ upp: - "{{ DATA }}" copy: - ["{{ 'g2tmpl_ROOT' | getenv }}/share/params_grib2_tbl_new", "{{ DATA }}/params_grib2_tbl_new"] - - ["{{ HOMEgfs }}/parm/post/nam_micro_lookup.dat", "{{ DATA }}/eta_micro_lookup.dat"] - - ["{{ HOMEgfs }}/exec/upp.x", "{{ DATA }}/"] - - ["{{ HOMEgfs }}/parm/post/itag.jinja", "{{ DATA }}/"] + - ["{{ PARMgfs }}/post/nam_micro_lookup.dat", "{{ DATA }}/eta_micro_lookup.dat"] + - ["{{ EXECgfs }}/upp.x", "{{ DATA }}/"] + - ["{{ PARMgfs }}/post/itag.jinja", "{{ DATA }}/"] analysis: config: rdaod: True data_in: copy: - - ["{{ HOMEgfs }}/parm/post/postxconfig-NT-GFS-ANL.txt", "{{ DATA }}/postxconfig-NT.txt"] + - ["{{ PARMgfs }}/post/postxconfig-NT-GFS-ANL.txt", "{{ DATA }}/postxconfig-NT.txt"] - ["{{ COM_ATMOS_ANALYSIS }}/{{ RUN }}.t{{ current_cycle | strftime('%H') }}z.atmanl.nc", "{{ DATA }}/{{ atmos_filename }}"] - ["{{ COM_ATMOS_ANALYSIS }}/{{ RUN }}.t{{ current_cycle | strftime('%H') }}z.sfcanl.nc", "{{ DATA }}/{{ flux_filename }}"] data_out: @@ -28,13 +28,13 @@ analysis: forecast: config: - rdaod: False + rdaod: True data_in: copy: {% if forecast_hour == 0 %} - - ["{{ HOMEgfs }}/parm/post/postxconfig-NT-GFS-F00-TWO.txt", "{{ DATA }}/postxconfig-NT.txt"] + - ["{{ PARMgfs }}/post/postxconfig-NT-GFS-F00-TWO.txt", "{{ DATA }}/postxconfig-NT.txt"] {% else %} - - ["{{ HOMEgfs }}/parm/post/postxconfig-NT-GFS-TWO.txt", "{{ DATA }}/postxconfig-NT.txt"] + - ["{{ PARMgfs }}/post/postxconfig-NT-GFS-TWO.txt", "{{ DATA }}/postxconfig-NT.txt"] {% endif %} - ["{{ COM_ATMOS_HISTORY }}/{{ RUN }}.t{{ current_cycle | strftime('%H') }}z.atmf{{ '%03d' % forecast_hour }}.nc", "{{ DATA }}/{{ atmos_filename }}"] - ["{{ COM_ATMOS_HISTORY }}/{{ RUN }}.t{{ current_cycle | strftime('%H') }}z.sfcf{{ '%03d' % forecast_hour }}.nc", "{{ DATA }}/{{ flux_filename }}"] @@ -47,7 +47,7 @@ forecast: goes: config: - rdaod: True + rdaod: False data_in: copy: {% set crtm_coefficients = [ @@ -81,7 +81,7 @@ goes: {% endfor %} - ["{{ 'CRTM_FIX' | getenv }}/AerosolCoeff.bin", "{{ DATA }}/"] - ["{{ 'CRTM_FIX' | getenv }}/CloudCoeff.bin", "{{ DATA }}/"] - - ["{{ HOMEgfs }}/parm/post/postxconfig-NT-GFS-GOES.txt", "{{ DATA }}/postxconfig-NT.txt"] + - ["{{ PARMgfs }}/post/postxconfig-NT-GFS-GOES.txt", "{{ DATA }}/postxconfig-NT.txt"] - ["{{ COM_ATMOS_HISTORY }}/{{ RUN }}.t{{ current_cycle | strftime('%H') }}z.atmf{{ '%03d' % forecast_hour }}.nc", "{{ DATA }}/{{ atmos_filename }}"] - ["{{ COM_ATMOS_HISTORY }}/{{ RUN }}.t{{ current_cycle | strftime('%H') }}z.sfcf{{ '%03d' % forecast_hour }}.nc", "{{ DATA }}/{{ flux_filename }}"] data_out: diff --git a/parm/product/bufr_ij9km.txt b/parm/product/bufr_ij9km.txt new file mode 100644 index 0000000000..321026f3d1 --- /dev/null +++ b/parm/product/bufr_ij9km.txt @@ -0,0 +1,2115 @@ + 1 2814 261 69.58 -140.18 + 2 2958 257 69.90 -128.97 + 3 3063 261 69.58 -120.75 + 4 2919 320 65.00 -132.00 + 5 3072 320 65.00 -120.00 + 6 3297 319 65.10 -102.43 + 7 2816 384 60.00 -140.00 + 8 3072 384 60.00 -120.00 + 9 3328 384 60.00 -100.00 + 10 2983 422 57.00 -127.00 + 11 2708 253 70.20 -148.47 + 12 3249 465 53.63 -106.20 + 13 2588 293 67.10 -157.85 + 14 3263 461 53.99 -105.12 + 15 3097 475 52.88 -118.07 + 16 3239 352 62.50 -107.00 + 17 3200 590 43.90 -110.00 + 18 3361 566 45.80 -97.45 + 19 3385 595 43.50 -95.60 + 20 3421 727 33.22 -92.80 + 21 3395 614 42.04 -94.79 + 22 3419 613 42.11 -92.92 + 23 3506 715 34.10 -86.10 + 24 3523 701 35.20 -84.80 + 25 3533 706 34.80 -84.05 + 26 3572 682 36.72 -80.97 + 27 3158 533 48.32 -113.35 + 28 3236 612 42.20 -107.20 + 29 3272 605 42.76 -104.45 + 30 3298 610 42.30 -102.40 + 31 3330 607 42.60 -99.90 + 32 3249 651 39.10 -106.20 + 33 3471 674 37.30 -88.90 + 34 3423 637 40.20 -92.60 + 35 3353 705 34.90 -98.10 + 36 3263 533 48.31 -105.10 + 37 3458 714 34.25 -89.87 + 38 3261 638 40.13 -105.24 + 39 3368 672 37.48 -96.93 + 40 3530 692 35.96 -84.29 + 41 3323 603 42.86 -100.41 + 42 3401 627 41.02 -94.36 + 43 3613 614 42.04 -77.76 + 44 3591 651 39.17 -79.52 + 45 3464 746 31.71 -89.41 + 46 3579 577 44.92 -80.42 + 47 3283 572 45.27 -103.54 + 48 3268 579 44.79 -104.73 + 49 3279 583 44.46 -103.85 + 50 3268 597 43.35 -104.69 + 51 3282 595 43.53 -103.65 + 52 3381 569 45.50 -95.90 + 53 3412 587 44.10 -93.50 + 54 3112 700 35.34 -116.88 + 55 3487 712 34.40 -87.60 + 56 3058 544 47.50 -121.10 + 57 3038 582 44.50 -122.70 + 58 3136 589 44.00 -115.00 + 59 3056 552 46.90 -121.30 + 60 3066 587 44.10 -120.50 + 61 3055 605 42.70 -121.40 + 62 3191 737 32.42 -110.73 + 63 3361 683 36.61 -97.49 + 64 3056 567 45.68 -121.27 + 65 3053 567 45.72 -121.56 + 66 3062 567 45.68 -120.82 + 67 3070 567 45.72 -120.21 + 68 3050 665 38.06 -121.77 + 69 3043 677 37.10 -122.28 + 70 3041 668 37.82 -122.47 + 71 3040 666 37.94 -122.50 + 72 3058 677 37.07 -121.12 + 73 3158 637 40.20 -113.30 + 74 3662 635 40.42 -73.98 + 75 3651 676 37.20 -74.80 + 76 3658 690 36.05 -74.25 + 77 3620 732 32.80 -77.20 + 78 3664 658 38.60 -73.75 + 79 3492 778 29.20 -87.25 + 80 3515 559 46.31 -85.46 + 81 3494 549 47.18 -87.22 + 82 3475 602 42.97 -88.55 + 83 3680 577 44.94 -72.51 + 84 3558 675 37.27 -82.10 + 85 3558 675 90.00 0.00 + 86 2604 201 74.30 -156.60 + 87 2816 192 75.00 -140.00 + 88 2390 376 60.60 -173.30 + 89 2538 385 59.90 -161.75 + 90 2691 347 62.88 -149.83 + 91 2703 374 60.79 -148.83 + 92 2691 378 60.49 -149.79 + 93 2700 372 60.95 -149.14 + 94 2705 374 60.78 -148.72 + 95 3146 778 29.22 -114.28 + 96 3563 689 36.20 -81.65 + 97 3483 607 42.59 -87.94 + 98 3100 607 42.59 -117.87 + 99 3042 549 47.08 -122.36 + 100 3020 551 46.91 -124.11 + 101 3445 542 47.66 -90.91 + 102 3440 553 46.77 -91.25 + 103 2554 444 55.31 -160.52 + 104 3615 700 35.33 -77.60 + 105 3679 582 44.53 -72.61 + 106 3582 619 41.63 -80.21 + 107 3576 657 38.69 -80.65 + 108 3102 709 34.57 -117.67 + 109 3103 709 34.63 -117.61 + 110 3506 563 45.97 -86.17 + 111 3499 558 46.42 -86.65 + 112 3461 651 39.16 -89.67 + 113 3308 588 44.05 -101.60 + 114 3505 594 43.58 -86.24 + 115 3517 596 43.43 -85.30 + 116 3528 597 43.37 -84.44 + 117 3274 712 34.38 -104.23 + 118 3994 554 46.70 -48.00 + 119 3903 548 47.20 -55.10 + 120 3506 605 42.75 -86.10 + 121 3504 609 42.41 -86.28 + 122 3108 571 45.35 -117.23 + 123 3388 620 41.59 -95.34 + 124 3484 591 43.78 -87.85 + 125 3292 756 30.90 -102.85 + 126 3434 721 33.64 -91.75 + 127 3427 688 36.22 -92.28 + 128 3284 750 31.38 -103.51 + 129 3566 814 26.42 -81.44 + 130 3572 810 26.75 -80.94 + 131 3337 753 31.18 -99.32 + 132 3320 760 30.59 -100.65 + 133 3060 652 39.08 -120.94 + 134 3072 672 37.51 -120.04 + 135 3069 671 37.58 -120.27 + 136 3081 680 36.83 -119.33 + 137 3056 693 35.85 -121.31 + 138 3049 546 47.31 -121.85 + 139 3057 659 38.49 -121.22 + 140 3062 651 39.13 -120.80 + 141 3041 631 40.72 -122.43 + 142 3656 633 40.50 -74.45 + 143 3705 602 42.99 -70.62 + 144 3667 569 45.50 -73.57 + 145 3676 626 41.10 -72.89 + 146 3048 644 39.69 -121.91 + 147 3058 651 39.17 -121.11 + 148 3076 696 35.62 -119.69 + 149 3056 695 35.66 -121.28 + 150 3031 658 38.61 -123.21 + 151 3031 659 38.51 -123.22 + 152 3034 656 38.78 -122.99 + 153 3069 677 37.11 -120.24 + 154 3046 640 39.99 -122.06 + 155 3033 653 39.00 -123.12 + 156 3035 659 38.51 -122.96 + 157 3249 723 33.50 -106.18 + 158 3243 719 33.82 -106.65 + 159 3180 632 40.58 -111.63 + 160 3546 591 43.78 -82.99 + 161 2791 409 58.00 -142.00 + 162 2778 461 54.00 -143.00 + 163 2855 435 56.00 -137.00 + 164 2624 448 55.00 -155.00 + 165 2829 473 53.00 -139.00 + 166 2722 498 51.10 -147.40 + 167 2535 486 52.00 -162.00 + 168 2641 518 49.50 -153.70 + 169 2448 522 49.20 -168.80 + 170 2880 512 50.00 -135.00 + 171 2957 659 38.50 -129.00 + 172 2996 717 34.00 -126.00 + 173 3034 774 29.50 -123.00 + 174 3072 832 25.00 -120.00 + 175 3136 832 25.00 -115.00 + 176 2806 585 44.30 -140.80 + 177 2764 541 47.70 -144.10 + 178 2855 673 37.40 -137.00 + 179 2842 550 47.00 -138.00 + 180 2874 594 43.60 -135.50 + 181 2902 644 39.70 -133.30 + 182 2951 531 48.50 -129.50 + 183 2976 581 44.60 -127.50 + 184 2799 505 50.50 -141.38 + 185 2983 633 40.50 -126.98 + 186 2176 448 55.00 170.00 + 187 2304 448 55.00 180.00 + 188 2240 384 60.00 175.00 + 189 3203 604 42.80 -109.81 + 190 3045 651 39.15 -122.15 + 191 3051 629 40.88 -121.66 + 192 3068 666 37.99 -120.38 + 193 3091 669 37.74 -118.59 + 194 3084 689 36.20 -119.10 + 195 3091 691 35.97 -118.54 + 196 3086 706 34.83 -118.95 + 197 3068 689 36.14 -120.35 + 198 3092 683 36.65 -118.48 + 199 3670 608 42.47 -73.29 + 200 3556 650 39.21 -82.23 + 201 3182 634 40.48 -111.43 + 202 3146 729 33.02 -114.24 + 203 3197 749 31.49 -110.30 + 204 3228 742 32.02 -107.87 + 205 3273 762 30.43 -104.33 + 206 3325 789 28.39 -100.29 + 207 3344 812 26.57 -98.82 + 208 3380 784 28.71 -95.96 + 209 3435 770 29.81 -91.66 + 210 3565 836 24.70 -81.51 + 211 3750 922 17.98 -67.08 + 212 3051 561 46.19 -121.70 + 213 3043 559 46.28 -122.28 + 214 3048 571 45.35 -121.94 + 215 3046 587 44.17 -122.06 + 216 3340 575 45.03 -99.11 + 217 3367 570 45.46 -96.99 + 218 3390 662 38.28 -95.22 + 219 3372 642 39.86 -96.63 + 220 3242 584 44.38 -106.72 + 221 3225 582 44.52 -108.08 + 222 3453 575 45.10 -90.30 + 223 3427 583 44.46 -92.29 + 224 3238 675 37.29 -107.06 + 225 3219 663 38.23 -108.56 + 226 3228 639 40.05 -107.89 + 227 3437 619 41.64 -91.54 + 228 3258 652 39.05 -105.51 + 229 3247 639 40.05 -106.36 + 230 3487 656 38.76 -87.61 + 231 3475 652 39.07 -88.53 + 232 3451 628 40.94 -90.43 + 233 3453 643 39.77 -90.24 + 234 3479 647 39.48 -88.28 + 235 3331 596 43.39 -99.84 + 236 3296 601 43.02 -102.52 + 237 3273 590 43.89 -104.32 + 238 3294 582 44.56 -102.66 + 239 3255 592 43.74 -105.74 + 240 3467 561 46.15 -89.21 + 241 3490 556 46.54 -87.39 + 242 3508 554 46.68 -85.97 + 243 3470 555 46.61 -88.91 + 244 3495 794 28.00 -87.00 + 245 3409 791 28.20 -93.70 + 246 3427 787 28.50 -92.30 + 247 3444 795 27.90 -91.00 + 248 3469 794 28.00 -89.00 + 249 3520 794 28.00 -85.00 + 250 3597 755 31.00 -79.00 + 251 3776 537 48.00 -65.00 + 252 3182 649 39.30 -111.46 + 253 3269 682 36.74 -104.65 + 254 3623 678 36.99 -77.00 + 255 3624 667 37.86 -76.89 + 256 3635 659 38.54 -76.03 + 257 3613 681 36.77 -77.79 + 258 3727 591 43.78 -68.86 + 259 3584 660 38.40 -80.00 + 260 3104 703 35.10 -117.56 + 261 3285 597 43.37 -103.39 + 262 3284 583 44.41 -103.48 + 263 3478 612 42.21 -88.32 + 264 3481 634 40.46 -88.10 + 265 3586 732 32.78 -79.92 + 266 3585 734 32.66 -79.93 + 267 3573 742 32.03 -80.89 + 268 3105 538 47.97 -117.43 + 269 3570 688 36.22 -81.10 + 270 3549 608 42.47 -82.76 + 271 3537 591 43.80 -83.72 + 272 3549 588 44.02 -82.79 + 273 3493 618 41.69 -87.15 + 274 3481 619 41.61 -88.10 + 275 3577 685 36.46 -80.55 + 276 3162 676 37.20 -112.99 + 277 3184 662 38.29 -111.26 + 278 3174 656 38.74 -112.10 + 279 3088 676 37.20 -118.80 + 280 3287 542 47.61 -103.26 + 281 3367 668 37.80 -97.01 + 282 3401 709 34.60 -94.30 + 283 3390 667 37.90 -95.20 + 284 3309 659 38.50 -101.50 + 285 3262 673 37.40 -105.20 + 286 3261 691 36.00 -105.30 + 287 3307 714 34.20 -101.70 + 288 3337 722 33.60 -99.30 + 289 3362 721 33.70 -97.40 + 290 3427 677 37.10 -92.30 + 291 3239 627 41.00 -107.00 + 292 3147 560 46.25 -114.15 + 293 3168 550 47.00 -112.50 + 294 3149 525 49.00 -114.00 + 295 3226 518 49.50 -108.00 + 296 3277 525 49.00 -104.00 + 297 3341 544 47.50 -99.00 + 298 3360 561 46.20 -97.50 + 299 3392 563 46.00 -95.00 + 300 3531 630 40.80 -84.20 + 301 3584 665 38.00 -80.00 + 302 3552 710 34.50 -82.50 + 303 3354 618 41.73 -98.01 + 304 3388 620 41.58 -95.34 + 305 3367 611 42.24 -96.98 + 306 3344 669 37.70 -98.75 + 307 3349 528 48.75 -98.39 + 308 3368 528 48.75 -96.94 + 309 3376 583 44.46 -96.25 + 310 3390 599 43.17 -95.21 + 311 3336 599 43.22 -99.40 + 312 3344 598 43.26 -98.76 + 313 3359 628 40.90 -97.62 + 314 3331 630 40.79 -99.78 + 315 3359 638 40.15 -97.58 + 316 3337 643 39.73 -99.32 + 317 3452 681 36.77 -90.32 + 318 3490 674 37.36 -87.40 + 319 3513 652 39.05 -85.61 + 320 3551 669 37.75 -82.64 + 321 3537 683 36.61 -83.74 + 322 3522 680 36.86 -84.86 + 323 3534 665 38.06 -83.98 + 324 3306 626 41.12 -101.77 + 325 3321 615 41.96 -100.57 + 326 3204 594 43.55 -109.69 + 327 3188 605 42.71 -110.94 + 328 3193 617 41.82 -110.56 + 329 3216 578 44.87 -108.79 + 330 3220 577 44.91 -108.45 + 331 3244 592 43.71 -106.63 + 332 3459 595 43.52 -89.77 + 333 3476 592 43.77 -88.49 + 334 3453 603 42.89 -90.24 + 335 3479 601 43.04 -88.24 + 336 3465 588 44.04 -89.31 + 337 3467 623 41.35 -89.15 + 338 3497 618 41.70 -86.82 + 339 3484 626 41.07 -87.85 + 340 3468 616 41.89 -89.08 + 341 3434 570 45.42 -91.77 + 342 3399 575 45.10 -94.51 + 343 3376 579 44.73 -96.27 + 344 3300 639 40.10 -102.24 + 345 3293 656 38.76 -102.79 + 346 3495 572 45.29 -86.98 + 347 3459 584 44.36 -89.84 + 348 3659 640 39.99 -74.17 + 349 3509 669 37.70 -85.87 + 350 3353 670 37.67 -98.12 + 351 3354 675 37.28 -98.04 + 352 3383 662 38.30 -95.72 + 353 3376 667 37.85 -96.29 + 354 3351 656 38.75 -98.23 + 355 3663 646 39.55 -73.90 + 356 3692 644 39.70 -71.60 + 357 3385 677 37.09 -95.57 + 358 3358 661 38.35 -97.69 + 359 3337 674 37.35 -99.35 + 360 3343 700 35.30 -98.90 + 361 3367 698 35.50 -97.00 + 362 3396 704 34.98 -94.69 + 363 3308 683 36.60 -101.60 + 364 3601 608 42.50 -78.68 + 365 3640 592 43.76 -75.68 + 366 3443 738 32.35 -91.03 + 367 3567 821 25.86 -81.38 + 368 3555 783 28.84 -82.33 + 369 3547 741 32.07 -82.90 + 370 3485 746 31.71 -87.78 + 371 3472 724 33.45 -88.82 + 372 3441 705 34.89 -91.20 + 373 3507 698 35.48 -86.09 + 374 3484 696 35.62 -87.84 + 375 3517 679 36.95 -85.26 + 376 3290 722 33.62 -103.02 + 377 3325 701 35.21 -100.25 + 378 3317 651 39.13 -100.87 + 379 3439 678 37.01 -91.36 + 380 3408 661 38.37 -93.79 + 381 3425 647 39.42 -92.44 + 382 3597 709 34.61 -79.06 + 383 3706 619 41.65 -70.52 + 384 3574 724 33.46 -80.85 + 385 3444 569 45.50 -91.00 + 386 3415 564 45.89 -93.27 + 387 3424 574 45.15 -92.54 + 388 3380 573 45.23 -96.00 + 389 3455 588 44.03 -90.08 + 390 3043 623 41.32 -122.32 + 391 3077 607 42.55 -119.66 + 392 3062 597 43.33 -120.84 + 393 3028 622 41.39 -123.49 + 394 3054 622 41.43 -121.46 + 395 3108 571 45.36 -117.25 + 396 3086 584 44.40 -118.96 + 397 3615 687 36.33 -77.64 + 398 3588 705 34.89 -79.76 + 399 3566 634 40.47 -81.42 + 400 3591 629 40.82 -79.53 + 401 3596 632 40.63 -79.11 + 402 3593 645 39.58 -79.34 + 403 3581 638 40.14 -80.29 + 404 3648 608 42.46 -75.06 + 405 3309 682 36.68 -101.50 + 406 3498 727 33.17 -86.77 + 407 3466 752 31.27 -89.26 + 408 3566 790 28.29 -81.44 + 409 3561 783 28.82 -81.81 + 410 3387 763 30.36 -95.41 + 411 3386 759 30.73 -95.47 + 412 3303 576 45.02 -102.02 + 413 3332 778 29.21 -99.74 + 414 3349 693 35.87 -98.42 + 415 3474 628 40.92 -88.62 + 416 3673 620 41.56 -73.05 + 417 3120 529 48.69 -116.32 + 418 3089 529 48.65 -118.73 + 419 3064 541 47.76 -120.65 + 420 3069 531 48.49 -120.24 + 421 3122 543 47.54 -116.14 + 422 3035 631 40.73 -122.94 + 423 3026 628 40.94 -123.63 + 424 3033 636 40.34 -123.07 + 425 3031 643 39.75 -123.21 + 426 3416 684 36.54 -93.20 + 427 2426 337 63.68 -170.50 + 428 2666 280 68.13 -151.73 + 429 2410 336 63.77 -171.73 + 430 2480 379 60.37 -166.27 + 431 2543 427 56.65 -161.37 + 432 2041 460 54.05 159.43 + 433 3684 577 44.89 -72.23 + 434 3309 596 43.46 -101.50 + 435 3280 589 43.99 -103.79 + 436 3533 777 29.30 -84.04 + 437 3372 795 27.90 -96.64 + 438 3528 713 34.31 -84.42 + 439 3535 713 34.27 -83.83 + 440 3528 623 41.34 -84.43 + 441 3510 624 41.28 -85.84 + 442 3515 617 41.81 -85.44 + 443 3498 620 41.57 -86.73 + 444 3286 526 48.93 -103.30 + 445 3333 526 48.88 -99.62 + 446 3285 561 46.19 -103.43 + 447 3337 563 46.02 -99.35 + 448 3353 561 46.17 -98.07 + 449 3537 563 46.01 -83.74 + 450 3465 552 46.88 -89.32 + 451 3495 568 45.58 -87.00 + 452 3103 712 34.36 -117.63 + 453 3076 705 34.94 -119.69 + 454 3070 711 34.48 -120.23 + 455 3071 709 34.61 -120.08 + 456 3186 592 43.74 -111.10 + 457 3533 706 34.85 -84.00 + 458 3280 584 44.35 -103.77 + 459 3273 583 44.41 -104.36 + 460 3395 668 37.80 -94.77 + 461 3423 670 37.64 -92.65 + 462 3414 661 38.35 -93.34 + 463 3052 646 39.49 -121.61 + 464 3478 718 33.90 -88.33 + 465 3282 709 34.64 -103.63 + 466 3643 655 38.83 -75.43 + 467 3338 735 32.54 -99.25 + 468 3402 655 38.81 -94.26 + 469 3355 687 36.34 -97.92 + 470 3342 712 34.36 -98.98 + 471 3509 631 40.72 -85.93 + 472 3362 759 30.72 -97.38 + 473 3348 636 40.32 -98.44 + 474 3515 749 31.46 -85.46 + 475 3510 735 32.54 -85.79 + 476 3311 531 48.50 -101.40 + 477 3541 689 36.17 -83.40 + 478 3643 595 43.47 -75.46 + 479 3542 734 32.68 -83.35 + 480 3491 682 36.74 -87.29 + 481 3143 750 31.40 -114.49 + 482 3164 738 32.37 -112.87 + 483 3205 732 32.82 -109.68 + 484 3129 532 48.39 -115.55 + 485 3271 528 48.76 -104.52 + 486 3034 669 37.70 -123.00 + 487 3033 661 38.32 -123.07 + 488 3274 558 46.37 -104.28 + 489 3183 583 44.42 -111.37 + 490 3171 612 42.17 -112.28 + 491 3124 646 39.50 -115.95 + 492 3083 662 38.30 -119.16 + 493 3668 580 44.65 -73.49 + 494 3644 625 41.14 -75.38 + 495 3638 640 39.98 -75.82 + 496 3295 675 37.28 -102.61 + 497 3709 568 45.64 -70.26 + 498 3722 544 47.46 -69.22 + 499 3745 569 45.56 -67.43 + 500 3672 614 42.05 -73.20 + 501 3208 613 42.11 -109.45 + 502 3228 608 42.49 -107.83 + 503 3215 608 42.48 -108.84 + 504 3195 599 43.20 -110.40 + 505 3381 632 40.61 -95.87 + 506 3372 624 41.24 -96.59 + 507 3495 621 41.45 -87.01 + 508 3477 622 41.42 -88.41 + 509 3344 643 39.76 -98.79 + 510 3241 633 40.51 -106.87 + 511 3234 647 39.43 -107.38 + 512 3250 647 39.48 -106.15 + 513 3232 633 40.50 -107.52 + 514 3464 627 41.02 -89.39 + 515 3480 656 38.72 -88.18 + 516 3352 652 39.06 -98.17 + 517 3377 677 37.13 -96.19 + 518 3373 661 38.37 -96.54 + 519 3622 606 42.64 -77.05 + 520 3630 617 41.77 -76.45 + 521 3248 631 40.73 -106.28 + 522 3629 596 43.45 -76.51 + 523 3402 600 43.08 -94.27 + 524 3396 596 43.40 -94.75 + 525 3394 635 40.35 -94.92 + 526 3539 663 38.22 -83.59 + 527 3416 656 38.71 -93.18 + 528 3416 647 39.42 -93.13 + 529 3221 592 43.71 -108.39 + 530 3187 599 43.18 -111.04 + 531 3283 568 45.59 -103.55 + 532 2654 373 60.82 -152.72 + 533 2560 269 69.00 -160.00 + 534 2273 486 52.00 177.55 + 535 2484 422 57.00 -166.00 + 536 2649 355 62.22 -153.08 + 537 2596 387 59.73 -157.26 + 538 2723 360 61.89 -147.32 + 539 2483 304 66.27 -166.05 + 540 3108 715 34.10 -117.23 + 541 3407 632 40.63 -93.90 + 542 3191 724 33.40 -110.77 + 543 3165 717 33.97 -112.74 + 544 3635 655 38.80 -76.07 + 545 3653 626 41.05 -74.63 + 546 2766 349 62.72 -143.97 + 547 2764 285 67.75 -144.11 + 548 2761 312 65.59 -144.36 + 549 2676 359 61.95 -151.00 + 550 2630 419 57.27 -154.56 + 551 2714 305 66.15 -148.03 + 552 2774 294 67.03 -143.29 + 553 2546 317 65.20 -161.15 + 554 2671 387 59.75 -151.37 + 555 2575 340 63.39 -158.83 + 556 2893 423 56.97 -134.00 + 557 2531 282 67.95 -162.31 + 558 2684 376 60.59 -150.32 + 559 2696 379 60.37 -149.41 + 560 2783 367 61.32 -142.59 + 561 2667 383 60.03 -151.66 + 562 2577 281 68.07 -158.71 + 563 2566 364 61.58 -159.54 + 564 2746 345 63.03 -145.49 + 565 2501 315 65.41 -164.66 + 566 2645 340 63.44 -153.36 + 567 2874 393 59.25 -135.52 + 568 2905 440 55.58 -133.10 + 569 2677 339 63.49 -150.88 + 570 2774 346 62.97 -143.34 + 571 2792 332 64.05 -141.93 + 572 2633 296 66.85 -154.34 + 573 2661 306 66.08 -152.17 + 574 2606 363 61.64 -156.44 + 575 2605 262 69.50 -156.50 + 576 2701 269 69.00 -149.00 + 577 2701 285 67.75 -149.00 + 578 2612 315 65.34 -155.95 + 579 2617 331 64.10 -155.56 + 580 2675 329 64.31 -151.08 + 581 2558 380 60.32 -160.20 + 582 2512 357 62.10 -163.80 + 583 2536 251 70.40 -161.90 + 584 2604 239 71.32 -156.62 + 585 3742 610 42.35 -67.70 + 586 3687 649 39.30 -72.00 + 587 3768 649 39.30 -65.70 + 588 3711 674 37.30 -70.10 + 589 3654 687 36.30 -74.60 + 590 3664 696 35.60 -73.80 + 591 3702 705 34.90 -70.80 + 592 3636 719 33.80 -76.00 + 593 3683 732 32.80 -72.30 + 594 2938 536 48.10 -130.50 + 595 2988 536 48.10 -126.60 + 596 2948 572 45.30 -129.70 + 597 3001 572 45.30 -125.60 + 598 2946 617 41.75 -129.90 + 599 2998 616 41.90 -125.80 + 600 3002 650 39.20 -125.50 + 601 3003 686 36.40 -125.40 + 602 3008 726 33.30 -125.00 + 603 3043 709 34.60 -122.30 + 604 3053 756 30.90 -121.50 + 605 3111 773 29.60 -117.00 + 606 3635 671 37.54 -76.01 + 607 3239 730 32.99 -106.97 + 608 3203 740 32.15 -109.84 + 609 3534 606 42.63 -83.98 + 610 3532 602 42.99 -84.14 + 611 3544 613 42.10 -83.16 + 612 3532 616 41.87 -84.07 + 613 3539 588 44.02 -83.54 + 614 3540 596 43.46 -83.45 + 615 3541 615 41.94 -83.43 + 616 3287 632 40.61 -103.26 + 617 3280 636 40.34 -103.80 + 618 3411 639 40.08 -93.59 + 619 3418 634 40.48 -93.01 + 620 3401 637 40.25 -94.33 + 621 3419 631 40.68 -92.90 + 622 3422 618 41.71 -92.73 + 623 3415 608 42.47 -93.27 + 624 3476 657 38.66 -88.45 + 625 3487 652 39.02 -87.65 + 626 3465 646 39.53 -89.33 + 627 3211 598 43.31 -109.19 + 628 3226 619 41.67 -107.98 + 629 3188 595 43.50 -110.96 + 630 3352 647 39.47 -98.13 + 631 3401 667 37.85 -94.31 + 632 3473 615 41.93 -88.71 + 633 3193 673 37.44 -110.56 + 634 3242 634 40.45 -106.75 + 635 3362 674 37.32 -97.39 + 636 3241 673 37.45 -106.80 + 637 3233 588 44.03 -107.45 + 638 3188 613 42.08 -110.96 + 639 3225 655 38.79 -108.06 + 640 3243 635 40.35 -106.70 + 641 3471 777 29.30 -88.84 + 642 4508 741 32.13 -7.88 + 643 3678 616 41.89 -72.71 + 644 143 381 60.20 11.10 + 645 230 388 59.67 17.93 + 646 230 392 59.35 17.95 + 647 320 380 60.32 24.97 + 648 4593 382 60.13 -1.18 + 649 4550 429 56.50 -4.58 + 650 3879 1574 -33.00 -57.00 + 651 4586 490 51.68 -1.78 + 652 4588 489 51.75 -1.58 + 653 4605 495 51.29 -0.27 + 654 4527 483 52.25 -6.33 + 655 4319 333 63.97 -22.60 + 656 4522 358 62.02 -6.76 + 657 161 438 55.77 12.53 + 658 56 486 52.03 4.35 + 659 58 500 50.90 4.47 + 660 89 553 46.82 6.95 + 661 31 528 48.73 2.40 + 662 4501 597 43.37 -8.42 + 663 4595 619 41.67 -1.02 + 664 4537 676 37.17 -5.62 + 665 4315 938 16.75 -22.95 + 666 184 291 67.27 14.37 + 667 2323 243 70.97 -178.53 + 668 2311 270 68.92 -179.48 + 669 2435 305 66.17 -169.83 + 670 2182 324 64.68 170.42 + 671 2272 323 64.73 177.50 + 672 2296 345 63.05 179.32 + 673 814 471 53.21 63.55 + 674 4528 407 58.22 -6.32 + 675 17 478 52.63 1.32 + 676 856 509 50.22 66.83 + 677 937 514 49.80 73.15 + 678 4541 511 50.08 -5.25 + 679 4529 452 54.65 -6.22 + 680 3674 705 34.90 -73.00 + 681 3646 738 32.30 -75.20 + 682 3596 736 32.50 -79.07 + 683 3618 777 29.30 -77.40 + 684 3573 750 31.40 -80.87 + 685 3604 782 28.90 -78.50 + 686 3589 732 32.80 -79.62 + 687 3579 739 32.28 -80.41 + 688 3460 820 25.90 -89.70 + 689 3410 820 25.90 -93.60 + 690 3509 820 25.90 -85.90 + 691 3392 795 27.90 -95.00 + 692 3373 806 27.00 -96.50 + 693 3704 659 38.50 -70.70 + 694 3730 607 42.60 -68.60 + 695 3712 595 43.53 -70.14 + 696 3720 633 40.50 -69.40 + 697 3652 659 38.50 -74.70 + 698 3756 626 41.10 -66.60 + 699 3651 683 36.60 -74.80 + 700 3686 631 40.70 -72.10 + 701 3672 636 40.30 -73.20 + 702 3747 585 44.30 -67.30 + 703 3665 635 40.37 -73.70 + 704 3644 669 37.76 -75.33 + 705 3485 537 48.06 -87.78 + 706 3503 572 45.33 -86.42 + 707 3549 571 45.35 -82.84 + 708 3501 543 47.56 -86.55 + 709 3554 618 41.68 -82.40 + 710 3458 546 47.32 -89.87 + 711 3495 606 42.67 -87.02 + 712 3555 584 44.28 -82.42 + 713 3618 594 43.62 -77.41 + 714 3569 608 42.47 -81.22 + 715 3592 596 43.40 -79.45 + 716 3593 605 42.74 -79.35 + 717 3486 612 42.14 -87.66 + 718 2712 431 56.30 -148.20 + 719 2938 608 42.50 -130.50 + 720 2613 488 51.90 -155.90 + 721 2932 562 46.10 -131.00 + 722 2848 628 40.90 -137.50 + 723 3021 650 39.20 -124.00 + 724 3011 605 42.75 -124.82 + 725 3019 561 46.20 -124.20 + 726 3015 635 40.40 -124.50 + 727 2334 422 57.00 -177.70 + 728 3079 737 32.40 -119.50 + 729 3098 736 32.49 -118.03 + 730 1267 912 18.77 98.96 + 731 1316 941 16.47 102.78 + 732 1282 950 15.77 100.14 + 733 1343 957 15.25 104.87 + 734 1288 977 13.67 100.61 + 735 1288 1060 7.19 100.61 + 736 2531 852 23.40 -162.30 + 737 2589 932 17.20 -157.80 + 738 2550 905 19.30 -160.80 + 739 2656 929 17.40 -152.50 + 740 110 441 55.52 8.55 + 741 61 482 52.31 4.76 + 742 89 483 52.28 6.89 + 743 2674 387 59.77 -151.17 + 744 80 525 48.98 6.25 + 745 3354 772 29.70 -98.01 + 746 3500 688 36.25 -86.57 + 747 3385 689 36.18 -95.56 + 748 3085 701 35.24 -119.03 + 749 3454 658 38.62 -90.18 + 750 3275 660 38.46 -104.18 + 751 3564 621 41.50 -81.60 + 752 3166 600 43.11 -112.68 + 753 3126 544 47.47 -115.80 + 754 27 624 41.28 2.07 + 755 4554 635 40.42 -4.25 + 756 4602 646 39.50 -0.47 + 757 3318 685 36.50 -100.80 + 758 3307 764 30.30 -101.70 + 759 3296 797 27.70 -102.50 + 760 3340 762 30.50 -99.10 + 761 3345 788 28.40 -98.70 + 762 3344 815 26.30 -98.80 + 763 3362 840 24.40 -97.40 + 764 3389 820 25.90 -95.30 + 765 3400 838 24.50 -94.40 + 766 3385 854 23.30 -95.60 + 767 3403 918 18.30 -94.20 + 768 3417 878 21.40 -93.10 + 769 3436 854 23.30 -91.60 + 770 3473 849 23.70 -88.70 + 771 3520 858 23.00 -85.00 + 772 3354 851 23.50 -98.00 + 773 3438 563 46.02 -91.45 + 774 3549 655 38.83 -82.80 + 775 3603 654 38.88 -78.52 + 776 3600 645 39.62 -78.76 + 777 3623 645 39.61 -77.01 + 778 113 473 53.05 8.79 + 779 172 480 52.47 13.40 + 780 92 501 50.87 7.15 + 781 110 511 50.05 8.58 + 782 119 529 48.68 9.22 + 783 151 533 48.35 11.78 + 784 182 535 48.23 14.19 + 785 213 536 48.12 16.57 + 786 183 511 50.10 14.26 + 787 237 456 54.38 18.47 + 788 269 484 52.17 20.97 + 789 217 498 51.10 16.89 + 790 246 545 47.43 19.18 + 791 263 579 44.78 20.53 + 792 300 605 42.69 23.41 + 793 353 607 42.57 27.52 + 794 357 599 43.23 27.83 + 795 319 613 42.07 24.86 + 796 334 581 44.57 26.09 + 797 158 569 45.50 12.33 + 798 162 617 41.80 12.60 + 799 157 617 41.80 12.23 + 800 310 697 35.53 24.15 + 801 373 627 40.97 29.08 + 802 423 638 40.13 33.00 + 803 515 669 37.75 40.20 + 804 426 702 35.15 33.28 + 805 2391 327 64.43 -173.23 + 806 387 389 59.58 30.18 + 807 481 438 55.75 37.57 + 808 1062 447 55.03 82.90 + 809 1689 600 43.12 131.90 + 810 985 599 43.23 76.93 + 811 576 618 41.68 44.95 + 812 954 601 43.07 74.47 + 813 887 624 41.27 69.27 + 814 461 743 31.98 35.98 + 815 451 744 31.87 35.22 + 816 599 833 24.88 46.77 + 817 599 836 24.72 46.72 + 818 615 778 29.22 47.98 + 819 553 687 36.32 43.15 + 820 567 727 33.22 44.23 + 821 612 761 30.57 47.78 + 822 628 794 28.00 49.00 + 823 640 806 27.00 50.00 + 824 657 695 35.68 51.32 + 825 787 754 31.05 61.47 + 826 903 711 34.42 70.47 + 827 843 751 31.31 65.85 + 828 661 829 25.25 51.57 + 829 709 829 25.25 55.33 + 830 700 839 24.42 54.65 + 831 3728 172 76.53 -68.75 + 832 989 786 28.58 77.20 + 833 1067 925 17.72 83.30 + 834 1372 599 43.20 107.17 + 835 1093 797 27.70 85.37 + 836 1462 866 22.32 114.17 + 837 1556 832 25.03 121.52 + 838 1483 1015 10.72 115.83 + 839 1624 671 37.55 126.80 + 840 1651 702 35.18 128.93 + 841 1810 631 40.70 141.37 + 842 1753 701 35.25 136.93 + 843 1790 697 35.55 139.78 + 844 1797 694 35.76 140.38 + 845 1735 708 34.68 135.53 + 846 1284 1084 5.30 100.27 + 847 1303 1117 2.75 101.72 + 848 1328 1134 1.38 103.72 + 849 1344 860 22.82 104.97 + 850 1355 883 21.02 105.80 + 851 1366 1013 10.82 106.67 + 852 1622 490 51.72 126.65 + 853 1491 642 39.80 116.47 + 854 1541 690 36.07 120.33 + 855 1332 759 30.67 104.02 + 856 1395 713 34.30 108.93 + 857 1458 710 34.52 113.83 + 858 1555 753 31.17 121.43 + 859 4412 794 27.93 -15.38 + 860 4510 722 33.57 -7.67 + 861 4506 747 31.62 -8.03 + 862 71 860 22.82 5.47 + 863 28 979 13.48 2.17 + 864 4570 938 16.72 -3.00 + 865 4384 963 14.73 -17.50 + 866 4393 981 13.35 -16.80 + 867 927 1245 -7.30 72.42 + 868 299 842 24.22 23.30 + 869 349 751 31.33 27.22 + 870 472 1168 -1.28 36.83 + 871 475 1196 -3.42 37.07 + 872 490 1215 -4.92 38.23 + 873 198 1208 -4.38 15.45 + 874 43 1068 6.58 3.33 + 875 4606 1080 5.60 -0.17 + 876 4512 1058 7.38 -7.53 + 877 4558 1085 5.25 -3.93 + 878 4476 1072 6.23 -10.37 + 879 170 1265 -8.85 13.23 + 880 398 1464 -24.37 31.05 + 881 239 1587 -33.97 18.60 + 882 2602 239 71.30 -156.78 + 883 2770 254 70.13 -143.63 + 884 2482 270 68.88 -166.13 + 885 2527 296 66.87 -162.63 + 886 2661 264 69.37 -152.13 + 887 2669 295 66.92 -151.52 + 888 2662 318 65.17 -152.10 + 889 2749 300 66.57 -145.27 + 890 2491 326 64.50 -165.43 + 891 2550 334 63.88 -160.80 + 892 2483 361 61.78 -166.03 + 893 2537 374 60.78 -161.80 + 894 2600 323 64.73 -156.93 + 895 2617 346 62.97 -155.62 + 896 2612 347 62.90 -155.98 + 897 2617 370 61.10 -155.58 + 898 2687 354 62.30 -150.10 + 899 2673 359 61.97 -151.18 + 900 2672 377 60.57 -151.25 + 901 2716 322 64.82 -147.87 + 902 2726 324 64.67 -147.10 + 903 2744 333 63.97 -145.70 + 904 2743 333 64.00 -145.73 + 905 2747 356 62.15 -145.45 + 906 2691 368 61.25 -149.80 + 907 2688 369 61.17 -150.02 + 908 2700 363 61.60 -149.08 + 909 2735 369 61.13 -146.35 + 910 2696 382 60.12 -149.45 + 911 2802 323 64.78 -141.15 + 912 2792 346 62.97 -141.93 + 913 2746 377 60.50 -145.50 + 914 2534 401 58.65 -162.07 + 915 2430 420 57.15 -170.22 + 916 2526 445 55.20 -162.73 + 917 2579 396 59.05 -158.52 + 918 2603 401 58.68 -156.65 + 919 2626 387 59.75 -154.92 + 920 2669 389 59.63 -151.50 + 921 2735 391 59.43 -146.33 + 922 2656 413 57.75 -152.50 + 923 2821 390 59.52 -139.67 + 924 2877 391 59.47 -135.30 + 925 2871 404 58.42 -135.73 + 926 2876 421 57.07 -135.35 + 927 2886 405 58.37 -134.58 + 928 2906 425 56.82 -132.97 + 929 2914 429 56.48 -132.37 + 930 2923 443 55.35 -131.70 + 931 2924 447 55.03 -131.57 + 932 2229 477 52.72 174.12 + 933 2347 488 51.88 -176.65 + 934 2447 474 52.95 -168.85 + 935 2477 462 53.90 -166.55 + 936 2931 487 51.93 -131.02 + 937 2986 316 65.28 -126.75 + 938 3005 230 72.00 -125.28 + 939 3105 432 56.23 -117.43 + 940 3081 176 76.23 -119.33 + 941 3315 424 56.87 -101.08 + 942 3356 438 55.75 -97.87 + 943 3568 271 68.78 -81.25 + 944 3811 96 82.50 -62.33 + 945 3042 524 49.03 -122.37 + 946 2978 503 50.68 -127.37 + 947 3148 466 53.55 -114.10 + 948 3155 470 53.30 -113.58 + 949 3219 459 54.13 -108.52 + 950 3185 508 50.27 -111.18 + 951 3277 469 53.33 -104.00 + 952 3693 552 46.90 -71.50 + 953 3851 543 47.57 -59.17 + 954 3080 513 49.95 -119.40 + 955 2944 404 58.42 -130.00 + 956 3585 599 43.17 -79.93 + 957 3580 595 43.47 -80.38 + 958 3796 577 44.88 -63.50 + 959 3778 576 44.98 -64.92 + 960 3773 592 43.72 -65.25 + 961 3184 310 65.77 -111.25 + 962 3546 611 42.27 -82.97 + 963 3840 590 43.93 -60.02 + 964 3796 581 44.63 -63.50 + 965 3762 590 43.87 -66.10 + 966 3765 572 45.32 -65.88 + 967 3570 601 43.03 -81.15 + 968 3589 593 43.67 -79.63 + 969 3619 564 45.95 -77.32 + 970 3664 570 45.47 -73.75 + 971 3661 567 45.68 -74.03 + 972 3640 572 45.32 -75.67 + 973 3605 586 44.23 -78.37 + 974 3593 576 44.97 -79.30 + 975 3570 579 44.75 -81.10 + 976 3758 565 45.83 -66.43 + 977 3781 562 46.12 -64.68 + 978 3800 559 46.28 -63.13 + 979 3840 561 46.17 -60.05 + 980 3695 553 46.80 -71.40 + 981 3636 558 46.38 -75.97 + 982 3613 537 48.05 -77.78 + 983 3592 558 46.37 -79.42 + 984 3523 538 47.97 -84.78 + 985 3567 530 48.57 -81.37 + 986 3484 483 52.23 -87.88 + 987 3465 533 48.37 -89.32 + 988 3029 529 48.65 -123.43 + 989 3934 542 47.62 -52.73 + 990 3910 525 48.95 -54.57 + 991 3874 522 49.22 -57.40 + 992 3760 509 50.22 -66.27 + 993 3859 531 48.53 -58.55 + 994 3835 469 53.32 -60.42 + 995 3655 515 49.77 -74.53 + 996 3666 464 53.75 -73.67 + 997 3496 515 49.78 -86.93 + 998 3576 496 51.27 -80.65 + 999 3577 495 51.28 -80.60 + 1000 3454 493 51.45 -90.20 + 1001 3458 463 53.83 -89.87 + 1002 3401 515 49.78 -94.37 + 1003 3364 513 49.90 -97.23 + 1004 3333 515 49.78 -99.65 + 1005 3307 452 54.68 -101.68 + 1006 3269 506 50.43 -104.67 + 1007 3243 484 52.17 -106.68 + 1008 3314 461 53.97 -101.10 + 1009 3256 471 53.22 -105.68 + 1010 3191 512 50.02 -110.72 + 1011 3165 517 49.63 -112.80 + 1012 3149 498 51.12 -114.02 + 1013 3127 517 49.62 -115.78 + 1014 3103 521 49.30 -117.63 + 1015 3032 522 49.18 -123.17 + 1016 3010 515 49.72 -124.90 + 1017 3038 462 53.88 -122.68 + 1018 3733 408 58.10 -68.42 + 1019 3609 404 58.45 -78.12 + 1020 3731 336 63.75 -68.53 + 1021 3404 400 58.75 -94.07 + 1022 3541 330 64.20 -83.37 + 1023 3509 128 79.98 -85.93 + 1024 3237 418 57.35 -107.13 + 1025 3393 195 74.72 -94.95 + 1026 3263 267 69.10 -105.12 + 1027 3380 329 64.30 -96.00 + 1028 3175 451 54.77 -112.02 + 1029 3176 384 60.02 -111.95 + 1030 3144 352 62.50 -114.40 + 1031 3039 399 58.83 -122.58 + 1032 3057 361 61.80 -121.20 + 1033 2960 382 60.12 -128.82 + 1034 2899 277 68.32 -133.53 + 1035 2880 375 60.72 -135.07 + 1036 2828 332 64.05 -139.13 + 1037 2819 287 67.57 -139.82 + 1038 3562 838 24.55 -81.75 + 1039 3571 835 24.73 -81.05 + 1040 3581 821 25.82 -80.28 + 1041 3581 820 25.90 -80.28 + 1042 3583 818 26.07 -80.15 + 1043 3579 824 25.65 -80.43 + 1044 3583 810 26.68 -80.12 + 1045 3582 817 26.20 -80.17 + 1046 3576 792 28.10 -80.65 + 1047 3579 798 27.65 -80.42 + 1048 3568 788 28.43 -81.32 + 1049 3571 778 29.18 -81.05 + 1050 3563 762 30.50 -81.69 + 1051 3560 765 30.22 -81.88 + 1052 3569 741 32.13 -81.19 + 1053 3576 739 32.22 -80.70 + 1054 3584 731 32.90 -80.03 + 1055 3554 797 27.70 -82.38 + 1056 3580 800 27.50 -80.37 + 1057 3561 812 26.58 -81.87 + 1058 3562 812 26.53 -81.75 + 1059 3552 794 27.97 -82.53 + 1060 3552 801 27.40 -82.55 + 1061 3550 795 27.92 -82.68 + 1062 3559 794 27.99 -82.02 + 1063 3545 773 29.62 -83.10 + 1064 3554 752 31.25 -82.40 + 1065 3566 752 31.25 -81.47 + 1066 3567 753 31.15 -81.37 + 1067 3529 763 30.38 -84.37 + 1068 3555 772 29.68 -82.27 + 1069 3526 725 33.36 -84.57 + 1070 3531 748 31.53 -84.18 + 1071 3543 758 30.78 -83.28 + 1072 3538 733 32.70 -83.65 + 1073 3559 725 33.37 -81.97 + 1074 3528 721 33.65 -84.42 + 1075 3527 720 33.78 -84.52 + 1076 3529 718 33.88 -84.30 + 1077 3521 771 29.73 -84.98 + 1078 3509 761 30.56 -85.92 + 1079 3493 762 30.47 -87.18 + 1080 3479 759 30.68 -88.25 + 1081 3481 760 30.63 -88.07 + 1082 3512 765 30.22 -85.68 + 1083 3523 738 32.33 -84.83 + 1084 3521 736 32.52 -84.93 + 1085 3503 738 32.30 -86.40 + 1086 3515 751 31.32 -85.45 + 1087 3498 722 33.57 -86.75 + 1088 3487 727 33.22 -87.62 + 1089 3510 722 33.58 -85.85 + 1090 3492 731 32.90 -87.25 + 1091 3414 771 29.78 -93.30 + 1092 3498 727 33.17 -86.77 + 1093 3454 779 29.10 -90.20 + 1094 3453 768 29.98 -90.25 + 1095 3432 768 30.03 -91.88 + 1096 3456 767 30.05 -90.03 + 1097 3442 761 30.53 -91.15 + 1098 3464 776 29.33 -89.40 + 1099 3459 764 30.33 -89.82 + 1100 3472 738 32.33 -88.75 + 1101 3465 749 31.47 -89.33 + 1102 3455 738 32.32 -90.08 + 1103 3444 723 33.48 -90.98 + 1104 3450 753 31.18 -90.47 + 1105 3455 723 33.50 -90.08 + 1106 3416 754 31.05 -93.20 + 1107 3415 766 30.12 -93.22 + 1108 3431 765 30.20 -91.98 + 1109 3405 769 29.95 -94.02 + 1110 3395 777 29.30 -94.80 + 1111 3388 768 29.97 -95.35 + 1112 3389 772 29.65 -95.28 + 1113 3375 760 30.58 -96.37 + 1114 3396 752 31.23 -94.75 + 1115 3387 738 32.34 -95.40 + 1116 3397 738 32.34 -94.65 + 1117 3408 736 32.47 -93.82 + 1118 3431 736 32.52 -92.03 + 1119 3427 750 31.40 -92.30 + 1120 3363 732 32.83 -97.30 + 1121 3361 820 25.90 -97.43 + 1122 3359 816 26.23 -97.65 + 1123 3351 817 26.18 -98.23 + 1124 3360 796 27.77 -97.50 + 1125 3354 797 27.73 -98.03 + 1126 3335 799 27.55 -99.47 + 1127 3386 779 29.12 -95.47 + 1128 3348 774 29.53 -98.47 + 1129 3358 764 30.30 -97.70 + 1130 3368 783 28.85 -96.92 + 1131 3364 747 31.62 -97.22 + 1132 3356 754 31.07 -97.83 + 1133 3369 732 32.84 -96.85 + 1134 3387 721 33.63 -95.45 + 1135 3367 731 32.90 -97.03 + 1136 3096 703 35.07 -118.15 + 1137 3362 732 32.82 -97.37 + 1138 3352 739 32.22 -98.18 + 1139 3317 776 29.37 -100.92 + 1140 3267 744 31.83 -104.80 + 1141 3322 750 31.37 -100.50 + 1142 3296 691 36.02 -102.55 + 1143 3277 763 30.37 -104.02 + 1144 3301 743 31.95 -102.18 + 1145 3288 745 31.78 -103.20 + 1146 3333 737 32.41 -99.68 + 1147 3305 721 33.65 -101.82 + 1148 3271 726 33.30 -104.53 + 1149 3274 738 32.33 -104.27 + 1150 3287 734 32.68 -103.22 + 1151 3249 739 32.24 -106.22 + 1152 3240 739 32.28 -106.92 + 1153 3247 745 31.80 -106.40 + 1154 3235 727 33.23 -107.27 + 1155 3230 739 32.27 -107.72 + 1156 3196 748 31.57 -110.33 + 1157 3206 749 31.47 -109.60 + 1158 3189 741 32.12 -110.93 + 1159 3175 724 33.43 -112.02 + 1160 3142 734 32.65 -114.60 + 1161 3103 716 34.05 -117.60 + 1162 3117 719 33.83 -116.50 + 1163 3094 714 34.20 -118.35 + 1164 3092 714 34.22 -118.48 + 1165 3064 701 35.23 -120.63 + 1166 3109 733 32.73 -117.17 + 1167 3111 735 32.57 -116.98 + 1168 3080 726 33.25 -119.45 + 1169 3093 724 33.40 -118.42 + 1170 3107 728 33.13 -117.28 + 1171 3109 731 32.85 -117.12 + 1172 3093 719 33.93 -118.40 + 1173 3096 719 33.82 -118.15 + 1174 3100 721 33.68 -117.87 + 1175 3611 713 34.27 -77.90 + 1176 3597 702 35.17 -79.02 + 1177 3599 704 34.98 -78.87 + 1178 3641 700 35.27 -75.55 + 1179 3625 707 34.78 -76.87 + 1180 3600 693 35.87 -78.78 + 1181 3618 696 35.64 -77.39 + 1182 3610 700 35.33 -77.97 + 1183 3611 693 35.84 -77.90 + 1184 3615 706 34.82 -77.61 + 1185 3686 681 36.82 -72.10 + 1186 3633 680 36.90 -76.19 + 1187 3629 677 37.13 -76.50 + 1188 3624 705 34.90 -76.88 + 1189 3622 703 35.07 -77.05 + 1190 3570 717 33.95 -81.12 + 1191 3588 714 34.18 -79.72 + 1192 3542 717 33.95 -83.32 + 1193 3554 706 34.84 -82.35 + 1194 3556 705 34.90 -82.22 + 1195 3550 710 34.50 -82.72 + 1196 3573 701 35.22 -80.93 + 1197 3567 695 35.73 -81.37 + 1198 3552 698 35.43 -82.55 + 1199 3585 690 36.08 -79.94 + 1200 3579 676 37.21 -80.41 + 1201 3554 685 36.48 -82.40 + 1202 3582 689 36.13 -80.22 + 1203 3518 712 34.35 -85.16 + 1204 3498 708 34.65 -86.77 + 1205 3487 707 34.75 -87.62 + 1206 3518 704 35.03 -85.20 + 1207 3534 693 35.82 -83.98 + 1208 3519 692 35.95 -85.08 + 1209 3499 689 36.13 -86.68 + 1210 3472 713 34.27 -88.77 + 1211 3456 703 35.05 -90.00 + 1212 3470 696 35.59 -88.92 + 1213 3428 706 34.83 -92.25 + 1214 3428 707 34.73 -92.23 + 1215 3448 693 35.83 -90.65 + 1216 3417 711 34.48 -93.10 + 1217 3432 714 34.18 -91.93 + 1218 3406 724 33.45 -93.98 + 1219 3421 727 33.22 -92.80 + 1220 3401 700 35.33 -94.37 + 1221 3403 691 36.00 -94.17 + 1222 3416 688 36.27 -93.15 + 1223 3425 689 36.20 -92.47 + 1224 3435 695 35.73 -91.65 + 1225 3432 680 36.88 -91.90 + 1226 3462 675 37.23 -89.57 + 1227 3405 679 36.91 -94.02 + 1228 3399 676 37.15 -94.50 + 1229 3348 717 33.98 -98.50 + 1230 3341 704 34.98 -99.05 + 1231 3339 700 35.33 -99.20 + 1232 3331 687 36.30 -99.77 + 1233 3359 699 35.40 -97.60 + 1234 3366 682 36.73 -97.10 + 1235 3349 709 34.60 -98.40 + 1236 3367 713 34.30 -97.02 + 1237 3381 689 36.20 -95.90 + 1238 3380 681 36.76 -96.01 + 1239 3383 705 34.88 -95.78 + 1240 3361 701 35.23 -97.47 + 1241 3288 685 36.45 -103.15 + 1242 3325 711 34.43 -100.28 + 1243 3216 697 35.52 -108.78 + 1244 3307 701 35.23 -101.70 + 1245 3243 744 31.87 -106.70 + 1246 3244 703 35.05 -106.62 + 1247 3251 696 35.62 -106.08 + 1248 3223 681 36.75 -108.23 + 1249 3282 702 35.18 -103.60 + 1250 3263 696 35.65 -105.15 + 1251 3150 700 35.27 -113.95 + 1252 3182 679 36.93 -111.45 + 1253 3169 710 34.53 -112.47 + 1254 3170 708 34.65 -112.42 + 1255 3191 704 35.02 -110.73 + 1256 3200 713 34.27 -110.00 + 1257 3208 710 34.51 -109.38 + 1258 3179 702 35.13 -111.67 + 1259 3177 701 35.23 -111.82 + 1260 3173 692 35.95 -112.15 + 1261 3141 707 34.77 -114.62 + 1262 3099 705 34.92 -117.90 + 1263 3114 706 34.84 -116.78 + 1264 3095 707 34.73 -118.22 + 1265 3097 709 34.63 -118.08 + 1266 3085 698 35.43 -119.05 + 1267 3134 690 36.08 -115.17 + 1268 3123 683 36.62 -116.02 + 1269 3076 681 36.77 -119.72 + 1270 3084 715 34.12 -119.12 + 1271 3075 711 34.43 -119.83 + 1272 3083 714 34.21 -119.20 + 1273 3065 707 34.75 -120.57 + 1274 3067 705 34.90 -120.45 + 1275 3064 695 35.66 -120.63 + 1276 3619 672 37.50 -77.33 + 1277 3604 664 38.13 -78.44 + 1278 3605 674 37.35 -78.43 + 1279 3642 666 37.93 -75.48 + 1280 3617 653 38.95 -77.44 + 1281 3617 662 38.27 -77.45 + 1282 3631 662 38.28 -76.40 + 1283 3642 661 38.33 -75.51 + 1284 3623 655 38.84 -77.03 + 1285 3627 650 39.18 -76.67 + 1286 3614 644 39.70 -77.73 + 1287 3630 648 39.33 -76.42 + 1288 3654 647 39.45 -74.57 + 1289 3648 648 39.37 -75.07 + 1290 3645 641 39.88 -75.25 + 1291 3648 639 40.08 -75.01 + 1292 3641 644 39.68 -75.60 + 1293 3651 636 40.28 -74.82 + 1294 3654 640 40.02 -74.60 + 1295 3656 630 40.80 -74.42 + 1296 3595 674 37.33 -79.19 + 1297 3593 684 36.57 -79.33 + 1298 3585 674 37.32 -79.97 + 1299 3587 666 37.95 -79.83 + 1300 3570 668 37.78 -81.12 + 1301 3569 674 37.30 -81.19 + 1302 3579 667 37.87 -80.40 + 1303 3564 661 38.37 -81.60 + 1304 3586 654 38.88 -79.85 + 1305 3582 649 39.30 -80.23 + 1306 3586 644 39.65 -79.92 + 1307 3610 648 39.40 -77.98 + 1308 3525 652 39.05 -84.67 + 1309 3526 665 38.03 -84.60 + 1310 3511 663 38.18 -85.73 + 1311 3542 671 37.59 -83.32 + 1312 3493 669 37.75 -87.16 + 1313 3508 667 37.91 -85.97 + 1314 3532 677 37.08 -84.08 + 1315 3552 661 38.37 -82.55 + 1316 3535 647 39.42 -83.83 + 1317 3496 662 38.25 -86.95 + 1318 3566 648 39.34 -81.43 + 1319 3576 638 40.18 -80.65 + 1320 3548 640 40.00 -82.88 + 1321 3547 642 39.82 -82.93 + 1322 3560 641 39.95 -81.90 + 1323 3531 641 39.90 -84.20 + 1324 3528 652 39.09 -84.42 + 1325 3488 665 38.05 -87.53 + 1326 3470 657 38.65 -88.97 + 1327 3466 668 37.78 -89.25 + 1328 3452 656 38.75 -90.37 + 1329 3448 657 38.66 -90.65 + 1330 3472 677 37.07 -88.77 + 1331 3525 678 37.05 -84.61 + 1332 3491 647 39.45 -87.32 + 1333 3500 651 39.15 -86.62 + 1334 3504 643 39.73 -86.27 + 1335 3496 635 40.41 -86.93 + 1336 3461 642 39.84 -89.67 + 1337 3441 641 39.95 -91.20 + 1338 3470 634 40.48 -88.92 + 1339 3487 638 40.12 -87.60 + 1340 3413 675 37.23 -93.38 + 1341 3428 655 38.82 -92.22 + 1342 3451 668 37.77 -90.43 + 1343 3434 664 38.13 -91.77 + 1344 3424 664 38.10 -92.55 + 1345 3396 649 39.32 -94.72 + 1346 3398 651 39.12 -94.60 + 1347 3394 655 38.83 -94.89 + 1348 3394 643 39.77 -94.92 + 1349 3361 670 37.65 -97.43 + 1350 3356 665 38.07 -97.87 + 1351 3386 670 37.66 -95.48 + 1352 3363 665 38.06 -97.28 + 1353 3329 668 37.77 -99.97 + 1354 3319 666 37.93 -100.72 + 1355 3316 678 37.04 -100.97 + 1356 3343 661 38.34 -98.86 + 1357 3338 655 38.85 -99.27 + 1358 3347 675 37.27 -98.55 + 1359 3371 651 39.13 -96.67 + 1360 3377 661 38.33 -96.19 + 1361 3385 652 39.07 -95.62 + 1362 3384 653 38.95 -95.67 + 1363 3359 646 39.55 -97.65 + 1364 3344 654 38.87 -98.82 + 1365 3359 655 38.80 -97.65 + 1366 3304 678 37.01 -101.88 + 1367 3253 673 37.45 -105.87 + 1368 3229 676 37.15 -107.75 + 1369 3227 666 37.95 -107.90 + 1370 3283 665 38.05 -103.52 + 1371 3294 665 38.07 -102.68 + 1372 3271 662 38.28 -104.52 + 1373 3307 648 39.37 -101.70 + 1374 3331 648 39.38 -99.83 + 1375 3268 655 38.82 -104.72 + 1376 3266 645 39.57 -104.85 + 1377 3240 644 39.65 -106.92 + 1378 3241 650 39.22 -106.87 + 1379 3240 659 38.53 -106.93 + 1380 3267 657 38.70 -104.77 + 1381 3266 643 39.75 -104.87 + 1382 3287 638 40.17 -103.22 + 1383 3263 641 39.91 -105.12 + 1384 3191 645 39.62 -110.75 + 1385 3207 670 37.62 -109.47 + 1386 3191 661 38.37 -110.72 + 1387 3156 678 37.04 -113.50 + 1388 3161 669 37.70 -113.10 + 1389 3173 669 37.70 -112.15 + 1390 3219 651 39.12 -108.53 + 1391 3227 659 38.50 -107.90 + 1392 3218 674 37.30 -108.67 + 1393 3269 634 40.43 -104.63 + 1394 3264 634 40.45 -105.01 + 1395 3198 653 39.00 -110.17 + 1396 3204 656 38.76 -109.75 + 1397 3167 648 39.33 -112.58 + 1398 3177 617 41.78 -111.85 + 1399 3093 674 37.37 -118.37 + 1400 3066 675 37.28 -120.52 + 1401 3053 659 38.52 -121.50 + 1402 3056 658 38.55 -121.30 + 1403 3052 657 38.70 -121.58 + 1404 3110 665 38.05 -117.08 + 1405 3090 658 38.55 -118.63 + 1406 3138 649 39.28 -114.85 + 1407 3143 670 37.62 -114.52 + 1408 3075 646 39.50 -119.78 + 1409 3075 645 39.57 -119.79 + 1410 3049 684 36.58 -121.85 + 1411 3052 683 36.66 -121.60 + 1412 3056 667 37.90 -121.25 + 1413 3049 669 37.70 -121.82 + 1414 3044 669 37.73 -122.22 + 1415 3042 670 37.62 -122.38 + 1416 3048 674 37.37 -121.93 + 1417 3036 659 38.52 -122.82 + 1418 3676 629 40.87 -72.86 + 1419 3659 631 40.70 -74.17 + 1420 3660 629 40.84 -74.07 + 1421 3663 630 40.77 -73.90 + 1422 3673 630 40.80 -73.10 + 1423 3663 619 41.63 -73.87 + 1424 3665 626 41.07 -73.69 + 1425 3660 621 41.50 -74.10 + 1426 3673 625 41.17 -73.12 + 1427 3676 624 41.27 -72.87 + 1428 3686 623 41.33 -72.05 + 1429 3692 625 41.17 -71.58 + 1430 3706 619 41.65 -70.52 + 1431 3712 625 41.25 -70.07 + 1432 3703 615 41.92 -70.73 + 1433 3700 618 41.68 -70.97 + 1434 3709 619 41.67 -70.28 + 1435 3694 618 41.73 -71.43 + 1436 3694 619 41.60 -71.42 + 1437 3678 615 41.93 -72.68 + 1438 3685 618 41.73 -72.18 + 1439 3683 607 42.57 -72.27 + 1440 3679 618 41.73 -72.65 + 1441 3701 607 42.58 -70.92 + 1442 3699 610 42.37 -71.03 + 1443 3689 611 42.27 -71.87 + 1444 3636 635 40.38 -75.97 + 1445 3626 637 40.20 -76.76 + 1446 3632 638 40.12 -76.29 + 1447 3599 625 41.18 -78.90 + 1448 3606 636 40.30 -78.32 + 1449 3599 636 40.32 -78.83 + 1450 3612 629 40.84 -77.85 + 1451 3639 623 41.33 -75.73 + 1452 3624 624 41.25 -76.92 + 1453 3651 618 41.70 -74.80 + 1454 3636 611 42.22 -75.98 + 1455 3630 608 42.48 -76.44 + 1456 3624 612 42.17 -76.90 + 1457 3675 595 43.53 -72.95 + 1458 3643 632 40.65 -75.43 + 1459 3664 605 42.75 -73.80 + 1460 3666 597 43.33 -73.62 + 1461 3642 580 44.68 -75.47 + 1462 3634 600 43.12 -76.12 + 1463 3644 600 43.15 -75.37 + 1464 3582 633 40.50 -80.22 + 1465 3579 630 40.77 -80.40 + 1466 3585 636 40.34 -79.93 + 1467 3592 636 40.28 -79.40 + 1468 3566 628 40.91 -81.43 + 1469 3594 612 42.15 -79.26 + 1470 3561 622 41.42 -81.87 + 1471 3552 629 40.82 -82.52 + 1472 3576 624 41.27 -80.67 + 1473 3582 613 42.08 -80.18 + 1474 3602 617 41.80 -78.62 + 1475 3586 622 41.38 -79.87 + 1476 3601 602 42.93 -78.73 + 1477 3598 600 43.10 -78.94 + 1478 3614 600 43.12 -77.67 + 1479 3483 615 41.98 -87.90 + 1480 3479 615 41.92 -88.25 + 1481 3479 640 40.03 -88.28 + 1482 3471 642 39.83 -88.87 + 1483 3461 631 40.66 -89.68 + 1484 3461 618 41.74 -89.68 + 1485 3518 627 41.00 -85.20 + 1486 3515 637 40.25 -85.40 + 1487 3490 619 41.62 -87.42 + 1488 3485 617 41.78 -87.75 + 1489 3487 616 41.87 -87.60 + 1490 3484 609 42.42 -87.87 + 1491 3504 618 41.70 -86.32 + 1492 3536 619 41.60 -83.80 + 1493 3538 627 41.02 -83.67 + 1494 3542 611 42.23 -83.33 + 1495 3546 609 42.42 -83.02 + 1496 3552 603 42.92 -82.53 + 1497 3483 545 47.45 -87.90 + 1498 3526 604 42.77 -84.60 + 1499 3527 611 42.27 -84.47 + 1500 3517 610 42.30 -85.25 + 1501 3468 612 42.20 -89.10 + 1502 3450 621 41.45 -90.52 + 1503 3435 616 41.88 -91.70 + 1504 3442 630 40.78 -91.13 + 1505 3410 620 41.53 -93.65 + 1506 3425 626 41.10 -92.45 + 1507 3387 630 40.75 -95.41 + 1508 3448 609 42.40 -90.70 + 1509 3434 598 43.28 -91.74 + 1510 3426 607 42.55 -92.40 + 1511 3414 600 43.15 -93.33 + 1512 3403 607 42.55 -94.20 + 1513 3390 607 42.60 -95.23 + 1514 3381 623 41.30 -95.90 + 1515 3370 629 40.84 -96.75 + 1516 3370 636 40.30 -96.75 + 1517 3350 627 40.97 -98.32 + 1518 3342 619 41.62 -98.95 + 1519 3341 631 40.73 -99.00 + 1520 3377 617 41.76 -96.18 + 1521 3385 639 40.08 -95.60 + 1522 3333 621 41.44 -99.64 + 1523 3361 615 41.98 -97.43 + 1524 3363 621 41.45 -97.34 + 1525 3345 608 42.47 -98.69 + 1526 3375 609 42.40 -96.38 + 1527 3375 623 41.32 -96.37 + 1528 3290 626 41.10 -102.98 + 1529 3320 625 41.13 -100.68 + 1530 3308 633 40.51 -101.62 + 1531 3337 634 40.45 -99.33 + 1532 3293 614 42.05 -102.80 + 1533 3289 604 42.83 -103.10 + 1534 3267 625 41.15 -104.82 + 1535 3256 623 41.32 -105.67 + 1536 3269 642 39.87 -104.67 + 1537 3282 616 41.87 -103.60 + 1538 3246 603 42.92 -106.47 + 1539 3207 634 40.43 -109.52 + 1540 3236 634 40.48 -107.22 + 1541 3230 646 39.53 -107.73 + 1542 3175 630 40.78 -111.97 + 1543 3178 637 40.22 -111.72 + 1544 3212 619 41.60 -109.07 + 1545 3236 617 41.80 -107.20 + 1546 3175 625 41.20 -112.02 + 1547 3217 604 42.82 -108.73 + 1548 3220 601 43.07 -108.47 + 1549 3187 624 41.28 -111.03 + 1550 3191 594 43.60 -110.73 + 1551 3167 603 42.92 -112.60 + 1552 3174 595 43.52 -112.07 + 1553 3122 577 44.88 -116.10 + 1554 3091 639 40.07 -118.57 + 1555 3149 631 40.73 -114.03 + 1556 3127 629 40.87 -115.73 + 1557 3127 629 40.83 -115.78 + 1558 3127 619 41.67 -115.78 + 1559 3101 628 40.90 -117.80 + 1560 3065 635 40.38 -120.57 + 1561 3064 649 39.28 -120.70 + 1562 3071 649 39.32 -120.13 + 1563 3072 654 38.90 -120.00 + 1564 3145 595 43.50 -114.30 + 1565 3143 608 42.48 -114.48 + 1566 3152 607 42.55 -113.77 + 1567 3050 612 42.15 -121.73 + 1568 3032 651 39.13 -123.20 + 1569 3044 638 40.15 -122.25 + 1570 3043 633 40.50 -122.30 + 1571 3020 627 40.98 -124.10 + 1572 3018 617 41.78 -124.23 + 1573 3066 621 41.50 -120.53 + 1574 3036 610 42.37 -122.87 + 1575 3693 599 43.20 -71.50 + 1576 3702 600 43.08 -70.82 + 1577 3708 593 43.65 -70.32 + 1578 3703 596 43.40 -70.72 + 1579 3733 583 44.45 -68.37 + 1580 3724 588 44.07 -69.10 + 1581 3751 577 44.92 -67.00 + 1582 3734 547 47.28 -68.32 + 1583 3727 578 44.80 -68.83 + 1584 3680 597 43.35 -72.52 + 1585 3683 593 43.63 -72.30 + 1586 3693 584 44.36 -71.55 + 1587 3687 583 44.42 -72.02 + 1588 3680 586 44.20 -72.57 + 1589 3694 594 43.57 -71.42 + 1590 3697 581 44.58 -71.18 + 1591 3683 603 42.90 -72.27 + 1592 3672 583 44.47 -73.15 + 1593 3706 582 44.53 -70.53 + 1594 3709 588 44.05 -70.28 + 1595 3715 585 44.32 -69.80 + 1596 3718 570 45.47 -69.58 + 1597 3729 568 45.65 -68.68 + 1598 3668 580 44.65 -73.47 + 1599 3650 577 44.93 -74.85 + 1600 3636 589 44.00 -76.01 + 1601 3659 584 44.38 -74.19 + 1602 3540 605 42.70 -83.47 + 1603 3524 577 44.90 -84.72 + 1604 3514 603 42.88 -85.52 + 1605 3502 612 42.14 -86.44 + 1606 3513 611 42.23 -85.55 + 1607 3504 599 43.17 -86.25 + 1608 3536 602 42.97 -83.75 + 1609 3541 606 42.67 -83.42 + 1610 3532 595 43.53 -84.08 + 1611 3525 584 44.36 -84.67 + 1612 3515 585 44.28 -85.42 + 1613 3504 585 44.28 -86.25 + 1614 3513 579 44.73 -85.58 + 1615 3539 575 45.07 -83.57 + 1616 3541 583 44.45 -83.40 + 1617 3483 602 42.95 -87.90 + 1618 3465 600 43.13 -89.33 + 1619 3469 606 42.62 -89.04 + 1620 3454 599 43.21 -90.18 + 1621 3440 590 43.87 -91.25 + 1622 3438 578 44.87 -91.48 + 1623 3424 590 43.92 -92.50 + 1624 3480 583 44.48 -88.13 + 1625 3486 587 44.13 -87.68 + 1626 3475 589 43.98 -88.55 + 1627 3461 577 44.93 -89.63 + 1628 3461 579 44.78 -89.67 + 1629 3494 567 45.73 -87.08 + 1630 3487 574 45.12 -87.63 + 1631 3411 578 44.85 -93.57 + 1632 3370 594 43.58 -96.73 + 1633 3369 585 44.31 -96.82 + 1634 3362 603 42.92 -97.38 + 1635 3337 591 43.80 -99.32 + 1636 3351 584 44.38 -98.22 + 1637 3354 592 43.77 -98.03 + 1638 3365 577 44.92 -97.15 + 1639 3404 569 45.55 -94.07 + 1640 3404 558 46.40 -94.13 + 1641 3391 582 44.55 -95.08 + 1642 3387 565 45.87 -95.40 + 1643 3382 583 44.45 -95.82 + 1644 3399 585 44.32 -94.50 + 1645 3401 564 45.95 -94.35 + 1646 3415 577 44.88 -93.22 + 1647 3417 577 44.95 -93.07 + 1648 3406 586 44.22 -93.91 + 1649 3400 593 43.65 -94.42 + 1650 3413 593 43.68 -93.37 + 1651 3349 570 45.45 -98.43 + 1652 3289 588 44.06 -103.05 + 1653 3258 584 44.35 -105.53 + 1654 3239 579 44.77 -106.97 + 1655 3195 582 44.54 -110.42 + 1656 3227 589 43.97 -107.95 + 1657 3259 570 45.45 -105.40 + 1658 3267 549 47.13 -104.80 + 1659 3323 569 45.55 -100.41 + 1660 3325 584 44.38 -100.28 + 1661 3301 564 45.93 -102.17 + 1662 3213 582 44.52 -109.02 + 1663 3199 607 42.58 -110.11 + 1664 3186 580 44.68 -111.12 + 1665 3219 566 45.80 -108.53 + 1666 3207 550 47.05 -109.47 + 1667 3168 564 45.95 -112.50 + 1668 3168 573 45.25 -112.55 + 1669 3186 566 45.78 -111.15 + 1670 3195 567 45.70 -110.45 + 1671 3121 594 43.57 -116.22 + 1672 3086 594 43.58 -118.95 + 1673 3058 585 44.25 -121.15 + 1674 3151 574 45.12 -113.88 + 1675 3122 564 45.95 -116.13 + 1676 3087 567 45.68 -118.85 + 1677 3100 578 44.83 -117.82 + 1678 3030 599 43.23 -123.35 + 1679 3018 596 43.42 -124.25 + 1680 3031 587 44.12 -123.22 + 1681 3034 577 44.92 -123.00 + 1682 3021 581 44.58 -124.06 + 1683 3039 568 45.60 -122.60 + 1684 3042 569 45.55 -122.40 + 1685 3035 569 45.53 -122.95 + 1686 3058 568 45.62 -121.17 + 1687 3741 562 46.12 -67.80 + 1688 3719 555 46.62 -69.53 + 1689 3738 552 46.87 -68.01 + 1690 3737 554 46.68 -68.05 + 1691 3529 557 46.47 -84.37 + 1692 3527 560 46.25 -84.47 + 1693 3523 569 45.57 -84.80 + 1694 3463 568 45.63 -89.47 + 1695 3488 556 46.53 -87.55 + 1696 3490 559 46.35 -87.40 + 1697 3481 565 45.82 -88.12 + 1698 3476 548 47.17 -88.50 + 1699 3455 556 46.53 -90.13 + 1700 3429 552 46.83 -92.18 + 1701 3420 545 47.38 -92.83 + 1702 3381 552 46.83 -95.89 + 1703 3433 540 47.82 -91.83 + 1704 3413 530 48.57 -93.38 + 1705 3397 528 48.73 -94.62 + 1706 3369 552 46.90 -96.80 + 1707 3345 551 46.93 -98.68 + 1708 3393 544 47.50 -94.93 + 1709 3388 526 48.93 -95.33 + 1710 3365 538 47.95 -97.18 + 1711 3343 536 48.10 -98.87 + 1712 3319 553 46.77 -100.75 + 1713 3293 553 46.80 -102.80 + 1714 3282 535 48.18 -103.63 + 1715 3312 534 48.27 -101.28 + 1716 3310 542 47.65 -101.43 + 1717 3244 535 48.22 -106.62 + 1718 3240 546 47.33 -106.93 + 1719 3257 536 48.10 -105.58 + 1720 3275 541 47.70 -104.20 + 1721 3175 555 46.60 -112.00 + 1722 3148 551 46.92 -114.08 + 1723 3183 544 47.48 -111.37 + 1724 3203 530 48.55 -109.77 + 1725 3146 534 48.30 -114.27 + 1726 3170 530 48.60 -112.37 + 1727 3066 556 46.57 -120.53 + 1728 3055 547 47.28 -121.33 + 1729 3072 545 47.40 -120.02 + 1730 3070 545 47.40 -120.20 + 1731 3079 546 47.30 -119.52 + 1732 3081 548 47.20 -119.32 + 1733 3111 558 46.38 -117.02 + 1734 3113 540 47.77 -116.82 + 1735 3082 559 46.32 -119.27 + 1736 3084 560 46.27 -119.12 + 1737 3095 562 46.10 -118.28 + 1738 3104 542 47.63 -117.53 + 1739 3107 542 47.68 -117.32 + 1740 3109 553 46.75 -117.12 + 1741 3103 541 47.70 -117.60 + 1742 3100 530 48.55 -117.88 + 1743 3066 550 47.03 -120.53 + 1744 3028 536 48.12 -123.50 + 1745 3079 532 48.42 -119.53 + 1746 3023 561 46.15 -123.88 + 1747 3035 551 46.97 -122.90 + 1748 3022 551 46.97 -123.93 + 1749 3035 562 46.12 -122.94 + 1750 3043 545 47.45 -122.30 + 1751 3044 544 47.50 -122.22 + 1752 3043 543 47.53 -122.30 + 1753 3043 539 47.90 -122.28 + 1754 3039 547 47.27 -122.58 + 1755 3567 798 27.65 -81.33 + 1756 3014 538 47.95 -124.55 + 1757 3040 527 48.80 -122.53 + 1758 3163 638 40.17 -112.93 + 1759 3253 558 46.43 -105.87 + 1760 3387 580 44.67 -95.45 + 1761 3450 599 43.22 -90.53 + 1762 3639 588 44.05 -75.73 + 1763 3709 590 43.90 -70.25 + 1764 3694 602 42.93 -71.43 + 1765 3249 616 41.90 -106.19 + 1766 3320 639 40.09 -100.65 + 1767 3307 603 42.91 -101.69 + 1768 3363 639 40.10 -97.34 + 1769 3357 612 42.21 -97.79 + 1770 3409 616 41.90 -93.70 + 1771 3449 619 41.61 -90.57 + 1772 3533 642 39.82 -84.03 + 1773 3476 617 41.77 -88.48 + 1774 3494 630 40.81 -87.05 + 1775 3515 655 38.83 -85.42 + 1776 3664 632 40.65 -73.78 + 1777 3696 608 42.47 -71.28 + 1778 3698 605 42.72 -71.12 + 1779 3680 612 42.20 -72.53 + 1780 3678 612 42.15 -72.72 + 1781 3714 619 41.67 -69.97 + 1782 3662 603 42.85 -73.93 + 1783 3301 668 37.77 -102.18 + 1784 3267 653 38.97 -104.82 + 1785 3281 649 39.26 -103.70 + 1786 3268 638 40.18 -104.72 + 1787 3340 670 37.65 -99.09 + 1788 3384 673 37.38 -95.63 + 1789 3363 662 38.31 -97.30 + 1790 3422 672 37.52 -92.70 + 1791 3403 645 39.58 -94.19 + 1792 3450 644 39.66 -90.48 + 1793 3465 638 40.15 -89.33 + 1794 3650 652 39.02 -74.92 + 1795 3102 695 35.68 -117.68 + 1796 3247 737 32.41 -106.35 + 1797 3227 680 36.84 -107.91 + 1798 3338 690 36.07 -99.22 + 1799 3335 686 36.43 -99.53 + 1800 3361 682 36.69 -97.48 + 1801 3381 695 35.68 -95.86 + 1802 3360 704 34.98 -97.52 + 1803 3457 680 36.88 -89.97 + 1804 3502 679 36.97 -86.42 + 1805 3633 688 36.27 -76.18 + 1806 3129 732 32.83 -115.58 + 1807 3122 721 33.63 -116.17 + 1808 3140 722 33.62 -114.72 + 1809 3282 703 35.08 -103.61 + 1810 3250 728 33.08 -106.12 + 1811 3247 731 32.90 -106.40 + 1812 3316 729 33.02 -100.98 + 1813 3331 762 30.50 -99.77 + 1814 3383 745 31.78 -95.71 + 1815 3402 715 34.11 -94.29 + 1816 3421 744 31.90 -92.78 + 1817 3468 763 30.40 -89.07 + 1818 3471 716 34.09 -88.86 + 1819 3564 744 31.90 -81.63 + 1820 3546 757 30.89 -83.01 + 1821 3578 717 33.97 -80.47 + 1822 3598 721 33.68 -78.93 + 1823 3601 719 33.82 -78.72 + 1824 3577 788 28.47 -80.55 + 1825 3111 735 32.55 -116.97 + 1826 3133 734 32.63 -115.24 + 1827 3246 747 31.63 -106.43 + 1828 3094 782 28.88 -118.30 + 1829 3188 780 29.07 -110.97 + 1830 3252 785 28.70 -105.97 + 1831 3171 802 27.32 -112.30 + 1832 3189 794 27.97 -110.93 + 1833 3190 794 27.95 -110.80 + 1834 3334 801 27.43 -99.57 + 1835 3351 819 26.02 -98.23 + 1836 3284 825 25.53 -103.45 + 1837 3326 821 25.87 -100.20 + 1838 3327 822 25.78 -100.10 + 1839 3360 822 25.77 -97.53 + 1840 3195 843 24.17 -110.42 + 1841 3196 844 24.07 -110.37 + 1842 3204 856 23.15 -109.70 + 1843 3234 834 24.82 -107.40 + 1844 3271 843 24.13 -104.53 + 1845 3246 855 23.20 -106.42 + 1846 3248 855 23.17 -106.27 + 1847 3340 848 23.73 -99.13 + 1848 3342 848 23.72 -98.97 + 1849 3294 859 22.90 -102.68 + 1850 3316 868 22.15 -100.98 + 1851 3356 867 22.28 -97.87 + 1852 3299 872 21.88 -102.30 + 1853 3497 883 21.03 -86.87 + 1854 3261 887 20.68 -105.25 + 1855 3286 889 20.52 -103.32 + 1856 3461 883 20.98 -89.65 + 1857 3496 889 20.53 -86.93 + 1858 3270 907 19.15 -104.57 + 1859 3315 898 19.85 -101.03 + 1860 3334 904 19.35 -99.57 + 1861 3340 903 19.43 -99.10 + 1862 3377 907 19.15 -96.18 + 1863 3398 920 18.10 -94.58 + 1864 3433 913 18.65 -91.80 + 1865 3310 927 17.60 -101.47 + 1866 3330 937 16.83 -99.92 + 1867 3332 937 16.77 -99.75 + 1868 3376 950 15.78 -96.27 + 1869 3426 963 14.78 -92.38 + 1870 3781 738 32.37 -64.68 + 1871 3598 810 26.70 -78.97 + 1872 3601 812 26.55 -78.69 + 1873 3593 823 25.73 -79.30 + 1874 3617 831 25.05 -77.47 + 1875 3639 851 23.50 -75.76 + 1876 3549 873 21.83 -82.78 + 1877 3554 858 22.98 -82.40 + 1878 3568 856 23.13 -81.28 + 1879 3612 878 21.42 -77.85 + 1880 3621 892 20.33 -77.12 + 1881 3628 891 20.40 -76.62 + 1882 3638 896 19.96 -75.85 + 1883 3647 895 20.08 -75.15 + 1884 3655 891 20.35 -74.50 + 1885 3650 888 20.65 -74.92 + 1886 3565 875 21.62 -81.55 + 1887 3600 873 21.78 -78.78 + 1888 3624 884 20.95 -76.94 + 1889 3647 897 19.90 -75.12 + 1890 3567 905 19.28 -81.35 + 1891 3611 915 18.50 -77.92 + 1892 3626 922 17.93 -76.78 + 1893 3685 899 19.75 -72.18 + 1894 3683 914 18.57 -72.30 + 1895 3705 899 19.75 -70.55 + 1896 3704 903 19.46 -70.69 + 1897 3733 914 18.57 -68.37 + 1898 3717 916 18.43 -69.67 + 1899 3714 916 18.47 -69.88 + 1900 3749 915 18.50 -67.12 + 1901 3749 918 18.27 -67.15 + 1902 3756 921 18.02 -66.57 + 1903 3764 916 18.43 -66.00 + 1904 3777 917 18.33 -64.97 + 1905 3779 925 17.70 -64.80 + 1906 3783 916 18.45 -64.53 + 1907 3478 928 17.53 -88.30 + 1908 3458 935 16.92 -89.88 + 1909 3438 956 15.32 -91.47 + 1910 3474 951 15.72 -88.60 + 1911 3450 965 14.58 -90.52 + 1912 3446 974 13.92 -90.82 + 1913 3459 978 13.57 -89.83 + 1914 3468 977 13.70 -89.12 + 1915 3469 980 13.43 -89.05 + 1916 3486 982 13.28 -87.67 + 1917 3509 941 16.46 -85.92 + 1918 3501 943 16.32 -86.53 + 1919 3497 951 15.73 -86.87 + 1920 3489 951 15.72 -87.48 + 1921 3493 958 15.17 -87.12 + 1922 3483 954 15.45 -87.93 + 1923 3536 957 15.22 -83.80 + 1924 3509 961 14.90 -85.93 + 1925 3472 963 14.78 -88.78 + 1926 3480 969 14.33 -88.17 + 1927 3492 972 14.05 -87.22 + 1928 3493 982 13.30 -87.18 + 1929 3541 972 14.05 -83.37 + 1930 3506 996 12.15 -86.17 + 1931 3523 1024 9.97 -84.78 + 1932 3530 1024 10.00 -84.22 + 1933 3531 1025 9.95 -84.15 + 1934 3545 1024 10.00 -83.05 + 1935 3513 1016 10.60 -85.55 + 1936 3552 1031 9.43 -82.52 + 1937 3593 1036 9.05 -79.37 + 1938 3554 1045 8.39 -82.42 + 1939 3556 1032 9.35 -82.25 + 1940 3572 1049 8.08 -80.94 + 1941 3591 1037 8.97 -79.51 + 1942 3801 919 18.20 -63.05 + 1943 3813 938 16.75 -62.17 + 1944 3806 931 17.29 -62.68 + 1945 3807 932 17.20 -62.58 + 1946 3818 933 17.12 -61.78 + 1947 3801 921 18.04 -63.12 + 1948 3802 928 17.48 -62.98 + 1949 3804 923 17.90 -62.85 + 1950 3821 944 16.27 -61.52 + 1951 3824 953 15.53 -61.30 + 1952 3823 953 15.53 -61.40 + 1953 3823 956 15.30 -61.40 + 1954 3847 985 13.07 -59.48 + 1955 3712 992 12.50 -70.01 + 1956 3726 996 12.20 -68.97 + 1957 3735 996 12.15 -68.28 + 1958 3562 991 12.58 -81.72 + 1959 3658 1010 11.13 -74.23 + 1960 3642 1018 10.45 -75.52 + 1961 3651 1012 10.90 -74.77 + 1962 3672 1061 7.10 -73.20 + 1963 3641 1072 6.22 -75.60 + 1964 3643 1073 6.18 -75.43 + 1965 3638 1090 4.82 -75.80 + 1966 3660 1092 4.70 -74.13 + 1967 3690 1017 10.57 -71.73 + 1968 3751 1016 10.60 -66.98 + 1969 3832 1188 -2.83 -60.70 + 1970 3988 1170 -1.43 -48.48 + 1971 3841 1192 -3.15 -59.98 + 1972 4115 1200 -3.78 -38.53 + 1973 3979 1223 -5.53 -49.15 + 1974 4060 1217 -5.05 -42.82 + 1975 4157 1228 -5.92 -35.25 + 1976 4095 1253 -7.88 -40.08 + 1977 3791 1263 -8.70 -63.90 + 1978 3978 1258 -8.27 -49.28 + 1979 4041 1268 -9.07 -44.37 + 1980 3989 1289 -10.70 -48.40 + 1981 4115 1318 -13.00 -38.52 + 1982 3890 1352 -15.65 -56.10 + 1983 4005 1446 -23.00 -47.13 + 1984 3980 1438 -22.32 -49.07 + 1985 4056 1445 -22.90 -43.17 + 1986 4011 1454 -23.62 -46.65 + 1987 3953 1537 -30.08 -51.18 + 1988 3461 1164 -0.90 -89.62 + 1989 3606 1154 -0.12 -78.35 + 1990 3586 1180 -2.15 -79.88 + 1991 3596 1255 -8.08 -79.12 + 1992 3623 1306 -12.02 -77.03 + 1993 3779 1341 -14.75 -64.80 + 1994 3736 1363 -16.50 -68.17 + 1995 3703 1579 -33.38 -70.78 + 1996 3697 1600 -34.97 -71.22 + 1997 3874 1474 -25.16 -57.38 + 1998 3853 1503 -27.45 -59.05 + 1999 3831 1573 -32.92 -60.78 + 2000 3859 1598 -34.82 -58.53 + 2001 3745 1738 -45.78 -67.45 + 2002 2569 871 21.98 -159.35 + 2003 2585 879 21.32 -158.07 + 2004 2587 879 21.35 -157.93 + 2005 2598 881 21.15 -157.10 + 2006 2606 884 20.90 -156.43 + 2007 1854 981 13.35 144.80 + 2008 1866 958 15.12 145.73 + 2009 2134 905 19.28 166.65 + 2010 2624 900 19.72 -155.07 + 2011 1944 1056 7.47 151.85 + 2012 2026 1063 6.97 158.22 + 2013 2087 1084 5.33 163.03 + 2014 2147 1040 8.73 167.73 + 2015 2194 1061 7.08 171.38 + 2016 1722 1058 7.33 134.48 + 2017 1768 1031 9.48 138.08 + 2018 1859 970 14.20 145.20 + 2019 1864 960 15.00 145.60 + 2020 1865 911 18.80 145.70 + 2021 1725 1048 8.10 134.70 + 2022 1693 1084 5.30 132.20 + 2023 1760 1046 8.30 137.50 + 2024 1790 1024 10.00 139.80 + 2025 1799 1027 9.80 140.50 + 2026 1727 1057 7.40 134.90 + 2027 1851 1042 8.60 144.60 + 2028 1883 1057 7.40 147.10 + 2029 1910 1057 7.40 149.20 + 2030 1917 1042 8.60 149.70 + 2031 1945 1042 8.60 151.90 + 2032 1955 1064 6.90 152.70 + 2033 1969 1082 5.50 153.80 + 2034 1984 1103 3.80 155.00 + 2035 2014 1078 5.80 157.30 + 2036 2022 1062 7.00 157.90 + 2037 2046 1065 6.80 159.80 + 2038 2057 1073 6.20 160.70 + 2039 2080 1001 11.80 162.50 + 2040 2121 1038 8.90 165.70 + 2041 2161 1059 7.30 168.80 + 2042 2171 1076 5.90 169.60 + 2043 2174 1009 11.20 169.80 + 2044 2179 1030 9.50 170.20 + 2045 2200 1074 6.10 171.80 + 2046 2804 1383 -18.07 -140.95 + 2047 2238 1626 -37.02 174.80 + 2048 1676 1311 -12.42 130.87 + 2049 1955 1506 -27.63 152.72 + 2050 1485 1561 -31.92 115.97 + 2051 1936 1587 -33.95 151.18 + 2052 1854 1634 -37.67 144.83 + 2053 1907 1605 -35.40 148.98 + 2054 1888 1700 -42.83 147.50 + 2055 1368 1231 -6.15 106.85 + 2056 3494 750 31.42 -87.05 + 2057 1544 958 15.18 120.57 + 2058 1549 966 14.52 121.00 + 2059 1563 1064 6.90 122.07 + 2060 3615 723 33.49 -77.59 + 2061 3143 776 29.37 -114.47 + 2062 3156 636 40.33 -113.50 + 2063 3591 702 35.17 -79.50 + 2064 3307 527 48.83 -101.67 + 2065 4600 586 44.22 -0.67 + 2066 276 1510 -28.00 21.50 + 2067 3738 1671 -40.50 -68.00 + 2068 3847 1502 -27.33 -59.50 + 2069 1728 1513 -28.23 134.98 + 2070 4262 657 38.70 -27.10 + 2071 2691 368 61.20 -149.80 + 2072 4424 1253 -7.90 -14.40 + 2073 1784 695 35.70 139.30 + 2074 3911 526 48.90 -54.50 + 2075 63 595 43.50 4.90 + 2076 4495 477 52.70 -8.90 + 2077 1694 1338 -14.50 132.30 + 2078 4507 664 38.10 -7.90 + 2079 1544 842 24.20 120.60 + 2080 1293 989 12.70 101.00 + 2081 533 892 20.30 41.60 + 2082 2424 1335 -14.30 -170.70 + 2083 3721 1813 -51.60 -69.30 + 2084 3208 1500 -27.20 -109.40 + 2085 1133 863 22.60 88.50 + 2086 4162 1256 -8.10 -34.90 + 2087 4013 1452 -23.40 -46.50 + 2088 1023 1060 7.20 79.90 + 2089 3751 1016 10.60 -67.00 + 2090 3938 1091 4.80 -52.40 + 2091 932 907 19.10 72.80 + 2092 608 1393 -18.80 47.50 + 2093 3189 740 32.20 -110.90 + 2094 3289 587 44.10 -103.10 + 2095 2727 325 64.60 -147.00 + 2096 3065 706 34.80 -120.60 + 2097 3097 709 34.60 -118.10 + 2098 3156 637 40.19 -113.47 + 2099 887 705 34.95 69.27 + 2100 4592 476 52.83 -1.32 + 2101 93 1037 9.01 7.26 + 2102 3118 850 23.61 -116.48 + 2103 814 563 46.00 63.56 + 2104 867 542 47.67 67.73 + 2105 2611 899 19.73 -156.05 + 2106 2875 394 59.23 -135.43 + 2107 2736 369 61.13 -146.25 + 2108 2600 886 20.78 -156.95 + 2109 2604 883 21.02 -156.63 + 2110 3757 923 17.85 -66.52 + 2111 3749 919 18.17 -67.15 + 2112 3417 701 35.25 -93.09 + 2113 1641 1351 -15.51 128.15 + 2114 4514 416 57.48 -7.36 + 2115 130 456 54.38 10.13 diff --git a/parm/product/gefs.0p25.f000.paramlist.a.txt b/parm/product/gefs.0p25.f000.paramlist.a.txt new file mode 100644 index 0000000000..4fdb8f9713 --- /dev/null +++ b/parm/product/gefs.0p25.f000.paramlist.a.txt @@ -0,0 +1,39 @@ +:HGT:surface: +:PRMSL:mean sea level: +:PRES:surface: +:TMP:2 m above ground: +:TMAX:2 m above ground: +:TMIN:2 m above ground: +:RH:2 m above ground: +:DPT:2 m above ground: +:UGRD:10 m above ground: +:VGRD:10 m above ground: +:APCP:surface: +:CRAIN:surface: +:CSNOW:surface: +:CFRZR:surface: +:CICEP:surface: +:PWAT:entire atmosphere (considered as a single layer): +:CAPE:180-0 mb above ground: +:CAPE:surface: +:CIN:180-0 mb above ground: +:CIN:surface: +:CPOFP:surface: +:HLCY:3000-0 m above ground: +:TCDC:entire atmosphere: +:WEASD:surface: +:SNOD:surface: +:ULWRF:top of atmosphere: +:DSWRF:surface: +:DLWRF:surface: +:USWRF:surface: +:ULWRF:surface: +:GUST:surface: +:SHTFL:surface: +:LHTFL:surface: +:ICETK:surface: +:TSOIL:0-0.1 +:SOILW:0-0.1 +:MSLET:mean sea level: +:VIS:surface: +:HGT:cloud ceiling: diff --git a/parm/product/gefs.0p25.f000.paramlist.b.txt b/parm/product/gefs.0p25.f000.paramlist.b.txt new file mode 100644 index 0000000000..b94b4ab8a3 --- /dev/null +++ b/parm/product/gefs.0p25.f000.paramlist.b.txt @@ -0,0 +1,522 @@ +############################# sorted pgrb2a + pgrb2b 201502 +:4LFTX:surface: +:5WAVH:500 mb: +:ABSV:1000 mb: +:ABSV:100 mb: +:ABSV:10 mb: +:ABSV:150 mb: +:ABSV:200 mb: +:ABSV:20 mb: +:ABSV:250 mb: +:ABSV:300 mb: +:ABSV:30 mb: +:ABSV:350 mb: +:ABSV:400 mb: +:ABSV:450 mb: +:ABSV:500 mb: +:ABSV:50 mb: +:ABSV:550 mb: +:ABSV:600 mb: +:ABSV:650 mb: +:ABSV:700 mb: +:ABSV:70 mb: +:ABSV:750 mb: +:ABSV:800 mb: +:ABSV:850 mb: +:ABSV:900 mb: +:ABSV:925 mb: +:ABSV:950 mb: +:ABSV:975 mb: +:BRTMP:top of atmosphere: +:CAPE:255-0 mb above ground: +:CIN:255-0 mb above ground: +:CLWMR:1000 mb: +:CLWMR:100 mb: +:CLWMR:10 mb: +:CLWMR:150 mb: +:CLWMR:200 mb: +:CLWMR:20 mb: +:CLWMR:250 mb: +:CLWMR:300 mb: +:CLWMR:30 mb: +:CLWMR:350 mb: +:CLWMR:400 mb: +:CLWMR:450 mb: +:CLWMR:500 mb: +:CLWMR:50 mb: +:CLWMR:550 mb: +:CLWMR:600 mb: +:CLWMR:650 mb: +:CLWMR:700 mb: +:CLWMR:70 mb: +:CLWMR:750 mb: +:CLWMR:800 mb: +:CLWMR:850 mb: +:CLWMR:900 mb: +:CLWMR:925 mb: +:CLWMR:950 mb: +:CLWMR:975 mb: +:CNWAT:surface: +:CWAT:entire atmosphere (considered as a single layer): +:DPT:30-0 mb above ground: +:FLDCP:surface: +:FRICV:surface: +:HGT:0C isotherm: +:HGT:1000 mb: +:HGT:100 mb: +:HGT:10 mb: +:HGT:1 mb: +:HGT:150 mb: +:HGT:200 mb: +:HGT:20 mb: +:HGT:2 mb: +:HGT:250 mb: +:HGT:300 mb: +:HGT:30 mb: +:HGT:3 mb: +:HGT:350 mb: +:HGT:400 mb: +:HGT:450 mb: +:HGT:500 mb: +:HGT:50 mb: +:HGT:5 mb: +:HGT:550 mb: +:HGT:600 mb: +:HGT:650 mb: +:HGT:700 mb: +:HGT:70 mb: +:HGT:7 mb: +:HGT:750 mb: +:HGT:800 mb: +:HGT:850 mb: +:HGT:900 mb: +:HGT:925 mb: +:HGT:950 mb: +:HGT:975 mb: +:HGT:highest tropospheric freezing level: +:HGT:max wind: +:HGT:PV=-1.5e-06 (Km^2/kg/s) surface: +:HGT:PV=1.5e-06 (Km^2/kg/s) surface: +:HGT:PV=-1e-06 (Km^2/kg/s) surface: +:HGT:PV=1e-06 (Km^2/kg/s) surface: +:HGT:PV=-2e-06 (Km^2/kg/s) surface: +:HGT:PV=2e-06 (Km^2/kg/s) surface: +:HGT:PV=-5e-07 (Km^2/kg/s) surface: +:HGT:PV=5e-07 (Km^2/kg/s) surface: +:HGT:tropopause: +:HINDEX:surface: +:HPBL:surface: +:ICAHT:max wind: +:ICAHT:tropopause: +:ICEC:surface: +:ICIP:300 mb: +:ICIP:400 mb: +:ICIP:500 mb: +:ICIP:600 mb: +:ICIP:700 mb: +:ICIP:800 mb: +:ICSEV:300 mb: +:ICSEV:400 mb: +:ICSEV:500 mb: +:ICSEV:600 mb: +:ICSEV:700 mb: +:ICSEV:800 mb: +:LAND:surface: +:LFTX:surface: +:MNTSF:320 K isentropic level: +:O3MR:100 mb: +:O3MR:10 mb: +:O3MR:125 mb: +:O3MR:150 mb: +:O3MR:1 mb: +:O3MR:200 mb: +:O3MR:20 mb: +:O3MR:250 mb: +:O3MR:2 mb: +:O3MR:300 mb: +:O3MR:30 mb: +:O3MR:350 mb: +:O3MR:3 mb: +:O3MR:400 mb: +:O3MR:50 mb: +:O3MR:5 mb: +:O3MR:70 mb: +:O3MR:7 mb: +:PEVPR:surface: +:PLI:30-0 mb above ground: +:PLPL:255-0 mb above ground: +:POT:0.995 sigma level: +:PRES:80 m above ground: +:PRES:max wind: +:PRES:mean sea level: +:PRES:PV=-1.5e-06 (Km^2/kg/s) surface: +:PRES:PV=1.5e-06 (Km^2/kg/s) surface: +:PRES:PV=-1e-06 (Km^2/kg/s) surface: +:PRES:PV=1e-06 (Km^2/kg/s) surface: +:PRES:PV=-2e-06 (Km^2/kg/s) surface: +:PRES:PV=2e-06 (Km^2/kg/s) surface: +:PRES:PV=-5e-07 (Km^2/kg/s) surface: +:PRES:PV=5e-07 (Km^2/kg/s) surface: +:PRES:tropopause: +:PVORT:310 K isentropic level: +:PVORT:320 K isentropic level: +:PVORT:350 K isentropic level: +:PVORT:450 K isentropic level: +:PVORT:550 K isentropic level: +:PVORT:650 K isentropic level: +:PWAT:30-0 mb above ground: +:RH:0.33-1 sigma layer: +:RH:0.44-0.72 sigma layer: +:RH:0.44-1 sigma layer: +:RH:0.72-0.94 sigma layer: +:RH:0.995 sigma level: +:RH:0C isotherm: +:RH:1000 mb: +:RH:100 mb: +:RH:10 mb: +:RH:120-90 mb above ground: +:RH:150-120 mb above ground: +:RH:150 mb: +:RH:180-150 mb above ground: +:RH:200 mb: +:RH:20 mb: +:RH:250 mb: +:RH:300 mb: +:RH:30-0 mb above ground: +:RH:30 mb: +:RH:350 mb: +:RH:400 mb: +:RH:450 mb: +:RH:500 mb: +:RH:50 mb: +:RH:550 mb: +:RH:600 mb: +:RH:60-30 mb above ground: +:RH:650 mb: +:RH:700 mb: +:RH:70 mb: +:RH:750 mb: +:RH:800 mb: +:RH:850 mb: +:RH:900 mb: +:RH:90-60 mb above ground: +:RH:925 mb: +:RH:950 mb: +:RH:975 mb: +:RH:entire atmosphere (considered as a single layer): +:RH:highest tropospheric freezing level: +:SFCR:surface: +:SNOHF:surface: +:SNOWC:surface: +:SOILL:0-0.1 m below ground: +:SOILL:0.1-0.4 m below ground: +:SOILL:0.4-1 m below ground: +:SOILL:1-2 m below ground: +:SOILW:0.1-0.4 m below ground: +:SOILW:0.4-1 m below ground: +:SOILW:1-2 m below ground: +:SPFH:1000 mb: +:SPFH:100 mb: +:SPFH:10 mb: +:SPFH:1 mb: +:SPFH:120-90 mb above ground: +:SPFH:150-120 mb above ground: +:SPFH:150 mb: +:SPFH:180-150 mb above ground: +:SPFH:200 mb: +:SPFH:20 mb: +:SPFH:2 mb: +:SPFH:250 mb: +:SPFH:2 m above ground: +:SPFH:300 mb: +:SPFH:30-0 mb above ground: +:SPFH:30 mb: +:SPFH:3 mb: +:SPFH:350 mb: +:SPFH:400 mb: +:SPFH:450 mb: +:SPFH:500 mb: +:SPFH:50 mb: +:SPFH:5 mb: +:SPFH:550 mb: +:SPFH:600 mb: +:SPFH:60-30 mb above ground: +:SPFH:650 mb: +:SPFH:700 mb: +:SPFH:70 mb: +:SPFH:7 mb: +:SPFH:750 mb: +:SPFH:800 mb: +:SPFH:80 m above ground: +:SPFH:850 mb: +:SPFH:900 mb: +:SPFH:90-60 mb above ground: +:SPFH:925 mb: +:SPFH:950 mb: +:SPFH:975 mb: +:SUNSD:surface: +:TCDC:475 mb: +:TMP:0.995 sigma level: +:TMP:1000 mb: +:TMP:100 m above ground: +:TMP:100 mb: +:TMP:10 mb: +:TMP:1 mb: +:TMP:120-90 mb above ground: +:TMP:150-120 mb above ground: +:TMP:150 mb: +:TMP:180-150 mb above ground: +:TMP:1829 m above mean sea level: +:TMP:200 mb: +:TMP:20 mb: +:TMP:2 mb: +:TMP:250 mb: +:TMP:2743 m above mean sea level: +:TMP:300 mb: +:TMP:30-0 mb above ground: +:TMP:305 m above mean sea level: +:TMP:30 mb: +:TMP:3 mb: +:TMP:320 K isentropic level: +:TMP:350 mb: +:TMP:3658 m above mean sea level: +:TMP:400 mb: +:TMP:450 mb: +:TMP:450 K isentropic level: +:TMP:4572 m above mean sea level: +:TMP:457 m above mean sea level: +:TMP:500 mb: +:TMP:50 mb: +:TMP:5 mb: +:TMP:550 mb: +:TMP:550 K isentropic level: +:TMP:600 mb: +:TMP:60-30 mb above ground: +:TMP:610 m above mean sea level: +:TMP:650 mb: +:TMP:650 K isentropic level: +:TMP:700 mb: +:TMP:70 mb: +:TMP:7 mb: +:TMP:750 mb: +:TMP:800 mb: +:TMP:80 m above ground: +:TMP:850 mb: +:TMP:900 mb: +:TMP:90-60 mb above ground: +:TMP:914 m above mean sea level: +:TMP:925 mb: +:TMP:950 mb: +:TMP:975 mb: +:TMP:max wind: +:TMP:PV=-1.5e-06 (Km^2/kg/s) surface: +:TMP:PV=1.5e-06 (Km^2/kg/s) surface: +:TMP:PV=-1e-06 (Km^2/kg/s) surface: +:TMP:PV=1e-06 (Km^2/kg/s) surface: +:TMP:PV=-2e-06 (Km^2/kg/s) surface: +:TMP:PV=2e-06 (Km^2/kg/s) surface: +:TMP:PV=-5e-07 (Km^2/kg/s) surface: +:TMP:PV=5e-07 (Km^2/kg/s) surface: +:TMP:surface: +:TMP:tropopause: +:TOZNE:entire atmosphere (considered as a single layer): +:TSOIL:0.1-0.4 m below ground: +:TSOIL:0.4-1 m below ground: +:TSOIL:1-2 m below ground: +:UGRD:0.995 sigma level: +:UGRD:1000 mb: +:UGRD:100 m above ground: +:UGRD:100 mb: +:UGRD:10 mb: +:UGRD:1 mb: +:UGRD:120-90 mb above ground: +:UGRD:150-120 mb above ground: +:UGRD:150 mb: +:UGRD:180-150 mb above ground: +:UGRD:1829 m above mean sea level: +:UGRD:200 mb: +:UGRD:20 mb: +:UGRD:2 mb: +:UGRD:250 mb: +:UGRD:2743 m above mean sea level: +:UGRD:300 mb: +:UGRD:30-0 mb above ground: +:UGRD:305 m above mean sea level: +:UGRD:30 mb: +:UGRD:3 mb: +:UGRD:320 K isentropic level: +:UGRD:350 mb: +:UGRD:3658 m above mean sea level: +:UGRD:400 mb: +:UGRD:450 mb: +:UGRD:450 K isentropic level: +:UGRD:4572 m above mean sea level: +:UGRD:457 m above mean sea level: +:UGRD:500 mb: +:UGRD:50 mb: +:UGRD:5 mb: +:UGRD:550 mb: +:UGRD:550 K isentropic level: +:UGRD:600 mb: +:UGRD:60-30 mb above ground: +:UGRD:610 m above mean sea level: +:UGRD:650 mb: +:UGRD:650 K isentropic level: +:UGRD:700 mb: +:UGRD:70 mb: +:UGRD:7 mb: +:UGRD:750 mb: +:UGRD:800 mb: +:UGRD:80 m above ground: +:UGRD:850 mb: +:UGRD:900 mb: +:UGRD:90-60 mb above ground: +:UGRD:914 m above mean sea level: +:UGRD:925 mb: +:UGRD:950 mb: +:UGRD:975 mb: +:UGRD:max wind: +:UGRD:planetary boundary layer: +:UGRD:PV=-1.5e-06 (Km^2/kg/s) surface: +:UGRD:PV=1.5e-06 (Km^2/kg/s) surface: +:UGRD:PV=-1e-06 (Km^2/kg/s) surface: +:UGRD:PV=1e-06 (Km^2/kg/s) surface: +:UGRD:PV=-2e-06 (Km^2/kg/s) surface: +:UGRD:PV=2e-06 (Km^2/kg/s) surface: +:UGRD:PV=-5e-07 (Km^2/kg/s) surface: +:UGRD:PV=5e-07 (Km^2/kg/s) surface: +:UGRD:tropopause: +:USTM:6000-0 m above ground: +:APTMP:2 m above ground: +:VGRD:0.995 sigma level: +:VGRD:1000 mb: +:VGRD:100 m above ground: +:VGRD:100 mb: +:VGRD:10 mb: +:VGRD:1 mb: +:VGRD:120-90 mb above ground: +:VGRD:150-120 mb above ground: +:VGRD:150 mb: +:VGRD:180-150 mb above ground: +:VGRD:1829 m above mean sea level: +:VGRD:200 mb: +:VGRD:20 mb: +:VGRD:2 mb: +:VGRD:250 mb: +:VGRD:2743 m above mean sea level: +:VGRD:300 mb: +:VGRD:30-0 mb above ground: +:VGRD:305 m above mean sea level: +:VGRD:30 mb: +:VGRD:3 mb: +:VGRD:320 K isentropic level: +:VGRD:350 mb: +:VGRD:3658 m above mean sea level: +:VGRD:400 mb: +:VGRD:450 mb: +:VGRD:450 K isentropic level: +:VGRD:4572 m above mean sea level: +:VGRD:457 m above mean sea level: +:VGRD:500 mb: +:VGRD:50 mb: +:VGRD:5 mb: +:VGRD:550 mb: +:VGRD:550 K isentropic level: +:VGRD:600 mb: +:VGRD:60-30 mb above ground: +:VGRD:610 m above mean sea level: +:VGRD:650 mb: +:VGRD:650 K isentropic level: +:VGRD:700 mb: +:VGRD:70 mb: +:VGRD:7 mb: +:VGRD:750 mb: +:VGRD:800 mb: +:VGRD:80 m above ground: +:VGRD:850 mb: +:VGRD:900 mb: +:VGRD:90-60 mb above ground: +:VGRD:914 m above mean sea level: +:VGRD:925 mb: +:VGRD:950 mb: +:VGRD:975 mb: +:VGRD:max wind: +:VGRD:planetary boundary layer: +:VGRD:PV=-1.5e-06 (Km^2/kg/s) surface: +:VGRD:PV=1.5e-06 (Km^2/kg/s) surface: +:VGRD:PV=-1e-06 (Km^2/kg/s) surface: +:VGRD:PV=1e-06 (Km^2/kg/s) surface: +:VGRD:PV=-2e-06 (Km^2/kg/s) surface: +:VGRD:PV=2e-06 (Km^2/kg/s) surface: +:VGRD:PV=-5e-07 (Km^2/kg/s) surface: +:VGRD:PV=5e-07 (Km^2/kg/s) surface: +:VGRD:tropopause: +:VRATE:planetary boundary layer: +:VSTM:6000-0 m above ground: +:VVEL:0.995 sigma level: +:VVEL:1 mb: +:VVEL:2 mb: +:VVEL:3 mb: +:VVEL:5 mb: +:VVEL:7 mb: +:VVEL:10 mb: +:VVEL:20 mb: +:VVEL:30 mb: +:VVEL:50 mb: +:VVEL:70 mb: +:VVEL:1000 mb: +:VVEL:100 mb: +:VVEL:150 mb: +:VVEL:200 mb: +:VVEL:250 mb: +:VVEL:300 mb: +:VVEL:350 mb: +:VVEL:400 mb: +:VVEL:450 mb: +:VVEL:500 mb: +:VVEL:550 mb: +:VVEL:600 mb: +:VVEL:650 mb: +:VVEL:700 mb: +:VVEL:750 mb: +:VVEL:800 mb: +:VVEL:850 mb: +:VVEL:900 mb: +:VVEL:925 mb: +:VVEL:950 mb: +:VVEL:975 mb: +:VWSH:PV=-1.5e-06 (Km^2/kg/s) surface: +:VWSH:PV=1.5e-06 (Km^2/kg/s) surface: +:VWSH:PV=-1e-06 (Km^2/kg/s) surface: +:VWSH:PV=1e-06 (Km^2/kg/s) surface: +:VWSH:PV=-2e-06 (Km^2/kg/s) surface: +:VWSH:PV=2e-06 (Km^2/kg/s) surface: +:VWSH:PV=-5e-07 (Km^2/kg/s) surface: +:VWSH:PV=5e-07 (Km^2/kg/s) surface: +:VWSH:tropopause: +:WILT:surface: +:PRES:1 hybrid level: +:HGT:1 hybrid level: +:TMP:1 hybrid level: +:RH:1 hybrid level: +:UGRD:1 hybrid level: +:VGRD:1 hybrid level: +:PRES:2 hybrid level: +:HGT:2 hybrid level: +:TMP:2 hybrid level: +:RH:2 hybrid level: +:UGRD:2 hybrid level: +:VGRD:2 hybrid level: +:PRES:3 hybrid level: +:HGT:3 hybrid level: +:TMP:3 hybrid level: +:RH:3 hybrid level: +:UGRD:3 hybrid level: +:VGRD:3 hybrid level: +:PRES:4 hybrid level: +:HGT:4 hybrid level: +:TMP:4 hybrid level: +:RH:4 hybrid level: +:UGRD:4 hybrid level: +:VGRD:4 hybrid level: +;############################ do not leave a blank line at the end diff --git a/parm/product/gefs.0p25.fFFF.paramlist.a.txt b/parm/product/gefs.0p25.fFFF.paramlist.a.txt new file mode 100644 index 0000000000..a4a3ace385 --- /dev/null +++ b/parm/product/gefs.0p25.fFFF.paramlist.a.txt @@ -0,0 +1,38 @@ +:PRMSL:mean sea level: +:PRES:surface: +:TMP:2 m above ground: +:TMAX:2 m above ground: +:TMIN:2 m above ground: +:RH:2 m above ground: +:DPT:2 m above ground: +:UGRD:10 m above ground: +:VGRD:10 m above ground: +:APCP:surface: +:CRAIN:surface: +:CSNOW:surface: +:CFRZR:surface: +:CICEP:surface: +:PWAT:entire atmosphere (considered as a single layer): +:CAPE:180-0 mb above ground: +:CAPE:surface: +:CIN:180-0 mb above ground: +:CIN:surface: +:HLCY:3000-0 m above ground: +:TCDC:entire atmosphere: +:WEASD:surface: +:SNOD:surface: +:ULWRF:top of atmosphere: +:DSWRF:surface: +:CPOFP:surface: +:DLWRF:surface: +:USWRF:surface: +:ULWRF:surface: +:GUST:surface: +:SHTFL:surface: +:LHTFL:surface: +:ICETK:surface: +:TSOIL:0-0.1 +:SOILW:0-0.1 +:MSLET:mean sea level: +:VIS:surface: +:HGT:cloud ceiling: diff --git a/parm/product/gefs.0p25.fFFF.paramlist.b.txt b/parm/product/gefs.0p25.fFFF.paramlist.b.txt new file mode 100644 index 0000000000..f7fdb73ddf --- /dev/null +++ b/parm/product/gefs.0p25.fFFF.paramlist.b.txt @@ -0,0 +1,554 @@ +############################# sorted pgrb2a + pgrb2b 201502 +:4LFTX:surface: +:5WAVH:500 mb: +:ABSV:1000 mb: +:ABSV:100 mb: +:ABSV:10 mb: +:ABSV:150 mb: +:ABSV:200 mb: +:ABSV:20 mb: +:ABSV:250 mb: +:ABSV:300 mb: +:ABSV:30 mb: +:ABSV:350 mb: +:ABSV:400 mb: +:ABSV:450 mb: +:ABSV:500 mb: +:ABSV:50 mb: +:ABSV:550 mb: +:ABSV:600 mb: +:ABSV:650 mb: +:ABSV:700 mb: +:ABSV:70 mb: +:ABSV:750 mb: +:ABSV:800 mb: +:ABSV:850 mb: +:ABSV:900 mb: +:ABSV:925 mb: +:ABSV:950 mb: +:ABSV:975 mb: +:ACPCP:surface: +:ALBDO:surface: +:BRTMP:top of atmosphere: +:CAPE:255-0 mb above ground: +:CDUVB:surface: +:CIN:255-0 mb above ground: +:CLWMR:1000 mb: +:CLWMR:100 mb: +:CLWMR:10 mb: +:CLWMR:150 mb: +:CLWMR:200 mb: +:CLWMR:20 mb: +:CLWMR:250 mb: +:CLWMR:300 mb: +:CLWMR:30 mb: +:CLWMR:350 mb: +:CLWMR:400 mb: +:CLWMR:450 mb: +:CLWMR:500 mb: +:CLWMR:50 mb: +:CLWMR:550 mb: +:CLWMR:600 mb: +:CLWMR:650 mb: +:CLWMR:700 mb: +:CLWMR:70 mb: +:CLWMR:750 mb: +:CLWMR:800 mb: +:CLWMR:850 mb: +:CLWMR:900 mb: +:CLWMR:925 mb: +:CLWMR:950 mb: +:CLWMR:975 mb: +:CNWAT:surface: +:CPRAT:surface: +:CWAT:entire atmosphere (considered as a single layer): +:CWORK:entire atmosphere (considered as a single layer): +:DPT:30-0 mb above ground: +:DUVB:surface: +:FLDCP:surface: +:FRICV:surface: +:GFLUX:surface: +:HGT:0C isotherm: +:HGT:1000 mb: +:HGT:100 mb: +:HGT:10 mb: +:HGT:1 mb: +:HGT:150 mb: +:HGT:200 mb: +:HGT:20 mb: +:HGT:2 mb: +:HGT:250 mb: +:HGT:300 mb: +:HGT:30 mb: +:HGT:3 mb: +:HGT:350 mb: +:HGT:400 mb: +:HGT:450 mb: +:HGT:500 mb: +:HGT:50 mb: +:HGT:5 mb: +:HGT:550 mb: +:HGT:600 mb: +:HGT:650 mb: +:HGT:700 mb: +:HGT:70 mb: +:HGT:7 mb: +:HGT:750 mb: +:HGT:800 mb: +:HGT:850 mb: +:HGT:900 mb: +:HGT:925 mb: +:HGT:950 mb: +:HGT:975 mb: +:HGT:highest tropospheric freezing level: +:HGT:max wind: +:HGT:PV=-1.5e-06 (Km^2/kg/s) surface: +:HGT:PV=1.5e-06 (Km^2/kg/s) surface: +:HGT:PV=-1e-06 (Km^2/kg/s) surface: +:HGT:PV=1e-06 (Km^2/kg/s) surface: +:HGT:PV=-2e-06 (Km^2/kg/s) surface: +:HGT:PV=2e-06 (Km^2/kg/s) surface: +:HGT:PV=-5e-07 (Km^2/kg/s) surface: +:HGT:PV=5e-07 (Km^2/kg/s) surface: +:HGT:surface: +:HGT:tropopause: +:HINDEX:surface: +:HPBL:surface: +:ICAHT:max wind: +:ICAHT:tropopause: +:ICEC:surface: +:ICIP:300 mb: +:ICIP:400 mb: +:ICIP:500 mb: +:ICIP:600 mb: +:ICIP:700 mb: +:ICIP:800 mb: +:ICSEV:300 mb: +:ICSEV:400 mb: +:ICSEV:500 mb: +:ICSEV:600 mb: +:ICSEV:700 mb: +:ICSEV:800 mb: +:LAND:surface: +:LFTX:surface: +:MNTSF:320 K isentropic level: +:NCPCP:surface: +:O3MR:100 mb: +:O3MR:10 mb: +:O3MR:125 mb: +:O3MR:150 mb: +:O3MR:1 mb: +:O3MR:200 mb: +:O3MR:20 mb: +:O3MR:250 mb: +:O3MR:2 mb: +:O3MR:300 mb: +:O3MR:30 mb: +:O3MR:350 mb: +:O3MR:3 mb: +:O3MR:400 mb: +:O3MR:50 mb: +:O3MR:5 mb: +:O3MR:70 mb: +:O3MR:7 mb: +:PEVPR:surface: +:PLI:30-0 mb above ground: +:PLPL:255-0 mb above ground: +:POT:0.995 sigma level: +:PRATE:surface: +:PRES:80 m above ground: +:PRES:convective cloud bottom level: +:PRES:convective cloud top level: +:PRES:high cloud bottom level: +:PRES:high cloud top level: +:PRES:low cloud bottom level: +:PRES:low cloud top level: +:PRES:max wind: +:PRES:mean sea level: +:PRES:middle cloud bottom level: +:PRES:middle cloud top level: +:PRES:PV=-1.5e-06 (Km^2/kg/s) surface: +:PRES:PV=1.5e-06 (Km^2/kg/s) surface: +:PRES:PV=-1e-06 (Km^2/kg/s) surface: +:PRES:PV=1e-06 (Km^2/kg/s) surface: +:PRES:PV=-2e-06 (Km^2/kg/s) surface: +:PRES:PV=2e-06 (Km^2/kg/s) surface: +:PRES:PV=-5e-07 (Km^2/kg/s) surface: +:PRES:PV=5e-07 (Km^2/kg/s) surface: +:PRES:tropopause: +:PVORT:310 K isentropic level: +:PVORT:320 K isentropic level: +:PVORT:350 K isentropic level: +:PVORT:450 K isentropic level: +:PVORT:550 K isentropic level: +:PVORT:650 K isentropic level: +:PWAT:30-0 mb above ground: +:RH:0.33-1 sigma layer: +:RH:0.44-0.72 sigma layer: +:RH:0.44-1 sigma layer: +:RH:0.72-0.94 sigma layer: +:RH:0.995 sigma level: +:RH:0C isotherm: +:RH:1000 mb: +:RH:100 mb: +:RH:10 mb: +:RH:120-90 mb above ground: +:RH:150-120 mb above ground: +:RH:150 mb: +:RH:180-150 mb above ground: +:RH:200 mb: +:RH:20 mb: +:RH:250 mb: +:RH:300 mb: +:RH:30-0 mb above ground: +:RH:30 mb: +:RH:350 mb: +:RH:400 mb: +:RH:450 mb: +:RH:500 mb: +:RH:50 mb: +:RH:550 mb: +:RH:600 mb: +:RH:60-30 mb above ground: +:RH:650 mb: +:RH:700 mb: +:RH:70 mb: +:RH:750 mb: +:RH:800 mb: +:RH:850 mb: +:RH:900 mb: +:RH:90-60 mb above ground: +:RH:925 mb: +:RH:950 mb: +:RH:975 mb: +:RH:entire atmosphere (considered as a single layer): +:RH:highest tropospheric freezing level: +:SFCR:surface: +:SNOWC:surface: +:SNOHF:surface: +:SOILL:0-0.1 m below ground: +:SOILL:0.1-0.4 m below ground: +:SOILL:0.4-1 m below ground: +:SOILL:1-2 m below ground: +:SOILW:0.1-0.4 m below ground: +:SOILW:0.4-1 m below ground: +:SOILW:1-2 m below ground: +:SPFH:1000 mb: +:SPFH:100 mb: +:SPFH:10 mb: +:SPFH:1 mb: +:SPFH:120-90 mb above ground: +:SPFH:150-120 mb above ground: +:SPFH:150 mb: +:SPFH:180-150 mb above ground: +:SPFH:200 mb: +:SPFH:20 mb: +:SPFH:2 mb: +:SPFH:250 mb: +:SPFH:2 m above ground: +:SPFH:300 mb: +:SPFH:30-0 mb above ground: +:SPFH:30 mb: +:SPFH:3 mb: +:SPFH:350 mb: +:SPFH:400 mb: +:SPFH:450 mb: +:SPFH:500 mb: +:SPFH:50 mb: +:SPFH:5 mb: +:SPFH:550 mb: +:SPFH:600 mb: +:SPFH:60-30 mb above ground: +:SPFH:650 mb: +:SPFH:700 mb: +:SPFH:70 mb: +:SPFH:7 mb: +:SPFH:750 mb: +:SPFH:800 mb: +:SPFH:80 m above ground: +:SPFH:850 mb: +:SPFH:900 mb: +:SPFH:90-60 mb above ground: +:SPFH:925 mb: +:SPFH:950 mb: +:SPFH:975 mb: +:SUNSD:surface: +:TCDC:475 mb: +:TCDC:boundary layer cloud layer: +:TCDC:convective cloud layer: +:TCDC:high cloud layer: +:TCDC:low cloud layer: +:TCDC:middle cloud layer: +:TMP:0.995 sigma level: +:TMP:1000 mb: +:TMP:100 m above ground: +:TMP:100 mb: +:TMP:10 mb: +:TMP:1 mb: +:TMP:120-90 mb above ground: +:TMP:150-120 mb above ground: +:TMP:150 mb: +:TMP:180-150 mb above ground: +:TMP:1829 m above mean sea level: +:TMP:200 mb: +:TMP:20 mb: +:TMP:2 mb: +:TMP:250 mb: +:TMP:2743 m above mean sea level: +:TMP:300 mb: +:TMP:30-0 mb above ground: +:TMP:305 m above mean sea level: +:TMP:30 mb: +:TMP:3 mb: +:TMP:320 K isentropic level: +:TMP:350 mb: +:TMP:3658 m above mean sea level: +:TMP:400 mb: +:TMP:450 K isentropic level: +:TMP:450 mb: +:TMP:4572 m above mean sea level: +:TMP:457 m above mean sea level: +:TMP:500 mb: +:TMP:50 mb: +:TMP:5 mb: +:TMP:550 K isentropic level: +:TMP:550 mb: +:TMP:600 mb: +:TMP:60-30 mb above ground: +:TMP:610 m above mean sea level: +:TMP:650 K isentropic level: +:TMP:650 mb: +:TMP:700 mb: +:TMP:70 mb: +:TMP:7 mb: +:TMP:750 mb: +:TMP:800 mb: +:TMP:80 m above ground: +:TMP:850 mb: +:TMP:900 mb: +:TMP:90-60 mb above ground: +:TMP:914 m above mean sea level: +:TMP:925 mb: +:TMP:950 mb: +:TMP:975 mb: +:TMP:high cloud top level: +:TMP:low cloud top level: +:TMP:max wind: +:TMP:middle cloud top level: +:TMP:PV=-1.5e-06 (Km^2/kg/s) surface: +:TMP:PV=1.5e-06 (Km^2/kg/s) surface: +:TMP:PV=-1e-06 (Km^2/kg/s) surface: +:TMP:PV=1e-06 (Km^2/kg/s) surface: +:TMP:PV=-2e-06 (Km^2/kg/s) surface: +:TMP:PV=2e-06 (Km^2/kg/s) surface: +:TMP:PV=-5e-07 (Km^2/kg/s) surface: +:TMP:PV=5e-07 (Km^2/kg/s) surface: +:TMP:surface: +:TMP:tropopause: +:TOZNE:entire atmosphere (considered as a single layer): +:TSOIL:0.1-0.4 m below ground: +:TSOIL:0.4-1 m below ground: +:TSOIL:1-2 m below ground: +:UFLX:surface: +:UGRD:0.995 sigma level: +:UGRD:1000 mb: +:UGRD:100 m above ground: +:UGRD:100 mb: +:UGRD:10 mb: +:UGRD:1 mb: +:UGRD:120-90 mb above ground: +:UGRD:150-120 mb above ground: +:UGRD:150 mb: +:UGRD:180-150 mb above ground: +:UGRD:1829 m above mean sea level: +:UGRD:200 mb: +:UGRD:20 mb: +:UGRD:2 mb: +:UGRD:250 mb: +:UGRD:2743 m above mean sea level: +:UGRD:300 mb: +:UGRD:30-0 mb above ground: +:UGRD:305 m above mean sea level: +:UGRD:30 mb: +:UGRD:3 mb: +:UGRD:320 K isentropic level: +:UGRD:350 mb: +:UGRD:3658 m above mean sea level: +:UGRD:400 mb: +:UGRD:450 K isentropic level: +:UGRD:450 mb: +:UGRD:4572 m above mean sea level: +:UGRD:457 m above mean sea level: +:UGRD:500 mb: +:UGRD:50 mb: +:UGRD:5 mb: +:UGRD:550 K isentropic level: +:UGRD:550 mb: +:UGRD:600 mb: +:UGRD:60-30 mb above ground: +:UGRD:610 m above mean sea level: +:UGRD:650 K isentropic level: +:UGRD:650 mb: +:UGRD:700 mb: +:UGRD:70 mb: +:UGRD:7 mb: +:UGRD:750 mb: +:UGRD:800 mb: +:UGRD:80 m above ground: +:UGRD:850 mb: +:UGRD:900 mb: +:UGRD:90-60 mb above ground: +:UGRD:914 m above mean sea level: +:UGRD:925 mb: +:UGRD:950 mb: +:UGRD:975 mb: +:UGRD:max wind: +:UGRD:planetary boundary layer: +:UGRD:PV=-1.5e-06 (Km^2/kg/s) surface: +:UGRD:PV=1.5e-06 (Km^2/kg/s) surface: +:UGRD:PV=-1e-06 (Km^2/kg/s) surface: +:UGRD:PV=1e-06 (Km^2/kg/s) surface: +:UGRD:PV=-2e-06 (Km^2/kg/s) surface: +:UGRD:PV=2e-06 (Km^2/kg/s) surface: +:UGRD:PV=-5e-07 (Km^2/kg/s) surface: +:UGRD:PV=5e-07 (Km^2/kg/s) surface: +:UGRD:tropopause: +:U-GWD:surface: +:USTM:6000-0 m above ground: +:USWRF:top of atmosphere: +:APTMP:2 m above ground +:VFLX:surface: +:VGRD:0.995 sigma level: +:VGRD:1000 mb: +:VGRD:100 m above ground: +:VGRD:100 mb: +:VGRD:10 mb: +:VGRD:1 mb: +:VGRD:120-90 mb above ground: +:VGRD:150-120 mb above ground: +:VGRD:150 mb: +:VGRD:180-150 mb above ground: +:VGRD:1829 m above mean sea level: +:VGRD:200 mb: +:VGRD:20 mb: +:VGRD:2 mb: +:VGRD:250 mb: +:VGRD:2743 m above mean sea level: +:VGRD:300 mb: +:VGRD:30-0 mb above ground: +:VGRD:305 m above mean sea level: +:VGRD:30 mb: +:VGRD:3 mb: +:VGRD:320 K isentropic level: +:VGRD:350 mb: +:VGRD:3658 m above mean sea level: +:VGRD:400 mb: +:VGRD:450 K isentropic level: +:VGRD:450 mb: +:VGRD:4572 m above mean sea level: +:VGRD:457 m above mean sea level: +:VGRD:500 mb: +:VGRD:50 mb: +:VGRD:5 mb: +:VGRD:550 K isentropic level: +:VGRD:550 mb: +:VGRD:600 mb: +:VGRD:60-30 mb above ground: +:VGRD:610 m above mean sea level: +:VGRD:650 K isentropic level: +:VGRD:650 mb: +:VGRD:700 mb: +:VGRD:70 mb: +:VGRD:7 mb: +:VGRD:750 mb: +:VGRD:800 mb: +:VGRD:80 m above ground: +:VGRD:850 mb: +:VGRD:900 mb: +:VGRD:90-60 mb above ground: +:VGRD:914 m above mean sea level: +:VGRD:925 mb: +:VGRD:950 mb: +:VGRD:975 mb: +:VGRD:max wind: +:VGRD:planetary boundary layer: +:VGRD:PV=-1.5e-06 (Km^2/kg/s) surface: +:VGRD:PV=1.5e-06 (Km^2/kg/s) surface: +:VGRD:PV=-1e-06 (Km^2/kg/s) surface: +:VGRD:PV=1e-06 (Km^2/kg/s) surface: +:VGRD:PV=-2e-06 (Km^2/kg/s) surface: +:VGRD:PV=2e-06 (Km^2/kg/s) surface: +:VGRD:PV=-5e-07 (Km^2/kg/s) surface: +:VGRD:PV=5e-07 (Km^2/kg/s) surface: +:VGRD:tropopause: +:V-GWD:surface: +:VRATE:planetary boundary layer: +:VSTM:6000-0 m above ground: +:VVEL:0.995 sigma level: +:VVEL:1 mb: +:VVEL:2 mb: +:VVEL:3 mb: +:VVEL:5 mb: +:VVEL:7 mb: +:VVEL:10 mb: +:VVEL:20 mb: +:VVEL:30 mb: +:VVEL:50 mb: +:VVEL:70 mb: +:VVEL:1000 mb: +:VVEL:100 mb: +:VVEL:150 mb: +:VVEL:200 mb: +:VVEL:250 mb: +:VVEL:300 mb: +:VVEL:350 mb: +:VVEL:400 mb: +:VVEL:450 mb: +:VVEL:500 mb: +:VVEL:550 mb: +:VVEL:600 mb: +:VVEL:650 mb: +:VVEL:700 mb: +:VVEL:750 mb: +:VVEL:800 mb: +:VVEL:850 mb: +:VVEL:900 mb: +:VVEL:925 mb: +:VVEL:950 mb: +:VVEL:975 mb: +:VWSH:PV=-1.5e-06 (Km^2/kg/s) surface: +:VWSH:PV=1.5e-06 (Km^2/kg/s) surface: +:VWSH:PV=-1e-06 (Km^2/kg/s) surface: +:VWSH:PV=1e-06 (Km^2/kg/s) surface: +:VWSH:PV=-2e-06 (Km^2/kg/s) surface: +:VWSH:PV=2e-06 (Km^2/kg/s) surface: +:VWSH:PV=-5e-07 (Km^2/kg/s) surface: +:VWSH:PV=5e-07 (Km^2/kg/s) surface: +:VWSH:tropopause: +:WATR:surface: +:WILT:surface: +:PRES:1 hybrid level: +:HGT:1 hybrid level: +:TMP:1 hybrid level: +:RH:1 hybrid level: +:UGRD:1 hybrid level: +:VGRD:1 hybrid level: +:PRES:2 hybrid level: +:HGT:2 hybrid level: +:TMP:2 hybrid level: +:RH:2 hybrid level: +:UGRD:2 hybrid level: +:VGRD:2 hybrid level: +:PRES:3 hybrid level: +:HGT:3 hybrid level: +:TMP:3 hybrid level: +:RH:3 hybrid level: +:UGRD:3 hybrid level: +:VGRD:3 hybrid level: +:PRES:4 hybrid level: +:HGT:4 hybrid level: +:TMP:4 hybrid level: +:RH:4 hybrid level: +:UGRD:4 hybrid level: +:VGRD:4 hybrid level: +;############################ do not leave a blank line at the end diff --git a/parm/product/gefs.0p50.f000.paramlist.a.txt b/parm/product/gefs.0p50.f000.paramlist.a.txt new file mode 100644 index 0000000000..ab8e73f552 --- /dev/null +++ b/parm/product/gefs.0p50.f000.paramlist.a.txt @@ -0,0 +1,80 @@ +############################# sorted pgrb2a 201408 +:CAPE:180-0 mb above ground: +:CIN:180-0 mb above ground: +:DLWRF:surface: +:DSWRF:surface: +:HGT:10 mb: +:HGT:100 mb: +:HGT:1000 mb: +:HGT:200 mb: +:HGT:250 mb: +:HGT:300 mb: +:HGT:50 mb: +:HGT:500 mb: +:HGT:700 mb: +:HGT:850 mb: +:HGT:925 mb: +:HGT:surface: +:ICETK:surface: +:LHTFL:surface: +:PRES:surface: +:PRMSL:mean sea level: +:PWAT:entire atmosphere (considered as a single layer): +:RH:10 mb: +:RH:100 mb: +:RH:1000 mb: +:RH:2 m above ground: +:RH:200 mb: +:RH:250 mb: +:RH:50 mb: +:RH:500 mb: +:RH:700 mb: +:RH:850 mb: +:RH:925 mb: +:SHTFL:surface: +:SNOD:surface: +:SOILW:0-0.1 m below ground: +:TMP:10 mb: +:TMP:100 mb: +:TMP:1000 mb: +:TMP:2 m above ground: +:TMP:200 mb: +:TMP:250 mb: +:TMP:50 mb: +:TMP:500 mb: +:TMP:700 mb: +:TMP:850 mb: +:TMP:925 mb: +:TSOIL:0-0.1 m below ground: +:UGRD:10 m above ground: +:UGRD:10 mb: +:UGRD:100 mb: +:UGRD:1000 mb: +:UGRD:200 mb: +:UGRD:250 mb: +:UGRD:300 mb: +:UGRD:400 mb: +:UGRD:50 mb: +:UGRD:500 mb: +:UGRD:700 mb: +:UGRD:850 mb: +:UGRD:925 mb: +:ULWRF:surface: +:ULWRF:top of atmosphere: +:USWRF:surface: +:VGRD:10 m above ground: +:VGRD:10 mb: +:VGRD:100 mb: +:VGRD:1000 mb: +:VGRD:200 mb: +:VGRD:250 mb: +:VGRD:300 mb: +:VGRD:400 mb: +:VGRD:50 mb: +:VGRD:500 mb: +:VGRD:700 mb: +:VGRD:850 mb: +:VGRD:925 mb: +:VVEL:850 mb: +:WEASD:surface: +;############################ do not leave a blank line at the end diff --git a/parm/product/gefs.0p50.f000.paramlist.b.txt b/parm/product/gefs.0p50.f000.paramlist.b.txt new file mode 100644 index 0000000000..8fd65468ae --- /dev/null +++ b/parm/product/gefs.0p50.f000.paramlist.b.txt @@ -0,0 +1,474 @@ +############################# sorted pgrb2a + pgrb2b 201502 +:4LFTX:surface: +:5WAVH:500 mb: +:ABSV:1000 mb: +:ABSV:100 mb: +:ABSV:10 mb: +:ABSV:150 mb: +:ABSV:200 mb: +:ABSV:20 mb: +:ABSV:250 mb: +:ABSV:300 mb: +:ABSV:30 mb: +:ABSV:350 mb: +:ABSV:400 mb: +:ABSV:450 mb: +:ABSV:500 mb: +:ABSV:50 mb: +:ABSV:550 mb: +:ABSV:600 mb: +:ABSV:650 mb: +:ABSV:700 mb: +:ABSV:70 mb: +:ABSV:750 mb: +:ABSV:800 mb: +:ABSV:850 mb: +:ABSV:900 mb: +:ABSV:925 mb: +:ABSV:950 mb: +:ABSV:975 mb: +:BRTMP:top of atmosphere: +:CAPE:255-0 mb above ground: +:CAPE:surface: +:CIN:255-0 mb above ground: +:CIN:surface: +:CLWMR:1000 mb: +:CLWMR:100 mb: +:CLWMR:10 mb: +:CLWMR:150 mb: +:CLWMR:200 mb: +:CLWMR:20 mb: +:CLWMR:250 mb: +:CLWMR:300 mb: +:CLWMR:30 mb: +:CLWMR:350 mb: +:CLWMR:400 mb: +:CLWMR:450 mb: +:CLWMR:500 mb: +:CLWMR:50 mb: +:CLWMR:550 mb: +:CLWMR:600 mb: +:CLWMR:650 mb: +:CLWMR:700 mb: +:CLWMR:70 mb: +:CLWMR:750 mb: +:CLWMR:800 mb: +:CLWMR:850 mb: +:CLWMR:900 mb: +:CLWMR:925 mb: +:CLWMR:950 mb: +:CLWMR:975 mb: +:CNWAT:surface: +:CPOFP:surface: +:CWAT:entire atmosphere (considered as a single layer): +:DPT:2 m above ground: +:DPT:30-0 mb above ground: +:FLDCP:surface: +:FRICV:surface: +:GUST:surface: +:HGT:0C isotherm: +:HGT:1 mb: +:HGT:150 mb: +:HGT:20 mb: +:HGT:2 mb: +:HGT:30 mb: +:HGT:3 mb: +:HGT:350 mb: +:HGT:400 mb: +:HGT:450 mb: +:HGT:5 mb: +:HGT:550 mb: +:HGT:600 mb: +:HGT:650 mb: +:HGT:70 mb: +:HGT:7 mb: +:HGT:750 mb: +:HGT:800 mb: +:HGT:900 mb: +:HGT:950 mb: +:HGT:975 mb: +:HGT:highest tropospheric freezing level: +:HGT:max wind: +:HGT:PV=-1.5e-06 (Km^2/kg/s) surface: +:HGT:PV=1.5e-06 (Km^2/kg/s) surface: +:HGT:PV=-1e-06 (Km^2/kg/s) surface: +:HGT:PV=1e-06 (Km^2/kg/s) surface: +:HGT:PV=-2e-06 (Km^2/kg/s) surface: +:HGT:PV=2e-06 (Km^2/kg/s) surface: +:HGT:PV=-5e-07 (Km^2/kg/s) surface: +:HGT:PV=5e-07 (Km^2/kg/s) surface: +:HGT:tropopause: +:HINDEX:surface: +:HLCY:3000-0 m above ground: +:HPBL:surface: +:ICAHT:max wind: +:ICAHT:tropopause: +:ICEC:surface: +:ICIP:300 mb: +:ICIP:400 mb: +:ICIP:500 mb: +:ICIP:600 mb: +:ICIP:700 mb: +:ICIP:800 mb: +:ICSEV:300 mb: +:ICSEV:400 mb: +:ICSEV:500 mb: +:ICSEV:600 mb: +:ICSEV:700 mb: +:ICSEV:800 mb: +:LAND:surface: +:LFTX:surface: +:MNTSF:320 K isentropic level: +:MSLET:mean sea level: +:O3MR:100 mb: +:O3MR:10 mb: +:O3MR:125 mb: +:O3MR:150 mb: +:O3MR:1 mb: +:O3MR:200 mb: +:O3MR:20 mb: +:O3MR:250 mb: +:O3MR:2 mb: +:O3MR:300 mb: +:O3MR:30 mb: +:O3MR:350 mb: +:O3MR:3 mb: +:O3MR:400 mb: +:O3MR:50 mb: +:O3MR:5 mb: +:O3MR:70 mb: +:O3MR:7 mb: +:PEVPR:surface: +:PLI:30-0 mb above ground: +:PLPL:255-0 mb above ground: +:POT:0.995 sigma level: +:PRES:80 m above ground: +:PRES:max wind: +:PRES:mean sea level: +:PRES:PV=-1.5e-06 (Km^2/kg/s) surface: +:PRES:PV=1.5e-06 (Km^2/kg/s) surface: +:PRES:PV=-1e-06 (Km^2/kg/s) surface: +:PRES:PV=1e-06 (Km^2/kg/s) surface: +:PRES:PV=-2e-06 (Km^2/kg/s) surface: +:PRES:PV=2e-06 (Km^2/kg/s) surface: +:PRES:PV=-5e-07 (Km^2/kg/s) surface: +:PRES:PV=5e-07 (Km^2/kg/s) surface: +:PRES:tropopause: +:PVORT:310 K isentropic level: +:PVORT:320 K isentropic level: +:PVORT:350 K isentropic level: +:PVORT:450 K isentropic level: +:PVORT:550 K isentropic level: +:PVORT:650 K isentropic level: +:PWAT:30-0 mb above ground: +:RH:0.33-1 sigma layer: +:RH:0.44-0.72 sigma layer: +:RH:0.44-1 sigma layer: +:RH:0.72-0.94 sigma layer: +:RH:0.995 sigma level: +:RH:0C isotherm: +:RH:120-90 mb above ground: +:RH:150-120 mb above ground: +:RH:150 mb: +:RH:180-150 mb above ground: +:RH:20 mb: +:RH:300 mb: +:RH:30-0 mb above ground: +:RH:30 mb: +:RH:350 mb: +:RH:400 mb: +:RH:450 mb: +:RH:550 mb: +:RH:600 mb: +:RH:60-30 mb above ground: +:RH:650 mb: +:RH:70 mb: +:RH:750 mb: +:RH:800 mb: +:RH:900 mb: +:RH:90-60 mb above ground: +:RH:950 mb: +:RH:975 mb: +:RH:entire atmosphere (considered as a single layer): +:RH:highest tropospheric freezing level: +:SFCR:surface: +:SNOHF:surface: +:SNOWC:surface: +:SOILL:0-0.1 m below ground: +:SOILL:0.1-0.4 m below ground: +:SOILL:0.4-1 m below ground: +:SOILL:1-2 m below ground: +:SOILW:0.1-0.4 m below ground: +:SOILW:0.4-1 m below ground: +:SOILW:1-2 m below ground: +:SPFH:1000 mb: +:SPFH:100 mb: +:SPFH:10 mb: +:SPFH:1 mb: +:SPFH:120-90 mb above ground: +:SPFH:150-120 mb above ground: +:SPFH:150 mb: +:SPFH:180-150 mb above ground: +:SPFH:200 mb: +:SPFH:20 mb: +:SPFH:2 mb: +:SPFH:250 mb: +:SPFH:2 m above ground: +:SPFH:300 mb: +:SPFH:30-0 mb above ground: +:SPFH:30 mb: +:SPFH:3 mb: +:SPFH:350 mb: +:SPFH:400 mb: +:SPFH:450 mb: +:SPFH:500 mb: +:SPFH:50 mb: +:SPFH:5 mb: +:SPFH:550 mb: +:SPFH:600 mb: +:SPFH:60-30 mb above ground: +:SPFH:650 mb: +:SPFH:700 mb: +:SPFH:70 mb: +:SPFH:7 mb: +:SPFH:750 mb: +:SPFH:800 mb: +:SPFH:80 m above ground: +:SPFH:850 mb: +:SPFH:900 mb: +:SPFH:90-60 mb above ground: +:SPFH:925 mb: +:SPFH:950 mb: +:SPFH:975 mb: +:SUNSD:surface: +:TCDC:475 mb: +:TMP:0.995 sigma level: +:TMP:100 m above ground: +:TMP:1 mb: +:TMP:120-90 mb above ground: +:TMP:150-120 mb above ground: +:TMP:150 mb: +:TMP:180-150 mb above ground: +:TMP:1829 m above mean sea level: +:TMP:20 mb: +:TMP:2 mb: +:TMP:2743 m above mean sea level: +:TMP:300 mb: +:TMP:30-0 mb above ground: +:TMP:305 m above mean sea level: +:TMP:30 mb: +:TMP:3 mb: +:TMP:320 K isentropic level: +:TMP:350 mb: +:TMP:3658 m above mean sea level: +:TMP:400 mb: +:TMP:450 mb: +:TMP:450 K isentropic level: +:TMP:4572 m above mean sea level: +:TMP:457 m above mean sea level: +:TMP:5 mb: +:TMP:550 mb: +:TMP:550 K isentropic level: +:TMP:600 mb: +:TMP:60-30 mb above ground: +:TMP:610 m above mean sea level: +:TMP:650 mb: +:TMP:650 K isentropic level: +:TMP:70 mb: +:TMP:7 mb: +:TMP:750 mb: +:TMP:800 mb: +:TMP:80 m above ground: +:TMP:900 mb: +:TMP:90-60 mb above ground: +:TMP:914 m above mean sea level: +:TMP:950 mb: +:TMP:975 mb: +:TMP:max wind: +:TMP:PV=-1.5e-06 (Km^2/kg/s) surface: +:TMP:PV=1.5e-06 (Km^2/kg/s) surface: +:TMP:PV=-1e-06 (Km^2/kg/s) surface: +:TMP:PV=1e-06 (Km^2/kg/s) surface: +:TMP:PV=-2e-06 (Km^2/kg/s) surface: +:TMP:PV=2e-06 (Km^2/kg/s) surface: +:TMP:PV=-5e-07 (Km^2/kg/s) surface: +:TMP:PV=5e-07 (Km^2/kg/s) surface: +:TMP:surface: +:TMP:tropopause: +:TOZNE:entire atmosphere (considered as a single layer): +:TSOIL:0.1-0.4 m below ground: +:TSOIL:0.4-1 m below ground: +:TSOIL:1-2 m below ground: +:UGRD:0.995 sigma level: +:UGRD:100 m above ground: +:UGRD:1 mb: +:UGRD:120-90 mb above ground: +:UGRD:150-120 mb above ground: +:UGRD:150 mb: +:UGRD:180-150 mb above ground: +:UGRD:1829 m above mean sea level: +:UGRD:20 mb: +:UGRD:2 mb: +:UGRD:2743 m above mean sea level: +:UGRD:30-0 mb above ground: +:UGRD:305 m above mean sea level: +:UGRD:30 mb: +:UGRD:3 mb: +:UGRD:320 K isentropic level: +:UGRD:350 mb: +:UGRD:3658 m above mean sea level: +:UGRD:450 mb: +:UGRD:450 K isentropic level: +:UGRD:4572 m above mean sea level: +:UGRD:457 m above mean sea level: +:UGRD:5 mb: +:UGRD:550 mb: +:UGRD:550 K isentropic level: +:UGRD:600 mb: +:UGRD:60-30 mb above ground: +:UGRD:610 m above mean sea level: +:UGRD:650 mb: +:UGRD:650 K isentropic level: +:UGRD:70 mb: +:UGRD:7 mb: +:UGRD:750 mb: +:UGRD:800 mb: +:UGRD:80 m above ground: +:UGRD:900 mb: +:UGRD:90-60 mb above ground: +:UGRD:914 m above mean sea level: +:UGRD:950 mb: +:UGRD:975 mb: +:UGRD:max wind: +:UGRD:planetary boundary layer: +:UGRD:PV=-1.5e-06 (Km^2/kg/s) surface: +:UGRD:PV=1.5e-06 (Km^2/kg/s) surface: +:UGRD:PV=-1e-06 (Km^2/kg/s) surface: +:UGRD:PV=1e-06 (Km^2/kg/s) surface: +:UGRD:PV=-2e-06 (Km^2/kg/s) surface: +:UGRD:PV=2e-06 (Km^2/kg/s) surface: +:UGRD:PV=-5e-07 (Km^2/kg/s) surface: +:UGRD:PV=5e-07 (Km^2/kg/s) surface: +:UGRD:tropopause: +:USTM:6000-0 m above ground: +:APTMP:2 m above ground: +:VGRD:0.995 sigma level: +:VGRD:100 m above ground: +:VGRD:1 mb: +:VGRD:120-90 mb above ground: +:VGRD:150-120 mb above ground: +:VGRD:150 mb: +:VGRD:180-150 mb above ground: +:VGRD:1829 m above mean sea level: +:VGRD:20 mb: +:VGRD:2 mb: +:VGRD:2743 m above mean sea level: +:VGRD:30-0 mb above ground: +:VGRD:305 m above mean sea level: +:VGRD:30 mb: +:VGRD:3 mb: +:VGRD:320 K isentropic level: +:VGRD:350 mb: +:VGRD:3658 m above mean sea level: +:VGRD:450 mb: +:VGRD:450 K isentropic level: +:VGRD:4572 m above mean sea level: +:VGRD:457 m above mean sea level: +:VGRD:5 mb: +:VGRD:550 mb: +:VGRD:550 K isentropic level: +:VGRD:600 mb: +:VGRD:60-30 mb above ground: +:VGRD:610 m above mean sea level: +:VGRD:650 mb: +:VGRD:650 K isentropic level: +:VGRD:70 mb: +:VGRD:7 mb: +:VGRD:750 mb: +:VGRD:800 mb: +:VGRD:80 m above ground: +:VGRD:900 mb: +:VGRD:90-60 mb above ground: +:VGRD:914 m above mean sea level: +:VGRD:950 mb: +:VGRD:975 mb: +:VGRD:max wind: +:VGRD:planetary boundary layer: +:VGRD:PV=-1.5e-06 (Km^2/kg/s) surface: +:VGRD:PV=1.5e-06 (Km^2/kg/s) surface: +:VGRD:PV=-1e-06 (Km^2/kg/s) surface: +:VGRD:PV=1e-06 (Km^2/kg/s) surface: +:VGRD:PV=-2e-06 (Km^2/kg/s) surface: +:VGRD:PV=2e-06 (Km^2/kg/s) surface: +:VGRD:PV=-5e-07 (Km^2/kg/s) surface: +:VGRD:PV=5e-07 (Km^2/kg/s) surface: +:VGRD:tropopause: +:VIS:surface: +:VRATE:planetary boundary layer: +:VSTM:6000-0 m above ground: +:VVEL:0.995 sigma level: +:VVEL:1 mb: +:VVEL:2 mb: +:VVEL:3 mb: +:VVEL:5 mb: +:VVEL:7 mb: +:VVEL:10 mb: +:VVEL:20 mb: +:VVEL:30 mb: +:VVEL:50 mb: +:VVEL:70 mb: +:VVEL:1000 mb: +:VVEL:100 mb: +:VVEL:150 mb: +:VVEL:200 mb: +:VVEL:250 mb: +:VVEL:300 mb: +:VVEL:350 mb: +:VVEL:400 mb: +:VVEL:450 mb: +:VVEL:500 mb: +:VVEL:550 mb: +:VVEL:600 mb: +:VVEL:650 mb: +:VVEL:700 mb: +:VVEL:750 mb: +:VVEL:800 mb: +:VVEL:900 mb: +:VVEL:925 mb: +:VVEL:950 mb: +:VVEL:975 mb: +:VWSH:PV=-1.5e-06 (Km^2/kg/s) surface: +:VWSH:PV=1.5e-06 (Km^2/kg/s) surface: +:VWSH:PV=-1e-06 (Km^2/kg/s) surface: +:VWSH:PV=1e-06 (Km^2/kg/s) surface: +:VWSH:PV=-2e-06 (Km^2/kg/s) surface: +:VWSH:PV=2e-06 (Km^2/kg/s) surface: +:VWSH:PV=-5e-07 (Km^2/kg/s) surface: +:VWSH:PV=5e-07 (Km^2/kg/s) surface: +:VWSH:tropopause: +:WILT:surface: +:HGT:cloud ceiling: +:PRES:1 hybrid level: +:HGT:1 hybrid level: +:TMP:1 hybrid level: +:RH:1 hybrid level: +:UGRD:1 hybrid level: +:VGRD:1 hybrid level: +:PRES:2 hybrid level: +:HGT:2 hybrid level: +:TMP:2 hybrid level: +:RH:2 hybrid level: +:UGRD:2 hybrid level: +:VGRD:2 hybrid level: +:PRES:3 hybrid level: +:HGT:3 hybrid level: +:TMP:3 hybrid level: +:RH:3 hybrid level: +:UGRD:3 hybrid level: +:VGRD:3 hybrid level: +:PRES:4 hybrid level: +:HGT:4 hybrid level: +:TMP:4 hybrid level: +:RH:4 hybrid level: +:UGRD:4 hybrid level: +:VGRD:4 hybrid level: diff --git a/parm/product/gefs.0p50.fFFF.paramlist.a.txt b/parm/product/gefs.0p50.fFFF.paramlist.a.txt new file mode 100644 index 0000000000..dde635408c --- /dev/null +++ b/parm/product/gefs.0p50.fFFF.paramlist.a.txt @@ -0,0 +1,87 @@ +############################# sorted pgrb2a 201408 +:APCP:surface: +:CAPE:180-0 mb above ground: +:CFRZR:surface: +:CICEP:surface: +:CIN:180-0 mb above ground: +:CRAIN:surface: +:CSNOW:surface: +:DLWRF:surface: +:DSWRF:surface: +:HGT:10 mb: +:HGT:100 mb: +:HGT:1000 mb: +:HGT:200 mb: +:HGT:250 mb: +:HGT:300 mb: +:HGT:50 mb: +:HGT:500 mb: +:HGT:700 mb: +:HGT:850 mb: +:HGT:925 mb: +:LHTFL:surface: +:ICETK:surface: +:PRES:surface: +:PRMSL:mean sea level: +:PWAT:entire atmosphere (considered as a single layer): +:RH:10 mb: +:RH:100 mb: +:RH:1000 mb: +:RH:2 m above ground: +:RH:200 mb: +:RH:250 mb: +:RH:50 mb: +:RH:500 mb: +:RH:700 mb: +:RH:850 mb: +:RH:925 mb: +:SHTFL:surface: +:SNOD:surface: +:SOILW:0-0.1 m below ground: +:TCDC:entire atmosphere: +:TMAX:2 m above ground: +:TMIN:2 m above ground: +:TMP:10 mb: +:TMP:100 mb: +:TMP:1000 mb: +:TMP:2 m above ground: +:TMP:200 mb: +:TMP:250 mb: +:TMP:50 mb: +:TMP:500 mb: +:TMP:700 mb: +:TMP:850 mb: +:TMP:925 mb: +:TSOIL:0-0.1 m below ground: +:UGRD:10 m above ground: +:UGRD:10 mb: +:UGRD:100 mb: +:UGRD:1000 mb: +:UGRD:200 mb: +:UGRD:250 mb: +:UGRD:300 mb: +:UGRD:400 mb: +:UGRD:50 mb: +:UGRD:500 mb: +:UGRD:700 mb: +:UGRD:850 mb: +:UGRD:925 mb: +:ULWRF:surface: +:ULWRF:top of atmosphere: +:USWRF:surface: +:VGRD:10 m above ground: +:VGRD:10 mb: +:VGRD:100 mb: +:VGRD:1000 mb: +:VGRD:200 mb: +:VGRD:250 mb: +:VGRD:300 mb: +:VGRD:400 mb: +:VGRD:50 mb: +:VGRD:500 mb: +:VGRD:700 mb: +:VGRD:850 mb: +:VGRD:925 mb: +:VVEL:850 mb: +:WEASD:surface: +;############################ do not leave a blank line at the end diff --git a/parm/product/gefs.0p50.fFFF.paramlist.b.txt b/parm/product/gefs.0p50.fFFF.paramlist.b.txt new file mode 100644 index 0000000000..28b98db7d5 --- /dev/null +++ b/parm/product/gefs.0p50.fFFF.paramlist.b.txt @@ -0,0 +1,506 @@ +############################# sorted pgrb2a + pgrb2b 201502 +:4LFTX:surface: +:5WAVH:500 mb: +:ABSV:1000 mb: +:ABSV:100 mb: +:ABSV:10 mb: +:ABSV:150 mb: +:ABSV:200 mb: +:ABSV:20 mb: +:ABSV:250 mb: +:ABSV:300 mb: +:ABSV:30 mb: +:ABSV:350 mb: +:ABSV:400 mb: +:ABSV:450 mb: +:ABSV:500 mb: +:ABSV:50 mb: +:ABSV:550 mb: +:ABSV:600 mb: +:ABSV:650 mb: +:ABSV:700 mb: +:ABSV:70 mb: +:ABSV:750 mb: +:ABSV:800 mb: +:ABSV:850 mb: +:ABSV:900 mb: +:ABSV:925 mb: +:ABSV:950 mb: +:ABSV:975 mb: +:ACPCP:surface: +:ALBDO:surface: +:BRTMP:top of atmosphere: +:CAPE:255-0 mb above ground: +:CAPE:surface: +:CDUVB:surface: +:CIN:255-0 mb above ground: +:CIN:surface: +:CLWMR:1000 mb: +:CLWMR:100 mb: +:CLWMR:10 mb: +:CLWMR:150 mb: +:CLWMR:200 mb: +:CLWMR:20 mb: +:CLWMR:250 mb: +:CLWMR:300 mb: +:CLWMR:30 mb: +:CLWMR:350 mb: +:CLWMR:400 mb: +:CLWMR:450 mb: +:CLWMR:500 mb: +:CLWMR:50 mb: +:CLWMR:550 mb: +:CLWMR:600 mb: +:CLWMR:650 mb: +:CLWMR:700 mb: +:CLWMR:70 mb: +:CLWMR:750 mb: +:CLWMR:800 mb: +:CLWMR:850 mb: +:CLWMR:900 mb: +:CLWMR:925 mb: +:CLWMR:950 mb: +:CLWMR:975 mb: +:CNWAT:surface: +:CPOFP:surface: +:CPRAT:surface: +:CWAT:entire atmosphere (considered as a single layer): +:CWORK:entire atmosphere (considered as a single layer): +:DPT:2 m above ground: +:DPT:30-0 mb above ground: +:DUVB:surface: +:FLDCP:surface: +:FRICV:surface: +:GFLUX:surface: +:GUST:surface: +:HGT:0C isotherm: +:HGT:1 mb: +:HGT:150 mb: +:HGT:20 mb: +:HGT:2 mb: +:HGT:30 mb: +:HGT:3 mb: +:HGT:350 mb: +:HGT:400 mb: +:HGT:450 mb: +:HGT:5 mb: +:HGT:550 mb: +:HGT:600 mb: +:HGT:650 mb: +:HGT:70 mb: +:HGT:7 mb: +:HGT:750 mb: +:HGT:800 mb: +:HGT:900 mb: +:HGT:950 mb: +:HGT:975 mb: +:HGT:highest tropospheric freezing level: +:HGT:max wind: +:HGT:PV=-1.5e-06 (Km^2/kg/s) surface: +:HGT:PV=1.5e-06 (Km^2/kg/s) surface: +:HGT:PV=-1e-06 (Km^2/kg/s) surface: +:HGT:PV=1e-06 (Km^2/kg/s) surface: +:HGT:PV=-2e-06 (Km^2/kg/s) surface: +:HGT:PV=2e-06 (Km^2/kg/s) surface: +:HGT:PV=-5e-07 (Km^2/kg/s) surface: +:HGT:PV=5e-07 (Km^2/kg/s) surface: +:HGT:surface: +:HGT:tropopause: +:HINDEX:surface: +:HLCY:3000-0 m above ground: +:HPBL:surface: +:ICAHT:max wind: +:ICAHT:tropopause: +:ICEC:surface: +:ICIP:300 mb: +:ICIP:400 mb: +:ICIP:500 mb: +:ICIP:600 mb: +:ICIP:700 mb: +:ICIP:800 mb: +:ICSEV:300 mb: +:ICSEV:400 mb: +:ICSEV:500 mb: +:ICSEV:600 mb: +:ICSEV:700 mb: +:ICSEV:800 mb: +:LAND:surface: +:LFTX:surface: +:MNTSF:320 K isentropic level: +:MSLET:mean sea level: +:NCPCP:surface: +:O3MR:100 mb: +:O3MR:10 mb: +:O3MR:125 mb: +:O3MR:150 mb: +:O3MR:1 mb: +:O3MR:200 mb: +:O3MR:20 mb: +:O3MR:250 mb: +:O3MR:2 mb: +:O3MR:300 mb: +:O3MR:30 mb: +:O3MR:350 mb: +:O3MR:3 mb: +:O3MR:400 mb: +:O3MR:50 mb: +:O3MR:5 mb: +:O3MR:70 mb: +:O3MR:7 mb: +:PEVPR:surface: +:PLI:30-0 mb above ground: +:PLPL:255-0 mb above ground: +:POT:0.995 sigma level: +:PRATE:surface: +:PRES:80 m above ground: +:PRES:convective cloud bottom level: +:PRES:convective cloud top level: +:PRES:high cloud bottom level: +:PRES:high cloud top level: +:PRES:low cloud bottom level: +:PRES:low cloud top level: +:PRES:max wind: +:PRES:mean sea level: +:PRES:middle cloud bottom level: +:PRES:middle cloud top level: +:PRES:PV=-1.5e-06 (Km^2/kg/s) surface: +:PRES:PV=1.5e-06 (Km^2/kg/s) surface: +:PRES:PV=-1e-06 (Km^2/kg/s) surface: +:PRES:PV=1e-06 (Km^2/kg/s) surface: +:PRES:PV=-2e-06 (Km^2/kg/s) surface: +:PRES:PV=2e-06 (Km^2/kg/s) surface: +:PRES:PV=-5e-07 (Km^2/kg/s) surface: +:PRES:PV=5e-07 (Km^2/kg/s) surface: +:PRES:tropopause: +:PVORT:310 K isentropic level: +:PVORT:320 K isentropic level: +:PVORT:350 K isentropic level: +:PVORT:450 K isentropic level: +:PVORT:550 K isentropic level: +:PVORT:650 K isentropic level: +:PWAT:30-0 mb above ground: +:RH:0.33-1 sigma layer: +:RH:0.44-0.72 sigma layer: +:RH:0.44-1 sigma layer: +:RH:0.72-0.94 sigma layer: +:RH:0.995 sigma level: +:RH:0C isotherm: +:RH:120-90 mb above ground: +:RH:150-120 mb above ground: +:RH:150 mb: +:RH:180-150 mb above ground: +:RH:20 mb: +:RH:300 mb: +:RH:30-0 mb above ground: +:RH:30 mb: +:RH:350 mb: +:RH:400 mb: +:RH:450 mb: +:RH:550 mb: +:RH:600 mb: +:RH:60-30 mb above ground: +:RH:650 mb: +:RH:70 mb: +:RH:750 mb: +:RH:800 mb: +:RH:900 mb: +:RH:90-60 mb above ground: +:RH:950 mb: +:RH:975 mb: +:RH:entire atmosphere (considered as a single layer): +:RH:highest tropospheric freezing level: +:SFCR:surface: +:SNOWC:surface: +:SNOHF:surface: +:SOILL:0-0.1 m below ground: +:SOILL:0.1-0.4 m below ground: +:SOILL:0.4-1 m below ground: +:SOILL:1-2 m below ground: +:SOILW:0.1-0.4 m below ground: +:SOILW:0.4-1 m below ground: +:SOILW:1-2 m below ground: +:SPFH:1000 mb: +:SPFH:100 mb: +:SPFH:10 mb: +:SPFH:1 mb: +:SPFH:120-90 mb above ground: +:SPFH:150-120 mb above ground: +:SPFH:150 mb: +:SPFH:180-150 mb above ground: +:SPFH:200 mb: +:SPFH:20 mb: +:SPFH:2 mb: +:SPFH:250 mb: +:SPFH:2 m above ground: +:SPFH:300 mb: +:SPFH:30-0 mb above ground: +:SPFH:30 mb: +:SPFH:3 mb: +:SPFH:350 mb: +:SPFH:400 mb: +:SPFH:450 mb: +:SPFH:500 mb: +:SPFH:50 mb: +:SPFH:5 mb: +:SPFH:550 mb: +:SPFH:600 mb: +:SPFH:60-30 mb above ground: +:SPFH:650 mb: +:SPFH:700 mb: +:SPFH:70 mb: +:SPFH:7 mb: +:SPFH:750 mb: +:SPFH:800 mb: +:SPFH:80 m above ground: +:SPFH:850 mb: +:SPFH:900 mb: +:SPFH:90-60 mb above ground: +:SPFH:925 mb: +:SPFH:950 mb: +:SPFH:975 mb: +:SUNSD:surface: +:TCDC:475 mb: +:TCDC:boundary layer cloud layer: +:TCDC:convective cloud layer: +:TCDC:high cloud layer: +:TCDC:low cloud layer: +:TCDC:middle cloud layer: +:TMP:0.995 sigma level: +:TMP:100 m above ground: +:TMP:1 mb: +:TMP:120-90 mb above ground: +:TMP:150-120 mb above ground: +:TMP:150 mb: +:TMP:180-150 mb above ground: +:TMP:1829 m above mean sea level: +:TMP:20 mb: +:TMP:2 mb: +:TMP:2743 m above mean sea level: +:TMP:300 mb: +:TMP:30-0 mb above ground: +:TMP:305 m above mean sea level: +:TMP:30 mb: +:TMP:3 mb: +:TMP:320 K isentropic level: +:TMP:350 mb: +:TMP:3658 m above mean sea level: +:TMP:400 mb: +:TMP:450 K isentropic level: +:TMP:450 mb: +:TMP:4572 m above mean sea level: +:TMP:457 m above mean sea level: +:TMP:5 mb: +:TMP:550 K isentropic level: +:TMP:550 mb: +:TMP:600 mb: +:TMP:60-30 mb above ground: +:TMP:610 m above mean sea level: +:TMP:650 K isentropic level: +:TMP:650 mb: +:TMP:70 mb: +:TMP:7 mb: +:TMP:750 mb: +:TMP:800 mb: +:TMP:80 m above ground: +:TMP:900 mb: +:TMP:90-60 mb above ground: +:TMP:914 m above mean sea level: +:TMP:950 mb: +:TMP:975 mb: +:TMP:high cloud top level: +:TMP:low cloud top level: +:TMP:max wind: +:TMP:middle cloud top level: +:TMP:PV=-1.5e-06 (Km^2/kg/s) surface: +:TMP:PV=1.5e-06 (Km^2/kg/s) surface: +:TMP:PV=-1e-06 (Km^2/kg/s) surface: +:TMP:PV=1e-06 (Km^2/kg/s) surface: +:TMP:PV=-2e-06 (Km^2/kg/s) surface: +:TMP:PV=2e-06 (Km^2/kg/s) surface: +:TMP:PV=-5e-07 (Km^2/kg/s) surface: +:TMP:PV=5e-07 (Km^2/kg/s) surface: +:TMP:surface: +:TMP:tropopause: +:TOZNE:entire atmosphere (considered as a single layer): +:TSOIL:0.1-0.4 m below ground: +:TSOIL:0.4-1 m below ground: +:TSOIL:1-2 m below ground: +:UFLX:surface: +:UGRD:0.995 sigma level: +:UGRD:100 m above ground: +:UGRD:1 mb: +:UGRD:120-90 mb above ground: +:UGRD:150-120 mb above ground: +:UGRD:150 mb: +:UGRD:180-150 mb above ground: +:UGRD:1829 m above mean sea level: +:UGRD:20 mb: +:UGRD:2 mb: +:UGRD:2743 m above mean sea level: +:UGRD:30-0 mb above ground: +:UGRD:305 m above mean sea level: +:UGRD:30 mb: +:UGRD:3 mb: +:UGRD:320 K isentropic level: +:UGRD:350 mb: +:UGRD:3658 m above mean sea level: +:UGRD:450 K isentropic level: +:UGRD:450 mb: +:UGRD:4572 m above mean sea level: +:UGRD:457 m above mean sea level: +:UGRD:5 mb: +:UGRD:550 K isentropic level: +:UGRD:550 mb: +:UGRD:600 mb: +:UGRD:60-30 mb above ground: +:UGRD:610 m above mean sea level: +:UGRD:650 K isentropic level: +:UGRD:650 mb: +:UGRD:70 mb: +:UGRD:7 mb: +:UGRD:750 mb: +:UGRD:800 mb: +:UGRD:80 m above ground: +:UGRD:900 mb: +:UGRD:90-60 mb above ground: +:UGRD:914 m above mean sea level: +:UGRD:950 mb: +:UGRD:975 mb: +:UGRD:max wind: +:UGRD:planetary boundary layer: +:UGRD:PV=-1.5e-06 (Km^2/kg/s) surface: +:UGRD:PV=1.5e-06 (Km^2/kg/s) surface: +:UGRD:PV=-1e-06 (Km^2/kg/s) surface: +:UGRD:PV=1e-06 (Km^2/kg/s) surface: +:UGRD:PV=-2e-06 (Km^2/kg/s) surface: +:UGRD:PV=2e-06 (Km^2/kg/s) surface: +:UGRD:PV=-5e-07 (Km^2/kg/s) surface: +:UGRD:PV=5e-07 (Km^2/kg/s) surface: +:UGRD:tropopause: +:U-GWD:surface: +:USTM:6000-0 m above ground: +:USWRF:top of atmosphere: +:APTMP:2 m above ground +:VFLX:surface: +:VGRD:0.995 sigma level: +:VGRD:100 m above ground: +:VGRD:1 mb: +:VGRD:120-90 mb above ground: +:VGRD:150-120 mb above ground: +:VGRD:150 mb: +:VGRD:180-150 mb above ground: +:VGRD:1829 m above mean sea level: +:VGRD:20 mb: +:VGRD:2 mb: +:VGRD:2743 m above mean sea level: +:VGRD:30-0 mb above ground: +:VGRD:305 m above mean sea level: +:VGRD:30 mb: +:VGRD:3 mb: +:VGRD:320 K isentropic level: +:VGRD:350 mb: +:VGRD:3658 m above mean sea level: +:VGRD:450 K isentropic level: +:VGRD:450 mb: +:VGRD:4572 m above mean sea level: +:VGRD:457 m above mean sea level: +:VGRD:5 mb: +:VGRD:550 K isentropic level: +:VGRD:550 mb: +:VGRD:600 mb: +:VGRD:60-30 mb above ground: +:VGRD:610 m above mean sea level: +:VGRD:650 K isentropic level: +:VGRD:650 mb: +:VGRD:70 mb: +:VGRD:7 mb: +:VGRD:750 mb: +:VGRD:800 mb: +:VGRD:80 m above ground: +:VGRD:900 mb: +:VGRD:90-60 mb above ground: +:VGRD:914 m above mean sea level: +:VGRD:950 mb: +:VGRD:975 mb: +:VGRD:max wind: +:VGRD:planetary boundary layer: +:VGRD:PV=-1.5e-06 (Km^2/kg/s) surface: +:VGRD:PV=1.5e-06 (Km^2/kg/s) surface: +:VGRD:PV=-1e-06 (Km^2/kg/s) surface: +:VGRD:PV=1e-06 (Km^2/kg/s) surface: +:VGRD:PV=-2e-06 (Km^2/kg/s) surface: +:VGRD:PV=2e-06 (Km^2/kg/s) surface: +:VGRD:PV=-5e-07 (Km^2/kg/s) surface: +:VGRD:PV=5e-07 (Km^2/kg/s) surface: +:VGRD:tropopause: +:V-GWD:surface: +:VIS:surface: +:VRATE:planetary boundary layer: +:VSTM:6000-0 m above ground: +:VVEL:0.995 sigma level: +:VVEL:1 mb: +:VVEL:2 mb: +:VVEL:3 mb: +:VVEL:5 mb: +:VVEL:7 mb: +:VVEL:10 mb: +:VVEL:20 mb: +:VVEL:30 mb: +:VVEL:50 mb: +:VVEL:70 mb: +:VVEL:1000 mb: +:VVEL:100 mb: +:VVEL:150 mb: +:VVEL:200 mb: +:VVEL:250 mb: +:VVEL:300 mb: +:VVEL:350 mb: +:VVEL:400 mb: +:VVEL:450 mb: +:VVEL:500 mb: +:VVEL:550 mb: +:VVEL:600 mb: +:VVEL:650 mb: +:VVEL:700 mb: +:VVEL:750 mb: +:VVEL:800 mb: +:VVEL:900 mb: +:VVEL:925 mb: +:VVEL:950 mb: +:VVEL:975 mb: +:VWSH:PV=-1.5e-06 (Km^2/kg/s) surface: +:VWSH:PV=1.5e-06 (Km^2/kg/s) surface: +:VWSH:PV=-1e-06 (Km^2/kg/s) surface: +:VWSH:PV=1e-06 (Km^2/kg/s) surface: +:VWSH:PV=-2e-06 (Km^2/kg/s) surface: +:VWSH:PV=2e-06 (Km^2/kg/s) surface: +:VWSH:PV=-5e-07 (Km^2/kg/s) surface: +:VWSH:PV=5e-07 (Km^2/kg/s) surface: +:VWSH:tropopause: +:WATR:surface: +:WILT:surface: +:HGT:cloud ceiling: +:PRES:1 hybrid level: +:HGT:1 hybrid level: +:TMP:1 hybrid level: +:RH:1 hybrid level: +:UGRD:1 hybrid level: +:VGRD:1 hybrid level: +:PRES:2 hybrid level: +:HGT:2 hybrid level: +:TMP:2 hybrid level: +:RH:2 hybrid level: +:UGRD:2 hybrid level: +:VGRD:2 hybrid level: +:PRES:3 hybrid level: +:HGT:3 hybrid level: +:TMP:3 hybrid level: +:RH:3 hybrid level: +:UGRD:3 hybrid level: +:VGRD:3 hybrid level: +:PRES:4 hybrid level: +:HGT:4 hybrid level: +:TMP:4 hybrid level: +:RH:4 hybrid level: +:UGRD:4 hybrid level: +:VGRD:4 hybrid level: diff --git a/parm/product/gefs.1p00.f000.paramlist.a.txt b/parm/product/gefs.1p00.f000.paramlist.a.txt new file mode 120000 index 0000000000..69265297d3 --- /dev/null +++ b/parm/product/gefs.1p00.f000.paramlist.a.txt @@ -0,0 +1 @@ +gefs.0p50.f000.paramlist.a.txt \ No newline at end of file diff --git a/parm/product/gefs.1p00.f000.paramlist.b.txt b/parm/product/gefs.1p00.f000.paramlist.b.txt new file mode 120000 index 0000000000..a51f2079e2 --- /dev/null +++ b/parm/product/gefs.1p00.f000.paramlist.b.txt @@ -0,0 +1 @@ +gefs.0p50.f000.paramlist.b.txt \ No newline at end of file diff --git a/parm/product/gefs.1p00.fFFF.paramlist.a.txt b/parm/product/gefs.1p00.fFFF.paramlist.a.txt new file mode 120000 index 0000000000..c131b24c02 --- /dev/null +++ b/parm/product/gefs.1p00.fFFF.paramlist.a.txt @@ -0,0 +1 @@ +gefs.0p50.fFFF.paramlist.a.txt \ No newline at end of file diff --git a/parm/product/gefs.1p00.fFFF.paramlist.b.txt b/parm/product/gefs.1p00.fFFF.paramlist.b.txt new file mode 120000 index 0000000000..0f2fb179cb --- /dev/null +++ b/parm/product/gefs.1p00.fFFF.paramlist.b.txt @@ -0,0 +1 @@ +gefs.0p50.fFFF.paramlist.b.txt \ No newline at end of file diff --git a/parm/product/gefs.2p50.f000.paramlist.a.txt b/parm/product/gefs.2p50.f000.paramlist.a.txt new file mode 100644 index 0000000000..4d2219ce8c --- /dev/null +++ b/parm/product/gefs.2p50.f000.paramlist.a.txt @@ -0,0 +1,23 @@ +############################# sorted pgrb2a 201408 +:HGT:surface: +:HGT:1000 mb: +:HGT:500 mb: +:PRMSL:mean sea level: +:RH:700 mb: +:TMP:2 m above ground: +:TMP:850 mb: +:UGRD:10 m above ground: +:UGRD:200 mb: +:UGRD:250 mb: +:UGRD:850 mb: +:VGRD:10 m above ground: +:VGRD:200 mb: +:VGRD:250 mb: +:VGRD:850 mb: +:APCP:surface +:CSNOW:surface +:CRAIN:surface +:CICEP:surface +:CFRZR:surface +:ULWRF:top +;############################ do not leave a blank line at the end diff --git a/parm/product/gefs.2p50.f000.paramlist.b.txt b/parm/product/gefs.2p50.f000.paramlist.b.txt new file mode 100644 index 0000000000..f2610c5f77 --- /dev/null +++ b/parm/product/gefs.2p50.f000.paramlist.b.txt @@ -0,0 +1,530 @@ +############################# sorted pgrb2a + pgrb2b 201502 +:4LFTX:surface: +:5WAVH:500 mb: +:ABSV:1000 mb: +:ABSV:100 mb: +:ABSV:10 mb: +:ABSV:150 mb: +:ABSV:200 mb: +:ABSV:20 mb: +:ABSV:250 mb: +:ABSV:300 mb: +:ABSV:30 mb: +:ABSV:350 mb: +:ABSV:400 mb: +:ABSV:450 mb: +:ABSV:500 mb: +:ABSV:50 mb: +:ABSV:550 mb: +:ABSV:600 mb: +:ABSV:650 mb: +:ABSV:700 mb: +:ABSV:70 mb: +:ABSV:750 mb: +:ABSV:800 mb: +:ABSV:850 mb: +:ABSV:900 mb: +:ABSV:925 mb: +:ABSV:950 mb: +:ABSV:975 mb: +:BRTMP:top of atmosphere: +:CAPE:180-0 mb above ground: +:CAPE:255-0 mb above ground: +:CAPE:surface: +:CIN:180-0 mb above ground: +:CIN:255-0 mb above ground: +:CIN:surface: +:CLWMR:1000 mb: +:CLWMR:100 mb: +:CLWMR:10 mb: +:CLWMR:150 mb: +:CLWMR:200 mb: +:CLWMR:20 mb: +:CLWMR:250 mb: +:CLWMR:300 mb: +:CLWMR:30 mb: +:CLWMR:350 mb: +:CLWMR:400 mb: +:CLWMR:450 mb: +:CLWMR:500 mb: +:CLWMR:50 mb: +:CLWMR:550 mb: +:CLWMR:600 mb: +:CLWMR:650 mb: +:CLWMR:700 mb: +:CLWMR:70 mb: +:CLWMR:750 mb: +:CLWMR:800 mb: +:CLWMR:850 mb: +:CLWMR:900 mb: +:CLWMR:925 mb: +:CLWMR:950 mb: +:CLWMR:975 mb: +:CNWAT:surface: +:CPOFP:surface: +:CWAT:entire atmosphere (considered as a single layer): +:DPT:2 m above ground: +:DPT:30-0 mb above ground: +:FLDCP:surface: +:FRICV:surface: +:GUST:surface: +:HGT:0C isotherm: +:HGT:100 mb: +:HGT:10 mb: +:HGT:1 mb: +:HGT:150 mb: +:HGT:200 mb: +:HGT:20 mb: +:HGT:2 mb: +:HGT:250 mb: +:HGT:300 mb: +:HGT:30 mb: +:HGT:3 mb: +:HGT:350 mb: +:HGT:400 mb: +:HGT:450 mb: +:HGT:50 mb: +:HGT:5 mb: +:HGT:550 mb: +:HGT:600 mb: +:HGT:650 mb: +:HGT:700 mb: +:HGT:70 mb: +:HGT:7 mb: +:HGT:750 mb: +:HGT:800 mb: +:HGT:850 mb: +:HGT:900 mb: +:HGT:925 mb: +:HGT:950 mb: +:HGT:975 mb: +:HGT:highest tropospheric freezing level: +:HGT:max wind: +:HGT:PV=-1.5e-06 (Km^2/kg/s) surface: +:HGT:PV=1.5e-06 (Km^2/kg/s) surface: +:HGT:PV=-1e-06 (Km^2/kg/s) surface: +:HGT:PV=1e-06 (Km^2/kg/s) surface: +:HGT:PV=-2e-06 (Km^2/kg/s) surface: +:HGT:PV=2e-06 (Km^2/kg/s) surface: +:HGT:PV=-5e-07 (Km^2/kg/s) surface: +:HGT:PV=5e-07 (Km^2/kg/s) surface: +:HGT:tropopause: +:HINDEX:surface: +:HLCY:3000-0 m above ground: +:HPBL:surface: +:ICAHT:max wind: +:ICAHT:tropopause: +:ICEC:surface: +:ICETK:surface: +:ICIP:300 mb: +:ICIP:400 mb: +:ICIP:500 mb: +:ICIP:600 mb: +:ICIP:700 mb: +:ICIP:800 mb: +:ICSEV:300 mb: +:ICSEV:400 mb: +:ICSEV:500 mb: +:ICSEV:600 mb: +:ICSEV:700 mb: +:ICSEV:800 mb: +:LAND:surface: +:LFTX:surface: +:MNTSF:320 K isentropic level: +:MSLET:mean sea level: +:O3MR:100 mb: +:O3MR:10 mb: +:O3MR:125 mb: +:O3MR:150 mb: +:O3MR:1 mb: +:O3MR:200 mb: +:O3MR:20 mb: +:O3MR:250 mb: +:O3MR:2 mb: +:O3MR:300 mb: +:O3MR:30 mb: +:O3MR:350 mb: +:O3MR:3 mb: +:O3MR:400 mb: +:O3MR:50 mb: +:O3MR:5 mb: +:O3MR:70 mb: +:O3MR:7 mb: +:PEVPR:surface: +:PLI:30-0 mb above ground: +:PLPL:255-0 mb above ground: +:POT:0.995 sigma level: +:PRES:80 m above ground: +:PRES:max wind: +:PRES:mean sea level: +:PRES:PV=-1.5e-06 (Km^2/kg/s) surface: +:PRES:PV=1.5e-06 (Km^2/kg/s) surface: +:PRES:PV=-1e-06 (Km^2/kg/s) surface: +:PRES:PV=1e-06 (Km^2/kg/s) surface: +:PRES:PV=-2e-06 (Km^2/kg/s) surface: +:PRES:PV=2e-06 (Km^2/kg/s) surface: +:PRES:PV=-5e-07 (Km^2/kg/s) surface: +:PRES:PV=5e-07 (Km^2/kg/s) surface: +:PRES:surface: +:PRES:tropopause: +:PVORT:310 K isentropic level: +:PVORT:320 K isentropic level: +:PVORT:350 K isentropic level: +:PVORT:450 K isentropic level: +:PVORT:550 K isentropic level: +:PVORT:650 K isentropic level: +:PWAT:30-0 mb above ground: +:PWAT:entire atmosphere (considered as a single layer): +:RH:0.33-1 sigma layer: +:RH:0.44-0.72 sigma layer: +:RH:0.44-1 sigma layer: +:RH:0.72-0.94 sigma layer: +:RH:0.995 sigma level: +:RH:0C isotherm: +:RH:1000 mb: +:RH:100 mb: +:RH:10 mb: +:RH:120-90 mb above ground: +:RH:150-120 mb above ground: +:RH:150 mb: +:RH:180-150 mb above ground: +:RH:200 mb: +:RH:20 mb: +:RH:250 mb: +:RH:2 m above ground: +:RH:300 mb: +:RH:30-0 mb above ground: +:RH:30 mb: +:RH:350 mb: +:RH:400 mb: +:RH:450 mb: +:RH:500 mb: +:RH:50 mb: +:RH:550 mb: +:RH:600 mb: +:RH:60-30 mb above ground: +:RH:650 mb: +:RH:70 mb: +:RH:750 mb: +:RH:800 mb: +:RH:850 mb: +:RH:900 mb: +:RH:90-60 mb above ground: +:RH:925 mb: +:RH:950 mb: +:RH:975 mb: +:RH:entire atmosphere (considered as a single layer): +:RH:highest tropospheric freezing level: +:SFCR:surface: +:SNOD:surface: +:SNOHF:surface: +:SNOWC:surface: +:SOILL:0-0.1 m below ground: +:SOILL:0.1-0.4 m below ground: +:SOILL:0.4-1 m below ground: +:SOILL:1-2 m below ground: +:SOILW:0-0.1 m below ground: +:SOILW:0.1-0.4 m below ground: +:SOILW:0.4-1 m below ground: +:SOILW:1-2 m below ground: +:SPFH:1000 mb: +:SPFH:100 mb: +:SPFH:10 mb: +:SPFH:1 mb: +:SPFH:120-90 mb above ground: +:SPFH:150-120 mb above ground: +:SPFH:150 mb: +:SPFH:180-150 mb above ground: +:SPFH:200 mb: +:SPFH:20 mb: +:SPFH:2 mb: +:SPFH:250 mb: +:SPFH:2 m above ground: +:SPFH:300 mb: +:SPFH:30-0 mb above ground: +:SPFH:30 mb: +:SPFH:3 mb: +:SPFH:350 mb: +:SPFH:400 mb: +:SPFH:450 mb: +:SPFH:500 mb: +:SPFH:50 mb: +:SPFH:5 mb: +:SPFH:550 mb: +:SPFH:600 mb: +:SPFH:60-30 mb above ground: +:SPFH:650 mb: +:SPFH:700 mb: +:SPFH:70 mb: +:SPFH:7 mb: +:SPFH:750 mb: +:SPFH:800 mb: +:SPFH:80 m above ground: +:SPFH:850 mb: +:SPFH:900 mb: +:SPFH:90-60 mb above ground: +:SPFH:925 mb: +:SPFH:950 mb: +:SPFH:975 mb: +:SUNSD:surface: +:TCDC:475 mb: +:TMP:0.995 sigma level: +:TMP:1000 mb: +:TMP:100 m above ground: +:TMP:100 mb: +:TMP:10 mb: +:TMP:1 mb: +:TMP:120-90 mb above ground: +:TMP:150-120 mb above ground: +:TMP:150 mb: +:TMP:180-150 mb above ground: +:TMP:1829 m above mean sea level: +:TMP:200 mb: +:TMP:20 mb: +:TMP:2 mb: +:TMP:250 mb: +:TMP:2743 m above mean sea level: +:TMP:300 mb: +:TMP:30-0 mb above ground: +:TMP:305 m above mean sea level: +:TMP:30 mb: +:TMP:3 mb: +:TMP:320 K isentropic level: +:TMP:350 mb: +:TMP:3658 m above mean sea level: +:TMP:400 mb: +:TMP:450 mb: +:TMP:450 K isentropic level: +:TMP:4572 m above mean sea level: +:TMP:457 m above mean sea level: +:TMP:500 mb: +:TMP:50 mb: +:TMP:5 mb: +:TMP:550 mb: +:TMP:550 K isentropic level: +:TMP:600 mb: +:TMP:60-30 mb above ground: +:TMP:610 m above mean sea level: +:TMP:650 mb: +:TMP:650 K isentropic level: +:TMP:700 mb: +:TMP:70 mb: +:TMP:7 mb: +:TMP:750 mb: +:TMP:800 mb: +:TMP:80 m above ground: +:TMP:900 mb: +:TMP:90-60 mb above ground: +:TMP:914 m above mean sea level: +:TMP:925 mb: +:TMP:950 mb: +:TMP:975 mb: +:TMP:max wind: +:TMP:PV=-1.5e-06 (Km^2/kg/s) surface: +:TMP:PV=1.5e-06 (Km^2/kg/s) surface: +:TMP:PV=-1e-06 (Km^2/kg/s) surface: +:TMP:PV=1e-06 (Km^2/kg/s) surface: +:TMP:PV=-2e-06 (Km^2/kg/s) surface: +:TMP:PV=2e-06 (Km^2/kg/s) surface: +:TMP:PV=-5e-07 (Km^2/kg/s) surface: +:TMP:PV=5e-07 (Km^2/kg/s) surface: +:TMP:surface: +:TMP:tropopause: +:TOZNE:entire atmosphere (considered as a single layer): +:TSOIL:0-0.1 m below ground: +:TSOIL:0.1-0.4 m below ground: +:TSOIL:0.4-1 m below ground: +:TSOIL:1-2 m below ground: +:UGRD:0.995 sigma level: +:UGRD:1000 mb: +:UGRD:100 m above ground: +:UGRD:100 mb: +:UGRD:10 mb: +:UGRD:1 mb: +:UGRD:120-90 mb above ground: +:UGRD:150-120 mb above ground: +:UGRD:150 mb: +:UGRD:180-150 mb above ground: +:UGRD:1829 m above mean sea level: +:UGRD:20 mb: +:UGRD:2 mb: +:UGRD:2743 m above mean sea level: +:UGRD:300 mb: +:UGRD:30-0 mb above ground: +:UGRD:305 m above mean sea level: +:UGRD:30 mb: +:UGRD:3 mb: +:UGRD:320 K isentropic level: +:UGRD:350 mb: +:UGRD:3658 m above mean sea level: +:UGRD:400 mb: +:UGRD:450 mb: +:UGRD:450 K isentropic level: +:UGRD:4572 m above mean sea level: +:UGRD:457 m above mean sea level: +:UGRD:500 mb: +:UGRD:50 mb: +:UGRD:5 mb: +:UGRD:550 mb: +:UGRD:550 K isentropic level: +:UGRD:600 mb: +:UGRD:60-30 mb above ground: +:UGRD:610 m above mean sea level: +:UGRD:650 mb: +:UGRD:650 K isentropic level: +:UGRD:700 mb: +:UGRD:70 mb: +:UGRD:7 mb: +:UGRD:750 mb: +:UGRD:800 mb: +:UGRD:80 m above ground: +:UGRD:900 mb: +:UGRD:90-60 mb above ground: +:UGRD:914 m above mean sea level: +:UGRD:925 mb: +:UGRD:950 mb: +:UGRD:975 mb: +:UGRD:max wind: +:UGRD:planetary boundary layer: +:UGRD:PV=-1.5e-06 (Km^2/kg/s) surface: +:UGRD:PV=1.5e-06 (Km^2/kg/s) surface: +:UGRD:PV=-1e-06 (Km^2/kg/s) surface: +:UGRD:PV=1e-06 (Km^2/kg/s) surface: +:UGRD:PV=-2e-06 (Km^2/kg/s) surface: +:UGRD:PV=2e-06 (Km^2/kg/s) surface: +:UGRD:PV=-5e-07 (Km^2/kg/s) surface: +:UGRD:PV=5e-07 (Km^2/kg/s) surface: +:UGRD:tropopause: +:USTM:6000-0 m above ground: +:APTMP:2 m above ground: +:VGRD:0.995 sigma level: +:VGRD:1000 mb: +:VGRD:100 m above ground: +:VGRD:100 mb: +:VGRD:10 mb: +:VGRD:1 mb: +:VGRD:120-90 mb above ground: +:VGRD:150-120 mb above ground: +:VGRD:150 mb: +:VGRD:180-150 mb above ground: +:VGRD:1829 m above mean sea level: +:VGRD:20 mb: +:VGRD:2 mb: +:VGRD:2743 m above mean sea level: +:VGRD:300 mb: +:VGRD:30-0 mb above ground: +:VGRD:305 m above mean sea level: +:VGRD:30 mb: +:VGRD:3 mb: +:VGRD:320 K isentropic level: +:VGRD:350 mb: +:VGRD:3658 m above mean sea level: +:VGRD:400 mb: +:VGRD:450 mb: +:VGRD:450 K isentropic level: +:VGRD:4572 m above mean sea level: +:VGRD:457 m above mean sea level: +:VGRD:500 mb: +:VGRD:50 mb: +:VGRD:5 mb: +:VGRD:550 mb: +:VGRD:550 K isentropic level: +:VGRD:600 mb: +:VGRD:60-30 mb above ground: +:VGRD:610 m above mean sea level: +:VGRD:650 mb: +:VGRD:650 K isentropic level: +:VGRD:700 mb: +:VGRD:70 mb: +:VGRD:7 mb: +:VGRD:750 mb: +:VGRD:800 mb: +:VGRD:80 m above ground: +:VGRD:900 mb: +:VGRD:90-60 mb above ground: +:VGRD:914 m above mean sea level: +:VGRD:925 mb: +:VGRD:950 mb: +:VGRD:975 mb: +:VGRD:max wind: +:VGRD:planetary boundary layer: +:VGRD:PV=-1.5e-06 (Km^2/kg/s) surface: +:VGRD:PV=1.5e-06 (Km^2/kg/s) surface: +:VGRD:PV=-1e-06 (Km^2/kg/s) surface: +:VGRD:PV=1e-06 (Km^2/kg/s) surface: +:VGRD:PV=-2e-06 (Km^2/kg/s) surface: +:VGRD:PV=2e-06 (Km^2/kg/s) surface: +:VGRD:PV=-5e-07 (Km^2/kg/s) surface: +:VGRD:PV=5e-07 (Km^2/kg/s) surface: +:VGRD:tropopause: +:VIS:surface: +:VRATE:planetary boundary layer: +:VSTM:6000-0 m above ground: +:VVEL:0.995 sigma level: +:VVEL:1 mb: +:VVEL:2 mb: +:VVEL:3 mb: +:VVEL:5 mb: +:VVEL:7 mb: +:VVEL:10 mb: +:VVEL:20 mb: +:VVEL:30 mb: +:VVEL:50 mb: +:VVEL:70 mb: +:VVEL:1000 mb: +:VVEL:100 mb: +:VVEL:150 mb: +:VVEL:200 mb: +:VVEL:250 mb: +:VVEL:300 mb: +:VVEL:350 mb: +:VVEL:400 mb: +:VVEL:450 mb: +:VVEL:500 mb: +:VVEL:550 mb: +:VVEL:600 mb: +:VVEL:650 mb: +:VVEL:700 mb: +:VVEL:750 mb: +:VVEL:800 mb: +:VVEL:850 mb: +:VVEL:900 mb: +:VVEL:925 mb: +:VVEL:950 mb: +:VVEL:975 mb: +:VWSH:PV=-1.5e-06 (Km^2/kg/s) surface: +:VWSH:PV=1.5e-06 (Km^2/kg/s) surface: +:VWSH:PV=-1e-06 (Km^2/kg/s) surface: +:VWSH:PV=1e-06 (Km^2/kg/s) surface: +:VWSH:PV=-2e-06 (Km^2/kg/s) surface: +:VWSH:PV=2e-06 (Km^2/kg/s) surface: +:VWSH:PV=-5e-07 (Km^2/kg/s) surface: +:VWSH:PV=5e-07 (Km^2/kg/s) surface: +:VWSH:tropopause: +:WEASD:surface: +:WILT:surface: +:HGT:cloud ceiling: +:PRES:1 hybrid level: +:HGT:1 hybrid level: +:TMP:1 hybrid level: +:RH:1 hybrid level: +:UGRD:1 hybrid level: +:VGRD:1 hybrid level: +:PRES:2 hybrid level: +:HGT:2 hybrid level: +:TMP:2 hybrid level: +:RH:2 hybrid level: +:UGRD:2 hybrid level: +:VGRD:2 hybrid level: +:PRES:3 hybrid level: +:HGT:3 hybrid level: +:TMP:3 hybrid level: +:RH:3 hybrid level: +:UGRD:3 hybrid level: +:VGRD:3 hybrid level: +:PRES:4 hybrid level: +:HGT:4 hybrid level: +:TMP:4 hybrid level: +:RH:4 hybrid level: +:UGRD:4 hybrid level: +:VGRD:4 hybrid level: diff --git a/parm/product/gefs.2p50.fFFF.paramlist.a.txt b/parm/product/gefs.2p50.fFFF.paramlist.a.txt new file mode 100644 index 0000000000..11b6a8aef3 --- /dev/null +++ b/parm/product/gefs.2p50.fFFF.paramlist.a.txt @@ -0,0 +1,22 @@ +############################# sorted pgrb2a 201408 +:HGT:1000 mb: +:HGT:500 mb: +:PRMSL:mean sea level: +:RH:700 mb: +:TMP:2 m above ground: +:TMP:850 mb: +:UGRD:10 m above ground: +:UGRD:200 mb: +:UGRD:250 mb: +:UGRD:850 mb: +:VGRD:10 m above ground: +:VGRD:200 mb: +:VGRD:250 mb: +:VGRD:850 mb: +:APCP:surface +:CSNOW:surface +:CRAIN:surface +:CICEP:surface +:CFRZR:surface +:ULWRF:top +;############################ do not leave a blank line at the end diff --git a/parm/product/gefs.2p50.fFFF.paramlist.b.txt b/parm/product/gefs.2p50.fFFF.paramlist.b.txt new file mode 100644 index 0000000000..8c05d49271 --- /dev/null +++ b/parm/product/gefs.2p50.fFFF.paramlist.b.txt @@ -0,0 +1,571 @@ +############################# sorted pgrb2a + pgrb2b 201502 +:4LFTX:surface: +:5WAVH:500 mb: +:ABSV:1000 mb: +:ABSV:100 mb: +:ABSV:10 mb: +:ABSV:150 mb: +:ABSV:200 mb: +:ABSV:20 mb: +:ABSV:250 mb: +:ABSV:300 mb: +:ABSV:30 mb: +:ABSV:350 mb: +:ABSV:400 mb: +:ABSV:450 mb: +:ABSV:500 mb: +:ABSV:50 mb: +:ABSV:550 mb: +:ABSV:600 mb: +:ABSV:650 mb: +:ABSV:700 mb: +:ABSV:70 mb: +:ABSV:750 mb: +:ABSV:800 mb: +:ABSV:850 mb: +:ABSV:900 mb: +:ABSV:925 mb: +:ABSV:950 mb: +:ABSV:975 mb: +:ACPCP:surface: +:ALBDO:surface: +:BRTMP:top of atmosphere: +:CAPE:180-0 mb above ground: +:CAPE:255-0 mb above ground: +:CAPE:surface: +:CDUVB:surface: +:CIN:180-0 mb above ground: +:CIN:255-0 mb above ground: +:CIN:surface: +:CLWMR:1000 mb: +:CLWMR:100 mb: +:CLWMR:10 mb: +:CLWMR:150 mb: +:CLWMR:200 mb: +:CLWMR:20 mb: +:CLWMR:250 mb: +:CLWMR:300 mb: +:CLWMR:30 mb: +:CLWMR:350 mb: +:CLWMR:400 mb: +:CLWMR:450 mb: +:CLWMR:500 mb: +:CLWMR:50 mb: +:CLWMR:550 mb: +:CLWMR:600 mb: +:CLWMR:650 mb: +:CLWMR:700 mb: +:CLWMR:70 mb: +:CLWMR:750 mb: +:CLWMR:800 mb: +:CLWMR:850 mb: +:CLWMR:900 mb: +:CLWMR:925 mb: +:CLWMR:950 mb: +:CLWMR:975 mb: +:CNWAT:surface: +:CPOFP:surface: +:CPRAT:surface: +:CWAT:entire atmosphere (considered as a single layer): +:CWORK:entire atmosphere (considered as a single layer): +:DLWRF:surface: +:DPT:2 m above ground: +:DPT:30-0 mb above ground: +:DSWRF:surface: +:DUVB:surface: +:FLDCP:surface: +:FRICV:surface: +:GFLUX:surface: +:GUST:surface: +:HGT:0C isotherm: +:HGT:100 mb: +:HGT:10 mb: +:HGT:1 mb: +:HGT:150 mb: +:HGT:200 mb: +:HGT:20 mb: +:HGT:2 mb: +:HGT:250 mb: +:HGT:300 mb: +:HGT:30 mb: +:HGT:3 mb: +:HGT:350 mb: +:HGT:400 mb: +:HGT:450 mb: +:HGT:50 mb: +:HGT:5 mb: +:HGT:550 mb: +:HGT:600 mb: +:HGT:650 mb: +:HGT:700 mb: +:HGT:70 mb: +:HGT:7 mb: +:HGT:750 mb: +:HGT:800 mb: +:HGT:850 mb: +:HGT:900 mb: +:HGT:925 mb: +:HGT:950 mb: +:HGT:975 mb: +:HGT:highest tropospheric freezing level: +:HGT:max wind: +:HGT:PV=-1.5e-06 (Km^2/kg/s) surface: +:HGT:PV=1.5e-06 (Km^2/kg/s) surface: +:HGT:PV=-1e-06 (Km^2/kg/s) surface: +:HGT:PV=1e-06 (Km^2/kg/s) surface: +:HGT:PV=-2e-06 (Km^2/kg/s) surface: +:HGT:PV=2e-06 (Km^2/kg/s) surface: +:HGT:PV=-5e-07 (Km^2/kg/s) surface: +:HGT:PV=5e-07 (Km^2/kg/s) surface: +:HGT:surface: +:HGT:tropopause: +:HINDEX:surface: +:HLCY:3000-0 m above ground: +:HPBL:surface: +:ICAHT:max wind: +:ICAHT:tropopause: +:ICEC:surface: +:ICETK:surface: +:ICIP:300 mb: +:ICIP:400 mb: +:ICIP:500 mb: +:ICIP:600 mb: +:ICIP:700 mb: +:ICIP:800 mb: +:ICSEV:300 mb: +:ICSEV:400 mb: +:ICSEV:500 mb: +:ICSEV:600 mb: +:ICSEV:700 mb: +:ICSEV:800 mb: +:LAND:surface: +:LFTX:surface: +:LHTFL:surface: +:MNTSF:320 K isentropic level: +:MSLET:mean sea level: +:NCPCP:surface: +:O3MR:100 mb: +:O3MR:10 mb: +:O3MR:125 mb: +:O3MR:150 mb: +:O3MR:1 mb: +:O3MR:200 mb: +:O3MR:20 mb: +:O3MR:250 mb: +:O3MR:2 mb: +:O3MR:300 mb: +:O3MR:30 mb: +:O3MR:350 mb: +:O3MR:3 mb: +:O3MR:400 mb: +:O3MR:50 mb: +:O3MR:5 mb: +:O3MR:70 mb: +:O3MR:7 mb: +:PEVPR:surface: +:PLI:30-0 mb above ground: +:PLPL:255-0 mb above ground: +:POT:0.995 sigma level: +:PRATE:surface: +:PRES:80 m above ground: +:PRES:convective cloud bottom level: +:PRES:convective cloud top level: +:PRES:high cloud bottom level: +:PRES:high cloud top level: +:PRES:low cloud bottom level: +:PRES:low cloud top level: +:PRES:max wind: +:PRES:mean sea level: +:PRES:middle cloud bottom level: +:PRES:middle cloud top level: +:PRES:PV=-1.5e-06 (Km^2/kg/s) surface: +:PRES:PV=1.5e-06 (Km^2/kg/s) surface: +:PRES:PV=-1e-06 (Km^2/kg/s) surface: +:PRES:PV=1e-06 (Km^2/kg/s) surface: +:PRES:PV=-2e-06 (Km^2/kg/s) surface: +:PRES:PV=2e-06 (Km^2/kg/s) surface: +:PRES:PV=-5e-07 (Km^2/kg/s) surface: +:PRES:PV=5e-07 (Km^2/kg/s) surface: +:PRES:surface: +:PRES:tropopause: +:PVORT:310 K isentropic level: +:PVORT:320 K isentropic level: +:PVORT:350 K isentropic level: +:PVORT:450 K isentropic level: +:PVORT:550 K isentropic level: +:PVORT:650 K isentropic level: +:PWAT:30-0 mb above ground: +:PWAT:entire atmosphere (considered as a single layer): +:RH:0.33-1 sigma layer: +:RH:0.44-0.72 sigma layer: +:RH:0.44-1 sigma layer: +:RH:0.72-0.94 sigma layer: +:RH:0.995 sigma level: +:RH:0C isotherm: +:RH:1000 mb: +:RH:100 mb: +:RH:10 mb: +:RH:120-90 mb above ground: +:RH:150-120 mb above ground: +:RH:150 mb: +:RH:180-150 mb above ground: +:RH:200 mb: +:RH:20 mb: +:RH:250 mb: +:RH:2 m above ground: +:RH:300 mb: +:RH:30-0 mb above ground: +:RH:30 mb: +:RH:350 mb: +:RH:400 mb: +:RH:450 mb: +:RH:500 mb: +:RH:50 mb: +:RH:550 mb: +:RH:600 mb: +:RH:60-30 mb above ground: +:RH:650 mb: +:RH:70 mb: +:RH:750 mb: +:RH:800 mb: +:RH:850 mb: +:RH:900 mb: +:RH:90-60 mb above ground: +:RH:925 mb: +:RH:950 mb: +:RH:975 mb: +:RH:entire atmosphere (considered as a single layer): +:RH:highest tropospheric freezing level: +:SFCR:surface: +:SHTFL:surface: +:SNOD:surface: +:SNOWC:surface: +:SNOHF:surface: +:SOILL:0-0.1 m below ground: +:SOILL:0.1-0.4 m below ground: +:SOILL:0.4-1 m below ground: +:SOILL:1-2 m below ground: +:SOILW:0-0.1 m below ground: +:SOILW:0.1-0.4 m below ground: +:SOILW:0.4-1 m below ground: +:SOILW:1-2 m below ground: +:SPFH:1000 mb: +:SPFH:100 mb: +:SPFH:10 mb: +:SPFH:1 mb: +:SPFH:120-90 mb above ground: +:SPFH:150-120 mb above ground: +:SPFH:150 mb: +:SPFH:180-150 mb above ground: +:SPFH:200 mb: +:SPFH:20 mb: +:SPFH:2 mb: +:SPFH:250 mb: +:SPFH:2 m above ground: +:SPFH:300 mb: +:SPFH:30-0 mb above ground: +:SPFH:30 mb: +:SPFH:3 mb: +:SPFH:350 mb: +:SPFH:400 mb: +:SPFH:450 mb: +:SPFH:500 mb: +:SPFH:50 mb: +:SPFH:5 mb: +:SPFH:550 mb: +:SPFH:600 mb: +:SPFH:60-30 mb above ground: +:SPFH:650 mb: +:SPFH:700 mb: +:SPFH:70 mb: +:SPFH:7 mb: +:SPFH:750 mb: +:SPFH:800 mb: +:SPFH:80 m above ground: +:SPFH:850 mb: +:SPFH:900 mb: +:SPFH:90-60 mb above ground: +:SPFH:925 mb: +:SPFH:950 mb: +:SPFH:975 mb: +:SUNSD:surface: +:TCDC:475 mb: +:TCDC:boundary layer cloud layer: +:TCDC:convective cloud layer: +:TCDC:entire atmosphere: +:TCDC:high cloud layer: +:TCDC:low cloud layer: +:TCDC:middle cloud layer: +:TMAX:2 m above ground: +:TMIN:2 m above ground: +:TMP:0.995 sigma level: +:TMP:1000 mb: +:TMP:100 m above ground: +:TMP:100 mb: +:TMP:10 mb: +:TMP:1 mb: +:TMP:120-90 mb above ground: +:TMP:150-120 mb above ground: +:TMP:150 mb: +:TMP:180-150 mb above ground: +:TMP:1829 m above mean sea level: +:TMP:200 mb: +:TMP:20 mb: +:TMP:2 mb: +:TMP:250 mb: +:TMP:2743 m above mean sea level: +:TMP:300 mb: +:TMP:30-0 mb above ground: +:TMP:305 m above mean sea level: +:TMP:30 mb: +:TMP:3 mb: +:TMP:320 K isentropic level: +:TMP:350 mb: +:TMP:3658 m above mean sea level: +:TMP:400 mb: +:TMP:450 K isentropic level: +:TMP:450 mb: +:TMP:4572 m above mean sea level: +:TMP:457 m above mean sea level: +:TMP:500 mb: +:TMP:50 mb: +:TMP:5 mb: +:TMP:550 K isentropic level: +:TMP:550 mb: +:TMP:600 mb: +:TMP:60-30 mb above ground: +:TMP:610 m above mean sea level: +:TMP:650 K isentropic level: +:TMP:650 mb: +:TMP:700 mb: +:TMP:70 mb: +:TMP:7 mb: +:TMP:750 mb: +:TMP:800 mb: +:TMP:80 m above ground: +:TMP:900 mb: +:TMP:90-60 mb above ground: +:TMP:914 m above mean sea level: +:TMP:925 mb: +:TMP:950 mb: +:TMP:975 mb: +:TMP:high cloud top level: +:TMP:low cloud top level: +:TMP:max wind: +:TMP:middle cloud top level: +:TMP:PV=-1.5e-06 (Km^2/kg/s) surface: +:TMP:PV=1.5e-06 (Km^2/kg/s) surface: +:TMP:PV=-1e-06 (Km^2/kg/s) surface: +:TMP:PV=1e-06 (Km^2/kg/s) surface: +:TMP:PV=-2e-06 (Km^2/kg/s) surface: +:TMP:PV=2e-06 (Km^2/kg/s) surface: +:TMP:PV=-5e-07 (Km^2/kg/s) surface: +:TMP:PV=5e-07 (Km^2/kg/s) surface: +:TMP:surface: +:TMP:tropopause: +:TOZNE:entire atmosphere (considered as a single layer): +:TSOIL:0-0.1 m below ground: +:TSOIL:0.1-0.4 m below ground: +:TSOIL:0.4-1 m below ground: +:TSOIL:1-2 m below ground: +:UFLX:surface: +:UGRD:0.995 sigma level: +:UGRD:1000 mb: +:UGRD:100 m above ground: +:UGRD:100 mb: +:UGRD:10 mb: +:UGRD:1 mb: +:UGRD:120-90 mb above ground: +:UGRD:150-120 mb above ground: +:UGRD:150 mb: +:UGRD:180-150 mb above ground: +:UGRD:1829 m above mean sea level: +:UGRD:20 mb: +:UGRD:2 mb: +:UGRD:2743 m above mean sea level: +:UGRD:300 mb: +:UGRD:30-0 mb above ground: +:UGRD:305 m above mean sea level: +:UGRD:30 mb: +:UGRD:3 mb: +:UGRD:320 K isentropic level: +:UGRD:350 mb: +:UGRD:3658 m above mean sea level: +:UGRD:400 mb: +:UGRD:450 K isentropic level: +:UGRD:450 mb: +:UGRD:4572 m above mean sea level: +:UGRD:457 m above mean sea level: +:UGRD:500 mb: +:UGRD:50 mb: +:UGRD:5 mb: +:UGRD:550 K isentropic level: +:UGRD:550 mb: +:UGRD:600 mb: +:UGRD:60-30 mb above ground: +:UGRD:610 m above mean sea level: +:UGRD:650 K isentropic level: +:UGRD:650 mb: +:UGRD:700 mb: +:UGRD:70 mb: +:UGRD:7 mb: +:UGRD:750 mb: +:UGRD:800 mb: +:UGRD:80 m above ground: +:UGRD:900 mb: +:UGRD:90-60 mb above ground: +:UGRD:914 m above mean sea level: +:UGRD:925 mb: +:UGRD:950 mb: +:UGRD:975 mb: +:UGRD:max wind: +:UGRD:planetary boundary layer: +:UGRD:PV=-1.5e-06 (Km^2/kg/s) surface: +:UGRD:PV=1.5e-06 (Km^2/kg/s) surface: +:UGRD:PV=-1e-06 (Km^2/kg/s) surface: +:UGRD:PV=1e-06 (Km^2/kg/s) surface: +:UGRD:PV=-2e-06 (Km^2/kg/s) surface: +:UGRD:PV=2e-06 (Km^2/kg/s) surface: +:UGRD:PV=-5e-07 (Km^2/kg/s) surface: +:UGRD:PV=5e-07 (Km^2/kg/s) surface: +:UGRD:tropopause: +:U-GWD:surface: +:ULWRF:surface: +:USTM:6000-0 m above ground: +:USWRF:surface: +:USWRF:top of atmosphere: +:APTMP:2 m above ground +:VFLX:surface: +:VGRD:0.995 sigma level: +:VGRD:1000 mb: +:VGRD:100 m above ground: +:VGRD:100 mb: +:VGRD:10 mb: +:VGRD:1 mb: +:VGRD:120-90 mb above ground: +:VGRD:150-120 mb above ground: +:VGRD:150 mb: +:VGRD:180-150 mb above ground: +:VGRD:1829 m above mean sea level: +:VGRD:20 mb: +:VGRD:2 mb: +:VGRD:2743 m above mean sea level: +:VGRD:300 mb: +:VGRD:30-0 mb above ground: +:VGRD:305 m above mean sea level: +:VGRD:30 mb: +:VGRD:3 mb: +:VGRD:320 K isentropic level: +:VGRD:350 mb: +:VGRD:3658 m above mean sea level: +:VGRD:400 mb: +:VGRD:450 K isentropic level: +:VGRD:450 mb: +:VGRD:4572 m above mean sea level: +:VGRD:457 m above mean sea level: +:VGRD:500 mb: +:VGRD:50 mb: +:VGRD:5 mb: +:VGRD:550 K isentropic level: +:VGRD:550 mb: +:VGRD:600 mb: +:VGRD:60-30 mb above ground: +:VGRD:610 m above mean sea level: +:VGRD:650 K isentropic level: +:VGRD:650 mb: +:VGRD:700 mb: +:VGRD:70 mb: +:VGRD:7 mb: +:VGRD:750 mb: +:VGRD:800 mb: +:VGRD:80 m above ground: +:VGRD:900 mb: +:VGRD:90-60 mb above ground: +:VGRD:914 m above mean sea level: +:VGRD:925 mb: +:VGRD:950 mb: +:VGRD:975 mb: +:VGRD:max wind: +:VGRD:planetary boundary layer: +:VGRD:PV=-1.5e-06 (Km^2/kg/s) surface: +:VGRD:PV=1.5e-06 (Km^2/kg/s) surface: +:VGRD:PV=-1e-06 (Km^2/kg/s) surface: +:VGRD:PV=1e-06 (Km^2/kg/s) surface: +:VGRD:PV=-2e-06 (Km^2/kg/s) surface: +:VGRD:PV=2e-06 (Km^2/kg/s) surface: +:VGRD:PV=-5e-07 (Km^2/kg/s) surface: +:VGRD:PV=5e-07 (Km^2/kg/s) surface: +:VGRD:tropopause: +:V-GWD:surface: +:VIS:surface: +:VRATE:planetary boundary layer: +:VSTM:6000-0 m above ground: +:VVEL:0.995 sigma level: +:VVEL:1 mb: +:VVEL:2 mb: +:VVEL:3 mb: +:VVEL:5 mb: +:VVEL:7 mb: +:VVEL:10 mb: +:VVEL:20 mb: +:VVEL:30 mb: +:VVEL:50 mb: +:VVEL:70 mb: +:VVEL:1000 mb: +:VVEL:100 mb: +:VVEL:150 mb: +:VVEL:200 mb: +:VVEL:250 mb: +:VVEL:300 mb: +:VVEL:350 mb: +:VVEL:400 mb: +:VVEL:450 mb: +:VVEL:500 mb: +:VVEL:550 mb: +:VVEL:600 mb: +:VVEL:650 mb: +:VVEL:700 mb: +:VVEL:750 mb: +:VVEL:800 mb: +:VVEL:850 mb: +:VVEL:900 mb: +:VVEL:925 mb: +:VVEL:950 mb: +:VVEL:975 mb: +:VWSH:PV=-1.5e-06 (Km^2/kg/s) surface: +:VWSH:PV=1.5e-06 (Km^2/kg/s) surface: +:VWSH:PV=-1e-06 (Km^2/kg/s) surface: +:VWSH:PV=1e-06 (Km^2/kg/s) surface: +:VWSH:PV=-2e-06 (Km^2/kg/s) surface: +:VWSH:PV=2e-06 (Km^2/kg/s) surface: +:VWSH:PV=-5e-07 (Km^2/kg/s) surface: +:VWSH:PV=5e-07 (Km^2/kg/s) surface: +:VWSH:tropopause: +:WATR:surface: +:WEASD:surface: +:WILT:surface: +:HGT:cloud ceiling: +:PRES:1 hybrid level: +:HGT:1 hybrid level: +:TMP:1 hybrid level: +:RH:1 hybrid level: +:UGRD:1 hybrid level: +:VGRD:1 hybrid level: +:PRES:2 hybrid level: +:HGT:2 hybrid level: +:TMP:2 hybrid level: +:RH:2 hybrid level: +:UGRD:2 hybrid level: +:VGRD:2 hybrid level: +:PRES:3 hybrid level: +:HGT:3 hybrid level: +:TMP:3 hybrid level: +:RH:3 hybrid level: +:UGRD:3 hybrid level: +:VGRD:3 hybrid level: +:PRES:4 hybrid level: +:HGT:4 hybrid level: +:TMP:4 hybrid level: +:RH:4 hybrid level: +:UGRD:4 hybrid level: +:VGRD:4 hybrid level: diff --git a/parm/post/global_1x1_paramlist_g2.anl b/parm/product/gfs.anl.paramlist.a.txt similarity index 100% rename from parm/post/global_1x1_paramlist_g2.anl rename to parm/product/gfs.anl.paramlist.a.txt diff --git a/parm/post/global_1x1_paramlist_g2.f000 b/parm/product/gfs.f000.paramlist.a.txt similarity index 100% rename from parm/post/global_1x1_paramlist_g2.f000 rename to parm/product/gfs.f000.paramlist.a.txt diff --git a/parm/post/global_1x1_paramlist_g2 b/parm/product/gfs.fFFF.paramlist.a.txt similarity index 100% rename from parm/post/global_1x1_paramlist_g2 rename to parm/product/gfs.fFFF.paramlist.a.txt diff --git a/parm/post/global_master-catchup_parmlist_g2 b/parm/product/gfs.fFFF.paramlist.b.txt similarity index 100% rename from parm/post/global_master-catchup_parmlist_g2 rename to parm/product/gfs.fFFF.paramlist.b.txt diff --git a/parm/ufs/fix/gfs/atmos.fixed_files.yaml b/parm/ufs/fix/gfs/atmos.fixed_files.yaml index 7d901fe17b..8db691b49c 100644 --- a/parm/ufs/fix/gfs/atmos.fixed_files.yaml +++ b/parm/ufs/fix/gfs/atmos.fixed_files.yaml @@ -1,85 +1,85 @@ copy: # Atmosphere mosaic file linked as the grid_spec file (atm only) - - [$(FIXorog)/$(CASE)/$(CASE)_mosaic.nc, $(DATA)/INPUT/grid_spec.nc] + - [$(FIXgfs)/orog/$(CASE)/$(CASE)_mosaic.nc, $(DATA)/INPUT/grid_spec.nc] # Atmosphere grid tile files - - [$(FIXorog)/$(CASE)/$(CASE)_grid.tile1.nc, $(DATA)/INPUT/] - - [$(FIXorog)/$(CASE)/$(CASE)_grid.tile2.nc, $(DATA)/INPUT/] - - [$(FIXorog)/$(CASE)/$(CASE)_grid.tile3.nc, $(DATA)/INPUT/] - - [$(FIXorog)/$(CASE)/$(CASE)_grid.tile4.nc, $(DATA)/INPUT/] - - [$(FIXorog)/$(CASE)/$(CASE)_grid.tile5.nc, $(DATA)/INPUT/] - - [$(FIXorog)/$(CASE)/$(CASE)_grid.tile6.nc, $(DATA)/INPUT/] + - [$(FIXgfs)/orog/$(CASE)/$(CASE)_grid.tile1.nc, $(DATA)/INPUT/] + - [$(FIXgfs)/orog/$(CASE)/$(CASE)_grid.tile2.nc, $(DATA)/INPUT/] + - [$(FIXgfs)/orog/$(CASE)/$(CASE)_grid.tile3.nc, $(DATA)/INPUT/] + - [$(FIXgfs)/orog/$(CASE)/$(CASE)_grid.tile4.nc, $(DATA)/INPUT/] + - [$(FIXgfs)/orog/$(CASE)/$(CASE)_grid.tile5.nc, $(DATA)/INPUT/] + - [$(FIXgfs)/orog/$(CASE)/$(CASE)_grid.tile6.nc, $(DATA)/INPUT/] - # oro_data_ls and oro_data_ss files from FIXugwd - - [$(FIXugwd)/$(CASE)/$(CASE)_oro_data_ls.tile1.nc, $(DATA)/INPUT/oro_data_ls.tile1.nc] - - [$(FIXugwd)/$(CASE)/$(CASE)_oro_data_ls.tile2.nc, $(DATA)/INPUT/oro_data_ls.tile2.nc] - - [$(FIXugwd)/$(CASE)/$(CASE)_oro_data_ls.tile3.nc, $(DATA)/INPUT/oro_data_ls.tile3.nc] - - [$(FIXugwd)/$(CASE)/$(CASE)_oro_data_ls.tile4.nc, $(DATA)/INPUT/oro_data_ls.tile4.nc] - - [$(FIXugwd)/$(CASE)/$(CASE)_oro_data_ls.tile5.nc, $(DATA)/INPUT/oro_data_ls.tile5.nc] - - [$(FIXugwd)/$(CASE)/$(CASE)_oro_data_ls.tile6.nc, $(DATA)/INPUT/oro_data_ls.tile6.nc] - - [$(FIXugwd)/$(CASE)/$(CASE)_oro_data_ss.tile1.nc, $(DATA)/INPUT/oro_data_ss.tile1.nc] - - [$(FIXugwd)/$(CASE)/$(CASE)_oro_data_ss.tile2.nc, $(DATA)/INPUT/oro_data_ss.tile2.nc] - - [$(FIXugwd)/$(CASE)/$(CASE)_oro_data_ss.tile3.nc, $(DATA)/INPUT/oro_data_ss.tile3.nc] - - [$(FIXugwd)/$(CASE)/$(CASE)_oro_data_ss.tile4.nc, $(DATA)/INPUT/oro_data_ss.tile4.nc] - - [$(FIXugwd)/$(CASE)/$(CASE)_oro_data_ss.tile5.nc, $(DATA)/INPUT/oro_data_ss.tile5.nc] - - [$(FIXugwd)/$(CASE)/$(CASE)_oro_data_ss.tile6.nc, $(DATA)/INPUT/oro_data_ss.tile6.nc] + # oro_data_ls and oro_data_ss files from FIXgfs/ugwd + - [$(FIXgfs)/ugwd/$(CASE)/$(CASE)_oro_data_ls.tile1.nc, $(DATA)/INPUT/oro_data_ls.tile1.nc] + - [$(FIXgfs)/ugwd/$(CASE)/$(CASE)_oro_data_ls.tile2.nc, $(DATA)/INPUT/oro_data_ls.tile2.nc] + - [$(FIXgfs)/ugwd/$(CASE)/$(CASE)_oro_data_ls.tile3.nc, $(DATA)/INPUT/oro_data_ls.tile3.nc] + - [$(FIXgfs)/ugwd/$(CASE)/$(CASE)_oro_data_ls.tile4.nc, $(DATA)/INPUT/oro_data_ls.tile4.nc] + - [$(FIXgfs)/ugwd/$(CASE)/$(CASE)_oro_data_ls.tile5.nc, $(DATA)/INPUT/oro_data_ls.tile5.nc] + - [$(FIXgfs)/ugwd/$(CASE)/$(CASE)_oro_data_ls.tile6.nc, $(DATA)/INPUT/oro_data_ls.tile6.nc] + - [$(FIXgfs)/ugwd/$(CASE)/$(CASE)_oro_data_ss.tile1.nc, $(DATA)/INPUT/oro_data_ss.tile1.nc] + - [$(FIXgfs)/ugwd/$(CASE)/$(CASE)_oro_data_ss.tile2.nc, $(DATA)/INPUT/oro_data_ss.tile2.nc] + - [$(FIXgfs)/ugwd/$(CASE)/$(CASE)_oro_data_ss.tile3.nc, $(DATA)/INPUT/oro_data_ss.tile3.nc] + - [$(FIXgfs)/ugwd/$(CASE)/$(CASE)_oro_data_ss.tile4.nc, $(DATA)/INPUT/oro_data_ss.tile4.nc] + - [$(FIXgfs)/ugwd/$(CASE)/$(CASE)_oro_data_ss.tile5.nc, $(DATA)/INPUT/oro_data_ss.tile5.nc] + - [$(FIXgfs)/ugwd/$(CASE)/$(CASE)_oro_data_ss.tile6.nc, $(DATA)/INPUT/oro_data_ss.tile6.nc] # GWD?? - - [$(FIXugwd)/ugwp_limb_tau.nc, $(DATA)/ugwp_limb_tau.nc] + - [$(FIXgfs)/ugwd/ugwp_limb_tau.nc, $(DATA)/ugwp_limb_tau.nc] # CO2 climatology - - [$(FIXam)/co2monthlycyc.txt, $(DATA)/co2monthlycyc.txt] - - [$(FIXam)/global_co2historicaldata_glob.txt, $(DATA)/co2historicaldata_glob.txt] - - [$(FIXam)/fix_co2_proj/global_co2historicaldata_2009.txt, $(DATA)/co2historicaldata_2009.txt] - - [$(FIXam)/fix_co2_proj/global_co2historicaldata_2010.txt, $(DATA)/co2historicaldata_2010.txt] - - [$(FIXam)/fix_co2_proj/global_co2historicaldata_2011.txt, $(DATA)/co2historicaldata_2011.txt] - - [$(FIXam)/fix_co2_proj/global_co2historicaldata_2012.txt, $(DATA)/co2historicaldata_2012.txt] - - [$(FIXam)/fix_co2_proj/global_co2historicaldata_2013.txt, $(DATA)/co2historicaldata_2013.txt] - - [$(FIXam)/fix_co2_proj/global_co2historicaldata_2014.txt, $(DATA)/co2historicaldata_2014.txt] - - [$(FIXam)/fix_co2_proj/global_co2historicaldata_2015.txt, $(DATA)/co2historicaldata_2015.txt] - - [$(FIXam)/fix_co2_proj/global_co2historicaldata_2016.txt, $(DATA)/co2historicaldata_2016.txt] - - [$(FIXam)/fix_co2_proj/global_co2historicaldata_2017.txt, $(DATA)/co2historicaldata_2017.txt] - - [$(FIXam)/fix_co2_proj/global_co2historicaldata_2018.txt, $(DATA)/co2historicaldata_2018.txt] - - [$(FIXam)/fix_co2_proj/global_co2historicaldata_2019.txt, $(DATA)/co2historicaldata_2019.txt] - - [$(FIXam)/fix_co2_proj/global_co2historicaldata_2020.txt, $(DATA)/co2historicaldata_2020.txt] - - [$(FIXam)/fix_co2_proj/global_co2historicaldata_2021.txt, $(DATA)/co2historicaldata_2021.txt] - - [$(FIXam)/fix_co2_proj/global_co2historicaldata_2022.txt, $(DATA)/co2historicaldata_2022.txt] - - [$(FIXam)/fix_co2_proj/global_co2historicaldata_2023.txt, $(DATA)/co2historicaldata_2023.txt] + - [$(FIXgfs)/am/co2monthlycyc.txt, $(DATA)/co2monthlycyc.txt] + - [$(FIXgfs)/am/global_co2historicaldata_glob.txt, $(DATA)/co2historicaldata_glob.txt] + - [$(FIXgfs)/am/fix_co2_proj/global_co2historicaldata_2009.txt, $(DATA)/co2historicaldata_2009.txt] + - [$(FIXgfs)/am/fix_co2_proj/global_co2historicaldata_2010.txt, $(DATA)/co2historicaldata_2010.txt] + - [$(FIXgfs)/am/fix_co2_proj/global_co2historicaldata_2011.txt, $(DATA)/co2historicaldata_2011.txt] + - [$(FIXgfs)/am/fix_co2_proj/global_co2historicaldata_2012.txt, $(DATA)/co2historicaldata_2012.txt] + - [$(FIXgfs)/am/fix_co2_proj/global_co2historicaldata_2013.txt, $(DATA)/co2historicaldata_2013.txt] + - [$(FIXgfs)/am/fix_co2_proj/global_co2historicaldata_2014.txt, $(DATA)/co2historicaldata_2014.txt] + - [$(FIXgfs)/am/fix_co2_proj/global_co2historicaldata_2015.txt, $(DATA)/co2historicaldata_2015.txt] + - [$(FIXgfs)/am/fix_co2_proj/global_co2historicaldata_2016.txt, $(DATA)/co2historicaldata_2016.txt] + - [$(FIXgfs)/am/fix_co2_proj/global_co2historicaldata_2017.txt, $(DATA)/co2historicaldata_2017.txt] + - [$(FIXgfs)/am/fix_co2_proj/global_co2historicaldata_2018.txt, $(DATA)/co2historicaldata_2018.txt] + - [$(FIXgfs)/am/fix_co2_proj/global_co2historicaldata_2019.txt, $(DATA)/co2historicaldata_2019.txt] + - [$(FIXgfs)/am/fix_co2_proj/global_co2historicaldata_2020.txt, $(DATA)/co2historicaldata_2020.txt] + - [$(FIXgfs)/am/fix_co2_proj/global_co2historicaldata_2021.txt, $(DATA)/co2historicaldata_2021.txt] + - [$(FIXgfs)/am/fix_co2_proj/global_co2historicaldata_2022.txt, $(DATA)/co2historicaldata_2022.txt] + - [$(FIXgfs)/am/fix_co2_proj/global_co2historicaldata_2023.txt, $(DATA)/co2historicaldata_2023.txt] - # FIXam files - - [$(FIXam)/global_climaeropac_global.txt, $(DATA)/aerosol.dat] - - [$(FIXam)/ozprdlos_2015_new_sbuvO3_tclm15_nuchem.f77, $(DATA)/global_o3prdlos.f77] - - [$(FIXam)/global_h2o_pltc.f77, $(DATA)/global_h2oprdlos.f77] - - [$(FIXam)/global_glacier.2x2.grb, $(DATA)/global_glacier.2x2.grb] - - [$(FIXam)/global_maxice.2x2.grb, $(DATA)/global_maxice.2x2.grb] - - [$(FIXam)/global_snoclim.1.875.grb, $(DATA)/global_snoclim.1.875.grb] - - [$(FIXam)/global_slmask.t1534.3072.1536.grb, $(DATA)/global_slmask.t1534.3072.1536.grb] - - [$(FIXam)/global_soilmgldas.statsgo.t1534.3072.1536.grb, $(DATA)/global_soilmgldas.statsgo.t1534.3072.1536.grb] - - [$(FIXam)/global_solarconstant_noaa_an.txt, $(DATA)/solarconstant_noaa_an.txt] - - [$(FIXam)/global_sfc_emissivity_idx.txt, $(DATA)/sfc_emissivity_idx.txt] - - [$(FIXam)/RTGSST.1982.2012.monthly.clim.grb, $(DATA)/RTGSST.1982.2012.monthly.clim.grb] - - [$(FIXam)/IMS-NIC.blended.ice.monthly.clim.grb, $(DATA)/IMS-NIC.blended.ice.monthly.clim.grb] + # FIXgfs/am files + - [$(FIXgfs)/am/global_climaeropac_global.txt, $(DATA)/aerosol.dat] + - [$(FIXgfs)/am/ozprdlos_2015_new_sbuvO3_tclm15_nuchem.f77, $(DATA)/global_o3prdlos.f77] + - [$(FIXgfs)/am/global_h2o_pltc.f77, $(DATA)/global_h2oprdlos.f77] + - [$(FIXgfs)/am/global_glacier.2x2.grb, $(DATA)/global_glacier.2x2.grb] + - [$(FIXgfs)/am/global_maxice.2x2.grb, $(DATA)/global_maxice.2x2.grb] + - [$(FIXgfs)/am/global_snoclim.1.875.grb, $(DATA)/global_snoclim.1.875.grb] + - [$(FIXgfs)/am/global_slmask.t1534.3072.1536.grb, $(DATA)/global_slmask.t1534.3072.1536.grb] + - [$(FIXgfs)/am/global_soilmgldas.statsgo.t1534.3072.1536.grb, $(DATA)/global_soilmgldas.statsgo.t1534.3072.1536.grb] + - [$(FIXgfs)/am/global_solarconstant_noaa_an.txt, $(DATA)/solarconstant_noaa_an.txt] + - [$(FIXgfs)/am/global_sfc_emissivity_idx.txt, $(DATA)/sfc_emissivity_idx.txt] + - [$(FIXgfs)/am/RTGSST.1982.2012.monthly.clim.grb, $(DATA)/RTGSST.1982.2012.monthly.clim.grb] + - [$(FIXgfs)/am/IMS-NIC.blended.ice.monthly.clim.grb, $(DATA)/IMS-NIC.blended.ice.monthly.clim.grb] # MERRA2 Aerosol Climatology - - [$(FIXaer)/merra2.aerclim.2003-2014.m01.nc, $(DATA)/aeroclim.m01.nc] - - [$(FIXaer)/merra2.aerclim.2003-2014.m02.nc, $(DATA)/aeroclim.m02.nc] - - [$(FIXaer)/merra2.aerclim.2003-2014.m03.nc, $(DATA)/aeroclim.m03.nc] - - [$(FIXaer)/merra2.aerclim.2003-2014.m04.nc, $(DATA)/aeroclim.m04.nc] - - [$(FIXaer)/merra2.aerclim.2003-2014.m05.nc, $(DATA)/aeroclim.m05.nc] - - [$(FIXaer)/merra2.aerclim.2003-2014.m06.nc, $(DATA)/aeroclim.m06.nc] - - [$(FIXaer)/merra2.aerclim.2003-2014.m07.nc, $(DATA)/aeroclim.m07.nc] - - [$(FIXaer)/merra2.aerclim.2003-2014.m08.nc, $(DATA)/aeroclim.m08.nc] - - [$(FIXaer)/merra2.aerclim.2003-2014.m09.nc, $(DATA)/aeroclim.m09.nc] - - [$(FIXaer)/merra2.aerclim.2003-2014.m10.nc, $(DATA)/aeroclim.m10.nc] - - [$(FIXaer)/merra2.aerclim.2003-2014.m11.nc, $(DATA)/aeroclim.m11.nc] - - [$(FIXaer)/merra2.aerclim.2003-2014.m12.nc, $(DATA)/aeroclim.m12.nc] + - [$(FIXgfs)/aer/merra2.aerclim.2003-2014.m01.nc, $(DATA)/aeroclim.m01.nc] + - [$(FIXgfs)/aer/merra2.aerclim.2003-2014.m02.nc, $(DATA)/aeroclim.m02.nc] + - [$(FIXgfs)/aer/merra2.aerclim.2003-2014.m03.nc, $(DATA)/aeroclim.m03.nc] + - [$(FIXgfs)/aer/merra2.aerclim.2003-2014.m04.nc, $(DATA)/aeroclim.m04.nc] + - [$(FIXgfs)/aer/merra2.aerclim.2003-2014.m05.nc, $(DATA)/aeroclim.m05.nc] + - [$(FIXgfs)/aer/merra2.aerclim.2003-2014.m06.nc, $(DATA)/aeroclim.m06.nc] + - [$(FIXgfs)/aer/merra2.aerclim.2003-2014.m07.nc, $(DATA)/aeroclim.m07.nc] + - [$(FIXgfs)/aer/merra2.aerclim.2003-2014.m08.nc, $(DATA)/aeroclim.m08.nc] + - [$(FIXgfs)/aer/merra2.aerclim.2003-2014.m09.nc, $(DATA)/aeroclim.m09.nc] + - [$(FIXgfs)/aer/merra2.aerclim.2003-2014.m10.nc, $(DATA)/aeroclim.m10.nc] + - [$(FIXgfs)/aer/merra2.aerclim.2003-2014.m11.nc, $(DATA)/aeroclim.m11.nc] + - [$(FIXgfs)/aer/merra2.aerclim.2003-2014.m12.nc, $(DATA)/aeroclim.m12.nc] # Optical depth - - [$(FIXlut)/optics_BC.v1_3.dat, $(DATA)/optics_BC.dat] - - [$(FIXlut)/optics_DU.v15_3.dat, $(DATA)/optics_DU.dat] - - [$(FIXlut)/optics_OC.v1_3.dat, $(DATA)/optics_OC.dat] - - [$(FIXlut)/optics_SS.v3_3.dat, $(DATA)/optics_SS.dat] - - [$(FIXlut)/optics_SU.v1_3.dat, $(DATA)/optics_SU.dat] + - [$(FIXgfs)/lut/optics_BC.v1_3.dat, $(DATA)/optics_BC.dat] + - [$(FIXgfs)/lut/optics_DU.v15_3.dat, $(DATA)/optics_DU.dat] + - [$(FIXgfs)/lut/optics_OC.v1_3.dat, $(DATA)/optics_OC.dat] + - [$(FIXgfs)/lut/optics_SS.v3_3.dat, $(DATA)/optics_SS.dat] + - [$(FIXgfs)/lut/optics_SU.v1_3.dat, $(DATA)/optics_SU.dat] # fd_ufs.yaml file - [$(HOMEgfs)/sorc/ufs_model.fd/tests/parm/fd_ufs.yaml, $(DATA)/] diff --git a/parm/ufs/fix/gfs/land.fixed_files.yaml b/parm/ufs/fix/gfs/land.fixed_files.yaml index bb2d060963..8e4d221dbc 100644 --- a/parm/ufs/fix/gfs/land.fixed_files.yaml +++ b/parm/ufs/fix/gfs/land.fixed_files.yaml @@ -1,58 +1,58 @@ copy: - # Files from FIXorog/C??/sfc - - [$(FIXorog)/$(CASE)/sfc/$(CASE).mx$(OCNRES).facsf.tile1.nc, $(DATA)/] - - [$(FIXorog)/$(CASE)/sfc/$(CASE).mx$(OCNRES).facsf.tile2.nc, $(DATA)/] - - [$(FIXorog)/$(CASE)/sfc/$(CASE).mx$(OCNRES).facsf.tile3.nc, $(DATA)/] - - [$(FIXorog)/$(CASE)/sfc/$(CASE).mx$(OCNRES).facsf.tile4.nc, $(DATA)/] - - [$(FIXorog)/$(CASE)/sfc/$(CASE).mx$(OCNRES).facsf.tile5.nc, $(DATA)/] - - [$(FIXorog)/$(CASE)/sfc/$(CASE).mx$(OCNRES).facsf.tile6.nc, $(DATA)/] + # Files from FIXgfs/orog/C??/sfc + - [$(FIXgfs)/orog/$(CASE)/sfc/$(CASE).mx$(OCNRES).facsf.tile1.nc, $(DATA)/] + - [$(FIXgfs)/orog/$(CASE)/sfc/$(CASE).mx$(OCNRES).facsf.tile2.nc, $(DATA)/] + - [$(FIXgfs)/orog/$(CASE)/sfc/$(CASE).mx$(OCNRES).facsf.tile3.nc, $(DATA)/] + - [$(FIXgfs)/orog/$(CASE)/sfc/$(CASE).mx$(OCNRES).facsf.tile4.nc, $(DATA)/] + - [$(FIXgfs)/orog/$(CASE)/sfc/$(CASE).mx$(OCNRES).facsf.tile5.nc, $(DATA)/] + - [$(FIXgfs)/orog/$(CASE)/sfc/$(CASE).mx$(OCNRES).facsf.tile6.nc, $(DATA)/] - - [$(FIXorog)/$(CASE)/sfc/$(CASE).mx$(OCNRES).maximum_snow_albedo.tile1.nc, $(DATA)/] - - [$(FIXorog)/$(CASE)/sfc/$(CASE).mx$(OCNRES).maximum_snow_albedo.tile2.nc, $(DATA)/] - - [$(FIXorog)/$(CASE)/sfc/$(CASE).mx$(OCNRES).maximum_snow_albedo.tile3.nc, $(DATA)/] - - [$(FIXorog)/$(CASE)/sfc/$(CASE).mx$(OCNRES).maximum_snow_albedo.tile4.nc, $(DATA)/] - - [$(FIXorog)/$(CASE)/sfc/$(CASE).mx$(OCNRES).maximum_snow_albedo.tile5.nc, $(DATA)/] - - [$(FIXorog)/$(CASE)/sfc/$(CASE).mx$(OCNRES).maximum_snow_albedo.tile6.nc, $(DATA)/] + - [$(FIXgfs)/orog/$(CASE)/sfc/$(CASE).mx$(OCNRES).maximum_snow_albedo.tile1.nc, $(DATA)/] + - [$(FIXgfs)/orog/$(CASE)/sfc/$(CASE).mx$(OCNRES).maximum_snow_albedo.tile2.nc, $(DATA)/] + - [$(FIXgfs)/orog/$(CASE)/sfc/$(CASE).mx$(OCNRES).maximum_snow_albedo.tile3.nc, $(DATA)/] + - [$(FIXgfs)/orog/$(CASE)/sfc/$(CASE).mx$(OCNRES).maximum_snow_albedo.tile4.nc, $(DATA)/] + - [$(FIXgfs)/orog/$(CASE)/sfc/$(CASE).mx$(OCNRES).maximum_snow_albedo.tile5.nc, $(DATA)/] + - [$(FIXgfs)/orog/$(CASE)/sfc/$(CASE).mx$(OCNRES).maximum_snow_albedo.tile6.nc, $(DATA)/] - - [$(FIXorog)/$(CASE)/sfc/$(CASE).mx$(OCNRES).slope_type.tile1.nc, $(DATA)/] - - [$(FIXorog)/$(CASE)/sfc/$(CASE).mx$(OCNRES).slope_type.tile2.nc, $(DATA)/] - - [$(FIXorog)/$(CASE)/sfc/$(CASE).mx$(OCNRES).slope_type.tile3.nc, $(DATA)/] - - [$(FIXorog)/$(CASE)/sfc/$(CASE).mx$(OCNRES).slope_type.tile4.nc, $(DATA)/] - - [$(FIXorog)/$(CASE)/sfc/$(CASE).mx$(OCNRES).slope_type.tile5.nc, $(DATA)/] - - [$(FIXorog)/$(CASE)/sfc/$(CASE).mx$(OCNRES).slope_type.tile6.nc, $(DATA)/] + - [$(FIXgfs)/orog/$(CASE)/sfc/$(CASE).mx$(OCNRES).slope_type.tile1.nc, $(DATA)/] + - [$(FIXgfs)/orog/$(CASE)/sfc/$(CASE).mx$(OCNRES).slope_type.tile2.nc, $(DATA)/] + - [$(FIXgfs)/orog/$(CASE)/sfc/$(CASE).mx$(OCNRES).slope_type.tile3.nc, $(DATA)/] + - [$(FIXgfs)/orog/$(CASE)/sfc/$(CASE).mx$(OCNRES).slope_type.tile4.nc, $(DATA)/] + - [$(FIXgfs)/orog/$(CASE)/sfc/$(CASE).mx$(OCNRES).slope_type.tile5.nc, $(DATA)/] + - [$(FIXgfs)/orog/$(CASE)/sfc/$(CASE).mx$(OCNRES).slope_type.tile6.nc, $(DATA)/] - - [$(FIXorog)/$(CASE)/sfc/$(CASE).mx$(OCNRES).snowfree_albedo.tile1.nc, $(DATA)/] - - [$(FIXorog)/$(CASE)/sfc/$(CASE).mx$(OCNRES).snowfree_albedo.tile2.nc, $(DATA)/] - - [$(FIXorog)/$(CASE)/sfc/$(CASE).mx$(OCNRES).snowfree_albedo.tile3.nc, $(DATA)/] - - [$(FIXorog)/$(CASE)/sfc/$(CASE).mx$(OCNRES).snowfree_albedo.tile4.nc, $(DATA)/] - - [$(FIXorog)/$(CASE)/sfc/$(CASE).mx$(OCNRES).snowfree_albedo.tile5.nc, $(DATA)/] - - [$(FIXorog)/$(CASE)/sfc/$(CASE).mx$(OCNRES).snowfree_albedo.tile6.nc, $(DATA)/] + - [$(FIXgfs)/orog/$(CASE)/sfc/$(CASE).mx$(OCNRES).snowfree_albedo.tile1.nc, $(DATA)/] + - [$(FIXgfs)/orog/$(CASE)/sfc/$(CASE).mx$(OCNRES).snowfree_albedo.tile2.nc, $(DATA)/] + - [$(FIXgfs)/orog/$(CASE)/sfc/$(CASE).mx$(OCNRES).snowfree_albedo.tile3.nc, $(DATA)/] + - [$(FIXgfs)/orog/$(CASE)/sfc/$(CASE).mx$(OCNRES).snowfree_albedo.tile4.nc, $(DATA)/] + - [$(FIXgfs)/orog/$(CASE)/sfc/$(CASE).mx$(OCNRES).snowfree_albedo.tile5.nc, $(DATA)/] + - [$(FIXgfs)/orog/$(CASE)/sfc/$(CASE).mx$(OCNRES).snowfree_albedo.tile6.nc, $(DATA)/] - - [$(FIXorog)/$(CASE)/sfc/$(CASE).mx$(OCNRES).soil_type.tile1.nc, $(DATA)/] - - [$(FIXorog)/$(CASE)/sfc/$(CASE).mx$(OCNRES).soil_type.tile2.nc, $(DATA)/] - - [$(FIXorog)/$(CASE)/sfc/$(CASE).mx$(OCNRES).soil_type.tile3.nc, $(DATA)/] - - [$(FIXorog)/$(CASE)/sfc/$(CASE).mx$(OCNRES).soil_type.tile4.nc, $(DATA)/] - - [$(FIXorog)/$(CASE)/sfc/$(CASE).mx$(OCNRES).soil_type.tile5.nc, $(DATA)/] - - [$(FIXorog)/$(CASE)/sfc/$(CASE).mx$(OCNRES).soil_type.tile6.nc, $(DATA)/] + - [$(FIXgfs)/orog/$(CASE)/sfc/$(CASE).mx$(OCNRES).soil_type.tile1.nc, $(DATA)/] + - [$(FIXgfs)/orog/$(CASE)/sfc/$(CASE).mx$(OCNRES).soil_type.tile2.nc, $(DATA)/] + - [$(FIXgfs)/orog/$(CASE)/sfc/$(CASE).mx$(OCNRES).soil_type.tile3.nc, $(DATA)/] + - [$(FIXgfs)/orog/$(CASE)/sfc/$(CASE).mx$(OCNRES).soil_type.tile4.nc, $(DATA)/] + - [$(FIXgfs)/orog/$(CASE)/sfc/$(CASE).mx$(OCNRES).soil_type.tile5.nc, $(DATA)/] + - [$(FIXgfs)/orog/$(CASE)/sfc/$(CASE).mx$(OCNRES).soil_type.tile6.nc, $(DATA)/] - - [$(FIXorog)/$(CASE)/sfc/$(CASE).mx$(OCNRES).substrate_temperature.tile1.nc, $(DATA)/] - - [$(FIXorog)/$(CASE)/sfc/$(CASE).mx$(OCNRES).substrate_temperature.tile2.nc, $(DATA)/] - - [$(FIXorog)/$(CASE)/sfc/$(CASE).mx$(OCNRES).substrate_temperature.tile3.nc, $(DATA)/] - - [$(FIXorog)/$(CASE)/sfc/$(CASE).mx$(OCNRES).substrate_temperature.tile4.nc, $(DATA)/] - - [$(FIXorog)/$(CASE)/sfc/$(CASE).mx$(OCNRES).substrate_temperature.tile5.nc, $(DATA)/] - - [$(FIXorog)/$(CASE)/sfc/$(CASE).mx$(OCNRES).substrate_temperature.tile6.nc, $(DATA)/] + - [$(FIXgfs)/orog/$(CASE)/sfc/$(CASE).mx$(OCNRES).substrate_temperature.tile1.nc, $(DATA)/] + - [$(FIXgfs)/orog/$(CASE)/sfc/$(CASE).mx$(OCNRES).substrate_temperature.tile2.nc, $(DATA)/] + - [$(FIXgfs)/orog/$(CASE)/sfc/$(CASE).mx$(OCNRES).substrate_temperature.tile3.nc, $(DATA)/] + - [$(FIXgfs)/orog/$(CASE)/sfc/$(CASE).mx$(OCNRES).substrate_temperature.tile4.nc, $(DATA)/] + - [$(FIXgfs)/orog/$(CASE)/sfc/$(CASE).mx$(OCNRES).substrate_temperature.tile5.nc, $(DATA)/] + - [$(FIXgfs)/orog/$(CASE)/sfc/$(CASE).mx$(OCNRES).substrate_temperature.tile6.nc, $(DATA)/] - - [$(FIXorog)/$(CASE)/sfc/$(CASE).mx$(OCNRES).vegetation_greenness.tile1.nc, $(DATA)/] - - [$(FIXorog)/$(CASE)/sfc/$(CASE).mx$(OCNRES).vegetation_greenness.tile2.nc, $(DATA)/] - - [$(FIXorog)/$(CASE)/sfc/$(CASE).mx$(OCNRES).vegetation_greenness.tile3.nc, $(DATA)/] - - [$(FIXorog)/$(CASE)/sfc/$(CASE).mx$(OCNRES).vegetation_greenness.tile4.nc, $(DATA)/] - - [$(FIXorog)/$(CASE)/sfc/$(CASE).mx$(OCNRES).vegetation_greenness.tile5.nc, $(DATA)/] - - [$(FIXorog)/$(CASE)/sfc/$(CASE).mx$(OCNRES).vegetation_greenness.tile6.nc, $(DATA)/] + - [$(FIXgfs)/orog/$(CASE)/sfc/$(CASE).mx$(OCNRES).vegetation_greenness.tile1.nc, $(DATA)/] + - [$(FIXgfs)/orog/$(CASE)/sfc/$(CASE).mx$(OCNRES).vegetation_greenness.tile2.nc, $(DATA)/] + - [$(FIXgfs)/orog/$(CASE)/sfc/$(CASE).mx$(OCNRES).vegetation_greenness.tile3.nc, $(DATA)/] + - [$(FIXgfs)/orog/$(CASE)/sfc/$(CASE).mx$(OCNRES).vegetation_greenness.tile4.nc, $(DATA)/] + - [$(FIXgfs)/orog/$(CASE)/sfc/$(CASE).mx$(OCNRES).vegetation_greenness.tile5.nc, $(DATA)/] + - [$(FIXgfs)/orog/$(CASE)/sfc/$(CASE).mx$(OCNRES).vegetation_greenness.tile6.nc, $(DATA)/] - - [$(FIXorog)/$(CASE)/sfc/$(CASE).mx$(OCNRES).vegetation_type.tile1.nc, $(DATA)/] - - [$(FIXorog)/$(CASE)/sfc/$(CASE).mx$(OCNRES).vegetation_type.tile2.nc, $(DATA)/] - - [$(FIXorog)/$(CASE)/sfc/$(CASE).mx$(OCNRES).vegetation_type.tile3.nc, $(DATA)/] - - [$(FIXorog)/$(CASE)/sfc/$(CASE).mx$(OCNRES).vegetation_type.tile4.nc, $(DATA)/] - - [$(FIXorog)/$(CASE)/sfc/$(CASE).mx$(OCNRES).vegetation_type.tile5.nc, $(DATA)/] - - [$(FIXorog)/$(CASE)/sfc/$(CASE).mx$(OCNRES).vegetation_type.tile6.nc, $(DATA)/] + - [$(FIXgfs)/orog/$(CASE)/sfc/$(CASE).mx$(OCNRES).vegetation_type.tile1.nc, $(DATA)/] + - [$(FIXgfs)/orog/$(CASE)/sfc/$(CASE).mx$(OCNRES).vegetation_type.tile2.nc, $(DATA)/] + - [$(FIXgfs)/orog/$(CASE)/sfc/$(CASE).mx$(OCNRES).vegetation_type.tile3.nc, $(DATA)/] + - [$(FIXgfs)/orog/$(CASE)/sfc/$(CASE).mx$(OCNRES).vegetation_type.tile4.nc, $(DATA)/] + - [$(FIXgfs)/orog/$(CASE)/sfc/$(CASE).mx$(OCNRES).vegetation_type.tile5.nc, $(DATA)/] + - [$(FIXgfs)/orog/$(CASE)/sfc/$(CASE).mx$(OCNRES).vegetation_type.tile6.nc, $(DATA)/] diff --git a/parm/ufs/fix/gfs/ocean.fixed_files.yaml b/parm/ufs/fix/gfs/ocean.fixed_files.yaml index 1ca8ce7a68..4ef19bab0d 100644 --- a/parm/ufs/fix/gfs/ocean.fixed_files.yaml +++ b/parm/ufs/fix/gfs/ocean.fixed_files.yaml @@ -1,9 +1,9 @@ copy: # Orography data tile files - - [$(FIXorog)/$(CASE)/$(CASE).mx$(OCNRES)_oro_data.tile1.nc, $(DATA)/INPUT/oro_data.tile1.nc] - - [$(FIXorog)/$(CASE)/$(CASE).mx$(OCNRES)_oro_data.tile2.nc, $(DATA)/INPUT/oro_data.tile2.nc] - - [$(FIXorog)/$(CASE)/$(CASE).mx$(OCNRES)_oro_data.tile3.nc, $(DATA)/INPUT/oro_data.tile3.nc] - - [$(FIXorog)/$(CASE)/$(CASE).mx$(OCNRES)_oro_data.tile4.nc, $(DATA)/INPUT/oro_data.tile4.nc] - - [$(FIXorog)/$(CASE)/$(CASE).mx$(OCNRES)_oro_data.tile5.nc, $(DATA)/INPUT/oro_data.tile5.nc] - - [$(FIXorog)/$(CASE)/$(CASE).mx$(OCNRES)_oro_data.tile6.nc, $(DATA)/INPUT/oro_data.tile6.nc] + - [$(FIXgfs)/orog/$(CASE)/$(CASE).mx$(OCNRES)_oro_data.tile1.nc, $(DATA)/INPUT/oro_data.tile1.nc] + - [$(FIXgfs)/orog/$(CASE)/$(CASE).mx$(OCNRES)_oro_data.tile2.nc, $(DATA)/INPUT/oro_data.tile2.nc] + - [$(FIXgfs)/orog/$(CASE)/$(CASE).mx$(OCNRES)_oro_data.tile3.nc, $(DATA)/INPUT/oro_data.tile3.nc] + - [$(FIXgfs)/orog/$(CASE)/$(CASE).mx$(OCNRES)_oro_data.tile4.nc, $(DATA)/INPUT/oro_data.tile4.nc] + - [$(FIXgfs)/orog/$(CASE)/$(CASE).mx$(OCNRES)_oro_data.tile5.nc, $(DATA)/INPUT/oro_data.tile5.nc] + - [$(FIXgfs)/orog/$(CASE)/$(CASE).mx$(OCNRES)_oro_data.tile6.nc, $(DATA)/INPUT/oro_data.tile6.nc] diff --git a/parm/ufs/fv3/data_table b/parm/ufs/fv3/data_table deleted file mode 100644 index 4ca9128415..0000000000 --- a/parm/ufs/fv3/data_table +++ /dev/null @@ -1 +0,0 @@ -"OCN", "runoff", "runoff", "./INPUT/@[FRUNOFF]", "none" , 1.0 diff --git a/parm/ufs/fv3/diag_table b/parm/ufs/fv3/diag_table index b972b3470c..47106cb294 100644 --- a/parm/ufs/fv3/diag_table +++ b/parm/ufs/fv3/diag_table @@ -1,7 +1,7 @@ "fv3_history", 0, "hours", 1, "hours", "time" "fv3_history2d", 0, "hours", 1, "hours", "time" -"ocn%4yr%2mo%2dy%2hr", 6, "hours", 1, "hours", "time", 6, "hours", "1901 1 1 0 0 0" -"ocn_daily%4yr%2mo%2dy", 1, "days", 1, "days", "time", 1, "days", "1901 1 1 0 0 0" +"ocn%4yr%2mo%2dy%2hr", @[FHOUT_OCNICE], "hours", 1, "hours", "time", @[FHOUT_OCNICE], "hours", "@[SYEAR] @[SMONTH] @[SDAY] @[CHOUR] 0 0" +"ocn_daily%4yr%2mo%2dy", 1, "days", 1, "days", "time", 1, "days", "@[SYEAR] @[SMONTH] @[SDAY] @[CHOUR] 0 0" ############## # Ocean fields diff --git a/parm/ufs/fv3/diag_table_aod b/parm/ufs/fv3/diag_table_aod index 0de51b66d8..fd8aee1791 100644 --- a/parm/ufs/fv3/diag_table_aod +++ b/parm/ufs/fv3/diag_table_aod @@ -3,4 +3,4 @@ "gfs_phys", "SU_AOD_550", "su_aod550", "fv3_history2d", "all", .false., "none", 2 "gfs_phys", "BC_AOD_550", "bc_aod550", "fv3_history2d", "all", .false., "none", 2 "gfs_phys", "OC_AOD_550", "oc_aod550", "fv3_history2d", "all", .false., "none", 2 -"gfs_phys", "SS_AOD_550", "ss_aod550", "fv3_history2d", "all", .false., "none", 2 \ No newline at end of file +"gfs_phys", "SS_AOD_550", "ss_aod550", "fv3_history2d", "all", .false., "none", 2 diff --git a/parm/ufs/gocart/ExtData.other b/parm/ufs/gocart/ExtData.other index 789576305e..7a0d63d6ca 100644 --- a/parm/ufs/gocart/ExtData.other +++ b/parm/ufs/gocart/ExtData.other @@ -17,12 +17,12 @@ DU_UTHRES '1' Y E - none none uthres ExtData/n #====== Sulfate Sources ================================================= # Anthropogenic (BF & FF) emissions -- allowed to input as two layers -SU_ANTHROL1 NA N Y %y4-%m2-%d2t12:00:00 none none SO2 ExtData/nexus/CEDS/v2019/%y4/CEDS.2019.emis.%y4%m2%d2.nc -SU_ANTHROL2 NA N Y %y4-%m2-%d2t12:00:00 none none SO2_elev ExtData/nexus/CEDS/v2019/%y4/CEDS.2019.emis.%y4%m2%d2.nc +SU_ANTHROL1 NA N Y %y4-%m2-%d2t12:00:00 none none SO2 ExtData/nexus/CEDS/v2019/monthly/%y4/CEDS_2019_monthly.%y4%m2.nc +SU_ANTHROL2 NA N Y %y4-%m2-%d2t12:00:00 none none SO2_elev ExtData/nexus/CEDS/v2019/monthly/%y4/CEDS_2019_monthly.%y4%m2.nc # Ship emissions -SU_SHIPSO2 NA N Y %y4-%m2-%d2t12:00:00 none none SO2_ship ExtData/nexus/CEDS/v2019/%y4/CEDS.2019.emis.%y4%m2%d2.nc -SU_SHIPSO4 NA N Y %y4-%m2-%d2t12:00:00 none none SO4_ship ExtData/nexus/CEDS/v2019/%y4/CEDS.2019.emis.%y4%m2%d2.nc +SU_SHIPSO2 NA N Y %y4-%m2-%d2t12:00:00 none none SO2_ship ExtData/nexus/CEDS/v2019/monthly/%y4/CEDS_2019_monthly.%y4%m2.nc +SU_SHIPSO4 NA N Y %y4-%m2-%d2t12:00:00 none none SO4_ship ExtData/nexus/CEDS/v2019/monthly/%y4/CEDS_2019_monthly.%y4%m2.nc # Aircraft fuel consumption SU_AIRCRAFT NA Y Y %y4-%m2-%d2t12:00:00 none none none /dev/null @@ -54,20 +54,20 @@ pSO2_OCS NA Y Y %y4-%m2-%d2t12:00:00 none none biofuel /dev/null # --------------- # # VOCs - OFFLINE MEGAN BIOG -OC_ISOPRENE NA N Y %y4-%m2-%d2t12:00:00 none none isoprene ExtData/nexus/MEGAN_OFFLINE_BVOC/v2019-10/%y4/MEGAN.OFFLINE.BIOVOC.%y4.emis.%y4%m2%d2.nc -OC_LIMO NA N Y %y4-%m2-%d2t12:00:00 none none limo ExtData/nexus/MEGAN_OFFLINE_BVOC/v2019-10/%y4/MEGAN.OFFLINE.BIOVOC.%y4.emis.%y4%m2%d2.nc -OC_MTPA NA N Y %y4-%m2-%d2t12:00:00 none none mtpa ExtData/nexus/MEGAN_OFFLINE_BVOC/v2019-10/%y4/MEGAN.OFFLINE.BIOVOC.%y4.emis.%y4%m2%d2.nc -OC_MTPO NA N Y %y4-%m2-%d2t12:00:00 none none mtpo ExtData/nexus/MEGAN_OFFLINE_BVOC/v2019-10/%y4/MEGAN.OFFLINE.BIOVOC.%y4.emis.%y4%m2%d2.nc +OC_ISOPRENE NA Y Y %y4-%m2-%d2t12:00:00 none none isoprene ExtData/nexus/MEGAN_OFFLINE_BVOC/v2021-12/MEGAN_OFFLINE_CLIMO_2000_2022_%m2.nc +OC_LIMO NA Y Y %y4-%m2-%d2t12:00:00 none none limo ExtData/nexus/MEGAN_OFFLINE_BVOC/v2021-12/MEGAN_OFFLINE_CLIMO_2000_2022_%m2.nc +OC_MTPA NA Y Y %y4-%m2-%d2t12:00:00 none none mtpa ExtData/nexus/MEGAN_OFFLINE_BVOC/v2021-12/MEGAN_OFFLINE_CLIMO_2000_2022_%m2.nc +OC_MTPO NA Y Y %y4-%m2-%d2t12:00:00 none none mtpo ExtData/nexus/MEGAN_OFFLINE_BVOC/v2021-12/MEGAN_OFFLINE_CLIMO_2000_2022_%m2.nc # Biofuel Source -- Included in AeroCom anthropogenic emissions OC_BIOFUEL NA Y Y %y4-%m2-%d2t12:00:00 none none biofuel /dev/null # Anthropogenic (BF & FF) emissions -- allowed to input as two layers -OC_ANTEOC1 NA N Y %y4-%m2-%d2t12:00:00 none none OC ExtData/nexus/CEDS/v2019/%y4/CEDS.2019.emis.%y4%m2%d2.nc -OC_ANTEOC2 NA N Y %y4-%m2-%d2t12:00:00 none none OC_elev ExtData/nexus/CEDS/v2019/%y4/CEDS.2019.emis.%y4%m2%d2.nc +OC_ANTEOC1 NA N Y %y4-%m2-%d2t12:00:00 none none OC ExtData/nexus/CEDS/v2019/monthly/%y4/CEDS_2019_monthly.%y4%m2.nc +OC_ANTEOC2 NA N Y %y4-%m2-%d2t12:00:00 none none OC_elev ExtData/nexus/CEDS/v2019/monthly/%y4/CEDS_2019_monthly.%y4%m2.nc # EDGAR based ship emissions -OC_SHIP NA N Y %y4-%m2-%d2t12:00:00 none none OC_ship ExtData/nexus/CEDS/v2019/%y4/CEDS.2019.emis.%y4%m2%d2.nc +OC_SHIP NA N Y %y4-%m2-%d2t12:00:00 none none OC_ship ExtData/nexus/CEDS/v2019/monthly/%y4/CEDS_2019_monthly.%y4%m2.nc # Aircraft fuel consumption OC_AIRCRAFT NA N Y %y4-%m2-%d2t12:00:00 none none oc_aviation /dev/null @@ -88,11 +88,11 @@ pSOA_ANTHRO_VOC NA Y Y %y4-%m2-%d2t12:00:00 none none biofuel /dev/null BC_BIOFUEL NA Y Y %y4-%m2-%d2t12:00:00 none none biofuel /dev/null # Anthropogenic (BF & FF) emissions -- allowed to input as two layers -BC_ANTEBC1 NA N Y %y4-%m2-%d2t12:00:00 none none BC ExtData/nexus/CEDS/v2019/%y4/CEDS.2019.emis.%y4%m2%d2.nc -BC_ANTEBC2 NA N Y %y4-%m2-%d2t12:00:00 none none BC_elev ExtData/nexus/CEDS/v2019/%y4/CEDS.2019.emis.%y4%m2%d2.nc +BC_ANTEBC1 NA N Y %y4-%m2-%d2t12:00:00 none none BC ExtData/nexus/CEDS/v2019/monthly/%y4/CEDS_2019_monthly.%y4%m2.nc +BC_ANTEBC2 NA N Y %y4-%m2-%d2t12:00:00 none none BC_elev ExtData/nexus/CEDS/v2019/monthly/%y4/CEDS_2019_monthly.%y4%m2.nc # EDGAR based ship emissions -BC_SHIP NA N Y %y4-%m2-%d2t12:00:00 none none BC_ship ExtData/nexus/CEDS/v2019/%y4/CEDS.2019.emis.%y4%m2%d2.nc +BC_SHIP NA N Y %y4-%m2-%d2t12:00:00 none none BC_ship ExtData/nexus/CEDS/v2019/monthly/%y4/CEDS_2019_monthly.%y4%m2.nc # Aircraft fuel consumption BC_AIRCRAFT NA N Y %y4-%m2-%d2t12:00:00 none none bc_aviation /dev/null @@ -133,11 +133,11 @@ BRC_AVIATION_CRS NA Y Y %y4-%m2-%d2t12:00:00 none none oc_aviation /dev/null pSOA_BIOB_VOC NA Y Y %y4-%m2-%d2t12:00:00 none none biofuel /dev/null # # ======= Nitrate Sources ======== -# EMI_NH3_AG 'kg m-2 s-1' N Y %y4-%m2-%d2T12:00:00 none none NH3_ag ExtData/nexus/CEDS/v2019/%y4/CEDS.2019.emis.%y4%m2%d2.nc +# EMI_NH3_AG 'kg m-2 s-1' N Y %y4-%m2-%d2T12:00:00 none none NH3_ag ExtData/nexus/CEDS/v2019/monthly/%y4/CEDS_2019_monthly.%y4%m2.nc # EMI_NH3_EN 'kg m-2 s-1' N Y %y4-%m2-%d2T12:00:00 none none NH3_en /dev/null -# EMI_NH3_IN 'kg m-2 s-1' N Y %y4-%m2-%d2T12:00:00 none none NH3_in ExtData/nexus/CEDS/v2019/%y4/CEDS.2019.emis.%y4%m2%d2.nc -# EMI_NH3_RE 'kg m-2 s-1' N Y %y4-%m2-%d2T12:00:00 none none NH3_re ExtData/nexus/CEDS/v2019/%y4/CEDS.2019.emis.%y4%m2%d2.nc -# EMI_NH3_TR 'kg m-2 s-1' N Y %y4-%m2-%d2T12:00:00 none none NH3_tr ExtData/nexus/CEDS/v2019/%y4/CEDS.2019.emis.%y4%m2%d2.nc +# EMI_NH3_IN 'kg m-2 s-1' N Y %y4-%m2-%d2T12:00:00 none none NH3_in ExtData/nexus/CEDS/v2019/monthly/%y4/CEDS_2019_monthly.%y4%m2.nc +# EMI_NH3_RE 'kg m-2 s-1' N Y %y4-%m2-%d2T12:00:00 none none NH3_re ExtData/nexus/CEDS/v2019/monthly/%y4/CEDS_2019_monthly.%y4%m2.nc +# EMI_NH3_TR 'kg m-2 s-1' N Y %y4-%m2-%d2T12:00:00 none none NH3_tr ExtData/nexus/CEDS/v2019/monthly/%y4/CEDS_2019_monthly.%y4%m2.nc # EMI_NH3_OC 'kg m-2 s-1' Y Y %y4-%m2-%d2T12:00:00 none none emiss_ocn ExtData/PIESA/sfc/GEIA.emis_NH3.ocean.x576_y361.t12.20080715_12z.nc4 # # -------------------------------------------------------------- diff --git a/parm/ufs/gocart/SU2G_instance_SU.rc b/parm/ufs/gocart/SU2G_instance_SU.rc index e365827760..79484b3068 100644 --- a/parm/ufs/gocart/SU2G_instance_SU.rc +++ b/parm/ufs/gocart/SU2G_instance_SU.rc @@ -7,8 +7,8 @@ aerosol_monochromatic_optics_file: ExtData/monochromatic/optics_SU.v1_3.nc nbins: 4 -# Volcanic pointwise sources -volcano_srcfilen: ExtData/nexus/VOLCANO/v2021-09/%y4/%m2/so2_volcanic_emissions_Carns.%y4%m2%d2.rc +# Volcanic pointwise sources | degassing only | replace with path to historical volcanic emissions for scientific runs +volcano_srcfilen: ExtData/nexus/VOLCANO/v2021-09/so2_volcanic_emissions_CARN_v202005.degassing_only.rc # Heights [m] of LTO, CDS and CRS aviation emissions layers aviation_vertical_layers: 0.0 100.0 9.0e3 10.0e3 diff --git a/parm/ufs/mom6/MOM_input_template_025 b/parm/ufs/mom6/MOM_input_template_025 deleted file mode 100644 index df56a3f486..0000000000 --- a/parm/ufs/mom6/MOM_input_template_025 +++ /dev/null @@ -1,902 +0,0 @@ -! This input file provides the adjustable run-time parameters for version 6 of the Modular Ocean Model (MOM6). -! Where appropriate, parameters use usually given in MKS units. - -! This particular file is for the example in ice_ocean_SIS2/OM4_025. - -! This MOM_input file typically contains only the non-default values that are needed to reproduce this example. -! A full list of parameters for this example can be found in the corresponding MOM_parameter_doc.all file -! which is generated by the model at run-time. -! === module MOM_domains === -TRIPOLAR_N = True ! [Boolean] default = False - ! Use tripolar connectivity at the northern edge of the domain. With - ! TRIPOLAR_N, NIGLOBAL must be even. -NIGLOBAL = @[NX_GLB] ! - ! The total number of thickness grid points in the x-direction in the physical - ! domain. With STATIC_MEMORY_ this is set in MOM_memory.h at compile time. -NJGLOBAL = @[NY_GLB] ! - ! The total number of thickness grid points in the y-direction in the physical - ! domain. With STATIC_MEMORY_ this is set in MOM_memory.h at compile time. -NIHALO = 4 ! default = 4 - ! The number of halo points on each side in the x-direction. With - ! STATIC_MEMORY_ this is set as NIHALO_ in MOM_memory.h at compile time; without - ! STATIC_MEMORY_ the default is NIHALO_ in MOM_memory.h (if defined) or 2. -NJHALO = 4 ! default = 4 - ! The number of halo points on each side in the y-direction. With - ! STATIC_MEMORY_ this is set as NJHALO_ in MOM_memory.h at compile time; without - ! STATIC_MEMORY_ the default is NJHALO_ in MOM_memory.h (if defined) or 2. -! LAYOUT = 32, 18 ! - ! The processor layout that was actually used. -! IO_LAYOUT = 1, 1 ! default = 1 - ! The processor layout to be used, or 0,0 to automatically set the io_layout to - ! be the same as the layout. - -! === module MOM === -USE_REGRIDDING = True ! [Boolean] default = False - ! If True, use the ALE algorithm (regridding/remapping). If False, use the - ! layered isopycnal algorithm. -THICKNESSDIFFUSE = True ! [Boolean] default = False - ! If true, interface heights are diffused with a coefficient of KHTH. -THICKNESSDIFFUSE_FIRST = True ! [Boolean] default = False - ! If true, do thickness diffusion before dynamics. This is only used if - ! THICKNESSDIFFUSE is true. -DT = @[DT_DYNAM_MOM6] ! [s] - ! The (baroclinic) dynamics time step. The time-step that is actually used will - ! be an integer fraction of the forcing time-step (DT_FORCING in ocean-only mode - ! or the coupling timestep in coupled mode.) -DT_THERM = @[DT_THERM_MOM6] ! [s] default = 1800.0 - ! The thermodynamic and tracer advection time step. Ideally DT_THERM should be - ! an integer multiple of DT and less than the forcing or coupling time-step, - ! unless THERMO_SPANS_COUPLING is true, in which case DT_THERM can be an integer - ! multiple of the coupling timestep. By default DT_THERM is set to DT. -THERMO_SPANS_COUPLING = @[MOM6_THERMO_SPAN] ! [Boolean] default = False - ! If true, the MOM will take thermodynamic and tracer timesteps that can be - ! longer than the coupling timestep. The actual thermodynamic timestep that is - ! used in this case is the largest integer multiple of the coupling timestep - ! that is less than or equal to DT_THERM. -HFREEZE = 20.0 ! [m] default = -1.0 - ! If HFREEZE > 0, melt potential will be computed. The actual depth - ! over which melt potential is computed will be min(HFREEZE, OBLD) - ! where OBLD is the boundary layer depth. If HFREEZE <= 0 (default) - ! melt potential will not be computed. -USE_PSURF_IN_EOS = False ! [Boolean] default = False - ! If true, always include the surface pressure contributions in equation of - ! state calculations. -FRAZIL = True ! [Boolean] default = False - ! If true, water freezes if it gets too cold, and the accumulated heat deficit - ! is returned in the surface state. FRAZIL is only used if - ! ENABLE_THERMODYNAMICS is true. -DO_GEOTHERMAL = True ! [Boolean] default = False - ! If true, apply geothermal heating. -BOUND_SALINITY = True ! [Boolean] default = False - ! If true, limit salinity to being positive. (The sea-ice model may ask for more - ! salt than is available and drive the salinity negative otherwise.) -MIN_SALINITY = 0.01 ! [PPT] default = 0.01 - ! The minimum value of salinity when BOUND_SALINITY=True. The default is 0.01 - ! for backward compatibility but ideally should be 0. -C_P = 3992.0 ! [J kg-1 K-1] default = 3991.86795711963 - ! The heat capacity of sea water, approximated as a constant. This is only used - ! if ENABLE_THERMODYNAMICS is true. The default value is from the TEOS-10 - ! definition of conservative temperature. -CHECK_BAD_SURFACE_VALS = True ! [Boolean] default = False - ! If true, check the surface state for ridiculous values. -BAD_VAL_SSH_MAX = 50.0 ! [m] default = 20.0 - ! The value of SSH above which a bad value message is triggered, if - ! CHECK_BAD_SURFACE_VALS is true. -BAD_VAL_SSS_MAX = 75.0 ! [PPT] default = 45.0 - ! The value of SSS above which a bad value message is triggered, if - ! CHECK_BAD_SURFACE_VALS is true. -BAD_VAL_SST_MAX = 55.0 ! [deg C] default = 45.0 - ! The value of SST above which a bad value message is triggered, if - ! CHECK_BAD_SURFACE_VALS is true. -BAD_VAL_SST_MIN = -3.0 ! [deg C] default = -2.1 - ! The value of SST below which a bad value message is triggered, if - ! CHECK_BAD_SURFACE_VALS is true. -DEFAULT_2018_ANSWERS = True ! [Boolean] default = True - ! This sets the default value for the various _2018_ANSWERS parameters. -WRITE_GEOM = 2 ! default = 1 - ! If =0, never write the geometry and vertical grid files. If =1, write the - ! geometry and vertical grid files only for a new simulation. If =2, always - ! write the geometry and vertical grid files. Other values are invalid. -SAVE_INITIAL_CONDS = False ! [Boolean] default = False - ! If true, write the initial conditions to a file given by IC_OUTPUT_FILE. - -! === module MOM_hor_index === -! Sets the horizontal array index types. - -! === module MOM_fixed_initialization === -INPUTDIR = "INPUT" ! default = "." - ! The directory in which input files are found. - -! === module MOM_grid_init === -GRID_CONFIG = "mosaic" ! - ! A character string that determines the method for defining the horizontal - ! grid. Current options are: - ! mosaic - read the grid from a mosaic (supergrid) - ! file set by GRID_FILE. - ! cartesian - use a (flat) Cartesian grid. - ! spherical - use a simple spherical grid. - ! mercator - use a Mercator spherical grid. -GRID_FILE = "ocean_hgrid.nc" ! - ! Name of the file from which to read horizontal grid data. -GRID_ROTATION_ANGLE_BUGS = False ! [Boolean] default = True - ! If true, use an older algorithm to calculate the sine and - ! cosines needed rotate between grid-oriented directions and - ! true north and east. Differences arise at the tripolar fold -USE_TRIPOLAR_GEOLONB_BUG = False ! [Boolean] default = True - ! If true, use older code that incorrectly sets the longitude in some points - ! along the tripolar fold to be off by 360 degrees. -TOPO_CONFIG = "file" ! - ! This specifies how bathymetry is specified: - ! file - read bathymetric information from the file - ! specified by (TOPO_FILE). - ! flat - flat bottom set to MAXIMUM_DEPTH. - ! bowl - an analytically specified bowl-shaped basin - ! ranging between MAXIMUM_DEPTH and MINIMUM_DEPTH. - ! spoon - a similar shape to 'bowl', but with an vertical - ! wall at the southern face. - ! halfpipe - a zonally uniform channel with a half-sine - ! profile in the meridional direction. - ! benchmark - use the benchmark test case topography. - ! Neverland - use the Neverland test case topography. - ! DOME - use a slope and channel configuration for the - ! DOME sill-overflow test case. - ! ISOMIP - use a slope and channel configuration for the - ! ISOMIP test case. - ! DOME2D - use a shelf and slope configuration for the - ! DOME2D gravity current/overflow test case. - ! Kelvin - flat but with rotated land mask. - ! seamount - Gaussian bump for spontaneous motion test case. - ! dumbbell - Sloshing channel with reservoirs on both ends. - ! shelfwave - exponential slope for shelfwave test case. - ! Phillips - ACC-like idealized topography used in the Phillips config. - ! dense - Denmark Strait-like dense water formation and overflow. - ! USER - call a user modified routine. -TOPO_FILE = "ocean_topog.nc" ! default = "topog.nc" - ! The file from which the bathymetry is read. -TOPO_EDITS_FILE = "All_edits.nc" ! default = "" - ! The file from which to read a list of i,j,z topography overrides. -ALLOW_LANDMASK_CHANGES = @[MOM6_ALLOW_LANDMASK_CHANGES] ! default = "False" - ! If true, allow topography overrides to change ocean points to land -MAXIMUM_DEPTH = 6500.0 ! [m] - ! The maximum depth of the ocean. -MINIMUM_DEPTH = 9.5 ! [m] default = 0.0 - ! If MASKING_DEPTH is unspecified, then anything shallower than MINIMUM_DEPTH is - ! assumed to be land and all fluxes are masked out. If MASKING_DEPTH is - ! specified, then all depths shallower than MINIMUM_DEPTH but deeper than - ! MASKING_DEPTH are rounded to MINIMUM_DEPTH. - -! === module MOM_open_boundary === -! Controls where open boundaries are located, what kind of boundary condition to impose, and what data to apply, -! if any. -MASKING_DEPTH = 0.0 ! [m] default = -9999.0 - ! The depth below which to mask points as land points, for which all fluxes are - ! zeroed out. MASKING_DEPTH is ignored if negative. -CHANNEL_CONFIG = "list" ! default = "none" - ! A parameter that determines which set of channels are - ! restricted to specific widths. Options are: - ! none - All channels have the grid width. - ! global_1deg - Sets 16 specific channels appropriate - ! for a 1-degree model, as used in CM2G. - ! list - Read the channel locations and widths from a - ! text file, like MOM_channel_list in the MOM_SIS - ! test case. - ! file - Read open face widths everywhere from a - ! NetCDF file on the model grid. -CHANNEL_LIST_FILE = "MOM_channels_global_025" ! default = "MOM_channel_list" - ! The file from which the list of narrowed channels is read. - -! === module MOM_verticalGrid === -! Parameters providing information about the vertical grid. -NK = 75 ! [nondim] - ! The number of model layers. - -! === module MOM_tracer_registry === - -! === module MOM_EOS === -DTFREEZE_DP = -7.75E-08 ! [deg C Pa-1] default = 0.0 - ! When TFREEZE_FORM=LINEAR, this is the derivative of the freezing potential - ! temperature with pressure. - -! === module MOM_restart === -PARALLEL_RESTARTFILES = True ! [Boolean] default = False - ! If true, each processor writes its own restart file, otherwise a single - ! restart file is generated - -! === module MOM_tracer_flow_control === -USE_IDEAL_AGE_TRACER = False ! [Boolean] default = False - ! If true, use the ideal_age_example tracer package. - -! === module ideal_age_example === - -! === module MOM_coord_initialization === -COORD_CONFIG = "file" ! - ! This specifies how layers are to be defined: - ! ALE or none - used to avoid defining layers in ALE mode - ! file - read coordinate information from the file - ! specified by (COORD_FILE). - ! BFB - Custom coords for buoyancy-forced basin case - ! based on SST_S, T_BOT and DRHO_DT. - ! linear - linear based on interfaces not layers - ! layer_ref - linear based on layer densities - ! ts_ref - use reference temperature and salinity - ! ts_range - use range of temperature and salinity - ! (T_REF and S_REF) to determine surface density - ! and GINT calculate internal densities. - ! gprime - use reference density (RHO_0) for surface - ! density and GINT calculate internal densities. - ! ts_profile - use temperature and salinity profiles - ! (read from COORD_FILE) to set layer densities. - ! USER - call a user modified routine. -COORD_FILE = "layer_coord.nc" ! - ! The file from which the coordinate densities are read. -REMAP_UV_USING_OLD_ALG = True ! [Boolean] default = True - ! If true, uses the old remapping-via-a-delta-z method for remapping u and v. If - ! false, uses the new method that remaps between grids described by an old and - ! new thickness. -REGRIDDING_COORDINATE_MODE = "HYCOM1" ! default = "LAYER" - ! Coordinate mode for vertical regridding. Choose among the following - ! possibilities: LAYER - Isopycnal or stacked shallow water layers - ! ZSTAR, Z* - stretched geopotential z* - ! SIGMA_SHELF_ZSTAR - stretched geopotential z* ignoring shelf - ! SIGMA - terrain following coordinates - ! RHO - continuous isopycnal - ! HYCOM1 - HyCOM-like hybrid coordinate - ! SLIGHT - stretched coordinates above continuous isopycnal - ! ADAPTIVE - optimize for smooth neutral density surfaces -BOUNDARY_EXTRAPOLATION = True ! [Boolean] default = False - ! When defined, a proper high-order reconstruction scheme is used within - ! boundary cells rather than PCM. E.g., if PPM is used for remapping, a PPM - ! reconstruction will also be used within boundary cells. -ALE_COORDINATE_CONFIG = "HYBRID:hycom1_75_800m.nc,sigma2,FNC1:2,4000,4.5,.01" ! default = "UNIFORM" - ! Determines how to specify the coordinate resolution. Valid options are: - ! PARAM - use the vector-parameter ALE_RESOLUTION - ! UNIFORM[:N] - uniformly distributed - ! FILE:string - read from a file. The string specifies - ! the filename and variable name, separated - ! by a comma or space, e.g. FILE:lev.nc,dz - ! or FILE:lev.nc,interfaces=zw - ! WOA09[:N] - the WOA09 vertical grid (approximately) - ! FNC1:string - FNC1:dz_min,H_total,power,precision - ! HYBRID:string - read from a file. The string specifies - ! the filename and two variable names, separated - ! by a comma or space, for sigma-2 and dz. e.g. - ! HYBRID:vgrid.nc,sigma2,dz -!ALE_RESOLUTION = 7*2.0, 2*2.01, 2.02, 2.03, 2.05, 2.08, 2.11, 2.15, 2.21, 2.2800000000000002, 2.37, 2.48, 2.61, 2.77, 2.95, 3.17, 3.4299999999999997, 3.74, 4.09, 4.49, 4.95, 5.48, 6.07, 6.74, 7.5, 8.34, 9.280000000000001, 10.33, 11.49, 12.77, 14.19, 15.74, 17.450000000000003, 19.31, 21.35, 23.56, 25.97, 28.580000000000002, 31.41, 34.47, 37.77, 41.32, 45.14, 49.25, 53.65, 58.370000000000005, 63.42, 68.81, 74.56, 80.68, 87.21000000000001, 94.14, 101.51, 109.33, 117.62, 126.4, 135.68, 145.5, 155.87, 166.81, 178.35, 190.51, 203.31, 216.78, 230.93, 245.8, 261.42, 277.83 ! [m] - ! The distribution of vertical resolution for the target - ! grid used for Eulerian-like coordinates. For example, - ! in z-coordinate mode, the parameter is a list of level - ! thicknesses (in m). In sigma-coordinate mode, the list - ! is of non-dimensional fractions of the water column. -!TARGET_DENSITIES = 1010.0, 1014.3034, 1017.8088, 1020.843, 1023.5566, 1025.813, 1027.0275, 1027.9114, 1028.6422, 1029.2795, 1029.852, 1030.3762, 1030.8626, 1031.3183, 1031.7486, 1032.1572, 1032.5471, 1032.9207, 1033.2798, 1033.6261, 1033.9608, 1034.2519, 1034.4817, 1034.6774, 1034.8508, 1035.0082, 1035.1533, 1035.2886, 1035.4159, 1035.5364, 1035.6511, 1035.7608, 1035.8661, 1035.9675, 1036.0645, 1036.1554, 1036.2411, 1036.3223, 1036.3998, 1036.4739, 1036.5451, 1036.6137, 1036.68, 1036.7441, 1036.8062, 1036.8526, 1036.8874, 1036.9164, 1036.9418, 1036.9647, 1036.9857, 1037.0052, 1037.0236, 1037.0409, 1037.0574, 1037.0738, 1037.0902, 1037.1066, 1037.123, 1037.1394, 1037.1558, 1037.1722, 1037.1887, 1037.206, 1037.2241, 1037.2435, 1037.2642, 1037.2866, 1037.3112, 1037.3389, 1037.3713, 1037.4118, 1037.475, 1037.6332, 1037.8104, 1038.0 ! [m] - ! HYBRID target densities for interfaces -REGRID_COMPRESSIBILITY_FRACTION = 0.01 ! [nondim] default = 0.0 - ! When interpolating potential density profiles we can add some artificial - ! compressibility solely to make homogeneous regions appear stratified. -MAXIMUM_INT_DEPTH_CONFIG = "FNC1:5,8000.0,1.0,.01" ! default = "NONE" - ! Determines how to specify the maximum interface depths. - ! Valid options are: - ! NONE - there are no maximum interface depths - ! PARAM - use the vector-parameter MAXIMUM_INTERFACE_DEPTHS - ! FILE:string - read from a file. The string specifies - ! the filename and variable name, separated - ! by a comma or space, e.g. FILE:lev.nc,Z - ! FNC1:string - FNC1:dz_min,H_total,power,precision -!MAXIMUM_INT_DEPTHS = 0.0, 5.0, 12.75, 23.25, 36.49, 52.480000000000004, 71.22, 92.71000000000001, 116.94000000000001, 143.92000000000002, 173.65, 206.13, 241.36, 279.33000000000004, 320.05000000000007, 363.5200000000001, 409.7400000000001, 458.7000000000001, 510.4100000000001, 564.8700000000001, 622.0800000000002, 682.0300000000002, 744.7300000000002, 810.1800000000003, 878.3800000000003, 949.3300000000004, 1023.0200000000004, 1099.4600000000005, 1178.6500000000005, 1260.5900000000006, 1345.2700000000007, 1432.7000000000007, 1522.8800000000008, 1615.8100000000009, 1711.490000000001, 1809.910000000001, 1911.080000000001, 2015.0000000000011, 2121.670000000001, 2231.080000000001, 2343.2400000000007, 2458.1500000000005, 2575.8100000000004, 2696.2200000000003, 2819.3700000000003, 2945.2700000000004, 3073.9200000000005, 3205.3200000000006, 3339.4600000000005, 3476.3500000000004, 3615.9900000000002, 3758.38, 3903.52, 4051.4, 4202.03, 4355.41, 4511.54, 4670.41, 4832.03, 4996.4, 5163.5199999999995, 5333.379999999999, 5505.989999999999, 5681.3499999999985, 5859.459999999998, 6040.319999999998, 6223.919999999998, 6410.269999999999, 6599.369999999999, 6791.219999999999, 6985.8099999999995, 7183.15, 7383.24, 7586.08, 7791.67, 8000.0 - ! The list of maximum depths for each interface. -MAX_LAYER_THICKNESS_CONFIG = "FNC1:400,31000.0,0.1,.01" ! default = "NONE" - ! Determines how to specify the maximum layer thicknesses. - ! Valid options are: - ! NONE - there are no maximum layer thicknesses - ! PARAM - use the vector-parameter MAX_LAYER_THICKNESS - ! FILE:string - read from a file. The string specifies - ! the filename and variable name, separated - ! by a comma or space, e.g. FILE:lev.nc,Z - ! FNC1:string - FNC1:dz_min,H_total,power,precision -!MAX_LAYER_THICKNESS = 400.0, 409.63, 410.32, 410.75, 411.07, 411.32, 411.52, 411.7, 411.86, 412.0, 412.13, 412.24, 412.35, 412.45, 412.54, 412.63, 412.71, 412.79, 412.86, 412.93, 413.0, 413.06, 413.12, 413.18, 413.24, 413.29, 413.34, 413.39, 413.44, 413.49, 413.54, 413.58, 413.62, 413.67, 413.71, 413.75, 413.78, 413.82, 413.86, 413.9, 413.93, 413.97, 414.0, 414.03, 414.06, 414.1, 414.13, 414.16, 414.19, 414.22, 414.24, 414.27, 414.3, 414.33, 414.35, 414.38, 414.41, 414.43, 414.46, 414.48, 414.51, 414.53, 414.55, 414.58, 414.6, 414.62, 414.65, 414.67, 414.69, 414.71, 414.73, 414.75, 414.77, 414.79, 414.83 ! [m] - ! The list of maximum thickness for each layer. -REMAPPING_SCHEME = "PPM_H4" ! default = "PLM" - ! This sets the reconstruction scheme used for vertical remapping for all - ! variables. It can be one of the following schemes: PCM (1st-order - ! accurate) - ! PLM (2nd-order accurate) - ! PPM_H4 (3rd-order accurate) - ! PPM_IH4 (3rd-order accurate) - ! PQM_IH4IH3 (4th-order accurate) - ! PQM_IH6IH5 (5th-order accurate) - -! === module MOM_grid === -! Parameters providing information about the lateral grid. - -! === module MOM_state_initialization === -INIT_LAYERS_FROM_Z_FILE = True ! [Boolean] default = False - ! If true, initialize the layer thicknesses, temperatures, and salinities from a - ! Z-space file on a latitude-longitude grid. - -! === module MOM_initialize_layers_from_Z === -TEMP_SALT_Z_INIT_FILE = "MOM6_IC_TS.nc" ! default = "temp_salt_z.nc" - ! The name of the z-space input file used to initialize - ! temperatures (T) and salinities (S). If T and S are not - ! in the same file, TEMP_Z_INIT_FILE and SALT_Z_INIT_FILE - ! must be set. -Z_INIT_FILE_PTEMP_VAR = "temp" ! default = "ptemp" - ! The name of the potential temperature variable in - ! TEMP_Z_INIT_FILE. -Z_INIT_FILE_SALT_VAR = "salt" ! default = "salt" - ! The name of the salinity variable in - ! SALT_Z_INIT_FILE. - -Z_INIT_ALE_REMAPPING = True ! [Boolean] default = False - ! If True, then remap straight to model coordinate from file. -Z_INIT_REMAP_OLD_ALG = True ! [Boolean] default = True - ! If false, uses the preferred remapping algorithm for initialization. If true, - ! use an older, less robust algorithm for remapping. - -! === module MOM_diag_mediator === -!Jiande NUM_DIAG_COORDS = 2 ! default = 1 -NUM_DIAG_COORDS = 1 - ! The number of diagnostic vertical coordinates to use. - ! For each coordinate, an entry in DIAG_COORDS must be provided. -!Jiande DIAG_COORDS = "z Z ZSTAR", "rho2 RHO2 RHO" ! -DIAG_COORDS = "z Z ZSTAR" - ! A list of string tuples associating diag_table modules to - ! a coordinate definition used for diagnostics. Each string - ! is of the form "MODULE_SUFFIX,PARAMETER_SUFFIX,COORDINATE_NAME". -DIAG_COORD_DEF_Z="FILE:@[MOM6_DIAG_COORD_DEF_Z_FILE],interfaces=zw" -DIAG_MISVAL = @[MOM6_DIAG_MISVAL] -!DIAG_COORD_DEF_RHO2 = "FILE:diag_rho2.nc,interfaces=rho2" ! default = "WOA09" - ! Determines how to specify the coordinate resolution. Valid options are: - ! PARAM - use the vector-parameter DIAG_COORD_RES_RHO2 - ! UNIFORM[:N] - uniformly distributed - ! FILE:string - read from a file. The string specifies - ! the filename and variable name, separated - ! by a comma or space, e.g. FILE:lev.nc,dz - ! or FILE:lev.nc,interfaces=zw - ! WOA09[:N] - the WOA09 vertical grid (approximately) - ! FNC1:string - FNC1:dz_min,H_total,power,precision - ! HYBRID:string - read from a file. The string specifies - ! the filename and two variable names, separated - ! by a comma or space, for sigma-2 and dz. e.g. - ! HYBRID:vgrid.nc,sigma2,dz - -! === module MOM_MEKE === -USE_MEKE = True ! [Boolean] default = False - ! If true, turns on the MEKE scheme which calculates a sub-grid mesoscale eddy - ! kinetic energy budget. -MEKE_GMCOEFF = 1.0 ! [nondim] default = -1.0 - ! The efficiency of the conversion of potential energy into MEKE by the - ! thickness mixing parameterization. If MEKE_GMCOEFF is negative, this - ! conversion is not used or calculated. -MEKE_BGSRC = 1.0E-13 ! [W kg-1] default = 0.0 - ! A background energy source for MEKE. -MEKE_KHMEKE_FAC = 1.0 ! [nondim] default = 0.0 - ! A factor that maps MEKE%Kh to Kh for MEKE itself. -MEKE_ALPHA_RHINES = 0.15 ! [nondim] default = 0.05 - ! If positive, is a coefficient weighting the Rhines scale in the expression for - ! mixing length used in MEKE-derived diffusivity. -MEKE_ALPHA_EADY = 0.15 ! [nondim] default = 0.05 - ! If positive, is a coefficient weighting the Eady length scale in the - ! expression for mixing length used in MEKE-derived diffusivity. - -! === module MOM_lateral_mixing_coeffs === -USE_VARIABLE_MIXING = True ! [Boolean] default = False - ! If true, the variable mixing code will be called. This allows diagnostics to - ! be created even if the scheme is not used. If KHTR_SLOPE_CFF>0 or - ! KhTh_Slope_Cff>0, this is set to true regardless of what is in the parameter - ! file. -RESOLN_SCALED_KH = True ! [Boolean] default = False - ! If true, the Laplacian lateral viscosity is scaled away when the first - ! baroclinic deformation radius is well resolved. -RESOLN_SCALED_KHTH = True ! [Boolean] default = False - ! If true, the interface depth diffusivity is scaled away when the first - ! baroclinic deformation radius is well resolved. -KHTR_SLOPE_CFF = 0.25 ! [nondim] default = 0.0 - ! The nondimensional coefficient in the Visbeck formula for the epipycnal tracer - ! diffusivity -USE_STORED_SLOPES = True ! [Boolean] default = False - ! If true, the isopycnal slopes are calculated once and stored for re-use. This - ! uses more memory but avoids calling the equation of state more times than - ! should be necessary. -INTERPOLATE_RES_FN = False ! [Boolean] default = True - ! If true, interpolate the resolution function to the velocity points from the - ! thickness points; otherwise interpolate the wave speed and calculate the - ! resolution function independently at each point. -GILL_EQUATORIAL_LD = True ! [Boolean] default = False - ! If true, uses Gill's definition of the baroclinic equatorial deformation - ! radius, otherwise, if false, use Pedlosky's definition. These definitions - ! differ by a factor of 2 in front of the beta term in the denominator. Gill's - ! is the more appropriate definition. -INTERNAL_WAVE_SPEED_BETTER_EST = False ! [Boolean] default = True - ! If true, use a more robust estimate of the first mode wave speed as the - ! starting point for iterations. - -! === module MOM_set_visc === -CHANNEL_DRAG = True ! [Boolean] default = False - ! If true, the bottom drag is exerted directly on each layer proportional to the - ! fraction of the bottom it overlies. -PRANDTL_TURB = 1.25 ! [nondim] default = 1.0 - ! The turbulent Prandtl number applied to shear instability. -HBBL = 10.0 ! [m] - ! The thickness of a bottom boundary layer with a viscosity of KVBBL if - ! BOTTOMDRAGLAW is not defined, or the thickness over which near-bottom - ! velocities are averaged for the drag law if BOTTOMDRAGLAW is defined but - ! LINEAR_DRAG is not. -DRAG_BG_VEL = 0.1 ! [m s-1] default = 0.0 - ! DRAG_BG_VEL is either the assumed bottom velocity (with LINEAR_DRAG) or an - ! unresolved velocity that is combined with the resolved velocity to estimate - ! the velocity magnitude. DRAG_BG_VEL is only used when BOTTOMDRAGLAW is - ! defined. -BBL_USE_EOS = True ! [Boolean] default = False - ! If true, use the equation of state in determining the properties of the bottom - ! boundary layer. Otherwise use the layer target potential densities. -BBL_THICK_MIN = 0.1 ! [m] default = 0.0 - ! The minimum bottom boundary layer thickness that can be used with - ! BOTTOMDRAGLAW. This might be Kv/(cdrag*drag_bg_vel) to give Kv as the minimum - ! near-bottom viscosity. -KV = 1.0E-04 ! [m2 s-1] - ! The background kinematic viscosity in the interior. The molecular value, ~1e-6 - ! m2 s-1, may be used. -KV_BBL_MIN = 0.0 ! [m2 s-1] default = 1.0E-04 - ! The minimum viscosities in the bottom boundary layer. -KV_TBL_MIN = 0.0 ! [m2 s-1] default = 1.0E-04 - ! The minimum viscosities in the top boundary layer. - -! === module MOM_thickness_diffuse === -KHTH_MAX_CFL = 0.1 ! [nondimensional] default = 0.8 - ! The maximum value of the local diffusive CFL ratio that is permitted for the - ! thickness diffusivity. 1.0 is the marginally unstable value in a pure layered - ! model, but much smaller numbers (e.g. 0.1) seem to work better for ALE-based - ! models. -USE_GM_WORK_BUG = True ! [Boolean] default = True - ! If true, compute the top-layer work tendency on the u-grid with the incorrect - ! sign, for legacy reproducibility. - -! === module MOM_continuity === - -! === module MOM_continuity_PPM === -ETA_TOLERANCE = 1.0E-06 ! [m] default = 3.75E-09 - ! The tolerance for the differences between the barotropic and baroclinic - ! estimates of the sea surface height due to the fluxes through each face. The - ! total tolerance for SSH is 4 times this value. The default is - ! 0.5*NK*ANGSTROM, and this should not be set less than about - ! 10^-15*MAXIMUM_DEPTH. -ETA_TOLERANCE_AUX = 0.001 ! [m] default = 1.0E-06 - ! The tolerance for free-surface height discrepancies between the barotropic - ! solution and the sum of the layer thicknesses when calculating the auxiliary - ! corrected velocities. By default, this is the same as ETA_TOLERANCE, but can - ! be made larger for efficiency. - -! === module MOM_CoriolisAdv === -CORIOLIS_SCHEME = "SADOURNY75_ENSTRO" ! default = "SADOURNY75_ENERGY" - ! CORIOLIS_SCHEME selects the discretization for the Coriolis terms. Valid - ! values are: - ! SADOURNY75_ENERGY - Sadourny, 1975; energy cons. - ! ARAKAWA_HSU90 - Arakawa & Hsu, 1990 - ! SADOURNY75_ENSTRO - Sadourny, 1975; enstrophy cons. - ! ARAKAWA_LAMB81 - Arakawa & Lamb, 1981; En. + Enst. - ! ARAKAWA_LAMB_BLEND - A blend of Arakawa & Lamb with - ! Arakawa & Hsu and Sadourny energy -BOUND_CORIOLIS = True ! [Boolean] default = False - ! If true, the Coriolis terms at u-points are bounded by the four estimates of - ! (f+rv)v from the four neighboring v-points, and similarly at v-points. This - ! option would have no effect on the SADOURNY Coriolis scheme if it were - ! possible to use centered difference thickness fluxes. - -! === module MOM_PressureForce === - -! === module MOM_PressureForce_AFV === -MASS_WEIGHT_IN_PRESSURE_GRADIENT = True ! [Boolean] default = False - ! If true, use mass weighting when interpolating T/S for integrals near the - ! bathymetry in AFV pressure gradient calculations. - -! === module MOM_hor_visc === -LAPLACIAN = True ! [Boolean] default = False - ! If true, use a Laplacian horizontal viscosity. -AH_VEL_SCALE = 0.01 ! [m s-1] default = 0.0 - ! The velocity scale which is multiplied by the cube of the grid spacing to - ! calculate the biharmonic viscosity. The final viscosity is the largest of this - ! scaled viscosity, the Smagorinsky and Leith viscosities, and AH. -SMAGORINSKY_AH = True ! [Boolean] default = False - ! If true, use a biharmonic Smagorinsky nonlinear eddy viscosity. -SMAG_BI_CONST = 0.06 ! [nondim] default = 0.0 - ! The nondimensional biharmonic Smagorinsky constant, typically 0.015 - 0.06. -USE_LAND_MASK_FOR_HVISC = False ! [Boolean] default = False - ! If true, use Use the land mask for the computation of thicknesses at velocity - ! locations. This eliminates the dependence on arbitrary values over land or - ! outside of the domain. Default is False in order to maintain answers with - ! legacy experiments but should be changed to True for new experiments. - -! === module MOM_vert_friction === -HMIX_FIXED = 0.5 ! [m] - ! The prescribed depth over which the near-surface viscosity and diffusivity are - ! elevated when the bulk mixed layer is not used. -KVML = 1.0E-04 ! [m2 s-1] default = 1.0E-04 - ! The kinematic viscosity in the mixed layer. A typical value is ~1e-2 m2 s-1. - ! KVML is not used if BULKMIXEDLAYER is true. The default is set by KV. -MAXVEL = 6.0 ! [m s-1] default = 3.0E+08 - ! The maximum velocity allowed before the velocity components are truncated. - -! === module MOM_PointAccel === -U_TRUNC_FILE = "U_velocity_truncations" ! default = "" - ! The absolute path to a file into which the accelerations leading to zonal - ! velocity truncations are written. Undefine this for efficiency if this - ! diagnostic is not needed. -V_TRUNC_FILE = "V_velocity_truncations" ! default = "" - ! The absolute path to a file into which the accelerations leading to meridional - ! velocity truncations are written. Undefine this for efficiency if this - ! diagnostic is not needed. - -! === module MOM_barotropic === -BOUND_BT_CORRECTION = True ! [Boolean] default = False - ! If true, the corrective pseudo mass-fluxes into the barotropic solver are - ! limited to values that require less than maxCFL_BT_cont to be accommodated. -BT_PROJECT_VELOCITY = True ! [Boolean] default = False - ! If true, step the barotropic velocity first and project out the velocity - ! tendency by 1+BEBT when calculating the transport. The default (false) is to - ! use a predictor continuity step to find the pressure field, and then to do a - ! corrector continuity step using a weighted average of the old and new - ! velocities, with weights of (1-BEBT) and BEBT. -DYNAMIC_SURFACE_PRESSURE = True ! [Boolean] default = False - ! If true, add a dynamic pressure due to a viscous ice shelf, for instance. -BEBT = 0.2 ! [nondim] default = 0.1 - ! BEBT determines whether the barotropic time stepping uses the forward-backward - ! time-stepping scheme or a backward Euler scheme. BEBT is valid in the range - ! from 0 (for a forward-backward treatment of nonrotating gravity waves) to 1 - ! (for a backward Euler treatment). In practice, BEBT must be greater than about - ! 0.05. -DTBT = -0.9 ! [s or nondim] default = -0.98 - ! The barotropic time step, in s. DTBT is only used with the split explicit time - ! stepping. To set the time step automatically based the maximum stable value - ! use 0, or a negative value gives the fraction of the stable value. Setting - ! DTBT to 0 is the same as setting it to -0.98. The value of DTBT that will - ! actually be used is an integer fraction of DT, rounding down. -BT_USE_OLD_CORIOLIS_BRACKET_BUG = True ! [Boolean] default = False - ! If True, use an order of operations that is not bitwise rotationally symmetric - ! in the meridional Coriolis term of the barotropic solver. - -! === module MOM_mixed_layer_restrat === -MIXEDLAYER_RESTRAT = True ! [Boolean] default = False - ! If true, a density-gradient dependent re-stratifying flow is imposed in the - ! mixed layer. Can be used in ALE mode without restriction but in layer mode can - ! only be used if BULKMIXEDLAYER is true. -FOX_KEMPER_ML_RESTRAT_COEF = 1.0 ! [nondim] default = 0.0 - ! A nondimensional coefficient that is proportional to the ratio of the - ! deformation radius to the dominant lengthscale of the submesoscale mixed layer - ! instabilities, times the minimum of the ratio of the mesoscale eddy kinetic - ! energy to the large-scale geostrophic kinetic energy or 1 plus the square of - ! the grid spacing over the deformation radius, as detailed by Fox-Kemper et al. - ! (2010) -MLE_FRONT_LENGTH = 500.0 ! [m] default = 0.0 - ! If non-zero, is the frontal-length scale used to calculate the upscaling of - ! buoyancy gradients that is otherwise represented by the parameter - ! FOX_KEMPER_ML_RESTRAT_COEF. If MLE_FRONT_LENGTH is non-zero, it is recommended - ! to set FOX_KEMPER_ML_RESTRAT_COEF=1.0. -MLE_USE_PBL_MLD = True ! [Boolean] default = False - ! If true, the MLE parameterization will use the mixed-layer depth provided by - ! the active PBL parameterization. If false, MLE will estimate a MLD based on a - ! density difference with the surface using the parameter MLE_DENSITY_DIFF. -MLE_MLD_DECAY_TIME = 2.592E+06 ! [s] default = 0.0 - ! The time-scale for a running-mean filter applied to the mixed-layer depth used - ! in the MLE restratification parameterization. When the MLD deepens below the - ! current running-mean the running-mean is instantaneously set to the current - ! MLD. - -! === module MOM_diabatic_driver === -! The following parameters are used for diabatic processes. -ENERGETICS_SFC_PBL = True ! [Boolean] default = False - ! If true, use an implied energetics planetary boundary layer scheme to - ! determine the diffusivity and viscosity in the surface boundary layer. -EPBL_IS_ADDITIVE = False ! [Boolean] default = True - ! If true, the diffusivity from ePBL is added to all other diffusivities. - ! Otherwise, the larger of kappa-shear and ePBL diffusivities are used. - -! === module MOM_CVMix_KPP === -! This is the MOM wrapper to CVMix:KPP -! See http://cvmix.github.io/ - -! === module MOM_tidal_mixing === -! Vertical Tidal Mixing Parameterization -INT_TIDE_DISSIPATION = True ! [Boolean] default = False - ! If true, use an internal tidal dissipation scheme to drive diapycnal mixing, - ! along the lines of St. Laurent et al. (2002) and Simmons et al. (2004). -INT_TIDE_PROFILE = "POLZIN_09" ! default = "STLAURENT_02" - ! INT_TIDE_PROFILE selects the vertical profile of energy dissipation with - ! INT_TIDE_DISSIPATION. Valid values are: - ! STLAURENT_02 - Use the St. Laurent et al exponential - ! decay profile. - ! POLZIN_09 - Use the Polzin WKB-stretched algebraic - ! decay profile. -INT_TIDE_DECAY_SCALE = 300.3003003003003 ! [m] default = 500.0 - ! The decay scale away from the bottom for tidal TKE with the new coding when - ! INT_TIDE_DISSIPATION is used. -KAPPA_ITIDES = 6.28319E-04 ! [m-1] default = 6.283185307179586E-04 - ! A topographic wavenumber used with INT_TIDE_DISSIPATION. The default is 2pi/10 - ! km, as in St.Laurent et al. 2002. -KAPPA_H2_FACTOR = 0.84 ! [nondim] default = 1.0 - ! A scaling factor for the roughness amplitude with INT_TIDE_DISSIPATION. -TKE_ITIDE_MAX = 0.1 ! [W m-2] default = 1000.0 - ! The maximum internal tide energy source available to mix above the bottom - ! boundary layer with INT_TIDE_DISSIPATION. -READ_TIDEAMP = True ! [Boolean] default = False - ! If true, read a file (given by TIDEAMP_FILE) containing the tidal amplitude - ! with INT_TIDE_DISSIPATION. -TIDEAMP_FILE = "tidal_amplitude.v20140616.nc" ! default = "tideamp.nc" - ! The path to the file containing the spatially varying tidal amplitudes with - ! INT_TIDE_DISSIPATION. -H2_FILE = "ocean_topog.nc" ! - ! The path to the file containing the sub-grid-scale topographic roughness - ! amplitude with INT_TIDE_DISSIPATION. - -! === module MOM_CVMix_conv === -! Parameterization of enhanced mixing due to convection via CVMix - -! === module MOM_geothermal === -GEOTHERMAL_SCALE = 1.0 ! [W m-2 or various] default = 0.0 - ! The constant geothermal heat flux, a rescaling factor for the heat flux read - ! from GEOTHERMAL_FILE, or 0 to disable the geothermal heating. -GEOTHERMAL_FILE = "geothermal_davies2013_v1.nc" ! default = "" - ! The file from which the geothermal heating is to be read, or blank to use a - ! constant heating rate. -GEOTHERMAL_VARNAME = "geothermal_hf" ! default = "geo_heat" - ! The name of the geothermal heating variable in GEOTHERMAL_FILE. - -! === module MOM_set_diffusivity === -BBL_MIXING_AS_MAX = False ! [Boolean] default = True - ! If true, take the maximum of the diffusivity from the BBL mixing and the other - ! diffusivities. Otherwise, diffusivity from the BBL_mixing is simply added. -USE_LOTW_BBL_DIFFUSIVITY = True ! [Boolean] default = False - ! If true, uses a simple, imprecise but non-coordinate dependent, model of BBL - ! mixing diffusivity based on Law of the Wall. Otherwise, uses the original BBL - ! scheme. -SIMPLE_TKE_TO_KD = True ! [Boolean] default = False - ! If true, uses a simple estimate of Kd/TKE that will work for arbitrary - ! vertical coordinates. If false, calculates Kd/TKE and bounds based on exact - ! energetics for an isopycnal layer-formulation. - -! === module MOM_bkgnd_mixing === -! Adding static vertical background mixing coefficients -KD = 1.5E-05 ! [m2 s-1] - ! The background diapycnal diffusivity of density in the interior. Zero or the - ! molecular value, ~1e-7 m2 s-1, may be used. -KD_MIN = 2.0E-06 ! [m2 s-1] default = 1.5E-07 - ! The minimum diapycnal diffusivity. -HENYEY_IGW_BACKGROUND = True ! [Boolean] default = False - ! If true, use a latitude-dependent scaling for the near surface background - ! diffusivity, as described in Harrison & Hallberg, JPO 2008. -KD_MAX = 0.1 ! [m2 s-1] default = -1.0 - ! The maximum permitted increment for the diapycnal diffusivity from TKE-based - ! parameterizations, or a negative value for no limit. - -! === module MOM_kappa_shear === -! Parameterization of shear-driven turbulence following Jackson, Hallberg and Legg, JPO 2008 -USE_JACKSON_PARAM = True ! [Boolean] default = False - ! If true, use the Jackson-Hallberg-Legg (JPO 2008) shear mixing - ! parameterization. -MAX_RINO_IT = 25 ! [nondim] default = 50 - ! The maximum number of iterations that may be used to estimate the Richardson - ! number driven mixing. -VERTEX_SHEAR = False ! [Boolean] default = False - ! If true, do the calculations of the shear-driven mixing - ! at the cell vertices (i.e., the vorticity points). -KAPPA_SHEAR_ITER_BUG = True ! [Boolean] default = True - ! If true, use an older, dimensionally inconsistent estimate of the derivative - ! of diffusivity with energy in the Newton's method iteration. The bug causes - ! undercorrections when dz > 1 m. -KAPPA_SHEAR_ALL_LAYER_TKE_BUG = True ! [Boolean] default = True - ! If true, report back the latest estimate of TKE instead of the time average - ! TKE when there is mass in all layers. Otherwise always report the time - ! averaged TKE, as is currently done when there are some massless layers. - -! === module MOM_CVMix_shear === -! Parameterization of shear-driven turbulence via CVMix (various options) - -! === module MOM_CVMix_ddiff === -! Parameterization of mixing due to double diffusion processes via CVMix - -! === module MOM_diabatic_aux === -! The following parameters are used for auxiliary diabatic processes. -PRESSURE_DEPENDENT_FRAZIL = False ! [Boolean] default = False - ! If true, use a pressure dependent freezing temperature when making frazil. The - ! default is false, which will be faster but is inappropriate with ice-shelf - ! cavities. -VAR_PEN_SW = True ! [Boolean] default = False - ! If true, use one of the CHL_A schemes specified by OPACITY_SCHEME to determine - ! the e-folding depth of incoming short wave radiation. -CHL_FILE = @[CHLCLIM] ! - ! CHL_FILE is the file containing chl_a concentrations in the variable CHL_A. It - ! is used when VAR_PEN_SW and CHL_FROM_FILE are true. -CHL_VARNAME = "chlor_a" ! default = "CHL_A" - ! Name of CHL_A variable in CHL_FILE. - -! === module MOM_energetic_PBL === -ML_OMEGA_FRAC = 0.001 ! [nondim] default = 0.0 - ! When setting the decay scale for turbulence, use this fraction of the absolute - ! rotation rate blended with the local value of f, as sqrt((1-of)*f^2 + - ! of*4*omega^2). -TKE_DECAY = 0.01 ! [nondim] default = 2.5 - ! TKE_DECAY relates the vertical rate of decay of the TKE available for - ! mechanical entrainment to the natural Ekman depth. -EPBL_MSTAR_SCHEME = "OM4" ! default = "CONSTANT" - ! EPBL_MSTAR_SCHEME selects the method for setting mstar. Valid values are: - ! CONSTANT - Use a fixed mstar given by MSTAR - ! OM4 - Use L_Ekman/L_Obukhov in the sabilizing limit, as in OM4 - ! REICHL_H18 - Use the scheme documented in Reichl & Hallberg, 2018. -MSTAR_CAP = 10.0 ! [nondim] default = -1.0 - ! If this value is positive, it sets the maximum value of mstar allowed in ePBL. - ! (This is not used if EPBL_MSTAR_SCHEME = CONSTANT). -MSTAR2_COEF1 = 0.29 ! [nondim] default = 0.3 - ! Coefficient in computing mstar when rotation and stabilizing effects are both - ! important (used if EPBL_MSTAR_SCHEME = OM4). -MSTAR2_COEF2 = 0.152 ! [nondim] default = 0.085 - ! Coefficient in computing mstar when only rotation limits the total mixing - ! (used if EPBL_MSTAR_SCHEME = OM4) -NSTAR = 0.06 ! [nondim] default = 0.2 - ! The portion of the buoyant potential energy imparted by surface fluxes that is - ! available to drive entrainment at the base of mixed layer when that energy is - ! positive. -EPBL_MLD_BISECTION = True ! [Boolean] default = False - ! If true, use bisection with the iterative determination of the self-consistent - ! mixed layer depth. Otherwise use the false position after a maximum and - ! minimum bound have been evaluated and the returned value or bisection before - ! this. -MSTAR_CONV_ADJ = 0.667 ! [nondim] default = 0.0 - ! Coefficient used for reducing mstar during convection due to reduction of - ! stable density gradient. -USE_MLD_ITERATION = True ! [Boolean] default = False - ! A logical that specifies whether or not to use the distance to the bottom of - ! the actively turbulent boundary layer to help set the EPBL length scale. -EPBL_TRANSITION_SCALE = 0.01 ! [nondim] default = 0.1 - ! A scale for the mixing length in the transition layer at the edge of the - ! boundary layer as a fraction of the boundary layer thickness. -MIX_LEN_EXPONENT = 1.0 ! [nondim] default = 2.0 - ! The exponent applied to the ratio of the distance to the MLD and the MLD depth - ! which determines the shape of the mixing length. This is only used if - ! USE_MLD_ITERATION is True. -USE_LA_LI2016 = @[MOM6_USE_LI2016] ! [nondim] default = False - ! A logical to use the Li et al. 2016 (submitted) formula to determine the - ! Langmuir number. -USE_WAVES = @[MOM6_USE_WAVES] ! [Boolean] default = False - ! If true, enables surface wave modules. -WAVE_METHOD = "SURFACE_BANDS" ! default = "EMPTY" - ! Choice of wave method, valid options include: - ! TEST_PROFILE - Prescribed from surface Stokes drift - ! and a decay wavelength. - ! SURFACE_BANDS - Computed from multiple surface values - ! and decay wavelengths. - ! DHH85 - Uses Donelan et al. 1985 empirical - ! wave spectrum with prescribed values. - ! LF17 - Infers Stokes drift profile from wind - ! speed following Li and Fox-Kemper 2017. -SURFBAND_SOURCE = "COUPLER" ! default = "EMPTY" - ! Choice of SURFACE_BANDS data mode, valid options include: - ! DATAOVERRIDE - Read from NetCDF using FMS DataOverride. - ! COUPLER - Look for variables from coupler pass - ! INPUT - Testing with fixed values. -STK_BAND_COUPLER = 3 ! default = 1 - ! STK_BAND_COUPLER is the number of Stokes drift bands in the coupler. This has - ! to be consistent with the number of Stokes drift bands in WW3, or the model - ! will fail. -SURFBAND_WAVENUMBERS = 0.04, 0.11, 0.3305 ! [rad/m] default = 0.12566 - ! Central wavenumbers for surface Stokes drift bands. -EPBL_LANGMUIR_SCHEME = "ADDITIVE" ! default = "NONE" - ! EPBL_LANGMUIR_SCHEME selects the method for including Langmuir turbulence. - ! Valid values are: - ! NONE - Do not do any extra mixing due to Langmuir turbulence - ! RESCALE - Use a multiplicative rescaling of mstar to account for Langmuir - ! turbulence - ! ADDITIVE - Add a Langmuir turblence contribution to mstar to other - ! contributions -LT_ENHANCE_COEF = 0.044 ! [nondim] default = 0.447 - ! Coefficient for Langmuir enhancement of mstar -LT_ENHANCE_EXP = -1.5 ! [nondim] default = -1.33 - ! Exponent for Langmuir enhancementt of mstar -LT_MOD_LAC1 = 0.0 ! [nondim] default = -0.87 - ! Coefficient for modification of Langmuir number due to MLD approaching Ekman - ! depth. -LT_MOD_LAC4 = 0.0 ! [nondim] default = 0.95 - ! Coefficient for modification of Langmuir number due to ratio of Ekman to - ! stable Obukhov depth. -LT_MOD_LAC5 = 0.22 ! [nondim] default = 0.95 - ! Coefficient for modification of Langmuir number due to ratio of Ekman to - ! unstable Obukhov depth. - -! === module MOM_regularize_layers === - -! === module MOM_opacity === -PEN_SW_NBANDS = 3 ! default = 1 - ! The number of bands of penetrating shortwave radiation. - -! === module MOM_tracer_advect === -TRACER_ADVECTION_SCHEME = "PPM:H3" ! default = "PLM" - ! The horizontal transport scheme for tracers: - ! PLM - Piecewise Linear Method - ! PPM:H3 - Piecewise Parabolic Method (Huyhn 3rd order) - ! PPM - Piecewise Parabolic Method (Colella-Woodward) - -! === module MOM_tracer_hor_diff === -CHECK_DIFFUSIVE_CFL = True ! [Boolean] default = False - ! If true, use enough iterations the diffusion to ensure that the diffusive - ! equivalent of the CFL limit is not violated. If false, always use the greater - ! of 1 or MAX_TR_DIFFUSION_CFL iteration. - -! === module MOM_neutral_diffusion === -! This module implements neutral diffusion of tracers - -! === module MOM_lateral_boundary_diffusion === -! This module implements lateral diffusion of tracers near boundaries - -! === module MOM_sum_output === -MAXTRUNC = 100000 ! [truncations save_interval-1] default = 0 - ! The run will be stopped, and the day set to a very large value if the velocity - ! is truncated more than MAXTRUNC times between energy saves. Set MAXTRUNC to 0 - ! to stop if there is any truncation of velocities. -ENERGYSAVEDAYS = 1.00 ! [days] default = 1.0 - ! The interval in units of TIMEUNIT between saves of the energies of the run and - ! other globally summed diagnostics. - -! === module ocean_model_init === - -! === module MOM_oda_incupd === -ODA_INCUPD = @[ODA_INCUPD] ! [Boolean] default = False - ! If true, oda incremental updates will be applied - ! everywhere in the domain. -ODA_INCUPD_FILE = "mom6_increment.nc" ! The name of the file with the T,S,h increments. - -ODA_TEMPINC_VAR = "Temp" ! default = "ptemp_inc" - ! The name of the potential temperature inc. variable in - ! ODA_INCUPD_FILE. -ODA_SALTINC_VAR = "Salt" ! default = "sal_inc" - ! The name of the salinity inc. variable in - ! ODA_INCUPD_FILE. -ODA_THK_VAR = "h" ! default = "h" - ! The name of the int. depth inc. variable in - ! ODA_INCUPD_FILE. -ODA_INCUPD_UV = true ! -ODA_UINC_VAR = "u" ! default = "u_inc" - ! The name of the zonal vel. inc. variable in - ! ODA_INCUPD_UV_FILE. -ODA_VINC_VAR = "v" ! default = "v_inc" - ! The name of the meridional vel. inc. variable in - ! ODA_INCUPD_UV_FILE. -ODA_INCUPD_NHOURS = @[ODA_INCUPD_NHOURS] ! default=3.0 - -! === module MOM_surface_forcing === -OCEAN_SURFACE_STAGGER = "A" ! default = "C" - ! A case-insensitive character string to indicate the - ! staggering of the surface velocity field that is - ! returned to the coupler. Valid values include - ! 'A', 'B', or 'C'. - -MAX_P_SURF = 0.0 ! [Pa] default = -1.0 - ! The maximum surface pressure that can be exerted by the atmosphere and - ! floating sea-ice or ice shelves. This is needed because the FMS coupling - ! structure does not limit the water that can be frozen out of the ocean and the - ! ice-ocean heat fluxes are treated explicitly. No limit is applied if a - ! negative value is used. -WIND_STAGGER = "A" ! default = "C" - ! A case-insensitive character string to indicate the - ! staggering of the input wind stress field. Valid - ! values are 'A', 'B', or 'C'. -CD_TIDES = 0.0018 ! [nondim] default = 1.0E-04 - ! The drag coefficient that applies to the tides. -GUST_CONST = 0.0 ! [Pa] default = 0.02 - ! The background gustiness in the winds. -FIX_USTAR_GUSTLESS_BUG = False ! [Boolean] default = False - ! If true correct a bug in the time-averaging of the gustless wind friction - ! velocity -USE_RIGID_SEA_ICE = True ! [Boolean] default = False - ! If true, sea-ice is rigid enough to exert a nonhydrostatic pressure that - ! resist vertical motion. -SEA_ICE_RIGID_MASS = 100.0 ! [kg m-2] default = 1000.0 - ! The mass of sea-ice per unit area at which the sea-ice starts to exhibit - ! rigidity -LIQUID_RUNOFF_FROM_DATA = @[MOM6_RIVER_RUNOFF] ! [Boolean] default = False - ! If true, allows liquid river runoff to be specified via - ! the data_table using the component name 'OCN'. -! === module ocean_stochastics === -DO_SPPT = @[DO_OCN_SPPT] ! [Boolean] default = False - ! If true perturb the diabatic tendencies in MOM_diabatic_driver -PERT_EPBL = @[PERT_EPBL] ! [Boolean] default = False - ! If true perturb the KE dissipation and destruction in MOM_energetic_PBL -! === module MOM_restart === -RESTART_CHECKSUMS_REQUIRED = False -! === module MOM_file_parser === diff --git a/parm/ufs/mom6/MOM_input_template_050 b/parm/ufs/mom6/MOM_input_template_050 deleted file mode 100644 index 4c39198c02..0000000000 --- a/parm/ufs/mom6/MOM_input_template_050 +++ /dev/null @@ -1,947 +0,0 @@ -! This input file provides the adjustable run-time parameters for version 6 of the Modular Ocean Model (MOM6). -! Where appropriate, parameters use usually given in MKS units. - -! This particular file is for the example in ice_ocean_SIS2/OM4_05. - -! This MOM_input file typically contains only the non-default values that are needed to reproduce this example. -! A full list of parameters for this example can be found in the corresponding MOM_parameter_doc.all file -! which is generated by the model at run-time. -! === module MOM_domains === -TRIPOLAR_N = True ! [Boolean] default = False - ! Use tripolar connectivity at the northern edge of the domain. With - ! TRIPOLAR_N, NIGLOBAL must be even. -NIGLOBAL = @[NX_GLB] ! - ! The total number of thickness grid points in the x-direction in the physical - ! domain. With STATIC_MEMORY_ this is set in MOM_memory.h at compile time. -NJGLOBAL = @[NY_GLB] ! - ! The total number of thickness grid points in the y-direction in the physical - ! domain. With STATIC_MEMORY_ this is set in MOM_memory.h at compile time. -NIHALO = 4 ! default = 4 - ! The number of halo points on each side in the x-direction. With - ! STATIC_MEMORY_ this is set as NIHALO_ in MOM_memory.h at compile time; without - ! STATIC_MEMORY_ the default is NIHALO_ in MOM_memory.h (if defined) or 2. -NJHALO = 4 ! default = 4 - ! The number of halo points on each side in the y-direction. With - ! STATIC_MEMORY_ this is set as NJHALO_ in MOM_memory.h at compile time; without - ! STATIC_MEMORY_ the default is NJHALO_ in MOM_memory.h (if defined) or 2. -! LAYOUT = 21, 20 ! - ! The processor layout that was actually used. -! IO_LAYOUT = 1, 1 ! default = 1 - ! The processor layout to be used, or 0,0 to automatically set the io_layout to - ! be the same as the layout. - -! === module MOM === -USE_REGRIDDING = True ! [Boolean] default = False - ! If True, use the ALE algorithm (regridding/remapping). If False, use the - ! layered isopycnal algorithm. -THICKNESSDIFFUSE = True ! [Boolean] default = False - ! If true, interface heights are diffused with a coefficient of KHTH. -THICKNESSDIFFUSE_FIRST = True ! [Boolean] default = False - ! If true, do thickness diffusion before dynamics. This is only used if - ! THICKNESSDIFFUSE is true. -DT = @[DT_DYNAM_MOM6] ! [s] - ! The (baroclinic) dynamics time step. The time-step that is actually used will - ! be an integer fraction of the forcing time-step (DT_FORCING in ocean-only mode - ! or the coupling timestep in coupled mode.) -DT_THERM = @[DT_THERM_MOM6] ! [s] default = 1800.0 - ! The thermodynamic and tracer advection time step. Ideally DT_THERM should be - ! an integer multiple of DT and less than the forcing or coupling time-step, - ! unless THERMO_SPANS_COUPLING is true, in which case DT_THERM can be an integer - ! multiple of the coupling timestep. By default DT_THERM is set to DT. -THERMO_SPANS_COUPLING = @[MOM6_THERMO_SPAN] ! [Boolean] default = False - ! If true, the MOM will take thermodynamic and tracer timesteps that can be - ! longer than the coupling timestep. The actual thermodynamic timestep that is - ! used in this case is the largest integer multiple of the coupling timestep - ! that is less than or equal to DT_THERM. -HFREEZE = 20.0 ! [m] default = -1.0 - ! If HFREEZE > 0, melt potential will be computed. The actual depth - ! over which melt potential is computed will be min(HFREEZE, OBLD) - ! where OBLD is the boundary layer depth. If HFREEZE <= 0 (default) - ! melt potential will not be computed. -USE_PSURF_IN_EOS = False ! [Boolean] default = False - ! If true, always include the surface pressure contributions in equation of - ! state calculations. -FRAZIL = True ! [Boolean] default = False - ! If true, water freezes if it gets too cold, and the accumulated heat deficit - ! is returned in the surface state. FRAZIL is only used if - ! ENABLE_THERMODYNAMICS is true. -DO_GEOTHERMAL = True ! [Boolean] default = False - ! If true, apply geothermal heating. -BOUND_SALINITY = True ! [Boolean] default = False - ! If true, limit salinity to being positive. (The sea-ice model may ask for more - ! salt than is available and drive the salinity negative otherwise.) -MIN_SALINITY = 0.01 ! [PPT] default = 0.01 - ! The minimum value of salinity when BOUND_SALINITY=True. The default is 0.01 - ! for backward compatibility but ideally should be 0. -C_P = 3992.0 ! [J kg-1 K-1] default = 3991.86795711963 - ! The heat capacity of sea water, approximated as a constant. This is only used - ! if ENABLE_THERMODYNAMICS is true. The default value is from the TEOS-10 - ! definition of conservative temperature. -CHECK_BAD_SURFACE_VALS = True ! [Boolean] default = False - ! If true, check the surface state for ridiculous values. -BAD_VAL_SSH_MAX = 50.0 ! [m] default = 20.0 - ! The value of SSH above which a bad value message is triggered, if - ! CHECK_BAD_SURFACE_VALS is true. -BAD_VAL_SSS_MAX = 75.0 ! [PPT] default = 45.0 - ! The value of SSS above which a bad value message is triggered, if - ! CHECK_BAD_SURFACE_VALS is true. -BAD_VAL_SST_MAX = 55.0 ! [deg C] default = 45.0 - ! The value of SST above which a bad value message is triggered, if - ! CHECK_BAD_SURFACE_VALS is true. -BAD_VAL_SST_MIN = -3.0 ! [deg C] default = -2.1 - ! The value of SST below which a bad value message is triggered, if - ! CHECK_BAD_SURFACE_VALS is true. -DEFAULT_2018_ANSWERS = True ! [Boolean] default = True - ! This sets the default value for the various _2018_ANSWERS parameters. -WRITE_GEOM = 2 ! default = 1 - ! If =0, never write the geometry and vertical grid files. If =1, write the - ! geometry and vertical grid files only for a new simulation. If =2, always - ! write the geometry and vertical grid files. Other values are invalid. -SAVE_INITIAL_CONDS = False ! [Boolean] default = False - ! If true, write the initial conditions to a file given by IC_OUTPUT_FILE. - -! === module MOM_hor_index === -! Sets the horizontal array index types. - -! === module MOM_fixed_initialization === -INPUTDIR = "INPUT" ! default = "." - ! The directory in which input files are found. - -! === module MOM_grid_init === -GRID_CONFIG = "mosaic" ! - ! A character string that determines the method for defining the horizontal - ! grid. Current options are: - ! mosaic - read the grid from a mosaic (supergrid) - ! file set by GRID_FILE. - ! cartesian - use a (flat) Cartesian grid. - ! spherical - use a simple spherical grid. - ! mercator - use a Mercator spherical grid. -GRID_FILE = "ocean_hgrid.nc" ! - ! Name of the file from which to read horizontal grid data. -GRID_ROTATION_ANGLE_BUGS = False ! [Boolean] default = True - ! If true, use an older algorithm to calculate the sine and - ! cosines needed rotate between grid-oriented directions and - ! true north and east. Differences arise at the tripolar fold -USE_TRIPOLAR_GEOLONB_BUG = False ! [Boolean] default = True - ! If true, use older code that incorrectly sets the longitude in some points - ! along the tripolar fold to be off by 360 degrees. -TOPO_CONFIG = "file" ! - ! This specifies how bathymetry is specified: - ! file - read bathymetric information from the file - ! specified by (TOPO_FILE). - ! flat - flat bottom set to MAXIMUM_DEPTH. - ! bowl - an analytically specified bowl-shaped basin - ! ranging between MAXIMUM_DEPTH and MINIMUM_DEPTH. - ! spoon - a similar shape to 'bowl', but with an vertical - ! wall at the southern face. - ! halfpipe - a zonally uniform channel with a half-sine - ! profile in the meridional direction. - ! benchmark - use the benchmark test case topography. - ! Neverland - use the Neverland test case topography. - ! DOME - use a slope and channel configuration for the - ! DOME sill-overflow test case. - ! ISOMIP - use a slope and channel configuration for the - ! ISOMIP test case. - ! DOME2D - use a shelf and slope configuration for the - ! DOME2D gravity current/overflow test case. - ! Kelvin - flat but with rotated land mask. - ! seamount - Gaussian bump for spontaneous motion test case. - ! dumbbell - Sloshing channel with reservoirs on both ends. - ! shelfwave - exponential slope for shelfwave test case. - ! Phillips - ACC-like idealized topography used in the Phillips config. - ! dense - Denmark Strait-like dense water formation and overflow. - ! USER - call a user modified routine. -TOPO_FILE = "ocean_topog.nc" ! default = "topog.nc" - ! The file from which the bathymetry is read. -ALLOW_LANDMASK_CHANGES = @[MOM6_ALLOW_LANDMASK_CHANGES] ! default = "False" - ! If true, allow topography overrides to change ocean points to land -MAXIMUM_DEPTH = 6500.0 ! [m] - ! The maximum depth of the ocean. -MINIMUM_DEPTH = 9.5 ! [m] default = 0.0 - ! If MASKING_DEPTH is unspecified, then anything shallower than MINIMUM_DEPTH is - ! assumed to be land and all fluxes are masked out. If MASKING_DEPTH is - ! specified, then all depths shallower than MINIMUM_DEPTH but deeper than - ! MASKING_DEPTH are rounded to MINIMUM_DEPTH. - -! === module MOM_open_boundary === -! Controls where open boundaries are located, what kind of boundary condition to impose, and what data to apply, -! if any. -MASKING_DEPTH = 0.0 ! [m] default = -9999.0 - ! The depth below which to mask points as land points, for which all fluxes are - ! zeroed out. MASKING_DEPTH is ignored if negative. -CHANNEL_CONFIG = "list" ! default = "none" - ! A parameter that determines which set of channels are - ! restricted to specific widths. Options are: - ! none - All channels have the grid width. - ! global_1deg - Sets 16 specific channels appropriate - ! for a 1-degree model, as used in CM2G. - ! list - Read the channel locations and widths from a - ! text file, like MOM_channel_list in the MOM_SIS - ! test case. - ! file - Read open face widths everywhere from a - ! NetCDF file on the model grid. -CHANNEL_LIST_FILE = "MOM_channels_global_025" ! default = "MOM_channel_list" - ! The file from which the list of narrowed channels is read. - -! === module MOM_verticalGrid === -! Parameters providing information about the vertical grid. -NK = 75 ! [nondim] - ! The number of model layers. - -! === module MOM_tracer_registry === - -! === module MOM_EOS === -DTFREEZE_DP = -7.75E-08 ! [deg C Pa-1] default = 0.0 - ! When TFREEZE_FORM=LINEAR, this is the derivative of the freezing potential - ! temperature with pressure. - -! === module MOM_restart === -PARALLEL_RESTARTFILES = True ! [Boolean] default = False - ! If true, each processor writes its own restart file, otherwise a single - ! restart file is generated - -! === module MOM_tracer_flow_control === -USE_IDEAL_AGE_TRACER = False ! [Boolean] default = False - ! If true, use the ideal_age_example tracer package. - -! === module ideal_age_example === - -! === module MOM_coord_initialization === -COORD_CONFIG = "file" ! - ! This specifies how layers are to be defined: - ! ALE or none - used to avoid defining layers in ALE mode - ! file - read coordinate information from the file - ! specified by (COORD_FILE). - ! BFB - Custom coords for buoyancy-forced basin case - ! based on SST_S, T_BOT and DRHO_DT. - ! linear - linear based on interfaces not layers - ! layer_ref - linear based on layer densities - ! ts_ref - use reference temperature and salinity - ! ts_range - use range of temperature and salinity - ! (T_REF and S_REF) to determine surface density - ! and GINT calculate internal densities. - ! gprime - use reference density (RHO_0) for surface - ! density and GINT calculate internal densities. - ! ts_profile - use temperature and salinity profiles - ! (read from COORD_FILE) to set layer densities. - ! USER - call a user modified routine. -COORD_FILE = "layer_coord.nc" ! - ! The file from which the coordinate densities are read. -REMAP_UV_USING_OLD_ALG = True ! [Boolean] default = True - ! If true, uses the old remapping-via-a-delta-z method for remapping u and v. If - ! false, uses the new method that remaps between grids described by an old and - ! new thickness. -REGRIDDING_COORDINATE_MODE = "HYCOM1" ! default = "LAYER" - ! Coordinate mode for vertical regridding. Choose among the following - ! possibilities: LAYER - Isopycnal or stacked shallow water layers - ! ZSTAR, Z* - stretched geopotential z* - ! SIGMA_SHELF_ZSTAR - stretched geopotential z* ignoring shelf - ! SIGMA - terrain following coordinates - ! RHO - continuous isopycnal - ! HYCOM1 - HyCOM-like hybrid coordinate - ! SLIGHT - stretched coordinates above continuous isopycnal - ! ADAPTIVE - optimize for smooth neutral density surfaces -BOUNDARY_EXTRAPOLATION = True ! [Boolean] default = False - ! When defined, a proper high-order reconstruction scheme is used within - ! boundary cells rather than PCM. E.g., if PPM is used for remapping, a PPM - ! reconstruction will also be used within boundary cells. -ALE_COORDINATE_CONFIG = "HYBRID:hycom1_75_800m.nc,sigma2,FNC1:2,4000,4.5,.01" ! default = "UNIFORM" - ! Determines how to specify the coordinate resolution. Valid options are: - ! PARAM - use the vector-parameter ALE_RESOLUTION - ! UNIFORM[:N] - uniformly distributed - ! FILE:string - read from a file. The string specifies - ! the filename and variable name, separated - ! by a comma or space, e.g. FILE:lev.nc,dz - ! or FILE:lev.nc,interfaces=zw - ! WOA09[:N] - the WOA09 vertical grid (approximately) - ! FNC1:string - FNC1:dz_min,H_total,power,precision - ! HYBRID:string - read from a file. The string specifies - ! the filename and two variable names, separated - ! by a comma or space, for sigma-2 and dz. e.g. - ! HYBRID:vgrid.nc,sigma2,dz -!ALE_RESOLUTION = 7*2.0, 2*2.01, 2.02, 2.03, 2.05, 2.08, 2.11, 2.15, 2.21, 2.2800000000000002, 2.37, 2.48, 2.61, 2.77, 2.95, 3.17, 3.4299999999999997, 3.74, 4.09, 4.49, 4.95, 5.48, 6.07, 6.74, 7.5, 8.34, 9.280000000000001, 10.33, 11.49, 12.77, 14.19, 15.74, 17.450000000000003, 19.31, 21.35, 23.56, 25.97, 28.580000000000002, 31.41, 34.47, 37.77, 41.32, 45.14, 49.25, 53.65, 58.370000000000005, 63.42, 68.81, 74.56, 80.68, 87.21000000000001, 94.14, 101.51, 109.33, 117.62, 126.4, 135.68, 145.5, 155.87, 166.81, 178.35, 190.51, 203.31, 216.78, 230.93, 245.8, 261.42, 277.83 ! [m] - ! The distribution of vertical resolution for the target - ! grid used for Eulerian-like coordinates. For example, - ! in z-coordinate mode, the parameter is a list of level - ! thicknesses (in m). In sigma-coordinate mode, the list - ! is of non-dimensional fractions of the water column. -!TARGET_DENSITIES = 1010.0, 1014.3034, 1017.8088, 1020.843, 1023.5566, 1025.813, 1027.0275, 1027.9114, 1028.6422, 1029.2795, 1029.852, 1030.3762, 1030.8626, 1031.3183, 1031.7486, 1032.1572, 1032.5471, 1032.9207, 1033.2798, 1033.6261, 1033.9608, 1034.2519, 1034.4817, 1034.6774, 1034.8508, 1035.0082, 1035.1533, 1035.2886, 1035.4159, 1035.5364, 1035.6511, 1035.7608, 1035.8661, 1035.9675, 1036.0645, 1036.1554, 1036.2411, 1036.3223, 1036.3998, 1036.4739, 1036.5451, 1036.6137, 1036.68, 1036.7441, 1036.8062, 1036.8526, 1036.8874, 1036.9164, 1036.9418, 1036.9647, 1036.9857, 1037.0052, 1037.0236, 1037.0409, 1037.0574, 1037.0738, 1037.0902, 1037.1066, 1037.123, 1037.1394, 1037.1558, 1037.1722, 1037.1887, 1037.206, 1037.2241, 1037.2435, 1037.2642, 1037.2866, 1037.3112, 1037.3389, 1037.3713, 1037.4118, 1037.475, 1037.6332, 1037.8104, 1038.0 ! [m] - ! HYBRID target densities for interfaces -REGRID_COMPRESSIBILITY_FRACTION = 0.01 ! [nondim] default = 0.0 - ! When interpolating potential density profiles we can add some artificial - ! compressibility solely to make homogeneous regions appear stratified. -MAXIMUM_INT_DEPTH_CONFIG = "FNC1:5,8000.0,1.0,.01" ! default = "NONE" - ! Determines how to specify the maximum interface depths. - ! Valid options are: - ! NONE - there are no maximum interface depths - ! PARAM - use the vector-parameter MAXIMUM_INTERFACE_DEPTHS - ! FILE:string - read from a file. The string specifies - ! the filename and variable name, separated - ! by a comma or space, e.g. FILE:lev.nc,Z - ! FNC1:string - FNC1:dz_min,H_total,power,precision -!MAXIMUM_INT_DEPTHS = 0.0, 5.0, 12.75, 23.25, 36.49, 52.480000000000004, 71.22, 92.71000000000001, 116.94000000000001, 143.92000000000002, 173.65, 206.13, 241.36, 279.33000000000004, 320.05000000000007, 363.5200000000001, 409.7400000000001, 458.7000000000001, 510.4100000000001, 564.8700000000001, 622.0800000000002, 682.0300000000002, 744.7300000000002, 810.1800000000003, 878.3800000000003, 949.3300000000004, 1023.0200000000004, 1099.4600000000005, 1178.6500000000005, 1260.5900000000006, 1345.2700000000007, 1432.7000000000007, 1522.8800000000008, 1615.8100000000009, 1711.490000000001, 1809.910000000001, 1911.080000000001, 2015.0000000000011, 2121.670000000001, 2231.080000000001, 2343.2400000000007, 2458.1500000000005, 2575.8100000000004, 2696.2200000000003, 2819.3700000000003, 2945.2700000000004, 3073.9200000000005, 3205.3200000000006, 3339.4600000000005, 3476.3500000000004, 3615.9900000000002, 3758.38, 3903.52, 4051.4, 4202.03, 4355.41, 4511.54, 4670.41, 4832.03, 4996.4, 5163.5199999999995, 5333.379999999999, 5505.989999999999, 5681.3499999999985, 5859.459999999998, 6040.319999999998, 6223.919999999998, 6410.269999999999, 6599.369999999999, 6791.219999999999, 6985.8099999999995, 7183.15, 7383.24, 7586.08, 7791.67, 8000.0 - ! The list of maximum depths for each interface. -MAX_LAYER_THICKNESS_CONFIG = "FNC1:400,31000.0,0.1,.01" ! default = "NONE" - ! Determines how to specify the maximum layer thicknesses. - ! Valid options are: - ! NONE - there are no maximum layer thicknesses - ! PARAM - use the vector-parameter MAX_LAYER_THICKNESS - ! FILE:string - read from a file. The string specifies - ! the filename and variable name, separated - ! by a comma or space, e.g. FILE:lev.nc,Z - ! FNC1:string - FNC1:dz_min,H_total,power,precision -!MAX_LAYER_THICKNESS = 400.0, 409.63, 410.32, 410.75, 411.07, 411.32, 411.52, 411.7, 411.86, 412.0, 412.13, 412.24, 412.35, 412.45, 412.54, 412.63, 412.71, 412.79, 412.86, 412.93, 413.0, 413.06, 413.12, 413.18, 413.24, 413.29, 413.34, 413.39, 413.44, 413.49, 413.54, 413.58, 413.62, 413.67, 413.71, 413.75, 413.78, 413.82, 413.86, 413.9, 413.93, 413.97, 414.0, 414.03, 414.06, 414.1, 414.13, 414.16, 414.19, 414.22, 414.24, 414.27, 414.3, 414.33, 414.35, 414.38, 414.41, 414.43, 414.46, 414.48, 414.51, 414.53, 414.55, 414.58, 414.6, 414.62, 414.65, 414.67, 414.69, 414.71, 414.73, 414.75, 414.77, 414.79, 414.83 ! [m] - ! The list of maximum thickness for each layer. -REMAPPING_SCHEME = "PPM_H4" ! default = "PLM" - ! This sets the reconstruction scheme used for vertical remapping for all - ! variables. It can be one of the following schemes: PCM (1st-order - ! accurate) - ! PLM (2nd-order accurate) - ! PPM_H4 (3rd-order accurate) - ! PPM_IH4 (3rd-order accurate) - ! PQM_IH4IH3 (4th-order accurate) - ! PQM_IH6IH5 (5th-order accurate) - -! === module MOM_grid === -! Parameters providing information about the lateral grid. - -! === module MOM_state_initialization === -INIT_LAYERS_FROM_Z_FILE = True ! [Boolean] default = False - ! If true, initialize the layer thicknesses, temperatures, and salinities from a - ! Z-space file on a latitude-longitude grid. - -! === module MOM_initialize_layers_from_Z === -TEMP_SALT_Z_INIT_FILE = "MOM6_IC_TS.nc" ! default = "temp_salt_z.nc" - ! The name of the z-space input file used to initialize - ! temperatures (T) and salinities (S). If T and S are not - ! in the same file, TEMP_Z_INIT_FILE and SALT_Z_INIT_FILE - ! must be set. -Z_INIT_FILE_PTEMP_VAR = "temp" ! default = "ptemp" - ! The name of the potential temperature variable in - ! TEMP_Z_INIT_FILE. -Z_INIT_FILE_SALT_VAR = "salt" ! default = "salt" - ! The name of the salinity variable in - ! SALT_Z_INIT_FILE. - -Z_INIT_ALE_REMAPPING = True ! [Boolean] default = False - ! If True, then remap straight to model coordinate from file. -Z_INIT_REMAP_OLD_ALG = True ! [Boolean] default = True - ! If false, uses the preferred remapping algorithm for initialization. If true, - ! use an older, less robust algorithm for remapping. - -! === module MOM_diag_mediator === -!Jiande NUM_DIAG_COORDS = 2 ! default = 1 -NUM_DIAG_COORDS = 1 ! default = 1 - ! The number of diagnostic vertical coordinates to use. - ! For each coordinate, an entry in DIAG_COORDS must be provided. -!Jiande DIAG_COORDS = "z Z ZSTAR", "rho2 RHO2 RHO" ! -DIAG_COORDS = "z Z ZSTAR" - ! A list of string tuples associating diag_table modules to - ! a coordinate definition used for diagnostics. Each string - ! is of the form "MODULE_SUFFIX,PARAMETER_SUFFIX,COORDINATE_NAME". -DIAG_COORD_DEF_Z="FILE:interpolate_zgrid_40L.nc,interfaces=zw" -DIAG_MISVAL = -1e34 -!DIAG_COORD_DEF_RHO2 = "RFNC1:35,999.5,1028,1028.5,8.,1038.,0.0078125" ! default = "WOA09" - ! Determines how to specify the coordinate resolution. Valid options are: - ! PARAM - use the vector-parameter DIAG_COORD_RES_RHO2 - ! UNIFORM[:N] - uniformly distributed - ! FILE:string - read from a file. The string specifies - ! the filename and variable name, separated - ! by a comma or space, e.g. FILE:lev.nc,dz - ! or FILE:lev.nc,interfaces=zw - ! WOA09[:N] - the WOA09 vertical grid (approximately) - ! FNC1:string - FNC1:dz_min,H_total,power,precision - ! HYBRID:string - read from a file. The string specifies - ! the filename and two variable names, separated - ! by a comma or space, for sigma-2 and dz. e.g. - ! HYBRID:vgrid.nc,sigma2,dz - -! === module MOM_MEKE === -USE_MEKE = True ! [Boolean] default = False - ! If true, turns on the MEKE scheme which calculates a sub-grid mesoscale eddy - ! kinetic energy budget. -MEKE_GMCOEFF = 1.0 ! [nondim] default = -1.0 - ! The efficiency of the conversion of potential energy into MEKE by the - ! thickness mixing parameterization. If MEKE_GMCOEFF is negative, this - ! conversion is not used or calculated. -MEKE_BGSRC = 1.0E-13 ! [W kg-1] default = 0.0 - ! A background energy source for MEKE. -MEKE_KHTH_FAC = 0.5 ! [nondim] default = 0.0 - ! A factor that maps MEKE%Kh to KhTh. -MEKE_KHTR_FAC = 0.5 ! [nondim] default = 0.0 - ! A factor that maps MEKE%Kh to KhTr. -MEKE_KHMEKE_FAC = 1.0 ! [nondim] default = 0.0 - ! A factor that maps MEKE%Kh to Kh for MEKE itself. -MEKE_VISCOSITY_COEFF_KU = 1.0 ! [nondim] default = 0.0 - ! If non-zero, is the scaling coefficient in the expression forviscosity used to - ! parameterize harmonic lateral momentum mixing byunresolved eddies represented - ! by MEKE. Can be negative torepresent backscatter from the unresolved eddies. -MEKE_ALPHA_RHINES = 0.15 ! [nondim] default = 0.05 - ! If positive, is a coefficient weighting the Rhines scale in the expression for - ! mixing length used in MEKE-derived diffusivity. -MEKE_ALPHA_EADY = 0.15 ! [nondim] default = 0.05 - ! If positive, is a coefficient weighting the Eady length scale in the - ! expression for mixing length used in MEKE-derived diffusivity. - -! === module MOM_lateral_mixing_coeffs === -USE_VARIABLE_MIXING = True ! [Boolean] default = False - ! If true, the variable mixing code will be called. This allows diagnostics to - ! be created even if the scheme is not used. If KHTR_SLOPE_CFF>0 or - ! KhTh_Slope_Cff>0, this is set to true regardless of what is in the parameter - ! file. -RESOLN_SCALED_KH = True ! [Boolean] default = False - ! If true, the Laplacian lateral viscosity is scaled away when the first - ! baroclinic deformation radius is well resolved. -RESOLN_SCALED_KHTH = True ! [Boolean] default = False - ! If true, the interface depth diffusivity is scaled away when the first - ! baroclinic deformation radius is well resolved. -KHTH_USE_EBT_STRUCT = True ! [Boolean] default = False - ! If true, uses the equivalent barotropic structure as the vertical structure of - ! thickness diffusivity. -KHTR_SLOPE_CFF = 0.25 ! [nondim] default = 0.0 - ! The nondimensional coefficient in the Visbeck formula for the epipycnal tracer - ! diffusivity -USE_STORED_SLOPES = True ! [Boolean] default = False - ! If true, the isopycnal slopes are calculated once and stored for re-use. This - ! uses more memory but avoids calling the equation of state more times than - ! should be necessary. -KH_RES_FN_POWER = 100 ! [nondim] default = 2 - ! The power of dx/Ld in the Kh resolution function. Any positive integer may be - ! used, although even integers are more efficient to calculate. Setting this - ! greater than 100 results in a step-function being used. -INTERPOLATE_RES_FN = False ! [Boolean] default = True - ! If true, interpolate the resolution function to the velocity points from the - ! thickness points; otherwise interpolate the wave speed and calculate the - ! resolution function independently at each point. -GILL_EQUATORIAL_LD = True ! [Boolean] default = False - ! If true, uses Gill's definition of the baroclinic equatorial deformation - ! radius, otherwise, if false, use Pedlosky's definition. These definitions - ! differ by a factor of 2 in front of the beta term in the denominator. Gill's - ! is the more appropriate definition. -INTERNAL_WAVE_SPEED_BETTER_EST = False ! [Boolean] default = True - ! If true, use a more robust estimate of the first mode wave speed as the - ! starting point for iterations. - -! === module MOM_set_visc === -CHANNEL_DRAG = True ! [Boolean] default = False - ! If true, the bottom drag is exerted directly on each layer proportional to the - ! fraction of the bottom it overlies. -PRANDTL_TURB = 1.25 ! [nondim] default = 1.0 - ! The turbulent Prandtl number applied to shear instability. -HBBL = 10.0 ! [m] - ! The thickness of a bottom boundary layer with a viscosity of KVBBL if - ! BOTTOMDRAGLAW is not defined, or the thickness over which near-bottom - ! velocities are averaged for the drag law if BOTTOMDRAGLAW is defined but - ! LINEAR_DRAG is not. -DRAG_BG_VEL = 0.1 ! [m s-1] default = 0.0 - ! DRAG_BG_VEL is either the assumed bottom velocity (with LINEAR_DRAG) or an - ! unresolved velocity that is combined with the resolved velocity to estimate - ! the velocity magnitude. DRAG_BG_VEL is only used when BOTTOMDRAGLAW is - ! defined. -BBL_USE_EOS = True ! [Boolean] default = False - ! If true, use the equation of state in determining the properties of the bottom - ! boundary layer. Otherwise use the layer target potential densities. -BBL_THICK_MIN = 0.1 ! [m] default = 0.0 - ! The minimum bottom boundary layer thickness that can be used with - ! BOTTOMDRAGLAW. This might be Kv/(cdrag*drag_bg_vel) to give Kv as the minimum - ! near-bottom viscosity. -KV = 1.0E-04 ! [m2 s-1] - ! The background kinematic viscosity in the interior. The molecular value, ~1e-6 - ! m2 s-1, may be used. -KV_BBL_MIN = 0.0 ! [m2 s-1] default = 1.0E-04 - ! The minimum viscosities in the bottom boundary layer. -KV_TBL_MIN = 0.0 ! [m2 s-1] default = 1.0E-04 - ! The minimum viscosities in the top boundary layer. - -! === module MOM_thickness_diffuse === -KHTH_MAX_CFL = 0.1 ! [nondimensional] default = 0.8 - ! The maximum value of the local diffusive CFL ratio that is permitted for the - ! thickness diffusivity. 1.0 is the marginally unstable value in a pure layered - ! model, but much smaller numbers (e.g. 0.1) seem to work better for ALE-based - ! models. -KHTH_USE_FGNV_STREAMFUNCTION = True ! [Boolean] default = False - ! If true, use the streamfunction formulation of Ferrari et al., 2010, which - ! effectively emphasizes graver vertical modes by smoothing in the vertical. -FGNV_FILTER_SCALE = 0.1 ! [nondim] default = 1.0 - ! A coefficient scaling the vertical smoothing term in the Ferrari et al., 2010, - ! streamfunction formulation. -USE_GM_WORK_BUG = True ! [Boolean] default = True - ! If true, compute the top-layer work tendency on the u-grid with the incorrect - ! sign, for legacy reproducibility. - -! === module MOM_continuity === - -! === module MOM_continuity_PPM === -ETA_TOLERANCE = 1.0E-06 ! [m] default = 3.75E-09 - ! The tolerance for the differences between the barotropic and baroclinic - ! estimates of the sea surface height due to the fluxes through each face. The - ! total tolerance for SSH is 4 times this value. The default is - ! 0.5*NK*ANGSTROM, and this should not be set less than about - ! 10^-15*MAXIMUM_DEPTH. -ETA_TOLERANCE_AUX = 0.001 ! [m] default = 1.0E-06 - ! The tolerance for free-surface height discrepancies between the barotropic - ! solution and the sum of the layer thicknesses when calculating the auxiliary - ! corrected velocities. By default, this is the same as ETA_TOLERANCE, but can - ! be made larger for efficiency. - -! === module MOM_CoriolisAdv === -CORIOLIS_SCHEME = "SADOURNY75_ENSTRO" ! default = "SADOURNY75_ENERGY" - ! CORIOLIS_SCHEME selects the discretization for the Coriolis terms. Valid - ! values are: - ! SADOURNY75_ENERGY - Sadourny, 1975; energy cons. - ! ARAKAWA_HSU90 - Arakawa & Hsu, 1990 - ! SADOURNY75_ENSTRO - Sadourny, 1975; enstrophy cons. - ! ARAKAWA_LAMB81 - Arakawa & Lamb, 1981; En. + Enst. - ! ARAKAWA_LAMB_BLEND - A blend of Arakawa & Lamb with - ! Arakawa & Hsu and Sadourny energy -BOUND_CORIOLIS = True ! [Boolean] default = False - ! If true, the Coriolis terms at u-points are bounded by the four estimates of - ! (f+rv)v from the four neighboring v-points, and similarly at v-points. This - ! option would have no effect on the SADOURNY Coriolis scheme if it were - ! possible to use centered difference thickness fluxes. - -! === module MOM_PressureForce === - -! === module MOM_PressureForce_AFV === -MASS_WEIGHT_IN_PRESSURE_GRADIENT = True ! [Boolean] default = False - ! If true, use mass weighting when interpolating T/S for integrals near the - ! bathymetry in AFV pressure gradient calculations. - -! === module MOM_hor_visc === -LAPLACIAN = True ! [Boolean] default = False - ! If true, use a Laplacian horizontal viscosity. -KH_VEL_SCALE = 0.01 ! [m s-1] default = 0.0 - ! The velocity scale which is multiplied by the grid spacing to calculate the - ! Laplacian viscosity. The final viscosity is the largest of this scaled - ! viscosity, the Smagorinsky and Leith viscosities, and KH. -KH_SIN_LAT = 2000.0 ! [m2 s-1] default = 0.0 - ! The amplitude of a latitudinally-dependent background viscosity of the form - ! KH_SIN_LAT*(SIN(LAT)**KH_PWR_OF_SINE). -SMAGORINSKY_KH = True ! [Boolean] default = False - ! If true, use a Smagorinsky nonlinear eddy viscosity. -SMAG_LAP_CONST = 0.15 ! [nondim] default = 0.0 - ! The nondimensional Laplacian Smagorinsky constant, often 0.15. -AH_VEL_SCALE = 0.01 ! [m s-1] default = 0.0 - ! The velocity scale which is multiplied by the cube of the grid spacing to - ! calculate the biharmonic viscosity. The final viscosity is the largest of this - ! scaled viscosity, the Smagorinsky and Leith viscosities, and AH. -SMAGORINSKY_AH = True ! [Boolean] default = False - ! If true, use a biharmonic Smagorinsky nonlinear eddy viscosity. -SMAG_BI_CONST = 0.06 ! [nondim] default = 0.0 - ! The nondimensional biharmonic Smagorinsky constant, typically 0.015 - 0.06. -USE_LAND_MASK_FOR_HVISC = False ! [Boolean] default = False - ! If true, use Use the land mask for the computation of thicknesses at velocity - ! locations. This eliminates the dependence on arbitrary values over land or - ! outside of the domain. Default is False in order to maintain answers with - ! legacy experiments but should be changed to True for new experiments. - -! === module MOM_vert_friction === -HMIX_FIXED = 0.5 ! [m] - ! The prescribed depth over which the near-surface viscosity and diffusivity are - ! elevated when the bulk mixed layer is not used. -KVML = 1.0E-04 ! [m2 s-1] default = 1.0E-04 - ! The kinematic viscosity in the mixed layer. A typical value is ~1e-2 m2 s-1. - ! KVML is not used if BULKMIXEDLAYER is true. The default is set by KV. -MAXVEL = 6.0 ! [m s-1] default = 3.0E+08 - ! The maximum velocity allowed before the velocity components are truncated. - -! === module MOM_PointAccel === -U_TRUNC_FILE = "U_velocity_truncations" ! default = "" - ! The absolute path to a file into which the accelerations leading to zonal - ! velocity truncations are written. Undefine this for efficiency if this - ! diagnostic is not needed. -V_TRUNC_FILE = "V_velocity_truncations" ! default = "" - ! The absolute path to a file into which the accelerations leading to meridional - ! velocity truncations are written. Undefine this for efficiency if this - ! diagnostic is not needed. - -! === module MOM_barotropic === -BOUND_BT_CORRECTION = True ! [Boolean] default = False - ! If true, the corrective pseudo mass-fluxes into the barotropic solver are - ! limited to values that require less than maxCFL_BT_cont to be accommodated. -BT_PROJECT_VELOCITY = True ! [Boolean] default = False - ! If true, step the barotropic velocity first and project out the velocity - ! tendency by 1+BEBT when calculating the transport. The default (false) is to - ! use a predictor continuity step to find the pressure field, and then to do a - ! corrector continuity step using a weighted average of the old and new - ! velocities, with weights of (1-BEBT) and BEBT. -DYNAMIC_SURFACE_PRESSURE = True ! [Boolean] default = False - ! If true, add a dynamic pressure due to a viscous ice shelf, for instance. -BEBT = 0.2 ! [nondim] default = 0.1 - ! BEBT determines whether the barotropic time stepping uses the forward-backward - ! time-stepping scheme or a backward Euler scheme. BEBT is valid in the range - ! from 0 (for a forward-backward treatment of nonrotating gravity waves) to 1 - ! (for a backward Euler treatment). In practice, BEBT must be greater than about - ! 0.05. -DTBT = -0.9 ! [s or nondim] default = -0.98 - ! The barotropic time step, in s. DTBT is only used with the split explicit time - ! stepping. To set the time step automatically based the maximum stable value - ! use 0, or a negative value gives the fraction of the stable value. Setting - ! DTBT to 0 is the same as setting it to -0.98. The value of DTBT that will - ! actually be used is an integer fraction of DT, rounding down. -BT_USE_OLD_CORIOLIS_BRACKET_BUG = True ! [Boolean] default = False - ! If True, use an order of operations that is not bitwise rotationally symmetric - ! in the meridional Coriolis term of the barotropic solver. - -! === module MOM_mixed_layer_restrat === -MIXEDLAYER_RESTRAT = True ! [Boolean] default = False - ! If true, a density-gradient dependent re-stratifying flow is imposed in the - ! mixed layer. Can be used in ALE mode without restriction but in layer mode can - ! only be used if BULKMIXEDLAYER is true. -FOX_KEMPER_ML_RESTRAT_COEF = 1.0 ! [nondim] default = 0.0 - ! A nondimensional coefficient that is proportional to the ratio of the - ! deformation radius to the dominant lengthscale of the submesoscale mixed layer - ! instabilities, times the minimum of the ratio of the mesoscale eddy kinetic - ! energy to the large-scale geostrophic kinetic energy or 1 plus the square of - ! the grid spacing over the deformation radius, as detailed by Fox-Kemper et al. - ! (2010) -MLE_FRONT_LENGTH = 200.0 ! [m] default = 0.0 - ! If non-zero, is the frontal-length scale used to calculate the upscaling of - ! buoyancy gradients that is otherwise represented by the parameter - ! FOX_KEMPER_ML_RESTRAT_COEF. If MLE_FRONT_LENGTH is non-zero, it is recommended - ! to set FOX_KEMPER_ML_RESTRAT_COEF=1.0. -MLE_USE_PBL_MLD = True ! [Boolean] default = False - ! If true, the MLE parameterization will use the mixed-layer depth provided by - ! the active PBL parameterization. If false, MLE will estimate a MLD based on a - ! density difference with the surface using the parameter MLE_DENSITY_DIFF. -MLE_MLD_DECAY_TIME = 2.592E+06 ! [s] default = 0.0 - ! The time-scale for a running-mean filter applied to the mixed-layer depth used - ! in the MLE restratification parameterization. When the MLD deepens below the - ! current running-mean the running-mean is instantaneously set to the current - ! MLD. - -! === module MOM_diabatic_driver === -! The following parameters are used for diabatic processes. -ENERGETICS_SFC_PBL = True ! [Boolean] default = False - ! If true, use an implied energetics planetary boundary layer scheme to - ! determine the diffusivity and viscosity in the surface boundary layer. -EPBL_IS_ADDITIVE = False ! [Boolean] default = True - ! If true, the diffusivity from ePBL is added to all other diffusivities. - ! Otherwise, the larger of kappa-shear and ePBL diffusivities are used. - -! === module MOM_CVMix_KPP === -! This is the MOM wrapper to CVMix:KPP -! See http://cvmix.github.io/ - -! === module MOM_tidal_mixing === -! Vertical Tidal Mixing Parameterization -INT_TIDE_DISSIPATION = True ! [Boolean] default = False - ! If true, use an internal tidal dissipation scheme to drive diapycnal mixing, - ! along the lines of St. Laurent et al. (2002) and Simmons et al. (2004). -INT_TIDE_PROFILE = "POLZIN_09" ! default = "STLAURENT_02" - ! INT_TIDE_PROFILE selects the vertical profile of energy dissipation with - ! INT_TIDE_DISSIPATION. Valid values are: - ! STLAURENT_02 - Use the St. Laurent et al exponential - ! decay profile. - ! POLZIN_09 - Use the Polzin WKB-stretched algebraic - ! decay profile. -INT_TIDE_DECAY_SCALE = 300.3003003003003 ! [m] default = 500.0 - ! The decay scale away from the bottom for tidal TKE with the new coding when - ! INT_TIDE_DISSIPATION is used. -KAPPA_ITIDES = 6.28319E-04 ! [m-1] default = 6.283185307179586E-04 - ! A topographic wavenumber used with INT_TIDE_DISSIPATION. The default is 2pi/10 - ! km, as in St.Laurent et al. 2002. -KAPPA_H2_FACTOR = 0.84 ! [nondim] default = 1.0 - ! A scaling factor for the roughness amplitude with INT_TIDE_DISSIPATION. -TKE_ITIDE_MAX = 0.1 ! [W m-2] default = 1000.0 - ! The maximum internal tide energy source available to mix above the bottom - ! boundary layer with INT_TIDE_DISSIPATION. -READ_TIDEAMP = True ! [Boolean] default = False - ! If true, read a file (given by TIDEAMP_FILE) containing the tidal amplitude - ! with INT_TIDE_DISSIPATION. -TIDEAMP_FILE = "tidal_amplitude.nc" ! default = "tideamp.nc" - ! The path to the file containing the spatially varying tidal amplitudes with - ! INT_TIDE_DISSIPATION. -H2_FILE = "ocean_topog.nc" ! - ! The path to the file containing the sub-grid-scale topographic roughness - ! amplitude with INT_TIDE_DISSIPATION. - -! === module MOM_CVMix_conv === -! Parameterization of enhanced mixing due to convection via CVMix - -! === module MOM_geothermal === -GEOTHERMAL_SCALE = 1.0 ! [W m-2 or various] default = 0.0 - ! The constant geothermal heat flux, a rescaling factor for the heat flux read - ! from GEOTHERMAL_FILE, or 0 to disable the geothermal heating. -GEOTHERMAL_FILE = "geothermal_davies2013_v1.nc" ! default = "" - ! The file from which the geothermal heating is to be read, or blank to use a - ! constant heating rate. -GEOTHERMAL_VARNAME = "geothermal_hf" ! default = "geo_heat" - ! The name of the geothermal heating variable in GEOTHERMAL_FILE. - -! === module MOM_set_diffusivity === -BBL_MIXING_AS_MAX = False ! [Boolean] default = True - ! If true, take the maximum of the diffusivity from the BBL mixing and the other - ! diffusivities. Otherwise, diffusivity from the BBL_mixing is simply added. -USE_LOTW_BBL_DIFFUSIVITY = True ! [Boolean] default = False - ! If true, uses a simple, imprecise but non-coordinate dependent, model of BBL - ! mixing diffusivity based on Law of the Wall. Otherwise, uses the original BBL - ! scheme. -SIMPLE_TKE_TO_KD = True ! [Boolean] default = False - ! If true, uses a simple estimate of Kd/TKE that will work for arbitrary - ! vertical coordinates. If false, calculates Kd/TKE and bounds based on exact - ! energetics for an isopycnal layer-formulation. - -! === module MOM_bkgnd_mixing === -! Adding static vertical background mixing coefficients -KD = 1.5E-05 ! [m2 s-1] - ! The background diapycnal diffusivity of density in the interior. Zero or the - ! molecular value, ~1e-7 m2 s-1, may be used. -KD_MIN = 2.0E-06 ! [m2 s-1] default = 1.5E-07 - ! The minimum diapycnal diffusivity. -HENYEY_IGW_BACKGROUND = True ! [Boolean] default = False - ! If true, use a latitude-dependent scaling for the near surface background - ! diffusivity, as described in Harrison & Hallberg, JPO 2008. -KD_MAX = 0.1 ! [m2 s-1] default = -1.0 - ! The maximum permitted increment for the diapycnal diffusivity from TKE-based - ! parameterizations, or a negative value for no limit. - -! === module MOM_kappa_shear === -! Parameterization of shear-driven turbulence following Jackson, Hallberg and Legg, JPO 2008 -USE_JACKSON_PARAM = True ! [Boolean] default = False - ! If true, use the Jackson-Hallberg-Legg (JPO 2008) shear mixing - ! parameterization. -MAX_RINO_IT = 25 ! [nondim] default = 50 - ! The maximum number of iterations that may be used to estimate the Richardson - ! number driven mixing. -VERTEX_SHEAR = False ! [Boolean] default = False - ! If true, do the calculations of the shear-driven mixing - ! at the cell vertices (i.e., the vorticity points). -KAPPA_SHEAR_ITER_BUG = True ! [Boolean] default = True - ! If true, use an older, dimensionally inconsistent estimate of the derivative - ! of diffusivity with energy in the Newton's method iteration. The bug causes - ! undercorrections when dz > 1 m. -KAPPA_SHEAR_ALL_LAYER_TKE_BUG = True ! [Boolean] default = True - ! If true, report back the latest estimate of TKE instead of the time average - ! TKE when there is mass in all layers. Otherwise always report the time - ! averaged TKE, as is currently done when there are some massless layers. - -! === module MOM_CVMix_shear === -! Parameterization of shear-driven turbulence via CVMix (various options) - -! === module MOM_CVMix_ddiff === -! Parameterization of mixing due to double diffusion processes via CVMix - -! === module MOM_diabatic_aux === -! The following parameters are used for auxiliary diabatic processes. -PRESSURE_DEPENDENT_FRAZIL = False ! [Boolean] default = False - ! If true, use a pressure dependent freezing temperature when making frazil. The - ! default is false, which will be faster but is inappropriate with ice-shelf - ! cavities. -VAR_PEN_SW = True ! [Boolean] default = False - ! If true, use one of the CHL_A schemes specified by OPACITY_SCHEME to determine - ! the e-folding depth of incoming short wave radiation. -CHL_FILE = @[CHLCLIM] ! - ! CHL_FILE is the file containing chl_a concentrations in the variable CHL_A. It - ! is used when VAR_PEN_SW and CHL_FROM_FILE are true. -CHL_VARNAME = "chlor_a" ! default = "CHL_A" - ! Name of CHL_A variable in CHL_FILE. - -! === module MOM_energetic_PBL === -ML_OMEGA_FRAC = 0.001 ! [nondim] default = 0.0 - ! When setting the decay scale for turbulence, use this fraction of the absolute - ! rotation rate blended with the local value of f, as sqrt((1-of)*f^2 + - ! of*4*omega^2). -TKE_DECAY = 0.01 ! [nondim] default = 2.5 - ! TKE_DECAY relates the vertical rate of decay of the TKE available for - ! mechanical entrainment to the natural Ekman depth. -EPBL_MSTAR_SCHEME = "OM4" ! default = "CONSTANT" - ! EPBL_MSTAR_SCHEME selects the method for setting mstar. Valid values are: - ! CONSTANT - Use a fixed mstar given by MSTAR - ! OM4 - Use L_Ekman/L_Obukhov in the sabilizing limit, as in OM4 - ! REICHL_H18 - Use the scheme documented in Reichl & Hallberg, 2018. -MSTAR_CAP = 10.0 ! [nondim] default = -1.0 - ! If this value is positive, it sets the maximum value of mstar allowed in ePBL. - ! (This is not used if EPBL_MSTAR_SCHEME = CONSTANT). -MSTAR2_COEF1 = 0.29 ! [nondim] default = 0.3 - ! Coefficient in computing mstar when rotation and stabilizing effects are both - ! important (used if EPBL_MSTAR_SCHEME = OM4). -MSTAR2_COEF2 = 0.152 ! [nondim] default = 0.085 - ! Coefficient in computing mstar when only rotation limits the total mixing - ! (used if EPBL_MSTAR_SCHEME = OM4) -EPBL_MLD_BISECTION = True ! [Boolean] default = False - ! If true, use bisection with the iterative determination of the self-consistent - ! mixed layer depth. Otherwise use the false position after a maximum and - ! minimum bound have been evaluated and the returned value or bisection before - ! this. -NSTAR = 0.06 ! [nondim] default = 0.2 - ! The portion of the buoyant potential energy imparted by surface fluxes that is - ! available to drive entrainment at the base of mixed layer when that energy is - ! positive. -MSTAR_CONV_ADJ = 0.667 ! [nondim] default = 0.0 - ! Coefficient used for reducing mstar during convection due to reduction of - ! stable density gradient. -USE_MLD_ITERATION = True ! [Boolean] default = False - ! A logical that specifies whether or not to use the distance to the bottom of - ! the actively turbulent boundary layer to help set the EPBL length scale. -EPBL_TRANSITION_SCALE = 0.01 ! [nondim] default = 0.1 - ! A scale for the mixing length in the transition layer at the edge of the - ! boundary layer as a fraction of the boundary layer thickness. -MIX_LEN_EXPONENT = 1.0 ! [nondim] default = 2.0 - ! The exponent applied to the ratio of the distance to the MLD and the MLD depth - ! which determines the shape of the mixing length. This is only used if - ! USE_MLD_ITERATION is True. -USE_LA_LI2016 = @[MOM6_USE_LI2016] ! [nondim] default = False - ! A logical to use the Li et al. 2016 (submitted) formula to determine the - ! Langmuir number. -USE_WAVES = @[MOM6_USE_WAVES] ! [Boolean] default = False - ! If true, enables surface wave modules. -WAVE_METHOD = "SURFACE_BANDS" ! default = "EMPTY" - ! Choice of wave method, valid options include: - ! TEST_PROFILE - Prescribed from surface Stokes drift - ! and a decay wavelength. - ! SURFACE_BANDS - Computed from multiple surface values - ! and decay wavelengths. - ! DHH85 - Uses Donelan et al. 1985 empirical - ! wave spectrum with prescribed values. - ! LF17 - Infers Stokes drift profile from wind - ! speed following Li and Fox-Kemper 2017. -SURFBAND_SOURCE = "COUPLER" ! default = "EMPTY" - ! Choice of SURFACE_BANDS data mode, valid options include: - ! DATAOVERRIDE - Read from NetCDF using FMS DataOverride. - ! COUPLER - Look for variables from coupler pass - ! INPUT - Testing with fixed values. -STK_BAND_COUPLER = 3 ! default = 1 - ! STK_BAND_COUPLER is the number of Stokes drift bands in the coupler. This has - ! to be consistent with the number of Stokes drift bands in WW3, or the model - ! will fail. -SURFBAND_WAVENUMBERS = 0.04, 0.11, 0.3305 ! [rad/m] default = 0.12566 - ! Central wavenumbers for surface Stokes drift bands. -EPBL_LANGMUIR_SCHEME = "ADDITIVE" ! default = "NONE" - ! EPBL_LANGMUIR_SCHEME selects the method for including Langmuir turbulence. - ! Valid values are: - ! NONE - Do not do any extra mixing due to Langmuir turbulence - ! RESCALE - Use a multiplicative rescaling of mstar to account for Langmuir - ! turbulence - ! ADDITIVE - Add a Langmuir turblence contribution to mstar to other - ! contributions -LT_ENHANCE_COEF = 0.044 ! [nondim] default = 0.447 - ! Coefficient for Langmuir enhancement of mstar -LT_ENHANCE_EXP = -1.5 ! [nondim] default = -1.33 - ! Exponent for Langmuir enhancementt of mstar -LT_MOD_LAC1 = 0.0 ! [nondim] default = -0.87 - ! Coefficient for modification of Langmuir number due to MLD approaching Ekman - ! depth. -LT_MOD_LAC4 = 0.0 ! [nondim] default = 0.95 - ! Coefficient for modification of Langmuir number due to ratio of Ekman to - ! stable Obukhov depth. -LT_MOD_LAC5 = 0.22 ! [nondim] default = 0.95 - ! Coefficient for modification of Langmuir number due to ratio of Ekman to - ! unstable Obukhov depth. - -! === module MOM_regularize_layers === - -! === module MOM_opacity === -PEN_SW_NBANDS = 3 ! default = 1 - ! The number of bands of penetrating shortwave radiation. - -! === module MOM_tracer_advect === -TRACER_ADVECTION_SCHEME = "PPM:H3" ! default = "PLM" - ! The horizontal transport scheme for tracers: - ! PLM - Piecewise Linear Method - ! PPM:H3 - Piecewise Parabolic Method (Huyhn 3rd order) - ! PPM - Piecewise Parabolic Method (Colella-Woodward) - -! === module MOM_tracer_hor_diff === -KHTR = 50.0 ! [m2 s-1] default = 0.0 - ! The background along-isopycnal tracer diffusivity. -CHECK_DIFFUSIVE_CFL = True ! [Boolean] default = False - ! If true, use enough iterations the diffusion to ensure that the diffusive - ! equivalent of the CFL limit is not violated. If false, always use the greater - ! of 1 or MAX_TR_DIFFUSION_CFL iteration. -MAX_TR_DIFFUSION_CFL = 2.0 ! [nondim] default = -1.0 - ! If positive, locally limit the along-isopycnal tracer diffusivity to keep the - ! diffusive CFL locally at or below this value. The number of diffusive - ! iterations is often this value or the next greater integer. - -! === module MOM_neutral_diffusion === -! This module implements neutral diffusion of tracers -USE_NEUTRAL_DIFFUSION = True ! [Boolean] default = False - ! If true, enables the neutral diffusion module. - -! === module ocean_model_init === -RESTART_CHECKSUMS_REQUIRED = False - -! === module MOM_oda_incupd === -ODA_INCUPD = @[ODA_INCUPD] ! [Boolean] default = False - ! If true, oda incremental updates will be applied - ! everywhere in the domain. -ODA_INCUPD_FILE = "mom6_increment.nc" ! The name of the file with the T,S,h increments. - -ODA_TEMPINC_VAR = "Temp" ! default = "ptemp_inc" - ! The name of the potential temperature inc. variable in - ! ODA_INCUPD_FILE. -ODA_SALTINC_VAR = "Salt" ! default = "sal_inc" - ! The name of the salinity inc. variable in - ! ODA_INCUPD_FILE. -ODA_THK_VAR = "h" ! default = "h" - ! The name of the int. depth inc. variable in - ! ODA_INCUPD_FILE. -ODA_INCUPD_UV = false ! -!ODA_UINC_VAR = "u" ! default = "u_inc" - ! The name of the zonal vel. inc. variable in - ! ODA_INCUPD_UV_FILE. -!ODA_VINC_VAR = "v" ! default = "v_inc" - ! The name of the meridional vel. inc. variable in - ! ODA_INCUPD_UV_FILE. -ODA_INCUPD_NHOURS = @[ODA_INCUPD_NHOURS] ! default=3.0 - -! === module MOM_lateral_boundary_diffusion === -! This module implements lateral diffusion of tracers near boundaries - -! === module MOM_sum_output === -MAXTRUNC = 100000 ! [truncations save_interval-1] default = 0 - ! The run will be stopped, and the day set to a very large value if the velocity - ! is truncated more than MAXTRUNC times between energy saves. Set MAXTRUNC to 0 - ! to stop if there is any truncation of velocities. -ENERGYSAVEDAYS = 1.0 ! [days] default = 1.0 - ! The interval in units of TIMEUNIT between saves of the energies of the run and - ! other globally summed diagnostics. -ENERGYSAVEDAYS_GEOMETRIC = 0.25 ! [days] default = 0.0 - ! The starting interval in units of TIMEUNIT for the first call to save the - ! energies of the run and other globally summed diagnostics. The interval - ! increases by a factor of 2. after each call to write_energy. - -! === module ocean_model_init === - -! === module MOM_surface_forcing === -OCEAN_SURFACE_STAGGER = "A" ! default = "C" - ! A case-insensitive character string to indicate the - ! staggering of the surface velocity field that is - ! returned to the coupler. Valid values include - ! 'A', 'B', or 'C'. - -MAX_P_SURF = 0.0 ! [Pa] default = -1.0 - ! The maximum surface pressure that can be exerted by the atmosphere and - ! floating sea-ice or ice shelves. This is needed because the FMS coupling - ! structure does not limit the water that can be frozen out of the ocean and the - ! ice-ocean heat fluxes are treated explicitly. No limit is applied if a - ! negative value is used. -WIND_STAGGER = "A" ! default = "C" - ! A case-insensitive character string to indicate the - ! staggering of the input wind stress field. Valid - ! values are 'A', 'B', or 'C'. -CD_TIDES = 0.0018 ! [nondim] default = 1.0E-04 - ! The drag coefficient that applies to the tides. -GUST_CONST = 0.0 ! [Pa] default = 0.02 - ! The background gustiness in the winds. -FIX_USTAR_GUSTLESS_BUG = False ! [Boolean] default = False - ! If true correct a bug in the time-averaging of the gustless wind friction - ! velocity -USE_RIGID_SEA_ICE = True ! [Boolean] default = False - ! If true, sea-ice is rigid enough to exert a nonhydrostatic pressure that - ! resist vertical motion. -SEA_ICE_RIGID_MASS = 100.0 ! [kg m-2] default = 1000.0 - ! The mass of sea-ice per unit area at which the sea-ice starts to exhibit - ! rigidity -LIQUID_RUNOFF_FROM_DATA = @[MOM6_RIVER_RUNOFF] ! [Boolean] default = False - ! If true, allows liquid river runoff to be specified via - ! the data_table using the component name 'OCN'. -! === module ocean_stochastics === -DO_SPPT = @[DO_OCN_SPPT] ! [Boolean] default = False - ! If true perturb the diabatic tendencies in MOM_diabatic_driver -PERT_EPBL = @[PERT_EPBL] ! [Boolean] default = False - ! If true perturb the KE dissipation and destruction in MOM_energetic_PBL -! === module MOM_restart === - -! === module MOM_file_parser === diff --git a/parm/ufs/mom6/MOM_input_template_100 b/parm/ufs/mom6/MOM_input_template_100 deleted file mode 100644 index f26d6e4bfb..0000000000 --- a/parm/ufs/mom6/MOM_input_template_100 +++ /dev/null @@ -1,866 +0,0 @@ -! This file was written by the model and records all non-layout or debugging parameters used at run-time. -! === module MOM === - -! === module MOM_unit_scaling === -! Parameters for doing unit scaling of variables. -USE_REGRIDDING = True ! [Boolean] default = False - ! If True, use the ALE algorithm (regridding/remapping). If False, use the - ! layered isopycnal algorithm. -THICKNESSDIFFUSE = True ! [Boolean] default = False - ! If true, interface heights are diffused with a coefficient of KHTH. -THICKNESSDIFFUSE_FIRST = True ! [Boolean] default = False - ! If true, do thickness diffusion before dynamics. This is only used if - ! THICKNESSDIFFUSE is true. -DT = @[DT_DYNAM_MOM6] ! [s] - ! The (baroclinic) dynamics time step. The time-step that is actually used will - ! be an integer fraction of the forcing time-step (DT_FORCING in ocean-only mode - ! or the coupling timestep in coupled mode.) -DT_THERM = @[DT_THERM_MOM6] ! [s] default = 1800.0 - ! The thermodynamic and tracer advection time step. Ideally DT_THERM should be - ! an integer multiple of DT and less than the forcing or coupling time-step, - ! unless THERMO_SPANS_COUPLING is true, in which case DT_THERM can be an integer - ! multiple of the coupling timestep. By default DT_THERM is set to DT. -THERMO_SPANS_COUPLING = @[MOM6_THERMO_SPAN] ! [Boolean] default = False - ! If true, the MOM will take thermodynamic and tracer timesteps that can be - ! longer than the coupling timestep. The actual thermodynamic timestep that is - ! used in this case is the largest integer multiple of the coupling timestep - ! that is less than or equal to DT_THERM. -HFREEZE = 20.0 ! [m] default = -1.0 - ! If HFREEZE > 0, melt potential will be computed. The actual depth - ! over which melt potential is computed will be min(HFREEZE, OBLD) - ! where OBLD is the boundary layer depth. If HFREEZE <= 0 (default) - ! melt potential will not be computed. -DTBT_RESET_PERIOD = -1.0 ! [s] default = 7200.0 - ! The period between recalculations of DTBT (if DTBT <= 0). If DTBT_RESET_PERIOD - ! is negative, DTBT is set based only on information available at - ! initialization. If 0, DTBT will be set every dynamics time step. The default - ! is set by DT_THERM. This is only used if SPLIT is true. -FRAZIL = True ! [Boolean] default = False - ! If true, water freezes if it gets too cold, and the accumulated heat deficit - ! is returned in the surface state. FRAZIL is only used if - ! ENABLE_THERMODYNAMICS is true. -BOUND_SALINITY = True ! [Boolean] default = False - ! If true, limit salinity to being positive. (The sea-ice model may ask for more - ! salt than is available and drive the salinity negative otherwise.) -MIN_SALINITY = 0.01 ! [PPT] default = 0.0 - ! The minimum value of salinity when BOUND_SALINITY=True. -C_P = 3925.0 ! [J kg-1 K-1] default = 3991.86795711963 - ! The heat capacity of sea water, approximated as a constant. This is only used - ! if ENABLE_THERMODYNAMICS is true. The default value is from the TEOS-10 - ! definition of conservative temperature. -USE_PSURF_IN_EOS = False ! [Boolean] default = True - ! If true, always include the surface pressure contributions in equation of - ! state calculations. -CHECK_BAD_SURFACE_VALS = True ! [Boolean] default = False - ! If true, check the surface state for ridiculous values. -BAD_VAL_SSH_MAX = 50.0 ! [m] default = 20.0 - ! The value of SSH above which a bad value message is triggered, if - ! CHECK_BAD_SURFACE_VALS is true. -BAD_VAL_SSS_MAX = 75.0 ! [PPT] default = 45.0 - ! The value of SSS above which a bad value message is triggered, if - ! CHECK_BAD_SURFACE_VALS is true. -BAD_VAL_SST_MAX = 55.0 ! [deg C] default = 45.0 - ! The value of SST above which a bad value message is triggered, if - ! CHECK_BAD_SURFACE_VALS is true. -BAD_VAL_SST_MIN = -3.0 ! [deg C] default = -2.1 - ! The value of SST below which a bad value message is triggered, if - ! CHECK_BAD_SURFACE_VALS is true. -DEFAULT_2018_ANSWERS = True ! [Boolean] default = False - ! This sets the default value for the various _2018_ANSWERS parameters. -WRITE_GEOM = 2 ! default = 1 - ! If =0, never write the geometry and vertical grid files. If =1, write the - ! geometry and vertical grid files only for a new simulation. If =2, always - ! write the geometry and vertical grid files. Other values are invalid. -SAVE_INITIAL_CONDS = False ! [Boolean] default = False - ! If true, write the initial conditions to a file given by IC_OUTPUT_FILE. - -! === module MOM_domains === -TRIPOLAR_N = True ! [Boolean] default = False - ! Use tripolar connectivity at the northern edge of the domain. With - ! TRIPOLAR_N, NIGLOBAL must be even. -NIGLOBAL = @[NX_GLB] ! - ! The total number of thickness grid points in the x-direction in the physical - ! domain. With STATIC_MEMORY_ this is set in MOM_memory.h at compile time. -NJGLOBAL = @[NY_GLB] ! - ! The total number of thickness grid points in the y-direction in the physical - ! domain. With STATIC_MEMORY_ this is set in MOM_memory.h at compile time. - -! === module MOM_hor_index === -! Sets the horizontal array index types. - -! === module MOM_fixed_initialization === -INPUTDIR = "INPUT" ! default = "." - ! The directory in which input files are found. - -! === module MOM_grid_init === -GRID_CONFIG = "mosaic" ! - ! A character string that determines the method for defining the horizontal - ! grid. Current options are: - ! mosaic - read the grid from a mosaic (supergrid) - ! file set by GRID_FILE. - ! cartesian - use a (flat) Cartesian grid. - ! spherical - use a simple spherical grid. - ! mercator - use a Mercator spherical grid. -GRID_FILE = "ocean_hgrid.nc" ! - ! Name of the file from which to read horizontal grid data. -GRID_ROTATION_ANGLE_BUGS = False ! [Boolean] default = True - ! If true, use an older algorithm to calculate the sine and - ! cosines needed rotate between grid-oriented directions and - ! true north and east. Differences arise at the tripolar fold -USE_TRIPOLAR_GEOLONB_BUG = False ! [Boolean] default = True - ! If true, use older code that incorrectly sets the longitude in some points - ! along the tripolar fold to be off by 360 degrees. -TOPO_CONFIG = "file" ! - ! This specifies how bathymetry is specified: - ! file - read bathymetric information from the file - ! specified by (TOPO_FILE). - ! flat - flat bottom set to MAXIMUM_DEPTH. - ! bowl - an analytically specified bowl-shaped basin - ! ranging between MAXIMUM_DEPTH and MINIMUM_DEPTH. - ! spoon - a similar shape to 'bowl', but with an vertical - ! wall at the southern face. - ! halfpipe - a zonally uniform channel with a half-sine - ! profile in the meridional direction. - ! bbuilder - build topography from list of functions. - ! benchmark - use the benchmark test case topography. - ! Neverworld - use the Neverworld test case topography. - ! DOME - use a slope and channel configuration for the - ! DOME sill-overflow test case. - ! ISOMIP - use a slope and channel configuration for the - ! ISOMIP test case. - ! DOME2D - use a shelf and slope configuration for the - ! DOME2D gravity current/overflow test case. - ! Kelvin - flat but with rotated land mask. - ! seamount - Gaussian bump for spontaneous motion test case. - ! dumbbell - Sloshing channel with reservoirs on both ends. - ! shelfwave - exponential slope for shelfwave test case. - ! Phillips - ACC-like idealized topography used in the Phillips config. - ! dense - Denmark Strait-like dense water formation and overflow. - ! USER - call a user modified routine. -TOPO_EDITS_FILE = "@[TOPOEDITS]" ! default = "" - ! The file from which to read a list of i,j,z topography overrides. -ALLOW_LANDMASK_CHANGES = @[MOM6_ALLOW_LANDMASK_CHANGES] ! default = "False" - ! If true, allow topography overrides to change ocean points to land -MAXIMUM_DEPTH = 6500.0 ! [m] - ! The maximum depth of the ocean. -MINIMUM_DEPTH = 9.5 ! [m] default = 0.0 - ! If MASKING_DEPTH is unspecified, then anything shallower than MINIMUM_DEPTH is - ! assumed to be land and all fluxes are masked out. If MASKING_DEPTH is - ! specified, then all depths shallower than MINIMUM_DEPTH but deeper than - ! MASKING_DEPTH are rounded to MINIMUM_DEPTH. - -! === module MOM_open_boundary === -! Controls where open boundaries are located, what kind of boundary condition to impose, and what data to apply, -! if any. -MASKING_DEPTH = 0.0 ! [m] default = -9999.0 - ! The depth below which to mask points as land points, for which all fluxes are - ! zeroed out. MASKING_DEPTH is ignored if negative. -CHANNEL_CONFIG = "list" ! default = "none" - ! A parameter that determines which set of channels are - ! restricted to specific widths. Options are: - ! none - All channels have the grid width. - ! global_1deg - Sets 16 specific channels appropriate - ! for a 1-degree model, as used in CM2G. - ! list - Read the channel locations and widths from a - ! text file, like MOM_channel_list in the MOM_SIS - ! test case. - ! file - Read open face widths everywhere from a - ! NetCDF file on the model grid. -CHANNEL_LIST_FILE = "MOM_channels_SPEAR" ! default = "MOM_channel_list" - ! The file from which the list of narrowed channels is read. - -! === module MOM_verticalGrid === -! Parameters providing information about the vertical grid. -NK = 75 ! [nondim] - ! The number of model layers. - -! === module MOM_tracer_registry === - -! === module MOM_EOS === -TFREEZE_FORM = "MILLERO_78" ! default = "LINEAR" - ! TFREEZE_FORM determines which expression should be used for the freezing - ! point. Currently, the valid choices are "LINEAR", "MILLERO_78", "TEOS10" - -! === module MOM_restart === -PARALLEL_RESTARTFILES = True ! [Boolean] default = False - ! If true, each processor writes its own restart file, otherwise a single - ! restart file is generated - -! === module MOM_tracer_flow_control === -USE_IDEAL_AGE_TRACER = False ! [Boolean] default = False - ! If true, use the ideal_age_example tracer package. - -! === module ideal_age_example === - -! === module MOM_coord_initialization === -COORD_CONFIG = "file" ! default = "none" - ! This specifies how layers are to be defined: - ! ALE or none - used to avoid defining layers in ALE mode - ! file - read coordinate information from the file - ! specified by (COORD_FILE). - ! BFB - Custom coords for buoyancy-forced basin case - ! based on SST_S, T_BOT and DRHO_DT. - ! linear - linear based on interfaces not layers - ! layer_ref - linear based on layer densities - ! ts_ref - use reference temperature and salinity - ! ts_range - use range of temperature and salinity - ! (T_REF and S_REF) to determine surface density - ! and GINT calculate internal densities. - ! gprime - use reference density (RHO_0) for surface - ! density and GINT calculate internal densities. - ! ts_profile - use temperature and salinity profiles - ! (read from COORD_FILE) to set layer densities. - ! USER - call a user modified routine. -COORD_FILE = "layer_coord.nc" ! - ! The file from which the coordinate densities are read. -REMAP_UV_USING_OLD_ALG = True ! [Boolean] default = False - ! If true, uses the old remapping-via-a-delta-z method for remapping u and v. If - ! false, uses the new method that remaps between grids described by an old and - ! new thickness. -REGRIDDING_COORDINATE_MODE = "HYCOM1" ! default = "LAYER" - ! Coordinate mode for vertical regridding. Choose among the following - ! possibilities: LAYER - Isopycnal or stacked shallow water layers - ! ZSTAR, Z* - stretched geopotential z* - ! SIGMA_SHELF_ZSTAR - stretched geopotential z* ignoring shelf - ! SIGMA - terrain following coordinates - ! RHO - continuous isopycnal - ! HYCOM1 - HyCOM-like hybrid coordinate - ! SLIGHT - stretched coordinates above continuous isopycnal - ! ADAPTIVE - optimize for smooth neutral density surfaces -BOUNDARY_EXTRAPOLATION = True ! [Boolean] default = False - ! When defined, a proper high-order reconstruction scheme is used within - ! boundary cells rather than PCM. E.g., if PPM is used for remapping, a PPM - ! reconstruction will also be used within boundary cells. -ALE_COORDINATE_CONFIG = "HYBRID:hycom1_75_800m.nc,sigma2,FNC1:2,4000,4.5,.01" ! default = "UNIFORM" - ! Determines how to specify the coordinate resolution. Valid options are: - ! PARAM - use the vector-parameter ALE_RESOLUTION - ! UNIFORM[:N] - uniformly distributed - ! FILE:string - read from a file. The string specifies - ! the filename and variable name, separated - ! by a comma or space, e.g. FILE:lev.nc,dz - ! or FILE:lev.nc,interfaces=zw - ! WOA09[:N] - the WOA09 vertical grid (approximately) - ! FNC1:string - FNC1:dz_min,H_total,power,precision - ! HYBRID:string - read from a file. The string specifies - ! the filename and two variable names, separated - ! by a comma or space, for sigma-2 and dz. e.g. - ! HYBRID:vgrid.nc,sigma2,dz -!ALE_RESOLUTION = 7*2.0, 2*2.01, 2.02, 2.03, 2.05, 2.08, 2.11, 2.15, 2.21, 2.2800000000000002, 2.37, 2.48, 2.61, 2.77, 2.95, 3.17, 3.4299999999999997, 3.74, 4.09, 4.49, 4.95, 5.48, 6.07, 6.74, 7.5, 8.34, 9.280000000000001, 10.33, 11.49, 12.77, 14.19, 15.74, 17.450000000000003, 19.31, 21.35, 23.56, 25.97, 28.580000000000002, 31.41, 34.47, 37.77, 41.32, 45.14, 49.25, 53.65, 58.370000000000005, 63.42, 68.81, 74.56, 80.68, 87.21000000000001, 94.14, 101.51, 109.33, 117.62, 126.4, 135.68, 145.5, 155.87, 166.81, 178.35, 190.51, 203.31, 216.78, 230.93, 245.8, 261.42, 277.83 ! [m] - ! The distribution of vertical resolution for the target - ! grid used for Eulerian-like coordinates. For example, - ! in z-coordinate mode, the parameter is a list of level - ! thicknesses (in m). In sigma-coordinate mode, the list - ! is of non-dimensional fractions of the water column. -!TARGET_DENSITIES = 1010.0, 1014.3034, 1017.8088, 1020.843, 1023.5566, 1025.813, 1027.0275, 1027.9114, 1028.6422, 1029.2795, 1029.852, 1030.3762, 1030.8626, 1031.3183, 1031.7486, 1032.1572, 1032.5471, 1032.9207, 1033.2798, 1033.6261, 1033.9608, 1034.2519, 1034.4817, 1034.6774, 1034.8508, 1035.0082, 1035.1533, 1035.2886, 1035.4159, 1035.5364, 1035.6511, 1035.7608, 1035.8661, 1035.9675, 1036.0645, 1036.1554, 1036.2411, 1036.3223, 1036.3998, 1036.4739, 1036.5451, 1036.6137, 1036.68, 1036.7441, 1036.8062, 1036.8526, 1036.8874, 1036.9164, 1036.9418, 1036.9647, 1036.9857, 1037.0052, 1037.0236, 1037.0409, 1037.0574, 1037.0738, 1037.0902, 1037.1066, 1037.123, 1037.1394, 1037.1558, 1037.1722, 1037.1887, 1037.206, 1037.2241, 1037.2435, 1037.2642, 1037.2866, 1037.3112, 1037.3389, 1037.3713, 1037.4118, 1037.475, 1037.6332, 1037.8104, 1038.0 ! [m] - ! HYBRID target densities for interfaces -MAXIMUM_INT_DEPTH_CONFIG = "FNC1:5,8000.0,1.0,.01" ! default = "NONE" - ! Determines how to specify the maximum interface depths. - ! Valid options are: - ! NONE - there are no maximum interface depths - ! PARAM - use the vector-parameter MAXIMUM_INTERFACE_DEPTHS - ! FILE:string - read from a file. The string specifies - ! the filename and variable name, separated - ! by a comma or space, e.g. FILE:lev.nc,Z - ! FNC1:string - FNC1:dz_min,H_total,power,precision -!MAXIMUM_INT_DEPTHS = 0.0, 5.0, 12.75, 23.25, 36.49, 52.480000000000004, 71.22, 92.71000000000001, 116.94000000000001, 143.92000000000002, 173.65, 206.13, 241.36, 279.33000000000004, 320.05000000000007, 363.5200000000001, 409.7400000000001, 458.7000000000001, 510.4100000000001, 564.8700000000001, 622.0800000000002, 682.0300000000002, 744.7300000000002, 810.1800000000003, 878.3800000000003, 949.3300000000004, 1023.0200000000004, 1099.4600000000005, 1178.6500000000005, 1260.5900000000006, 1345.2700000000007, 1432.7000000000007, 1522.8800000000008, 1615.8100000000009, 1711.490000000001, 1809.910000000001, 1911.080000000001, 2015.0000000000011, 2121.670000000001, 2231.080000000001, 2343.2400000000007, 2458.1500000000005, 2575.8100000000004, 2696.2200000000003, 2819.3700000000003, 2945.2700000000004, 3073.9200000000005, 3205.3200000000006, 3339.4600000000005, 3476.3500000000004, 3615.9900000000002, 3758.38, 3903.52, 4051.4, 4202.03, 4355.41, 4511.54, 4670.41, 4832.03, 4996.4, 5163.5199999999995, 5333.379999999999, 5505.989999999999, 5681.3499999999985, 5859.459999999998, 6040.319999999998, 6223.919999999998, 6410.269999999999, 6599.369999999999, 6791.219999999999, 6985.8099999999995, 7183.15, 7383.24, 7586.08, 7791.67, 8000.0 - ! The list of maximum depths for each interface. -MAX_LAYER_THICKNESS_CONFIG = "FNC1:400,31000.0,0.1,.01" ! default = "NONE" - ! Determines how to specify the maximum layer thicknesses. - ! Valid options are: - ! NONE - there are no maximum layer thicknesses - ! PARAM - use the vector-parameter MAX_LAYER_THICKNESS - ! FILE:string - read from a file. The string specifies - ! the filename and variable name, separated - ! by a comma or space, e.g. FILE:lev.nc,Z - ! FNC1:string - FNC1:dz_min,H_total,power,precision -!MAX_LAYER_THICKNESS = 400.0, 409.63, 410.32, 410.75, 411.07, 411.32, 411.52, 411.7, 411.86, 412.0, 412.13, 412.24, 412.35, 412.45, 412.54, 412.63, 412.71, 412.79, 412.86, 412.93, 413.0, 413.06, 413.12, 413.18, 413.24, 413.29, 413.34, 413.39, 413.44, 413.49, 413.54, 413.58, 413.62, 413.67, 413.71, 413.75, 413.78, 413.82, 413.86, 413.9, 413.93, 413.97, 414.0, 414.03, 414.06, 414.1, 414.13, 414.16, 414.19, 414.22, 414.24, 414.27, 414.3, 414.33, 414.35, 414.38, 414.41, 414.43, 414.46, 414.48, 414.51, 414.53, 414.55, 414.58, 414.6, 414.62, 414.65, 414.67, 414.69, 414.71, 414.73, 414.75, 414.77, 414.79, 414.83 ! [m] - ! The list of maximum thickness for each layer. -REMAPPING_SCHEME = "PPM_H4" ! default = "PLM" - ! This sets the reconstruction scheme used for vertical remapping for all - ! variables. It can be one of the following schemes: PCM (1st-order - ! accurate) - ! PLM (2nd-order accurate) - ! PPM_H4 (3rd-order accurate) - ! PPM_IH4 (3rd-order accurate) - ! PQM_IH4IH3 (4th-order accurate) - ! PQM_IH6IH5 (5th-order accurate) - -! === module MOM_grid === -! Parameters providing information about the lateral grid. - -! === module MOM_state_initialization === -INIT_LAYERS_FROM_Z_FILE = True ! [Boolean] default = False - ! If true, initialize the layer thicknesses, temperatures, and salinities from a - ! Z-space file on a latitude-longitude grid. - -! === module MOM_initialize_layers_from_Z === -TEMP_SALT_Z_INIT_FILE = "MOM6_IC_TS.nc" ! default = "temp_salt_z.nc" - ! The name of the z-space input file used to initialize - ! temperatures (T) and salinities (S). If T and S are not - ! in the same file, TEMP_Z_INIT_FILE and SALT_Z_INIT_FILE - ! must be set. -Z_INIT_FILE_PTEMP_VAR = "temp" ! default = "ptemp" - ! The name of the potential temperature variable in - ! TEMP_Z_INIT_FILE. -Z_INIT_FILE_SALT_VAR = "salt" ! default = "salt" - ! The name of the salinity variable in - ! SALT_Z_INIT_FILE. -Z_INIT_ALE_REMAPPING = True ! [Boolean] default = False - ! If True, then remap straight to model coordinate from file. -Z_INIT_REMAP_OLD_ALG = True ! [Boolean] default = False - ! If false, uses the preferred remapping algorithm for initialization. If true, - ! use an older, less robust algorithm for remapping. - -! === module MOM_diag_mediator === -!Jiande NUM_DIAG_COORDS = 2 ! default = 1 -NUM_DIAG_COORDS = 1 - ! The number of diagnostic vertical coordinates to use. - ! For each coordinate, an entry in DIAG_COORDS must be provided. -!Jiande DIAG_COORDS = "z Z ZSTAR", "rho2 RHO2 RHO" ! -DIAG_COORDS = "z Z ZSTAR" - ! A list of string tuples associating diag_table modules to - ! a coordinate definition used for diagnostics. Each string - ! is of the form "MODULE_SUFFIX,PARAMETER_SUFFIX,COORDINATE_NAME". -DIAG_COORD_DEF_Z="FILE:@[MOM6_DIAG_COORD_DEF_Z_FILE],interfaces=zw" -DIAG_MISVAL = @[MOM6_DIAG_MISVAL] -!AVAILABLE_DIAGS_FILE = "available_diags.002160" ! default = "available_diags.000000" - ! A file into which to write a list of all available ocean diagnostics that can - ! be included in a diag_table. -!DIAG_COORD_DEF_Z = "FILE:vgrid_75_2m.nc,dz" ! default = "WOA09" - ! Determines how to specify the coordinate resolution. Valid options are: - ! PARAM - use the vector-parameter DIAG_COORD_RES_Z - ! UNIFORM[:N] - uniformly distributed - ! FILE:string - read from a file. The string specifies - ! the filename and variable name, separated - ! by a comma or space, e.g. FILE:lev.nc,dz - ! or FILE:lev.nc,interfaces=zw - ! WOA09[:N] - the WOA09 vertical grid (approximately) - ! FNC1:string - FNC1:dz_min,H_total,power,precision - ! HYBRID:string - read from a file. The string specifies - ! the filename and two variable names, separated - ! by a comma or space, for sigma-2 and dz. e.g. - ! HYBRID:vgrid.nc,sigma2,dz -!DIAG_COORD_DEF_RHO2 = "RFNC1:35,999.5,1028,1028.5,8.,1038.,0.0078125" ! default = "WOA09" - ! Determines how to specify the coordinate resolution. Valid options are: - ! PARAM - use the vector-parameter DIAG_COORD_RES_RHO2 - ! UNIFORM[:N] - uniformly distributed - ! FILE:string - read from a file. The string specifies - ! the filename and variable name, separated - ! by a comma or space, e.g. FILE:lev.nc,dz - ! or FILE:lev.nc,interfaces=zw - ! WOA09[:N] - the WOA09 vertical grid (approximately) - ! FNC1:string - FNC1:dz_min,H_total,power,precision - ! HYBRID:string - read from a file. The string specifies - ! the filename and two variable names, separated - ! by a comma or space, for sigma-2 and dz. e.g. - ! HYBRID:vgrid.nc,sigma2,dz - -! === module MOM_MEKE === -USE_MEKE = True ! [Boolean] default = False - ! If true, turns on the MEKE scheme which calculates a sub-grid mesoscale eddy - ! kinetic energy budget. -MEKE_GMCOEFF = 1.0 ! [nondim] default = -1.0 - ! The efficiency of the conversion of potential energy into MEKE by the - ! thickness mixing parameterization. If MEKE_GMCOEFF is negative, this - ! conversion is not used or calculated. -MEKE_BGSRC = 1.0E-13 ! [W kg-1] default = 0.0 - ! A background energy source for MEKE. -MEKE_KHTH_FAC = 0.8 ! [nondim] default = 0.0 - ! A factor that maps MEKE%Kh to KhTh. -MEKE_KHTR_FAC = 0.8 ! [nondim] default = 0.0 - ! A factor that maps MEKE%Kh to KhTr. -MEKE_ALPHA_RHINES = 0.05 ! [nondim] default = 0.0 - ! If positive, is a coefficient weighting the Rhines scale in the expression for - ! mixing length used in MEKE-derived diffusivity. -MEKE_ALPHA_EADY = 0.05 ! [nondim] default = 0.0 - ! If positive, is a coefficient weighting the Eady length scale in the - ! expression for mixing length used in MEKE-derived diffusivity. - -! === module MOM_lateral_mixing_coeffs === -USE_VARIABLE_MIXING = True ! [Boolean] default = False - ! If true, the variable mixing code will be called. This allows diagnostics to - ! be created even if the scheme is not used. If KHTR_SLOPE_CFF>0 or - ! KhTh_Slope_Cff>0, this is set to true regardless of what is in the parameter - ! file. -RESOLN_SCALED_KH = True ! [Boolean] default = False - ! If true, the Laplacian lateral viscosity is scaled away when the first - ! baroclinic deformation radius is well resolved. -RESOLN_SCALED_KHTH = True ! [Boolean] default = False - ! If true, the interface depth diffusivity is scaled away when the first - ! baroclinic deformation radius is well resolved. -KHTR_SLOPE_CFF = 0.25 ! [nondim] default = 0.0 - ! The nondimensional coefficient in the Visbeck formula for the epipycnal tracer - ! diffusivity -USE_STORED_SLOPES = True ! [Boolean] default = False - ! If true, the isopycnal slopes are calculated once and stored for re-use. This - ! uses more memory but avoids calling the equation of state more times than - ! should be necessary. -KH_RES_FN_POWER = 100 ! [nondim] default = 2 - ! The power of dx/Ld in the Kh resolution function. Any positive integer may be - ! used, although even integers are more efficient to calculate. Setting this - ! greater than 100 results in a step-function being used. -VISC_RES_FN_POWER = 2 ! [nondim] default = 100 - ! The power of dx/Ld in the Kh resolution function. Any positive integer may be - ! used, although even integers are more efficient to calculate. Setting this - ! greater than 100 results in a step-function being used. This function affects - ! lateral viscosity, Kh, and not KhTh. -INTERNAL_WAVE_SPEED_BETTER_EST = False ! [Boolean] default = True - ! If true, use a more robust estimate of the first mode wave speed as the - ! starting point for iterations. - -! === module MOM_set_visc === -CHANNEL_DRAG = True ! [Boolean] default = False - ! If true, the bottom drag is exerted directly on each layer proportional to the - ! fraction of the bottom it overlies. -HBBL = 10.0 ! [m] - ! The thickness of a bottom boundary layer with a viscosity of KVBBL if - ! BOTTOMDRAGLAW is not defined, or the thickness over which near-bottom - ! velocities are averaged for the drag law if BOTTOMDRAGLAW is defined but - ! LINEAR_DRAG is not. -DRAG_BG_VEL = 0.1 ! [m s-1] default = 0.0 - ! DRAG_BG_VEL is either the assumed bottom velocity (with LINEAR_DRAG) or an - ! unresolved velocity that is combined with the resolved velocity to estimate - ! the velocity magnitude. DRAG_BG_VEL is only used when BOTTOMDRAGLAW is - ! defined. -BBL_USE_EOS = True ! [Boolean] default = False - ! If true, use the equation of state in determining the properties of the bottom - ! boundary layer. Otherwise use the layer target potential densities. -BBL_THICK_MIN = 0.1 ! [m] default = 0.0 - ! The minimum bottom boundary layer thickness that can be used with - ! BOTTOMDRAGLAW. This might be Kv/(cdrag*drag_bg_vel) to give Kv as the minimum - ! near-bottom viscosity. -KV = 1.0E-04 ! [m2 s-1] - ! The background kinematic viscosity in the interior. The molecular value, ~1e-6 - ! m2 s-1, may be used. -KV_BBL_MIN = 0.0 ! [m2 s-1] default = 1.0E-04 - ! The minimum viscosities in the bottom boundary layer. -KV_TBL_MIN = 0.0 ! [m2 s-1] default = 1.0E-04 - ! The minimum viscosities in the top boundary layer. - -! === module MOM_thickness_diffuse === -USE_GM_WORK_BUG = True ! [Boolean] default = False - ! If true, compute the top-layer work tendency on the u-grid with the incorrect - ! sign, for legacy reproducibility. - -! === module MOM_dynamics_split_RK2 === - -! === module MOM_continuity === - -! === module MOM_continuity_PPM === -ETA_TOLERANCE = 1.0E-06 ! [m] default = 3.75E-09 - ! The tolerance for the differences between the barotropic and baroclinic - ! estimates of the sea surface height due to the fluxes through each face. The - ! total tolerance for SSH is 4 times this value. The default is - ! 0.5*NK*ANGSTROM, and this should not be set less than about - ! 10^-15*MAXIMUM_DEPTH. -ETA_TOLERANCE_AUX = 0.001 ! [m] default = 1.0E-06 - ! The tolerance for free-surface height discrepancies between the barotropic - ! solution and the sum of the layer thicknesses when calculating the auxiliary - ! corrected velocities. By default, this is the same as ETA_TOLERANCE, but can - ! be made larger for efficiency. - -! === module MOM_CoriolisAdv === -CORIOLIS_SCHEME = "SADOURNY75_ENSTRO" ! default = "SADOURNY75_ENERGY" - ! CORIOLIS_SCHEME selects the discretization for the Coriolis terms. Valid - ! values are: - ! SADOURNY75_ENERGY - Sadourny, 1975; energy cons. - ! ARAKAWA_HSU90 - Arakawa & Hsu, 1990 - ! SADOURNY75_ENSTRO - Sadourny, 1975; enstrophy cons. - ! ARAKAWA_LAMB81 - Arakawa & Lamb, 1981; En. + Enst. - ! ARAKAWA_LAMB_BLEND - A blend of Arakawa & Lamb with - ! Arakawa & Hsu and Sadourny energy -BOUND_CORIOLIS = True ! [Boolean] default = False - ! If true, the Coriolis terms at u-points are bounded by the four estimates of - ! (f+rv)v from the four neighboring v-points, and similarly at v-points. This - ! option would have no effect on the SADOURNY Coriolis scheme if it were - ! possible to use centered difference thickness fluxes. - -! === module MOM_PressureForce === - -! === module MOM_PressureForce_AFV === -MASS_WEIGHT_IN_PRESSURE_GRADIENT = True ! [Boolean] default = False - ! If true, use mass weighting when interpolating T/S for integrals near the - ! bathymetry in AFV pressure gradient calculations. - -! === module MOM_hor_visc === -LAPLACIAN = True ! [Boolean] default = False - ! If true, use a Laplacian horizontal viscosity. -SMAGORINSKY_KH = True ! [Boolean] default = False - ! If true, use a Smagorinsky nonlinear eddy viscosity. -SMAG_LAP_CONST = 0.15 ! [nondim] default = 0.0 - ! The nondimensional Laplacian Smagorinsky constant, often 0.15. -AH_VEL_SCALE = 0.05 ! [m s-1] default = 0.0 - ! The velocity scale which is multiplied by the cube of the grid spacing to - ! calculate the biharmonic viscosity. The final viscosity is the largest of this - ! scaled viscosity, the Smagorinsky and Leith viscosities, and AH. -SMAGORINSKY_AH = True ! [Boolean] default = False - ! If true, use a biharmonic Smagorinsky nonlinear eddy viscosity. -SMAG_BI_CONST = 0.06 ! [nondim] default = 0.0 - ! The nondimensional biharmonic Smagorinsky constant, typically 0.015 - 0.06. -USE_KH_BG_2D = True ! [Boolean] default = False - ! If true, read a file containing 2-d background harmonic viscosities. The final - ! viscosity is the maximum of the other terms and this background value. - -! === module MOM_vert_friction === -HMIX_FIXED = 0.5 ! [m] - ! The prescribed depth over which the near-surface viscosity and diffusivity are - ! elevated when the bulk mixed layer is not used. -KVML = 1.0E-04 ! [m2 s-1] default = 1.0E-04 - ! The kinematic viscosity in the mixed layer. A typical value is ~1e-2 m2 s-1. - ! KVML is not used if BULKMIXEDLAYER is true. The default is set by KV. -MAXVEL = 6.0 ! [m s-1] default = 3.0E+08 - ! The maximum velocity allowed before the velocity components are truncated. - -! === module MOM_barotropic === -BOUND_BT_CORRECTION = True ! [Boolean] default = False - ! If true, the corrective pseudo mass-fluxes into the barotropic solver are - ! limited to values that require less than maxCFL_BT_cont to be accommodated. -BT_PROJECT_VELOCITY = True ! [Boolean] default = False - ! If true, step the barotropic velocity first and project out the velocity - ! tendency by 1+BEBT when calculating the transport. The default (false) is to - ! use a predictor continuity step to find the pressure field, and then to do a - ! corrector continuity step using a weighted average of the old and new - ! velocities, with weights of (1-BEBT) and BEBT. -BT_STRONG_DRAG = True ! [Boolean] default = False - ! If true, use a stronger estimate of the retarding effects of strong bottom - ! drag, by making it implicit with the barotropic time-step instead of implicit - ! with the baroclinic time-step and dividing by the number of barotropic steps. -BEBT = 0.2 ! [nondim] default = 0.1 - ! BEBT determines whether the barotropic time stepping uses the forward-backward - ! time-stepping scheme or a backward Euler scheme. BEBT is valid in the range - ! from 0 (for a forward-backward treatment of nonrotating gravity waves) to 1 - ! (for a backward Euler treatment). In practice, BEBT must be greater than about - ! 0.05. -DTBT = -0.9 ! [s or nondim] default = -0.98 - ! The barotropic time step, in s. DTBT is only used with the split explicit time - ! stepping. To set the time step automatically based the maximum stable value - ! use 0, or a negative value gives the fraction of the stable value. Setting - ! DTBT to 0 is the same as setting it to -0.98. The value of DTBT that will - ! actually be used is an integer fraction of DT, rounding down. - -! === module MOM_mixed_layer_restrat === -MIXEDLAYER_RESTRAT = True ! [Boolean] default = False - ! If true, a density-gradient dependent re-stratifying flow is imposed in the - ! mixed layer. Can be used in ALE mode without restriction but in layer mode can - ! only be used if BULKMIXEDLAYER is true. -FOX_KEMPER_ML_RESTRAT_COEF = 60.0 ! [nondim] default = 0.0 - ! A nondimensional coefficient that is proportional to the ratio of the - ! deformation radius to the dominant lengthscale of the submesoscale mixed layer - ! instabilities, times the minimum of the ratio of the mesoscale eddy kinetic - ! energy to the large-scale geostrophic kinetic energy or 1 plus the square of - ! the grid spacing over the deformation radius, as detailed by Fox-Kemper et al. - ! (2010) -MLE_USE_PBL_MLD = True ! [Boolean] default = False - ! If true, the MLE parameterization will use the mixed-layer depth provided by - ! the active PBL parameterization. If false, MLE will estimate a MLD based on a - ! density difference with the surface using the parameter MLE_DENSITY_DIFF. -MLE_MLD_DECAY_TIME = 2.592E+06 ! [s] default = 0.0 - ! The time-scale for a running-mean filter applied to the mixed-layer depth used - ! in the MLE restratification parameterization. When the MLD deepens below the - ! current running-mean the running-mean is instantaneously set to the current - ! MLD. - -! === module MOM_diabatic_driver === -! The following parameters are used for diabatic processes. -ENERGETICS_SFC_PBL = True ! [Boolean] default = False - ! If true, use an implied energetics planetary boundary layer scheme to - ! determine the diffusivity and viscosity in the surface boundary layer. -EPBL_IS_ADDITIVE = False ! [Boolean] default = True - ! If true, the diffusivity from ePBL is added to all other diffusivities. - ! Otherwise, the larger of kappa-shear and ePBL diffusivities are used. -KD_MIN_TR = 2.0E-06 ! [m2 s-1] default = 2.0E-06 - ! A minimal diffusivity that should always be applied to tracers, especially in - ! massless layers near the bottom. The default is 0.1*KD. - -! === module MOM_CVMix_KPP === -! This is the MOM wrapper to CVMix:KPP -! See http://cvmix.github.io/ - -! === module MOM_tidal_mixing === -! Vertical Tidal Mixing Parameterization -INT_TIDE_DISSIPATION = True ! [Boolean] default = False - ! If true, use an internal tidal dissipation scheme to drive diapycnal mixing, - ! along the lines of St. Laurent et al. (2002) and Simmons et al. (2004). -INT_TIDE_PROFILE = "POLZIN_09" ! default = "STLAURENT_02" - ! INT_TIDE_PROFILE selects the vertical profile of energy dissipation with - ! INT_TIDE_DISSIPATION. Valid values are: - ! STLAURENT_02 - Use the St. Laurent et al exponential - ! decay profile. - ! POLZIN_09 - Use the Polzin WKB-stretched algebraic - ! decay profile. -KAPPA_ITIDES = 6.28319E-04 ! [m-1] default = 6.283185307179586E-04 - ! A topographic wavenumber used with INT_TIDE_DISSIPATION. The default is 2pi/10 - ! km, as in St.Laurent et al. 2002. -KAPPA_H2_FACTOR = 0.84 ! [nondim] default = 1.0 - ! A scaling factor for the roughness amplitude with INT_TIDE_DISSIPATION. -TKE_ITIDE_MAX = 0.1 ! [W m-2] default = 1000.0 - ! The maximum internal tide energy source available to mix above the bottom - ! boundary layer with INT_TIDE_DISSIPATION. -READ_TIDEAMP = True ! [Boolean] default = False - ! If true, read a file (given by TIDEAMP_FILE) containing the tidal amplitude - ! with INT_TIDE_DISSIPATION. -TIDEAMP_FILE = "tidal_amplitude.nc" ! default = "tideamp.nc" - ! The path to the file containing the spatially varying tidal amplitudes with - ! INT_TIDE_DISSIPATION. -H2_FILE = "topog.nc" ! - ! The path to the file containing the sub-grid-scale topographic roughness - ! amplitude with INT_TIDE_DISSIPATION. - -! === module MOM_CVMix_conv === -! Parameterization of enhanced mixing due to convection via CVMix - -! === module MOM_set_diffusivity === -BBL_MIXING_AS_MAX = False ! [Boolean] default = True - ! If true, take the maximum of the diffusivity from the BBL mixing and the other - ! diffusivities. Otherwise, diffusivity from the BBL_mixing is simply added. -USE_LOTW_BBL_DIFFUSIVITY = True ! [Boolean] default = False - ! If true, uses a simple, imprecise but non-coordinate dependent, model of BBL - ! mixing diffusivity based on Law of the Wall. Otherwise, uses the original BBL - ! scheme. -SIMPLE_TKE_TO_KD = True ! [Boolean] default = False - ! If true, uses a simple estimate of Kd/TKE that will work for arbitrary - ! vertical coordinates. If false, calculates Kd/TKE and bounds based on exact - ! energetics for an isopycnal layer-formulation. - -! === module MOM_bkgnd_mixing === -! Adding static vertical background mixing coefficients -KD = 2.0E-05 ! [m2 s-1] default = 0.0 - ! The background diapycnal diffusivity of density in the interior. Zero or the - ! molecular value, ~1e-7 m2 s-1, may be used. -KD_MIN = 2.0E-06 ! [m2 s-1] default = 2.0E-07 - ! The minimum diapycnal diffusivity. -HENYEY_IGW_BACKGROUND = True ! [Boolean] default = False - ! If true, use a latitude-dependent scaling for the near surface background - ! diffusivity, as described in Harrison & Hallberg, JPO 2008. -KD_MAX = 0.1 ! [m2 s-1] default = -1.0 - ! The maximum permitted increment for the diapycnal diffusivity from TKE-based - ! parameterizations, or a negative value for no limit. - -! === module MOM_kappa_shear === -! Parameterization of shear-driven turbulence following Jackson, Hallberg and Legg, JPO 2008 -USE_JACKSON_PARAM = True ! [Boolean] default = False - ! If true, use the Jackson-Hallberg-Legg (JPO 2008) shear mixing - ! parameterization. -MAX_RINO_IT = 25 ! [nondim] default = 50 - ! The maximum number of iterations that may be used to estimate the Richardson - ! number driven mixing. -VERTEX_SHEAR = False ! [Boolean] default = False - ! If true, do the calculations of the shear-driven mixing - ! at the cell vertices (i.e., the vorticity points). -KD_TRUNC_KAPPA_SHEAR = 2.0E-07 ! [m2 s-1] default = 2.0E-07 - ! The value of shear-driven diffusivity that is considered negligible and is - ! rounded down to 0. The default is 1% of KD_KAPPA_SHEAR_0. -KAPPA_SHEAR_ITER_BUG = True ! [Boolean] default = False - ! If true, use an older, dimensionally inconsistent estimate of the derivative - ! of diffusivity with energy in the Newton's method iteration. The bug causes - ! undercorrections when dz > 1 m. -KAPPA_SHEAR_ALL_LAYER_TKE_BUG = True ! [Boolean] default = False - ! If true, report back the latest estimate of TKE instead of the time average - ! TKE when there is mass in all layers. Otherwise always report the time - ! averaged TKE, as is currently done when there are some massless layers. - -! === module MOM_CVMix_shear === -! Parameterization of shear-driven turbulence via CVMix (various options) - -! === module MOM_CVMix_ddiff === -! Parameterization of mixing due to double diffusion processes via CVMix - -! === module MOM_diabatic_aux === -! The following parameters are used for auxiliary diabatic processes. -PRESSURE_DEPENDENT_FRAZIL = False ! [Boolean] default = False - ! If true, use a pressure dependent freezing temperature when making frazil. The - ! default is false, which will be faster but is inappropriate with ice-shelf - ! cavities. -VAR_PEN_SW = True ! [Boolean] default = False - ! If true, use one of the CHL_A schemes specified by OPACITY_SCHEME to determine - ! the e-folding depth of incoming short wave radiation. -CHL_FILE = @[CHLCLIM] ! - ! CHL_FILE is the file containing chl_a concentrations in the variable CHL_A. It - ! is used when VAR_PEN_SW and CHL_FROM_FILE are true. - -! === module MOM_energetic_PBL === -ML_OMEGA_FRAC = 0.001 ! [nondim] default = 0.0 - ! When setting the decay scale for turbulence, use this fraction of the absolute - ! rotation rate blended with the local value of f, as sqrt((1-of)*f^2 + - ! of*4*omega^2). -TKE_DECAY = 0.01 ! [nondim] default = 2.5 - ! TKE_DECAY relates the vertical rate of decay of the TKE available for - ! mechanical entrainment to the natural Ekman depth. -EPBL_MSTAR_SCHEME = "OM4" ! default = "CONSTANT" - ! EPBL_MSTAR_SCHEME selects the method for setting mstar. Valid values are: - ! CONSTANT - Use a fixed mstar given by MSTAR - ! OM4 - Use L_Ekman/L_Obukhov in the sabilizing limit, as in OM4 - ! REICHL_H18 - Use the scheme documented in Reichl & Hallberg, 2018. -MSTAR_CAP = 10.0 ! [nondim] default = -1.0 - ! If this value is positive, it sets the maximum value of mstar allowed in ePBL. - ! (This is not used if EPBL_MSTAR_SCHEME = CONSTANT). -MSTAR2_COEF1 = 0.29 ! [nondim] default = 0.3 - ! Coefficient in computing mstar when rotation and stabilizing effects are both - ! important (used if EPBL_MSTAR_SCHEME = OM4). -MSTAR2_COEF2 = 0.152 ! [nondim] default = 0.085 - ! Coefficient in computing mstar when only rotation limits the total mixing - ! (used if EPBL_MSTAR_SCHEME = OM4) -NSTAR = 0.06 ! [nondim] default = 0.2 - ! The portion of the buoyant potential energy imparted by surface fluxes that is - ! available to drive entrainment at the base of mixed layer when that energy is - ! positive. -MSTAR_CONV_ADJ = 0.667 ! [nondim] default = 0.0 - ! Coefficient used for reducing mstar during convection due to reduction of - ! stable density gradient. -USE_MLD_ITERATION = False ! [Boolean] default = True - ! A logical that specifies whether or not to use the distance to the bottom of - ! the actively turbulent boundary layer to help set the EPBL length scale. -EPBL_TRANSITION_SCALE = 0.01 ! [nondim] default = 0.1 - ! A scale for the mixing length in the transition layer at the edge of the - ! boundary layer as a fraction of the boundary layer thickness. -MIX_LEN_EXPONENT = 1.0 ! [nondim] default = 2.0 - ! The exponent applied to the ratio of the distance to the MLD and the MLD depth - ! which determines the shape of the mixing length. This is only used if - ! USE_MLD_ITERATION is True. -USE_LA_LI2016 = @[MOM6_USE_LI2016] ! [nondim] default = False - ! A logical to use the Li et al. 2016 (submitted) formula to determine the - ! Langmuir number. -USE_WAVES = @[MOM6_USE_WAVES] ! [Boolean] default = False - ! If true, enables surface wave modules. -WAVE_METHOD = "SURFACE_BANDS" ! default = "EMPTY" - ! Choice of wave method, valid options include: - ! TEST_PROFILE - Prescribed from surface Stokes drift - ! and a decay wavelength. - ! SURFACE_BANDS - Computed from multiple surface values - ! and decay wavelengths. - ! DHH85 - Uses Donelan et al. 1985 empirical - ! wave spectrum with prescribed values. - ! LF17 - Infers Stokes drift profile from wind - ! speed following Li and Fox-Kemper 2017. -SURFBAND_SOURCE = "COUPLER" ! default = "EMPTY" - ! Choice of SURFACE_BANDS data mode, valid options include: - ! DATAOVERRIDE - Read from NetCDF using FMS DataOverride. - ! COUPLER - Look for variables from coupler pass - ! INPUT - Testing with fixed values. -STK_BAND_COUPLER = 3 ! default = 1 - ! STK_BAND_COUPLER is the number of Stokes drift bands in the coupler. This has - ! to be consistent with the number of Stokes drift bands in WW3, or the model - ! will fail. -SURFBAND_WAVENUMBERS = 0.04, 0.11, 0.3305 ! [rad/m] default = 0.12566 - ! Central wavenumbers for surface Stokes drift bands. -EPBL_LANGMUIR_SCHEME = "ADDITIVE" ! default = "NONE" - ! EPBL_LANGMUIR_SCHEME selects the method for including Langmuir turbulence. - ! Valid values are: - ! NONE - Do not do any extra mixing due to Langmuir turbulence - ! RESCALE - Use a multiplicative rescaling of mstar to account for Langmuir - ! turbulence - ! ADDITIVE - Add a Langmuir turblence contribution to mstar to other - ! contributions -LT_ENHANCE_COEF = 0.044 ! [nondim] default = 0.447 - ! Coefficient for Langmuir enhancement of mstar -LT_ENHANCE_EXP = -1.5 ! [nondim] default = -1.33 - ! Exponent for Langmuir enhancementt of mstar -LT_MOD_LAC1 = 0.0 ! [nondim] default = -0.87 - ! Coefficient for modification of Langmuir number due to MLD approaching Ekman - ! depth. -LT_MOD_LAC4 = 0.0 ! [nondim] default = 0.95 - ! Coefficient for modification of Langmuir number due to ratio of Ekman to - ! stable Obukhov depth. -LT_MOD_LAC5 = 0.22 ! [nondim] default = 0.95 - ! Coefficient for modification of Langmuir number due to ratio of Ekman to - ! unstable Obukhov depth. - -! === module MOM_regularize_layers === - -! === module MOM_opacity === -PEN_SW_NBANDS = 3 ! default = 1 - ! The number of bands of penetrating shortwave radiation. - -! === module MOM_tracer_advect === -TRACER_ADVECTION_SCHEME = "PPM:H3" ! default = "PLM" - ! The horizontal transport scheme for tracers: - ! PLM - Piecewise Linear Method - ! PPM:H3 - Piecewise Parabolic Method (Huyhn 3rd order) - ! PPM - Piecewise Parabolic Method (Colella-Woodward) - -! === module MOM_tracer_hor_diff === -CHECK_DIFFUSIVE_CFL = True ! [Boolean] default = False - ! If true, use enough iterations the diffusion to ensure that the diffusive - ! equivalent of the CFL limit is not violated. If false, always use the greater - ! of 1 or MAX_TR_DIFFUSION_CFL iteration. - -! === module MOM_neutral_diffusion === -! This module implements neutral diffusion of tracers -USE_NEUTRAL_DIFFUSION = True ! [Boolean] default = False - ! If true, enables the neutral diffusion module. - -! === module MOM_lateral_boundary_diffusion === -! This module implements lateral diffusion of tracers near boundaries - -! === module MOM_sum_output === -CALCULATE_APE = False ! [Boolean] default = True - ! If true, calculate the available potential energy of the interfaces. Setting - ! this to false reduces the memory footprint of high-PE-count models - ! dramatically. -MAXTRUNC = 100000 ! [truncations save_interval-1] default = 0 - ! The run will be stopped, and the day set to a very large value if the velocity - ! is truncated more than MAXTRUNC times between energy saves. Set MAXTRUNC to 0 - ! to stop if there is any truncation of velocities. -ENERGYSAVEDAYS = 0.25 ! [days] default = 1.0 - ! The interval in units of TIMEUNIT between saves of the energies of the run and - ! other globally summed diagnostics. - -! === module ocean_model_init === - -! === module MOM_oda_incupd === -ODA_INCUPD = @[ODA_INCUPD] ! [Boolean] default = False - ! If true, oda incremental updates will be applied - ! everywhere in the domain. -ODA_INCUPD_FILE = "mom6_increment.nc" ! The name of the file with the T,S,h increments. - -ODA_TEMPINC_VAR = "Temp" ! default = "ptemp_inc" - ! The name of the potential temperature inc. variable in - ! ODA_INCUPD_FILE. -ODA_SALTINC_VAR = "Salt" ! default = "sal_inc" - ! The name of the salinity inc. variable in - ! ODA_INCUPD_FILE. -ODA_THK_VAR = "h" ! default = "h" - ! The name of the int. depth inc. variable in - ! ODA_INCUPD_FILE. -ODA_INCUPD_UV = true ! -ODA_UINC_VAR = "u" ! default = "u_inc" - ! The name of the zonal vel. inc. variable in - ! ODA_INCUPD_UV_FILE. -ODA_VINC_VAR = "v" ! default = "v_inc" - ! The name of the meridional vel. inc. variable in - ! ODA_INCUPD_UV_FILE. -ODA_INCUPD_NHOURS = @[ODA_INCUPD_NHOURS] ! default=3.0 - -! === module MOM_surface_forcing === -OCEAN_SURFACE_STAGGER = "A" ! default = "C" - ! A case-insensitive character string to indicate the - ! staggering of the surface velocity field that is - ! returned to the coupler. Valid values include - ! 'A', 'B', or 'C'. - -MAX_P_SURF = 0.0 ! [Pa] default = -1.0 - ! The maximum surface pressure that can be exerted by the atmosphere and - ! floating sea-ice or ice shelves. This is needed because the FMS coupling - ! structure does not limit the water that can be frozen out of the ocean and the - ! ice-ocean heat fluxes are treated explicitly. No limit is applied if a - ! negative value is used. -WIND_STAGGER = "A" ! default = "C" - ! A case-insensitive character string to indicate the - ! staggering of the input wind stress field. Valid - ! values are 'A', 'B', or 'C'. -CD_TIDES = 0.0018 ! [nondim] default = 1.0E-04 - ! The drag coefficient that applies to the tides. -GUST_CONST = 0.02 ! [Pa] default = 0.0 - ! The background gustiness in the winds. -FIX_USTAR_GUSTLESS_BUG = False ! [Boolean] default = True - ! If true correct a bug in the time-averaging of the gustless wind friction - ! velocity -! === module ocean_stochastics === -DO_SPPT = @[DO_OCN_SPPT] ! [Boolean] default = False - ! If true perturb the diabatic tendencies in MOM_diabatic_driver -PERT_EPBL = @[PERT_EPBL] ! [Boolean] default = False - ! If true perturb the KE dissipation and destruction in MOM_energetic_PBL - -! === module MOM_restart === - -! === module MOM_file_parser === diff --git a/parm/ufs/mom6/MOM_input_template_500 b/parm/ufs/mom6/MOM_input_template_500 deleted file mode 100644 index dde805d247..0000000000 --- a/parm/ufs/mom6/MOM_input_template_500 +++ /dev/null @@ -1,592 +0,0 @@ -! This file was written by the model and records the non-default parameters used at run-time. -! === module MOM === - -! === module MOM_unit_scaling === -! Parameters for doing unit scaling of variables. -USE_REGRIDDING = True ! [Boolean] default = False - ! If True, use the ALE algorithm (regridding/remapping). If False, use the - ! layered isopycnal algorithm. -THICKNESSDIFFUSE = True ! [Boolean] default = False - ! If true, interface heights are diffused with a coefficient of KHTH. -THICKNESSDIFFUSE_FIRST = True ! [Boolean] default = False - ! If true, do thickness diffusion before dynamics. This is only used if - ! THICKNESSDIFFUSE is true. -DT = @[DT_DYNAM_MOM6] ! [s] - ! The (baroclinic) dynamics time step. The time-step that is actually used will - ! be an integer fraction of the forcing time-step (DT_FORCING in ocean-only mode - ! or the coupling timestep in coupled mode.) -DT_THERM = @[DT_THERM_MOM6] ! [s] default = 1800.0 - ! The thermodynamic and tracer advection time step. Ideally DT_THERM should be - ! an integer multiple of DT and less than the forcing or coupling time-step, - ! unless THERMO_SPANS_COUPLING is true, in which case DT_THERM can be an integer - ! multiple of the coupling timestep. By default DT_THERM is set to DT. -THERMO_SPANS_COUPLING = @[MOM6_THERMO_SPAN] ! [Boolean] default = False - ! If true, the MOM will take thermodynamic and tracer timesteps that can be - ! longer than the coupling timestep. The actual thermodynamic timestep that is - ! used in this case is the largest integer multiple of the coupling timestep - ! that is less than or equal to DT_THERM. -HFREEZE = 20.0 ! [m] default = -1.0 - ! If HFREEZE > 0, melt potential will be computed. The actual depth - ! over which melt potential is computed will be min(HFREEZE, OBLD) - ! where OBLD is the boundary layer depth. If HFREEZE <= 0 (default) - ! melt potential will not be computed. -FRAZIL = True ! [Boolean] default = False - ! If true, water freezes if it gets too cold, and the accumulated heat deficit - ! is returned in the surface state. FRAZIL is only used if - ! ENABLE_THERMODYNAMICS is true. -BOUND_SALINITY = True ! [Boolean] default = False - ! If true, limit salinity to being positive. (The sea-ice model may ask for more - ! salt than is available and drive the salinity negative otherwise.) - -! === module MOM_domains === -TRIPOLAR_N = True ! [Boolean] default = False - ! Use tripolar connectivity at the northern edge of the domain. With - ! TRIPOLAR_N, NIGLOBAL must be even. -NIGLOBAL = @[NX_GLB] ! - ! The total number of thickness grid points in the x-direction in the physical - ! domain. With STATIC_MEMORY_ this is set in MOM_memory.h at compile time. -NJGLOBAL = @[NY_GLB] ! - ! The total number of thickness grid points in the y-direction in the physical - ! domain. With STATIC_MEMORY_ this is set in MOM_memory.h at compile time. - -! === module MOM_hor_index === -! Sets the horizontal array index types. - -! === module MOM_fixed_initialization === -INPUTDIR = "INPUT" ! default = "." - ! The directory in which input files are found. - -! === module MOM_grid_init === -GRID_CONFIG = "mosaic" ! - ! A character string that determines the method for defining the horizontal - ! grid. Current options are: - ! mosaic - read the grid from a mosaic (supergrid) - ! file set by GRID_FILE. - ! cartesian - use a (flat) Cartesian grid. - ! spherical - use a simple spherical grid. - ! mercator - use a Mercator spherical grid. -GRID_FILE = "ocean_hgrid.nc" ! - ! Name of the file from which to read horizontal grid data. -GRID_ROTATION_ANGLE_BUGS = False ! [Boolean] default = True - ! If true, use an older algorithm to calculate the sine and - ! cosines needed rotate between grid-oriented directions and - ! true north and east. Differences arise at the tripolar fold -USE_TRIPOLAR_GEOLONB_BUG = False ! [Boolean] default = True - ! If true, use older code that incorrectly sets the longitude in some points - ! along the tripolar fold to be off by 360 degrees. -TOPO_CONFIG = "file" ! - ! This specifies how bathymetry is specified: - ! file - read bathymetric information from the file - ! specified by (TOPO_FILE). - ! flat - flat bottom set to MAXIMUM_DEPTH. - ! bowl - an analytically specified bowl-shaped basin - ! ranging between MAXIMUM_DEPTH and MINIMUM_DEPTH. - ! spoon - a similar shape to 'bowl', but with an vertical - ! wall at the southern face. - ! halfpipe - a zonally uniform channel with a half-sine - ! profile in the meridional direction. - ! bbuilder - build topography from list of functions. - ! benchmark - use the benchmark test case topography. - ! Neverworld - use the Neverworld test case topography. - ! DOME - use a slope and channel configuration for the - ! DOME sill-overflow test case. - ! ISOMIP - use a slope and channel configuration for the - ! ISOMIP test case. - ! DOME2D - use a shelf and slope configuration for the - ! DOME2D gravity current/overflow test case. - ! Kelvin - flat but with rotated land mask. - ! seamount - Gaussian bump for spontaneous motion test case. - ! dumbbell - Sloshing channel with reservoirs on both ends. - ! shelfwave - exponential slope for shelfwave test case. - ! Phillips - ACC-like idealized topography used in the Phillips config. - ! dense - Denmark Strait-like dense water formation and overflow. - ! USER - call a user modified routine. -TOPO_FILE = "ocean_topog.nc" ! default = "topog.nc" - ! The file from which the bathymetry is read. -!MAXIMUM_DEPTH = 5801.341919389728 ! [m] - ! The (diagnosed) maximum depth of the ocean. -MINIMUM_DEPTH = 10.0 ! [m] default = 0.0 - ! If MASKING_DEPTH is unspecified, then anything shallower than MINIMUM_DEPTH is - ! assumed to be land and all fluxes are masked out. If MASKING_DEPTH is - ! specified, then all depths shallower than MINIMUM_DEPTH but deeper than - ! MASKING_DEPTH are rounded to MINIMUM_DEPTH. - -! === module MOM_open_boundary === -! Controls where open boundaries are located, what kind of boundary condition to impose, and what data to apply, -! if any. -MASKING_DEPTH = 0.0 ! [m] default = -9999.0 - ! The depth below which to mask points as land points, for which all fluxes are - ! zeroed out. MASKING_DEPTH is ignored if negative. - -! === module MOM_verticalGrid === -! Parameters providing information about the vertical grid. -NK = 25 ! [nondim] - ! The number of model layers. - -! === module MOM_tracer_registry === - -! === module MOM_EOS === -TFREEZE_FORM = "MILLERO_78" ! default = "LINEAR" - ! TFREEZE_FORM determines which expression should be used for the freezing - ! point. Currently, the valid choices are "LINEAR", "MILLERO_78", "TEOS10" - -! === module MOM_restart === -RESTART_CHECKSUMS_REQUIRED = False -! === module MOM_tracer_flow_control === - -! === module MOM_coord_initialization === -COORD_CONFIG = "file" ! default = "none" - ! This specifies how layers are to be defined: - ! ALE or none - used to avoid defining layers in ALE mode - ! file - read coordinate information from the file - ! specified by (COORD_FILE). - ! BFB - Custom coords for buoyancy-forced basin case - ! based on SST_S, T_BOT and DRHO_DT. - ! linear - linear based on interfaces not layers - ! layer_ref - linear based on layer densities - ! ts_ref - use reference temperature and salinity - ! ts_range - use range of temperature and salinity - ! (T_REF and S_REF) to determine surface density - ! and GINT calculate internal densities. - ! gprime - use reference density (RHO_0) for surface - ! density and GINT calculate internal densities. - ! ts_profile - use temperature and salinity profiles - ! (read from COORD_FILE) to set layer densities. - ! USER - call a user modified routine. -COORD_FILE = "layer_coord25.nc" ! - ! The file from which the coordinate densities are read. -REGRIDDING_COORDINATE_MODE = "HYCOM1" ! default = "LAYER" - ! Coordinate mode for vertical regridding. Choose among the following - ! possibilities: LAYER - Isopycnal or stacked shallow water layers - ! ZSTAR, Z* - stretched geopotential z* - ! SIGMA_SHELF_ZSTAR - stretched geopotential z* ignoring shelf - ! SIGMA - terrain following coordinates - ! RHO - continuous isopycnal - ! HYCOM1 - HyCOM-like hybrid coordinate - ! SLIGHT - stretched coordinates above continuous isopycnal - ! ADAPTIVE - optimize for smooth neutral density surfaces -BOUNDARY_EXTRAPOLATION = True ! [Boolean] default = False - ! When defined, a proper high-order reconstruction scheme is used within - ! boundary cells rather than PCM. E.g., if PPM is used for remapping, a PPM - ! reconstruction will also be used within boundary cells. -ALE_COORDINATE_CONFIG = "HYBRID:hycom1_25.nc,sigma2,FNC1:5,4000,4.5,.01" ! default = "UNIFORM" - ! Determines how to specify the coordinate - ! resolution. Valid options are: - ! PARAM - use the vector-parameter ALE_RESOLUTION - ! UNIFORM[:N] - uniformly distributed - ! FILE:string - read from a file. The string specifies - ! the filename and variable name, separated - ! by a comma or space, e.g. FILE:lev.nc,dz - ! or FILE:lev.nc,interfaces=zw - ! WOA09[:N] - the WOA09 vertical grid (approximately) - ! FNC1:string - FNC1:dz_min,H_total,power,precision - ! HYBRID:string - read from a file. The string specifies - ! the filename and two variable names, separated - ! by a comma or space, for sigma-2 and dz. e.g. - ! HYBRID:vgrid.nc,sigma2,dz -!ALE_RESOLUTION = 2*5.0, 5.01, 5.07, 5.25, 5.68, 6.55, 8.1, 10.66, 14.620000000000001, 20.450000000000003, 28.73, 40.1, 55.32, 75.23, 100.8, 133.09, 173.26, 222.62, 282.56, 354.62, 440.47, 541.87, 660.76, 799.1800000000001 ! [m] - ! The distribution of vertical resolution for the target - ! grid used for Eulerian-like coordinates. For example, - ! in z-coordinate mode, the parameter is a list of level - ! thicknesses (in m). In sigma-coordinate mode, the list - ! is of non-dimensional fractions of the water column. -!TARGET_DENSITIES = 1010.0, 1020.843017578125, 1027.0274658203125, 1029.279541015625, 1030.862548828125, 1032.1572265625, 1033.27978515625, 1034.251953125, 1034.850830078125, 1035.28857421875, 1035.651123046875, 1035.967529296875, 1036.2410888671875, 1036.473876953125, 1036.6800537109375, 1036.8525390625, 1036.9417724609375, 1037.0052490234375, 1037.057373046875, 1037.1065673828125, 1037.15576171875, 1037.2060546875, 1037.26416015625, 1037.3388671875, 1037.4749755859375, 1038.0 ! [m] - ! HYBRID target densities for itnerfaces -REGRID_COMPRESSIBILITY_FRACTION = 0.01 ! [not defined] default = 0.0 - ! When interpolating potential density profiles we can add - ! some artificial compressibility solely to make homogenous - ! regions appear stratified. -MAXIMUM_INT_DEPTH_CONFIG = "FNC1:5,8000.0,1.0,.125" ! default = "NONE" - ! Determines how to specify the maximum interface depths. - ! Valid options are: - ! NONE - there are no maximum interface depths - ! PARAM - use the vector-parameter MAXIMUM_INTERFACE_DEPTHS - ! FILE:string - read from a file. The string specifies - ! the filename and variable name, separated - ! by a comma or space, e.g. FILE:lev.nc,Z - ! FNC1:string - FNC1:dz_min,H_total,power,precision -!MAXIMUM_INT_DEPTHS = 0.0, 5.0, 36.25, 93.75, 177.5, 287.5, 423.75, 586.25, 775.0, 990.0, 1231.25, 1498.75, 1792.5, 2112.5, 2458.75, 2831.25, 3230.0, 3655.0, 4106.25, 4583.75, 5087.5, 5617.5, 6173.75, 6756.25, 7365.0, 8000.0 ! [m] - ! The list of maximum depths for each interface. -MAX_LAYER_THICKNESS_CONFIG = "FNC1:400,31000.0,0.1,.01" ! default = "NONE" - ! Determines how to specify the maximum layer thicknesses. - ! Valid options are: - ! NONE - there are no maximum layer thicknesses - ! PARAM - use the vector-parameter MAX_LAYER_THICKNESS - ! FILE:string - read from a file. The string specifies - ! the filename and variable name, separated - ! by a comma or space, e.g. FILE:lev.nc,Z - ! FNC1:string - FNC1:dz_min,H_total,power,precision -!MAX_LAYER_THICKNESS = 400.0, 1094.2, 1144.02, 1174.81, 1197.42, 1215.4099999999999, 1230.42, 1243.3200000000002, 1254.65, 1264.78, 1273.94, 1282.31, 1290.02, 1297.17, 1303.85, 1310.1, 1316.0, 1321.5700000000002, 1326.85, 1331.87, 1336.67, 1341.25, 1345.6399999999999, 1349.85, 1353.88 ! [m] - ! The list of maximum thickness for each layer. -REMAPPING_SCHEME = "PPM_H4" ! default = "PLM" - ! This sets the reconstruction scheme used for vertical remapping for all - ! variables. It can be one of the following schemes: PCM (1st-order - ! accurate) - ! PLM (2nd-order accurate) - ! PPM_H4 (3rd-order accurate) - ! PPM_IH4 (3rd-order accurate) - ! PQM_IH4IH3 (4th-order accurate) - ! PQM_IH6IH5 (5th-order accurate) - -! === module MOM_grid === -! Parameters providing information about the lateral grid. - -! === module MOM_state_initialization === -INIT_LAYERS_FROM_Z_FILE = True ! [Boolean] default = False - ! If true, initialize the layer thicknesses, temperatures, and salinities from a - ! Z-space file on a latitude-longitude grid. - -! === module MOM_initialize_layers_from_Z === -TEMP_SALT_Z_INIT_FILE = "" ! default = "temp_salt_z.nc" - ! The name of the z-space input file used to initialize - ! temperatures (T) and salinities (S). If T and S are not - ! in the same file, TEMP_Z_INIT_FILE and SALT_Z_INIT_FILE - ! must be set. -TEMP_Z_INIT_FILE = "woa18_decav_t00_01.nc" ! default = "" - ! The name of the z-space input file used to initialize - ! temperatures, only. -SALT_Z_INIT_FILE = "woa18_decav_s00_01.nc" ! default = "" - ! The name of the z-space input file used to initialize - ! temperatures, only. -Z_INIT_FILE_PTEMP_VAR = "t_an" ! default = "ptemp" - ! The name of the potential temperature variable in - ! TEMP_Z_INIT_FILE. -Z_INIT_FILE_SALT_VAR = "s_an" ! default = "salt" - ! The name of the salinity variable in - ! SALT_Z_INIT_FILE. -Z_INIT_ALE_REMAPPING = True ! [Boolean] default = False - ! If True, then remap straight to model coordinate from file. - -! === module MOM_diag_mediator === -NUM_DIAG_COORDS = 1 - ! The number of diagnostic vertical coordinates to use. - ! For each coordinate, an entry in DIAG_COORDS must be provided. -DIAG_COORDS = "z Z ZSTAR" - ! A list of string tuples associating diag_table modules to - ! a coordinate definition used for diagnostics. Each string - ! is of the form "MODULE_SUFFIX,PARAMETER_SUFFIX,COORDINATE_NAME". -DIAG_COORD_DEF_Z="FILE:@[MOM6_DIAG_COORD_DEF_Z_FILE],interfaces=zw" -DIAG_MISVAL = @[MOM6_DIAG_MISVAL] - -! === module MOM_MEKE === -USE_MEKE = True ! [Boolean] default = False - ! If true, turns on the MEKE scheme which calculates a sub-grid mesoscale eddy - ! kinetic energy budget. - -! === module MOM_lateral_mixing_coeffs === -USE_VARIABLE_MIXING = True ! [Boolean] default = False - ! If true, the variable mixing code will be called. This allows diagnostics to - ! be created even if the scheme is not used. If KHTR_SLOPE_CFF>0 or - ! KhTh_Slope_Cff>0, this is set to true regardless of what is in the parameter - ! file. -! === module MOM_set_visc === -CHANNEL_DRAG = True ! [Boolean] default = False - ! If true, the bottom drag is exerted directly on each layer proportional to the - ! fraction of the bottom it overlies. -HBBL = 10.0 ! [m] - ! The thickness of a bottom boundary layer with a viscosity of KVBBL if - ! BOTTOMDRAGLAW is not defined, or the thickness over which near-bottom - ! velocities are averaged for the drag law if BOTTOMDRAGLAW is defined but - ! LINEAR_DRAG is not. -KV = 1.0E-04 ! [m2 s-1] - ! The background kinematic viscosity in the interior. The molecular value, ~1e-6 - ! m2 s-1, may be used. - -! === module MOM_continuity === - -! === module MOM_continuity_PPM === - -! === module MOM_CoriolisAdv === -CORIOLIS_SCHEME = "SADOURNY75_ENSTRO" ! default = "SADOURNY75_ENERGY" - ! CORIOLIS_SCHEME selects the discretization for the Coriolis terms. Valid - ! values are: - ! SADOURNY75_ENERGY - Sadourny, 1975; energy cons. - ! ARAKAWA_HSU90 - Arakawa & Hsu, 1990 - ! SADOURNY75_ENSTRO - Sadourny, 1975; enstrophy cons. - ! ARAKAWA_LAMB81 - Arakawa & Lamb, 1981; En. + Enst. - ! ARAKAWA_LAMB_BLEND - A blend of Arakawa & Lamb with - ! Arakawa & Hsu and Sadourny energy -BOUND_CORIOLIS = True ! [Boolean] default = False - ! If true, the Coriolis terms at u-points are bounded by the four estimates of - ! (f+rv)v from the four neighboring v-points, and similarly at v-points. This - ! option would have no effect on the SADOURNY Coriolis scheme if it were - ! possible to use centered difference thickness fluxes. - -! === module MOM_PressureForce === - -! === module MOM_PressureForce_AFV === -MASS_WEIGHT_IN_PRESSURE_GRADIENT = True ! [Boolean] default = False - ! If true, use mass weighting when interpolating T/S for integrals near the - ! bathymetry in AFV pressure gradient calculations. - -! === module MOM_hor_visc === -LAPLACIAN = True ! [Boolean] default = False - ! If true, use a Laplacian horizontal viscosity. -KH_VEL_SCALE = 0.01 ! [m s-1] default = 0.0 - ! The velocity scale which is multiplied by the grid spacing to calculate the - ! Laplacian viscosity. The final viscosity is the largest of this scaled - ! viscosity, the Smagorinsky and Leith viscosities, and KH. -KH_SIN_LAT = 2000.0 ! [m2 s-1] default = 0.0 - ! The amplitude of a latitudinally-dependent background viscosity of the form - ! KH_SIN_LAT*(SIN(LAT)**KH_PWR_OF_SINE). -SMAGORINSKY_KH = True ! [Boolean] default = False - ! If true, use a Smagorinsky nonlinear eddy viscosity. -SMAG_LAP_CONST = 0.15 ! [nondim] default = 0.0 - ! The nondimensional Laplacian Smagorinsky constant, often 0.15. -AH_VEL_SCALE = 0.01 ! [m s-1] default = 0.0 - ! The velocity scale which is multiplied by the cube of the grid spacing to - ! calculate the biharmonic viscosity. The final viscosity is the largest of this - ! scaled viscosity, the Smagorinsky and Leith viscosities, and AH. -SMAGORINSKY_AH = True ! [Boolean] default = False - ! If true, use a biharmonic Smagorinsky nonlinear eddy viscosity. -SMAG_BI_CONST = 0.06 ! [nondim] default = 0.0 - ! The nondimensional biharmonic Smagorinsky constant, typically 0.015 - 0.06. -USE_LAND_MASK_FOR_HVISC = True ! [Boolean] default = False - ! If true, use Use the land mask for the computation of thicknesses at velocity - ! locations. This eliminates the dependence on arbitrary values over land or - ! outside of the domain. - -! === module MOM_vert_friction === -HMIX_FIXED = 0.5 ! [m] - ! The prescribed depth over which the near-surface viscosity and diffusivity are - ! elevated when the bulk mixed layer is not used. -KVML = 1.0E-04 ! [m2 s-1] default = 1.0E-04 - ! The kinematic viscosity in the mixed layer. A typical value is ~1e-2 m2 s-1. - ! KVML is not used if BULKMIXEDLAYER is true. The default is set by KV. -MAXVEL = 6.0 ! [m s-1] default = 3.0E+08 - ! The maximum velocity allowed before the velocity components are truncated. - -! === module MOM_barotropic === -BOUND_BT_CORRECTION = True ! [Boolean] default = False - ! If true, the corrective pseudo mass-fluxes into the barotropic solver are - ! limited to values that require less than maxCFL_BT_cont to be accommodated. -BT_PROJECT_VELOCITY = True ! [Boolean] default = False - ! If true, step the barotropic velocity first and project out the velocity - ! tendency by 1+BEBT when calculating the transport. The default (false) is to - ! use a predictor continuity step to find the pressure field, and then to do a - ! corrector continuity step using a weighted average of the old and new - ! velocities, with weights of (1-BEBT) and BEBT. -DYNAMIC_SURFACE_PRESSURE = False ! [Boolean] default = False - ! If true, add a dynamic pressure due to a viscous ice shelf, for instance. -BEBT = 0.2 ! [nondim] default = 0.1 - ! BEBT determines whether the barotropic time stepping uses the forward-backward - ! time-stepping scheme or a backward Euler scheme. BEBT is valid in the range - ! from 0 (for a forward-backward treatment of nonrotating gravity waves) to 1 - ! (for a backward Euler treatment). In practice, BEBT must be greater than about - ! 0.05. -DTBT = -0.9 ! [s or nondim] default = -0.98 - ! The barotropic time step, in s. DTBT is only used with the split explicit time - ! stepping. To set the time step automatically based the maximum stable value - ! use 0, or a negative value gives the fraction of the stable value. Setting - ! DTBT to 0 is the same as setting it to -0.98. The value of DTBT that will - ! actually be used is an integer fraction of DT, rounding down. - -! === module MOM_mixed_layer_restrat === -MIXEDLAYER_RESTRAT = False ! [Boolean] default = False - ! If true, a density-gradient dependent re-stratifying flow is imposed in the - ! mixed layer. Can be used in ALE mode without restriction but in layer mode can - ! only be used if BULKMIXEDLAYER is true. -FOX_KEMPER_ML_RESTRAT_COEF = 60.0 ! [nondim] default = 0.0 - ! A nondimensional coefficient that is proportional to the ratio of the - ! deformation radius to the dominant lengthscale of the submesoscale mixed layer - ! instabilities, times the minimum of the ratio of the mesoscale eddy kinetic - ! energy to the large-scale geostrophic kinetic energy or 1 plus the square of - ! the grid spacing over the deformation radius, as detailed by Fox-Kemper et al. - ! (2010) -MLE_FRONT_LENGTH = 200.0 ! [m] default = 0.0 - ! If non-zero, is the frontal-length scale used to calculate the upscaling of - ! buoyancy gradients that is otherwise represented by the parameter - ! FOX_KEMPER_ML_RESTRAT_COEF. If MLE_FRONT_LENGTH is non-zero, it is recommended - ! to set FOX_KEMPER_ML_RESTRAT_COEF=1.0. -MLE_USE_PBL_MLD = True ! [Boolean] default = False - ! If true, the MLE parameterization will use the mixed-layer depth provided by - ! the active PBL parameterization. If false, MLE will estimate a MLD based on a - ! density difference with the surface using the parameter MLE_DENSITY_DIFF. -MLE_MLD_DECAY_TIME = 2.592E+06 ! [s] default = 0.0 - ! The time-scale for a running-mean filter applied to the mixed-layer depth used - ! in the MLE restratification parameterization. When the MLD deepens below the - ! current running-mean the running-mean is instantaneously set to the current - ! MLD. - -! === module MOM_diabatic_driver === -! The following parameters are used for diabatic processes. -ENERGETICS_SFC_PBL = True ! [Boolean] default = False - ! If true, use an implied energetics planetary boundary layer scheme to - ! determine the diffusivity and viscosity in the surface boundary layer. -EPBL_IS_ADDITIVE = False ! [Boolean] default = True - ! If true, the diffusivity from ePBL is added to all other diffusivities. - ! Otherwise, the larger of kappa-shear and ePBL diffusivities are used. - -! === module MOM_CVMix_KPP === -! This is the MOM wrapper to CVMix:KPP -! See http://cvmix.github.io/ - -! === module MOM_tidal_mixing === -! Vertical Tidal Mixing Parameterization - -! === module MOM_CVMix_conv === -! Parameterization of enhanced mixing due to convection via CVMix - -! === module MOM_set_diffusivity === - -! === module MOM_bkgnd_mixing === -! Adding static vertical background mixing coefficients -KD = 1.5E-05 ! [m2 s-1] default = 0.0 - ! The background diapycnal diffusivity of density in the interior. Zero or the - ! molecular value, ~1e-7 m2 s-1, may be used. -KD_MIN = 2.0E-06 ! [m2 s-1] default = 2.0E-07 - ! The minimum diapycnal diffusivity. -HENYEY_IGW_BACKGROUND = True ! [Boolean] default = False - ! If true, use a latitude-dependent scaling for the near surface background - ! diffusivity, as described in Harrison & Hallberg, JPO 2008. - -! === module MOM_kappa_shear === -! Parameterization of shear-driven turbulence following Jackson, Hallberg and Legg, JPO 2008 -USE_JACKSON_PARAM = True ! [Boolean] default = False - ! If true, use the Jackson-Hallberg-Legg (JPO 2008) shear mixing - ! parameterization. -MAX_RINO_IT = 25 ! [nondim] default = 50 - ! The maximum number of iterations that may be used to estimate the Richardson - ! number driven mixing. - -! === module MOM_CVMix_shear === -! Parameterization of shear-driven turbulence via CVMix (various options) - -! === module MOM_CVMix_ddiff === -! Parameterization of mixing due to double diffusion processes via CVMix - -! === module MOM_diabatic_aux === -! The following parameters are used for auxiliary diabatic processes. - -! === module MOM_energetic_PBL === -EPBL_USTAR_MIN = 1.45842E-18 ! [m s-1] - ! The (tiny) minimum friction velocity used within the ePBL code, derived from - ! OMEGA and ANGSTROM.. -USE_LA_LI2016 = @[MOM6_USE_LI2016] ! [nondim] default = False - ! A logical to use the Li et al. 2016 (submitted) formula to determine the - ! Langmuir number. -USE_WAVES = @[MOM6_USE_WAVES] ! [Boolean] default = False - ! If true, enables surface wave modules. -WAVE_METHOD = "SURFACE_BANDS" ! default = "EMPTY" - ! Choice of wave method, valid options include: - ! TEST_PROFILE - Prescribed from surface Stokes drift - ! and a decay wavelength. - ! SURFACE_BANDS - Computed from multiple surface values - ! and decay wavelengths. - ! DHH85 - Uses Donelan et al. 1985 empirical - ! wave spectrum with prescribed values. - ! LF17 - Infers Stokes drift profile from wind - ! speed following Li and Fox-Kemper 2017. -SURFBAND_SOURCE = "COUPLER" ! default = "EMPTY" - ! Choice of SURFACE_BANDS data mode, valid options include: - ! DATAOVERRIDE - Read from NetCDF using FMS DataOverride. - ! COUPLER - Look for variables from coupler pass - ! INPUT - Testing with fixed values. -STK_BAND_COUPLER = 3 ! default = 1 - ! STK_BAND_COUPLER is the number of Stokes drift bands in the coupler. This has - ! to be consistent with the number of Stokes drift bands in WW3, or the model - ! will fail. -SURFBAND_WAVENUMBERS = 0.04, 0.11, 0.3305 ! [rad/m] default = 0.12566 - ! Central wavenumbers for surface Stokes drift bands. -EPBL_LANGMUIR_SCHEME = "ADDITIVE" ! default = "NONE" - ! EPBL_LANGMUIR_SCHEME selects the method for including Langmuir turbulence. - ! Valid values are: - ! NONE - Do not do any extra mixing due to Langmuir turbulence - ! RESCALE - Use a multiplicative rescaling of mstar to account for Langmuir - ! turbulence - ! ADDITIVE - Add a Langmuir turblence contribution to mstar to other - ! contributions -LT_ENHANCE_COEF = 0.044 ! [nondim] default = 0.447 - ! Coefficient for Langmuir enhancement of mstar -LT_ENHANCE_EXP = -1.5 ! [nondim] default = -1.33 - ! Exponent for Langmuir enhancementt of mstar -LT_MOD_LAC1 = 0.0 ! [nondim] default = -0.87 - ! Coefficient for modification of Langmuir number due to MLD approaching Ekman - ! depth. -LT_MOD_LAC4 = 0.0 ! [nondim] default = 0.95 - ! Coefficient for modification of Langmuir number due to ratio of Ekman to - ! stable Obukhov depth. -LT_MOD_LAC5 = 0.22 ! [nondim] default = 0.95 - ! Coefficient for modification of Langmuir number due to ratio of Ekman to - ! unstable Obukhov depth. - -! === module MOM_regularize_layers === - -! === module MOM_opacity === - -! === module MOM_tracer_advect === -TRACER_ADVECTION_SCHEME = "PPM:H3" ! default = "PLM" - ! The horizontal transport scheme for tracers: - ! PLM - Piecewise Linear Method - ! PPM:H3 - Piecewise Parabolic Method (Huyhn 3rd order) - ! PPM - Piecewise Parabolic Method (Colella-Woodward) - -! === module MOM_tracer_hor_diff === -KHTR = 50.0 ! [m2 s-1] default = 0.0 - ! The background along-isopycnal tracer diffusivity. -CHECK_DIFFUSIVE_CFL = True ! [Boolean] default = False - ! If true, use enough iterations the diffusion to ensure that the diffusive - ! equivalent of the CFL limit is not violated. If false, always use the greater - ! of 1 or MAX_TR_DIFFUSION_CFL iteration. -MAX_TR_DIFFUSION_CFL = 2.0 ! [nondim] default = -1.0 - ! If positive, locally limit the along-isopycnal tracer diffusivity to keep the - ! diffusive CFL locally at or below this value. The number of diffusive - ! iterations is often this value or the next greater integer. - -! === module MOM_neutral_diffusion === -! This module implements neutral diffusion of tracers -USE_NEUTRAL_DIFFUSION = True ! [Boolean] default = False - ! If true, enables the neutral diffusion module. - -! === module MOM_sum_output === -MAXTRUNC = 1000 ! [truncations save_interval-1] default = 0 - ! The run will be stopped, and the day set to a very large value if the velocity - ! is truncated more than MAXTRUNC times between energy saves. Set MAXTRUNC to 0 - ! to stop if there is any truncation of velocities. - -! === module ocean_model_init === - -! === module MOM_oda_incupd === -ODA_INCUPD = @[ODA_INCUPD] ! [Boolean] default = False - ! If true, oda incremental updates will be applied - ! everywhere in the domain. -ODA_INCUPD_FILE = "mom6_increment.nc" ! The name of the file with the T,S,h increments. - -ODA_TEMPINC_VAR = "Temp" ! default = "ptemp_inc" - ! The name of the potential temperature inc. variable in - ! ODA_INCUPD_FILE. -ODA_SALTINC_VAR = "Salt" ! default = "sal_inc" - ! The name of the salinity inc. variable in - ! ODA_INCUPD_FILE. -ODA_THK_VAR = "h" ! default = "h" - ! The name of the int. depth inc. variable in - ! ODA_INCUPD_FILE. -ODA_INCUPD_UV = true ! -ODA_UINC_VAR = "u" ! default = "u_inc" - ! The name of the zonal vel. inc. variable in - ! ODA_INCUPD_UV_FILE. -ODA_VINC_VAR = "v" ! default = "v_inc" - ! The name of the meridional vel. inc. variable in - ! ODA_INCUPD_UV_FILE. -ODA_INCUPD_NHOURS = @[ODA_INCUPD_NHOURS] ! default=3.0 - -! === module MOM_surface_forcing === -OCEAN_SURFACE_STAGGER = "A" ! default = "C" - ! A case-insensitive character string to indicate the - ! staggering of the surface velocity field that is - ! returned to the coupler. Valid values include - ! 'A', 'B', or 'C'. - -MAX_P_SURF = 0.0 ! [Pa] default = -1.0 - ! The maximum surface pressure that can be exerted by the atmosphere and - ! floating sea-ice or ice shelves. This is needed because the FMS coupling - ! structure does not limit the water that can be frozen out of the ocean and the - ! ice-ocean heat fluxes are treated explicitly. No limit is applied if a - ! negative value is used. -WIND_STAGGER = "A" ! default = "C" - ! A case-insensitive character string to indicate the - ! staggering of the input wind stress field. Valid - ! values are 'A', 'B', or 'C'. -! === module MOM_restart === - -! === module MOM_file_parser === diff --git a/parm/ufs/ufs.configure.atm.IN b/parm/ufs/ufs.configure.atm.IN deleted file mode 100644 index 3457d8cf53..0000000000 --- a/parm/ufs/ufs.configure.atm.IN +++ /dev/null @@ -1,22 +0,0 @@ -# ESMF # -logKindFlag: @[esmf_logkind] -globalResourceControl: true - -EARTH_component_list: ATM -EARTH_attributes:: - Verbosity = 0 -:: - -# ATM # -ATM_model: @[atm_model] -ATM_petlist_bounds: @[atm_petlist_bounds] -ATM_omp_num_threads: @[atm_omp_num_threads] -ATM_attributes:: - Verbosity = 0 - Diagnostic = 0 -:: - -# Run Sequence # -runSeq:: - ATM -:: diff --git a/parm/ufs/ufs.configure.atm_aero.IN b/parm/ufs/ufs.configure.atm_aero.IN deleted file mode 100644 index 629cc156ce..0000000000 --- a/parm/ufs/ufs.configure.atm_aero.IN +++ /dev/null @@ -1,40 +0,0 @@ -############################################# -#### UFS Run-Time Configuration File ##### -############################################# - -# ESMF # -logKindFlag: @[esmf_logkind] -globalResourceControl: true - -# EARTH # -EARTH_component_list: ATM CHM -EARTH_attributes:: - Verbosity = 0 -:: - -# ATM # -ATM_model: @[atm_model] -ATM_petlist_bounds: @[atm_petlist_bounds] -ATM_omp_num_threads: @[atm_omp_num_threads] -ATM_attributes:: - Verbosity = 0 -:: - -# CHM # -CHM_model: @[chm_model] -CHM_petlist_bounds: @[chm_petlist_bounds] -CHM_omp_num_threads: @[chm_omp_num_threads] -CHM_attributes:: - Verbosity = 0 -:: - -# Run Sequence # -runSeq:: - @@[coupling_interval_fast_sec] - ATM phase1 - ATM -> CHM - CHM - CHM -> ATM - ATM phase2 - @ -:: diff --git a/parm/ufs/ufs.configure.blocked_atm_wav.IN b/parm/ufs/ufs.configure.blocked_atm_wav.IN deleted file mode 100644 index b68aa2e735..0000000000 --- a/parm/ufs/ufs.configure.blocked_atm_wav.IN +++ /dev/null @@ -1,41 +0,0 @@ -############################################# -#### UFS Run-Time Configuration File ##### -############################################# - -# ESMF # -logKindFlag: @[esmf_logkind] -globalResourceControl: true - -# EARTH # -EARTH_component_list: ATM WAV -EARTH_attributes:: - Verbosity = max -:: - -# ATM # -ATM_model: @[atm_model] -ATM_petlist_bounds: @[atm_petlist_bounds] -ATM_omp_num_threads: @[atm_omp_num_threads] -ATM_attributes:: - Verbosity = max - DumpFields = true -:: - -# WAV # -WAV_model: @[wav_model] -WAV_petlist_bounds: @[wav_petlist_bounds] -WAV_omp_num_threads: @[wav_omp_num_threads] -WAV_attributes:: - Verbosity = max -:: - - - -# Run Sequence # -runSeq:: - @@[coupling_interval_sec] - ATM -> WAV - ATM - WAV - @ -:: diff --git a/parm/ufs/ufs.configure.cpld.IN b/parm/ufs/ufs.configure.cpld.IN deleted file mode 100644 index e473fb2a03..0000000000 --- a/parm/ufs/ufs.configure.cpld.IN +++ /dev/null @@ -1,122 +0,0 @@ -############################################# -#### UFS Run-Time Configuration File ##### -############################################# - -# ESMF # -logKindFlag: @[esmf_logkind] -globalResourceControl: true - -# EARTH # -EARTH_component_list: MED ATM OCN ICE -EARTH_attributes:: - Verbosity = 0 -:: - -# MED # -MED_model: @[med_model] -MED_petlist_bounds: @[med_petlist_bounds] -MED_omp_num_threads: @[med_omp_num_threads] -:: - -# ATM # -ATM_model: @[atm_model] -ATM_petlist_bounds: @[atm_petlist_bounds] -ATM_omp_num_threads: @[atm_omp_num_threads] -ATM_attributes:: - Verbosity = 0 - DumpFields = @[DumpFields] - ProfileMemory = false - OverwriteSlice = true -:: - -# OCN # -OCN_model: @[ocn_model] -OCN_petlist_bounds: @[ocn_petlist_bounds] -OCN_omp_num_threads: @[ocn_omp_num_threads] -OCN_attributes:: - Verbosity = 0 - DumpFields = @[DumpFields] - ProfileMemory = false - OverwriteSlice = true - mesh_ocn = @[MESH_OCN_ICE] - use_coldstart = @[use_coldstart] - use_mommesh = @[use_mommesh] -:: - -# ICE # -ICE_model: @[ice_model] -ICE_petlist_bounds: @[ice_petlist_bounds] -ICE_omp_num_threads: @[ice_omp_num_threads] -ICE_attributes:: - Verbosity = 0 - DumpFields = @[DumpFields] - ProfileMemory = false - OverwriteSlice = true - mesh_ice = @[MESH_OCN_ICE] - eps_imesh = @[eps_imesh] - stop_n = @[RESTART_N] - stop_option = nhours - stop_ymd = -999 -:: - -# CMEPS warm run sequence -runSeq:: -@@[coupling_interval_slow_sec] - MED med_phases_prep_ocn_avg - MED -> OCN :remapMethod=redist - OCN - @@[coupling_interval_fast_sec] - MED med_phases_prep_atm - MED med_phases_prep_ice - MED -> ATM :remapMethod=redist - MED -> ICE :remapMethod=redist - ATM - ICE - ATM -> MED :remapMethod=redist - MED med_phases_post_atm - ICE -> MED :remapMethod=redist - MED med_phases_post_ice - MED med_phases_ocnalb_run - MED med_phases_prep_ocn_accum - @ - OCN -> MED :remapMethod=redist - MED med_phases_post_ocn - MED med_phases_restart_write -@ -:: - -# CMEPS variables - -DRIVER_attributes:: -:: -MED_attributes:: - ATM_model = @[atm_model] - ICE_model = @[ice_model] - OCN_model = @[ocn_model] - coupling_mode = @[CPLMODE] - history_tile_atm = @[ATMTILESIZE] - pio_rearranger = box - ocean_albedo_limit = @[ocean_albedo_limit] -:: -ALLCOMP_attributes:: - ScalarFieldCount = 2 - ScalarFieldIdxGridNX = 1 - ScalarFieldIdxGridNY = 2 - ScalarFieldName = cpl_scalars - start_type = @[RUNTYPE] - restart_dir = RESTART/ - case_name = ufs.cpld - restart_n = @[RESTART_N] - restart_option = nhours - restart_ymd = -999 - dbug_flag = @[cap_dbug_flag] - stop_n = @[FHMAX] - stop_option = nhours - stop_ymd = -999 - orb_eccen = 1.e36 - orb_iyear = 2000 - orb_iyear_align = 2000 - orb_mode = fixed_year - orb_mvelp = 1.e36 - orb_obliq = 1.e36 -:: diff --git a/parm/ufs/ufs.configure.cpld_aero.IN b/parm/ufs/ufs.configure.cpld_aero.IN deleted file mode 100644 index d90d377006..0000000000 --- a/parm/ufs/ufs.configure.cpld_aero.IN +++ /dev/null @@ -1,134 +0,0 @@ -############################################# -#### UFS Run-Time Configuration File ##### -############################################# - -# ESMF # -logKindFlag: @[esmf_logkind] -globalResourceControl: true - -# EARTH # -EARTH_component_list: MED ATM CHM OCN ICE -EARTH_attributes:: - Verbosity = 0 -:: - -# MED # -MED_model: @[med_model] -MED_petlist_bounds: @[med_petlist_bounds] -MED_omp_num_threads: @[med_omp_num_threads] -:: - -# ATM # -ATM_model: @[atm_model] -ATM_petlist_bounds: @[atm_petlist_bounds] -ATM_omp_num_threads: @[atm_omp_num_threads] -ATM_attributes:: - Verbosity = 0 - DumpFields = @[DumpFields] - ProfileMemory = false - OverwriteSlice = true -:: - -# CHM # -CHM_model: @[chm_model] -CHM_petlist_bounds: @[chm_petlist_bounds] -CHM_omp_num_threads: @[chm_omp_num_threads] -CHM_attributes:: - Verbosity = 0 -:: - -# OCN # -OCN_model: @[ocn_model] -OCN_petlist_bounds: @[ocn_petlist_bounds] -OCN_omp_num_threads: @[ocn_omp_num_threads] -OCN_attributes:: - Verbosity = 0 - DumpFields = @[DumpFields] - ProfileMemory = false - OverwriteSlice = true - mesh_ocn = @[MESH_OCN_ICE] - use_coldstart = @[use_coldstart] - use_mommesh = @[use_mommesh] -:: - -# ICE # -ICE_model: @[ice_model] -ICE_petlist_bounds: @[ice_petlist_bounds] -ICE_omp_num_threads: @[ice_omp_num_threads] -ICE_attributes:: - Verbosity = 0 - DumpFields = @[DumpFields] - ProfileMemory = false - OverwriteSlice = true - mesh_ice = @[MESH_OCN_ICE] - eps_imesh = @[eps_imesh] - stop_n = @[RESTART_N] - stop_option = nhours - stop_ymd = -999 -:: - -# CMEPS warm run sequence -runSeq:: -@@[coupling_interval_slow_sec] - MED med_phases_prep_ocn_avg - MED -> OCN :remapMethod=redist - OCN - @@[coupling_interval_fast_sec] - MED med_phases_prep_atm - MED med_phases_prep_ice - MED -> ATM :remapMethod=redist - MED -> ICE :remapMethod=redist - ATM phase1 - ATM -> CHM - CHM - CHM -> ATM - ATM phase2 - ICE - ATM -> MED :remapMethod=redist - MED med_phases_post_atm - ICE -> MED :remapMethod=redist - MED med_phases_post_ice - MED med_phases_ocnalb_run - MED med_phases_prep_ocn_accum - @ - OCN -> MED :remapMethod=redist - MED med_phases_post_ocn - MED med_phases_restart_write -@ -:: - -# CMEPS variables - -DRIVER_attributes:: -:: -MED_attributes:: - ATM_model = @[atm_model] - ICE_model = @[ice_model] - OCN_model = @[ocn_model] - coupling_mode = @[CPLMODE] - history_tile_atm = @[ATMTILESIZE] - pio_rearranger = box - ocean_albedo_limit = @[ocean_albedo_limit] -:: -ALLCOMP_attributes:: - ScalarFieldCount = 2 - ScalarFieldIdxGridNX = 1 - ScalarFieldIdxGridNY = 2 - ScalarFieldName = cpl_scalars - start_type = @[RUNTYPE] - restart_dir = RESTART/ - case_name = ufs.cpld - restart_n = @[RESTART_N] - restart_option = nhours - restart_ymd = -999 - dbug_flag = @[cap_dbug_flag] - stop_n = @[FHMAX] - stop_option = nhours - stop_ymd = -999 - orb_eccen = 1.e36 - orb_iyear = 2000 - orb_iyear_align = 2000 - orb_mode = fixed_year - orb_mvelp = 1.e36 - orb_obliq = 1.e36 -:: diff --git a/parm/ufs/ufs.configure.cpld_aero_outerwave.IN b/parm/ufs/ufs.configure.cpld_aero_outerwave.IN deleted file mode 100644 index 23e7751112..0000000000 --- a/parm/ufs/ufs.configure.cpld_aero_outerwave.IN +++ /dev/null @@ -1,151 +0,0 @@ -############################################# -#### UFS Run-Time Configuration File ##### -############################################# - -# ESMF # -logKindFlag: @[esmf_logkind] -globalResourceControl: true - -# EARTH # -EARTH_component_list: MED ATM CHM OCN ICE WAV -EARTH_attributes:: - Verbosity = 0 -:: - -# MED # -MED_model: @[med_model] -MED_petlist_bounds: @[med_petlist_bounds] -MED_omp_num_threads: @[med_omp_num_threads] -:: - -# ATM # -ATM_model: @[atm_model] -ATM_petlist_bounds: @[atm_petlist_bounds] -ATM_omp_num_threads: @[atm_omp_num_threads] -ATM_attributes:: - Verbosity = 0 - DumpFields = @[DumpFields] - ProfileMemory = false - OverwriteSlice = true -:: - -# CHM # -CHM_model: @[chm_model] -CHM_petlist_bounds: @[chm_petlist_bounds] -CHM_omp_num_threads: @[chm_omp_num_threads] -CHM_attributes:: - Verbosity = 0 -:: - -# OCN # -OCN_model: @[ocn_model] -OCN_petlist_bounds: @[ocn_petlist_bounds] -OCN_omp_num_threads: @[ocn_omp_num_threads] -OCN_attributes:: - Verbosity = 0 - DumpFields = @[DumpFields] - ProfileMemory = false - OverwriteSlice = true - mesh_ocn = @[MESH_OCN_ICE] - use_coldstart = @[use_coldstart] - use_mommesh = @[use_mommesh] -:: - -# ICE # -ICE_model: @[ice_model] -ICE_petlist_bounds: @[ice_petlist_bounds] -ICE_omp_num_threads: @[ice_omp_num_threads] -ICE_attributes:: - Verbosity = 0 - DumpFields = @[DumpFields] - ProfileMemory = false - OverwriteSlice = true - mesh_ice = @[MESH_OCN_ICE] - eps_imesh = @[eps_imesh] - stop_n = @[RESTART_N] - stop_option = nhours - stop_ymd = -999 -:: - -# WAV # -WAV_model: @[wav_model] -WAV_petlist_bounds: @[wav_petlist_bounds] -WAV_omp_num_threads: @[wav_omp_num_threads] -WAV_attributes:: - Verbosity = 0 - OverwriteSlice = false - mesh_wav = @[MESH_WAV] -:: - -# CMEPS warm run sequence -runSeq:: -@@[coupling_interval_slow_sec] - MED med_phases_prep_wav_avg - MED med_phases_prep_ocn_avg - MED -> WAV :remapMethod=redist - MED -> OCN :remapMethod=redist - WAV - OCN - @@[coupling_interval_fast_sec] - MED med_phases_prep_atm - MED med_phases_prep_ice - MED -> ATM :remapMethod=redist - MED -> ICE :remapMethod=redist - ATM phase1 - ATM -> CHM - CHM - CHM -> ATM - ATM phase2 - ICE - ATM -> MED :remapMethod=redist - MED med_phases_post_atm - ICE -> MED :remapMethod=redist - MED med_phases_post_ice - MED med_phases_ocnalb_run - MED med_phases_prep_ocn_accum - MED med_phases_prep_wav_accum - @ - OCN -> MED :remapMethod=redist - WAV -> MED :remapMethod=redist - MED med_phases_post_ocn - MED med_phases_post_wav - MED med_phases_restart_write -@ -:: - -# CMEPS variables - -DRIVER_attributes:: -:: -MED_attributes:: - ATM_model = @[atm_model] - ICE_model = @[ice_model] - OCN_model = @[ocn_model] - WAV_model = @[wav_model] - coupling_mode = @[CPLMODE] - history_tile_atm = @[ATMTILESIZE] - pio_rearranger = box - ocean_albedo_limit = @[ocean_albedo_limit] -:: -ALLCOMP_attributes:: - ScalarFieldCount = 2 - ScalarFieldIdxGridNX = 1 - ScalarFieldIdxGridNY = 2 - ScalarFieldName = cpl_scalars - start_type = @[RUNTYPE] - restart_dir = RESTART/ - case_name = ufs.cpld - restart_n = @[RESTART_N] - restart_option = nhours - restart_ymd = -999 - dbug_flag = @[cap_dbug_flag] - stop_n = @[FHMAX] - stop_option = nhours - stop_ymd = -999 - orb_eccen = 1.e36 - orb_iyear = 2000 - orb_iyear_align = 2000 - orb_mode = fixed_year - orb_mvelp = 1.e36 - orb_obliq = 1.e36 -:: diff --git a/parm/ufs/ufs.configure.cpld_aero_wave.IN b/parm/ufs/ufs.configure.cpld_aero_wave.IN deleted file mode 100644 index ab0f6a9f8d..0000000000 --- a/parm/ufs/ufs.configure.cpld_aero_wave.IN +++ /dev/null @@ -1,151 +0,0 @@ -############################################# -#### UFS Run-Time Configuration File ##### -############################################# - -# ESMF # -logKindFlag: @[esmf_logkind] -globalResourceControl: true - -# EARTH # -EARTH_component_list: MED ATM CHM OCN ICE WAV -EARTH_attributes:: - Verbosity = 0 -:: - -# MED # -MED_model: @[med_model] -MED_petlist_bounds: @[med_petlist_bounds] -MED_omp_num_threads: @[med_omp_num_threads] -:: - -# ATM # -ATM_model: @[atm_model] -ATM_petlist_bounds: @[atm_petlist_bounds] -ATM_omp_num_threads: @[atm_omp_num_threads] -ATM_attributes:: - Verbosity = 0 - DumpFields = @[DumpFields] - ProfileMemory = false - OverwriteSlice = true -:: - -# CHM # -CHM_model: @[chm_model] -CHM_petlist_bounds: @[chm_petlist_bounds] -CHM_omp_num_threads: @[chm_omp_num_threads] -CHM_attributes:: - Verbosity = 0 -:: - -# OCN # -OCN_model: @[ocn_model] -OCN_petlist_bounds: @[ocn_petlist_bounds] -OCN_omp_num_threads: @[ocn_omp_num_threads] -OCN_attributes:: - Verbosity = 0 - DumpFields = @[DumpFields] - ProfileMemory = false - OverwriteSlice = true - mesh_ocn = @[MESH_OCN_ICE] - use_coldstart = @[use_coldstart] - use_mommesh = @[use_mommesh] -:: - -# ICE # -ICE_model: @[ice_model] -ICE_petlist_bounds: @[ice_petlist_bounds] -ICE_omp_num_threads: @[ice_omp_num_threads] -ICE_attributes:: - Verbosity = 0 - DumpFields = @[DumpFields] - ProfileMemory = false - OverwriteSlice = true - mesh_ice = @[MESH_OCN_ICE] - eps_imesh = @[eps_imesh] - stop_n = @[RESTART_N] - stop_option = nhours - stop_ymd = -999 -:: - -# WAV # -WAV_model: @[wav_model] -WAV_petlist_bounds: @[wav_petlist_bounds] -WAV_omp_num_threads: @[wav_omp_num_threads] -WAV_attributes:: - Verbosity = 0 - OverwriteSlice = false - mesh_wav = @[MESH_WAV] -:: - -# CMEPS warm run sequence -runSeq:: -@@[coupling_interval_slow_sec] - MED med_phases_prep_ocn_avg - MED -> OCN :remapMethod=redist - OCN - @@[coupling_interval_fast_sec] - MED med_phases_prep_atm - MED med_phases_prep_ice - MED med_phases_prep_wav_accum - MED med_phases_prep_wav_avg - MED -> ATM :remapMethod=redist - MED -> ICE :remapMethod=redist - MED -> WAV :remapMethod=redist - ATM phase1 - ATM -> CHM - CHM - CHM -> ATM - ATM phase2 - ICE - WAV - ATM -> MED :remapMethod=redist - MED med_phases_post_atm - ICE -> MED :remapMethod=redist - MED med_phases_post_ice - WAV -> MED :remapMethod=redist - MED med_phases_post_wav - MED med_phases_ocnalb_run - MED med_phases_prep_ocn_accum - @ - OCN -> MED :remapMethod=redist - MED med_phases_post_ocn - MED med_phases_restart_write -@ -:: - -# CMEPS variables - -DRIVER_attributes:: -:: -MED_attributes:: - ATM_model = @[atm_model] - ICE_model = @[ice_model] - OCN_model = @[ocn_model] - WAV_model = @[wav_model] - coupling_mode = @[CPLMODE] - history_tile_atm = @[ATMTILESIZE] - pio_rearranger = box - ocean_albedo_limit = @[ocean_albedo_limit] -:: -ALLCOMP_attributes:: - ScalarFieldCount = 2 - ScalarFieldIdxGridNX = 1 - ScalarFieldIdxGridNY = 2 - ScalarFieldName = cpl_scalars - start_type = @[RUNTYPE] - restart_dir = RESTART/ - case_name = ufs.cpld - restart_n = @[RESTART_N] - restart_option = nhours - restart_ymd = -999 - dbug_flag = @[cap_dbug_flag] - stop_n = @[FHMAX] - stop_option = nhours - stop_ymd = -999 - orb_eccen = 1.e36 - orb_iyear = 2000 - orb_iyear_align = 2000 - orb_mode = fixed_year - orb_mvelp = 1.e36 - orb_obliq = 1.e36 -:: diff --git a/parm/ufs/ufs.configure.cpld_outerwave.IN b/parm/ufs/ufs.configure.cpld_outerwave.IN deleted file mode 100644 index 9a45d5ff9a..0000000000 --- a/parm/ufs/ufs.configure.cpld_outerwave.IN +++ /dev/null @@ -1,139 +0,0 @@ -############################################# -#### UFS Run-Time Configuration File ##### -############################################# - -# ESMF # -logKindFlag: @[esmf_logkind] -globalResourceControl: true - -# EARTH # -EARTH_component_list: MED ATM OCN ICE WAV -EARTH_attributes:: - Verbosity = 0 -:: - -# MED # -MED_model: @[med_model] -MED_petlist_bounds: @[med_petlist_bounds] -MED_omp_num_threads: @[med_omp_num_threads] -:: - -# ATM # -ATM_model: @[atm_model] -ATM_petlist_bounds: @[atm_petlist_bounds] -ATM_omp_num_threads: @[atm_omp_num_threads] -ATM_attributes:: - Verbosity = 0 - DumpFields = @[DumpFields] - ProfileMemory = false - OverwriteSlice = true -:: - -# OCN # -OCN_model: @[ocn_model] -OCN_petlist_bounds: @[ocn_petlist_bounds] -OCN_omp_num_threads: @[ocn_omp_num_threads] -OCN_attributes:: - Verbosity = 0 - DumpFields = @[DumpFields] - ProfileMemory = false - OverwriteSlice = true - mesh_ocn = @[MESH_OCN_ICE] - use_coldstart = @[use_coldstart] - use_mommesh = @[use_mommesh] -:: - -# ICE # -ICE_model: @[ice_model] -ICE_petlist_bounds: @[ice_petlist_bounds] -ICE_omp_num_threads: @[ice_omp_num_threads] -ICE_attributes:: - Verbosity = 0 - DumpFields = @[DumpFields] - ProfileMemory = false - OverwriteSlice = true - mesh_ice = @[MESH_OCN_ICE] - eps_imesh = @[eps_imesh] - stop_n = @[RESTART_N] - stop_option = nhours - stop_ymd = -999 -:: - -# WAV # -WAV_model: @[wav_model] -WAV_petlist_bounds: @[wav_petlist_bounds] -WAV_omp_num_threads: @[wav_omp_num_threads] -WAV_attributes:: - Verbosity = 0 - OverwriteSlice = false - mesh_wav = @[MESH_WAV] -:: - -# CMEPS warm run sequence -runSeq:: -@@[coupling_interval_slow_sec] - MED med_phases_prep_wav_avg - MED med_phases_prep_ocn_avg - MED -> WAV :remapMethod=redist - MED -> OCN :remapMethod=redist - WAV - OCN - @@[coupling_interval_fast_sec] - MED med_phases_prep_atm - MED med_phases_prep_ice - MED -> ATM :remapMethod=redist - MED -> ICE :remapMethod=redist - ATM - ICE - ATM -> MED :remapMethod=redist - MED med_phases_post_atm - ICE -> MED :remapMethod=redist - MED med_phases_post_ice - MED med_phases_ocnalb_run - MED med_phases_prep_ocn_accum - MED med_phases_prep_wav_accum - @ - OCN -> MED :remapMethod=redist - WAV -> MED :remapMethod=redist - MED med_phases_post_ocn - MED med_phases_post_wav - MED med_phases_restart_write -@ -:: - -# CMEPS variables - -DRIVER_attributes:: -:: -MED_attributes:: - ATM_model = @[atm_model] - ICE_model = @[ice_model] - OCN_model = @[ocn_model] - WAV_model = @[wav_model] - coupling_mode = @[CPLMODE] - history_tile_atm = @[ATMTILESIZE] - pio_rearranger = box - ocean_albedo_limit = @[ocean_albedo_limit] -:: -ALLCOMP_attributes:: - ScalarFieldCount = 2 - ScalarFieldIdxGridNX = 1 - ScalarFieldIdxGridNY = 2 - ScalarFieldName = cpl_scalars - start_type = @[RUNTYPE] - restart_dir = RESTART/ - case_name = ufs.cpld - restart_n = @[RESTART_N] - restart_option = nhours - restart_ymd = -999 - dbug_flag = @[cap_dbug_flag] - stop_n = @[FHMAX] - stop_option = nhours - stop_ymd = -999 - orb_eccen = 1.e36 - orb_iyear = 2000 - orb_iyear_align = 2000 - orb_mode = fixed_year - orb_mvelp = 1.e36 - orb_obliq = 1.e36 -:: diff --git a/parm/ufs/ufs.configure.cpld_wave.IN b/parm/ufs/ufs.configure.cpld_wave.IN deleted file mode 100644 index 37a462a5d4..0000000000 --- a/parm/ufs/ufs.configure.cpld_wave.IN +++ /dev/null @@ -1,139 +0,0 @@ -############################################# -#### UFS Run-Time Configuration File ##### -############################################# - -# ESMF # -logKindFlag: @[esmf_logkind] -globalResourceControl: true - -# EARTH # -EARTH_component_list: MED ATM OCN ICE WAV -EARTH_attributes:: - Verbosity = 0 -:: - -# MED # -MED_model: @[med_model] -MED_petlist_bounds: @[med_petlist_bounds] -MED_omp_num_threads: @[med_omp_num_threads] -:: - -# ATM # -ATM_model: @[atm_model] -ATM_petlist_bounds: @[atm_petlist_bounds] -ATM_omp_num_threads: @[atm_omp_num_threads] -ATM_attributes:: - Verbosity = 0 - DumpFields = @[DumpFields] - ProfileMemory = false - OverwriteSlice = true -:: - -# OCN # -OCN_model: @[ocn_model] -OCN_petlist_bounds: @[ocn_petlist_bounds] -OCN_omp_num_threads: @[ocn_omp_num_threads] -OCN_attributes:: - Verbosity = 0 - DumpFields = @[DumpFields] - ProfileMemory = false - OverwriteSlice = true - mesh_ocn = @[MESH_OCN_ICE] - use_coldstart = @[use_coldstart] - use_mommesh = @[use_mommesh] -:: - -# ICE # -ICE_model: @[ice_model] -ICE_petlist_bounds: @[ice_petlist_bounds] -ICE_omp_num_threads: @[ice_omp_num_threads] -ICE_attributes:: - Verbosity = 0 - DumpFields = @[DumpFields] - ProfileMemory = false - OverwriteSlice = true - mesh_ice = @[MESH_OCN_ICE] - eps_imesh = @[eps_imesh] - stop_n = @[RESTART_N] - stop_option = nhours - stop_ymd = -999 -:: - -# WAV # -WAV_model: @[wav_model] -WAV_petlist_bounds: @[wav_petlist_bounds] -WAV_omp_num_threads: @[wav_omp_num_threads] -WAV_attributes:: - Verbosity = 0 - OverwriteSlice = false - mesh_wav = @[MESH_WAV] -:: - -# CMEPS warm run sequence -runSeq:: -@@[coupling_interval_slow_sec] - MED med_phases_prep_ocn_avg - MED -> OCN :remapMethod=redist - OCN - @@[coupling_interval_fast_sec] - MED med_phases_prep_atm - MED med_phases_prep_ice - MED med_phases_prep_wav_accum - MED med_phases_prep_wav_avg - MED -> ATM :remapMethod=redist - MED -> ICE :remapMethod=redist - MED -> WAV :remapMethod=redist - ATM - ICE - WAV - ATM -> MED :remapMethod=redist - MED med_phases_post_atm - ICE -> MED :remapMethod=redist - MED med_phases_post_ice - WAV -> MED :remapMethod=redist - MED med_phases_post_wav - MED med_phases_ocnalb_run - MED med_phases_prep_ocn_accum - @ - OCN -> MED :remapMethod=redist - MED med_phases_post_ocn - MED med_phases_restart_write -@ -:: - -# CMEPS variables - -DRIVER_attributes:: -:: -MED_attributes:: - ATM_model = @[atm_model] - ICE_model = @[ice_model] - OCN_model = @[ocn_model] - WAV_model = @[wav_model] - coupling_mode = @[CPLMODE] - history_tile_atm = @[ATMTILESIZE] - pio_rearranger = box - ocean_albedo_limit = @[ocean_albedo_limit] -:: -ALLCOMP_attributes:: - ScalarFieldCount = 2 - ScalarFieldIdxGridNX = 1 - ScalarFieldIdxGridNY = 2 - ScalarFieldName = cpl_scalars - start_type = @[RUNTYPE] - restart_dir = RESTART/ - case_name = ufs.cpld - restart_n = @[RESTART_N] - restart_option = nhours - restart_ymd = -999 - dbug_flag = @[cap_dbug_flag] - stop_n = @[FHMAX] - stop_option = nhours - stop_ymd = -999 - orb_eccen = 1.e36 - orb_iyear = 2000 - orb_iyear_align = 2000 - orb_mode = fixed_year - orb_mvelp = 1.e36 - orb_obliq = 1.e36 -:: diff --git a/parm/ufs/ufs.configure.leapfrog_atm_wav.IN b/parm/ufs/ufs.configure.leapfrog_atm_wav.IN deleted file mode 100644 index ec22c9478c..0000000000 --- a/parm/ufs/ufs.configure.leapfrog_atm_wav.IN +++ /dev/null @@ -1,41 +0,0 @@ -############################################# -#### UFS Run-Time Configuration File ##### -############################################# - -# ESMF # -logKindFlag: @[esmf_logkind] -globalResourceControl: true - -# EARTH # -EARTH_component_list: ATM WAV -EARTH_attributes:: - Verbosity = max -:: - -# ATM # -ATM_model: @[atm_model] -ATM_petlist_bounds: @[atm_petlist_bounds] -ATM_omp_num_threads: @[atm_omp_num_threads] -ATM_attributes:: - Verbosity = max - DumpFields = true -:: - -# WAV # -WAV_model: @[wav_model] -WAV_petlist_bounds: @[wav_petlist_bounds] -WAV_omp_num_threads: @[wav_omp_num_threads] -WAV_attributes:: - Verbosity = max -:: - - - -# Run Sequence # -runSeq:: - @@[coupling_interval_slow_sec] - ATM - ATM -> WAV - WAV - @ -:: diff --git a/parm/wave/at_10m_interp.inp.tmpl b/parm/wave/at_10m_interp.inp.tmpl index b2a80081e1..6f4c1f7099 100755 --- a/parm/wave/at_10m_interp.inp.tmpl +++ b/parm/wave/at_10m_interp.inp.tmpl @@ -5,7 +5,7 @@ $ Start Time DT NSteps $ Total number of grids 2 $ Grid extensions - 'gnh_10m' + 'uglo_m1g16' 'at_10m' $ 0 diff --git a/parm/wave/ep_10m_interp.inp.tmpl b/parm/wave/ep_10m_interp.inp.tmpl index 0848854ccf..23cfd50c2e 100755 --- a/parm/wave/ep_10m_interp.inp.tmpl +++ b/parm/wave/ep_10m_interp.inp.tmpl @@ -5,7 +5,7 @@ $ Start Time DT NSteps $ Total number of grids 2 $ Grid extensions - 'gnh_10m' + 'uglo_m1g16' 'ep_10m' $ 0 diff --git a/parm/wave/glo_15mxt_interp.inp.tmpl b/parm/wave/glo_15mxt_interp.inp.tmpl index 74bc9eebf4..19e9dae684 100755 --- a/parm/wave/glo_15mxt_interp.inp.tmpl +++ b/parm/wave/glo_15mxt_interp.inp.tmpl @@ -3,11 +3,9 @@ $------------------------------------------------ $ Start Time DT NSteps TIME DT NSTEPS $ Total number of grids - 4 + 2 $ Grid extensions - 'gnh_10m' - 'aoc_9km' - 'gsh_15m' + 'uglo_m1g16' 'glo_15mxt' $ 0 diff --git a/parm/wave/glo_200_interp.inp.tmpl b/parm/wave/glo_200_interp.inp.tmpl new file mode 100755 index 0000000000..c238a6fe0b --- /dev/null +++ b/parm/wave/glo_200_interp.inp.tmpl @@ -0,0 +1,12 @@ +$ Input file for interpolation of GLO30m_ext Grid +$------------------------------------------------ +$ Start Time DT NSteps + TIME DT NSTEPS +$ Total number of grids + 2 +$ Grid extensions + 'uglo_100km' + 'glo_200' +$ + 0 +$ diff --git a/parm/wave/glo_30m_interp.inp.tmpl b/parm/wave/glo_30m_interp.inp.tmpl index ea1baf7fc4..c62881202c 100755 --- a/parm/wave/glo_30m_interp.inp.tmpl +++ b/parm/wave/glo_30m_interp.inp.tmpl @@ -3,11 +3,9 @@ $------------------------------------------------ $ Start Time DT NSteps TIME DT NSTEPS $ Total number of grids - 4 + 2 $ Grid extensions - 'gnh_10m' - 'aoc_9km' - 'gsh_15m' + 'uglo_m1g16' 'glo_30m' $ 0 diff --git a/parm/wave/wc_10m_interp.inp.tmpl b/parm/wave/wc_10m_interp.inp.tmpl index abb51b4dfc..8338c91d0c 100755 --- a/parm/wave/wc_10m_interp.inp.tmpl +++ b/parm/wave/wc_10m_interp.inp.tmpl @@ -5,7 +5,7 @@ $ Start Time DT NSteps $ Total number of grids 2 $ Grid extensions - 'gnh_10m' + 'uglo_m1g16' 'wc_10m' $ 0 diff --git a/scripts/exgdas_atmos_chgres_forenkf.sh b/scripts/exgdas_atmos_chgres_forenkf.sh index d48d58947e..45c6be973d 100755 --- a/scripts/exgdas_atmos_chgres_forenkf.sh +++ b/scripts/exgdas_atmos_chgres_forenkf.sh @@ -17,11 +17,10 @@ # ################################################################################ -source "$HOMEgfs/ush/preamble.sh" +source "${USHgfs}/preamble.sh" # Directories. pwd=$(pwd) -export FIXam=${FIXam:-$HOMEgfs/fix/am} # Base variables CDATE=${CDATE:-"2001010100"} @@ -41,7 +40,7 @@ export NCP=${NCP:-"/bin/cp"} export NMV=${NMV:-"/bin/mv"} export NLN=${NLN:-"/bin/ln -sf"} export CHGRP_CMD=${CHGRP_CMD:-"chgrp ${group_name:-rstprod}"} -export NCLEN=${NCLEN:-$HOMEgfs/ush/getncdimlen} +export NCLEN=${NCLEN:-${USHgfs}/getncdimlen} # IAU DOIAU=${DOIAU:-"NO"} @@ -49,7 +48,7 @@ export IAUFHRS=${IAUFHRS:-"6"} # Dependent Scripts and Executables export APRUN_CHGRES=${APRUN_CHGRES:-${APRUN:-""}} -export CHGRESNCEXEC=${CHGRESNCEXEC:-$HOMEgfs/exec/enkf_chgres_recenter_nc.x} +export CHGRESNCEXEC=${CHGRESNCEXEC:-${EXECgfs}/enkf_chgres_recenter_nc.x} export NTHREADS_CHGRES=${NTHREADS_CHGRES:-1} APRUNCFP=${APRUNCFP:-""} @@ -59,7 +58,7 @@ SENDECF=${SENDECF:-"NO"} SENDDBN=${SENDDBN:-"NO"} # level info file -SIGLEVEL=${SIGLEVEL:-${FIXam}/global_hyblev.l${LEVS}.txt} +SIGLEVEL=${SIGLEVEL:-${FIXgfs}/am/global_hyblev.l${LEVS}.txt} # forecast files APREFIX=${APREFIX:-""} @@ -129,7 +128,7 @@ if [ $DO_CALC_ANALYSIS == "YES" ]; then $NLN $ATMF09ENS fcst.ensres.09 fi export OMP_NUM_THREADS=$NTHREADS_CHGRES - SIGLEVEL=${SIGLEVEL:-${FIXam}/global_hyblev.l${LEVS_ENKF}.txt} + SIGLEVEL=${SIGLEVEL:-${FIXgfs}/am/global_hyblev.l${LEVS_ENKF}.txt} if [ $USE_CFP = "YES" ]; then [[ -f $DATA/mp_chgres.sh ]] && rm $DATA/mp_chgres.sh diff --git a/scripts/exgdas_atmos_gempak_gif_ncdc.sh b/scripts/exgdas_atmos_gempak_gif_ncdc.sh index 63a7475a0e..2dc460cc55 100755 --- a/scripts/exgdas_atmos_gempak_gif_ncdc.sh +++ b/scripts/exgdas_atmos_gempak_gif_ncdc.sh @@ -6,55 +6,34 @@ # in the future, we should move it above somewhere else. ############################################################## -source "$HOMEgfs/ush/preamble.sh" - -cd $DATA - -export NTS=$USHgempak/restore - -if [ $MODEL = GDAS ] -then - case $MODEL in - GDAS) fcsthrs="000";; - esac - - export fhr - for fhr in $fcsthrs - do - icnt=1 - maxtries=180 - while [ $icnt -lt 1000 ] - do - if [ -r ${COMIN}/${RUN}_${PDY}${cyc}f${fhr} ] ; then - break - else - sleep 20 - let "icnt=icnt+1" - fi - if [ $icnt -ge $maxtries ] - then - msg="ABORTING after 1 hour of waiting for F$fhr to end." - err_exit $msg - fi - done - - cp ${COMIN}/${RUN}_${PDY}${cyc}f${fhr} gem_grids${fhr}.gem - export err=$? - if [[ $err -ne 0 ]] ; then - echo " File: ${COMIN}/${RUN}_${PDY}${cyc}f${fhr} does not exist." - exit $err - fi - - if [ $cyc -eq 00 -o $cyc -eq 12 ] - then - $USHgempak/gempak_${RUN}_f${fhr}_gif.sh - if [ ! -f $USHgempak/gempak_${RUN}_f${fhr}_gif.sh ] ; then - echo "WARNING: $USHgempak/gempak_${RUN}_f${fhr}_gif.sh FILE is missing" - fi - fi +source "${HOMEgfs}/ush/preamble.sh" +cd "${DATA}" || exit 2 + +export NTS="${HOMEgfs}/gempak/ush/restore" + +if [[ ${MODEL} == GDAS ]]; then + fcsthrs="000" + + sleep_interval=20 + max_tries=180 + export fhr3 + for fhr3 in ${fcsthrs}; do + gempak_file="${COM_ATMOS_GEMPAK_1p00}/${RUN}_1p00_${PDY}${cyc}f${fhr3}" + if ! wait_for_file "${gempak_file}" "${sleep_interval}" "${max_tries}" ; then + echo "FATAL ERROR: ${gempak_file} not found after ${max_tries} iterations" + exit 10 + fi + + cp "${gempak_file}" "gem_grids${fhr3}.gem" + export err=$? + if (( err != 0 )) ; then + echo "FATAL: Could not copy ${gempak_file}" + exit "${err}" + fi + + "${HOMEgfs}/gempak/ush/gempak_${RUN}_f${fhr3}_gif.sh" done fi - exit diff --git a/scripts/exgdas_atmos_nawips.sh b/scripts/exgdas_atmos_nawips.sh index 94a23f2a85..ea350239c1 100755 --- a/scripts/exgdas_atmos_nawips.sh +++ b/scripts/exgdas_atmos_nawips.sh @@ -10,53 +10,24 @@ # echo " data on the CCS is properly protected." ##################################################################### -source "$HOMEgfs/ush/preamble.sh" "${2}" +source "${USHgfs}/preamble.sh" "${2}" -cd $DATA -RUN2=$1 +cd "${DATA}" || exit 1 +grid=$1 fend=$2 DBN_ALERT_TYPE=$3 destination=$4 -DATA_RUN=$DATA/$RUN2 -mkdir -p $DATA_RUN -cd $DATA_RUN - -cp $FIXgempak/g2varswmo2.tbl g2varswmo2.tbl -export err=$? -if [[ $err -ne 0 ]] ; then - echo " File g2varswmo2.tbl file is missing." - exit $err -fi -cp $FIXgempak/g2vcrdwmo2.tbl g2vcrdwmo2.tbl -export err=$? -if [[ $err -ne 0 ]] ; then - echo " File g2vcrdwmo2.tbl file is missing." - exit $err -fi - -cp $FIXgempak/g2varsncep1.tbl g2varsncep1.tbl -export err=$? -if [[ $err -ne 0 ]] ; then - echo " File g2varsncep1.tbl file is missing." - exit $err -fi - -cp $FIXgempak/g2vcrdncep1.tbl g2vcrdncep1.tbl -export err=$? -if [[ $err -ne 0 ]] ; then - echo " File g2vcrdncep1.tbl file is missing." - exit $err -fi - -# -NAGRIB=$GEMEXE/nagrib2_nc -export err=$? -if [[ $err -ne 0 ]] ; then - echo " File $GEMEXE/nagrib2_nc is missing." - echo " WARNING: module GEMPAK was not loaded" - exit $err -fi +DATA_RUN="${DATA}/${grid}" +mkdir -p "${DATA_RUN}" +cd "${DATA_RUN}" || exit 1 + +for table in g2varswmo2.tbl g2vcrdwmo2.tbl g2varsncep1.tbl g2vcrdncep1.tbl; do + cp "${HOMEgfs}/gempak/fix/${table}" "${table}" || \ + ( echo "FATAL ERROR: ${table} is missing" && exit 2 ) +done + +NAGRIB="${GEMEXE}/nagrib2" cpyfil=gds garea=dset @@ -68,81 +39,55 @@ proj= output=T pdsext=no -maxtries=180 -fhcnt=$fstart -while [ $fhcnt -le $fend ] ; do - fhr=$(printf "%03d" $fhcnt) - fhcnt3=$(expr $fhr % 3) +sleep_interval=10 +max_tries=180 - fhr3=$(printf "%03d" $fhcnt) +fhr=$(( 10#${fstart} )) +while (( fhr <= 10#${fend} )); do + fhr3=$(printf "%03d" "${fhr}") - GEMGRD=${RUN2}_${PDY}${cyc}f${fhr3} + source_dirvar="COM_ATMOS_GRIB_${grid}" + GEMGRD="${RUN}_${grid}_${PDY}${cyc}f${fhr3}" + export GRIBIN="${!source_dirvar}/${model}.${cycle}.pgrb2.${grid}.f${fhr3}" + GRIBIN_chk="${GRIBIN}.idx" - if [[ ${RUN2} = "gdas_0p25" ]]; then - export GRIBIN=${COM_ATMOS_GRIB_0p25}/${model}.${cycle}.pgrb2.0p25.f${fhr} - if [[ ! -f ${GRIBIN} ]] ; then - echo "WARNING: ${GRIBIN} FILE is missing" - fi - GRIBIN_chk=${COM_ATMOS_GRIB_0p25}${model}.${cycle}.pgrb2.0p25.f${fhr}.idx - else - export GRIBIN=${COM_ATMOS_GRIB_1p00}/${model}.${cycle}.pgrb2.1p00.f${fhr} - if [[ ! -f ${GRIBIN} ]] ; then - echo "WARNING: ${GRIBIN} FILE is missing" - fi - GRIBIN_chk=${COM_ATMOS_GRIB_1p00}/${model}.${cycle}.pgrb2.1p00.f${fhr}.idx + if ! wait_for_file "${GRIBIN_chk}" "${sleep_interval}" "${max_tries}"; then + echo "FATAL ERROR: after 1 hour of waiting for ${GRIBIN_chk} file at F${fhr3} to end." + export err=7 ; err_chk + exit "${err}" fi - icnt=1 - while [ $icnt -lt 1000 ] - do - if [ -r $GRIBIN_chk ] ; then - sleep 5 - break - else - echo "The process is waiting ... ${GRIBIN_chk} file to proceed." - sleep 20 - let "icnt=icnt+1" - fi - if [ $icnt -ge $maxtries ] - then - echo "ABORTING: after 1 hour of waiting for ${GRIBIN_chk} file at F$fhr to end." - export err=7 ; err_chk - exit $err - fi - done - - cp $GRIBIN grib$fhr - - export pgm="nagrib2 F$fhr" + cp "${GRIBIN}" "grib${fhr3}" + + export pgm="nagrib2 F${fhr3}" startmsg - $NAGRIB << EOF - GBFILE = grib$fhr + ${NAGRIB} << EOF + GBFILE = grib${fhr3} INDXFL = - GDOUTF = $GEMGRD - PROJ = $proj - GRDAREA = $grdarea - KXKY = $kxky - MAXGRD = $maxgrd - CPYFIL = $cpyfil - GAREA = $garea - OUTPUT = $output - GBTBLS = $gbtbls + GDOUTF = ${GEMGRD} + PROJ = ${proj} + GRDAREA = ${grdarea} + KXKY = ${kxky} + MAXGRD = ${maxgrd} + CPYFIL = ${cpyfil} + GAREA = ${garea} + OUTPUT = ${output} + GBTBLS = ${gbtbls} GBDIAG = - PDSEXT = $pdsext + PDSEXT = ${pdsext} l r EOF - export err=$?;err_chk + export err=$?; err_chk - cp "${GEMGRD}" "${destination}/.${GEMGRD}" + cp "${GEMGRD}" "${destination}/${GEMGRD}" export err=$? - if [[ ${err} -ne 0 ]] ; then - echo " File ${GEMGRD} does not exist." + if (( err != 0 )) ; then + echo "FATAL ERROR: ${GEMGRD} does not exist." exit "${err}" fi - mv "${destination}/.${GEMGRD}" "${destination}/${GEMGRD}" if [[ ${SENDDBN} = "YES" ]] ; then "${DBNROOT}/bin/dbn_alert" MODEL "${DBN_ALERT_TYPE}" "${job}" \ "${destination}/${GEMGRD}" @@ -150,14 +95,14 @@ EOF echo "##### DBN_ALERT_TYPE is: ${DBN_ALERT_TYPE} #####" fi - if [ $fhcnt -ge 240 ] ; then - let fhcnt=fhcnt+12 + if (( fhr >= 240 )) ; then + fhr=$((fhr+12)) else - let fhcnt=fhcnt+finc + fhr=$((fhr+finc)) fi done -$GEMEXE/gpend +"${GEMEXE}/gpend" ##################################################################### diff --git a/scripts/exgdas_atmos_verfozn.sh b/scripts/exgdas_atmos_verfozn.sh index 1810fdef5d..e681fc55c5 100755 --- a/scripts/exgdas_atmos_verfozn.sh +++ b/scripts/exgdas_atmos_verfozn.sh @@ -1,6 +1,6 @@ #! /usr/bin/env bash -source "${HOMEgfs}/ush/preamble.sh" +source "${USHgfs}/preamble.sh" ################################################################################ # exgdas_atmos_verfozn.sh diff --git a/scripts/exgdas_atmos_verfrad.sh b/scripts/exgdas_atmos_verfrad.sh index 50320ffba1..bad8715acd 100755 --- a/scripts/exgdas_atmos_verfrad.sh +++ b/scripts/exgdas_atmos_verfrad.sh @@ -1,6 +1,6 @@ #! /usr/bin/env bash -source "${HOMEgfs}/ush/preamble.sh" +source "${USHgfs}/preamble.sh" ################################################################################ #### UNIX Script Documentation Block @@ -37,9 +37,9 @@ if [[ -s ${radstat} && -s ${biascr} ]]; then #------------------------------------------------------------------ # SATYPE is the list of expected satellite/instrument sources - # in the radstat file. It should be stored in the $TANKverf - # directory. If it isn't there then use the $FIXgdas copy. In all - # cases write it back out to the radmon.$PDY directory. Add any + # in the radstat file. It should be stored in the $TANKverf + # directory. If it isn't there then use the gdas fix copy. In all + # cases write it back out to the radmon.$PDY directory. Add any # new sources to the list before writing back out. #------------------------------------------------------------------ @@ -131,15 +131,6 @@ if [[ -s ${radstat} && -s ${biascr} ]]; then "${USHgfs}/radmon_verf_time.sh" rc_time=$? - #-------------------------------------- - # optionally run clean_tankdir script - # - if [[ ${CLEAN_TANKVERF:-0} -eq 1 ]]; then - "${USHradmon}/clean_tankdir.sh" glb 60 - rc_clean_tankdir=$? - echo "rc_clean_tankdir = ${rc_clean_tankdir}" - fi - fi diff --git a/scripts/exgdas_enkf_earc.sh b/scripts/exgdas_enkf_earc.sh index 199b5609a2..3e54c658e9 100755 --- a/scripts/exgdas_enkf_earc.sh +++ b/scripts/exgdas_enkf_earc.sh @@ -1,6 +1,6 @@ #! /usr/bin/env bash -source "${HOMEgfs}/ush/preamble.sh" +source "${USHgfs}/preamble.sh" ############################################## # Begin JOB SPECIFIC work @@ -15,16 +15,16 @@ if [ "${EARCICS_CYC}" -lt 0 ]; then EARCICS_CYC=$((EARCICS_CYC+24)) fi -"${HOMEgfs}/ush/hpssarch_gen.sh" "${RUN}" +"${USHgfs}/hpssarch_gen.sh" "${RUN}" status=$? if [ "${status}" -ne 0 ]; then - echo "${HOMEgfs}/ush/hpssarch_gen.sh ${RUN} failed, ABORT!" + echo "${USHgfs}/hpssarch_gen.sh ${RUN} failed, ABORT!" exit "${status}" fi cd "${ROTDIR}" || exit 2 -source "${HOMEgfs}/ush/file_utils.sh" +source "${USHgfs}/file_utils.sh" ################################################################### # ENSGRP > 0 archives a group of ensemble members diff --git a/scripts/exgdas_enkf_ecen.sh b/scripts/exgdas_enkf_ecen.sh index c20d1dec78..31ec6b81c4 100755 --- a/scripts/exgdas_enkf_ecen.sh +++ b/scripts/exgdas_enkf_ecen.sh @@ -17,7 +17,7 @@ # ################################################################################ -source "$HOMEgfs/ush/preamble.sh" +source "${USHgfs}/preamble.sh" # Directories. pwd=$(pwd) @@ -31,16 +31,16 @@ ntiles=${ntiles:-6} # Utilities NCP=${NCP:-"/bin/cp -p"} NLN=${NLN:-"/bin/ln -sf"} -NCLEN=${NCLEN:-$HOMEgfs/ush/getncdimlen} +NCLEN=${NCLEN:-${USHgfs}/getncdimlen} # Scripts # Executables. -GETATMENSMEANEXEC=${GETATMENSMEANEXEC:-$HOMEgfs/exec/getsigensmeanp_smooth.x} -GETSFCENSMEANEXEC=${GETSFCENSMEANEXEC:-$HOMEgfs/exec/getsfcensmeanp.x} -RECENATMEXEC=${RECENATMEXEC:-$HOMEgfs/exec/recentersigp.x} -CALCINCNEMSEXEC=${CALCINCNEMSEXEC:-$HOMEgfs/exec/calc_increment_ens.x} -CALCINCNCEXEC=${CALCINCEXEC:-$HOMEgfs/exec/calc_increment_ens_ncio.x} +GETATMENSMEANEXEC=${GETATMENSMEANEXEC:-${EXECgfs}/getsigensmeanp_smooth.x} +GETSFCENSMEANEXEC=${GETSFCENSMEANEXEC:-${EXECgfs}/getsfcensmeanp.x} +RECENATMEXEC=${RECENATMEXEC:-${EXECgfs}/recentersigp.x} +CALCINCNEMSEXEC=${CALCINCNEMSEXEC:-${EXECgfs}/calc_increment_ens.x} +CALCINCNCEXEC=${CALCINCEXEC:-${EXECgfs}/calc_increment_ens_ncio.x} # Files. OPREFIX=${OPREFIX:-""} @@ -51,7 +51,6 @@ GPREFIX=${GPREFIX:-""} GPREFIX_ENS=${GPREFIX_ENS:-$GPREFIX} # Variables -NMEM_ENS=${NMEM_ENS:-80} imp_physics=${imp_physics:-99} INCREMENTS_TO_ZERO=${INCREMENTS_TO_ZERO:-"'NONE'"} DOIAU=${DOIAU_ENKF:-"NO"} @@ -59,25 +58,29 @@ FHMIN=${FHMIN_ECEN:-3} FHMAX=${FHMAX_ECEN:-9} FHOUT=${FHOUT_ECEN:-3} FHSFC=${FHSFC_ECEN:-$FHMIN} -if [ $RUN = "enkfgfs" ]; then +NMEM_ENS_MAX=${NMEM_ENS:-80} +if [ "${RUN}" = "enkfgfs" ]; then DO_CALC_INCREMENT=${DO_CALC_INCREMENT_ENKF_GFS:-"NO"} + NMEM_ENS=${NMEM_ENS_GFS:-30} + ec_offset=${NMEM_ENS_GFS_OFFSET:-20} + mem_offset=$((ec_offset * cyc/6)) else DO_CALC_INCREMENT=${DO_CALC_INCREMENT:-"NO"} + NMEM_ENS=${NMEM_ENS:-80} + mem_offset=0 fi # global_chgres stuff -CHGRESNEMS=${CHGRESNEMS:-$HOMEgfs/exec/enkf_chgres_recenter.x} -CHGRESNC=${CHGRESNC:-$HOMEgfs/exec/enkf_chgres_recenter_nc.x} +CHGRESNEMS=${CHGRESNEMS:-${EXECgfs}/enkf_chgres_recenter.x} +CHGRESNC=${CHGRESNC:-${EXECgfs}/enkf_chgres_recenter_nc.x} NTHREADS_CHGRES=${NTHREADS_CHGRES:-24} APRUN_CHGRES=${APRUN_CHGRES:-""} # global_cycle stuff -CYCLESH=${CYCLESH:-$HOMEgfs/ush/global_cycle.sh} -export CYCLEXEC=${CYCLEXEC:-$HOMEgfs/exec/global_cycle} +CYCLESH=${CYCLESH:-${USHgfs}/global_cycle.sh} +export CYCLEXEC=${CYCLEXEC:-${EXECgfs}/global_cycle} APRUN_CYCLE=${APRUN_CYCLE:-${APRUN:-""}} NTHREADS_CYCLE=${NTHREADS_CYCLE:-${NTHREADS:-1}} -export FIXorog=${FIXorog:-$HOMEgfs/fix/orog} -export FIXam=${FIXam:-$HOMEgfs/fix/am} export CYCLVARS=${CYCLVARS:-"FSNOL=-2.,FSNOS=99999.,"} export FHOUR=${FHOUR:-0} export DELTSFC=${DELTSFC:-6} @@ -108,12 +111,17 @@ ENKF_SUFFIX="s" for FHR in $(seq $FHMIN $FHOUT $FHMAX); do for imem in $(seq 1 $NMEM_ENS); do + smem=$((imem + mem_offset)) + if (( smem > NMEM_ENS_MAX )); then + smem=$((smem - NMEM_ENS_MAX)) + fi + gmemchar="mem"$(printf %03i $smem) memchar="mem"$(printf %03i $imem) MEMDIR=${memchar} YMD=${PDY} HH=${cyc} generate_com -x \ COM_ATMOS_ANALYSIS_MEM:COM_ATMOS_ANALYSIS_TMPL - MEMDIR=${memchar} RUN=${GDUMP_ENS} YMD=${gPDY} HH=${gcyc} generate_com -x \ + MEMDIR=${gmemchar} RUN=${GDUMP_ENS} YMD=${gPDY} HH=${gcyc} generate_com -x \ COM_ATMOS_HISTORY_MEM_PREV:COM_ATMOS_HISTORY_TMPL ${NLN} "${COM_ATMOS_HISTORY_MEM_PREV}/${GPREFIX_ENS}atmf00${FHR}${ENKF_SUFFIX}.nc" "./atmges_${memchar}" @@ -241,7 +249,7 @@ if [ $RECENTER_ENKF = "YES" ]; then $NLN $ATMANL_GSI atmanl_gsi $NLN $ATMANL_GSI_ENSRES atmanl_gsi_ensres - SIGLEVEL=${SIGLEVEL:-${FIXam}/global_hyblev.l${LEVS}.txt} + SIGLEVEL=${SIGLEVEL:-${FIXgfs}/am/global_hyblev.l${LEVS}.txt} $NLN $CHGRESNC chgres.x chgresnml=chgres_nc_gauss.nml nmltitle=chgres diff --git a/scripts/exgdas_enkf_fcst.sh b/scripts/exgdas_enkf_fcst.sh deleted file mode 100755 index fd6136ddd2..0000000000 --- a/scripts/exgdas_enkf_fcst.sh +++ /dev/null @@ -1,225 +0,0 @@ -#! /usr/bin/env bash - -################################################################################ -#### UNIX Script Documentation Block -# . . -# Script name: exgdas_enkf_fcst.sh -# Script description: Run ensemble forecasts -# -# Author: Rahul Mahajan Org: NCEP/EMC Date: 2017-03-02 -# -# Abstract: This script runs ensemble forecasts serially one-after-another -# -# $Id$ -# -# Attributes: -# Language: POSIX shell -# -#### -################################################################################ - -source "${HOMEgfs}/ush/preamble.sh" - -# Enemble group, begin and end -ENSGRP=${ENSGRP:-1} -ENSBEG=${ENSBEG:-1} -ENSEND=${ENSEND:-1} - -# Re-run failed members, or entire group -RERUN_EFCSGRP=${RERUN_EFCSGRP:-"YES"} - -# Recenter flag and increment file prefix -RECENTER_ENKF=${RECENTER_ENKF:-"YES"} -export PREFIX_ATMINC=${PREFIX_ATMINC:-""} - -################################################################################ -# Preprocessing -cd "${DATA}" || exit 99 -DATATOP=${DATA} - -################################################################################ -# Set output data -EFCSGRP="${COM_TOP}/efcs.grp${ENSGRP}" -if [[ -f ${EFCSGRP} ]]; then - if [[ ${RERUN_EFCSGRP} = "YES" ]]; then - rm -f "${EFCSGRP}" - else - echo "RERUN_EFCSGRP = ${RERUN_EFCSGRP}, will re-run FAILED members only!" - ${NMV} "${EFCSGRP}" "${EFCSGRP}.fail" - fi -fi - -################################################################################ -# Set namelist/model config options common to all members once - -# There are many many model namelist options -# Some are resolution (CASE) dependent, some depend on the model configuration -# and will need to be added here before $FORECASTSH is called -# For now assume that -# 1. the ensemble and the deterministic are same resolution -# 2. the ensemble runs with the same configuration as the deterministic - -# Model config option for Ensemble -export TYPE=${TYPE_ENKF:-${TYPE:-nh}} # choices: nh, hydro -export MONO=${MONO_ENKF:-${MONO:-non-mono}} # choices: mono, non-mono - -# fv_core_nml -export CASE=${CASE_ENS:-${CASE:-C768}} -export layout_x=${layout_x_ENKF:-${layout_x:-8}} -export layout_y=${layout_y_ENKF:-${layout_y:-16}} -export LEVS=${LEVS_ENKF:-${LEVS:-64}} - -# nggps_diag_nml -export FHOUT=${FHOUT_ENKF:-3} -if [[ ${RUN} == "enkfgfs" ]]; then - export FHOUT=${FHOUT_ENKF_GFS:-${FHOUT_ENKF:${FHOUT:-3}}} -fi -# model_configure -export DELTIM=${DELTIM_ENKF:-${DELTIM:-225}} -export FHMAX=${FHMAX_ENKF:-9} -if [[ ${RUN} == "enkfgfs" ]]; then - export FHMAX=${FHMAX_ENKF_GFS:-${FHMAX_ENKF:-${FHMAX}}} -fi - -# gfs_physics_nml -export FHSWR=${FHSWR_ENKF:-${FHSWR:-3600.}} -export FHLWR=${FHLWR_ENKF:-${FHLWR:-3600.}} -export IEMS=${IEMS_ENKF:-${IEMS:-1}} -export ISOL=${ISOL_ENKF:-${ISOL:-2}} -export IAER=${IAER_ENKF:-${IAER:-111}} -export ICO2=${ICO2_ENKF:-${ICO2:-2}} -export cdmbgwd=${cdmbgwd_ENKF:-${cdmbgwd:-"3.5,0.25"}} -export dspheat=${dspheat_ENKF:-${dspheat:-".true."}} -export shal_cnv=${shal_cnv_ENKF:-${shal_cnv:-".true."}} -export FHZER=${FHZER_ENKF:-${FHZER:-6}} -export FHCYC=${FHCYC_ENKF:-${FHCYC:-6}} - -# Set PREFIX_ATMINC to r when recentering on -if [[ ${RECENTER_ENKF} = "YES" ]]; then - export PREFIX_ATMINC="r" -fi - -# Ignore possible spelling error (nothing is misspelled) -# shellcheck disable=SC2153 -GDATE=$(${NDATE} -"${assim_freq}" "${PDY}${cyc}") -declare -x gPDY="${GDATE:0:8}" -declare -x gcyc="${GDATE:8:2}" - -################################################################################ -# Run forecast for ensemble member -rc=0 -for imem in $(seq "${ENSBEG}" "${ENSEND}"); do - - cd "${DATATOP}" - - ENSMEM=$(printf %03i "${imem}") - export ENSMEM - memchar="mem${ENSMEM}" - - echo "Processing MEMBER: ${ENSMEM}" - - ra=0 - - skip_mem="NO" - if [[ -f ${EFCSGRP}.fail ]]; then - set +e - memstat=$(grep "MEMBER ${ENSMEM}" "${EFCSGRP}.fail" | grep -c "PASS") - set_strict - [[ ${memstat} -eq 1 ]] && skip_mem="YES" - fi - - # Construct COM variables from templates (see config.com) - # Can't make these read-only because we are looping over members - MEMDIR="${memchar}" YMD=${PDY} HH=${cyc} generate_com -x COM_ATMOS_RESTART COM_ATMOS_INPUT COM_ATMOS_ANALYSIS \ - COM_ATMOS_HISTORY COM_ATMOS_MASTER COM_CONF - - MEMDIR="${memchar}" YMD="${gPDY}" HH="${gcyc}" generate_com -x COM_ATMOS_RESTART_PREV:COM_ATMOS_RESTART_TMPL - - if [[ ${DO_WAVE} == "YES" ]]; then - MEMDIR="${memchar}" YMD=${PDY} HH=${cyc} generate_com -x COM_WAVE_RESTART COM_WAVE_PREP COM_WAVE_HISTORY - MEMDIR="${memchar}" YMD="${gPDY}" HH="${gcyc}" generate_com -x COM_WAVE_RESTART_PREV:COM_WAVE_RESTART_TMPL - fi - - if [[ ${DO_OCN} == "YES" ]]; then - MEMDIR="${memchar}" YMD=${PDY} HH=${cyc} generate_com -x COM_MED_RESTART COM_OCEAN_RESTART \ - COM_OCEAN_INPUT COM_OCEAN_HISTORY COM_OCEAN_ANALYSIS - MEMDIR="${memchar}" YMD="${gPDY}" HH="${gcyc}" generate_com -x COM_OCEAN_RESTART_PREV:COM_OCEAN_RESTART_TMPL - fi - - if [[ ${DO_ICE} == "YES" ]]; then - MEMDIR="${memchar}" YMD=${PDY} HH=${cyc} generate_com -x COM_ICE_HISTORY COM_ICE_INPUT COM_ICE_RESTART - MEMDIR="${memchar}" YMD="${gPDY}" HH="${gcyc}" generate_com -x COM_ICE_RESTART_PREV:COM_ICE_RESTART_TMPL - fi - - if [[ ${DO_AERO} == "YES" ]]; then - MEMDIR="${memchar}" YMD=${PDY} HH=${cyc} generate_com -x COM_CHEM_HISTORY - fi - - - if [[ ${skip_mem} = "NO" ]]; then - - ra=0 - - export MEMBER=${imem} - export DATA="${DATATOP}/${memchar}" - if [[ -d ${DATA} ]]; then rm -rf "${DATA}"; fi - mkdir -p "${DATA}" - ${FORECASTSH} - ra=$? - - # Notify a member forecast failed and abort - if [[ ${ra} -ne 0 ]]; then - err_exit "FATAL ERROR: forecast of member ${ENSMEM} FAILED. Aborting job" - fi - - rc=$((rc+ra)) - - fi - - if [[ ${SENDDBN} = YES ]]; then - fhr=${FHOUT} - while [[ ${fhr} -le ${FHMAX} ]]; do - FH3=$(printf %03i "${fhr}") - if (( fhr % 3 == 0 )); then - "${DBNROOT}/bin/dbn_alert" MODEL GFS_ENKF "${job}" "${COM_ATMOS_HISTORY}/${RUN}.t${cyc}z.sfcf${FH3}.nc" - fi - fhr=$((fhr+FHOUT)) - done - fi - - cd "${DATATOP}" - - if [[ -s ${EFCSGRP} ]]; then - ${NCP} "${EFCSGRP}" log_old - fi - [[ -f log ]] && rm log - [[ -f log_new ]] && rm log_new - if [[ ${ra} -ne 0 ]]; then - echo "MEMBER ${ENSMEM} : FAIL" > log - else - echo "MEMBER ${ENSMEM} : PASS" > log - fi - if [[ -s log_old ]] ; then - cat log_old log > log_new - else - cat log > log_new - fi - ${NCP} log_new "${EFCSGRP}" - -done - -################################################################################ -# Echo status of ensemble group -cd "${DATATOP}" -echo "Status of ensemble members in group ${ENSGRP}:" -cat "${EFCSGRP}" -[[ -f ${EFCSGRP}.fail ]] && rm "${EFCSGRP}".fail - -################################################################################ -# If any members failed, error out -export err=${rc}; err_chk - -################################################################################ -# Postprocessing - -exit "${err}" diff --git a/scripts/exgdas_enkf_post.sh b/scripts/exgdas_enkf_post.sh index 86ab9071a4..6f60068747 100755 --- a/scripts/exgdas_enkf_post.sh +++ b/scripts/exgdas_enkf_post.sh @@ -17,7 +17,7 @@ # ################################################################################ -source "$HOMEgfs/ush/preamble.sh" +source "${USHgfs}/preamble.sh" # Directories. pwd=$(pwd) @@ -34,11 +34,11 @@ SENDDBN=${SENDDBN:-"NO"} # Fix files LEVS=${LEVS:-64} -HYBENSMOOTH=${HYBENSMOOTH:-$FIXgsi/global_hybens_smoothinfo.l${LEVS}.txt} +HYBENSMOOTH=${HYBENSMOOTH:-${FIXgfs}/gsi/global_hybens_smoothinfo.l${LEVS}.txt} # Executables. -GETATMENSMEANEXEC=${GETATMENSMEANEXEC:-$HOMEgfs/exec/getsigensmeanp_smooth.x} -GETSFCENSMEANEXEC=${GETSFCENSMEANEXEC:-$HOMEgfs/exec/getsfcensmeanp.x} +GETATMENSMEANEXEC=${GETATMENSMEANEXEC:-${EXECgfs}/getsigensmeanp_smooth.x} +GETSFCENSMEANEXEC=${GETSFCENSMEANEXEC:-${EXECgfs}/getsfcensmeanp.x} # Other variables. PREFIX=${PREFIX:-""} @@ -46,10 +46,11 @@ FHMIN=${FHMIN_EPOS:-3} FHMAX=${FHMAX_EPOS:-9} FHOUT=${FHOUT_EPOS:-3} -if [[ $CDUMP == "gfs" ]]; then +if [[ "${RUN}" == "enkfgfs" ]]; then NMEM_ENS=${NMEM_ENS_GFS:-${NMEM_ENS:-30}} +else + NMEM_ENS=${NMEM_ENS:-80} fi -NMEM_ENS=${NMEM_ENS:-80} SMOOTH_ENKF=${SMOOTH_ENKF:-"NO"} ENKF_SPREAD=${ENKF_SPREAD:-"NO"} diff --git a/scripts/exgdas_enkf_select_obs.sh b/scripts/exgdas_enkf_select_obs.sh index 2ad624bcdb..782838df6d 100755 --- a/scripts/exgdas_enkf_select_obs.sh +++ b/scripts/exgdas_enkf_select_obs.sh @@ -17,7 +17,7 @@ # ################################################################################ -source "$HOMEgfs/ush/preamble.sh" +source "${USHgfs}/preamble.sh" # Directories. pwd=$(pwd) @@ -26,7 +26,7 @@ pwd=$(pwd) export NLN=${NLN:-"/bin/ln -sf"} # Scripts. -ANALYSISSH=${ANALYSISSH:-$HOMEgfs/scripts/exglobal_atmos_analysis.sh} +ANALYSISSH=${ANALYSISSH:-${SCRgfs}/exglobal_atmos_analysis.sh} # Select obs export RUN_SELECT=${RUN_SELECT:-"YES"} diff --git a/scripts/exgdas_enkf_sfc.sh b/scripts/exgdas_enkf_sfc.sh index 81d68fb9fe..e94909862a 100755 --- a/scripts/exgdas_enkf_sfc.sh +++ b/scripts/exgdas_enkf_sfc.sh @@ -17,13 +17,14 @@ # ################################################################################ -source "$HOMEgfs/ush/preamble.sh" +source "${USHgfs}/preamble.sh" # Directories. pwd=$(pwd) # Base variables DONST=${DONST:-"NO"} +GSI_SOILANAL=${GSI_SOILANAL:-"NO"} DOSFCANL_ENKF=${DOSFCANL_ENKF:-"YES"} export CASE=${CASE:-384} ntiles=${ntiles:-6} @@ -31,7 +32,7 @@ ntiles=${ntiles:-6} # Utilities NCP=${NCP:-"/bin/cp -p"} NLN=${NLN:-"/bin/ln -sf"} -NCLEN=${NCLEN:-$HOMEgfs/ush/getncdimlen} +NCLEN=${NCLEN:-${USHgfs}/getncdimlen} # Scripts @@ -46,16 +47,22 @@ GPREFIX=${GPREFIX:-""} GPREFIX_ENS=${GPREFIX_ENS:-${GPREFIX}} # Variables -NMEM_ENS=${NMEM_ENS:-80} +NMEM_ENS_MAX=${NMEM_ENS:-80} +if [ "${RUN}" = "enkfgfs" ]; then + NMEM_ENS=${NMEM_ENS_GFS:-30} + ec_offset=${NMEM_ENS_GFS_OFFSET:-20} + mem_offset=$((ec_offset * cyc/6)) +else + NMEM_ENS=${NMEM_ENS:-80} + mem_offset=0 +fi DOIAU=${DOIAU_ENKF:-"NO"} # Global_cycle stuff -CYCLESH=${CYCLESH:-$HOMEgfs/ush/global_cycle.sh} -export CYCLEXEC=${CYCLEXEC:-$HOMEgfs/exec/global_cycle} +CYCLESH=${CYCLESH:-${USHgfs}/global_cycle.sh} +export CYCLEXEC=${CYCLEXEC:-${EXECgfs}/global_cycle} APRUN_CYCLE=${APRUN_CYCLE:-${APRUN:-""}} NTHREADS_CYCLE=${NTHREADS_CYCLE:-${NTHREADS:-1}} -export FIXorog=${FIXorog:-$HOMEgfs/fix/orog} -export FIXam=${FIXam:-$HOMEgfs/fix/am} export CYCLVARS=${CYCLVARS:-"FSNOL=-2.,FSNOS=99999.,"} export FHOUR=${FHOUR:-0} export DELTSFC=${DELTSFC:-6} @@ -63,7 +70,6 @@ export DELTSFC=${DELTSFC:-6} APRUN_ESFC=${APRUN_ESFC:-${APRUN:-""}} NTHREADS_ESFC=${NTHREADS_ESFC:-${NTHREADS:-1}} - ################################################################################ # Preprocessing mkdata=NO @@ -134,28 +140,39 @@ if [ $DOIAU = "YES" ]; then export TILE_NUM=$n for imem in $(seq 1 $NMEM_ENS); do - + smem=$((imem + mem_offset)) + if (( smem > NMEM_ENS_MAX )); then + smem=$((smem - NMEM_ENS_MAX)) + fi + gmemchar="mem"$(printf %03i "$smem") cmem=$(printf %03i $imem) memchar="mem$cmem" MEMDIR=${memchar} YMD=${PDY} HH=${cyc} generate_com \ COM_ATMOS_RESTART_MEM:COM_ATMOS_RESTART_TMPL - MEMDIR=${memchar} RUN="enkfgdas" YMD=${gPDY} HH=${gcyc} generate_com \ + MEMDIR=${gmemchar} RUN=${GDUMP_ENS} YMD=${gPDY} HH=${gcyc} generate_com \ COM_ATMOS_RESTART_MEM_PREV:COM_ATMOS_RESTART_TMPL - [[ ${TILE_NUM} -eq 1 ]] && mkdir -p "${COM_ATMOS_RESTART_MEM}" + MEMDIR=${memchar} YMD=${PDY} HH=${cyc} generate_com \ + COM_ATMOS_ANALYSIS_MEM:COM_ATMOS_ANALYSIS_TMPL + [[ ${TILE_NUM} -eq 1 ]] && mkdir -p "${COM_ATMOS_RESTART_MEM}" ${NCP} "${COM_ATMOS_RESTART_MEM_PREV}/${bPDY}.${bcyc}0000.sfc_data.tile${n}.nc" \ "${COM_ATMOS_RESTART_MEM}/${bPDY}.${bcyc}0000.sfcanl_data.tile${n}.nc" ${NLN} "${COM_ATMOS_RESTART_MEM_PREV}/${bPDY}.${bcyc}0000.sfc_data.tile${n}.nc" \ "${DATA}/fnbgsi.${cmem}" ${NLN} "${COM_ATMOS_RESTART_MEM}/${bPDY}.${bcyc}0000.sfcanl_data.tile${n}.nc" \ "${DATA}/fnbgso.${cmem}" - ${NLN} "${FIXorog}/${CASE}/${CASE}_grid.tile${n}.nc" "${DATA}/fngrid.${cmem}" - ${NLN} "${FIXorog}/${CASE}/${CASE}.mx${OCNRES}_oro_data.tile${n}.nc" "${DATA}/fnorog.${cmem}" + ${NLN} "${FIXgfs}/orog/${CASE}/${CASE}_grid.tile${n}.nc" "${DATA}/fngrid.${cmem}" + ${NLN} "${FIXgfs}/orog/${CASE}/${CASE}.mx${OCNRES}_oro_data.tile${n}.nc" "${DATA}/fnorog.${cmem}" - done + if [[ ${GSI_SOILANAL} = "YES" ]]; then + FHR=6 + ${NLN} "${COM_ATMOS_ANALYSIS_MEM}/${APREFIX_ENS}sfci00${FHR}.nc" \ + "${DATA}/lnd_incr.${cmem}" + fi + done # ensembles CDATE="${PDY}${cyc}" ${CYCLESH} export err=$?; err_chk @@ -170,14 +187,18 @@ if [ $DOSFCANL_ENKF = "YES" ]; then export TILE_NUM=$n for imem in $(seq 1 $NMEM_ENS); do - + smem=$((imem + mem_offset)) + if (( smem > NMEM_ENS_MAX )); then + smem=$((smem - NMEM_ENS_MAX)) + fi + gmemchar="mem"$(printf %03i "$smem") cmem=$(printf %03i $imem) memchar="mem$cmem" MEMDIR=${memchar} YMD=${PDY} HH=${cyc} generate_com \ COM_ATMOS_RESTART_MEM:COM_ATMOS_RESTART_TMPL - RUN="${GDUMP_ENS}" MEMDIR=${memchar} YMD=${gPDY} HH=${gcyc} generate_com \ + RUN="${GDUMP_ENS}" MEMDIR=${gmemchar} YMD=${gPDY} HH=${gcyc} generate_com \ COM_ATMOS_RESTART_MEM_PREV:COM_ATMOS_RESTART_TMPL [[ ${TILE_NUM} -eq 1 ]] && mkdir -p "${COM_ATMOS_RESTART_MEM}" @@ -188,8 +209,8 @@ if [ $DOSFCANL_ENKF = "YES" ]; then "${DATA}/fnbgsi.${cmem}" ${NLN} "${COM_ATMOS_RESTART_MEM}/${PDY}.${cyc}0000.sfcanl_data.tile${n}.nc" \ "${DATA}/fnbgso.${cmem}" - ${NLN} "${FIXorog}/${CASE}/${CASE}_grid.tile${n}.nc" "${DATA}/fngrid.${cmem}" - ${NLN} "${FIXorog}/${CASE}/${CASE}.mx${OCNRES}_oro_data.tile${n}.nc" "${DATA}/fnorog.${cmem}" + ${NLN} "${FIXgfs}/orog/${CASE}/${CASE}_grid.tile${n}.nc" "${DATA}/fngrid.${cmem}" + ${NLN} "${FIXgfs}/orog/${CASE}/${CASE}.mx${OCNRES}_oro_data.tile${n}.nc" "${DATA}/fnorog.${cmem}" done diff --git a/scripts/exgdas_enkf_update.sh b/scripts/exgdas_enkf_update.sh index 1f11026ac4..9c42826383 100755 --- a/scripts/exgdas_enkf_update.sh +++ b/scripts/exgdas_enkf_update.sh @@ -17,7 +17,7 @@ # ################################################################################ -source "$HOMEgfs/ush/preamble.sh" +source "${USHgfs}/preamble.sh" # Directories. pwd=$(pwd) @@ -25,7 +25,7 @@ pwd=$(pwd) # Utilities NCP=${NCP:-"/bin/cp -p"} NLN=${NLN:-"/bin/ln -sf"} -NCLEN=${NCLEN:-$HOMEgfs/ush/getncdimlen} +NCLEN=${NCLEN:-${USHgfs}/getncdimlen} USE_CFP=${USE_CFP:-"NO"} CFP_MP=${CFP_MP:-"NO"} nm="" @@ -37,7 +37,7 @@ APRUN_ENKF=${APRUN_ENKF:-${APRUN:-""}} NTHREADS_ENKF=${NTHREADS_ENKF:-${NTHREADS:-1}} # Executables -ENKFEXEC=${ENKFEXEC:-$HOMEgfs/exec/enkf.x} +ENKFEXEC=${ENKFEXEC:-${EXECgfs}/enkf.x} # Cycling and forecast hour specific parameters CDATE=${CDATE:-"2001010100"} @@ -56,7 +56,6 @@ ENKFSTAT=${ENKFSTAT:-${APREFIX}enkfstat} # Namelist parameters USE_CORRELATED_OBERRS=${USE_CORRELATED_OBERRS:-"NO"} -NMEM_ENS=${NMEM_ENS:-80} NAM_ENKF=${NAM_ENKF:-""} SATOBS_ENKF=${SATOBS_ENKF:-""} OZOBS_ENKF=${OZOBS_ENKF:-""} @@ -81,12 +80,19 @@ cnvw_option=${cnvw_option:-".false."} netcdf_diag=${netcdf_diag:-".true."} modelspace_vloc=${modelspace_vloc:-".false."} # if true, 'vlocal_eig.dat' is needed IAUFHRS_ENKF=${IAUFHRS_ENKF:-6} -if [ $RUN = "enkfgfs" ]; then +NMEM_ENS_MAX=${NMEM_ENS:-80} +if [ "${RUN}" = "enkfgfs" ]; then DO_CALC_INCREMENT=${DO_CALC_INCREMENT_ENKF_GFS:-"NO"} + NMEM_ENS=${NMEM_ENS_GFS:-30} + ec_offset=${NMEM_ENS_GFS_OFFSET:-20} + mem_offset=$((ec_offset * cyc/6)) else DO_CALC_INCREMENT=${DO_CALC_INCREMENT:-"NO"} + NMEM_ENS=${NMEM_ENS:-80} + mem_offset=0 fi INCREMENTS_TO_ZERO=${INCREMENTS_TO_ZERO:-"'NONE'"} +GSI_SOILANAL=${GSI_SOILANAL:-"NO"} ################################################################################ @@ -105,14 +111,14 @@ else fi LATA_ENKF=${LATA_ENKF:-$LATB_ENKF} LONA_ENKF=${LONA_ENKF:-$LONB_ENKF} -SATANGL=${SATANGL:-${FIXgsi}/global_satangbias.txt} -SATINFO=${SATINFO:-${FIXgsi}/global_satinfo.txt} -CONVINFO=${CONVINFO:-${FIXgsi}/global_convinfo.txt} -OZINFO=${OZINFO:-${FIXgsi}/global_ozinfo.txt} -SCANINFO=${SCANINFO:-${FIXgsi}/global_scaninfo.txt} -HYBENSINFO=${HYBENSINFO:-${FIXgsi}/global_hybens_info.l${LEVS_ENKF}.txt} -ANAVINFO=${ANAVINFO:-${FIXgsi}/global_anavinfo.l${LEVS_ENKF}.txt} -VLOCALEIG=${VLOCALEIG:-${FIXgsi}/vlocal_eig_l${LEVS_ENKF}.dat} +SATANGL=${SATANGL:-${FIXgfs}/gsi/global_satangbias.txt} +SATINFO=${SATINFO:-${FIXgfs}/gsi/global_satinfo.txt} +CONVINFO=${CONVINFO:-${FIXgfs}/gsi/global_convinfo.txt} +OZINFO=${OZINFO:-${FIXgfs}/gsi/global_ozinfo.txt} +SCANINFO=${SCANINFO:-${FIXgfs}/gsi/global_scaninfo.txt} +HYBENSINFO=${HYBENSINFO:-${FIXgfs}/gsi/global_hybens_info.l${LEVS_ENKF}.txt} +ANAVINFO=${ANAVINFO:-${FIXgfs}/gsi/global_anavinfo.l${LEVS_ENKF}.txt} +VLOCALEIG=${VLOCALEIG:-${FIXgfs}/gsi/vlocal_eig_l${LEVS_ENKF}.dat} ENKF_SUFFIX="s" [[ $SMOOTH_ENKF = "NO" ]] && ENKF_SUFFIX="" @@ -178,9 +184,14 @@ else fi nfhrs=$(echo $IAUFHRS_ENKF | sed 's/,/ /g') for imem in $(seq 1 $NMEM_ENS); do + smem=$((imem + mem_offset)) + if (( smem > NMEM_ENS_MAX )); then + smem=$((smem - NMEM_ENS_MAX)) + fi + gmemchar="mem"$(printf %03i $smem) memchar="mem"$(printf %03i $imem) - MEMDIR=${memchar} RUN=${GDUMP_ENS} YMD=${gPDY} HH=${gcyc} generate_com -x \ + MEMDIR=${gmemchar} RUN=${GDUMP_ENS} YMD=${gPDY} HH=${gcyc} generate_com -x \ COM_ATMOS_HISTORY_MEM_PREV:COM_ATMOS_HISTORY_TMPL MEMDIR=${memchar} YMD=${PDY} HH=${cyc} generate_com -x \ @@ -203,6 +214,10 @@ for imem in $(seq 1 $NMEM_ENS); do for FHR in $nfhrs; do ${NLN} "${COM_ATMOS_HISTORY_MEM_PREV}/${GPREFIX}atmf00${FHR}${ENKF_SUFFIX}.nc" \ "sfg_${PDY}${cyc}_fhr0${FHR}_${memchar}" + if [ $GSI_SOILANAL = "YES" ]; then + ${NLN} "${COM_ATMOS_HISTORY_MEM_PREV}/${GPREFIX}sfcf00${FHR}${ENKF_SUFFIX}.nc" \ + "bfg_${PDY}${cyc}_fhr0${FHR}_${memchar}" + fi if [ $cnvw_option = ".true." ]; then ${NLN} "${COM_ATMOS_HISTORY_MEM_PREV}/${GPREFIX}sfcf00${FHR}.nc" \ "sfgsfc_${PDY}${cyc}_fhr0${FHR}_${memchar}" @@ -224,6 +239,10 @@ for imem in $(seq 1 $NMEM_ENS); do "incr_${PDY}${cyc}_fhr0${FHR}_${memchar}" fi fi + if [ $GSI_SOILANAL = "YES" ]; then + ${NLN} "${COM_ATMOS_ANALYSIS_MEM}/${APREFIX}sfci00${FHR}.nc" \ + "sfcincr_${PDY}${cyc}_fhr0${FHR}_${memchar}" + fi done done @@ -238,10 +257,10 @@ for FHR in $nfhrs; do fi done -if [ $USE_CFP = "YES" ]; then +if [[ $USE_CFP = "YES" ]]; then chmod 755 $DATA/mp_untar.sh ncmd=$(cat $DATA/mp_untar.sh | wc -l) - if [ $ncmd -gt 0 ]; then + if [[ $ncmd -gt 0 ]]; then ncmd_max=$((ncmd < npe_node_max ? ncmd : npe_node_max)) APRUNCFP=$(eval echo $APRUNCFP) $APRUNCFP $DATA/mp_untar.sh @@ -398,8 +417,8 @@ cat stdout stderr > "${COM_ATMOS_ANALYSIS_STAT}/${ENKFSTAT}" ################################################################################ # Postprocessing -cd $pwd -[[ $mkdata = "YES" ]] && rm -rf $DATA +cd "$pwd" +[[ $mkdata = "YES" ]] && rm -rf "${DATA}" -exit $err +exit ${err} diff --git a/scripts/exgfs_atmos_awips_20km_1p0deg.sh b/scripts/exgfs_atmos_awips_20km_1p0deg.sh index 7546f3cabe..490875b2c4 100755 --- a/scripts/exgfs_atmos_awips_20km_1p0deg.sh +++ b/scripts/exgfs_atmos_awips_20km_1p0deg.sh @@ -19,7 +19,7 @@ # echo " " ############################################################################### -source "${HOMEgfs}/ush/preamble.sh" +source "${USHgfs}/preamble.sh" fcsthrs="$1" num=$# @@ -38,7 +38,7 @@ fi cd "${DATA}" || exit 2 # "Import" functions used in this script -source "${HOMEgfs}/ush/product_functions.sh" +source "${USHgfs}/product_functions.sh" ############################################### # Wait for the availability of the pgrb file @@ -91,7 +91,7 @@ export opt28=' -new_grid_interpolation budget -fi ' cp "${COM_ATMOS_GRIB_0p25}/gfs.t${cyc}z.pgrb2.0p25.f${fcsthrs}" "tmpfile2${fcsthrs}" cp "${COM_ATMOS_GRIB_0p25}/gfs.t${cyc}z.pgrb2b.0p25.f${fcsthrs}" "tmpfile2b${fcsthrs}" cat "tmpfile2${fcsthrs}" "tmpfile2b${fcsthrs}" > "tmpfile${fcsthrs}" -${WGRIB2} "tmpfile${fcsthrs}" | grep -F -f "${PARMproduct}/gfs_awips_parmlist_g2" | \ +${WGRIB2} "tmpfile${fcsthrs}" | grep -F -f "${PARMgfs}/product/gfs_awips_parmlist_g2" | \ ${WGRIB2} -i -grib masterfile "tmpfile${fcsthrs}" export err=$? if [[ $err -ne 0 ]]; then @@ -179,7 +179,7 @@ for GRID in conus ak prico pac 003; do export FORT31="awps_file_fi${fcsthrs}_${GRID}" export FORT51="grib2.awpgfs${fcsthrs}.${GRID}" - cp "${PARMwmo}/grib2_awpgfs${fcsthrs}.${GRID}" "parm_list" + cp "${PARMgfs}/wmo/grib2_awpgfs${fcsthrs}.${GRID}" "parm_list" if [[ ${DO_WAVE} != "YES" ]]; then # Remove wave field it not running wave model grep -vw "5WAVH" "parm_list" > "parm_list_temp" @@ -213,7 +213,7 @@ for GRID in conus ak prico pac 003; do export FORT31="awps_file_fi${fcsthrs}_${GRID}" export FORT51="grib2.awpgfs_20km_${GRID}_f${fcsthrs}" - cp "${PARMwmo}/grib2_awpgfs_20km_${GRID}f${fcsthrs}" "parm_list" + cp "${PARMgfs}/wmo/grib2_awpgfs_20km_${GRID}f${fcsthrs}" "parm_list" if [[ ${DO_WAVE} != "YES" ]]; then # Remove wave field it not running wave model grep -vw "5WAVH" "parm_list" > "parm_list_temp" diff --git a/scripts/exgfs_atmos_fbwind.sh b/scripts/exgfs_atmos_fbwind.sh index 735a906bff..de8e448a01 100755 --- a/scripts/exgfs_atmos_fbwind.sh +++ b/scripts/exgfs_atmos_fbwind.sh @@ -14,7 +14,7 @@ # echo " Nov 2019 - B Vuong Removed WINTEMV bulletin (retired)" ##################################################################### -source "$HOMEgfs/ush/preamble.sh" +source "${USHgfs}/preamble.sh" cd $DATA @@ -42,7 +42,7 @@ do cp $COMIN/gfs.${cycle}.pgrb2.0p25.f${fhr} tmp_pgrb2_0p25${fhr} cp $COMIN/gfs.${cycle}.pgrb2b.0p25.f${fhr} tmp_pgrb2b_0p25${fhr} cat tmp_pgrb2_0p25${fhr} tmp_pgrb2b_0p25${fhr} > tmp0p25filef${fhr} - $WGRIB2 tmp0p25filef${fhr} | grep -F -f $PARMproduct/gfs_fbwnd_parmlist_g2 | $WGRIB2 -i -grib tmpfilef${fhr} tmp0p25filef${fhr} + $WGRIB2 tmp0p25filef${fhr} | grep -F -f ${PARMgfs}/product/gfs_fbwnd_parmlist_g2 | $WGRIB2 -i -grib tmpfilef${fhr} tmp0p25filef${fhr} $CNVGRIB -g21 tmpfilef${fhr} tmpfilef${fhr}.grib1 $GRBINDEX tmpfilef${fhr}.grib1 tmpfilef${fhr}.grib1i mv tmpfilef${fhr}.grib1 gfs.t${cyc}z.grbf${fhr}_grb1 @@ -68,7 +68,7 @@ export FORT51="tran.fbwnd_pacific" startmsg -$EXECgfs/fbwndgfs < $PARMproduct/fbwnd_pacific.stnlist >> $pgmout 2> errfile +$EXECgfs/fbwndgfs < ${PARMgfs}/product/fbwnd_pacific.stnlist >> $pgmout 2> errfile export err=$?; err_chk diff --git a/scripts/exgfs_atmos_gempak_gif_ncdc_skew_t.sh b/scripts/exgfs_atmos_gempak_gif_ncdc_skew_t.sh index 2dd7fa886a..f7e981c6b6 100755 --- a/scripts/exgfs_atmos_gempak_gif_ncdc_skew_t.sh +++ b/scripts/exgfs_atmos_gempak_gif_ncdc_skew_t.sh @@ -7,107 +7,79 @@ # in the future, we should move it above somewhere else. ############################################################## -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs}/ush/preamble.sh" -cd $DATA +cd "${DATA}" || exit 1 -export NTS=$USHgempak/restore +export NTS="${HOMEgfs}/gempak/ush/restore" -if [ $MODEL = GDAS -o $MODEL = GFS ] -then - case $MODEL in - GDAS) fcsthrs="00";; - GFS) fcsthrs="00 12 24 36 48";; +if [[ "${MODEL}" == GDAS ]] || [[ "${MODEL}" == GFS ]]; then + case "${MODEL}" in + GDAS) fcsthrs="0";; + GFS) fcsthrs="0 12 24 36 48";; + *) + echo "FATAL ERROR: Unrecognized model type ${MODEL}" + exit 5 + ;; esac - export fhr - for fhr in $fcsthrs - do - icnt=1 - maxtries=180 - export GRIBFILE=${COMIN}/${RUN}_${PDY}${cyc}f0${fhr} - while [ $icnt -lt 1000 ] - do - if [ -r ${COMIN}/${RUN}_${PDY}${cyc}f0${fhr} ] ; then - sleep 5 - break - else - echo "The process is waiting ... ${GRIBFILE} file to proceed." - sleep 20 - let "icnt=icnt+1" - fi - if [ $icnt -ge $maxtries ] - then - echo "ABORTING: after 1 hour of waiting for ${GRIBFILE} file at F$fhr to end." - export err=7 ; err_chk - exit $err - fi - done - - cp ${COMIN}/${RUN}_${PDY}${cyc}f0${fhr} gem_grids${fhr}.gem - -# if [ $cyc -eq 00 -o $cyc -eq 12 ] - #then - $USHgempak/gempak_${RUN}_f${fhr}_gif.sh - #fi - + sleep_interval=20 + max_tries=180 + for fhr in ${fcsthrs}; do + fhr3=$(printf %03d "${fhr}") + export GRIBFILE=${COM_ATMOS_GEMPAK_1p00}/${RUN}_1p00_${PDY}${cyc}f${fhr3} + if ! wait_for_file "${GRIBFILE}" "${sleep_interval}" "${max_tries}" ; then + echo "FATAL ERROR: ${GRIBFILE} not found after ${max_tries} iterations" + exit 10 + fi + + cp "${GRIBFILE}" "gem_grids${fhr3}.gem" + export fhr3 + if (( fhr == 0 )); then + "${HOMEgfs}/gempak/ush/gempak_${RUN}_f000_gif.sh" + else + "${HOMEgfs}/gempak/ush/gempak_${RUN}_fhhh_gif.sh" + fi done fi -#################################################################################### -# echo "-----------------------------------------------------------------------------" -# echo "GFS MAG postprocessing script exmag_sigman_skew_k_gfs_gif_ncdc_skew_t.sh " -# echo "-----------------------------------------------------------------------------" -# echo "History: Mar 2012 added to processing for enhanced MAG skew_t" -# echo "2012-03-11 Mabe -- reworked script to add significant level " -# echo " data to existing mandatory level data in a new file" -# echo "2013-04-24 Mabe -- Reworked to remove unneeded output with " -# echo " conversion to WCOSS" -# Add ms to filename to make it different since it has both mandatory -# and significant level data $COMOUT/${RUN}.${cycle}.msupperair -# $COMOUT/${RUN}.${cycle}.msupperairtble -##################################################################################### - -cd $DATA - -export RSHPDY=$(echo $PDY | cut -c5-)$(echo $PDY | cut -c3-4) - -cp $HOMEgfs/gempak/dictionaries/sonde.land.tbl . -cp $HOMEgfs/gempak/dictionaries/metar.tbl . +cd "${DATA}" || exit 1 + +export RSHPDY="${PDY:4:}${PDY:2:2}" + +cp "${HOMEgfs}/gempak/dictionaries/sonde.land.tbl" sonde.land.tbl +cp "${HOMEgfs}/gempak/dictionaries/metar.tbl" metar.tbl sort -k 2n,2 metar.tbl > metar_stnm.tbl -cp $COMINobsproc/${model}.$cycle.adpupa.tm00.bufr_d fort.40 -export err=$? -if [[ $err -ne 0 ]] ; then - echo " File ${model}.$cycle.adpupa.tm00.bufr_d does not exist." - exit $err +cp "${COM_OBS}/${model}.${cycle}.adpupa.tm00.bufr_d" fort.40 +err=$? +if (( err != 0 )) ; then + echo "FATAL ERROR: File ${model}.${cycle}.adpupa.tm00.bufr_d could not be copied (does it exist?)." + exit "${err}" fi -# $RDBFMSUA >> $pgmout 2> errfile -${UTILgfs}/exec/rdbfmsua >> $pgmout 2> errfile +"${HOMEgfs}/exec/rdbfmsua.x" >> "${pgmout}" 2> errfile err=$?;export err ;err_chk +# shellcheck disable=SC2012,SC2155 export filesize=$( ls -l rdbfmsua.out | awk '{print $5}' ) ################################################################ # only run script if rdbfmsua.out contained upper air data. ################################################################ -if [ $filesize -gt 40 ] -then - - cp rdbfmsua.out $COMOUT/${RUN}.${cycle}.msupperair - cp sonde.idsms.tbl $COMOUT/${RUN}.${cycle}.msupperairtble - if [ $SENDDBN = "YES" ]; then - $DBNROOT/bin/dbn_alert DATA MSUPPER_AIR $job $COMOUT/${RUN}.${cycle}.msupperair - $DBNROOT/bin/dbn_alert DATA MSUPPER_AIRTBL $job $COMOUT/${RUN}.${cycle}.msupperairtble +if (( filesize > 40 )); then + cp rdbfmsua.out "${COM_ATMOS_GEMPAK_UPPER_AIR}/${RUN}.${cycle}.msupperair" + cp sonde.idsms.tbl "${COM_ATMOS_GEMPAK_UPPER_AIR}/${RUN}.${cycle}.msupperairtble" + if [[ ${SENDDBN} = "YES" ]]; then + "${DBNROOT}/bin/dbn_alert" DATA MSUPPER_AIR "${job}" "${COM_ATMOS_GEMPAK_UPPER_AIR}/${RUN}.${cycle}.msupperair" + "${DBNROOT}/bin/dbn_alert" DATA MSUPPER_AIRTBL "${job}" "${COM_ATMOS_GEMPAK_UPPER_AIR}/${RUN}.${cycle}.msupperairtble" fi - fi ############################################################ -if [ -e "$pgmout" ] ; then - cat $pgmout +if [[ -e "${pgmout}" ]] ; then + cat "${pgmout}" fi diff --git a/scripts/exgfs_atmos_gempak_meta.sh b/scripts/exgfs_atmos_gempak_meta.sh index 04f4f1fc5c..6ae8c77cfb 100755 --- a/scripts/exgfs_atmos_gempak_meta.sh +++ b/scripts/exgfs_atmos_gempak_meta.sh @@ -1,138 +1,91 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" +source "${HOMEgfs}/ush/preamble.sh" -cd $DATA - -GEMGRD1=${RUN}_${PDY}${cyc}f -#find out what fcst hr to start processing -fhr=$fhend +GEMGRD1="${RUN}_1p00_${PDY}${cyc}f" export numproc=23 -while [ $fhr -ge $fhbeg ] ; do - fhr=$(printf "%03d" $fhr) - ls -l $COMIN/$GEMGRD1${fhr} - err1=$? - if [ $err1 -eq 0 -o $fhr -eq $fhbeg ] ; then +# Find the last hour available +for (( fhr = fhend; fhr >= fhbeg; fhr = fhr - fhinc )) ; do + fhr3=$(printf "%03d" "${fhr}") + if [[ -r "${COM_ATMOS_GEMPAK_1p00}/${GEMGRD1}${fhr3}" ]]; then break fi - fhr=$(expr $fhr - $fhinc) done -maxtries=180 +sleep_interval=20 +max_tries=180 first_time=0 do_all=0 #loop through and process needed forecast hours -while [ $fhr -le $fhend ] -do - # - # First check to see if this is a rerun. If so make all Meta files - if [ $fhr -gt 126 -a $first_time -eq 0 ] ; then - do_all=1 - fi - first_time=1 - - if [ $fhr -eq 120 ] ; then - fhr=126 - fi - icnt=1 - - while [ $icnt -lt 1000 ] - do - ls -l $COMIN/$GEMGRD1${fhr} - err1=$? - if [ $err1 -eq 0 ] ; then - break - else - sleep 20 - let "icnt= icnt + 1" - fi - if [ $icnt -ge $maxtries ] - then - echo "ABORTING after 1 hour of waiting for gempak grid F$fhr to end." - export err=7 ; err_chk - exit $err - fi - done - - export fhr - - ######################################################## - # Create a script to be poe'd - # - # Note: The number of scripts to be run MUST match the number - # of total_tasks set in the ecf script, or the job will fail. - # -# if [ -f $DATA/poescript ]; then - rm $DATA/poescript -# fi - - fhr=$(printf "%02d" $fhr) - - if [ $do_all -eq 1 ] ; then - do_all=0 - awk '{print $1}' $FIXgempak/gfs_meta > $DATA/tmpscript - else - # - # Do not try to grep out 12, it will grab the 12 from 126. - # This will work as long as we don't need 12 fhr metafiles - # - if [ $fhr -ne 12 ] ; then - grep $fhr $FIXgempak/gfs_meta |awk -F" [0-9]" '{print $1}' > $DATA/tmpscript - fi - fi - - for script in $(cat $DATA/tmpscript) - do - eval "echo $script" >> $DATA/poescript - done - - num=$(cat $DATA/poescript |wc -l) - - while [ $num -lt $numproc ] ; do - echo "hostname" >>poescript - num=$(expr $num + 1) - done - - chmod 775 $DATA/poescript - cat $DATA/poescript - export MP_PGMMODEL=mpmd - export MP_CMDFILE=$DATA/poescript - -# If this is the final fcst hour, alert the -# file to all centers. -# - if [ 10#$fhr -ge $fhend ] ; then - export DBN_ALERT_TYPE=GFS_METAFILE_LAST - fi - - export fend=$fhr - - sleep 20 - ntasks=${NTASKS_META:-$(cat $DATA/poescript | wc -l)} - ptile=${PTILE_META:-4} - threads=${NTHREADS_META:-1} - export OMP_NUM_THREADS=$threads - APRUN="mpiexec -l -n $ntasks -ppn $ntasks --cpu-bind verbose,core cfp" - - APRUN_METACFP=${APRUN_METACFP:-$APRUN} - APRUNCFP=$(eval echo $APRUN_METACFP) - - $APRUNCFP $DATA/poescript - export err=$?; err_chk +while (( fhr <= fhend )); do + # + # First check to see if this is a rerun. If so make all Meta files + if (( fhr > 126 )) && (( first_time == 0 )); then + do_all=1 + fi + first_time=1 - fhr=$(printf "%03d" $fhr) - if [ $fhr -eq 126 ] ; then - let fhr=fhr+6 - else - let fhr=fhr+fhinc - fi -done + if (( fhr == 120 )); then + fhr=126 + fi -##################################################################### + gempak_file="${COM_ATMOS_GEMPAK_1p00}/${GEMGRD1}${fhr3}" + if ! wait_for_file "${gempak_file}" "${sleep_interval}" "${max_tries}"; then + echo "FATAL ERROR: gempak grid file ${gempak_file} not available after maximum wait time." + exit 7 + fi + + export fhr + ######################################################## + # Create a script to be poe'd + # + # Note: The number of scripts to be run MUST match the number + # of total_tasks set in the ecf script, or the job will fail. + # + if [[ -f poescript ]]; then + rm poescript + fi + + fhr3=$(printf "%03d" "${fhr}") + + if (( do_all == 1 )) ; then + do_all=0 + # shellcheck disable=SC2312 + awk '{print $1}' "${HOMEgfs}/gempak/fix/gfs_meta" | envsubst > "poescript" + else + # + # Do not try to grep out 12, it will grab the 12 from 126. + # This will work as long as we don't need 12 fhr metafiles + # + if (( fhr != 12 )) ; then + # shellcheck disable=SC2312 + grep "${fhr}" "${HOMEgfs}/gempak/fix/gfs_meta" | awk -F" [0-9]" '{print $1}' | envsubst > "poescript" + fi + fi + + # If this is the final fcst hour, alert the + # file to all centers. + # + if (( fhr >= fhend )) ; then + export DBN_ALERT_TYPE=GFS_METAFILE_LAST + fi + + export fend=${fhr} + + cat poescript + + "${HOMEgfs}/ush/run_mpmd.sh" poescript + export err=$?; err_chk + + if (( fhr == 126 )) ; then + fhr=$((fhr + 6)) + else + fhr=$((fhr + fhinc)) + fi +done exit -# diff --git a/scripts/exgfs_atmos_goes_nawips.sh b/scripts/exgfs_atmos_goes_nawips.sh index 583593fef8..2c725a6402 100755 --- a/scripts/exgfs_atmos_goes_nawips.sh +++ b/scripts/exgfs_atmos_goes_nawips.sh @@ -11,31 +11,31 @@ # echo "C. Magee: 10/2013 - swap X and Y for rtgssthr Atl and Pac." ##################################################################### -source "$HOMEgfs/ush/preamble.sh" +source "${USHgfs}/preamble.sh" -cd $DATA +cd "${DATA}" || exit 2 -cp $FIXgempak/g2varswmo2.tbl g2varswmo2.tbl -cp $FIXgempak/g2vcrdwmo2.tbl g2vcrdwmo2.tbl -cp $FIXgempak/g2varsncep1.tbl g2varsncep1.tbl -cp $FIXgempak/g2vcrdncep1.tbl g2vcrdncep1.tbl +for table in g2varswmo2.tbl g2vcrdwmo2.tbl g2varsncep1.tbl g2vcrdncep1.tbl; do + cp "${HOMEgfs}/gempak/fix/${table}" "${table}" || \ + ( echo "FATAL ERROR: ${table} is missing" && exit 2 ) +done # -# NAGRIB_TABLE=$FIXgempak/nagrib.tbl -NAGRIB=$GEMEXE/nagrib2 -# - -entry=$(grep "^$RUN2 " $NAGRIB_TABLE | awk 'index($1,"#") != 1 {print $0}') - -if [ "$entry" != "" ] ; then - cpyfil=$(echo $entry | awk 'BEGIN {FS="|"} {print $2}') - garea=$(echo $entry | awk 'BEGIN {FS="|"} {print $3}') - gbtbls=$(echo $entry | awk 'BEGIN {FS="|"} {print $4}') - maxgrd=$(echo $entry | awk 'BEGIN {FS="|"} {print $5}') - kxky=$(echo $entry | awk 'BEGIN {FS="|"} {print $6}') - grdarea=$(echo $entry | awk 'BEGIN {FS="|"} {print $7}') - proj=$(echo $entry | awk 'BEGIN {FS="|"} {print $8}') - output=$(echo $entry | awk 'BEGIN {FS="|"} {print $9}') +NAGRIB_TABLE="${HOMEgfs}/gempak/fix/nagrib.tbl" +NAGRIB="${GEMEXE}/nagrib2" + +# shellcheck disable=SC2312 +entry=$(grep "^${RUN2} " "${NAGRIB_TABLE}" | awk 'index($1,"#") != 1 {print $0}') + +if [[ "${entry}" != "" ]] ; then + cpyfil=$(echo "${entry}" | awk 'BEGIN {FS="|"} {print $2}') + garea=$(echo "${entry}" | awk 'BEGIN {FS="|"} {print $3}') + gbtbls=$(echo "${entry}" | awk 'BEGIN {FS="|"} {print $4}') + maxgrd=$(echo "${entry}" | awk 'BEGIN {FS="|"} {print $5}') + kxky=$(echo "${entry}" | awk 'BEGIN {FS="|"} {print $6}') + grdarea=$(echo "${entry}" | awk 'BEGIN {FS="|"} {print $7}') + proj=$(echo "${entry}" | awk 'BEGIN {FS="|"} {print $8}') + output=$(echo "${entry}" | awk 'BEGIN {FS="|"} {print $9}') else cpyfil=gds garea=dset @@ -48,71 +48,55 @@ else fi pdsext=no -maxtries=180 -fhcnt=$fstart -while [ $fhcnt -le $fend ] ; do - fhr=$(printf "%03d" $fhcnt) - fhcnt3=$(expr $fhr % 3) - - fhr3=$(printf "03d" $fhcnt) - GRIBIN=$COMIN/${model}.${cycle}.${GRIB}${fhr}${EXT} - GEMGRD=${RUN2}_${PDY}${cyc}f${fhr3} - - GRIBIN_chk=$GRIBIN - - icnt=1 - while [ $icnt -lt 1000 ] - do - if [ -r $GRIBIN_chk ] ; then - break - else - sleep 20 - let "icnt=icnt+1" - fi - if [ $icnt -ge $maxtries ] - then - echo "ABORTING after 1 hour of waiting for F$fhr to end." - export err=7 ; err_chk - exit $err - fi - done - - cp $GRIBIN grib$fhr - - export pgm="nagrib_nc F$fhr" - startmsg - - $NAGRIB << EOF - GBFILE = grib$fhr +sleep_interval=20 +max_tries=180 +fhr=${fstart} +for (( fhr=fstart; fhr <= fend; fhr=fhr+finc )); do + fhr3=$(printf "%03d" "${fhr}") + GRIBIN="${COM_ATMOS_GOES}/${model}.${cycle}.${GRIB}${fhr3}${EXT}" + GEMGRD="${RUN2}_${PDY}${cyc}f${fhr3}" + + GRIBIN_chk="${GRIBIN}" + + if ! wait_for_file "${GRIBIN_chk}" "${sleep_interval}" "${max_tries}"; then + echo "FATAL ERROR: after 1 hour of waiting for ${GRIBIN_chk} file at F${fhr3} to end." + export err=7 ; err_chk + exit "${err}" + fi + + cp "${GRIBIN}" "grib${fhr3}" + + export pgm="nagrib_nc F${fhr3}" + + ${NAGRIB} << EOF + GBFILE = grib${fhr3} INDXFL = - GDOUTF = $GEMGRD - PROJ = $proj - GRDAREA = $grdarea - KXKY = $kxky - MAXGRD = $maxgrd - CPYFIL = $cpyfil - GAREA = $garea - OUTPUT = $output - GBTBLS = $gbtbls + GDOUTF = ${GEMGRD} + PROJ = ${proj} + GRDAREA = ${grdarea} + KXKY = ${kxky} + MAXGRD = ${maxgrd} + CPYFIL = ${cpyfil} + GAREA = ${garea} + OUTPUT = ${output} + GBTBLS = ${gbtbls} GBDIAG = - PDSEXT = $pdsext + PDSEXT = ${pdsext} l r EOF export err=$?;err_chk - $GEMEXE/gpend + "${GEMEXE}/gpend" - cp $GEMGRD $COMOUT/.$GEMGRD - mv $COMOUT/.$GEMGRD $COMOUT/$GEMGRD - if [ $SENDDBN = "YES" ] ; then - $DBNROOT/bin/dbn_alert MODEL ${DBN_ALERT_TYPE} $job \ - $COMOUT/$GEMGRD + cpfs "${GEMGRD}" "${COM_ATMOS_GEMPAK_0p25}/${GEMGRD}" + if [[ ${SENDDBN} == "YES" ]] ; then + "${DBNROOT}/bin/dbn_alert" MODEL "${DBN_ALERT_TYPE}" "${job}" \ + "${COM_ATMOS_GEMPAK_0p25}/${GEMGRD}" else echo "##### DBN_ALERT_TYPE is: ${DBN_ALERT_TYPE} #####" fi - let fhcnt=fhcnt+finc done ##################################################################### diff --git a/scripts/exgfs_atmos_grib2_special_npoess.sh b/scripts/exgfs_atmos_grib2_special_npoess.sh index a43c279ae6..3877b50b77 100755 --- a/scripts/exgfs_atmos_grib2_special_npoess.sh +++ b/scripts/exgfs_atmos_grib2_special_npoess.sh @@ -7,9 +7,9 @@ # echo "-----------------------------------------------------" ##################################################################### -source "$HOMEgfs/ush/preamble.sh" +source "${USHgfs}/preamble.sh" -cd $DATA +cd "${DATA}" || exit 2 ############################################################ # Define Variables: @@ -80,17 +80,17 @@ fi ############################################################################## # Specify Forecast Hour Range F000 - F024 for GFS_NPOESS_PGRB2_0P5DEG ############################################################################## -export SHOUR=000 -export FHOUR=024 -export FHINC=003 -if [[ "${FHOUR}" -gt "${FHMAX_GFS}" ]]; then +export SHOUR=0 +export FHOUR=24 +export FHINC=3 +if (( FHOUR > FHMAX_GFS )); then export FHOUR="${FHMAX_GFS}" fi ############################################################ # Loop Through the Post Forecast Files ############################################################ -for (( fhr=$((10#${SHOUR})); fhr <= $((10#${FHOUR})); fhr = fhr + FHINC )); do +for (( fhr=SHOUR; fhr <= FHOUR; fhr = fhr + FHINC )); do fhr3=$(printf "%03d" "${fhr}") @@ -99,34 +99,22 @@ for (( fhr=$((10#${SHOUR})); fhr <= $((10#${FHOUR})); fhr = fhr + FHINC )); do # existence of the restart files ############################### export pgm="postcheck" - ic=1 - while (( ic <= SLEEP_LOOP_MAX )); do - if [[ -f "${COM_ATMOS_GRIB_0p50}/gfs.t${cyc}z.pgrb2b.0p50.f${fhr3}.idx" ]]; then - break - else - ic=$((ic + 1)) - sleep "${SLEEP_INT}" - fi - ############################### - # If we reach this point assume - # fcst job never reached restart - # period and error exit - ############################### - if (( ic == SLEEP_LOOP_MAX )); then - echo "FATAL ERROR: 0p50 grib file not available after max sleep time" - export err=9 - err_chk || exit "${err}" - fi - done + grib_file="${COM_ATMOS_GRIB_0p50}/gfs.t${cyc}z.pgrb2b.0p50.f${fhr3}.idx" + if ! wait_for_file "${grib_file}" "${SLEEP_INT}" "${SLEEP_LOOP_MAX}"; then + echo "FATAL ERROR: 0p50 grib file not available after max sleep time" + export err=9 + err_chk || exit "${err}" + fi ###################################################################### # Process Global NPOESS 0.50 GFS GRID PRODUCTS IN GRIB2 F000 - F024 # ###################################################################### - paramlist=${PARMproduct}/global_npoess_paramlist_g2 + paramlist="${PARMgfs}/product/global_npoess_paramlist_g2" cp "${COM_ATMOS_GRIB_0p50}/gfs.t${cyc}z.pgrb2.0p50.f${fhr3}" tmpfile2 cp "${COM_ATMOS_GRIB_0p50}/gfs.t${cyc}z.pgrb2b.0p50.f${fhr3}" tmpfile2b cat tmpfile2 tmpfile2b > tmpfile - ${WGRIB2} tmpfile | grep -F -f ${paramlist} | ${WGRIB2} -i -grib pgb2file tmpfile + # shellcheck disable=SC2312 + ${WGRIB2} tmpfile | grep -F -f "${paramlist}" | ${WGRIB2} -i -grib pgb2file tmpfile export err=$?; err_chk cp pgb2file "${COM_ATMOS_GOES}/${RUN}.${cycle}.pgrb2f${fhr3}.npoess" @@ -135,8 +123,7 @@ for (( fhr=$((10#${SHOUR})); fhr <= $((10#${FHOUR})); fhr = fhr + FHINC )); do "${DBNROOT}/bin/dbn_alert" MODEL GFS_PGBNPOESS "${job}" \ "${COM_ATMOS_GOES}/${RUN}.${cycle}.pgrb2f${fhr3}.npoess" else - msg="File ${RUN}.${cycle}.pgrb2f${fhr3}.npoess not posted to db_net." - postmsg "${msg}" || echo "${msg}" + echo "File ${RUN}.${cycle}.pgrb2f${fhr3}.npoess not posted to db_net." fi echo "${PDY}${cyc}${fhr3}" > "${COM_ATMOS_GOES}/${RUN}.t${cyc}z.control.halfdeg.npoess" rm tmpfile pgb2file @@ -146,10 +133,10 @@ done ################################################################ # Specify Forecast Hour Range F000 - F180 for GOESSIMPGRB files ################################################################ -export SHOUR=000 +export SHOUR=0 export FHOUR=180 -export FHINC=003 -if [[ "${FHOUR}" -gt "${FHMAX_GFS}" ]]; then +export FHINC=3 +if (( FHOUR > FHMAX_GFS )); then export FHOUR="${FHMAX_GFS}" fi @@ -157,7 +144,7 @@ fi # Process GFS PGRB2_SPECIAL_POST ################################# -for (( fhr=$((10#${SHOUR})); fhr <= $((10#${FHOUR})); fhr = fhr + FHINC )); do +for (( fhr=SHOUR; fhr <= FHOUR; fhr = fhr + FHINC )); do fhr3=$(printf "%03d" "${fhr}") @@ -165,38 +152,25 @@ for (( fhr=$((10#${SHOUR})); fhr <= $((10#${FHOUR})); fhr = fhr + FHINC )); do # Start Looping for the # existence of the restart files ############################### - set +x export pgm="postcheck" - ic=1 - while (( ic <= SLEEP_LOOP_MAX )); do - if [[ -f "${COM_ATMOS_GOES}/${RUN}.t${cyc}z.special.grb2if${fhr3}.idx" ]]; then - break - else - ic=$((ic + 1)) - sleep "${SLEEP_INT}" - fi - ############################### - # If we reach this point assume - # fcst job never reached restart - # period and error exit - ############################### - if (( ic == SLEEP_LOOP_MAX )); then - echo "FATAL ERROR: Special goes grib file not available after max sleep time" - export err=9 - err_chk || exit "${err}" - fi - done - set_trace + grib_file="${COM_ATMOS_MASTER}/${RUN}.t${cyc}z.goesmasterf${fhr3}.grb2" + if ! wait_for_file "${grib_file}" "${SLEEP_INT}" "${SLEEP_LOOP_MAX}"; then + echo "FATAL ERROR: GOES master grib file ${grib_file} not available after max sleep time" + export err=9 + err_chk || exit "${err}" + fi ############################### # Put restart files into /nwges # for backup to start Model Fcst ############################### - cp "${COM_ATMOS_GOES}/${RUN}.t${cyc}z.special.grb2if${fhr3}" masterfile + cp "${grib_file}" masterfile export grid0p25="latlon 0:1440:0.25 90:721:-0.25" + # shellcheck disable=SC2086,SC2248 ${WGRIB2} masterfile ${opt1} ${opt21} ${opt22} ${opt23} ${opt24} ${opt25} ${opt26} \ ${opt27} ${opt28} -new_grid ${grid0p25} pgb2file export gridconus="lambert:253.0:50.0:50.0 214.5:349:32463.0 1.0:277:32463.0" + # shellcheck disable=SC2086,SC2248 ${WGRIB2} masterfile ${opt1} ${opt21} ${opt22} ${opt23} ${opt24} ${opt25} ${opt26} \ ${opt27} ${opt28} -new_grid ${gridconus} pgb2file2 diff --git a/scripts/exgfs_atmos_grib_awips.sh b/scripts/exgfs_atmos_grib_awips.sh index 037b4ce191..bdfc625dce 100755 --- a/scripts/exgfs_atmos_grib_awips.sh +++ b/scripts/exgfs_atmos_grib_awips.sh @@ -21,7 +21,7 @@ # echo " FEB 2019 - Removed grid 225" ##################################################################### -source "${HOMEgfs}/ush/preamble.sh" +source "${USHgfs}/preamble.sh" fcsthrs="$1" num=$# @@ -40,7 +40,7 @@ fi cd "${DATA}/awips_g1" || exit 2 # "Import" functions used in this script -source "${HOMEgfs}/ush/product_functions.sh" +source "${USHgfs}/product_functions.sh" ############################################### # Wait for the availability of the pgrb file @@ -75,14 +75,14 @@ set_trace cp "${COM_ATMOS_GRIB_0p25}/gfs.t${cyc}z.pgrb2.0p25.f${fcsthrs}" "tmpfile2" cp "${COM_ATMOS_GRIB_0p25}/gfs.t${cyc}z.pgrb2b.0p25.f${fcsthrs}" "tmpfile2b" cat tmpfile2 tmpfile2b > tmpfile -${WGRIB2} tmpfile | grep -F -f "${PARMproduct}/gfs_awips_parmlist_g2" | \ +${WGRIB2} tmpfile | grep -F -f "${PARMgfs}/product/gfs_awips_parmlist_g2" | \ ${WGRIB2} -i -grib masterfile tmpfile scale_dec masterfile ${CNVGRIB} -g21 masterfile masterfile.grib1 ln -s masterfile.grib1 fort.11 -"${HOMEgfs}/exec/overgridid.x" << EOF +"${EXECgfs}/overgridid.x" << EOF 255 EOF @@ -105,8 +105,8 @@ export GRID=211 export FORT11="master.grbf${fcsthrs}" export FORT31="master.grbif${fcsthrs}" export FORT51="xtrn.awpgfs${fcsthrs}.${GRID}" -# $MKGFSAWPS < $PARMwmo/grib_awpgfs${fcsthrs}.${GRID} parm=KWBC >> $pgmout 2>errfile -"${HOMEgfs}/exec/mkgfsawps.x" < "${PARMwmo}/grib_awpgfs${fcsthrs}.${GRID}" parm=KWBC >> "${pgmout}" 2>errfile +# $MKGFSAWPS < ${PARMgfs}/wmo/grib_awpgfs${fcsthrs}.${GRID} parm=KWBC >> $pgmout 2>errfile +"${EXECgfs}/mkgfsawps.x" < "${PARMgfs}/wmo/grib_awpgfs${fcsthrs}.${GRID}" parm=KWBC >> "${pgmout}" 2>errfile export err=$?; err_chk ############################## # Post Files to ${COM_ATMOS_WMO} diff --git a/scripts/exgfs_atmos_nawips.sh b/scripts/exgfs_atmos_nawips.sh index ebb509d392..25873473a8 100755 --- a/scripts/exgfs_atmos_nawips.sh +++ b/scripts/exgfs_atmos_nawips.sh @@ -10,7 +10,7 @@ # echo " data on the CCS is properly protected." ##################################################################### -source "${HOMEgfs}/ush/preamble.sh" "${2}" +source "${USHgfs}/preamble.sh" "${2}" #### If EMC GFS PARA runs hourly file are not available, The ILPOST #### will set to 3 hour in EMC GFS PARA. @@ -18,17 +18,17 @@ source "${HOMEgfs}/ush/preamble.sh" "${2}" export ILPOST=${ILPOST:-1} cd "${DATA}" || exit 1 -RUN2=$1 +grid=$1 fend=$2 DBN_ALERT_TYPE=$3 destination=$4 -DATA_RUN="${DATA}/${RUN2}" +DATA_RUN="${DATA}/${grid}" mkdir -p "${DATA_RUN}" cd "${DATA_RUN}" || exit 1 # "Import" functions used in this script -source "${HOMEgfs}/ush/product_functions.sh" +source "${USHgfs}/product_functions.sh" # NAGRIB="${GEMEXE}/nagrib2" @@ -44,20 +44,22 @@ proj= output=T pdsext=no -maxtries=360 -fhcnt=${fstart} -while (( fhcnt <= fend )) ; do +sleep_interval=10 +max_tries=360 +fhr=$(( 10#${fstart} )) +while (( fhr <= 10#${fend} )) ; do - if mkdir "lock.${fhcnt}" ; then - cd "lock.${fhcnt}" || exit 1 - cp "${FIXgempak}/g2varswmo2.tbl" "g2varswmo2.tbl" - cp "${FIXgempak}/g2vcrdwmo2.tbl" "g2vcrdwmo2.tbl" - cp "${FIXgempak}/g2varsncep1.tbl" "g2varsncep1.tbl" - cp "${FIXgempak}/g2vcrdncep1.tbl" "g2vcrdncep1.tbl" + fhr3=$(printf "%03d" "${fhr}") - fhr=$(printf "%03d" "${fhcnt}") + if mkdir "lock.${fhr3}" ; then + cd "lock.${fhr3}" || exit 1 - GEMGRD="${RUN2}_${PDY}${cyc}f${fhr}" + for table in g2varswmo2.tbl g2vcrdwmo2.tbl g2varsncep1.tbl g2vcrdncep1.tbl; do + cp "${HOMEgfs}/gempak/fix/${table}" "${table}" || \ + ( echo "FATAL ERROR: ${table} is missing" && exit 2 ) + done + + GEMGRD="${RUN}_${grid}_${PDY}${cyc}f${fhr3}" # Set type of Interpolation for WGRIB2 export opt1=' -set_grib_type same -new_grid_winds earth ' @@ -71,63 +73,42 @@ while (( fhcnt <= fend )) ; do export opt27=":(APCP|ACPCP|PRATE|CPRAT|DZDT):" export opt28=' -new_grid_interpolation budget -fi ' - case ${RUN2} in + case ${grid} in # TODO: Why aren't we interpolating from the 0p25 grids for 35-km and 40-km? - 'gfs_0p50' | 'gfs_0p25') res=${RUN2: -4};; - *) res="1p00";; + '0p50' | '0p25') grid_in=${grid};; + *) grid_in="1p00";; esac - source_var="COM_ATMOS_GRIB_${res}" - export GRIBIN="${!source_var}/${model}.${cycle}.pgrb2.${res}.f${fhr}" - GRIBIN_chk="${!source_var}/${model}.${cycle}.pgrb2.${res}.f${fhr}.idx" + source_var="COM_ATMOS_GRIB_${grid_in}" + export GRIBIN="${!source_var}/${model}.${cycle}.pgrb2.${grid_in}.f${fhr3}" + GRIBIN_chk="${!source_var}/${model}.${cycle}.pgrb2.${grid_in}.f${fhr3}.idx" - icnt=1 - while (( icnt < 1000 )); do - if [[ -r "${GRIBIN_chk}" ]] ; then - # File available, wait 5 seconds then proceed - sleep 5 - break - else - # File not available yet, wait 10 seconds and try again - echo "The process is waiting ... ${GRIBIN_chk} file to proceed." - sleep 10 - icnt=$((icnt+1)) - fi - if (( icnt >= maxtries )); then - echo "FATAL ERROR: after 1 hour of waiting for ${GRIBIN_chk} file at F${fhr} to end." - export err=7 ; err_chk - exit "${err}" - fi - done + if ! wait_for_file "${GRIBIN_chk}" "${sleep_interval}" "${max_tries}"; then + echo "FATAL ERROR: after 1 hour of waiting for ${GRIBIN_chk} file at F${fhr3} to end." + export err=7 ; err_chk + exit "${err}" + fi - case "${RUN2}" in - gfs35_pac) - export gfs35_pac='latlon 130.0:416:0.312 75.125:186:-0.312' - # shellcheck disable=SC2086,SC2248 - "${WGRIB2}" "${GRIBIN}" ${opt1} ${opt21} ${opt22} ${opt23} ${opt24} ${opt25} ${opt26} ${opt27} ${opt28} -new_grid ${gfs35_pac} "grib${fhr}" - trim_rh "grib${fhr}" - ;; - gfs35_atl) - export gfs35_atl='latlon 230.0:480:0.312 75.125:242:-0.312' - # shellcheck disable=SC2086,SC2248 - "${WGRIB2}" "${GRIBIN}" ${opt1} ${opt21} ${opt22} ${opt23} ${opt24} ${opt25} ${opt26} ${opt27} ${opt28} -new_grid ${gfs35_atl} "grib${fhr}" - trim_rh "grib${fhr}" - ;; - gfs40) - export gfs40='lambert:265.0:25.0:25.0 226.541:185:40635.0 12.19:129:40635.0' - # shellcheck disable=SC2086,SC2248 - "${WGRIB2}" "${GRIBIN}" ${opt1uv} ${opt21} ${opt22} ${opt23} ${opt24} ${opt25} ${opt26} ${opt27} ${opt28} -new_grid ${gfs40} "grib${fhr}" - trim_rh "grib${fhr}" - ;; - *) - cp "${GRIBIN}" "grib${fhr}" + case "${grid}" in + 35km_pac) grid_spec='latlon 130.0:416:0.312 75.125:186:-0.312';; + 35km_atl) grid_spec='latlon 230.0:480:0.312 75.125:242:-0.312';; + 40km) grid_spec='lambert:265.0:25.0:25.0 226.541:185:40635.0 12.19:129:40635.0';; + *) grid_spec='';; esac - export pgm="nagrib2 F${fhr}" + if [[ "${grid_spec}" != "" ]]; then + # shellcheck disable=SC2086,SC2248 + "${WGRIB2}" "${GRIBIN}" ${opt1uv} ${opt21} ${opt22} ${opt23} ${opt24} ${opt25} ${opt26} ${opt27} ${opt28} -new_grid ${grid_spec} "grib${fhr3}" + trim_rh "grib${fhr3}" + else + cp "${GRIBIN}" "grib${fhr3}" + fi + + export pgm="nagrib2 F${fhr3}" startmsg ${NAGRIB} << EOF -GBFILE = grib${fhr} +GBFILE = grib${fhr3} INDXFL = GDOUTF = ${GEMGRD} PROJ = ${proj} @@ -148,21 +129,20 @@ EOF cpfs "${GEMGRD}" "${destination}/${GEMGRD}" if [[ ${SENDDBN} == "YES" ]] ; then "${DBNROOT}/bin/dbn_alert" MODEL "${DBN_ALERT_TYPE}" "${job}" \ - "${destination}/${GEMGRD}" + "${destination}/${GEMGRD}" fi cd "${DATA_RUN}" || exit 1 else - if (( fhcnt <= 240 )) ; then - if (( fhcnt < 276 )) && [[ "${RUN2}" = "gfs_0p50" ]] ; then - fhcnt=$((fhcnt+6)) - else - fhcnt=$((fhcnt+12)) - fi - elif ((fhcnt < 120)) && [[ "${RUN2}" = "gfs_0p25" ]] ; then - #### let fhcnt=fhcnt+1 - fhcnt=$((hcnt + ILPOST)) + if (( fhr >= 240 )) ; then + if (( fhr < 276 )) && [[ "${grid}" = "0p50" ]] ; then + fhr=$((fhr+6)) + else + fhr=$((fhr+12)) + fi + elif ((fhr < 120)) && [[ "${grid}" = "0p25" ]] ; then + fhr=$((fhr + ILPOST)) else - fhcnt=$((ILPOST > finc ? fhcnt+ILPOST : fhcnt+finc )) + fhr=$((ILPOST > finc ? fhr+ILPOST : fhr+finc )) fi fi done diff --git a/scripts/exgfs_atmos_postsnd.sh b/scripts/exgfs_atmos_postsnd.sh index 368f001ed0..7aa97f3644 100755 --- a/scripts/exgfs_atmos_postsnd.sh +++ b/scripts/exgfs_atmos_postsnd.sh @@ -20,7 +20,7 @@ # 9) 2019-12-18 Guang Ping Lou generalizing to reading in NetCDF or nemsio ################################################################ -source "$HOMEgfs/ush/preamble.sh" +source "${USHgfs}/preamble.sh" cd $DATA @@ -44,7 +44,7 @@ export NINT3=${FHOUT_GFS:-3} rm -f -r "${COM_ATMOS_BUFR}" mkdir -p "${COM_ATMOS_BUFR}" -GETDIM="${HOMEgfs}/ush/getncdimlen" +GETDIM="${USHgfs}/getncdimlen" LEVS=$(${GETDIM} "${COM_ATMOS_HISTORY}/${RUN}.${cycle}.atmf000.${atmfm}" pfull) declare -x LEVS @@ -88,8 +88,7 @@ export FINT=$NINT1 if [ $FEND -gt $NEND1 ]; then export FINT=$NINT3 fi -## $USHbufrsnd/gfs_bufr.sh - $USHbufrsnd/gfs_bufr.sh + ${USHgfs}/gfs_bufr.sh export FSTART=$FEND done @@ -115,7 +114,7 @@ fi ######################################## rm -rf poe_col for (( m = 1; m <10 ; m++ )); do - echo "sh ${USHbufrsnd}/gfs_sndp.sh ${m} " >> poe_col + echo "sh ${USHgfs}/gfs_sndp.sh ${m} " >> poe_col done if [[ ${CFP_MP:-"NO"} == "YES" ]]; then @@ -129,7 +128,7 @@ chmod +x cmdfile ${APRUN_POSTSNDCFP} cmdfile -sh "${USHbufrsnd}/gfs_bfr2gpk.sh" +sh "${USHgfs}/gfs_bfr2gpk.sh" ############## END OF SCRIPT ####################### diff --git a/scripts/exgfs_atmos_wafs_blending_0p25.sh b/scripts/exgfs_atmos_wafs_blending_0p25.sh new file mode 100755 index 0000000000..293325185e --- /dev/null +++ b/scripts/exgfs_atmos_wafs_blending_0p25.sh @@ -0,0 +1,298 @@ +#!/bin/ksh +################################################################################ +#### UNIX Script Documentation Block +# . . +# Script name: exgfs_atmos_wafs_blending_0p25.sh (copied from exgfs_atmos_wafs_blending.sh) +# Script description: This scripts looks for US and UK WAFS Grib2 products at 1/4 deg, +# wait for specified period of time, and then run $USHgfs/wafs_blending_0p25.sh +# if both WAFS data are available. Otherwise, the job aborts with error massage +# +# Author: Y Mao Org: EMC Date: 2020-04-02 +# +# +# Script history log: +# 2020-04-02 Y Mao +# Oct 2021 - Remove jlogfile +# 2022-05-25 | Y Mao | Add ICAO new milestone Nov 2023 + +set -x +echo "JOB $job HAS BEGUN" +export SEND_AWC_US_ALERT=NO +export SEND_AWC_UK_ALERT=NO +export SEND_US_WAFS=NO +export SEND_UK_WAFS=NO + +cd $DATA +export SLEEP_LOOP_MAX=`expr $SLEEP_TIME / $SLEEP_INT` + +echo "start blending US and UK WAFS products at 1/4 degree for " $cyc " z cycle" +export ffhr=$SHOUR + +export ic_uk=1 + +while test $ffhr -le $EHOUR +do + +########################## +# look for US WAFS data +########################## + + export ic=1 + while [ $ic -le $SLEEP_LOOP_MAX ] + do + if [ -s ${COMINus}/gfs.t${cyc}z.wafs_0p25_unblended.f${ffhr}.grib2 ] ; then + break + fi + if [ $ic -eq $SLEEP_LOOP_MAX ] ; then + echo "US WAFS GRIB2 file $COMINus/gfs.t${cyc}z.wafs_0p25_unblended.f${ffhr}.grib2 not found after waiting over $SLEEP_TIME seconds" + echo "US WAFS GRIB2 file " $COMINus/gfs.t${cyc}z.wafs_0p25_unblended.f${ffhr}.grib2 "not found after waiting ",$SLEEP_TIME, "exitting" + SEND_UK_WAFS=YES + break + else + ic=`expr $ic + 1` + sleep $SLEEP_INT + fi + done + +########################## +# look for UK WAFS data. +########################## + + SLEEP_LOOP_MAX_UK=$SLEEP_LOOP_MAX + + # export ic=1 + while [ $ic_uk -le $SLEEP_LOOP_MAX_UK ] + do + # Three(3) unblended UK files for each cycle+fhour: icing, turb, cb + ukfiles=`ls $COMINuk/EGRR_WAFS_0p25_*_unblended_${PDY}_${cyc}z_t${ffhr}.grib2 | wc -l` + if [ $ukfiles -ge 3 ] ; then + break + fi + + if [ $ic_uk -eq $SLEEP_LOOP_MAX_UK ] ; then + echo "UK WAFS GRIB2 file $COMINuk/EGRR_WAFS_0p25_*_unblended_${PDY}_${cyc}z_t${ffhr}.grib2 not found" + echo "UK WAFS GRIB2 file " $COMINuk/EGRR_WAFS_0p25_*_unblended_${PDY}_${cyc}z_t${ffhr}.grib2 " not found" + export SEND_US_WAFS=YES + break + else + ic_uk=`expr $ic_uk + 1` + sleep $SLEEP_INT + fi + done + +########################## +# If both UK and US data are missing. +########################## + + if [ $SEND_UK_WAFS = 'YES' -a $SEND_US_WAFS = 'YES' ] ; then + SEND_US_WAFS=NO + SEND_UK_WAFS=NO + echo "BOTH UK and US data are missing, no blended for $PDY$cyc$ffhr" + export err=1; err_chk + continue + fi + +########################## +# Blending or unblended +########################## + + if [ $SEND_US_WAFS = 'YES' ] ; then + echo "turning back on dbn alert for unblended US WAFS product" + elif [ $SEND_UK_WAFS = 'YES' ] ; then + echo "turning back on dbn alert for unblended UK WAFS product" + # retrieve UK products + # Three(3) unblended UK files for each cycle+fhour: icing, turb, cb + cat $COMINuk/EGRR_WAFS_0p25_*_unblended_${PDY}_${cyc}z_t${ffhr}.grib2 > EGRR_WAFS_0p25_unblended_${PDY}_${cyc}z_t${ffhr}.grib2 + else # elif [ $SEND_US_WAFS = "NO" -a $SEND_UK_WAFS = "NO" ] ; then + # retrieve UK products + # Three(3) unblended UK files for each cycle+fhour: icing, turb, cb + cat $COMINuk/EGRR_WAFS_0p25_*_unblended_${PDY}_${cyc}z_t${ffhr}.grib2 > EGRR_WAFS_0p25_unblended_${PDY}_${cyc}z_t${ffhr}.grib2 + + # pick up US data + cp ${COMINus}/gfs.t${cyc}z.wafs_0p25_unblended.f${ffhr}.grib2 . + + # run blending code + export pgm=wafs_blending_0p25.x + . prep_step + + startmsg + $EXECgfs/$pgm gfs.t${cyc}z.wafs_0p25_unblended.f${ffhr}.grib2 \ + EGRR_WAFS_0p25_unblended_${PDY}_${cyc}z_t${ffhr}.grib2 \ + 0p25_blended_${PDY}${cyc}f${ffhr}.grib2 > f${ffhr}.out + + err1=$? + if test "$err1" -ne 0 + then + echo "WAFS blending 0p25 program failed at " ${PDY}${cyc}F${ffhr} " turning back on dbn alert for unblended US WAFS product" + SEND_US_WAFS=YES + fi + fi + +########################## +# Date dissemination +########################## + + if [ $SEND_US_WAFS = "YES" ] ; then + + ############################################################################################## + # + # checking any US WAFS product was sent due to No UK WAFS GRIB2 file or WAFS blending program + # (Alert once for all forecast hours) + # + if [ $SEND_AWC_US_ALERT = "NO" ] ; then + echo "WARNING! No UK WAFS GRIB2 0P25 file for WAFS blending. Send alert message to AWC ......" + make_NTC_file.pl NOXX10 KKCI $PDY$cyc NONE $FIXgfs/wafs_blending_0p25_admin_msg $PCOM/wifs_0p25_admin_msg + make_NTC_file.pl NOXX10 KWBC $PDY$cyc NONE $FIXgfs/wafs_blending_0p25_admin_msg $PCOM/iscs_0p25_admin_msg + if [ $SENDDBN_NTC = "YES" ] ; then + $DBNROOT/bin/dbn_alert NTC_LOW WAFS $job $PCOM/wifs_0p25_admin_msg + $DBNROOT/bin/dbn_alert NTC_LOW WAFS $job $PCOM/iscs_0p25_admin_msg + fi + + if [ $envir != prod ]; then + export maillist='nco.spa@noaa.gov' + fi + export maillist=${maillist:-'nco.spa@noaa.gov,ncep.sos@noaa.gov'} + export subject="WARNING! No UK WAFS GRIB2 0P25 file for WAFS blending, $PDY t${cyc}z $job" + echo "*************************************************************" > mailmsg + echo "*** WARNING! No UK WAFS GRIB2 0P25 file for WAFS blending ***" >> mailmsg + echo "*************************************************************" >> mailmsg + echo >> mailmsg + echo "Send alert message to AWC ...... " >> mailmsg + echo >> mailmsg + cat mailmsg > $COMOUT/${RUN}.t${cyc}z.wafs_blend_0p25_usonly.emailbody + cat $COMOUT/${RUN}.t${cyc}z.wafs_blend_0p25_usonly.emailbody | mail.py -s "$subject" $maillist -v + + export SEND_AWC_US_ALERT=YES + fi + ############################################################################################## + # + # Distribute US WAFS unblend Data to NCEP FTP Server (WOC) and TOC + # + echo "altering the unblended US WAFS products - $COMINus/gfs.t${cyc}z.wafs_0p25_unblended.f${ffhr}.grib2 " + echo "and $COMINus/gfs.t${cyc}z.wafs_0p25_unblended.f${ffhr}.grib2.idx " + + if [ $SENDDBN = "YES" ] ; then + $DBNROOT/bin/dbn_alert MODEL GFS_WAFS_0P25_UBL_GB2 $job $COMINus/gfs.t${cyc}z.wafs_0p25_unblended.f${ffhr}.grib2 + $DBNROOT/bin/dbn_alert MODEL GFS_WAFS_0P25_UBL_GB2_WIDX $job $COMINus/gfs.t${cyc}z.wafs_0p25_unblended.f${ffhr}.grib2.idx + fi + +# if [ $SENDDBN_NTC = "YES" ] ; then +# $DBNROOT/bin/dbn_alert NTC_LOW $NET $job $COMOUT/gfs.t${cyc}z.wafs_0p25_unblended.f${ffhr}.grib2 +# fi + + + export SEND_US_WAFS=NO + + elif [ $SEND_UK_WAFS = "YES" ] ; then + ############################################################################################## + # + # checking any UK WAFS product was sent due to No US WAFS GRIB2 file + # (Alert once for all forecast hours) + # + if [ $SEND_AWC_UK_ALERT = "NO" ] ; then + echo "WARNING: No US WAFS GRIB2 0P25 file for WAFS blending. Send alert message to AWC ......" + make_NTC_file.pl NOXX10 KKCI $PDY$cyc NONE $FIXgfs/wafs_blending_0p25_admin_msg $PCOM/wifs_0p25_admin_msg + make_NTC_file.pl NOXX10 KWBC $PDY$cyc NONE $FIXgfs/wafs_blending_0p25_admin_msg $PCOM/iscs_0p25_admin_msg + if [ $SENDDBN_NTC = "YES" ] ; then + $DBNROOT/bin/dbn_alert NTC_LOW WAFS $job $PCOM/wifs_0p25_admin_msg + $DBNROOT/bin/dbn_alert NTC_LOW WAFS $job $PCOM/iscs_0p25_admin_msg + fi + + if [ $envir != prod ]; then + export maillist='nco.spa@noaa.gov' + fi + export maillist=${maillist:-'nco.spa@noaa.gov,ncep.sos@noaa.gov'} + export subject="WARNING! No US WAFS GRIB2 0P25 file for WAFS blending, $PDY t${cyc}z $job" + echo "*************************************************************" > mailmsg + echo "*** WARNING! No US WAFS GRIB2 0P25 file for WAFS blending ***" >> mailmsg + echo "*************************************************************" >> mailmsg + echo >> mailmsg + echo "Send alert message to AWC ...... " >> mailmsg + echo >> mailmsg + cat mailmsg > $COMOUT/${RUN}.t${cyc}z.wafs_blend_0p25_ukonly.emailbody + cat $COMOUT/${RUN}.t${cyc}z.wafs_blend_0p25_ukonly.emailbody | mail.py -s "$subject" $maillist -v + + export SEND_AWC_UK_ALERT=YES + fi + ############################################################################################## + # + # Distribute UK WAFS unblend Data to NCEP FTP Server (WOC) and TOC + # + echo "altering the unblended UK WAFS products - EGRR_WAFS_0p25_unblended_${PDY}_${cyc}z_t${ffhr}.grib2" + + if [ $SENDDBN = "YES" ] ; then + $DBNROOT/bin/dbn_alert MODEL GFS_WAFS_UKMET_0P25_UBL_GB2 $job EGRR_WAFS_0p25_unblended_${PDY}_${cyc}z_t${ffhr}.grib2 + fi + +# if [ $SENDDBN_NTC = "YES" ] ; then +# $DBNROOT/bin/dbn_alert NTC_LOW $NET $job EGRR_WAFS_0p25_unblended_${PDY}_${cyc}z_t${ffhr}.grib2 +# fi + export SEND_UK_WAFS=NO + + + else + ############################################################################################## + # + # TOCGRIB2 Processing WAFS Blending GRIB2 (Icing, CB, GTG) + + # As in August 2020, no WMO header is needed for WAFS data at 1/4 deg + ## . prep_step + ## export pgm=$TOCGRIB2 + ## startmsg + + ## export FORT11=0p25_blended_${PDY}${cyc}f${ffhr}.grib2 + ## export FORT31=" " + ## export FORT51=grib2.t${cyc}z.WAFS_0p25_blended_f${ffhr} + + ## $TOCGRIB2 < $FIXgfs/grib2_blended_wafs_wifs_f${ffhr}.0p25 >> $pgmout 2> errfile + + ## err=$?;export err ;err_chk + ## echo " error from tocgrib=",$err + + ############################################################################################## + # + # Distribute US WAFS unblend Data to NCEP FTP Server (WOC) and TOC + # + if [ $SENDCOM = YES ]; then + cp 0p25_blended_${PDY}${cyc}f${ffhr}.grib2 $COMOUT/WAFS_0p25_blended_${PDY}${cyc}f${ffhr}.grib2 + ## cp grib2.t${cyc}z.WAFS_0p25_blended_f${ffhr} $PCOM/grib2.t${cyc}z.WAFS_0p25_blended_f${ffhr} + fi + + if [ $SENDDBN_NTC = "YES" ] ; then + # Distribute Data to NCEP FTP Server (WOC) and TOC + echo "No WMO header yet" + ## $DBNROOT/bin/dbn_alert NTC_LOW $NET $job $PCOM/grib2.t${cyc}z.WAFS_0p25_blended_f${ffhr} + fi + + if [ $SENDDBN = "YES" ] ; then + $DBNROOT/bin/dbn_alert MODEL GFS_WAFS_0P25_BL_GB2 $job $COMOUT/WAFS_0p25_blended_${PDY}${cyc}f${ffhr}.grib2 + fi + fi + +########################## +# Next loop +########################## + + echo "$PDY$cyc$ffhr" > $COMOUT/${RUN}.t${cyc}z.control.wafsblending_0p25 + + if [ $FHOUT_GFS -eq 3 ] ; then + FHINC=03 + else + if [ $ffhr -lt 24 ] ; then + FHINC=01 + else + FHINC=03 + fi + fi + + ffhr=`expr $ffhr + $FHINC` + if test $ffhr -lt 10 + then + ffhr=0${ffhr} + fi + +done +################################################################################ + +exit 0 +# diff --git a/scripts/exgfs_atmos_wafs_gcip.sh b/scripts/exgfs_atmos_wafs_gcip.sh new file mode 100755 index 0000000000..ad91c47420 --- /dev/null +++ b/scripts/exgfs_atmos_wafs_gcip.sh @@ -0,0 +1,242 @@ +#!/bin/ksh +###################################################################### +# UTILITY SCRIPT NAME : exgfs_atmos_wafs_gcip.sh +# DATE WRITTEN : 01/28/2015 +# +# Abstract: This utility script produces the WAFS GCIP. +# +# GCIP runs f00 f03 for each cycle, 4 times/day, +# to make the output valid every 3 hours +# +# History: 01/28/2015 +# - GFS post master file as first guess +# /com/prod/gfs.YYYYMMDD +# - Nesdis composite global satellite data +# /dcom (ftp?) +# - Metar/ships/lightning/pireps +# ksh /nwprod/ush/dumpjb YYYYMMDDHH hours output >/dev/null +# - Radar data over CONUS +# /com/hourly/prod/radar.YYYYMMDD/refd3d.tHHz.grbf00 +# - output of current icing potential +##################################################################### +echo "-----------------------------------------------------" +echo "JGFS_ATMOS_WAFS_GCIP at 00Z/06Z/12Z/18Z GFS postprocessing" +echo "-----------------------------------------------------" +echo "History: 2015 - First implementation of this new script." +echo "Oct 2021 - Remove jlogfile" +echo " " +##################################################################### + +set -xa + +# Set up working dir for parallel runs based on ffhr +ffhr=$1 +DATA=$DATA/$ffhr +mkdir -p $DATA +cd $DATA +# Overwrite TMPDIR for dumpjb +export TMPDIR=$DATA + +SLEEP_LOOP_MAX=`expr $SLEEP_TIME / $SLEEP_INT` + +configFile=gcip.config + +echo 'before preparing data' `date` + +# valid time. no worry, it won't be across to another date +vhour=$(( $ffhr + $cyc )) +vhour="$(printf "%02d" $(( 10#$vhour )) )" + +######################################################## +# Preparing data + +if [ $RUN = "gfs" ] ; then + + # model data + masterFile=$COMINgfs/gfs.t${cyc}z.master.grb2f$ffhr + + # check the availability of model file + icnt=1 + while [ $icnt -lt $SLEEP_LOOP_MAX ] ; do + if [ -s $masterFile ] ; then + break + fi + sleep $SLEEP_INT + icnt=$((icnt + 1)) + if [ $icnt -ge $SLEEP_LOOP_MAX ] ; then + msg="ABORTING after $SLEEP_TIME seconds of waiting for gfs master file!" + err_exit $msg + fi + done + + cpreq $PARMgfs/wafs_gcip_gfs.cfg $configFile + + modelFile=modelfile.grb +# ln -sf $masterFile $modelFile + $WGRIB2 $masterFile | egrep ":HGT:|:VVEL:|:CLMR:|:TMP:|:SPFH:|:RWMR:|:SNMR:|:GRLE:|:ICMR:|:RH:" | egrep "00 mb:|25 mb:|50 mb:|75 mb:|:HGT:surface" | $WGRIB2 -i $masterFile -grib $modelFile + + # metar / ships / lightning / pireps + # dumped data files' suffix is ".ibm" + obsfiles="metar ships ltngsr pirep" + for obsfile in $obsfiles ; do +# ksh $USHobsproc_dump/dumpjb ${PDY}${vhour} 1.5 $obsfile >/dev/null + ksh $DUMPJB ${PDY}${vhour} 1.5 $obsfile + done + metarFile=metar.ibm + shipFile=ships.ibm + lightningFile=ltngsr.ibm + pirepFile=pirep.ibm + + satFiles="" + channels="VIS SIR LIR SSR" + # If one channel is missing, satFiles will be empty + for channel in $channels ; do + satFile=GLOBCOMP$channel.${PDY}${vhour} + if [[ $COMINsat == *ftp:* ]] ; then + curl -O $COMINsat/$satFile + else + + # check the availability of satellite data file + icnt=1 + while [ $icnt -lt $SLEEP_LOOP_MAX ] ; do + if [ -s $COMINsat/$satFile ] ; then + break + fi + sleep $SLEEP_INT + icnt=$((icnt + 1)) + if [ $icnt -ge $SLEEP_LOOP_MAX ] ; then + msg="GCIP at ${vhour}z ABORTING after $SLEEP_TIME seconds of waiting for satellite $channel file!" + echo "$msg" + rc=1 + echo $msg >> $COMOUT/${RUN}.gcip.log + + if [ $envir != prod ]; then + export maillist='nco.spa@noaa.gov' + fi + export maillist=${maillist:-'nco.spa@noaa.gov,ncep.sos@noaa.gov'} + + export subject="Missing GLOBCOMPVIS Satellite Data for $PDY t${cyc}z $job" + echo "*************************************************************" > mailmsg + echo "*** WARNING !! COULD NOT FIND GLOBCOMPVIS Satellite Data *** " >> mailmsg + echo "*************************************************************" >> mailmsg + echo >> mailmsg + echo "One or more GLOBCOMPVIS Satellite Data files are missing, including " >> mailmsg + echo " $COMINsat/$satFile " >> mailmsg + echo >> mailmsg + echo "$job will gracfully exited" >> mailmsg + cat mailmsg > $COMOUT/${RUN}.t${cyc}z.gcip.emailbody + cat $COMOUT/${RUN}.t${cyc}z.gcip.emailbody | mail.py -s "$subject" $maillist -v + + exit $rc + fi + done + + cp $COMINsat/$satFile . + fi + if [[ -s $satFile ]] ; then + satFiles="$satFiles $satFile" + else + satFiles="" + break + fi + done + + # radar data + sourceRadar=$COMINradar/refd3d.t${vhour}z.grb2f00 + + # check the availability of radar data file + icnt=1 + while [ $icnt -lt $SLEEP_LOOP_MAX ] ; do + if [ -s $sourceRadar ] ; then + break + fi + sleep $SLEEP_INT + icnt=$((icnt + 1)) + if [ $icnt -ge $SLEEP_LOOP_MAX ] ; then + echo "WARNING: radar data is not available after $SLEEP_TIME seconds of waiting!" + fi + done + + radarFile=radarFile.grb + if [ -s $sourceRadar ] ; then + cp $sourceRadar $radarFile + fi + + fi # RUN model name + +######################################################## +# Composite gcip command options + +outputfile=gfs.t${vhour}z.gcip.f00.grib2 + +cmdoptions="-t ${PDY}${vhour} -c $configFile -model $modelFile" +if [[ -s $metarFile ]] ; then + cmdoptions="$cmdoptions -metar $metarFile" +else + err_exit "There are no METAR observations." +fi +if [[ -s $shipFile ]] ; then + cmdoptions="$cmdoptions -ship $shipFile" +fi +# empty if a channel data is missing +if [[ -n $satFiles ]] ; then + cmdoptions="$cmdoptions -sat $satFiles" +else + err_exit "Satellite data are not available or completed." +fi +if [[ -s $lightningFile ]] ; then + cmdoptions="$cmdoptions -lightning $lightningFile" +fi +if [[ -s $pirepFile ]] ; then + cmdoptions="$cmdoptions -pirep $pirepFile" +fi +if [[ -s $radarFile ]] ; then + cmdoptions="$cmdoptions -radar $radarFile" +fi +cmdoptions="$cmdoptions -o $outputfile" + +####################################################### +# Run GCIP + +echo 'after preparing data' `date` + +export pgm=wafs_gcip.x + +cpreq $FIXgfs/gcip_near_ir_refl.table near_ir_refl.table + +startmsg +$EXECgfs/$pgm >> $pgmout $cmdoptions 2> errfile & +wait +export err=$?; err_chk + + +if [[ -s $outputfile ]] ; then + ############################## + # Post Files to COM + ############################## + if [ $SENDCOM = "YES" ] ; then + cp $outputfile $COMOUT/$outputfile + if [ $SENDDBN = "YES" ] ; then + # $DBNROOT/bin/dbn_alert GFS_WAFS GCIP $job $COMOUT/$outputfile +#alert removed in v15.0 $DBNROOT/bin/dbn_alert MODEL GFS_WAFS_GCIP $job $COMOUT/$outputfile + : + fi + fi +else + err_exit "Output $outputfile was not generated" +fi + + +################################################################################ +# GOOD RUN +set +x +echo "**************JOB EXGFS_ATMOS_WAFS_GCIP.SH COMPLETED NORMALLY ON THE IBM" +echo "**************JOB EXGFS_ATMOS_WAFS_GCIP.SH COMPLETED NORMALLY ON THE IBM" +echo "**************JOB EXGFS_ATMOS_WAFS_GCIP.SH COMPLETED NORMALLY ON THE IBM" +set -x +################################################################################ + +exit 0 + +############## END OF SCRIPT ####################### + diff --git a/scripts/exgfs_atmos_wafs_grib.sh b/scripts/exgfs_atmos_wafs_grib.sh new file mode 100755 index 0000000000..e81f0e99da --- /dev/null +++ b/scripts/exgfs_atmos_wafs_grib.sh @@ -0,0 +1,146 @@ +#!/bin/sh +###################################################################### +# UTILITY SCRIPT NAME : exgfs_atmos_wafs_grib.sh +# DATE WRITTEN : 10/04/2004 +# +# Abstract: This utility script produces the WAFS GRIB +# +# Input: 1 arguments are passed to this script. +# 1st argument - Forecast Hour - format of 2I +# +# Logic: If we are processing fcsthrss 12-30, we have the +# added variable of the a or b in the process accordingly. +# The other fcsthrss, the a or b is dropped. +# +##################################################################### +echo "------------------------------------------------" +echo "JWAFS_00/06/12/18 GFS postprocessing" +echo "------------------------------------------------" +echo "History: OCT 2004 - First implementation of this new script." +echo " Aug 2015 - Modified for Phase II" +echo " Dec 2015 - Modified for input model data in Grib2" +echo " Oct 2021 - Remove jlogfile" +echo " " +##################################################################### +set +x +fcsthrs_list="$1" +num=$# + +if test "$num" -ge 1 +then + echo " Appropriate number of arguments were passed" + set -x + export DBNALERT_TYPE=${DBNALERT_TYPE:-GRIB} +# export job=${job:-interactive} +else + echo "" + echo "Usage: exgfs_atmos_wafs_grib.sh \$fcsthrs " + echo "" + exit 16 +fi + +cd $DATA + +set -x + +# To fix bugzilla 628 ( removing 'j' ahead of $job ) +export jobsuffix=gfs_atmos_wafs_f${fcsthrs}_$cyc + +############################################### +# Wait for the availability of the pgrib file +############################################### +# file name and forecast hour of GFS model data in Grib2 are 3 digits +export fcsthrs000="$(printf "%03d" $(( 10#$fcsthrs )) )" +icnt=1 +while [ $icnt -lt 1000 ] +do +# if [ -s $COMIN/${RUN}.${cycle}.pgrbf$fcsthrs ] + if [ -s $COMIN/${RUN}.${cycle}.pgrb2.1p00.f$fcsthrs000 ] + then + break + fi + + sleep 10 + icnt=$((icnt + 1)) + if [ $icnt -ge 180 ] + then + msg="ABORTING after 30 min of waiting for the pgrib filei!" + err_exit $msg + fi +done + +######################################## +echo "HAS BEGUN!" +######################################## + +echo " ------------------------------------------" +echo " BEGIN MAKING GFS WAFS PRODUCTS" +echo " ------------------------------------------" + +#################################################### +# +# GFS WAFS PRODUCTS MUST RUN IN CERTAIN ORDER +# BY REQUIREMENT FROM FAA. +# PLEASE DO NOT ALTER ORDER OF PROCESSING WAFS +# PRODUCTS CONSULTING WITH MR. BRENT GORDON. +# +#################################################### + +set +x +echo " " +echo "#####################################" +echo " Process GRIB WAFS PRODUCTS (mkwafs)" +echo " FORECAST HOURS 00 - 72." +echo "#####################################" +echo " " +set -x + +if test $fcsthrs -eq 0 +then + echo " " +fi + +# If we are processing fcsthrss 12-30, we have the +# added variable of the a or b in the process. +# The other fcsthrss, the a or b is dropped. + +if test $fcsthrs -ge 12 -a $fcsthrs -le 24 +then + sh $USHgfs/wafs_mkgbl.sh ${fcsthrs} a +fi + +if test $fcsthrs -eq 30 +then + sh $USHgfs/wafs_mkgbl.sh ${fcsthrs} a + for fcsthrs in 12 18 24 30 + do + sh $USHgfs/wafs_mkgbl.sh ${fcsthrs} b + done + sh $USHgfs/wafs_mkgbl.sh 00 x + sh $USHgfs/wafs_mkgbl.sh 06 x +fi + +if test $fcsthrs -gt 30 -a $fcsthrs -le 48 +then + sh $USHgfs/wafs_mkgbl.sh ${fcsthrs} x +fi + +if test $fcsthrs -eq 60 -o $fcsthrs -eq 72 +then + sh $USHgfs/wafs_mkgbl.sh ${fcsthrs} x +fi + +################################################################################ +# GOOD RUN +set +x +echo "**************JOB EXGFS_ATMOS_WAFS_GRIB.SH COMPLETED NORMALLY ON THE IBM" +echo "**************JOB EXGFS_ATMOS_WAFS_GRIB.SH COMPLETED NORMALLY ON THE IBM" +echo "**************JOB EXGFS_ATMOS_WAFS_GRIB.SH COMPLETED NORMALLY ON THE IBM" +set -x +################################################################################ + +echo "HAS COMPLETED NORMALLY!" + +exit 0 + +############## END OF SCRIPT ####################### diff --git a/scripts/exgfs_atmos_wafs_grib2.sh b/scripts/exgfs_atmos_wafs_grib2.sh new file mode 100755 index 0000000000..4631a10d8c --- /dev/null +++ b/scripts/exgfs_atmos_wafs_grib2.sh @@ -0,0 +1,227 @@ +#!/bin/sh +###################################################################### +# UTILITY SCRIPT NAME : exgfs_atmos_wafs_grib2.sh +# DATE WRITTEN : 07/15/2009 +# +# Abstract: This utility script produces the WAFS GRIB2. The output +# GRIB files are posted on NCEP ftp server and the grib2 files +# are pushed via dbnet to TOC to WAFS (ICSC). +# This is a joint project of WAFC London and WAFC Washington. +# +# We are processing WAFS grib2 for fcsthrs from 06 - 36 +# with 3-hour time increment. +# +# History: 08/20/2014 +# - ingest master file in grib2 (or grib1 if grib2 fails) +# - output of icng tcld cat cb are in grib2 +# 02/21/2020 +# - Prepare unblended icing severity and GTG tubulence +# for blending at 0.25 degree +# 02/22/2022 +# - Add grib2 data requested by FAA +# - Stop generating grib1 data for WAFS +##################################################################### +echo "-----------------------------------------------------" +echo "JGFS_ATMOS_WAFS_GRIB2 at 00Z/06Z/12Z/18Z GFS postprocessing" +echo "-----------------------------------------------------" +echo "History: AUGUST 2009 - First implementation of this new script." +echo "Oct 2021 - Remove jlogfile" +echo "Feb 2022 - Add FAA data, stop grib1 data" +echo " " +##################################################################### + +set -x + +fcsthrs=$1 + +DATA=$DATA/$fcsthrs +mkdir -p $DATA +cd $DATA + +########################################################## +# Wait for the availability of the gfs master pgrib file +########################################################## +# file name and forecast hour of GFS model data in Grib2 are 3 digits +export fcsthrs000="$(printf "%03d" $(( 10#$fcsthrs )) )" + +# 2D data +master2=$COMIN/${RUN}.${cycle}.master.grb2f${fcsthrs000} +master2i=$COMIN/${RUN}.${cycle}.master.grb2if${fcsthrs000} +# 3D data +wafs2=$COMIN/${RUN}.${cycle}.wafs.grb2f${fcsthrs000} +wafs2i=$COMIN/${RUN}.${cycle}.wafs.grb2f${fcsthrs000}.idx +# 3D data (on ICAO standard level) +icao2=$COMIN/${RUN}.${cycle}.wafs_icao.grb2f${fcsthrs000} +icao2i=$COMIN/${RUN}.${cycle}.wafs_icao.grb2f${fcsthrs000}.idx + +icnt=1 +while [ $icnt -lt 1000 ] +do + if [[ -s $master2i && -s $wafs2i ]] ; then + break + fi + + sleep 10 + icnt=$((icnt + 1)) + if [ $icnt -ge 180 ] ; then + msg="ABORTING after 30 min of waiting for the gfs master and wafs file!" + err_exit $msg + fi +done + +######################################## +echo "HAS BEGUN!" +######################################## + +echo " ------------------------------------------" +echo " BEGIN MAKING GFS WAFS GRIB2 PRODUCTS" +echo " ------------------------------------------" + +set +x +echo " " +echo "#####################################" +echo " Process GRIB WAFS PRODUCTS " +echo " FORECAST HOURS 06 - 36." +echo "#####################################" +echo " " +set -x + + +if [ $fcsthrs -le 36 -a $fcsthrs -gt 0 ] ; then + wafs_timewindow=yes +else + wafs_timewindow=no +fi + +#--------------------------- +# 1) Grib2 data for FAA +#--------------------------- +$WGRIB2 $master2 | grep -F -f $FIXgfs/grib2_gfs_awf_master.list | $WGRIB2 -i $master2 -grib tmpfile_gfsf${fcsthrs} +# F006 master file has two records of 0-6 hour APCP and ACPCP each, keep only one +# FAA APCP ACPCP: included every 6 forecast hour (0, 48], every 12 forest hour [48, 72] (controlled by $FIXgfs/grib2_gfs_awf_master3d.list) +if [ $fcsthrs -eq 6 ] ; then + $WGRIB2 tmpfile_gfsf${fcsthrs} -not "(APCP|ACPCP)" -grib tmp.grb2 + $WGRIB2 tmpfile_gfsf${fcsthrs} -match APCP -append -grib tmp.grb2 -quit + $WGRIB2 tmpfile_gfsf${fcsthrs} -match ACPCP -append -grib tmp.grb2 -quit + mv tmp.grb2 tmpfile_gfsf${fcsthrs} +fi +# U V will have the same grid message number by using -ncep_uv. +# U V will have the different grid message number without -ncep_uv. +$WGRIB2 tmpfile_gfsf${fcsthrs} \ + -set master_table 6 \ + -new_grid_winds earth -set_grib_type jpeg \ + -new_grid_interpolation bilinear -if ":(UGRD|VGRD):max wind" -new_grid_interpolation neighbor -fi \ + -new_grid latlon 0:288:1.25 90:145:-1.25 gfs.t${cyc}z.awf_grb45f${fcsthrs}.grib2 +$WGRIB2 -s gfs.t${cyc}z.awf_grb45f${fcsthrs}.grib2 > gfs.t${cyc}z.awf_grb45f${fcsthrs}.grib2.idx + +# For FAA, add WMO header. The header is different from WAFS +export pgm=$TOCGRIB2 +. prep_step +startmsg +export FORT11=gfs.t${cyc}z.awf_grb45f${fcsthrs}.grib2 +export FORT31=" " +export FORT51=grib2.t${cyc}z.awf_grbf${fcsthrs}.45 +$TOCGRIB2 < $FIXgfs/grib2_gfs_awff${fcsthrs}.45 >> $pgmout 2> errfile +err=$?;export err ;err_chk +echo " error from tocgrib=",$err + +if [ $wafs_timewindow = 'yes' ] ; then +#--------------------------- +# 2) traditional WAFS fields +#--------------------------- + # 3D data from $wafs2, on exact model pressure levels + $WGRIB2 $wafs2 | grep -F -f $FIXgfs/grib2_gfs_wafs_wafsmaster.list | $WGRIB2 -i $wafs2 -grib tmpfile_gfsf${fcsthrs} + # 2D data from $master2 + tail -5 $FIXgfs/grib2_gfs_wafs_wafsmaster.list > grib2_gfs_wafs_wafsmaster.list.2D + $WGRIB2 $master2 | grep -F -f grib2_gfs_wafs_wafsmaster.list.2D | $WGRIB2 -i $master2 -grib tmpfile_gfsf${fcsthrs}.2D + # Complete list of WAFS data + cat tmpfile_gfsf${fcsthrs}.2D >> tmpfile_gfsf${fcsthrs} + # WMO header + cp $FIXgfs/grib2_gfs_wafsf${fcsthrs}.45 wafs_wmo_header45 + # U V will have the same grid message number by using -ncep_uv. + # U V will have the different grid message number without -ncep_uv. + $WGRIB2 tmpfile_gfsf${fcsthrs} \ + -set master_table 6 \ + -new_grid_winds earth -set_grib_type jpeg \ + -new_grid_interpolation bilinear -if ":(UGRD|VGRD):max wind" -new_grid_interpolation neighbor -fi \ + -new_grid latlon 0:288:1.25 90:145:-1.25 gfs.t${cyc}z.wafs_grb45f${fcsthrs}.grib2 + $WGRIB2 -s gfs.t${cyc}z.wafs_grb45f${fcsthrs}.grib2 > gfs.t${cyc}z.wafs_grb45f${fcsthrs}.grib2.idx + + # For WAFS, add WMO header. Processing WAFS GRIB2 grid 45 for ISCS and WIFS + export pgm=$TOCGRIB2 + . prep_step + startmsg + export FORT11=gfs.t${cyc}z.wafs_grb45f${fcsthrs}.grib2 + export FORT31=" " + export FORT51=grib2.t${cyc}z.wafs_grbf${fcsthrs}.45 + $TOCGRIB2 < wafs_wmo_header45 >> $pgmout 2> errfile + err=$?;export err ;err_chk + echo " error from tocgrib=",$err + +fi # wafs_timewindow + +if [ $SENDCOM = "YES" ] ; then + + ############################## + # Post Files to COM + ############################## + + # FAA data + mv gfs.t${cyc}z.awf_grb45f${fcsthrs}.grib2 $COMOUT/gfs.t${cyc}z.awf_grb45f${fcsthrs}.grib2 + mv gfs.t${cyc}z.awf_grb45f${fcsthrs}.grib2.idx $COMOUT/gfs.t${cyc}z.awf_grb45f${fcsthrs}.grib2.idx + + # WAFS data + if [ $wafs_timewindow = 'yes' ] ; then + mv gfs.t${cyc}z.wafs_grb45f${fcsthrs}.grib2 $COMOUT/gfs.t${cyc}z.wafs_grb45f${fcsthrs}.grib2 + mv gfs.t${cyc}z.wafs_grb45f${fcsthrs}.grib2.idx $COMOUT/gfs.t${cyc}z.wafs_grb45f${fcsthrs}.grib2.idx + fi + + ############################## + # Post Files to PCOM + ############################## + + mv grib2.t${cyc}z.awf_grbf${fcsthrs}.45 $PCOM/grib2.t${cyc}z.awf_grbf${fcsthrs}.45 + + if [ $wafs_timewindow = 'yes' ] ; then + mv grib2.t${cyc}z.wafs_grbf${fcsthrs}.45 $PCOM/grib2.t${cyc}z.wafs_grbf${fcsthrs}.45 + fi +fi + +###################### +# Distribute Data +###################### + +if [ $SENDDBN = "YES" ] ; then + +# +# Distribute Data to WOC +# + if [ $wafs_timewindow = 'yes' ] ; then + $DBNROOT/bin/dbn_alert MODEL GFS_WAFS_1P25_GB2 $job $COMOUT/gfs.t${cyc}z.wafs_grb45f${fcsthrs}.grib2 +# +# Distribute Data to TOC TO WIFS FTP SERVER (AWC) +# + $DBNROOT/bin/dbn_alert NTC_LOW $NET $job $PCOM/grib2.t${cyc}z.wafs_grbf${fcsthrs}.45 + fi +# +# Distribute data to FAA +# + $DBNROOT/bin/dbn_alert NTC_LOW $NET $job $PCOM/grib2.t${cyc}z.awf_grbf${fcsthrs}.45 + + +fi + +################################################################################ +# GOOD RUN +set +x +echo "**************JOB EXGFS_ATMOS_WAFS_GRIB2.SH COMPLETED NORMALLY ON THE IBM" +echo "**************JOB EXGFS_ATMOS_WAFS_GRIB2.SH COMPLETED NORMALLY ON THE IBM" +echo "**************JOB EXGFS_ATMOS_WAFS_GRIB2.SH COMPLETED NORMALLY ON THE IBM" +set -x +################################################################################ + +echo "HAS COMPLETED NORMALLY!" + +exit 0 + +############## END OF SCRIPT ####################### diff --git a/scripts/exgfs_atmos_wafs_grib2_0p25.sh b/scripts/exgfs_atmos_wafs_grib2_0p25.sh new file mode 100755 index 0000000000..ec53966430 --- /dev/null +++ b/scripts/exgfs_atmos_wafs_grib2_0p25.sh @@ -0,0 +1,200 @@ +#!/bin/sh +###################################################################### +# UTILITY SCRIPT NAME : exgfs_atmos_wafs_grib2_0p25.sh +# DATE WRITTEN : 03/20/2020 +# +# Abstract: This utility script produces the WAFS GRIB2 at 0.25 degree. +# The output GRIB files are posted on NCEP ftp server and the +# grib2 files are pushed via dbnet to TOC to WAFS (ICSC). +# This is a joint project of WAFC London and WAFC Washington. +# +# We are processing WAFS grib2 for ffhr: +# hourly: 006 - 024 +# 3 hour: 027 - 048 +# 6 hour: 054 - 120 (for U/V/T/RH, not for turbulence/icing/CB) +# +# History: +##################################################################### +echo "-----------------------------------------------------" +echo "JGFS_ATMOS_WAFS_GRIB2_0P25 at 00Z/06Z/12Z/18Z GFS postprocessing" +echo "-----------------------------------------------------" +echo "History: MARCH 2020 - First implementation of this new script." +echo "Oct 2021 - Remove jlogfile" +echo "Aug 2022 - ffhr expanded from 36 to 120" +echo " " +##################################################################### + +cd $DATA + +set -x + + +ffhr=$1 +export ffhr="$(printf "%03d" $(( 10#$ffhr )) )" +export ffhr2="$(printf "%02d" $(( 10#$ffhr )) )" + +DATA=$DATA/$ffhr +mkdir -p $DATA +cd $DATA + + +if [ $ffhr -le 48 ] ; then + hazard_timewindow=yes +else + hazard_timewindow=no +fi + + +########################################################## +# Wait for the availability of the gfs WAFS file +########################################################## + +# 3D data (on new ICAO model pressure levels) and 2D data (CB) +wafs2=$COMIN/${RUN}.${cycle}.wafs.grb2f${ffhr} +wafs2i=$COMIN/${RUN}.${cycle}.wafs.grb2f${ffhr}.idx + +# 2D data from master file (U/V/H on max wind level, T/H at tropopause) +master2=$COMIN/${RUN}.${cycle}.master.grb2f${ffhr} + +# 3D data (on standard atmospheric pressure levels) +# Up to fhour=48 +# Will be removed in GFS.v17 +icao2=$COMIN/${RUN}.${cycle}.wafs_icao.grb2f${ffhr} + +icnt=1 +while [ $icnt -lt 1000 ] +do + if [[ -s $wafs2i ]] ; then + break + fi + + sleep 10 + icnt=$((icnt + 1)) + if [ $icnt -ge 180 ] ; then + msg="ABORTING after 30 min of waiting for the gfs wafs file!" + err_exit $msg + fi +done + + +######################################## +echo "HAS BEGUN!" +######################################## + +echo " ------------------------------------------" +echo " BEGIN MAKING GFS WAFS GRIB2 0.25 DEG PRODUCTS" +echo " ------------------------------------------" + +set +x +echo " " +echo "#####################################" +echo " Process GRIB2 WAFS 0.25 DEG PRODUCTS " +echo "#####################################" +echo " " +set -x + +opt1=' -set_grib_type same -new_grid_winds earth ' +opt21=' -new_grid_interpolation bilinear -if ' +opt22="(:ICESEV|parm=37):" +opt23=' -new_grid_interpolation neighbor -fi ' +opt24=' -set_bitmap 1 -set_grib_max_bits 16 ' +opt25=":(UGRD|VGRD):max wind" +newgrid="latlon 0:1440:0.25 90:721:-0.25" + +# WAFS 3D data +$WGRIB2 $wafs2 $opt1 $opt21 $opt22 $opt23 $opt24 -new_grid $newgrid tmp_wafs_0p25.grb2 +# Master 2D data +$WGRIB2 $master2 | grep -F -f $FIXgfs/grib2_0p25_gfs_master2d.list \ + | $WGRIB2 -i $master2 -set master_table 25 -grib tmp_master.grb2 +$WGRIB2 tmp_master.grb2 $opt1 $opt21 ":(UGRD|VGRD):max wind" $opt23 $opt24 -new_grid $newgrid tmp_master_0p25.grb2 + +#--------------------------- +# Product 1: WAFS u/v/t/rh gfs.tHHz.wafs_0p25.fFFF.grib2 +#--------------------------- +$WGRIB2 tmp_wafs_0p25.grb2 | egrep "UGRD|VGRD|TMP|HGT|RH" \ + | $WGRIB2 -i tmp_wafs_0p25.grb2 -set master_table 25 -grib tmp.gfs.t${cyc}z.wafs_0p25.f${ffhr}.grib2 +cat tmp_master_0p25.grb2 >> tmp.gfs.t${cyc}z.wafs_0p25.f${ffhr}.grib2 +# Convert template 5 to 5.40 +#$WGRIB2 tmp.gfs.t${cyc}z.wafs_0p25.f${ffhr}.grib2 -set_grib_type jpeg -grib_out gfs.t${cyc}z.wafs_0p25.f${ffhr}.grib2 +mv tmp.gfs.t${cyc}z.wafs_0p25.f${ffhr}.grib2 gfs.t${cyc}z.wafs_0p25.f${ffhr}.grib2 +$WGRIB2 -s gfs.t${cyc}z.wafs_0p25.f${ffhr}.grib2 > gfs.t${cyc}z.wafs_0p25.f${ffhr}.grib2.idx + +if [ $hazard_timewindow = 'yes' ] ; then +#--------------------------- +# Product 2: For AWC and Delta airline: EDPARM CAT MWT ICESEV CB gfs.tHHz.awf_0p25.fFFF.grib2 +#--------------------------- + criteria1=":EDPARM:|:ICESEV:|parm=37:" + criteria2=":CATEDR:|:MWTURB:" + criteria3=":CBHE:|:ICAHT:" + $WGRIB2 tmp_wafs_0p25.grb2 | egrep "${criteria1}|$criteria2|$criteria3" \ + | $WGRIB2 -i tmp_wafs_0p25.grb2 -grib gfs.t${cyc}z.awf_0p25.f${ffhr}.grib2 + $WGRIB2 -s gfs.t${cyc}z.awf_0p25.f${ffhr}.grib2 > gfs.t${cyc}z.awf_0p25.f${ffhr}.grib2.idx + +#--------------------------- +# Product 3: WAFS unblended EDPARM, ICESEV, CB (No CAT MWT) gfs.tHHz.wafs_0p25_unblended.fFF.grib2 +#--------------------------- + $WGRIB2 tmp_wafs_0p25.grb2 | grep -F -f $FIXgfs/grib2_0p25_gfs_hazard.list \ + | $WGRIB2 -i tmp_wafs_0p25.grb2 -set master_table 25 -grib tmp_wafs_0p25.grb2.forblend + + # Convert template 5 to 5.40 + #$WGRIB2 tmp_wafs_0p25.grb2.forblend -set_grib_type jpeg -grib_out gfs.t${cyc}z.wafs_0p25_unblended.f${ffhr2}.grib2 + mv tmp_wafs_0p25.grb2.forblend gfs.t${cyc}z.wafs_0p25_unblended.f${ffhr2}.grib2 + $WGRIB2 -s gfs.t${cyc}z.wafs_0p25_unblended.f${ffhr2}.grib2 > gfs.t${cyc}z.wafs_0p25_unblended.f${ffhr2}.grib2.idx +fi + +if [ $SENDCOM = "YES" ] ; then + + ############################## + # Post Files to COM + ############################## + + mv gfs.t${cyc}z.wafs_0p25.f${ffhr}.grib2 $COMOUT/gfs.t${cyc}z.wafs_0p25.f${ffhr}.grib2 + mv gfs.t${cyc}z.wafs_0p25.f${ffhr}.grib2.idx $COMOUT/gfs.t${cyc}z.wafs_0p25.f${ffhr}.grib2.idx + + if [ $hazard_timewindow = 'yes' ] ; then + mv gfs.t${cyc}z.awf_0p25.f${ffhr}.grib2 $COMOUT/gfs.t${cyc}z.awf_0p25.f${ffhr}.grib2 + mv gfs.t${cyc}z.awf_0p25.f${ffhr}.grib2.idx $COMOUT/gfs.t${cyc}z.awf_0p25.f${ffhr}.grib2.idx + + mv gfs.t${cyc}z.wafs_0p25_unblended.f${ffhr2}.grib2 $COMOUT/gfs.t${cyc}z.wafs_0p25_unblended.f${ffhr2}.grib2 + mv gfs.t${cyc}z.wafs_0p25_unblended.f${ffhr2}.grib2.idx $COMOUT/gfs.t${cyc}z.wafs_0p25_unblended.f${ffhr2}.grib2.idx + fi + + ############################# + # Post Files to PCOM + ############################## + ## mv gfs.t${cyc}z.wafs_0p25_unblended_wifs.f${ffhr2}.grib2 $PCOM/gfs.t${cyc}z.wafs_0p25_unblended_wifs.f${ffhr2}.grib2 +fi + + +if [ $SENDDBN = "YES" ] ; then + ###################### + # Distribute Data + ###################### + + if [ $hazard_timewindow = 'yes' ] ; then + # Hazard WAFS data (ICESEV EDR CAT MWT on 100mb to 1000mb or on new ICAO 2023 levels) sent to AWC and to NOMADS for US stakeholders + $DBNROOT/bin/dbn_alert MODEL GFS_AWF_0P25_GB2 $job $COMOUT/gfs.t${cyc}z.awf_0p25.f${ffhr}.grib2 + + # Unblended US WAFS data sent to UK for blending, to the same server as 1.25 deg unblended data: wmo/grib2.tCCz.wafs_grb_wifsfFF.45 + $DBNROOT/bin/dbn_alert MODEL GFS_WAFS_0P25_UBL_GB2 $job $COMOUT/gfs.t${cyc}z.wafs_0p25_unblended.f${ffhr2}.grib2 + fi + + # WAFS U/V/T/RH data sent to the same server as the unblended data as above + $DBNROOT/bin/dbn_alert MODEL GFS_WAFS_0P25_GB2 $job $COMOUT/gfs.t${cyc}z.wafs_0p25.f${ffhr}.grib2 + +fi + +################################################################################ +# GOOD RUN +set +x +echo "**************JOB EXGFS_ATMOS_WAFS_GRIB2_0P25.SH COMPLETED NORMALLY ON THE IBM" +echo "**************JOB EXGFS_ATMOS_WAFS_GRIB2_0P25.SH COMPLETED NORMALLY ON THE IBM" +echo "**************JOB EXGFS_ATMOS_WAFS_GRIB2_0P25.SH COMPLETED NORMALLY ON THE IBM" +set -x +################################################################################ + +echo "HAS COMPLETED NORMALLY!" + +exit 0 + +############## END OF SCRIPT ####################### diff --git a/scripts/exgfs_pmgr.sh b/scripts/exgfs_pmgr.sh index a417bbed55..c3b9a5befa 100755 --- a/scripts/exgfs_pmgr.sh +++ b/scripts/exgfs_pmgr.sh @@ -6,7 +6,7 @@ # This script monitors the progress of the gfs_fcst job # -source "$HOMEgfs/ush/preamble.sh" +source "${USHgfs}/preamble.sh" hour=00 TEND=384 diff --git a/scripts/exgfs_prdgen_manager.sh b/scripts/exgfs_prdgen_manager.sh index 7d0a95696b..01e8c58c87 100755 --- a/scripts/exgfs_prdgen_manager.sh +++ b/scripts/exgfs_prdgen_manager.sh @@ -6,7 +6,7 @@ # This script monitors the progress of the gfs_fcst job # -source "$HOMEgfs/ush/preamble.sh" +source "${USHgfs}/preamble.sh" hour=00 TEND=384 diff --git a/scripts/exgfs_wave_init.sh b/scripts/exgfs_wave_init.sh index ce903a2284..79c2931889 100755 --- a/scripts/exgfs_wave_init.sh +++ b/scripts/exgfs_wave_init.sh @@ -26,7 +26,7 @@ # --------------------------------------------------------------------------- # # 0. Preparations -source "${HOMEgfs}/ush/preamble.sh" +source "${USHgfs}/preamble.sh" # 0.a Basic modes of operation @@ -94,16 +94,16 @@ source "${HOMEgfs}/ush/preamble.sh" echo " Mod def file for ${grdID} not found in ${COM_WAVE_PREP}. Setting up to generate ..." echo ' ' set_trace - if [ -f $FIXwave/ww3_grid.inp.$grdID ] + if [ -f ${FIXgfs}/wave/ww3_grid.inp.$grdID ] then - cp $FIXwave/ww3_grid.inp.$grdID ww3_grid.inp.$grdID + cp ${FIXgfs}/wave/ww3_grid.inp.$grdID ww3_grid.inp.$grdID fi if [ -f ww3_grid.inp.$grdID ] then set +x echo ' ' - echo " ww3_grid.inp.$grdID copied ($FIXwave/ww3_grid.inp.$grdID)." + echo " ww3_grid.inp.$grdID copied (${FIXgfs}/wave/ww3_grid.inp.$grdID)." echo ' ' set_trace else @@ -118,11 +118,18 @@ source "${HOMEgfs}/ush/preamble.sh" err=2;export err;${errchk} fi + + if [ -f ${FIXgfs}/wave/${grdID}.msh ] + then + cp "${FIXgfs}/wave/${grdID}.msh" "${grdID}.msh" + fi + #TO DO: how do we say "it's unstructured, and therefore need to have error check here" + [[ ! -d "${COM_WAVE_PREP}" ]] && mkdir -m 775 -p "${COM_WAVE_PREP}" if [ ${CFP_MP:-"NO"} = "YES" ]; then - echo "$nmoddef $USHwave/wave_grid_moddef.sh $grdID > $grdID.out 2>&1" >> cmdfile + echo "$nmoddef ${USHgfs}/wave_grid_moddef.sh $grdID > $grdID.out 2>&1" >> cmdfile else - echo "$USHwave/wave_grid_moddef.sh $grdID > $grdID.out 2>&1" >> cmdfile + echo "${USHgfs}/wave_grid_moddef.sh $grdID > $grdID.out 2>&1" >> cmdfile fi nmoddef=$(expr $nmoddef + 1) @@ -166,7 +173,7 @@ source "${HOMEgfs}/ush/preamble.sh" exit=$? fi - if [ "$exit" != '0' ] + if [[ "$exit" != '0' ]] then set +x echo ' ' @@ -195,9 +202,9 @@ source "${HOMEgfs}/ush/preamble.sh" echo '********************************************** ' echo '*** FATAL ERROR : NO MODEL DEFINITION FILE *** ' echo '********************************************** ' - echo " grdID = $grdID" + echo " grdID = ${grdID}" echo ' ' - sed "s/^/$grdID.out : /g" $grdID.out + sed "s/^/${grdID}.out : /g" "${grdID}.out" set_trace err=3;export err;${errchk} fi diff --git a/scripts/exgfs_wave_nawips.sh b/scripts/exgfs_wave_nawips.sh index 63690ff1b0..69c4e54ebb 100755 --- a/scripts/exgfs_wave_nawips.sh +++ b/scripts/exgfs_wave_nawips.sh @@ -11,7 +11,7 @@ # March-2020 Roberto.Padilla@noaa.gov ##################################################################### -source "${HOMEgfs}/ush/preamble.sh" +source "${USHgfs}/preamble.sh" #export grids=${grids:-'glo_30m at_10m ep_10m wc_10m ao_9km'} #Interpolated grids export grids=${grids:-'glo_30m'} #Native grids @@ -24,7 +24,6 @@ export FHOUT_HF_WAV=${FHOUT_HF_WAV:-3} export maxtries=${maxtries:-720} export cycle=${cycle:-t${cyc}z} export GEMwave=${GEMwave:-${HOMEgfs}/gempak} -export FIXwave=${FIXwave:-${HOMEgfs}/fix/wave} export DATA=${DATA:-${DATAROOT:?}/${jobid}} if [ ! -d ${DATA} ];then mkdir -p ${DATA} diff --git a/scripts/exgfs_wave_post_gridded_sbs.sh b/scripts/exgfs_wave_post_gridded_sbs.sh index af362b1c45..be17967865 100755 --- a/scripts/exgfs_wave_post_gridded_sbs.sh +++ b/scripts/exgfs_wave_post_gridded_sbs.sh @@ -30,7 +30,7 @@ # --------------------------------------------------------------------------- # # 0. Preparations -source "$HOMEgfs/ush/preamble.sh" +source "${USHgfs}/preamble.sh" # 0.a Basic modes of operation @@ -139,9 +139,9 @@ source "$HOMEgfs/ush/preamble.sh" then for intGRD in $waveinterpGRD do - if [ -f $PARMwave/${intGRD}_interp.inp.tmpl ] + if [ -f ${PARMgfs}/wave/${intGRD}_interp.inp.tmpl ] then - cp -f $PARMwave/${intGRD}_interp.inp.tmpl ${intGRD}_interp.inp.tmpl + cp -f ${PARMgfs}/wave/${intGRD}_interp.inp.tmpl ${intGRD}_interp.inp.tmpl fi if [ -f ${intGRD}_interp.inp.tmpl ] @@ -168,9 +168,9 @@ source "$HOMEgfs/ush/preamble.sh" then for grbGRD in $waveinterpGRD $wavepostGRD do - if [ -f $PARMwave/ww3_grib2.${grbGRD}.inp.tmpl ] + if [ -f ${PARMgfs}/wave/ww3_grib2.${grbGRD}.inp.tmpl ] then - cp -f $PARMwave/ww3_grib2.${grbGRD}.inp.tmpl ww3_grib2.${grbGRD}.inp.tmpl + cp -f ${PARMgfs}/wave/ww3_grib2.${grbGRD}.inp.tmpl ww3_grib2.${grbGRD}.inp.tmpl fi if [ -f ww3_grib2.${grbGRD}.inp.tmpl ] @@ -279,7 +279,7 @@ source "$HOMEgfs/ush/preamble.sh" for grdID in $waveinterpGRD do ymdh_int=$($NDATE -${WAVHINDH} $ymdh); dt_int=3600.; n_int=9999 ; - echo "$USHwave/wave_grid_interp_sbs.sh $grdID $ymdh_int $dt_int $n_int > grint_$grdID.out 2>&1" >> ${fcmdigrd}.${nigrd} + echo "${USHgfs}/wave_grid_interp_sbs.sh $grdID $ymdh_int $dt_int $n_int > grint_$grdID.out 2>&1" >> ${fcmdigrd}.${nigrd} if [ "$DOGRB_WAV" = 'YES' ] then gribFL=\'$(echo ${OUTPARS_WAV})\' @@ -296,7 +296,7 @@ source "$HOMEgfs/ush/preamble.sh" wc_10m) GRDNAME='wcoast' ; GRDRES=0p16 ; GRIDNR=255 ; MODNR=11 ;; ak_10m) GRDNAME='alaska' ; GRDRES=0p16 ; GRIDNR=255 ; MODNR=11 ;; esac - echo "$USHwave/wave_grib2_sbs.sh $grdID $GRIDNR $MODNR $ymdh $fhr $GRDNAME $GRDRES $gribFL > grib_$grdID.out 2>&1" >> ${fcmdigrd}.${nigrd} + echo "${USHgfs}/wave_grib2_sbs.sh $grdID $GRIDNR $MODNR $ymdh $fhr $GRDNAME $GRDRES $gribFL > grib_$grdID.out 2>&1" >> ${fcmdigrd}.${nigrd} fi echo "${GRIBDATA}/${fcmdigrd}.${nigrd}" >> ${fcmdnow} chmod 744 ${fcmdigrd}.${nigrd} @@ -325,7 +325,7 @@ source "$HOMEgfs/ush/preamble.sh" glo_500) GRDNAME='global' ; GRDRES=5p00 ; GRIDNR=255 ; MODNR=11 ;; gwes_30m) GRDNAME='global' ; GRDRES=0p50 ; GRIDNR=255 ; MODNR=10 ;; esac - echo "$USHwave/wave_grib2_sbs.sh $grdID $GRIDNR $MODNR $ymdh $fhr $GRDNAME $GRDRES $gribFL > grib_$grdID.out 2>&1" >> ${fcmdnow} + echo "${USHgfs}/wave_grib2_sbs.sh $grdID $GRIDNR $MODNR $ymdh $fhr $GRDNAME $GRDRES $gribFL > grib_$grdID.out 2>&1" >> ${fcmdnow} done fi diff --git a/scripts/exgfs_wave_post_pnt.sh b/scripts/exgfs_wave_post_pnt.sh index a7aa957564..bc4c85ec74 100755 --- a/scripts/exgfs_wave_post_pnt.sh +++ b/scripts/exgfs_wave_post_pnt.sh @@ -32,7 +32,7 @@ # --------------------------------------------------------------------------- # # 0. Preparations -source "$HOMEgfs/ush/preamble.sh" +source "${USHgfs}/preamble.sh" # 0.a Basic modes of operation @@ -151,12 +151,16 @@ source "$HOMEgfs/ush/preamble.sh" rm -f buoy.loc - if [ -f $PARMwave/wave_${NET}.buoys ] + if [ -f ${PARMgfs}/wave/wave_${NET}.buoys ] then - cp -f $PARMwave/wave_${NET}.buoys buoy.loc.temp + cp -f ${PARMgfs}/wave/wave_${NET}.buoys buoy.loc.temp if [ "$DOBNDPNT_WAV" = YES ]; then #only do boundary points - sed -n '/^\$.*/!p' buoy.loc.temp | grep IBP > buoy.loc + sed -n '/^\$.*/!p' buoy.loc.temp | grep IBP > buoy.loc || { + echo "WARNING: No boundary points found in buoy file ${PARMgfs}/wave/wave_${NET}.buoys" + echo " Ending job without doing anything." + exit 0 + } else #exclude boundary points sed -n '/^\$.*/!p' buoy.loc.temp | grep -v IBP > buoy.loc @@ -166,7 +170,7 @@ source "$HOMEgfs/ush/preamble.sh" if [ -s buoy.loc ] then set +x - echo " buoy.loc and buoy.ibp copied and processed ($PARMwave/wave_${NET}.buoys)." + echo " buoy.loc and buoy.ibp copied and processed (${PARMgfs}/wave/wave_${NET}.buoys)." set_trace else set +x @@ -184,9 +188,9 @@ source "$HOMEgfs/ush/preamble.sh" # 1.d Input template files - if [ -f $PARMwave/ww3_outp_spec.inp.tmpl ] + if [ -f ${PARMgfs}/wave/ww3_outp_spec.inp.tmpl ] then - cp -f $PARMwave/ww3_outp_spec.inp.tmpl ww3_outp_spec.inp.tmpl + cp -f ${PARMgfs}/wave/ww3_outp_spec.inp.tmpl ww3_outp_spec.inp.tmpl fi if [ -f ww3_outp_spec.inp.tmpl ] @@ -207,9 +211,9 @@ source "$HOMEgfs/ush/preamble.sh" DOBLL_WAV='NO' fi - if [ -f $PARMwave/ww3_outp_bull.inp.tmpl ] + if [ -f ${PARMgfs}/wave/ww3_outp_bull.inp.tmpl ] then - cp -f $PARMwave/ww3_outp_bull.inp.tmpl ww3_outp_bull.inp.tmpl + cp -f ${PARMgfs}/wave/ww3_outp_bull.inp.tmpl ww3_outp_bull.inp.tmpl fi if [ -f ww3_outp_bull.inp.tmpl ] @@ -262,7 +266,7 @@ source "$HOMEgfs/ush/preamble.sh" ln -fs ./out_pnt.${waveuoutpGRD} ./out_pnt.ww3 ln -fs ./mod_def.${waveuoutpGRD} ./mod_def.ww3 export pgm=ww3_outp;. prep_step - $EXECwave/ww3_outp > buoy_lst.loc 2>&1 + ${EXECgfs}/ww3_outp > buoy_lst.loc 2>&1 export err=$?;err_chk @@ -383,7 +387,7 @@ source "$HOMEgfs/ush/preamble.sh" export dtspec=3600. for buoy in $buoys do - echo "$USHwave/wave_outp_spec.sh $buoy $ymdh spec $SPECDATA > $SPECDATA/spec_$buoy.out 2>&1" >> tmpcmdfile.$FH3 + echo "${USHgfs}/wave_outp_spec.sh $buoy $ymdh spec $SPECDATA > $SPECDATA/spec_$buoy.out 2>&1" >> tmpcmdfile.$FH3 done fi @@ -392,7 +396,7 @@ source "$HOMEgfs/ush/preamble.sh" export dtspec=3600. for buoy in $buoys do - echo "$USHwave/wave_outp_spec.sh $buoy $ymdh bull $SPECDATA > $SPECDATA/bull_$buoy.out 2>&1" >> tmpcmdfile.$FH3 + echo "${USHgfs}/wave_outp_spec.sh $buoy $ymdh bull $SPECDATA > $SPECDATA/bull_$buoy.out 2>&1" >> tmpcmdfile.$FH3 done fi @@ -510,7 +514,7 @@ source "$HOMEgfs/ush/preamble.sh" then for buoy in $buoys do - echo "$USHwave/wave_outp_cat.sh $buoy $FHMAX_WAV_PNT spec > ${CATOUTDIR}/spec_cat_$buoy.out 2>&1" >> cmdfile.bouy + echo "${USHgfs}/wave_outp_cat.sh $buoy $FHMAX_WAV_PNT spec > ${CATOUTDIR}/spec_cat_$buoy.out 2>&1" >> cmdfile.bouy done fi @@ -518,7 +522,7 @@ source "$HOMEgfs/ush/preamble.sh" then for buoy in $buoys do - echo "$USHwave/wave_outp_cat.sh $buoy $FHMAX_WAV_PNT bull > ${CATOUTDIR}/bull_cat_$buoy.out 2>&1" >> cmdfile.bouy + echo "${USHgfs}/wave_outp_cat.sh $buoy $FHMAX_WAV_PNT bull > ${CATOUTDIR}/bull_cat_$buoy.out 2>&1" >> cmdfile.bouy done fi @@ -610,43 +614,43 @@ source "$HOMEgfs/ush/preamble.sh" if [ ${CFP_MP:-"NO"} = "YES" ] && [ "$DOBLL_WAV" = "YES" ]; then if [ "$DOBNDPNT_WAV" = YES ]; then if [ "$DOSPC_WAV" = YES ]; then - echo "$nm $USHwave/wave_tar.sh $WAV_MOD_TAG ibp $Nb > ${WAV_MOD_TAG}_spec_tar.out 2>&1 " >> cmdtarfile + echo "$nm ${USHgfs}/wave_tar.sh $WAV_MOD_TAG ibp $Nb > ${WAV_MOD_TAG}_spec_tar.out 2>&1 " >> cmdtarfile nm=$(( nm + 1 )) fi if [ "$DOBLL_WAV" = YES ]; then - echo "$nm $USHwave/wave_tar.sh $WAV_MOD_TAG ibpbull $Nb > ${WAV_MOD_TAG}_spec_tar.out 2>&1 " >> cmdtarfile + echo "$nm ${USHgfs}/wave_tar.sh $WAV_MOD_TAG ibpbull $Nb > ${WAV_MOD_TAG}_spec_tar.out 2>&1 " >> cmdtarfile nm=$(( nm + 1 )) - echo "$nm $USHwave/wave_tar.sh $WAV_MOD_TAG ibpcbull $Nb > ${WAV_MOD_TAG}_spec_tar.out 2>&1 " >> cmdtarfile + echo "$nm ${USHgfs}/wave_tar.sh $WAV_MOD_TAG ibpcbull $Nb > ${WAV_MOD_TAG}_spec_tar.out 2>&1 " >> cmdtarfile nm=$(( nm + 1 )) fi else if [ "$DOSPC_WAV" = YES ]; then - echo "$nm $USHwave/wave_tar.sh $WAV_MOD_TAG spec $Nb > ${WAV_MOD_TAG}_spec_tar.out 2>&1 " >> cmdtarfile + echo "$nm ${USHgfs}/wave_tar.sh $WAV_MOD_TAG spec $Nb > ${WAV_MOD_TAG}_spec_tar.out 2>&1 " >> cmdtarfile nm=$(( nm + 1 )) fi if [ "$DOBLL_WAV" = YES ]; then - echo "$nm $USHwave/wave_tar.sh $WAV_MOD_TAG bull $Nb > ${WAV_MOD_TAG}_spec_tar.out 2>&1 " >> cmdtarfile + echo "$nm ${USHgfs}/wave_tar.sh $WAV_MOD_TAG bull $Nb > ${WAV_MOD_TAG}_spec_tar.out 2>&1 " >> cmdtarfile nm=$(( nm + 1 )) - echo "$nm $USHwave/wave_tar.sh $WAV_MOD_TAG cbull $Nb > ${WAV_MOD_TAG}_spec_tar.out 2>&1 " >> cmdtarfile + echo "$nm ${USHgfs}/wave_tar.sh $WAV_MOD_TAG cbull $Nb > ${WAV_MOD_TAG}_spec_tar.out 2>&1 " >> cmdtarfile nm=$(( nm + 1 )) fi fi else if [ "$DOBNDPNT_WAV" = YES ]; then if [ "$DOSPC_WAV" = YES ]; then - echo "$USHwave/wave_tar.sh $WAV_MOD_TAG ibp $Nb > ${WAV_MOD_TAG}_spec_tar.out 2>&1 " >> cmdtarfile + echo "${USHgfs}/wave_tar.sh $WAV_MOD_TAG ibp $Nb > ${WAV_MOD_TAG}_spec_tar.out 2>&1 " >> cmdtarfile fi if [ "$DOBLL_WAV" = YES ]; then - echo "$USHwave/wave_tar.sh $WAV_MOD_TAG ibpbull $Nb > ${WAV_MOD_TAG}_spec_tar.out 2>&1 " >> cmdtarfile - echo "$USHwave/wave_tar.sh $WAV_MOD_TAG ibpcbull $Nb > ${WAV_MOD_TAG}_spec_tar.out 2>&1 " >> cmdtarfile + echo "${USHgfs}/wave_tar.sh $WAV_MOD_TAG ibpbull $Nb > ${WAV_MOD_TAG}_spec_tar.out 2>&1 " >> cmdtarfile + echo "${USHgfs}/wave_tar.sh $WAV_MOD_TAG ibpcbull $Nb > ${WAV_MOD_TAG}_spec_tar.out 2>&1 " >> cmdtarfile fi else if [ "$DOSPC_WAV" = YES ]; then - echo "$USHwave/wave_tar.sh $WAV_MOD_TAG spec $Nb > ${WAV_MOD_TAG}_spec_tar.out 2>&1 " >> cmdtarfile + echo "${USHgfs}/wave_tar.sh $WAV_MOD_TAG spec $Nb > ${WAV_MOD_TAG}_spec_tar.out 2>&1 " >> cmdtarfile fi if [ "$DOBLL_WAV" = YES ]; then - echo "$USHwave/wave_tar.sh $WAV_MOD_TAG bull $Nb > ${WAV_MOD_TAG}_spec_tar.out 2>&1 " >> cmdtarfile - echo "$USHwave/wave_tar.sh $WAV_MOD_TAG cbull $Nb > ${WAV_MOD_TAG}_spec_tar.out 2>&1 " >> cmdtarfile + echo "${USHgfs}/wave_tar.sh $WAV_MOD_TAG bull $Nb > ${WAV_MOD_TAG}_spec_tar.out 2>&1 " >> cmdtarfile + echo "${USHgfs}/wave_tar.sh $WAV_MOD_TAG cbull $Nb > ${WAV_MOD_TAG}_spec_tar.out 2>&1 " >> cmdtarfile fi fi fi diff --git a/scripts/exgfs_wave_prdgen_bulls.sh b/scripts/exgfs_wave_prdgen_bulls.sh index 2e6cb2071b..2bf90cdf2b 100755 --- a/scripts/exgfs_wave_prdgen_bulls.sh +++ b/scripts/exgfs_wave_prdgen_bulls.sh @@ -18,7 +18,7 @@ # --------------------------------------------------------------------------- # # 0. Preparations -source "$HOMEgfs/ush/preamble.sh" +source "${USHgfs}/preamble.sh" # 0.a Basic modes of operation @@ -29,12 +29,6 @@ source "$HOMEgfs/ush/preamble.sh" export cycle=${cycle:-t${cyc}z} export pgmout=OUTPUT.$$ export DATA=${DATA:-${DATAROOT:?}/${job}.$$} - #export CODEwave=${CODEwave:-${PACKAGEROOT}/${NET}_code.${wave_code_ver}/${code_pkg}} - export EXECwave=${EXECwave:-$HOMEgfs/exec} - export FIXwave=${FIXwave:-$HOMEgfs/fix} - export PARMwave=${PARMwave:-$HOMEgfs/parm/parm_wave} - export USHwave=${USHwave:-$HOMEgfs/ush} - #export EXECcode=${EXECcode:-CODEwave/exec} mkdir -p $DATA cd $DATA @@ -117,8 +111,8 @@ source "$HOMEgfs/ush/preamble.sh" echo ' --------------------------' echo ' ' # 1.c Get the datat cards - if [ -f $PARMwave/bull_awips_gfswave ]; then - cp $PARMwave/bull_awips_gfswave awipsbull.data + if [ -f ${PARMgfs}/wave/bull_awips_gfswave ]; then + cp ${PARMgfs}/wave/bull_awips_gfswave awipsbull.data else msg="ABNORMAL EXIT: NO AWIPS BULLETIN HEADER DATA FILE" set +x diff --git a/scripts/exgfs_wave_prdgen_gridded.sh b/scripts/exgfs_wave_prdgen_gridded.sh index b0cbc124ce..55c5b36827 100755 --- a/scripts/exgfs_wave_prdgen_gridded.sh +++ b/scripts/exgfs_wave_prdgen_gridded.sh @@ -19,7 +19,7 @@ # --------------------------------------------------------------------------- # # 0. Preparations -source "$HOMEgfs/ush/preamble.sh" +source "${USHgfs}/preamble.sh" # 0.a Basic modes of operation @@ -31,9 +31,6 @@ source "$HOMEgfs/ush/preamble.sh" export FHOUT_WAV=${FHOUT_WAV:-6} #from 72 to 180 inc=6 export FHOUT_HF_WAV=${FHOUT_HF_WAV:-3} export maxtries=720 - export FIXwave=${FIXwave:-$HOMEgfs/fix/wave} - export PARMwave=${PARMwave:-$HOMEgfs/parm/parm_wave} - export USHwave=${USHwave:-$HOMEgfs/ush} export cyc=${cyc:-00} export cycle=${cycle:-t${cyc}z} export pgmout=OUTPUT.$$ @@ -158,7 +155,7 @@ grids=${grids:-ak_10m at_10m ep_10m wc_10m glo_30m} # # 1.d Input template files - parmfile=$PARMwave/grib2_${RUNwave}.$grdOut.f${fhr} + parmfile=${PARMgfs}/wave/grib2_${RUNwave}.$grdOut.f${fhr} if [ -f $parmfile ]; then ln -s $parmfile awipsgrb.$grdID.f${fhr} else diff --git a/scripts/exgfs_wave_prep.sh b/scripts/exgfs_wave_prep.sh index be006c1c85..1fbe7dd767 100755 --- a/scripts/exgfs_wave_prep.sh +++ b/scripts/exgfs_wave_prep.sh @@ -23,7 +23,7 @@ # # # Update log # # Mar2007 HTolman - Added NCO note on resources on mist/dew # -# Apr2007 HTolman - Renaming mod_def files in $FIX_wave. # +# Apr2007 HTolman - Renaming mod_def files in ${FIXgfs}/wave. # # Mar2011 AChawla - Migrating to a vertical structure # # Nov2012 JHAlves - Transitioning to WCOSS # # Apr2019 JHAlves - Transitioning to GEFS workflow # @@ -40,7 +40,7 @@ # --------------------------------------------------------------------------- # # 0. Preparations -source "$HOMEgfs/ush/preamble.sh" +source "${USHgfs}/preamble.sh" # 0.a Basic modes of operation @@ -207,16 +207,16 @@ source "$HOMEgfs/ush/preamble.sh" ;; esac - if [ -f $PARMwave/ww3_prnc.${type}.$grdID.inp.tmpl ] + if [ -f ${PARMgfs}/wave/ww3_prnc.${type}.$grdID.inp.tmpl ] then - cp $PARMwave/ww3_prnc.${type}.$grdID.inp.tmpl . + cp ${PARMgfs}/wave/ww3_prnc.${type}.$grdID.inp.tmpl . fi if [ -f ww3_prnc.${type}.$grdID.inp.tmpl ] then set +x echo ' ' - echo " ww3_prnc.${type}.$grdID.inp.tmpl copied ($PARMwave)." + echo " ww3_prnc.${type}.$grdID.inp.tmpl copied (${PARMgfs}/wave)." echo ' ' set_trace else @@ -247,7 +247,7 @@ source "$HOMEgfs/ush/preamble.sh" if [ "${RUNMEM}" = "-1" ] || [ "${WW3ICEIENS}" = "T" ] || [ "$waveMEMB" = "00" ] then - $USHwave/wave_prnc_ice.sh > wave_prnc_ice.out + ${USHgfs}/wave_prnc_ice.sh > wave_prnc_ice.out ERR=$? if [ -d ice ] @@ -389,10 +389,10 @@ source "$HOMEgfs/ush/preamble.sh" fi if [ ${CFP_MP:-"NO"} = "YES" ]; then - echo "$nm $USHwave/wave_prnc_cur.sh $ymdh_rtofs $curfile $fhr_rtofs $FLGFIRST > cur_$ymdh_rtofs.out 2>&1" >> cmdfile + echo "$nm ${USHgfs}/wave_prnc_cur.sh $ymdh_rtofs $curfile $fhr_rtofs $FLGFIRST > cur_$ymdh_rtofs.out 2>&1" >> cmdfile nm=$(expr $nm + 1) else - echo "$USHwave/wave_prnc_cur.sh $ymdh_rtofs $curfile $fhr_rtofs $FLGFIRST > cur_$ymdh_rtofs.out 2>&1" >> cmdfile + echo "${USHgfs}/wave_prnc_cur.sh $ymdh_rtofs $curfile $fhr_rtofs $FLGFIRST > cur_$ymdh_rtofs.out 2>&1" >> cmdfile fi if [ "${FLGFIRST}" = "T" ] ; then diff --git a/scripts/exglobal_archive_emc.sh b/scripts/exglobal_archive_emc.sh index 2f7e3be972..5842c76b57 100755 --- a/scripts/exglobal_archive_emc.sh +++ b/scripts/exglobal_archive_emc.sh @@ -1,6 +1,6 @@ #! /usr/bin/env bash -source "${HOMEgfs}/ush/preamble.sh" +source "${USHgfs}/preamble.sh" ############################################## # Begin JOB SPECIFIC work @@ -29,10 +29,11 @@ PDY_MOS="${CDATE_MOS:0:8}" ############################################################### # Archive online for verification and diagnostics ############################################################### -source "${HOMEgfs}/ush/file_utils.sh" +source "${USHgfs}/file_utils.sh" [[ ! -d ${ARCDIR} ]] && mkdir -p "${ARCDIR}" nb_copy "${COM_ATMOS_ANALYSIS}/${APREFIX}gsistat" "${ARCDIR}/gsistat.${RUN}.${PDY}${cyc}" +nb_copy "${COM_SNOW_ANALYSIS}/${APREFIX}snowstat" "${ARCDIR}/snowstat.${RUN}.${PDY}${cyc}" if [[ ${DO_AERO} = "YES" ]]; then nb_copy "${COM_CHEM_ANALYSIS}/${APREFIX}aerostat" "${ARCDIR}/aerostat.${RUN}.${PDY}${cyc}" fi @@ -158,10 +159,10 @@ if [[ ${HPSSARCH} = "YES" || ${LOCALARCH} = "YES" ]]; then cd "${DATA}" || exit 2 - "${HOMEgfs}/ush/hpssarch_gen.sh" "${RUN}" + "${USHgfs}/hpssarch_gen.sh" "${RUN}" status=$? if [ "${status}" -ne 0 ]; then - echo "${HOMEgfs}/ush/hpssarch_gen.sh ${RUN} failed, ABORT!" + echo "${USHgfs}/hpssarch_gen.sh ${RUN} failed, ABORT!" exit "${status}" fi @@ -182,12 +183,12 @@ if [[ ${HPSSARCH} = "YES" || ${LOCALARCH} = "YES" ]]; then targrp_list="${targrp_list} gfswave" fi - if [ "${DO_OCN}" = "YES" ]; then - targrp_list="${targrp_list} ocn_ice_grib2_0p5 ocn_ice_grib2_0p25 ocn_2D ocn_3D ocn_xsect ocn_daily gfs_flux_1p00" + if [[ "${DO_OCN}" == "YES" ]]; then + targrp_list="${targrp_list} ocean_6hravg ocean_daily ocean_grib2 gfs_flux_1p00" fi - if [ "${DO_ICE}" = "YES" ]; then - targrp_list="${targrp_list} ice" + if [[ "${DO_ICE}" == "YES" ]]; then + targrp_list="${targrp_list} ice_6hravg ice_grib2" fi # Aerosols @@ -291,7 +292,7 @@ if [[ ${HPSSARCH} = "YES" || ${LOCALARCH} = "YES" ]]; then stat_chgrp=$? ${HSICMD} chmod 640 "${tar_fl}" stat_chgrp=$((stat_chgrp+$?)) - if [ "${stat_chgrp}" -gt 0 ]; then + if [[ "${stat_chgrp}" -gt 0 ]]; then echo "FATAL ERROR: Unable to properly restrict ${tar_fl}!" echo "Attempting to delete ${tar_fl}" ${HSICMD} rm "${tar_fl}" diff --git a/scripts/exglobal_archive_gsl.sh b/scripts/exglobal_archive_gsl.sh index b84fe345c2..c1aac3a462 100755 --- a/scripts/exglobal_archive_gsl.sh +++ b/scripts/exglobal_archive_gsl.sh @@ -1,6 +1,6 @@ #! /usr/bin/env bash -source "${HOMEgfs}/ush/preamble.sh" +source "${USHgfs}/preamble.sh" ############################################## # Begin JOB SPECIFIC work @@ -158,17 +158,17 @@ if [[ ${HPSSARCH} = "YES" || ${LOCALARCH} = "YES" ]]; then cd "${DATA}" || exit 2 - "${HOMEgfs}/ush/hpssarch_gen.sh" "${RUN}" + "${USHgfs}/hpssarch_gen.sh" "${RUN}" status=$? - if [[ "${status}" -ne 0 ]]; then - echo "${HOMEgfs}/ush/hpssarch_gen.sh ${RUN} failed, ABORT!" + if [ "${status}" -ne 0 ]; then + echo "${USHgfs}/hpssarch_gen.sh ${RUN} failed, ABORT!" exit "${status}" fi cd "${ROTDIR}" || exit 2 if [[ "${RUN}" = "gfs" ]]; then - +#######JKH session starts here!!!! ######## targrp_list="gfs_pgrb2" if [[ "${ARCH_GAUSSIAN:-"NO"}" = "YES" ]]; then @@ -180,10 +180,11 @@ if [[ ${HPSSARCH} = "YES" || ${LOCALARCH} = "YES" ]]; then targrp_list="${targrp_list} gfs_ics" fi +#######JKH session ends here!!!! ######## fi # Turn on extended globbing options - yyyy="${PDY:0:4}" + yyyy="${PDY:0:4}" ##JKH shopt -s extglob for targrp in ${targrp_list}; do set +e @@ -208,7 +209,7 @@ if [[ ${HPSSARCH} = "YES" || ${LOCALARCH} = "YES" ]]; then esac # Create the tarball - tar_fl="${ATARDIR}/${yyyy}/${PDY}${cyc}/${targrp}.tar" + tar_fl="${ATARDIR}/${yyyy}/${PDY}${cyc}/${targrp}.tar" ##JKH ${TARCMD} -P -cvf "${tar_fl}" $(cat "${DATA}/${targrp}.txt") status=$? @@ -240,4 +241,4 @@ if [[ ${HPSSARCH} = "YES" || ${LOCALARCH} = "YES" ]]; then ############################################################### fi ##end of HPSS archive ############################################################### - + ##JKH diff --git a/scripts/exglobal_atmos_analysis.sh b/scripts/exglobal_atmos_analysis.sh index cb3c6467a1..575c112675 100755 --- a/scripts/exglobal_atmos_analysis.sh +++ b/scripts/exglobal_atmos_analysis.sh @@ -19,7 +19,7 @@ # Set environment. -source "${HOMEgfs}/ush/preamble.sh" +source "${USHgfs}/preamble.sh" # Directories. pwd=$(pwd) @@ -42,7 +42,7 @@ export NCP=${NCP:-"/bin/cp"} export NMV=${NMV:-"/bin/mv"} export NLN=${NLN:-"/bin/ln -sf"} export CHGRP_CMD=${CHGRP_CMD:-"chgrp ${group_name:-rstprod}"} -export NCLEN=${NCLEN:-${HOMEgfs}/ush/getncdimlen} +export NCLEN=${NCLEN:-${USHgfs}/getncdimlen} COMPRESS=${COMPRESS:-gzip} UNCOMPRESS=${UNCOMPRESS:-gunzip} APRUNCFP=${APRUNCFP:-""} @@ -68,19 +68,19 @@ DOIAU=${DOIAU:-"NO"} export IAUFHRS=${IAUFHRS:-"6"} # Dependent Scripts and Executables -GSIEXEC=${GSIEXEC:-${HOMEgfs}/exec/gsi.x} +GSIEXEC=${GSIEXEC:-${EXECgfs}/gsi.x} export NTHREADS_CALCINC=${NTHREADS_CALCINC:-1} export APRUN_CALCINC=${APRUN_CALCINC:-${APRUN:-""}} export APRUN_CALCANL=${APRUN_CALCANL:-${APRUN:-""}} export APRUN_CHGRES=${APRUN_CALCANL:-${APRUN:-""}} -export CALCINCEXEC=${CALCINCEXEC:-${HOMEgfs}/exec/calc_increment_ens.x} -export CALCINCNCEXEC=${CALCINCNCEXEC:-${HOMEgfs}/exec/calc_increment_ens_ncio.x} -export CALCANLEXEC=${CALCANLEXEC:-${HOMEgfs}/exec/calc_analysis.x} -export CHGRESNCEXEC=${CHGRESNCEXEC:-${HOMEgfs}/exec/enkf_chgres_recenter_nc.x} -export CHGRESINCEXEC=${CHGRESINCEXEC:-${HOMEgfs}/exec/interp_inc.x} -CHGRESEXEC=${CHGRESEXEC:-${HOMEgfs}/exec/enkf_chgres_recenter.x} +export CALCINCEXEC=${CALCINCEXEC:-${EXECgfs}/calc_increment_ens.x} +export CALCINCNCEXEC=${CALCINCNCEXEC:-${EXECgfs}/calc_increment_ens_ncio.x} +export CALCANLEXEC=${CALCANLEXEC:-${EXECgfs}/calc_analysis.x} +export CHGRESNCEXEC=${CHGRESNCEXEC:-${EXECgfs}/enkf_chgres_recenter_nc.x} +export CHGRESINCEXEC=${CHGRESINCEXEC:-${EXECgfs}/interp_inc.x} +CHGRESEXEC=${CHGRESEXEC:-${EXECgfs}/enkf_chgres_recenter.x} export NTHREADS_CHGRES=${NTHREADS_CHGRES:-24} -CALCINCPY=${CALCINCPY:-${HOMEgfs}/ush/calcinc_gfs.py} +CALCINCPY=${CALCINCPY:-${USHgfs}/calcinc_gfs.py} # OPS flags RUN=${RUN:-""} @@ -89,6 +89,8 @@ SENDDBN=${SENDDBN:-"NO"} RUN_GETGES=${RUN_GETGES:-"NO"} GETGESSH=${GETGESSH:-"getges.sh"} export gesenvir=${gesenvir:-${envir}} + +export hofx_2m_sfcfile=${hofx_2m_sfcfile:-".false."} # Observations OPREFIX=${OPREFIX:-""} @@ -289,21 +291,21 @@ else fi # GSI Fix files -BERROR=${BERROR:-${FIXgsi}/Big_Endian/global_berror.l${LEVS}y${NLAT_A}.f77} -SATANGL=${SATANGL:-${FIXgsi}/global_satangbias.txt} -SATINFO=${SATINFO:-${FIXgsi}/global_satinfo.txt} -RADCLOUDINFO=${RADCLOUDINFO:-${FIXgsi}/cloudy_radiance_info.txt} -ATMSFILTER=${ATMSFILTER:-${FIXgsi}/atms_beamwidth.txt} -ANAVINFO=${ANAVINFO:-${FIXgsi}/global_anavinfo.l${LEVS}.txt} -CONVINFO=${CONVINFO:-${FIXgsi}/global_convinfo.txt} -vqcdat=${vqcdat:-${FIXgsi}/vqctp001.dat} -INSITUINFO=${INSITUINFO:-${FIXgsi}/global_insituinfo.txt} -OZINFO=${OZINFO:-${FIXgsi}/global_ozinfo.txt} -PCPINFO=${PCPINFO:-${FIXgsi}/global_pcpinfo.txt} -AEROINFO=${AEROINFO:-${FIXgsi}/global_aeroinfo.txt} -SCANINFO=${SCANINFO:-${FIXgsi}/global_scaninfo.txt} -HYBENSINFO=${HYBENSINFO:-${FIXgsi}/global_hybens_info.l${LEVS}.txt} -OBERROR=${OBERROR:-${FIXgsi}/prepobs_errtable.global} +BERROR=${BERROR:-${FIXgfs}/gsi/Big_Endian/global_berror.l${LEVS}y${NLAT_A}.f77} +SATANGL=${SATANGL:-${FIXgfs}/gsi/global_satangbias.txt} +SATINFO=${SATINFO:-${FIXgfs}/gsi/global_satinfo.txt} +RADCLOUDINFO=${RADCLOUDINFO:-${FIXgfs}/gsi/cloudy_radiance_info.txt} +ATMSFILTER=${ATMSFILTER:-${FIXgfs}/gsi/atms_beamwidth.txt} +ANAVINFO=${ANAVINFO:-${FIXgfs}/gsi/global_anavinfo.l${LEVS}.txt} +CONVINFO=${CONVINFO:-${FIXgfs}/gsi/global_convinfo.txt} +vqcdat=${vqcdat:-${FIXgfs}/gsi/vqctp001.dat} +INSITUINFO=${INSITUINFO:-${FIXgfs}/gsi/global_insituinfo.txt} +OZINFO=${OZINFO:-${FIXgfs}/gsi/global_ozinfo.txt} +PCPINFO=${PCPINFO:-${FIXgfs}/gsi/global_pcpinfo.txt} +AEROINFO=${AEROINFO:-${FIXgfs}/gsi/global_aeroinfo.txt} +SCANINFO=${SCANINFO:-${FIXgfs}/gsi/global_scaninfo.txt} +HYBENSINFO=${HYBENSINFO:-${FIXgfs}/gsi/global_hybens_info.l${LEVS}.txt} +OBERROR=${OBERROR:-${FIXgfs}/gsi/prepobs_errtable.global} # GSI namelist SETUP=${SETUP:-""} @@ -381,8 +383,8 @@ ${NLN} ${OBERROR} errtable #If using correlated error, link to the covariance files if [ ${USE_CORRELATED_OBERRS} == "YES" ]; then if grep -q "Rcov" ${ANAVINFO} ; then - if ls ${FIXgsi}/Rcov* 1> /dev/null 2>&1; then - ${NLN} ${FIXgsi}/Rcov* ${DATA} + if ls ${FIXgfs}/gsi/Rcov* 1> /dev/null 2>&1; then + ${NLN} ${FIXgfs}/gsi/Rcov* ${DATA} echo "using correlated obs error" else echo "FATAL ERROR: Satellite error covariance files (Rcov) are missing." @@ -748,6 +750,7 @@ cat > gsiparm.anl << EOF / &OBS_INPUT dmesh(1)=145.0,dmesh(2)=150.0,dmesh(3)=100.0,dmesh(4)=50.0,time_window_max=3.0, + hofx_2m_sfcfile=${hofx_2m_sfcfile}, ${OBSINPUT} / OBS_INPUT:: diff --git a/scripts/exglobal_atmos_analysis_calc.sh b/scripts/exglobal_atmos_analysis_calc.sh index a2086aa927..771221ce1d 100755 --- a/scripts/exglobal_atmos_analysis_calc.sh +++ b/scripts/exglobal_atmos_analysis_calc.sh @@ -19,11 +19,10 @@ # Set environment. -source "$HOMEgfs/ush/preamble.sh" +source "${USHgfs}/preamble.sh" # Directories. pwd=$(pwd) -export FIXam=${FIXam:-$HOMEgfs/fix/am} # Base variables CDUMP=${CDUMP:-"gdas"} @@ -34,7 +33,7 @@ export NCP=${NCP:-"/bin/cp"} export NMV=${NMV:-"/bin/mv"} export NLN=${NLN:-"/bin/ln -sf"} export CHGRP_CMD=${CHGRP_CMD:-"chgrp ${group_name:-rstprod}"} -export NCLEN=${NCLEN:-$HOMEgfs/ush/getncdimlen} +export NCLEN=${NCLEN:-${USHgfs}/getncdimlen} COMPRESS=${COMPRESS:-gzip} UNCOMPRESS=${UNCOMPRESS:-gunzip} APRUNCFP=${APRUNCFP:-""} @@ -53,16 +52,16 @@ export APRUN_CALCINC=${APRUN_CALCINC:-${APRUN:-""}} export APRUN_CALCANL=${APRUN_CALCANL:-${APRUN:-""}} export APRUN_CHGRES=${APRUN_CALCANL:-${APRUN:-""}} -export CALCANLEXEC=${CALCANLEXEC:-$HOMEgfs/exec/calc_analysis.x} -export CHGRESNCEXEC=${CHGRESNCEXEC:-$HOMEgfs/exec/enkf_chgres_recenter_nc.x} -export CHGRESINCEXEC=${CHGRESINCEXEC:-$HOMEgfs/exec/interp_inc.x} +export CALCANLEXEC=${CALCANLEXEC:-${EXECgfs}/calc_analysis.x} +export CHGRESNCEXEC=${CHGRESNCEXEC:-${EXECgfs}/enkf_chgres_recenter_nc.x} +export CHGRESINCEXEC=${CHGRESINCEXEC:-${EXECgfs}/interp_inc.x} export NTHREADS_CHGRES=${NTHREADS_CHGRES:-1} -CALCINCPY=${CALCINCPY:-$HOMEgfs/ush/calcinc_gfs.py} -CALCANLPY=${CALCANLPY:-$HOMEgfs/ush/calcanl_gfs.py} +CALCINCPY=${CALCINCPY:-${USHgfs}/calcinc_gfs.py} +CALCANLPY=${CALCANLPY:-${USHgfs}/calcanl_gfs.py} DOGAUSFCANL=${DOGAUSFCANL-"NO"} -GAUSFCANLSH=${GAUSFCANLSH:-$HOMEgfs/ush/gaussian_sfcanl.sh} -export GAUSFCANLEXE=${GAUSFCANLEXE:-$HOMEgfs/exec/gaussian_sfcanl.x} +GAUSFCANLSH=${GAUSFCANLSH:-${USHgfs}/gaussian_sfcanl.sh} +export GAUSFCANLEXE=${GAUSFCANLEXE:-${EXECgfs}/gaussian_sfcanl.x} NTHREADS_GAUSFCANL=${NTHREADS_GAUSFCANL:-1} APRUN_GAUSFCANL=${APRUN_GAUSFCANL:-${APRUN:-""}} diff --git a/scripts/exglobal_atmos_pmgr.sh b/scripts/exglobal_atmos_pmgr.sh index 86afed962e..7f348474b6 100755 --- a/scripts/exglobal_atmos_pmgr.sh +++ b/scripts/exglobal_atmos_pmgr.sh @@ -6,7 +6,7 @@ # This script monitors the progress of the gfs_fcst job # -source "$HOMEgfs/ush/preamble.sh" +source "${USHgfs}/preamble.sh" hour=00 diff --git a/scripts/exglobal_atmos_products.sh b/scripts/exglobal_atmos_products.sh index 5f0b1db6cf..9067819380 100755 --- a/scripts/exglobal_atmos_products.sh +++ b/scripts/exglobal_atmos_products.sh @@ -1,18 +1,22 @@ #! /usr/bin/env bash -source "${HOMEgfs}/ush/preamble.sh" +source "${USHgfs}/preamble.sh" # Programs used export WGRIB2=${WGRIB2:-${wgrib2_ROOT}/bin/wgrib2} # Scripts used -INTERP_ATMOS_MASTERSH=${INTERP_ATMOS_MASTERSH:-"${HOMEgfs}/ush/interp_atmos_master.sh"} -INTERP_ATMOS_SFLUXSH=${INTERP_ATMOS_SFLUXSH:-"${HOMEgfs}/ush/interp_atmos_sflux.sh"} +INTERP_ATMOS_MASTERSH=${INTERP_ATMOS_MASTERSH:-"${USHgfs}/interp_atmos_master.sh"} +INTERP_ATMOS_SFLUXSH=${INTERP_ATMOS_SFLUXSH:-"${USHgfs}/interp_atmos_sflux.sh"} # Variables used in this job downset=${downset:-1} # No. of groups of pressure grib2 products to create npe_atmos_products=${npe_atmos_products:-8} # no. of processors available to process each group +# WGNE related options +WGNE=${WGNE:-NO} # Create WGNE products +FHMAX_WGNE=${FHMAX_WGNE:-0} # WGNE products are created for first FHMAX_WGNE forecast hours (except 0) + cd "${DATA}" || exit 1 # Set paramlist files based on FORECAST_HOUR (-1, 0, 3, 6, etc.) @@ -129,7 +133,7 @@ for (( nset=1 ; nset <= downset ; nset++ )); do # Run with MPMD or serial if [[ "${USE_CFP:-}" = "YES" ]]; then - OMP_NUM_THREADS=1 "${HOMEgfs}/ush/run_mpmd.sh" "${DATA}/poescript" + OMP_NUM_THREADS=1 "${USHgfs}/run_mpmd.sh" "${DATA}/poescript" export err=$? else chmod 755 "${DATA}/poescript" @@ -167,14 +171,18 @@ done # for (( nset=1 ; nset <= downset ; nset++ )) #--------------------------------------------------------------- +# Create the index file for the sflux master, if it exists. +FLUX_FILE="${COM_ATMOS_MASTER}/${PREFIX}sfluxgrb${fhr3}.grib2" +if [[ -s "${FLUX_FILE}" ]]; then + ${WGRIB2} -s "${FLUX_FILE}" > "${FLUX_FILE}.idx" +fi + # Section creating slfux grib2 interpolated products # Create 1-degree sflux grib2 output # move to COM and index it if [[ "${FLXGF:-}" == "YES" ]]; then # Files needed by ${INTERP_ATMOS_SFLUXSH} - FLUX_FILE="${COM_ATMOS_MASTER}/${PREFIX}sfluxgrb${fhr3}.grib2" - input_file="${FLUX_FILE}" output_file_prefix="sflux_${fhr3}" grid_string="1p00" @@ -190,6 +198,15 @@ if [[ "${FLXGF:-}" == "YES" ]]; then done fi +# Section creating 0.25 degree WGNE products for nset=1, and fhr <= FHMAX_WGNE +if [[ "${WGNE:-}" == "YES" ]]; then + grp="" # TODO: this should be "a" when we eventually rename the pressure grib2 files per EE2 convention + if (( FORECAST_HOUR > 0 & FORECAST_HOUR <= FHMAX_WGNE )); then + # TODO: 597 is the message number for APCP in GFSv16. GFSv17 may change this as more messages are added. This can be controlled via config.atmos_products + ${WGRIB2} "${COM_ATMOS_GRIB_0p25}/${PREFIX}pgrb2${grp}.0p25.${fhr3}" -d "${APCP_MSG:-597}" -grib "${COM_ATMOS_GRIB_0p25}/${PREFIX}wgne.${fhr3}" + fi +fi + #--------------------------------------------------------------- # Start sending DBN alerts @@ -200,18 +217,21 @@ if [[ "${SENDDBN:-}" == "YES" ]]; then if [[ "${RUN}" == "gfs" ]]; then "${DBNROOT}/bin/dbn_alert" MODEL "${RUN^^}_PGB2B_0P25" "${job}" "${COM_ATMOS_GRIB_0p25}/${PREFIX}pgrb2b.0p25.${fhr3}" "${DBNROOT}/bin/dbn_alert" MODEL "${RUN^^}_PGB2B_0P25_WIDX" "${job}" "${COM_ATMOS_GRIB_0p25}/${PREFIX}pgrb2b.0p25.${fhr3}.idx" - if [[ -s "${COM_ATMOS_GRIB_0p50}/${PREFIX}pgrb2.0p50.f${fhr3}" ]]; then + if [[ -s "${COM_ATMOS_GRIB_0p50}/${PREFIX}pgrb2.0p50.${fhr3}" ]]; then "${DBNROOT}/bin/dbn_alert" MODEL "${RUN^^}_PGB2_0P5" "${job}" "${COM_ATMOS_GRIB_0p50}/${PREFIX}pgrb2.0p50.${fhr3}" "${DBNROOT}/bin/dbn_alert" MODEL "${RUN^^}_PGB2_0P5_WIDX" "${job}" "${COM_ATMOS_GRIB_0p50}/${PREFIX}pgrb2.0p50.${fhr3}.idx" "${DBNROOT}/bin/dbn_alert" MODEL "${RUN^^}_PGB2B_0P5" "${job}" "${COM_ATMOS_GRIB_0p50}/${PREFIX}pgrb2b.0p50.${fhr3}" "${DBNROOT}/bin/dbn_alert" MODEL "${RUN^^}_PGB2B_0P5_WIDX" "${job}" "${COM_ATMOS_GRIB_0p50}/${PREFIX}pgrb2b.0p50.${fhr3}.idx" fi - if [[ -s "${COM_ATMOS_GRIB_1p00}/${PREFIX}pgrb2.1p00.f${fhr3}" ]]; then + if [[ -s "${COM_ATMOS_GRIB_1p00}/${PREFIX}pgrb2.1p00.${fhr3}" ]]; then "${DBNROOT}/bin/dbn_alert" MODEL "${RUN^^}_PGB2_1P0" "${job}" "${COM_ATMOS_GRIB_1p00}/${PREFIX}pgrb2.1p00.${fhr3}" "${DBNROOT}/bin/dbn_alert" MODEL "${RUN^^}_PGB2_1P0_WIDX" "${job}" "${COM_ATMOS_GRIB_1p00}/${PREFIX}pgrb2.1p00.${fhr3}.idx" "${DBNROOT}/bin/dbn_alert" MODEL "${RUN^^}_PGB2B_1P0" "${job}" "${COM_ATMOS_GRIB_1p00}/${PREFIX}pgrb2b.1p00.${fhr3}" "${DBNROOT}/bin/dbn_alert" MODEL "${RUN^^}_PGB2B_1P0_WIDX" "${job}" "${COM_ATMOS_GRIB_1p00}/${PREFIX}pgrb2b.1p00.${fhr3}.idx" fi + if [[ "${WGNE:-}" == "YES" ]] && [[ -s "${COM_ATMOS_GRIB_0p25}/${PREFIX}wgne.${fhr3}" ]] ; then + "${DBNROOT}/bin/dbn_alert" MODEL "${RUN^^}_WGNE" "${job}" "${COM_ATMOS_GRIB_0p25}/${PREFIX}wgne.${fhr3}" + fi fi if [[ "${fhr3}" == "anl" ]]; then diff --git a/scripts/exglobal_atmos_sfcanl.sh b/scripts/exglobal_atmos_sfcanl.sh index 2997ac0d25..fd358f4508 100755 --- a/scripts/exglobal_atmos_sfcanl.sh +++ b/scripts/exglobal_atmos_sfcanl.sh @@ -19,7 +19,7 @@ # Set environment. -source "${HOMEgfs}/ush/preamble.sh" +source "${USHgfs}/preamble.sh" # Directories. pwd=$(pwd) @@ -37,7 +37,7 @@ export NCP=${NCP:-"/bin/cp"} export NMV=${NMV:-"/bin/mv"} export NLN=${NLN:-"/bin/ln -sf"} export CHGRP_CMD=${CHGRP_CMD:-"chgrp ${group_name:-rstprod}"} -export NCLEN=${NCLEN:-$HOMEgfs/ush/getncdimlen} +export NCLEN=${NCLEN:-${USHgfs}/getncdimlen} COMPRESS=${COMPRESS:-gzip} UNCOMPRESS=${UNCOMPRESS:-gunzip} APRUNCFP=${APRUNCFP:-""} @@ -47,16 +47,14 @@ DOIAU=${DOIAU:-"NO"} export IAUFHRS=${IAUFHRS:-"6"} # Surface cycle related parameters -CYCLESH=${CYCLESH:-${HOMEgfs}/ush/global_cycle.sh} -export CYCLEXEC=${CYCLEXEC:-${HOMEgfs}/exec/global_cycle} +CYCLESH=${CYCLESH:-${USHgfs}/global_cycle.sh} +export CYCLEXEC=${CYCLEXEC:-${EXECgfs}/global_cycle} NTHREADS_CYCLE=${NTHREADS_CYCLE:-24} APRUN_CYCLE=${APRUN_CYCLE:-${APRUN:-""}} export SNOW_NUDGE_COEFF=${SNOW_NUDGE_COEFF:-'-2.'} export CYCLVARS=${CYCLVARS:-""} export FHOUR=${FHOUR:-0} export DELTSFC=${DELTSFC:-6} -export FIXam=${FIXam:-${HOMEgfs}/fix/am} -export FIXorog=${FIXorog:-${HOMEgfs}/fix/orog} # FV3 specific info (required for global_cycle) export CASE=${CASE:-"C384"} @@ -72,15 +70,15 @@ export APRUN_CALCINC=${APRUN_CALCINC:-${APRUN:-""}} export APRUN_CALCANL=${APRUN_CALCANL:-${APRUN:-""}} export APRUN_CHGRES=${APRUN_CALCANL:-${APRUN:-""}} -export CALCANLEXEC=${CALCANLEXEC:-${HOMEgfs}/exec/calc_analysis.x} -export CHGRESNCEXEC=${CHGRESNCEXEC:-${HOMEgfs}/exec/enkf_chgres_recenter_nc.x} -export CHGRESINCEXEC=${CHGRESINCEXEC:-${HOMEgfs}/exec/interp_inc.x} +export CALCANLEXEC=${CALCANLEXEC:-${EXECgfs}/calc_analysis.x} +export CHGRESNCEXEC=${CHGRESNCEXEC:-${EXECgfs}/enkf_chgres_recenter_nc.x} +export CHGRESINCEXEC=${CHGRESINCEXEC:-${EXECgfs}/interp_inc.x} export NTHREADS_CHGRES=${NTHREADS_CHGRES:-1} -CALCINCPY=${CALCINCPY:-${HOMEgfs}/ush/calcinc_gfs.py} -CALCANLPY=${CALCANLPY:-${HOMEgfs}/ush/calcanl_gfs.py} +CALCINCPY=${CALCINCPY:-${USHgfs}/calcinc_gfs.py} +CALCANLPY=${CALCANLPY:-${USHgfs}/calcanl_gfs.py} export APRUN_CHGRES=${APRUN_CALCANL:-${APRUN:-""}} -CHGRESEXEC=${CHGRESEXEC:-${HOMEgfs}/exec/enkf_chgres_recenter.x} +CHGRESEXEC=${CHGRESEXEC:-${EXECgfs}/enkf_chgres_recenter.x} # OPS flags RUN=${RUN:-""} @@ -176,8 +174,8 @@ if [[ ${DOIAU} = "YES" ]]; then "${COM_ATMOS_RESTART}/${bPDY}.${bcyc}0000.sfcanl_data.tile${n}.nc" ${NLN} "${COM_ATMOS_RESTART_PREV}/${bPDY}.${bcyc}0000.sfc_data.tile${n}.nc" "${DATA}/fnbgsi.00${n}" ${NLN} "${COM_ATMOS_RESTART}/${bPDY}.${bcyc}0000.sfcanl_data.tile${n}.nc" "${DATA}/fnbgso.00${n}" - ${NLN} "${FIXorog}/${CASE}/${CASE}_grid.tile${n}.nc" "${DATA}/fngrid.00${n}" - ${NLN} "${FIXorog}/${CASE}/${CASE}.mx${OCNRES}_oro_data.tile${n}.nc" "${DATA}/fnorog.00${n}" + ${NLN} "${FIXgfs}/orog/${CASE}/${CASE}_grid.tile${n}.nc" "${DATA}/fngrid.00${n}" + ${NLN} "${FIXgfs}/orog/${CASE}/${CASE}.mx${OCNRES}_oro_data.tile${n}.nc" "${DATA}/fnorog.00${n}" done export APRUNCY=${APRUN_CYCLE} @@ -190,8 +188,8 @@ fi # Update surface restarts at middle of window for n in $(seq 1 ${ntiles}); do - if [[ ${DO_JEDILANDDA:-"NO"} = "YES" ]]; then - ${NCP} "${COM_LAND_ANALYSIS}/${PDY}.${cyc}0000.sfc_data.tile${n}.nc" \ + if [[ ${DO_JEDISNOWDA:-"NO"} = "YES" ]]; then + ${NCP} "${COM_SNOW_ANALYSIS}/${PDY}.${cyc}0000.sfc_data.tile${n}.nc" \ "${COM_ATMOS_RESTART}/${PDY}.${cyc}0000.sfcanl_data.tile${n}.nc" else ${NCP} "${COM_ATMOS_RESTART_PREV}/${PDY}.${cyc}0000.sfc_data.tile${n}.nc" \ @@ -199,8 +197,8 @@ for n in $(seq 1 ${ntiles}); do fi ${NLN} "${COM_ATMOS_RESTART_PREV}/${PDY}.${cyc}0000.sfc_data.tile${n}.nc" "${DATA}/fnbgsi.00${n}" ${NLN} "${COM_ATMOS_RESTART}/${PDY}.${cyc}0000.sfcanl_data.tile${n}.nc" "${DATA}/fnbgso.00${n}" - ${NLN} "${FIXorog}/${CASE}/${CASE}_grid.tile${n}.nc" "${DATA}/fngrid.00${n}" - ${NLN} "${FIXorog}/${CASE}/${CASE}.mx${OCNRES}_oro_data.tile${n}.nc" "${DATA}/fnorog.00${n}" + ${NLN} "${FIXgfs}/orog/${CASE}/${CASE}_grid.tile${n}.nc" "${DATA}/fngrid.00${n}" + ${NLN} "${FIXgfs}/orog/${CASE}/${CASE}.mx${OCNRES}_oro_data.tile${n}.nc" "${DATA}/fnorog.00${n}" done export APRUNCY=${APRUN_CYCLE} diff --git a/scripts/exglobal_atmos_tropcy_qc_reloc.sh b/scripts/exglobal_atmos_tropcy_qc_reloc.sh index 380441a6c9..f1272b1844 100755 --- a/scripts/exglobal_atmos_tropcy_qc_reloc.sh +++ b/scripts/exglobal_atmos_tropcy_qc_reloc.sh @@ -10,7 +10,7 @@ # echo " Oct 2013 - Use main USH vars as part of minor pkg cleanup" ############################################################################ -source "$HOMEgfs/ush/preamble.sh" +source "${USHgfs}/preamble.sh" # Make sure we are in the $DATA directory cd $DATA @@ -50,7 +50,7 @@ if [ "$PROCESS_TROPCY" = 'YES' ]; then #echo $PDY - ${USHSYND:-$HOMEgfs/ush}/syndat_qctropcy.sh $cdate10 + ${USHgfs}/syndat_qctropcy.sh $cdate10 errsc=$? if [ "$errsc" -ne '0' ]; then echo "syndat_qctropcy.sh failed. exit" @@ -95,7 +95,7 @@ if [ "$DO_RELOCATE" = 'YES' ]; then ################################################### export MP_LABELIO=${MP_LABELIO:-yes} - $USHRELO/tropcy_relocate.sh $cdate10 + ${USHgfs}/tropcy_relocate.sh $cdate10 errsc=$? [ "$errsc" -ne '0' ] && exit $errsc diff --git a/scripts/exglobal_atmos_vminmon.sh b/scripts/exglobal_atmos_vminmon.sh index a4453dcf1a..b4307c8af9 100755 --- a/scripts/exglobal_atmos_vminmon.sh +++ b/scripts/exglobal_atmos_vminmon.sh @@ -1,6 +1,6 @@ #! /usr/bin/env bash -source "${HOMEgfs}/ush/preamble.sh" +source "${USHgfs}/preamble.sh" ################################################################################ #### UNIX Script Documentation Block @@ -44,15 +44,15 @@ if [[ -s ${gsistat} ]]; then #------------------------------------------------------------------ # Run the child sccripts. #------------------------------------------------------------------ - "${USHgfs}/minmon_xtrct_costs.pl" "${MINMON_SUFFIX}" "${PDY}" "${cyc}" "${gsistat}" dummy + "${USHgfs}/minmon_xtrct_costs.pl" "${MINMON_SUFFIX}" "${PDY}" "${cyc}" "${gsistat}" rc_costs=$? echo "rc_costs = ${rc_costs}" - "${USHgfs}/minmon_xtrct_gnorms.pl" "${MINMON_SUFFIX}" "${PDY}" "${cyc}" "${gsistat}" dummy + "${USHgfs}/minmon_xtrct_gnorms.pl" "${MINMON_SUFFIX}" "${PDY}" "${cyc}" "${gsistat}" rc_gnorms=$? echo "rc_gnorms = ${rc_gnorms}" - "${USHgfs}/minmon_xtrct_reduct.pl" "${MINMON_SUFFIX}" "${PDY}" "${cyc}" "${gsistat}" dummy + "${USHgfs}/minmon_xtrct_reduct.pl" "${MINMON_SUFFIX}" "${PDY}" "${cyc}" "${gsistat}" rc_reduct=$? echo "rc_reduct = ${rc_reduct}" diff --git a/scripts/exglobal_cleanup.sh b/scripts/exglobal_cleanup.sh index 5d7c0a9788..8141de771a 100755 --- a/scripts/exglobal_cleanup.sh +++ b/scripts/exglobal_cleanup.sh @@ -1,6 +1,6 @@ #! /usr/bin/env bash -source "${HOMEgfs}/ush/preamble.sh" +source "${USHgfs}/preamble.sh" ############################################################### # Clean up previous cycles; various depths diff --git a/scripts/exglobal_diag.sh b/scripts/exglobal_diag.sh index 3836643afc..ad4c4be4a8 100755 --- a/scripts/exglobal_diag.sh +++ b/scripts/exglobal_diag.sh @@ -19,7 +19,7 @@ # Set environment. -source "$HOMEgfs/ush/preamble.sh" +source "${USHgfs}/preamble.sh" # Directories. pwd=$(pwd) @@ -34,7 +34,7 @@ export NCP=${NCP:-"/bin/cp"} export NMV=${NMV:-"/bin/mv"} export NLN=${NLN:-"/bin/ln -sf"} export CHGRP_CMD=${CHGRP_CMD:-"chgrp ${group_name:-rstprod}"} -export NCLEN=${NCLEN:-$HOMEgfs/ush/getncdimlen} +export NCLEN=${NCLEN:-${USHgfs}/getncdimlen} export CATEXEC=${CATEXEC:-${ncdiag_ROOT:-${gsi_ncdiag_ROOT}}/bin/ncdiag_cat_serial.x} COMPRESS=${COMPRESS:-gzip} UNCOMPRESS=${UNCOMPRESS:-gunzip} diff --git a/scripts/exglobal_forecast.sh b/scripts/exglobal_forecast.sh index c50cde74f1..3555d4ef33 100755 --- a/scripts/exglobal_forecast.sh +++ b/scripts/exglobal_forecast.sh @@ -38,19 +38,19 @@ ## Restart files: ## ## Fix files: -## 1. computing grid, $FIXorog/$CASE/${CASE}_grid.tile${n}.nc -## 2. orography data, $FIXorog/$CASE/${CASE}.mx${OCNRES}_oro_data.tile${n}.nc -## 3. mosaic data, $FIXorog/$CASE/${CASE}_mosaic.nc -## 4. Global O3 data, $FIXam/${O3FORC} -## 5. Global H2O data, $FIXam/${H2OFORC} -## 6. Global solar constant data, $FIXam/global_solarconstant_noaa_an.txt -## 7. Global surface emissivity, $FIXam/global_sfc_emissivity_idx.txt -## 8. Global CO2 historical data, $FIXam/global_co2historicaldata_glob.txt -## 8. Global CO2 monthly data, $FIXam/co2monthlycyc.txt -## 10. Additional global CO2 data, $FIXam/fix_co2_proj/global_co2historicaldata +## 1. computing grid, ${FIXgfs}/orog/$CASE/${CASE}_grid.tile${n}.nc +## 2. orography data, ${FIXgfs}/orog/$CASE/${CASE}.mx${OCNRES}_oro_data.tile${n}.nc +## 3. mosaic data, ${FIXgfs}/orog/$CASE/${CASE}_mosaic.nc +## 4. Global O3 data, ${FIXgfs}/am/${O3FORC} +## 5. Global H2O data, ${FIXgfs}/am/${H2OFORC} +## 6. Global solar constant data, ${FIXgfs}/am/global_solarconstant_noaa_an.txt +## 7. Global surface emissivity, ${FIXgfs}/am/global_sfc_emissivity_idx.txt +## 8. Global CO2 historical data, ${FIXgfs}/am/global_co2historicaldata_glob.txt +## 8. Global CO2 monthly data, ${FIXgfs}/am/co2monthlycyc.txt +## 10. Additional global CO2 data, ${FIXgfs}/am/fix_co2_proj/global_co2historicaldata ## 11. Climatological aerosol global distribution -## $FIXam/global_climaeropac_global.txt -## 12. Monthly volcanic forcing $FIXam/global_volcanic_aerosols_YYYY-YYYY.txt +## ${FIXgfs}/am/global_climaeropac_global.txt +## 12. Monthly volcanic forcing ${FIXgfs}/am/global_volcanic_aerosols_YYYY-YYYY.txt ## ## Data output (location, name) ## If quilting=true and output grid is gaussian grid: @@ -77,14 +77,14 @@ # Main body starts here ####################### -source "${HOMEgfs}/ush/preamble.sh" +source "${USHgfs}/preamble.sh" # include all subroutines. Executions later. -source "${HOMEgfs}/ush/forecast_predet.sh" # include functions for variable definition -source "${HOMEgfs}/ush/forecast_det.sh" # include functions for run type determination -source "${HOMEgfs}/ush/forecast_postdet.sh" # include functions for variables after run type determination -source "${HOMEgfs}/ush/ufs_configure.sh" # include functions for ufs.configure processing -source "${HOMEgfs}/ush/parsing_model_configure_FV3.sh" +source "${USHgfs}/forecast_predet.sh" # include functions for variable definition +source "${USHgfs}/forecast_det.sh" # include functions for run type determination +source "${USHgfs}/forecast_postdet.sh" # include functions for variables after run type determination +source "${USHgfs}/parsing_ufs_configure.sh" # include functions for ufs_configure processing +source "${USHgfs}/parsing_model_configure_FV3.sh" # Coupling control switches, for coupling purpose, off by default cpl=${cpl:-.false.} @@ -105,9 +105,11 @@ common_predet echo "MAIN: Loading variables before determination of run type" FV3_predet +[[ ${cplflx} = .true. ]] && CMEPS_predet [[ ${cplflx} = .true. ]] && MOM6_predet [[ ${cplwav} = .true. ]] && WW3_predet [[ ${cplice} = .true. ]] && CICE_predet +[[ ${cplchm} = .true. ]] && GOCART_predet echo "MAIN: Variables before determination of run type loaded" echo "MAIN: Determining run type" @@ -119,6 +121,7 @@ echo "MAIN: RUN Type Determined" echo "MAIN: Post-determination set up of run type" FV3_postdet +[[ ${cplflx} = .true. ]] && CMEPS_postdet [[ ${cplflx} = .true. ]] && MOM6_postdet [[ ${cplwav} = .true. ]] && WW3_postdet [[ ${cplice} = .true. ]] && CICE_postdet @@ -146,7 +149,13 @@ if [[ "${esmf_profile:-}" = ".true." ]]; then export ESMF_RUNTIME_PROFILE_OUTPUT=SUMMARY fi -${NCP} "${FCSTEXECDIR}/${FCSTEXEC}" "${DATA}/" +if [[ "${USE_ESMF_THREADING:-}" == "YES" ]]; then + unset OMP_NUM_THREADS +else + export OMP_NUM_THREADS=${UFS_THREADS:-1} +fi + +${NCP} "${EXECgfs}/${FCSTEXEC}" "${DATA}/" ${APRUN_UFS} "${DATA}/${FCSTEXEC}" 1>&1 2>&2 export ERR=$? export err=${ERR} @@ -154,6 +163,7 @@ ${ERRSCRIPT} || exit "${err}" FV3_out [[ ${cplflx} = .true. ]] && MOM6_out +[[ ${cplflx} = .true. ]] && CMEPS_out [[ ${cplwav} = .true. ]] && WW3_out [[ ${cplice} = .true. ]] && CICE_out [[ ${cplchm} = .true. ]] && GOCART_out diff --git a/scripts/exglobal_oceanice_products.py b/scripts/exglobal_oceanice_products.py new file mode 100755 index 0000000000..9bb2b09596 --- /dev/null +++ b/scripts/exglobal_oceanice_products.py @@ -0,0 +1,52 @@ +#!/usr/bin/env python3 + +import os + +from wxflow import AttrDict, Logger, logit, cast_strdict_as_dtypedict +from pygfs.task.oceanice_products import OceanIceProducts + +# initialize root logger +logger = Logger(level=os.environ.get("LOGGING_LEVEL", "DEBUG"), colored_log=True) + + +@logit(logger) +def main(): + + config = cast_strdict_as_dtypedict(os.environ) + + # Instantiate the OceanIce object + oceanice = OceanIceProducts(config) + + # Pull out all the configuration keys needed to run the rest of steps + keys = ['HOMEgfs', 'DATA', 'current_cycle', 'RUN', 'NET', + f'COM_{oceanice.task_config.component.upper()}_HISTORY', + f'COM_{oceanice.task_config.component.upper()}_GRIB', + 'APRUN_OCNICEPOST', + 'component', 'forecast_hour', 'valid_datetime', 'avg_period', + 'model_grid', 'product_grids', 'oceanice_yaml'] + oceanice_dict = AttrDict() + for key in keys: + oceanice_dict[key] = oceanice.task_config[key] + + # Initialize the DATA/ directory; copy static data + oceanice.initialize(oceanice_dict) + + for grid in oceanice_dict.product_grids: + + logger.info(f"Processing {grid} grid") + + # Configure DATA/ directory for execution; prepare namelist etc. + oceanice.configure(oceanice_dict, grid) + + # Run the oceanice post executable to interpolate and create grib2 files + oceanice.execute(oceanice_dict, grid) + + # Subset raw model data to create netCDF products + oceanice.subset(oceanice_dict) + + # Copy processed output from execute and subset + oceanice.finalize(oceanice_dict) + + +if __name__ == '__main__': + main() diff --git a/scripts/exglobal_prep_land_obs.py b/scripts/exglobal_prep_snow_obs.py similarity index 59% rename from scripts/exglobal_prep_land_obs.py rename to scripts/exglobal_prep_snow_obs.py index 3594771c8a..5107d9c935 100755 --- a/scripts/exglobal_prep_land_obs.py +++ b/scripts/exglobal_prep_snow_obs.py @@ -1,12 +1,12 @@ #!/usr/bin/env python3 -# exglobal_land_analysis_prepare.py -# This script creates a LandAnalysis object +# exglobal_prep_snow_obs.py +# This script creates a SnowAnalysis object # and runs the prepare_GTS and prepare_IMS method # which perform the pre-processing for GTS and IMS data import os from wxflow import Logger, cast_strdict_as_dtypedict -from pygfs.task.land_analysis import LandAnalysis +from pygfs.task.snow_analysis import SnowAnalysis # Initialize root logger @@ -18,8 +18,8 @@ # Take configuration from environment and cast it as python dictionary config = cast_strdict_as_dtypedict(os.environ) - # Instantiate the land prepare task - LandAnl = LandAnalysis(config) - LandAnl.prepare_GTS() - if f"{ LandAnl.runtime_config.cyc }" == '18': - LandAnl.prepare_IMS() + # Instantiate the snow prepare task + SnowAnl = SnowAnalysis(config) + SnowAnl.prepare_GTS() + if f"{ SnowAnl.runtime_config.cyc }" == '18': + SnowAnl.prepare_IMS() diff --git a/scripts/exglobal_land_analysis.py b/scripts/exglobal_snow_analysis.py similarity index 66% rename from scripts/exglobal_land_analysis.py rename to scripts/exglobal_snow_analysis.py index 70141475b0..fe050f5af5 100755 --- a/scripts/exglobal_land_analysis.py +++ b/scripts/exglobal_snow_analysis.py @@ -1,12 +1,12 @@ #!/usr/bin/env python3 -# exglobal_land_analysis.py -# This script creates an LandAnalysis class +# exglobal_snow_analysis.py +# This script creates an SnowAnalysis class # and runs the initialize, execute and finalize methods -# for a global Land Snow Depth analysis +# for a global Snow Depth analysis import os from wxflow import Logger, cast_strdict_as_dtypedict -from pygfs.task.land_analysis import LandAnalysis +from pygfs.task.snow_analysis import SnowAnalysis # Initialize root logger logger = Logger(level=os.environ.get("LOGGING_LEVEL", "DEBUG"), colored_log=True) @@ -17,8 +17,8 @@ # Take configuration from environment and cast it as python dictionary config = cast_strdict_as_dtypedict(os.environ) - # Instantiate the land analysis task - anl = LandAnalysis(config) + # Instantiate the snow analysis task + anl = SnowAnalysis(config) anl.initialize() anl.execute() anl.finalize() diff --git a/scripts/exglobal_stage_ic.sh b/scripts/exglobal_stage_ic.sh index 58b37f3114..e7dfc3c495 100755 --- a/scripts/exglobal_stage_ic.sh +++ b/scripts/exglobal_stage_ic.sh @@ -1,6 +1,6 @@ #!/usr/bin/env bash -source "${HOMEgfs}/ush/preamble.sh" +source "${USHgfs}/preamble.sh" # Locally scoped variables and functions # shellcheck disable=SC2153 @@ -104,6 +104,17 @@ for MEMDIR in "${MEMDIR_ARRAY[@]}"; do ;; esac + # Ocean Perturbation Files + # Extra zero on MEMDIR ensure we have a number even if the string is empty + if (( 0${MEMDIR:3} > 0 )) && [[ "${USE_OCN_PERTURB_FILES:-false}" == "true" ]]; then + src="${BASE_CPLIC}/${CPL_OCNIC:-}/${PDY}${cyc}/${MEMDIR}/ocean/${PDY}.${cyc}0000.mom6_increment.nc" + tgt="${COM_OCEAN_RESTART_PREV}/${PDY}.${cyc}0000.mom6_increment.nc" + ${NCP} "${src}" "${tgt}" + rc=${?} + ((rc != 0)) && error_message "${src}" "${tgt}" "${rc}" + err=$((err + rc)) + fi + # TODO: Do mediator restarts exists in a ATMW configuration? # TODO: No mediator is presumably involved in an ATMA configuration if [[ ${EXP_WARM_START:-".false."} = ".true." ]]; then diff --git a/scripts/run_reg2grb2.sh b/scripts/run_reg2grb2.sh deleted file mode 100755 index ab2c80043e..0000000000 --- a/scripts/run_reg2grb2.sh +++ /dev/null @@ -1,72 +0,0 @@ -#! /usr/bin/env bash - -source "${HOMEgfs}/ush/preamble.sh" - -#requires grib_util module - -MOM6REGRID=${MOM6REGRID:-${HOMEgfs}} -export mask_file="${MOM6REGRID}/fix/reg2grb2/mask.0p25x0p25.grb2" - -# offline testing: -#export DATA= -#export icefile=$DATA/DATA0p5/icer2012010106.01.2012010100_0p5x0p5.nc -#export ocnfile=$DATA/DATA0p5/ocnr2012010106.01.2012010100_0p5x0p5.nc -#export outfile=$DATA/DATA0p5/out/ocnh2012010106.01.2012010100.grb2 -# -# workflow testing: -export icefile="icer${VDATE}.${ENSMEM}.${IDATE}_0p25x0p25_CICE.nc" -export ocnfile="ocnr${VDATE}.${ENSMEM}.${IDATE}_0p25x0p25_MOM6.nc" -export outfile="ocn_ice${VDATE}.${ENSMEM}.${IDATE}_0p25x0p25.grb2" -export outfile0p5="ocn_ice${VDATE}.${ENSMEM}.${IDATE}_0p5x0p5.grb2" - -export mfcstcpl=${mfcstcpl:-1} -export IGEN_OCNP=${IGEN_OCNP:-197} - -# PT This is the forecast date -export year=${VDATE:0:4} -export month=${VDATE:4:2} -export day=${VDATE:6:2} -export hour=${VDATE:8:2} - -# PT This is the initialization date -export syear=${IDATE:0:4} -export smonth=${IDATE:4:2} -export sday=${IDATE:6:2} -export shour=${IDATE:8:2} - -# PT Need to get this from above - could be 6 or 1 hour -export hh_inc_ocn=6 -# -# set for 1p0 lat-lon -#export im=360 -#export jm=181 -# export km=40 -#export imo=360 -#export jmo=181 -# -# set for 0p5 lat-lon -#export im=720 -#export jm=361 -#export km=40 -#export imo=720 -#export jmo=361 -# -# set for 0p25 lat-lon -export im=1440 -export jm=721 -export imo=1440 -export jmo=721 -export km=40 - -export flats=-90. -export flatn=90. -export flonw=0.0 -export flone=359.75 - -ln -sf "${mask_file}" ./iceocnpost.g2 -${executable} > "reg2grb2.${VDATE}.${IDATE}.out" - -# interpolated from 0p25 to 0p5 grid -grid2p05="0 6 0 0 0 0 0 0 720 361 0 0 90000000 0 48 -90000000 359500000 500000 500000 0" -${COPYGB2} -g "${grid2p05}" -i0 -x "${outfile}" "${outfile0p5}" - diff --git a/scripts/run_regrid.sh b/scripts/run_regrid.sh deleted file mode 100755 index 103e9a759e..0000000000 --- a/scripts/run_regrid.sh +++ /dev/null @@ -1,27 +0,0 @@ -#! /usr/bin/env bash - -source "${HOMEgfs}/ush/preamble.sh" - -MOM6REGRID="${MOM6REGRID:-${HOMEgfs}}" -export EXEC_DIR="${MOM6REGRID}/exec" -export USH_DIR="${MOM6REGRID}/ush" -export COMOUTocean="${COM_OCEAN_HISTORY}" -export COMOUTice="${COM_ICE_HISTORY}" -export IDATE="${IDATE}" -export VDATE="${VDATE}" -export ENSMEM="${ENSMEM}" -export FHR="${fhr}" -export DATA="${DATA}" -export FIXreg2grb2="${FIXreg2grb2}" - -###### DO NOT MODIFY BELOW UNLESS YOU KNOW WHAT YOU ARE DOING ####### -#Need NCL module to be loaded: -echo "${NCARG_ROOT}" -export NCL="${NCARG_ROOT}/bin/ncl" - -ls -alrt - -${NCL} "${USH_DIR}/icepost.ncl" -${NCL} "${USH_DIR}/ocnpost.ncl" -##################################################################### - diff --git a/sorc/build_all.sh b/sorc/build_all.sh index 23cf420f1d..d8374c269f 100755 --- a/sorc/build_all.sh +++ b/sorc/build_all.sh @@ -16,41 +16,54 @@ function _usage() { Builds all of the global-workflow components by calling the individual build scripts in sequence. -Usage: ${BASH_SOURCE[0]} [-a UFS_app][-c build_config][-h][-j n][-v] +Usage: ${BASH_SOURCE[0]} [-a UFS_app][-c build_config][-d][-h][-j n][-v][-w] -a UFS_app: Build a specific UFS app instead of the default + -d: + Build in debug mode -g: Build GSI -h: Print this help message and exit -j: Specify maximum number of build jobs (n) + -k: + Kill all builds if any build fails -u: Build UFS-DA -v: Execute all build scripts with -v option to turn on verbose where supported + -w: + Use unstructured wave grid EOF exit 1 } -script_dir=$(cd "$(dirname "${BASH_SOURCE[0]}")" &> /dev/null && pwd) -cd "${script_dir}" || exit 1 +# shellcheck disable=SC2155 +readonly HOMEgfs=$(cd "$(dirname "$(readlink -f -n "${BASH_SOURCE[0]}" )" )/.." && pwd -P) +cd "${HOMEgfs}/sorc" || exit 1 _build_ufs_opt="" _build_ufsda="NO" _build_gsi="NO" +_build_debug="" _verbose_opt="" +_wave_unst="" _build_job_max=20 +_quick_kill="NO" # Reset option counter in case this script is sourced OPTIND=1 -while getopts ":a:ghj:uv" option; do +while getopts ":a:dghj:kuvw" option; do case "${option}" in a) _build_ufs_opt+="-a ${OPTARG} ";; + d) _build_debug="-d" ;; g) _build_gsi="YES" ;; h) _usage;; j) _build_job_max="${OPTARG} ";; + k) _quick_kill="YES" ;; u) _build_ufsda="YES" ;; v) _verbose_opt="-v";; + w) _wave_unst="-w";; :) echo "[${BASH_SOURCE[0]}]: ${option} requires an argument" _usage @@ -64,24 +77,24 @@ done shift $((OPTIND-1)) -logs_dir="${script_dir}/logs" +logs_dir="${HOMEgfs}/sorc/logs" if [[ ! -d "${logs_dir}" ]]; then echo "Creating logs folder" - mkdir "${logs_dir}" || exit 1 + mkdir -p "${logs_dir}" || exit 1 fi # Check final exec folder exists -if [[ ! -d "../exec" ]]; then - echo "Creating ../exec folder" - mkdir ../exec +if [[ ! -d "${HOMEgfs}/exec" ]]; then + echo "Creating ${HOMEgfs}/exec folder" + mkdir -p "${HOMEgfs}/exec" fi #------------------------------------ # GET MACHINE #------------------------------------ export COMPILER="intel" -source gfs_utils.fd/ush/detect_machine.sh -source gfs_utils.fd/ush/module-setup.sh +source "${HOMEgfs}/ush/detect_machine.sh" +source "${HOMEgfs}/ush/module-setup.sh" if [[ -z "${MACHINE_ID}" ]]; then echo "FATAL: Unable to determine target machine" exit 1 @@ -113,42 +126,42 @@ declare -A build_opts big_jobs=0 build_jobs["ufs"]=8 big_jobs=$((big_jobs+1)) -build_opts["ufs"]="${_verbose_opt} ${_build_ufs_opt}" +build_opts["ufs"]="${_wave_unst} ${_verbose_opt} ${_build_ufs_opt} ${_build_debug}" -build_jobs["upp"]=6 # The UPP is hardcoded to use 6 cores -build_opts["upp"]="" +build_jobs["upp"]=2 +build_opts["upp"]="${_build_debug}" -build_jobs["ufs_utils"]=3 -build_opts["ufs_utils"]="${_verbose_opt}" +build_jobs["ufs_utils"]=2 +build_opts["ufs_utils"]="${_verbose_opt} ${_build_debug}" build_jobs["gfs_utils"]=1 -build_opts["gfs_utils"]="${_verbose_opt}" +build_opts["gfs_utils"]="${_verbose_opt} ${_build_debug}" -build_jobs["ww3prepost"]=3 -build_opts["ww3prepost"]="${_verbose_opt} ${_build_ufs_opt}" +build_jobs["ww3prepost"]=2 +build_opts["ww3prepost"]="${_wave_unst} ${_verbose_opt} ${_build_ufs_opt} ${_build_debug}" # Optional DA builds if [[ "${_build_ufsda}" == "YES" ]]; then - if [[ "${MACHINE_ID}" != "orion" && "${MACHINE_ID}" != "hera" ]]; then + if [[ "${MACHINE_ID}" != "orion" && "${MACHINE_ID}" != "hera" && "${MACHINE_ID}" != "hercules" ]]; then echo "NOTE: The GDAS App is not supported on ${MACHINE_ID}. Disabling build." else build_jobs["gdas"]=8 big_jobs=$((big_jobs+1)) - build_opts["gdas"]="${_verbose_opt}" + build_opts["gdas"]="${_verbose_opt} ${_build_debug}" fi fi if [[ "${_build_gsi}" == "YES" ]]; then build_jobs["gsi_enkf"]=8 - build_opts["gsi_enkf"]="${_verbose_opt}" + build_opts["gsi_enkf"]="${_verbose_opt} ${_build_debug}" fi if [[ "${_build_gsi}" == "YES" || "${_build_ufsda}" == "YES" ]] ; then - build_jobs["gsi_utils"]=2 - build_opts["gsi_utils"]="${_verbose_opt}" + build_jobs["gsi_utils"]=1 + build_opts["gsi_utils"]="${_verbose_opt} ${_build_debug}" if [[ "${MACHINE_ID}" == "hercules" ]]; then echo "NOTE: The GSI Monitor is not supported on Hercules. Disabling build." else build_jobs["gsi_monitor"]=1 - build_opts["gsi_monitor"]="${_verbose_opt}" + build_opts["gsi_monitor"]="${_verbose_opt} ${_build_debug}" fi fi @@ -184,6 +197,31 @@ fi procs_in_use=0 declare -A build_ids +check_builds() +{ + for chk_build in "${!build_jobs[@]}"; do + # Check if the build is complete and if so what the status was + if [[ -n "${build_ids[${chk_build}]+0}" ]]; then + if ! ps -p "${build_ids[${chk_build}]}" > /dev/null; then + wait "${build_ids[${chk_build}]}" + build_stat=$? + if [[ ${build_stat} != 0 ]]; then + echo "build_${chk_build}.sh failed! Exiting!" + echo "Check logs/build_${chk_build}.log for details." + echo "logs/build_${chk_build}.log" > "${HOMEgfs}/sorc/logs/error.logs" + for kill_build in "${!build_jobs[@]}"; do + if [[ -n "${build_ids[${kill_build}]+0}" ]]; then + pkill -P "${build_ids[${kill_build}]}" + fi + done + return "${build_stat}" + fi + fi + fi + done + return 0 +} + builds_started=0 # Now start looping through all of the jobs until everything is done while [[ ${builds_started} -lt ${#build_jobs[@]} ]]; do @@ -192,13 +230,10 @@ while [[ ${builds_started} -lt ${#build_jobs[@]} ]]; do if [[ -n "${build_jobs[${build}]+0}" && -z "${build_ids[${build}]+0}" ]]; then # Do we have enough processors to run it? if [[ ${_build_job_max} -ge $(( build_jobs[build] + procs_in_use )) ]]; then - if [[ "${build}" != "upp" ]]; then - "./build_${build}.sh" -j "${build_jobs[${build}]}" "${build_opts[${build}]:-}" > \ - "${logs_dir}/build_${build}.log" 2>&1 & - else - "./build_${build}.sh" "${build_opts[${build}]}" > \ - "${logs_dir}/build_${build}.log" 2>&1 & - fi + # double-quoting build_opts here will not work since it is a string of options + #shellcheck disable=SC2086 + "./build_${build}.sh" ${build_opts[${build}]:-} -j "${build_jobs[${build}]}" > \ + "${logs_dir}/build_${build}.log" 2>&1 & build_ids["${build}"]=$! echo "Starting build_${build}.sh" procs_in_use=$(( procs_in_use + build_jobs[${build}] )) @@ -222,11 +257,31 @@ while [[ ${builds_started} -lt ${#build_jobs[@]} ]]; do fi done + # If requested, check if any build has failed and exit if so + if [[ "${_quick_kill}" == "YES" ]]; then + check_builds + build_stat=$? + if (( build_stat != 0 )); then + exit "${build_stat}" + fi + fi + sleep 5s done + # Wait for all jobs to complete and check return statuses -while [[ ${#build_jobs[@]} -gt 0 ]]; do +while [[ "${#build_jobs[@]}" -gt 0 ]]; do + + # If requested, check if any build has failed and exit if so + if [[ "${_quick_kill}" == "YES" ]]; then + check_builds + build_stat=$? + if [[ ${build_stat} != 0 ]]; then + exit "${build_stat}" + fi + fi + for build in "${!build_jobs[@]}"; do # Test if each job is complete and if so, notify and remove from the array if [[ -n "${build_ids[${build}]+0}" ]]; then diff --git a/sorc/build_gdas.sh b/sorc/build_gdas.sh index b1a17c33dd..43c503ab4d 100755 --- a/sorc/build_gdas.sh +++ b/sorc/build_gdas.sh @@ -2,11 +2,12 @@ set -eux OPTIND=1 +_opts="-f " # forces a clean build while getopts ":j:dv" option; do case "${option}" in - d) export BUILD_TYPE="DEBUG";; - j) export BUILD_JOBS=${OPTARG};; - v) export BUILD_VERBOSE="YES";; + d) _opts+="-c -DCMAKE_BUILD_TYPE=Debug " ;; + j) BUILD_JOBS=${OPTARG};; + v) _opts+="-v ";; :) echo "[${BASH_SOURCE[0]}]: ${option} requires an argument" usage @@ -19,12 +20,10 @@ while getopts ":j:dv" option; do done shift $((OPTIND-1)) -# TODO: GDASApp does not presently handle BUILD_TYPE - -BUILD_TYPE=${BUILD_TYPE:-"Release"} \ -BUILD_VERBOSE=${BUILD_VERBOSE:-"NO"} \ +# double quoting opts will not work since it is a string of options +# shellcheck disable=SC2086 BUILD_JOBS="${BUILD_JOBS:-8}" \ WORKFLOW_BUILD="ON" \ -./gdas.cd/build.sh +./gdas.cd/build.sh ${_opts} -f exit diff --git a/sorc/build_gfs_utils.sh b/sorc/build_gfs_utils.sh index 09bd4a9656..e53f71ddcd 100755 --- a/sorc/build_gfs_utils.sh +++ b/sorc/build_gfs_utils.sh @@ -18,14 +18,12 @@ EOF exit 1 } -cwd=$(pwd) - OPTIND=1 while getopts ":j:dvh" option; do case "${option}" in - d) export BUILD_TYPE="DEBUG";; - v) export BUILD_VERBOSE="YES";; - j) export BUILD_JOBS="${OPTARG}";; + d) BUILD_TYPE="Debug";; + v) BUILD_VERBOSE="YES";; + j) BUILD_JOBS="${OPTARG}";; h) usage ;; @@ -44,6 +42,6 @@ shift $((OPTIND-1)) BUILD_TYPE=${BUILD_TYPE:-"Release"} \ BUILD_VERBOSE=${BUILD_VERBOSE:-"NO"} \ BUILD_JOBS=${BUILD_JOBS:-8} \ -"${cwd}/gfs_utils.fd/ush/build.sh" +"./gfs_utils.fd/ush/build.sh" exit diff --git a/sorc/build_gsi_enkf.sh b/sorc/build_gsi_enkf.sh index 9ba278e3ec..ba24cefa81 100755 --- a/sorc/build_gsi_enkf.sh +++ b/sorc/build_gsi_enkf.sh @@ -4,9 +4,9 @@ set -eux OPTIND=1 while getopts ":j:dv" option; do case "${option}" in - d) export BUILD_TYPE="DEBUG";; - j) export BUILD_JOBS="${OPTARG}";; - v) export BUILD_VERBOSE="YES";; + d) BUILD_TYPE="Debug";; + j) BUILD_JOBS="${OPTARG}";; + v) BUILD_VERBOSE="YES";; :) echo "[${BASH_SOURCE[0]}]: ${option} requires an argument" usage diff --git a/sorc/build_gsi_monitor.sh b/sorc/build_gsi_monitor.sh index 3de1262aac..31add1882a 100755 --- a/sorc/build_gsi_monitor.sh +++ b/sorc/build_gsi_monitor.sh @@ -1,14 +1,12 @@ #! /usr/bin/env bash set -eux -cwd=$(pwd) - OPTIND=1 while getopts ":j:dv" option; do case "${option}" in - d) export BUILD_TYPE="DEBUG";; - j) export BUILD_JOBS="${OPTARG}";; - v) export BUILD_VERBOSE="YES";; + d) BUILD_TYPE="Debug";; + j) BUILD_JOBS="${OPTARG}";; + v) BUILD_VERBOSE="YES";; :) echo "[${BASH_SOURCE[0]}]: ${option} requires an argument" usage @@ -24,6 +22,6 @@ shift $((OPTIND-1)) BUILD_TYPE=${BUILD_TYPE:-"Release"} \ BUILD_VERBOSE=${BUILD_VERBOSE:-"NO"} \ BUILD_JOBS=${BUILD_JOBS:-8} \ -"${cwd}/gsi_monitor.fd/ush/build.sh" +"./gsi_monitor.fd/ush/build.sh" exit diff --git a/sorc/build_gsi_utils.sh b/sorc/build_gsi_utils.sh index 81eab0f628..58c64e6e4a 100755 --- a/sorc/build_gsi_utils.sh +++ b/sorc/build_gsi_utils.sh @@ -1,14 +1,12 @@ #! /usr/bin/env bash set -eux -cwd=$(pwd) - OPTIND=1 while getopts ":j:dv" option; do case "${option}" in - d) export BUILD_TYPE="DEBUG";; - j) export BUILD_JOBS="${OPTARG}";; - v) export BUILD_VERBOSE="YES";; + d) BUILD_TYPE="Debug";; + j) BUILD_JOBS="${OPTARG}";; + v) BUILD_VERBOSE="YES";; :) echo "[${BASH_SOURCE[0]}]: ${option} requires an argument" usage @@ -25,6 +23,6 @@ BUILD_TYPE=${BUILD_TYPE:-"Release"} \ BUILD_VERBOSE=${BUILD_VERBOSE:-"NO"} \ BUILD_JOBS=${BUILD_JOBS:-8} \ UTIL_OPTS="-DBUILD_UTIL_ENKF_GFS=ON -DBUILD_UTIL_NCIO=ON" \ -"${cwd}/gsi_utils.fd/ush/build.sh" +"./gsi_utils.fd/ush/build.sh" exit diff --git a/sorc/build_ufs.sh b/sorc/build_ufs.sh index 59914d6b09..3055179f50 100755 --- a/sorc/build_ufs.sh +++ b/sorc/build_ufs.sh @@ -5,14 +5,15 @@ cwd=$(pwd) # Default settings APP="S2SWA" -CCPP_SUITES="FV3_GFS_v17_p8_ugwpv1,FV3_GFS_v17_coupled_p8_ugwpv1,FV3_GFS_v17_p8_ugwpv1_c3,FV3_GFS_v17_p8_ugwpv1_c3_mynn,FV3_GFS_v17_p8_ugwpv1_mynn" # TODO: does the g-w need to build with all these CCPP_SUITES? +CCPP_SUITES="FV3_GFS_v17_p8_ugwpv1,FV3_GFS_v17_coupled_p8_ugwpv1" # TODO: does the g-w need to build with all these CCPP_SUITES? -while getopts ":da:j:v" option; do +while getopts ":da:j:vw" option; do case "${option}" in - d) BUILD_TYPE="DEBUG";; + d) BUILD_TYPE="Debug";; a) APP="${OPTARG}";; j) BUILD_JOBS="${OPTARG}";; v) export BUILD_VERBOSE="YES";; + w) PDLIB="ON";; :) echo "[${BASH_SOURCE[0]}]: ${option} requires an argument" ;; @@ -28,13 +29,14 @@ source "./tests/detect_machine.sh" source "./tests/module-setup.sh" MAKE_OPT="-DAPP=${APP} -D32BIT=ON -DCCPP_SUITES=${CCPP_SUITES}" -[[ ${BUILD_TYPE:-"Release"} = "DEBUG" ]] && MAKE_OPT+=" -DDEBUG=ON" +[[ ${PDLIB:-"OFF"} = "ON" ]] && MAKE_OPT+=" -DPDLIB=ON" +[[ ${BUILD_TYPE:-"Release"} = "Debug" ]] && MAKE_OPT+=" -DDEBUG=ON" COMPILE_NR=0 CLEAN_BEFORE=YES CLEAN_AFTER=NO if [[ "${MACHINE_ID}" != "noaacloud" ]]; then - ./tests/compile.sh "${MACHINE_ID}" "${MAKE_OPT}" "${COMPILE_NR}" "intel" "${CLEAN_BEFORE}" "${CLEAN_AFTER}" + BUILD_JOBS=${BUILD_JOBS:-8} ./tests/compile.sh "${MACHINE_ID}" "${MAKE_OPT}" "${COMPILE_NR}" "intel" "${CLEAN_BEFORE}" "${CLEAN_AFTER}" mv "./tests/fv3_${COMPILE_NR}.exe" ./tests/ufs_model.x mv "./tests/modules.fv3_${COMPILE_NR}.lua" ./tests/modules.ufs_model.lua cp "./modulefiles/ufs_common.lua" ./tests/ufs_common.lua diff --git a/sorc/build_ufs_utils.sh b/sorc/build_ufs_utils.sh index e78ca3c180..63ec56cb41 100755 --- a/sorc/build_ufs_utils.sh +++ b/sorc/build_ufs_utils.sh @@ -4,8 +4,9 @@ set -eux OPTIND=1 while getopts ":j:dv" option; do case "${option}" in - j) export BUILD_JOBS="${OPTARG}";; - v) export BUILD_VERBOSE="YES";; + d) BUILD_TYPE="Debug" ;; + j) BUILD_JOBS="${OPTARG}";; + v) BUILD_VERBOSE="YES";; :) echo "[${BASH_SOURCE[0]}]: ${option} requires an argument" usage @@ -18,13 +19,11 @@ while getopts ":j:dv" option; do done shift $((OPTIND-1)) -script_dir=$(dirname "${BASH_SOURCE[0]}") -cd "${script_dir}/ufs_utils.fd" || exit 1 - CMAKE_OPTS="-DGFS=ON" \ +BUILD_TYPE=${BUILD_TYPE:-"Release"} \ BUILD_JOBS=${BUILD_JOBS:-8} \ BUILD_VERBOSE=${BUILD_VERBOSE:-} \ -./build_all.sh +./ufs_utils.fd/build_all.sh exit diff --git a/sorc/build_upp.sh b/sorc/build_upp.sh index a55e96ebc8..1dca0035fd 100755 --- a/sorc/build_upp.sh +++ b/sorc/build_upp.sh @@ -6,25 +6,26 @@ cd "${script_dir}" || exit 1 OPTIND=1 _opts="" -while getopts ":dv" option; do - case "${option}" in - d) _opts+="-d ";; - v) _opts+="-v ";; - :) - echo "[${BASH_SOURCE[0]}]: ${option} requires an argument" - ;; - *) - echo "[${BASH_SOURCE[0]}]: Unrecognized option: ${option}" - ;; - esac +while getopts ":dj:v" option; do + case "${option}" in + d) _opts+="-d " ;; + j) BUILD_JOBS="${OPTARG}" ;; + v) _opts+="-v ";; + :) + echo "[${BASH_SOURCE[0]}]: ${option} requires an argument" + ;; + *) + echo "[${BASH_SOURCE[0]}]: Unrecognized option: ${option}" + ;; + esac done shift $((OPTIND-1)) # Check final exec folder exists if [[ ! -d "../exec" ]]; then - mkdir ../exec + mkdir -p ../exec fi -cd ufs_model.fd/FV3/upp/tests +cd upp.fd/tests # shellcheck disable=SC2086 -./compile_upp.sh ${_opts} +BUILD_JOBS=${BUILD_JOBS:-8} ./compile_upp.sh ${_opts} diff --git a/sorc/build_ww3prepost.sh b/sorc/build_ww3prepost.sh index 919afaacb3..5b527a1641 100755 --- a/sorc/build_ww3prepost.sh +++ b/sorc/build_ww3prepost.sh @@ -6,12 +6,15 @@ cd "${script_dir}" || exit 1 # Default settings APP="S2SWA" +PDLIB="OFF" -while getopts ":j:a:v" option; do +while getopts ":j:a:dvw" option; do case "${option}" in a) APP="${OPTARG}";; + d) BUILD_TYPE="Debug";; j) BUILD_JOBS="${OPTARG}";; v) export BUILD_VERBOSE="YES";; + w) PDLIB="ON";; :) echo "[${BASH_SOURCE[0]}]: ${option} requires an argument" usage @@ -23,15 +26,17 @@ while getopts ":j:a:v" option; do esac done - # Determine which switch to use if [[ "${APP}" == "ATMW" ]]; then ww3switch="model/esmf/switch" else - ww3switch="model/bin/switch_meshcap" + if [[ "${PDLIB}" == "ON" ]]; then + ww3switch="model/bin/switch_meshcap_pdlib" + else + ww3switch="model/bin/switch_meshcap" + fi fi - # Check final exec folder exists if [[ ! -d "../exec" ]]; then mkdir ../exec @@ -64,6 +69,8 @@ mkdir -p "${path_build}" || exit 1 cd "${path_build}" || exit 1 echo "Forcing a SHRD build" +buildswitch="${path_build}/switch" + cat "${SWITCHFILE}" > "${path_build}/tempswitch" sed -e "s/DIST/SHRD/g"\ @@ -73,15 +80,22 @@ sed -e "s/DIST/SHRD/g"\ -e "s/MPI / /g"\ -e "s/B4B / /g"\ -e "s/PDLIB / /g"\ + -e "s/SCOTCH / /g"\ + -e "s/METIS / /g"\ -e "s/NOGRB/NCEP2/g"\ "${path_build}/tempswitch" > "${path_build}/switch" rm "${path_build}/tempswitch" -echo "Switch file is ${path_build}/switch with switches:" -cat "${path_build}/switch" +echo "Switch file is ${buildswitch} with switches:" +cat "${buildswitch}" + +#define cmake build options +MAKE_OPT="-DCMAKE_INSTALL_PREFIX=install" +[[ ${BUILD_TYPE:-"Release"} = "Debug" ]] && MAKE_OPT+=" -DCMAKE_BUILD_TYPE=Debug" #Build executables: -cmake "${WW3_DIR}" -DSWITCH="${path_build}/switch" -DCMAKE_INSTALL_PREFIX=install +# shellcheck disable=SC2086 +cmake "${WW3_DIR}" -DSWITCH="${buildswitch}" ${MAKE_OPT} rc=$? if (( rc != 0 )); then echo "Fatal error in cmake." diff --git a/sorc/gdas.cd b/sorc/gdas.cd index f44a6d500d..2198b41956 160000 --- a/sorc/gdas.cd +++ b/sorc/gdas.cd @@ -1 +1 @@ -Subproject commit f44a6d500dda2aba491e4fa12c0bee428ddb7b80 +Subproject commit 2198b419567cf7efa7404cd076e76e01d86f9e58 diff --git a/sorc/gfs_utils.fd b/sorc/gfs_utils.fd index 7d3b08e87c..de3708bfb0 160000 --- a/sorc/gfs_utils.fd +++ b/sorc/gfs_utils.fd @@ -1 +1 @@ -Subproject commit 7d3b08e87c07cfa54079442d245ac7e9ab1cd9f4 +Subproject commit de3708bfb00cd51900e813b84fdf2a3be5d398b0 diff --git a/sorc/gsi_enkf.fd b/sorc/gsi_enkf.fd index c94bc72ff4..b53740a7bd 160000 --- a/sorc/gsi_enkf.fd +++ b/sorc/gsi_enkf.fd @@ -1 +1 @@ -Subproject commit c94bc72ff410b48c325abbfe92c9fcb601d89aed +Subproject commit b53740a7bd1cc416f634589075b8c8b89f0ef761 diff --git a/sorc/gsi_monitor.fd b/sorc/gsi_monitor.fd index ae256c0d69..8efe38eade 160000 --- a/sorc/gsi_monitor.fd +++ b/sorc/gsi_monitor.fd @@ -1 +1 @@ -Subproject commit ae256c0d69df3232ee9dd3e81b176bf2c3cda312 +Subproject commit 8efe38eadebbd5d50284aee44f6d8b6799a7f6e6 diff --git a/sorc/gsi_utils.fd b/sorc/gsi_utils.fd index 90481d9618..67b014d8d3 160000 --- a/sorc/gsi_utils.fd +++ b/sorc/gsi_utils.fd @@ -1 +1 @@ -Subproject commit 90481d961854e4412ecac49991721e6e63d4b82e +Subproject commit 67b014d8d3e5acc1d21aca15e3fe2d66d327a206 diff --git a/sorc/link_workflow.sh b/sorc/link_workflow.sh index ed33b17e72..b92ae0e757 100755 --- a/sorc/link_workflow.sh +++ b/sorc/link_workflow.sh @@ -107,7 +107,6 @@ for dir in aer \ lut \ mom6 \ orog \ - reg2grb2 \ sfc_climo \ ugwd \ verif \ @@ -122,12 +121,6 @@ do done -if [[ -d "${HOMEgfs}/sorc/ufs_utils.fd" ]]; then - cd "${HOMEgfs}/sorc/ufs_utils.fd/fix" || exit 1 - ./link_fixdirs.sh "${RUN_ENVIR}" "${machine}" 2> /dev/null -fi - - #--------------------------------------- #--add files from external repositories #--------------------------------------- @@ -136,15 +129,25 @@ cd "${HOMEgfs}/parm/ufs" || exit 1 ${LINK_OR_COPY} "${HOMEgfs}/sorc/ufs_model.fd/tests/parm/noahmptable.tbl" . cd "${HOMEgfs}/parm/post" || exit 1 -for file in postxconfig-NT-GEFS-ANL.txt postxconfig-NT-GEFS-F00.txt postxconfig-NT-GEFS.txt postxconfig-NT-GFS-ANL.txt \ - postxconfig-NT-GFS-F00-TWO.txt postxconfig-NT-GFS-F00.txt postxconfig-NT-GFS-FLUX-F00.txt postxconfig-NT-GFS-FLUX.txt \ - postxconfig-NT-GFS-GOES.txt postxconfig-NT-GFS-TWO.txt \ - postxconfig-NT-GFS.txt postxconfig-NT-gefs-aerosol.txt postxconfig-NT-gefs-chem.txt params_grib2_tbl_new \ - post_tag_gfs128 post_tag_gfs65 nam_micro_lookup.dat \ - AEROSOL_LUTS.dat optics_luts_DUST.dat optics_luts_SALT.dat optics_luts_SOOT.dat optics_luts_SUSO.dat optics_luts_WASO.dat +for file in postxconfig-NT-GEFS-F00.txt postxconfig-NT-GEFS.txt postxconfig-NT-GEFS-WAFS.txt \ + postxconfig-NT-GEFS-F00-aerosol.txt postxconfig-NT-GEFS-aerosol.txt \ + postxconfig-NT-GFS-ANL.txt postxconfig-NT-GFS-F00.txt postxconfig-NT-GFS-FLUX-F00.txt \ + postxconfig-NT-GFS.txt postxconfig-NT-GFS-FLUX.txt postxconfig-NT-GFS-GOES.txt \ + postxconfig-NT-GFS-F00-TWO.txt postxconfig-NT-GFS-TWO.txt \ + params_grib2_tbl_new post_tag_gfs128 post_tag_gfs65 nam_micro_lookup.dat do ${LINK_OR_COPY} "${HOMEgfs}/sorc/upp.fd/parm/${file}" . done +for file in optics_luts_DUST.dat optics_luts_DUST_nasa.dat optics_luts_NITR_nasa.dat \ + optics_luts_SALT.dat optics_luts_SALT_nasa.dat optics_luts_SOOT.dat optics_luts_SOOT_nasa.dat \ + optics_luts_SUSO.dat optics_luts_SUSO_nasa.dat optics_luts_WASO.dat optics_luts_WASO_nasa.dat +do + ${LINK_OR_COPY} "${HOMEgfs}/sorc/upp.fd/fix/chem/${file}" . +done +for file in ice.csv ocean.csv ocnicepost.nml.jinja2 +do + ${LINK_OR_COPY} "${HOMEgfs}/sorc/gfs_utils.fd/parm/ocnicepost/${file}" . +done cd "${HOMEgfs}/scripts" || exit 8 ${LINK_OR_COPY} "${HOMEgfs}/sorc/ufs_utils.fd/scripts/exemcsfc_global_sfc_prep.sh" . @@ -152,29 +155,36 @@ cd "${HOMEgfs}/ush" || exit 8 for file in emcsfc_ice_blend.sh global_cycle_driver.sh emcsfc_snow.sh global_cycle.sh; do ${LINK_OR_COPY} "${HOMEgfs}/sorc/ufs_utils.fd/ush/${file}" . done -for file in finddate.sh make_ntc_bull.pl make_NTC_file.pl make_tif.sh month_name.sh ; do +for file in make_ntc_bull.pl make_NTC_file.pl make_tif.sh month_name.sh ; do ${LINK_OR_COPY} "${HOMEgfs}/sorc/gfs_utils.fd/ush/${file}" . done -# TODO: Link these ufs.configure templates from ufs-weather-model -#cd "${HOMEgfs}/parm/ufs" || exit 1 -#declare -a ufs_configure_files=("ufs.configure.atm.IN" \ -# "ufs.configure.atm_aero.IN" \ -# "ufs.configure.atmw.IN" \ -# "ufs.configure.blocked_atm_wav_2way.IN" \ -# "ufs.configure.blocked_atm_wav.IN" \ -# "ufs.configure.cpld_agrid.IN" \ -# "ufs.configure.cpld_esmfthreads.IN" \ -# "ufs.configure.cpld.IN" \ -# "ufs.configure.cpld_noaero.IN" \ -# "ufs.configure.cpld_noaero_nowave.IN" \ -# "ufs.configure.cpld_noaero_outwav.IN" \ -# "ufs.configure.leapfrog_atm_wav.IN") -#for file in "${ufs_configure_files[@]}"; do -# [[ -s "${file}" ]] && rm -f "${file}" -# ${LINK_OR_COPY} "${HOMEgfs}/sorc/ufs_model.fd/tests/parm/${file}" . -#done +# Link these templates from ufs-weather-model +cd "${HOMEgfs}/parm/ufs" || exit 1 +declare -a ufs_templates=("model_configure.IN" \ + "MOM_input_025.IN" "MOM_input_050.IN" "MOM_input_100.IN" "MOM_input_500.IN" \ + "MOM6_data_table.IN" \ + "ice_in.IN" \ + "ufs.configure.atm.IN" \ + "ufs.configure.atm_esmf.IN" \ + "ufs.configure.atmaero.IN" \ + "ufs.configure.atmaero_esmf.IN" \ + "ufs.configure.s2s.IN" \ + "ufs.configure.s2s_esmf.IN" \ + "ufs.configure.s2sa.IN" \ + "ufs.configure.s2sa_esmf.IN" \ + "ufs.configure.s2sw.IN" \ + "ufs.configure.s2sw_esmf.IN" \ + "ufs.configure.s2swa.IN" \ + "ufs.configure.s2swa_esmf.IN" \ + "ufs.configure.leapfrog_atm_wav.IN" \ + "ufs.configure.leapfrog_atm_wav_esmf.IN" ) +for file in "${ufs_templates[@]}"; do + [[ -s "${file}" ]] && rm -f "${file}" + ${LINK_OR_COPY} "${HOMEgfs}/sorc/ufs_model.fd/tests/parm/${file}" . +done +# Link the script from ufs-weather-model that parses the templates cd "${HOMEgfs}/ush" || exit 1 [[ -s "atparse.bash" ]] && rm -f "atparse.bash" ${LINK_OR_COPY} "${HOMEgfs}/sorc/ufs_model.fd/tests/atparse.bash" . @@ -187,7 +197,7 @@ if [[ -d "${HOMEgfs}/sorc/gdas.cd" ]]; then cd "${HOMEgfs}/fix" || exit 1 [[ ! -d gdas ]] && mkdir -p gdas cd gdas || exit 1 - for gdas_sub in fv3jedi gsibec; do + for gdas_sub in fv3jedi gsibec obs; do if [[ -d "${gdas_sub}" ]]; then rm -rf "${gdas_sub}" fi @@ -196,6 +206,18 @@ if [[ -d "${HOMEgfs}/sorc/gdas.cd" ]]; then done fi +#------------------------------ +#--add GDASApp parm directory +#------------------------------ +if [[ -d "${HOMEgfs}/sorc/gdas.cd" ]]; then + cd "${HOMEgfs}/parm/gdas" || exit 1 + declare -a gdasapp_comps=("aero" "atm" "io" "ioda" "snow" "soca") + for comp in "${gdasapp_comps[@]}"; do + [[ -d "${comp}" ]] && rm -rf "${comp}" + ${LINK_OR_COPY} "${HOMEgfs}/sorc/gdas.cd/parm/${comp}" . + done +fi + #------------------------------ #--add GDASApp files #------------------------------ @@ -242,8 +264,8 @@ if [[ ! -d "${HOMEgfs}/exec" ]]; then mkdir "${HOMEgfs}/exec" || exit 1 ; fi cd "${HOMEgfs}/exec" || exit 1 for utilexe in fbwndgfs.x gaussian_sfcanl.x gfs_bufr.x supvit.x syndat_getjtbul.x \ - syndat_maksynrc.x syndat_qctropcy.x tocsbufr.x overgridid.x \ - mkgfsawps.x enkf_chgres_recenter_nc.x tave.x vint.x reg2grb2.x + syndat_maksynrc.x syndat_qctropcy.x tocsbufr.x overgridid.x rdbfmsua.x \ + mkgfsawps.x enkf_chgres_recenter_nc.x tave.x vint.x ocnicepost.x webtitle.x do [[ -s "${utilexe}" ]] && rm -f "${utilexe}" ${LINK_OR_COPY} "${HOMEgfs}/sorc/gfs_utils.fd/install/bin/${utilexe}" . @@ -308,6 +330,10 @@ if [[ -d "${HOMEgfs}/sorc/gdas.cd/build" ]]; then "fv3jedi_enshofx.x" \ "fv3jedi_hofx_nomodel.x" \ "fv3jedi_testdata_downloader.py" \ + "gdas_ens_handler.x" \ + "gdas_incr_handler.x" \ + "gdas_obsprovider2ioda.x" \ + "gdas_socahybridweights.x" \ "soca_convertincrement.x" \ "soca_error_covariance_training.x" \ "soca_setcorscales.x" \ @@ -326,10 +352,11 @@ fi #--link source code directories #------------------------------ cd "${HOMEgfs}/sorc" || exit 8 -if [[ -d ufs_model.fd ]]; then - [[ -d upp.fd ]] && rm -rf upp.fd - ${LINK} ufs_model.fd/FV3/upp upp.fd -fi +# TODO: Commenting out until UPP is up-to-date with Rocky-8. +#if [[ -d ufs_model.fd ]]; then +# [[ -d upp.fd ]] && rm -rf upp.fd +# ${LINK} ufs_model.fd/FV3/upp upp.fd +#fi if [[ -d gsi_enkf.fd ]]; then [[ -d gsi.fd ]] && rm -rf gsi.fd @@ -397,7 +424,6 @@ for prog in enkf_chgres_recenter_nc.fd \ mkgfsawps.fd \ overgridid.fd \ rdbfmsua.fd \ - reg2grb2.fd \ supvit.fd \ syndat_getjtbul.fd \ syndat_maksynrc.fd \ @@ -405,7 +431,8 @@ for prog in enkf_chgres_recenter_nc.fd \ tave.fd \ tocsbufr.fd \ vint.fd \ - webtitle.fd + webtitle.fd \ + ocnicepost.fd do if [[ -d "${prog}" ]]; then rm -rf "${prog}"; fi ${LINK_OR_COPY} "gfs_utils.fd/src/${prog}" . diff --git a/sorc/ncl.setup b/sorc/ncl.setup deleted file mode 100644 index b4981689db..0000000000 --- a/sorc/ncl.setup +++ /dev/null @@ -1,12 +0,0 @@ -#!/bin/bash - -set +x -case ${target} in - 'jet'|'hera') - module load ncl/6.5.0 - export NCARG_LIB=${NCARG_ROOT}/lib - ;; - *) - echo "[${BASH_SOURCE[0]}]: unknown ${target}" - ;; -esac diff --git a/sorc/ufs_model.fd b/sorc/ufs_model.fd index 7088634d67..7fdb58cad0 160000 --- a/sorc/ufs_model.fd +++ b/sorc/ufs_model.fd @@ -1 +1 @@ -Subproject commit 7088634d67da1650de3f3f3789a1c8db250d49be +Subproject commit 7fdb58cad0dad2f62ce7813c6719554d1c5a17af diff --git a/sorc/ufs_utils.fd b/sorc/ufs_utils.fd index 7dcadcaf8d..f42fae239d 160000 --- a/sorc/ufs_utils.fd +++ b/sorc/ufs_utils.fd @@ -1 +1 @@ -Subproject commit 7dcadcaf8db44ecd4ef46a1802c666c526a5e8c8 +Subproject commit f42fae239d0824f7b9a83c9afdc3d980894c7df8 diff --git a/sorc/upp.fd b/sorc/upp.fd new file mode 160000 index 0000000000..4770a2f509 --- /dev/null +++ b/sorc/upp.fd @@ -0,0 +1 @@ +Subproject commit 4770a2f509b7122e76c4f004210031a58ae9502c diff --git a/sorc/verif-global.fd b/sorc/verif-global.fd index c267780a12..9377e84ba3 160000 --- a/sorc/verif-global.fd +++ b/sorc/verif-global.fd @@ -1 +1 @@ -Subproject commit c267780a1255fa7db052c745cf9c78b7dc6a2695 +Subproject commit 9377e84ba3fc9b2fd13c2c84cfd571855dee75ae diff --git a/sorc/wxflow b/sorc/wxflow index 528f5abb49..942b90bfaa 160000 --- a/sorc/wxflow +++ b/sorc/wxflow @@ -1 +1 @@ -Subproject commit 528f5abb49e80751f83ebd6eb0a87bc70012bb24 +Subproject commit 942b90bfaa14f6b6d7374310dbdfd421ddb30548 diff --git a/ush/bash_utils.sh b/ush/bash_utils.sh new file mode 100644 index 0000000000..bd351dcd36 --- /dev/null +++ b/ush/bash_utils.sh @@ -0,0 +1,127 @@ +#! /usr/bin/env bash + +function generate_com() { + # + # Generate a list COM variables from a template by substituting in env variables. + # + # Each argument must have a corresponding template with the name ${ARG}_TMPL. Any + # variables in the template are replaced with their values. Undefined variables + # are just removed without raising an error. + # + # Accepts as options `-r` and `-x`, which do the same thing as the same options in + # `declare`. Variables are automatically marked as `-g` so the variable is visible + # in the calling script. + # + # Syntax: + # generate_com [-rx] $var1[:$tmpl1] [$var2[:$tmpl2]] [...]] + # + # options: + # -r: Make variable read-only (same as `decalre -r`) + # -x: Mark variable for export (same as `declare -x`) + # var1, var2, etc: Variable names whose values will be generated from a template + # and declared + # tmpl1, tmpl2, etc: Specify the template to use (default is "${var}_TMPL") + # + # Examples: + # # Current cycle and RUN, implicitly using template COM_ATMOS_ANALYSIS_TMPL + # YMD=${PDY} HH=${cyc} generate_com -rx COM_ATMOS_ANALYSIS + # + # # Previous cycle and gdas using an explicit template + # RUN=${GDUMP} YMD=${gPDY} HH=${gcyc} generate_com -rx \ + # COM_ATMOS_HISTORY_PREV:COM_ATMOS_HISTORY_TMPL + # + # # Current cycle and COM for first member + # MEMDIR='mem001' YMD=${PDY} HH=${cyc} generate_com -rx COM_ATMOS_HISTORY + # + if [[ ${DEBUG_WORKFLOW:-"NO"} == "NO" ]]; then set +x; fi + local opts="-g" + local OPTIND=1 + while getopts "rx" option; do + opts="${opts}${option}" + done + shift $((OPTIND-1)) + + for input in "$@"; do + IFS=':' read -ra args <<< "${input}" + local com_var="${args[0]}" + local template + local value + if (( ${#args[@]} > 1 )); then + template="${args[1]}" + else + template="${com_var}_TMPL" + fi + if [[ ! -v "${template}" ]]; then + echo "FATAL ERROR in generate_com: Requested template ${template} not defined!" + exit 2 + fi + value=$(echo "${!template}" | envsubst) + # shellcheck disable=SC2086 + declare ${opts} "${com_var}"="${value}" + # shellcheck disable= + echo "generate_com :: ${com_var}=${value}" + done + set_trace +} + +function wait_for_file() { + # + # Wait for a file to exist and return the status. + # + # Checks if a file exists periodically up to a maximum number of attempts. When the file + # exists or the limit is reached, the status is returned (0 if the file exists,1 if it + # does not). This allows it to be used as a conditional to handle missing files. + # + # Syntax: + # wait_for_file file_name [sleep_interval [max_tries]] + # + # file_name: File to check the existence of (must be readable) + # sleep_interval: Time to wait between each check (in seconds) [default: 60] + # max_tries: The maximum number of checks to make [default: 100] + # + # Example: + # ``` + # file_name=/path/to/foo + # sleep_interval=60 + # max_tries=30 + # if wait_for_file; then + # echo "FATAL ERROR: ${file_name} still does not exist after waiting one-half hour." + # exit 1 + # fi + # # Code that depends on file existing + # ``` + # + set +x + local file_name=${1:?"wait_for_file() requires a file name"} + local sleep_interval=${2:-60} + local max_tries=${3:-100} + + for (( iter=0; iter> "${DATA}/INPUT/coupler.res" << EOF - 2 (Calendar: no_calendar=0, thirty_day_months=1, julian=2, gregorian=3, noleap=4) + 3 (Calendar: no_calendar=0, thirty_day_months=1, julian=2, gregorian=3, noleap=4) ${gPDY:0:4} ${gPDY:4:2} ${gPDY:6:2} ${gcyc} 0 0 Model start time: year, month, day, hour, minute, second ${sPDY:0:4} ${sPDY:4:2} ${sPDY:6:2} ${scyc} 0 0 Current model time: year, month, day, hour, minute, second EOF @@ -93,7 +93,9 @@ EOF ${NLN} "${file}" "${DATA}/INPUT/${file2}" done - local hour_rst=$(nhour "${CDATE_RST}" "${current_cycle}") + local hour_rst + hour_rst=$(nhour "${CDATE_RST}" "${current_cycle}") + # shellcheck disable=SC2034 IAU_FHROT=$((IAU_OFFSET+hour_rst)) if [[ ${DOIAU} = "YES" ]]; then IAUFHRS=-1 @@ -133,17 +135,15 @@ EOF #-------------------------------------------------------------------------- # Grid and orography data - FIXsfc=${FIXsfc:-"${FIXorog}/${CASE}/sfc"} - if [[ ${cplflx} = ".false." ]] ; then - ${NLN} "${FIXorog}/${CASE}/${CASE}_mosaic.nc" "${DATA}/INPUT/grid_spec.nc" + ${NLN} "${FIXgfs}/orog/${CASE}/${CASE}_mosaic.nc" "${DATA}/INPUT/grid_spec.nc" else - ${NLN} "${FIXorog}/${CASE}/${CASE}_mosaic.nc" "${DATA}/INPUT/${CASE}_mosaic.nc" + ${NLN} "${FIXgfs}/orog/${CASE}/${CASE}_mosaic.nc" "${DATA}/INPUT/${CASE}_mosaic.nc" fi for n in $(seq 1 "${ntiles}"); do - ${NLN} "${FIXorog}/${CASE}/${CASE}.mx${OCNRES}_oro_data.tile${n}.nc" "${DATA}/INPUT/oro_data.tile${n}.nc" - ${NLN} "${FIXorog}/${CASE}/${CASE}_grid.tile${n}.nc" "${DATA}/INPUT/${CASE}_grid.tile${n}.nc" + ${NLN} "${FIXgfs}/orog/${CASE}/${CASE}.mx${OCNRES}_oro_data.tile${n}.nc" "${DATA}/INPUT/oro_data.tile${n}.nc" + ${NLN} "${FIXgfs}/orog/${CASE}/${CASE}_grid.tile${n}.nc" "${DATA}/INPUT/${CASE}_grid.tile${n}.nc" done _suite_file="${HOMEgfs}/sorc/ufs_model.fd/FV3/ccpp/suites/suite_${CCPP_SUITE}.xml" @@ -192,7 +192,7 @@ EOF fi # NoahMP table - local noahmptablefile="${HOMEgfs}/parm/ufs/noahmptable.tbl" + local noahmptablefile="${PARMgfs}/ufs/noahmptable.tbl" if [[ ! -f ${noahmptablefile} ]]; then echo "FATAL ERROR: missing noahmp table file ${noahmptablefile}" exit 1 @@ -201,10 +201,10 @@ EOF fi # Files for GWD - ${NLN} "${FIXugwd}/ugwp_limb_tau.nc" "${DATA}/ugwp_limb_tau.nc" + ${NLN} "${FIXgfs}/ugwd/ugwp_limb_tau.nc" "${DATA}/ugwp_limb_tau.nc" for n in $(seq 1 "${ntiles}"); do - ${NLN} "${FIXugwd}/${CASE}/${CASE}_oro_data_ls.tile${n}.nc" "${DATA}/INPUT/oro_data_ls.tile${n}.nc" - ${NLN} "${FIXugwd}/${CASE}/${CASE}_oro_data_ss.tile${n}.nc" "${DATA}/INPUT/oro_data_ss.tile${n}.nc" + ${NLN} "${FIXgfs}/ugwd/${CASE}/${CASE}_oro_data_ls.tile${n}.nc" "${DATA}/INPUT/oro_data_ls.tile${n}.nc" + ${NLN} "${FIXgfs}/ugwd/${CASE}/${CASE}_oro_data_ss.tile${n}.nc" "${DATA}/INPUT/oro_data_ss.tile${n}.nc" done # GFS standard input data @@ -225,51 +225,59 @@ EOF # imp_physics should be 8: #### if [[ ${imp_physics} -eq 8 ]]; then - ${NLN} "${FIXam}/CCN_ACTIVATE.BIN" "${DATA}/CCN_ACTIVATE.BIN" - ${NLN} "${FIXam}/freezeH2O.dat" "${DATA}/freezeH2O.dat" - ${NLN} "${FIXam}/qr_acr_qgV2.dat" "${DATA}/qr_acr_qgV2.dat" - ${NLN} "${FIXam}/qr_acr_qsV2.dat" "${DATA}/qr_acr_qsV2.dat" + ${NLN} "${FIXgfs}/am/CCN_ACTIVATE.BIN" "${DATA}/CCN_ACTIVATE.BIN" + ${NLN} "${FIXgfs}/am/freezeH2O.dat" "${DATA}/freezeH2O.dat" + ${NLN} "${FIXgfs}/am/qr_acr_qgV2.dat" "${DATA}/qr_acr_qgV2.dat" + ${NLN} "${FIXgfs}/am/qr_acr_qsV2.dat" "${DATA}/qr_acr_qsV2.dat" fi - ${NLN} "${FIXam}/${O3FORC}" "${DATA}/global_o3prdlos.f77" - ${NLN} "${FIXam}/${H2OFORC}" "${DATA}/global_h2oprdlos.f77" - ${NLN} "${FIXam}/global_solarconstant_noaa_an.txt" "${DATA}/solarconstant_noaa_an.txt" - ${NLN} "${FIXam}/global_sfc_emissivity_idx.txt" "${DATA}/sfc_emissivity_idx.txt" + ${NLN} "${FIXgfs}/am/${O3FORC}" "${DATA}/global_o3prdlos.f77" + ${NLN} "${FIXgfs}/am/${H2OFORC}" "${DATA}/global_h2oprdlos.f77" + ${NLN} "${FIXgfs}/am/global_solarconstant_noaa_an.txt" "${DATA}/solarconstant_noaa_an.txt" + ${NLN} "${FIXgfs}/am/global_sfc_emissivity_idx.txt" "${DATA}/sfc_emissivity_idx.txt" ## merra2 aerosol climo if [[ ${IAER} -eq "1011" ]]; then for month in $(seq 1 12); do MM=$(printf %02d "${month}") - ${NLN} "${FIXaer}/merra2.aerclim.2003-2014.m${MM}.nc" "aeroclim.m${MM}.nc" + ${NLN} "${FIXgfs}/aer/merra2.aerclim.2003-2014.m${MM}.nc" "aeroclim.m${MM}.nc" done - ${NLN} "${FIXlut}/optics_BC.v1_3.dat" "${DATA}/optics_BC.dat" - ${NLN} "${FIXlut}/optics_OC.v1_3.dat" "${DATA}/optics_OC.dat" - ${NLN} "${FIXlut}/optics_DU.v15_3.dat" "${DATA}/optics_DU.dat" - ${NLN} "${FIXlut}/optics_SS.v3_3.dat" "${DATA}/optics_SS.dat" - ${NLN} "${FIXlut}/optics_SU.v1_3.dat" "${DATA}/optics_SU.dat" fi - - ${NLN} "${FIXam}/global_co2historicaldata_glob.txt" "${DATA}/co2historicaldata_glob.txt" - ${NLN} "${FIXam}/co2monthlycyc.txt" "${DATA}/co2monthlycyc.txt" - if [[ ${ICO2} -gt 0 ]]; then - for file in $(ls "${FIXam}/fix_co2_proj/global_co2historicaldata"*) ; do + + ${NLN} "${FIXgfs}/lut/optics_BC.v1_3.dat" "${DATA}/optics_BC.dat" + ${NLN} "${FIXgfs}/lut/optics_OC.v1_3.dat" "${DATA}/optics_OC.dat" + ${NLN} "${FIXgfs}/lut/optics_DU.v15_3.dat" "${DATA}/optics_DU.dat" + ${NLN} "${FIXgfs}/lut/optics_SS.v3_3.dat" "${DATA}/optics_SS.dat" + ${NLN} "${FIXgfs}/lut/optics_SU.v1_3.dat" "${DATA}/optics_SU.dat" + + ${NLN} "${FIXgfs}/am/global_co2historicaldata_glob.txt" "${DATA}/co2historicaldata_glob.txt" + ${NLN} "${FIXgfs}/am/co2monthlycyc.txt" "${DATA}/co2monthlycyc.txt" + # Set historical CO2 values based on whether this is a reforecast run or not + # Ref. issue 2403 + local co2dir + co2dir="fix_co2_proj" + if [[ ${reforecast:-"NO"} == "YES" ]]; then + co2dir="co2dat_4a" + fi + if (( ICO2 > 0 )); then + for file in $(ls "${FIXgfs}/am/${co2dir}/global_co2historicaldata"*) ; do ${NLN} "${file}" "${DATA}/$(basename "${file//global_}")" done fi - ${NLN} "${FIXam}/global_climaeropac_global.txt" "${DATA}/aerosol.dat" + ${NLN} "${FIXgfs}/am/global_climaeropac_global.txt" "${DATA}/aerosol.dat" if [[ ${IAER} -gt 0 ]] ; then - for file in $(ls "${FIXam}/global_volcanic_aerosols"*) ; do + for file in $(ls "${FIXgfs}/am/global_volcanic_aerosols"*) ; do ${NLN} "${file}" "${DATA}/$(basename "${file//global_}")" done fi # inline post fix files if [[ ${WRITE_DOPOST} = ".true." ]]; then - ${NLN} "${PARM_POST}/post_tag_gfs${LEVS}" "${DATA}/itag" - ${NLN} "${FLTFILEGFS:-${PARM_POST}/postxconfig-NT-GFS-TWO.txt}" "${DATA}/postxconfig-NT.txt" - ${NLN} "${FLTFILEGFSF00:-${PARM_POST}/postxconfig-NT-GFS-F00-TWO.txt}" "${DATA}/postxconfig-NT_FH00.txt" - ${NLN} "${POSTGRB2TBL:-${PARM_POST}/params_grib2_tbl_new}" "${DATA}/params_grib2_tbl_new" + ${NLN} "${PARMgfs}/post/post_tag_gfs${LEVS}" "${DATA}/itag" + ${NLN} "${FLTFILEGFS:-${PARMgfs}/post/postxconfig-NT-GFS-TWO.txt}" "${DATA}/postxconfig-NT.txt" + ${NLN} "${FLTFILEGFSF00:-${PARMgfs}/post/postxconfig-NT-GFS-F00-TWO.txt}" "${DATA}/postxconfig-NT_FH00.txt" + ${NLN} "${POSTGRB2TBL:-${PARMgfs}/post/params_grib2_tbl_new}" "${DATA}/params_grib2_tbl_new" fi #------------------------------------------------------------------ @@ -296,28 +304,28 @@ EOF LATB_JMO=${LATB_JMO:-${LATB_CASE}} # Fix files - FNGLAC=${FNGLAC:-"${FIXam}/global_glacier.2x2.grb"} - FNMXIC=${FNMXIC:-"${FIXam}/global_maxice.2x2.grb"} - FNTSFC=${FNTSFC:-"${FIXam}/RTGSST.1982.2012.monthly.clim.grb"} - FNSNOC=${FNSNOC:-"${FIXam}/global_snoclim.1.875.grb"} + FNGLAC=${FNGLAC:-"${FIXgfs}/am/global_glacier.2x2.grb"} + FNMXIC=${FNMXIC:-"${FIXgfs}/am/global_maxice.2x2.grb"} + FNTSFC=${FNTSFC:-"${FIXgfs}/am/RTGSST.1982.2012.monthly.clim.grb"} + FNSNOC=${FNSNOC:-"${FIXgfs}/am/global_snoclim.1.875.grb"} FNZORC=${FNZORC:-"igbp"} - FNAISC=${FNAISC:-"${FIXam}/IMS-NIC.blended.ice.monthly.clim.grb"} - FNALBC2=${FNALBC2:-"${FIXsfc}/${CASE}.mx${OCNRES}.facsf.tileX.nc"} - FNTG3C=${FNTG3C:-"${FIXsfc}/${CASE}.mx${OCNRES}.substrate_temperature.tileX.nc"} - FNVEGC=${FNVEGC:-"${FIXsfc}/${CASE}.mx${OCNRES}.vegetation_greenness.tileX.nc"} - FNMSKH=${FNMSKH:-"${FIXam}/global_slmask.t1534.3072.1536.grb"} - FNVMNC=${FNVMNC:-"${FIXsfc}/${CASE}.mx${OCNRES}.vegetation_greenness.tileX.nc"} - FNVMXC=${FNVMXC:-"${FIXsfc}/${CASE}.mx${OCNRES}.vegetation_greenness.tileX.nc"} - FNSLPC=${FNSLPC:-"${FIXsfc}/${CASE}.mx${OCNRES}.slope_type.tileX.nc"} - FNALBC=${FNALBC:-"${FIXsfc}/${CASE}.mx${OCNRES}.snowfree_albedo.tileX.nc"} - FNVETC=${FNVETC:-"${FIXsfc}/${CASE}.mx${OCNRES}.vegetation_type.tileX.nc"} - FNSOTC=${FNSOTC:-"${FIXsfc}/${CASE}.mx${OCNRES}.soil_type.tileX.nc"} - FNSOCC=${FNSOCC:-"${FIXsfc}/${CASE}.mx${OCNRES}.soil_color.tileX.nc"} - FNABSC=${FNABSC:-"${FIXsfc}/${CASE}.mx${OCNRES}.maximum_snow_albedo.tileX.nc"} - FNSMCC=${FNSMCC:-"${FIXam}/global_soilmgldas.statsgo.t${JCAP}.${LONB}.${LATB}.grb"} + FNAISC=${FNAISC:-"${FIXgfs}/am/IMS-NIC.blended.ice.monthly.clim.grb"} + FNALBC2=${FNALBC2:-"${FIXgfs}/orog/${CASE}/sfc/${CASE}.mx${OCNRES}.facsf.tileX.nc"} + FNTG3C=${FNTG3C:-"${FIXgfs}/orog/${CASE}/sfc/${CASE}.mx${OCNRES}.substrate_temperature.tileX.nc"} + FNVEGC=${FNVEGC:-"${FIXgfs}/orog/${CASE}/sfc/${CASE}.mx${OCNRES}.vegetation_greenness.tileX.nc"} + FNMSKH=${FNMSKH:-"${FIXgfs}/am/global_slmask.t1534.3072.1536.grb"} + FNVMNC=${FNVMNC:-"${FIXgfs}/orog/${CASE}/sfc/${CASE}.mx${OCNRES}.vegetation_greenness.tileX.nc"} + FNVMXC=${FNVMXC:-"${FIXgfs}/orog/${CASE}/sfc/${CASE}.mx${OCNRES}.vegetation_greenness.tileX.nc"} + FNSLPC=${FNSLPC:-"${FIXgfs}/orog/${CASE}/sfc/${CASE}.mx${OCNRES}.slope_type.tileX.nc"} + FNALBC=${FNALBC:-"${FIXgfs}/orog/${CASE}/sfc/${CASE}.mx${OCNRES}.snowfree_albedo.tileX.nc"} + FNVETC=${FNVETC:-"${FIXgfs}/orog/${CASE}/sfc/${CASE}.mx${OCNRES}.vegetation_type.tileX.nc"} + FNSOTC=${FNSOTC:-"${FIXgfs}/orog/${CASE}/sfc/${CASE}.mx${OCNRES}.soil_type.tileX.nc"} + FNSOCC=${FNSOCC:-"${FIXgfs}/orog/${CASE}/sfc/${CASE}.mx${OCNRES}.soil_color.tileX.nc"} + FNABSC=${FNABSC:-"${FIXgfs}/orog/${CASE}/sfc/${CASE}.mx${OCNRES}.maximum_snow_albedo.tileX.nc"} + FNSMCC=${FNSMCC:-"${FIXgfs}/am/global_soilmgldas.statsgo.t${JCAP}.${LONB}.${LATB}.grb"} # If the appropriate resolution fix file is not present, use the highest resolution available (T1534) - [[ ! -f ${FNSMCC} ]] && FNSMCC="${FIXam}/global_soilmgldas.statsgo.t1534.3072.1536.grb" + [[ ! -f ${FNSMCC} ]] && FNSMCC="${FIXgfs}/am/global_soilmgldas.statsgo.t1534.3072.1536.grb" # NSST Options # nstf_name contains the NSST related parameters @@ -432,26 +440,26 @@ EOF fi # Stochastic Physics Options - if [[ ${SET_STP_SEED:-"YES"} = "YES" ]]; then - ISEED_SKEB=$((current_cycle*1000 + MEMBER*10 + 1)) - ISEED_SHUM=$((current_cycle*1000 + MEMBER*10 + 2)) - ISEED_SPPT=$((current_cycle*1000 + MEMBER*10 + 3)) - ISEED_CA=$(( (current_cycle*1000 + MEMBER*10 + 4) % 2147483647 )) - ISEED_LNDP=$(( (current_cycle*1000 + MEMBER*10 + 5) % 2147483647 )) + if [[ ${DO_SPPT:-"NO"} = "YES" ]]; then + do_sppt=".true." + ISEED_SPPT=$((current_cycle*10000 + ${MEMBER#0}*100 + 3)),$((current_cycle*10000 + ${MEMBER#0}*100 + 4)),$((current_cycle*10000 + ${MEMBER#0}*100 + 5)),$((current_cycle*10000 + ${MEMBER#0}*100 + 6)),$((current_cycle*10000 + ${MEMBER#0}*100 + 7)) else ISEED=${ISEED:-0} fi + if (( MEMBER > 0 )) && [[ ${DO_CA:-"NO"} = "YES" ]]; then + ISEED_CA=$(( (current_cycle*10000 + ${MEMBER#0}*100 + 18) % 2147483647 )) + fi if [[ ${DO_SKEB} = "YES" ]]; then do_skeb=".true." - fi - if [[ ${DO_SPPT} = "YES" ]]; then - do_sppt=".true." + ISEED_SKEB=$((current_cycle*10000 + ${MEMBER#0}*100 + 1)) fi if [[ ${DO_SHUM} = "YES" ]]; then do_shum=".true." + ISEED_SHUM=$((current_cycle*1000 + MEMBER*10 + 2)) fi if [[ ${DO_LAND_PERT} = "YES" ]]; then lndp_type=${lndp_type:-2} + ISEED_LNDP=$(( (current_cycle*1000 + MEMBER*10 + 5) % 2147483647 )) LNDP_TAU=${LNDP_TAU:-21600} LNDP_SCALE=${LNDP_SCALE:-500000} ISEED_LNDP=${ISEED_LNDP:-${ISEED}} @@ -463,8 +471,6 @@ EOF LONB_STP=${LONB_STP:-${LONB_CASE}} LATB_STP=${LATB_STP:-${LATB_CASE}} cd "${DATA}" || exit 1 - if [[ ! -d ${COM_ATMOS_HISTORY} ]]; then mkdir -p "${COM_ATMOS_HISTORY}"; fi - if [[ ! -d ${COM_ATMOS_MASTER} ]]; then mkdir -p "${COM_ATMOS_MASTER}"; fi if [[ "${QUILTING}" = ".true." ]] && [[ "${OUTPUT_GRID}" = "gaussian_grid" ]]; then for fhr in ${FV3_OUTPUT_FH}; do local FH3=$(printf %03i "${fhr}") @@ -492,7 +498,7 @@ FV3_nml(){ # namelist output for a certain component echo "SUB ${FUNCNAME[0]}: Creating name lists and model configure file for FV3" # Call child scripts in current script directory - source "${HOMEgfs}/ush/parsing_namelists_FV3.sh" + source "${USHgfs}/parsing_namelists_FV3.sh" FV3_namelists echo "SUB ${FUNCNAME[0]}: FV3 name lists and model configure file created" } @@ -503,7 +509,6 @@ FV3_out() { # Copy FV3 restart files if [[ ${RUN} =~ "gdas" ]]; then cd "${DATA}/RESTART" - mkdir -p "${COM_ATMOS_RESTART}" local idate=$(date --utc -d "${current_cycle:0:8} ${current_cycle:8:2} + ${restart_interval} hours" +%Y%m%d%H) while [[ ${idate} -le ${forecast_end_cycle} ]]; do for file in "${idate:0:8}.${idate:8:2}0000."*; do @@ -516,7 +521,7 @@ FV3_out() { ${NCP} "${DATA}/input.nml" "${COM_CONF}/ufs.input.nml" ${NCP} "${DATA}/model_configure" "${COM_CONF}/ufs.model_configure" ${NCP} "${DATA}/ufs.configure" "${COM_CONF}/ufs.ufs.configure" - ${NCP} "${DATA}/diag_table" "${COM_CONF}/ufs.diag_table" + ${NCP} "${DATA}/diag_table" "${COM_CONF}/ufs.diag_table" fi echo "SUB ${FUNCNAME[0]}: Output data for FV3 copied" } @@ -540,12 +545,11 @@ WW3_postdet() { fi - #if wave mesh is not the same as the ocn/ice mesh, linkk it in the file - local comparemesh=${MESH_OCN_ICE:-"mesh.mx${ICERES}.nc"} - if [[ "${MESH_WAV}" = "${comparemesh}" ]]; then - echo "Wave is on same mesh as ocean/ice" + #if wave mesh is not the same as the ocean mesh, link it in the file + if [[ "${MESH_WAV}" == "${MESH_OCN:-mesh.mx${OCNRES}.nc}" ]]; then + echo "Wave is on same mesh as ocean" else - ${NLN} "${FIXwave}/${MESH_WAV}" "${DATA}/" + ${NLN} "${FIXgfs}/wave/${MESH_WAV}" "${DATA}/" fi export wavprfx=${RUNwave}${WAV_MEMBER:-} @@ -605,8 +609,6 @@ WW3_postdet() { ${NLN} "${wavcurfile}" "${DATA}/current.${WAVECUR_FID}" fi - if [[ ! -d ${COM_WAVE_HISTORY} ]]; then mkdir -p "${COM_WAVE_HISTORY}"; fi - # Link output files cd "${DATA}" if [[ ${waveMULTIGRID} = ".true." ]]; then @@ -651,8 +653,8 @@ WW3_nml() { echo "SUB ${FUNCNAME[0]}: Copying input files for WW3" WAV_MOD_TAG=${RUN}wave${waveMEMB} if [[ "${USE_WAV_RMP:-YES}" = "YES" ]]; then - if (( $( ls -1 "${FIXwave}/rmp_src_to_dst_conserv_"* 2> /dev/null | wc -l) > 0 )); then - for file in $(ls "${FIXwave}/rmp_src_to_dst_conserv_"*) ; do + if (( $( ls -1 "${FIXgfs}/wave/rmp_src_to_dst_conserv_"* 2> /dev/null | wc -l) > 0 )); then + for file in $(ls "${FIXgfs}/wave/rmp_src_to_dst_conserv_"*) ; do ${NLN} "${file}" "${DATA}/" done else @@ -660,7 +662,7 @@ WW3_nml() { exit 4 fi fi - source "${HOMEgfs}/ush/parsing_namelists_WW3.sh" + source "${USHgfs}/parsing_namelists_WW3.sh" WW3_namelists } @@ -683,6 +685,7 @@ MOM6_postdet() { ${NLN} "${COM_OCEAN_RESTART_PREV}/${sPDY}.${scyc}0000.MOM.res.nc" "${DATA}/INPUT/MOM.res.nc" case ${OCNRES} in "025") + local nn for nn in $(seq 1 4); do if [[ -f "${COM_OCEAN_RESTART_PREV}/${sPDY}.${scyc}0000.MOM.res_${nn}.nc" ]]; then ${NLN} "${COM_OCEAN_RESTART_PREV}/${sPDY}.${scyc}0000.MOM.res_${nn}.nc" "${DATA}/INPUT/MOM.res_${nn}.nc" @@ -700,11 +703,17 @@ MOM6_postdet() { ${NLN} "${COM_OCEAN_ANALYSIS}/${RUN}.t${cyc}z.ocninc.nc" "${DATA}/INPUT/mom6_increment.nc" fi + # GEFS perturbations + # TODO if [[ $RUN} == "gefs" ]] block maybe be needed + # to ensure it does not interfere with the GFS + if (( MEMBER > 0 )) && [[ "${ODA_INCUPD:-False}" == "True" ]]; then + ${NLN} "${COM_OCEAN_RESTART_PREV}/${sPDY}.${scyc}0000.mom6_increment.nc" "${DATA}/INPUT/mom6_increment.nc" + fi # Copy MOM6 fixed files - ${NCP} "${FIXmom}/${OCNRES}/"* "${DATA}/INPUT/" + ${NCP} "${FIXgfs}/mom6/${OCNRES}/"* "${DATA}/INPUT/" # TODO: These need to be explicit # Copy coupled grid_spec - spec_file="${FIXcpl}/a${CASE}o${OCNRES}/grid_spec.nc" + spec_file="${FIXgfs}/cpl/a${CASE}o${OCNRES}/grid_spec.nc" if [[ -s ${spec_file} ]]; then ${NCP} "${spec_file}" "${DATA}/INPUT/" else @@ -712,90 +721,62 @@ MOM6_postdet() { exit 3 fi - # Copy mediator restart files to RUNDIR # TODO: mediator should have its own CMEPS_postdet() function - if [[ ${warm_start} = ".true." ]]; then - local mediator_file="${COM_MED_RESTART}/${PDY}.${cyc}0000.ufs.cpld.cpl.r.nc" - if [[ -f "${mediator_file}" ]]; then - ${NCP} "${mediator_file}" "${DATA}/ufs.cpld.cpl.r.nc" - rm -f "${DATA}/rpointer.cpl" - touch "${DATA}/rpointer.cpl" - echo "ufs.cpld.cpl.r.nc" >> "${DATA}/rpointer.cpl" - else - # We have a choice to make here. - # Either we can FATAL ERROR out, or we can let the coupling fields initialize from zero - # cmeps_run_type is determined based on the availability of the mediator restart file - echo "WARNING: ${mediator_file} does not exist for warm_start = .true., initializing!" - #echo "FATAL ERROR: ${mediator_file} must exist for warm_start = .true. and does not, ABORT!" - #exit 4 - fi - else - # This is a cold start, so initialize the coupling fields from zero - export cmeps_run_type="startup" - fi - # If using stochatic parameterizations, create a seed that does not exceed the # largest signed integer - if [[ "${DO_OCN_SPPT}" = "YES" ]] || [[ "${DO_OCN_PERT_EPBL}" = "YES" ]]; then - if [[ ${SET_STP_SEED:-"YES"} = "YES" ]]; then - ISEED_OCNSPPT=$(( (current_cycle*1000 + MEMBER*10 + 6) % 2147483647 )) - ISEED_EPBL=$(( (current_cycle*1000 + MEMBER*10 + 7) % 2147483647 )) - else - ISEED=${ISEED:-0} - fi + if [[ ${DO_OCN_SPPT} = "YES" ]]; then + ISEED_OCNSPPT=$((current_cycle*10000 + ${MEMBER#0}*100 + 8)),$((current_cycle*10000 + ${MEMBER#0}*100 + 9)),$((current_cycle*10000 + ${MEMBER#0}*100 + 10)),$((current_cycle*10000 + ${MEMBER#0}*100 + 11)),$((current_cycle*10000 + ${MEMBER#0}*100 + 12)) + fi + if [[ ${DO_OCN_PERT_EPBL} = "YES" ]]; then + ISEED_EPBL=$((current_cycle*10000 + ${MEMBER#0}*100 + 13)),$((current_cycle*10000 + ${MEMBER#0}*100 + 14)),$((current_cycle*10000 + ${MEMBER#0}*100 + 15)),$((current_cycle*10000 + ${MEMBER#0}*100 + 16)),$((current_cycle*10000 + ${MEMBER#0}*100 + 17)) fi - - # Create COMOUTocean - [[ ! -d ${COM_OCEAN_HISTORY} ]] && mkdir -p "${COM_OCEAN_HISTORY}" # Link output files if [[ "${RUN}" =~ "gfs" || "${RUN}" =~ "gefs" ]]; then - # Link output files for RUN = gfs + # Link output files for RUN = gfs|gefs - # TODO: get requirements on what files need to be written out and what these dates here are and what they mean - - if [[ ! -d ${COM_OCEAN_HISTORY} ]]; then mkdir -p "${COM_OCEAN_HISTORY}"; fi + # Looping over MOM6 output hours + local fhr fhr3 last_fhr interval midpoint vdate vdate_mid source_file dest_file + for fhr in ${MOM6_OUTPUT_FH}; do + fhr3=$(printf %03i "${fhr}") - # Looping over FV3 output hours - # TODO: Need to define MOM6_OUTPUT_FH and control at some point for issue #1629 - for fhr in ${FV3_OUTPUT_FH}; do if [[ -z ${last_fhr:-} ]]; then - local last_fhr=${fhr} + last_fhr=${fhr} continue fi + (( interval = fhr - last_fhr )) (( midpoint = last_fhr + interval/2 )) - local vdate=$(date --utc -d "${current_cycle:0:8} ${current_cycle:8:2} + ${fhr} hours" +%Y%m%d%H) - local vdate_mid=$(date --utc -d "${current_cycle:0:8} ${current_cycle:8:2} + ${midpoint} hours" +%Y%m%d%H) - + vdate=$(date --utc -d "${current_cycle:0:8} ${current_cycle:8:2} + ${fhr} hours" +%Y%m%d%H) + vdate_mid=$(date --utc -d "${current_cycle:0:8} ${current_cycle:8:2} + ${midpoint} hours" +%Y%m%d%H) # Native model output uses window midpoint in the filename, but we are mapping that to the end of the period for COM - local source_file="ocn_${vdate_mid:0:4}_${vdate_mid:4:2}_${vdate_mid:6:2}_${vdate_mid:8:2}.nc" - local dest_file="ocn${vdate}.${ENSMEM}.${current_cycle}.nc" + source_file="ocn_${vdate_mid:0:4}_${vdate_mid:4:2}_${vdate_mid:6:2}_${vdate_mid:8:2}.nc" + dest_file="${RUN}.ocean.t${cyc}z.${interval}hr_avg.f${fhr3}.nc" ${NLN} "${COM_OCEAN_HISTORY}/${dest_file}" "${DATA}/${source_file}" - local source_file="ocn_daily_${vdate:0:4}_${vdate:4:2}_${vdate:6:2}.nc" - local dest_file=${source_file} - if [[ ! -a "${DATA}/${source_file}" ]]; then + # Daily output + if (( fhr > 0 & fhr % 24 == 0 )); then + source_file="ocn_daily_${vdate:0:4}_${vdate:4:2}_${vdate:6:2}.nc" + dest_file="${RUN}.ocean.t${cyc}z.daily.f${fhr3}.nc" ${NLN} "${COM_OCEAN_HISTORY}/${dest_file}" "${DATA}/${source_file}" fi - local last_fhr=${fhr} + last_fhr=${fhr} + done elif [[ "${RUN}" =~ "gdas" ]]; then # Link output files for RUN = gdas - # Save MOM6 backgrounds - for fhr in ${FV3_OUTPUT_FH}; do - local idatestr=$(date --utc -d "${current_cycle:0:8} ${current_cycle:8:2} + ${fhr} hours" +%Y_%m_%d_%H) + # Save (instantaneous) MOM6 backgrounds + for fhr in ${MOM6_OUTPUT_FH}; do local fhr3=$(printf %03i "${fhr}") - ${NLN} "${COM_OCEAN_HISTORY}/${RUN}.t${cyc}z.ocnf${fhr3}.nc" "${DATA}/ocn_da_${idatestr}.nc" + local vdatestr=$(date --utc -d "${current_cycle:0:8} ${current_cycle:8:2} + ${fhr} hours" +%Y_%m_%d_%H) + ${NLN} "${COM_OCEAN_HISTORY}/${RUN}.ocean.t${cyc}z.inst.f${fhr3}.nc" "${DATA}/ocn_da_${vdatestr}.nc" done fi - mkdir -p "${COM_OCEAN_RESTART}" - # Link ocean restarts from DATA to COM # Coarser than 1/2 degree has a single MOM restart ${NLN} "${COM_OCEAN_RESTART}/${forecast_end_cycle:0:8}.${forecast_end_cycle:8:2}0000.MOM.res.nc" "${DATA}/MOM6_RESTART/" @@ -810,10 +791,16 @@ MOM6_postdet() { ;; esac - # Loop over restart_interval frequency and link restarts from DATA to COM - local idate=$(date --utc -d "${current_cycle:0:8} ${current_cycle:8:2} + ${restart_interval} hours" +%Y%m%d%H) - while [[ ${idate} -lt ${forecast_end_cycle} ]]; do - local idatestr=$(date +%Y-%m-%d-%H -d "${idate:0:8} ${idate:8:2}") + if [[ "${RUN}" =~ "gdas" ]]; then + local interval idate + if [[ "${DOIAU}" = "YES" ]]; then + # Link restarts at the beginning of the next cycle from DATA to COM + interval=$(( assim_freq / 2 )) + idate=$(date --utc -d "${next_cycle:0:8} ${next_cycle:8:2} - ${interval} hours" +%Y%m%d%H) + else + # Link restarts at the middle of the next cycle from DATA to COM + idate="${next_cycle}" + fi ${NLN} "${COM_OCEAN_RESTART}/${idate:0:8}.${idate:8:2}0000.MOM.res.nc" "${DATA}/MOM6_RESTART/" case ${OCNRES} in "025") @@ -822,23 +809,7 @@ MOM6_postdet() { done ;; esac - local idate=$(date --utc -d "${idate:0:8} ${idate:8:2} + ${restart_interval} hours" +%Y%m%d%H) - done - - # TODO: mediator should have its own CMEPS_postdet() function - # Link mediator restarts from DATA to COM - # DANGER DANGER DANGER - Linking mediator restarts to COM causes the model to fail with a message like this below: - # Abort with message NetCDF: File exists && NC_NOCLOBBER in file pio-2.5.7/src/clib/pioc_support.c at line 2173 - # Instead of linking, copy the mediator files after the model finishes - #local COMOUTmed="${ROTDIR}/${RUN}.${PDY}/${cyc}/med" - #mkdir -p "${COMOUTmed}/RESTART" - #local idate=$(date --utc -d "${current_cycle:0:8} ${current_cycle:8:2} + ${restart_interval} hours" +%Y%m%d%H) - #while [[ ${idate} -le ${forecast_end_cycle} ]]; do - # local seconds=$(to_seconds ${idate:8:2}0000) # use function to_seconds from forecast_predet.sh to convert HHMMSS to seconds - # local idatestr="${idate:0:4}-${idate:4:2}-${idate:6:2}-${seconds}" - # ${NLN} "${COMOUTmed}/RESTART/${idate:0:8}.${idate:8:2}0000.ufs.cpld.cpl.r.nc" "${DATA}/RESTART/ufs.cpld.cpl.r.${idatestr}.nc" - # local idate=$(date --utc -d "${idate:0:8} ${idate:8:2} + ${restart_interval} hours" +%Y%m%d%H) - #done + fi echo "SUB ${FUNCNAME[0]}: MOM6 input data linked/copied" @@ -846,7 +817,7 @@ MOM6_postdet() { MOM6_nml() { echo "SUB ${FUNCNAME[0]}: Creating name list for MOM6" - source "${HOMEgfs}/ush/parsing_namelists_MOM6.sh" + source "${USHgfs}/parsing_namelists_MOM6.sh" MOM6_namelists } @@ -854,56 +825,13 @@ MOM6_out() { echo "SUB ${FUNCNAME[0]}: Copying output data for MOM6" # Copy MOM_input from DATA to COM_OCEAN_INPUT after the forecast is run (and successfull) - if [[ ! -d ${COM_OCEAN_INPUT} ]]; then mkdir -p "${COM_OCEAN_INPUT}"; fi ${NCP} "${DATA}/INPUT/MOM_input" "${COM_CONF}/ufs.MOM_input" - # TODO: mediator should have its own CMEPS_out() function - # Copy mediator restarts from DATA to COM - # Linking mediator restarts to COM causes the model to fail with a message. - # See MOM6_postdet() function for error message - mkdir -p "${COM_MED_RESTART}" - local idate=$(date --utc -d "${current_cycle:0:8} ${current_cycle:8:2} + ${restart_interval} hours" +%Y%m%d%H) - while [[ ${idate} -le ${forecast_end_cycle} ]]; do - local seconds=$(to_seconds "${idate:8:2}"0000) # use function to_seconds from forecast_predet.sh to convert HHMMSS to seconds - local idatestr="${idate:0:4}-${idate:4:2}-${idate:6:2}-${seconds}" - local mediator_file="${DATA}/RESTART/ufs.cpld.cpl.r.${idatestr}.nc" - if [[ -f ${mediator_file} ]]; then - ${NCP} "${DATA}/RESTART/ufs.cpld.cpl.r.${idatestr}.nc" "${COM_MED_RESTART}/${idate:0:8}.${idate:8:2}0000.ufs.cpld.cpl.r.nc" - else - echo "Mediator restart ${mediator_file} not found." - fi - local idate=$(date --utc -d "${idate:0:8} ${idate:8:2} + ${restart_interval} hours" +%Y%m%d%H) - done } CICE_postdet() { echo "SUB ${FUNCNAME[0]}: CICE after run type determination" - # TODO: These settings should be elevated to config.ice - histfreq_n=${histfreq_n:-6} - dumpfreq_n=${dumpfreq_n:-1000} # Set this to a really large value, as cice, mom6 and cmeps restart interval is controlled by ufs.configure - dumpfreq=${dumpfreq:-"y"} # "h","d","m" or "y" for restarts at intervals of "hours", "days", "months" or "years" - - if [[ "${RUN}" =~ "gdas" ]]; then - cice_hist_avg=".false., .false., .false., .false., .false." # DA needs instantaneous - else - cice_hist_avg=".true., .true., .true., .true., .true." # P8 wants averaged over histfreq_n - fi - - FRAZIL_FWSALT=${FRAZIL_FWSALT:-".true."} - ktherm=${ktherm:-2} - tfrz_option=${tfrz_option:-"'mushy'"} - tr_pond_lvl=${tr_pond_lvl:-".true."} # Use level melt ponds tr_pond_lvl=true - - # restart_pond_lvl (if tr_pond_lvl=true): - # -- if true, initialize the level ponds from restart (if runtype=continue) - # -- if false, re-initialize level ponds to zero (if runtype=initial or continue) - restart_pond_lvl=${restart_pond_lvl:-".false."} - - ice_grid_file=${ice_grid_file:-"grid_cice_NEMS_mx${ICERES}.nc"} - ice_kmt_file=${ice_kmt_file:-"kmtu_cice_NEMS_mx${ICERES}.nc"} - export MESH_OCN_ICE=${MESH_OCN_ICE:-"mesh.mx${ICERES}.nc"} - # Copy CICE ICs echo "Link CICE ICs" cice_restart_file="${COM_ICE_RESTART_PREV}/${sPDY}.${scyc}0000.cice_model.res.nc" @@ -917,58 +845,44 @@ CICE_postdet() { echo "${DATA}/cice_model.res.nc" > "${DATA}/ice.restart_file" echo "Link CICE fixed files" - ${NLN} "${FIXcice}/${ICERES}/${ice_grid_file}" "${DATA}/" - ${NLN} "${FIXcice}/${ICERES}/${ice_kmt_file}" "${DATA}/" - ${NLN} "${FIXcice}/${ICERES}/${MESH_OCN_ICE}" "${DATA}/" - - # Link CICE output files - if [[ ! -d "${COM_ICE_HISTORY}" ]]; then mkdir -p "${COM_ICE_HISTORY}"; fi - mkdir -p "${COM_ICE_RESTART}" - - if [[ "${RUN}" =~ "gfs" || "${RUN}" =~ "gefs" ]]; then - # Link output files for RUN = gfs - - # TODO: make these forecast output files consistent w/ GFS output - # TODO: Work w/ NB to determine appropriate naming convention for these files - - # TODO: consult w/ NB on how to improve on this. Gather requirements and more information on what these files are and how they are used to properly catalog them - local vdate seconds vdatestr fhr last_fhr - for fhr in ${FV3_OUTPUT_FH}; do - vdate=$(date --utc -d "${current_cycle:0:8} ${current_cycle:8:2} + ${fhr} hours" +%Y%m%d%H) - seconds=$(to_seconds "${vdate:8:2}0000") # convert HHMMSS to seconds - vdatestr="${vdate:0:4}-${vdate:4:2}-${vdate:6:2}-${seconds}" - - if [[ 10#${fhr} -eq 0 ]]; then - ${NLN} "${COM_ICE_HISTORY}/iceic${vdate}.${ENSMEM}.${current_cycle}.nc" "${DATA}/CICE_OUTPUT/iceh_ic.${vdatestr}.nc" - else - (( interval = fhr - last_fhr )) # Umm.. isn't this histfreq_n? - ${NLN} "${COM_ICE_HISTORY}/ice${vdate}.${ENSMEM}.${current_cycle}.nc" "${DATA}/CICE_OUTPUT/iceh_$(printf "%0.2d" "${interval}")h.${vdatestr}.nc" - fi + ${NLN} "${FIXgfs}/cice/${ICERES}/${CICE_GRID}" "${DATA}/" + ${NLN} "${FIXgfs}/cice/${ICERES}/${CICE_MASK}" "${DATA}/" + ${NLN} "${FIXgfs}/cice/${ICERES}/${MESH_ICE}" "${DATA}/" + + # Link iceh_ic file to COM. This is the initial condition file from CICE (f000) + # TODO: Is this file needed in COM? Is this going to be used for generating any products? + local vdate seconds vdatestr fhr fhr3 interval last_fhr + seconds=$(to_seconds "${current_cycle:8:2}0000") # convert HHMMSS to seconds + vdatestr="${current_cycle:0:4}-${current_cycle:4:2}-${current_cycle:6:2}-${seconds}" + ${NLN} "${COM_ICE_HISTORY}/${RUN}.ice.t${cyc}z.ic.nc" "${DATA}/CICE_OUTPUT/iceh_ic.${vdatestr}.nc" + + # Link CICE forecast output files from DATA/CICE_OUTPUT to COM + local source_file dest_file + for fhr in ${CICE_OUTPUT_FH}; do + fhr3=$(printf %03i "${fhr}") + + if [[ -z ${last_fhr:-} ]]; then last_fhr=${fhr} - done + continue + fi - elif [[ "${RUN}" =~ "gdas" ]]; then + (( interval = fhr - last_fhr )) - # Link CICE generated initial condition file from DATA/CICE_OUTPUT to COMOUTice - # This can be thought of as the f000 output from the CICE model - local seconds vdatestr - seconds=$(to_seconds "${current_cycle:8:2}0000") # convert HHMMSS to seconds - vdatestr="${current_cycle:0:4}-${current_cycle:4:2}-${current_cycle:6:2}-${seconds}" - ${NLN} "${COM_ICE_HISTORY}/${RUN}.t${cyc}z.iceic.nc" "${DATA}/CICE_OUTPUT/iceh_ic.${vdatestr}.nc" - - # Link instantaneous CICE forecast output files from DATA/CICE_OUTPUT to COMOUTice - local vdate vdatestr seconds fhr fhr3 - fhr="${FHOUT}" - while [[ "${fhr}" -le "${FHMAX}" ]]; do - vdate=$(date --utc -d "${current_cycle:0:8} ${current_cycle:8:2} + ${fhr} hours" +%Y%m%d%H) - seconds=$(to_seconds "${vdate:8:2}0000") # convert HHMMSS to seconds - vdatestr="${vdate:0:4}-${vdate:4:2}-${vdate:6:2}-${seconds}" - fhr3=$(printf %03i "${fhr}") - ${NLN} "${COM_ICE_HISTORY}/${RUN}.t${cyc}z.icef${fhr3}.nc" "${DATA}/CICE_OUTPUT/iceh_inst.${vdatestr}.nc" - fhr=$((fhr + FHOUT)) - done + vdate=$(date --utc -d "${current_cycle:0:8} ${current_cycle:8:2} + ${fhr} hours" +%Y%m%d%H) + seconds=$(to_seconds "${vdate:8:2}0000") # convert HHMMSS to seconds + vdatestr="${vdate:0:4}-${vdate:4:2}-${vdate:6:2}-${seconds}" - fi + if [[ "${RUN}" =~ "gfs" || "${RUN}" =~ "gefs" ]]; then + source_file="iceh_$(printf "%0.2d" "${interval}")h.${vdatestr}.nc" + dest_file="${RUN}.ice.t${cyc}z.${interval}hr_avg.f${fhr3}.nc" + elif [[ "${RUN}" =~ "gdas" ]]; then + source_file="iceh_inst.${vdatestr}.nc" + dest_file="${RUN}.ice.t${cyc}z.inst.f${fhr3}.nc" + fi + ${NLN} "${COM_ICE_HISTORY}/${dest_file}" "${DATA}/CICE_OUTPUT/${source_file}" + + last_fhr=${fhr} + done # Link CICE restarts from CICE_RESTART to COMOUTice/RESTART # Loop over restart_interval and link restarts from DATA to COM @@ -984,7 +898,7 @@ CICE_postdet() { CICE_nml() { echo "SUB ${FUNCNAME[0]}: Creating name list for CICE" - source "${HOMEgfs}/ush/parsing_namelists_CICE.sh" + source "${USHgfs}/parsing_namelists_CICE.sh" CICE_namelists } @@ -992,7 +906,6 @@ CICE_out() { echo "SUB ${FUNCNAME[0]}: Copying output data for CICE" # Copy ice_in namelist from DATA to COMOUTice after the forecast is run (and successfull) - if [[ ! -d "${COM_ICE_INPUT}" ]]; then mkdir -p "${COM_ICE_INPUT}"; fi ${NCP} "${DATA}/ice_in" "${COM_CONF}/ufs.ice_in" } @@ -1030,9 +943,7 @@ GOCART_rc() { GOCART_postdet() { echo "SUB ${FUNCNAME[0]}: Linking output data for GOCART" - if [[ ! -d "${COM_CHEM_HISTORY}" ]]; then mkdir -p "${COM_CHEM_HISTORY}"; fi - - for fhr in ${FV3_OUTPUT_FH}; do + for fhr in ${GOCART_OUTPUT_FH}; do local vdate=$(date --utc -d "${current_cycle:0:8} ${current_cycle:8:2} + ${fhr} hours" +%Y%m%d%H) # Temporarily delete existing files due to noclobber in GOCART @@ -1053,12 +964,62 @@ GOCART_out() { # TO DO: this should be linked but there were issues where gocart was crashing if it was linked local fhr local vdate - for fhr in ${FV3_OUTPUT_FH}; do + for fhr in ${GOCART_OUTPUT_FH}; do if (( fhr == 0 )); then continue; fi vdate=$(date --utc -d "${current_cycle:0:8} ${current_cycle:8:2} + ${fhr} hours" +%Y%m%d%H) ${NCP} "${DATA}/gocart.inst_aod.${vdate:0:8}_${vdate:8:2}00z.nc4" \ "${COM_CHEM_HISTORY}/gocart.inst_aod.${vdate:0:8}_${vdate:8:2}00z.nc4" done +} +CMEPS_postdet() { + echo "SUB ${FUNCNAME[0]}: Linking output data for CMEPS mediator" + + # Copy mediator restart files to RUNDIR + if [[ "${warm_start}" = ".true." ]]; then + local mediator_file="${COM_MED_RESTART}/${PDY}.${cyc}0000.ufs.cpld.cpl.r.nc" + if [[ -f "${mediator_file}" ]]; then + ${NCP} "${mediator_file}" "${DATA}/ufs.cpld.cpl.r.nc" + rm -f "${DATA}/rpointer.cpl" + touch "${DATA}/rpointer.cpl" + echo "ufs.cpld.cpl.r.nc" >> "${DATA}/rpointer.cpl" + else + # We have a choice to make here. + # Either we can FATAL ERROR out, or we can let the coupling fields initialize from zero + # cmeps_run_type is determined based on the availability of the mediator restart file + echo "WARNING: ${mediator_file} does not exist for warm_start = .true., initializing!" + #echo "FATAL ERROR: ${mediator_file} must exist for warm_start = .true. and does not, ABORT!" + #exit 4 + fi + fi + + # Link mediator restarts from DATA to COM + # DANGER DANGER DANGER - Linking mediator restarts to COM causes the model to fail with a message like this below: + # Abort with message NetCDF: File exists && NC_NOCLOBBER in file pio-2.5.7/src/clib/pioc_support.c at line 2173 + # Instead of linking, copy the mediator files after the model finishes. See CMEPS_out() below. + #local rdate rdatestr seconds mediator_file + #rdate=${forecast_end_cycle} + #seconds=$(to_seconds "${rdate:8:2}"0000) # use function to_seconds from forecast_predet.sh to convert HHMMSS to seconds + #rdatestr="${rdate:0:4}-${rdate:4:2}-${rdate:6:2}-${seconds}" + #${NLN} "${COM_MED_RESTART}/${rdate:0:8}.${rdate:8:2}0000.ufs.cpld.cpl.r.nc" "${DATA}/CMEPS_RESTART/ufs.cpld.cpl.r.${rdatestr}.nc" + +} + +CMEPS_out() { + echo "SUB ${FUNCNAME[0]}: Copying output data for CMEPS mediator" + + # Linking mediator restarts to COM causes the model to fail with a message. + # Abort with message NetCDF: File exists && NC_NOCLOBBER in file pio-2.5.7/src/clib/pioc_support.c at line 2173 + # Copy mediator restarts from DATA to COM + local rdate rdatestr seconds mediator_file + rdate=${forecast_end_cycle} + seconds=$(to_seconds "${rdate:8:2}"0000) # use function to_seconds from forecast_predet.sh to convert HHMMSS to seconds + rdatestr="${rdate:0:4}-${rdate:4:2}-${rdate:6:2}-${seconds}" + mediator_file="${DATA}/CMEPS_RESTART/ufs.cpld.cpl.r.${rdatestr}.nc" + if [[ -f ${mediator_file} ]]; then + ${NCP} "${mediator_file}" "${COM_MED_RESTART}/${rdate:0:8}.${rdate:8:2}0000.ufs.cpld.cpl.r.nc" + else + echo "Mediator restart ${mediator_file} not found." + fi } diff --git a/ush/forecast_predet.sh b/ush/forecast_predet.sh index 9bb565919a..b5e1ad8e82 100755 --- a/ush/forecast_predet.sh +++ b/ush/forecast_predet.sh @@ -8,35 +8,35 @@ ## This script is a definition of functions. ##### -# For all non-evironment variables -# Cycling and forecast hour specific parameters - to_seconds() { # Function to convert HHMMSS to seconds since 00Z - local hhmmss=${1:?} - local hh=${hhmmss:0:2} - local mm=${hhmmss:2:2} - local ss=${hhmmss:4:2} - local seconds=$((10#${hh}*3600+10#${mm}*60+10#${ss})) - local padded_seconds=$(printf "%05d" "${seconds}") + local hhmmss hh mm ss seconds padded_seconds + hhmmss=${1:?} + hh=${hhmmss:0:2} + mm=${hhmmss:2:2} + ss=${hhmmss:4:2} + seconds=$((10#${hh}*3600+10#${mm}*60+10#${ss})) + padded_seconds=$(printf "%05d" "${seconds}") echo "${padded_seconds}" } middle_date(){ # Function to calculate mid-point date in YYYYMMDDHH between two dates also in YYYYMMDDHH - local date1=${1:?} - local date2=${2:?} - local date1s=$(date --utc -d "${date1:0:8} ${date1:8:2}:00:00" +%s) - local date2s=$(date --utc -d "${date2:0:8} ${date2:8:2}:00:00" +%s) - local dtsecsby2=$(( $((date2s - date1s)) / 2 )) - local mid_date=$(date --utc -d "${date1:0:8} ${date1:8:2} + ${dtsecsby2} seconds" +%Y%m%d%H%M%S) + local date1 date2 date1s date2s dtsecsby2 mid_date + date1=${1:?} + date2=${2:?} + date1s=$(date --utc -d "${date1:0:8} ${date1:8:2}:00:00" +%s) + date2s=$(date --utc -d "${date2:0:8} ${date2:8:2}:00:00" +%s) + dtsecsby2=$(( $((date2s - date1s)) / 2 )) + mid_date=$(date --utc -d "${date1:0:8} ${date1:8:2} + ${dtsecsby2} seconds" +%Y%m%d%H%M%S) echo "${mid_date:0:10}" } nhour(){ # Function to calculate hours between two dates (This replicates prod-util NHOUR) - local date1=${1:?} - local date2=${2:?} + local date1 date2 seconds1 seconds2 hours + date1=${1:?} + date2=${2:?} # Convert dates to UNIX timestamps seconds1=$(date --utc -d "${date1:0:8} ${date1:8:2}:00:00" +%s) seconds2=$(date --utc -d "${date2:0:8} ${date2:8:2}:00:00" +%s) @@ -44,30 +44,17 @@ nhour(){ echo "${hours}" } +# shellcheck disable=SC2034 common_predet(){ echo "SUB ${FUNCNAME[0]}: Defining variables for shared through model components" - # Ignore "not used" warning - # shellcheck disable=SC2034 pwd=$(pwd) CDUMP=${CDUMP:-gdas} - CASE=${CASE:-C768} - CDATE=${CDATE:-2017032500} + CDATE=${CDATE:-"${PDY}${cyc}"} ENSMEM=${ENSMEM:-000} - FCSTEXECDIR=${FCSTEXECDIR:-${HOMEgfs}/exec} - FCSTEXEC=${FCSTEXEC:-ufs_model.x} - - # Directories. - FIXgfs=${FIXgfs:-${HOMEgfs}/fix} - - # Model specific stuff - PARM_POST=${PARM_POST:-${HOMEgfs}/parm/post} - # Define significant cycles - current_cycle=${CDATE} + current_cycle="${PDY}${cyc}" previous_cycle=$(date --utc -d "${current_cycle:0:8} ${current_cycle:8:2} - ${assim_freq} hours" +%Y%m%d%H) - # ignore errors that variable isn't used - # shellcheck disable=SC2034 next_cycle=$(date --utc -d "${current_cycle:0:8} ${current_cycle:8:2} + ${assim_freq} hours" +%Y%m%d%H) forecast_end_cycle=$(date --utc -d "${current_cycle:0:8} ${current_cycle:8:2} + ${FHMAX} hours" +%Y%m%d%H) @@ -88,23 +75,29 @@ common_predet(){ tcyc=${scyc} fi - mkdir -p "${COM_CONF}" + FHMIN=${FHMIN:-0} + FHMAX=${FHMAX:-9} + FHOUT=${FHOUT:-3} + FHMAX_HF=${FHMAX_HF:-0} + FHOUT_HF=${FHOUT_HF:-1} + + # Several model components share DATA/INPUT for input data + if [[ ! -d "${DATA}/INPUT" ]]; then mkdir -p "${DATA}/INPUT"; fi + + if [[ ! -d "${COM_CONF}" ]]; then mkdir -p "${COM_CONF}"; fi cd "${DATA}" || ( echo "FATAL ERROR: Unable to 'cd ${DATA}', ABORT!"; exit 8 ) } +# shellcheck disable=SC2034 FV3_predet(){ echo "SUB ${FUNCNAME[0]}: Defining variables for FV3" - FHMIN=${FHMIN:-0} - FHMAX=${FHMAX:-9} - FHOUT=${FHOUT:-3} + + if [[ ! -d "${COM_ATMOS_HISTORY}" ]]; then mkdir -p "${COM_ATMOS_HISTORY}"; fi + if [[ ! -d "${COM_ATMOS_MASTER}" ]]; then mkdir -p "${COM_ATMOS_MASTER}"; fi + if [[ ! -d "${COM_ATMOS_RESTART}" ]]; then mkdir -p "${COM_ATMOS_RESTART}"; fi + FHZER=${FHZER:-6} FHCYC=${FHCYC:-24} - FHMAX_HF=${FHMAX_HF:-0} - FHOUT_HF=${FHOUT_HF:-1} - NSOUT=${NSOUT:-"-1"} - FDIAG=${FHOUT} - if (( FHMAX_HF > 0 && FHOUT_HF > 0 )); then FDIAG=${FHOUT_HF}; fi - WRITE_DOPOST=${WRITE_DOPOST:-".false."} restart_interval=${restart_interval:-${FHMAX}} # restart_interval = 0 implies write restart at the END of the forecast i.e. at FHMAX if [[ ${restart_interval} -eq 0 ]]; then @@ -112,30 +105,16 @@ FV3_predet(){ fi # Convert output settings into an explicit list for FV3 - # NOTE: FV3_OUTPUT_FH is also currently used in other components - # TODO: Have a seperate control for other components to address issue #1629 FV3_OUTPUT_FH="" local fhr=${FHMIN} if (( FHOUT_HF > 0 && FHMAX_HF > 0 )); then - for (( fh = FHMIN; fh < FHMAX_HF; fh = fh + FHOUT_HF )); do - FV3_OUTPUT_FH="${FV3_OUTPUT_FH} ${fh}" - done + FV3_OUTPUT_FH="${FV3_OUTPUT_FH} $(seq -s ' ' "${FHMIN}" "${FHOUT_HF}" "${FHMAX_HF}")" fhr=${FHMAX_HF} fi - for (( fh = fhr; fh <= FHMAX; fh = fh + FHOUT )); do - FV3_OUTPUT_FH="${FV3_OUTPUT_FH} ${fh}" - done - - - # Model resolution specific parameters - DELTIM=${DELTIM:-225} - layout_x=${layout_x:-8} - layout_y=${layout_y:-16} - LEVS=${LEVS:-65} + FV3_OUTPUT_FH="${FV3_OUTPUT_FH} $(seq -s ' ' "${fhr}" "${FHOUT}" "${FHMAX}")" # Other options - MEMBER=${MEMBER:-"-1"} # -1: control, 0: ensemble mean, >0: ensemble member $MEMBER - ENS_NUM=${ENS_NUM:-1} # Single executable runs multiple members (e.g. GEFS) + MEMBER=$(( 10#${ENSMEM:-"-1"} )) # -1: control, 0: ensemble mean, >0: ensemble member $MEMBER PREFIX_ATMINC=${PREFIX_ATMINC:-""} # allow ensemble to use recentered increment # IAU options @@ -145,18 +124,8 @@ FV3_predet(){ # Model config options ntiles=6 - TYPE=${TYPE:-"nh"} # choices: nh, hydro - MONO=${MONO:-"non-mono"} # choices: mono, non-mono - - QUILTING=${QUILTING:-".true."} - OUTPUT_GRID=${OUTPUT_GRID:-"gaussian_grid"} - WRITE_NEMSIOFLIP=${WRITE_NEMSIOFLIP:-".true."} - WRITE_FSYNCFLAG=${WRITE_FSYNCFLAG:-".true."} - rCDUMP=${rCDUMP:-${CDUMP}} - mkdir -p "${DATA}/INPUT" - #------------------------------------------------------------------ # changeable parameters # dycore definitions @@ -196,7 +165,6 @@ FV3_predet(){ nstf_name=${nstf_name:-"${NST_MODEL},${NST_SPINUP},${NST_RESV},${ZSEA1},${ZSEA2}"} nst_anl=${nst_anl:-".false."} - # blocking factor used for threading and general physics performance #nyblocks=$(expr \( $npy - 1 \) \/ $layout_y ) #nxblocks=$(expr \( $npx - 1 \) \/ $layout_x \/ 32) @@ -214,8 +182,7 @@ FV3_predet(){ print_freq=${print_freq:-6} #------------------------------------------------------- - if [[ ${RUN} =~ "gfs" || ${RUN} = "gefs" ]]; then - if [[ ! -d ${COM_ATMOS_RESTART} ]]; then mkdir -p "${COM_ATMOS_RESTART}" ; fi + if [[ "${RUN}" =~ "gfs" || "${RUN}" = "gefs" ]]; then ${NLN} "${COM_ATMOS_RESTART}" RESTART # The final restart written at the end doesn't include the valid date # Create links that keep the same name pattern for these files @@ -229,26 +196,69 @@ FV3_predet(){ ${NLN} "${file}" "${COM_ATMOS_RESTART}/${forecast_end_cycle:0:8}.${forecast_end_cycle:8:2}0000.${file}" done else - mkdir -p "${DATA}/RESTART" + if [[ ! -d "${DATA}/RESTART" ]]; then mkdir -p "${DATA}/RESTART"; fi fi - echo "SUB ${FUNCNAME[0]}: pre-determination variables set" } WW3_predet(){ echo "SUB ${FUNCNAME[0]}: WW3 before run type determination" + + if [[ ! -d "${COM_WAVE_HISTORY}" ]]; then mkdir -p "${COM_WAVE_HISTORY}"; fi if [[ ! -d "${COM_WAVE_RESTART}" ]]; then mkdir -p "${COM_WAVE_RESTART}" ; fi + ${NLN} "${COM_WAVE_RESTART}" "restart_wave" } +# shellcheck disable=SC2034 CICE_predet(){ echo "SUB ${FUNCNAME[0]}: CICE before run type determination" + + if [[ ! -d "${COM_ICE_HISTORY}" ]]; then mkdir -p "${COM_ICE_HISTORY}"; fi + if [[ ! -d "${COM_ICE_RESTART}" ]]; then mkdir -p "${COM_ICE_RESTART}"; fi + if [[ ! -d "${COM_ICE_INPUT}" ]]; then mkdir -p "${COM_ICE_INPUT}"; fi + if [[ ! -d "${DATA}/CICE_OUTPUT" ]]; then mkdir -p "${DATA}/CICE_OUTPUT"; fi if [[ ! -d "${DATA}/CICE_RESTART" ]]; then mkdir -p "${DATA}/CICE_RESTART"; fi + + # CICE does not have a concept of high frequency output like FV3 + # Convert output settings into an explicit list for CICE + CICE_OUTPUT_FH=$(seq -s ' ' "${FHMIN}" "${FHOUT_OCNICE}" "${FHMAX}") + } +# shellcheck disable=SC2034 MOM6_predet(){ echo "SUB ${FUNCNAME[0]}: MOM6 before run type determination" + + if [[ ! -d "${COM_OCEAN_HISTORY}" ]]; then mkdir -p "${COM_OCEAN_HISTORY}"; fi + if [[ ! -d "${COM_OCEAN_RESTART}" ]]; then mkdir -p "${COM_OCEAN_RESTART}"; fi + if [[ ! -d "${COM_OCEAN_INPUT}" ]]; then mkdir -p "${COM_OCEAN_INPUT}"; fi + if [[ ! -d "${DATA}/MOM6_OUTPUT" ]]; then mkdir -p "${DATA}/MOM6_OUTPUT"; fi if [[ ! -d "${DATA}/MOM6_RESTART" ]]; then mkdir -p "${DATA}/MOM6_RESTART"; fi + + # MOM6 does not have a concept of high frequency output like FV3 + # Convert output settings into an explicit list for MOM6 + MOM6_OUTPUT_FH=$(seq -s ' ' "${FHMIN}" "${FHOUT_OCNICE}" "${FHMAX}") + +} + +CMEPS_predet(){ + echo "SUB ${FUNCNAME[0]}: CMEPS before run type determination" + + if [[ ! -d "${COM_MED_RESTART}" ]]; then mkdir -p "${COM_MED_RESTART}"; fi + + if [[ ! -d "${DATA}/CMEPS_RESTART" ]]; then mkdir -p "${DATA}/CMEPS_RESTART"; fi + +} + +# shellcheck disable=SC2034 +GOCART_predet(){ + echo "SUB ${FUNCNAME[0]}: GOCART before run type determination" + + if [[ ! -d "${COM_CHEM_HISTORY}" ]]; then mkdir -p "${COM_CHEM_HISTORY}"; fi + + GOCART_OUTPUT_FH=$(seq -s ' ' "${FHMIN}" "6" "${FHMAX}") + # TODO: AERO_HISTORY.rc has hardwired output frequency to 6 hours } diff --git a/ush/fv3gfs_remap.sh b/ush/fv3gfs_remap.sh deleted file mode 100755 index 7986add331..0000000000 --- a/ush/fv3gfs_remap.sh +++ /dev/null @@ -1,118 +0,0 @@ -#! /usr/bin/env bash - -#-------------------------------------- -#-- remap FV3 6 tiles to global array -#-- Fanglin Yang, October 2016 -#-------------------------------------- - -source "$HOMEgfs/ush/preamble.sh" - -export CDATE=${CDATE:-"2016100300"} -export CASE=${CASE:-"C192"} # C48 C96 C192 C384 C768 C1152 C3072 -export GG=${master_grid:-"0p25deg"} # 1deg 0p5deg 0p25deg 0p125deg - -pwd=$(pwd) -export DATA=${DATA:-$pwd} -export HOMEgfs=${HOMEgfs:-$PACKAGEROOT} -export FIXgfs=${FIXgfs:-$HOMEgfs/fix} -export FIXorog=${FIXorog:-$FIXgfs/orog} -export REMAPEXE=${REMAPEXE:-$HOMEgfs/exec/fregrid_parallel} -export IPD4=${IPD4:-"YES"} - -cycn=$(echo $CDATE | cut -c 9-10) -export TCYC=${TCYC:-".t${cycn}z."} -export CDUMP=${CDUMP:-gfs} -export PREFIX=${PREFIX:-${CDUMP}${TCYC}} - -#-------------------------------------------------- -export grid_loc=${FIXorog}/${CASE}/${CASE}_mosaic.nc -export weight_file=${FIXorog}/${CASE}/remap_weights_${CASE}_${GG}.nc - -export APRUN_REMAP=${APRUN_REMAP:-${APRUN:-""}} -export NTHREADS_REMAP=${NTHREADS_REMAP:-${NTHREADS:-1}} - -#-------------------------------------------------- -if [ $GG = 1deg ]; then export nlon=360 ; export nlat=180 ; fi -if [ $GG = 0p5deg ]; then export nlon=720 ; export nlat=360 ; fi -if [ $GG = 0p25deg ]; then export nlon=1440 ; export nlat=720 ; fi -if [ $GG = 0p125deg ]; then export nlon=2880 ; export nlat=1440 ; fi - -#-------------------------------------------------- -hgt=h; if [ $IPD4 = YES ]; then hgt=z; fi - -#--for non-hydrostatic case -export atmos_4xdaily_nh="slp, vort850, vort200,\ - us, u1000, u850, u700, u500, u200, u100, u50, u10,\ - vs, v1000, v850, v700, v500, v200, v100, v50, v10,\ - tm, t1000, t850, t700, t500, t200, t100, t50, t10,\ - ${hgt}1000, ${hgt}850, ${hgt}700, ${hgt}500, ${hgt}200, ${hgt}100, ${hgt}50, ${hgt}10,\ - q1000, q850, q700, q500, q200, q100, q50, q10,\ - rh1000, rh850, rh700, rh500, rh200,\ - omg1000, omg850, omg700, omg500, omg200, omg100, omg50, omg10,\ - w700,w850,w500, w200" - -#--for hydrostatic case -export atmos_4xdaily_hy="slp, vort850, vort200,\ - us, u1000, u850, u700, u500, u200, u100, u50, u10,\ - vs, v1000, v850, v700, v500, v200, v100, v50, v10,\ - tm, t1000, t850, t700, t500, t200, t100, t50, t10,\ - ${hgt}1000, ${hgt}850, ${hgt}700, ${hgt}500, ${hgt}200, ${hgt}100, ${hgt}50, ${hgt}10,\ - q1000, q850, q700, q500, q200, q100, q50, q10,\ - rh1000, rh850, rh700, rh500, rh200,\ - omg1000, omg850, omg700, omg500, omg200, omg100, omg50, omg10,\ - w700" - -export nggps2d_nh="ALBDOsfc, CPRATsfc, PRATEsfc, DLWRFsfc, ULWRFsfc,\ - DSWRFsfc, USWRFsfc, DSWRFtoa, USWRFtoa, ULWRFtoa,\ - GFLUXsfc, HGTsfc, HPBLsfc, ICECsfc, SLMSKsfc,\ - LHTFLsfc, SHTFLsfc, PRESsfc, PWATclm, SOILM,\ - SOILW1, SOILW2, SOILW3, SOILW4, SPFH2m,\ - TCDCclm, TCDChcl, TCDClcl, TCDCmcl,\ - SOILT1, SOILT2, SOILT3, SOILT4,\ - TMP2m, TMPsfc, UGWDsfc, VGWDsfc, UFLXsfc,\ - VFLXsfc, UGRD10m, VGRD10m, WEASDsfc, SNODsfc,\ - ZORLsfc, VFRACsfc, F10Msfc, VTYPEsfc, STYPEsfc" -export nggps2d_hy="$nggps2d_nh" - -export nggps3d_nh="ucomp, vcomp, temp, delp, sphum, o3mr, clwmr, nhpres, w, delz" #for non-hydrostatic case -export nggps3d_hy="ucomp, vcomp, temp, delp, sphum, o3mr, clwmr, hypres" #for hydrostatic case - -#-------------------------------------------------- -cd $DATA || exit 8 - -testfile=nggps3d.tile4.nc -nhrun=$(ncdump -c $testfile | grep nhpres) -nhrun=$? - -export OMP_NUM_THREADS=$NTHREADS_REMAP - -#-------------------------------------------------- -err=0 -for type in atmos_4xdaily nggps2d nggps3d ; do - - export in_file="$type" - export out_file=${PREFIX}${type}.${GG}.nc - [[ -s $DATA/$out_file ]] && rm -f $DATA/$out_file - if [ $nhrun -eq 0 ]; then - export fld=$(eval echo \${${type}_nh}) - else - export fld=$(eval echo \${${type}_hy}) - fi - - $APRUN_REMAP $REMAPEXE --input_dir $DATA \ - --input_file $in_file \ - --output_dir $DATA \ - --output_file $out_file \ - --input_mosaic $grid_loc \ - --scalar_field "$fld" \ - --interp_method conserve_order1 \ - --remap_file $weight_file \ - --nlon $nlon \ - --nlat $nlat - rc=$? - ((err+=$rc)) - -done - -exit $err - diff --git a/ush/gaussian_sfcanl.sh b/ush/gaussian_sfcanl.sh index 1a0441a06f..5c6c842845 100755 --- a/ush/gaussian_sfcanl.sh +++ b/ush/gaussian_sfcanl.sh @@ -23,17 +23,7 @@ # OUTPUT_FILE Output gaussian analysis file format. Default is "nemsio" # Set to "netcdf" for netcdf output file # Otherwise, output in nemsio. -# BASEDIR Root directory where all scripts and fixed files reside. -# Default is /nwprod2. -# HOMEgfs Directory for gfs version. Default is -# $BASEDIR/gfs_ver.v15.0.0} -# FIXam Directory for the global fixed climatology files. -# Defaults to $HOMEgfs/fix/am -# FIXorog Directory for the model grid and orography netcdf -# files. Defaults to $HOMEgfs/fix/orog # FIXWGTS Weight file to use for interpolation -# EXECgfs Directory of the program executable. Defaults to -# $HOMEgfs/exec # DATA Working directory # (if nonexistent will be made, used and deleted) # Defaults to current working directory @@ -83,9 +73,9 @@ # # programs : $GAUSFCANLEXE # -# fixed data : ${FIXorog}/${CASE}/${CASE}.mx${OCNRES}_oro_data.tile*.nc +# fixed data : ${FIXgfs}/orog/${CASE}/${CASE}.mx${OCNRES}_oro_data.tile*.nc # ${FIXWGTS} -# ${FIXam}/global_hyblev.l65.txt +# ${FIXgfs}/am/global_hyblev.l65.txt # # input data : ${COM_ATMOS_RESTART}/${PDY}.${cyc}0000.sfcanl_data.tile*.nc # @@ -110,7 +100,7 @@ # ################################################################################ -source "$HOMEgfs/ush/preamble.sh" +source "${USHgfs}/preamble.sh" CASE=${CASE:-C768} res=$(echo $CASE | cut -c2-) @@ -121,20 +111,13 @@ LATB_SFC=${LATB_SFC:-$LATB_CASE} DONST=${DONST:-"NO"} LEVS=${LEVS:-64} LEVSP1=$(($LEVS+1)) -# Directories. -gfs_ver=${gfs_ver:-v16.3.0} -BASEDIR=${BASEDIR:-${PACKAGEROOT:-/lfs/h1/ops/prod/packages}} -HOMEgfs=${HOMEgfs:-$BASEDIR/gfs.${gfs_ver}} -EXECgfs=${EXECgfs:-$HOMEgfs/exec} -FIXorog=${FIXorog:-$HOMEgfs/fix/orog} -FIXam=${FIXam:-$HOMEgfs/fix/am} -FIXWGTS=${FIXWGTS:-$FIXorog/$CASE/fv3_SCRIP_${CASE}_GRIDSPEC_lon${LONB_SFC}_lat${LATB_SFC}.gaussian.neareststod.nc} +FIXWGTS=${FIXWGTS:-${FIXgfs}/orog/${CASE}/fv3_SCRIP_${CASE}_GRIDSPEC_lon${LONB_SFC}_lat${LATB_SFC}.gaussian.neareststod.nc} DATA=${DATA:-$(pwd)} # Filenames. XC=${XC:-} GAUSFCANLEXE=${GAUSFCANLEXE:-$EXECgfs/gaussian_sfcanl.x} -SIGLEVEL=${SIGLEVEL:-$FIXam/global_hyblev.l${LEVSP1}.txt} +SIGLEVEL=${SIGLEVEL:-${FIXgfs}/am/global_hyblev.l${LEVSP1}.txt} CDATE=${CDATE:?} @@ -187,12 +170,12 @@ ${NLN} "${COM_ATMOS_RESTART}/${PDY}.${cyc}0000.sfcanl_data.tile5.nc" "./anal.til ${NLN} "${COM_ATMOS_RESTART}/${PDY}.${cyc}0000.sfcanl_data.tile6.nc" "./anal.tile6.nc" # input orography tiles -${NLN} "${FIXorog}/${CASE}/${CASE}.mx${OCNRES}_oro_data.tile1.nc" "./orog.tile1.nc" -${NLN} "${FIXorog}/${CASE}/${CASE}.mx${OCNRES}_oro_data.tile2.nc" "./orog.tile2.nc" -${NLN} "${FIXorog}/${CASE}/${CASE}.mx${OCNRES}_oro_data.tile3.nc" "./orog.tile3.nc" -${NLN} "${FIXorog}/${CASE}/${CASE}.mx${OCNRES}_oro_data.tile4.nc" "./orog.tile4.nc" -${NLN} "${FIXorog}/${CASE}/${CASE}.mx${OCNRES}_oro_data.tile5.nc" "./orog.tile5.nc" -${NLN} "${FIXorog}/${CASE}/${CASE}.mx${OCNRES}_oro_data.tile6.nc" "./orog.tile6.nc" +${NLN} "${FIXgfs}/orog/${CASE}/${CASE}.mx${OCNRES}_oro_data.tile1.nc" "./orog.tile1.nc" +${NLN} "${FIXgfs}/orog/${CASE}/${CASE}.mx${OCNRES}_oro_data.tile2.nc" "./orog.tile2.nc" +${NLN} "${FIXgfs}/orog/${CASE}/${CASE}.mx${OCNRES}_oro_data.tile3.nc" "./orog.tile3.nc" +${NLN} "${FIXgfs}/orog/${CASE}/${CASE}.mx${OCNRES}_oro_data.tile4.nc" "./orog.tile4.nc" +${NLN} "${FIXgfs}/orog/${CASE}/${CASE}.mx${OCNRES}_oro_data.tile5.nc" "./orog.tile5.nc" +${NLN} "${FIXgfs}/orog/${CASE}/${CASE}.mx${OCNRES}_oro_data.tile6.nc" "./orog.tile6.nc" ${NLN} "${SIGLEVEL}" "./vcoord.txt" diff --git a/ush/getdump.sh b/ush/getdump.sh index 462ca5e755..7ab241ca1a 100755 --- a/ush/getdump.sh +++ b/ush/getdump.sh @@ -1,6 +1,6 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" +source "${USHgfs}/preamble.sh" COMPONENT=${COMPONENT:-atmos} diff --git a/ush/getges.sh b/ush/getges.sh index 2fb54fccc7..d960354bf4 100755 --- a/ush/getges.sh +++ b/ush/getges.sh @@ -76,7 +76,7 @@ ################################################################################ #------------------------------------------------------------------------------- -source "$HOMEgfs/ush/preamble.sh" +source "${USHgfs}/preamble.sh" # Set some default parameters. fhbeg=03 # hour to begin searching backward for guess diff --git a/ush/gfs_bfr2gpk.sh b/ush/gfs_bfr2gpk.sh index add68536ec..dbd8defb0e 100755 --- a/ush/gfs_bfr2gpk.sh +++ b/ush/gfs_bfr2gpk.sh @@ -10,7 +10,7 @@ # Log: # # K. Brill/HPC 04/12/05 # ######################################################################### -source "${HOMEgfs}/ush/preamble.sh" +source "${USHgfs}/preamble.sh" # Set GEMPAK paths. diff --git a/ush/gfs_bufr.sh b/ush/gfs_bufr.sh index 5ed05f9beb..5bd0a5e7f7 100755 --- a/ush/gfs_bufr.sh +++ b/ush/gfs_bufr.sh @@ -17,9 +17,10 @@ # 2018-05-22 Guang Ping Lou: Making it work for both GFS and FV3GFS # 2018-05-30 Guang Ping Lou: Make sure all files are available. # 2019-10-10 Guang Ping Lou: Read in NetCDF files +# 2024-03-03 Bo Cui: Add options to use different bufr table for different resolution NetCDF files # echo "History: February 2003 - First implementation of this utility script" # -source "${HOMEgfs:?}/ush/preamble.sh" +source "${USHgfs}/preamble.sh" if [[ "${F00FLAG}" == "YES" ]]; then f00flag=".true." @@ -76,11 +77,23 @@ for (( hr = 10#${FSTART}; hr <= 10#${FEND}; hr = hr + 10#${FINT} )); do done # define input BUFR table file. -ln -sf "${PARMbufrsnd}/bufr_gfs_${CLASS}.tbl" fort.1 -ln -sf "${STNLIST:-${PARMbufrsnd}/bufr_stalist.meteo.gfs}" fort.8 -ln -sf "${PARMbufrsnd}/bufr_ij13km.txt" fort.7 +ln -sf "${PARMgfs}/product/bufr_gfs_${CLASS}.tbl" fort.1 +ln -sf "${STNLIST:-${PARMgfs}/product/bufr_stalist.meteo.gfs}" fort.8 -${APRUN_POSTSND} "${EXECbufrsnd}/${pgm}" < gfsparm > "out_gfs_bufr_${FEND}" +case "${CASE}" in + "C768") + ln -sf "${PARMgfs}/product/bufr_ij13km.txt" fort.7 + ;; + "C1152") + ln -sf "${PARMgfs}/product/bufr_ij9km.txt" fort.7 + ;; + *) + echo "WARNING: No bufr table for this resolution, using the one for C768" + ln -sf "${PARMgfs}/product/bufr_ij13km.txt" fort.7 + ;; +esac + +${APRUN_POSTSND} "${EXECgfs}/${pgm}" < gfsparm > "out_gfs_bufr_${FEND}" export err=$? if [ $err -ne 0 ]; then diff --git a/ush/gfs_bufr_netcdf.sh b/ush/gfs_bufr_netcdf.sh index b358c6b69a..843922e53b 100755 --- a/ush/gfs_bufr_netcdf.sh +++ b/ush/gfs_bufr_netcdf.sh @@ -19,7 +19,7 @@ # 2019-10-10 Guang Ping Lou: Read in NetCDF files # echo "History: February 2003 - First implementation of this utility script" # -source "$HOMEgfs/ush/preamble.sh" +source "${USHgfs}/preamble.sh" if test "$F00FLAG" = "YES" then @@ -105,11 +105,11 @@ do done # define input BUFR table file. -ln -sf $PARMbufrsnd/bufr_gfs_${CLASS}.tbl fort.1 -ln -sf ${STNLIST:-$PARMbufrsnd/bufr_stalist.meteo.gfs} fort.8 -ln -sf $PARMbufrsnd/bufr_ij13km.txt fort.7 +ln -sf ${PARMgfs}/product/bufr_gfs_${CLASS}.tbl fort.1 +ln -sf ${STNLIST:-${PARMgfs}/product/bufr_stalist.meteo.gfs} fort.8 +ln -sf ${PARMgfs}/product/bufr_ij13km.txt fort.7 -${APRUN_POSTSND} "${EXECbufrsnd}/${pgm}" < gfsparm > "out_gfs_bufr_${FEND}" +${APRUN_POSTSND} "${EXECgfs}/${pgm}" < gfsparm > "out_gfs_bufr_${FEND}" export err=$? exit ${err} diff --git a/ush/gfs_sndp.sh b/ush/gfs_sndp.sh index 99c5c68fa3..ade49eec36 100755 --- a/ush/gfs_sndp.sh +++ b/ush/gfs_sndp.sh @@ -7,7 +7,7 @@ # 1) 2004-09-10 Steve Gilbert First Implementation ################################################################ -source "$HOMEgfs/ush/preamble.sh" +source "${USHgfs}/preamble.sh" # Create "collectives" consisting of groupings of the soundings # into files designated by geographical region. Each input @@ -16,7 +16,7 @@ source "$HOMEgfs/ush/preamble.sh" export m=$1 mkdir $DATA/$m cd $DATA/$m - cp $FIXbufrsnd/gfs_collective${m}.list $DATA/$m/. + cp ${FIXgfs}/product/gfs_collective${m}.list $DATA/$m/. CCCC=KWBC file_list=gfs_collective${m}.list @@ -37,7 +37,7 @@ cd $DATA/$m #. prep_step export FORT11=$DATA/${m}/bufrin export FORT51=./bufrout - ${EXECbufrsnd}/${pgm} << EOF + ${EXECgfs}/${pgm} << EOF &INPUT BULHED="$WMOHEAD",KWBX="$CCCC", NCEP2STD=.TRUE., diff --git a/ush/gfs_truncate_enkf.sh b/ush/gfs_truncate_enkf.sh index 0a7d6fc0dd..6102ada75d 100755 --- a/ush/gfs_truncate_enkf.sh +++ b/ush/gfs_truncate_enkf.sh @@ -1,6 +1,6 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" +source "${USHgfs}/preamble.sh" member=$1 export SIGINP=$2 @@ -14,17 +14,16 @@ mkdir -p $DATATMP cd $DATATMP export LEVS=${LEVS_LORES:-64} -export FIXam=${FIXam:-$HOMEgfs/fix/am} export CHGRESSH=${CHGRESSH:-${USHgfs}/global_chgres.sh} export CHGRESEXEC=${CHGRESEXEC-${EXECgfs}/global_chgres} -export OROGRAPHY=${OROGRAPHY_LORES:-$FIXam/global_orography.t$JCAP.$LONB.$LATB.grb} -export OROGRAPHY_UF=${OROGRAPHY_UF_LORES:-$FIXam/global_orography_uf.t$JCAP.$LONB.$LATB.grb} -export LONSPERLAT=${LONSPERLAT_LORES:-$FIXam/global_lonsperlat.t${JCAP}.$LONB.$LATB.txt} -export SLMASK=${SLMASK_LORES:-$FIXam/global_slmask.t$JCAP.$LONB.$LATB.grb} -export MTNVAR=${MTNVAR_LORES:-$FIXam/global_mtnvar.t$JCAP.$LONB.$LATB.f77} -export SIGLEVEL=${SIGLEVEL_LORES:-$FIXam/global_hyblev.l${LEVS}.txt} -export O3CLIM=${O3CLIM:-$FIXam/global_o3clim.txt} +export OROGRAPHY=${OROGRAPHY_LORES:-${FIXgfs}/am/global_orography.t$JCAP.$LONB.$LATB.grb} +export OROGRAPHY_UF=${OROGRAPHY_UF_LORES:-${FIXgfs}/am/global_orography_uf.t$JCAP.$LONB.$LATB.grb} +export LONSPERLAT=${LONSPERLAT_LORES:-${FIXgfs}/am/global_lonsperlat.t${JCAP}.$LONB.$LATB.txt} +export SLMASK=${SLMASK_LORES:-${FIXgfs}/am/global_slmask.t$JCAP.$LONB.$LATB.grb} +export MTNVAR=${MTNVAR_LORES:-${FIXgfs}/am/global_mtnvar.t$JCAP.$LONB.$LATB.f77} +export SIGLEVEL=${SIGLEVEL_LORES:-${FIXgfs}/am/global_hyblev.l${LEVS}.txt} +export O3CLIM=${O3CLIM:-${FIXgfs}/am/global_o3clim.txt} use_ufo=.true. diff --git a/ush/global_savefits.sh b/ush/global_savefits.sh index f26132dd8a..973d27a358 100755 --- a/ush/global_savefits.sh +++ b/ush/global_savefits.sh @@ -3,7 +3,7 @@ ######################################################## # save fit and horiz files for all analysis cycles ######################################################## -source "$HOMEgfs/ush/preamble.sh" +source "${USHgfs}/preamble.sh" export FIT_DIR=${FIT_DIR:-$COMOUT/fits} export HORZ_DIR=${HORZ_DIR:-$COMOUT/horiz} diff --git a/ush/hpssarch_gen_emc.sh b/ush/hpssarch_gen_emc.sh index c34fff1a84..90f9398656 100755 --- a/ush/hpssarch_gen_emc.sh +++ b/ush/hpssarch_gen_emc.sh @@ -4,7 +4,7 @@ # Fanglin Yang, 20180318 # --create bunches of files to be archived to HPSS ################################################### -source "${HOMEgfs}/ush/preamble.sh" +source "${USHgfs}/preamble.sh" type=${1:-gfs} ##gfs, gdas, enkfgdas or enkfggfs @@ -92,7 +92,7 @@ if [[ ${type} = "gfs" ]]; then # This uses the bash extended globbing option { echo "./logs/${PDY}${cyc}/gfs!(arch).log" - echo "${COM_ATMOS_HISTORY/${ROTDIR}\//}/input.nml" + echo "${COM_CONF/${ROTDIR}\//}/ufs.input.nml" if [[ ${MODE} = "cycled" ]]; then if [[ -s "${COM_ATMOS_ANALYSIS}/${head}gsistat" ]]; then @@ -251,48 +251,64 @@ if [[ ${type} = "gfs" ]]; then } >> "${DATA}/gfswave.txt" fi - if [[ ${DO_OCN} = "YES" ]]; then + if [[ "${DO_OCN}" == "YES" ]]; then - head="gfs.t${cyc}z." + head="gfs.ocean.t${cyc}z." + rm -f "${DATA}/ocean_6hravg.txt"; touch "${DATA}/ocean_6hravg.txt" + rm -f "${DATA}/ocean_daily.txt"; touch "${DATA}/ocean_daily.txt" + rm -f "${DATA}/ocean_grib2.txt"; touch "${DATA}/ocean_grib2.txt" - rm -f "${DATA}/gfs_flux_1p00.txt" - rm -f "${DATA}/ocn_ice_grib2_0p5.txt" - rm -f "${DATA}/ocn_ice_grib2_0p25.txt" - rm -f "${DATA}/ocn_2D.txt" - rm -f "${DATA}/ocn_3D.txt" - rm -f "${DATA}/ocn_xsect.txt" - rm -f "${DATA}/ocn_daily.txt" - touch "${DATA}/gfs_flux_1p00.txt" - touch "${DATA}/ocn_ice_grib2_0p5.txt" - touch "${DATA}/ocn_ice_grib2_0p25.txt" - touch "${DATA}/ocn_2D.txt" - touch "${DATA}/ocn_3D.txt" - touch "${DATA}/ocn_xsect.txt" - touch "${DATA}/ocn_daily.txt" - echo "${COM_OCEAN_INPUT/${ROTDIR}\//}/MOM_input" >> "${DATA}/ocn_2D.txt" - echo "${COM_OCEAN_2D/${ROTDIR}\//}/ocn_2D*" >> "${DATA}/ocn_2D.txt" - echo "${COM_OCEAN_3D/${ROTDIR}\//}/ocn_3D*" >> "${DATA}/ocn_3D.txt" - echo "${COM_OCEAN_XSECT/${ROTDIR}\//}/ocn*EQ*" >> "${DATA}/ocn_xsect.txt" - echo "${COM_OCEAN_HISTORY/${ROTDIR}\//}/ocn_daily*" >> "${DATA}/ocn_daily.txt" - echo "${COM_OCEAN_GRIB_0p50/${ROTDIR}\//}/ocn_ice*0p5x0p5.grb2" >> "${DATA}/ocn_ice_grib2_0p5.txt" - echo "${COM_OCEAN_GRIB_0p25/${ROTDIR}\//}/ocn_ice*0p25x0p25.grb2" >> "${DATA}/ocn_ice_grib2_0p25.txt" + echo "${COM_OCEAN_HISTORY/${ROTDIR}\//}/${head}6hr_avg.f*.nc" >> "${DATA}/ocean_6hravg.txt" + echo "${COM_OCEAN_HISTORY/${ROTDIR}\//}/${head}daily.f*.nc" >> "${DATA}/ocean_daily.txt" + + { + if [[ -d "${COM_OCEAN_GRIB}/5p00" ]]; then + echo "${COM_OCEAN_GRIB/${ROTDIR}\//}/5p00/${head}5p00.f*.grib2" + echo "${COM_OCEAN_GRIB/${ROTDIR}\//}/5p00/${head}5p00.f*.grib2.idx" + fi + if [[ -d "${COM_OCEAN_GRIB}/1p00" ]]; then + echo "${COM_OCEAN_GRIB/${ROTDIR}\//}/1p00/${head}1p00.f*.grib2" + echo "${COM_OCEAN_GRIB/${ROTDIR}\//}/1p00/${head}1p00.f*.grib2.idx" + fi + if [[ -d "${COM_OCEAN_GRIB}/0p25" ]]; then + echo "${COM_OCEAN_GRIB/${ROTDIR}\//}/0p25/${head}0p25.f*.grib2" + echo "${COM_OCEAN_GRIB/${ROTDIR}\//}/0p25/${head}0p25.f*.grib2.idx" + fi + } >> "${DATA}/ocean_grib2.txt" # Also save fluxes from atmosphere + head="gfs.t${cyc}z." + rm -f "${DATA}/gfs_flux_1p00.txt"; touch "${DATA}/gfs_flux_1p00.txt" { echo "${COM_ATMOS_GRIB_1p00/${ROTDIR}\//}/${head}flux.1p00.f???" echo "${COM_ATMOS_GRIB_1p00/${ROTDIR}\//}/${head}flux.1p00.f???.idx" } >> "${DATA}/gfs_flux_1p00.txt" fi - if [[ ${DO_ICE} = "YES" ]]; then - head="gfs.t${cyc}z." + if [[ "${DO_ICE}" == "YES" ]]; then + head="gfs.ice.t${cyc}z." + rm -f "${DATA}/ice_6hravg.txt"; touch "${DATA}/ice_6hravg.txt" + rm -f "${DATA}/ice_grib2.txt"; touch "${DATA}/ice_grib2.txt" - rm -f "${DATA}/ice.txt" - touch "${DATA}/ice.txt" { - echo "${COM_ICE_INPUT/${ROTDIR}\//}/ice_in" - echo "${COM_ICE_HISTORY/${ROTDIR}\//}/ice*nc" - } >> "${DATA}/ice.txt" + echo "${COM_ICE_HISTORY/${ROTDIR}\//}/${head}ic.nc" + echo "${COM_ICE_HISTORY/${ROTDIR}\//}/${head}6hr_avg.f*.nc" + } >> "${DATA}/ice_6hravg.txt" + + { + if [[ -d "${COM_ICE_GRIB}/5p00" ]]; then + echo "${COM_ICE_GRIB/${ROTDIR}\//}/5p00/${head}5p00.f*.grib2" + echo "${COM_ICE_GRIB/${ROTDIR}\//}/5p00/${head}5p00.f*.grib2.idx" + fi + if [[ -d "${COM_ICE_GRIB}/1p00" ]]; then + echo "${COM_ICE_GRIB/${ROTDIR}\//}/1p00/${head}1p00.f*.grib2" + echo "${COM_ICE_GRIB/${ROTDIR}\//}/1p00/${head}1p00.f*.grib2.idx" + fi + if [[ -d "${COM_ICE_GRIB}/0p25" ]]; then + echo "${COM_ICE_GRIB/${ROTDIR}\//}/0p25/${head}0p25.f*.grib2" + echo "${COM_ICE_GRIB/${ROTDIR}\//}/0p25/${head}0p25.f*.grib2.idx" + fi + } >> "${DATA}/ice_grib2.txt" fi if [[ ${DO_AERO} = "YES" ]]; then @@ -360,6 +376,9 @@ if [[ ${type} == "gdas" ]]; then echo "${COM_CHEM_ANALYSIS/${ROTDIR}\//}/${head}aerostat" fi fi + if [[ -s "${COM_SNOW_ANALYSIS}/${head}snowstat.tgz" ]]; then + echo "${COM_SNOW_ANALYSIS/${ROTDIR}\//}/${head}snowstat.tgz" + fi if [[ -s "${COM_ATMOS_ANALYSIS}/${head}radstat" ]]; then echo "${COM_ATMOS_ANALYSIS/${ROTDIR}\//}/${head}radstat" fi @@ -368,7 +387,10 @@ if [[ ${type} == "gdas" ]]; then echo "./logs/${PDY}${cyc}/gdas${fstep}.log" fi done - echo "./logs/${PDY}${cyc}/gdaspost*.log" + echo "./logs/${PDY}${cyc}/gdas*prod*.log" + if [[ "${WRITE_DOPOST}" == ".false." ]]; then + echo "./logs/${PDY}${cyc}/gdas*upp*.log" + fi fh=0 while [[ ${fh} -le 9 ]]; do @@ -407,10 +429,13 @@ if [[ ${type} == "gdas" ]]; then fi subtyplist="gome_metop-b omi_aura ompslp_npp ompsnp_n20 ompsnp_npp ompstc8_n20 ompstc8_npp sbuv2_n19" for subtype in ${subtyplist}; do - echo "${COM_ATMOS_OZNMON/${ROTDIR}\//}/${type}/${subtype}.anl.${PDY}${cyc}.ieee_d${suffix}" - echo "${COM_ATMOS_OZNMON/${ROTDIR}\//}/${type}/${subtype}.anl.ctl" - echo "${COM_ATMOS_OZNMON/${ROTDIR}\//}/${type}/${subtype}.ges.${PDY}${cyc}.ieee_d${suffix}" - echo "${COM_ATMOS_OZNMON/${ROTDIR}\//}/${type}/${subtype}.ges.ctl" + # On occassion, data is not available for some of these satellites. Check for existence. + if [[ -s "${COM_ATMOS_OZNMON/${ROTDIR}\//}/${type}/${subtype}.ges.${PDY}${cyc}.ieee_d${suffix}" ]]; then + echo "${COM_ATMOS_OZNMON/${ROTDIR}\//}/${type}/${subtype}.anl.${PDY}${cyc}.ieee_d${suffix}" + echo "${COM_ATMOS_OZNMON/${ROTDIR}\//}/${type}/${subtype}.anl.ctl" + echo "${COM_ATMOS_OZNMON/${ROTDIR}\//}/${type}/${subtype}.ges.${PDY}${cyc}.ieee_d${suffix}" + echo "${COM_ATMOS_OZNMON/${ROTDIR}\//}/${type}/${subtype}.ges.ctl" + fi done echo "${COM_ATMOS_OZNMON/${ROTDIR}\//}/${type}/stdout.${type}.tar.gz" done @@ -478,6 +503,15 @@ if [[ ${type} == "gdas" ]]; then echo "${COM_ATMOS_RESTART/${ROTDIR}\//}/*0000.sfcanl_data.tile4.nc" echo "${COM_ATMOS_RESTART/${ROTDIR}\//}/*0000.sfcanl_data.tile5.nc" echo "${COM_ATMOS_RESTART/${ROTDIR}\//}/*0000.sfcanl_data.tile6.nc" + + [[ -s "${COM_CONF}/${head}letkfoi.yaml" ]] && echo "${COM_CONF/${ROTDIR}\//}/${head}letkfoi.yaml" + + echo "${COM_SNOW_ANALYSIS/${ROTDIR}\//}/*0000.sfc_data.tile1.nc" + echo "${COM_SNOW_ANALYSIS/${ROTDIR}\//}/*0000.sfc_data.tile2.nc" + echo "${COM_SNOW_ANALYSIS/${ROTDIR}\//}/*0000.sfc_data.tile3.nc" + echo "${COM_SNOW_ANALYSIS/${ROTDIR}\//}/*0000.sfc_data.tile4.nc" + echo "${COM_SNOW_ANALYSIS/${ROTDIR}\//}/*0000.sfc_data.tile5.nc" + echo "${COM_SNOW_ANALYSIS/${ROTDIR}\//}/*0000.sfc_data.tile6.nc" } >> "${DATA}/gdas_restarta.txt" #.................. @@ -612,8 +646,17 @@ if [[ ${type} == "enkfgdas" || ${type} == "enkfgfs" ]]; then fi fi done # loop over FHR - for fstep in eobs ecen esfc eupd efcs epos ; do - echo "logs/${PDY}${cyc}/${RUN}${fstep}*.log" + for fstep in fcst epos ; do + echo "logs/${PDY}${cyc}/${RUN}${fstep}*.log" + done + + # eobs, ecen, esfc, and eupd are not run on the first cycle + for fstep in eobs ecen esfc eupd ; do + for log in "${ROTDIR}/logs/${PDY}${cyc}/${RUN}${fstep}"*".log"; do + if [[ -s "${log}" ]]; then + echo "logs/${PDY}${cyc}/${RUN}${fstep}*.log" + fi + done done # eomg* are optional jobs @@ -650,7 +693,7 @@ if [[ ${type} == "enkfgdas" || ${type} == "enkfgfs" ]]; then touch "${DATA}/${RUN}_restartb_grp${n}.txt" m=1 - while (( m <= NMEM_EARCGRP )); do + while (( m <= NMEM_EARCGRP && (n-1)*NMEM_EARCGRP+m <= NMEM_ENS )); do nm=$(((n-1)*NMEM_EARCGRP+m)) mem=$(printf %03i ${nm}) head="${RUN}.t${cyc}z." @@ -751,4 +794,3 @@ fi ##end of enkfgdas or enkfgfs #----------------------------------------------------- exit 0 - diff --git a/ush/hpssarch_gen_gsl.sh b/ush/hpssarch_gen_gsl.sh index a3d7d8206f..cd8eefd5f3 100755 --- a/ush/hpssarch_gen_gsl.sh +++ b/ush/hpssarch_gen_gsl.sh @@ -8,9 +8,9 @@ # --only echo name of file if file exists # --assume we only run cold starts ################################################### -source "${HOMEgfs}/ush/preamble.sh" +source "${USHgfs}/preamble.sh" -type=${1:-gfs} ##gfs +type=${1:-gfs} ##gfs ##JKH ARCH_GAUSSIAN=${ARCH_GAUSSIAN:-"YES"} ARCH_GAUSSIAN_FHMAX=${ARCH_GAUSSIAN_FHMAX:-36} @@ -31,7 +31,7 @@ if [[ ${type} = "gfs" ]]; then FHMAX_HF_GFS=${FHMAX_HF_GFS:-120} FHOUT_HF_GFS=${FHOUT_HF_GFS:-1} - rm -f "${DATA}/gfs_pgrb2.txt" + rm -f "${DATA}/gfs_pgrb2.txt" ##JKH session start here!!!! rm -f "${DATA}/gfs_ics.txt" touch "${DATA}/gfs_pgrb2.txt" touch "${DATA}/gfs_ics.txt" @@ -121,4 +121,4 @@ if [[ ${type} = "gfs" ]]; then fi ##end of gfs #----------------------------------------------------- -#JKHexit 0 +#JKHexit 0 ##JKH session end here!!!! diff --git a/ush/icepost.ncl b/ush/icepost.ncl deleted file mode 100755 index ad102971c4..0000000000 --- a/ush/icepost.ncl +++ /dev/null @@ -1,382 +0,0 @@ -;------------------------------------------------------------------ -; Denise.Worthen@noaa.gov (Feb 2019) -; -; This script will remap CICE5 output on the tripole grid to -; a set of rectilinear grids using pre-computed ESMF weights to remap -; the listed fields to the destination grid and write the results -; to a new netCDF file -; -; See ocnpost.ncl for a complete description -; -; Bin.Li@noaa.gov (May 2019) -; This script is revised to be used in the coupled workflow. -; Revised parts are marked by - - load "$NCARG_ROOT/lib/ncarg/nclscripts/esmf/ESMF_regridding.ncl" - -;---------------------------------------------------------------------- -begin - -;************************************************ -; specify parameters -;************************************************ -; - - output_masks = False - ; destination grid sizes and name - dsttype = (/"rect."/) - ;dstgrds = (/"1p0", "0p5", "0p25"/) -; - - ; specify a location to use - ; nemsrc = "/scratch4/NCEPDEV/ocean/save/Denise.Worthen/NEMS_INPUT0.1/ocnicepost/" - ; interpolation methods - methods = (/"bilinear" ,"conserve"/) - ; ocean model output location - ;dirsrc = "/scratch3/NCEPDEV/stmp2/Denise.Worthen/BM1_ice/" - - - ; variables to be regridded with the native tripole stagger location - - varlist = (/ (/ "hi_h", "Ct", "bilinear"/) \ - ,(/ "hs_h", "Ct", "bilinear"/) \ - ,(/ "Tsfc_h", "Ct", "bilinear"/) \ - ,(/ "aice_h", "Ct", "bilinear"/) \ - ,(/ "sst_h", "Ct", "bilinear"/) \ - /) - dims = dimsizes(varlist) - nvars = dims(0) - delete(dims) - ;print(varlist) - - ; vectors to be regridded with the native tripole stagger location - ; and dimensionality - ; note: vectors are always unstaggered using bilinear weights, but can - ; be remapped using conservative - nvpairs = 1 - veclist = new( (/nvpairs,3,2/),"string") - veclist = (/ (/ (/"uvel_h", "vvel_h"/), (/"Bu", "Bu"/), (/"bilinear", "bilinear"/) /) \ - /) - ;print(veclist) - - begTime = get_cpu_time() -;---------------------------------------------------------------------- -; make a list of the directories and files from the run -;---------------------------------------------------------------------- -; idate = "20120101" -; icefilelist = systemfunc("ls "+dirsrc+"gfs."+idate+"/00/"+"ice*.nc") -; icef = addfiles(icefilelist,"r") -; nfiles = dimsizes(icefilelist) -; - - ; get the rotation angle - angleT = icef[0]->ANGLET - - ; get a 2 dimensional fields for creating the interpolation mask - ; the mask2d contain 1's on land and 0's at valid points. - mask2d = where(ismissing(icef[0]->sst_h), 1.0, 0.0) - ;printVarSummary(mask2d) - - ; create conformed rotation arrays to make vector rotations cleaner - angleT2d=conform_dims(dimsizes(mask2d),angleT,(/1,2/)) - -;---------------------------------------------------------------------- -; loop over the output resolutions -;---------------------------------------------------------------------- - - jj = 1 - ii = 0 - - do jj = 0,dimsizes(dstgrds)-1 - ;outres = "_"+dstgrds(jj)+"x"+dstgrds(jj) - outres = dstgrds(jj)+"x"+dstgrds(jj) - outgrid = dstgrds(jj) - - ; regrid a field to obtain the output xy dimensions - wgtsfile = nemsrc+"/"+"tripole.mx025.Ct.to."+dsttype+dstgrds(jj)+".bilinear.nc" - tt = ESMF_regrid_with_weights(angleT,wgtsfile,False) - tt!0 = "lat" - tt!1 = "lon" - lat = tt&lat - lon = tt&lon - dims = dimsizes(tt) - nlat = dims(0) - nlon = dims(1) - print("fields will be remapped to destination grid size "\ - +nlon+" "+nlat) - - delete(tt) - delete(dims) - - ; regrid the masks to obtain the interpolation masks. - ; the mask2d contain 1's on land and 0's at valid points. - ; when remapped, any mask value > 0 identifies land values that - ; have crept into the field. remapped model fields are then - ; masked with this interpolation mask - - wgtsfile = nemsrc+"/"+"tripole.mx025.Ct.to."+dsttype+dstgrds(jj)+".bilinear.nc" - rgmask2d = ESMF_regrid_with_weights(mask2d, wgtsfile,False) - - if(output_masks)then - testfile = "masks_"+dstgrds(jj)+".nc" - system("/bin/rm -f "+testfile) - ; create - testcdf = addfile(testfile,"c") - testcdf->rgmask2d = rgmask2d - ; close - delete(testcdf) - end if - - ; create the interpolation mask - rgmask2d = where(rgmask2d .gt. 0.0, rgmask2d@_FillValue, 1.0) - -;---------------------------------------------------------------------- -; loop over each file in the icefilelist -;---------------------------------------------------------------------- -; - ; retrieve the time stamp - time = icef[0]->time - delete(time@bounds) - -;---------------------------------------------------------------------- -; set up the output netcdf file -;---------------------------------------------------------------------- -; system("/bin/rm -f " + outfile) ; remove if exists -; outcdf = addfile (outfile, "c") ; open output file -; -; - - ; explicitly declare file definition mode. Improve efficiency. - setfileoption(outcdf,"DefineMode",True) - - ; create global attributes of the file - fAtt = True ; assign file attributes - fAtt@creation_date = systemfunc ("date") - fAtt@source_file = infile - fileattdef( outcdf, fAtt ) ; copy file attributes - - ; predefine the coordinate variables and their dimensionality - dimNames = (/"time", "lat", "lon"/) - dimSizes = (/ -1 , nlat, nlon/) - dimUnlim = (/ True , False, False/) - filedimdef(outcdf,dimNames,dimSizes,dimUnlim) - - ; predefine the the dimensionality of the variables to be written out - filevardef(outcdf, "time", typeof(time), getvardims(time)) - filevardef(outcdf, "lat", typeof(lat), getvardims(lat)) - filevardef(outcdf, "lon", typeof(lon), getvardims(lon)) - - ; Copy attributes associated with each variable to the file - filevarattdef(outcdf, "time", time) - filevarattdef(outcdf, "lat", lat) - filevarattdef(outcdf, "lon", lon) - - ; predefine variables - do nv = 0,nvars-1 - varname = varlist(nv,0) - odims = (/"time", "lat", "lon"/) - ;print("creating variable "+varname+" in file") - filevardef(outcdf, varname, "float", odims) - delete(odims) - end do - - do nv = 0,nvpairs-1 - do nn = 0,1 - vecname = veclist(nv,0,nn) - odims = (/"time", "lat", "lon"/) - ;print("creating variable "+vecname+" in file") - filevardef(outcdf, vecname, "float", odims) - delete(odims) - end do - end do - - ; explicitly exit file definition mode. - setfileoption(outcdf,"DefineMode",False) - - lat=lat(::-1) - ; write the dimensions to the file - outcdf->time = (/time/) - outcdf->lat = (/lat/) - outcdf->lon = (/lon/) - -;---------------------------------------------------------------------- -; loop over nvars variables -;---------------------------------------------------------------------- - - ;nv = 1 - do nv = 0,nvars-1 - varname = varlist(nv,0) - vargrid = varlist(nv,1) - varmeth = varlist(nv,2) - - ;print(nv+" "+varname+" "+vargrid+" "+varmeth) - icevar = icef[ii]->$varname$ - ndims = dimsizes(dimsizes(icevar)) - ;print(ndims+" "+dimsizes(icevar)) - - if(vargrid .ne. "Ct")then - ; print error if the variable is not on the Ct grid - print("Variable is not on Ct grid") - exit - end if - - ; regrid to dsttype+dstgrd with method - ;print("remapping "+varname+" to grid "+dsttype+dstgrds(jj)) - wgtsfile = nemsrc+"/"+"tripole.mx025.Ct.to."+dsttype+dstgrds(jj)+"."+varmeth+".nc" - - rgtt = ESMF_regrid_with_weights(icevar,wgtsfile,False) - rgtt = where(ismissing(rgmask2d),icevar@_FillValue,rgtt) - rgtt=rgtt(:,::-1,:) - - ; enter file definition mode to add variable attributes - setfileoption(outcdf,"DefineMode",True) - filevarattdef(outcdf, varname, rgtt) - setfileoption(outcdf,"DefineMode",False) - - - outcdf->$varname$ = (/rgtt/) - - delete(icevar) - delete(rgtt) - - ; nv, loop over number of variables - end do - -;---------------------------------------------------------------------- -; -;---------------------------------------------------------------------- - - ;nv = 0 - do nv = 0,nvpairs-1 - vecnames = veclist(nv,0,:) - vecgrids = veclist(nv,1,:) - vecmeth = veclist(nv,2,:) - ;print(nv+" "+vecnames+" "+vecgrids+" "+vecmeth) - - ; create a vector pair list - vecpairs = NewList("fifo") - n = 0 - uvel = icef[ii]->$vecnames(n)$ - vecfld = where(ismissing(uvel),0.0,uvel) - copy_VarAtts(uvel,vecfld) - ;print("unstagger "+vecnames(n)+" from "+vecgrids(n)+" to Ct") - wgtsfile = nemsrc+"/"+"tripole.mx025."+vecgrids(n)+".to.Ct.bilinear.nc" - ut = ESMF_regrid_with_weights(vecfld,wgtsfile,False) - delete(ut@remap) - - n = 1 - vvel = icef[ii]->$vecnames(n)$ - vecfld = where(ismissing(vvel),0.0,vvel) - copy_VarAtts(vvel,vecfld) - ;print("unstagger "+vecnames(n)+" from "+vecgrids(n)+" to Ct") - wgtsfile = nemsrc+"/"+"tripole.mx025."+vecgrids(n)+".to.Ct.bilinear.nc" - vt = ESMF_regrid_with_weights(vecfld,wgtsfile,False) - delete(vt@remap) - - ListAppend(vecpairs,ut) - ListAppend(vecpairs,vt) - ;print(vecpairs) - - ; rotate - ; first copy Metadata - urot = vecpairs[0] - vrot = vecpairs[1] - urot = cos(angleT2d)*ut - sin(angleT2d)*vt - vrot = sin(angleT2d)*ut + cos(angleT2d)*vt - - ; change attribute to indicate these are now rotated velocities - urot@long_name=str_sub_str(urot@long_name,"(x)","zonal") - vrot@long_name=str_sub_str(vrot@long_name,"(y)","meridional") - ; copy back - vecpairs[0] = urot - vecpairs[1] = vrot - delete([/urot, vrot/]) - - ; remap - do n = 0,1 - vecfld = vecpairs[n] - ; regrid to dsttype+dstgrd with method - ;print("remapping "+vecnames(n)+" to grid "+dsttype+dstgrds(jj)) - wgtsfile = nemsrc+"/"+"tripole.mx025.Ct.to."+dsttype+dstgrds(jj)+"."+vecmeth(n)+".nc" - - rgtt = ESMF_regrid_with_weights(vecfld,wgtsfile,False) - rgtt = where(ismissing(rgmask2d),vecfld@_FillValue,rgtt) - rgtt=rgtt(:,::-1,:) - - ; enter file definition mode to add variable attributes - setfileoption(outcdf,"DefineMode",True) - filevarattdef(outcdf, vecnames(n), rgtt) - setfileoption(outcdf,"DefineMode",False) - - outcdf->$vecnames(n)$ = (/rgtt/) - delete(rgtt) - end do - delete([/uvel,vvel,ut,vt,vecfld,vecpairs/]) - delete([/vecnames,vecgrids,vecmeth/]) - ; nv, loop over number of vector pairs - end do - -;---------------------------------------------------------------------- -; close the outcdf and continue through filelist -;---------------------------------------------------------------------- - - delete(outcdf) - - ; ii, loop over files - ;end do - ;jj, loop over destination grids - delete([/lat,lon,nlon,nlat/]) - delete([/rgmask2d/]) - end do - print("One complete ice file in " + (get_cpu_time() - begTime) + " seconds") -exit -end diff --git a/ush/interp_atmos_master.sh b/ush/interp_atmos_master.sh index 0abc6ad185..4c4ee4b03c 100755 --- a/ush/interp_atmos_master.sh +++ b/ush/interp_atmos_master.sh @@ -4,7 +4,7 @@ # Generate 0.25 / 0.5 / 1 degree interpolated grib2 files for each input grib2 file # trim's RH and tweaks sea-ice cover -source "${HOMEgfs}/ush/preamble.sh" +source "${USHgfs}/preamble.sh" input_file=${1:-"pgb2file_in"} # Input pressure grib2 file output_file_prefix=${2:-"pgb2file_out"} # Prefix for output grib2 file; the prefix is appended by resolution e.g. _0p25 @@ -29,7 +29,7 @@ grid0p50="latlon 0:720:0.5 90:361:-0.5" grid1p00="latlon 0:360:1.0 90:181:-1.0" # "Import" functions used in this script -source "${HOMEgfs}/ush/product_functions.sh" +source "${USHgfs}/product_functions.sh" # Transform the input ${grid_string} into an array for processing IFS=':' read -ra grids <<< "${grid_string}" diff --git a/ush/interp_atmos_sflux.sh b/ush/interp_atmos_sflux.sh index 516a2f5e4a..cdf748f666 100755 --- a/ush/interp_atmos_sflux.sh +++ b/ush/interp_atmos_sflux.sh @@ -3,7 +3,7 @@ # This script takes in a master flux file and creates interpolated flux files at various interpolated resolutions # Generate 0.25 / 0.5 / 1 degree interpolated grib2 flux files for each input sflux grib2 file -source "${HOMEgfs}/ush/preamble.sh" +source "${USHgfs}/preamble.sh" input_file=${1:-"sfluxfile_in"} # Input sflux grib2 file output_file_prefix=${2:-"sfluxfile_out"} # Prefix for output sflux grib2 file; the prefix is appended by resolution e.g. _0p25 @@ -46,4 +46,4 @@ ${WGRIB2} "${input_file}" ${defaults} \ ${output_grids} export err=$?; err_chk -exit 0 \ No newline at end of file +exit 0 diff --git a/ush/link_crtm_fix.sh b/ush/link_crtm_fix.sh index 61ac3f7870..5204c3e3da 100755 --- a/ush/link_crtm_fix.sh +++ b/ush/link_crtm_fix.sh @@ -1,6 +1,6 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" +source "${USHgfs}/preamble.sh" # Get CRTM fix directory from (in this order): # 1. First argument to script, or diff --git a/ush/load_fv3gfs_modules.sh b/ush/load_fv3gfs_modules.sh index b4f23fa331..ae0e381db4 100755 --- a/ush/load_fv3gfs_modules.sh +++ b/ush/load_fv3gfs_modules.sh @@ -10,7 +10,8 @@ fi ulimit_s=$( ulimit -S -s ) # Find module command and purge: -source "${HOMEgfs}/modulefiles/module-setup.sh.inc" +source "${HOMEgfs}/ush/detect_machine.sh" +source "${HOMEgfs}/ush/module-setup.sh" # Source versions file for runtime source "${HOMEgfs}/versions/run.ver" @@ -18,36 +19,14 @@ source "${HOMEgfs}/versions/run.ver" # Load our modules: module use "${HOMEgfs}/modulefiles" -if [[ -d /lfs/f1 ]]; then - # We are on WCOSS2 (Cactus or Dogwood) - module load module_base.wcoss2 -elif [[ -d /mnt/lfs1 ]] ; then - # We are on NOAA Jet - module load module_base.jet -elif [[ -d /scratch1 ]] ; then - # We are on NOAA Hera - module load module_base.hera -elif [[ -d /work ]] ; then - # We are on MSU Orion or Hercules - if [[ -d /apps/other ]] ; then - # Hercules - module load module_base.hercules - else - # Orion - module load module_base.orion - fi -elif [[ -d /glade ]] ; then - # We are on NCAR Yellowstone - module load module_base.cheyenne -elif [[ -d /lustre && -d /ncrc ]] ; then - # We are on GAEA. - module load module_base.gaea -elif [[ -d /data/prod ]] ; then - # We are on SSEC S4 - module load module_base.s4 -else - echo WARNING: UNKNOWN PLATFORM -fi +case "${MACHINE_ID}" in + "wcoss2" | "hera" | "orion" | "hercules" | "gaea" | "jet" | "s4") + module load "module_base.${MACHINE_ID}" + ;; + *) + echo "WARNING: UNKNOWN PLATFORM" + ;; +esac module list diff --git a/ush/load_ufsda_modules.sh b/ush/load_ufsda_modules.sh index da8e2d8096..e8e72b8fbe 100755 --- a/ush/load_ufsda_modules.sh +++ b/ush/load_ufsda_modules.sh @@ -27,53 +27,26 @@ fi ulimit_s=$( ulimit -S -s ) # Find module command and purge: -source "${HOMEgfs}/modulefiles/module-setup.sh.inc" +source "${HOMEgfs}/ush/detect_machine.sh" +source "${HOMEgfs}/ush/module-setup.sh" # Load our modules: module use "${HOMEgfs}/sorc/gdas.cd/modulefiles" -if [[ -d /lfs/f1 ]]; then - # We are on WCOSS2 (Cactus or Dogwood) - echo WARNING: UFSDA NOT SUPPORTED ON THIS PLATFORM -elif [[ -d /lfs3 ]] ; then - # We are on NOAA Jet - echo WARNING: UFSDA NOT SUPPORTED ON THIS PLATFORM -elif [[ -d /scratch1 ]] ; then - # We are on NOAA Hera - module load "${MODS}/hera" - # set NETCDF variable based on ncdump location - NETCDF=$( which ncdump ) - export NETCDF - # prod_util stuff, find a better solution later... - module use /scratch2/NCEPDEV/nwprod/hpc-stack/libs/hpc-stack/modulefiles/compiler/intel/2022.1.2/ - module load prod_util -elif [[ -d /work ]] ; then - # We are on MSU Orion - # prod_util stuff, find a better solution later... - #module use /apps/contrib/NCEP/hpc-stack/libs/hpc-stack/modulefiles/compiler/intel/2022.1.2/ - #module load prod_util - export UTILROOT=/work2/noaa/da/python/opt/intel-2022.1.2/prod_util/1.2.2 - export MDATE=/work2/noaa/da/python/opt/intel-2022.1.2/prod_util/1.2.2/bin/mdate - export NDATE=/work2/noaa/da/python/opt/intel-2022.1.2/prod_util/1.2.2/bin/ndate - export NHOUR=/work2/noaa/da/python/opt/intel-2022.1.2/prod_util/1.2.2/bin/nhour - export FSYNC=/work2/noaa/da/python/opt/intel-2022.1.2/prod_util/1.2.2/bin/fsync_file - module load "${MODS}/orion" - # set NETCDF variable based on ncdump location - ncdump=$( which ncdump ) - NETCDF=$( echo "${ncdump}" | cut -d " " -f 3 ) - export NETCDF -elif [[ -d /glade ]] ; then - # We are on NCAR Yellowstone - echo WARNING: UFSDA NOT SUPPORTED ON THIS PLATFORM -elif [[ -d /lustre && -d /ncrc ]] ; then - # We are on GAEA. - echo WARNING: UFSDA NOT SUPPORTED ON THIS PLATFORM -elif [[ -d /data/prod ]] ; then - # We are on SSEC S4 - echo WARNING: UFSDA NOT SUPPORTED ON THIS PLATFORM -else - echo WARNING: UNKNOWN PLATFORM -fi +case "${MACHINE_ID}" in + ("hera" | "orion" | "hercules") + module load "${MODS}/${MACHINE_ID}" + ncdump=$( command -v ncdump ) + NETCDF=$( echo "${ncdump}" | cut -d " " -f 3 ) + export NETCDF + ;; + ("wcoss2" | "acorn" | "jet" | "gaea" | "s4") + echo WARNING: UFSDA NOT SUPPORTED ON THIS PLATFORM + ;; + *) + echo "WARNING: UNKNOWN PLATFORM" + ;; +esac module list pip list diff --git a/ush/minmon_xtrct_costs.pl b/ush/minmon_xtrct_costs.pl index 502032da80..c56ac3bdad 100755 --- a/ush/minmon_xtrct_costs.pl +++ b/ush/minmon_xtrct_costs.pl @@ -22,8 +22,8 @@ # #--------------------------- -if ($#ARGV != 4 ) { - print "usage: minmon_xtrct_costs.pl SUFFIX PDY cyc infile jlogfile\n"; +if ($#ARGV != 3 ) { + print "usage: minmon_xtrct_costs.pl SUFFIX PDY cyc infile\n"; exit; } my $suffix = $ARGV[0]; @@ -31,7 +31,6 @@ my $pdy = $ARGV[1]; my $cyc = $ARGV[2]; my $infile = $ARGV[3]; -my $jlogfile = $ARGV[4]; my $use_costterms = 0; my $no_data = 0.00; diff --git a/ush/minmon_xtrct_gnorms.pl b/ush/minmon_xtrct_gnorms.pl index 0125c58ac8..ac83c08cd3 100755 --- a/ush/minmon_xtrct_gnorms.pl +++ b/ush/minmon_xtrct_gnorms.pl @@ -185,8 +185,8 @@ sub updateGnormData { # #--------------------------------------------------------------------------- -if ($#ARGV != 4 ) { - print "usage: minmon_xtrct_gnorms.pl SUFFIX pdy cyc infile jlogfile\n"; +if ($#ARGV != 3 ) { + print "usage: minmon_xtrct_gnorms.pl SUFFIX pdy cyc infile \n"; exit; } @@ -195,7 +195,6 @@ sub updateGnormData { my $pdy = $ARGV[1]; my $cyc = $ARGV[2]; my $infile = $ARGV[3]; -my $jlogfile = $ARGV[4]; my $scr = "minmon_xtrct_gnorms.pl"; diff --git a/ush/minmon_xtrct_reduct.pl b/ush/minmon_xtrct_reduct.pl index 1b8186b6ad..cc5da86af8 100755 --- a/ush/minmon_xtrct_reduct.pl +++ b/ush/minmon_xtrct_reduct.pl @@ -9,20 +9,18 @@ # reduction.ieee_d files ready for GrADS use. #--------------------------------------------------------------------------- -if ($#ARGV != 4 ) { - print "usage: minmon_xtrct_reduct.pl SUFFIX pdy cyc infile jlogfile\n"; +if ($#ARGV != 3 ) { + print "usage: minmon_xtrct_reduct.pl SUFFIX pdy cyc infile\n"; print " suffix is data source identifier\n"; print " pdy is YYYYMMDD of the cycle to be processed\n"; print " cyc is HH of the cycle to be processed\n"; print " infile is the data file containing the reduction stats\n"; - print " jlogfile is the job log file\n"; exit; } my $suffix = $ARGV[0]; my $pdy = $ARGV[1]; my $cyc = $ARGV[2]; my $infile = $ARGV[3]; -my $jlogfile = $ARGV[4]; my $scr = "minmon_xtrct_reduct.pl"; print "$scr has started\n"; diff --git a/ush/module-setup.sh b/ush/module-setup.sh index fd656966bf..b66e3622d0 100755 --- a/ush/module-setup.sh +++ b/ush/module-setup.sh @@ -1,6 +1,8 @@ #!/bin/bash set -u +source "${HOMEgfs}/ush/detect_machine.sh" + if [[ ${MACHINE_ID} = jet* ]] ; then # We are on NOAA Jet if ( ! eval module help > /dev/null 2>&1 ) ; then @@ -34,10 +36,10 @@ elif [[ ${MACHINE_ID} = orion* ]] ; then if ( ! eval module help > /dev/null 2>&1 ) ; then source /apps/lmod/lmod/init/bash fi - export LMOD_SYSTEM_DEFAULT_MODULES=contrib - set +u - module reset - set -u + #export LMOD_SYSTEM_DEFAULT_MODULES=git/2.28.0 # contrib has a lot of stuff we shouldn't put in MODULEPATH + #set +u + module purge # reset causes issues on Orion sometimes. + #set -u elif [[ ${MACHINE_ID} = s4* ]] ; then # We are on SSEC Wisconsin S4 @@ -123,7 +125,7 @@ elif [[ ${MACHINE_ID} = "noaacloud" ]]; then export SPACK_ROOT=/contrib/global-workflow/spack-stack/spack export PATH=${PATH}:${SPACK_ROOT}/bin . "${SPACK_ROOT}"/share/spack/setup-env.sh - + else echo WARNING: UNKNOWN PLATFORM 1>&2 fi diff --git a/ush/oceanice_nc2grib2.sh b/ush/oceanice_nc2grib2.sh new file mode 100755 index 0000000000..2afd0e07f2 --- /dev/null +++ b/ush/oceanice_nc2grib2.sh @@ -0,0 +1,319 @@ +#!/bin/bash + +# This script contains functions to convert ocean/ice rectilinear netCDF files to grib2 format +# This script uses the wgrib2 utility to convert the netCDF files to grib2 format and then indexes it + +source "${USHgfs}/preamble.sh" + +################################################################################ +function _ice_nc2grib2 { +# This function converts the ice rectilinear netCDF files to grib2 format + + # Set the inputs + local grid=${1} # 0p25, 0p50, 1p00, 5p00 + local latlon_dims=${2} # 0:721:0:1440, 0:361:0:720, 0:181:0:360, 0:36:0:72 + local current_cycle=${3} # YYYYMMDDHH + local aperiod=${4} # 0-6 + local infile=${5} # ice.0p25.nc + local outfile=${6} # ice.0p25.grib2 + local template=${7} # template.global.0p25.gb2 + + ${WGRIB2} "${template}" \ + -import_netcdf "${infile}" "hi_h" "0:1:${latlon_dims}" \ + -set_var ICETK -set center 7 \ + -set_date "${current_cycle}" -set_ftime "${aperiod} hour ave fcst" \ + -set_scaling same same -set_grib_type c1 -grib_out "${outfile}" \ + -import_netcdf "${infile}" "aice_h" "0:1:${latlon_dims}" \ + -set_var ICEC -set center 7 \ + -set_date "${current_cycle}" -set_ftime "${aperiod} hour ave fcst" \ + -set_scaling same same -set_grib_type c1 -grib_out "${outfile}" \ + -import_netcdf "${infile}" "Tsfc_h" "0:1:${latlon_dims}" \ + -set_var ICETMP -set center 7 -rpn "273.15:+" \ + -set_date "${current_cycle}" -set_ftime "${aperiod} hour ave fcst" \ + -set_scaling same same -set_grib_type c1 -grib_out "${outfile}" \ + -import_netcdf "${infile}" "uvel_h" "0:1:${latlon_dims}" \ + -set_var UICE -set center 7 \ + -set_date "${current_cycle}" -set_ftime "${aperiod} hour ave fcst" \ + -set_scaling same same -set_grib_type c1 -grib_out "${outfile}" \ + -import_netcdf "${infile}" "vvel_h" "0:1:${latlon_dims}" \ + -set_var VICE -set center 7 \ + -set_date "${current_cycle}" -set_ftime "${aperiod} hour ave fcst" \ + -set_scaling same same -set_grib_type c1 -grib_out "${outfile}" + +# Additional variables needed for GFSv17/GEFSv13 operational forecast +# files, but GRIB2 parameters not available in NCEP (-set center 7) +# tables in wgrib2 v2.0.8: + +# -import_netcdf "${infile}" "hs_h" "0:1:${latlon_dims}" \ +# -set_var ??? -set center 7 \ +# -set_date "${current_cycle}" -set_ftime "${aperiod} hour ave fcst" \ +# -set_scaling same same -set_grib_type c1 -grib_out "${outfile}" \ +# -import_netcdf "${infile}" "frzmlt_h" "0:1:${latlon_dims}" \ +# -set_var ??? -set center 7 \ +# -set_date "${current_cycle}" -set_ftime "${aperiod} hour ave fcst" \ +# -set_scaling same same -set_grib_type c1 -grib_out "${outfile}" \ +# -import_netcdf "${infile}" "albsni_h" "0:1:${latlon_dims}" \ +# -set_var ALBICE -set center 7 -rpn "100.0:/" \ +# -set_date "${current_cycle}" -set_ftime "${aperiod} hour ave fcst" \ +# -set_scaling same same -set_grib_type c1 -grib_out "${outfile}" \ +# -import_netcdf "${infile}" "mlt_onset_h" "0:1:${latlon_dims}" \ +# -set_var ??? -set center 7 \ +# -set_date "${current_cycle}" -set_ftime "${aperiod} hour ave fcst" \ +# -set_scaling same same -set_grib_type c1 -grib_out "${outfile}" \ +# -import_netcdf "${infile}" "frz_onset_h" "0:1:${latlon_dims}" \ +# -set_var ??? -set center 7 \ +# -set_date "${current_cycle}" -set_ftime "${aperiod} hour ave fcst" \ +# -set_scaling same same -set_grib_type c1 -grib_out "${outfile}" + + rc=$? + # Check if the conversion was successful + if (( rc != 0 )); then + echo "FATAL ERROR: Failed to convert the ice rectilinear netCDF file to grib2 format" + fi + return "${rc}" + +} + +################################################################################ +function _ocean2D_nc2grib2 { +# This function converts the ocean 2D rectilinear netCDF files to grib2 format + + # Set the inputs + local grid=${1} # 0p25, 0p50, 1p00, 5p00 + local latlon_dims=${2} # 0:721:0:1440, 0:361:0:720, 0:181:0:360, 0:36:0:72 + local current_cycle=${3} # YYYYMMDDHH + local aperiod=${4} # 0-6 + local infile=${5} # ocean.0p25.nc + local outfile=${6} # ocean_2D.0p25.grib2 + local template=${7} # template.global.0p25.gb2 + + ${WGRIB2} "${template}" \ + -import_netcdf "${infile}" "SSH" "0:1:${latlon_dims}" \ + -set_var SSHG -set center 7 \ + -set_date "${current_cycle}" -set_ftime "${aperiod} hour ave fcst" \ + -set_scaling same same -set_grib_type c1 -grib_out "${outfile}" \ + -import_netcdf "${infile}" "SST" "0:1:${latlon_dims}" \ + -set_var WTMP -set center 7 -rpn "273.15:+" \ + -set_date "${current_cycle}" -set_ftime "${aperiod} hour ave fcst" \ + -set_scaling same same -set_grib_type c1 -grib_out "${outfile}" \ + -import_netcdf "${infile}" "SSS" "0:1:${latlon_dims}" \ + -set_var SALIN -set center 7 \ + -set_date "${current_cycle}" -set_ftime "${aperiod} hour ave fcst" \ + -set_scaling same same -set_grib_type c1 -grib_out "${outfile}" \ + -import_netcdf "${infile}" "speed" "0:1:${latlon_dims}" \ + -set_var SPC -set center 7 \ + -set_date "${current_cycle}" -set_ftime "${aperiod} hour ave fcst" \ + -set_scaling same same -set_grib_type c1 -grib_out "${outfile}" \ + -import_netcdf "${infile}" "SSU" "0:1:${latlon_dims}" \ + -set_var UOGRD -set center 7 \ + -set_date "${current_cycle}" -set_ftime "${aperiod} hour ave fcst" \ + -set_scaling same same -set_grib_type c1 -grib_out "${outfile}" \ + -import_netcdf "${infile}" "SSV" "0:1:${latlon_dims}" \ + -set_var VOGRD -set center 7 \ + -set_date "${current_cycle}" -set_ftime "${aperiod} hour ave fcst" \ + -set_scaling same same -set_grib_type c1 -grib_out "${outfile}" \ + -import_netcdf "${infile}" "latent" "0:1:${latlon_dims}" \ + -set_var LHTFL -set center 7 \ + -set_date "${current_cycle}" -set_ftime "${aperiod} hour ave fcst" \ + -set_scaling same same -set_grib_type c1 -grib_out "${outfile}" \ + -import_netcdf "${infile}" "sensible" "0:1:${latlon_dims}" \ + -set_var SHTFL -set center 7 \ + -set_date "${current_cycle}" -set_ftime "${aperiod} hour ave fcst" \ + -set_scaling same same -set_grib_type c1 -grib_out "${outfile}" \ + -import_netcdf "${infile}" "SW" "0:1:${latlon_dims}" \ + -set_var DSWRF -set center 7 \ + -set_date "${current_cycle}" -set_ftime "${aperiod} hour ave fcst" \ + -set_scaling same same -set_grib_type c1 -grib_out "${outfile}" \ + -import_netcdf "${infile}" "LW" "0:1:${latlon_dims}" \ + -set_var DLWRF -set center 7 \ + -set_date "${current_cycle}" -set_ftime "${aperiod} hour ave fcst" \ + -set_scaling same same -set_grib_type c1 -grib_out "${outfile}" \ + -import_netcdf "${infile}" "LwLatSens" "0:1:${latlon_dims}" \ + -set_var THFLX -set center 7 \ + -set_date "${current_cycle}" -set_ftime "${aperiod} hour ave fcst" \ + -set_scaling same same -set_grib_type c1 -grib_out "${outfile}" \ + -import_netcdf "${infile}" "MLD_003" "0:1:${latlon_dims}" \ + -set_var WDEPTH -set center 7 -set_lev "mixed layer depth" \ + -set_date "${current_cycle}" -set_ftime "${aperiod} hour ave fcst" \ + -set_scaling same same -set_grib_type c1 -grib_out "${outfile}" + +# Additional variables needed for GFSv17/GEFSv13 operational forecast +# files, but GRIB2 parameters not available in NCEP (-set center 7) +# tables in wgrib2 v2.0.8: +# +# -import_netcdf "${infile}" "Heat_PmE" "0:1:${latlon_dims}" \ +# -set_var DWHFLUX -set center 7 \ +# -set_date "${current_cycle}" -set_ftime "${aperiod} hour ave fcst" \ +# -set_scaling same same -set_grib_type c1 -grib_out "${outfile}" \ +# -import_netcdf "${infile}" "taux" "0:1:${latlon_dims}" \ +# -set_var XCOMPSS -set center 7 \ +# -set_date "${current_cycle}" -set_ftime "${aperiod} hour ave fcst" \ +# -set_scaling same same -set_grib_type c1 -grib_out "${outfile}" \ +# -import_netcdf "${infile}" "tauy" "0:1:${latlon_dims}" \ +# -set_var YCOMPSS -set center 7 \ +# -set_date "${current_cycle}" -set_ftime "${aperiod} hour ave fcst" \ +# -set_scaling same same -set_grib_type c1 -grib_out "${outfile}" + + rc=$? + # Check if the conversion was successful + if (( rc != 0 )); then + echo "FATAL ERROR: Failed to convert the ocean rectilinear netCDF file to grib2 format" + fi + return "${rc}" + +} + +################################################################################ +function _ocean3D_nc2grib2 { +# This function converts the ocean 3D rectilinear netCDF files to grib2 format + + # Set the inputs + local grid=${1} # 0p25, 0p50, 1p00, 5p00 + local latlon_dims=${2} # 0:721:0:1440, 0:361:0:720, 0:181:0:360, 0:36:0:72 + local levels=${3} # 5:15:25:35:45:55:65:75:85:95:105:115:125 + local current_cycle=${4} # YYYYMMDDHH + local aperiod=${5} # 0-6 + local infile=${6} # ocean.0p25.nc + local outfile=${7} # ocean_3D.0p25.grib2 + local template=${8} # template.global.0p25.gb2 + + IFS=':' read -ra depths <<< "${levels}" + + zl=0 + for depth in "${depths[@]}"; do + + [[ -f "tmp.gb2" ]] && rm -f "tmp.gb2" + + ${WGRIB2} "${template}" \ + -import_netcdf "${infile}" "temp" "0:1:${zl}:1:${latlon_dims}" \ + -set_var WTMP -set center 7 -rpn "273.15:+" \ + -set_lev "${depth} m below water surface" \ + -set_date "${current_cycle}" -set_ftime "${aperiod} hour ave fcst" \ + -set_scaling same same -set_grib_type c1 -grib_out tmp.gb2 \ + -import_netcdf "${infile}" "so" "0:1:${zl}:1:${latlon_dims}" \ + -set_var SALIN -set center 7 \ + -set_lev "${depth} m below water surface" \ + -set_date "${current_cycle}" -set_ftime "${aperiod} hour ave fcst" \ + -set_scaling same same -set_grib_type c1 -grib_out tmp.gb2 \ + -import_netcdf "${infile}" "uo" "0:1:${zl}:1:${latlon_dims}" \ + -set_var UOGRD -set center 7 \ + -set_lev "${depth} m below water surface" \ + -set_date "${current_cycle}" -set_ftime "${aperiod} hour ave fcst" \ + -set_scaling same same -set_grib_type c1 -grib_out tmp.gb2 \ + -import_netcdf "${infile}" "vo" "0:1:${zl}:1:${latlon_dims}" \ + -set_var VOGRD -set center 7 \ + -set_lev "${depth} m below water surface" \ + -set_date "${current_cycle}" -set_ftime "${aperiod} hour ave fcst" \ + -set_scaling same same -set_grib_type c1 -grib_out tmp.gb2 + + rc=$? + # Check if the conversion was successful + if (( rc != 0 )); then + echo "FATAL ERROR: Failed to convert the ocean rectilinear netCDF file to grib2 format at depth ${depth}m, ABORT!" + return "${rc}" + fi + + cat tmp.gb2 >> "${outfile}" + rm -f tmp.gb2 + ((zl = zl + 1)) + + done + + # Notes: + # WATPTEMP (water potential temperature (theta)) may be a better + # GRIB2 parameter than WTMP (water temperature) if MOM6 outputs + # potential temperature. WATPTEMP is not available in NCEP + # (-set center 7) tables in wgrib2 v2.0.8. + + return "${rc}" + +} + +################################################################################ +# Input arguments +component=${1:?"Need a valid component; options: ice|ocean"} +grid=${2:-"0p25"} # Default to 0.25-degree grid +current_cycle=${3:-"2013100100"} # Default to 2013100100 +avg_period=${4:-"0-6"} # Default to 6-hourly average +ocean_levels=${5:-"5:15:25:35:45:55:65:75:85:95:105:115:125"} # Default to 12-levels + +case "${grid}" in + "0p25") + latlon_dims="0:721:0:1440" + ;; + "0p50") + latlon_dims="0:361:0:720" + ;; + "1p00") + latlon_dims="0:181:0:360" + ;; + "5p00") + latlon_dims="0:36:0:72" + ;; + *) + echo "FATAL ERROR: Unsupported grid '${grid}', ABORT!" + exit 1 + ;; +esac + +input_file="${component}.${grid}.nc" +template="template.global.${grid}.gb2" + +# Check if the template file exists +if [[ ! -f "${template}" ]]; then + echo "FATAL ERROR: '${template}' does not exist, ABORT!" + exit 127 +fi + +# Check if the input file exists +if [[ ! -f "${input_file}" ]]; then + echo "FATAL ERROR: '${input_file}' does not exist, ABORT!" + exit 127 +fi + +case "${component}" in + "ice") + rm -f "${component}.${grid}.grib2" || true + _ice_nc2grib2 "${grid}" "${latlon_dims}" "${current_cycle}" "${avg_period}" "${input_file}" "${component}.${grid}.grib2" "${template}" + rc=$? + if (( rc != 0 )); then + echo "FATAL ERROR: Failed to convert the ice rectilinear netCDF file to grib2 format" + exit "${rc}" + fi + ;; + "ocean") + rm -f "${component}_2D.${grid}.grib2" || true + _ocean2D_nc2grib2 "${grid}" "${latlon_dims}" "${current_cycle}" "${avg_period}" "${input_file}" "${component}_2D.${grid}.grib2" "${template}" + rc=$? + if (( rc != 0 )); then + echo "FATAL ERROR: Failed to convert the ocean 2D rectilinear netCDF file to grib2 format" + exit "${rc}" + fi + rm -f "${component}_3D.${grid}.grib2" || true + _ocean3D_nc2grib2 "${grid}" "${latlon_dims}" "${ocean_levels}" "${current_cycle}" "${avg_period}" "${input_file}" "${component}_3D.${grid}.grib2" "${template}" + rc=$? + if (( rc != 0 )); then + echo "FATAL ERROR: Failed to convert the ocean 3D rectilinear netCDF file to grib2 format" + exit "${rc}" + fi + # Combine the 2D and 3D grib2 files into a single file + rm -f "${component}.${grid}.grib2" || true + cat "${component}_2D.${grid}.grib2" "${component}_3D.${grid}.grib2" > "${component}.${grid}.grib2" + + ;; + *) + echo "FATAL ERROR: Unknown component: '${component}'. ABORT!" + exit 3 + ;; +esac + +# Index the output grib2 file +${WGRIB2} -s "${component}.${grid}.grib2" > "${component}.${grid}.grib2.idx" +rc=$? +# Check if the indexing was successful +if (( rc != 0 )); then + echo "FATAL ERROR: Failed to index the file '${component}.${grid}.grib2'" + exit "${rc}" +fi + +exit 0 diff --git a/ush/ocnpost.ncl b/ush/ocnpost.ncl deleted file mode 100755 index 27e60b0edf..0000000000 --- a/ush/ocnpost.ncl +++ /dev/null @@ -1,588 +0,0 @@ -;------------------------------------------------------------------ -; Denise.Worthen@noaa.gov (Feb 2019) -; -; This script will remap MOM6 ocean output on the tripole grid to -; a set of rectilinear grids using pre-computed ESMF weights to remap -; the listed fields to the destination grid and write the results -; to a new netCDF file -; -; Prior to running this script, files containing the conservative -; and bilinear regridding weights must be generated. These weights -; are created using the generate_iceocnpost_weights.ncl script. -; -; Note: the descriptive text below assumes fortran type indexing -; where the variables are indexed as (i,j) and indices start at 1 -; NCL indices are (j,i) and start at 0 -; -; The post involves these steps -; -; a) unstaggering velocity points -; MOM6 is on an Arakawa C grid. MOM6 refers to these -; locations as "Ct" for the centers and "Cu", "Cv" -; "Bu" for the left-right, north-south and corner -; points, respectively. -; -; The indexing scheme in MOM6 is as follows: -; -; Cv@i,j -; ----X------X Bu@i,j -; | -; | -; Ct@i,j | -; X X Cu@i,j -; | -; | -; | -; -; CICE5 is on an Arakawa B grid. CICE5 refers to these -; locations as TLAT,TLON for the centers and ULAT,ULON -; for the corners -; -; In UFS, the CICE5 grid has been created using the MOM6 -; supergrid file. Therefore, all grid points are consistent -; between the two models. -; -; In the following, MOM6's nomenclature will be followed, -; so that CICE5's U-grid will be referred to as "Bu". -; -; b) rotation of tripole vectors to East-West -; MOM6 and CICE6 both output velocties on their native -; velocity points. For MOM6, that is u-velocities on the -; Cu grid and v-velocites on the Cv grid. For CICE5, it is -; both u and v-velocities on the Bu grid. -; -; The rotation angle for both models are defined at center -; grid points; therefore the velocities need to be first -; unstaggered before rotation. MOM6 and CICE5 also define -; opposite directions for the rotations. Finally, while the -; grid points are identical between the two models, CICE5 -; calculates the rotation angle at center grid points by -; averaging the four surrounding B grid points. MOM6 derives -; the rotation angle at the center directly from the latitude -; and longitude of the center grid points. The angles are therefor -; not identical between the two grids. -; -; c) conservative regridding of some fields -; Fields such as ice concentration or fluxes which inherently -; area area-weighted require conservative regridding. Most other -; variables are state variables and can be regridded using -; bilinear weighting. -; -; An efficient way to accomplish the unstaggering of velocities -; is to use the bilinear interpolation weights between grid -; points of the Arakawa C grid and the center grid points (for example -; Cu->Ct). These weights are generated by the weight generation script -; -; Remapping from the tripole to rectilinear uses either the bilinear -; or conservative weights from the weight generation script. Bilinear weights -; generated for the first vertical level can be used on other levels -; (where the masking changes) by utilizing the correct masking procedure. -; Set output_masks to true to examine the interpolation masks. -; -; Intermediate file output can easily be generated for debugging by -; follwing the example in the output_masks logical -; -; Bin.Li@noaa.gov (May 2019) -; The scripts is revised for use in the coupled workflow. -; - load "$NCARG_ROOT/lib/ncarg/nclscripts/esmf/ESMF_regridding.ncl" - -;---------------------------------------------------------------------- -begin -; - - ; warnings (generated by int2p_n_Wrap) can be supressed by - ; the following (comment out to get the warnings) - err = NhlGetErrorObjectId() - setvalues err -; "errLevel" : "Fatal" ; only report Fatal errors - "errLevel" : "Verbose" - end setvalues - - output_masks = False - - ; specify a location to use - ; nemsrc = "/scratch4/NCEPDEV/ocean/save/Denise.Worthen/NEMS_INPUT0.1/ocnicepost/" - ; interpolation methods - methods = (/"bilinear" ,"conserve"/) - ; ocean model output location - ;dirsrc = "/scratch3/NCEPDEV/stmp2/Denise.Worthen/BM1_ocn/" - - ; destination grid sizes and name - dsttype = (/"rect."/) - ;dstgrds = (/"1p0", "0p5", "0p25"/) - ;dstgrds = (/"0p5"/) - dstgrds = (/"0p25"/) - - ; variables to be regridded with the native tripole stagger location - ; and dimensionality - ; first BM contained only field "mld", which was actually ePBL - ; the remaining BMs contain ePBL, MLD_003 and MLD_0125 - ; the following NCO command will be issued at the end - ; to rename the variable mld to ePBL if the variable mld is found - ; ncocmd = "ncrename -O -v mld,ePBL " - ncocmd = "ncrename -O -v MLD_003,mld" - - varlist = (/ (/ "SSH", "Ct", "bilinear", "2"/) \ - ,(/ "SST", "Ct", "bilinear", "2"/) \ - ,(/ "SSS", "Ct", "bilinear", "2"/) \ - ,(/ "speed", "Ct", "bilinear", "2"/) \ - ,(/ "temp", "Ct", "bilinear", "3"/) \ - ,(/ "so", "Ct", "bilinear", "3"/) \ - ,(/ "latent", "Ct", "conserve", "2"/) \ - ,(/ "sensible", "Ct", "conserve", "2"/) \ - ,(/ "SW", "Ct", "conserve", "2"/) \ - ,(/ "LW", "Ct", "conserve", "2"/) \ - ,(/ "evap", "Ct", "conserve", "2"/) \ - ,(/ "lprec", "Ct", "conserve", "2"/) \ - ,(/ "fprec", "Ct", "conserve", "2"/) \ - ,(/"LwLatSens", "Ct", "conserve", "2"/) \ - ,(/ "Heat_PmE", "Ct", "conserve", "2"/) \ -; ,(/ "mld", "Ct", "bilinear", "2"/) \ - ,(/ "ePBL", "Ct", "bilinear", "2"/) \ - ,(/ "MLD_003", "Ct", "bilinear", "2"/) \ - ,(/ "MLD_0125", "Ct", "bilinear", "2"/) \ - /) - dims = dimsizes(varlist) - nvars = dims(0) - delete(dims) - ;print(varlist) - - ; vectors to be regridded with the native tripole stagger location - ; and dimensionality - ; note: vectors are always unstaggered using bilinear weights, but can - ; be remapped using conservative - nvpairs = 3 - veclist = new( (/nvpairs,4,2/),"string") - veclist = (/ (/ (/ "SSU", "SSV"/), (/"Cu", "Cv"/), (/"bilinear", "bilinear"/), (/"2", "2"/) /) \ - , (/ (/ "uo", "vo"/), (/"Cu", "Cv"/), (/"bilinear", "bilinear"/), (/"3", "3"/) /) \ - , (/ (/ "taux", "tauy"/), (/"Cu", "Cv"/), (/"conserve", "conserve"/), (/"2", "2"/) /) \ - /) - ;print(veclist) - - begTime = get_cpu_time() -;---------------------------------------------------------------------- -; make a list of the directories and files from the run -;---------------------------------------------------------------------- - -; idate = "20120101" - -; ocnfilelist = systemfunc("ls "+dirsrc+"gfs."+idate+"/00/"+"ocn*.nc") -; ocnf = addfiles(ocnfilelist,"r") -; nfiles = dimsizes(ocnfilelist) -; - - ; get the rotation angles and vertical grid from the first file - ; two different name were used for the angles, either sinrot,cosrot - ; or sin_rot,cos_rot - if(isfilevar(ocnf[0],"sin_rot"))then - sinrot = ocnf[0]->sin_rot - else - sinrot = ocnf[0]->sinrot - end if - if(isfilevar(ocnf[0],"cos_rot"))then - cosrot = ocnf[0]->cos_rot - else - cosrot = ocnf[0]->cosrot - end if - z_l = ocnf[0]->z_l - z_i = ocnf[0]->z_i - nlevs = dimsizes(z_l) - - ; get a 2 and 3 dimensional fields for creating the interpolation masks - ; the mask2d,mask3d contain 1's on land and 0's at valid points. - mask2d = where(ismissing(ocnf[0]->SST), 1.0, 0.0) - mask3d = where(ismissing(ocnf[0]->temp), 1.0, 0.0) - ;printVarSummary(mask2d) - ;printVarSummary(mask3d) - - ; create conformed rotation arrays to make vector rotations cleaner - sinrot2d=conform_dims(dimsizes(mask2d),sinrot,(/1,2/)) - cosrot2d=conform_dims(dimsizes(mask2d),cosrot,(/1,2/)) - - sinrot3d=conform_dims(dimsizes(mask3d),sinrot,(/2,3/)) - cosrot3d=conform_dims(dimsizes(mask3d),cosrot,(/2,3/)) - - ; check for variables in file. this is only required because - ; of the missing/misnamed MLD variables in the first BM - ; only the varlist is checked, since it is assumed there are - ; no other variables missing after the first benchmark - valid = new((/nvars/),"logical") - valid = False - do nv = 0,nvars-1 - varname = varlist(nv,0) - if(isfilevar(ocnf[0],varname))then - valid(nv) = True - end if - print(varlist(nv,0)+" "+valid(nv)) - end do - -;---------------------------------------------------------------------- -; loop over the output resolutions -;---------------------------------------------------------------------- - - jj = 1 - ii = 0 - - do jj = 0,dimsizes(dstgrds)-1 - ;outres = "_"+dstgrds(jj)+"x"+dstgrds(jj) - outres = dstgrds(jj)+"x"+dstgrds(jj) - outgrid = dstgrds(jj) - - ; regrid a field to obtain the output xy dimensions - wgtsfile = nemsrc+"/"+"tripole.mx025.Ct.to."+dsttype+dstgrds(jj)+".bilinear.nc" - tt = ESMF_regrid_with_weights(sinrot,wgtsfile,False) - tt!0 = "lat" - tt!1 = "lon" - lat = tt&lat - lon = tt&lon - dims = dimsizes(tt) - nlat = dims(0) - nlon = dims(1) - - print("fields will be remapped to destination grid size "\ - +nlon+" "+nlat) - - delete(tt) - delete(dims) - - ; regrid the masks to obtain the interpolation masks. - ; the mask2d,mask3d contain 1's on land and 0's at valid points. - ; when remapped, any mask value > 0 identifies land values that - ; have crept into the field. remapped model fields are then - ; masked with this interpolation mask - - wgtsfile = nemsrc+"/"+"tripole.mx025.Ct.to."+dsttype+dstgrds(jj)+".bilinear.nc" - rgmask2d = ESMF_regrid_with_weights(mask2d, wgtsfile,False) - rgmask3d = ESMF_regrid_with_weights(mask3d, wgtsfile,False) - - if(output_masks)then - testfile = "masks_"+dstgrds(jj)+".nc" - system("/bin/rm -f "+testfile) - ; create - testcdf = addfile(testfile,"c") - testcdf->rgmask2d = rgmask2d - testcdf->rgmask3d = rgmask3d - ; close - delete(testcdf) - end if - - ; create the interpolation mask - rgmask2d = where(rgmask2d .gt. 0.0, rgmask2d@_FillValue, 1.0) - rgmask3d = where(rgmask3d .gt. 0.0, rgmask3d@_FillValue, 1.0) - - ; conformed depth array - depth = conform_dims(dimsizes(mask3d), z_l, (/1/)) - ;print(dimsizes(depth)) - -;---------------------------------------------------------------------- -; loop over each file in the ocnfilelist -;---------------------------------------------------------------------- -; - - ; retrieve the time stamp - time = ocnf[0]->time - delete(time@bounds) - -;---------------------------------------------------------------------- -; set up the output netcdf file -;---------------------------------------------------------------------- -; system("/bin/rm -f " + outfile) ; remove if exists -; outcdf = addfile (outfile, "c") ; open output file -; specify output file information and open file for output - FILENAME_REGRID = DATA_TMP+"/ocnr"+VDATE+"."+ENSMEM+"."+IDATE+"_"+outres+"_MOM6.nc" - if (isfilepresent(FILENAME_REGRID)) then - system("rm -f "+FILENAME_REGRID) - end if - outcdf = addfile(FILENAME_REGRID,"c") - outfile=FILENAME_REGRID - - ; explicitly declare file definition mode. Improve efficiency. - setfileoption(outcdf,"DefineMode",True) - - ; create global attributes of the file - fAtt = True ; assign file attributes - fAtt@creation_date = systemfunc ("date") - fAtt@source_file = infile - fileattdef( outcdf, fAtt ) ; copy file attributes - - ; predefine the coordinate variables and their dimensionality - ; dimNames = (/"time", "z_l", "z_i", "z_T", "lat", "lon"/) - dimNames = (/"time", "z_l", "z_i", "lat", "lon"/) - ;dimSizes = (/ -1 , nlevs, nlevs+1, nTd, nlat, nlon/) - dimSizes = (/ -1 , nlevs, nlevs+1, nlat, nlon/) - ;dimUnlim = (/ True , False, False, False, False, False/) - dimUnlim = (/ True , False, False, False, False/) - filedimdef(outcdf,dimNames,dimSizes,dimUnlim) - - ; predefine the the dimensionality of the variables to be written out - filevardef(outcdf, "time", typeof(time), getvardims(time)) - filevardef(outcdf, "z_l", typeof(z_l), getvardims(z_l)) - filevardef(outcdf, "z_i", typeof(z_i), getvardims(z_i)) - ;filevardef(outcdf, "z_T", typeof(z_T), getvardims(z_T)) - filevardef(outcdf, "lat", typeof(lat), getvardims(lat)) - filevardef(outcdf, "lon", typeof(lon), getvardims(lon)) - - ; Copy attributes associated with each variable to the file - filevarattdef(outcdf, "time", time) - filevarattdef(outcdf, "z_l", z_l) - filevarattdef(outcdf, "z_i", z_i) - ;filevarattdef(outcdf, "z_T", z_T) - filevarattdef(outcdf, "lat", lat) - filevarattdef(outcdf, "lon", lon) - - ; predefine variables - do nv = 0,nvars-1 - varname = varlist(nv,0) - vardims = varlist(nv,3) - if(valid(nv))then - if(vardims .eq. "2")then - odims = (/"time", "lat", "lon"/) - else - odims = (/"time", "z_l", "lat", "lon"/) - end if - ;print("creating variable "+varname+" in file") - filevardef(outcdf, varname, "float", odims) - delete(odims) - end if - end do - - do nv = 0,nvpairs-1 - do nn = 0,1 - vecname = veclist(nv,0,nn) - vecdims = veclist(nv,3,nn) - if(vecdims .eq. "2")then - odims = (/"time", "lat", "lon"/) - else - odims = (/"time", "z_l", "lat", "lon"/) - end if - ;print("creating variable "+vecname+" in file") - filevardef(outcdf, vecname, "float", odims) - delete(odims) - delete(vecdims) - end do - end do - - ; explicitly exit file definition mode. - setfileoption(outcdf,"DefineMode",False) - - ; write the dimensions to the file - outcdf->time = (/time/) - outcdf->z_l = (/z_l/) - outcdf->z_i = (/z_i/) -; outcdf->z_T = (/z_T/) -; - outcdf->lat = (/lat/) - outcdf->lon = (/lon/) - -;---------------------------------------------------------------------- -; loop over nvars variables -;---------------------------------------------------------------------- - - do nv = 0,nvars-1 - varname = varlist(nv,0) - vargrid = varlist(nv,1) - varmeth = varlist(nv,2) - vardims = varlist(nv,3) - - if(valid(nv))then - ;print(nv+" "+varname+" "+vargrid+" "+varmeth) - ocnvar = ocnf[ii]->$varname$ - ndims = dimsizes(dimsizes(ocnvar)) - ;print(ndims+" "+dimsizes(ocnvar)) - - if(vargrid .ne. "Ct")then - ; print error if the variable is not on the Ct grid - print("Variable is not on Ct grid") - exit - end if - - ; regrid to dsttype+dstgrd with method - ;print("remapping "+varname+" to grid "+dsttype+dstgrds(jj)) - wgtsfile = nemsrc+"/"+"tripole.mx025.Ct.to."+dsttype+dstgrds(jj)+"."+varmeth+".nc" - - rgtt = ESMF_regrid_with_weights(ocnvar,wgtsfile,False) - if(vardims .eq. "2")then - rgtt = where(ismissing(rgmask2d),ocnvar@_FillValue,rgtt) - rgtt=rgtt(:,::-1,:) - else - rgtt = where(ismissing(rgmask3d),ocnvar@_FillValue,rgtt) - rgtt=rgtt(:,:,::-1,:) - end if - - ; enter file definition mode to add variable attributes - setfileoption(outcdf,"DefineMode",True) - filevarattdef(outcdf, varname, rgtt) - setfileoption(outcdf,"DefineMode",False) - - outcdf->$varname$ = (/rgtt/) - - delete(ocnvar) - delete(rgtt) - - ; variable exists - end if - ; nv, loop over number of variables - end do - -;---------------------------------------------------------------------- -; -;---------------------------------------------------------------------- - - ;nv = 2 - do nv = 0,nvpairs-1 - vecnames = veclist(nv,0,:) - vecgrids = veclist(nv,1,:) - vecmeth = veclist(nv,2,:) - vecdims = veclist(nv,3,:) - ;print(nv+" "+vecnames+" "+vecgrids+" "+vecmeth) - - ; create a vector pair list - vecpairs = NewList("fifo") - n = 0 - uvel = ocnf[ii]->$vecnames(n)$ - vecfld = where(ismissing(uvel),0.0,uvel) - copy_VarAtts(uvel,vecfld) - ;print("unstagger "+vecnames(n)+" from "+vecgrids(n)+" to Ct") - wgtsfile = nemsrc+"/"+"tripole.mx025."+vecgrids(n)+".to.Ct.bilinear.nc" - ut = ESMF_regrid_with_weights(vecfld,wgtsfile,False) - delete(ut@remap) - - n = 1 - vvel = ocnf[ii]->$vecnames(n)$ - vecfld = where(ismissing(vvel),0.0,vvel) - copy_VarAtts(vvel,vecfld) - ;print("unstagger "+vecnames(n)+" from "+vecgrids(n)+" to Ct") - wgtsfile = nemsrc+"/"+"tripole.mx025."+vecgrids(n)+".to.Ct.bilinear.nc" - vt = ESMF_regrid_with_weights(vecfld,wgtsfile,False) - delete(vt@remap) - - ListAppend(vecpairs,ut) - ListAppend(vecpairs,vt) - ;print(vecpairs) - - ; rotate - ; first copy Metadata - urot = vecpairs[0] - vrot = vecpairs[1] - if(vecdims(0) .eq. "2")then - urot = ut*cosrot2d + vt*sinrot2d - vrot = vt*cosrot2d - ut*sinrot2d - else - urot = ut*cosrot3d + vt*sinrot3d - vrot = vt*cosrot3d - ut*sinrot3d - end if - ; change attribute to indicate these are now rotated velocities - urot@long_name=str_sub_str(urot@long_name,"X","Zonal") - vrot@long_name=str_sub_str(vrot@long_name,"Y","Meridional") - ; copy back - vecpairs[0] = urot - vecpairs[1] = vrot - delete([/urot, vrot/]) - - ; remap - do n = 0,1 - vecfld = vecpairs[n] - ; regrid to dsttype+dstgrd with method - ;print("remapping "+vecnames(n)+" to grid "+dsttype+dstgrds(jj)) - wgtsfile = nemsrc+"/"+"tripole.mx025.Ct.to."+dsttype+dstgrds(jj)+"."+vecmeth(n)+".nc" - - rgtt = ESMF_regrid_with_weights(vecfld,wgtsfile,False) - if(vecdims(n) .eq. "2")then - rgtt = where(ismissing(rgmask2d),vecfld@_FillValue,rgtt) - rgtt=rgtt(:,::-1,:) - else - rgtt = where(ismissing(rgmask3d),vecfld@_FillValue,rgtt) - rgtt=rgtt(:,:,::-1,:) - end if - - ; enter file definition mode to add variable attributes - setfileoption(outcdf,"DefineMode",True) - filevarattdef(outcdf, vecnames(n), rgtt) - setfileoption(outcdf,"DefineMode",False) - - outcdf->$vecnames(n)$ = (/rgtt/) - delete(rgtt) - end do - delete([/uvel,vvel,ut,vt,vecfld,vecpairs/]) - delete([/vecnames,vecgrids,vecmeth,vecdims/]) - ; nv, loop over number of vector pairs - end do - -;---------------------------------------------------------------------- -; close the outcdf and continue through filelist -;---------------------------------------------------------------------- - - delete(outcdf) - ; rename mld to ePBL if required - do nv = 0,nvars-1 - varname = varlist(nv,0) - ; if(varname .eq. "mld" .and. valid(nv))then - if(varname .eq. "MLD_003" .and. valid(nv))then - print("Renaming MLD_003 to mld") - ;print(ncocmd+" "+outfile) - system(ncocmd+" "+outfile) - end if - end do - - ; ii, loop over files -; - ;jj, loop over destination grids - delete([/lat,lon,nlon,nlat/]) - delete([/rgmask2d,rgmask3d/]) - end do - print("One complete ocn file in " + (get_cpu_time() - begTime) + " seconds") -exit -end diff --git a/ush/ozn_xtrct.sh b/ush/ozn_xtrct.sh index 57ff87be5f..0c623bf03c 100755 --- a/ush/ozn_xtrct.sh +++ b/ush/ozn_xtrct.sh @@ -1,6 +1,6 @@ #! /usr/bin/env bash -source "${HOMEgfs}/ush/preamble.sh" +source "${USHgfs}/preamble.sh" #------------------------------------------------------------------ # ozn_xtrct.sh @@ -132,12 +132,12 @@ else #-------------------------------------------------------------------- # Copy extraction programs to working directory # - ${NCP} "${HOMEgfs}/exec/oznmon_time.x" ./oznmon_time.x + ${NCP} "${EXECgfs}/oznmon_time.x" ./oznmon_time.x if [[ ! -e oznmon_time.x ]]; then iret=2 exit ${iret} fi - ${NCP} "${HOMEgfs}/exec/oznmon_horiz.x" ./oznmon_horiz.x + ${NCP} "${EXECgfs}/oznmon_horiz.x" ./oznmon_horiz.x if [[ ! -e oznmon_horiz.x ]]; then iret=3 exit ${iret} diff --git a/ush/parsing_model_configure_DATM.sh b/ush/parsing_model_configure_DATM.sh deleted file mode 100755 index ecd3fa6dd6..0000000000 --- a/ush/parsing_model_configure_DATM.sh +++ /dev/null @@ -1,38 +0,0 @@ -#! /usr/bin/env bash - -##### -## "parsing_model_configure_DATM.sh" -## This script writes model configure file -## for DATM model -## -## This is the child script of ex-global forecast, -## writing model configure file for DATM -## This script is a direct execution. -##### - -DATM_model_configure(){ - -rm -f model_configure -cat > model_configure <> model_configure <> "${DATA}/model_configure" +echo "Rendered model_configure" +cat "${DATA}/model_configure" -quilting: ${QUILTING} -quilting_restart: .true. -write_groups: ${WRITE_GROUP:-1} -write_tasks_per_group: ${WRTTASK_PER_GROUP:-24} -itasks: 1 -output_history: ${OUTPUT_HISTORY:-".true."} -history_file_on_native_grid: .false. -write_dopost: ${WRITE_DOPOST:-".false."} -write_nsflip: ${WRITE_NSFLIP:-".false."} -num_files: ${NUM_FILES:-2} -filename_base: 'atm' 'sfc' -output_grid: ${OUTPUT_GRID} -output_file: '${OUTPUT_FILETYPE_ATM}' '${OUTPUT_FILETYPE_SFC}' -zstandard_level: 0 -ichunk2d: ${ichunk2d:-0} -jchunk2d: ${jchunk2d:-0} -ichunk3d: ${ichunk3d:-0} -jchunk3d: ${jchunk3d:-0} -kchunk3d: ${kchunk3d:-0} -ideflate: ${ideflate:-1} -quantize_mode: 'quantize_bitround' -quantize_nsd: ${QUANTIZE_NSD:-0} -imo: ${LONB_IMO} -jmo: ${LATB_JMO} -output_fh: ${FV3_OUTPUT_FH} -iau_offset: ${IAU_OFFSET:-0} -EOF - -echo "$(cat model_configure)" } diff --git a/ush/parsing_namelists_CICE.sh b/ush/parsing_namelists_CICE.sh index 6ef743ebc9..ec44c7cdca 100755 --- a/ush/parsing_namelists_CICE.sh +++ b/ush/parsing_namelists_CICE.sh @@ -2,6 +2,8 @@ # parsing namelist of CICE +# Disable variable not used warnings +# shellcheck disable=SC2034 CICE_namelists(){ # "warm_start" here refers to whether CICE model is warm starting or not. @@ -37,374 +39,80 @@ if (( $(( NY_GLB % NPY )) == 0 )); then else local block_size_y=$(( (NY_GLB / NPY) + 1 )) fi -local max_blocks=-1 local sec stepsperhr npt sec=$(to_seconds "${current_cycle:8:2}0000") stepsperhr=$((3600/ICETIM)) npt=$((FHMAX*stepsperhr)) # Need this in order for dump_last to work -cat > ice_in <> "${DATA}/ice_in" +echo "Rendered ice_in:" +cat "${DATA}/ice_in" } diff --git a/ush/parsing_namelists_FV3.sh b/ush/parsing_namelists_FV3.sh index 531afaa55a..916dd6e890 100755 --- a/ush/parsing_namelists_FV3.sh +++ b/ush/parsing_namelists_FV3.sh @@ -9,13 +9,20 @@ ## This script is a direct execution. ##### +# Disable variable not used warnings +# shellcheck disable=SC2034 FV3_namelists(){ # setup the tables -DIAG_TABLE=${DIAG_TABLE:-${HOMEgfs}/parm/ufs/fv3/diag_table} -DIAG_TABLE_APPEND=${DIAG_TABLE_APPEND:-${HOMEgfs}/parm/ufs/fv3/diag_table_aod} -DATA_TABLE=${DATA_TABLE:-${HOMEgfs}/parm/ufs/fv3/data_table} -FIELD_TABLE=${FIELD_TABLE:-${HOMEgfs}/parm/ufs/fv3/field_table} +DIAG_TABLE=${DIAG_TABLE:-${PARMgfs}/ufs/fv3/diag_table} +DIAG_TABLE_APPEND=${DIAG_TABLE_APPEND:-${PARMgfs}/ufs/fv3/diag_table_aod} +DATA_TABLE=${DATA_TABLE:-${PARMgfs}/ufs/MOM6_data_table.IN} +FIELD_TABLE=${FIELD_TABLE:-${PARMgfs}/ufs/fv3/field_table} + +# set cdmbgwd +if (( gwd_opt == 2 )) && [[ ${do_gsl_drag_ls_bl} == ".true." ]]; then + cdmbgwd=${cdmbgwd_gsl} +fi # ensure non-prognostic tracers are set dnats=${dnats:-0} @@ -33,7 +40,15 @@ if [[ -n "${AERO_DIAG_TABLE:-}" ]]; then cat "${AERO_DIAG_TABLE}" fi cat "${DIAG_TABLE_APPEND}" -} >> diag_table +} >> diag_table_template + +local template=diag_table_template +local SYEAR=${current_cycle:0:4} +local SMONTH=${current_cycle:4:2} +local SDAY=${current_cycle:6:2} +local CHOUR=${current_cycle:8:2} +source "${USHgfs}/atparse.bash" +atparse < "${template}" >> "diag_table" # copy data table @@ -94,22 +109,13 @@ cat > input.nml <> input.nml << EOF - nord_tr = ${nord_tr:-"2"} -EOF -fi - -cat >> input.nml << EOF grid_type = -1 make_nh = ${make_nh} fv_debug = ${fv_debug:-".false."} range_warn = ${range_warn:-".true."} reset_eta = .false. n_sponge = ${n_sponge:-"10"} - nudge_qv = ${nudge_qv:-".true."} + nudge_qv = ${nudge_qv:-".false."} nudge_dz = ${nudge_dz:-".false."} tau = ${tau:-10.} rf_cutoff = ${rf_cutoff:-"7.5e2"} @@ -198,41 +204,21 @@ case "${CCPP_SUITE:-}" in oz_phys_2015 = .true. EOF ;; - FV3_RAP_noah*) + "FV3_GSD_v0") cat >> input.nml << EOF iovr = ${iovr:-"3"} ltaerosol = ${ltaerosol:-".false."} lradar = ${lradar:-".false."} - dt_inner = ${dt_inner:-"40."} - ttendlim = ${ttendlim:-"-999"} + ttendlim = ${ttendlim:-0.005} oz_phys = ${oz_phys:-".false."} oz_phys_2015 = ${oz_phys_2015:-".true."} lsoil_lsm = ${lsoil_lsm:-"4"} do_mynnedmf = ${do_mynnedmf:-".false."} do_mynnsfclay = ${do_mynnsfclay:-".false."} icloud_bl = ${icloud_bl:-"1"} - tke_budget = ${tke_budget:-"0"} - bl_mynn_tkeadvect = ${bl_mynn_tkeadvect:=".true."} - bl_mynn_cloudpdf = ${bl_mynn_cloudpdf:="2"} - bl_mynn_mixlength = ${bl_mynn_mixlength:="1"} - bl_mynn_edmf = ${bl_mynn_edmf:="1"} - bl_mynn_edmf_mom = ${bl_mynn_edmf_mom:="1"} - bl_mynn_edmf_tke = ${bl_mynn_edmf_tke:="0"} - bl_mynn_cloudmix = ${bl_mynn_cloudmix:="1"} - bl_mynn_mixqt = ${bl_mynn_mixqt:="0"} - bl_mynn_output = ${bl_mynn_output:="0"} - bl_mynn_closure = ${bl_mynn_closure:="2.6"} - do_ugwp = ${do_ugwp:-".false."} - do_tofd = ${do_tofd:-".true."} - gwd_opt = ${gwd_opt:-"2"} - do_ugwp_v0 = ${do_ugwp_v0:-".true."} - do_ugwp_v1 = ${do_ugwp_v1:-".false."} - do_ugwp_v0_orog_only = ${do_ugwp_v0_orog_only:-".false."} - do_ugwp_v0_nst_only = ${do_ugwp_v0_nst_only:-".false."} - do_gsl_drag_ls_bl = ${do_gsl_drag_ls_bl:-".false."} - do_gsl_drag_ss = ${do_gsl_drag_ss:-".true."} - do_gsl_drag_tofd = ${do_gsl_drag_tofd:-".true."} - do_ugwp_v1_orog_only = ${do_ugwp_v1_orog_only:-".false."} + bl_mynn_edmf = ${bl_mynn_edmf:-"1"} + bl_mynn_tkeadvect=${bl_mynn_tkeadvect:-".true."} + bl_mynn_edmf_mom=${bl_mynn_edmf_mom:-"1"} min_lakeice = ${min_lakeice:-"0.15"} min_seaice = ${min_seaice:-"0.15"} use_cice_alb = ${use_cice_alb:-".false."} @@ -273,50 +259,6 @@ EOF bl_mynn_edmf_mom = ${bl_mynn_edmf_mom:-"1"} min_lakeice = ${min_lakeice:-"0.15"} min_seaice = ${min_seaice:-"0.15"} -EOF - ;; - FV3_GFS_v17_p8_*mynn) - local default_dt_inner=$(( DELTIM/2 )) - cat >> input.nml << EOF - iovr = ${iovr:-"3"} - ltaerosol = ${ltaerosol:-".false."} - lradar = ${lradar:-".true."} - ttendlim = ${ttendlim:-"-999"} - dt_inner = ${dt_inner:-"${default_dt_inner}"} - sedi_semi = ${sedi_semi:-".true."} - decfl = ${decfl:-"10"} - oz_phys = ${oz_phys:-".false."} - oz_phys_2015 = ${oz_phys_2015:-".true."} - lsoil_lsm = ${lsoil_lsm:-"4"} - do_mynnedmf = ${do_mynnedmf:-".false."} - do_mynnsfclay = ${do_mynnsfclay:-".false."} - icloud_bl = ${icloud_bl:-"1"} - tke_budget = ${tke_budget:-"0"} - bl_mynn_tkeadvect = ${bl_mynn_tkeadvect:-".true."} - bl_mynn_cloudpdf = ${bl_mynn_cloudpdf:="2"} - bl_mynn_mixlength = ${bl_mynn_mixlength:="1"} - bl_mynn_edmf = ${bl_mynn_edmf:-"1"} - bl_mynn_edmf_mom = ${bl_mynn_edmf_mom:-"1"} - bl_mynn_edmf_tke = ${bl_mynn_edmf_tke:="0"} - bl_mynn_cloudmix = ${bl_mynn_cloudmix:="1"} - bl_mynn_mixqt = ${bl_mynn_mixqt:="0"} - bl_mynn_output = ${bl_mynn_output:="0"} - bl_mynn_closure = ${bl_mynn_closure:="2.6"} - lcnorm = ${lcnorm:-".true."} - do_ugwp = ${do_ugwp:-".false."} - do_tofd = ${do_tofd:-".false."} - gwd_opt = ${gwd_opt:-"2"} - do_ugwp_v0 = ${do_ugwp_v0:-".false."} - do_ugwp_v1 = ${do_ugwp_v1:-".true."} - do_ugwp_v0_orog_only = ${do_ugwp_v0_orog_only:-".false."} - do_ugwp_v0_nst_only = ${do_ugwp_v0_nst_only:-".false."} - do_gsl_drag_ls_bl = ${do_gsl_drag_ls_bl:-".true."} - do_gsl_drag_ss = ${do_gsl_drag_ss:-".true."} - do_gsl_drag_tofd = ${do_gsl_drag_tofd:-".true."} - do_ugwp_v1_orog_only = ${do_ugwp_v1_orog_only:-".false."} - min_lakeice = ${min_lakeice:-"0.15"} - min_seaice = ${min_seaice:-"0.15"} - use_cice_alb = ${use_cice_alb:-".false."} EOF ;; FV3_GFS_v17*) @@ -456,6 +398,14 @@ cat >> input.nml <> input.nml <> input.nml << EOF @@ -669,7 +619,7 @@ EOF skeb_tau = ${SKEB_TAU:-"-999."} skeb_lscale = ${SKEB_LSCALE:-"-999."} skebnorm = ${SKEBNORM:-"1"} - skeb_npass = ${SKEB_nPASS:-"30"} + skeb_npass = ${SKEB_NPASS:-"30"} skeb_vdof = ${SKEB_VDOF:-"5"} EOF fi @@ -692,6 +642,43 @@ EOF sppt_logit = ${SPPT_LOGIT:-".true."} sppt_sfclimit = ${SPPT_SFCLIMIT:-".true."} use_zmtnblck = ${use_zmtnblck:-".true."} + pbl_taper = ${pbl_taper:-"0,0,0,0.125,0.25,0.5,0.75"} +EOF + fi + + if [[ "${DO_OCN_SPPT:-NO}" == "YES" ]]; then + cat >> input.nml <> input.nml <> input.nml <> input.nml <> input.nml <> input.nml <> input.nml <> input.nml < "${DATA}/INPUT/MOM_input" -rm "${DATA}/INPUT/MOM_input_template_${OCNRES}" +# ================================================================ +# MOM_input +# --------- +# Prepare local variables for use in MOM_input.IN from UFSWM +# The ones already defined are left commented as a reminder +# == MOM_domains section == +# NX_GLB +# NY_GLB +# == MOM section == +# DT_DYNAM_MOM6 +# DT_THERM_MOM6 +# MOM6_THERMO_SPAN +# == MOM_grid_init section == +local MOM6_TOPOEDITS=${TOPOEDITS} +# MOM6_ALLOW_LANDMASK_CHANGES +# == MOM_diag_mediator section == +# MOM6_DIAG_COORD_DEF_Z_FILE +# MOM6_DIAG_MISVAL +# == MOM_diabatic_aux section == +local MOM6_CHLCLIM=${CHLCLIM} +# == MOM_energetic_PBL section == +# MOM6_USE_LI2016 +if [[ "${cplwav}" == ".true." ]] ; then + local MOM6_USE_WAVES="True" +else + local MOM6_USE_WAVES="False" +fi +# == MOM_oda_incupd section == +local ODA_TEMPINC_VAR=${ODA_TEMPINC_VAR:-"Temp"} +local ODA_SALTINC_VAR=${ODA_SALTINC_VAR:-"Salt"} +local ODA_THK_VAR=${ODA_THK_VAR:-"h"} +local ODA_INCUPD_UV="True" +local ODA_UINC_VAR=${ODA_UINC_VAR:-"u"} +local ODA_VINC_VAR=${ODA_VINC_VAR:-"v"} +# ODA_INCUPD +# ODA_INCUPD_NHOURS +# == MOM_surface_forcing section == +# MOM6_RIVER_RUNOFF +# == ocean_stochastics section == +if [[ "${DO_OCN_SPPT}" == "YES" ]]; then + local DO_OCN_SPPT="True" # TODO: This is problematic if DO_OCN_SPPT is going to be used elsewhere +else + local DO_OCN_SPPT="False" +fi +if [[ "${DO_OCN_PERT_EPBL}" == "YES" ]]; then + local PERT_EPBL="True" +else + local PERT_EPBL="False" +fi +# Ensure the template exists +local template=${MOM6_INPUT_TEMPLATE:-"${PARMgfs}/ufs/MOM_input_${OCNRES}.IN"} +if [[ ! -f "${template}" ]]; then + echo "FATAL ERROR: template '${template}' does not exist, ABORT!" + exit 1 +fi +rm -f "${DATA}/INPUT/MOM_input" +atparse < "${template}" >> "${DATA}/INPUT/MOM_input" +echo "Rendered MOM_input:" +cat "${DATA}/INPUT/MOM_input" -#data table for runoff: -DATA_TABLE=${DATA_TABLE:-${HOMEgfs}/parm/ufs/fv3/data_table} -${NCP} "${DATA_TABLE}" "${DATA}/data_table_template" -sed -e "s/@\[FRUNOFF\]/${FRUNOFF}/g" "${DATA}/data_table_template" > "${DATA}/data_table" -rm "${DATA}/data_table_template" +# ================================================================ +# data_table +# ---------- +# Prepare local variables for use in MOM6_data_table.IN from UFSWM +local MOM6_FRUNOFF=${FRUNOFF} + +# Ensure the template exists +local template=${MOM6_DATA_TABLE_TEMPLATE:-"${PARMgfs}/ufs/MOM6_data_table.IN"} +if [[ ! -f "${template}" ]]; then + echo "FATAL ERROR: template '${template}' does not exist, ABORT!" + exit 1 +fi +rm -f "${DATA}/data_table" +atparse < "${template}" >> "${DATA}/data_table" +echo "Rendered data_table:" +cat "${DATA}/data_table" } diff --git a/ush/parsing_namelists_WW3.sh b/ush/parsing_namelists_WW3.sh index 9b0a94695c..a01d694710 100755 --- a/ush/parsing_namelists_WW3.sh +++ b/ush/parsing_namelists_WW3.sh @@ -79,8 +79,8 @@ WW3_namelists(){ if [ $waveMULTIGRID = ".true." ]; then # ww3_multi template - if [ -f $PARMwave/ww3_multi.inp.tmpl ]; then - cp $PARMwave/ww3_multi.inp.tmpl ww3_multi.inp.tmpl + if [ -f ${PARMgfs}/wave/ww3_multi.inp.tmpl ]; then + cp ${PARMgfs}/wave/ww3_multi.inp.tmpl ww3_multi.inp.tmpl fi if [ ! -f ww3_multi.inp.tmpl ]; then echo "ABNORMAL EXIT: NO TEMPLATE FOR WW3 MULTI INPUT FILE" @@ -88,8 +88,8 @@ WW3_namelists(){ fi else # ww3_multi template - if [ -f $PARMwave/ww3_shel.inp.tmpl ]; then - cp $PARMwave/ww3_shel.inp.tmpl ww3_shel.inp.tmpl + if [ -f ${PARMgfs}/wave/ww3_shel.inp.tmpl ]; then + cp ${PARMgfs}/wave/ww3_shel.inp.tmpl ww3_shel.inp.tmpl fi if [ ! -f ww3_shel.inp.tmpl ]; then echo "ABNORMAL EXIT: NO TEMPLATE FOR WW3 SHEL INPUT FILE" @@ -99,18 +99,18 @@ WW3_namelists(){ # Buoy location file - if [ -f $PARMwave/wave_${NET}.buoys ] + if [ -f ${PARMgfs}/wave/wave_${NET}.buoys ] then - cp $PARMwave/wave_${NET}.buoys buoy.loc + cp ${PARMgfs}/wave/wave_${NET}.buoys buoy.loc fi if [ -f buoy.loc ] then set +x - echo " buoy.loc copied ($PARMwave/wave_${NET}.buoys)." + echo " buoy.loc copied (${PARMgfs}/wave/wave_${NET}.buoys)." set_trace else - echo " FATAL ERROR : buoy.loc ($PARMwave/wave_${NET}.buoys) NOT FOUND" + echo " FATAL ERROR : buoy.loc (${PARMgfs}/wave/wave_${NET}.buoys) NOT FOUND" exit 12 fi diff --git a/ush/ufs_configure.sh b/ush/parsing_ufs_configure.sh similarity index 80% rename from ush/ufs_configure.sh rename to ush/parsing_ufs_configure.sh index 8898d11162..ade07c52bf 100755 --- a/ush/ufs_configure.sh +++ b/ush/parsing_ufs_configure.sh @@ -1,36 +1,31 @@ #! /usr/bin/env bash ##### -## This script writes ufs.configure file -## first, select a "*.IN" templates based on -## $confignamevarforufs and parse values based on -## $cpl** switches. -## -## This is a child script of modular -## forecast script. This script is definition only (Is it? There is nothing defined here being used outside this script.) +## This script writes ufs.configure file based on a template defined in +## ${ufs_configure_template} ##### # Disable variable not used warnings # shellcheck disable=SC2034 writing_ufs_configure() { -echo "SUB ${FUNCNAME[0]}: ufs.configure.sh begins" +echo "SUB ${FUNCNAME[0]}: ufs.configure begins" # Setup ufs.configure -local DumpFields=${NEMSDumpFields:-false} +local esmf_logkind=${esmf_logkind:-"ESMF_LOGKIND_MULTI"} #options: ESMF_LOGKIND_MULTI_ON_ERROR, ESMF_LOGKIND_MULTI, ESMF_LOGKIND_NONE +local DumpFields=${DumpFields:-false} local cap_dbug_flag=${cap_dbug_flag:-0} + # Determine "cmeps_run_type" based on the availability of the mediator restart file # If it is a warm_start, we already copied the mediator restart to DATA, if it was present # If the mediator restart was not present, despite being a "warm_start", we put out a WARNING -# in forecast_postdet.sh +# in forecast_postdet.sh function CMEPS_postdet if [[ -f "${DATA}/ufs.cpld.cpl.r.nc" ]]; then local cmeps_run_type='continue' else local cmeps_run_type='startup' fi -local esmf_logkind=${esmf_logkind:-"ESMF_LOGKIND_MULTI"} #options: ESMF_LOGKIND_MULTI_ON_ERROR, ESMF_LOGKIND_MULTI, ESMF_LOGKIND_NONE - # Atm-related local atm_model="fv3" local atm_petlist_bounds="0 $(( ATMPETS-1 ))" @@ -53,12 +48,14 @@ if [[ "${cplflx}" = ".true." ]]; then local ocn_petlist_bounds="${ATMPETS} $(( ATMPETS+OCNPETS-1 ))" local ocn_omp_num_threads="${OCNTHREADS}" local RUNTYPE="${cmeps_run_type}" + local CMEPS_RESTART_DIR="CMEPS_RESTART/" local CPLMODE="${cplmode}" local coupling_interval_fast_sec="${CPL_FAST}" local RESTART_N="${restart_interval}" local ocean_albedo_limit=0.06 local ATMTILESIZE="${CASE:1}" local ocean_albedo_limit=0.06 + local pio_rearranger=${pio_rearranger:-"box"} fi if [[ "${cplice}" = ".true." ]]; then @@ -66,7 +63,6 @@ if [[ "${cplice}" = ".true." ]]; then local ice_model="cice6" local ice_petlist_bounds="$(( ATMPETS+OCNPETS )) $(( ATMPETS+OCNPETS+ICEPETS-1 ))" local ice_omp_num_threads="${ICETHREADS}" - local MESH_OCN_ICE=${MESH_OCN_ICE:-"mesh.mx${ICERES}.nc"} local FHMAX="${FHMAX_GFS}" # TODO: How did this get in here hard-wired to FHMAX_GFS? fi @@ -76,6 +72,7 @@ if [[ "${cplwav}" = ".true." ]]; then local wav_petlist_bounds="$(( ATMPETS+OCNPETS+ICEPETS )) $(( ATMPETS+OCNPETS+ICEPETS+WAVPETS-1 ))" local wav_omp_num_threads="${WAVTHREADS}" local MULTIGRID="${waveMULTIGRID}" + local WW3_user_sets_restname="false" fi @@ -84,7 +81,7 @@ if [[ "${cplchm}" = ".true." ]]; then local chm_model="gocart" local chm_petlist_bounds="0 $(( CHMPETS-1 ))" local chm_omp_num_threads="${CHMTHREADS}" - local coupling_interval_fast_sec="${CPL_FAST}" + local coupling_interval_sec="${CPL_FAST}" fi @@ -92,9 +89,11 @@ fi if [[ ! -r "${ufs_configure_template}" ]]; then echo "FATAL ERROR: template '${ufs_configure_template}' does not exist, ABORT!" exit 1 +else + echo "INFO: using ufs.configure template: '${ufs_configure_template}'" fi -source "${HOMEgfs}/ush/atparse.bash" +source "${USHgfs}/atparse.bash" rm -f "${DATA}/ufs.configure" atparse < "${ufs_configure_template}" >> "${DATA}/ufs.configure" echo "Rendered ufs.configure:" @@ -102,6 +101,6 @@ cat ufs.configure ${NCP} "${HOMEgfs}/sorc/ufs_model.fd/tests/parm/fd_ufs.yaml" fd_ufs.yaml -echo "SUB ${FUNCNAME[0]}: ufs.configure.sh ends for ${ufs_configure_template}" +echo "SUB ${FUNCNAME[0]}: ufs.configure ends" } diff --git a/ush/preamble.sh b/ush/preamble.sh index be64684aa8..08d7659ad1 100644 --- a/ush/preamble.sh +++ b/ush/preamble.sh @@ -16,6 +16,8 @@ # TRACE (YES/NO): Whether to echo every command (set -x) [default: "YES"] # STRICT (YES/NO): Whether to exit immediately on error or undefined variable # (set -eu) [default: "YES"] +# POSTAMBLE_CMD (empty/set): A command to run at the end of the job +# [default: empty] # ####### set +x @@ -70,6 +72,24 @@ postamble() { start_time="${2}" rc="${3}" + # Execute postamble command + # + # Commands can be added to the postamble by appending them to $POSTAMBLE_CMD: + # POSTAMBLE_CMD="new_thing; ${POSTAMBLE_CMD:-}" # (before existing commands) + # POSTAMBLE_CMD="${POSTAMBLE_CMD:-}; new_thing" # (after existing commands) + # + # Always use this form so previous POSTAMBLE_CMD are not overwritten. This should + # only be used for commands that execute conditionally (i.e. on certain machines + # or jobs). Global changes should just be added to this function. + # These commands will be called when EACH SCRIPT terminates, so be mindful. Please + # consult with global-workflow CMs about permanent changes to $POSTAMBLE_CMD or + # this postamble function. + # + + if [[ -v 'POSTAMBLE_CMD' ]]; then + ${POSTAMBLE_CMD} + fi + # Calculate the elapsed time end_time=$(date +%s) end_time_human=$(date -d@"${end_time}" -u +%H:%M:%S) @@ -87,70 +107,7 @@ postamble() { trap "postamble ${_calling_script} ${start_time} \$?" EXIT # shellcheck disable= -function generate_com() { - # - # Generate a list COM variables from a template by substituting in env variables. - # - # Each argument must have a corresponding template with the name ${ARG}_TMPL. Any - # variables in the template are replaced with their values. Undefined variables - # are just removed without raising an error. - # - # Accepts as options `-r` and `-x`, which do the same thing as the same options in - # `declare`. Variables are automatically marked as `-g` so the variable is visible - # in the calling script. - # - # Syntax: - # generate_com [-rx] $var1[:$tmpl1] [$var2[:$tmpl2]] [...]] - # - # options: - # -r: Make variable read-only (same as `decalre -r`) - # -x: Mark variable for export (same as `declare -x`) - # var1, var2, etc: Variable names whose values will be generated from a template - # and declared - # tmpl1, tmpl2, etc: Specify the template to use (default is "${var}_TMPL") - # - # Examples: - # # Current cycle and RUN, implicitly using template COM_ATMOS_ANALYSIS_TMPL - # YMD=${PDY} HH=${cyc} generate_com -rx COM_ATMOS_ANALYSIS - # - # # Previous cycle and gdas using an explicit template - # RUN=${GDUMP} YMD=${gPDY} HH=${gcyc} generate_com -rx \ - # COM_ATMOS_HISTORY_PREV:COM_ATMOS_HISTORY_TMPL - # - # # Current cycle and COM for first member - # MEMDIR='mem001' YMD=${PDY} HH=${cyc} generate_com -rx COM_ATMOS_HISTORY - # - if [[ ${DEBUG_WORKFLOW:-"NO"} == "NO" ]]; then set +x; fi - local opts="-g" - local OPTIND=1 - while getopts "rx" option; do - opts="${opts}${option}" - done - shift $((OPTIND-1)) - - for input in "$@"; do - IFS=':' read -ra args <<< "${input}" - local com_var="${args[0]}" - local template - local value - if (( ${#args[@]} > 1 )); then - template="${args[1]}" - else - template="${com_var}_TMPL" - fi - if [[ ! -v "${template}" ]]; then - echo "FATAL ERROR in generate_com: Requested template ${template} not defined!" - exit 2 - fi - value=$(echo "${!template}" | envsubst) - # shellcheck disable=SC2086 - declare ${opts} "${com_var}"="${value}" - echo "generate_com :: ${com_var}=${value}" - done - set_trace -} -# shellcheck disable= -declare -xf generate_com +source "${HOMEgfs}/ush/bash_utils.sh" # Turn on our settings set_strict diff --git a/ush/python/pygfs/task/aero_analysis.py b/ush/python/pygfs/task/aero_analysis.py index 0e515a0df4..a61b7c82f3 100644 --- a/ush/python/pygfs/task/aero_analysis.py +++ b/ush/python/pygfs/task/aero_analysis.py @@ -12,7 +12,7 @@ add_to_datetime, to_fv3time, to_timedelta, chdir, to_fv3time, - YAMLFile, parse_yamltmpl, parse_j2yaml, save_as_yaml, + YAMLFile, parse_j2yaml, save_as_yaml, logit, Executable, WorkflowException) @@ -32,7 +32,7 @@ def __init__(self, config): _res = int(self.config['CASE'][1:]) _res_anl = int(self.config['CASE_ANL'][1:]) _window_begin = add_to_datetime(self.runtime_config.current_cycle, -to_timedelta(f"{self.config['assim_freq']}H") / 2) - _fv3jedi_yaml = os.path.join(self.runtime_config.DATA, f"{self.runtime_config.CDUMP}.t{self.runtime_config['cyc']:02d}z.aerovar.yaml") + _jedi_yaml = os.path.join(self.runtime_config.DATA, f"{self.runtime_config.CDUMP}.t{self.runtime_config['cyc']:02d}z.aerovar.yaml") # Create a local dictionary that is repeatedly used across this class local_dict = AttrDict( @@ -46,11 +46,11 @@ def __init__(self, config): 'npz_anl': self.config['LEVS'] - 1, 'AERO_WINDOW_BEGIN': _window_begin, 'AERO_WINDOW_LENGTH': f"PT{self.config['assim_freq']}H", - 'aero_bkg_fhr': map(int, self.config['aero_bkg_times'].split(',')), + 'aero_bkg_fhr': map(int, str(self.config['aero_bkg_times']).split(',')), 'OPREFIX': f"{self.runtime_config.CDUMP}.t{self.runtime_config.cyc:02d}z.", # TODO: CDUMP is being replaced by RUN 'APREFIX': f"{self.runtime_config.CDUMP}.t{self.runtime_config.cyc:02d}z.", # TODO: CDUMP is being replaced by RUN 'GPREFIX': f"gdas.t{self.runtime_config.previous_cycle.hour:02d}z.", - 'fv3jedi_yaml': _fv3jedi_yaml, + 'jedi_yaml': _jedi_yaml, } ) @@ -73,15 +73,13 @@ def initialize(self: Analysis) -> None: super().initialize() # stage CRTM fix files - crtm_fix_list_path = os.path.join(self.task_config['HOMEgfs'], 'parm', 'gdas', 'aero_crtm_coeff.yaml') - logger.debug(f"Staging CRTM fix files from {crtm_fix_list_path}") - crtm_fix_list = parse_j2yaml(crtm_fix_list_path, self.task_config) + logger.info(f"Staging CRTM fix files from {self.task_config.CRTM_FIX_YAML}") + crtm_fix_list = parse_j2yaml(self.task_config.CRTM_FIX_YAML, self.task_config) FileHandler(crtm_fix_list).sync() # stage fix files - jedi_fix_list_path = os.path.join(self.task_config['HOMEgfs'], 'parm', 'gdas', 'aero_jedi_fix.yaml') - logger.debug(f"Staging JEDI fix files from {jedi_fix_list_path}") - jedi_fix_list = parse_j2yaml(jedi_fix_list_path, self.task_config) + logger.info(f"Staging JEDI fix files from {self.task_config.JEDI_FIX_YAML}") + jedi_fix_list = parse_j2yaml(self.task_config.JEDI_FIX_YAML, self.task_config) FileHandler(jedi_fix_list).sync() # stage berror files @@ -93,10 +91,9 @@ def initialize(self: Analysis) -> None: FileHandler(self.get_bkg_dict(AttrDict(self.task_config, **self.task_config))).sync() # generate variational YAML file - logger.debug(f"Generate variational YAML file: {self.task_config.fv3jedi_yaml}") - varda_yaml = parse_j2yaml(self.task_config['AEROVARYAML'], self.task_config) - save_as_yaml(varda_yaml, self.task_config.fv3jedi_yaml) - logger.info(f"Wrote variational YAML to: {self.task_config.fv3jedi_yaml}") + logger.debug(f"Generate variational YAML file: {self.task_config.jedi_yaml}") + save_as_yaml(self.task_config.jedi_config, self.task_config.jedi_yaml) + logger.info(f"Wrote variational YAML to: {self.task_config.jedi_yaml}") # need output dir for diags and anl logger.debug("Create empty output [anl, diags] directories to receive output from executable") @@ -114,7 +111,7 @@ def execute(self: Analysis) -> None: exec_cmd = Executable(self.task_config.APRUN_AEROANL) exec_name = os.path.join(self.task_config.DATA, 'fv3jedi_var.x') exec_cmd.add_default_arg(exec_name) - exec_cmd.add_default_arg(self.task_config.fv3jedi_yaml) + exec_cmd.add_default_arg(self.task_config.jedi_yaml) try: logger.debug(f"Executing {exec_cmd}") @@ -212,7 +209,7 @@ def _add_fms_cube_sphere_increments(self: Analysis) -> None: inc_template = os.path.join(self.task_config.DATA, 'anl', 'aeroinc.' + increment_template) bkg_template = os.path.join(self.task_config.COM_ATMOS_RESTART_PREV, restart_template) # get list of increment vars - incvars_list_path = os.path.join(self.task_config['HOMEgfs'], 'parm', 'gdas', 'aeroanl_inc_vars.yaml') + incvars_list_path = os.path.join(self.task_config['PARMgfs'], 'gdas', 'aeroanl_inc_vars.yaml') incvars = YAMLFile(path=incvars_list_path)['incvars'] super().add_fv3_increments(inc_template, bkg_template, incvars) diff --git a/ush/python/pygfs/task/analysis.py b/ush/python/pygfs/task/analysis.py index cfd1fb2206..2221fb7b34 100644 --- a/ush/python/pygfs/task/analysis.py +++ b/ush/python/pygfs/task/analysis.py @@ -4,6 +4,7 @@ import glob import tarfile from logging import getLogger +from pprint import pformat from netCDF4 import Dataset from typing import List, Dict, Any, Union @@ -25,9 +26,15 @@ class Analysis(Task): def __init__(self, config: Dict[str, Any]) -> None: super().__init__(config) self.config.ntiles = 6 + # Store location of GDASApp jinja2 templates + self.gdasapp_j2tmpl_dir = os.path.join(self.config.PARMgfs, 'gdas') def initialize(self) -> None: super().initialize() + + # all JEDI analyses need a JEDI config + self.task_config.jedi_config = self.get_jedi_config() + # all analyses need to stage observations obs_dict = self.get_obs_dict() FileHandler(obs_dict).sync() @@ -39,13 +46,33 @@ def initialize(self) -> None: # link jedi executable to run directory self.link_jediexe() + @logit(logger) + def get_jedi_config(self) -> Dict[str, Any]: + """Compile a dictionary of JEDI configuration from JEDIYAML template file + + Parameters + ---------- + + Returns + ---------- + jedi_config : Dict + a dictionary containing the fully rendered JEDI yaml configuration + """ + + # generate JEDI YAML file + logger.info(f"Generate JEDI YAML config: {self.task_config.jedi_yaml}") + jedi_config = parse_j2yaml(self.task_config.JEDIYAML, self.task_config, searchpath=self.gdasapp_j2tmpl_dir) + logger.debug(f"JEDI config:\n{pformat(jedi_config)}") + + return jedi_config + @logit(logger) def get_obs_dict(self) -> Dict[str, Any]: """Compile a dictionary of observation files to copy - This method uses the OBS_LIST configuration variable to generate a dictionary - from a list of YAML files that specify what observation files are to be - copied to the run directory from the observation input directory + This method extracts 'observers' from the JEDI yaml and from that list, extracts a list of + observation files that are to be copied to the run directory + from the observation input directory Parameters ---------- @@ -55,13 +82,13 @@ def get_obs_dict(self) -> Dict[str, Any]: obs_dict: Dict a dictionary containing the list of observation files to copy for FileHandler """ - logger.debug(f"OBS_LIST: {self.task_config['OBS_LIST']}") - obs_list_config = parse_j2yaml(self.task_config["OBS_LIST"], self.task_config) - logger.debug(f"obs_list_config: {obs_list_config}") - # get observers from master dictionary - observers = obs_list_config['observers'] + + logger.info(f"Extracting a list of observation files from {self.task_config.JEDIYAML}") + observations = find_value_in_nested_dict(self.task_config.jedi_config, 'observations') + logger.debug(f"observations:\n{pformat(observations)}") + copylist = [] - for ob in observers: + for ob in observations['observers']: obfile = ob['obs space']['obsdatain']['engine']['obsfile'] basename = os.path.basename(obfile) copylist.append([os.path.join(self.task_config['COM_OBS'], basename), obfile]) @@ -75,9 +102,11 @@ def get_obs_dict(self) -> Dict[str, Any]: def get_bias_dict(self) -> Dict[str, Any]: """Compile a dictionary of observation files to copy - This method uses the OBS_LIST configuration variable to generate a dictionary - from a list of YAML files that specify what observation bias correction files - are to be copied to the run directory from the observation input directory + This method extracts 'observers' from the JEDI yaml and from that list, extracts a list of + observation bias correction files that are to be copied to the run directory + from the component directory. + TODO: COM_ATMOS_ANALYSIS_PREV is hardwired here and this method is not appropriate in + `analysis.py` and should be implemented in the component where this is applicable. Parameters ---------- @@ -87,21 +116,22 @@ def get_bias_dict(self) -> Dict[str, Any]: bias_dict: Dict a dictionary containing the list of observation bias files to copy for FileHandler """ - logger.debug(f"OBS_LIST: {self.task_config['OBS_LIST']}") - obs_list_config = parse_j2yaml(self.task_config["OBS_LIST"], self.task_config) - logger.debug(f"obs_list_config: {obs_list_config}") - # get observers from master dictionary - observers = obs_list_config['observers'] + + logger.info(f"Extracting a list of bias correction files from {self.task_config.JEDIYAML}") + observations = find_value_in_nested_dict(self.task_config.jedi_config, 'observations') + logger.debug(f"observations:\n{pformat(observations)}") + copylist = [] - for ob in observers: + for ob in observations['observers']: if 'obs bias' in ob.keys(): obfile = ob['obs bias']['input file'] obdir = os.path.dirname(obfile) basename = os.path.basename(obfile) prefix = '.'.join(basename.split('.')[:-2]) - for file in ['satbias.nc4', 'satbias_cov.nc4', 'tlapse.txt']: + for file in ['satbias.nc', 'satbias_cov.nc', 'tlapse.txt']: bfile = f"{prefix}.{file}" copylist.append([os.path.join(self.task_config.COM_ATMOS_ANALYSIS_PREV, bfile), os.path.join(obdir, bfile)]) + # TODO: Why is this specific to ATMOS? bias_dict = { 'mkdir': [os.path.join(self.runtime_config.DATA, 'bc')], @@ -311,13 +341,13 @@ def tgz_diags(statfile: str, diagdir: str) -> None: Parameters ---------- statfile : str | os.PathLike - Path to the output .tar.gz .tgz file that will contain the diag*.nc4 files e.g. atmstat.tgz + Path to the output .tar.gz .tgz file that will contain the diag*.nc files e.g. atmstat.tgz diagdir : str | os.PathLike Directory containing JEDI diag files """ # get list of diag files to put in tarball - diags = glob.glob(os.path.join(diagdir, 'diags', 'diag*nc4')) + diags = glob.glob(os.path.join(diagdir, 'diags', 'diag*nc')) logger.info(f"Compressing {len(diags)} diag files to {statfile}") @@ -326,3 +356,74 @@ def tgz_diags(statfile: str, diagdir: str) -> None: # Add diag files to tarball for diagfile in diags: tgz.add(diagfile, arcname=os.path.basename(diagfile)) + + +@logit(logger) +def find_value_in_nested_dict(nested_dict: Dict, target_key: str) -> Any: + """ + Recursively search through a nested dictionary and return the value for the target key. + This returns the first target key it finds. So if a key exists in a subsequent + nested dictionary, it will not be found. + + Parameters + ---------- + nested_dict : Dict + Dictionary to search + target_key : str + Key to search for + + Returns + ------- + Any + Value of the target key + + Raises + ------ + KeyError + If key is not found in dictionary + + TODO: if this gives issues due to landing on an incorrect key in the nested + dictionary, we will have to implement a more concrete method to search for a key + given a more complete address. See resolved conversations in PR 2387 + + # Example usage: + nested_dict = { + 'a': { + 'b': { + 'c': 1, + 'd': { + 'e': 2, + 'f': 3 + } + }, + 'g': 4 + }, + 'h': { + 'i': 5 + }, + 'j': { + 'k': 6 + } + } + + user_key = input("Enter the key to search for: ") + result = find_value_in_nested_dict(nested_dict, user_key) + """ + + if not isinstance(nested_dict, dict): + raise TypeError(f"Input is not of type(dict)") + + result = nested_dict.get(target_key) + if result is not None: + return result + + for value in nested_dict.values(): + if isinstance(value, dict): + try: + result = find_value_in_nested_dict(value, target_key) + if result is not None: + return result + except KeyError: + pass + + raise KeyError(f"Key '{target_key}' not found in the nested dictionary") diff --git a/ush/python/pygfs/task/atm_analysis.py b/ush/python/pygfs/task/atm_analysis.py index da41574fc9..6348bdf319 100644 --- a/ush/python/pygfs/task/atm_analysis.py +++ b/ush/python/pygfs/task/atm_analysis.py @@ -11,7 +11,7 @@ FileHandler, add_to_datetime, to_fv3time, to_timedelta, to_YMDH, chdir, - parse_yamltmpl, parse_j2yaml, save_as_yaml, + parse_j2yaml, save_as_yaml, logit, Executable, WorkflowException) @@ -31,7 +31,7 @@ def __init__(self, config): _res = int(self.config.CASE[1:]) _res_anl = int(self.config.CASE_ANL[1:]) _window_begin = add_to_datetime(self.runtime_config.current_cycle, -to_timedelta(f"{self.config.assim_freq}H") / 2) - _fv3jedi_yaml = os.path.join(self.runtime_config.DATA, f"{self.runtime_config.CDUMP}.t{self.runtime_config.cyc:02d}z.atmvar.yaml") + _jedi_yaml = os.path.join(self.runtime_config.DATA, f"{self.runtime_config.CDUMP}.t{self.runtime_config.cyc:02d}z.atmvar.yaml") # Create a local dictionary that is repeatedly used across this class local_dict = AttrDict( @@ -48,7 +48,7 @@ def __init__(self, config): 'OPREFIX': f"{self.runtime_config.CDUMP}.t{self.runtime_config.cyc:02d}z.", # TODO: CDUMP is being replaced by RUN 'APREFIX': f"{self.runtime_config.CDUMP}.t{self.runtime_config.cyc:02d}z.", # TODO: CDUMP is being replaced by RUN 'GPREFIX': f"gdas.t{self.runtime_config.previous_cycle.hour:02d}z.", - 'fv3jedi_yaml': _fv3jedi_yaml, + 'jedi_yaml': _jedi_yaml, } ) @@ -71,19 +71,17 @@ def initialize(self: Analysis) -> None: super().initialize() # stage CRTM fix files - crtm_fix_list_path = os.path.join(self.task_config.HOMEgfs, 'parm', 'gdas', 'atm_crtm_coeff.yaml') - logger.debug(f"Staging CRTM fix files from {crtm_fix_list_path}") - crtm_fix_list = parse_j2yaml(crtm_fix_list_path, self.task_config) + logger.info(f"Staging CRTM fix files from {self.task_config.CRTM_FIX_YAML}") + crtm_fix_list = parse_j2yaml(self.task_config.CRTM_FIX_YAML, self.task_config) FileHandler(crtm_fix_list).sync() # stage fix files - jedi_fix_list_path = os.path.join(self.task_config.HOMEgfs, 'parm', 'gdas', 'atm_jedi_fix.yaml') - logger.debug(f"Staging JEDI fix files from {jedi_fix_list_path}") - jedi_fix_list = parse_j2yaml(jedi_fix_list_path, self.task_config) + logger.info(f"Staging JEDI fix files from {self.task_config.JEDI_FIX_YAML}") + jedi_fix_list = parse_j2yaml(self.task_config.JEDI_FIX_YAML, self.task_config) FileHandler(jedi_fix_list).sync() # stage static background error files, otherwise it will assume ID matrix - logger.debug(f"Stage files for STATICB_TYPE {self.task_config.STATICB_TYPE}") + logger.info(f"Stage files for STATICB_TYPE {self.task_config.STATICB_TYPE}") FileHandler(self.get_berror_dict(self.task_config)).sync() # stage ensemble files for use in hybrid background error @@ -94,7 +92,7 @@ def initialize(self: Analysis) -> None: 'NMEM_ENS', 'DATA', 'current_cycle', 'ntiles'] for key in keys: localconf[key] = self.task_config[key] - localconf.RUN = 'enkf' + self.task_config.RUN + localconf.RUN = 'enkfgdas' localconf.dirname = 'ens' FileHandler(self.get_fv3ens_dict(localconf)).sync() @@ -102,10 +100,9 @@ def initialize(self: Analysis) -> None: FileHandler(self.get_bkg_dict(AttrDict(self.task_config))).sync() # generate variational YAML file - logger.debug(f"Generate variational YAML file: {self.task_config.fv3jedi_yaml}") - varda_yaml = parse_j2yaml(self.task_config.ATMVARYAML, self.task_config) - save_as_yaml(varda_yaml, self.task_config.fv3jedi_yaml) - logger.info(f"Wrote variational YAML to: {self.task_config.fv3jedi_yaml}") + logger.debug(f"Generate variational YAML file: {self.task_config.jedi_yaml}") + save_as_yaml(self.task_config.jedi_config, self.task_config.jedi_yaml) + logger.info(f"Wrote variational YAML to: {self.task_config.jedi_yaml}") # need output dir for diags and anl logger.debug("Create empty output [anl, diags] directories to receive output from executable") @@ -123,7 +120,7 @@ def execute(self: Analysis) -> None: exec_cmd = Executable(self.task_config.APRUN_ATMANL) exec_name = os.path.join(self.task_config.DATA, 'fv3jedi_var.x') exec_cmd.add_default_arg(exec_name) - exec_cmd.add_default_arg(self.task_config.fv3jedi_yaml) + exec_cmd.add_default_arg(self.task_config.jedi_yaml) try: logger.debug(f"Executing {exec_cmd}") @@ -152,7 +149,7 @@ def finalize(self: Analysis) -> None: atmstat = os.path.join(self.task_config.COM_ATMOS_ANALYSIS, f"{self.task_config.APREFIX}atmstat") # get list of diag files to put in tarball - diags = glob.glob(os.path.join(self.task_config.DATA, 'diags', 'diag*nc4')) + diags = glob.glob(os.path.join(self.task_config.DATA, 'diags', 'diag*nc')) logger.info(f"Compressing {len(diags)} diag files to {atmstat}.gz") @@ -170,7 +167,7 @@ def finalize(self: Analysis) -> None: archive.add(diaggzip, arcname=os.path.basename(diaggzip)) # copy full YAML from executable to ROTDIR - logger.info(f"Copying {self.task_config.fv3jedi_yaml} to {self.task_config.COM_ATMOS_ANALYSIS}") + logger.info(f"Copying {self.task_config.jedi_yaml} to {self.task_config.COM_ATMOS_ANALYSIS}") src = os.path.join(self.task_config.DATA, f"{self.task_config.CDUMP}.t{self.task_config.cyc:02d}z.atmvar.yaml") dest = os.path.join(self.task_config.COM_ATMOS_ANALYSIS, f"{self.task_config.CDUMP}.t{self.task_config.cyc:02d}z.atmvar.yaml") logger.debug(f"Copying {src} to {dest}") diff --git a/ush/python/pygfs/task/atmens_analysis.py b/ush/python/pygfs/task/atmens_analysis.py index 9cf84c07c7..1037b557c2 100644 --- a/ush/python/pygfs/task/atmens_analysis.py +++ b/ush/python/pygfs/task/atmens_analysis.py @@ -11,7 +11,7 @@ FileHandler, add_to_datetime, to_fv3time, to_timedelta, to_YMDH, to_YMD, chdir, - parse_yamltmpl, parse_j2yaml, save_as_yaml, + parse_j2yaml, save_as_yaml, logit, Executable, WorkflowException, @@ -31,7 +31,7 @@ def __init__(self, config): _res = int(self.config.CASE_ENS[1:]) _window_begin = add_to_datetime(self.runtime_config.current_cycle, -to_timedelta(f"{self.config.assim_freq}H") / 2) - _fv3jedi_yaml = os.path.join(self.runtime_config.DATA, f"{self.runtime_config.CDUMP}.t{self.runtime_config.cyc:02d}z.atmens.yaml") + _jedi_yaml = os.path.join(self.runtime_config.DATA, f"{self.runtime_config.CDUMP}.t{self.runtime_config.cyc:02d}z.atmens.yaml") # Create a local dictionary that is repeatedly used across this class local_dict = AttrDict( @@ -45,7 +45,7 @@ def __init__(self, config): 'OPREFIX': f"{self.config.EUPD_CYC}.t{self.runtime_config.cyc:02d}z.", # TODO: CDUMP is being replaced by RUN 'APREFIX': f"{self.runtime_config.CDUMP}.t{self.runtime_config.cyc:02d}z.", # TODO: CDUMP is being replaced by RUN 'GPREFIX': f"gdas.t{self.runtime_config.previous_cycle.hour:02d}z.", - 'fv3jedi_yaml': _fv3jedi_yaml, + 'jedi_yaml': _jedi_yaml, } ) @@ -96,19 +96,17 @@ def initialize(self: Analysis) -> None: FileHandler({'mkdir': dirlist}).sync() # stage CRTM fix files - crtm_fix_list_path = os.path.join(self.task_config.HOMEgfs, 'parm', 'gdas', 'atm_crtm_coeff.yaml') - logger.debug(f"Staging CRTM fix files from {crtm_fix_list_path}") - crtm_fix_list = parse_j2yaml(crtm_fix_list_path, self.task_config) + logger.info(f"Staging CRTM fix files from {self.task_config.CRTM_FIX_YAML}") + crtm_fix_list = parse_j2yaml(self.task_config.CRTM_FIX_YAML, self.task_config) FileHandler(crtm_fix_list).sync() # stage fix files - jedi_fix_list_path = os.path.join(self.task_config.HOMEgfs, 'parm', 'gdas', 'atm_jedi_fix.yaml') - logger.debug(f"Staging JEDI fix files from {jedi_fix_list_path}") - jedi_fix_list = parse_j2yaml(jedi_fix_list_path, self.task_config) + logger.info(f"Staging JEDI fix files from {self.task_config.JEDI_FIX_YAML}") + jedi_fix_list = parse_j2yaml(self.task_config.JEDI_FIX_YAML, self.task_config) FileHandler(jedi_fix_list).sync() # stage backgrounds - logger.debug(f"Stage ensemble member background files") + logger.info(f"Stage ensemble member background files") localconf = AttrDict() keys = ['COM_ATMOS_RESTART_TMPL', 'previous_cycle', 'ROTDIR', 'RUN', 'NMEM_ENS', 'DATA', 'current_cycle', 'ntiles'] @@ -118,10 +116,9 @@ def initialize(self: Analysis) -> None: FileHandler(self.get_fv3ens_dict(localconf)).sync() # generate ensemble da YAML file - logger.debug(f"Generate ensemble da YAML file: {self.task_config.fv3jedi_yaml}") - ensda_yaml = parse_j2yaml(self.task_config.ATMENSYAML, self.task_config) - save_as_yaml(ensda_yaml, self.task_config.fv3jedi_yaml) - logger.info(f"Wrote ensemble da YAML to: {self.task_config.fv3jedi_yaml}") + logger.debug(f"Generate ensemble da YAML file: {self.task_config.jedi_yaml}") + save_as_yaml(self.task_config.jedi_config, self.task_config.jedi_yaml) + logger.info(f"Wrote ensemble da YAML to: {self.task_config.jedi_yaml}") # need output dir for diags and anl logger.debug("Create empty output [anl, diags] directories to receive output from executable") @@ -153,7 +150,7 @@ def execute(self: Analysis) -> None: exec_cmd = Executable(self.task_config.APRUN_ATMENSANL) exec_name = os.path.join(self.task_config.DATA, 'fv3jedi_letkf.x') exec_cmd.add_default_arg(exec_name) - exec_cmd.add_default_arg(self.task_config.fv3jedi_yaml) + exec_cmd.add_default_arg(self.task_config.jedi_yaml) try: logger.debug(f"Executing {exec_cmd}") @@ -188,7 +185,7 @@ def finalize(self: Analysis) -> None: atmensstat = os.path.join(self.task_config.COM_ATMOS_ANALYSIS_ENS, f"{self.task_config.APREFIX}atmensstat") # get list of diag files to put in tarball - diags = glob.glob(os.path.join(self.task_config.DATA, 'diags', 'diag*nc4')) + diags = glob.glob(os.path.join(self.task_config.DATA, 'diags', 'diag*nc')) logger.info(f"Compressing {len(diags)} diag files to {atmensstat}.gz") @@ -206,7 +203,7 @@ def finalize(self: Analysis) -> None: archive.add(diaggzip, arcname=os.path.basename(diaggzip)) # copy full YAML from executable to ROTDIR - logger.info(f"Copying {self.task_config.fv3jedi_yaml} to {self.task_config.COM_ATMOS_ANALYSIS_ENS}") + logger.info(f"Copying {self.task_config.jedi_yaml} to {self.task_config.COM_ATMOS_ANALYSIS_ENS}") src = os.path.join(self.task_config.DATA, f"{self.task_config.CDUMP}.t{self.task_config.cyc:02d}z.atmens.yaml") dest = os.path.join(self.task_config.COM_ATMOS_ANALYSIS_ENS, f"{self.task_config.CDUMP}.t{self.task_config.cyc:02d}z.atmens.yaml") logger.debug(f"Copying {src} to {dest}") diff --git a/ush/python/pygfs/task/oceanice_products.py b/ush/python/pygfs/task/oceanice_products.py new file mode 100644 index 0000000000..c865a9f408 --- /dev/null +++ b/ush/python/pygfs/task/oceanice_products.py @@ -0,0 +1,337 @@ +#!/usr/bin/env python3 + +import os +from logging import getLogger +from typing import List, Dict, Any +from pprint import pformat +import xarray as xr + +from wxflow import (AttrDict, + parse_j2yaml, + FileHandler, + Jinja, + logit, + Task, + add_to_datetime, to_timedelta, + WorkflowException, + Executable) + +logger = getLogger(__name__.split('.')[-1]) + + +class OceanIceProducts(Task): + """Ocean Ice Products Task + """ + + VALID_COMPONENTS = ['ocean', 'ice'] + COMPONENT_RES_MAP = {'ocean': 'OCNRES', 'ice': 'ICERES'} + VALID_PRODUCT_GRIDS = {'mx025': ['1p00', '0p25'], + 'mx050': ['1p00', '0p50'], + 'mx100': ['1p00'], + 'mx500': ['5p00']} + + # These could be read from the yaml file + TRIPOLE_DIMS_MAP = {'mx025': [1440, 1080], 'mx050': [720, 526], 'mx100': [360, 320], 'mx500': [72, 35]} + LATLON_DIMS_MAP = {'0p25': [1440, 721], '0p50': [720, 361], '1p00': [360, 181], '5p00': [72, 36]} + + @logit(logger, name="OceanIceProducts") + def __init__(self, config: Dict[str, Any]) -> None: + """Constructor for the Ocean/Ice Productstask + + Parameters + ---------- + config : Dict[str, Any] + Incoming configuration for the task from the environment + + Returns + ------- + None + """ + super().__init__(config) + + if self.config.COMPONENT not in self.VALID_COMPONENTS: + raise NotImplementedError(f'{self.config.COMPONENT} is not a valid model component.\n' + + 'Valid model components are:\n' + + f'{", ".join(self.VALID_COMPONENTS)}') + + model_grid = f"mx{self.config[self.COMPONENT_RES_MAP[self.config.COMPONENT]]:03d}" + + valid_datetime = add_to_datetime(self.runtime_config.current_cycle, to_timedelta(f"{self.config.FORECAST_HOUR}H")) + + # TODO: This is a bit of a hack, but it works for now + # FIXME: find a better way to provide the averaging period + # This will be different for ocean and ice, so when they are made flexible, this will need to be addressed + avg_period = f"{self.config.FORECAST_HOUR-self.config.FHOUT_OCNICE_GFS:03d}-{self.config.FORECAST_HOUR:03d}" + + localdict = AttrDict( + {'component': self.config.COMPONENT, + 'forecast_hour': self.config.FORECAST_HOUR, + 'valid_datetime': valid_datetime, + 'avg_period': avg_period, + 'model_grid': model_grid, + 'product_grids': self.VALID_PRODUCT_GRIDS[model_grid]} + ) + self.task_config = AttrDict(**self.config, **self.runtime_config, **localdict) + + # Read the oceanice_products.yaml file for common configuration + logger.info(f"Read the ocean ice products configuration yaml file {self.config.OCEANICEPRODUCTS_CONFIG}") + self.task_config.oceanice_yaml = parse_j2yaml(self.config.OCEANICEPRODUCTS_CONFIG, self.task_config) + logger.debug(f"oceanice_yaml:\n{pformat(self.task_config.oceanice_yaml)}") + + @staticmethod + @logit(logger) + def initialize(config: Dict) -> None: + """Initialize the work directory by copying all the common fix data + + Parameters + ---------- + config : Dict + Configuration dictionary for the task + + Returns + ------- + None + """ + + # Copy static data to run directory + logger.info("Copy static data to run directory") + FileHandler(config.oceanice_yaml.ocnicepost.fix_data).sync() + + # Copy "component" specific model data to run directory (e.g. ocean/ice forecast output) + logger.info(f"Copy {config.component} data to run directory") + FileHandler(config.oceanice_yaml[config.component].data_in).sync() + + @staticmethod + @logit(logger) + def configure(config: Dict, product_grid: str) -> None: + """Configure the namelist for the product_grid in the work directory. + Create namelist 'ocnicepost.nml' from template + + Parameters + ---------- + config : Dict + Configuration dictionary for the task + product_grid : str + Target product grid to process + + Returns + ------- + None + """ + + # Make a localconf with the "component" specific configuration for parsing the namelist + localconf = AttrDict() + localconf.DATA = config.DATA + localconf.component = config.component + + localconf.source_tripole_dims = ', '.join(map(str, OceanIceProducts.TRIPOLE_DIMS_MAP[config.model_grid])) + localconf.target_latlon_dims = ', '.join(map(str, OceanIceProducts.LATLON_DIMS_MAP[product_grid])) + + localconf.maskvar = config.oceanice_yaml[config.component].namelist.maskvar + localconf.sinvar = config.oceanice_yaml[config.component].namelist.sinvar + localconf.cosvar = config.oceanice_yaml[config.component].namelist.cosvar + localconf.angvar = config.oceanice_yaml[config.component].namelist.angvar + localconf.debug = ".true." if config.oceanice_yaml.ocnicepost.namelist.debug else ".false." + + logger.debug(f"localconf:\n{pformat(localconf)}") + + # Configure the namelist and write to file + logger.info("Create namelist for ocnicepost.x") + nml_template = os.path.join(localconf.DATA, "ocnicepost.nml.jinja2") + nml_data = Jinja(nml_template, localconf).render + logger.debug(f"ocnicepost_nml:\n{nml_data}") + nml_file = os.path.join(localconf.DATA, "ocnicepost.nml") + with open(nml_file, "w") as fho: + fho.write(nml_data) + + @staticmethod + @logit(logger) + def execute(config: Dict, product_grid: str) -> None: + """Run the ocnicepost.x executable to interpolate and convert to grib2 + + Parameters + ---------- + config : Dict + Configuration dictionary for the task + product_grid : str + Target product grid to process + + Returns + ------- + None + """ + + # Run the ocnicepost.x executable + OceanIceProducts.interp(config.DATA, config.APRUN_OCNICEPOST, exec_name="ocnicepost.x") + + # Convert interpolated netCDF file to grib2 + OceanIceProducts.netCDF_to_grib2(config, product_grid) + + @staticmethod + @logit(logger) + def interp(workdir: str, aprun_cmd: str, exec_name: str = "ocnicepost.x") -> None: + """ + Run the interpolation executable to generate rectilinear netCDF file + + Parameters + ---------- + config : Dict + Configuration dictionary for the task + workdir : str + Working directory for the task + aprun_cmd : str + aprun command to use + exec_name : str + Name of the executable e.g. ocnicepost.x + + Returns + ------- + None + """ + os.chdir(workdir) + logger.debug(f"Current working directory: {os.getcwd()}") + + exec_cmd = Executable(aprun_cmd) + exec_cmd.add_default_arg(os.path.join(workdir, exec_name)) + + OceanIceProducts._call_executable(exec_cmd) + + @staticmethod + @logit(logger) + def netCDF_to_grib2(config: Dict, grid: str) -> None: + """Convert interpolated netCDF file to grib2 + + Parameters + ---------- + config : Dict + Configuration dictionary for the task + grid : str + Target product grid to process + + Returns + ------ + None + """ + + os.chdir(config.DATA) + + exec_cmd = Executable(config.oceanice_yaml.nc2grib2.script) + arguments = [config.component, grid, config.current_cycle.strftime("%Y%m%d%H"), config.avg_period] + if config.component == 'ocean': + levs = config.oceanice_yaml.ocean.namelist.ocean_levels + arguments.append(':'.join(map(str, levs))) + + logger.info(f"Executing {exec_cmd} with arguments {arguments}") + try: + exec_cmd(*arguments) + except OSError: + logger.exception(f"FATAL ERROR: Failed to execute {exec_cmd}") + raise OSError(f"{exec_cmd}") + except Exception: + logger.exception(f"FATAL ERROR: Error occurred during execution of {exec_cmd}") + raise WorkflowException(f"{exec_cmd}") + + @staticmethod + @logit(logger) + def subset(config: Dict) -> None: + """ + Subset a list of variables from a netcdf file and save to a new netcdf file. + Also save global attributes and history from the old netcdf file into new netcdf file + + Parameters + ---------- + config : Dict + Configuration dictionary for the task + + Returns + ------- + None + """ + + os.chdir(config.DATA) + + input_file = f"{config.component}.nc" + output_file = f"{config.component}_subset.nc" + varlist = config.oceanice_yaml[config.component].subset + + logger.info(f"Subsetting {varlist} from {input_file} to {output_file}") + + try: + # open the netcdf file + ds = xr.open_dataset(input_file) + + # subset the variables + ds_subset = ds[varlist] + + # save global attributes from the old netcdf file into new netcdf file + ds_subset.attrs = ds.attrs + + # save subsetted variables to a new netcdf file + ds_subset.to_netcdf(output_file) + + except FileNotFoundError: + logger.exception(f"FATAL ERROR: Input file not found: {input_file}") + raise FileNotFoundError(f"File not found: {input_file}") + + except IOError as err: + logger.exception(f"FATAL ERROR: IOError occurred during netCDF subset: {input_file}") + raise IOError(f"An I/O error occurred: {err}") + + except Exception as err: + logger.exception(f"FATAL ERROR: Error occurred during netCDF subset: {input_file}") + raise WorkflowException(f"{err}") + + finally: + # close the netcdf files + ds.close() + ds_subset.close() + + @staticmethod + @logit(logger) + def _call_executable(exec_cmd: Executable) -> None: + """Internal method to call executable + + Parameters + ---------- + exec_cmd : Executable + Executable to run + + Raises + ------ + OSError + Failure due to OS issues + WorkflowException + All other exceptions + """ + + logger.info(f"Executing {exec_cmd}") + try: + exec_cmd() + except OSError: + logger.exception(f"FATAL ERROR: Failed to execute {exec_cmd}") + raise OSError(f"{exec_cmd}") + except Exception: + logger.exception(f"FATAL ERROR: Error occurred during execution of {exec_cmd}") + raise WorkflowException(f"{exec_cmd}") + + @staticmethod + @logit(logger) + def finalize(config: Dict) -> None: + """Perform closing actions of the task. + Copy data back from the DATA/ directory to COM/ + + Parameters + ---------- + config: Dict + Configuration dictionary for the task + + Returns + ------- + None + """ + + # Copy "component" specific generated data to COM/ directory + data_out = config.oceanice_yaml[config.component].data_out + + logger.info(f"Copy processed data to COM/ directory") + FileHandler(data_out).sync() diff --git a/ush/python/pygfs/task/land_analysis.py b/ush/python/pygfs/task/snow_analysis.py similarity index 89% rename from ush/python/pygfs/task/land_analysis.py rename to ush/python/pygfs/task/snow_analysis.py index 307e875183..c149f140b6 100644 --- a/ush/python/pygfs/task/land_analysis.py +++ b/ush/python/pygfs/task/snow_analysis.py @@ -11,7 +11,7 @@ FileHandler, to_fv3time, to_YMD, to_YMDH, to_timedelta, add_to_datetime, rm_p, - parse_j2yaml, parse_yamltmpl, save_as_yaml, + parse_j2yaml, save_as_yaml, Jinja, logit, Executable, @@ -21,14 +21,14 @@ logger = getLogger(__name__.split('.')[-1]) -class LandAnalysis(Analysis): +class SnowAnalysis(Analysis): """ - Class for global land analysis tasks + Class for global snow analysis tasks """ - NMEM_LANDENS = 2 # The size of the land ensemble is fixed at 2. Does this need to be a variable? + NMEM_SNOWENS = 2 - @logit(logger, name="LandAnalysis") + @logit(logger, name="SnowAnalysis") def __init__(self, config): super().__init__(config) @@ -43,8 +43,8 @@ def __init__(self, config): 'npy_ges': _res + 1, 'npz_ges': self.config.LEVS - 1, 'npz': self.config.LEVS - 1, - 'LAND_WINDOW_BEGIN': _window_begin, - 'LAND_WINDOW_LENGTH': f"PT{self.config['assim_freq']}H", + 'SNOW_WINDOW_BEGIN': _window_begin, + 'SNOW_WINDOW_LENGTH': f"PT{self.config['assim_freq']}H", 'OPREFIX': f"{self.runtime_config.RUN}.t{self.runtime_config.cyc:02d}z.", 'APREFIX': f"{self.runtime_config.RUN}.t{self.runtime_config.cyc:02d}z.", 'jedi_yaml': _letkfoi_yaml @@ -56,9 +56,9 @@ def __init__(self, config): @logit(logger) def prepare_GTS(self) -> None: - """Prepare the GTS data for a global land analysis + """Prepare the GTS data for a global snow analysis - This method will prepare GTS data for a global land analysis using JEDI. + This method will prepare GTS data for a global snow analysis using JEDI. This includes: - processing GTS bufr snow depth observation data to IODA format @@ -74,7 +74,7 @@ def prepare_GTS(self) -> None: # create a temporary dict of all keys needed in this method localconf = AttrDict() keys = ['HOMEgfs', 'DATA', 'current_cycle', 'COM_OBS', 'COM_ATMOS_RESTART_PREV', - 'OPREFIX', 'CASE', 'ntiles'] + 'OPREFIX', 'CASE', 'OCNRES', 'ntiles'] for key in keys: localconf[key] = self.task_config[key] @@ -99,7 +99,7 @@ def prepare_GTS(self) -> None: def _gtsbufr2iodax(exe, yaml_file): if not os.path.isfile(yaml_file): - logger.exception(f"{yaml_file} not found") + logger.exception(f"FATAL ERROR: {yaml_file} not found") raise FileNotFoundError(yaml_file) logger.info(f"Executing {exe}") @@ -133,9 +133,9 @@ def _gtsbufr2iodax(exe, yaml_file): @logit(logger) def prepare_IMS(self) -> None: - """Prepare the IMS data for a global land analysis + """Prepare the IMS data for a global snow analysis - This method will prepare IMS data for a global land analysis using JEDI. + This method will prepare IMS data for a global snow analysis using JEDI. This includes: - staging model backgrounds - processing raw IMS observation data and prepare for conversion to IODA @@ -153,7 +153,7 @@ def prepare_IMS(self) -> None: # create a temporary dict of all keys needed in this method localconf = AttrDict() keys = ['DATA', 'current_cycle', 'COM_OBS', 'COM_ATMOS_RESTART_PREV', - 'OPREFIX', 'CASE', 'OCNRES', 'ntiles'] + 'OPREFIX', 'CASE', 'OCNRES', 'ntiles', 'FIXgfs'] for key in keys: localconf[key] = self.task_config[key] @@ -198,7 +198,7 @@ def prepare_IMS(self) -> None: raise WorkflowException(f"An error occured during execution of {exe}") # Ensure the snow depth IMS file is produced by the above executable - input_file = f"IMSscf.{to_YMD(localconf.current_cycle)}.{localconf.CASE}.mx{localconf.OCNRES}_oro_data.nc" + input_file = f"IMSscf.{to_YMD(localconf.current_cycle)}.{localconf.CASE}_oro_data.nc" if not os.path.isfile(f"{os.path.join(localconf.DATA, input_file)}"): logger.exception(f"{self.task_config.CALCFIMSEXE} failed to produce {input_file}") raise FileNotFoundError(f"{os.path.join(localconf.DATA, input_file)}") @@ -232,7 +232,7 @@ def prepare_IMS(self) -> None: @logit(logger) def initialize(self) -> None: - """Initialize method for Land analysis + """Initialize method for snow analysis This method: - creates artifacts in the DATA directory by copying fix files - creates the JEDI LETKF yaml from the template @@ -241,7 +241,7 @@ def initialize(self) -> None: Parameters ---------- self : Analysis - Instance of the LandAnalysis object + Instance of the SnowAnalysis object """ super().initialize() @@ -249,30 +249,27 @@ def initialize(self) -> None: # create a temporary dict of all keys needed in this method localconf = AttrDict() keys = ['DATA', 'current_cycle', 'COM_OBS', 'COM_ATMOS_RESTART_PREV', - 'OPREFIX', 'CASE', 'ntiles'] + 'OPREFIX', 'CASE', 'OCNRES', 'ntiles'] for key in keys: localconf[key] = self.task_config[key] # Make member directories in DATA for background dirlist = [] - for imem in range(1, LandAnalysis.NMEM_LANDENS + 1): + for imem in range(1, SnowAnalysis.NMEM_SNOWENS + 1): dirlist.append(os.path.join(localconf.DATA, 'bkg', f'mem{imem:03d}')) FileHandler({'mkdir': dirlist}).sync() # stage fix files - jedi_fix_list_path = os.path.join(self.task_config.HOMEgfs, 'parm', 'gdas', 'land_jedi_fix.yaml') - logger.info(f"Staging JEDI fix files from {jedi_fix_list_path}") - jedi_fix_list = parse_yamltmpl(jedi_fix_list_path, self.task_config) + logger.info(f"Staging JEDI fix files from {self.task_config.JEDI_FIX_YAML}") + jedi_fix_list = parse_j2yaml(self.task_config.JEDI_FIX_YAML, self.task_config) FileHandler(jedi_fix_list).sync() # stage backgrounds logger.info("Staging ensemble backgrounds") FileHandler(self.get_ens_bkg_dict(localconf)).sync() - # generate letkfoi YAML file - logger.info(f"Generate JEDI LETKF YAML file: {self.task_config.jedi_yaml}") - letkfoi_yaml = parse_j2yaml(self.task_config.JEDIYAML, self.task_config) - save_as_yaml(letkfoi_yaml, self.task_config.jedi_yaml) + # Write out letkfoi YAML file + save_as_yaml(self.task_config.jedi_config, self.task_config.jedi_yaml) logger.info(f"Wrote letkfoi YAML to: {self.task_config.jedi_yaml}") # need output dir for diags and anl @@ -294,15 +291,15 @@ def execute(self) -> None: Parameters ---------- self : Analysis - Instance of the LandAnalysis object + Instance of the SnowAnalysis object """ # create a temporary dict of all keys needed in this method localconf = AttrDict() keys = ['HOMEgfs', 'DATA', 'current_cycle', - 'COM_ATMOS_RESTART_PREV', 'COM_LAND_ANALYSIS', 'APREFIX', - 'SNOWDEPTHVAR', 'BESTDDEV', 'CASE', 'ntiles', - 'APRUN_LANDANL', 'JEDIEXE', 'jedi_yaml', + 'COM_ATMOS_RESTART_PREV', 'COM_SNOW_ANALYSIS', 'APREFIX', + 'SNOWDEPTHVAR', 'BESTDDEV', 'CASE', 'OCNRES', 'ntiles', + 'APRUN_SNOWANL', 'JEDIEXE', 'jedi_yaml', 'APPLY_INCR_NML_TMPL', 'APPLY_INCR_EXE', 'APRUN_APPLY_INCR'] for key in keys: localconf[key] = self.task_config[key] @@ -314,7 +311,7 @@ def execute(self) -> None: logger.info("Running JEDI LETKF") self.execute_jediexe(localconf.DATA, - localconf.APRUN_LANDANL, + localconf.APRUN_SNOWANL, os.path.basename(localconf.JEDIEXE), localconf.jedi_yaml) @@ -323,7 +320,7 @@ def execute(self) -> None: @logit(logger) def finalize(self) -> None: - """Performs closing actions of the Land analysis task + """Performs closing actions of the Snow analysis task This method: - tar and gzip the output diag files and place in COM/ - copy the generated YAML file from initialize to the COM/ @@ -333,11 +330,11 @@ def finalize(self) -> None: Parameters ---------- self : Analysis - Instance of the LandAnalysis object + Instance of the SnowAnalysis object """ logger.info("Create diagnostic tarball of diag*.nc4 files") - statfile = os.path.join(self.task_config.COM_LAND_ANALYSIS, f"{self.task_config.APREFIX}landstat.tgz") + statfile = os.path.join(self.task_config.COM_SNOW_ANALYSIS, f"{self.task_config.APREFIX}snowstat.tgz") self.tgz_diags(statfile, self.task_config.DATA) logger.info("Copy full YAML to COM") @@ -355,17 +352,17 @@ def finalize(self) -> None: for itile in range(1, self.task_config.ntiles + 1): filename = template.format(tilenum=itile) src = os.path.join(self.task_config.DATA, 'anl', filename) - dest = os.path.join(self.task_config.COM_LAND_ANALYSIS, filename) + dest = os.path.join(self.task_config.COM_SNOW_ANALYSIS, filename) anllist.append([src, dest]) FileHandler({'copy': anllist}).sync() logger.info('Copy increments to COM') - template = f'landinc.{to_fv3time(self.task_config.current_cycle)}.sfc_data.tile{{tilenum}}.nc' + template = f'snowinc.{to_fv3time(self.task_config.current_cycle)}.sfc_data.tile{{tilenum}}.nc' inclist = [] for itile in range(1, self.task_config.ntiles + 1): filename = template.format(tilenum=itile) src = os.path.join(self.task_config.DATA, 'anl', filename) - dest = os.path.join(self.task_config.COM_LAND_ANALYSIS, filename) + dest = os.path.join(self.task_config.COM_SNOW_ANALYSIS, filename) inclist.append([src, dest]) FileHandler({'copy': inclist}).sync() @@ -375,7 +372,7 @@ def get_bkg_dict(config: Dict) -> Dict[str, List[str]]: """Compile a dictionary of model background files to copy This method constructs a dictionary of FV3 RESTART files (coupler, sfc_data) - that are needed for global land DA and returns said dictionary for use by the FileHandler class. + that are needed for global snow DA and returns said dictionary for use by the FileHandler class. Parameters ---------- @@ -401,11 +398,11 @@ def get_bkg_dict(config: Dict) -> Dict[str, List[str]]: # Start accumulating list of background files to copy bkglist = [] - # land DA needs coupler + # snow DA needs coupler basename = f'{to_fv3time(config.current_cycle)}.coupler.res' bkglist.append([os.path.join(rst_dir, basename), os.path.join(run_dir, basename)]) - # land DA only needs sfc_data + # snow DA only needs sfc_data for ftype in ['sfc_data']: template = f'{to_fv3time(config.current_cycle)}.{ftype}.tile{{tilenum}}.nc' for itile in range(1, config.ntiles + 1): @@ -447,17 +444,17 @@ def get_ens_bkg_dict(config: Dict) -> Dict: # get FV3 sfc_data RESTART files; Note an ensemble is being created rst_dir = os.path.join(config.COM_ATMOS_RESTART_PREV) - for imem in range(1, LandAnalysis.NMEM_LANDENS + 1): + for imem in range(1, SnowAnalysis.NMEM_SNOWENS + 1): memchar = f"mem{imem:03d}" run_dir = os.path.join(config.DATA, 'bkg', memchar, 'RESTART') dirlist.append(run_dir) - # Land DA needs coupler + # Snow DA needs coupler basename = f'{to_fv3time(config.current_cycle)}.coupler.res' bkglist.append([os.path.join(rst_dir, basename), os.path.join(run_dir, basename)]) - # Land DA only needs sfc_data + # Snow DA only needs sfc_data for ftype in ['sfc_data']: template = f'{to_fv3time(config.current_cycle)}.{ftype}.tile{{tilenum}}.nc' for itile in range(1, config.ntiles + 1): @@ -491,7 +488,7 @@ def create_ensemble(vname: str, bestddev: float, config: Dict) -> None: """ # 2 ens members - offset = bestddev / np.sqrt(LandAnalysis.NMEM_LANDENS) + offset = bestddev / np.sqrt(SnowAnalysis.NMEM_SNOWENS) logger.info(f"Creating ensemble for LETKFOI by offsetting with {offset}") @@ -530,6 +527,7 @@ def add_increments(config: Dict) -> None: DATA current_cycle CASE + OCNRES ntiles APPLY_INCR_NML_TMPL APPLY_INCR_EXE diff --git a/ush/radmon_err_rpt.sh b/ush/radmon_err_rpt.sh index 6ae6505624..c3d251d5cd 100755 --- a/ush/radmon_err_rpt.sh +++ b/ush/radmon_err_rpt.sh @@ -1,6 +1,6 @@ #! /usr/bin/env bash -source "${HOMEgfs}/ush/preamble.sh" +source "${USHgfs}/preamble.sh" ################################################################################ #### UNIX Script Documentation Block @@ -55,9 +55,6 @@ cycle2=${5:-${cycle2:?}} diag_rpt=${6:-${diag_rpt:?}} outfile=${7:-${outfile:?}} -# Directories -HOMEradmon=${HOMEradmon:-$(pwd)} - # Other variables err=0 RADMON_SUFFIX=${RADMON_SUFFIX} diff --git a/ush/radmon_verf_angle.sh b/ush/radmon_verf_angle.sh index f68d7c88cc..3dff2a6f98 100755 --- a/ush/radmon_verf_angle.sh +++ b/ush/radmon_verf_angle.sh @@ -1,6 +1,6 @@ #! /usr/bin/env bash -source "${HOMEgfs}/ush/preamble.sh" +source "${USHgfs}/preamble.sh" ################################################################################ #### UNIX Script Documentation Block @@ -29,8 +29,6 @@ source "${HOMEgfs}/ush/preamble.sh" # Imported Shell Variables: # RADMON_SUFFIX data source suffix # defauls to opr -# EXECgfs executable directory -# PARMmonitor parm directory # RAD_AREA global or regional flag # defaults to global # TANKverf_rad data repository @@ -83,7 +81,6 @@ which prep_step which startmsg # File names -export pgmout=${pgmout:-${jlogfile}} touch "${pgmout}" # Other variables @@ -101,7 +98,7 @@ fi err=0 angle_exec=radmon_angle.x -shared_scaninfo="${shared_scaninfo:-${PARMmonitor}/gdas_radmon_scaninfo.txt}" +shared_scaninfo="${shared_scaninfo:-${PARMgfs}/monitor/gdas_radmon_scaninfo.txt}" scaninfo=scaninfo.txt #-------------------------------------------------------------------- diff --git a/ush/radmon_verf_bcoef.sh b/ush/radmon_verf_bcoef.sh index ab1058711e..4274436154 100755 --- a/ush/radmon_verf_bcoef.sh +++ b/ush/radmon_verf_bcoef.sh @@ -1,6 +1,6 @@ #! /usr/bin/env bash -source "${HOMEgfs}/ush/preamble.sh" +source "${USHgfs}/preamble.sh" ################################################################################ #### UNIX Script Documentation Block @@ -69,7 +69,6 @@ fi echo " RADMON_NETCDF, netcdf_boolean = ${RADMON_NETCDF}, ${netcdf_boolean}" # File names -pgmout=${pgmout:-${jlogfile}} touch "${pgmout}" # Other variables diff --git a/ush/radmon_verf_bcor.sh b/ush/radmon_verf_bcor.sh index f1f97c247e..ea0a7842e6 100755 --- a/ush/radmon_verf_bcor.sh +++ b/ush/radmon_verf_bcor.sh @@ -1,6 +1,6 @@ #! /usr/bin/env bash -source "${HOMEgfs}/ush/preamble.sh" +source "${USHgfs}/preamble.sh" ################################################################################ #### UNIX Script Documentation Block @@ -65,7 +65,6 @@ source "${HOMEgfs}/ush/preamble.sh" #################################################################### # File names -pgmout=${pgmout:-${jlogfile}} touch "${pgmout}" # Other variables diff --git a/ush/radmon_verf_time.sh b/ush/radmon_verf_time.sh index 7f98407ec5..0e935826dd 100755 --- a/ush/radmon_verf_time.sh +++ b/ush/radmon_verf_time.sh @@ -1,6 +1,6 @@ #! /usr/bin/env bash -source "${HOMEgfs}/ush/preamble.sh" +source "${USHgfs}/preamble.sh" ################################################################################ #### UNIX Script Documentation Block @@ -33,8 +33,6 @@ source "${HOMEgfs}/ush/preamble.sh" # defaults to 1 (on) # RADMON_SUFFIX data source suffix # defauls to opr -# EXECgfs executable directory -# PARMmonitor parm data directory # RAD_AREA global or regional flag # defaults to global # TANKverf_rad data repository @@ -75,11 +73,9 @@ source "${HOMEgfs}/ush/preamble.sh" #################################################################### # File names -#pgmout=${pgmout:-${jlogfile}} -#touch $pgmout radmon_err_rpt=${radmon_err_rpt:-${USHgfs}/radmon_err_rpt.sh} -base_file=${base_file:-${PARMmonitor}/gdas_radmon_base.tar} +base_file=${base_file:-${PARMgfs}/monitor/gdas_radmon_base.tar} report=report.txt disclaimer=disclaimer.txt diff --git a/ush/rstprod.sh b/ush/rstprod.sh index acac0340bb..b48a6817e0 100755 --- a/ush/rstprod.sh +++ b/ush/rstprod.sh @@ -1,6 +1,6 @@ #! /usr/bin/env bash -source "$HOMEgfs/ush/preamble.sh" +source "${USHgfs}/preamble.sh" #--------------------------------------------------------- # rstprod.sh diff --git a/ush/run_mpmd.sh b/ush/run_mpmd.sh index 24cb3f2656..e3fc2b7512 100755 --- a/ush/run_mpmd.sh +++ b/ush/run_mpmd.sh @@ -1,6 +1,6 @@ #!/usr/bin/env bash -source "${HOMEgfs}/ush/preamble.sh" +source "${USHgfs}/preamble.sh" cmdfile=${1:?"run_mpmd requires an input file containing commands to execute in MPMD mode"} diff --git a/ush/syndat_getjtbul.sh b/ush/syndat_getjtbul.sh index c17067ff72..6596c6ef96 100755 --- a/ush/syndat_getjtbul.sh +++ b/ush/syndat_getjtbul.sh @@ -18,17 +18,10 @@ # Imported variables that must be passed in: # DATA - path to working directory # pgmout - string indicating path to for standard output file -# EXECSYND - path to syndat executable directory # TANK_TROPCY - path to home directory containing tropical cyclone record # data base -# Imported variables that can be passed in: -# jlogfile - path to job log file (skipped over by this script if not -# passed in) - -source "$HOMEgfs/ush/preamble.sh" - -EXECSYND=${EXECSYND:-${HOMESYND}/exec} +source "${USHgfs}/preamble.sh" cd $DATA @@ -52,8 +45,6 @@ hour=$(echo $CDATE10 | cut -c9-10) echo $PDYm1 pdym1=$PDYm1 -#pdym1=$(sh $utilscript/finddate.sh $pdy d-1) - echo " " >> $pgmout echo "Entering sub-shell syndat_getjtbul.sh to recover JTWC Bulletins" \ >> $pgmout @@ -123,7 +114,7 @@ fi [ -s jtwcbul ] && echo "Processing JTWC bulletin halfs into tcvitals records" >> $pgmout -pgm=$(basename $EXECSYND/syndat_getjtbul.x) +pgm=$(basename ${EXECgfs}/syndat_getjtbul.x) export pgm if [ -s prep_step ]; then set +u @@ -138,7 +129,7 @@ rm -f fnoc export FORT11=jtwcbul export FORT51=fnoc -time -p ${EXECSYND}/${pgm} >> $pgmout 2> errfile +time -p ${EXECgfs}/${pgm} >> $pgmout 2> errfile errget=$? ###cat errfile cat errfile >> $pgmout diff --git a/ush/syndat_qctropcy.sh b/ush/syndat_qctropcy.sh index cda9030577..8ec8f70b14 100755 --- a/ush/syndat_qctropcy.sh +++ b/ush/syndat_qctropcy.sh @@ -44,10 +44,6 @@ # COMSP - path to both output jtwc-fnoc file and output tcvitals file (this # tcvitals file is read by subsequent relocation processing and/or # subsequent program SYNDAT_SYNDATA) -# PARMSYND - path to syndat parm field directory -# EXECSYND - path to syndat executable directory -# FIXam - path to syndat fix field directory -# USHSYND - path to syndat ush directory # Imported variables that can be passed in: # ARCHSYND - path to syndat archive directory @@ -59,7 +55,7 @@ # data base # (Default: /dcom/us007003) # slmask - path to t126 32-bit gaussian land/sea mask file -# (Default: $FIXam/syndat_slmask.t126.gaussian) +# (Default: ${FIXgfs}/am/syndat_slmask.t126.gaussian) # copy_back - switch to copy updated files back to archive directory and # to tcvitals directory # (Default: YES) @@ -67,19 +63,13 @@ # (Default: not set) # TIMEIT - optional time and resource reporting (Default: not set) -source "$HOMEgfs/ush/preamble.sh" +source "${USHgfs}/preamble.sh" ARCHSYND=${ARCHSYND:-$COMROOTp3/gfs/prod/syndat} -HOMENHCp1=${HOMENHCp1:-/gpfs/?p1/nhc/save/guidance/storm-data/ncep} HOMENHC=${HOMENHC:-/gpfs/dell2/nhc/save/guidance/storm-data/ncep} TANK_TROPCY=${TANK_TROPCY:-${DCOMROOT}/us007003} -FIXam=${FIXam:-$HOMEgfs/fix/am} -USHSYND=${USHSYND:-$HOMEgfs/ush} -EXECSYND=${EXECSYND:-$HOMEgfs/exec} -PARMSYND=${PARMSYND:-$HOMEgfs/parm/relo} - -slmask=${slmask:-$FIXam/syndat_slmask.t126.gaussian} +slmask=${slmask:-${FIXgfs}/am/syndat_slmask.t126.gaussian} copy_back=${copy_back:-YES} files_override=${files_override:-""} @@ -188,12 +178,12 @@ if [ -n "$files_override" ]; then # for testing, typically want FILES=F fi echo " &INPUT RUNID = '${net}_${tmmark}_${cyc}', FILES = $files " > vitchk.inp -cat $PARMSYND/syndat_qctropcy.${RUN}.parm >> vitchk.inp +cat ${PARMgfs}/relo/syndat_qctropcy.${RUN}.parm >> vitchk.inp -# Copy the fixed fields from FIXam +# Copy the fixed fields -cp $FIXam/syndat_fildef.vit fildef.vit -cp $FIXam/syndat_stmnames stmnames +cp ${FIXgfs}/am/syndat_fildef.vit fildef.vit +cp ${FIXgfs}/am/syndat_stmnames stmnames rm -f nhc fnoc lthistry @@ -205,12 +195,9 @@ rm -f nhc fnoc lthistry # All are input to program syndat_qctropcy # ------------------------------------------------------------------ -if [ -s $HOMENHC/tcvitals ]; then - echo "tcvitals found" >> $pgmout - cp $HOMENHC/tcvitals nhc -elif [ -s $HOMENHCp1/tcvitals ]; then +if [ -s ${HOMENHC}/tcvitals ]; then echo "tcvitals found" >> $pgmout - cp $HOMENHCp1/tcvitals nhc + cp ${HOMENHC}/tcvitals nhc else echo "WARNING: tcvitals not found, create empty tcvitals" >> $pgmout > nhc @@ -221,17 +208,17 @@ touch nhc [ "$copy_back" = 'YES' ] && cat nhc >> $ARCHSYND/syndat_tcvitals.$year mv -f nhc nhc1 -$USHSYND/parse-storm-type.pl nhc1 > nhc +${USHgfs}/parse-storm-type.pl nhc1 > nhc cp -p nhc nhc.ORIG # JTWC/FNOC ... execute syndat_getjtbul script to write into working directory # as fnoc; copy to archive -$USHSYND/syndat_getjtbul.sh $CDATE10 +${USHgfs}/syndat_getjtbul.sh $CDATE10 touch fnoc [ "$copy_back" = 'YES' ] && cat fnoc >> $ARCHSYND/syndat_tcvitals.$year mv -f fnoc fnoc1 -$USHSYND/parse-storm-type.pl fnoc1 > fnoc +${USHgfs}/parse-storm-type.pl fnoc1 > fnoc if [ $SENDDBN = YES ]; then $DBNROOT/bin/dbn_alert MODEL SYNDAT_TCVITALS $job $ARCHSYND/syndat_tcvitals.$year @@ -245,7 +232,7 @@ cp $slmask slmask.126 # Execute program syndat_qctropcy -pgm=$(basename $EXECSYND/syndat_qctropcy.x) +pgm=$(basename ${EXECgfs}/syndat_qctropcy.x) export pgm if [ -s prep_step ]; then set +u @@ -259,7 +246,7 @@ fi echo "$CDATE10" > cdate10.dat export FORT11=slmask.126 export FORT12=cdate10.dat -${EXECSYND}/${pgm} >> $pgmout 2> errfile +${EXECgfs}/${pgm} >> $pgmout 2> errfile errqct=$? ###cat errfile cat errfile >> $pgmout @@ -323,28 +310,25 @@ diff nhc nhc.ORIG > /dev/null errdiff=$? ################################### -# Update NHC file in $HOMENHC +# Update NHC file in ${HOMENHC} ################################### if test "$errdiff" -ne '0' then if [ "$copy_back" = 'YES' -a ${envir} = 'prod' ]; then - if [ -s $HOMENHC/tcvitals ]; then - cp nhc $HOMENHC/tcvitals - fi - if [ -s $HOMENHCp1/tcvitals ]; then - cp nhc $HOMENHCp1/tcvitals + if [ -s ${HOMENHC}/tcvitals ]; then + cp nhc ${HOMENHC}/tcvitals fi err=$? if [ "$err" -ne '0' ]; then msg="###ERROR: Previous NHC Synthetic Data Record File \ -$HOMENHC/tcvitals not updated by syndat_qctropcy" +${HOMENHC}/tcvitals not updated by syndat_qctropcy" else msg="Previous NHC Synthetic Data Record File \ -$HOMENHC/tcvitals successfully updated by syndat_qctropcy" +${HOMENHC}/tcvitals successfully updated by syndat_qctropcy" fi set +x @@ -357,7 +341,7 @@ $HOMENHC/tcvitals successfully updated by syndat_qctropcy" else - msg="Previous NHC Synthetic Data Record File $HOMENHC/tcvitals \ + msg="Previous NHC Synthetic Data Record File ${HOMENHC}/tcvitals \ not changed by syndat_qctropcy" set +x echo diff --git a/ush/tropcy_relocate.sh b/ush/tropcy_relocate.sh index 01a21bd12c..29ffc32797 100755 --- a/ush/tropcy_relocate.sh +++ b/ush/tropcy_relocate.sh @@ -84,20 +84,13 @@ # envir String indicating environment under which job runs ('prod' # or 'test') # Default is "prod" -# HOMEALL String indicating parent directory path for some or -# all files under which job runs. -# If the imported variable MACHINE!=sgi, then the default is -# "/nw${envir}"; otherwise the default is -# "/disk1/users/snake/prepobs" -# HOMERELO String indicating parent directory path for relocation -# specific files. (May be under HOMEALL) # envir_getges String indicating environment under which GETGES utility -# ush runs (see documentation in $USHGETGES/getges.sh for +# ush runs (see documentation in ${USHgfs}/getges.sh for # more information) # Default is "$envir" # network_getges # String indicating job network under which GETGES utility -# ush runs (see documentation in $USHGETGES/getges.sh for +# ush runs (see documentation in ${USHgfs}/getges.sh for # more information) # Default is "global" unless the center relocation processing # date/time is not a multiple of 3-hrs, then the default is @@ -122,34 +115,20 @@ # POE_OPTS String indicating options to use with poe command # Default is "-pgmmodel mpmd -ilevel 2 -labelio yes \ # -stdoutmode ordered" -# USHGETGES String indicating directory path for GETGES utility ush -# file -# USHRELO String indicating directory path for RELOCATE ush files -# Default is "${HOMERELO}/ush" -# EXECRELO String indicating directory path for RELOCATE executables -# Default is "${HOMERELO}/exec" -# FIXRELO String indicating directory path for RELOCATE data fix- -# field files -# Default is "${HOMERELO}/fix" -# EXECUTIL String indicating directory path for utility program -# executables -# If the imported variable MACHINE!=sgi, then the default is -# "/nwprod/util/exec"; otherwise the default is -# "${HOMEALL}/util/exec" # RELOX String indicating executable path for RELOCATE_MV_NVORTEX # program -# Default is "$EXECRELO/relocate_mv_nvortex" +# Default is "${EXECgfs}/relocate_mv_nvortex" # SUPVX String indicating executable path for SUPVIT utility # program -# Default is "$EXECUTIL/supvit.x" +# Default is "${EXECgfs}/supvit.x" # GETTX String indicating executable path for GETTRK utility # program -# Default is "$EXECUTIL/gettrk" +# Default is "${EXECgfs}/gettrk" # BKGFREQ Frequency of background files for relocation # Default is "3" # SENDDBN String when set to "YES" alerts output files to $COMSP # NDATE String indicating executable path for NDATE utility program -# Default is "$EXECUTIL/ndate" +# Default is "${EXECgfs}/ndate" # # These do not have to be exported to this script. If they are, they will # be used by the script. If they are not, they will be skipped @@ -166,18 +145,18 @@ # # Modules and files referenced: # Herefile: RELOCATE_GES -# $USHRELO/tropcy_relocate_extrkr.sh -# $USHGETGES/getges.sh +# ${USHgfs}/tropcy_relocate_extrkr.sh +# ${USHgfs}/getges.sh # $NDATE (here and in child script -# $USHRELO/tropcy_relocate_extrkr.sh) +# ${USHgfs}/tropcy_relocate_extrkr.sh) # /usr/bin/poe # postmsg # $DATA/prep_step (here and in child script -# $USHRELO/tropcy_relocate_extrkr.sh) +# ${USHgfs}/tropcy_relocate_extrkr.sh) # $DATA/err_exit (here and in child script -# $USHRELO/tropcy_relocate_extrkr.sh) +# ${USHgfs}/tropcy_relocate_extrkr.sh) # $DATA/err_chk (here and in child script -# $USHRELO/tropcy_relocate_extrkr.sh) +# ${USHgfs}/tropcy_relocate_extrkr.sh) # NOTE: The last three scripts above are NOT REQUIRED utilities. # If $DATA/prep_step not found, a scaled down version of it is # executed in-line. If $DATA/err_exit or $DATA/err_chk are not @@ -188,7 +167,7 @@ # programs : # RELOCATE_MV_NVORTEX - executable $RELOX # T126 GRIB global land/sea mask: -# $FIXRELO/global_slmask.t126.grb +# ${FIXgfs}/am/global_slmask.t126.grb # SUPVIT - executable $SUPVX # GETTRK - executable $GETTX # @@ -204,7 +183,7 @@ # #### -source "$HOMEgfs/ush/preamble.sh" +source "${USHgfs}/preamble.sh" MACHINE=${MACHINE:-$(hostname -s | cut -c 1-3)} @@ -275,14 +254,6 @@ set_trace envir=${envir:-prod} -if [ $MACHINE != sgi ]; then - HOMEALL=${HOMEALL:-$OPSROOT} -else - HOMEALL=${HOMEALL:-/disk1/users/snake/prepobs} -fi - -HOMERELO=${HOMERELO:-${shared_global_home}} - envir_getges=${envir_getges:-$envir} if [ $modhr -eq 0 ]; then network_getges=${network_getges:-global} @@ -295,21 +266,12 @@ pgmout=${pgmout:-/dev/null} tstsp=${tstsp:-/tmp/null/} tmmark=${tmmark:-tm00} -USHRELO=${USHRELO:-${HOMERELO}/ush} -##USHGETGES=${USHGETGES:-/nwprod/util/ush} -##USHGETGES=${USHGETGES:-${HOMERELO}/ush} -USHGETGES=${USHGETGES:-${USHRELO}} - -EXECRELO=${EXECRELO:-${HOMERELO}/exec} - -FIXRELO=${FIXRELO:-${HOMERELO}/fix} - -RELOX=${RELOX:-$EXECRELO/relocate_mv_nvortex} +RELOX=${RELOX:-${EXECgfs}/relocate_mv_nvortex} export BKGFREQ=${BKGFREQ:-1} -SUPVX=${SUPVX:-$EXECRELO/supvit.x} -GETTX=${GETTX:-$EXECRELO/gettrk} +SUPVX=${SUPVX:-${EXECgfs}/supvit.x} +GETTX=${GETTX:-${EXECgfs}/gettrk} ################################################ # EXECUTE TROPICAL CYCLONE RELOCATION PROCESSING @@ -355,7 +317,7 @@ echo " relocation processing date/time" echo "AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA" echo set_trace - $USHGETGES/getges.sh -e $envir_getges -n $network_getges \ + ${USHgfs}/getges.sh -e $envir_getges -n $network_getges \ -v $CDATE10 -f $fhr -t tcvges tcvitals.m${fhr} set +x echo @@ -405,7 +367,7 @@ echo " relocation processing date/time" echo "AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA" echo set_trace - $USHGETGES/getges.sh -e $envir_getges -n $network_getges \ + ${USHgfs}/getges.sh -e $envir_getges -n $network_getges \ -v $CDATE10 -t $stype $sges errges=$? if test $errges -ne 0; then @@ -439,7 +401,7 @@ to center relocation date/time;" # ---------------------------------------------------------------------------- if [ $fhr = "0" ]; then - "${USHGETGES}/getges.sh" -e "${envir_getges}" -n "${network_getges}" -v "${CDATE10}" \ + "${USHgfs}/getges.sh" -e "${envir_getges}" -n "${network_getges}" -v "${CDATE10}" \ -t "${stype}" > "${COM_OBS}/${RUN}.${cycle}.sgesprep_pre-relocate_pathname.${tmmark}" cp "${COM_OBS}/${RUN}.${cycle}.sgesprep_pre-relocate_pathname.${tmmark}" \ "${COM_OBS}/${RUN}.${cycle}.sgesprep_pathname.${tmmark}" @@ -459,7 +421,7 @@ echo " relocation processing date/time" echo "AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA" echo set_trace - $USHGETGES/getges.sh -e $envir_getges -n $network_getges \ + ${USHgfs}/getges.sh -e $envir_getges -n $network_getges \ -v $CDATE10 -t $ptype $pges errges=$? if test $errges -ne 0; then @@ -541,7 +503,7 @@ else # $DATA/$RUN.$cycle.relocate.model_track.tm00 # -------------------------------------------- - $USHRELO/tropcy_relocate_extrkr.sh + ${USHgfs}/tropcy_relocate_extrkr.sh err=$? if [ $err -ne 0 ]; then @@ -550,12 +512,12 @@ else set +x echo - echo "$USHRELO/tropcy_relocate_extrkr.sh failed" + echo "${USHgfs}/tropcy_relocate_extrkr.sh failed" echo "ABNORMAL EXIT!!!!!!!!!!!" echo set_trace if [ -s $DATA/err_exit ]; then - $DATA/err_exit "Script $USHRELO/tropcy_relocate_extrkr.sh failed" + $DATA/err_exit "Script ${USHgfs}/tropcy_relocate_extrkr.sh failed" else exit 555 fi diff --git a/ush/tropcy_relocate_extrkr.sh b/ush/tropcy_relocate_extrkr.sh index ede2318c4a..8e6bc5283a 100755 --- a/ush/tropcy_relocate_extrkr.sh +++ b/ush/tropcy_relocate_extrkr.sh @@ -3,7 +3,7 @@ # This script is executed by the script tropcy_relocate.sh # -------------------------------------------------------- -source "$HOMEgfs/ush/preamble.sh" +source "${USHgfs}/preamble.sh" export machine=${machine:-ZEUS} export machine=$(echo $machine|tr '[a-z]' '[A-Z]') @@ -1538,9 +1538,9 @@ ln -s -f ${vdir}/trak.${cmodel}.radii.${symdh} fort.63 ln -s -f ${vdir}/trak.${cmodel}.atcfunix.${symdh} fort.64 if [ $BKGFREQ -eq 1 ]; then - ln -s -f ${FIXRELO}/${cmodel}.tracker_leadtimes_hrly fort.15 + ln -s -f ${FIXgfs}/am/${cmodel}.tracker_leadtimes_hrly fort.15 elif [ $BKGFREQ -eq 3 ]; then - ln -s -f ${FIXRELO}/${cmodel}.tracker_leadtimes fort.15 + ln -s -f ${FIXgfs}/am/${cmodel}.tracker_leadtimes fort.15 fi ##$XLF_LINKSSH diff --git a/ush/wafs_mkgbl.sh b/ush/wafs_mkgbl.sh new file mode 100755 index 0000000000..026a01ffed --- /dev/null +++ b/ush/wafs_mkgbl.sh @@ -0,0 +1,152 @@ +# UTILITY SCRIPT NAME : wafs_mkgbl.sh +# AUTHOR : Mary Jacobs +# DATE WRITTEN : 11/06/96 +# +# Abstract: This utility script produces the GFS WAFS +# bulletins. +# +# Input: 2 arguments are passed to this script. +# 1st argument - Forecast Hour - format of 2I +# 2nd argument - In hours 12-30, the designator of +# a or b. +# +# Logic: If we are processing hours 12-30, we have the +# added variable of the a or b, and process +# accordingly. The other hours, the a or b is dropped. +# +echo "History: SEPT 1996 - First implementation of this utility script" +echo "History: AUG 1999 - Modified for implementation on IBM SP" +echo " - Allows users to run interactively" +# + +set -x +hour_list="$1" +sets_key=$2 +num=$# + +if test $num -ge 2 +then + echo " Appropriate number of arguments were passed" + set -x + if [ -z "$DATA" ] + then + export DATA=`pwd` + cd $DATA + setpdy.sh + . PDY + fi +else + echo "" + echo "Usage: wafs_mkgbl.sh \$hour [a|b]" + echo "" + exit 16 +fi + +echo " ------------------------------------------" +echo " BEGIN MAKING ${NET} WAFS PRODUCTS" +echo " ------------------------------------------" + +echo "Enter Make WAFS utility." + +for hour in $hour_list +do + ############################## + # Copy Input Field to $DATA + ############################## + + if test ! -f pgrbf${hour} + then +# cp $COMIN/${RUN}.${cycle}.pgrbf${hour} pgrbf${hour} + +# file name and forecast hour of GFS model data in Grib2 are 3 digits +# export fhr3=$hour +# if test $fhr3 -lt 100 +# then +# export fhr3="0$fhr3" +# fi + fhr3="$(printf "%03d" $(( 10#$hour )) )" + +# To solve Bugzilla #408: remove the dependency of grib1 files in gfs wafs job in next GFS upgrade +# Reason: It's not efficent if simply converting from grib2 to grib1 (costs 6 seconds with 415 records) +# Solution: Need to grep 'selected fields on selected levels' before CNVGRIB (costs 1 second with 92 records) + ln -s $COMIN/${RUN}.${cycle}.pgrb2.1p00.f$fhr3 pgrb2f${hour} + $WGRIB2 pgrb2f${hour} | grep -F -f $FIXgfs/grib_wafs.grb2to1.list | $WGRIB2 -i pgrb2f${hour} -grib pgrb2f${hour}.tmp +# on Cray, IOBUF_PARAMS has to used to speed up CNVGRIB +# export IOBUF_PARAMS='*:size=32M:count=4:verbose' + $CNVGRIB -g21 pgrb2f${hour}.tmp pgrbf${hour} +# unset IOBUF_PARAMS + fi + + # + # BAG - Put in fix on 20070925 to force the percision of U and V winds + # to default to 1 through the use of the grib_wafs.namelist file. + # + $COPYGB -g3 -i0 -N$FIXgfs/grib_wafs.namelist -x pgrbf${hour} tmp + mv tmp pgrbf${hour} + $GRBINDEX pgrbf${hour} pgrbif${hour} + + ############################## + # Process WAFS + ############################## + + if test $hour -ge '12' -a $hour -le '30' + then + sets=$sets_key + set +x + echo "We are processing the primary and secondary sets of hours." + echo "These sets are the a and b of hours 12-30." + set -x + else + # This is for hours 00/06 and 36-72. + unset sets + fi + + export pgm=wafs_makewafs + . prep_step + + export FORT11="pgrbf${hour}" + export FORT31="pgrbif${hour}" + export FORT51="xtrn.wfs${NET}${hour}${sets}" + export FORT53="com.wafs${hour}${sets}" + + startmsg + $EXECgfs/wafs_makewafs.x < $FIXgfs/grib_wfs${NET}${hour}${sets} >>$pgmout 2>errfile + export err=$?;err_chk + + + ############################## + # Post Files to PCOM + ############################## + + if test "$SENDCOM" = 'YES' + then + cp xtrn.wfs${NET}${hour}${sets} $PCOM/xtrn.wfs${NET}${cyc}${hour}${sets}.$jobsuffix +# cp com.wafs${hour}${sets} $PCOM/com.wafs${cyc}${hour}${sets}.$jobsuffix + +# if test "$SENDDBN_NTC" = 'YES' +# then +# if test "$NET" = 'gfs' +# then +# $DBNROOT/bin/dbn_alert MODEL GFS_WAFS $job \ +# $PCOM/com.wafs${cyc}${hour}${sets}.$jobsuffix +# $DBNROOT/bin/dbn_alert MODEL GFS_XWAFS $job \ +# $PCOM/xtrn.wfs${NET}${cyc}${hour}${sets}.$jobsuffix +# fi +# fi + fi + + ############################## + # Distribute Data + ############################## + + if [ "$SENDDBN_NTC" = 'YES' ] ; then + $DBNROOT/bin/dbn_alert GRIB_LOW $NET $job $PCOM/xtrn.wfs${NET}${cyc}${hour}${sets}.$jobsuffix + else + echo "xtrn.wfs${NET}${cyc}${hour}${sets}.$job file not posted to db_net." + fi + + echo "Wafs Processing $hour hour completed normally" + +done + +exit diff --git a/ush/wave_grib2_sbs.sh b/ush/wave_grib2_sbs.sh index af28760269..6b459cb1b0 100755 --- a/ush/wave_grib2_sbs.sh +++ b/ush/wave_grib2_sbs.sh @@ -25,7 +25,7 @@ # --------------------------------------------------------------------------- # # 0. Preparations -source "${HOMEgfs}/ush/preamble.sh" +source "${USHgfs}/preamble.sh" # 0.a Basic modes of operation @@ -82,7 +82,7 @@ if [[ ! -s "${COM_WAVE_GRID}/${outfile}.idx" ]]; then echo " Model ID : $WAV_MOD_TAG" set_trace - if [[ -z "${PDY}" ]] || [[ -z ${cyc} ]] || [[ -z "${cycle}" ]] || [[ -z "${EXECwave}" ]] || \ + if [[ -z "${PDY}" ]] || [[ -z ${cyc} ]] || [[ -z "${cycle}" ]] || [[ -z "${EXECgfs}" ]] || \ [[ -z "${COM_WAVE_GRID}" ]] || [[ -z "${WAV_MOD_TAG}" ]] || [[ -z "${gribflags}" ]] || \ [[ -z "${GRIDNR}" ]] || [[ -z "${MODNR}" ]] || \ [[ -z "${SENDDBN}" ]]; then @@ -138,11 +138,11 @@ if [[ ! -s "${COM_WAVE_GRID}/${outfile}.idx" ]]; then set +x echo " Run ww3_grib2" - echo " Executing ${EXECwave}/ww3_grib" + echo " Executing ${EXECgfs}/ww3_grib" set_trace export pgm=ww3_grib;. prep_step - "${EXECwave}/ww3_grib" > "grib2_${grdnam}_${FH3}.out" 2>&1 + "${EXECgfs}/ww3_grib" > "grib2_${grdnam}_${FH3}.out" 2>&1 export err=$?;err_chk if [ ! -s gribfile ]; then diff --git a/ush/wave_grid_interp_sbs.sh b/ush/wave_grid_interp_sbs.sh index c11a75f89d..db2918f924 100755 --- a/ush/wave_grid_interp_sbs.sh +++ b/ush/wave_grid_interp_sbs.sh @@ -25,7 +25,7 @@ # --------------------------------------------------------------------------- # # 0. Preparations -source "$HOMEgfs/ush/preamble.sh" +source "${USHgfs}/preamble.sh" # 0.a Basic modes of operation @@ -65,7 +65,7 @@ source "$HOMEgfs/ush/preamble.sh" echo " Model ID : $WAV_MOD_TAG" set_trace - if [[ -z "${PDY}" ]] || [[ -z "${cyc}" ]] || [[ -z "${cycle}" ]] || [[ -z "${EXECwave}" ]] || \ + if [[ -z "${PDY}" ]] || [[ -z "${cyc}" ]] || [[ -z "${cycle}" ]] || [[ -z "${EXECgfs}" ]] || \ [[ -z "${COM_WAVE_PREP}" ]] || [[ -z "${WAV_MOD_TAG}" ]] || [[ -z "${SENDDBN}" ]] || \ [ -z "${waveGRD}" ] then @@ -75,7 +75,7 @@ source "$HOMEgfs/ush/preamble.sh" echo '*** EXPORTED VARIABLES IN postprocessor NOT SET ***' echo '***************************************************' echo ' ' - echo "${PDY}${cyc} ${cycle} ${EXECwave} ${COM_WAVE_PREP} ${WAV_MOD_TAG} ${SENDDBN} ${waveGRD}" + echo "${PDY}${cyc} ${cycle} ${EXECgfs} ${COM_WAVE_PREP} ${WAV_MOD_TAG} ${SENDDBN} ${waveGRD}" set_trace exit 1 fi @@ -85,7 +85,7 @@ source "$HOMEgfs/ush/preamble.sh" rm -f ${DATA}/output_${ymdh}0000/out_grd.$grdID if [ ! -f ${DATA}/${grdID}_interp.inp.tmpl ]; then - cp $PARMwave/${grdID}_interp.inp.tmpl ${DATA} + cp ${PARMgfs}/wave/${grdID}_interp.inp.tmpl ${DATA} fi ln -sf ${DATA}/${grdID}_interp.inp.tmpl . @@ -113,18 +113,18 @@ source "$HOMEgfs/ush/preamble.sh" wht_OK='no' if [ ! -f ${DATA}/ww3_gint.WHTGRIDINT.bin.${grdID} ]; then - if [ -f $FIXwave/ww3_gint.WHTGRIDINT.bin.${grdID} ] + if [ -f ${FIXgfs}/wave/ww3_gint.WHTGRIDINT.bin.${grdID} ] then set +x echo ' ' - echo " Copying $FIXwave/ww3_gint.WHTGRIDINT.bin.${grdID} " + echo " Copying ${FIXgfs}/wave/ww3_gint.WHTGRIDINT.bin.${grdID} " set_trace - cp $FIXwave/ww3_gint.WHTGRIDINT.bin.${grdID} ${DATA} + cp ${FIXgfs}/wave/ww3_gint.WHTGRIDINT.bin.${grdID} ${DATA} wht_OK='yes' else set +x echo ' ' - echo " Not found: $FIXwave/ww3_gint.WHTGRIDINT.bin.${grdID} " + echo " Not found: ${FIXgfs}/wave/ww3_gint.WHTGRIDINT.bin.${grdID} " fi fi # Check and link weights file @@ -137,18 +137,18 @@ source "$HOMEgfs/ush/preamble.sh" set +x echo " Run ww3_gint - echo " Executing $EXECwave/ww3_gint + echo " Executing ${EXECgfs}/ww3_gint set_trace export pgm=ww3_gint;. prep_step - $EXECwave/ww3_gint 1> gint.${grdID}.out 2>&1 + ${EXECgfs}/ww3_gint 1> gint.${grdID}.out 2>&1 export err=$?;err_chk # Write interpolation file to main TEMP dir area if not there yet if [ "wht_OK" = 'no' ] then cp -f ./WHTGRIDINT.bin ${DATA}/ww3_gint.WHTGRIDINT.bin.${grdID} - cp -f ./WHTGRIDINT.bin ${FIXwave}/ww3_gint.WHTGRIDINT.bin.${grdID} + cp -f ./WHTGRIDINT.bin ${FIXgfs}/wave/ww3_gint.WHTGRIDINT.bin.${grdID} fi diff --git a/ush/wave_grid_moddef.sh b/ush/wave_grid_moddef.sh index 5b1b212a16..e895666d66 100755 --- a/ush/wave_grid_moddef.sh +++ b/ush/wave_grid_moddef.sh @@ -20,7 +20,7 @@ # --------------------------------------------------------------------------- # # 0. Preparations -source "$HOMEgfs/ush/preamble.sh" +source "${USHgfs}/preamble.sh" # 0.a Basic modes of operation @@ -59,7 +59,7 @@ source "$HOMEgfs/ush/preamble.sh" # 0.c Define directories and the search path. # The tested variables should be exported by the postprocessor script. - if [ -z "$grdID" ] || [ -z "$EXECwave" ] || [ -z "$wave_sys_ver" ] + if [ -z "$grdID" ] || [ -z "${EXECgfs}" ] then set +x echo ' ' @@ -77,14 +77,22 @@ source "$HOMEgfs/ush/preamble.sh" set +x echo ' ' echo ' Creating mod_def file ...' - echo " Executing $EXECwave/ww3_grid" + echo " Executing ${EXECgfs}/ww3_grid" echo ' ' set_trace rm -f ww3_grid.inp ln -sf ../ww3_grid.inp.$grdID ww3_grid.inp + + if [ -f ../${grdID}.msh ] + then + rm -f ${grdID}.msh + ln -sf ../${grdID}.msh ${grdID}.msh + fi + + - $EXECwave/ww3_grid 1> grid_${grdID}.out 2>&1 + "${EXECgfs}/ww3_grid" 1> "grid_${grdID}.out" 2>&1 err=$? if [ "$err" != '0' ] @@ -99,10 +107,10 @@ source "$HOMEgfs/ush/preamble.sh" exit 3 fi - if [ -f mod_def.ww3 ] + if [[ -f mod_def.ww3 ]] then cp mod_def.ww3 "${COM_WAVE_PREP}/${RUN}wave.mod_def.${grdID}" - mv mod_def.ww3 ../mod_def.$grdID + mv mod_def.ww3 "../mod_def.${grdID}" else set +x echo ' ' @@ -118,6 +126,6 @@ source "$HOMEgfs/ush/preamble.sh" # 3. Clean up cd .. -rm -rf moddef_$grdID +rm -rf "moddef_${grdID}" # End of ww3_mod_def.sh ------------------------------------------------- # diff --git a/ush/wave_outp_cat.sh b/ush/wave_outp_cat.sh index f4bf6b2294..6ce3ce06cf 100755 --- a/ush/wave_outp_cat.sh +++ b/ush/wave_outp_cat.sh @@ -21,7 +21,7 @@ # --------------------------------------------------------------------------- # # 0. Preparations -source "$HOMEgfs/ush/preamble.sh" +source "${USHgfs}/preamble.sh" # 0.a Basic modes of operation bloc=$1 diff --git a/ush/wave_outp_spec.sh b/ush/wave_outp_spec.sh index 5acc0f95ab..91cd722c10 100755 --- a/ush/wave_outp_spec.sh +++ b/ush/wave_outp_spec.sh @@ -22,7 +22,7 @@ # --------------------------------------------------------------------------- # # 0. Preparations -source "$HOMEgfs/ush/preamble.sh" +source "${USHgfs}/preamble.sh" # 0.a Basic modes of operation bloc=$1 @@ -104,7 +104,7 @@ source "$HOMEgfs/ush/preamble.sh" # 0.c Define directories and the search path. # The tested variables should be exported by the postprocessor script. - if [ -z "$CDATE" ] || [ -z "$dtspec" ] || [ -z "$EXECwave" ] || \ + if [ -z "$CDATE" ] || [ -z "$dtspec" ] || [ -z "${EXECgfs}" ] || \ [ -z "$WAV_MOD_TAG" ] || [ -z "${STA_DIR}" ] then set +x @@ -170,11 +170,11 @@ source "$HOMEgfs/ush/preamble.sh" # 2.b Run the postprocessor set +x - echo " Executing $EXECwave/ww3_outp" + echo " Executing ${EXECgfs}/ww3_outp" set_trace export pgm=ww3_outp;. prep_step - $EXECwave/ww3_outp 1> outp_${specdir}_${buoy}.out 2>&1 + ${EXECgfs}/ww3_outp 1> outp_${specdir}_${buoy}.out 2>&1 export err=$?;err_chk diff --git a/ush/wave_prnc_cur.sh b/ush/wave_prnc_cur.sh index 6b1ab19db2..652d1be817 100755 --- a/ush/wave_prnc_cur.sh +++ b/ush/wave_prnc_cur.sh @@ -22,7 +22,7 @@ ################################################################################ # -source "$HOMEgfs/ush/preamble.sh" +source "${USHgfs}/preamble.sh" ymdh_rtofs=$1 curfile=$2 @@ -46,7 +46,7 @@ mv -f "cur_temp3.nc" "cur_uv_${PDY}_${fext}${fh3}_flat.nc" # Convert to regular lat lon file # If weights need to be regenerated due to CDO ver change, use: # $CDO genbil,r4320x2160 rtofs_glo_2ds_f000_3hrly_prog.nc weights.nc -cp ${FIXwave}/weights_rtofs_to_r4320x2160.nc ./weights.nc +cp ${FIXgfs}/wave/weights_rtofs_to_r4320x2160.nc ./weights.nc # Interpolate to regular 5 min grid ${CDO} remap,r4320x2160,weights.nc "cur_uv_${PDY}_${fext}${fh3}_flat.nc" "cur_5min_01.nc" @@ -65,9 +65,9 @@ rm -f cur_temp[123].nc cur_5min_??.nc "cur_glo_uv_${PDY}_${fext}${fh3}.nc weight if [ ${flagfirst} = "T" ] then - sed -e "s/HDRFL/T/g" ${PARMwave}/ww3_prnc.cur.${WAVECUR_FID}.inp.tmpl > ww3_prnc.inp + sed -e "s/HDRFL/T/g" ${PARMgfs}/wave/ww3_prnc.cur.${WAVECUR_FID}.inp.tmpl > ww3_prnc.inp else - sed -e "s/HDRFL/F/g" ${PARMwave}/ww3_prnc.cur.${WAVECUR_FID}.inp.tmpl > ww3_prnc.inp + sed -e "s/HDRFL/F/g" ${PARMgfs}/wave/ww3_prnc.cur.${WAVECUR_FID}.inp.tmpl > ww3_prnc.inp fi rm -f cur.nc @@ -75,7 +75,7 @@ ln -s "cur_glo_uv_${PDY}_${fext}${fh3}_5min.nc" "cur.nc" ln -s "${DATA}/mod_def.${WAVECUR_FID}" ./mod_def.ww3 export pgm=ww3_prnc;. prep_step -$EXECwave/ww3_prnc 1> prnc_${WAVECUR_FID}_${ymdh_rtofs}.out 2>&1 +${EXECgfs}/ww3_prnc 1> prnc_${WAVECUR_FID}_${ymdh_rtofs}.out 2>&1 export err=$?; err_chk diff --git a/ush/wave_prnc_ice.sh b/ush/wave_prnc_ice.sh index 5ec1d7fc2e..e5efaf3042 100755 --- a/ush/wave_prnc_ice.sh +++ b/ush/wave_prnc_ice.sh @@ -27,7 +27,7 @@ # --------------------------------------------------------------------------- # # 0. Preparations -source "$HOMEgfs/ush/preamble.sh" +source "${USHgfs}/preamble.sh" # 0.a Basic modes of operation @@ -55,7 +55,7 @@ source "$HOMEgfs/ush/preamble.sh" echo "Making ice fields." if [[ -z "${YMDH}" ]] || [[ -z "${cycle}" ]] || \ - [[ -z "${COM_WAVE_PREP}" ]] || [[ -z "${FIXwave}" ]] || [[ -z "${EXECwave}" ]] || \ + [[ -z "${COM_WAVE_PREP}" ]] || [[ -z "${FIXgfs}" ]] || [[ -z "${EXECgfs}" ]] || \ [[ -z "${WAV_MOD_TAG}" ]] || [[ -z "${WAVEICE_FID}" ]] || [[ -z "${COM_OBS}" ]]; then set +x @@ -144,7 +144,7 @@ source "$HOMEgfs/ush/preamble.sh" export pgm=ww3_prnc;. prep_step - $EXECwave/ww3_prnc 1> prnc_${WAVEICE_FID}_${cycle}.out 2>&1 + ${EXECgfs}/ww3_prnc 1> prnc_${WAVEICE_FID}_${cycle}.out 2>&1 export err=$?; err_chk if [ "$err" != '0' ] diff --git a/ush/wave_tar.sh b/ush/wave_tar.sh index 1a8d6d6cc5..bb8836df2c 100755 --- a/ush/wave_tar.sh +++ b/ush/wave_tar.sh @@ -25,7 +25,7 @@ # --------------------------------------------------------------------------- # # 0. Preparations -source "$HOMEgfs/ush/preamble.sh" +source "${USHgfs}/preamble.sh" # 0.a Basic modes of operation diff --git a/versions/build.hera.ver b/versions/build.hera.ver index ff85b1a801..337d5c32da 100644 --- a/versions/build.hera.ver +++ b/versions/build.hera.ver @@ -1,3 +1,5 @@ export stack_intel_ver=2021.5.0 export stack_impi_ver=2021.5.1 +export spack_env=gsi-addon-dev-rocky8 source "${HOMEgfs:-}/versions/build.spack.ver" +export spack_mod_path="/scratch1/NCEPDEV/nems/role.epic/spack-stack/spack-stack-${spack_stack_ver}/envs/${spack_env}/install/modulefiles/Core" diff --git a/versions/build.hercules.ver b/versions/build.hercules.ver index 5513466631..cab0c92111 100644 --- a/versions/build.hercules.ver +++ b/versions/build.hercules.ver @@ -1,3 +1,6 @@ export stack_intel_ver=2021.9.0 export stack_impi_ver=2021.9.0 +export intel_mkl_ver=2023.1.0 +export spack_env=gsi-addon-env source "${HOMEgfs:-}/versions/build.spack.ver" +export spack_mod_path="/work/noaa/epic/role-epic/spack-stack/hercules/spack-stack-${spack_stack_ver}/envs/${spack_env}/install/modulefiles/Core" diff --git a/versions/build.jet.ver b/versions/build.jet.ver index ff85b1a801..55c0ea0bd1 100644 --- a/versions/build.jet.ver +++ b/versions/build.jet.ver @@ -1,3 +1,5 @@ export stack_intel_ver=2021.5.0 export stack_impi_ver=2021.5.1 +export spack_env=gsi-addon-dev source "${HOMEgfs:-}/versions/build.spack.ver" +export spack_mod_path="/lfs4/HFIP/hfv3gfs/role.epic/spack-stack/spack-stack-${spack_stack_ver}/envs/${spack_env}/install/modulefiles/Core" diff --git a/versions/build.orion.ver b/versions/build.orion.ver index ff85b1a801..df7856110d 100644 --- a/versions/build.orion.ver +++ b/versions/build.orion.ver @@ -1,3 +1,5 @@ -export stack_intel_ver=2021.5.0 +export stack_intel_ver=2022.0.2 export stack_impi_ver=2021.5.1 +export spack_env=gsi-addon-env source "${HOMEgfs:-}/versions/build.spack.ver" +export spack_mod_path="/work/noaa/epic/role-epic/spack-stack/orion/spack-stack-${spack_stack_ver}/envs/${spack_env}/install/modulefiles/Core" diff --git a/versions/build.s4.ver b/versions/build.s4.ver index a0aae51d87..e2731ccfb3 100644 --- a/versions/build.s4.ver +++ b/versions/build.s4.ver @@ -1,3 +1,5 @@ export stack_intel_ver=2021.5.0 export stack_impi_ver=2021.5.0 +export spack_env=gsi-addon-env source "${HOMEgfs:-}/versions/build.spack.ver" +export spack_mod_path="/data/prod/jedi/spack-stack/spack-stack-${spack_stack_ver}/envs/${spack_env}/install/modulefiles/Core" diff --git a/versions/build.spack.ver b/versions/build.spack.ver index fb5b244bf5..808f85dd16 100644 --- a/versions/build.spack.ver +++ b/versions/build.spack.ver @@ -1,5 +1,4 @@ -export spack_stack_ver=1.5.1 -export spack_env=gsi-addon +export spack_stack_ver=1.6.0 export cmake_ver=3.23.1 @@ -11,7 +10,7 @@ export fms_ver=2023.02.01 export hdf5_ver=1.14.0 export netcdf_c_ver=4.9.2 -export netcdf_fortran_ver=4.6.0 +export netcdf_fortran_ver=4.6.1 export bacio_ver=2.4.1 export nemsio_ver=2.5.4 @@ -19,10 +18,10 @@ export sigio_ver=2.3.2 export w3emc_ver=2.10.0 export bufr_ver=11.7.0 export g2_ver=3.4.5 -export sp_ver=2.3.3 +export sp_ver=2.5.0 export ip_ver=4.3.0 export gsi_ncdiag_ver=1.1.2 export g2tmpl_ver=1.10.2 -export crtm_ver=2.4.0 +export crtm_ver=2.4.0.1 export wgrib2_ver=2.0.8 export grib_util_ver=1.3.0 diff --git a/versions/fix.ver b/versions/fix.ver index 13d9b56dd2..d2828518bc 100644 --- a/versions/fix.ver +++ b/versions/fix.ver @@ -10,13 +10,14 @@ export datm_ver=20220805 export gdas_crtm_ver=20220805 export gdas_fv3jedi_ver=20220805 export gdas_gsibec_ver=20221031 +export gdas_obs_ver=20240213 export glwu_ver=20220805 -export gsi_ver=20230911 +export gsi_ver=20240208 export lut_ver=20220805 export mom6_ver=20231219 export orog_ver=20231027 export reg2grb2_ver=20220805 export sfc_climo_ver=20220805 -export ugwd_ver=20220805 +export ugwd_ver=20231027 export verif_ver=20220805 export wave_ver=20240105 diff --git a/versions/run.hera.ver b/versions/run.hera.ver index 43443ba715..be6a594792 100644 --- a/versions/run.hera.ver +++ b/versions/run.hera.ver @@ -1,13 +1,12 @@ export stack_intel_ver=2021.5.0 export stack_impi_ver=2021.5.1 +export spack_env=gsi-addon-dev-rocky8 export hpss_ver=hpss export ncl_ver=6.6.2 export R_ver=3.5.0 export gempak_ver=7.4.2 - -#For metplus jobs, not currently working with spack-stack -#export met_ver=9.1.3 -#export metplus_ver=3.1.1 +export perl_ver=5.38.0 source "${HOMEgfs:-}/versions/run.spack.ver" +export spack_mod_path="/scratch1/NCEPDEV/nems/role.epic/spack-stack/spack-stack-${spack_stack_ver}/envs/${spack_env}/install/modulefiles/Core" diff --git a/versions/run.hercules.ver b/versions/run.hercules.ver index 43f1b2181d..ee8e4f8aea 100644 --- a/versions/run.hercules.ver +++ b/versions/run.hercules.ver @@ -1,12 +1,7 @@ export stack_intel_ver=2021.9.0 export stack_impi_ver=2021.9.0 export intel_mkl_ver=2023.1.0 - -export ncl_ver=6.6.2 -export perl_ver=5.36.0 +export spack_env=gsi-addon-env source "${HOMEgfs:-}/versions/run.spack.ver" - -# wgrib2 and cdo are different on Hercules from all the other systems -export wgrib2_ver=3.1.1 -export cdo_ver=2.2.0 +export spack_mod_path="/work/noaa/epic/role-epic/spack-stack/hercules/spack-stack-${spack_stack_ver}/envs/${spack_env}/install/modulefiles/Core" diff --git a/versions/run.jet.ver b/versions/run.jet.ver index 18a82cab4f..d5b98bf514 100644 --- a/versions/run.jet.ver +++ b/versions/run.jet.ver @@ -1,5 +1,6 @@ export stack_intel_ver=2021.5.0 export stack_impi_ver=2021.5.1 +export spack_env=gsi-addon-dev export hpss_ver= export ncl_ver=6.6.2 @@ -7,3 +8,4 @@ export R_ver=4.0.2 export gempak_ver=7.4.2 source "${HOMEgfs:-}/versions/run.spack.ver" +export spack_mod_path="/lfs4/HFIP/hfv3gfs/role.epic/spack-stack/spack-stack-${spack_stack_ver}/envs/${spack_env}/install/modulefiles/Core" diff --git a/versions/run.orion.ver b/versions/run.orion.ver index 7671bc028d..2fdeae8888 100644 --- a/versions/run.orion.ver +++ b/versions/run.orion.ver @@ -1,11 +1,12 @@ export stack_intel_ver=2022.0.2 export stack_impi_ver=2021.5.1 +export spack_env=gsi-addon-env export ncl_ver=6.6.2 export gempak_ver=7.5.1 -#For metplus jobs, not currently working with spack-stack -#export met_ver=9.1.3 -#export metplus_ver=3.1.1 - source "${HOMEgfs:-}/versions/run.spack.ver" +export spack_mod_path="/work/noaa/epic/role-epic/spack-stack/orion/spack-stack-${spack_stack_ver}/envs/${spack_env}/install/modulefiles/Core" + +#cdo is older on Orion +export cdo_ver=2.0.5 diff --git a/versions/run.s4.ver b/versions/run.s4.ver index 56817ef439..6d0f4cbaca 100644 --- a/versions/run.s4.ver +++ b/versions/run.s4.ver @@ -1,6 +1,8 @@ export stack_intel_ver=2021.5.0 export stack_impi_ver=2021.5.0 +export spack_env=gsi-addon-env export ncl_ver=6.4.0-precompiled source "${HOMEgfs:-}/versions/run.spack.ver" +export spack_mod_path="/data/prod/jedi/spack-stack/spack-stack-${spack_stack_ver}/envs/${spack_env}/install/modulefiles/Core" diff --git a/versions/run.spack.ver b/versions/run.spack.ver index c1c13f58df..c00b7483cd 100644 --- a/versions/run.spack.ver +++ b/versions/run.spack.ver @@ -1,26 +1,31 @@ -export spack_stack_ver=1.5.1 -export spack_env=gsi-addon -export python_ver=3.10.8 +export spack_stack_ver=1.6.0 +export python_ver=3.11.6 export jasper_ver=2.0.32 export libpng_ver=1.6.37 -export cdo_ver=2.0.5 +export cdo_ver=2.2.0 export nco_ver=5.0.6 export hdf5_ver=1.14.0 export netcdf_c_ver=4.9.2 -export netcdf_fortran_ver=4.6.0 +export netcdf_fortran_ver=4.6.1 export bufr_ver=11.7.0 export gsi_ncdiag_ver=1.1.2 export g2tmpl_ver=1.10.2 -export crtm_ver=2.4.0 +export crtm_ver=2.4.0.1 export wgrib2_ver=2.0.8 export grib_util_ver=1.3.0 -export prod_util_ver=1.2.2 +export prod_util_ver=2.1.1 export py_netcdf4_ver=1.5.8 -export py_pyyaml_ver=5.4.1 +export py_pyyaml_ver=6.0 export py_jinja2_ver=3.1.2 +export py_pandas_ver=1.5.3 +export py_python_dateutil_ver=2.8.2 + +export met_ver=9.1.3 +export metplus_ver=3.1.1 +export py_xarray_ver=2023.7.0 export obsproc_run_ver=1.1.2 export prepobs_run_ver=1.0.1 diff --git a/versions/run.wcoss2.ver b/versions/run.wcoss2.ver index a188cdea74..0aaad3ec3d 100644 --- a/versions/run.wcoss2.ver +++ b/versions/run.wcoss2.ver @@ -39,6 +39,8 @@ export g2tmpl_ver=1.10.2 export ncdiag_ver=1.0.0 export crtm_ver=2.4.0 export wgrib2_ver=2.0.8 +export met_ver=9.1.3 +export metplus_ver=3.1.1 # Development-only below diff --git a/workflow/applications/applications.py b/workflow/applications/applications.py index d45b6a9abc..b20c5a7c28 100644 --- a/workflow/applications/applications.py +++ b/workflow/applications/applications.py @@ -145,6 +145,10 @@ def _source_configs(self, conf: Configuration) -> Dict[str, Any]: files += ['config.anal', 'config.eupd'] elif config in ['efcs']: files += ['config.fcst', 'config.efcs'] + elif config in ['atmanlinit', 'atmanlrun']: + files += ['config.atmanl', f'config.{config}'] + elif config in ['atmensanlinit', 'atmensanlrun']: + files += ['config.atmensanl', f'config.{config}'] elif 'wave' in config: files += ['config.wave', f'config.{config}'] else: diff --git a/workflow/applications/gefs.py b/workflow/applications/gefs.py index b2369e8dfc..0be4dc7124 100644 --- a/workflow/applications/gefs.py +++ b/workflow/applications/gefs.py @@ -14,13 +14,18 @@ def _get_app_configs(self): """ Returns the config_files that are involved in gefs """ - configs = ['stage_ic', 'fcst'] + configs = ['stage_ic', 'fcst', 'atmos_products'] if self.nens > 0: configs += ['efcs'] if self.do_wave: - configs += ['waveinit'] + configs += ['waveinit', 'wavepostsbs', 'wavepostpnt'] + if self.do_wave_bnd: + configs += ['wavepostbndpnt', 'wavepostbndpntbll'] + + if self.do_ocean or self.do_ice: + configs += ['oceanice_products'] return configs @@ -45,4 +50,18 @@ def get_task_names(self): if self.nens > 0: tasks += ['efcs'] + tasks += ['atmos_prod'] + + if self.do_ocean: + tasks += ['ocean_prod'] + + if self.do_ice: + tasks += ['ice_prod'] + + if self.do_wave: + tasks += ['wavepostsbs'] + if self.do_wave_bnd: + tasks += ['wavepostbndpnt', 'wavepostbndpntbll'] + tasks += ['wavepostpnt'] + return {f"{self._base['CDUMP']}": tasks} diff --git a/workflow/applications/gfs_cycled.py b/workflow/applications/gfs_cycled.py index 1ff6cc3723..e492aa0bfd 100644 --- a/workflow/applications/gfs_cycled.py +++ b/workflow/applications/gfs_cycled.py @@ -16,8 +16,9 @@ def __init__(self, conf: Configuration): self.do_jediatmvar = self._base.get('DO_JEDIATMVAR', False) self.do_jediatmens = self._base.get('DO_JEDIATMENS', False) self.do_jediocnvar = self._base.get('DO_JEDIOCNVAR', False) - self.do_jedilandda = self._base.get('DO_JEDILANDDA', False) + self.do_jedisnowda = self._base.get('DO_JEDISNOWDA', False) self.do_mergensst = self._base.get('DO_MERGENSST', False) + self.do_vrfy_oceanda = self._base.get('DO_VRFY_OCEANDA', False) self.lobsdiag_forenkf = False self.eupd_cdumps = None @@ -43,11 +44,12 @@ def _get_app_configs(self): if self.do_jediocnvar: configs += ['prepoceanobs', 'ocnanalprep', 'ocnanalbmat', - 'ocnanalrun', 'ocnanalchkpt', 'ocnanalpost', - 'ocnanalvrfy'] + 'ocnanalrun', 'ocnanalchkpt', 'ocnanalpost'] + if self.do_vrfy_oceanda: + configs += ['ocnanalvrfy'] - if self.do_ocean: - configs += ['ocnpost'] + if self.do_ocean or self.do_ice: + configs += ['oceanice_products'] configs += ['sfcanl', 'analcalc', 'fcst', 'upp', 'atmos_products', 'arch', 'cleanup'] @@ -83,7 +85,9 @@ def _get_app_configs(self): configs += ['metp'] if self.do_gempak: - configs += ['gempak', 'npoess'] + configs += ['gempak'] + if self.do_goes: + configs += ['npoess'] if self.do_bufrsnd: configs += ['postsnd'] @@ -103,8 +107,8 @@ def _get_app_configs(self): if self.do_aero: configs += ['aeroanlinit', 'aeroanlrun', 'aeroanlfinal'] - if self.do_jedilandda: - configs += ['preplandobs', 'landanl'] + if self.do_jedisnowda: + configs += ['prepsnowobs', 'snowanl'] if self.do_mos: configs += ['mos_stn_prep', 'mos_grd_prep', 'mos_ext_stn_prep', 'mos_ext_grd_prep', @@ -137,16 +141,17 @@ def get_task_names(self): if self.do_jediocnvar: gdas_gfs_common_tasks_before_fcst += ['prepoceanobs', 'ocnanalprep', 'ocnanalbmat', 'ocnanalrun', - 'ocnanalchkpt', 'ocnanalpost', - 'ocnanalvrfy'] + 'ocnanalchkpt', 'ocnanalpost'] + if self.do_vrfy_oceanda: + gdas_gfs_common_tasks_before_fcst += ['ocnanalvrfy'] gdas_gfs_common_tasks_before_fcst += ['sfcanl', 'analcalc'] if self.do_aero: gdas_gfs_common_tasks_before_fcst += ['aeroanlinit', 'aeroanlrun', 'aeroanlfinal'] - if self.do_jedilandda: - gdas_gfs_common_tasks_before_fcst += ['preplandobs', 'landanl'] + if self.do_jedisnowda: + gdas_gfs_common_tasks_before_fcst += ['prepsnowobs', 'snowanl'] wave_prep_tasks = ['waveinit', 'waveprep'] wave_bndpnt_tasks = ['wavepostbndpnt', 'wavepostbndpntbll'] @@ -175,7 +180,7 @@ def get_task_names(self): if self.do_upp: gdas_tasks += ['atmupp'] - gdas_tasks += ['atmprod'] + gdas_tasks += ['atmos_prod'] if self.do_wave and 'gdas' in self.wave_cdumps: if self.do_wave_bnd: @@ -207,9 +212,15 @@ def get_task_names(self): gfs_tasks += ['atmanlupp', 'atmanlprod', 'fcst'] + if self.do_ocean: + gfs_tasks += ['ocean_prod'] + + if self.do_ice: + gfs_tasks += ['ice_prod'] + if self.do_upp: gfs_tasks += ['atmupp'] - gfs_tasks += ['atmprod'] + gfs_tasks += ['atmos_prod'] if self.do_goes: gfs_tasks += ['goesupp'] @@ -245,8 +256,9 @@ def get_task_names(self): gfs_tasks += ['gempak'] gfs_tasks += ['gempakmeta'] gfs_tasks += ['gempakncdcupapgif'] - gfs_tasks += ['npoess_pgrb2_0p5deg'] - gfs_tasks += ['gempakpgrb2spec'] + if self.do_goes: + gfs_tasks += ['npoess_pgrb2_0p5deg'] + gfs_tasks += ['gempakpgrb2spec'] if self.do_awips: gfs_tasks += ['awips_20km_1p0deg', 'awips_g2', 'fbwind'] @@ -321,9 +333,4 @@ def get_gfs_cyc_dates(base: Dict[str, Any]) -> Dict[str, Any]: base_out['EDATE_GFS'] = edate_gfs base_out['INTERVAL_GFS'] = interval_gfs - fhmax_gfs = {} - for hh in ['00', '06', '12', '18']: - fhmax_gfs[hh] = base.get(f'FHMAX_GFS_{hh}', base.get('FHMAX_GFS_00', 120)) - base_out['FHMAX_GFS'] = fhmax_gfs - return base_out diff --git a/workflow/applications/gfs_forecast_only.py b/workflow/applications/gfs_forecast_only.py index 1145863210..0a9648ee65 100644 --- a/workflow/applications/gfs_forecast_only.py +++ b/workflow/applications/gfs_forecast_only.py @@ -49,7 +49,7 @@ def _get_app_configs(self): configs += ['awips'] if self.do_ocean or self.do_ice: - configs += ['ocnpost'] + configs += ['oceanice_products'] if self.do_wave: configs += ['waveinit', 'waveprep', 'wavepostsbs', 'wavepostpnt'] @@ -100,7 +100,7 @@ def get_task_names(self): if self.do_upp: tasks += ['atmupp'] - tasks += ['atmprod'] + tasks += ['atmos_prod'] if self.do_goes: tasks += ['goesupp'] @@ -126,8 +126,11 @@ def get_task_names(self): if self.do_awips: tasks += ['awips_20km_1p0deg', 'awips_g2', 'fbwind'] - if self.do_ocean or self.do_ice: - tasks += ['ocnpost'] + if self.do_ocean: + tasks += ['ocean_prod'] + + if self.do_ice: + tasks += ['ice_prod'] if self.do_wave: if self.do_wave_bnd: diff --git a/workflow/create_experiment.py b/workflow/create_experiment.py index 7e0f350c0f..708cf432bf 100755 --- a/workflow/create_experiment.py +++ b/workflow/create_experiment.py @@ -63,7 +63,9 @@ def input_args(): formatter_class=ArgumentDefaultsHelpFormatter) parser.add_argument( - '--yaml', help='full path to yaml file describing the experiment configuration', type=Path, required=True) + '-y', '--yaml', help='full path to yaml file describing the experiment configuration', type=Path, required=True) + parser.add_argument( + '-o', '--overwrite', help='overwrite previously created experiment', action="store_true", required=False) return parser.parse_args() @@ -89,6 +91,9 @@ def input_args(): setup_expt_args.append(f"--{kk}") setup_expt_args.append(str(vv)) + if user_inputs.overwrite: + setup_expt_args.append("--overwrite") + logger.info(f"Call: setup_expt.main()") logger.debug(f"setup_expt.py {' '.join(setup_expt_args)}") setup_expt.main(setup_expt_args) diff --git a/workflow/hosts/hera_emc.yaml b/workflow/hosts/hera_emc.yaml index 71d444ed00..1393694153 100644 --- a/workflow/hosts/hera_emc.yaml +++ b/workflow/hosts/hera_emc.yaml @@ -22,3 +22,6 @@ ATARDIR: '/NCEPDEV/${HPSS_PROJECT}/1year/${USER}/${machine}/scratch/${PSLOT}' MAKE_NSSTBUFR: 'NO' MAKE_ACFTBUFR: 'NO' SUPPORTED_RESOLUTIONS: ['C1152', 'C768', 'C384', 'C192', 'C96', 'C48'] +COMINecmwf: /scratch1/NCEPDEV/global/glopara/data/external_gempak/ecmwf +COMINnam: /scratch1/NCEPDEV/global/glopara/data/external_gempak/nam +COMINukmet: /scratch1/NCEPDEV/global/glopara/data/external_gempak/ukmet diff --git a/workflow/hosts/hera_gsl.yaml b/workflow/hosts/hera_gsl.yaml index c12cac1559..37c014d2dd 100644 --- a/workflow/hosts/hera_gsl.yaml +++ b/workflow/hosts/hera_gsl.yaml @@ -2,8 +2,7 @@ BASE_GIT: '/scratch1/NCEPDEV/global/glopara/git' DMPDIR: '/scratch1/NCEPDEV/global/glopara/dump' BASE_CPLIC: '/scratch1/NCEPDEV/global/glopara/data/ICSDIR/prototype_ICs' PACKAGEROOT: '/scratch1/NCEPDEV/global/glopara/nwpara' -COMROOT: '/scratch1/NCEPDEV/global/glopara/com' -COMINsyn: '${COMROOT}/gfs/prod/syndat' +COMINsyn: '/scratch1/NCEPDEV/global/glopara/com/gfs/prod/syndat' HOMEDIR: '/scratch1/BMC/gsd-fv3-dev/NCEPDEV/global/${USER}' STMP: '${HOMEgfs}/FV3GFSrun/' PTMP: '${HOMEgfs}/FV3GFSrun/' @@ -23,3 +22,6 @@ ATARDIR: '/BMC/${HPSS_PROJECT}/1year/${USER}/${machine}/scratch/${PSLOT}' MAKE_NSSTBUFR: 'NO' MAKE_ACFTBUFR: 'NO' SUPPORTED_RESOLUTIONS: ['C1152', 'C768', 'C384', 'C192', 'C96', 'C48'] +COMINecmwf: /scratch1/NCEPDEV/global/glopara/data/external_gempak/ecmwf +COMINnam: /scratch1/NCEPDEV/global/glopara/data/external_gempak/nam +COMINukmet: /scratch1/NCEPDEV/global/glopara/data/external_gempak/ukmet diff --git a/workflow/hosts/hercules.yaml b/workflow/hosts/hercules.yaml index 58a9589f2f..2623672709 100644 --- a/workflow/hosts/hercules.yaml +++ b/workflow/hosts/hercules.yaml @@ -22,3 +22,6 @@ ATARDIR: '${NOSCRUB}/archive_rotdir/${PSLOT}' MAKE_NSSTBUFR: 'NO' MAKE_ACFTBUFR: 'NO' SUPPORTED_RESOLUTIONS: ['C1152', 'C768', 'C384', 'C192', 'C96', 'C48'] +COMINecmwf: /work/noaa/global/glopara/data/external_gempak/ecmwf +COMINnam: /work/noaa/global/glopara/data/external_gempak/nam +COMINukmet: /work/noaa/global/glopara/data/external_gempak/ukmet diff --git a/workflow/hosts/jet_emc.yaml b/workflow/hosts/jet_emc.yaml index 00c89b60a1..28b7571b32 100644 --- a/workflow/hosts/jet_emc.yaml +++ b/workflow/hosts/jet_emc.yaml @@ -22,3 +22,6 @@ ATARDIR: '/NCEPDEV/${HPSS_PROJECT}/1year/${USER}/${machine}/scratch/${PSLOT}' MAKE_NSSTBUFR: 'NO' MAKE_ACFTBUFR: 'NO' SUPPORTED_RESOLUTIONS: ['C384', 'C192', 'C96', 'C48'] +COMINecmwf: /mnt/lfs4/HFIP/hfv3gfs/glopara/data/external_gempak/ecmwf +COMINnam: /mnt/lfs4/HFIP/hfv3gfs/glopara/data/external_gempak/nam +COMINukmet: /mnt/lfs4/HFIP/hfv3gfs/glopara/data/external_gempak/ukmet diff --git a/workflow/hosts/jet_gsl.yaml b/workflow/hosts/jet_gsl.yaml index 9da6f2b2aa..e556ca4663 100644 --- a/workflow/hosts/jet_gsl.yaml +++ b/workflow/hosts/jet_gsl.yaml @@ -23,3 +23,6 @@ ATARDIR: '/BMC/${HPSS_PROJECT}/1year/${USER}/${machine}/scratch/${PSLOT}' MAKE_NSSTBUFR: 'NO' MAKE_ACFTBUFR: 'NO' SUPPORTED_RESOLUTIONS: ['C768', 'C384', 'C192', 'C96', 'C48'] +COMINecmwf: /mnt/lfs4/HFIP/hfv3gfs/glopara/data/external_gempak/ecmwf +COMINnam: /mnt/lfs4/HFIP/hfv3gfs/glopara/data/external_gempak/nam +COMINukmet: /mnt/lfs4/HFIP/hfv3gfs/glopara/data/external_gempak/ukmet diff --git a/workflow/hosts/orion.yaml b/workflow/hosts/orion.yaml index 4c08a878dc..dd95def386 100644 --- a/workflow/hosts/orion.yaml +++ b/workflow/hosts/orion.yaml @@ -22,3 +22,6 @@ ATARDIR: '${NOSCRUB}/archive_rotdir/${PSLOT}' MAKE_NSSTBUFR: 'NO' MAKE_ACFTBUFR: 'NO' SUPPORTED_RESOLUTIONS: ['C1152', 'C768', 'C384', 'C192', 'C96', 'C48'] +COMINecmwf: /work/noaa/global/glopara/data/external_gempak/ecmwf +COMINnam: /work/noaa/global/glopara/data/external_gempak/nam +COMINukmet: /work/noaa/global/glopara/data/external_gempak/ukmet diff --git a/workflow/hosts/wcoss2.yaml b/workflow/hosts/wcoss2.yaml index cfb141061c..ba203a8413 100644 --- a/workflow/hosts/wcoss2.yaml +++ b/workflow/hosts/wcoss2.yaml @@ -22,3 +22,6 @@ ATARDIR: '/NCEPDEV/${HPSS_PROJECT}/1year/${USER}/${machine}/scratch/${PSLOT}' MAKE_NSSTBUFR: 'NO' MAKE_ACFTBUFR: 'NO' SUPPORTED_RESOLUTIONS: ['C1152', 'C768', 'C384', 'C192', 'C96', 'C48'] +COMINecmwf: /lfs/h2/emc/global/noscrub/emc.global/data/external_gempak/ecmwf +COMINnam: /lfs/h2/emc/global/noscrub/emc.global/data/external_gempak/nam +COMINukmet: /lfs/h2/emc/global/noscrub/emc.global/data/external_gempak/ukmet diff --git a/workflow/rocoto/gefs_tasks.py b/workflow/rocoto/gefs_tasks.py index c46d9ad452..50b24f3578 100644 --- a/workflow/rocoto/gefs_tasks.py +++ b/workflow/rocoto/gefs_tasks.py @@ -75,7 +75,7 @@ def stage_ic(self): def waveinit(self): resources = self.get_resource('waveinit') - task_name = f'waveinit' + task_name = f'wave_init' task_dict = {'task_name': task_name, 'resources': resources, 'envars': self.envars, @@ -90,20 +90,18 @@ def waveinit(self): return task def fcst(self): - - # TODO: Add real dependencies dependencies = [] dep_dict = {'type': 'task', 'name': f'stage_ic'} dependencies.append(rocoto.add_dependency(dep_dict)) if self.app_config.do_wave: - dep_dict = {'type': 'task', 'name': f'waveinit'} + dep_dict = {'type': 'task', 'name': f'wave_init'} dependencies.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep_condition='and', dep=dependencies) resources = self.get_resource('fcst') - task_name = f'fcst' + task_name = f'fcst_mem000' task_dict = {'task_name': task_name, 'resources': resources, 'dependency': dependencies, @@ -124,36 +122,270 @@ def efcs(self): dependencies.append(rocoto.add_dependency(dep_dict)) if self.app_config.do_wave: - dep_dict = {'type': 'task', 'name': f'waveinit'} + dep_dict = {'type': 'task', 'name': f'wave_init'} dependencies.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep_condition='and', dep=dependencies) efcsenvars = self.envars.copy() - efcsenvars.append(rocoto.create_envar(name='ENSGRP', value='#grp#')) - - groups = self._get_hybgroups(self._base['NMEM_ENS'], self._configs['efcs']['NMEM_EFCSGRP']) - var_dict = {'grp': groups} + efcsenvars_dict = {'ENSMEM': '#member#', + 'MEMDIR': 'mem#member#' + } + for key, value in efcsenvars_dict.items(): + efcsenvars.append(rocoto.create_envar(name=key, value=str(value))) resources = self.get_resource('efcs') - task_name = f'efcs#grp#' + task_name = f'fcst_mem#member#' task_dict = {'task_name': task_name, 'resources': resources, 'dependency': dependencies, 'envars': efcsenvars, 'cycledef': 'gefs', - 'command': f'{self.HOMEgfs}/jobs/rocoto/efcs.sh', + 'command': f'{self.HOMEgfs}/jobs/rocoto/fcst.sh', 'job_name': f'{self.pslot}_{task_name}_@H', 'log': f'{self.rotdir}/logs/@Y@m@d@H/{task_name}.log', 'maxtries': '&MAXTRIES;' } - metatask_dict = {'task_name': 'efmn', - 'var_dict': var_dict, + member_var_dict = {'member': ' '.join([f"{mem:03d}" for mem in range(1, self.nmem + 1)])} + metatask_dict = {'task_name': 'fcst_ens', + 'var_dict': member_var_dict, 'task_dict': task_dict } task = rocoto.create_task(metatask_dict) return task + + def atmos_prod(self): + return self._atmosoceaniceprod('atmos') + + def ocean_prod(self): + return self._atmosoceaniceprod('ocean') + + def ice_prod(self): + return self._atmosoceaniceprod('ice') + + def _atmosoceaniceprod(self, component: str): + + products_dict = {'atmos': {'config': 'atmos_products', + 'history_path_tmpl': 'COM_ATMOS_MASTER_TMPL', + 'history_file_tmpl': f'{self.cdump}.t@Hz.master.grb2f#fhr#'}, + 'ocean': {'config': 'oceanice_products', + 'history_path_tmpl': 'COM_OCEAN_HISTORY_TMPL', + 'history_file_tmpl': f'{self.cdump}.ocean.t@Hz.6hr_avg.f#fhr#.nc'}, + 'ice': {'config': 'oceanice_products', + 'history_path_tmpl': 'COM_ICE_HISTORY_TMPL', + 'history_file_tmpl': f'{self.cdump}.ice.t@Hz.6hr_avg.f#fhr#.nc'}} + + component_dict = products_dict[component] + config = component_dict['config'] + history_path_tmpl = component_dict['history_path_tmpl'] + history_file_tmpl = component_dict['history_file_tmpl'] + + resources = self.get_resource(config) + + history_path = self._template_to_rocoto_cycstring(self._base[history_path_tmpl], {'MEMDIR': 'mem#member#'}) + deps = [] + data = f'{history_path}/{history_file_tmpl}' + dep_dict = {'type': 'data', 'data': data, 'age': 120} + deps.append(rocoto.add_dependency(dep_dict)) + dependencies = rocoto.create_dependency(dep=deps) + + postenvars = self.envars.copy() + postenvar_dict = {'ENSMEM': '#member#', + 'MEMDIR': 'mem#member#', + 'FHRLST': '#fhr#', + 'COMPONENT': component} + for key, value in postenvar_dict.items(): + postenvars.append(rocoto.create_envar(name=key, value=str(value))) + + task_name = f'{component}_prod_mem#member#_f#fhr#' + task_dict = {'task_name': task_name, + 'resources': resources, + 'dependency': dependencies, + 'envars': postenvars, + 'cycledef': 'gefs', + 'command': f'{self.HOMEgfs}/jobs/rocoto/{config}.sh', + 'job_name': f'{self.pslot}_{task_name}_@H', + 'log': f'{self.rotdir}/logs/@Y@m@d@H/{task_name}.log', + 'maxtries': '&MAXTRIES;'} + + fhrs = self._get_forecast_hours('gefs', self._configs[config]) + + # ocean/ice components do not have fhr 0 as they are averaged output + if component in ['ocean', 'ice']: + fhrs.remove(0) + + fhr_var_dict = {'fhr': ' '.join([f"{fhr:03d}" for fhr in fhrs])} + + fhr_metatask_dict = {'task_name': f'{component}_prod_#member#', + 'task_dict': task_dict, + 'var_dict': fhr_var_dict} + + member_var_dict = {'member': ' '.join([f"{mem:03d}" for mem in range(0, self.nmem + 1)])} + member_metatask_dict = {'task_name': f'{component}_prod', + 'task_dict': fhr_metatask_dict, + 'var_dict': member_var_dict} + + task = rocoto.create_task(member_metatask_dict) + + return task + + def wavepostsbs(self): + deps = [] + for wave_grid in self._configs['wavepostsbs']['waveGRD'].split(): + wave_hist_path = self._template_to_rocoto_cycstring(self._base["COM_WAVE_HISTORY_TMPL"], {'MEMDIR': 'mem#member#'}) + data = f'{wave_hist_path}/gefswave.out_grd.{wave_grid}.@Y@m@d.@H0000' + dep_dict = {'type': 'data', 'data': data} + deps.append(rocoto.add_dependency(dep_dict)) + dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) + + wave_post_envars = self.envars.copy() + postenvar_dict = {'ENSMEM': '#member#', + 'MEMDIR': 'mem#member#', + } + for key, value in postenvar_dict.items(): + wave_post_envars.append(rocoto.create_envar(name=key, value=str(value))) + + resources = self.get_resource('wavepostsbs') + + task_name = f'wave_post_grid_mem#member#' + task_dict = {'task_name': task_name, + 'resources': resources, + 'dependency': dependencies, + 'envars': wave_post_envars, + 'cycledef': 'gefs', + 'command': f'{self.HOMEgfs}/jobs/rocoto/wavepostsbs.sh', + 'job_name': f'{self.pslot}_{task_name}_@H', + 'log': f'{self.rotdir}/logs/@Y@m@d@H/{task_name}.log', + 'maxtries': '&MAXTRIES;' + } + + member_var_dict = {'member': ' '.join([str(mem).zfill(3) for mem in range(0, self.nmem + 1)])} + member_metatask_dict = {'task_name': 'wave_post_grid', + 'task_dict': task_dict, + 'var_dict': member_var_dict + } + + task = rocoto.create_task(member_metatask_dict) + + return task + + def wavepostbndpnt(self): + deps = [] + dep_dict = {'type': 'task', 'name': f'fcst_mem#member#'} + deps.append(rocoto.add_dependency(dep_dict)) + dependencies = rocoto.create_dependency(dep=deps) + + wave_post_bndpnt_envars = self.envars.copy() + postenvar_dict = {'ENSMEM': '#member#', + 'MEMDIR': 'mem#member#', + } + for key, value in postenvar_dict.items(): + wave_post_bndpnt_envars.append(rocoto.create_envar(name=key, value=str(value))) + + resources = self.get_resource('wavepostbndpnt') + task_name = f'wave_post_bndpnt_mem#member#' + task_dict = {'task_name': task_name, + 'resources': resources, + 'dependency': dependencies, + 'envars': wave_post_bndpnt_envars, + 'cycledef': 'gefs', + 'command': f'{self.HOMEgfs}/jobs/rocoto/wavepostbndpnt.sh', + 'job_name': f'{self.pslot}_{task_name}_@H', + 'log': f'{self.rotdir}/logs/@Y@m@d@H/{task_name}.log', + 'maxtries': '&MAXTRIES;' + } + + member_var_dict = {'member': ' '.join([str(mem).zfill(3) for mem in range(0, self.nmem + 1)])} + member_metatask_dict = {'task_name': 'wave_post_bndpnt', + 'task_dict': task_dict, + 'var_dict': member_var_dict + } + + task = rocoto.create_task(member_metatask_dict) + + return task + + def wavepostbndpntbll(self): + deps = [] + atmos_hist_path = self._template_to_rocoto_cycstring(self._base["COM_ATMOS_HISTORY_TMPL"], {'MEMDIR': 'mem#member#'}) + # Is there any reason this is 180? + data = f'{atmos_hist_path}/{self.cdump}.t@Hz.atm.logf180.txt' + dep_dict = {'type': 'data', 'data': data} + deps.append(rocoto.add_dependency(dep_dict)) + + dep_dict = {'type': 'task', 'name': f'fcst_mem#member#'} + deps.append(rocoto.add_dependency(dep_dict)) + dependencies = rocoto.create_dependency(dep_condition='or', dep=deps) + + wave_post_bndpnt_bull_envars = self.envars.copy() + postenvar_dict = {'ENSMEM': '#member#', + 'MEMDIR': 'mem#member#', + } + for key, value in postenvar_dict.items(): + wave_post_bndpnt_bull_envars.append(rocoto.create_envar(name=key, value=str(value))) + + resources = self.get_resource('wavepostbndpntbll') + task_name = f'wave_post_bndpnt_bull_mem#member#' + task_dict = {'task_name': task_name, + 'resources': resources, + 'dependency': dependencies, + 'envars': wave_post_bndpnt_bull_envars, + 'cycledef': 'gefs', + 'command': f'{self.HOMEgfs}/jobs/rocoto/wavepostbndpntbll.sh', + 'job_name': f'{self.pslot}_{task_name}_@H', + 'log': f'{self.rotdir}/logs/@Y@m@d@H/{task_name}.log', + 'maxtries': '&MAXTRIES;' + } + + member_var_dict = {'member': ' '.join([str(mem).zfill(3) for mem in range(0, self.nmem + 1)])} + member_metatask_dict = {'task_name': 'wave_post_bndpnt_bull', + 'task_dict': task_dict, + 'var_dict': member_var_dict + } + + task = rocoto.create_task(member_metatask_dict) + + return task + + def wavepostpnt(self): + deps = [] + dep_dict = {'type': 'task', 'name': f'fcst_mem#member#'} + deps.append(rocoto.add_dependency(dep_dict)) + if self.app_config.do_wave_bnd: + dep_dict = {'type': 'task', 'name': f'wave_post_bndpnt_bull_mem#member#'} + deps.append(rocoto.add_dependency(dep_dict)) + dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) + + wave_post_pnt_envars = self.envars.copy() + postenvar_dict = {'ENSMEM': '#member#', + 'MEMDIR': 'mem#member#', + } + for key, value in postenvar_dict.items(): + wave_post_pnt_envars.append(rocoto.create_envar(name=key, value=str(value))) + + resources = self.get_resource('wavepostpnt') + task_name = f'wave_post_pnt_mem#member#' + task_dict = {'task_name': task_name, + 'resources': resources, + 'dependency': dependencies, + 'envars': wave_post_pnt_envars, + 'cycledef': 'gefs', + 'command': f'{self.HOMEgfs}/jobs/rocoto/wavepostpnt.sh', + 'job_name': f'{self.pslot}_{task_name}_@H', + 'log': f'{self.rotdir}/logs/@Y@m@d@H/{task_name}.log', + 'maxtries': '&MAXTRIES;' + } + + member_var_dict = {'member': ' '.join([str(mem).zfill(3) for mem in range(0, self.nmem + 1)])} + member_metatask_dict = {'task_name': 'wave_post_pnt', + 'task_dict': task_dict, + 'var_dict': member_var_dict + } + + task = rocoto.create_task(member_metatask_dict) + + return task diff --git a/workflow/rocoto/gfs_tasks.py b/workflow/rocoto/gfs_tasks.py index 0f5e184192..75760e55e1 100644 --- a/workflow/rocoto/gfs_tasks.py +++ b/workflow/rocoto/gfs_tasks.py @@ -99,7 +99,7 @@ def prep(self): gfs_enkf = True if self.app_config.do_hybvar and 'gfs' in self.app_config.eupd_cdumps else False deps = [] - dep_dict = {'type': 'metatask', 'name': 'gdasatmprod', 'offset': f"-{timedelta_to_HMS(self._base['cycle_interval'])}"} + dep_dict = {'type': 'metatask', 'name': 'gdasatmos_prod', 'offset': f"-{timedelta_to_HMS(self._base['cycle_interval'])}"} deps.append(rocoto.add_dependency(dep_dict)) data = f'{atm_hist_path}/gdas.t@Hz.atmf009.nc' dep_dict = {'type': 'data', 'data': data, 'offset': f"-{timedelta_to_HMS(self._base['cycle_interval'])}"} @@ -271,8 +271,8 @@ def sfcanl(self): else: dep_dict = {'type': 'task', 'name': f'{self.cdump}anal'} deps.append(rocoto.add_dependency(dep_dict)) - if self.app_config.do_jedilandda: - dep_dict = {'type': 'task', 'name': f'{self.cdump}landanl'} + if self.app_config.do_jedisnowda: + dep_dict = {'type': 'task', 'name': f'{self.cdump}snowanl'} deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) else: @@ -531,21 +531,21 @@ def aeroanlfinal(self): return task - def preplandobs(self): + def prepsnowobs(self): deps = [] dep_dict = {'type': 'task', 'name': f'{self.cdump}prep'} deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep=deps) - resources = self.get_resource('preplandobs') - task_name = f'{self.cdump}preplandobs' + resources = self.get_resource('prepsnowobs') + task_name = f'{self.cdump}prepsnowobs' task_dict = {'task_name': task_name, 'resources': resources, 'dependency': dependencies, 'envars': self.envars, 'cycledef': self.cdump.replace('enkf', ''), - 'command': f'{self.HOMEgfs}/jobs/rocoto/preplandobs.sh', + 'command': f'{self.HOMEgfs}/jobs/rocoto/prepsnowobs.sh', 'job_name': f'{self.pslot}_{task_name}_@H', 'log': f'{self.rotdir}/logs/@Y@m@d@H/{task_name}.log', 'maxtries': '&MAXTRIES;' @@ -555,21 +555,21 @@ def preplandobs(self): return task - def landanl(self): + def snowanl(self): deps = [] - dep_dict = {'type': 'task', 'name': f'{self.cdump}preplandobs'} + dep_dict = {'type': 'task', 'name': f'{self.cdump}prepsnowobs'} deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep=deps) - resources = self.get_resource('landanl') - task_name = f'{self.cdump}landanl' + resources = self.get_resource('snowanl') + task_name = f'{self.cdump}snowanl' task_dict = {'task_name': task_name, 'resources': resources, 'dependency': dependencies, 'envars': self.envars, 'cycledef': self.cdump.replace('enkf', ''), - 'command': f'{self.HOMEgfs}/jobs/rocoto/landanl.sh', + 'command': f'{self.HOMEgfs}/jobs/rocoto/snowanl.sh', 'job_name': f'{self.pslot}_{task_name}_@H', 'log': f'{self.rotdir}/logs/@Y@m@d@H/{task_name}.log', 'maxtries': '&MAXTRIES;' @@ -583,7 +583,7 @@ def prepoceanobs(self): ocean_hist_path = self._template_to_rocoto_cycstring(self._base["COM_OCEAN_HISTORY_TMPL"], {'RUN': 'gdas'}) deps = [] - data = f'{ocean_hist_path}/gdas.t@Hz.ocnf009.nc' + data = f'{ocean_hist_path}/gdas.ocean.t@Hz.inst.f009.nc' dep_dict = {'type': 'data', 'data': data, 'offset': f"-{timedelta_to_HMS(self._base['cycle_interval'])}"} deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep=deps) @@ -826,8 +826,8 @@ def _fcst_cycled(self): dep_dict = {'type': 'task', 'name': f'{self.cdump}aeroanlfinal'} dependencies.append(rocoto.add_dependency(dep_dict)) - if self.app_config.do_jedilandda: - dep_dict = {'type': 'task', 'name': f'{self.cdump}landanl'} + if self.app_config.do_jedisnowda: + dep_dict = {'type': 'task', 'name': f'{self.cdump}snowanl'} dependencies.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep_condition='and', dep=dependencies) @@ -927,24 +927,9 @@ def atmanlprod(self): return task @staticmethod - def _get_ufs_postproc_grps(cdump, config): + def _get_ufs_postproc_grps(cdump, config, component='atmos'): - fhmin = config['FHMIN'] - fhmax = config['FHMAX'] - fhout = config['FHOUT'] - - # Get a list of all forecast hours - fhrs = [] - if cdump in ['gdas']: - fhrs = range(fhmin, fhmax + fhout, fhout) - elif cdump in ['gfs']: - fhmax = np.max( - [config['FHMAX_GFS_00'], config['FHMAX_GFS_06'], config['FHMAX_GFS_12'], config['FHMAX_GFS_18']]) - fhout = config['FHOUT_GFS'] - fhmax_hf = config['FHMAX_HF_GFS'] - fhout_hf = config['FHOUT_HF_GFS'] - fhrs_hf = range(fhmin, fhmax_hf + fhout_hf, fhout_hf) - fhrs = list(fhrs_hf) + list(range(fhrs_hf[-1] + fhout, fhmax + fhout, fhout)) + fhrs = Tasks._get_forecast_hours(cdump, config, component=component) nfhrs_per_grp = config.get('NFHRS_PER_GROUP', 1) ngrps = len(fhrs) // nfhrs_per_grp if len(fhrs) % nfhrs_per_grp == 0 else len(fhrs) // nfhrs_per_grp + 1 @@ -1017,83 +1002,63 @@ def _upptask(self, upp_run="forecast", task_id="atmupp"): return task - def atmprod(self): + def atmos_prod(self): + return self._atmosoceaniceprod('atmos') - varname1, varname2, varname3 = 'grp', 'dep', 'lst' - varval1, varval2, varval3 = self._get_ufs_postproc_grps(self.cdump, self._configs['atmos_products']) - var_dict = {varname1: varval1, varname2: varval2, varname3: varval3} + def ocean_prod(self): + return self._atmosoceaniceprod('ocean') - postenvars = self.envars.copy() - postenvar_dict = {'FHRLST': '#lst#'} - for key, value in postenvar_dict.items(): - postenvars.append(rocoto.create_envar(name=key, value=str(value))) + def ice_prod(self): + return self._atmosoceaniceprod('ice') - atm_master_path = self._template_to_rocoto_cycstring(self._base["COM_ATMOS_MASTER_TMPL"]) - deps = [] - data = f'{atm_master_path}/{self.cdump}.t@Hz.master.grb2#dep#' - dep_dict = {'type': 'data', 'data': data, 'age': 120} - deps.append(rocoto.add_dependency(dep_dict)) - dependencies = rocoto.create_dependency(dep=deps) - cycledef = 'gdas_half,gdas' if self.cdump in ['gdas'] else self.cdump - resources = self.get_resource('atmos_products') + def _atmosoceaniceprod(self, component: str): - task_name = f'{self.cdump}atmprod#{varname1}#' - task_dict = {'task_name': task_name, - 'resources': resources, - 'dependency': dependencies, - 'envars': postenvars, - 'cycledef': cycledef, - 'command': f'{self.HOMEgfs}/jobs/rocoto/atmos_products.sh', - 'job_name': f'{self.pslot}_{task_name}_@H', - 'log': f'{self.rotdir}/logs/@Y@m@d@H/{task_name}.log', - 'maxtries': '&MAXTRIES;' - } + products_dict = {'atmos': {'config': 'atmos_products', + 'history_path_tmpl': 'COM_ATMOS_MASTER_TMPL', + 'history_file_tmpl': f'{self.cdump}.t@Hz.master.grb2#dep#'}, + 'ocean': {'config': 'oceanice_products', + 'history_path_tmpl': 'COM_OCEAN_HISTORY_TMPL', + 'history_file_tmpl': f'{self.cdump}.ocean.t@Hz.6hr_avg.#dep#.nc'}, + 'ice': {'config': 'oceanice_products', + 'history_path_tmpl': 'COM_ICE_HISTORY_TMPL', + 'history_file_tmpl': f'{self.cdump}.ice.t@Hz.6hr_avg.#dep#.nc'}} - metatask_dict = {'task_name': f'{self.cdump}atmprod', - 'task_dict': task_dict, - 'var_dict': var_dict - } - - task = rocoto.create_task(metatask_dict) - - return task - - def ocnpost(self): + component_dict = products_dict[component] + config = component_dict['config'] + history_path_tmpl = component_dict['history_path_tmpl'] + history_file_tmpl = component_dict['history_file_tmpl'] varname1, varname2, varname3 = 'grp', 'dep', 'lst' - varval1, varval2, varval3 = self._get_ufs_postproc_grps(self.cdump, self._configs['ocnpost']) + varval1, varval2, varval3 = self._get_ufs_postproc_grps(self.cdump, self._configs[config], component=component) var_dict = {varname1: varval1, varname2: varval2, varname3: varval3} postenvars = self.envars.copy() - postenvar_dict = {'FHRLST': '#lst#', - 'ROTDIR': self.rotdir} + postenvar_dict = {'FHRLST': '#lst#', 'COMPONENT': component} for key, value in postenvar_dict.items(): postenvars.append(rocoto.create_envar(name=key, value=str(value))) + history_path = self._template_to_rocoto_cycstring(self._base[history_path_tmpl]) deps = [] - atm_hist_path = self._template_to_rocoto_cycstring(self._base["COM_ATMOS_HISTORY_TMPL"]) - data = f'{atm_hist_path}/{self.cdump}.t@Hz.atm.log#dep#.txt' - dep_dict = {'type': 'data', 'data': data} - deps.append(rocoto.add_dependency(dep_dict)) - dep_dict = {'type': 'task', 'name': f'{self.cdump}fcst'} + data = f'{history_path}/{history_file_tmpl}' + dep_dict = {'type': 'data', 'data': data, 'age': 120} deps.append(rocoto.add_dependency(dep_dict)) - dependencies = rocoto.create_dependency(dep_condition='or', dep=deps) + dependencies = rocoto.create_dependency(dep=deps) cycledef = 'gdas_half,gdas' if self.cdump in ['gdas'] else self.cdump - resources = self.get_resource('ocnpost') + resources = self.get_resource(component_dict['config']) - task_name = f'{self.cdump}ocnpost#{varname1}#' + task_name = f'{self.cdump}{component}_prod#{varname1}#' task_dict = {'task_name': task_name, 'resources': resources, 'dependency': dependencies, 'envars': postenvars, 'cycledef': cycledef, - 'command': f'{self.HOMEgfs}/jobs/rocoto/ocnpost.sh', + 'command': f"{self.HOMEgfs}/jobs/rocoto/{config}.sh", 'job_name': f'{self.pslot}_{task_name}_@H', 'log': f'{self.rotdir}/logs/@Y@m@d@H/{task_name}.log', 'maxtries': '&MAXTRIES;' } - metatask_dict = {'task_name': f'{self.cdump}ocnpost', + metatask_dict = {'task_name': f'{self.cdump}{component}_prod', 'task_dict': task_dict, 'var_dict': var_dict } @@ -1345,8 +1310,7 @@ def _get_awipsgroups(cdump, config): if cdump in ['gdas']: fhrs = range(fhmin, fhmax + fhout, fhout) elif cdump in ['gfs']: - fhmax = np.max( - [config['FHMAX_GFS_00'], config['FHMAX_GFS_06'], config['FHMAX_GFS_12'], config['FHMAX_GFS_18']]) + fhmax = config['FHMAX_GFS'] fhout = config['FHOUT_GFS'] fhmax_hf = config['FHMAX_HF_GFS'] fhout_hf = config['FHOUT_HF_GFS'] @@ -1373,7 +1337,7 @@ def _get_awipsgroups(cdump, config): def awips_20km_1p0deg(self): deps = [] - dep_dict = {'type': 'metatask', 'name': f'{self.cdump}atmprod'} + dep_dict = {'type': 'metatask', 'name': f'{self.cdump}atmos_prod'} deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep=deps) @@ -1414,7 +1378,7 @@ def awips_20km_1p0deg(self): def awips_g2(self): deps = [] - dep_dict = {'type': 'metatask', 'name': f'{self.cdump}atmprod'} + dep_dict = {'type': 'metatask', 'name': f'{self.cdump}atmos_prod'} deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep=deps) @@ -1455,7 +1419,7 @@ def awips_g2(self): def gempak(self): deps = [] - dep_dict = {'type': 'metatask', 'name': f'{self.cdump}atmprod'} + dep_dict = {'type': 'metatask', 'name': f'{self.cdump}atmos_prod'} deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep=deps) @@ -1478,7 +1442,7 @@ def gempak(self): def gempakmeta(self): deps = [] - dep_dict = {'type': 'metatask', 'name': f'{self.cdump}atmprod'} + dep_dict = {'type': 'task', 'name': f'{self.cdump}gempak'} deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep=deps) @@ -1501,7 +1465,7 @@ def gempakmeta(self): def gempakmetancdc(self): deps = [] - dep_dict = {'type': 'metatask', 'name': f'{self.cdump}atmprod'} + dep_dict = {'type': 'task', 'name': f'{self.cdump}gempak'} deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep=deps) @@ -1524,7 +1488,7 @@ def gempakmetancdc(self): def gempakncdcupapgif(self): deps = [] - dep_dict = {'type': 'metatask', 'name': f'{self.cdump}atmprod'} + dep_dict = {'type': 'task', 'name': f'{self.cdump}gempak'} deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep=deps) @@ -1573,7 +1537,9 @@ def npoess_pgrb2_0p5deg(self): deps = [] dep_dict = {'type': 'task', 'name': f'{self.cdump}atmanlprod'} deps.append(rocoto.add_dependency(dep_dict)) - dependencies = rocoto.create_dependency(dep=deps) + dep_dict = {'type': 'metatask', 'name': f'{self.cdump}goesupp'} + deps.append(rocoto.add_dependency(dep_dict)) + dependencies = rocoto.create_dependency(dep=deps, dep_condition='and') resources = self.get_resource('npoess') task_name = f'{self.cdump}npoess_pgrb2_0p5deg' @@ -1582,7 +1548,7 @@ def npoess_pgrb2_0p5deg(self): 'dependency': dependencies, 'envars': self.envars, 'cycledef': self.cdump.replace('enkf', ''), - 'command': f'{self.HOMEgfs}/jobs/rocoto/npoess_pgrb2_0p5deg.sh', + 'command': f'{self.HOMEgfs}/jobs/rocoto/npoess.sh', 'job_name': f'{self.pslot}_{task_name}_@H', 'log': f'{self.rotdir}/logs/@Y@m@d@H/{task_name}.log', 'maxtries': '&MAXTRIES;' @@ -1663,7 +1629,7 @@ def vminmon(self): def tracker(self): deps = [] - dep_dict = {'type': 'metatask', 'name': f'{self.cdump}atmprod'} + dep_dict = {'type': 'metatask', 'name': f'{self.cdump}atmos_prod'} deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep=deps) @@ -1686,7 +1652,7 @@ def tracker(self): def genesis(self): deps = [] - dep_dict = {'type': 'metatask', 'name': f'{self.cdump}atmprod'} + dep_dict = {'type': 'metatask', 'name': f'{self.cdump}atmos_prod'} deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep=deps) @@ -1709,7 +1675,7 @@ def genesis(self): def genesis_fsu(self): deps = [] - dep_dict = {'type': 'metatask', 'name': f'{self.cdump}atmprod'} + dep_dict = {'type': 'metatask', 'name': f'{self.cdump}atmos_prod'} deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep=deps) @@ -1732,7 +1698,7 @@ def genesis_fsu(self): def fit2obs(self): deps = [] - dep_dict = {'type': 'metatask', 'name': f'{self.cdump}atmprod'} + dep_dict = {'type': 'metatask', 'name': f'{self.cdump}atmos_prod'} deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep=deps) @@ -1797,7 +1763,7 @@ def metp(self): def mos_stn_prep(self): deps = [] - dep_dict = {'type': 'metatask', 'name': f'{self.cdump}atmprod'} + dep_dict = {'type': 'metatask', 'name': f'{self.cdump}atmos_prod'} deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep=deps) @@ -1820,7 +1786,7 @@ def mos_stn_prep(self): def mos_grd_prep(self): deps = [] - dep_dict = {'type': 'metatask', 'name': f'{self.cdump}atmprod'} + dep_dict = {'type': 'metatask', 'name': f'{self.cdump}atmos_prod'} deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep=deps) @@ -1843,7 +1809,7 @@ def mos_grd_prep(self): def mos_ext_stn_prep(self): deps = [] - dep_dict = {'type': 'metatask', 'name': f'{self.cdump}atmprod'} + dep_dict = {'type': 'metatask', 'name': f'{self.cdump}atmos_prod'} deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep=deps) @@ -1866,7 +1832,7 @@ def mos_ext_stn_prep(self): def mos_ext_grd_prep(self): deps = [] - dep_dict = {'type': 'metatask', 'name': f'{self.cdump}atmprod'} + dep_dict = {'type': 'metatask', 'name': f'{self.cdump}atmos_prod'} deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep=deps) @@ -2184,7 +2150,7 @@ def arch(self): dep_dict = {'type': 'task', 'name': f'{self.cdump}genesis_fsu'} deps.append(rocoto.add_dependency(dep_dict)) # Post job dependencies - dep_dict = {'type': 'metatask', 'name': f'{self.cdump}atmprod'} + dep_dict = {'type': 'metatask', 'name': f'{self.cdump}atmos_prod'} deps.append(rocoto.add_dependency(dep_dict)) if self.app_config.do_wave: dep_dict = {'type': 'task', 'name': f'{self.cdump}wavepostsbs'} @@ -2195,8 +2161,12 @@ def arch(self): dep_dict = {'type': 'task', 'name': f'{self.cdump}wavepostbndpnt'} deps.append(rocoto.add_dependency(dep_dict)) if self.app_config.do_ocean: - if self.app_config.mode in ['forecast-only']: # TODO: fix ocnpost to run in cycled mode - dep_dict = {'type': 'metatask', 'name': f'{self.cdump}ocnpost'} + if self.cdump in ['gfs']: + dep_dict = {'type': 'metatask', 'name': f'{self.cdump}ocean_prod'} + deps.append(rocoto.add_dependency(dep_dict)) + if self.app_config.do_ice: + if self.cdump in ['gfs']: + dep_dict = {'type': 'metatask', 'name': f'{self.cdump}ice_prod'} deps.append(rocoto.add_dependency(dep_dict)) # MOS job dependencies if self.cdump in ['gfs'] and self.app_config.do_mos: @@ -2239,6 +2209,21 @@ def cleanup(self): dep_dict = {'type': 'task', 'name': f'{self.cdump}arch'} deps.append(rocoto.add_dependency(dep_dict)) + if self.app_config.do_gempak: + if self.cdump in ['gdas']: + dep_dict = {'type': 'task', 'name': f'{self.cdump}gempakmetancdc'} + deps.append(rocoto.add_dependency(dep_dict)) + elif self.cdump in ['gfs']: + dep_dict = {'type': 'task', 'name': f'{self.cdump}gempakmeta'} + deps.append(rocoto.add_dependency(dep_dict)) + dep_dict = {'type': 'task', 'name': f'{self.cdump}gempakncdcupapgif'} + deps.append(rocoto.add_dependency(dep_dict)) + if self.app_config.do_goes: + dep_dict = {'type': 'task', 'name': f'{self.cdump}gempakgrb2spec'} + deps.append(rocoto.add_dependency(dep_dict)) + dep_dict = {'type': 'task', 'name': f'{self.cdump}npoess_pgrb2_0p5deg'} + deps.append(rocoto.add_dependency(dep_dict)) + dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) resources = self.get_resource('cleanup') @@ -2291,14 +2276,14 @@ def eomg(self): dependencies = rocoto.create_dependency(dep=deps) eomgenvars = self.envars.copy() - eomgenvars.append(rocoto.create_envar(name='ENSGRP', value='#grp#')) - - groups = self._get_hybgroups(self._base['NMEM_ENS'], self._configs['eobs']['NMEM_EOMGGRP']) - - var_dict = {'grp': groups} + eomgenvars_dict = {'ENSMEM': '#member#', + 'MEMDIR': 'mem#member#' + } + for key, value in eomgenvars_dict.items(): + eomgenvars.append(rocoto.create_envar(name=key, value=str(value))) resources = self.get_resource('eomg') - task_name = f'{self.cdump}eomg#grp#' + task_name = f'{self.cdump}eomg_mem#member#' task_dict = {'task_name': task_name, 'resources': resources, 'dependency': dependencies, @@ -2310,8 +2295,9 @@ def eomg(self): 'maxtries': '&MAXTRIES;' } - metatask_dict = {'task_name': f'{self.cdump}eomn', - 'var_dict': var_dict, + member_var_dict = {'member': ' '.join([str(mem).zfill(3) for mem in range(1, self.nmem + 1)])} + metatask_dict = {'task_name': f'{self.cdump}eomg', + 'var_dict': member_var_dict, 'task_dict': task_dict, } @@ -2347,7 +2333,7 @@ def eupd(self): if self.app_config.lobsdiag_forenkf: dep_dict = {'type': 'task', 'name': f'{self.cdump}ediag'} else: - dep_dict = {'type': 'metatask', 'name': f'{self.cdump}eomn'} + dep_dict = {'type': 'metatask', 'name': f'{self.cdump}eomg'} deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep=deps) @@ -2555,31 +2541,30 @@ def efcs(self): dependencies = rocoto.create_dependency(dep_condition='or', dep=dependencies) efcsenvars = self.envars.copy() - efcsenvars.append(rocoto.create_envar(name='ENSGRP', value='#grp#')) - - groups = self._get_hybgroups(self._base['NMEM_ENS'], self._configs['efcs']['NMEM_EFCSGRP']) + efcsenvars_dict = {'ENSMEM': '#member#', + 'MEMDIR': 'mem#member#' + } + for key, value in efcsenvars_dict.items(): + efcsenvars.append(rocoto.create_envar(name=key, value=str(value))) - if self.cdump == "enkfgfs": - groups = self._get_hybgroups(self._base['NMEM_ENS_GFS'], self._configs['efcs']['NMEM_EFCSGRP_GFS']) cycledef = 'gdas_half,gdas' if self.cdump in ['enkfgdas'] else self.cdump.replace('enkf', '') resources = self.get_resource('efcs') - var_dict = {'grp': groups} - - task_name = f'{self.cdump}efcs#grp#' + task_name = f'{self.cdump}fcst_mem#member#' task_dict = {'task_name': task_name, 'resources': resources, 'dependency': dependencies, 'envars': efcsenvars, 'cycledef': cycledef, - 'command': f'{self.HOMEgfs}/jobs/rocoto/efcs.sh', + 'command': f'{self.HOMEgfs}/jobs/rocoto/fcst.sh', 'job_name': f'{self.pslot}_{task_name}_@H', 'log': f'{self.rotdir}/logs/@Y@m@d@H/{task_name}.log', 'maxtries': '&MAXTRIES;' } - metatask_dict = {'task_name': f'{self.cdump}efmn', - 'var_dict': var_dict, + member_var_dict = {'member': ' '.join([str(mem).zfill(3) for mem in range(1, self.nmem + 1)])} + metatask_dict = {'task_name': f'{self.cdump}fcst', + 'var_dict': member_var_dict, 'task_dict': task_dict } @@ -2594,7 +2579,7 @@ def echgres(self): deps = [] dep_dict = {'type': 'task', 'name': f'{self.cdump.replace("enkf","")}fcst'} deps.append(rocoto.add_dependency(dep_dict)) - dep_dict = {'type': 'task', 'name': f'{self.cdump}efcs01'} + dep_dict = {'type': 'task', 'name': f'{self.cdump}fcst_mem001'} deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) @@ -2642,7 +2627,7 @@ def _get_eposgroups(epos): return grp, dep, lst deps = [] - dep_dict = {'type': 'metatask', 'name': f'{self.cdump}efmn'} + dep_dict = {'type': 'metatask', 'name': f'{self.cdump}fcst'} deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep=deps) @@ -2691,7 +2676,9 @@ def earc(self): earcenvars = self.envars.copy() earcenvars.append(rocoto.create_envar(name='ENSGRP', value='#grp#')) - groups = self._get_hybgroups(self._base['NMEM_ENS'], self._configs['earc']['NMEM_EARCGRP'], start_index=0) + # Integer division is floor division, but we need ceiling division + n_groups = -(self.nmem // -self._configs['earc']['NMEM_EARCGRP']) + groups = ' '.join([f'{grp:02d}' for grp in range(0, n_groups + 1)]) cycledef = 'gdas_half,gdas' if self.cdump in ['enkfgdas'] else self.cdump.replace('enkf', '') diff --git a/workflow/rocoto/tasks_emc.py b/workflow/rocoto/tasks_emc.py index 1c79de0c19..75312ba77d 100644 --- a/workflow/rocoto/tasks_emc.py +++ b/workflow/rocoto/tasks_emc.py @@ -4,6 +4,7 @@ from applications.applications import AppConfig import rocoto.rocoto as rocoto from wxflow import Template, TemplateConstants, to_timedelta +from typing import List __all__ = ['Tasks'] @@ -19,10 +20,10 @@ class Tasks: 'eobs', 'eomg', 'epos', 'esfc', 'eupd', 'atmensanlinit', 'atmensanlrun', 'atmensanlfinal', 'aeroanlinit', 'aeroanlrun', 'aeroanlfinal', - 'preplandobs', 'landanl', + 'prepsnowobs', 'snowanl', 'fcst', - 'atmanlupp', 'atmanlprod', 'atmupp', 'atmprod', 'goesupp', - 'ocnpost', + 'atmanlupp', 'atmanlprod', 'atmupp', 'goesupp', + 'atmosprod', 'oceanprod', 'iceprod', 'verfozn', 'verfrad', 'vminmon', 'metp', 'tracker', 'genesis', 'genesis_fsu', @@ -46,6 +47,10 @@ def __init__(self, app_config: AppConfig, cdump: str) -> None: self.HOMEgfs = self._base['HOMEgfs'] self.rotdir = self._base['ROTDIR'] self.pslot = self._base['PSLOT'] + if self.cdump == "enkfgfs": + self.nmem = int(self._base['NMEM_ENS_GFS']) + else: + self.nmem = int(self._base['NMEM_ENS']) self._base['cycle_interval'] = to_timedelta(f'{self._base["assim_freq"]}H') self.n_tiles = 6 # TODO - this needs to be elsewhere @@ -61,6 +66,7 @@ def __init__(self, app_config: AppConfig, cdump: str) -> None: 'cyc': '@H', 'COMROOT': self._base.get('COMROOT'), 'DATAROOT': self._base.get('DATAROOT')} + self.envars = self._set_envars(envar_dict) @staticmethod @@ -72,12 +78,6 @@ def _set_envars(envar_dict) -> list: return envars - @staticmethod - def _get_hybgroups(nens: int, nmem_per_group: int, start_index: int = 1): - ngrps = nens / nmem_per_group - groups = ' '.join([f'{x:02d}' for x in range(start_index, int(ngrps) + 1)]) - return groups - def _template_to_rocoto_cycstring(self, template: str, subs_dict: dict = {}) -> str: ''' Takes a string templated with ${ } and converts it into a string suitable @@ -123,6 +123,40 @@ def _template_to_rocoto_cycstring(self, template: str, subs_dict: dict = {}) -> TemplateConstants.DOLLAR_CURLY_BRACE, rocoto_conversion_dict.get) + @staticmethod + def _get_forecast_hours(cdump, config, component='atmos') -> List[str]: + # Make a local copy of the config to avoid modifying the original + local_config = config.copy() + + # Ocean/Ice components do not have a HF output option like the atmosphere + if component in ['ocean', 'ice']: + local_config['FHMAX_HF_GFS'] = config['FHMAX_GFS'] + local_config['FHOUT_HF_GFS'] = config['FHOUT_OCNICE_GFS'] + local_config['FHOUT_GFS'] = config['FHOUT_OCNICE_GFS'] + local_config['FHOUT'] = config['FHOUT_OCNICE'] + + fhmin = local_config['FHMIN'] + + # Get a list of all forecast hours + fhrs = [] + if cdump in ['gdas']: + fhmax = local_config['FHMAX'] + fhout = local_config['FHOUT'] + fhrs = list(range(fhmin, fhmax + fhout, fhout)) + elif cdump in ['gfs', 'gefs']: + fhmax = local_config['FHMAX_GFS'] + fhout = local_config['FHOUT_GFS'] + fhmax_hf = local_config['FHMAX_HF_GFS'] + fhout_hf = local_config['FHOUT_HF_GFS'] + fhrs_hf = range(fhmin, fhmax_hf + fhout_hf, fhout_hf) + fhrs = list(fhrs_hf) + list(range(fhrs_hf[-1] + fhout, fhmax + fhout, fhout)) + + # ocean/ice components do not have fhr 0 as they are averaged output + if component in ['ocean', 'ice']: + fhrs.remove(0) + + return fhrs + def get_resource(self, task_name): """ Given a task name (task_name) and its configuration (task_names), @@ -162,7 +196,11 @@ def get_resource(self, task_name): native = None if scheduler in ['pbspro']: - native = '-l debug=true,place=vscatter' + # Set place=vscatter by default and debug=true if DEBUG_POSTSCRIPT="YES" + if self._base['DEBUG_POSTSCRIPT']: + native = '-l debug=true,place=vscatter' + else: + native = '-l place=vscatter' # Set either exclusive or shared - default on WCOSS2 is exclusive when not set if task_config.get('is_exclusive', False): native += ':exclhost' diff --git a/workflow/rocoto/tasks_gsl.py b/workflow/rocoto/tasks_gsl.py index 371721bbb9..535c20c24c 100644 --- a/workflow/rocoto/tasks_gsl.py +++ b/workflow/rocoto/tasks_gsl.py @@ -4,6 +4,7 @@ from applications.applications import AppConfig import rocoto.rocoto as rocoto from wxflow import Template, TemplateConstants, to_timedelta +from typing import List __all__ = ['Tasks'] @@ -19,10 +20,10 @@ class Tasks: 'eobs', 'eomg', 'epos', 'esfc', 'eupd', 'atmensanlinit', 'atmensanlrun', 'atmensanlfinal', 'aeroanlinit', 'aeroanlrun', 'aeroanlfinal', - 'preplandobs', 'landanl', + 'prepsnowobs', 'snowanl', 'fcst', - 'atmanlupp', 'atmanlprod', 'atmupp', 'atmprod', 'goesupp', - 'ocnpost', + 'atmanlupp', 'atmanlprod', 'atmupp', 'goesupp', + 'atmosprod', 'oceanprod', 'iceprod', 'verfozn', 'verfrad', 'vminmon', 'metp', 'tracker', 'genesis', 'genesis_fsu', @@ -46,6 +47,10 @@ def __init__(self, app_config: AppConfig, cdump: str) -> None: self.HOMEgfs = self._base['HOMEgfs'] self.rotdir = self._base['ROTDIR'] self.pslot = self._base['PSLOT'] + if self.cdump == "enkfgfs": + self.nmem = int(self._base['NMEM_ENS_GFS']) + else: + self.nmem = int(self._base['NMEM_ENS']) self._base['cycle_interval'] = to_timedelta(f'{self._base["assim_freq"]}H') self.n_tiles = 6 # TODO - this needs to be elsewhere @@ -53,7 +58,7 @@ def __init__(self, app_config: AppConfig, cdump: str) -> None: envar_dict = {'RUN_ENVIR': self._base.get('RUN_ENVIR', 'emc'), 'HOMEgfs': self.HOMEgfs, 'EXPDIR': self._base.get('EXPDIR'), - 'ROTDIR': self._base.get('ROTDIR'), + 'ROTDIR': self._base.get('ROTDIR'), ##JKH 'NET': self._base.get('NET'), 'CDUMP': self.cdump, 'RUN': self.cdump, @@ -62,6 +67,7 @@ def __init__(self, app_config: AppConfig, cdump: str) -> None: 'cyc': '@H', 'COMROOT': self._base.get('COMROOT'), 'DATAROOT': self._base.get('DATAROOT')} + self.envars = self._set_envars(envar_dict) @staticmethod @@ -73,12 +79,6 @@ def _set_envars(envar_dict) -> list: return envars - @staticmethod - def _get_hybgroups(nens: int, nmem_per_group: int, start_index: int = 1): - ngrps = nens / nmem_per_group - groups = ' '.join([f'{x:02d}' for x in range(start_index, int(ngrps) + 1)]) - return groups - def _template_to_rocoto_cycstring(self, template: str, subs_dict: dict = {}) -> str: ''' Takes a string templated with ${ } and converts it into a string suitable @@ -124,6 +124,40 @@ def _template_to_rocoto_cycstring(self, template: str, subs_dict: dict = {}) -> TemplateConstants.DOLLAR_CURLY_BRACE, rocoto_conversion_dict.get) + @staticmethod + def _get_forecast_hours(cdump, config, component='atmos') -> List[str]: + # Make a local copy of the config to avoid modifying the original + local_config = config.copy() + + # Ocean/Ice components do not have a HF output option like the atmosphere + if component in ['ocean', 'ice']: + local_config['FHMAX_HF_GFS'] = config['FHMAX_GFS'] + local_config['FHOUT_HF_GFS'] = config['FHOUT_OCNICE_GFS'] + local_config['FHOUT_GFS'] = config['FHOUT_OCNICE_GFS'] + local_config['FHOUT'] = config['FHOUT_OCNICE'] + + fhmin = local_config['FHMIN'] + + # Get a list of all forecast hours + fhrs = [] + if cdump in ['gdas']: + fhmax = local_config['FHMAX'] + fhout = local_config['FHOUT'] + fhrs = list(range(fhmin, fhmax + fhout, fhout)) + elif cdump in ['gfs', 'gefs']: + fhmax = local_config['FHMAX_GFS'] + fhout = local_config['FHOUT_GFS'] + fhmax_hf = local_config['FHMAX_HF_GFS'] + fhout_hf = local_config['FHOUT_HF_GFS'] + fhrs_hf = range(fhmin, fhmax_hf + fhout_hf, fhout_hf) + fhrs = list(fhrs_hf) + list(range(fhrs_hf[-1] + fhout, fhmax + fhout, fhout)) + + # ocean/ice components do not have fhr 0 as they are averaged output + if component in ['ocean', 'ice']: + fhrs.remove(0) + + return fhrs + def get_resource(self, task_name): """ Given a task name (task_name) and its configuration (task_names), @@ -163,14 +197,18 @@ def get_resource(self, task_name): native = None if scheduler in ['pbspro']: - native = '-l debug=true,place=vscatter' + # Set place=vscatter by default and debug=true if DEBUG_POSTSCRIPT="YES" + if self._base['DEBUG_POSTSCRIPT']: + native = '-l debug=true,place=vscatter' + else: + native = '-l place=vscatter' # Set either exclusive or shared - default on WCOSS2 is exclusive when not set if task_config.get('is_exclusive', False): native += ':exclhost' else: native += ':shared' elif scheduler in ['slurm']: - native = '&NATIVE_STR;' + native = '&NATIVE_STR;' ##JKH queue = task_config['QUEUE_SERVICE'] if task_name in Tasks.SERVICE_TASKS else task_config['QUEUE'] diff --git a/workflow/setup_expt.py b/workflow/setup_expt.py index 7d7ac84aad..c689ec2ae7 100755 --- a/workflow/setup_expt.py +++ b/workflow/setup_expt.py @@ -224,6 +224,13 @@ def link_files_from_src_to_dst(src_dir, dst_dir): src_file = os.path.join(src_dir, fname) if os.path.exists(src_file): os.symlink(src_file, os.path.join(dst_dir, fname)) + # First 1/2 cycle also needs a atmos increment if doing warm start + if inputs.start in ['warm']: + for ftype in ['atmi003.nc', 'atminc.nc', 'atmi009.nc']: + fname = f'{inputs.cdump}.t{idatestr[8:]}z.{ftype}' + src_file = os.path.join(src_dir, fname) + if os.path.exists(src_file): + os.symlink(src_file, os.path.join(dst_dir, fname)) return @@ -245,12 +252,6 @@ def fill_EXPDIR(inputs): expdir = os.path.join(inputs.expdir, inputs.pslot) configs = glob.glob(f'{configdir}/config.*') - exclude_configs = ['base', 'base.emc.dyn', 'base.nco.static', 'fv3.nco.static'] - for exclude in exclude_configs: - try: - configs.remove(f'{configdir}/config.{exclude}') - except ValueError: - pass if len(configs) == 0: raise IOError(f'no config files found in {configdir}') for config in configs: @@ -288,7 +289,8 @@ def _update_defaults(dict_in: dict) -> dict: def edit_baseconfig(host, inputs, yaml_dict): """ - Parses and populates the templated `config.base.emc.dyn` to `config.base` + Parses and populates the templated `HOMEgfs/parm/config//config.base` + to `EXPDIR/pslot/config.base` """ tmpl_dict = { @@ -316,7 +318,8 @@ def edit_baseconfig(host, inputs, yaml_dict): "@EXP_WARM_START@": is_warm_start, "@MODE@": inputs.mode, "@gfs_cyc@": inputs.gfs_cyc, - "@APP@": inputs.app + "@APP@": inputs.app, + "@NMEM_ENS@": getattr(inputs, 'nens', 0) } tmpl_dict = dict(tmpl_dict, **extend_dict) @@ -324,7 +327,6 @@ def edit_baseconfig(host, inputs, yaml_dict): if getattr(inputs, 'nens', 0) > 0: extend_dict = { "@CASEENS@": f'C{inputs.resensatmos}', - "@NMEM_ENS@": inputs.nens, } tmpl_dict = dict(tmpl_dict, **extend_dict) @@ -340,7 +342,7 @@ def edit_baseconfig(host, inputs, yaml_dict): except KeyError: pass - base_input = f'{inputs.configdir}/config.base.emc.dyn' + base_input = f'{inputs.configdir}/config.base' base_output = f'{inputs.expdir}/{inputs.pslot}/config.base' edit_config(base_input, base_output, tmpl_dict) @@ -383,7 +385,7 @@ def input_args(*argv): Method to collect user arguments for `setup_expt.py` """ - ufs_apps = ['ATM', 'ATMA', 'ATMW', 'S2S', 'S2SA', 'S2SW'] + ufs_apps = ['ATM', 'ATMA', 'ATMW', 'S2S', 'S2SA', 'S2SW', 'S2SWA'] def _common_args(parser): parser.add_argument('--pslot', help='parallel experiment name', @@ -399,6 +401,8 @@ def _common_args(parser): parser.add_argument('--idate', help='starting date of experiment, initial conditions must exist!', required=True, type=lambda dd: to_datetime(dd)) parser.add_argument('--edate', help='end date experiment', required=True, type=lambda dd: to_datetime(dd)) + parser.add_argument('--overwrite', help='overwrite previously created experiment (if it exists)', + action='store_true', required=False) return parser def _gfs_args(parser): @@ -429,7 +433,7 @@ def _gfs_or_gefs_ensemble_args(parser): def _gfs_or_gefs_forecast_args(parser): parser.add_argument('--app', help='UFS application', type=str, - choices=ufs_apps + ['S2SWA'], required=False, default='ATM') + choices=ufs_apps, required=False, default='ATM') parser.add_argument('--gfs_cyc', help='Number of forecasts per day', type=int, choices=[1, 2, 4], default=1, required=False) return parser @@ -493,17 +497,19 @@ def _gefs_args(parser): return parser.parse_args(list(*argv) if len(argv) else None) -def query_and_clean(dirname): +def query_and_clean(dirname, force_clean=False): """ Method to query if a directory exists and gather user input for further action """ create_dir = True if os.path.exists(dirname): - print() - print(f'directory already exists in {dirname}') - print() - overwrite = input('Do you wish to over-write [y/N]: ') + print(f'\ndirectory already exists in {dirname}') + if force_clean: + overwrite = True + print(f'removing directory ........ {dirname}\n') + else: + overwrite = input('Do you wish to over-write [y/N]: ') create_dir = True if overwrite in [ 'y', 'yes', 'Y', 'YES'] else False if create_dir: @@ -553,8 +559,8 @@ def main(*argv): rotdir = os.path.join(user_inputs.comroot, user_inputs.pslot) expdir = os.path.join(user_inputs.expdir, user_inputs.pslot) - create_rotdir = query_and_clean(rotdir) - create_expdir = query_and_clean(expdir) + create_rotdir = query_and_clean(rotdir, force_clean=user_inputs.overwrite) + create_expdir = query_and_clean(expdir, force_clean=user_inputs.overwrite) if create_rotdir: makedirs_if_missing(rotdir) @@ -565,6 +571,11 @@ def main(*argv): fill_EXPDIR(user_inputs) update_configs(host, user_inputs) + print(f"*" * 100) + print(f'EXPDIR: {expdir}') + print(f'ROTDIR: {rotdir}') + print(f"*" * 100) + if __name__ == '__main__':