Skip to content

Commit

Permalink
Merge branch 'ufs-community:develop' into text/us-216
Browse files Browse the repository at this point in the history
  • Loading branch information
jdkublnick authored Sep 12, 2024
2 parents ebb2cb5 + 26cdad8 commit c2f31bf
Show file tree
Hide file tree
Showing 42 changed files with 1,057 additions and 964 deletions.
12 changes: 9 additions & 3 deletions .cicd/Jenkinsfile
Original file line number Diff line number Diff line change
Expand Up @@ -29,9 +29,15 @@ pipeline {
stage('Launch SonarQube') {
steps {
script {
echo "BRANCH_NAME=${env.CHANGE_BRANCH}"
echo "FORK_NAME=${env.CHANGE_FORK}"
echo "CHANGE_URL=${env.CHANGE_URL}"
echo "CHANGE_ID=${env.CHANGE_ID}"
build job: '/ufs-srweather-app/ufs-srw-sonarqube', parameters: [
string(name: 'BRANCH_NAME', value: env.CHANGE_BRANCH ?: 'develop'),
string(name: 'FORK_NAME', value: env.CHANGE_FORK ?: '')
string(name: 'FORK_NAME', value: env.CHANGE_FORK ?: ''),
string(name: 'CHANGE_URL', value: env.CHANGE_URL ?: ''),
string(name: 'CHANGE_ID', value: env.CHANGE_ID ?: '')
], wait: false
}
}
Expand Down Expand Up @@ -193,9 +199,9 @@ pipeline {
// Try a few Workflow Task scripts to make sure E2E tests can be launched in a follow-on 'Test' stage
stage('Functional WorkflowTaskTests') {
environment {
TASK_DEPTH = "${env.SRW_WRAPPER_TASK_DEPTH}"
TASK_DEPTH = "${params.SRW_WRAPPER_TASK_DEPTH}"
}

steps {
dir ("${env.SRW_PLATFORM}") {
echo "Running ${TASK_DEPTH} simple workflow script task tests on ${env.SRW_PLATFORM} (using ${env.WORKSPACE}/${env.SRW_PLATFORM})"
Expand Down
2 changes: 1 addition & 1 deletion .cicd/scripts/srw_metric.sh
Original file line number Diff line number Diff line change
Expand Up @@ -107,7 +107,7 @@ if [[ ${RUN_STAT_ANLY_OPT} == true ]]; then
elif [[ -f Indy-Severe-Weather.tgz ]]; then
tar xvfz Indy-Severe-Weather.tgz
else
wget https://noaa-ufs-srw-pds.s3.amazonaws.com/sample_cases/release-public-v2.1.0/Indy-Severe-Weather.tgz
wget https://noaa-ufs-srw-pds.s3.amazonaws.com/experiment-user-cases/release-public-v2.1.0/METplus-vx-sample/Indy-Severe-Weather.tgz
tar xvfz Indy-Severe-Weather.tgz
fi
[[ -f ${SRW_PLATFORM,,}-${srw_compiler}-skill-score.txt ]] && rm ${SRW_PLATFORM,,}-${srw_compiler}-skill-score.txt
Expand Down
10 changes: 4 additions & 6 deletions .github/PULL_REQUEST_TEMPLATE
Original file line number Diff line number Diff line change
Expand Up @@ -30,15 +30,13 @@
<!-- Explicitly state what tests were run on these changes, or if any are still pending (for README or other text-only changes, just put "None required"). Make note of the compilers used, the platform/machine, and other relevant details as necessary. For more complicated changes, or those resulting in scientific changes, please be explicit! -->
<!-- Add an X to check off a box. -->

- [ ] hera.intel
- [ ] orion.intel
- [ ] hercules.intel
- [ ] cheyenne.intel
- [ ] cheyenne.gnu
- [ ] derecho.intel
- [ ] gaea.intel
- [ ] gaeac5.intel
- [ ] hera.gnu
- [ ] hera.intel
- [ ] hercules.intel
- [ ] jet.intel
- [ ] orion.intel
- [ ] wcoss2.intel
- [ ] NOAA Cloud (indicate which platform)
- [ ] Jenkins
Expand Down
2 changes: 1 addition & 1 deletion Externals.cfg
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ protocol = git
repo_url = https://github.com/ufs-community/ufs-weather-model
# Specify either a branch name or a hash but not both.
#branch = develop
hash = 1c6b4d4
hash = b5a1976
local_path = sorc/ufs-weather-model
required = True

Expand Down
207 changes: 104 additions & 103 deletions devclean.sh
Original file line number Diff line number Diff line change
Expand Up @@ -4,33 +4,31 @@
usage () {
cat << EOF_USAGE
Clean the UFS-SRW Application build
Clean the UFS-SRW Application build.
NOTE: If user included custom directories at build time, those directories must be deleted manually
Usage: $0 [OPTIONS] ...
OPTIONS
-h, --help
show this help guide
Show this help guide
-a, --all
removes "bin", "build" directories, and other build artifacts
--remove
removes the "build" directory, keeps the "bin", "lib" and other build artifacts intact
--clean
removes "bin", "build" directories, and other build artifacts (same as "-a", "--all")
--conda
removes "conda" directory and conda_loc file in SRW
--install-dir=INSTALL_DIR
installation directory name (\${SRW_DIR} by default)
--build-dir=BUILD_DIR
main build directory, absolute path (\${SRW_DIR}/build/ by default)
--bin-dir=BIN_DIR
binary directory name ("exec" by default); full path is \${INSTALL_DIR}/\${BIN_DIR})
--conda-dir=CONDA_DIR
directory where conda is installed. caution: if outside the SRW clone, it may have broader use
--sub-modules
remove sub-module directories. They will need to be checked out again by sourcing "\${SRW_DIR}/manage_externals/checkout_externals" before attempting subsequent builds
Remove all build artifacts, conda and submodules (equivalent to \`-b -c -s\`)
-b, --build
Remove build directories and artifacts: build/ exec/ share/ include/ lib/ lib64/
-c, --conda
Remove "conda" directory and conda_loc file in SRW main directory
--container
For cleaning builds within the SRW containers, will remove the "container-bin"
directory rather than "exec". Has no effect if \`-b\` is not specified.
-f, --force
Remove directories as requested, without asking for user confirmation of their deletion.
-s, --sub-modules
Remove sub-module directories. They need to be checked out again by sourcing "\${SRW_DIR}/manage_externals/checkout_externals" before attempting subsequent builds
-v, --verbose
provide more verbose output
Provide more verbose output
EOF_USAGE
}

Expand All @@ -39,17 +37,10 @@ settings () {
cat << EOF_SETTINGS
Settings:
INSTALL_DIR=${INSTALL_DIR}
BUILD_DIR=${BUILD_DIR}
BIN_DIR=${BIN_DIR}
CONDA_DIR=${CONDA_DIR}
REMOVE=${REMOVE}
FORCE=${REMOVE}
VERBOSE=${VERBOSE}
Default cleaning options: (if no arguments provided, then nothing is cleaned)
REMOVE=${REMOVE}
CLEAN=${CLEAN}
INCLUDE_SUB_MODULES=${INCLUDE_SUB_MODULES}
REMOVE_SUB_MODULES=${REMOVE_SUB_MODULES}
REMOVE_CONDA=${REMOVE_CONDA}
EOF_SETTINGS
}
Expand All @@ -63,113 +54,123 @@ usage_error () {

# default settings
SRW_DIR=$(cd "$(dirname "$(readlink -f -n "${BASH_SOURCE[0]}" )" )" && pwd -P)
INSTALL_DIR=${INSTALL_DIR:-${SRW_DIR}}
BUILD_DIR=${BUILD_DIR:-"${SRW_DIR}/build"}
BIN_DIR="exec"
CONDA_DIR=${CONDA_DIR:-"${SRW_DIR}/conda"}
REMOVE=false
VERBOSE=false

# default clean options
REMOVE=false
CLEAN=false
INCLUDE_SUB_MODULES=false #changes to true if '--sub-modules' option is provided
REMOVE_BUILD=false
REMOVE_CONDA=false
REMOVE_SUB_MODULES=false
CONTAINER=false

# process requires arguments
if [[ ("$1" == "--help") || ("$1" == "-h") ]]; then
usage
exit 0
fi

# process optional arguments
# process arguments
while :; do
case $1 in
--help|-h) usage; exit 0 ;;
--all|-a) ALL_CLEAN=true ;;
--remove) REMOVE=true ;;
--remove=?*|--remove=) usage_error "$1 argument ignored." ;;
--clean) CLEAN=true ;;
--conda) REMOVE_CONDA=true ;;
--install-dir=?*) INSTALL_DIR=${1#*=} ;;
--install-dir|--install-dir=) usage_error "$1 requires argument." ;;
--build-dir=?*) BUILD_DIR=${1#*=} ;;
--build-dir|--build-dir=) usage_error "$1 requires argument." ;;
--bin-dir=?*) BIN_DIR=${1#*=} ;;
--bin-dir|--bin-dir=) usage_error "$1 requires argument." ;;
--conda-dir=?*) CONDA_DIR=${1#*=} ;;
--conda-dir|--conda-dir=) usage_error "$1 requires argument." ;;
--sub-modules) INCLUDE_SUB_MODULES=true ;;
--all|-a) REMOVE_BUILD=true; REMOVE_CONDA=true; REMOVE_SUB_MODULES=true ;;
--build|-b) REMOVE_BUILD=true ;;
--conda|-c) REMOVE_CONDA=true ;;
--container) CONTAINER=true ;;
--force) REMOVE=true ;;
--force=?*|--force=) usage_error "$1 argument ignored." ;;
--sub-modules|-s) REMOVE_SUB_MODULES=true ;;
--sub-modules=?*|--sub-modules=) usage_error "$1 argument ignored." ;;
--verbose|-v) VERBOSE=true ;;
--verbose=?*|--verbose=) usage_error "$1 argument ignored." ;;
# targets
default) ALL_CLEAN=false ;;
# unknown
-?*|?*) usage_error "Unknown option $1" ;;
*) break ;;
esac
shift
done

# choose defaults to clean
if [ "${ALL_CLEAN}" = true ]; then
CLEAN=true
fi

# print settings
if [ "${VERBOSE}" = true ] ; then
settings
fi

# clean if build directory already exists
if [ "${REMOVE}" = true ] && [ "${CLEAN}" = false ] ; then
printf '%s\n' "Remove the \"build\" directory only, BUILD_DIR = $BUILD_DIR "
[[ -d ${BUILD_DIR} ]] && rm -rf ${BUILD_DIR} && printf '%s\n' "rm -rf ${BUILD_DIR}"
elif [ "${CLEAN}" = true ]; then
printf '%s\n' "Remove build directory, bin directory, and other build artifacts "
printf '%s\n' " from the installation directory = ${INSTALL_DIR} "

directories=( \
"${BUILD_DIR}" \
"${INSTALL_DIR}/${BIN_DIR}" \
"${INSTALL_DIR}/share" \
"${INSTALL_DIR}/include" \
"${INSTALL_DIR}/lib" \
"${INSTALL_DIR}/lib64" \
# Populate "removal_list" as an array of files/directories to remove, based on user selections
declare -a removal_list='()'

# Clean standard build artifacts
if [ ${REMOVE_BUILD} == true ]; then
removal_list=( \
"${SRW_DIR}/build" \
"${SRW_DIR}/share" \
"${SRW_DIR}/include" \
"${SRW_DIR}/lib" \
"${SRW_DIR}/lib64" \
)
if [ ${#directories[@]} -ge 1 ]; then
for dir in ${directories[@]}; do
[[ -d "${dir}" ]] && rm -rfv ${dir}
done
echo " "
if [ ${CONTAINER} == true ]; then
removal_list+=("${SRW_DIR}/container-bin")
else
removal_list+=("${SRW_DIR}/exec")
fi
fi
# Clean all the submodules if requested. Note: Need to check out them again before attempting subsequent builds, by sourcing ${SRW_DIR}/manage_externals/checkout_externals
if [ ${INCLUDE_SUB_MODULES} == true ]; then
printf '%s\n' "Removing submodules ..."

# Clean all the submodules if requested.
if [ ${REMOVE_SUB_MODULES} == true ]; then
declare -a submodules='()'
submodules=(${SRW_DIR}/sorc/*)
# echo " submodules are: ${submodules[@]} (total of ${#submodules[@]}) "
if [ ${#submodules[@]} -ge 1 ]; then
for sub in ${submodules[@]}; do [[ -d "${sub}" ]] && ( rm -rf ${sub} && printf '%s\n' "rm -rf ${sub}" ); done
submodules=(./sorc/*)
# Only add directories to make sure we don't delete CMakeLists.txt
for sub in ${submodules[@]}; do [[ -d "${sub}" ]] && removal_list+=( "${sub}" ); done
if [ "${VERBOSE}" = true ] ; then
printf '%s\n' "Note: Need to check out submodules again for any subsequent builds, " \
" by running ${SRW_DIR}/manage_externals/checkout_externals "
fi
printf '%s\n' "Note: Need to check out submodules again for any subsequent builds, " \
" by sourcing ${SRW_DIR}/manage_externals/checkout_externals "
fi
#

# Clean conda if requested
if [ "${REMOVE_CONDA}" = true ] ; then
printf '%s\n' "Removing conda location file"
rm -rf ${SRW_DIR}/conda_loc
printf '%s\n' "Removing conda installation"
rm -rf ${CONDA_DIR}
# Do not read "conda_loc" file to determine location of conda install; if the user has changed it to a different location
# they likely do not want to remove it!
conda_location=$(<${SRW_DIR}/conda_loc)
if [ "${VERBOSE}" = true ] ; then
echo "conda_location=$conda_location"
fi
if [ "${conda_location}" == "${SRW_DIR}/conda" ]; then
removal_list+=("${SRW_DIR}/conda_loc")
removal_list+=("${SRW_DIR}/conda")
else
echo "WARNING: location of conda build in ${SRW_DIR}/conda_loc is not the default location!"
echo "Will not attempt to remove conda!"
fi
fi

# If array is empty, that means user has not selected any removal options
if [ ${#removal_list[@]} -eq 0 ]; then
usage_error "No removal options specified"
fi

while [ ${REMOVE} == false ]; do
# Make user confirm deletion of directories unless '--force' option was provided
printf "The following files/directories will be deleted:\n\n"
for i in "${removal_list[@]}"; do
echo "$i"
done
echo ""
read -p "Confirm that you want to delete these files/directories! (Yes/No): " choice
case ${choice} in
[Yy]* ) REMOVE=true ;;
[Nn]* ) echo "User chose not to delete, exiting..."; exit ;;
* ) printf "Invalid option selected.\n" ;;
esac
done

if [ ${REMOVE} == true ]; then
for dir in ${removal_list[@]}; do
echo "Removing ${dir}"
if [ "${VERBOSE}" = true ] ; then
rm -rfv ${dir}
else
rm -rf ${dir}
fi
done
echo " "
echo "All the requested cleaning tasks have been completed"
echo " "
fi

echo " "
echo "All the requested cleaning tasks have been completed"
echo " "

exit 0

10 changes: 4 additions & 6 deletions doc/ContribGuide/contributing.rst
Original file line number Diff line number Diff line change
Expand Up @@ -227,15 +227,13 @@ Here is the template that is provided when developers click "Create pull request
<!-- Explicitly state what tests were run on these changes, or if any are still pending (for README or other text-only changes, just put "None required"). Make note of the compilers used, the platform/machine, and other relevant details as necessary. For more complicated changes, or those resulting in scientific changes, please be explicit! -->
<!-- Add an X to check off a box. -->
- [ ] hera.intel
- [ ] orion.intel
- [ ] hercules.intel
- [ ] cheyenne.intel
- [ ] cheyenne.gnu
- [ ] derecho.intel
- [ ] gaea.intel
- [ ] gaeac5.intel
- [ ] hera.gnu
- [ ] hera.intel
- [ ] hercules.intel
- [ ] jet.intel
- [ ] orion.intel
- [ ] wcoss2.intel
- [ ] NOAA Cloud (indicate which platform)
- [ ] Jenkins
Expand Down
2 changes: 1 addition & 1 deletion doc/UsersGuide/BackgroundInfo/Components.rst
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ UFS Preprocessing Utilities (UFS_UTILS)

The SRW Application includes a number of pre-processing utilities (UFS_UTILS) that initialize and prepare the model. Since the SRW App provides forecast predictions over a limited area (rather than globally), these utilities generate a regional grid (``regional_esg_grid/make_hgrid``) along with :term:`orography` (``orog``) and surface climatology (``sfc_climo_gen``) files on that grid. Grids include a strip, or "halo," of six cells that surround the regional grid and feed in lateral boundary condition data. Since different grid and orography files require different numbers of :term:`halo` cells, additional utilities handle topography filtering and shave the number of halo points (based on downstream workflow component requirements). The pre-processing software :term:`chgres_cube` is used to convert the raw external model data into initial and lateral boundary condition files in :term:`netCDF` format. These are needed as input to the :term:`FV3` limited area model (:term:`LAM`). Additional information about the UFS pre-processing utilities can be found in the :doc:`UFS_UTILS Technical Documentation <ufs-utils:index>` and in the `UFS_UTILS Scientific Documentation <https://ufs-community.github.io/UFS_UTILS/index.html>`__.

The SRW Application can be initialized from a range of operational initial condition files. It is possible to initialize the model from the Global Forecast System (:term:`GFS`), North American Mesoscale (:term:`NAM`) Forecast System, Rapid Refresh (:term:`RAP`), and High-Resolution Rapid Refresh (:term:`HRRR`) files in Gridded Binary v2 (:term:`GRIB2`) format. GFS files also come in :term:`NEMSIO` format for past dates.
The SRW Application can be initialized from a range of operational initial condition files. It is possible to initialize the model from the Global Forecast System (:term:`GFS`), North American Mesoscale (:term:`NAM`) Forecast System, Rapid Refresh (:term:`RAP`), High-Resolution Rapid Refresh (:term:`HRRR`), and Rapid Refresh Forecast System (:term:`RRFS`) files in Gridded Binary v2 (:term:`GRIB2`) format. GFS files also come in :term:`NEMSIO` format for past dates.

.. WARNING::
For GFS data, dates prior to 1 January 2018 may work but are not guaranteed. Public archives of model data can be accessed through the `NOAA Operational Model Archive and Distribution System <https://nomads.ncep.noaa.gov/>`__ (NOMADS). Raw external model data may be pre-staged on disk by the user.
Expand Down
2 changes: 1 addition & 1 deletion doc/UsersGuide/BuildingRunningTesting/AQM.rst
Original file line number Diff line number Diff line change
Expand Up @@ -123,7 +123,7 @@ The community AQM configuration assumes that users have :term:`HPSS` access and
USE_USER_STAGED_EXTRN_FILES: true
EXTRN_MDL_SOURCE_BASEDIR_LBCS: /path/to/data
On Level 1 systems, users can find :term:`ICs/LBCs` in the usual :ref:`input data locations <Data>` under ``FV3GFS/netcdf/2023021700`` and ``FV3GFS/netcdf/2023021706``. Users can also download the data required for the community experiment from the `UFS SRW App Data Bucket <https://noaa-ufs-srw-pds.s3.amazonaws.com/index.html#input_model_data/FV3GFS/netcdf/>`__.
On Level 1 systems, users can find :term:`ICs/LBCs` in the usual :ref:`input data locations <Data>` under ``FV3GFS/netcdf/2023021700`` and ``FV3GFS/netcdf/2023021706``. Users can also download the data required for the community experiment from the `UFS SRW App Data Bucket <https://noaa-ufs-srw-pds.s3.amazonaws.com/index.html#develop-20240618/input_model_data/FV3GFS/netcdf/>`__.

Users may also wish to change :term:`cron`-related parameters in ``config.yaml``. In the ``config.aqm.community.yaml`` file, which was copied into ``config.yaml``, cron is used for automatic submission and resubmission of the workflow:

Expand Down
6 changes: 3 additions & 3 deletions doc/UsersGuide/BuildingRunningTesting/ContainerQuickstart.rst
Original file line number Diff line number Diff line change
Expand Up @@ -188,8 +188,8 @@ The SRW App requires input files to run. These include static datasets, initial

.. code-block:: console
wget https://noaa-ufs-srw-pds.s3.amazonaws.com/current_srw_release_data/fix_data.tgz
wget https://noaa-ufs-srw-pds.s3.amazonaws.com/current_srw_release_data/gst_data.tgz
wget https://noaa-ufs-srw-pds.s3.amazonaws.com/experiment-user-cases/release-public-v2.2.0/out-of-the-box/fix_data.tgz
wget https://noaa-ufs-srw-pds.s3.amazonaws.com/experiment-user-cases/release-public-v2.2.0/out-of-the-box/gst_data.tgz
tar -xzf fix_data.tgz
tar -xzf gst_data.tgz
Expand Down Expand Up @@ -439,4 +439,4 @@ If users have the PBS resource manager installed on their system, the allocation
For more information on the ``qsub`` command options, see the `PBS Manual §2.59.3 <https://2021.help.altair.com/2021.1/PBSProfessional/PBS2021.1.pdf>`__, (p. 1416).

These commands should output a hostname. Users can then run ``ssh <hostname>``. After "ssh-ing" to the compute node, they can run the container from that node. To run larger experiments, it may be necessary to allocate multiple compute nodes.
These commands should output a hostname. Users can then run ``ssh <hostname>``. After "ssh-ing" to the compute node, they can run the container from that node. To run larger experiments, it may be necessary to allocate multiple compute nodes.
Loading

0 comments on commit c2f31bf

Please sign in to comment.