generated from GEOS-ESM/geos-template-repo
-
Notifications
You must be signed in to change notification settings - Fork 1
SMT GEOS LocalBuild
Florian Deconinck edited this page Apr 23, 2024
·
23 revisions
-
tcsh
for GEOS workflow - A
gcc/g++/gfortran
compiler (gcc-12 is our current workhorse)
sudo apt install g++ gcc gfortran
# If you have multiple installation, you might have to set the prefered
# compiler using the combo:
# sudo update-alternatives --install /usr/bin/gcc gcc /usr/bin/gcc-12
# sudo update-alternatives --config gcc
- A
cuda
toolkit for GPU backends. (nvhpc/23.5+ or appropriate for your hardware) - Use
lmod
to manipulate the stack with module like on HPCs: https://lmod.readthedocs.io/en/latest/030_installing.html
sudo apt install -y lua5.3 lua-bit32 lua-posix lua-posix-dev liblua5.3-0 liblua5.3-dev tcl tcl-dev tcl8.6 tcl8.6-dev libtcl8.6
wget https://github.com/TACC/Lmod/archive/refs/tags/8.7.37.tar.gz
tar -xvf 8.7.37.tar.gz
cd Lmod-8.7.37/
sudo ./configure --prefix=/opt/apps
sudo make install
sudo ln -s /opt/apps/lmod/lmod/init/profile /etc/profile.d/z00_lmod.sh
sudo ln -s /opt/apps/lmod/lmod/init/cshrc /etc/profile.d/z00_lmod.csh
sudo ln -s /opt/apps/lmod/lmod/init/profile.fish /etc/fish/conf.d/z00_lmod.fish
- Copy code from STM from the
feature/sw
branch on thesw_stack/discover/sles15
- Change all
/discover
path to local - In
build_0_on-node.sh
remove the--with-gdrcopy=
line for UCX (you don't have it installed or/and it doesn't matter outside of HPCs) - Then run the pipeline
./download.sh
./build_0_on-node.sh
./build_1_on-login.sh
The stack will take some time to install, especially the baselibs
part of it. To check that baselibs has been built and installed, you can run
./verify_baselibs.sh
which should output
-------+---------+---------+--------------
Config | Install | Check | Package
-------+---------+---------+--------------
ok | ok | -- | jpeg
ok | ok | -- | zlib
ok | ok | -- | szlib
ok | ok | -- | hdf5
ok | ok | -- | netcdf
ok | ok | -- | netcdf-fortran
ok | ok | -- | udunits2
ok | ok | -- | fortran_udunits2
ok | ok | -- | esmf
ok | ok | -- | GFE
-------+---------+---------+--------------
- Get and setup
mepo
which is pre-installed onDiscover
-
mepo
stands for "Multiple rEPOsitory": a Python tool held by Matt T in the SI team that handles the GEOS multi-repository strategy
-
git clone [email protected]:GEOS-ESM/mepo.git
#... In your bashrc ...
alias mepo='DIR_TO_MEPO/mepo'
- Git clone the root of GEOS then using
mepo
, you clone the rest of the components which pull on thecomponents.yaml
at the root ofGEOSgcm
- There is two sources for a component the default and the
develop
source -
v11.5.2
is our baseline, but we havedsl/develop
branch where needed - We do not use
develop
forGEOSgcm_App
orcmake
since those have been setup for OpenACC but are not up-to-date for v11.5.2
- There is two sources for a component the default and the
git clone -b update/geos_11.5.2 [email protected]:GEOS-ESM/GEOSgcm.git geos
cd geos
mepo clone --partial blobless
mepo develop env GEOSgcm GEOSgcm_GridComp FVdycoreCubed_GridComp pyFV3
- CMake GEOS, using the stack, our custom build
baselibs
and turning on the interface topyFV3
for the dynamical core - Then
make install
. Grab a coffee (or 2)- We override the compiler with their
mpi
counterpart to make surelibmpi
is pulled properly. CMake should deal with that, but there's some failure floating around.
- We override the compiler with their
module use -a SW_STACK/modulefiles
ml SMTStack/2024.03.00
export BASEDIR=SW_STACK/src/2024.03.00/install/baselibs-7.17.1/install/x86_64-pc-linux-gnu
mkdir -f build
cd build
export TMP=GEOS_DIR/geos/build/tmp
export TMPDIR=$TMP
export TEMP=$TMP
mkdir $TMP
echo $TMP
export FC=mpif90
export CC=mpicc
export CXX=mpic++
cmake .. -DBASEDIR=$BASEDIR/Linux \
-DCMAKE_Fortran_COMPILER=mpif90 \
-DBUILD_PYFV3_INTERFACE=ON \
-DCMAKE_INSTALL_PREFIX=../install \
-DPython3_EXECUTABLE=`which python3`
make -j48 install
Some code in GEOS requires MKL. It's unclear which (apart from a random number generator in Radiation) and it seems to be an optional dependency. The cmake
process will look for it. If you want to install it follow steps for the standalone OneAPI MKL installer
- Get data (curtesy of Matt T.) in it's most compact form: TinyBC
- WARNING: TinyBC is Matt way to move around a small example and/or develop on desktop, it's fragile and we shall treat it as a dev tool, offer help when needed.
- Do not rename it
wget https://portal.nccs.nasa.gov/datashare/astg/smt/geos-fp/TinyBCs-GitV10.2024Apr04.tar.gz
tar -xvf TinyBCs-GitV10.2024Apr04.tar.gz
rm TinyBCs-GitV10.2024Apr04.tar.gz
cd TinyBCs-GitV10
- Make experiment, using TinyBC. First we will make our live easier by aliasing the relevant script. In your
.bashrc
(remember to re-bash)
alias tinybc_create_exp=PATH_TO/TinyBCs-GitV10/scripts/create_expt.py
alias tinybc_makeoneday=PATH_TO/TinyBCs-GitV10/scripts/makeoneday.bash
- Then we can make experiment for c24
cd PATH_TO/geos/install/bin
tinybc_create_exp --horz c24 --vert 72 --nonhydro --nooserver --gocart C --moist GFDL --link --expdir /PATH_TO/geos/experiment TBC_C24_L72
cd /PATH_TO/geos/experiment/TBC_C24_L72
tinybc_makeonday noext # Temporary: we deactivate ExtData to go around a crash
tcsh
- The simulation is ready to run a 24h simulation (with the original Fortran) with
source PATH_TO/geos/@env/g5_modules.sh # source the env
./gcm_run.j
- To run the
pyFV3
version, modification to theAGCM.rc
andgcm_run.j
needs to be done after thetinybc_makeoneday
. This runs thenumpy
backend.
AGCM.rc
###########################################################
# dynamics options
# ----------------------------------------
DYCORE: FV3
FV3_CONFIG: HWT
+ RUN_PYFV3: 1
AdvCore_Advection: 0
gcm_run.j
setenv RUN_CMD "$GEOSBIN/esma_mpirun -np "
setenv GCMVER `cat $GEOSETC/.AGCM_VERSION`
echo VERSION: $GCMVER
+setenv FV3_DACEMODE Python
+setenv GEOS_PYFV3_BACKEND numpy
+setenv PACE_CONSTANTS GEOS
+setenv PACE_FLOAT_PRECISION 32
+setenv PYTHONOPTIMIZE 1
+setenv PACE_LOGLEVEL Debug
+setenv FVCOMP_DIR PATH_TO_GEOS/src/Components/@GEOSgcm_GridComp/GEOSagcm_GridComp/GEOSsuperdyn_GridComp/@FVdycoreCubed_GridComp/
+setenv PYTHONPATH $FVCOMP_DIR/python/interface:$FVCOMP_DIR/python/@pyFV3
#######################################################################
# Experiment Specific Environment Variables
######################################################################