-
Notifications
You must be signed in to change notification settings - Fork 23
Technical documentation
The first step in coupling models together is the generation of remapping weights files. The weights files are used by the coupler (OASIS3-MCT or Tango) at runtime to transfer the coupling fields.
For the 1 and 0.25 degree ACCESS-OM configurations OASIS3-MCT/SCRIP has always been used to generate the remapping weights. This approach does not scale well. At 1 deg the generation step takes several minutes, at 0.25 deg it takes 5-6 hours. This approach is impractical for a 0.1 degree grid.
Fortunately ESMF can be used instead. It supports multiprocessing and outputs weights files in a format that OASIS can use during runtime. In effect we are just replacing the SCRIP part of OASIS with ESMF. Furthermore the latest versions also support higher order interpolation schemes.
A tool, make_remap_weights.py, is included within the ACCESS-OM2 repository that uses ESMF to create weights files.
Regenerate the weights:
qsub make_remap_weights.sh
You shouldn't need to do this, as make_remap_weights.sh
loads the appropriate ESMF module. But just for reference this is done with:
cd $ACCESS_OM_DIR/tools/contrib
./build_esmf_on_raijin.sh
Check that the new ESMF_RegridWeightGen executable has been built successfully:
ls $ACCESS_OM_DIR/tools/contrib/bin/ESMF_RegridWeightGen
and then regenerate the weights as above.
This section shows some runtime and error comparisons between OASIS-MCT/SCRIP and ESMF. They pass a field from a CORE2 grid to a MOM 1 degree grid and calculate the conservativeness and runtime of the remapping.
The tests on the 1 degree grid show that EMSF is more conservative than OASIS3-MCT/SCRIP:
Remapper | Relative Error | Runtime (s) |
---|---|---|
OASIS-MCT/SCRIP | 9.5e-10 | 118.5 |
ESMF | 3.4e-16 | 23.2 |
From the CORE2 to a MOM 0.1 degree grid:
Remapper | Relative Error | Runtime (s) |
---|---|---|
OASIS-MCT/SCRIP | - | - |
ESMF | 1.6e-14 | 169.7 |
There are no numbers for OASIS on the 0.1 degree grid, this did not complete after several hours.
The runtimes give the time it takes to build the weights file and remap a single field. They are rough and unoptimised, for example, the OASIS remapping is done in Fortran, while the ESMF one uses Python.
The OASIS3-MCT/SCRIP error looks high. It might be worth further investigation if this remapping approach is to be used. For the 0.1 degree configuration we are restricted to ESMF in any case for performance reasons.
Remapping weight creation and remapping error are tested by ARCCSS Jenkins
This section introduced new Python tools and packages that can:
- Generate weight files using the ESMF_RegridWeightGen tool. Using this overcomes performance limitation with OASIS3-MCT/SCRIP and makes it possible to easily build weights files for high resolution grids.
- Compare how well ESMF and SCRIP conserve fields passed between atmosphere and ocean grids. This shows that there is no downside to using ESMF over SCRIP.
This section describes how to create the initial coupling field restarts.
is here: MOM_diags.txt (best viewed as "raw").
A comprehensive tabulation of MOM5 diagnostics does not exist.
A set of Unix commands to generate such a list is offered on p.413 of S. M. Griffies "Elements of the Modular Ocean Model (MOM) 5 (2012 release with updates)" [Technical Report 7, NOAA/Geophysical Fluid Dynamics Laboratory Ocean Group, February 2012] but this does not deal well with diagnostic registrations that span multiple lines.
The list offered here is more complete, but not without its own problems. Hopefully it will be of some use.
This is a work in progress.
Some of these diags may not actually be available because the functions may not be called.
Also many names are programmatically generated so you have to dive into the code to figure out what they are, eg
e: id_yflux_adv_int_z ( n ) = register_diag_field ( 'ocean_model' , trim ( t_prog ( n ) % name ) // '_yflux_adv_int_z' , grd % tracer_axes_flux_y ( 1 : 2 ) , time % model_time , 'z-integral of cp*rho*dxt*v*temp' , 'Watts' , missing_value = missing_value , range = ( / - 1.e18 , 1.e18 / ) )
(we are working on dealing with that case)
The diag list was produced by Marshall Ward using flint to parse the entire MOM codebase via this script (named parse.py
):
from flint.project import Project
proj = Project(verbose=True)
proj.parse('mom5/mom5')
The parse.py
script was run (python
works fine if 3.x is installed):
python3 parse.py > out
The parsed output out
is then filtered with grep
to just list the lines with diag_table field registrations:
grep -e "^mom5" -e "= *register_diag_field" out > MOM_diags.txt
MOM5 and CICE5 normally run with processor land masking (no processors allocated to tiles that are all-land), so grid data from production runs has regions which are NaN, which then causes problems with plotting and analysis.
We therefore have separate grid files specially created by a 1-timestep run with no processor land masking:
/g/data/ik11/grids/ocean_grid_01.nc
/g/data/ik11/grids/ocean_grid_025.nc
/g/data/ik11/grids/ocean_grid_10.nc
Here's how to do it (using 1 deg as an example)
- clone the model repo
git clone https://github.com/COSIMA/1deg_jra55_ryf.git 1deg_jra55_ryf_grid
cd 1deg_jra55_ryf_grid
mkdir mom_1deg
- open
config.yaml
to identify the ocean input directory, make a local symbolic copy of it, and removeocean_mask_table
so processor masking won't be used, e.g.
cp -s /g/data/ik11/inputs/access-om2/input_20201102/mom_1deg mom_1deg
rm mom_1deg/ocean_mask_table
- replace the ocean input directory in
config.yaml
with the full path of copy you just created. - set
restart_period = 0, 0, 86400
in accessom2.nml` -
cd ocean
, and editdiag_table_source.yaml
:
- in the
defaults
section of'static 2d grid data'
- add
packing: 1
(for double precision) - set
file_name_dimension
to the file name you want (e.g.ocean_grid_10
) - comment out everything but
file_name
in thefile_name_dimension
subsection
- add
- include all the fields you want in the
fields section
of'static 2d grid data'
, e.g.
fields:
area_t:
area_u:
drag_coeff:
dxt:
dxu:
dyt:
dyu:
geolat_c:
geolat_t:
geolon_c:
geolon_t:
ht:
hu:
kmt:
kmu:
- delete everything after this in
diag_table_source.yaml
- do
./make_diag_table.py
and check thatdiag_table
looks ok
Try running it (payu run
). It will fail with something like this in access-om2.err
:
FATAL from PE 0: MPP_DEFINE_DOMAINS2D: incorrect number of PEs assigned for this layout and maskmap. Use 240 PEs for this domain decomposition for mom_domain
so use this to edit config.yaml
to set the ocean ncpus
to the number of PEs suggested.
Run again
payu sweep
payu run
Hopefully it will work this time! The grid data will be in archive/output000/ocean/ocean_grid_10.nc
.
- Quick start
- Downloading ACCESS-OM2
- Building the models
- Running a test case
- Checking model output
- Running on gadi
- Updating an experiment
- Creating a new experiment
- Starting an experiment from existing restarts
- Integrating with the COSIMA cookbook
- Archiving model output
- Updating model bathymetry
- Perturbing the forcing fields
- Changing the bathymetry, land-sea mask and OASIS remapping weights
- Overview
- Coupling strategy
- Model load balancing
- CICE block distribution
- Build system
- Future work
- libaccessom2 / YATM
- MOM
- CICE
- Building models individually
- Installing new executables
- Contributing changes to the ACCESS-OM2 repository
- Testing
- Release history