Skip to content
Erik Kluzek edited this page Aug 5, 2022 · 22 revisions

Welcome to the SimpleLand wiki!

THIS IS A WORK IN PROGRESS!!!! Updating instructions as I have the chance!

Background Info:

The Simple Land Interface Model (SLIM) is an idealized land surface model that couples to CESM in place of CLM. A full model description is available in the supplementary information of Laguë et al. 2019: https://journals.ametsoc.org/jcli/article/32/18/5725/344078/Separating-the-Impact-of-Individual-Land-Surface

In the current format, SLIM works coupled with CESM2.1 - that is, if whatever configuration you're wanting (except for the land) can be set up with the code from CESM2.1 (e.g. CAM5, CAM6), you should be able to swap out CLM for SLIM using the code here. A few caveats on that:

  • SLIM runoff isn't currently being passed to the coupler; if you're wanting to run with a dynamic ocean (not a slab ocean or fixed SSTs), we'll need to implement that. Straightforward enough for liquid water, but we'll likely have to devise some sort of "calving" scheme to avoid snow building up over Greenland/Antarctica.

  • I've typically used SLIM with a slab ocean in a realistic continental configuration, where the q-fluxes for the slab ocean come from a coupled CAM5 CESM simulation. These q-fluxes may or may not be suitable for your purposes.

Notable details:

Building a SLIM-CESM run is very similar to building a normal CESM run, just with a bit more tweaking of your inputs than an out-of-the-box CESM configuration before you're ready to go! A quick summary of the notable differences from a "normal" CESM run:

  • create your case using a "longname" compset, where you tell the model exactly what combination of atm/land/ocean/ice etc you'd like to use

  • you're going to tell CESM to build with CLM, but since you're using the code from this repository, SLIM is actually sitting where CLM would otherwise sit. As far as the rest of CESM knows, it is still using CLM, but the actual CLM code has been entirely replaced by SLIM. One day, hopefully we'll make this interface a little smoother so you can tell CESM "build with CLM" or "build with SLIM" from the same codebase.

  • Right now, if you're running CLM simulations AND SLIM simulations, you need to keep two separate copies of the source code - one for CLM, and one for SLIM.

  • The input files for SLIM are not currently in the main input folder of CESM data on Cheyenne. But, the main input file is a fairly small netcdf file, and is the main way the user modifies how SLIM looks. You'll likely want to change this file to control what your land looks like. An example file is linked below.


Getting Started:

Detailed instructions coming soon. Below is a work in progress that rapidly deteriorates!


These steps should take you from the "download the code from github" step all they way to "yay, I'm running a SLIM-CESM simulation!"

Get the code

  1. On github (here, welcome!), navigate to the repository for SLIM: https://github.com/ESCOMP/SimpleLand
  2. Make sure you're on the (as of Feb 2022, anyhow) branch called cesm2_1_slim_mml
  3. In the terminal/command line, log on to whatever server you'd like your copy of the SLIM source code to live. For me, I'm using Cheyenne - if you use this computer, you can see where I've downloaded the SLIM code here: /glade/u/home/mlague/cesm_source/slim_example

NOTE: do not try to download SLIM into an existing CESM codebase. Make a clean, new directory just for the version of the CESM source code including SLIM. You'll download SLIM first using these instructions, then as part of these instructions you'll get the rest of CESM from the CESM github also downloaded into that folder. SLIM "borrows" the CLM's coupling infrastructure, so right now SLIM and CLM can't co-exist in the same codebase. You need one for "regular" CESM and one for "SLIM" CESM.

  1. Copy the code from github to your folder! e.g.
    git clone https://github.com/ESCOMP/SimpleLand.git
  2. You'll now have a folder called "SimpleLand". Open it.
    cd SimpleLand
  3. Now you'll need to checkout all the "external" components SLIM needs to run - in particular, this is going to go fetch the source code for CAM. To do this, type:
    ./manage_externals/checkout_externals
    Note that it is also going to download a bunch of other stuff, like all the FATES source code, that technically SLIM doesn't, and indeed can't, ever use. One day this will be tidied up enough such that this no longer needs to happen (not downloading it is easy, my problem is that when the model initializes it looks for that source code and is unhappy when it can't find it - I haven't taken the time to figure out how to make this stop happening since it isn't a problem if I'm willing to cart around a little extra model baggage... Not ideal, but again, it is on the list of things to clean up later!)
  4. this step no longer exists :)

Create a test case!

In particular, I'll walk through a SLIM-CAM-SOM example here. We're going to use longnames to configure the compsets, so you can make whatever you want (e.g. an offline SLIM-only run forced with reanlalysis), but the coupled one is (kind of) more involved so I'll walk through it here. You can tweak this as necessary to get the setup you want. The most involved SLIM setup is if you want to (a) run coupled with CAM and save all the coupler output to (b) force an offline SLIM run with the atmospheric data of the coupled run. Instructions to do that will be added later.

  1. Navigate to the cime/scripts folder. If you're following along on Cheyenne, this is: /glade/u/home/mlague/cesm_source/slim_example/SimpleLand/cime/scripts
  2. Create a case! In this example, my casename is slim_cam5_som_example, and I'm creating it with year 2000 co2, orbital forcing, etc, CAM50, a slab ocean (DOCN%SOM), and SLIM - note that in the "land" place, we're telling it we're using CLM50%BGC - but in the actual source code for the land model, CLM has been clobbered and SLIM is sitting there instead. This is why you still have to use this github repo to run SLIM, and is the main hurdle to getting it into CESM properly...
    ./create_newcase --case /glade/u/home/mlague/cesmruns/simple_land/slim_cam5_som_example --compset 2000_CAM50_CLM50%BGC_CICE_DOCN%SOM_SROF_SGLC_SWAV --res f19_g16 --run-unsupported

Navigate to your new case!

I'm going to refer to this as your "case folder". e.g. for me:
/glade/u/home/mlague/cesmruns/simple_land/slim_cam5_som_example You'll need to make (or copy) some surface data files to get your run going. This includes a file mml_surdat' that is like fsurdat' for clm, and basically tells SLIM what its land surface properties are on a lat x lon x month grid. Also, if you're running with a slab ocean (like this example), you may want a qflux file. I've put an example of each of these files (a very idealized SLIM input file where all land looks the same and has an albedo of 0.1, and a qflux file from a coupled CESM1.2 land-atm-ocean run) in a different repo, here: https://github.com/marysa/SLIM_files

Note, there are also sample files that are in the normal CESM inputdata mechanism and will be downloaded for you when you create a case.

  1. You'll need to make (or copy) a surface data file someplace onto the computer/server you're going to be running SLIM from. For example, I've got a file here (and also in the above repo):

/glade/u/home/mlague/cesmruns/simple_land/inputs/global_pert/Cheyenne_inputs/globalconst_alpha0.1_soilcv2e6_hc0.1_rs100.0_glc_hc0.01_f19_20170829.nc

** NOTE ** 👍 The above file has intermittently been resulting in a Nan showing up in the land model a few months into the run. It has also... worked just fine. While digging into what is going on here, if you just want to get SLIM running, use the following file instead! https://github.com/marysa/SLIM_files/blob/master/slim_realistic_fromCLM5_alb2000_hc2000_rs2000annual_f19_20190621.nc

To customize your land surface, you can use either of these files (for the Nan avoidance, preferably use the "realistic" one), load them into python or whatever language you like to edit netcdfs with, and modify the surface properties to be what you want, then save it out as a new file and use that for your run.

  1. If you're using a slab ocean, you'll also need a file for q-fluxes. I've got a file here (and also in the above repo):
    /glade/p/work/mlague/CESM/som/qflux/pop_frc.b.e11.B1850C5CN.f09_g16.005.082914_j.nc
  2. If you're using a slab ocean, copy the file user_docn.streams.txt.som from the above repo (https://github.com/marysa/SLIM_files) into your case folder. Inside the user_docn.streams.txt.som, update the appropriate lines so the location and name of your qflux file are correct.
  3. Run ./case.setup
  4. Copy the file user_nl_clm from https://github.com/marysa/SLIM_files, or edit the user_nl_clm file that just popped up to point to the input file netcdf file that is going to control what SLIM looks like. E.g. for me, this is the line:
    mml_surdat = '/glade/u/home/mlague/cesmruns/simple_land/inputs/global_pert/Cheyenne_inputs/globalconst_alpha0.1_soilcv2e6_hc0.1_rs100.0_glc_hc0.01_f19_20170829.nc'

at this point, you should use whatever the file name of the surface data file you're using is, e.g. slim_realistic_fromCLM5_alb2000_hc2000_rs2000annual_f19_20190621.nc instead of globalconst_alpha0.1_soilcv2e6_hc0.1_rs100.0_glc_hc0.01_f19_20170829.nc

  1. Request SLIM output, and no CLM output.
    This is another "this is not elegant, we should fix this" thing - right now, even though CLM doesn't run, it initializes all of the clm variables. So, to avoid printing a bunch of blank CLM variables to a netcdf file, and to get the SLIM variables instead, you'll want to (a) suppress all history output, then (b) explicitly request whatever variables you want from SLIM. These are the lines hist_empty_htpaces = .true. and hist_fincl1 = ... in the example user_nl_clm file.
  2. Another silly thing, which may be specific to Cheyenne and has been fixed in the main CESM release code! Before building, go into the xml file env_mach_specific.xml, and move the line <arg name="labelstdout">-p "%g:"</arg> so it sits one line higher than it does right now, above <arg name="anum_tasks"> -np {{ total_tasks }}</arg>. I'm not kidding. It won't run (on Cheyenne) if you don't do this (or it wouldn't...). I don't know why this is a problem, and why moving the lines fixes it. Erik Kluzek could tell you :p . Note, xml files are very picky about maintaining appropriate spacing - so keep the same indenting that it has right now! Just swap those two lines.
  3. At this point, you may have to modify env_mach_pes.xml and/or other files specifically to run on the computer you are using (if you are not using Cheyenne). There is an example file for this in https://github.com/marysa/SLIM_files, which you should use if running a CAM/SOM/SLIM run like this on Cheyenne (what it defaults to right now will work, it just will cost you a lot of wall clock hours).
  4. Okay, the silly stuff is done, now we can build! Type:
    ./case.build
    (or, if you're on Cheyenne, qcmd -- ./case.build)
    Don't be concerned if this takes a while. Like, 10 or 20 minutes. It goes much faster if you've been building a lot lately, but if it is the first time you're compiling this code, expect it to be kinda slow. If you're building with a DATM rather than CAM, this will be faster, though still slower than it needs to be, because in addition to compiling SLIM, it is going to compile a bunch of CLM also - even though it won't run CLM. That... needs to be tidied up. (It works the way it is, and tidying it up in such a way that it will integrate properly with the rest of CESM is something that I need help with!)

Configure your run

This section is about deciding how long you want to run for, how much wall time you want to ask for, what account to charge (on Cheyenne), etc... All of this is the same as a regular CESM run - see their MUCH MORE CONCISE / CLEAR documentation here: http://www.cesm.ucar.edu/models/cesm2/
19. edit env_run.xml to set, in particular, STOP_OPTION and STOP_N to be whatever you want them to be. Its your world, you decide how long it gets to exist :p .
20. SUBMIT YOUR CASE! How exciting. Unleash it on the queue, cross your fingers, and hope everything works...
./case.submit

Check on your run

  1. If you're on Cheyenne, you can check on the status of your job using qstat -u $user

Look for your output

  1. Your output will be written to scratch space (which, incidentally, is also where the model compiled to). For the example I've been building here, that is:
    /gpfs/fs1/scratch/mlague/slim_cam5_som_example/run (which will likely get purged before anybody reads this and goes looking for it, but I provide it more as an example of where to go looking for your own scratch/run folder than as a "go look here for something" example)

  2. Hopefully it is running. History files and log files will start showing up in your scratch/run folder. If you told it to archive your output (default in CESM is to move your output to the "short-term archive" out of your run folder), then once the run finishes, it will take all your history files and move them to some other specified folder. For me, that is /gpfs/fs1/scratch/mlague/archive/slim_cam5_som_example

  3. That's it!

Customizing your SLIM run

  • Take a look at the example input file https://github.com/marysa/SLIM_files/blob/master/globalconst_alpha0.1_soilcv2e6_hc0.1_rs100.0_glc_hc0.01_f19_20170829.nc . You'll want to make a netcdf file like this (or, I suggest, edit this one), to reflect how you want your land surface to look. E.g. if you want your albedo to have a nice pattern, modify the alb_gvd (ground visible direct), alb_gvf (ground visible diffuse), alb_gnd, alb_gnf (ground near-ir direct/diffuse) variables. Right now, SLIM takes an input file that is lat x lon x month. If you wanted to prescribe a time-evolving albedo (e.g. a different albedo every year, not just every month), we'd probably have to tweak the code to handle that.
  • The file https://github.com/marysa/SLIM_files/blob/master/slim_realistic_fromCLM5_alb2000_hc2000_rs2000annual_f19_20190621.nc has a spatially varying set of land surface properties, as an example of a slightly less idealized setup for SLIM (spatially varying albedo, evaporative resistance, and aerodynamic roughness - but still spatially uniform soil thermal properties, bucket capacity, etc).
  • Tweaking the code: the primary file controlling all of the physics in SLIM is mml_main.F90, which sits in SimpleLand/src/main, or, in the example on Cheyenne, here:
    /glade/u/home/mlague/cesm_source/slim_example/SimpleLand/src/main/mml_main.F90
    If you want to mess with the physics (make sure you want to do that before starting haha), this is the file to dig into.
  • If you want to run with a different compset: usually I use the longname option to declare my compset, though I actually did name a few (you can find them in SimpleLand/cime_config/config_compsets.xml. For example, instead of using the longname option for your compset, you could use H and K compsets, which are, respectively, coupled and offline SLIM runs in CESM. E.g. K_HIST_MMLOFFLINE_GSWP3 which has the longname HIST_DATM%GSWP3v1_CLM45%BGC_SICE_SOCN_SROF_SGLC_SWAV, or H_MML_2000_CAM5_BGC which has the longname 2000_CAM50_CLM45%BGC_CICE_DOCN%SOM_SROF_SGLC_SWAV.