Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Failures in GEOSCHEM/HEMCO configurations with different ocean & atm grids #1172

Open
billsacks opened this issue Oct 14, 2024 · 15 comments
Open
Assignees
Labels
bug Something isn't working correctly

Comments

@billsacks
Copy link
Member

What happened?

In doing some testing with the exchange grid (setting aoflux_grid = "xgrid" in user_nl_cpl, I got a failure in SMS_D_Ln9.f09_f09_mg17.FCnudged_GC.derecho_intel.cam-outfrq9s:

dec2455.hsn.de.hpc.ucar.edu 484: forrtl: error (73): floating divide by zero
dec2455.hsn.de.hpc.ucar.edu 484: Image              PC                Routine            Line        Source
dec2455.hsn.de.hpc.ucar.edu 484: libpthread-2.31.s  0000149FBD5108C0  Unknown               Unknown  Unknown
dec2455.hsn.de.hpc.ucar.edu 484: cesm.exe           000000000315AECC  drydep_mod_mp_adu        4108  drydep_mod.F90
dec2455.hsn.de.hpc.ucar.edu 484: cesm.exe           000000000313FCB0  drydep_mod_mp_dep        1774  drydep_mod.F90
dec2455.hsn.de.hpc.ucar.edu 484: cesm.exe           000000000311CE7A  drydep_mod_mp_do_         316  drydep_mod.F90
dec2455.hsn.de.hpc.ucar.edu 484: cesm.exe           0000000002C89ECB  chemistry_mp_chem        3514  chemistry.F90
dec2455.hsn.de.hpc.ucar.edu 484: cesm.exe           00000000012ACFAC  physpkg_mp_tphysa        1604  physpkg.F90
dec2455.hsn.de.hpc.ucar.edu 484: cesm.exe           00000000012A7F77  physpkg_mp_phys_r        1284  physpkg.F90
dec2455.hsn.de.hpc.ucar.edu 484: cesm.exe           00000000009A9FA1  cam_comp_mp_cam_r         290  cam_comp.F90
dec2455.hsn.de.hpc.ucar.edu 484: cesm.exe           0000000000959FFF  atm_comp_nuopc_mp        1136  atm_comp_nuopc.F90
dec2455.hsn.de.hpc.ucar.edu 484: libesmf.so         0000149FC6533D86  execute                   377  ESMCI_MethodTable.C
dec2455.hsn.de.hpc.ucar.edu 484: libesmf.so         0000149FC6533932  execute                   563  ESMCI_MethodTable.C
dec2455.hsn.de.hpc.ucar.edu 484: libesmf.so         0000149FC653352A  c_esmc_methodtabl         317  ESMCI_MethodTable.C
dec2455.hsn.de.hpc.ucar.edu 484: libesmf.so         0000149FC68D388B  esmf_attachmethod        1287  ESMF_AttachMethods.F90
dec2455.hsn.de.hpc.ucar.edu 484: libesmf.so         0000149FC80493FD  Unknown               Unknown  Unknown
dec2455.hsn.de.hpc.ucar.edu 484: libesmf.so         0000149FC618F279  callVFuncPtr             2167  ESMCI_FTable.C
dec2455.hsn.de.hpc.ucar.edu 484: libesmf.so         0000149FC618E2B8  ESMCI_FTableCallE         824  ESMCI_FTable.C
dec2455.hsn.de.hpc.ucar.edu 484: libesmf.so         0000149FC662BAB2  enter                    2501  ESMCI_VMKernel.C
dec2455.hsn.de.hpc.ucar.edu 484: libesmf.so         0000149FC6614346  enter                    1216  ESMCI_VM.C
dec2455.hsn.de.hpc.ucar.edu 484: libesmf.so         0000149FC618F65F  c_esmc_ftablecall         981  ESMCI_FTable.C
dec2455.hsn.de.hpc.ucar.edu 484: libesmf.so         0000149FC6C134FC  esmf_compmod_mp_e        1252  ESMF_Comp.F90
dec2455.hsn.de.hpc.ucar.edu 484: libesmf.so         0000149FC74E3D6A  esmf_gridcompmod_        1903  ESMF_GridComp.F90
dec2455.hsn.de.hpc.ucar.edu 484: libesmf.so         0000149FC7F85B75  nuopc_driver_mp_r        3694  NUOPC_Driver.F90
dec2455.hsn.de.hpc.ucar.edu 484: libesmf.so         0000149FC7F8BDFA  nuopc_driver_mp_e        3940  NUOPC_Driver.F90
dec2455.hsn.de.hpc.ucar.edu 484: libesmf.so         0000149FC6533D86  execute                   377  ESMCI_MethodTable.C
dec2455.hsn.de.hpc.ucar.edu 484: libesmf.so         0000149FC6533932  execute                   563  ESMCI_MethodTable.C
dec2455.hsn.de.hpc.ucar.edu 484: libesmf.so         0000149FC653352A  c_esmc_methodtabl         317  ESMCI_MethodTable.C
dec2455.hsn.de.hpc.ucar.edu 484: libesmf.so         0000149FC68D388B  esmf_attachmethod        1287  ESMF_AttachMethods.F90
dec2455.hsn.de.hpc.ucar.edu 484: libesmf.so         0000149FC7F83B76  nuopc_driver_mp_r        3615  NUOPC_Driver.F90
dec2455.hsn.de.hpc.ucar.edu 484: libesmf.so         0000149FC618F279  callVFuncPtr             2167  ESMCI_FTable.C
dec2455.hsn.de.hpc.ucar.edu 484: libesmf.so         0000149FC618E2B8  ESMCI_FTableCallE         824  ESMCI_FTable.C
dec2455.hsn.de.hpc.ucar.edu 484: libesmf.so         0000149FC662BAB2  enter                    2501  ESMCI_VMKernel.C
dec2455.hsn.de.hpc.ucar.edu 484: libesmf.so         0000149FC6614346  enter                    1216  ESMCI_VM.C
dec2455.hsn.de.hpc.ucar.edu 484: libesmf.so         0000149FC618F65F  c_esmc_ftablecall         981  ESMCI_FTable.C
dec2455.hsn.de.hpc.ucar.edu 484: libesmf.so         0000149FC6C134FC  esmf_compmod_mp_e        1252  ESMF_Comp.F90
dec2455.hsn.de.hpc.ucar.edu 484: libesmf.so         0000149FC74E3D6A  esmf_gridcompmod_        1903  ESMF_GridComp.F90
dec2455.hsn.de.hpc.ucar.edu 484: libesmf.so         0000149FC7F85B75  nuopc_driver_mp_r        3694  NUOPC_Driver.F90
dec2455.hsn.de.hpc.ucar.edu 484: libesmf.so         0000149FC7F8BDFA  nuopc_driver_mp_e        3940  NUOPC_Driver.F90
dec2455.hsn.de.hpc.ucar.edu 484: libesmf.so         0000149FC6533D86  execute                   377  ESMCI_MethodTable.C
dec2455.hsn.de.hpc.ucar.edu 484: libesmf.so         0000149FC6533932  execute                   563  ESMCI_MethodTable.C
dec2455.hsn.de.hpc.ucar.edu 484: libesmf.so         0000149FC653352A  c_esmc_methodtabl         317  ESMCI_MethodTable.C
dec2455.hsn.de.hpc.ucar.edu 484: libesmf.so         0000149FC68D388B  esmf_attachmethod        1287  ESMF_AttachMethods.F90
dec2455.hsn.de.hpc.ucar.edu 484: libesmf.so         0000149FC7F83B76  nuopc_driver_mp_r        3615  NUOPC_Driver.F90
dec2455.hsn.de.hpc.ucar.edu 484: libesmf.so         0000149FC618F279  callVFuncPtr             2167  ESMCI_FTable.C
dec2455.hsn.de.hpc.ucar.edu 484: libesmf.so         0000149FC618E2B8  ESMCI_FTableCallE         824  ESMCI_FTable.C
dec2455.hsn.de.hpc.ucar.edu 484: libesmf.so         0000149FC662BAB2  enter                    2501  ESMCI_VMKernel.C
dec2455.hsn.de.hpc.ucar.edu 484: libesmf.so         0000149FC6614346  enter                    1216  ESMCI_VM.C
dec2455.hsn.de.hpc.ucar.edu 484: libesmf.so         0000149FC618F65F  c_esmc_ftablecall         981  ESMCI_FTable.C
dec2455.hsn.de.hpc.ucar.edu 484: libesmf.so         0000149FC6C134FC  esmf_compmod_mp_e        1252  ESMF_Comp.F90
dec2455.hsn.de.hpc.ucar.edu 484: libesmf.so         0000149FC74E3D6A  esmf_gridcompmod_        1903  ESMF_GridComp.F90
dec2455.hsn.de.hpc.ucar.edu 484: cesm.exe           000000000044E467  MAIN__                    141  esmApp.F90
dec2455.hsn.de.hpc.ucar.edu 484: cesm.exe           0000000000425D7D  Unknown               Unknown  Unknown
dec2455.hsn.de.hpc.ucar.edu 484: libc-2.31.so       0000149FB8E0229D  __libc_start_main     Unknown  Unknown
dec2455.hsn.de.hpc.ucar.edu 484: cesm.exe           0000000000425CAA  Unknown               Unknown  Unknown

I also reproduced this failure in a run with the default of aoflux_grid = "ogrid" but when running with different ocean & atm grids: SMS_D_Ln9.f09_g17.FCnudged_GC.derecho_intel.cam-outfrq9s.

This is dying due to a divide by 0 in the geoschem drydep_mod, due to 0 values of ustar:

    ! surface resistance for particle
    RS   = 1.e0_f8 / (E0 * USTAR * (EB + EIM + EIN) * R1 )

It looks like the issue comes from the fact that, when the ocean grid differs from the atmosphere grid – either because these grids actually differ or because we're using the exchange grid (in which case there's an extra mapping done even if the atm & ocn grids are the same) – we get 0 values for ocean fields not only in land grid cells (which appear to be handled properly in CAM) but also in grid cells that are entirely sea ice-covered (or entirely sea ice + land).

Here is the relevant code for calculating USTAR in geoschem (in chemistry.F90):

    ! Field      : USTAR
    ! Description: Friction velocity
    ! Unit       : m/s
    ! Dimensions : nX, nY
    ! Note       : We here combine the land friction velocity (fv) with
    !              the ocean friction velocity (ustar)
    DO J = 1, nY
       State_Met(LCHNK)%USTAR     (1,J) =                       &
            cam_in%fv(J)    * ( cam_in%landFrac(J))             &
          + cam_in%uStar(J) * ( 1.0e+0_fp - cam_in%landFrac(J))
    ENDDO

Notice that this code assumes that cam_in%uStar applies in the complement of landFrac, but in fact there is a third relevant fraction, the ice fraction, which is not accounted for here.

When I put in place a hack to workaround this ustar issue:

diff --git a/src/chemistry/geoschem/chemistry.F90 b/src/chemistry/geoschem/chemistry.F90
index ab56200c..2c64670b 100644
--- a/src/chemistry/geoschem/chemistry.F90
+++ b/src/chemistry/geoschem/chemistry.F90
@@ -2817,9 +2817,13 @@ subroutine chem_timestep_tend( state, ptend, cam_in, cam_out, dT, pbuf, fh2o )
     ! Note       : We here combine the land friction velocity (fv) with
     !              the ocean friction velocity (ustar)
     DO J = 1, nY
-       State_Met(LCHNK)%USTAR     (1,J) =                       &
-            cam_in%fv(J)    * ( cam_in%landFrac(J))             &
-          + cam_in%uStar(J) * ( 1.0e+0_fp - cam_in%landFrac(J))
+       if (cam_in%uStar(J) == 0. .and. cam_in%landFrac(J) == 0.) then
+          State_Met(LCHNK)%USTAR(1,J) = 0.1
+       else
+          State_Met(LCHNK)%USTAR     (1,J) =                       &
+               cam_in%fv(J)    * ( cam_in%landFrac(J))             &
+             + cam_in%uStar(J) * ( 1.0e+0_fp - cam_in%landFrac(J))
+       end if
     ENDDO
 
     ! Field      : Z0

I got past the original crash, but got a new one here:

dec1643.hsn.de.hpc.ucar.edu 367: forrtl: error (73): floating divide by zero
dec1643.hsn.de.hpc.ucar.edu 367: Image              PC                Routine            Line        Source
dec1643.hsn.de.hpc.ucar.edu 367: libpthread-2.31.s  000014FBD087C8C0  Unknown               Unknown  Unknown
dec1643.hsn.de.hpc.ucar.edu 367: cesm.exe           0000000003126BE2  drydep_mod_mp_oce         705  drydep_mod.F90
dec1643.hsn.de.hpc.ucar.edu 367: cesm.exe           0000000003139B91  drydep_mod_mp_dep        1549  drydep_mod.F90
dec1643.hsn.de.hpc.ucar.edu 367: cesm.exe           000000000311DA7E  drydep_mod_mp_do_         316  drydep_mod.F90
dec1643.hsn.de.hpc.ucar.edu 367: cesm.exe           0000000002C8AACF  chemistry_mp_chem        3518  chemistry.F90
dec1643.hsn.de.hpc.ucar.edu 367: cesm.exe           00000000012ACFAC  physpkg_mp_tphysa        1604  physpkg.F90
dec1643.hsn.de.hpc.ucar.edu 367: cesm.exe           00000000012A7F77  physpkg_mp_phys_r        1284  physpkg.F90
dec1643.hsn.de.hpc.ucar.edu 367: cesm.exe           00000000009A9FA1  cam_comp_mp_cam_r         290  cam_comp.F90
dec1643.hsn.de.hpc.ucar.edu 367: cesm.exe           0000000000959FFF  atm_comp_nuopc_mp        1136  atm_comp_nuopc.F90

That's dying due to a divide-by-zero of TEMPK in OCEANO3, which I think comes from

    ! Field      : TSKIN
    ! Description: Surface skin temperature
    ! Remarks    : NOT to be confused with TS (T at 2m) (hplin, 3/20/23)
    ! Unit       : K
    ! Dimensions : nX, nY
    State_Met(LCHNK)%TSKIN     (1,:nY) = cam_in%SST(:nY)

which comes from So_t. Note that, like ustar, atmExp_So_t has 0 values over all-ice points in runs with different atm vs. ocn grids. (When the ocean is running on the same grid as the atmosphere, atmExp_So_t is 273.15 over land and doesn't seem to have patches of missing (or 273.15) values over sea ice points. But these 0 values appear when they're running on different grids.)

The 0 values come from the mapping from ocn to atm in the mediator, which uses a normalization of ofrac, which is the current ocean fraction (excluding both sea ice area and land area). @mvertens and I both feel that this ofrac normalization is the correct thing to do (and this was done in cpl7 as well as in CMEPS). Many fields (including "important" ones like the latent and sensible heat fluxes, etc.) are merged between ocn, ice and lnd before being passed to the atmosphere, so they are handled correctly. But a few fields – like ustar and So_t – are ocean-specific and so do not have this merge in the mediator. These fields need to be handled in CAM in a way that recognizes that they only apply over the ocean portion of the grid cell – not the sea ice portion or the land portion.

I see three options for the path forward, in order from least to most invasive in other components outside of CAM:

  • Option 1: Change geoschem to be more similar to bulk_aero and modal_aero in this respect, doing its own calculations of ustar at least over sea ice, and possibly over ocean
  • Option 2: Have sea ice send ustar, merge in CAM
    • CAM's geoschem will then use sea ice's ustar in the merge.
    • Note that bulk_aero and modal_aero could also be updated to use ustar from sea ice and ocean instead of calculating fv there.
  • Option 3: Have sea ice send ustar, do the merge of ustar / fv in the mediator
    • CAM's geoschem can be simplified to use this existing merged quantity everywhere
    • Note that bulk_aero an modal_aero could also be updated to use this merged quantity

What are the steps to reproduce the bug?

Run SMS_D_Ln9.f09_g17.FCnudged_GC.derecho_intel.cam-outfrq9s

What CAM tag were you using?

cam6_4_032 in cesm3_0_alpha03c

What machine were you running CAM on?

CISL machine (e.g. cheyenne)

What compiler were you using?

Intel

Path to a case directory, if applicable

/glade/derecho/scratch/sacks/SMS_D_Ln9.f09_g17.FCnudged_GC.derecho_intel.cam-outfrq9s.20241011_100351_kob7r6

Will you be addressing this bug yourself?

No

Extra info

For the failing case in /glade/derecho/scratch/sacks/SMS_D_Ln9.f09_g17.FCnudged_GC.derecho_intel.cam-outfrq9s.20241011_100351_kob7r6, I ran it twice. The first time it failed as noted above. The second time I introduced changes to get around the ustar issue:

diff --git a/src/chemistry/geoschem/chemistry.F90 b/src/chemistry/geoschem/chemistry.F90
index ab56200c..2c64670b 100644
--- a/src/chemistry/geoschem/chemistry.F90
+++ b/src/chemistry/geoschem/chemistry.F90
@@ -2817,9 +2817,13 @@ subroutine chem_timestep_tend( state, ptend, cam_in, cam_out, dT, pbuf, fh2o )
     ! Note       : We here combine the land friction velocity (fv) with
     !              the ocean friction velocity (ustar)
     DO J = 1, nY
-       State_Met(LCHNK)%USTAR     (1,J) =                       &
-            cam_in%fv(J)    * ( cam_in%landFrac(J))             &
-          + cam_in%uStar(J) * ( 1.0e+0_fp - cam_in%landFrac(J))
+       if (cam_in%uStar(J) == 0. .and. cam_in%landFrac(J) == 0.) then
+          State_Met(LCHNK)%USTAR(1,J) = 0.1
+       else
+          State_Met(LCHNK)%USTAR     (1,J) =                       &
+               cam_in%fv(J)    * ( cam_in%landFrac(J))             &
+             + cam_in%uStar(J) * ( 1.0e+0_fp - cam_in%landFrac(J))
+       end if
     ENDDO
 
     ! Field      : Z0

and it died differently, I think due to 0 values in SSTs. See the two different cesm log files for backtraces for these two errors.

@jimmielin
Copy link
Member

Thanks @billsacks, copying @lizziel from the GEOS-Chem Support Team who is in charge of software engineering of GEOS-Chem within CESM.

@billsacks
Copy link
Member Author

Thanks, @jimmielin !

@lizziel - let me know if you'd like to talk more about this. It took me a while to get my head around what was going on here and what feels like possible solutions, and I'm not sure how clearly I managed to explain this.

Has GEOS-Chem been tested yet in fully-coupled (B compset) configurations? My sense is that it will fail similarly to what I saw in these F compsets with non-standard grids (non-standard in that I was running the ocean on a different grid than the atmosphere), but if GEOS-Chem has been run in B compsets and this problem hasn't appeared, then I'd like to look further to see why.

@jimmielin
Copy link
Member

As far as I know GEOS-Chem hasn't been tested with a B-compset. I wouldn't be surprised if it failed.

I have thought a little more about this since I realized I named myself on one of the crashing snippets of code (on TSKIN)... It looks like GEOS-Chem will need some form of USTAR and TSKIN over sea ice. This bug/oversight has probably been present for a while.

The GEOS-Chem within CESM description paper, Fritz et al. (2022), section 2.3.2 notes that GEOS-Chem computes its own dry deposition velocities over ocean and sea ice, which are merged with the CLM-computed velocities over land:

Dry-deposition velocities over land are computed by CLM and are passed to CAM through the coupler. They are then merged with dry-deposition velocities computed over ocean and ice by GEOS-Chem, identical to the procedure used in CAM-chem. Each of these are scaled by the land and ocean/ice fraction, respectively.

For some reason the GEOS-Chem code in chemistry.F90 does not (no longer?) considers sea ice fraction when 'merging' the dry deposition velocities. So in any of the fixes 1-3 proposed, a fix to GEOS-Chem's handling of the dry deposition velocities merging would need to be added, as well.

IF ( map2GC_dryDep(N) > 0 ) THEN
   ! State_Chm%DryDepVel is in m/s
   State_Chm(LCHNK)%DryDepVel(1,:nY,map2GC_dryDep(N)) = &
      ! This first bit corresponds to the dry deposition
      ! velocities over land as computed from CLM and
      ! converted to m/s. This is scaled by the fraction
      ! of land.
        cam_in%depVel(:nY,N) * 1.0e-02_fp &
         * MAX(0._fp, 1.0_fp - State_Met(LCHNK)%FROCEAN(1,:nY)) &
      ! This second bit corresponds to the dry deposition
      ! velocities over ocean and sea ice as computed from
      ! GEOS-Chem. This is scaled by the fraction of ocean
      ! and sea ice.
      + State_Chm(LCHNK)%DryDepVel(1,:nY,map2GC_dryDep(N)) &
        * State_Met(LCHNK)%FROCEAN(1,:nY)
ENDIF

@lizziel
Copy link
Collaborator

lizziel commented Oct 15, 2024

The inclusion of ice in various land fraction terms in GEOS-Chem has changed over the past years, with minor discrepancies found in criteria for ice. I am not surprised that Thibaud's implementation is missing ice since it can get even more confusing with CESM's definitions thrown into the mix.

For the GEOS-Chem/CESM interface fix in chemistry.F90, I can incorporate FRSEAICE to both land and ocean/ice terms to properly weight GEOS-Chem and CLM computed dry dep velocities. It would look like this:

IF ( map2GC_dryDep(N) > 0 ) THEN
   ! State_Chm%DryDepVel is in m/s
   State_Chm(LCHNK)%DryDepVel(1,:nY,map2GC_dryDep(N)) = &
      ! This first bit corresponds to the dry deposition
      ! velocities over land as computed from CLM and
      ! converted to m/s. This is scaled by the fraction
      ! of land.
        cam_in%depVel(:nY,N) * 1.0e-02_fp &
         * MAX(0._fp, 1.0_fp - State_Met(LCHNK)%FROCEAN(1,:nY) - State_Met(LCHNK)%FRSEAICE(1,:nY)) &
      ! This second bit corresponds to the dry deposition
      ! velocities over ocean and sea ice as computed from
      ! GEOS-Chem. This is scaled by the fraction of ocean
      ! and sea ice.
      + State_Chm(LCHNK)%DryDepVel(1,:nY,map2GC_dryDep(N)) &
        * (State_Met(LCHNK)%FROCEAN(1,:nY) + State_Met(LCHNK)%FRSEAICE(1,:nY))
ENDIF

Here are my thoughts on the options to address the zero values for friction velocity and temperature (and maybe others?):

Option 1: Change geoschem to be more similar to bulk_aero and modal_aero in this respect, doing its own calculations of ustar at least over sea ice, and possibly over ocean

Could you point me to the code that does this? This would be a quick fix but not ideal. If something changes then we would need to remember to change it in all of these places. It also does not prevent this problem from creeping up in any new modules added to CESM that use USTAR. It is not intuitive there would be zero values.

Option 2: Have sea ice send ustar, merge in CAM
CAM's geoschem will then use sea ice's ustar in the merge.
Note that bulk_aero and modal_aero could also be updated to use ustar from sea ice and ocean instead of calculating fv there.

This one makes me think I am not entirely understanding friction velocity in CAM. Is the idea that there would be a new ustar that includes both ocean and sea ice, and friction velocity would be stored in fv for over land? It would be less confusing to have a field for friction velocity regardless of land type.

Option 3: Have sea ice send ustar, do the merge of ustar / fv in the mediator
CAM's geoschem can be simplified to use this existing merged quantity everywhere
Note that bulk_aero an modal_aero could also be updated to use this merged quantity

Yes, this is what I was driving at my response to 2. I think the cleanest most elegant solution is have a field for friction velocity for all grid cells globally available from the mediator.

@lizziel
Copy link
Collaborator

lizziel commented Oct 15, 2024

My response above extends to any other fields that might have zero values but that are sized globally, e.g. surface temperature.

@billsacks
Copy link
Member Author

@jimmielin - thanks for adding those points about dry deposition velocities. And @lizziel thank you for all of your thoughts here.

Option 1: Change geoschem to be more similar to bulk_aero and modal_aero in this respect, doing its own calculations of ustar at least over sea ice, and possibly over ocean

Could you point me to the code that does this? This would be a quick fix but not ideal. If something changes then we would need to remember to change it in all of these places. It also does not prevent this problem from creeping up in any new modules added to CESM that use USTAR. It is not intuitive there would be zero values.

I agree with you on this being a quick fix but not feeling ideal. With the major caveat that I'm not familiar with this code but just have been poking around a bit, what I found is this:

! calc ram and fv over ocean and sea ice ...
call calcram( ncol,landfrac,icefrac,ocnfrac,obklen,&
ustar,ram1in,ram1,state%t(:,pver),state%pmid(:,pver),&
state%pdel(:,pver),fvin,fv)

which calls:

subroutine calcram(ncol,landfrac,icefrac,ocnfrac,obklen,&

But actually, now that I look more closely at that code, I'm not convinced that it's necessarily doing the right thing in terms of land / ocean / ice fractions... but I haven't looked closely.

Option 2: Have sea ice send ustar, merge in CAM
CAM's geoschem will then use sea ice's ustar in the merge.
Note that bulk_aero and modal_aero could also be updated to use ustar from sea ice and ocean instead of calculating fv there.

This one makes me think I am not entirely understanding friction velocity in CAM. Is the idea that there would be a new ustar that includes both ocean and sea ice, and friction velocity would be stored in fv for over land? It would be less confusing to have a field for friction velocity regardless of land type.

I'll again add the caveat that I may not be understanding all of this correctly myself. My thinking here is that it looks like CICE doesn't even send ustar to the mediator currently, so a first step for either option 2 or 3 would be adding that. I'm not clear on whether there's a scientific reason why CAM computes its own ustar over sea ice rather than having CICE send it, or if that's just a historical artifact. I had been thinking with option 2 that CAM would get 3 separate fields – ustar / fv from each of land, ocean and sea ice – and then do the merge itself. I agree with you, though, that it could be cleaner to have the mediator do the merge; what I'm not sure about is whether that will work for all of the uses in CAM or if some parts of CAM actually want these separate values over the different surface types. As you point out, a compromise could be to have a merged sea ice / ocean ustar and a separate fv from land; whether this is the "right" thing to do may depend on uses elsewhere in CAM.

Option 3: Have sea ice send ustar, do the merge of ustar / fv in the mediator
CAM's geoschem can be simplified to use this existing merged quantity everywhere
Note that bulk_aero an modal_aero could also be updated to use this merged quantity

Yes, this is what I was driving at my response to 2. I think the cleanest most elegant solution is have a field for friction velocity for all grid cells globally available from the mediator.

I agree that Option 3 feels cleanest in some ways. This would take some broader discussion, though, and I'm particularly wondering if it will actually cause some problems in CAM to just have a single merged friction velocity field that applies to land, sea ice and ocean: This feels like the right thing to do for cases like this where you seem to want the grid cell average friction velocity, but I wonder if there are other cases that (at least currently) want just the friction velocity over land – in which case, sending the grid cell average will give the wrong values for points that are part land and part ocean. It may be that any CAM code that's currently set up that way can / should be rewritten to use grid cell average friction velocities, but I'm thinking this discussion may need to pull in some other people.

To summarize my sense at this point: Option 3 does feel cleanest, but both options 3 & 2 seem like they'll require broader discussions among other CAM developers. I'm not sure how critical this issue is – e.g., are there hopes to have GEOS-Chem be a supported option for B compsets for CESM3? Option 1 could get things working consistently with other chemistry options in CAM, but I agree that it may not be the right long-term solution. I don't know how much you and others want to just get something working now vs. holding off for a better long-term solution to avoid doing work now that just needs to be undone later....

@lizziel
Copy link
Collaborator

lizziel commented Oct 16, 2024

I'm not sure how critical this issue is – e.g., are there hopes to have GEOS-Chem be a supported option for B compsets for CESM3?

I am not familiar with B compsets and have not heard any requests for them. Given the level of work I think we would need a research need to pursue it. We currently do not have that in our research group at Harvard but perhaps other groups do or will. That request may be brought up with me, or it might be presented to NCAR. Perhaps we should all be on the lookout for hearing of this and inform the other if a request shows up? This question should also be presented to Louisa Emmons who is the CESM scientist for GEOS-Chem. We have CESM-GC meeting on Nov 6th during which I can bring it up.

Option 1 could get things working consistently with other chemistry options in CAM, but I agree that it may not be the right long-term solution. I don't know how much you and others want to just get something working now vs. holding off for a better long-term solution to avoid doing work now that just needs to be undone later...

I like the idea of a quick fix should anyone want to play with this. Implementing a better solution later can be famous last words, forgotten about and rediscovered years later. Do you leave todo items like this in a github issue to keep open?

@billsacks
Copy link
Member Author

I am not familiar with B compsets and have not heard any requests for them. Given the level of work I think we would need a research need to pursue it. We currently do not have that in our research group at Harvard but perhaps other groups do or will. That request may be brought up with me, or it might be presented to NCAR. Perhaps we should all be on the lookout for hearing of this and inform the other if a request shows up? This question should also be presented to Louisa Emmons who is the CESM scientist for GEOS-Chem. We have CESM-GC meeting on Nov 6th during which I can bring it up.

B compsets are fully-coupled configurations, with active ocean as well as atmosphere (and land, sea ice, etc.). The reason this issue appears in B compsets and not in F compsets is that B compsets almost always have different atm vs. ocean grids, whereas F compsets typically use the same grid for atm & ocean, and this issue only appears when the atm & ocean grids differ... though it may be that the results are subtly wrong even when the grids are the same due to not considering the ice fraction.

I like the idea of a quick fix should anyone want to play with this. Implementing a better solution later can be famous last words, forgotten about and rediscovered years later. Do you leave todo items like this in a github issue to keep open?

Personally, yes, but I'm not involved with CAM development and I don't know what the CAM group prefers for something like this. (@cacraigucar ?)

@cacraigucar
Copy link
Collaborator

I am adding @PeterHjortLauritzen (the NCAR AMWG CAM rep), @adamrher (one of the scientists working on CAM) and @nusbaume to this thread, as this has grown outside of my expertise and into the science realm.

I should say that we have been looking at GEOS-Chem as a "draft" implementation, as until the restart issues are solved, it is not ready for full use. I was surprised to see that anyone was trying to use it in a B compset.

I can say that with the push to CESM3, we AMP SEs do not have time to work on any changes that would be required to solve this. We are certainly open to accepting changes from anyone who takes the time to fix and test the problems, but we will not have time to dive into this ourselves before the release.

Leaving this issue open is one way to ensure that it doesn't get forgotten. Unfortunately, until we take the time to review the backlog of issues post-release, it will probably get lost in the noise, unless someone brings it back to our attention.

@billsacks
Copy link
Member Author

Thanks @cacraigucar . Just to clarify / summarize where this came from: This originated from my testing of using the exchange grid for atmosphere-ocean flux calculations. In that testing, I was getting failures in F compset tests with GEOS-Chem. I determined that these same failures appeared in F compset runs with different atm & ocn grids. From this, I speculated that this will be a problem for B compsets.

For now we are planning to avoid this issue by not using the exchange grid if the atm & ocn grids are identical, as they typically are in F compsets. (The exchange grid doesn't give any benefit in this situation, and just leads to extra overhead.) So if the use of GEOS-Chem in B compsets isn't planned for the near future, then I agree that this can be left as low priority for now.

@adamrher
Copy link

For now we are planning to avoid this issue by not using the exchange grid if the atm & ocn grids are identical, as they typically are in F compsets.

Hi @billsacks. My not-so deep reading of this thread indicates that there are two issues, and this is a workaround for the first one (only support GEOS-Chem for F compsets; turn off the xgrid for F-compsets). The second issue is related to omitting the sea ice fraction in grid mean ustar calculation in src/chemistry/geoschem/chemistry.F90, of which a fix of some sort will be implemented imminently. A more satisfactory solution should be developed in the future, and so a git issue should be opened as a reminder. Is that accurate?

@billsacks
Copy link
Member Author

My not-so deep reading of this thread indicates that there are two issues, and this is a workaround for the first one (only support GEOS-Chem for F compsets; turn off the xgrid for F-compsets). The second issue is related to omitting the sea ice fraction in grid mean ustar calculation in src/chemistry/geoschem/chemistry.F90, of which a fix of some sort will be implemented imminently. A more satisfactory solution should be developed in the future, and so a git issue should be opened as a reminder. Is that accurate?

That sounds right - thanks for summarizing.

@lizziel
Copy link
Collaborator

lizziel commented Oct 18, 2024

Regarding my earlier suggested fix for weighting dry dep velocities between CLM on land and GEOS-Chem on ocean/ice, there is actually is no need for a fix. State_Met(LCHNK)%FROCEAN(1,:nY) is defined to include ice such that the current weighting is correct.

   State_Met(LCHNK)%FRLAND    (1,:nY) = cam_in%landFrac(:nY)
   State_Met(LCHNK)%FROCEAN   (1,:nY) = cam_in%ocnFrac(:nY) + cam_in%iceFrac(:nY)
   State_Met(LCHNK)%FRSEAICE  (1,:nY) = cam_in%iceFrac(:nY)

I also checked the categorization of IsWater, IsIce, IsLand, and IsSnow. There is inconsistency between how they are set in CAM versus how they are used in GEOS-Chem. This is fine for IsIce (includes land ice in GC) and IsLand (excludes snow and ice in GC) since they are only used in dry deposition and we don't use GC dry dep velocities over land. IsWater might be more problematic since it include lakes in GEOS-Chem and is used in chemistry. Is fraction lake available?

@billsacks
Copy link
Member Author

Thanks for the follow-up @lizziel . I don't think lake fraction is available in CAM, since this is internal to CTSM and I don't think is sent to CAM. If needed, it could be sent, but I also wonder if this is something that can / should be handled in CTSM, or maybe already is handled in CTSM, since CTSM handled dry dep velocities over land-owned areas in general, and lakes are land-owned.

@lizziel
Copy link
Collaborator

lizziel commented Oct 21, 2024

IsWater without lakes is indeed fine for dry deposition in GEOS-Chem since we do not use GC dry dep vels from land. However, we use IsWater in GEOS-Chem chemistry which is done globally . I can check with GEOS-Chem scientists on whether this would be a problem in chemistry.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working correctly
Projects
Status: To Do
Development

No branches or pull requests

5 participants