-
Notifications
You must be signed in to change notification settings - Fork 146
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Failures in GEOSCHEM/HEMCO configurations with different ocean & atm grids #1172
Comments
Thanks @billsacks, copying @lizziel from the GEOS-Chem Support Team who is in charge of software engineering of GEOS-Chem within CESM. |
Thanks, @jimmielin ! @lizziel - let me know if you'd like to talk more about this. It took me a while to get my head around what was going on here and what feels like possible solutions, and I'm not sure how clearly I managed to explain this. Has GEOS-Chem been tested yet in fully-coupled (B compset) configurations? My sense is that it will fail similarly to what I saw in these F compsets with non-standard grids (non-standard in that I was running the ocean on a different grid than the atmosphere), but if GEOS-Chem has been run in B compsets and this problem hasn't appeared, then I'd like to look further to see why. |
As far as I know GEOS-Chem hasn't been tested with a B-compset. I wouldn't be surprised if it failed. I have thought a little more about this since I realized I named myself on one of the crashing snippets of code (on TSKIN)... It looks like GEOS-Chem will need some form of The GEOS-Chem within CESM description paper, Fritz et al. (2022), section 2.3.2 notes that GEOS-Chem computes its own dry deposition velocities over ocean and sea ice, which are merged with the CLM-computed velocities over land:
For some reason the GEOS-Chem code in chemistry.F90 does not (no longer?) considers sea ice fraction when 'merging' the dry deposition velocities. So in any of the fixes 1-3 proposed, a fix to GEOS-Chem's handling of the dry deposition velocities merging would need to be added, as well. IF ( map2GC_dryDep(N) > 0 ) THEN
! State_Chm%DryDepVel is in m/s
State_Chm(LCHNK)%DryDepVel(1,:nY,map2GC_dryDep(N)) = &
! This first bit corresponds to the dry deposition
! velocities over land as computed from CLM and
! converted to m/s. This is scaled by the fraction
! of land.
cam_in%depVel(:nY,N) * 1.0e-02_fp &
* MAX(0._fp, 1.0_fp - State_Met(LCHNK)%FROCEAN(1,:nY)) &
! This second bit corresponds to the dry deposition
! velocities over ocean and sea ice as computed from
! GEOS-Chem. This is scaled by the fraction of ocean
! and sea ice.
+ State_Chm(LCHNK)%DryDepVel(1,:nY,map2GC_dryDep(N)) &
* State_Met(LCHNK)%FROCEAN(1,:nY)
ENDIF |
The inclusion of ice in various land fraction terms in GEOS-Chem has changed over the past years, with minor discrepancies found in criteria for ice. I am not surprised that Thibaud's implementation is missing ice since it can get even more confusing with CESM's definitions thrown into the mix. For the GEOS-Chem/CESM interface fix in
Here are my thoughts on the options to address the zero values for friction velocity and temperature (and maybe others?):
Could you point me to the code that does this? This would be a quick fix but not ideal. If something changes then we would need to remember to change it in all of these places. It also does not prevent this problem from creeping up in any new modules added to CESM that use
This one makes me think I am not entirely understanding friction velocity in CAM. Is the idea that there would be a new ustar that includes both ocean and sea ice, and friction velocity would be stored in fv for over land? It would be less confusing to have a field for friction velocity regardless of land type.
Yes, this is what I was driving at my response to 2. I think the cleanest most elegant solution is have a field for friction velocity for all grid cells globally available from the mediator. |
My response above extends to any other fields that might have zero values but that are sized globally, e.g. surface temperature. |
@jimmielin - thanks for adding those points about dry deposition velocities. And @lizziel thank you for all of your thoughts here.
I agree with you on this being a quick fix but not feeling ideal. With the major caveat that I'm not familiar with this code but just have been poking around a bit, what I found is this: CAM/src/chemistry/bulk_aero/aero_model.F90 Lines 453 to 456 in 3e9a281
which calls: CAM/src/chemistry/aerosol/aer_drydep_mod.F90 Line 179 in 3e9a281
But actually, now that I look more closely at that code, I'm not convinced that it's necessarily doing the right thing in terms of land / ocean / ice fractions... but I haven't looked closely.
I'll again add the caveat that I may not be understanding all of this correctly myself. My thinking here is that it looks like CICE doesn't even send ustar to the mediator currently, so a first step for either option 2 or 3 would be adding that. I'm not clear on whether there's a scientific reason why CAM computes its own ustar over sea ice rather than having CICE send it, or if that's just a historical artifact. I had been thinking with option 2 that CAM would get 3 separate fields – ustar / fv from each of land, ocean and sea ice – and then do the merge itself. I agree with you, though, that it could be cleaner to have the mediator do the merge; what I'm not sure about is whether that will work for all of the uses in CAM or if some parts of CAM actually want these separate values over the different surface types. As you point out, a compromise could be to have a merged sea ice / ocean ustar and a separate fv from land; whether this is the "right" thing to do may depend on uses elsewhere in CAM.
I agree that Option 3 feels cleanest in some ways. This would take some broader discussion, though, and I'm particularly wondering if it will actually cause some problems in CAM to just have a single merged friction velocity field that applies to land, sea ice and ocean: This feels like the right thing to do for cases like this where you seem to want the grid cell average friction velocity, but I wonder if there are other cases that (at least currently) want just the friction velocity over land – in which case, sending the grid cell average will give the wrong values for points that are part land and part ocean. It may be that any CAM code that's currently set up that way can / should be rewritten to use grid cell average friction velocities, but I'm thinking this discussion may need to pull in some other people. To summarize my sense at this point: Option 3 does feel cleanest, but both options 3 & 2 seem like they'll require broader discussions among other CAM developers. I'm not sure how critical this issue is – e.g., are there hopes to have GEOS-Chem be a supported option for B compsets for CESM3? Option 1 could get things working consistently with other chemistry options in CAM, but I agree that it may not be the right long-term solution. I don't know how much you and others want to just get something working now vs. holding off for a better long-term solution to avoid doing work now that just needs to be undone later.... |
I am not familiar with B compsets and have not heard any requests for them. Given the level of work I think we would need a research need to pursue it. We currently do not have that in our research group at Harvard but perhaps other groups do or will. That request may be brought up with me, or it might be presented to NCAR. Perhaps we should all be on the lookout for hearing of this and inform the other if a request shows up? This question should also be presented to Louisa Emmons who is the CESM scientist for GEOS-Chem. We have CESM-GC meeting on Nov 6th during which I can bring it up.
I like the idea of a quick fix should anyone want to play with this. Implementing a better solution later can be famous last words, forgotten about and rediscovered years later. Do you leave todo items like this in a github issue to keep open? |
B compsets are fully-coupled configurations, with active ocean as well as atmosphere (and land, sea ice, etc.). The reason this issue appears in B compsets and not in F compsets is that B compsets almost always have different atm vs. ocean grids, whereas F compsets typically use the same grid for atm & ocean, and this issue only appears when the atm & ocean grids differ... though it may be that the results are subtly wrong even when the grids are the same due to not considering the ice fraction.
Personally, yes, but I'm not involved with CAM development and I don't know what the CAM group prefers for something like this. (@cacraigucar ?) |
I am adding @PeterHjortLauritzen (the NCAR AMWG CAM rep), @adamrher (one of the scientists working on CAM) and @nusbaume to this thread, as this has grown outside of my expertise and into the science realm. I should say that we have been looking at GEOS-Chem as a "draft" implementation, as until the restart issues are solved, it is not ready for full use. I was surprised to see that anyone was trying to use it in a B compset. I can say that with the push to CESM3, we AMP SEs do not have time to work on any changes that would be required to solve this. We are certainly open to accepting changes from anyone who takes the time to fix and test the problems, but we will not have time to dive into this ourselves before the release. Leaving this issue open is one way to ensure that it doesn't get forgotten. Unfortunately, until we take the time to review the backlog of issues post-release, it will probably get lost in the noise, unless someone brings it back to our attention. |
Thanks @cacraigucar . Just to clarify / summarize where this came from: This originated from my testing of using the exchange grid for atmosphere-ocean flux calculations. In that testing, I was getting failures in F compset tests with GEOS-Chem. I determined that these same failures appeared in F compset runs with different atm & ocn grids. From this, I speculated that this will be a problem for B compsets. For now we are planning to avoid this issue by not using the exchange grid if the atm & ocn grids are identical, as they typically are in F compsets. (The exchange grid doesn't give any benefit in this situation, and just leads to extra overhead.) So if the use of GEOS-Chem in B compsets isn't planned for the near future, then I agree that this can be left as low priority for now. |
Hi @billsacks. My not-so deep reading of this thread indicates that there are two issues, and this is a workaround for the first one (only support GEOS-Chem for F compsets; turn off the xgrid for F-compsets). The second issue is related to omitting the sea ice fraction in grid mean ustar calculation in |
That sounds right - thanks for summarizing. |
Regarding my earlier suggested fix for weighting dry dep velocities between CLM on land and GEOS-Chem on ocean/ice, there is actually is no need for a fix.
I also checked the categorization of |
Thanks for the follow-up @lizziel . I don't think lake fraction is available in CAM, since this is internal to CTSM and I don't think is sent to CAM. If needed, it could be sent, but I also wonder if this is something that can / should be handled in CTSM, or maybe already is handled in CTSM, since CTSM handled dry dep velocities over land-owned areas in general, and lakes are land-owned. |
|
What happened?
In doing some testing with the exchange grid (setting
aoflux_grid = "xgrid"
inuser_nl_cpl
, I got a failure inSMS_D_Ln9.f09_f09_mg17.FCnudged_GC.derecho_intel.cam-outfrq9s
:I also reproduced this failure in a run with the default of
aoflux_grid = "ogrid"
but when running with different ocean & atm grids:SMS_D_Ln9.f09_g17.FCnudged_GC.derecho_intel.cam-outfrq9s
.This is dying due to a divide by 0 in the geoschem drydep_mod, due to 0 values of ustar:
It looks like the issue comes from the fact that, when the ocean grid differs from the atmosphere grid – either because these grids actually differ or because we're using the exchange grid (in which case there's an extra mapping done even if the atm & ocn grids are the same) – we get 0 values for ocean fields not only in land grid cells (which appear to be handled properly in CAM) but also in grid cells that are entirely sea ice-covered (or entirely sea ice + land).
Here is the relevant code for calculating USTAR in geoschem (in chemistry.F90):
Notice that this code assumes that
cam_in%uStar
applies in the complement of landFrac, but in fact there is a third relevant fraction, the ice fraction, which is not accounted for here.When I put in place a hack to workaround this ustar issue:
I got past the original crash, but got a new one here:
That's dying due to a divide-by-zero of TEMPK in OCEANO3, which I think comes from
which comes from
So_t
. Note that, like ustar,atmExp_So_t
has 0 values over all-ice points in runs with different atm vs. ocn grids. (When the ocean is running on the same grid as the atmosphere, atmExp_So_t is 273.15 over land and doesn't seem to have patches of missing (or 273.15) values over sea ice points. But these 0 values appear when they're running on different grids.)The 0 values come from the mapping from ocn to atm in the mediator, which uses a normalization of ofrac, which is the current ocean fraction (excluding both sea ice area and land area). @mvertens and I both feel that this ofrac normalization is the correct thing to do (and this was done in cpl7 as well as in CMEPS). Many fields (including "important" ones like the latent and sensible heat fluxes, etc.) are merged between ocn, ice and lnd before being passed to the atmosphere, so they are handled correctly. But a few fields – like ustar and So_t – are ocean-specific and so do not have this merge in the mediator. These fields need to be handled in CAM in a way that recognizes that they only apply over the ocean portion of the grid cell – not the sea ice portion or the land portion.
I see three options for the path forward, in order from least to most invasive in other components outside of CAM:
What are the steps to reproduce the bug?
Run SMS_D_Ln9.f09_g17.FCnudged_GC.derecho_intel.cam-outfrq9s
What CAM tag were you using?
cam6_4_032 in cesm3_0_alpha03c
What machine were you running CAM on?
CISL machine (e.g. cheyenne)
What compiler were you using?
Intel
Path to a case directory, if applicable
/glade/derecho/scratch/sacks/SMS_D_Ln9.f09_g17.FCnudged_GC.derecho_intel.cam-outfrq9s.20241011_100351_kob7r6
Will you be addressing this bug yourself?
No
Extra info
For the failing case in /glade/derecho/scratch/sacks/SMS_D_Ln9.f09_g17.FCnudged_GC.derecho_intel.cam-outfrq9s.20241011_100351_kob7r6, I ran it twice. The first time it failed as noted above. The second time I introduced changes to get around the ustar issue:
and it died differently, I think due to 0 values in SSTs. See the two different cesm log files for backtraces for these two errors.
The text was updated successfully, but these errors were encountered: