Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CICE C-grid crash on 1/12 degree tripole #941

Open
daveh150 opened this issue Mar 12, 2024 · 11 comments
Open

CICE C-grid crash on 1/12 degree tripole #941

daveh150 opened this issue Mar 12, 2024 · 11 comments

Comments

@daveh150
Copy link
Contributor

I am working on updating to CICE 6.5.0 to enable use of CICE on the C-Grid. At the moment we export ocean velocities on the ‘A’ grid. Before moving ocean coupling location to C-grid, I wanted to test running CICE on C-grid and ocean on ‘A’ grid. After 29 hours I get errors that I think look like what J-F was seeing in his testing. Here is info in the logs. This is not at the tripole seam (total grid size is 4500x3527, this is happening at (2561,3033). (In email I mentioned at southern hemisphere, but I was mistaken, sorry). Any thoughts on what else to look for? Thank you for your help!

grid_ice = C
grid_ice_thrm = T
grid_ice_dynu = E
grid_ice_dynv = N
grid_atm = A
grid_atm_thrm = T
grid_atm_dynu = T
grid_atm_dynv = T
grid_ocn = A
grid_ocn_thrm = T
grid_ocn_dynu = T
grid_ocn_dynv = T

New mass < 0, i, j = 6 3034
Old mass = 1.009873647647687E-004
New mass = -1.723111112528535E-007
Net transport = -1.011596758760216E-004
istep1, my_task, iblk, cat = 7552661 426 1 1
(print_state) negative area (ice)\u00A5
(print_state) istep1, my_task, i, j, iblk: 7552661 426 6
3034 1
(print_state) Global block: 427
(print_state) Global i and j: 2561 3033
(print_state) Lat, Lon (degrees): 58.5665063869671
-89.4533113919817

aice 6.447869072451422E-004
aice0 0.999355213092755

uvel(i,j) 3.33410265094825
vvel(i,j) -2.27173681352247
uvelE(i,j) 6.66673187374646
vvelN(i,j) 0.000000000000000E+000

atm states and fluxes
uatm = 10.7973263899519
vatm = -9.60914979831031
potT = 282.715820312500
Tair = 282.715820312500
Qa = 6.291424389928579E-003
rhoa = 1.23230516910553
swvdr = 150.374064941406
swvdf = 128.892055664062
swidr = 166.485571899414
swidf = 91.2985394287109
flw = 310.546752929688
frain = 4.701963916886598E-005
fsnow = 0.000000000000000E+000

ocn states and fluxes
frzmlt = -1000.00000000000
sst = 5.41451142398993
sss = 30.1864523562483
Tf = -1.63006842723741
uocn = 7.606549328525385E-002
vocn = -0.499165868790315
strtltxU= 0.000000000000000E+000
strtltyU= 0.000000000000000E+000

@JFLemieux73
Copy link
Contributor

Hi Dave,

Finally I don't think it is the remapping that is causing the problem. The velocities are huge. 6.67 m/s is unrealistic. There might even be higher velocities elsewhere. What do you get in ice_std_out for the max velocity?

I guess there is an instability developing. It could be related to the coupling with the ocean model.

@daveh150
Copy link
Contributor Author

daveh150 commented Mar 13, 2024 via email

@JFLemieux73
Copy link
Contributor

Hi Dave,

I am adding @dupontf to the discussion. I think a good candidate for the instability is the explicit treatment of Coriolis for the C-grid (it is implicit for the B-grid).

This was already identified as a potential problem is this issue:

#917

We didn't have problems in uncoupled mode but it might be different when coupled to an ocean model. What I don't understand is that we followed Kimmritz et al. 2016 which is probably what is used in the MITgcm.

Here is what I propose:

-I will contact Martin Losch about the MITgcm.
-I will talk to @dupontf about this. We will try to come up with a semi-implicit approach (Bouillon et al. 2009).
-Could you please rerun the 29 hours and if possible have a plot of the max speed (in ice_std_out) as a function of time?
-To make sure it is not the remapping, could you please do a second run with upwind? You could also add max speed as a function of time (with upwind) to the previous plot.

Sounds good?

jf

@daveh150
Copy link
Contributor Author

daveh150 commented Mar 14, 2024 via email

@JFLemieux73
Copy link
Contributor

I emailed Martin Losch. Here is what he wrote:

"I haven’t had any issues with EVP in a very long time, so I am not quite sure how much I can help here. I always use explicit Coriolis (also for implicit solvers) and this was never a problem other than a timestep constraint (~1hour or so). The EVP subcycling “timesteps” are so short that I don’t expect this ever to be an issue.

If this is happening in the nearly free-drift of the MIZ I would expect that there’s something with happening with the ice-ocean/ice-wind stress. Here I usually use a “semi-implicit” treatment, where I compute the linear drag from cDrag = Cd*abs(ui-uo) (with some minimal drag) and then add the the “symmetric parts” (the cosine part when you do rotate the drag to model some sort of Ekman spiral) to the LHS. But In a python version of the EVP solver we tried both implicit and explicit treatment and didn’t find too much of a difference (in coarse simulations).

I hope that helps"

@JFLemieux73
Copy link
Contributor

One more comment from Martin:

"HYCOM is on a C-grid, right? If CICE was on B then there needed to be some averaging to get the velocities onto B-points and the ice-ocean stress right, whereas now this averaging needs to be different, right? So I would put my money on the ice-ocean stress …"

@dupontf
Copy link

dupontf commented Mar 14, 2024

Well, I am not totally ignoring the possibility of Coriolis instability since, in absence of friction, the explicit representation is unconditionally unstable. However, we had a coupling issue in the past in NEMO that is worth mentioning: it is the feedback between ice-ocean stress computed in the sea-ice component and passed back to NEMO. NEMO uses a leapfrog scheme which is stable to friction if friction uses velocity at n-1 for computing velocity at n+1 (leapfrog on n-1, n, and n+1). However, NEMO was passing ocean velocity at time n to the ice module, which is passed back to NEMO ultimately, once the ice-ocean stress is computed, and is therefore unstable in the leapfrog framework... just a thought!

@JFLemieux73
Copy link
Contributor

Hi Dave,
In case the problem is Coriolis. I coded a semi-implicit treatment instead of the explicit approach. Have a look at this branch:

https://github.com/JFLemieux73/CICE/tree/semi_imp_Coriolis

I ran gx1 for five days in stand-alone mode. The fields (aice, hi, uvelE, vvelN) are very similar between the standard (explicit) and new (semi-implicit) code.

@anton-seaice
Copy link
Contributor

For what its worth ... I ran latest CICE with nuopc/cmeps driver on 1 degree tripole grid for 12 months without any crashes. The uvel/vvel maximum look ok. The resolution is quite different but the configuration is similar, i.e. all coupling for ocean and atmosphere is done on the A-grid but cice is calculating on C-grid (grid_ice="C").

@JFLemieux73
Copy link
Contributor

Thanks Anton for the info.

@dabail10
Copy link
Contributor

dabail10 commented Apr 5, 2024

I have just run CICE6-MOM6 coupled with JRA55 forcing for two cycles (122) years. CICE6 / MOM6 are both on the c-grid, but the coupling is still done at A-grid points. This was the 2/3 degree tripole grid.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

6 participants