Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Blow up for a WRF and ROMS coupling case while WRF only case and ROMS only case succeeded #317

Open
qwdy opened this issue Sep 12, 2024 · 5 comments

Comments

@qwdy
Copy link

qwdy commented Sep 12, 2024

Hi John,

I have ran a WRF only case and a ROMS only case successfully. But the couple case blew up due to the very high v velocity in grid(30,93,1). Could you take a look for this. Thanks.

wrcouple_out.txt

@jcwarner-usgs
Copy link
Collaborator

it is difficult for me to diagnose this issue from the txt file. You will need to look at the roms his and rst files. looks like it happened early in the run. maybe save the his every 5 minutes to see what is happening. save the rst every 5 and then look at rst time level 3.

@qwdy
Copy link
Author

qwdy commented Sep 13, 2024

Hi John

I changed ROMS NRST and NHIS parameter to get rst file and his file every 5 minutes. In the first 25 minutes, it seemed fine, but at the 30 minute, ROMS blew up because of one grid point which has a very high vbar and v-momentum component. But as I said before, the ROMS-ONLY case ran successfully which should mean my ROMS input files are basically reasonable.

I have an idea that I modify this problematic grid's mask to 0 to see if I can make it work. What do you think?

Maybe the participation of WRF cause the error, but I don't have a further solution yet. What I focus is mainly on WRF output, so I make my ROMS domain is larger than the WRF domian to ensure SST of WRF domain is totally from ROMS. What if I exchange their domain size, will it work?

namelist.input.txt
WRF-ONLY.out.txt
ROMS-ONLY.in.txt
ROMS-ONLY.out.txt
56dbd7ffee09404297d1a199986eea2
5ccda93e09c564283219a262c4ff0dd

@jcwarner-usgs
Copy link
Collaborator

i would just mask that cell out. or change the bathy at it.
this is a coarse resolution along the coast.
try that before your change the whole wrf grid. i doubt that would solve the problem.

@qwdy
Copy link
Author

qwdy commented Sep 23, 2024

Hi John,

I masked that cell out, but the simulation still failed which caused by the point under the 'masked point'. So I exchanged the domain size of ROMS and WRF (i.e. WRF is larger than ROMS), it worked. I want to know if COAWST has such a limit that WRF has to be larger than ROMS?

Besides, I have a question of the 'time' in COAWST. In the wrfout* file and ocean_his.nc , the time are in UTC time zone?

@jcwarner-usgs
Copy link
Collaborator

there is not a constraint on the model grid sizes. If you changed WRF grid, the point of issue is well within the grid domains. so i dont know why the grid extent would have any change on the values well within the grid.

COAWST does not check the times. I just assume the user will start the models at the same time. Fields are exchanged based on the coupling intervals set in the coupling.in. I did not make this too complicated. Other systems like ESMF based control the model clocks, etc. I did not get that detailed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants