Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add export of cpl_scalars #8

Merged
merged 2 commits into from
Apr 3, 2024

Conversation

DeniseWorthen
Copy link

@DeniseWorthen DeniseWorthen commented Mar 8, 2024

Export 2d-size of tile and number of tiles for use by CMEPS mediator history files.

See additional information in UFS Issue ufs-community/ufs-weather-model#2171

* export 2dsize of tile and number of tiles for use by CMEPS
mediatory history files
@DeniseWorthen
Copy link
Author

@uturuncoglu This is working fine for UFS. I am able to get the land history files on 6 tiles. I have a run here

 /glade/derecho/scratch/worthen/FV3_RT/rt_26609/lnd.db

The file ufs.cpld.cpl.hi.2021-03-22-43200.nc shows the fields on the 6 tiles.

@uturuncoglu
Copy link
Collaborator

uturuncoglu commented Mar 8, 2024

@DeniseWorthen I am not sure there is an issue or not but please see following plots. The first one is q2m and from land history (ufs.cpld.lnd.out.2021-03-23-21600.tile1.nc) and the second one is atmExp_Sl_q and from mediator history (ufs.cpld.cpl.hi.2021-03-23-21600.nc). The mediator one seems lots of zeros over land.

Screenshot 2024-03-08 at 10 39 24 AM
Screenshot 2024-03-08 at 10 39 24 AM 2

I am not sure we have same issue with original implementation or not. It seems other variables have same issue but since the units are different in model history and mediator history, it is hard to tell.

@uturuncoglu
Copy link
Collaborator

The following is for zvfun
Screenshot 2024-03-08 at 10 47 57 AM
Screenshot 2024-03-08 at 10 47 57 AM 2

@uturuncoglu
Copy link
Collaborator

@DeniseWorthen It would be also nice to run /glade/derecho/scratch/worthen/FV3_RT/rt_26609/control_p8_atmlnd_intel case again by adding those options for the atm output. So, we could also check the fields exported to land component.

@DeniseWorthen
Copy link
Author

Hm....the ufs.cpld.lnd.out.2021-03-23-21600.tile1.nc files are also export state variables on tiles?

One test I didn't do is to check using the current method of setting the tilesize via config. I think you had used that previous. The new history files should be equivalent.

@uturuncoglu
Copy link
Collaborator

@DeniseWorthen Yes. The land output also includes the forcing fields coming from atmosphere. So, for example dswsfc is the shortwave, t1 is temperature and q1 is humidity. Wind is little bit different since we are not using wind direction.

@uturuncoglu
Copy link
Collaborator

@DeniseWorthen Yes, if I remember correctly, I was using config options also for land but it is hard to remember exactly.

@DeniseWorthen
Copy link
Author

DeniseWorthen commented Mar 8, 2024

@uturuncoglu I made a couple of checks.

First, I wrote the mediator history files on the mesh and decomposed them into the tiles using NCL. Those fields match the import/export fields I get if I use the original method of setting a config variable

  history_tile_atm = 96
  history_tile_lnd = 96 

This original method only worked for component history files, but even when the fields are split up into two files (one for atm history and one for lnd history), the fields are identical to the original mesh fields.

Then I checked that the fields using the new cpl_scalars and the fields from NCL are the same as those written directly via CMEPS. So I'm pretty confident the new cpl_scalars are writing the tiled history files correctly. I did find an issue that the fractions are not written on the tiles when you use a single history write phase, so I'll need to fix that in CMEPS. I can fix that later though.

Second, I looked at time=25200 on tile 1 and the zvfun in the "out" file compared to the lndImp_Sl_zvfun and the atmexp_Sl_zfun in the mediator history.

Screenshot 2024-03-08 at 4 55 07 PM Screenshot 2024-03-08 at 4 55 23 PM Screenshot 2024-03-08 at 5 10 15 PM

So the lndimp looks the same as the lnd out file, but the atmexp is different. I don't know whether that is expected.

I also checked the DSWSFC field from the "out" and the LNDEXP_FAXA_SWDN and ATMIMP_FAXA_SWDN fields and those all look consistent.

@uturuncoglu
Copy link
Collaborator

@DeniseWorthen Thanks for checking. Yes, it seems that there is an issue in atm export. I think that lndImp_Sl_zvfun needs to be same as atmexp_Sl_zfun. I am not sure why we have this difference. Is it same for the out-of-box configuration? I mean without any modification. We could check the baseline if this was an issue in the first place. If so maybe there is a bug in mediator since lndImp_Sl_zvfun is fine but atmexp_Sl_zfun not.

@uturuncoglu
Copy link
Collaborator

uturuncoglu commented Mar 8, 2024

@DeniseWorthen @barlage I was also looking to the debug mode issue. I put required checks to the place that we have issue and still failing. In think I know why, we are using C96.vegetation_type.tile to get vegetation type and using it (along with fraction) to construct the land sea mask but we are also using initial condition sfc_data.tile files to get vegetation type and I think that the vegetation type information found in C96.vegetation_type.tile is not consistent with the one found in sfc_data.tile. Look at attached plot for the difference of vegetation type for tile 1 (static file - src file)

Screenshot 2024-03-08 at 3 54 17 PM

There are lots of differences and I think there are also land-sea mask difference between those files. And it leads to issue. I could modify the code and make it consistent by using also src file to construct land-sea mask. Just to be sure, I also need to check datm coupled configuration since that uses custom initial condition. @barlage let me know what you think? @DeniseWorthen I think once I fix the issue, I think this could be part of your PR since you have some modification in the land model too. Of course we also need to solve the issue in atmexp_Sl_zfun and also other atmexp_ fields.

@uturuncoglu
Copy link
Collaborator

@DeniseWorthen @barlage I think this is also related to the ufs-community/ufs-weather-model#1423

@uturuncoglu
Copy link
Collaborator

@DeniseWorthen @barlage Okay. I fixed the issue with vegetation fraction but now getting error from soil type. This is also read from C96.soil_type.tile*.nc files but also found in sfc_data.tile*.nc (I think stype, btw it is hard to know which variable which since surface file has no meaningful long name in it and just variable name. Anyway, maybe I could get the variable from surface file but it won't be consistent with others. Anyway, maybe for active atmosphere coupled configurations the all information needs to be populated from surface file and for data atmosphere configuration I could use static files (start with C96). I am getting following variables from static files,

geolat - oro_data.tile*.nc
soil_type - soil_type.tile*.nc
vegetation_type - .vegetation_type.tile*.nc (this also needs to be read from sfc)
slope_type - .slope_type.tile*.nc
substrate_temperature - substrate_temperature.tile*.nc
maximum_snow_albedo - maximum_snow_albedo.tile*.nc
vegetation_greenness - vegetation_greenness.tile*.nc
soil_color - soil_color.tile*.nc

but I think surface file does not have all of those variables and we will end up with inconsistencies on those variables in terms of land-sea mask and actual values like I showed for zvfun.

@DeniseWorthen
Copy link
Author

@DeniseWorthen Thanks for checking. Yes, it seems that there is an issue in atm export. I think that lndImp_Sl_zvfun needs to be same as atmexp_Sl_zfun. I am not sure why we have this difference. Is it same for the out-of-box configuration? I mean without any modification. We could check the baseline if this was an issue in the first place. If so maybe there is a bug in mediator since lndImp_Sl_zvfun is fine but atmexp_Sl_zfun not.

I did run the develop branch and use NCL to write out the fields on the tiles. I see the same thing---the lndImp and the atmExp of Sl_zfun are not the same, and I agree they should be. But, I also don't understand what the field "land fraction" fields mean. The lfrac in the atm looks the same as the lfrac in the land, but I don't understand how lfrac is zero on land. It must mean something different than I think:

Screenshot 2024-03-09 at 11 37 45 AM

@uturuncoglu
Copy link
Collaborator

@barlage @DeniseWorthen I made some changes in the code to solve the issue under debug mode which is triggered by inconsistencies in the input data. So, following is the initial attempt to solve it,

https://github.com/NOAA-EMC/noahmp/compare/develop...NOAA-EMC:noahmp:hotfix/debug_mode?expand=1

This is working and passing the point that Denise had issue but failing with another error like following,

forrtl: severe (408): fort: (3): Subscript #1 of the array ALBSAT_TABLE has value -999 which is less than the lower bound of 1

Image              PC                Routine            Line        Source
fv3.exe            000000000AB02772  noahmpdrv_mp_tran        1402  noahmpdrv.F90
fv3.exe            000000000AADFE10  noahmpdrv_mp_noah         853  noahmpdrv.F90
fv3.exe            000000000E289048  lnd_comp_driver_m         601  lnd_comp_driver.F90
fv3.exe            000000000E25EF9C  lnd_comp_nuopc_mp         458  lnd_comp_nuopc.F90

This is basically caused by the soil color data. If you look at my changes closely, I am still getting soil color data from static file rather than sfc file and this causes issue since those are not consistent. I am not sure how this can be solved. Two possible options are (1) create new initial condition (sfc file) that has also other variables line soil color (I still don't sure what the status soil color does not match with sfc file in terms of masking) or (2) update soil color static file. @barlage let me know what you think?

@uturuncoglu
Copy link
Collaborator

@DeniseWorthen I think this is not related with the land fraction field and I think we have issues with other fields. Maybe some recent changes in the CMEPS side. The baseline created after land PR is,

/glade/derecho/scratch/epicufsrt/ufs-weather-model/RT/NEMSfv3gfs/develop-20240124/control_p8_atmlnd_intel/

and current baseline is,

/glade/derecho/scratch/epicufsrt/ufs-weather-model/RT/NEMSfv3gfs/develop-20240301/control_p8_atmlnd_intel/

The control_p8_atmlnd_intel test does not check the mediator history file. So, it is not easy to check them quickly. I could try to run the version of ufs-weather-model just after land PR https://github.com/ufs-community/ufs-weather-model/tree/932a532516e8b9fd7aa13cc137f777cbcc8e8709 and check it.

BTW, if I check the FV3 output from your run /glade/derecho/scratch/worthen/FV3_RT/rt_26609/control_p8_atmlnd_intel/sfcf024.tile1.nc and look at spfh2m it looks strange and this is also same from the develop-20240124. So, maybe we missed the issue and CMEPS was buggy also in the land PR. Anyway, I'll look at the issue.

@uturuncoglu
Copy link
Collaborator

@DeniseWorthen Maybe there is something wrong with land frac. So, that leads issue when data is transferred from land to atmosphere. Anyway, I am trying to run in my side to test some ideas.

@DeniseWorthen
Copy link
Author

DeniseWorthen commented Mar 11, 2024

@uturuncoglu Maybe Mike understands better, but I know George recently committed a new "v2" surface file test. I don't know if those files are more appropriate to use (the old land mask mis-match issue?).

I feel like I opened a can of worms but I do think the cpl_scalars are working correctly (except for ESCOMP/CMEPS#436 (comment)). Let me know what I can do to help.

@uturuncoglu
Copy link
Collaborator

@DeniseWorthen Yes, file inconsistencies are the main issue in here. I am glad that we have issue and It force us to fix it. We have weekly tag-up with @barlage today. So, we could discuss it. I agree it seems that I/O part seems working. I am still working on land fraction issue, I might solve it today.

@uturuncoglu
Copy link
Collaborator

@DeniseWorthen I investigate it little bit further. Here is my finding, the following part of the code in CMEPS basically assigns the lfrac, https://github.com/NOAA-EMC/CMEPS/blob/624920ddbd819c76ec37591c24e872308201810e/mediator/med_fraction_mod.F90#L257. I checked both is_local%wrap%FBImp(complnd,complnd) and is_local%wrap%FBFrac(complnd) and it seems those are fine. But, the one (https://github.com/NOAA-EMC/CMEPS/blob/624920ddbd819c76ec37591c24e872308201810e/mediator/med_phases_post_lnd_mod.F90#L61) used in the med_map_field_packed routine is not. Some how the land fraction used in the run phase is corrupted. Anyway, at this point I have no idea why it is corrupted in the post phase. I'll keep checking to find the reason.

@DeniseWorthen
Copy link
Author

@uturuncoglu good digging. I wonder what it means when it says as a comment "this may be overwritten later" (in the fraction_mod).

@uturuncoglu
Copy link
Collaborator

@DeniseWorthen Yes. In the fraction file, the mediator trying to map land fraction coming from atmosphere as land fraction. I put more debug code to cmeps to find out this is the issue or not. If so, we might need to put some specialization in here. I am not sure configuration under CESM. Maybe there is some implicit assumption over there. I think FV3 does not export land fraction but need to check.

@barlage
Copy link
Collaborator

barlage commented Mar 11, 2024

@DeniseWorthen @uturuncoglu I did a little digging into the sfc_data and fix files being used in the RT. What I mean below by "not consistent" is that there are not valid land characteristics over all grids where land_frac>0. From what I can tell:

  • those used in the RT by default are not consistent, e.g.
fixpath = "/scratch1/NCEPDEV/stmp2/Michael.Barlage/FV3_RT/rt_114204/control_p8_atmlnd_intel/INPUT/"
sfcpath = "/scratch1/NCEPDEV/stmp2/Michael.Barlage/FV3_RT/rt_114204/control_p8_atmlnd_intel/INPUT/"
  • I believe these are coming from here and they are also not consistent (doesn't matter which oro file is used):
fixpath = "/scratch2/NAGAPE/epic/UFS-WM_RT/NEMSfv3gfs/input-data-20221101/FV3_fix_tiled/C96/"
sfcpath = "/scratch2/NAGAPE/epic/UFS-WM_RT/NEMSfv3gfs/input-data-20221101/FV3_input_data/INPUT_L127/"

Note these are consistent for non-fractional grids.

  • these "v2" data are consistent for fractional grids and probably should be used:
fixpath = "/scratch2/NAGAPE/epic/UFS-WM_RT/NEMSfv3gfs/input-data-20221101/FV3_input_data/INPUT_L127_v2_sfc/fix_sfc/"
sfcpath = "/scratch2/NAGAPE/epic/UFS-WM_RT/NEMSfv3gfs/input-data-20221101/FV3_input_data/INPUT_L127_v2_sfc/"
  • these also seem consistent, but may be dangerous to use since they are a mix of two dirs and may not be consistent in other ways
fixpath = "/scratch2/NAGAPE/epic/UFS-WM_RT/NEMSfv3gfs/input-data-20221101/FV3_input_data/INPUT_L127_v2_sfc/fix_sfc/"
sfcpath = "/scratch2/NAGAPE/epic/UFS-WM_RT/NEMSfv3gfs/input-data-20221101/FV3_input_data/INPUT_L127_mx100_v2_sfc/"

@uturuncoglu
Copy link
Collaborator

@barlage Thanks for checking. Since the test uses tests/fv3_conf/control_run.IN file to copy files, I check that one and there is a special variable V2_SFC_FILEthat needs to be set to .true. to use V2 files. Anyway, let me try to include that variable to land test and see what happens.

@uturuncoglu
Copy link
Collaborator

@DeniseWorthen Following part of code changes the land fraction. https://github.com/NOAA-EMC/CMEPS/blob/624920ddbd819c76ec37591c24e872308201810e/mediator/med_fraction_mod.F90#L454. It seems that 'lfrac' in FBFrac(compatm) is set by https://github.com/NOAA-EMC/CMEPS/blob/624920ddbd819c76ec37591c24e872308201810e/mediator/med_fraction_mod.F90#L390 like following

          if (associated(lfrac)) then
             if (is_local%wrap%comp_present(complnd)) then
                do n = 1,size(lfrac)
                   lfrac(n) = 1.0_R8 - ofrac(n)
                   if (abs(lfrac(n)) < eps_fraclim) then
                      lfrac(n) = 0.0_R8
                   end if
                end do
             else
                lfrac(:) = 0.0_R8
             end if

using ocean fraction but since we don't have ocean component and using eps_fraclim it is going weird. I am plaining to put some control in here and slip section start with L454 since we have land fraction coming from land and no ocean component. @mvertens Is there any CESM configurations that just couple atmosphere with land without any ocean component (data or active). Maybe this configuration is never tested under CESM. @DeniseWorthen @mvertens Please let me know what you think?

@uturuncoglu
Copy link
Collaborator

@DeniseWorthen Okay. The part that sets the land fraction using ocean is just used when either ocean and ice model present. I think we need to add extra protection to 822L to prevent getting land fraction from atmospheric model component.

@uturuncoglu
Copy link
Collaborator

Sorry line 461 in med_fraction_mod.F90. Anyway, I also checked the land fraction coming from atmospheric model component and it seems it is corrupted. See following plot,

Screenshot 2024-03-11 at 3 56 56 PM

As you can see, we have lots of zeros over land which is not realistic and created issue. I think I our case, we have to use land provided fraction rather than the one interpolated from atmosphere which is not correct for UFS case.

@uturuncoglu
Copy link
Collaborator

@DeniseWorthen Maybe the mapping in https://github.com/NOAA-EMC/CMEPS/blob/624920ddbd819c76ec37591c24e872308201810e/mediator/med_fraction_mod.F90#L431 is not working correctly in out case since it is trying to map land frac from land component to atm.

@uturuncoglu
Copy link
Collaborator

@DeniseWorthen Okay. I traced the issue more. It seems that atmospheric model mask shows the same issue. Mediator log has following,

 (module_med_map: med_map_routehandles_initfrom_field) creating RH consd for lnd to atm srcMask =    -987987 dstMask =          1

So, since destination mask (fv3 model) is corrupted the interpolation has also issue. I checked the fv3 configuration and it seems it sets the mask in addLsmask2grid under FV3/atmos_model.F90 like following,

!$omp parallel do default(shared) private(i,j,nb,ix)
    do j=jsc,jec
      do i=isc,iec
        nb = Atm_block%blkno(i,j)
        ix = Atm_block%ixp(i,j)
! use land sea mask: land:1, ocean:0
        lsmask(i,j) = floor(one + epsln - GFS_data(nb)%SfcProp%oceanfrac(ix))
      enddo
    enddo

also oceanfrac is set under FV3/io/fv3atm_sfc_io.F90 and differently for fractional grid or not. In my case, frac_grid = .true.. So, I am not sure what is wrong in here. I keep digging. Now, I started to think that maybe setting atmospheric model mask has issue.

@DeniseWorthen
Copy link
Author

DeniseWorthen commented Mar 11, 2024

@uturuncoglu What I still don't understand is what land fraction should be on land, away from a coast. Near a coast, "land frac" in FV3 has a meaning relating to the mapped ocean mask. That is the 1-land_frac in the below figure (it is called land_frac, but it is really ocean_frac) used in the setting of the lsmask variable.

Screenshot 2024-03-11 at 7 03 05 PM

Does the land model have a concept of "fractional land", so that a grid cell in the middle of the Sahara would be something other than land_frac=1? Your satellite view figure above agrees w/ the variable med_frac_atm_lfrac I posted above. It seems like the imported field (lndImp_Sl_zvfun) is getting masked by this atm_lfrac before being sent to the atm. I just don't know what the land frac it should be using is.

@DeniseWorthen
Copy link
Author

So what really have we changed here? We're not changing the lsmask in atmos_model, but you've fixed a couple of bugs in noahmp and we've changed the mapping masks. But you were testing all of that earlier today, when you were getting bad VTK files?

Is the only difference now that we're using v2 surface files in addition to the other changes?

I wanted to see if the fix is really coming from v2 surface files. I tried to run the same executable (hotfix + cmeps mapping) w/ the old surface files and I still get a set fault

forrtl: severe (408): fort: (3): Subscript #1 of the array ALBSAT_TABLE has value -999 which is less than the lower bound of 1

Image              PC                Routine            Line        Source
fv3.exe            000000000AB0A142  noahmpdrv_mp_tran        1402  noahmpdrv.F90

@uturuncoglu
Copy link
Collaborator

@DeniseWorthen The fix does not fix the issue with the old surface files since we are still getting soil color from fixed file not from sfc file. In v2 case, those files are consistent (soil color too) and it is working. I did not check that the v2 sfc files has soil color in it. If so, I could get that one too. Anyway, the code crashes since old files are not consistent in terms of soil color.

@DeniseWorthen
Copy link
Author

@uturuncoglu I don't think it necessary to make the current code work w/ the old input, I'm just wondering what really "fixed" the land mask issue?

We definitely need to commit a debug atm-land test.

@uturuncoglu
Copy link
Collaborator

uturuncoglu commented Mar 12, 2024

@DeniseWorthen Okay. Good news. It seems that optimized one also working fine. We have no land fraction issue anymore. Please double check in your end too if you could find time. Anyway, let me work on little bit more in noahmp fix branch and clean it and if soil color is available in sfc file I could get it from there too. I'll update you about it. In the mean time you could get the mask change to your CMEPS PR. I also need to run other cases to be sure that sbs and datm coupled ones are fine. I am not expecting any issue but just to be in the safe side. I think with this changes, we will have answer change in the land related tests. I am also plaining to add land debug test. Do you want me to crate ufs-weather-model branch that you could get it to your PR?

@DeniseWorthen
Copy link
Author

Good morning @uturuncoglu. I was hoping to bring in the cpl_scalars work to UFS in a single PR w/ no baseline changes (except for the single history file which is compared and which now is on tiles). So I'd like to do the following if it works for you.

  1. update my ESCOMP CMEPS PR w/ the two changes (the mapping masks for LND and getting the fractions on tiles when you use the single mediator history phase). This probably means you'll need to re-test in CESM.

  2. Commit a UFS PR containing the cpl_scalars for FV3, NoahMP and CMEPs.

  3. You can have more time to confirm all the NoahMP bug fixes and V2 surface files are working correcting. Then we create a UFS PR w/ the NoahMP bug fixes + debug test + v2 surface files. Once you give me the go-ahead, I'm happy to do that and do all the testing to get it into the commit Q.

Let me know if this all works for you.

@uturuncoglu
Copy link
Collaborator

@DeniseWorthen yes, that works. Then, I could create fix PR. BTW, It seems everything is fine except 2m specific humidity. We have some small values around coastline. I am not sure this is v2 data files issue or not. Are you seeing same issue in your end?

@DeniseWorthen
Copy link
Author

Let me check. I didn't specifically look at q2m.

@DeniseWorthen
Copy link
Author

DeniseWorthen commented Mar 13, 2024

The 2m sp hum looks wrong in the sfc tile. In the mediator history, the atm import and the lnd export agree though.
Screenshot 2024-03-13 at 9 58 03 AM

My test case is V2 + noahmp hotfix + cmeps mapping change.

@uturuncoglu
Copy link
Collaborator

@DeniseWorthen Thanks for checking. Did you also look at t2m? That has no similar issue. Anyway, there might be a bug in CCPP Level too. So, I'll check it. Anyway, please ad any piece to your PR and I could start working from that point. Thanks for all your help.

@DeniseWorthen
Copy link
Author

I see the T2m temperature in the sfc file showing the same 0.0 values along the coast as the specific humidity.

I didn't want to push the two fixes to my ESCOMP CMEPS PR in case there is still an issue in CMEPS mapping. I don't think there is, because the lnd import and atm exports look OK. But I wanted to check w/ you first.

@uturuncoglu
Copy link
Collaborator

@DeniseWorthen Okay it is good to know we have issue in t2m. In my case, I could not see it. How did you do it? BTW, I think CMEPS mask change for lnd->atm is fine.

@uturuncoglu
Copy link
Collaborator

@DeniseWorthen In my case, if I check sfc file the tmpsfc seems fine.

@uturuncoglu
Copy link
Collaborator

@DeniseWorthen Okay. Got it. tmp2m has issue. The q2m and t2m follows different path in CCPP. So, there could be issue over there since I am not seeing any issue in latent and sensible heat.

@uturuncoglu
Copy link
Collaborator

@DeniseWorthen BTW, i am plaining to open new issue in UFS Weather Model side to track the issue specific to lnd->atm coupling and also inconsistencies about the input files. I could use it also in the fix PR that will come soon.

@DeniseWorthen
Copy link
Author

@uturuncoglu I'm guessing at how to match mediator history variables and surface file variables. But this is T2m from sfc file and sa_ta from the mediator

Screenshot 2024-03-13 at 11 57 51 AM

and tmpsfc and tskn from the mediator
Screenshot 2024-03-13 at 11 57 25 AM

Copy link
Collaborator

@uturuncoglu uturuncoglu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks fine. Thanks for adding cpl_scalars support. The GitHub action is also passing for DATM+LND configuration. @barlage please let me know if you want to review this PR.

@uturuncoglu
Copy link
Collaborator

@DeniseWorthen @barlage JFYI, I create a new issue under UFS Weather Model to resolve the issues about inconsistent input and also debug mode run. ufs-community/ufs-weather-model#2189

@DeniseWorthen
Copy link
Author

@uturuncoglu Great. I've run the control_p8_atmlnd with the CMEPS branch containing the mapping difference and it does change the baseline. Not unexpected, but just so you know.

@uturuncoglu
Copy link
Collaborator

@DeniseWorthen Yes. Thanks for confirming. If I remember correctly, datm coupled configurations and sbs were fine with this change in my tests. Anyway, I am working on it.

@uturuncoglu
Copy link
Collaborator

@DeniseWorthen I created a draft PR in here ufs-community/ufs-weather-model#2191. This will go after yours since depends on CMEPS changes.

@DeniseWorthen
Copy link
Author

@uturuncoglu Thanks for your followup work on this. Let me know if I can help to test. I'll create the final sync-to-emc CMEPS PR once you merge (assuming no issues).

@zach1221
Copy link

zach1221 commented Apr 3, 2024

Hello, @barlage . Testing is complete on WM PR #2175. Could you please merge this noahmp PR for us?

@DeniseWorthen
Copy link
Author

Maybe @uturuncoglu can merge also?

@uturuncoglu uturuncoglu merged commit 6a51f02 into NOAA-EMC:develop Apr 3, 2024
2 checks passed
@uturuncoglu
Copy link
Collaborator

@DeniseWorthen @zach1221 Okay. I merge it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants