Skip to content

Minutes for 2023 meetings

Michael Kavulich edited this page Dec 6, 2023 · 126 revisions

CCPP physics code management meeting notes


Date: Wed 6 Dec 2023

Attendees:

Topic 1: Mike Kavulich will summarize the recommendations to physics developers and code managers so that we can increase the SCIENTIFIC interoperability of the parameterizations/suites among hosts and applications.

Topic 2: Grant Firl will review feedback provided by NCEP Central Operations to improve the CCPP code for use in operational implementations (not about the science but about the code). Grant will also present the ongoing follow up actions.

Slides for topic 2

Date: Wed 8 Nov 2023

Attendees: Lisa, Mike K., Joe, Grant, Dustin, Pedro, Qingfu, Kate, Laura, Jim D., Jimy, Ligia, Alex R., Peter, Qingfu, Soren|

Topic 1: Way forward with PUMAS. We heard that NCAR (CGD?) is making PUMAS CCPP-compliant and we'd like to see how to get more information about this effort so other groups can benefit from it.

Peter: Plans for one of Jesse Nusbaumer, Cheryl Craig, Courtney Peverley to work on this within next 2 years. Plan for including aerosol/chemistry is longer than 2 years. Not a whole lot of details beyond this; this is one of the more complicated schemes to work with. Will prescribe aerosols (e.g. from climatology) for first "hacked" working version. This is so this work doesn't have to wait on working aerosol/chemistry interaction with CCPP.

Lisa: It is challenging to move forward with PUMAS in UFS until it is available; can maybe work with UFS-based interstitials, but will still be difficult.

Topic 2: We have seen some CCPP Physics PR descriptions say that the changes are for tuning the code to host/application X. This is concerning because it may negatively impact other hosts/apps. We'd like to discuss how development can be made in a way that benefits the ecosystem.

Jim: Do host models using code tuned for other hosts pull in that code to stay up-to-date? Some skill may degrade due to updates in other hosts. Changes can range from bugfixes (which everyone should probably want), to new science changes, to host-specific tuning -- hard to differentiate as code progresses.

Grant: Documentation of which parameters and sections of algorithms that are most sensitive and how they might change results would be ideal. Also, ranges of parameters that are realistic/physical would be helpful.

Several: There could be a need to have namelist-controlled logicals to control in-code branching where possible. This is preferrable to putting differences in institution-specific branches or something.

Dustin: The issue can be viewed through the lens of which code belongs in primary vs. interstitial. Ideally, all tuning-type stuff should be in interstitial code so that primary schemes can truly be host-agnostic.

Alex: Documentation is necessary, but not sufficient since changes are still made in code for tuning, which makes development/maintenance more difficult.

Dustin: Branching in code has potential performance implications.

Qingfu: Sensitivities of parameters are different for different applications/hosts, although developers probably have a decent idea about most sensitive parameters and parts of the algorithm.

Jim: Documentation RE: parameter changes, any tests that were done (and their results) for PRs would help others to follow as well.

Laura: More concerned about "radical" changes in parameterization than parameter tuning. This leads to difficult decisions on the part of other folks using the scheme.

Peter: Perturbed Parameter Ensemble is good for more systematic analysis, but relevance is still dependent on horizontal- and time-scales of the testing. A more "loose" method of documenting parameter ranges would be nice (rather than publications, that might "nail" developers to a given range).

Chat:

Jimy Dudhia4:07 PM fix number concentrations

Dustin Swales - NOAA Federal4:19 PM "magic numbers" used for host specific tuning could be added as nml parameters.

Grant4:20 PM I agree about maybe relying on code branching controlled by namelist logicals where necessary.

Michael Kavulich4:24 PM I agree, this discussion more than just this specific case. Nearly all schemes have some level of "magic number" tuning.

Grant4:29 PM A lot of folks don't like seeing preprocessor directives in code for stylistic reasons. But tuning goes way beyond "parameters" -- usually changes in actual algorithms.

Dustin Swales - NOAA Federal4:32 PM RRTMGP is host agnostic, developed to get around all these RRTMG flavors floating around

Joseph Olson - NOAA Federal4:33 PM I have to go, but I'd be happy to discuss this with anyone offline.

Lisa Bengtsson - NOAA Federal4:34 PM Same for me, let me know if there's any action needed from the developers

Jimy Dudhia4:45 PM Namelist parameters would have to come in through the meta files, and likely each require whole new scheme-specific standard names.

Michael Kavulich4:46 PM If the developer doesn't know what a number is/does, that's still important to document!

Kate Zhang - NOAA Affiliate4:46 PM I have to leave now. Thanks.

Date: Wed 25 Oct 2023

Attendees: Lulin, Lisa, Mike K., Joe, Grant, Dustin, Fanglin, Pedro, Qingfu, Mike K., Kate, Laura, Andrew H., Jim D., Ligia, Mike E., Peter, Qingfu, Soren|

Today's topics: Proposal for Inclusion of new CCPP Physics Schemes (https://docs.google.com/presentation/d/13QwMEIK3ak6XNi6cLb0FY_vunuVoqiCX/edit#slide=id.p2)

Discussions:

Lisa: The "stakeholders" on the right side of the "onion" diagram need to be updated. It also needs to be made clear what it means to be listed as a "stakeholder".

Lisa: For the outer ring option #2, who would be responsible for using the "install template" that is filled out by the developer? The person who wants to use it with model X (when it is written for model Y)?

Kate: Which schemes belong in the "root" and which belong in submodules?

New business: Fanglin and Jim would like to discuss the Tiedtke convective scheme.

Chats: Grant Firl - NOAA Affiliate2:05 PM Slides: https://docs.google.com/presentation/d/13QwMEIK3ak6XNi6cLb0FY_vunuVoqiCX/edit?usp=sharing&ouid=114709562346133955835&rtpof=true&sd=true

Fanglin Yang - NOAA Federal2:15 PM EMC/DTC for code management, not meant for development, right ?

Grant Firl - NOAA Affiliate2:17 PM I think that the institutions listed on the right side of the oval probably need to be updated since it is confusing. They were originally "stakeholders", whatever that means -- not source of development or who manages it.

Ligia Bernardet - NOAA Federal2:17 PM EPIC does the code mgmt for the UFS Weather Model, which involves many submodules.DTC and EMC co-manage the UFS Fork for CCPP

Lisa Bengtsson - NOAA Federal2:18 PM That makes sense Grant, thanks.

Ligia, yes that's true, but currently to "get in" to the dark blue circle you have to follow the ufs-weather-model regression testing

Ligia Bernardet - NOAA Federal2:19 PM @Lisa: correct

Lisa Bengtsson - NOAA Federal2:20 PM (which is not only done by EPIC but all the UFS code managers)

Dustin Swales - NOAA Federal2:29 PM I support moving testing to the cloud

Ligia Bernardet - NOAA Federal2:30 PM We're getting feedback from Laura's mic

Lisa Bengtsson - NOAA Federal2:38 PM I'm still a bit confused by this diagram, where would NCAR's physics be in this diagram in the future?

Dustin Swales - NOAA Federal2:40 PM Which "NCAR" physics?

Grant Firl - NOAA Affiliate2:40 PM It depends on where they want it? MMM and CGD might have different decisions on where they want it.

Ligia Bernardet - NOAA Federal2:42 PM MMM/CGD can be brought into the Root CCPP repo selectively, as needed, via optional submodules And then it is inside the "red circle" and available to progress down the onion

Grant Firl - NOAA Affiliate2:46 PM The ufs-community/ufs-dev branch also contains more than it should, IMO -- those that are not actually operational or candidates

Dustin Swales - NOAA Federal2:47 PM @Grant Great point. Maybe we should move non-operationally supported physics from The Fork to The Root?

Ligia Bernardet - NOAA Federal2:48 PM The UFS is for research and operations, not just for operations. So, the UFS should have access to developmental schemes

Fanglin Yang - NOAA Federal2:48 PM @Grant@Dustin My preference is to keep UFS fork lean and clean

Dustin Swales - NOAA Federal2:50 PM Maybe operational UFS could point to a the UFS Fork (skinny) , and "research UFS" could point to The Root (thick)?

Grant Firl - NOAA Affiliate2:50 PM @Lisa The UFS can easily bring in the Root/main branch (which contains everything) instead of the ufs-community/ufs-dev (which can be lean/clean) to support "non-operational" research. @Dustin has the same idea

Dustin Swales - NOAA Federal2:51 PM I also missed Lauras question writing the same response :)

Lisa Bengtsson - NOAA Federal2:54 PM @Grant @Dustin, as long as the developer doesn't have to update several repositories I think it could work.

Date: Wed 11 Oct 2023

Attendees: Lulin, Sam, Tim, Andy, Lisa, Mike K., Joe, Jimy, Grant, Dustin, Fangli, Pedro, Qingfu, Mike K., Kate, Laura, |

Today's topics: 1. Running CCPP Physics in single precision mode. 2. Guide lines for accepting new CCPP physics schemes.

Discussion:

  1. 64 bit to 32 bit conversion (https://docs.google.com/presentation/d/1VrhHt-bXnqXwqYDTMxxoai1NrlofRdJqd73JtdnLOp0/edit#slide=id.g28b37fd4566_0_522): 32 bit added to NEPTUNE. Memory and file sizes are the motivation for the conversion. Non vectorized computation will convert all floating point from the memory to 80 bits in CPU, then convert back to the same precision when transfer back to memory. In vectorized computation, CPU adapts native floating point precision. Twice as many reals per kb leads to more NaNs errors. ESMF does not check point type. HAFS trial: heap corruption, signaling NaNs, intel compiler...

Jimy: what vectorized mean? Sam:

Fanglin: How acceleration between dycore and physics conversion differ?

  1. 32 bit to 64 bit in RFFS (https://docs.google.com/presentation/d/1SyDIZ6_81nC0JONDtA-mvORuNjTcuCDg/edit#slide=id.p3): tests within RRFS. Basically the results are the same.

Fanglin and Jimy: Good to see maps of fields.

GFS suites and HAFS suites may be considered

  1. Guide lines for accepting new CCPP physics schemes: Need a format to Dustin: more inclusiveness (academia, operational, etc) Mike K.: Sam: John M. work on 32 bits problem. Good example for

Fanglin: From the operational perspective.

Jimy: What kind of tests should be passed? SCM is a needed which requires interstitial schemes more geared towards GFS regime.

Mike Ek: Systematic test and evaluation platform.

For next meeting. Lisa: PUMA microphysics scheme inclusion. Hong's new scheme.

Chats: Samuel Trahan - NOAA Affiliate2:19 PM https://docs.google.com/presentation/d/1VrhHt-bXnqXwqYDTMxxoai1NrlofRdJqd73JtdnLOp0/edit#slide=id.g28b37fd4566_0_522 https://docs.google.com/presentation/d/1SyDIZ6_81nC0JONDtA-mvORuNjTcuCDg/edit#slide=id.p3 Jimy Dudhia2:36 PM Good summary Michael Kavulich2:38 PM Great point, another win for community collaboration! Jimy Dudhia2:41 PM WRF criteria (1) sufficiently different from existing schemes, (2) sufficiently documented, ideally published, (3) sufficiently of interest to more than the developer Joseph Olson - NOAA Federal2:42 PM (4) preferable to have long term developer support Jimy Dudhia2:42 PM @joe yes, ideally code owner status Jimy Dudhia2:50 PM Hong may have a scheme soon too

Date: Wed 13 Sep 2023

Attendees: Lulin, Peter, Qingfu, Jimy, Laura, Weiwei, Andy, Grant, Lisa, Mike K. |

Today's topics: 1. CCPP visioning workshop post-event survey results. 2. Versioning the CCPP commits. 3. Cleaning up the physics code to make other host models work with the CCPP repo more easily.

Discussion:

  1. Post-CCPP survey results: Jimy has a good point on multiple hosts with different namelist options and how to reconcile the CCPP SDF.
  2. Versioning the CCPP commits: Matching operational model need to match the hashes in the past. Weiwei: how to differentiate the development version from the operational version. Need to develop a protocol. In UFS, a challenge is the pre-release testing with SDF name that does not correspond to the code. Individual scheme as submodule to avoid confusions. Using the environmental module concept to inform CCPP SDF structure. Communication is a key problem. HSD concept for versioning system. GFS workflow points to the version of the code with different tag.
  3. Make sure other host models can access CCPP physics. De-centralize code management. MPAS CCPP physics interaction needs the CCPP framework to handle dynamics-physics consistency.

Chats:

You2:04 PM https://github.com/NCAR/ccpp-physics/wiki/Minutes-for-2023-meetings

Weiwei Li2:40 PM This is a good idea @laura

Weiwei Li2:43 PM I agree @lisa & @lulin - hard to control :) I guess the communication is so individual-dependent

Jimy Dudhia2:46 PM we can at least control public releases by just using CCPP versions, like v6.0 currently. the main purpose is to make it repeatable

Michael Kavulich2:47 PM Perhaps this is another argument for more frequent minor releases

Lisa Bengtsson - NOAA Federal2:49 PM @Jimy The problem with ccpp_v6 for the global model (GFS) was that it did not correspond to any prototype code. But it had a suite def file called _P8, while the code tagged in ccpp_v6 did not correspond to coupled UFS prototype 8 at the time.

Jimy Dudhia2:50 PM @Lisa the group is aware that this is something we want to avoid with naming suites in the future

Grant Firl - NOAA Affiliate2:52 PM FYI, the ccpp-physics repo organization PR is still a draft an open for comments: https://github.com/ufs-community/ccpp-physics/pull/99

Lisa Bengtsson - NOAA Federal2:53 PM @Jimy, yes. I'm aware that you're aware. :-) The tricky thing is to come up with a suite name that can explain what we're running, that is also not very long.

Michael Kavulich2:56 PM As unsatisfying as it sounds, I think the best solution is to forbid references to specific products in the names of CCPP suite files. The names do not need to describe the contents (the contents describe the contents!), they only need to be unambiguous. Documentation can be used to describe what suite files correspond to which operational products (GFS v17 prototype 8d corresponds to Suite_Applesauce.xml, code hash abcd123)

Peter Hjort Lauritzen2:57 PM If you are interested, here is a loooong paper discussing these issues:

https://agupubs.onlinelibrary.wiley.com/doi/full/10.1029/2022MS003117

Date: Wed 30 Aug 2023

Attendees: Ligia, Lulin, Dustin, Qingfu, Jimy, Laura, Pedro, Weiwei, Jesse, Grant, Lisa, Julio, |

Today's topic: CCPP-Physics repo reorganization: https://docs.google.com/presentation/d/1WnxjIJlp7L6YkIym5lfk-3nIl1yOX0lsOH3aTy1Ayng/edit#slide=id.g240a6f74dc6_0_3

Discussion:

Ligia: like the idea 1b instead of flatting everything; ideally, smoke_dust should be treated as a process like other schemes instead of under the UFS

Jimy: Top level should be somewhere everyone can use, so Thomp_pre should be generic?

Laura: scheme with 1 file vs 10 files, stay at process-level or scheme-level?; Ligia: Thompson_pre is host-specific or not? Dustin: host-specific

Ligia: dependency shared among other schemes; Dustin: yes, e.g., radiation scheme shared between two parameterizations

Grant: in favor of 1b; naming standard is another topic discussed during the workshop (like how WRF does it, e.g., every PBL scheme starts with bl_*); Jimy: start with module_bl, everything PBL is under there

Ligia: Where the interstitials would live? Under host? Or under each physics?

Dustin: in forks or in mothership? UFS/CAM-SIMA only wants their own interstitials

Jessie: should keep everything in the same repo…if we can’t get access to the code, we won’t know how a model works

Pedro: use symbolic link (Dustin noted down)

Chats:

Lulin Xue 2:01 PM https://github.com/NCAR/ccpp-physics/wiki/Minutes-for-2023-meetings

Lulin Xue 2:06 PM https://docs.google.com/presentation/d/1WnxjIJlp7L6YkIym5lfk-3nIl1yOX0lsOH3aTy1Ayng/edit#slide=id.g240a6f74dc6_0_3

Grant Firl - NOAA Affiliate 2:12 PM https://github.com/NCAR/ccpp-framework/discussions/476

Lisa Bengtsson - NOAA Federal 2:17 PM If UFS and NRL share all interstitials and pre/post you could have a UFS_NRL/ directory?

Ligia Bernardet - NOAA Federal 2:18 PM Interstitials/UFS_NRL_SCM directory?

Jimy Dudhia 2:18 PM @Lisa yes that came up offline too, also the issue of ownership

Jimy Dudhia 2:19 PM OK, my example of thompson_pre would be under thompson/

James Doyle 2:20 PM I agree with Lisa's point. Also, yes, NEPTUNE and not NRL

Jimy Dudhia 2:25 PM I think the directory organization makes them easy enough to find

Jimy Dudhia 2:28 PM External hosts could use CCPP with their own interstitials

Jimy Dudhia 2:30 PM Good point. github forks come with host-specific things they don't need.

Laura Fowler 2:32 PM the magic sauce is the meta file.

Jimy Dudhia 2:37 PM Yes, it is a way to avoid duplicate files

Date: Wed 2 Aug 2023

Attendees: Ligia, Lulin, Dustin, Jim, Laura, Pedro, Weiwei, Lisa, Kate, Joe, Jesse, Matus, Mike K., Fanglin, Grant, Jimy

Today's topic: Tiedtke convection scheme consolidation and 3D physics recap (https://docs.google.com/presentation/d/1yTEWbAwGEcGXTuOEGA1wBzUTVeQyn2cKnd55AzDpj2k/edit#slide=id.p1)

Ligia opens the meeting and uses Tiedtke situation as an example for future guidance in code management. Fanglin clarifies that the version contributed by Chunxi Zhang has been tested and used in UFS. Jim, Matus and Jimy confirmed that the current WRF Tiedtke by Chunxi was originated from the ECMWF code and tech note. Laura pointed out that there are two Tiedke schemes in the current MMM physics repo.

Fanglin: EMC is interested in testing it in HAFS and reached out to ECMWF for permission to use it.

Discussion:

Matus: NRL has been used the new Tiedtke and is happy with it. Not easy to put it in MPAS from the NRL side.

Laura: MMM spent efforts in making the scheme CCPP-compliant. Issues related to units should be taken care of by the host models. Should have just one repo.

Jim: Wait for Wei's return to review and consolidate the code.

Fanglin: OK to wait for Wei's review. EMC has its own version for testing now. Scale-aware part contributed by Wei needs some thoughts.

Jimy: Significant differences between NRL and MMM versions. Two types diff: dycore dependent part and TTDs

Ligia: Interested in having a dycore independent version.

Fanglin will reach out to Chunxi about the origin of the code. ECMWF has not replied to EMC's request yet.

Jimy presents the recap of 3D physics and CCPP related issues.

Jim mentions the stochastic part of the new physics includes some 3D aspects. High-res as a trend may require more attentions to this problem.

Ligia and Jimy have the discussion on implementation outside of the CCPP. Grant raises the question on what to include in the CCPP.

Meeting on August 16?

Chat:

Jimy Dudhia2:14 PM Correct

Fanglin Yang - NOAA Federal2:15 PM ntiedtke is now used by Central Weather Bureau of Taiwan for operation

Jimy Dudhia2:18 PM CWB is using WRF

Jimy Dudhia2:19 PM more accurate to say repo code was added to MPAS by Laura

James Doyle2:23 PM when will Wei be back from vacation?

Jimy Dudhia2:23 PM mid-Sept, she just left

Jimy Dudhia2:27 PM The other factor here is that the NRL version has changes to the MPAS version that we need to study more carefully

James Doyle2:28 PM I agree Jimy - we need to make sure those changes are OK. Best to wait and have Wei review it.

Jimy Dudhia2:29 PM she had a couple of days to look before she left, and I got some comments from her

Dustin Swales - NOAA Federal2:31 PM We have a way to handle this problem in the future, via ccpp-physics importing the MMM shared physics repository. For this development we are going to do it the "old way"

Sorry, not "old way" "hard way"

Grant Firl - NOAA Affiliate2:32 PM I see that the CCPP/NRL version is using physcons. That is host-specific and should be fixed.

Jimy Dudhia2:32 PM We need Wei to elaborate on what she told me

Jimy Dudhia2:33 PM It is possibly copied from a 15 year old version

Weiwei Li2:36 PM If the PR is merged but MPAS (say Wei Wang) updates the code in some foreseeable future (e.g., in their next public release), who would be responsible to update this code?

Jimy Dudhia2:36 PM I think it is closer to the MPAS version but has some original pieces too

Ligia Bernardet - NOAA Federal2:38 PM https://docs.google.com/presentation/d/1yTEWbAwGEcGXTuOEGA1wBzUTVeQyn2cKnd55AzDpj2k/edit#slide=id.p4

Dustin Swales - NOAA Federal2:55 PM I asked Lisa if she would have to redo the CA if there was a new dycore and she sighed

Fanglin Yang - NOAA Federal2:56 PM See https://journals.ametsoc.org/view/journals/mwre/139/11/mwr-d-10-05091.1.xml for the origin of Tiedtke in ARW-WRF (v3.2.1). It was copied over from iRAMP model.

Jimy Dudhia2:58 PM Yes, a generic operator

Ligia Bernardet - NOAA Federal2:58 PM We will continue the discussion about 3D Physics during the CCPP Visioning Wkshp in 2 weeks. Woo-hoo, it is coming up!

Ligia Bernardet - NOAA Federal2:59 PM Matus, thanks for coming!

Date: Wed 5 Jul 2023

Attendees: Ligia, Lulin, Dustin, Grant, Julio, Jimy, Laura, Pedro, Lisa, Kate, Joe, Jesse, Matt, Mike K.,

Today's topic: 3D physics in CCPP physics

Grant opened up with CCPP visioning workshop survey results.

Joe: using tendency from the column in the neighbouring cells for better simulation. PBL in fine res, horizontal stress terms are non negligible relative to the vertical term. These should be included in the model but not necessary a component of CCPP.

Lisa: not sure if the 3D perspective a consideration of physics or dycore. Don't know how to answer the survey question. Ligia responds:

Fanglin: comment on time splitting physics and computing of tendency terms on different grids. Jimy agreed with Fanglin's concern and think the calculation of tendency should be taken care of by the dycore when irregular grids are used.

Pedro: Adam, Jimy and Joe chimed in on the topic of horizontal diffusion between physics and dycore.

Sho: Jimy and Fanglin discuss where the gradient terms should be calculated.

Joe: Question on terms calculated over the terrain.

Chat: Grant Firl - NOAA Affiliate2:11 PM Pie chart of survey responses for what "3D physics" means to them: https://drive.google.com/file/d/1nAIxTI7xlgq7uEneq0wjybUXf_1IrY05/view?usp=sharing

Jimy Dudhia2:17 PM even more complex for MPAS grid

Dustin Swales - NOAA Federal2:18 PM Great points Fanglin. Lisa, convection is a time-split process. No?

Lisa Bengtsson - NOAA Federal2:20 PM Dustin, I'm not sure what you mean?

Grant Firl - NOAA Affiliate2:21 PM Convection is treated as time-split (as opposed to process-split) in the UFS.

Fanglin Yang - NOAA Federal2:22 PM Dustin, this schematic may help answer the question https://docs.google.com/presentation/d/1xJVq1AHLdi7Q8tk7t2LSS_uwTQWldHCbpQHZOYoMTE8/edit#slide=id.p

Ligia Bernardet - NOAA Federal2:25 PM @Dustin, can you summarize what you heard form Fanglin, I just want to make sure we capture this correctly

Dustin Swales - NOAA Federal2:25 PM The state passed to convection is not the same as the state passed to the physics. So for time-split schemes, like convection, any 3d derived fields computed between dycore and physics are computed with a different state than in the scheme themselves?

Dustin Swales - NOAA Federal2:26 PM time-split process implies that it is using the "internal physics state", not the state from the dycore

Fanglin Yang - NOAA Federal2:26 PM correct

Lisa Bengtsson - NOAA Federal2:27 PM ah, I see, yes.

Ligia Bernardet - NOAA Federal2:27 PM Tks, that is what I tried to ask. Whether the horiz derivatives that a scheme needs can be computed by the dycore. For time split situations, the dycore will not have that state.

Lisa Bengtsson - NOAA Federal2:27 PM For the CA and cold pool work, we are not using tendencies, but rather moving "objects" in 2D.

Fanglin Yang - NOAA Federal2:28 PM @Ligia another point is that the grid configurations between physics and dycore are also different

Ligia Bernardet - NOAA Federal2:28 PM Unless we insert more calls to dycore between physics calls. @Fanglin Hmmm Something else to take into account

Sho Yokota - NOAA Affiliate2:29 PM https://docs.google.com/presentation/d/1CdxQSF20cUP-uLZ0MjqOMnnftcaFFQYh/edit?usp=sharing&ouid=112129678525705993946&rtpof=true&sd=true

Jimy Dudhia2:39 PM I think it depends on the slope of the coordinate

Fanglin Yang - NOAA Federal2:41 PM Jimy, yes, but the slope varies with ehight and depends on the coordinate definition

Jimy Dudhia2:42 PM I think it is to avoid strong slopes - we see problems in horizontal diffusion unless we restrict it in those cases. Useful info - thanks

Lisa Bengtsson - NOAA Federal2:49 PM @fanglin, in the CA scheme we compute advection and divergence terms on the A-grid given input from the physics at the previous time-step.

Fanglin Yang - NOAA Federal2:50 PM @Lisa thx for the clarification

Ligia Bernardet - NOAA Federal2:54 PM So in GD3 various columns need to be solved simultaneously because they affect each other in the timestep?

Lisa Bengtsson - NOAA Federal2:55 PM For operations it will probably be too expensive to not parallelize physics.

Grant Firl - NOAA Affiliate2:56 PM Does anybody know if the benefits of time-splitting are greater than the benefits of having 3D physics with dycore-calculated horizontal gradients?

Jimy Dudhia2:56 PM In GD3 they get summed after the column physics

Adam Herrington2:56 PM good question grant! time-splitting is important for long physics time-steps

Ligia Bernardet - NOAA Federal2:56 PM @Jimy OK, if they get summed AFTER, then it can be done outside of physics

Jimy Dudhia2:56 PM It might just be the K coefficients you need

Grant Firl - NOAA Affiliate2:57 PM We'll also be talking about this in the CCPP Visioning Workshop! We should save some discussion for then... LOL

James Doyle2:57 PM Great discussion everyone and very interesting - Thanks!

Dustin Swales - NOAA Federal2:58 PM Good stuff!

Fanglin Yang - NOAA Federal2:58 PM @Grant time splitting might be important for maintaining model stability as well

James Doyle2:58 PM Agree Fanglin with that point about stability

Date: Wed 7 Jun 2023

Attendees: Ligia, Lulin, Dustin, Grant, Julio, Jimy, Laura, Pedro, Lisa, Kate, Joe, Jesse, Matt, Mike K.,

Today's topic: going through the CCPP vision workshop survey contents Part 1: CCPP overview (must have) Part 2: CCPP code management Collect email address as an option if the participant want to help or engage in organizing breakout groups. Part 3: New CCPP Functionality Requests (to be supported in the future) This section is optional. Change languages to be less CCPP specific. May want to add one option on using the same constants throughout the physics. Standard names for constants as a solution. GPU framework and physics supports may be combined. Part 4: Preparing for Physics of the Future Part 5: Demographics

Chat: You2:05 PM Sorry for joining the meeting late

Michael Kavulich2:18 PM An "I don't know what this is" option would be better than making responses optional

Jimy Dudhia2:18 PM Don't Know column could also go from don't know to very interested

Jimy Dudhia2:21 PM This question is optional?

Lisa Bengtsson - NOAA Federal2:41 PM Perhaps it can combined with the topic of lookup tables or saturation functions?

Ligia Bernardet - NOAA Federal2:42 PM Yes, it is interesting to connect constants (and parametric constants) with lookup tables or saturation functions. And consistency in general.

Dustin Swales - NOAA Federal2:47 PM We have GPU compliant scheme now, but now way to exercise this functionality

Date: Wed 24 May 2023

Attendees: Ligia, Lulin, Dustin, Grant, Julio, Jeremy, Domingo, Tim, Weiwei, Fanglin, Jimy, Jim, Andy, Jeremy Gibbs, Lisa, Kate, Joe, Jesse, Matt, Mike K.,

Today's topic: GPU!!!

Talk by Jeremy and Domingo on NCAR FastEddy model: resident GPU LES model. Need more physics to be a full weather model. Potential connections to CCPP are there. Jeremy: CCPP compliance relies on Fortran standard. How much flexibility exists for CCPP to handle different languages. Grant: Framework is written in python and can autogenerate codes in any language. But standards were setup under limited resources for Fortran mostly.

Talk by isidora: CCPP-compliant physics initiative by GSL scientific computing branch.

Talk by Grant: For CCPP to choose which physics schemes should be run on CPU or GPU. GSL-funded efforts.

Chat: Julio Bacmeister2:04 PM Hi Fanglin, Nice to see you Fanglin Yang - NOAA Federal2:04 PM yes, same here Timothy Sliwinski - NOAA Affiliate2:11 PM We're getting feedback from the OWL mic, I think. Can you mute yourself, Ligia? Thanks. Much better. Julio Bacmeister2:14 PM Any possibility for terrain elevation? You2:14 PM Yes, Julio. You will see the examples oon. soon. Fanglin Yang - NOAA Federal2:16 PM or RRTMGp ? Jimy Dudhia2:18 PM our rule for WRF is one minute per km grid size for radiation frequency Jesse Nusbaumer2:26 PM Does the CCPP assume a host model language? It seems (?) like the host model language will impact the CCPP interface as much as the physics parameterization language Grant Firl - NOAA Affiliate2:36 PM Another point RE: data locality is that, at least now, we cannot assume GPU-residence for the entire model when using the CCPP -- related to your separation of concerns bullet. We must be able to support all combinations of components on heterogeneous architectures, so data locality is a HUGE concern for us. Grant Firl - NOAA Affiliate2:49 PM A GPU-ized tridiagonal solver could be directly applicable to other PBL schemes used in the GFS, so that would be nice Fanglin Yang - NOAA Federal2:49 PM @Grant this is great to know Jeremy Sauer2:51 PM Really good point Grant. I think we'd be interested to contribute and help CCPP in addressing new complexity aspects like heterogeneous hardware, mixed languages, performance optimizations , memory layout etc. Jeremy Sauer2:53 PM Re: i or k index in "Do loops" GPUs shouldn't be using loops. That said memory layout does matter since coalesced memory access is still extremely important.. Fanglin Yang - NOAA Federal2:54 PM It appears the scaling with model vertical layers is excellent. Science wise this has the potential to allow us to run the model with much higher vertical resolution to better resolve wave vertical propagations. Ideally we should be running GFS with 200 layers (up to 80km) to improve the simulation of the QBO James Doyle2:56 PM I have to drop off now. Thanks for the presentations. Are the slides available anywhere? Jesse Nusbaumer2:58 PM @Grant we at NCAR are about to have a new SE that is supposed to work on GPUs + CCPP on our end. It would be great if we could get him involved in your effort as well with the framework, even if it just means sitting in on meetings if possible Grant Firl - NOAA Affiliate2:59 PM Absolutely, Jesse. We should be pulling on the same rope as much as possible. We haven't started meetings yet for our project (just received funding), but when we do, we should talk about getting you all involved. Jesse Nusbaumer2:59 PM Yeah, that would be great, thanks! Grant Firl - NOAA Affiliate3:00 PM @Fanglin, I love the idea of more layers as a PBL person!

Date: Wed 10 May 2023

Attendees: Ligia, Lulin, Dustin, Grant, Julio, John, Alex, Andy, Jordan, Jesse, Kate, Qingfu, Weiwei, Jimy, Jim, Mike Ek

Today's topic: the need of stateless schemes. John: The need rooted from the design of NEPTUNE including multiple instances of physics in ensemble and DA system that are run at the same time. Need changes at the CCPP governance level to include such requirements.

Chat:

Ligia Bernardet - NOAA Federal2:02 PM https://docs.google.com/presentation/d/1rI1R98UvZNARF5qiuOYQpITRI5mz2PSwaAAwns-ypZc/edit#slide=id.g1f45bbe2bd0_0_0 Grant Firl - NOAA Affiliate2:04 PM last bullet should be "GPU-compliant" Jimy Dudhia2:08 PM also LSM difference? Dustin Swales2:09 PM I believe both use RUC Jimy Dudhia2:09 PM OK, RRFS uses NoahMP Ligia Bernardet - NOAA Federal2:10 PM Yes, RRFS uses NoahMP Julio Bacmeister2:12 PM I apologize, but the term "stateless physics" is unfamiliar to me Dustin Swales2:12 PM me too Ligia Bernardet - NOAA Federal2:15 PM Here is the PR John is talking about: https://github.com/NCAR/ccpp-physics/pull/1000 Julio Bacmeister2:15 PM so if the state isn't updated by the physics it is stateless???? Grant Firl - NOAA Affiliate2:17 PM Also https://github.com/NCAR/ccpp-framework/pull/463 which adds the "instance" variable Jimy Dudhia2:20 PM definitely need an example just to help visualize this Dustin Swales2:20 PM slides are coming... Stateless schemes do not rely on shared memory Grant Firl - NOAA Affiliate2:50 PM Thanks for the nice discussion, John and Dustin! Lisa Bengtsson - NOAA Federal2:50 PM Could you share the slides? thank you Ligia Bernardet - NOAA Federal2:51 PM https://docs.google.com/presentation/d/1rI1R98UvZNARF5qiuOYQpITRI5mz2PSwaAAwns-ypZc/edit#slide=id.g1f45bbe2bd0_0_0

Date: Wed 12 April 2023

Attendees: Ligia, Lulin, Dustin, Grant, Julio, Joe, Jesse, Kate, Qingfu, Andrew, Adam, Lisa, Jim, Mike Ek

Today's topic recap of CCPP-physics code management goals and plans: Ligia went through some of the original plans.

What is the most desirable way of managing physics? Put common physics (e.g., YSU PBL) in a single repo? Put all physics in authoritative repo? Use submodules to combine the repos?

Code Management for CCPP Physics Authoritative Repo Goal: Can this process (as is or with changes) help us move toward a joint repo that can be used by the various modeling groups?

CCPP team is responsible for RTs for all models.

Discussions: Julio volunteer Jesse to respond: Agree with the concept of centralized code management for CCPP physics. But the details of getting to the goal are not clear. Don't want to waste CCPP PR reviewer's time to review functions that may not be used by UFS.

Julio: questions on convention of CCPP physics scheme files. How to organize the scheme structure.

Dustin: Organization of multiple versions of the same scheme by different host models.

Lulin: Group of the same origin of one scheme with multiple versions for different hosts. But a common version should be available for use across hosts.

Jim: has similar comments and suggest defining a threshold to consolidate differences in different versions of physics. Based on host, the same physics can be different in terms of variables being passed in and used. Therefore, it may not be realistic to have a common physics code for all hosts.

Ligia: there are ways to deal with the scenario just being discussed.

Adam: comment on forks. Model X points to model specific fork. Should a branch structure enough to satisfy the need. Ligia: NOAA's interests require separation of code management from branches.

Jesse: On CGD end, worry about sync code back to CCPP repo.

Dustin: Use existing CCPP authoritative physics instead of convert everything from CESM/CAM part.

Jesse and Adam: May refer to CCPP authoritative physics to identify existing version and work on these CCPP-compliant version to update the code.

Dustin: Physics schemes are developed with different interoperability in nature. Some are easy and some are impossible to rewrite as CCPP interoperable version.

Jesse: Some good suggestions on practical side of the code managing and RT activities.

SIMA topic: single code base repo for applications.

Chat:

Ligia Bernardet - NOAA Federal2:04 PM

https://docs.google.com/presentation/d/14PSIpx5enC59H5nse93QAzOsXVAmCaYsN4hVl4fScjc/edit#slide=id.p1

Dustin Swales2:35 PM

You don't need to have a fork, you can use the mothership

NOAA wants a fork

Grant Firl - NOAA Affiliate2:53 PM

That's how GFS physics started in the CCPP. -- the "nuclear" option

Date: Wed 29 March 2023

Attendees: Dustin, Ligia, Laura, Joe, Jesse, Jimy, Qingfu, Pedro, Jordan. Andrew, Jim Doyle

Today's topic recap and discussion from last week's "Incorporating Physics From NCAR models in the CCPP". Presentation by Dustin: https://docs.google.com/presentation/d/1rI1R98UvZNARF5qiuOYQpITRI5mz2PSwaAAwns-ypZc/edit#slide=id.g1f45bbe2bd0_0_0

Laura said that in the long term MPAS will probably adopt the CCPP Framework. WRF will probably never adopt the Framework.

Questions?

  • Do you agree with these statements about a scheme that is used by both CCPP and non-CCPP hosts:
    • The scheme should be imported into the CCPP as git submodule
    • There should be a CCPP-compliant wrapper to call the scheme

Laura and Jimy: Do not agree. A scheme can be used in hosts that use CCPP Framework and not without a wrapper. If a wrapper is not used, then there needs to be conversions between host <-> scheme in some way, such as a pre- and post- for the scheme. What are the pros/cons of having a wrapper vs pre- and post? Wrappers: keep teh SDF more concise.

Pre-, post- and wrappers would be host-dependent, while physics is host agnostic.

Wrappers make the CCPP more like "the old way", like a driver.

Wrappers avoid having too many standard names (e.g., for chemistry).

In order not to use a wrapper, you need to modify the external scheme.

*** Where should the wrapper be stored? In the submodule or in ccpp-physics?
  • When multiple hosts use a given scheme and interstitial code is used to connect the host to the scheme, should…
    • There be one wrapper for each host?
    • There be one wrapper for all hosts, with if statements inside as needed?

Pre- and post- are preferred as they make the SDF more exposed.

Shared repo has CCPP-compliant schemes. SCM can have pre- and post-files.

If a developer has a non-CCPPized scheme and we want to bring it into CCPP. Where would the meta file live?

Laura: keep the param as is. The host comes wit the pre- and post- to use it. Instead of the opposite, in which the param changes to become a wrapper to fit the SCM.

Folks like Dom, Steve may have an opinion? Could be discussed at CCPP visioning workshop.

Chat You2:04 PM Slides from 2 weeks ago: https://docs.google.com/presentation/d/1BGPWMG0pigGNtPc0eM_MYNTMzJOKktyZNapHJ-FjKvg/edit#slide=id.g1f45bbe2bd0_0_0 You2:06 PM Today's slides: https://docs.google.com/presentation/d/1rI1R98UvZNARF5qiuOYQpITRI5mz2PSwaAAwns-ypZc/edit#slide=id.g1f45bbe2bd0_0_0 Laura Fowler2:23 PM I agree with Jimy. Jesse Nusbaumer2:24 PM Yeah, our plan with CAM was to also try and implement Jimy's strategy, of having multiple interstitials in the SDF around the "core" physics scheme, with a few key exceptions (like chemistry) Jimy Dudhia2:25 PM It is why we produced a meta file Jimy Dudhia2:29 PM pre and post would be host dependent which allows the physics to be agnostic Jimy Dudhia2:31 PM yes, that is the question, Laura Fowler2:34 PM It seems to me that using a wrapper is the same as doing it the same old fashion way. Jordan Powers2:35 PM Laura: Like a driver? Laura Fowler2:36 PM yes Jordan Powers2:37 PM Roger. Jimy Dudhia2:38 PM interstitials keep the scheme at the SDF level, and wrappers hide it below

  • How can make sure that there is only one code base of each scheme (e.g., YSU PBL) in the CCPP?

Date: Wed 15 March 2023

Attendees: Qingfu Liu, Dustin Swales, Jesse Nusbaumer, Julio Bacmeister, Grant Firl, Lisa Bengtsson, Joe Olson, Ligia Bernardet, Lulin Xue, Jimy Dudhia, Matthew Dawson, Li Zhang, Andy Hazelton, Fanglin Yang, Pedro Jimenez, | Alex Reinecke, Jeff Mcqueen, Jeremy Gibbs, Jim Doyle, Laura Fowler, Mike Ek, Adam Herrington,

Today's topic: "Incorporating Physics From NCAR models in the CCPP" led by Dustin and Grant: https://docs.google.com/presentation/d/1BGPWMG0pigGNtPc0eM_MYNTMzJOKktyZNapHJ-FjKvg/edit#slide=id.g1f45bbe2bd0_0_0

Goals: 
Provide overview of CCPP interoperability concepts (to date).
Provide overview on recent efforts to implement NCAR CGD and MMM physics schemes in the CCPP. 

CCPP interoperability concepts: 

COSP example: CAM and CCPP-physics implementations.

MMM Physics in the CCPP: 

Questions and comments: Q from Fanglin: what are the outputs of COSP? Just brightness temperature? or others? A from Dustin: A whole suite of diagnostics are provided by COSP.

Q from Fanglin: Exception of WRF physics. A from Ligia: Wrapper over schemes that have been distributed outside of CCPP

Q from Julio: What are the plans for I/O inside CCPP? A from Grant: It is a general rule that I/O is not handled by CCPP physics. A from Dustin: Need interstitial codes to pass variables to I/O handling.

Q from Jesse: What is the lowest level of functionality inside a scheme? How do we define the level of functionality? Should it be left for developers to define? Where should we put the schemes? A from Dustin and Grant: Layers of wrappers may show up in a scheme during its history. The lowest level should be the original version of the scheme to work on. C from Jesse: C from Ligia: No clear cut in the definition. Should put all CCPP physics in a central place as a goal. It is not necessary to have just one repo for all schemes. Submodule of some physics should be OK as long as parties collaborate on code management.

Q from Lisa: If it is not in ccpp/physics, then what is the role of ccpp? The host could just point to the repository where the scheme sits?

Save for next meeting.

Chat Notes:

You2:01 PM https://github.com/NCAR/ccpp-physics/wiki/Minutes-for-2023-meetings

Joseph Olson - NOAA Federal2:02 PM Welcome Master Qingfu!

Dustin Swales2:02 PM Welcome!

Lisa Bengtsson - NOAA Federal2:03 PM Welcome Qingfu! Where did Chunxi go?

Kate Zhang - NOAA Affiliate2:03 PM Welcome, Qingfu!

Fanglin Yang - NOAA Federal2:03 PM @Lisa, Chunxi joined a private company

Lisa Bengtsson - NOAA Federal2:04 PM Best of luck to him, sorry to see him go

Fanglin Yang - NOAA Federal2:04 PM still in the weather/climate community

Lisa Bengtsson - NOAA Federal2:04 PM nice

You2:06 PM https://docs.google.com/presentation/d/1BGPWMG0pigGNtPc0eM_MYNTMzJOKktyZNapHJ-FjKvg/edit#slide=id.g1f45bbe2bd0_0_0

Fanglin Yang - NOAA Federal2:51 PM Are the cloud and gas optics data read by RTE+RRTMGP or by the host in CCPP?

Dustin Swales2:54 PM Read during the init phases

Jimy Dudhia2:54 PM Examples of things staying outside are NoahMP and RRTMGP

Lisa Bengtsson - NOAA Federal2:55 PM If it is not in ccpp/physics, then what is the role of ccpp? The host could just point to the repository where the scheme sits?

Date: Wed 15 February 2023

Attendees: Jesse Nusbaumer, Julio Bacmeister, Grant Firl, Jim Doyle, Chunxi Zhang, Laura Fowler, Ligia Bernardet, Dustin Swales, Lulin Xue, Mike Ek, Jimy Dudhia, Matthew Dawson, Li Zhang, Andy Hazelton, Joe Olson, Adam Herrington, Fanglin Yang, | Pedro Jimenez, Alex Reinecke, Jeff Mcqueen, Jeremy Gibbs, Lisa Bengtsson,

Today's topic: coordination among different parties on CCPP development and application across model ecosystem. Julio, Jesse, and Adam: CGD reps of SIMA and CCPP efforts. Will present plan related to CCPP today.

Julio's presentation: Current CAM physics loop. In the physics loop, the state is constantly updated/adjusted by physics sequentially. The physics subroutines only update the tendencies. The physics loop (tendency accumulator/state updater) updates both state and accumulated tendencies. These functions are outside of physics parameterizations (answer Grant's question). Ligia: the communication between dynamics and physics is only through tendencies. The state update is "virtual state" in the step (Adam). Chunxi's question on how dynamics provides state variables to physics (UFS is different than CAM's strategy). Subcycling in CLUBB/microphysics. 300s time step for CLUBB. Every 6 CLUBB calls invoke 3 microphysics calls. Order of calls to physics does matter! Future plans: CAM4 and CAM5 physics CCPP-ization proposed to NSF CSSI. Laura's question on chemistry CCPP-ization.

Jesse's presentation: CCPP implementation plan in CAM. SIMA and CAM physics. Many current physics source files are interface routines. CAMDEN is a new model infrastructure for CAM. CCPP implementation plan: for each physics scheme in CAM, the SEs will 1. save a snapshot of the model state before and after the CAM scheme, 2. create a metadata file for the scheme and pull out the source code to be CCPP-compliant, 3. Add at least the "run" phase of new CCPP scheme back into CAM and check b4b, 4. add the full CCPP scheme into CAMDEN, 5. test and validation. People and time: Courtney Peverley, Cheryl Craig, John Truesdale, Kate Thayer-Calder, Peter Lauritzen, and AMP Scientists. Challenges: Time/peopole, Technical.

Questions and comments: Ligia: Curious when WRF physics will be part of the work? Jesse: test if the framework works as expected. Adam: it will depend on SIMA science solicitation proposal outcome for WRF physics to be available for CAM. Jordan: haven't seen WRF CCPP-ized physics being run with CCPP framework.

Ligia: b4b is a tough standard to meet. Laura: questions to Jesse and Julio if CCPP-ized physics will change the orders of physics calls? Julio: long physics time step is the main issue instead of dycore. Jimy: CAM physics for MPAS may have different strategy. Julio: re-emphasize the time step length. Adam: surface flux and PBL need the same states.

Call for topic for next meeting! Please let us know what you want to discuss in the future!

Chat Notes:

Adam Herrington2:10 PM

slight detail -- cam_dev now emits the emissions where it's calculated

Jimy Dudhia2:48 PM

glad to hear about the SIMA proposals wanting to use WRF physics. We can be involved.

Fanglin Yang - NOAA Federal2:56 PM

same for GFS

Date: Wed 01 February 2023

Attendees: Grant Firl, Chunxi Zhang, Laura Fowler, Ligia Bernardet, Dustin Swales, Pedro Jimenez, Lulin Xue, Mike Ek, Jimy Dudhia, Lisa Bengtsson, Andy Hazelton, Joe Olson, | Alex Reinecke, Jeff Mcqueen, Jeremy Gibbs, Julio Bacmeister, Fanglin Yang, Matthew Dawson, Li Zhang

Today's topic: resuming "streamlining CCPP interstitial schemes" (https://docs.google.com/presentation/d/1KPGUAs2IpRrg2KxzvdfutYi8jNXbCOiDKiMXBnv_2rc/edit#slide=id.p1).

Grant: Look more carefully and identify specific interstitial combination opportunities/strategies Comb through existing interstitials getting rid of hard-coded logic/constituents Implement mandatory scheme tendency output for primary schemes and move tendency accumulation/application to interstitials in preparation for the ccpp-framework taking this over

SDF Reduction Opportunities: Scheme-specific, Scheme-generic, and Suite-level. Goals: Primary scheme interoperability, suite configurability, future maintainability, strict reproducibility.

From primary schemes only to maximally split, there are different levels of complexity of the SDF. We want to identify what level do we expect.

SDF reduction opportunities 1 are the radiation schemes (RRTMG and RRTMGP), opportunities 2 are surface pre and post, opportunities 3 are GFS_suit_interstitial groups.

Method of cleaning the interstitial schemes: Mike Kavulich's script of tracking variables.

Discussion: If codes are called in different places in the physics call-chain, they should be grouped into an interstitial scheme. If codes are always called at the same place, they may be combined with the physics scheme. However, we should take a look at the pre and post interstitial schemes carefully not to impact the interoperability of CCPP. Jimy mentioned that some pre and post are determined by host models instead of physics schemes.

Chunxi: EMC wants to see the direction of simplifying and shortening SDF.

Dustin: Can the framework provide functionality to handle the way the physics is called? Grant: It is not trivial and requires good amount of efforts.

Lisa: Consolidate deep convection schemes since they share some common routines. Some generic interstitial schemes should be provided to handle this situation.

Chat Notes:

Jimy Dudhia2:39 PM

maybe I chose a bad example, but another one would be whether the host needs updates or tendencies in a post stage

Jimy Dudhia2:44 PM

some could also go into the suite's mp generic pre

Grant Firl - NOAA Affiliate2:49 PM

Dustin, that particular nut is harder to crack due to the potential need for the framework to allocate memory for intermediate states.

chunxi zhang - NOAA Affiliate2:49 PM

@Jimy Only scheme specific interstitials (e.g. mp_thompson_pre...) need to combine with the scheme. Other interstitials are still likely necessary.

Jimy Dudhia2:51 PM I think of interstitials as host to scheme conversions, so they would be host dependent some are scheme to scheme and internal to a suite

Date: Wed 18 January 2023

Attendees: Julio Bacmeister, Grant Firl, Chunxi Zhang, Laura Fowler, Ligia Bernardet, Fanglin Yang, Dustin Swales, Pedro Jimenez, Joe Olson, Lisa Bengtsson, Matthew Dawson, Andy Hazelton, Lulin Xue, Mike Ek, Jimy Dudhia, Li Zhang | Alex Reinecke, Jeff Mcqueen, Jeremy Gibbs,

Lisa proposed to discuss on how to better use CCPP suites

Ligia introduced the background: Individual CCPP physics packages can be used in any combination in a simulation, a suite consists of combination of certain physics packages to be used by a host model through suite definition files. Naming of the suite needs some discussions here. Issue is that even the suite and suite name is the same, the code base has been continuous evolving. The results presented at a meeting may use codes very different from the operational codes. Different suite names should represent the actual code base to avoid confusion. Or mention the tag in the presentation.

Changes happen between the initial naming of a suite and the final tag of the frozen code for operations.

Lisa: clarification needed. Same hash that is old may not work anymore now.

Use the same tag and hash across UFS and CCPP code base.

Julio: How CESM atmospheric physics fit in to the picture after CCPP is adopted? Should CESM contribute to the CCPP authoritative repo? What procedures are needed to contribute?

Ligia: Need another meeting to discuss Julio's questions.

Chunxi and Ligia: How long into the future we need to support the code? Hash indicates the exact code being used. Old version is still available for repeating experiments.

CCPP release and UFS release should be coordinated. In between releases, hard to track and coordinate. Name the suite in a better way.

Joe: Don't even consider to take care of the user errors.

Lisa: There should be a UFS release include all the CCPP details but not checking out codes from different repos.

Dustin: Codes are set up when they are under development. Other places release when codes are frozen.

Ligia: Change experiment name to something else than the suite name. How often can we support new releases? How far back can we support the old codes?

Joe: If you want to spread CCPP, it will be better to separate from UFS.

Chat Notes:

Jimy Dudhia2:28 PM

@Julio CCPP's repository is for others to access your physics. The minimal testing would involve running your suite through the CCPP framework in the SCM

Julio Bacmeister2:29 PM

Thanks Jimy

chunxi zhang - NOAA Affiliate2:53 PM I believe people just randomly checked out the develop branch of the ufs-weather-model The applications do have official tags in the ufs-weather-model repo Jimy Dudhia2:56 PM sufficient to say this suite in ccppv6.0