Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Questions about "X = Decode( ˜X) is "evaluated" against Y, The network tendencies Ψ( ˜X) are added to Φ( ˜X)" #73

Open
weatherforecasterwhai opened this issue May 12, 2024 · 0 comments

Comments

@weatherforecasterwhai
Copy link

weatherforecasterwhai commented May 12, 2024

Hi,
The sentence is just above FigC5: "X = Decode( ˜X) is evaluated against Y ."
Is this sentence running during inference or trainning ?
The sentences before this one seem to mean in inference. If in inference, why do X evaluated against Y?
Y is ERA5 input state at the beginning, in inference, there's no more other ERA5 state.

And acoording to Fig1 and Fig C5, " The network tendencies Ψ( ˜X) are added to Φ( ˜X)" .
The codes in class DivCurlNeuralParameterization(hk.Module) is not easy to follow:

  1. _prediction_mask = self.prediction_mask
    if prediction_mask is None:
    prediction_mask = pytree_utils.tree_map_over_nonscalars(
    lambda _: True, inputs, scalar_fn=lambda : False
    )

    What is prediction mask?
    From the paper, there's only land-sea "mask".
    And from "Deep-dive into trained models", there's "non-zero mask":
    "Here care should be taken to ensure that structural sparsity is handled properly, because many of the entries in these arrays are always fixed at zero." Why do neuralGCM need such a mask?

  2. nodal_inputs = self.modal_to_nodal_features_fn(
    inputs, memory=memory, diagnostics=diagnostics, randomness=randomness,
    forcing=forcing,
    )

    What is this "memory" come from and memory of what? No other codes mention about memroy.
    Since curl_and_div_tendencies method from dinosaur.MoistPrimitiveEquations does not "add" any other tendencies,
    could this "memory" contains the dynamical tendencies, and then the learned physical tendencies "add" this memory of dynamical tendencies?
    If not, where does these two tendencies "add" together in the codes?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant