You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The initial state of the mountain wave test case differs depending on whether the init_atmosphere program is run serially or in parallel. When running the program serially, everything looks fine. After five hours, the result is expected:
When running the init_atmosphere program in parallel (mpiexec -np 4 ./init_atmosphere_model), however, the result looks as follows:
For creating the plots, MPAS 8.2.2 was used, compiled with gfortran, with OpenMP and OpenACC switched off. Both times, the model was run in parallel.
I looked more closely at the initial state and wrote the following python program to detect differences:
# This script compares the input files of the serial vs. parallel initialization.
import netCDF4 as nc
import numpy as np
filename_serial = "mtn_wave_init_serial.nc"
filename_parallel = "mtn_wave_init_parallel.nc"
array_serial = {}
array_parallel = {}
ds = nc.Dataset(filename_serial, "r", format = "NETCDF4")
var_names = ds.variables.keys()
for var in var_names:
array_serial[var] = ds[var][:]
ds.close()
ds = nc.Dataset(filename_parallel, "r", format = "NETCDF4")
for var in var_names:
array_parallel[var] = ds[var][:]
ds.close()
for var in var_names:
try:
min_serial = np.round(np.nanmin(array_serial[var]), 3)
except:
continue
max_serial = np.round(np.max(array_serial[var]), 3)
min_parallel = np.round(np.min(array_parallel[var]), 3)
max_parallel = np.round(np.max(array_parallel[var]), 3)
if min_serial!=min_parallel or max_serial!=max_parallel:
print("deviation detected for variable: " + var + ", serial init min, max: ", str(min_serial) + ", " + str(max_serial) + \
", parallel init min, max: " + str(min_parallel) + ", " + str(max_parallel))
print("Maximum absolute zgrid diff: " + str(np.max(np.abs(array_serial["zgrid"] - array_parallel["zgrid"]))))
It gave the following results:
deviation detected for variable: zxu, serial init min, max: -0.157, 0.157, parallel init min, max: -0.166, 0.166
deviation detected for variable: zb, serial init min, max: -0.056, 0.057, parallel init min, max: -0.056, 0.055
deviation detected for variable: zb3, serial init min, max: -0.005, 0.005, parallel init min, max: -0.006, 0.006
deviation detected for variable: w, serial init min, max: -1.94, 1.934, parallel init min, max: -1.831, 1.842
deviation detected for variable: rho, serial init min, max: 0.077, 1.193, parallel init min, max: 0.077, 1.2
deviation detected for variable: rho_base, serial init min, max: 0.056, 1.195, parallel init min, max: 0.056, 1.202
deviation detected for variable: surface_pressure, serial init min, max: 1731.304, 1777.552, parallel init min, max: 1743.454, 1788.171
Maximum absolute zgrid diff: 250.0
As you see there are differences which cannot be explained by machine precision. Something seems to be wrong with the orography when run in parallel.
The text was updated successfully, but these errors were encountered:
MHBalsmeier
changed the title
Mountain wave initial state worng when init_atmosphere is run in parallel
Mountain wave test case initial state wrong when init_atmosphere_model is run in parallel
Dec 25, 2024
The initial state of the mountain wave test case differs depending on whether the init_atmosphere program is run serially or in parallel. When running the program serially, everything looks fine. After five hours, the result is expected:
When running the init_atmosphere program in parallel (mpiexec -np 4 ./init_atmosphere_model), however, the result looks as follows:
For creating the plots, MPAS 8.2.2 was used, compiled with gfortran, with OpenMP and OpenACC switched off. Both times, the model was run in parallel.
I looked more closely at the initial state and wrote the following python program to detect differences:
It gave the following results:
As you see there are differences which cannot be explained by machine precision. Something seems to be wrong with the orography when run in parallel.
The text was updated successfully, but these errors were encountered: