Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CoastalApp compilation on Cheyenne fails #156

Open
pvelissariou1 opened this issue Jan 30, 2023 · 98 comments
Open

CoastalApp compilation on Cheyenne fails #156

pvelissariou1 opened this issue Jan 30, 2023 · 98 comments
Assignees

Comments

@pvelissariou1
Copy link
Collaborator

Ufuk Turuncoglu Ufuk Turuncoglu Jan 27, 2023, 1:16 PM (3 days ago)    
Ufuk Turuncoglu
to Saeed, me, Daniel to Saeed, me, Daniel
to Saeed, me, Daniel

Ufuk Turuncoglu
Jan 27, 2023, 1:16 PM (3 days ago)
to Saeed, me, Daniel

Hi All,

I am following steps in

https://github.com/noaa-ocs-modeling/CoastalApp-testsuite

to run the tests. I am getting following error when I try to compile the model on Cheyenne. It seems that we have module files under modulefiles/ directory.

./build.sh --compiler intel --platform cheyenne --component "atmesh pahm adcirc ww3" -y
envmodules_intel.cheyenne :: This environment is either not configured or not supported
Exiting ...

Anyway, I just want to let you know. If you have no idea why it is not working I could try to fix it but I just want o ask you first. I also tried with gnu and it is not working too. Any idea? I could also try on Orion if you want.

Regards,

—ufuk

Ufuk Turuncoglu
Jan 27, 2023, 1:17 PM (3 days ago)
to Saeed, me, Daniel

Do i also need to follow the steps in compile-NEMS.x.txt (under CoastalApp directory)?

—ufuk

Panagiotis Velissariou - NOAA Affiliate [email protected]
Jan 27, 2023, 1:29 PM (3 days ago)
to Ufuk, Saeed, Daniel

Ufuk hi,

I haven't configured CoastalApp for "cheyenne" because I don't have access to this platform.
What you can do is go into CoastalApp/modulefiles, copy the *hera modulefiles to *cheyenne and modify them according
to cheyenne's module configuration.
I think it will be better if we could have a chat (google meet) to discuss details.
Are you free today?

Takis

Panagiotis Velissariou, Ph.D., P.E.
UCAR Scientist
National Ocean and Atmospheric Administration
National Ocean Service
Office of Coast Survey CSDL/CMMB
Physical Scientist - Project Lead
cell: (205) 227-9141
email: [email protected]

Ufuk Turuncoglu
Jan 27, 2023, 1:36 PM (3 days ago)
to me, Saeed, Daniel

Hi,

I have just comment you the section in the module file (for Cheyenne, Intel) that causes issue. Now I am trying to build and see how it goes. If that won’t work I’ll try your suggestion and use Hera one as a base. I’ll let you know if I need help about it and then we could have a chat. Is there any other platform that is tested regularly such as Orion?

Thanks for your help,

—ufuk

Ufuk Turuncoglu
Jan 27, 2023, 1:38 PM (3 days ago)
to me, Saeed, Daniel

BTW, which Here module is the best one for Intel compiler. I could see multiple under module files/ directory. Thanks.

—ufuk

On Jan 27, 2023, at 12:29 PM, Panagiotis Velissariou - NOAA Affiliate <[email protected]> wrote:

Panagiotis Velissariou - NOAA Affiliate [email protected]
Jan 27, 2023, 1:49 PM (3 days ago)
to Ufuk

envmodules_intel.hera is the file currently used. Basically the code in all modulefiles is the same,
only the modules that are loaded are different between platforms.
Could you please load the module manually and send me what NETCDF and HDF environment variables
are set?

Takis

Panagiotis Velissariou, Ph.D., P.E.
UCAR Scientist
National Ocean and Atmospheric Administration
National Ocean Service
Office of Coast Survey CSDL/CMMB
Physical Scientist - Project Lead
cell: (205) 227-9141
email: [email protected]

Ufuk Turuncoglu
Jan 27, 2023, 3:11 PM (3 days ago)
to me

Hi,

I did following,

  • fixed thirdparty_open/parmetis/Makefile file and change

cc = mpicc
cxx = mpicxx

To

if [[ -z "${CC}" ]]; then
cc = mpicc
else
cc = ${CC}
fi
if [[ -z "${CXX}" ]]; then
cxx = mpicxx
else
cxx = ${CXX}
fi

Since the variables can be provided by the environment variables.

  • update the modulefiles/envmodules_intel.cheyenne by using modules in ufs-weather-model develop,

module purge
module load cmake/3.22.0
module load ncarenv/1.3
module load intel/2022.1
module load mpt/2.25
module load ncarcompilers/0.5.0
module use /glade/work/epicufsrt/GMTB/tools/intel/2022.1/hpc-stack-v1.2.0_6eb6/modulefiles/stack
module load hpc/1.2.0
module load hpc-intel/2022.1
module load hpc-mpt/2.25
export CC=mpicc
export PCC=mpicc
export CXX=mpicxx
export PCXX=mpicxx
export FC=mpif90
module load zlib/1.2.11
module load hdf5/1.10.6
module load netcdf/4.7.4
module load pio/2.5.7
module load esmf/8.3.0b09

When I run the script using following command,

./build.sh --compiler intel --platform cheyenne --component "atmesh pahm adcirc ww3" --tp parmetis -y

I am getting following error,

make[1]: Leaving directory '/glade/scratch/turuncu/COASTAL_APP/CoastalApp-testsuite/CoastalApp/thirdparty_open/parmetis/metis/build/Linux-x86_64'
Makefile:11: *** missing separator. Stop.
ERROR:: compileMetis (main): called from: functions_build
the configuration of the ParMETIS library source failed
tried to run the make command as:
make config CC=mpicc CXX=mpicxx cc=mpicc cxx=mpicxx CFLAGS= CONFIG_FLAGS1=-DCMAKE_C_FLAGS='"-DIDXTYPEWIDTH=32 -DREALTYPEWIDTH=32"' gklib_path=/glade/scratch/turuncu/COASTAL_APP/CoastalApp-testsuite/CoastalApp/thirdparty_open/parmetis/metis/GKlib metis_path=/glade/scratch/turuncu/COASTAL_APP/CoastalApp-testsuite/CoastalApp/thirdparty_open/parmetis/metis prefix=/glade/scratch/turuncu/COASTAL_APP/CoastalApp-testsuite/CoastalApp/thirdparty_open/parmetis/del
Exiting now …

I am still investigating. I’ll update you when I could compile the model then maybe you might want to push those changes to the main repository or I could create a branch and we could have a PR later.

—ufuk

Panagiotis Velissariou - NOAA Affiliate [email protected]
Jan 27, 2023, 3:38 PM (3 days ago)
to Ufuk

You don't need to modify any of these in the Makefile. All these are set
by the build script. Check the file scripts/functions_build where there is a function
compileMetis that defines all these variables and configures metis/parmetis properly.
It should work in your case, if not please let me know the errors you are getting.

Takis

Panagiotis Velissariou, Ph.D., P.E.
UCAR Scientist
National Ocean and Atmospheric Administration
National Ocean Service
Office of Coast Survey CSDL/CMMB
Physical Scientist - Project Lead
cell: (205) 227-9141
email: [email protected]

Ufuk Turuncoglu
Jan 29, 2023, 11:47 PM (12 hours ago)
to me

Hi,

I past that issue but I am having another error as following,

-- Configuration on "cheyenne" waits verification.
-- Found NetCDF: /glade/work/epicufsrt/GMTB/tools/intel/2022.1/hpc-stack-v1.2.0_6eb6/intel-2022.1/mpt-2.25/netcdf/4.7.4/include (found version "4.7.4") found components: C Fortran
-- FindNetCDF defines targets:
-- - NetCDF_VERSION [4.7.4]
-- - NetCDF_PARALLEL [TRUE]
-- - NetCDF_C_CONFIG_EXECUTABLE [/glade/work/epicufsrt/GMTB/tools/intel/2022.1/hpc-stack-v1.2.0_6eb6/intel-2022.1/mpt-2.25/netcdf/4.7.4/bin/nc-config]
-- - NetCDF::NetCDF_C [STATIC] [Root: /glade/work/epicufsrt/GMTB/tools/intel/2022.1/hpc-stack-v1.2.0_6eb6/intel-2022.1/mpt-2.25/netcdf/4.7.4] Lib: /glade/work/epicufsrt/GMTB/tools/intel/2022.1/hpc-stack-v1.2.0_6eb6/intel-2022.1/mpt-2.25/netcdf/4.7.4/lib/libnetcdf.a
-- - NetCDF_Fortran_CONFIG_EXECUTABLE [/glade/work/epicufsrt/GMTB/tools/intel/2022.1/hpc-stack-v1.2.0_6eb6/intel-2022.1/mpt-2.25/netcdf/4.7.4/bin/nf-config]
-- - NetCDF::NetCDF_Fortran [STATIC] [Root: /glade/work/epicufsrt/GMTB/tools/intel/2022.1/hpc-stack-v1.2.0_6eb6/intel-2022.1/mpt-2.25/netcdf/4.7.4] Lib: /glade/work/epicufsrt/GMTB/tools/intel/2022.1/hpc-stack-v1.2.0_6eb6/intel-2022.1/mpt-2.25/netcdf/4.7.4/lib/libnetcdff.a
CMake Error at cmake/check_netcdf.cmake:143 (message):
The NetCDF library specified is not compatible with the specified
compilers.It will not be enabled. Specify a different path or disable
NetCDF.Ensure that you specify the same compilers to build PAHM as were
used to build the NetCDF library.

Call Stack (most recent call first):
CMakeLists.txt:238 (include)

-- Configuring incomplete, errors occurred!

and then

pahm_mod.F90(15): error #7002: Error in opening the compiled module file. Check INCLUDE paths. [PAHM_MESH]
USE PaHM_Mesh
------^
pahm_mod.F90(107): error #6404: This name does not have a type, and must have an explicit type. [NP]
theData%numNd = np
--------------------^
pahm_mod.F90(108): error #6404: This name does not have a type, and must have an explicit type. [NE]
theData%numEl = ne
--------------------^
pahm_mod.F90(122): error #6404: This name does not have a type, and must have an explicit type. [SLAM]
theData%ndCoords((iCnt - 1) * nDims + 1) = slam(iCnt)
-------------------------------------------------^
pahm_mod.F90(123): error #6404: This name does not have a type, and must have an explicit type. [SFEA]
theData%ndCoords((iCnt - 1) * nDims + 2) = sfea(iCnt)
-------------------------------------------------^
pahm_mod.F90(130): error #6404: This name does not have a type, and must have an explicit type. [NM]
theData%elConnect((iCnt - 1) * numFaceEl + 1) = nm(iCnt, 1)
------------------------------------------------------^
pahm_mod.F90(136): error #6404: This name does not have a type, and must have an explicit type. [DP]
theData%bathymetry = dp
-------------------------^
compilation aborted for pahm_mod.F90 (code 1)

It seems that those parameters (i.e., np) is defined in PAHM/src/mesh.F90 and it needs to be compiled before pahm_mod.F90. Maybe it is because if netcdf issue, not sure.

At this point I put following to modulefiles/envmodules_intel.cheyenne (trying to use exact modules used for UFS),

module load cmake/3.22.0
module load ncarenv/1.3
module load intel/2022.1
module load mpt/2.25
module load ncarcompilers/0.5.0
module use /glade/work/epicufsrt/GMTB/tools/intel/2022.1/hpc-stack-v1.2.0_6eb6/modulefiles/stack
module load hpc/1.2.0
module load hpc-intel/2022.1
module load hpc-mpt/2.25
export CC=mpicc
export PCC=mpicc
export CXX=mpicxx
export PCXX=mpicxx
export FC=mpif90
export F90=mpif90
export PFC=mpif90
export PF90=mpif90
module load zlib/1.2.11
module load hdf5/1.10.6
module load netcdf/4.7.4
module load pio/2.5.7
module load esmf/8.3.0b09

So, maybe there are some restriction in CostallApp side in terms of module versions. In my case, I am using exact modules and netcdf needs to be consistent with compiler. Maybe there is something in PHMS side. I’ll try to find the source. In the meantime, if you have any suggestion just let me know.

Thanks,

—ufuk

Panagiotis Velissariou - NOAA Affiliate [email protected]
9:00 AM (3 hours ago)
to Ufuk

Hi Ufuk,

I have fixed these issues, there was a conflict with the schism model.
Let me update the github repo and I'll let you know.

Takis

Panagiotis Velissariou, Ph.D., P.E.
UCAR Scientist
National Ocean and Atmospheric Administration
National Ocean Service
Office of Coast Survey CSDL/CMMB
Physical Scientist - Project Lead
cell: (205) 227-9141
email: [email protected]

Panagiotis Velissariou - NOAA Affiliate [email protected]
10:15 AM (2 hours ago)
to Ufuk

Hi,

Please clone the develop-updates branch for the time being:
git clone --recurse-submodules https://github.com/noaa-ocs-modeling/CoastalApp.git -b develop-updates

With this branch and your module file, it should compile on cheyenne.
Please let me know.

Takis

Panagiotis Velissariou, Ph.D., P.E.
UCAR Scientist
National Ocean and Atmospheric Administration
National Ocean Service
Office of Coast Survey CSDL/CMMB
Physical Scientist - Project Lead
cell: (205) 227-9141
email: [email protected]

Ufuk Turuncoglu
10:43 AM (1 hour ago)
Thanks for the quick fix. I’ll try and let you know.

Ufuk Turuncoglu
10:58 AM (1 hour ago)
to me

Hi Again,

I am still getting same error with your branch,

-- - NetCDF::NetCDF_Fortran [STATIC] [Root: /glade/work/epicufsrt/GMTB/tools/intel/2022.1/hpc-stack-v1.2.0_6eb6/intel-2022.1/mpt-2.25/netcdf/4.7.4] Lib: /glade/work/epicufsrt/GMTB/tools/intel/2022.1/hpc-stack-v1.2.0_6eb6/intel-2022.1/mpt-2.25/netcdf/4.7.4/lib/libnetcdff.a
CMake Error at cmake/check_netcdf.cmake:143 (message):
The NetCDF library specified is not compatible with the specified
compilers.It will not be enabled. Specify a different path or disable
NetCDF.Ensure that you specify the same compilers to build PAHM as were
used to build the NetCDF library.

Call Stack (most recent call first):
CMakeLists.txt:238 (include)

-- Configuring incomplete, errors occurred!
See also "/glade/scratch/turuncu/COASTAL_APP/CoastalApp-testsuite/CoastalApp/PAHM/build/CMakeFiles/CMakeOutput.log".
Couldn't run "make" due to "cmake" errors.
make[1]: Entering directory '/glade/scratch/turuncu/COASTAL_APP/CoastalApp-testsuite/CoastalApp/PAHM/nuopc'

mpif90 -c -g -traceback -fp-model precise -fPIC -O2 -fPIC -assume realloc_lhs -m64 -mcmodel=small -pthread -threads -qopenmp -I/glade/work/epicufsrt/GMTB/tools/intel/2022.1/hpc-stack-v1.2.0_6eb6/intel-2022.1/mpt-2.25/esmf/8.3.0b09/mod -I/glade/work/epicufsrt/GMTB/tools/intel/2022.1/hpc-stack-v1.2.0_6eb6/intel-2022.1/mpt-2.25/esmf/8.3.0b09/include -I/glade/work/epicufsrt/GMTB/tools/intel/2022.1/hpc-stack-v1.2.0_6eb6/intel-2022.1/mpt-2.25/netcdf/4.7.4/include -DESMF_NO_INTEGER_1_BYTE -DESMF_NO_INTEGER_2_BYTE -DESMF_VERSION_STRING_GIT='v8.3.0b09' -DESMF_MOAB=1 -DESMF_LAPACK=1 -DESMF_LAPACK_INTERNAL=1 -DESMF_NO_ACC_SOFTWARE_STACK=1 -DESMF_NETCDF=1 -DESMF_YAMLCPP=1 -DESMF_YAML=1 -DESMF_PIO=1 -DESMF_MPIIO -DESMF_NO_OPENACC -DESMF_BOPT_O -DESMF_TESTCOMPTUNNEL -DSx86_64_small=1 -DESMF_OS_Linux=1 -DESMF_COMM=mpt -DESMF_DIR=/glade/work/jongkim/stacks/hash/hpc-stack-esmf/pkg/v8.3.0b09 -I/glade/work/epicufsrt/GMTB/tools/intel/2022.1/hpc-stack-v1.2.0_6eb6/intel-2022.1/mpt-2.25/netcdf/4.7.4/include -I. -I/glade/scratch/turuncu/COASTAL_APP/CoastalApp-testsuite/CoastalApp/PAHM_INSTALL -I/glade/scratch/turuncu/COASTAL_APP/CoastalApp-testsuite/CoastalApp/PAHM_INSTALL/include -L/glade/work/epicufsrt/GMTB/tools/intel/2022.1/hpc-stack-v1.2.0_6eb6/intel-2022.1/mpt-2.25/netcdf/4.7.4/lib -L. -L/glade/scratch/turuncu/COASTAL_APP/CoastalApp-testsuite/CoastalApp/PAHM_INSTALL -L/glade/scratch/turuncu/COASTAL_APP/CoastalApp-testsuite/CoastalApp/PAHM_INSTALL/lib -DWITHPETLISTS_on -DESMF_VERSION_MAJOR=8 pahm_mod.F90
pahm_mod.F90(15): error #7002: Error in opening the compiled module file. Check INCLUDE paths. [PAHM_MESH]
USE PaHM_Mesh
------^
pahm_mod.F90(107): error #6404: This name does not have a type, and must have an explicit type. [NP]
theData%numNd = np
--------------------^
pahm_mod.F90(108): error #6404: This name does not have a type, and must have an explicit type. [NE]
theData%numEl = ne
--------------------^
pahm_mod.F90(122): error #6404: This name does not have a type, and must have an explicit type. [SLAM]
theData%ndCoords((iCnt - 1) * nDims + 1) = slam(iCnt)
-------------------------------------------------^
pahm_mod.F90(123): error #6404: This name does not have a type, and must have an explicit type. [SFEA]
theData%ndCoords((iCnt - 1) * nDims + 2) = sfea(iCnt)
-------------------------------------------------^
pahm_mod.F90(130): error #6404: This name does not have a type, and must have an explicit type. [NM]
theData%elConnect((iCnt - 1) * numFaceEl + 1) = nm(iCnt, 1)
------------------------------------------------------^
pahm_mod.F90(136): error #6404: This name does not have a type, and must have an explicit type. [DP]
theData%bathymetry = dp
-------------------------^
compilation aborted for pahm_mod.F90 (code 1)
Makefile:48: recipe for target 'pahm_mod.o' failed
make[1]: *** [pahm_mod.o] Error 1
make[1]: Leaving directory '/glade/scratch/turuncu/COASTAL_APP/CoastalApp-testsuite/CoastalApp/PAHM/nuopc'
/glade/scratch/turuncu/COASTAL_APP/CoastalApp-testsuite/CoastalApp/NEMS/src/incmake/component_PAHM.mk:30: recipe for target '/glade/scratch/turuncu/COASTAL_APP/CoastalApp-testsuite/CoastalApp/PAHM_INSTALL/pahm.mk' failed
make: *** [/glade/scratch/turuncu/COASTAL_APP/CoastalApp-testsuite/CoastalApp/PAHM_INSTALL/pahm.mk] Error 2

—ufuk

Panagiotis Velissariou - NOAA Affiliate [email protected]
11:56 AM (26 minutes ago)
to Ufuk

Hi,

Unfortunately CoastalApp does not work with the hpc-stack static libraries. Can you use the OS supplied libraries for NetCDF/HDF5 and ESMF so all dependencies are resolved?
Let me check on hera to see what compiles in CoastalApp using hpc-stack.
For sure ADCIRC can not be compiled using hpc-stack.
It is in the to-do list to add the hpc-stack functionality in CoastalApp.

Takis

Panagiotis Velissariou, Ph.D., P.E.
UCAR Scientist
National Ocean and Atmospheric Administration
National Ocean Service
Office of Coast Survey CSDL/CMMB
Physical Scientist - Project Lead
cell: (205) 227-9141
email: [email protected]

Ufuk Turuncoglu
12:00 PM (21 minutes ago)
to me

Thanks for your help. Okay. I could try with others frowned by the system. In any case we need to fix this issue to make components work under UFS since they are using theses modules. If you could make it work under Hera just let me know. One more question, I think set of libraries are used by CostallApp that could be part of hpc-stack. Could you list them for me? I know parmetis is one of them, which I think you don’t need to compile every time with the app itself and could be part of the hpc-stack (or spack-stack in the near future). Right?

—ufuk

Panagiotis Velissariou - NOAA Affiliate
12:15 PM (7 minutes ago)
to Ufuk

ParMetis can not be part of hpc-stack due to licensing issues. We had this discussion
with NCEP and it was decided that ParMetis can not be part of the official NOAA modeling
infrastructure due to its license. This is the reason I have the ParMetis functionality in CoastalApp.
Now, WW3 is moving away from ParMetis (I don't know when this will be done) and SCHISM
has the capability to use an externally compiled ParMetis or NOT.

As long as you compiled ParMetis once in CoastalApp (it is installed in CoastalApp/THIRDPARTY_INSTALL)
you can rerun the build.sh scripts as:
PARMETISHOME=FULL_PATH_TO_CoastalApp/THIRDPARTY_INSTALL ./build.sh "your options"
(without the --tp parmetis option) and CoastalApp will pick-up the installed ParMetis.
Just make sure that all libraries, including ParMetis, are compiled using the same compilers and mpi libraries.

Takis

Panagiotis Velissariou, Ph.D., P.E.
UCAR Scientist
National Ocean and Atmospheric Administration
National Ocean Service
Office of Coast Survey CSDL/CMMB
Physical Scientist - Project Lead
cell: (205) 227-9141
email: [email protected]

Ufuk Turuncoglu
12:14 PM (7 minutes ago)
to me

I also tried to do this to see why netcdf is failing. I write the INFO variables from Cmake build (add message(INFO ${NetCDF3_LOG}) to PAHM/cmake/check_netcdf.cmake) and it seems that it is failing like following,

Change Dir: /glade/scratch/turuncu/COASTAL_APP/CoastalApp-testsuite/CoastalApp/PAHM/build/CMakeFiles/CMakeTmp

Run Build Command(s):/usr/bin/gmake -f Makefile cmTC_95f2d/fast && gmake[1]: Entering directory '/glade/scratch/turuncu/COASTAL_APP/CoastalApp-testsuite/CoastalApp/PAHM/build/CMakeFiles/CMakeTmp'
/usr/bin/gmake -f CMakeFiles/cmTC_95f2d.dir/build.make CMakeFiles/cmTC_95f2d.dir/build
gmake[2]: Entering directory '/glade/scratch/turuncu/COASTAL_APP/CoastalApp-testsuite/CoastalApp/PAHM/build/CMakeFiles/CMakeTmp'
Building Fortran object CMakeFiles/cmTC_95f2d.dir/netcdf3check.f90.o
/glade/u/apps/ch/opt/ncarcompilers/0.5.0/intel/2022.1/mpi/mpif90 -I/glade/work/epicufsrt/GMTB/tools/intel/2022.1/hpc-stack-v1.2.0_6eb6/intel-2022.1/mpt-2.25/netcdf/4.7.4/include -extend-source 132 -heap-arrays 0 -fno-alias -sox -qno-opt-dynamic-align -O2 -debug minimal -nowarn -fp-model source -assume byterecl -mcmodel=medium -c /glade/scratch/turuncu/COASTAL_APP/CoastalApp-testsuite/CoastalApp/PAHM/build/CMakeFiles/netcdf3check.f90 -o CMakeFiles/cmTC_95f2d.dir/netcdf3check.f90.o
gmake[2]: *** No rule to make target '/glade/work/epicufsrt/GMTB/tools/intel/2022.1/hpc-stack-v1.2.0_6eb6/intel-2022.1/mpt-2.25/netcdf/4.7.4/lib /glade/work/epicufsrt/GMTB/tools/intel/2022.1/hpc-stack-v1.2.0_6eb6/intel-2022.1/mpt-2.25/hdf5/1.10.6/lib -lhdf5_hl -lhdf5 /glade/work/epicufsrt/GMTB/tools/intel/2022.1/hpc-stack-v1.2.0_6eb6/intel-2022.1/zlib/1.2.11/lib -lz -ldl -lm -lnetcdf -lm', needed by 'cmTC_95f2d'. Stop.
gmake[2]: Leaving directory '/glade/scratch/turuncu/COASTAL_APP/CoastalApp-testsuite/CoastalApp/PAHM/build/CMakeFiles/CMakeTmp'
Makefile:127: recipe for target 'cmTC_95f2d/fast' failed
gmake[1]: *** [cmTC_95f2d/fast] Error 2
gmake[1]: Leaving directory '/glade/scratch/turuncu/COASTAL_APP/CoastalApp-testsuite/CoastalApp/PAHM/build/CMakeFiles/CMakeTmp'

Then, I copy the code that test netcdf3 and run the command manually.

/glade/u/apps/ch/opt/ncarcompilers/0.5.0/intel/2022.1/mpi/mpif90 -I/glade/work/epicufsrt/GMTB/tools/intel/2022.1/hpc-stack-v1.2.0_6eb6/intel-2022.1/mpt-2.25/netcdf/4.7.4/include -extend-source 132 -heap-arrays 0 -fno-alias -sox -qno-opt-dynamic-align -O2 -debug minimal -nowarn -fp-model source -assume byterecl -mcmodel=medium -c netcdf3check.f90

and it compiled code without any issue. It is also same netcdf4 one.So, I am not sure why this is not working under CMake build. I think it is not related with the hpc-stack modules. Anyway, I’ll try the find the source of the issue.

—ufuk

Ufuk Turuncoglu
12:20 PM (2 minutes ago)
to me

Hi,

Okay. It is good to know the background about the parmetis. This will help when we start to move the components to UFS. Thanks.

—ufuk

@pvelissariou1 pvelissariou1 self-assigned this Jan 30, 2023
@uturuncoglu
Copy link

@pvelissariou1 I might find the issue with error related with the netcdf library. It is cased by the following part in the cmake file,

  try_compile(NETCDF4_TEST
              "${CMAKE_BINARY_DIR}"
              SOURCES "${CMAKE_BINARY_DIR}/CMakeFiles/netcdf4check.f90"
              CMAKE_FLAGS "-DINCLUDE_DIRECTORIES=${NetCDF_INCLUDE_DIRS}"
              LINK_LIBRARIES "${NetCDF_LIBRARIES}" 
              LINK_LIBRARIES "${NetCDF_AdditionalLibs}"
              OUTPUT_VARIABLE NetCDF4_LOG)

The LINK_LIBRARIES part basically causes issue. It might be due to the used cmake version etc. under Cheyenne. If I remove them then it finds the target and compiler but fails due to the missing netcdf library but I think I could also provide them under CMAKE_FLAGS argument. Anyway, I'll try to fix it to see what happens.

@pvelissariou1
Copy link
Collaborator Author

pvelissariou1 commented Jan 30, 2023 via email

@uturuncoglu
Copy link

It is cmake/3.22.0. The build is fine but it fails when linking in the try_compile command.

@pvelissariou1
Copy link
Collaborator Author

pvelissariou1 commented Jan 30, 2023 via email

@uturuncoglu
Copy link

@pvelissariou1 I think I fixed the issue with PAHM build. The issue was ordering of the netcdf libraries in the link stage. The build requires to add first netcdf fortran libraries and then netcdf c ones. Otherwise, the build complains about missing netcdf functions. I updated cmake/Modules/FindNetCDF.cmake file and also add list( REVERSE _search_components ) to change the order of the language components. Maybe this could be done in a more elegant way but since the script first checks C and then others this was the easiest solution. Anyway, if you need more information I could create a branch and push the changes to there. Or we could implement another way.

Anyway, that allowed me to pass the issue but then I hit to another one from ADCIRC. It was complaining about finding netcdf library but I fixed it by setting export NETCDF_DIR=$NETCDF_ROOTin my Cheyyene module file. Then, I got following error from LibXdmf libraries.

CMake Error: The following variables are used in this project, but they are set to NOTFOUND.
Please set them or make sure they are set and tested correctly in the CMake files:
XDMF_LibXdmf
    linked by target "cmTC_73a97" in directory /glade/scratch/turuncu/COASTAL_APP/CoastalApp-testsuite/CoastalApp/ADCIRC/build/CMakeFiles/CMakeTmp
XDMF_LibXdmfCore
    linked by target "cmTC_73a97" in directory /glade/scratch/turuncu/COASTAL_APP/CoastalApp-testsuite/CoastalApp/ADCIRC/build/CMakeFiles/CMakeTmp
XDMF_LibXdmfUtils
    linked by target "cmTC_73a97" in directory /glade/scratch/turuncu/COASTAL_APP/CoastalApp-testsuite/CoastalApp/ADCIRC/build/CMakeFiles/CMakeTmp

CMake Error at cmake/xdmf_check.cmake:40 (try_compile):
  Failed to generate test project build system.
Call Stack (most recent call first):
  CMakeLists.txt:105 (include)

In the following documentation, it says it is required to write data in Xdmf format.

https://wiki.adcirc.org/Compiling

I am not sure it is required or not to run the tests. Anyway, do I need to install it or checkout somewhere that ADCIRC could find and install it for me?

@uturuncoglu
Copy link

Anyway, I think I fixed that one too. I edit ADCIRC/work/makefile and again change the order of the link libraries for netcdf from -lnetcdf -lnetcdff to -lnetcdff -lnetcdf. Then, it could complete the build without any issue. This needs to be done for every occurrence of -lnetcdf -lnetcdff. I was also getting error related with the HDF library since netcdf is build with HDF and make some changes in the makefile to add them to the link stage too. I also modify the ADCIRC/work/makefile to use nf-config --flibs and nc-config --libs to get libraries related with netcdf and it seems it is working. I have issue like /usr/lib64/gcc/x86_64-suse-linux/4.8/../../../../x86_64-suse-linux/bin/ld: /glade/work/epicufsrt/GMTB/tools/intel/2022.1/hpc-stack-v1.2.0_6eb6/intel-2022.1/mpt-2.25/netcdf/4.7.4/lib/libnetcdf.a(libdispatch_la-dparallel.o): undefined reference to symbol 'MPI_Info_f2c'but I'll investigate it too.

@uturuncoglu
Copy link

This could be related with using fort rather than mpif90 in ADCIRC build and I think can be fixed easily. Anyway, I'll keep you updated.

@pvelissariou1
Copy link
Collaborator Author

pvelissariou1 commented Feb 2, 2023 via email

@uturuncoglu
Copy link

@pvelissariou1 I was looking into MPI error in the linking step of ADCIRC.

patch_la-dparallel.o): undefined reference to symbol 'MPI_Info_f2c'
/usr/lib64/gcc/x86_64-suse-linux/4.8/../../../../x86_64-suse-linux/bin/ld: /glade/u/apps/ch/opt/mpt/2.25/lib/libmpi.so: error adding symbols: DSO missing from command line

This seems related with the used compiler for ADCIRC and my build is using ifort rather than MPI compiler. It seems that there is no configure.nems.cheyenne.intel file under conf/ directory. Do we need to create one? I could easily use configure.nems.hera.intel or configure.nems.orion.intel as a reference to create it. Maybe because of the missing file it tries to run with default compiler.

@uturuncoglu
Copy link

uturuncoglu commented Feb 4, 2023

If I modify ADCIRC/work/cmplrflags.mk and change PPFC from ifort to mpif90 under Intel section, the ADCIRC passes the MPI issue (there is no need to create configure.nems.cheyenne.intel) but then fails with following,

mpif90 -c -g -traceback -fp-model precise   -fPIC -O2 -fPIC -assume realloc_lhs -m64 -mcmodel=small -pthread -threads  -qopenmp -I/glade/work/epicufsrt/GMTB/tools/intel/2022.1/hpc-stack-v1.2.0_6eb6/intel-2022.1/mpt-2.25/esmf/8.3.0b09/mod -I/glade/work/epicufsrt/GMTB/tools/intel/2022.1/hpc-stack-v1.2.0_6eb6/intel-2022.1/mpt-2.25/esmf/8.3.0b09/include -I/glade/work/epicufsrt/GMTB/tools/intel/2022.1/hpc-stack-v1.2.0_6eb6/intel-2022.1/mpt-2.25/netcdf/4.7.4/include  -DESMF_NO_INTEGER_1_BYTE -DESMF_NO_INTEGER_2_BYTE -DESMF_VERSION_STRING_GIT='v8.3.0b09' -DESMF_MOAB=1 -DESMF_LAPACK=1 -DESMF_LAPACK_INTERNAL=1 -DESMF_NO_ACC_SOFTWARE_STACK=1 -DESMF_NETCDF=1 -DESMF_YAMLCPP=1 -DESMF_YAML=1 -DESMF_PIO=1 -DESMF_MPIIO -DESMF_NO_OPENACC -DESMF_BOPT_O -DESMF_TESTCOMPTUNNEL -DSx86_64_small=1 -DESMF_OS_Linux=1 -DESMF_COMM=mpt -DESMF_DIR=/glade/work/jongkim/stacks/hash/hpc-stack-esmf/pkg/v8.3.0b09 -I. -I/glade/scratch/turuncu/COASTAL_APP/CoastalApp-testsuite/CoastalApp/ADCIRC/work/odircp -I/glade/scratch/turuncu/COASTAL_APP/CoastalApp-testsuite/CoastalApp/ADCIRC/prep -I/glade/work/epicufsrt/GMTB/tools/intel/2022.1/hpc-stack-v1.2.0_6eb6/intel-2022.1/mpt-2.25/netcdf/4.7.4/include    -L. -L/glade/scratch/turuncu/COASTAL_APP/CoastalApp-testsuite/CoastalApp/ADCIRC  -L/glade/work/epicufsrt/GMTB/tools/intel/2022.1/hpc-stack-v1.2.0_6eb6/intel-2022.1/mpt-2.25/netcdf/4.7.4/lib -DWITHPETLISTS_on  -DESMF_VERSION_MAJOR=8 couple2swan_modif.F90
couple2swan_modif.F90(17): error #7002: Error in opening the compiled module file.  Check INCLUDE paths.   [SIZES]
      USE SIZES,  ONLY: SZ
----------^
couple2swan_modif.F90(18): error #7002: Error in opening the compiled module file.  Check INCLUDE paths.   [WRITE_OUTPUT]
      USE WRITE_OUTPUT, ONLY : terminate
----------^
couple2swan_modif.F90(19): error #7002: Error in opening the compiled module file.  Check INCLUDE paths.   [GLOBAL]

So, maybe dependency needs to be define in ADCIRC/thirdparty/nuopc/Makefile. The sizes.F is under ADCIRC/src.

@uturuncoglu
Copy link

I am not sure how CoastalApp builds on other systems such as Hera without any issue. Maybe I am doing something totally wrong in here but in any case, if we want to fork UFS and port these components to there, the build needs to work with those modules. We could still follow the same approach and wrap each component with CMake build interfaces in UFS but I think it is better to use their own build system. BTW, some components have both cmake and make builds system like ADCIRC. Am I right? I am asking because UFS uses CMake for component builds.

@pvelissariou1
Copy link
Collaborator Author

pvelissariou1 commented Feb 4, 2023 via email

@pvelissariou1
Copy link
Collaborator Author

An update:

  1. Clone CoastalApp develop branch
  2. Compile CoastalApp on Orion using the orion default module
  3. Compile CoastalApp on Orion using hpc-stack

@uturuncoglu
Copy link

@pvelissariou1 I confirmed that I could able to build CoastalApp on Orion and run the test cases without any issue with out-of-box modules. As a next step, I'll try to do same with the modules files used by UFS develop (hpc-stack). BTW, where all those run directories. It might be good to check the results visually to be sure that they run without any issue. I saw them in the queue but I am not sure they run successfully.

When I run the run_all.sh, I had issue as following,

sbatch: error: Batch job submission failed: Invalid account or account/partition combination specified
sbatch: error: Batch job submission failed: Invalid account or account/partition combination specified
sbatch: error: Batch job submission failed: Invalid account or account/partition combination specified
sbatch: error: Batch job submission failed: Invalid account or account/partition combination specified

Since I am not in coastal group, I need to change the job script using following command,

find ./ -name "model_setup.job" -type f -exec sed -i 's/--account=coastal/--account=nems/g' {} \;
find ./ -name "model_run.job" -type f -exec sed -i 's/--account=coastal/--account=nems/g' {} \;

Then, I got another one like,

sbatch: error: Batch job submission failed: Requested time limit is invalid (missing or exceeds some limit)
sbatch: error: Batch job submission failed: Invalid account or account/partition combination specified

So, I had to add following to the job scripts

#SBATCH --qos=batch
#SBATCH --partition=orion

using following commands,

find ./ -name "model_setup.job" -type f -exec sed -i '2i #SBATCH --qos=batch\n#SBATCH --partition=orion' {} \;
find ./ -name "model_run.job" -type f -exec sed -i '2i #SBATCH --qos=batch\n#SBATCH --partition=orion' {} \;

So, maybe those information could come from a configuration file etc. Anyway, I think once we port the components to the UFS, I think it would be nice to port those tests to the UFS RT system and all these things are handled by RT system without any additional development.

@uturuncoglu
Copy link

@pvelissariou1 I think ADCIRC writes the data separately for each PET and needs to be combined together to check the results. Anyway, is there any post-processing tool that I could use to check the results?

@pvelissariou1
Copy link
Collaborator Author

pvelissariou1 commented Feb 9, 2023 via email

@uturuncoglu
Copy link

uturuncoglu commented Feb 9, 2023

@pvelissariou1 I also tried with hpc-stack and I got similar error that I had on Cheyenne. It is related to the netcdf again.

--   - NetCDF::NetCDF_Fortran [STATIC] [Root: /work/noaa/epic-ps/role-epic-ps/hpc-stack/libs/intel-2022.1.2/intel-2022.1.2/impi-2022.1.2/netcdf/4.7.4] Lib: /work/noaa/epic-ps/role-epic-ps/hpc-stack/libs/intel-2022.1.2/intel-2022.1.2/impi-2022.1.2/netcdf/4.7.4/lib/libnetcdff.a 
CMake Error at cmake/check_netcdf.cmake:143 (message):
  The NetCDF library specified is not compatible with the specified
  compilers.It will not be enabled.  Specify a different path or disable
  NetCDF.Ensure that you specify the same compilers to build PAHM as were
  used to build the NetCDF library.

Call Stack (most recent call first):
  CMakeLists.txt:238 (include)

Probably, we need to work on build system to make components happy with the hpc-stack before starting to move them under UFS. For your reference, here are my modules for modulefiles/envmodules_intel.orion

module load cmake/3.22.1
module use /work/noaa/epic-ps/role-epic-ps/hpc-stack/libs/intel-2022.1.2/modulefiles/stack
module load hpc/1.2.0
module load hpc-intel/2022.1.2
module load hpc-impi/2022.1.2
module load zlib/1.2.11
module load hdf5/1.10.6
module load netcdf/4.7.4
module load pio/2.5.7
module load esmf/8.3.0b09

@uturuncoglu
Copy link

@pvelissariou1 Do you want me to work on it? I could create fork for component and work on branches that fix the issue with the build.

@pvelissariou1
Copy link
Collaborator Author

pvelissariou1 commented Feb 9, 2023 via email

@pvelissariou1
Copy link
Collaborator Author

pvelissariou1 commented Feb 9, 2023 via email

@uturuncoglu
Copy link

Sure. feature/ncar will be fine.

@pvelissariou1
Copy link
Collaborator Author

pvelissariou1 commented Feb 9, 2023 via email

@uturuncoglu
Copy link

The following test are still running, is it expected,

           8925869     orion fl_wav_m    tufuk  R    2:22:37     15 Orion-01-[03,43],Orion-02-[27,40,51],Orion-06-[61-62],Orion-08-[08,47-48,72],Orion-09-[03,06],Orion-15-[06-07]
           8925867     orion fl_atm2w    tufuk  R    2:22:41     15 Orion-11-[32-46]
           8925871     orion fl_wav_n    tufuk  R    2:15:30     15 Orion-17-[38-46],Orion-23-[69-72],Orion-24-[01-02]

If I remember correctly they were very cheap tests but maybe I am wrong.

@pvelissariou1
Copy link
Collaborator Author

pvelissariou1 commented Feb 9, 2023 via email

@uturuncoglu
Copy link

I can't see anything in the log. They could hang. Anyway, I'll keep monitor them. They are running more than 2-hours. Is it expected?

@uturuncoglu
Copy link

@pvelissariou1 I am creating forks for the component that I need to modify in my personal account due to the restriction issue. I think this is more flexible way to continue to the development. I have already fixed PAHM (https://github.com/uturuncoglu/PaHM and feature/ncar branch - just remove C netcdf library which is not required by the model) and I'll go one by one.

@uturuncoglu
Copy link

@pvelissariou1 It turns out I have still issue with PAHM build when it is triggered by the app.

@pvelissariou1
Copy link
Collaborator Author

pvelissariou1 commented Feb 10, 2023 via email

@uturuncoglu
Copy link

@pvelissariou1 i have good news. I could able to build ADCIRC under UFS on Cheyenne. As I told you before in the call, I went with new CMake interface approach which is also used by some other UFS components. I think this is the best way to support both UFS and also other platforms outside of UFS at this point. We could discuss it more in the Kick-off meeting.

I think same version will also build on other platforms without any issue but I did not tested yet.

Here is my fork,

https://github.com/uturuncoglu/ufs-weather-model/tree/feature/coastal_app

At this point, I am only building only ADCIRC (no adcprep etc.) with external Metis (build with spack package manager) and this is preliminary work. I will modify the CMake interface (https://github.com/uturuncoglu/ufs-weather-model/blob/feature/coastal_app/ADCIRC-interface/CMakeLists.txt) to build with its internal Metis. So, there won't be any Metis dependency at least for ADCIRC. I also need to look at CPP options that is used by CoastalApp and be sure that I am using same ones.

I am plaining to go step by step and bring each component individually and test them. Once they works individually and produce similar answer with the CoastalApp. Then, we could couple them.

At this point my questions are;

  • Is there any standalone ADCIRC (with some forcing from files) test under https://github.com/noaa-ocs-modeling/CoastalApp-testsuite ? If so, I could try to define it as UFS Regression Test and this will give us a capability to compare the results with the CoastalApp.

  • What is the adcprep, adcswan ,aswip etc.? Are we plaining to support all under UFS? If so, I could also try to build them. We could put some options to UFS build to enable those configurations if I understand their aim clearly.

@uturuncoglu
Copy link

BTW, the ADCIRC configuration might need to have some tweaks. The CFS data is 6-hourly but WTIMINC is indicated as hourly infort.15. Also it says that it is 7 days of run. So, maybe I need to set RNDAY to 1. Anyway, we might need to modify this file and maybe others to fit to this configuration. Then, I could start to define it as a UFS RT.

@pvelissariou1
Copy link
Collaborator Author

pvelissariou1 commented Mar 6, 2023 via email

@uturuncoglu
Copy link

@pvelissariou1 Sure. BTW, if you want to use higher temporal resolution dataset to force ADCIRC, we could replace the CFS data after creating RT and validate that ADCIRC producing reasonable results.

@pvelissariou1
Copy link
Collaborator Author

pvelissariou1 commented Mar 7, 2023 via email

@uturuncoglu
Copy link

@pvelissariou1 Okay. I modified the parameter file and run again. I still don't know how I could check the results. Could you give me some information.

@pvelissariou1
Copy link
Collaborator Author

pvelissariou1 commented Mar 7, 2023 via email

@pvelissariou1
Copy link
Collaborator Author

pvelissariou1 commented Mar 8, 2023 via email

@uturuncoglu
Copy link

@pvelissariou1 okay. let me check those tools. If the output looks fine then I'll define this case ad RT. Do you have any preference for the next configuration?

@pvelissariou1
Copy link
Collaborator Author

pvelissariou1 commented Mar 8, 2023 via email

@uturuncoglu
Copy link

Okay. I'll look at first SCHISM after this.

@pvelissariou1
Copy link
Collaborator Author

pvelissariou1 commented Mar 8, 2023 via email

@uturuncoglu
Copy link

@pvelissariou1 I can't see the attachment.

@uturuncoglu
Copy link

@pvelissariou1 i am having trouble to use Python package. I download the simple datasets and try to point my run directory but run plot_maps.py does not produce anything just following output,

rm: cannot remove 'base_info.pyc': No such file or directory
rm: cannot remove 'geo_regions.pyc': No such file or directory
/glade/scratch/turuncu/COASTAL_APP/florence_hsofs.atm2adc/
/glade/scratch/turuncu/COASTAL_APP/ca_adcirc_plot/plot_maps.py:461: DeprecationWarning: tostring() is deprecated. Use tobytes() instead.
  dates1 = netCDF4.num2date(ncv1['time'][:],ncv1['time'].units)
[info]: Fin

I need to create very simple plots that shows the run is fine.

@uturuncoglu
Copy link

@pvelissariou1 @saeed-moghimi-noaa I added coastal_florence_hsofs_atm2adc as a regression test under UFS. I did not check the results yet since I have issue with plotting packages and I need to have very simple tool (Python scripts etc) to create some plots to verify it is working as expected. Maybe I could try to create it with using NCL etc.

Anyway, the regression test basically runs with NUOPC connectors not CMEPS until the issue in ADCIRC cap is solved. I also created new regression file for coastal app (tests/rt_coastal.conf). We could move the tests that we want to test regularly with UFS to the rt.conf later. All the work is in the following UFS Weather Model fork.

https://github.com/uturuncoglu/ufs-weather-model/tree/feature/coastal_app

and the regression test can be run with following command,

./rt.sh -l rt_coastal.conf -k -n coastal_florence_hsofs_atm2adc

to run the test you also need to stage the input files (there is no baseline yet). The adcprep is not part of this RT and the files generated by the adcprep is staged as a part of the input files. So, if you plan to test it in your side please let me know for the input file directory and required modifications in the rt.sh (this is required since I have no write access to UFS common data directory under platform, so I created one under my account). I am plaining to modify the RT once we are able to run the ADCIRC with CMEPS.

Once we push all those development to the UFS level, the input directory and baselines will be stored in common place for the all the supported platforms and also RTs.

@uturuncoglu
Copy link

@pvelissariou1 could you give me information about the file that I need to look at for the output. There are lots of fort* files and I could not be sure which one/s need to be to check the output. Is there any documentation that explains the all those files and their contents.

@pvelissariou1
Copy link
Collaborator Author

pvelissariou1 commented Mar 18, 2023 via email

@uturuncoglu
Copy link

@pvelissariou1 I have issue with those files. I tried to run the florence_hsofs.atm2adc on Orion via CoastalApp-testsuite (so the existing model - not my version) and nothing is written to those files while the model is running. Maybe I need to wait to finish the simulation and then the time dimension will be updated not sure but also file size not changing. So, probably I have issue with running model. As I understand those two files are created outside of the model run in the prep step. I think like this because if I delete those two files and run the model again, it fails during file initialization (write_output.F:1603). Anyway, at this point I have no baseline to compare (at least the files, I know the model that I am forcing with CDEPS uses different input). If you don't mind could you run the florence_hsofs.atm2adc case in your end (I think it is running on Hera without any issue) and copy the run directory to Cheyenne or Orion. So, if I could see the actual running case, maybe I could find what is going on wrong with it. Thanks.

@uturuncoglu
Copy link

Definitely there is an issue with the model or florence_hsofs.atm2adc configuration since PET0 seems finalized (ATMMESH) but ADCIRC PETs are still running on Orion. Again this is current version of coupled model not the modified and updated one.

@uturuncoglu
Copy link

@pvelissariou1 While we are solving issues related with the ADCIRC, i would like to start also bring SCHISM to the UFS model. It seems there are two version under CoastalApp: schism and schism-esmf. Which one do we need to port to UFS model? The schism might have CMake build but schism-esmf not. Again, if you give me more information about it, that would be great. Also, I'll try to use the CMake provided by the SCHISM but if it won't work I could create CMake interface for SCHISM under the UFS like I did in ADCIRC.

@pvelissariou1
Copy link
Collaborator Author

pvelissariou1 commented Mar 20, 2023 via email

@uturuncoglu
Copy link

@pvelissariou1 I am little bit confused because that one also has schism source code. https://github.com/schism-dev/schism-esmf/tree/master/src/schism
do we need to checkout both of them under UFS?

@pvelissariou1
Copy link
Collaborator Author

pvelissariou1 commented Mar 20, 2023 via email

@uturuncoglu
Copy link

@pvelissariou1 I could not find any test with SCHISM under CoastalApp-testsuite. If you don't mind could you point me a test with SCHISM. So, I could compare the one under UFS. BTW, any progress of running florence_hsofs.atm2adc on Orion or any other platform successfully. In my side, I could not run.

@pvelissariou1
Copy link
Collaborator Author

pvelissariou1 commented Mar 21, 2023 via email

@uturuncoglu
Copy link

@pvelissariou1 Let me test it under Orion again with the latest version to see florence_hsofs.atm2adc will work or not.

@uturuncoglu
Copy link

@pvelissariou1 I tried with the develop. The florence_hsofs.atm2adc test fails with following error.

/work/noaa/nems/tufuk/CoastalApp-testsuite/CoastalApp/ALLBIN_INSTALL/NEMS.x: error while loading shared libraries: libpnetcdf.so.4: cannot open shared object file: No such file or directory
/work/noaa/nems/tufuk/CoastalApp-testsuite/CoastalApp/ALLBIN_INSTALL/NEMS.x: error while loading shared libraries: libpnetcdf.so.4: cannot open shared object file: No such file or directory

setup job is fine.

@uturuncoglu
Copy link

I add following to model_run_slurm.job

module load cmake
module load intel/2018.4 impi/2018.4
module load hdf5/1.10.5-parallel pnetcdf/1.12.0 netcdf/4.7.2-parallel

and run it to see what happens. Module issue gone but I am getting

/var/spool/slurmd/job9500254/slurm_script: line 93: NEMS.x: command not found

error now.

@pvelissariou1
Copy link
Collaborator Author

pvelissariou1 commented Mar 22, 2023 via email

@uturuncoglu
Copy link

@pvelissariou1 Okay. I fixed the issue and run the model. Still have issue with output. Model seems working fine but I don't have anything in the output files and time dimension is still 0. If you don't mind could you try to run the florence_hsofs.atm2adc case on Orion in your side and check the output. For example, I have no data in fort.64.nc and time dimension size is zero.

@pvelissariou1
Copy link
Collaborator Author

pvelissariou1 commented Mar 22, 2023 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants