Skip to content
Marshall Ward edited this page Apr 25, 2024 · 99 revisions

Getting Started

This page will help you to get to the point of running some basic test cases using MOM6 and SIS2. If you have any problems please check the issues pages and see the FAQ. There are a few broad steps:

  1. Downloading model source code and tools
  2. Downloading input data
  3. Compiling the models
  4. Running a test case
  5. Checking your results

Downloading the source code and tools

Check that git is configured

Most of the source code is hosted on GitHub. Check that git is setup on your machine and that you can "talk" to GitHub. See Set Up Git (to install if you don't already have it - unlikely) and Git configuration and environment for details.

Where to download/install

All files need to be visible to the compute nodes on your machine. For example on the gaea platform the files are best placed on the lustre file system. Be aware that many parts of the lustre file system are swept (i.e., your files are erased). To avoid having files swept, it is best to use the unswept area, e.g.:

cd /gpfs/f5/gfdl_o/scratch/$USER/

Note that the lustre file system is NOT backed up. The home file system is backed up but of limited size, and since it is not visible from the compute nodes, using home is more difficult.

In general if you are making changes to the source code it's important to use git to push to GitHub regularly. This is especially true when your source and edits are on a non-backed up file system (like lustre).

If you are running the model on a desktop/laptop then it shouldn't matter where you put the source.

Cloning the core repositories

The source code is split across several repositories. These are arranged in a hierarchy with MOM6-examples at the top as a "super repository", and the other repositories configured as submodules. To get a better understanding of the repository layout see Understanding the repository layout.

Even without a GitHub account you can clone (download) the repositories by executing this single command at the command-line in a shell:

git clone --recursive https://github.com/NOAA-GFDL/MOM6-examples.git MOM6-examples

That said, we strongly recommend having a GitHub account so you can version control your own modifications and contribute to MOM6.

The above will clone (download) the MOM6-examples repository and all its sub-modules into a directory called MOM6-examples/.

From here on, all commands will take place within the MOM6-examples/ directory you just created.

Comment

By way of explanation, the above one line "recursive clone" is equivalent to the follow steps:

git clone https://github.com/NOAA-GFDL/MOM6-examples.git MOM6-examples
cd MOM6-examples
git submodule init
git submodule update --recursive

where the last line in turn is equivalent to

git submodule update src/FMS
git submodule update --init --recursive src/MOM6
git submodule update src/SIS2
git submodule update tools/matlab/gtools
git submodule update tools/python/MIDAS

Knowing this, you can avoid downloading various of the submodules. For instance, you could just clone MOM6-examples and none of the submodules by stopping before the git submodule init.

Note that MOM6 also has a sub-module (CVmix). If you haven't used a recursive clone or update and src/MOM6/pkg/CVMix-src is empty, it should be initialized and updated independently.

cd src/MOM6
git submodule init
git submodule update

Downloading additional component models for coupled models

Not all of the necessary component models for a fully coupled model can be downloaded as described above. If you would like to run any coupled model configurations (i.e. anything other than stand-alone MOM6), then see Obtaining other components.

Build tools and compilation templates

MOM6 relies on tools that are part of the Flexible Runtime Environment (FRE) to help with the build process. Those tools are provided in the NOAA-GFDL/mkmf repository which you should find in MOM6-examples/src/mkmf/.

The workflow we describe in these wiki pages generally compiles the model in a user created directory MOM6-examples/build/:

cd MOM6-examples
mkdir build

After downloading all of the above the directory tree should look like:

% cd MOM6-examples
% tree -d -L 2
.
|-- build
|-- coupled_AM2_LM2_SIS
|   |-- AM2_MOM6i_1deg
|   `-- CM2G63L
|-- coupled_AM2_LM3_SIS
|   `-- AM2_MOM6i_1deg
|-- coupled_AM2_LM3_SIS2
|   |-- AM2_SIS2B_MOM6i_1deg
|   `-- AM2_SIS2_MOM6i_1deg
|-- ice_ocean_SIS
|   |-- Amery
|   |-- *
|   `-- OM4_025
|-- ice_ocean_SIS2
|   |-- Baltic
|   |-- *
|   `-- SIS2_icebergs
|-- ocean_only
|   |-- DOME
|   |-- *
|   `-- torus_advection_test
|-- src
|   |-- FMS
|   |-- MOM6
|   |-- SIS2
|   |-- atmos_null
|   |-- coupler
|   |-- ice_param
|   |-- icebergs
|   |-- land_null
|   `-- mkmf
`-- tools
    |-- analysis
    |-- matlab
    |-- python
    `-- tests

Downloading input data

Some experiments have input data sets that are placed in an INPUT/ subdirectory of the example. For example tree ocean_only/global yields:

% tree ocean_only/global
ocean_only/global
|-- INPUT
|   |-- 31Layer_zgrid.nc -> .datasets/global/siena_201204/INPUT/31Layer_zgrid.nc
|   |-- GOLD_IC.2010.11.15.nc -> .datasets/global/siena_201204/INPUT/GOLD_IC.2010.11.15.nc
|   |-- OM3_zgrid.nc -> .datasets/global/siena_201204/INPUT/OM3_zgrid.nc
|   |-- README -> .datasets/global/siena_201204/mosaic.unpacked/README
|   |-- atmos_hgrid.nc -> .datasets/global/siena_201204/mosaic.unpacked/atmos_hgrid.nc
|   |-- atmos_mosaic.nc -> .datasets/global/siena_201204/mosaic.unpacked/atmos_mosaic.nc
|   |-- atmos_mosaic_tile1Xland_mosaic_tile1.nc -> .datasets/global/siena_201204/mosaic.unpacked/atmos_mosaic_tile1Xland_mosaic_tile1.nc
|   |-- atmos_mosaic_tile1Xocean_mosaic_tile1.nc -> .datasets/global/siena_201204/mosaic.unpacked/atmos_mosaic_tile1Xocean_mosaic_tile1.nc
|   |-- geothermal_heating_cm2g.nc -> .datasets/global/siena_201204/INPUT/geothermal_heating_cm2g.nc
|   |-- grid_spec.nc -> .datasets/global/siena_201204/mosaic.unpacked/grid_spec.nc
|   |-- gustiness_qscat.nc -> .datasets/global/siena_201204/INPUT/gustiness_qscat.nc
|   |-- land_hgrid.nc -> .datasets/global/siena_201204/mosaic.unpacked/land_hgrid.nc
|   |-- land_mask.nc -> .datasets/global/siena_201204/mosaic.unpacked/land_mask.nc
|   |-- land_mosaic.nc -> .datasets/global/siena_201204/mosaic.unpacked/land_mosaic.nc
|   |-- land_mosaic_tile1Xocean_mosaic_tile1.nc -> .datasets/global/siena_201204/mosaic.unpacked/land_mosaic_tile1Xocean_mosaic_tile1.nc
|   |-- mosaic.nc -> .datasets/global/siena_201204/mosaic.unpacked/mosaic.nc
|   |-- ocean_forcing_daily.nc -> .datasets/global/siena_201204/INPUT/ocean_forcing_daily.nc
|   |-- ocean_forcing_monthly.nc -> .datasets/global/siena_201204/INPUT/ocean_forcing_monthly.nc
|   |-- ocean_hgrid.nc -> .datasets/global/siena_201204/mosaic.unpacked/ocean_hgrid.nc
|   |-- ocean_mask.nc -> .datasets/global/siena_201204/mosaic.unpacked/ocean_mask.nc
|   |-- ocean_mosaic.nc -> .datasets/global/siena_201204/mosaic.unpacked/ocean_mosaic.nc
|   |-- ocean_precip_monthly.nc -> .datasets/global/siena_201204/INPUT/ocean_precip_monthly.nc
|   |-- ocean_vgrid.nc -> .datasets/global/siena_201204/mosaic.unpacked/ocean_vgrid.nc
|   |-- seawifs_1998-2006_GOLD_smoothed_2X.nc -> .datasets/global/siena_201204/INPUT/seawifs_1998-2006_GOLD_smoothed_2X.nc
|   |-- sgs_h2.nc -> .datasets/global/siena_201204/INPUT/sgs_h2.nc
|   |-- tideamp.nc -> .datasets/global/siena_201204/INPUT/tideamp.nc
|   `-- topog.nc -> .datasets/global/siena_201204/mosaic.unpacked/topog.nc
|-- MOM_input
|-- MOM_memory.h
|-- MOM_override
|-- MOM_parameter_doc.all
|-- MOM_parameter_doc.short
|-- available_diags.000000
|-- diag_table
|-- input.nml
|-- ocean.stats.gnu
|-- ocean.stats.intel
`-- ocean.stats.pgi

The files are actually links relative to a non-existent link .datasets. This link can be pointed to either a local installation of the data sets or a shared installation (.e.g /archive/gold/datasets or /lustre/f1/pdata/gfdl_O/datasets).

Follow the instructions below to create a .datasets link.

Linking data sets on Gaea

cd MOM6-examples/

ln -sf /gpfs/f5/gfdl_o/world-shared/datasets .datasets

You only have to do this once.

Linking data sets on GFDL workstatons or GFDL PAN

cd MOM6-examples/
ln -sf /archive/gold/datasets .datasets

You only have to do this once.

Installing data sets on another platform

You can construct your own datasets directory that the link .datasets points to by downloading and unpacking each of the tar files at ftp://ftp.gfdl.noaa.gov/perm/Alistair.Adcroft/MOM6-testing/ into your own central location and linking .datasets to that location.

Note that not all tar files are needed at once but some tests might use data from more than one file. We try to reuse data between tests and the directory names within the datasets location are not necessarily those of the tests in MOM6-examples.

Compiling the models

Compilation method

There are two methods for building MOM6: either autoconf or the GFDL mkmf toolchain.

The autoconf approach will automatically configure the compiler and gather any dependencies, but can only be used for ocean-only builds. It offers a path to quickly getting started, but users who plan to either work with the coupled models or use the GFDL workflow may prefer to familiarize themselves with the mkmf approach.

To build with autoconf, consult the MOM6 README. The instructions below describe the mkmf build process.

Overview

We need to build different executables depending on the coupled mode of the model. All modes require MOM6 and FMS source code. When used in ice-ocean mode, whether with SIS or SIS2, we must link in a "null" land- and atmosphere-models. In fully coupled mode you also need the full land and atmosphere source codes.

The phases of compilation are:

  1. Create a path_names file that contains the relative path to the source files. This uses the list_paths tools which ships with FRE but is also provided in the mkmf package that you installed in the above Build tools and compilation templates section.
  2. Create a Makefile that will compile your library or executable. This uses the mkmf tool which ships with FRE.
  3. Compile your library or executable.

We typically apply above steps either component-by-component (the usual method within FRE scripts) or, in these pages, applied to FMS and then everything else.

Setting up the compile environment

The first step is to ensure that you have all the necessary prerequisite software installed. As a minimum you'll need: a fortran compiler (such as gfortran), an MPI implementation (such as openmpi or mpich2), and netcdf libraries and headers. Here are some instructions for specific platforms:

Feel free to add instructions for your platform if they differ considerably from the above.

The following instructions are for GFDL's HPC "Gaea" using the GCC compiler within Cray's programming environment. They will need adapting to work on other platforms.

We'll create an "env" file for convenience, which is sourced at each compile step below. This one works only on Gaea so be sure to create one appropriate for your platform (see previous section):

mkdir build
cat > build/env << EOFA
module unload PrgEnv-pgi PrgEnv-intel PrgEnv-gnu darshan
module unload PrgEnv-pgi PrgEnv-intel PrgEnv-gnu
module load PrgEnv-gnu
module unload netcdf gcc
module load gcc/9.3.0 cray-hdf5 cray-netcdf
EOFA

Alternatively, you could run those module commands once in your shell and omit the source ../env that appear below.

Compiling the FMS shared code

It is best to compile the shared code separately from the model code (proves quicker when building multiple versions of the model).

Instructions for Gaea are shown below:

mkdir -p build/fms/
(cd build/fms; rm -f path_names; \
../../src/mkmf/bin/list_paths -l ../../src/FMS; \
../../src/mkmf/bin/mkmf -t ../../src/mkmf/templates/ncrc-gnu.mk -p libfms.a -c "-Duse_libMPI -Duse_netCDF" path_names)

On your platform, replace ncrc-gnu.mk with the appropriate template file.

The above creates the file build/fms/Makefile which you can use to build the fms library:

(cd build/fms/; source ../env; make NETCDF=3 REPRO=1 libfms.a -j)

which creates the file build/fms/libfms.a. This is a library that we will link to after compiling the MOM6 source code. Note that to create a debug compilation, change "repro" to "debug". To use the GNU compiler instead of Intel then change "intel.mk" to "gnu.mk".

If these step fails then it's very likely that your build template - "ncrc-gnu.mk" in this case - does not match your environment, please see Setting up build configuration

Compiling MOM6 in ocean-only mode (without SIS2)

With the FMS library compiled, you can now build the MOM6 code. On Gaea:

mkdir -p build/ocean_only/
(cd build/ocean_only/; rm -f path_names; \
../../src/mkmf/bin/list_paths -l ./ ../../src/MOM6/{config_src/infra/FMS1,config_src/memory/dynamic_symmetric,config_src/drivers/solo_driver,config_src/external,src/{*,*/*}}/ ; \
../../src/mkmf/bin/mkmf -t ../../src/mkmf/templates/ncrc-gnu.mk -o '-I../fms' -p MOM6 -l '-L../fms -lfms' path_names)

As before, replace ncrc-gnu.mk with your template file.

One can also use dynamic_nonsymmetric instead of dynamic_symmetric. This step creates the "pathnames" file, which lists the source code that will be compiled. One generally does NOT need to recreate the Makefile or path_names files. But if any new module has been added, or there have been changes to the dependencies between modules (e.g., changed use statements), THEN one needs to recreate the path_names file.

Finally, compile the MOM6 ocean-only model with:

(cd build/ocean_only/; source ../env; make REPRO=1 MOM6 -j)

If successful you should now have an ocean executable: build/ocean_only/MOM6.

Comments

Passing -I ../fms via the -o option to mkmf is making sure that the modules files associated with libfms.a can be found. Similarly passing '-L../fms -lfms' via the -l option to mkmf is specifying where to look for libfms.a.

Again, to compile in debug mode, change "repro" to "debug" in every place above.

The circle_obcs case requires compilation with config_src/memory/dynamic_symmetric.

The long list on the command-line of directories under config_src/ points to specific variants of code, such as different "caps" or drivers, different memory models, and stubs for packages that can be used by MOM6. This allows the user to construct different executables and/or compile with different packages represented by the stubs. For example, we selected the FMS1 infrastucture (infra/FMS1), the symmetric-memory model (memory/dynamic_symmetric) which is our preferred shape for arrays, the solo_driver (config_src/solo_driver) needed for uncoupled ocean-only mode, and everything under external which are the blank stubs so this model will be cheaper to run than with full data-assimilation and biogeochemical capabilities.

Compiling MOM6 in MOM6-SIS2 coupled mode

This step is an alternative to the above "Compiling MOM6 without SIS2".

mkdir -p build/ice_ocean_SIS2/
(cd build/ice_ocean_SIS2/; rm -f path_names; \
../../src/mkmf/bin/list_paths -l ./ ../../src/MOM6/config_src/{infra/FMS1,memory/dynamic_symmetric,drivers/FMS_cap,external} ../../src/SIS2/config_src/dynamic_symmetric ../../src/MOM6/src/{*,*/*}/ ../../src/{atmos_null,coupler,land_null,ice_param,icebergs/src,SIS2,FMS/coupler,FMS/include}/)
(cd build/ice_ocean_SIS2/; \
../../src/mkmf/bin/mkmf -t ../../src/mkmf/templates/ncrc-gnu.mk -o '-I../fms' -p MOM6 -l '-L../fms -lfms' -c '-Duse_AM3_physics -D_USE_LEGACY_LAND_' path_names )

Finally, compile the MOM6 sea-ice ocean coupled model with:

(cd build/ice_ocean_SIS2/; source ../env; make REPRO=1 MOM6 -j)

A successful compilation will yield the file build/ice_ocean_SIS2/MOM6.

Running a test case

How a test is run depends a lot on your platform. On most systems you'll need to use the scheduling system to get access to compute nodes. For example, see Running on Gaea below. If this is not the case, or you already have an interactive session on a compute node, then simply execute the following:

cd ocean_only/double_gyre/
mkdir -p RESTART
mpirun -n 8 ../../build/ocean_only/MOM6

The mkdir -p RESTART is needed so that the model can write out a restart state at the end of the test. Also you may need to replace "mpirun" with the name of your MPI run executable. The smaller test cases, like double gyre, can be run on a single core with:

cd ocean_only/double_gyre/
mkdir -p RESTART
../build/ocean_only/MOM6

Running on Gaea

On gaea you will be logged into a front-end node on which you compiled. The model executable can only execute in parallel on the compute nodes. The executable is loaded onto the compute nodes with the srun command which is invoked from a batch node. Job scripts (batch scripts) run on the batch nodes and are normally submitted from the front-end nodes. In brief:

  • Front-end nodes - can see /home and /gpfs/f5. Use for editing, compiling and submitting jobs.
  • Batch nodes - can see /home and /gpfs/f5. Use for launching the executable with srun.
  • Compute nodes - can only see /gpfs/f5.
  • Use srun instead of mpirun.

1. Interactively

To run the model interactively you can obtain an interactive session on a batch node with (for example):

salloc -n 60 -t 125 -p batch --cluster c4

This allocates you 60 cores for 125 minutes on cluster c4 in queue=batch You can also ask for N nodes (one node=36 cores):

salloc -N 10 -t 60 -p batch --cluster c4

to get a session with 360 (10x36) processors for 1 hour. If you want to use the debugger, you can use:

salloc --x11=first -q interactive --cluster c4 -N 2

Although this shell can see /home the executable will not be able to. Therefore the directory in which you run srun should be on /gpfs/f5, which is why we suggest working on the /lustre file-system when setting up MOM6. Running the model (here on 60 cores) is then done with:

srun -n 60 ./MOM6 

2. Simple batch job

The simplest batch script (say mom.sub) to run the model would look like:

#!/bin/bash

#SBATCH -n 60
## if you prefer to specify nodes, use instead:
##SBATCH -N 10
#SBATCH --time=0:60:00
#SBATCH --job-name="MOM6"
#SBATCH --output=slurm.out
#SBATCH --error=slurm.err
#SBATCH --qos=normal
#SBATCH --partition=batch
#SBATCH --clusters=c4
## obviously use your group account:
#SBATCH --account=gfdl_o

srun -n 60 ./MOM6

then submit to the scheduler with:

sbatch mom.sub

Checking and examining your results

Once you have run the test case you will get some output (and/or a netcdf HDF5 error that will indicate that you did not create the RESTART/ folder above) in the directory from which you ran srun or mpirun (i.e. in MOM6-examples/ocean_only/double_gyre/ for the above example).

The numerical results will depend on the specifics of your platform (chip, operating system, compiler vendor, versions, etc.) and so "bit for bit" reproducing the results obtained on GFDL platforms is not guaranteed. Some basics are expected for the checked-in experiments:

  • The generated output documenting parameters and configuration (MOM_parameter_doc.*) should be indentical (except for choices of processor layout).
  • The absence of any U_velocity_truncations or V_velocity_truncations files, which if present indicate numerical instabilities.

If you want an example of regression files used by GFDL for testing, you can look in a demonstration regression repository. If you clone this regressions repository, or better yet, maintain one of your own for you platform, then you could check your results by comparing the newly generated ocean.stats file to the committed one:

diff ocean.stats ocean.stats.intel

If you are running on gaea and the ocean.stats files differ, then try removing the Depth_list.nc file and ensure that you ran the initial test with 8 processors (i.e -np 8). If you are not running on gaea then it's expected that there will be small differences in the ocean.stats files. This is because no two compilers or computers can be relied upon to do all floating point calculations in a consistent way.

To look at netCDF output, you might need to combine the netcdf files, if you ran on more than one processor. To combine the netCDF files on gaea, do the following:

module load fre
mppnccombine <fname>.nc

where "<fname>" is the root for the files to be combined. If you do not have access to the mppnccombine command then you can find the source code for the tool at https://github.com/NOAA-GFDL/FRE-NCtools.

The content of the netcdf files is controlled by the model input file diag_table. See Diagnostics from the MOM6 documentation. You will need a netcdf browser (such as ncview) or the appropriate netcdf libraries within your analysis software (e.g. scipy.io.netcdf or netCDF4 for python).

See Running and changing the "benchmark" test for another example of running an ocean-only model and how to change the configuration.

Clone this wiki locally