Ocean Dynamics Course: Computing
The primary tool for the course will be the MIT-GCM.
The newest version of the manual is here. You can follow Chapter 3 and install the
model yourself anywhere you like, and you may find that the
standard configuration scripts work fine for your environment. If
you want to use it on our local supercomputer, Max, you can follow
the steps below. The advantage is that here you'll be all set up
for running it in parallel, when we get to that point.
Running the MIT-GCM on Max:
- Get an account on Max (email email@example.com)
- Once logged into l1.ccss.nyu.edu, copy the tarball to your home
directory and untar it:
cp /home/MITgcm/MITgcm-cvs20070112.tgz ~
tar xvfz MITgcm-cvs20070112.tgz
- To prepare to build the model executable, you need to be on a
node that is set up like a compute node (l1 is a login node):
- You can look at
section 3.3 to see a description of the subdirectories in the newly
created ~/MITgcm/ directory. All of the pre-set
experiments are located in the verification
subdirectory. Here we'll go through the steps to run the
barotropic gyre example I covered in class (see Section
3.9 for all the details on this example simulation).
- Before building the executable, we need to change a few things.
There are a two primary files you must deal with: SIZE.h
which sets model resolution and 'tiling' (distribution of fields in
a two-dimensional domain decomposition), and data, the
namelist file that sets physical parameters (like values of beta
and viscosity) as well as run-control parameters (such as the
timestep, the length of the run, and the frequency of saves of
model output). Read the manual section linked above for info about
To remove tiling (but keep the resolution the same as in the the
tutorial description), first diff against my copy of
SIZE.h, then simply replace yours with mine:
Now replace all namelists with my namelists (say yes to overwriting), since I have
swapped the ending '&' symbols with '/', as is
necessary with our compilers.
- Note that the files windx.sin_y and topog.box
in input are created by the matlab script
gendata.m, also in input. Looking at that
file, you can see how easy it is to create your own inputs.
Now we will run the a configuration script that 'makes' a
makefile, using a configuration file created by the system
adminstrator for Max, and build the model into the code directory.
These commands will take about 5-10 minutes.
../../../tools/genmake2 -of /home/MITgcm/linux_ppc64_xlf_ncar+max
- Now we will make a specific directory for your first run. We
will copy the input files and the executable into that directory:
cp input/* run1
cp code/mitgcmuv run1
- Now you can try to test-run the model. The line below will
redirect the STDOUT output to the file output.txt, and will finish
in just a few seconds, since in data, the end time
parameter is set as endTime=12000 (seconds), so the model
runs for only 10 timesteps.
./mitgcmuv > output.txt
- You can compare the resulting output.txt to the sample
on in the results subdirectory. You also should now find
a new directory called
mnc_test_*_0001 where '*' is some string of numbers. This
directory contains the model output in 'netcdf' format (see UCAR's Netcdf
page). There is nothing significant since the run was so
- To finally do a long run (as the one showed in class), you
first need to change the end time parameter in data to a
longer time. You can simply comment the endTime=12000
line and uncomment the endTime=311040000 line, which tells
the model to run for 3600 days (in seconds).
One must use the compute nodes of the cluster for jobs that take
more than a few minutes, and these
are accessed by submitting jobs using the Portable Batch
System. See the Max page on
submitting jobs for more info.
Copy my PBS script and use this to run your job after
changing the end time above. Assuming you are still in the
- You can see all submitted jobs by typing the command
pbstop. Eventually you should see an 'R' by your run,
showing it is running. It will take about 1 hour. When its done,
it won't be in the queue anymore. You will then find a new
mnc_test_*_0002 subdirectory, and this will contain new
output. All of the relevant output will be stored in the netcdf
file state.0000000000.t001.nc. You can look at it with
the tool ncview as follows:
ncview state.0000000000.t001.nc &
- If you are a Matlab user (as you should be!), you can get the
fields into your workspace by first putting the paths to some
NetCDF utilities into your Matlab path (I suggest adding this line
to your .cshrc, or whereever you set your path):
Then after restarting matlab and moving to the run1
directory, you can grab files with the menu-driven function
getnc as follows:
foo = getnc('state.0000000000.t001.nc');
Just follow the instructions and you can now do whatever you like
with the output.