OpenFOAM

OpenFOAM is a suite of computational fluid dynamics applications. It contains myriad solvers, both compressible and incompressible, as well as many utilities and libraries.

Availability & Restrictions

OpenFOAM is available to all OSC users without restriction.

The following versions of OpenFOAM are available on OSC clusters:

Version Glenn Oakley
1.7.x X X
2.1.0   X

Usage

Set-up

To configure the Glenn cluster for the use of OpenFOAM 1.7.x for serial execution, use the following commands:

    module load gcc-4.4.5
    module switch mpi mvapich2-1.6-gnu
    module load openfoam-1.7.x

To configure the Oakley cluster for the use of OpenFOAM 1.7.x for serial execution, use the following commands:

    . /usr/local/OpenFOAM/OpenFOAM-1.7.x/etc/bashrc

To configure the Oakley cluster for the use of OpenFOAM 2.1.0 for serial execution, use the following commands:

    . /usr/local/OpenFOAM/OpenFOAM-2.1.0/etc/bashrc

Basic Structure for a OpenFOAM Case

The basic directory structure for a OpenFOAM case is shown as follows:

/home/yourusername/OpenFOAM_case
|-- 0
    |-- U
    |-- epsilon
    |-- k
    |-- p
    `-- nut 
|-- constant
    |-- RASProperties
    |-- polyMesh
    |   |-- blockMeshDict
    |   `-- boundary
    |-- transportProperties
    `-- turbulenceProperties
|-- system
    |-- controlDict
    |-- fvSchemes
    |-- fvSolution
    `-- snappyHexMeshDict

IMPORTANT: To run in parallel, you need to also create the "decomposeParDict" file in the system directory. If you do not create this file, the decomposePar command will fail.

Using OpenFOAM

Once your environment has been configured with the above commands, you can then start any of the OpenFOAM utilities by the associated command:

blockMesh

or

icoFoam

or

sonicFoam

 

Batch Usage

OpenFOAM can be run in both serial and parallel.

Sample Batch Script (serial execution)

To run OpenFOAM in serial, the following example batch script will provide you with a template.

#PBS -N serial_OpenFOAM 
#PBS -l nodes=1:ppn=1 
#PBS -l walltime=24:00:00 
#PBS -j oe 
#PBS -S /bin/bash 

#Initialize OpenFOAM on Glenn Cluster
module load gcc-4.4.5
module switch mpi mvapich2-1.6-gnu
module load openfoam-1.7.x

#Move to the case directory, where the 0, constant and system directories reside
cd $PBS_O_WORKDIR

# Copy files to $TMPDIR and move there to execute the program
cp * $TMPDIR
cd $TMPDIR

#Mesh the geometry 
blockMesh

#Run the solver 
icoFoam

#Finally, copy files back to your home directory
cp * $PBS_O_WORKDIR

IMPORTANT: This template is for running OpenFOAM on Glenn cluster. If you would like to run OpenFOAM on Oakley cluster, you should modify the following part based on the "Set-up" part:

    #Initialize OpenFOAM on Glenn Cluster
    module load gcc-4.4.5
    module switch mpi mvapich2-1.6-gnu
    module load openfoam-1.7.x

Sample Batch Script (parallel execution)

An example of a parallel batch script follows, which will run the model on 16 processors (two nodes, where each node has 8 processors):

#PBS -N parallel_OpenFOAM
#PBS -l nodes=2:ppn=8
#PBS -l walltime=6:00:00
#PBS -j oe
#PBS -S /bin/bash 

#Initialize OpenFOAM on Glenn Cluster
module load gcc-4.4.5
module switch mpi mvapich2-1.6-gnu
module load openfoam-1.7.x

#Move to the case directory, where the 0, constant and system directories reside
cd $PBS_O_WORKDIR

#Mesh the geometry
blockMesh

#Decompose the mesh for parallel run
decomposePar

#Run the solver
mpirun -np 16 simpleFoam -parallel 

#Reconstruct the parallel results
reconstructPar

IMPORTANT: This template is for running OpenFOAM on Glenn cluster. If you would like to run OpenFOAM on Oakley cluster, you should do the following modifications:

  • Modify the following part based on the "Set-up" part:
    #Initialize OpenFOAM on Glenn Cluster
    module load gcc-4.4.5
    module switch mpi mvapich2-1.6-gnu
    module load openfoam-1.7.x
  • Use "mpiexec" instead of "mpirun" as:
    
    mpiexec simpleFoam -parallel
 

Further Reading

See Also

  • ParaView
Supercomputer: 
Service: 
Fields of Science: