8/24/16 3:57PM: All HPC systems are availalbe including:

OpenFOAM

OpenFOAM is a suite of computational fluid dynamics applications. It contains myriad solvers, both compressible and incompressible, as well as many utilities and libraries.

Availability and Restrictions

Versions

The following versions of OpenFOAM are available on OSC clusters:

version oakley Ruby Owens
2.2.2     X
2.3.0 X X  
2.4.0 X X  
3.0.0 X X  

 

Feel free to contact OSC Help if you need other versions for your work.

Access 

OpenFOAM is available to all OSC users without restriction.

Basic Structure for a OpenFOAM Case

The basic directory structure for a OpenFOAM case is shown as follows:

/home/yourusername/OpenFOAM_case
|-- 0
|-- U
|-- epsilon
|-- k
|-- p
`-- nut 
|-- constant
|-- RASProperties
|-- polyMesh
|   |-- blockMeshDict
|   `-- boundary
|-- transportProperties
`-- turbulenceProperties
|-- system
|-- controlDict
|-- fvSchemes
|-- fvSolution
`-- snappyHexMeshDict

IMPORTANT: To run in parallel, you need to also create the decomposeParDict file in the system directory. If you do not create this file, the decomposePar command will fail.

Usage

Usage on Oakley

Setup on Oakley

To configure the Oakley cluster for the use of OpenFOAM 1.7.x, use the following commands:

. /usr/local/OpenFOAM/OpenFOAM-1.7.x/etc/bashrc

To configure the Oakley cluster for the use of OpenFOAM 2.1.0, use the following commands:

. /usr/local/OpenFOAM/OpenFOAM-2.1.0/etc/bashrc

To configure the Oakley cluster for the use of OpenFOAM 2.3.0, use the following commands:

module load openfoam/2.3.0 # permissible with default Intel and MVAPICH2 compilers

To configure the Oakley cluster for the use of OpenFOAM 3.0.0, use the following commands:

module load openmpi/1.10 # currently only 3.0.0 is installed using OpenMPI libraries
module load openfoam/3.0.0

Batch Usage on Oakley

Batch jobs can request mutiple nodes/cores and compute time up to the limits of the OSC systems.

On Oakley, refer to Queues and Reservations for Oakley and Scheduling Policies and Limits for more info. 

Interactive Batch Session

For an interactive batch session on Oakley, one can run the following command:

qsub -I -l nodes=1:ppn=12 -l walltime=1:00:00

which gives you 12 cores ( -l nodes=1:ppn=12) with 1 hour ( -l walltime=1:00:00). You may adjust the numbers per your need. 

Non-interactive Batch Job (Serial Run)

A batch script can be created and submitted for a serial or parallel run. You can create the batch script using any text editor you like in a working directory on the system of your choice. Below is the example batch script ( job.txt) for a serial run, note that this would use the default installation of OpenFOAM 2.3.0 on Oakley:

#PBS -N serial_OpenFOAM 
#PBS -l nodes=1:ppn=1 
#PBS -l walltime=24:00:00 
#PBS -j oe 
#PBS -S /bin/bash 
# Initialize OpenFOAM on Oakley Cluster
# This only works if you are using default modules
module load openfoam
# Move to the case directory, where the 0, constant and system directories reside
cd $PBS_O_WORKDIR
# Copy files to $TMPDIR and move there to execute the program
cp * $TMPDIR
cd $TMPDIR
# Mesh the geometry
blockMesh
# Run the solver
icoFoam
# Finally, copy files back to your home directory
cp * $PBS_O_WORKDIR

In order to run it via the batch system, submit the job.txt   file with the following command:

qsub job.txt
Non-interactive Batch Job (Parallel Run)

Below is the example batch script ( job.txt) for a parallel run:

#PBS -N parallel_OpenFOAM
#PBS -l nodes=2:ppn=8
#PBS -l walltime=6:00:00
#PBS -j oe
#PBS -S /bin/bash 
# Initialize OpenFOAM on Oakley Cluster
# This only works if you are using default modules
module load openfoam
#Move to the case directory, where the 0, constant and system directories reside
cd $PBS_O_WORKDIR
#Mesh the geometry
blockMesh
#Decompose the mesh for parallel run
decomposePar
#Run the solver
mpiexec simpleFoam -parallel 
#Reconstruct the parallel results
reconstructPar

Usage on Ruby

Setup on Ruby

To configure the Ruby cluster for the use of OpenFOAM 2.3.0, use the following commands:
module load openfoam/2.3.0 # permissible with default Intel and MVAPICH2 compilers

To configure the Ruby cluster for the use of OpenFOAM 3.0.0, use the following commands:

module load openmpi/1.10 # currently only 3.0.0 is installed using OpenMPI libraries
module load openfoam/3.0.0

Batch Usage on Ruby

Batch jobs can request mutiple nodes/cores and compute time up to the limits of the OSC systems.

On Oakley, refer to Queues and Reservations for Ruby and Scheduling Policies and Limits for more info. 

Interactive Batch Session

For an interactive batch session on Oakley, one can run the following command:

qsub -I -l nodes=1:ppn=20 -l walltime=1:00:00

which gives you 20 cores ( -l nodes=1:ppn=20 ) with 1 hour ( -l walltime=1:00:00 ). You may adjust the numbers per your need. 

Non-interactive Batch Job (Serial Run)

A batch script can be created and submitted for a serial or parallel run. You can create the batch script using any text editor you like in a working directory on the system of your choice. Below is the example batch script ( job.txt ) for a serial run:

#PBS -N serial_OpenFOAM 
#PBS -l nodes=1:ppn=1 
#PBS -l walltime=24:00:00 
#PBS -j oe 
#PBS -S /bin/bash 
# Initialize OpenFOAM on Ruby Cluster
# This only works if you are using default modules
module load openfoam
# Move to the case directory, where the 0, constant and system directories reside
cd $PBS_O_WORKDIR
# Copy files to $TMPDIR and move there to execute the program
cp * $TMPDIR
cd $TMPDIR
# Mesh the geometry
blockMesh
# Run the solver
icoFoam
# Finally, copy files back to your home directory
cp * $PBS_O_WORKDIR

In order to run it via the batch system, submit the job.txt  file with the following command:

qsub job.txt
Non-interactive Batch Job (Parallel Run)

Below is the example batch script ( job.txt) for a parallel run:

#PBS -N parallel_OpenFOAM
#PBS -l nodes=2:ppn=8
#PBS -l walltime=6:00:00
#PBS -j oe
#PBS -S /bin/bash 
# Initialize OpenFOAM on Ruby Cluster
# This only works if you are using default modules
module load openfoam
# Move to the case directory, where the 0, constant and system directories reside
cd $PBS_O_WORKDIR
# Mesh the geometry
blockMesh
# Decompose the mesh for parallel run
decomposePar
# Run the solver
mpiexec simpleFoam -parallel 
# Reconstruct the parallel results
reconstructPar

Further Reading

The OpenFOAM home page

Supercomputer: 
Service: 
Fields of Science: