WARP3D

From WARP3D's webpage:

WARP3D is under continuing development as a research code for the solution of large-scale, 3-Dsolid models subjected to static and dynamic loads. The capabilities of the code focus on 
fatigue & fracture analyses primarily in metals. WARP3D runs on laptops-to-supercomputers and can analyze models with several million nodes and elements. 

Availability and Restrictions

Versions

The following versions of WARP3D are available on OSC clusters:

version oakley Owens
17.5.3 X  
17.7.1   X
17.7.4 X X

 

Feel free to contact OSC Help if you need other versions for your work.

Access 

WARP3D is available to all OSC users without restriction.

Usage

Usage on Owens

Setup on Owens

To configure the Owens cluster for the use of WARP3D, use the following commands:

module load intel
module load intelmpi
module load warp3d

Batch Usage on Owens

Batch jobs can request multiple nodes/cores and compute time up to the limits of the OSC systems. Refer to Queues and Reservations for OakleyQueues and Reservations for Ruby, and Scheduling Policies and Limits for more info.

Running WARP3D

Below is an example batch script ( job.txt ) for using WARP3D:

#PBS -N WARP3D
#PBS -l nodes=1:ppn=28
#PBS -l walltime=30:00
#PBS -j oe 
#PBS -S /bin/bash 
# Load the modules for WARP3D
module load intel
module load intelmpi
module load warp3d
# Copy files to $TMPDIR and move there to execute the program
cp $WARP3D_HOME/example_problems_for_READMEs/mt_cohes_*.inp $TMPDIR
cd $TMPDIR
# Run the solver using 4 MPI tasks and 6 threads per MPI task 
$WARP3D_HOME/warp3d_script_linux_hybrid 4 6 < mt_cohes_4_cpu.inp
# Finally, copy files back to your home directory 
cp -r * $PBS_O_WORKDIR

In order to run it via the batch system, submit the job.txt   file with the following command:

qsub job.txt

Further Reading

http://www.warp3d.net

https://github.com/rhdodds/warp3d

Service: 
Fields of Science: