LS-DYNA is a general purpose finite element code for simulating complex structural problems, specializing in nonlinear, transient dynamic problems using explicit integration. LS-DYNA is one of the codes developed at Livermore Software Technology Corporation (LSTC).
LS-DYNA is available to OSC users that have filled out the appropriate academic license agreement form found here.
Serial (smp -for single node jobs) and parallel (mpp -for multipe node jobs) versions of LS-DYNA solvers are installed at OSC. You may check the available versions on Oakley using
module avail dyna. (On Glenn, use
module avail lsdyna for smp solvers, and use
module avail mpp for mpp solvers.) In the module name, '_s' indicates single precision and '_d' indicates double precision. For example, mpp971_d_R5 is the mpp solver with double precision on Glenn. The following versions are currently available on OSC cluster systems (including smp/mpp, single/double precision):
LS-DYNA can be used via the batch system, by requesting the computing resources and launching solvers (smp/mpp) in a script file. See the following example.
To use LS-DYNA via the batch system, here's what you need to do:
1) copy your input files (explorer.k in the example below) to your work directory at OSC
2) create a batch script, similar to the following file, saved as 'test.job':
#PBS -N plate_test #PBS -l walltime=5:00:00 #PBS -l nodes=2:ppn=8 #PBS -j oe #PBS -S /bin/csh # The following lines set up the LSDYNA environment module unload mpi module load mpp971_d_R5 # # Move to the directory where the input files are located # cd $PBS_O_WORKDIR # # Run LSDYNA (number of cpus > 1) # mpiexec mpp971 I=explorer.k NCPU=16
This example script uses the mpp solver for a parallel job (nodes>1) on Glenn.
module swap mvapich2/1.7 intelmpi
in place of
module unload mpi
in the above.
#PBS -l nodes=1:ppn=8 ... module load lsdyna971_d_R5 ... lsdyna I=explorer.k NCPU=8
3) submit the script to the batch queue with the command:
when the job is finished, all the result files will be found in the directory where you submitted your job ($PBS_O_WORKDIR).
Alternatively, you can submit your job from the temporary directory ($TMPDIR), which is faster to access for the system and might be beneficial for bigger jobs. Note that $TMPDIR is uniquely associated with the job submitted and will be cleared when the job ends. So you need to copy your results back to your work directory at the end of your script. An example scrip should include the following lines:
... cd $TMPDIR cp $PBS_O_WORKDIR/explorer.k . ... #launch the solver and execute pbsdcp -g '*' $PBS_O_WORKDIR
LS-PrePost is an advanced pre and post-processor that is delivered free with LS-DYNA. The user interface is designed to be both efficient and intuitive. LS-PrePost runs on Windows, Linux, and Unix utilizing OpenGL graphics to achieve fast rendering and XY plotting. The latest builds can be downloaded from LSTC's FTP Site.
The prefered way of accessing LS-Prepost is through OnDemand's Glenn desktop application. This gives you a preconfigured enviornment with GPU acceleration enabled.
LS-PrePost is available on both Glenn and Oakley clusters. A module was created for LS-Prepost 4.0 on Oakley, and can be loaded with 'module load lspp'. All other versions can be loaded with the corresponding dyna modules.
Below are instructions on running LS-PrePost on Oakley through Glenn's OnDemand desktop interface with GPU acceleration enabled. To run LS-PrePost with a slower X-tunneling procedure, see the specified instructions below.
1) Log in to OnDemand with your HPC username and password.
2) Launch the Glenn Desktop from "Apps" menu.
3) Open a Terminal window (Applications > Accessories > Terminal)
4) Type the following command to connect to oakley:
ssh -X firstname.lastname@example.org
* Where "username" is your username.
5) Once logged in to Oakley, submit an interactive job with the following command:
qsub -X -I -l nodes=1:ppn=12:gpus=2:vis -l walltime=hh:mm:ss
* pick a walltime that is close to the amount of time you will need to for working in the GUI application.
6) Once your job starts, make a note of the hostname for the compute node your job is running on. You can find this information by typing the following command:
7) Open another Terminal window, and type the following commands:
module load VirtualGL vglconnect username@job-hostname
* job-hostname is the information you found in step 6; your command might look something like this, for example:
You'll be asked a password to connect, which should be your HPC password.
8) Now, you should be connected to your running job's GPU node. Run the following commands to launch LS-PrePost version 4.0:
export LD_LIBRARY_PATH=/usr/local/MATLAB/R2013a/sys/opengl/lib/glnxa64 /usr/local/lstc/ls-prepost/lsprepost4.0_centos6/lspp4
At startup LS-PrePost displays a graphical interface for model generation and results post-processing.
The following procedure will result in a much slower and GUI interface, but may be useful if the above instructions are not working. It can be done completely from the command line, with no need for logging into OnDemand. You may need to edit your terminal settings to enabled x11 forwarding.
1) Login to Oakley or Glenn with X11 forwarding enabled
ssh -X email@example.com
ssh -X firstname.lastname@example.org
2) Submit a Interactive job
qsub -I -X -l nodes=1:ppn=12 -l walltime=hh:mm:ss
and wait for it to start. Use ppn=8 for Glenn.
3) Load the LS-Dyna module and start up LS-PrePost application
module load ls-dyna lspp3
For LS-Prepost 4.0 on Oakley, use the following commands instead:
module load lspp lspp4
A x11 window should pop up. If you get a error along the lines of:
Error: Unable to initialize gtk, is DISPLAY set properly?
Double check that you:
1) logged in with x11 forwarding enabled
2) have configured your x11 settings for your terminal
3) included the -X with the qsub command
4) have a x11 client running on your computer (Xming, Xorg, XQuarts, etc.).
Documentation for LS-PrePost may be obtained at: http://www.lstc.com/lspp/
This page describes how to specify user defined material to use within LS-DYNA. The user-defined subroutines in LS-DYNA allow the program to be customized for particular applications. In order to define user material, LS-DYNA must be recompiled.
LS-DYNA with user defined material models are only available on the Glenn Cluster.
The following versions on Glenn are available for use with user defined material models:
The first step to running a simulation with user defined material is to build a new executable. The following is an example done with solver version mpp971_s_7600.2.398.
When you log into the Glenn system, load mpp971_s_7600.2.398 with the command:
module load mpp971_s_7600
Next, copy the mpp971_s_7600 object files and Makefile to your current directory:
cp /usr/local/lstc/mpp971_s_7600.2.398/usermat/* $PWD
Next, update the dyn21.f file with your user defined material model subroutine. Please see the LS-DYNA User's Manual (Keyword version) for details regarding the format and structure of this file.
Once your user defined model is setup correctly in dyn21.f, build the new mpp971 executable with the command:
To execute a multi processor (ppn > 1) run with your new executable, execute the following steps:
1) move your input file to a directory on Glenn (pipe.k in the example below)
2) copy your newly created mpp971 executable to this directory as well
3) create a batch script (lstc_umat.job) like the following:
#PBS -N LSDYNA_umat #PBS -l walltime=1:00:00 #PBS -l nodes=2:ppn=8 #PBS -j oe #PBS -S /bin/csh # This is the template batch script for running a pre-compiled # MPP 971 v7600 LS-DYNA. # Total number of processors is ( nodes x ppn ) # # The following lines set up the LSDYNA environment # on the P4 cluster (glenn.osc.edu) module load mpp971_s_7600 # # Move to the directory where the job was submitted from # (i.e. PBS_O_WORKDIR = directory where you typed qsub) # cd $PBS_O_WORKDIR # # Run LSDYNA # NOTE: you have to put in your input file name # mpiexec mpp971 I=pipe.k NCPU=16
4) Next, submit this job to the batch queue with the command:
The output result files will be saved to the directory you ran the qsub command from (known as the $PBS_O_WORKDIR_
On-line documentation is available on LSTC website.