Software Refresh - May 2020

OSC will be refreshing the software stack for Owens and Pitzer on May 19, 2020. This will be done in a system-wide downtime. During the software refresh, some default versions will be changed to be more up-to-date. Information about the new default versions, as well as all available versions of each software package will be included on the corresponding OSC software webpage. See https://www.osc.edu/supercomputing/software-list.

Summary of Changes

  • New versions of the compilers and MPI have been installed and will become the defaults. Libraries will be rebuilt with the new default compilers and MPI. The newest stable version of each library will be used in most cases.
  • For software applications, the latest installed version will become the default in most cases, but the current default version will be kept and can be requested explicitly. New versions of some applications will be installed.

Impact on User-Built Software

If you compile and link your own software you need to be particularly aware of the changes in the default modules. You will probably need to either rebuild your software or explicitly load the compiler and MPI modules that you built your code with before you run the code.

After the refresh on May 19, 2020, to load the previous default environment at login on Owens or Pitzer, use the command   module load modules/au2018. This environment was the default environment at login on Owens and Pitzer until 5/19/2020. If your code is built with compilers other than Intel compiler, you can explicitly load the old default module using the command module load name/version . Please refer to Compilers/MPI or the corresponding OSC software webpage (See https://www.osc.edu/supercomputing/software-list) for more information.

Known issues and changes

intelmpi/2019.3: MPI-IO issues on home directories

Certain MPI-IO operations with intelmpi/2019.3 may crash, fail or proceed with errors on the home directory. We do not expect the same issue on our GPFS file system, such as the project space and the scratch space. The problem might be related to the known issue reported by HDF5 group. Please read the section "Problem Reading A Collectively Written Dataset in Parallel" from HDF5 Known Issues for more detail.

intelmpi/2019.5: MPI-IO issues on GPFS file system

MPI-IO routines with intelmpi/2019.5 on our GPFS file systems may fail as a known issue from Intel MPI. You can set an environment variable, I_MPI_EXTRA_FILESYSTEM=0 for a workaround or simply use intelmpi/2019.3, which is our new default version. Please read the section "Known Issues and Limitations, Intel MPI Library 2019 Update 5" from Intel MPI Known Issues for more detail.

pgi/20.1: LLVM back-end for code generation as default

PGI compilers later than version 19.1 use a LLVM-based back-end for code generation. OSC's previous default PGI compiler was pgi/18.4, and it used a non-LLVM back-end. For more detail, please read our PGI compiler page.

pgi/20.1: disabling memory registration

You may have a warning message when you run a MPI job with pgi/20.1 and mvapich2/2.3.3:

WARNING: Error in initializing MVAPICH2 ptmalloc library.Continuing without InfiniBand registration cache support.

Please read about the impact of disabling memory registration cache on application performance in the Mvapich2 2.3.3 user guide

Details

 

Compilers/MPI

 

The following table gives details about the default versions for compilers and MPI implementations . The versions refer to actual module names, except where otherwise noted.

 

Software Old default new default notes
intel 18.0.3 19.0.5  
gnu 7.3.0 9.1.0  
pgi 18.4 20.1  
mvapich2 2.3.2 2.3.3 available with intel, gnu, pgi compiler
intelmpi 2018.3 2019.3 Intel compiler only
openmpi 3.1.0-hpcx 4.0.3-hpcx Intel and gnu compiler

 

Libraries

 

The following libraries will be built for the new default compiler/MPI versions.

 

software old default new default notes
boost 1.67.0 1.72.0  
fftw3 3.3.8 3.3.8  
hdf5 1.10.2 1.12.0

serial & parallel. There is API compatibility issue on the new version, 1.12.0.

Please read this page for more detail.

metis 5.1.0 5.1.0  
mkl 2018.0.3 2019.0.5 Only modules not built.
netcdf 4.6.1 4.7.4 serial & parallel, with C, Fortran and C++ interfaces
parmetis 4.0.3 4.0.3  
scalapack 2.0.2 2.1.0  
ncarg 6.5.0 6.6.2  

 

Software/module to be removed

 

software/module versions notes
lapack 3.8.0, owens and pitzer We recommand to use mkl instead.

 

Applications

 

The following table gives details about the upcoming changes to software applications. All software names and version numbers refer to the actual module names.  

 

Software Old default New default Notes
amber 18 19 20 coming
darshan 3.1.6 3.1.8  
espresso 6.3 6.5  
gromacs 2018.2 2020.2  
lammps 22Aug18 3Mar20  
mpp-dyna 971_d_10.1.0.lua 971_s_11.0.0 Owens only
namd 2.12 2.13  
nwchem 6.8 7.0.0  
openfoam 5.0 7.0  
bedtools 2.25.0 2.29.2 Owens only
rosetta 3.10 3.12  
abaqus 2018 2020 Owens only
arm-ddt/arm-map/arm-pr 18.2.1 20.0.3  
bowtie2 2.2.9 2.4.1  
cmake 3.11.4 3.17.2  
comsol 53a 5.5 Owens only
cuda 9.2.88 10.2.89 See the software page for GNU compiler support
desmond 2018.2 2019.1 Owens only
gatk 3.5 4.1.2.0  
gaussian g16a03 g16c01  
hyperworks 2017.1 2019.2 Owens only
ls-dyna 971_s_9.0.1 971_s_11.0.0 Owens only
matlab r2018b r2020a  
paraview 5.5.2 5.8.0  
samtools 1.3.1 1.10  
schrodinger 2018.3 2020.1 Owens only
sratoolkit 2.9.0 2.9.6  
starccm 13.02.011 15.02.007 Owens only
vcftools 0.1.14 0.1.16  

 

 

 

Supercomputer: 
Service: