OpenMPI job stopped at 'There are not enough slots available in the system to satisfy the slots'

Users would encounter a MPI job failed with openmpi/3.1.0-hpcx on Owens and Pitzer. The job would stop with the error  like "There are not enough slots available in the system to satisfy the slots". Please switch to openmpi/3.1.4-hpcx. The buggy version openmpi/3.1.0-hpcx will be removed on August 18 2020.


Resolved: We removed  openmpi/3.1.0-hpcx on August 18 2020.

Error ' undefined reference to uuid@' with MVAPICH2 in Conda environment

Users may encoutner an error like ' undefined reference to `uuid_unparse@UUID_1.0' while compiling MPI applications with mvapich2 in some Conda enivronments. We found pre-installed libuuid package from Conda conflicting with system libuuid libraries. The affected Conda packages are python/2.7-conda5.2python/3.6-conda5.2 and python/3.7-2019.10.

Incorrect MPI launcher and compiler wrappers with Conda environments python/2.7-conda5.2 and python/3.6-conda5.2

Users may encounter under-performing MPI jobs or failures of compiling MPI applications if you are using Conda from system. We found pre-installed mpich2 package in some Conda environments overrides default MPI path. The affected Conda packages are python/2.7-conda5.2 and python/3.6-conda5.2. If users experience these issues, please re-load MPI module, e.g. module load mvapich2/2.3.2 after setting up your Conda environment.

Error when downloading SRA data on computing nodes

NCBI blocks any connection from computing nodes because they are behind firewalls. Thus OSC users cannot use SRA tools to download data "on-the-fly" at runtime on computing nodes, e.g. 'fastq-dump -X 5 SRR390728'.  OSC users must download SRA data on login using the command 'prefetch' before any sequence analysis. Please see the section 'Download SRA Data'  in the SRA Toolkit software page for more detail.

Segmentation fault from openmpi/1.10-hpcx and 2.0-hpcx on Owens

We have found that recent MPI jobs using openmpi/1.10-hpcx and openmpi/2.0-hpcx on Owens may complete or hang until the job is killed, but receive segmentation fault. Some applications might be affected (if you run these applications with openmpi mentioned above): orca, openfoam and lammps. OSC users can use other compatible versions, e.g. openmpi/1.10.7-hpcx and openmpi/2.1.6-hpcx. Please check available version from the software page.