Known issues

Unresolved known issues

Known issue with an Unresolved Resolution state is an active problem under investigation; a temporary workaround may be available.

Resolved known issues

A known issue with a Resolved (workaround) Resolution state is an ongoing problem; a permanent workaround is available which may include using different software or hardware.

A known issue with Resolved Resolution state has been corrected.

Known Issues

Title Category Resolution Description Posted Updatedsort ascending
Addressing CP2K 7.1 Memory Issues on Pitzer and Owens Clusters Owens, Pitzer Resolved
(workaround)

According to https://github.com/cp2k/cp2k/issues/1830 and user feedback, you may encounter Out-of-Memory (OOM) errors during long molecular... Read more

1 week 3 days ago 1 week 3 days ago
Rolling reboots on all HPC systems starting Oct 31 2024 Owens, Pitzer Resolved

Updates on Nov 13 2024:

Pitzer is completed. 

Updates... Read more

1 month 4 days ago 2 weeks 3 days ago
Handling full-node MPI warnings with MVAPICH 3.0 Ascend, Cardinal Resolved
(workaround)

When running a full-node MPI job with MVAPICH 3.0 , you may encounter the following warning message:

[][mvp_generate_implicit_cpu_mapping] WARNING: You appear to be running at full... Read more          
1 month 1 day ago 1 month 1 day ago
HWloc warning: Failed with: intersection without inclusion Ascend, Cardinal Resolved
(workaround)

When running MPI+OpenMP hybrid code with the Intel Classic Compiler and MVAPICH 3.0, you may encounter the following warning message from hwloc:

... Read more
1 month 1 day ago 1 month 1 day ago
CP2K 6.1 Floating-point exception on Pitzer Cascade Lakes (48-core) node Owens, Pitzer Resolved
Program received signal SIGFPE: Floating-point exception - erroneous arithmetic operation.

Backtrace for this error:

Thid could be a bug in libxsmm 1.9.0 which is released on Mar 15... Read more

1 month 1 week ago 1 month 1 week ago
STAR-CCM+ MPI job failure and workaround Cardinal, Software Resolved
(workaround)

STAR-CCM+ encounters errors when running MPI jobs with Intel MPI or OpenMPI, displaying the following message:

ib_iface.c:1139 UCX ERROR Invalid active_speed on mlx5_0:1: 128

... Read more

1 month 2 weeks ago 1 month 2 weeks ago
HCOLL-related failures in OpenMPI applications Cardinal, Software Resolved
(workaround)

Several applications using OpenMPI, including HDF5, Boost, Rmpi, ORCA, and CP2K, may fail with errors such as

mca_coll_hcoll_module_enable() coll_hcol: mca_coll_hcoll_save_coll_handlers... Read more          
1 month 2 weeks ago 1 month 2 weeks ago
GCC 13 compilation errors due to missing headers Cardinal, Pitzer, Software Resolved
(workaround)

Users may encounter the following errors when compiling a C++ program with GCC 13:

error: 'uint64_t' in namespace 'std' does not name a type

or

error: 'std::... Read more          
1 month 2 weeks ago 1 month 2 weeks ago
Compatibility Issues with NumPy 2.0 Software Resolved
(workaround)

The newly released version of NumPy 2.0 includes substantial internal changes, including migrating code from C to C++. These modifications have led to significant issues... Read more

4 months 2 days ago 1 month 2 weeks ago
Login Issues with my.osc.edu Outage Resolved

Update: this is fixed. 

Original post:

We are aware that some users may be experiencing issues logging into ... Read more

2 months 2 weeks ago 2 months 1 week ago

Pages