Ollama
Ollama is an open-source inference server for large language models (LLMs). This module also includes Open-WebUI, which provides an easy-to-use web interface.
Ollama is an open-source inference server for large language models (LLMs). This module also includes Open-WebUI, which provides an easy-to-use web interface.
Starccm can be loaded with the following modules on Cardinal:
module load starccm/19.06.009
module load starccm/19.06.009-mixed
module load starccm/19.06.009-hbm
module load starccm/19.06.009-mixed-hbm
Ruby is a dynamic, open source programming language with a focus on simplicity and productivity.
The following versions of ruby are available on OSC clusters:
bedtools2 2.31.0 is now available on Cardinal via module load bedtools2/2.31.0
|
The following components and versions are available on Cardinal: |
osc-seff is a command developed at OSC for use on OSC's systems and provides a the CPU resource data of the seff command with the GPU resource data of gpu-seff.
gpu-seff is a command developed at OSC for use on OSC's systems and is similar providing GPU resource data, similar to the CPU resource data reported by the seff command.