ComfyUI

ComfyUI is an open-source inference server and web frontend for generative AI models, with a focus on image, video and audio generation.  Its node-based workflow offers highly flexible routing and model composition for complex generation pipelines.

ComfyUI is in early user testing phase - not all functionality is guaranteed to work.  Contact oschelp@osc.edu with any questions.
ComfyUI is not currently suitable for use with protected or sensitive data - do not use if you need protected data service. See https://www.osc.edu/resources/protected_data_service for more details.

Availability and Restrictions

Versions

ComfyUI is available on OSC Clusters. The versions currently available at OSC are:

Version Ascend
20260121 X

 

You can use module spider comfyui to view available modules for a given machine.

Access:

All OSC users may use ComfyUI, but individual models may have their own license restrictions.

Publisher/Vendor/Repository and License Type

GPL-3: https://github.com/comfy-org/ComfyUI

Prerequisites

  • GPU Usage: ComfyUI should be run with a GPU for best performance. 

Due to the need for GPUs, we recommend not running ComfyUI on login nodes nor OnDemand lightweight desktops.  The current best option is to run on a GPU-enabled OnDemand desktop with minimum 16 cores.

Running ComfyUI Overview

1. Load module

2. Start ComfyUI

 

Commands

ComfyUI is available through the module system and must be loaded prior to running any of the commands below:

loading comfyui module:
module load comfyui/20260121
Starting comfyui:
comfyui_start

On service start, a port number will be displayed.  Take note of this number, as you will need it if you close your browser. 

COMFYUI_PORT: 61234

This port number is only an example - your port number will differ from the one above.

The COMFYUI_PORT environment variable will be used to define the web service port.

A browser will open automatically after a minute or so.

Stopping comfyui:

ComfyUI can be manually stopped with the following command:

comfyui_stop

It is also killed upon module unload.  If you want to stop the services, you can simply unload the ComfyUI module:

module unload comfyui

Model Management

By default, ComfyUI will use three directories in your $HOME directory to store its files. These are set as environment variables defined upon module load and include:

COMFYUI_RUNDIR   includes ComfyUI application and environment files
COMFYUI_BASEDIR includes user-specific models, workflows, input and output files, and custom nodes

COMFYUI_USERSCRIPTS_DIR includes userscripts for advanced configuration

installing a model:
Downloading large models can exceed your disk space quota.  Check model sizes before downloading!

$COMFYUI_BASEDIR/models/checkpoints will be where most model checkpoints are stored.  You can manually download checkpoints here or use custom nodes to download and move the files for you.

$COMFYUI_BASEDIR/models/vae is where you can place your VAE model files.


Some models require licensing agreements or are otherwise restricted and require a Hugging Face account and login.  With the ComfyUI module loaded, use the huggingface-cli tool to login:

hf auth login

You will need your Hugging Face token.  For more details, see https://huggingface.co/docs/huggingface_hub/en/guides/cli.

Deleting a Model:

Models can be deleted manually by deleting the target files in $COMFYUI_BASEDIR/models

Consider removing unused models periodically to free disk space.

Custom Nodes

Beyond model installation, you may optionally install community extensions called 'custom nodes' to provide additional functionality.  These features are not vetted and may present a security risk.  Use at your own discretion.

See installation instructions here: https://docs.comfy.org/installation/install_custom_node

Supercomputer: