YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
swarm-container
This repo builds a SwarmUI-ready container with:
It is built on top of the nvidia PyTorch images nvcr.io/nvidia/pytorch.
Requirements
- A Blackwell GPU
- RTX 50-series
- RTX Pro 6000
- RTX Pro 5000
- Docker or Podman
Getting Started
The image is available on DockerHub, so all you need to do is have the SwarmUI repo cloned locally.
Replace /path/to/SwarmUI with the path you've cloned SwarmUI at locally and run one of the following:
All model paths as default
docker run --gpus all --rm -it --shm-size=512m --name swarmui \
-p 7801:7801 \
-v /path/to/SwarmUI:/workspace \
jtreminio/swarmui:latest
Then navigate to http://localhost:7801/.
Define different model and config paths
docker run --gpus all --rm -it --shm-size=512m --name swarmui \
-p 7801:7801 \
-v /path/to/SwarmUI:/workspace \
-v /path/to/local/output_directory:/workspace/Output \
-v /path/to/local/wildcard_directory:/workspace/Data/Wildcards \
jtreminio/swarmui:latest
Then navigate to http://localhost:7801/.
Building
If you would like to build the image for yourself, simply run:
# compiles flash_attn, sageattention, torchaudio, etc
./step-1.sh
# builds the Docker image for reuse
./step-2.sh
There are two steps because docker build does not have a --gpus all option, so you cannot compile anything that requires a GPU.
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support