YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

flash-attn (v2.8.2 and v2.8.3) wheel for current ComfyUI. Compiled with cxx11abi=true. deepspeed (v0.18.3) wheel for various needs, including TTS.

Tested with ComfyUI v0.3.62 installed with UmeAiRT (https://github.com/UmeAiRT/ComfyUI-Auto_installer). deepspeed being tested.

Mainly made for 50xx cards with these specs:

Python: 3.12 CUDA: 12.9 Torch: 2.8

Deepspeed will work with most cards.

ATTENTION the 2.8.3 version of flash_attn is not yet supported in ComfyUI. You have to use 2.8.2.

install by downloading, then

pip install flash_attn-2.8.2+cu129torch2.8.0cxx11abiTRUE-cp312-cp312-win_amd64.whl

or

pip install deepspeed-0.18.3+53e91a09-cp312-cp312-win_amd64.whl

License pulled from original source (https://github.com/Dao-AILab/flash-attention).

DeepSpeed is under Apache-2.0 license.


license: bsd-3-clause

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support