Spaces:
Running
on
Zero
Running
on
Zero
Fix flash_attn wheel URL (cu122)
Browse files- requirements.txt +2 -2
requirements.txt
CHANGED
|
@@ -27,5 +27,5 @@ insightface==0.7.3
|
|
| 27 |
transformers==4.52.0
|
| 28 |
huggingface_hub
|
| 29 |
ninja
|
| 30 |
-
# 使用预编译的 flash_attn wheel (torch 2.6.0 + CUDA 12.
|
| 31 |
-
https://github.com/Dao-AILab/flash-attention/releases/download/v2.8.1/flash_attn-2.8.1+
|
|
|
|
| 27 |
transformers==4.52.0
|
| 28 |
huggingface_hub
|
| 29 |
ninja
|
| 30 |
+
# 使用预编译的 flash_attn wheel (torch 2.6.0 + CUDA 12.2 + Python 3.10)
|
| 31 |
+
https://github.com/Dao-AILab/flash-attention/releases/download/v2.8.1/flash_attn-2.8.1+cu122torch2.6cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
|