zzz66 commited on
Commit
e7afc0e
·
1 Parent(s): 96465d6

Fix flash_attn wheel URL (cu122)

Browse files
Files changed (1) hide show
  1. requirements.txt +2 -2
requirements.txt CHANGED
@@ -27,5 +27,5 @@ insightface==0.7.3
27
  transformers==4.52.0
28
  huggingface_hub
29
  ninja
30
- # 使用预编译的 flash_attn wheel (torch 2.6.0 + CUDA 12.4 + Python 3.10)
31
- https://github.com/Dao-AILab/flash-attention/releases/download/v2.8.1/flash_attn-2.8.1+cu124torch2.6cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
 
27
  transformers==4.52.0
28
  huggingface_hub
29
  ninja
30
+ # 使用预编译的 flash_attn wheel (torch 2.6.0 + CUDA 12.2 + Python 3.10)
31
+ https://github.com/Dao-AILab/flash-attention/releases/download/v2.8.1/flash_attn-2.8.1+cu122torch2.6cxx11abiFALSE-cp310-cp310-linux_x86_64.whl