zimhe commited on
Commit
04a47cb
·
1 Parent(s): a521a3f

revise flash attention to py 3.10

Browse files
Files changed (1) hide show
  1. requirements.txt +1 -1
requirements.txt CHANGED
@@ -10,4 +10,4 @@ transformers
10
  xformers
11
  realesrgan
12
  py360convert
13
- https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.4cxx11abiTRUE-cp312-cp312-linux_x86_64.whl
 
10
  xformers
11
  realesrgan
12
  py360convert
13
+ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.4cxx11abiTRUE-cp310-cp310-linux_x86_64.whl