Julian Bilcke commited on
Commit
6ea6b16
·
1 Parent(s): 5c50d1d

switch to building flash-attn

Browse files
Files changed (1) hide show
  1. requirements.txt +1 -1
requirements.txt CHANGED
@@ -1,4 +1,4 @@
1
- flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.4cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
2
 
3
  torch>=2.4.0
4
  torchvision>=0.19.0
 
1
+ #flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.4cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
2
 
3
  torch>=2.4.0
4
  torchvision>=0.19.0