jena-shreyas commited on
Commit
4333c80
·
1 Parent(s): 1c5650b

flash-attn ABI true -> false for HF spaces

Browse files
Files changed (1) hide show
  1. requirements.txt +1 -1
requirements.txt CHANGED
@@ -8,7 +8,7 @@ einops-exts==0.0.4
8
  torch==2.6.0
9
  torchvision==0.21.0
10
  tokenizers==0.19.0
11
- flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.3/flash_attn-2.7.3+cu12torch2.6cxx11abiTRUE-cp310-cp310-linux_x86_64.whl
12
  GitPython==3.1.43
13
  gradio==6.2.0
14
  gradio_client==2.0.2
 
8
  torch==2.6.0
9
  torchvision==0.21.0
10
  tokenizers==0.19.0
11
+ flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.3/flash_attn-2.7.3+cu12torch2.6cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
12
  GitPython==3.1.43
13
  gradio==6.2.0
14
  gradio_client==2.0.2