remove flash attention
Browse files- requirements.txt +1 -2
requirements.txt
CHANGED
|
@@ -6,5 +6,4 @@ torchvision
|
|
| 6 |
qwen_vl_utils
|
| 7 |
Pillow
|
| 8 |
PyMuPDF
|
| 9 |
-
accelerate
|
| 10 |
-
https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.7.4.post1+cu126torch2.7-cp310-cp310-linux_x86_64.whl
|
|
|
|
| 6 |
qwen_vl_utils
|
| 7 |
Pillow
|
| 8 |
PyMuPDF
|
| 9 |
+
accelerate
|
|
|