Tom1986 commited on
Commit
bb8e91b
·
verified ·
1 Parent(s): ff5d8c7

Update torch to >=2.1.0 and fix flash-attn dependency

Browse files
Files changed (1) hide show
  1. requirements.txt +3 -4
requirements.txt CHANGED
@@ -1,4 +1,4 @@
1
- torch>=2.4.0
2
  torchvision>=0.19.0
3
  opencv-python>=4.9.0.80
4
  diffusers>=0.31.0
@@ -11,6 +11,5 @@ easydict
11
  ftfy
12
  dashscope
13
  imageio-ffmpeg
14
- https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.18/flash_attn-2.7.4+cu128torch2.8-cp310-cp310-linux_x86_64.whl
15
- numpy>=1.23.5,<2
16
-
 
1
+ torch>=2.1.0
2
  torchvision>=0.19.0
3
  opencv-python>=4.9.0.80
4
  diffusers>=0.31.0
 
11
  ftfy
12
  dashscope
13
  imageio-ffmpeg
14
+ flash-attn
15
+ numpy>=1.23.5,<2