prithivMLmods commited on
Commit
515bb0e
·
verified ·
1 Parent(s): 0fd32ad

Update requirements.txt

Browse files
Files changed (1) hide show
  1. requirements.txt +1 -0
requirements.txt CHANGED
@@ -1,5 +1,6 @@
1
  flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.3/flash_attn-2.7.3+cu12torch2.6cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
2
  transformers-stream-generator
 
3
  huggingface_hub
4
  qwen-vl-utils
5
  pyvips-binary
 
1
  flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.3/flash_attn-2.7.3+cu12torch2.6cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
2
  transformers-stream-generator
3
+ compressed-tensors
4
  huggingface_hub
5
  qwen-vl-utils
6
  pyvips-binary