VLM-FO1-3B-Demo / requirements.txt
PengLiu
update
c8d4dab
torch==2.6.0
torchvision==0.21.0
transformers==4.50.1
timm==1.0.9
accelerate==1.4.0
gradio
mmengine==0.8.2
einops
ninja
scikit-image
https://airesources.oss-cn-hangzhou.aliyuncs.com/lp/wheel/multiscaledeformableattention-1.0-cp310-cp310-linux_x86_64.whl
https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.6cxx11abiFALSE-cp310-cp310-linux_x86_64.whl