CorridorKey / requirements.txt
Nekochu's picture
add ZeroGPU GPU inference (FP16, flash-attn, batch=32@1024/16@2048)
0b6961f
raw
history blame contribute delete
248 Bytes
numpy
opencv-python-headless
huggingface-hub
onnxruntime-gpu
spaces
gradio[mcp]
torch
torchvision
timm
https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.9.0/flash_attn-2.8.3+cu126torch2.9-cp310-cp310-linux_x86_64.whl