1. Add flash-attn 2.8.3 release link with ABI=False since it works on HF Spaces f5837a1 jena-shreyas commited on 10 days ago
Drop python_version to 3.10, fix flash-attn .whl file for Python=3.10, torch=2.6 f9a14b4 jena-shreyas commited on 12 days ago