Fix flash_attn_config import to use relative import in cu130 x86_64 builds aaa4e4f Varun commited on Mar 20
Update flash_attn_interface.py and flash_attn_config.py for cu130 x86_64 builds 4f71ea1 Varun commited on Mar 20
Add torch 2.11 and 2.12 builds (ABI-compatible with 2.9+) 0af69e1 verified varunneal commited on Feb 8
Update build/torch28-cxx11-cu126-x86_64-linux/flash_attention_3/flash_attn_interface.py 58d4df4 verified varunneal commited on Sep 25, 2025
Update build/torch28-cxx11-cu128-x86_64-linux/flash_attention_3/flash_attn_interface.py e307171 verified varunneal commited on Sep 25, 2025