Request for Python 3.12 wheel: flash-attn for CUDA 12.8 + PyTorch 2.8.0

#16
by Consider20241010 - opened

Hi @lldacing , thanks for this incredible and very useful repo! The flash_attn-2.7.4.post1+cu128torch2.8.0 wheel is exactly what many of us need.

Since Python 3.12 has become very popular and is the default version for many new projects and virtual environments, would it be possible to add a cp312 (Python 3.12) build for the cu128+torch2.8.0 package?

A pre-compiled wheel for flash-attn on Windows + CUDA 12.8 + PyTorch 2.8.0 + Python 3.12 would be a huge help for the community and would save many users from the complex and often failing process of compiling it from source.

Thank you for your time and for maintaining this essential project!

This is exactly what I'm looking for... Please pre-compile it.

thanks!!!

Sign up or log in to comment