Request for Python 3.12 wheel: flash-attn for CUDA 128 + PyTorch 2.9.1

#19
by StableDiffusion69 - opened

Can you please compile for Python 3.12 -flash-attn for CUDA 128 + PyTorch 2.9.1?
Thank you.

I second this. :)

Doesn't do it for me, sadly. πŸ˜”

Sign up or log in to comment