Request flash attention 2.7.4 for windows, python 3.12 + cuda 128 + PyTorch 2.10

#21
by Koxae - opened
No description provided.
Koxae changed pull request title from Request flash attention 2.7.4 for windows, python 3.12, cuda 128, PyTorch 2.10 to Request flash attention 2.7.4 for windows, python 3.12 + cuda 128 + PyTorch 2.10
Publish this branch
This branch is in draft mode, publish it to be able to merge.

Sign up or log in to comment