lym00 commited on
Commit
2f23133
·
verified ·
1 Parent(s): bea65b1

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -5,7 +5,7 @@ language:
5
 
6
  | Prebuilt Wheels | Python Versions | PyTorch Versions | CUDA Versions | Source |
7
  |-----------------------------------------------|-----------------|------------------|----------------|---------------------------------------------------------------------------|
8
- | Flash-Attention 2.7.4.post1 | 3.12 | 2.8.0.dev | 12.8.1 | [Dao-AILab/flash-attention](https://github.com/Dao-AILab/flash-attention) |
9
  | Flash-Attention 3 (pending official release) | 3.12 | 2.8.0.dev | 12.8.1 | INSERT |
10
  | Sage-Attention 2++ (pending official release) | 3.12 | 2.8.0.dev | 12.8.1 | [jt-zhang/SageAttention2_plus](https://huggingface.co/jt-zhang/SageAttention2_plus) |
11
  | Sage-Attention 3 (pending official release) | 3.12 | 2.8.0.dev | 12.8.1 | INSERT |
 
5
 
6
  | Prebuilt Wheels | Python Versions | PyTorch Versions | CUDA Versions | Source |
7
  |-----------------------------------------------|-----------------|------------------|----------------|---------------------------------------------------------------------------|
8
+ | [Flash-Attention 2.7.4.post1](https://huggingface.co/lym00/win_amd64_prebuilt_wheels/blob/main/flash_attn-2.7.4.post1-cp312-cp312-win_amd64.whl) | 3.12 | 2.8.0.dev | 12.8.1 | [Dao-AILab/flash-attention](https://github.com/Dao-AILab/flash-attention) |
9
  | Flash-Attention 3 (pending official release) | 3.12 | 2.8.0.dev | 12.8.1 | INSERT |
10
  | Sage-Attention 2++ (pending official release) | 3.12 | 2.8.0.dev | 12.8.1 | [jt-zhang/SageAttention2_plus](https://huggingface.co/jt-zhang/SageAttention2_plus) |
11
  | Sage-Attention 3 (pending official release) | 3.12 | 2.8.0.dev | 12.8.1 | INSERT |