Files changed (1) hide show
  1. README.md +296 -128
README.md CHANGED
@@ -1,24 +1,24 @@
1
- ---
2
- language:
3
- - en
4
- tags:
5
- - flash_attention_2
6
- - pytorch
7
- - text-generation-inference
8
- license: bsd-3-clause
9
- ---
10
 
11
  ![flashattention_logo](https://cdn-uploads.huggingface.co/production/uploads/65bb837dbfb878f46c77de4c/yQ8ugTupP2yLUV9sZ4HXt.png)
12
 
13
  # **Flash Attention 2 Pre-built Wheels**
14
 
15
  > [!IMPORTANT]
16
- This repository contains a copy of all prebuilt wheels for **flash_attn-2.8.3** from [Dao-AILabs Flash Attention](https://github.com/Dao-AILab/flash-attention), organized for better categorization based on PyTorch versions and Python (cp) release versions.
17
 
18
  This repository provides pre-built wheels for `flash-attn` version **2.8.3** for various PyTorch versions, Python versions, and architectures (compiled with CUDA 12). You can install these directly using `pip install <url>` or add the provided strings directly to your `requirements.txt`.
19
 
20
  > [!NOTE]
21
- The detailed categories and structured view of the `strangertoolshf/flash_attention_2_wheelhouse` folders and files on the hf-tree shareable link are available here: [huggingface-tree](https://strangertoolshf-huggingface-tree.static.hf.space/index.html#models/strangertoolshf/flash_attention_2_wheelhouse/main)
22
 
23
  <div style="
24
  background: rgba(255, 61, 61, 0.15);
@@ -43,142 +43,310 @@ The detailed categories and structured view of the `strangertoolshf/flash_attent
43
 
44
  ### Torch 2.9
45
  **ABI: `TRUE` (Implied)**
46
- * **Python 3.12 (`cp312`)**:
47
- `flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.9/cu12/cp312/flash_attn-2.8.3+cu12torch2.9cxx11abiTRUE-cp312-cp312-linux_x86_64.whl`
 
 
 
48
 
49
  > End of wheels for Torch 2.9.
50
 
 
 
51
  ### Torch 2.8
52
- **ABI: `FALSE`**
53
- * **Python 3.9 (`cp39`)**:
54
- `flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.8/cu12/abiFALSE/cp39/flash_attn-2.8.3+cu12torch2.8cxx11abiFALSE-cp39-cp39-linux_x86_64.whl`
55
- * **Python 3.10 (`cp310`)**:
56
- `flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.8/cu12/abiFALSE/cp310/flash_attn-2.8.3+cu12torch2.8cxx11abiFALSE-cp310-cp310-linux_x86_64.whl`
57
- * **Python 3.11 (`cp311`)**:
58
- `flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.8/cu12/abiFALSE/cp311/flash_attn-2.8.3+cu12torch2.8cxx11abiFALSE-cp311-cp311-linux_x86_64.whl`
59
- * **Python 3.12 (`cp312`)**:
60
- `flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.8/cu12/abiFALSE/cp312/flash_attn-2.8.3+cu12torch2.8cxx11abiFALSE-cp312-cp312-linux_x86_64.whl`
61
- * **Python 3.13 (`cp313`)**:
62
- `flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.8/cu12/abiFALSE/cp313/flash_attn-2.8.3+cu12torch2.8cxx11abiFALSE-cp313-cp313-linux_x86_64.whl`
63
-
64
- **ABI: `TRUE`**
65
- * **Python 3.9 (`cp39`)**:
66
- `flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.8/cu12/abiTRUE/cp39/flash_attn-2.8.3+cu12torch2.8cxx11abiTRUE-cp39-cp39-linux_x86_64.whl`
67
- * **Python 3.10 (`cp310`)**:
68
- `flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.8/cu12/abiTRUE/cp310/flash_attn-2.8.3+cu12torch2.8cxx11abiTRUE-cp310-cp310-linux_x86_64.whl`
69
- * **Python 3.11 (`cp311`)**:
70
- `flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.8/cu12/abiTRUE/cp311/flash_attn-2.8.3+cu12torch2.8cxx11abiTRUE-cp311-cp311-linux_x86_64.whl`
71
- * **Python 3.12 (`cp312`)**:
72
- `flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.8/cu12/abiTRUE/cp312/flash_attn-2.8.3+cu12torch2.8cxx11abiTRUE-cp312-cp312-linux_x86_64.whl`
73
- * **Python 3.13 (`cp313`)**:
74
- `flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.8/cu12/abiTRUE/cp313/flash_attn-2.8.3+cu12torch2.8cxx11abiTRUE-cp313-cp313-linux_x86_64.whl`
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
75
 
76
  ### Torch 2.7
77
- **ABI: `FALSE`**
78
- * **Python 3.9 (`cp39`)**:
79
- `flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.7/cu12/abiFALSE/cp39/flash_attn-2.8.3+cu12torch2.7cxx11abiFALSE-cp39-cp39-linux_x86_64.whl`
80
- * **Python 3.10 (`cp310`)**:
81
- `flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.7/cu12/abiFALSE/cp310/flash_attn-2.8.3+cu12torch2.7cxx11abiFALSE-cp310-cp310-linux_x86_64.whl`
82
- * **Python 3.11 (`cp311`)**:
83
- `flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.7/cu12/abiFALSE/cp311/flash_attn-2.8.3+cu12torch2.7cxx11abiFALSE-cp311-cp311-linux_x86_64.whl`
84
- * **Python 3.12 (`cp312`)**:
85
- `flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.7/cu12/abiFALSE/cp312/flash_attn-2.8.3+cu12torch2.7cxx11abiFALSE-cp312-cp312-linux_x86_64.whl`
86
- * **Python 3.13 (`cp313`)**:
87
- `flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.7/cu12/abiFALSE/cp313/flash_attn-2.8.3+cu12torch2.7cxx11abiFALSE-cp313-cp313-linux_x86_64.whl`
88
-
89
- **ABI: `TRUE`**
90
- * **Python 3.9 (`cp39`)**:
91
- `flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.7/cu12/abiTRUE/cp39/flash_attn-2.8.3+cu12torch2.7cxx11abiTRUE-cp39-cp39-linux_x86_64.whl`
92
- * **Python 3.10 (`cp310`)**:
93
- `flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.7/cu12/abiTRUE/cp310/flash_attn-2.8.3+cu12torch2.7cxx11abiTRUE-cp310-cp310-linux_x86_64.whl`
94
- * **Python 3.11 (`cp311`)**:
95
- `flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.7/cu12/abiTRUE/cp311/flash_attn-2.8.3+cu12torch2.7cxx11abiTRUE-cp311-cp311-linux_x86_64.whl`
96
- * **Python 3.12 (`cp312`)**:
97
- `flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.7/cu12/abiTRUE/cp312/flash_attn-2.8.3+cu12torch2.7cxx11abiTRUE-cp312-cp312-linux_x86_64.whl`
98
- * **Python 3.13 (`cp313`)**:
99
- `flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.7/cu12/abiTRUE/cp313/flash_attn-2.8.3+cu12torch2.7cxx11abiTRUE-cp313-cp313-linux_x86_64.whl`
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
100
 
101
  ### Torch 2.6
102
- **ABI: `FALSE`**
103
- * **Python 3.9 (`cp39`)**:
104
- `flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.6/cu12/abiFALSE/cp39/flash_attn-2.8.3+cu12torch2.6cxx11abiFALSE-cp39-cp39-linux_x86_64.whl`
105
- * **Python 3.10 (`cp310`)**:
106
- `flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.6/cu12/abiFALSE/cp310/flash_attn-2.8.3+cu12torch2.6cxx11abiFALSE-cp310-cp310-linux_x86_64.whl`
107
- * **Python 3.11 (`cp311`)**:
108
- `flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.6/cu12/abiFALSE/cp311/flash_attn-2.8.3+cu12torch2.6cxx11abiFALSE-cp311-cp311-linux_x86_64.whl`
109
- * **Python 3.12 (`cp312`)**:
110
- `flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.6/cu12/abiFALSE/cp312/flash_attn-2.8.3+cu12torch2.6cxx11abiFALSE-cp312-cp312-linux_x86_64.whl`
111
- * **Python 3.13 (`cp313`)**:
112
- `flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.6/cu12/abiFALSE/cp313/flash_attn-2.8.3+cu12torch2.6cxx11abiFALSE-cp313-cp313-linux_x86_64.whl`
113
-
114
- **ABI: `TRUE`**
115
- * **Python 3.9 (`cp39`)**:
116
- `flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.6/cu12/abiTRUE/cp39/flash_attn-2.8.3+cu12torch2.6cxx11abiTRUE-cp39-cp39-linux_x86_64.whl`
117
- * **Python 3.10 (`cp310`)**:
118
- `flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.6/cu12/abiTRUE/cp310/flash_attn-2.8.3+cu12torch2.6cxx11abiTRUE-cp310-cp310-linux_x86_64.whl`
119
- * **Python 3.11 (`cp311`)**:
120
- `flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.6/cu12/abiTRUE/cp311/flash_attn-2.8.3+cu12torch2.6cxx11abiTRUE-cp311-cp311-linux_x86_64.whl`
121
- * **Python 3.12 (`cp312`)**:
122
- `flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.6/cu12/abiTRUE/cp312/flash_attn-2.8.3+cu12torch2.6cxx11abiTRUE-cp312-cp312-linux_x86_64.whl`
123
- * **Python 3.13 (`cp313`)**:
124
- `flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.6/cu12/abiTRUE/cp313/flash_attn-2.8.3+cu12torch2.6cxx11abiTRUE-cp313-cp313-linux_x86_64.whl`
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
125
 
126
  ### Torch 2.5
127
- **ABI: `FALSE`**
128
- * **Python 3.9 (`cp39`)**:
129
- `flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.5/cu12/abiFALSE/cp39/flash_attn-2.8.3+cu12torch2.5cxx11abiFALSE-cp39-cp39-linux_x86_64.whl`
130
- * **Python 3.10 (`cp310`)**:
131
- `flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.5/cu12/abiFALSE/cp310/flash_attn-2.8.3+cu12torch2.5cxx11abiFALSE-cp310-cp310-linux_x86_64.whl`
132
- * **Python 3.11 (`cp311`)**:
133
- `flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.5/cu12/abiFALSE/cp311/flash_attn-2.8.3+cu12torch2.5cxx11abiFALSE-cp311-cp311-linux_x86_64.whl`
134
- * **Python 3.12 (`cp312`)**:
135
- `flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.5/cu12/abiFALSE/cp312/flash_attn-2.8.3+cu12torch2.5cxx11abiFALSE-cp312-cp312-linux_x86_64.whl`
136
- * **Python 3.13 (`cp313`)**:
137
- `flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.5/cu12/abiFALSE/cp313/flash_attn-2.8.3+cu12torch2.5cxx11abiFALSE-cp313-cp313-linux_x86_64.whl`
138
-
139
- **ABI: `TRUE`**
140
- * **Python 3.9 (`cp39`)**:
141
- `flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.5/cu12/abiTRUE/cp39/flash_attn-2.8.3+cu12torch2.5cxx11abiTRUE-cp39-cp39-linux_x86_64.whl`
142
- * **Python 3.10 (`cp310`)**:
143
- `flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.5/cu12/abiTRUE/cp310/flash_attn-2.8.3+cu12torch2.5cxx11abiTRUE-cp310-cp310-linux_x86_64.whl`
144
- * **Python 3.11 (`cp311`)**:
145
- `flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.5/cu12/abiTRUE/cp311/flash_attn-2.8.3+cu12torch2.5cxx11abiTRUE-cp311-cp311-linux_x86_64.whl`
146
- * **Python 3.12 (`cp312`)**:
147
- `flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.5/cu12/abiTRUE/cp312/flash_attn-2.8.3+cu12torch2.5cxx11abiTRUE-cp312-cp312-linux_x86_64.whl`
148
- * **Python 3.13 (`cp313`)**:
149
- `flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.5/cu12/abiTRUE/cp313/flash_attn-2.8.3+cu12torch2.5cxx11abiTRUE-cp313-cp313-linux_x86_64.whl`
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
150
 
151
  ### Torch 2.4
152
- **ABI: `FALSE`**
153
- * **Python 3.9 (`cp39`)**:
154
- `flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.4/cu12/abiFALSE/cp39/flash_attn-2.8.3+cu12torch2.4cxx11abiFALSE-cp39-cp39-linux_x86_64.whl`
155
- * **Python 3.10 (`cp310`)**:
156
- `flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.4/cu12/abiFALSE/cp310/flash_attn-2.8.3+cu12torch2.4cxx11abiFALSE-cp310-cp310-linux_x86_64.whl`
157
- * **Python 3.11 (`cp311`)**:
158
- `flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.4/cu12/abiFALSE/cp311/flash_attn-2.8.3+cu12torch2.4cxx11abiFALSE-cp311-cp311-linux_x86_64.whl`
159
- * **Python 3.12 (`cp312`)**:
160
- `flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.4/cu12/abiFALSE/cp312/flash_attn-2.8.3+cu12torch2.4cxx11abiFALSE-cp312-cp312-linux_x86_64.whl`
161
-
162
- **ABI: `TRUE`**
163
- * **Python 3.9 (`cp39`)**:
164
- `flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.4/cu12/abiTRUE/cp39/flash_attn-2.8.3+cu12torch2.4cxx11abiTRUE-cp39-cp39-linux_x86_64.whl`
165
- * **Python 3.10 (`cp310`)**:
166
- `flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.4/cu12/abiTRUE/cp310/flash_attn-2.8.3+cu12torch2.4cxx11abiTRUE-cp310-cp310-linux_x86_64.whl`
167
- * **Python 3.11 (`cp311`)**:
168
- `flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.4/cu12/abiTRUE/cp311/flash_attn-2.8.3+cu12torch2.4cxx11abiTRUE-cp311-cp311-linux_x86_64.whl`
169
- * **Python 3.12 (`cp312`)**:
170
- `flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.4/cu12/abiTRUE/cp312/flash_attn-2.8.3+cu12torch2.4cxx11abiTRUE-cp312-cp312-linux_x86_64.whl`
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
171
 
172
  ## Linux aarch64 (ARM)
173
 
174
  ### Torch 2.9 (Nightly/Pre-release)
175
- **ABI: `TRUE`**
176
- * **Python 3.12 (`cp312`)**:
177
- `flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_aarch64/torch2.9/cu12/abiTRUE/cp312/flash_attn-2.8.3+cu12torch2.9cxx11abiTRUE-cp312-cp312-linux_aarch64.whl`
 
 
 
 
178
 
179
  ---
180
 
181
  ## Acknowledgements and Note:
182
 
183
  - Dao-AILab [flash-attention]: [Faster Attention with Better Parallelism and Work Partitioning](https://github.com/Dao-AILab/flash-attention).
184
- - Note: This repository follows the same license, release notices, and other terms and conditions as the Dao-AILab [flash-attention](https://github.com/Dao-AILab/flash-attention) repository.
 
1
+ ---
2
+ language:
3
+ - en
4
+ tags:
5
+ - flash_attention_2
6
+ - pytorch
7
+ - text-generation-inference
8
+ license: bsd-3-clause
9
+ ---
10
 
11
  ![flashattention_logo](https://cdn-uploads.huggingface.co/production/uploads/65bb837dbfb878f46c77de4c/yQ8ugTupP2yLUV9sZ4HXt.png)
12
 
13
  # **Flash Attention 2 Pre-built Wheels**
14
 
15
  > [!IMPORTANT]
16
+ > This repository contains a copy of all prebuilt wheels for **flash_attn-2.8.3** from [Dao-AILab's Flash Attention](https://github.com/Dao-AILab/flash-attention), organized for better categorization based on PyTorch versions and Python (cp) release versions.
17
 
18
  This repository provides pre-built wheels for `flash-attn` version **2.8.3** for various PyTorch versions, Python versions, and architectures (compiled with CUDA 12). You can install these directly using `pip install <url>` or add the provided strings directly to your `requirements.txt`.
19
 
20
  > [!NOTE]
21
+ > The detailed categories and structured view of the `strangertoolshf/flash_attention_2_wheelhouse` folders and files on the hf-tree shareable link are available here: [huggingface-tree](https://strangertoolshf-huggingface-tree.static.hf.space/index.html#models/strangertoolshf/flash_attention_2_wheelhouse/main)
22
 
23
  <div style="
24
  background: rgba(255, 61, 61, 0.15);
 
43
 
44
  ### Torch 2.9
45
  **ABI: `TRUE` (Implied)**
46
+
47
+ #### Python 3.12 (`cp312`)
48
+ ```
49
+ flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.9/cu12/cp312/flash_attn-2.8.3+cu12torch2.9cxx11abiTRUE-cp312-cp312-linux_x86_64.whl
50
+ ```
51
 
52
  > End of wheels for Torch 2.9.
53
 
54
+ ---
55
+
56
  ### Torch 2.8
57
+
58
+ #### ABI: `FALSE`
59
+
60
+ ##### Python 3.9 (`cp39`)
61
+ ```
62
+ flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.8/cu12/abiFALSE/cp39/flash_attn-2.8.3+cu12torch2.8cxx11abiFALSE-cp39-cp39-linux_x86_64.whl
63
+ ```
64
+
65
+ ##### Python 3.10 (`cp310`)
66
+ ```
67
+ flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.8/cu12/abiFALSE/cp310/flash_attn-2.8.3+cu12torch2.8cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
68
+ ```
69
+
70
+ ##### Python 3.11 (`cp311`)
71
+ ```
72
+ flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.8/cu12/abiFALSE/cp311/flash_attn-2.8.3+cu12torch2.8cxx11abiFALSE-cp311-cp311-linux_x86_64.whl
73
+ ```
74
+
75
+ ##### Python 3.12 (`cp312`)
76
+ ```
77
+ flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.8/cu12/abiFALSE/cp312/flash_attn-2.8.3+cu12torch2.8cxx11abiFALSE-cp312-cp312-linux_x86_64.whl
78
+ ```
79
+
80
+ ##### Python 3.13 (`cp313`)
81
+ ```
82
+ flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.8/cu12/abiFALSE/cp313/flash_attn-2.8.3+cu12torch2.8cxx11abiFALSE-cp313-cp313-linux_x86_64.whl
83
+ ```
84
+
85
+ #### ABI: `TRUE`
86
+
87
+ ##### Python 3.9 (`cp39`)
88
+ ```
89
+ flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.8/cu12/abiTRUE/cp39/flash_attn-2.8.3+cu12torch2.8cxx11abiTRUE-cp39-cp39-linux_x86_64.whl
90
+ ```
91
+
92
+ ##### Python 3.10 (`cp310`)
93
+ ```
94
+ flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.8/cu12/abiTRUE/cp310/flash_attn-2.8.3+cu12torch2.8cxx11abiTRUE-cp310-cp310-linux_x86_64.whl
95
+ ```
96
+
97
+ ##### Python 3.11 (`cp311`)
98
+ ```
99
+ flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.8/cu12/abiTRUE/cp311/flash_attn-2.8.3+cu12torch2.8cxx11abiTRUE-cp311-cp311-linux_x86_64.whl
100
+ ```
101
+
102
+ ##### Python 3.12 (`cp312`)
103
+ ```
104
+ flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.8/cu12/abiTRUE/cp312/flash_attn-2.8.3+cu12torch2.8cxx11abiTRUE-cp312-cp312-linux_x86_64.whl
105
+ ```
106
+
107
+ ##### Python 3.13 (`cp313`)
108
+ ```
109
+ flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.8/cu12/abiTRUE/cp313/flash_attn-2.8.3+cu12torch2.8cxx11abiTRUE-cp313-cp313-linux_x86_64.whl
110
+ ```
111
+
112
+ ---
113
 
114
  ### Torch 2.7
115
+
116
+ #### ABI: `FALSE`
117
+
118
+ ##### Python 3.9 (`cp39`)
119
+ ```
120
+ flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.7/cu12/abiFALSE/cp39/flash_attn-2.8.3+cu12torch2.7cxx11abiFALSE-cp39-cp39-linux_x86_64.whl
121
+ ```
122
+
123
+ ##### Python 3.10 (`cp310`)
124
+ ```
125
+ flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.7/cu12/abiFALSE/cp310/flash_attn-2.8.3+cu12torch2.7cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
126
+ ```
127
+
128
+ ##### Python 3.11 (`cp311`)
129
+ ```
130
+ flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.7/cu12/abiFALSE/cp311/flash_attn-2.8.3+cu12torch2.7cxx11abiFALSE-cp311-cp311-linux_x86_64.whl
131
+ ```
132
+
133
+ ##### Python 3.12 (`cp312`)
134
+ ```
135
+ flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.7/cu12/abiFALSE/cp312/flash_attn-2.8.3+cu12torch2.7cxx11abiFALSE-cp312-cp312-linux_x86_64.whl
136
+ ```
137
+
138
+ ##### Python 3.13 (`cp313`)
139
+ ```
140
+ flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.7/cu12/abiFALSE/cp313/flash_attn-2.8.3+cu12torch2.7cxx11abiFALSE-cp313-cp313-linux_x86_64.whl
141
+ ```
142
+
143
+ #### ABI: `TRUE`
144
+
145
+ ##### Python 3.9 (`cp39`)
146
+ ```
147
+ flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.7/cu12/abiTRUE/cp39/flash_attn-2.8.3+cu12torch2.7cxx11abiTRUE-cp39-cp39-linux_x86_64.whl
148
+ ```
149
+
150
+ ##### Python 3.10 (`cp310`)
151
+ ```
152
+ flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.7/cu12/abiTRUE/cp310/flash_attn-2.8.3+cu12torch2.7cxx11abiTRUE-cp310-cp310-linux_x86_64.whl
153
+ ```
154
+
155
+ ##### Python 3.11 (`cp311`)
156
+ ```
157
+ flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.7/cu12/abiTRUE/cp311/flash_attn-2.8.3+cu12torch2.7cxx11abiTRUE-cp311-cp311-linux_x86_64.whl
158
+ ```
159
+
160
+ ##### Python 3.12 (`cp312`)
161
+ ```
162
+ flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.7/cu12/abiTRUE/cp312/flash_attn-2.8.3+cu12torch2.7cxx11abiTRUE-cp312-cp312-linux_x86_64.whl
163
+ ```
164
+
165
+ ##### Python 3.13 (`cp313`)
166
+ ```
167
+ flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.7/cu12/abiTRUE/cp313/flash_attn-2.8.3+cu12torch2.7cxx11abiTRUE-cp313-cp313-linux_x86_64.whl
168
+ ```
169
+
170
+ ---
171
 
172
  ### Torch 2.6
173
+
174
+ #### ABI: `FALSE`
175
+
176
+ ##### Python 3.9 (`cp39`)
177
+ ```
178
+ flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.6/cu12/abiFALSE/cp39/flash_attn-2.8.3+cu12torch2.6cxx11abiFALSE-cp39-cp39-linux_x86_64.whl
179
+ ```
180
+
181
+ ##### Python 3.10 (`cp310`)
182
+ ```
183
+ flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.6/cu12/abiFALSE/cp310/flash_attn-2.8.3+cu12torch2.6cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
184
+ ```
185
+
186
+ ##### Python 3.11 (`cp311`)
187
+ ```
188
+ flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.6/cu12/abiFALSE/cp311/flash_attn-2.8.3+cu12torch2.6cxx11abiFALSE-cp311-cp311-linux_x86_64.whl
189
+ ```
190
+
191
+ ##### Python 3.12 (`cp312`)
192
+ ```
193
+ flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.6/cu12/abiFALSE/cp312/flash_attn-2.8.3+cu12torch2.6cxx11abiFALSE-cp312-cp312-linux_x86_64.whl
194
+ ```
195
+
196
+ ##### Python 3.13 (`cp313`)
197
+ ```
198
+ flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.6/cu12/abiFALSE/cp313/flash_attn-2.8.3+cu12torch2.6cxx11abiFALSE-cp313-cp313-linux_x86_64.whl
199
+ ```
200
+
201
+ #### ABI: `TRUE`
202
+
203
+ ##### Python 3.9 (`cp39`)
204
+ ```
205
+ flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.6/cu12/abiTRUE/cp39/flash_attn-2.8.3+cu12torch2.6cxx11abiTRUE-cp39-cp39-linux_x86_64.whl
206
+ ```
207
+
208
+ ##### Python 3.10 (`cp310`)
209
+ ```
210
+ flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.6/cu12/abiTRUE/cp310/flash_attn-2.8.3+cu12torch2.6cxx11abiTRUE-cp310-cp310-linux_x86_64.whl
211
+ ```
212
+
213
+ ##### Python 3.11 (`cp311`)
214
+ ```
215
+ flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.6/cu12/abiTRUE/cp311/flash_attn-2.8.3+cu12torch2.6cxx11abiTRUE-cp311-cp311-linux_x86_64.whl
216
+ ```
217
+
218
+ ##### Python 3.12 (`cp312`)
219
+ ```
220
+ flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.6/cu12/abiTRUE/cp312/flash_attn-2.8.3+cu12torch2.6cxx11abiTRUE-cp312-cp312-linux_x86_64.whl
221
+ ```
222
+
223
+ ##### Python 3.13 (`cp313`)
224
+ ```
225
+ flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.6/cu12/abiTRUE/cp313/flash_attn-2.8.3+cu12torch2.6cxx11abiTRUE-cp313-cp313-linux_x86_64.whl
226
+ ```
227
+
228
+ ---
229
 
230
  ### Torch 2.5
231
+
232
+ #### ABI: `FALSE`
233
+
234
+ ##### Python 3.9 (`cp39`)
235
+ ```
236
+ flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.5/cu12/abiFALSE/cp39/flash_attn-2.8.3+cu12torch2.5cxx11abiFALSE-cp39-cp39-linux_x86_64.whl
237
+ ```
238
+
239
+ ##### Python 3.10 (`cp310`)
240
+ ```
241
+ flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.5/cu12/abiFALSE/cp310/flash_attn-2.8.3+cu12torch2.5cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
242
+ ```
243
+
244
+ ##### Python 3.11 (`cp311`)
245
+ ```
246
+ flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.5/cu12/abiFALSE/cp311/flash_attn-2.8.3+cu12torch2.5cxx11abiFALSE-cp311-cp311-linux_x86_64.whl
247
+ ```
248
+
249
+ ##### Python 3.12 (`cp312`)
250
+ ```
251
+ flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.5/cu12/abiFALSE/cp312/flash_attn-2.8.3+cu12torch2.5cxx11abiFALSE-cp312-cp312-linux_x86_64.whl
252
+ ```
253
+
254
+ ##### Python 3.13 (`cp313`)
255
+ ```
256
+ flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.5/cu12/abiFALSE/cp313/flash_attn-2.8.3+cu12torch2.5cxx11abiFALSE-cp313-cp313-linux_x86_64.whl
257
+ ```
258
+
259
+ #### ABI: `TRUE`
260
+
261
+ ##### Python 3.9 (`cp39`)
262
+ ```
263
+ flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.5/cu12/abiTRUE/cp39/flash_attn-2.8.3+cu12torch2.5cxx11abiTRUE-cp39-cp39-linux_x86_64.whl
264
+ ```
265
+
266
+ ##### Python 3.10 (`cp310`)
267
+ ```
268
+ flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.5/cu12/abiTRUE/cp310/flash_attn-2.8.3+cu12torch2.5cxx11abiTRUE-cp310-cp310-linux_x86_64.whl
269
+ ```
270
+
271
+ ##### Python 3.11 (`cp311`)
272
+ ```
273
+ flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.5/cu12/abiTRUE/cp311/flash_attn-2.8.3+cu12torch2.5cxx11abiTRUE-cp311-cp311-linux_x86_64.whl
274
+ ```
275
+
276
+ ##### Python 3.12 (`cp312`)
277
+ ```
278
+ flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.5/cu12/abiTRUE/cp312/flash_attn-2.8.3+cu12torch2.5cxx11abiTRUE-cp312-cp312-linux_x86_64.whl
279
+ ```
280
+
281
+ ##### Python 3.13 (`cp313`)
282
+ ```
283
+ flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.5/cu12/abiTRUE/cp313/flash_attn-2.8.3+cu12torch2.5cxx11abiTRUE-cp313-cp313-linux_x86_64.whl
284
+ ```
285
+
286
+ ---
287
 
288
  ### Torch 2.4
289
+
290
+ #### ABI: `FALSE`
291
+
292
+ ##### Python 3.9 (`cp39`)
293
+ ```
294
+ flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.4/cu12/abiFALSE/cp39/flash_attn-2.8.3+cu12torch2.4cxx11abiFALSE-cp39-cp39-linux_x86_64.whl
295
+ ```
296
+
297
+ ##### Python 3.10 (`cp310`)
298
+ ```
299
+ flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.4/cu12/abiFALSE/cp310/flash_attn-2.8.3+cu12torch2.4cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
300
+ ```
301
+
302
+ ##### Python 3.11 (`cp311`)
303
+ ```
304
+ flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.4/cu12/abiFALSE/cp311/flash_attn-2.8.3+cu12torch2.4cxx11abiFALSE-cp311-cp311-linux_x86_64.whl
305
+ ```
306
+
307
+ ##### Python 3.12 (`cp312`)
308
+ ```
309
+ flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.4/cu12/abiFALSE/cp312/flash_attn-2.8.3+cu12torch2.4cxx11abiFALSE-cp312-cp312-linux_x86_64.whl
310
+ ```
311
+
312
+ #### ABI: `TRUE`
313
+
314
+ ##### Python 3.9 (`cp39`)
315
+ ```
316
+ flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.4/cu12/abiTRUE/cp39/flash_attn-2.8.3+cu12torch2.4cxx11abiTRUE-cp39-cp39-linux_x86_64.whl
317
+ ```
318
+
319
+ ##### Python 3.10 (`cp310`)
320
+ ```
321
+ flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.4/cu12/abiTRUE/cp310/flash_attn-2.8.3+cu12torch2.4cxx11abiTRUE-cp310-cp310-linux_x86_64.whl
322
+ ```
323
+
324
+ ##### Python 3.11 (`cp311`)
325
+ ```
326
+ flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.4/cu12/abiTRUE/cp311/flash_attn-2.8.3+cu12torch2.4cxx11abiTRUE-cp311-cp311-linux_x86_64.whl
327
+ ```
328
+
329
+ ##### Python 3.12 (`cp312`)
330
+ ```
331
+ flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_x86_64/torch2.4/cu12/abiTRUE/cp312/flash_attn-2.8.3+cu12torch2.4cxx11abiTRUE-cp312-cp312-linux_x86_64.whl
332
+ ```
333
+
334
+ ---
335
 
336
  ## Linux aarch64 (ARM)
337
 
338
  ### Torch 2.9 (Nightly/Pre-release)
339
+
340
+ #### ABI: `TRUE`
341
+
342
+ ##### Python 3.12 (`cp312`)
343
+ ```
344
+ flash-attn @ https://huggingface.co/strangertoolshf/flash_attention_2_wheelhouse/resolve/main/wheelhouse-flash_attn-2.8.3/linux_aarch64/torch2.9/cu12/abiTRUE/cp312/flash_attn-2.8.3+cu12torch2.9cxx11abiTRUE-cp312-cp312-linux_aarch64.whl
345
+ ```
346
 
347
  ---
348
 
349
  ## Acknowledgements and Note:
350
 
351
  - Dao-AILab [flash-attention]: [Faster Attention with Better Parallelism and Work Partitioning](https://github.com/Dao-AILab/flash-attention).
352
+ - Note: This repository follows the same license, release notices, and other terms and conditions as the Dao-AILab [flash-attention](https://github.com/Dao-AILab/flash-attention) repository.