Update tag
Browse files
README.md
CHANGED
|
@@ -1,5 +1,5 @@
|
|
| 1 |
---
|
| 2 |
tags:
|
| 3 |
-
-
|
| 4 |
---
|
| 5 |
This CUDA extension implements fused dropout + residual + LayerNorm from the [flash-attention](https://github.com/Dao-AILab/flash-attention/tree/main/csrc/layer_norm) repo.
|
|
|
|
| 1 |
---
|
| 2 |
tags:
|
| 3 |
+
- kernels
|
| 4 |
---
|
| 5 |
This CUDA extension implements fused dropout + residual + LayerNorm from the [flash-attention](https://github.com/Dao-AILab/flash-attention/tree/main/csrc/layer_norm) repo.
|