danieldk HF Staff lysandre HF Staff commited on
Commit
56bca05
·
verified ·
1 Parent(s): 2c8e98a

Update README.md (#1)

Browse files

- Update README.md (b07548cf419fe6fa19eabc118c6cdf0275fbe9a3)


Co-authored-by: Lysandre <lysandre@users.noreply.huggingface.co>

Files changed (1) hide show
  1. README.md +5 -1
README.md CHANGED
@@ -1,3 +1,7 @@
 
 
 
 
1
  This CUDA extension implements fused dropout + residual + LayerNorm, building on
2
  Apex's [FastLayerNorm](https://github.com/NVIDIA/apex/tree/master/apex/contrib/layer_norm).
3
  Major changes:
@@ -17,4 +21,4 @@ cd csrc/layer_norm && pip install .
17
 
18
  As of 2024-01-05, this extension is no longer used in the FlashAttention repo.
19
  We've instead switched to a Triton-based
20
- [implementation](https://github.com/Dao-AILab/flash-attention/blob/main/flash_attn/ops/triton/layer_norm.py).
 
1
+ ---
2
+ tags:
3
+ - kernel
4
+ ---
5
  This CUDA extension implements fused dropout + residual + LayerNorm, building on
6
  Apex's [FastLayerNorm](https://github.com/NVIDIA/apex/tree/master/apex/contrib/layer_norm).
7
  Major changes:
 
21
 
22
  As of 2024-01-05, this extension is no longer used in the FlashAttention repo.
23
  We've instead switched to a Triton-based
24
+ [implementation](https://github.com/Dao-AILab/flash-attention/blob/main/flash_attn/ops/triton/layer_norm.py).