danieldk HF Staff commited on
Commit
5d26454
·
verified ·
1 Parent(s): 933d590

Update tag

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -1,5 +1,5 @@
1
  ---
2
  tags:
3
- - kernel
4
  ---
5
  This CUDA extension implements fused dropout + residual + LayerNorm from the [flash-attention](https://github.com/Dao-AILab/flash-attention/tree/main/csrc/layer_norm) repo.
 
1
  ---
2
  tags:
3
+ - kernels
4
  ---
5
  This CUDA extension implements fused dropout + residual + LayerNorm from the [flash-attention](https://github.com/Dao-AILab/flash-attention/tree/main/csrc/layer_norm) repo.