layer_norm / README.md
danieldk's picture
danieldk HF Staff
Update tag
5d26454 verified
---
tags:
- kernels
---
This CUDA extension implements fused dropout + residual + LayerNorm from the [flash-attention](https://github.com/Dao-AILab/flash-attention/tree/main/csrc/layer_norm) repo.