56bca05 5d26454 56bca05 854c683
1
2
3
4
5
--- tags: - kernels --- This CUDA extension implements fused dropout + residual + LayerNorm from the [flash-attention](https://github.com/Dao-AILab/flash-attention/tree/main/csrc/layer_norm) repo.