File size: 369 Bytes
07db927 56bca05 5d26454 56bca05 854c683 | 1 2 3 4 5 6 7 8 | > [!WARNING]
> This repository will soon be deleted as it's now deprecated. Please use [kernels-community/layer-norm](https://huggingface.co/kernels-community/layer-norm).
---
tags:
- kernels
---
This CUDA extension implements fused dropout + residual + LayerNorm from the [flash-attention](https://github.com/Dao-AILab/flash-attention/tree/main/csrc/layer_norm) repo. |