File size: 371 Bytes
56bca05 5d26454 56bca05 326203b 854c683 | 1 2 3 4 5 6 7 8 9 10 | ---
tags:
- kernels
---
> [!WARNING]
> This repository will soon be deleted as it's now deprecated. Please use [kernels-community/layer-norm](https://huggingface.co/kernels-community/layer-norm).
This CUDA extension implements fused dropout + residual + LayerNorm from the [flash-attention](https://github.com/Dao-AILab/flash-attention/tree/main/csrc/layer_norm) repo. |