Kernels
sage-attention / README.md
drbh's picture
drbh HF Staff
Upload README.md with huggingface_hub (#2)
1ec3eb7
metadata
library_name: kernels
license: apache-2.0

This is the repository card of {repo_id} that has been pushed on the Hub. It was built to be used with the kernels library. This card was automatically generated.

How to use

# make sure `kernels` is installed: `pip install -U kernels`
from kernels import get_kernel

kernel_module = get_kernel("kernels-community/sage-attention") # <- change the ID if needed
per_block_int8 = kernel_module.per_block_int8

per_block_int8(...)

Available functions

  • per_block_int8
  • per_warp_int8
  • sub_mean
  • per_channel_fp8
  • sageattn

Supported backends

  • cuda

CUDA Capabilities

  • 8.0
  • 8.9
  • 9.0a

Benchmarks

[TODO: provide benchmarks if available]

Source code

[TODO: provide original source code and other relevant citations if available]

Notes

[TODO: provide additional notes about this kernel if needed]