Instructions to use sigmoid-neuron/flash-attention-1-triton with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Kernels
How to use sigmoid-neuron/flash-attention-1-triton with Kernels:
# !pip install kernels from kernels import get_kernel kernel = get_kernel("sigmoid-neuron/flash-attention-1-triton") - Notebooks
- Google Colab
- Kaggle
Ctrl+K
feat: add support for ROCm backend and update device validation for HIP and XPU compatibility
a391959