File size: 1,963 Bytes
75ead9a 2a9e323 75ead9a | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 | # SageAttention Wheels (CUDA 13.x) 🚀
Prebuilt SageAttention 2.2.0 wheels compiled for Linux x86_64 with CUDA 13.x support.
This repository provides ready-to-use binary wheels for different Python and PyTorch versions, optimized for modern NVIDIA GPUs (Ada / Hopper / Ampere).
---
# 📦 Available Wheels
| Python Version | PyTorch Version | CUDA | File |
|----------------|----------------|------|------|
| 3.11 | 2.10 | cu13 | sageattention-2.2.0-python3.11-pytorch2.10-cu13-linux_x86_64.whl |
| 3.12 | 2.10 | cu13 | sageattention-2.2.0-python3.12-pytorch2.10-cu13-linux_x86_64.whl |
| 3.12 | 2.11 | cu13 | sageattention-2.2.0-python3.12-pytorch2.11-cu13-linux_x86_64.whl |
| 3.13 | 2.11 | cu13 | sageattention-2.2.0-python3.13-pytorch2.11-cu13-linux_x86_64.whl |
---
# ⚡ Requirements
- Linux x86_64
- NVIDIA GPU (Ada / Ampere / Hopper tested)
- CUDA 13.x runtime / toolkit
- PyTorch matching wheel version
- Python version matching wheel
---
# 🧠 Installation
## 1. Create virtual environment
python3.12 -m venv venv
source venv/bin/activate
---
## 2. Install PyTorch (CUDA 13)
pip install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu130
---
## 3. Install SageAttention wheel
pip install sageattention-2.2.0-python3.12-pytorch2.11-cu13-linux_x86_64.whl
---
# 🧪 Quick Test
python -c "import torch; print(torch.cuda.is_available())"
python -c "import sageattention; print('SageAttention loaded successfully')"
---
# 🚀 Notes
- Wheels are precompiled for performance
- Must match Python + PyTorch versions exactly
- CUDA 13.x required
- Optimized for sm_80+ GPUs
---
# ⚠️ Troubleshooting
CUDA not found:
export CUDA_HOME=/opt/cuda
export PATH=$CUDA_HOME/bin:$PATH
export LD_LIBRARY_PATH=$CUDA_HOME/lib64:$LD_LIBRARY_PATH
---
# 💬 Support
Matrix Network
@aimiko:mochiart.moe
---
# 📜 License
Refer to upstream SageAttention repository.
This repo contains only prebuilt binaries.
|