Update README.md
Browse files
README.md
CHANGED
|
@@ -112,7 +112,7 @@ print(f"# A:\n{output}\n")
|
|
| 112 |
```
|
| 113 |
|
| 114 |
__System requirements:__
|
| 115 |
-
* GPUs: H100, L40s
|
| 116 |
* CPU: AMD, Intel
|
| 117 |
* Python: 3.10-3.12
|
| 118 |
|
|
@@ -123,6 +123,14 @@ To work with our models just run these lines in your terminal:
|
|
| 123 |
pip install thestage
|
| 124 |
pip install 'thestage-elastic-models[nvidia]'
|
| 125 |
pip install flash_attn==2.7.3 --no-build-isolation
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 126 |
pip uninstall apex
|
| 127 |
```
|
| 128 |
|
|
|
|
| 112 |
```
|
| 113 |
|
| 114 |
__System requirements:__
|
| 115 |
+
* GPUs: H100, L40s, 4090, 5090
|
| 116 |
* CPU: AMD, Intel
|
| 117 |
* Python: 3.10-3.12
|
| 118 |
|
|
|
|
| 123 |
pip install thestage
|
| 124 |
pip install 'thestage-elastic-models[nvidia]'
|
| 125 |
pip install flash_attn==2.7.3 --no-build-isolation
|
| 126 |
+
|
| 127 |
+
# or for blackwell support
|
| 128 |
+
pip install 'thestage-elastic-models[blackwell]'
|
| 129 |
+
pip install torch==2.7.0+cu128 torchvision torchaudio --index-url https://download.pytorch.org/whl/cu128
|
| 130 |
+
# please download the appropriate version of Wheels for your system from https://github.com/Zarrac/flashattention-blackwell-wheels-whl-ONLY-5090-5080-5070-5060-flash-attention-/releases/tag/FlashAttention
|
| 131 |
+
mv flash_attn-2.7.4.post1-rtx5090-torch2.7.0cu128cxx11abiTRUE-cp311-linux_x86_64.whl flash_attn-2.7.4.post1-0rtx5090torch270cu128cxx11abiTRUE-cp311-cp311-linux_x86_64.whl
|
| 132 |
+
pip install flash_attn-2.7.4.post1-0rtx5090torch270cu128cxx11abiTRUE-cp311-cp311-linux_x86_64.whl
|
| 133 |
+
|
| 134 |
pip uninstall apex
|
| 135 |
```
|
| 136 |
|