| --- |
| title: "Installation" |
| format: |
| html: |
| toc: true |
| toc-depth: 3 |
| number-sections: true |
| execute: |
| enabled: false |
| --- |
|
|
| This guide covers all the ways you can install and set up Axolotl for your environment. |
|
|
| |
|
|
| - NVIDIA GPU (Ampere architecture or newer for `bf16` and Flash Attention) or AMD GPU |
| - Python ≥3.10 |
| - PyTorch ≥2.4.1 |
|
|
| |
|
|
| |
|
|
| ```{.bash} |
| pip3 install -U packaging setuptools wheel ninja |
| pip3 install --no-build-isolation axolotl[flash-attn,deepspeed] |
| ``` |
|
|
| We use `--no-build-isolation` in order to detect the installed PyTorch version (if |
| installed) in order not to clobber it, and so that we set the correct version of |
| dependencies that are specific to the PyTorch version or other installed |
| co-dependencies. |
|
|
| |
|
|
| For the latest features between releases: |
|
|
| ```{.bash} |
| git clone https://github.com/axolotl-ai-cloud/axolotl.git |
| cd axolotl |
| pip3 install -U packaging setuptools wheel ninja |
| pip3 install --no-build-isolation -e '.[flash-attn,deepspeed]' |
| ``` |
|
|
| |
|
|
| ```{.bash} |
| docker run --gpus '"all"' --rm -it axolotlai/axolotl:main-latest |
| ``` |
|
|
| For development with Docker: |
|
|
| ```{.bash} |
| docker compose up -d |
| ``` |
|
|
| ::: {.callout-tip} |
| |
| ```{.bash} |
| docker run --privileged --gpus '"all"' --shm-size 10g --rm -it \ |
| --name axolotl --ipc=host \ |
| --ulimit memlock=-1 --ulimit stack=67108864 \ |
| --mount type=bind,src="${PWD}",target=/workspace/axolotl \ |
| -v ${HOME}/.cache/huggingface:/root/.cache/huggingface \ |
| axolotlai/axolotl:main-latest |
| ``` |
| ::: |
|
|
| Please refer to the [Docker documentation](docker.qmd) for more information on the different Docker images that are available. |
|
|
| |
|
|
| |
|
|
| For providers supporting Docker: |
|
|
| - Use `axolotlai/axolotl-cloud:main-latest` |
| - Available on: |
| - [Latitude.sh](https://latitude.sh/blueprint/989e0e79-3bf6-41ea-a46b-1f246e309d5c) |
| - [JarvisLabs.ai](https://jarvislabs.ai/templates/axolotl) |
| - [RunPod](https://runpod.io/gsc?template=v2ickqhz9s&ref=6i7fkpdz) |
| - [Novita](https://novita.ai/gpus-console?templateId=311) |
|
|
| |
|
|
| Use our [example notebook](../examples/colab-notebooks/colab-axolotl-example.ipynb). |
|
|
| |
|
|
| |
|
|
| ```{.bash} |
| pip3 install --no-build-isolation -e '.' |
| ``` |
|
|
| See @sec-troubleshooting for Mac-specific issues. |
|
|
| |
|
|
| ::: {.callout-important} |
| We recommend using WSL2 (Windows Subsystem for Linux) or Docker. |
| ::: |
|
|
| |
|
|
| |
|
|
| 1. Install Python ≥3.10 |
| 2. Install PyTorch: https://pytorch.org/get-started/locally/ |
| 3. Install Axolotl: |
| ```{.bash} |
| pip3 install -U packaging setuptools wheel ninja |
| pip3 install --no-build-isolation -e '.[flash-attn,deepspeed]' |
| ``` |
| 4. (Optional) Login to Hugging Face: |
| ```{.bash} |
| huggingface-cli login |
| ``` |
|
|
| |
|
|
| If you encounter installation issues, see our [FAQ](faq.qmd) and [Debugging Guide](debugging.qmd). |
|
|