File size: 213 Bytes
0b0cd15 |
1 2 3 4 5 6 7 8 |
CUDA, AVX512.
```sh
$ git clone "https://github.com/abetlen/llama-cpp-python" --depth=1
$ cd llama-cpp-python
$ git submodule update --init --recursive
$ CMAKE_ARGS="-DLLAMA_CUBLAS=on" python -m pip install .
``` |