| --- |
| license: mit |
| language: |
| - en |
| tags: |
| - llama.cpp |
| - gguf |
| - pytorch |
| - binaries |
| --- |
| |
| # Llama.cpp Spinoff - Python Binary Wheels |
|
|
| Pre-built Python `whl` binaries for [mention your specific fork name here] to allow easy installation without compiling C++ code. |
|
|
| ## Installation |
|
|
| Install the latest version via pip: |
|
|
| ```bash |
| pip install turbocpp |
| ``` |
| ## Features |
|
|
| - **Pre-compiled**: No need to install compilers (CMake, GCC/Visual Studio). |
| - **Optimized**: Built with specific performance optimizations (AVX, AVX2, cuBLAS). |
| - **GGUF Supported**: Supports the latest GGUF format. |