Update README.md
Browse files
README.md
CHANGED
|
@@ -1,3 +1,25 @@
|
|
| 1 |
-
---
|
| 2 |
-
license: mit
|
| 3 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
---
|
| 2 |
+
license: mit
|
| 3 |
+
tags:
|
| 4 |
+
- llama-cpp-python
|
| 5 |
+
- prebuilt-wheels
|
| 6 |
+
- huggingface-spaces
|
| 7 |
+
- cpu-only
|
| 8 |
+
- python-3.13
|
| 9 |
+
---
|
| 10 |
+
|
| 11 |
+
# Llama-CPP-Python Pre-built Wheels (CPU Only)
|
| 12 |
+
|
| 13 |
+
This repository provides pre-compiled Python wheels for `llama-cpp-python`, specifically optimized for **Hugging Face Spaces (Free CPU Tier)**.
|
| 14 |
+
|
| 15 |
+
## 🚀 Key Features
|
| 16 |
+
- **Zero Compilation:** Skips the 15-minute C++ build process on HF Spaces.
|
| 17 |
+
- **Python 3.13 Support:** Built for the latest Python version.
|
| 18 |
+
- **Generic CPU:** Compiled with `GGML_NATIVE=OFF` to ensure compatibility with older cloud processors (no "Illegal Instruction" errors).
|
| 19 |
+
|
| 20 |
+
## 🛠 Usage in HF Spaces
|
| 21 |
+
|
| 22 |
+
### Dockerfile (Recommended)
|
| 23 |
+
Add this to your Dockerfile to install the wheel instantly:
|
| 24 |
+
```dockerfile
|
| 25 |
+
RUN pip install [https://huggingface.co/Jameson040/llama-cpp-python-wheels/resolve/main/llama_cpp_python-0.3.16-cp313-cp313-linux_x86_64.whl](https://huggingface.co/Jameson040/llama-cpp-python-wheels/resolve/main/llama_cpp_python-0.3.16-cp313-cp313-linux_x86_64.whl)
|