Soumik Bose commited on
Commit
c5e3be7
·
1 Parent(s): fcda744
Files changed (1) hide show
  1. requirements.txt +22 -3
requirements.txt CHANGED
@@ -1,7 +1,26 @@
1
- # Use JamePeng's pre-built CPU wheel for Linux
2
- llama-cpp-python @ https://github.com/JamePeng/llama-cpp-python/releases/download/v0.3.17-Basic-linux-20251202/llama_cpp_python-0.3.17-cp310-cp310-linux_x86_64.whl
3
 
4
  fastapi>=0.115.0
5
  uvicorn>=0.30.0
6
  pydantic>=2.8.0
7
- huggingface-hub>=0.24.0
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Use mrzeeshanahmed's pre-built manylinux wheel
2
+ llama-cpp-python @ https://github.com/mrzeeshanahmed/llama-cpp-python/releases/download/v0.3.17-manylinux-x86_64/llama_cpp_python-0.3.17-cp310-cp310-manylinux_2_17_x86_64.whl
3
 
4
  fastapi>=0.115.0
5
  uvicorn>=0.30.0
6
  pydantic>=2.8.0
7
+ huggingface-hub>=0.24.0
8
+ ```
9
+
10
+ **Note the underscore vs hyphen**: The package name in the wheel filename uses `llama_cpp_python` (with underscores), not `llama-cpp-python`.
11
+
12
+ If this still gives 404, here's a **bulletproof fallback** - just build it from source but with a timeout workaround:
13
+
14
+ ## Alternative: Split the build into pre-requirements
15
+
16
+ Create `packages.txt`:
17
+ ```
18
+ build-essential
19
+ cmake
20
+ libopenblas-dev
21
+ ```
22
+
23
+ Create `pre-requirements.txt`:
24
+ ```
25
+ --extra-index-url https://abetlen.github.io/llama-cpp-python/whl/cpu
26
+ llama-cpp-python>=0.2.90