Use pre-built llama-cpp-python wheel for cu124
Browse files- requirements.txt +2 -1
requirements.txt
CHANGED
|
@@ -1,4 +1,5 @@
|
|
| 1 |
gradio>=4.0.0
|
| 2 |
huggingface_hub>=0.20.0
|
| 3 |
spaces
|
| 4 |
-
llama-cpp-python
|
|
|
|
|
|
| 1 |
gradio>=4.0.0
|
| 2 |
huggingface_hub>=0.20.0
|
| 3 |
spaces
|
| 4 |
+
--extra-index-url https://abetlen.github.io/llama-cpp-python/whl/cu124
|
| 5 |
+
llama-cpp-python
|