Spaces:
Sleeping
Sleeping
Deploy FastAPI server with CodeLlama 7B
Browse files- requirements.txt +7 -3
requirements.txt
CHANGED
|
@@ -1,3 +1,7 @@
|
|
| 1 |
-
|
| 2 |
-
|
| 3 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
--extra-index-url https://abetlen.github.io/llama-cpp-python/whl/cpu
|
| 2 |
+
llama-cpp-python>=0.2.0
|
| 3 |
+
fastapi>=0.104.0
|
| 4 |
+
uvicorn[standard]>=0.24.0
|
| 5 |
+
pydantic>=2.0.0
|
| 6 |
+
huggingface-hub>=0.19.0
|
| 7 |
+
python-multipart
|