Spaces:
Sleeping
Sleeping
Deploy FastAPI server with CodeLlama 7B
Browse files- requirements.txt +2 -2
requirements.txt
CHANGED
|
@@ -1,5 +1,5 @@
|
|
| 1 |
-
|
| 2 |
-
llama-cpp-python
|
| 3 |
fastapi>=0.104.0
|
| 4 |
uvicorn[standard]>=0.24.0
|
| 5 |
pydantic>=2.0.0
|
|
|
|
| 1 |
+
# llama-cpp-python will be built from source in Dockerfile (not from wheel)
|
| 2 |
+
# Removed: --extra-index-url and llama-cpp-python line
|
| 3 |
fastapi>=0.104.0
|
| 4 |
uvicorn[standard]>=0.24.0
|
| 5 |
pydantic>=2.0.0
|