Spaces:
Sleeping
Sleeping
Update .env
Browse files
.env
CHANGED
|
@@ -162,7 +162,7 @@ MAX_PARALLEL_INSERT=2
|
|
| 162 |
LLM_BINDING=openai
|
| 163 |
LLM_MODEL=gpt-4o
|
| 164 |
LLM_BINDING_HOST=https://api.openai.com/v1
|
| 165 |
-
LLM_BINDING_API_KEY
|
| 166 |
|
| 167 |
### Optional for Azure
|
| 168 |
# AZURE_OPENAI_API_VERSION=2024-08-01-preview
|
|
@@ -213,20 +213,20 @@ OLLAMA_LLM_NUM_CTX=32768
|
|
| 213 |
### Embedding Configuration (Should not be changed after the first file processed)
|
| 214 |
### EMBEDDING_BINDING: ollama, openai, azure_openai, jina, lollms, aws_bedrock
|
| 215 |
####################################################################################
|
| 216 |
-
|
| 217 |
-
EMBEDDING_BINDING=ollama
|
| 218 |
-
EMBEDDING_MODEL=bge-m3:latest
|
| 219 |
-
EMBEDDING_DIM=1024
|
| 220 |
-
EMBEDDING_BINDING_API_KEY=your_api_key
|
| 221 |
# If the embedding service is deployed within the same Docker stack, use host.docker.internal instead of localhost
|
| 222 |
-
EMBEDDING_BINDING_HOST=http://localhost:11434
|
| 223 |
|
| 224 |
### OpenAI compatible (VoyageAI embedding openai compatible)
|
| 225 |
-
|
| 226 |
-
|
| 227 |
-
|
| 228 |
-
|
| 229 |
-
|
| 230 |
|
| 231 |
### Optional for Azure
|
| 232 |
# AZURE_EMBEDDING_DEPLOYMENT=text-embedding-3-large
|
|
@@ -242,7 +242,7 @@ EMBEDDING_BINDING_HOST=http://localhost:11434
|
|
| 242 |
# EMBEDDING_BINDING_API_KEY=your_api_key
|
| 243 |
|
| 244 |
### Optional for Ollama embedding
|
| 245 |
-
OLLAMA_EMBEDDING_NUM_CTX=8192
|
| 246 |
### use the following command to see all support options for Ollama embedding
|
| 247 |
### lightrag-server --embedding-binding ollama --help
|
| 248 |
|
|
|
|
| 162 |
LLM_BINDING=openai
|
| 163 |
LLM_MODEL=gpt-4o
|
| 164 |
LLM_BINDING_HOST=https://api.openai.com/v1
|
| 165 |
+
LLM_BINDING_API_KEY=${LLM_BINDING_API_KEY}
|
| 166 |
|
| 167 |
### Optional for Azure
|
| 168 |
# AZURE_OPENAI_API_VERSION=2024-08-01-preview
|
|
|
|
| 213 |
### Embedding Configuration (Should not be changed after the first file processed)
|
| 214 |
### EMBEDDING_BINDING: ollama, openai, azure_openai, jina, lollms, aws_bedrock
|
| 215 |
####################################################################################
|
| 216 |
+
EMBEDDING_TIMEOUT=30
|
| 217 |
+
# EMBEDDING_BINDING=ollama
|
| 218 |
+
# EMBEDDING_MODEL=bge-m3:latest
|
| 219 |
+
# EMBEDDING_DIM=1024
|
| 220 |
+
# EMBEDDING_BINDING_API_KEY=your_api_key
|
| 221 |
# If the embedding service is deployed within the same Docker stack, use host.docker.internal instead of localhost
|
| 222 |
+
# EMBEDDING_BINDING_HOST=http://localhost:11434
|
| 223 |
|
| 224 |
### OpenAI compatible (VoyageAI embedding openai compatible)
|
| 225 |
+
EMBEDDING_BINDING=openai
|
| 226 |
+
EMBEDDING_MODEL=text-embedding-3-small
|
| 227 |
+
EMBEDDING_DIM=1536
|
| 228 |
+
EMBEDDING_BINDING_HOST=https://api.openai.com/v1
|
| 229 |
+
EMBEDDING_BINDING_API_KEY=${LLM_BINDING_API_KEY}
|
| 230 |
|
| 231 |
### Optional for Azure
|
| 232 |
# AZURE_EMBEDDING_DEPLOYMENT=text-embedding-3-large
|
|
|
|
| 242 |
# EMBEDDING_BINDING_API_KEY=your_api_key
|
| 243 |
|
| 244 |
### Optional for Ollama embedding
|
| 245 |
+
# OLLAMA_EMBEDDING_NUM_CTX=8192
|
| 246 |
### use the following command to see all support options for Ollama embedding
|
| 247 |
### lightrag-server --embedding-binding ollama --help
|
| 248 |
|