Spaces:
Paused
Paused
TensorNull commited on
Commit ·
e9a58c8
1
Parent(s): c314689
feat: Add CometAPI provider
Browse files- conf/model_providers.yaml +6 -3
- docs/troubleshooting.md +1 -1
conf/model_providers.yaml
CHANGED
|
@@ -27,6 +27,9 @@ chat:
|
|
| 27 |
anthropic:
|
| 28 |
name: Anthropic
|
| 29 |
litellm_provider: anthropic
|
|
|
|
|
|
|
|
|
|
| 30 |
deepseek:
|
| 31 |
name: DeepSeek
|
| 32 |
litellm_provider: deepseek
|
|
@@ -35,8 +38,8 @@ chat:
|
|
| 35 |
litellm_provider: github_copilot
|
| 36 |
kwargs:
|
| 37 |
extra_headers:
|
| 38 |
-
|
| 39 |
-
|
| 40 |
google:
|
| 41 |
name: Google
|
| 42 |
litellm_provider: gemini
|
|
@@ -109,4 +112,4 @@ embedding:
|
|
| 109 |
litellm_provider: azure
|
| 110 |
other:
|
| 111 |
name: Other OpenAI compatible
|
| 112 |
-
litellm_provider: openai
|
|
|
|
| 27 |
anthropic:
|
| 28 |
name: Anthropic
|
| 29 |
litellm_provider: anthropic
|
| 30 |
+
cometapi:
|
| 31 |
+
name: CometAPI
|
| 32 |
+
litellm_provider: cometapi
|
| 33 |
deepseek:
|
| 34 |
name: DeepSeek
|
| 35 |
litellm_provider: deepseek
|
|
|
|
| 38 |
litellm_provider: github_copilot
|
| 39 |
kwargs:
|
| 40 |
extra_headers:
|
| 41 |
+
"Editor-Version": "vscode/1.85.1"
|
| 42 |
+
"Copilot-Integration-Id": "vscode-chat"
|
| 43 |
google:
|
| 44 |
name: Google
|
| 45 |
litellm_provider: gemini
|
|
|
|
| 112 |
litellm_provider: azure
|
| 113 |
other:
|
| 114 |
name: Other OpenAI compatible
|
| 115 |
+
litellm_provider: openai
|
docs/troubleshooting.md
CHANGED
|
@@ -12,7 +12,7 @@ This page addresses frequently asked questions (FAQ) and provides troubleshootin
|
|
| 12 |
Refer to the [Choosing your LLMs](installation.md#installing-and-using-ollama-local-models) section of the documentation for detailed instructions and examples for configuring different LLMs. Local models can be run using Ollama or LM Studio.
|
| 13 |
|
| 14 |
> [!TIP]
|
| 15 |
-
> Some LLM providers offer free usage of their APIs, for example Groq, Mistral or
|
| 16 |
|
| 17 |
**6. How can I make Agent Zero retain memory between sessions?**
|
| 18 |
Refer to the [How to update Agent Zero](installation.md#how-to-update-agent-zero) section of the documentation for instructions on how to update Agent Zero while retaining memory and data.
|
|
|
|
| 12 |
Refer to the [Choosing your LLMs](installation.md#installing-and-using-ollama-local-models) section of the documentation for detailed instructions and examples for configuring different LLMs. Local models can be run using Ollama or LM Studio.
|
| 13 |
|
| 14 |
> [!TIP]
|
| 15 |
+
> Some LLM providers offer free usage of their APIs, for example Groq, Mistral, SambaNova or CometAPI.
|
| 16 |
|
| 17 |
**6. How can I make Agent Zero retain memory between sessions?**
|
| 18 |
Refer to the [How to update Agent Zero](installation.md#how-to-update-agent-zero) section of the documentation for instructions on how to update Agent Zero while retaining memory and data.
|