Spaces:
Running
Running
File size: 1,865 Bytes
8b52a8f d74a964 8b52a8f d74a964 8b52a8f d74a964 d46904b d74a964 3267348 d74a964 d46904b d74a964 d46904b d74a964 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 |
---
title: LiteLLM Proxy
sdk: docker
app_port: 7860
pinned: false
---
# LiteLLM Proxy (OpenAI-Compatible)
This Space runs the LiteLLM proxy, providing an OpenAI-compatible API that can route to multiple providers.
## Secrets to Set (Space Settings -> Secrets)
- `LITELLM_MASTER_KEY`: master key required in `Authorization: Bearer ...`
- `GMN_API_KEY`: for the third-party OpenAI-compatible gateway
- `GMN_API_BASE`: set to `https://gmn.chuangzuoli.com/v1`
- `ANTHROPIC_API_KEY`: for Anthropic routing
- `ARK_ANTHROPIC_BASE_URL`: set to `https://ark.cn-beijing.volces.com/api/coding`
- `ARK_ANTHROPIC_AUTH_TOKEN`: Volcengine ARK API key (Anthropic-compatible)
## Useful Endpoints
- `GET /health/liveliness` (no auth) - good for keep-alive pings
- `POST /chat/completions` (auth) - OpenAI-compatible chat completions
- `POST /v1/responses` (auth) - OpenAI-compatible Responses API (recommended)
## Example Curl
```bash
curl -sS https://YOUR-SPACE.hf.space/health/liveliness
```
```bash
curl -sS https://YOUR-SPACE.hf.space/chat/completions \
-H "Authorization: Bearer $LITELLM_MASTER_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "gpt-4o-mini",
"messages": [{"role": "user", "content": "Say hi"}]
}'
```
Responses API (some gateways require this exact path + OpenAI-Beta header):
```bash
curl -sS https://YOUR-SPACE.hf.space/v1/responses \
-H "Authorization: Bearer $LITELLM_MASTER_KEY" \
-H "Content-Type: application/json" \
-H "OpenAI-Beta: responses=v1" \
-d '{
"model": "gpt-4o-mini",
"input": "Say hi"
}'
```
Anthropic via alias:
```bash
curl -sS https://YOUR-SPACE.hf.space/chat/completions \
-H "Authorization: Bearer $LITELLM_MASTER_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "claude-sonnet",
"messages": [{"role": "user", "content": "Say hi from Claude"}]
}'
```
|