Spaces:
Running
Running
metadata
title: LiteLLM Proxy
sdk: docker
app_port: 7860
pinned: false
LiteLLM Proxy (OpenAI-Compatible)
This Space runs the LiteLLM proxy, providing an OpenAI-compatible API that can route to multiple providers.
Secrets to Set (Space Settings -> Secrets)
LITELLM_MASTER_KEY: master key required inAuthorization: Bearer ...GMN_API_KEY: for the third-party OpenAI-compatible gatewayGMN_API_BASE: set tohttps://gmn.chuangzuoli.com/v1ANTHROPIC_API_KEY: for Anthropic routingARK_ANTHROPIC_BASE_URL: set tohttps://ark.cn-beijing.volces.com/api/codingARK_ANTHROPIC_AUTH_TOKEN: Volcengine ARK API key (Anthropic-compatible)
Useful Endpoints
GET /health/liveliness(no auth) - good for keep-alive pingsPOST /chat/completions(auth) - OpenAI-compatible chat completionsPOST /v1/responses(auth) - OpenAI-compatible Responses API (recommended)
Example Curl
curl -sS https://YOUR-SPACE.hf.space/health/liveliness
curl -sS https://YOUR-SPACE.hf.space/chat/completions \
-H "Authorization: Bearer $LITELLM_MASTER_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "gpt-4o-mini",
"messages": [{"role": "user", "content": "Say hi"}]
}'
Responses API (some gateways require this exact path + OpenAI-Beta header):
curl -sS https://YOUR-SPACE.hf.space/v1/responses \
-H "Authorization: Bearer $LITELLM_MASTER_KEY" \
-H "Content-Type: application/json" \
-H "OpenAI-Beta: responses=v1" \
-d '{
"model": "gpt-4o-mini",
"input": "Say hi"
}'
Anthropic via alias:
curl -sS https://YOUR-SPACE.hf.space/chat/completions \
-H "Authorization: Bearer $LITELLM_MASTER_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "claude-sonnet",
"messages": [{"role": "user", "content": "Say hi from Claude"}]
}'