Spaces:
Running
Running
| ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | |
| π¦ HuggingClaw β OpenClaw Gateway for HuggingFace Spaces | |
| ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | |
| Copy this file to .env and fill in your values. | |
| For local development: cp .env.example .env && nano .env | |
| ββ REQUIRED: Core Configuration ββ | |
| [REQUIRED] LLM provider API key | |
| - Anthropic: sk-ant-v0-... | |
| - OpenAI: sk-... | |
| - Google: AIzaSy... | |
| - OpenRouter: sk-or-v1-... (300+ models via single key) | |
| LLM_API_KEY=your_api_key_here | |
| [REQUIRED] LLM model to use (format: provider/model-name) | |
| Auto-detects provider from prefix β any provider is supported! | |
| Provider IDs from OpenClaw docs: docs.openclaw.ai/concepts/model-providers | |
| # ββ Core Providers ββ | |
| # Anthropic (ANTHROPIC_API_KEY): | |
| - anthropic/claude-opus-4-6 | |
| - anthropic/claude-sonnet-4-6 | |
| - anthropic/claude-sonnet-4-5 | |
| - anthropic/claude-haiku-4-5 | |
| # OpenAI (OPENAI_API_KEY): | |
| - openai/gpt-5.4-pro | |
| - openai/gpt-5.4 | |
| - openai/gpt-5.4-mini | |
| - openai/gpt-5.4-nano | |
| - openai/gpt-4.1 | |
| - openai/gpt-4.1-mini | |
| # Google Gemini (GEMINI_API_KEY): | |
| - google/gemini-3.1-pro-preview | |
| - google/gemini-3-flash-preview | |
| - google/gemini-2.5-pro | |
| - google/gemini-2.5-flash | |
| # DeepSeek (DEEPSEEK_API_KEY): | |
| - deepseek/deepseek-v3.2 | |
| - deepseek/deepseek-r1-0528 | |
| - deepseek/deepseek-r1 | |
| # ββ OpenCode Providers ββ | |
| # OpenCode Zen β tested & verified models (OPENCODE_API_KEY): | |
| - opencode/claude-opus-4-6 | |
| - opencode/gpt-5.4 | |
| Get key from: https://opencode.ai/auth | |
| # OpenCode Go β low-cost open models (OPENCODE_API_KEY): | |
| - opencode-go/kimi-k2.5 | |
| # ββ Gateway/Router Providers ββ | |
| # OpenRouter β 300+ models via single API key (OPENROUTER_API_KEY): | |
| - openrouter/anthropic/claude-sonnet-4-6 | |
| - openrouter/openai/gpt-5.4 | |
| - openrouter/deepseek/deepseek-v3.2 | |
| - openrouter/meta-llama/llama-3.3-70b-instruct:free | |
| Get key from: https://openrouter.ai | |
| # Kilo Gateway (KILOCODE_API_KEY): | |
| - kilocode/anthropic/claude-opus-4.6 | |
| # ββ Chinese/Asian Providers ββ | |
| # Z.ai / GLM (ZAI_API_KEY) β OpenClaw normalizes z-ai/z.ai β zai: | |
| - zai/glm-5 | |
| - zai/glm-5-turbo | |
| - zai/glm-4.7 | |
| - zai/glm-4.7-flash | |
| # Moonshot / Kimi (MOONSHOT_API_KEY): | |
| - moonshot/kimi-k2.5 | |
| - moonshot/kimi-k2-thinking | |
| # MiniMax (MINIMAX_API_KEY): | |
| - minimax/minimax-m2.7 | |
| - minimax/minimax-m2.5 | |
| # Xiaomi / MiMo (XIAOMI_API_KEY): | |
| - xiaomi/mimo-v2-pro | |
| - xiaomi/mimo-v2-omni | |
| # Volcengine / Doubao (VOLCANO_ENGINE_API_KEY): | |
| - volcengine/doubao-seed-1-8-251228 | |
| - volcengine/kimi-k2-5-260127 | |
| # BytePlus β international (BYTEPLUS_API_KEY): | |
| - byteplus/seed-1-8-251228 | |
| # ββ Western Providers ββ | |
| # Mistral (MISTRAL_API_KEY): | |
| - mistral/mistral-large-latest | |
| - mistral/mistral-small-2603 | |
| - mistral/devstral-medium | |
| # xAI / Grok (XAI_API_KEY): | |
| - xai/grok-4.20-beta | |
| - xai/grok-4 | |
| # NVIDIA (NVIDIA_API_KEY): | |
| - nvidia/nemotron-3-super-120b-a12b | |
| # Groq (GROQ_API_KEY): | |
| - groq/mixtral-8x7b-32768 | |
| # Cohere (COHERE_API_KEY): | |
| - cohere/command-a | |
| # Together (TOGETHER_API_KEY): | |
| - together/meta-llama/llama-3.3-70b-instruct | |
| # Cerebras (CEREBRAS_API_KEY): | |
| - cerebras/zai-glm-4.7 | |
| # HuggingFace Inference (HUGGINGFACE_HUB_TOKEN): | |
| - huggingface/deepseek-ai/DeepSeek-R1 | |
| # Or any other OpenClaw-supported provider (format: provider/model-name) | |
| LLM_MODEL=anthropic/claude-sonnet-4-5 | |
| [REQUIRED] Gateway authentication token | |
| Generate: openssl rand -hex 32 | |
| GATEWAY_TOKEN=your_gateway_token_here | |
| (Optional) Password auth β simpler alternative to token for casual users | |
| If set, users can log in with this password instead of the token | |
| OPENCLAW_PASSWORD=your_password_here | |
| ββ OPTIONAL: Chat Integrations ββ | |
| Enable WhatsApp pairing flow | |
| Set to true only if you want WhatsApp enabled | |
| WHATSAPP_ENABLED=false | |
| Get bot token from: https://t.me/BotFather | |
| TELEGRAM_BOT_TOKEN=your_bot_token_here | |
| Single user ID (from https://t.me/userinfobot) | |
| TELEGRAM_USER_ID=123456789 | |
| Multiple user IDs (comma-separated for team access) | |
| TELEGRAM_USER_IDS=123456789,987654321,555555555 | |
| ββ OPTIONAL: Workspace Backup to HF Dataset ββ | |
| HF_USERNAME=your_hf_username | |
| HF_TOKEN=hf_your_token_here | |
| Backup dataset name (auto-created if missing) | |
| Default: huggingclaw-backup | |
| BACKUP_DATASET_NAME=huggingclaw-backup | |
| Git commit identity for workspace syncs | |
| WORKSPACE_GIT_USER=openclaw@example.com | |
| WORKSPACE_GIT_NAME=OpenClaw Bot | |
| ββ OPTIONAL: Background Services ββ | |
| Keep-alive ping interval (seconds). Default: 300. Set 0 to disable. | |
| KEEP_ALIVE_INTERVAL=300 | |
| Workspace auto-sync interval (seconds). Default: 180. | |
| SYNC_INTERVAL=180 | |
| Webhooks: Standard POST notifications for lifecycle events | |
| WEBHOOK_URL=https://your-webhook-endpoint.com/log | |
| Optional: external keep-alive via UptimeRobot | |
| Use the Main API key from UptimeRobot -> Integrations. | |
| Do not use the Read-only API key or a Monitor-specific API key. | |
| Run setup-uptimerobot.sh once from your own terminal to create the monitor. | |
| UPTIMEROBOT_API_KEY=ur_your_api_key_here | |
| Trusted proxies (comma-separated IPs) | |
| Fixes "Proxy headers detected from untrusted address" behind reverse proxies | |
| Only set if you see pairing/auth errors. Find IPs in Space logs (remote=x.x.x.x) | |
| TRUSTED_PROXIES=10.20.31.87,10.20.26.157 | |
| Allowed origins for Control UI (comma-separated URLs) | |
| Locks down the web UI to only these origins | |
| ALLOWED_ORIGINS=https://your-space.hf.space | |
| ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | |
| QUICK START: Only 3 secrets required! | |
| 1. LLM_API_KEY β From your LLM provider | |
| 2. LLM_MODEL β Pick a model above | |
| 3. GATEWAY_TOKEN β Run: openssl rand -hex 32 | |
| ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | |