Spaces:
Running
title: HuggingClaw
emoji: π¦
colorFrom: blue
colorTo: purple
sdk: docker
app_port: 7861
pinned: true
license: mit
Your always-on AI assistant β free, no server needed. HuggingClaw runs OpenClaw on HuggingFace Spaces, giving you a 24/7 AI chat assistant on Telegram and WhatsApp. It works with any large language model (LLM) β Claude, ChatGPT, Gemini, etc. β and even supports custom models via OpenRouter. Deploy in minutes on the free HF Spaces tier (2 vCPU, 16GB RAM, 50GB) with automatic workspace backup to a HuggingFace Dataset so your chat history and settings persist across restarts.
Table of Contents
- β¨ Features
- π₯ Video Tutorial
- π Quick Start
- π± Telegram Setup (Optional)
- π¬ WhatsApp Setup (Optional)
- πΎ Workspace Backup (Optional)
- π Webhooks (Optional)
- π Security & Advanced (Optional)
- π€ LLM Providers
- π» Local Development
- π CLI Access
- ποΈ Architecture
- π Staying Alive
- π Troubleshooting
- π Links
- π€ Contributing
- π License
β¨ Features
- π Any LLM: Use Claude, OpenAI GPT, Google Gemini, Grok, DeepSeek, Qwen, and 40+ providers (set
LLM_API_KEYandLLM_MODELaccordingly). - β‘ Zero Config: Duplicate this Space and set just three secrets (LLM_API_KEY, LLM_MODEL, GATEWAY_TOKEN) β no other setup needed.
- π³ Fast Builds: Uses a pre-built OpenClaw Docker image to deploy in minutes.
- π Built-In Browser: Headless Chromium is included in the Space, so browser actions work from the start.
- πΎ Workspace Backup: Chats, settings, and WhatsApp session state sync to a private HF Dataset via the
huggingface_hub(Git fallback), preserving data automatically. - β° External Keep-Alive: Set up a one-time UptimeRobot monitor from the dashboard to help keep free HF Spaces awake.
- π₯ Multi-User Messaging: Support for Telegram (multi-user) and WhatsApp (pairing).
- π Visual Dashboard: Beautiful Web UI to monitor uptime, sync status, and active models.
- π Webhooks: Get notified on restarts or backup failures via standard webhooks.
- π Flexible Auth: Secure the Control UI with either a gateway token or password.
- π 100% HF-Native: Runs entirely on HuggingFaceβs free infrastructure (2 vCPU, 16GB RAM).
π₯ Video Tutorial
Watch a quick walkthrough on YouTube: Deploying HuggingClaw on HF Spaces.
π Quick Start
Step 1: Duplicate this Space
Click the button above to duplicate the template.
Step 2: Add Your Secrets
Navigate to your new Space's Settings, scroll down to the Variables and secrets section, and add the following three under Secrets:
LLM_API_KEYβ Your provider API key (e.g., Anthropic, OpenAI, OpenRouter).LLM_MODELβ The model ID string you wish to use (e.g.,openai/gpt-4oorgoogle/gemini-2.5-flash).GATEWAY_TOKENβ A custom password or token to secure your Control UI. (You can use any strong password, or generate one withopenssl rand -hex 32if you prefer).
HuggingClaw is completely flexible! You only need these three secrets to get started. You can set other secrets later.
Optional: if you want to pin a specific OpenClaw release instead of latest, add OPENCLAW_VERSION under Variables in your Space settings. For Docker Spaces, HF passes Variables as build args during image build, so this should be a Variable, not a Secret.
Step 3: Deploy & Run
That's it! The Space will build the container and start up automatically. You can monitor the build process in the Logs tab.
Step 4: Monitor & Manage
HuggingClaw features a built-in dashboard to track:
- Uptime: Real-time uptime monitoring.
- Sync Status: Visual indicators for workspace backup operations.
- Chat Status: Real-time connection status for WhatsApp and Telegram.
- Model Info: See which LLM is currently powering your assistant.
π± Telegram Setup (Optional)
To chat via Telegram:
- Create a bot via @BotFather: send
/newbot, follow prompts, and copy the bot token. - Find your Telegram user ID with @userinfobot.
- Add these secrets in Settings β Secrets. After restarting, the bot should appear online on Telegram.
| Variable | Default | Description |
|---|---|---|
TELEGRAM_BOT_TOKEN |
β | Telegram bot token from BotFather |
TELEGRAM_USER_ID |
β | Single Telegram user ID allowlist |
TELEGRAM_USER_IDS |
β | Comma-separated Telegram user IDs for team access |
π¬ WhatsApp Setup (Optional)
To use WhatsApp, enable the channel and scan the QR code from the Control UI (Channels β WhatsApp β Login):
| Variable | Default | Description |
|---|---|---|
WHATSAPP_ENABLED |
false |
Enable WhatsApp pairing support |
πΎ Workspace Backup (Optional)
For persistent chat history and configuration, HuggingClaw can sync your workspace to a private HuggingFace Dataset. On first run, it will automatically create (or use) the Dataset repo HF_USERNAME/SPACE-backup, restore your workspace on startup, and sync changes periodically.
| Variable | Default | Description |
|---|---|---|
HF_USERNAME |
β | Your HuggingFace username |
HF_TOKEN |
β | HF token with write access |
BACKUP_DATASET_NAME |
huggingclaw-backup |
Dataset name for backup repo |
SYNC_INTERVAL |
180 |
Sync interval in seconds |
WORKSPACE_GIT_USER |
openclaw@example.com |
Git commit email for syncs |
WORKSPACE_GIT_NAME |
OpenClaw Bot |
Git commit name for syncs |
This backup also stores a hidden copy of your WhatsApp session credentials, allowing paired logins to survive Space restarts automatically.
π Staying Alive (Recommended on Free HF Spaces)
Free Hugging Face Spaces can still sleep. HuggingClaw does not rely on internal self-pings anymore. To help keep a public Space awake, set up an external UptimeRobot monitor from the dashboard.
Use the Main API key from UptimeRobot.
Do not use the Read-only API key or a Monitor-specific API key.
Setup:
- Open
/. - Find Keep Space Awake.
- Paste your UptimeRobot Main API key.
- Click Create Monitor.
What happens next:
- HuggingClaw creates a monitor for
https://your-space.hf.space/health - UptimeRobot keeps pinging it from outside Hugging Face
- You only need to do this once
You do not need to add this key to Hugging Face Space Secrets.
Note:
- This works for public Spaces.
- It does not work reliably for private Spaces, because external monitors cannot access private HF health URLs.
π Webhooks (Optional)
Get notified when your Space restarts or if a backup fails:
| Variable | Default | Description |
|---|---|---|
WEBHOOK_URL |
β | Endpoint URL for POST JSON notifications |
π Security & Advanced (Optional)
Configure password access and network restrictions:
| Variable | Default | Description |
|---|---|---|
OPENCLAW_PASSWORD |
β | Enable simple password auth instead of token |
TRUSTED_PROXIES |
β | Comma-separated IPs of HF proxies |
ALLOWED_ORIGINS |
β | Comma-separated allowed origins for Control UI |
OPENCLAW_VERSION |
latest |
Build-time pin for the OpenClaw image tag |
π€ LLM Providers
HuggingClaw supports all providers from OpenClaw. Set LLM_MODEL=<provider/model> and the provider is auto-detected. For example:
| Provider | Prefix | Example Model | API Key Source |
|---|---|---|---|
| Anthropic | anthropic/ |
anthropic/claude-sonnet-4-6 |
Anthropic Console |
| OpenAI | openai/ |
openai/gpt-5.4 |
OpenAI Platform |
google/ |
google/gemini-2.5-flash |
AI Studio | |
| DeepSeek | deepseek/ |
deepseek/deepseek-v3.2 |
DeepSeek |
| xAI (Grok) | xai/ |
xai/grok-4 |
xAI |
| Mistral | mistral/ |
mistral/mistral-large-latest |
Mistral Console |
| Moonshot | moonshot/ |
moonshot/kimi-k2.5 |
Moonshot |
| Cohere | cohere/ |
cohere/command-a |
Cohere Dashboard |
| Groq | groq/ |
groq/mixtral-8x7b-32768 |
Groq |
| MiniMax | minimax/ |
minimax/minimax-m2.7 |
MiniMax |
| NVIDIA | nvidia/ |
nvidia/nemotron-3-super-120b-a12b |
NVIDIA API |
| Z.ai (GLM) | zai/ |
zai/glm-5 |
Z.ai |
| Volcengine | volcengine/ |
volcengine/doubao-seed-1-8-251228 |
Volcengine |
| HuggingFace | huggingface/ |
huggingface/deepseek-ai/DeepSeek-R1 |
HF Tokens |
| OpenCode Zen | opencode/ |
opencode/claude-opus-4-6 |
OpenCode.ai |
| OpenCode Go | opencode-go/ |
opencode-go/kimi-k2.5 |
OpenCode.ai |
| Kilo Gateway | kilocode/ |
kilocode/anthropic/claude-opus-4.6 |
Kilo.ai |
OpenRouter β 200+ Models with One Key
Get an OpenRouter API key to use all providers. For example:
LLM_API_KEY=sk-or-v1-xxxxxxxx
LLM_MODEL=openrouter/openai/gpt-5.4
Popular options include openrouter/google/gemini-2.5-flash or openrouter/meta-llama/llama-3.3-70b-instruct.
Any Other Provider
You can also use any custom provider:
LLM_API_KEY=your_api_key
LLM_MODEL=provider/model-name
The provider prefix in LLM_MODEL tells HuggingClaw how to call it. See OpenClaw Model Providers for the full list.
π» Local Development
git clone https://github.com/somratpro/huggingclaw.git
cd huggingclaw
cp .env.example .env
# Edit .env with your secret values
With Docker:
docker build --build-arg OPENCLAW_VERSION=latest -t huggingclaw .
docker run -p 7861:7861 --env-file .env huggingclaw
Without Docker:
npm install -g openclaw@latest
export $(cat .env | xargs)
bash start.sh
π CLI Access
After deploying, you can connect via the OpenClaw CLI (e.g., to onboard channels or run agents):
npm install -g openclaw@latest
openclaw channels login --gateway https://YOUR_SPACE_NAME.hf.space
# When prompted, enter your GATEWAY_TOKEN
ποΈ Architecture
HuggingClaw/
βββ Dockerfile # Multi-stage build using pre-built OpenClaw image
βββ start.sh # Config generator, validator, and orchestrator
βββ workspace-sync.py # Syncs workspace to HF Datasets (with Git fallback)
βββ health-server.js # /health endpoint for uptime checks
βββ dns-fix.js # DNS-over-HTTPS fallback (for blocked domains)
βββ .env.example # Environment variable reference
βββ README.md # (this file)
**Startup sequence:**
1. Validate required secrets (fail fast with clear error).
2. Check HF token (warn if expired or missing).
3. Auto-create backup dataset if missing.
4. Restore workspace from HF Dataset.
5. Generate `openclaw.json` from environment variables.
6. Print startup summary.
7. Launch background tasks (auto-sync and optional channel helpers).
8. Launch the OpenClaw gateway (start listening).
9. On `SIGTERM`, save workspace and exit cleanly.
π Troubleshooting
- Missing secrets: Ensure
LLM_API_KEY,LLM_MODEL, andGATEWAY_TOKENare set in your Space Settings β Secrets. - Telegram bot issues: Verify your
TELEGRAM_BOT_TOKEN. Check Space logs for lines likeπ± Enabling Telegram. - Backup restore failing: Make sure
HF_USERNAMEandHF_TOKENare correct (token needs write access to your Dataset). - Space keeps sleeping: Open
/and useKeep Space Awaketo create the external monitor. - Auth errors / proxy: If you see reverse-proxy auth errors, add the logged IPs under
TRUSTED_PROXIES(from logsremote=x.x.x.x). - Control UI says too many failed authentication attempts: Wait for the retry window to expire, then open the Space in an incognito window or clear site storage for your Space before logging in again with
GATEWAY_TOKEN. - WhatsApp lost its session after restart: Make sure
HF_USERNAMEandHF_TOKENare configured so the hidden session backup can be restored on boot. - UI blocked (CORS): Set
ALLOWED_ORIGINS=https://your-space-name.hf.space. - Version mismatches: Pin a specific OpenClaw build with the
OPENCLAW_VERSIONVariable in HF Spaces, or--build-arg OPENCLAW_VERSION=...locally.
π Links
π€ Contributing
Contributions are welcome! Please see CONTRIBUTING.md for guidelines.
π License
MIT β see LICENSE for details.
Made with β€οΈ by @somratpro for the OpenClaw community.