Spaces:
Running
title: HuggingClaw
emoji: π¦
colorFrom: blue
colorTo: purple
sdk: docker
app_port: 7861
pinned: true
license: mit
secrets:
- name: LLM_API_KEY
description: Your LLM provider API key (e.g. Anthropic, OpenAI, Google, OpenRouter).
- name: LLM_MODEL
description: Model ID to use, e.g. google/gemini-2.5-flash or openai/gpt-4o.
- name: GATEWAY_TOKEN
description: >-
Strong token to secure your OpenClaw Control UI (generate: openssl rand
-hex 32).
- name: CLOUDFLARE_WORKERS_TOKEN
description: Cloudflare API token β auto-creates a Worker proxy and KeepAlive monitor.
- name: TELEGRAM_ALLOWED_USERS
description: Comma-separated Telegram user IDs for access
- name: TELEGRAM_BOT_TOKEN
description: Telegram bot token from BotFather
Your always-on AI assistant β free, no server needed. HuggingClaw runs OpenClaw on HuggingFace Spaces, giving you a 24/7 AI chat assistant on Telegram and WhatsApp. It works with any large language model (LLM) β Claude, ChatGPT, Gemini, etc. β and even supports custom models via OpenRouter. Deploy in minutes on the free HF Spaces tier (2 vCPU, 16GB RAM, 50GB) with automatic workspace backup to a HuggingFace Dataset so your chat history and settings persist across restarts.
Table of Contents
- β¨ Features
- π₯ Video Tutorial
- π Quick Start
- π± Telegram Setup (Optional)
- π Cloudflare Proxy (Optional)
- π¬ WhatsApp Setup (Optional)
- πΎ Workspace Backup (Optional)
- π Webhooks (Optional)
- π Security & Advanced (Optional)
- π€ LLM Providers
- π» Local Development
- π CLI Access
- ποΈ Architecture
- π Staying Alive
- π Troubleshooting
- π Links
- π€ Contributing
- π License
β¨ Features
- π Any LLM: Use Claude, OpenAI GPT, Google Gemini, Grok, DeepSeek, Qwen, and 40+ providers (set
LLM_API_KEYandLLM_MODELaccordingly). - β‘ Zero Config: Duplicate this Space and set just three secrets (LLM_API_KEY, LLM_MODEL, GATEWAY_TOKEN) β no other setup needed.
- π³ Fast Builds: Uses a pre-built OpenClaw Docker image to deploy in minutes.
- π Cloudflare Outbound Proxy: HuggingClaw can automatically provision a Cloudflare Worker proxy for blocked outbound traffic such as Telegram API requests.
- πΎ Workspace Backup: Chats, settings, and WhatsApp session state sync to a private HF Dataset via the
huggingface_hub, preserving data automatically without storing your HF token in a git remote. - β° Easy Keep-Alive: Uses
CLOUDFLARE_WORKERS_TOKENto automatically set up a cron-triggered keep-awake worker at boot. - π₯ Multi-User Messaging: Support for Telegram (multi-user) and WhatsApp (pairing).
- π Visual Dashboard: Beautiful Web UI to monitor uptime, sync status, and active models.
- π Webhooks: Get notified on restarts or backup failures via standard webhooks.
- π Flexible Auth: Secure the Control UI with either a gateway token or password.
- π 100% HF-Native: Runs entirely on HuggingFaceβs free infrastructure (2 vCPU, 16GB RAM).
π₯ Video Tutorial
Watch a quick walkthrough on YouTube: Deploying HuggingClaw on HF Spaces.
π Quick Start
Step 1: Duplicate this Space
Click the button above to duplicate the template.
Step 2: Add Your Secrets
Navigate to your new Space's Settings, scroll down to the Variables and secrets section, and add the following three under Secrets:
LLM_API_KEYβ Your provider API key (e.g., Anthropic, OpenAI, OpenRouter).LLM_MODELβ The model ID string you wish to use (e.g.,openai/gpt-4oorgoogle/gemini-2.5-flash).GATEWAY_TOKENβ A custom password or token to secure your Control UI. (You can use any strong password, or generate one withopenssl rand -hex 32if you prefer).
HuggingClaw is completely flexible! You only need these three secrets to get started. You can set other secrets later.
Optional: if you want to pin a specific OpenClaw release instead of latest, add OPENCLAW_VERSION under Variables in your Space settings. For Docker Spaces, HF passes Variables as build args during image build, so this should be a Variable, not a Secret.
Step 3: Deploy & Run
That's it! The Space will build the container and start up automatically. You can monitor the build process in the Logs tab.
Step 4: Monitor & Manage
HuggingClaw features a built-in dashboard to track:
- Uptime: Real-time uptime monitoring.
- Sync Status: Visual indicators for workspace backup operations.
- Chat Status: Real-time connection status for WhatsApp and Telegram.
- Model Info: See which LLM is currently powering your assistant.
π± Telegram Setup (Optional)
To chat via Telegram:
- Create a bot via @BotFather: send
/newbot, follow prompts, and copy the bot token. - Find your Telegram user ID with @userinfobot.
- Add
CLOUDFLARE_WORKERS_TOKENin Space secrets to let HuggingClaw auto-provision the outbound proxy, or setCLOUDFLARE_PROXY_URLmanually if you already have a Worker. - Add these secrets in Settings β Secrets. After restarting, the bot should appear online on Telegram.
| Variable | Default | Description |
|---|---|---|
TELEGRAM_BOT_TOKEN |
β | Telegram bot token from BotFather |
TELEGRAM_ALLOWED_USERS |
β | Comma-separated Telegram user IDs for access |
π Cloudflare Proxy Setup
Hugging Face Free Tier often restricts outbound connections to services like Telegram, Discord, and WhatsApp. HuggingClaw solves this with a Transparent Outbound Proxy via Cloudflare Workers.
β‘ Automatic Setup (Recommended)
This is the easiest way. HuggingClaw will handle the deployment for you.
- Create a Cloudflare API Token:
- Go to API Tokens.
- Create Token -> Edit Cloudflare Workers template.
- Ensure it has
Account: Workers Scripts: Editpermissions.
- Add the token as a secret named
CLOUDFLARE_WORKERS_TOKENin your Space Settings.
What happens next?
- HuggingClaw automatically creates a Worker named after your Space host.
- It generates a secure, private
CLOUDFLARE_PROXY_SECRET. - All restricted outbound traffic is automatically routed through this Worker.
π¬ WhatsApp Setup (Optional)
To use WhatsApp, enable the channel and scan the QR code from the Control UI (Channels β WhatsApp β Login):
| Variable | Default | Description |
|---|---|---|
WHATSAPP_ENABLED |
false |
Enable WhatsApp pairing support |
πΎ Workspace Backup (Optional)
HuggingClaw automatically syncs your workspace (chats, settings, sessions) to a private HF Dataset named huggingclaw-backup.
- Persistence: Survived restarts and restores your state on boot.
- WhatsApp: Stores session credentials so you don't have to scan the QR code every time.
- Interval: Syncs every 3 minutes by default.
| Variable | Default | Description |
|---|---|---|
HF_TOKEN |
β | HF token with Write access |
SYNC_INTERVAL |
180 |
Backup frequency in seconds |
π Staying Alive (Recommended on Free HF Spaces)
Your Space will automatically be kept awake by a background Cloudflare Worker when you configure the CLOUDFLARE_WORKERS_TOKEN secret. The worker uses a cron trigger to regularly ping your Space's /health endpoint. The dashboard displays the current keep-alive worker status.
π Webhooks (Optional)
Get notified when your Space restarts or if a backup fails:
| Variable | Default | Description |
|---|---|---|
WEBHOOK_URL |
β | Endpoint URL for POST JSON notifications |
π Security & Advanced (Optional)
Configure password access and network restrictions:
| Variable | Default | Description |
|---|---|---|
OPENCLAW_PASSWORD |
β | Enable simple password auth instead of token |
TRUSTED_PROXIES |
β | Comma-separated IPs of HF proxies |
ALLOWED_ORIGINS |
β | Comma-separated allowed origins for Control UI |
CLOUDFLARE_KEEPALIVE_ENABLED |
true |
Set to false to disable the automatic Cloudflare KeepAlive worker |
π€ LLM Providers
HuggingClaw supports all providers from OpenClaw. Set LLM_MODEL=<provider/model> and the provider is auto-detected.
Click to see supported providers and examples
| Provider | Prefix | Example Model |
|---|---|---|
| Anthropic | anthropic/ |
anthropic/claude-3-5-sonnet-latest |
| OpenAI | openai/ |
openai/gpt-4o |
google/ |
google/gemini-2.0-flash |
|
| DeepSeek | deepseek/ |
deepseek/deepseek-chat |
| xAI (Grok) | xai/ |
xai/grok-2-latest |
| Mistral | mistral/ |
mistral/mistral-large-latest |
| HuggingFace | huggingface/ |
huggingface/deepseek-ai/DeepSeek-R1 |
| OpenRouter | openrouter/ |
openrouter/anthropic/claude-3.5-sonnet |
And many more: Cohere, Groq, NVIDIA, Mistral, Moonshot, etc.
Any Other Provider
You can also use any custom provider:
LLM_API_KEY=your_api_key
LLM_MODEL=provider/model-name
The provider prefix in LLM_MODEL tells HuggingClaw how to call it. See OpenClaw Model Providers for the full list.
Custom OpenAI-Compatible Provider
Register a custom endpoint at startup without modifying the CLI.
| Variable | Description | Default |
|---|---|---|
CUSTOM_PROVIDER_NAME |
Unique provider prefix (e.g., modal) |
Required |
CUSTOM_BASE_URL |
API base URL (e.g., https://.../v1) |
Required |
CUSTOM_MODEL_ID |
Model ID on the server | Required |
LLM_MODEL |
Must match {CUSTOM_PROVIDER_NAME}/{CUSTOM_MODEL_ID} |
Required |
CUSTOM_API_KEY |
Provider-specific key | LLM_API_KEY |
CUSTOM_CONTEXT_WINDOW |
Context limit | 128000 |
CUSTOM_PROVIDER_NAMEcannot override built-in providers (openai, anthropic, etc.).
Example (Modal):
CUSTOM_PROVIDER_NAME=modal
CUSTOM_BASE_URL=https://api.us-west-2.modal.direct/v1
CUSTOM_MODEL_ID=zai-org/GLM-5.1-FP8
LLM_MODEL=modal/zai-org/GLM-5.1-FP8
π» Local Development
git clone https://github.com/somratpro/huggingclaw.git
cd huggingclaw
cp .env.example .env
# Edit .env with your secret values
With Docker:
docker build --build-arg OPENCLAW_VERSION=latest -t huggingclaw .
docker run -p 7861:7861 --env-file .env huggingclaw
Without Docker:
npm install -g openclaw@latest
export $(cat .env | xargs)
bash start.sh
π CLI Access
After deploying, you can connect via the OpenClaw CLI (e.g., to onboard channels or run agents):
npm install -g openclaw@latest
openclaw channels login --gateway https://YOUR_SPACE_NAME.hf.space
# When prompted, enter your GATEWAY_TOKEN
ποΈ Architecture
HuggingClaw uses a multi-layered approach to ensure stability and persistence on Hugging Face's ephemeral infrastructure.
Click to view technical details
- Dashboard (
/): Management, monitoring, and keep-alive tools. - Control UI (
/gateway): Secure interface for managing agents and channels. - Health Check (
/health): Endpoint for uptime monitoring and readiness probes. - Sync Engine: Python background process managing HF Dataset persistence.
- Transparent Proxy: Interceptor for requests to blocked domains (Telegram, etc.).
Startup sequence:
- Validate required secrets and check HF token.
- Resolve backup namespace and restore workspace from HF Dataset.
- Generate
openclaw.jsonconfiguration. - Launch background tasks (auto-sync, channel helpers).
- Start OpenClaw gateway and listen for connections.
π Troubleshooting
- Missing secrets: Ensure
LLM_API_KEY,LLM_MODEL, andGATEWAY_TOKENare set in your Space Settings β Secrets. - Telegram bot issues: Verify your
TELEGRAM_BOT_TOKEN. Check Space logs for lines likeπ± Enabling Telegram. - Backup restore failing: Make sure
HF_TOKENis valid and has write access to your HF account dataset. SetHF_USERNAMEonly if auto-detection is not available in your environment. - Space keeps sleeping: Add
CLOUDFLARE_WORKERS_TOKENas a Space secret to enable automatic keep-awake monitoring via Cloudflare Workers. - Auth errors / proxy: If you see reverse-proxy auth errors, add the logged IPs under
TRUSTED_PROXIES(from logsremote=x.x.x.x). - Control UI says too many failed authentication attempts: Wait for the retry window to expire, then open the Space in an incognito window or clear site storage for your Space before logging in again with
GATEWAY_TOKEN. - WhatsApp lost its session after restart: Make sure
HF_TOKENis configured so the hidden session backup can be restored on boot. - UI blocked (CORS): Set
ALLOWED_ORIGINS=https://your-space-name.hf.space. - Version mismatches: Pin a specific OpenClaw build with the
OPENCLAW_VERSIONVariable in HF Spaces, or--build-arg OPENCLAW_VERSION=...locally.
π More Projects
Similar projects by @somratpro β all free, one-click deploy on HF Spaces:
| Project | What it runs | HF Space | GitHub |
|---|---|---|---|
| HuggingMes | Hermes β Self-hosted agent gateway | Space | Repo |
| Hugging8n | n8n β workflow & automation platform | Space | Repo |
| HuggingClip | Paperclip β AI agent orchestration platform | Space | Repo |
π Links
β€οΈ Support
If HuggingClaw saves you time, consider buying me a coffee to keep the projects alive!
USDT (TRC-20 / TRON network only)
TELx8TJz1W1h7n6SgpgGNNGZXpJCEUZrdB
Send USDT on TRC-20 network only. Sending other tokens or using a different network will result in permanent loss.
π€ Contributing
Contributions are welcome! Please see CONTRIBUTING.md for guidelines.
π License
MIT β see LICENSE for details.
Made with β€οΈ by @somratpro for the OpenClaw community.