HuggingClaw / README.md
somratpro's picture
fix: correct project name from HuggingMess to HuggingMes in README
9c56e36
metadata
title: HuggingClaw
emoji: 🦞
colorFrom: blue
colorTo: purple
sdk: docker
app_port: 7861
pinned: true
license: mit
secrets:
  - name: LLM_API_KEY
    description: Your LLM provider API key (e.g. Anthropic, OpenAI, Google, OpenRouter).
  - name: LLM_MODEL
    description: Model ID to use, e.g. google/gemini-2.5-flash or openai/gpt-4o.
  - name: GATEWAY_TOKEN
    description: >-
      Strong token to secure your OpenClaw Control UI (generate: openssl rand
      -hex 32).
  - name: CLOUDFLARE_WORKERS_TOKEN
    description: Cloudflare API token β€” auto-creates a Worker proxy and KeepAlive monitor.
  - name: TELEGRAM_ALLOWED_USERS
    description: Comma-separated Telegram user IDs for access
  - name: TELEGRAM_BOT_TOKEN
    description: Telegram bot token from BotFather

GitHub Stars License: MIT HF Space OpenClaw

Your always-on AI assistant β€” free, no server needed. HuggingClaw runs OpenClaw on HuggingFace Spaces, giving you a 24/7 AI chat assistant on Telegram and WhatsApp. It works with any large language model (LLM) – Claude, ChatGPT, Gemini, etc. – and even supports custom models via OpenRouter. Deploy in minutes on the free HF Spaces tier (2 vCPU, 16GB RAM, 50GB) with automatic workspace backup to a HuggingFace Dataset so your chat history and settings persist across restarts.

Table of Contents

✨ Features

  • πŸ”Œ Any LLM: Use Claude, OpenAI GPT, Google Gemini, Grok, DeepSeek, Qwen, and 40+ providers (set LLM_API_KEY and LLM_MODEL accordingly).
  • ⚑ Zero Config: Duplicate this Space and set just three secrets (LLM_API_KEY, LLM_MODEL, GATEWAY_TOKEN) – no other setup needed.
  • 🐳 Fast Builds: Uses a pre-built OpenClaw Docker image to deploy in minutes.
  • 🌐 Cloudflare Outbound Proxy: HuggingClaw can automatically provision a Cloudflare Worker proxy for blocked outbound traffic such as Telegram API requests.
  • πŸ’Ύ Workspace Backup: Chats, settings, and WhatsApp session state sync to a private HF Dataset via the huggingface_hub, preserving data automatically without storing your HF token in a git remote.
  • ⏰ Easy Keep-Alive: Uses CLOUDFLARE_WORKERS_TOKEN to automatically set up a cron-triggered keep-awake worker at boot.
  • πŸ‘₯ Multi-User Messaging: Support for Telegram (multi-user) and WhatsApp (pairing).
  • πŸ“Š Visual Dashboard: Beautiful Web UI to monitor uptime, sync status, and active models.
  • πŸ”” Webhooks: Get notified on restarts or backup failures via standard webhooks.
  • πŸ” Flexible Auth: Secure the Control UI with either a gateway token or password.
  • 🏠 100% HF-Native: Runs entirely on HuggingFace’s free infrastructure (2 vCPU, 16GB RAM).

πŸŽ₯ Video Tutorial

Watch a quick walkthrough on YouTube: Deploying HuggingClaw on HF Spaces.

πŸš€ Quick Start

Step 1: Duplicate this Space

Duplicate this Space

Click the button above to duplicate the template.

Step 2: Add Your Secrets

Navigate to your new Space's Settings, scroll down to the Variables and secrets section, and add the following three under Secrets:

  • LLM_API_KEY – Your provider API key (e.g., Anthropic, OpenAI, OpenRouter).
  • LLM_MODEL – The model ID string you wish to use (e.g., openai/gpt-4o or google/gemini-2.5-flash).
  • GATEWAY_TOKEN – A custom password or token to secure your Control UI. (You can use any strong password, or generate one with openssl rand -hex 32 if you prefer).

HuggingClaw is completely flexible! You only need these three secrets to get started. You can set other secrets later.

Optional: if you want to pin a specific OpenClaw release instead of latest, add OPENCLAW_VERSION under Variables in your Space settings. For Docker Spaces, HF passes Variables as build args during image build, so this should be a Variable, not a Secret.

Step 3: Deploy & Run

That's it! The Space will build the container and start up automatically. You can monitor the build process in the Logs tab.

Step 4: Monitor & Manage

HuggingClaw features a built-in dashboard to track:

  • Uptime: Real-time uptime monitoring.
  • Sync Status: Visual indicators for workspace backup operations.
  • Chat Status: Real-time connection status for WhatsApp and Telegram.
  • Model Info: See which LLM is currently powering your assistant.

πŸ“± Telegram Setup (Optional)

To chat via Telegram:

  1. Create a bot via @BotFather: send /newbot, follow prompts, and copy the bot token.
  2. Find your Telegram user ID with @userinfobot.
  3. Add CLOUDFLARE_WORKERS_TOKEN in Space secrets to let HuggingClaw auto-provision the outbound proxy, or set CLOUDFLARE_PROXY_URL manually if you already have a Worker.
  4. Add these secrets in Settings β†’ Secrets. After restarting, the bot should appear online on Telegram.
Variable Default Description
TELEGRAM_BOT_TOKEN β€” Telegram bot token from BotFather
TELEGRAM_ALLOWED_USERS β€” Comma-separated Telegram user IDs for access

🌐 Cloudflare Proxy Setup

Hugging Face Free Tier often restricts outbound connections to services like Telegram, Discord, and WhatsApp. HuggingClaw solves this with a Transparent Outbound Proxy via Cloudflare Workers.

⚑ Automatic Setup (Recommended)

This is the easiest way. HuggingClaw will handle the deployment for you.

  1. Create a Cloudflare API Token:
    • Go to API Tokens.
    • Create Token -> Edit Cloudflare Workers template.
    • Ensure it has Account: Workers Scripts: Edit permissions.
  2. Add the token as a secret named CLOUDFLARE_WORKERS_TOKEN in your Space Settings.

What happens next?

  • HuggingClaw automatically creates a Worker named after your Space host.
  • It generates a secure, private CLOUDFLARE_PROXY_SECRET.
  • All restricted outbound traffic is automatically routed through this Worker.

πŸ’¬ WhatsApp Setup (Optional)

To use WhatsApp, enable the channel and scan the QR code from the Control UI (Channels β†’ WhatsApp β†’ Login):

Variable Default Description
WHATSAPP_ENABLED false Enable WhatsApp pairing support

πŸ’Ύ Workspace Backup (Optional)

HuggingClaw automatically syncs your workspace (chats, settings, sessions) to a private HF Dataset named huggingclaw-backup.

  • Persistence: Survived restarts and restores your state on boot.
  • WhatsApp: Stores session credentials so you don't have to scan the QR code every time.
  • Interval: Syncs every 3 minutes by default.
Variable Default Description
HF_TOKEN β€” HF token with Write access
SYNC_INTERVAL 180 Backup frequency in seconds

πŸ’“ Staying Alive (Recommended on Free HF Spaces)

Your Space will automatically be kept awake by a background Cloudflare Worker when you configure the CLOUDFLARE_WORKERS_TOKEN secret. The worker uses a cron trigger to regularly ping your Space's /health endpoint. The dashboard displays the current keep-alive worker status.

πŸ”” Webhooks (Optional)

Get notified when your Space restarts or if a backup fails:

Variable Default Description
WEBHOOK_URL β€” Endpoint URL for POST JSON notifications

πŸ” Security & Advanced (Optional)

Configure password access and network restrictions:

Variable Default Description
OPENCLAW_PASSWORD β€” Enable simple password auth instead of token
TRUSTED_PROXIES β€” Comma-separated IPs of HF proxies
ALLOWED_ORIGINS β€” Comma-separated allowed origins for Control UI
CLOUDFLARE_KEEPALIVE_ENABLED true Set to false to disable the automatic Cloudflare KeepAlive worker

πŸ€– LLM Providers

HuggingClaw supports all providers from OpenClaw. Set LLM_MODEL=<provider/model> and the provider is auto-detected.

Click to see supported providers and examples
Provider Prefix Example Model
Anthropic anthropic/ anthropic/claude-3-5-sonnet-latest
OpenAI openai/ openai/gpt-4o
Google google/ google/gemini-2.0-flash
DeepSeek deepseek/ deepseek/deepseek-chat
xAI (Grok) xai/ xai/grok-2-latest
Mistral mistral/ mistral/mistral-large-latest
HuggingFace huggingface/ huggingface/deepseek-ai/DeepSeek-R1
OpenRouter openrouter/ openrouter/anthropic/claude-3.5-sonnet

And many more: Cohere, Groq, NVIDIA, Mistral, Moonshot, etc.

Any Other Provider

You can also use any custom provider:

LLM_API_KEY=your_api_key
LLM_MODEL=provider/model-name

The provider prefix in LLM_MODEL tells HuggingClaw how to call it. See OpenClaw Model Providers for the full list.

Custom OpenAI-Compatible Provider

Register a custom endpoint at startup without modifying the CLI.

Variable Description Default
CUSTOM_PROVIDER_NAME Unique provider prefix (e.g., modal) Required
CUSTOM_BASE_URL API base URL (e.g., https://.../v1) Required
CUSTOM_MODEL_ID Model ID on the server Required
LLM_MODEL Must match {CUSTOM_PROVIDER_NAME}/{CUSTOM_MODEL_ID} Required
CUSTOM_API_KEY Provider-specific key LLM_API_KEY
CUSTOM_CONTEXT_WINDOW Context limit 128000

CUSTOM_PROVIDER_NAME cannot override built-in providers (openai, anthropic, etc.).

Example (Modal):

CUSTOM_PROVIDER_NAME=modal
CUSTOM_BASE_URL=https://api.us-west-2.modal.direct/v1
CUSTOM_MODEL_ID=zai-org/GLM-5.1-FP8
LLM_MODEL=modal/zai-org/GLM-5.1-FP8

πŸ’» Local Development

git clone https://github.com/somratpro/huggingclaw.git
cd huggingclaw
cp .env.example .env
# Edit .env with your secret values

With Docker:

docker build --build-arg OPENCLAW_VERSION=latest -t huggingclaw .
docker run -p 7861:7861 --env-file .env huggingclaw

Without Docker:

npm install -g openclaw@latest
export $(cat .env | xargs)
bash start.sh

πŸ”— CLI Access

After deploying, you can connect via the OpenClaw CLI (e.g., to onboard channels or run agents):

npm install -g openclaw@latest
openclaw channels login --gateway https://YOUR_SPACE_NAME.hf.space
# When prompted, enter your GATEWAY_TOKEN

πŸ—οΈ Architecture

HuggingClaw uses a multi-layered approach to ensure stability and persistence on Hugging Face's ephemeral infrastructure.

Click to view technical details
  • Dashboard (/): Management, monitoring, and keep-alive tools.
  • Control UI (/gateway): Secure interface for managing agents and channels.
  • Health Check (/health): Endpoint for uptime monitoring and readiness probes.
  • Sync Engine: Python background process managing HF Dataset persistence.
  • Transparent Proxy: Interceptor for requests to blocked domains (Telegram, etc.).

Startup sequence:

  1. Validate required secrets and check HF token.
  2. Resolve backup namespace and restore workspace from HF Dataset.
  3. Generate openclaw.json configuration.
  4. Launch background tasks (auto-sync, channel helpers).
  5. Start OpenClaw gateway and listen for connections.

πŸ› Troubleshooting

  • Missing secrets: Ensure LLM_API_KEY, LLM_MODEL, and GATEWAY_TOKEN are set in your Space Settings β†’ Secrets.
  • Telegram bot issues: Verify your TELEGRAM_BOT_TOKEN. Check Space logs for lines like πŸ“± Enabling Telegram.
  • Backup restore failing: Make sure HF_TOKEN is valid and has write access to your HF account dataset. Set HF_USERNAME only if auto-detection is not available in your environment.
  • Space keeps sleeping: Add CLOUDFLARE_WORKERS_TOKEN as a Space secret to enable automatic keep-awake monitoring via Cloudflare Workers.
  • Auth errors / proxy: If you see reverse-proxy auth errors, add the logged IPs under TRUSTED_PROXIES (from logs remote=x.x.x.x).
  • Control UI says too many failed authentication attempts: Wait for the retry window to expire, then open the Space in an incognito window or clear site storage for your Space before logging in again with GATEWAY_TOKEN.
  • WhatsApp lost its session after restart: Make sure HF_TOKEN is configured so the hidden session backup can be restored on boot.
  • UI blocked (CORS): Set ALLOWED_ORIGINS=https://your-space-name.hf.space.
  • Version mismatches: Pin a specific OpenClaw build with the OPENCLAW_VERSION Variable in HF Spaces, or --build-arg OPENCLAW_VERSION=... locally.

🌟 More Projects

Similar projects by @somratpro β€” all free, one-click deploy on HF Spaces:

Project What it runs HF Space GitHub
HuggingMes Hermes β€” Self-hosted agent gateway Space Repo
Hugging8n n8n β€” workflow & automation platform Space Repo
HuggingClip Paperclip β€” AI agent orchestration platform Space Repo

πŸ“š Links

❀️ Support

If HuggingClaw saves you time, consider buying me a coffee to keep the projects alive!

USDT (TRC-20 / TRON network only)

TELx8TJz1W1h7n6SgpgGNNGZXpJCEUZrdB

Send USDT on TRC-20 network only. Sending other tokens or using a different network will result in permanent loss.

🀝 Contributing

Contributions are welcome! Please see CONTRIBUTING.md for guidelines.

πŸ“„ License

MIT β€” see LICENSE for details.

Made with ❀️ by @somratpro for the OpenClaw community.