--- title: HuggingClaw emoji: ๐Ÿ”ฅ colorFrom: yellow colorTo: red sdk: docker pinned: false license: mit datasets: - tao-shen/HuggingClaw-data short_description: Free always-on AI assistant, no hardware required app_port: 7860 tags: - huggingface - openrouter - chatbot - llm - openclaw - ai-assistant - whatsapp - telegram - text-generation - openai-api - huggingface-spaces - docker - deployment - persistent-storage - agents - multi-channel - openai-compatible - free-tier - one-click-deploy - self-hosted - messaging-bot ---
HuggingClaw

Your always-on AI assistant โ€” free, no server needed
WhatsApp ยท Telegram ยท 40+ channels ยท 16 GB RAM ยท One-click deploy ยท Auto-persistent

[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](LICENSE) [![Hugging Face](https://img.shields.io/badge/๐Ÿค—-Hugging%20Face-yellow)](https://huggingface.co) [![HF Spaces](https://img.shields.io/badge/Spaces-HuggingFace-blue)](https://huggingface.co/spaces/tao-shen/HuggingClaw) [![OpenClaw](https://img.shields.io/badge/OpenClaw-Powered-orange)](https://github.com/openclaw/openclaw) [![Docker](https://img.shields.io/badge/Docker-Ready-2496ED?logo=docker)](https://www.docker.com/) [![OpenAI Compatible](https://img.shields.io/badge/OpenAI--compatible-API-green)](https://openclawdoc.com/docs/reference/environment-variables) [![WhatsApp](https://img.shields.io/badge/WhatsApp-Enabled-25D366?logo=whatsapp)](https://www.whatsapp.com/) [![Telegram](https://img.shields.io/badge/Telegram-Enabled-26A5E4?logo=telegram)](https://telegram.org/) [![Free Tier](https://img.shields.io/badge/Free%20Tier-16GB%20RAM-brightgreen)](https://huggingface.co/spaces)
--- ## What you get In about 5 minutes, youโ€™ll have a **free, always-on AI assistant** connected to WhatsApp, Telegram, and 40+ other channels โ€” no server, no subscription, no hardware required. | | | |---|---| | **Free forever** | HuggingFace Spaces gives you 2 vCPU + 16 GB RAM at no cost | | **Always online** | Your conversations, settings, and credentials survive every restart | | **WhatsApp & Telegram** | Works reliably, including channels that HF Spaces normally blocks | | **Any LLM** | OpenAI, Claude, Gemini, OpenRouter (200+ models, free tier available), or your own Ollama | | **One-click deploy** | Duplicate the Space, set two secrets, done | > **Powered by [OpenClaw](https://github.com/openclaw/openclaw)** โ€” an open-source AI assistant that normally requires your own machine (e.g. a Mac Mini). HuggingClaw makes it run for free on HuggingFace Spaces by solving two Spaces limitations: data loss on restart (fixed via HF Dataset sync) and DNS failures for some domains like WhatsApp (fixed via DNS-over-HTTPS). ## Architecture
Architecture
## Quick Start ### 1. Duplicate this Space Click **Duplicate this Space** on the [HuggingClaw Space page](https://huggingface.co/spaces/tao-shen/HuggingClaw). > **After duplicating:** Edit your Space's `README.md` and update the `datasets:` field in the YAML header to point to your own dataset repo (e.g. `your-name/YourSpace-data`), or remove it entirely. This prevents your Space from appearing as linked to the original dataset. ### 2. Set Secrets Go to **Settings โ†’ Repository secrets** and add the following. The only two you *must* set are `HF_TOKEN` and one API key. | Secret | Status | Description | Example | |--------|:------:|-------------|---------| | `HF_TOKEN` | **Required** | HF Access Token with write permission ([create one](https://huggingface.co/settings/tokens)) | `hf_AbCdEfGhIjKlMnOpQrStUvWxYz` | | `AUTO_CREATE_DATASET` | **Recommended** | Set to `true` โ€” HuggingClaw will automatically create a private backup dataset on first startup. No manual setup needed. | `true` | | `GROQ_API_KEY` | Recommended | [Groq](https://console.groq.com) API key โ€” Fastest inference (Llama 3.3 70B) | `gsk_xxxxxxxxxxxx` | | `OPENROUTER_API_KEY` | Recommended | [OpenRouter](https://openrouter.ai) API key โ€” 200+ models, free tier available. Easiest way to get started. | `sk-or-v1-xxxxxxxxxxxx` | | `XAI_API_KEY` | Optional | [xAI Grok](https://console.x.ai) API key โ€” Fast inference, Grok-beta model | `gsk_xxxxxxxxxxxx` | | `OPENAI_API_KEY` | Optional | OpenAI (or any [OpenAI-compatible](https://openclawdoc.com/docs/reference/environment-variables)) API key | `sk-proj-xxxxxxxxxxxx` | | `ANTHROPIC_API_KEY` | Optional | Anthropic Claude API key | `sk-ant-xxxxxxxxxxxx` | | `GOOGLE_API_KEY` | Optional | Google / Gemini API key | `AIzaSyXxXxXxXxXx` | | `OPENCLAW_DEFAULT_MODEL` | Optional | Default model for new conversations | `groq/llama-3.3-70b-versatile` | ### Data Persistence HuggingClaw syncs `~/.openclaw` (conversations, settings, credentials) to a private HuggingFace Dataset repo so your data survives every restart. **Option A โ€” Auto mode (recommended)** 1. Set `AUTO_CREATE_DATASET` = `true` in your Space secrets 2. Set `HF_TOKEN` with write permission 3. Done โ€” on first startup, HuggingClaw automatically creates a private Dataset repo named `your-username/SpaceName-data`. Each duplicated Space gets its own isolated dataset. > (Optional) Set `OPENCLAW_DATASET_REPO` = `your-name/custom-name` if you prefer a specific repo name. **Option B โ€” Manual mode** 1. Go to [huggingface.co/new-dataset](https://huggingface.co/new-dataset) and create a **private** Dataset repo (e.g. `your-name/HuggingClaw-data`) 2. Set `OPENCLAW_DATASET_REPO` = `your-name/HuggingClaw-data` in your Space secrets 3. Set `HF_TOKEN` with write permission 4. Done โ€” HuggingClaw will sync to this repo every 60 seconds > **Security note:** `AUTO_CREATE_DATASET` defaults to `false` โ€” HuggingClaw will never create repos on your behalf unless you explicitly opt in. ### Running Local Models (CPU-Friendly) HuggingClaw can run small models (โ‰ค1B parameters) **locally on CPU** - perfect for HF Spaces free tier! **Supported Models:** - **NeuralNexusLab/HacKing** (0.6B) - โœ… Recommended - TinyLlama-1.1B - Qwen-1.5B - Phi-2 (2.7B, may be slower) **Quick Setup:** 1. **Set these secrets** in your Space: | Secret | Value | |--------|-------| | `LOCAL_MODEL_ENABLED` | `true` | | `LOCAL_MODEL_NAME` | `neuralnexuslab/hacking` | | `LOCAL_MODEL_ID` | `neuralnexuslab/hacking` | | `LOCAL_MODEL_NAME_DISPLAY` | `NeuralNexus HacKing 0.6B` | 2. **Wait for startup** - The model will be pulled on first startup (~30 seconds for 0.6B) 3. **Connect to Control UI** - The local model will appear in the model selector **Performance Expectations:** | Model Size | CPU Speed (tokens/s) | RAM Usage | |------------|---------------------|-----------| | 0.6B | 20-50 t/s | ~500 MB | | 1B | 10-20 t/s | ~1 GB | | 3B | 3-8 t/s | ~2 GB | > **Note:** 0.6B models run very smoothly on HF Spaces free tier (2 vCPU, 16GB RAM) ### Environment Variables Fine-tune persistence and performance. Set these as **Repository Secrets** in HF Spaces, or in `.env` for local Docker. | Variable | Default | Description | |----------|---------|-------------| | `GATEWAY_TOKEN` | `huggingclaw` | **Gateway token for Control UI access.** Override to set a custom token. | | `AUTO_CREATE_DATASET` | `false` | **Auto-create the Dataset repo.** Set to `true` to auto-create a private Dataset repo on first startup. | | `SYNC_INTERVAL` | `60` | **Backup interval in seconds.** How often data syncs to the Dataset repo. | > For the full list (including `OPENAI_BASE_URL`, `OLLAMA_HOST`, proxy settings, etc.), see [`.env.example`](.env.example). ### 3. Open the Control UI Visit your Space URL. Enter the gateway token (default: `huggingclaw`) to connect. Customize via `GATEWAY_TOKEN` secret. Messaging integrations (Telegram, WhatsApp) can be configured directly inside the Control UI after connecting. > **Telegram note:** HF Spaces blocks `api.telegram.org` DNS. HuggingClaw automatically probes alternative API endpoints at startup and selects one that works โ€” no manual configuration needed. ## Configuration HuggingClaw supports **all OpenClaw environment variables** โ€” it passes the entire environment to the OpenClaw process (`env=os.environ.copy()`), so any variable from the [OpenClaw docs](https://openclawdoc.com/docs/reference/environment-variables) works out of the box in HF Spaces. This includes: - **API Keys** โ€” `OPENAI_API_KEY`, `ANTHROPIC_API_KEY`, `GOOGLE_API_KEY`, `MISTRAL_API_KEY`, `COHERE_API_KEY`, `OPENROUTER_API_KEY`, `GROQ_API_KEY`, `XAI_API_KEY` - **Server** โ€” `OPENCLAW_API_PORT`, `OPENCLAW_WS_PORT`, `OPENCLAW_HOST` - **Memory** โ€” `OPENCLAW_MEMORY_BACKEND`, `OPENCLAW_REDIS_URL`, `OPENCLAW_SQLITE_PATH` - **Network** โ€” `OPENCLAW_HTTP_PROXY`, `OPENCLAW_HTTPS_PROXY`, `OPENCLAW_NO_PROXY` - **Ollama** โ€” `OLLAMA_HOST`, `OLLAMA_NUM_PARALLEL`, `OLLAMA_KEEP_ALIVE` - **Secrets** โ€” `OPENCLAW_SECRETS_BACKEND`, `VAULT_ADDR`, `VAULT_TOKEN` HuggingClaw adds its own variables for persistence and deployment: `HF_TOKEN`, `OPENCLAW_DATASET_REPO`, `AUTO_CREATE_DATASET`, `SYNC_INTERVAL`, `OPENCLAW_DEFAULT_MODEL`, etc. See [`.env.example`](.env.example) for the complete reference. ## Security - **Token authentication** โ€” Control UI requires a gateway token to connect (default: `huggingclaw`, customizable via `GATEWAY_TOKEN`) - **Secrets stay server-side** โ€” API keys and tokens are never exposed to the browser - **Private backups** โ€” the Dataset repo is created as private by default ## License MIT