File size: 15,275 Bytes
496ab5f
 
d41fe21
 
 
496ab5f
51ec4bc
d41fe21
b4ed131
496ab5f
 
d41fe21
 
 
 
 
 
51ec4bc
b4ed131
 
 
 
 
 
 
51ec4bc
b4ed131
51ec4bc
b0dbb67
b4ed131
 
 
 
 
 
 
 
 
 
 
 
 
 
 
eec7304
969345a
78e2c48
51ec4bc
 
 
f69c31b
b4ed131
 
 
 
 
d41fe21
 
 
b4ed131
d41fe21
b4ed131
d41fe21
eec7304
d41fe21
b4ed131
d41fe21
b4ed131
d41fe21
b4ed131
 
 
d41fe21
b4ed131
 
d41fe21
51ec4bc
 
b4ed131
d41fe21
b4ed131
d41fe21
9528925
 
a3d9f77
9528925
 
 
 
 
 
b4ed131
d41fe21
b4ed131
d41fe21
b4ed131
 
b0dbb67
d41fe21
b0dbb67
 
 
 
 
d41fe21
51ec4bc
 
b0dbb67
51ec4bc
b0dbb67
 
 
969345a
b4ed131
5365372
b0dbb67
5365372
b0dbb67
 
 
 
 
eec7304
b0dbb67
 
d41fe21
b0dbb67
 
d41fe21
78e2c48
 
a3d9f77
78e2c48
 
 
 
 
 
6345e1c
78e2c48
 
 
 
 
 
 
 
 
 
 
 
a3d9f77
 
 
 
 
51ec4bc
 
 
 
b0dbb67
 
 
bcbf1ad
b0dbb67
d41fe21
b0dbb67
d41fe21
b0dbb67
 
 
 
 
51ec4bc
4987deb
b4ed131
4987deb
b4ed131
b19cd4a
51ec4bc
9528925
51ec4bc
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
b19cd4a
b4ed131
b19cd4a
b4ed131
b19cd4a
b4ed131
 
 
b19cd4a
abf5aa0
b4ed131
abf5aa0
b4ed131
abf5aa0
b4ed131
0bfb89f
b4ed131
4987deb
 
 
d41fe21
b4ed131
d41fe21
 
 
 
 
 
 
b4ed131
d41fe21
 
b4ed131
 
d41fe21
51ec4bc
 
d41fe21
 
 
b4ed131
d41fe21
 
 
 
 
 
b4ed131
d41fe21
b4ed131
d41fe21
 
 
b4ed131
 
d41fe21
 
 
 
b4ed131
d41fe21
b4ed131
 
 
 
 
 
 
 
 
 
 
 
 
 
 
78e2c48
b4ed131
 
 
 
d41fe21
 
b4ed131
 
 
6345e1c
b4ed131
f69c31b
969345a
b4ed131
51ec4bc
d41fe21
 
 
b4ed131
 
 
d41fe21
 
 
b4ed131
d41fe21
 
 
 
 
b4ed131
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
---
title: HuggingClaw
emoji: 🦞
colorFrom: blue
colorTo: purple
sdk: docker
app_port: 7861
pinned: true
license: mit
---

<!-- Badges -->
[![GitHub Stars](https://img.shields.io/github/stars/somratpro/huggingclaw?style=flat-square)](https://github.com/somratpro/huggingclaw)
[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg?style=flat-square)](https://opensource.org/licenses/MIT)
[![HF Space](https://img.shields.io/badge/πŸ€—%20HuggingFace-Space-blue?style=flat-square)](https://huggingface.co/spaces)
[![OpenClaw](https://img.shields.io/badge/OpenClaw-Gateway-red?style=flat-square)](https://github.com/openclaw/openclaw)

**Your always-on AI assistant β€” free, no server needed.** HuggingClaw runs [OpenClaw](https://openclaw.ai) on HuggingFace Spaces, giving you a 24/7 AI chat assistant on Telegram and WhatsApp. It works with *any* large language model (LLM) – Claude, ChatGPT, Gemini, etc. – and even supports custom models via [OpenRouter](https://openrouter.ai). Deploy in minutes on the free HF Spaces tier (2 vCPU, 16GB RAM, 50GB) with automatic workspace backup to a HuggingFace Dataset so your chat history and settings persist across restarts.

## Table of Contents

- [✨ Features](#-features)
- [πŸŽ₯ Video Tutorial](#-video-tutorial)
- [πŸš€ Quick Start](#-quick-start)
- [πŸ“± Telegram Setup *(Optional)*](#-telegram-setup-optional)
- [πŸ’¬ WhatsApp Setup *(Optional)*](#-whatsapp-setup-optional)
- [πŸ’Ύ Workspace Backup *(Optional)*](#-workspace-backup-optional)
- [πŸ”” Webhooks *(Optional)*](#-webhooks-optional)
- [πŸ” Security & Advanced *(Optional)*](#-security--advanced-optional)
- [πŸ€– LLM Providers](#-llm-providers)
- [πŸ’» Local Development](#-local-development)
- [πŸ”— CLI Access](#-cli-access)
- [πŸ—οΈ Architecture](#-architecture)
- [πŸ’“ Staying Alive](#-staying-alive)
- [πŸ› Troubleshooting](#-troubleshooting)
- [πŸ“š Links](#-links)
- [🀝 Contributing](#-contributing)
- [πŸ“„ License](#-license)

## ✨ Features

- πŸ”Œ **Any LLM:** Use Claude, OpenAI GPT, Google Gemini, Grok, DeepSeek, Qwen, and 40+ providers (set `LLM_API_KEY` and `LLM_MODEL` accordingly).
- ⚑ **Zero Config:** Duplicate this Space and set **just three** secrets (LLM_API_KEY, LLM_MODEL, GATEWAY_TOKEN) – no other setup needed.
- 🐳 **Fast Builds:** Uses a pre-built OpenClaw Docker image to deploy in minutes.
- 🌐 **Built-In Browser:** Headless Chromium is included in the Space, so browser actions work from the start.
- πŸ’Ύ **Workspace Backup:** Chats, settings, and WhatsApp session state sync to a private HF Dataset via the `huggingface_hub` (Git fallback), preserving data automatically.
- ⏰ **External Keep-Alive:** Set up a one-time UptimeRobot monitor from the dashboard to help keep free HF Spaces awake.
- πŸ‘₯ **Multi-User Messaging:** Support for Telegram (multi-user) and WhatsApp (pairing).
- πŸ“Š **Visual Dashboard:** Beautiful Web UI to monitor uptime, sync status, and active models.
- πŸ”” **Webhooks:** Get notified on restarts or backup failures via standard webhooks.
- πŸ” **Flexible Auth:** Secure the Control UI with either a gateway token or password.
- 🏠 **100% HF-Native:** Runs entirely on HuggingFace’s free infrastructure (2 vCPU, 16GB RAM).

## πŸŽ₯ Video Tutorial

Watch a quick walkthrough on YouTube: [Deploying HuggingClaw on HF Spaces](https://www.youtube.com/watch?v=S6pl7NmjX7g&t=73s).

## πŸš€ Quick Start

### Step 1: Duplicate this Space

[![Duplicate this Space](https://huggingface.co/datasets/huggingface/badges/resolve/main/duplicate-this-space-xl.svg)](https://huggingface.co/spaces/somratpro/HuggingClaw?duplicate=true)

Click the button above to duplicate the template.

### Step 2: Add Your Secrets

Navigate to your new Space's **Settings**, scroll down to the **Variables and secrets** section, and add the following three under **Secrets**:

- `LLM_API_KEY` – Your provider API key (e.g., Anthropic, OpenAI, OpenRouter).
- `LLM_MODEL` – The model ID string you wish to use (e.g., `openai/gpt-4o` or `google/gemini-2.5-flash`).
- `GATEWAY_TOKEN` – A custom password or token to secure your Control UI. *(You can use any strong password, or generate one with `openssl rand -hex 32` if you prefer).*

> [!TIP]
> HuggingClaw is completely flexible! You only need these three secrets to get started. You can set other secrets later.

Optional: if you want to pin a specific OpenClaw release instead of `latest`, add `OPENCLAW_VERSION` under **Variables** in your Space settings. For Docker Spaces, HF passes Variables as build args during image build, so this should be a Variable, not a Secret.

### Step 3: Deploy & Run

That's it! The Space will build the container and start up automatically. You can monitor the build process in the **Logs** tab.

### Step 4: Monitor & Manage

HuggingClaw features a built-in dashboard to track:

- **Uptime:** Real-time uptime monitoring.
- **Sync Status:** Visual indicators for workspace backup operations.
- **Chat Status:** Real-time connection status for WhatsApp and Telegram.
- **Model Info:** See which LLM is currently powering your assistant.

## πŸ“± Telegram Setup *(Optional)*

To chat via Telegram:

1. Create a bot via [@BotFather](https://t.me/BotFather): send `/newbot`, follow prompts, and copy the bot token.
2. Find your Telegram user ID with [@userinfobot](https://t.me/userinfobot).
3. Add these secrets in Settings β†’ Secrets. After restarting, the bot should appear online on Telegram.

| Variable | Default | Description |
| :--- | :--- | :--- |
| `TELEGRAM_BOT_TOKEN` | β€” | Telegram bot token from BotFather |
| `TELEGRAM_USER_ID` | β€” | Single Telegram user ID allowlist |
| `TELEGRAM_USER_IDS` | β€” | Comma-separated Telegram user IDs for team access |

## πŸ’¬ WhatsApp Setup *(Optional)*

To use WhatsApp, enable the channel and scan the QR code from the Control UI (**Channels** β†’ **WhatsApp** β†’ **Login**):

| Variable | Default | Description |
| :--- | :--- | :--- |
| `WHATSAPP_ENABLED` | `false` | Enable WhatsApp pairing support |

## πŸ’Ύ Workspace Backup *(Optional)*

For persistent chat history and configuration, HuggingClaw can sync your workspace to a private HuggingFace Dataset. On first run, it will automatically create (or use) the Dataset repo `HF_USERNAME/SPACE-backup`, restore your workspace on startup, and sync changes periodically.

| Variable | Default | Description |
| :--- | :--- | :--- |
| `HF_USERNAME` | β€” | Your HuggingFace username |
| `HF_TOKEN` | β€” | HF token with write access |
| `BACKUP_DATASET_NAME` | `huggingclaw-backup` | Dataset name for backup repo |
| `SYNC_INTERVAL` | `180` | Sync interval in seconds |
| `WORKSPACE_GIT_USER` | `openclaw@example.com` | Git commit email for syncs |
| `WORKSPACE_GIT_NAME` | `OpenClaw Bot` | Git commit name for syncs |

> [!TIP]
> This backup also stores a hidden copy of your WhatsApp session credentials, allowing paired logins to survive Space restarts automatically.

## πŸ’“ Staying Alive *(Recommended on Free HF Spaces)*

Free Hugging Face Spaces can still sleep. HuggingClaw does not rely on internal self-pings anymore. To help keep a public Space awake, set up an external UptimeRobot monitor from the dashboard.

Use the **Main API key** from UptimeRobot.
Do **not** use the `Read-only API key` or a `Monitor-specific API key`.

Setup:

1. Open `/`.
2. Find **Keep Space Awake**.
3. Paste your UptimeRobot **Main API key**.
4. Click **Create Monitor**.

What happens next:

- HuggingClaw creates a monitor for `https://your-space.hf.space/health`
- UptimeRobot keeps pinging it from outside Hugging Face
- You only need to do this once

You do **not** need to add this key to Hugging Face Space Secrets.

Note:

- This works for **public** Spaces.
- It does **not** work reliably for **private** Spaces, because external monitors cannot access private HF health URLs.

## πŸ”” Webhooks *(Optional)*

Get notified when your Space restarts or if a backup fails:

| Variable | Default | Description |
| :--- | :--- | :--- |
| `WEBHOOK_URL` | β€” | Endpoint URL for POST JSON notifications |

## πŸ” Security & Advanced *(Optional)*

Configure password access and network restrictions:

| Variable | Default | Description |
| :--- | :--- | :--- |
| `OPENCLAW_PASSWORD` | β€” | Enable simple password auth instead of token |
| `TRUSTED_PROXIES` | β€” | Comma-separated IPs of HF proxies |
| `ALLOWED_ORIGINS` | β€” | Comma-separated allowed origins for Control UI |
| `OPENCLAW_VERSION` | `latest` | Build-time pin for the OpenClaw image tag |

## πŸ€– LLM Providers

HuggingClaw supports **all providers** from OpenClaw. Set `LLM_MODEL=<provider/model>` and the provider is auto-detected. For example:

| Provider         | Prefix          | Example Model                         | API Key Source                                       |
| :--------------- | :-------------- | :------------------------------------ | :--------------------------------------------------- |
| **Anthropic**    | `anthropic/`    | `anthropic/claude-sonnet-4-6`         | [Anthropic Console](https://console.anthropic.com/) |
| **OpenAI**       | `openai/`       | `openai/gpt-5.4`                      | [OpenAI Platform](https://platform.openai.com/)     |
| **Google**       | `google/`       | `google/gemini-2.5-flash`             | [AI Studio](https://ai.google.dev/)                  |
| **DeepSeek**     | `deepseek/`     | `deepseek/deepseek-v3.2`              | [DeepSeek](https://platform.deepseek.com)            |
| **xAI (Grok)**   | `xai/`          | `xai/grok-4`                          | [xAI](https://console.x.ai)                          |
| **Mistral**      | `mistral/`      | `mistral/mistral-large-latest`        | [Mistral Console](https://console.mistral.ai)        |
| **Moonshot**     | `moonshot/`     | `moonshot/kimi-k2.5`                  | [Moonshot](https://platform.moonshot.cn)             |
| **Cohere**       | `cohere/`       | `cohere/command-a`                    | [Cohere Dashboard](https://dashboard.cohere.com)    |
| **Groq**         | `groq/`         | `groq/mixtral-8x7b-32768`             | [Groq](https://console.groq.com)                     |
| **MiniMax**      | `minimax/`      | `minimax/minimax-m2.7`                | [MiniMax](https://platform.minimax.io)               |
| **NVIDIA**       | `nvidia/`       | `nvidia/nemotron-3-super-120b-a12b`   | [NVIDIA API](https://api.nvidia.com)                |
| **Z.ai (GLM)**   | `zai/`          | `zai/glm-5`                           | [Z.ai](https://z.ai)                                 |
| **Volcengine**   | `volcengine/`   | `volcengine/doubao-seed-1-8-251228`   | [Volcengine](https://www.volcengine.com)            |
| **HuggingFace**  | `huggingface/`  | `huggingface/deepseek-ai/DeepSeek-R1` | [HF Tokens](https://huggingface.co/settings/tokens) |
| **OpenCode Zen** | `opencode/`     | `opencode/claude-opus-4-6`            | [OpenCode.ai](https://opencode.ai/auth)              |
| **OpenCode Go**  | `opencode-go/`  | `opencode-go/kimi-k2.5`               | [OpenCode.ai](https://opencode.ai/auth)              |
| **Kilo Gateway** | `kilocode/`     | `kilocode/anthropic/claude-opus-4.6`  | [Kilo.ai](https://kilo.ai)                           |

### OpenRouter – 200+ Models with One Key

Get an [OpenRouter](https://openrouter.ai) API key to use *all* providers. For example:

```bash
LLM_API_KEY=sk-or-v1-xxxxxxxx
LLM_MODEL=openrouter/openai/gpt-5.4
```

Popular options include `openrouter/google/gemini-2.5-flash` or `openrouter/meta-llama/llama-3.3-70b-instruct`.

### Any Other Provider

You can also use any custom provider:

```bash
LLM_API_KEY=your_api_key
LLM_MODEL=provider/model-name
```

The provider prefix in `LLM_MODEL` tells HuggingClaw how to call it. See [OpenClaw Model Providers](https://docs.openclaw.ai/concepts/model-providers) for the full list.

## πŸ’» Local Development

```bash
git clone https://github.com/somratpro/huggingclaw.git
cd huggingclaw
cp .env.example .env
# Edit .env with your secret values
```

**With Docker:**

```bash
docker build --build-arg OPENCLAW_VERSION=latest -t huggingclaw .
docker run -p 7861:7861 --env-file .env huggingclaw
```

**Without Docker:**

```bash
npm install -g openclaw@latest
export $(cat .env | xargs)
bash start.sh
```

## πŸ”— CLI Access

After deploying, you can connect via the OpenClaw CLI (e.g., to onboard channels or run agents):

```bash
npm install -g openclaw@latest
openclaw channels login --gateway https://YOUR_SPACE_NAME.hf.space
# When prompted, enter your GATEWAY_TOKEN
```

## πŸ—οΈ Architecture

```bash
HuggingClaw/
β”œβ”€β”€ Dockerfile          # Multi-stage build using pre-built OpenClaw image
β”œβ”€β”€ start.sh            # Config generator, validator, and orchestrator
β”œβ”€β”€ workspace-sync.py   # Syncs workspace to HF Datasets (with Git fallback)
β”œβ”€β”€ health-server.js    # /health endpoint for uptime checks
β”œβ”€β”€ dns-fix.js          # DNS-over-HTTPS fallback (for blocked domains)
β”œβ”€β”€ .env.example        # Environment variable reference
└── README.md           # (this file)

**Startup sequence:**
1. Validate required secrets (fail fast with clear error).
2. Check HF token (warn if expired or missing).
3. Auto-create backup dataset if missing.
4. Restore workspace from HF Dataset.
5. Generate `openclaw.json` from environment variables.
6. Print startup summary.
7. Launch background tasks (auto-sync and optional channel helpers).
8. Launch the OpenClaw gateway (start listening).
9. On `SIGTERM`, save workspace and exit cleanly.
```

## πŸ› Troubleshooting

- **Missing secrets:** Ensure `LLM_API_KEY`, `LLM_MODEL`, and `GATEWAY_TOKEN` are set in your Space **Settings β†’ Secrets**.
- **Telegram bot issues:** Verify your `TELEGRAM_BOT_TOKEN`. Check Space logs for lines like `πŸ“± Enabling Telegram`.
- **Backup restore failing:** Make sure `HF_USERNAME` and `HF_TOKEN` are correct (token needs write access to your Dataset).
- **Space keeps sleeping:** Open `/` and use `Keep Space Awake` to create the external monitor.
- **Auth errors / proxy:** If you see reverse-proxy auth errors, add the logged IPs under `TRUSTED_PROXIES` (from logs `remote=x.x.x.x`).
- **Control UI says too many failed authentication attempts:** Wait for the retry window to expire, then open the Space in an incognito window or clear site storage for your Space before logging in again with `GATEWAY_TOKEN`.
- **WhatsApp lost its session after restart:** Make sure `HF_USERNAME` and `HF_TOKEN` are configured so the hidden session backup can be restored on boot.
- **UI blocked (CORS):** Set `ALLOWED_ORIGINS=https://your-space-name.hf.space`.
- **Version mismatches:** Pin a specific OpenClaw build with the `OPENCLAW_VERSION` Variable in HF Spaces, or `--build-arg OPENCLAW_VERSION=...` locally.

## πŸ“š Links

- [OpenClaw Docs](https://docs.openclaw.ai)  
- [OpenClaw GitHub](https://github.com/openclaw/openclaw)  
- [HuggingFace Spaces Docs](https://huggingface.co/docs/hub/spaces)  

## 🀝 Contributing

Contributions are welcome! Please see [CONTRIBUTING.md](CONTRIBUTING.md) for guidelines.

## πŸ“„ License

MIT β€” see [LICENSE](LICENSE) for details.

*Made with ❀️ by [@somratpro](https://github.com/somratpro) for the OpenClaw community.*