Spaces:
Running
Running
| title: RustForHuggingClaw | |
| emoji: π¦ | |
| colorFrom: indigo | |
| colorTo: red | |
| sdk: docker | |
| app_port: 7860 | |
| datasets: | |
| - your-username/your-space-name-data | |
| # HuggingClaw π¦ | |
| Deploy [OpenClaw](https://openclaw.ai) on HuggingFace Spaces for free β | |
| 2 vCPU Β· 16 GB RAM Β· 50 GB storage Β· always online. | |
| AI provider calls are routed through **LiteLLM Proxy** so you can plug in | |
| any provider (OpenAI, Anthropic, Gemini, Azure, OpenRouter, Ollama, local | |
| endpoints β¦) with just two variables. | |
| --- | |
| ## Required Secrets | |
| Set these in **Space β Settings β Repository secrets**: | |
| | Secret | Example | Description | | |
| |--------|---------|-------------| | |
| | `HF_TOKEN` | `hf_xxxβ¦` | HuggingFace token with **read + write** access | | |
| | `OPENCLAW_DATASET_REPO` | `your-name/HuggingClaw-data` | Private dataset repo used to persist OpenClaw config | | |
| | `LITELLM_API_KEY` | `sk-xxxβ¦` | Your AI provider API key | | |
| | `LITELLM_MODEL` | `openai/gpt-4o` | Model in LiteLLM format (see table below) | | |
| --- | |
| ## Optional Secrets | |
| | Secret | Default | Description | | |
| |--------|---------|-------------| | |
| | `LITELLM_API_BASE` | *(provider default)* | Custom API endpoint β required for Azure, OpenRouter, local, etc. | | |
| | `AUTO_CREATE_DATASET` | `false` | Set `true` to auto-create the dataset repo on first boot | | |
| | `SYNC_INTERVAL` | `60` | Seconds between workspace sync pushes to HuggingFace | | |
| --- | |
| ## Model Examples | |
| | Provider | `LITELLM_MODEL` | `LITELLM_API_BASE` | | |
| |----------|----------------|-------------------| | |
| | OpenAI | `openai/gpt-4o` | *(leave empty)* | | |
| | Anthropic | `anthropic/claude-3-5-sonnet-20241022` | *(leave empty)* | | |
| | Google Gemini | `gemini/gemini-1.5-pro` | *(leave empty)* | | |
| | Azure OpenAI | `azure/<deployment-name>` | `https://<resource>.openai.azure.com` | | |
| | OpenRouter | `openrouter/openai/gpt-4o` | `https://openrouter.ai/api/v1` | | |
| | Ollama (local) | `ollama/llama3` | `http://localhost:11434` | | |
| | Any OpenAI-compat | `openai/<model-name>` | your custom base URL | | |
| --- | |
| ## Setup | |
| 1. **Duplicate this Space** on the HuggingFace Space page. | |
| 2. Create a **private Dataset repo** at [huggingface.co/new-dataset](https://huggingface.co/new-dataset). | |
| 3. Set all **Required Secrets** above. | |
| 4. Edit this README: update the `datasets:` field to match your dataset repo. | |
| 5. The Space will build; OpenClaw will be available on port **7860**. | |
| --- | |
| ## Architecture | |
| ``` | |
| HF Space (Docker) | |
| βββ openclaw-hf-sync β Rust binary (pid 1 via tini) | |
| β βββ on boot : pull ~/.openclaw from HF dataset | |
| β βββ every 60s: push ~/.openclaw changes to HF dataset | |
| β βββ on exit : final push | |
| βββ start.sh β child process | |
| βββ LiteLLM Proxy (127.0.0.1:4000) β routes to your AI provider | |
| βββ OpenClaw Gateway (:7860) β points to LiteLLM Proxy | |
| ``` |