--- title: RustForHuggingClaw emoji: ๐Ÿฆ€ colorFrom: indigo colorTo: red sdk: docker app_port: 7860 datasets: - your-username/your-space-name-data --- # HuggingClaw ๐Ÿฆ€ Deploy [OpenClaw](https://openclaw.ai) on HuggingFace Spaces for free โ€” 2 vCPU ยท 16 GB RAM ยท 50 GB storage ยท always online. AI provider calls are routed through **LiteLLM Proxy** so you can plug in any provider (OpenAI, Anthropic, Gemini, Azure, OpenRouter, Ollama, local endpoints โ€ฆ) with just two variables. --- ## Required Secrets Set these in **Space โ†’ Settings โ†’ Repository secrets**: | Secret | Example | Description | |--------|---------|-------------| | `HF_TOKEN` | `hf_xxxโ€ฆ` | HuggingFace token with **read + write** access | | `OPENCLAW_DATASET_REPO` | `your-name/HuggingClaw-data` | Private dataset repo used to persist OpenClaw config | | `LITELLM_API_KEY` | `sk-xxxโ€ฆ` | Your AI provider API key | | `LITELLM_MODEL` | `openai/gpt-4o` | Model in LiteLLM format (see table below) | --- ## Optional Secrets | Secret | Default | Description | |--------|---------|-------------| | `LITELLM_API_BASE` | *(provider default)* | Custom API endpoint โ€” required for Azure, OpenRouter, local, etc. | | `AUTO_CREATE_DATASET` | `false` | Set `true` to auto-create the dataset repo on first boot | | `SYNC_INTERVAL` | `60` | Seconds between workspace sync pushes to HuggingFace | --- ## Model Examples | Provider | `LITELLM_MODEL` | `LITELLM_API_BASE` | |----------|----------------|-------------------| | OpenAI | `openai/gpt-4o` | *(leave empty)* | | Anthropic | `anthropic/claude-3-5-sonnet-20241022` | *(leave empty)* | | Google Gemini | `gemini/gemini-1.5-pro` | *(leave empty)* | | Azure OpenAI | `azure/` | `https://.openai.azure.com` | | OpenRouter | `openrouter/openai/gpt-4o` | `https://openrouter.ai/api/v1` | | Ollama (local) | `ollama/llama3` | `http://localhost:11434` | | Any OpenAI-compat | `openai/` | your custom base URL | --- ## Setup 1. **Duplicate this Space** on the HuggingFace Space page. 2. Create a **private Dataset repo** at [huggingface.co/new-dataset](https://huggingface.co/new-dataset). 3. Set all **Required Secrets** above. 4. Edit this README: update the `datasets:` field to match your dataset repo. 5. The Space will build; OpenClaw will be available on port **7860**. --- ## Architecture ``` HF Space (Docker) โ”œโ”€โ”€ openclaw-hf-sync โ† Rust binary (pid 1 via tini) โ”‚ โ”œโ”€โ”€ on boot : pull ~/.openclaw from HF dataset โ”‚ โ”œโ”€โ”€ every 60s: push ~/.openclaw changes to HF dataset โ”‚ โ””โ”€โ”€ on exit : final push โ””โ”€โ”€ start.sh โ† child process โ”œโ”€โ”€ LiteLLM Proxy (127.0.0.1:4000) โ† routes to your AI provider โ””โ”€โ”€ OpenClaw Gateway (:7860) โ† points to LiteLLM Proxy ```