Spaces:
Running
Running
File size: 2,805 Bytes
85a22fb aa7309c 85a22fb aa7309c 56f1ebb aa7309c 56f1ebb aa7309c 56f1ebb aa7309c 56f1ebb aa7309c 56f1ebb | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 | ---
title: RustForHuggingClaw
emoji: π¦
colorFrom: indigo
colorTo: red
sdk: docker
app_port: 7860
datasets:
- your-username/your-space-name-data
---
# HuggingClaw π¦
Deploy [OpenClaw](https://openclaw.ai) on HuggingFace Spaces for free β
2 vCPU Β· 16 GB RAM Β· 50 GB storage Β· always online.
AI provider calls are routed through **LiteLLM Proxy** so you can plug in
any provider (OpenAI, Anthropic, Gemini, Azure, OpenRouter, Ollama, local
endpoints β¦) with just two variables.
---
## Required Secrets
Set these in **Space β Settings β Repository secrets**:
| Secret | Example | Description |
|--------|---------|-------------|
| `HF_TOKEN` | `hf_xxxβ¦` | HuggingFace token with **read + write** access |
| `OPENCLAW_DATASET_REPO` | `your-name/HuggingClaw-data` | Private dataset repo used to persist OpenClaw config |
| `LITELLM_API_KEY` | `sk-xxxβ¦` | Your AI provider API key |
| `LITELLM_MODEL` | `openai/gpt-4o` | Model in LiteLLM format (see table below) |
---
## Optional Secrets
| Secret | Default | Description |
|--------|---------|-------------|
| `LITELLM_API_BASE` | *(provider default)* | Custom API endpoint β required for Azure, OpenRouter, local, etc. |
| `AUTO_CREATE_DATASET` | `false` | Set `true` to auto-create the dataset repo on first boot |
| `SYNC_INTERVAL` | `60` | Seconds between workspace sync pushes to HuggingFace |
---
## Model Examples
| Provider | `LITELLM_MODEL` | `LITELLM_API_BASE` |
|----------|----------------|-------------------|
| OpenAI | `openai/gpt-4o` | *(leave empty)* |
| Anthropic | `anthropic/claude-3-5-sonnet-20241022` | *(leave empty)* |
| Google Gemini | `gemini/gemini-1.5-pro` | *(leave empty)* |
| Azure OpenAI | `azure/<deployment-name>` | `https://<resource>.openai.azure.com` |
| OpenRouter | `openrouter/openai/gpt-4o` | `https://openrouter.ai/api/v1` |
| Ollama (local) | `ollama/llama3` | `http://localhost:11434` |
| Any OpenAI-compat | `openai/<model-name>` | your custom base URL |
---
## Setup
1. **Duplicate this Space** on the HuggingFace Space page.
2. Create a **private Dataset repo** at [huggingface.co/new-dataset](https://huggingface.co/new-dataset).
3. Set all **Required Secrets** above.
4. Edit this README: update the `datasets:` field to match your dataset repo.
5. The Space will build; OpenClaw will be available on port **7860**.
---
## Architecture
```
HF Space (Docker)
βββ openclaw-hf-sync β Rust binary (pid 1 via tini)
β βββ on boot : pull ~/.openclaw from HF dataset
β βββ every 60s: push ~/.openclaw changes to HF dataset
β βββ on exit : final push
βββ start.sh β child process
βββ LiteLLM Proxy (127.0.0.1:4000) β routes to your AI provider
βββ OpenClaw Gateway (:7860) β points to LiteLLM Proxy
``` |