Spaces:
Running
Running
更新 README.md
Browse files
README.md
CHANGED
|
@@ -11,32 +11,71 @@ datasets:
|
|
| 11 |
|
| 12 |
# HuggingClaw 🦀
|
| 13 |
|
| 14 |
-
Deploy [OpenClaw](https://openclaw.ai) on HuggingFace Spaces for free —
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 15 |
|
| 16 |
## Required Secrets
|
| 17 |
|
| 18 |
-
Set these in
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 19 |
|
| 20 |
-
|
| 21 |
-
|--------|-------------|
|
| 22 |
-
| `HF_TOKEN` | Your HuggingFace token (needs read + write access) |
|
| 23 |
-
| `OPENCLAW_DATASET_REPO` | Dataset repo to persist data, e.g. `your-name/HuggingClaw-data` |
|
| 24 |
|
| 25 |
## Optional Secrets
|
| 26 |
|
| 27 |
| Secret | Default | Description |
|
| 28 |
|--------|---------|-------------|
|
| 29 |
-
| `
|
| 30 |
-
| `
|
| 31 |
-
| `
|
| 32 |
-
|
| 33 |
-
|
| 34 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 35 |
|
| 36 |
## Setup
|
| 37 |
|
| 38 |
1. **Duplicate this Space** on the HuggingFace Space page.
|
| 39 |
2. Create a **private Dataset repo** at [huggingface.co/new-dataset](https://huggingface.co/new-dataset).
|
| 40 |
-
3. Set
|
| 41 |
-
4. Edit this README
|
| 42 |
-
5. The Space will build
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 11 |
|
| 12 |
# HuggingClaw 🦀
|
| 13 |
|
| 14 |
+
Deploy [OpenClaw](https://openclaw.ai) on HuggingFace Spaces for free —
|
| 15 |
+
2 vCPU · 16 GB RAM · 50 GB storage · always online.
|
| 16 |
+
|
| 17 |
+
AI provider calls are routed through **LiteLLM Proxy** so you can plug in
|
| 18 |
+
any provider (OpenAI, Anthropic, Gemini, Azure, OpenRouter, Ollama, local
|
| 19 |
+
endpoints …) with just two variables.
|
| 20 |
+
|
| 21 |
+
---
|
| 22 |
|
| 23 |
## Required Secrets
|
| 24 |
|
| 25 |
+
Set these in **Space → Settings → Repository secrets**:
|
| 26 |
+
|
| 27 |
+
| Secret | Example | Description |
|
| 28 |
+
|--------|---------|-------------|
|
| 29 |
+
| `HF_TOKEN` | `hf_xxx…` | HuggingFace token with **read + write** access |
|
| 30 |
+
| `OPENCLAW_DATASET_REPO` | `your-name/HuggingClaw-data` | Private dataset repo used to persist OpenClaw config |
|
| 31 |
+
| `LITELLM_API_KEY` | `sk-xxx…` | Your AI provider API key |
|
| 32 |
+
| `LITELLM_MODEL` | `openai/gpt-4o` | Model in LiteLLM format (see table below) |
|
| 33 |
|
| 34 |
+
---
|
|
|
|
|
|
|
|
|
|
| 35 |
|
| 36 |
## Optional Secrets
|
| 37 |
|
| 38 |
| Secret | Default | Description |
|
| 39 |
|--------|---------|-------------|
|
| 40 |
+
| `LITELLM_API_BASE` | *(provider default)* | Custom API endpoint — required for Azure, OpenRouter, local, etc. |
|
| 41 |
+
| `AUTO_CREATE_DATASET` | `false` | Set `true` to auto-create the dataset repo on first boot |
|
| 42 |
+
| `SYNC_INTERVAL` | `60` | Seconds between workspace sync pushes to HuggingFace |
|
| 43 |
+
|
| 44 |
+
---
|
| 45 |
+
|
| 46 |
+
## Model Examples
|
| 47 |
+
|
| 48 |
+
| Provider | `LITELLM_MODEL` | `LITELLM_API_BASE` |
|
| 49 |
+
|----------|----------------|-------------------|
|
| 50 |
+
| OpenAI | `openai/gpt-4o` | *(leave empty)* |
|
| 51 |
+
| Anthropic | `anthropic/claude-3-5-sonnet-20241022` | *(leave empty)* |
|
| 52 |
+
| Google Gemini | `gemini/gemini-1.5-pro` | *(leave empty)* |
|
| 53 |
+
| Azure OpenAI | `azure/<deployment-name>` | `https://<resource>.openai.azure.com` |
|
| 54 |
+
| OpenRouter | `openrouter/openai/gpt-4o` | `https://openrouter.ai/api/v1` |
|
| 55 |
+
| Ollama (local) | `ollama/llama3` | `http://localhost:11434` |
|
| 56 |
+
| Any OpenAI-compat | `openai/<model-name>` | your custom base URL |
|
| 57 |
+
|
| 58 |
+
---
|
| 59 |
|
| 60 |
## Setup
|
| 61 |
|
| 62 |
1. **Duplicate this Space** on the HuggingFace Space page.
|
| 63 |
2. Create a **private Dataset repo** at [huggingface.co/new-dataset](https://huggingface.co/new-dataset).
|
| 64 |
+
3. Set all **Required Secrets** above.
|
| 65 |
+
4. Edit this README: update the `datasets:` field to match your dataset repo.
|
| 66 |
+
5. The Space will build; OpenClaw will be available on port **7860**.
|
| 67 |
+
|
| 68 |
+
---
|
| 69 |
+
|
| 70 |
+
## Architecture
|
| 71 |
+
|
| 72 |
+
```
|
| 73 |
+
HF Space (Docker)
|
| 74 |
+
├── openclaw-hf-sync ← Rust binary (pid 1 via tini)
|
| 75 |
+
│ ├── on boot : pull ~/.openclaw from HF dataset
|
| 76 |
+
│ ├── every 60s: push ~/.openclaw changes to HF dataset
|
| 77 |
+
│ └── on exit : final push
|
| 78 |
+
└── start.sh ← child process
|
| 79 |
+
├── LiteLLM Proxy (127.0.0.1:4000) ← routes to your AI provider
|
| 80 |
+
└── OpenClaw Gateway (:7860) ← points to LiteLLM Proxy
|
| 81 |
+
```
|