Somrat Sorkar commited on
Commit
d85636c
Β·
1 Parent(s): b723ff6

Add .env.example with complete configuration reference

Browse files
Files changed (1) hide show
  1. .env.example +134 -0
.env.example ADDED
@@ -0,0 +1,134 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # ════════════════════════════════════════════════════════════════
2
+ # 🦞 HuggingClaw β€” OpenClaw Gateway for HuggingFace Spaces
3
+ # ════════════════════════════════════════════════════════════════
4
+ # Copy this file to .env and fill in your values.
5
+ # For local development: cp .env.example .env && nano .env
6
+
7
+ # ── REQUIRED: Core Configuration ──
8
+ # [REQUIRED] LLM provider API key
9
+ # - Anthropic: sk-ant-v0-...
10
+ # - OpenAI: sk-...
11
+ # - Google: AIzaSy...
12
+ LLM_API_KEY=your_api_key_here
13
+
14
+ # [REQUIRED] LLM model to use (format: provider/model-name)
15
+ # Auto-detects provider from prefix (google/, openai/, anthropic/)
16
+ #
17
+ # Anthropic Claude:
18
+ # - anthropic/claude-opus-4-6
19
+ # - anthropic/claude-sonnet-4-5
20
+ # - anthropic/claude-haiku-4-5
21
+ #
22
+ # OpenAI:
23
+ # - openai/gpt-4-turbo
24
+ # - openai/gpt-4
25
+ # - openai/gpt-3.5-turbo
26
+ #
27
+ # Google Gemini:
28
+ # - google/gemini-2.5-flash
29
+ # - google/gemini-2.0-flash
30
+ # - google/gemini-1.5-pro
31
+ LLM_MODEL=anthropic/claude-sonnet-4-5
32
+
33
+ # [REQUIRED] Gateway authentication token
34
+ # Generate: openssl rand -hex 32
35
+ GATEWAY_TOKEN=your_gateway_token_here
36
+
37
+ # ── OPTIONAL: Telegram Integration ──
38
+ # Enable Telegram bot integration
39
+ # Get bot token from: https://t.me/BotFather
40
+ TELEGRAM_BOT_TOKEN=your_bot_token_here
41
+
42
+ # Single user ID for DM access
43
+ # Get your ID from: https://t.me/userinfobot
44
+ TELEGRAM_USER_ID=123456789
45
+
46
+ # Multiple user IDs (comma-separated for team access)
47
+ # TELEGRAM_USER_IDS=123456789,987654321,555555555
48
+
49
+ # ── OPTIONAL: Workspace Backup to HF Dataset ──
50
+ # Enable automatic workspace backup & restore
51
+ # Your HuggingFace username
52
+ HF_USERNAME=your_hf_username
53
+
54
+ # HuggingFace API token (with write access)
55
+ # Get from: https://huggingface.co/settings/tokens
56
+ HF_TOKEN=hf_your_token_here
57
+
58
+ # Name of the backup dataset (auto-created if missing)
59
+ # Default: huggingclaw-backup
60
+ BACKUP_DATASET_NAME=huggingclaw-backup
61
+
62
+ # Git commit email for workspace syncs
63
+ # Default: openclaw@example.com
64
+ WORKSPACE_GIT_USER=openclaw@example.com
65
+
66
+ # Git commit name for workspace syncs
67
+ # Default: OpenClaw Bot
68
+ WORKSPACE_GIT_NAME=OpenClaw Bot
69
+
70
+ # ── OPTIONAL: Background Services Configuration ──
71
+ # Keep-alive ping interval in seconds (prevents HF Spaces sleep)
72
+ # Default: 300 (5 minutes)
73
+ # Set to 0 to disable (not recommended on HF Spaces)
74
+ KEEP_ALIVE_INTERVAL=300
75
+
76
+ # Workspace auto-sync interval in seconds
77
+ # Default: 600 (10 minutes)
78
+ SYNC_INTERVAL=600
79
+
80
+ # ── OPTIONAL: Advanced Configuration ──
81
+ # Pin OpenClaw version (default: latest)
82
+ # Example: OPENCLAW_VERSION=2026.3.24
83
+ OPENCLAW_VERSION=latest
84
+
85
+ # Health endpoint port (default: 7861)
86
+ HEALTH_PORT=7861
87
+
88
+ # ════════════════════════════════════════════════════════════════
89
+ # QUICK START CHECKLIST
90
+ # ════════════════════════════════════════════════════════════════
91
+ #
92
+ # βœ… Minimum setup (3 secrets):
93
+ # 1. LLM_API_KEY - Get from your LLM provider
94
+ # 2. LLM_MODEL - Choose a model above
95
+ # 3. GATEWAY_TOKEN - Run: openssl rand -hex 32
96
+ #
97
+ # βœ… Add Telegram (2 more secrets):
98
+ # 4. TELEGRAM_BOT_TOKEN - From @BotFather
99
+ # 5. TELEGRAM_USER_ID - From @userinfobot
100
+ #
101
+ # βœ… Enable Backup (2 more secrets):
102
+ # 6. HF_USERNAME - Your HF account name
103
+ # 7. HF_TOKEN - From HF Settings β†’ Tokens
104
+ #
105
+ # ════════════════════════════════════════════════════════════════
106
+ # DEPLOYMENT OPTIONS
107
+ # ════════════════════════════════════════════════════════════════
108
+ #
109
+ # πŸ“¦ HuggingFace Spaces (Recommended):
110
+ # - Click "Duplicate this Space"
111
+ # - Go to Settings β†’ Secrets
112
+ # - Add LLM_API_KEY, LLM_MODEL, GATEWAY_TOKEN
113
+ # - Deploy (automatic!)
114
+ #
115
+ # 🐳 Docker Local:
116
+ # docker build -t huggingclaw .
117
+ # docker run -p 7860:7860 --env-file .env huggingclaw
118
+ #
119
+ # πŸ’» Direct (without Docker):
120
+ # npm install -g openclaw@latest
121
+ # export $(cat .env | xargs)
122
+ # bash start.sh
123
+ #
124
+ # ════════════════════════════════════════════════════════════════
125
+ # VERIFY YOUR SETUP
126
+ # ════════════════════════════════════════════════════════════════
127
+ #
128
+ # After deployment, check:
129
+ # 1. Logs for "🦞 HuggingClaw Gateway" banner
130
+ # 2. Health endpoint: curl https://YOUR-SPACE-URL.hf.space/health
131
+ # 3. Control UI: https://YOUR-SPACE-URL.hf.space
132
+ # 4. (If Telegram) DM your bot to test
133
+ #
134
+ # ════════════════════════════════════════════════════════════════