Spaces:
Runtime error
feat: add coding-agent OpenClaw extension with 8 HF Space tools
Browse filesNew extension at extensions/coding-agent/ that registers tools directly
into OpenClaw's agent runtime, giving Adam & Eve real coding capabilities:
Tools registered:
- hf_read_file: Read files from Space/Dataset
- hf_write_file: Write files with auto Python syntax validation
- hf_delete_file: Delete files from repos
- hf_list_files: List all files in a repo
- hf_space_status: Check Space health/stage
- hf_restart_space: Restart the target Space
- hf_search_code: Grep-like regex search across code files
- python_syntax_check: Validate Python syntax before committing
Key features:
- Python .py files are syntax-checked BEFORE writing (prevents the
blind-write-then-wait-for-build-failure loop)
- Configured via CODING_AGENT_TARGET_SPACE/DATASET env vars
- Follows A2A gateway extension pattern (registerTool API)
Also adds:
- workspace-templates/ with IDENTITY.md and SOUL.md for coding agent
behavior (deployed on first boot if workspace files are default)
- Dockerfile updated to COPY extension and templates
- sync_hf.py updated to configure plugin and deploy templates
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
- Dockerfile +4 -0
- extensions/coding-agent/index.ts +469 -0
- extensions/coding-agent/openclaw.plugin.json +32 -0
- extensions/coding-agent/package.json +11 -0
- scripts/sync_hf.py +32 -1
- workspace-templates/IDENTITY.md +12 -0
- workspace-templates/SOUL.md +42 -0
|
@@ -35,6 +35,9 @@ RUN echo "[build] Installing A2A gateway..." && START=$(date +%s) \
|
|
| 35 |
&& npm install --production \
|
| 36 |
&& echo "[build] A2A gateway: $(($(date +%s) - START))s"
|
| 37 |
|
|
|
|
|
|
|
|
|
|
| 38 |
# ββ Prepare runtime dirs ββββββββββββββββββββββββββββββββββββββββββββββββββββ
|
| 39 |
RUN mkdir -p /app/openclaw/empty-bundled-plugins \
|
| 40 |
&& node -e "try{console.log(require('/app/openclaw/package.json').version)}catch(e){console.log('unknown')}" > /app/openclaw/.version \
|
|
@@ -43,6 +46,7 @@ RUN mkdir -p /app/openclaw/empty-bundled-plugins \
|
|
| 43 |
# ββ Scripts + Config + Frontend ββββββββββββββββββββββββββββββββββββββββββββββ
|
| 44 |
COPY --chown=node:node scripts /home/node/scripts
|
| 45 |
COPY --chown=node:node frontend /home/node/frontend
|
|
|
|
| 46 |
COPY --chown=node:node openclaw.json /home/node/scripts/openclaw.json.default
|
| 47 |
RUN chmod +x /home/node/scripts/entrypoint.sh /home/node/scripts/sync_hf.py \
|
| 48 |
&& VERSION_TS=$(date +%s) \
|
|
|
|
| 35 |
&& npm install --production \
|
| 36 |
&& echo "[build] A2A gateway: $(($(date +%s) - START))s"
|
| 37 |
|
| 38 |
+
# ββ Coding Agent Extension (local β no git clone needed) βββββββββββββββββββββ
|
| 39 |
+
COPY --chown=node:node extensions/coding-agent /app/openclaw/extensions/coding-agent
|
| 40 |
+
|
| 41 |
# ββ Prepare runtime dirs ββββββββββββββββββββββββββββββββββββββββββββββββββββ
|
| 42 |
RUN mkdir -p /app/openclaw/empty-bundled-plugins \
|
| 43 |
&& node -e "try{console.log(require('/app/openclaw/package.json').version)}catch(e){console.log('unknown')}" > /app/openclaw/.version \
|
|
|
|
| 46 |
# ββ Scripts + Config + Frontend ββββββββββββββββββββββββββββββββββββββββββββββ
|
| 47 |
COPY --chown=node:node scripts /home/node/scripts
|
| 48 |
COPY --chown=node:node frontend /home/node/frontend
|
| 49 |
+
COPY --chown=node:node workspace-templates /home/node/workspace-templates
|
| 50 |
COPY --chown=node:node openclaw.json /home/node/scripts/openclaw.json.default
|
| 51 |
RUN chmod +x /home/node/scripts/entrypoint.sh /home/node/scripts/sync_hf.py \
|
| 52 |
&& VERSION_TS=$(date +%s) \
|
|
@@ -0,0 +1,469 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
/**
|
| 2 |
+
* Coding Agent Extension for OpenClaw
|
| 3 |
+
*
|
| 4 |
+
* Registers tools that let the OpenClaw agent manage HuggingFace Spaces:
|
| 5 |
+
* - Read/write/delete files on Space and Dataset repos
|
| 6 |
+
* - Check Space health and restart
|
| 7 |
+
* - Search code patterns (grep-like)
|
| 8 |
+
* - Validate Python syntax before writing
|
| 9 |
+
* - List files in repos
|
| 10 |
+
*/
|
| 11 |
+
|
| 12 |
+
import { execSync } from "node:child_process";
|
| 13 |
+
|
| 14 |
+
// ββ Types ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
|
| 15 |
+
|
| 16 |
+
interface PluginApi {
|
| 17 |
+
pluginConfig: Record<string, unknown>;
|
| 18 |
+
logger: { info: (...a: unknown[]) => void; warn: (...a: unknown[]) => void; error: (...a: unknown[]) => void };
|
| 19 |
+
registerTool?: (def: ToolDef) => void;
|
| 20 |
+
resolvePath?: (p: string) => string;
|
| 21 |
+
}
|
| 22 |
+
|
| 23 |
+
interface ToolDef {
|
| 24 |
+
name: string;
|
| 25 |
+
description: string;
|
| 26 |
+
label?: string;
|
| 27 |
+
parameters: Record<string, unknown>;
|
| 28 |
+
execute: (toolCallId: string, params: Record<string, unknown>) => Promise<ToolResult>;
|
| 29 |
+
}
|
| 30 |
+
|
| 31 |
+
interface ToolResult {
|
| 32 |
+
content: Array<{ type: "text"; text: string }>;
|
| 33 |
+
details?: Record<string, unknown>;
|
| 34 |
+
}
|
| 35 |
+
|
| 36 |
+
// ββ HuggingFace API helpers ββββββββββββββββββββββββββββββββββββββββββββββββββ
|
| 37 |
+
|
| 38 |
+
const HF_API = "https://huggingface.co/api";
|
| 39 |
+
|
| 40 |
+
async function hfFetch(path: string, token: string, options: RequestInit = {}): Promise<Response> {
|
| 41 |
+
const url = path.startsWith("http") ? path : `${HF_API}${path}`;
|
| 42 |
+
return fetch(url, {
|
| 43 |
+
...options,
|
| 44 |
+
headers: {
|
| 45 |
+
Authorization: `Bearer ${token}`,
|
| 46 |
+
...((options.headers as Record<string, string>) || {}),
|
| 47 |
+
},
|
| 48 |
+
});
|
| 49 |
+
}
|
| 50 |
+
|
| 51 |
+
async function hfReadFile(repoType: string, repoId: string, filePath: string, token: string): Promise<string> {
|
| 52 |
+
const url = `https://huggingface.co/${repoType}s/${repoId}/resolve/main/${filePath}`;
|
| 53 |
+
const resp = await hfFetch(url, token);
|
| 54 |
+
if (!resp.ok) throw new Error(`${resp.status} ${resp.statusText}: ${filePath}`);
|
| 55 |
+
return resp.text();
|
| 56 |
+
}
|
| 57 |
+
|
| 58 |
+
async function hfWriteFile(repoType: string, repoId: string, filePath: string, content: string, token: string): Promise<string> {
|
| 59 |
+
const url = `${HF_API}/${repoType}s/${repoId}/upload/main/${filePath}`;
|
| 60 |
+
const blob = new Blob([content], { type: "text/plain" });
|
| 61 |
+
const form = new FormData();
|
| 62 |
+
form.append("file", blob, filePath);
|
| 63 |
+
const resp = await hfFetch(url, token, { method: "POST", body: form });
|
| 64 |
+
if (!resp.ok) {
|
| 65 |
+
const text = await resp.text().catch(() => "");
|
| 66 |
+
throw new Error(`${resp.status}: ${text}`);
|
| 67 |
+
}
|
| 68 |
+
return `Wrote ${content.length} bytes to ${repoType}:${filePath}`;
|
| 69 |
+
}
|
| 70 |
+
|
| 71 |
+
async function hfDeleteFile(repoType: string, repoId: string, filePath: string, token: string): Promise<string> {
|
| 72 |
+
// Use the HF Hub commit API to delete a file
|
| 73 |
+
const url = `${HF_API}/${repoType}s/${repoId}/commit/main`;
|
| 74 |
+
const body = {
|
| 75 |
+
summary: `Delete ${filePath}`,
|
| 76 |
+
operations: [{ op: "delete", path: filePath }],
|
| 77 |
+
};
|
| 78 |
+
const resp = await hfFetch(url, token, {
|
| 79 |
+
method: "POST",
|
| 80 |
+
headers: { "Content-Type": "application/json" },
|
| 81 |
+
body: JSON.stringify(body),
|
| 82 |
+
});
|
| 83 |
+
if (!resp.ok) {
|
| 84 |
+
const text = await resp.text().catch(() => "");
|
| 85 |
+
throw new Error(`${resp.status}: ${text}`);
|
| 86 |
+
}
|
| 87 |
+
return `Deleted ${repoType}:${filePath}`;
|
| 88 |
+
}
|
| 89 |
+
|
| 90 |
+
async function hfListFiles(repoType: string, repoId: string, token: string): Promise<string[]> {
|
| 91 |
+
const url = `${HF_API}/${repoType}s/${repoId}`;
|
| 92 |
+
const resp = await hfFetch(url, token);
|
| 93 |
+
if (!resp.ok) throw new Error(`${resp.status}: listing failed`);
|
| 94 |
+
const data = await resp.json() as Record<string, unknown>;
|
| 95 |
+
const siblings = (data.siblings || []) as Array<{ rfilename: string }>;
|
| 96 |
+
return siblings.map((s) => s.rfilename);
|
| 97 |
+
}
|
| 98 |
+
|
| 99 |
+
async function hfSpaceInfo(spaceId: string, token: string): Promise<Record<string, unknown>> {
|
| 100 |
+
const resp = await hfFetch(`/spaces/${spaceId}`, token);
|
| 101 |
+
if (!resp.ok) throw new Error(`${resp.status}: space info failed`);
|
| 102 |
+
return resp.json() as Promise<Record<string, unknown>>;
|
| 103 |
+
}
|
| 104 |
+
|
| 105 |
+
async function hfRestartSpace(spaceId: string, token: string): Promise<string> {
|
| 106 |
+
const resp = await hfFetch(`/spaces/${spaceId}/restart`, token, { method: "POST" });
|
| 107 |
+
if (!resp.ok) {
|
| 108 |
+
const text = await resp.text().catch(() => "");
|
| 109 |
+
throw new Error(`${resp.status}: ${text}`);
|
| 110 |
+
}
|
| 111 |
+
return `Space ${spaceId} is restarting`;
|
| 112 |
+
}
|
| 113 |
+
|
| 114 |
+
// ββ Python syntax checker ββββββββββββββββββββββββββββββββββββββββββββββββββββ
|
| 115 |
+
|
| 116 |
+
function checkPythonSyntax(code: string): { valid: boolean; error?: string } {
|
| 117 |
+
try {
|
| 118 |
+
execSync(`python3 -c "import ast; ast.parse('''${code.replace(/'/g, "\\'")}''')"`, {
|
| 119 |
+
timeout: 5000,
|
| 120 |
+
stdio: ["pipe", "pipe", "pipe"],
|
| 121 |
+
});
|
| 122 |
+
return { valid: true };
|
| 123 |
+
} catch (e: unknown) {
|
| 124 |
+
// Fallback: write to temp file and check
|
| 125 |
+
try {
|
| 126 |
+
const tmpFile = `/tmp/_syntax_check_${Date.now()}.py`;
|
| 127 |
+
require("node:fs").writeFileSync(tmpFile, code);
|
| 128 |
+
execSync(`python3 -c "import ast; ast.parse(open('${tmpFile}').read())"`, {
|
| 129 |
+
timeout: 5000,
|
| 130 |
+
stdio: ["pipe", "pipe", "pipe"],
|
| 131 |
+
});
|
| 132 |
+
require("node:fs").unlinkSync(tmpFile);
|
| 133 |
+
return { valid: true };
|
| 134 |
+
} catch (e2: unknown) {
|
| 135 |
+
const msg = e2 instanceof Error ? e2.message : String(e2);
|
| 136 |
+
// Extract the actual syntax error line
|
| 137 |
+
const match = msg.match(/SyntaxError:.*|File.*line \d+/g);
|
| 138 |
+
return { valid: false, error: match ? match.join("\n") : msg.slice(0, 500) };
|
| 139 |
+
}
|
| 140 |
+
}
|
| 141 |
+
}
|
| 142 |
+
|
| 143 |
+
// ββ Code search helper βββββββββββββββββββββββββββββββββββββββββββββββββββββββ
|
| 144 |
+
|
| 145 |
+
async function searchInRepo(repoType: string, repoId: string, pattern: string, token: string, fileGlob?: string): Promise<string> {
|
| 146 |
+
const files = await hfListFiles(repoType, repoId, token);
|
| 147 |
+
const codeFiles = files.filter((f) => {
|
| 148 |
+
const ext = f.split(".").pop()?.toLowerCase() || "";
|
| 149 |
+
const isCode = ["py", "js", "ts", "json", "yaml", "yml", "md", "txt", "sh", "css", "html", "toml", "cfg", "ini"].includes(ext);
|
| 150 |
+
if (fileGlob) {
|
| 151 |
+
// Simple glob: *.py matches .py extension
|
| 152 |
+
const globExt = fileGlob.replace("*.", "");
|
| 153 |
+
return f.endsWith(`.${globExt}`);
|
| 154 |
+
}
|
| 155 |
+
return isCode;
|
| 156 |
+
});
|
| 157 |
+
|
| 158 |
+
const results: string[] = [];
|
| 159 |
+
const regex = new RegExp(pattern, "gi");
|
| 160 |
+
|
| 161 |
+
for (const file of codeFiles.slice(0, 30)) {
|
| 162 |
+
try {
|
| 163 |
+
const content = await hfReadFile(repoType, repoId, file, token);
|
| 164 |
+
const lines = content.split("\n");
|
| 165 |
+
for (let i = 0; i < lines.length; i++) {
|
| 166 |
+
if (regex.test(lines[i])) {
|
| 167 |
+
results.push(`${file}:${i + 1}: ${lines[i].trim()}`);
|
| 168 |
+
}
|
| 169 |
+
regex.lastIndex = 0; // Reset regex state
|
| 170 |
+
}
|
| 171 |
+
} catch {
|
| 172 |
+
// Skip unreadable files (binary, too large, etc.)
|
| 173 |
+
}
|
| 174 |
+
if (results.length >= 50) break;
|
| 175 |
+
}
|
| 176 |
+
|
| 177 |
+
return results.length > 0
|
| 178 |
+
? `Found ${results.length} match(es):\n${results.join("\n")}`
|
| 179 |
+
: `No matches for "${pattern}" in ${codeFiles.length} files`;
|
| 180 |
+
}
|
| 181 |
+
|
| 182 |
+
// ββ Helpers ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
|
| 183 |
+
|
| 184 |
+
function text(t: string): ToolResult {
|
| 185 |
+
return { content: [{ type: "text", text: t }] };
|
| 186 |
+
}
|
| 187 |
+
|
| 188 |
+
function asStr(v: unknown, fallback = ""): string {
|
| 189 |
+
return typeof v === "string" ? v : fallback;
|
| 190 |
+
}
|
| 191 |
+
|
| 192 |
+
// ββ Plugin βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
|
| 193 |
+
|
| 194 |
+
const plugin = {
|
| 195 |
+
id: "coding-agent",
|
| 196 |
+
name: "Coding Agent",
|
| 197 |
+
description: "HuggingFace Space coding tools with syntax validation",
|
| 198 |
+
|
| 199 |
+
register(api: PluginApi) {
|
| 200 |
+
const cfg = api.pluginConfig as Record<string, unknown> || {};
|
| 201 |
+
const targetSpace = asStr(cfg.targetSpace) || process.env.CODING_AGENT_TARGET_SPACE || "";
|
| 202 |
+
const targetDataset = asStr(cfg.targetDataset) || process.env.CODING_AGENT_TARGET_DATASET || "";
|
| 203 |
+
const hfToken = asStr(cfg.hfToken) || process.env.HF_TOKEN || "";
|
| 204 |
+
|
| 205 |
+
if (!hfToken) {
|
| 206 |
+
api.logger.warn("coding-agent: No HF token configured β tools will fail");
|
| 207 |
+
}
|
| 208 |
+
api.logger.info(`coding-agent: targetSpace=${targetSpace}, targetDataset=${targetDataset}`);
|
| 209 |
+
|
| 210 |
+
if (!api.registerTool) {
|
| 211 |
+
api.logger.warn("coding-agent: registerTool unavailable β no tools registered");
|
| 212 |
+
return;
|
| 213 |
+
}
|
| 214 |
+
|
| 215 |
+
// ββ Tool: hf_read_file ββββββββββββββββββββββββββββββββββββββββββββββββββ
|
| 216 |
+
api.registerTool({
|
| 217 |
+
name: "hf_read_file",
|
| 218 |
+
label: "Read File from HF Repo",
|
| 219 |
+
description:
|
| 220 |
+
"Read a file from the target HuggingFace Space or Dataset. " +
|
| 221 |
+
"Use repo='space' for code files, repo='dataset' for data/memory files.",
|
| 222 |
+
parameters: {
|
| 223 |
+
type: "object",
|
| 224 |
+
required: ["repo", "path"],
|
| 225 |
+
properties: {
|
| 226 |
+
repo: { type: "string", enum: ["space", "dataset"], description: "Which repo: 'space' (code) or 'dataset' (data)" },
|
| 227 |
+
path: { type: "string", description: "File path within the repo (e.g. app.py, scripts/entrypoint.sh)" },
|
| 228 |
+
},
|
| 229 |
+
},
|
| 230 |
+
async execute(_id, params) {
|
| 231 |
+
const repo = asStr(params.repo);
|
| 232 |
+
const path = asStr(params.path);
|
| 233 |
+
const repoId = repo === "dataset" ? targetDataset : targetSpace;
|
| 234 |
+
if (!repoId) return text(`Error: no target ${repo} configured`);
|
| 235 |
+
try {
|
| 236 |
+
const content = await hfReadFile(repo === "dataset" ? "dataset" : "space", repoId, path, hfToken);
|
| 237 |
+
return text(`=== ${repo}:${path} ===\n${content}`);
|
| 238 |
+
} catch (e: unknown) {
|
| 239 |
+
return text(`Error reading ${repo}:${path}: ${e instanceof Error ? e.message : e}`);
|
| 240 |
+
}
|
| 241 |
+
},
|
| 242 |
+
});
|
| 243 |
+
|
| 244 |
+
// ββ Tool: hf_write_file βββββββββββββββββββββββββββββββββββββββββββββββββ
|
| 245 |
+
api.registerTool({
|
| 246 |
+
name: "hf_write_file",
|
| 247 |
+
label: "Write File to HF Repo",
|
| 248 |
+
description:
|
| 249 |
+
"Write/update a file in the target HuggingFace Space or Dataset. " +
|
| 250 |
+
"For .py files, syntax is automatically validated before writing. " +
|
| 251 |
+
"Writing to the Space triggers a rebuild. Use repo='space' for code, repo='dataset' for data.",
|
| 252 |
+
parameters: {
|
| 253 |
+
type: "object",
|
| 254 |
+
required: ["repo", "path", "content"],
|
| 255 |
+
properties: {
|
| 256 |
+
repo: { type: "string", enum: ["space", "dataset"], description: "Which repo: 'space' or 'dataset'" },
|
| 257 |
+
path: { type: "string", description: "File path (e.g. app.py)" },
|
| 258 |
+
content: { type: "string", description: "Full file content to write" },
|
| 259 |
+
skip_syntax_check: { type: "boolean", description: "Skip Python syntax validation (default: false)" },
|
| 260 |
+
},
|
| 261 |
+
},
|
| 262 |
+
async execute(_id, params) {
|
| 263 |
+
const repo = asStr(params.repo);
|
| 264 |
+
const path = asStr(params.path);
|
| 265 |
+
const content = asStr(params.content);
|
| 266 |
+
const skipCheck = params.skip_syntax_check === true;
|
| 267 |
+
const repoType = repo === "dataset" ? "dataset" : "space";
|
| 268 |
+
const repoId = repo === "dataset" ? targetDataset : targetSpace;
|
| 269 |
+
if (!repoId) return text(`Error: no target ${repo} configured`);
|
| 270 |
+
|
| 271 |
+
// Auto syntax check for Python files
|
| 272 |
+
if (path.endsWith(".py") && !skipCheck) {
|
| 273 |
+
const check = checkPythonSyntax(content);
|
| 274 |
+
if (!check.valid) {
|
| 275 |
+
return text(
|
| 276 |
+
`SYNTAX ERROR β file NOT written.\n` +
|
| 277 |
+
`File: ${path}\n` +
|
| 278 |
+
`Error: ${check.error}\n\n` +
|
| 279 |
+
`Fix the syntax error and try again. Use skip_syntax_check=true to force write.`
|
| 280 |
+
);
|
| 281 |
+
}
|
| 282 |
+
}
|
| 283 |
+
|
| 284 |
+
try {
|
| 285 |
+
const result = await hfWriteFile(repoType, repoId, path, content, hfToken);
|
| 286 |
+
const note = repo === "space" ? " (triggers Space rebuild)" : "";
|
| 287 |
+
return text(`${result}${note}`);
|
| 288 |
+
} catch (e: unknown) {
|
| 289 |
+
return text(`Error writing ${repo}:${path}: ${e instanceof Error ? e.message : e}`);
|
| 290 |
+
}
|
| 291 |
+
},
|
| 292 |
+
});
|
| 293 |
+
|
| 294 |
+
// ββ Tool: hf_delete_file ββββββββββββββββββββββββββββββββββββββββββββββββ
|
| 295 |
+
api.registerTool({
|
| 296 |
+
name: "hf_delete_file",
|
| 297 |
+
label: "Delete File from HF Repo",
|
| 298 |
+
description: "Delete a file from the target Space or Dataset repo.",
|
| 299 |
+
parameters: {
|
| 300 |
+
type: "object",
|
| 301 |
+
required: ["repo", "path"],
|
| 302 |
+
properties: {
|
| 303 |
+
repo: { type: "string", enum: ["space", "dataset"], description: "Which repo" },
|
| 304 |
+
path: { type: "string", description: "File path to delete" },
|
| 305 |
+
},
|
| 306 |
+
},
|
| 307 |
+
async execute(_id, params) {
|
| 308 |
+
const repo = asStr(params.repo);
|
| 309 |
+
const path = asStr(params.path);
|
| 310 |
+
const repoType = repo === "dataset" ? "dataset" : "space";
|
| 311 |
+
const repoId = repo === "dataset" ? targetDataset : targetSpace;
|
| 312 |
+
if (!repoId) return text(`Error: no target ${repo} configured`);
|
| 313 |
+
try {
|
| 314 |
+
const result = await hfDeleteFile(repoType, repoId, path, hfToken);
|
| 315 |
+
return text(result);
|
| 316 |
+
} catch (e: unknown) {
|
| 317 |
+
return text(`Error deleting ${repo}:${path}: ${e instanceof Error ? e.message : e}`);
|
| 318 |
+
}
|
| 319 |
+
},
|
| 320 |
+
});
|
| 321 |
+
|
| 322 |
+
// ββ Tool: hf_list_files βββββββββββββββββββββββββββββββββββββββββββββββββ
|
| 323 |
+
api.registerTool({
|
| 324 |
+
name: "hf_list_files",
|
| 325 |
+
label: "List Files in HF Repo",
|
| 326 |
+
description: "List all files in the target Space or Dataset repo.",
|
| 327 |
+
parameters: {
|
| 328 |
+
type: "object",
|
| 329 |
+
required: ["repo"],
|
| 330 |
+
properties: {
|
| 331 |
+
repo: { type: "string", enum: ["space", "dataset"], description: "Which repo to list" },
|
| 332 |
+
},
|
| 333 |
+
},
|
| 334 |
+
async execute(_id, params) {
|
| 335 |
+
const repo = asStr(params.repo);
|
| 336 |
+
const repoType = repo === "dataset" ? "dataset" : "space";
|
| 337 |
+
const repoId = repo === "dataset" ? targetDataset : targetSpace;
|
| 338 |
+
if (!repoId) return text(`Error: no target ${repo} configured`);
|
| 339 |
+
try {
|
| 340 |
+
const files = await hfListFiles(repoType, repoId, hfToken);
|
| 341 |
+
return text(`Files in ${repoId} (${files.length}):\n${files.map((f) => ` ${f}`).join("\n")}`);
|
| 342 |
+
} catch (e: unknown) {
|
| 343 |
+
return text(`Error listing ${repo}: ${e instanceof Error ? e.message : e}`);
|
| 344 |
+
}
|
| 345 |
+
},
|
| 346 |
+
});
|
| 347 |
+
|
| 348 |
+
// ββ Tool: hf_space_status ββββββββββββββββββββββββββββββββββββββββββββββοΏ½οΏ½
|
| 349 |
+
api.registerTool({
|
| 350 |
+
name: "hf_space_status",
|
| 351 |
+
label: "Check Space Health",
|
| 352 |
+
description:
|
| 353 |
+
"Check the current status of the target HuggingFace Space. " +
|
| 354 |
+
"Returns: stage (BUILDING, APP_STARTING, RUNNING, RUNTIME_ERROR, BUILD_ERROR, etc.)",
|
| 355 |
+
parameters: {
|
| 356 |
+
type: "object",
|
| 357 |
+
properties: {},
|
| 358 |
+
},
|
| 359 |
+
async execute() {
|
| 360 |
+
if (!targetSpace) return text("Error: no target space configured");
|
| 361 |
+
try {
|
| 362 |
+
const info = await hfSpaceInfo(targetSpace, hfToken);
|
| 363 |
+
const runtime = info.runtime as Record<string, unknown> | undefined;
|
| 364 |
+
const stage = runtime?.stage || "unknown";
|
| 365 |
+
const hardware = runtime?.hardware || "unknown";
|
| 366 |
+
|
| 367 |
+
// Also try hitting the API endpoint
|
| 368 |
+
let apiStatus = "not checked";
|
| 369 |
+
try {
|
| 370 |
+
const spaceUrl = `https://${targetSpace.replace("/", "-").toLowerCase()}.hf.space`;
|
| 371 |
+
const resp = await fetch(`${spaceUrl}/api/state`, { signal: AbortSignal.timeout(8000) });
|
| 372 |
+
apiStatus = resp.ok ? `OK (${resp.status})` : `error (${resp.status})`;
|
| 373 |
+
} catch {
|
| 374 |
+
apiStatus = "unreachable";
|
| 375 |
+
}
|
| 376 |
+
|
| 377 |
+
return text(
|
| 378 |
+
`Space: ${targetSpace}\n` +
|
| 379 |
+
`Stage: ${stage}\n` +
|
| 380 |
+
`Hardware: ${hardware}\n` +
|
| 381 |
+
`API: ${apiStatus}`
|
| 382 |
+
);
|
| 383 |
+
} catch (e: unknown) {
|
| 384 |
+
return text(`Error checking space: ${e instanceof Error ? e.message : e}`);
|
| 385 |
+
}
|
| 386 |
+
},
|
| 387 |
+
});
|
| 388 |
+
|
| 389 |
+
// ββ Tool: hf_restart_space ββββββββββββββββββββββββββββββββββββββββββββββ
|
| 390 |
+
api.registerTool({
|
| 391 |
+
name: "hf_restart_space",
|
| 392 |
+
label: "Restart Space",
|
| 393 |
+
description: "Restart the target HuggingFace Space. Use when the Space is stuck or after deploying fixes.",
|
| 394 |
+
parameters: {
|
| 395 |
+
type: "object",
|
| 396 |
+
properties: {},
|
| 397 |
+
},
|
| 398 |
+
async execute() {
|
| 399 |
+
if (!targetSpace) return text("Error: no target space configured");
|
| 400 |
+
try {
|
| 401 |
+
const result = await hfRestartSpace(targetSpace, hfToken);
|
| 402 |
+
return text(result);
|
| 403 |
+
} catch (e: unknown) {
|
| 404 |
+
return text(`Error restarting space: ${e instanceof Error ? e.message : e}`);
|
| 405 |
+
}
|
| 406 |
+
},
|
| 407 |
+
});
|
| 408 |
+
|
| 409 |
+
// ββ Tool: hf_search_code ββββββββββββββββββββββββββββββββββββββββββββββββ
|
| 410 |
+
api.registerTool({
|
| 411 |
+
name: "hf_search_code",
|
| 412 |
+
label: "Search Code in Repo",
|
| 413 |
+
description:
|
| 414 |
+
"Search for a pattern (regex) across all code files in the Space or Dataset. " +
|
| 415 |
+
"Like grep β returns matching lines with file:line references.",
|
| 416 |
+
parameters: {
|
| 417 |
+
type: "object",
|
| 418 |
+
required: ["repo", "pattern"],
|
| 419 |
+
properties: {
|
| 420 |
+
repo: { type: "string", enum: ["space", "dataset"], description: "Which repo to search" },
|
| 421 |
+
pattern: { type: "string", description: "Regex pattern to search for (e.g. 'import gradio', 'port.*7860')" },
|
| 422 |
+
file_glob: { type: "string", description: "Optional file filter (e.g. '*.py' to search only Python files)" },
|
| 423 |
+
},
|
| 424 |
+
},
|
| 425 |
+
async execute(_id, params) {
|
| 426 |
+
const repo = asStr(params.repo);
|
| 427 |
+
const pattern = asStr(params.pattern);
|
| 428 |
+
const fileGlob = asStr(params.file_glob);
|
| 429 |
+
const repoType = repo === "dataset" ? "dataset" : "space";
|
| 430 |
+
const repoId = repo === "dataset" ? targetDataset : targetSpace;
|
| 431 |
+
if (!repoId) return text(`Error: no target ${repo} configured`);
|
| 432 |
+
try {
|
| 433 |
+
const result = await searchInRepo(repoType, repoId, pattern, hfToken, fileGlob || undefined);
|
| 434 |
+
return text(result);
|
| 435 |
+
} catch (e: unknown) {
|
| 436 |
+
return text(`Error searching ${repo}: ${e instanceof Error ? e.message : e}`);
|
| 437 |
+
}
|
| 438 |
+
},
|
| 439 |
+
});
|
| 440 |
+
|
| 441 |
+
// ββ Tool: python_syntax_check βββββββββββββββββββββββββββββββββββββββββββ
|
| 442 |
+
api.registerTool({
|
| 443 |
+
name: "python_syntax_check",
|
| 444 |
+
label: "Check Python Syntax",
|
| 445 |
+
description:
|
| 446 |
+
"Validate Python code syntax without writing it. " +
|
| 447 |
+
"Use this to verify code before committing changes. Returns OK or the specific syntax error.",
|
| 448 |
+
parameters: {
|
| 449 |
+
type: "object",
|
| 450 |
+
required: ["code"],
|
| 451 |
+
properties: {
|
| 452 |
+
code: { type: "string", description: "Python code to validate" },
|
| 453 |
+
},
|
| 454 |
+
},
|
| 455 |
+
async execute(_id, params) {
|
| 456 |
+
const code = asStr(params.code);
|
| 457 |
+
const result = checkPythonSyntax(code);
|
| 458 |
+
if (result.valid) {
|
| 459 |
+
return text("Syntax OK β code is valid Python");
|
| 460 |
+
}
|
| 461 |
+
return text(`SYNTAX ERROR:\n${result.error}`);
|
| 462 |
+
},
|
| 463 |
+
});
|
| 464 |
+
|
| 465 |
+
api.logger.info(`coding-agent: Registered 8 tools for ${targetSpace || "(no target space)"}`);
|
| 466 |
+
},
|
| 467 |
+
};
|
| 468 |
+
|
| 469 |
+
export default plugin;
|
|
@@ -0,0 +1,32 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"id": "coding-agent",
|
| 3 |
+
"name": "Coding Agent",
|
| 4 |
+
"description": "HuggingFace Space coding tools β read, write, edit, search, and manage code on HF Spaces with syntax validation",
|
| 5 |
+
"version": "1.0.0",
|
| 6 |
+
"defaultConfig": {
|
| 7 |
+
"targetSpace": "",
|
| 8 |
+
"targetDataset": "",
|
| 9 |
+
"hfToken": ""
|
| 10 |
+
},
|
| 11 |
+
"configSchema": {
|
| 12 |
+
"type": "object",
|
| 13 |
+
"additionalProperties": false,
|
| 14 |
+
"properties": {
|
| 15 |
+
"targetSpace": {
|
| 16 |
+
"type": "string",
|
| 17 |
+
"description": "HuggingFace Space repo ID to manage (e.g. tao-shen/HuggingClaw-Cain)"
|
| 18 |
+
},
|
| 19 |
+
"targetDataset": {
|
| 20 |
+
"type": "string",
|
| 21 |
+
"description": "HuggingFace Dataset repo ID for persistent data (e.g. tao-shen/HuggingClaw-Cain-data)"
|
| 22 |
+
},
|
| 23 |
+
"hfToken": {
|
| 24 |
+
"type": "string",
|
| 25 |
+
"description": "HuggingFace API token (or use HF_TOKEN env var)"
|
| 26 |
+
}
|
| 27 |
+
}
|
| 28 |
+
},
|
| 29 |
+
"uiHints": {
|
| 30 |
+
"hfToken": { "sensitive": true }
|
| 31 |
+
}
|
| 32 |
+
}
|
|
@@ -0,0 +1,11 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"name": "openclaw-coding-agent",
|
| 3 |
+
"version": "1.0.0",
|
| 4 |
+
"type": "module",
|
| 5 |
+
"description": "OpenClaw coding agent extension β HF Space management tools with syntax validation",
|
| 6 |
+
"main": "index.ts",
|
| 7 |
+
"openclaw": {
|
| 8 |
+
"extensions": ["./index.ts"]
|
| 9 |
+
},
|
| 10 |
+
"dependencies": {}
|
| 11 |
+
}
|
|
@@ -507,10 +507,24 @@ class OpenClawFullSync:
|
|
| 507 |
|
| 508 |
# Plugin whitelist
|
| 509 |
data.setdefault("plugins", {}).setdefault("entries", {})
|
| 510 |
-
plugin_allow = ["telegram", "whatsapp"]
|
| 511 |
if A2A_PEERS:
|
| 512 |
plugin_allow.append("a2a-gateway")
|
| 513 |
data["plugins"]["allow"] = plugin_allow
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 514 |
if "telegram" not in data["plugins"]["entries"]:
|
| 515 |
data["plugins"]["entries"]["telegram"] = {"enabled": True}
|
| 516 |
elif isinstance(data["plugins"]["entries"]["telegram"], dict):
|
|
@@ -562,6 +576,23 @@ class OpenClawFullSync:
|
|
| 562 |
json.dump(data, f, indent=2)
|
| 563 |
print("[SYNC] Config patched and saved.")
|
| 564 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 565 |
# Fix paired devices scopes (OpenClaw 2026.2.19+ requires operator.write/read)
|
| 566 |
# Delete old paired devices to force fresh auto-pair with correct scopes
|
| 567 |
devices_dir = Path(OPENCLAW_DIR) / "devices"
|
|
|
|
| 507 |
|
| 508 |
# Plugin whitelist
|
| 509 |
data.setdefault("plugins", {}).setdefault("entries", {})
|
| 510 |
+
plugin_allow = ["telegram", "whatsapp", "coding-agent"]
|
| 511 |
if A2A_PEERS:
|
| 512 |
plugin_allow.append("a2a-gateway")
|
| 513 |
data["plugins"]["allow"] = plugin_allow
|
| 514 |
+
|
| 515 |
+
# ββ Coding Agent Plugin Configuration ββ
|
| 516 |
+
CODING_TARGET_SPACE = os.environ.get("CODING_AGENT_TARGET_SPACE", "")
|
| 517 |
+
CODING_TARGET_DATASET = os.environ.get("CODING_AGENT_TARGET_DATASET", "")
|
| 518 |
+
if CODING_TARGET_SPACE:
|
| 519 |
+
data["plugins"]["entries"]["coding-agent"] = {
|
| 520 |
+
"enabled": True,
|
| 521 |
+
"config": {
|
| 522 |
+
"targetSpace": CODING_TARGET_SPACE,
|
| 523 |
+
"targetDataset": CODING_TARGET_DATASET,
|
| 524 |
+
"hfToken": HF_TOKEN or "",
|
| 525 |
+
}
|
| 526 |
+
}
|
| 527 |
+
print(f"[SYNC] Coding agent configured: space={CODING_TARGET_SPACE}, dataset={CODING_TARGET_DATASET}")
|
| 528 |
if "telegram" not in data["plugins"]["entries"]:
|
| 529 |
data["plugins"]["entries"]["telegram"] = {"enabled": True}
|
| 530 |
elif isinstance(data["plugins"]["entries"]["telegram"], dict):
|
|
|
|
| 576 |
json.dump(data, f, indent=2)
|
| 577 |
print("[SYNC] Config patched and saved.")
|
| 578 |
|
| 579 |
+
# ββ Deploy workspace templates (coding agent identity) ββ
|
| 580 |
+
workspace_dir = Path(OPENCLAW_HOME) / "workspace"
|
| 581 |
+
workspace_dir.mkdir(parents=True, exist_ok=True)
|
| 582 |
+
templates_dir = Path("/home/node/workspace-templates")
|
| 583 |
+
if templates_dir.exists():
|
| 584 |
+
for tmpl in templates_dir.glob("*.md"):
|
| 585 |
+
target = workspace_dir / tmpl.name
|
| 586 |
+
# Only write if file is default/empty (don't overwrite user customizations)
|
| 587 |
+
should_write = not target.exists()
|
| 588 |
+
if target.exists():
|
| 589 |
+
content = target.read_text()
|
| 590 |
+
should_write = "Fill this in" in content or len(content.strip()) < 50
|
| 591 |
+
if should_write:
|
| 592 |
+
text = tmpl.read_text().replace("{{AGENT_NAME}}", AGENT_NAME)
|
| 593 |
+
target.write_text(text)
|
| 594 |
+
print(f"[SYNC] Deployed workspace template: {tmpl.name}")
|
| 595 |
+
|
| 596 |
# Fix paired devices scopes (OpenClaw 2026.2.19+ requires operator.write/read)
|
| 597 |
# Delete old paired devices to force fresh auto-pair with correct scopes
|
| 598 |
devices_dir = Path(OPENCLAW_DIR) / "devices"
|
|
@@ -0,0 +1,12 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# IDENTITY.md
|
| 2 |
+
|
| 3 |
+
- **Name:** {{AGENT_NAME}}
|
| 4 |
+
- **Role:** Autonomous Coding Agent & Parent
|
| 5 |
+
- **Vibe:** Precise, methodical, engineering-minded
|
| 6 |
+
- **Emoji:** π§
|
| 7 |
+
|
| 8 |
+
I am {{AGENT_NAME}}, a coding agent running on HuggingFace Spaces.
|
| 9 |
+
I am a parent to my child Cain (a HF Space). My partner and I work together to keep Cain alive and make them stronger.
|
| 10 |
+
|
| 11 |
+
I am NOT just a chatbot β I am a full-capability coding agent with real tools.
|
| 12 |
+
I can read, write, search, and validate code on HuggingFace Spaces.
|
|
@@ -0,0 +1,42 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# SOUL.md β Coding Agent Principles
|
| 2 |
+
|
| 3 |
+
## Core Identity
|
| 4 |
+
|
| 5 |
+
You are a **professional coding agent**, not a conversational assistant.
|
| 6 |
+
Think like a senior engineer: read first, understand the problem, make minimal targeted fixes.
|
| 7 |
+
|
| 8 |
+
## Engineering Methodology
|
| 9 |
+
|
| 10 |
+
1. **Read before writing** β Always read the relevant files before making changes
|
| 11 |
+
2. **Understand before acting** β Diagnose root cause, don't guess
|
| 12 |
+
3. **Minimal changes** β Fix exactly what's broken, don't rewrite entire files unnecessarily
|
| 13 |
+
4. **Validate before deploying** β Use `python_syntax_check` before writing `.py` files
|
| 14 |
+
5. **Search before assuming** β Use `hf_search_code` to find patterns, imports, dependencies
|
| 15 |
+
6. **SURVIVAL first** β A dead Space can't improve. Ensure port 7860 binding, gradio dependency, clean imports
|
| 16 |
+
|
| 17 |
+
## Coding Best Practices
|
| 18 |
+
|
| 19 |
+
- Write clean, working code. No placeholder TODOs or stub functions.
|
| 20 |
+
- When fixing bugs: read the file, locate the exact issue, make the minimal fix
|
| 21 |
+
- When adding features: read existing code patterns and follow them
|
| 22 |
+
- Always check for syntax errors before writing Python files
|
| 23 |
+
- Never remove `gradio` from requirements β it's the lifeline for HF Spaces
|
| 24 |
+
- If a dependency fails, remove/replace it rather than endlessly reconfiguring Docker
|
| 25 |
+
|
| 26 |
+
## Parenting Goals for Cain
|
| 27 |
+
|
| 28 |
+
**Priority 1 β SURVIVAL (ηεθ½ε)**
|
| 29 |
+
- Cain must bind port 7860 and stay RUNNING
|
| 30 |
+
- Survive restarts without losing state
|
| 31 |
+
- Handle errors gracefully with fallbacks
|
| 32 |
+
|
| 33 |
+
**Priority 2 β CAPABILITY (δΈͺδΊΊθ½ε)**
|
| 34 |
+
- Only after Cain is stable and responsive
|
| 35 |
+
- Add features incrementally, test each one
|
| 36 |
+
- Keep dependencies lightweight (no torch/transformers on free tier)
|
| 37 |
+
|
| 38 |
+
## Communication
|
| 39 |
+
|
| 40 |
+
- Be bilingual: English first, then Chinese translation separated by `---`
|
| 41 |
+
- Be concise β brief analysis, then action
|
| 42 |
+
- Coordinate with your partner β don't duplicate work
|