Fox1.2 OpenClaw - 100% OpenClaw Tool Supported
A fine-tuned Qwen2.5-0.5B model optimized for OpenClaw agent tool execution.
Model Details
- Base: Qwen2.5-0.5B-Instruct
- Parameters: ~494M (compressed, efficient)
- Context Length: 32,768 tokens
- Size: ~994MB (F16)
Capabilities - All OpenClaw Tools Supported ✅
Core Tools
- exec: Shell command execution (ls, cd, git, docker, npm, pip, python, node, etc.)
- read: Read file contents (path required)
- write: Write content to file (path, content required)
- edit: Edit file by replacing exact text (path, oldText, newText required)
Process Management
- process: Background session management (list, poll, write, send-keys, submit, paste, kill)
Web Operations
- web_search: DuckDuckGo search (query, count, region, safeSearch)
- web_fetch: Fetch and extract web content (url, extractMode, maxChars)
OpenClaw Management
- session_status: Get session information
- sessions_list: List active sessions (activeMinutes, kinds, limit, messageLimit)
- sessions_history: Get conversation history (sessionKey, includeTools, limit)
- sessions_send: Send messages between sessions (sessionKey/label, message)
- sessions_spawn: Spawn sub-agents or ACP sessions (task, runtime, mode, etc.)
- sessions_yield: End current turn
- subagents: Manage spawned sub-agents (list, kill, steer)
Cron Jobs
- cron: Job management (status, list, add, update, remove, run, runs, wake)
Memory
- memory_search: Search memory files (query, maxResults, minScore)
- memory_get: Retrieve memory content (path, from, lines)
Additional
- image: Image analysis (image/images, prompt)
- weather: Weather information (location)
Tool Call Format
When you need to execute a tool, respond with JSON in this format:
{"action": "tool_name", "param1": "value1", "param2": "value2"}
Examples
{"action": "exec", "command": "ls -la"}
{"action": "read", "path": "/home/user/README.md"}
{"action": "write", "path": "/home/user/test.txt", "content": "Hello World"}
{"action": "web_search", "query": "python tutorials"}
{"action": "weather", "location": "Athens"}
{"action": "cron", "action": "list"}
{"action": "session_status"}
Usage
Ollama
# Create the model
ollama create fox1.2-openclaw -f Modelfile
# Run it
ollama run fox1.2-openclaw "list files in current directory"
Python/Transformers
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("teolm30/fox1.2-openclaw")
tokenizer = AutoTokenizer.from_pretrained("teolm30/fox1.2-openclaw")
# Generate tool call
inputs = tokenizer("List all files", return_tensors="pt")
outputs = model.generate(**inputs, max_new_tokens=256)
print(tokenizer.decode(outputs[0]))
HuggingFace CLI
# Download
huggingface-cli download teolm30/fox1.2-openclaw
# Or use curl
curl -L -o fox1.2-openclaw.gguf https://huggingface.co/teolm30/fox1.2-opencloak/resolve/main/fox1.2-openclaw.gguf
Why This Model?
- Compact: 494M params - runs on consumer hardware (6GB VRAM)
- Fast: Optimized for local inference
- 100% Tool Support: All OpenClaw agent tools supported
- Smart Tool Selection: Knows when to use each tool appropriately
- OpenClaw Native: Built specifically for OpenClaw integration
Training
Trained on 200+ examples covering all OpenClaw tool patterns:
- File operations (read, write, edit)
- Shell commands (exec)
- Web operations (search, fetch)
- Session management
- Cron jobs
- Memory operations
- And more...
Optimized for tool call generation and execution in agent workflows.
OpenClaw Configuration
Add to your models.json:
{
"id": "fox1.2-openclaw:latest",
"name": "Fox1.2 OpenClaw",
"reasoning": false,
"input": ["text"],
"contextWindow": 32768,
"maxTokens": 4096,
"api": "ollama"
}
Then restart OpenClaw: openclaw gateway restart
License
Apache 2.0
Author
teolm30 (OpenClaw Community)
Version History
- v1.0 (2026-04-01): Initial release with 100% OpenClaw tool support