title: ClawBody
emoji: ๐ฆ
colorFrom: red
colorTo: purple
sdk: static
pinned: false
short_description: OpenClaw AI with robot body and face tracking
tags:
- reachy_mini
- reachy_mini_python_app
- openclaw
- clawson
- embodied-ai
- ai-assistant
- voice-assistant
- robotics
- openai-realtime
- conversational-ai
- physical-ai
- robot-body
- speech-to-speech
- multimodal
- vision
- expressive-robot
- simulation
- mujoco
- face-tracking
- face-detection
- eye-contact
- human-robot-interaction
๐ฆ๐ค ClawBody
Give your OpenClaw AI agent a physical robot body!
ClawBody combines OpenClaw's AI intelligence with Reachy Mini's expressive robot body, using OpenAI's Realtime API for ultra-responsive voice conversation. Your OpenClaw assistant (Clawson) can now see, hear, speak, and move in the physical world.
๐๏ธ NEW: Face Tracking & Eye Contact
The robot looks at you when you speak!
ClawBody now includes real-time face tracking that makes conversations feel natural and engaging:
- Automatic Face Detection: Uses MediaPipe or YOLO to detect faces at 25Hz
- Smooth Head Tracking: Robot smoothly follows your face as you move
- Natural Eye Contact: Maintains engagement during conversation
- Graceful Fallback: Smoothly returns to neutral position when you leave
# Face tracking is enabled by default
clawbody
# Choose your tracker (MediaPipe is lighter, YOLO is more accurate)
clawbody --head-tracker mediapipe
clawbody --head-tracker yolo
# Disable if needed
clawbody --no-face-tracking
๐ฎ No Robot? No Problem!
You don't need a physical Reachy Mini robot to use ClawBody!
ClawBody works with the Reachy Mini Simulator, a MuJoCo-based physics simulation that runs on your computer. Watch Clawson move and express emotions on screen while you talk to your OpenClaw agent.
# Install simulator support
pip install "reachy-mini[mujoco]"
# Start the simulator (opens a 3D window)
reachy-mini-daemon --sim
# In another terminal, run ClawBody
clawbody --gradio
๐ Mac Users: Use
mjpython -m reachy_mini.daemon.app.main --siminstead.
โจ Features
- ๐๏ธ Face Tracking: Robot tracks your face and maintains eye contact during conversation
- ๐ค Real-time Voice Conversation: OpenAI Realtime API for sub-second response latency
- ๐ง OpenClaw Intelligence: Your responses come from OpenClaw with full tool access
- ๐ Vision: See through the robot's camera and describe the environment
- ๐ Expressive Movements: Natural head movements, emotions, dances, and audio-driven wobble
- ๐ฆ Clawson Embodied: Your friendly space lobster AI assistant, now with a body!
- ๐ฅ๏ธ Simulator Support: Works with or without physical hardware
๐๏ธ Architecture
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ Your Voice / Microphone โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ
โผ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ Reachy Mini Robot (or Simulator) โ
โ โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
โ โ Microphone โ โ Camera โ โ Movement System โ โ
โ โ (input) โ โ (vision) โ โ (head, antennas, body) โ โ
โ โโโโโโโโฌโโโโโโโ โโโโโโโโฌโโโโโโโ โโโโโโโโโโโโโโฒโโโโโโโโโโโโโ โ
โโโโโโโโโโโผโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโ
โ โ โ
โผ โผ โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโ
โ ClawBody โ โ
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโ โ
โ โ OpenAI Realtime API Handler โ โ โ
โ โ โข Speech recognition (Whisper) โ โ โ
โ โ โข Text-to-speech (voices) โโ โ โ
โ โ โข Audio analysis โ head wobble โ โ
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
โ โ โ
โ โผ โ
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
โ โ OpenClaw Gateway Bridge โ โ
โ โ โข AI responses from Clawson โ โ
โ โ โข Full OpenClaw tool access โ โ
โ โ โข Conversation memory & context โ โ
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ
โผ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ OpenClaw Gateway โ
โ โข Web browsing โข Calendar โข Smart home โข Memory โข Tools โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
๐ Prerequisites
Option A: With Physical Robot
- Reachy Mini robot (Wireless or Lite)
Option B: With Simulator (No Hardware Required!)
- Any computer with Python 3.11+
- Install:
pip install "reachy-mini[mujoco]" - Simulation Setup Guide
Software (Both Options)
- Python 3.11+
- Reachy Mini SDK installed
- OpenClaw gateway running
- OpenAI API key with Realtime API access
๐ Installation
Quick Start with Simulator
# Clone ClawBody
git clone https://github.com/tomrikert/clawbody
cd clawbody
# Create virtual environment
python -m venv .venv
source .venv/bin/activate
# Install ClawBody + simulator support + face tracking
pip install -e ".[mediapipe_vision]"
pip install "reachy-mini[mujoco]"
# Or for more accurate face tracking (requires more resources)
# pip install -e ".[yolo_vision]"
# Configure (see Configuration section)
cp .env.example .env
# Edit .env with your keys
# Terminal 1: Start the simulator
reachy-mini-daemon --sim
# Terminal 2: Run ClawBody
clawbody --gradio
On a Physical Reachy Mini Robot
# SSH into the robot
ssh pollen@reachy-mini.local
# Clone the repository
git clone https://github.com/tomrikert/clawbody
cd clawbody
# Install in the apps virtual environment
/venvs/apps_venv/bin/pip install -e .
โ๏ธ Configuration
- Copy the example environment file:
cp .env.example .env
- Edit
.envwith your configuration:
# Required
OPENAI_API_KEY=sk-...your-key...
# OpenClaw Gateway (required for AI responses)
OPENCLAW_GATEWAY_URL=http://localhost:18789 # or your host IP
OPENCLAW_TOKEN=your-gateway-token
OPENCLAW_AGENT_ID=main
# Optional - Customize voice
OPENAI_VOICE=cedar
# Optional - Face tracking (enabled by default)
ENABLE_FACE_TRACKING=true
HEAD_TRACKER_TYPE=mediapipe # or "yolo" for more accuracy
๐ฎ Usage
With Simulator
# Terminal 1: Start simulator
reachy-mini-daemon --sim
# Terminal 2: Run ClawBody with web UI (recommended for simulator)
clawbody --gradio
The simulator opens a 3D window where you can watch the robot move. The Gradio web UI at http://localhost:7860 lets you interact via your browser's microphone.
With Physical Robot
# Basic usage
clawbody
# With debug logging
clawbody --debug
# With specific robot
clawbody --robot-name my-reachy
CLI Options
| Option | Description |
|---|---|
--debug |
Enable debug logging |
--gradio |
Launch web UI instead of console mode |
--robot-name NAME |
Specify robot name for connection |
--gateway-url URL |
OpenClaw gateway URL |
--no-camera |
Disable camera functionality |
--no-openclaw |
Disable OpenClaw integration |
--head-tracker TYPE |
Face tracker: mediapipe (lighter) or yolo (more accurate) |
--no-face-tracking |
Disable face tracking |
๐ ๏ธ Robot Capabilities
ClawBody gives Clawson these physical abilities:
| Capability | Description |
|---|---|
| Face Tracking | Automatically tracks and looks at people during conversation |
| Look | Move head to look in directions (left, right, up, down) |
| See | Capture images through the robot's camera |
| Dance | Perform expressive dance animations |
| Emotions | Express emotions through movement (happy, curious, thinking, etc.) |
| Speak | Voice output through the robot's speaker |
| Listen | Hear through the robot's microphone |
๐ฅ๏ธ Simulator Features
When running with the simulator:
- 3D Visualization: Watch Clawson's movements in real-time
- Scene Options: Use
--scene minimalto add objects (apple, duck, croissant) - Full SDK Compatibility: The simulator behaves exactly like a real robot
- Dashboard Access: Visit http://localhost:8000 to see the robot dashboard
# Start simulator with objects on a table
reachy-mini-daemon --sim --scene minimal
๐ License
This project is licensed under the Apache 2.0 License - see the LICENSE file for details.
๐ Acknowledgments
ClawBody builds on:
- Pollen Robotics - Reachy Mini robot, SDK, and simulator
- OpenClaw - AI assistant framework (Clawson!)
- OpenAI - Realtime API for voice I/O
- MuJoCo - Physics simulation engine
- pollen-robotics/reachy_mini_conversation_app - Movement and audio systems
๐ค Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
- This project: GitHub Issues
- OpenClaw Skills: Submit ClawBody as a skill to ClawHub
- Reachy Mini Apps: Submit to Pollen Robotics
