File size: 13,406 Bytes
c8edd3d
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
# Reachy Mini DanceML SDK Documentation

This documentation covers the SDK methods for controlling Reachy Mini movements and the data format for dance sequences.

## Table of Contents
- [Overview](#overview)
- [Dataset Format](#dataset-format)
- [Core Classes](#core-classes)
- [Movement Tools](#movement-tools)
- [Usage Examples](#usage-examples)

---

## Overview

The Reachy Mini DanceML SDK enables:
- **Voice-controlled movements** via OpenAI Realtime API
- **Keyframe-based animations** with cubic spline interpolation
- **Simple pose commands** for direct head positioning

---

## Available Datasets

Pollen Robotics provides two HuggingFace datasets with pre-recorded movements:

| Dataset | Records | Description |
|---------|---------|-------------|
| `pollen-robotics/reachy-mini-dances-library` | 20 | Dance moves (pecking, bobbing, swaying) |
| `pollen-robotics/reachy-mini-emotions-library` | 81 | Emotional expressions (wonder, fear, joy, etc.) |

Both datasets share the same schema and can be used interchangeably with this SDK.

### Dataset Schema

| Field | Type | Description |
|-------|------|-------------|
| `description` | `string` | Human-readable description of the movement |
| `time` | `List[float]` | Timestamps in seconds from animation start |
| `set_target_data` | `List[TargetData]` | Array of pose targets at each timestamp |

### TargetData Structure

Each element in `set_target_data` contains:

```python
{
    "head": [[4x4 homogeneous transformation matrix]],
    "antennas": [left_angle, right_angle],  # in radians
    "body_yaw": 0.0,                         # body rotation (typically 0)
    "check_collision": false                 # collision check flag
}
```

### Head Pose Matrix

The `head` field is a 4x4 homogeneous transformation matrix representing the head orientation:

```
[[r11, r12, r13, tx],
 [r21, r22, r23, ty],
 [r31, r32, r33, tz],
 [0,   0,   0,   1 ]]
```

Where:
- The 3x3 upper-left submatrix encodes rotation (roll, pitch, yaw)
- The last column `[tx, ty, tz, 1]` encodes translation

### Loading the Datasets

```python
from datasets import load_dataset

# Dance moves (requires HuggingFace login)
dances = load_dataset("pollen-robotics/reachy-mini-dances-library")

# Emotions library (requires HuggingFace login)
emotions = load_dataset("pollen-robotics/reachy-mini-emotions-library")

# Access a dance move
dance = dances['train'][0]
print(f"Description: {dance['description']}")
# Output: "A sharp, forward, chicken-like pecking motion."

# Access an emotion
emotion = emotions['train'][0]
print(f"Description: {emotion['description']}")
# Output: "When you discover something extraordinary..."

# Both have the same structure
print(f"Duration: {emotion['time'][-1]} seconds")
print(f"Frames: {len(emotion['time'])}")
```

### Example Emotion Descriptions

The emotions library includes expressive movements such as:
- **Wonder**: "When you discover something extraordinary"
- **Fear**: "You look around without really knowing where to look"
- **Joy**: Celebratory movements
- **Surprise**: Reactive startle responses
- **Curiosity**: Investigative head tilts

---

## Core Classes

### KeyFrame

A single keyframe in an animation sequence.

```python
from reachy_mini_danceml.movement_tools import KeyFrame

@dataclass
class KeyFrame:
    t: float                      # Time in seconds from animation start
    head: dict                    # {"roll": 0, "pitch": 0, "yaw": 0} in degrees
    antennas: Tuple[float, float] # (left, right) antenna angles in degrees
```

#### Methods

| Method | Description |
|--------|-------------|
| `KeyFrame.from_dict(data)` | Create KeyFrame from a dictionary |

#### Example

```python
# Create keyframes for a nodding animation
keyframes = [
    KeyFrame(t=0.0, head={"roll": 0, "pitch": 0, "yaw": 0}, antennas=(0, 0)),
    KeyFrame(t=0.3, head={"roll": 0, "pitch": -15, "yaw": 0}, antennas=(10, 10)),
    KeyFrame(t=0.6, head={"roll": 0, "pitch": 10, "yaw": 0}, antennas=(-5, -5)),
    KeyFrame(t=1.0, head={"roll": 0, "pitch": 0, "yaw": 0}, antennas=(0, 0)),
]
```

---

### GeneratedMove

A Move generated from keyframes with cubic spline interpolation.

```python
from reachy_mini_danceml.movement_generator import GeneratedMove

class GeneratedMove(Move):
    def __init__(self, keyframes: List[KeyFrame])
    
    @property
    def duration(self) -> float
    
    def evaluate(self, t: float) -> Tuple[np.ndarray, np.ndarray, float]
```

#### Properties

| Property | Type | Description |
|----------|------|-------------|
| `duration` | `float` | Total animation duration in seconds |

#### Methods

| Method | Parameters | Returns | Description |
|--------|------------|---------|-------------|
| `evaluate(t)` | `t: float` (time in seconds) | `(head_pose, antennas, body_yaw)` | Interpolate pose at time t |

#### Return Values from `evaluate()`

- `head_pose`: 4x4 numpy array (homogeneous transformation matrix)
- `antennas`: numpy array `[left, right]` in radians
- `body_yaw`: float (always 0.0)

#### Example

```python
from reachy_mini_danceml.movement_generator import GeneratedMove
from reachy_mini_danceml.movement_tools import KeyFrame

keyframes = [
    KeyFrame(t=0.0, head={"yaw": 0}),
    KeyFrame(t=1.0, head={"yaw": 30}),
    KeyFrame(t=2.0, head={"yaw": 0}),
]

move = GeneratedMove(keyframes)
print(f"Duration: {move.duration} seconds")

# Get pose at 0.5 seconds
head, antennas, body_yaw = move.evaluate(0.5)
```

---

### MoveLibrary

Manages loading and indexing of dance and emotion datasets.

```python
from reachy_mini_danceml.dataset_loader import MoveLibrary

library = MoveLibrary()
library.load()

# Search
results = library.search_moves("happy")

# Get Record
record = library.get_move("joy_jump")
```

### MovementGenerator

Generates and executes movements on Reachy Mini.

```python
from reachy_mini_danceml.movement_generator import MovementGenerator

class MovementGenerator:
    def __init__(self, reachy: ReachyMini)
    
    def create_from_keyframes(self, keyframes) -> GeneratedMove
    
    async def goto_pose(self, roll=0, pitch=0, yaw=0, duration=0.5) -> None
    
    async def play_move(self, move: Move) -> None
    
    async def stop(self) -> None
```

#### Methods

| Method | Parameters | Description |
|--------|------------|-------------|
| `create_from_keyframes(keyframes)` | `List[KeyFrame]` or `List[dict]` | Create a GeneratedMove from keyframes |
| `goto_pose(roll, pitch, yaw, duration)` | Angles in degrees, duration in seconds | Move head to specific pose |
| `play_move(move)` | `Move` object | Play an animation asynchronously |
| `stop()` | None | Stop current movement, return to neutral |

#### Angle Limits

| Parameter | Range | Direction |
|-----------|-------|-----------|
| `roll` | -30° to 30° | Positive = tilt right |
| `pitch` | -30° to 30° | Positive = look up |
| `yaw` | -45° to 45° | Positive = look left |
| `antennas` | -60° to 60° | Each antenna independently |

#### Example

```python
from reachy_mini import ReachyMini
from reachy_mini_danceml.movement_generator import MovementGenerator

async def demo(reachy: ReachyMini):
    generator = MovementGenerator(reachy)
    
    # Simple pose
    await generator.goto_pose(roll=0, pitch=10, yaw=-20, duration=0.5)
    
    # Keyframe animation
    keyframes = [
        {"t": 0.0, "head": {"yaw": 0}, "antennas": [0, 0]},
        {"t": 0.5, "head": {"yaw": 30}, "antennas": [20, -20]},
        {"t": 1.0, "head": {"yaw": 0}, "antennas": [0, 0]},
    ]
    move = generator.create_from_keyframes(keyframes)
    await generator.play_move(move)
```

---

## Movement Tools

These are OpenAI function-calling tool schemas for voice control integration.

### PLAY_MOVE_TOOL

Play a pre-defined movement from the library by its name/ID.

```python
{
    "type": "function",
    "name": "play_move",
    "description": "Play a pre-defined movement from the library by its name (e.g., 'joy', 'fear', 'chicken_dance'). Prefer this over creating sequences manually.",
    "parameters": {
        "properties": {
            "name": {"type": "string", "description": "Name or ID of the movement"}
        },
        "required": ["name"]
    }
}
```

### SEARCH_MOVES_TOOL

Search the library for available movements.

```python
{
    "type": "function",
    "name": "search_moves",
    "description": "Search the movement library for available expressions or dances.",
    "parameters": {
        "properties": {
            "query": {"type": "string", "description": "Keywords to search for"}
        },
        "required": ["query"]
    }
}
```

### GET_CHOREOGRAPHY_GUIDE_TOOL

Retrieve physics rules and examples for custom generation.

```python
{
    "type": "function",
    "name": "get_choreography_guide",
    "description": "Read the choreography guide to learn how to create safe and expressive custom movements. Call this BEFORE using create_sequence for new moves."
}
```

### GOTO_POSE_TOOL

Move the robot's head to a specific pose.

```python
{
    "type": "function",
    "name": "goto_pose",
    "parameters": {
        "properties": {
            "roll": {"type": "number", "description": "Roll angle (-30 to 30°)"},
            "pitch": {"type": "number", "description": "Pitch angle (-30 to 30°)"},
            "yaw": {"type": "number", "description": "Yaw angle (-45 to 45°)"},
            "duration": {"type": "number", "description": "Duration in seconds"}
        }
    }
}
```

### CREATE_SEQUENCE_TOOL

Create and play an animated movement sequence from keyframes.

```python
{
    "type": "function",
    "name": "create_sequence",
    "parameters": {
        "properties": {
            "keyframes": {
                "type": "array",
                "items": {
                    "type": "object",
                    "properties": {
                        "t": {"type": "number", "description": "Time in seconds"},
                        "head": {
                            "properties": {
                                "roll": {"type": "number"},
                                "pitch": {"type": "number"},
                                "yaw": {"type": "number"}
                            }
                        },
                        "antennas": {
                            "type": "array",
                            "items": {"type": "number"},
                            "description": "[left, right] in degrees (-60 to 60)"
                        }
                    },
                    "required": ["t"]
                }
            }
        },
        "required": ["keyframes"]
    }
}
```

### STOP_MOVEMENT_TOOL

Stop any currently playing movement and return to neutral position.

```python
{
    "type": "function",
    "name": "stop_movement",
    "description": "Stop current movement and return to neutral"
}
```

---

## Hybrid AI Workflow

The SDK is designed for a **Hybrid Generative/Retrieval** architecture to optimize context usage.

### Recommended Agent Logic

1.  **Retrieval First**: Always try `search_moves(query)` first.
2.  **Play by Name**: If a match is found, use `play_move(name)`. This uses 0 tokens for movement data.
3.  **On-Demand Learning**: If no match is found, call `get_choreography_guide()` to load physics rules.
4.  **Safe Generation**: Finally, use `create_sequence(keyframes)` to generate a custom move using the loaded rules.

---

## Usage Examples

### Example 1: Wave Animation

```python
wave_keyframes = [
    {"t": 0.0, "head": {"roll": 0, "yaw": 0}, "antennas": [0, 0]},
    {"t": 0.3, "head": {"roll": 0, "yaw": 0}, "antennas": [30, -30]},
    {"t": 0.6, "head": {"roll": 0, "yaw": 0}, "antennas": [-30, 30]},
    {"t": 0.9, "head": {"roll": 0, "yaw": 0}, "antennas": [30, -30]},
    {"t": 1.2, "head": {"roll": 0, "yaw": 0}, "antennas": [0, 0]},
]
```

### Example 2: Curious Head Tilt

```python
curious_keyframes = [
    {"t": 0.0, "head": {"roll": 0, "pitch": 0, "yaw": 0}},
    {"t": 0.4, "head": {"roll": 15, "pitch": 5, "yaw": 10}},
    {"t": 1.5, "head": {"roll": 15, "pitch": 5, "yaw": 10}},
    {"t": 2.0, "head": {"roll": 0, "pitch": 0, "yaw": 0}},
]
```

### Example 3: Excited Celebration

```python
excited_keyframes = [
    {"t": 0.0, "head": {"pitch": 0}, "antennas": [0, 0]},
    {"t": 0.2, "head": {"pitch": -10}, "antennas": [40, 40]},
    {"t": 0.4, "head": {"pitch": 5}, "antennas": [-20, -20]},
    {"t": 0.6, "head": {"pitch": -10}, "antennas": [40, 40]},
    {"t": 0.8, "head": {"pitch": 5}, "antennas": [-20, -20]},
    {"t": 1.0, "head": {"pitch": 0}, "antennas": [0, 0]},
]
```

---

## AI Agent Output Format

When building an AI agent to generate movements for Reachy Mini, the output should match this format:

### For Simple Poses

```json
{
    "function": "goto_pose",
    "arguments": {
        "roll": 0,
        "pitch": 10,
        "yaw": -20,
        "duration": 0.5
    }
}
```

### For Animated Sequences

```json
{
    "function": "create_sequence",
    "arguments": {
        "keyframes": [
            {"t": 0.0, "head": {"roll": 0, "pitch": 0, "yaw": 0}, "antennas": [0, 0]},
            {"t": 0.5, "head": {"roll": 10, "pitch": -5, "yaw": 20}, "antennas": [15, -15]},
            {"t": 1.0, "head": {"roll": 0, "pitch": 0, "yaw": 0}, "antennas": [0, 0]}
        ]
    }
}
```

This format allows seamless integration with the OpenAI Realtime API for voice-controlled robot movements.