File size: 2,107 Bytes
cdfc516 | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 | ---
task_categories:
- robotics
tags:
- lerobot
- cross-embodiment
---
# MOTIF: Learning Action Motifs for Few-shot Cross-Embodiment Transfer
This repository contains a minimal real-world dataset provided to reproduce the interleaved task setting described in the paper [MOTIF: Learning Action Motifs for Few-shot Cross-Embodiment Transfer](https://huggingface.co/papers/2602.13764).
[**GitHub**](https://github.com/buduz/MOTIF) | [**Paper**](https://huggingface.co/papers/2602.13764)
## Dataset Description
MOTIF is a framework for few-shot cross-embodiment robotic transfer. It learns reusable **action motifs**—embodiment-agnostic spatiotemporal patterns—that enable efficient policy generalization across different robot embodiments.
This example dataset includes:
- **Embodiments**: ARX5 and Piper.
- **Tasks**: Two distinct tasks across embodiments.
- **Format**: The dataset adheres to the [LeRobot](https://github.com/huggingface/lerobot) data format and includes a `modality.json` for detailed modality and annotation definitions (compatible with GR00T N1).
## Usage
### Download the Dataset
You can download the dataset locally using the `huggingface-cli`:
```bash
huggingface-cli download \
--repo-type dataset Crossingz/ARX5_Piper_Few_shot_Example \
--local-dir ./demo_data
```
### Kinematic Trajectory Canonicalization
To enable embodiment-agnostic motif learning, raw end-effector trajectories must be canonicalized into a shared reference frame. You can use the processing script provided in the [official repository](https://github.com/buduz/MOTIF):
```bash
python data/process/trajectory_canonicalization.py \
--dataset_path ./demo_data \
--save_path ./demo_data_processed
```
## Citation
If you find this dataset or the MOTIF framework useful, please consider citing:
```bibtex
@article{zhi2025motif,
title={MOTIF: Learning Action Motifs for Few-shot Cross-Embodiment Transfer},
author={Zhi, Heng and Tan, Wentao and Zhu, Lei and Li, Fengling and Li, Jingjing and Yang, Guoli and Shen, Heng Tao},
journal={arXiv preprint arXiv:2602.13764},
year={2025}
}
``` |