File size: 2,519 Bytes
0d65268
 
 
 
 
 
758ca77
0d65268
 
ad2e7fd
0d65268
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
efbe499
0d65268
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
---
license: apache-2.0
---

# 🖊️ ACT-SO100-Draw

Action Chunking Transformer (ACT) checkpoint for **drawing with a custom pen-holding attachment on the SO-100 and SO-101 robotic arms**.

![pen_tool_photo](./assets/tool.jpg)
*3-D-printed pen mount designed for SO-100 and SO-101 robotic arms.**

Tool STL is available for download in the [SO-100 Tools repository](https://github.com/krohling/so-100-tools).

---

## Demo  

![Training Results](./assets/demo.gif)

---

## Dataset

| Name                                                                                                                                                                             | Episodes | Frames / episode | Modalities                                |
| -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | -------- | ---------------- | ----------------------------------------- |
| [370-drawn-to-caffeine-draw-smiley](https://huggingface.co/spaces/lerobot/visualize_dataset?path=%2FLeRobot-worldwide-hackathon%2F370-drawn-to-caffeine-draw-smiley%2Fepisode_0) | 42       | \~450            | RGB 640×480, proprio 5-DoF, gripper state |


## Training Details

See run details on wandb for more information: [wandb run](https://wandb.ai/kevin_ai/lerobot_hackathon/runs/ahu8fcc0).

| Hyper-parameter     | Value                              |
| ------------------- | ---------------------------------- |
| Chunk size        | 100                                 |
| Dim Feedforward       | 3200                                |
| Dim Model            | 512                                |
| Dropout              | 0.1                                |
| Feedforward Activation | ReLU                               |
| Decoder layers       | 1                                  |
| Encoder layers       | 4                                  |
| Attention heads      | 8                                  |
| VAE Encoder layers | 4                                  |
| Batch size          | 32                                 |
| Optimizer           | AdamW, lr = 1e-5                   |


## Citation

If you use this checkpoint in your work, please cite the following:

```bibtex
@misc{Rohling2025ACTSO100Draw,
  author       = {Kevin Rohling},
  title        = {ACT Checkpoint for Pen-Drawing on SO-100},
  year         = {2025},
  howpublished = {\url{https://huggingface.co/kevin510/ACT-SO100-Draw}}
}
```