Lava8888 commited on
Commit
c828b3b
·
verified ·
1 Parent(s): 8ce6565

Upload folder using huggingface_hub

Browse files
Files changed (43) hide show
  1. .gitattributes +1 -0
  2. README.md +52 -0
  3. smolvla_omy/checkpoints/000500/pretrained_model/config.json +83 -0
  4. smolvla_omy/checkpoints/000500/pretrained_model/model.safetensors +3 -0
  5. smolvla_omy/checkpoints/000500/pretrained_model/train_config.json +195 -0
  6. smolvla_omy/checkpoints/000500/training_state/optimizer_param_groups.json +527 -0
  7. smolvla_omy/checkpoints/000500/training_state/optimizer_state.safetensors +3 -0
  8. smolvla_omy/checkpoints/000500/training_state/rng_state.safetensors +3 -0
  9. smolvla_omy/checkpoints/000500/training_state/scheduler_state.json +15 -0
  10. smolvla_omy/checkpoints/000500/training_state/training_step.json +3 -0
  11. smolvla_omy/checkpoints/001000/pretrained_model/config.json +83 -0
  12. smolvla_omy/checkpoints/001000/pretrained_model/model.safetensors +3 -0
  13. smolvla_omy/checkpoints/001000/pretrained_model/train_config.json +195 -0
  14. smolvla_omy/checkpoints/001000/training_state/optimizer_param_groups.json +527 -0
  15. smolvla_omy/checkpoints/001000/training_state/optimizer_state.safetensors +3 -0
  16. smolvla_omy/checkpoints/001000/training_state/rng_state.safetensors +3 -0
  17. smolvla_omy/checkpoints/001000/training_state/scheduler_state.json +15 -0
  18. smolvla_omy/checkpoints/001000/training_state/training_step.json +3 -0
  19. smolvla_omy/checkpoints/001500/pretrained_model/config.json +83 -0
  20. smolvla_omy/checkpoints/001500/pretrained_model/model.safetensors +3 -0
  21. smolvla_omy/checkpoints/001500/pretrained_model/train_config.json +195 -0
  22. smolvla_omy/checkpoints/001500/training_state/optimizer_param_groups.json +527 -0
  23. smolvla_omy/checkpoints/001500/training_state/optimizer_state.safetensors +3 -0
  24. smolvla_omy/checkpoints/001500/training_state/rng_state.safetensors +3 -0
  25. smolvla_omy/checkpoints/001500/training_state/scheduler_state.json +15 -0
  26. smolvla_omy/checkpoints/001500/training_state/training_step.json +3 -0
  27. smolvla_omy/checkpoints/002000/pretrained_model/config.json +83 -0
  28. smolvla_omy/checkpoints/002000/pretrained_model/model.safetensors +3 -0
  29. smolvla_omy/checkpoints/002000/pretrained_model/train_config.json +195 -0
  30. smolvla_omy/checkpoints/002000/training_state/optimizer_param_groups.json +527 -0
  31. smolvla_omy/checkpoints/002000/training_state/optimizer_state.safetensors +3 -0
  32. smolvla_omy/checkpoints/002000/training_state/rng_state.safetensors +3 -0
  33. smolvla_omy/checkpoints/002000/training_state/scheduler_state.json +15 -0
  34. smolvla_omy/checkpoints/002000/training_state/training_step.json +3 -0
  35. smolvla_omy/wandb/debug-internal.log +6 -0
  36. smolvla_omy/wandb/debug.log +21 -0
  37. smolvla_omy/wandb/run-20250810_170118-7/files/output.log +0 -0
  38. smolvla_omy/wandb/run-20250810_170118-7/files/requirements.txt +238 -0
  39. smolvla_omy/wandb/run-20250810_170118-7/files/wandb-metadata.json +43 -0
  40. smolvla_omy/wandb/run-20250810_170118-7/logs/debug-core.log +7 -0
  41. smolvla_omy/wandb/run-20250810_170118-7/logs/debug-internal.log +6 -0
  42. smolvla_omy/wandb/run-20250810_170118-7/logs/debug.log +21 -0
  43. smolvla_omy/wandb/run-20250810_170118-7/run-7.wandb +3 -0
.gitattributes CHANGED
@@ -33,3 +33,4 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
 
 
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
36
+ smolvla_omy/wandb/run-20250810_170118-7/run-7.wandb filter=lfs diff=lfs merge=lfs -text
README.md ADDED
@@ -0,0 +1,52 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # SmolVLA-OMY Model Checkpoints
2
+
3
+ This repository contains training checkpoints for a SmolVLA (Small Vision-Language-Action) model trained on the ArrangeVegetables task.
4
+
5
+ ## Model Details
6
+
7
+ - **Model Type**: SmolVLA (Vision-Language-Action model)
8
+ - **Task**: ArrangeVegetables manipulation task
9
+ - **Training Steps**: 20,000 steps
10
+ - **Batch Size**: 350
11
+ - **Chunk Size**: 5 action steps
12
+ - **Input Features**:
13
+ - Visual observations: 256x256 RGB images (both main camera and wrist camera)
14
+ - State observations: 6-dimensional state vector
15
+ - **Output Features**: 12-dimensional action space
16
+
17
+ ## Checkpoint Structure
18
+
19
+ The repository contains checkpoints saved at different training steps:
20
+ - `000500/`: Checkpoint at 500 steps
21
+ - `001000/`: Checkpoint at 1,000 steps
22
+ - `001500/`: Checkpoint at 1,500 steps
23
+ - `002000/`: Checkpoint at 2,000 steps
24
+
25
+ Each checkpoint contains:
26
+ - `pretrained_model/`: Model weights and configuration
27
+ - `training_state/`: Optimizer state, scheduler state, and training metadata
28
+
29
+ ## Training Configuration
30
+
31
+ - **Device**: CUDA
32
+ - **Seed**: 42
33
+ - **Workers**: 24
34
+ - **Evaluation Frequency**: Every 5 steps
35
+ - **Logging Frequency**: Every step
36
+ - **Image Resize**: 512x512 with padding
37
+ - **Normalization**: Identity for visual, mean-std for state/action
38
+
39
+ ## Usage
40
+
41
+ To load a checkpoint:
42
+
43
+ ```python
44
+ from your_training_framework import load_checkpoint
45
+
46
+ # Load the latest checkpoint (2000 steps)
47
+ model = load_checkpoint("./002000/pretrained_model/")
48
+ ```
49
+
50
+ ## Dataset
51
+
52
+ Trained on the ArrangeVegetables dataset available at: `lava8888/ArrangeVegetables`
smolvla_omy/checkpoints/000500/pretrained_model/config.json ADDED
@@ -0,0 +1,83 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "type": "smolvla",
3
+ "n_obs_steps": 1,
4
+ "normalization_mapping": {
5
+ "VISUAL": "IDENTITY",
6
+ "STATE": "MEAN_STD",
7
+ "ACTION": "MEAN_STD"
8
+ },
9
+ "input_features": {
10
+ "observation.image": {
11
+ "type": "VISUAL",
12
+ "shape": [
13
+ 3,
14
+ 256,
15
+ 256
16
+ ]
17
+ },
18
+ "observation.wrist_image": {
19
+ "type": "VISUAL",
20
+ "shape": [
21
+ 3,
22
+ 256,
23
+ 256
24
+ ]
25
+ },
26
+ "observation.state": {
27
+ "type": "STATE",
28
+ "shape": [
29
+ 6
30
+ ]
31
+ }
32
+ },
33
+ "output_features": {
34
+ "action": {
35
+ "type": "ACTION",
36
+ "shape": [
37
+ 12
38
+ ]
39
+ }
40
+ },
41
+ "device": "cuda",
42
+ "use_amp": false,
43
+ "chunk_size": 5,
44
+ "n_action_steps": 5,
45
+ "max_state_dim": 32,
46
+ "max_action_dim": 32,
47
+ "resize_imgs_with_padding": [
48
+ 512,
49
+ 512
50
+ ],
51
+ "empty_cameras": 0,
52
+ "adapt_to_pi_aloha": false,
53
+ "use_delta_joint_actions_aloha": false,
54
+ "tokenizer_max_length": 48,
55
+ "num_steps": 10,
56
+ "use_cache": true,
57
+ "freeze_vision_encoder": true,
58
+ "train_expert_only": true,
59
+ "train_state_proj": true,
60
+ "optimizer_lr": 0.0001,
61
+ "optimizer_betas": [
62
+ 0.9,
63
+ 0.95
64
+ ],
65
+ "optimizer_eps": 1e-08,
66
+ "optimizer_weight_decay": 1e-10,
67
+ "optimizer_grad_clip_norm": 10,
68
+ "scheduler_warmup_steps": 1000,
69
+ "scheduler_decay_steps": 30000,
70
+ "scheduler_decay_lr": 2.5e-06,
71
+ "vlm_model_name": "HuggingFaceTB/SmolVLM2-500M-Video-Instruct",
72
+ "load_vlm_weights": false,
73
+ "add_image_special_tokens": false,
74
+ "attention_mode": "cross_attn",
75
+ "prefix_length": -1,
76
+ "pad_language_to": "longest",
77
+ "num_expert_layers": -1,
78
+ "num_vlm_layers": 16,
79
+ "self_attn_every_n_layers": 2,
80
+ "expert_width_multiplier": 0.75,
81
+ "min_period": 0.004,
82
+ "max_period": 4.0
83
+ }
smolvla_omy/checkpoints/000500/pretrained_model/model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d87f75272f49d653f201305f355afcf20667e5caff46fccc061bc6245db6fb38
3
+ size 1197790120
smolvla_omy/checkpoints/000500/pretrained_model/train_config.json ADDED
@@ -0,0 +1,195 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "dataset": {
3
+ "repo_id": "lava8888/ArrangeVegetables",
4
+ "root": "./ArrangeVegetables",
5
+ "episodes": null,
6
+ "image_transforms": {
7
+ "enable": false,
8
+ "max_num_transforms": 3,
9
+ "random_order": false,
10
+ "tfs": {
11
+ "brightness": {
12
+ "weight": 1.0,
13
+ "type": "ColorJitter",
14
+ "kwargs": {
15
+ "brightness": [
16
+ 0.8,
17
+ 1.2
18
+ ]
19
+ }
20
+ },
21
+ "contrast": {
22
+ "weight": 1.0,
23
+ "type": "ColorJitter",
24
+ "kwargs": {
25
+ "contrast": [
26
+ 0.8,
27
+ 1.2
28
+ ]
29
+ }
30
+ },
31
+ "saturation": {
32
+ "weight": 1.0,
33
+ "type": "ColorJitter",
34
+ "kwargs": {
35
+ "saturation": [
36
+ 0.5,
37
+ 1.5
38
+ ]
39
+ }
40
+ },
41
+ "hue": {
42
+ "weight": 1.0,
43
+ "type": "ColorJitter",
44
+ "kwargs": {
45
+ "hue": [
46
+ -0.05,
47
+ 0.05
48
+ ]
49
+ }
50
+ },
51
+ "sharpness": {
52
+ "weight": 1.0,
53
+ "type": "SharpnessJitter",
54
+ "kwargs": {
55
+ "sharpness": [
56
+ 0.5,
57
+ 1.5
58
+ ]
59
+ }
60
+ }
61
+ }
62
+ },
63
+ "revision": null,
64
+ "use_imagenet_stats": true,
65
+ "video_backend": "torchcodec"
66
+ },
67
+ "env": null,
68
+ "policy": {
69
+ "type": "smolvla",
70
+ "n_obs_steps": 1,
71
+ "normalization_mapping": {
72
+ "VISUAL": "IDENTITY",
73
+ "STATE": "MEAN_STD",
74
+ "ACTION": "MEAN_STD"
75
+ },
76
+ "input_features": {
77
+ "observation.image": {
78
+ "type": "VISUAL",
79
+ "shape": [
80
+ 3,
81
+ 256,
82
+ 256
83
+ ]
84
+ },
85
+ "observation.wrist_image": {
86
+ "type": "VISUAL",
87
+ "shape": [
88
+ 3,
89
+ 256,
90
+ 256
91
+ ]
92
+ },
93
+ "observation.state": {
94
+ "type": "STATE",
95
+ "shape": [
96
+ 6
97
+ ]
98
+ }
99
+ },
100
+ "output_features": {
101
+ "action": {
102
+ "type": "ACTION",
103
+ "shape": [
104
+ 12
105
+ ]
106
+ }
107
+ },
108
+ "device": "cuda",
109
+ "use_amp": false,
110
+ "chunk_size": 5,
111
+ "n_action_steps": 5,
112
+ "max_state_dim": 32,
113
+ "max_action_dim": 32,
114
+ "resize_imgs_with_padding": [
115
+ 512,
116
+ 512
117
+ ],
118
+ "empty_cameras": 0,
119
+ "adapt_to_pi_aloha": false,
120
+ "use_delta_joint_actions_aloha": false,
121
+ "tokenizer_max_length": 48,
122
+ "num_steps": 10,
123
+ "use_cache": true,
124
+ "freeze_vision_encoder": true,
125
+ "train_expert_only": true,
126
+ "train_state_proj": true,
127
+ "optimizer_lr": 0.0001,
128
+ "optimizer_betas": [
129
+ 0.9,
130
+ 0.95
131
+ ],
132
+ "optimizer_eps": 1e-08,
133
+ "optimizer_weight_decay": 1e-10,
134
+ "optimizer_grad_clip_norm": 10,
135
+ "scheduler_warmup_steps": 1000,
136
+ "scheduler_decay_steps": 30000,
137
+ "scheduler_decay_lr": 2.5e-06,
138
+ "vlm_model_name": "HuggingFaceTB/SmolVLM2-500M-Video-Instruct",
139
+ "load_vlm_weights": false,
140
+ "add_image_special_tokens": false,
141
+ "attention_mode": "cross_attn",
142
+ "prefix_length": -1,
143
+ "pad_language_to": "longest",
144
+ "num_expert_layers": -1,
145
+ "num_vlm_layers": 16,
146
+ "self_attn_every_n_layers": 2,
147
+ "expert_width_multiplier": 0.75,
148
+ "min_period": 0.004,
149
+ "max_period": 4.0
150
+ },
151
+ "output_dir": "ckpt/smolvla_omy",
152
+ "job_name": "smolvla_6",
153
+ "resume": false,
154
+ "seed": 42,
155
+ "num_workers": 24,
156
+ "batch_size": 350,
157
+ "steps": 20000,
158
+ "eval_freq": 5,
159
+ "log_freq": 1,
160
+ "save_checkpoint": true,
161
+ "save_freq": 500,
162
+ "use_policy_training_preset": true,
163
+ "optimizer": {
164
+ "type": "adamw",
165
+ "lr": 0.0001,
166
+ "weight_decay": 1e-10,
167
+ "grad_clip_norm": 10,
168
+ "betas": [
169
+ 0.9,
170
+ 0.95
171
+ ],
172
+ "eps": 1e-08
173
+ },
174
+ "scheduler": {
175
+ "type": "cosine_decay_with_warmup",
176
+ "num_warmup_steps": 1000,
177
+ "num_decay_steps": 30000,
178
+ "peak_lr": 0.0001,
179
+ "decay_lr": 2.5e-06
180
+ },
181
+ "eval": {
182
+ "n_episodes": 50,
183
+ "batch_size": 50,
184
+ "use_async_envs": false
185
+ },
186
+ "wandb": {
187
+ "enable": true,
188
+ "disable_artifact": true,
189
+ "project": "smolVLA",
190
+ "entity": "qualiastudios",
191
+ "notes": "first",
192
+ "run_id": "7",
193
+ "mode": "online"
194
+ }
195
+ }
smolvla_omy/checkpoints/000500/training_state/optimizer_param_groups.json ADDED
@@ -0,0 +1,527 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "lr": 5.0049950049950055e-05,
4
+ "betas": [
5
+ 0.9,
6
+ 0.95
7
+ ],
8
+ "eps": 1e-08,
9
+ "weight_decay": 1e-10,
10
+ "amsgrad": false,
11
+ "maximize": false,
12
+ "foreach": null,
13
+ "capturable": false,
14
+ "differentiable": false,
15
+ "fused": null,
16
+ "decoupled_weight_decay": true,
17
+ "initial_lr": 0.0001,
18
+ "params": [
19
+ 0,
20
+ 1,
21
+ 2,
22
+ 3,
23
+ 4,
24
+ 5,
25
+ 6,
26
+ 7,
27
+ 8,
28
+ 9,
29
+ 10,
30
+ 11,
31
+ 12,
32
+ 13,
33
+ 14,
34
+ 15,
35
+ 16,
36
+ 17,
37
+ 18,
38
+ 19,
39
+ 20,
40
+ 21,
41
+ 22,
42
+ 23,
43
+ 24,
44
+ 25,
45
+ 26,
46
+ 27,
47
+ 28,
48
+ 29,
49
+ 30,
50
+ 31,
51
+ 32,
52
+ 33,
53
+ 34,
54
+ 35,
55
+ 36,
56
+ 37,
57
+ 38,
58
+ 39,
59
+ 40,
60
+ 41,
61
+ 42,
62
+ 43,
63
+ 44,
64
+ 45,
65
+ 46,
66
+ 47,
67
+ 48,
68
+ 49,
69
+ 50,
70
+ 51,
71
+ 52,
72
+ 53,
73
+ 54,
74
+ 55,
75
+ 56,
76
+ 57,
77
+ 58,
78
+ 59,
79
+ 60,
80
+ 61,
81
+ 62,
82
+ 63,
83
+ 64,
84
+ 65,
85
+ 66,
86
+ 67,
87
+ 68,
88
+ 69,
89
+ 70,
90
+ 71,
91
+ 72,
92
+ 73,
93
+ 74,
94
+ 75,
95
+ 76,
96
+ 77,
97
+ 78,
98
+ 79,
99
+ 80,
100
+ 81,
101
+ 82,
102
+ 83,
103
+ 84,
104
+ 85,
105
+ 86,
106
+ 87,
107
+ 88,
108
+ 89,
109
+ 90,
110
+ 91,
111
+ 92,
112
+ 93,
113
+ 94,
114
+ 95,
115
+ 96,
116
+ 97,
117
+ 98,
118
+ 99,
119
+ 100,
120
+ 101,
121
+ 102,
122
+ 103,
123
+ 104,
124
+ 105,
125
+ 106,
126
+ 107,
127
+ 108,
128
+ 109,
129
+ 110,
130
+ 111,
131
+ 112,
132
+ 113,
133
+ 114,
134
+ 115,
135
+ 116,
136
+ 117,
137
+ 118,
138
+ 119,
139
+ 120,
140
+ 121,
141
+ 122,
142
+ 123,
143
+ 124,
144
+ 125,
145
+ 126,
146
+ 127,
147
+ 128,
148
+ 129,
149
+ 130,
150
+ 131,
151
+ 132,
152
+ 133,
153
+ 134,
154
+ 135,
155
+ 136,
156
+ 137,
157
+ 138,
158
+ 139,
159
+ 140,
160
+ 141,
161
+ 142,
162
+ 143,
163
+ 144,
164
+ 145,
165
+ 146,
166
+ 147,
167
+ 148,
168
+ 149,
169
+ 150,
170
+ 151,
171
+ 152,
172
+ 153,
173
+ 154,
174
+ 155,
175
+ 156,
176
+ 157,
177
+ 158,
178
+ 159,
179
+ 160,
180
+ 161,
181
+ 162,
182
+ 163,
183
+ 164,
184
+ 165,
185
+ 166,
186
+ 167,
187
+ 168,
188
+ 169,
189
+ 170,
190
+ 171,
191
+ 172,
192
+ 173,
193
+ 174,
194
+ 175,
195
+ 176,
196
+ 177,
197
+ 178,
198
+ 179,
199
+ 180,
200
+ 181,
201
+ 182,
202
+ 183,
203
+ 184,
204
+ 185,
205
+ 186,
206
+ 187,
207
+ 188,
208
+ 189,
209
+ 190,
210
+ 191,
211
+ 192,
212
+ 193,
213
+ 194,
214
+ 195,
215
+ 196,
216
+ 197,
217
+ 198,
218
+ 199,
219
+ 200,
220
+ 201,
221
+ 202,
222
+ 203,
223
+ 204,
224
+ 205,
225
+ 206,
226
+ 207,
227
+ 208,
228
+ 209,
229
+ 210,
230
+ 211,
231
+ 212,
232
+ 213,
233
+ 214,
234
+ 215,
235
+ 216,
236
+ 217,
237
+ 218,
238
+ 219,
239
+ 220,
240
+ 221,
241
+ 222,
242
+ 223,
243
+ 224,
244
+ 225,
245
+ 226,
246
+ 227,
247
+ 228,
248
+ 229,
249
+ 230,
250
+ 231,
251
+ 232,
252
+ 233,
253
+ 234,
254
+ 235,
255
+ 236,
256
+ 237,
257
+ 238,
258
+ 239,
259
+ 240,
260
+ 241,
261
+ 242,
262
+ 243,
263
+ 244,
264
+ 245,
265
+ 246,
266
+ 247,
267
+ 248,
268
+ 249,
269
+ 250,
270
+ 251,
271
+ 252,
272
+ 253,
273
+ 254,
274
+ 255,
275
+ 256,
276
+ 257,
277
+ 258,
278
+ 259,
279
+ 260,
280
+ 261,
281
+ 262,
282
+ 263,
283
+ 264,
284
+ 265,
285
+ 266,
286
+ 267,
287
+ 268,
288
+ 269,
289
+ 270,
290
+ 271,
291
+ 272,
292
+ 273,
293
+ 274,
294
+ 275,
295
+ 276,
296
+ 277,
297
+ 278,
298
+ 279,
299
+ 280,
300
+ 281,
301
+ 282,
302
+ 283,
303
+ 284,
304
+ 285,
305
+ 286,
306
+ 287,
307
+ 288,
308
+ 289,
309
+ 290,
310
+ 291,
311
+ 292,
312
+ 293,
313
+ 294,
314
+ 295,
315
+ 296,
316
+ 297,
317
+ 298,
318
+ 299,
319
+ 300,
320
+ 301,
321
+ 302,
322
+ 303,
323
+ 304,
324
+ 305,
325
+ 306,
326
+ 307,
327
+ 308,
328
+ 309,
329
+ 310,
330
+ 311,
331
+ 312,
332
+ 313,
333
+ 314,
334
+ 315,
335
+ 316,
336
+ 317,
337
+ 318,
338
+ 319,
339
+ 320,
340
+ 321,
341
+ 322,
342
+ 323,
343
+ 324,
344
+ 325,
345
+ 326,
346
+ 327,
347
+ 328,
348
+ 329,
349
+ 330,
350
+ 331,
351
+ 332,
352
+ 333,
353
+ 334,
354
+ 335,
355
+ 336,
356
+ 337,
357
+ 338,
358
+ 339,
359
+ 340,
360
+ 341,
361
+ 342,
362
+ 343,
363
+ 344,
364
+ 345,
365
+ 346,
366
+ 347,
367
+ 348,
368
+ 349,
369
+ 350,
370
+ 351,
371
+ 352,
372
+ 353,
373
+ 354,
374
+ 355,
375
+ 356,
376
+ 357,
377
+ 358,
378
+ 359,
379
+ 360,
380
+ 361,
381
+ 362,
382
+ 363,
383
+ 364,
384
+ 365,
385
+ 366,
386
+ 367,
387
+ 368,
388
+ 369,
389
+ 370,
390
+ 371,
391
+ 372,
392
+ 373,
393
+ 374,
394
+ 375,
395
+ 376,
396
+ 377,
397
+ 378,
398
+ 379,
399
+ 380,
400
+ 381,
401
+ 382,
402
+ 383,
403
+ 384,
404
+ 385,
405
+ 386,
406
+ 387,
407
+ 388,
408
+ 389,
409
+ 390,
410
+ 391,
411
+ 392,
412
+ 393,
413
+ 394,
414
+ 395,
415
+ 396,
416
+ 397,
417
+ 398,
418
+ 399,
419
+ 400,
420
+ 401,
421
+ 402,
422
+ 403,
423
+ 404,
424
+ 405,
425
+ 406,
426
+ 407,
427
+ 408,
428
+ 409,
429
+ 410,
430
+ 411,
431
+ 412,
432
+ 413,
433
+ 414,
434
+ 415,
435
+ 416,
436
+ 417,
437
+ 418,
438
+ 419,
439
+ 420,
440
+ 421,
441
+ 422,
442
+ 423,
443
+ 424,
444
+ 425,
445
+ 426,
446
+ 427,
447
+ 428,
448
+ 429,
449
+ 430,
450
+ 431,
451
+ 432,
452
+ 433,
453
+ 434,
454
+ 435,
455
+ 436,
456
+ 437,
457
+ 438,
458
+ 439,
459
+ 440,
460
+ 441,
461
+ 442,
462
+ 443,
463
+ 444,
464
+ 445,
465
+ 446,
466
+ 447,
467
+ 448,
468
+ 449,
469
+ 450,
470
+ 451,
471
+ 452,
472
+ 453,
473
+ 454,
474
+ 455,
475
+ 456,
476
+ 457,
477
+ 458,
478
+ 459,
479
+ 460,
480
+ 461,
481
+ 462,
482
+ 463,
483
+ 464,
484
+ 465,
485
+ 466,
486
+ 467,
487
+ 468,
488
+ 469,
489
+ 470,
490
+ 471,
491
+ 472,
492
+ 473,
493
+ 474,
494
+ 475,
495
+ 476,
496
+ 477,
497
+ 478,
498
+ 479,
499
+ 480,
500
+ 481,
501
+ 482,
502
+ 483,
503
+ 484,
504
+ 485,
505
+ 486,
506
+ 487,
507
+ 488,
508
+ 489,
509
+ 490,
510
+ 491,
511
+ 492,
512
+ 493,
513
+ 494,
514
+ 495,
515
+ 496,
516
+ 497,
517
+ 498,
518
+ 499,
519
+ 500,
520
+ 501,
521
+ 502,
522
+ 503,
523
+ 504,
524
+ 505
525
+ ]
526
+ }
527
+ ]
smolvla_omy/checkpoints/000500/training_state/optimizer_state.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f8d7858bdb6795a8c0199212983688c5d79de1848ba50ae88f9764ab0e284445
3
+ size 412659164
smolvla_omy/checkpoints/000500/training_state/rng_state.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d5396ab9f630e87e1691e4e0e33cb341647f60050242789daddf2b8d72d3876f
3
+ size 15708
smolvla_omy/checkpoints/000500/training_state/scheduler_state.json ADDED
@@ -0,0 +1,15 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "base_lrs": [
3
+ 0.0001
4
+ ],
5
+ "last_epoch": 500,
6
+ "_step_count": 501,
7
+ "_is_initial": false,
8
+ "_get_lr_called_within_step": false,
9
+ "_last_lr": [
10
+ 5.0049950049950055e-05
11
+ ],
12
+ "lr_lambdas": [
13
+ null
14
+ ]
15
+ }
smolvla_omy/checkpoints/000500/training_state/training_step.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ {
2
+ "step": 500
3
+ }
smolvla_omy/checkpoints/001000/pretrained_model/config.json ADDED
@@ -0,0 +1,83 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "type": "smolvla",
3
+ "n_obs_steps": 1,
4
+ "normalization_mapping": {
5
+ "VISUAL": "IDENTITY",
6
+ "STATE": "MEAN_STD",
7
+ "ACTION": "MEAN_STD"
8
+ },
9
+ "input_features": {
10
+ "observation.image": {
11
+ "type": "VISUAL",
12
+ "shape": [
13
+ 3,
14
+ 256,
15
+ 256
16
+ ]
17
+ },
18
+ "observation.wrist_image": {
19
+ "type": "VISUAL",
20
+ "shape": [
21
+ 3,
22
+ 256,
23
+ 256
24
+ ]
25
+ },
26
+ "observation.state": {
27
+ "type": "STATE",
28
+ "shape": [
29
+ 6
30
+ ]
31
+ }
32
+ },
33
+ "output_features": {
34
+ "action": {
35
+ "type": "ACTION",
36
+ "shape": [
37
+ 12
38
+ ]
39
+ }
40
+ },
41
+ "device": "cuda",
42
+ "use_amp": false,
43
+ "chunk_size": 5,
44
+ "n_action_steps": 5,
45
+ "max_state_dim": 32,
46
+ "max_action_dim": 32,
47
+ "resize_imgs_with_padding": [
48
+ 512,
49
+ 512
50
+ ],
51
+ "empty_cameras": 0,
52
+ "adapt_to_pi_aloha": false,
53
+ "use_delta_joint_actions_aloha": false,
54
+ "tokenizer_max_length": 48,
55
+ "num_steps": 10,
56
+ "use_cache": true,
57
+ "freeze_vision_encoder": true,
58
+ "train_expert_only": true,
59
+ "train_state_proj": true,
60
+ "optimizer_lr": 0.0001,
61
+ "optimizer_betas": [
62
+ 0.9,
63
+ 0.95
64
+ ],
65
+ "optimizer_eps": 1e-08,
66
+ "optimizer_weight_decay": 1e-10,
67
+ "optimizer_grad_clip_norm": 10,
68
+ "scheduler_warmup_steps": 1000,
69
+ "scheduler_decay_steps": 30000,
70
+ "scheduler_decay_lr": 2.5e-06,
71
+ "vlm_model_name": "HuggingFaceTB/SmolVLM2-500M-Video-Instruct",
72
+ "load_vlm_weights": false,
73
+ "add_image_special_tokens": false,
74
+ "attention_mode": "cross_attn",
75
+ "prefix_length": -1,
76
+ "pad_language_to": "longest",
77
+ "num_expert_layers": -1,
78
+ "num_vlm_layers": 16,
79
+ "self_attn_every_n_layers": 2,
80
+ "expert_width_multiplier": 0.75,
81
+ "min_period": 0.004,
82
+ "max_period": 4.0
83
+ }
smolvla_omy/checkpoints/001000/pretrained_model/model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:39770941cb10c010a4cb934eaadfc5428112c3155ff7cda3179fc1545a2945c4
3
+ size 1197790120
smolvla_omy/checkpoints/001000/pretrained_model/train_config.json ADDED
@@ -0,0 +1,195 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "dataset": {
3
+ "repo_id": "lava8888/ArrangeVegetables",
4
+ "root": "./ArrangeVegetables",
5
+ "episodes": null,
6
+ "image_transforms": {
7
+ "enable": false,
8
+ "max_num_transforms": 3,
9
+ "random_order": false,
10
+ "tfs": {
11
+ "brightness": {
12
+ "weight": 1.0,
13
+ "type": "ColorJitter",
14
+ "kwargs": {
15
+ "brightness": [
16
+ 0.8,
17
+ 1.2
18
+ ]
19
+ }
20
+ },
21
+ "contrast": {
22
+ "weight": 1.0,
23
+ "type": "ColorJitter",
24
+ "kwargs": {
25
+ "contrast": [
26
+ 0.8,
27
+ 1.2
28
+ ]
29
+ }
30
+ },
31
+ "saturation": {
32
+ "weight": 1.0,
33
+ "type": "ColorJitter",
34
+ "kwargs": {
35
+ "saturation": [
36
+ 0.5,
37
+ 1.5
38
+ ]
39
+ }
40
+ },
41
+ "hue": {
42
+ "weight": 1.0,
43
+ "type": "ColorJitter",
44
+ "kwargs": {
45
+ "hue": [
46
+ -0.05,
47
+ 0.05
48
+ ]
49
+ }
50
+ },
51
+ "sharpness": {
52
+ "weight": 1.0,
53
+ "type": "SharpnessJitter",
54
+ "kwargs": {
55
+ "sharpness": [
56
+ 0.5,
57
+ 1.5
58
+ ]
59
+ }
60
+ }
61
+ }
62
+ },
63
+ "revision": null,
64
+ "use_imagenet_stats": true,
65
+ "video_backend": "torchcodec"
66
+ },
67
+ "env": null,
68
+ "policy": {
69
+ "type": "smolvla",
70
+ "n_obs_steps": 1,
71
+ "normalization_mapping": {
72
+ "VISUAL": "IDENTITY",
73
+ "STATE": "MEAN_STD",
74
+ "ACTION": "MEAN_STD"
75
+ },
76
+ "input_features": {
77
+ "observation.image": {
78
+ "type": "VISUAL",
79
+ "shape": [
80
+ 3,
81
+ 256,
82
+ 256
83
+ ]
84
+ },
85
+ "observation.wrist_image": {
86
+ "type": "VISUAL",
87
+ "shape": [
88
+ 3,
89
+ 256,
90
+ 256
91
+ ]
92
+ },
93
+ "observation.state": {
94
+ "type": "STATE",
95
+ "shape": [
96
+ 6
97
+ ]
98
+ }
99
+ },
100
+ "output_features": {
101
+ "action": {
102
+ "type": "ACTION",
103
+ "shape": [
104
+ 12
105
+ ]
106
+ }
107
+ },
108
+ "device": "cuda",
109
+ "use_amp": false,
110
+ "chunk_size": 5,
111
+ "n_action_steps": 5,
112
+ "max_state_dim": 32,
113
+ "max_action_dim": 32,
114
+ "resize_imgs_with_padding": [
115
+ 512,
116
+ 512
117
+ ],
118
+ "empty_cameras": 0,
119
+ "adapt_to_pi_aloha": false,
120
+ "use_delta_joint_actions_aloha": false,
121
+ "tokenizer_max_length": 48,
122
+ "num_steps": 10,
123
+ "use_cache": true,
124
+ "freeze_vision_encoder": true,
125
+ "train_expert_only": true,
126
+ "train_state_proj": true,
127
+ "optimizer_lr": 0.0001,
128
+ "optimizer_betas": [
129
+ 0.9,
130
+ 0.95
131
+ ],
132
+ "optimizer_eps": 1e-08,
133
+ "optimizer_weight_decay": 1e-10,
134
+ "optimizer_grad_clip_norm": 10,
135
+ "scheduler_warmup_steps": 1000,
136
+ "scheduler_decay_steps": 30000,
137
+ "scheduler_decay_lr": 2.5e-06,
138
+ "vlm_model_name": "HuggingFaceTB/SmolVLM2-500M-Video-Instruct",
139
+ "load_vlm_weights": false,
140
+ "add_image_special_tokens": false,
141
+ "attention_mode": "cross_attn",
142
+ "prefix_length": -1,
143
+ "pad_language_to": "longest",
144
+ "num_expert_layers": -1,
145
+ "num_vlm_layers": 16,
146
+ "self_attn_every_n_layers": 2,
147
+ "expert_width_multiplier": 0.75,
148
+ "min_period": 0.004,
149
+ "max_period": 4.0
150
+ },
151
+ "output_dir": "ckpt/smolvla_omy",
152
+ "job_name": "smolvla_6",
153
+ "resume": false,
154
+ "seed": 42,
155
+ "num_workers": 24,
156
+ "batch_size": 350,
157
+ "steps": 20000,
158
+ "eval_freq": 5,
159
+ "log_freq": 1,
160
+ "save_checkpoint": true,
161
+ "save_freq": 500,
162
+ "use_policy_training_preset": true,
163
+ "optimizer": {
164
+ "type": "adamw",
165
+ "lr": 0.0001,
166
+ "weight_decay": 1e-10,
167
+ "grad_clip_norm": 10,
168
+ "betas": [
169
+ 0.9,
170
+ 0.95
171
+ ],
172
+ "eps": 1e-08
173
+ },
174
+ "scheduler": {
175
+ "type": "cosine_decay_with_warmup",
176
+ "num_warmup_steps": 1000,
177
+ "num_decay_steps": 30000,
178
+ "peak_lr": 0.0001,
179
+ "decay_lr": 2.5e-06
180
+ },
181
+ "eval": {
182
+ "n_episodes": 50,
183
+ "batch_size": 50,
184
+ "use_async_envs": false
185
+ },
186
+ "wandb": {
187
+ "enable": true,
188
+ "disable_artifact": true,
189
+ "project": "smolVLA",
190
+ "entity": "qualiastudios",
191
+ "notes": "first",
192
+ "run_id": "7",
193
+ "mode": "online"
194
+ }
195
+ }
smolvla_omy/checkpoints/001000/training_state/optimizer_param_groups.json ADDED
@@ -0,0 +1,527 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "lr": 9.973294239920334e-05,
4
+ "betas": [
5
+ 0.9,
6
+ 0.95
7
+ ],
8
+ "eps": 1e-08,
9
+ "weight_decay": 1e-10,
10
+ "amsgrad": false,
11
+ "maximize": false,
12
+ "foreach": null,
13
+ "capturable": false,
14
+ "differentiable": false,
15
+ "fused": null,
16
+ "decoupled_weight_decay": true,
17
+ "initial_lr": 0.0001,
18
+ "params": [
19
+ 0,
20
+ 1,
21
+ 2,
22
+ 3,
23
+ 4,
24
+ 5,
25
+ 6,
26
+ 7,
27
+ 8,
28
+ 9,
29
+ 10,
30
+ 11,
31
+ 12,
32
+ 13,
33
+ 14,
34
+ 15,
35
+ 16,
36
+ 17,
37
+ 18,
38
+ 19,
39
+ 20,
40
+ 21,
41
+ 22,
42
+ 23,
43
+ 24,
44
+ 25,
45
+ 26,
46
+ 27,
47
+ 28,
48
+ 29,
49
+ 30,
50
+ 31,
51
+ 32,
52
+ 33,
53
+ 34,
54
+ 35,
55
+ 36,
56
+ 37,
57
+ 38,
58
+ 39,
59
+ 40,
60
+ 41,
61
+ 42,
62
+ 43,
63
+ 44,
64
+ 45,
65
+ 46,
66
+ 47,
67
+ 48,
68
+ 49,
69
+ 50,
70
+ 51,
71
+ 52,
72
+ 53,
73
+ 54,
74
+ 55,
75
+ 56,
76
+ 57,
77
+ 58,
78
+ 59,
79
+ 60,
80
+ 61,
81
+ 62,
82
+ 63,
83
+ 64,
84
+ 65,
85
+ 66,
86
+ 67,
87
+ 68,
88
+ 69,
89
+ 70,
90
+ 71,
91
+ 72,
92
+ 73,
93
+ 74,
94
+ 75,
95
+ 76,
96
+ 77,
97
+ 78,
98
+ 79,
99
+ 80,
100
+ 81,
101
+ 82,
102
+ 83,
103
+ 84,
104
+ 85,
105
+ 86,
106
+ 87,
107
+ 88,
108
+ 89,
109
+ 90,
110
+ 91,
111
+ 92,
112
+ 93,
113
+ 94,
114
+ 95,
115
+ 96,
116
+ 97,
117
+ 98,
118
+ 99,
119
+ 100,
120
+ 101,
121
+ 102,
122
+ 103,
123
+ 104,
124
+ 105,
125
+ 106,
126
+ 107,
127
+ 108,
128
+ 109,
129
+ 110,
130
+ 111,
131
+ 112,
132
+ 113,
133
+ 114,
134
+ 115,
135
+ 116,
136
+ 117,
137
+ 118,
138
+ 119,
139
+ 120,
140
+ 121,
141
+ 122,
142
+ 123,
143
+ 124,
144
+ 125,
145
+ 126,
146
+ 127,
147
+ 128,
148
+ 129,
149
+ 130,
150
+ 131,
151
+ 132,
152
+ 133,
153
+ 134,
154
+ 135,
155
+ 136,
156
+ 137,
157
+ 138,
158
+ 139,
159
+ 140,
160
+ 141,
161
+ 142,
162
+ 143,
163
+ 144,
164
+ 145,
165
+ 146,
166
+ 147,
167
+ 148,
168
+ 149,
169
+ 150,
170
+ 151,
171
+ 152,
172
+ 153,
173
+ 154,
174
+ 155,
175
+ 156,
176
+ 157,
177
+ 158,
178
+ 159,
179
+ 160,
180
+ 161,
181
+ 162,
182
+ 163,
183
+ 164,
184
+ 165,
185
+ 166,
186
+ 167,
187
+ 168,
188
+ 169,
189
+ 170,
190
+ 171,
191
+ 172,
192
+ 173,
193
+ 174,
194
+ 175,
195
+ 176,
196
+ 177,
197
+ 178,
198
+ 179,
199
+ 180,
200
+ 181,
201
+ 182,
202
+ 183,
203
+ 184,
204
+ 185,
205
+ 186,
206
+ 187,
207
+ 188,
208
+ 189,
209
+ 190,
210
+ 191,
211
+ 192,
212
+ 193,
213
+ 194,
214
+ 195,
215
+ 196,
216
+ 197,
217
+ 198,
218
+ 199,
219
+ 200,
220
+ 201,
221
+ 202,
222
+ 203,
223
+ 204,
224
+ 205,
225
+ 206,
226
+ 207,
227
+ 208,
228
+ 209,
229
+ 210,
230
+ 211,
231
+ 212,
232
+ 213,
233
+ 214,
234
+ 215,
235
+ 216,
236
+ 217,
237
+ 218,
238
+ 219,
239
+ 220,
240
+ 221,
241
+ 222,
242
+ 223,
243
+ 224,
244
+ 225,
245
+ 226,
246
+ 227,
247
+ 228,
248
+ 229,
249
+ 230,
250
+ 231,
251
+ 232,
252
+ 233,
253
+ 234,
254
+ 235,
255
+ 236,
256
+ 237,
257
+ 238,
258
+ 239,
259
+ 240,
260
+ 241,
261
+ 242,
262
+ 243,
263
+ 244,
264
+ 245,
265
+ 246,
266
+ 247,
267
+ 248,
268
+ 249,
269
+ 250,
270
+ 251,
271
+ 252,
272
+ 253,
273
+ 254,
274
+ 255,
275
+ 256,
276
+ 257,
277
+ 258,
278
+ 259,
279
+ 260,
280
+ 261,
281
+ 262,
282
+ 263,
283
+ 264,
284
+ 265,
285
+ 266,
286
+ 267,
287
+ 268,
288
+ 269,
289
+ 270,
290
+ 271,
291
+ 272,
292
+ 273,
293
+ 274,
294
+ 275,
295
+ 276,
296
+ 277,
297
+ 278,
298
+ 279,
299
+ 280,
300
+ 281,
301
+ 282,
302
+ 283,
303
+ 284,
304
+ 285,
305
+ 286,
306
+ 287,
307
+ 288,
308
+ 289,
309
+ 290,
310
+ 291,
311
+ 292,
312
+ 293,
313
+ 294,
314
+ 295,
315
+ 296,
316
+ 297,
317
+ 298,
318
+ 299,
319
+ 300,
320
+ 301,
321
+ 302,
322
+ 303,
323
+ 304,
324
+ 305,
325
+ 306,
326
+ 307,
327
+ 308,
328
+ 309,
329
+ 310,
330
+ 311,
331
+ 312,
332
+ 313,
333
+ 314,
334
+ 315,
335
+ 316,
336
+ 317,
337
+ 318,
338
+ 319,
339
+ 320,
340
+ 321,
341
+ 322,
342
+ 323,
343
+ 324,
344
+ 325,
345
+ 326,
346
+ 327,
347
+ 328,
348
+ 329,
349
+ 330,
350
+ 331,
351
+ 332,
352
+ 333,
353
+ 334,
354
+ 335,
355
+ 336,
356
+ 337,
357
+ 338,
358
+ 339,
359
+ 340,
360
+ 341,
361
+ 342,
362
+ 343,
363
+ 344,
364
+ 345,
365
+ 346,
366
+ 347,
367
+ 348,
368
+ 349,
369
+ 350,
370
+ 351,
371
+ 352,
372
+ 353,
373
+ 354,
374
+ 355,
375
+ 356,
376
+ 357,
377
+ 358,
378
+ 359,
379
+ 360,
380
+ 361,
381
+ 362,
382
+ 363,
383
+ 364,
384
+ 365,
385
+ 366,
386
+ 367,
387
+ 368,
388
+ 369,
389
+ 370,
390
+ 371,
391
+ 372,
392
+ 373,
393
+ 374,
394
+ 375,
395
+ 376,
396
+ 377,
397
+ 378,
398
+ 379,
399
+ 380,
400
+ 381,
401
+ 382,
402
+ 383,
403
+ 384,
404
+ 385,
405
+ 386,
406
+ 387,
407
+ 388,
408
+ 389,
409
+ 390,
410
+ 391,
411
+ 392,
412
+ 393,
413
+ 394,
414
+ 395,
415
+ 396,
416
+ 397,
417
+ 398,
418
+ 399,
419
+ 400,
420
+ 401,
421
+ 402,
422
+ 403,
423
+ 404,
424
+ 405,
425
+ 406,
426
+ 407,
427
+ 408,
428
+ 409,
429
+ 410,
430
+ 411,
431
+ 412,
432
+ 413,
433
+ 414,
434
+ 415,
435
+ 416,
436
+ 417,
437
+ 418,
438
+ 419,
439
+ 420,
440
+ 421,
441
+ 422,
442
+ 423,
443
+ 424,
444
+ 425,
445
+ 426,
446
+ 427,
447
+ 428,
448
+ 429,
449
+ 430,
450
+ 431,
451
+ 432,
452
+ 433,
453
+ 434,
454
+ 435,
455
+ 436,
456
+ 437,
457
+ 438,
458
+ 439,
459
+ 440,
460
+ 441,
461
+ 442,
462
+ 443,
463
+ 444,
464
+ 445,
465
+ 446,
466
+ 447,
467
+ 448,
468
+ 449,
469
+ 450,
470
+ 451,
471
+ 452,
472
+ 453,
473
+ 454,
474
+ 455,
475
+ 456,
476
+ 457,
477
+ 458,
478
+ 459,
479
+ 460,
480
+ 461,
481
+ 462,
482
+ 463,
483
+ 464,
484
+ 465,
485
+ 466,
486
+ 467,
487
+ 468,
488
+ 469,
489
+ 470,
490
+ 471,
491
+ 472,
492
+ 473,
493
+ 474,
494
+ 475,
495
+ 476,
496
+ 477,
497
+ 478,
498
+ 479,
499
+ 480,
500
+ 481,
501
+ 482,
502
+ 483,
503
+ 484,
504
+ 485,
505
+ 486,
506
+ 487,
507
+ 488,
508
+ 489,
509
+ 490,
510
+ 491,
511
+ 492,
512
+ 493,
513
+ 494,
514
+ 495,
515
+ 496,
516
+ 497,
517
+ 498,
518
+ 499,
519
+ 500,
520
+ 501,
521
+ 502,
522
+ 503,
523
+ 504,
524
+ 505
525
+ ]
526
+ }
527
+ ]
smolvla_omy/checkpoints/001000/training_state/optimizer_state.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0653df4442f0e38d12aea9f9d9d570c52559ae67b1bd4202e30f6f5eda7baa43
3
+ size 412659164
smolvla_omy/checkpoints/001000/training_state/rng_state.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b0833a0516d28aed28c6e022a657c7919284907562097f19ac3bbc5bf59c342f
3
+ size 15708
smolvla_omy/checkpoints/001000/training_state/scheduler_state.json ADDED
@@ -0,0 +1,15 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "base_lrs": [
3
+ 0.0001
4
+ ],
5
+ "last_epoch": 1000,
6
+ "_step_count": 1001,
7
+ "_is_initial": false,
8
+ "_get_lr_called_within_step": false,
9
+ "_last_lr": [
10
+ 9.973294239920334e-05
11
+ ],
12
+ "lr_lambdas": [
13
+ null
14
+ ]
15
+ }
smolvla_omy/checkpoints/001000/training_state/training_step.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ {
2
+ "step": 1000
3
+ }
smolvla_omy/checkpoints/001500/pretrained_model/config.json ADDED
@@ -0,0 +1,83 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "type": "smolvla",
3
+ "n_obs_steps": 1,
4
+ "normalization_mapping": {
5
+ "VISUAL": "IDENTITY",
6
+ "STATE": "MEAN_STD",
7
+ "ACTION": "MEAN_STD"
8
+ },
9
+ "input_features": {
10
+ "observation.image": {
11
+ "type": "VISUAL",
12
+ "shape": [
13
+ 3,
14
+ 256,
15
+ 256
16
+ ]
17
+ },
18
+ "observation.wrist_image": {
19
+ "type": "VISUAL",
20
+ "shape": [
21
+ 3,
22
+ 256,
23
+ 256
24
+ ]
25
+ },
26
+ "observation.state": {
27
+ "type": "STATE",
28
+ "shape": [
29
+ 6
30
+ ]
31
+ }
32
+ },
33
+ "output_features": {
34
+ "action": {
35
+ "type": "ACTION",
36
+ "shape": [
37
+ 12
38
+ ]
39
+ }
40
+ },
41
+ "device": "cuda",
42
+ "use_amp": false,
43
+ "chunk_size": 5,
44
+ "n_action_steps": 5,
45
+ "max_state_dim": 32,
46
+ "max_action_dim": 32,
47
+ "resize_imgs_with_padding": [
48
+ 512,
49
+ 512
50
+ ],
51
+ "empty_cameras": 0,
52
+ "adapt_to_pi_aloha": false,
53
+ "use_delta_joint_actions_aloha": false,
54
+ "tokenizer_max_length": 48,
55
+ "num_steps": 10,
56
+ "use_cache": true,
57
+ "freeze_vision_encoder": true,
58
+ "train_expert_only": true,
59
+ "train_state_proj": true,
60
+ "optimizer_lr": 0.0001,
61
+ "optimizer_betas": [
62
+ 0.9,
63
+ 0.95
64
+ ],
65
+ "optimizer_eps": 1e-08,
66
+ "optimizer_weight_decay": 1e-10,
67
+ "optimizer_grad_clip_norm": 10,
68
+ "scheduler_warmup_steps": 1000,
69
+ "scheduler_decay_steps": 30000,
70
+ "scheduler_decay_lr": 2.5e-06,
71
+ "vlm_model_name": "HuggingFaceTB/SmolVLM2-500M-Video-Instruct",
72
+ "load_vlm_weights": false,
73
+ "add_image_special_tokens": false,
74
+ "attention_mode": "cross_attn",
75
+ "prefix_length": -1,
76
+ "pad_language_to": "longest",
77
+ "num_expert_layers": -1,
78
+ "num_vlm_layers": 16,
79
+ "self_attn_every_n_layers": 2,
80
+ "expert_width_multiplier": 0.75,
81
+ "min_period": 0.004,
82
+ "max_period": 4.0
83
+ }
smolvla_omy/checkpoints/001500/pretrained_model/model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:33fa0ca7e6fd0c9f2cdb3540843551a45ae0f0fa62369c547288734f7d8d7d6c
3
+ size 1197790120
smolvla_omy/checkpoints/001500/pretrained_model/train_config.json ADDED
@@ -0,0 +1,195 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "dataset": {
3
+ "repo_id": "lava8888/ArrangeVegetables",
4
+ "root": "./ArrangeVegetables",
5
+ "episodes": null,
6
+ "image_transforms": {
7
+ "enable": false,
8
+ "max_num_transforms": 3,
9
+ "random_order": false,
10
+ "tfs": {
11
+ "brightness": {
12
+ "weight": 1.0,
13
+ "type": "ColorJitter",
14
+ "kwargs": {
15
+ "brightness": [
16
+ 0.8,
17
+ 1.2
18
+ ]
19
+ }
20
+ },
21
+ "contrast": {
22
+ "weight": 1.0,
23
+ "type": "ColorJitter",
24
+ "kwargs": {
25
+ "contrast": [
26
+ 0.8,
27
+ 1.2
28
+ ]
29
+ }
30
+ },
31
+ "saturation": {
32
+ "weight": 1.0,
33
+ "type": "ColorJitter",
34
+ "kwargs": {
35
+ "saturation": [
36
+ 0.5,
37
+ 1.5
38
+ ]
39
+ }
40
+ },
41
+ "hue": {
42
+ "weight": 1.0,
43
+ "type": "ColorJitter",
44
+ "kwargs": {
45
+ "hue": [
46
+ -0.05,
47
+ 0.05
48
+ ]
49
+ }
50
+ },
51
+ "sharpness": {
52
+ "weight": 1.0,
53
+ "type": "SharpnessJitter",
54
+ "kwargs": {
55
+ "sharpness": [
56
+ 0.5,
57
+ 1.5
58
+ ]
59
+ }
60
+ }
61
+ }
62
+ },
63
+ "revision": null,
64
+ "use_imagenet_stats": true,
65
+ "video_backend": "torchcodec"
66
+ },
67
+ "env": null,
68
+ "policy": {
69
+ "type": "smolvla",
70
+ "n_obs_steps": 1,
71
+ "normalization_mapping": {
72
+ "VISUAL": "IDENTITY",
73
+ "STATE": "MEAN_STD",
74
+ "ACTION": "MEAN_STD"
75
+ },
76
+ "input_features": {
77
+ "observation.image": {
78
+ "type": "VISUAL",
79
+ "shape": [
80
+ 3,
81
+ 256,
82
+ 256
83
+ ]
84
+ },
85
+ "observation.wrist_image": {
86
+ "type": "VISUAL",
87
+ "shape": [
88
+ 3,
89
+ 256,
90
+ 256
91
+ ]
92
+ },
93
+ "observation.state": {
94
+ "type": "STATE",
95
+ "shape": [
96
+ 6
97
+ ]
98
+ }
99
+ },
100
+ "output_features": {
101
+ "action": {
102
+ "type": "ACTION",
103
+ "shape": [
104
+ 12
105
+ ]
106
+ }
107
+ },
108
+ "device": "cuda",
109
+ "use_amp": false,
110
+ "chunk_size": 5,
111
+ "n_action_steps": 5,
112
+ "max_state_dim": 32,
113
+ "max_action_dim": 32,
114
+ "resize_imgs_with_padding": [
115
+ 512,
116
+ 512
117
+ ],
118
+ "empty_cameras": 0,
119
+ "adapt_to_pi_aloha": false,
120
+ "use_delta_joint_actions_aloha": false,
121
+ "tokenizer_max_length": 48,
122
+ "num_steps": 10,
123
+ "use_cache": true,
124
+ "freeze_vision_encoder": true,
125
+ "train_expert_only": true,
126
+ "train_state_proj": true,
127
+ "optimizer_lr": 0.0001,
128
+ "optimizer_betas": [
129
+ 0.9,
130
+ 0.95
131
+ ],
132
+ "optimizer_eps": 1e-08,
133
+ "optimizer_weight_decay": 1e-10,
134
+ "optimizer_grad_clip_norm": 10,
135
+ "scheduler_warmup_steps": 1000,
136
+ "scheduler_decay_steps": 30000,
137
+ "scheduler_decay_lr": 2.5e-06,
138
+ "vlm_model_name": "HuggingFaceTB/SmolVLM2-500M-Video-Instruct",
139
+ "load_vlm_weights": false,
140
+ "add_image_special_tokens": false,
141
+ "attention_mode": "cross_attn",
142
+ "prefix_length": -1,
143
+ "pad_language_to": "longest",
144
+ "num_expert_layers": -1,
145
+ "num_vlm_layers": 16,
146
+ "self_attn_every_n_layers": 2,
147
+ "expert_width_multiplier": 0.75,
148
+ "min_period": 0.004,
149
+ "max_period": 4.0
150
+ },
151
+ "output_dir": "ckpt/smolvla_omy",
152
+ "job_name": "smolvla_6",
153
+ "resume": false,
154
+ "seed": 42,
155
+ "num_workers": 24,
156
+ "batch_size": 350,
157
+ "steps": 20000,
158
+ "eval_freq": 5,
159
+ "log_freq": 1,
160
+ "save_checkpoint": true,
161
+ "save_freq": 500,
162
+ "use_policy_training_preset": true,
163
+ "optimizer": {
164
+ "type": "adamw",
165
+ "lr": 0.0001,
166
+ "weight_decay": 1e-10,
167
+ "grad_clip_norm": 10,
168
+ "betas": [
169
+ 0.9,
170
+ 0.95
171
+ ],
172
+ "eps": 1e-08
173
+ },
174
+ "scheduler": {
175
+ "type": "cosine_decay_with_warmup",
176
+ "num_warmup_steps": 1000,
177
+ "num_decay_steps": 30000,
178
+ "peak_lr": 0.0001,
179
+ "decay_lr": 2.5e-06
180
+ },
181
+ "eval": {
182
+ "n_episodes": 50,
183
+ "batch_size": 50,
184
+ "use_async_envs": false
185
+ },
186
+ "wandb": {
187
+ "enable": true,
188
+ "disable_artifact": true,
189
+ "project": "smolVLA",
190
+ "entity": "qualiastudios",
191
+ "notes": "first",
192
+ "run_id": "7",
193
+ "mode": "online"
194
+ }
195
+ }
smolvla_omy/checkpoints/001500/training_state/optimizer_param_groups.json ADDED
@@ -0,0 +1,527 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "lr": 9.939980660401297e-05,
4
+ "betas": [
5
+ 0.9,
6
+ 0.95
7
+ ],
8
+ "eps": 1e-08,
9
+ "weight_decay": 1e-10,
10
+ "amsgrad": false,
11
+ "maximize": false,
12
+ "foreach": null,
13
+ "capturable": false,
14
+ "differentiable": false,
15
+ "fused": null,
16
+ "decoupled_weight_decay": true,
17
+ "initial_lr": 0.0001,
18
+ "params": [
19
+ 0,
20
+ 1,
21
+ 2,
22
+ 3,
23
+ 4,
24
+ 5,
25
+ 6,
26
+ 7,
27
+ 8,
28
+ 9,
29
+ 10,
30
+ 11,
31
+ 12,
32
+ 13,
33
+ 14,
34
+ 15,
35
+ 16,
36
+ 17,
37
+ 18,
38
+ 19,
39
+ 20,
40
+ 21,
41
+ 22,
42
+ 23,
43
+ 24,
44
+ 25,
45
+ 26,
46
+ 27,
47
+ 28,
48
+ 29,
49
+ 30,
50
+ 31,
51
+ 32,
52
+ 33,
53
+ 34,
54
+ 35,
55
+ 36,
56
+ 37,
57
+ 38,
58
+ 39,
59
+ 40,
60
+ 41,
61
+ 42,
62
+ 43,
63
+ 44,
64
+ 45,
65
+ 46,
66
+ 47,
67
+ 48,
68
+ 49,
69
+ 50,
70
+ 51,
71
+ 52,
72
+ 53,
73
+ 54,
74
+ 55,
75
+ 56,
76
+ 57,
77
+ 58,
78
+ 59,
79
+ 60,
80
+ 61,
81
+ 62,
82
+ 63,
83
+ 64,
84
+ 65,
85
+ 66,
86
+ 67,
87
+ 68,
88
+ 69,
89
+ 70,
90
+ 71,
91
+ 72,
92
+ 73,
93
+ 74,
94
+ 75,
95
+ 76,
96
+ 77,
97
+ 78,
98
+ 79,
99
+ 80,
100
+ 81,
101
+ 82,
102
+ 83,
103
+ 84,
104
+ 85,
105
+ 86,
106
+ 87,
107
+ 88,
108
+ 89,
109
+ 90,
110
+ 91,
111
+ 92,
112
+ 93,
113
+ 94,
114
+ 95,
115
+ 96,
116
+ 97,
117
+ 98,
118
+ 99,
119
+ 100,
120
+ 101,
121
+ 102,
122
+ 103,
123
+ 104,
124
+ 105,
125
+ 106,
126
+ 107,
127
+ 108,
128
+ 109,
129
+ 110,
130
+ 111,
131
+ 112,
132
+ 113,
133
+ 114,
134
+ 115,
135
+ 116,
136
+ 117,
137
+ 118,
138
+ 119,
139
+ 120,
140
+ 121,
141
+ 122,
142
+ 123,
143
+ 124,
144
+ 125,
145
+ 126,
146
+ 127,
147
+ 128,
148
+ 129,
149
+ 130,
150
+ 131,
151
+ 132,
152
+ 133,
153
+ 134,
154
+ 135,
155
+ 136,
156
+ 137,
157
+ 138,
158
+ 139,
159
+ 140,
160
+ 141,
161
+ 142,
162
+ 143,
163
+ 144,
164
+ 145,
165
+ 146,
166
+ 147,
167
+ 148,
168
+ 149,
169
+ 150,
170
+ 151,
171
+ 152,
172
+ 153,
173
+ 154,
174
+ 155,
175
+ 156,
176
+ 157,
177
+ 158,
178
+ 159,
179
+ 160,
180
+ 161,
181
+ 162,
182
+ 163,
183
+ 164,
184
+ 165,
185
+ 166,
186
+ 167,
187
+ 168,
188
+ 169,
189
+ 170,
190
+ 171,
191
+ 172,
192
+ 173,
193
+ 174,
194
+ 175,
195
+ 176,
196
+ 177,
197
+ 178,
198
+ 179,
199
+ 180,
200
+ 181,
201
+ 182,
202
+ 183,
203
+ 184,
204
+ 185,
205
+ 186,
206
+ 187,
207
+ 188,
208
+ 189,
209
+ 190,
210
+ 191,
211
+ 192,
212
+ 193,
213
+ 194,
214
+ 195,
215
+ 196,
216
+ 197,
217
+ 198,
218
+ 199,
219
+ 200,
220
+ 201,
221
+ 202,
222
+ 203,
223
+ 204,
224
+ 205,
225
+ 206,
226
+ 207,
227
+ 208,
228
+ 209,
229
+ 210,
230
+ 211,
231
+ 212,
232
+ 213,
233
+ 214,
234
+ 215,
235
+ 216,
236
+ 217,
237
+ 218,
238
+ 219,
239
+ 220,
240
+ 221,
241
+ 222,
242
+ 223,
243
+ 224,
244
+ 225,
245
+ 226,
246
+ 227,
247
+ 228,
248
+ 229,
249
+ 230,
250
+ 231,
251
+ 232,
252
+ 233,
253
+ 234,
254
+ 235,
255
+ 236,
256
+ 237,
257
+ 238,
258
+ 239,
259
+ 240,
260
+ 241,
261
+ 242,
262
+ 243,
263
+ 244,
264
+ 245,
265
+ 246,
266
+ 247,
267
+ 248,
268
+ 249,
269
+ 250,
270
+ 251,
271
+ 252,
272
+ 253,
273
+ 254,
274
+ 255,
275
+ 256,
276
+ 257,
277
+ 258,
278
+ 259,
279
+ 260,
280
+ 261,
281
+ 262,
282
+ 263,
283
+ 264,
284
+ 265,
285
+ 266,
286
+ 267,
287
+ 268,
288
+ 269,
289
+ 270,
290
+ 271,
291
+ 272,
292
+ 273,
293
+ 274,
294
+ 275,
295
+ 276,
296
+ 277,
297
+ 278,
298
+ 279,
299
+ 280,
300
+ 281,
301
+ 282,
302
+ 283,
303
+ 284,
304
+ 285,
305
+ 286,
306
+ 287,
307
+ 288,
308
+ 289,
309
+ 290,
310
+ 291,
311
+ 292,
312
+ 293,
313
+ 294,
314
+ 295,
315
+ 296,
316
+ 297,
317
+ 298,
318
+ 299,
319
+ 300,
320
+ 301,
321
+ 302,
322
+ 303,
323
+ 304,
324
+ 305,
325
+ 306,
326
+ 307,
327
+ 308,
328
+ 309,
329
+ 310,
330
+ 311,
331
+ 312,
332
+ 313,
333
+ 314,
334
+ 315,
335
+ 316,
336
+ 317,
337
+ 318,
338
+ 319,
339
+ 320,
340
+ 321,
341
+ 322,
342
+ 323,
343
+ 324,
344
+ 325,
345
+ 326,
346
+ 327,
347
+ 328,
348
+ 329,
349
+ 330,
350
+ 331,
351
+ 332,
352
+ 333,
353
+ 334,
354
+ 335,
355
+ 336,
356
+ 337,
357
+ 338,
358
+ 339,
359
+ 340,
360
+ 341,
361
+ 342,
362
+ 343,
363
+ 344,
364
+ 345,
365
+ 346,
366
+ 347,
367
+ 348,
368
+ 349,
369
+ 350,
370
+ 351,
371
+ 352,
372
+ 353,
373
+ 354,
374
+ 355,
375
+ 356,
376
+ 357,
377
+ 358,
378
+ 359,
379
+ 360,
380
+ 361,
381
+ 362,
382
+ 363,
383
+ 364,
384
+ 365,
385
+ 366,
386
+ 367,
387
+ 368,
388
+ 369,
389
+ 370,
390
+ 371,
391
+ 372,
392
+ 373,
393
+ 374,
394
+ 375,
395
+ 376,
396
+ 377,
397
+ 378,
398
+ 379,
399
+ 380,
400
+ 381,
401
+ 382,
402
+ 383,
403
+ 384,
404
+ 385,
405
+ 386,
406
+ 387,
407
+ 388,
408
+ 389,
409
+ 390,
410
+ 391,
411
+ 392,
412
+ 393,
413
+ 394,
414
+ 395,
415
+ 396,
416
+ 397,
417
+ 398,
418
+ 399,
419
+ 400,
420
+ 401,
421
+ 402,
422
+ 403,
423
+ 404,
424
+ 405,
425
+ 406,
426
+ 407,
427
+ 408,
428
+ 409,
429
+ 410,
430
+ 411,
431
+ 412,
432
+ 413,
433
+ 414,
434
+ 415,
435
+ 416,
436
+ 417,
437
+ 418,
438
+ 419,
439
+ 420,
440
+ 421,
441
+ 422,
442
+ 423,
443
+ 424,
444
+ 425,
445
+ 426,
446
+ 427,
447
+ 428,
448
+ 429,
449
+ 430,
450
+ 431,
451
+ 432,
452
+ 433,
453
+ 434,
454
+ 435,
455
+ 436,
456
+ 437,
457
+ 438,
458
+ 439,
459
+ 440,
460
+ 441,
461
+ 442,
462
+ 443,
463
+ 444,
464
+ 445,
465
+ 446,
466
+ 447,
467
+ 448,
468
+ 449,
469
+ 450,
470
+ 451,
471
+ 452,
472
+ 453,
473
+ 454,
474
+ 455,
475
+ 456,
476
+ 457,
477
+ 458,
478
+ 459,
479
+ 460,
480
+ 461,
481
+ 462,
482
+ 463,
483
+ 464,
484
+ 465,
485
+ 466,
486
+ 467,
487
+ 468,
488
+ 469,
489
+ 470,
490
+ 471,
491
+ 472,
492
+ 473,
493
+ 474,
494
+ 475,
495
+ 476,
496
+ 477,
497
+ 478,
498
+ 479,
499
+ 480,
500
+ 481,
501
+ 482,
502
+ 483,
503
+ 484,
504
+ 485,
505
+ 486,
506
+ 487,
507
+ 488,
508
+ 489,
509
+ 490,
510
+ 491,
511
+ 492,
512
+ 493,
513
+ 494,
514
+ 495,
515
+ 496,
516
+ 497,
517
+ 498,
518
+ 499,
519
+ 500,
520
+ 501,
521
+ 502,
522
+ 503,
523
+ 504,
524
+ 505
525
+ ]
526
+ }
527
+ ]
smolvla_omy/checkpoints/001500/training_state/optimizer_state.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f69818c83e50953a1ba5f6ef94e3ac3f2e368ab3c02b239f371df3e401657e40
3
+ size 412659164
smolvla_omy/checkpoints/001500/training_state/rng_state.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2a179c8c79cdb88ba975817e9c82731e95cc15b6771637db1d5e601c13ca39d4
3
+ size 15708
smolvla_omy/checkpoints/001500/training_state/scheduler_state.json ADDED
@@ -0,0 +1,15 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "base_lrs": [
3
+ 0.0001
4
+ ],
5
+ "last_epoch": 1500,
6
+ "_step_count": 1501,
7
+ "_is_initial": false,
8
+ "_get_lr_called_within_step": false,
9
+ "_last_lr": [
10
+ 9.939980660401297e-05
11
+ ],
12
+ "lr_lambdas": [
13
+ null
14
+ ]
15
+ }
smolvla_omy/checkpoints/001500/training_state/training_step.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ {
2
+ "step": 1500
3
+ }
smolvla_omy/checkpoints/002000/pretrained_model/config.json ADDED
@@ -0,0 +1,83 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "type": "smolvla",
3
+ "n_obs_steps": 1,
4
+ "normalization_mapping": {
5
+ "VISUAL": "IDENTITY",
6
+ "STATE": "MEAN_STD",
7
+ "ACTION": "MEAN_STD"
8
+ },
9
+ "input_features": {
10
+ "observation.image": {
11
+ "type": "VISUAL",
12
+ "shape": [
13
+ 3,
14
+ 256,
15
+ 256
16
+ ]
17
+ },
18
+ "observation.wrist_image": {
19
+ "type": "VISUAL",
20
+ "shape": [
21
+ 3,
22
+ 256,
23
+ 256
24
+ ]
25
+ },
26
+ "observation.state": {
27
+ "type": "STATE",
28
+ "shape": [
29
+ 6
30
+ ]
31
+ }
32
+ },
33
+ "output_features": {
34
+ "action": {
35
+ "type": "ACTION",
36
+ "shape": [
37
+ 12
38
+ ]
39
+ }
40
+ },
41
+ "device": "cuda",
42
+ "use_amp": false,
43
+ "chunk_size": 5,
44
+ "n_action_steps": 5,
45
+ "max_state_dim": 32,
46
+ "max_action_dim": 32,
47
+ "resize_imgs_with_padding": [
48
+ 512,
49
+ 512
50
+ ],
51
+ "empty_cameras": 0,
52
+ "adapt_to_pi_aloha": false,
53
+ "use_delta_joint_actions_aloha": false,
54
+ "tokenizer_max_length": 48,
55
+ "num_steps": 10,
56
+ "use_cache": true,
57
+ "freeze_vision_encoder": true,
58
+ "train_expert_only": true,
59
+ "train_state_proj": true,
60
+ "optimizer_lr": 0.0001,
61
+ "optimizer_betas": [
62
+ 0.9,
63
+ 0.95
64
+ ],
65
+ "optimizer_eps": 1e-08,
66
+ "optimizer_weight_decay": 1e-10,
67
+ "optimizer_grad_clip_norm": 10,
68
+ "scheduler_warmup_steps": 1000,
69
+ "scheduler_decay_steps": 30000,
70
+ "scheduler_decay_lr": 2.5e-06,
71
+ "vlm_model_name": "HuggingFaceTB/SmolVLM2-500M-Video-Instruct",
72
+ "load_vlm_weights": false,
73
+ "add_image_special_tokens": false,
74
+ "attention_mode": "cross_attn",
75
+ "prefix_length": -1,
76
+ "pad_language_to": "longest",
77
+ "num_expert_layers": -1,
78
+ "num_vlm_layers": 16,
79
+ "self_attn_every_n_layers": 2,
80
+ "expert_width_multiplier": 0.75,
81
+ "min_period": 0.004,
82
+ "max_period": 4.0
83
+ }
smolvla_omy/checkpoints/002000/pretrained_model/model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a7846f08cf57e1b6150715c56572a2832b2c45310272210a9918ea6c6e405c95
3
+ size 1197790120
smolvla_omy/checkpoints/002000/pretrained_model/train_config.json ADDED
@@ -0,0 +1,195 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "dataset": {
3
+ "repo_id": "lava8888/ArrangeVegetables",
4
+ "root": "./ArrangeVegetables",
5
+ "episodes": null,
6
+ "image_transforms": {
7
+ "enable": false,
8
+ "max_num_transforms": 3,
9
+ "random_order": false,
10
+ "tfs": {
11
+ "brightness": {
12
+ "weight": 1.0,
13
+ "type": "ColorJitter",
14
+ "kwargs": {
15
+ "brightness": [
16
+ 0.8,
17
+ 1.2
18
+ ]
19
+ }
20
+ },
21
+ "contrast": {
22
+ "weight": 1.0,
23
+ "type": "ColorJitter",
24
+ "kwargs": {
25
+ "contrast": [
26
+ 0.8,
27
+ 1.2
28
+ ]
29
+ }
30
+ },
31
+ "saturation": {
32
+ "weight": 1.0,
33
+ "type": "ColorJitter",
34
+ "kwargs": {
35
+ "saturation": [
36
+ 0.5,
37
+ 1.5
38
+ ]
39
+ }
40
+ },
41
+ "hue": {
42
+ "weight": 1.0,
43
+ "type": "ColorJitter",
44
+ "kwargs": {
45
+ "hue": [
46
+ -0.05,
47
+ 0.05
48
+ ]
49
+ }
50
+ },
51
+ "sharpness": {
52
+ "weight": 1.0,
53
+ "type": "SharpnessJitter",
54
+ "kwargs": {
55
+ "sharpness": [
56
+ 0.5,
57
+ 1.5
58
+ ]
59
+ }
60
+ }
61
+ }
62
+ },
63
+ "revision": null,
64
+ "use_imagenet_stats": true,
65
+ "video_backend": "torchcodec"
66
+ },
67
+ "env": null,
68
+ "policy": {
69
+ "type": "smolvla",
70
+ "n_obs_steps": 1,
71
+ "normalization_mapping": {
72
+ "VISUAL": "IDENTITY",
73
+ "STATE": "MEAN_STD",
74
+ "ACTION": "MEAN_STD"
75
+ },
76
+ "input_features": {
77
+ "observation.image": {
78
+ "type": "VISUAL",
79
+ "shape": [
80
+ 3,
81
+ 256,
82
+ 256
83
+ ]
84
+ },
85
+ "observation.wrist_image": {
86
+ "type": "VISUAL",
87
+ "shape": [
88
+ 3,
89
+ 256,
90
+ 256
91
+ ]
92
+ },
93
+ "observation.state": {
94
+ "type": "STATE",
95
+ "shape": [
96
+ 6
97
+ ]
98
+ }
99
+ },
100
+ "output_features": {
101
+ "action": {
102
+ "type": "ACTION",
103
+ "shape": [
104
+ 12
105
+ ]
106
+ }
107
+ },
108
+ "device": "cuda",
109
+ "use_amp": false,
110
+ "chunk_size": 5,
111
+ "n_action_steps": 5,
112
+ "max_state_dim": 32,
113
+ "max_action_dim": 32,
114
+ "resize_imgs_with_padding": [
115
+ 512,
116
+ 512
117
+ ],
118
+ "empty_cameras": 0,
119
+ "adapt_to_pi_aloha": false,
120
+ "use_delta_joint_actions_aloha": false,
121
+ "tokenizer_max_length": 48,
122
+ "num_steps": 10,
123
+ "use_cache": true,
124
+ "freeze_vision_encoder": true,
125
+ "train_expert_only": true,
126
+ "train_state_proj": true,
127
+ "optimizer_lr": 0.0001,
128
+ "optimizer_betas": [
129
+ 0.9,
130
+ 0.95
131
+ ],
132
+ "optimizer_eps": 1e-08,
133
+ "optimizer_weight_decay": 1e-10,
134
+ "optimizer_grad_clip_norm": 10,
135
+ "scheduler_warmup_steps": 1000,
136
+ "scheduler_decay_steps": 30000,
137
+ "scheduler_decay_lr": 2.5e-06,
138
+ "vlm_model_name": "HuggingFaceTB/SmolVLM2-500M-Video-Instruct",
139
+ "load_vlm_weights": false,
140
+ "add_image_special_tokens": false,
141
+ "attention_mode": "cross_attn",
142
+ "prefix_length": -1,
143
+ "pad_language_to": "longest",
144
+ "num_expert_layers": -1,
145
+ "num_vlm_layers": 16,
146
+ "self_attn_every_n_layers": 2,
147
+ "expert_width_multiplier": 0.75,
148
+ "min_period": 0.004,
149
+ "max_period": 4.0
150
+ },
151
+ "output_dir": "ckpt/smolvla_omy",
152
+ "job_name": "smolvla_6",
153
+ "resume": false,
154
+ "seed": 42,
155
+ "num_workers": 24,
156
+ "batch_size": 350,
157
+ "steps": 20000,
158
+ "eval_freq": 5,
159
+ "log_freq": 1,
160
+ "save_checkpoint": true,
161
+ "save_freq": 500,
162
+ "use_policy_training_preset": true,
163
+ "optimizer": {
164
+ "type": "adamw",
165
+ "lr": 0.0001,
166
+ "weight_decay": 1e-10,
167
+ "grad_clip_norm": 10,
168
+ "betas": [
169
+ 0.9,
170
+ 0.95
171
+ ],
172
+ "eps": 1e-08
173
+ },
174
+ "scheduler": {
175
+ "type": "cosine_decay_with_warmup",
176
+ "num_warmup_steps": 1000,
177
+ "num_decay_steps": 30000,
178
+ "peak_lr": 0.0001,
179
+ "decay_lr": 2.5e-06
180
+ },
181
+ "eval": {
182
+ "n_episodes": 50,
183
+ "batch_size": 50,
184
+ "use_async_envs": false
185
+ },
186
+ "wandb": {
187
+ "enable": true,
188
+ "disable_artifact": true,
189
+ "project": "smolVLA",
190
+ "entity": "qualiastudios",
191
+ "notes": "first",
192
+ "run_id": "7",
193
+ "mode": "online"
194
+ }
195
+ }
smolvla_omy/checkpoints/002000/training_state/optimizer_param_groups.json ADDED
@@ -0,0 +1,527 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "lr": 9.893469553577303e-05,
4
+ "betas": [
5
+ 0.9,
6
+ 0.95
7
+ ],
8
+ "eps": 1e-08,
9
+ "weight_decay": 1e-10,
10
+ "amsgrad": false,
11
+ "maximize": false,
12
+ "foreach": null,
13
+ "capturable": false,
14
+ "differentiable": false,
15
+ "fused": null,
16
+ "decoupled_weight_decay": true,
17
+ "initial_lr": 0.0001,
18
+ "params": [
19
+ 0,
20
+ 1,
21
+ 2,
22
+ 3,
23
+ 4,
24
+ 5,
25
+ 6,
26
+ 7,
27
+ 8,
28
+ 9,
29
+ 10,
30
+ 11,
31
+ 12,
32
+ 13,
33
+ 14,
34
+ 15,
35
+ 16,
36
+ 17,
37
+ 18,
38
+ 19,
39
+ 20,
40
+ 21,
41
+ 22,
42
+ 23,
43
+ 24,
44
+ 25,
45
+ 26,
46
+ 27,
47
+ 28,
48
+ 29,
49
+ 30,
50
+ 31,
51
+ 32,
52
+ 33,
53
+ 34,
54
+ 35,
55
+ 36,
56
+ 37,
57
+ 38,
58
+ 39,
59
+ 40,
60
+ 41,
61
+ 42,
62
+ 43,
63
+ 44,
64
+ 45,
65
+ 46,
66
+ 47,
67
+ 48,
68
+ 49,
69
+ 50,
70
+ 51,
71
+ 52,
72
+ 53,
73
+ 54,
74
+ 55,
75
+ 56,
76
+ 57,
77
+ 58,
78
+ 59,
79
+ 60,
80
+ 61,
81
+ 62,
82
+ 63,
83
+ 64,
84
+ 65,
85
+ 66,
86
+ 67,
87
+ 68,
88
+ 69,
89
+ 70,
90
+ 71,
91
+ 72,
92
+ 73,
93
+ 74,
94
+ 75,
95
+ 76,
96
+ 77,
97
+ 78,
98
+ 79,
99
+ 80,
100
+ 81,
101
+ 82,
102
+ 83,
103
+ 84,
104
+ 85,
105
+ 86,
106
+ 87,
107
+ 88,
108
+ 89,
109
+ 90,
110
+ 91,
111
+ 92,
112
+ 93,
113
+ 94,
114
+ 95,
115
+ 96,
116
+ 97,
117
+ 98,
118
+ 99,
119
+ 100,
120
+ 101,
121
+ 102,
122
+ 103,
123
+ 104,
124
+ 105,
125
+ 106,
126
+ 107,
127
+ 108,
128
+ 109,
129
+ 110,
130
+ 111,
131
+ 112,
132
+ 113,
133
+ 114,
134
+ 115,
135
+ 116,
136
+ 117,
137
+ 118,
138
+ 119,
139
+ 120,
140
+ 121,
141
+ 122,
142
+ 123,
143
+ 124,
144
+ 125,
145
+ 126,
146
+ 127,
147
+ 128,
148
+ 129,
149
+ 130,
150
+ 131,
151
+ 132,
152
+ 133,
153
+ 134,
154
+ 135,
155
+ 136,
156
+ 137,
157
+ 138,
158
+ 139,
159
+ 140,
160
+ 141,
161
+ 142,
162
+ 143,
163
+ 144,
164
+ 145,
165
+ 146,
166
+ 147,
167
+ 148,
168
+ 149,
169
+ 150,
170
+ 151,
171
+ 152,
172
+ 153,
173
+ 154,
174
+ 155,
175
+ 156,
176
+ 157,
177
+ 158,
178
+ 159,
179
+ 160,
180
+ 161,
181
+ 162,
182
+ 163,
183
+ 164,
184
+ 165,
185
+ 166,
186
+ 167,
187
+ 168,
188
+ 169,
189
+ 170,
190
+ 171,
191
+ 172,
192
+ 173,
193
+ 174,
194
+ 175,
195
+ 176,
196
+ 177,
197
+ 178,
198
+ 179,
199
+ 180,
200
+ 181,
201
+ 182,
202
+ 183,
203
+ 184,
204
+ 185,
205
+ 186,
206
+ 187,
207
+ 188,
208
+ 189,
209
+ 190,
210
+ 191,
211
+ 192,
212
+ 193,
213
+ 194,
214
+ 195,
215
+ 196,
216
+ 197,
217
+ 198,
218
+ 199,
219
+ 200,
220
+ 201,
221
+ 202,
222
+ 203,
223
+ 204,
224
+ 205,
225
+ 206,
226
+ 207,
227
+ 208,
228
+ 209,
229
+ 210,
230
+ 211,
231
+ 212,
232
+ 213,
233
+ 214,
234
+ 215,
235
+ 216,
236
+ 217,
237
+ 218,
238
+ 219,
239
+ 220,
240
+ 221,
241
+ 222,
242
+ 223,
243
+ 224,
244
+ 225,
245
+ 226,
246
+ 227,
247
+ 228,
248
+ 229,
249
+ 230,
250
+ 231,
251
+ 232,
252
+ 233,
253
+ 234,
254
+ 235,
255
+ 236,
256
+ 237,
257
+ 238,
258
+ 239,
259
+ 240,
260
+ 241,
261
+ 242,
262
+ 243,
263
+ 244,
264
+ 245,
265
+ 246,
266
+ 247,
267
+ 248,
268
+ 249,
269
+ 250,
270
+ 251,
271
+ 252,
272
+ 253,
273
+ 254,
274
+ 255,
275
+ 256,
276
+ 257,
277
+ 258,
278
+ 259,
279
+ 260,
280
+ 261,
281
+ 262,
282
+ 263,
283
+ 264,
284
+ 265,
285
+ 266,
286
+ 267,
287
+ 268,
288
+ 269,
289
+ 270,
290
+ 271,
291
+ 272,
292
+ 273,
293
+ 274,
294
+ 275,
295
+ 276,
296
+ 277,
297
+ 278,
298
+ 279,
299
+ 280,
300
+ 281,
301
+ 282,
302
+ 283,
303
+ 284,
304
+ 285,
305
+ 286,
306
+ 287,
307
+ 288,
308
+ 289,
309
+ 290,
310
+ 291,
311
+ 292,
312
+ 293,
313
+ 294,
314
+ 295,
315
+ 296,
316
+ 297,
317
+ 298,
318
+ 299,
319
+ 300,
320
+ 301,
321
+ 302,
322
+ 303,
323
+ 304,
324
+ 305,
325
+ 306,
326
+ 307,
327
+ 308,
328
+ 309,
329
+ 310,
330
+ 311,
331
+ 312,
332
+ 313,
333
+ 314,
334
+ 315,
335
+ 316,
336
+ 317,
337
+ 318,
338
+ 319,
339
+ 320,
340
+ 321,
341
+ 322,
342
+ 323,
343
+ 324,
344
+ 325,
345
+ 326,
346
+ 327,
347
+ 328,
348
+ 329,
349
+ 330,
350
+ 331,
351
+ 332,
352
+ 333,
353
+ 334,
354
+ 335,
355
+ 336,
356
+ 337,
357
+ 338,
358
+ 339,
359
+ 340,
360
+ 341,
361
+ 342,
362
+ 343,
363
+ 344,
364
+ 345,
365
+ 346,
366
+ 347,
367
+ 348,
368
+ 349,
369
+ 350,
370
+ 351,
371
+ 352,
372
+ 353,
373
+ 354,
374
+ 355,
375
+ 356,
376
+ 357,
377
+ 358,
378
+ 359,
379
+ 360,
380
+ 361,
381
+ 362,
382
+ 363,
383
+ 364,
384
+ 365,
385
+ 366,
386
+ 367,
387
+ 368,
388
+ 369,
389
+ 370,
390
+ 371,
391
+ 372,
392
+ 373,
393
+ 374,
394
+ 375,
395
+ 376,
396
+ 377,
397
+ 378,
398
+ 379,
399
+ 380,
400
+ 381,
401
+ 382,
402
+ 383,
403
+ 384,
404
+ 385,
405
+ 386,
406
+ 387,
407
+ 388,
408
+ 389,
409
+ 390,
410
+ 391,
411
+ 392,
412
+ 393,
413
+ 394,
414
+ 395,
415
+ 396,
416
+ 397,
417
+ 398,
418
+ 399,
419
+ 400,
420
+ 401,
421
+ 402,
422
+ 403,
423
+ 404,
424
+ 405,
425
+ 406,
426
+ 407,
427
+ 408,
428
+ 409,
429
+ 410,
430
+ 411,
431
+ 412,
432
+ 413,
433
+ 414,
434
+ 415,
435
+ 416,
436
+ 417,
437
+ 418,
438
+ 419,
439
+ 420,
440
+ 421,
441
+ 422,
442
+ 423,
443
+ 424,
444
+ 425,
445
+ 426,
446
+ 427,
447
+ 428,
448
+ 429,
449
+ 430,
450
+ 431,
451
+ 432,
452
+ 433,
453
+ 434,
454
+ 435,
455
+ 436,
456
+ 437,
457
+ 438,
458
+ 439,
459
+ 440,
460
+ 441,
461
+ 442,
462
+ 443,
463
+ 444,
464
+ 445,
465
+ 446,
466
+ 447,
467
+ 448,
468
+ 449,
469
+ 450,
470
+ 451,
471
+ 452,
472
+ 453,
473
+ 454,
474
+ 455,
475
+ 456,
476
+ 457,
477
+ 458,
478
+ 459,
479
+ 460,
480
+ 461,
481
+ 462,
482
+ 463,
483
+ 464,
484
+ 465,
485
+ 466,
486
+ 467,
487
+ 468,
488
+ 469,
489
+ 470,
490
+ 471,
491
+ 472,
492
+ 473,
493
+ 474,
494
+ 475,
495
+ 476,
496
+ 477,
497
+ 478,
498
+ 479,
499
+ 480,
500
+ 481,
501
+ 482,
502
+ 483,
503
+ 484,
504
+ 485,
505
+ 486,
506
+ 487,
507
+ 488,
508
+ 489,
509
+ 490,
510
+ 491,
511
+ 492,
512
+ 493,
513
+ 494,
514
+ 495,
515
+ 496,
516
+ 497,
517
+ 498,
518
+ 499,
519
+ 500,
520
+ 501,
521
+ 502,
522
+ 503,
523
+ 504,
524
+ 505
525
+ ]
526
+ }
527
+ ]
smolvla_omy/checkpoints/002000/training_state/optimizer_state.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:5229549d85011fd10f8a07f3bab05c46574efda6b5ed83fede75603df7cd6a63
3
+ size 412659164
smolvla_omy/checkpoints/002000/training_state/rng_state.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b6d56fa8d349abc86d627d060792a552901510d5975728bd5910d9fea0f7d16d
3
+ size 15708
smolvla_omy/checkpoints/002000/training_state/scheduler_state.json ADDED
@@ -0,0 +1,15 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "base_lrs": [
3
+ 0.0001
4
+ ],
5
+ "last_epoch": 2000,
6
+ "_step_count": 2001,
7
+ "_is_initial": false,
8
+ "_get_lr_called_within_step": false,
9
+ "_last_lr": [
10
+ 9.893469553577303e-05
11
+ ],
12
+ "lr_lambdas": [
13
+ null
14
+ ]
15
+ }
smolvla_omy/checkpoints/002000/training_state/training_step.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ {
2
+ "step": 2000
3
+ }
smolvla_omy/wandb/debug-internal.log ADDED
@@ -0,0 +1,6 @@
 
 
 
 
 
 
 
1
+ {"time":"2025-08-10T17:01:18.58226796Z","level":"INFO","msg":"stream: starting","core version":"0.21.1"}
2
+ {"time":"2025-08-10T17:01:18.781374731Z","level":"INFO","msg":"stream: created new stream","id":"7"}
3
+ {"time":"2025-08-10T17:01:18.781419518Z","level":"INFO","msg":"stream: started","id":"7"}
4
+ {"time":"2025-08-10T17:01:18.781467058Z","level":"INFO","msg":"writer: started","stream_id":"7"}
5
+ {"time":"2025-08-10T17:01:18.78151031Z","level":"INFO","msg":"sender: started","stream_id":"7"}
6
+ {"time":"2025-08-10T17:01:18.781544725Z","level":"INFO","msg":"handler: started","stream_id":"7"}
smolvla_omy/wandb/debug.log ADDED
@@ -0,0 +1,21 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2025-08-10 17:01:18,364 INFO MainThread:13371 [wandb_setup.py:_flush():80] Current SDK version is 0.21.1
2
+ 2025-08-10 17:01:18,364 INFO MainThread:13371 [wandb_setup.py:_flush():80] Configure stats pid to 13371
3
+ 2025-08-10 17:01:18,364 INFO MainThread:13371 [wandb_setup.py:_flush():80] Loading settings from /home/ubuntu/.config/wandb/settings
4
+ 2025-08-10 17:01:18,364 INFO MainThread:13371 [wandb_setup.py:_flush():80] Loading settings from /home/ubuntu/hyperstack_smolvla/wandb/settings
5
+ 2025-08-10 17:01:18,364 INFO MainThread:13371 [wandb_setup.py:_flush():80] Loading settings from environment variables
6
+ 2025-08-10 17:01:18,364 INFO MainThread:13371 [wandb_init.py:setup_run_log_directory():703] Logging user logs to ckpt/smolvla_omy/wandb/run-20250810_170118-7/logs/debug.log
7
+ 2025-08-10 17:01:18,364 INFO MainThread:13371 [wandb_init.py:setup_run_log_directory():704] Logging internal logs to ckpt/smolvla_omy/wandb/run-20250810_170118-7/logs/debug-internal.log
8
+ 2025-08-10 17:01:18,364 INFO MainThread:13371 [wandb_init.py:init():830] calling init triggers
9
+ 2025-08-10 17:01:18,365 INFO MainThread:13371 [wandb_init.py:init():835] wandb.init called with sweep_config: {}
10
+ config: {'dataset': {'repo_id': 'lava8888/ArrangeVegetables', 'root': './ArrangeVegetables', 'episodes': None, 'image_transforms': {'enable': False, 'max_num_transforms': 3, 'random_order': False, 'tfs': {'brightness': {'weight': 1.0, 'type': 'ColorJitter', 'kwargs': {'brightness': [0.8, 1.2]}}, 'contrast': {'weight': 1.0, 'type': 'ColorJitter', 'kwargs': {'contrast': [0.8, 1.2]}}, 'saturation': {'weight': 1.0, 'type': 'ColorJitter', 'kwargs': {'saturation': [0.5, 1.5]}}, 'hue': {'weight': 1.0, 'type': 'ColorJitter', 'kwargs': {'hue': [-0.05, 0.05]}}, 'sharpness': {'weight': 1.0, 'type': 'SharpnessJitter', 'kwargs': {'sharpness': [0.5, 1.5]}}}}, 'revision': None, 'use_imagenet_stats': True, 'video_backend': 'torchcodec'}, 'env': None, 'policy': {'type': 'smolvla', 'n_obs_steps': 1, 'normalization_mapping': {'VISUAL': <NormalizationMode.IDENTITY: 'IDENTITY'>, 'STATE': <NormalizationMode.MEAN_STD: 'MEAN_STD'>, 'ACTION': <NormalizationMode.MEAN_STD: 'MEAN_STD'>}, 'input_features': {}, 'output_features': {}, 'device': 'cuda', 'use_amp': False, 'chunk_size': 5, 'n_action_steps': 5, 'max_state_dim': 32, 'max_action_dim': 32, 'resize_imgs_with_padding': [512, 512], 'empty_cameras': 0, 'adapt_to_pi_aloha': False, 'use_delta_joint_actions_aloha': False, 'tokenizer_max_length': 48, 'num_steps': 10, 'use_cache': True, 'freeze_vision_encoder': True, 'train_expert_only': True, 'train_state_proj': True, 'optimizer_lr': 0.0001, 'optimizer_betas': [0.9, 0.95], 'optimizer_eps': 1e-08, 'optimizer_weight_decay': 1e-10, 'optimizer_grad_clip_norm': 10, 'scheduler_warmup_steps': 1000, 'scheduler_decay_steps': 30000, 'scheduler_decay_lr': 2.5e-06, 'vlm_model_name': 'HuggingFaceTB/SmolVLM2-500M-Video-Instruct', 'load_vlm_weights': False, 'add_image_special_tokens': False, 'attention_mode': 'cross_attn', 'prefix_length': -1, 'pad_language_to': 'longest', 'num_expert_layers': -1, 'num_vlm_layers': 16, 'self_attn_every_n_layers': 2, 'expert_width_multiplier': 0.75, 'min_period': 0.004, 'max_period': 4.0}, 'output_dir': 'ckpt/smolvla_omy', 'job_name': 'smolvla_6', 'resume': False, 'seed': 42, 'num_workers': 24, 'batch_size': 350, 'steps': 20000, 'eval_freq': 5, 'log_freq': 1, 'save_checkpoint': True, 'save_freq': 500, 'use_policy_training_preset': True, 'optimizer': {'type': 'adamw', 'lr': 0.0001, 'weight_decay': 1e-10, 'grad_clip_norm': 10, 'betas': [0.9, 0.95], 'eps': 1e-08}, 'scheduler': {'type': 'cosine_decay_with_warmup', 'num_warmup_steps': 1000, 'num_decay_steps': 30000, 'peak_lr': 0.0001, 'decay_lr': 2.5e-06}, 'eval': {'n_episodes': 50, 'batch_size': 50, 'use_async_envs': False}, 'wandb': {'enable': True, 'disable_artifact': True, 'project': 'smolVLA', 'entity': 'qualiastudios', 'notes': 'first', 'run_id': '7', 'mode': 'online'}, '_wandb': {}}
11
+ 2025-08-10 17:01:18,365 INFO MainThread:13371 [wandb_init.py:init():871] starting backend
12
+ 2025-08-10 17:01:18,574 INFO MainThread:13371 [wandb_init.py:init():874] sending inform_init request
13
+ 2025-08-10 17:01:18,579 INFO MainThread:13371 [wandb_init.py:init():882] backend started and connected
14
+ 2025-08-10 17:01:18,582 INFO MainThread:13371 [wandb_init.py:init():953] updated telemetry
15
+ 2025-08-10 17:01:18,586 INFO MainThread:13371 [wandb_init.py:init():977] communicating run to backend with 90.0 second timeout
16
+ 2025-08-10 17:01:18,955 INFO MainThread:13371 [wandb_init.py:init():1029] starting run threads in backend
17
+ 2025-08-10 17:01:19,105 INFO MainThread:13371 [wandb_run.py:_console_start():2494] atexit reg
18
+ 2025-08-10 17:01:19,105 INFO MainThread:13371 [wandb_run.py:_redirect():2342] redirect: wrap_raw
19
+ 2025-08-10 17:01:19,106 INFO MainThread:13371 [wandb_run.py:_redirect():2411] Wrapping output streams.
20
+ 2025-08-10 17:01:19,106 INFO MainThread:13371 [wandb_run.py:_redirect():2434] Redirects installed.
21
+ 2025-08-10 17:01:19,107 INFO MainThread:13371 [wandb_init.py:init():1075] run started, returning control to user process
smolvla_omy/wandb/run-20250810_170118-7/files/output.log ADDED
The diff for this file is too large to render. See raw diff
 
smolvla_omy/wandb/run-20250810_170118-7/files/requirements.txt ADDED
@@ -0,0 +1,238 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Pygments==2.19.2
2
+ numba==0.61.2
3
+ kiwisolver==1.4.8
4
+ nest-asyncio==1.6.0
5
+ pyarrow==21.0.0
6
+ nvidia-cuda-cupti-cu12==12.8.90
7
+ python-xlib==0.33
8
+ deepdiff==8.6.0
9
+ python-dateutil==2.9.0.post0
10
+ mergedeep==1.3.4
11
+ numcodecs==0.13.1
12
+ charset-normalizer==3.4.3
13
+ multidict==6.6.3
14
+ torchcodec==0.6.0
15
+ imageio==2.37.0
16
+ prompt_toolkit==3.0.51
17
+ GitPython==3.1.45
18
+ tqdm==4.67.1
19
+ typing-inspect==0.9.0
20
+ accelerate==1.10.0
21
+ jupyter_core==5.8.1
22
+ jupyter_client==8.6.3
23
+ cffi==1.17.1
24
+ beautifulsoup4==4.13.4
25
+ urllib3==2.5.0
26
+ mypy_extensions==1.1.0
27
+ debugpy==1.8.16
28
+ pfzy==0.3.4
29
+ decorator==5.2.1
30
+ matplotlib-inline==0.1.7
31
+ tokenizers==0.21.4
32
+ Flask==3.1.1
33
+ click==8.2.1
34
+ cloudpickle==3.1.1
35
+ nvidia-cuda-runtime-cu12==12.8.90
36
+ nvidia-cuda-nvrtc-cu12==12.8.93
37
+ nvidia-cusolver-cu12==11.7.3.90
38
+ multiprocess==0.70.16
39
+ gymnasium==0.29.1
40
+ typing_extensions==4.14.1
41
+ nvidia-nccl-cu12==2.27.3
42
+ toml==0.10.2
43
+ executing==2.2.0
44
+ fsspec==2025.3.0
45
+ itsdangerous==2.2.0
46
+ huggingface-hub==0.34.4
47
+ pandas==2.3.1
48
+ networkx==3.4.2
49
+ pyzmq==27.0.1
50
+ wandb==0.21.1
51
+ fasteners==0.19
52
+ smmap==5.0.2
53
+ torch==2.8.0
54
+ asciitree==0.3.3
55
+ evdev==1.6.1
56
+ aiohappyeyeballs==2.6.1
57
+ packaging==25.0
58
+ regex==2025.7.34
59
+ wcwidth==0.2.13
60
+ huggingface==0.0.1
61
+ annotated-types==0.7.0
62
+ nvidia-nvtx-cu12==12.8.90
63
+ jedi==0.19.2
64
+ PyYAML==6.0.2
65
+ PySocks==1.7.1
66
+ pyyaml-include==1.4.1
67
+ attrs==25.3.0
68
+ aiohttp==3.12.15
69
+ num2words==0.5.14
70
+ aiosignal==1.4.0
71
+ hf-xet==1.1.7
72
+ draccus==0.10.0
73
+ exceptiongroup==1.3.0
74
+ contourpy==1.3.2
75
+ sentry-sdk==2.34.1
76
+ xxhash==3.5.0
77
+ sympy==1.14.0
78
+ asttokens==3.0.0
79
+ nvidia-cusparselt-cu12==0.7.1
80
+ lerobot==0.1.0
81
+ soupsieve==2.7
82
+ frozenlist==1.7.0
83
+ gitdb==4.0.12
84
+ psutil==7.0.0
85
+ Werkzeug==3.1.3
86
+ PyScreeze==1.0.1
87
+ pymunk==6.11.1
88
+ nvidia-cublas-cu12==12.8.4.1
89
+ pillow==11.3.0
90
+ PyMsgBox==1.0.9
91
+ async-timeout==5.0.1
92
+ scipy==1.15.3
93
+ traitlets==5.14.3
94
+ h5py==3.14.0
95
+ python3-xlib==0.15
96
+ fonttools==4.59.0
97
+ requests==2.32.4
98
+ gdown==5.2.0
99
+ PyAutoGUI==0.9.54
100
+ typing-inspection==0.4.1
101
+ PyRect==0.2.0
102
+ einops==0.8.1
103
+ omegaconf==2.3.0
104
+ stack-data==0.6.3
105
+ numpy==2.2.6
106
+ jsonlines==4.0.0
107
+ parso==0.8.4
108
+ matplotlib==3.10.5
109
+ comm==0.2.3
110
+ tornado==6.5.2
111
+ ipython==8.37.0
112
+ pynput==1.8.1
113
+ zarr==2.18.3
114
+ pyperclip==1.9.0
115
+ nvidia-nvjitlink-cu12==12.8.93
116
+ orderly-set==5.5.0
117
+ MarkupSafe==3.0.2
118
+ pydantic==2.11.7
119
+ inquirerpy==0.3.4
120
+ diffusers==0.34.0
121
+ ipykernel==6.30.1
122
+ transformers==4.50.3
123
+ dill==0.3.8
124
+ Jinja2==3.1.6
125
+ rerun-sdk==0.24.1
126
+ blinker==1.9.0
127
+ cmake==4.0.3
128
+ termcolor==3.1.0
129
+ propcache==0.3.2
130
+ nvidia-cudnn-cu12==9.10.2.21
131
+ PyGetWindow==0.0.9
132
+ docopt==0.6.2
133
+ nvidia-cufile-cu12==1.13.1.3
134
+ pure_eval==0.2.3
135
+ nvidia-cusparse-cu12==12.5.8.93
136
+ pycparser==2.22
137
+ llvmlite==0.44.0
138
+ imageio-ffmpeg==0.6.0
139
+ yarl==1.20.1
140
+ safetensors==0.6.2
141
+ nvidia-curand-cu12==10.3.9.90
142
+ mpmath==1.3.0
143
+ triton==3.4.0
144
+ torchvision==0.23.0
145
+ av==15.0.0
146
+ antlr4-python3-runtime==4.9.3
147
+ nvidia-cufft-cu12==11.3.3.83
148
+ pytweening==1.2.0
149
+ Farama-Notifications==0.0.4
150
+ tzdata==2025.2
151
+ datasets==4.0.0
152
+ opencv-python-headless==4.12.0.88
153
+ MouseInfo==0.1.3
154
+ protobuf==6.31.1
155
+ cycler==0.12.1
156
+ pydantic_core==2.33.2
157
+ urllib3==1.26.5
158
+ pyasn1==0.4.8
159
+ zope.interface==5.4.0
160
+ cloud-init==24.2
161
+ ubuntu-pro-client==8001
162
+ MarkupSafe==2.0.1
163
+ idna==3.3
164
+ filelock==3.6.0
165
+ setuptools==59.6.0
166
+ unidiff==0.5.5
167
+ systemd-python==234
168
+ Twisted==22.1.0
169
+ pexpect==4.8.0
170
+ pyparsing==2.4.7
171
+ distro-info==1.1+ubuntu0.2
172
+ incremental==21.3.0
173
+ keyring==23.5.0
174
+ wheel==0.37.1
175
+ sos==4.5.6
176
+ constantly==15.1.0
177
+ importlib-metadata==4.6.4
178
+ unattended-upgrades==0.1
179
+ python-apt==2.4.0+ubuntu4
180
+ python-debian==0.1.43+ubuntu1.1
181
+ distro==1.7.0
182
+ ptyprocess==0.7.0
183
+ xdg==5
184
+ more-itertools==8.10.0
185
+ wadllib==1.3.6
186
+ pyrsistent==0.18.1
187
+ command-not-found==0.3
188
+ launchpadlib==1.10.16
189
+ devscripts==2.22.1ubuntu1
190
+ colorama==0.4.4
191
+ python-magic==0.4.24
192
+ lazr.uri==1.0.6
193
+ netifaces==0.11.0
194
+ dbus-python==1.2.18
195
+ bcrypt==3.2.0
196
+ click==8.0.3
197
+ distlib==0.3.4
198
+ pyserial==3.5
199
+ requests==2.25.1
200
+ SecretStorage==3.3.1
201
+ PyHamcrest==2.0.2
202
+ jeepney==0.7.1
203
+ pip==22.0.2
204
+ hyperlink==21.0.0
205
+ pyasn1-modules==0.2.1
206
+ ufw==0.36.1
207
+ Jinja2==3.0.3
208
+ virtualenvwrapper==4.8.4
209
+ stevedore==3.5.0
210
+ pyOpenSSL==21.0.0
211
+ oauthlib==3.2.0
212
+ httplib2==0.20.2
213
+ attrs==21.2.0
214
+ gpg==1.16.0
215
+ virtualenv==20.13.0+ds
216
+ six==1.16.0
217
+ virtualenv-clone==0.3.0
218
+ configobj==5.0.6
219
+ ssh-import-id==5.11
220
+ jsonschema==3.2.0
221
+ jsonpointer==2.0
222
+ pbr==5.8.0
223
+ platformdirs==2.5.1
224
+ zipp==1.0.0
225
+ certifi==2020.6.20
226
+ PyGObject==3.42.1
227
+ service-identity==18.1.0
228
+ cryptography==3.4.8
229
+ Automat==20.2.0
230
+ pytz==2022.1
231
+ lazr.restfulclient==0.14.4
232
+ pyxdg==0.27
233
+ blinker==1.4
234
+ Babel==2.8.0
235
+ PyJWT==2.3.0
236
+ chardet==4.0.0
237
+ PyYAML==5.4.1
238
+ jsonpatch==1.32
smolvla_omy/wandb/run-20250810_170118-7/files/wandb-metadata.json ADDED
@@ -0,0 +1,43 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "os": "Linux-6.8.0-40-generic-x86_64-with-glibc2.35",
3
+ "python": "CPython 3.10.12",
4
+ "startedAt": "2025-08-10T17:01:18.362970Z",
5
+ "args": [
6
+ "--config_path",
7
+ "smolvla_omy.yaml"
8
+ ],
9
+ "program": "/home/ubuntu/hyperstack_smolvla/train_model.py",
10
+ "codePath": "train_model.py",
11
+ "codePathLocal": "train_model.py",
12
+ "git": {
13
+ "remote": "https://github.com/LudvigEriksonBrangstrup/hyperstack_smolvla.git",
14
+ "commit": "ab95dc34a31924ce4783ea5d7bbebfba9afdb670"
15
+ },
16
+ "root": "ckpt/smolvla_omy",
17
+ "host": "quick-franklin",
18
+ "executable": "/bin/python3",
19
+ "cpu_count": 28,
20
+ "cpu_count_logical": 28,
21
+ "gpu": "NVIDIA A100 80GB PCIe",
22
+ "gpu_count": 1,
23
+ "disk": {
24
+ "/": {
25
+ "total": "103865303040",
26
+ "used": "31205855232"
27
+ }
28
+ },
29
+ "memory": {
30
+ "total": "126711267328"
31
+ },
32
+ "gpu_nvidia": [
33
+ {
34
+ "name": "NVIDIA A100 80GB PCIe",
35
+ "memoryTotal": "85899345920",
36
+ "cudaCores": 6912,
37
+ "architecture": "Ampere",
38
+ "uuid": "GPU-2f0e3d1f-2917-fc72-bc7e-8d4fe513dfc6"
39
+ }
40
+ ],
41
+ "cudaVersion": "12.2",
42
+ "writerId": "ay01d755inz748t5sr9ys3vc1t6vopqg"
43
+ }
smolvla_omy/wandb/run-20250810_170118-7/logs/debug-core.log ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ {"time":"2025-08-10T17:01:18.393971931Z","level":"INFO","msg":"main: starting server","port-filename":"/tmp/tmp20s3taen/port-13371.txt","pid":13371,"log-level":0,"disable-analytics":false,"shutdown-on-parent-exit":false,"enable-dcgm-profiling":false}
2
+ {"time":"2025-08-10T17:01:18.394661855Z","level":"INFO","msg":"server: will exit if parent process dies","ppid":13371}
3
+ {"time":"2025-08-10T17:01:18.394626317Z","level":"INFO","msg":"server: accepting connections","addr":{"Name":"/tmp/wandb-13371-13411-923330907/socket","Net":"unix"}}
4
+ {"time":"2025-08-10T17:01:18.574453402Z","level":"INFO","msg":"connection: ManageConnectionData: new connection created","id":"1(@)"}
5
+ {"time":"2025-08-10T17:01:18.582061467Z","level":"INFO","msg":"handleInformInit: received","streamId":"7","id":"1(@)"}
6
+ {"time":"2025-08-10T17:01:18.781425359Z","level":"INFO","msg":"handleInformInit: stream started","streamId":"7","id":"1(@)"}
7
+ {"time":"2025-08-10T20:21:49.024430523Z","level":"INFO","msg":"server: parent process exited, terminating service process"}
smolvla_omy/wandb/run-20250810_170118-7/logs/debug-internal.log ADDED
@@ -0,0 +1,6 @@
 
 
 
 
 
 
 
1
+ {"time":"2025-08-10T17:01:18.58226796Z","level":"INFO","msg":"stream: starting","core version":"0.21.1"}
2
+ {"time":"2025-08-10T17:01:18.781374731Z","level":"INFO","msg":"stream: created new stream","id":"7"}
3
+ {"time":"2025-08-10T17:01:18.781419518Z","level":"INFO","msg":"stream: started","id":"7"}
4
+ {"time":"2025-08-10T17:01:18.781467058Z","level":"INFO","msg":"writer: started","stream_id":"7"}
5
+ {"time":"2025-08-10T17:01:18.78151031Z","level":"INFO","msg":"sender: started","stream_id":"7"}
6
+ {"time":"2025-08-10T17:01:18.781544725Z","level":"INFO","msg":"handler: started","stream_id":"7"}
smolvla_omy/wandb/run-20250810_170118-7/logs/debug.log ADDED
@@ -0,0 +1,21 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2025-08-10 17:01:18,364 INFO MainThread:13371 [wandb_setup.py:_flush():80] Current SDK version is 0.21.1
2
+ 2025-08-10 17:01:18,364 INFO MainThread:13371 [wandb_setup.py:_flush():80] Configure stats pid to 13371
3
+ 2025-08-10 17:01:18,364 INFO MainThread:13371 [wandb_setup.py:_flush():80] Loading settings from /home/ubuntu/.config/wandb/settings
4
+ 2025-08-10 17:01:18,364 INFO MainThread:13371 [wandb_setup.py:_flush():80] Loading settings from /home/ubuntu/hyperstack_smolvla/wandb/settings
5
+ 2025-08-10 17:01:18,364 INFO MainThread:13371 [wandb_setup.py:_flush():80] Loading settings from environment variables
6
+ 2025-08-10 17:01:18,364 INFO MainThread:13371 [wandb_init.py:setup_run_log_directory():703] Logging user logs to ckpt/smolvla_omy/wandb/run-20250810_170118-7/logs/debug.log
7
+ 2025-08-10 17:01:18,364 INFO MainThread:13371 [wandb_init.py:setup_run_log_directory():704] Logging internal logs to ckpt/smolvla_omy/wandb/run-20250810_170118-7/logs/debug-internal.log
8
+ 2025-08-10 17:01:18,364 INFO MainThread:13371 [wandb_init.py:init():830] calling init triggers
9
+ 2025-08-10 17:01:18,365 INFO MainThread:13371 [wandb_init.py:init():835] wandb.init called with sweep_config: {}
10
+ config: {'dataset': {'repo_id': 'lava8888/ArrangeVegetables', 'root': './ArrangeVegetables', 'episodes': None, 'image_transforms': {'enable': False, 'max_num_transforms': 3, 'random_order': False, 'tfs': {'brightness': {'weight': 1.0, 'type': 'ColorJitter', 'kwargs': {'brightness': [0.8, 1.2]}}, 'contrast': {'weight': 1.0, 'type': 'ColorJitter', 'kwargs': {'contrast': [0.8, 1.2]}}, 'saturation': {'weight': 1.0, 'type': 'ColorJitter', 'kwargs': {'saturation': [0.5, 1.5]}}, 'hue': {'weight': 1.0, 'type': 'ColorJitter', 'kwargs': {'hue': [-0.05, 0.05]}}, 'sharpness': {'weight': 1.0, 'type': 'SharpnessJitter', 'kwargs': {'sharpness': [0.5, 1.5]}}}}, 'revision': None, 'use_imagenet_stats': True, 'video_backend': 'torchcodec'}, 'env': None, 'policy': {'type': 'smolvla', 'n_obs_steps': 1, 'normalization_mapping': {'VISUAL': <NormalizationMode.IDENTITY: 'IDENTITY'>, 'STATE': <NormalizationMode.MEAN_STD: 'MEAN_STD'>, 'ACTION': <NormalizationMode.MEAN_STD: 'MEAN_STD'>}, 'input_features': {}, 'output_features': {}, 'device': 'cuda', 'use_amp': False, 'chunk_size': 5, 'n_action_steps': 5, 'max_state_dim': 32, 'max_action_dim': 32, 'resize_imgs_with_padding': [512, 512], 'empty_cameras': 0, 'adapt_to_pi_aloha': False, 'use_delta_joint_actions_aloha': False, 'tokenizer_max_length': 48, 'num_steps': 10, 'use_cache': True, 'freeze_vision_encoder': True, 'train_expert_only': True, 'train_state_proj': True, 'optimizer_lr': 0.0001, 'optimizer_betas': [0.9, 0.95], 'optimizer_eps': 1e-08, 'optimizer_weight_decay': 1e-10, 'optimizer_grad_clip_norm': 10, 'scheduler_warmup_steps': 1000, 'scheduler_decay_steps': 30000, 'scheduler_decay_lr': 2.5e-06, 'vlm_model_name': 'HuggingFaceTB/SmolVLM2-500M-Video-Instruct', 'load_vlm_weights': False, 'add_image_special_tokens': False, 'attention_mode': 'cross_attn', 'prefix_length': -1, 'pad_language_to': 'longest', 'num_expert_layers': -1, 'num_vlm_layers': 16, 'self_attn_every_n_layers': 2, 'expert_width_multiplier': 0.75, 'min_period': 0.004, 'max_period': 4.0}, 'output_dir': 'ckpt/smolvla_omy', 'job_name': 'smolvla_6', 'resume': False, 'seed': 42, 'num_workers': 24, 'batch_size': 350, 'steps': 20000, 'eval_freq': 5, 'log_freq': 1, 'save_checkpoint': True, 'save_freq': 500, 'use_policy_training_preset': True, 'optimizer': {'type': 'adamw', 'lr': 0.0001, 'weight_decay': 1e-10, 'grad_clip_norm': 10, 'betas': [0.9, 0.95], 'eps': 1e-08}, 'scheduler': {'type': 'cosine_decay_with_warmup', 'num_warmup_steps': 1000, 'num_decay_steps': 30000, 'peak_lr': 0.0001, 'decay_lr': 2.5e-06}, 'eval': {'n_episodes': 50, 'batch_size': 50, 'use_async_envs': False}, 'wandb': {'enable': True, 'disable_artifact': True, 'project': 'smolVLA', 'entity': 'qualiastudios', 'notes': 'first', 'run_id': '7', 'mode': 'online'}, '_wandb': {}}
11
+ 2025-08-10 17:01:18,365 INFO MainThread:13371 [wandb_init.py:init():871] starting backend
12
+ 2025-08-10 17:01:18,574 INFO MainThread:13371 [wandb_init.py:init():874] sending inform_init request
13
+ 2025-08-10 17:01:18,579 INFO MainThread:13371 [wandb_init.py:init():882] backend started and connected
14
+ 2025-08-10 17:01:18,582 INFO MainThread:13371 [wandb_init.py:init():953] updated telemetry
15
+ 2025-08-10 17:01:18,586 INFO MainThread:13371 [wandb_init.py:init():977] communicating run to backend with 90.0 second timeout
16
+ 2025-08-10 17:01:18,955 INFO MainThread:13371 [wandb_init.py:init():1029] starting run threads in backend
17
+ 2025-08-10 17:01:19,105 INFO MainThread:13371 [wandb_run.py:_console_start():2494] atexit reg
18
+ 2025-08-10 17:01:19,105 INFO MainThread:13371 [wandb_run.py:_redirect():2342] redirect: wrap_raw
19
+ 2025-08-10 17:01:19,106 INFO MainThread:13371 [wandb_run.py:_redirect():2411] Wrapping output streams.
20
+ 2025-08-10 17:01:19,106 INFO MainThread:13371 [wandb_run.py:_redirect():2434] Redirects installed.
21
+ 2025-08-10 17:01:19,107 INFO MainThread:13371 [wandb_init.py:init():1075] run started, returning control to user process
smolvla_omy/wandb/run-20250810_170118-7/run-7.wandb ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:fbaf6da1ebfa490eb21b84b604003717268ce32c613cd8e10064f414369adf6f
3
+ size 4128768