wintermelontree commited on
Commit
630d180
·
verified ·
1 Parent(s): 615f682

Upload folder using huggingface_hub

Browse files
tool_hang_low_dim/_2025_12_30_15_49_12.log ADDED
@@ -0,0 +1,154 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Using robot0_eef_pos with dim 3 for observation
2
+ Using robot0_eef_quat with dim 4 for observation
3
+ Using robot0_gripper_qpos with dim 2 for observation
4
+ Using object with dim 44 for observation
5
+ Total low-dim observation dim: 53
6
+ Original action dim: 7
7
+ Final action dim: 7
8
+ ===== Basic stats =====
9
+ Total transitions: 95962
10
+ Total trajectories: 200
11
+ Traj length mean/std: 479.81, 88.47046908432215
12
+ Traj length min/max: 332, 744
13
+ obs min: [-2.52127122e-01 -3.07641201e-01 0.00000000e+00 0.00000000e+00
14
+ -9.91649085e-02 -7.60016492e-01 -3.01106725e-01 0.00000000e+00
15
+ -4.17679508e-02 -8.04781282e-02 -9.76209898e-05 0.00000000e+00
16
+ -2.03777831e-02 -3.52264643e-02 -2.20292196e-01 0.00000000e+00
17
+ -7.16841553e-02 -2.52492655e-01 -1.11481197e-01 -9.99909639e-01
18
+ -7.98503578e-01 -7.42919326e-01 0.00000000e+00 -2.68730730e-01
19
+ -3.27619294e-01 0.00000000e+00 -4.11453307e-01 -7.15811133e-01
20
+ -7.27672637e-01 0.00000000e+00 -1.82675895e-01 -3.11966754e-01
21
+ -3.26691579e-01 -9.99954820e-01 -7.09664166e-01 -7.08980560e-01
22
+ 0.00000000e+00 -2.46124441e-01 -2.33028617e-01 0.00000000e+00
23
+ -4.88239914e-01 -1.15251660e-01 -9.04773653e-01 0.00000000e+00
24
+ -2.20961330e-01 -1.44786243e-01 -2.26624309e-01 -6.41619921e-01
25
+ -9.99993920e-01 -7.50604928e-01 0.00000000e+00 0.00000000e+00
26
+ 0.00000000e+00]
27
+ obs max: [1.47632899e-01 1.26011762e-01 1.18004313e+00 9.99909619e-01
28
+ 8.00552734e-01 1.20881506e-01 4.54378396e-01 4.13186631e-02
29
+ 0.00000000e+00 0.00000000e+00 1.88628021e-02 8.25429750e-01
30
+ 1.47824243e-01 1.72425508e-02 7.36363363e-05 1.00000036e+00
31
+ 3.65600221e-01 2.08337948e-01 3.37257582e-01 9.99907792e-01
32
+ 8.00549448e-01 7.60016680e-01 4.54383522e-01 4.32860451e-03
33
+ 1.30913309e-01 1.21488420e+00 3.16708297e-01 6.25906646e-01
34
+ 8.86733294e-01 9.46352839e-01 2.78364920e-01 3.65295857e-01
35
+ 2.17046003e-01 9.99961853e-01 4.67488050e-01 2.55197793e-01
36
+ 6.43479943e-01 7.57403192e-02 1.24550600e-02 1.14192089e+00
37
+ 6.95465624e-01 6.72781169e-01 0.00000000e+00 8.45146418e-01
38
+ 3.62621336e-01 3.49493562e-01 2.44990615e-01 6.61285222e-01
39
+ 9.99998212e-01 7.17198074e-01 8.74303877e-01 1.00000000e+00
40
+ 1.00000000e+00]
41
+ action min: [-1. -1. -1. -0.35410371 -0.49649122 -0.72497994
42
+ -1. ]
43
+ action max: [1. 1. 1. 0.32611984 0.61583012 1.
44
+ 1. ]
45
+ Trajectory demo_0: cumulative reward = 1.0 (length: 681 -> 677)
46
+ Trajectory demo_1: cumulative reward = 1.0 (length: 613 -> 609)
47
+ Trajectory demo_2: cumulative reward = 1.0 (length: 737 -> 733)
48
+ Trajectory demo_3: cumulative reward = 1.0 (length: 582 -> 578)
49
+ Trajectory demo_4: cumulative reward = 1.0 (length: 552 -> 548)
50
+ Trajectory demo_5: cumulative reward = 1.0 (length: 681 -> 677)
51
+ Trajectory demo_6: cumulative reward = 1.0 (length: 587 -> 551)
52
+ Trajectory demo_7: cumulative reward = 1.0 (length: 622 -> 618)
53
+ Trajectory demo_8: cumulative reward = 1.0 (length: 514 -> 510)
54
+ Trajectory demo_9: cumulative reward = 1.0 (length: 623 -> 619)
55
+ Trajectory demo_10: cumulative reward = 1.0 (length: 497 -> 493)
56
+ Trajectory demo_11: cumulative reward = 1.0 (length: 478 -> 474)
57
+ Trajectory demo_12: cumulative reward = 1.0 (length: 496 -> 492)
58
+ Trajectory demo_13: cumulative reward = 1.0 (length: 514 -> 510)
59
+ Trajectory demo_14: cumulative reward = 1.0 (length: 708 -> 704)
60
+ Trajectory demo_15: cumulative reward = 1.0 (length: 510 -> 505)
61
+ DEBUG Trajectory 16 (demo_16): Original length=676, non-zero rewards=10
62
+ DEBUG Trajectory 16 (demo_16): Reward sum=10.0, unique rewards=[0. 1.]
63
+ DEBUG Trajectory 16 (demo_16): Last 20 rewards: [0. 0. 0. 0. 0. 0. 0. 0. 1. 1. 1. 1. 1. 0. 0. 1. 1. 1. 1. 1.]
64
+ DEBUG Trajectory 16 (demo_16): Truncating at index 665, cumsum at that point = 1.0
65
+ Trajectory demo_16: cumulative reward = 1.0 (length: 676 -> 665)
66
+ Trajectory demo_17: cumulative reward = 1.0 (length: 677 -> 668)
67
+ Trajectory demo_18: cumulative reward = 1.0 (length: 632 -> 605)
68
+ Trajectory demo_19: cumulative reward = 1.0 (length: 569 -> 565)
69
+ Trajectory demo_20: cumulative reward = 1.0 (length: 565 -> 561)
70
+ Trajectory demo_21: cumulative reward = 1.0 (length: 575 -> 572)
71
+ Trajectory demo_22: cumulative reward = 1.0 (length: 711 -> 707)
72
+ Trajectory demo_23: cumulative reward = 1.0 (length: 742 -> 738)
73
+ Trajectory demo_24: cumulative reward = 1.0 (length: 537 -> 533)
74
+ Trajectory demo_25: cumulative reward = 1.0 (length: 480 -> 475)
75
+ Trajectory demo_26: cumulative reward = 1.0 (length: 429 -> 425)
76
+ Trajectory demo_27: cumulative reward = 1.0 (length: 469 -> 465)
77
+ Trajectory demo_28: cumulative reward = 1.0 (length: 512 -> 505)
78
+ Trajectory demo_29: cumulative reward = 1.0 (length: 532 -> 526)
79
+ Trajectory demo_30: cumulative reward = 1.0 (length: 409 -> 405)
80
+ Trajectory demo_31: cumulative reward = 1.0 (length: 476 -> 458)
81
+ Trajectory demo_32: cumulative reward = 1.0 (length: 473 -> 467)
82
+ Trajectory demo_33: cumulative reward = 1.0 (length: 398 -> 394)
83
+ Trajectory demo_34: cumulative reward = 1.0 (length: 574 -> 570)
84
+ Trajectory demo_35: cumulative reward = 1.0 (length: 589 -> 585)
85
+ Trajectory demo_36: cumulative reward = 1.0 (length: 532 -> 528)
86
+ Trajectory demo_37: cumulative reward = 1.0 (length: 567 -> 563)
87
+ Trajectory demo_38: cumulative reward = 1.0 (length: 471 -> 466)
88
+ Trajectory demo_39: cumulative reward = 1.0 (length: 485 -> 481)
89
+ Trajectory demo_40: cumulative reward = 1.0 (length: 649 -> 645)
90
+ Trajectory demo_41: cumulative reward = 1.0 (length: 538 -> 534)
91
+ Trajectory demo_42: cumulative reward = 1.0 (length: 580 -> 576)
92
+ Trajectory demo_43: cumulative reward = 1.0 (length: 495 -> 491)
93
+ Trajectory demo_44: cumulative reward = 1.0 (length: 523 -> 516)
94
+ Trajectory demo_45: cumulative reward = 1.0 (length: 627 -> 623)
95
+ Trajectory demo_46: cumulative reward = 1.0 (length: 450 -> 446)
96
+ Trajectory demo_47: cumulative reward = 1.0 (length: 539 -> 535)
97
+ Trajectory demo_48: cumulative reward = 1.0 (length: 508 -> 504)
98
+ Trajectory demo_49: cumulative reward = 1.0 (length: 555 -> 551)
99
+ Trajectory demo_50: cumulative reward = 1.0 (length: 509 -> 432)
100
+ Trajectory demo_51: cumulative reward = 1.0 (length: 502 -> 498)
101
+ Trajectory demo_52: cumulative reward = 1.0 (length: 577 -> 573)
102
+ Trajectory demo_53: cumulative reward = 1.0 (length: 457 -> 453)
103
+ Trajectory demo_54: cumulative reward = 1.0 (length: 489 -> 485)
104
+ Trajectory demo_55: cumulative reward = 1.0 (length: 521 -> 517)
105
+ Trajectory demo_56: cumulative reward = 1.0 (length: 469 -> 465)
106
+ Trajectory demo_57: cumulative reward = 1.0 (length: 476 -> 472)
107
+ Trajectory demo_58: cumulative reward = 1.0 (length: 433 -> 429)
108
+ Trajectory demo_59: cumulative reward = 1.0 (length: 404 -> 379)
109
+ Trajectory demo_60: cumulative reward = 1.0 (length: 477 -> 451)
110
+ Trajectory demo_61: cumulative reward = 1.0 (length: 541 -> 537)
111
+ Trajectory demo_62: cumulative reward = 1.0 (length: 490 -> 486)
112
+ Trajectory demo_63: cumulative reward = 1.0 (length: 495 -> 487)
113
+ Trajectory demo_64: cumulative reward = 1.0 (length: 508 -> 504)
114
+ Trajectory demo_65: cumulative reward = 1.0 (length: 399 -> 395)
115
+ Trajectory demo_66: cumulative reward = 1.0 (length: 461 -> 458)
116
+ Trajectory demo_67: cumulative reward = 1.0 (length: 474 -> 470)
117
+ Trajectory demo_68: cumulative reward = 1.0 (length: 419 -> 415)
118
+ Trajectory demo_69: cumulative reward = 1.0 (length: 542 -> 538)
119
+ Trajectory demo_70: cumulative reward = 1.0 (length: 421 -> 417)
120
+ Trajectory demo_71: cumulative reward = 1.0 (length: 562 -> 555)
121
+ Trajectory demo_72: cumulative reward = 1.0 (length: 416 -> 412)
122
+ Trajectory demo_73: cumulative reward = 1.0 (length: 432 -> 428)
123
+ Trajectory demo_74: cumulative reward = 1.0 (length: 470 -> 466)
124
+ Trajectory demo_75: cumulative reward = 1.0 (length: 407 -> 403)
125
+ Trajectory demo_76: cumulative reward = 1.0 (length: 439 -> 412)
126
+ Trajectory demo_77: cumulative reward = 1.0 (length: 429 -> 425)
127
+ Trajectory demo_78: cumulative reward = 1.0 (length: 452 -> 448)
128
+ Trajectory demo_79: cumulative reward = 1.0 (length: 414 -> 410)
129
+ Trajectory demo_80: cumulative reward = 1.0 (length: 537 -> 533)
130
+ Trajectory demo_81: cumulative reward = 1.0 (length: 363 -> 359)
131
+ Trajectory demo_82: cumulative reward = 1.0 (length: 422 -> 416)
132
+ Trajectory demo_83: cumulative reward = 1.0 (length: 440 -> 408)
133
+ Trajectory demo_84: cumulative reward = 1.0 (length: 430 -> 426)
134
+ Trajectory demo_85: cumulative reward = 1.0 (length: 488 -> 484)
135
+ Trajectory demo_86: cumulative reward = 1.0 (length: 479 -> 475)
136
+ Trajectory demo_87: cumulative reward = 1.0 (length: 443 -> 438)
137
+ Trajectory demo_88: cumulative reward = 1.0 (length: 433 -> 429)
138
+ Trajectory demo_89: cumulative reward = 1.0 (length: 444 -> 439)
139
+ Trajectory demo_90: cumulative reward = 1.0 (length: 410 -> 406)
140
+ Trajectory demo_91: cumulative reward = 1.0 (length: 446 -> 421)
141
+ Trajectory demo_92: cumulative reward = 1.0 (length: 369 -> 365)
142
+ Trajectory demo_93: cumulative reward = 1.0 (length: 378 -> 374)
143
+ Trajectory demo_94: cumulative reward = 1.0 (length: 399 -> 395)
144
+ Trajectory demo_95: cumulative reward = 1.0 (length: 496 -> 490)
145
+ Trajectory demo_96: cumulative reward = 1.0 (length: 523 -> 515)
146
+ Trajectory demo_97: cumulative reward = 1.0 (length: 608 -> 601)
147
+ Trajectory demo_98: cumulative reward = 1.0 (length: 394 -> 390)
148
+ Trajectory demo_99: cumulative reward = 1.0 (length: 425 -> 421)
149
+ ===== Truncation Statistics =====
150
+ Original total steps: 95962
151
+ Truncated total steps: 94576
152
+ Reduction: 1386 steps (1.4%)
153
+ Train - Trajectories: 200, Transitions: 94576
154
+ Val - Trajectories: 0, Transitions: 0.0
tool_hang_low_dim/normalization.npz ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4663652d105e129f155bc8f50ee263e6b30ed7d17baa2e8347d3c4e67943c4a4
3
+ size 1451
tool_hang_low_dim/train.npz ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3990178a296f8c1b8211a16e79a99ce1555db10f147da95173692a18697c1037
3
+ size 34520267
tool_hang_low_dim/val.npz ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ef423294119533dd4d99cd9e9c6d933cd3cedf7ddf9070a66d603c05a24cffc2
3
+ size 964