metadata license: apache-2.0
tags:
- pose-estimation
- bop
- doper
- keypoints
- 6dof
DOPER BOP -- Per-Object 6DoF Pose Estimation on BOP Datasets
Per-object DOPER-t (keypoint) and RTMDet-tiny (detector) models for all 168 objects across 9 BOP datasets.
Trained on synthetic PBR data only. Supports two evaluation modes:
Keypoints-only (GT bounding box) -- upper bound on pose accuracy
Detection + Pose (full pipeline, no GT) -- RTMDet detection, DOPER-t keypoints, PnP solve
Results Summary
Keypoints-Only (GT BBox)
Dataset
Objects
Split
N GT
ADD-AUC
ADD-S-AUC
MSSD-AUC
MSPD-AUC
ARMSSD
ARMSPD
lm
15
test
3000
0.9820
0.9931
0.9733
0.9766
0.6166
0.7718
lmo
8
test
1517
0.9655
0.9798
0.9565
0.9786
0.5042
0.7479
tless
30
test_primesense
6900
0.6816
0.7030
0.6662
0.9140
0.2245
0.4098
tudl
3
test
600
0.9635
0.9789
0.9431
0.9587
0.3465
0.6038
ycbv
21
test
4125
0.8671
0.8884
0.8503
0.9177
0.2528
0.4584
hb
33
val_primesense
23120
0.9538
0.9630
0.9451
0.9772
0.5961
0.7718
itodd
28
val
123
0.6352
0.6582
0.6123
0.6237
0.0925
0.1037
icbin
2
test
2250
0.8641
0.8921
0.8372
0.9192
0.1548
0.3391
hope
28
val
920
0.8759
0.8937
0.8581
0.8878
0.3331
0.4424
Detection + Pose (Full Pipeline)
Per-object RTMDet-tiny detectors trained with ~30-40% negative images (cross-dataset).
Dataset
Objects
Split
N GT
Det Rate
ADD-AUC
ADD-S-AUC
MSSD-AUC
MSPD-AUC
ARMSSD
ARMSPD
lm
15
test
3000
97.4%
0.9843
0.9950
0.9768
0.9783
0.6264
0.7813
lmo
8
test
1517
94.5%
0.9701
0.9840
0.9622
0.9801
0.5234
0.7602
tless
30
test_primesense
6900
74.1%
0.7079
0.7292
0.6940
0.9129
0.2581
0.4351
tudl
3
test
600
93.8%
0.9064
0.9251
0.8831
0.9380
0.3032
0.5281
ycbv
21
test
4125
85.0%
0.7650
0.7882
0.7473
0.8695
0.2001
0.3671
itodd
28
val
123
66.7%
0.6369
0.6554
0.6202
0.5718
0.1609
0.1177
icbin
2
test
2250
9.1%
0.9116
0.9414
0.8826
0.8452
0.1506
0.2443
hope
28
val
920
57.8%
0.9199
0.9362
0.9065
0.8856
0.3605
0.4596
Det Rate = fraction of GT instances matched by a detection (IoU > 0.1). AUC metrics computed only over detected+solved instances.
Metrics
Metric
Description
ADD
Average Distance of model points (non-symmetric)
ADD-S
Average Distance of closest model points (symmetric-aware, a.k.a. ADI)
MSSD
Maximum Symmetry-Aware Surface Distance (BOP )
MSPD
Maximum Symmetry-Aware Projection Distance (BOP )
AUC
Area Under the recall-vs-threshold Curve (40 thresholds in (0, 10x diameter])
ARMSSD
Average Recall at BOP thresholds: {0.05, 0.10, ..., 0.50} x diameter
ARMSPD
Average Recall at BOP thresholds: {5, 10, ..., 50} pixels
Method
3D Keypoints : 17 symmetry-aware keypoints per object from BOP meshes + models_info.json
Keypoint Training : DOPER-t (CSPNeXt-tiny, 256x256, 300 epochs) on projected keypoints from PBR renders
Detector Training : RTMDet-tiny (CSPNeXt-tiny, COCO pretrained, 20 epochs) per object, with ~30-40% cross-dataset negative images
Inference : RTMDet detects object bbox -> DOPER-t predicts 2D keypoints -> PnP+RANSAC solves 6DoF pose
Training Data
BOP core (lm, lmo, tless, tudl, ycbv, hb, itodd, icbin): BOP BlenderProc PBR (train_pbr)
HOPE : Custom BlenderProc synthetic (~40K images/object)
Symmetry Handling
Type
Strategy
None
Farthest-point sampling (FPS) on mesh surface
Discrete
Keypoints in fundamental domain, replicated under symmetry transforms
Continuous
Axial keypoints + equidistant ring perpendicular to symmetry axis
Per-Object Results (Keypoints-Only, GT BBox)
LM (15 objects, test)
ID
Diam (mm)
N GT
ADD-AUC
ADD-S-AUC
MSSD-AUC
MSPD-AUC
ARMSSD
ARMSPD
1
102.1
200
0.9990
1.0000
0.9986
1.0000
0.8260
0.9965
2
247.5
200
0.9926
0.9995
0.9779
0.9622
0.6405
0.6985
3
167.4
200
0.9106
0.9558
0.9161
0.9319
0.2665
0.4410
4
172.5
200
0.9936
0.9990
0.9922
0.9980
0.6900
0.9295
5
201.4
200
0.9955
0.9986
0.9915
0.9914
0.8490
0.9080
6
154.5
200
1.0000
1.0000
1.0000
1.0000
0.9285
0.9950
7
124.3
200
0.9335
0.9620
0.9018
0.9450
0.1750
0.4435
8
261.5
200
0.9977
0.9995
0.9936
0.9939
0.8770
0.9205
9
109.0
200
0.9749
0.9949
0.9527
0.9672
0.3800
0.6435
10
164.6
200
0.9911
0.9975
0.9875
0.9878
0.6850
0.8915
11
175.9
200
0.9851
0.9965
0.9730
0.9940
0.4825
0.8785
12
145.5
200
0.9734
0.9950
0.9521
0.9456
0.3175
0.5175
13
278.1
200
0.9982
0.9994
0.9970
0.9968
0.8965
0.9090
14
282.6
200
0.9878
0.9991
0.9695
0.9370
0.4830
0.4875
15
212.4
200
0.9972
1.0000
0.9957
0.9989
0.7515
0.9165
mean
0.9820
0.9931
0.9733
0.9766
0.6166
0.7718
LMO (8 objects, test)
ID
Diam (mm)
N GT
ADD-AUC
ADD-S-AUC
MSSD-AUC
MSPD-AUC
ARMSSD
ARMSPD
1
102.1
187
0.9828
0.9880
0.9790
0.9951
0.6567
0.8995
5
201.4
199
0.9935
0.9987
0.9861
0.9864
0.7111
0.8201
6
154.5
196
0.9858
0.9909
0.9811
0.9971
0.6143
0.9276
8
261.5
200
0.9981
0.9996
0.9949
0.9954
0.8350
0.8840
9
109.0
188
0.9608
0.9826
0.9368
0.9654
0.2883
0.5968
10
164.6
191
0.8719
0.9132
0.8753
0.9677
0.2435
0.6429
11
175.9
154
0.9625
0.9735
0.9524
0.9828
0.4422
0.7721
12
145.5
200
0.9689
0.9920
0.9468
0.9389
0.2430
0.4405
mean
0.9655
0.9798
0.9565
0.9786
0.5042
0.7479
T-LESS (30 objects, test_primesense)
ID
Diam (mm)
N GT
ADD-AUC
ADD-S-AUC
MSSD-AUC
MSPD-AUC
ARMSSD
ARMSPD
1
63.5
900
0.7947
0.8240
0.7741
0.9646
0.1030
0.6047
2
66.2
500
0.1058
0.1158
0.0997
0.9357
0.0000
0.3038
3
65.3
400
0.8078
0.8419
0.7850
0.9506
0.0793
0.4778
4
80.7
650
0.8712
0.8968
0.8537
0.9638
0.2065
0.6006
5
108.7
200
0.9694
0.9824
0.9635
0.9778
0.5290
0.7625
6
108.3
100
0.9235
0.9435
0.9133
0.9665
0.2870
0.6640
7
178.6
250
0.8267
0.8514
0.8046
0.8572
0.3076
0.3760
8
217.2
150
0.9177
0.9372
0.9043
0.8987
0.3347
0.4493
9
144.5
250
0.8684
0.9066
0.8286
0.7812
0.0184
0.0160
10
90.2
150
0.9417
0.9542
0.9355
0.9800
0.5987
0.8020
11
76.6
200
0.9243
0.9453
0.9088
0.9610
0.3530
0.6250
12
86.0
150
0.9362
0.9558
0.9187
0.9440
0.3640
0.5140
13
58.1
150
0.8567
0.8872
0.8393
0.9662
0.1493
0.6033
14
71.9
150
0.2485
0.2687
0.2363
0.9418
0.0000
0.3580
15
68.6
150
0.8617
0.8970
0.8390
0.9553
0.0887
0.5220
16
69.2
200
0.0229
0.0271
0.0217
0.9280
0.0000
0.2115
17
112.8
150
0.1552
0.1765
0.1473
0.8477
0.0013
0.0133
18
111.0
150
0.9515
0.9695
0.9283
0.9027
0.3733
0.4573
19
89.1
200
0.0000
0.0000
0.0000
0.8629
0.0000
0.0005
20
98.9
250
0.4074
0.4462
0.3760
0.8706
0.0000
0.0324
21
92.3
200
0.8999
0.9182
0.8811
0.9386
0.4730
0.6105
22
92.3
200
0.9441
0.9588
0.9289
0.9469
0.5480
0.6815
23
142.6
250
0.9696
0.9815
0.9635
0.9632
0.5932
0.7012
24
84.7
200
0.9069
0.9315
0.8935
0.9665
0.3010
0.6300
25
108.8
100
0.9378
0.9470
0.9335
0.9770
0.7100
0.8230
26
108.8
100
0.9030
0.9297
0.8672
0.8755
0.0720
0.1180
27
152.5
100
0.2220
0.2417
0.2095
0.7245
0.0000
0.0030
28
124.8
200
0.2886
0.3205
0.2627
0.8064
0.0000
0.0015
29
134.2
100
0.8878
0.9270
0.8812
0.8858
0.2430
0.3030
30
88.8
150
0.0967
0.1075
0.0882
0.8798
0.0000
0.0293
mean
0.6816
0.7030
0.6662
0.9140
0.2245
0.4098
TUD-L (3 objects, test)
ID
Diam (mm)
N GT
ADD-AUC
ADD-S-AUC
MSSD-AUC
MSPD-AUC
ARMSSD
ARMSPD
1
430.3
200
0.9711
0.9835
0.9545
0.9439
0.3925
0.5605
2
175.7
200
0.9324
0.9547
0.9096
0.9689
0.2365
0.6570
3
352.4
200
0.9870
0.9985
0.9651
0.9633
0.4105
0.5940
mean
0.9635
0.9789
0.9431
0.9587
0.3465
0.6038
YCB-V (21 objects, test)
ID
Diam (mm)
N GT
ADD-AUC
ADD-S-AUC
MSSD-AUC
MSPD-AUC
ARMSSD
ARMSPD
1
172.1
300
0.9468
0.9800
0.9470
0.9587
0.1273
0.5660
2
269.6
225
0.9967
0.9996
0.9943
0.9876
0.6609
0.7849
3
198.4
375
0.9925
0.9975
0.9906
0.9956
0.6568
0.8539
4
120.5
450
0.9518
0.9670
0.9436
0.9778
0.4098
0.7502
5
196.5
150
0.9512
0.9622
0.9373
0.9408
0.3940
0.5693
6
89.8
300
0.7816
0.8251
0.7290
0.8697
0.0230
0.1147
7
142.5
75
0.9780
0.9923
0.9743
0.9943
0.3600
0.8467
8
114.1
75
0.9910
1.0000
0.9863
1.0000
0.5920
0.9480
9
129.5
225
0.8569
0.8846
0.8296
0.9054
0.2316
0.4089
10
197.8
150
0.7247
0.7437
0.6943
0.9063
0.0400
0.2027
11
259.5
225
0.7109
0.7407
0.6770
0.6787
0.0262
0.0236
12
259.6
300
0.7168
0.7312
0.6992
0.8213
0.2130
0.2900
13
161.9
150
0.8455
0.8883
0.8307
0.8267
0.0587
0.0527
14
125.0
150
0.9562
0.9832
0.9252
0.9317
0.0500
0.3567
15
226.2
300
0.9940
0.9972
0.9878
0.9849
0.5987
0.7610
16
237.3
75
0.6860
0.7153
0.6660
0.8560
0.0000
0.0320
17
204.0
75
0.9407
0.9567
0.9337
0.9587
0.1373
0.5240
18
121.4
150
0.9830
0.9893
0.9788
0.9983
0.5080
0.9033
19
174.7
150
0.9033
0.9292
0.8903
0.9232
0.0753
0.4100
20
217.1
150
0.9152
0.9480
0.8945
0.9010
0.1467
0.2180
21
102.9
75
0.3873
0.4257
0.3460
0.8553
0.0000
0.0093
mean
0.8671
0.8884
0.8503
0.9177
0.2528
0.4584
HB (33 objects, val_primesense)
ID
Diam (mm)
N GT
ADD-AUC
ADD-S-AUC
MSSD-AUC
1
232.6
680
0.9906
0.9962
0.9803
2
257.4
340
1.0000
1.0000
1.0000
3
166.5
1020
0.9737
0.9834
0.9640
4
179.0
680
0.9849
0.9908
0.9793
5
205.4
680
0.9864
0.9899
0.9825
6
121.4
340
0.9741
0.9968
0.9471
7
263.7
340
0.9984
1.0000
0.9940
8
186.8
680
0.9931
0.9968
0.9903
9
166.6
680
0.9139
0.9348
0.8892
10
180.8
680
0.9338
0.9546
0.9307
11
238.5
340
0.9153
0.9433
0.9260
12
156.9
1360
0.9688
0.9823
0.9537
13
145.3
1020
0.9582
0.9733
0.9455
14
243.7
680
0.9922
0.9965
0.9857
15
113.0
1700
0.9483
0.9611
0.9374
16
101.6
1020
0.9519
0.9679
0.9296
17
132.8
1360
0.9448
0.9571
0.9293
18
211.1
680
0.1994
0.2258
0.1693
19
185.6
680
0.9987
0.9993
0.9982
20
244.8
340
1.0000
1.0000
1.0000
21
212.6
340
0.9979
0.9993
0.9963
22
190.2
1360
0.9874
0.9933
0.9823
23
233.9
1020
0.9987
0.9997
0.9971
24
252.3
340
0.9940
0.9999
0.9889
25
202.9
680
0.9753
0.9853
0.9612
26
183.8
680
0.9785
0.9893
0.9613
27
264.4
340
0.9989
0.9998
0.9983
28
477.5
340
1.0000
1.0000
1.0000
29
198.0
680
0.9720
0.9970
0.9489
30
416.2
340
1.0000
1.0000
1.0000
31
158.0
340
0.9912
0.9971
0.9817
32
201.8
680
0.9610
0.9701
0.9487
33
187.2
680
0.9945
0.9974
0.9916
mean
0.9538
0.9630
0.9451
ITODD (28 objects, val)
ID
Diam (mm)
N GT
ADD-AUC
ADD-S-AUC
MSSD-AUC
MSPD-AUC
ARMSSD
ARMSPD
1
64.1
4
0.9125
0.9437
0.8562
0.5500
0.0000
0.0000
2
51.5
3
0.4833
0.5000
0.4500
0.7167
0.0000
0.0000
3
142.2
4
0.0000
0.0000
0.0000
0.2750
0.0000
0.0000
4
139.4
3
0.2583
0.3000
0.2250
0.3083
0.0000
0.0000
5
158.6
6
0.7167
0.7583
0.6833
0.2500
0.0000
0.0000
6
85.3
5
0.9650
0.9750
0.9650
0.9750
0.1800
0.6000
7
38.5
5
0.0000
0.0200
0.0000
0.8250
0.0000
0.0000
8
68.9
3
0.8417
0.8917
0.7917
0.5083
0.0000
0.0000
9
94.8
3
0.0000
0.0000
0.0000
0.4000
0.0000
0.0000
10
55.7
4
0.9563
0.9688
0.9313
0.8938
0.0000
0.1500
11
140.1
5
0.5950
0.6300
0.5850
0.4550
0.0600
0.0000
12
107.7
4
0.0000
0.0000
0.0000
0.1437
0.0000
0.0000
13
128.1
4
0.9750
0.9875
0.9563
0.7438
0.1750
0.1000
14
102.9
3
0.0000
0.0000
0.0000
0.3917
0.0000
0.0000
15
114.2
3
0.9583
0.9833
0.9167
0.4750
0.0333
0.0000
16
193.1
3
0.9833
1.0000
0.9667
0.5917
0.2333
0.0000
17
77.8
3
0.7333
0.7833
0.6833
0.6167
0.0000
0.0000
18
108.5
3
0.2083
0.2417
0.1917
0.6583
0.0000
0.0000
19
121.4
3
0.7917
0.8417
0.7500
0.5083
0.0000
0.0000
20
122.0
4
0.9812
0.9875
0.9750
0.9062
0.6250
0.4250
21
171.2
3
0.9917
1.0000
0.9833
0.7833
0.5333
0.1333
22
267.5
3
0.9333
0.9667
0.8833
0.3333
0.0000
0.0000
23
56.9
1
1.0000
1.0000
0.9750
0.9750
0.4000
0.7000
24
65.0
6
0.3333
0.3583
0.3292
0.8708
0.0000
0.0000
25
48.5
6
0.9375
0.9625
0.9083
0.7917
0.0000
0.3000
26
66.8
6
0.9792
0.9875
0.9667
0.9458
0.3500
0.4333
27
55.7
5
0.7200
0.7750
0.6700
0.6800
0.0000
0.0000
28
24.1
18
0.5306
0.5681
0.5014
0.8917
0.0000
0.0611
mean
0.6352
0.6582
0.6123
0.6237
0.0925
0.1037
IC-BIN (2 objects, test)
ID
Diam (mm)
N GT
ADD-AUC
ADD-S-AUC
MSSD-AUC
MSPD-AUC
ARMSSD
ARMSPD
1
136.6
1800
0.8560
0.8871
0.8380
0.9422
0.2424
0.4592
2
220.6
450
0.8723
0.8971
0.8364
0.8962
0.0671
0.2191
mean
0.8641
0.8921
0.8372
0.9192
0.1548
0.3391
HOPE (28 objects, val)
ID
Diam (mm)
N GT
ADD-AUC
ADD-S-AUC
MSSD-AUC
MSPD-AUC
ARMSSD
ARMSPD
1
107.9
45
0.8717
0.8867
0.8644
0.9417
0.4178
0.5844
2
153.9
45
0.8994
0.9150
0.8733
0.8478
0.3133
0.4000
3
115.0
45
0.8622
0.8778
0.8533
0.9439
0.4378
0.6200
4
89.6
20
0.9525
0.9713
0.9400
0.9475
0.3050
0.5300
5
98.0
30
0.8350
0.8458
0.8250
0.9417
0.5300
0.6633
6
208.7
25
0.9910
0.9960
0.9880
0.9370
0.5920
0.4880
7
89.6
40
0.8325
0.8663
0.8150
0.9187
0.1250
0.3600
8
115.6
25
0.8990
0.9190
0.8780
0.8470
0.2760
0.3600
9
206.0
25
0.8610
0.8910
0.8210
0.7250
0.2000
0.1880
10
89.7
35
0.6257
0.6493
0.5979
0.8479
0.1571
0.2286
11
153.9
40
0.9238
0.9338
0.9094
0.9256
0.5225
0.6525
12
207.1
50
0.9535
0.9710
0.9310
0.7805
0.3500
0.2800
13
153.4
30
0.9767
0.9917
0.9550
0.8525
0.3767
0.3733
14
204.5
25
0.9660
0.9760
0.9500
0.8920
0.4120
0.3520
15
75.7
35
0.8814
0.9150
0.8521
0.8929
0.0657
0.3943
16
161.5
50
0.9560
0.9660
0.9420
0.8980
0.5120
0.5220
17
205.7
25
0.7150
0.7290
0.6790
0.7060
0.1960
0.1840
18
122.8
30
0.9658
0.9783
0.9458
0.9133
0.4167
0.4467
19
89.2
20
0.9650
0.9800
0.9550
0.9750
0.3600
0.7350
20
89.9
30
0.7575
0.7900
0.7350
0.9050
0.0900
0.3400
21
89.2
55
0.8286
0.8509
0.8141
0.9223
0.2855
0.5655
22
152.4
20
0.7838
0.8137
0.7475
0.8462
0.0650
0.1350
23
151.3
40
0.8206
0.8475
0.7900
0.7162
0.0975
0.1500
24
151.3
10
1.0000
1.0000
0.9975
0.9750
0.8000
0.6500
25
252.8
35
0.9786
0.9836
0.9714
0.9257
0.5286
0.5400
26
107.1
35
0.7529
0.7679
0.7457
0.9364
0.3771
0.5800
27
76.1
35
0.7736
0.7971
0.7636
0.9443
0.1629
0.4886
28
82.9
20
0.8962
0.9150
0.8875
0.9525
0.3550
0.5750
mean
0.8759
0.8937
0.8581
0.8878
0.3331
0.4424
File Structure
{dataset}/obj_{NNNNNN}/
best_coco_AP_epoch_NNN.pth # DOPER-t keypoint checkpoint
keypoints_3d.json # 17 symmetry-aware 3D keypoints (mm)
bop_summary.json # Evaluation metrics
vis_grid.jpg # Qualitative results (GT bbox eval)
Detector checkpoints and BOP submission CSVs are in the bop_submission/ directory.
Usage
from huggingface_hub import hf_hub_download
ckpt = hf_hub_download("TontonTremblay/DOPER_BOP" , "ycbv/obj_000001/best_coco_AP_epoch_200.pth" , repo_type="dataset" )
kpts = hf_hub_download("TontonTremblay/DOPER_BOP" , "ycbv/obj_000001/keypoints_3d.json" , repo_type="dataset" )
BOP Submission
Pre-computed detection+pose results for all 9 datasets in BOP CSV format:
bop_submission/doper-t_bop_results_det_v2.zip -- RTMDet + DOPER-t + PnP (no GT bbox)
Citation
If you use these models, please cite the DOPER project and the BOP benchmark .