123 commited on
Commit
addbf21
·
1 Parent(s): 3192c09

upload ckpts

Browse files
100_percent/LLM-Augmented-MTR/best_eval_record.txt ADDED
@@ -0,0 +1,40 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ epoch_1 mAP 0.23161921567387053
2
+ best_epoch_1 mAP 0.23161921567387053
3
+ epoch_4 mAP 0.3319349918100569
4
+ best_epoch_4 mAP 0.3319349918100569
5
+ epoch_6 mAP 0.3404955714941025
6
+ best_epoch_6 mAP 0.3404955714941025
7
+ epoch_8 mAP 0.37202317184872097
8
+ best_epoch_8 mAP 0.37202317184872097
9
+ epoch_10 mAP 0.3758834236198001
10
+ best_epoch_10 mAP 0.3758834236198001
11
+ epoch_12 mAP 0.37917777564790517
12
+ best_epoch_12 mAP 0.37917777564790517
13
+ epoch_14 mAP 0.37314530710379284
14
+ best_epoch_12 mAP 0.37917777564790517
15
+ epoch_16 mAP 0.39642529024018175
16
+ best_epoch_16 mAP 0.39642529024018175
17
+ epoch_18 mAP 0.40432000160217285
18
+ best_epoch_18 mAP 0.40432000160217285
19
+ epoch_20 mAP 0.4059317045741611
20
+ best_epoch_20 mAP 0.4059317045741611
21
+ epoch_21 mAP 0.40459092126952273
22
+ best_epoch_20 mAP 0.4059317045741611
23
+ epoch_22 mAP 0.4035279485914442
24
+ best_epoch_20 mAP 0.4059317045741611
25
+ epoch_23 mAP 0.4111773471037547
26
+ best_epoch_23 mAP 0.4111773471037547
27
+ epoch_24 mAP 0.41981253690189785
28
+ best_epoch_24 mAP 0.41981253690189785
29
+ epoch_25 mAP 0.4195335838529799
30
+ best_epoch_24 mAP 0.41981253690189785
31
+ epoch_26 mAP 0.4268754555119409
32
+ best_epoch_26 mAP 0.4268754555119409
33
+ epoch_27 mAP 0.42183637287881637
34
+ best_epoch_26 mAP 0.4268754555119409
35
+ epoch_28 mAP 0.4209697412119972
36
+ best_epoch_26 mAP 0.4268754555119409
37
+ epoch_29 mAP 0.4222996963395012
38
+ best_epoch_26 mAP 0.4268754555119409
39
+ epoch_30 mAP 0.4242007202572293
40
+ best_epoch_26 mAP 0.4268754555119409
100_percent/LLM-Augmented-MTR/checkpoint_epoch_26.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:fc53c6fbae714165ac201d28f8b6ea1fd558ffb45a9bca22cdfab00484d60d1e
3
+ size 843700702
100_percent/LLM-Augmented-MTR/log_train_20240516-100200.txt ADDED
The diff for this file is too large to render. See raw diff
 
100_percent/LLM-Augmented-MTR/log_train_20240516-143350.txt ADDED
The diff for this file is too large to render. See raw diff
 
100_percent/LLM-Augmented-MTR/log_train_20240518-102918.txt ADDED
@@ -0,0 +1,1038 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2024-05-18 10:29:18,049 INFO **********************Start logging**********************
2
+ 2024-05-18 10:29:18,049 INFO CUDA_VISIBLE_DEVICES=4,5,6,7
3
+ 2024-05-18 10:29:18,050 INFO total_batch_size: 52
4
+ 2024-05-18 10:29:18,050 INFO cfg_file cfgs/waymo/mtr+100_percent_data_llm_augmented.yaml
5
+ 2024-05-18 10:29:18,050 INFO batch_size 13
6
+ 2024-05-18 10:29:18,050 INFO epochs 30
7
+ 2024-05-18 10:29:18,050 INFO workers 8
8
+ 2024-05-18 10:29:18,050 INFO extra_tag llm_augmented_mtr+100_percent_attn_loop_learned_gate_inside+change_context_window_8_layer_4
9
+ 2024-05-18 10:29:18,050 INFO ckpt None
10
+ 2024-05-18 10:29:18,050 INFO pretrained_model None
11
+ 2024-05-18 10:29:18,050 INFO launcher pytorch
12
+ 2024-05-18 10:29:18,050 INFO tcp_port 18888
13
+ 2024-05-18 10:29:18,050 INFO without_sync_bn False
14
+ 2024-05-18 10:29:18,050 INFO fix_random_seed False
15
+ 2024-05-18 10:29:18,050 INFO ckpt_save_interval 2
16
+ 2024-05-18 10:29:18,050 INFO local_rank 0
17
+ 2024-05-18 10:29:18,050 INFO max_ckpt_save_num 5
18
+ 2024-05-18 10:29:18,050 INFO merge_all_iters_to_one_epoch False
19
+ 2024-05-18 10:29:18,050 INFO set_cfgs None
20
+ 2024-05-18 10:29:18,050 INFO max_waiting_mins 0
21
+ 2024-05-18 10:29:18,050 INFO start_epoch 0
22
+ 2024-05-18 10:29:18,050 INFO save_to_file False
23
+ 2024-05-18 10:29:18,050 INFO not_eval_with_train False
24
+ 2024-05-18 10:29:18,050 INFO logger_iter_interval 50
25
+ 2024-05-18 10:29:18,050 INFO ckpt_save_time_interval 300
26
+ 2024-05-18 10:29:18,050 INFO add_worker_init_fn False
27
+ 2024-05-18 10:29:18,051 INFO dataset_type
28
+ 2024-05-18 10:29:18,051 INFO cfg.ROOT_DIR: /home/aidrive/zhengxj/projects_new/MTR_new
29
+ 2024-05-18 10:29:18,051 INFO cfg.LOCAL_RANK: 0
30
+ 2024-05-18 10:29:18,051 INFO
31
+ cfg.DATA_CONFIG = edict()
32
+ 2024-05-18 10:29:18,055 INFO cfg.DATA_CONFIG.DATASET: WaymoDataset
33
+ 2024-05-18 10:29:18,055 INFO cfg.DATA_CONFIG.OBJECT_TYPE: ['TYPE_VEHICLE', 'TYPE_PEDESTRIAN', 'TYPE_CYCLIST']
34
+ 2024-05-18 10:29:18,055 INFO cfg.DATA_CONFIG.DATA_ROOT: /home/DISCOVER/yanzj/workspace/code/MTR/data/waymo/mtr_processed
35
+ 2024-05-18 10:29:18,055 INFO
36
+ cfg.DATA_CONFIG.SPLIT_DIR = edict()
37
+ 2024-05-18 10:29:18,055 INFO cfg.DATA_CONFIG.SPLIT_DIR.train: processed_scenarios_training
38
+ 2024-05-18 10:29:18,055 INFO cfg.DATA_CONFIG.SPLIT_DIR.valid: processed_scenarios_validation
39
+ 2024-05-18 10:29:18,055 INFO cfg.DATA_CONFIG.SPLIT_DIR.test: processed_scenarios_testing
40
+ 2024-05-18 10:29:18,055 INFO
41
+ cfg.DATA_CONFIG.INFO_FILE = edict()
42
+ 2024-05-18 10:29:18,055 INFO cfg.DATA_CONFIG.INFO_FILE.train: processed_scenarios_training_infos.pkl
43
+ 2024-05-18 10:29:18,055 INFO cfg.DATA_CONFIG.INFO_FILE.valid: processed_scenarios_val_infos.pkl
44
+ 2024-05-18 10:29:18,055 INFO cfg.DATA_CONFIG.INFO_FILE.test: processed_scenarios_test_infos.pkl
45
+ 2024-05-18 10:29:18,055 INFO
46
+ cfg.DATA_CONFIG.SAMPLE_INTERVAL = edict()
47
+ 2024-05-18 10:29:18,055 INFO cfg.DATA_CONFIG.SAMPLE_INTERVAL.train: 1
48
+ 2024-05-18 10:29:18,055 INFO cfg.DATA_CONFIG.SAMPLE_INTERVAL.valid: 1
49
+ 2024-05-18 10:29:18,055 INFO cfg.DATA_CONFIG.SAMPLE_INTERVAL.test: 1
50
+ 2024-05-18 10:29:18,055 INFO
51
+ cfg.DATA_CONFIG.INFO_FILTER_DICT = edict()
52
+ 2024-05-18 10:29:18,055 INFO cfg.DATA_CONFIG.INFO_FILTER_DICT.filter_info_by_object_type: ['TYPE_VEHICLE', 'TYPE_PEDESTRIAN', 'TYPE_CYCLIST']
53
+ 2024-05-18 10:29:18,055 INFO cfg.DATA_CONFIG.POINT_SAMPLED_INTERVAL: 1
54
+ 2024-05-18 10:29:18,055 INFO cfg.DATA_CONFIG.NUM_POINTS_EACH_POLYLINE: 20
55
+ 2024-05-18 10:29:18,055 INFO cfg.DATA_CONFIG.VECTOR_BREAK_DIST_THRESH: 1.0
56
+ 2024-05-18 10:29:18,055 INFO cfg.DATA_CONFIG.NUM_OF_SRC_POLYLINES: 768
57
+ 2024-05-18 10:29:18,056 INFO cfg.DATA_CONFIG.CENTER_OFFSET_OF_MAP: [30.0, 0]
58
+ 2024-05-18 10:29:18,056 INFO cfg.DATA_CONFIG.LOAD_CONTEXT_DATA: True
59
+ 2024-05-18 10:29:18,056 INFO cfg.DATA_CONFIG.GENERATE_EMBEDDING: False
60
+ 2024-05-18 10:29:18,056 INFO cfg.DATA_CONFIG.RETRIEVAL_WINDOW_SIZE: 8
61
+ 2024-05-18 10:29:18,056 INFO cfg.DATA_CONFIG.ENCODER_FOR_CONTEXT: 100
62
+ 2024-05-18 10:29:18,056 INFO
63
+ cfg.MODEL = edict()
64
+ 2024-05-18 10:29:18,056 INFO
65
+ cfg.MODEL.CONTEXT_ENCODER = edict()
66
+ 2024-05-18 10:29:18,056 INFO cfg.MODEL.CONTEXT_ENCODER.NAME: MTREncoder
67
+ 2024-05-18 10:29:18,056 INFO cfg.MODEL.CONTEXT_ENCODER.NUM_OF_ATTN_NEIGHBORS: 16
68
+ 2024-05-18 10:29:18,056 INFO cfg.MODEL.CONTEXT_ENCODER.NUM_INPUT_ATTR_AGENT: 29
69
+ 2024-05-18 10:29:18,056 INFO cfg.MODEL.CONTEXT_ENCODER.NUM_INPUT_ATTR_MAP: 9
70
+ 2024-05-18 10:29:18,056 INFO cfg.MODEL.CONTEXT_ENCODER.NUM_CHANNEL_IN_MLP_AGENT: 256
71
+ 2024-05-18 10:29:18,056 INFO cfg.MODEL.CONTEXT_ENCODER.NUM_CHANNEL_IN_MLP_MAP: 64
72
+ 2024-05-18 10:29:18,056 INFO cfg.MODEL.CONTEXT_ENCODER.NUM_LAYER_IN_MLP_AGENT: 3
73
+ 2024-05-18 10:29:18,056 INFO cfg.MODEL.CONTEXT_ENCODER.NUM_LAYER_IN_MLP_MAP: 5
74
+ 2024-05-18 10:29:18,056 INFO cfg.MODEL.CONTEXT_ENCODER.NUM_LAYER_IN_PRE_MLP_MAP: 3
75
+ 2024-05-18 10:29:18,056 INFO cfg.MODEL.CONTEXT_ENCODER.D_MODEL: 256
76
+ 2024-05-18 10:29:18,056 INFO cfg.MODEL.CONTEXT_ENCODER.NUM_ATTN_LAYERS: 6
77
+ 2024-05-18 10:29:18,056 INFO cfg.MODEL.CONTEXT_ENCODER.NUM_ATTN_HEAD: 8
78
+ 2024-05-18 10:29:18,056 INFO cfg.MODEL.CONTEXT_ENCODER.DROPOUT_OF_ATTN: 0.1
79
+ 2024-05-18 10:29:18,056 INFO cfg.MODEL.CONTEXT_ENCODER.USE_LOCAL_ATTN: True
80
+ 2024-05-18 10:29:18,056 INFO
81
+ cfg.MODEL.MOTION_DECODER = edict()
82
+ 2024-05-18 10:29:18,057 INFO cfg.MODEL.MOTION_DECODER.NAME: MTRDecoder
83
+ 2024-05-18 10:29:18,057 INFO cfg.MODEL.MOTION_DECODER.OBJECT_TYPE: ['TYPE_VEHICLE', 'TYPE_PEDESTRIAN', 'TYPE_CYCLIST']
84
+ 2024-05-18 10:29:18,057 INFO cfg.MODEL.MOTION_DECODER.CENTER_OFFSET_OF_MAP: [30.0, 0]
85
+ 2024-05-18 10:29:18,057 INFO cfg.MODEL.MOTION_DECODER.NUM_FUTURE_FRAMES: 80
86
+ 2024-05-18 10:29:18,057 INFO cfg.MODEL.MOTION_DECODER.NUM_MOTION_MODES: 6
87
+ 2024-05-18 10:29:18,057 INFO cfg.MODEL.MOTION_DECODER.INTENTION_POINTS_FILE: data/waymo/cluster_64_center_dict.pkl
88
+ 2024-05-18 10:29:18,057 INFO cfg.MODEL.MOTION_DECODER.D_MODEL: 512
89
+ 2024-05-18 10:29:18,057 INFO cfg.MODEL.MOTION_DECODER.NUM_DECODER_LAYERS: 6
90
+ 2024-05-18 10:29:18,057 INFO cfg.MODEL.MOTION_DECODER.NUM_ATTN_HEAD: 8
91
+ 2024-05-18 10:29:18,057 INFO cfg.MODEL.MOTION_DECODER.MAP_D_MODEL: 256
92
+ 2024-05-18 10:29:18,057 INFO cfg.MODEL.MOTION_DECODER.DROPOUT_OF_ATTN: 0.1
93
+ 2024-05-18 10:29:18,057 INFO cfg.MODEL.MOTION_DECODER.NUM_BASE_MAP_POLYLINES: 256
94
+ 2024-05-18 10:29:18,057 INFO cfg.MODEL.MOTION_DECODER.NUM_WAYPOINT_MAP_POLYLINES: 128
95
+ 2024-05-18 10:29:18,057 INFO
96
+ cfg.MODEL.MOTION_DECODER.LOSS_WEIGHTS = edict()
97
+ 2024-05-18 10:29:18,057 INFO cfg.MODEL.MOTION_DECODER.LOSS_WEIGHTS.cls: 1.0
98
+ 2024-05-18 10:29:18,057 INFO cfg.MODEL.MOTION_DECODER.LOSS_WEIGHTS.reg: 1.0
99
+ 2024-05-18 10:29:18,057 INFO cfg.MODEL.MOTION_DECODER.LOSS_WEIGHTS.vel: 0.5
100
+ 2024-05-18 10:29:18,057 INFO cfg.MODEL.MOTION_DECODER.NMS_DIST_THRESH: 2.5
101
+ 2024-05-18 10:29:18,057 INFO cfg.MODEL.MOTION_DECODER.LOAD_CONTEXT_DATA: True
102
+ 2024-05-18 10:29:18,057 INFO cfg.MODEL.MOTION_DECODER.RETRIEVAL_WINDOW_SIZE: 8
103
+ 2024-05-18 10:29:18,057 INFO cfg.MODEL.GENERATE_EMBEDDING: False
104
+ 2024-05-18 10:29:18,057 INFO
105
+ cfg.OPTIMIZATION = edict()
106
+ 2024-05-18 10:29:18,057 INFO cfg.OPTIMIZATION.BATCH_SIZE_PER_GPU: 10
107
+ 2024-05-18 10:29:18,057 INFO cfg.OPTIMIZATION.NUM_EPOCHS: 30
108
+ 2024-05-18 10:29:18,058 INFO cfg.OPTIMIZATION.OPTIMIZER: AdamW
109
+ 2024-05-18 10:29:18,058 INFO cfg.OPTIMIZATION.LR: 0.0001
110
+ 2024-05-18 10:29:18,058 INFO cfg.OPTIMIZATION.WEIGHT_DECAY: 0.01
111
+ 2024-05-18 10:29:18,058 INFO cfg.OPTIMIZATION.SCHEDULER: lambdaLR
112
+ 2024-05-18 10:29:18,058 INFO cfg.OPTIMIZATION.DECAY_STEP_LIST: [22, 24, 26, 28]
113
+ 2024-05-18 10:29:18,058 INFO cfg.OPTIMIZATION.LR_DECAY: 0.5
114
+ 2024-05-18 10:29:18,058 INFO cfg.OPTIMIZATION.LR_CLIP: 1e-06
115
+ 2024-05-18 10:29:18,058 INFO cfg.OPTIMIZATION.GRAD_NORM_CLIP: 1000.0
116
+ 2024-05-18 10:29:18,058 INFO cfg.TAG: mtr+100_percent_data_llm_augmented
117
+ 2024-05-18 10:29:18,058 INFO cfg.EXP_GROUP_PATH: waymo
118
+ 2024-05-18 10:29:18,135 INFO Start to load infos from /home/DISCOVER/yanzj/workspace/code/MTR/data/waymo/mtr_processed/processed_scenarios_training_infos.pkl
119
+ 2024-05-18 10:29:31,419 INFO Total scenes before filters: 487002
120
+ 2024-05-18 10:29:41,038 INFO Total scenes after filter_info_by_object_type: 487002
121
+ 2024-05-18 10:29:41,063 INFO Total scenes after filters: 487002
122
+ 2024-05-18 10:29:41,064 INFO Start to load context from /home/aidrive/zhengxj/projects_new/MTR_new/LLM_integrate/context_data/train/context_data_encoder_100.pkl
123
+ 2024-05-18 10:30:18,067 INFO Total scenes in context info file: 487002
124
+ 2024-05-18 10:31:55,387 INFO ==> Loading parameters from checkpoint /home/aidrive/zhengxj/projects_new/MTR_new/output/waymo/mtr+100_percent_data_llm_augmented/llm_augmented_mtr+100_percent_attn_loop_learned_gate_inside+change_context_window_8_layer_4/ckpt/latest_model.pth to CPU
125
+ 2024-05-18 10:31:57,791 INFO ==> Loading optimizer parameters from checkpoint /home/aidrive/zhengxj/projects_new/MTR_new/output/waymo/mtr+100_percent_data_llm_augmented/llm_augmented_mtr+100_percent_attn_loop_learned_gate_inside+change_context_window_8_layer_4/ckpt/latest_model.pth to CPU
126
+ 2024-05-18 10:31:59,104 INFO ==> Done (loaded 894/894)
127
+ 2024-05-18 10:32:00,840 INFO DistributedDataParallel(
128
+ (module): MotionTransformer(
129
+ (context_encoder): MTREncoder(
130
+ (agent_polyline_encoder): PointNetPolylineEncoder(
131
+ (pre_mlps): Sequential(
132
+ (0): Linear(in_features=30, out_features=256, bias=False)
133
+ (1): SyncBatchNorm(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
134
+ (2): ReLU()
135
+ )
136
+ (mlps): Sequential(
137
+ (0): Linear(in_features=512, out_features=256, bias=False)
138
+ (1): SyncBatchNorm(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
139
+ (2): ReLU()
140
+ (3): Linear(in_features=256, out_features=256, bias=False)
141
+ (4): SyncBatchNorm(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
142
+ (5): ReLU()
143
+ )
144
+ (out_mlps): Sequential(
145
+ (0): Linear(in_features=256, out_features=256, bias=True)
146
+ (1): ReLU()
147
+ (2): Linear(in_features=256, out_features=256, bias=True)
148
+ )
149
+ )
150
+ (map_polyline_encoder): PointNetPolylineEncoder(
151
+ (pre_mlps): Sequential(
152
+ (0): Linear(in_features=9, out_features=64, bias=False)
153
+ (1): SyncBatchNorm(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
154
+ (2): ReLU()
155
+ (3): Linear(in_features=64, out_features=64, bias=False)
156
+ (4): SyncBatchNorm(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
157
+ (5): ReLU()
158
+ (6): Linear(in_features=64, out_features=64, bias=False)
159
+ (7): SyncBatchNorm(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
160
+ (8): ReLU()
161
+ )
162
+ (mlps): Sequential(
163
+ (0): Linear(in_features=128, out_features=64, bias=False)
164
+ (1): SyncBatchNorm(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
165
+ (2): ReLU()
166
+ (3): Linear(in_features=64, out_features=64, bias=False)
167
+ (4): SyncBatchNorm(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
168
+ (5): ReLU()
169
+ )
170
+ (out_mlps): Sequential(
171
+ (0): Linear(in_features=64, out_features=64, bias=True)
172
+ (1): ReLU()
173
+ (2): Linear(in_features=64, out_features=256, bias=True)
174
+ )
175
+ )
176
+ (self_attn_layers): ModuleList(
177
+ (0): TransformerEncoderLayer(
178
+ (self_attn): MultiheadAttentionLocal(
179
+ (out_proj): Linear(in_features=256, out_features=256, bias=True)
180
+ )
181
+ (linear1): Linear(in_features=256, out_features=1024, bias=True)
182
+ (dropout): Dropout(p=0.1, inplace=False)
183
+ (linear2): Linear(in_features=1024, out_features=256, bias=True)
184
+ (norm1): LayerNorm((256,), eps=1e-05, elementwise_affine=True)
185
+ (norm2): LayerNorm((256,), eps=1e-05, elementwise_affine=True)
186
+ (dropout1): Dropout(p=0.1, inplace=False)
187
+ (dropout2): Dropout(p=0.1, inplace=False)
188
+ )
189
+ (1): TransformerEncoderLayer(
190
+ (self_attn): MultiheadAttentionLocal(
191
+ (out_proj): Linear(in_features=256, out_features=256, bias=True)
192
+ )
193
+ (linear1): Linear(in_features=256, out_features=1024, bias=True)
194
+ (dropout): Dropout(p=0.1, inplace=False)
195
+ (linear2): Linear(in_features=1024, out_features=256, bias=True)
196
+ (norm1): LayerNorm((256,), eps=1e-05, elementwise_affine=True)
197
+ (norm2): LayerNorm((256,), eps=1e-05, elementwise_affine=True)
198
+ (dropout1): Dropout(p=0.1, inplace=False)
199
+ (dropout2): Dropout(p=0.1, inplace=False)
200
+ )
201
+ (2): TransformerEncoderLayer(
202
+ (self_attn): MultiheadAttentionLocal(
203
+ (out_proj): Linear(in_features=256, out_features=256, bias=True)
204
+ )
205
+ (linear1): Linear(in_features=256, out_features=1024, bias=True)
206
+ (dropout): Dropout(p=0.1, inplace=False)
207
+ (linear2): Linear(in_features=1024, out_features=256, bias=True)
208
+ (norm1): LayerNorm((256,), eps=1e-05, elementwise_affine=True)
209
+ (norm2): LayerNorm((256,), eps=1e-05, elementwise_affine=True)
210
+ (dropout1): Dropout(p=0.1, inplace=False)
211
+ (dropout2): Dropout(p=0.1, inplace=False)
212
+ )
213
+ (3): TransformerEncoderLayer(
214
+ (self_attn): MultiheadAttentionLocal(
215
+ (out_proj): Linear(in_features=256, out_features=256, bias=True)
216
+ )
217
+ (linear1): Linear(in_features=256, out_features=1024, bias=True)
218
+ (dropout): Dropout(p=0.1, inplace=False)
219
+ (linear2): Linear(in_features=1024, out_features=256, bias=True)
220
+ (norm1): LayerNorm((256,), eps=1e-05, elementwise_affine=True)
221
+ (norm2): LayerNorm((256,), eps=1e-05, elementwise_affine=True)
222
+ (dropout1): Dropout(p=0.1, inplace=False)
223
+ (dropout2): Dropout(p=0.1, inplace=False)
224
+ )
225
+ (4): TransformerEncoderLayer(
226
+ (self_attn): MultiheadAttentionLocal(
227
+ (out_proj): Linear(in_features=256, out_features=256, bias=True)
228
+ )
229
+ (linear1): Linear(in_features=256, out_features=1024, bias=True)
230
+ (dropout): Dropout(p=0.1, inplace=False)
231
+ (linear2): Linear(in_features=1024, out_features=256, bias=True)
232
+ (norm1): LayerNorm((256,), eps=1e-05, elementwise_affine=True)
233
+ (norm2): LayerNorm((256,), eps=1e-05, elementwise_affine=True)
234
+ (dropout1): Dropout(p=0.1, inplace=False)
235
+ (dropout2): Dropout(p=0.1, inplace=False)
236
+ )
237
+ (5): TransformerEncoderLayer(
238
+ (self_attn): MultiheadAttentionLocal(
239
+ (out_proj): Linear(in_features=256, out_features=256, bias=True)
240
+ )
241
+ (linear1): Linear(in_features=256, out_features=1024, bias=True)
242
+ (dropout): Dropout(p=0.1, inplace=False)
243
+ (linear2): Linear(in_features=1024, out_features=256, bias=True)
244
+ (norm1): LayerNorm((256,), eps=1e-05, elementwise_affine=True)
245
+ (norm2): LayerNorm((256,), eps=1e-05, elementwise_affine=True)
246
+ (dropout1): Dropout(p=0.1, inplace=False)
247
+ (dropout2): Dropout(p=0.1, inplace=False)
248
+ )
249
+ )
250
+ )
251
+ (motion_decoder): MTRDecoder(
252
+ (in_proj_center_obj): Sequential(
253
+ (0): Linear(in_features=256, out_features=512, bias=True)
254
+ (1): ReLU()
255
+ (2): Linear(in_features=512, out_features=512, bias=True)
256
+ )
257
+ (in_proj_obj): Sequential(
258
+ (0): Linear(in_features=256, out_features=512, bias=True)
259
+ (1): ReLU()
260
+ (2): Linear(in_features=512, out_features=512, bias=True)
261
+ )
262
+ (obj_decoder_layers): ModuleList(
263
+ (0): TransformerDecoderLayer(
264
+ (sa_qcontent_proj): Linear(in_features=512, out_features=512, bias=True)
265
+ (sa_qpos_proj): Linear(in_features=512, out_features=512, bias=True)
266
+ (sa_kcontent_proj): Linear(in_features=512, out_features=512, bias=True)
267
+ (sa_kpos_proj): Linear(in_features=512, out_features=512, bias=True)
268
+ (sa_v_proj): Linear(in_features=512, out_features=512, bias=True)
269
+ (self_attn): MultiheadAttention(
270
+ (out_proj): Linear(in_features=512, out_features=512, bias=True)
271
+ )
272
+ (norm1): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
273
+ (dropout1): Dropout(p=0.1, inplace=False)
274
+ (ca_qcontent_proj): Linear(in_features=512, out_features=512, bias=True)
275
+ (ca_qpos_proj): Linear(in_features=512, out_features=512, bias=True)
276
+ (ca_kcontent_proj): Linear(in_features=512, out_features=512, bias=True)
277
+ (ca_kpos_proj): Linear(in_features=512, out_features=512, bias=True)
278
+ (ca_v_proj): Linear(in_features=512, out_features=512, bias=True)
279
+ (ca_qpos_sine_proj): Linear(in_features=512, out_features=512, bias=True)
280
+ (cross_attn): MultiheadAttention(
281
+ (out_proj): Linear(in_features=512, out_features=512, bias=True)
282
+ )
283
+ (linear1): Linear(in_features=512, out_features=2048, bias=True)
284
+ (dropout): Dropout(p=0.1, inplace=False)
285
+ (linear2): Linear(in_features=2048, out_features=512, bias=True)
286
+ (norm2): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
287
+ (norm3): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
288
+ (dropout2): Dropout(p=0.1, inplace=False)
289
+ (dropout3): Dropout(p=0.1, inplace=False)
290
+ )
291
+ (1): TransformerDecoderLayer(
292
+ (sa_qcontent_proj): Linear(in_features=512, out_features=512, bias=True)
293
+ (sa_qpos_proj): Linear(in_features=512, out_features=512, bias=True)
294
+ (sa_kcontent_proj): Linear(in_features=512, out_features=512, bias=True)
295
+ (sa_kpos_proj): Linear(in_features=512, out_features=512, bias=True)
296
+ (sa_v_proj): Linear(in_features=512, out_features=512, bias=True)
297
+ (self_attn): MultiheadAttention(
298
+ (out_proj): Linear(in_features=512, out_features=512, bias=True)
299
+ )
300
+ (norm1): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
301
+ (dropout1): Dropout(p=0.1, inplace=False)
302
+ (ca_qcontent_proj): Linear(in_features=512, out_features=512, bias=True)
303
+ (ca_qpos_proj): Linear(in_features=512, out_features=512, bias=True)
304
+ (ca_kcontent_proj): Linear(in_features=512, out_features=512, bias=True)
305
+ (ca_kpos_proj): Linear(in_features=512, out_features=512, bias=True)
306
+ (ca_v_proj): Linear(in_features=512, out_features=512, bias=True)
307
+ (ca_qpos_sine_proj): Linear(in_features=512, out_features=512, bias=True)
308
+ (cross_attn): MultiheadAttention(
309
+ (out_proj): Linear(in_features=512, out_features=512, bias=True)
310
+ )
311
+ (linear1): Linear(in_features=512, out_features=2048, bias=True)
312
+ (dropout): Dropout(p=0.1, inplace=False)
313
+ (linear2): Linear(in_features=2048, out_features=512, bias=True)
314
+ (norm2): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
315
+ (norm3): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
316
+ (dropout2): Dropout(p=0.1, inplace=False)
317
+ (dropout3): Dropout(p=0.1, inplace=False)
318
+ )
319
+ (2): TransformerDecoderLayer(
320
+ (sa_qcontent_proj): Linear(in_features=512, out_features=512, bias=True)
321
+ (sa_qpos_proj): Linear(in_features=512, out_features=512, bias=True)
322
+ (sa_kcontent_proj): Linear(in_features=512, out_features=512, bias=True)
323
+ (sa_kpos_proj): Linear(in_features=512, out_features=512, bias=True)
324
+ (sa_v_proj): Linear(in_features=512, out_features=512, bias=True)
325
+ (self_attn): MultiheadAttention(
326
+ (out_proj): Linear(in_features=512, out_features=512, bias=True)
327
+ )
328
+ (norm1): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
329
+ (dropout1): Dropout(p=0.1, inplace=False)
330
+ (ca_qcontent_proj): Linear(in_features=512, out_features=512, bias=True)
331
+ (ca_qpos_proj): Linear(in_features=512, out_features=512, bias=True)
332
+ (ca_kcontent_proj): Linear(in_features=512, out_features=512, bias=True)
333
+ (ca_kpos_proj): Linear(in_features=512, out_features=512, bias=True)
334
+ (ca_v_proj): Linear(in_features=512, out_features=512, bias=True)
335
+ (ca_qpos_sine_proj): Linear(in_features=512, out_features=512, bias=True)
336
+ (cross_attn): MultiheadAttention(
337
+ (out_proj): Linear(in_features=512, out_features=512, bias=True)
338
+ )
339
+ (linear1): Linear(in_features=512, out_features=2048, bias=True)
340
+ (dropout): Dropout(p=0.1, inplace=False)
341
+ (linear2): Linear(in_features=2048, out_features=512, bias=True)
342
+ (norm2): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
343
+ (norm3): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
344
+ (dropout2): Dropout(p=0.1, inplace=False)
345
+ (dropout3): Dropout(p=0.1, inplace=False)
346
+ )
347
+ (3): TransformerDecoderLayer(
348
+ (sa_qcontent_proj): Linear(in_features=512, out_features=512, bias=True)
349
+ (sa_qpos_proj): Linear(in_features=512, out_features=512, bias=True)
350
+ (sa_kcontent_proj): Linear(in_features=512, out_features=512, bias=True)
351
+ (sa_kpos_proj): Linear(in_features=512, out_features=512, bias=True)
352
+ (sa_v_proj): Linear(in_features=512, out_features=512, bias=True)
353
+ (self_attn): MultiheadAttention(
354
+ (out_proj): Linear(in_features=512, out_features=512, bias=True)
355
+ )
356
+ (norm1): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
357
+ (dropout1): Dropout(p=0.1, inplace=False)
358
+ (ca_qcontent_proj): Linear(in_features=512, out_features=512, bias=True)
359
+ (ca_qpos_proj): Linear(in_features=512, out_features=512, bias=True)
360
+ (ca_kcontent_proj): Linear(in_features=512, out_features=512, bias=True)
361
+ (ca_kpos_proj): Linear(in_features=512, out_features=512, bias=True)
362
+ (ca_v_proj): Linear(in_features=512, out_features=512, bias=True)
363
+ (ca_qpos_sine_proj): Linear(in_features=512, out_features=512, bias=True)
364
+ (cross_attn): MultiheadAttention(
365
+ (out_proj): Linear(in_features=512, out_features=512, bias=True)
366
+ )
367
+ (linear1): Linear(in_features=512, out_features=2048, bias=True)
368
+ (dropout): Dropout(p=0.1, inplace=False)
369
+ (linear2): Linear(in_features=2048, out_features=512, bias=True)
370
+ (norm2): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
371
+ (norm3): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
372
+ (dropout2): Dropout(p=0.1, inplace=False)
373
+ (dropout3): Dropout(p=0.1, inplace=False)
374
+ )
375
+ (4): TransformerDecoderLayer(
376
+ (sa_qcontent_proj): Linear(in_features=512, out_features=512, bias=True)
377
+ (sa_qpos_proj): Linear(in_features=512, out_features=512, bias=True)
378
+ (sa_kcontent_proj): Linear(in_features=512, out_features=512, bias=True)
379
+ (sa_kpos_proj): Linear(in_features=512, out_features=512, bias=True)
380
+ (sa_v_proj): Linear(in_features=512, out_features=512, bias=True)
381
+ (self_attn): MultiheadAttention(
382
+ (out_proj): Linear(in_features=512, out_features=512, bias=True)
383
+ )
384
+ (norm1): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
385
+ (dropout1): Dropout(p=0.1, inplace=False)
386
+ (ca_qcontent_proj): Linear(in_features=512, out_features=512, bias=True)
387
+ (ca_qpos_proj): Linear(in_features=512, out_features=512, bias=True)
388
+ (ca_kcontent_proj): Linear(in_features=512, out_features=512, bias=True)
389
+ (ca_kpos_proj): Linear(in_features=512, out_features=512, bias=True)
390
+ (ca_v_proj): Linear(in_features=512, out_features=512, bias=True)
391
+ (ca_qpos_sine_proj): Linear(in_features=512, out_features=512, bias=True)
392
+ (cross_attn): MultiheadAttention(
393
+ (out_proj): Linear(in_features=512, out_features=512, bias=True)
394
+ )
395
+ (linear1): Linear(in_features=512, out_features=2048, bias=True)
396
+ (dropout): Dropout(p=0.1, inplace=False)
397
+ (linear2): Linear(in_features=2048, out_features=512, bias=True)
398
+ (norm2): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
399
+ (norm3): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
400
+ (dropout2): Dropout(p=0.1, inplace=False)
401
+ (dropout3): Dropout(p=0.1, inplace=False)
402
+ )
403
+ (5): TransformerDecoderLayer(
404
+ (sa_qcontent_proj): Linear(in_features=512, out_features=512, bias=True)
405
+ (sa_qpos_proj): Linear(in_features=512, out_features=512, bias=True)
406
+ (sa_kcontent_proj): Linear(in_features=512, out_features=512, bias=True)
407
+ (sa_kpos_proj): Linear(in_features=512, out_features=512, bias=True)
408
+ (sa_v_proj): Linear(in_features=512, out_features=512, bias=True)
409
+ (self_attn): MultiheadAttention(
410
+ (out_proj): Linear(in_features=512, out_features=512, bias=True)
411
+ )
412
+ (norm1): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
413
+ (dropout1): Dropout(p=0.1, inplace=False)
414
+ (ca_qcontent_proj): Linear(in_features=512, out_features=512, bias=True)
415
+ (ca_qpos_proj): Linear(in_features=512, out_features=512, bias=True)
416
+ (ca_kcontent_proj): Linear(in_features=512, out_features=512, bias=True)
417
+ (ca_kpos_proj): Linear(in_features=512, out_features=512, bias=True)
418
+ (ca_v_proj): Linear(in_features=512, out_features=512, bias=True)
419
+ (ca_qpos_sine_proj): Linear(in_features=512, out_features=512, bias=True)
420
+ (cross_attn): MultiheadAttention(
421
+ (out_proj): Linear(in_features=512, out_features=512, bias=True)
422
+ )
423
+ (linear1): Linear(in_features=512, out_features=2048, bias=True)
424
+ (dropout): Dropout(p=0.1, inplace=False)
425
+ (linear2): Linear(in_features=2048, out_features=512, bias=True)
426
+ (norm2): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
427
+ (norm3): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
428
+ (dropout2): Dropout(p=0.1, inplace=False)
429
+ (dropout3): Dropout(p=0.1, inplace=False)
430
+ )
431
+ )
432
+ (in_proj_map): Sequential(
433
+ (0): Linear(in_features=256, out_features=256, bias=True)
434
+ (1): ReLU()
435
+ (2): Linear(in_features=256, out_features=256, bias=True)
436
+ )
437
+ (map_decoder_layers): ModuleList(
438
+ (0): TransformerDecoderLayer(
439
+ (sa_qcontent_proj): Linear(in_features=256, out_features=256, bias=True)
440
+ (sa_qpos_proj): Linear(in_features=256, out_features=256, bias=True)
441
+ (sa_kcontent_proj): Linear(in_features=256, out_features=256, bias=True)
442
+ (sa_kpos_proj): Linear(in_features=256, out_features=256, bias=True)
443
+ (sa_v_proj): Linear(in_features=256, out_features=256, bias=True)
444
+ (self_attn): MultiheadAttention(
445
+ (out_proj): Linear(in_features=256, out_features=256, bias=True)
446
+ )
447
+ (norm1): LayerNorm((256,), eps=1e-05, elementwise_affine=True)
448
+ (dropout1): Dropout(p=0.1, inplace=False)
449
+ (ca_qcontent_proj): Linear(in_features=256, out_features=256, bias=True)
450
+ (ca_qpos_proj): Linear(in_features=256, out_features=256, bias=True)
451
+ (ca_kcontent_proj): Linear(in_features=256, out_features=256, bias=True)
452
+ (ca_kpos_proj): Linear(in_features=256, out_features=256, bias=True)
453
+ (ca_v_proj): Linear(in_features=256, out_features=256, bias=True)
454
+ (ca_qpos_sine_proj): Linear(in_features=256, out_features=256, bias=True)
455
+ (cross_attn): MultiheadAttentionLocal(
456
+ (out_proj): Linear(in_features=256, out_features=256, bias=True)
457
+ )
458
+ (linear1): Linear(in_features=256, out_features=1024, bias=True)
459
+ (dropout): Dropout(p=0.1, inplace=False)
460
+ (linear2): Linear(in_features=1024, out_features=256, bias=True)
461
+ (norm2): LayerNorm((256,), eps=1e-05, elementwise_affine=True)
462
+ (norm3): LayerNorm((256,), eps=1e-05, elementwise_affine=True)
463
+ (dropout2): Dropout(p=0.1, inplace=False)
464
+ (dropout3): Dropout(p=0.1, inplace=False)
465
+ )
466
+ (1): TransformerDecoderLayer(
467
+ (sa_qcontent_proj): Linear(in_features=256, out_features=256, bias=True)
468
+ (sa_qpos_proj): Linear(in_features=256, out_features=256, bias=True)
469
+ (sa_kcontent_proj): Linear(in_features=256, out_features=256, bias=True)
470
+ (sa_kpos_proj): Linear(in_features=256, out_features=256, bias=True)
471
+ (sa_v_proj): Linear(in_features=256, out_features=256, bias=True)
472
+ (self_attn): MultiheadAttention(
473
+ (out_proj): Linear(in_features=256, out_features=256, bias=True)
474
+ )
475
+ (norm1): LayerNorm((256,), eps=1e-05, elementwise_affine=True)
476
+ (dropout1): Dropout(p=0.1, inplace=False)
477
+ (ca_qcontent_proj): Linear(in_features=256, out_features=256, bias=True)
478
+ (ca_qpos_proj): Linear(in_features=256, out_features=256, bias=True)
479
+ (ca_kcontent_proj): Linear(in_features=256, out_features=256, bias=True)
480
+ (ca_kpos_proj): Linear(in_features=256, out_features=256, bias=True)
481
+ (ca_v_proj): Linear(in_features=256, out_features=256, bias=True)
482
+ (ca_qpos_sine_proj): Linear(in_features=256, out_features=256, bias=True)
483
+ (cross_attn): MultiheadAttentionLocal(
484
+ (out_proj): Linear(in_features=256, out_features=256, bias=True)
485
+ )
486
+ (linear1): Linear(in_features=256, out_features=1024, bias=True)
487
+ (dropout): Dropout(p=0.1, inplace=False)
488
+ (linear2): Linear(in_features=1024, out_features=256, bias=True)
489
+ (norm2): LayerNorm((256,), eps=1e-05, elementwise_affine=True)
490
+ (norm3): LayerNorm((256,), eps=1e-05, elementwise_affine=True)
491
+ (dropout2): Dropout(p=0.1, inplace=False)
492
+ (dropout3): Dropout(p=0.1, inplace=False)
493
+ )
494
+ (2): TransformerDecoderLayer(
495
+ (sa_qcontent_proj): Linear(in_features=256, out_features=256, bias=True)
496
+ (sa_qpos_proj): Linear(in_features=256, out_features=256, bias=True)
497
+ (sa_kcontent_proj): Linear(in_features=256, out_features=256, bias=True)
498
+ (sa_kpos_proj): Linear(in_features=256, out_features=256, bias=True)
499
+ (sa_v_proj): Linear(in_features=256, out_features=256, bias=True)
500
+ (self_attn): MultiheadAttention(
501
+ (out_proj): Linear(in_features=256, out_features=256, bias=True)
502
+ )
503
+ (norm1): LayerNorm((256,), eps=1e-05, elementwise_affine=True)
504
+ (dropout1): Dropout(p=0.1, inplace=False)
505
+ (ca_qcontent_proj): Linear(in_features=256, out_features=256, bias=True)
506
+ (ca_qpos_proj): Linear(in_features=256, out_features=256, bias=True)
507
+ (ca_kcontent_proj): Linear(in_features=256, out_features=256, bias=True)
508
+ (ca_kpos_proj): Linear(in_features=256, out_features=256, bias=True)
509
+ (ca_v_proj): Linear(in_features=256, out_features=256, bias=True)
510
+ (ca_qpos_sine_proj): Linear(in_features=256, out_features=256, bias=True)
511
+ (cross_attn): MultiheadAttentionLocal(
512
+ (out_proj): Linear(in_features=256, out_features=256, bias=True)
513
+ )
514
+ (linear1): Linear(in_features=256, out_features=1024, bias=True)
515
+ (dropout): Dropout(p=0.1, inplace=False)
516
+ (linear2): Linear(in_features=1024, out_features=256, bias=True)
517
+ (norm2): LayerNorm((256,), eps=1e-05, elementwise_affine=True)
518
+ (norm3): LayerNorm((256,), eps=1e-05, elementwise_affine=True)
519
+ (dropout2): Dropout(p=0.1, inplace=False)
520
+ (dropout3): Dropout(p=0.1, inplace=False)
521
+ )
522
+ (3): TransformerDecoderLayer(
523
+ (sa_qcontent_proj): Linear(in_features=256, out_features=256, bias=True)
524
+ (sa_qpos_proj): Linear(in_features=256, out_features=256, bias=True)
525
+ (sa_kcontent_proj): Linear(in_features=256, out_features=256, bias=True)
526
+ (sa_kpos_proj): Linear(in_features=256, out_features=256, bias=True)
527
+ (sa_v_proj): Linear(in_features=256, out_features=256, bias=True)
528
+ (self_attn): MultiheadAttention(
529
+ (out_proj): Linear(in_features=256, out_features=256, bias=True)
530
+ )
531
+ (norm1): LayerNorm((256,), eps=1e-05, elementwise_affine=True)
532
+ (dropout1): Dropout(p=0.1, inplace=False)
533
+ (ca_qcontent_proj): Linear(in_features=256, out_features=256, bias=True)
534
+ (ca_qpos_proj): Linear(in_features=256, out_features=256, bias=True)
535
+ (ca_kcontent_proj): Linear(in_features=256, out_features=256, bias=True)
536
+ (ca_kpos_proj): Linear(in_features=256, out_features=256, bias=True)
537
+ (ca_v_proj): Linear(in_features=256, out_features=256, bias=True)
538
+ (ca_qpos_sine_proj): Linear(in_features=256, out_features=256, bias=True)
539
+ (cross_attn): MultiheadAttentionLocal(
540
+ (out_proj): Linear(in_features=256, out_features=256, bias=True)
541
+ )
542
+ (linear1): Linear(in_features=256, out_features=1024, bias=True)
543
+ (dropout): Dropout(p=0.1, inplace=False)
544
+ (linear2): Linear(in_features=1024, out_features=256, bias=True)
545
+ (norm2): LayerNorm((256,), eps=1e-05, elementwise_affine=True)
546
+ (norm3): LayerNorm((256,), eps=1e-05, elementwise_affine=True)
547
+ (dropout2): Dropout(p=0.1, inplace=False)
548
+ (dropout3): Dropout(p=0.1, inplace=False)
549
+ )
550
+ (4): TransformerDecoderLayer(
551
+ (sa_qcontent_proj): Linear(in_features=256, out_features=256, bias=True)
552
+ (sa_qpos_proj): Linear(in_features=256, out_features=256, bias=True)
553
+ (sa_kcontent_proj): Linear(in_features=256, out_features=256, bias=True)
554
+ (sa_kpos_proj): Linear(in_features=256, out_features=256, bias=True)
555
+ (sa_v_proj): Linear(in_features=256, out_features=256, bias=True)
556
+ (self_attn): MultiheadAttention(
557
+ (out_proj): Linear(in_features=256, out_features=256, bias=True)
558
+ )
559
+ (norm1): LayerNorm((256,), eps=1e-05, elementwise_affine=True)
560
+ (dropout1): Dropout(p=0.1, inplace=False)
561
+ (ca_qcontent_proj): Linear(in_features=256, out_features=256, bias=True)
562
+ (ca_qpos_proj): Linear(in_features=256, out_features=256, bias=True)
563
+ (ca_kcontent_proj): Linear(in_features=256, out_features=256, bias=True)
564
+ (ca_kpos_proj): Linear(in_features=256, out_features=256, bias=True)
565
+ (ca_v_proj): Linear(in_features=256, out_features=256, bias=True)
566
+ (ca_qpos_sine_proj): Linear(in_features=256, out_features=256, bias=True)
567
+ (cross_attn): MultiheadAttentionLocal(
568
+ (out_proj): Linear(in_features=256, out_features=256, bias=True)
569
+ )
570
+ (linear1): Linear(in_features=256, out_features=1024, bias=True)
571
+ (dropout): Dropout(p=0.1, inplace=False)
572
+ (linear2): Linear(in_features=1024, out_features=256, bias=True)
573
+ (norm2): LayerNorm((256,), eps=1e-05, elementwise_affine=True)
574
+ (norm3): LayerNorm((256,), eps=1e-05, elementwise_affine=True)
575
+ (dropout2): Dropout(p=0.1, inplace=False)
576
+ (dropout3): Dropout(p=0.1, inplace=False)
577
+ )
578
+ (5): TransformerDecoderLayer(
579
+ (sa_qcontent_proj): Linear(in_features=256, out_features=256, bias=True)
580
+ (sa_qpos_proj): Linear(in_features=256, out_features=256, bias=True)
581
+ (sa_kcontent_proj): Linear(in_features=256, out_features=256, bias=True)
582
+ (sa_kpos_proj): Linear(in_features=256, out_features=256, bias=True)
583
+ (sa_v_proj): Linear(in_features=256, out_features=256, bias=True)
584
+ (self_attn): MultiheadAttention(
585
+ (out_proj): Linear(in_features=256, out_features=256, bias=True)
586
+ )
587
+ (norm1): LayerNorm((256,), eps=1e-05, elementwise_affine=True)
588
+ (dropout1): Dropout(p=0.1, inplace=False)
589
+ (ca_qcontent_proj): Linear(in_features=256, out_features=256, bias=True)
590
+ (ca_qpos_proj): Linear(in_features=256, out_features=256, bias=True)
591
+ (ca_kcontent_proj): Linear(in_features=256, out_features=256, bias=True)
592
+ (ca_kpos_proj): Linear(in_features=256, out_features=256, bias=True)
593
+ (ca_v_proj): Linear(in_features=256, out_features=256, bias=True)
594
+ (ca_qpos_sine_proj): Linear(in_features=256, out_features=256, bias=True)
595
+ (cross_attn): MultiheadAttentionLocal(
596
+ (out_proj): Linear(in_features=256, out_features=256, bias=True)
597
+ )
598
+ (linear1): Linear(in_features=256, out_features=1024, bias=True)
599
+ (dropout): Dropout(p=0.1, inplace=False)
600
+ (linear2): Linear(in_features=1024, out_features=256, bias=True)
601
+ (norm2): LayerNorm((256,), eps=1e-05, elementwise_affine=True)
602
+ (norm3): LayerNorm((256,), eps=1e-05, elementwise_affine=True)
603
+ (dropout2): Dropout(p=0.1, inplace=False)
604
+ (dropout3): Dropout(p=0.1, inplace=False)
605
+ )
606
+ )
607
+ (map_query_content_mlps): ModuleList(
608
+ (0): Linear(in_features=512, out_features=256, bias=True)
609
+ (1): Linear(in_features=512, out_features=256, bias=True)
610
+ (2): Linear(in_features=512, out_features=256, bias=True)
611
+ (3): Linear(in_features=512, out_features=256, bias=True)
612
+ (4): Linear(in_features=512, out_features=256, bias=True)
613
+ (5): Linear(in_features=512, out_features=256, bias=True)
614
+ )
615
+ (map_query_embed_mlps): Linear(in_features=512, out_features=256, bias=True)
616
+ (obj_pos_encoding_layer): Sequential(
617
+ (0): Linear(in_features=2, out_features=512, bias=True)
618
+ (1): ReLU()
619
+ (2): Linear(in_features=512, out_features=512, bias=True)
620
+ (3): ReLU()
621
+ (4): Linear(in_features=512, out_features=512, bias=True)
622
+ )
623
+ (dense_future_head): Sequential(
624
+ (0): Linear(in_features=1024, out_features=512, bias=False)
625
+ (1): SyncBatchNorm(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
626
+ (2): ReLU()
627
+ (3): Linear(in_features=512, out_features=512, bias=False)
628
+ (4): SyncBatchNorm(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
629
+ (5): ReLU()
630
+ (6): Linear(in_features=512, out_features=560, bias=True)
631
+ )
632
+ (future_traj_mlps): Sequential(
633
+ (0): Linear(in_features=320, out_features=512, bias=True)
634
+ (1): ReLU()
635
+ (2): Linear(in_features=512, out_features=512, bias=True)
636
+ (3): ReLU()
637
+ (4): Linear(in_features=512, out_features=512, bias=True)
638
+ )
639
+ (traj_fusion_mlps): Sequential(
640
+ (0): Linear(in_features=1024, out_features=512, bias=True)
641
+ (1): ReLU()
642
+ (2): Linear(in_features=512, out_features=512, bias=True)
643
+ (3): ReLU()
644
+ (4): Linear(in_features=512, out_features=512, bias=True)
645
+ )
646
+ (intention_query_mlps): Sequential(
647
+ (0): Linear(in_features=512, out_features=512, bias=False)
648
+ (1): SyncBatchNorm(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
649
+ (2): ReLU()
650
+ (3): Linear(in_features=512, out_features=512, bias=True)
651
+ )
652
+ (context_proj_layer): Sequential(
653
+ (0): Linear(in_features=17, out_features=512, bias=True)
654
+ (1): ReLU()
655
+ (2): Linear(in_features=512, out_features=512, bias=True)
656
+ )
657
+ (context_multi_head_attn): ModuleList(
658
+ (0): MultiheadAttention(
659
+ (out_proj): NonDynamicallyQuantizableLinear(in_features=512, out_features=512, bias=True)
660
+ )
661
+ (1): MultiheadAttention(
662
+ (out_proj): NonDynamicallyQuantizableLinear(in_features=512, out_features=512, bias=True)
663
+ )
664
+ (2): MultiheadAttention(
665
+ (out_proj): NonDynamicallyQuantizableLinear(in_features=512, out_features=512, bias=True)
666
+ )
667
+ (3): MultiheadAttention(
668
+ (out_proj): NonDynamicallyQuantizableLinear(in_features=512, out_features=512, bias=True)
669
+ )
670
+ )
671
+ (gate_proj_layers): ModuleList(
672
+ (0): Sequential(
673
+ (0): Linear(in_features=512, out_features=512, bias=False)
674
+ (1): SyncBatchNorm(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
675
+ (2): ReLU()
676
+ (3): Linear(in_features=512, out_features=1, bias=True)
677
+ )
678
+ (1): Sequential(
679
+ (0): Linear(in_features=512, out_features=512, bias=False)
680
+ (1): SyncBatchNorm(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
681
+ (2): ReLU()
682
+ (3): Linear(in_features=512, out_features=1, bias=True)
683
+ )
684
+ (2): Sequential(
685
+ (0): Linear(in_features=512, out_features=512, bias=False)
686
+ (1): SyncBatchNorm(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
687
+ (2): ReLU()
688
+ (3): Linear(in_features=512, out_features=1, bias=True)
689
+ )
690
+ (3): Sequential(
691
+ (0): Linear(in_features=512, out_features=512, bias=False)
692
+ (1): SyncBatchNorm(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
693
+ (2): ReLU()
694
+ (3): Linear(in_features=512, out_features=1, bias=True)
695
+ )
696
+ )
697
+ (query_feature_fusion_layers): ModuleList(
698
+ (0): Sequential(
699
+ (0): Linear(in_features=1280, out_features=512, bias=False)
700
+ (1): SyncBatchNorm(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
701
+ (2): ReLU()
702
+ (3): Linear(in_features=512, out_features=512, bias=True)
703
+ )
704
+ (1): Sequential(
705
+ (0): Linear(in_features=1280, out_features=512, bias=False)
706
+ (1): SyncBatchNorm(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
707
+ (2): ReLU()
708
+ (3): Linear(in_features=512, out_features=512, bias=True)
709
+ )
710
+ (2): Sequential(
711
+ (0): Linear(in_features=1280, out_features=512, bias=False)
712
+ (1): SyncBatchNorm(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
713
+ (2): ReLU()
714
+ (3): Linear(in_features=512, out_features=512, bias=True)
715
+ )
716
+ (3): Sequential(
717
+ (0): Linear(in_features=1280, out_features=512, bias=False)
718
+ (1): SyncBatchNorm(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
719
+ (2): ReLU()
720
+ (3): Linear(in_features=512, out_features=512, bias=True)
721
+ )
722
+ (4): Sequential(
723
+ (0): Linear(in_features=1280, out_features=512, bias=False)
724
+ (1): SyncBatchNorm(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
725
+ (2): ReLU()
726
+ (3): Linear(in_features=512, out_features=512, bias=True)
727
+ )
728
+ (5): Sequential(
729
+ (0): Linear(in_features=1280, out_features=512, bias=False)
730
+ (1): SyncBatchNorm(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
731
+ (2): ReLU()
732
+ (3): Linear(in_features=512, out_features=512, bias=True)
733
+ )
734
+ )
735
+ (motion_reg_heads): ModuleList(
736
+ (0): Sequential(
737
+ (0): Linear(in_features=512, out_features=512, bias=False)
738
+ (1): SyncBatchNorm(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
739
+ (2): ReLU()
740
+ (3): Linear(in_features=512, out_features=512, bias=False)
741
+ (4): SyncBatchNorm(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
742
+ (5): ReLU()
743
+ (6): Linear(in_features=512, out_features=560, bias=True)
744
+ )
745
+ (1): Sequential(
746
+ (0): Linear(in_features=512, out_features=512, bias=False)
747
+ (1): SyncBatchNorm(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
748
+ (2): ReLU()
749
+ (3): Linear(in_features=512, out_features=512, bias=False)
750
+ (4): SyncBatchNorm(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
751
+ (5): ReLU()
752
+ (6): Linear(in_features=512, out_features=560, bias=True)
753
+ )
754
+ (2): Sequential(
755
+ (0): Linear(in_features=512, out_features=512, bias=False)
756
+ (1): SyncBatchNorm(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
757
+ (2): ReLU()
758
+ (3): Linear(in_features=512, out_features=512, bias=False)
759
+ (4): SyncBatchNorm(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
760
+ (5): ReLU()
761
+ (6): Linear(in_features=512, out_features=560, bias=True)
762
+ )
763
+ (3): Sequential(
764
+ (0): Linear(in_features=512, out_features=512, bias=False)
765
+ (1): SyncBatchNorm(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
766
+ (2): ReLU()
767
+ (3): Linear(in_features=512, out_features=512, bias=False)
768
+ (4): SyncBatchNorm(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
769
+ (5): ReLU()
770
+ (6): Linear(in_features=512, out_features=560, bias=True)
771
+ )
772
+ (4): Sequential(
773
+ (0): Linear(in_features=512, out_features=512, bias=False)
774
+ (1): SyncBatchNorm(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
775
+ (2): ReLU()
776
+ (3): Linear(in_features=512, out_features=512, bias=False)
777
+ (4): SyncBatchNorm(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
778
+ (5): ReLU()
779
+ (6): Linear(in_features=512, out_features=560, bias=True)
780
+ )
781
+ (5): Sequential(
782
+ (0): Linear(in_features=512, out_features=512, bias=False)
783
+ (1): SyncBatchNorm(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
784
+ (2): ReLU()
785
+ (3): Linear(in_features=512, out_features=512, bias=False)
786
+ (4): SyncBatchNorm(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
787
+ (5): ReLU()
788
+ (6): Linear(in_features=512, out_features=560, bias=True)
789
+ )
790
+ )
791
+ (motion_cls_heads): ModuleList(
792
+ (0): Sequential(
793
+ (0): Linear(in_features=512, out_features=512, bias=False)
794
+ (1): SyncBatchNorm(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
795
+ (2): ReLU()
796
+ (3): Linear(in_features=512, out_features=512, bias=False)
797
+ (4): SyncBatchNorm(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
798
+ (5): ReLU()
799
+ (6): Linear(in_features=512, out_features=1, bias=True)
800
+ )
801
+ (1): Sequential(
802
+ (0): Linear(in_features=512, out_features=512, bias=False)
803
+ (1): SyncBatchNorm(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
804
+ (2): ReLU()
805
+ (3): Linear(in_features=512, out_features=512, bias=False)
806
+ (4): SyncBatchNorm(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
807
+ (5): ReLU()
808
+ (6): Linear(in_features=512, out_features=1, bias=True)
809
+ )
810
+ (2): Sequential(
811
+ (0): Linear(in_features=512, out_features=512, bias=False)
812
+ (1): SyncBatchNorm(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
813
+ (2): ReLU()
814
+ (3): Linear(in_features=512, out_features=512, bias=False)
815
+ (4): SyncBatchNorm(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
816
+ (5): ReLU()
817
+ (6): Linear(in_features=512, out_features=1, bias=True)
818
+ )
819
+ (3): Sequential(
820
+ (0): Linear(in_features=512, out_features=512, bias=False)
821
+ (1): SyncBatchNorm(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
822
+ (2): ReLU()
823
+ (3): Linear(in_features=512, out_features=512, bias=False)
824
+ (4): SyncBatchNorm(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
825
+ (5): ReLU()
826
+ (6): Linear(in_features=512, out_features=1, bias=True)
827
+ )
828
+ (4): Sequential(
829
+ (0): Linear(in_features=512, out_features=512, bias=False)
830
+ (1): SyncBatchNorm(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
831
+ (2): ReLU()
832
+ (3): Linear(in_features=512, out_features=512, bias=False)
833
+ (4): SyncBatchNorm(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
834
+ (5): ReLU()
835
+ (6): Linear(in_features=512, out_features=1, bias=True)
836
+ )
837
+ (5): Sequential(
838
+ (0): Linear(in_features=512, out_features=512, bias=False)
839
+ (1): SyncBatchNorm(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
840
+ (2): ReLU()
841
+ (3): Linear(in_features=512, out_features=512, bias=False)
842
+ (4): SyncBatchNorm(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
843
+ (5): ReLU()
844
+ (6): Linear(in_features=512, out_features=1, bias=True)
845
+ )
846
+ )
847
+ )
848
+ )
849
+ )
850
+ 2024-05-18 10:32:00,849 INFO Total number of parameters: 71310426
851
+ 2024-05-18 10:32:00,850 INFO Start to load infos from /home/DISCOVER/yanzj/workspace/code/MTR/data/waymo/mtr_processed/processed_scenarios_val_infos.pkl
852
+ 2024-05-18 10:32:01,586 INFO Total scenes before filters: 44097
853
+ 2024-05-18 10:32:02,404 INFO Total scenes after filter_info_by_object_type: 44097
854
+ 2024-05-18 10:32:02,409 INFO Total scenes after filters: 44097
855
+ 2024-05-18 10:32:02,409 INFO Start to load context from /home/aidrive/zhengxj/projects_new/MTR_new/LLM_integrate/context_data/valid/context_data_encoder_100.pkl
856
+ 2024-05-18 10:32:09,467 INFO Total scenes in context info file: 44097
857
+ 2024-05-18 10:32:18,343 INFO **********************Start training waymo/mtr+100_percent_data_llm_augmented(llm_augmented_mtr+100_percent_attn_loop_learned_gate_inside+change_context_window_8_layer_4)**********************
858
+ 2024-05-18 10:32:32,868 INFO epoch: 21/30, acc_iter=131034, cur_iter=9275/9366, batch_size=13, iter_cost=8.56s, time_cost(epoch): 00:08/12:58, time_cost(all): 00:14/178:16:33, ade_TYPE_VEHICLE_layer_5=0.691, ade_TYPE_PEDESTRIAN_layer_5=0.178, ade_TYPE_CYCLIST_layer_5=0.385, loss=106.815, lr=0.0001
859
+ 2024-05-18 10:32:53,257 INFO epoch: 21/30, acc_iter=131050, cur_iter=9291/9366, batch_size=13, iter_cost=1.70s, time_cost(epoch): 00:28/02:07, time_cost(all): 00:34/35:28:21, ade_TYPE_VEHICLE_layer_5=0.765, ade_TYPE_PEDESTRIAN_layer_5=0.331, ade_TYPE_CYCLIST_layer_5=0.484, loss=93.268, lr=0.0001
860
+ 2024-05-18 10:33:52,925 INFO epoch: 21/30, acc_iter=131100, cur_iter=9341/9366, batch_size=13, iter_cost=1.32s, time_cost(epoch): 01:28/00:33, time_cost(all): 01:34/27:32:10, ade_TYPE_VEHICLE_layer_5=0.654, ade_TYPE_PEDESTRIAN_layer_5=0.322, ade_TYPE_CYCLIST_layer_5=0.868, loss=54.603, lr=0.0001
861
+ 2024-05-18 10:34:20,693 INFO epoch: 21/30, acc_iter=131124, cur_iter=9365/9366, batch_size=13, iter_cost=1.28s, time_cost(epoch): 01:56/00:01, time_cost(all): 02:02/26:37:06, ade_TYPE_VEHICLE_layer_5=0.588, ade_TYPE_PEDESTRIAN_layer_5=0.244, ade_TYPE_CYCLIST_layer_5=0.272, loss=42.187, lr=0.0001
862
+ 2024-05-18 10:34:36,717 INFO *************** EPOCH 22 EVALUATION *****************
863
+ 2024-05-18 10:34:40,366 INFO eval: epoch=22, batch_iter=0/849, batch_size=13, iter_cost=3.64s, time_cost: 00:03/51:31,
864
+ 2024-05-18 10:34:42,700 INFO eval: epoch=22, batch_iter=10/849, batch_size=13, iter_cost=0.60s, time_cost: 00:05/08:21,
865
+ 2024-05-18 10:34:45,003 INFO eval: epoch=22, batch_iter=20/849, batch_size=13, iter_cost=0.41s, time_cost: 00:08/05:43,
866
+ 2024-05-18 10:34:47,257 INFO eval: epoch=22, batch_iter=30/849, batch_size=13, iter_cost=0.35s, time_cost: 00:10/04:47,
867
+ 2024-05-18 10:34:49,694 INFO eval: epoch=22, batch_iter=40/849, batch_size=13, iter_cost=0.32s, time_cost: 00:12/04:22,
868
+ 2024-05-18 10:34:52,057 INFO eval: epoch=22, batch_iter=50/849, batch_size=13, iter_cost=0.31s, time_cost: 00:15/04:05,
869
+ 2024-05-18 10:34:54,463 INFO eval: epoch=22, batch_iter=60/849, batch_size=13, iter_cost=0.30s, time_cost: 00:17/03:53,
870
+ 2024-05-18 10:34:56,687 INFO eval: epoch=22, batch_iter=70/849, batch_size=13, iter_cost=0.29s, time_cost: 00:19/03:42,
871
+ 2024-05-18 10:34:59,094 INFO eval: epoch=22, batch_iter=80/849, batch_size=13, iter_cost=0.28s, time_cost: 00:22/03:35,
872
+ 2024-05-18 10:35:01,361 INFO eval: epoch=22, batch_iter=90/849, batch_size=13, iter_cost=0.27s, time_cost: 00:24/03:27,
873
+ 2024-05-18 10:35:03,720 INFO eval: epoch=22, batch_iter=100/849, batch_size=13, iter_cost=0.27s, time_cost: 00:26/03:22,
874
+ 2024-05-18 10:35:06,003 INFO eval: epoch=22, batch_iter=110/849, batch_size=13, iter_cost=0.27s, time_cost: 00:29/03:16,
875
+ 2024-05-18 10:35:08,339 INFO eval: epoch=22, batch_iter=120/849, batch_size=13, iter_cost=0.26s, time_cost: 00:31/03:12,
876
+ 2024-05-18 10:35:10,738 INFO eval: epoch=22, batch_iter=130/849, batch_size=13, iter_cost=0.26s, time_cost: 00:34/03:08,
877
+ 2024-05-18 10:35:13,110 INFO eval: epoch=22, batch_iter=140/849, batch_size=13, iter_cost=0.26s, time_cost: 00:36/03:04,
878
+ 2024-05-18 10:35:15,416 INFO eval: epoch=22, batch_iter=150/849, batch_size=13, iter_cost=0.26s, time_cost: 00:38/03:00,
879
+ 2024-05-18 10:35:17,712 INFO eval: epoch=22, batch_iter=160/849, batch_size=13, iter_cost=0.26s, time_cost: 00:40/02:56,
880
+ 2024-05-18 10:35:20,083 INFO eval: epoch=22, batch_iter=170/849, batch_size=13, iter_cost=0.26s, time_cost: 00:43/02:53,
881
+ 2024-05-18 10:35:22,418 INFO eval: epoch=22, batch_iter=180/849, batch_size=13, iter_cost=0.25s, time_cost: 00:45/02:49,
882
+ 2024-05-18 10:35:24,709 INFO eval: epoch=22, batch_iter=190/849, batch_size=13, iter_cost=0.25s, time_cost: 00:47/02:46,
883
+ 2024-05-18 10:35:27,086 INFO eval: epoch=22, batch_iter=200/849, batch_size=13, iter_cost=0.25s, time_cost: 00:50/02:43,
884
+ 2024-05-18 10:35:29,527 INFO eval: epoch=22, batch_iter=210/849, batch_size=13, iter_cost=0.25s, time_cost: 00:52/02:40,
885
+ 2024-05-18 10:35:31,772 INFO eval: epoch=22, batch_iter=220/849, batch_size=13, iter_cost=0.25s, time_cost: 00:55/02:37,
886
+ 2024-05-18 10:35:34,195 INFO eval: epoch=22, batch_iter=230/849, batch_size=13, iter_cost=0.25s, time_cost: 00:57/02:34,
887
+ 2024-05-18 10:35:36,580 INFO eval: epoch=22, batch_iter=240/849, batch_size=13, iter_cost=0.25s, time_cost: 00:59/02:31,
888
+ 2024-05-18 10:35:38,983 INFO eval: epoch=22, batch_iter=250/849, batch_size=13, iter_cost=0.25s, time_cost: 01:02/02:29,
889
+ 2024-05-18 10:35:41,266 INFO eval: epoch=22, batch_iter=260/849, batch_size=13, iter_cost=0.25s, time_cost: 01:04/02:26,
890
+ 2024-05-18 10:35:43,514 INFO eval: epoch=22, batch_iter=270/849, batch_size=13, iter_cost=0.25s, time_cost: 01:06/02:23,
891
+ 2024-05-18 10:35:45,727 INFO eval: epoch=22, batch_iter=280/849, batch_size=13, iter_cost=0.25s, time_cost: 01:09/02:20,
892
+ 2024-05-18 10:35:48,131 INFO eval: epoch=22, batch_iter=290/849, batch_size=13, iter_cost=0.25s, time_cost: 01:11/02:17,
893
+ 2024-05-18 10:35:50,573 INFO eval: epoch=22, batch_iter=300/849, batch_size=13, iter_cost=0.25s, time_cost: 01:13/02:15,
894
+ 2024-05-18 10:35:53,226 INFO eval: epoch=22, batch_iter=310/849, batch_size=13, iter_cost=0.25s, time_cost: 01:16/02:13,
895
+ 2024-05-18 10:35:55,819 INFO eval: epoch=22, batch_iter=320/849, batch_size=13, iter_cost=0.25s, time_cost: 01:19/02:10,
896
+ 2024-05-18 10:35:58,139 INFO eval: epoch=22, batch_iter=330/849, batch_size=13, iter_cost=0.25s, time_cost: 01:21/02:08,
897
+ 2024-05-18 10:36:00,685 INFO eval: epoch=22, batch_iter=340/849, batch_size=13, iter_cost=0.25s, time_cost: 01:23/02:05,
898
+ 2024-05-18 10:36:03,115 INFO eval: epoch=22, batch_iter=350/849, batch_size=13, iter_cost=0.25s, time_cost: 01:26/02:03,
899
+ 2024-05-18 10:36:05,615 INFO eval: epoch=22, batch_iter=360/849, batch_size=13, iter_cost=0.25s, time_cost: 01:28/02:00,
900
+ 2024-05-18 10:36:08,071 INFO eval: epoch=22, batch_iter=370/849, batch_size=13, iter_cost=0.25s, time_cost: 01:31/01:58,
901
+ 2024-05-18 10:36:10,403 INFO eval: epoch=22, batch_iter=380/849, batch_size=13, iter_cost=0.25s, time_cost: 01:33/01:55,
902
+ 2024-05-18 10:36:12,569 INFO eval: epoch=22, batch_iter=390/849, batch_size=13, iter_cost=0.25s, time_cost: 01:35/01:52,
903
+ 2024-05-18 10:36:14,866 INFO eval: epoch=22, batch_iter=400/849, batch_size=13, iter_cost=0.25s, time_cost: 01:38/01:50,
904
+ 2024-05-18 10:36:17,176 INFO eval: epoch=22, batch_iter=410/849, batch_size=13, iter_cost=0.25s, time_cost: 01:40/01:47,
905
+ 2024-05-18 10:36:19,662 INFO eval: epoch=22, batch_iter=420/849, batch_size=13, iter_cost=0.25s, time_cost: 01:42/01:45,
906
+ 2024-05-18 10:36:22,073 INFO eval: epoch=22, batch_iter=430/849, batch_size=13, iter_cost=0.24s, time_cost: 01:45/01:42,
907
+ 2024-05-18 10:36:24,591 INFO eval: epoch=22, batch_iter=440/849, batch_size=13, iter_cost=0.25s, time_cost: 01:47/01:40,
908
+ 2024-05-18 10:36:27,184 INFO eval: epoch=22, batch_iter=450/849, batch_size=13, iter_cost=0.25s, time_cost: 01:50/01:37,
909
+ 2024-05-18 10:36:29,425 INFO eval: epoch=22, batch_iter=460/849, batch_size=13, iter_cost=0.25s, time_cost: 01:52/01:35,
910
+ 2024-05-18 10:36:31,789 INFO eval: epoch=22, batch_iter=470/849, batch_size=13, iter_cost=0.24s, time_cost: 01:55/01:32,
911
+ 2024-05-18 10:36:33,958 INFO eval: epoch=22, batch_iter=480/849, batch_size=13, iter_cost=0.24s, time_cost: 01:57/01:30,
912
+ 2024-05-18 10:36:36,304 INFO eval: epoch=22, batch_iter=490/849, batch_size=13, iter_cost=0.24s, time_cost: 01:59/01:27,
913
+ 2024-05-18 10:36:38,745 INFO eval: epoch=22, batch_iter=500/849, batch_size=13, iter_cost=0.24s, time_cost: 02:02/01:25,
914
+ 2024-05-18 10:36:40,967 INFO eval: epoch=22, batch_iter=510/849, batch_size=13, iter_cost=0.24s, time_cost: 02:04/01:22,
915
+ 2024-05-18 10:36:43,271 INFO eval: epoch=22, batch_iter=520/849, batch_size=13, iter_cost=0.24s, time_cost: 02:06/01:20,
916
+ 2024-05-18 10:36:45,723 INFO eval: epoch=22, batch_iter=530/849, batch_size=13, iter_cost=0.24s, time_cost: 02:08/01:17,
917
+ 2024-05-18 10:36:48,064 INFO eval: epoch=22, batch_iter=540/849, batch_size=13, iter_cost=0.24s, time_cost: 02:11/01:15,
918
+ 2024-05-18 10:36:50,563 INFO eval: epoch=22, batch_iter=550/849, batch_size=13, iter_cost=0.24s, time_cost: 02:13/01:12,
919
+ 2024-05-18 10:36:52,891 INFO eval: epoch=22, batch_iter=560/849, batch_size=13, iter_cost=0.24s, time_cost: 02:16/01:10,
920
+ 2024-05-18 10:36:55,339 INFO eval: epoch=22, batch_iter=570/849, batch_size=13, iter_cost=0.24s, time_cost: 02:18/01:07,
921
+ 2024-05-18 10:36:57,667 INFO eval: epoch=22, batch_iter=580/849, batch_size=13, iter_cost=0.24s, time_cost: 02:20/01:05,
922
+ 2024-05-18 10:37:08,450 INFO eval: epoch=22, batch_iter=590/849, batch_size=13, iter_cost=0.26s, time_cost: 02:31/01:06,
923
+ 2024-05-18 10:37:10,837 INFO eval: epoch=22, batch_iter=600/849, batch_size=13, iter_cost=0.26s, time_cost: 02:34/01:03,
924
+ 2024-05-18 10:37:13,271 INFO eval: epoch=22, batch_iter=610/849, batch_size=13, iter_cost=0.26s, time_cost: 02:36/01:01,
925
+ 2024-05-18 10:37:15,605 INFO eval: epoch=22, batch_iter=620/849, batch_size=13, iter_cost=0.26s, time_cost: 02:38/00:58,
926
+ 2024-05-18 10:37:17,972 INFO eval: epoch=22, batch_iter=630/849, batch_size=13, iter_cost=0.26s, time_cost: 02:41/00:56,
927
+ 2024-05-18 10:37:20,274 INFO eval: epoch=22, batch_iter=640/849, batch_size=13, iter_cost=0.26s, time_cost: 02:43/00:53,
928
+ 2024-05-18 10:37:22,605 INFO eval: epoch=22, batch_iter=650/849, batch_size=13, iter_cost=0.26s, time_cost: 02:45/00:50,
929
+ 2024-05-18 10:37:28,257 INFO eval: epoch=22, batch_iter=660/849, batch_size=13, iter_cost=0.26s, time_cost: 02:51/00:49,
930
+ 2024-05-18 10:37:30,781 INFO eval: epoch=22, batch_iter=670/849, batch_size=13, iter_cost=0.26s, time_cost: 02:54/00:46,
931
+ 2024-05-18 10:37:33,194 INFO eval: epoch=22, batch_iter=680/849, batch_size=13, iter_cost=0.26s, time_cost: 02:56/00:43,
932
+ 2024-05-18 10:37:35,626 INFO eval: epoch=22, batch_iter=690/849, batch_size=13, iter_cost=0.26s, time_cost: 02:58/00:41,
933
+ 2024-05-18 10:37:40,710 INFO eval: epoch=22, batch_iter=700/849, batch_size=13, iter_cost=0.26s, time_cost: 03:03/00:39,
934
+ 2024-05-18 10:37:43,109 INFO eval: epoch=22, batch_iter=710/849, batch_size=13, iter_cost=0.26s, time_cost: 03:06/00:36,
935
+ 2024-05-18 10:37:48,904 INFO eval: epoch=22, batch_iter=720/849, batch_size=13, iter_cost=0.27s, time_cost: 03:12/00:34,
936
+ 2024-05-18 10:37:51,449 INFO eval: epoch=22, batch_iter=730/849, batch_size=13, iter_cost=0.27s, time_cost: 03:14/00:31,
937
+ 2024-05-18 10:37:53,735 INFO eval: epoch=22, batch_iter=740/849, batch_size=13, iter_cost=0.27s, time_cost: 03:17/00:29,
938
+ 2024-05-18 10:37:56,387 INFO eval: epoch=22, batch_iter=750/849, batch_size=13, iter_cost=0.27s, time_cost: 03:19/00:26,
939
+ 2024-05-18 10:38:03,692 INFO eval: epoch=22, batch_iter=760/849, batch_size=13, iter_cost=0.27s, time_cost: 03:26/00:24,
940
+ 2024-05-18 10:38:06,114 INFO eval: epoch=22, batch_iter=770/849, batch_size=13, iter_cost=0.27s, time_cost: 03:29/00:21,
941
+ 2024-05-18 10:38:08,584 INFO eval: epoch=22, batch_iter=780/849, batch_size=13, iter_cost=0.27s, time_cost: 03:31/00:18,
942
+ 2024-05-18 10:38:11,063 INFO eval: epoch=22, batch_iter=790/849, batch_size=13, iter_cost=0.27s, time_cost: 03:34/00:16,
943
+ 2024-05-18 10:38:13,663 INFO eval: epoch=22, batch_iter=800/849, batch_size=13, iter_cost=0.27s, time_cost: 03:36/00:13,
944
+ 2024-05-18 10:38:16,095 INFO eval: epoch=22, batch_iter=810/849, batch_size=13, iter_cost=0.27s, time_cost: 03:39/00:10,
945
+ 2024-05-18 10:38:18,534 INFO eval: epoch=22, batch_iter=820/849, batch_size=13, iter_cost=0.27s, time_cost: 03:41/00:07,
946
+ 2024-05-18 10:38:26,364 INFO eval: epoch=22, batch_iter=830/849, batch_size=13, iter_cost=0.28s, time_cost: 03:49/00:05,
947
+ 2024-05-18 10:38:28,723 INFO eval: epoch=22, batch_iter=840/849, batch_size=13, iter_cost=0.28s, time_cost: 03:51/00:02,
948
+ 2024-05-18 10:38:30,428 INFO eval: epoch=22, batch_iter=848/849, batch_size=1, iter_cost=0.28s, time_cost: 03:53/00:00,
949
+ 2024-05-18 10:38:47,235 INFO Total number of samples before merging from multiple GPUs: 11025
950
+ 2024-05-18 10:38:57,649 INFO Total number of samples after merging from multiple GPUs (removing duplicate): 44097
951
+ 2024-05-18 10:38:57,652 INFO *************** Performance of EPOCH 22 *****************
952
+ 2024-05-18 10:38:57,652 INFO Generate label finished(sec_per_example: 0.0059 second).
953
+ 2024-05-18 10:45:37,545 INFO
954
+ minADE - TYPE_VEHICLE_5 : 0.3506
955
+ minADE - TYPE_VEHICLE_9 : 0.7024
956
+ minADE - TYPE_VEHICLE_15 : 1.3107
957
+ minADE - TYPE_PEDESTRIAN_5 : 0.1675
958
+ minADE - TYPE_PEDESTRIAN_9 : 0.3248
959
+ minADE - TYPE_PEDESTRIAN_15 : 0.5732
960
+ minADE - TYPE_CYCLIST_5 : 0.3610
961
+ minADE - TYPE_CYCLIST_9 : 0.6543
962
+ minADE - TYPE_CYCLIST_15 : 1.1300
963
+ minFDE - TYPE_VEHICLE_5 : 0.6329
964
+ minFDE - TYPE_VEHICLE_9 : 1.3764
965
+ minFDE - TYPE_VEHICLE_15 : 2.7524
966
+ minFDE - TYPE_PEDESTRIAN_5 : 0.3164
967
+ minFDE - TYPE_PEDESTRIAN_9 : 0.6600
968
+ minFDE - TYPE_PEDESTRIAN_15 : 1.2651
969
+ minFDE - TYPE_CYCLIST_5 : 0.6524
970
+ minFDE - TYPE_CYCLIST_9 : 1.2579
971
+ minFDE - TYPE_CYCLIST_15 : 2.4358
972
+ MissRate - TYPE_VEHICLE_5 : 0.1215
973
+ MissRate - TYPE_VEHICLE_9 : 0.1591
974
+ MissRate - TYPE_VEHICLE_15 : 0.2118
975
+ MissRate - TYPE_PEDESTRIAN_5 : 0.0584
976
+ MissRate - TYPE_PEDESTRIAN_9 : 0.0730
977
+ MissRate - TYPE_PEDESTRIAN_15 : 0.0951
978
+ MissRate - TYPE_CYCLIST_5 : 0.1826
979
+ MissRate - TYPE_CYCLIST_9 : 0.1779
980
+ MissRate - TYPE_CYCLIST_15 : 0.2000
981
+ OverlapRate - TYPE_VEHICLE_5 : 0.0058
982
+ OverlapRate - TYPE_VEHICLE_9 : 0.0158
983
+ OverlapRate - TYPE_VEHICLE_15 : 0.0405
984
+ OverlapRate - TYPE_PEDESTRIAN_5 : 0.0553
985
+ OverlapRate - TYPE_PEDESTRIAN_9 : 0.0652
986
+ OverlapRate - TYPE_PEDESTRIAN_15 : 0.0782
987
+ OverlapRate - TYPE_CYCLIST_5 : 0.0173
988
+ OverlapRate - TYPE_CYCLIST_9 : 0.0339
989
+ OverlapRate - TYPE_CYCLIST_15 : 0.0562
990
+ mAP - TYPE_VEHICLE_5 : 0.5047
991
+ mAP - TYPE_VEHICLE_9 : 0.4322
992
+ mAP - TYPE_VEHICLE_15 : 0.3590
993
+ mAP - TYPE_PEDESTRIAN_5 : 0.5245
994
+ mAP - TYPE_PEDESTRIAN_9 : 0.4485
995
+ mAP - TYPE_PEDESTRIAN_15 : 0.4139
996
+ mAP - TYPE_CYCLIST_5 : 0.3606
997
+ mAP - TYPE_CYCLIST_9 : 0.3207
998
+ mAP - TYPE_CYCLIST_15 : 0.2677
999
+ -------------------------------------------------------------: 0.0000
1000
+ minADE - VEHICLE: 0.7879
1001
+ minADE - PEDESTRIAN: 0.3552
1002
+ minADE - CYCLIST: 0.7151
1003
+ minFDE - VEHICLE: 1.5872
1004
+ minFDE - PEDESTRIAN: 0.7471
1005
+ minFDE - CYCLIST: 1.4487
1006
+ MissRate - VEHICLE: 0.1641
1007
+ MissRate - PEDESTRIAN: 0.0755
1008
+ MissRate - CYCLIST: 0.1868
1009
+ OverlapRate - VEHICLE: 0.0207
1010
+ OverlapRate - PEDESTRIAN: 0.0663
1011
+ OverlapRate - CYCLIST: 0.0358
1012
+ mAP - VEHICLE: 0.4320
1013
+ mAP - PEDESTRIAN: 0.4623
1014
+ mAP - CYCLIST: 0.3163
1015
+ --------------------------------------------------------------: 0.0000
1016
+ minADE: 0.6194
1017
+ minFDE: 1.2610
1018
+ MissRate: 0.1422
1019
+ mAP: 0.4035
1020
+ ---------------------------------------------------------------: 0.0000
1021
+ TYPE_UNSET: 0.0000
1022
+ TYPE_VEHICLE: 165676.0000
1023
+ TYPE_PEDESTRIAN: 21068.0000
1024
+ TYPE_CYCLIST: 5428.0000
1025
+ TYPE_OTHER: 0.0000
1026
+ -----Note that this evaluation may have marginal differences with the official Waymo evaluation server-----: 0.0000
1027
+
1028
+ Waymo mAP minADE minFDE MissRate
1029
+ VEHICLE 0.4320, 0.7879, 1.5872, 0.1641,
1030
+ PEDESTRIAN 0.4623, 0.3552, 0.7471, 0.0755,
1031
+ CYCLIST 0.3163, 0.7151, 1.4487, 0.1868,
1032
+ Avg 0.4035, 0.6194, 1.2610, 0.1422,
1033
+
1034
+ 2024-05-18 10:45:37,550 INFO Result is save to /home/aidrive/zhengxj/projects_new/MTR_new/output/waymo/mtr+100_percent_data_llm_augmented/llm_augmented_mtr+100_percent_attn_loop_learned_gate_inside+change_context_window_8_layer_4/eval/eval_with_train
1035
+ 2024-05-18 10:45:37,550 INFO ****************Evaluation done.*****************
1036
+ 2024-05-18 10:45:48,643 INFO epoch: 22/30, acc_iter=131125, cur_iter=0/9366, batch_size=13, iter_cost=2.71s, time_cost(epoch): 00:02/7:03:27, time_cost(all): 13:30/56:27:43, ade_TYPE_VEHICLE_layer_5=0.609, ade_TYPE_PEDESTRIAN_layer_5=0.244, ade_TYPE_CYCLIST_layer_5=-0.000, loss=84.691, lr=0.0001
1037
+ 2024-05-18 10:46:17,916 INFO epoch: 22/30, acc_iter=131150, cur_iter=25/9366, batch_size=13, iter_cost=1.23s, time_cost(epoch): 00:31/3:11:31, time_cost(all): 13:59/25:35:47, ade_TYPE_VEHICLE_layer_5=0.673, ade_TYPE_PEDESTRIAN_layer_5=0.205, ade_TYPE_CYCLIST_layer_5=0.427, loss=127.063, lr=0.0001
1038
+ 2024-05-18 10:47:16,199 INFO epoch: 22/30, acc_iter=131200, cur_iter=75/9366, batch_size=13, iter_cost=1.19s, time_cost(epoch): 01:30/3:03:55, time_cost(all): 14:57/24:41:45, ade_TYPE_VEHICLE_layer_5=0.635, ade_TYPE_PEDESTRIAN_layer_5=0.233, ade_TYPE_CYCLIST_layer_5=0.486, loss=94.115, lr=0.0001
100_percent/LLM-Augmented-MTR/log_train_20240518-104841.txt ADDED
The diff for this file is too large to render. See raw diff
 
100_percent/MTR/best_eval_record.txt ADDED
@@ -0,0 +1,40 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ epoch_2 mAP 0.2698469476567374
2
+ best_epoch_2 mAP 0.2698469476567374
3
+ epoch_4 mAP 0.30147600008381736
4
+ best_epoch_4 mAP 0.30147600008381736
5
+ epoch_6 mAP 0.31303117838170796
6
+ best_epoch_6 mAP 0.31303117838170796
7
+ epoch_8 mAP 0.3235940817329619
8
+ best_epoch_8 mAP 0.3235940817329619
9
+ epoch_10 mAP 0.36777884099218583
10
+ best_epoch_10 mAP 0.36777884099218583
11
+ epoch_12 mAP 0.3643510788679123
12
+ best_epoch_10 mAP 0.36777884099218583
13
+ epoch_14 mAP 0.3607478638490041
14
+ best_epoch_10 mAP 0.36777884099218583
15
+ epoch_16 mAP 0.3673184762398402
16
+ best_epoch_10 mAP 0.36777884099218583
17
+ epoch_18 mAP 0.3678972903225157
18
+ best_epoch_18 mAP 0.3678972903225157
19
+ epoch_20 mAP 0.3698185649183061
20
+ best_epoch_20 mAP 0.3698185649183061
21
+ epoch_21 mAP 0.3756412830617693
22
+ best_epoch_21 mAP 0.3756412830617693
23
+ epoch_22 mAP 0.384421490960651
24
+ best_epoch_22 mAP 0.384421490960651
25
+ epoch_23 mAP 0.3798180388079749
26
+ best_epoch_22 mAP 0.384421490960651
27
+ epoch_24 mAP 0.39770102169778615
28
+ best_epoch_24 mAP 0.39770102169778615
29
+ epoch_25 mAP 0.38524179326163405
30
+ best_epoch_24 mAP 0.39770102169778615
31
+ epoch_26 mAP 0.40852043694920015
32
+ best_epoch_26 mAP 0.40852043694920015
33
+ epoch_27 mAP 0.4085294571187761
34
+ best_epoch_27 mAP 0.4085294571187761
35
+ epoch_28 mAP 0.41052143772443134
36
+ best_epoch_28 mAP 0.41052143772443134
37
+ epoch_29 mAP 0.4173127942615085
38
+ best_epoch_29 mAP 0.4173127942615085
39
+ epoch_30 mAP 0.4181843135091994
40
+ best_epoch_30 mAP 0.4181843135091994
100_percent/MTR/checkpoint_epoch_30.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f14858a1ce6b4098eef8230db39f014b337ef5e158b591e0febe722d7ae4667a
3
+ size 777075481
100_percent/MTR/log_train_20230318-135944.txt ADDED
The diff for this file is too large to render. See raw diff
 
100_percent/MTR/log_train_20230323-015050.txt ADDED
The diff for this file is too large to render. See raw diff
 
100_percent/MTR/log_train_20230324-224338.txt ADDED
The diff for this file is too large to render. See raw diff
 
20_percent/LLM-Augmented-MTR/best_eval_record.txt ADDED
@@ -0,0 +1,42 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ epoch_1 mAP 0.20280200242996216 minADE 1.121805641386244 minFDE 2.3629365298483105 MissRate0.32461252974139315
2
+ best_epoch_1 mAP 0.20280200242996216 minADE 1.121805641386244 minFDE 2.3629365298483105 MissRate0.32461252974139315
3
+ epoch_2 mAP 0.2108259747425715 minADE 1.0475726491875117 minFDE 2.1439842118157277 MissRate0.28972430196073323
4
+ best_epoch_2 mAP 0.2108259747425715 minADE 1.0475726491875117 minFDE 2.1439842118157277 MissRate0.28972430196073323
5
+ epoch_4 mAP 0.2567381891939375 minADE 0.8995593041181564 minFDE 1.8451896177397833 MissRate0.24359686755471763
6
+ best_epoch_4 mAP 0.2567381891939375 minADE 0.8995593041181564 minFDE 1.8451896177397833 MissRate0.24359686755471763
7
+ epoch_6 mAP 0.280468452307913 minADE 0.8129338771104813 minFDE 1.6794896456930373 MissRate0.21835642390780982
8
+ best_epoch_6 mAP 0.280468452307913 minADE 0.8129338771104813 minFDE 1.6794896456930373 MissRate0.21835642390780982
9
+ epoch_8 mAP 0.3066303845908907 minADE 0.8265427615907458 minFDE 1.6715423862139385 MissRate0.20962182266844642
10
+ best_epoch_8 mAP 0.3066303845908907 minADE 0.8265427615907458 minFDE 1.6715423862139385 MissRate0.20962182266844642
11
+ epoch_10 mAP 0.30340896215703755 minADE 0.7995703717072805 minFDE 1.6467467910713618 MissRate0.21301125652260247
12
+ best_epoch_8 mAP 0.3066303845908907 minADE 0.8265427615907458 minFDE 1.6715423862139385 MissRate0.20962182266844642
13
+ epoch_12 mAP 0.2987242821190092 minADE 0.7453566574388081 minFDE 1.527520441346698 MissRate0.19389896177583274
14
+ best_epoch_8 mAP 0.3066303845908907 minADE 0.8265427615907458 minFDE 1.6715423862139385 MissRate0.20962182266844642
15
+ epoch_14 mAP 0.3013038718038135 minADE 0.7638516972462336 minFDE 1.539596176809735 MissRate0.19799110210604134
16
+ best_epoch_8 mAP 0.3066303845908907 minADE 0.8265427615907458 minFDE 1.6715423862139385 MissRate0.20962182266844642
17
+ epoch_16 mAP 0.31712262829144794 minADE 0.7448720667097305 minFDE 1.4916071097056072 MissRate0.18731188111835054
18
+ best_epoch_16 mAP 0.31712262829144794 minADE 0.7448720667097305 minFDE 1.4916071097056072 MissRate0.18731188111835054
19
+ epoch_18 mAP 0.29922616150644094 minADE 0.7549243850840464 minFDE 1.5041462249226043 MissRate0.19184935175710252
20
+ best_epoch_16 mAP 0.31712262829144794 minADE 0.7448720667097305 minFDE 1.4916071097056072 MissRate0.18731188111835054
21
+ epoch_20 mAP 0.32285115122795105 minADE 0.7543421751923031 minFDE 1.4892633656660716 MissRate0.18655251794391212
22
+ best_epoch_20 mAP 0.32285115122795105 minADE 0.7543421751923031 minFDE 1.4892633656660716 MissRate0.18655251794391212
23
+ epoch_21 mAP 0.3209489054150052 minADE 0.721488227446874 minFDE 1.4614114463329315 MissRate0.18441139078802535
24
+ best_epoch_20 mAP 0.32285115122795105 minADE 0.7543421751923031 minFDE 1.4892633656660716 MissRate0.18655251794391212
25
+ epoch_22 mAP 0.32682062354352737 minADE 0.7124388366937637 minFDE 1.4540120561917622 MissRate0.18042861835824117
26
+ best_epoch_22 mAP 0.32682062354352737 minADE 0.7124388366937637 minFDE 1.4540120561917622 MissRate0.18042861835824117
27
+ epoch_23 mAP 0.340921809275945 minADE 0.7024568070967993 minFDE 1.4272008803155687 MissRate0.17437677664889228
28
+ best_epoch_23 mAP 0.340921809275945 minADE 0.7024568070967993 minFDE 1.4272008803155687 MissRate0.17437677664889228
29
+ epoch_24 mAP 0.3347405940294266 minADE 0.7021272778511047 minFDE 1.405790156788296 MissRate0.17520727548334333
30
+ best_epoch_23 mAP 0.340921809275945 minADE 0.7024568070967993 minFDE 1.4272008803155687 MissRate0.17437677664889228
31
+ epoch_25 mAP 0.34384024805492824 minADE 0.6872262193097008 minFDE 1.386270996597078 MissRate0.1706915605399344
32
+ best_epoch_25 mAP 0.34384024805492824 minADE 0.6872262193097008 minFDE 1.386270996597078 MissRate0.1706915605399344
33
+ epoch_26 mAP 0.35064262317286604 minADE 0.6828386121326022 minFDE 1.3688042196962567 MissRate0.17072225858767828
34
+ best_epoch_26 mAP 0.35064262317286604 minADE 0.6828386121326022 minFDE 1.3688042196962567 MissRate0.17072225858767828
35
+ epoch_27 mAP 0.34423224296834737 minADE 0.6811849905384911 minFDE 1.3747731546560924 MissRate0.16926835477352142
36
+ best_epoch_26 mAP 0.35064262317286604 minADE 0.6828386121326022 minFDE 1.3688042196962567 MissRate0.17072225858767828
37
+ epoch_28 mAP 0.3391927546925015 minADE 0.6855861726734372 minFDE 1.3797330624527404 MissRate0.17102102521393037
38
+ best_epoch_26 mAP 0.35064262317286604 minADE 0.6828386121326022 minFDE 1.3688042196962567 MissRate0.17072225858767828
39
+ epoch_29 mAP 0.35265530480278867 minADE 0.6794825990994772 minFDE 1.3695374263657465 MissRate0.1699780879749192
40
+ best_epoch_29 mAP 0.35265530480278867 minADE 0.6794825990994772 minFDE 1.3695374263657465 MissRate0.1699780879749192
41
+ epoch_30 mAP 0.3505900684330199 minADE 0.6841080155637528 minFDE 1.3792778882715437 MissRate0.17003927462630805
42
+ best_epoch_29 mAP 0.35265530480278867 minADE 0.6794825990994772 minFDE 1.3695374263657465 MissRate0.1699780879749192
20_percent/LLM-Augmented-MTR/checkpoint_epoch_29.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:528da69183bab17fd3ac0d8afafdd38d5d970e652f1d0af1fd52decd32066ab8
3
+ size 890718577
20_percent/LLM-Augmented-MTR/log_train_20240227-140250.txt ADDED
The diff for this file is too large to render. See raw diff
 
20_percent/MTR/best_eval_record.txt ADDED
@@ -0,0 +1,42 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ epoch_1 mAP 0.1780766099691391 minADE 1.2116148041354284 minFDE 2.463306056128608 MissRate0.34856364462110734
2
+ best_epoch_1 mAP 0.1780766099691391 minADE 1.2116148041354284 minFDE 2.463306056128608 MissRate0.34856364462110734
3
+ epoch_2 mAP 0.20629685951603782 minADE 1.029665822784106 minFDE 2.1169575320349803 MissRate0.2860292113489575
4
+ best_epoch_2 mAP 0.20629685951603782 minADE 1.029665822784106 minFDE 2.1169575320349803 MissRate0.2860292113489575
5
+ epoch_4 mAP 0.237494428952535 minADE 0.8865823778841232 minFDE 1.837194714281294 MissRate0.24316217501958212
6
+ best_epoch_4 mAP 0.237494428952535 minADE 0.8865823778841232 minFDE 1.837194714281294 MissRate0.24316217501958212
7
+ epoch_6 mAP 0.24996380673514473 minADE 0.8017858978774811 minFDE 1.6643226312266457 MissRate0.2179658810297648
8
+ best_epoch_6 mAP 0.24996380673514473 minADE 0.8017858978774811 minFDE 1.6643226312266457 MissRate0.2179658810297648
9
+ epoch_8 mAP 0.29360215034749776 minADE 0.7573151406314639 minFDE 1.578547431363 MissRate0.19708167016506195
10
+ best_epoch_8 mAP 0.29360215034749776 minADE 0.7573151406314639 minFDE 1.578547431363 MissRate0.19708167016506195
11
+ epoch_10 mAP 0.30071333050727844 minADE 0.7649337003628413 minFDE 1.5673650403817494 MissRate0.20276714944177202
12
+ best_epoch_10 mAP 0.30071333050727844 minADE 0.7649337003628413 minFDE 1.5673650403817494 MissRate0.20276714944177202
13
+ epoch_12 mAP 0.292175743314955 minADE 0.7380434456798767 minFDE 1.5039872195985582 MissRate0.18972881303893196
14
+ best_epoch_10 mAP 0.30071333050727844 minADE 0.7649337003628413 minFDE 1.5673650403817494 MissRate0.20276714944177202
15
+ epoch_14 mAP 0.31314270695050556 minADE 0.7305420802699195 minFDE 1.4838222993744745 MissRate0.18249268995391
16
+ best_epoch_14 mAP 0.31314270695050556 minADE 0.7305420802699195 minFDE 1.4838222993744745 MissRate0.18249268995391
17
+ epoch_16 mAP 0.31479272080792325 minADE 0.7203008400069343 minFDE 1.4792389836576252 MissRate0.18043349352147842
18
+ best_epoch_16 mAP 0.31479272080792325 minADE 0.7203008400069343 minFDE 1.4792389836576252 MissRate0.18043349352147842
19
+ epoch_18 mAP 0.30901829567220473 minADE 0.7110151019361285 minFDE 1.4722885092099507 MissRate0.1831517426504029
20
+ best_epoch_16 mAP 0.31479272080792325 minADE 0.7203008400069343 minFDE 1.4792389836576252 MissRate0.18043349352147842
21
+ epoch_20 mAP 0.3403045965565576 minADE 0.6804299834701751 minFDE 1.4013747837808397 MissRate0.1693890881207254
22
+ best_epoch_20 mAP 0.3403045965565576 minADE 0.6804299834701751 minFDE 1.4013747837808397 MissRate0.1693890881207254
23
+ epoch_21 mAP 0.34428356422318357 minADE 0.6813108093208736 minFDE 1.3953127794795568 MissRate0.1662339808212386
24
+ best_epoch_21 mAP 0.34428356422318357 minADE 0.6813108093208736 minFDE 1.3953127794795568 MissRate0.1662339808212386
25
+ epoch_22 mAP 0.34294628765847945 minADE 0.6738415045870675 minFDE 1.3911531501346166 MissRate0.165701354543368
26
+ best_epoch_21 mAP 0.34428356422318357 minADE 0.6813108093208736 minFDE 1.3953127794795568 MissRate0.1662339808212386
27
+ epoch_23 mAP 0.34789742694960707 minADE 0.6705604626072778 minFDE 1.3779481848080952 MissRate0.16484576877620485
28
+ best_epoch_23 mAP 0.34789742694960707 minADE 0.6705604626072778 minFDE 1.3779481848080952 MissRate0.16484576877620485
29
+ epoch_24 mAP 0.33808205359511906 minADE 0.6704978479279412 minFDE 1.3809854719373915 MissRate0.16550052000416648
30
+ best_epoch_23 mAP 0.34789742694960707 minADE 0.6705604626072778 minFDE 1.3779481848080952 MissRate0.16484576877620485
31
+ epoch_25 mAP 0.34635144968827564 minADE 0.6702178435193168 minFDE 1.374932004345788 MissRate0.16507353136936823
32
+ best_epoch_23 mAP 0.34789742694960707 minADE 0.6705604626072778 minFDE 1.3779481848080952 MissRate0.16484576877620485
33
+ epoch_26 mAP 0.3466732932461632 minADE 0.6752092076672448 minFDE 1.3828196260664196 MissRate0.16483944985601637
34
+ best_epoch_23 mAP 0.34789742694960707 minADE 0.6705604626072778 minFDE 1.3779481848080952 MissRate0.16484576877620485
35
+ epoch_27 mAP 0.34161571330494356 minADE 0.6706651995579401 minFDE 1.3762814667489793 MissRate0.1658488561709722
36
+ best_epoch_23 mAP 0.34789742694960707 minADE 0.6705604626072778 minFDE 1.3779481848080952 MissRate0.16484576877620485
37
+ epoch_28 mAP 0.3456234435240428 minADE 0.672044810321596 minFDE 1.3780042330423992 MissRate0.16570446474684608
38
+ best_epoch_23 mAP 0.34789742694960707 minADE 0.6705604626072778 minFDE 1.3779481848080952 MissRate0.16484576877620485
39
+ epoch_29 mAP 0.34985576404465574 minADE 0.6696184790796704 minFDE 1.377215094036526 MissRate0.1654756905304061
40
+ best_epoch_29 mAP 0.34985576404465574 minADE 0.6696184790796704 minFDE 1.377215094036526 MissRate0.1654756905304061
41
+ epoch_30 mAP 0.34322212139765423 minADE 0.6689203000730939 minFDE 1.3752210405137806 MissRate0.16523590435584387
42
+ best_epoch_29 mAP 0.34985576404465574 minADE 0.6696184790796704 minFDE 1.377215094036526 MissRate0.1654756905304061
20_percent/MTR/checkpoint_epoch_29.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0eb5c77d846d4d2d9ed8f4cd832cf8419a5d4481020e444575e0e0d389f633c8
3
+ size 777075481
20_percent/MTR/log_train_20240315-005422.txt ADDED
The diff for this file is too large to render. See raw diff
 
20_percent/MTR/log_train_20240315-075642.txt ADDED
The diff for this file is too large to render. See raw diff
 
5_percent/LLM-Augmented-MTR/best_eval_record.txt ADDED
@@ -0,0 +1,40 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ epoch_2 mAP 0.14707461351321804
2
+ best_epoch_2 mAP 0.14707461351321804
3
+ epoch_4 mAP 0.19665334704849455
4
+ best_epoch_4 mAP 0.19665334704849455
5
+ epoch_6 mAP 0.24278521206643844
6
+ best_epoch_6 mAP 0.24278521206643844
7
+ epoch_8 mAP 0.2573127862479952
8
+ best_epoch_8 mAP 0.2573127862479952
9
+ epoch_10 mAP 0.2338909415735139
10
+ best_epoch_8 mAP 0.2573127862479952
11
+ epoch_12 mAP 0.2445411466889911
12
+ best_epoch_8 mAP 0.2573127862479952
13
+ epoch_14 mAP 0.2600547605090671
14
+ best_epoch_14 mAP 0.2600547605090671
15
+ epoch_16 mAP 0.2669246411985821
16
+ best_epoch_16 mAP 0.2669246411985821
17
+ epoch_18 mAP 0.25567510227362317
18
+ best_epoch_16 mAP 0.2669246411985821
19
+ epoch_20 mAP 0.27623524599605137
20
+ best_epoch_20 mAP 0.27623524599605137
21
+ epoch_21 mAP 0.291039678785536
22
+ best_epoch_21 mAP 0.291039678785536
23
+ epoch_22 mAP 0.26954489284091526
24
+ best_epoch_21 mAP 0.291039678785536
25
+ epoch_23 mAP 0.28672027587890625
26
+ best_epoch_21 mAP 0.291039678785536
27
+ epoch_24 mAP 0.2953291071785821
28
+ best_epoch_24 mAP 0.2953291071785821
29
+ epoch_25 mAP 0.2961776968505648
30
+ best_epoch_25 mAP 0.2961776968505648
31
+ epoch_26 mAP 0.2938236908780204
32
+ best_epoch_25 mAP 0.2961776968505648
33
+ epoch_27 mAP 0.29479679961999256
34
+ best_epoch_25 mAP 0.2961776968505648
35
+ epoch_28 mAP 0.30382166306177777
36
+ best_epoch_28 mAP 0.30382166306177777
37
+ epoch_29 mAP 0.2993926422463523
38
+ best_epoch_28 mAP 0.30382166306177777
39
+ epoch_30 mAP 0.2966313726372189
40
+ best_epoch_28 mAP 0.30382166306177777
5_percent/LLM-Augmented-MTR/checkpoint_epoch_28.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e693293b7e1054056229c0b119921fa673c48f50148fdea3fd00af72829dca0a
3
+ size 793251102
5_percent/LLM-Augmented-MTR/log_train_20240519-174533.txt ADDED
The diff for this file is too large to render. See raw diff
 
5_percent/LLM-Augmented-MTR/log_train_20240520-084733.txt ADDED
The diff for this file is too large to render. See raw diff
 
5_percent/MTR/best_eval_record.txt ADDED
@@ -0,0 +1,42 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ epoch_1 mAP 0.11289571225643158
2
+ best_epoch_1 mAP 0.11289571225643158
3
+ epoch_2 mAP 0.15150262746546006
4
+ best_epoch_2 mAP 0.15150262746546006
5
+ epoch_4 mAP 0.19338538083765242
6
+ best_epoch_4 mAP 0.19338538083765242
7
+ epoch_6 mAP 0.20877731011973488
8
+ best_epoch_6 mAP 0.20877731011973488
9
+ epoch_8 mAP 0.2264716509315703
10
+ best_epoch_8 mAP 0.2264716509315703
11
+ epoch_10 mAP 0.23379969514078566
12
+ best_epoch_10 mAP 0.23379969514078566
13
+ epoch_12 mAP 0.2377154164844089
14
+ best_epoch_12 mAP 0.2377154164844089
15
+ epoch_14 mAP 0.23920134041044447
16
+ best_epoch_14 mAP 0.23920134041044447
17
+ epoch_16 mAP 0.19992810570531425
18
+ best_epoch_14 mAP 0.23920134041044447
19
+ epoch_18 mAP 0.23819062610467276
20
+ best_epoch_14 mAP 0.23920134041044447
21
+ epoch_20 mAP 0.24969915880097282
22
+ best_epoch_20 mAP 0.24969915880097282
23
+ epoch_21 mAP 0.2677012417051527
24
+ best_epoch_21 mAP 0.2677012417051527
25
+ epoch_22 mAP 0.2480028718709946
26
+ best_epoch_21 mAP 0.2677012417051527
27
+ epoch_23 mAP 0.26686604486571414
28
+ best_epoch_21 mAP 0.2677012417051527
29
+ epoch_24 mAP 0.27477139068974393
30
+ best_epoch_24 mAP 0.27477139068974393
31
+ epoch_25 mAP 0.2842640694644716
32
+ best_epoch_25 mAP 0.2842640694644716
33
+ epoch_26 mAP 0.28255397578080493
34
+ best_epoch_25 mAP 0.2842640694644716
35
+ epoch_27 mAP 0.2821702079640494
36
+ best_epoch_25 mAP 0.2842640694644716
37
+ epoch_28 mAP 0.2909945597251256
38
+ best_epoch_28 mAP 0.2909945597251256
39
+ epoch_29 mAP 0.284342681368192
40
+ best_epoch_28 mAP 0.2909945597251256
41
+ epoch_30 mAP 0.28522086474630565
42
+ best_epoch_28 mAP 0.2909945597251256
5_percent/MTR/checkpoint_epoch_28.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d7a7b99f064fea4af5c28c5118346042c977b9f549a1e4464074f8079b936651
3
+ size 777282554
5_percent/MTR/log_train_20240429-093927.txt ADDED
The diff for this file is too large to render. See raw diff
 
README.md CHANGED
@@ -1,3 +1,63 @@
1
  ---
2
  license: mit
3
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  license: mit
3
  ---
4
+ Here we provide the baseline model (MTR) and our models' checkpoint.
5
+
6
+ We provide three types of models based on how much training data they had used:
7
+
8
+ ## 5%
9
+ MTR's performance (epoch_28):
10
+ ```
11
+ Waymo mAP minADE minFDE MissRate
12
+ VEHICLE 0.3288, 0.9097, 1.8636, 0.2156,
13
+ PEDESTRIAN 0.3192, 0.4202, 0.8957, 0.1134,
14
+ CYCLIST 0.2250, 0.9298, 1.9409, 0.2736,
15
+ Avg 0.2910, 0.7532, 1.5668, 0.2008,
16
+ ```
17
+
18
+ LLM-Augmented-MTR's performance (epoch_28):
19
+ ```
20
+ Waymo mAP minADE minFDE MissRate
21
+ VEHICLE 0.3367, 0.9216, 1.8960, 0.2236,
22
+ PEDESTRIAN 0.3613, 0.4295, 0.9059, 0.1103,
23
+ CYCLIST 0.2135, 0.9166, 1.9246, 0.2693,
24
+ Avg 0.3038, 0.7559, 1.5755, 0.2011,
25
+ ```
26
+
27
+ ## 20%
28
+ MTR's performance (epoch_29):
29
+ ```
30
+ Waymo mAP minADE minFDE MissRate
31
+ VEHICLE 0.3912, 0.8239, 1.6778, 0.1800,
32
+ PEDESTRIAN 0.3608, 0.3829, 0.8091, 0.0935,
33
+ CYCLIST 0.2975, 0.8020, 1.6448, 0.2230,
34
+ Avg 0.3499, 0.6696, 1.3772, 0.1655,
35
+ ```
36
+
37
+ LLM-Augmented-MTR's performance (epoch_29):
38
+ ```
39
+ Waymo mAP minADE minFDE MissRate
40
+ VEHICLE 0.4028, 0.8090, 1.5943, 0.1711,
41
+ PEDESTRIAN 0.3621, 0.3880, 0.8107, 0.0958,
42
+ CYCLIST 0.2930, 0.8415, 1.7036, 0.2430,
43
+ Avg 0.3527, 0.6795, 1.3695, 0.1700,
44
+ ```
45
+
46
+ ## 100%
47
+ MTR's performance (epoch_30):
48
+ ```
49
+ Waymo mAP minADE minFDE MissRate
50
+ VEHICLE 0.4464, 0.7545, 1.5161, 0.1523,
51
+ PEDESTRIAN 0.4149, 0.3456, 0.7252, 0.0750,
52
+ CYCLIST 0.3933, 0.6849, 1.3869, 0.1795,
53
+ Avg 0.4182, 0.5950, 1.2094, 0.1356,
54
+ ```
55
+
56
+ LLM-Augmented-MTR's performance (epoch_26):
57
+ ```
58
+ Waymo mAP minADE minFDE MissRate
59
+ VEHICLE 0.4578, 0.7570, 1.5308, 0.1523,
60
+ PEDESTRIAN 0.4794, 0.3535, 0.7376, 0.0765,
61
+ CYCLIST 0.3434, 0.7062, 1.4260, 0.1827,
62
+ Avg 0.4269, 0.6056, 1.2315, 0.1371,
63
+ ```