File size: 112,364 Bytes
8591bfe
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
1001
1002
1003
1004
1005
1006
1007
1008
1009
1010
1011
1012
1013
1014
1015
1016
1017
1018
1019
1020
1021
1022
1023
1024
1025
1026
1027
1028
1029
1030
1031
1032
1033
1034
1035
1036
1037
1038
1039
1040
1041
1042
1043
1044
1045
1046
1047
1048
1049
1050
1051
1052
1053
1054
1055
1056
1057
1058
1059
1060
1061
1062
1063
1064
1065
1066
1067
1068
1069
1070
1071
1072
1073
1074
1075
1076
1077
1078
1079
1080
1081
1082
1083
1084
1085
1086
1087
1088
1089
1090
1091
1092
1093
1094
1095
1096
1097
1098
1099
1100
1101
1102
1103
1104
1105
1106
1107
1108
1109
1110
1111
1112
1113
1114
1115
1116
1117
1118
1119
1120
1121
1122
1123
1124
1125
1126
1127
[ INFO : 2025-05-29 06:40:43,303 ] - exp_dir is: exp/MHFA_wav2vec2_large_960-FT-1stage5
[ INFO : 2025-05-29 06:40:43,303 ] - <== Passed Arguments ==>
[ INFO : 2025-05-29 06:40:43,304 ] - {'data_type': 'shard',
[ INFO : 2025-05-29 06:40:43,304 ] -  'dataloader_args': {'batch_size': 16,
[ INFO : 2025-05-29 06:40:43,304 ] -                      'drop_last': True,
[ INFO : 2025-05-29 06:40:43,304 ] -                      'num_workers': 6,
[ INFO : 2025-05-29 06:40:43,304 ] -                      'pin_memory': False,
[ INFO : 2025-05-29 06:40:43,304 ] -                      'prefetch_factor': 8},
[ INFO : 2025-05-29 06:40:43,304 ] -  'dataset_args': {'aug_prob': 0,
[ INFO : 2025-05-29 06:40:43,304 ] -                   'cmvn': True,
[ INFO : 2025-05-29 06:40:43,304 ] -                   'cmvn_args': {'norm_mean': True, 'norm_var': False},
[ INFO : 2025-05-29 06:40:43,304 ] -                   'filter': True,
[ INFO : 2025-05-29 06:40:43,304 ] -                   'filter_args': {'max_num_frames': 400, 'min_num_frames': 50},
[ INFO : 2025-05-29 06:40:43,304 ] -                   'frontend': 's3prl',
[ INFO : 2025-05-29 06:40:43,304 ] -                   'num_frms': 150,
[ INFO : 2025-05-29 06:40:43,304 ] -                   'resample_rate': 16000,
[ INFO : 2025-05-29 06:40:43,304 ] -                   's3prl_args': {'download_dir': './s3prl_hub',
[ INFO : 2025-05-29 06:40:43,304 ] -                                  'frame_length': 20,
[ INFO : 2025-05-29 06:40:43,304 ] -                                  'frame_shift': 20,
[ INFO : 2025-05-29 06:40:43,304 ] -                                  'frozen': False,
[ INFO : 2025-05-29 06:40:43,304 ] -                                  'layer': -1,
[ INFO : 2025-05-29 06:40:43,305 ] -                                  'layerwise_feature': True,
[ INFO : 2025-05-29 06:40:43,305 ] -                                  'multilayer_feature': True,
[ INFO : 2025-05-29 06:40:43,305 ] -                                  'upstream_args': {'name': 'wav2vec2_large_960'}},
[ INFO : 2025-05-29 06:40:43,305 ] -                   'sample_num_per_epoch': 0,
[ INFO : 2025-05-29 06:40:43,305 ] -                   'shuffle': True,
[ INFO : 2025-05-29 06:40:43,305 ] -                   'shuffle_args': {'shuffle_size': 2500},
[ INFO : 2025-05-29 06:40:43,305 ] -                   'spec_aug': False,
[ INFO : 2025-05-29 06:40:43,305 ] -                   'spec_aug_args': {'max_f': 8,
[ INFO : 2025-05-29 06:40:43,305 ] -                                     'max_t': 10,
[ INFO : 2025-05-29 06:40:43,305 ] -                                     'num_f_mask': 1,
[ INFO : 2025-05-29 06:40:43,305 ] -                                     'num_t_mask': 1,
[ INFO : 2025-05-29 06:40:43,305 ] -                                     'prob': 0.6},
[ INFO : 2025-05-29 06:40:43,305 ] -                   'speed_perturb': False},
[ INFO : 2025-05-29 06:40:43,305 ] -  'enable_amp': False,
[ INFO : 2025-05-29 06:40:43,305 ] -  'exp_dir': 'exp/MHFA_wav2vec2_large_960-FT-1stage5',
[ INFO : 2025-05-29 06:40:43,305 ] -  'gpus': [1],
[ INFO : 2025-05-29 06:40:43,305 ] -  'log_batch_interval': 100,
[ INFO : 2025-05-29 06:40:43,305 ] -  'loss': 'CrossEntropyLoss',
[ INFO : 2025-05-29 06:40:43,305 ] -  'loss_args': {},
[ INFO : 2025-05-29 06:40:43,305 ] -  'margin_scheduler': 'MarginScheduler',
[ INFO : 2025-05-29 06:40:43,305 ] -  'margin_update': {'final_margin': 0.0,
[ INFO : 2025-05-29 06:40:43,305 ] -                    'fix_start_epoch': 1,
[ INFO : 2025-05-29 06:40:43,305 ] -                    'increase_start_epoch': 1,
[ INFO : 2025-05-29 06:40:43,305 ] -                    'increase_type': 'exp',
[ INFO : 2025-05-29 06:40:43,305 ] -                    'initial_margin': 0.0,
[ INFO : 2025-05-29 06:40:43,305 ] -                    'update_margin': True},
[ INFO : 2025-05-29 06:40:43,305 ] -  'model': 'SSL_BACKEND_MHFA',
[ INFO : 2025-05-29 06:40:43,305 ] -  'model_args': {'compression_dim': 128,
[ INFO : 2025-05-29 06:40:43,305 ] -                 'embed_dim': 256,
[ INFO : 2025-05-29 06:40:43,305 ] -                 'feat_dim': 1024,
[ INFO : 2025-05-29 06:40:43,305 ] -                 'feature_grad_mult': 0.05,
[ INFO : 2025-05-29 06:40:43,305 ] -                 'head_nb': 32,
[ INFO : 2025-05-29 06:40:43,305 ] -                 'nb_layer': 25},
[ INFO : 2025-05-29 06:40:43,305 ] -  'model_init': None,
[ INFO : 2025-05-29 06:40:43,305 ] -  'num_avg': 2,
[ INFO : 2025-05-29 06:40:43,305 ] -  'num_epochs': 5,
[ INFO : 2025-05-29 06:40:43,305 ] -  'optimizer': 'AdamW',
[ INFO : 2025-05-29 06:40:43,305 ] -  'optimizer_args': {'weight_decay': 1e-08},
[ INFO : 2025-05-29 06:40:43,305 ] -  'projection_args': {'easy_margin': False,
[ INFO : 2025-05-29 06:40:43,305 ] -                      'project_type': 'softmax',
[ INFO : 2025-05-29 06:40:43,305 ] -                      'scale': 32.0},
[ INFO : 2025-05-29 06:40:43,305 ] -  'save_epoch_interval': 1,
[ INFO : 2025-05-29 06:40:43,305 ] -  'scheduler': 'ExponentialDecrease',
[ INFO : 2025-05-29 06:40:43,305 ] -  'scheduler_args': {'final_lr': 5e-07,
[ INFO : 2025-05-29 06:40:43,305 ] -                     'initial_lr': 1e-05,
[ INFO : 2025-05-29 06:40:43,306 ] -                     'warm_from_zero': True,
[ INFO : 2025-05-29 06:40:43,306 ] -                     'warm_up_epoch': 1},
[ INFO : 2025-05-29 06:40:43,306 ] -  'seed': 42,
[ INFO : 2025-05-29 06:40:43,306 ] -  'train_data': 'data/asv5/train/shard.list',
[ INFO : 2025-05-29 06:40:43,306 ] -  'train_label': 'data/asv5/train/utt2cls'}
[ INFO : 2025-05-29 06:40:43,884 ] - <== Data statistics ==>
[ INFO : 2025-05-29 06:40:43,884 ] - train data num: 182357, class num: 2
[ INFO : 2025-05-29 06:40:43,885 ] - <== Dataloaders ==>
[ INFO : 2025-05-29 06:40:43,885 ] - train dataloaders created
[ INFO : 2025-05-29 06:40:43,885 ] - epoch iteration number: 11397
[ INFO : 2025-05-29 06:40:43,885 ] - <== Model ==>
[ INFO : 2025-05-29 06:40:51,352 ] - speaker_model size: 318696274
[ INFO : 2025-05-29 06:40:51,353 ] - Train model from scratch ...
[ INFO : 2025-05-29 06:40:51,355 ] - SSL_BACKEND_MHFA(
[ INFO : 2025-05-29 06:40:51,355 ] -   (cmp_linear_k): Linear(in_features=1024, out_features=128, bias=True)
[ INFO : 2025-05-29 06:40:51,355 ] -   (cmp_linear_v): Linear(in_features=1024, out_features=128, bias=True)
[ INFO : 2025-05-29 06:40:51,355 ] -   (att_head): Linear(in_features=128, out_features=32, bias=True)
[ INFO : 2025-05-29 06:40:51,355 ] -   (pooling_fc): Linear(in_features=4096, out_features=256, bias=True)
[ INFO : 2025-05-29 06:40:51,355 ] -   (frontend): S3prlFrontend(
[ INFO : 2025-05-29 06:40:51,355 ] -     (upstream): S3PRLUpstream(
[ INFO : 2025-05-29 06:40:51,355 ] -       (upstream): UpstreamExpert(
[ INFO : 2025-05-29 06:40:51,355 ] -         (model): Wav2Vec2Model(
[ INFO : 2025-05-29 06:40:51,355 ] -           (feature_extractor): ConvFeatureExtractionModel(
[ INFO : 2025-05-29 06:40:51,355 ] -             (conv_layers): ModuleList(
[ INFO : 2025-05-29 06:40:51,355 ] -               (0): Sequential(
[ INFO : 2025-05-29 06:40:51,355 ] -                 (0): Conv1d(1, 512, kernel_size=(10,), stride=(5,), bias=False)
[ INFO : 2025-05-29 06:40:51,355 ] -                 (1): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,355 ] -                 (2): Fp32GroupNorm(512, 512, eps=1e-05, affine=True)
[ INFO : 2025-05-29 06:40:51,355 ] -                 (3): GELU(approximate=none)
[ INFO : 2025-05-29 06:40:51,355 ] -               )
[ INFO : 2025-05-29 06:40:51,356 ] -               (1): Sequential(
[ INFO : 2025-05-29 06:40:51,356 ] -                 (0): Conv1d(512, 512, kernel_size=(3,), stride=(2,), bias=False)
[ INFO : 2025-05-29 06:40:51,356 ] -                 (1): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,356 ] -                 (2): GELU(approximate=none)
[ INFO : 2025-05-29 06:40:51,356 ] -               )
[ INFO : 2025-05-29 06:40:51,356 ] -               (2): Sequential(
[ INFO : 2025-05-29 06:40:51,356 ] -                 (0): Conv1d(512, 512, kernel_size=(3,), stride=(2,), bias=False)
[ INFO : 2025-05-29 06:40:51,356 ] -                 (1): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,356 ] -                 (2): GELU(approximate=none)
[ INFO : 2025-05-29 06:40:51,356 ] -               )
[ INFO : 2025-05-29 06:40:51,356 ] -               (3): Sequential(
[ INFO : 2025-05-29 06:40:51,356 ] -                 (0): Conv1d(512, 512, kernel_size=(3,), stride=(2,), bias=False)
[ INFO : 2025-05-29 06:40:51,356 ] -                 (1): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,356 ] -                 (2): GELU(approximate=none)
[ INFO : 2025-05-29 06:40:51,356 ] -               )
[ INFO : 2025-05-29 06:40:51,356 ] -               (4): Sequential(
[ INFO : 2025-05-29 06:40:51,356 ] -                 (0): Conv1d(512, 512, kernel_size=(3,), stride=(2,), bias=False)
[ INFO : 2025-05-29 06:40:51,356 ] -                 (1): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,356 ] -                 (2): GELU(approximate=none)
[ INFO : 2025-05-29 06:40:51,356 ] -               )
[ INFO : 2025-05-29 06:40:51,356 ] -               (5): Sequential(
[ INFO : 2025-05-29 06:40:51,356 ] -                 (0): Conv1d(512, 512, kernel_size=(2,), stride=(2,), bias=False)
[ INFO : 2025-05-29 06:40:51,356 ] -                 (1): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,356 ] -                 (2): GELU(approximate=none)
[ INFO : 2025-05-29 06:40:51,356 ] -               )
[ INFO : 2025-05-29 06:40:51,356 ] -               (6): Sequential(
[ INFO : 2025-05-29 06:40:51,356 ] -                 (0): Conv1d(512, 512, kernel_size=(2,), stride=(2,), bias=False)
[ INFO : 2025-05-29 06:40:51,356 ] -                 (1): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,356 ] -                 (2): GELU(approximate=none)
[ INFO : 2025-05-29 06:40:51,356 ] -               )
[ INFO : 2025-05-29 06:40:51,356 ] -             )
[ INFO : 2025-05-29 06:40:51,356 ] -           )
[ INFO : 2025-05-29 06:40:51,356 ] -           (post_extract_proj): Linear(in_features=512, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,356 ] -           (dropout_input): Dropout(p=0.1, inplace=False)
[ INFO : 2025-05-29 06:40:51,356 ] -           (dropout_features): Dropout(p=0.1, inplace=False)
[ INFO : 2025-05-29 06:40:51,356 ] -           (quantizer): GumbelVectorQuantizer(
[ INFO : 2025-05-29 06:40:51,356 ] -             (weight_proj): Linear(in_features=512, out_features=640, bias=True)
[ INFO : 2025-05-29 06:40:51,356 ] -           )
[ INFO : 2025-05-29 06:40:51,356 ] -           (project_q): Linear(in_features=768, out_features=768, bias=True)
[ INFO : 2025-05-29 06:40:51,356 ] -           (encoder): TransformerEncoder(
[ INFO : 2025-05-29 06:40:51,356 ] -             (pos_conv): Sequential(
[ INFO : 2025-05-29 06:40:51,356 ] -               (0): Conv1d(1024, 1024, kernel_size=(128,), stride=(1,), padding=(64,), groups=16)
[ INFO : 2025-05-29 06:40:51,357 ] -               (1): SamePad()
[ INFO : 2025-05-29 06:40:51,357 ] -               (2): GELU(approximate=none)
[ INFO : 2025-05-29 06:40:51,357 ] -             )
[ INFO : 2025-05-29 06:40:51,357 ] -             (layers): ModuleList(
[ INFO : 2025-05-29 06:40:51,357 ] -               (0): TransformerSentenceEncoderLayer(
[ INFO : 2025-05-29 06:40:51,357 ] -                 (self_attn): MultiheadAttention(
[ INFO : 2025-05-29 06:40:51,357 ] -                   (dropout_module): FairseqDropout()
[ INFO : 2025-05-29 06:40:51,357 ] -                   (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,357 ] -                   (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,357 ] -                   (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,357 ] -                   (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,357 ] -                 )
[ INFO : 2025-05-29 06:40:51,357 ] -                 (dropout1): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,357 ] -                 (dropout2): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,357 ] -                 (dropout3): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,357 ] -                 (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
[ INFO : 2025-05-29 06:40:51,357 ] -                 (fc1): Linear(in_features=1024, out_features=4096, bias=True)
[ INFO : 2025-05-29 06:40:51,357 ] -                 (fc2): Linear(in_features=4096, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,357 ] -                 (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
[ INFO : 2025-05-29 06:40:51,357 ] -               )
[ INFO : 2025-05-29 06:40:51,357 ] -               (1): TransformerSentenceEncoderLayer(
[ INFO : 2025-05-29 06:40:51,357 ] -                 (self_attn): MultiheadAttention(
[ INFO : 2025-05-29 06:40:51,357 ] -                   (dropout_module): FairseqDropout()
[ INFO : 2025-05-29 06:40:51,357 ] -                   (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,357 ] -                   (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,357 ] -                   (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,357 ] -                   (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,357 ] -                 )
[ INFO : 2025-05-29 06:40:51,357 ] -                 (dropout1): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,357 ] -                 (dropout2): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,357 ] -                 (dropout3): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,357 ] -                 (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
[ INFO : 2025-05-29 06:40:51,357 ] -                 (fc1): Linear(in_features=1024, out_features=4096, bias=True)
[ INFO : 2025-05-29 06:40:51,357 ] -                 (fc2): Linear(in_features=4096, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,357 ] -                 (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
[ INFO : 2025-05-29 06:40:51,357 ] -               )
[ INFO : 2025-05-29 06:40:51,357 ] -               (2): TransformerSentenceEncoderLayer(
[ INFO : 2025-05-29 06:40:51,357 ] -                 (self_attn): MultiheadAttention(
[ INFO : 2025-05-29 06:40:51,357 ] -                   (dropout_module): FairseqDropout()
[ INFO : 2025-05-29 06:40:51,357 ] -                   (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,357 ] -                   (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,357 ] -                   (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,357 ] -                   (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,357 ] -                 )
[ INFO : 2025-05-29 06:40:51,357 ] -                 (dropout1): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,358 ] -                 (dropout2): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,358 ] -                 (dropout3): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,358 ] -                 (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
[ INFO : 2025-05-29 06:40:51,358 ] -                 (fc1): Linear(in_features=1024, out_features=4096, bias=True)
[ INFO : 2025-05-29 06:40:51,358 ] -                 (fc2): Linear(in_features=4096, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,358 ] -                 (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
[ INFO : 2025-05-29 06:40:51,358 ] -               )
[ INFO : 2025-05-29 06:40:51,358 ] -               (3): TransformerSentenceEncoderLayer(
[ INFO : 2025-05-29 06:40:51,358 ] -                 (self_attn): MultiheadAttention(
[ INFO : 2025-05-29 06:40:51,358 ] -                   (dropout_module): FairseqDropout()
[ INFO : 2025-05-29 06:40:51,358 ] -                   (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,358 ] -                   (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,358 ] -                   (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,358 ] -                   (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,358 ] -                 )
[ INFO : 2025-05-29 06:40:51,358 ] -                 (dropout1): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,358 ] -                 (dropout2): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,358 ] -                 (dropout3): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,358 ] -                 (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
[ INFO : 2025-05-29 06:40:51,358 ] -                 (fc1): Linear(in_features=1024, out_features=4096, bias=True)
[ INFO : 2025-05-29 06:40:51,358 ] -                 (fc2): Linear(in_features=4096, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,358 ] -                 (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
[ INFO : 2025-05-29 06:40:51,358 ] -               )
[ INFO : 2025-05-29 06:40:51,358 ] -               (4): TransformerSentenceEncoderLayer(
[ INFO : 2025-05-29 06:40:51,358 ] -                 (self_attn): MultiheadAttention(
[ INFO : 2025-05-29 06:40:51,358 ] -                   (dropout_module): FairseqDropout()
[ INFO : 2025-05-29 06:40:51,358 ] -                   (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,358 ] -                   (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,358 ] -                   (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,358 ] -                   (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,358 ] -                 )
[ INFO : 2025-05-29 06:40:51,358 ] -                 (dropout1): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,358 ] -                 (dropout2): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,358 ] -                 (dropout3): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,358 ] -                 (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
[ INFO : 2025-05-29 06:40:51,358 ] -                 (fc1): Linear(in_features=1024, out_features=4096, bias=True)
[ INFO : 2025-05-29 06:40:51,358 ] -                 (fc2): Linear(in_features=4096, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,358 ] -                 (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
[ INFO : 2025-05-29 06:40:51,358 ] -               )
[ INFO : 2025-05-29 06:40:51,358 ] -               (5): TransformerSentenceEncoderLayer(
[ INFO : 2025-05-29 06:40:51,358 ] -                 (self_attn): MultiheadAttention(
[ INFO : 2025-05-29 06:40:51,358 ] -                   (dropout_module): FairseqDropout()
[ INFO : 2025-05-29 06:40:51,358 ] -                   (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,359 ] -                   (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,359 ] -                   (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,359 ] -                   (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,359 ] -                 )
[ INFO : 2025-05-29 06:40:51,359 ] -                 (dropout1): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,359 ] -                 (dropout2): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,359 ] -                 (dropout3): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,359 ] -                 (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
[ INFO : 2025-05-29 06:40:51,359 ] -                 (fc1): Linear(in_features=1024, out_features=4096, bias=True)
[ INFO : 2025-05-29 06:40:51,359 ] -                 (fc2): Linear(in_features=4096, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,359 ] -                 (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
[ INFO : 2025-05-29 06:40:51,359 ] -               )
[ INFO : 2025-05-29 06:40:51,359 ] -               (6): TransformerSentenceEncoderLayer(
[ INFO : 2025-05-29 06:40:51,359 ] -                 (self_attn): MultiheadAttention(
[ INFO : 2025-05-29 06:40:51,359 ] -                   (dropout_module): FairseqDropout()
[ INFO : 2025-05-29 06:40:51,359 ] -                   (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,359 ] -                   (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,359 ] -                   (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,359 ] -                   (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,359 ] -                 )
[ INFO : 2025-05-29 06:40:51,359 ] -                 (dropout1): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,359 ] -                 (dropout2): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,359 ] -                 (dropout3): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,359 ] -                 (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
[ INFO : 2025-05-29 06:40:51,359 ] -                 (fc1): Linear(in_features=1024, out_features=4096, bias=True)
[ INFO : 2025-05-29 06:40:51,359 ] -                 (fc2): Linear(in_features=4096, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,359 ] -                 (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
[ INFO : 2025-05-29 06:40:51,359 ] -               )
[ INFO : 2025-05-29 06:40:51,359 ] -               (7): TransformerSentenceEncoderLayer(
[ INFO : 2025-05-29 06:40:51,359 ] -                 (self_attn): MultiheadAttention(
[ INFO : 2025-05-29 06:40:51,359 ] -                   (dropout_module): FairseqDropout()
[ INFO : 2025-05-29 06:40:51,359 ] -                   (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,359 ] -                   (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,359 ] -                   (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,359 ] -                   (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,359 ] -                 )
[ INFO : 2025-05-29 06:40:51,359 ] -                 (dropout1): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,359 ] -                 (dropout2): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,359 ] -                 (dropout3): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,359 ] -                 (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
[ INFO : 2025-05-29 06:40:51,359 ] -                 (fc1): Linear(in_features=1024, out_features=4096, bias=True)
[ INFO : 2025-05-29 06:40:51,359 ] -                 (fc2): Linear(in_features=4096, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,359 ] -                 (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
[ INFO : 2025-05-29 06:40:51,359 ] -               )
[ INFO : 2025-05-29 06:40:51,359 ] -               (8): TransformerSentenceEncoderLayer(
[ INFO : 2025-05-29 06:40:51,360 ] -                 (self_attn): MultiheadAttention(
[ INFO : 2025-05-29 06:40:51,360 ] -                   (dropout_module): FairseqDropout()
[ INFO : 2025-05-29 06:40:51,360 ] -                   (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,360 ] -                   (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,360 ] -                   (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,360 ] -                   (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,360 ] -                 )
[ INFO : 2025-05-29 06:40:51,360 ] -                 (dropout1): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,360 ] -                 (dropout2): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,360 ] -                 (dropout3): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,360 ] -                 (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
[ INFO : 2025-05-29 06:40:51,360 ] -                 (fc1): Linear(in_features=1024, out_features=4096, bias=True)
[ INFO : 2025-05-29 06:40:51,360 ] -                 (fc2): Linear(in_features=4096, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,360 ] -                 (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
[ INFO : 2025-05-29 06:40:51,360 ] -               )
[ INFO : 2025-05-29 06:40:51,360 ] -               (9): TransformerSentenceEncoderLayer(
[ INFO : 2025-05-29 06:40:51,360 ] -                 (self_attn): MultiheadAttention(
[ INFO : 2025-05-29 06:40:51,360 ] -                   (dropout_module): FairseqDropout()
[ INFO : 2025-05-29 06:40:51,360 ] -                   (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,360 ] -                   (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,360 ] -                   (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,360 ] -                   (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,360 ] -                 )
[ INFO : 2025-05-29 06:40:51,360 ] -                 (dropout1): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,360 ] -                 (dropout2): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,360 ] -                 (dropout3): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,360 ] -                 (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
[ INFO : 2025-05-29 06:40:51,360 ] -                 (fc1): Linear(in_features=1024, out_features=4096, bias=True)
[ INFO : 2025-05-29 06:40:51,360 ] -                 (fc2): Linear(in_features=4096, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,360 ] -                 (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
[ INFO : 2025-05-29 06:40:51,360 ] -               )
[ INFO : 2025-05-29 06:40:51,360 ] -               (10): TransformerSentenceEncoderLayer(
[ INFO : 2025-05-29 06:40:51,360 ] -                 (self_attn): MultiheadAttention(
[ INFO : 2025-05-29 06:40:51,360 ] -                   (dropout_module): FairseqDropout()
[ INFO : 2025-05-29 06:40:51,360 ] -                   (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,360 ] -                   (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,360 ] -                   (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,360 ] -                   (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,360 ] -                 )
[ INFO : 2025-05-29 06:40:51,360 ] -                 (dropout1): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,360 ] -                 (dropout2): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,360 ] -                 (dropout3): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,360 ] -                 (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
[ INFO : 2025-05-29 06:40:51,361 ] -                 (fc1): Linear(in_features=1024, out_features=4096, bias=True)
[ INFO : 2025-05-29 06:40:51,361 ] -                 (fc2): Linear(in_features=4096, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,361 ] -                 (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
[ INFO : 2025-05-29 06:40:51,361 ] -               )
[ INFO : 2025-05-29 06:40:51,361 ] -               (11): TransformerSentenceEncoderLayer(
[ INFO : 2025-05-29 06:40:51,361 ] -                 (self_attn): MultiheadAttention(
[ INFO : 2025-05-29 06:40:51,361 ] -                   (dropout_module): FairseqDropout()
[ INFO : 2025-05-29 06:40:51,361 ] -                   (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,361 ] -                   (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,361 ] -                   (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,361 ] -                   (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,361 ] -                 )
[ INFO : 2025-05-29 06:40:51,361 ] -                 (dropout1): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,361 ] -                 (dropout2): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,361 ] -                 (dropout3): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,361 ] -                 (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
[ INFO : 2025-05-29 06:40:51,361 ] -                 (fc1): Linear(in_features=1024, out_features=4096, bias=True)
[ INFO : 2025-05-29 06:40:51,361 ] -                 (fc2): Linear(in_features=4096, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,361 ] -                 (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
[ INFO : 2025-05-29 06:40:51,361 ] -               )
[ INFO : 2025-05-29 06:40:51,361 ] -               (12): TransformerSentenceEncoderLayer(
[ INFO : 2025-05-29 06:40:51,361 ] -                 (self_attn): MultiheadAttention(
[ INFO : 2025-05-29 06:40:51,361 ] -                   (dropout_module): FairseqDropout()
[ INFO : 2025-05-29 06:40:51,361 ] -                   (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,361 ] -                   (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,361 ] -                   (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,361 ] -                   (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,361 ] -                 )
[ INFO : 2025-05-29 06:40:51,361 ] -                 (dropout1): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,361 ] -                 (dropout2): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,361 ] -                 (dropout3): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,361 ] -                 (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
[ INFO : 2025-05-29 06:40:51,361 ] -                 (fc1): Linear(in_features=1024, out_features=4096, bias=True)
[ INFO : 2025-05-29 06:40:51,361 ] -                 (fc2): Linear(in_features=4096, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,361 ] -                 (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
[ INFO : 2025-05-29 06:40:51,361 ] -               )
[ INFO : 2025-05-29 06:40:51,361 ] -               (13): TransformerSentenceEncoderLayer(
[ INFO : 2025-05-29 06:40:51,361 ] -                 (self_attn): MultiheadAttention(
[ INFO : 2025-05-29 06:40:51,361 ] -                   (dropout_module): FairseqDropout()
[ INFO : 2025-05-29 06:40:51,361 ] -                   (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,361 ] -                   (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,361 ] -                   (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,361 ] -                   (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,361 ] -                 )
[ INFO : 2025-05-29 06:40:51,362 ] -                 (dropout1): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,362 ] -                 (dropout2): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,362 ] -                 (dropout3): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,362 ] -                 (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
[ INFO : 2025-05-29 06:40:51,362 ] -                 (fc1): Linear(in_features=1024, out_features=4096, bias=True)
[ INFO : 2025-05-29 06:40:51,362 ] -                 (fc2): Linear(in_features=4096, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,362 ] -                 (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
[ INFO : 2025-05-29 06:40:51,362 ] -               )
[ INFO : 2025-05-29 06:40:51,362 ] -               (14): TransformerSentenceEncoderLayer(
[ INFO : 2025-05-29 06:40:51,362 ] -                 (self_attn): MultiheadAttention(
[ INFO : 2025-05-29 06:40:51,362 ] -                   (dropout_module): FairseqDropout()
[ INFO : 2025-05-29 06:40:51,362 ] -                   (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,362 ] -                   (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,362 ] -                   (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,362 ] -                   (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,362 ] -                 )
[ INFO : 2025-05-29 06:40:51,362 ] -                 (dropout1): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,362 ] -                 (dropout2): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,362 ] -                 (dropout3): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,362 ] -                 (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
[ INFO : 2025-05-29 06:40:51,362 ] -                 (fc1): Linear(in_features=1024, out_features=4096, bias=True)
[ INFO : 2025-05-29 06:40:51,362 ] -                 (fc2): Linear(in_features=4096, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,362 ] -                 (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
[ INFO : 2025-05-29 06:40:51,362 ] -               )
[ INFO : 2025-05-29 06:40:51,362 ] -               (15): TransformerSentenceEncoderLayer(
[ INFO : 2025-05-29 06:40:51,362 ] -                 (self_attn): MultiheadAttention(
[ INFO : 2025-05-29 06:40:51,362 ] -                   (dropout_module): FairseqDropout()
[ INFO : 2025-05-29 06:40:51,362 ] -                   (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,362 ] -                   (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,362 ] -                   (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,362 ] -                   (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,362 ] -                 )
[ INFO : 2025-05-29 06:40:51,362 ] -                 (dropout1): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,362 ] -                 (dropout2): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,362 ] -                 (dropout3): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,362 ] -                 (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
[ INFO : 2025-05-29 06:40:51,362 ] -                 (fc1): Linear(in_features=1024, out_features=4096, bias=True)
[ INFO : 2025-05-29 06:40:51,362 ] -                 (fc2): Linear(in_features=4096, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,362 ] -                 (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
[ INFO : 2025-05-29 06:40:51,362 ] -               )
[ INFO : 2025-05-29 06:40:51,362 ] -               (16): TransformerSentenceEncoderLayer(
[ INFO : 2025-05-29 06:40:51,362 ] -                 (self_attn): MultiheadAttention(
[ INFO : 2025-05-29 06:40:51,362 ] -                   (dropout_module): FairseqDropout()
[ INFO : 2025-05-29 06:40:51,362 ] -                   (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,362 ] -                   (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,362 ] -                   (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,362 ] -                   (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,363 ] -                 )
[ INFO : 2025-05-29 06:40:51,363 ] -                 (dropout1): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,363 ] -                 (dropout2): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,363 ] -                 (dropout3): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,363 ] -                 (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
[ INFO : 2025-05-29 06:40:51,363 ] -                 (fc1): Linear(in_features=1024, out_features=4096, bias=True)
[ INFO : 2025-05-29 06:40:51,363 ] -                 (fc2): Linear(in_features=4096, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,363 ] -                 (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
[ INFO : 2025-05-29 06:40:51,363 ] -               )
[ INFO : 2025-05-29 06:40:51,363 ] -               (17): TransformerSentenceEncoderLayer(
[ INFO : 2025-05-29 06:40:51,363 ] -                 (self_attn): MultiheadAttention(
[ INFO : 2025-05-29 06:40:51,363 ] -                   (dropout_module): FairseqDropout()
[ INFO : 2025-05-29 06:40:51,363 ] -                   (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,363 ] -                   (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,363 ] -                   (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,363 ] -                   (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,363 ] -                 )
[ INFO : 2025-05-29 06:40:51,363 ] -                 (dropout1): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,363 ] -                 (dropout2): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,363 ] -                 (dropout3): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,363 ] -                 (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
[ INFO : 2025-05-29 06:40:51,363 ] -                 (fc1): Linear(in_features=1024, out_features=4096, bias=True)
[ INFO : 2025-05-29 06:40:51,363 ] -                 (fc2): Linear(in_features=4096, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,363 ] -                 (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
[ INFO : 2025-05-29 06:40:51,363 ] -               )
[ INFO : 2025-05-29 06:40:51,363 ] -               (18): TransformerSentenceEncoderLayer(
[ INFO : 2025-05-29 06:40:51,363 ] -                 (self_attn): MultiheadAttention(
[ INFO : 2025-05-29 06:40:51,363 ] -                   (dropout_module): FairseqDropout()
[ INFO : 2025-05-29 06:40:51,363 ] -                   (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,363 ] -                   (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,363 ] -                   (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,363 ] -                   (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,363 ] -                 )
[ INFO : 2025-05-29 06:40:51,363 ] -                 (dropout1): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,363 ] -                 (dropout2): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,363 ] -                 (dropout3): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,363 ] -                 (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
[ INFO : 2025-05-29 06:40:51,363 ] -                 (fc1): Linear(in_features=1024, out_features=4096, bias=True)
[ INFO : 2025-05-29 06:40:51,363 ] -                 (fc2): Linear(in_features=4096, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,363 ] -                 (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
[ INFO : 2025-05-29 06:40:51,363 ] -               )
[ INFO : 2025-05-29 06:40:51,363 ] -               (19): TransformerSentenceEncoderLayer(
[ INFO : 2025-05-29 06:40:51,363 ] -                 (self_attn): MultiheadAttention(
[ INFO : 2025-05-29 06:40:51,363 ] -                   (dropout_module): FairseqDropout()
[ INFO : 2025-05-29 06:40:51,363 ] -                   (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,364 ] -                   (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,364 ] -                   (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,364 ] -                   (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,364 ] -                 )
[ INFO : 2025-05-29 06:40:51,364 ] -                 (dropout1): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,364 ] -                 (dropout2): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,364 ] -                 (dropout3): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,364 ] -                 (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
[ INFO : 2025-05-29 06:40:51,364 ] -                 (fc1): Linear(in_features=1024, out_features=4096, bias=True)
[ INFO : 2025-05-29 06:40:51,364 ] -                 (fc2): Linear(in_features=4096, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,364 ] -                 (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
[ INFO : 2025-05-29 06:40:51,364 ] -               )
[ INFO : 2025-05-29 06:40:51,364 ] -               (20): TransformerSentenceEncoderLayer(
[ INFO : 2025-05-29 06:40:51,364 ] -                 (self_attn): MultiheadAttention(
[ INFO : 2025-05-29 06:40:51,364 ] -                   (dropout_module): FairseqDropout()
[ INFO : 2025-05-29 06:40:51,364 ] -                   (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,364 ] -                   (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,364 ] -                   (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,364 ] -                   (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,364 ] -                 )
[ INFO : 2025-05-29 06:40:51,364 ] -                 (dropout1): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,364 ] -                 (dropout2): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,364 ] -                 (dropout3): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,364 ] -                 (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
[ INFO : 2025-05-29 06:40:51,364 ] -                 (fc1): Linear(in_features=1024, out_features=4096, bias=True)
[ INFO : 2025-05-29 06:40:51,364 ] -                 (fc2): Linear(in_features=4096, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,364 ] -                 (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
[ INFO : 2025-05-29 06:40:51,364 ] -               )
[ INFO : 2025-05-29 06:40:51,364 ] -               (21): TransformerSentenceEncoderLayer(
[ INFO : 2025-05-29 06:40:51,364 ] -                 (self_attn): MultiheadAttention(
[ INFO : 2025-05-29 06:40:51,364 ] -                   (dropout_module): FairseqDropout()
[ INFO : 2025-05-29 06:40:51,364 ] -                   (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,364 ] -                   (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,364 ] -                   (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,364 ] -                   (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,364 ] -                 )
[ INFO : 2025-05-29 06:40:51,364 ] -                 (dropout1): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,364 ] -                 (dropout2): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,364 ] -                 (dropout3): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,364 ] -                 (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
[ INFO : 2025-05-29 06:40:51,364 ] -                 (fc1): Linear(in_features=1024, out_features=4096, bias=True)
[ INFO : 2025-05-29 06:40:51,364 ] -                 (fc2): Linear(in_features=4096, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,364 ] -                 (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
[ INFO : 2025-05-29 06:40:51,364 ] -               )
[ INFO : 2025-05-29 06:40:51,365 ] -               (22): TransformerSentenceEncoderLayer(
[ INFO : 2025-05-29 06:40:51,365 ] -                 (self_attn): MultiheadAttention(
[ INFO : 2025-05-29 06:40:51,365 ] -                   (dropout_module): FairseqDropout()
[ INFO : 2025-05-29 06:40:51,365 ] -                   (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,365 ] -                   (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,365 ] -                   (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,365 ] -                   (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,365 ] -                 )
[ INFO : 2025-05-29 06:40:51,365 ] -                 (dropout1): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,365 ] -                 (dropout2): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,365 ] -                 (dropout3): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,365 ] -                 (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
[ INFO : 2025-05-29 06:40:51,365 ] -                 (fc1): Linear(in_features=1024, out_features=4096, bias=True)
[ INFO : 2025-05-29 06:40:51,365 ] -                 (fc2): Linear(in_features=4096, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,365 ] -                 (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
[ INFO : 2025-05-29 06:40:51,365 ] -               )
[ INFO : 2025-05-29 06:40:51,365 ] -               (23): TransformerSentenceEncoderLayer(
[ INFO : 2025-05-29 06:40:51,365 ] -                 (self_attn): MultiheadAttention(
[ INFO : 2025-05-29 06:40:51,365 ] -                   (dropout_module): FairseqDropout()
[ INFO : 2025-05-29 06:40:51,365 ] -                   (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,365 ] -                   (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,365 ] -                   (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,365 ] -                   (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,365 ] -                 )
[ INFO : 2025-05-29 06:40:51,365 ] -                 (dropout1): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,365 ] -                 (dropout2): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,365 ] -                 (dropout3): Dropout(p=0.0, inplace=False)
[ INFO : 2025-05-29 06:40:51,365 ] -                 (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
[ INFO : 2025-05-29 06:40:51,365 ] -                 (fc1): Linear(in_features=1024, out_features=4096, bias=True)
[ INFO : 2025-05-29 06:40:51,365 ] -                 (fc2): Linear(in_features=4096, out_features=1024, bias=True)
[ INFO : 2025-05-29 06:40:51,365 ] -                 (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
[ INFO : 2025-05-29 06:40:51,365 ] -               )
[ INFO : 2025-05-29 06:40:51,365 ] -             )
[ INFO : 2025-05-29 06:40:51,365 ] -             (layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
[ INFO : 2025-05-29 06:40:51,365 ] -           )
[ INFO : 2025-05-29 06:40:51,365 ] -           (layer_norm): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
[ INFO : 2025-05-29 06:40:51,365 ] -           (final_proj): Linear(in_features=1024, out_features=768, bias=True)
[ INFO : 2025-05-29 06:40:51,365 ] -         )
[ INFO : 2025-05-29 06:40:51,365 ] -       )
[ INFO : 2025-05-29 06:40:51,365 ] -     )
[ INFO : 2025-05-29 06:40:51,365 ] -   )
[ INFO : 2025-05-29 06:40:51,365 ] -   (projection): Linear(
[ INFO : 2025-05-29 06:40:51,365 ] -     (trans): Sequential(
[ INFO : 2025-05-29 06:40:51,365 ] -       (0): BatchNorm1d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
[ INFO : 2025-05-29 06:40:51,365 ] -       (1): ReLU(inplace=True)
[ INFO : 2025-05-29 06:40:51,366 ] -       (2): Linear(in_features=256, out_features=2, bias=True)
[ INFO : 2025-05-29 06:40:51,366 ] -     )
[ INFO : 2025-05-29 06:40:51,366 ] -   )
[ INFO : 2025-05-29 06:40:51,366 ] - )
[ INFO : 2025-05-29 06:40:51,366 ] - start_epoch: 1
[ INFO : 2025-05-29 06:40:51,697 ] - <== Loss ==>
[ INFO : 2025-05-29 06:40:51,697 ] - loss criterion is: CrossEntropyLoss
[ INFO : 2025-05-29 06:40:51,698 ] - <== Optimizer ==>
[ INFO : 2025-05-29 06:40:51,698 ] - optimizer is: AdamW
[ INFO : 2025-05-29 06:40:51,698 ] - <== Scheduler ==>
[ INFO : 2025-05-29 06:40:51,698 ] - scheduler is: ExponentialDecrease
[ INFO : 2025-05-29 06:40:51,698 ] - <== MarginScheduler ==>
[ INFO : 2025-05-29 06:40:51,705 ] - <========== Training process ==========>
[ INFO : 2025-05-29 06:40:51,705 ] - +----------+----------+----------+----------+----------+----------+
[ INFO : 2025-05-29 06:40:51,705 ] - |     Epoch|     Batch|        Lr|    Margin|      Loss|       Acc|
[ INFO : 2025-05-29 06:40:51,705 ] - +----------+----------+----------+----------+----------+----------+
[ INFO : 2025-05-29 06:41:59,654 ] - |         1|       100|2.1604e-08|         0|   0.67596|    57.438|
[ INFO : 2025-05-29 06:42:53,297 ] - |         1|       200|4.3198e-08|         0|   0.67501|    58.406|
[ INFO : 2025-05-29 06:43:47,158 ] - |         1|       300|6.4565e-08|         0|   0.67552|    58.563|
[ INFO : 2025-05-29 06:44:41,033 ] - |         1|       400|8.5706e-08|         0|   0.66946|    59.406|
[ INFO : 2025-05-29 06:45:34,928 ] - |         1|       500|1.0662e-07|         0|   0.66453|    60.312|
[ INFO : 2025-05-29 06:46:28,859 ] - |         1|       600|1.2732e-07|         0|   0.65905|    61.156|
[ INFO : 2025-05-29 06:47:22,782 ] - |         1|       700| 1.478e-07|         0|   0.65253|    62.438|
[ INFO : 2025-05-29 06:48:16,692 ] - |         1|       800|1.6806e-07|         0|   0.64673|    63.461|
[ INFO : 2025-05-29 06:49:10,764 ] - |         1|       900| 1.881e-07|         0|   0.64003|    64.472|
[ INFO : 2025-05-29 06:50:05,202 ] - |         1|      1000|2.0792e-07|         0|   0.63428|     65.35|
[ INFO : 2025-05-29 06:50:59,484 ] - |         1|      1100|2.2754e-07|         0|   0.62866|    66.352|
[ INFO : 2025-05-29 06:51:53,762 ] - |         1|      1200|2.4694e-07|         0|   0.62292|    67.146|
[ INFO : 2025-05-29 06:52:48,034 ] - |         1|      1300|2.6613e-07|         0|   0.61563|    68.192|
[ INFO : 2025-05-29 06:53:42,310 ] - |         1|      1400|2.8512e-07|         0|   0.60869|    69.045|
[ INFO : 2025-05-29 06:54:36,573 ] - |         1|      1500| 3.039e-07|         0|   0.60278|    69.721|
[ INFO : 2025-05-29 06:55:30,842 ] - |         1|      1600|3.2247e-07|         0|   0.59599|    70.469|
[ INFO : 2025-05-29 06:56:25,109 ] - |         1|      1700|3.4084e-07|         0|   0.59048|    71.103|
[ INFO : 2025-05-29 06:57:19,365 ] - |         1|      1800|3.5901e-07|         0|   0.58325|    71.899|
[ INFO : 2025-05-29 06:58:13,822 ] - |         1|      1900|3.7698e-07|         0|   0.57617|    72.599|
[ INFO : 2025-05-29 06:59:07,686 ] - |         1|      2000|3.9475e-07|         0|   0.56882|    73.334|
[ INFO : 2025-05-29 07:00:01,709 ] - |         1|      2100|4.1232e-07|         0|   0.56136|    74.062|
[ INFO : 2025-05-29 07:00:55,724 ] - |         1|      2200| 4.297e-07|         0|   0.55338|    74.795|
[ INFO : 2025-05-29 07:01:49,594 ] - |         1|      2300|4.4689e-07|         0|   0.54342|     75.62|
[ INFO : 2025-05-29 07:02:43,613 ] - |         1|      2400|4.6388e-07|         0|   0.53396|    76.331|
[ INFO : 2025-05-29 07:03:37,884 ] - |         1|      2500|4.8069e-07|         0|   0.52301|      77.1|
[ INFO : 2025-05-29 07:04:32,146 ] - |         1|      2600| 4.973e-07|         0|   0.51167|     77.82|
[ INFO : 2025-05-29 07:05:26,393 ] - |         1|      2700|5.1373e-07|         0|   0.50056|    78.523|
[ INFO : 2025-05-29 07:06:20,890 ] - |         1|      2800|5.2997e-07|         0|   0.49103|      79.1|
[ INFO : 2025-05-29 07:07:14,752 ] - |         1|      2900|5.4602e-07|         0|   0.48239|    79.647|
[ INFO : 2025-05-29 07:08:08,619 ] - |         1|      3000| 5.619e-07|         0|   0.47338|    80.204|
[ INFO : 2025-05-29 07:09:02,455 ] - |         1|      3100|5.7759e-07|         0|   0.46446|    80.724|
[ INFO : 2025-05-29 07:09:56,314 ] - |         1|      3200| 5.931e-07|         0|   0.45544|    81.236|
[ INFO : 2025-05-29 07:10:50,412 ] - |         1|      3300|6.0843e-07|         0|   0.44635|    81.746|
[ INFO : 2025-05-29 07:11:44,283 ] - |         1|      3400|6.2359e-07|         0|   0.43928|     82.18|
[ INFO : 2025-05-29 07:12:38,118 ] - |         1|      3500|6.3857e-07|         0|   0.43211|    82.575|
[ INFO : 2025-05-29 07:13:31,958 ] - |         1|      3600|6.5337e-07|         0|   0.42477|    82.997|
[ INFO : 2025-05-29 07:14:25,800 ] - |         1|      3700|6.6801e-07|         0|   0.41764|    83.378|
[ INFO : 2025-05-29 07:15:20,229 ] - |         1|      3800|6.8247e-07|         0|   0.41068|    83.757|
[ INFO : 2025-05-29 07:16:14,405 ] - |         1|      3900|6.9676e-07|         0|   0.40481|    84.079|
[ INFO : 2025-05-29 07:17:08,411 ] - |         1|      4000|7.1088e-07|         0|   0.39813|    84.423|
[ INFO : 2025-05-29 07:18:02,642 ] - |         1|      4100|7.2484e-07|         0|   0.39211|    84.736|
[ INFO : 2025-05-29 07:18:56,890 ] - |         1|      4200|7.3863e-07|         0|   0.38742|    85.013|
[ INFO : 2025-05-29 07:19:51,098 ] - |         1|      4300|7.5226e-07|         0|   0.38198|    85.323|
[ INFO : 2025-05-29 07:20:45,338 ] - |         1|      4400|7.6572e-07|         0|   0.37611|    85.619|
[ INFO : 2025-05-29 07:21:39,594 ] - |         1|      4500|7.7902e-07|         0|   0.37128|    85.876|
[ INFO : 2025-05-29 07:22:33,918 ] - |         1|      4600|7.9216e-07|         0|   0.36664|    86.135|
[ INFO : 2025-05-29 07:23:28,545 ] - |         1|      4700|8.0514e-07|         0|   0.36233|    86.358|
[ INFO : 2025-05-29 07:24:22,835 ] - |         1|      4800|8.1796e-07|         0|   0.35797|    86.587|
[ INFO : 2025-05-29 07:25:16,837 ] - |         1|      4900|8.3063e-07|         0|   0.35393|    86.832|
[ INFO : 2025-05-29 07:26:11,007 ] - |         1|      5000|8.4314e-07|         0|   0.34963|    87.066|
[ INFO : 2025-05-29 07:27:04,962 ] - |         1|      5100| 8.555e-07|         0|   0.34572|    87.277|
[ INFO : 2025-05-29 07:27:58,855 ] - |         1|      5200| 8.677e-07|         0|   0.34163|    87.476|
[ INFO : 2025-05-29 07:28:52,732 ] - |         1|      5300|8.7975e-07|         0|   0.33751|    87.688|
[ INFO : 2025-05-29 07:29:46,600 ] - |         1|      5400|8.9166e-07|         0|    0.3336|    87.883|
[ INFO : 2025-05-29 07:30:40,477 ] - |         1|      5500|9.0341e-07|         0|   0.33002|    88.085|
[ INFO : 2025-05-29 07:31:34,612 ] - |         1|      5600|9.1502e-07|         0|   0.32644|    88.272|
[ INFO : 2025-05-29 07:32:28,469 ] - |         1|      5700|9.2647e-07|         0|   0.32254|    88.471|
[ INFO : 2025-05-29 07:33:22,458 ] - |         1|      5800|9.3779e-07|         0|   0.31942|    88.628|
[ INFO : 2025-05-29 07:34:16,338 ] - |         1|      5900|9.4896e-07|         0|    0.3163|    88.786|
[ INFO : 2025-05-29 07:35:10,218 ] - |         1|      6000|9.5999e-07|         0|   0.31294|    88.961|
[ INFO : 2025-05-29 07:36:04,087 ] - |         1|      6100|9.7087e-07|         0|   0.30974|    89.127|
[ INFO : 2025-05-29 07:36:57,980 ] - |         1|      6200|9.8161e-07|         0|   0.30706|    89.255|
[ INFO : 2025-05-29 07:37:51,859 ] - |         1|      6300|9.9222e-07|         0|   0.30377|    89.417|
[ INFO : 2025-05-29 07:38:45,896 ] - |         1|      6400|1.0027e-06|         0|   0.30106|    89.563|
[ INFO : 2025-05-29 07:39:40,087 ] - |         1|      6500| 1.013e-06|         0|   0.29813|    89.715|
[ INFO : 2025-05-29 07:40:34,398 ] - |         1|      6600|1.0232e-06|         0|   0.29564|    89.852|
[ INFO : 2025-05-29 07:41:28,618 ] - |         1|      6700|1.0333e-06|         0|   0.29278|    89.998|
[ INFO : 2025-05-29 07:42:22,892 ] - |         1|      6800|1.0432e-06|         0|   0.29064|    90.095|
[ INFO : 2025-05-29 07:43:17,174 ] - |         1|      6900| 1.053e-06|         0|   0.28795|    90.228|
[ INFO : 2025-05-29 07:44:11,479 ] - |         1|      7000|1.0627e-06|         0|   0.28537|    90.361|
[ INFO : 2025-05-29 07:45:05,487 ] - |         1|      7100|1.0722e-06|         0|   0.28272|    90.486|
[ INFO : 2025-05-29 07:45:59,371 ] - |         1|      7200|1.0816e-06|         0|   0.28027|    90.607|
[ INFO : 2025-05-29 07:46:53,273 ] - |         1|      7300|1.0909e-06|         0|   0.27802|    90.715|
[ INFO : 2025-05-29 07:47:47,165 ] - |         1|      7400|   1.1e-06|         0|   0.27569|    90.833|
[ INFO : 2025-05-29 07:48:41,409 ] - |         1|      7500| 1.109e-06|         0|   0.27312|    90.953|
[ INFO : 2025-05-29 07:49:35,285 ] - |         1|      7600|1.1179e-06|         0|   0.27081|    91.065|
[ INFO : 2025-05-29 07:50:29,197 ] - |         1|      7700|1.1267e-06|         0|   0.26866|    91.175|
[ INFO : 2025-05-29 07:51:23,088 ] - |         1|      7800|1.1353e-06|         0|    0.2667|    91.276|
[ INFO : 2025-05-29 07:52:17,073 ] - |         1|      7900|1.1439e-06|         0|   0.26462|    91.384|
[ INFO : 2025-05-29 07:53:10,972 ] - |         1|      8000|1.1523e-06|         0|   0.26254|    91.489|
[ INFO : 2025-05-29 07:54:04,874 ] - |         1|      8100|1.1606e-06|         0|    0.2605|    91.593|
[ INFO : 2025-05-29 07:54:58,776 ] - |         1|      8200|1.1687e-06|         0|   0.25839|    91.692|
[ INFO : 2025-05-29 07:55:53,022 ] - |         1|      8300|1.1768e-06|         0|   0.25672|    91.776|
[ INFO : 2025-05-29 07:56:47,172 ] - |         1|      8400|1.1847e-06|         0|   0.25479|    91.868|
[ INFO : 2025-05-29 07:57:41,296 ] - |         1|      8500|1.1925e-06|         0|   0.25298|    91.955|
[ INFO : 2025-05-29 07:58:35,496 ] - |         1|      8600|1.2003e-06|         0|   0.25117|    92.041|
[ INFO : 2025-05-29 07:59:29,690 ] - |         1|      8700|1.2078e-06|         0|   0.24946|    92.126|
[ INFO : 2025-05-29 08:00:23,579 ] - |         1|      8800|1.2153e-06|         0|   0.24762|    92.215|
[ INFO : 2025-05-29 08:01:17,502 ] - |         1|      8900|1.2227e-06|         0|   0.24582|    92.301|
[ INFO : 2025-05-29 08:02:11,609 ] - |         1|      9000|1.2299e-06|         0|   0.24406|    92.382|
[ INFO : 2025-05-29 08:03:05,904 ] - |         1|      9100|1.2371e-06|         0|   0.24219|    92.464|
[ INFO : 2025-05-29 08:04:00,217 ] - |         1|      9200|1.2441e-06|         0|   0.24052|    92.541|
[ INFO : 2025-05-29 08:04:54,295 ] - |         1|      9300|1.2511e-06|         0|   0.23869|     92.62|
[ INFO : 2025-05-29 08:05:48,569 ] - |         1|      9400|1.2579e-06|         0|   0.23687|    92.697|
[ INFO : 2025-05-29 08:06:42,514 ] - |         1|      9500|1.2646e-06|         0|   0.23538|    92.774|
[ INFO : 2025-05-29 08:07:36,448 ] - |         1|      9600|1.2712e-06|         0|   0.23374|    92.844|
[ INFO : 2025-05-29 08:08:30,359 ] - |         1|      9700|1.2777e-06|         0|    0.2321|    92.918|
[ INFO : 2025-05-29 08:09:24,275 ] - |         1|      9800|1.2841e-06|         0|   0.23042|    92.989|
[ INFO : 2025-05-29 08:10:18,183 ] - |         1|      9900|1.2904e-06|         0|   0.22893|    93.059|
[ INFO : 2025-05-29 08:11:12,107 ] - |         1|     10000|1.2966e-06|         0|   0.22737|    93.126|
[ INFO : 2025-05-29 08:12:06,057 ] - |         1|     10100|1.3027e-06|         0|   0.22583|    93.194|
[ INFO : 2025-05-29 08:12:59,977 ] - |         1|     10200|1.3087e-06|         0|   0.22431|     93.26|
[ INFO : 2025-05-29 08:13:54,159 ] - |         1|     10300|1.3146e-06|         0|   0.22284|    93.325|
[ INFO : 2025-05-29 08:14:48,064 ] - |         1|     10400|1.3204e-06|         0|   0.22141|    93.389|
[ INFO : 2025-05-29 08:15:41,989 ] - |         1|     10500|1.3262e-06|         0|   0.21993|    93.452|
[ INFO : 2025-05-29 08:16:35,926 ] - |         1|     10600|1.3318e-06|         0|   0.21851|    93.513|
[ INFO : 2025-05-29 08:17:29,814 ] - |         1|     10700|1.3373e-06|         0|   0.21729|    93.572|
[ INFO : 2025-05-29 08:18:23,951 ] - |         1|     10800|1.3427e-06|         0|    0.2159|    93.632|
[ INFO : 2025-05-29 08:19:17,862 ] - |         1|     10900| 1.348e-06|         0|   0.21461|    93.689|
[ INFO : 2025-05-29 08:20:11,780 ] - |         1|     11000|1.3533e-06|         0|    0.2132|    93.745|
[ INFO : 2025-05-29 08:21:05,692 ] - |         1|     11100|1.3584e-06|         0|    0.2119|    93.798|
[ INFO : 2025-05-29 08:21:59,588 ] - |         1|     11200|1.3635e-06|         0|   0.21051|    93.853|
[ INFO : 2025-05-29 08:22:53,828 ] - |         1|     11300|1.3684e-06|         0|   0.20925|    93.907|
[ INFO : 2025-05-29 08:23:46,546 ] - |         1|     11397|1.3732e-06|         0|   0.20802|    93.956|
[ INFO : 2025-05-29 08:24:55,007 ] - |         2|       100|1.3661e-06|         0|  0.066344|     99.75|
[ INFO : 2025-05-29 08:25:48,869 ] - |         2|       200|1.3589e-06|         0|  0.062643|    99.594|
[ INFO : 2025-05-29 08:26:42,757 ] - |         2|       300|1.3518e-06|         0|  0.060587|    99.729|
[ INFO : 2025-05-29 08:27:36,644 ] - |         2|       400|1.3447e-06|         0|  0.060496|    99.734|
[ INFO : 2025-05-29 08:28:30,539 ] - |         2|       500|1.3376e-06|         0|  0.058355|     99.75|
[ INFO : 2025-05-29 08:29:24,433 ] - |         2|       600|1.3306e-06|         0|  0.056239|    99.792|
[ INFO : 2025-05-29 08:30:18,305 ] - |         2|       700|1.3237e-06|         0|  0.055371|    99.812|
[ INFO : 2025-05-29 08:31:12,180 ] - |         2|       800|1.3167e-06|         0|  0.055045|    99.836|
[ INFO : 2025-05-29 08:32:06,217 ] - |         2|       900|1.3098e-06|         0|  0.055822|    99.854|
[ INFO : 2025-05-29 08:33:00,313 ] - |         2|      1000|1.3029e-06|         0|  0.056063|    99.869|
[ INFO : 2025-05-29 08:33:54,201 ] - |         2|      1100|1.2961e-06|         0|  0.055283|    99.875|
[ INFO : 2025-05-29 08:34:48,095 ] - |         2|      1200|1.2893e-06|         0|  0.054846|    99.875|
[ INFO : 2025-05-29 08:35:41,994 ] - |         2|      1300|1.2826e-06|         0|  0.053766|     99.88|
[ INFO : 2025-05-29 08:36:35,882 ] - |         2|      1400|1.2758e-06|         0|  0.053288|    99.884|
[ INFO : 2025-05-29 08:37:29,745 ] - |         2|      1500|1.2691e-06|         0|  0.053381|    99.892|
[ INFO : 2025-05-29 08:38:23,606 ] - |         2|      1600|1.2625e-06|         0|  0.052703|    99.898|
[ INFO : 2025-05-29 08:39:17,482 ] - |         2|      1700|1.2559e-06|         0|  0.052517|    99.904|
[ INFO : 2025-05-29 08:40:11,419 ] - |         2|      1800|1.2493e-06|         0|  0.051893|     99.91|
[ INFO : 2025-05-29 08:41:05,597 ] - |         2|      1900|1.2427e-06|         0|   0.05245|    99.901|
[ INFO : 2025-05-29 08:41:59,461 ] - |         2|      2000|1.2362e-06|         0|  0.051726|    99.906|
[ INFO : 2025-05-29 08:42:53,343 ] - |         2|      2100|1.2297e-06|         0|  0.051435|    99.911|
[ INFO : 2025-05-29 08:43:47,210 ] - |         2|      2200|1.2233e-06|         0|  0.051502|    99.906|
[ INFO : 2025-05-29 08:44:41,085 ] - |         2|      2300|1.2169e-06|         0|  0.051428|     99.91|
[ INFO : 2025-05-29 08:45:34,972 ] - |         2|      2400|1.2105e-06|         0|  0.051169|    99.911|
[ INFO : 2025-05-29 08:46:28,858 ] - |         2|      2500|1.2041e-06|         0|   0.05121|    99.907|
[ INFO : 2025-05-29 08:47:22,747 ] - |         2|      2600|1.1978e-06|         0|  0.050938|    99.911|
[ INFO : 2025-05-29 08:48:16,646 ] - |         2|      2700|1.1916e-06|         0|  0.050661|    99.914|
[ INFO : 2025-05-29 08:49:11,031 ] - |         2|      2800|1.1853e-06|         0|   0.05038|    99.915|
[ INFO : 2025-05-29 08:50:05,021 ] - |         2|      2900|1.1791e-06|         0|  0.050187|    99.914|
[ INFO : 2025-05-29 08:50:58,936 ] - |         2|      3000|1.1729e-06|         0|  0.049805|    99.917|
[ INFO : 2025-05-29 08:51:52,807 ] - |         2|      3100|1.1668e-06|         0|    0.0492|    99.919|
[ INFO : 2025-05-29 08:52:46,695 ] - |         2|      3200|1.1606e-06|         0|   0.04925|    99.922|
[ INFO : 2025-05-29 08:53:40,571 ] - |         2|      3300|1.1546e-06|         0|  0.049152|    99.919|
[ INFO : 2025-05-29 08:54:34,458 ] - |         2|      3400|1.1485e-06|         0|  0.049046|    99.921|
[ INFO : 2025-05-29 08:55:28,337 ] - |         2|      3500|1.1425e-06|         0|  0.048691|     99.92|
[ INFO : 2025-05-29 08:56:22,334 ] - |         2|      3600|1.1365e-06|         0|  0.048403|    99.922|
[ INFO : 2025-05-29 08:57:16,269 ] - |         2|      3700|1.1305e-06|         0|  0.048211|    99.924|
[ INFO : 2025-05-29 08:58:10,528 ] - |         2|      3800|1.1246e-06|         0|  0.047885|    99.926|
[ INFO : 2025-05-29 08:59:04,456 ] - |         2|      3900|1.1187e-06|         0|  0.047618|    99.928|
[ INFO : 2025-05-29 08:59:58,362 ] - |         2|      4000|1.1128e-06|         0|   0.04733|     99.93|
[ INFO : 2025-05-29 09:00:52,314 ] - |         2|      4100| 1.107e-06|         0|  0.046997|    99.931|
[ INFO : 2025-05-29 09:01:46,205 ] - |         2|      4200|1.1012e-06|         0|  0.046746|    99.933|
[ INFO : 2025-05-29 09:02:40,266 ] - |         2|      4300|1.0954e-06|         0|  0.046405|    99.935|
[ INFO : 2025-05-29 09:03:34,563 ] - |         2|      4400|1.0897e-06|         0|  0.046099|    99.935|
[ INFO : 2025-05-29 09:04:28,869 ] - |         2|      4500| 1.084e-06|         0|  0.045704|    99.935|
[ INFO : 2025-05-29 09:05:23,159 ] - |         2|      4600|1.0783e-06|         0|  0.045569|    99.935|
[ INFO : 2025-05-29 09:06:17,848 ] - |         2|      4700|1.0726e-06|         0|  0.045219|    99.935|
[ INFO : 2025-05-29 09:07:12,197 ] - |         2|      4800| 1.067e-06|         0|  0.044952|    99.936|
[ INFO : 2025-05-29 09:08:06,512 ] - |         2|      4900|1.0614e-06|         0|  0.044719|    99.938|
[ INFO : 2025-05-29 09:09:00,810 ] - |         2|      5000|1.0558e-06|         0|  0.044543|    99.939|
[ INFO : 2025-05-29 09:09:55,106 ] - |         2|      5100|1.0503e-06|         0|  0.044252|    99.939|
[ INFO : 2025-05-29 09:10:49,412 ] - |         2|      5200|1.0448e-06|         0|  0.043954|     99.94|
[ INFO : 2025-05-29 09:11:43,377 ] - |         2|      5300|1.0393e-06|         0|  0.043616|    99.941|
[ INFO : 2025-05-29 09:12:37,292 ] - |         2|      5400|1.0339e-06|         0|  0.043458|     99.94|
[ INFO : 2025-05-29 09:13:31,207 ] - |         2|      5500|1.0285e-06|         0|  0.043159|    99.941|
[ INFO : 2025-05-29 09:14:25,520 ] - |         2|      5600|1.0231e-06|         0|   0.04285|    99.942|
[ INFO : 2025-05-29 09:15:19,454 ] - |         2|      5700|1.0177e-06|         0|  0.042605|    99.943|
[ INFO : 2025-05-29 09:16:13,373 ] - |         2|      5800|1.0124e-06|         0|  0.042356|    99.944|
[ INFO : 2025-05-29 09:17:07,291 ] - |         2|      5900|1.0071e-06|         0|  0.042154|    99.945|
[ INFO : 2025-05-29 09:18:01,263 ] - |         2|      6000|1.0018e-06|         0|   0.04182|    99.946|
[ INFO : 2025-05-29 09:18:55,583 ] - |         2|      6100|9.9652e-07|         0|  0.041539|    99.947|
[ INFO : 2025-05-29 09:19:49,896 ] - |         2|      6200| 9.913e-07|         0|  0.041255|    99.948|
[ INFO : 2025-05-29 09:20:44,217 ] - |         2|      6300| 9.861e-07|         0|  0.040998|    99.948|
[ INFO : 2025-05-29 09:21:38,536 ] - |         2|      6400|9.8093e-07|         0|  0.040721|    99.949|
[ INFO : 2025-05-29 09:22:32,904 ] - |         2|      6500|9.7579e-07|         0|  0.040482|     99.95|
[ INFO : 2025-05-29 09:23:27,269 ] - |         2|      6600|9.7067e-07|         0|  0.040228|    99.951|
[ INFO : 2025-05-29 09:24:21,454 ] - |         2|      6700|9.6558e-07|         0|  0.040036|    99.951|
[ INFO : 2025-05-29 09:25:15,663 ] - |         2|      6800|9.6052e-07|         0|  0.039871|    99.951|
[ INFO : 2025-05-29 09:26:09,984 ] - |         2|      6900|9.5548e-07|         0|  0.039765|    99.949|
[ INFO : 2025-05-29 09:27:04,292 ] - |         2|      7000|9.5047e-07|         0|  0.039558|     99.95|
[ INFO : 2025-05-29 09:27:58,598 ] - |         2|      7100|9.4549e-07|         0|  0.039348|    99.951|
[ INFO : 2025-05-29 09:28:52,921 ] - |         2|      7200|9.4053e-07|         0|  0.039131|    99.951|
[ INFO : 2025-05-29 09:29:47,233 ] - |         2|      7300| 9.356e-07|         0|  0.038893|    99.952|
[ INFO : 2025-05-29 09:30:41,538 ] - |         2|      7400|9.3069e-07|         0|  0.038655|    99.953|
[ INFO : 2025-05-29 09:31:36,240 ] - |         2|      7500|9.2581e-07|         0|  0.038451|    99.953|
[ INFO : 2025-05-29 09:32:30,605 ] - |         2|      7600|9.2096e-07|         0|  0.038275|    99.954|
[ INFO : 2025-05-29 09:33:24,523 ] - |         2|      7700|9.1613e-07|         0|  0.038059|    99.955|
[ INFO : 2025-05-29 09:34:18,403 ] - |         2|      7800|9.1133e-07|         0|  0.037848|    99.955|
[ INFO : 2025-05-29 09:35:12,271 ] - |         2|      7900|9.0655e-07|         0|  0.037671|    99.954|
[ INFO : 2025-05-29 09:36:06,117 ] - |         2|      8000| 9.018e-07|         0|  0.037502|    99.955|
[ INFO : 2025-05-29 09:37:00,205 ] - |         2|      8100|8.9707e-07|         0|  0.037266|    99.954|
[ INFO : 2025-05-29 09:37:54,450 ] - |         2|      8200|8.9236e-07|         0|  0.037081|    99.955|
[ INFO : 2025-05-29 09:38:48,704 ] - |         2|      8300|8.8768e-07|         0|   0.03691|    99.956|
[ INFO : 2025-05-29 09:39:43,127 ] - |         2|      8400|8.8303e-07|         0|  0.036781|    99.956|
[ INFO : 2025-05-29 09:40:37,351 ] - |         2|      8500| 8.784e-07|         0|  0.036608|    99.957|
[ INFO : 2025-05-29 09:41:31,214 ] - |         2|      8600|8.7379e-07|         0|  0.036476|    99.957|
[ INFO : 2025-05-29 09:42:25,068 ] - |         2|      8700|8.6921e-07|         0|  0.036277|    99.958|
[ INFO : 2025-05-29 09:43:18,939 ] - |         2|      8800|8.6466e-07|         0|  0.036146|    99.958|
[ INFO : 2025-05-29 09:44:12,796 ] - |         2|      8900|8.6012e-07|         0|   0.03597|    99.959|
[ INFO : 2025-05-29 09:45:06,665 ] - |         2|      9000|8.5561e-07|         0|   0.03578|    99.959|
[ INFO : 2025-05-29 09:46:00,543 ] - |         2|      9100|8.5113e-07|         0|   0.03557|    99.959|
[ INFO : 2025-05-29 09:46:54,393 ] - |         2|      9200|8.4666e-07|         0|  0.035385|     99.96|
[ INFO : 2025-05-29 09:47:48,243 ] - |         2|      9300|8.4222e-07|         0|  0.035174|     99.96|
[ INFO : 2025-05-29 09:48:42,444 ] - |         2|      9400|8.3781e-07|         0|  0.034991|     99.96|
[ INFO : 2025-05-29 09:49:36,317 ] - |         2|      9500|8.3342e-07|         0|  0.034818|     99.96|
[ INFO : 2025-05-29 09:50:30,173 ] - |         2|      9600|8.2905e-07|         0|  0.034672|     99.96|
[ INFO : 2025-05-29 09:51:24,035 ] - |         2|      9700| 8.247e-07|         0|  0.034509|    99.961|
[ INFO : 2025-05-29 09:52:17,900 ] - |         2|      9800|8.2037e-07|         0|  0.034358|    99.961|
[ INFO : 2025-05-29 09:53:11,750 ] - |         2|      9900|8.1607e-07|         0|  0.034227|    99.961|
[ INFO : 2025-05-29 09:54:05,606 ] - |         2|     10000|8.1179e-07|         0|  0.034054|    99.962|
[ INFO : 2025-05-29 09:54:59,469 ] - |         2|     10100|8.0754e-07|         0|  0.033946|    99.962|
[ INFO : 2025-05-29 09:55:53,316 ] - |         2|     10200| 8.033e-07|         0|  0.033768|    99.963|
[ INFO : 2025-05-29 09:56:47,487 ] - |         2|     10300|7.9909e-07|         0|   0.03361|    99.963|
[ INFO : 2025-05-29 09:57:41,369 ] - |         2|     10400| 7.949e-07|         0|  0.033445|    99.963|
[ INFO : 2025-05-29 09:58:35,219 ] - |         2|     10500|7.9073e-07|         0|  0.033304|    99.963|
[ INFO : 2025-05-29 09:59:29,050 ] - |         2|     10600|7.8659e-07|         0|  0.033194|    99.963|
[ INFO : 2025-05-29 10:00:22,891 ] - |         2|     10700|7.8246e-07|         0|  0.033028|    99.964|
[ INFO : 2025-05-29 10:01:16,717 ] - |         2|     10800|7.7836e-07|         0|   0.03289|    99.964|
[ INFO : 2025-05-29 10:02:10,545 ] - |         2|     10900|7.7428e-07|         0|  0.032776|    99.963|
[ INFO : 2025-05-29 10:03:04,378 ] - |         2|     11000|7.7022e-07|         0|  0.032675|    99.963|
[ INFO : 2025-05-29 10:03:58,224 ] - |         2|     11100|7.6618e-07|         0|  0.032567|    99.963|
[ INFO : 2025-05-29 10:04:52,070 ] - |         2|     11200|7.6216e-07|         0|  0.032412|    99.963|
[ INFO : 2025-05-29 10:05:46,650 ] - |         2|     11300|7.5817e-07|         0|  0.032296|    99.963|
[ INFO : 2025-05-29 10:06:39,593 ] - |         2|     11397|7.5431e-07|         0|  0.032161|    99.964|
[ INFO : 2025-05-29 10:07:47,958 ] - |         3|       100|7.5036e-07|         0|  0.019246|       100|
[ INFO : 2025-05-29 10:08:41,776 ] - |         3|       200|7.4642e-07|         0|  0.017758|       100|
[ INFO : 2025-05-29 10:09:35,634 ] - |         3|       300|7.4251e-07|         0|  0.016996|       100|
[ INFO : 2025-05-29 10:10:29,517 ] - |         3|       400|7.3862e-07|         0|  0.016896|       100|
[ INFO : 2025-05-29 10:11:23,431 ] - |         3|       500|7.3474e-07|         0|  0.017228|    99.975|
[ INFO : 2025-05-29 10:12:17,322 ] - |         3|       600|7.3089e-07|         0|  0.017965|    99.958|
[ INFO : 2025-05-29 10:13:11,205 ] - |         3|       700|7.2706e-07|         0|  0.017722|    99.964|
[ INFO : 2025-05-29 10:14:05,116 ] - |         3|       800|7.2325e-07|         0|  0.017277|    99.969|
[ INFO : 2025-05-29 10:14:59,129 ] - |         3|       900|7.1945e-07|         0|  0.016971|    99.972|
[ INFO : 2025-05-29 10:15:53,289 ] - |         3|      1000|7.1568e-07|         0|  0.016686|    99.975|
[ INFO : 2025-05-29 10:16:47,202 ] - |         3|      1100|7.1193e-07|         0|  0.016306|    99.977|
[ INFO : 2025-05-29 10:17:41,093 ] - |         3|      1200| 7.082e-07|         0|  0.016142|    99.979|
[ INFO : 2025-05-29 10:18:34,998 ] - |         3|      1300|7.0448e-07|         0|  0.016357|    99.981|
[ INFO : 2025-05-29 10:19:28,901 ] - |         3|      1400|7.0079e-07|         0|  0.016228|    99.982|
[ INFO : 2025-05-29 10:20:22,801 ] - |         3|      1500|6.9711e-07|         0|  0.016087|    99.983|
[ INFO : 2025-05-29 10:21:16,693 ] - |         3|      1600|6.9346e-07|         0|  0.016105|     99.98|
[ INFO : 2025-05-29 10:22:10,569 ] - |         3|      1700|6.8982e-07|         0|  0.015892|    99.982|
[ INFO : 2025-05-29 10:23:04,477 ] - |         3|      1800|6.8621e-07|         0|  0.015856|    99.979|
[ INFO : 2025-05-29 10:23:58,711 ] - |         3|      1900|6.8261e-07|         0|  0.015893|    99.977|
[ INFO : 2025-05-29 10:24:52,622 ] - |         3|      2000|6.7903e-07|         0|  0.015718|    99.978|
[ INFO : 2025-05-29 10:25:46,531 ] - |         3|      2100|6.7547e-07|         0|  0.015757|    99.979|
[ INFO : 2025-05-29 10:26:40,468 ] - |         3|      2200|6.7193e-07|         0|  0.015657|     99.98|
[ INFO : 2025-05-29 10:27:34,382 ] - |         3|      2300| 6.684e-07|         0|  0.015754|    99.978|
[ INFO : 2025-05-29 10:28:28,285 ] - |         3|      2400| 6.649e-07|         0|  0.015691|    99.977|
[ INFO : 2025-05-29 10:29:22,216 ] - |         3|      2500|6.6141e-07|         0|  0.015557|    99.977|
[ INFO : 2025-05-29 10:30:16,128 ] - |         3|      2600|6.5795e-07|         0|  0.015425|    99.978|
[ INFO : 2025-05-29 10:31:10,066 ] - |         3|      2700| 6.545e-07|         0|   0.01542|    99.979|
[ INFO : 2025-05-29 10:32:04,344 ] - |         3|      2800|6.5106e-07|         0|  0.015373|     99.98|
[ INFO : 2025-05-29 10:32:58,288 ] - |         3|      2900|6.4765e-07|         0|  0.015297|    99.981|
[ INFO : 2025-05-29 10:33:52,171 ] - |         3|      3000|6.4425e-07|         0|   0.01526|    99.981|
[ INFO : 2025-05-29 10:34:46,004 ] - |         3|      3100|6.4088e-07|         0|  0.015218|    99.982|
[ INFO : 2025-05-29 10:35:39,895 ] - |         3|      3200|6.3752e-07|         0|  0.015195|    99.982|
[ INFO : 2025-05-29 10:36:33,821 ] - |         3|      3300|6.3417e-07|         0|   0.01524|    99.983|
[ INFO : 2025-05-29 10:37:27,961 ] - |         3|      3400|6.3085e-07|         0|  0.015318|    99.983|
[ INFO : 2025-05-29 10:38:21,988 ] - |         3|      3500|6.2754e-07|         0|  0.015249|    99.984|
[ INFO : 2025-05-29 10:39:15,892 ] - |         3|      3600|6.2425e-07|         0|  0.015095|    99.984|
[ INFO : 2025-05-29 10:40:09,787 ] - |         3|      3700|6.2098e-07|         0|  0.014993|    99.985|
[ INFO : 2025-05-29 10:41:04,000 ] - |         3|      3800|6.1772e-07|         0|  0.014895|    99.985|
[ INFO : 2025-05-29 10:41:57,893 ] - |         3|      3900|6.1448e-07|         0|  0.014883|    99.986|
[ INFO : 2025-05-29 10:42:51,786 ] - |         3|      4000|6.1126e-07|         0|  0.014873|    99.986|
[ INFO : 2025-05-29 10:43:45,692 ] - |         3|      4100|6.0806e-07|         0|   0.01485|    99.986|
[ INFO : 2025-05-29 10:44:39,571 ] - |         3|      4200|6.0487e-07|         0|  0.014807|    99.987|
[ INFO : 2025-05-29 10:45:33,433 ] - |         3|      4300| 6.017e-07|         0|  0.014773|    99.985|
[ INFO : 2025-05-29 10:46:27,591 ] - |         3|      4400|5.9854e-07|         0|  0.014782|    99.984|
[ INFO : 2025-05-29 10:47:21,854 ] - |         3|      4500| 5.954e-07|         0|  0.014784|    99.985|
[ INFO : 2025-05-29 10:48:15,862 ] - |         3|      4600|5.9228e-07|         0|  0.014769|    99.984|
[ INFO : 2025-05-29 10:49:10,084 ] - |         3|      4700|5.8918e-07|         0|  0.014748|    99.984|
[ INFO : 2025-05-29 10:50:03,988 ] - |         3|      4800|5.8609e-07|         0|  0.014685|    99.982|
[ INFO : 2025-05-29 10:50:57,882 ] - |         3|      4900|5.8301e-07|         0|  0.014616|    99.982|
[ INFO : 2025-05-29 10:51:51,783 ] - |         3|      5000|5.7996e-07|         0|  0.014559|    99.983|
[ INFO : 2025-05-29 10:52:45,724 ] - |         3|      5100|5.7692e-07|         0|  0.014473|    99.983|
[ INFO : 2025-05-29 10:53:39,613 ] - |         3|      5200|5.7389e-07|         0|  0.014414|    99.983|
[ INFO : 2025-05-29 10:54:33,494 ] - |         3|      5300|5.7088e-07|         0|  0.014354|    99.983|
[ INFO : 2025-05-29 10:55:27,379 ] - |         3|      5400|5.6789e-07|         0|  0.014283|    99.984|
[ INFO : 2025-05-29 10:56:21,265 ] - |         3|      5500|5.6491e-07|         0|  0.014215|    99.984|
[ INFO : 2025-05-29 10:57:15,468 ] - |         3|      5600|5.6195e-07|         0|  0.014202|    99.984|
[ INFO : 2025-05-29 10:58:09,410 ] - |         3|      5700|  5.59e-07|         0|  0.014209|    99.984|
[ INFO : 2025-05-29 10:59:03,329 ] - |         3|      5800|5.5607e-07|         0|  0.014183|    99.984|
[ INFO : 2025-05-29 10:59:57,321 ] - |         3|      5900|5.5316e-07|         0|   0.01413|    99.984|
[ INFO : 2025-05-29 11:00:51,630 ] - |         3|      6000|5.5026e-07|         0|  0.014056|    99.984|
[ INFO : 2025-05-29 11:01:45,933 ] - |         3|      6100|5.4737e-07|         0|  0.014031|    99.985|
[ INFO : 2025-05-29 11:02:40,224 ] - |         3|      6200| 5.445e-07|         0|  0.013943|    99.985|
[ INFO : 2025-05-29 11:03:34,366 ] - |         3|      6300|5.4165e-07|         0|  0.013936|    99.985|
[ INFO : 2025-05-29 11:04:28,264 ] - |         3|      6400|5.3881e-07|         0|  0.013887|    99.985|
[ INFO : 2025-05-29 11:05:22,166 ] - |         3|      6500|5.3598e-07|         0|  0.013839|    99.986|
[ INFO : 2025-05-29 11:06:16,435 ] - |         3|      6600|5.3317e-07|         0|  0.013782|    99.986|
[ INFO : 2025-05-29 11:07:10,353 ] - |         3|      6700|5.3037e-07|         0|  0.013735|    99.986|
[ INFO : 2025-05-29 11:08:04,274 ] - |         3|      6800|5.2759e-07|         0|  0.013678|    99.986|
[ INFO : 2025-05-29 11:08:58,176 ] - |         3|      6900|5.2483e-07|         0|  0.013655|    99.986|
[ INFO : 2025-05-29 11:09:52,079 ] - |         3|      7000|5.2208e-07|         0|  0.013615|    99.987|
[ INFO : 2025-05-29 11:10:46,007 ] - |         3|      7100|5.1934e-07|         0|  0.013583|    99.986|
[ INFO : 2025-05-29 11:11:39,927 ] - |         3|      7200|5.1661e-07|         0|  0.013577|    99.984|
[ INFO : 2025-05-29 11:12:33,889 ] - |         3|      7300|5.1391e-07|         0|  0.013546|    99.985|
[ INFO : 2025-05-29 11:13:28,004 ] - |         3|      7400|5.1121e-07|         0|  0.013506|    99.985|
[ INFO : 2025-05-29 11:14:22,707 ] - |         3|      7500|5.0853e-07|         0|   0.01347|    99.985|
[ INFO : 2025-05-29 11:15:17,065 ] - |         3|      7600|5.0586e-07|         0|  0.013409|    99.985|
[ INFO : 2025-05-29 11:16:11,375 ] - |         3|      7700|5.0321e-07|         0|  0.013362|    99.985|
[ INFO : 2025-05-29 11:17:05,692 ] - |         3|      7800|5.0057e-07|         0|    0.0133|    99.986|
[ INFO : 2025-05-29 11:18:00,010 ] - |         3|      7900|4.9795e-07|         0|  0.013246|    99.986|
[ INFO : 2025-05-29 11:18:54,360 ] - |         3|      8000|4.9534e-07|         0|    0.0132|    99.986|
[ INFO : 2025-05-29 11:19:48,673 ] - |         3|      8100|4.9274e-07|         0|   0.01318|    99.985|
[ INFO : 2025-05-29 11:20:42,990 ] - |         3|      8200|4.9016e-07|         0|  0.013161|    99.986|
[ INFO : 2025-05-29 11:21:37,313 ] - |         3|      8300|4.8759e-07|         0|  0.013116|    99.986|
[ INFO : 2025-05-29 11:22:31,781 ] - |         3|      8400|4.8503e-07|         0|  0.013075|    99.986|
[ INFO : 2025-05-29 11:23:26,398 ] - |         3|      8500|4.8249e-07|         0|  0.013035|    99.986|
[ INFO : 2025-05-29 11:24:20,746 ] - |         3|      8600|4.7996e-07|         0|  0.013014|    99.985|
[ INFO : 2025-05-29 11:25:15,077 ] - |         3|      8700|4.7744e-07|         0|  0.012991|    99.986|
[ INFO : 2025-05-29 11:26:09,422 ] - |         3|      8800|4.7494e-07|         0|  0.012955|    99.986|
[ INFO : 2025-05-29 11:27:03,728 ] - |         3|      8900|4.7245e-07|         0|  0.012933|    99.986|
[ INFO : 2025-05-29 11:27:58,072 ] - |         3|      9000|4.6997e-07|         0|  0.012898|    99.986|
[ INFO : 2025-05-29 11:28:52,400 ] - |         3|      9100|4.6751e-07|         0|  0.012866|    99.986|
[ INFO : 2025-05-29 11:29:46,715 ] - |         3|      9200|4.6506e-07|         0|  0.012844|    99.986|
[ INFO : 2025-05-29 11:30:41,033 ] - |         3|      9300|4.6262e-07|         0|   0.01281|    99.987|
[ INFO : 2025-05-29 11:31:35,435 ] - |         3|      9400|4.6019e-07|         0|  0.012789|    99.987|
[ INFO : 2025-05-29 11:32:29,350 ] - |         3|      9500|4.5778e-07|         0|  0.012757|    99.987|
[ INFO : 2025-05-29 11:33:23,248 ] - |         3|      9600|4.5538e-07|         0|  0.012716|    99.987|
[ INFO : 2025-05-29 11:34:17,160 ] - |         3|      9700|4.5299e-07|         0|  0.012698|    99.987|
[ INFO : 2025-05-29 11:35:11,085 ] - |         3|      9800|4.5062e-07|         0|  0.012672|    99.987|
[ INFO : 2025-05-29 11:36:05,259 ] - |         3|      9900|4.4825e-07|         0|  0.012636|    99.987|
[ INFO : 2025-05-29 11:36:59,579 ] - |         3|     10000| 4.459e-07|         0|  0.012593|    99.987|
[ INFO : 2025-05-29 11:37:53,920 ] - |         3|     10100|4.4356e-07|         0|  0.012546|    99.988|
[ INFO : 2025-05-29 11:38:48,237 ] - |         3|     10200|4.4124e-07|         0|  0.012512|    99.988|
[ INFO : 2025-05-29 11:39:42,607 ] - |         3|     10300|4.3893e-07|         0|  0.012478|    99.988|
[ INFO : 2025-05-29 11:40:36,595 ] - |         3|     10400|4.3662e-07|         0|  0.012496|    99.987|
[ INFO : 2025-05-29 11:41:30,521 ] - |         3|     10500|4.3433e-07|         0|  0.012504|    99.986|
[ INFO : 2025-05-29 11:42:24,449 ] - |         3|     10600|4.3206e-07|         0|  0.012461|    99.986|
[ INFO : 2025-05-29 11:43:18,415 ] - |         3|     10700|4.2979e-07|         0|   0.01241|    99.987|
[ INFO : 2025-05-29 11:44:12,375 ] - |         3|     10800|4.2754e-07|         0|  0.012369|    99.987|
[ INFO : 2025-05-29 11:45:06,359 ] - |         3|     10900| 4.253e-07|         0|  0.012328|    99.987|
[ INFO : 2025-05-29 11:46:00,421 ] - |         3|     11000|4.2307e-07|         0|  0.012317|    99.986|
[ INFO : 2025-05-29 11:46:54,740 ] - |         3|     11100|4.2085e-07|         0|   0.01229|    99.986|
[ INFO : 2025-05-29 11:47:49,047 ] - |         3|     11200|4.1864e-07|         0|  0.012249|    99.987|
[ INFO : 2025-05-29 11:48:43,779 ] - |         3|     11300|4.1645e-07|         0|  0.012207|    99.987|
[ INFO : 2025-05-29 11:49:36,728 ] - |         3|     11397|4.1433e-07|         0|  0.012181|    99.987|
[ INFO : 2025-05-29 11:50:45,531 ] - |         4|       100|4.1216e-07|         0|  0.011305|       100|
[ INFO : 2025-05-29 11:51:39,787 ] - |         4|       200|   4.1e-07|         0|  0.011262|    99.969|
[ INFO : 2025-05-29 11:52:34,090 ] - |         4|       300|4.0785e-07|         0|  0.010273|    99.979|
[ INFO : 2025-05-29 11:53:28,427 ] - |         4|       400|4.0571e-07|         0|  0.010109|    99.984|
[ INFO : 2025-05-29 11:54:22,784 ] - |         4|       500|4.0358e-07|         0| 0.0095476|    99.987|
[ INFO : 2025-05-29 11:55:17,120 ] - |         4|       600|4.0146e-07|         0| 0.0095149|     99.99|
[ INFO : 2025-05-29 11:56:11,417 ] - |         4|       700|3.9936e-07|         0| 0.0095323|    99.991|
[ INFO : 2025-05-29 11:57:05,679 ] - |         4|       800|3.9726e-07|         0| 0.0093086|    99.992|
[ INFO : 2025-05-29 11:58:00,086 ] - |         4|       900|3.9518e-07|         0| 0.0092652|    99.993|
[ INFO : 2025-05-29 11:58:54,202 ] - |         4|      1000|3.9311e-07|         0| 0.0091379|    99.994|
[ INFO : 2025-05-29 11:59:48,123 ] - |         4|      1100|3.9105e-07|         0| 0.0089546|    99.994|
[ INFO : 2025-05-29 12:00:42,082 ] - |         4|      1200|  3.89e-07|         0| 0.0088099|    99.995|
[ INFO : 2025-05-29 12:01:35,968 ] - |         4|      1300|3.8696e-07|         0| 0.0088192|    99.995|
[ INFO : 2025-05-29 12:02:29,852 ] - |         4|      1400|3.8493e-07|         0| 0.0087789|    99.996|
[ INFO : 2025-05-29 12:03:23,741 ] - |         4|      1500|3.8291e-07|         0| 0.0089935|    99.992|
[ INFO : 2025-05-29 12:04:17,625 ] - |         4|      1600| 3.809e-07|         0| 0.0090024|    99.992|
[ INFO : 2025-05-29 12:05:11,530 ] - |         4|      1700|3.7891e-07|         0| 0.0089355|    99.989|
[ INFO : 2025-05-29 12:06:05,410 ] - |         4|      1800|3.7692e-07|         0| 0.0092182|    99.986|
[ INFO : 2025-05-29 12:06:59,610 ] - |         4|      1900|3.7494e-07|         0| 0.0091191|    99.987|
[ INFO : 2025-05-29 12:07:53,505 ] - |         4|      2000|3.7298e-07|         0| 0.0090715|    99.987|
[ INFO : 2025-05-29 12:08:47,408 ] - |         4|      2100|3.7102e-07|         0| 0.0090055|    99.988|
[ INFO : 2025-05-29 12:09:41,304 ] - |         4|      2200|3.6908e-07|         0| 0.0089165|    99.989|
[ INFO : 2025-05-29 12:10:35,208 ] - |         4|      2300|3.6714e-07|         0| 0.0088999|    99.989|
[ INFO : 2025-05-29 12:11:29,079 ] - |         4|      2400|3.6522e-07|         0| 0.0088756|     99.99|
[ INFO : 2025-05-29 12:12:22,970 ] - |         4|      2500| 3.633e-07|         0|  0.008934|     99.99|
[ INFO : 2025-05-29 12:13:16,880 ] - |         4|      2600| 3.614e-07|         0| 0.0089882|     99.99|
[ INFO : 2025-05-29 12:14:10,772 ] - |         4|      2700| 3.595e-07|         0| 0.0089811|    99.991|
[ INFO : 2025-05-29 12:15:04,970 ] - |         4|      2800|3.5762e-07|         0| 0.0089502|    99.991|
[ INFO : 2025-05-29 12:15:58,848 ] - |         4|      2900|3.5574e-07|         0| 0.0089147|    99.991|
[ INFO : 2025-05-29 12:16:52,727 ] - |         4|      3000|3.5388e-07|         0| 0.0088899|    99.992|
[ INFO : 2025-05-29 12:17:46,609 ] - |         4|      3100|3.5202e-07|         0| 0.0088333|    99.992|
[ INFO : 2025-05-29 12:18:40,473 ] - |         4|      3200|3.5018e-07|         0| 0.0087808|    99.992|
[ INFO : 2025-05-29 12:19:34,330 ] - |         4|      3300|3.4834e-07|         0| 0.0087097|    99.992|
[ INFO : 2025-05-29 12:20:28,207 ] - |         4|      3400|3.4651e-07|         0| 0.0086741|    99.993|
[ INFO : 2025-05-29 12:21:22,074 ] - |         4|      3500| 3.447e-07|         0| 0.0086248|    99.993|
[ INFO : 2025-05-29 12:22:15,949 ] - |         4|      3600|3.4289e-07|         0| 0.0085991|    99.993|
[ INFO : 2025-05-29 12:23:09,829 ] - |         4|      3700|3.4109e-07|         0| 0.0085524|    99.993|
[ INFO : 2025-05-29 12:24:04,033 ] - |         4|      3800| 3.393e-07|         0| 0.0086038|    99.992|
[ INFO : 2025-05-29 12:24:57,901 ] - |         4|      3900|3.3752e-07|         0| 0.0086074|    99.992|
[ INFO : 2025-05-29 12:25:51,760 ] - |         4|      4000|3.3575e-07|         0| 0.0085755|    99.992|
[ INFO : 2025-05-29 12:26:45,617 ] - |         4|      4100|3.3399e-07|         0| 0.0085458|    99.992|
[ INFO : 2025-05-29 12:27:39,853 ] - |         4|      4200|3.3224e-07|         0| 0.0085666|    99.993|
[ INFO : 2025-05-29 12:28:34,110 ] - |         4|      4300| 3.305e-07|         0|  0.008527|    99.993|
[ INFO : 2025-05-29 12:29:28,361 ] - |         4|      4400|3.2877e-07|         0| 0.0084722|    99.993|
[ INFO : 2025-05-29 12:30:22,629 ] - |         4|      4500|3.2704e-07|         0| 0.0084408|    99.993|
[ INFO : 2025-05-29 12:31:16,870 ] - |         4|      4600|3.2533e-07|         0| 0.0084195|    99.993|
[ INFO : 2025-05-29 12:32:11,183 ] - |         4|      4700|3.2362e-07|         0| 0.0083878|    99.993|
[ INFO : 2025-05-29 12:33:05,060 ] - |         4|      4800|3.2193e-07|         0| 0.0083804|    99.993|
[ INFO : 2025-05-29 12:33:58,937 ] - |         4|      4900|3.2024e-07|         0| 0.0083493|    99.994|
[ INFO : 2025-05-29 12:34:52,827 ] - |         4|      5000|3.1856e-07|         0| 0.0083211|    99.994|
[ INFO : 2025-05-29 12:35:46,701 ] - |         4|      5100|3.1689e-07|         0| 0.0083753|    99.994|
[ INFO : 2025-05-29 12:36:40,591 ] - |         4|      5200|3.1523e-07|         0|  0.008386|    99.994|
[ INFO : 2025-05-29 12:37:34,461 ] - |         4|      5300|3.1357e-07|         0| 0.0083899|    99.994|
[ INFO : 2025-05-29 12:38:28,334 ] - |         4|      5400|3.1193e-07|         0| 0.0083559|    99.994|
[ INFO : 2025-05-29 12:39:22,517 ] - |         4|      5500|3.1029e-07|         0| 0.0083394|    99.994|
[ INFO : 2025-05-29 12:40:16,920 ] - |         4|      5600|3.0867e-07|         0| 0.0083061|    99.994|
[ INFO : 2025-05-29 12:41:10,790 ] - |         4|      5700|3.0705e-07|         0| 0.0082919|    99.995|
[ INFO : 2025-05-29 12:42:04,633 ] - |         4|      5800|3.0544e-07|         0| 0.0082648|    99.995|
[ INFO : 2025-05-29 12:42:58,481 ] - |         4|      5900|3.0384e-07|         0| 0.0082792|    99.993|
[ INFO : 2025-05-29 12:43:52,320 ] - |         4|      6000|3.0224e-07|         0| 0.0082813|    99.993|
[ INFO : 2025-05-29 12:44:46,168 ] - |         4|      6100|3.0066e-07|         0| 0.0082519|    99.993|
[ INFO : 2025-05-29 12:45:40,019 ] - |         4|      6200|2.9908e-07|         0| 0.0082358|    99.993|
[ INFO : 2025-05-29 12:46:33,914 ] - |         4|      6300|2.9752e-07|         0| 0.0082109|    99.993|
[ INFO : 2025-05-29 12:47:28,168 ] - |         4|      6400|2.9596e-07|         0| 0.0081878|    99.993|
[ INFO : 2025-05-29 12:48:22,430 ] - |         4|      6500| 2.944e-07|         0| 0.0081639|    99.993|
[ INFO : 2025-05-29 12:49:16,996 ] - |         4|      6600|2.9286e-07|         0| 0.0081423|    99.993|
[ INFO : 2025-05-29 12:50:11,246 ] - |         4|      6700|2.9132e-07|         0| 0.0081187|    99.993|
[ INFO : 2025-05-29 12:51:05,473 ] - |         4|      6800| 2.898e-07|         0| 0.0080927|    99.994|
[ INFO : 2025-05-29 12:51:59,718 ] - |         4|      6900|2.8828e-07|         0| 0.0080934|    99.994|
[ INFO : 2025-05-29 12:52:53,959 ] - |         4|      7000|2.8677e-07|         0| 0.0080665|    99.994|
[ INFO : 2025-05-29 12:53:48,225 ] - |         4|      7100|2.8526e-07|         0| 0.0080416|    99.993|
[ INFO : 2025-05-29 12:54:42,470 ] - |         4|      7200|2.8377e-07|         0| 0.0080303|    99.993|
[ INFO : 2025-05-29 12:55:36,720 ] - |         4|      7300|2.8228e-07|         0| 0.0080104|    99.993|
[ INFO : 2025-05-29 12:56:30,974 ] - |         4|      7400| 2.808e-07|         0| 0.0080155|    99.993|
[ INFO : 2025-05-29 12:57:25,374 ] - |         4|      7500|2.7933e-07|         0| 0.0079889|    99.993|
[ INFO : 2025-05-29 12:58:19,264 ] - |         4|      7600|2.7786e-07|         0| 0.0079682|    99.993|
[ INFO : 2025-05-29 12:59:13,141 ] - |         4|      7700| 2.764e-07|         0| 0.0079616|    99.994|
[ INFO : 2025-05-29 13:00:07,015 ] - |         4|      7800|2.7496e-07|         0| 0.0079268|    99.994|
[ INFO : 2025-05-29 13:01:00,894 ] - |         4|      7900|2.7351e-07|         0| 0.0079391|    99.994|
[ INFO : 2025-05-29 13:01:54,773 ] - |         4|      8000|2.7208e-07|         0| 0.0079335|    99.994|
[ INFO : 2025-05-29 13:02:48,674 ] - |         4|      8100|2.7065e-07|         0|  0.007916|    99.994|
[ INFO : 2025-05-29 13:03:42,542 ] - |         4|      8200|2.6923e-07|         0| 0.0078933|    99.994|
[ INFO : 2025-05-29 13:04:36,402 ] - |         4|      8300|2.6782e-07|         0| 0.0078892|    99.994|
[ INFO : 2025-05-29 13:05:30,487 ] - |         4|      8400|2.6642e-07|         0| 0.0078704|    99.994|
[ INFO : 2025-05-29 13:06:24,984 ] - |         4|      8500|2.6502e-07|         0|  0.007864|    99.994|
[ INFO : 2025-05-29 13:07:19,285 ] - |         4|      8600|2.6363e-07|         0|  0.007837|    99.994|
[ INFO : 2025-05-29 13:08:13,604 ] - |         4|      8700|2.6225e-07|         0| 0.0078211|    99.994|
[ INFO : 2025-05-29 13:09:07,912 ] - |         4|      8800|2.6087e-07|         0| 0.0078095|    99.994|
[ INFO : 2025-05-29 13:10:02,219 ] - |         4|      8900|2.5951e-07|         0| 0.0078065|    99.994|
[ INFO : 2025-05-29 13:10:56,511 ] - |         4|      9000|2.5815e-07|         0| 0.0077907|    99.994|
[ INFO : 2025-05-29 13:11:50,804 ] - |         4|      9100|2.5679e-07|         0| 0.0077785|    99.993|
[ INFO : 2025-05-29 13:12:45,104 ] - |         4|      9200|2.5545e-07|         0| 0.0077792|    99.993|
[ INFO : 2025-05-29 13:13:39,417 ] - |         4|      9300|2.5411e-07|         0| 0.0077653|    99.993|
[ INFO : 2025-05-29 13:14:33,839 ] - |         4|      9400|2.5277e-07|         0| 0.0077576|    99.993|
[ INFO : 2025-05-29 13:15:28,088 ] - |         4|      9500|2.5145e-07|         0| 0.0077581|    99.993|
[ INFO : 2025-05-29 13:16:22,346 ] - |         4|      9600|2.5013e-07|         0| 0.0077351|    99.993|
[ INFO : 2025-05-29 13:17:16,628 ] - |         4|      9700|2.4882e-07|         0| 0.0077168|    99.994|
[ INFO : 2025-05-29 13:18:10,876 ] - |         4|      9800|2.4751e-07|         0| 0.0076977|    99.994|
[ INFO : 2025-05-29 13:19:05,193 ] - |         4|      9900|2.4622e-07|         0| 0.0076781|    99.994|
[ INFO : 2025-05-29 13:19:59,491 ] - |         4|     10000|2.4493e-07|         0| 0.0076679|    99.994|
[ INFO : 2025-05-29 13:20:53,758 ] - |         4|     10100|2.4364e-07|         0| 0.0076445|    99.994|
[ INFO : 2025-05-29 13:21:48,053 ] - |         4|     10200|2.4236e-07|         0| 0.0076314|    99.994|
[ INFO : 2025-05-29 13:22:42,636 ] - |         4|     10300|2.4109e-07|         0| 0.0076092|    99.994|
[ INFO : 2025-05-29 13:23:36,939 ] - |         4|     10400|2.3983e-07|         0| 0.0075944|    99.994|
[ INFO : 2025-05-29 13:24:31,216 ] - |         4|     10500|2.3857e-07|         0| 0.0075698|    99.994|
[ INFO : 2025-05-29 13:25:25,501 ] - |         4|     10600|2.3732e-07|         0| 0.0075635|    99.994|
[ INFO : 2025-05-29 13:26:19,782 ] - |         4|     10700|2.3608e-07|         0| 0.0075443|    99.994|
[ INFO : 2025-05-29 13:27:14,051 ] - |         4|     10800|2.3484e-07|         0|  0.007539|    99.994|
[ INFO : 2025-05-29 13:28:08,319 ] - |         4|     10900|2.3361e-07|         0| 0.0075309|    99.994|
[ INFO : 2025-05-29 13:29:02,603 ] - |         4|     11000|2.3238e-07|         0| 0.0075086|    99.994|
[ INFO : 2025-05-29 13:29:56,868 ] - |         4|     11100|2.3116e-07|         0| 0.0075116|    99.993|
[ INFO : 2025-05-29 13:30:50,950 ] - |         4|     11200|2.2995e-07|         0| 0.0075029|    99.993|
[ INFO : 2025-05-29 13:31:45,333 ] - |         4|     11300|2.2875e-07|         0|  0.007484|    99.993|
[ INFO : 2025-05-29 13:32:38,279 ] - |         4|     11397|2.2758e-07|         0|  0.007462|    99.993|
[ INFO : 2025-05-29 13:33:46,484 ] - |         5|       100|2.2639e-07|         0| 0.0062654|       100|
[ INFO : 2025-05-29 13:34:40,313 ] - |         5|       200| 2.252e-07|         0| 0.0059358|       100|
[ INFO : 2025-05-29 13:35:34,176 ] - |         5|       300|2.2402e-07|         0|  0.005884|       100|
[ INFO : 2025-05-29 13:36:28,065 ] - |         5|       400|2.2285e-07|         0| 0.0058216|       100|
[ INFO : 2025-05-29 13:37:21,947 ] - |         5|       500|2.2168e-07|         0|  0.005825|       100|
[ INFO : 2025-05-29 13:38:15,829 ] - |         5|       600|2.2052e-07|         0| 0.0061103|     99.99|
[ INFO : 2025-05-29 13:39:09,714 ] - |         5|       700|2.1936e-07|         0| 0.0060536|    99.991|
[ INFO : 2025-05-29 13:40:03,610 ] - |         5|       800|2.1821e-07|         0| 0.0059491|    99.992|
[ INFO : 2025-05-29 13:40:57,630 ] - |         5|       900|2.1707e-07|         0| 0.0059083|    99.993|
[ INFO : 2025-05-29 13:41:51,708 ] - |         5|      1000|2.1593e-07|         0|  0.006071|    99.994|
[ INFO : 2025-05-29 13:42:45,598 ] - |         5|      1100| 2.148e-07|         0| 0.0060048|    99.994|
[ INFO : 2025-05-29 13:43:39,479 ] - |         5|      1200|2.1367e-07|         0| 0.0060302|    99.995|
[ INFO : 2025-05-29 13:44:33,360 ] - |         5|      1300|2.1255e-07|         0| 0.0060569|    99.995|
[ INFO : 2025-05-29 13:45:27,220 ] - |         5|      1400|2.1143e-07|         0| 0.0060691|    99.996|
[ INFO : 2025-05-29 13:46:21,102 ] - |         5|      1500|2.1033e-07|         0|  0.005964|    99.996|
[ INFO : 2025-05-29 13:47:14,983 ] - |         5|      1600|2.0922e-07|         0| 0.0059117|    99.996|
[ INFO : 2025-05-29 13:48:08,849 ] - |         5|      1700|2.0813e-07|         0| 0.0058892|    99.996|
[ INFO : 2025-05-29 13:49:02,717 ] - |         5|      1800|2.0703e-07|         0| 0.0058621|    99.997|
[ INFO : 2025-05-29 13:49:56,872 ] - |         5|      1900|2.0595e-07|         0| 0.0058668|    99.997|
[ INFO : 2025-05-29 13:50:50,747 ] - |         5|      2000|2.0487e-07|         0| 0.0058335|    99.997|
[ INFO : 2025-05-29 13:51:44,626 ] - |         5|      2100| 2.038e-07|         0| 0.0058177|    99.997|
[ INFO : 2025-05-29 13:52:38,524 ] - |         5|      2200|2.0273e-07|         0| 0.0057914|    99.997|
[ INFO : 2025-05-29 13:53:32,420 ] - |         5|      2300|2.0166e-07|         0| 0.0058094|    99.995|
[ INFO : 2025-05-29 13:54:26,331 ] - |         5|      2400|2.0061e-07|         0| 0.0057814|    99.995|
[ INFO : 2025-05-29 13:55:20,495 ] - |         5|      2500|1.9955e-07|         0| 0.0058227|    99.995|
[ INFO : 2025-05-29 13:56:14,446 ] - |         5|      2600|1.9851e-07|         0| 0.0058013|    99.995|
[ INFO : 2025-05-29 13:57:08,627 ] - |         5|      2700|1.9747e-07|         0|  0.005776|    99.995|
[ INFO : 2025-05-29 13:58:02,835 ] - |         5|      2800|1.9643e-07|         0| 0.0057618|    99.996|
[ INFO : 2025-05-29 13:58:56,717 ] - |         5|      2900| 1.954e-07|         0| 0.0057288|    99.996|
[ INFO : 2025-05-29 13:59:50,618 ] - |         5|      3000|1.9438e-07|         0| 0.0057371|    99.996|
[ INFO : 2025-05-29 14:00:44,496 ] - |         5|      3100|1.9336e-07|         0| 0.0057527|    99.996|
[ INFO : 2025-05-29 14:01:38,397 ] - |         5|      3200|1.9234e-07|         0| 0.0057163|    99.996|
[ INFO : 2025-05-29 14:02:32,294 ] - |         5|      3300|1.9134e-07|         0| 0.0057195|    99.996|
[ INFO : 2025-05-29 14:03:26,369 ] - |         5|      3400|1.9033e-07|         0| 0.0057172|    99.996|
[ INFO : 2025-05-29 14:04:20,676 ] - |         5|      3500|1.8933e-07|         0| 0.0057323|    99.996|
[ INFO : 2025-05-29 14:05:14,985 ] - |         5|      3600|1.8834e-07|         0| 0.0057181|    99.997|
[ INFO : 2025-05-29 14:06:09,266 ] - |         5|      3700|1.8735e-07|         0| 0.0057173|    99.997|
[ INFO : 2025-05-29 14:07:03,859 ] - |         5|      3800|1.8637e-07|         0| 0.0057444|    99.997|
[ INFO : 2025-05-29 14:07:58,139 ] - |         5|      3900|1.8539e-07|         0| 0.0057346|    99.997|
[ INFO : 2025-05-29 14:08:52,424 ] - |         5|      4000|1.8442e-07|         0| 0.0057177|    99.997|
[ INFO : 2025-05-29 14:09:46,503 ] - |         5|      4100|1.8346e-07|         0| 0.0057146|    99.997|
[ INFO : 2025-05-29 14:10:40,800 ] - |         5|      4200|1.8249e-07|         0| 0.0057233|    99.997|
[ INFO : 2025-05-29 14:11:35,088 ] - |         5|      4300|1.8154e-07|         0| 0.0057011|    99.997|
[ INFO : 2025-05-29 14:12:29,213 ] - |         5|      4400|1.8059e-07|         0| 0.0057134|    99.997|
[ INFO : 2025-05-29 14:13:23,506 ] - |         5|      4500|1.7964e-07|         0| 0.0057202|    99.997|
[ INFO : 2025-05-29 14:14:17,839 ] - |         5|      4600| 1.787e-07|         0| 0.0057496|    99.997|
[ INFO : 2025-05-29 14:15:12,471 ] - |         5|      4700|1.7776e-07|         0| 0.0057486|    99.997|
[ INFO : 2025-05-29 14:16:06,735 ] - |         5|      4800|1.7683e-07|         0|  0.005729|    99.997|
[ INFO : 2025-05-29 14:17:01,007 ] - |         5|      4900| 1.759e-07|         0| 0.0057251|    99.997|
[ INFO : 2025-05-29 14:17:55,271 ] - |         5|      5000|1.7498e-07|         0| 0.0057119|    99.997|
[ INFO : 2025-05-29 14:18:49,539 ] - |         5|      5100|1.7406e-07|         0| 0.0057011|    99.998|
[ INFO : 2025-05-29 14:19:43,807 ] - |         5|      5200|1.7315e-07|         0| 0.0056945|    99.998|
[ INFO : 2025-05-29 14:20:38,084 ] - |         5|      5300|1.7224e-07|         0| 0.0056786|    99.998|
[ INFO : 2025-05-29 14:21:32,358 ] - |         5|      5400|1.7134e-07|         0|  0.005657|    99.998|
[ INFO : 2025-05-29 14:22:26,512 ] - |         5|      5500|1.7044e-07|         0| 0.0056339|    99.998|
[ INFO : 2025-05-29 14:23:20,742 ] - |         5|      5600|1.6954e-07|         0| 0.0056408|    99.998|
[ INFO : 2025-05-29 14:24:14,618 ] - |         5|      5700|1.6866e-07|         0| 0.0056216|    99.998|
[ INFO : 2025-05-29 14:25:08,523 ] - |         5|      5800|1.6777e-07|         0| 0.0056202|    99.998|
[ INFO : 2025-05-29 14:26:02,385 ] - |         5|      5900|1.6689e-07|         0| 0.0056103|    99.998|
[ INFO : 2025-05-29 14:26:56,231 ] - |         5|      6000|1.6602e-07|         0| 0.0055893|    99.998|
[ INFO : 2025-05-29 14:27:50,369 ] - |         5|      6100|1.6515e-07|         0| 0.0055787|    99.998|
[ INFO : 2025-05-29 14:28:44,622 ] - |         5|      6200|1.6428e-07|         0| 0.0055665|    99.998|
[ INFO : 2025-05-29 14:29:38,865 ] - |         5|      6300|1.6342e-07|         0| 0.0055706|    99.998|
[ INFO : 2025-05-29 14:30:33,092 ] - |         5|      6400|1.6256e-07|         0| 0.0055517|    99.998|
[ INFO : 2025-05-29 14:31:27,237 ] - |         5|      6500|1.6171e-07|         0| 0.0055442|    99.998|
[ INFO : 2025-05-29 14:32:21,524 ] - |         5|      6600|1.6086e-07|         0| 0.0055505|    99.998|
[ INFO : 2025-05-29 14:33:15,394 ] - |         5|      6700|1.6002e-07|         0| 0.0055472|    99.998|
[ INFO : 2025-05-29 14:34:09,297 ] - |         5|      6800|1.5918e-07|         0| 0.0055246|    99.998|
[ INFO : 2025-05-29 14:35:03,189 ] - |         5|      6900|1.5834e-07|         0| 0.0055159|    99.998|
[ INFO : 2025-05-29 14:35:57,089 ] - |         5|      7000|1.5751e-07|         0| 0.0055247|    99.998|
[ INFO : 2025-05-29 14:36:50,968 ] - |         5|      7100|1.5669e-07|         0| 0.0055288|    99.998|
[ INFO : 2025-05-29 14:37:44,866 ] - |         5|      7200|1.5587e-07|         0| 0.0055158|    99.998|
[ INFO : 2025-05-29 14:38:38,751 ] - |         5|      7300|1.5505e-07|         0| 0.0055152|    99.998|
[ INFO : 2025-05-29 14:39:32,676 ] - |         5|      7400|1.5424e-07|         0| 0.0055127|    99.998|
[ INFO : 2025-05-29 14:40:26,906 ] - |         5|      7500|1.5343e-07|         0| 0.0054964|    99.998|
[ INFO : 2025-05-29 14:41:20,841 ] - |         5|      7600|1.5262e-07|         0| 0.0054956|    99.998|
[ INFO : 2025-05-29 14:42:14,799 ] - |         5|      7700|1.5182e-07|         0| 0.0054908|    99.998|
[ INFO : 2025-05-29 14:43:08,781 ] - |         5|      7800|1.5103e-07|         0| 0.0054795|    99.998|
[ INFO : 2025-05-29 14:44:02,664 ] - |         5|      7900|1.5024e-07|         0| 0.0054846|    99.998|
[ INFO : 2025-05-29 14:44:56,546 ] - |         5|      8000|1.4945e-07|         0| 0.0054841|    99.998|
[ INFO : 2025-05-29 14:45:50,421 ] - |         5|      8100|1.4866e-07|         0| 0.0054763|    99.998|
[ INFO : 2025-05-29 14:46:44,308 ] - |         5|      8200|1.4788e-07|         0| 0.0054769|    99.997|
[ INFO : 2025-05-29 14:47:38,249 ] - |         5|      8300|1.4711e-07|         0|  0.005476|    99.997|
[ INFO : 2025-05-29 14:48:32,271 ] - |         5|      8400|1.4634e-07|         0| 0.0054643|    99.997|
[ INFO : 2025-05-29 14:49:26,399 ] - |         5|      8500|1.4557e-07|         0|  0.005462|    99.997|
[ INFO : 2025-05-29 14:50:20,304 ] - |         5|      8600|1.4481e-07|         0| 0.0054532|    99.997|
[ INFO : 2025-05-29 14:51:14,213 ] - |         5|      8700|1.4405e-07|         0| 0.0054479|    99.997|
[ INFO : 2025-05-29 14:52:08,156 ] - |         5|      8800|1.4329e-07|         0| 0.0054367|    99.997|
[ INFO : 2025-05-29 14:53:02,110 ] - |         5|      8900|1.4254e-07|         0| 0.0054292|    99.997|
[ INFO : 2025-05-29 14:53:56,064 ] - |         5|      9000|1.4179e-07|         0| 0.0054217|    99.997|
[ INFO : 2025-05-29 14:54:49,946 ] - |         5|      9100|1.4105e-07|         0| 0.0054331|    99.997|
[ INFO : 2025-05-29 14:55:43,876 ] - |         5|      9200|1.4031e-07|         0| 0.0054249|    99.997|
[ INFO : 2025-05-29 14:56:37,887 ] - |         5|      9300|1.3958e-07|         0| 0.0054144|    99.997|
[ INFO : 2025-05-29 14:57:32,100 ] - |         5|      9400|1.3884e-07|         0| 0.0054065|    99.997|
[ INFO : 2025-05-29 14:58:26,001 ] - |         5|      9500|1.3812e-07|         0| 0.0054114|    99.997|
[ INFO : 2025-05-29 14:59:19,895 ] - |         5|      9600|1.3739e-07|         0| 0.0054155|    99.997|
[ INFO : 2025-05-29 15:00:13,802 ] - |         5|      9700|1.3667e-07|         0| 0.0054132|    99.997|
[ INFO : 2025-05-29 15:01:07,698 ] - |         5|      9800|1.3595e-07|         0| 0.0054056|    99.997|
[ INFO : 2025-05-29 15:02:01,595 ] - |         5|      9900|1.3524e-07|         0| 0.0054006|    99.997|
[ INFO : 2025-05-29 15:02:55,468 ] - |         5|     10000|1.3453e-07|         0| 0.0053878|    99.997|
[ INFO : 2025-05-29 15:03:49,364 ] - |         5|     10100|1.3383e-07|         0| 0.0053721|    99.998|
[ INFO : 2025-05-29 15:04:43,274 ] - |         5|     10200|1.3313e-07|         0| 0.0053709|    99.998|
[ INFO : 2025-05-29 15:05:37,654 ] - |         5|     10300|1.3243e-07|         0| 0.0053773|    99.998|
[ INFO : 2025-05-29 15:06:31,519 ] - |         5|     10400|1.3173e-07|         0| 0.0053712|    99.998|
[ INFO : 2025-05-29 15:07:25,403 ] - |         5|     10500|1.3104e-07|         0| 0.0053666|    99.998|
[ INFO : 2025-05-29 15:08:19,274 ] - |         5|     10600|1.3036e-07|         0|  0.005362|    99.998|
[ INFO : 2025-05-29 15:09:13,139 ] - |         5|     10700|1.2967e-07|         0| 0.0053526|    99.998|
[ INFO : 2025-05-29 15:10:07,010 ] - |         5|     10800|1.2899e-07|         0| 0.0053456|    99.998|
[ INFO : 2025-05-29 15:11:00,880 ] - |         5|     10900|1.2832e-07|         0| 0.0053401|    99.998|
[ INFO : 2025-05-29 15:11:54,774 ] - |         5|     11000|1.2764e-07|         0| 0.0053376|    99.998|
[ INFO : 2025-05-29 15:12:48,678 ] - |         5|     11100|1.2697e-07|         0| 0.0053249|    99.998|
[ INFO : 2025-05-29 15:13:42,886 ] - |         5|     11200|1.2631e-07|         0| 0.0053186|    99.998|
[ INFO : 2025-05-29 15:14:37,440 ] - |         5|     11300|1.2565e-07|         0| 0.0053129|    99.998|
[ INFO : 2025-05-29 15:15:30,410 ] - |         5|     11397|1.2501e-07|         0| 0.0053079|    99.998|
[ INFO : 2025-05-29 15:15:34,369 ] - +----------+----------+----------+----------+----------+----------+