Junteng commited on
Commit
30f3787
·
verified ·
1 Parent(s): 710d1ff

Upload folder using huggingface_hub

Browse files
data/trained_openclip/negative_logs/plotqa_v2/2024_09_02-19_36_58-model_ViT-L-14-336-lr_1e-06-b_64-j_4-p_amp/checkpoints/epoch_1.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4fa61f66e5d6ebdbcf9adc05ef5d3c7ea3be2084b43ee1bc23ab857b7e65bc6d
3
+ size 5135890710
data/trained_openclip/negative_logs/plotqa_v2/2024_09_02-19_36_58-model_ViT-L-14-336-lr_1e-06-b_64-j_4-p_amp/checkpoints/epoch_2.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b0eb20d4f0b05d2a611270cdda9108652ddd709c81ee8b3fda78c3d68da37c18
3
+ size 5135890710
data/trained_openclip/negative_logs/plotqa_v2/2024_09_02-19_36_58-model_ViT-L-14-336-lr_1e-06-b_64-j_4-p_amp/checkpoints/epoch_3.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ec5f7ec1f08fbc956c82d8f0574da0712b16d36627c6e17e30282cf7a7935eb1
3
+ size 5135890710
data/trained_openclip/negative_logs/plotqa_v2/2024_09_02-19_36_58-model_ViT-L-14-336-lr_1e-06-b_64-j_4-p_amp/out.log ADDED
@@ -0,0 +1,549 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2024-09-02,19:36:58 | INFO | Running in distributed mode with multiple processes. Device: cuda:0.Process (global: 0, local 0), total 4.
2
+ 2024-09-02,19:36:58 | INFO | Loading ViT-L-14-336 model config.
3
+ 2024-09-02,19:37:00 | INFO | Loading pretrained ViT-L-14-336 weights (/project/deemreason/junteng/Vision4Math/data/openclip-vit-14-336/openclip_model.pt).
4
+ 2024-09-02,19:37:02 | INFO | Model:
5
+ 2024-09-02,19:37:02 | INFO | CLIP(
6
+ (visual): VisualTransformer(
7
+ (conv1): Conv2d(3, 1024, kernel_size=(14, 14), stride=(14, 14), bias=False)
8
+ (ln_pre): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
9
+ (transformer): Transformer(
10
+ (resblocks): ModuleList(
11
+ (0-23): 24 x ResidualAttentionBlock(
12
+ (attn): MultiheadAttention(
13
+ (out_proj): NonDynamicallyQuantizableLinear(in_features=1024, out_features=1024, bias=True)
14
+ )
15
+ (ln_1): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
16
+ (mlp): Sequential(
17
+ (c_fc): Linear(in_features=1024, out_features=4096, bias=True)
18
+ (gelu): QuickGELU()
19
+ (c_proj): Linear(in_features=4096, out_features=1024, bias=True)
20
+ )
21
+ (ln_2): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
22
+ )
23
+ )
24
+ )
25
+ (ln_post): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
26
+ )
27
+ (transformer): Transformer(
28
+ (resblocks): ModuleList(
29
+ (0-11): 12 x ResidualAttentionBlock(
30
+ (attn): MultiheadAttention(
31
+ (out_proj): NonDynamicallyQuantizableLinear(in_features=768, out_features=768, bias=True)
32
+ )
33
+ (ln_1): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
34
+ (mlp): Sequential(
35
+ (c_fc): Linear(in_features=768, out_features=3072, bias=True)
36
+ (gelu): QuickGELU()
37
+ (c_proj): Linear(in_features=3072, out_features=768, bias=True)
38
+ )
39
+ (ln_2): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
40
+ )
41
+ )
42
+ )
43
+ (token_embedding): Embedding(49408, 768)
44
+ (ln_final): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
45
+ )
46
+ 2024-09-02,19:37:02 | INFO | Params:
47
+ 2024-09-02,19:37:02 | INFO | batch_size: 64
48
+ 2024-09-02,19:37:02 | INFO | beta1: 0.9
49
+ 2024-09-02,19:37:02 | INFO | beta2: 0.98
50
+ 2024-09-02,19:37:02 | INFO | checkpoint_path: /project/deemreason/junteng/Vision4Math/train_clip/negative_logs/plotqa_v2/2024_09_02-19_36_58-model_ViT-L-14-336-lr_1e-06-b_64-j_4-p_amp/checkpoints
51
+ 2024-09-02,19:37:02 | INFO | copy_codebase: False
52
+ 2024-09-02,19:37:02 | INFO | csv_caption_key: caption
53
+ 2024-09-02,19:37:02 | INFO | csv_hard_captions_key: neg_caption
54
+ 2024-09-02,19:37:02 | INFO | csv_img_key: img_path
55
+ 2024-09-02,19:37:02 | INFO | csv_separator: ,
56
+ 2024-09-02,19:37:02 | INFO | dataset_resampled: False
57
+ 2024-09-02,19:37:02 | INFO | dataset_type: csv
58
+ 2024-09-02,19:37:02 | INFO | ddp_static_graph: False
59
+ 2024-09-02,19:37:02 | INFO | debug: False
60
+ 2024-09-02,19:37:02 | INFO | device: cuda:0
61
+ 2024-09-02,19:37:02 | INFO | dist_backend: nccl
62
+ 2024-09-02,19:37:02 | INFO | dist_url: env://
63
+ 2024-09-02,19:37:02 | INFO | distributed: True
64
+ 2024-09-02,19:37:02 | INFO | epochs: 3
65
+ 2024-09-02,19:37:02 | INFO | eps: 1e-06
66
+ 2024-09-02,19:37:02 | INFO | force_quick_gelu: True
67
+ 2024-09-02,19:37:02 | INFO | gather_with_grad: False
68
+ 2024-09-02,19:37:02 | INFO | grad_checkpointing: False
69
+ 2024-09-02,19:37:02 | INFO | horovod: False
70
+ 2024-09-02,19:37:02 | INFO | imagenet_v2: None
71
+ 2024-09-02,19:37:02 | INFO | imagenet_val: None
72
+ 2024-09-02,19:37:02 | INFO | local_loss: False
73
+ 2024-09-02,19:37:02 | INFO | local_rank: 0
74
+ 2024-09-02,19:37:02 | INFO | lock_image: False
75
+ 2024-09-02,19:37:02 | INFO | lock_image_freeze_bn_stats: False
76
+ 2024-09-02,19:37:02 | INFO | lock_image_unlocked_groups: 0
77
+ 2024-09-02,19:37:02 | INFO | log_level: 20
78
+ 2024-09-02,19:37:02 | INFO | log_local: False
79
+ 2024-09-02,19:37:02 | INFO | log_path: /project/deemreason/junteng/Vision4Math/train_clip/negative_logs/plotqa_v2/2024_09_02-19_36_58-model_ViT-L-14-336-lr_1e-06-b_64-j_4-p_amp/out.log
80
+ 2024-09-02,19:37:02 | INFO | logs: /project/deemreason/junteng/Vision4Math/train_clip/negative_logs/plotqa_v2
81
+ 2024-09-02,19:37:02 | INFO | lr: 1e-06
82
+ 2024-09-02,19:37:02 | INFO | model: ViT-L-14-336
83
+ 2024-09-02,19:37:02 | INFO | name: 2024_09_02-19_36_58-model_ViT-L-14-336-lr_1e-06-b_64-j_4-p_amp
84
+ 2024-09-02,19:37:02 | INFO | no_set_device_rank: False
85
+ 2024-09-02,19:37:02 | INFO | norm_gradient_clip: None
86
+ 2024-09-02,19:37:02 | INFO | precision: amp
87
+ 2024-09-02,19:37:02 | INFO | pretrained: /project/deemreason/junteng/Vision4Math/data/openclip-vit-14-336/openclip_model.pt
88
+ 2024-09-02,19:37:02 | INFO | pretrained_image: False
89
+ 2024-09-02,19:37:02 | INFO | rank: 0
90
+ 2024-09-02,19:37:02 | INFO | report_to: wandb
91
+ 2024-09-02,19:37:02 | INFO | resume: None
92
+ 2024-09-02,19:37:02 | INFO | save_frequency: 1
93
+ 2024-09-02,19:37:02 | INFO | save_most_recent: False
94
+ 2024-09-02,19:37:02 | INFO | seed: 0
95
+ 2024-09-02,19:37:02 | INFO | skip_scheduler: False
96
+ 2024-09-02,19:37:02 | INFO | tensorboard: False
97
+ 2024-09-02,19:37:02 | INFO | tensorboard_path:
98
+ 2024-09-02,19:37:02 | INFO | torchscript: False
99
+ 2024-09-02,19:37:02 | INFO | trace: False
100
+ 2024-09-02,19:37:02 | INFO | train_data: /project/deemreason/junteng/Vision4Math/csv_data/plotqa_train_v2.csv
101
+ 2024-09-02,19:37:02 | INFO | train_num_samples: None
102
+ 2024-09-02,19:37:02 | INFO | use_bn_sync: False
103
+ 2024-09-02,19:37:02 | INFO | val_data: None
104
+ 2024-09-02,19:37:02 | INFO | val_frequency: 1
105
+ 2024-09-02,19:37:02 | INFO | val_num_samples: None
106
+ 2024-09-02,19:37:02 | INFO | wandb: True
107
+ 2024-09-02,19:37:02 | INFO | wandb_notes:
108
+ 2024-09-02,19:37:02 | INFO | wandb_project: open-clip-sum
109
+ 2024-09-02,19:37:02 | INFO | warmup: 0
110
+ 2024-09-02,19:37:02 | INFO | wd: 0.1
111
+ 2024-09-02,19:37:02 | INFO | workers: 4
112
+ 2024-09-02,19:37:02 | INFO | world_size: 4
113
+ 2024-09-02,19:37:02 | INFO | zeroshot_frequency: 2
114
+ 2024-09-02,19:37:11 | INFO | Init a wandb project!
115
+ 2024-09-02,19:37:16 | INFO | Start epoch 0
116
+ 2024-09-02,19:37:19 | INFO | Train Epoch: 0 [ 256/3655823 (0%)] Loss: 5.5831 (5.583) Data (t): 1.467 Batch (t): 3.462, 73.9382/s LR: 0.000001 Logit Scale: 100.000 - V4
117
+ 2024-09-02,19:38:01 | INFO | Train Epoch: 0 [ 25856/3655823 (1%)] Loss: 3.2637 (4.423) Data (t): 0.000 Batch (t): 0.414, 627.302/s LR: 0.000001 Logit Scale: 99.997 - V4
118
+ 2024-09-02,19:38:42 | INFO | Train Epoch: 0 [ 51456/3655823 (1%)] Loss: 2.7307 (3.859) Data (t): 0.000 Batch (t): 0.407, 627.869/s LR: 0.000001 Logit Scale: 99.997 - V4
119
+ 2024-09-02,19:39:23 | INFO | Train Epoch: 0 [ 77056/3655823 (2%)] Loss: 2.4601 (3.509) Data (t): 0.000 Batch (t): 0.411, 629.819/s LR: 0.000001 Logit Scale: 99.998 - V4
120
+ 2024-09-02,19:40:04 | INFO | Train Epoch: 0 [ 102656/3655823 (3%)] Loss: 2.4481 (3.297) Data (t): 0.001 Batch (t): 0.415, 627.474/s LR: 0.000001 Logit Scale: 99.999 - V4
121
+ 2024-09-02,19:40:45 | INFO | Train Epoch: 0 [ 128256/3655823 (4%)] Loss: 2.3653 (3.142) Data (t): 0.000 Batch (t): 0.408, 645.102/s LR: 0.000001 Logit Scale: 100.000 - V4
122
+ 2024-09-02,19:41:26 | INFO | Train Epoch: 0 [ 153856/3655823 (4%)] Loss: 2.4479 (3.043) Data (t): 0.000 Batch (t): 0.408, 627.314/s LR: 0.000001 Logit Scale: 100.000 - V4
123
+ 2024-09-02,19:42:07 | INFO | Train Epoch: 0 [ 179456/3655823 (5%)] Loss: 2.4120 (2.964) Data (t): 0.000 Batch (t): 0.408, 627.802/s LR: 0.000001 Logit Scale: 100.000 - V4
124
+ 2024-09-02,19:42:48 | INFO | Train Epoch: 0 [ 205056/3655823 (6%)] Loss: 2.1145 (2.869) Data (t): 0.000 Batch (t): 0.410, 627.061/s LR: 0.000001 Logit Scale: 100.000 - V4
125
+ 2024-09-02,19:43:29 | INFO | Train Epoch: 0 [ 230656/3655823 (6%)] Loss: 2.1470 (2.797) Data (t): 0.000 Batch (t): 0.412, 627.200/s LR: 0.000001 Logit Scale: 100.000 - V4
126
+ 2024-09-02,19:44:10 | INFO | Train Epoch: 0 [ 256256/3655823 (7%)] Loss: 2.1355 (2.737) Data (t): 0.000 Batch (t): 0.407, 627.290/s LR: 0.000001 Logit Scale: 100.000 - V4
127
+ 2024-09-02,19:44:50 | INFO | Train Epoch: 0 [ 281856/3655823 (8%)] Loss: 1.9676 (2.673) Data (t): 0.001 Batch (t): 0.407, 627.746/s LR: 0.000001 Logit Scale: 100.000 - V4
128
+ 2024-09-02,19:45:31 | INFO | Train Epoch: 0 [ 307456/3655823 (8%)] Loss: 1.9708 (2.619) Data (t): 0.000 Batch (t): 0.407, 631.575/s LR: 0.000001 Logit Scale: 100.000 - V4
129
+ 2024-09-02,19:46:12 | INFO | Train Epoch: 0 [ 333056/3655823 (9%)] Loss: 1.8963 (2.567) Data (t): 0.000 Batch (t): 0.407, 628.187/s LR: 0.000001 Logit Scale: 100.000 - V4
130
+ 2024-09-02,19:46:53 | INFO | Train Epoch: 0 [ 358656/3655823 (10%)] Loss: 1.7813 (2.515) Data (t): 0.000 Batch (t): 0.416, 630.738/s LR: 0.000001 Logit Scale: 100.000 - V4
131
+ 2024-09-02,19:47:34 | INFO | Train Epoch: 0 [ 384256/3655823 (11%)] Loss: 1.9189 (2.478) Data (t): 0.000 Batch (t): 0.407, 628.081/s LR: 0.000001 Logit Scale: 100.000 - V4
132
+ 2024-09-02,19:48:15 | INFO | Train Epoch: 0 [ 409856/3655823 (11%)] Loss: 2.0016 (2.450) Data (t): 0.000 Batch (t): 0.407, 629.155/s LR: 0.000001 Logit Scale: 100.000 - V4
133
+ 2024-09-02,19:48:56 | INFO | Train Epoch: 0 [ 435456/3655823 (12%)] Loss: 1.8773 (2.418) Data (t): 0.000 Batch (t): 0.408, 626.731/s LR: 0.000001 Logit Scale: 100.000 - V4
134
+ 2024-09-02,19:49:36 | INFO | Train Epoch: 0 [ 461056/3655823 (13%)] Loss: 1.8882 (2.390) Data (t): 0.000 Batch (t): 0.408, 628.219/s LR: 0.000001 Logit Scale: 100.000 - V4
135
+ 2024-09-02,19:50:18 | INFO | Train Epoch: 0 [ 486656/3655823 (13%)] Loss: 2.1099 (2.376) Data (t): 0.000 Batch (t): 0.416, 630.040/s LR: 0.000001 Logit Scale: 100.000 - V4
136
+ 2024-09-02,19:50:59 | INFO | Train Epoch: 0 [ 512256/3655823 (14%)] Loss: 1.8348 (2.350) Data (t): 0.000 Batch (t): 0.407, 629.517/s LR: 0.000001 Logit Scale: 100.000 - V4
137
+ 2024-09-02,19:51:39 | INFO | Train Epoch: 0 [ 537856/3655823 (15%)] Loss: 1.9693 (2.333) Data (t): 0.000 Batch (t): 0.408, 630.885/s LR: 0.000001 Logit Scale: 100.000 - V4
138
+ 2024-09-02,19:52:20 | INFO | Train Epoch: 0 [ 563456/3655823 (15%)] Loss: 1.9957 (2.318) Data (t): 0.000 Batch (t): 0.408, 628.967/s LR: 0.000001 Logit Scale: 100.000 - V4
139
+ 2024-09-02,19:53:01 | INFO | Train Epoch: 0 [ 589056/3655823 (16%)] Loss: 1.8148 (2.297) Data (t): 0.001 Batch (t): 0.408, 628.276/s LR: 0.000001 Logit Scale: 100.000 - V4
140
+ 2024-09-02,19:53:43 | INFO | Train Epoch: 0 [ 614656/3655823 (17%)] Loss: 1.7455 (2.275) Data (t): 0.000 Batch (t): 0.416, 629.975/s LR: 0.000001 Logit Scale: 100.000 - V4
141
+ 2024-09-02,19:54:23 | INFO | Train Epoch: 0 [ 640256/3655823 (18%)] Loss: 1.8137 (2.257) Data (t): 0.000 Batch (t): 0.407, 630.226/s LR: 0.000001 Logit Scale: 100.000 - V4
142
+ 2024-09-02,19:55:04 | INFO | Train Epoch: 0 [ 665856/3655823 (18%)] Loss: 2.0148 (2.248) Data (t): 0.000 Batch (t): 0.407, 627.805/s LR: 0.000001 Logit Scale: 100.000 - V4
143
+ 2024-09-02,19:55:45 | INFO | Train Epoch: 0 [ 691456/3655823 (19%)] Loss: 1.9422 (2.238) Data (t): 0.000 Batch (t): 0.407, 627.260/s LR: 0.000001 Logit Scale: 100.000 - V4
144
+ 2024-09-02,19:56:26 | INFO | Train Epoch: 0 [ 717056/3655823 (20%)] Loss: 1.7487 (2.221) Data (t): 0.000 Batch (t): 0.407, 627.539/s LR: 0.000001 Logit Scale: 100.000 - V4
145
+ 2024-09-02,19:57:07 | INFO | Train Epoch: 0 [ 742656/3655823 (20%)] Loss: 1.7226 (2.204) Data (t): 0.000 Batch (t): 0.412, 629.291/s LR: 0.000001 Logit Scale: 100.000 - V4
146
+ 2024-09-02,19:57:48 | INFO | Train Epoch: 0 [ 768256/3655823 (21%)] Loss: 1.9393 (2.196) Data (t): 0.000 Batch (t): 0.409, 630.002/s LR: 0.000001 Logit Scale: 100.000 - V4
147
+ 2024-09-02,19:58:28 | INFO | Train Epoch: 0 [ 793856/3655823 (22%)] Loss: 1.7361 (2.181) Data (t): 0.000 Batch (t): 0.407, 628.506/s LR: 0.000001 Logit Scale: 100.000 - V4
148
+ 2024-09-02,19:59:09 | INFO | Train Epoch: 0 [ 819456/3655823 (22%)] Loss: 1.9322 (2.174) Data (t): 0.001 Batch (t): 0.407, 629.090/s LR: 0.000001 Logit Scale: 99.999 - V4
149
+ 2024-09-02,19:59:50 | INFO | Train Epoch: 0 [ 845056/3655823 (23%)] Loss: 1.9691 (2.168) Data (t): 0.000 Batch (t): 0.407, 628.442/s LR: 0.000001 Logit Scale: 100.000 - V4
150
+ 2024-09-02,20:00:31 | INFO | Train Epoch: 0 [ 870656/3655823 (24%)] Loss: 1.7442 (2.156) Data (t): 0.000 Batch (t): 0.410, 628.569/s LR: 0.000001 Logit Scale: 100.000 - V4
151
+ 2024-09-02,20:01:12 | INFO | Train Epoch: 0 [ 896256/3655823 (25%)] Loss: 1.5916 (2.140) Data (t): 0.000 Batch (t): 0.414, 630.010/s LR: 0.000001 Logit Scale: 100.000 - V4
152
+ 2024-09-02,20:01:53 | INFO | Train Epoch: 0 [ 921856/3655823 (25%)] Loss: 1.5470 (2.124) Data (t): 0.000 Batch (t): 0.407, 627.363/s LR: 0.000001 Logit Scale: 100.000 - V4
153
+ 2024-09-02,20:02:34 | INFO | Train Epoch: 0 [ 947456/3655823 (26%)] Loss: 1.9370 (2.119) Data (t): 0.000 Batch (t): 0.407, 631.444/s LR: 0.000001 Logit Scale: 100.000 - V4
154
+ 2024-09-02,20:03:14 | INFO | Train Epoch: 0 [ 973056/3655823 (27%)] Loss: 1.6025 (2.106) Data (t): 0.000 Batch (t): 0.407, 627.173/s LR: 0.000001 Logit Scale: 100.000 - V4
155
+ 2024-09-02,20:03:55 | INFO | Train Epoch: 0 [ 998656/3655823 (27%)] Loss: 1.9974 (2.103) Data (t): 0.000 Batch (t): 0.410, 627.276/s LR: 0.000001 Logit Scale: 99.999 - V4
156
+ 2024-09-02,20:04:37 | INFO | Train Epoch: 0 [1024256/3655823 (28%)] Loss: 1.7604 (2.095) Data (t): 0.000 Batch (t): 0.412, 629.020/s LR: 0.000001 Logit Scale: 100.000 - V4
157
+ 2024-09-02,20:05:17 | INFO | Train Epoch: 0 [1049856/3655823 (29%)] Loss: 1.7433 (2.086) Data (t): 0.001 Batch (t): 0.407, 630.089/s LR: 0.000001 Logit Scale: 100.000 - V4
158
+ 2024-09-02,20:05:58 | INFO | Train Epoch: 0 [1075456/3655823 (29%)] Loss: 1.7398 (2.078) Data (t): 0.001 Batch (t): 0.407, 629.147/s LR: 0.000001 Logit Scale: 100.000 - V4
159
+ 2024-09-02,20:06:39 | INFO | Train Epoch: 0 [1101056/3655823 (30%)] Loss: 1.7736 (2.071) Data (t): 0.000 Batch (t): 0.407, 627.362/s LR: 0.000001 Logit Scale: 100.000 - V4
160
+ 2024-09-02,20:07:20 | INFO | Train Epoch: 0 [1126656/3655823 (31%)] Loss: 1.7673 (2.065) Data (t): 0.001 Batch (t): 0.407, 628.700/s LR: 0.000001 Logit Scale: 100.000 - V4
161
+ 2024-09-02,20:08:01 | INFO | Train Epoch: 0 [1152256/3655823 (32%)] Loss: 1.7794 (2.058) Data (t): 0.001 Batch (t): 0.416, 629.627/s LR: 0.000001 Logit Scale: 100.000 - V4
162
+ 2024-09-02,20:08:42 | INFO | Train Epoch: 0 [1177856/3655823 (32%)] Loss: 1.7664 (2.052) Data (t): 0.001 Batch (t): 0.407, 627.477/s LR: 0.000001 Logit Scale: 100.000 - V4
163
+ 2024-09-02,20:09:23 | INFO | Train Epoch: 0 [1203456/3655823 (33%)] Loss: 1.9198 (2.049) Data (t): 0.000 Batch (t): 0.407, 628.791/s LR: 0.000001 Logit Scale: 100.000 - V4
164
+ 2024-09-02,20:10:03 | INFO | Train Epoch: 0 [1229056/3655823 (34%)] Loss: 1.7759 (2.044) Data (t): 0.000 Batch (t): 0.407, 628.510/s LR: 0.000001 Logit Scale: 100.000 - V4
165
+ 2024-09-02,20:10:44 | INFO | Train Epoch: 0 [1254656/3655823 (34%)] Loss: 1.5365 (2.034) Data (t): 0.001 Batch (t): 0.408, 628.667/s LR: 0.000001 Logit Scale: 100.000 - V4
166
+ 2024-09-02,20:11:26 | INFO | Train Epoch: 0 [1280256/3655823 (35%)] Loss: 1.8680 (2.030) Data (t): 0.000 Batch (t): 0.415, 627.007/s LR: 0.000001 Logit Scale: 100.000 - V4
167
+ 2024-09-02,20:12:06 | INFO | Train Epoch: 0 [1305856/3655823 (36%)] Loss: 1.6181 (2.022) Data (t): 0.001 Batch (t): 0.407, 627.473/s LR: 0.000001 Logit Scale: 100.000 - V4
168
+ 2024-09-02,20:12:47 | INFO | Train Epoch: 0 [1331456/3655823 (36%)] Loss: 1.8238 (2.019) Data (t): 0.000 Batch (t): 0.407, 628.349/s LR: 0.000001 Logit Scale: 100.000 - V4
169
+ 2024-09-02,20:13:28 | INFO | Train Epoch: 0 [1357056/3655823 (37%)] Loss: 1.5133 (2.009) Data (t): 0.000 Batch (t): 0.408, 629.197/s LR: 0.000001 Logit Scale: 100.000 - V4
170
+ 2024-09-02,20:14:09 | INFO | Train Epoch: 0 [1382656/3655823 (38%)] Loss: 1.6781 (2.003) Data (t): 0.000 Batch (t): 0.408, 629.223/s LR: 0.000001 Logit Scale: 100.000 - V4
171
+ 2024-09-02,20:14:50 | INFO | Train Epoch: 0 [1408256/3655823 (39%)] Loss: 1.7688 (1.999) Data (t): 0.000 Batch (t): 0.417, 626.672/s LR: 0.000001 Logit Scale: 100.000 - V4
172
+ 2024-09-02,20:15:31 | INFO | Train Epoch: 0 [1433856/3655823 (39%)] Loss: 1.8756 (1.997) Data (t): 0.000 Batch (t): 0.408, 628.428/s LR: 0.000001 Logit Scale: 100.000 - V4
173
+ 2024-09-02,20:16:12 | INFO | Train Epoch: 0 [1459456/3655823 (40%)] Loss: 1.7711 (1.993) Data (t): 0.000 Batch (t): 0.408, 628.928/s LR: 0.000001 Logit Scale: 100.000 - V4
174
+ 2024-09-02,20:16:53 | INFO | Train Epoch: 0 [1485056/3655823 (41%)] Loss: 1.7487 (1.989) Data (t): 0.000 Batch (t): 0.408, 628.357/s LR: 0.000001 Logit Scale: 100.000 - V4
175
+ 2024-09-02,20:17:33 | INFO | Train Epoch: 0 [1510656/3655823 (41%)] Loss: 1.5975 (1.982) Data (t): 0.000 Batch (t): 0.408, 628.963/s LR: 0.000001 Logit Scale: 100.000 - V4
176
+ 2024-09-02,20:18:15 | INFO | Train Epoch: 0 [1536256/3655823 (42%)] Loss: 1.3927 (1.973) Data (t): 0.000 Batch (t): 0.414, 628.931/s LR: 0.000001 Logit Scale: 100.000 - V4
177
+ 2024-09-02,20:18:56 | INFO | Train Epoch: 0 [1561856/3655823 (43%)] Loss: 1.6046 (1.967) Data (t): 0.000 Batch (t): 0.409, 629.768/s LR: 0.000001 Logit Scale: 100.000 - V4
178
+ 2024-09-02,20:19:36 | INFO | Train Epoch: 0 [1587456/3655823 (43%)] Loss: 1.6262 (1.961) Data (t): 0.000 Batch (t): 0.408, 620.834/s LR: 0.000001 Logit Scale: 99.999 - V4
179
+ 2024-09-02,20:20:17 | INFO | Train Epoch: 0 [1613056/3655823 (44%)] Loss: 1.5664 (1.955) Data (t): 0.000 Batch (t): 0.407, 629.223/s LR: 0.000001 Logit Scale: 100.000 - V4
180
+ 2024-09-02,20:20:58 | INFO | Train Epoch: 0 [1638656/3655823 (45%)] Loss: 1.7675 (1.952) Data (t): 0.000 Batch (t): 0.408, 628.223/s LR: 0.000001 Logit Scale: 100.000 - V4
181
+ 2024-09-02,20:21:39 | INFO | Train Epoch: 0 [1664256/3655823 (46%)] Loss: 1.6215 (1.947) Data (t): 0.000 Batch (t): 0.410, 628.139/s LR: 0.000001 Logit Scale: 100.000 - V4
182
+ 2024-09-02,20:22:20 | INFO | Train Epoch: 0 [1689856/3655823 (46%)] Loss: 1.9395 (1.947) Data (t): 0.000 Batch (t): 0.412, 627.024/s LR: 0.000001 Logit Scale: 100.000 - V4
183
+ 2024-09-02,20:23:01 | INFO | Train Epoch: 0 [1715456/3655823 (47%)] Loss: 1.5725 (1.942) Data (t): 0.000 Batch (t): 0.408, 627.084/s LR: 0.000001 Logit Scale: 100.000 - V4
184
+ 2024-09-02,20:23:42 | INFO | Train Epoch: 0 [1741056/3655823 (48%)] Loss: 1.4810 (1.935) Data (t): 0.000 Batch (t): 0.408, 627.155/s LR: 0.000001 Logit Scale: 100.000 - V4
185
+ 2024-09-02,20:24:22 | INFO | Train Epoch: 0 [1766656/3655823 (48%)] Loss: 1.7223 (1.932) Data (t): 0.000 Batch (t): 0.408, 627.320/s LR: 0.000001 Logit Scale: 100.000 - V4
186
+ 2024-09-02,20:25:03 | INFO | Train Epoch: 0 [1792256/3655823 (49%)] Loss: 1.4262 (1.925) Data (t): 0.000 Batch (t): 0.410, 628.574/s LR: 0.000001 Logit Scale: 100.000 - V4
187
+ 2024-09-02,20:25:45 | INFO | Train Epoch: 0 [1817856/3655823 (50%)] Loss: 1.5906 (1.920) Data (t): 0.000 Batch (t): 0.415, 628.732/s LR: 0.000001 Logit Scale: 100.000 - V4
188
+ 2024-09-02,20:26:26 | INFO | Train Epoch: 0 [1843456/3655823 (50%)] Loss: 1.8560 (1.919) Data (t): 0.000 Batch (t): 0.408, 630.705/s LR: 0.000001 Logit Scale: 100.000 - V4
189
+ 2024-09-02,20:27:06 | INFO | Train Epoch: 0 [1869056/3655823 (51%)] Loss: 1.5540 (1.914) Data (t): 0.000 Batch (t): 0.408, 627.215/s LR: 0.000001 Logit Scale: 100.000 - V4
190
+ 2024-09-02,20:27:47 | INFO | Train Epoch: 0 [1894656/3655823 (52%)] Loss: 1.5654 (1.910) Data (t): 0.000 Batch (t): 0.407, 626.752/s LR: 0.000001 Logit Scale: 100.000 - V4
191
+ 2024-09-02,20:28:28 | INFO | Train Epoch: 0 [1920256/3655823 (53%)] Loss: 1.4253 (1.903) Data (t): 0.000 Batch (t): 0.407, 628.087/s LR: 0.000001 Logit Scale: 100.000 - V4
192
+ 2024-09-02,20:29:09 | INFO | Train Epoch: 0 [1945856/3655823 (53%)] Loss: 1.6971 (1.901) Data (t): 0.000 Batch (t): 0.414, 629.113/s LR: 0.000001 Logit Scale: 100.000 - V4
193
+ 2024-09-02,20:29:50 | INFO | Train Epoch: 0 [1971456/3655823 (54%)] Loss: 1.7810 (1.899) Data (t): 0.000 Batch (t): 0.407, 627.489/s LR: 0.000001 Logit Scale: 100.000 - V4
194
+ 2024-09-02,20:30:31 | INFO | Train Epoch: 0 [1997056/3655823 (55%)] Loss: 1.5387 (1.895) Data (t): 0.000 Batch (t): 0.407, 627.345/s LR: 0.000001 Logit Scale: 100.000 - V4
195
+ 2024-09-02,20:31:11 | INFO | Train Epoch: 0 [2022656/3655823 (55%)] Loss: 1.4292 (1.889) Data (t): 0.000 Batch (t): 0.408, 628.520/s LR: 0.000001 Logit Scale: 100.000 - V4
196
+ 2024-09-02,20:31:52 | INFO | Train Epoch: 0 [2048256/3655823 (56%)] Loss: 1.7634 (1.887) Data (t): 0.000 Batch (t): 0.407, 628.523/s LR: 0.000001 Logit Scale: 100.000 - V4
197
+ 2024-09-02,20:32:34 | INFO | Train Epoch: 0 [2073856/3655823 (57%)] Loss: 1.4560 (1.882) Data (t): 0.000 Batch (t): 0.414, 629.622/s LR: 0.000001 Logit Scale: 100.000 - V4
198
+ 2024-09-02,20:33:14 | INFO | Train Epoch: 0 [2099456/3655823 (57%)] Loss: 1.4923 (1.877) Data (t): 0.000 Batch (t): 0.407, 628.921/s LR: 0.000001 Logit Scale: 100.000 - V4
199
+ 2024-09-02,20:33:55 | INFO | Train Epoch: 0 [2125056/3655823 (58%)] Loss: 1.6012 (1.874) Data (t): 0.000 Batch (t): 0.407, 628.008/s LR: 0.000001 Logit Scale: 100.000 - V4
200
+ 2024-09-02,20:34:36 | INFO | Train Epoch: 0 [2150656/3655823 (59%)] Loss: 1.5860 (1.871) Data (t): 0.000 Batch (t): 0.407, 629.937/s LR: 0.000001 Logit Scale: 100.000 - V4
201
+ 2024-09-02,20:35:17 | INFO | Train Epoch: 0 [2176256/3655823 (60%)] Loss: 1.6384 (1.868) Data (t): 0.000 Batch (t): 0.407, 628.786/s LR: 0.000001 Logit Scale: 100.000 - V4
202
+ 2024-09-02,20:35:58 | INFO | Train Epoch: 0 [2201856/3655823 (60%)] Loss: 1.7724 (1.867) Data (t): 0.000 Batch (t): 0.416, 629.551/s LR: 0.000001 Logit Scale: 100.000 - V4
203
+ 2024-09-02,20:36:39 | INFO | Train Epoch: 0 [2227456/3655823 (61%)] Loss: 1.6790 (1.865) Data (t): 0.000 Batch (t): 0.408, 627.271/s LR: 0.000001 Logit Scale: 100.000 - V4
204
+ 2024-09-02,20:37:20 | INFO | Train Epoch: 0 [2253056/3655823 (62%)] Loss: 1.5074 (1.861) Data (t): 0.000 Batch (t): 0.408, 629.176/s LR: 0.000001 Logit Scale: 100.000 - V4
205
+ 2024-09-02,20:38:00 | INFO | Train Epoch: 0 [2278656/3655823 (62%)] Loss: 1.5656 (1.857) Data (t): 0.000 Batch (t): 0.408, 624.286/s LR: 0.000001 Logit Scale: 100.000 - V4
206
+ 2024-09-02,20:38:41 | INFO | Train Epoch: 0 [2304256/3655823 (63%)] Loss: 1.3563 (1.852) Data (t): 0.000 Batch (t): 0.409, 624.346/s LR: 0.000001 Logit Scale: 100.000 - V4
207
+ 2024-09-02,20:39:23 | INFO | Train Epoch: 0 [2329856/3655823 (64%)] Loss: 1.5710 (1.849) Data (t): 0.000 Batch (t): 0.417, 628.211/s LR: 0.000001 Logit Scale: 100.000 - V4
208
+ 2024-09-02,20:40:04 | INFO | Train Epoch: 0 [2355456/3655823 (64%)] Loss: 1.7251 (1.847) Data (t): 0.000 Batch (t): 0.408, 626.981/s LR: 0.000001 Logit Scale: 100.000 - V4
209
+ 2024-09-02,20:40:45 | INFO | Train Epoch: 0 [2381056/3655823 (65%)] Loss: 1.5758 (1.845) Data (t): 0.000 Batch (t): 0.408, 628.556/s LR: 0.000001 Logit Scale: 100.000 - V4
210
+ 2024-09-02,20:41:25 | INFO | Train Epoch: 0 [2406656/3655823 (66%)] Loss: 1.6105 (1.842) Data (t): 0.000 Batch (t): 0.408, 627.109/s LR: 0.000001 Logit Scale: 100.000 - V4
211
+ 2024-09-02,20:42:06 | INFO | Train Epoch: 0 [2432256/3655823 (67%)] Loss: 1.7807 (1.841) Data (t): 0.000 Batch (t): 0.407, 628.508/s LR: 0.000001 Logit Scale: 100.000 - V4
212
+ 2024-09-02,20:42:47 | INFO | Train Epoch: 0 [2457856/3655823 (67%)] Loss: 1.5765 (1.839) Data (t): 0.000 Batch (t): 0.408, 622.872/s LR: 0.000001 Logit Scale: 100.000 - V4
213
+ 2024-09-02,20:43:29 | INFO | Train Epoch: 0 [2483456/3655823 (68%)] Loss: 1.9540 (1.840) Data (t): 0.000 Batch (t): 0.418, 628.559/s LR: 0.000001 Logit Scale: 100.000 - V4
214
+ 2024-09-02,20:44:10 | INFO | Train Epoch: 0 [2509056/3655823 (69%)] Loss: 1.5541 (1.837) Data (t): 0.000 Batch (t): 0.408, 627.764/s LR: 0.000001 Logit Scale: 100.000 - V4
215
+ 2024-09-02,20:44:50 | INFO | Train Epoch: 0 [2534656/3655823 (69%)] Loss: 1.8295 (1.837) Data (t): 0.000 Batch (t): 0.408, 626.775/s LR: 0.000001 Logit Scale: 100.000 - V4
216
+ 2024-09-02,20:45:31 | INFO | Train Epoch: 0 [2560256/3655823 (70%)] Loss: 1.6062 (1.835) Data (t): 0.000 Batch (t): 0.408, 627.681/s LR: 0.000001 Logit Scale: 100.000 - V4
217
+ 2024-09-02,20:46:12 | INFO | Train Epoch: 0 [2585856/3655823 (71%)] Loss: 1.5546 (1.832) Data (t): 0.000 Batch (t): 0.408, 627.299/s LR: 0.000001 Logit Scale: 100.000 - V4
218
+ 2024-09-02,20:46:54 | INFO | Train Epoch: 0 [2611456/3655823 (71%)] Loss: 1.6402 (1.830) Data (t): 0.000 Batch (t): 0.416, 628.529/s LR: 0.000001 Logit Scale: 100.000 - V4
219
+ 2024-09-02,20:47:34 | INFO | Train Epoch: 0 [2637056/3655823 (72%)] Loss: 1.6546 (1.828) Data (t): 0.000 Batch (t): 0.407, 629.015/s LR: 0.000001 Logit Scale: 100.000 - V4
220
+ 2024-09-02,20:48:15 | INFO | Train Epoch: 0 [2662656/3655823 (73%)] Loss: 1.7634 (1.828) Data (t): 0.000 Batch (t): 0.408, 627.676/s LR: 0.000001 Logit Scale: 100.000 - V4
221
+ 2024-09-02,20:48:56 | INFO | Train Epoch: 0 [2688256/3655823 (74%)] Loss: 1.5918 (1.826) Data (t): 0.000 Batch (t): 0.407, 625.348/s LR: 0.000001 Logit Scale: 100.000 - V4
222
+ 2024-09-02,20:49:37 | INFO | Train Epoch: 0 [2713856/3655823 (74%)] Loss: 1.5016 (1.823) Data (t): 0.000 Batch (t): 0.407, 630.085/s LR: 0.000001 Logit Scale: 100.000 - V4
223
+ 2024-09-02,20:50:18 | INFO | Train Epoch: 0 [2739456/3655823 (75%)] Loss: 1.7166 (1.822) Data (t): 0.000 Batch (t): 0.417, 629.134/s LR: 0.000001 Logit Scale: 100.000 - V4
224
+ 2024-09-02,20:50:59 | INFO | Train Epoch: 0 [2765056/3655823 (76%)] Loss: 1.7674 (1.821) Data (t): 0.000 Batch (t): 0.408, 626.911/s LR: 0.000001 Logit Scale: 100.000 - V4
225
+ 2024-09-02,20:51:40 | INFO | Train Epoch: 0 [2790656/3655823 (76%)] Loss: 1.7379 (1.820) Data (t): 0.000 Batch (t): 0.408, 628.030/s LR: 0.000001 Logit Scale: 100.000 - V4
226
+ 2024-09-02,20:52:21 | INFO | Train Epoch: 0 [2816256/3655823 (77%)] Loss: 1.9221 (1.821) Data (t): 0.000 Batch (t): 0.408, 626.829/s LR: 0.000001 Logit Scale: 100.000 - V4
227
+ 2024-09-02,20:53:01 | INFO | Train Epoch: 0 [2841856/3655823 (78%)] Loss: 1.8331 (1.821) Data (t): 0.000 Batch (t): 0.408, 628.023/s LR: 0.000001 Logit Scale: 100.000 - V4
228
+ 2024-09-02,20:53:43 | INFO | Train Epoch: 0 [2867456/3655823 (78%)] Loss: 1.4906 (1.818) Data (t): 0.000 Batch (t): 0.417, 630.973/s LR: 0.000001 Logit Scale: 100.000 - V4
229
+ 2024-09-02,20:54:24 | INFO | Train Epoch: 0 [2893056/3655823 (79%)] Loss: 1.6156 (1.817) Data (t): 0.000 Batch (t): 0.408, 627.483/s LR: 0.000001 Logit Scale: 100.000 - V4
230
+ 2024-09-02,20:55:05 | INFO | Train Epoch: 0 [2918656/3655823 (80%)] Loss: 1.7688 (1.816) Data (t): 0.000 Batch (t): 0.408, 626.426/s LR: 0.000001 Logit Scale: 100.000 - V4
231
+ 2024-09-02,20:55:45 | INFO | Train Epoch: 0 [2944256/3655823 (81%)] Loss: 1.6564 (1.815) Data (t): 0.000 Batch (t): 0.408, 628.140/s LR: 0.000001 Logit Scale: 100.000 - V4
232
+ 2024-09-02,20:56:26 | INFO | Train Epoch: 0 [2969856/3655823 (81%)] Loss: 1.5896 (1.813) Data (t): 0.000 Batch (t): 0.408, 628.286/s LR: 0.000001 Logit Scale: 100.000 - V4
233
+ 2024-09-02,20:57:08 | INFO | Train Epoch: 0 [2995456/3655823 (82%)] Loss: 1.7447 (1.812) Data (t): 0.000 Batch (t): 0.415, 631.621/s LR: 0.000001 Logit Scale: 100.000 - V4
234
+ 2024-09-02,20:57:49 | INFO | Train Epoch: 0 [3021056/3655823 (83%)] Loss: 1.7545 (1.812) Data (t): 0.000 Batch (t): 0.410, 628.349/s LR: 0.000001 Logit Scale: 100.000 - V4
235
+ 2024-09-02,20:58:29 | INFO | Train Epoch: 0 [3046656/3655823 (83%)] Loss: 1.5513 (1.810) Data (t): 0.000 Batch (t): 0.407, 626.934/s LR: 0.000001 Logit Scale: 100.000 - V4
236
+ 2024-09-02,20:59:10 | INFO | Train Epoch: 0 [3072256/3655823 (84%)] Loss: 1.6052 (1.808) Data (t): 0.000 Batch (t): 0.407, 628.211/s LR: 0.000001 Logit Scale: 100.000 - V4
237
+ 2024-09-02,20:59:51 | INFO | Train Epoch: 0 [3097856/3655823 (85%)] Loss: 1.5626 (1.806) Data (t): 0.000 Batch (t): 0.408, 628.421/s LR: 0.000001 Logit Scale: 100.000 - V4
238
+ 2024-09-02,21:00:32 | INFO | Train Epoch: 0 [3123456/3655823 (85%)] Loss: 1.3925 (1.803) Data (t): 0.000 Batch (t): 0.410, 627.462/s LR: 0.000001 Logit Scale: 100.000 - V4
239
+ 2024-09-02,21:01:13 | INFO | Train Epoch: 0 [3149056/3655823 (86%)] Loss: 1.4453 (1.800) Data (t): 0.000 Batch (t): 0.415, 626.779/s LR: 0.000001 Logit Scale: 100.000 - V4
240
+ 2024-09-02,21:01:54 | INFO | Train Epoch: 0 [3174656/3655823 (87%)] Loss: 1.7288 (1.799) Data (t): 0.000 Batch (t): 0.408, 626.773/s LR: 0.000001 Logit Scale: 100.000 - V4
241
+ 2024-09-02,21:02:35 | INFO | Train Epoch: 0 [3200256/3655823 (88%)] Loss: 1.6877 (1.798) Data (t): 0.000 Batch (t): 0.408, 627.124/s LR: 0.000001 Logit Scale: 100.000 - V4
242
+ 2024-09-02,21:03:16 | INFO | Train Epoch: 0 [3225856/3655823 (88%)] Loss: 1.5236 (1.796) Data (t): 0.000 Batch (t): 0.408, 628.616/s LR: 0.000001 Logit Scale: 100.000 - V4
243
+ 2024-09-02,21:03:56 | INFO | Train Epoch: 0 [3251456/3655823 (89%)] Loss: 1.5898 (1.794) Data (t): 0.000 Batch (t): 0.408, 627.820/s LR: 0.000001 Logit Scale: 100.000 - V4
244
+ 2024-09-02,21:04:38 | INFO | Train Epoch: 0 [3277056/3655823 (90%)] Loss: 1.7068 (1.794) Data (t): 0.000 Batch (t): 0.417, 629.565/s LR: 0.000001 Logit Scale: 100.000 - V4
245
+ 2024-09-02,21:05:19 | INFO | Train Epoch: 0 [3302656/3655823 (90%)] Loss: 1.9002 (1.795) Data (t): 0.000 Batch (t): 0.408, 626.629/s LR: 0.000001 Logit Scale: 100.000 - V4
246
+ 2024-09-02,21:06:00 | INFO | Train Epoch: 0 [3328256/3655823 (91%)] Loss: 1.5467 (1.793) Data (t): 0.000 Batch (t): 0.408, 627.183/s LR: 0.000001 Logit Scale: 100.000 - V4
247
+ 2024-09-02,21:06:41 | INFO | Train Epoch: 0 [3353856/3655823 (92%)] Loss: 1.4131 (1.790) Data (t): 0.000 Batch (t): 0.408, 629.609/s LR: 0.000001 Logit Scale: 100.000 - V4
248
+ 2024-09-02,21:07:21 | INFO | Train Epoch: 0 [3379456/3655823 (92%)] Loss: 1.4317 (1.787) Data (t): 0.000 Batch (t): 0.408, 628.202/s LR: 0.000001 Logit Scale: 100.000 - V4
249
+ 2024-09-02,21:08:03 | INFO | Train Epoch: 0 [3405056/3655823 (93%)] Loss: 1.4983 (1.785) Data (t): 0.000 Batch (t): 0.416, 629.785/s LR: 0.000001 Logit Scale: 100.000 - V4
250
+ 2024-09-02,21:08:44 | INFO | Train Epoch: 0 [3430656/3655823 (94%)] Loss: 1.5149 (1.783) Data (t): 0.000 Batch (t): 0.408, 628.086/s LR: 0.000001 Logit Scale: 100.000 - V4
251
+ 2024-09-02,21:09:24 | INFO | Train Epoch: 0 [3456256/3655823 (95%)] Loss: 1.6370 (1.782) Data (t): 0.000 Batch (t): 0.408, 627.030/s LR: 0.000001 Logit Scale: 100.000 - V4
252
+ 2024-09-02,21:10:05 | INFO | Train Epoch: 0 [3481856/3655823 (95%)] Loss: 1.4425 (1.779) Data (t): 0.000 Batch (t): 0.408, 628.987/s LR: 0.000001 Logit Scale: 100.000 - V4
253
+ 2024-09-02,21:10:46 | INFO | Train Epoch: 0 [3507456/3655823 (96%)] Loss: 1.8596 (1.780) Data (t): 0.000 Batch (t): 0.407, 627.770/s LR: 0.000001 Logit Scale: 100.000 - V4
254
+ 2024-09-02,21:11:28 | INFO | Train Epoch: 0 [3533056/3655823 (97%)] Loss: 1.5420 (1.778) Data (t): 0.000 Batch (t): 0.417, 630.324/s LR: 0.000001 Logit Scale: 100.000 - V4
255
+ 2024-09-02,21:12:08 | INFO | Train Epoch: 0 [3558656/3655823 (97%)] Loss: 1.6101 (1.777) Data (t): 0.000 Batch (t): 0.407, 627.708/s LR: 0.000001 Logit Scale: 100.000 - V4
256
+ 2024-09-02,21:12:49 | INFO | Train Epoch: 0 [3584256/3655823 (98%)] Loss: 1.8772 (1.778) Data (t): 0.000 Batch (t): 0.408, 628.813/s LR: 0.000001 Logit Scale: 100.000 - V4
257
+ 2024-09-02,21:13:30 | INFO | Train Epoch: 0 [3609856/3655823 (99%)] Loss: 1.5998 (1.777) Data (t): 0.000 Batch (t): 0.408, 626.051/s LR: 0.000001 Logit Scale: 100.000 - V4
258
+ 2024-09-02,21:14:11 | INFO | Train Epoch: 0 [3635456/3655823 (99%)] Loss: 1.4080 (1.774) Data (t): 0.000 Batch (t): 0.407, 629.103/s LR: 0.000001 Logit Scale: 100.000 - V4
259
+ 2024-09-02,21:14:43 | INFO | Train Epoch: 0 [3655680/3655823 (100%)] Loss: 1.6069 (1.773) Data (t): 0.001 Batch (t): 0.412, 634.622/s LR: 0.000001 Logit Scale: 100.000 - V4
260
+ 2024-09-02,21:14:52 | INFO | Start epoch 1
261
+ 2024-09-02,21:14:54 | INFO | Train Epoch: 1 [ 256/3655823 (0%)] Loss: 1.5133 (1.513) Data (t): 1.453 Batch (t): 1.874, 136.629/s LR: 0.000001 Logit Scale: 100.000 - V4
262
+ 2024-09-02,21:15:35 | INFO | Train Epoch: 1 [ 25856/3655823 (1%)] Loss: 1.5142 (1.514) Data (t): 0.000 Batch (t): 0.410, 630.191/s LR: 0.000001 Logit Scale: 100.000 - V4
263
+ 2024-09-02,21:16:16 | INFO | Train Epoch: 1 [ 51456/3655823 (1%)] Loss: 1.5544 (1.527) Data (t): 0.001 Batch (t): 0.407, 627.702/s LR: 0.000001 Logit Scale: 100.000 - V4
264
+ 2024-09-02,21:16:56 | INFO | Train Epoch: 1 [ 77056/3655823 (2%)] Loss: 1.7478 (1.582) Data (t): 0.000 Batch (t): 0.407, 629.082/s LR: 0.000001 Logit Scale: 100.000 - V4
265
+ 2024-09-02,21:17:37 | INFO | Train Epoch: 1 [ 102656/3655823 (3%)] Loss: 1.3177 (1.529) Data (t): 0.000 Batch (t): 0.408, 628.874/s LR: 0.000001 Logit Scale: 100.000 - V4
266
+ 2024-09-02,21:18:18 | INFO | Train Epoch: 1 [ 128256/3655823 (4%)] Loss: 1.6253 (1.545) Data (t): 0.000 Batch (t): 0.410, 630.915/s LR: 0.000001 Logit Scale: 100.000 - V4
267
+ 2024-09-02,21:19:00 | INFO | Train Epoch: 1 [ 153856/3655823 (4%)] Loss: 1.3942 (1.524) Data (t): 0.000 Batch (t): 0.414, 625.376/s LR: 0.000001 Logit Scale: 100.000 - V4
268
+ 2024-09-02,21:19:40 | INFO | Train Epoch: 1 [ 179456/3655823 (5%)] Loss: 1.4307 (1.512) Data (t): 0.000 Batch (t): 0.408, 628.076/s LR: 0.000001 Logit Scale: 100.000 - V4
269
+ 2024-09-02,21:20:21 | INFO | Train Epoch: 1 [ 205056/3655823 (6%)] Loss: 1.5396 (1.515) Data (t): 0.000 Batch (t): 0.408, 627.494/s LR: 0.000001 Logit Scale: 100.000 - V4
270
+ 2024-09-02,21:21:02 | INFO | Train Epoch: 1 [ 230656/3655823 (6%)] Loss: 1.7573 (1.539) Data (t): 0.000 Batch (t): 0.407, 627.089/s LR: 0.000001 Logit Scale: 100.000 - V4
271
+ 2024-09-02,21:21:43 | INFO | Train Epoch: 1 [ 256256/3655823 (7%)] Loss: 1.5486 (1.540) Data (t): 0.000 Batch (t): 0.407, 628.877/s LR: 0.000001 Logit Scale: 100.000 - V4
272
+ 2024-09-02,21:22:24 | INFO | Train Epoch: 1 [ 281856/3655823 (8%)] Loss: 1.3610 (1.525) Data (t): 0.000 Batch (t): 0.417, 628.258/s LR: 0.000001 Logit Scale: 100.000 - V4
273
+ 2024-09-02,21:23:05 | INFO | Train Epoch: 1 [ 307456/3655823 (8%)] Loss: 1.7699 (1.544) Data (t): 0.000 Batch (t): 0.408, 629.525/s LR: 0.000001 Logit Scale: 100.000 - V4
274
+ 2024-09-02,21:23:46 | INFO | Train Epoch: 1 [ 333056/3655823 (9%)] Loss: 1.7270 (1.557) Data (t): 0.000 Batch (t): 0.408, 628.146/s LR: 0.000001 Logit Scale: 100.000 - V4
275
+ 2024-09-02,21:24:27 | INFO | Train Epoch: 1 [ 358656/3655823 (10%)] Loss: 1.5177 (1.555) Data (t): 0.001 Batch (t): 0.408, 627.366/s LR: 0.000001 Logit Scale: 100.000 - V4
276
+ 2024-09-02,21:25:07 | INFO | Train Epoch: 1 [ 384256/3655823 (11%)] Loss: 1.6982 (1.564) Data (t): 0.000 Batch (t): 0.407, 627.758/s LR: 0.000001 Logit Scale: 100.000 - V4
277
+ 2024-09-02,21:25:49 | INFO | Train Epoch: 1 [ 409856/3655823 (11%)] Loss: 1.5244 (1.561) Data (t): 0.000 Batch (t): 0.416, 628.672/s LR: 0.000001 Logit Scale: 100.000 - V4
278
+ 2024-09-02,21:26:30 | INFO | Train Epoch: 1 [ 435456/3655823 (12%)] Loss: 1.4700 (1.556) Data (t): 0.000 Batch (t): 0.407, 628.596/s LR: 0.000001 Logit Scale: 100.000 - V4
279
+ 2024-09-02,21:27:10 | INFO | Train Epoch: 1 [ 461056/3655823 (13%)] Loss: 1.4835 (1.552) Data (t): 0.000 Batch (t): 0.407, 628.762/s LR: 0.000001 Logit Scale: 100.000 - V4
280
+ 2024-09-02,21:27:51 | INFO | Train Epoch: 1 [ 486656/3655823 (13%)] Loss: 1.5312 (1.551) Data (t): 0.000 Batch (t): 0.407, 629.042/s LR: 0.000001 Logit Scale: 100.000 - V4
281
+ 2024-09-02,21:28:32 | INFO | Train Epoch: 1 [ 512256/3655823 (14%)] Loss: 1.3163 (1.540) Data (t): 0.000 Batch (t): 0.407, 628.484/s LR: 0.000001 Logit Scale: 100.000 - V4
282
+ 2024-09-02,21:29:13 | INFO | Train Epoch: 1 [ 537856/3655823 (15%)] Loss: 1.6038 (1.543) Data (t): 0.000 Batch (t): 0.416, 627.924/s LR: 0.000001 Logit Scale: 100.000 - V4
283
+ 2024-09-02,21:29:54 | INFO | Train Epoch: 1 [ 563456/3655823 (15%)] Loss: 1.4335 (1.538) Data (t): 0.000 Batch (t): 0.407, 628.898/s LR: 0.000001 Logit Scale: 100.000 - V4
284
+ 2024-09-02,21:30:35 | INFO | Train Epoch: 1 [ 589056/3655823 (16%)] Loss: 1.6874 (1.544) Data (t): 0.000 Batch (t): 0.407, 627.771/s LR: 0.000001 Logit Scale: 100.000 - V4
285
+ 2024-09-02,21:31:16 | INFO | Train Epoch: 1 [ 614656/3655823 (17%)] Loss: 1.6103 (1.547) Data (t): 0.000 Batch (t): 0.407, 622.736/s LR: 0.000001 Logit Scale: 100.000 - V4
286
+ 2024-09-02,21:31:56 | INFO | Train Epoch: 1 [ 640256/3655823 (18%)] Loss: 1.4827 (1.545) Data (t): 0.000 Batch (t): 0.407, 627.594/s LR: 0.000001 Logit Scale: 100.000 - V4
287
+ 2024-09-02,21:32:38 | INFO | Train Epoch: 1 [ 665856/3655823 (18%)] Loss: 1.5158 (1.544) Data (t): 0.000 Batch (t): 0.416, 631.575/s LR: 0.000001 Logit Scale: 100.000 - V4
288
+ 2024-09-02,21:33:19 | INFO | Train Epoch: 1 [ 691456/3655823 (19%)] Loss: 1.4753 (1.541) Data (t): 0.000 Batch (t): 0.407, 626.767/s LR: 0.000001 Logit Scale: 100.000 - V4
289
+ 2024-09-02,21:33:59 | INFO | Train Epoch: 1 [ 717056/3655823 (20%)] Loss: 1.6635 (1.545) Data (t): 0.000 Batch (t): 0.407, 628.659/s LR: 0.000001 Logit Scale: 100.000 - V4
290
+ 2024-09-02,21:34:40 | INFO | Train Epoch: 1 [ 742656/3655823 (20%)] Loss: 1.5481 (1.545) Data (t): 0.000 Batch (t): 0.407, 629.260/s LR: 0.000001 Logit Scale: 100.000 - V4
291
+ 2024-09-02,21:35:21 | INFO | Train Epoch: 1 [ 768256/3655823 (21%)] Loss: 1.3125 (1.538) Data (t): 0.000 Batch (t): 0.408, 627.366/s LR: 0.000001 Logit Scale: 100.000 - V4
292
+ 2024-09-02,21:36:02 | INFO | Train Epoch: 1 [ 793856/3655823 (22%)] Loss: 1.5203 (1.537) Data (t): 0.000 Batch (t): 0.414, 626.264/s LR: 0.000001 Logit Scale: 100.000 - V4
293
+ 2024-09-02,21:36:43 | INFO | Train Epoch: 1 [ 819456/3655823 (22%)] Loss: 1.6740 (1.542) Data (t): 0.000 Batch (t): 0.410, 628.629/s LR: 0.000001 Logit Scale: 100.000 - V4
294
+ 2024-09-02,21:37:24 | INFO | Train Epoch: 1 [ 845056/3655823 (23%)] Loss: 1.4071 (1.538) Data (t): 0.000 Batch (t): 0.407, 629.449/s LR: 0.000001 Logit Scale: 100.000 - V4
295
+ 2024-09-02,21:38:05 | INFO | Train Epoch: 1 [ 870656/3655823 (24%)] Loss: 1.5560 (1.538) Data (t): 0.000 Batch (t): 0.407, 630.243/s LR: 0.000001 Logit Scale: 100.000 - V4
296
+ 2024-09-02,21:38:45 | INFO | Train Epoch: 1 [ 896256/3655823 (25%)] Loss: 1.2346 (1.530) Data (t): 0.000 Batch (t): 0.407, 627.806/s LR: 0.000001 Logit Scale: 100.000 - V4
297
+ 2024-09-02,21:39:26 | INFO | Train Epoch: 1 [ 921856/3655823 (25%)] Loss: 1.4627 (1.528) Data (t): 0.000 Batch (t): 0.407, 630.867/s LR: 0.000001 Logit Scale: 100.000 - V4
298
+ 2024-09-02,21:40:08 | INFO | Train Epoch: 1 [ 947456/3655823 (26%)] Loss: 1.2894 (1.522) Data (t): 0.000 Batch (t): 0.417, 629.256/s LR: 0.000001 Logit Scale: 100.000 - V4
299
+ 2024-09-02,21:40:49 | INFO | Train Epoch: 1 [ 973056/3655823 (27%)] Loss: 1.5275 (1.522) Data (t): 0.000 Batch (t): 0.408, 626.342/s LR: 0.000001 Logit Scale: 100.000 - V4
300
+ 2024-09-02,21:41:29 | INFO | Train Epoch: 1 [ 998656/3655823 (27%)] Loss: 1.5467 (1.522) Data (t): 0.000 Batch (t): 0.407, 627.584/s LR: 0.000001 Logit Scale: 100.000 - V4
301
+ 2024-09-02,21:42:10 | INFO | Train Epoch: 1 [1024256/3655823 (28%)] Loss: 1.4740 (1.521) Data (t): 0.000 Batch (t): 0.408, 628.038/s LR: 0.000001 Logit Scale: 100.000 - V4
302
+ 2024-09-02,21:42:51 | INFO | Train Epoch: 1 [1049856/3655823 (29%)] Loss: 1.3797 (1.518) Data (t): 0.000 Batch (t): 0.407, 627.314/s LR: 0.000001 Logit Scale: 100.000 - V4
303
+ 2024-09-02,21:43:33 | INFO | Train Epoch: 1 [1075456/3655823 (29%)] Loss: 1.3461 (1.514) Data (t): 0.000 Batch (t): 0.417, 627.971/s LR: 0.000001 Logit Scale: 100.000 - V4
304
+ 2024-09-02,21:44:13 | INFO | Train Epoch: 1 [1101056/3655823 (30%)] Loss: 1.4789 (1.513) Data (t): 0.000 Batch (t): 0.408, 626.148/s LR: 0.000001 Logit Scale: 100.000 - V4
305
+ 2024-09-02,21:44:54 | INFO | Train Epoch: 1 [1126656/3655823 (31%)] Loss: 1.3646 (1.510) Data (t): 0.000 Batch (t): 0.407, 629.987/s LR: 0.000001 Logit Scale: 100.000 - V4
306
+ 2024-09-02,21:45:35 | INFO | Train Epoch: 1 [1152256/3655823 (32%)] Loss: 1.5140 (1.510) Data (t): 0.000 Batch (t): 0.407, 628.557/s LR: 0.000001 Logit Scale: 100.000 - V4
307
+ 2024-09-02,21:46:16 | INFO | Train Epoch: 1 [1177856/3655823 (32%)] Loss: 1.6683 (1.513) Data (t): 0.000 Batch (t): 0.407, 628.529/s LR: 0.000001 Logit Scale: 100.000 - V4
308
+ 2024-09-02,21:46:57 | INFO | Train Epoch: 1 [1203456/3655823 (33%)] Loss: 1.6725 (1.517) Data (t): 0.000 Batch (t): 0.417, 629.136/s LR: 0.000001 Logit Scale: 100.000 - V4
309
+ 2024-09-02,21:47:38 | INFO | Train Epoch: 1 [1229056/3655823 (34%)] Loss: 1.4948 (1.516) Data (t): 0.000 Batch (t): 0.407, 628.064/s LR: 0.000001 Logit Scale: 100.000 - V4
310
+ 2024-09-02,21:48:19 | INFO | Train Epoch: 1 [1254656/3655823 (34%)] Loss: 1.5949 (1.518) Data (t): 0.000 Batch (t): 0.407, 629.528/s LR: 0.000001 Logit Scale: 100.000 - V4
311
+ 2024-09-02,21:48:59 | INFO | Train Epoch: 1 [1280256/3655823 (35%)] Loss: 1.3180 (1.514) Data (t): 0.000 Batch (t): 0.408, 628.289/s LR: 0.000001 Logit Scale: 100.000 - V4
312
+ 2024-09-02,21:49:40 | INFO | Train Epoch: 1 [1305856/3655823 (36%)] Loss: 1.5062 (1.514) Data (t): 0.000 Batch (t): 0.407, 627.331/s LR: 0.000001 Logit Scale: 100.000 - V4
313
+ 2024-09-02,21:50:22 | INFO | Train Epoch: 1 [1331456/3655823 (36%)] Loss: 1.4016 (1.511) Data (t): 0.000 Batch (t): 0.417, 627.811/s LR: 0.000001 Logit Scale: 100.000 - V4
314
+ 2024-09-02,21:51:03 | INFO | Train Epoch: 1 [1357056/3655823 (37%)] Loss: 1.7230 (1.515) Data (t): 0.000 Batch (t): 0.407, 630.733/s LR: 0.000001 Logit Scale: 100.000 - V4
315
+ 2024-09-02,21:51:43 | INFO | Train Epoch: 1 [1382656/3655823 (38%)] Loss: 1.5396 (1.516) Data (t): 0.000 Batch (t): 0.407, 627.915/s LR: 0.000001 Logit Scale: 100.000 - V4
316
+ 2024-09-02,21:52:24 | INFO | Train Epoch: 1 [1408256/3655823 (39%)] Loss: 1.7449 (1.520) Data (t): 0.000 Batch (t): 0.407, 629.595/s LR: 0.000001 Logit Scale: 100.000 - V4
317
+ 2024-09-02,21:53:05 | INFO | Train Epoch: 1 [1433856/3655823 (39%)] Loss: 1.2845 (1.516) Data (t): 0.000 Batch (t): 0.407, 630.280/s LR: 0.000001 Logit Scale: 100.000 - V4
318
+ 2024-09-02,21:53:46 | INFO | Train Epoch: 1 [1459456/3655823 (40%)] Loss: 1.6151 (1.517) Data (t): 0.000 Batch (t): 0.413, 628.299/s LR: 0.000001 Logit Scale: 100.000 - V4
319
+ 2024-09-02,21:54:27 | INFO | Train Epoch: 1 [1485056/3655823 (41%)] Loss: 1.5740 (1.518) Data (t): 0.000 Batch (t): 0.409, 629.952/s LR: 0.000001 Logit Scale: 100.000 - V4
320
+ 2024-09-02,21:55:08 | INFO | Train Epoch: 1 [1510656/3655823 (41%)] Loss: 1.5132 (1.518) Data (t): 0.000 Batch (t): 0.407, 629.294/s LR: 0.000001 Logit Scale: 100.000 - V4
321
+ 2024-09-02,21:55:48 | INFO | Train Epoch: 1 [1536256/3655823 (42%)] Loss: 1.4155 (1.517) Data (t): 0.000 Batch (t): 0.407, 628.534/s LR: 0.000001 Logit Scale: 100.000 - V4
322
+ 2024-09-02,21:56:29 | INFO | Train Epoch: 1 [1561856/3655823 (43%)] Loss: 1.3554 (1.514) Data (t): 0.000 Batch (t): 0.407, 629.873/s LR: 0.000001 Logit Scale: 100.000 - V4
323
+ 2024-09-02,21:57:10 | INFO | Train Epoch: 1 [1587456/3655823 (43%)] Loss: 1.4103 (1.512) Data (t): 0.000 Batch (t): 0.411, 629.178/s LR: 0.000001 Logit Scale: 100.000 - V4
324
+ 2024-09-02,21:57:51 | INFO | Train Epoch: 1 [1613056/3655823 (44%)] Loss: 1.4684 (1.512) Data (t): 0.000 Batch (t): 0.412, 629.864/s LR: 0.000001 Logit Scale: 100.000 - V4
325
+ 2024-09-02,21:58:32 | INFO | Train Epoch: 1 [1638656/3655823 (45%)] Loss: 1.5633 (1.513) Data (t): 0.000 Batch (t): 0.407, 629.829/s LR: 0.000001 Logit Scale: 100.000 - V4
326
+ 2024-09-02,21:59:13 | INFO | Train Epoch: 1 [1664256/3655823 (46%)] Loss: 1.4915 (1.512) Data (t): 0.000 Batch (t): 0.407, 627.540/s LR: 0.000001 Logit Scale: 100.000 - V4
327
+ 2024-09-02,21:59:53 | INFO | Train Epoch: 1 [1689856/3655823 (46%)] Loss: 1.4593 (1.511) Data (t): 0.000 Batch (t): 0.407, 629.971/s LR: 0.000001 Logit Scale: 100.000 - V4
328
+ 2024-09-02,22:00:34 | INFO | Train Epoch: 1 [1715456/3655823 (47%)] Loss: 1.2689 (1.508) Data (t): 0.000 Batch (t): 0.407, 628.891/s LR: 0.000001 Logit Scale: 100.000 - V4
329
+ 2024-09-02,22:01:16 | INFO | Train Epoch: 1 [1741056/3655823 (48%)] Loss: 1.5227 (1.508) Data (t): 0.000 Batch (t): 0.416, 628.608/s LR: 0.000001 Logit Scale: 100.000 - V4
330
+ 2024-09-02,22:01:56 | INFO | Train Epoch: 1 [1766656/3655823 (48%)] Loss: 1.5385 (1.509) Data (t): 0.000 Batch (t): 0.407, 627.905/s LR: 0.000001 Logit Scale: 100.000 - V4
331
+ 2024-09-02,22:02:37 | INFO | Train Epoch: 1 [1792256/3655823 (49%)] Loss: 1.6290 (1.510) Data (t): 0.000 Batch (t): 0.407, 631.010/s LR: 0.000001 Logit Scale: 100.000 - V4
332
+ 2024-09-02,22:03:18 | INFO | Train Epoch: 1 [1817856/3655823 (50%)] Loss: 1.5324 (1.511) Data (t): 0.000 Batch (t): 0.407, 630.453/s LR: 0.000001 Logit Scale: 100.000 - V4
333
+ 2024-09-02,22:03:59 | INFO | Train Epoch: 1 [1843456/3655823 (50%)] Loss: 1.5188 (1.511) Data (t): 0.000 Batch (t): 0.407, 628.681/s LR: 0.000000 Logit Scale: 100.000 - V4
334
+ 2024-09-02,22:04:40 | INFO | Train Epoch: 1 [1869056/3655823 (51%)] Loss: 1.7214 (1.513) Data (t): 0.000 Batch (t): 0.416, 629.958/s LR: 0.000000 Logit Scale: 100.000 - V4
335
+ 2024-09-02,22:05:21 | INFO | Train Epoch: 1 [1894656/3655823 (52%)] Loss: 1.4878 (1.513) Data (t): 0.000 Batch (t): 0.407, 628.809/s LR: 0.000000 Logit Scale: 100.000 - V4
336
+ 2024-09-02,22:06:02 | INFO | Train Epoch: 1 [1920256/3655823 (53%)] Loss: 1.5888 (1.514) Data (t): 0.000 Batch (t): 0.407, 628.475/s LR: 0.000000 Logit Scale: 100.000 - V4
337
+ 2024-09-02,22:06:42 | INFO | Train Epoch: 1 [1945856/3655823 (53%)] Loss: 1.5318 (1.514) Data (t): 0.000 Batch (t): 0.407, 628.192/s LR: 0.000000 Logit Scale: 100.000 - V4
338
+ 2024-09-02,22:07:23 | INFO | Train Epoch: 1 [1971456/3655823 (54%)] Loss: 1.6754 (1.516) Data (t): 0.000 Batch (t): 0.407, 630.166/s LR: 0.000000 Logit Scale: 100.000 - V4
339
+ 2024-09-02,22:08:05 | INFO | Train Epoch: 1 [1997056/3655823 (55%)] Loss: 1.4976 (1.516) Data (t): 0.000 Batch (t): 0.414, 408.668/s LR: 0.000000 Logit Scale: 100.000 - V4
340
+ 2024-09-02,22:08:45 | INFO | Train Epoch: 1 [2022656/3655823 (55%)] Loss: 1.7855 (1.520) Data (t): 0.000 Batch (t): 0.407, 630.855/s LR: 0.000000 Logit Scale: 100.000 - V4
341
+ 2024-09-02,22:09:26 | INFO | Train Epoch: 1 [2048256/3655823 (56%)] Loss: 1.6878 (1.522) Data (t): 0.000 Batch (t): 0.407, 629.100/s LR: 0.000000 Logit Scale: 100.000 - V4
342
+ 2024-09-02,22:10:07 | INFO | Train Epoch: 1 [2073856/3655823 (57%)] Loss: 1.6600 (1.523) Data (t): 0.000 Batch (t): 0.407, 627.637/s LR: 0.000000 Logit Scale: 100.000 - V4
343
+ 2024-09-02,22:10:47 | INFO | Train Epoch: 1 [2099456/3655823 (57%)] Loss: 1.6056 (1.524) Data (t): 0.000 Batch (t): 0.408, 628.494/s LR: 0.000000 Logit Scale: 100.000 - V4
344
+ 2024-09-02,22:11:29 | INFO | Train Epoch: 1 [2125056/3655823 (58%)] Loss: 1.3115 (1.522) Data (t): 0.000 Batch (t): 0.414, 630.900/s LR: 0.000000 Logit Scale: 100.000 - V4
345
+ 2024-09-02,22:12:10 | INFO | Train Epoch: 1 [2150656/3655823 (59%)] Loss: 1.8129 (1.525) Data (t): 0.000 Batch (t): 0.410, 628.077/s LR: 0.000000 Logit Scale: 100.000 - V4
346
+ 2024-09-02,22:12:51 | INFO | Train Epoch: 1 [2176256/3655823 (60%)] Loss: 1.6818 (1.527) Data (t): 0.000 Batch (t): 0.408, 628.288/s LR: 0.000000 Logit Scale: 100.000 - V4
347
+ 2024-09-02,22:13:31 | INFO | Train Epoch: 1 [2201856/3655823 (60%)] Loss: 1.4452 (1.526) Data (t): 0.000 Batch (t): 0.407, 629.634/s LR: 0.000000 Logit Scale: 100.000 - V4
348
+ 2024-09-02,22:14:12 | INFO | Train Epoch: 1 [2227456/3655823 (61%)] Loss: 1.4680 (1.525) Data (t): 0.000 Batch (t): 0.407, 628.668/s LR: 0.000000 Logit Scale: 100.000 - V4
349
+ 2024-09-02,22:14:53 | INFO | Train Epoch: 1 [2253056/3655823 (62%)] Loss: 1.5947 (1.526) Data (t): 0.000 Batch (t): 0.414, 628.227/s LR: 0.000000 Logit Scale: 100.000 - V4
350
+ 2024-09-02,22:15:34 | INFO | Train Epoch: 1 [2278656/3655823 (62%)] Loss: 1.4808 (1.526) Data (t): 0.000 Batch (t): 0.409, 627.235/s LR: 0.000000 Logit Scale: 100.000 - V4
351
+ 2024-09-02,22:16:15 | INFO | Train Epoch: 1 [2304256/3655823 (63%)] Loss: 1.4525 (1.525) Data (t): 0.000 Batch (t): 0.407, 628.953/s LR: 0.000000 Logit Scale: 100.000 - V4
352
+ 2024-09-02,22:16:56 | INFO | Train Epoch: 1 [2329856/3655823 (64%)] Loss: 1.4756 (1.524) Data (t): 0.000 Batch (t): 0.407, 627.547/s LR: 0.000000 Logit Scale: 100.000 - V4
353
+ 2024-09-02,22:17:37 | INFO | Train Epoch: 1 [2355456/3655823 (64%)] Loss: 1.6125 (1.525) Data (t): 0.000 Batch (t): 0.407, 629.699/s LR: 0.000000 Logit Scale: 100.000 - V4
354
+ 2024-09-02,22:18:17 | INFO | Train Epoch: 1 [2381056/3655823 (65%)] Loss: 1.6759 (1.527) Data (t): 0.000 Batch (t): 0.410, 627.931/s LR: 0.000000 Logit Scale: 100.000 - V4
355
+ 2024-09-02,22:18:59 | INFO | Train Epoch: 1 [2406656/3655823 (66%)] Loss: 1.6390 (1.528) Data (t): 0.000 Batch (t): 0.414, 627.236/s LR: 0.000000 Logit Scale: 100.000 - V4
356
+ 2024-09-02,22:19:40 | INFO | Train Epoch: 1 [2432256/3655823 (67%)] Loss: 1.4739 (1.528) Data (t): 0.000 Batch (t): 0.407, 627.855/s LR: 0.000000 Logit Scale: 100.000 - V4
357
+ 2024-09-02,22:20:20 | INFO | Train Epoch: 1 [2457856/3655823 (67%)] Loss: 1.4514 (1.527) Data (t): 0.000 Batch (t): 0.407, 627.891/s LR: 0.000000 Logit Scale: 100.000 - V4
358
+ 2024-09-02,22:21:01 | INFO | Train Epoch: 1 [2483456/3655823 (68%)] Loss: 1.5166 (1.527) Data (t): 0.000 Batch (t): 0.407, 626.913/s LR: 0.000000 Logit Scale: 100.000 - V4
359
+ 2024-09-02,22:21:42 | INFO | Train Epoch: 1 [2509056/3655823 (69%)] Loss: 1.4762 (1.526) Data (t): 0.000 Batch (t): 0.408, 627.541/s LR: 0.000000 Logit Scale: 100.000 - V4
360
+ 2024-09-02,22:22:23 | INFO | Train Epoch: 1 [2534656/3655823 (69%)] Loss: 1.3142 (1.524) Data (t): 0.000 Batch (t): 0.414, 630.252/s LR: 0.000000 Logit Scale: 100.000 - V4
361
+ 2024-09-02,22:23:04 | INFO | Train Epoch: 1 [2560256/3655823 (70%)] Loss: 1.6721 (1.525) Data (t): 0.000 Batch (t): 0.409, 630.476/s LR: 0.000000 Logit Scale: 100.000 - V4
362
+ 2024-09-02,22:23:45 | INFO | Train Epoch: 1 [2585856/3655823 (71%)] Loss: 1.6706 (1.527) Data (t): 0.000 Batch (t): 0.407, 628.681/s LR: 0.000000 Logit Scale: 100.000 - V4
363
+ 2024-09-02,22:24:26 | INFO | Train Epoch: 1 [2611456/3655823 (71%)] Loss: 1.5446 (1.527) Data (t): 0.000 Batch (t): 0.407, 627.126/s LR: 0.000000 Logit Scale: 100.000 - V4
364
+ 2024-09-02,22:25:06 | INFO | Train Epoch: 1 [2637056/3655823 (72%)] Loss: 1.7021 (1.529) Data (t): 0.000 Batch (t): 0.407, 628.812/s LR: 0.000000 Logit Scale: 100.000 - V4
365
+ 2024-09-02,22:25:48 | INFO | Train Epoch: 1 [2662656/3655823 (73%)] Loss: 1.7364 (1.531) Data (t): 0.000 Batch (t): 0.414, 622.986/s LR: 0.000000 Logit Scale: 100.000 - V4
366
+ 2024-09-02,22:26:29 | INFO | Train Epoch: 1 [2688256/3655823 (74%)] Loss: 1.2895 (1.528) Data (t): 0.000 Batch (t): 0.409, 628.695/s LR: 0.000000 Logit Scale: 100.000 - V4
367
+ 2024-09-02,22:27:09 | INFO | Train Epoch: 1 [2713856/3655823 (74%)] Loss: 1.6429 (1.530) Data (t): 0.000 Batch (t): 0.407, 630.669/s LR: 0.000000 Logit Scale: 100.000 - V4
368
+ 2024-09-02,22:27:50 | INFO | Train Epoch: 1 [2739456/3655823 (75%)] Loss: 1.5169 (1.529) Data (t): 0.000 Batch (t): 0.407, 628.794/s LR: 0.000000 Logit Scale: 100.000 - V4
369
+ 2024-09-02,22:28:31 | INFO | Train Epoch: 1 [2765056/3655823 (76%)] Loss: 1.5358 (1.529) Data (t): 0.000 Batch (t): 0.407, 628.273/s LR: 0.000000 Logit Scale: 100.000 - V4
370
+ 2024-09-02,22:29:12 | INFO | Train Epoch: 1 [2790656/3655823 (76%)] Loss: 1.5837 (1.530) Data (t): 0.000 Batch (t): 0.414, 627.414/s LR: 0.000000 Logit Scale: 100.000 - V4
371
+ 2024-09-02,22:29:53 | INFO | Train Epoch: 1 [2816256/3655823 (77%)] Loss: 1.5221 (1.530) Data (t): 0.000 Batch (t): 0.409, 629.769/s LR: 0.000000 Logit Scale: 100.000 - V4
372
+ 2024-09-02,22:30:34 | INFO | Train Epoch: 1 [2841856/3655823 (78%)] Loss: 1.8568 (1.533) Data (t): 0.000 Batch (t): 0.407, 630.064/s LR: 0.000000 Logit Scale: 100.000 - V4
373
+ 2024-09-02,22:31:15 | INFO | Train Epoch: 1 [2867456/3655823 (78%)] Loss: 1.5648 (1.533) Data (t): 0.000 Batch (t): 0.407, 631.715/s LR: 0.000000 Logit Scale: 100.000 - V4
374
+ 2024-09-02,22:31:55 | INFO | Train Epoch: 1 [2893056/3655823 (79%)] Loss: 1.4747 (1.533) Data (t): 0.000 Batch (t): 0.407, 629.037/s LR: 0.000000 Logit Scale: 100.000 - V4
375
+ 2024-09-02,22:32:37 | INFO | Train Epoch: 1 [2918656/3655823 (80%)] Loss: 1.4372 (1.532) Data (t): 0.000 Batch (t): 0.414, 627.094/s LR: 0.000000 Logit Scale: 100.000 - V4
376
+ 2024-09-02,22:33:18 | INFO | Train Epoch: 1 [2944256/3655823 (81%)] Loss: 1.6138 (1.532) Data (t): 0.000 Batch (t): 0.410, 627.375/s LR: 0.000000 Logit Scale: 100.000 - V4
377
+ 2024-09-02,22:33:58 | INFO | Train Epoch: 1 [2969856/3655823 (81%)] Loss: 1.5428 (1.533) Data (t): 0.000 Batch (t): 0.407, 629.974/s LR: 0.000000 Logit Scale: 100.000 - V4
378
+ 2024-09-02,22:34:39 | INFO | Train Epoch: 1 [2995456/3655823 (82%)] Loss: 1.4792 (1.532) Data (t): 0.000 Batch (t): 0.407, 628.819/s LR: 0.000000 Logit Scale: 100.000 - V4
379
+ 2024-09-02,22:35:20 | INFO | Train Epoch: 1 [3021056/3655823 (83%)] Loss: 1.6329 (1.533) Data (t): 0.000 Batch (t): 0.407, 630.009/s LR: 0.000000 Logit Scale: 100.000 - V4
380
+ 2024-09-02,22:36:01 | INFO | Train Epoch: 1 [3046656/3655823 (83%)] Loss: 1.4424 (1.532) Data (t): 0.000 Batch (t): 0.414, 629.763/s LR: 0.000000 Logit Scale: 100.000 - V4
381
+ 2024-09-02,22:36:42 | INFO | Train Epoch: 1 [3072256/3655823 (84%)] Loss: 1.3678 (1.531) Data (t): 0.000 Batch (t): 0.407, 629.572/s LR: 0.000000 Logit Scale: 100.000 - V4
382
+ 2024-09-02,22:37:23 | INFO | Train Epoch: 1 [3097856/3655823 (85%)] Loss: 1.7346 (1.532) Data (t): 0.000 Batch (t): 0.410, 631.033/s LR: 0.000000 Logit Scale: 100.000 - V4
383
+ 2024-09-02,22:38:03 | INFO | Train Epoch: 1 [3123456/3655823 (85%)] Loss: 1.6369 (1.533) Data (t): 0.000 Batch (t): 0.407, 628.910/s LR: 0.000000 Logit Scale: 100.000 - V4
384
+ 2024-09-02,22:38:44 | INFO | Train Epoch: 1 [3149056/3655823 (86%)] Loss: 1.7221 (1.535) Data (t): 0.000 Batch (t): 0.407, 628.351/s LR: 0.000000 Logit Scale: 100.000 - V4
385
+ 2024-09-02,22:39:25 | INFO | Train Epoch: 1 [3174656/3655823 (87%)] Loss: 1.5437 (1.535) Data (t): 0.000 Batch (t): 0.412, 630.017/s LR: 0.000000 Logit Scale: 100.000 - V4
386
+ 2024-09-02,22:40:06 | INFO | Train Epoch: 1 [3200256/3655823 (88%)] Loss: 1.3392 (1.533) Data (t): 0.000 Batch (t): 0.410, 628.299/s LR: 0.000000 Logit Scale: 100.000 - V4
387
+ 2024-09-02,22:40:47 | INFO | Train Epoch: 1 [3225856/3655823 (88%)] Loss: 1.4120 (1.532) Data (t): 0.000 Batch (t): 0.409, 630.385/s LR: 0.000000 Logit Scale: 100.000 - V4
388
+ 2024-09-02,22:41:28 | INFO | Train Epoch: 1 [3251456/3655823 (89%)] Loss: 1.4983 (1.532) Data (t): 0.000 Batch (t): 0.407, 629.850/s LR: 0.000000 Logit Scale: 100.000 - V4
389
+ 2024-09-02,22:42:09 | INFO | Train Epoch: 1 [3277056/3655823 (90%)] Loss: 1.5001 (1.532) Data (t): 0.000 Batch (t): 0.407, 631.704/s LR: 0.000000 Logit Scale: 100.000 - V4
390
+ 2024-09-02,22:42:49 | INFO | Train Epoch: 1 [3302656/3655823 (90%)] Loss: 1.6518 (1.533) Data (t): 0.000 Batch (t): 0.409, 629.730/s LR: 0.000000 Logit Scale: 100.000 - V4
391
+ 2024-09-02,22:43:31 | INFO | Train Epoch: 1 [3328256/3655823 (91%)] Loss: 1.5092 (1.533) Data (t): 0.000 Batch (t): 0.412, 630.069/s LR: 0.000000 Logit Scale: 100.000 - V4
392
+ 2024-09-02,22:44:12 | INFO | Train Epoch: 1 [3353856/3655823 (92%)] Loss: 1.2884 (1.531) Data (t): 0.000 Batch (t): 0.409, 627.948/s LR: 0.000000 Logit Scale: 100.000 - V4
393
+ 2024-09-02,22:44:52 | INFO | Train Epoch: 1 [3379456/3655823 (92%)] Loss: 1.4368 (1.530) Data (t): 0.000 Batch (t): 0.407, 630.895/s LR: 0.000000 Logit Scale: 100.000 - V4
394
+ 2024-09-02,22:45:33 | INFO | Train Epoch: 1 [3405056/3655823 (93%)] Loss: 1.5422 (1.530) Data (t): 0.000 Batch (t): 0.407, 628.972/s LR: 0.000000 Logit Scale: 100.000 - V4
395
+ 2024-09-02,22:46:14 | INFO | Train Epoch: 1 [3430656/3655823 (94%)] Loss: 1.5070 (1.530) Data (t): 0.000 Batch (t): 0.407, 628.455/s LR: 0.000000 Logit Scale: 100.000 - V4
396
+ 2024-09-02,22:46:55 | INFO | Train Epoch: 1 [3456256/3655823 (95%)] Loss: 1.3101 (1.528) Data (t): 0.000 Batch (t): 0.414, 628.779/s LR: 0.000000 Logit Scale: 100.000 - V4
397
+ 2024-09-02,22:47:36 | INFO | Train Epoch: 1 [3481856/3655823 (95%)] Loss: 1.3650 (1.527) Data (t): 0.000 Batch (t): 0.409, 628.301/s LR: 0.000000 Logit Scale: 100.000 - V4
398
+ 2024-09-02,22:48:17 | INFO | Train Epoch: 1 [3507456/3655823 (96%)] Loss: 1.3703 (1.526) Data (t): 0.000 Batch (t): 0.407, 630.465/s LR: 0.000000 Logit Scale: 100.000 - V4
399
+ 2024-09-02,22:48:57 | INFO | Train Epoch: 1 [3533056/3655823 (97%)] Loss: 1.4965 (1.526) Data (t): 0.000 Batch (t): 0.407, 628.362/s LR: 0.000000 Logit Scale: 100.000 - V4
400
+ 2024-09-02,22:49:38 | INFO | Train Epoch: 1 [3558656/3655823 (97%)] Loss: 1.5122 (1.526) Data (t): 0.000 Batch (t): 0.407, 629.408/s LR: 0.000000 Logit Scale: 100.000 - V4
401
+ 2024-09-02,22:50:19 | INFO | Train Epoch: 1 [3584256/3655823 (98%)] Loss: 1.4414 (1.525) Data (t): 0.000 Batch (t): 0.414, 628.142/s LR: 0.000000 Logit Scale: 100.000 - V4
402
+ 2024-09-02,22:51:00 | INFO | Train Epoch: 1 [3609856/3655823 (99%)] Loss: 1.4188 (1.524) Data (t): 0.000 Batch (t): 0.409, 630.100/s LR: 0.000000 Logit Scale: 100.000 - V4
403
+ 2024-09-02,22:51:41 | INFO | Train Epoch: 1 [3635456/3655823 (99%)] Loss: 1.5303 (1.524) Data (t): 0.000 Batch (t): 0.407, 627.737/s LR: 0.000000 Logit Scale: 100.000 - V4
404
+ 2024-09-02,22:52:13 | INFO | Train Epoch: 1 [3655680/3655823 (100%)] Loss: 1.5484 (1.525) Data (t): 0.001 Batch (t): 0.407, 631.627/s LR: 0.000000 Logit Scale: 100.000 - V4
405
+ 2024-09-02,22:52:22 | INFO | Start epoch 2
406
+ 2024-09-02,22:52:24 | INFO | Train Epoch: 2 [ 256/3655823 (0%)] Loss: 1.4037 (1.404) Data (t): 1.370 Batch (t): 1.815, 141.077/s LR: 0.000000 Logit Scale: 100.000 - V4
407
+ 2024-09-02,22:53:04 | INFO | Train Epoch: 2 [ 25856/3655823 (1%)] Loss: 1.5089 (1.456) Data (t): 0.000 Batch (t): 0.407, 630.111/s LR: 0.000000 Logit Scale: 100.000 - V4
408
+ 2024-09-02,22:53:46 | INFO | Train Epoch: 2 [ 51456/3655823 (1%)] Loss: 1.5214 (1.478) Data (t): 0.000 Batch (t): 0.415, 628.223/s LR: 0.000000 Logit Scale: 100.000 - V4
409
+ 2024-09-02,22:54:27 | INFO | Train Epoch: 2 [ 77056/3655823 (2%)] Loss: 1.5636 (1.499) Data (t): 0.000 Batch (t): 0.407, 629.353/s LR: 0.000000 Logit Scale: 100.000 - V4
410
+ 2024-09-02,22:55:08 | INFO | Train Epoch: 2 [ 102656/3655823 (3%)] Loss: 1.6539 (1.530) Data (t): 0.000 Batch (t): 0.410, 630.132/s LR: 0.000000 Logit Scale: 100.000 - V4
411
+ 2024-09-02,22:55:48 | INFO | Train Epoch: 2 [ 128256/3655823 (4%)] Loss: 1.5802 (1.539) Data (t): 0.000 Batch (t): 0.407, 631.567/s LR: 0.000000 Logit Scale: 100.000 - V4
412
+ 2024-09-02,22:56:29 | INFO | Train Epoch: 2 [ 153856/3655823 (4%)] Loss: 1.5986 (1.547) Data (t): 0.000 Batch (t): 0.407, 628.568/s LR: 0.000000 Logit Scale: 100.000 - V4
413
+ 2024-09-02,22:57:11 | INFO | Train Epoch: 2 [ 179456/3655823 (5%)] Loss: 1.4930 (1.540) Data (t): 0.000 Batch (t): 0.414, 626.270/s LR: 0.000000 Logit Scale: 100.000 - V4
414
+ 2024-09-02,22:57:51 | INFO | Train Epoch: 2 [ 205056/3655823 (6%)] Loss: 1.4261 (1.528) Data (t): 0.000 Batch (t): 0.407, 630.380/s LR: 0.000000 Logit Scale: 100.000 - V4
415
+ 2024-09-02,22:58:32 | INFO | Train Epoch: 2 [ 230656/3655823 (6%)] Loss: 1.4926 (1.524) Data (t): 0.000 Batch (t): 0.409, 629.050/s LR: 0.000000 Logit Scale: 100.000 - V4
416
+ 2024-09-02,22:59:13 | INFO | Train Epoch: 2 [ 256256/3655823 (7%)] Loss: 1.7086 (1.541) Data (t): 0.000 Batch (t): 0.407, 629.610/s LR: 0.000000 Logit Scale: 100.000 - V4
417
+ 2024-09-02,22:59:54 | INFO | Train Epoch: 2 [ 281856/3655823 (8%)] Loss: 1.7055 (1.555) Data (t): 0.000 Batch (t): 0.407, 630.368/s LR: 0.000000 Logit Scale: 100.000 - V4
418
+ 2024-09-02,23:00:35 | INFO | Train Epoch: 2 [ 307456/3655823 (8%)] Loss: 1.3944 (1.542) Data (t): 0.000 Batch (t): 0.409, 629.897/s LR: 0.000000 Logit Scale: 100.000 - V4
419
+ 2024-09-02,23:01:16 | INFO | Train Epoch: 2 [ 333056/3655823 (9%)] Loss: 1.4867 (1.538) Data (t): 0.000 Batch (t): 0.412, 630.043/s LR: 0.000000 Logit Scale: 100.000 - V4
420
+ 2024-09-02,23:01:57 | INFO | Train Epoch: 2 [ 358656/3655823 (10%)] Loss: 1.5009 (1.536) Data (t): 0.000 Batch (t): 0.410, 629.868/s LR: 0.000000 Logit Scale: 100.000 - V4
421
+ 2024-09-02,23:02:37 | INFO | Train Epoch: 2 [ 384256/3655823 (11%)] Loss: 1.4124 (1.528) Data (t): 0.000 Batch (t): 0.407, 629.839/s LR: 0.000000 Logit Scale: 100.000 - V4
422
+ 2024-09-02,23:03:18 | INFO | Train Epoch: 2 [ 409856/3655823 (11%)] Loss: 1.6669 (1.536) Data (t): 0.000 Batch (t): 0.407, 629.113/s LR: 0.000000 Logit Scale: 100.000 - V4
423
+ 2024-09-02,23:03:59 | INFO | Train Epoch: 2 [ 435456/3655823 (12%)] Loss: 1.4738 (1.533) Data (t): 0.000 Batch (t): 0.407, 629.178/s LR: 0.000000 Logit Scale: 100.000 - V4
424
+ 2024-09-02,23:04:40 | INFO | Train Epoch: 2 [ 461056/3655823 (13%)] Loss: 1.7301 (1.543) Data (t): 0.000 Batch (t): 0.414, 629.342/s LR: 0.000000 Logit Scale: 100.000 - V4
425
+ 2024-09-02,23:05:21 | INFO | Train Epoch: 2 [ 486656/3655823 (13%)] Loss: 1.5073 (1.541) Data (t): 0.000 Batch (t): 0.410, 629.457/s LR: 0.000000 Logit Scale: 100.000 - V4
426
+ 2024-09-02,23:06:02 | INFO | Train Epoch: 2 [ 512256/3655823 (14%)] Loss: 1.4765 (1.538) Data (t): 0.000 Batch (t): 0.408, 627.737/s LR: 0.000000 Logit Scale: 100.000 - V4
427
+ 2024-09-02,23:06:43 | INFO | Train Epoch: 2 [ 537856/3655823 (15%)] Loss: 1.6323 (1.543) Data (t): 0.000 Batch (t): 0.407, 627.787/s LR: 0.000000 Logit Scale: 100.000 - V4
428
+ 2024-09-02,23:07:23 | INFO | Train Epoch: 2 [ 563456/3655823 (15%)] Loss: 1.4247 (1.537) Data (t): 0.000 Batch (t): 0.407, 629.128/s LR: 0.000000 Logit Scale: 100.000 - V4
429
+ 2024-09-02,23:08:05 | INFO | Train Epoch: 2 [ 589056/3655823 (16%)] Loss: 1.5308 (1.537) Data (t): 0.000 Batch (t): 0.414, 630.767/s LR: 0.000000 Logit Scale: 100.000 - V4
430
+ 2024-09-02,23:08:46 | INFO | Train Epoch: 2 [ 614656/3655823 (17%)] Loss: 1.2977 (1.528) Data (t): 0.000 Batch (t): 0.409, 628.541/s LR: 0.000000 Logit Scale: 100.000 - V4
431
+ 2024-09-02,23:09:26 | INFO | Train Epoch: 2 [ 640256/3655823 (18%)] Loss: 1.7215 (1.535) Data (t): 0.000 Batch (t): 0.407, 627.081/s LR: 0.000000 Logit Scale: 100.000 - V4
432
+ 2024-09-02,23:10:07 | INFO | Train Epoch: 2 [ 665856/3655823 (18%)] Loss: 1.5919 (1.537) Data (t): 0.000 Batch (t): 0.407, 627.969/s LR: 0.000000 Logit Scale: 100.000 - V4
433
+ 2024-09-02,23:10:48 | INFO | Train Epoch: 2 [ 691456/3655823 (19%)] Loss: 1.6819 (1.542) Data (t): 0.000 Batch (t): 0.407, 626.797/s LR: 0.000000 Logit Scale: 100.000 - V4
434
+ 2024-09-02,23:11:29 | INFO | Train Epoch: 2 [ 717056/3655823 (20%)] Loss: 1.5473 (1.543) Data (t): 0.000 Batch (t): 0.414, 626.998/s LR: 0.000000 Logit Scale: 100.000 - V4
435
+ 2024-09-02,23:12:10 | INFO | Train Epoch: 2 [ 742656/3655823 (20%)] Loss: 1.5662 (1.543) Data (t): 0.000 Batch (t): 0.407, 629.791/s LR: 0.000000 Logit Scale: 100.000 - V4
436
+ 2024-09-02,23:12:51 | INFO | Train Epoch: 2 [ 768256/3655823 (21%)] Loss: 1.2745 (1.535) Data (t): 0.000 Batch (t): 0.409, 629.149/s LR: 0.000000 Logit Scale: 100.000 - V4
437
+ 2024-09-02,23:13:32 | INFO | Train Epoch: 2 [ 793856/3655823 (22%)] Loss: 1.5888 (1.536) Data (t): 0.000 Batch (t): 0.407, 627.713/s LR: 0.000000 Logit Scale: 100.000 - V4
438
+ 2024-09-02,23:14:12 | INFO | Train Epoch: 2 [ 819456/3655823 (22%)] Loss: 1.5075 (1.535) Data (t): 0.000 Batch (t): 0.407, 629.456/s LR: 0.000000 Logit Scale: 100.000 - V4
439
+ 2024-09-02,23:14:54 | INFO | Train Epoch: 2 [ 845056/3655823 (23%)] Loss: 1.3259 (1.529) Data (t): 0.000 Batch (t): 0.414, 627.704/s LR: 0.000000 Logit Scale: 100.000 - V4
440
+ 2024-09-02,23:15:34 | INFO | Train Epoch: 2 [ 870656/3655823 (24%)] Loss: 1.4892 (1.528) Data (t): 0.000 Batch (t): 0.407, 629.108/s LR: 0.000000 Logit Scale: 100.000 - V4
441
+ 2024-09-02,23:16:15 | INFO | Train Epoch: 2 [ 896256/3655823 (25%)] Loss: 1.3673 (1.524) Data (t): 0.000 Batch (t): 0.410, 630.858/s LR: 0.000000 Logit Scale: 100.000 - V4
442
+ 2024-09-02,23:16:56 | INFO | Train Epoch: 2 [ 921856/3655823 (25%)] Loss: 1.4061 (1.521) Data (t): 0.000 Batch (t): 0.407, 627.890/s LR: 0.000000 Logit Scale: 100.000 - V4
443
+ 2024-09-02,23:17:37 | INFO | Train Epoch: 2 [ 947456/3655823 (26%)] Loss: 1.3169 (1.515) Data (t): 0.000 Batch (t): 0.407, 628.742/s LR: 0.000000 Logit Scale: 100.000 - V4
444
+ 2024-09-02,23:18:18 | INFO | Train Epoch: 2 [ 973056/3655823 (27%)] Loss: 1.4882 (1.514) Data (t): 0.000 Batch (t): 0.412, 627.743/s LR: 0.000000 Logit Scale: 100.000 - V4
445
+ 2024-09-02,23:18:59 | INFO | Train Epoch: 2 [ 998656/3655823 (27%)] Loss: 1.3831 (1.511) Data (t): 0.000 Batch (t): 0.409, 629.521/s LR: 0.000000 Logit Scale: 100.000 - V4
446
+ 2024-09-02,23:19:40 | INFO | Train Epoch: 2 [1024256/3655823 (28%)] Loss: 1.4389 (1.509) Data (t): 0.000 Batch (t): 0.409, 630.280/s LR: 0.000000 Logit Scale: 100.000 - V4
447
+ 2024-09-02,23:20:21 | INFO | Train Epoch: 2 [1049856/3655823 (29%)] Loss: 1.4462 (1.508) Data (t): 0.000 Batch (t): 0.407, 628.252/s LR: 0.000000 Logit Scale: 100.000 - V4
448
+ 2024-09-02,23:21:01 | INFO | Train Epoch: 2 [1075456/3655823 (29%)] Loss: 1.8087 (1.515) Data (t): 0.000 Batch (t): 0.407, 627.748/s LR: 0.000000 Logit Scale: 100.000 - V4
449
+ 2024-09-02,23:21:42 | INFO | Train Epoch: 2 [1101056/3655823 (30%)] Loss: 1.6658 (1.518) Data (t): 0.000 Batch (t): 0.410, 627.804/s LR: 0.000000 Logit Scale: 100.000 - V4
450
+ 2024-09-02,23:22:23 | INFO | Train Epoch: 2 [1126656/3655823 (31%)] Loss: 1.6089 (1.520) Data (t): 0.000 Batch (t): 0.411, 629.073/s LR: 0.000000 Logit Scale: 100.000 - V4
451
+ 2024-09-02,23:23:04 | INFO | Train Epoch: 2 [1152256/3655823 (32%)] Loss: 1.4291 (1.518) Data (t): 0.000 Batch (t): 0.409, 629.290/s LR: 0.000000 Logit Scale: 100.000 - V4
452
+ 2024-09-02,23:23:45 | INFO | Train Epoch: 2 [1177856/3655823 (32%)] Loss: 1.7743 (1.524) Data (t): 0.000 Batch (t): 0.407, 628.358/s LR: 0.000000 Logit Scale: 100.000 - V4
453
+ 2024-09-02,23:24:26 | INFO | Train Epoch: 2 [1203456/3655823 (33%)] Loss: 1.5225 (1.524) Data (t): 0.000 Batch (t): 0.407, 627.118/s LR: 0.000000 Logit Scale: 100.000 - V4
454
+ 2024-09-02,23:25:06 | INFO | Train Epoch: 2 [1229056/3655823 (34%)] Loss: 1.6578 (1.527) Data (t): 0.000 Batch (t): 0.407, 631.498/s LR: 0.000000 Logit Scale: 100.000 - V4
455
+ 2024-09-02,23:25:48 | INFO | Train Epoch: 2 [1254656/3655823 (34%)] Loss: 1.4961 (1.526) Data (t): 0.000 Batch (t): 0.414, 627.631/s LR: 0.000000 Logit Scale: 100.000 - V4
456
+ 2024-09-02,23:26:29 | INFO | Train Epoch: 2 [1280256/3655823 (35%)] Loss: 1.4220 (1.524) Data (t): 0.000 Batch (t): 0.407, 629.668/s LR: 0.000000 Logit Scale: 100.000 - V4
457
+ 2024-09-02,23:27:10 | INFO | Train Epoch: 2 [1305856/3655823 (36%)] Loss: 1.7906 (1.529) Data (t): 0.000 Batch (t): 0.410, 628.641/s LR: 0.000000 Logit Scale: 100.000 - V4
458
+ 2024-09-02,23:27:50 | INFO | Train Epoch: 2 [1331456/3655823 (36%)] Loss: 1.5221 (1.529) Data (t): 0.000 Batch (t): 0.407, 628.897/s LR: 0.000000 Logit Scale: 100.000 - V4
459
+ 2024-09-02,23:28:31 | INFO | Train Epoch: 2 [1357056/3655823 (37%)] Loss: 1.2659 (1.524) Data (t): 0.000 Batch (t): 0.407, 630.465/s LR: 0.000000 Logit Scale: 100.000 - V4
460
+ 2024-09-02,23:29:12 | INFO | Train Epoch: 2 [1382656/3655823 (38%)] Loss: 1.7159 (1.527) Data (t): 0.000 Batch (t): 0.413, 628.141/s LR: 0.000000 Logit Scale: 100.000 - V4
461
+ 2024-09-02,23:29:53 | INFO | Train Epoch: 2 [1408256/3655823 (39%)] Loss: 1.3950 (1.525) Data (t): 0.000 Batch (t): 0.407, 627.700/s LR: 0.000000 Logit Scale: 100.000 - V4
462
+ 2024-09-02,23:30:34 | INFO | Train Epoch: 2 [1433856/3655823 (39%)] Loss: 1.5019 (1.525) Data (t): 0.000 Batch (t): 0.409, 628.393/s LR: 0.000000 Logit Scale: 100.000 - V4
463
+ 2024-09-02,23:31:15 | INFO | Train Epoch: 2 [1459456/3655823 (40%)] Loss: 1.4502 (1.523) Data (t): 0.000 Batch (t): 0.407, 630.965/s LR: 0.000000 Logit Scale: 100.000 - V4
464
+ 2024-09-02,23:31:55 | INFO | Train Epoch: 2 [1485056/3655823 (41%)] Loss: 1.4851 (1.523) Data (t): 0.000 Batch (t): 0.407, 629.367/s LR: 0.000000 Logit Scale: 100.000 - V4
465
+ 2024-09-02,23:32:37 | INFO | Train Epoch: 2 [1510656/3655823 (41%)] Loss: 1.3077 (1.519) Data (t): 0.000 Batch (t): 0.414, 629.548/s LR: 0.000000 Logit Scale: 100.000 - V4
466
+ 2024-09-02,23:33:17 | INFO | Train Epoch: 2 [1536256/3655823 (42%)] Loss: 1.2835 (1.515) Data (t): 0.000 Batch (t): 0.407, 628.345/s LR: 0.000000 Logit Scale: 100.000 - V4
467
+ 2024-09-02,23:33:58 | INFO | Train Epoch: 2 [1561856/3655823 (43%)] Loss: 1.3746 (1.513) Data (t): 0.000 Batch (t): 0.409, 626.895/s LR: 0.000000 Logit Scale: 100.000 - V4
468
+ 2024-09-02,23:34:39 | INFO | Train Epoch: 2 [1587456/3655823 (43%)] Loss: 1.4182 (1.512) Data (t): 0.000 Batch (t): 0.407, 628.771/s LR: 0.000000 Logit Scale: 100.000 - V4
469
+ 2024-09-02,23:35:20 | INFO | Train Epoch: 2 [1613056/3655823 (44%)] Loss: 1.6644 (1.514) Data (t): 0.000 Batch (t): 0.407, 627.470/s LR: 0.000000 Logit Scale: 100.000 - V4
470
+ 2024-09-02,23:36:01 | INFO | Train Epoch: 2 [1638656/3655823 (45%)] Loss: 1.5005 (1.514) Data (t): 0.000 Batch (t): 0.414, 629.166/s LR: 0.000000 Logit Scale: 100.000 - V4
471
+ 2024-09-02,23:36:42 | INFO | Train Epoch: 2 [1664256/3655823 (46%)] Loss: 1.3908 (1.512) Data (t): 0.000 Batch (t): 0.407, 628.847/s LR: 0.000000 Logit Scale: 100.000 - V4
472
+ 2024-09-02,23:37:23 | INFO | Train Epoch: 2 [1689856/3655823 (46%)] Loss: 1.3752 (1.510) Data (t): 0.000 Batch (t): 0.409, 628.394/s LR: 0.000000 Logit Scale: 100.000 - V4
473
+ 2024-09-02,23:38:03 | INFO | Train Epoch: 2 [1715456/3655823 (47%)] Loss: 1.3061 (1.507) Data (t): 0.000 Batch (t): 0.407, 629.893/s LR: 0.000000 Logit Scale: 100.000 - V4
474
+ 2024-09-02,23:38:44 | INFO | Train Epoch: 2 [1741056/3655823 (48%)] Loss: 1.6684 (1.509) Data (t): 0.000 Batch (t): 0.407, 629.377/s LR: 0.000000 Logit Scale: 100.000 - V4
475
+ 2024-09-02,23:39:25 | INFO | Train Epoch: 2 [1766656/3655823 (48%)] Loss: 1.5123 (1.509) Data (t): 0.000 Batch (t): 0.412, 627.347/s LR: 0.000000 Logit Scale: 100.000 - V4
476
+ 2024-09-02,23:40:06 | INFO | Train Epoch: 2 [1792256/3655823 (49%)] Loss: 1.5436 (1.510) Data (t): 0.000 Batch (t): 0.409, 629.354/s LR: 0.000000 Logit Scale: 100.000 - V4
477
+ 2024-09-02,23:40:47 | INFO | Train Epoch: 2 [1817856/3655823 (50%)] Loss: 1.3315 (1.507) Data (t): 0.000 Batch (t): 0.409, 631.814/s LR: 0.000000 Logit Scale: 100.000 - V4
478
+ 2024-09-02,23:41:28 | INFO | Train Epoch: 2 [1843456/3655823 (50%)] Loss: 1.4653 (1.507) Data (t): 0.000 Batch (t): 0.407, 630.276/s LR: 0.000000 Logit Scale: 100.000 - V4
479
+ 2024-09-02,23:42:09 | INFO | Train Epoch: 2 [1869056/3655823 (51%)] Loss: 1.6304 (1.508) Data (t): 0.000 Batch (t): 0.407, 628.864/s LR: 0.000000 Logit Scale: 100.000 - V4
480
+ 2024-09-02,23:42:50 | INFO | Train Epoch: 2 [1894656/3655823 (52%)] Loss: 1.6257 (1.510) Data (t): 0.000 Batch (t): 0.411, 629.561/s LR: 0.000000 Logit Scale: 100.000 - V4
481
+ 2024-09-02,23:43:31 | INFO | Train Epoch: 2 [1920256/3655823 (53%)] Loss: 1.4509 (1.509) Data (t): 0.000 Batch (t): 0.409, 629.451/s LR: 0.000000 Logit Scale: 100.000 - V4
482
+ 2024-09-02,23:44:11 | INFO | Train Epoch: 2 [1945856/3655823 (53%)] Loss: 1.6384 (1.511) Data (t): 0.000 Batch (t): 0.407, 628.176/s LR: 0.000000 Logit Scale: 100.000 - V4
483
+ 2024-09-02,23:44:52 | INFO | Train Epoch: 2 [1971456/3655823 (54%)] Loss: 1.7136 (1.513) Data (t): 0.000 Batch (t): 0.410, 627.612/s LR: 0.000000 Logit Scale: 100.000 - V4
484
+ 2024-09-02,23:45:33 | INFO | Train Epoch: 2 [1997056/3655823 (55%)] Loss: 1.4535 (1.513) Data (t): 0.000 Batch (t): 0.407, 630.598/s LR: 0.000000 Logit Scale: 100.000 - V4
485
+ 2024-09-02,23:46:14 | INFO | Train Epoch: 2 [2022656/3655823 (55%)] Loss: 1.4362 (1.512) Data (t): 0.000 Batch (t): 0.407, 630.524/s LR: 0.000000 Logit Scale: 100.000 - V4
486
+ 2024-09-02,23:46:55 | INFO | Train Epoch: 2 [2048256/3655823 (56%)] Loss: 1.5898 (1.513) Data (t): 0.000 Batch (t): 0.414, 630.467/s LR: 0.000000 Logit Scale: 100.000 - V4
487
+ 2024-09-02,23:47:36 | INFO | Train Epoch: 2 [2073856/3655823 (57%)] Loss: 1.5890 (1.514) Data (t): 0.000 Batch (t): 0.407, 629.275/s LR: 0.000000 Logit Scale: 100.000 - V4
488
+ 2024-09-02,23:48:17 | INFO | Train Epoch: 2 [2099456/3655823 (57%)] Loss: 1.3375 (1.511) Data (t): 0.000 Batch (t): 0.409, 630.843/s LR: 0.000000 Logit Scale: 100.000 - V4
489
+ 2024-09-02,23:48:57 | INFO | Train Epoch: 2 [2125056/3655823 (58%)] Loss: 1.5884 (1.512) Data (t): 0.000 Batch (t): 0.407, 629.078/s LR: 0.000000 Logit Scale: 100.000 - V4
490
+ 2024-09-02,23:49:38 | INFO | Train Epoch: 2 [2150656/3655823 (59%)] Loss: 1.4484 (1.512) Data (t): 0.000 Batch (t): 0.407, 628.168/s LR: 0.000000 Logit Scale: 100.000 - V4
491
+ 2024-09-02,23:50:20 | INFO | Train Epoch: 2 [2176256/3655823 (60%)] Loss: 1.4590 (1.511) Data (t): 0.000 Batch (t): 0.414, 627.246/s LR: 0.000000 Logit Scale: 100.000 - V4
492
+ 2024-09-02,23:51:00 | INFO | Train Epoch: 2 [2201856/3655823 (60%)] Loss: 1.4026 (1.510) Data (t): 0.000 Batch (t): 0.407, 629.318/s LR: 0.000000 Logit Scale: 100.000 - V4
493
+ 2024-09-02,23:51:41 | INFO | Train Epoch: 2 [2227456/3655823 (61%)] Loss: 1.6156 (1.511) Data (t): 0.000 Batch (t): 0.410, 627.409/s LR: 0.000000 Logit Scale: 100.000 - V4
494
+ 2024-09-02,23:52:22 | INFO | Train Epoch: 2 [2253056/3655823 (62%)] Loss: 1.7187 (1.513) Data (t): 0.000 Batch (t): 0.407, 630.345/s LR: 0.000000 Logit Scale: 100.000 - V4
495
+ 2024-09-02,23:53:03 | INFO | Train Epoch: 2 [2278656/3655823 (62%)] Loss: 1.4741 (1.513) Data (t): 0.000 Batch (t): 0.407, 628.182/s LR: 0.000000 Logit Scale: 100.000 - V4
496
+ 2024-09-02,23:53:44 | INFO | Train Epoch: 2 [2304256/3655823 (63%)] Loss: 1.4681 (1.512) Data (t): 0.000 Batch (t): 0.415, 627.790/s LR: 0.000000 Logit Scale: 100.000 - V4
497
+ 2024-09-02,23:54:25 | INFO | Train Epoch: 2 [2329856/3655823 (64%)] Loss: 1.3857 (1.511) Data (t): 0.000 Batch (t): 0.408, 629.276/s LR: 0.000000 Logit Scale: 100.000 - V4
498
+ 2024-09-02,23:55:06 | INFO | Train Epoch: 2 [2355456/3655823 (64%)] Loss: 1.5978 (1.512) Data (t): 0.000 Batch (t): 0.409, 627.194/s LR: 0.000000 Logit Scale: 100.000 - V4
499
+ 2024-09-02,23:55:47 | INFO | Train Epoch: 2 [2381056/3655823 (65%)] Loss: 1.6680 (1.514) Data (t): 0.000 Batch (t): 0.407, 628.069/s LR: 0.000000 Logit Scale: 100.000 - V4
500
+ 2024-09-02,23:56:27 | INFO | Train Epoch: 2 [2406656/3655823 (66%)] Loss: 1.5529 (1.514) Data (t): 0.000 Batch (t): 0.407, 630.765/s LR: 0.000000 Logit Scale: 100.000 - V4
501
+ 2024-09-02,23:57:09 | INFO | Train Epoch: 2 [2432256/3655823 (67%)] Loss: 1.3681 (1.512) Data (t): 0.000 Batch (t): 0.413, 630.806/s LR: 0.000000 Logit Scale: 100.000 - V4
502
+ 2024-09-02,23:57:49 | INFO | Train Epoch: 2 [2457856/3655823 (67%)] Loss: 1.5440 (1.513) Data (t): 0.000 Batch (t): 0.407, 628.749/s LR: 0.000000 Logit Scale: 100.000 - V4
503
+ 2024-09-02,23:58:30 | INFO | Train Epoch: 2 [2483456/3655823 (68%)] Loss: 1.4075 (1.512) Data (t): 0.000 Batch (t): 0.407, 628.206/s LR: 0.000000 Logit Scale: 100.000 - V4
504
+ 2024-09-02,23:59:11 | INFO | Train Epoch: 2 [2509056/3655823 (69%)] Loss: 1.8254 (1.515) Data (t): 0.000 Batch (t): 0.410, 628.974/s LR: 0.000000 Logit Scale: 100.000 - V4
505
+ 2024-09-02,23:59:52 | INFO | Train Epoch: 2 [2534656/3655823 (69%)] Loss: 1.6677 (1.516) Data (t): 0.000 Batch (t): 0.408, 626.462/s LR: 0.000000 Logit Scale: 100.000 - V4
506
+ 2024-09-03,00:00:33 | INFO | Train Epoch: 2 [2560256/3655823 (70%)] Loss: 1.4165 (1.515) Data (t): 0.000 Batch (t): 0.412, 629.517/s LR: 0.000000 Logit Scale: 100.000 - V4
507
+ 2024-09-03,00:01:14 | INFO | Train Epoch: 2 [2585856/3655823 (71%)] Loss: 1.7347 (1.518) Data (t): 0.000 Batch (t): 0.410, 629.107/s LR: 0.000000 Logit Scale: 100.000 - V4
508
+ 2024-09-03,00:01:55 | INFO | Train Epoch: 2 [2611456/3655823 (71%)] Loss: 1.2317 (1.515) Data (t): 0.000 Batch (t): 0.407, 630.817/s LR: 0.000000 Logit Scale: 100.000 - V4
509
+ 2024-09-03,00:02:36 | INFO | Train Epoch: 2 [2637056/3655823 (72%)] Loss: 1.5369 (1.515) Data (t): 0.000 Batch (t): 0.410, 625.949/s LR: 0.000000 Logit Scale: 100.000 - V4
510
+ 2024-09-03,00:03:16 | INFO | Train Epoch: 2 [2662656/3655823 (73%)] Loss: 1.4907 (1.515) Data (t): 0.000 Batch (t): 0.407, 629.731/s LR: 0.000000 Logit Scale: 100.000 - V4
511
+ 2024-09-03,00:03:58 | INFO | Train Epoch: 2 [2688256/3655823 (74%)] Loss: 1.4255 (1.514) Data (t): 0.000 Batch (t): 0.412, 398.724/s LR: 0.000000 Logit Scale: 100.000 - V4
512
+ 2024-09-03,00:04:39 | INFO | Train Epoch: 2 [2713856/3655823 (74%)] Loss: 1.4123 (1.513) Data (t): 0.000 Batch (t): 0.410, 631.559/s LR: 0.000000 Logit Scale: 100.000 - V4
513
+ 2024-09-03,00:05:19 | INFO | Train Epoch: 2 [2739456/3655823 (75%)] Loss: 1.5758 (1.514) Data (t): 0.000 Batch (t): 0.407, 628.544/s LR: 0.000000 Logit Scale: 100.000 - V4
514
+ 2024-09-03,00:06:00 | INFO | Train Epoch: 2 [2765056/3655823 (76%)] Loss: 1.4495 (1.513) Data (t): 0.000 Batch (t): 0.409, 626.030/s LR: 0.000000 Logit Scale: 100.000 - V4
515
+ 2024-09-03,00:06:41 | INFO | Train Epoch: 2 [2790656/3655823 (76%)] Loss: 1.5304 (1.513) Data (t): 0.000 Batch (t): 0.407, 627.283/s LR: 0.000000 Logit Scale: 100.000 - V4
516
+ 2024-09-03,00:07:22 | INFO | Train Epoch: 2 [2816256/3655823 (77%)] Loss: 1.4701 (1.513) Data (t): 0.000 Batch (t): 0.410, 629.953/s LR: 0.000000 Logit Scale: 100.000 - V4
517
+ 2024-09-03,00:08:03 | INFO | Train Epoch: 2 [2841856/3655823 (78%)] Loss: 1.5763 (1.513) Data (t): 0.000 Batch (t): 0.412, 627.543/s LR: 0.000000 Logit Scale: 100.000 - V4
518
+ 2024-09-03,00:08:44 | INFO | Train Epoch: 2 [2867456/3655823 (78%)] Loss: 1.7433 (1.515) Data (t): 0.000 Batch (t): 0.407, 627.613/s LR: 0.000000 Logit Scale: 100.000 - V4
519
+ 2024-09-03,00:09:25 | INFO | Train Epoch: 2 [2893056/3655823 (79%)] Loss: 1.3988 (1.514) Data (t): 0.000 Batch (t): 0.410, 630.525/s LR: 0.000000 Logit Scale: 100.000 - V4
520
+ 2024-09-03,00:10:06 | INFO | Train Epoch: 2 [2918656/3655823 (80%)] Loss: 1.4154 (1.513) Data (t): 0.000 Batch (t): 0.408, 627.753/s LR: 0.000000 Logit Scale: 100.000 - V4
521
+ 2024-09-03,00:10:46 | INFO | Train Epoch: 2 [2944256/3655823 (81%)] Loss: 1.3944 (1.512) Data (t): 0.000 Batch (t): 0.408, 627.055/s LR: 0.000000 Logit Scale: 100.000 - V4
522
+ 2024-09-03,00:11:28 | INFO | Train Epoch: 2 [2969856/3655823 (81%)] Loss: 1.5659 (1.513) Data (t): 0.000 Batch (t): 0.414, 629.686/s LR: 0.000000 Logit Scale: 100.000 - V4
523
+ 2024-09-03,00:12:09 | INFO | Train Epoch: 2 [2995456/3655823 (82%)] Loss: 1.5505 (1.513) Data (t): 0.000 Batch (t): 0.408, 630.069/s LR: 0.000000 Logit Scale: 100.000 - V4
524
+ 2024-09-03,00:12:49 | INFO | Train Epoch: 2 [3021056/3655823 (83%)] Loss: 1.5070 (1.513) Data (t): 0.000 Batch (t): 0.407, 628.991/s LR: 0.000000 Logit Scale: 100.000 - V4
525
+ 2024-09-03,00:13:30 | INFO | Train Epoch: 2 [3046656/3655823 (83%)] Loss: 1.5398 (1.513) Data (t): 0.000 Batch (t): 0.410, 629.726/s LR: 0.000000 Logit Scale: 100.000 - V4
526
+ 2024-09-03,00:14:11 | INFO | Train Epoch: 2 [3072256/3655823 (84%)] Loss: 1.3857 (1.512) Data (t): 0.000 Batch (t): 0.407, 629.560/s LR: 0.000000 Logit Scale: 100.000 - V4
527
+ 2024-09-03,00:14:52 | INFO | Train Epoch: 2 [3097856/3655823 (85%)] Loss: 1.5881 (1.513) Data (t): 0.000 Batch (t): 0.414, 631.045/s LR: 0.000000 Logit Scale: 100.000 - V4
528
+ 2024-09-03,00:15:33 | INFO | Train Epoch: 2 [3123456/3655823 (85%)] Loss: 1.5037 (1.513) Data (t): 0.000 Batch (t): 0.407, 629.088/s LR: 0.000000 Logit Scale: 100.000 - V4
529
+ 2024-09-03,00:16:14 | INFO | Train Epoch: 2 [3149056/3655823 (86%)] Loss: 1.4014 (1.512) Data (t): 0.001 Batch (t): 0.407, 630.044/s LR: 0.000000 Logit Scale: 100.000 - V4
530
+ 2024-09-03,00:16:55 | INFO | Train Epoch: 2 [3174656/3655823 (87%)] Loss: 1.2179 (1.510) Data (t): 0.000 Batch (t): 0.410, 629.206/s LR: 0.000000 Logit Scale: 100.000 - V4
531
+ 2024-09-03,00:17:36 | INFO | Train Epoch: 2 [3200256/3655823 (88%)] Loss: 1.3645 (1.508) Data (t): 0.000 Batch (t): 0.407, 627.603/s LR: 0.000000 Logit Scale: 100.000 - V4
532
+ 2024-09-03,00:18:17 | INFO | Train Epoch: 2 [3225856/3655823 (88%)] Loss: 1.5680 (1.509) Data (t): 0.000 Batch (t): 0.412, 627.229/s LR: 0.000000 Logit Scale: 100.000 - V4
533
+ 2024-09-03,00:18:58 | INFO | Train Epoch: 2 [3251456/3655823 (89%)] Loss: 1.3383 (1.508) Data (t): 0.000 Batch (t): 0.410, 626.713/s LR: 0.000000 Logit Scale: 100.000 - V4
534
+ 2024-09-03,00:19:39 | INFO | Train Epoch: 2 [3277056/3655823 (90%)] Loss: 1.3487 (1.506) Data (t): 0.000 Batch (t): 0.408, 626.343/s LR: 0.000000 Logit Scale: 100.000 - V4
535
+ 2024-09-03,00:20:20 | INFO | Train Epoch: 2 [3302656/3655823 (90%)] Loss: 1.7396 (1.508) Data (t): 0.000 Batch (t): 0.410, 630.255/s LR: 0.000000 Logit Scale: 100.000 - V4
536
+ 2024-09-03,00:21:00 | INFO | Train Epoch: 2 [3328256/3655823 (91%)] Loss: 1.5906 (1.509) Data (t): 0.000 Batch (t): 0.408, 626.238/s LR: 0.000000 Logit Scale: 100.000 - V4
537
+ 2024-09-03,00:21:41 | INFO | Train Epoch: 2 [3353856/3655823 (92%)] Loss: 1.3907 (1.508) Data (t): 0.000 Batch (t): 0.410, 625.774/s LR: 0.000000 Logit Scale: 100.000 - V4
538
+ 2024-09-03,00:22:23 | INFO | Train Epoch: 2 [3379456/3655823 (92%)] Loss: 1.7100 (1.509) Data (t): 0.000 Batch (t): 0.412, 627.232/s LR: 0.000000 Logit Scale: 100.000 - V4
539
+ 2024-09-03,00:23:03 | INFO | Train Epoch: 2 [3405056/3655823 (93%)] Loss: 1.5588 (1.510) Data (t): 0.000 Batch (t): 0.408, 629.864/s LR: 0.000000 Logit Scale: 100.000 - V4
540
+ 2024-09-03,00:23:44 | INFO | Train Epoch: 2 [3430656/3655823 (94%)] Loss: 1.4719 (1.510) Data (t): 0.000 Batch (t): 0.410, 626.668/s LR: 0.000000 Logit Scale: 100.000 - V4
541
+ 2024-09-03,00:24:25 | INFO | Train Epoch: 2 [3456256/3655823 (95%)] Loss: 1.5265 (1.510) Data (t): 0.000 Batch (t): 0.407, 626.014/s LR: 0.000000 Logit Scale: 100.000 - V4
542
+ 2024-09-03,00:25:06 | INFO | Train Epoch: 2 [3481856/3655823 (95%)] Loss: 1.4344 (1.509) Data (t): 0.000 Batch (t): 0.410, 628.206/s LR: 0.000000 Logit Scale: 100.000 - V4
543
+ 2024-09-03,00:25:47 | INFO | Train Epoch: 2 [3507456/3655823 (96%)] Loss: 1.4098 (1.508) Data (t): 0.000 Batch (t): 0.412, 627.392/s LR: 0.000000 Logit Scale: 100.000 - V4
544
+ 2024-09-03,00:26:28 | INFO | Train Epoch: 2 [3533056/3655823 (97%)] Loss: 1.4200 (1.508) Data (t): 0.000 Batch (t): 0.408, 626.110/s LR: 0.000000 Logit Scale: 100.000 - V4
545
+ 2024-09-03,00:27:09 | INFO | Train Epoch: 2 [3558656/3655823 (97%)] Loss: 1.5582 (1.508) Data (t): 0.000 Batch (t): 0.408, 626.130/s LR: 0.000000 Logit Scale: 100.000 - V4
546
+ 2024-09-03,00:27:50 | INFO | Train Epoch: 2 [3584256/3655823 (98%)] Loss: 1.6104 (1.509) Data (t): 0.000 Batch (t): 0.409, 631.655/s LR: 0.000000 Logit Scale: 100.000 - V4
547
+ 2024-09-03,00:28:30 | INFO | Train Epoch: 2 [3609856/3655823 (99%)] Loss: 1.4446 (1.508) Data (t): 0.000 Batch (t): 0.407, 627.898/s LR: 0.000000 Logit Scale: 100.000 - V4
548
+ 2024-09-03,00:29:12 | INFO | Train Epoch: 2 [3635456/3655823 (99%)] Loss: 1.3426 (1.507) Data (t): 0.000 Batch (t): 0.414, 627.434/s LR: 0.000000 Logit Scale: 100.000 - V4
549
+ 2024-09-03,00:29:44 | INFO | Train Epoch: 2 [3655680/3655823 (100%)] Loss: 1.1659 (1.505) Data (t): 0.001 Batch (t): 0.407, 631.296/s LR: 0.000000 Logit Scale: 100.000 - V4
data/trained_openclip/negative_logs/plotqa_v2/2024_09_02-19_36_58-model_ViT-L-14-336-lr_1e-06-b_64-j_4-p_amp/params.txt ADDED
@@ -0,0 +1,67 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ batch_size: 64
2
+ beta1: 0.9
3
+ beta2: 0.98
4
+ checkpoint_path: /project/deemreason/junteng/Vision4Math/train_clip/negative_logs/plotqa_v2/2024_09_02-19_36_58-model_ViT-L-14-336-lr_1e-06-b_64-j_4-p_amp/checkpoints
5
+ copy_codebase: False
6
+ csv_caption_key: caption
7
+ csv_hard_captions_key: neg_caption
8
+ csv_img_key: img_path
9
+ csv_separator: ,
10
+ dataset_resampled: False
11
+ dataset_type: csv
12
+ ddp_static_graph: False
13
+ debug: False
14
+ device: cuda:0
15
+ dist_backend: nccl
16
+ dist_url: env://
17
+ distributed: True
18
+ epochs: 3
19
+ eps: 1e-06
20
+ force_quick_gelu: True
21
+ gather_with_grad: False
22
+ grad_checkpointing: False
23
+ horovod: False
24
+ imagenet_v2: None
25
+ imagenet_val: None
26
+ local_loss: False
27
+ local_rank: 0
28
+ lock_image: False
29
+ lock_image_freeze_bn_stats: False
30
+ lock_image_unlocked_groups: 0
31
+ log_level: 20
32
+ log_local: False
33
+ log_path: /project/deemreason/junteng/Vision4Math/train_clip/negative_logs/plotqa_v2/2024_09_02-19_36_58-model_ViT-L-14-336-lr_1e-06-b_64-j_4-p_amp/out.log
34
+ logs: /project/deemreason/junteng/Vision4Math/train_clip/negative_logs/plotqa_v2
35
+ lr: 1e-06
36
+ model: ViT-L-14-336
37
+ name: 2024_09_02-19_36_58-model_ViT-L-14-336-lr_1e-06-b_64-j_4-p_amp
38
+ no_set_device_rank: False
39
+ norm_gradient_clip: None
40
+ precision: amp
41
+ pretrained: /project/deemreason/junteng/Vision4Math/data/openclip-vit-14-336/openclip_model.pt
42
+ pretrained_image: False
43
+ rank: 0
44
+ report_to: wandb
45
+ resume: None
46
+ save_frequency: 1
47
+ save_most_recent: False
48
+ seed: 0
49
+ skip_scheduler: False
50
+ tensorboard: False
51
+ tensorboard_path:
52
+ torchscript: False
53
+ trace: False
54
+ train_data: /project/deemreason/junteng/Vision4Math/csv_data/plotqa_train_v2.csv
55
+ train_num_samples: None
56
+ use_bn_sync: False
57
+ val_data: None
58
+ val_frequency: 1
59
+ val_num_samples: None
60
+ wandb: True
61
+ wandb_notes:
62
+ wandb_project: open-clip-sum
63
+ warmup: 0
64
+ wd: 0.1
65
+ workers: 4
66
+ world_size: 4
67
+ zeroshot_frequency: 2