Junteng commited on
Commit
4e0e6dc
·
verified ·
1 Parent(s): d3579e8

Upload folder using huggingface_hub

Browse files
Files changed (8) hide show
  1. data/trained_openclip/negative_logs/plotqa_train_only_qa_v2_5false_formated_sampled_fixed_flaten_decimal2/2024_11_26-13_26_45-model_ViT-L-14-336-lr_1e-06-b_64-j_4-p_amp/checkpoints/epoch_1.pt +3 -0
  2. data/trained_openclip/negative_logs/plotqa_train_only_qa_v2_5false_formated_sampled_fixed_flaten_decimal2/2024_11_26-13_26_45-model_ViT-L-14-336-lr_1e-06-b_64-j_4-p_amp/checkpoints/epoch_2.pt +3 -0
  3. data/trained_openclip/negative_logs/plotqa_train_only_qa_v2_5false_formated_sampled_fixed_flaten_decimal2/2024_11_26-13_26_45-model_ViT-L-14-336-lr_1e-06-b_64-j_4-p_amp/out.log +534 -0
  4. data/trained_openclip/negative_logs/plotqa_train_only_qa_v2_5false_formated_sampled_fixed_flaten_decimal2/2024_11_26-13_26_45-model_ViT-L-14-336-lr_1e-06-b_64-j_4-p_amp/params.txt +67 -0
  5. data/trained_openclip/negative_logs/plotqa_train_only_qa_v2_5false_formated_sampled_fixed_flaten_decimal2/2024_11_26-23_59_33-model_ViT-L-14-336-lr_5e-06-b_64-j_4-p_amp/checkpoints/epoch_1.pt +3 -0
  6. data/trained_openclip/negative_logs/plotqa_train_only_qa_v2_5false_formated_sampled_fixed_flaten_decimal2/2024_11_26-23_59_33-model_ViT-L-14-336-lr_5e-06-b_64-j_4-p_amp/checkpoints/epoch_2.pt +3 -0
  7. data/trained_openclip/negative_logs/plotqa_train_only_qa_v2_5false_formated_sampled_fixed_flaten_decimal2/2024_11_26-23_59_33-model_ViT-L-14-336-lr_5e-06-b_64-j_4-p_amp/out.log +534 -0
  8. data/trained_openclip/negative_logs/plotqa_train_only_qa_v2_5false_formated_sampled_fixed_flaten_decimal2/2024_11_26-23_59_33-model_ViT-L-14-336-lr_5e-06-b_64-j_4-p_amp/params.txt +67 -0
data/trained_openclip/negative_logs/plotqa_train_only_qa_v2_5false_formated_sampled_fixed_flaten_decimal2/2024_11_26-13_26_45-model_ViT-L-14-336-lr_1e-06-b_64-j_4-p_amp/checkpoints/epoch_1.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1ef4d80ec486f5f01136fdf042e33e31d3e7a31ed27bc82dccb78a27ef52ec40
3
+ size 5135890710
data/trained_openclip/negative_logs/plotqa_train_only_qa_v2_5false_formated_sampled_fixed_flaten_decimal2/2024_11_26-13_26_45-model_ViT-L-14-336-lr_1e-06-b_64-j_4-p_amp/checkpoints/epoch_2.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b8e076c064dbf722cda1c971b98755117cf36c95fd5a99626ca7a67cda409773
3
+ size 5135890710
data/trained_openclip/negative_logs/plotqa_train_only_qa_v2_5false_formated_sampled_fixed_flaten_decimal2/2024_11_26-13_26_45-model_ViT-L-14-336-lr_1e-06-b_64-j_4-p_amp/out.log ADDED
@@ -0,0 +1,534 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2024-11-26,13:26:45 | INFO | Running in distributed mode with multiple processes. Device: cuda:0.Process (global: 0, local 0), total 8.
2
+ 2024-11-26,13:26:45 | INFO | Loading ViT-L-14-336 model config.
3
+ 2024-11-26,13:26:48 | INFO | Loading pretrained ViT-L-14-336 weights (data/openclip-vit-14-336/openclip_model.pt).
4
+ 2024-11-26,13:26:55 | INFO | Model:
5
+ 2024-11-26,13:26:55 | INFO | CLIP(
6
+ (visual): VisualTransformer(
7
+ (conv1): Conv2d(3, 1024, kernel_size=(14, 14), stride=(14, 14), bias=False)
8
+ (ln_pre): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
9
+ (transformer): Transformer(
10
+ (resblocks): ModuleList(
11
+ (0-23): 24 x ResidualAttentionBlock(
12
+ (attn): MultiheadAttention(
13
+ (out_proj): NonDynamicallyQuantizableLinear(in_features=1024, out_features=1024, bias=True)
14
+ )
15
+ (ln_1): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
16
+ (mlp): Sequential(
17
+ (c_fc): Linear(in_features=1024, out_features=4096, bias=True)
18
+ (gelu): QuickGELU()
19
+ (c_proj): Linear(in_features=4096, out_features=1024, bias=True)
20
+ )
21
+ (ln_2): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
22
+ )
23
+ )
24
+ )
25
+ (ln_post): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
26
+ )
27
+ (transformer): Transformer(
28
+ (resblocks): ModuleList(
29
+ (0-11): 12 x ResidualAttentionBlock(
30
+ (attn): MultiheadAttention(
31
+ (out_proj): NonDynamicallyQuantizableLinear(in_features=768, out_features=768, bias=True)
32
+ )
33
+ (ln_1): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
34
+ (mlp): Sequential(
35
+ (c_fc): Linear(in_features=768, out_features=3072, bias=True)
36
+ (gelu): QuickGELU()
37
+ (c_proj): Linear(in_features=3072, out_features=768, bias=True)
38
+ )
39
+ (ln_2): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
40
+ )
41
+ )
42
+ )
43
+ (token_embedding): Embedding(49408, 768)
44
+ (ln_final): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
45
+ )
46
+ 2024-11-26,13:26:55 | INFO | Params:
47
+ 2024-11-26,13:26:55 | INFO | batch_size: 64
48
+ 2024-11-26,13:26:55 | INFO | beta1: 0.9
49
+ 2024-11-26,13:26:55 | INFO | beta2: 0.98
50
+ 2024-11-26,13:26:55 | INFO | checkpoint_path: data/trained_openclip/negative_logs/plotqa_train_only_qa_v2_5false_formated_sampled_fixed_flaten_decimal2/2024_11_26-13_26_45-model_ViT-L-14-336-lr_1e-06-b_64-j_4-p_amp/checkpoints
51
+ 2024-11-26,13:26:55 | INFO | copy_codebase: False
52
+ 2024-11-26,13:26:55 | INFO | csv_caption_key: caption
53
+ 2024-11-26,13:26:55 | INFO | csv_hard_captions_key: neg_caption
54
+ 2024-11-26,13:26:55 | INFO | csv_img_key: img_path
55
+ 2024-11-26,13:26:55 | INFO | csv_separator: ,
56
+ 2024-11-26,13:26:55 | INFO | dataset_resampled: False
57
+ 2024-11-26,13:26:55 | INFO | dataset_type: csv
58
+ 2024-11-26,13:26:55 | INFO | ddp_static_graph: False
59
+ 2024-11-26,13:26:55 | INFO | debug: False
60
+ 2024-11-26,13:26:55 | INFO | device: cuda:0
61
+ 2024-11-26,13:26:55 | INFO | dist_backend: nccl
62
+ 2024-11-26,13:26:55 | INFO | dist_url: env://
63
+ 2024-11-26,13:26:55 | INFO | distributed: True
64
+ 2024-11-26,13:26:55 | INFO | epochs: 2
65
+ 2024-11-26,13:26:55 | INFO | eps: 1e-06
66
+ 2024-11-26,13:26:55 | INFO | force_quick_gelu: True
67
+ 2024-11-26,13:26:55 | INFO | gather_with_grad: False
68
+ 2024-11-26,13:26:55 | INFO | grad_checkpointing: False
69
+ 2024-11-26,13:26:55 | INFO | horovod: False
70
+ 2024-11-26,13:26:55 | INFO | imagenet_v2: None
71
+ 2024-11-26,13:26:55 | INFO | imagenet_val: None
72
+ 2024-11-26,13:26:55 | INFO | local_loss: False
73
+ 2024-11-26,13:26:55 | INFO | local_rank: 0
74
+ 2024-11-26,13:26:55 | INFO | lock_image: False
75
+ 2024-11-26,13:26:55 | INFO | lock_image_freeze_bn_stats: False
76
+ 2024-11-26,13:26:55 | INFO | lock_image_unlocked_groups: 0
77
+ 2024-11-26,13:26:55 | INFO | log_level: 20
78
+ 2024-11-26,13:26:55 | INFO | log_local: False
79
+ 2024-11-26,13:26:55 | INFO | log_path: data/trained_openclip/negative_logs/plotqa_train_only_qa_v2_5false_formated_sampled_fixed_flaten_decimal2/2024_11_26-13_26_45-model_ViT-L-14-336-lr_1e-06-b_64-j_4-p_amp/out.log
80
+ 2024-11-26,13:26:55 | INFO | logs: data/trained_openclip/negative_logs/plotqa_train_only_qa_v2_5false_formated_sampled_fixed_flaten_decimal2
81
+ 2024-11-26,13:26:55 | INFO | lr: 1e-06
82
+ 2024-11-26,13:26:55 | INFO | model: ViT-L-14-336
83
+ 2024-11-26,13:26:55 | INFO | name: 2024_11_26-13_26_45-model_ViT-L-14-336-lr_1e-06-b_64-j_4-p_amp
84
+ 2024-11-26,13:26:55 | INFO | no_set_device_rank: False
85
+ 2024-11-26,13:26:55 | INFO | norm_gradient_clip: None
86
+ 2024-11-26,13:26:55 | INFO | precision: amp
87
+ 2024-11-26,13:26:55 | INFO | pretrained: data/openclip-vit-14-336/openclip_model.pt
88
+ 2024-11-26,13:26:55 | INFO | pretrained_image: False
89
+ 2024-11-26,13:26:55 | INFO | rank: 0
90
+ 2024-11-26,13:26:55 | INFO | report_to: wandb
91
+ 2024-11-26,13:26:55 | INFO | resume: None
92
+ 2024-11-26,13:26:55 | INFO | save_frequency: 1
93
+ 2024-11-26,13:26:55 | INFO | save_most_recent: False
94
+ 2024-11-26,13:26:55 | INFO | seed: 0
95
+ 2024-11-26,13:26:55 | INFO | skip_scheduler: False
96
+ 2024-11-26,13:26:55 | INFO | tensorboard: False
97
+ 2024-11-26,13:26:55 | INFO | tensorboard_path:
98
+ 2024-11-26,13:26:55 | INFO | torchscript: False
99
+ 2024-11-26,13:26:55 | INFO | trace: False
100
+ 2024-11-26,13:26:55 | INFO | train_data: csv_data/plotqa_train_only_qa_v2_5false_formated_sampled_fixed_flaten_decimal2.csv
101
+ 2024-11-26,13:26:55 | INFO | train_num_samples: None
102
+ 2024-11-26,13:26:55 | INFO | use_bn_sync: False
103
+ 2024-11-26,13:26:55 | INFO | val_data: None
104
+ 2024-11-26,13:26:55 | INFO | val_frequency: 1
105
+ 2024-11-26,13:26:55 | INFO | val_num_samples: None
106
+ 2024-11-26,13:26:55 | INFO | wandb: True
107
+ 2024-11-26,13:26:55 | INFO | wandb_notes:
108
+ 2024-11-26,13:26:55 | INFO | wandb_project: neg-clip-plotqa_train_only_qa_v2_5false_formated_sampled_fixed_flaten_decimal2
109
+ 2024-11-26,13:26:55 | INFO | warmup: 0
110
+ 2024-11-26,13:26:55 | INFO | wd: 0.1
111
+ 2024-11-26,13:26:55 | INFO | workers: 4
112
+ 2024-11-26,13:26:55 | INFO | world_size: 8
113
+ 2024-11-26,13:26:55 | INFO | zeroshot_frequency: 2
114
+ 2024-11-26,13:27:49 | INFO | Init a wandb project!
115
+ 2024-11-26,13:27:59 | INFO | Start epoch 0
116
+ 2024-11-26,13:28:05 | INFO | Train Epoch: 0 [ 512/10637090 (0%)] Loss: 6.0211 (6.021) Data (t): 2.675 Batch (t): 6.225, 82.2449/s LR: 0.000001 Logit Scale: 100.000 - V4
117
+ 2024-11-26,13:29:36 | INFO | Train Epoch: 0 [ 51712/10637090 (0%)] Loss: 2.6505 (4.336) Data (t): 0.001 Batch (t): 0.908, 568.068/s LR: 0.000001 Logit Scale: 99.998 - V4
118
+ 2024-11-26,13:31:06 | INFO | Train Epoch: 0 [ 102912/10637090 (1%)] Loss: 2.3597 (3.677) Data (t): 0.001 Batch (t): 0.901, 567.151/s LR: 0.000001 Logit Scale: 99.998 - V4
119
+ 2024-11-26,13:32:38 | INFO | Train Epoch: 0 [ 154112/10637090 (1%)] Loss: 2.0625 (3.273) Data (t): 0.001 Batch (t): 0.916, 571.802/s LR: 0.000001 Logit Scale: 99.998 - V4
120
+ 2024-11-26,13:34:11 | INFO | Train Epoch: 0 [ 205312/10637090 (2%)] Loss: 1.8719 (2.993) Data (t): 0.001 Batch (t): 0.933, 567.169/s LR: 0.000001 Logit Scale: 99.999 - V4
121
+ 2024-11-26,13:35:41 | INFO | Train Epoch: 0 [ 256512/10637090 (2%)] Loss: 1.7555 (2.787) Data (t): 0.001 Batch (t): 0.902, 570.080/s LR: 0.000001 Logit Scale: 100.000 - V4
122
+ 2024-11-26,13:37:11 | INFO | Train Epoch: 0 [ 307712/10637090 (3%)] Loss: 1.7410 (2.637) Data (t): 0.001 Batch (t): 0.901, 567.160/s LR: 0.000001 Logit Scale: 100.000 - V4
123
+ 2024-11-26,13:38:42 | INFO | Train Epoch: 0 [ 358912/10637090 (3%)] Loss: 1.7635 (2.528) Data (t): 0.001 Batch (t): 0.901, 569.098/s LR: 0.000001 Logit Scale: 100.000 - V4
124
+ 2024-11-26,13:40:13 | INFO | Train Epoch: 0 [ 410112/10637090 (4%)] Loss: 1.7196 (2.438) Data (t): 0.001 Batch (t): 0.909, 567.021/s LR: 0.000001 Logit Scale: 100.000 - V4
125
+ 2024-11-26,13:41:48 | INFO | Train Epoch: 0 [ 461312/10637090 (4%)] Loss: 1.9196 (2.386) Data (t): 0.001 Batch (t): 0.950, 570.077/s LR: 0.000001 Logit Scale: 100.000 - V4
126
+ 2024-11-26,13:43:18 | INFO | Train Epoch: 0 [ 512512/10637090 (5%)] Loss: 1.6487 (2.319) Data (t): 0.001 Batch (t): 0.901, 567.572/s LR: 0.000001 Logit Scale: 100.000 - V4
127
+ 2024-11-26,13:44:48 | INFO | Train Epoch: 0 [ 563712/10637090 (5%)] Loss: 1.6103 (2.260) Data (t): 0.001 Batch (t): 0.902, 569.064/s LR: 0.000001 Logit Scale: 100.000 - V4
128
+ 2024-11-26,13:46:18 | INFO | Train Epoch: 0 [ 614912/10637090 (6%)] Loss: 1.5905 (2.209) Data (t): 0.001 Batch (t): 0.901, 569.221/s LR: 0.000001 Logit Scale: 100.000 - V4
129
+ 2024-11-26,13:47:48 | INFO | Train Epoch: 0 [ 666112/10637090 (6%)] Loss: 1.5883 (2.164) Data (t): 0.001 Batch (t): 0.901, 570.760/s LR: 0.000001 Logit Scale: 100.000 - V4
130
+ 2024-11-26,13:49:24 | INFO | Train Epoch: 0 [ 717312/10637090 (7%)] Loss: 1.4338 (2.116) Data (t): 0.001 Batch (t): 0.957, 570.627/s LR: 0.000001 Logit Scale: 100.000 - V4
131
+ 2024-11-26,13:50:54 | INFO | Train Epoch: 0 [ 768512/10637090 (7%)] Loss: 1.6144 (2.084) Data (t): 0.001 Batch (t): 0.900, 569.146/s LR: 0.000001 Logit Scale: 100.000 - V4
132
+ 2024-11-26,13:52:24 | INFO | Train Epoch: 0 [ 819712/10637090 (8%)] Loss: 1.6176 (2.057) Data (t): 0.001 Batch (t): 0.902, 568.243/s LR: 0.000001 Logit Scale: 100.000 - V4
133
+ 2024-11-26,13:53:54 | INFO | Train Epoch: 0 [ 870912/10637090 (8%)] Loss: 1.7019 (2.037) Data (t): 0.001 Batch (t): 0.900, 570.904/s LR: 0.000001 Logit Scale: 100.000 - V4
134
+ 2024-11-26,13:55:24 | INFO | Train Epoch: 0 [ 922112/10637090 (9%)] Loss: 1.6732 (2.018) Data (t): 0.001 Batch (t): 0.900, 569.139/s LR: 0.000001 Logit Scale: 100.000 - V4
135
+ 2024-11-26,13:56:59 | INFO | Train Epoch: 0 [ 973312/10637090 (9%)] Loss: 1.5010 (1.992) Data (t): 0.001 Batch (t): 0.954, 569.706/s LR: 0.000001 Logit Scale: 100.000 - V4
136
+ 2024-11-26,13:58:29 | INFO | Train Epoch: 0 [ 1024512/10637090 (10%)] Loss: 1.5747 (1.972) Data (t): 0.001 Batch (t): 0.899, 568.905/s LR: 0.000001 Logit Scale: 100.000 - V4
137
+ 2024-11-26,13:59:59 | INFO | Train Epoch: 0 [ 1075712/10637090 (10%)] Loss: 1.5527 (1.953) Data (t): 0.001 Batch (t): 0.899, 569.620/s LR: 0.000001 Logit Scale: 100.000 - V4
138
+ 2024-11-26,14:01:29 | INFO | Train Epoch: 0 [ 1126912/10637090 (11%)] Loss: 1.6238 (1.939) Data (t): 0.001 Batch (t): 0.900, 567.832/s LR: 0.000001 Logit Scale: 100.000 - V4
139
+ 2024-11-26,14:02:59 | INFO | Train Epoch: 0 [ 1178112/10637090 (11%)] Loss: 1.3380 (1.914) Data (t): 0.001 Batch (t): 0.900, 570.975/s LR: 0.000001 Logit Scale: 100.000 - V4
140
+ 2024-11-26,14:04:34 | INFO | Train Epoch: 0 [ 1229312/10637090 (12%)] Loss: 1.2799 (1.889) Data (t): 0.001 Batch (t): 0.950, 327.981/s LR: 0.000001 Logit Scale: 100.000 - V4
141
+ 2024-11-26,14:06:06 | INFO | Train Epoch: 0 [ 1280512/10637090 (12%)] Loss: 1.3642 (1.868) Data (t): 0.001 Batch (t): 0.914, 569.657/s LR: 0.000001 Logit Scale: 100.000 - V4
142
+ 2024-11-26,14:07:36 | INFO | Train Epoch: 0 [ 1331712/10637090 (13%)] Loss: 1.5145 (1.855) Data (t): 0.001 Batch (t): 0.901, 567.738/s LR: 0.000001 Logit Scale: 100.000 - V4
143
+ 2024-11-26,14:09:06 | INFO | Train Epoch: 0 [ 1382912/10637090 (13%)] Loss: 1.6791 (1.849) Data (t): 0.001 Batch (t): 0.901, 569.797/s LR: 0.000001 Logit Scale: 100.000 - V4
144
+ 2024-11-26,14:10:36 | INFO | Train Epoch: 0 [ 1434112/10637090 (13%)] Loss: 1.4249 (1.834) Data (t): 0.001 Batch (t): 0.901, 568.832/s LR: 0.000001 Logit Scale: 100.000 - V4
145
+ 2024-11-26,14:12:07 | INFO | Train Epoch: 0 [ 1485312/10637090 (14%)] Loss: 1.4562 (1.822) Data (t): 0.001 Batch (t): 0.910, 566.126/s LR: 0.000001 Logit Scale: 100.000 - V4
146
+ 2024-11-26,14:13:42 | INFO | Train Epoch: 0 [ 1536512/10637090 (14%)] Loss: 1.4888 (1.811) Data (t): 0.001 Batch (t): 0.945, 564.916/s LR: 0.000001 Logit Scale: 100.000 - V4
147
+ 2024-11-26,14:15:12 | INFO | Train Epoch: 0 [ 1587712/10637090 (15%)] Loss: 1.4355 (1.799) Data (t): 0.001 Batch (t): 0.901, 570.004/s LR: 0.000001 Logit Scale: 100.000 - V4
148
+ 2024-11-26,14:16:42 | INFO | Train Epoch: 0 [ 1638912/10637090 (15%)] Loss: 1.4400 (1.788) Data (t): 0.001 Batch (t): 0.900, 568.622/s LR: 0.000001 Logit Scale: 100.000 - V4
149
+ 2024-11-26,14:18:12 | INFO | Train Epoch: 0 [ 1690112/10637090 (16%)] Loss: 1.2937 (1.774) Data (t): 0.001 Batch (t): 0.901, 569.205/s LR: 0.000001 Logit Scale: 100.000 - V4
150
+ 2024-11-26,14:19:43 | INFO | Train Epoch: 0 [ 1741312/10637090 (16%)] Loss: 1.4636 (1.765) Data (t): 0.001 Batch (t): 0.909, 271.898/s LR: 0.000001 Logit Scale: 100.000 - V4
151
+ 2024-11-26,14:21:18 | INFO | Train Epoch: 0 [ 1792512/10637090 (17%)] Loss: 1.3336 (1.753) Data (t): 0.001 Batch (t): 0.948, 571.164/s LR: 0.000001 Logit Scale: 100.000 - V4
152
+ 2024-11-26,14:22:48 | INFO | Train Epoch: 0 [ 1843712/10637090 (17%)] Loss: 1.4242 (1.744) Data (t): 0.001 Batch (t): 0.900, 569.579/s LR: 0.000001 Logit Scale: 100.000 - V4
153
+ 2024-11-26,14:24:17 | INFO | Train Epoch: 0 [ 1894912/10637090 (18%)] Loss: 1.2935 (1.732) Data (t): 0.001 Batch (t): 0.899, 569.778/s LR: 0.000001 Logit Scale: 100.000 - V4
154
+ 2024-11-26,14:25:47 | INFO | Train Epoch: 0 [ 1946112/10637090 (18%)] Loss: 1.3457 (1.722) Data (t): 0.001 Batch (t): 0.899, 570.664/s LR: 0.000001 Logit Scale: 100.000 - V4
155
+ 2024-11-26,14:27:17 | INFO | Train Epoch: 0 [ 1997312/10637090 (19%)] Loss: 1.4240 (1.715) Data (t): 0.001 Batch (t): 0.898, 566.842/s LR: 0.000001 Logit Scale: 100.000 - V4
156
+ 2024-11-26,14:28:53 | INFO | Train Epoch: 0 [ 2048512/10637090 (19%)] Loss: 1.4699 (1.709) Data (t): 0.001 Batch (t): 0.958, 571.469/s LR: 0.000001 Logit Scale: 100.000 - V4
157
+ 2024-11-26,14:30:23 | INFO | Train Epoch: 0 [ 2099712/10637090 (20%)] Loss: 1.5219 (1.704) Data (t): 0.001 Batch (t): 0.898, 569.506/s LR: 0.000001 Logit Scale: 100.000 - V4
158
+ 2024-11-26,14:31:53 | INFO | Train Epoch: 0 [ 2150912/10637090 (20%)] Loss: 1.3657 (1.697) Data (t): 0.001 Batch (t): 0.898, 569.664/s LR: 0.000001 Logit Scale: 100.000 - V4
159
+ 2024-11-26,14:33:22 | INFO | Train Epoch: 0 [ 2202112/10637090 (21%)] Loss: 1.3388 (1.688) Data (t): 0.001 Batch (t): 0.899, 570.094/s LR: 0.000001 Logit Scale: 100.000 - V4
160
+ 2024-11-26,14:34:52 | INFO | Train Epoch: 0 [ 2253312/10637090 (21%)] Loss: 1.4835 (1.684) Data (t): 0.001 Batch (t): 0.898, 568.711/s LR: 0.000001 Logit Scale: 100.000 - V4
161
+ 2024-11-26,14:36:28 | INFO | Train Epoch: 0 [ 2304512/10637090 (22%)] Loss: 1.4458 (1.679) Data (t): 0.001 Batch (t): 0.958, 327.824/s LR: 0.000001 Logit Scale: 100.000 - V4
162
+ 2024-11-26,14:37:58 | INFO | Train Epoch: 0 [ 2355712/10637090 (22%)] Loss: 1.4833 (1.675) Data (t): 0.001 Batch (t): 0.899, 571.156/s LR: 0.000001 Logit Scale: 100.000 - V4
163
+ 2024-11-26,14:39:28 | INFO | Train Epoch: 0 [ 2406912/10637090 (23%)] Loss: 1.5002 (1.671) Data (t): 0.001 Batch (t): 0.899, 571.241/s LR: 0.000001 Logit Scale: 100.000 - V4
164
+ 2024-11-26,14:40:58 | INFO | Train Epoch: 0 [ 2458112/10637090 (23%)] Loss: 1.5528 (1.669) Data (t): 0.001 Batch (t): 0.898, 568.659/s LR: 0.000001 Logit Scale: 100.000 - V4
165
+ 2024-11-26,14:42:28 | INFO | Train Epoch: 0 [ 2509312/10637090 (24%)] Loss: 1.4062 (1.663) Data (t): 0.001 Batch (t): 0.898, 571.460/s LR: 0.000001 Logit Scale: 100.000 - V4
166
+ 2024-11-26,14:44:00 | INFO | Train Epoch: 0 [ 2560512/10637090 (24%)] Loss: 1.4364 (1.659) Data (t): 0.001 Batch (t): 0.927, 571.364/s LR: 0.000001 Logit Scale: 100.000 - V4
167
+ 2024-11-26,14:45:33 | INFO | Train Epoch: 0 [ 2611712/10637090 (25%)] Loss: 1.2963 (1.652) Data (t): 0.001 Batch (t): 0.926, 570.582/s LR: 0.000001 Logit Scale: 99.999 - V4
168
+ 2024-11-26,14:47:03 | INFO | Train Epoch: 0 [ 2662912/10637090 (25%)] Loss: 1.2933 (1.645) Data (t): 0.001 Batch (t): 0.898, 569.676/s LR: 0.000001 Logit Scale: 100.000 - V4
169
+ 2024-11-26,14:48:33 | INFO | Train Epoch: 0 [ 2714112/10637090 (26%)] Loss: 1.2771 (1.638) Data (t): 0.001 Batch (t): 0.898, 569.279/s LR: 0.000001 Logit Scale: 100.000 - V4
170
+ 2024-11-26,14:50:02 | INFO | Train Epoch: 0 [ 2765312/10637090 (26%)] Loss: 1.3485 (1.633) Data (t): 0.001 Batch (t): 0.898, 568.489/s LR: 0.000001 Logit Scale: 100.000 - V4
171
+ 2024-11-26,14:51:34 | INFO | Train Epoch: 0 [ 2816512/10637090 (26%)] Loss: 1.2289 (1.626) Data (t): 0.001 Batch (t): 0.914, 572.484/s LR: 0.000001 Logit Scale: 100.000 - V4
172
+ 2024-11-26,14:53:08 | INFO | Train Epoch: 0 [ 2867712/10637090 (27%)] Loss: 1.4018 (1.622) Data (t): 0.001 Batch (t): 0.941, 572.417/s LR: 0.000001 Logit Scale: 100.000 - V4
173
+ 2024-11-26,14:54:38 | INFO | Train Epoch: 0 [ 2918912/10637090 (27%)] Loss: 1.3707 (1.618) Data (t): 0.001 Batch (t): 0.897, 571.299/s LR: 0.000001 Logit Scale: 100.000 - V4
174
+ 2024-11-26,14:56:07 | INFO | Train Epoch: 0 [ 2970112/10637090 (28%)] Loss: 1.3077 (1.612) Data (t): 0.001 Batch (t): 0.898, 569.293/s LR: 0.000001 Logit Scale: 100.000 - V4
175
+ 2024-11-26,14:57:37 | INFO | Train Epoch: 0 [ 3021312/10637090 (28%)] Loss: 1.3162 (1.607) Data (t): 0.001 Batch (t): 0.897, 569.669/s LR: 0.000001 Logit Scale: 100.000 - V4
176
+ 2024-11-26,14:59:07 | INFO | Train Epoch: 0 [ 3072512/10637090 (29%)] Loss: 1.1386 (1.600) Data (t): 0.001 Batch (t): 0.898, 570.275/s LR: 0.000001 Logit Scale: 100.000 - V4
177
+ 2024-11-26,15:00:43 | INFO | Train Epoch: 0 [ 3123712/10637090 (29%)] Loss: 1.3472 (1.596) Data (t): 0.001 Batch (t): 0.958, 570.992/s LR: 0.000001 Logit Scale: 100.000 - V4
178
+ 2024-11-26,15:02:12 | INFO | Train Epoch: 0 [ 3174912/10637090 (30%)] Loss: 1.3549 (1.592) Data (t): 0.001 Batch (t): 0.897, 571.916/s LR: 0.000001 Logit Scale: 100.000 - V4
179
+ 2024-11-26,15:03:42 | INFO | Train Epoch: 0 [ 3226112/10637090 (30%)] Loss: 1.3102 (1.587) Data (t): 0.001 Batch (t): 0.898, 569.407/s LR: 0.000001 Logit Scale: 100.000 - V4
180
+ 2024-11-26,15:05:12 | INFO | Train Epoch: 0 [ 3277312/10637090 (31%)] Loss: 1.2370 (1.582) Data (t): 0.001 Batch (t): 0.899, 570.822/s LR: 0.000001 Logit Scale: 100.000 - V4
181
+ 2024-11-26,15:06:42 | INFO | Train Epoch: 0 [ 3328512/10637090 (31%)] Loss: 1.3859 (1.579) Data (t): 0.001 Batch (t): 0.898, 571.059/s LR: 0.000001 Logit Scale: 100.000 - V4
182
+ 2024-11-26,15:08:16 | INFO | Train Epoch: 0 [ 3379712/10637090 (32%)] Loss: 1.3788 (1.576) Data (t): 0.001 Batch (t): 0.945, 569.700/s LR: 0.000001 Logit Scale: 100.000 - V4
183
+ 2024-11-26,15:09:47 | INFO | Train Epoch: 0 [ 3430912/10637090 (32%)] Loss: 1.2963 (1.572) Data (t): 0.001 Batch (t): 0.905, 571.613/s LR: 0.000001 Logit Scale: 100.000 - V4
184
+ 2024-11-26,15:11:17 | INFO | Train Epoch: 0 [ 3482112/10637090 (33%)] Loss: 1.3762 (1.569) Data (t): 0.001 Batch (t): 0.899, 568.268/s LR: 0.000001 Logit Scale: 100.000 - V4
185
+ 2024-11-26,15:12:47 | INFO | Train Epoch: 0 [ 3533312/10637090 (33%)] Loss: 1.3043 (1.565) Data (t): 0.001 Batch (t): 0.898, 569.454/s LR: 0.000001 Logit Scale: 100.000 - V4
186
+ 2024-11-26,15:14:16 | INFO | Train Epoch: 0 [ 3584512/10637090 (34%)] Loss: 1.3195 (1.562) Data (t): 0.001 Batch (t): 0.898, 571.086/s LR: 0.000001 Logit Scale: 100.000 - V4
187
+ 2024-11-26,15:15:50 | INFO | Train Epoch: 0 [ 3635712/10637090 (34%)] Loss: 1.3561 (1.559) Data (t): 0.001 Batch (t): 0.939, 571.159/s LR: 0.000001 Logit Scale: 100.000 - V4
188
+ 2024-11-26,15:17:22 | INFO | Train Epoch: 0 [ 3686912/10637090 (35%)] Loss: 1.5171 (1.558) Data (t): 0.001 Batch (t): 0.918, 569.509/s LR: 0.000001 Logit Scale: 100.000 - V4
189
+ 2024-11-26,15:18:52 | INFO | Train Epoch: 0 [ 3738112/10637090 (35%)] Loss: 1.2799 (1.555) Data (t): 0.001 Batch (t): 0.899, 571.044/s LR: 0.000001 Logit Scale: 100.000 - V4
190
+ 2024-11-26,15:20:22 | INFO | Train Epoch: 0 [ 3789312/10637090 (36%)] Loss: 1.1552 (1.549) Data (t): 0.001 Batch (t): 0.898, 569.819/s LR: 0.000001 Logit Scale: 100.000 - V4
191
+ 2024-11-26,15:21:52 | INFO | Train Epoch: 0 [ 3840512/10637090 (36%)] Loss: 1.3273 (1.546) Data (t): 0.001 Batch (t): 0.899, 568.020/s LR: 0.000001 Logit Scale: 100.000 - V4
192
+ 2024-11-26,15:23:24 | INFO | Train Epoch: 0 [ 3891712/10637090 (37%)] Loss: 1.2578 (1.543) Data (t): 0.001 Batch (t): 0.922, 571.115/s LR: 0.000001 Logit Scale: 100.000 - V4
193
+ 2024-11-26,15:24:58 | INFO | Train Epoch: 0 [ 3942912/10637090 (37%)] Loss: 1.2760 (1.539) Data (t): 0.001 Batch (t): 0.936, 570.522/s LR: 0.000001 Logit Scale: 100.000 - V4
194
+ 2024-11-26,15:26:27 | INFO | Train Epoch: 0 [ 3994112/10637090 (38%)] Loss: 1.2114 (1.535) Data (t): 0.001 Batch (t): 0.898, 571.387/s LR: 0.000001 Logit Scale: 100.000 - V4
195
+ 2024-11-26,15:27:57 | INFO | Train Epoch: 0 [ 4045312/10637090 (38%)] Loss: 1.2218 (1.531) Data (t): 0.001 Batch (t): 0.899, 568.116/s LR: 0.000001 Logit Scale: 100.000 - V4
196
+ 2024-11-26,15:29:27 | INFO | Train Epoch: 0 [ 4096512/10637090 (39%)] Loss: 1.2275 (1.527) Data (t): 0.001 Batch (t): 0.901, 569.249/s LR: 0.000001 Logit Scale: 100.000 - V4
197
+ 2024-11-26,15:30:58 | INFO | Train Epoch: 0 [ 4147712/10637090 (39%)] Loss: 1.2819 (1.524) Data (t): 0.001 Batch (t): 0.910, 570.455/s LR: 0.000001 Logit Scale: 100.000 - V4
198
+ 2024-11-26,15:32:33 | INFO | Train Epoch: 0 [ 4198912/10637090 (39%)] Loss: 1.2112 (1.521) Data (t): 0.001 Batch (t): 0.949, 568.145/s LR: 0.000001 Logit Scale: 100.000 - V4
199
+ 2024-11-26,15:34:03 | INFO | Train Epoch: 0 [ 4250112/10637090 (40%)] Loss: 1.3589 (1.519) Data (t): 0.001 Batch (t): 0.899, 570.476/s LR: 0.000001 Logit Scale: 100.000 - V4
200
+ 2024-11-26,15:35:33 | INFO | Train Epoch: 0 [ 4301312/10637090 (40%)] Loss: 1.4017 (1.517) Data (t): 0.001 Batch (t): 0.899, 571.676/s LR: 0.000001 Logit Scale: 100.000 - V4
201
+ 2024-11-26,15:37:03 | INFO | Train Epoch: 0 [ 4352512/10637090 (41%)] Loss: 1.2269 (1.514) Data (t): 0.001 Batch (t): 0.899, 568.670/s LR: 0.000001 Logit Scale: 100.000 - V4
202
+ 2024-11-26,15:38:33 | INFO | Train Epoch: 0 [ 4403712/10637090 (41%)] Loss: 1.2209 (1.511) Data (t): 0.001 Batch (t): 0.899, 570.317/s LR: 0.000001 Logit Scale: 100.000 - V4
203
+ 2024-11-26,15:40:08 | INFO | Train Epoch: 0 [ 4454912/10637090 (42%)] Loss: 1.2596 (1.508) Data (t): 0.001 Batch (t): 0.953, 571.961/s LR: 0.000001 Logit Scale: 100.000 - V4
204
+ 2024-11-26,15:41:39 | INFO | Train Epoch: 0 [ 4506112/10637090 (42%)] Loss: 1.2666 (1.505) Data (t): 0.001 Batch (t): 0.904, 570.514/s LR: 0.000001 Logit Scale: 100.000 - V4
205
+ 2024-11-26,15:43:08 | INFO | Train Epoch: 0 [ 4557312/10637090 (43%)] Loss: 1.3623 (1.503) Data (t): 0.001 Batch (t): 0.898, 569.718/s LR: 0.000001 Logit Scale: 100.000 - V4
206
+ 2024-11-26,15:44:38 | INFO | Train Epoch: 0 [ 4608512/10637090 (43%)] Loss: 1.1735 (1.500) Data (t): 0.001 Batch (t): 0.898, 570.883/s LR: 0.000001 Logit Scale: 100.000 - V4
207
+ 2024-11-26,15:46:08 | INFO | Train Epoch: 0 [ 4659712/10637090 (44%)] Loss: 1.3254 (1.498) Data (t): 0.001 Batch (t): 0.898, 571.044/s LR: 0.000001 Logit Scale: 100.000 - V4
208
+ 2024-11-26,15:47:42 | INFO | Train Epoch: 0 [ 4710912/10637090 (44%)] Loss: 1.3611 (1.496) Data (t): 0.001 Batch (t): 0.939, 570.032/s LR: 0.000001 Logit Scale: 100.000 - V4
209
+ 2024-11-26,15:49:14 | INFO | Train Epoch: 0 [ 4762112/10637090 (45%)] Loss: 1.4570 (1.496) Data (t): 0.001 Batch (t): 0.918, 569.304/s LR: 0.000001 Logit Scale: 100.000 - V4
210
+ 2024-11-26,15:50:44 | INFO | Train Epoch: 0 [ 4813312/10637090 (45%)] Loss: 1.2662 (1.494) Data (t): 0.001 Batch (t): 0.899, 568.471/s LR: 0.000001 Logit Scale: 100.000 - V4
211
+ 2024-11-26,15:52:14 | INFO | Train Epoch: 0 [ 4864512/10637090 (46%)] Loss: 1.2966 (1.492) Data (t): 0.001 Batch (t): 0.899, 569.787/s LR: 0.000001 Logit Scale: 100.000 - V4
212
+ 2024-11-26,15:53:44 | INFO | Train Epoch: 0 [ 4915712/10637090 (46%)] Loss: 1.4136 (1.491) Data (t): 0.001 Batch (t): 0.900, 570.394/s LR: 0.000001 Logit Scale: 100.000 - V4
213
+ 2024-11-26,15:55:17 | INFO | Train Epoch: 0 [ 4966912/10637090 (47%)] Loss: 1.2369 (1.488) Data (t): 0.001 Batch (t): 0.930, 569.732/s LR: 0.000001 Logit Scale: 100.000 - V4
214
+ 2024-11-26,15:56:50 | INFO | Train Epoch: 0 [ 5018112/10637090 (47%)] Loss: 1.2626 (1.486) Data (t): 0.001 Batch (t): 0.930, 566.710/s LR: 0.000001 Logit Scale: 100.000 - V4
215
+ 2024-11-26,15:58:20 | INFO | Train Epoch: 0 [ 5069312/10637090 (48%)] Loss: 1.1438 (1.482) Data (t): 0.001 Batch (t): 0.900, 569.983/s LR: 0.000001 Logit Scale: 100.000 - V4
216
+ 2024-11-26,15:59:50 | INFO | Train Epoch: 0 [ 5120512/10637090 (48%)] Loss: 1.3868 (1.482) Data (t): 0.001 Batch (t): 0.899, 570.512/s LR: 0.000001 Logit Scale: 100.000 - V4
217
+ 2024-11-26,16:01:20 | INFO | Train Epoch: 0 [ 5171712/10637090 (49%)] Loss: 1.3372 (1.480) Data (t): 0.001 Batch (t): 0.899, 572.159/s LR: 0.000001 Logit Scale: 100.000 - V4
218
+ 2024-11-26,16:02:52 | INFO | Train Epoch: 0 [ 5222912/10637090 (49%)] Loss: 1.1923 (1.477) Data (t): 0.001 Batch (t): 0.924, 567.108/s LR: 0.000001 Logit Scale: 100.000 - V4
219
+ 2024-11-26,16:04:26 | INFO | Train Epoch: 0 [ 5274112/10637090 (50%)] Loss: 1.2317 (1.475) Data (t): 0.001 Batch (t): 0.938, 570.208/s LR: 0.000001 Logit Scale: 100.000 - V4
220
+ 2024-11-26,16:05:56 | INFO | Train Epoch: 0 [ 5325312/10637090 (50%)] Loss: 1.2536 (1.473) Data (t): 0.001 Batch (t): 0.899, 569.669/s LR: 0.000001 Logit Scale: 100.000 - V4
221
+ 2024-11-26,16:07:26 | INFO | Train Epoch: 0 [ 5376512/10637090 (51%)] Loss: 1.2602 (1.471) Data (t): 0.001 Batch (t): 0.899, 568.815/s LR: 0.000001 Logit Scale: 100.000 - V4
222
+ 2024-11-26,16:08:56 | INFO | Train Epoch: 0 [ 5427712/10637090 (51%)] Loss: 1.2263 (1.469) Data (t): 0.001 Batch (t): 0.900, 570.632/s LR: 0.000001 Logit Scale: 100.000 - V4
223
+ 2024-11-26,16:10:27 | INFO | Train Epoch: 0 [ 5478912/10637090 (52%)] Loss: 1.2208 (1.466) Data (t): 0.001 Batch (t): 0.910, 569.131/s LR: 0.000001 Logit Scale: 100.000 - V4
224
+ 2024-11-26,16:12:02 | INFO | Train Epoch: 0 [ 5530112/10637090 (52%)] Loss: 1.3252 (1.465) Data (t): 0.001 Batch (t): 0.951, 569.417/s LR: 0.000001 Logit Scale: 100.000 - V4
225
+ 2024-11-26,16:13:32 | INFO | Train Epoch: 0 [ 5581312/10637090 (52%)] Loss: 1.3377 (1.464) Data (t): 0.001 Batch (t): 0.900, 568.483/s LR: 0.000001 Logit Scale: 100.000 - V4
226
+ 2024-11-26,16:15:02 | INFO | Train Epoch: 0 [ 5632512/10637090 (53%)] Loss: 1.3099 (1.462) Data (t): 0.001 Batch (t): 0.900, 569.297/s LR: 0.000001 Logit Scale: 100.000 - V4
227
+ 2024-11-26,16:16:32 | INFO | Train Epoch: 0 [ 5683712/10637090 (53%)] Loss: 1.2511 (1.461) Data (t): 0.001 Batch (t): 0.900, 570.130/s LR: 0.000001 Logit Scale: 100.000 - V4
228
+ 2024-11-26,16:18:02 | INFO | Train Epoch: 0 [ 5734912/10637090 (54%)] Loss: 1.2541 (1.459) Data (t): 0.001 Batch (t): 0.900, 568.688/s LR: 0.000001 Logit Scale: 100.000 - V4
229
+ 2024-11-26,16:19:37 | INFO | Train Epoch: 0 [ 5786112/10637090 (54%)] Loss: 1.1981 (1.456) Data (t): 0.001 Batch (t): 0.949, 568.508/s LR: 0.000001 Logit Scale: 100.000 - V4
230
+ 2024-11-26,16:21:08 | INFO | Train Epoch: 0 [ 5837312/10637090 (55%)] Loss: 1.2915 (1.455) Data (t): 0.001 Batch (t): 0.914, 569.130/s LR: 0.000001 Logit Scale: 100.000 - V4
231
+ 2024-11-26,16:22:38 | INFO | Train Epoch: 0 [ 5888512/10637090 (55%)] Loss: 1.2872 (1.454) Data (t): 0.001 Batch (t): 0.900, 569.196/s LR: 0.000001 Logit Scale: 100.000 - V4
232
+ 2024-11-26,16:24:08 | INFO | Train Epoch: 0 [ 5939712/10637090 (56%)] Loss: 1.2329 (1.452) Data (t): 0.001 Batch (t): 0.901, 566.003/s LR: 0.000001 Logit Scale: 100.000 - V4
233
+ 2024-11-26,16:25:38 | INFO | Train Epoch: 0 [ 5990912/10637090 (56%)] Loss: 1.2263 (1.450) Data (t): 0.001 Batch (t): 0.900, 569.279/s LR: 0.000001 Logit Scale: 100.000 - V4
234
+ 2024-11-26,16:27:11 | INFO | Train Epoch: 0 [ 6042112/10637090 (57%)] Loss: 1.3165 (1.449) Data (t): 0.001 Batch (t): 0.930, 570.634/s LR: 0.000001 Logit Scale: 100.000 - V4
235
+ 2024-11-26,16:28:44 | INFO | Train Epoch: 0 [ 6093312/10637090 (57%)] Loss: 1.2909 (1.447) Data (t): 0.001 Batch (t): 0.930, 570.690/s LR: 0.000001 Logit Scale: 100.000 - V4
236
+ 2024-11-26,16:30:14 | INFO | Train Epoch: 0 [ 6144512/10637090 (58%)] Loss: 1.1370 (1.445) Data (t): 0.001 Batch (t): 0.899, 569.606/s LR: 0.000001 Logit Scale: 100.000 - V4
237
+ 2024-11-26,16:31:44 | INFO | Train Epoch: 0 [ 6195712/10637090 (58%)] Loss: 1.3181 (1.444) Data (t): 0.001 Batch (t): 0.900, 571.386/s LR: 0.000001 Logit Scale: 100.000 - V4
238
+ 2024-11-26,16:33:14 | INFO | Train Epoch: 0 [ 6246912/10637090 (59%)] Loss: 1.2128 (1.442) Data (t): 0.001 Batch (t): 0.899, 566.859/s LR: 0.000001 Logit Scale: 100.000 - V4
239
+ 2024-11-26,16:34:46 | INFO | Train Epoch: 0 [ 6298112/10637090 (59%)] Loss: 1.1429 (1.439) Data (t): 0.001 Batch (t): 0.922, 568.686/s LR: 0.000001 Logit Scale: 100.000 - V4
240
+ 2024-11-26,16:36:20 | INFO | Train Epoch: 0 [ 6349312/10637090 (60%)] Loss: 1.2498 (1.438) Data (t): 0.001 Batch (t): 0.938, 568.535/s LR: 0.000001 Logit Scale: 100.000 - V4
241
+ 2024-11-26,16:37:50 | INFO | Train Epoch: 0 [ 6400512/10637090 (60%)] Loss: 1.2077 (1.436) Data (t): 0.001 Batch (t): 0.900, 569.745/s LR: 0.000001 Logit Scale: 100.000 - V4
242
+ 2024-11-26,16:39:20 | INFO | Train Epoch: 0 [ 6451712/10637090 (61%)] Loss: 1.2972 (1.435) Data (t): 0.001 Batch (t): 0.900, 571.371/s LR: 0.000001 Logit Scale: 100.000 - V4
243
+ 2024-11-26,16:40:50 | INFO | Train Epoch: 0 [ 6502912/10637090 (61%)] Loss: 1.1936 (1.433) Data (t): 0.001 Batch (t): 0.899, 569.002/s LR: 0.000001 Logit Scale: 100.000 - V4
244
+ 2024-11-26,16:42:22 | INFO | Train Epoch: 0 [ 6554112/10637090 (62%)] Loss: 1.2448 (1.432) Data (t): 0.001 Batch (t): 0.924, 570.664/s LR: 0.000001 Logit Scale: 100.000 - V4
245
+ 2024-11-26,16:43:56 | INFO | Train Epoch: 0 [ 6605312/10637090 (62%)] Loss: 1.1709 (1.430) Data (t): 0.001 Batch (t): 0.936, 570.478/s LR: 0.000001 Logit Scale: 100.000 - V4
246
+ 2024-11-26,16:45:26 | INFO | Train Epoch: 0 [ 6656512/10637090 (63%)] Loss: 1.2946 (1.429) Data (t): 0.001 Batch (t): 0.899, 569.146/s LR: 0.000001 Logit Scale: 100.000 - V4
247
+ 2024-11-26,16:46:56 | INFO | Train Epoch: 0 [ 6707712/10637090 (63%)] Loss: 1.3742 (1.428) Data (t): 0.001 Batch (t): 0.900, 568.575/s LR: 0.000001 Logit Scale: 100.000 - V4
248
+ 2024-11-26,16:48:26 | INFO | Train Epoch: 0 [ 6758912/10637090 (64%)] Loss: 1.1728 (1.426) Data (t): 0.001 Batch (t): 0.899, 569.071/s LR: 0.000001 Logit Scale: 100.000 - V4
249
+ 2024-11-26,16:49:57 | INFO | Train Epoch: 0 [ 6810112/10637090 (64%)] Loss: 1.2177 (1.425) Data (t): 0.001 Batch (t): 0.910, 568.539/s LR: 0.000001 Logit Scale: 100.000 - V4
250
+ 2024-11-26,16:51:31 | INFO | Train Epoch: 0 [ 6861312/10637090 (65%)] Loss: 1.2139 (1.423) Data (t): 0.001 Batch (t): 0.945, 568.533/s LR: 0.000001 Logit Scale: 100.000 - V4
251
+ 2024-11-26,16:53:02 | INFO | Train Epoch: 0 [ 6912512/10637090 (65%)] Loss: 1.2024 (1.422) Data (t): 0.001 Batch (t): 0.906, 570.082/s LR: 0.000001 Logit Scale: 100.000 - V4
252
+ 2024-11-26,16:54:32 | INFO | Train Epoch: 0 [ 6963712/10637090 (65%)] Loss: 1.2551 (1.420) Data (t): 0.001 Batch (t): 0.900, 567.906/s LR: 0.000001 Logit Scale: 100.000 - V4
253
+ 2024-11-26,16:56:02 | INFO | Train Epoch: 0 [ 7014912/10637090 (66%)] Loss: 1.4345 (1.420) Data (t): 0.001 Batch (t): 0.899, 570.889/s LR: 0.000001 Logit Scale: 100.000 - V4
254
+ 2024-11-26,16:57:32 | INFO | Train Epoch: 0 [ 7066112/10637090 (66%)] Loss: 1.1707 (1.419) Data (t): 0.001 Batch (t): 0.899, 567.717/s LR: 0.000001 Logit Scale: 100.000 - V4
255
+ 2024-11-26,16:59:06 | INFO | Train Epoch: 0 [ 7117312/10637090 (67%)] Loss: 1.2277 (1.417) Data (t): 0.001 Batch (t): 0.941, 569.467/s LR: 0.000001 Logit Scale: 100.000 - V4
256
+ 2024-11-26,17:00:38 | INFO | Train Epoch: 0 [ 7168512/10637090 (67%)] Loss: 1.1800 (1.416) Data (t): 0.001 Batch (t): 0.921, 569.605/s LR: 0.000001 Logit Scale: 100.000 - V4
257
+ 2024-11-26,17:02:08 | INFO | Train Epoch: 0 [ 7219712/10637090 (68%)] Loss: 1.1229 (1.414) Data (t): 0.001 Batch (t): 0.900, 569.514/s LR: 0.000001 Logit Scale: 100.000 - V4
258
+ 2024-11-26,17:03:38 | INFO | Train Epoch: 0 [ 7270912/10637090 (68%)] Loss: 1.3891 (1.413) Data (t): 0.001 Batch (t): 0.900, 569.009/s LR: 0.000001 Logit Scale: 100.000 - V4
259
+ 2024-11-26,17:05:08 | INFO | Train Epoch: 0 [ 7322112/10637090 (69%)] Loss: 1.0706 (1.411) Data (t): 0.001 Batch (t): 0.900, 566.596/s LR: 0.000001 Logit Scale: 100.000 - V4
260
+ 2024-11-26,17:06:40 | INFO | Train Epoch: 0 [ 7373312/10637090 (69%)] Loss: 1.3101 (1.410) Data (t): 0.001 Batch (t): 0.923, 571.396/s LR: 0.000001 Logit Scale: 100.000 - V4
261
+ 2024-11-26,17:08:13 | INFO | Train Epoch: 0 [ 7424512/10637090 (70%)] Loss: 1.1021 (1.408) Data (t): 0.001 Batch (t): 0.930, 570.699/s LR: 0.000001 Logit Scale: 100.000 - V4
262
+ 2024-11-26,17:09:43 | INFO | Train Epoch: 0 [ 7475712/10637090 (70%)] Loss: 1.1971 (1.407) Data (t): 0.001 Batch (t): 0.899, 569.514/s LR: 0.000001 Logit Scale: 100.000 - V4
263
+ 2024-11-26,17:11:13 | INFO | Train Epoch: 0 [ 7526912/10637090 (71%)] Loss: 1.0900 (1.405) Data (t): 0.001 Batch (t): 0.899, 568.834/s LR: 0.000001 Logit Scale: 100.000 - V4
264
+ 2024-11-26,17:12:43 | INFO | Train Epoch: 0 [ 7578112/10637090 (71%)] Loss: 1.2845 (1.404) Data (t): 0.001 Batch (t): 0.900, 569.436/s LR: 0.000001 Logit Scale: 100.000 - V4
265
+ 2024-11-26,17:14:15 | INFO | Train Epoch: 0 [ 7629312/10637090 (72%)] Loss: 1.0373 (1.401) Data (t): 0.001 Batch (t): 0.924, 570.141/s LR: 0.000001 Logit Scale: 100.000 - V4
266
+ 2024-11-26,17:15:49 | INFO | Train Epoch: 0 [ 7680512/10637090 (72%)] Loss: 1.1573 (1.400) Data (t): 0.001 Batch (t): 0.937, 571.663/s LR: 0.000001 Logit Scale: 100.000 - V4
267
+ 2024-11-26,17:17:19 | INFO | Train Epoch: 0 [ 7731712/10637090 (73%)] Loss: 1.2276 (1.399) Data (t): 0.001 Batch (t): 0.898, 571.793/s LR: 0.000001 Logit Scale: 100.000 - V4
268
+ 2024-11-26,17:18:49 | INFO | Train Epoch: 0 [ 7782912/10637090 (73%)] Loss: 1.2173 (1.397) Data (t): 0.001 Batch (t): 0.900, 569.486/s LR: 0.000001 Logit Scale: 100.000 - V4
269
+ 2024-11-26,17:20:19 | INFO | Train Epoch: 0 [ 7834112/10637090 (74%)] Loss: 1.1969 (1.396) Data (t): 0.001 Batch (t): 0.899, 567.732/s LR: 0.000001 Logit Scale: 100.000 - V4
270
+ 2024-11-26,17:21:51 | INFO | Train Epoch: 0 [ 7885312/10637090 (74%)] Loss: 1.3347 (1.396) Data (t): 0.001 Batch (t): 0.926, 568.903/s LR: 0.000001 Logit Scale: 100.000 - V4
271
+ 2024-11-26,17:23:24 | INFO | Train Epoch: 0 [ 7936512/10637090 (75%)] Loss: 1.2011 (1.394) Data (t): 0.001 Batch (t): 0.932, 571.177/s LR: 0.000001 Logit Scale: 100.000 - V4
272
+ 2024-11-26,17:24:55 | INFO | Train Epoch: 0 [ 7987712/10637090 (75%)] Loss: 1.2774 (1.394) Data (t): 0.001 Batch (t): 0.907, 569.049/s LR: 0.000001 Logit Scale: 100.000 - V4
273
+ 2024-11-26,17:26:25 | INFO | Train Epoch: 0 [ 8038912/10637090 (76%)] Loss: 1.0803 (1.392) Data (t): 0.001 Batch (t): 0.900, 571.377/s LR: 0.000001 Logit Scale: 100.000 - V4
274
+ 2024-11-26,17:27:55 | INFO | Train Epoch: 0 [ 8090112/10637090 (76%)] Loss: 1.2829 (1.391) Data (t): 0.001 Batch (t): 0.901, 568.095/s LR: 0.000001 Logit Scale: 100.000 - V4
275
+ 2024-11-26,17:29:25 | INFO | Train Epoch: 0 [ 8141312/10637090 (77%)] Loss: 1.1095 (1.389) Data (t): 0.001 Batch (t): 0.899, 569.986/s LR: 0.000001 Logit Scale: 100.000 - V4
276
+ 2024-11-26,17:31:00 | INFO | Train Epoch: 0 [ 8192512/10637090 (77%)] Loss: 1.2348 (1.388) Data (t): 0.001 Batch (t): 0.949, 571.927/s LR: 0.000001 Logit Scale: 100.000 - V4
277
+ 2024-11-26,17:32:31 | INFO | Train Epoch: 0 [ 8243712/10637090 (78%)] Loss: 1.1336 (1.387) Data (t): 0.001 Batch (t): 0.913, 570.498/s LR: 0.000001 Logit Scale: 100.000 - V4
278
+ 2024-11-26,17:34:01 | INFO | Train Epoch: 0 [ 8294912/10637090 (78%)] Loss: 1.2583 (1.386) Data (t): 0.001 Batch (t): 0.899, 570.770/s LR: 0.000001 Logit Scale: 100.000 - V4
279
+ 2024-11-26,17:35:31 | INFO | Train Epoch: 0 [ 8346112/10637090 (78%)] Loss: 1.1911 (1.385) Data (t): 0.001 Batch (t): 0.899, 571.924/s LR: 0.000001 Logit Scale: 100.000 - V4
280
+ 2024-11-26,17:37:01 | INFO | Train Epoch: 0 [ 8397312/10637090 (79%)] Loss: 1.1734 (1.383) Data (t): 0.001 Batch (t): 0.899, 565.386/s LR: 0.000001 Logit Scale: 100.000 - V4
281
+ 2024-11-26,17:38:34 | INFO | Train Epoch: 0 [ 8448512/10637090 (79%)] Loss: 1.2354 (1.383) Data (t): 0.001 Batch (t): 0.931, 570.636/s LR: 0.000001 Logit Scale: 100.000 - V4
282
+ 2024-11-26,17:40:07 | INFO | Train Epoch: 0 [ 8499712/10637090 (80%)] Loss: 1.1634 (1.381) Data (t): 0.001 Batch (t): 0.924, 567.707/s LR: 0.000001 Logit Scale: 100.000 - V4
283
+ 2024-11-26,17:41:36 | INFO | Train Epoch: 0 [ 8550912/10637090 (80%)] Loss: 1.2009 (1.380) Data (t): 0.001 Batch (t): 0.898, 569.702/s LR: 0.000001 Logit Scale: 100.000 - V4
284
+ 2024-11-26,17:43:06 | INFO | Train Epoch: 0 [ 8602112/10637090 (81%)] Loss: 1.0738 (1.378) Data (t): 0.001 Batch (t): 0.898, 570.831/s LR: 0.000001 Logit Scale: 100.000 - V4
285
+ 2024-11-26,17:44:36 | INFO | Train Epoch: 0 [ 8653312/10637090 (81%)] Loss: 1.2410 (1.378) Data (t): 0.001 Batch (t): 0.900, 569.181/s LR: 0.000001 Logit Scale: 100.000 - V4
286
+ 2024-11-26,17:46:08 | INFO | Train Epoch: 0 [ 8704512/10637090 (82%)] Loss: 1.3493 (1.377) Data (t): 0.001 Batch (t): 0.916, 571.074/s LR: 0.000001 Logit Scale: 100.000 - V4
287
+ 2024-11-26,17:47:41 | INFO | Train Epoch: 0 [ 8755712/10637090 (82%)] Loss: 1.1247 (1.376) Data (t): 0.001 Batch (t): 0.930, 570.210/s LR: 0.000001 Logit Scale: 100.000 - V4
288
+ 2024-11-26,17:49:11 | INFO | Train Epoch: 0 [ 8806912/10637090 (83%)] Loss: 1.1758 (1.375) Data (t): 0.001 Batch (t): 0.898, 569.424/s LR: 0.000001 Logit Scale: 100.000 - V4
289
+ 2024-11-26,17:50:41 | INFO | Train Epoch: 0 [ 8858112/10637090 (83%)] Loss: 1.4430 (1.375) Data (t): 0.001 Batch (t): 0.899, 570.117/s LR: 0.000001 Logit Scale: 100.000 - V4
290
+ 2024-11-26,17:52:10 | INFO | Train Epoch: 0 [ 8909312/10637090 (84%)] Loss: 1.2844 (1.375) Data (t): 0.001 Batch (t): 0.898, 569.938/s LR: 0.000001 Logit Scale: 100.000 - V4
291
+ 2024-11-26,17:53:43 | INFO | Train Epoch: 0 [ 8960512/10637090 (84%)] Loss: 1.2235 (1.374) Data (t): 0.001 Batch (t): 0.923, 569.088/s LR: 0.000001 Logit Scale: 100.000 - V4
292
+ 2024-11-26,17:55:16 | INFO | Train Epoch: 0 [ 9011712/10637090 (85%)] Loss: 1.3818 (1.374) Data (t): 0.001 Batch (t): 0.930, 570.654/s LR: 0.000001 Logit Scale: 100.000 - V4
293
+ 2024-11-26,17:56:46 | INFO | Train Epoch: 0 [ 9062912/10637090 (85%)] Loss: 1.2917 (1.373) Data (t): 0.001 Batch (t): 0.905, 568.413/s LR: 0.000001 Logit Scale: 100.000 - V4
294
+ 2024-11-26,17:58:16 | INFO | Train Epoch: 0 [ 9114112/10637090 (86%)] Loss: 1.4781 (1.374) Data (t): 0.001 Batch (t): 0.898, 571.441/s LR: 0.000001 Logit Scale: 100.000 - V4
295
+ 2024-11-26,17:59:46 | INFO | Train Epoch: 0 [ 9165312/10637090 (86%)] Loss: 1.1813 (1.373) Data (t): 0.001 Batch (t): 0.898, 572.722/s LR: 0.000001 Logit Scale: 100.000 - V4
296
+ 2024-11-26,18:01:17 | INFO | Train Epoch: 0 [ 9216512/10637090 (87%)] Loss: 1.2649 (1.372) Data (t): 0.001 Batch (t): 0.916, 571.388/s LR: 0.000001 Logit Scale: 100.000 - V4
297
+ 2024-11-26,18:02:51 | INFO | Train Epoch: 0 [ 9267712/10637090 (87%)] Loss: 1.1824 (1.371) Data (t): 0.001 Batch (t): 0.931, 568.484/s LR: 0.000001 Logit Scale: 100.000 - V4
298
+ 2024-11-26,18:04:22 | INFO | Train Epoch: 0 [ 9318912/10637090 (88%)] Loss: 1.1912 (1.370) Data (t): 0.001 Batch (t): 0.914, 568.624/s LR: 0.000001 Logit Scale: 100.000 - V4
299
+ 2024-11-26,18:05:52 | INFO | Train Epoch: 0 [ 9370112/10637090 (88%)] Loss: 1.4411 (1.371) Data (t): 0.001 Batch (t): 0.898, 571.071/s LR: 0.000001 Logit Scale: 100.000 - V4
300
+ 2024-11-26,18:07:21 | INFO | Train Epoch: 0 [ 9421312/10637090 (89%)] Loss: 0.98230 (1.369) Data (t): 0.001 Batch (t): 0.897, 573.989/s LR: 0.000001 Logit Scale: 100.000 - V4
301
+ 2024-11-26,18:08:51 | INFO | Train Epoch: 0 [ 9472512/10637090 (89%)] Loss: 1.1880 (1.368) Data (t): 0.001 Batch (t): 0.899, 572.881/s LR: 0.000001 Logit Scale: 100.000 - V4
302
+ 2024-11-26,18:10:26 | INFO | Train Epoch: 0 [ 9523712/10637090 (90%)] Loss: 1.3178 (1.367) Data (t): 0.001 Batch (t): 0.949, 314.720/s LR: 0.000001 Logit Scale: 100.000 - V4
303
+ 2024-11-26,18:11:57 | INFO | Train Epoch: 0 [ 9574912/10637090 (90%)] Loss: 1.2982 (1.367) Data (t): 0.001 Batch (t): 0.911, 571.868/s LR: 0.000001 Logit Scale: 100.000 - V4
304
+ 2024-11-26,18:13:27 | INFO | Train Epoch: 0 [ 9626112/10637090 (90%)] Loss: 1.2339 (1.366) Data (t): 0.001 Batch (t): 0.898, 570.843/s LR: 0.000001 Logit Scale: 100.000 - V4
305
+ 2024-11-26,18:14:57 | INFO | Train Epoch: 0 [ 9677312/10637090 (91%)] Loss: 1.3044 (1.366) Data (t): 0.001 Batch (t): 0.897, 570.286/s LR: 0.000001 Logit Scale: 100.000 - V4
306
+ 2024-11-26,18:16:27 | INFO | Train Epoch: 0 [ 9728512/10637090 (91%)] Loss: 1.2013 (1.365) Data (t): 0.001 Batch (t): 0.899, 571.575/s LR: 0.000001 Logit Scale: 100.000 - V4
307
+ 2024-11-26,18:18:00 | INFO | Train Epoch: 0 [ 9779712/10637090 (92%)] Loss: 1.1965 (1.364) Data (t): 0.001 Batch (t): 0.930, 573.865/s LR: 0.000001 Logit Scale: 100.000 - V4
308
+ 2024-11-26,18:19:33 | INFO | Train Epoch: 0 [ 9830912/10637090 (92%)] Loss: 1.2956 (1.364) Data (t): 0.001 Batch (t): 0.930, 570.286/s LR: 0.000001 Logit Scale: 100.000 - V4
309
+ 2024-11-26,18:21:02 | INFO | Train Epoch: 0 [ 9882112/10637090 (93%)] Loss: 1.2174 (1.363) Data (t): 0.001 Batch (t): 0.897, 571.233/s LR: 0.000001 Logit Scale: 100.000 - V4
310
+ 2024-11-26,18:22:32 | INFO | Train Epoch: 0 [ 9933312/10637090 (93%)] Loss: 1.2816 (1.363) Data (t): 0.001 Batch (t): 0.898, 570.071/s LR: 0.000001 Logit Scale: 100.000 - V4
311
+ 2024-11-26,18:24:02 | INFO | Train Epoch: 0 [ 9984512/10637090 (94%)] Loss: 1.2450 (1.362) Data (t): 0.001 Batch (t): 0.898, 570.906/s LR: 0.000001 Logit Scale: 100.000 - V4
312
+ 2024-11-26,18:25:34 | INFO | Train Epoch: 0 [10035712/10637090 (94%)] Loss: 1.3246 (1.362) Data (t): 0.001 Batch (t): 0.925, 571.460/s LR: 0.000001 Logit Scale: 100.000 - V4
313
+ 2024-11-26,18:27:08 | INFO | Train Epoch: 0 [10086912/10637090 (95%)] Loss: 1.0945 (1.361) Data (t): 0.001 Batch (t): 0.931, 571.226/s LR: 0.000001 Logit Scale: 100.000 - V4
314
+ 2024-11-26,18:28:38 | INFO | Train Epoch: 0 [10138112/10637090 (95%)] Loss: 1.3692 (1.361) Data (t): 0.001 Batch (t): 0.905, 570.400/s LR: 0.000001 Logit Scale: 100.000 - V4
315
+ 2024-11-26,18:30:08 | INFO | Train Epoch: 0 [10189312/10637090 (96%)] Loss: 1.2519 (1.360) Data (t): 0.001 Batch (t): 0.897, 571.391/s LR: 0.000001 Logit Scale: 100.000 - V4
316
+ 2024-11-26,18:31:37 | INFO | Train Epoch: 0 [10240512/10637090 (96%)] Loss: 1.2631 (1.360) Data (t): 0.001 Batch (t): 0.897, 572.820/s LR: 0.000001 Logit Scale: 100.000 - V4
317
+ 2024-11-26,18:33:09 | INFO | Train Epoch: 0 [10291712/10637090 (97%)] Loss: 1.2612 (1.359) Data (t): 0.001 Batch (t): 0.918, 570.538/s LR: 0.000001 Logit Scale: 100.000 - V4
318
+ 2024-11-26,18:34:42 | INFO | Train Epoch: 0 [10342912/10637090 (97%)] Loss: 1.1512 (1.358) Data (t): 0.001 Batch (t): 0.931, 570.150/s LR: 0.000001 Logit Scale: 100.000 - V4
319
+ 2024-11-26,18:36:13 | INFO | Train Epoch: 0 [10394112/10637090 (98%)] Loss: 1.2540 (1.358) Data (t): 0.001 Batch (t): 0.904, 569.754/s LR: 0.000001 Logit Scale: 100.000 - V4
320
+ 2024-11-26,18:37:43 | INFO | Train Epoch: 0 [10445312/10637090 (98%)] Loss: 1.3530 (1.357) Data (t): 0.001 Batch (t): 0.898, 570.079/s LR: 0.000001 Logit Scale: 100.000 - V4
321
+ 2024-11-26,18:39:12 | INFO | Train Epoch: 0 [10496512/10637090 (99%)] Loss: 1.1090 (1.356) Data (t): 0.001 Batch (t): 0.898, 570.845/s LR: 0.000001 Logit Scale: 100.000 - V4
322
+ 2024-11-26,18:40:44 | INFO | Train Epoch: 0 [10547712/10637090 (99%)] Loss: 1.3959 (1.356) Data (t): 0.001 Batch (t): 0.917, 570.875/s LR: 0.000001 Logit Scale: 100.000 - V4
323
+ 2024-11-26,18:42:17 | INFO | Train Epoch: 0 [10598912/10637090 (100%)] Loss: 1.1885 (1.356) Data (t): 0.001 Batch (t): 0.930, 572.573/s LR: 0.000001 Logit Scale: 100.000 - V4
324
+ 2024-11-26,18:43:25 | INFO | Train Epoch: 0 [10636800/10637090 (100%)] Loss: 1.1315 (1.355) Data (t): 0.002 Batch (t): 0.919, 570.887/s LR: 0.000001 Logit Scale: 100.000 - V4
325
+ 2024-11-26,18:43:32 | INFO | Start epoch 1
326
+ 2024-11-26,18:43:36 | INFO | Train Epoch: 1 [ 512/10637090 (0%)] Loss: 1.1272 (1.127) Data (t): 2.823 Batch (t): 3.742, 136.843/s LR: 0.000000 Logit Scale: 100.000 - V4
327
+ 2024-11-26,18:45:06 | INFO | Train Epoch: 1 [ 51712/10637090 (0%)] Loss: 1.3047 (1.216) Data (t): 0.001 Batch (t): 0.900, 570.502/s LR: 0.000000 Logit Scale: 100.000 - V4
328
+ 2024-11-26,18:46:35 | INFO | Train Epoch: 1 [ 102912/10637090 (1%)] Loss: 1.3060 (1.246) Data (t): 0.001 Batch (t): 0.899, 571.723/s LR: 0.000000 Logit Scale: 100.000 - V4
329
+ 2024-11-26,18:48:06 | INFO | Train Epoch: 1 [ 154112/10637090 (1%)] Loss: 1.2823 (1.255) Data (t): 0.001 Batch (t): 0.910, 570.631/s LR: 0.000000 Logit Scale: 100.000 - V4
330
+ 2024-11-26,18:49:39 | INFO | Train Epoch: 1 [ 205312/10637090 (2%)] Loss: 1.1136 (1.227) Data (t): 0.001 Batch (t): 0.924, 570.965/s LR: 0.000000 Logit Scale: 100.000 - V4
331
+ 2024-11-26,18:51:12 | INFO | Train Epoch: 1 [ 256512/10637090 (2%)] Loss: 1.1178 (1.209) Data (t): 0.001 Batch (t): 0.933, 568.766/s LR: 0.000000 Logit Scale: 100.000 - V4
332
+ 2024-11-26,18:52:42 | INFO | Train Epoch: 1 [ 307712/10637090 (3%)] Loss: 1.1960 (1.207) Data (t): 0.001 Batch (t): 0.899, 569.822/s LR: 0.000000 Logit Scale: 100.000 - V4
333
+ 2024-11-26,18:54:12 | INFO | Train Epoch: 1 [ 358912/10637090 (3%)] Loss: 1.1630 (1.201) Data (t): 0.001 Batch (t): 0.900, 568.075/s LR: 0.000000 Logit Scale: 100.000 - V4
334
+ 2024-11-26,18:55:43 | INFO | Train Epoch: 1 [ 410112/10637090 (4%)] Loss: 1.1219 (1.193) Data (t): 0.001 Batch (t): 0.909, 567.355/s LR: 0.000000 Logit Scale: 100.000 - V4
335
+ 2024-11-26,18:57:14 | INFO | Train Epoch: 1 [ 461312/10637090 (4%)] Loss: 1.2985 (1.203) Data (t): 0.001 Batch (t): 0.913, 566.980/s LR: 0.000000 Logit Scale: 100.000 - V4
336
+ 2024-11-26,18:58:47 | INFO | Train Epoch: 1 [ 512512/10637090 (5%)] Loss: 1.1123 (1.195) Data (t): 0.001 Batch (t): 0.928, 567.868/s LR: 0.000000 Logit Scale: 100.000 - V4
337
+ 2024-11-26,19:00:18 | INFO | Train Epoch: 1 [ 563712/10637090 (5%)] Loss: 1.1470 (1.191) Data (t): 0.001 Batch (t): 0.906, 571.082/s LR: 0.000000 Logit Scale: 100.000 - V4
338
+ 2024-11-26,19:01:48 | INFO | Train Epoch: 1 [ 614912/10637090 (6%)] Loss: 1.2744 (1.197) Data (t): 0.001 Batch (t): 0.899, 569.837/s LR: 0.000000 Logit Scale: 100.000 - V4
339
+ 2024-11-26,19:03:17 | INFO | Train Epoch: 1 [ 666112/10637090 (6%)] Loss: 1.0854 (1.189) Data (t): 0.001 Batch (t): 0.898, 568.425/s LR: 0.000000 Logit Scale: 100.000 - V4
340
+ 2024-11-26,19:04:50 | INFO | Train Epoch: 1 [ 717312/10637090 (7%)] Loss: 1.1137 (1.184) Data (t): 0.001 Batch (t): 0.921, 571.932/s LR: 0.000000 Logit Scale: 100.000 - V4
341
+ 2024-11-26,19:06:22 | INFO | Train Epoch: 1 [ 768512/10637090 (7%)] Loss: 1.0684 (1.177) Data (t): 0.001 Batch (t): 0.922, 567.544/s LR: 0.000000 Logit Scale: 100.000 - V4
342
+ 2024-11-26,19:07:53 | INFO | Train Epoch: 1 [ 819712/10637090 (8%)] Loss: 1.2528 (1.181) Data (t): 0.001 Batch (t): 0.912, 571.835/s LR: 0.000000 Logit Scale: 100.000 - V4
343
+ 2024-11-26,19:09:23 | INFO | Train Epoch: 1 [ 870912/10637090 (8%)] Loss: 1.0918 (1.176) Data (t): 0.001 Batch (t): 0.899, 567.959/s LR: 0.000000 Logit Scale: 100.000 - V4
344
+ 2024-11-26,19:10:53 | INFO | Train Epoch: 1 [ 922112/10637090 (9%)] Loss: 1.3715 (1.187) Data (t): 0.001 Batch (t): 0.900, 570.266/s LR: 0.000000 Logit Scale: 100.000 - V4
345
+ 2024-11-26,19:12:24 | INFO | Train Epoch: 1 [ 973312/10637090 (9%)] Loss: 1.1492 (1.185) Data (t): 0.001 Batch (t): 0.915, 570.129/s LR: 0.000000 Logit Scale: 100.000 - V4
346
+ 2024-11-26,19:13:57 | INFO | Train Epoch: 1 [ 1024512/10637090 (10%)] Loss: 1.0801 (1.180) Data (t): 0.001 Batch (t): 0.929, 564.189/s LR: 0.000000 Logit Scale: 100.000 - V4
347
+ 2024-11-26,19:15:28 | INFO | Train Epoch: 1 [ 1075712/10637090 (10%)] Loss: 1.1641 (1.179) Data (t): 0.001 Batch (t): 0.913, 563.636/s LR: 0.000000 Logit Scale: 100.000 - V4
348
+ 2024-11-26,19:16:58 | INFO | Train Epoch: 1 [ 1126912/10637090 (11%)] Loss: 1.0773 (1.175) Data (t): 0.001 Batch (t): 0.899, 571.633/s LR: 0.000000 Logit Scale: 100.000 - V4
349
+ 2024-11-26,19:18:28 | INFO | Train Epoch: 1 [ 1178112/10637090 (11%)] Loss: 1.1838 (1.175) Data (t): 0.001 Batch (t): 0.898, 569.161/s LR: 0.000000 Logit Scale: 100.000 - V4
350
+ 2024-11-26,19:19:59 | INFO | Train Epoch: 1 [ 1229312/10637090 (12%)] Loss: 1.1636 (1.175) Data (t): 0.001 Batch (t): 0.909, 566.818/s LR: 0.000000 Logit Scale: 100.000 - V4
351
+ 2024-11-26,19:21:31 | INFO | Train Epoch: 1 [ 1280512/10637090 (12%)] Loss: 1.2164 (1.176) Data (t): 0.001 Batch (t): 0.918, 569.866/s LR: 0.000000 Logit Scale: 100.000 - V4
352
+ 2024-11-26,19:23:04 | INFO | Train Epoch: 1 [ 1331712/10637090 (13%)] Loss: 1.1400 (1.175) Data (t): 0.001 Batch (t): 0.927, 572.217/s LR: 0.000000 Logit Scale: 100.000 - V4
353
+ 2024-11-26,19:24:33 | INFO | Train Epoch: 1 [ 1382912/10637090 (13%)] Loss: 1.3023 (1.179) Data (t): 0.001 Batch (t): 0.897, 570.665/s LR: 0.000000 Logit Scale: 100.000 - V4
354
+ 2024-11-26,19:26:03 | INFO | Train Epoch: 1 [ 1434112/10637090 (13%)] Loss: 1.0929 (1.176) Data (t): 0.001 Batch (t): 0.898, 569.468/s LR: 0.000000 Logit Scale: 100.000 - V4
355
+ 2024-11-26,19:27:34 | INFO | Train Epoch: 1 [ 1485312/10637090 (14%)] Loss: 1.1918 (1.177) Data (t): 0.001 Batch (t): 0.907, 570.668/s LR: 0.000000 Logit Scale: 100.000 - V4
356
+ 2024-11-26,19:29:06 | INFO | Train Epoch: 1 [ 1536512/10637090 (14%)] Loss: 1.1910 (1.177) Data (t): 0.001 Batch (t): 0.917, 570.185/s LR: 0.000000 Logit Scale: 100.000 - V4
357
+ 2024-11-26,19:30:38 | INFO | Train Epoch: 1 [ 1587712/10637090 (15%)] Loss: 1.2453 (1.180) Data (t): 0.001 Batch (t): 0.927, 571.132/s LR: 0.000000 Logit Scale: 100.000 - V4
358
+ 2024-11-26,19:32:08 | INFO | Train Epoch: 1 [ 1638912/10637090 (15%)] Loss: 1.3020 (1.183) Data (t): 0.001 Batch (t): 0.898, 570.613/s LR: 0.000000 Logit Scale: 100.000 - V4
359
+ 2024-11-26,19:33:38 | INFO | Train Epoch: 1 [ 1690112/10637090 (16%)] Loss: 1.1995 (1.184) Data (t): 0.001 Batch (t): 0.897, 571.345/s LR: 0.000000 Logit Scale: 100.000 - V4
360
+ 2024-11-26,19:35:07 | INFO | Train Epoch: 1 [ 1741312/10637090 (16%)] Loss: 1.2071 (1.184) Data (t): 0.001 Batch (t): 0.898, 569.574/s LR: 0.000000 Logit Scale: 100.000 - V4
361
+ 2024-11-26,19:36:40 | INFO | Train Epoch: 1 [ 1792512/10637090 (17%)] Loss: 1.2923 (1.187) Data (t): 0.001 Batch (t): 0.921, 568.688/s LR: 0.000000 Logit Scale: 100.000 - V4
362
+ 2024-11-26,19:38:12 | INFO | Train Epoch: 1 [ 1843712/10637090 (17%)] Loss: 1.1152 (1.185) Data (t): 0.001 Batch (t): 0.928, 327.312/s LR: 0.000000 Logit Scale: 100.000 - V4
363
+ 2024-11-26,19:39:43 | INFO | Train Epoch: 1 [ 1894912/10637090 (18%)] Loss: 1.0761 (1.183) Data (t): 0.001 Batch (t): 0.905, 571.211/s LR: 0.000000 Logit Scale: 100.000 - V4
364
+ 2024-11-26,19:41:13 | INFO | Train Epoch: 1 [ 1946112/10637090 (18%)] Loss: 1.3059 (1.186) Data (t): 0.001 Batch (t): 0.898, 569.130/s LR: 0.000000 Logit Scale: 100.000 - V4
365
+ 2024-11-26,19:42:42 | INFO | Train Epoch: 1 [ 1997312/10637090 (19%)] Loss: 1.1767 (1.186) Data (t): 0.001 Batch (t): 0.899, 567.882/s LR: 0.000000 Logit Scale: 100.000 - V4
366
+ 2024-11-26,19:44:15 | INFO | Train Epoch: 1 [ 2048512/10637090 (19%)] Loss: 1.1926 (1.186) Data (t): 0.001 Batch (t): 0.921, 571.107/s LR: 0.000000 Logit Scale: 100.000 - V4
367
+ 2024-11-26,19:45:47 | INFO | Train Epoch: 1 [ 2099712/10637090 (20%)] Loss: 0.98852 (1.181) Data (t): 0.001 Batch (t): 0.921, 565.473/s LR: 0.000000 Logit Scale: 100.000 - V4
368
+ 2024-11-26,19:47:18 | INFO | Train Epoch: 1 [ 2150912/10637090 (20%)] Loss: 1.1040 (1.179) Data (t): 0.001 Batch (t): 0.912, 570.252/s LR: 0.000000 Logit Scale: 100.000 - V4
369
+ 2024-11-26,19:48:48 | INFO | Train Epoch: 1 [ 2202112/10637090 (21%)] Loss: 1.1025 (1.177) Data (t): 0.001 Batch (t): 0.899, 568.608/s LR: 0.000000 Logit Scale: 100.000 - V4
370
+ 2024-11-26,19:50:18 | INFO | Train Epoch: 1 [ 2253312/10637090 (21%)] Loss: 1.1215 (1.176) Data (t): 0.001 Batch (t): 0.899, 571.476/s LR: 0.000000 Logit Scale: 100.000 - V4
371
+ 2024-11-26,19:51:49 | INFO | Train Epoch: 1 [ 2304512/10637090 (22%)] Loss: 1.2547 (1.178) Data (t): 0.001 Batch (t): 0.916, 571.678/s LR: 0.000000 Logit Scale: 100.000 - V4
372
+ 2024-11-26,19:53:21 | INFO | Train Epoch: 1 [ 2355712/10637090 (22%)] Loss: 1.1704 (1.178) Data (t): 0.001 Batch (t): 0.914, 569.301/s LR: 0.000000 Logit Scale: 100.000 - V4
373
+ 2024-11-26,19:54:53 | INFO | Train Epoch: 1 [ 2406912/10637090 (23%)] Loss: 1.1251 (1.177) Data (t): 0.001 Batch (t): 0.928, 571.243/s LR: 0.000000 Logit Scale: 100.000 - V4
374
+ 2024-11-26,19:56:23 | INFO | Train Epoch: 1 [ 2458112/10637090 (23%)] Loss: 1.3239 (1.180) Data (t): 0.001 Batch (t): 0.899, 570.420/s LR: 0.000000 Logit Scale: 100.000 - V4
375
+ 2024-11-26,19:57:53 | INFO | Train Epoch: 1 [ 2509312/10637090 (24%)] Loss: 1.1951 (1.180) Data (t): 0.001 Batch (t): 0.899, 571.080/s LR: 0.000000 Logit Scale: 100.000 - V4
376
+ 2024-11-26,19:59:24 | INFO | Train Epoch: 1 [ 2560512/10637090 (24%)] Loss: 1.2716 (1.182) Data (t): 0.001 Batch (t): 0.908, 568.033/s LR: 0.000000 Logit Scale: 100.000 - V4
377
+ 2024-11-26,20:00:56 | INFO | Train Epoch: 1 [ 2611712/10637090 (25%)] Loss: 1.1840 (1.182) Data (t): 0.001 Batch (t): 0.919, 571.383/s LR: 0.000000 Logit Scale: 100.000 - V4
378
+ 2024-11-26,20:02:29 | INFO | Train Epoch: 1 [ 2662912/10637090 (25%)] Loss: 1.1681 (1.182) Data (t): 0.001 Batch (t): 0.929, 570.933/s LR: 0.000000 Logit Scale: 100.000 - V4
379
+ 2024-11-26,20:03:59 | INFO | Train Epoch: 1 [ 2714112/10637090 (26%)] Loss: 1.2029 (1.182) Data (t): 0.001 Batch (t): 0.898, 571.058/s LR: 0.000000 Logit Scale: 100.000 - V4
380
+ 2024-11-26,20:05:29 | INFO | Train Epoch: 1 [ 2765312/10637090 (26%)] Loss: 1.1721 (1.182) Data (t): 0.001 Batch (t): 0.899, 566.658/s LR: 0.000000 Logit Scale: 100.000 - V4
381
+ 2024-11-26,20:06:59 | INFO | Train Epoch: 1 [ 2816512/10637090 (26%)] Loss: 1.1462 (1.181) Data (t): 0.001 Batch (t): 0.909, 569.733/s LR: 0.000000 Logit Scale: 100.000 - V4
382
+ 2024-11-26,20:08:32 | INFO | Train Epoch: 1 [ 2867712/10637090 (27%)] Loss: 1.1959 (1.181) Data (t): 0.001 Batch (t): 0.921, 569.731/s LR: 0.000000 Logit Scale: 100.000 - V4
383
+ 2024-11-26,20:10:05 | INFO | Train Epoch: 1 [ 2918912/10637090 (27%)] Loss: 1.2124 (1.182) Data (t): 0.001 Batch (t): 0.930, 325.826/s LR: 0.000000 Logit Scale: 100.000 - V4
384
+ 2024-11-26,20:11:34 | INFO | Train Epoch: 1 [ 2970112/10637090 (28%)] Loss: 1.2197 (1.183) Data (t): 0.001 Batch (t): 0.899, 571.573/s LR: 0.000000 Logit Scale: 100.000 - V4
385
+ 2024-11-26,20:13:04 | INFO | Train Epoch: 1 [ 3021312/10637090 (28%)] Loss: 1.2276 (1.183) Data (t): 0.001 Batch (t): 0.900, 568.050/s LR: 0.000000 Logit Scale: 100.000 - V4
386
+ 2024-11-26,20:14:34 | INFO | Train Epoch: 1 [ 3072512/10637090 (29%)] Loss: 1.0580 (1.181) Data (t): 0.001 Batch (t): 0.900, 568.648/s LR: 0.000000 Logit Scale: 100.000 - V4
387
+ 2024-11-26,20:16:07 | INFO | Train Epoch: 1 [ 3123712/10637090 (29%)] Loss: 1.1839 (1.181) Data (t): 0.001 Batch (t): 0.922, 569.204/s LR: 0.000000 Logit Scale: 100.000 - V4
388
+ 2024-11-26,20:17:39 | INFO | Train Epoch: 1 [ 3174912/10637090 (30%)] Loss: 1.3073 (1.183) Data (t): 0.001 Batch (t): 0.928, 569.760/s LR: 0.000000 Logit Scale: 100.000 - V4
389
+ 2024-11-26,20:19:10 | INFO | Train Epoch: 1 [ 3226112/10637090 (30%)] Loss: 1.2545 (1.184) Data (t): 0.001 Batch (t): 0.905, 567.984/s LR: 0.000000 Logit Scale: 100.000 - V4
390
+ 2024-11-26,20:20:40 | INFO | Train Epoch: 1 [ 3277312/10637090 (31%)] Loss: 1.1420 (1.184) Data (t): 0.001 Batch (t): 0.900, 572.471/s LR: 0.000000 Logit Scale: 100.000 - V4
391
+ 2024-11-26,20:22:10 | INFO | Train Epoch: 1 [ 3328512/10637090 (31%)] Loss: 1.3115 (1.186) Data (t): 0.001 Batch (t): 0.898, 572.822/s LR: 0.000000 Logit Scale: 100.000 - V4
392
+ 2024-11-26,20:23:42 | INFO | Train Epoch: 1 [ 3379712/10637090 (32%)] Loss: 1.1180 (1.185) Data (t): 0.001 Batch (t): 0.922, 569.949/s LR: 0.000000 Logit Scale: 100.000 - V4
393
+ 2024-11-26,20:25:14 | INFO | Train Epoch: 1 [ 3430912/10637090 (32%)] Loss: 1.1664 (1.184) Data (t): 0.001 Batch (t): 0.921, 568.426/s LR: 0.000000 Logit Scale: 100.000 - V4
394
+ 2024-11-26,20:26:45 | INFO | Train Epoch: 1 [ 3482112/10637090 (33%)] Loss: 1.0871 (1.183) Data (t): 0.001 Batch (t): 0.911, 571.307/s LR: 0.000000 Logit Scale: 100.000 - V4
395
+ 2024-11-26,20:28:15 | INFO | Train Epoch: 1 [ 3533312/10637090 (33%)] Loss: 1.3068 (1.185) Data (t): 0.001 Batch (t): 0.897, 572.100/s LR: 0.000000 Logit Scale: 100.000 - V4
396
+ 2024-11-26,20:29:45 | INFO | Train Epoch: 1 [ 3584512/10637090 (34%)] Loss: 1.1484 (1.184) Data (t): 0.001 Batch (t): 0.899, 571.685/s LR: 0.000000 Logit Scale: 100.000 - V4
397
+ 2024-11-26,20:31:16 | INFO | Train Epoch: 1 [ 3635712/10637090 (34%)] Loss: 1.2818 (1.186) Data (t): 0.001 Batch (t): 0.914, 573.222/s LR: 0.000000 Logit Scale: 100.000 - V4
398
+ 2024-11-26,20:32:49 | INFO | Train Epoch: 1 [ 3686912/10637090 (35%)] Loss: 1.1526 (1.185) Data (t): 0.001 Batch (t): 0.928, 571.248/s LR: 0.000000 Logit Scale: 100.000 - V4
399
+ 2024-11-26,20:34:20 | INFO | Train Epoch: 1 [ 3738112/10637090 (35%)] Loss: 0.99751 (1.183) Data (t): 0.001 Batch (t): 0.911, 571.760/s LR: 0.000000 Logit Scale: 100.000 - V4
400
+ 2024-11-26,20:35:50 | INFO | Train Epoch: 1 [ 3789312/10637090 (36%)] Loss: 1.1876 (1.183) Data (t): 0.001 Batch (t): 0.898, 569.914/s LR: 0.000000 Logit Scale: 100.000 - V4
401
+ 2024-11-26,20:37:20 | INFO | Train Epoch: 1 [ 3840512/10637090 (36%)] Loss: 1.1905 (1.183) Data (t): 0.001 Batch (t): 0.900, 570.957/s LR: 0.000000 Logit Scale: 100.000 - V4
402
+ 2024-11-26,20:38:51 | INFO | Train Epoch: 1 [ 3891712/10637090 (37%)] Loss: 1.1538 (1.182) Data (t): 0.001 Batch (t): 0.910, 565.558/s LR: 0.000000 Logit Scale: 100.000 - V4
403
+ 2024-11-26,20:40:23 | INFO | Train Epoch: 1 [ 3942912/10637090 (37%)] Loss: 1.0978 (1.181) Data (t): 0.001 Batch (t): 0.919, 569.475/s LR: 0.000000 Logit Scale: 100.000 - V4
404
+ 2024-11-26,20:41:56 | INFO | Train Epoch: 1 [ 3994112/10637090 (38%)] Loss: 1.1547 (1.181) Data (t): 0.001 Batch (t): 0.928, 570.513/s LR: 0.000000 Logit Scale: 100.000 - V4
405
+ 2024-11-26,20:43:25 | INFO | Train Epoch: 1 [ 4045312/10637090 (38%)] Loss: 1.4202 (1.184) Data (t): 0.001 Batch (t): 0.899, 569.742/s LR: 0.000000 Logit Scale: 100.000 - V4
406
+ 2024-11-26,20:44:55 | INFO | Train Epoch: 1 [ 4096512/10637090 (39%)] Loss: 1.2166 (1.184) Data (t): 0.001 Batch (t): 0.898, 574.082/s LR: 0.000000 Logit Scale: 100.000 - V4
407
+ 2024-11-26,20:46:25 | INFO | Train Epoch: 1 [ 4147712/10637090 (39%)] Loss: 1.1548 (1.184) Data (t): 0.001 Batch (t): 0.898, 573.561/s LR: 0.000000 Logit Scale: 100.000 - V4
408
+ 2024-11-26,20:47:57 | INFO | Train Epoch: 1 [ 4198912/10637090 (39%)] Loss: 1.2397 (1.185) Data (t): 0.001 Batch (t): 0.921, 569.802/s LR: 0.000000 Logit Scale: 100.000 - V4
409
+ 2024-11-26,20:49:31 | INFO | Train Epoch: 1 [ 4250112/10637090 (40%)] Loss: 1.2459 (1.185) Data (t): 0.001 Batch (t): 0.935, 569.457/s LR: 0.000000 Logit Scale: 100.000 - V4
410
+ 2024-11-26,20:51:00 | INFO | Train Epoch: 1 [ 4301312/10637090 (40%)] Loss: 1.1428 (1.185) Data (t): 0.001 Batch (t): 0.898, 572.439/s LR: 0.000000 Logit Scale: 100.000 - V4
411
+ 2024-11-26,20:52:30 | INFO | Train Epoch: 1 [ 4352512/10637090 (41%)] Loss: 1.1446 (1.184) Data (t): 0.001 Batch (t): 0.899, 570.299/s LR: 0.000000 Logit Scale: 100.000 - V4
412
+ 2024-11-26,20:54:00 | INFO | Train Epoch: 1 [ 4403712/10637090 (41%)] Loss: 1.0985 (1.183) Data (t): 0.001 Batch (t): 0.899, 570.359/s LR: 0.000000 Logit Scale: 100.000 - V4
413
+ 2024-11-26,20:55:32 | INFO | Train Epoch: 1 [ 4454912/10637090 (42%)] Loss: 1.1671 (1.183) Data (t): 0.001 Batch (t): 0.920, 572.689/s LR: 0.000000 Logit Scale: 100.000 - V4
414
+ 2024-11-26,20:57:05 | INFO | Train Epoch: 1 [ 4506112/10637090 (42%)] Loss: 1.1655 (1.183) Data (t): 0.001 Batch (t): 0.927, 322.234/s LR: 0.000000 Logit Scale: 100.000 - V4
415
+ 2024-11-26,20:58:35 | INFO | Train Epoch: 1 [ 4557312/10637090 (43%)] Loss: 1.1359 (1.183) Data (t): 0.001 Batch (t): 0.903, 571.277/s LR: 0.000000 Logit Scale: 100.000 - V4
416
+ 2024-11-26,21:00:05 | INFO | Train Epoch: 1 [ 4608512/10637090 (43%)] Loss: 1.2041 (1.183) Data (t): 0.001 Batch (t): 0.899, 569.163/s LR: 0.000000 Logit Scale: 100.000 - V4
417
+ 2024-11-26,21:01:35 | INFO | Train Epoch: 1 [ 4659712/10637090 (44%)] Loss: 1.1353 (1.182) Data (t): 0.001 Batch (t): 0.898, 570.520/s LR: 0.000000 Logit Scale: 100.000 - V4
418
+ 2024-11-26,21:03:07 | INFO | Train Epoch: 1 [ 4710912/10637090 (44%)] Loss: 1.2965 (1.184) Data (t): 0.001 Batch (t): 0.921, 568.051/s LR: 0.000000 Logit Scale: 100.000 - V4
419
+ 2024-11-26,21:04:39 | INFO | Train Epoch: 1 [ 4762112/10637090 (45%)] Loss: 1.1852 (1.184) Data (t): 0.001 Batch (t): 0.921, 571.169/s LR: 0.000000 Logit Scale: 100.000 - V4
420
+ 2024-11-26,21:06:10 | INFO | Train Epoch: 1 [ 4813312/10637090 (45%)] Loss: 1.1519 (1.183) Data (t): 0.001 Batch (t): 0.912, 568.091/s LR: 0.000000 Logit Scale: 100.000 - V4
421
+ 2024-11-26,21:07:40 | INFO | Train Epoch: 1 [ 4864512/10637090 (46%)] Loss: 1.2208 (1.184) Data (t): 0.001 Batch (t): 0.899, 569.083/s LR: 0.000000 Logit Scale: 100.000 - V4
422
+ 2024-11-26,21:09:10 | INFO | Train Epoch: 1 [ 4915712/10637090 (46%)] Loss: 1.0217 (1.182) Data (t): 0.001 Batch (t): 0.899, 568.266/s LR: 0.000000 Logit Scale: 100.000 - V4
423
+ 2024-11-26,21:10:41 | INFO | Train Epoch: 1 [ 4966912/10637090 (47%)] Loss: 1.2111 (1.182) Data (t): 0.001 Batch (t): 0.914, 572.742/s LR: 0.000000 Logit Scale: 100.000 - V4
424
+ 2024-11-26,21:12:14 | INFO | Train Epoch: 1 [ 5018112/10637090 (47%)] Loss: 1.1928 (1.182) Data (t): 0.001 Batch (t): 0.929, 570.259/s LR: 0.000000 Logit Scale: 100.000 - V4
425
+ 2024-11-26,21:13:46 | INFO | Train Epoch: 1 [ 5069312/10637090 (48%)] Loss: 1.1379 (1.182) Data (t): 0.001 Batch (t): 0.912, 566.363/s LR: 0.000000 Logit Scale: 100.000 - V4
426
+ 2024-11-26,21:15:15 | INFO | Train Epoch: 1 [ 5120512/10637090 (48%)] Loss: 0.98969 (1.180) Data (t): 0.001 Batch (t): 0.899, 566.979/s LR: 0.000000 Logit Scale: 100.000 - V4
427
+ 2024-11-26,21:16:45 | INFO | Train Epoch: 1 [ 5171712/10637090 (49%)] Loss: 1.2899 (1.181) Data (t): 0.001 Batch (t): 0.897, 570.488/s LR: 0.000000 Logit Scale: 100.000 - V4
428
+ 2024-11-26,21:18:15 | INFO | Train Epoch: 1 [ 5222912/10637090 (49%)] Loss: 1.1594 (1.181) Data (t): 0.001 Batch (t): 0.897, 571.862/s LR: 0.000000 Logit Scale: 100.000 - V4
429
+ 2024-11-26,21:19:48 | INFO | Train Epoch: 1 [ 5274112/10637090 (50%)] Loss: 1.1718 (1.181) Data (t): 0.001 Batch (t): 0.929, 569.649/s LR: 0.000000 Logit Scale: 100.000 - V4
430
+ 2024-11-26,21:21:21 | INFO | Train Epoch: 1 [ 5325312/10637090 (50%)] Loss: 1.1299 (1.180) Data (t): 0.001 Batch (t): 0.928, 572.603/s LR: 0.000000 Logit Scale: 100.000 - V4
431
+ 2024-11-26,21:22:50 | INFO | Train Epoch: 1 [ 5376512/10637090 (51%)] Loss: 1.2869 (1.181) Data (t): 0.001 Batch (t): 0.898, 569.349/s LR: 0.000000 Logit Scale: 100.000 - V4
432
+ 2024-11-26,21:24:20 | INFO | Train Epoch: 1 [ 5427712/10637090 (51%)] Loss: 1.1522 (1.181) Data (t): 0.001 Batch (t): 0.898, 570.411/s LR: 0.000000 Logit Scale: 100.000 - V4
433
+ 2024-11-26,21:25:50 | INFO | Train Epoch: 1 [ 5478912/10637090 (52%)] Loss: 1.2841 (1.182) Data (t): 0.001 Batch (t): 0.898, 569.983/s LR: 0.000000 Logit Scale: 100.000 - V4
434
+ 2024-11-26,21:27:22 | INFO | Train Epoch: 1 [ 5530112/10637090 (52%)] Loss: 1.2017 (1.182) Data (t): 0.001 Batch (t): 0.922, 569.563/s LR: 0.000000 Logit Scale: 100.000 - V4
435
+ 2024-11-26,21:28:55 | INFO | Train Epoch: 1 [ 5581312/10637090 (52%)] Loss: 1.2017 (1.182) Data (t): 0.001 Batch (t): 0.929, 319.869/s LR: 0.000000 Logit Scale: 100.000 - V4
436
+ 2024-11-26,21:30:25 | INFO | Train Epoch: 1 [ 5632512/10637090 (53%)] Loss: 1.2487 (1.183) Data (t): 0.001 Batch (t): 0.899, 570.889/s LR: 0.000000 Logit Scale: 100.000 - V4
437
+ 2024-11-26,21:31:55 | INFO | Train Epoch: 1 [ 5683712/10637090 (53%)] Loss: 1.2429 (1.183) Data (t): 0.001 Batch (t): 0.899, 571.607/s LR: 0.000000 Logit Scale: 100.000 - V4
438
+ 2024-11-26,21:33:25 | INFO | Train Epoch: 1 [ 5734912/10637090 (54%)] Loss: 1.0940 (1.183) Data (t): 0.001 Batch (t): 0.899, 568.140/s LR: 0.000000 Logit Scale: 100.000 - V4
439
+ 2024-11-26,21:34:57 | INFO | Train Epoch: 1 [ 5786112/10637090 (54%)] Loss: 1.0491 (1.181) Data (t): 0.001 Batch (t): 0.923, 572.406/s LR: 0.000000 Logit Scale: 100.000 - V4
440
+ 2024-11-26,21:36:29 | INFO | Train Epoch: 1 [ 5837312/10637090 (55%)] Loss: 1.1494 (1.181) Data (t): 0.001 Batch (t): 0.920, 573.115/s LR: 0.000000 Logit Scale: 100.000 - V4
441
+ 2024-11-26,21:38:00 | INFO | Train Epoch: 1 [ 5888512/10637090 (55%)] Loss: 1.0827 (1.180) Data (t): 0.001 Batch (t): 0.911, 569.390/s LR: 0.000000 Logit Scale: 100.000 - V4
442
+ 2024-11-26,21:39:30 | INFO | Train Epoch: 1 [ 5939712/10637090 (56%)] Loss: 1.2272 (1.181) Data (t): 0.001 Batch (t): 0.899, 569.449/s LR: 0.000000 Logit Scale: 100.000 - V4
443
+ 2024-11-26,21:41:00 | INFO | Train Epoch: 1 [ 5990912/10637090 (56%)] Loss: 1.2296 (1.181) Data (t): 0.001 Batch (t): 0.900, 569.298/s LR: 0.000000 Logit Scale: 100.000 - V4
444
+ 2024-11-26,21:42:32 | INFO | Train Epoch: 1 [ 6042112/10637090 (57%)] Loss: 1.1828 (1.181) Data (t): 0.001 Batch (t): 0.923, 569.099/s LR: 0.000000 Logit Scale: 100.000 - V4
445
+ 2024-11-26,21:44:04 | INFO | Train Epoch: 1 [ 6093312/10637090 (57%)] Loss: 1.1871 (1.181) Data (t): 0.001 Batch (t): 0.916, 569.842/s LR: 0.000000 Logit Scale: 100.000 - V4
446
+ 2024-11-26,21:45:35 | INFO | Train Epoch: 1 [ 6144512/10637090 (58%)] Loss: 1.1808 (1.181) Data (t): 0.001 Batch (t): 0.913, 570.897/s LR: 0.000000 Logit Scale: 100.000 - V4
447
+ 2024-11-26,21:47:05 | INFO | Train Epoch: 1 [ 6195712/10637090 (58%)] Loss: 1.2000 (1.181) Data (t): 0.001 Batch (t): 0.900, 569.644/s LR: 0.000000 Logit Scale: 100.000 - V4
448
+ 2024-11-26,21:48:35 | INFO | Train Epoch: 1 [ 6246912/10637090 (59%)] Loss: 1.3105 (1.182) Data (t): 0.001 Batch (t): 0.899, 569.345/s LR: 0.000000 Logit Scale: 100.000 - V4
449
+ 2024-11-26,21:50:06 | INFO | Train Epoch: 1 [ 6298112/10637090 (59%)] Loss: 1.0298 (1.181) Data (t): 0.001 Batch (t): 0.906, 569.081/s LR: 0.000000 Logit Scale: 100.000 - V4
450
+ 2024-11-26,21:51:38 | INFO | Train Epoch: 1 [ 6349312/10637090 (60%)] Loss: 1.0581 (1.180) Data (t): 0.001 Batch (t): 0.923, 569.032/s LR: 0.000000 Logit Scale: 100.000 - V4
451
+ 2024-11-26,21:53:11 | INFO | Train Epoch: 1 [ 6400512/10637090 (60%)] Loss: 1.1226 (1.180) Data (t): 0.001 Batch (t): 0.930, 573.111/s LR: 0.000000 Logit Scale: 100.000 - V4
452
+ 2024-11-26,21:54:41 | INFO | Train Epoch: 1 [ 6451712/10637090 (61%)] Loss: 1.0434 (1.179) Data (t): 0.001 Batch (t): 0.898, 570.992/s LR: 0.000000 Logit Scale: 100.000 - V4
453
+ 2024-11-26,21:56:11 | INFO | Train Epoch: 1 [ 6502912/10637090 (61%)] Loss: 1.1701 (1.179) Data (t): 0.001 Batch (t): 0.898, 569.764/s LR: 0.000000 Logit Scale: 100.000 - V4
454
+ 2024-11-26,21:57:41 | INFO | Train Epoch: 1 [ 6554112/10637090 (62%)] Loss: 1.0020 (1.177) Data (t): 0.001 Batch (t): 0.899, 568.551/s LR: 0.000000 Logit Scale: 100.000 - V4
455
+ 2024-11-26,21:59:13 | INFO | Train Epoch: 1 [ 6605312/10637090 (62%)] Loss: 1.1857 (1.177) Data (t): 0.001 Batch (t): 0.924, 568.507/s LR: 0.000000 Logit Scale: 100.000 - V4
456
+ 2024-11-26,22:00:47 | INFO | Train Epoch: 1 [ 6656512/10637090 (63%)] Loss: 1.1697 (1.177) Data (t): 0.001 Batch (t): 0.938, 571.196/s LR: 0.000000 Logit Scale: 100.000 - V4
457
+ 2024-11-26,22:02:17 | INFO | Train Epoch: 1 [ 6707712/10637090 (63%)] Loss: 1.3153 (1.178) Data (t): 0.001 Batch (t): 0.899, 570.651/s LR: 0.000000 Logit Scale: 100.000 - V4
458
+ 2024-11-26,22:03:47 | INFO | Train Epoch: 1 [ 6758912/10637090 (64%)] Loss: 1.2412 (1.179) Data (t): 0.001 Batch (t): 0.900, 566.658/s LR: 0.000000 Logit Scale: 100.000 - V4
459
+ 2024-11-26,22:05:16 | INFO | Train Epoch: 1 [ 6810112/10637090 (64%)] Loss: 1.2105 (1.179) Data (t): 0.001 Batch (t): 0.899, 569.025/s LR: 0.000000 Logit Scale: 100.000 - V4
460
+ 2024-11-26,22:06:49 | INFO | Train Epoch: 1 [ 6861312/10637090 (65%)] Loss: 1.2094 (1.179) Data (t): 0.001 Batch (t): 0.922, 572.277/s LR: 0.000000 Logit Scale: 100.000 - V4
461
+ 2024-11-26,22:08:22 | INFO | Train Epoch: 1 [ 6912512/10637090 (65%)] Loss: 1.1225 (1.179) Data (t): 0.001 Batch (t): 0.931, 569.517/s LR: 0.000000 Logit Scale: 100.000 - V4
462
+ 2024-11-26,22:09:52 | INFO | Train Epoch: 1 [ 6963712/10637090 (65%)] Loss: 1.2264 (1.179) Data (t): 0.001 Batch (t): 0.907, 566.350/s LR: 0.000000 Logit Scale: 100.000 - V4
463
+ 2024-11-26,22:11:22 | INFO | Train Epoch: 1 [ 7014912/10637090 (66%)] Loss: 1.1015 (1.179) Data (t): 0.001 Batch (t): 0.899, 568.496/s LR: 0.000000 Logit Scale: 100.000 - V4
464
+ 2024-11-26,22:12:52 | INFO | Train Epoch: 1 [ 7066112/10637090 (66%)] Loss: 1.1839 (1.179) Data (t): 0.001 Batch (t): 0.900, 568.224/s LR: 0.000000 Logit Scale: 100.000 - V4
465
+ 2024-11-26,22:14:25 | INFO | Train Epoch: 1 [ 7117312/10637090 (67%)] Loss: 1.2270 (1.179) Data (t): 0.001 Batch (t): 0.924, 569.192/s LR: 0.000000 Logit Scale: 100.000 - V4
466
+ 2024-11-26,22:15:57 | INFO | Train Epoch: 1 [ 7168512/10637090 (67%)] Loss: 1.1463 (1.179) Data (t): 0.001 Batch (t): 0.923, 569.831/s LR: 0.000000 Logit Scale: 100.000 - V4
467
+ 2024-11-26,22:17:28 | INFO | Train Epoch: 1 [ 7219712/10637090 (68%)] Loss: 1.0701 (1.178) Data (t): 0.001 Batch (t): 0.911, 571.012/s LR: 0.000000 Logit Scale: 100.000 - V4
468
+ 2024-11-26,22:18:58 | INFO | Train Epoch: 1 [ 7270912/10637090 (68%)] Loss: 1.1044 (1.177) Data (t): 0.001 Batch (t): 0.899, 567.547/s LR: 0.000000 Logit Scale: 100.000 - V4
469
+ 2024-11-26,22:20:28 | INFO | Train Epoch: 1 [ 7322112/10637090 (69%)] Loss: 1.2390 (1.178) Data (t): 0.001 Batch (t): 0.899, 566.791/s LR: 0.000000 Logit Scale: 100.000 - V4
470
+ 2024-11-26,22:21:59 | INFO | Train Epoch: 1 [ 7373312/10637090 (69%)] Loss: 1.1801 (1.178) Data (t): 0.001 Batch (t): 0.914, 569.889/s LR: 0.000000 Logit Scale: 100.000 - V4
471
+ 2024-11-26,22:23:33 | INFO | Train Epoch: 1 [ 7424512/10637090 (70%)] Loss: 1.1531 (1.178) Data (t): 0.001 Batch (t): 0.934, 570.462/s LR: 0.000000 Logit Scale: 100.000 - V4
472
+ 2024-11-26,22:25:04 | INFO | Train Epoch: 1 [ 7475712/10637090 (70%)] Loss: 1.2969 (1.179) Data (t): 0.001 Batch (t): 0.913, 570.918/s LR: 0.000000 Logit Scale: 100.000 - V4
473
+ 2024-11-26,22:26:34 | INFO | Train Epoch: 1 [ 7526912/10637090 (71%)] Loss: 1.1912 (1.179) Data (t): 0.001 Batch (t): 0.900, 570.391/s LR: 0.000000 Logit Scale: 100.000 - V4
474
+ 2024-11-26,22:28:04 | INFO | Train Epoch: 1 [ 7578112/10637090 (71%)] Loss: 1.1033 (1.178) Data (t): 0.001 Batch (t): 0.899, 570.299/s LR: 0.000000 Logit Scale: 100.000 - V4
475
+ 2024-11-26,22:29:35 | INFO | Train Epoch: 1 [ 7629312/10637090 (72%)] Loss: 1.0673 (1.177) Data (t): 0.001 Batch (t): 0.907, 570.048/s LR: 0.000000 Logit Scale: 100.000 - V4
476
+ 2024-11-26,22:31:07 | INFO | Train Epoch: 1 [ 7680512/10637090 (72%)] Loss: 1.1071 (1.177) Data (t): 0.001 Batch (t): 0.924, 567.562/s LR: 0.000000 Logit Scale: 100.000 - V4
477
+ 2024-11-26,22:32:40 | INFO | Train Epoch: 1 [ 7731712/10637090 (73%)] Loss: 1.1401 (1.177) Data (t): 0.001 Batch (t): 0.932, 570.268/s LR: 0.000000 Logit Scale: 100.000 - V4
478
+ 2024-11-26,22:34:10 | INFO | Train Epoch: 1 [ 7782912/10637090 (73%)] Loss: 1.0985 (1.176) Data (t): 0.001 Batch (t): 0.900, 568.763/s LR: 0.000000 Logit Scale: 100.000 - V4
479
+ 2024-11-26,22:35:40 | INFO | Train Epoch: 1 [ 7834112/10637090 (74%)] Loss: 1.2812 (1.177) Data (t): 0.001 Batch (t): 0.899, 570.524/s LR: 0.000000 Logit Scale: 100.000 - V4
480
+ 2024-11-26,22:37:10 | INFO | Train Epoch: 1 [ 7885312/10637090 (74%)] Loss: 1.2813 (1.178) Data (t): 0.001 Batch (t): 0.900, 569.218/s LR: 0.000000 Logit Scale: 100.000 - V4
481
+ 2024-11-26,22:38:43 | INFO | Train Epoch: 1 [ 7936512/10637090 (75%)] Loss: 1.2167 (1.178) Data (t): 0.001 Batch (t): 0.923, 568.970/s LR: 0.000000 Logit Scale: 100.000 - V4
482
+ 2024-11-26,22:40:17 | INFO | Train Epoch: 1 [ 7987712/10637090 (75%)] Loss: 1.0725 (1.177) Data (t): 0.001 Batch (t): 0.939, 569.014/s LR: 0.000000 Logit Scale: 100.000 - V4
483
+ 2024-11-26,22:41:46 | INFO | Train Epoch: 1 [ 8038912/10637090 (76%)] Loss: 1.1530 (1.177) Data (t): 0.001 Batch (t): 0.897, 569.460/s LR: 0.000000 Logit Scale: 100.000 - V4
484
+ 2024-11-26,22:43:16 | INFO | Train Epoch: 1 [ 8090112/10637090 (76%)] Loss: 1.1359 (1.177) Data (t): 0.001 Batch (t): 0.898, 568.609/s LR: 0.000000 Logit Scale: 100.000 - V4
485
+ 2024-11-26,22:44:46 | INFO | Train Epoch: 1 [ 8141312/10637090 (77%)] Loss: 1.3262 (1.178) Data (t): 0.001 Batch (t): 0.899, 567.700/s LR: 0.000000 Logit Scale: 100.000 - V4
486
+ 2024-11-26,22:46:18 | INFO | Train Epoch: 1 [ 8192512/10637090 (77%)] Loss: 1.3573 (1.179) Data (t): 0.001 Batch (t): 0.924, 571.296/s LR: 0.000000 Logit Scale: 100.000 - V4
487
+ 2024-11-26,22:47:51 | INFO | Train Epoch: 1 [ 8243712/10637090 (78%)] Loss: 1.3213 (1.180) Data (t): 0.001 Batch (t): 0.925, 570.037/s LR: 0.000000 Logit Scale: 100.000 - V4
488
+ 2024-11-26,22:49:22 | INFO | Train Epoch: 1 [ 8294912/10637090 (78%)] Loss: 1.2602 (1.180) Data (t): 0.001 Batch (t): 0.914, 570.789/s LR: 0.000000 Logit Scale: 100.000 - V4
489
+ 2024-11-26,22:50:52 | INFO | Train Epoch: 1 [ 8346112/10637090 (78%)] Loss: 1.1083 (1.180) Data (t): 0.001 Batch (t): 0.899, 568.988/s LR: 0.000000 Logit Scale: 100.000 - V4
490
+ 2024-11-26,22:52:22 | INFO | Train Epoch: 1 [ 8397312/10637090 (79%)] Loss: 1.1070 (1.179) Data (t): 0.001 Batch (t): 0.897, 571.238/s LR: 0.000000 Logit Scale: 100.000 - V4
491
+ 2024-11-26,22:53:53 | INFO | Train Epoch: 1 [ 8448512/10637090 (79%)] Loss: 1.3084 (1.180) Data (t): 0.001 Batch (t): 0.913, 569.511/s LR: 0.000000 Logit Scale: 100.000 - V4
492
+ 2024-11-26,22:55:26 | INFO | Train Epoch: 1 [ 8499712/10637090 (80%)] Loss: 1.2415 (1.180) Data (t): 0.001 Batch (t): 0.932, 570.018/s LR: 0.000000 Logit Scale: 100.000 - V4
493
+ 2024-11-26,22:56:58 | INFO | Train Epoch: 1 [ 8550912/10637090 (80%)] Loss: 1.1739 (1.180) Data (t): 0.001 Batch (t): 0.914, 565.601/s LR: 0.000000 Logit Scale: 100.000 - V4
494
+ 2024-11-26,22:58:28 | INFO | Train Epoch: 1 [ 8602112/10637090 (81%)] Loss: 1.2912 (1.181) Data (t): 0.001 Batch (t): 0.899, 567.853/s LR: 0.000000 Logit Scale: 100.000 - V4
495
+ 2024-11-26,22:59:58 | INFO | Train Epoch: 1 [ 8653312/10637090 (81%)] Loss: 1.0921 (1.181) Data (t): 0.001 Batch (t): 0.899, 573.454/s LR: 0.000000 Logit Scale: 100.000 - V4
496
+ 2024-11-26,23:01:29 | INFO | Train Epoch: 1 [ 8704512/10637090 (82%)] Loss: 1.1525 (1.180) Data (t): 0.001 Batch (t): 0.914, 571.670/s LR: 0.000000 Logit Scale: 100.000 - V4
497
+ 2024-11-26,23:03:02 | INFO | Train Epoch: 1 [ 8755712/10637090 (82%)] Loss: 1.1539 (1.180) Data (t): 0.001 Batch (t): 0.926, 262.009/s LR: 0.000000 Logit Scale: 100.000 - V4
498
+ 2024-11-26,23:04:34 | INFO | Train Epoch: 1 [ 8806912/10637090 (83%)] Loss: 1.2145 (1.180) Data (t): 0.001 Batch (t): 0.921, 570.100/s LR: 0.000000 Logit Scale: 100.000 - V4
499
+ 2024-11-26,23:06:04 | INFO | Train Epoch: 1 [ 8858112/10637090 (83%)] Loss: 1.0540 (1.180) Data (t): 0.001 Batch (t): 0.900, 567.721/s LR: 0.000000 Logit Scale: 100.000 - V4
500
+ 2024-11-26,23:07:34 | INFO | Train Epoch: 1 [ 8909312/10637090 (84%)] Loss: 1.2772 (1.180) Data (t): 0.001 Batch (t): 0.899, 569.287/s LR: 0.000000 Logit Scale: 100.000 - V4
501
+ 2024-11-26,23:09:04 | INFO | Train Epoch: 1 [ 8960512/10637090 (84%)] Loss: 1.2035 (1.180) Data (t): 0.001 Batch (t): 0.907, 569.264/s LR: 0.000000 Logit Scale: 100.000 - V4
502
+ 2024-11-26,23:10:37 | INFO | Train Epoch: 1 [ 9011712/10637090 (85%)] Loss: 1.2042 (1.180) Data (t): 0.001 Batch (t): 0.924, 568.662/s LR: 0.000000 Logit Scale: 100.000 - V4
503
+ 2024-11-26,23:12:10 | INFO | Train Epoch: 1 [ 9062912/10637090 (85%)] Loss: 1.0957 (1.180) Data (t): 0.001 Batch (t): 0.930, 569.348/s LR: 0.000000 Logit Scale: 100.000 - V4
504
+ 2024-11-26,23:13:39 | INFO | Train Epoch: 1 [ 9114112/10637090 (86%)] Loss: 1.2101 (1.180) Data (t): 0.001 Batch (t): 0.899, 572.392/s LR: 0.000000 Logit Scale: 100.000 - V4
505
+ 2024-11-26,23:15:09 | INFO | Train Epoch: 1 [ 9165312/10637090 (86%)] Loss: 1.0783 (1.180) Data (t): 0.001 Batch (t): 0.899, 570.228/s LR: 0.000000 Logit Scale: 100.000 - V4
506
+ 2024-11-26,23:16:39 | INFO | Train Epoch: 1 [ 9216512/10637090 (87%)] Loss: 1.2411 (1.180) Data (t): 0.001 Batch (t): 0.900, 562.989/s LR: 0.000000 Logit Scale: 100.000 - V4
507
+ 2024-11-26,23:18:12 | INFO | Train Epoch: 1 [ 9267712/10637090 (87%)] Loss: 1.2212 (1.180) Data (t): 0.001 Batch (t): 0.926, 567.625/s LR: 0.000000 Logit Scale: 100.000 - V4
508
+ 2024-11-26,23:19:46 | INFO | Train Epoch: 1 [ 9318912/10637090 (88%)] Loss: 1.2338 (1.180) Data (t): 0.001 Batch (t): 0.940, 569.754/s LR: 0.000000 Logit Scale: 100.000 - V4
509
+ 2024-11-26,23:21:16 | INFO | Train Epoch: 1 [ 9370112/10637090 (88%)] Loss: 1.2923 (1.181) Data (t): 0.001 Batch (t): 0.899, 567.528/s LR: 0.000000 Logit Scale: 100.000 - V4
510
+ 2024-11-26,23:22:46 | INFO | Train Epoch: 1 [ 9421312/10637090 (89%)] Loss: 1.0940 (1.181) Data (t): 0.001 Batch (t): 0.900, 569.919/s LR: 0.000000 Logit Scale: 100.000 - V4
511
+ 2024-11-26,23:24:16 | INFO | Train Epoch: 1 [ 9472512/10637090 (89%)] Loss: 1.2164 (1.181) Data (t): 0.001 Batch (t): 0.900, 570.008/s LR: 0.000000 Logit Scale: 100.000 - V4
512
+ 2024-11-26,23:25:47 | INFO | Train Epoch: 1 [ 9523712/10637090 (90%)] Loss: 1.0894 (1.180) Data (t): 0.001 Batch (t): 0.915, 568.146/s LR: 0.000000 Logit Scale: 100.000 - V4
513
+ 2024-11-26,23:27:21 | INFO | Train Epoch: 1 [ 9574912/10637090 (90%)] Loss: 1.1913 (1.180) Data (t): 0.001 Batch (t): 0.934, 569.573/s LR: 0.000000 Logit Scale: 100.000 - V4
514
+ 2024-11-26,23:28:52 | INFO | Train Epoch: 1 [ 9626112/10637090 (90%)] Loss: 1.0642 (1.180) Data (t): 0.001 Batch (t): 0.914, 568.430/s LR: 0.000000 Logit Scale: 100.000 - V4
515
+ 2024-11-26,23:30:22 | INFO | Train Epoch: 1 [ 9677312/10637090 (91%)] Loss: 1.1439 (1.180) Data (t): 0.001 Batch (t): 0.898, 572.201/s LR: 0.000000 Logit Scale: 100.000 - V4
516
+ 2024-11-26,23:31:52 | INFO | Train Epoch: 1 [ 9728512/10637090 (91%)] Loss: 1.1327 (1.179) Data (t): 0.001 Batch (t): 0.898, 570.895/s LR: 0.000000 Logit Scale: 100.000 - V4
517
+ 2024-11-26,23:33:23 | INFO | Train Epoch: 1 [ 9779712/10637090 (92%)] Loss: 1.1159 (1.179) Data (t): 0.001 Batch (t): 0.914, 570.508/s LR: 0.000000 Logit Scale: 100.000 - V4
518
+ 2024-11-26,23:34:57 | INFO | Train Epoch: 1 [ 9830912/10637090 (92%)] Loss: 1.2168 (1.179) Data (t): 0.001 Batch (t): 0.934, 567.726/s LR: 0.000000 Logit Scale: 100.000 - V4
519
+ 2024-11-26,23:36:28 | INFO | Train Epoch: 1 [ 9882112/10637090 (93%)] Loss: 1.1149 (1.179) Data (t): 0.001 Batch (t): 0.914, 570.335/s LR: 0.000000 Logit Scale: 100.000 - V4
520
+ 2024-11-26,23:37:58 | INFO | Train Epoch: 1 [ 9933312/10637090 (93%)] Loss: 1.1064 (1.178) Data (t): 0.001 Batch (t): 0.899, 568.050/s LR: 0.000000 Logit Scale: 100.000 - V4
521
+ 2024-11-26,23:39:28 | INFO | Train Epoch: 1 [ 9984512/10637090 (94%)] Loss: 1.1443 (1.178) Data (t): 0.001 Batch (t): 0.901, 568.709/s LR: 0.000000 Logit Scale: 100.000 - V4
522
+ 2024-11-26,23:40:59 | INFO | Train Epoch: 1 [10035712/10637090 (94%)] Loss: 1.2071 (1.178) Data (t): 0.001 Batch (t): 0.914, 569.486/s LR: 0.000000 Logit Scale: 100.000 - V4
523
+ 2024-11-26,23:42:31 | INFO | Train Epoch: 1 [10086912/10637090 (95%)] Loss: 1.2291 (1.179) Data (t): 0.001 Batch (t): 0.917, 568.210/s LR: 0.000000 Logit Scale: 100.000 - V4
524
+ 2024-11-26,23:44:04 | INFO | Train Epoch: 1 [10138112/10637090 (95%)] Loss: 0.94618 (1.178) Data (t): 0.001 Batch (t): 0.933, 566.726/s LR: 0.000000 Logit Scale: 100.000 - V4
525
+ 2024-11-26,23:45:34 | INFO | Train Epoch: 1 [10189312/10637090 (96%)] Loss: 1.1821 (1.178) Data (t): 0.001 Batch (t): 0.899, 567.719/s LR: 0.000000 Logit Scale: 100.000 - V4
526
+ 2024-11-26,23:47:04 | INFO | Train Epoch: 1 [10240512/10637090 (96%)] Loss: 1.1193 (1.177) Data (t): 0.001 Batch (t): 0.899, 567.518/s LR: 0.000000 Logit Scale: 100.000 - V4
527
+ 2024-11-26,23:48:35 | INFO | Train Epoch: 1 [10291712/10637090 (97%)] Loss: 1.0365 (1.177) Data (t): 0.001 Batch (t): 0.905, 571.630/s LR: 0.000000 Logit Scale: 100.000 - V4
528
+ 2024-11-26,23:50:06 | INFO | Train Epoch: 1 [10342912/10637090 (97%)] Loss: 1.0404 (1.176) Data (t): 0.001 Batch (t): 0.917, 569.601/s LR: 0.000000 Logit Scale: 100.000 - V4
529
+ 2024-11-26,23:51:40 | INFO | Train Epoch: 1 [10394112/10637090 (98%)] Loss: 1.1874 (1.176) Data (t): 0.001 Batch (t): 0.938, 567.593/s LR: 0.000000 Logit Scale: 100.000 - V4
530
+ 2024-11-26,23:53:10 | INFO | Train Epoch: 1 [10445312/10637090 (98%)] Loss: 1.0585 (1.175) Data (t): 0.001 Batch (t): 0.899, 568.669/s LR: 0.000000 Logit Scale: 100.000 - V4
531
+ 2024-11-26,23:54:40 | INFO | Train Epoch: 1 [10496512/10637090 (99%)] Loss: 1.1873 (1.175) Data (t): 0.001 Batch (t): 0.900, 565.194/s LR: 0.000000 Logit Scale: 100.000 - V4
532
+ 2024-11-26,23:56:10 | INFO | Train Epoch: 1 [10547712/10637090 (99%)] Loss: 1.0841 (1.175) Data (t): 0.001 Batch (t): 0.899, 570.082/s LR: 0.000000 Logit Scale: 100.000 - V4
533
+ 2024-11-26,23:57:42 | INFO | Train Epoch: 1 [10598912/10637090 (100%)] Loss: 0.96767 (1.174) Data (t): 0.001 Batch (t): 0.925, 570.929/s LR: 0.000000 Logit Scale: 100.000 - V4
534
+ 2024-11-26,23:58:51 | INFO | Train Epoch: 1 [10636800/10637090 (100%)] Loss: 1.2094 (1.174) Data (t): 0.002 Batch (t): 0.933, 573.472/s LR: 0.000000 Logit Scale: 100.000 - V4
data/trained_openclip/negative_logs/plotqa_train_only_qa_v2_5false_formated_sampled_fixed_flaten_decimal2/2024_11_26-13_26_45-model_ViT-L-14-336-lr_1e-06-b_64-j_4-p_amp/params.txt ADDED
@@ -0,0 +1,67 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ batch_size: 64
2
+ beta1: 0.9
3
+ beta2: 0.98
4
+ checkpoint_path: data/trained_openclip/negative_logs/plotqa_train_only_qa_v2_5false_formated_sampled_fixed_flaten_decimal2/2024_11_26-13_26_45-model_ViT-L-14-336-lr_1e-06-b_64-j_4-p_amp/checkpoints
5
+ copy_codebase: False
6
+ csv_caption_key: caption
7
+ csv_hard_captions_key: neg_caption
8
+ csv_img_key: img_path
9
+ csv_separator: ,
10
+ dataset_resampled: False
11
+ dataset_type: csv
12
+ ddp_static_graph: False
13
+ debug: False
14
+ device: cuda:0
15
+ dist_backend: nccl
16
+ dist_url: env://
17
+ distributed: True
18
+ epochs: 2
19
+ eps: 1e-06
20
+ force_quick_gelu: True
21
+ gather_with_grad: False
22
+ grad_checkpointing: False
23
+ horovod: False
24
+ imagenet_v2: None
25
+ imagenet_val: None
26
+ local_loss: False
27
+ local_rank: 0
28
+ lock_image: False
29
+ lock_image_freeze_bn_stats: False
30
+ lock_image_unlocked_groups: 0
31
+ log_level: 20
32
+ log_local: False
33
+ log_path: data/trained_openclip/negative_logs/plotqa_train_only_qa_v2_5false_formated_sampled_fixed_flaten_decimal2/2024_11_26-13_26_45-model_ViT-L-14-336-lr_1e-06-b_64-j_4-p_amp/out.log
34
+ logs: data/trained_openclip/negative_logs/plotqa_train_only_qa_v2_5false_formated_sampled_fixed_flaten_decimal2
35
+ lr: 1e-06
36
+ model: ViT-L-14-336
37
+ name: 2024_11_26-13_26_45-model_ViT-L-14-336-lr_1e-06-b_64-j_4-p_amp
38
+ no_set_device_rank: False
39
+ norm_gradient_clip: None
40
+ precision: amp
41
+ pretrained: data/openclip-vit-14-336/openclip_model.pt
42
+ pretrained_image: False
43
+ rank: 0
44
+ report_to: wandb
45
+ resume: None
46
+ save_frequency: 1
47
+ save_most_recent: False
48
+ seed: 0
49
+ skip_scheduler: False
50
+ tensorboard: False
51
+ tensorboard_path:
52
+ torchscript: False
53
+ trace: False
54
+ train_data: csv_data/plotqa_train_only_qa_v2_5false_formated_sampled_fixed_flaten_decimal2.csv
55
+ train_num_samples: None
56
+ use_bn_sync: False
57
+ val_data: None
58
+ val_frequency: 1
59
+ val_num_samples: None
60
+ wandb: True
61
+ wandb_notes:
62
+ wandb_project: neg-clip-plotqa_train_only_qa_v2_5false_formated_sampled_fixed_flaten_decimal2
63
+ warmup: 0
64
+ wd: 0.1
65
+ workers: 4
66
+ world_size: 8
67
+ zeroshot_frequency: 2
data/trained_openclip/negative_logs/plotqa_train_only_qa_v2_5false_formated_sampled_fixed_flaten_decimal2/2024_11_26-23_59_33-model_ViT-L-14-336-lr_5e-06-b_64-j_4-p_amp/checkpoints/epoch_1.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:05200113a7debf4bb778eb5be911589ec276996de31db68df3d2b817036d39d6
3
+ size 5135890710
data/trained_openclip/negative_logs/plotqa_train_only_qa_v2_5false_formated_sampled_fixed_flaten_decimal2/2024_11_26-23_59_33-model_ViT-L-14-336-lr_5e-06-b_64-j_4-p_amp/checkpoints/epoch_2.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:513b1979b9b90c2d986e67257ccc243c080330b27f272ed625c65555970f6a4b
3
+ size 5135890710
data/trained_openclip/negative_logs/plotqa_train_only_qa_v2_5false_formated_sampled_fixed_flaten_decimal2/2024_11_26-23_59_33-model_ViT-L-14-336-lr_5e-06-b_64-j_4-p_amp/out.log ADDED
@@ -0,0 +1,534 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2024-11-26,23:59:33 | INFO | Running in distributed mode with multiple processes. Device: cuda:0.Process (global: 0, local 0), total 8.
2
+ 2024-11-26,23:59:33 | INFO | Loading ViT-L-14-336 model config.
3
+ 2024-11-26,23:59:36 | INFO | Loading pretrained ViT-L-14-336 weights (data/openclip-vit-14-336/openclip_model.pt).
4
+ 2024-11-26,23:59:43 | INFO | Model:
5
+ 2024-11-26,23:59:43 | INFO | CLIP(
6
+ (visual): VisualTransformer(
7
+ (conv1): Conv2d(3, 1024, kernel_size=(14, 14), stride=(14, 14), bias=False)
8
+ (ln_pre): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
9
+ (transformer): Transformer(
10
+ (resblocks): ModuleList(
11
+ (0-23): 24 x ResidualAttentionBlock(
12
+ (attn): MultiheadAttention(
13
+ (out_proj): NonDynamicallyQuantizableLinear(in_features=1024, out_features=1024, bias=True)
14
+ )
15
+ (ln_1): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
16
+ (mlp): Sequential(
17
+ (c_fc): Linear(in_features=1024, out_features=4096, bias=True)
18
+ (gelu): QuickGELU()
19
+ (c_proj): Linear(in_features=4096, out_features=1024, bias=True)
20
+ )
21
+ (ln_2): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
22
+ )
23
+ )
24
+ )
25
+ (ln_post): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
26
+ )
27
+ (transformer): Transformer(
28
+ (resblocks): ModuleList(
29
+ (0-11): 12 x ResidualAttentionBlock(
30
+ (attn): MultiheadAttention(
31
+ (out_proj): NonDynamicallyQuantizableLinear(in_features=768, out_features=768, bias=True)
32
+ )
33
+ (ln_1): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
34
+ (mlp): Sequential(
35
+ (c_fc): Linear(in_features=768, out_features=3072, bias=True)
36
+ (gelu): QuickGELU()
37
+ (c_proj): Linear(in_features=3072, out_features=768, bias=True)
38
+ )
39
+ (ln_2): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
40
+ )
41
+ )
42
+ )
43
+ (token_embedding): Embedding(49408, 768)
44
+ (ln_final): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
45
+ )
46
+ 2024-11-26,23:59:43 | INFO | Params:
47
+ 2024-11-26,23:59:43 | INFO | batch_size: 64
48
+ 2024-11-26,23:59:43 | INFO | beta1: 0.9
49
+ 2024-11-26,23:59:43 | INFO | beta2: 0.98
50
+ 2024-11-26,23:59:43 | INFO | checkpoint_path: data/trained_openclip/negative_logs/plotqa_train_only_qa_v2_5false_formated_sampled_fixed_flaten_decimal2/2024_11_26-23_59_33-model_ViT-L-14-336-lr_5e-06-b_64-j_4-p_amp/checkpoints
51
+ 2024-11-26,23:59:43 | INFO | copy_codebase: False
52
+ 2024-11-26,23:59:43 | INFO | csv_caption_key: caption
53
+ 2024-11-26,23:59:43 | INFO | csv_hard_captions_key: neg_caption
54
+ 2024-11-26,23:59:43 | INFO | csv_img_key: img_path
55
+ 2024-11-26,23:59:43 | INFO | csv_separator: ,
56
+ 2024-11-26,23:59:43 | INFO | dataset_resampled: False
57
+ 2024-11-26,23:59:43 | INFO | dataset_type: csv
58
+ 2024-11-26,23:59:43 | INFO | ddp_static_graph: False
59
+ 2024-11-26,23:59:43 | INFO | debug: False
60
+ 2024-11-26,23:59:43 | INFO | device: cuda:0
61
+ 2024-11-26,23:59:43 | INFO | dist_backend: nccl
62
+ 2024-11-26,23:59:43 | INFO | dist_url: env://
63
+ 2024-11-26,23:59:43 | INFO | distributed: True
64
+ 2024-11-26,23:59:43 | INFO | epochs: 2
65
+ 2024-11-26,23:59:43 | INFO | eps: 1e-06
66
+ 2024-11-26,23:59:43 | INFO | force_quick_gelu: True
67
+ 2024-11-26,23:59:43 | INFO | gather_with_grad: False
68
+ 2024-11-26,23:59:43 | INFO | grad_checkpointing: False
69
+ 2024-11-26,23:59:43 | INFO | horovod: False
70
+ 2024-11-26,23:59:43 | INFO | imagenet_v2: None
71
+ 2024-11-26,23:59:43 | INFO | imagenet_val: None
72
+ 2024-11-26,23:59:43 | INFO | local_loss: False
73
+ 2024-11-26,23:59:43 | INFO | local_rank: 0
74
+ 2024-11-26,23:59:43 | INFO | lock_image: False
75
+ 2024-11-26,23:59:43 | INFO | lock_image_freeze_bn_stats: False
76
+ 2024-11-26,23:59:43 | INFO | lock_image_unlocked_groups: 0
77
+ 2024-11-26,23:59:43 | INFO | log_level: 20
78
+ 2024-11-26,23:59:43 | INFO | log_local: False
79
+ 2024-11-26,23:59:43 | INFO | log_path: data/trained_openclip/negative_logs/plotqa_train_only_qa_v2_5false_formated_sampled_fixed_flaten_decimal2/2024_11_26-23_59_33-model_ViT-L-14-336-lr_5e-06-b_64-j_4-p_amp/out.log
80
+ 2024-11-26,23:59:43 | INFO | logs: data/trained_openclip/negative_logs/plotqa_train_only_qa_v2_5false_formated_sampled_fixed_flaten_decimal2
81
+ 2024-11-26,23:59:43 | INFO | lr: 5e-06
82
+ 2024-11-26,23:59:43 | INFO | model: ViT-L-14-336
83
+ 2024-11-26,23:59:43 | INFO | name: 2024_11_26-23_59_33-model_ViT-L-14-336-lr_5e-06-b_64-j_4-p_amp
84
+ 2024-11-26,23:59:43 | INFO | no_set_device_rank: False
85
+ 2024-11-26,23:59:43 | INFO | norm_gradient_clip: None
86
+ 2024-11-26,23:59:43 | INFO | precision: amp
87
+ 2024-11-26,23:59:43 | INFO | pretrained: data/openclip-vit-14-336/openclip_model.pt
88
+ 2024-11-26,23:59:43 | INFO | pretrained_image: False
89
+ 2024-11-26,23:59:43 | INFO | rank: 0
90
+ 2024-11-26,23:59:43 | INFO | report_to: wandb
91
+ 2024-11-26,23:59:43 | INFO | resume: None
92
+ 2024-11-26,23:59:43 | INFO | save_frequency: 1
93
+ 2024-11-26,23:59:43 | INFO | save_most_recent: False
94
+ 2024-11-26,23:59:43 | INFO | seed: 0
95
+ 2024-11-26,23:59:43 | INFO | skip_scheduler: False
96
+ 2024-11-26,23:59:43 | INFO | tensorboard: False
97
+ 2024-11-26,23:59:43 | INFO | tensorboard_path:
98
+ 2024-11-26,23:59:43 | INFO | torchscript: False
99
+ 2024-11-26,23:59:43 | INFO | trace: False
100
+ 2024-11-26,23:59:43 | INFO | train_data: csv_data/plotqa_train_only_qa_v2_5false_formated_sampled_fixed_flaten_decimal2.csv
101
+ 2024-11-26,23:59:43 | INFO | train_num_samples: None
102
+ 2024-11-26,23:59:43 | INFO | use_bn_sync: False
103
+ 2024-11-26,23:59:43 | INFO | val_data: None
104
+ 2024-11-26,23:59:43 | INFO | val_frequency: 1
105
+ 2024-11-26,23:59:43 | INFO | val_num_samples: None
106
+ 2024-11-26,23:59:43 | INFO | wandb: True
107
+ 2024-11-26,23:59:43 | INFO | wandb_notes:
108
+ 2024-11-26,23:59:43 | INFO | wandb_project: neg-clip-plotqa_train_only_qa_v2_5false_formated_sampled_fixed_flaten_decimal2
109
+ 2024-11-26,23:59:43 | INFO | warmup: 0
110
+ 2024-11-26,23:59:43 | INFO | wd: 0.1
111
+ 2024-11-26,23:59:43 | INFO | workers: 4
112
+ 2024-11-26,23:59:43 | INFO | world_size: 8
113
+ 2024-11-26,23:59:43 | INFO | zeroshot_frequency: 2
114
+ 2024-11-27,00:00:38 | INFO | Init a wandb project!
115
+ 2024-11-27,00:00:58 | INFO | Start epoch 0
116
+ 2024-11-27,00:01:05 | INFO | Train Epoch: 0 [ 512/10637090 (0%)] Loss: 6.0211 (6.021) Data (t): 2.743 Batch (t): 6.422, 79.7278/s LR: 0.000005 Logit Scale: 100.000 - V4
117
+ 2024-11-27,00:02:36 | INFO | Train Epoch: 0 [ 51712/10637090 (0%)] Loss: 2.0704 (4.046) Data (t): 0.001 Batch (t): 0.909, 567.485/s LR: 0.000005 Logit Scale: 99.995 - V4
118
+ 2024-11-27,00:04:06 | INFO | Train Epoch: 0 [ 102912/10637090 (1%)] Loss: 1.8651 (3.319) Data (t): 0.001 Batch (t): 0.900, 569.403/s LR: 0.000005 Logit Scale: 99.996 - V4
119
+ 2024-11-27,00:05:37 | INFO | Train Epoch: 0 [ 154112/10637090 (1%)] Loss: 1.6311 (2.897) Data (t): 0.001 Batch (t): 0.915, 569.471/s LR: 0.000005 Logit Scale: 99.994 - V4
120
+ 2024-11-27,00:07:12 | INFO | Train Epoch: 0 [ 205312/10637090 (2%)] Loss: 1.5378 (2.625) Data (t): 0.001 Batch (t): 0.944, 567.919/s LR: 0.000005 Logit Scale: 99.992 - V4
121
+ 2024-11-27,00:08:42 | INFO | Train Epoch: 0 [ 256512/10637090 (2%)] Loss: 1.4296 (2.426) Data (t): 0.001 Batch (t): 0.899, 567.674/s LR: 0.000005 Logit Scale: 99.991 - V4
122
+ 2024-11-27,00:10:12 | INFO | Train Epoch: 0 [ 307712/10637090 (3%)] Loss: 1.4987 (2.293) Data (t): 0.001 Batch (t): 0.899, 572.843/s LR: 0.000005 Logit Scale: 99.987 - V4
123
+ 2024-11-27,00:11:42 | INFO | Train Epoch: 0 [ 358912/10637090 (3%)] Loss: 1.4410 (2.187) Data (t): 0.001 Batch (t): 0.900, 568.004/s LR: 0.000005 Logit Scale: 99.984 - V4
124
+ 2024-11-27,00:13:12 | INFO | Train Epoch: 0 [ 410112/10637090 (4%)] Loss: 1.4968 (2.110) Data (t): 0.001 Batch (t): 0.899, 570.547/s LR: 0.000005 Logit Scale: 99.983 - V4
125
+ 2024-11-27,00:14:49 | INFO | Train Epoch: 0 [ 461312/10637090 (4%)] Loss: 1.6677 (2.066) Data (t): 0.001 Batch (t): 0.972, 570.981/s LR: 0.000005 Logit Scale: 99.978 - V4
126
+ 2024-11-27,00:16:19 | INFO | Train Epoch: 0 [ 512512/10637090 (5%)] Loss: 1.3984 (2.005) Data (t): 0.001 Batch (t): 0.900, 569.236/s LR: 0.000005 Logit Scale: 99.977 - V4
127
+ 2024-11-27,00:17:49 | INFO | Train Epoch: 0 [ 563712/10637090 (5%)] Loss: 1.4217 (1.957) Data (t): 0.001 Batch (t): 0.899, 568.284/s LR: 0.000005 Logit Scale: 99.972 - V4
128
+ 2024-11-27,00:19:18 | INFO | Train Epoch: 0 [ 614912/10637090 (6%)] Loss: 1.3985 (1.914) Data (t): 0.001 Batch (t): 0.898, 569.348/s LR: 0.000005 Logit Scale: 99.970 - V4
129
+ 2024-11-27,00:20:48 | INFO | Train Epoch: 0 [ 666112/10637090 (6%)] Loss: 1.3929 (1.876) Data (t): 0.001 Batch (t): 0.899, 569.799/s LR: 0.000005 Logit Scale: 99.967 - V4
130
+ 2024-11-27,00:22:25 | INFO | Train Epoch: 0 [ 717312/10637090 (7%)] Loss: 1.2314 (1.833) Data (t): 0.001 Batch (t): 0.962, 570.341/s LR: 0.000005 Logit Scale: 99.965 - V4
131
+ 2024-11-27,00:23:54 | INFO | Train Epoch: 0 [ 768512/10637090 (7%)] Loss: 1.4314 (1.808) Data (t): 0.001 Batch (t): 0.899, 569.148/s LR: 0.000005 Logit Scale: 99.961 - V4
132
+ 2024-11-27,00:25:24 | INFO | Train Epoch: 0 [ 819712/10637090 (8%)] Loss: 1.4184 (1.785) Data (t): 0.001 Batch (t): 0.899, 569.275/s LR: 0.000005 Logit Scale: 99.959 - V4
133
+ 2024-11-27,00:26:54 | INFO | Train Epoch: 0 [ 870912/10637090 (8%)] Loss: 1.4606 (1.767) Data (t): 0.001 Batch (t): 0.899, 567.158/s LR: 0.000005 Logit Scale: 99.955 - V4
134
+ 2024-11-27,00:28:24 | INFO | Train Epoch: 0 [ 922112/10637090 (9%)] Loss: 1.4578 (1.751) Data (t): 0.001 Batch (t): 0.899, 566.798/s LR: 0.000005 Logit Scale: 99.953 - V4
135
+ 2024-11-27,00:30:00 | INFO | Train Epoch: 0 [ 973312/10637090 (9%)] Loss: 1.3601 (1.732) Data (t): 0.001 Batch (t): 0.956, 571.441/s LR: 0.000005 Logit Scale: 99.948 - V4
136
+ 2024-11-27,00:31:30 | INFO | Train Epoch: 0 [ 1024512/10637090 (10%)] Loss: 1.3931 (1.715) Data (t): 0.001 Batch (t): 0.898, 569.813/s LR: 0.000005 Logit Scale: 99.944 - V4
137
+ 2024-11-27,00:33:00 | INFO | Train Epoch: 0 [ 1075712/10637090 (10%)] Loss: 1.3680 (1.700) Data (t): 0.001 Batch (t): 0.900, 566.470/s LR: 0.000005 Logit Scale: 99.943 - V4
138
+ 2024-11-27,00:34:30 | INFO | Train Epoch: 0 [ 1126912/10637090 (11%)] Loss: 1.4400 (1.688) Data (t): 0.001 Batch (t): 0.899, 569.038/s LR: 0.000005 Logit Scale: 99.939 - V4
139
+ 2024-11-27,00:36:00 | INFO | Train Epoch: 0 [ 1178112/10637090 (11%)] Loss: 1.1873 (1.667) Data (t): 0.001 Batch (t): 0.899, 570.775/s LR: 0.000005 Logit Scale: 99.936 - V4
140
+ 2024-11-27,00:37:34 | INFO | Train Epoch: 0 [ 1229312/10637090 (12%)] Loss: 1.1434 (1.647) Data (t): 0.001 Batch (t): 0.946, 326.849/s LR: 0.000005 Logit Scale: 99.932 - V4
141
+ 2024-11-27,00:39:06 | INFO | Train Epoch: 0 [ 1280512/10637090 (12%)] Loss: 1.1847 (1.629) Data (t): 0.001 Batch (t): 0.920, 569.612/s LR: 0.000005 Logit Scale: 99.930 - V4
142
+ 2024-11-27,00:40:36 | INFO | Train Epoch: 0 [ 1331712/10637090 (13%)] Loss: 1.3127 (1.617) Data (t): 0.001 Batch (t): 0.900, 567.596/s LR: 0.000005 Logit Scale: 99.928 - V4
143
+ 2024-11-27,00:42:06 | INFO | Train Epoch: 0 [ 1382912/10637090 (13%)] Loss: 1.5171 (1.613) Data (t): 0.001 Batch (t): 0.900, 566.739/s LR: 0.000005 Logit Scale: 99.926 - V4
144
+ 2024-11-27,00:43:36 | INFO | Train Epoch: 0 [ 1434112/10637090 (13%)] Loss: 1.2625 (1.601) Data (t): 0.001 Batch (t): 0.900, 567.925/s LR: 0.000005 Logit Scale: 99.922 - V4
145
+ 2024-11-27,00:45:07 | INFO | Train Epoch: 0 [ 1485312/10637090 (14%)] Loss: 1.2965 (1.591) Data (t): 0.001 Batch (t): 0.910, 567.549/s LR: 0.000005 Logit Scale: 99.920 - V4
146
+ 2024-11-27,00:46:42 | INFO | Train Epoch: 0 [ 1536512/10637090 (14%)] Loss: 1.3683 (1.584) Data (t): 0.001 Batch (t): 0.950, 569.403/s LR: 0.000005 Logit Scale: 99.915 - V4
147
+ 2024-11-27,00:48:12 | INFO | Train Epoch: 0 [ 1587712/10637090 (15%)] Loss: 1.3209 (1.576) Data (t): 0.001 Batch (t): 0.898, 571.054/s LR: 0.000005 Logit Scale: 99.913 - V4
148
+ 2024-11-27,00:49:42 | INFO | Train Epoch: 0 [ 1638912/10637090 (15%)] Loss: 1.2952 (1.567) Data (t): 0.001 Batch (t): 0.897, 570.095/s LR: 0.000005 Logit Scale: 99.910 - V4
149
+ 2024-11-27,00:51:11 | INFO | Train Epoch: 0 [ 1690112/10637090 (16%)] Loss: 1.1427 (1.555) Data (t): 0.001 Batch (t): 0.897, 569.632/s LR: 0.000005 Logit Scale: 99.905 - V4
150
+ 2024-11-27,00:52:41 | INFO | Train Epoch: 0 [ 1741312/10637090 (16%)] Loss: 1.3267 (1.548) Data (t): 0.001 Batch (t): 0.898, 570.207/s LR: 0.000005 Logit Scale: 99.903 - V4
151
+ 2024-11-27,00:54:18 | INFO | Train Epoch: 0 [ 1792512/10637090 (17%)] Loss: 1.1937 (1.538) Data (t): 0.001 Batch (t): 0.965, 570.586/s LR: 0.000005 Logit Scale: 99.901 - V4
152
+ 2024-11-27,00:55:48 | INFO | Train Epoch: 0 [ 1843712/10637090 (17%)] Loss: 1.2775 (1.531) Data (t): 0.001 Batch (t): 0.899, 568.347/s LR: 0.000005 Logit Scale: 99.897 - V4
153
+ 2024-11-27,00:57:17 | INFO | Train Epoch: 0 [ 1894912/10637090 (18%)] Loss: 1.1753 (1.522) Data (t): 0.001 Batch (t): 0.899, 569.093/s LR: 0.000005 Logit Scale: 99.893 - V4
154
+ 2024-11-27,00:58:47 | INFO | Train Epoch: 0 [ 1946112/10637090 (18%)] Loss: 1.2124 (1.514) Data (t): 0.001 Batch (t): 0.898, 568.568/s LR: 0.000005 Logit Scale: 99.893 - V4
155
+ 2024-11-27,01:00:17 | INFO | Train Epoch: 0 [ 1997312/10637090 (19%)] Loss: 1.2806 (1.508) Data (t): 0.001 Batch (t): 0.898, 570.066/s LR: 0.000005 Logit Scale: 99.890 - V4
156
+ 2024-11-27,01:01:54 | INFO | Train Epoch: 0 [ 2048512/10637090 (19%)] Loss: 1.3294 (1.504) Data (t): 0.001 Batch (t): 0.966, 571.047/s LR: 0.000005 Logit Scale: 99.886 - V4
157
+ 2024-11-27,01:03:23 | INFO | Train Epoch: 0 [ 2099712/10637090 (20%)] Loss: 1.3688 (1.501) Data (t): 0.001 Batch (t): 0.897, 571.014/s LR: 0.000005 Logit Scale: 99.883 - V4
158
+ 2024-11-27,01:04:53 | INFO | Train Epoch: 0 [ 2150912/10637090 (20%)] Loss: 1.2393 (1.495) Data (t): 0.001 Batch (t): 0.898, 570.310/s LR: 0.000005 Logit Scale: 99.881 - V4
159
+ 2024-11-27,01:06:23 | INFO | Train Epoch: 0 [ 2202112/10637090 (21%)] Loss: 1.2518 (1.489) Data (t): 0.001 Batch (t): 0.898, 572.764/s LR: 0.000005 Logit Scale: 99.877 - V4
160
+ 2024-11-27,01:07:53 | INFO | Train Epoch: 0 [ 2253312/10637090 (21%)] Loss: 1.3710 (1.486) Data (t): 0.001 Batch (t): 0.898, 571.602/s LR: 0.000005 Logit Scale: 99.874 - V4
161
+ 2024-11-27,01:09:27 | INFO | Train Epoch: 0 [ 2304512/10637090 (22%)] Loss: 1.3049 (1.482) Data (t): 0.001 Batch (t): 0.945, 572.675/s LR: 0.000005 Logit Scale: 99.870 - V4
162
+ 2024-11-27,01:10:59 | INFO | Train Epoch: 0 [ 2355712/10637090 (22%)] Loss: 1.3114 (1.479) Data (t): 0.001 Batch (t): 0.919, 569.923/s LR: 0.000005 Logit Scale: 99.868 - V4
163
+ 2024-11-27,01:12:29 | INFO | Train Epoch: 0 [ 2406912/10637090 (23%)] Loss: 1.3374 (1.476) Data (t): 0.001 Batch (t): 0.898, 568.928/s LR: 0.000005 Logit Scale: 99.865 - V4
164
+ 2024-11-27,01:13:59 | INFO | Train Epoch: 0 [ 2458112/10637090 (23%)] Loss: 1.4219 (1.475) Data (t): 0.001 Batch (t): 0.897, 572.496/s LR: 0.000005 Logit Scale: 99.861 - V4
165
+ 2024-11-27,01:15:28 | INFO | Train Epoch: 0 [ 2509312/10637090 (24%)] Loss: 1.2642 (1.471) Data (t): 0.001 Batch (t): 0.896, 572.387/s LR: 0.000005 Logit Scale: 99.863 - V4
166
+ 2024-11-27,01:17:02 | INFO | Train Epoch: 0 [ 2560512/10637090 (24%)] Loss: 1.2675 (1.467) Data (t): 0.001 Batch (t): 0.936, 571.857/s LR: 0.000005 Logit Scale: 99.860 - V4
167
+ 2024-11-27,01:18:34 | INFO | Train Epoch: 0 [ 2611712/10637090 (25%)] Loss: 1.1737 (1.461) Data (t): 0.001 Batch (t): 0.923, 570.747/s LR: 0.000005 Logit Scale: 99.857 - V4
168
+ 2024-11-27,01:20:04 | INFO | Train Epoch: 0 [ 2662912/10637090 (25%)] Loss: 1.1781 (1.456) Data (t): 0.001 Batch (t): 0.897, 571.081/s LR: 0.000005 Logit Scale: 99.856 - V4
169
+ 2024-11-27,01:21:34 | INFO | Train Epoch: 0 [ 2714112/10637090 (26%)] Loss: 1.1610 (1.450) Data (t): 0.001 Batch (t): 0.897, 571.302/s LR: 0.000005 Logit Scale: 99.855 - V4
170
+ 2024-11-27,01:23:03 | INFO | Train Epoch: 0 [ 2765312/10637090 (26%)] Loss: 1.2686 (1.447) Data (t): 0.001 Batch (t): 0.897, 569.776/s LR: 0.000005 Logit Scale: 99.853 - V4
171
+ 2024-11-27,01:24:33 | INFO | Train Epoch: 0 [ 2816512/10637090 (26%)] Loss: 1.1229 (1.441) Data (t): 0.001 Batch (t): 0.897, 570.995/s LR: 0.000005 Logit Scale: 99.849 - V4
172
+ 2024-11-27,01:26:09 | INFO | Train Epoch: 0 [ 2867712/10637090 (27%)] Loss: 1.2637 (1.438) Data (t): 0.001 Batch (t): 0.963, 571.722/s LR: 0.000005 Logit Scale: 99.846 - V4
173
+ 2024-11-27,01:27:39 | INFO | Train Epoch: 0 [ 2918912/10637090 (27%)] Loss: 1.2818 (1.435) Data (t): 0.001 Batch (t): 0.897, 571.497/s LR: 0.000005 Logit Scale: 99.848 - V4
174
+ 2024-11-27,01:29:09 | INFO | Train Epoch: 0 [ 2970112/10637090 (28%)] Loss: 1.1737 (1.431) Data (t): 0.001 Batch (t): 0.897, 571.149/s LR: 0.000005 Logit Scale: 99.843 - V4
175
+ 2024-11-27,01:30:38 | INFO | Train Epoch: 0 [ 3021312/10637090 (28%)] Loss: 1.1583 (1.426) Data (t): 0.001 Batch (t): 0.897, 571.518/s LR: 0.000005 Logit Scale: 99.843 - V4
176
+ 2024-11-27,01:32:08 | INFO | Train Epoch: 0 [ 3072512/10637090 (29%)] Loss: 1.0435 (1.420) Data (t): 0.001 Batch (t): 0.896, 572.025/s LR: 0.000005 Logit Scale: 99.839 - V4
177
+ 2024-11-27,01:33:44 | INFO | Train Epoch: 0 [ 3123712/10637090 (29%)] Loss: 1.2389 (1.417) Data (t): 0.001 Batch (t): 0.958, 571.597/s LR: 0.000005 Logit Scale: 99.836 - V4
178
+ 2024-11-27,01:35:13 | INFO | Train Epoch: 0 [ 3174912/10637090 (30%)] Loss: 1.2484 (1.414) Data (t): 0.001 Batch (t): 0.896, 569.564/s LR: 0.000005 Logit Scale: 99.836 - V4
179
+ 2024-11-27,01:36:43 | INFO | Train Epoch: 0 [ 3226112/10637090 (30%)] Loss: 1.1787 (1.411) Data (t): 0.001 Batch (t): 0.896, 570.934/s LR: 0.000005 Logit Scale: 99.835 - V4
180
+ 2024-11-27,01:38:12 | INFO | Train Epoch: 0 [ 3277312/10637090 (31%)] Loss: 1.1424 (1.407) Data (t): 0.001 Batch (t): 0.895, 572.160/s LR: 0.000005 Logit Scale: 99.833 - V4
181
+ 2024-11-27,01:39:42 | INFO | Train Epoch: 0 [ 3328512/10637090 (31%)] Loss: 1.2771 (1.405) Data (t): 0.001 Batch (t): 0.896, 570.350/s LR: 0.000005 Logit Scale: 99.833 - V4
182
+ 2024-11-27,01:41:16 | INFO | Train Epoch: 0 [ 3379712/10637090 (32%)] Loss: 1.2625 (1.403) Data (t): 0.001 Batch (t): 0.936, 568.358/s LR: 0.000005 Logit Scale: 99.832 - V4
183
+ 2024-11-27,01:42:47 | INFO | Train Epoch: 0 [ 3430912/10637090 (32%)] Loss: 1.1774 (1.399) Data (t): 0.001 Batch (t): 0.916, 575.051/s LR: 0.000005 Logit Scale: 99.831 - V4
184
+ 2024-11-27,01:44:17 | INFO | Train Epoch: 0 [ 3482112/10637090 (33%)] Loss: 1.2596 (1.397) Data (t): 0.001 Batch (t): 0.897, 569.664/s LR: 0.000005 Logit Scale: 99.826 - V4
185
+ 2024-11-27,01:45:47 | INFO | Train Epoch: 0 [ 3533312/10637090 (33%)] Loss: 1.1953 (1.394) Data (t): 0.001 Batch (t): 0.897, 570.688/s LR: 0.000005 Logit Scale: 99.827 - V4
186
+ 2024-11-27,01:47:16 | INFO | Train Epoch: 0 [ 3584512/10637090 (34%)] Loss: 1.1942 (1.392) Data (t): 0.001 Batch (t): 0.896, 570.153/s LR: 0.000005 Logit Scale: 99.825 - V4
187
+ 2024-11-27,01:48:51 | INFO | Train Epoch: 0 [ 3635712/10637090 (34%)] Loss: 1.2278 (1.389) Data (t): 0.001 Batch (t): 0.943, 572.123/s LR: 0.000005 Logit Scale: 99.824 - V4
188
+ 2024-11-27,01:50:22 | INFO | Train Epoch: 0 [ 3686912/10637090 (35%)] Loss: 1.3798 (1.389) Data (t): 0.001 Batch (t): 0.916, 568.780/s LR: 0.000005 Logit Scale: 99.822 - V4
189
+ 2024-11-27,01:51:52 | INFO | Train Epoch: 0 [ 3738112/10637090 (35%)] Loss: 1.1573 (1.386) Data (t): 0.001 Batch (t): 0.897, 570.548/s LR: 0.000005 Logit Scale: 99.822 - V4
190
+ 2024-11-27,01:53:21 | INFO | Train Epoch: 0 [ 3789312/10637090 (36%)] Loss: 1.0215 (1.381) Data (t): 0.001 Batch (t): 0.896, 572.590/s LR: 0.000005 Logit Scale: 99.818 - V4
191
+ 2024-11-27,01:54:51 | INFO | Train Epoch: 0 [ 3840512/10637090 (36%)] Loss: 1.2224 (1.379) Data (t): 0.001 Batch (t): 0.896, 569.029/s LR: 0.000005 Logit Scale: 99.817 - V4
192
+ 2024-11-27,01:56:24 | INFO | Train Epoch: 0 [ 3891712/10637090 (37%)] Loss: 1.1264 (1.376) Data (t): 0.001 Batch (t): 0.926, 324.695/s LR: 0.000005 Logit Scale: 99.818 - V4
193
+ 2024-11-27,01:57:57 | INFO | Train Epoch: 0 [ 3942912/10637090 (37%)] Loss: 1.1852 (1.373) Data (t): 0.001 Batch (t): 0.934, 569.099/s LR: 0.000005 Logit Scale: 99.819 - V4
194
+ 2024-11-27,01:59:27 | INFO | Train Epoch: 0 [ 3994112/10637090 (38%)] Loss: 1.1099 (1.370) Data (t): 0.001 Batch (t): 0.896, 573.960/s LR: 0.000005 Logit Scale: 99.817 - V4
195
+ 2024-11-27,02:00:56 | INFO | Train Epoch: 0 [ 4045312/10637090 (38%)] Loss: 1.1226 (1.367) Data (t): 0.001 Batch (t): 0.897, 573.014/s LR: 0.000005 Logit Scale: 99.818 - V4
196
+ 2024-11-27,02:02:26 | INFO | Train Epoch: 0 [ 4096512/10637090 (39%)] Loss: 1.1286 (1.364) Data (t): 0.001 Batch (t): 0.898, 570.082/s LR: 0.000005 Logit Scale: 99.815 - V4
197
+ 2024-11-27,02:03:56 | INFO | Train Epoch: 0 [ 4147712/10637090 (39%)] Loss: 1.1437 (1.361) Data (t): 0.001 Batch (t): 0.896, 571.589/s LR: 0.000005 Logit Scale: 99.816 - V4
198
+ 2024-11-27,02:05:32 | INFO | Train Epoch: 0 [ 4198912/10637090 (39%)] Loss: 1.1229 (1.358) Data (t): 0.001 Batch (t): 0.961, 571.753/s LR: 0.000005 Logit Scale: 99.813 - V4
199
+ 2024-11-27,02:07:02 | INFO | Train Epoch: 0 [ 4250112/10637090 (40%)] Loss: 1.2819 (1.357) Data (t): 0.001 Batch (t): 0.896, 569.461/s LR: 0.000005 Logit Scale: 99.811 - V4
200
+ 2024-11-27,02:08:31 | INFO | Train Epoch: 0 [ 4301312/10637090 (40%)] Loss: 1.3135 (1.357) Data (t): 0.001 Batch (t): 0.896, 570.928/s LR: 0.000005 Logit Scale: 99.812 - V4
201
+ 2024-11-27,02:10:01 | INFO | Train Epoch: 0 [ 4352512/10637090 (41%)] Loss: 1.1356 (1.354) Data (t): 0.001 Batch (t): 0.895, 571.161/s LR: 0.000005 Logit Scale: 99.809 - V4
202
+ 2024-11-27,02:11:30 | INFO | Train Epoch: 0 [ 4403712/10637090 (41%)] Loss: 1.1187 (1.352) Data (t): 0.001 Batch (t): 0.895, 570.798/s LR: 0.000004 Logit Scale: 99.809 - V4
203
+ 2024-11-27,02:13:05 | INFO | Train Epoch: 0 [ 4454912/10637090 (42%)] Loss: 1.1785 (1.350) Data (t): 0.001 Batch (t): 0.943, 572.874/s LR: 0.000004 Logit Scale: 99.808 - V4
204
+ 2024-11-27,02:14:36 | INFO | Train Epoch: 0 [ 4506112/10637090 (42%)] Loss: 1.1717 (1.348) Data (t): 0.001 Batch (t): 0.917, 571.405/s LR: 0.000004 Logit Scale: 99.806 - V4
205
+ 2024-11-27,02:16:06 | INFO | Train Epoch: 0 [ 4557312/10637090 (43%)] Loss: 1.2548 (1.347) Data (t): 0.001 Batch (t): 0.895, 569.395/s LR: 0.000004 Logit Scale: 99.806 - V4
206
+ 2024-11-27,02:17:36 | INFO | Train Epoch: 0 [ 4608512/10637090 (43%)] Loss: 1.0788 (1.344) Data (t): 0.001 Batch (t): 0.896, 571.157/s LR: 0.000004 Logit Scale: 99.807 - V4
207
+ 2024-11-27,02:19:05 | INFO | Train Epoch: 0 [ 4659712/10637090 (44%)] Loss: 1.2446 (1.343) Data (t): 0.001 Batch (t): 0.896, 569.782/s LR: 0.000004 Logit Scale: 99.806 - V4
208
+ 2024-11-27,02:20:40 | INFO | Train Epoch: 0 [ 4710912/10637090 (44%)] Loss: 1.2120 (1.341) Data (t): 0.001 Batch (t): 0.945, 571.213/s LR: 0.000004 Logit Scale: 99.805 - V4
209
+ 2024-11-27,02:22:11 | INFO | Train Epoch: 0 [ 4762112/10637090 (45%)] Loss: 1.3526 (1.341) Data (t): 0.001 Batch (t): 0.917, 573.227/s LR: 0.000004 Logit Scale: 99.804 - V4
210
+ 2024-11-27,02:23:41 | INFO | Train Epoch: 0 [ 4813312/10637090 (45%)] Loss: 1.1652 (1.340) Data (t): 0.001 Batch (t): 0.896, 573.041/s LR: 0.000004 Logit Scale: 99.804 - V4
211
+ 2024-11-27,02:25:11 | INFO | Train Epoch: 0 [ 4864512/10637090 (46%)] Loss: 1.2001 (1.338) Data (t): 0.001 Batch (t): 0.897, 571.774/s LR: 0.000004 Logit Scale: 99.801 - V4
212
+ 2024-11-27,02:26:40 | INFO | Train Epoch: 0 [ 4915712/10637090 (46%)] Loss: 1.3084 (1.338) Data (t): 0.001 Batch (t): 0.896, 571.180/s LR: 0.000004 Logit Scale: 99.802 - V4
213
+ 2024-11-27,02:28:13 | INFO | Train Epoch: 0 [ 4966912/10637090 (47%)] Loss: 1.1183 (1.336) Data (t): 0.001 Batch (t): 0.934, 571.497/s LR: 0.000004 Logit Scale: 99.802 - V4
214
+ 2024-11-27,02:29:46 | INFO | Train Epoch: 0 [ 5018112/10637090 (47%)] Loss: 1.1509 (1.334) Data (t): 0.001 Batch (t): 0.927, 571.698/s LR: 0.000004 Logit Scale: 99.802 - V4
215
+ 2024-11-27,02:31:16 | INFO | Train Epoch: 0 [ 5069312/10637090 (48%)] Loss: 1.0520 (1.331) Data (t): 0.001 Batch (t): 0.897, 571.711/s LR: 0.000004 Logit Scale: 99.802 - V4
216
+ 2024-11-27,02:32:46 | INFO | Train Epoch: 0 [ 5120512/10637090 (48%)] Loss: 1.2611 (1.330) Data (t): 0.001 Batch (t): 0.896, 576.107/s LR: 0.000004 Logit Scale: 99.802 - V4
217
+ 2024-11-27,02:34:15 | INFO | Train Epoch: 0 [ 5171712/10637090 (49%)] Loss: 1.2434 (1.329) Data (t): 0.001 Batch (t): 0.896, 570.625/s LR: 0.000004 Logit Scale: 99.801 - V4
218
+ 2024-11-27,02:35:47 | INFO | Train Epoch: 0 [ 5222912/10637090 (49%)] Loss: 1.1106 (1.327) Data (t): 0.001 Batch (t): 0.920, 324.289/s LR: 0.000004 Logit Scale: 99.804 - V4
219
+ 2024-11-27,02:37:21 | INFO | Train Epoch: 0 [ 5274112/10637090 (50%)] Loss: 1.1516 (1.325) Data (t): 0.001 Batch (t): 0.941, 571.862/s LR: 0.000004 Logit Scale: 99.803 - V4
220
+ 2024-11-27,02:38:51 | INFO | Train Epoch: 0 [ 5325312/10637090 (50%)] Loss: 1.1647 (1.324) Data (t): 0.001 Batch (t): 0.896, 572.345/s LR: 0.000004 Logit Scale: 99.804 - V4
221
+ 2024-11-27,02:40:20 | INFO | Train Epoch: 0 [ 5376512/10637090 (51%)] Loss: 1.1783 (1.323) Data (t): 0.001 Batch (t): 0.896, 571.826/s LR: 0.000004 Logit Scale: 99.803 - V4
222
+ 2024-11-27,02:41:50 | INFO | Train Epoch: 0 [ 5427712/10637090 (51%)] Loss: 1.1427 (1.321) Data (t): 0.001 Batch (t): 0.897, 572.677/s LR: 0.000004 Logit Scale: 99.801 - V4
223
+ 2024-11-27,02:43:21 | INFO | Train Epoch: 0 [ 5478912/10637090 (52%)] Loss: 1.1340 (1.319) Data (t): 0.001 Batch (t): 0.906, 574.483/s LR: 0.000004 Logit Scale: 99.802 - V4
224
+ 2024-11-27,02:44:54 | INFO | Train Epoch: 0 [ 5530112/10637090 (52%)] Loss: 1.2224 (1.318) Data (t): 0.001 Batch (t): 0.934, 570.175/s LR: 0.000004 Logit Scale: 99.803 - V4
225
+ 2024-11-27,02:46:26 | INFO | Train Epoch: 0 [ 5581312/10637090 (52%)] Loss: 1.2584 (1.318) Data (t): 0.001 Batch (t): 0.917, 570.760/s LR: 0.000004 Logit Scale: 99.802 - V4
226
+ 2024-11-27,02:47:55 | INFO | Train Epoch: 0 [ 5632512/10637090 (53%)] Loss: 1.1864 (1.317) Data (t): 0.001 Batch (t): 0.896, 569.898/s LR: 0.000004 Logit Scale: 99.803 - V4
227
+ 2024-11-27,02:49:25 | INFO | Train Epoch: 0 [ 5683712/10637090 (53%)] Loss: 1.1288 (1.315) Data (t): 0.001 Batch (t): 0.896, 570.907/s LR: 0.000004 Logit Scale: 99.803 - V4
228
+ 2024-11-27,02:50:55 | INFO | Train Epoch: 0 [ 5734912/10637090 (54%)] Loss: 1.1840 (1.314) Data (t): 0.001 Batch (t): 0.897, 573.630/s LR: 0.000004 Logit Scale: 99.803 - V4
229
+ 2024-11-27,02:52:29 | INFO | Train Epoch: 0 [ 5786112/10637090 (54%)] Loss: 1.1072 (1.312) Data (t): 0.001 Batch (t): 0.944, 573.707/s LR: 0.000004 Logit Scale: 99.802 - V4
230
+ 2024-11-27,02:54:01 | INFO | Train Epoch: 0 [ 5837312/10637090 (55%)] Loss: 1.1986 (1.311) Data (t): 0.001 Batch (t): 0.917, 572.057/s LR: 0.000004 Logit Scale: 99.803 - V4
231
+ 2024-11-27,02:55:30 | INFO | Train Epoch: 0 [ 5888512/10637090 (55%)] Loss: 1.1697 (1.310) Data (t): 0.001 Batch (t): 0.896, 572.526/s LR: 0.000004 Logit Scale: 99.809 - V4
232
+ 2024-11-27,02:57:00 | INFO | Train Epoch: 0 [ 5939712/10637090 (56%)] Loss: 1.1323 (1.308) Data (t): 0.001 Batch (t): 0.896, 571.471/s LR: 0.000004 Logit Scale: 99.809 - V4
233
+ 2024-11-27,02:58:30 | INFO | Train Epoch: 0 [ 5990912/10637090 (56%)] Loss: 1.1166 (1.307) Data (t): 0.001 Batch (t): 0.897, 572.541/s LR: 0.000004 Logit Scale: 99.808 - V4
234
+ 2024-11-27,03:00:03 | INFO | Train Epoch: 0 [ 6042112/10637090 (57%)] Loss: 1.1826 (1.306) Data (t): 0.001 Batch (t): 0.935, 570.517/s LR: 0.000004 Logit Scale: 99.810 - V4
235
+ 2024-11-27,03:01:36 | INFO | Train Epoch: 0 [ 6093312/10637090 (57%)] Loss: 1.1965 (1.305) Data (t): 0.001 Batch (t): 0.927, 570.458/s LR: 0.000004 Logit Scale: 99.813 - V4
236
+ 2024-11-27,03:03:06 | INFO | Train Epoch: 0 [ 6144512/10637090 (58%)] Loss: 1.0597 (1.303) Data (t): 0.001 Batch (t): 0.897, 569.976/s LR: 0.000004 Logit Scale: 99.814 - V4
237
+ 2024-11-27,03:04:35 | INFO | Train Epoch: 0 [ 6195712/10637090 (58%)] Loss: 1.2225 (1.302) Data (t): 0.001 Batch (t): 0.895, 572.179/s LR: 0.000004 Logit Scale: 99.816 - V4
238
+ 2024-11-27,03:06:05 | INFO | Train Epoch: 0 [ 6246912/10637090 (59%)] Loss: 1.0985 (1.300) Data (t): 0.001 Batch (t): 0.896, 569.476/s LR: 0.000004 Logit Scale: 99.816 - V4
239
+ 2024-11-27,03:07:37 | INFO | Train Epoch: 0 [ 6298112/10637090 (59%)] Loss: 1.0678 (1.298) Data (t): 0.001 Batch (t): 0.927, 571.894/s LR: 0.000004 Logit Scale: 99.813 - V4
240
+ 2024-11-27,03:09:11 | INFO | Train Epoch: 0 [ 6349312/10637090 (60%)] Loss: 1.1614 (1.297) Data (t): 0.001 Batch (t): 0.935, 571.609/s LR: 0.000004 Logit Scale: 99.814 - V4
241
+ 2024-11-27,03:10:40 | INFO | Train Epoch: 0 [ 6400512/10637090 (60%)] Loss: 1.0997 (1.296) Data (t): 0.001 Batch (t): 0.896, 573.898/s LR: 0.000004 Logit Scale: 99.814 - V4
242
+ 2024-11-27,03:12:10 | INFO | Train Epoch: 0 [ 6451712/10637090 (61%)] Loss: 1.2194 (1.295) Data (t): 0.001 Batch (t): 0.896, 568.655/s LR: 0.000004 Logit Scale: 99.815 - V4
243
+ 2024-11-27,03:13:40 | INFO | Train Epoch: 0 [ 6502912/10637090 (61%)] Loss: 1.0856 (1.293) Data (t): 0.001 Batch (t): 0.895, 572.735/s LR: 0.000004 Logit Scale: 99.817 - V4
244
+ 2024-11-27,03:15:11 | INFO | Train Epoch: 0 [ 6554112/10637090 (62%)] Loss: 1.1479 (1.292) Data (t): 0.001 Batch (t): 0.914, 318.806/s LR: 0.000004 Logit Scale: 99.817 - V4
245
+ 2024-11-27,03:16:43 | INFO | Train Epoch: 0 [ 6605312/10637090 (62%)] Loss: 1.0956 (1.291) Data (t): 0.001 Batch (t): 0.920, 573.559/s LR: 0.000004 Logit Scale: 99.818 - V4
246
+ 2024-11-27,03:18:15 | INFO | Train Epoch: 0 [ 6656512/10637090 (63%)] Loss: 1.2245 (1.290) Data (t): 0.001 Batch (t): 0.917, 572.504/s LR: 0.000004 Logit Scale: 99.817 - V4
247
+ 2024-11-27,03:19:44 | INFO | Train Epoch: 0 [ 6707712/10637090 (63%)] Loss: 1.2737 (1.290) Data (t): 0.001 Batch (t): 0.896, 571.761/s LR: 0.000004 Logit Scale: 99.817 - V4
248
+ 2024-11-27,03:21:14 | INFO | Train Epoch: 0 [ 6758912/10637090 (64%)] Loss: 1.0758 (1.289) Data (t): 0.001 Batch (t): 0.895, 573.015/s LR: 0.000004 Logit Scale: 99.817 - V4
249
+ 2024-11-27,03:22:43 | INFO | Train Epoch: 0 [ 6810112/10637090 (64%)] Loss: 1.1316 (1.287) Data (t): 0.001 Batch (t): 0.897, 570.041/s LR: 0.000004 Logit Scale: 99.819 - V4
250
+ 2024-11-27,03:24:17 | INFO | Train Epoch: 0 [ 6861312/10637090 (65%)] Loss: 1.1231 (1.286) Data (t): 0.001 Batch (t): 0.938, 572.175/s LR: 0.000004 Logit Scale: 99.822 - V4
251
+ 2024-11-27,03:25:49 | INFO | Train Epoch: 0 [ 6912512/10637090 (65%)] Loss: 1.0891 (1.285) Data (t): 0.001 Batch (t): 0.917, 572.771/s LR: 0.000004 Logit Scale: 99.823 - V4
252
+ 2024-11-27,03:27:18 | INFO | Train Epoch: 0 [ 6963712/10637090 (65%)] Loss: 1.1356 (1.284) Data (t): 0.001 Batch (t): 0.895, 571.532/s LR: 0.000004 Logit Scale: 99.823 - V4
253
+ 2024-11-27,03:28:48 | INFO | Train Epoch: 0 [ 7014912/10637090 (66%)] Loss: 1.3145 (1.284) Data (t): 0.001 Batch (t): 0.896, 572.996/s LR: 0.000004 Logit Scale: 99.825 - V4
254
+ 2024-11-27,03:30:17 | INFO | Train Epoch: 0 [ 7066112/10637090 (66%)] Loss: 1.0851 (1.282) Data (t): 0.001 Batch (t): 0.896, 572.070/s LR: 0.000004 Logit Scale: 99.826 - V4
255
+ 2024-11-27,03:31:49 | INFO | Train Epoch: 0 [ 7117312/10637090 (67%)] Loss: 1.1289 (1.281) Data (t): 0.001 Batch (t): 0.920, 571.412/s LR: 0.000004 Logit Scale: 99.830 - V4
256
+ 2024-11-27,03:33:22 | INFO | Train Epoch: 0 [ 7168512/10637090 (67%)] Loss: 1.1010 (1.280) Data (t): 0.001 Batch (t): 0.927, 570.866/s LR: 0.000004 Logit Scale: 99.830 - V4
257
+ 2024-11-27,03:34:52 | INFO | Train Epoch: 0 [ 7219712/10637090 (68%)] Loss: 1.0506 (1.278) Data (t): 0.001 Batch (t): 0.896, 571.527/s LR: 0.000004 Logit Scale: 99.832 - V4
258
+ 2024-11-27,03:36:21 | INFO | Train Epoch: 0 [ 7270912/10637090 (68%)] Loss: 1.2806 (1.278) Data (t): 0.001 Batch (t): 0.895, 572.979/s LR: 0.000004 Logit Scale: 99.833 - V4
259
+ 2024-11-27,03:37:51 | INFO | Train Epoch: 0 [ 7322112/10637090 (69%)] Loss: 0.96458 (1.276) Data (t): 0.001 Batch (t): 0.895, 570.327/s LR: 0.000004 Logit Scale: 99.834 - V4
260
+ 2024-11-27,03:39:24 | INFO | Train Epoch: 0 [ 7373312/10637090 (69%)] Loss: 1.2042 (1.276) Data (t): 0.001 Batch (t): 0.928, 571.858/s LR: 0.000004 Logit Scale: 99.834 - V4
261
+ 2024-11-27,03:40:55 | INFO | Train Epoch: 0 [ 7424512/10637090 (70%)] Loss: 1.0352 (1.274) Data (t): 0.001 Batch (t): 0.917, 570.552/s LR: 0.000004 Logit Scale: 99.833 - V4
262
+ 2024-11-27,03:42:26 | INFO | Train Epoch: 0 [ 7475712/10637090 (70%)] Loss: 1.0972 (1.273) Data (t): 0.001 Batch (t): 0.907, 572.454/s LR: 0.000004 Logit Scale: 99.835 - V4
263
+ 2024-11-27,03:43:55 | INFO | Train Epoch: 0 [ 7526912/10637090 (71%)] Loss: 1.0199 (1.271) Data (t): 0.001 Batch (t): 0.894, 572.540/s LR: 0.000004 Logit Scale: 99.838 - V4
264
+ 2024-11-27,03:45:25 | INFO | Train Epoch: 0 [ 7578112/10637090 (71%)] Loss: 1.1705 (1.271) Data (t): 0.001 Batch (t): 0.896, 573.302/s LR: 0.000004 Logit Scale: 99.840 - V4
265
+ 2024-11-27,03:46:57 | INFO | Train Epoch: 0 [ 7629312/10637090 (72%)] Loss: 0.96109 (1.269) Data (t): 0.001 Batch (t): 0.921, 573.336/s LR: 0.000004 Logit Scale: 99.841 - V4
266
+ 2024-11-27,03:48:29 | INFO | Train Epoch: 0 [ 7680512/10637090 (72%)] Loss: 1.0640 (1.267) Data (t): 0.001 Batch (t): 0.924, 574.690/s LR: 0.000004 Logit Scale: 99.843 - V4
267
+ 2024-11-27,03:50:00 | INFO | Train Epoch: 0 [ 7731712/10637090 (73%)] Loss: 1.1384 (1.266) Data (t): 0.001 Batch (t): 0.906, 573.956/s LR: 0.000004 Logit Scale: 99.845 - V4
268
+ 2024-11-27,03:51:29 | INFO | Train Epoch: 0 [ 7782912/10637090 (73%)] Loss: 1.1253 (1.265) Data (t): 0.001 Batch (t): 0.895, 570.350/s LR: 0.000004 Logit Scale: 99.846 - V4
269
+ 2024-11-27,03:52:59 | INFO | Train Epoch: 0 [ 7834112/10637090 (74%)] Loss: 1.1012 (1.264) Data (t): 0.001 Batch (t): 0.895, 570.958/s LR: 0.000004 Logit Scale: 99.849 - V4
270
+ 2024-11-27,03:54:30 | INFO | Train Epoch: 0 [ 7885312/10637090 (74%)] Loss: 1.2335 (1.264) Data (t): 0.001 Batch (t): 0.914, 572.382/s LR: 0.000003 Logit Scale: 99.850 - V4
271
+ 2024-11-27,03:56:03 | INFO | Train Epoch: 0 [ 7936512/10637090 (75%)] Loss: 1.1014 (1.263) Data (t): 0.001 Batch (t): 0.928, 570.918/s LR: 0.000003 Logit Scale: 99.853 - V4
272
+ 2024-11-27,03:57:35 | INFO | Train Epoch: 0 [ 7987712/10637090 (75%)] Loss: 1.1607 (1.262) Data (t): 0.001 Batch (t): 0.916, 568.421/s LR: 0.000003 Logit Scale: 99.855 - V4
273
+ 2024-11-27,03:59:04 | INFO | Train Epoch: 0 [ 8038912/10637090 (76%)] Loss: 1.0089 (1.261) Data (t): 0.001 Batch (t): 0.895, 571.523/s LR: 0.000003 Logit Scale: 99.857 - V4
274
+ 2024-11-27,04:00:34 | INFO | Train Epoch: 0 [ 8090112/10637090 (76%)] Loss: 1.1753 (1.260) Data (t): 0.001 Batch (t): 0.896, 572.515/s LR: 0.000003 Logit Scale: 99.857 - V4
275
+ 2024-11-27,04:02:04 | INFO | Train Epoch: 0 [ 8141312/10637090 (77%)] Loss: 1.0299 (1.259) Data (t): 0.001 Batch (t): 0.897, 573.351/s LR: 0.000003 Logit Scale: 99.861 - V4
276
+ 2024-11-27,04:03:38 | INFO | Train Epoch: 0 [ 8192512/10637090 (77%)] Loss: 1.1359 (1.258) Data (t): 0.001 Batch (t): 0.946, 258.268/s LR: 0.000003 Logit Scale: 99.863 - V4
277
+ 2024-11-27,04:05:10 | INFO | Train Epoch: 0 [ 8243712/10637090 (78%)] Loss: 1.0495 (1.257) Data (t): 0.001 Batch (t): 0.917, 572.484/s LR: 0.000003 Logit Scale: 99.865 - V4
278
+ 2024-11-27,04:06:40 | INFO | Train Epoch: 0 [ 8294912/10637090 (78%)] Loss: 1.1191 (1.256) Data (t): 0.001 Batch (t): 0.897, 569.789/s LR: 0.000003 Logit Scale: 99.868 - V4
279
+ 2024-11-27,04:08:09 | INFO | Train Epoch: 0 [ 8346112/10637090 (78%)] Loss: 1.0729 (1.255) Data (t): 0.001 Batch (t): 0.897, 569.753/s LR: 0.000003 Logit Scale: 99.870 - V4
280
+ 2024-11-27,04:09:39 | INFO | Train Epoch: 0 [ 8397312/10637090 (79%)] Loss: 1.0643 (1.254) Data (t): 0.001 Batch (t): 0.896, 570.698/s LR: 0.000003 Logit Scale: 99.873 - V4
281
+ 2024-11-27,04:11:12 | INFO | Train Epoch: 0 [ 8448512/10637090 (79%)] Loss: 1.1415 (1.253) Data (t): 0.001 Batch (t): 0.934, 570.096/s LR: 0.000003 Logit Scale: 99.874 - V4
282
+ 2024-11-27,04:12:44 | INFO | Train Epoch: 0 [ 8499712/10637090 (80%)] Loss: 1.0682 (1.252) Data (t): 0.001 Batch (t): 0.917, 570.128/s LR: 0.000003 Logit Scale: 99.877 - V4
283
+ 2024-11-27,04:14:15 | INFO | Train Epoch: 0 [ 8550912/10637090 (80%)] Loss: 1.0792 (1.251) Data (t): 0.001 Batch (t): 0.907, 569.003/s LR: 0.000003 Logit Scale: 99.882 - V4
284
+ 2024-11-27,04:15:44 | INFO | Train Epoch: 0 [ 8602112/10637090 (81%)] Loss: 1.0108 (1.249) Data (t): 0.001 Batch (t): 0.897, 572.395/s LR: 0.000003 Logit Scale: 99.886 - V4
285
+ 2024-11-27,04:17:14 | INFO | Train Epoch: 0 [ 8653312/10637090 (81%)] Loss: 1.1330 (1.249) Data (t): 0.001 Batch (t): 0.897, 570.980/s LR: 0.000003 Logit Scale: 99.886 - V4
286
+ 2024-11-27,04:18:48 | INFO | Train Epoch: 0 [ 8704512/10637090 (82%)] Loss: 1.2478 (1.249) Data (t): 0.001 Batch (t): 0.937, 569.938/s LR: 0.000003 Logit Scale: 99.892 - V4
287
+ 2024-11-27,04:20:20 | INFO | Train Epoch: 0 [ 8755712/10637090 (82%)] Loss: 1.0364 (1.248) Data (t): 0.001 Batch (t): 0.917, 570.768/s LR: 0.000003 Logit Scale: 99.893 - V4
288
+ 2024-11-27,04:21:50 | INFO | Train Epoch: 0 [ 8806912/10637090 (83%)] Loss: 1.1142 (1.247) Data (t): 0.001 Batch (t): 0.907, 570.615/s LR: 0.000003 Logit Scale: 99.896 - V4
289
+ 2024-11-27,04:23:20 | INFO | Train Epoch: 0 [ 8858112/10637090 (83%)] Loss: 1.3697 (1.247) Data (t): 0.001 Batch (t): 0.896, 569.593/s LR: 0.000003 Logit Scale: 99.903 - V4
290
+ 2024-11-27,04:24:49 | INFO | Train Epoch: 0 [ 8909312/10637090 (84%)] Loss: 1.2082 (1.247) Data (t): 0.001 Batch (t): 0.895, 571.698/s LR: 0.000003 Logit Scale: 99.905 - V4
291
+ 2024-11-27,04:26:22 | INFO | Train Epoch: 0 [ 8960512/10637090 (84%)] Loss: 1.1149 (1.246) Data (t): 0.001 Batch (t): 0.930, 571.555/s LR: 0.000003 Logit Scale: 99.909 - V4
292
+ 2024-11-27,04:27:54 | INFO | Train Epoch: 0 [ 9011712/10637090 (85%)] Loss: 1.3058 (1.247) Data (t): 0.001 Batch (t): 0.915, 572.179/s LR: 0.000003 Logit Scale: 99.912 - V4
293
+ 2024-11-27,04:29:26 | INFO | Train Epoch: 0 [ 9062912/10637090 (85%)] Loss: 1.2046 (1.247) Data (t): 0.001 Batch (t): 0.918, 570.659/s LR: 0.000003 Logit Scale: 99.917 - V4
294
+ 2024-11-27,04:30:55 | INFO | Train Epoch: 0 [ 9114112/10637090 (86%)] Loss: 1.3685 (1.247) Data (t): 0.001 Batch (t): 0.895, 572.701/s LR: 0.000003 Logit Scale: 99.917 - V4
295
+ 2024-11-27,04:32:25 | INFO | Train Epoch: 0 [ 9165312/10637090 (86%)] Loss: 1.0527 (1.246) Data (t): 0.001 Batch (t): 0.895, 570.638/s LR: 0.000003 Logit Scale: 99.922 - V4
296
+ 2024-11-27,04:33:56 | INFO | Train Epoch: 0 [ 9216512/10637090 (87%)] Loss: 1.1506 (1.246) Data (t): 0.001 Batch (t): 0.914, 571.971/s LR: 0.000003 Logit Scale: 99.923 - V4
297
+ 2024-11-27,04:35:29 | INFO | Train Epoch: 0 [ 9267712/10637090 (87%)] Loss: 1.1005 (1.245) Data (t): 0.001 Batch (t): 0.928, 573.551/s LR: 0.000003 Logit Scale: 99.927 - V4
298
+ 2024-11-27,04:37:00 | INFO | Train Epoch: 0 [ 9318912/10637090 (88%)] Loss: 1.0531 (1.244) Data (t): 0.001 Batch (t): 0.917, 572.713/s LR: 0.000003 Logit Scale: 99.929 - V4
299
+ 2024-11-27,04:38:30 | INFO | Train Epoch: 0 [ 9370112/10637090 (88%)] Loss: 1.3258 (1.244) Data (t): 0.001 Batch (t): 0.895, 573.920/s LR: 0.000003 Logit Scale: 99.931 - V4
300
+ 2024-11-27,04:40:00 | INFO | Train Epoch: 0 [ 9421312/10637090 (89%)] Loss: 0.87377 (1.242) Data (t): 0.001 Batch (t): 0.895, 572.551/s LR: 0.000003 Logit Scale: 99.933 - V4
301
+ 2024-11-27,04:41:29 | INFO | Train Epoch: 0 [ 9472512/10637090 (89%)] Loss: 1.0942 (1.241) Data (t): 0.001 Batch (t): 0.895, 571.491/s LR: 0.000003 Logit Scale: 99.938 - V4
302
+ 2024-11-27,04:43:03 | INFO | Train Epoch: 0 [ 9523712/10637090 (90%)] Loss: 1.1874 (1.241) Data (t): 0.001 Batch (t): 0.936, 571.812/s LR: 0.000003 Logit Scale: 99.944 - V4
303
+ 2024-11-27,04:44:34 | INFO | Train Epoch: 0 [ 9574912/10637090 (90%)] Loss: 1.1915 (1.241) Data (t): 0.001 Batch (t): 0.917, 569.109/s LR: 0.000003 Logit Scale: 99.944 - V4
304
+ 2024-11-27,04:46:05 | INFO | Train Epoch: 0 [ 9626112/10637090 (90%)] Loss: 1.1215 (1.240) Data (t): 0.001 Batch (t): 0.905, 572.676/s LR: 0.000003 Logit Scale: 99.947 - V4
305
+ 2024-11-27,04:47:34 | INFO | Train Epoch: 0 [ 9677312/10637090 (91%)] Loss: 1.1917 (1.240) Data (t): 0.001 Batch (t): 0.895, 575.117/s LR: 0.000003 Logit Scale: 99.952 - V4
306
+ 2024-11-27,04:49:04 | INFO | Train Epoch: 0 [ 9728512/10637090 (91%)] Loss: 1.0777 (1.239) Data (t): 0.001 Batch (t): 0.895, 571.019/s LR: 0.000003 Logit Scale: 99.953 - V4
307
+ 2024-11-27,04:50:37 | INFO | Train Epoch: 0 [ 9779712/10637090 (92%)] Loss: 1.0869 (1.238) Data (t): 0.001 Batch (t): 0.935, 574.195/s LR: 0.000003 Logit Scale: 99.956 - V4
308
+ 2024-11-27,04:52:09 | INFO | Train Epoch: 0 [ 9830912/10637090 (92%)] Loss: 1.1759 (1.238) Data (t): 0.001 Batch (t): 0.916, 571.018/s LR: 0.000003 Logit Scale: 99.959 - V4
309
+ 2024-11-27,04:53:39 | INFO | Train Epoch: 0 [ 9882112/10637090 (93%)] Loss: 1.1170 (1.237) Data (t): 0.001 Batch (t): 0.904, 571.626/s LR: 0.000003 Logit Scale: 99.961 - V4
310
+ 2024-11-27,04:55:09 | INFO | Train Epoch: 0 [ 9933312/10637090 (93%)] Loss: 1.1807 (1.237) Data (t): 0.001 Batch (t): 0.895, 570.554/s LR: 0.000003 Logit Scale: 99.964 - V4
311
+ 2024-11-27,04:56:38 | INFO | Train Epoch: 0 [ 9984512/10637090 (94%)] Loss: 1.1203 (1.237) Data (t): 0.001 Batch (t): 0.894, 575.774/s LR: 0.000003 Logit Scale: 99.969 - V4
312
+ 2024-11-27,04:58:12 | INFO | Train Epoch: 0 [10035712/10637090 (94%)] Loss: 1.1971 (1.236) Data (t): 0.001 Batch (t): 0.936, 572.666/s LR: 0.000003 Logit Scale: 99.976 - V4
313
+ 2024-11-27,04:59:43 | INFO | Train Epoch: 0 [10086912/10637090 (95%)] Loss: 1.0003 (1.235) Data (t): 0.001 Batch (t): 0.907, 573.248/s LR: 0.000003 Logit Scale: 99.980 - V4
314
+ 2024-11-27,05:01:14 | INFO | Train Epoch: 0 [10138112/10637090 (95%)] Loss: 1.2638 (1.235) Data (t): 0.001 Batch (t): 0.917, 571.226/s LR: 0.000003 Logit Scale: 99.984 - V4
315
+ 2024-11-27,05:02:44 | INFO | Train Epoch: 0 [10189312/10637090 (96%)] Loss: 1.1377 (1.235) Data (t): 0.001 Batch (t): 0.896, 573.971/s LR: 0.000003 Logit Scale: 99.986 - V4
316
+ 2024-11-27,05:04:13 | INFO | Train Epoch: 0 [10240512/10637090 (96%)] Loss: 1.1627 (1.234) Data (t): 0.001 Batch (t): 0.895, 570.690/s LR: 0.000003 Logit Scale: 99.990 - V4
317
+ 2024-11-27,05:05:46 | INFO | Train Epoch: 0 [10291712/10637090 (97%)] Loss: 1.1491 (1.234) Data (t): 0.001 Batch (t): 0.922, 569.910/s LR: 0.000003 Logit Scale: 99.994 - V4
318
+ 2024-11-27,05:07:17 | INFO | Train Epoch: 0 [10342912/10637090 (97%)] Loss: 1.0414 (1.233) Data (t): 0.001 Batch (t): 0.910, 573.756/s LR: 0.000003 Logit Scale: 99.998 - V4
319
+ 2024-11-27,05:08:50 | INFO | Train Epoch: 0 [10394112/10637090 (98%)] Loss: 1.1568 (1.233) Data (t): 0.001 Batch (t): 0.929, 570.643/s LR: 0.000003 Logit Scale: 100.000 - V4
320
+ 2024-11-27,05:10:19 | INFO | Train Epoch: 0 [10445312/10637090 (98%)] Loss: 1.2443 (1.233) Data (t): 0.001 Batch (t): 0.895, 570.889/s LR: 0.000003 Logit Scale: 100.000 - V4
321
+ 2024-11-27,05:11:49 | INFO | Train Epoch: 0 [10496512/10637090 (99%)] Loss: 0.99356 (1.232) Data (t): 0.001 Batch (t): 0.896, 570.394/s LR: 0.000003 Logit Scale: 100.000 - V4
322
+ 2024-11-27,05:13:20 | INFO | Train Epoch: 0 [10547712/10637090 (99%)] Loss: 1.2998 (1.232) Data (t): 0.001 Batch (t): 0.916, 253.427/s LR: 0.000003 Logit Scale: 100.000 - V4
323
+ 2024-11-27,05:14:52 | INFO | Train Epoch: 0 [10598912/10637090 (100%)] Loss: 1.0761 (1.231) Data (t): 0.001 Batch (t): 0.919, 571.106/s LR: 0.000003 Logit Scale: 100.000 - V4
324
+ 2024-11-27,05:16:01 | INFO | Train Epoch: 0 [10636800/10637090 (100%)] Loss: 1.0335 (1.230) Data (t): 0.002 Batch (t): 0.926, 574.966/s LR: 0.000003 Logit Scale: 100.000 - V4
325
+ 2024-11-27,05:16:08 | INFO | Start epoch 1
326
+ 2024-11-27,05:16:12 | INFO | Train Epoch: 1 [ 512/10637090 (0%)] Loss: 1.0128 (1.013) Data (t): 2.981 Batch (t): 3.918, 130.688/s LR: 0.000003 Logit Scale: 100.000 - V4
327
+ 2024-11-27,05:17:43 | INFO | Train Epoch: 1 [ 51712/10637090 (0%)] Loss: 1.2033 (1.108) Data (t): 0.001 Batch (t): 0.909, 571.364/s LR: 0.000002 Logit Scale: 100.000 - V4
328
+ 2024-11-27,05:19:13 | INFO | Train Epoch: 1 [ 102912/10637090 (1%)] Loss: 1.2030 (1.140) Data (t): 0.001 Batch (t): 0.898, 572.273/s LR: 0.000002 Logit Scale: 100.000 - V4
329
+ 2024-11-27,05:20:42 | INFO | Train Epoch: 1 [ 154112/10637090 (1%)] Loss: 1.1652 (1.146) Data (t): 0.001 Batch (t): 0.898, 572.128/s LR: 0.000002 Logit Scale: 100.000 - V4
330
+ 2024-11-27,05:22:16 | INFO | Train Epoch: 1 [ 205312/10637090 (2%)] Loss: 1.0050 (1.118) Data (t): 0.001 Batch (t): 0.940, 573.144/s LR: 0.000002 Logit Scale: 100.000 - V4
331
+ 2024-11-27,05:23:48 | INFO | Train Epoch: 1 [ 256512/10637090 (2%)] Loss: 0.99667 (1.098) Data (t): 0.001 Batch (t): 0.918, 574.014/s LR: 0.000002 Logit Scale: 100.000 - V4
332
+ 2024-11-27,05:25:19 | INFO | Train Epoch: 1 [ 307712/10637090 (3%)] Loss: 1.0535 (1.091) Data (t): 0.001 Batch (t): 0.906, 568.583/s LR: 0.000002 Logit Scale: 100.000 - V4
333
+ 2024-11-27,05:26:49 | INFO | Train Epoch: 1 [ 358912/10637090 (3%)] Loss: 1.0576 (1.087) Data (t): 0.001 Batch (t): 0.897, 571.214/s LR: 0.000002 Logit Scale: 100.000 - V4
334
+ 2024-11-27,05:28:18 | INFO | Train Epoch: 1 [ 410112/10637090 (4%)] Loss: 1.0211 (1.080) Data (t): 0.001 Batch (t): 0.897, 569.649/s LR: 0.000002 Logit Scale: 100.000 - V4
335
+ 2024-11-27,05:29:51 | INFO | Train Epoch: 1 [ 461312/10637090 (4%)] Loss: 1.1738 (1.089) Data (t): 0.001 Batch (t): 0.926, 572.019/s LR: 0.000002 Logit Scale: 100.000 - V4
336
+ 2024-11-27,05:31:22 | INFO | Train Epoch: 1 [ 512512/10637090 (5%)] Loss: 1.0125 (1.082) Data (t): 0.001 Batch (t): 0.913, 571.222/s LR: 0.000002 Logit Scale: 100.000 - V4
337
+ 2024-11-27,05:32:54 | INFO | Train Epoch: 1 [ 563712/10637090 (5%)] Loss: 1.0086 (1.076) Data (t): 0.001 Batch (t): 0.916, 570.775/s LR: 0.000002 Logit Scale: 100.000 - V4
338
+ 2024-11-27,05:34:24 | INFO | Train Epoch: 1 [ 614912/10637090 (6%)] Loss: 1.1774 (1.084) Data (t): 0.001 Batch (t): 0.898, 568.716/s LR: 0.000002 Logit Scale: 100.000 - V4
339
+ 2024-11-27,05:35:53 | INFO | Train Epoch: 1 [ 666112/10637090 (6%)] Loss: 0.98214 (1.077) Data (t): 0.001 Batch (t): 0.898, 571.983/s LR: 0.000002 Logit Scale: 100.000 - V4
340
+ 2024-11-27,05:37:25 | INFO | Train Epoch: 1 [ 717312/10637090 (7%)] Loss: 1.0125 (1.072) Data (t): 0.001 Batch (t): 0.913, 574.215/s LR: 0.000002 Logit Scale: 100.000 - V4
341
+ 2024-11-27,05:38:57 | INFO | Train Epoch: 1 [ 768512/10637090 (7%)] Loss: 0.96567 (1.066) Data (t): 0.001 Batch (t): 0.926, 568.692/s LR: 0.000002 Logit Scale: 100.000 - V4
342
+ 2024-11-27,05:40:29 | INFO | Train Epoch: 1 [ 819712/10637090 (8%)] Loss: 1.1290 (1.069) Data (t): 0.001 Batch (t): 0.917, 567.079/s LR: 0.000002 Logit Scale: 100.000 - V4
343
+ 2024-11-27,05:41:59 | INFO | Train Epoch: 1 [ 870912/10637090 (8%)] Loss: 0.95952 (1.063) Data (t): 0.001 Batch (t): 0.899, 565.774/s LR: 0.000002 Logit Scale: 100.000 - V4
344
+ 2024-11-27,05:43:29 | INFO | Train Epoch: 1 [ 922112/10637090 (9%)] Loss: 1.2495 (1.073) Data (t): 0.001 Batch (t): 0.897, 567.043/s LR: 0.000002 Logit Scale: 100.000 - V4
345
+ 2024-11-27,05:44:59 | INFO | Train Epoch: 1 [ 973312/10637090 (9%)] Loss: 1.0532 (1.072) Data (t): 0.001 Batch (t): 0.904, 571.326/s LR: 0.000002 Logit Scale: 100.000 - V4
346
+ 2024-11-27,05:46:33 | INFO | Train Epoch: 1 [ 1024512/10637090 (10%)] Loss: 0.96439 (1.067) Data (t): 0.001 Batch (t): 0.936, 566.503/s LR: 0.000002 Logit Scale: 100.000 - V4
347
+ 2024-11-27,05:48:03 | INFO | Train Epoch: 1 [ 1075712/10637090 (10%)] Loss: 1.0617 (1.067) Data (t): 0.001 Batch (t): 0.906, 571.041/s LR: 0.000002 Logit Scale: 100.000 - V4
348
+ 2024-11-27,05:49:34 | INFO | Train Epoch: 1 [ 1126912/10637090 (11%)] Loss: 0.95454 (1.062) Data (t): 0.001 Batch (t): 0.906, 571.611/s LR: 0.000002 Logit Scale: 100.000 - V4
349
+ 2024-11-27,05:51:03 | INFO | Train Epoch: 1 [ 1178112/10637090 (11%)] Loss: 1.0707 (1.062) Data (t): 0.001 Batch (t): 0.896, 569.365/s LR: 0.000002 Logit Scale: 100.000 - V4
350
+ 2024-11-27,05:52:33 | INFO | Train Epoch: 1 [ 1229312/10637090 (12%)] Loss: 1.0439 (1.061) Data (t): 0.001 Batch (t): 0.897, 572.121/s LR: 0.000002 Logit Scale: 100.000 - V4
351
+ 2024-11-27,05:54:06 | INFO | Train Epoch: 1 [ 1280512/10637090 (12%)] Loss: 1.1100 (1.063) Data (t): 0.001 Batch (t): 0.933, 574.313/s LR: 0.000002 Logit Scale: 100.000 - V4
352
+ 2024-11-27,05:55:38 | INFO | Train Epoch: 1 [ 1331712/10637090 (13%)] Loss: 1.0108 (1.061) Data (t): 0.001 Batch (t): 0.915, 571.737/s LR: 0.000002 Logit Scale: 100.000 - V4
353
+ 2024-11-27,05:57:08 | INFO | Train Epoch: 1 [ 1382912/10637090 (13%)] Loss: 1.1779 (1.066) Data (t): 0.001 Batch (t): 0.905, 570.526/s LR: 0.000002 Logit Scale: 100.000 - V4
354
+ 2024-11-27,05:58:38 | INFO | Train Epoch: 1 [ 1434112/10637090 (13%)] Loss: 0.96992 (1.062) Data (t): 0.001 Batch (t): 0.894, 572.490/s LR: 0.000002 Logit Scale: 100.000 - V4
355
+ 2024-11-27,06:00:07 | INFO | Train Epoch: 1 [ 1485312/10637090 (14%)] Loss: 1.0453 (1.062) Data (t): 0.001 Batch (t): 0.895, 572.927/s LR: 0.000002 Logit Scale: 100.000 - V4
356
+ 2024-11-27,06:01:40 | INFO | Train Epoch: 1 [ 1536512/10637090 (14%)] Loss: 1.0654 (1.062) Data (t): 0.001 Batch (t): 0.924, 570.176/s LR: 0.000002 Logit Scale: 100.000 - V4
357
+ 2024-11-27,06:03:12 | INFO | Train Epoch: 1 [ 1587712/10637090 (15%)] Loss: 1.0905 (1.063) Data (t): 0.001 Batch (t): 0.922, 571.839/s LR: 0.000002 Logit Scale: 100.000 - V4
358
+ 2024-11-27,06:04:42 | INFO | Train Epoch: 1 [ 1638912/10637090 (15%)] Loss: 1.1550 (1.066) Data (t): 0.001 Batch (t): 0.905, 570.747/s LR: 0.000002 Logit Scale: 100.000 - V4
359
+ 2024-11-27,06:06:12 | INFO | Train Epoch: 1 [ 1690112/10637090 (16%)] Loss: 1.0767 (1.066) Data (t): 0.001 Batch (t): 0.896, 572.266/s LR: 0.000002 Logit Scale: 100.000 - V4
360
+ 2024-11-27,06:07:42 | INFO | Train Epoch: 1 [ 1741312/10637090 (16%)] Loss: 1.1005 (1.067) Data (t): 0.001 Batch (t): 0.895, 568.226/s LR: 0.000002 Logit Scale: 100.000 - V4
361
+ 2024-11-27,06:09:14 | INFO | Train Epoch: 1 [ 1792512/10637090 (17%)] Loss: 1.1870 (1.070) Data (t): 0.001 Batch (t): 0.925, 570.328/s LR: 0.000002 Logit Scale: 100.000 - V4
362
+ 2024-11-27,06:10:45 | INFO | Train Epoch: 1 [ 1843712/10637090 (17%)] Loss: 1.0125 (1.069) Data (t): 0.001 Batch (t): 0.912, 571.452/s LR: 0.000002 Logit Scale: 100.000 - V4
363
+ 2024-11-27,06:12:17 | INFO | Train Epoch: 1 [ 1894912/10637090 (18%)] Loss: 0.95886 (1.066) Data (t): 0.001 Batch (t): 0.916, 573.938/s LR: 0.000002 Logit Scale: 100.000 - V4
364
+ 2024-11-27,06:13:46 | INFO | Train Epoch: 1 [ 1946112/10637090 (18%)] Loss: 1.1727 (1.068) Data (t): 0.001 Batch (t): 0.896, 567.277/s LR: 0.000002 Logit Scale: 100.000 - V4
365
+ 2024-11-27,06:15:16 | INFO | Train Epoch: 1 [ 1997312/10637090 (19%)] Loss: 1.0444 (1.068) Data (t): 0.001 Batch (t): 0.896, 572.636/s LR: 0.000002 Logit Scale: 100.000 - V4
366
+ 2024-11-27,06:16:47 | INFO | Train Epoch: 1 [ 2048512/10637090 (19%)] Loss: 1.0698 (1.068) Data (t): 0.001 Batch (t): 0.912, 573.693/s LR: 0.000002 Logit Scale: 100.000 - V4
367
+ 2024-11-27,06:18:20 | INFO | Train Epoch: 1 [ 2099712/10637090 (20%)] Loss: 0.84103 (1.063) Data (t): 0.001 Batch (t): 0.926, 572.319/s LR: 0.000002 Logit Scale: 100.000 - V4
368
+ 2024-11-27,06:19:51 | INFO | Train Epoch: 1 [ 2150912/10637090 (20%)] Loss: 0.98125 (1.061) Data (t): 0.001 Batch (t): 0.914, 573.110/s LR: 0.000002 Logit Scale: 100.000 - V4
369
+ 2024-11-27,06:21:21 | INFO | Train Epoch: 1 [ 2202112/10637090 (21%)] Loss: 0.98067 (1.059) Data (t): 0.001 Batch (t): 0.895, 571.854/s LR: 0.000002 Logit Scale: 100.000 - V4
370
+ 2024-11-27,06:22:50 | INFO | Train Epoch: 1 [ 2253312/10637090 (21%)] Loss: 0.97608 (1.057) Data (t): 0.001 Batch (t): 0.895, 571.749/s LR: 0.000002 Logit Scale: 100.000 - V4
371
+ 2024-11-27,06:24:20 | INFO | Train Epoch: 1 [ 2304512/10637090 (22%)] Loss: 1.1066 (1.058) Data (t): 0.001 Batch (t): 0.902, 573.417/s LR: 0.000002 Logit Scale: 100.000 - V4
372
+ 2024-11-27,06:25:53 | INFO | Train Epoch: 1 [ 2355712/10637090 (22%)] Loss: 1.0162 (1.057) Data (t): 0.001 Batch (t): 0.924, 572.695/s LR: 0.000002 Logit Scale: 100.000 - V4
373
+ 2024-11-27,06:27:24 | INFO | Train Epoch: 1 [ 2406912/10637090 (23%)] Loss: 1.0336 (1.057) Data (t): 0.001 Batch (t): 0.914, 573.991/s LR: 0.000002 Logit Scale: 100.000 - V4
374
+ 2024-11-27,06:28:55 | INFO | Train Epoch: 1 [ 2458112/10637090 (23%)] Loss: 1.2035 (1.060) Data (t): 0.001 Batch (t): 0.904, 573.854/s LR: 0.000002 Logit Scale: 100.000 - V4
375
+ 2024-11-27,06:30:24 | INFO | Train Epoch: 1 [ 2509312/10637090 (24%)] Loss: 1.0746 (1.060) Data (t): 0.001 Batch (t): 0.894, 569.583/s LR: 0.000002 Logit Scale: 100.000 - V4
376
+ 2024-11-27,06:31:54 | INFO | Train Epoch: 1 [ 2560512/10637090 (24%)] Loss: 1.1349 (1.061) Data (t): 0.001 Batch (t): 0.895, 573.191/s LR: 0.000002 Logit Scale: 100.000 - V4
377
+ 2024-11-27,06:33:27 | INFO | Train Epoch: 1 [ 2611712/10637090 (25%)] Loss: 1.0716 (1.062) Data (t): 0.001 Batch (t): 0.931, 571.037/s LR: 0.000002 Logit Scale: 100.000 - V4
378
+ 2024-11-27,06:34:58 | INFO | Train Epoch: 1 [ 2662912/10637090 (25%)] Loss: 1.0197 (1.061) Data (t): 0.001 Batch (t): 0.914, 573.285/s LR: 0.000002 Logit Scale: 100.000 - V4
379
+ 2024-11-27,06:36:29 | INFO | Train Epoch: 1 [ 2714112/10637090 (26%)] Loss: 1.0800 (1.061) Data (t): 0.001 Batch (t): 0.905, 570.613/s LR: 0.000002 Logit Scale: 100.000 - V4
380
+ 2024-11-27,06:37:58 | INFO | Train Epoch: 1 [ 2765312/10637090 (26%)] Loss: 1.0601 (1.061) Data (t): 0.001 Batch (t): 0.896, 572.227/s LR: 0.000002 Logit Scale: 100.000 - V4
381
+ 2024-11-27,06:39:28 | INFO | Train Epoch: 1 [ 2816512/10637090 (26%)] Loss: 1.0374 (1.061) Data (t): 0.001 Batch (t): 0.896, 574.116/s LR: 0.000001 Logit Scale: 100.000 - V4
382
+ 2024-11-27,06:41:00 | INFO | Train Epoch: 1 [ 2867712/10637090 (27%)] Loss: 1.0835 (1.061) Data (t): 0.001 Batch (t): 0.925, 573.652/s LR: 0.000001 Logit Scale: 100.000 - V4
383
+ 2024-11-27,06:42:32 | INFO | Train Epoch: 1 [ 2918912/10637090 (27%)] Loss: 1.0796 (1.061) Data (t): 0.001 Batch (t): 0.921, 279.225/s LR: 0.000001 Logit Scale: 100.000 - V4
384
+ 2024-11-27,06:44:03 | INFO | Train Epoch: 1 [ 2970112/10637090 (28%)] Loss: 1.0603 (1.061) Data (t): 0.001 Batch (t): 0.905, 570.632/s LR: 0.000001 Logit Scale: 100.000 - V4
385
+ 2024-11-27,06:45:32 | INFO | Train Epoch: 1 [ 3021312/10637090 (28%)] Loss: 1.1166 (1.062) Data (t): 0.001 Batch (t): 0.896, 571.055/s LR: 0.000001 Logit Scale: 100.000 - V4
386
+ 2024-11-27,06:47:02 | INFO | Train Epoch: 1 [ 3072512/10637090 (29%)] Loss: 0.90431 (1.060) Data (t): 0.001 Batch (t): 0.896, 567.176/s LR: 0.000001 Logit Scale: 100.000 - V4
387
+ 2024-11-27,06:48:33 | INFO | Train Epoch: 1 [ 3123712/10637090 (29%)] Loss: 1.0275 (1.059) Data (t): 0.001 Batch (t): 0.912, 570.255/s LR: 0.000001 Logit Scale: 100.000 - V4
388
+ 2024-11-27,06:50:06 | INFO | Train Epoch: 1 [ 3174912/10637090 (30%)] Loss: 1.1707 (1.061) Data (t): 0.001 Batch (t): 0.925, 569.632/s LR: 0.000001 Logit Scale: 100.000 - V4
389
+ 2024-11-27,06:51:37 | INFO | Train Epoch: 1 [ 3226112/10637090 (30%)] Loss: 1.1179 (1.062) Data (t): 0.001 Batch (t): 0.915, 569.527/s LR: 0.000001 Logit Scale: 100.000 - V4
390
+ 2024-11-27,06:53:07 | INFO | Train Epoch: 1 [ 3277312/10637090 (31%)] Loss: 1.0479 (1.062) Data (t): 0.001 Batch (t): 0.895, 568.036/s LR: 0.000001 Logit Scale: 100.000 - V4
391
+ 2024-11-27,06:54:36 | INFO | Train Epoch: 1 [ 3328512/10637090 (31%)] Loss: 1.1544 (1.063) Data (t): 0.001 Batch (t): 0.894, 571.927/s LR: 0.000001 Logit Scale: 100.000 - V4
392
+ 2024-11-27,06:56:07 | INFO | Train Epoch: 1 [ 3379712/10637090 (32%)] Loss: 1.0102 (1.062) Data (t): 0.001 Batch (t): 0.913, 571.492/s LR: 0.000001 Logit Scale: 100.000 - V4
393
+ 2024-11-27,06:57:39 | INFO | Train Epoch: 1 [ 3430912/10637090 (32%)] Loss: 1.0633 (1.062) Data (t): 0.001 Batch (t): 0.916, 572.763/s LR: 0.000001 Logit Scale: 100.000 - V4
394
+ 2024-11-27,06:59:11 | INFO | Train Epoch: 1 [ 3482112/10637090 (33%)] Loss: 0.98734 (1.061) Data (t): 0.001 Batch (t): 0.925, 278.302/s LR: 0.000001 Logit Scale: 100.000 - V4
395
+ 2024-11-27,07:00:41 | INFO | Train Epoch: 1 [ 3533312/10637090 (33%)] Loss: 1.1443 (1.062) Data (t): 0.001 Batch (t): 0.896, 569.593/s LR: 0.000001 Logit Scale: 100.000 - V4
396
+ 2024-11-27,07:02:11 | INFO | Train Epoch: 1 [ 3584512/10637090 (34%)] Loss: 1.0147 (1.062) Data (t): 0.001 Batch (t): 0.896, 574.240/s LR: 0.000001 Logit Scale: 100.000 - V4
397
+ 2024-11-27,07:03:40 | INFO | Train Epoch: 1 [ 3635712/10637090 (34%)] Loss: 1.1478 (1.063) Data (t): 0.001 Batch (t): 0.897, 571.050/s LR: 0.000001 Logit Scale: 100.000 - V4
398
+ 2024-11-27,07:05:14 | INFO | Train Epoch: 1 [ 3686912/10637090 (35%)] Loss: 0.99775 (1.062) Data (t): 0.001 Batch (t): 0.933, 569.403/s LR: 0.000001 Logit Scale: 100.000 - V4
399
+ 2024-11-27,07:06:45 | INFO | Train Epoch: 1 [ 3738112/10637090 (35%)] Loss: 0.89432 (1.060) Data (t): 0.001 Batch (t): 0.915, 571.482/s LR: 0.000001 Logit Scale: 100.000 - V4
400
+ 2024-11-27,07:08:16 | INFO | Train Epoch: 1 [ 3789312/10637090 (36%)] Loss: 1.0577 (1.060) Data (t): 0.001 Batch (t): 0.905, 573.403/s LR: 0.000001 Logit Scale: 100.000 - V4
401
+ 2024-11-27,07:09:45 | INFO | Train Epoch: 1 [ 3840512/10637090 (36%)] Loss: 1.0405 (1.060) Data (t): 0.001 Batch (t): 0.896, 571.636/s LR: 0.000001 Logit Scale: 100.000 - V4
402
+ 2024-11-27,07:11:15 | INFO | Train Epoch: 1 [ 3891712/10637090 (37%)] Loss: 1.0512 (1.059) Data (t): 0.001 Batch (t): 0.895, 573.732/s LR: 0.000001 Logit Scale: 100.000 - V4
403
+ 2024-11-27,07:12:48 | INFO | Train Epoch: 1 [ 3942912/10637090 (37%)] Loss: 0.98693 (1.058) Data (t): 0.001 Batch (t): 0.931, 568.375/s LR: 0.000001 Logit Scale: 100.000 - V4
404
+ 2024-11-27,07:14:20 | INFO | Train Epoch: 1 [ 3994112/10637090 (38%)] Loss: 1.0355 (1.058) Data (t): 0.001 Batch (t): 0.921, 566.626/s LR: 0.000001 Logit Scale: 100.000 - V4
405
+ 2024-11-27,07:15:50 | INFO | Train Epoch: 1 [ 4045312/10637090 (38%)] Loss: 1.2874 (1.061) Data (t): 0.001 Batch (t): 0.905, 572.765/s LR: 0.000001 Logit Scale: 100.000 - V4
406
+ 2024-11-27,07:17:20 | INFO | Train Epoch: 1 [ 4096512/10637090 (39%)] Loss: 1.0966 (1.061) Data (t): 0.001 Batch (t): 0.897, 572.618/s LR: 0.000001 Logit Scale: 100.000 - V4
407
+ 2024-11-27,07:18:50 | INFO | Train Epoch: 1 [ 4147712/10637090 (39%)] Loss: 1.0360 (1.061) Data (t): 0.001 Batch (t): 0.895, 569.975/s LR: 0.000001 Logit Scale: 100.000 - V4
408
+ 2024-11-27,07:20:22 | INFO | Train Epoch: 1 [ 4198912/10637090 (39%)] Loss: 1.1183 (1.062) Data (t): 0.001 Batch (t): 0.919, 569.986/s LR: 0.000001 Logit Scale: 100.000 - V4
409
+ 2024-11-27,07:21:53 | INFO | Train Epoch: 1 [ 4250112/10637090 (40%)] Loss: 1.0804 (1.062) Data (t): 0.001 Batch (t): 0.918, 574.977/s LR: 0.000001 Logit Scale: 100.000 - V4
410
+ 2024-11-27,07:23:25 | INFO | Train Epoch: 1 [ 4301312/10637090 (40%)] Loss: 1.0074 (1.061) Data (t): 0.001 Batch (t): 0.914, 571.915/s LR: 0.000001 Logit Scale: 100.000 - V4
411
+ 2024-11-27,07:24:54 | INFO | Train Epoch: 1 [ 4352512/10637090 (41%)] Loss: 1.0147 (1.061) Data (t): 0.001 Batch (t): 0.895, 570.253/s LR: 0.000001 Logit Scale: 100.000 - V4
412
+ 2024-11-27,07:26:24 | INFO | Train Epoch: 1 [ 4403712/10637090 (41%)] Loss: 0.98719 (1.060) Data (t): 0.001 Batch (t): 0.895, 572.913/s LR: 0.000001 Logit Scale: 100.000 - V4
413
+ 2024-11-27,07:27:55 | INFO | Train Epoch: 1 [ 4454912/10637090 (42%)] Loss: 1.0311 (1.060) Data (t): 0.001 Batch (t): 0.911, 570.976/s LR: 0.000001 Logit Scale: 100.000 - V4
414
+ 2024-11-27,07:29:27 | INFO | Train Epoch: 1 [ 4506112/10637090 (42%)] Loss: 1.0330 (1.059) Data (t): 0.001 Batch (t): 0.919, 572.207/s LR: 0.000001 Logit Scale: 100.000 - V4
415
+ 2024-11-27,07:30:59 | INFO | Train Epoch: 1 [ 4557312/10637090 (43%)] Loss: 1.0003 (1.059) Data (t): 0.001 Batch (t): 0.924, 572.237/s LR: 0.000001 Logit Scale: 100.000 - V4
416
+ 2024-11-27,07:32:29 | INFO | Train Epoch: 1 [ 4608512/10637090 (43%)] Loss: 1.0510 (1.059) Data (t): 0.001 Batch (t): 0.894, 572.512/s LR: 0.000001 Logit Scale: 100.000 - V4
417
+ 2024-11-27,07:33:58 | INFO | Train Epoch: 1 [ 4659712/10637090 (44%)] Loss: 1.0159 (1.058) Data (t): 0.001 Batch (t): 0.895, 574.088/s LR: 0.000001 Logit Scale: 100.000 - V4
418
+ 2024-11-27,07:35:29 | INFO | Train Epoch: 1 [ 4710912/10637090 (44%)] Loss: 1.1518 (1.059) Data (t): 0.001 Batch (t): 0.911, 573.055/s LR: 0.000001 Logit Scale: 100.000 - V4
419
+ 2024-11-27,07:37:01 | INFO | Train Epoch: 1 [ 4762112/10637090 (45%)] Loss: 1.0437 (1.059) Data (t): 0.001 Batch (t): 0.917, 572.585/s LR: 0.000001 Logit Scale: 100.000 - V4
420
+ 2024-11-27,07:38:32 | INFO | Train Epoch: 1 [ 4813312/10637090 (45%)] Loss: 0.98672 (1.058) Data (t): 0.001 Batch (t): 0.913, 570.547/s LR: 0.000001 Logit Scale: 100.000 - V4
421
+ 2024-11-27,07:40:03 | INFO | Train Epoch: 1 [ 4864512/10637090 (46%)] Loss: 1.0978 (1.059) Data (t): 0.001 Batch (t): 0.904, 572.213/s LR: 0.000001 Logit Scale: 100.000 - V4
422
+ 2024-11-27,07:41:32 | INFO | Train Epoch: 1 [ 4915712/10637090 (46%)] Loss: 0.89812 (1.057) Data (t): 0.001 Batch (t): 0.894, 571.949/s LR: 0.000001 Logit Scale: 100.000 - V4
423
+ 2024-11-27,07:43:02 | INFO | Train Epoch: 1 [ 4966912/10637090 (47%)] Loss: 1.0941 (1.057) Data (t): 0.001 Batch (t): 0.895, 571.482/s LR: 0.000001 Logit Scale: 100.000 - V4
424
+ 2024-11-27,07:44:35 | INFO | Train Epoch: 1 [ 5018112/10637090 (47%)] Loss: 1.0506 (1.057) Data (t): 0.001 Batch (t): 0.935, 569.850/s LR: 0.000001 Logit Scale: 100.000 - V4
425
+ 2024-11-27,07:46:06 | INFO | Train Epoch: 1 [ 5069312/10637090 (48%)] Loss: 0.99763 (1.057) Data (t): 0.001 Batch (t): 0.914, 575.887/s LR: 0.000001 Logit Scale: 100.000 - V4
426
+ 2024-11-27,07:47:37 | INFO | Train Epoch: 1 [ 5120512/10637090 (48%)] Loss: 0.87316 (1.055) Data (t): 0.001 Batch (t): 0.904, 568.556/s LR: 0.000001 Logit Scale: 100.000 - V4
427
+ 2024-11-27,07:49:07 | INFO | Train Epoch: 1 [ 5171712/10637090 (49%)] Loss: 1.1561 (1.056) Data (t): 0.001 Batch (t): 0.896, 572.086/s LR: 0.000001 Logit Scale: 100.000 - V4
428
+ 2024-11-27,07:50:36 | INFO | Train Epoch: 1 [ 5222912/10637090 (49%)] Loss: 1.0303 (1.056) Data (t): 0.001 Batch (t): 0.896, 573.661/s LR: 0.000001 Logit Scale: 100.000 - V4
429
+ 2024-11-27,07:52:08 | INFO | Train Epoch: 1 [ 5274112/10637090 (50%)] Loss: 1.0331 (1.055) Data (t): 0.001 Batch (t): 0.919, 572.625/s LR: 0.000001 Logit Scale: 100.000 - V4
430
+ 2024-11-27,07:53:40 | INFO | Train Epoch: 1 [ 5325312/10637090 (50%)] Loss: 0.96590 (1.055) Data (t): 0.001 Batch (t): 0.923, 270.099/s LR: 0.000001 Logit Scale: 100.000 - V4
431
+ 2024-11-27,07:55:12 | INFO | Train Epoch: 1 [ 5376512/10637090 (51%)] Loss: 1.1712 (1.056) Data (t): 0.001 Batch (t): 0.915, 574.077/s LR: 0.000001 Logit Scale: 100.000 - V4
432
+ 2024-11-27,07:56:41 | INFO | Train Epoch: 1 [ 5427712/10637090 (51%)] Loss: 1.0271 (1.055) Data (t): 0.001 Batch (t): 0.895, 570.777/s LR: 0.000001 Logit Scale: 100.000 - V4
433
+ 2024-11-27,07:58:11 | INFO | Train Epoch: 1 [ 5478912/10637090 (52%)] Loss: 1.1131 (1.056) Data (t): 0.001 Batch (t): 0.896, 570.787/s LR: 0.000001 Logit Scale: 100.000 - V4
434
+ 2024-11-27,07:59:43 | INFO | Train Epoch: 1 [ 5530112/10637090 (52%)] Loss: 1.0604 (1.056) Data (t): 0.001 Batch (t): 0.920, 570.413/s LR: 0.000001 Logit Scale: 100.000 - V4
435
+ 2024-11-27,08:01:14 | INFO | Train Epoch: 1 [ 5581312/10637090 (52%)] Loss: 1.0678 (1.056) Data (t): 0.001 Batch (t): 0.914, 572.048/s LR: 0.000001 Logit Scale: 100.000 - V4
436
+ 2024-11-27,08:02:47 | INFO | Train Epoch: 1 [ 5632512/10637090 (53%)] Loss: 1.1293 (1.057) Data (t): 0.001 Batch (t): 0.924, 572.418/s LR: 0.000001 Logit Scale: 100.000 - V4
437
+ 2024-11-27,08:04:16 | INFO | Train Epoch: 1 [ 5683712/10637090 (53%)] Loss: 1.1303 (1.057) Data (t): 0.001 Batch (t): 0.896, 570.208/s LR: 0.000001 Logit Scale: 100.000 - V4
438
+ 2024-11-27,08:05:46 | INFO | Train Epoch: 1 [ 5734912/10637090 (54%)] Loss: 0.95027 (1.056) Data (t): 0.001 Batch (t): 0.896, 568.880/s LR: 0.000001 Logit Scale: 100.000 - V4
439
+ 2024-11-27,08:07:17 | INFO | Train Epoch: 1 [ 5786112/10637090 (54%)] Loss: 0.92660 (1.055) Data (t): 0.001 Batch (t): 0.913, 571.644/s LR: 0.000001 Logit Scale: 100.000 - V4
440
+ 2024-11-27,08:08:49 | INFO | Train Epoch: 1 [ 5837312/10637090 (55%)] Loss: 1.0374 (1.055) Data (t): 0.001 Batch (t): 0.920, 571.733/s LR: 0.000001 Logit Scale: 100.000 - V4
441
+ 2024-11-27,08:10:22 | INFO | Train Epoch: 1 [ 5888512/10637090 (55%)] Loss: 0.95528 (1.054) Data (t): 0.001 Batch (t): 0.927, 571.842/s LR: 0.000001 Logit Scale: 100.000 - V4
442
+ 2024-11-27,08:11:51 | INFO | Train Epoch: 1 [ 5939712/10637090 (56%)] Loss: 1.0986 (1.055) Data (t): 0.001 Batch (t): 0.897, 571.042/s LR: 0.000001 Logit Scale: 100.000 - V4
443
+ 2024-11-27,08:13:21 | INFO | Train Epoch: 1 [ 5990912/10637090 (56%)] Loss: 1.1118 (1.055) Data (t): 0.001 Batch (t): 0.896, 572.127/s LR: 0.000001 Logit Scale: 100.000 - V4
444
+ 2024-11-27,08:14:52 | INFO | Train Epoch: 1 [ 6042112/10637090 (57%)] Loss: 1.0610 (1.055) Data (t): 0.001 Batch (t): 0.912, 571.703/s LR: 0.000001 Logit Scale: 100.000 - V4
445
+ 2024-11-27,08:16:24 | INFO | Train Epoch: 1 [ 6093312/10637090 (57%)] Loss: 1.0524 (1.055) Data (t): 0.001 Batch (t): 0.921, 574.197/s LR: 0.000001 Logit Scale: 100.000 - V4
446
+ 2024-11-27,08:17:56 | INFO | Train Epoch: 1 [ 6144512/10637090 (58%)] Loss: 1.0528 (1.055) Data (t): 0.001 Batch (t): 0.915, 571.926/s LR: 0.000001 Logit Scale: 100.000 - V4
447
+ 2024-11-27,08:19:27 | INFO | Train Epoch: 1 [ 6195712/10637090 (58%)] Loss: 1.0440 (1.055) Data (t): 0.001 Batch (t): 0.906, 570.911/s LR: 0.000001 Logit Scale: 100.000 - V4
448
+ 2024-11-27,08:20:56 | INFO | Train Epoch: 1 [ 6246912/10637090 (59%)] Loss: 1.1593 (1.056) Data (t): 0.001 Batch (t): 0.895, 568.834/s LR: 0.000001 Logit Scale: 100.000 - V4
449
+ 2024-11-27,08:22:26 | INFO | Train Epoch: 1 [ 6298112/10637090 (59%)] Loss: 0.92146 (1.055) Data (t): 0.001 Batch (t): 0.895, 568.399/s LR: 0.000000 Logit Scale: 100.000 - V4
450
+ 2024-11-27,08:23:58 | INFO | Train Epoch: 1 [ 6349312/10637090 (60%)] Loss: 0.93241 (1.054) Data (t): 0.001 Batch (t): 0.920, 572.968/s LR: 0.000000 Logit Scale: 100.000 - V4
451
+ 2024-11-27,08:25:30 | INFO | Train Epoch: 1 [ 6400512/10637090 (60%)] Loss: 0.99950 (1.053) Data (t): 0.001 Batch (t): 0.922, 571.263/s LR: 0.000000 Logit Scale: 100.000 - V4
452
+ 2024-11-27,08:27:01 | INFO | Train Epoch: 1 [ 6451712/10637090 (61%)] Loss: 0.87824 (1.052) Data (t): 0.001 Batch (t): 0.915, 572.957/s LR: 0.000000 Logit Scale: 100.000 - V4
453
+ 2024-11-27,08:28:31 | INFO | Train Epoch: 1 [ 6502912/10637090 (61%)] Loss: 1.0529 (1.052) Data (t): 0.001 Batch (t): 0.895, 572.566/s LR: 0.000000 Logit Scale: 100.000 - V4
454
+ 2024-11-27,08:30:00 | INFO | Train Epoch: 1 [ 6554112/10637090 (62%)] Loss: 0.84983 (1.051) Data (t): 0.001 Batch (t): 0.895, 572.458/s LR: 0.000000 Logit Scale: 100.000 - V4
455
+ 2024-11-27,08:31:32 | INFO | Train Epoch: 1 [ 6605312/10637090 (62%)] Loss: 1.0097 (1.050) Data (t): 0.001 Batch (t): 0.920, 572.206/s LR: 0.000000 Logit Scale: 100.000 - V4
456
+ 2024-11-27,08:33:04 | INFO | Train Epoch: 1 [ 6656512/10637090 (63%)] Loss: 1.0204 (1.050) Data (t): 0.001 Batch (t): 0.912, 572.053/s LR: 0.000000 Logit Scale: 100.000 - V4
457
+ 2024-11-27,08:34:36 | INFO | Train Epoch: 1 [ 6707712/10637090 (63%)] Loss: 1.1329 (1.051) Data (t): 0.001 Batch (t): 0.924, 571.449/s LR: 0.000000 Logit Scale: 100.000 - V4
458
+ 2024-11-27,08:36:05 | INFO | Train Epoch: 1 [ 6758912/10637090 (64%)] Loss: 1.1183 (1.051) Data (t): 0.001 Batch (t): 0.894, 573.877/s LR: 0.000000 Logit Scale: 100.000 - V4
459
+ 2024-11-27,08:37:35 | INFO | Train Epoch: 1 [ 6810112/10637090 (64%)] Loss: 1.0636 (1.051) Data (t): 0.001 Batch (t): 0.894, 571.467/s LR: 0.000000 Logit Scale: 100.000 - V4
460
+ 2024-11-27,08:39:07 | INFO | Train Epoch: 1 [ 6861312/10637090 (65%)] Loss: 1.0463 (1.051) Data (t): 0.001 Batch (t): 0.918, 573.057/s LR: 0.000000 Logit Scale: 100.000 - V4
461
+ 2024-11-27,08:40:38 | INFO | Train Epoch: 1 [ 6912512/10637090 (65%)] Loss: 1.0026 (1.051) Data (t): 0.001 Batch (t): 0.911, 577.593/s LR: 0.000000 Logit Scale: 100.000 - V4
462
+ 2024-11-27,08:42:10 | INFO | Train Epoch: 1 [ 6963712/10637090 (65%)] Loss: 1.0808 (1.051) Data (t): 0.000 Batch (t): 0.923, 571.991/s LR: 0.000000 Logit Scale: 100.000 - V4
463
+ 2024-11-27,08:43:40 | INFO | Train Epoch: 1 [ 7014912/10637090 (66%)] Loss: 0.97769 (1.051) Data (t): 0.001 Batch (t): 0.895, 570.742/s LR: 0.000000 Logit Scale: 100.000 - V4
464
+ 2024-11-27,08:45:09 | INFO | Train Epoch: 1 [ 7066112/10637090 (66%)] Loss: 1.0644 (1.051) Data (t): 0.001 Batch (t): 0.895, 571.450/s LR: 0.000000 Logit Scale: 100.000 - V4
465
+ 2024-11-27,08:46:40 | INFO | Train Epoch: 1 [ 7117312/10637090 (67%)] Loss: 1.0796 (1.051) Data (t): 0.001 Batch (t): 0.912, 570.051/s LR: 0.000000 Logit Scale: 100.000 - V4
466
+ 2024-11-27,08:48:12 | INFO | Train Epoch: 1 [ 7168512/10637090 (67%)] Loss: 0.99263 (1.050) Data (t): 0.001 Batch (t): 0.918, 564.585/s LR: 0.000000 Logit Scale: 100.000 - V4
467
+ 2024-11-27,08:49:43 | INFO | Train Epoch: 1 [ 7219712/10637090 (68%)] Loss: 0.97418 (1.050) Data (t): 0.001 Batch (t): 0.913, 574.592/s LR: 0.000000 Logit Scale: 100.000 - V4
468
+ 2024-11-27,08:51:13 | INFO | Train Epoch: 1 [ 7270912/10637090 (68%)] Loss: 0.97256 (1.049) Data (t): 0.001 Batch (t): 0.901, 574.151/s LR: 0.000000 Logit Scale: 100.000 - V4
469
+ 2024-11-27,08:52:43 | INFO | Train Epoch: 1 [ 7322112/10637090 (69%)] Loss: 1.0953 (1.050) Data (t): 0.001 Batch (t): 0.892, 574.728/s LR: 0.000000 Logit Scale: 100.000 - V4
470
+ 2024-11-27,08:54:14 | INFO | Train Epoch: 1 [ 7373312/10637090 (69%)] Loss: 1.0402 (1.050) Data (t): 0.001 Batch (t): 0.910, 574.869/s LR: 0.000000 Logit Scale: 100.000 - V4
471
+ 2024-11-27,08:55:44 | INFO | Train Epoch: 1 [ 7424512/10637090 (70%)] Loss: 1.0422 (1.050) Data (t): 0.001 Batch (t): 0.907, 574.034/s LR: 0.000000 Logit Scale: 100.000 - V4
472
+ 2024-11-27,08:57:16 | INFO | Train Epoch: 1 [ 7475712/10637090 (70%)] Loss: 1.1315 (1.050) Data (t): 0.001 Batch (t): 0.912, 575.577/s LR: 0.000000 Logit Scale: 100.000 - V4
473
+ 2024-11-27,08:58:47 | INFO | Train Epoch: 1 [ 7526912/10637090 (71%)] Loss: 1.0247 (1.050) Data (t): 0.001 Batch (t): 0.913, 574.663/s LR: 0.000000 Logit Scale: 100.000 - V4
474
+ 2024-11-27,09:00:16 | INFO | Train Epoch: 1 [ 7578112/10637090 (71%)] Loss: 0.96505 (1.049) Data (t): 0.001 Batch (t): 0.893, 572.353/s LR: 0.000000 Logit Scale: 100.000 - V4
475
+ 2024-11-27,09:01:46 | INFO | Train Epoch: 1 [ 7629312/10637090 (72%)] Loss: 0.89964 (1.048) Data (t): 0.001 Batch (t): 0.893, 573.886/s LR: 0.000000 Logit Scale: 100.000 - V4
476
+ 2024-11-27,09:03:17 | INFO | Train Epoch: 1 [ 7680512/10637090 (72%)] Loss: 0.94812 (1.048) Data (t): 0.001 Batch (t): 0.919, 572.963/s LR: 0.000000 Logit Scale: 100.000 - V4
477
+ 2024-11-27,09:04:49 | INFO | Train Epoch: 1 [ 7731712/10637090 (73%)] Loss: 1.0112 (1.047) Data (t): 0.001 Batch (t): 0.912, 573.004/s LR: 0.000000 Logit Scale: 100.000 - V4
478
+ 2024-11-27,09:06:21 | INFO | Train Epoch: 1 [ 7782912/10637090 (73%)] Loss: 0.96896 (1.047) Data (t): 0.001 Batch (t): 0.923, 573.496/s LR: 0.000000 Logit Scale: 100.000 - V4
479
+ 2024-11-27,09:07:50 | INFO | Train Epoch: 1 [ 7834112/10637090 (74%)] Loss: 1.1535 (1.048) Data (t): 0.001 Batch (t): 0.893, 571.504/s LR: 0.000000 Logit Scale: 100.000 - V4
480
+ 2024-11-27,09:09:20 | INFO | Train Epoch: 1 [ 7885312/10637090 (74%)] Loss: 1.1186 (1.048) Data (t): 0.001 Batch (t): 0.892, 574.372/s LR: 0.000000 Logit Scale: 100.000 - V4
481
+ 2024-11-27,09:10:51 | INFO | Train Epoch: 1 [ 7936512/10637090 (75%)] Loss: 1.0656 (1.048) Data (t): 0.001 Batch (t): 0.917, 571.524/s LR: 0.000000 Logit Scale: 100.000 - V4
482
+ 2024-11-27,09:12:22 | INFO | Train Epoch: 1 [ 7987712/10637090 (75%)] Loss: 0.94982 (1.048) Data (t): 0.001 Batch (t): 0.910, 574.255/s LR: 0.000000 Logit Scale: 100.000 - V4
483
+ 2024-11-27,09:13:54 | INFO | Train Epoch: 1 [ 8038912/10637090 (76%)] Loss: 1.0177 (1.047) Data (t): 0.001 Batch (t): 0.922, 574.910/s LR: 0.000000 Logit Scale: 100.000 - V4
484
+ 2024-11-27,09:15:24 | INFO | Train Epoch: 1 [ 8090112/10637090 (76%)] Loss: 0.94190 (1.047) Data (t): 0.001 Batch (t): 0.893, 574.880/s LR: 0.000000 Logit Scale: 100.000 - V4
485
+ 2024-11-27,09:16:53 | INFO | Train Epoch: 1 [ 8141312/10637090 (77%)] Loss: 1.1812 (1.048) Data (t): 0.001 Batch (t): 0.892, 573.457/s LR: 0.000000 Logit Scale: 100.000 - V4
486
+ 2024-11-27,09:18:24 | INFO | Train Epoch: 1 [ 8192512/10637090 (77%)] Loss: 1.1853 (1.048) Data (t): 0.001 Batch (t): 0.910, 575.807/s LR: 0.000000 Logit Scale: 100.000 - V4
487
+ 2024-11-27,09:19:56 | INFO | Train Epoch: 1 [ 8243712/10637090 (78%)] Loss: 1.1836 (1.049) Data (t): 0.001 Batch (t): 0.917, 574.010/s LR: 0.000000 Logit Scale: 100.000 - V4
488
+ 2024-11-27,09:21:28 | INFO | Train Epoch: 1 [ 8294912/10637090 (78%)] Loss: 1.1244 (1.050) Data (t): 0.001 Batch (t): 0.923, 574.645/s LR: 0.000000 Logit Scale: 100.000 - V4
489
+ 2024-11-27,09:22:57 | INFO | Train Epoch: 1 [ 8346112/10637090 (78%)] Loss: 0.97844 (1.049) Data (t): 0.001 Batch (t): 0.894, 571.009/s LR: 0.000000 Logit Scale: 100.000 - V4
490
+ 2024-11-27,09:24:27 | INFO | Train Epoch: 1 [ 8397312/10637090 (79%)] Loss: 0.95430 (1.049) Data (t): 0.001 Batch (t): 0.894, 569.645/s LR: 0.000000 Logit Scale: 100.000 - V4
491
+ 2024-11-27,09:25:58 | INFO | Train Epoch: 1 [ 8448512/10637090 (79%)] Loss: 1.1523 (1.049) Data (t): 0.001 Batch (t): 0.910, 574.517/s LR: 0.000000 Logit Scale: 100.000 - V4
492
+ 2024-11-27,09:27:28 | INFO | Train Epoch: 1 [ 8499712/10637090 (80%)] Loss: 1.0875 (1.050) Data (t): 0.001 Batch (t): 0.908, 573.476/s LR: 0.000000 Logit Scale: 100.000 - V4
493
+ 2024-11-27,09:29:00 | INFO | Train Epoch: 1 [ 8550912/10637090 (80%)] Loss: 1.0188 (1.049) Data (t): 0.001 Batch (t): 0.915, 573.363/s LR: 0.000000 Logit Scale: 100.000 - V4
494
+ 2024-11-27,09:30:31 | INFO | Train Epoch: 1 [ 8602112/10637090 (81%)] Loss: 1.1405 (1.050) Data (t): 0.001 Batch (t): 0.915, 573.662/s LR: 0.000000 Logit Scale: 100.000 - V4
495
+ 2024-11-27,09:32:01 | INFO | Train Epoch: 1 [ 8653312/10637090 (81%)] Loss: 0.93292 (1.049) Data (t): 0.001 Batch (t): 0.895, 573.662/s LR: 0.000000 Logit Scale: 100.000 - V4
496
+ 2024-11-27,09:33:31 | INFO | Train Epoch: 1 [ 8704512/10637090 (82%)] Loss: 1.0593 (1.049) Data (t): 0.001 Batch (t): 0.903, 574.119/s LR: 0.000000 Logit Scale: 100.000 - V4
497
+ 2024-11-27,09:35:03 | INFO | Train Epoch: 1 [ 8755712/10637090 (82%)] Loss: 1.0038 (1.049) Data (t): 0.001 Batch (t): 0.920, 574.029/s LR: 0.000000 Logit Scale: 100.000 - V4
498
+ 2024-11-27,09:36:35 | INFO | Train Epoch: 1 [ 8806912/10637090 (83%)] Loss: 1.0629 (1.049) Data (t): 0.001 Batch (t): 0.916, 266.849/s LR: 0.000000 Logit Scale: 100.000 - V4
499
+ 2024-11-27,09:38:06 | INFO | Train Epoch: 1 [ 8858112/10637090 (83%)] Loss: 0.94171 (1.048) Data (t): 0.001 Batch (t): 0.915, 572.994/s LR: 0.000000 Logit Scale: 100.000 - V4
500
+ 2024-11-27,09:39:36 | INFO | Train Epoch: 1 [ 8909312/10637090 (84%)] Loss: 1.1578 (1.049) Data (t): 0.001 Batch (t): 0.894, 573.463/s LR: 0.000000 Logit Scale: 100.000 - V4
501
+ 2024-11-27,09:41:05 | INFO | Train Epoch: 1 [ 8960512/10637090 (84%)] Loss: 1.0284 (1.049) Data (t): 0.001 Batch (t): 0.895, 570.680/s LR: 0.000000 Logit Scale: 100.000 - V4
502
+ 2024-11-27,09:42:37 | INFO | Train Epoch: 1 [ 9011712/10637090 (85%)] Loss: 1.0820 (1.049) Data (t): 0.001 Batch (t): 0.920, 573.570/s LR: 0.000000 Logit Scale: 100.000 - V4
503
+ 2024-11-27,09:44:08 | INFO | Train Epoch: 1 [ 9062912/10637090 (85%)] Loss: 0.97469 (1.049) Data (t): 0.001 Batch (t): 0.913, 570.541/s LR: 0.000000 Logit Scale: 100.000 - V4
504
+ 2024-11-27,09:45:41 | INFO | Train Epoch: 1 [ 9114112/10637090 (86%)] Loss: 1.0719 (1.049) Data (t): 0.001 Batch (t): 0.925, 571.838/s LR: 0.000000 Logit Scale: 100.000 - V4
505
+ 2024-11-27,09:47:10 | INFO | Train Epoch: 1 [ 9165312/10637090 (86%)] Loss: 0.97399 (1.048) Data (t): 0.001 Batch (t): 0.894, 571.676/s LR: 0.000000 Logit Scale: 100.000 - V4
506
+ 2024-11-27,09:48:40 | INFO | Train Epoch: 1 [ 9216512/10637090 (87%)] Loss: 1.1124 (1.049) Data (t): 0.001 Batch (t): 0.894, 570.389/s LR: 0.000000 Logit Scale: 100.000 - V4
507
+ 2024-11-27,09:50:11 | INFO | Train Epoch: 1 [ 9267712/10637090 (87%)] Loss: 1.0737 (1.049) Data (t): 0.001 Batch (t): 0.918, 575.484/s LR: 0.000000 Logit Scale: 100.000 - V4
508
+ 2024-11-27,09:51:43 | INFO | Train Epoch: 1 [ 9318912/10637090 (88%)] Loss: 1.0940 (1.049) Data (t): 0.001 Batch (t): 0.913, 573.371/s LR: 0.000000 Logit Scale: 100.000 - V4
509
+ 2024-11-27,09:53:15 | INFO | Train Epoch: 1 [ 9370112/10637090 (88%)] Loss: 1.1696 (1.050) Data (t): 0.001 Batch (t): 0.925, 572.882/s LR: 0.000000 Logit Scale: 100.000 - V4
510
+ 2024-11-27,09:54:45 | INFO | Train Epoch: 1 [ 9421312/10637090 (89%)] Loss: 0.97639 (1.049) Data (t): 0.001 Batch (t): 0.896, 571.202/s LR: 0.000000 Logit Scale: 100.000 - V4
511
+ 2024-11-27,09:56:14 | INFO | Train Epoch: 1 [ 9472512/10637090 (89%)] Loss: 1.0731 (1.050) Data (t): 0.001 Batch (t): 0.896, 571.435/s LR: 0.000000 Logit Scale: 100.000 - V4
512
+ 2024-11-27,09:57:46 | INFO | Train Epoch: 1 [ 9523712/10637090 (90%)] Loss: 0.95788 (1.049) Data (t): 0.001 Batch (t): 0.912, 574.514/s LR: 0.000000 Logit Scale: 100.000 - V4
513
+ 2024-11-27,09:59:17 | INFO | Train Epoch: 1 [ 9574912/10637090 (90%)] Loss: 1.0155 (1.049) Data (t): 0.001 Batch (t): 0.911, 568.433/s LR: 0.000000 Logit Scale: 100.000 - V4
514
+ 2024-11-27,10:00:48 | INFO | Train Epoch: 1 [ 9626112/10637090 (90%)] Loss: 0.95145 (1.048) Data (t): 0.001 Batch (t): 0.916, 573.715/s LR: 0.000000 Logit Scale: 100.000 - V4
515
+ 2024-11-27,10:02:20 | INFO | Train Epoch: 1 [ 9677312/10637090 (91%)] Loss: 1.0364 (1.048) Data (t): 0.001 Batch (t): 0.916, 572.356/s LR: 0.000000 Logit Scale: 100.000 - V4
516
+ 2024-11-27,10:03:49 | INFO | Train Epoch: 1 [ 9728512/10637090 (91%)] Loss: 1.0150 (1.048) Data (t): 0.001 Batch (t): 0.896, 572.428/s LR: 0.000000 Logit Scale: 100.000 - V4
517
+ 2024-11-27,10:05:21 | INFO | Train Epoch: 1 [ 9779712/10637090 (92%)] Loss: 0.99979 (1.048) Data (t): 0.001 Batch (t): 0.913, 573.510/s LR: 0.000000 Logit Scale: 100.000 - V4
518
+ 2024-11-27,10:06:52 | INFO | Train Epoch: 1 [ 9830912/10637090 (92%)] Loss: 1.0624 (1.048) Data (t): 0.001 Batch (t): 0.910, 572.486/s LR: 0.000000 Logit Scale: 100.000 - V4
519
+ 2024-11-27,10:08:23 | INFO | Train Epoch: 1 [ 9882112/10637090 (93%)] Loss: 0.97977 (1.048) Data (t): 0.001 Batch (t): 0.915, 570.660/s LR: 0.000000 Logit Scale: 100.000 - V4
520
+ 2024-11-27,10:09:55 | INFO | Train Epoch: 1 [ 9933312/10637090 (93%)] Loss: 0.97347 (1.047) Data (t): 0.001 Batch (t): 0.914, 573.422/s LR: 0.000000 Logit Scale: 100.000 - V4
521
+ 2024-11-27,10:11:24 | INFO | Train Epoch: 1 [ 9984512/10637090 (94%)] Loss: 1.0213 (1.047) Data (t): 0.001 Batch (t): 0.893, 574.399/s LR: 0.000000 Logit Scale: 100.000 - V4
522
+ 2024-11-27,10:12:54 | INFO | Train Epoch: 1 [10035712/10637090 (94%)] Loss: 1.0902 (1.047) Data (t): 0.001 Batch (t): 0.902, 572.747/s LR: 0.000000 Logit Scale: 100.000 - V4
523
+ 2024-11-27,10:14:26 | INFO | Train Epoch: 1 [10086912/10637090 (95%)] Loss: 1.0825 (1.048) Data (t): 0.001 Batch (t): 0.920, 571.481/s LR: 0.000000 Logit Scale: 100.000 - V4
524
+ 2024-11-27,10:15:57 | INFO | Train Epoch: 1 [10138112/10637090 (95%)] Loss: 0.83598 (1.046) Data (t): 0.001 Batch (t): 0.905, 568.170/s LR: 0.000000 Logit Scale: 100.000 - V4
525
+ 2024-11-27,10:17:29 | INFO | Train Epoch: 1 [10189312/10637090 (96%)] Loss: 1.0455 (1.046) Data (t): 0.001 Batch (t): 0.925, 571.571/s LR: 0.000000 Logit Scale: 100.000 - V4
526
+ 2024-11-27,10:18:59 | INFO | Train Epoch: 1 [10240512/10637090 (96%)] Loss: 0.98382 (1.046) Data (t): 0.001 Batch (t): 0.894, 573.749/s LR: 0.000000 Logit Scale: 100.000 - V4
527
+ 2024-11-27,10:20:28 | INFO | Train Epoch: 1 [10291712/10637090 (97%)] Loss: 0.88477 (1.045) Data (t): 0.001 Batch (t): 0.894, 573.371/s LR: 0.000000 Logit Scale: 100.000 - V4
528
+ 2024-11-27,10:22:00 | INFO | Train Epoch: 1 [10342912/10637090 (97%)] Loss: 0.92619 (1.045) Data (t): 0.001 Batch (t): 0.921, 571.458/s LR: 0.000000 Logit Scale: 100.000 - V4
529
+ 2024-11-27,10:23:31 | INFO | Train Epoch: 1 [10394112/10637090 (98%)] Loss: 1.0476 (1.045) Data (t): 0.001 Batch (t): 0.914, 572.968/s LR: 0.000000 Logit Scale: 100.000 - V4
530
+ 2024-11-27,10:25:03 | INFO | Train Epoch: 1 [10445312/10637090 (98%)] Loss: 0.92127 (1.044) Data (t): 0.001 Batch (t): 0.916, 573.688/s LR: 0.000000 Logit Scale: 100.000 - V4
531
+ 2024-11-27,10:26:33 | INFO | Train Epoch: 1 [10496512/10637090 (99%)] Loss: 1.0239 (1.044) Data (t): 0.001 Batch (t): 0.904, 571.659/s LR: 0.000000 Logit Scale: 100.000 - V4
532
+ 2024-11-27,10:28:03 | INFO | Train Epoch: 1 [10547712/10637090 (99%)] Loss: 0.99158 (1.044) Data (t): 0.001 Batch (t): 0.894, 571.625/s LR: 0.000000 Logit Scale: 100.000 - V4
533
+ 2024-11-27,10:29:35 | INFO | Train Epoch: 1 [10598912/10637090 (100%)] Loss: 0.84240 (1.043) Data (t): 0.001 Batch (t): 0.920, 573.355/s LR: 0.000000 Logit Scale: 100.000 - V4
534
+ 2024-11-27,10:30:42 | INFO | Train Epoch: 1 [10636800/10637090 (100%)] Loss: 1.0801 (1.043) Data (t): 0.002 Batch (t): 0.905, 575.829/s LR: 0.000000 Logit Scale: 100.000 - V4
data/trained_openclip/negative_logs/plotqa_train_only_qa_v2_5false_formated_sampled_fixed_flaten_decimal2/2024_11_26-23_59_33-model_ViT-L-14-336-lr_5e-06-b_64-j_4-p_amp/params.txt ADDED
@@ -0,0 +1,67 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ batch_size: 64
2
+ beta1: 0.9
3
+ beta2: 0.98
4
+ checkpoint_path: data/trained_openclip/negative_logs/plotqa_train_only_qa_v2_5false_formated_sampled_fixed_flaten_decimal2/2024_11_26-23_59_33-model_ViT-L-14-336-lr_5e-06-b_64-j_4-p_amp/checkpoints
5
+ copy_codebase: False
6
+ csv_caption_key: caption
7
+ csv_hard_captions_key: neg_caption
8
+ csv_img_key: img_path
9
+ csv_separator: ,
10
+ dataset_resampled: False
11
+ dataset_type: csv
12
+ ddp_static_graph: False
13
+ debug: False
14
+ device: cuda:0
15
+ dist_backend: nccl
16
+ dist_url: env://
17
+ distributed: True
18
+ epochs: 2
19
+ eps: 1e-06
20
+ force_quick_gelu: True
21
+ gather_with_grad: False
22
+ grad_checkpointing: False
23
+ horovod: False
24
+ imagenet_v2: None
25
+ imagenet_val: None
26
+ local_loss: False
27
+ local_rank: 0
28
+ lock_image: False
29
+ lock_image_freeze_bn_stats: False
30
+ lock_image_unlocked_groups: 0
31
+ log_level: 20
32
+ log_local: False
33
+ log_path: data/trained_openclip/negative_logs/plotqa_train_only_qa_v2_5false_formated_sampled_fixed_flaten_decimal2/2024_11_26-23_59_33-model_ViT-L-14-336-lr_5e-06-b_64-j_4-p_amp/out.log
34
+ logs: data/trained_openclip/negative_logs/plotqa_train_only_qa_v2_5false_formated_sampled_fixed_flaten_decimal2
35
+ lr: 5e-06
36
+ model: ViT-L-14-336
37
+ name: 2024_11_26-23_59_33-model_ViT-L-14-336-lr_5e-06-b_64-j_4-p_amp
38
+ no_set_device_rank: False
39
+ norm_gradient_clip: None
40
+ precision: amp
41
+ pretrained: data/openclip-vit-14-336/openclip_model.pt
42
+ pretrained_image: False
43
+ rank: 0
44
+ report_to: wandb
45
+ resume: None
46
+ save_frequency: 1
47
+ save_most_recent: False
48
+ seed: 0
49
+ skip_scheduler: False
50
+ tensorboard: False
51
+ tensorboard_path:
52
+ torchscript: False
53
+ trace: False
54
+ train_data: csv_data/plotqa_train_only_qa_v2_5false_formated_sampled_fixed_flaten_decimal2.csv
55
+ train_num_samples: None
56
+ use_bn_sync: False
57
+ val_data: None
58
+ val_frequency: 1
59
+ val_num_samples: None
60
+ wandb: True
61
+ wandb_notes:
62
+ wandb_project: neg-clip-plotqa_train_only_qa_v2_5false_formated_sampled_fixed_flaten_decimal2
63
+ warmup: 0
64
+ wd: 0.1
65
+ workers: 4
66
+ world_size: 8
67
+ zeroshot_frequency: 2