vse-infty / coco_wsl_grid_bert /test_log_cxc.txt
cccjc's picture
add weights
f4fb17d
2021-03-24 02:09:23,091 loading file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt from cache at /home/tiger/.cache/torch/transformers/26bc1ad6c0ac742e9b52263248f6d0f00068293b33709fae12320c0e35ccfbbb.542ce4285a40d23a559526243235df47c5f75c197f04f37d1a0c124c32c9a084
2021-03-24 02:09:24,637 Did not load external(non-ImageNet) checkpoints
2021-03-24 02:09:24,638 Resnet backbone now has fixed blocks 2
2021-03-24 02:09:26,206 loading configuration file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-config.json from cache at /home/tiger/.cache/torch/transformers/4dad0251492946e18ac39290fcfe91b89d370fee250efe9521476438fe8ca185.7156163d5fdc189c3016baca0775ffce230789d7fa2a42ef516483e4ca884517
2021-03-24 02:09:26,206 Model config {
"architectures": [
"BertForMaskedLM"
],
"attention_probs_dropout_prob": 0.1,
"finetuning_task": null,
"hidden_act": "gelu",
"hidden_dropout_prob": 0.1,
"hidden_size": 768,
"initializer_range": 0.02,
"intermediate_size": 3072,
"layer_norm_eps": 1e-12,
"max_position_embeddings": 512,
"model_type": "bert",
"num_attention_heads": 12,
"num_hidden_layers": 12,
"num_labels": 2,
"output_attentions": false,
"output_hidden_states": false,
"pad_token_id": 0,
"pruned_heads": {},
"torchscript": false,
"type_vocab_size": 2,
"use_bfloat16": false,
"vocab_size": 30522
}
2021-03-24 02:09:27,670 loading weights file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-pytorch_model.bin from cache at /home/tiger/.cache/torch/transformers/aa1ef1aede4482d0dbcd4d52baad8ae300e60902e88fcb0bebdec09afd232066.36ca03ab34a1a5d5fa7bc3d03d55c4fa650fed07220e2eeebc06ce58d0e9a157
2021-03-24 02:09:30,120 Use adam as the optimizer, with init lr 0.0005
2021-03-24 02:09:30,121 Image encoder is data paralleled now.
2021-03-24 02:09:30,233 Loading dataset
2021-03-24 02:09:30,271 Input images are scaled by factor 2.0
2021-03-24 02:09:30,272 Computing results...
2021-03-24 02:09:51,749 Test: [0/196] Le 63.6659 (63.6659) Time 21.475 (0.000)
2021-03-24 02:09:59,075 Test: [10/196] Le 64.0212 (62.9529) Time 0.769 (0.000)
2021-03-24 02:10:06,372 Test: [20/196] Le 62.5716 (62.7840) Time 0.707 (0.000)
2021-03-24 02:10:13,746 Test: [30/196] Le 63.2262 (63.1042) Time 0.732 (0.000)
2021-03-24 02:10:21,147 Test: [40/196] Le 64.0326 (62.8167) Time 0.704 (0.000)
2021-03-24 02:10:28,558 Test: [50/196] Le 63.5130 (62.8027) Time 0.717 (0.000)
2021-03-24 02:10:35,860 Test: [60/196] Le 62.8874 (62.9139) Time 0.715 (0.000)
2021-03-24 02:10:43,206 Test: [70/196] Le 63.8606 (62.8671) Time 0.745 (0.000)
2021-03-24 02:10:50,685 Test: [80/196] Le 64.2199 (62.9350) Time 0.735 (0.000)
2021-03-24 02:10:58,113 Test: [90/196] Le 63.8676 (63.0111) Time 0.772 (0.000)
2021-03-24 02:11:05,484 Test: [100/196] Le 64.4704 (63.0609) Time 0.785 (0.000)
2021-03-24 02:11:12,759 Test: [110/196] Le 60.2271 (63.0471) Time 0.764 (0.000)
2021-03-24 02:11:20,155 Test: [120/196] Le 66.3189 (63.0213) Time 0.725 (0.000)
2021-03-24 02:11:27,501 Test: [130/196] Le 61.8337 (63.0454) Time 0.740 (0.000)
2021-03-24 02:11:34,893 Test: [140/196] Le 64.4948 (63.0690) Time 0.711 (0.000)
2021-03-24 02:11:42,357 Test: [150/196] Le 59.8112 (63.0335) Time 0.715 (0.000)
2021-03-24 02:11:49,662 Test: [160/196] Le 64.5806 (63.0258) Time 0.711 (0.000)
2021-03-24 02:11:57,051 Test: [170/196] Le 63.7446 (62.9643) Time 0.729 (0.000)
2021-03-24 02:12:04,428 Test: [180/196] Le 62.9918 (62.9341) Time 0.792 (0.000)
2021-03-24 02:12:11,782 Test: [190/196] Le 61.2165 (63.0057) Time 0.711 (0.000)
2021-03-24 02:12:17,785 Images: 5000, Captions: 25000
2021-03-24 02:12:54,113 T2I R@1: 53.64, R@5: 81.08, R@10: 88.948
2021-03-24 02:12:54,114 I2T R@1: 67.9, R@5: 90.64, R@10: 95.46
2021-03-24 02:14:02,666 I2I R@1: 51.283656856606136, R@5: 83.17678981423502, R@10: 90.5030265080359
2021-03-24 02:14:02,666 T2T R@1: 46.656, R@5: 69.156, R@10: 78.192