2023-01-21 18:04:36,712 48k INFO {'train': {'log_interval': 200, 'eval_interval': 1000, 'seed': 1234, 'epochs': 10000, 'learning_rate': 0.0001, 'betas': [0.8, 0.99], 'eps': 1e-09, 'batch_size': 12, 'fp16_run': False, 'lr_decay': 0.999875, 'segment_size': 17920, 'init_lr_ratio': 1, 'warmup_epochs': 0, 'c_mel': 45, 'c_kl': 1.0, 'use_sr': True, 'max_speclen': 384, 'port': '8001'}, 'data': {'training_files': 'filelists/train.txt', 'validation_files': 'filelists/val.txt', 'max_wav_value': 32768.0, 'sampling_rate': 48000, 'filter_length': 1280, 'hop_length': 320, 'win_length': 1280, 'n_mel_channels': 80, 'mel_fmin': 0.0, 'mel_fmax': None}, 'model': {'inter_channels': 192, 'hidden_channels': 192, 'filter_channels': 768, 'n_heads': 2, 'n_layers': 6, 'kernel_size': 3, 'p_dropout': 0.1, 'resblock': '1', 'resblock_kernel_sizes': [3, 7, 11], 'resblock_dilation_sizes': [[1, 3, 5], [1, 3, 5], [1, 3, 5]], 'upsample_rates': [10, 8, 2, 2], 'upsample_initial_channel': 512, 'upsample_kernel_sizes': [16, 16, 4, 4], 'n_layers_q': 3, 'use_spectral_norm': False, 'gin_channels': 256, 'ssl_dim': 256, 'n_speakers': 2}, 'spk': {'Asaki': 0}, 'model_dir': './logs/48k'} 2023-01-21 18:04:36,712 48k WARNING /root/so-vits-svc is not a git repository, therefore hash value comparison will be ignored. 2023-01-21 18:05:17,789 48k INFO emb_g.weight is not in the checkpoint 2023-01-21 18:05:17,893 48k INFO Loaded checkpoint './logs/48k/G_0.pth' (iteration 1) 2023-01-21 18:05:18,041 48k INFO Loaded checkpoint './logs/48k/D_0.pth' (iteration 1) 2023-01-21 18:05:35,509 48k INFO {'train': {'log_interval': 200, 'eval_interval': 1000, 'seed': 1234, 'epochs': 10000, 'learning_rate': 0.0001, 'betas': [0.8, 0.99], 'eps': 1e-09, 'batch_size': 12, 'fp16_run': False, 'lr_decay': 0.999875, 'segment_size': 17920, 'init_lr_ratio': 1, 'warmup_epochs': 0, 'c_mel': 45, 'c_kl': 1.0, 'use_sr': True, 'max_speclen': 384, 'port': '8001'}, 'data': {'training_files': 'filelists/train.txt', 'validation_files': 'filelists/val.txt', 'max_wav_value': 32768.0, 'sampling_rate': 48000, 'filter_length': 1280, 'hop_length': 320, 'win_length': 1280, 'n_mel_channels': 80, 'mel_fmin': 0.0, 'mel_fmax': None}, 'model': {'inter_channels': 192, 'hidden_channels': 192, 'filter_channels': 768, 'n_heads': 2, 'n_layers': 6, 'kernel_size': 3, 'p_dropout': 0.1, 'resblock': '1', 'resblock_kernel_sizes': [3, 7, 11], 'resblock_dilation_sizes': [[1, 3, 5], [1, 3, 5], [1, 3, 5]], 'upsample_rates': [10, 8, 2, 2], 'upsample_initial_channel': 512, 'upsample_kernel_sizes': [16, 16, 4, 4], 'n_layers_q': 3, 'use_spectral_norm': False, 'gin_channels': 256, 'ssl_dim': 256, 'n_speakers': 2}, 'spk': {'Asaki': 0}, 'model_dir': './logs/48k'} 2023-01-21 18:05:35,510 48k WARNING /root/so-vits-svc is not a git repository, therefore hash value comparison will be ignored. 2023-01-21 18:07:09,417 48k INFO Train Epoch: 1 [0%] 2023-01-21 18:07:09,419 48k INFO [2.78788423538208, 2.3910000324249268, 7.987979412078857, 49.07175064086914, 7.279245853424072, 0, 0.0001] 2023-01-21 18:07:26,003 48k INFO Saving model and optimizer state at iteration 1 to ./logs/48k/G_0.pth 2023-01-21 18:07:32,371 48k INFO Saving model and optimizer state at iteration 1 to ./logs/48k/D_0.pth 2023-01-21 18:11:03,546 48k INFO ====> Epoch: 1 2023-01-21 18:12:49,246 48k INFO Train Epoch: 2 [48%] 2023-01-21 18:12:49,247 48k INFO [2.5332446098327637, 2.26025390625, 6.446165084838867, 19.588043212890625, 1.3003804683685303, 200, 9.99875e-05] 2023-01-21 18:13:52,244 48k INFO ====> Epoch: 2 2023-01-21 18:16:35,440 48k INFO Train Epoch: 3 [96%] 2023-01-21 18:16:35,443 48k INFO [2.5118918418884277, 2.3020124435424805, 6.964531898498535, 21.382247924804688, 1.1781079769134521, 400, 9.99750015625e-05] 2023-01-21 18:16:39,867 48k INFO ====> Epoch: 3 2023-01-21 18:19:05,092 48k INFO {'train': {'log_interval': 200, 'eval_interval': 1000, 'seed': 1234, 'epochs': 10000, 'learning_rate': 0.0001, 'betas': [0.8, 0.99], 'eps': 1e-09, 'batch_size': 12, 'fp16_run': False, 'lr_decay': 0.999875, 'segment_size': 17920, 'init_lr_ratio': 1, 'warmup_epochs': 0, 'c_mel': 45, 'c_kl': 1.0, 'use_sr': True, 'max_speclen': 384, 'port': '8001'}, 'data': {'training_files': 'filelists/train.txt', 'validation_files': 'filelists/val.txt', 'max_wav_value': 32768.0, 'sampling_rate': 48000, 'filter_length': 1280, 'hop_length': 320, 'win_length': 1280, 'n_mel_channels': 80, 'mel_fmin': 0.0, 'mel_fmax': None}, 'model': {'inter_channels': 192, 'hidden_channels': 192, 'filter_channels': 768, 'n_heads': 2, 'n_layers': 6, 'kernel_size': 3, 'p_dropout': 0.1, 'resblock': '1', 'resblock_kernel_sizes': [3, 7, 11], 'resblock_dilation_sizes': [[1, 3, 5], [1, 3, 5], [1, 3, 5]], 'upsample_rates': [10, 8, 2, 2], 'upsample_initial_channel': 512, 'upsample_kernel_sizes': [16, 16, 4, 4], 'n_layers_q': 3, 'use_spectral_norm': False, 'gin_channels': 256, 'ssl_dim': 256, 'n_speakers': 2}, 'spk': {'Asaki': 0}, 'model_dir': './logs/48k'} 2023-01-21 18:19:05,092 48k WARNING /root/so-vits-svc is not a git repository, therefore hash value comparison will be ignored. 2023-01-21 18:19:20,863 48k INFO ====> Epoch: 4 2023-01-21 18:21:28,824 48k INFO {'train': {'log_interval': 200, 'eval_interval': 1000, 'seed': 1234, 'epochs': 10000, 'learning_rate': 0.0001, 'betas': [0.8, 0.99], 'eps': 1e-09, 'batch_size': 12, 'fp16_run': False, 'lr_decay': 0.999875, 'segment_size': 17920, 'init_lr_ratio': 1, 'warmup_epochs': 0, 'c_mel': 45, 'c_kl': 1.0, 'use_sr': True, 'max_speclen': 384, 'port': '8001'}, 'data': {'training_files': 'filelists/train.txt', 'validation_files': 'filelists/val.txt', 'max_wav_value': 32768.0, 'sampling_rate': 48000, 'filter_length': 1280, 'hop_length': 320, 'win_length': 1280, 'n_mel_channels': 80, 'mel_fmin': 0.0, 'mel_fmax': None}, 'model': {'inter_channels': 192, 'hidden_channels': 192, 'filter_channels': 768, 'n_heads': 2, 'n_layers': 6, 'kernel_size': 3, 'p_dropout': 0.1, 'resblock': '1', 'resblock_kernel_sizes': [3, 7, 11], 'resblock_dilation_sizes': [[1, 3, 5], [1, 3, 5], [1, 3, 5]], 'upsample_rates': [10, 8, 2, 2], 'upsample_initial_channel': 512, 'upsample_kernel_sizes': [16, 16, 4, 4], 'n_layers_q': 3, 'use_spectral_norm': False, 'gin_channels': 256, 'ssl_dim': 256, 'n_speakers': 2}, 'spk': {'Asaki': 0}, 'model_dir': './logs/48k'} 2023-01-21 18:21:28,825 48k WARNING /root/so-vits-svc is not a git repository, therefore hash value comparison will be ignored. 2023-01-21 18:22:14,827 48k INFO Loaded checkpoint './logs/48k/G_0.pth' (iteration 1) 2023-01-21 18:22:18,033 48k INFO Loaded checkpoint './logs/48k/D_0.pth' (iteration 1) 2023-01-21 18:23:47,919 48k INFO Train Epoch: 1 [0%] 2023-01-21 18:23:47,920 48k INFO [2.359910011291504, 2.8044304847717285, 8.25642204284668, 43.937801361083984, 5.3009514808654785, 0, 0.0001] 2023-01-21 18:24:01,307 48k INFO Saving model and optimizer state at iteration 1 to ./logs/48k/G_0.pth 2023-01-21 18:24:04,235 48k INFO Saving model and optimizer state at iteration 1 to ./logs/48k/D_0.pth 2023-01-21 18:26:27,936 48k INFO ====> Epoch: 1 2023-01-21 18:28:16,058 48k INFO Train Epoch: 2 [48%] 2023-01-21 18:28:16,060 48k INFO [2.5112764835357666, 2.1918115615844727, 6.411715507507324, 19.3953800201416, 1.3029295206069946, 200, 9.99875e-05] 2023-01-21 18:29:18,505 48k INFO ====> Epoch: 2 2023-01-21 18:31:58,017 48k INFO Train Epoch: 3 [96%] 2023-01-21 18:31:58,021 48k INFO [2.536815643310547, 2.2379751205444336, 6.896862506866455, 20.75645637512207, 1.1633890867233276, 400, 9.99750015625e-05] 2023-01-21 18:32:02,308 48k INFO ====> Epoch: 3 2023-01-21 18:34:40,706 48k INFO ====> Epoch: 4 2023-01-21 18:36:17,736 48k INFO Train Epoch: 5 [44%] 2023-01-21 18:36:17,737 48k INFO [2.454810857772827, 2.231981039047241, 6.8509931564331055, 22.396493911743164, 1.0622344017028809, 600, 9.995000937421877e-05] 2023-01-21 18:37:24,162 48k INFO ====> Epoch: 5 2023-01-21 18:40:03,179 48k INFO Train Epoch: 6 [93%] 2023-01-21 18:40:03,181 48k INFO [2.407399892807007, 2.395613670349121, 6.204927444458008, 19.234540939331055, 0.9698495864868164, 800, 9.993751562304699e-05] 2023-01-21 18:40:11,999 48k INFO ====> Epoch: 6 2023-01-21 18:42:53,269 48k INFO ====> Epoch: 7 2023-01-21 18:44:28,845 48k INFO Train Epoch: 8 [41%] 2023-01-21 18:44:28,846 48k INFO [2.494946241378784, 2.308384418487549, 6.459575176239014, 19.92708396911621, 1.1307897567749023, 1000, 9.991253280566489e-05] 2023-01-21 18:44:45,411 48k INFO Saving model and optimizer state at iteration 8 to ./logs/48k/G_1000.pth 2023-01-21 18:44:47,865 48k INFO Saving model and optimizer state at iteration 8 to ./logs/48k/D_1000.pth 2023-01-21 18:45:59,935 48k INFO ====> Epoch: 8 2023-01-21 18:48:34,562 48k INFO Train Epoch: 9 [89%] 2023-01-21 18:48:34,564 48k INFO [2.288059711456299, 2.4334986209869385, 6.657644748687744, 18.83081817626953, 1.0002739429473877, 1200, 9.990004373906418e-05] 2023-01-21 18:48:47,765 48k INFO ====> Epoch: 9 2023-01-21 18:51:29,773 48k INFO ====> Epoch: 10 2023-01-21 18:52:54,050 48k INFO Train Epoch: 11 [37%] 2023-01-21 18:52:54,051 48k INFO [2.6331095695495605, 2.1205484867095947, 5.91013240814209, 20.19317626953125, 0.860890805721283, 1400, 9.987507028906759e-05] 2023-01-21 18:54:09,630 48k INFO ====> Epoch: 11 2023-01-21 18:56:29,456 48k INFO Train Epoch: 12 [85%] 2023-01-21 18:56:29,457 48k INFO [2.4770467281341553, 2.0123064517974854, 6.768789768218994, 19.123401641845703, 0.841851532459259, 1600, 9.986258590528146e-05] 2023-01-21 18:56:47,066 48k INFO ====> Epoch: 12 2023-01-21 18:59:33,852 48k INFO ====> Epoch: 13 2023-01-21 19:00:59,237 48k INFO Train Epoch: 14 [33%] 2023-01-21 19:00:59,239 48k INFO [2.570930004119873, 2.231339454650879, 6.8405561447143555, 20.79842185974121, 1.3023321628570557, 1800, 9.983762181915804e-05] 2023-01-21 19:02:18,905 48k INFO ====> Epoch: 14 2023-01-21 19:04:35,908 48k INFO Train Epoch: 15 [81%] 2023-01-21 19:04:35,909 48k INFO [2.498138427734375, 2.230814218521118, 6.456470489501953, 18.991992950439453, 0.8417561650276184, 2000, 9.982514211643064e-05] 2023-01-21 19:04:53,401 48k INFO Saving model and optimizer state at iteration 15 to ./logs/48k/G_2000.pth 2023-01-21 19:04:55,484 48k INFO Saving model and optimizer state at iteration 15 to ./logs/48k/D_2000.pth 2023-01-21 19:05:18,677 48k INFO ====> Epoch: 15 2023-01-21 19:08:16,402 48k INFO ====> Epoch: 16 2023-01-21 19:09:37,263 48k INFO Train Epoch: 17 [30%] 2023-01-21 19:09:37,264 48k INFO [2.417067527770996, 2.208970785140991, 6.753937244415283, 18.541423797607422, 1.1583983898162842, 2200, 9.980018739066937e-05] 2023-01-21 19:11:01,142 48k INFO ====> Epoch: 17 2023-01-21 19:13:13,982 48k INFO Train Epoch: 18 [78%] 2023-01-21 19:13:13,983 48k INFO [2.4082534313201904, 2.3695671558380127, 8.824447631835938, 20.506267547607422, 1.017524242401123, 2400, 9.978771236724554e-05] 2023-01-21 19:13:40,413 48k INFO ====> Epoch: 18 2023-01-21 19:16:23,390 48k INFO ====> Epoch: 19 2023-01-21 19:17:45,828 48k INFO Train Epoch: 20 [26%] 2023-01-21 19:17:45,830 48k INFO [2.3429973125457764, 2.336138963699341, 7.183443546295166, 19.938209533691406, 1.11813223361969, 2600, 9.976276699833672e-05] 2023-01-21 19:19:14,454 48k INFO ====> Epoch: 20 2023-01-21 19:21:27,359 48k INFO Train Epoch: 21 [74%] 2023-01-21 19:21:27,361 48k INFO [2.4236397743225098, 2.297074794769287, 7.671711444854736, 20.581554412841797, 0.8658930659294128, 2800, 9.975029665246193e-05] 2023-01-21 19:21:58,167 48k INFO ====> Epoch: 21 2023-01-21 19:24:40,846 48k INFO ====> Epoch: 22 2023-01-21 19:25:56,604 48k INFO Train Epoch: 23 [22%] 2023-01-21 19:25:56,606 48k INFO [2.559892177581787, 2.1731812953948975, 6.826930999755859, 18.329904556274414, 0.6633304953575134, 3000, 9.972536063689719e-05] 2023-01-21 19:26:05,470 48k INFO Saving model and optimizer state at iteration 23 to ./logs/48k/G_3000.pth 2023-01-21 19:26:08,949 48k INFO Saving model and optimizer state at iteration 23 to ./logs/48k/D_3000.pth 2023-01-21 19:27:43,894 48k INFO ====> Epoch: 23 2023-01-21 19:29:50,652 48k INFO Train Epoch: 24 [70%] 2023-01-21 19:29:50,653 48k INFO [2.7978920936584473, 1.9214602708816528, 5.106021881103516, 13.230501174926758, 0.7694799304008484, 3200, 9.971289496681757e-05] 2023-01-21 19:30:26,082 48k INFO ====> Epoch: 24 2023-01-21 19:33:10,375 48k INFO ====> Epoch: 25 2023-01-21 19:34:19,108 48k INFO Train Epoch: 26 [19%] 2023-01-21 19:34:19,109 48k INFO [2.5172605514526367, 2.1951985359191895, 5.333796977996826, 16.90324592590332, 0.7694472670555115, 3400, 9.968796830108985e-05] 2023-01-21 19:35:56,710 48k INFO ====> Epoch: 26 2023-01-21 19:38:08,194 48k INFO Train Epoch: 27 [67%] 2023-01-21 19:38:08,195 48k INFO [2.5653388500213623, 2.3432114124298096, 7.813320636749268, 18.53550910949707, 1.0009058713912964, 3600, 9.967550730505221e-05] 2023-01-21 19:38:48,066 48k INFO ====> Epoch: 27 2023-01-21 19:41:31,180 48k INFO ====> Epoch: 28 2023-01-21 19:42:32,556 48k INFO Train Epoch: 29 [15%] 2023-01-21 19:42:32,557 48k INFO [2.522839307785034, 2.3247222900390625, 6.587640762329102, 18.705707550048828, 1.0434538125991821, 3800, 9.965058998565574e-05] 2023-01-21 19:44:14,050 48k INFO ====> Epoch: 29 2023-01-21 19:46:16,207 48k INFO Train Epoch: 30 [63%] 2023-01-21 19:46:16,208 48k INFO [2.744947910308838, 2.0575339794158936, 4.532804489135742, 15.108162879943848, 1.0357608795166016, 4000, 9.963813366190753e-05] 2023-01-21 19:46:27,097 48k INFO Saving model and optimizer state at iteration 30 to ./logs/48k/G_4000.pth 2023-01-21 19:46:30,302 48k INFO Saving model and optimizer state at iteration 30 to ./logs/48k/D_4000.pth 2023-01-21 19:47:16,675 48k INFO ====> Epoch: 30 2023-01-21 19:49:56,569 48k INFO ====> Epoch: 31 2023-01-21 19:50:50,278 48k INFO Train Epoch: 32 [11%] 2023-01-21 19:50:50,280 48k INFO [2.7185075283050537, 2.1597092151641846, 5.198480606079102, 13.74974536895752, 0.6658070087432861, 4200, 9.961322568533789e-05] 2023-01-21 19:52:36,896 48k INFO ====> Epoch: 32 2023-01-21 19:54:37,255 48k INFO Train Epoch: 33 [59%] 2023-01-21 19:54:37,257 48k INFO [2.370847463607788, 2.301891803741455, 6.454242706298828, 16.457263946533203, 1.0966852903366089, 4400, 9.960077403212722e-05] 2023-01-21 19:55:25,957 48k INFO ====> Epoch: 33 2023-01-21 19:58:05,688 48k INFO ====> Epoch: 34 2023-01-21 19:58:56,195 48k INFO Train Epoch: 35 [7%] 2023-01-21 19:58:56,197 48k INFO [2.4801878929138184, 2.185500383377075, 6.788366794586182, 15.576699256896973, 0.8841870427131653, 4600, 9.957587539488128e-05] 2023-01-21 20:00:46,706 48k INFO ====> Epoch: 35 2023-01-21 20:02:36,080 48k INFO Train Epoch: 36 [56%] 2023-01-21 20:02:36,081 48k INFO [2.679831027984619, 2.0548784732818604, 5.029653549194336, 16.398338317871094, 0.718271791934967, 4800, 9.956342841045691e-05] 2023-01-21 20:03:29,370 48k INFO ====> Epoch: 36 2023-01-21 20:06:20,967 48k INFO ====> Epoch: 37 2023-01-21 20:07:06,429 48k INFO Train Epoch: 38 [4%] 2023-01-21 20:07:06,433 48k INFO [2.5552399158477783, 2.1978342533111572, 6.492395401000977, 19.408016204833984, 0.8538599014282227, 5000, 9.953853910903285e-05] 2023-01-21 20:07:12,565 48k INFO Saving model and optimizer state at iteration 38 to ./logs/48k/G_5000.pth 2023-01-21 20:07:15,387 48k INFO Saving model and optimizer state at iteration 38 to ./logs/48k/D_5000.pth 2023-01-21 20:09:12,470 48k INFO ====> Epoch: 38 2023-01-21 20:10:55,128 48k INFO Train Epoch: 39 [52%] 2023-01-21 20:10:55,130 48k INFO [2.4816973209381104, 2.0630762577056885, 6.828281402587891, 20.028705596923828, 1.0210936069488525, 5200, 9.952609679164422e-05] 2023-01-21 20:11:52,586 48k INFO ====> Epoch: 39 2023-01-21 20:14:36,304 48k INFO ====> Epoch: 40 2023-01-21 20:15:17,849 48k INFO Train Epoch: 41 [0%] 2023-01-21 20:15:17,850 48k INFO [2.4473423957824707, 2.315026044845581, 6.20758581161499, 18.650897979736328, 0.7872772216796875, 5400, 9.950121682254156e-05] 2023-01-21 20:17:17,708 48k INFO ====> Epoch: 41 2023-01-21 20:18:58,229 48k INFO Train Epoch: 42 [48%] 2023-01-21 20:18:58,230 48k INFO [2.5761501789093018, 2.3296546936035156, 6.582129001617432, 16.524967193603516, 0.6928725838661194, 5600, 9.948877917043875e-05] 2023-01-21 20:20:00,056 48k INFO ====> Epoch: 42 2023-01-21 20:22:46,301 48k INFO Train Epoch: 43 [96%] 2023-01-21 20:22:46,304 48k INFO [2.4285154342651367, 2.264054536819458, 7.586179256439209, 19.46199607849121, 0.9597828388214111, 5800, 9.947634307304244e-05] 2023-01-21 20:22:50,640 48k INFO ====> Epoch: 43 2023-01-21 20:25:35,091 48k INFO ====> Epoch: 44 2023-01-21 20:27:09,037 48k INFO Train Epoch: 45 [44%] 2023-01-21 20:27:09,038 48k INFO [2.67507266998291, 2.1673824787139893, 5.574207305908203, 14.969029426574707, 0.9140932559967041, 6000, 9.945147554159202e-05] 2023-01-21 20:27:19,140 48k INFO Saving model and optimizer state at iteration 45 to ./logs/48k/G_6000.pth 2023-01-21 20:27:21,831 48k INFO Saving model and optimizer state at iteration 45 to ./logs/48k/D_6000.pth 2023-01-21 20:28:30,277 48k INFO ====> Epoch: 45 2023-01-21 20:31:08,153 48k INFO Train Epoch: 46 [93%] 2023-01-21 20:31:08,154 48k INFO [2.633803367614746, 2.0723886489868164, 5.611661434173584, 15.92891788482666, 0.5805487036705017, 6200, 9.943904410714931e-05] 2023-01-21 20:31:16,810 48k INFO ====> Epoch: 46 2023-01-21 20:33:57,081 48k INFO ====> Epoch: 47 2023-01-21 20:35:28,880 48k INFO Train Epoch: 48 [41%] 2023-01-21 20:35:28,881 48k INFO [2.399261713027954, 2.1058430671691895, 7.828022480010986, 21.01303482055664, 0.7018458247184753, 6400, 9.941418589985758e-05] 2023-01-21 20:36:39,568 48k INFO ====> Epoch: 48 2023-01-21 20:39:11,220 48k INFO Train Epoch: 49 [89%] 2023-01-21 20:39:11,223 48k INFO [2.4858646392822266, 2.297830581665039, 7.348237991333008, 19.16147804260254, 1.027444839477539, 6600, 9.940175912662009e-05] 2023-01-21 20:39:24,367 48k INFO ====> Epoch: 49 2023-01-21 20:42:12,722 48k INFO ====> Epoch: 50 2023-01-21 20:43:49,123 48k INFO Train Epoch: 51 [37%] 2023-01-21 20:43:49,125 48k INFO [2.6693339347839355, 2.0732462406158447, 6.496889591217041, 17.18926239013672, 0.6295542120933533, 6800, 9.937691023999092e-05] 2023-01-21 20:45:04,252 48k INFO ====> Epoch: 51 2023-01-21 20:47:28,747 48k INFO Train Epoch: 52 [85%] 2023-01-21 20:47:28,753 48k INFO [2.433626413345337, 2.3078064918518066, 6.976032257080078, 21.506155014038086, 1.0545254945755005, 7000, 9.936448812621091e-05] 2023-01-21 20:47:40,241 48k INFO Saving model and optimizer state at iteration 52 to ./logs/48k/G_7000.pth 2023-01-21 20:47:42,889 48k INFO Saving model and optimizer state at iteration 52 to ./logs/48k/D_7000.pth 2023-01-21 20:48:02,836 48k INFO ====> Epoch: 52 2023-01-21 20:50:55,789 48k INFO ====> Epoch: 53 2023-01-21 20:52:18,257 48k INFO Train Epoch: 54 [33%] 2023-01-21 20:52:18,258 48k INFO [2.4525606632232666, 2.2672998905181885, 8.37466812133789, 19.30447006225586, 0.9938163757324219, 7200, 9.933964855674948e-05] 2023-01-21 20:53:37,609 48k INFO ====> Epoch: 54 2023-01-21 20:55:53,748 48k INFO Train Epoch: 55 [81%] 2023-01-21 20:55:53,751 48k INFO [2.4006752967834473, 2.346210479736328, 6.938320636749268, 18.15106964111328, 0.6857073903083801, 7400, 9.932723110067987e-05] 2023-01-21 20:56:15,747 48k INFO ====> Epoch: 55 2023-01-21 20:59:00,493 48k INFO ====> Epoch: 56 2023-01-21 21:00:17,911 48k INFO Train Epoch: 57 [30%] 2023-01-21 21:00:17,914 48k INFO [2.2028679847717285, 2.6490511894226074, 7.324281215667725, 16.907730102539062, 0.7272804975509644, 7600, 9.930240084489267e-05] 2023-01-21 21:01:41,965 48k INFO ====> Epoch: 57 2023-01-21 21:03:56,163 48k INFO Train Epoch: 58 [78%] 2023-01-21 21:03:56,164 48k INFO [2.4505438804626465, 2.4747581481933594, 7.810667514801025, 19.97563362121582, 0.6851949691772461, 7800, 9.928998804478705e-05] 2023-01-21 21:04:22,429 48k INFO ====> Epoch: 58 2023-01-21 21:07:13,615 48k INFO ====> Epoch: 59 2023-01-21 21:08:59,017 48k INFO Train Epoch: 60 [26%] 2023-01-21 21:08:59,018 48k INFO [2.486494302749634, 2.199840545654297, 7.365984916687012, 18.430219650268555, 0.7499792575836182, 8000, 9.926516709918191e-05] 2023-01-21 21:09:28,413 48k INFO Saving model and optimizer state at iteration 60 to ./logs/48k/G_8000.pth 2023-01-21 21:09:31,610 48k INFO Saving model and optimizer state at iteration 60 to ./logs/48k/D_8000.pth 2023-01-21 21:11:01,009 48k INFO ====> Epoch: 60 2023-01-21 21:14:15,414 48k INFO Train Epoch: 61 [74%] 2023-01-21 21:14:15,443 48k INFO [2.6325886249542236, 2.0911941528320312, 5.727120399475098, 16.730710983276367, 0.5703538060188293, 8200, 9.92527589532945e-05] 2023-01-21 21:14:46,236 48k INFO ====> Epoch: 61 2023-01-21 21:18:24,391 48k INFO ====> Epoch: 62 2023-01-21 23:53:25,508 48k INFO {'train': {'log_interval': 200, 'eval_interval': 1000, 'seed': 1234, 'epochs': 10000, 'learning_rate': 0.0001, 'betas': [0.8, 0.99], 'eps': 1e-09, 'batch_size': 12, 'fp16_run': False, 'lr_decay': 0.999875, 'segment_size': 17920, 'init_lr_ratio': 1, 'warmup_epochs': 0, 'c_mel': 45, 'c_kl': 1.0, 'use_sr': True, 'max_speclen': 384, 'port': '8001'}, 'data': {'training_files': 'filelists/train.txt', 'validation_files': 'filelists/val.txt', 'max_wav_value': 32768.0, 'sampling_rate': 48000, 'filter_length': 1280, 'hop_length': 320, 'win_length': 1280, 'n_mel_channels': 80, 'mel_fmin': 0.0, 'mel_fmax': None}, 'model': {'inter_channels': 192, 'hidden_channels': 192, 'filter_channels': 768, 'n_heads': 2, 'n_layers': 6, 'kernel_size': 3, 'p_dropout': 0.1, 'resblock': '1', 'resblock_kernel_sizes': [3, 7, 11], 'resblock_dilation_sizes': [[1, 3, 5], [1, 3, 5], [1, 3, 5]], 'upsample_rates': [10, 8, 2, 2], 'upsample_initial_channel': 512, 'upsample_kernel_sizes': [16, 16, 4, 4], 'n_layers_q': 3, 'use_spectral_norm': False, 'gin_channels': 256, 'ssl_dim': 256, 'n_speakers': 2}, 'spk': {'Asaki': 0}, 'model_dir': './logs/48k'} 2023-01-21 23:53:25,509 48k WARNING /root/so-vits-svc is not a git repository, therefore hash value comparison will be ignored. 2023-01-21 23:54:30,236 48k INFO Loaded checkpoint './logs/48k/G_8000.pth' (iteration 60) 2023-01-21 23:54:42,333 48k INFO Loaded checkpoint './logs/48k/D_8000.pth' (iteration 60) 2023-01-21 23:57:39,795 48k INFO Train Epoch: 60 [26%] 2023-01-21 23:57:39,796 48k INFO [2.525301456451416, 2.0419814586639404, 6.9045634269714355, 18.9658203125, 0.8763787746429443, 8000, 9.92527589532945e-05] 2023-01-21 23:58:09,857 48k INFO Saving model and optimizer state at iteration 60 to ./logs/48k/G_8000.pth 2023-01-21 23:58:14,303 48k INFO Saving model and optimizer state at iteration 60 to ./logs/48k/D_8000.pth 2023-01-22 00:00:03,698 48k INFO ====> Epoch: 60 2023-01-22 00:02:54,306 48k INFO Train Epoch: 61 [74%] 2023-01-22 00:02:54,307 48k INFO [2.589698076248169, 2.205421209335327, 6.1892900466918945, 17.996353149414062, 0.9285355806350708, 8200, 9.924035235842533e-05] 2023-01-22 00:03:25,516 48k INFO ====> Epoch: 61 2023-01-22 00:06:56,456 48k INFO ====> Epoch: 62 2023-01-22 00:08:26,712 48k INFO Train Epoch: 63 [22%] 2023-01-22 00:08:26,714 48k INFO [2.4841582775115967, 2.2755141258239746, 8.042802810668945, 19.29408073425293, 1.0683722496032715, 8400, 9.921554382096622e-05] 2023-01-22 00:09:59,635 48k INFO ====> Epoch: 63 2023-01-22 00:12:30,711 48k INFO Train Epoch: 64 [70%] 2023-01-22 00:12:30,910 48k INFO [2.4924073219299316, 2.350067377090454, 6.41464900970459, 16.699356079101562, 0.6877371668815613, 8600, 9.92031418779886e-05] 2023-01-22 00:13:06,247 48k INFO ====> Epoch: 64 2023-01-22 00:16:15,398 48k INFO ====> Epoch: 65 2023-01-22 00:17:51,028 48k INFO Train Epoch: 66 [19%] 2023-01-22 00:17:51,030 48k INFO [2.3635761737823486, 2.2696850299835205, 6.850845813751221, 16.303762435913086, 0.9442687630653381, 8800, 9.917834264256819e-05] 2023-01-22 00:19:28,721 48k INFO ====> Epoch: 66 2023-01-22 00:21:39,174 48k INFO Train Epoch: 67 [67%] 2023-01-22 00:21:39,181 48k INFO [2.457024097442627, 2.392209529876709, 9.030594825744629, 19.695812225341797, 0.6749056577682495, 9000, 9.916594534973787e-05] 2023-01-22 00:22:00,547 48k INFO Saving model and optimizer state at iteration 67 to ./logs/48k/G_9000.pth 2023-01-22 00:22:03,569 48k INFO Saving model and optimizer state at iteration 67 to ./logs/48k/D_9000.pth 2023-01-22 00:22:44,927 48k INFO ====> Epoch: 67 2023-01-22 00:26:09,963 48k INFO ====> Epoch: 68 2023-01-22 00:27:39,535 48k INFO Train Epoch: 69 [15%] 2023-01-22 00:27:39,536 48k INFO [2.4610981941223145, 2.2215635776519775, 7.027792930603027, 16.051708221435547, 0.7246941328048706, 9200, 9.914115541286833e-05] 2023-01-22 00:29:21,524 48k INFO ====> Epoch: 69 2023-01-22 00:31:53,716 48k INFO Train Epoch: 70 [63%] 2023-01-22 00:31:53,717 48k INFO [2.755732536315918, 2.032122850418091, 4.157487392425537, 12.618134498596191, 0.6738178730010986, 9400, 9.912876276844171e-05] 2023-01-22 00:32:38,163 48k INFO ====> Epoch: 70 2023-01-22 00:35:46,107 48k INFO ====> Epoch: 71 2023-01-22 00:37:40,344 48k INFO Train Epoch: 72 [11%] 2023-01-22 00:37:40,352 48k INFO [2.0691123008728027, 3.014491319656372, 8.477749824523926, 18.246536254882812, 0.7139840722084045, 9600, 9.910398212663652e-05] 2023-01-22 00:39:27,040 48k INFO ====> Epoch: 72 2023-01-22 00:41:43,590 48k INFO Train Epoch: 73 [59%] 2023-01-22 00:41:43,591 48k INFO [2.5539746284484863, 2.139866352081299, 4.984771251678467, 14.52645492553711, 0.6911921501159668, 9800, 9.909159412887068e-05] 2023-01-22 00:42:32,247 48k INFO ====> Epoch: 73 2023-01-22 00:45:34,088 48k INFO ====> Epoch: 74 2023-01-22 00:46:46,365 48k INFO Train Epoch: 75 [7%] 2023-01-22 00:46:46,370 48k INFO [2.5185959339141846, 2.310060501098633, 6.430722713470459, 16.572755813598633, 0.772114634513855, 10000, 9.906682277864462e-05] 2023-01-22 00:47:13,131 48k INFO Saving model and optimizer state at iteration 75 to ./logs/48k/G_10000.pth 2023-01-22 00:47:15,685 48k INFO Saving model and optimizer state at iteration 75 to ./logs/48k/D_10000.pth 2023-01-22 00:49:08,154 48k INFO ====> Epoch: 75 2023-01-22 00:51:46,494 48k INFO Train Epoch: 76 [56%] 2023-01-22 00:51:46,500 48k INFO [2.4656147956848145, 2.074436902999878, 7.349031448364258, 18.18931007385254, 0.8241748809814453, 10200, 9.905443942579728e-05] 2023-01-22 00:52:40,008 48k INFO ====> Epoch: 76 2023-01-22 00:56:17,365 48k INFO ====> Epoch: 77 2023-01-22 00:57:57,092 48k INFO Train Epoch: 78 [4%] 2023-01-22 00:57:57,094 48k INFO [2.4762134552001953, 2.1492390632629395, 7.48077917098999, 19.839807510375977, 0.7054923176765442, 10400, 9.902967736366644e-05] 2023-01-22 00:59:52,412 48k INFO ====> Epoch: 78 2023-01-22 01:02:48,444 48k INFO Train Epoch: 79 [52%] 2023-01-22 01:02:48,445 48k INFO [2.3755106925964355, 2.1339473724365234, 8.485089302062988, 20.205278396606445, 0.8104638457298279, 10600, 9.901729865399597e-05] 2023-01-22 01:03:46,580 48k INFO ====> Epoch: 79 2023-01-22 01:07:19,853 48k INFO ====> Epoch: 80 2023-01-22 01:08:57,434 48k INFO Train Epoch: 81 [0%] 2023-01-22 01:08:57,435 48k INFO [2.456996440887451, 2.1419756412506104, 6.801480770111084, 17.714597702026367, 0.7562506198883057, 10800, 9.899254587647776e-05] 2023-01-22 01:10:56,892 48k INFO ====> Epoch: 81 2023-01-22 01:13:33,567 48k INFO Train Epoch: 82 [48%] 2023-01-22 01:13:33,573 48k INFO [2.6736533641815186, 2.158252716064453, 5.721027851104736, 15.55154037475586, 0.8178394436836243, 11000, 9.89801718082432e-05] 2023-01-22 01:13:59,842 48k INFO Saving model and optimizer state at iteration 82 to ./logs/48k/G_11000.pth 2023-01-22 01:14:02,509 48k INFO Saving model and optimizer state at iteration 82 to ./logs/48k/D_11000.pth 2023-01-22 01:15:05,842 48k INFO ====> Epoch: 82 2023-01-22 01:18:52,015 48k INFO Train Epoch: 83 [96%] 2023-01-22 01:18:52,016 48k INFO [2.4652676582336426, 2.307220697402954, 7.07650089263916, 20.759014129638672, 0.6611120104789734, 11200, 9.896779928676716e-05] 2023-01-22 01:18:56,345 48k INFO ====> Epoch: 83 2023-01-22 01:22:04,703 48k INFO ====> Epoch: 84 2023-01-22 01:24:35,176 48k INFO Train Epoch: 85 [44%] 2023-01-22 01:24:35,178 48k INFO [2.485337495803833, 2.2664029598236084, 6.920342922210693, 16.13712501525879, 0.8188708424568176, 11400, 9.894305888331732e-05] 2023-01-22 01:25:41,796 48k INFO ====> Epoch: 85 2023-01-22 01:29:22,272 48k INFO Train Epoch: 86 [93%] 2023-01-22 01:29:22,274 48k INFO [2.4194915294647217, 2.3636951446533203, 6.283056735992432, 14.88568115234375, 0.7501641511917114, 11600, 9.89306910009569e-05] 2023-01-22 01:29:31,140 48k INFO ====> Epoch: 86 2023-01-22 01:33:03,270 48k INFO ====> Epoch: 87 2023-01-22 01:35:32,163 48k INFO Train Epoch: 88 [41%] 2023-01-22 01:35:32,164 48k INFO [2.435405731201172, 2.164144515991211, 6.78734827041626, 16.678043365478516, 1.0764909982681274, 11800, 9.89059598739987e-05] 2023-01-22 01:36:43,232 48k INFO ====> Epoch: 88 2023-01-22 01:40:20,413 48k INFO Train Epoch: 89 [89%] 2023-01-22 01:40:20,414 48k INFO [2.571382761001587, 2.2321105003356934, 6.196994304656982, 18.037050247192383, 0.5229658484458923, 12000, 9.889359662901445e-05] 2023-01-22 01:40:49,853 48k INFO Saving model and optimizer state at iteration 89 to ./logs/48k/G_12000.pth 2023-01-22 01:40:52,531 48k INFO Saving model and optimizer state at iteration 89 to ./logs/48k/D_12000.pth 2023-01-22 01:41:07,463 48k INFO ====> Epoch: 89 2023-01-22 01:44:26,203 48k INFO ====> Epoch: 90 2023-01-22 01:46:25,928 48k INFO Train Epoch: 91 [37%] 2023-01-22 01:46:25,930 48k INFO [2.4212565422058105, 2.342421293258667, 8.24842643737793, 21.763324737548828, 0.8787412643432617, 12200, 9.886887477506964e-05] 2023-01-22 01:47:41,694 48k INFO ====> Epoch: 91 2023-01-22 01:50:59,209 48k INFO Train Epoch: 92 [85%] 2023-01-22 01:50:59,210 48k INFO [2.452944278717041, 2.280447244644165, 7.893123149871826, 19.17333984375, 0.6785156726837158, 12400, 9.885651616572276e-05] 2023-01-22 01:51:16,903 48k INFO ====> Epoch: 92 2023-01-22 01:55:14,849 48k INFO ====> Epoch: 93 2023-01-22 01:57:51,888 48k INFO Train Epoch: 94 [33%] 2023-01-22 01:57:51,889 48k INFO [2.491302728652954, 2.1840858459472656, 7.335545063018799, 16.308502197265625, 0.9185124635696411, 12600, 9.883180358131438e-05] 2023-01-22 01:59:11,869 48k INFO ====> Epoch: 94 2023-01-22 02:02:40,753 48k INFO Train Epoch: 95 [81%] 2023-01-22 02:02:40,754 48k INFO [2.3356051445007324, 2.415994167327881, 6.539595127105713, 15.313440322875977, 0.7552074193954468, 12800, 9.881944960586671e-05] 2023-01-22 02:03:02,726 48k INFO ====> Epoch: 95 2023-01-22 02:06:40,692 48k INFO ====> Epoch: 96 2023-01-22 02:08:33,947 48k INFO Train Epoch: 97 [30%] 2023-01-22 02:08:33,948 48k INFO [2.3028030395507812, 2.4014933109283447, 7.01709508895874, 19.628679275512695, 0.69573974609375, 13000, 9.879474628751914e-05] 2023-01-22 02:08:49,948 48k INFO Saving model and optimizer state at iteration 97 to ./logs/48k/G_13000.pth 2023-01-22 02:08:53,352 48k INFO Saving model and optimizer state at iteration 97 to ./logs/48k/D_13000.pth 2023-01-22 02:10:19,221 48k INFO ====> Epoch: 97 2023-01-22 02:13:25,123 48k INFO Train Epoch: 98 [78%] 2023-01-22 02:13:25,141 48k INFO [2.350069046020508, 2.531834602355957, 7.963974952697754, 19.812700271606445, 0.5810386538505554, 13200, 9.87823969442332e-05] 2023-01-22 02:13:51,617 48k INFO ====> Epoch: 98 2023-01-22 02:16:53,335 48k INFO ====> Epoch: 99 2023-01-22 02:18:54,678 48k INFO Train Epoch: 100 [26%] 2023-01-22 02:18:54,679 48k INFO [2.562602996826172, 2.017732620239258, 6.309353351593018, 15.973162651062012, 0.5981793999671936, 13400, 9.875770288847208e-05] 2023-01-22 02:20:23,643 48k INFO ====> Epoch: 100 2023-01-22 02:22:55,177 48k INFO Train Epoch: 101 [74%] 2023-01-22 02:22:55,193 48k INFO [2.321093797683716, 2.340337038040161, 8.523782730102539, 19.372026443481445, 0.8616945147514343, 13600, 9.874535817561101e-05] 2023-01-22 02:23:26,497 48k INFO ====> Epoch: 101 2023-01-22 02:26:37,269 48k INFO ====> Epoch: 102 2023-01-22 02:28:07,704 48k INFO Train Epoch: 103 [22%] 2023-01-22 02:28:07,705 48k INFO [2.397919178009033, 2.359201431274414, 6.399106979370117, 16.85429573059082, 1.105242133140564, 13800, 9.872067337896332e-05] 2023-01-22 02:29:41,115 48k INFO ====> Epoch: 103 2023-01-22 02:32:29,190 48k INFO Train Epoch: 104 [70%] 2023-01-22 02:32:29,192 48k INFO [2.48563551902771, 2.2279274463653564, 6.698295593261719, 15.254925727844238, 0.8275677561759949, 14000, 9.870833329479095e-05] 2023-01-22 02:32:59,897 48k INFO Saving model and optimizer state at iteration 104 to ./logs/48k/G_14000.pth 2023-01-22 02:33:02,360 48k INFO Saving model and optimizer state at iteration 104 to ./logs/48k/D_14000.pth 2023-01-22 02:33:38,969 48k INFO ====> Epoch: 104 2023-01-22 02:36:41,677 48k INFO ====> Epoch: 105 2023-01-22 02:38:17,152 48k INFO Train Epoch: 106 [19%] 2023-01-22 02:38:17,153 48k INFO [2.4486289024353027, 2.3451008796691895, 6.8948974609375, 15.702513694763184, 0.8337259292602539, 14200, 9.868365775378495e-05] 2023-01-22 02:39:54,419 48k INFO ====> Epoch: 106 2023-01-22 02:42:22,340 48k INFO Train Epoch: 107 [67%] 2023-01-22 02:42:22,342 48k INFO [2.4994194507598877, 2.3838698863983154, 7.747100353240967, 17.850509643554688, 0.974573016166687, 14400, 9.867132229656573e-05] 2023-01-22 02:43:02,250 48k INFO ====> Epoch: 107 2023-01-22 02:46:17,401 48k INFO ====> Epoch: 108 2023-01-22 02:47:40,188 48k INFO Train Epoch: 109 [15%] 2023-01-22 02:47:40,197 48k INFO [2.57458758354187, 2.2239394187927246, 7.012730121612549, 17.30144691467285, 0.6685440540313721, 14600, 9.864665600773098e-05] 2023-01-22 02:49:22,229 48k INFO ====> Epoch: 109 2023-01-22 02:51:53,799 48k INFO Train Epoch: 110 [63%] 2023-01-22 02:51:53,800 48k INFO [2.6348493099212646, 2.197437047958374, 6.645849227905273, 15.593059539794922, 0.7109649777412415, 14800, 9.863432517573002e-05] 2023-01-22 02:52:37,838 48k INFO ====> Epoch: 110 2023-01-22 02:55:38,603 48k INFO ====> Epoch: 111 2023-01-22 02:57:14,163 48k INFO Train Epoch: 112 [11%] 2023-01-22 02:57:14,165 48k INFO [2.4782354831695557, 2.25980544090271, 5.911761283874512, 14.526992797851562, 0.6271955370903015, 15000, 9.86096681355974e-05] 2023-01-22 02:57:29,237 48k INFO Saving model and optimizer state at iteration 112 to ./logs/48k/G_15000.pth 2023-01-22 02:57:31,828 48k INFO Saving model and optimizer state at iteration 112 to ./logs/48k/D_15000.pth 2023-01-22 02:59:19,878 48k INFO ====> Epoch: 112 2023-01-22 03:01:52,171 48k INFO Train Epoch: 113 [59%] 2023-01-22 03:01:52,173 48k INFO [2.622976064682007, 2.117339849472046, 4.659737586975098, 15.5582857131958, 0.6520732045173645, 15200, 9.859734192708044e-05] 2023-01-22 03:02:41,016 48k INFO ====> Epoch: 113 2023-01-22 03:06:12,624 48k INFO ====> Epoch: 114 2023-01-22 03:08:00,673 48k INFO Train Epoch: 115 [7%] 2023-01-22 03:08:00,674 48k INFO [2.4388582706451416, 2.26151704788208, 7.095875263214111, 17.430084228515625, 0.803480327129364, 15400, 9.857269413218213e-05] 2023-01-22 03:09:51,446 48k INFO ====> Epoch: 115 2023-01-22 03:12:25,202 48k INFO Train Epoch: 116 [56%] 2023-01-22 03:12:25,446 48k INFO [2.5684196949005127, 2.272275447845459, 7.055837631225586, 18.466167449951172, 0.6179978251457214, 15600, 9.85603725454156e-05] 2023-01-22 03:13:18,850 48k INFO ====> Epoch: 116 2023-01-22 03:16:18,149 48k INFO ====> Epoch: 117 2023-01-22 03:17:45,392 48k INFO Train Epoch: 118 [4%] 2023-01-22 03:17:45,393 48k INFO [2.5602006912231445, 2.2561113834381104, 7.903483867645264, 17.666229248046875, 0.6731774806976318, 15800, 9.853573399228505e-05] 2023-01-22 03:19:40,163 48k INFO ====> Epoch: 118 2023-01-22 03:21:46,082 48k INFO Train Epoch: 119 [52%] 2023-01-22 03:21:46,083 48k INFO [2.5184006690979004, 2.2383458614349365, 8.089900016784668, 20.32976531982422, 0.9312099814414978, 16000, 9.8523417025536e-05] 2023-01-22 03:22:07,357 48k INFO Saving model and optimizer state at iteration 119 to ./logs/48k/G_16000.pth 2023-01-22 03:22:10,666 48k INFO Saving model and optimizer state at iteration 119 to ./logs/48k/D_16000.pth 2023-01-22 03:23:09,987 48k INFO ====> Epoch: 119 2023-01-22 03:26:34,576 48k INFO ====> Epoch: 120 2023-01-22 03:27:49,350 48k INFO Train Epoch: 121 [0%] 2023-01-22 03:27:49,351 48k INFO [2.560426712036133, 2.2733850479125977, 5.619736194610596, 15.718830108642578, 0.7331914901733398, 16200, 9.8498787710708e-05] 2023-01-22 03:29:48,533 48k INFO ====> Epoch: 121 2023-01-22 03:31:55,312 48k INFO Train Epoch: 122 [48%] 2023-01-22 03:31:55,314 48k INFO [2.449033260345459, 2.320108413696289, 7.434198379516602, 16.732603073120117, 0.9057919383049011, 16400, 9.848647536224416e-05] 2023-01-22 03:32:57,118 48k INFO ====> Epoch: 122 2023-01-22 03:36:24,925 48k INFO Train Epoch: 123 [96%] 2023-01-22 03:36:24,927 48k INFO [2.3237009048461914, 2.319559097290039, 8.862730026245117, 20.4947452545166, 0.8726757168769836, 16600, 9.847416455282387e-05] 2023-01-22 03:36:29,358 48k INFO ====> Epoch: 123 2023-01-22 03:39:30,125 48k INFO ====> Epoch: 124 2023-01-22 03:41:37,520 48k INFO Train Epoch: 125 [44%] 2023-01-22 03:41:37,522 48k INFO [2.5762555599212646, 2.3611881732940674, 6.96412992477417, 16.476716995239258, 0.9901931285858154, 16800, 9.84495475503445e-05] 2023-01-22 03:42:44,310 48k INFO ====> Epoch: 125 2023-01-22 03:45:43,202 48k INFO Train Epoch: 126 [93%] 2023-01-22 03:45:43,292 48k INFO [2.5661556720733643, 2.294110059738159, 6.857804775238037, 17.081287384033203, 0.47358566522598267, 17000, 9.84372413569007e-05] 2023-01-22 03:46:08,438 48k INFO Saving model and optimizer state at iteration 126 to ./logs/48k/G_17000.pth 2023-01-22 03:46:11,266 48k INFO Saving model and optimizer state at iteration 126 to ./logs/48k/D_17000.pth 2023-01-22 03:46:22,122 48k INFO ====> Epoch: 126 2023-01-22 03:49:23,103 48k INFO ====> Epoch: 127 2023-01-22 03:51:42,359 48k INFO Train Epoch: 128 [41%] 2023-01-22 03:51:42,360 48k INFO [2.4992475509643555, 2.2112784385681152, 6.907090187072754, 17.81713104248047, 0.5233902335166931, 17200, 9.841263358464336e-05] 2023-01-22 03:52:53,164 48k INFO ====> Epoch: 128 2023-01-22 03:55:44,823 48k INFO Train Epoch: 129 [89%] 2023-01-22 03:55:44,825 48k INFO [2.517200469970703, 2.223841428756714, 7.324708461761475, 17.107942581176758, 0.609559178352356, 17400, 9.840033200544528e-05] 2023-01-22 03:55:58,119 48k INFO ====> Epoch: 129 2023-01-22 03:59:26,598 48k INFO ====> Epoch: 130 2023-01-22 04:01:23,703 48k INFO Train Epoch: 131 [37%] 2023-01-22 04:01:23,704 48k INFO [2.5788629055023193, 2.0731279850006104, 7.313710689544678, 18.551986694335938, 0.9027882814407349, 17600, 9.837573345994909e-05] 2023-01-22 04:02:39,334 48k INFO ====> Epoch: 131 2023-01-22 04:05:46,683 48k INFO Train Epoch: 132 [85%] 2023-01-22 04:05:46,684 48k INFO [2.3986008167266846, 2.526988983154297, 7.2613325119018555, 17.647071838378906, 0.9513104557991028, 17800, 9.836343649326659e-05] 2023-01-22 04:06:04,633 48k INFO ====> Epoch: 132 2023-01-22 04:09:09,961 48k INFO ====> Epoch: 133 2023-01-22 04:11:03,854 48k INFO Train Epoch: 134 [33%] 2023-01-22 04:11:03,856 48k INFO [2.418104887008667, 2.254361152648926, 8.838530540466309, 19.83738136291504, 0.8438720107078552, 18000, 9.833884717107196e-05] 2023-01-22 04:11:21,496 48k INFO Saving model and optimizer state at iteration 134 to ./logs/48k/G_18000.pth 2023-01-22 04:11:24,200 48k INFO Saving model and optimizer state at iteration 134 to ./logs/48k/D_18000.pth 2023-01-22 04:12:47,037 48k INFO ====> Epoch: 134 2023-01-22 04:15:58,661 48k INFO Train Epoch: 135 [81%] 2023-01-22 04:15:58,663 48k INFO [2.260188102722168, 2.641092300415039, 7.994870185852051, 18.271020889282227, 0.7274472713470459, 18200, 9.832655481517557e-05] 2023-01-22 04:16:20,712 48k INFO ====> Epoch: 135 2023-01-22 04:19:23,686 48k INFO ====> Epoch: 136 2023-01-22 04:21:19,451 48k INFO Train Epoch: 137 [30%] 2023-01-22 04:21:19,454 48k INFO [2.3936800956726074, 2.272400379180908, 7.838564395904541, 18.13138198852539, 0.7771925926208496, 18400, 9.830197471282419e-05] 2023-01-22 04:22:43,687 48k INFO ====> Epoch: 137 2023-01-22 04:25:23,172 48k INFO Train Epoch: 138 [78%] 2023-01-22 04:25:23,175 48k INFO [2.3418500423431396, 2.629686117172241, 8.626249313354492, 17.842092514038086, 0.6093645095825195, 18600, 9.828968696598508e-05] 2023-01-22 04:25:49,822 48k INFO ====> Epoch: 138 2023-01-22 04:29:07,330 48k INFO ====> Epoch: 139 2023-01-22 04:30:39,199 48k INFO Train Epoch: 140 [26%] 2023-01-22 04:30:39,203 48k INFO [2.529461145401001, 2.4735360145568848, 6.89732027053833, 16.554645538330078, 0.7939026355743408, 18800, 9.826511608001993e-05] 2023-01-22 04:32:07,773 48k INFO ====> Epoch: 140 2023-01-22 04:34:44,953 48k INFO Train Epoch: 141 [74%] 2023-01-22 04:34:44,954 48k INFO [2.5293631553649902, 2.286747694015503, 6.57271671295166, 18.564292907714844, 0.7098046541213989, 19000, 9.825283294050992e-05] 2023-01-22 04:35:03,028 48k INFO Saving model and optimizer state at iteration 141 to ./logs/48k/G_19000.pth 2023-01-22 04:35:05,948 48k INFO Saving model and optimizer state at iteration 141 to ./logs/48k/D_19000.pth 2023-01-22 04:35:39,124 48k INFO ====> Epoch: 141 2023-01-22 04:38:42,381 48k INFO ====> Epoch: 142 2023-01-22 04:40:14,090 48k INFO Train Epoch: 143 [22%] 2023-01-22 04:40:14,094 48k INFO [2.3879966735839844, 2.299771547317505, 8.37502384185791, 19.9527587890625, 0.8313977718353271, 19200, 9.822827126747529e-05] 2023-01-22 04:41:47,200 48k INFO ====> Epoch: 143 2023-01-22 04:44:02,755 48k INFO Train Epoch: 144 [70%] 2023-01-22 04:44:02,757 48k INFO [2.6654202938079834, 2.2383499145507812, 6.162298679351807, 15.618973731994629, 0.5648946166038513, 19400, 9.821599273356685e-05] 2023-01-22 04:44:38,342 48k INFO ====> Epoch: 144 2023-01-22 04:47:52,558 48k INFO ====> Epoch: 145 2023-01-22 04:49:22,168 48k INFO Train Epoch: 146 [19%] 2023-01-22 04:49:22,169 48k INFO [2.472923755645752, 2.2588765621185303, 6.5103559494018555, 17.332674026489258, 0.7001518607139587, 19600, 9.819144027000834e-05] 2023-01-22 04:50:59,896 48k INFO ====> Epoch: 146 2023-01-22 04:53:24,755 48k INFO Train Epoch: 147 [67%] 2023-01-22 04:53:24,785 48k INFO [2.477975606918335, 2.292552947998047, 7.11348295211792, 17.6319580078125, 0.7713624835014343, 19800, 9.817916633997459e-05] 2023-01-22 04:54:04,754 48k INFO ====> Epoch: 147 2023-01-22 04:57:06,726 48k INFO ====> Epoch: 148 2023-01-22 04:58:37,525 48k INFO Train Epoch: 149 [15%] 2023-01-22 04:58:37,526 48k INFO [2.451296091079712, 2.3346211910247803, 7.485963344573975, 18.380931854248047, 0.8386195302009583, 20000, 9.815462308243906e-05] 2023-01-22 04:58:59,371 48k INFO Saving model and optimizer state at iteration 149 to ./logs/48k/G_20000.pth 2023-01-22 04:59:02,518 48k INFO Saving model and optimizer state at iteration 149 to ./logs/48k/D_20000.pth 2023-01-22 05:00:46,706 48k INFO ====> Epoch: 149 2023-01-22 05:03:24,601 48k INFO Train Epoch: 150 [63%] 2023-01-22 05:03:24,602 48k INFO [2.5784292221069336, 2.1291050910949707, 5.19167423248291, 12.592987060546875, 0.6221301555633545, 20200, 9.814235375455375e-05] 2023-01-22 05:04:09,009 48k INFO ====> Epoch: 150 2023-01-22 05:07:31,491 48k INFO ====> Epoch: 151 2023-01-22 05:08:45,403 48k INFO Train Epoch: 152 [11%] 2023-01-22 05:08:45,404 48k INFO [2.618713855743408, 2.0036542415618896, 6.612338066101074, 17.011075973510742, 0.8388224840164185, 20400, 9.811781969958938e-05] 2023-01-22 05:10:32,102 48k INFO ====> Epoch: 152 2023-01-22 05:13:08,615 48k INFO Train Epoch: 153 [59%] 2023-01-22 05:13:08,640 48k INFO [2.527005434036255, 2.328071117401123, 5.690831184387207, 15.966653823852539, 1.0628108978271484, 20600, 9.810555497212693e-05] 2023-01-22 05:13:57,316 48k INFO ====> Epoch: 153 2023-01-22 05:17:05,674 48k INFO ====> Epoch: 154 2023-01-22 05:18:28,797 48k INFO Train Epoch: 155 [7%] 2023-01-22 05:18:28,798 48k INFO [2.5062482357025146, 2.3023681640625, 7.1140618324279785, 16.61337661743164, 0.7241228818893433, 20800, 9.808103011628319e-05] 2023-01-22 05:20:19,887 48k INFO ====> Epoch: 155 2023-01-22 05:22:24,741 48k INFO Train Epoch: 156 [56%] 2023-01-22 05:22:24,742 48k INFO [2.3822405338287354, 2.386378049850464, 7.236391544342041, 19.511474609375, 0.6870246529579163, 21000, 9.806876998751865e-05] 2023-01-22 05:22:46,685 48k INFO Saving model and optimizer state at iteration 156 to ./logs/48k/G_21000.pth 2023-01-22 05:22:49,650 48k INFO Saving model and optimizer state at iteration 156 to ./logs/48k/D_21000.pth 2023-01-22 05:23:44,387 48k INFO ====> Epoch: 156 2023-01-22 05:26:52,570 48k INFO ====> Epoch: 157 2023-01-22 05:28:24,275 48k INFO Train Epoch: 158 [4%] 2023-01-22 05:28:24,276 48k INFO [2.4722728729248047, 2.126098155975342, 7.6077423095703125, 18.62118148803711, 0.99229496717453, 21200, 9.804425432734629e-05] 2023-01-22 05:30:19,346 48k INFO ====> Epoch: 158 2023-01-22 05:32:23,823 48k INFO Train Epoch: 159 [52%] 2023-01-22 05:32:23,825 48k INFO [2.323911666870117, 2.3990256786346436, 8.89924144744873, 19.531360626220703, 0.8525999188423157, 21400, 9.803199879555537e-05] 2023-01-22 05:33:21,508 48k INFO ====> Epoch: 159 2023-01-22 05:36:52,141 48k INFO ====> Epoch: 160 2023-01-22 05:38:15,406 48k INFO Train Epoch: 161 [0%] 2023-01-22 05:38:15,413 48k INFO [2.641345977783203, 2.1503939628601074, 7.0000200271606445, 15.121925354003906, 0.8041406273841858, 21600, 9.800749232760646e-05] 2023-01-22 05:40:15,081 48k INFO ====> Epoch: 161 2023-01-22 05:42:19,218 48k INFO Train Epoch: 162 [48%] 2023-01-22 05:42:19,220 48k INFO [2.3542985916137695, 2.4592692852020264, 7.556517124176025, 18.219379425048828, 0.9150540828704834, 21800, 9.79952413910655e-05] 2023-01-22 05:43:21,676 48k INFO ====> Epoch: 162 2023-01-22 05:46:26,523 48k INFO Train Epoch: 163 [96%] 2023-01-22 05:46:26,525 48k INFO [2.327195167541504, 2.5564873218536377, 8.502983093261719, 19.635250091552734, 0.7463013529777527, 22000, 9.798299198589162e-05] 2023-01-22 05:46:47,804 48k INFO Saving model and optimizer state at iteration 163 to ./logs/48k/G_22000.pth 2023-01-22 05:46:51,118 48k INFO Saving model and optimizer state at iteration 163 to ./logs/48k/D_22000.pth 2023-01-22 05:46:57,677 48k INFO ====> Epoch: 163 2023-01-22 05:50:10,137 48k INFO ====> Epoch: 164 2023-01-22 05:52:07,789 48k INFO Train Epoch: 165 [44%] 2023-01-22 05:52:07,790 48k INFO [2.4798994064331055, 2.2589170932769775, 7.685574054718018, 17.81836700439453, 0.7918965816497803, 22200, 9.795849776887939e-05] 2023-01-22 05:53:14,326 48k INFO ====> Epoch: 165 2023-01-22 05:56:23,033 48k INFO Train Epoch: 166 [93%] 2023-01-22 05:56:23,034 48k INFO [2.520526647567749, 2.271491527557373, 7.116887092590332, 17.537757873535156, 0.8250789642333984, 22400, 9.794625295665828e-05] 2023-01-22 05:56:31,973 48k INFO ====> Epoch: 166 2023-01-22 05:59:43,319 48k INFO ====> Epoch: 167 2023-01-22 06:01:36,738 48k INFO Train Epoch: 168 [41%] 2023-01-22 06:01:36,740 48k INFO [2.4691922664642334, 2.1712820529937744, 6.529301643371582, 19.012008666992188, 0.8846498727798462, 22600, 9.792176792382932e-05] 2023-01-22 06:02:48,139 48k INFO ====> Epoch: 168 2023-01-22 06:06:05,410 48k INFO Train Epoch: 169 [89%] 2023-01-22 06:06:05,411 48k INFO [2.375034809112549, 2.7028026580810547, 7.033033847808838, 17.064311981201172, 0.822642982006073, 22800, 9.790952770283884e-05] 2023-01-22 06:06:18,955 48k INFO ====> Epoch: 169 2023-01-22 06:09:28,166 48k INFO ====> Epoch: 170 2023-01-22 06:11:44,860 48k INFO Train Epoch: 171 [37%] 2023-01-22 06:11:44,861 48k INFO [2.439336061477661, 2.248835325241089, 7.995594024658203, 18.637996673583984, 0.8797584176063538, 23000, 9.78850518507495e-05] 2023-01-22 06:12:04,729 48k INFO Saving model and optimizer state at iteration 171 to ./logs/48k/G_23000.pth 2023-01-22 06:12:07,642 48k INFO Saving model and optimizer state at iteration 171 to ./logs/48k/D_23000.pth 2023-01-22 06:13:24,506 48k INFO ====> Epoch: 171 2023-01-22 06:16:05,121 48k INFO Train Epoch: 172 [85%] 2023-01-22 06:16:05,122 48k INFO [2.402244806289673, 2.406243324279785, 7.594982624053955, 19.036361694335938, 0.7359146475791931, 23200, 9.787281621926815e-05] 2023-01-22 06:16:22,798 48k INFO ====> Epoch: 172 2023-01-22 06:19:37,403 48k INFO ====> Epoch: 173 2023-01-22 06:21:19,497 48k INFO Train Epoch: 174 [33%] 2023-01-22 06:21:19,498 48k INFO [2.4317710399627686, 2.3903510570526123, 6.976230144500732, 17.176145553588867, 1.0084552764892578, 23400, 9.784834954447608e-05] 2023-01-22 06:22:39,312 48k INFO ====> Epoch: 174 2023-01-22 06:25:43,459 48k INFO Train Epoch: 175 [81%] 2023-01-22 06:25:43,460 48k INFO [2.2472660541534424, 2.5740716457366943, 7.536203384399414, 16.73565101623535, 0.7055643200874329, 23600, 9.783611850078301e-05] 2023-01-22 06:26:05,688 48k INFO ====> Epoch: 175 2023-01-22 06:29:06,672 48k INFO ====> Epoch: 176 2023-01-22 06:31:06,589 48k INFO Train Epoch: 177 [30%] 2023-01-22 06:31:06,595 48k INFO [2.3501670360565186, 2.364131450653076, 8.117849349975586, 16.83472442626953, 0.5952101945877075, 23800, 9.781166099984716e-05] 2023-01-22 06:32:30,889 48k INFO ====> Epoch: 177 2023-01-22 06:35:21,985 48k INFO Train Epoch: 178 [78%] 2023-01-22 06:35:21,987 48k INFO [2.3780722618103027, 2.3098182678222656, 7.440067768096924, 18.17629623413086, 0.8502250909805298, 24000, 9.779943454222217e-05] 2023-01-22 06:35:57,969 48k INFO Saving model and optimizer state at iteration 178 to ./logs/48k/G_24000.pth 2023-01-22 06:36:00,651 48k INFO Saving model and optimizer state at iteration 178 to ./logs/48k/D_24000.pth 2023-01-22 06:36:28,480 48k INFO ====> Epoch: 178 2023-01-22 06:40:14,382 48k INFO ====> Epoch: 179 2023-01-22 06:41:50,808 48k INFO Train Epoch: 180 [26%] 2023-01-22 06:41:50,838 48k INFO [2.493206262588501, 2.3761775493621826, 8.050931930541992, 18.229337692260742, 0.7468435764312744, 24200, 9.777498621170277e-05] 2023-01-22 06:43:19,748 48k INFO ====> Epoch: 180 2023-01-22 06:46:25,738 48k INFO Train Epoch: 181 [74%] 2023-01-22 06:46:25,739 48k INFO [2.3736653327941895, 2.211493730545044, 9.023890495300293, 17.625791549682617, 0.8236864805221558, 24400, 9.776276433842631e-05] 2023-01-22 06:46:56,778 48k INFO ====> Epoch: 181 2023-01-22 06:50:11,644 48k INFO ====> Epoch: 182 2023-01-22 06:52:04,368 48k INFO Train Epoch: 183 [22%] 2023-01-22 06:52:04,369 48k INFO [2.430777072906494, 2.2571747303009033, 7.499727725982666, 16.950727462768555, 0.7711045742034912, 24600, 9.773832517488488e-05] 2023-01-22 06:53:37,493 48k INFO ====> Epoch: 183 2023-01-22 06:56:54,190 48k INFO Train Epoch: 184 [70%] 2023-01-22 06:56:54,192 48k INFO [2.542609453201294, 2.189286708831787, 6.360416889190674, 14.887353897094727, 0.44517409801483154, 24800, 9.772610788423802e-05] 2023-01-22 06:57:29,412 48k INFO ====> Epoch: 184 2023-01-22 07:00:44,464 48k INFO ====> Epoch: 185 2023-01-22 07:02:43,882 48k INFO Train Epoch: 186 [19%] 2023-01-22 07:02:43,883 48k INFO [2.5294599533081055, 2.1700663566589355, 6.489031791687012, 14.84936237335205, 0.6636635661125183, 25000, 9.77016778842374e-05] 2023-01-22 07:03:00,614 48k INFO Saving model and optimizer state at iteration 186 to ./logs/48k/G_25000.pth 2023-01-22 07:03:04,222 48k INFO Saving model and optimizer state at iteration 186 to ./logs/48k/D_25000.pth 2023-01-22 07:04:43,434 48k INFO ====> Epoch: 186 2023-01-22 07:07:36,975 48k INFO Train Epoch: 187 [67%] 2023-01-22 07:07:37,079 48k INFO [2.5327281951904297, 2.2112808227539062, 6.651467800140381, 15.817968368530273, 0.6885758638381958, 25200, 9.768946517450186e-05] 2023-01-22 07:08:17,329 48k INFO ====> Epoch: 187 2023-01-22 07:11:50,617 48k INFO ====> Epoch: 188 2023-01-22 07:13:40,322 48k INFO Train Epoch: 189 [15%] 2023-01-22 07:13:40,324 48k INFO [2.481234550476074, 2.1767075061798096, 8.824145317077637, 18.01640510559082, 0.8733751773834229, 25400, 9.766504433460612e-05] 2023-01-22 07:15:22,278 48k INFO ====> Epoch: 189 2023-01-22 07:18:15,904 48k INFO Train Epoch: 190 [63%] 2023-01-22 07:18:15,911 48k INFO [2.6454858779907227, 2.208667039871216, 5.610787868499756, 13.372602462768555, 0.38875046372413635, 25600, 9.765283620406429e-05] 2023-01-22 07:19:00,409 48k INFO ====> Epoch: 190 2023-01-22 07:22:33,376 48k INFO ====> Epoch: 191 2023-01-22 07:23:58,497 48k INFO Train Epoch: 192 [11%] 2023-01-22 07:23:58,525 48k INFO [2.545492649078369, 2.1973745822906494, 4.482292175292969, 13.961978912353516, 0.5211648941040039, 25800, 9.762842452083883e-05] 2023-01-22 07:25:44,555 48k INFO ====> Epoch: 192 2023-01-22 07:28:39,112 48k INFO Train Epoch: 193 [59%] 2023-01-22 07:28:39,113 48k INFO [2.532291889190674, 2.4437904357910156, 6.170163631439209, 14.199174880981445, 0.697663426399231, 26000, 9.761622096777372e-05] 2023-01-22 07:29:14,337 48k INFO Saving model and optimizer state at iteration 193 to ./logs/48k/G_26000.pth 2023-01-22 07:29:17,227 48k INFO Saving model and optimizer state at iteration 193 to ./logs/48k/D_26000.pth 2023-01-22 07:30:07,411 48k INFO ====> Epoch: 193 2023-01-22 07:33:51,823 48k INFO ====> Epoch: 194 2023-01-22 07:35:31,596 48k INFO Train Epoch: 195 [7%] 2023-01-22 07:35:31,597 48k INFO [2.4549500942230225, 2.381260633468628, 7.578873157501221, 17.375896453857422, 0.9950110912322998, 26200, 9.759181843778522e-05] 2023-01-22 07:37:22,445 48k INFO ====> Epoch: 195 2023-01-22 07:40:13,760 48k INFO Train Epoch: 196 [56%] 2023-01-22 07:40:13,762 48k INFO [2.5595929622650146, 2.324709892272949, 6.432162761688232, 17.188098907470703, 0.7180476188659668, 26400, 9.757961946048049e-05] 2023-01-22 07:41:06,996 48k INFO ====> Epoch: 196 2023-01-22 07:44:24,869 48k INFO ====> Epoch: 197 2023-01-22 07:46:05,385 48k INFO Train Epoch: 198 [4%] 2023-01-22 07:46:05,386 48k INFO [2.39399790763855, 2.356853723526001, 7.550986289978027, 19.06048583984375, 0.7446596622467041, 26600, 9.755522608029692e-05] 2023-01-22 07:48:01,263 48k INFO ====> Epoch: 198 2023-01-22 07:50:05,942 48k INFO Train Epoch: 199 [52%] 2023-01-22 07:50:05,944 48k INFO [2.4923439025878906, 2.1890056133270264, 7.604694843292236, 19.055347442626953, 1.066550612449646, 26800, 9.754303167703689e-05] 2023-01-22 07:51:03,420 48k INFO ====> Epoch: 199 2023-01-22 07:54:34,337 48k INFO ====> Epoch: 200 2023-01-22 07:55:58,198 48k INFO Train Epoch: 201 [0%] 2023-01-22 07:55:58,221 48k INFO [2.4305946826934814, 2.455082893371582, 8.23879337310791, 18.429059982299805, 0.6871993541717529, 27000, 9.75186474432275e-05] 2023-01-22 07:56:15,581 48k INFO Saving model and optimizer state at iteration 201 to ./logs/48k/G_27000.pth 2023-01-22 07:56:18,369 48k INFO Saving model and optimizer state at iteration 201 to ./logs/48k/D_27000.pth 2023-01-22 07:58:19,724 48k INFO ====> Epoch: 201 2023-01-22 08:00:43,977 48k INFO Train Epoch: 202 [48%] 2023-01-22 08:00:43,978 48k INFO [2.4814980030059814, 2.2718801498413086, 7.460274696350098, 15.51793384552002, 0.7232438325881958, 27200, 9.750645761229709e-05] 2023-01-22 08:01:45,988 48k INFO ====> Epoch: 202 2023-01-22 08:05:15,421 48k INFO Train Epoch: 203 [96%] 2023-01-22 08:05:15,422 48k INFO [2.285353183746338, 2.3923587799072266, 9.981049537658691, 20.62666893005371, 0.6629798412322998, 27400, 9.749426930509556e-05] 2023-01-22 08:05:19,999 48k INFO ====> Epoch: 203 2023-01-22 08:08:56,917 48k INFO ====> Epoch: 204 2023-01-22 08:11:27,688 48k INFO Train Epoch: 205 [44%] 2023-01-22 08:11:27,690 48k INFO [2.2753095626831055, 2.734102487564087, 7.607904434204102, 16.576171875, 0.776640772819519, 27600, 9.746989726111722e-05] 2023-01-22 08:12:34,096 48k INFO ====> Epoch: 205 2023-01-22 08:15:45,225 48k INFO Train Epoch: 206 [93%] 2023-01-22 08:15:45,233 48k INFO [2.3811511993408203, 2.363701820373535, 8.00710391998291, 17.106149673461914, 0.5795164108276367, 27800, 9.745771352395957e-05] 2023-01-22 08:15:54,040 48k INFO ====> Epoch: 206 2023-01-22 08:19:20,566 48k INFO ====> Epoch: 207 2023-01-22 08:21:15,161 48k INFO Train Epoch: 208 [41%] 2023-01-22 08:21:15,192 48k INFO [2.2225372791290283, 2.6000564098358154, 9.317414283752441, 19.5484619140625, 0.7637076377868652, 28000, 9.743335061835535e-05] 2023-01-22 08:21:41,786 48k INFO Saving model and optimizer state at iteration 208 to ./logs/48k/G_28000.pth 2023-01-22 08:21:44,570 48k INFO Saving model and optimizer state at iteration 208 to ./logs/48k/D_28000.pth 2023-01-22 08:22:57,589 48k INFO ====> Epoch: 208 2023-01-22 08:26:16,281 48k INFO Train Epoch: 209 [89%] 2023-01-22 08:26:16,282 48k INFO [2.5466623306274414, 2.280019998550415, 6.366764545440674, 16.429372787475586, 0.7919825315475464, 28200, 9.742117144952805e-05] 2023-01-22 08:26:29,799 48k INFO ====> Epoch: 209 2023-01-22 08:29:44,574 48k INFO ====> Epoch: 210 2023-01-22 08:32:08,252 48k INFO Train Epoch: 211 [37%] 2023-01-22 08:32:08,254 48k INFO [2.543309211730957, 2.161836624145508, 6.711134433746338, 17.572219848632812, 0.6301430463790894, 28400, 9.739681767887146e-05] 2023-01-22 08:33:23,796 48k INFO ====> Epoch: 211 2023-01-22 08:36:44,667 48k INFO Train Epoch: 212 [85%] 2023-01-22 08:36:44,668 48k INFO [2.321545124053955, 2.4296023845672607, 8.030949592590332, 18.253841400146484, 0.620351254940033, 28600, 9.73846430766616e-05] 2023-01-22 08:37:02,314 48k INFO ====> Epoch: 212 2023-01-22 08:40:11,674 48k INFO ====> Epoch: 213 2023-01-22 08:42:27,449 48k INFO Train Epoch: 214 [33%] 2023-01-22 08:42:27,451 48k INFO [2.2797422409057617, 2.548719644546509, 7.825155735015869, 17.908039093017578, 0.657192587852478, 28800, 9.736029843752747e-05] 2023-01-22 08:43:48,064 48k INFO ====> Epoch: 214 2023-01-22 08:46:37,239 48k INFO Train Epoch: 215 [81%] 2023-01-22 08:46:37,240 48k INFO [2.3596415519714355, 2.441544771194458, 7.363457679748535, 18.221845626831055, 0.5869831442832947, 29000, 9.734812840022278e-05] 2023-01-22 08:47:02,494 48k INFO Saving model and optimizer state at iteration 215 to ./logs/48k/G_29000.pth 2023-01-22 08:47:05,383 48k INFO Saving model and optimizer state at iteration 215 to ./logs/48k/D_29000.pth 2023-01-22 08:47:29,131 48k INFO ====> Epoch: 215 2023-01-22 08:50:44,141 48k INFO ====> Epoch: 216 2023-01-22 08:52:51,470 48k INFO Train Epoch: 217 [30%] 2023-01-22 08:52:51,471 48k INFO [2.516904830932617, 2.1090333461761475, 7.781053066253662, 17.962955474853516, 0.7437824010848999, 29200, 9.732379288918723e-05] 2023-01-22 08:54:15,836 48k INFO ====> Epoch: 217 2023-01-22 08:56:55,327 48k INFO Train Epoch: 218 [78%] 2023-01-22 08:56:55,328 48k INFO [2.2108588218688965, 2.646613359451294, 8.852120399475098, 19.39406967163086, 0.7638500928878784, 29400, 9.731162741507607e-05] 2023-01-22 08:57:21,961 48k INFO ====> Epoch: 218 2023-01-22 09:01:01,627 48k INFO ====> Epoch: 219 2023-01-22 09:13:49,133 48k INFO {'train': {'log_interval': 200, 'eval_interval': 1000, 'seed': 1234, 'epochs': 10000, 'learning_rate': 0.0001, 'betas': [0.8, 0.99], 'eps': 1e-09, 'batch_size': 12, 'fp16_run': False, 'lr_decay': 0.999875, 'segment_size': 17920, 'init_lr_ratio': 1, 'warmup_epochs': 0, 'c_mel': 45, 'c_kl': 1.0, 'use_sr': True, 'max_speclen': 384, 'port': '8001'}, 'data': {'training_files': 'filelists/train.txt', 'validation_files': 'filelists/val.txt', 'max_wav_value': 32768.0, 'sampling_rate': 48000, 'filter_length': 1280, 'hop_length': 320, 'win_length': 1280, 'n_mel_channels': 80, 'mel_fmin': 0.0, 'mel_fmax': None}, 'model': {'inter_channels': 192, 'hidden_channels': 192, 'filter_channels': 768, 'n_heads': 2, 'n_layers': 6, 'kernel_size': 3, 'p_dropout': 0.1, 'resblock': '1', 'resblock_kernel_sizes': [3, 7, 11], 'resblock_dilation_sizes': [[1, 3, 5], [1, 3, 5], [1, 3, 5]], 'upsample_rates': [10, 8, 2, 2], 'upsample_initial_channel': 512, 'upsample_kernel_sizes': [16, 16, 4, 4], 'n_layers_q': 3, 'use_spectral_norm': False, 'gin_channels': 256, 'ssl_dim': 256, 'n_speakers': 2}, 'spk': {'Asaki': 0}, 'model_dir': './logs/48k'} 2023-01-22 09:13:49,135 48k WARNING /root/so-vits-svc is not a git repository, therefore hash value comparison will be ignored. 2023-01-22 09:14:46,147 48k INFO Loaded checkpoint './logs/48k/G_29000.pth' (iteration 215) 2023-01-22 09:14:48,694 48k INFO Loaded checkpoint './logs/48k/D_29000.pth' (iteration 215) 2023-01-22 09:18:48,036 48k INFO Train Epoch: 215 [81%] 2023-01-22 09:18:48,037 48k INFO [2.3785886764526367, 2.3731846809387207, 7.397822856903076, 17.803794860839844, 0.4226111173629761, 29000, 9.733595988417275e-05] 2023-01-22 09:19:34,338 48k INFO Saving model and optimizer state at iteration 215 to ./logs/48k/G_29000.pth 2023-01-22 09:19:37,315 48k INFO Saving model and optimizer state at iteration 215 to ./logs/48k/D_29000.pth 2023-01-22 09:20:18,049 48k INFO ====> Epoch: 215 2023-01-22 09:23:57,137 48k INFO ====> Epoch: 216 2023-01-22 09:25:29,848 48k INFO Train Epoch: 217 [30%] 2023-01-22 09:25:29,863 48k INFO [2.420741081237793, 2.4384684562683105, 7.4001922607421875, 16.251575469970703, 0.6255336999893188, 29200, 9.731162741507607e-05] 2023-01-22 09:26:55,195 48k INFO ====> Epoch: 217 2023-01-22 09:29:40,961 48k INFO Train Epoch: 218 [78%] 2023-01-22 09:29:41,220 48k INFO [2.289294958114624, 2.531303644180298, 9.353946685791016, 19.492435455322266, 0.6052765250205994, 29400, 9.729946346164919e-05] 2023-01-22 09:30:07,987 48k INFO ====> Epoch: 218 2023-01-22 09:33:08,812 48k INFO ====> Epoch: 219 2023-01-22 09:35:05,368 48k INFO Train Epoch: 220 [26%] 2023-01-22 09:35:05,369 48k INFO [2.6419517993927, 2.1512279510498047, 6.031922817230225, 15.157461166381836, 0.6757804155349731, 29600, 9.727514011608789e-05] 2023-01-22 09:36:34,617 48k INFO ====> Epoch: 220 2023-01-22 09:39:05,353 48k INFO Train Epoch: 221 [74%] 2023-01-22 09:39:05,355 48k INFO [2.4989914894104004, 2.185439109802246, 6.980371475219727, 15.880250930786133, 0.7117370963096619, 29800, 9.726298072357337e-05] 2023-01-22 09:39:36,627 48k INFO ====> Epoch: 221 2023-01-22 09:42:45,717 48k INFO ====> Epoch: 222 2023-01-22 09:44:23,393 48k INFO Train Epoch: 223 [22%] 2023-01-22 09:44:23,395 48k INFO [2.2948989868164062, 2.3704075813293457, 7.319271564483643, 19.12636375427246, 0.8146140575408936, 30000, 9.723866649812655e-05] 2023-01-22 09:44:39,740 48k INFO Saving model and optimizer state at iteration 223 to ./logs/48k/G_30000.pth 2023-01-22 09:44:42,605 48k INFO Saving model and optimizer state at iteration 223 to ./logs/48k/D_30000.pth 2023-01-22 09:46:18,352 48k INFO ====> Epoch: 223 2023-01-22 09:50:01,687 48k INFO Train Epoch: 224 [70%] 2023-01-22 09:50:01,689 48k INFO [2.562938690185547, 2.395346164703369, 4.526849269866943, 11.443775177001953, 0.31155121326446533, 30200, 9.722651166481428e-05] 2023-01-22 09:50:37,193 48k INFO ====> Epoch: 224 2023-01-22 09:56:50,450 48k INFO ====> Epoch: 225 2023-01-22 10:00:40,730 48k INFO Train Epoch: 226 [19%] 2023-01-22 10:00:40,732 48k INFO [2.507101058959961, 2.5883829593658447, 6.81978702545166, 16.16312599182129, 0.6929633617401123, 30400, 9.720220655606233e-05] 2023-01-22 10:02:18,626 48k INFO ====> Epoch: 226 2023-01-22 10:06:52,036 48k INFO Train Epoch: 227 [67%] 2023-01-22 10:06:52,038 48k INFO [2.43876051902771, 2.2823843955993652, 7.320391654968262, 15.781881332397461, 0.8624737858772278, 30600, 9.719005628024282e-05] 2023-01-22 10:07:31,989 48k INFO ====> Epoch: 227 2023-01-22 10:12:36,663 48k INFO ====> Epoch: 228 2023-01-22 10:15:21,989 48k INFO Train Epoch: 229 [15%] 2023-01-22 10:15:21,991 48k INFO [2.380829095840454, 2.3975629806518555, 7.482331275939941, 17.6983585357666, 0.673857569694519, 30800, 9.716576028476738e-05] 2023-01-22 10:17:04,845 48k INFO ====> Epoch: 229 2023-01-22 10:22:14,442 48k INFO Train Epoch: 230 [63%] 2023-01-22 10:22:14,445 48k INFO [2.4932522773742676, 2.160306930541992, 6.428990364074707, 15.630932807922363, 0.7413484454154968, 31000, 9.715361456473177e-05] 2023-01-22 10:23:05,752 48k INFO Saving model and optimizer state at iteration 230 to ./logs/48k/G_31000.pth 2023-01-22 10:23:08,740 48k INFO Saving model and optimizer state at iteration 230 to ./logs/48k/D_31000.pth 2023-01-22 10:23:55,116 48k INFO ====> Epoch: 230 2023-01-22 10:29:54,943 48k INFO ====> Epoch: 231 2023-01-22 10:34:58,678 48k INFO Train Epoch: 232 [11%] 2023-01-22 10:34:58,680 48k INFO [2.5397183895111084, 2.259108066558838, 5.593452453613281, 16.34769630432129, 0.7978397011756897, 31200, 9.71293276791158e-05] 2023-01-22 10:36:45,530 48k INFO ====> Epoch: 232 2023-01-22 10:42:00,550 48k INFO Train Epoch: 233 [59%] 2023-01-22 10:42:00,551 48k INFO [2.277923822402954, 2.6269798278808594, 8.837657928466797, 16.94891357421875, 0.5446866750717163, 31400, 9.711718651315591e-05] 2023-01-22 10:42:49,304 48k INFO ====> Epoch: 233 2023-01-22 10:47:27,347 48k INFO ====> Epoch: 234 2023-01-22 10:52:24,525 48k INFO Train Epoch: 235 [7%] 2023-01-22 10:52:24,526 48k INFO [2.485118865966797, 2.403068780899048, 8.079656600952148, 16.027925491333008, 0.676116943359375, 31600, 9.709290873398365e-05] 2023-01-22 10:54:16,281 48k INFO ====> Epoch: 235 2023-01-22 10:59:27,121 48k INFO Train Epoch: 236 [56%] 2023-01-22 10:59:27,123 48k INFO [2.4529805183410645, 2.3092803955078125, 7.376678943634033, 17.033023834228516, 0.77704256772995, 31800, 9.70807721203919e-05] 2023-01-22 11:00:20,697 48k INFO ====> Epoch: 236 2023-01-22 11:07:51,580 48k INFO ====> Epoch: 237 2023-01-22 11:11:56,256 48k INFO Train Epoch: 238 [4%] 2023-01-22 11:11:56,258 48k INFO [2.3923535346984863, 2.2982871532440186, 8.522915840148926, 18.49658966064453, 0.8651725053787231, 32000, 9.705650344424885e-05] 2023-01-22 11:12:42,714 48k INFO Saving model and optimizer state at iteration 238 to ./logs/48k/G_32000.pth 2023-01-22 11:12:45,902 48k INFO Saving model and optimizer state at iteration 238 to ./logs/48k/D_32000.pth 2023-01-22 11:14:43,591 48k INFO ====> Epoch: 238 2023-01-22 11:42:13,949 48k INFO {'train': {'log_interval': 200, 'eval_interval': 1000, 'seed': 1234, 'epochs': 10000, 'learning_rate': 0.0001, 'betas': [0.8, 0.99], 'eps': 1e-09, 'batch_size': 12, 'fp16_run': False, 'lr_decay': 0.999875, 'segment_size': 17920, 'init_lr_ratio': 1, 'warmup_epochs': 0, 'c_mel': 45, 'c_kl': 1.0, 'use_sr': True, 'max_speclen': 384, 'port': '8001'}, 'data': {'training_files': 'filelists/train.txt', 'validation_files': 'filelists/val.txt', 'max_wav_value': 32768.0, 'sampling_rate': 48000, 'filter_length': 1280, 'hop_length': 320, 'win_length': 1280, 'n_mel_channels': 80, 'mel_fmin': 0.0, 'mel_fmax': None}, 'model': {'inter_channels': 192, 'hidden_channels': 192, 'filter_channels': 768, 'n_heads': 2, 'n_layers': 6, 'kernel_size': 3, 'p_dropout': 0.1, 'resblock': '1', 'resblock_kernel_sizes': [3, 7, 11], 'resblock_dilation_sizes': [[1, 3, 5], [1, 3, 5], [1, 3, 5]], 'upsample_rates': [10, 8, 2, 2], 'upsample_initial_channel': 512, 'upsample_kernel_sizes': [16, 16, 4, 4], 'n_layers_q': 3, 'use_spectral_norm': False, 'gin_channels': 256, 'ssl_dim': 256, 'n_speakers': 2}, 'spk': {'Asaki': 0}, 'model_dir': './logs/48k'} 2023-01-22 11:42:13,951 48k WARNING /root/so-vits-svc is not a git repository, therefore hash value comparison will be ignored. 2023-01-22 11:43:09,160 48k INFO Loaded checkpoint './logs/48k/G_32000.pth' (iteration 238) 2023-01-22 11:43:12,020 48k INFO Loaded checkpoint './logs/48k/D_32000.pth' (iteration 238) 2023-01-22 11:47:42,883 48k INFO Train Epoch: 238 [4%] 2023-01-22 11:47:42,884 48k INFO [2.2613136768341064, 2.4589107036590576, 8.430075645446777, 17.873971939086914, 0.8159986138343811, 32000, 9.704437138131832e-05] 2023-01-22 11:48:39,244 48k INFO Saving model and optimizer state at iteration 238 to ./logs/48k/G_32000.pth 2023-01-22 11:48:42,766 48k INFO Saving model and optimizer state at iteration 238 to ./logs/48k/D_32000.pth 2023-01-22 11:50:58,825 48k INFO ====> Epoch: 238 2023-01-22 11:55:45,427 48k INFO Train Epoch: 239 [52%] 2023-01-22 11:55:45,428 48k INFO [2.3614792823791504, 2.3841142654418945, 7.2908477783203125, 18.564212799072266, 0.8683855533599854, 32200, 9.703224083489565e-05] 2023-01-22 11:56:43,152 48k INFO ====> Epoch: 239 2023-01-22 12:01:25,010 48k INFO ====> Epoch: 240 2023-01-22 12:03:50,306 48k INFO Train Epoch: 241 [0%] 2023-01-22 12:03:50,307 48k INFO [2.42980694770813, 2.4430458545684814, 9.487882614135742, 19.02658462524414, 0.7641972899436951, 32400, 9.700798429081568e-05] 2023-01-22 12:05:50,424 48k INFO ====> Epoch: 241 2023-01-22 12:09:26,921 48k INFO Train Epoch: 242 [48%] 2023-01-22 12:09:26,923 48k INFO [2.480952024459839, 2.3397634029388428, 7.1838531494140625, 15.901000022888184, 1.0192186832427979, 32600, 9.699585829277933e-05] 2023-01-22 12:10:29,057 48k INFO ====> Epoch: 242 2023-01-22 12:15:57,879 48k INFO Train Epoch: 243 [96%] 2023-01-22 12:15:57,929 48k INFO [2.350839614868164, 2.6268553733825684, 8.62727165222168, 19.235857009887695, 0.6308739185333252, 32800, 9.698373381049272e-05] 2023-01-22 12:16:02,290 48k INFO ====> Epoch: 243 2023-01-22 12:20:23,461 48k INFO ====> Epoch: 244 2023-01-22 12:24:06,724 48k INFO Train Epoch: 245 [44%] 2023-01-22 12:24:06,726 48k INFO [2.468545913696289, 2.249843120574951, 6.784601211547852, 16.18385124206543, 0.7427247166633606, 33000, 9.695948939241093e-05] 2023-01-22 12:24:47,521 48k INFO Saving model and optimizer state at iteration 245 to ./logs/48k/G_33000.pth 2023-01-22 12:24:51,279 48k INFO Saving model and optimizer state at iteration 245 to ./logs/48k/D_33000.pth 2023-01-22 12:26:00,245 48k INFO ====> Epoch: 245 2023-01-22 12:30:39,556 48k INFO Train Epoch: 246 [93%] 2023-01-22 12:30:39,558 48k INFO [2.5003232955932617, 2.3093690872192383, 8.366510391235352, 17.251068115234375, 0.6487038135528564, 33200, 9.694736945623688e-05] 2023-01-22 12:30:48,514 48k INFO ====> Epoch: 246 2023-01-22 12:36:04,173 48k INFO ====> Epoch: 247 2023-01-22 12:40:07,179 48k INFO Train Epoch: 248 [41%] 2023-01-22 12:40:07,180 48k INFO [2.3906939029693604, 2.306753158569336, 8.893338203430176, 17.83040428161621, 0.7982873916625977, 33400, 9.692313412867544e-05] 2023-01-22 12:41:18,321 48k INFO ====> Epoch: 248 2023-01-22 12:45:09,336 48k INFO Train Epoch: 249 [89%] 2023-01-22 12:45:09,340 48k INFO [2.366762161254883, 2.511160373687744, 7.13370418548584, 17.698837280273438, 0.6363296508789062, 33600, 9.691101873690936e-05] 2023-01-22 12:45:22,494 48k INFO ====> Epoch: 249 2023-01-22 12:49:44,610 48k INFO ====> Epoch: 250 2023-01-22 12:52:27,384 48k INFO Train Epoch: 251 [37%] 2023-01-22 12:52:27,385 48k INFO [2.401773452758789, 2.2863190174102783, 7.359044075012207, 15.43443489074707, 0.6048851013183594, 33800, 9.68867924964598e-05] 2023-01-22 12:53:43,018 48k INFO ====> Epoch: 251 2023-01-22 12:57:19,934 48k INFO Train Epoch: 252 [85%] 2023-01-22 12:57:19,936 48k INFO [2.4856605529785156, 2.3468356132507324, 7.179285049438477, 16.397764205932617, 0.700822651386261, 34000, 9.687468164739773e-05] 2023-01-22 12:58:05,171 48k INFO Saving model and optimizer state at iteration 252 to ./logs/48k/G_34000.pth 2023-01-22 12:58:08,015 48k INFO Saving model and optimizer state at iteration 252 to ./logs/48k/D_34000.pth 2023-01-22 12:58:27,620 48k INFO ====> Epoch: 252 2023-01-22 13:02:46,209 48k INFO ====> Epoch: 253 2023-01-22 13:05:38,845 48k INFO Train Epoch: 254 [33%] 2023-01-22 13:05:38,847 48k INFO [2.4740426540374756, 2.368603467941284, 8.116990089416504, 18.18496322631836, 0.7356986999511719, 34200, 9.685046449065278e-05] 2023-01-22 13:06:58,379 48k INFO ====> Epoch: 254 2023-01-22 13:11:36,701 48k INFO Train Epoch: 255 [81%] 2023-01-22 13:11:36,703 48k INFO [2.314347505569458, 2.4388198852539062, 6.875949859619141, 14.967924118041992, 0.8673399686813354, 34400, 9.683835818259144e-05] 2023-01-22 13:11:58,860 48k INFO ====> Epoch: 255 2023-01-22 13:16:03,385 48k INFO ====> Epoch: 256 2023-01-22 13:18:47,286 48k INFO Train Epoch: 257 [30%] 2023-01-22 13:18:47,287 48k INFO [2.3185205459594727, 2.391451835632324, 8.362555503845215, 17.992568969726562, 0.7871115803718567, 34600, 9.681415010614512e-05] 2023-01-22 13:20:11,647 48k INFO ====> Epoch: 257 2023-01-22 13:24:03,182 48k INFO Train Epoch: 258 [78%] 2023-01-22 13:24:03,195 48k INFO [2.2283198833465576, 2.539550304412842, 9.251622200012207, 19.5689754486084, 0.842915415763855, 34800, 9.680204833738185e-05] 2023-01-22 13:24:29,710 48k INFO ====> Epoch: 258 2023-01-22 13:28:38,768 48k INFO ====> Epoch: 259 2023-01-22 13:31:46,668 48k INFO Train Epoch: 260 [26%] 2023-01-22 13:31:46,670 48k INFO [2.289602756500244, 2.483950614929199, 7.842837810516357, 16.14043617248535, 0.877478837966919, 35000, 9.67778493378295e-05] 2023-01-22 13:32:23,148 48k INFO Saving model and optimizer state at iteration 260 to ./logs/48k/G_35000.pth 2023-01-22 13:32:26,047 48k INFO Saving model and optimizer state at iteration 260 to ./logs/48k/D_35000.pth 2023-01-22 13:33:56,916 48k INFO ====> Epoch: 260 2023-01-22 13:37:55,621 48k INFO Train Epoch: 261 [74%] 2023-01-22 13:37:55,622 48k INFO [2.3270151615142822, 2.4291317462921143, 8.443516731262207, 19.155803680419922, 0.9110982418060303, 35200, 9.676575210666227e-05] 2023-01-22 13:38:26,590 48k INFO ====> Epoch: 261 2023-01-22 13:42:50,298 48k INFO ====> Epoch: 262 2023-01-22 13:46:00,461 48k INFO Train Epoch: 263 [22%] 2023-01-22 13:46:00,467 48k INFO [2.4476375579833984, 2.2606985569000244, 7.555863857269287, 18.846004486083984, 0.6997097134590149, 35400, 9.674156218060047e-05] 2023-01-22 13:47:33,478 48k INFO ====> Epoch: 263 2023-01-22 13:51:35,645 48k INFO Train Epoch: 264 [70%] 2023-01-22 13:51:35,647 48k INFO [2.3684072494506836, 2.3997764587402344, 7.387723922729492, 17.959917068481445, 0.46424421668052673, 35600, 9.67294694853279e-05] 2023-01-22 13:52:11,274 48k INFO ====> Epoch: 264 2023-01-22 13:56:29,079 48k INFO ====> Epoch: 265 2023-01-22 13:59:49,865 48k INFO Train Epoch: 266 [19%] 2023-01-22 13:59:49,867 48k INFO [2.4590985774993896, 2.43709397315979, 7.71727991104126, 16.326539993286133, 0.7760441303253174, 35800, 9.670528862935451e-05] 2023-01-22 14:01:27,749 48k INFO ====> Epoch: 266 2023-01-22 14:05:22,911 48k INFO Train Epoch: 267 [67%] 2023-01-22 14:05:22,912 48k INFO [2.529423236846924, 2.330211639404297, 8.158029556274414, 18.776729583740234, 0.896825909614563, 36000, 9.669320046827584e-05] 2023-01-22 14:06:09,893 48k INFO Saving model and optimizer state at iteration 267 to ./logs/48k/G_36000.pth 2023-01-22 14:06:13,659 48k INFO Saving model and optimizer state at iteration 267 to ./logs/48k/D_36000.pth 2023-01-22 14:06:55,763 48k INFO ====> Epoch: 267 2023-01-22 14:11:17,672 48k INFO ====> Epoch: 268 2023-01-22 14:14:34,453 48k INFO Train Epoch: 269 [15%] 2023-01-22 14:14:34,454 48k INFO [2.594207286834717, 2.40482497215271, 7.07330846786499, 15.730304718017578, 0.6758981347084045, 36200, 9.666902867899003e-05] 2023-01-22 14:16:16,548 48k INFO ====> Epoch: 269 2023-01-22 14:19:00,161 48k INFO Train Epoch: 270 [63%] 2023-01-22 14:19:00,162 48k INFO [2.584080696105957, 2.189208507537842, 7.746429920196533, 14.936728477478027, 0.6904215812683105, 36400, 9.665694505040515e-05] 2023-01-22 14:19:44,650 48k INFO ====> Epoch: 270 2023-01-22 14:23:17,182 48k INFO ====> Epoch: 271 2023-01-22 14:25:06,843 48k INFO Train Epoch: 272 [11%] 2023-01-22 14:25:06,844 48k INFO [2.4072866439819336, 2.4608421325683594, 6.526651382446289, 13.777335166931152, 0.5674250721931458, 36600, 9.663278232440732e-05] 2023-01-22 14:26:53,326 48k INFO ====> Epoch: 272 2023-01-22 14:29:22,069 48k INFO Train Epoch: 273 [59%] 2023-01-22 14:29:22,073 48k INFO [2.4348113536834717, 2.5120534896850586, 6.264192581176758, 14.798490524291992, 0.7323782444000244, 36800, 9.662070322661676e-05] 2023-01-22 14:30:11,213 48k INFO ====> Epoch: 273 2023-01-22 14:33:15,771 48k INFO ====> Epoch: 274 2023-01-22 14:34:27,028 48k INFO Train Epoch: 275 [7%] 2023-01-22 14:34:27,030 48k INFO [2.4231503009796143, 2.4943506717681885, 7.505504131317139, 16.735111236572266, 0.7811313271522522, 37000, 9.659654956050859e-05] 2023-01-22 14:34:43,701 48k INFO Saving model and optimizer state at iteration 275 to ./logs/48k/G_37000.pth 2023-01-22 14:34:46,872 48k INFO Saving model and optimizer state at iteration 275 to ./logs/48k/D_37000.pth 2023-01-22 14:36:40,418 48k INFO ====> Epoch: 275 2023-01-22 14:38:49,269 48k INFO Train Epoch: 276 [56%] 2023-01-22 14:38:49,270 48k INFO [2.477111339569092, 2.2743024826049805, 7.314972877502441, 18.483715057373047, 0.7627670764923096, 37200, 9.658447499181352e-05] 2023-01-22 14:39:42,537 48k INFO ====> Epoch: 276 2023-01-22 14:42:40,686 48k INFO ====> Epoch: 277 2023-01-22 14:43:43,995 48k INFO Train Epoch: 278 [4%] 2023-01-22 14:43:43,996 48k INFO [2.353349447250366, 2.3733224868774414, 9.344162940979004, 17.682641983032227, 0.6716291904449463, 37400, 9.656033038219798e-05] 2023-01-22 14:45:39,541 48k INFO ====> Epoch: 278 2023-01-22 14:47:38,735 48k INFO Train Epoch: 279 [52%] 2023-01-22 14:47:38,737 48k INFO [2.4274039268493652, 2.398956060409546, 8.283748626708984, 18.00942611694336, 1.0797754526138306, 37600, 9.65482603409002e-05] 2023-01-22 14:48:36,304 48k INFO ====> Epoch: 279 2023-01-22 14:51:33,488 48k INFO ====> Epoch: 280 2023-01-22 14:52:32,311 48k INFO Train Epoch: 281 [0%] 2023-01-22 14:52:32,312 48k INFO [2.448072910308838, 2.522920846939087, 8.10617446899414, 17.984914779663086, 0.6238106489181519, 37800, 9.652412478438153e-05] 2023-01-22 14:54:31,579 48k INFO ====> Epoch: 281 2023-01-22 14:56:31,000 48k INFO Train Epoch: 282 [48%] 2023-01-22 14:56:31,001 48k INFO [2.3328754901885986, 2.499995708465576, 8.391336441040039, 17.412683486938477, 0.6468909382820129, 38000, 9.651205926878348e-05] 2023-01-22 14:56:49,612 48k INFO Saving model and optimizer state at iteration 282 to ./logs/48k/G_38000.pth 2023-01-22 14:56:53,222 48k INFO Saving model and optimizer state at iteration 282 to ./logs/48k/D_38000.pth 2023-01-22 14:57:57,974 48k INFO ====> Epoch: 282 2023-01-22 15:00:51,353 48k INFO Train Epoch: 283 [96%] 2023-01-22 15:00:51,354 48k INFO [2.3887181282043457, 2.4443349838256836, 7.818312644958496, 19.522598266601562, 0.6072680354118347, 38200, 9.649999526137489e-05] 2023-01-22 15:00:55,701 48k INFO ====> Epoch: 283 2023-01-22 15:03:57,827 48k INFO ====> Epoch: 284 2023-01-22 15:05:51,611 48k INFO Train Epoch: 285 [44%] 2023-01-22 15:05:51,612 48k INFO [2.4296469688415527, 2.4187684059143066, 7.598639488220215, 17.00841522216797, 0.5899146795272827, 38400, 9.647587177037196e-05] 2023-01-22 15:06:57,998 48k INFO ====> Epoch: 285 2023-01-22 15:11:21,809 48k INFO Train Epoch: 286 [93%] 2023-01-22 15:11:21,810 48k INFO [2.47986102104187, 2.418102502822876, 8.120516777038574, 16.82959747314453, 0.5558894276618958, 38600, 9.646381228640066e-05] 2023-01-22 15:11:30,527 48k INFO ====> Epoch: 286 2023-01-22 15:16:52,714 48k INFO ====> Epoch: 287 2023-01-22 15:20:26,741 48k INFO Train Epoch: 288 [41%] 2023-01-22 15:20:26,742 48k INFO [2.417646884918213, 2.1912782192230225, 7.205511093139648, 17.26949691772461, 1.06976318359375, 38800, 9.643969784057613e-05] 2023-01-22 15:21:38,151 48k INFO ====> Epoch: 288 2023-01-22 15:25:47,521 48k INFO Train Epoch: 289 [89%] 2023-01-22 15:25:47,527 48k INFO [2.371002197265625, 2.5105152130126953, 8.714533805847168, 17.450605392456055, 0.5216397047042847, 39000, 9.642764287834605e-05] 2023-01-22 15:26:31,653 48k INFO Saving model and optimizer state at iteration 289 to ./logs/48k/G_39000.pth 2023-01-22 15:26:35,026 48k INFO Saving model and optimizer state at iteration 289 to ./logs/48k/D_39000.pth 2023-01-22 15:26:50,591 48k INFO ====> Epoch: 289 2023-01-22 15:31:56,677 48k INFO ====> Epoch: 290 2023-01-22 15:35:13,704 48k INFO Train Epoch: 291 [37%] 2023-01-22 15:35:13,705 48k INFO [2.3130550384521484, 2.456458330154419, 9.596668243408203, 19.601957321166992, 0.6780734658241272, 39200, 9.640353747430838e-05] 2023-01-22 15:36:28,799 48k INFO ====> Epoch: 291 2023-01-22 15:40:24,708 48k INFO Train Epoch: 292 [85%] 2023-01-22 15:40:24,709 48k INFO [2.2959342002868652, 2.400784730911255, 8.653800964355469, 17.960926055908203, 0.6029865741729736, 39400, 9.639148703212408e-05] 2023-01-22 15:40:42,372 48k INFO ====> Epoch: 292 2023-01-22 15:45:05,927 48k INFO ====> Epoch: 293 2023-01-22 15:47:45,120 48k INFO Train Epoch: 294 [33%] 2023-01-22 15:47:45,121 48k INFO [2.469186544418335, 2.334118127822876, 7.310582637786865, 16.590017318725586, 0.836621105670929, 39600, 9.636739066648303e-05] 2023-01-22 15:49:05,056 48k INFO ====> Epoch: 294 2023-01-22 15:52:54,903 48k INFO Train Epoch: 295 [81%] 2023-01-22 15:52:54,904 48k INFO [2.2566592693328857, 2.727398633956909, 8.405400276184082, 18.7299747467041, 0.6234728693962097, 39800, 9.635534474264972e-05] 2023-01-22 15:53:16,960 48k INFO ====> Epoch: 295 2023-01-22 15:57:08,482 48k INFO ====> Epoch: 296 2023-01-22 15:59:40,732 48k INFO Train Epoch: 297 [30%] 2023-01-22 15:59:40,734 48k INFO [2.2669360637664795, 2.4068658351898193, 9.069479942321777, 18.476909637451172, 0.56817227602005, 40000, 9.633125741201631e-05] 2023-01-22 16:00:09,875 48k INFO Saving model and optimizer state at iteration 297 to ./logs/48k/G_40000.pth 2023-01-22 16:00:13,530 48k INFO Saving model and optimizer state at iteration 297 to ./logs/48k/D_40000.pth 2023-01-22 16:01:39,876 48k INFO ====> Epoch: 297 2023-01-22 16:04:58,368 48k INFO Train Epoch: 298 [78%] 2023-01-22 16:04:58,369 48k INFO [2.2250146865844727, 2.633596181869507, 9.244190216064453, 18.637998580932617, 0.4720035195350647, 40200, 9.631921600483981e-05] 2023-01-22 16:05:24,824 48k INFO ====> Epoch: 298 2023-01-22 16:09:07,628 48k INFO ====> Epoch: 299 2023-01-22 16:11:25,471 48k INFO Train Epoch: 300 [26%] 2023-01-22 16:11:25,472 48k INFO [2.4519076347351074, 2.2256922721862793, 7.041754245758057, 15.106376647949219, 0.6140763759613037, 40400, 9.629513770582634e-05] 2023-01-22 16:12:54,422 48k INFO ====> Epoch: 300 2023-01-22 16:16:01,050 48k INFO Train Epoch: 301 [74%] 2023-01-22 16:16:01,052 48k INFO [2.3448362350463867, 2.3815178871154785, 8.741674423217773, 18.212007522583008, 0.5099637508392334, 40600, 9.628310081361311e-05] 2023-01-22 16:16:31,897 48k INFO ====> Epoch: 301 2023-01-22 16:20:21,965 48k INFO ====> Epoch: 302 2023-01-22 16:22:38,996 48k INFO Train Epoch: 303 [22%] 2023-01-22 16:22:38,997 48k INFO [2.4114632606506348, 2.3417625427246094, 7.886682987213135, 16.273374557495117, 0.7922483682632446, 40800, 9.625903154283315e-05] 2023-01-22 16:24:12,039 48k INFO ====> Epoch: 303 2023-01-22 16:27:16,195 48k INFO Train Epoch: 304 [70%] 2023-01-22 16:27:16,197 48k INFO [2.4813506603240967, 2.407681941986084, 8.287625312805176, 16.775705337524414, 0.8260019421577454, 41000, 9.62469991638903e-05] 2023-01-22 16:27:47,317 48k INFO Saving model and optimizer state at iteration 304 to ./logs/48k/G_41000.pth 2023-01-22 16:27:50,042 48k INFO Saving model and optimizer state at iteration 304 to ./logs/48k/D_41000.pth 2023-01-22 16:28:27,866 48k INFO ====> Epoch: 304 2023-01-22 16:32:01,219 48k INFO ====> Epoch: 305 2023-01-22 16:34:29,562 48k INFO Train Epoch: 306 [19%] 2023-01-22 16:34:29,564 48k INFO [2.4534859657287598, 2.2021894454956055, 8.257509231567383, 17.203853607177734, 0.7473194599151611, 41200, 9.622293891795867e-05] 2023-01-22 16:36:06,768 48k INFO ====> Epoch: 306 2023-01-22 16:39:14,597 48k INFO Train Epoch: 307 [67%] 2023-01-22 16:39:14,598 48k INFO [2.3421471118927, 2.4423537254333496, 8.335381507873535, 16.379396438598633, 0.5592942833900452, 41400, 9.621091105059392e-05] 2023-01-22 16:39:54,344 48k INFO ====> Epoch: 307 2023-01-22 16:43:52,238 48k INFO ====> Epoch: 308 2023-01-22 16:45:54,386 48k INFO Train Epoch: 309 [15%] 2023-01-22 16:45:54,387 48k INFO [2.372825860977173, 2.4120821952819824, 7.044921875, 15.132333755493164, 0.7050344347953796, 41600, 9.618685982612675e-05] 2023-01-22 16:47:36,858 48k INFO ====> Epoch: 309 2023-01-22 16:50:39,548 48k INFO Train Epoch: 310 [63%] 2023-01-22 16:50:39,549 48k INFO [2.524437427520752, 2.391568660736084, 4.835186004638672, 13.121316909790039, 0.5700317025184631, 41800, 9.617483646864849e-05] 2023-01-22 16:51:23,854 48k INFO ====> Epoch: 310 2023-01-22 16:55:02,343 48k INFO ====> Epoch: 311 2023-01-22 16:56:45,269 48k INFO Train Epoch: 312 [11%] 2023-01-22 16:56:45,270 48k INFO [2.516934633255005, 2.2302396297454834, 6.302649974822998, 14.910506248474121, 0.6865663528442383, 42000, 9.615079426226314e-05] 2023-01-22 16:57:03,289 48k INFO Saving model and optimizer state at iteration 312 to ./logs/48k/G_42000.pth 2023-01-22 16:57:06,946 48k INFO Saving model and optimizer state at iteration 312 to ./logs/48k/D_42000.pth 2023-01-22 16:58:55,200 48k INFO ====> Epoch: 312 2023-01-22 17:01:45,439 48k INFO Train Epoch: 313 [59%] 2023-01-22 17:01:45,451 48k INFO [2.2551987171173096, 2.97629714012146, 5.956668853759766, 13.055106163024902, 0.7470076680183411, 42200, 9.613877541298036e-05] 2023-01-22 17:02:34,025 48k INFO ====> Epoch: 313 2023-01-22 17:06:07,478 48k INFO ====> Epoch: 314 2023-01-22 17:07:57,374 48k INFO Train Epoch: 315 [7%] 2023-01-22 17:07:57,375 48k INFO [2.2620961666107178, 2.534827947616577, 9.351641654968262, 18.0697021484375, 0.8800477981567383, 42400, 9.611474222129547e-05] 2023-01-22 17:09:47,670 48k INFO ====> Epoch: 315 2023-01-22 17:12:30,305 48k INFO Train Epoch: 316 [56%] 2023-01-22 17:12:30,574 48k INFO [2.3954522609710693, 2.353015661239624, 6.899136066436768, 15.734223365783691, 0.40866079926490784, 42600, 9.61027278785178e-05] 2023-01-22 17:13:23,903 48k INFO ====> Epoch: 316 2023-01-22 17:17:08,457 48k INFO ====> Epoch: 317 2023-01-22 17:18:36,382 48k INFO Train Epoch: 318 [4%] 2023-01-22 17:18:36,384 48k INFO [2.333951711654663, 2.4426777362823486, 8.317424774169922, 18.092710494995117, 0.6056740283966064, 42800, 9.60787036981533e-05] 2023-01-22 17:20:31,508 48k INFO ====> Epoch: 318 2023-01-22 17:23:13,600 48k INFO Train Epoch: 319 [52%] 2023-01-22 17:23:13,605 48k INFO [2.318000316619873, 2.354292392730713, 8.948800086975098, 18.166889190673828, 0.7283632159233093, 43000, 9.606669386019102e-05] 2023-01-22 17:23:43,071 48k INFO Saving model and optimizer state at iteration 319 to ./logs/48k/G_43000.pth 2023-01-22 17:23:47,388 48k INFO Saving model and optimizer state at iteration 319 to ./logs/48k/D_43000.pth 2023-01-22 17:24:46,834 48k INFO ====> Epoch: 319 2023-01-22 17:28:37,212 48k INFO ====> Epoch: 320 2023-01-22 17:30:29,695 48k INFO Train Epoch: 321 [0%] 2023-01-22 17:30:29,696 48k INFO [2.3656821250915527, 2.551257610321045, 7.59739875793457, 16.959199905395508, 0.8131361603736877, 43200, 9.604267868776807e-05] 2023-01-22 17:32:29,657 48k INFO ====> Epoch: 321 2023-01-22 17:35:10,204 48k INFO Train Epoch: 322 [48%] 2023-01-22 17:35:10,205 48k INFO [2.4267737865448, 2.3376646041870117, 8.554890632629395, 17.099754333496094, 0.6729087829589844, 43400, 9.603067335293209e-05] 2023-01-22 17:36:12,540 48k INFO ====> Epoch: 322 2023-01-22 17:39:51,287 48k INFO Train Epoch: 323 [96%] 2023-01-22 17:39:51,289 48k INFO [2.1600894927978516, 2.575343370437622, 8.432794570922852, 18.25483512878418, 0.8089667558670044, 43600, 9.601866951876297e-05] 2023-01-22 17:39:55,703 48k INFO ====> Epoch: 323 2023-01-22 17:43:40,271 48k INFO ====> Epoch: 324 2023-01-22 17:46:14,375 48k INFO Train Epoch: 325 [44%] 2023-01-22 17:46:14,377 48k INFO [2.3632731437683105, 2.348932981491089, 8.134114265441895, 16.433855056762695, 0.70457923412323, 43800, 9.599466635167497e-05] 2023-01-22 17:47:20,501 48k INFO ====> Epoch: 325 2023-01-22 17:50:55,550 48k INFO Train Epoch: 326 [93%] 2023-01-22 17:50:55,551 48k INFO [2.383044719696045, 2.339278221130371, 7.963810443878174, 16.52117919921875, 0.6481860876083374, 44000, 9.5982667018381e-05] 2023-01-22 17:51:28,240 48k INFO Saving model and optimizer state at iteration 326 to ./logs/48k/G_44000.pth 2023-01-22 17:51:31,557 48k INFO Saving model and optimizer state at iteration 326 to ./logs/48k/D_44000.pth 2023-01-22 17:51:42,541 48k INFO ====> Epoch: 326 2023-01-22 17:55:33,188 48k INFO ====> Epoch: 327 2023-01-22 17:58:26,939 48k INFO Train Epoch: 328 [41%] 2023-01-22 17:58:26,940 48k INFO [2.4874587059020996, 2.2184391021728516, 7.857416152954102, 18.301340103149414, 0.8074116110801697, 44200, 9.595867285135558e-05] 2023-01-22 17:59:37,947 48k INFO ====> Epoch: 328 2023-01-22 18:03:13,454 48k INFO Train Epoch: 329 [89%] 2023-01-22 18:03:13,455 48k INFO [2.4864158630371094, 2.403803586959839, 7.286619663238525, 15.194061279296875, 0.6527774333953857, 44400, 9.594667801724916e-05] 2023-01-22 18:03:26,692 48k INFO ====> Epoch: 329 2023-01-22 18:07:29,922 48k INFO ====> Epoch: 330 2023-01-22 18:10:47,616 48k INFO Train Epoch: 331 [37%] 2023-01-22 18:10:47,622 48k INFO [2.3572330474853516, 2.6455559730529785, 8.697616577148438, 18.971073150634766, 0.740763247013092, 44600, 9.592269284691169e-05] 2023-01-22 18:12:02,503 48k INFO ====> Epoch: 331 2023-01-22 18:15:22,465 48k INFO Train Epoch: 332 [85%] 2023-01-22 18:15:22,467 48k INFO [2.443180799484253, 2.3018908500671387, 7.5917768478393555, 17.607200622558594, 0.632247805595398, 44800, 9.591070251030582e-05] 2023-01-22 18:15:40,014 48k INFO ====> Epoch: 332 2023-01-22 18:19:19,724 48k INFO ====> Epoch: 333 2023-01-22 18:22:05,425 48k INFO Train Epoch: 334 [33%] 2023-01-22 18:22:05,426 48k INFO [2.376763343811035, 2.4500439167022705, 8.75696086883545, 16.75783348083496, 0.6433892846107483, 45000, 9.588672633328296e-05] 2023-01-22 18:22:36,876 48k INFO Saving model and optimizer state at iteration 334 to ./logs/48k/G_45000.pth 2023-01-22 18:22:39,736 48k INFO Saving model and optimizer state at iteration 334 to ./logs/48k/D_45000.pth 2023-01-22 18:24:01,389 48k INFO ====> Epoch: 334 2023-01-22 18:27:26,592 48k INFO Train Epoch: 335 [81%] 2023-01-22 18:27:26,593 48k INFO [2.405405044555664, 2.3900609016418457, 7.327393054962158, 16.003793716430664, 0.6529723405838013, 45200, 9.58747404924913e-05] 2023-01-22 18:27:48,662 48k INFO ====> Epoch: 335 2023-01-22 18:32:00,972 48k INFO ====> Epoch: 336 2023-01-22 18:34:25,150 48k INFO Train Epoch: 337 [30%] 2023-01-22 18:34:25,151 48k INFO [2.500805139541626, 2.380406141281128, 7.1822052001953125, 15.658002853393555, 0.6731748580932617, 45400, 9.5850773305411e-05] 2023-01-22 18:35:49,632 48k INFO ====> Epoch: 337 2023-01-22 18:38:57,692 48k INFO Train Epoch: 338 [78%] 2023-01-22 18:38:57,699 48k INFO [2.4773645401000977, 2.4101974964141846, 7.406733512878418, 16.831851959228516, 0.8341858983039856, 45600, 9.583879195874782e-05] 2023-01-22 18:39:24,373 48k INFO ====> Epoch: 338 2023-01-22 18:42:45,701 48k INFO ====> Epoch: 339 2023-01-22 18:45:08,920 48k INFO Train Epoch: 340 [26%] 2023-01-22 18:45:08,922 48k INFO [2.6184966564178467, 2.255016326904297, 6.836249828338623, 13.85340404510498, 0.5831400752067566, 45800, 9.581483375823925e-05] 2023-01-22 18:46:37,793 48k INFO ====> Epoch: 340 2023-01-22 18:49:34,077 48k INFO Train Epoch: 341 [74%] 2023-01-22 18:49:34,096 48k INFO [2.4269752502441406, 2.4917781352996826, 8.80415153503418, 19.02927589416504, 0.6437544226646423, 46000, 9.580285690401946e-05] 2023-01-22 18:50:02,035 48k INFO Saving model and optimizer state at iteration 341 to ./logs/48k/G_46000.pth 2023-01-22 18:50:05,719 48k INFO Saving model and optimizer state at iteration 341 to ./logs/48k/D_46000.pth 2023-01-22 18:50:38,735 48k INFO ====> Epoch: 341 2023-01-22 18:54:05,014 48k INFO ====> Epoch: 342 2023-01-22 18:55:39,680 48k INFO Train Epoch: 343 [22%] 2023-01-22 18:55:39,681 48k INFO [2.2519588470458984, 2.7247397899627686, 8.072264671325684, 16.854413986206055, 0.6617252826690674, 46200, 9.577890768671308e-05] 2023-01-22 18:57:12,413 48k INFO ====> Epoch: 343 2023-01-22 18:59:58,002 48k INFO Train Epoch: 344 [70%] 2023-01-22 18:59:58,003 48k INFO [2.596748113632202, 2.186664581298828, 6.420111656188965, 13.523789405822754, 0.6469960808753967, 46400, 9.576693532325224e-05] 2023-01-22 19:00:33,359 48k INFO ====> Epoch: 344 2023-01-22 19:03:37,049 48k INFO ====> Epoch: 345 2023-01-22 19:05:08,347 48k INFO Train Epoch: 346 [19%] 2023-01-22 19:05:08,349 48k INFO [2.485302686691284, 2.2631027698516846, 6.422457218170166, 11.714428901672363, 0.864237904548645, 46600, 9.574299508577979e-05] 2023-01-22 19:06:46,303 48k INFO ====> Epoch: 346 2023-01-22 19:09:44,194 48k INFO Train Epoch: 347 [67%] 2023-01-22 19:09:44,200 48k INFO [2.309687852859497, 2.4901983737945557, 9.9243745803833, 18.756410598754883, 0.5765377283096313, 46800, 9.573102721139406e-05] 2023-01-22 19:10:24,294 48k INFO ====> Epoch: 347 2023-01-22 19:13:49,697 48k INFO ====> Epoch: 348 2023-01-22 19:15:26,292 48k INFO Train Epoch: 349 [15%] 2023-01-22 19:15:26,294 48k INFO [2.4820961952209473, 2.3990478515625, 6.847670078277588, 16.006511688232422, 0.6788283586502075, 47000, 9.570709595038851e-05] 2023-01-22 19:15:44,786 48k INFO Saving model and optimizer state at iteration 349 to ./logs/48k/G_47000.pth 2023-01-22 19:15:47,482 48k INFO Saving model and optimizer state at iteration 349 to ./logs/48k/D_47000.pth 2023-01-22 19:17:31,289 48k INFO ====> Epoch: 349 2023-01-22 19:19:50,751 48k INFO Train Epoch: 350 [63%] 2023-01-22 19:19:50,752 48k INFO [2.392808198928833, 2.4501442909240723, 7.621319770812988, 15.952756881713867, 0.7500029802322388, 47200, 9.569513256339471e-05] 2023-01-22 19:20:34,741 48k INFO ====> Epoch: 350 2023-01-22 19:23:40,623 48k INFO ====> Epoch: 351 2023-01-22 19:25:22,715 48k INFO Train Epoch: 352 [11%] 2023-01-22 19:25:22,717 48k INFO [2.034245491027832, 3.184155225753784, 7.303689956665039, 13.532115936279297, 0.7092576026916504, 47400, 9.56712102754903e-05] 2023-01-22 19:27:08,215 48k INFO ====> Epoch: 352 2023-01-22 19:29:31,047 48k INFO Train Epoch: 353 [59%] 2023-01-22 19:29:31,053 48k INFO [2.7139081954956055, 2.187431573867798, 6.128696441650391, 15.494945526123047, 0.5425180792808533, 47600, 9.565925137420586e-05] 2023-01-22 19:30:20,224 48k INFO ====> Epoch: 353 2023-01-22 19:34:01,314 48k INFO ====> Epoch: 354 2023-01-22 19:35:14,228 48k INFO Train Epoch: 355 [7%] 2023-01-22 19:35:14,230 48k INFO [2.2400426864624023, 2.626953601837158, 9.33132266998291, 18.713666915893555, 0.8385408520698547, 47800, 9.56353380560381e-05] 2023-01-22 19:37:04,807 48k INFO ====> Epoch: 355 2023-01-22 19:39:13,438 48k INFO Train Epoch: 356 [56%] 2023-01-22 19:39:13,439 48k INFO [2.329470634460449, 2.384976387023926, 8.607851028442383, 17.461442947387695, 0.6669854521751404, 48000, 9.562338363878108e-05] 2023-01-22 19:39:34,671 48k INFO Saving model and optimizer state at iteration 356 to ./logs/48k/G_48000.pth 2023-01-22 19:39:38,774 48k INFO Saving model and optimizer state at iteration 356 to ./logs/48k/D_48000.pth 2023-01-22 19:40:33,803 48k INFO ====> Epoch: 356 2023-01-22 19:43:36,797 48k INFO ====> Epoch: 357 2023-01-22 19:44:46,848 48k INFO Train Epoch: 358 [4%] 2023-01-22 19:44:46,856 48k INFO [2.3734679222106934, 2.39916729927063, 8.32049560546875, 17.051219940185547, 0.6743294596672058, 48200, 9.559947928698674e-05] 2023-01-22 19:46:41,988 48k INFO ====> Epoch: 358 2023-01-22 19:48:45,310 48k INFO Train Epoch: 359 [52%] 2023-01-22 19:48:45,312 48k INFO [2.2945504188537598, 2.458164930343628, 9.259329795837402, 17.520519256591797, 0.8739093542098999, 48400, 9.558752935207586e-05] 2023-01-22 19:49:42,736 48k INFO ====> Epoch: 359 2023-01-22 19:52:53,280 48k INFO ====> Epoch: 360 2023-01-22 19:54:19,388 48k INFO Train Epoch: 361 [0%] 2023-01-22 19:54:19,389 48k INFO [2.413972854614258, 2.3302712440490723, 7.629361629486084, 14.748807907104492, 0.6676719188690186, 48600, 9.556363396329299e-05] 2023-01-22 19:56:18,906 48k INFO ====> Epoch: 361 2023-01-22 19:58:40,490 48k INFO Train Epoch: 362 [48%] 2023-01-22 19:58:40,491 48k INFO [2.326883316040039, 2.574125289916992, 7.825387001037598, 16.380937576293945, 0.8805585503578186, 48800, 9.555168850904757e-05] 2023-01-22 19:59:43,004 48k INFO ====> Epoch: 362 2023-01-22 20:02:49,302 48k INFO Train Epoch: 363 [96%] 2023-01-22 20:02:49,303 48k INFO [2.2305524349212646, 2.5671908855438232, 8.702371597290039, 17.94123649597168, 0.22314974665641785, 49000, 9.553974454798393e-05] 2023-01-22 20:03:09,642 48k INFO Saving model and optimizer state at iteration 363 to ./logs/48k/G_49000.pth 2023-01-22 20:03:13,492 48k INFO Saving model and optimizer state at iteration 363 to ./logs/48k/D_49000.pth 2023-01-22 20:03:22,765 48k INFO ====> Epoch: 363 2023-01-22 20:06:33,174 48k INFO ====> Epoch: 364 2023-01-22 20:09:17,753 48k INFO Train Epoch: 365 [44%] 2023-01-22 20:09:17,754 48k INFO [2.5504512786865234, 2.061028480529785, 5.690334320068359, 12.943085670471191, 0.4153297245502472, 49200, 9.551586110465545e-05] 2023-01-22 20:10:23,750 48k INFO ====> Epoch: 365 2023-01-22 20:13:55,129 48k INFO Train Epoch: 366 [93%] 2023-01-22 20:13:55,130 48k INFO [2.3691818714141846, 2.4165148735046387, 7.433866024017334, 14.679097175598145, 0.5044909715652466, 49400, 9.550392162201736e-05] 2023-01-22 20:14:03,968 48k INFO ====> Epoch: 366 2023-01-22 20:17:22,609 48k INFO ====> Epoch: 367 2023-01-22 20:19:42,288 48k INFO Train Epoch: 368 [41%] 2023-01-22 20:19:42,289 48k INFO [2.424231767654419, 2.3427834510803223, 7.549695014953613, 17.186979293823242, 0.8195695281028748, 49600, 9.548004713386062e-05] 2023-01-22 20:20:53,504 48k INFO ====> Epoch: 368 2023-01-22 20:24:09,938 48k INFO Train Epoch: 369 [89%] 2023-01-22 20:24:09,939 48k INFO [2.5482380390167236, 2.3470144271850586, 6.29485559463501, 16.27666473388672, 0.6958893537521362, 49800, 9.546811212796888e-05] 2023-01-22 20:24:22,913 48k INFO ====> Epoch: 369 2023-01-22 20:28:06,576 48k INFO ====> Epoch: 370 2023-01-22 20:30:41,757 48k INFO Train Epoch: 371 [37%] 2023-01-22 20:30:41,758 48k INFO [2.378931999206543, 2.2879815101623535, 8.373250961303711, 17.704442977905273, 0.6974684596061707, 50000, 9.544424659162614e-05] 2023-01-22 20:31:16,693 48k INFO Saving model and optimizer state at iteration 371 to ./logs/48k/G_50000.pth 2023-01-22 20:31:19,618 48k INFO Saving model and optimizer state at iteration 371 to ./logs/48k/D_50000.pth 2023-01-22 20:32:36,644 48k INFO ====> Epoch: 371 2023-01-22 20:36:05,787 48k INFO Train Epoch: 372 [85%] 2023-01-22 20:36:05,788 48k INFO [2.309500217437744, 2.3511455059051514, 7.46088171005249, 18.75249481201172, 0.6919755935668945, 50200, 9.543231606080218e-05] 2023-01-22 20:36:23,471 48k INFO ====> Epoch: 372 2023-01-22 20:39:58,742 48k INFO ====> Epoch: 373 2023-01-22 20:42:26,575 48k INFO Train Epoch: 374 [33%] 2023-01-22 20:42:26,576 48k INFO [2.272925615310669, 2.6266238689422607, 8.974039077758789, 18.0629825592041, 0.620621919631958, 50400, 9.540845947291691e-05] 2023-01-22 20:43:46,613 48k INFO ====> Epoch: 374 2023-01-22 20:47:13,680 48k INFO Train Epoch: 375 [81%] 2023-01-22 20:47:13,681 48k INFO [2.252105236053467, 2.4372036457061768, 7.6792378425598145, 17.018348693847656, 0.5078848004341125, 50600, 9.53965334154828e-05] 2023-01-22 20:47:36,154 48k INFO ====> Epoch: 375 2023-01-22 20:51:04,204 48k INFO ====> Epoch: 376 2023-01-22 20:53:33,382 48k INFO Train Epoch: 377 [30%] 2023-01-22 20:53:33,383 48k INFO [2.4920310974121094, 2.292401075363159, 6.733639240264893, 15.295342445373535, 0.6535497903823853, 50800, 9.537268577269974e-05] 2023-01-22 20:54:58,207 48k INFO ====> Epoch: 377 2023-01-22 20:57:47,631 48k INFO Train Epoch: 378 [78%] 2023-01-22 20:57:47,632 48k INFO [2.213726758956909, 2.594534397125244, 9.34267520904541, 17.764270782470703, 0.8651587963104248, 51000, 9.536076418697815e-05] 2023-01-22 20:58:07,851 48k INFO Saving model and optimizer state at iteration 378 to ./logs/48k/G_51000.pth 2023-01-22 20:58:11,293 48k INFO Saving model and optimizer state at iteration 378 to ./logs/48k/D_51000.pth 2023-01-22 20:58:39,752 48k INFO ====> Epoch: 378 2023-01-22 21:01:47,033 48k INFO ====> Epoch: 379 2023-01-22 21:03:25,600 48k INFO Train Epoch: 380 [26%] 2023-01-22 21:03:25,601 48k INFO [2.3485701084136963, 2.557446002960205, 9.270715713500977, 17.831764221191406, 0.8496134281158447, 51200, 9.533692548594333e-05] 2023-01-22 21:04:54,197 48k INFO ====> Epoch: 380 2023-01-22 21:07:31,699 48k INFO Train Epoch: 381 [74%] 2023-01-22 21:07:31,700 48k INFO [2.2504372596740723, 2.8843283653259277, 8.759394645690918, 18.14085578918457, 0.641109824180603, 51400, 9.532500837025758e-05] 2023-01-22 21:08:02,589 48k INFO ====> Epoch: 381 2023-01-22 21:11:12,758 48k INFO ====> Epoch: 382 2023-01-22 21:12:56,793 48k INFO Train Epoch: 383 [22%] 2023-01-22 21:12:56,795 48k INFO [2.2730185985565186, 2.4812185764312744, 9.189831733703613, 18.620134353637695, 0.6183872818946838, 51600, 9.530117860761828e-05] 2023-01-22 21:14:30,053 48k INFO ====> Epoch: 383 2023-01-22 21:17:14,129 48k INFO Train Epoch: 384 [70%] 2023-01-22 21:17:14,136 48k INFO [2.4577383995056152, 2.3072867393493652, 7.165156364440918, 15.110790252685547, 0.3102637529373169, 51800, 9.528926596029232e-05] 2023-01-22 21:17:49,371 48k INFO ====> Epoch: 384 2023-01-22 21:20:53,098 48k INFO ====> Epoch: 385 2023-01-22 21:22:26,521 48k INFO Train Epoch: 386 [19%] 2023-01-22 21:22:26,522 48k INFO [2.4013285636901855, 2.209219217300415, 8.431103706359863, 16.141674041748047, 0.3340992033481598, 52000, 9.526544513269702e-05] 2023-01-22 21:22:50,724 48k INFO Saving model and optimizer state at iteration 386 to ./logs/48k/G_52000.pth 2023-01-22 21:22:53,896 48k INFO Saving model and optimizer state at iteration 386 to ./logs/48k/D_52000.pth 2023-01-22 21:24:33,981 48k INFO ====> Epoch: 386 2023-01-22 21:27:02,161 48k INFO Train Epoch: 387 [67%] 2023-01-22 21:27:02,163 48k INFO [2.3817572593688965, 2.4587137699127197, 8.96212100982666, 17.186725616455078, 0.7489089369773865, 52200, 9.525353695205543e-05] 2023-01-22 21:27:42,028 48k INFO ====> Epoch: 387 2023-01-22 21:30:51,412 48k INFO ====> Epoch: 388 2023-01-22 21:32:19,590 48k INFO Train Epoch: 389 [15%] 2023-01-22 21:32:19,592 48k INFO [2.4070582389831543, 2.4686810970306396, 8.145015716552734, 16.723182678222656, 0.7111635208129883, 52400, 9.522972505615393e-05] 2023-01-22 21:34:01,344 48k INFO ====> Epoch: 389 2023-01-22 21:36:21,650 48k INFO Train Epoch: 390 [63%] 2023-01-22 21:36:21,652 48k INFO [2.6142497062683105, 2.260935068130493, 7.7933197021484375, 15.09135627746582, 0.5993373394012451, 52600, 9.52178213405219e-05] 2023-01-22 21:37:05,895 48k INFO ====> Epoch: 390 2023-01-22 21:40:12,802 48k INFO ====> Epoch: 391 2023-01-22 21:41:31,691 48k INFO Train Epoch: 392 [11%] 2023-01-22 21:41:31,720 48k INFO [2.4525582790374756, 2.2930748462677, 7.187772750854492, 17.190383911132812, 0.9463768005371094, 52800, 9.519401837296521e-05] 2023-01-22 21:43:17,115 48k INFO ====> Epoch: 392 2023-01-22 21:45:33,409 48k INFO Train Epoch: 393 [59%] 2023-01-22 21:45:33,411 48k INFO [2.334266185760498, 2.483794689178467, 7.741704940795898, 17.896568298339844, 0.6831472516059875, 53000, 9.51821191206686e-05] 2023-01-22 21:45:54,587 48k INFO Saving model and optimizer state at iteration 393 to ./logs/48k/G_53000.pth 2023-01-22 21:45:56,803 48k INFO Saving model and optimizer state at iteration 393 to ./logs/48k/D_53000.pth 2023-01-22 21:46:47,710 48k INFO ====> Epoch: 393 2023-01-22 21:49:51,981 48k INFO ====> Epoch: 394 2023-01-22 21:51:30,694 48k INFO Train Epoch: 395 [7%] 2023-01-22 21:51:30,695 48k INFO [2.423403739929199, 2.3523902893066406, 8.129036903381348, 17.8643856048584, 0.7813327312469482, 53200, 9.515832507810904e-05] 2023-01-22 21:53:21,576 48k INFO ====> Epoch: 395 2023-01-22 21:55:39,044 48k INFO Train Epoch: 396 [56%] 2023-01-22 21:55:39,046 48k INFO [2.372114896774292, 2.2718143463134766, 7.284083843231201, 16.500965118408203, 0.7166744470596313, 53400, 9.514643028747427e-05] 2023-01-22 21:56:32,298 48k INFO ====> Epoch: 396 2023-01-22 21:59:38,789 48k INFO ====> Epoch: 397 2023-01-22 22:00:53,678 48k INFO Train Epoch: 398 [4%] 2023-01-22 22:00:53,680 48k INFO [2.530900239944458, 2.180014133453369, 8.217077255249023, 15.510058403015137, 0.5506068468093872, 53600, 9.512264516656537e-05] 2023-01-22 22:02:48,890 48k INFO ====> Epoch: 398 2023-01-22 22:04:55,759 48k INFO Train Epoch: 399 [52%] 2023-01-22 22:04:55,760 48k INFO [2.3812646865844727, 2.300353765487671, 7.995172023773193, 18.48531723022461, 0.5640478730201721, 53800, 9.511075483591955e-05] 2023-01-22 22:05:53,127 48k INFO ====> Epoch: 399 2023-01-22 22:09:00,189 48k INFO ====> Epoch: 400 2023-01-22 22:10:07,878 48k INFO Train Epoch: 401 [0%] 2023-01-22 22:10:07,879 48k INFO [2.383000612258911, 2.3352184295654297, 6.403345584869385, 14.253823280334473, 0.6330124735832214, 54000, 9.508697863331611e-05] 2023-01-22 22:10:24,097 48k INFO Saving model and optimizer state at iteration 401 to ./logs/48k/G_54000.pth 2023-01-22 22:10:26,794 48k INFO Saving model and optimizer state at iteration 401 to ./logs/48k/D_54000.pth 2023-01-22 22:12:27,708 48k INFO ====> Epoch: 401 2023-01-22 22:15:03,944 48k INFO Train Epoch: 402 [48%] 2023-01-22 22:15:04,219 48k INFO [2.4328720569610596, 2.4769058227539062, 8.421703338623047, 18.248504638671875, 0.6509317755699158, 54200, 9.507509276098694e-05] 2023-01-22 22:16:05,993 48k INFO ====> Epoch: 402 2023-01-22 22:19:06,617 48k INFO Train Epoch: 403 [96%] 2023-01-22 22:19:06,618 48k INFO [2.318669319152832, 2.5995075702667236, 9.268884658813477, 19.276948928833008, 0.8183263540267944, 54400, 9.506320837439182e-05] 2023-01-22 22:19:11,079 48k INFO ====> Epoch: 403 2023-01-22 22:22:16,974 48k INFO ====> Epoch: 404 2023-01-22 22:24:18,998 48k INFO Train Epoch: 405 [44%] 2023-01-22 22:24:18,999 48k INFO [2.3085920810699463, 2.480414390563965, 9.08579158782959, 16.507625579833984, 0.6533172130584717, 54600, 9.503944405766085e-05] 2023-01-22 22:25:25,269 48k INFO ====> Epoch: 405 2023-01-22 22:28:36,520 48k INFO Train Epoch: 406 [93%] 2023-01-22 22:28:36,521 48k INFO [2.5581226348876953, 2.226229667663574, 7.423098087310791, 14.937605857849121, 0.5044761896133423, 54800, 9.502756412715364e-05] 2023-01-22 22:28:45,516 48k INFO ====> Epoch: 406 2023-01-22 22:31:49,182 48k INFO ====> Epoch: 407 2023-01-22 22:33:45,194 48k INFO Train Epoch: 408 [41%] 2023-01-22 22:33:45,195 48k INFO [2.3719305992126465, 2.4215621948242188, 7.81636381149292, 18.027067184448242, 0.800922691822052, 55000, 9.500380872092753e-05] 2023-01-22 22:34:08,525 48k INFO Saving model and optimizer state at iteration 408 to ./logs/48k/G_55000.pth 2023-01-22 22:34:12,253 48k INFO Saving model and optimizer state at iteration 408 to ./logs/48k/D_55000.pth 2023-01-22 22:35:24,853 48k INFO ====> Epoch: 408 2023-01-22 22:38:18,961 48k INFO Train Epoch: 409 [89%] 2023-01-22 22:38:18,962 48k INFO [2.4410510063171387, 2.3868722915649414, 7.151692867279053, 16.575334548950195, 0.801203191280365, 55200, 9.49919332448374e-05] 2023-01-22 22:38:32,271 48k INFO ====> Epoch: 409 2023-01-22 22:41:40,591 48k INFO ====> Epoch: 410 2023-01-22 22:43:32,762 48k INFO Train Epoch: 411 [37%] 2023-01-22 22:43:32,768 48k INFO [2.307041883468628, 2.3303682804107666, 8.98763656616211, 17.92055892944336, 0.8689233660697937, 55400, 9.496818674577514e-05] 2023-01-22 22:44:47,715 48k INFO ====> Epoch: 411 2023-01-22 22:47:33,460 48k INFO Train Epoch: 412 [85%] 2023-01-22 22:47:33,462 48k INFO [2.3667988777160645, 2.37882661819458, 7.951331615447998, 17.686573028564453, 0.6935745477676392, 55600, 9.495631572243191e-05] 2023-01-22 22:47:51,179 48k INFO ====> Epoch: 412 2023-01-22 22:51:24,548 48k INFO ====> Epoch: 413 2023-01-22 22:53:44,851 48k INFO Train Epoch: 414 [33%] 2023-01-22 22:53:44,852 48k INFO [2.514413595199585, 2.494786024093628, 7.504847526550293, 15.524942398071289, 0.8808487057685852, 55800, 9.493257812719373e-05] 2023-01-22 22:55:04,969 48k INFO ====> Epoch: 414 2023-01-22 22:58:06,869 48k INFO Train Epoch: 415 [81%] 2023-01-22 22:58:06,871 48k INFO [2.4126620292663574, 2.339552879333496, 7.5960373878479, 17.79773712158203, 0.4057232439517975, 56000, 9.492071155492783e-05] 2023-01-22 22:58:28,167 48k INFO Saving model and optimizer state at iteration 415 to ./logs/48k/G_56000.pth 2023-01-22 22:58:31,677 48k INFO Saving model and optimizer state at iteration 415 to ./logs/48k/D_56000.pth 2023-01-22 22:58:55,453 48k INFO ====> Epoch: 415 2023-01-22 23:02:08,946 48k INFO ====> Epoch: 416 2023-01-22 23:03:54,024 48k INFO Train Epoch: 417 [30%] 2023-01-22 23:03:55,838 48k INFO [2.36519718170166, 2.4014482498168945, 8.030113220214844, 17.69798469543457, 0.4719547629356384, 56200, 9.489698286017521e-05] 2023-01-22 23:05:20,440 48k INFO ====> Epoch: 417 2023-01-22 23:08:02,187 48k INFO Train Epoch: 418 [78%] 2023-01-22 23:08:02,194 48k INFO [2.210477590560913, 2.684577226638794, 8.964897155761719, 18.925643920898438, 0.7870686054229736, 56400, 9.488512073731768e-05] 2023-01-22 23:08:28,821 48k INFO ====> Epoch: 418 2023-01-22 23:11:26,612 48k INFO ====> Epoch: 419 2023-01-22 23:13:24,490 48k INFO Train Epoch: 420 [26%] 2023-01-22 23:13:24,491 48k INFO [2.4035556316375732, 2.4795188903808594, 9.226785659790039, 17.452316284179688, 0.6268768310546875, 56600, 9.486140093971337e-05] 2023-01-22 23:14:53,071 48k INFO ====> Epoch: 420 2023-01-22 23:17:23,518 48k INFO Train Epoch: 421 [74%] 2023-01-22 23:17:23,519 48k INFO [2.383875846862793, 2.418855905532837, 8.918020248413086, 17.813169479370117, 0.7755799889564514, 56800, 9.484954326459589e-05] 2023-01-22 23:17:54,532 48k INFO ====> Epoch: 421 2023-01-22 23:20:57,027 48k INFO ====> Epoch: 422 2023-01-22 23:22:39,103 48k INFO Train Epoch: 423 [22%] 2023-01-22 23:22:39,105 48k INFO [2.4390835762023926, 2.2981722354888916, 7.517523765563965, 16.94045066833496, 1.0891990661621094, 57000, 9.482583236080386e-05] 2023-01-22 23:23:01,672 48k INFO Saving model and optimizer state at iteration 423 to ./logs/48k/G_57000.pth 2023-01-22 23:23:05,078 48k INFO Saving model and optimizer state at iteration 423 to ./logs/48k/D_57000.pth 2023-01-22 23:24:40,456 48k INFO ====> Epoch: 423 2023-01-22 23:27:09,764 48k INFO Train Epoch: 424 [70%] 2023-01-22 23:27:09,765 48k INFO [2.416020393371582, 2.2847819328308105, 7.797627925872803, 16.58034896850586, 0.7516502141952515, 57200, 9.481397913175876e-05] 2023-01-22 23:27:45,314 48k INFO ====> Epoch: 424 2023-01-22 23:30:52,198 48k INFO ====> Epoch: 425 2023-01-22 23:32:19,917 48k INFO Train Epoch: 426 [19%] 2023-01-22 23:32:19,919 48k INFO [2.2528889179229736, 2.594204902648926, 7.402072906494141, 15.54792308807373, 0.6624400019645691, 57400, 9.479027711844423e-05] 2023-01-22 23:33:57,294 48k INFO ====> Epoch: 426 2023-01-22 23:36:50,439 48k INFO Train Epoch: 427 [67%] 2023-01-22 23:36:50,440 48k INFO [2.3424465656280518, 2.5035247802734375, 9.568531036376953, 16.264623641967773, 0.7429888248443604, 57600, 9.477842833380443e-05] 2023-01-22 23:37:30,121 48k INFO ====> Epoch: 427 2023-01-22 23:40:36,851 48k INFO ====> Epoch: 428 2023-01-22 23:42:27,666 48k INFO Train Epoch: 429 [15%] 2023-01-22 23:42:27,667 48k INFO [2.357149124145508, 2.5013837814331055, 6.176361560821533, 15.052521705627441, 0.9897030591964722, 57800, 9.475473520763392e-05] 2023-01-22 23:44:09,609 48k INFO ====> Epoch: 429 2023-01-22 23:46:30,783 48k INFO Train Epoch: 430 [63%] 2023-01-22 23:46:30,784 48k INFO [2.3712730407714844, 2.49556040763855, 9.23398494720459, 17.886489868164062, 0.690373957157135, 58000, 9.474289086573296e-05] 2023-01-22 23:47:06,764 48k INFO Saving model and optimizer state at iteration 430 to ./logs/48k/G_58000.pth 2023-01-22 23:47:09,090 48k INFO Saving model and optimizer state at iteration 430 to ./logs/48k/D_58000.pth 2023-01-22 23:47:55,907 48k INFO ====> Epoch: 430 2023-01-22 23:51:01,670 48k INFO ====> Epoch: 431 2023-01-22 23:52:22,084 48k INFO Train Epoch: 432 [11%] 2023-01-22 23:52:22,090 48k INFO [2.594696521759033, 2.230950117111206, 6.00062894821167, 16.495588302612305, 0.8937126994132996, 58200, 9.471920662337418e-05] 2023-01-22 23:54:09,111 48k INFO ====> Epoch: 432 2023-01-22 23:56:26,617 48k INFO Train Epoch: 433 [59%] 2023-01-22 23:56:26,619 48k INFO [2.4132039546966553, 2.3234856128692627, 7.32626485824585, 18.2718505859375, 0.9831944704055786, 58400, 9.470736672254626e-05] 2023-01-22 23:57:15,238 48k INFO ====> Epoch: 433 2023-01-23 00:00:55,142 48k INFO ====> Epoch: 434 2023-01-23 00:03:07,499 48k INFO Train Epoch: 435 [7%] 2023-01-23 00:03:07,500 48k INFO [2.475004196166992, 2.545865058898926, 7.233957290649414, 16.334447860717773, 0.6956380009651184, 58600, 9.468369136066823e-05] 2023-01-23 00:04:58,250 48k INFO ====> Epoch: 435 2023-01-23 00:07:35,490 48k INFO Train Epoch: 436 [56%] 2023-01-23 00:07:35,491 48k INFO [2.5066936016082764, 2.1352639198303223, 6.077179908752441, 16.713815689086914, 0.8870971202850342, 58800, 9.467185589924815e-05] 2023-01-23 00:08:28,629 48k INFO ====> Epoch: 436 2023-01-23 00:11:42,842 48k INFO ====> Epoch: 437 2023-01-23 00:12:50,156 48k INFO Train Epoch: 438 [4%] 2023-01-23 00:12:50,160 48k INFO [2.275291681289673, 2.5353684425354004, 8.892474174499512, 17.51678466796875, 0.32377031445503235, 59000, 9.464818941452107e-05] 2023-01-23 00:13:05,379 48k INFO Saving model and optimizer state at iteration 438 to ./logs/48k/G_59000.pth 2023-01-23 00:13:07,993 48k INFO Saving model and optimizer state at iteration 438 to ./logs/48k/D_59000.pth 2023-01-23 00:15:04,342 48k INFO ====> Epoch: 438 2023-01-23 00:17:10,060 48k INFO Train Epoch: 439 [52%] 2023-01-23 00:17:10,061 48k INFO [2.4641830921173096, 2.2607421875, 8.466349601745605, 18.851123809814453, 0.882507860660553, 59200, 9.463635839084426e-05] 2023-01-23 00:18:07,662 48k INFO ====> Epoch: 439 2023-01-23 00:21:12,242 48k INFO ====> Epoch: 440 2023-01-23 00:23:00,285 48k INFO Train Epoch: 441 [0%] 2023-01-23 00:23:00,286 48k INFO [2.2754013538360596, 2.4661455154418945, 8.888656616210938, 17.7296085357666, 0.9613738656044006, 59400, 9.461270077993965e-05] 2023-01-23 00:25:00,178 48k INFO ====> Epoch: 441 2023-01-23 00:27:48,760 48k INFO Train Epoch: 442 [48%] 2023-01-23 00:27:48,762 48k INFO [2.40460205078125, 2.505282402038574, 8.782103538513184, 17.396421432495117, 0.9132993221282959, 59600, 9.460087419234215e-05] 2023-01-23 00:28:50,788 48k INFO ====> Epoch: 442 2023-01-23 00:32:29,597 48k INFO Train Epoch: 443 [96%] 2023-01-23 00:32:29,598 48k INFO [2.1229450702667236, 2.666767120361328, 9.826761245727539, 18.330766677856445, 0.7474295496940613, 59800, 9.458904908306811e-05] 2023-01-23 00:32:33,857 48k INFO ====> Epoch: 443 2023-01-23 00:36:32,961 48k INFO ====> Epoch: 444 2023-01-23 00:39:04,933 48k INFO Train Epoch: 445 [44%] 2023-01-23 00:39:04,934 48k INFO [2.3201799392700195, 2.4325859546661377, 8.579071044921875, 18.35289764404297, 0.9164303541183472, 60000, 9.456540329875122e-05] 2023-01-23 00:39:33,858 48k INFO Saving model and optimizer state at iteration 445 to ./logs/48k/G_60000.pth 2023-01-23 00:39:36,475 48k INFO Saving model and optimizer state at iteration 445 to ./logs/48k/D_60000.pth 2023-01-23 00:40:44,383 48k INFO ====> Epoch: 445 2023-01-23 00:43:58,890 48k INFO Train Epoch: 446 [93%] 2023-01-23 00:43:58,891 48k INFO [2.7426693439483643, 2.0262110233306885, 6.592021465301514, 14.024862289428711, 0.4380919337272644, 60200, 9.455358262333887e-05] 2023-01-23 00:44:07,685 48k INFO ====> Epoch: 446 2023-01-23 00:48:11,771 48k INFO ====> Epoch: 447 2023-01-23 00:50:29,491 48k INFO Train Epoch: 448 [41%] 2023-01-23 00:50:29,492 48k INFO [2.411860466003418, 2.363800525665283, 9.18837833404541, 17.263263702392578, 1.1086256504058838, 60400, 9.452994570508276e-05] 2023-01-23 00:51:40,537 48k INFO ====> Epoch: 448 2023-01-23 00:55:11,635 48k INFO Train Epoch: 449 [89%] 2023-01-23 00:55:11,636 48k INFO [2.394672155380249, 2.4029970169067383, 6.962438583374023, 15.489338874816895, 0.5273309946060181, 60600, 9.451812946186962e-05] 2023-01-23 00:55:24,864 48k INFO ====> Epoch: 449 2023-01-23 00:59:26,194 48k INFO ====> Epoch: 450 2023-01-23 01:01:59,674 48k INFO Train Epoch: 451 [37%] 2023-01-23 01:01:59,676 48k INFO [2.49904465675354, 2.2242441177368164, 7.90507173538208, 15.120101928710938, 0.7286273241043091, 60800, 9.44945014063499e-05] 2023-01-23 01:03:14,823 48k INFO ====> Epoch: 451 2023-01-23 01:06:46,251 48k INFO Train Epoch: 452 [85%] 2023-01-23 01:06:46,252 48k INFO [2.409268379211426, 2.272257089614868, 7.155418395996094, 15.779254913330078, 0.6481053829193115, 61000, 9.448268959367411e-05] 2023-01-23 01:07:14,279 48k INFO Saving model and optimizer state at iteration 452 to ./logs/48k/G_61000.pth 2023-01-23 01:07:18,284 48k INFO Saving model and optimizer state at iteration 452 to ./logs/48k/D_61000.pth 2023-01-23 01:07:37,961 48k INFO ====> Epoch: 452 2023-01-23 01:11:39,261 48k INFO ====> Epoch: 453 2023-01-23 01:13:29,615 48k INFO Train Epoch: 454 [33%] 2023-01-23 01:13:29,616 48k INFO [2.3891849517822266, 2.392873764038086, 8.792804718017578, 17.50490379333496, 0.6778994798660278, 61200, 9.445907039756771e-05] 2023-01-23 01:14:49,037 48k INFO ====> Epoch: 454 2023-01-23 01:18:04,038 48k INFO Train Epoch: 455 [81%] 2023-01-23 01:18:04,040 48k INFO [2.2337732315063477, 2.5298757553100586, 8.708442687988281, 15.645561218261719, 0.7498693466186523, 61400, 9.4447263013768e-05] 2023-01-23 01:18:26,153 48k INFO ====> Epoch: 455 2023-01-23 01:21:38,786 48k INFO ====> Epoch: 456 2023-01-23 01:23:54,269 48k INFO Train Epoch: 457 [30%] 2023-01-23 01:23:54,271 48k INFO [2.315610885620117, 2.4691522121429443, 8.564940452575684, 15.713743209838867, 0.8432081937789917, 61600, 9.442365267375304e-05] 2023-01-23 01:25:18,666 48k INFO ====> Epoch: 457 2023-01-23 01:28:22,771 48k INFO Train Epoch: 458 [78%] 2023-01-23 01:28:22,772 48k INFO [2.3445239067077637, 2.660304546356201, 9.66295337677002, 17.594961166381836, 0.5806844234466553, 61800, 9.441184971716882e-05] 2023-01-23 01:28:49,221 48k INFO ====> Epoch: 458 2023-01-23 01:32:27,239 48k INFO ====> Epoch: 459 2023-01-23 01:34:47,328 48k INFO Train Epoch: 460 [26%] 2023-01-23 01:34:47,329 48k INFO [2.4636003971099854, 2.6390609741210938, 7.9141974449157715, 16.073108673095703, 0.6279911398887634, 62000, 9.438824822992467e-05] 2023-01-23 01:35:13,159 48k INFO Saving model and optimizer state at iteration 460 to ./logs/48k/G_62000.pth 2023-01-23 01:35:17,021 48k INFO Saving model and optimizer state at iteration 460 to ./logs/48k/D_62000.pth 2023-01-23 01:36:46,829 48k INFO ====> Epoch: 460 2023-01-23 01:39:45,761 48k INFO Train Epoch: 461 [74%] 2023-01-23 01:39:45,762 48k INFO [2.4827582836151123, 2.40217661857605, 8.489347457885742, 18.092926025390625, 1.0102205276489258, 62200, 9.437644969889592e-05] 2023-01-23 01:40:16,796 48k INFO ====> Epoch: 461 2023-01-23 01:43:29,490 48k INFO ====> Epoch: 462 2023-01-23 01:45:44,120 48k INFO Train Epoch: 463 [22%] 2023-01-23 01:45:44,121 48k INFO [2.2780494689941406, 2.4932522773742676, 9.63774299621582, 18.86756134033203, 0.9262866973876953, 62400, 9.435285706110322e-05] 2023-01-23 01:47:16,940 48k INFO ====> Epoch: 463 2023-01-23 01:50:24,941 48k INFO Train Epoch: 464 [70%] 2023-01-23 01:50:24,942 48k INFO [2.2163138389587402, 2.804476499557495, 6.592695713043213, 12.299654006958008, 0.4662669003009796, 62600, 9.434106295397058e-05] 2023-01-23 01:51:00,327 48k INFO ====> Epoch: 464 2023-01-23 01:54:31,641 48k INFO ====> Epoch: 465 2023-01-23 01:56:57,669 48k INFO Train Epoch: 466 [19%] 2023-01-23 01:56:57,671 48k INFO [2.676191568374634, 2.3066391944885254, 6.232028961181641, 14.163414001464844, 0.8368260860443115, 62800, 9.431747916231119e-05] 2023-01-23 01:58:35,643 48k INFO ====> Epoch: 466 2023-01-23 02:01:55,112 48k INFO Train Epoch: 467 [67%] 2023-01-23 02:01:55,113 48k INFO [2.402942657470703, 2.630358934402466, 8.812569618225098, 16.382644653320312, 0.7541434168815613, 63000, 9.430568947741589e-05] 2023-01-23 02:02:24,445 48k INFO Saving model and optimizer state at iteration 467 to ./logs/48k/G_63000.pth 2023-01-23 02:02:28,331 48k INFO Saving model and optimizer state at iteration 467 to ./logs/48k/D_63000.pth 2023-01-23 02:03:10,058 48k INFO ====> Epoch: 467 2023-01-23 02:06:50,725 48k INFO ====> Epoch: 468 2023-01-23 02:08:32,293 48k INFO Train Epoch: 469 [15%] 2023-01-23 02:08:32,294 48k INFO [2.2730259895324707, 2.497195243835449, 10.085423469543457, 19.390012741088867, 0.7676803469657898, 63200, 9.428211452857292e-05] 2023-01-23 02:10:14,188 48k INFO ====> Epoch: 469 2023-01-23 02:13:07,314 48k INFO Train Epoch: 470 [63%] 2023-01-23 02:13:07,317 48k INFO [2.572646141052246, 2.421330451965332, 6.504598617553711, 13.939603805541992, 0.8870522975921631, 63400, 9.427032926425684e-05] 2023-01-23 02:13:51,448 48k INFO ====> Epoch: 470 2023-01-23 02:17:03,950 48k INFO ====> Epoch: 471 2023-01-23 02:19:00,852 48k INFO Train Epoch: 472 [11%] 2023-01-23 02:19:00,853 48k INFO [2.492624044418335, 2.1990275382995605, 6.288263320922852, 15.28076171875, 0.45137277245521545, 63600, 9.424676315491467e-05] 2023-01-23 02:20:47,339 48k INFO ====> Epoch: 472 2023-01-23 02:23:32,233 48k INFO Train Epoch: 473 [59%] 2023-01-23 02:23:32,234 48k INFO [2.457491636276245, 2.2580392360687256, 8.079216957092285, 16.433982849121094, 0.9397866725921631, 63800, 9.423498230952031e-05] 2023-01-23 02:24:21,104 48k INFO ====> Epoch: 473 2023-01-23 02:27:35,271 48k INFO ====> Epoch: 474 2023-01-23 02:28:52,920 48k INFO Train Epoch: 475 [7%] 2023-01-23 02:28:52,923 48k INFO [2.370532989501953, 2.4813404083251953, 8.601213455200195, 18.101593017578125, 0.6695253849029541, 64000, 9.421142503636453e-05] 2023-01-23 02:29:20,326 48k INFO Saving model and optimizer state at iteration 475 to ./logs/48k/G_64000.pth 2023-01-23 02:29:24,025 48k INFO Saving model and optimizer state at iteration 475 to ./logs/48k/D_64000.pth 2023-01-23 02:31:17,487 48k INFO ====> Epoch: 475 2023-01-23 02:34:07,729 48k INFO Train Epoch: 476 [56%] 2023-01-23 02:34:07,732 48k INFO [2.4547312259674072, 2.510133743286133, 7.57282829284668, 15.834877014160156, 0.7404948472976685, 64200, 9.419964860823498e-05] 2023-01-23 02:35:01,180 48k INFO ====> Epoch: 476 2023-01-23 02:38:22,097 48k INFO ====> Epoch: 477 2023-01-23 02:40:02,357 48k INFO Train Epoch: 478 [4%] 2023-01-23 02:40:02,359 48k INFO [2.1692864894866943, 2.597463846206665, 9.827874183654785, 17.81121253967285, 0.606740415096283, 64400, 9.417610016795242e-05] 2023-01-23 02:41:57,687 48k INFO ====> Epoch: 478 2023-01-23 02:44:43,469 48k INFO Train Epoch: 479 [52%] 2023-01-23 02:44:43,470 48k INFO [2.3955013751983643, 2.411597967147827, 9.038177490234375, 18.39568328857422, 0.7376821637153625, 64600, 9.416432815543143e-05] 2023-01-23 02:45:41,236 48k INFO ====> Epoch: 479 2023-01-23 02:49:27,195 48k INFO ====> Epoch: 480 2023-01-23 02:51:37,147 48k INFO Train Epoch: 481 [0%] 2023-01-23 02:51:37,149 48k INFO [2.3897790908813477, 2.5631942749023438, 9.113100051879883, 19.897499084472656, 0.6526987552642822, 64800, 9.41407885447102e-05] 2023-01-23 02:53:36,957 48k INFO ====> Epoch: 481 2023-01-23 02:56:24,452 48k INFO Train Epoch: 482 [48%] 2023-01-23 02:56:24,453 48k INFO [2.348032236099243, 2.3229308128356934, 8.423750877380371, 16.210403442382812, 0.4674111604690552, 65000, 9.412902094614211e-05] 2023-01-23 02:56:53,003 48k INFO Saving model and optimizer state at iteration 482 to ./logs/48k/G_65000.pth 2023-01-23 02:56:56,517 48k INFO Saving model and optimizer state at iteration 482 to ./logs/48k/D_65000.pth 2023-01-23 02:57:59,544 48k INFO ====> Epoch: 482 2023-01-23 03:01:24,362 48k INFO Train Epoch: 483 [96%] 2023-01-23 03:01:24,363 48k INFO [2.241405487060547, 2.531721591949463, 9.409074783325195, 18.028432846069336, 0.5777483582496643, 65200, 9.411725481852385e-05] 2023-01-23 03:01:28,935 48k INFO ====> Epoch: 483 2023-01-23 03:05:23,277 48k INFO ====> Epoch: 484 2023-01-23 03:08:07,350 48k INFO Train Epoch: 485 [44%] 2023-01-23 03:08:07,351 48k INFO [2.314054012298584, 2.5191404819488525, 8.369872093200684, 15.345575332641602, 0.514629602432251, 65400, 9.409372697540131e-05] 2023-01-23 03:09:13,698 48k INFO ====> Epoch: 485 2023-01-23 03:12:43,418 48k INFO Train Epoch: 486 [93%] 2023-01-23 03:12:43,420 48k INFO [2.7442984580993652, 2.0573577880859375, 7.521128177642822, 13.543753623962402, 0.748651385307312, 65600, 9.408196525952938e-05] 2023-01-23 03:12:52,160 48k INFO ====> Epoch: 486 2023-01-23 03:16:30,863 48k INFO ====> Epoch: 487 2023-01-23 03:18:47,783 48k INFO Train Epoch: 488 [41%] 2023-01-23 03:18:47,784 48k INFO [2.2368924617767334, 2.4866597652435303, 9.987361907958984, 18.462007522583008, 0.6101175546646118, 65800, 9.405844623824521e-05] 2023-01-23 03:19:58,659 48k INFO ====> Epoch: 488 2023-01-23 03:23:02,247 48k INFO Train Epoch: 489 [89%] 2023-01-23 03:23:02,248 48k INFO [2.591611862182617, 2.164257049560547, 7.288784503936768, 16.24890899658203, 0.6817768216133118, 66000, 9.404668893246542e-05] 2023-01-23 03:23:32,115 48k INFO Saving model and optimizer state at iteration 489 to ./logs/48k/G_66000.pth 2023-01-23 03:23:35,925 48k INFO Saving model and optimizer state at iteration 489 to ./logs/48k/D_66000.pth 2023-01-23 03:23:50,907 48k INFO ====> Epoch: 489 2023-01-23 03:27:08,658 48k INFO ====> Epoch: 490 2023-01-23 03:29:38,226 48k INFO Train Epoch: 491 [37%] 2023-01-23 03:29:38,234 48k INFO [2.336954355239868, 2.380932331085205, 9.086402893066406, 19.221088409423828, 0.8439057469367981, 66200, 9.402317872971181e-05] 2023-01-23 03:30:53,386 48k INFO ====> Epoch: 491 2023-01-23 03:34:14,754 48k INFO Train Epoch: 492 [85%] 2023-01-23 03:34:14,756 48k INFO [2.229433536529541, 2.39129638671875, 8.22130012512207, 17.659345626831055, 0.7189197540283203, 66400, 9.401142583237059e-05] 2023-01-23 03:34:32,465 48k INFO ====> Epoch: 492 2023-01-23 03:38:10,583 48k INFO ====> Epoch: 493 2023-01-23 03:40:20,566 48k INFO Train Epoch: 494 [33%] 2023-01-23 03:40:20,567 48k INFO [2.30479097366333, 2.601863145828247, 8.857730865478516, 18.17171859741211, 0.7724751234054565, 66600, 9.398792444484102e-05] 2023-01-23 03:41:40,486 48k INFO ====> Epoch: 494 2023-01-23 03:44:56,073 48k INFO Train Epoch: 495 [81%] 2023-01-23 03:44:56,075 48k INFO [2.2343363761901855, 2.5464093685150146, 7.585329055786133, 14.777770042419434, 0.4459614157676697, 66800, 9.397617595428541e-05] 2023-01-23 03:45:18,205 48k INFO ====> Epoch: 495 2023-01-23 03:48:37,254 48k INFO ====> Epoch: 496 2023-01-23 03:50:46,600 48k INFO Train Epoch: 497 [30%] 2023-01-23 03:50:46,601 48k INFO [2.2515339851379395, 2.563539505004883, 9.423137664794922, 19.225181579589844, 0.7892696857452393, 67000, 9.395268337867458e-05] 2023-01-23 03:51:04,872 48k INFO Saving model and optimizer state at iteration 497 to ./logs/48k/G_67000.pth 2023-01-23 03:51:08,765 48k INFO Saving model and optimizer state at iteration 497 to ./logs/48k/D_67000.pth 2023-01-23 03:52:36,378 48k INFO ====> Epoch: 497 2023-01-23 03:55:50,407 48k INFO Train Epoch: 498 [78%] 2023-01-23 03:55:50,408 48k INFO [2.207946538925171, 2.5758650302886963, 9.834657669067383, 18.78272247314453, 0.9229573011398315, 67200, 9.394093929325224e-05] 2023-01-23 03:56:17,010 48k INFO ====> Epoch: 498 2023-01-23 04:00:03,315 48k INFO ====> Epoch: 499 2023-01-23 04:02:20,967 48k INFO Train Epoch: 500 [26%] 2023-01-23 04:02:20,969 48k INFO [2.437617778778076, 2.546577215194702, 8.479930877685547, 16.95292091369629, 0.6588883399963379, 67400, 9.39174555262561e-05] 2023-01-23 04:03:49,490 48k INFO ====> Epoch: 500 2023-01-23 04:07:14,609 48k INFO Train Epoch: 501 [74%] 2023-01-23 04:07:14,610 48k INFO [2.3738203048706055, 2.4791338443756104, 9.325396537780762, 18.01314353942871, 0.6414987444877625, 67600, 9.39057158443153e-05] 2023-01-23 04:07:45,536 48k INFO ====> Epoch: 501 2023-01-23 04:11:19,466 48k INFO ====> Epoch: 502 2023-01-23 04:13:18,276 48k INFO Train Epoch: 503 [22%] 2023-01-23 04:13:18,277 48k INFO [2.2449514865875244, 2.7135398387908936, 9.00589370727539, 17.360939025878906, 0.6913172602653503, 67800, 9.388224088263103e-05] 2023-01-23 04:14:51,079 48k INFO ====> Epoch: 503 2023-01-23 04:17:54,613 48k INFO Train Epoch: 504 [70%] 2023-01-23 04:17:54,615 48k INFO [2.4780049324035645, 2.4273104667663574, 7.033159255981445, 15.005268096923828, 0.5343117117881775, 68000, 9.38705056025207e-05] 2023-01-23 04:18:19,855 48k INFO Saving model and optimizer state at iteration 504 to ./logs/48k/G_68000.pth 2023-01-23 04:18:23,064 48k INFO Saving model and optimizer state at iteration 504 to ./logs/48k/D_68000.pth 2023-01-23 04:19:00,554 48k INFO ====> Epoch: 504 2023-01-23 04:22:36,617 48k INFO ====> Epoch: 505 2023-01-23 04:24:32,295 48k INFO Train Epoch: 506 [19%] 2023-01-23 04:24:32,394 48k INFO [2.792174816131592, 1.978759765625, 5.7947564125061035, 11.720703125, 0.7894920110702515, 68200, 9.384703944284672e-05] 2023-01-23 04:26:10,650 48k INFO ====> Epoch: 506 2023-01-23 04:29:26,640 48k INFO Train Epoch: 507 [67%] 2023-01-23 04:29:26,641 48k INFO [2.3609468936920166, 2.3993513584136963, 9.340760231018066, 17.154266357421875, 0.7342668175697327, 68400, 9.383530856291636e-05] 2023-01-23 04:30:06,425 48k INFO ====> Epoch: 507 2023-01-23 04:33:51,729 48k INFO ====> Epoch: 508 2023-01-23 04:35:50,299 48k INFO Train Epoch: 509 [15%] 2023-01-23 04:35:50,300 48k INFO [2.2937166690826416, 2.4604575634002686, 8.376726150512695, 17.816755294799805, 0.7440769672393799, 68600, 9.381185120195232e-05] 2023-01-23 04:37:32,840 48k INFO ====> Epoch: 509 2023-01-23 04:40:12,531 48k INFO Train Epoch: 510 [63%] 2023-01-23 04:40:12,538 48k INFO [2.513740062713623, 2.4181125164031982, 5.764969348907471, 11.68647575378418, 0.5526787638664246, 68800, 9.380012472055207e-05] 2023-01-23 04:40:56,781 48k INFO ====> Epoch: 510 2023-01-23 04:44:23,575 48k INFO ====> Epoch: 511 2023-01-23 04:46:14,243 48k INFO Train Epoch: 512 [11%] 2023-01-23 04:46:14,244 48k INFO [2.247480869293213, 2.5991368293762207, 8.123600959777832, 13.356539726257324, 0.6554232239723206, 69000, 9.377667615499888e-05] 2023-01-23 04:46:28,279 48k INFO Saving model and optimizer state at iteration 512 to ./logs/48k/G_69000.pth 2023-01-23 04:46:31,354 48k INFO Saving model and optimizer state at iteration 512 to ./logs/48k/D_69000.pth 2023-01-23 04:48:19,357 48k INFO ====> Epoch: 512 2023-01-23 04:51:17,981 48k INFO Train Epoch: 513 [59%] 2023-01-23 04:51:17,982 48k INFO [2.4786553382873535, 2.396721839904785, 7.631348609924316, 15.291622161865234, 0.7242208123207092, 69200, 9.376495407047951e-05] 2023-01-23 04:52:06,636 48k INFO ====> Epoch: 513 2023-01-23 04:56:02,679 48k INFO ====> Epoch: 514 2023-01-23 04:57:52,541 48k INFO Train Epoch: 515 [7%] 2023-01-23 04:57:52,542 48k INFO [2.0901215076446533, 2.9044952392578125, 8.52492618560791, 17.741355895996094, 0.8403676152229309, 69400, 9.374151429703929e-05] 2023-01-23 04:59:43,478 48k INFO ====> Epoch: 515 2023-01-23 05:02:35,968 48k INFO Train Epoch: 516 [56%] 2023-01-23 05:02:35,974 48k INFO [2.3474578857421875, 2.459120035171509, 8.149603843688965, 15.909011840820312, 0.3790472447872162, 69600, 9.372979660775216e-05] 2023-01-23 05:03:29,217 48k INFO ====> Epoch: 516 2023-01-23 05:07:07,670 48k INFO ====> Epoch: 517 2023-01-23 05:08:23,168 48k INFO Train Epoch: 518 [4%] 2023-01-23 05:08:23,170 48k INFO [2.349428176879883, 2.305008888244629, 9.471578598022461, 18.863718032836914, 0.5991547703742981, 69800, 9.370636562312829e-05] 2023-01-23 05:10:18,980 48k INFO ====> Epoch: 518 2023-01-23 05:12:35,088 48k INFO Train Epoch: 519 [52%] 2023-01-23 05:12:35,090 48k INFO [2.3059561252593994, 2.3011035919189453, 9.124948501586914, 18.73409080505371, 0.9174631834030151, 70000, 9.36946523274254e-05] 2023-01-23 05:13:03,491 48k INFO Saving model and optimizer state at iteration 519 to ./logs/48k/G_70000.pth 2023-01-23 05:13:07,255 48k INFO Saving model and optimizer state at iteration 519 to ./logs/48k/D_70000.pth 2023-01-23 05:14:07,258 48k INFO ====> Epoch: 519 2023-01-23 05:17:37,166 48k INFO ====> Epoch: 520 2023-01-23 05:19:14,166 48k INFO Train Epoch: 521 [0%] 2023-01-23 05:19:14,167 48k INFO [2.391169786453247, 2.298710584640503, 7.874879360198975, 17.222780227661133, 0.7691891193389893, 70200, 9.367123012832248e-05] 2023-01-23 05:21:14,037 48k INFO ====> Epoch: 521 2023-01-23 05:23:30,549 48k INFO Train Epoch: 522 [48%] 2023-01-23 05:23:30,550 48k INFO [2.4699225425720215, 2.38809871673584, 7.038280963897705, 16.74960708618164, 0.467206746339798, 70400, 9.365952122455643e-05] 2023-01-23 05:24:32,578 48k INFO ====> Epoch: 522 2023-01-23 05:27:57,000 48k INFO Train Epoch: 523 [96%] 2023-01-23 05:27:57,002 48k INFO [2.2196826934814453, 2.603968620300293, 8.215780258178711, 19.168991088867188, 0.6022151112556458, 70600, 9.364781378440336e-05] 2023-01-23 05:28:01,344 48k INFO ====> Epoch: 523 2023-01-23 05:31:19,217 48k INFO ====> Epoch: 524 2023-01-23 05:33:33,015 48k INFO Train Epoch: 525 [44%] 2023-01-23 05:33:33,017 48k INFO [2.413567543029785, 2.460486650466919, 7.819330215454102, 16.708255767822266, 0.5553939938545227, 70800, 9.362440329420433e-05] 2023-01-23 05:34:39,891 48k INFO ====> Epoch: 525 2023-01-23 05:38:09,889 48k INFO Train Epoch: 526 [93%] 2023-01-23 05:38:09,890 48k INFO [2.5558125972747803, 2.0516128540039062, 6.010700225830078, 13.286131858825684, 0.523678183555603, 71000, 9.361270024379255e-05] 2023-01-23 05:38:36,260 48k INFO Saving model and optimizer state at iteration 526 to ./logs/48k/G_71000.pth 2023-01-23 05:38:39,142 48k INFO Saving model and optimizer state at iteration 526 to ./logs/48k/D_71000.pth 2023-01-23 05:38:50,320 48k INFO ====> Epoch: 526 2023-01-23 05:42:00,626 48k INFO ====> Epoch: 527 2023-01-23 05:44:02,542 48k INFO Train Epoch: 528 [41%] 2023-01-23 05:44:02,543 48k INFO [2.4207816123962402, 2.3192920684814453, 8.085917472839355, 17.483858108520508, 0.5888242125511169, 71200, 9.358929853143005e-05] 2023-01-23 05:45:13,195 48k INFO ====> Epoch: 528 2023-01-23 05:48:19,428 48k INFO Train Epoch: 529 [89%] 2023-01-23 05:48:19,429 48k INFO [2.2477660179138184, 2.590639114379883, 9.059660911560059, 18.565515518188477, 0.6304738521575928, 71400, 9.357759986911361e-05] 2023-01-23 05:48:32,618 48k INFO ====> Epoch: 529 2023-01-23 05:51:55,033 48k INFO ====> Epoch: 530 2023-01-23 05:54:18,716 48k INFO Train Epoch: 531 [37%] 2023-01-23 05:54:18,717 48k INFO [2.259197473526001, 2.7553253173828125, 10.081747055053711, 19.390836715698242, 0.7929471135139465, 71600, 9.355420693129632e-05] 2023-01-23 05:55:34,148 48k INFO ====> Epoch: 531 2023-01-23 05:58:52,084 48k INFO Train Epoch: 532 [85%] 2023-01-23 05:58:52,085 48k INFO [2.34936785697937, 2.593371868133545, 9.750057220458984, 18.457355499267578, 0.47896596789360046, 71800, 9.35425126554299e-05] 2023-01-23 05:59:09,649 48k INFO ====> Epoch: 532 2023-01-23 06:02:38,362 48k INFO ====> Epoch: 533 2023-01-23 06:04:46,238 48k INFO Train Epoch: 534 [33%] 2023-01-23 06:04:46,240 48k INFO [2.465785503387451, 2.3006303310394287, 8.187129020690918, 17.370882034301758, 0.726603090763092, 72000, 9.351912848886779e-05] 2023-01-23 06:05:14,998 48k INFO Saving model and optimizer state at iteration 534 to ./logs/48k/G_72000.pth 2023-01-23 06:05:18,639 48k INFO Saving model and optimizer state at iteration 534 to ./logs/48k/D_72000.pth 2023-01-23 06:06:40,943 48k INFO ====> Epoch: 534 2023-01-23 06:09:36,812 48k INFO Train Epoch: 535 [81%] 2023-01-23 06:09:37,298 48k INFO [2.334657669067383, 2.3674590587615967, 7.939729690551758, 16.107160568237305, 0.8264689445495605, 72200, 9.350743859780667e-05] 2023-01-23 06:09:59,807 48k INFO ====> Epoch: 535 2023-01-23 06:13:37,977 48k INFO ====> Epoch: 536 2023-01-23 06:15:17,684 48k INFO Train Epoch: 537 [30%] 2023-01-23 06:15:17,685 48k INFO [2.3952043056488037, 2.4368011951446533, 8.693976402282715, 15.688526153564453, 0.665168046951294, 72400, 9.348406319921095e-05] 2023-01-23 06:16:41,819 48k INFO ====> Epoch: 537 2023-01-23 06:19:25,511 48k INFO Train Epoch: 538 [78%] 2023-01-23 06:19:25,512 48k INFO [2.2839550971984863, 2.526076316833496, 8.797012329101562, 17.412857055664062, 0.8043933510780334, 72600, 9.347237769131105e-05] 2023-01-23 06:19:52,160 48k INFO ====> Epoch: 538 2023-01-23 06:23:05,243 48k INFO ====> Epoch: 539 2023-01-23 06:25:08,027 48k INFO Train Epoch: 540 [26%] 2023-01-23 06:25:08,028 48k INFO [2.3396079540252686, 2.4294543266296387, 8.018697738647461, 16.36534881591797, 0.7491115927696228, 72800, 9.344901105739411e-05] 2023-01-23 06:26:36,757 48k INFO ====> Epoch: 540 2023-01-23 06:29:36,980 48k INFO Train Epoch: 541 [74%] 2023-01-23 06:29:36,982 48k INFO [2.3457224369049072, 2.595045328140259, 8.457950592041016, 17.99509620666504, 0.5498891472816467, 73000, 9.343732993101193e-05] 2023-01-23 06:30:05,868 48k INFO Saving model and optimizer state at iteration 541 to ./logs/48k/G_73000.pth 2023-01-23 06:30:09,397 48k INFO Saving model and optimizer state at iteration 541 to ./logs/48k/D_73000.pth 2023-01-23 06:30:42,402 48k INFO ====> Epoch: 541 2023-01-23 06:34:28,237 48k INFO ====> Epoch: 542 2023-01-23 06:36:13,223 48k INFO Train Epoch: 543 [22%] 2023-01-23 06:36:13,224 48k INFO [2.330172538757324, 2.5891201496124268, 8.219502449035645, 16.42380142211914, 0.7907260060310364, 73200, 9.341397205848746e-05] 2023-01-23 06:37:47,498 48k INFO ====> Epoch: 543 2023-01-23 06:40:45,680 48k INFO Train Epoch: 544 [70%] 2023-01-23 06:40:45,681 48k INFO [2.329918384552002, 2.450347423553467, 5.9266533851623535, 11.582715034484863, 0.5389988422393799, 73400, 9.340229531198015e-05] 2023-01-23 06:41:21,108 48k INFO ====> Epoch: 544 2023-01-23 06:44:37,578 48k INFO ====> Epoch: 545 2023-01-23 06:46:43,954 48k INFO Train Epoch: 546 [19%] 2023-01-23 06:46:43,960 48k INFO [2.170058012008667, 2.939147710800171, 7.199021816253662, 12.189772605895996, 0.36329540610313416, 73600, 9.337894619756301e-05] 2023-01-23 06:48:21,480 48k INFO ====> Epoch: 546 2023-01-23 06:50:48,477 48k INFO Train Epoch: 547 [67%] 2023-01-23 06:50:48,478 48k INFO [2.2676734924316406, 2.567783832550049, 10.414104461669922, 18.03912353515625, 0.7712804079055786, 73800, 9.336727382928831e-05] 2023-01-23 06:51:28,124 48k INFO ====> Epoch: 547 2023-01-23 06:55:06,325 48k INFO ====> Epoch: 548 2023-01-23 06:57:11,232 48k INFO Train Epoch: 549 [15%] 2023-01-23 06:57:11,234 48k INFO [2.284647226333618, 2.6116702556610107, 7.897822380065918, 16.129924774169922, 0.6521420478820801, 74000, 9.334393346969463e-05] 2023-01-23 06:57:36,009 48k INFO Saving model and optimizer state at iteration 549 to ./logs/48k/G_74000.pth 2023-01-23 06:57:39,329 48k INFO Saving model and optimizer state at iteration 549 to ./logs/48k/D_74000.pth 2023-01-23 06:59:23,300 48k INFO ====> Epoch: 549 2023-01-23 07:02:09,565 48k INFO Train Epoch: 550 [63%] 2023-01-23 07:02:09,567 48k INFO [2.562563419342041, 2.1167540550231934, 5.93740177154541, 13.063631057739258, 0.8302456736564636, 74200, 9.33322654780109e-05] 2023-01-23 07:02:53,762 48k INFO ====> Epoch: 550 2023-01-23 07:06:24,869 48k INFO ====> Epoch: 551 2023-01-23 07:08:06,869 48k INFO Train Epoch: 552 [11%] 2023-01-23 07:08:06,870 48k INFO [2.6706295013427734, 2.0981557369232178, 5.807929992675781, 12.750184059143066, 0.660175085067749, 74400, 9.330893386995804e-05] 2023-01-23 07:09:53,888 48k INFO ====> Epoch: 552 2023-01-23 07:12:19,864 48k INFO Train Epoch: 553 [59%] 2023-01-23 07:12:19,865 48k INFO [2.375337600708008, 2.560523509979248, 7.181933403015137, 14.025781631469727, 0.9454037547111511, 74600, 9.32972702532243e-05] 2023-01-23 07:13:08,498 48k INFO ====> Epoch: 553 2023-01-23 07:16:33,287 48k INFO ====> Epoch: 554 2023-01-23 07:18:12,027 48k INFO Train Epoch: 555 [7%] 2023-01-23 07:18:12,028 48k INFO [2.2871885299682617, 2.412287950515747, 8.505861282348633, 16.921663284301758, 0.6331974267959595, 74800, 9.327394739343082e-05] 2023-01-23 07:20:03,448 48k INFO ====> Epoch: 555 2023-01-23 07:23:00,761 48k INFO Train Epoch: 556 [56%] 2023-01-23 07:23:00,762 48k INFO [2.5349063873291016, 2.3069913387298584, 7.597678184509277, 16.98713493347168, 0.57831871509552, 75000, 9.326228815000664e-05] 2023-01-23 07:23:28,637 48k INFO Saving model and optimizer state at iteration 556 to ./logs/48k/G_75000.pth 2023-01-23 07:23:33,118 48k INFO Saving model and optimizer state at iteration 556 to ./logs/48k/D_75000.pth 2023-01-23 07:24:28,285 48k INFO ====> Epoch: 556 2023-01-23 07:27:58,262 48k INFO ====> Epoch: 557 2023-01-23 07:29:41,230 48k INFO Train Epoch: 558 [4%] 2023-01-23 07:29:41,232 48k INFO [2.4764013290405273, 2.2352867126464844, 8.54216480255127, 16.876190185546875, 0.5928930640220642, 75200, 9.323897403519238e-05] 2023-01-23 07:31:36,316 48k INFO ====> Epoch: 558 2023-01-23 07:34:06,635 48k INFO Train Epoch: 559 [52%] 2023-01-23 07:34:06,636 48k INFO [2.260918140411377, 2.3559529781341553, 9.802104949951172, 18.758684158325195, 0.56284099817276, 75400, 9.322731916343797e-05] 2023-01-23 07:35:04,328 48k INFO ====> Epoch: 559 2023-01-23 07:38:10,876 48k INFO ====> Epoch: 560 2023-01-23 07:39:19,536 48k INFO Train Epoch: 561 [0%] 2023-01-23 07:39:19,537 48k INFO [2.3466763496398926, 2.3197426795959473, 8.889543533325195, 16.620197296142578, 0.778913140296936, 75600, 9.320401379032397e-05] 2023-01-23 07:41:18,996 48k INFO ====> Epoch: 561 2023-01-23 07:43:48,415 48k INFO Train Epoch: 562 [48%] 2023-01-23 07:43:48,417 48k INFO [2.3352317810058594, 2.455029010772705, 7.023183345794678, 16.8964786529541, 1.0707758665084839, 75800, 9.319236328860017e-05] 2023-01-23 07:44:50,525 48k INFO ====> Epoch: 562 2023-01-23 07:48:23,156 48k INFO Train Epoch: 563 [96%] 2023-01-23 07:48:23,157 48k INFO [2.260845422744751, 2.819899559020996, 9.82557487487793, 19.687292098999023, 0.8775888085365295, 76000, 9.318071424318909e-05] 2023-01-23 07:48:51,376 48k INFO Saving model and optimizer state at iteration 563 to ./logs/48k/G_76000.pth 2023-01-23 07:48:55,330 48k INFO Saving model and optimizer state at iteration 563 to ./logs/48k/D_76000.pth 2023-01-23 07:49:01,749 48k INFO ====> Epoch: 563 2023-01-23 07:52:09,323 48k INFO ====> Epoch: 564 2023-01-23 07:54:15,032 48k INFO Train Epoch: 565 [44%] 2023-01-23 07:54:15,033 48k INFO [2.249080181121826, 2.559215784072876, 7.469959735870361, 14.57598876953125, 0.5874495506286621, 76200, 9.315742052057694e-05] 2023-01-23 07:55:21,168 48k INFO ====> Epoch: 565 2023-01-23 07:58:46,780 48k INFO Train Epoch: 566 [93%] 2023-01-23 07:58:46,781 48k INFO [2.5025992393493652, 2.2122788429260254, 7.982885360717773, 14.74073314666748, 0.7472951412200928, 76400, 9.314577584301187e-05] 2023-01-23 07:58:55,481 48k INFO ====> Epoch: 566 2023-01-23 08:02:31,360 48k INFO ====> Epoch: 567 2023-01-23 08:04:50,088 48k INFO Train Epoch: 568 [41%] 2023-01-23 08:04:50,090 48k INFO [2.279555559158325, 2.440805435180664, 9.013542175292969, 18.103361129760742, 0.6954818367958069, 76600, 9.312249085445385e-05] 2023-01-23 08:06:01,053 48k INFO ====> Epoch: 568 2023-01-23 08:09:28,966 48k INFO Train Epoch: 569 [89%] 2023-01-23 08:09:28,968 48k INFO [2.2280197143554688, 2.892700433731079, 7.713043689727783, 16.520448684692383, 0.6331877112388611, 76800, 9.311085054309703e-05] 2023-01-23 08:09:42,100 48k INFO ====> Epoch: 569 2023-01-23 08:13:09,807 48k INFO ====> Epoch: 570 2023-01-23 08:15:20,649 48k INFO Train Epoch: 571 [37%] 2023-01-23 08:15:20,650 48k INFO [2.299046754837036, 2.4935965538024902, 10.082944869995117, 18.427770614624023, 0.7462549209594727, 77000, 9.30875742853183e-05] 2023-01-23 08:15:49,041 48k INFO Saving model and optimizer state at iteration 571 to ./logs/48k/G_77000.pth 2023-01-23 08:15:52,304 48k INFO Saving model and optimizer state at iteration 571 to ./logs/48k/D_77000.pth 2023-01-23 08:17:09,495 48k INFO ====> Epoch: 571 2023-01-23 08:20:40,169 48k INFO Train Epoch: 572 [85%] 2023-01-23 08:20:40,171 48k INFO [2.2871062755584717, 2.4942123889923096, 9.397868156433105, 17.350862503051758, 0.9620381593704224, 77200, 9.307593833853263e-05] 2023-01-23 08:20:57,836 48k INFO ====> Epoch: 572 2023-01-23 08:24:14,298 48k INFO ====> Epoch: 573 2023-01-23 08:26:09,210 48k INFO Train Epoch: 574 [33%] 2023-01-23 08:26:09,212 48k INFO [2.256251811981201, 2.545170307159424, 9.29013729095459, 17.560558319091797, 0.8410231471061707, 77400, 9.305267080825953e-05] 2023-01-23 08:27:29,029 48k INFO ====> Epoch: 574 2023-01-23 08:30:15,163 48k INFO Train Epoch: 575 [81%] 2023-01-23 08:30:15,164 48k INFO [2.370176315307617, 2.435363531112671, 8.493626594543457, 17.78217887878418, 0.8170899152755737, 77600, 9.304103922440849e-05] 2023-01-23 08:30:37,092 48k INFO ====> Epoch: 575 2023-01-23 08:33:58,104 48k INFO ====> Epoch: 576 2023-01-23 08:36:02,021 48k INFO Train Epoch: 577 [30%] 2023-01-23 08:36:02,022 48k INFO [2.239204168319702, 2.547520399093628, 9.621316909790039, 19.087575912475586, 1.1108367443084717, 77800, 9.301778041836861e-05] 2023-01-23 08:37:25,892 48k INFO ====> Epoch: 577 2023-01-23 08:40:16,401 48k INFO Train Epoch: 578 [78%] 2023-01-23 08:40:16,411 48k INFO [2.218193769454956, 2.678682804107666, 10.40750503540039, 18.72446632385254, 1.0528074502944946, 78000, 9.300615319581631e-05] 2023-01-23 08:40:43,130 48k INFO Saving model and optimizer state at iteration 578 to ./logs/48k/G_78000.pth 2023-01-23 08:40:45,945 48k INFO Saving model and optimizer state at iteration 578 to ./logs/48k/D_78000.pth 2023-01-23 08:41:14,232 48k INFO ====> Epoch: 578 2023-01-23 08:44:26,549 48k INFO ====> Epoch: 579 2023-01-23 08:46:28,100 48k INFO Train Epoch: 580 [26%] 2023-01-23 08:46:28,102 48k INFO [2.378347396850586, 2.4849836826324463, 10.180054664611816, 16.343280792236328, 0.3610348105430603, 78200, 9.29829031107385e-05] 2023-01-23 08:47:56,577 48k INFO ====> Epoch: 580 2023-01-23 08:50:51,713 48k INFO Train Epoch: 581 [74%] 2023-01-23 08:50:51,715 48k INFO [2.3561880588531494, 2.415515661239624, 7.880939960479736, 15.963616371154785, 0.7567977905273438, 78400, 9.297128024784965e-05] 2023-01-23 08:51:22,498 48k INFO ====> Epoch: 581 2023-01-23 08:54:58,950 48k INFO ====> Epoch: 582 2023-01-23 08:56:58,358 48k INFO Train Epoch: 583 [22%] 2023-01-23 08:56:58,359 48k INFO [2.385441541671753, 2.3792145252227783, 8.17920207977295, 16.826969146728516, 0.9153580069541931, 78600, 9.294803888046393e-05] 2023-01-23 08:58:31,439 48k INFO ====> Epoch: 583 2023-01-23 09:01:17,124 48k INFO Train Epoch: 584 [70%] 2023-01-23 09:01:17,125 48k INFO [2.2009034156799316, 2.828523635864258, 9.102944374084473, 17.104888916015625, 0.7088967561721802, 78800, 9.293642037560387e-05] 2023-01-23 09:01:52,305 48k INFO ====> Epoch: 584 2023-01-23 09:05:23,953 48k INFO ====> Epoch: 585 2023-01-23 09:07:18,852 48k INFO Train Epoch: 586 [19%] 2023-01-23 09:07:18,853 48k INFO [2.526090383529663, 2.31742787361145, 7.489372730255127, 15.54841136932373, 0.9547833204269409, 79000, 9.291318772264153e-05] 2023-01-23 09:07:36,135 48k INFO Saving model and optimizer state at iteration 586 to ./logs/48k/G_79000.pth 2023-01-23 09:07:39,594 48k INFO Saving model and optimizer state at iteration 586 to ./logs/48k/D_79000.pth 2023-01-23 09:09:20,641 48k INFO ====> Epoch: 586 2023-01-23 09:12:11,036 48k INFO Train Epoch: 587 [67%] 2023-01-23 09:12:11,192 48k INFO [2.48903751373291, 2.3355085849761963, 8.752163887023926, 16.37012481689453, 0.6926398873329163, 79200, 9.29015735741762e-05] 2023-01-23 09:12:50,841 48k INFO ====> Epoch: 587 2023-01-23 09:16:25,243 48k INFO ====> Epoch: 588 2023-01-23 09:17:52,599 48k INFO Train Epoch: 589 [15%] 2023-01-23 09:17:52,600 48k INFO [2.4560537338256836, 2.3058207035064697, 7.657466411590576, 15.429879188537598, 0.9578604102134705, 79400, 9.287834963236974e-05] 2023-01-23 09:19:34,894 48k INFO ====> Epoch: 589 2023-01-23 09:22:07,851 48k INFO Train Epoch: 590 [63%] 2023-01-23 09:22:07,852 48k INFO [2.4067280292510986, 2.4139997959136963, 5.7375407218933105, 11.630255699157715, 0.17634831368923187, 79600, 9.286673983866569e-05] 2023-01-23 09:22:52,101 48k INFO ====> Epoch: 590 2023-01-23 09:25:58,239 48k INFO ====> Epoch: 591 2023-01-23 09:27:14,975 48k INFO Train Epoch: 592 [11%] 2023-01-23 09:27:15,017 48k INFO [2.5092360973358154, 2.212297201156616, 5.302035331726074, 14.694295883178711, 0.8002123832702637, 79800, 9.284352460474882e-05] 2023-01-23 09:29:00,953 48k INFO ====> Epoch: 592 2023-01-23 09:31:39,884 48k INFO Train Epoch: 593 [59%] 2023-01-23 09:31:39,885 48k INFO [2.284442663192749, 2.894716262817383, 7.415098190307617, 15.763818740844727, 0.5646786093711853, 80000, 9.283191916417322e-05] 2023-01-23 09:32:06,229 48k INFO Saving model and optimizer state at iteration 593 to ./logs/48k/G_80000.pth 2023-01-23 09:32:10,438 48k INFO Saving model and optimizer state at iteration 593 to ./logs/48k/D_80000.pth 2023-01-23 09:33:00,789 48k INFO ====> Epoch: 593 2023-01-23 09:36:17,322 48k INFO ====> Epoch: 594 2023-01-23 09:37:57,267 48k INFO Train Epoch: 595 [7%] 2023-01-23 09:37:57,268 48k INFO [2.325518846511841, 2.4845781326293945, 8.73952579498291, 17.72337532043457, 0.39521557092666626, 80200, 9.28087126348809e-05] 2023-01-23 09:39:47,756 48k INFO ====> Epoch: 595 2023-01-23 09:47:43,266 48k INFO Train Epoch: 596 [56%] 2023-01-23 09:47:43,267 48k INFO [2.458754301071167, 2.4023544788360596, 8.908976554870605, 18.174755096435547, 0.6748313903808594, 80400, 9.279711154580154e-05] 2023-01-23 09:48:36,583 48k INFO ====> Epoch: 596 2023-01-23 09:54:44,340 48k INFO ====> Epoch: 597 2023-01-23 09:56:05,823 48k INFO Train Epoch: 598 [4%] 2023-01-23 09:56:05,828 48k INFO [2.306227207183838, 2.5555543899536133, 8.862488746643066, 17.059858322143555, 0.5011264085769653, 80600, 9.277391371786995e-05] 2023-01-23 09:58:00,949 48k INFO ====> Epoch: 598 2023-01-23 10:00:27,351 48k INFO Train Epoch: 599 [52%] 2023-01-23 10:00:27,353 48k INFO [2.3713431358337402, 2.324467658996582, 9.048022270202637, 17.33318519592285, 1.0491530895233154, 80800, 9.276231697865521e-05] 2023-01-23 10:01:24,983 48k INFO ====> Epoch: 599 2023-01-23 10:05:13,570 48k INFO ====> Epoch: 600 2023-01-23 10:06:43,634 48k INFO Train Epoch: 601 [0%] 2023-01-23 10:06:43,635 48k INFO [2.2902629375457764, 2.66019344329834, 7.314599990844727, 15.387389183044434, 0.5805667042732239, 81000, 9.273912784882175e-05] 2023-01-23 10:06:55,166 48k INFO Saving model and optimizer state at iteration 601 to ./logs/48k/G_81000.pth 2023-01-23 10:06:58,194 48k INFO Saving model and optimizer state at iteration 601 to ./logs/48k/D_81000.pth 2023-01-23 10:09:01,248 48k INFO ====> Epoch: 601 2023-01-23 10:11:49,446 48k INFO Train Epoch: 602 [48%] 2023-01-23 10:11:49,447 48k INFO [2.120929718017578, 2.594834327697754, 9.416112899780273, 17.658924102783203, 0.6664719581604004, 81200, 9.272753545784065e-05] 2023-01-23 10:12:51,401 48k INFO ====> Epoch: 602 2023-01-23 10:16:38,749 48k INFO Train Epoch: 603 [96%] 2023-01-23 10:16:38,750 48k INFO [2.2888693809509277, 2.672717571258545, 9.259673118591309, 19.07910919189453, 0.7934699058532715, 81400, 9.27159445159084e-05] 2023-01-23 10:16:43,075 48k INFO ====> Epoch: 603 2023-01-23 10:20:26,861 48k INFO ====> Epoch: 604 2023-01-23 10:22:52,742 48k INFO Train Epoch: 605 [44%] 2023-01-23 10:22:52,744 48k INFO [2.1657214164733887, 2.8009016513824463, 9.758602142333984, 16.538991928100586, 0.5339023470878601, 81600, 9.269276697846605e-05] 2023-01-23 10:24:01,717 48k INFO ====> Epoch: 605 2023-01-23 10:27:54,724 48k INFO Train Epoch: 606 [93%] 2023-01-23 10:27:54,725 48k INFO [2.356616973876953, 2.419138193130493, 10.1941556930542, 17.311622619628906, 0.6290832161903381, 81800, 9.268118038259374e-05] 2023-01-23 10:28:03,468 48k INFO ====> Epoch: 606 2023-01-23 10:31:41,886 48k INFO ====> Epoch: 607 2023-01-23 10:34:15,781 48k INFO Train Epoch: 608 [41%] 2023-01-23 10:34:15,782 48k INFO [2.2607390880584717, 2.3102474212646484, 9.26540470123291, 18.3974609375, 0.6629650592803955, 82000, 9.265801153564152e-05] 2023-01-23 10:34:44,081 48k INFO Saving model and optimizer state at iteration 608 to ./logs/48k/G_82000.pth 2023-01-23 10:34:47,408 48k INFO Saving model and optimizer state at iteration 608 to ./logs/48k/D_82000.pth 2023-01-23 10:36:00,808 48k INFO ====> Epoch: 608 2023-01-23 10:39:49,608 48k INFO Train Epoch: 609 [89%] 2023-01-23 10:39:49,610 48k INFO [2.394526481628418, 2.5951595306396484, 6.737144470214844, 13.23776912689209, 0.7622970342636108, 82200, 9.264642928419956e-05] 2023-01-23 10:40:02,894 48k INFO ====> Epoch: 609 2023-01-23 10:44:01,363 48k INFO ====> Epoch: 610 2023-01-23 10:45:59,385 48k INFO Train Epoch: 611 [37%] 2023-01-23 10:45:59,387 48k INFO [2.4009838104248047, 2.6458544731140137, 7.874640941619873, 15.9624605178833, 0.45186084508895874, 82400, 9.262326912447895e-05] 2023-01-23 10:47:15,171 48k INFO ====> Epoch: 611 2023-01-23 10:50:30,862 48k INFO Train Epoch: 612 [85%] 2023-01-23 10:50:30,863 48k INFO [2.327263355255127, 2.4288601875305176, 7.984612941741943, 15.039701461791992, 0.7114595770835876, 82600, 9.261169121583839e-05] 2023-01-23 10:50:48,873 48k INFO ====> Epoch: 612 2023-01-23 10:54:45,857 48k INFO ====> Epoch: 613 2023-01-23 10:57:16,182 48k INFO Train Epoch: 614 [33%] 2023-01-23 10:57:16,184 48k INFO [2.233272075653076, 2.565967082977295, 9.913457870483398, 18.056718826293945, 0.7459328174591064, 82800, 9.25885397400921e-05] 2023-01-23 10:58:36,002 48k INFO ====> Epoch: 614 2023-01-23 11:02:07,590 48k INFO Train Epoch: 615 [81%] 2023-01-23 11:02:07,591 48k INFO [2.4625437259674072, 2.256674289703369, 7.244801044464111, 16.27364158630371, 0.49542298913002014, 83000, 9.257696617262459e-05] 2023-01-23 11:02:36,949 48k INFO Saving model and optimizer state at iteration 615 to ./logs/48k/G_83000.pth 2023-01-23 11:02:39,467 48k INFO Saving model and optimizer state at iteration 615 to ./logs/48k/D_83000.pth 2023-01-23 11:03:03,972 48k INFO ====> Epoch: 615 2023-01-23 11:06:45,079 48k INFO ====> Epoch: 616 2023-01-23 11:09:19,615 48k INFO Train Epoch: 617 [30%] 2023-01-23 11:09:19,616 48k INFO [2.280289649963379, 2.5142030715942383, 9.423628807067871, 17.162843704223633, 0.7757919430732727, 83200, 9.255382337759651e-05] 2023-01-23 11:10:43,998 48k INFO ====> Epoch: 617 2023-01-23 11:14:26,804 48k INFO Train Epoch: 618 [78%] 2023-01-23 11:14:26,805 48k INFO [2.252939462661743, 2.6416707038879395, 9.928266525268555, 18.81395721435547, 0.7950023412704468, 83400, 9.254225414967431e-05] 2023-01-23 11:14:53,182 48k INFO ====> Epoch: 618 2023-01-23 11:18:40,028 48k INFO ====> Epoch: 619 2023-01-23 11:21:04,414 48k INFO Train Epoch: 620 [26%] 2023-01-23 11:21:04,415 48k INFO [2.330713987350464, 2.4435784816741943, 9.328523635864258, 16.41358184814453, 0.5452368259429932, 83600, 9.25191200321096e-05] 2023-01-23 11:22:32,529 48k INFO ====> Epoch: 620 2023-01-23 11:26:07,083 48k INFO Train Epoch: 621 [74%] 2023-01-23 11:26:07,084 48k INFO [2.465080738067627, 2.477177858352661, 9.134875297546387, 16.464344024658203, 0.6981496810913086, 83800, 9.250755514210558e-05] 2023-01-23 11:26:38,207 48k INFO ====> Epoch: 621 2023-01-23 11:30:48,111 48k INFO ====> Epoch: 622 2023-01-23 11:33:08,029 48k INFO Train Epoch: 623 [22%] 2023-01-23 11:33:08,030 48k INFO [2.281447410583496, 2.502004861831665, 8.606794357299805, 16.007204055786133, 0.8194517493247986, 84000, 9.24844296987506e-05] 2023-01-23 11:33:55,568 48k INFO Saving model and optimizer state at iteration 623 to ./logs/48k/G_84000.pth 2023-01-23 11:33:59,023 48k INFO Saving model and optimizer state at iteration 623 to ./logs/48k/D_84000.pth 2023-01-23 11:35:35,305 48k INFO ====> Epoch: 623 2023-01-23 11:39:33,658 48k INFO Train Epoch: 624 [70%] 2023-01-23 11:39:33,659 48k INFO [2.401902198791504, 2.3257033824920654, 7.123854160308838, 14.747583389282227, 0.5908711552619934, 84200, 9.247286914503825e-05] 2023-01-23 11:40:09,081 48k INFO ====> Epoch: 624 2023-01-23 11:44:31,373 48k INFO ====> Epoch: 625 2023-01-23 11:47:09,292 48k INFO Train Epoch: 626 [19%] 2023-01-23 11:47:09,298 48k INFO [2.5194787979125977, 2.1450979709625244, 7.368860721588135, 14.262621879577637, 0.6622946858406067, 84400, 9.244975237264057e-05] 2023-01-23 11:48:46,903 48k INFO ====> Epoch: 626 2023-01-23 11:52:29,563 48k INFO Train Epoch: 627 [67%] 2023-01-23 11:52:29,564 48k INFO [2.2619121074676514, 2.5718159675598145, 9.377521514892578, 18.50860023498535, 0.6075581908226013, 84600, 9.243819615359399e-05] 2023-01-23 11:53:09,519 48k INFO ====> Epoch: 627 2023-01-23 11:57:23,810 48k INFO ====> Epoch: 628 2023-01-23 12:00:04,897 48k INFO Train Epoch: 629 [15%] 2023-01-23 12:00:04,898 48k INFO [2.4113378524780273, 2.133976936340332, 7.6676740646362305, 14.943703651428223, 0.6422217488288879, 84800, 9.24150880489024e-05] 2023-01-23 12:01:47,129 48k INFO ====> Epoch: 629 2023-01-23 12:05:07,140 48k INFO Train Epoch: 630 [63%] 2023-01-23 12:05:07,147 48k INFO [2.53289532661438, 2.2660720348358154, 5.899055480957031, 12.31200885772705, 0.5604547262191772, 85000, 9.240353616289628e-05] 2023-01-23 12:05:29,913 48k INFO Saving model and optimizer state at iteration 630 to ./logs/48k/G_85000.pth 2023-01-23 12:05:33,493 48k INFO Saving model and optimizer state at iteration 630 to ./logs/48k/D_85000.pth 2023-01-23 12:06:19,972 48k INFO ====> Epoch: 630 2023-01-23 12:09:50,763 48k INFO ====> Epoch: 631 2023-01-23 12:11:30,252 48k INFO Train Epoch: 632 [11%] 2023-01-23 12:11:30,253 48k INFO [2.566802740097046, 2.3807997703552246, 5.538755893707275, 13.934245109558105, 0.8906274437904358, 85200, 9.23804367226608e-05] 2023-01-23 12:13:16,623 48k INFO ====> Epoch: 632 2023-01-23 12:15:42,853 48k INFO Train Epoch: 633 [59%] 2023-01-23 12:15:42,854 48k INFO [2.4868736267089844, 2.628262996673584, 6.566684246063232, 16.536779403686523, 0.6331160068511963, 85400, 9.236888916807045e-05] 2023-01-23 12:16:31,393 48k INFO ====> Epoch: 633 2023-01-23 12:19:39,866 48k INFO ====> Epoch: 634 2023-01-23 12:21:23,304 48k INFO Train Epoch: 635 [7%] 2023-01-23 12:21:23,305 48k INFO [2.3033621311187744, 2.577363967895508, 9.784289360046387, 18.88309669494629, 0.6458785533905029, 85600, 9.234579838904232e-05] 2023-01-23 12:23:13,841 48k INFO ====> Epoch: 635 2023-01-23 12:26:31,338 48k INFO Train Epoch: 636 [56%] 2023-01-23 12:26:31,339 48k INFO [2.3117527961730957, 2.541654586791992, 8.412480354309082, 17.84021759033203, 0.6069367527961731, 85800, 9.233425516424368e-05] 2023-01-23 12:27:24,569 48k INFO ====> Epoch: 636 2023-01-23 12:30:37,244 48k INFO ====> Epoch: 637 2023-01-23 12:31:47,999 48k INFO Train Epoch: 638 [4%] 2023-01-23 12:31:48,000 48k INFO [2.3077545166015625, 2.347388744354248, 10.04366683959961, 18.771944046020508, 1.0110491514205933, 86000, 9.231117304317535e-05] 2023-01-23 12:32:10,599 48k INFO Saving model and optimizer state at iteration 638 to ./logs/48k/G_86000.pth 2023-01-23 12:32:15,723 48k INFO Saving model and optimizer state at iteration 638 to ./logs/48k/D_86000.pth 2023-01-23 12:34:13,608 48k INFO ====> Epoch: 638 2023-01-23 12:36:39,149 48k INFO Train Epoch: 639 [52%] 2023-01-23 12:36:39,150 48k INFO [2.3034706115722656, 2.473806858062744, 8.806417465209961, 18.38829231262207, 0.7272040247917175, 86200, 9.229963414654495e-05] 2023-01-23 12:37:36,721 48k INFO ====> Epoch: 639 2023-01-23 12:41:01,677 48k INFO ====> Epoch: 640 2023-01-23 12:42:04,175 48k INFO Train Epoch: 641 [0%] 2023-01-23 12:42:04,176 48k INFO [2.448392629623413, 2.3364408016204834, 8.66163158416748, 17.695932388305664, 0.6137610673904419, 86400, 9.22765606801901e-05] 2023-01-23 12:44:03,364 48k INFO ====> Epoch: 641 2023-01-23 12:46:06,254 48k INFO Train Epoch: 642 [48%] 2023-01-23 12:46:06,255 48k INFO [2.5083346366882324, 2.333986520767212, 5.9104485511779785, 12.765799522399902, 0.8800625205039978, 86600, 9.226502611010507e-05] 2023-01-23 12:47:08,526 48k INFO ====> Epoch: 642 2023-01-23 12:50:28,705 48k INFO Train Epoch: 643 [96%] 2023-01-23 12:50:28,707 48k INFO [2.307321786880493, 2.554363250732422, 9.555883407592773, 18.9908504486084, 0.4043273627758026, 86800, 9.22534929818413e-05] 2023-01-23 12:50:33,112 48k INFO ====> Epoch: 643 2023-01-23 12:53:39,632 48k INFO ====> Epoch: 644 2023-01-23 12:55:47,548 48k INFO Train Epoch: 645 [44%] 2023-01-23 12:55:47,549 48k INFO [2.030200481414795, 3.689149856567383, 8.897171974182129, 13.396315574645996, 0.5087249279022217, 87000, 9.223043105005667e-05] 2023-01-23 12:56:11,993 48k INFO Saving model and optimizer state at iteration 645 to ./logs/48k/G_87000.pth 2023-01-23 12:56:15,289 48k INFO Saving model and optimizer state at iteration 645 to ./logs/48k/D_87000.pth 2023-01-23 12:57:23,245 48k INFO ====> Epoch: 645 2023-01-23 13:00:26,360 48k INFO Train Epoch: 646 [93%] 2023-01-23 13:00:26,361 48k INFO [2.503373861312866, 2.3530757427215576, 7.709852695465088, 14.199324607849121, 0.719214677810669, 87200, 9.221890224617541e-05] 2023-01-23 13:00:35,082 48k INFO ====> Epoch: 646 2023-01-23 13:03:41,372 48k INFO ====> Epoch: 647 2023-01-23 13:05:45,284 48k INFO Train Epoch: 648 [41%] 2023-01-23 13:05:45,285 48k INFO [2.3297483921051025, 2.400326728820801, 9.339561462402344, 18.135873794555664, 0.6591439843177795, 87400, 9.21958489615342e-05] 2023-01-23 13:06:56,102 48k INFO ====> Epoch: 648 2023-01-23 13:09:52,562 48k INFO Train Epoch: 649 [89%] 2023-01-23 13:09:52,563 48k INFO [2.203615427017212, 2.6745376586914062, 9.407502174377441, 16.500959396362305, 0.4853845238685608, 87600, 9.218432448041401e-05] 2023-01-23 13:10:05,908 48k INFO ====> Epoch: 649 2023-01-23 13:13:10,969 48k INFO ====> Epoch: 650 2023-01-23 13:15:20,475 48k INFO Train Epoch: 651 [37%] 2023-01-23 13:15:20,477 48k INFO [2.319074869155884, 2.5735580921173096, 8.80582046508789, 17.370121002197266, 0.5249329805374146, 87800, 9.216127983967398e-05] 2023-01-23 13:16:35,516 48k INFO ====> Epoch: 651 2023-01-23 13:19:26,716 48k INFO Train Epoch: 652 [85%] 2023-01-23 13:19:26,717 48k INFO [2.413025379180908, 2.322674036026001, 7.0642266273498535, 15.272393226623535, 0.6074795126914978, 88000, 9.214975967969402e-05] 2023-01-23 13:20:13,098 48k INFO Saving model and optimizer state at iteration 652 to ./logs/48k/G_88000.pth 2023-01-23 13:20:15,608 48k INFO Saving model and optimizer state at iteration 652 to ./logs/48k/D_88000.pth 2023-01-23 13:20:35,186 48k INFO ====> Epoch: 652 2023-01-23 13:24:05,514 48k INFO ====> Epoch: 653 2023-01-23 13:25:51,692 48k INFO Train Epoch: 654 [33%] 2023-01-23 13:25:51,694 48k INFO [2.2720584869384766, 2.6400303840637207, 10.407285690307617, 18.18247413635254, 0.8616445064544678, 88200, 9.212672367961408e-05] 2023-01-23 13:27:11,219 48k INFO ====> Epoch: 654 2023-01-23 13:30:01,691 48k INFO Train Epoch: 655 [81%] 2023-01-23 13:30:01,692 48k INFO [2.2425336837768555, 2.4033803939819336, 8.104988098144531, 14.639413833618164, 0.7983376979827881, 88400, 9.211520783915413e-05] 2023-01-23 13:30:24,068 48k INFO ====> Epoch: 655 2023-01-23 13:33:48,593 48k INFO ====> Epoch: 656 2023-01-23 13:35:51,934 48k INFO Train Epoch: 657 [30%] 2023-01-23 13:35:51,936 48k INFO [2.286266565322876, 2.5105576515197754, 8.431275367736816, 14.967750549316406, 0.4286090135574341, 88600, 9.209218047649445e-05] 2023-01-23 13:37:16,269 48k INFO ====> Epoch: 657 2023-01-23 13:40:06,743 48k INFO Train Epoch: 658 [78%] 2023-01-23 13:40:06,750 48k INFO [2.262540340423584, 2.6764657497406006, 9.515693664550781, 17.725317001342773, 0.5965756177902222, 88800, 9.208066895393489e-05] 2023-01-23 13:40:33,205 48k INFO ====> Epoch: 658 2023-01-23 13:43:37,458 48k INFO ====> Epoch: 659 2023-01-23 13:45:12,577 48k INFO Train Epoch: 660 [26%] 2023-01-23 13:45:12,579 48k INFO [2.4291131496429443, 2.539032459259033, 8.754891395568848, 16.244508743286133, 0.8218885064125061, 89000, 9.205765022545685e-05] 2023-01-23 13:45:34,910 48k INFO Saving model and optimizer state at iteration 660 to ./logs/48k/G_89000.pth 2023-01-23 13:45:38,161 48k INFO Saving model and optimizer state at iteration 660 to ./logs/48k/D_89000.pth 2023-01-23 13:47:09,203 48k INFO ====> Epoch: 660 2023-01-23 13:49:40,391 48k INFO Train Epoch: 661 [74%] 2023-01-23 13:49:40,744 48k INFO [2.372612237930298, 2.252922534942627, 6.7332963943481445, 14.210121154785156, 0.4247722327709198, 89200, 9.204614301917867e-05] 2023-01-23 13:50:11,852 48k INFO ====> Epoch: 661 2023-01-23 13:53:41,128 48k INFO ====> Epoch: 662 2023-01-23 13:55:26,383 48k INFO Train Epoch: 663 [22%] 2023-01-23 13:55:26,388 48k INFO [2.2618978023529053, 2.5745720863342285, 9.383424758911133, 18.767181396484375, 0.8378880620002747, 89400, 9.202313292164485e-05] 2023-01-23 13:56:59,346 48k INFO ====> Epoch: 663 2023-01-23 13:59:43,463 48k INFO Train Epoch: 664 [70%] 2023-01-23 13:59:43,464 48k INFO [2.5094504356384277, 2.284512519836426, 6.434460163116455, 13.793341636657715, 0.6798518896102905, 89600, 9.201163003002964e-05] 2023-01-23 14:00:18,748 48k INFO ====> Epoch: 664 2023-01-23 14:03:44,468 48k INFO ====> Epoch: 665 2023-01-23 14:05:24,645 48k INFO Train Epoch: 666 [19%] 2023-01-23 14:05:24,646 48k INFO [2.5879433155059814, 2.173961877822876, 6.476892471313477, 14.379911422729492, 0.6830874681472778, 89800, 9.198862856020383e-05] 2023-01-23 14:07:09,186 48k INFO ====> Epoch: 666 2023-01-23 14:09:44,568 48k INFO Train Epoch: 667 [67%] 2023-01-23 14:09:44,573 48k INFO [2.221569538116455, 2.5455260276794434, 9.3875150680542, 18.626310348510742, 0.704677164554596, 90000, 9.19771299816338e-05] 2023-01-23 14:10:09,267 48k INFO Saving model and optimizer state at iteration 667 to ./logs/48k/G_90000.pth 2023-01-23 14:10:13,099 48k INFO Saving model and optimizer state at iteration 667 to ./logs/48k/D_90000.pth 2023-01-23 14:10:55,272 48k INFO ====> Epoch: 667 2023-01-23 14:14:15,306 48k INFO ====> Epoch: 668 2023-01-23 14:15:40,414 48k INFO Train Epoch: 669 [15%] 2023-01-23 14:15:40,431 48k INFO [2.3661084175109863, 2.4566454887390137, 7.640926837921143, 16.941137313842773, 0.7208341956138611, 90200, 9.195413713628104e-05] 2023-01-23 14:17:22,674 48k INFO ====> Epoch: 669 2023-01-23 14:20:01,080 48k INFO Train Epoch: 670 [63%] 2023-01-23 14:20:01,081 48k INFO [2.4071245193481445, 2.41774845123291, 6.338243007659912, 12.75311279296875, 0.23847471177577972, 90400, 9.194264286913901e-05] 2023-01-23 14:20:45,258 48k INFO ====> Epoch: 670 2023-01-23 14:24:23,282 48k INFO ====> Epoch: 671 2023-01-23 14:25:43,323 48k INFO Train Epoch: 672 [11%] 2023-01-23 14:25:43,324 48k INFO [2.365208625793457, 2.429495334625244, 6.957775115966797, 15.659441947937012, 0.7415834069252014, 90600, 9.191965864502551e-05] 2023-01-23 14:27:29,929 48k INFO ====> Epoch: 672 2023-01-23 14:29:59,083 48k INFO Train Epoch: 673 [59%] 2023-01-23 14:29:59,089 48k INFO [2.2973029613494873, 2.634345054626465, 8.438095092773438, 15.495898246765137, 0.7165843844413757, 90800, 9.190816868769488e-05] 2023-01-23 14:30:47,705 48k INFO ====> Epoch: 673 2023-01-23 14:34:00,104 48k INFO ====> Epoch: 674 2023-01-23 14:35:50,667 48k INFO Train Epoch: 675 [7%] 2023-01-23 14:35:50,668 48k INFO [2.4199812412261963, 2.625131130218506, 8.837600708007812, 15.795662879943848, 0.9739019870758057, 91000, 9.188519308158808e-05] 2023-01-23 14:36:11,381 48k INFO Saving model and optimizer state at iteration 675 to ./logs/48k/G_91000.pth 2023-01-23 14:36:14,664 48k INFO Saving model and optimizer state at iteration 675 to ./logs/48k/D_91000.pth 2023-01-23 14:38:10,341 48k INFO ====> Epoch: 675 2023-01-23 14:41:14,991 48k INFO Train Epoch: 676 [56%] 2023-01-23 14:41:14,992 48k INFO [2.2650890350341797, 2.5563158988952637, 8.704974174499512, 16.164230346679688, 0.7905020117759705, 91200, 9.187370743245287e-05] 2023-01-23 14:42:08,095 48k INFO ====> Epoch: 676 2023-01-23 14:45:42,133 48k INFO ====> Epoch: 677 2023-01-23 14:47:25,986 48k INFO Train Epoch: 678 [4%] 2023-01-23 14:47:25,987 48k INFO [2.3577237129211426, 2.533660411834717, 8.930459976196289, 16.319442749023438, 0.6588347554206848, 91400, 9.185074044112143e-05] 2023-01-23 14:49:21,405 48k INFO ====> Epoch: 678 2023-01-23 14:51:33,182 48k INFO Train Epoch: 679 [52%] 2023-01-23 14:51:33,183 48k INFO [2.3559813499450684, 2.3830463886260986, 9.294419288635254, 18.223573684692383, 0.6694890856742859, 91600, 9.183925909856629e-05] 2023-01-23 14:52:30,402 48k INFO ====> Epoch: 679 2023-01-23 14:55:51,630 48k INFO ====> Epoch: 680 2023-01-23 14:57:20,027 48k INFO Train Epoch: 681 [0%] 2023-01-23 14:57:20,028 48k INFO [2.5543010234832764, 2.3302440643310547, 7.266804218292236, 14.586856842041016, 0.5788456797599792, 91800, 9.181630071878007e-05] 2023-01-23 14:59:19,621 48k INFO ====> Epoch: 681 2023-01-23 15:01:41,002 48k INFO Train Epoch: 682 [48%] 2023-01-23 15:01:41,003 48k INFO [2.2725534439086914, 2.514932155609131, 9.626895904541016, 18.32183837890625, 0.8567736744880676, 92000, 9.180482368119022e-05] 2023-01-23 15:02:03,443 48k INFO Saving model and optimizer state at iteration 682 to ./logs/48k/G_92000.pth 2023-01-23 15:02:07,784 48k INFO Saving model and optimizer state at iteration 682 to ./logs/48k/D_92000.pth 2023-01-23 15:03:13,537 48k INFO ====> Epoch: 682 2023-01-23 15:06:35,126 48k INFO Train Epoch: 683 [96%] 2023-01-23 15:06:35,128 48k INFO [2.2784862518310547, 2.7096762657165527, 9.165365219116211, 18.17491912841797, 0.8631377816200256, 92200, 9.179334807823006e-05] 2023-01-23 15:06:39,342 48k INFO ====> Epoch: 683 2023-01-23 15:09:46,247 48k INFO ====> Epoch: 684 2023-01-23 15:12:15,604 48k INFO Train Epoch: 685 [44%] 2023-01-23 15:12:15,605 48k INFO [2.6750905513763428, 2.2956862449645996, 6.515520095825195, 13.413148880004883, 0.33466899394989014, 92400, 9.177040117548157e-05] 2023-01-23 15:13:21,734 48k INFO ====> Epoch: 685 2023-01-23 15:16:28,194 48k INFO Train Epoch: 686 [93%] 2023-01-23 15:16:28,195 48k INFO [2.476912498474121, 2.323096752166748, 7.453498363494873, 15.133732795715332, 0.8782281875610352, 92600, 9.175892987533463e-05] 2023-01-23 15:16:37,037 48k INFO ====> Epoch: 686 2023-01-23 15:19:42,350 48k INFO ====> Epoch: 687 2023-01-23 15:21:33,647 48k INFO Train Epoch: 688 [41%] 2023-01-23 15:21:33,648 48k INFO [2.3036961555480957, 2.4185056686401367, 8.856172561645508, 16.794601440429688, 0.5078306198120117, 92800, 9.173599157659907e-05] 2023-01-23 15:22:44,412 48k INFO ====> Epoch: 688 2023-01-23 15:25:46,193 48k INFO Train Epoch: 689 [89%] 2023-01-23 15:25:46,303 48k INFO [2.2456166744232178, 2.5809035301208496, 8.885001182556152, 16.810626983642578, 0.6562636494636536, 93000, 9.172452457765199e-05] 2023-01-23 15:26:16,494 48k INFO Saving model and optimizer state at iteration 689 to ./logs/48k/G_93000.pth 2023-01-23 15:26:19,296 48k INFO Saving model and optimizer state at iteration 689 to ./logs/48k/D_93000.pth 2023-01-23 15:26:34,487 48k INFO ====> Epoch: 689 2023-01-23 15:29:46,219 48k INFO ====> Epoch: 690 2023-01-23 15:31:39,312 48k INFO Train Epoch: 691 [37%] 2023-01-23 15:31:39,313 48k INFO [2.3286421298980713, 2.452043056488037, 9.014930725097656, 17.73033905029297, 0.31700393557548523, 93200, 9.170159487970326e-05] 2023-01-23 15:32:54,912 48k INFO ====> Epoch: 691 2023-01-23 15:36:06,378 48k INFO Train Epoch: 692 [85%] 2023-01-23 15:36:06,379 48k INFO [2.203701972961426, 2.5992093086242676, 10.308894157409668, 18.300077438354492, 0.7520074248313904, 93400, 9.169013218034329e-05] 2023-01-23 15:36:23,924 48k INFO ====> Epoch: 692 2023-01-23 15:39:46,387 48k INFO ====> Epoch: 693 2023-01-23 15:41:32,943 48k INFO Train Epoch: 694 [33%] 2023-01-23 15:41:32,950 48k INFO [2.286574602127075, 2.5418381690979004, 9.841751098632812, 18.389131546020508, 0.7673541903495789, 93600, 9.166721107995651e-05] 2023-01-23 15:42:52,763 48k INFO ====> Epoch: 694 2023-01-23 15:45:34,358 48k INFO Train Epoch: 695 [81%] 2023-01-23 15:45:34,535 48k INFO [2.233466863632202, 2.598067045211792, 8.920974731445312, 16.20260238647461, 0.5351613759994507, 93800, 9.16557526785715e-05] 2023-01-23 15:45:56,615 48k INFO ====> Epoch: 695 2023-01-23 15:49:18,254 48k INFO ====> Epoch: 696 2023-01-23 15:51:09,169 48k INFO Train Epoch: 697 [30%] 2023-01-23 15:51:09,171 48k INFO [2.3939268589019775, 2.4089672565460205, 9.220158576965332, 17.064083099365234, 0.6973056793212891, 94000, 9.163284017252299e-05] 2023-01-23 15:51:27,190 48k INFO Saving model and optimizer state at iteration 697 to ./logs/48k/G_94000.pth 2023-01-23 15:51:30,547 48k INFO Saving model and optimizer state at iteration 697 to ./logs/48k/D_94000.pth 2023-01-23 15:52:56,061 48k INFO ====> Epoch: 697 2023-01-23 15:55:40,331 48k INFO Train Epoch: 698 [78%] 2023-01-23 15:55:40,339 48k INFO [2.233152151107788, 2.568537950515747, 9.785449028015137, 17.041868209838867, 0.7452932000160217, 94200, 9.162138606750142e-05] 2023-01-23 15:56:06,919 48k INFO ====> Epoch: 698 2023-01-23 15:59:42,192 48k INFO ====> Epoch: 699 2023-01-23 16:01:42,313 48k INFO Train Epoch: 700 [26%] 2023-01-23 16:01:42,314 48k INFO [2.3292768001556396, 2.7579808235168457, 9.651118278503418, 15.359038352966309, 0.5842181444168091, 94400, 9.15984821525687e-05] 2023-01-23 16:03:10,590 48k INFO ====> Epoch: 700 2023-01-23 16:06:06,856 48k INFO Train Epoch: 701 [74%] 2023-01-23 16:06:06,857 48k INFO [2.359689712524414, 2.3016927242279053, 8.284928321838379, 16.516754150390625, 0.5996209979057312, 94600, 9.158703234229962e-05] 2023-01-23 16:06:37,800 48k INFO ====> Epoch: 701 2023-01-23 16:10:18,548 48k INFO ====> Epoch: 702 2023-01-23 16:12:33,679 48k INFO Train Epoch: 703 [22%] 2023-01-23 16:12:33,680 48k INFO [2.247359275817871, 2.4822838306427, 10.054429054260254, 18.489530563354492, 0.7176375389099121, 94800, 9.156413701526141e-05] 2023-01-23 16:14:06,364 48k INFO ====> Epoch: 703 2023-01-23 16:16:50,664 48k INFO Train Epoch: 704 [70%] 2023-01-23 16:16:50,669 48k INFO [2.362586736679077, 2.597709894180298, 8.650516510009766, 16.46657943725586, 0.7182358503341675, 95000, 9.155269149813449e-05] 2023-01-23 16:17:19,850 48k INFO Saving model and optimizer state at iteration 704 to ./logs/48k/G_95000.pth 2023-01-23 16:17:23,722 48k INFO Saving model and optimizer state at iteration 704 to ./logs/48k/D_95000.pth 2023-01-23 16:18:01,543 48k INFO ====> Epoch: 704 2023-01-23 16:21:41,418 48k INFO ====> Epoch: 705 2023-01-23 16:23:41,082 48k INFO Train Epoch: 706 [19%] 2023-01-23 16:23:41,597 48k INFO [2.5595579147338867, 2.1619532108306885, 6.170368194580078, 14.25611400604248, 0.681039571762085, 95200, 9.152980475577075e-05] 2023-01-23 16:25:18,521 48k INFO ====> Epoch: 706 2023-01-23 16:28:26,473 48k INFO Train Epoch: 707 [67%] 2023-01-23 16:28:26,485 48k INFO [2.266728639602661, 2.5745716094970703, 9.15714168548584, 15.492210388183594, 0.6026924848556519, 95400, 9.151836353017629e-05] 2023-01-23 16:29:06,289 48k INFO ====> Epoch: 707 2023-01-23 16:32:43,947 48k INFO ====> Epoch: 708 2023-01-23 16:34:43,214 48k INFO Train Epoch: 709 [15%] 2023-01-23 16:34:43,219 48k INFO [2.2529520988464355, 2.6039986610412598, 9.70850944519043, 17.94719123840332, 0.9164374470710754, 95600, 9.149548536926816e-05] 2023-01-23 16:36:24,777 48k INFO ====> Epoch: 709 2023-01-23 16:39:19,559 48k INFO Train Epoch: 710 [63%] 2023-01-23 16:39:19,608 48k INFO [2.3715319633483887, 2.7192444801330566, 6.14512825012207, 13.518601417541504, 0.44435185194015503, 95800, 9.1484048433597e-05] 2023-01-23 16:40:03,796 48k INFO ====> Epoch: 710 2023-01-23 16:43:41,496 48k INFO ====> Epoch: 711 2023-01-23 16:45:51,515 48k INFO Train Epoch: 712 [11%] 2023-01-23 16:45:51,516 48k INFO [2.3720388412475586, 2.4731528759002686, 7.763126850128174, 13.446093559265137, 0.5640010833740234, 96000, 9.146117885092685e-05] 2023-01-23 16:46:15,780 48k INFO Saving model and optimizer state at iteration 712 to ./logs/48k/G_96000.pth 2023-01-23 16:46:18,817 48k INFO Saving model and optimizer state at iteration 712 to ./logs/48k/D_96000.pth 2023-01-23 16:48:07,318 48k INFO ====> Epoch: 712 2023-01-23 16:50:55,295 48k INFO Train Epoch: 713 [59%] 2023-01-23 16:50:55,301 48k INFO [2.394923686981201, 2.376936674118042, 6.987839221954346, 14.217169761657715, 0.7358266711235046, 96200, 9.144974620357048e-05] 2023-01-23 16:51:44,149 48k INFO ====> Epoch: 713 2023-01-23 16:55:47,952 48k INFO ====> Epoch: 714 2023-01-23 16:57:56,614 48k INFO Train Epoch: 715 [7%] 2023-01-23 16:57:56,615 48k INFO [2.2142672538757324, 2.8764772415161133, 9.021501541137695, 19.501338958740234, 0.708393394947052, 96400, 9.142688519592185e-05] 2023-01-23 16:59:47,304 48k INFO ====> Epoch: 715 2023-01-23 17:02:33,661 48k INFO Train Epoch: 716 [56%] 2023-01-23 17:02:33,678 48k INFO [2.2781355381011963, 2.6946065425872803, 8.663657188415527, 17.959291458129883, 0.5412201285362244, 96600, 9.141545683527236e-05] 2023-01-23 17:03:26,893 48k INFO ====> Epoch: 716 2023-01-23 17:07:09,083 48k INFO ====> Epoch: 717 2023-01-23 17:09:12,855 48k INFO Train Epoch: 718 [4%] 2023-01-23 17:09:12,856 48k INFO [2.416106700897217, 2.427372932434082, 9.13365650177002, 18.081199645996094, 0.5625664591789246, 96800, 9.139260439943005e-05] 2023-01-23 17:11:08,675 48k INFO ====> Epoch: 718 2023-01-23 17:14:02,816 48k INFO Train Epoch: 719 [52%] 2023-01-23 17:14:02,817 48k INFO [2.336932897567749, 2.4239323139190674, 10.29831314086914, 18.191335678100586, 0.8012924194335938, 97000, 9.138118032388012e-05] 2023-01-23 17:14:31,465 48k INFO Saving model and optimizer state at iteration 719 to ./logs/48k/G_97000.pth 2023-01-23 17:14:35,100 48k INFO Saving model and optimizer state at iteration 719 to ./logs/48k/D_97000.pth 2023-01-23 17:15:35,841 48k INFO ====> Epoch: 719 2023-01-23 17:19:34,101 48k INFO ====> Epoch: 720 2023-01-23 17:21:00,320 48k INFO Train Epoch: 721 [0%] 2023-01-23 17:21:00,325 48k INFO [2.4317708015441895, 2.411318302154541, 8.04038143157959, 18.796098709106445, 0.5673758387565613, 97200, 9.13583364566301e-05] 2023-01-23 17:22:59,150 48k INFO ====> Epoch: 721 2023-01-23 17:25:47,042 48k INFO Train Epoch: 722 [48%] 2023-01-23 17:25:47,045 48k INFO [2.257328510284424, 2.577617645263672, 8.60241413116455, 15.19523811340332, 0.6654015183448792, 97400, 9.134691666457301e-05] 2023-01-23 17:26:48,844 48k INFO ====> Epoch: 722 2023-01-23 17:30:38,823 48k INFO Train Epoch: 723 [96%] 2023-01-23 17:30:39,028 48k INFO [2.219017744064331, 2.6116533279418945, 10.647038459777832, 18.646547317504883, 0.8299684524536133, 97600, 9.133549829998994e-05] 2023-01-23 17:30:43,486 48k INFO ====> Epoch: 723 2023-01-23 17:34:22,816 48k INFO ====> Epoch: 724 2023-01-23 17:36:45,696 48k INFO Train Epoch: 725 [44%] 2023-01-23 17:36:45,697 48k INFO [2.272353172302246, 2.5331063270568848, 10.364089012145996, 16.416751861572266, 0.6031312346458435, 97800, 9.13126658525321e-05] 2023-01-23 17:37:52,231 48k INFO ====> Epoch: 725 2023-01-23 17:41:12,881 48k INFO Train Epoch: 726 [93%] 2023-01-23 17:41:12,882 48k INFO [2.440281391143799, 2.2429893016815186, 7.904738903045654, 15.634512901306152, 0.6011893153190613, 98000, 9.130125176930053e-05] 2023-01-23 17:41:49,849 48k INFO Saving model and optimizer state at iteration 726 to ./logs/48k/G_98000.pth 2023-01-23 17:41:53,128 48k INFO Saving model and optimizer state at iteration 726 to ./logs/48k/D_98000.pth 2023-01-23 17:42:04,865 48k INFO ====> Epoch: 726 2023-01-23 17:45:22,248 48k INFO ====> Epoch: 727 2023-01-23 17:47:22,879 48k INFO Train Epoch: 728 [41%] 2023-01-23 17:47:22,880 48k INFO [2.2874960899353027, 2.417243242263794, 9.45569133758545, 16.317270278930664, 0.6977587938308716, 98200, 9.127842788294025e-05] 2023-01-23 17:48:33,960 48k INFO ====> Epoch: 728 2023-01-23 17:51:27,887 48k INFO Train Epoch: 729 [89%] 2023-01-23 17:51:27,888 48k INFO [2.3048431873321533, 2.680650234222412, 8.348795890808105, 16.993009567260742, 0.35977423191070557, 98400, 9.126701807945488e-05] 2023-01-23 17:51:41,249 48k INFO ====> Epoch: 729 2023-01-23 17:55:00,949 48k INFO ====> Epoch: 730 2023-01-23 17:57:07,040 48k INFO Train Epoch: 731 [37%] 2023-01-23 17:57:07,041 48k INFO [2.367094039916992, 2.3933725357055664, 9.107186317443848, 14.271458625793457, 1.0356553792953491, 98600, 9.124420275098216e-05] 2023-01-23 17:58:22,743 48k INFO ====> Epoch: 731 2023-01-23 18:01:43,802 48k INFO Train Epoch: 732 [85%] 2023-01-23 18:01:43,803 48k INFO [2.28037691116333, 2.5676751136779785, 9.520210266113281, 17.415977478027344, 0.6486546993255615, 98800, 9.123279722563828e-05] 2023-01-23 18:02:01,619 48k INFO ====> Epoch: 732 2023-01-23 18:06:08,174 48k INFO ====> Epoch: 733 2023-01-23 18:09:03,885 48k INFO Train Epoch: 734 [33%] 2023-01-23 18:09:03,893 48k INFO [2.360475540161133, 2.340808629989624, 9.394254684448242, 16.78240203857422, 0.8319499492645264, 99000, 9.120999045184433e-05] 2023-01-23 18:09:36,664 48k INFO Saving model and optimizer state at iteration 734 to ./logs/48k/G_99000.pth 2023-01-23 18:09:40,771 48k INFO Saving model and optimizer state at iteration 734 to ./logs/48k/D_99000.pth 2023-01-23 18:11:02,914 48k INFO ====> Epoch: 734 2023-01-23 18:14:50,710 48k INFO Train Epoch: 735 [81%] 2023-01-23 18:14:50,712 48k INFO [2.3580970764160156, 2.508890151977539, 7.750563144683838, 16.80018424987793, 0.7157800197601318, 99200, 9.119858920303784e-05] 2023-01-23 18:15:12,991 48k INFO ====> Epoch: 735 2023-01-23 18:19:31,874 48k INFO ====> Epoch: 736 2023-01-23 18:22:17,140 48k INFO Train Epoch: 737 [30%] 2023-01-23 18:22:17,141 48k INFO [2.2907395362854004, 2.6198179721832275, 8.582846641540527, 15.994983673095703, 0.8249756693840027, 99400, 9.117579098071503e-05] 2023-01-23 18:24:05,641 48k INFO ====> Epoch: 737 2023-01-23 18:27:41,940 48k INFO Train Epoch: 738 [78%] 2023-01-23 18:27:41,941 48k INFO [2.151040554046631, 2.6538069248199463, 10.364665031433105, 17.489656448364258, 0.8400857448577881, 99600, 9.116439400684243e-05] 2023-01-23 18:28:08,257 48k INFO ====> Epoch: 738 2023-01-23 18:32:05,181 48k INFO ====> Epoch: 739 2023-01-23 18:34:37,691 48k INFO Train Epoch: 740 [26%] 2023-01-23 18:34:37,693 48k INFO [2.383901834487915, 2.405245780944824, 9.31235122680664, 16.296445846557617, 0.7319443821907043, 99800, 9.114160433278438e-05] 2023-01-23 18:36:06,408 48k INFO ====> Epoch: 740 2023-01-23 18:39:25,015 48k INFO Train Epoch: 741 [74%] 2023-01-23 18:39:25,017 48k INFO [2.1883418560028076, 2.590717077255249, 9.276054382324219, 17.12923240661621, 0.5096774101257324, 100000, 9.113021163224278e-05] 2023-01-23 18:39:55,803 48k INFO Saving model and optimizer state at iteration 741 to ./logs/48k/G_100000.pth 2023-01-23 18:39:59,491 48k INFO Saving model and optimizer state at iteration 741 to ./logs/48k/D_100000.pth 2023-01-23 18:40:32,350 48k INFO ====> Epoch: 741 2023-01-23 18:44:37,085 48k INFO ====> Epoch: 742 2023-01-23 18:46:54,919 48k INFO Train Epoch: 743 [22%] 2023-01-23 18:46:54,921 48k INFO [2.2805190086364746, 2.5671565532684326, 11.115105628967285, 17.8770751953125, 0.6484877467155457, 100200, 9.110743050324427e-05] 2023-01-23 18:48:27,708 48k INFO ====> Epoch: 743 2023-01-23 18:51:36,990 48k INFO Train Epoch: 744 [70%] 2023-01-23 18:51:36,991 48k INFO [2.4817636013031006, 2.4993042945861816, 7.4215497970581055, 12.953827857971191, 0.6485880017280579, 100400, 9.109604207443135e-05] 2023-01-23 18:52:12,843 48k INFO ====> Epoch: 744 2023-01-23 18:57:13,122 48k INFO ====> Epoch: 745 2023-01-23 18:59:38,508 48k INFO Train Epoch: 746 [19%] 2023-01-23 18:59:38,510 48k INFO [2.3838703632354736, 2.392402172088623, 8.455521583557129, 14.668843269348145, 0.4412974417209625, 100600, 9.107326948728839e-05] 2023-01-23 19:01:16,085 48k INFO ====> Epoch: 746 2023-01-23 19:04:40,996 48k INFO Train Epoch: 747 [67%] 2023-01-23 19:04:40,997 48k INFO [2.4019527435302734, 2.401427984237671, 7.69476318359375, 14.105962753295898, 0.7036499381065369, 100800, 9.106188532860248e-05] 2023-01-23 19:05:21,020 48k INFO ====> Epoch: 747 2023-01-23 19:10:07,319 48k INFO ====> Epoch: 748 2023-01-23 19:12:46,559 48k INFO Train Epoch: 749 [15%] 2023-01-23 19:12:46,560 48k INFO [2.3553926944732666, 2.4884707927703857, 8.283745765686035, 16.98600959777832, 0.7987759113311768, 101000, 9.103912128011228e-05] 2023-01-23 19:13:20,153 48k INFO Saving model and optimizer state at iteration 749 to ./logs/48k/G_101000.pth 2023-01-23 19:13:22,699 48k INFO Saving model and optimizer state at iteration 749 to ./logs/48k/D_101000.pth 2023-01-23 19:15:06,189 48k INFO ====> Epoch: 749 2023-01-23 19:18:11,204 48k INFO Train Epoch: 750 [63%] 2023-01-23 19:18:11,352 48k INFO [2.465074062347412, 2.3123202323913574, 6.646108150482178, 14.734275817871094, 0.7837478518486023, 101200, 9.102774138995226e-05] 2023-01-23 19:18:55,718 48k INFO ====> Epoch: 750 2023-01-23 19:22:56,693 48k INFO ====> Epoch: 751 2023-01-23 19:24:46,296 48k INFO Train Epoch: 752 [11%] 2023-01-23 19:24:46,297 48k INFO [2.4925448894500732, 2.316079616546631, 6.217012405395508, 15.693580627441406, 0.765614926815033, 101400, 9.100498587691323e-05] 2023-01-23 19:26:36,248 48k INFO ====> Epoch: 752 2023-01-23 19:29:23,333 48k INFO Train Epoch: 753 [59%] 2023-01-23 19:29:23,334 48k INFO [2.4430408477783203, 2.4428393840789795, 6.9307990074157715, 13.179892539978027, 0.6235496401786804, 101600, 9.09936102536786e-05] 2023-01-23 19:30:12,051 48k INFO ====> Epoch: 753 2023-01-23 19:33:18,389 48k INFO ====> Epoch: 754 2023-01-23 19:34:57,563 48k INFO Train Epoch: 755 [7%] 2023-01-23 19:34:57,564 48k INFO [2.365800619125366, 2.6078872680664062, 8.889287948608398, 15.693405151367188, 0.9483134150505066, 101800, 9.097086327289034e-05] 2023-01-23 19:36:53,716 48k INFO ====> Epoch: 755 2023-01-23 19:39:09,795 48k INFO Train Epoch: 756 [56%] 2023-01-23 19:39:09,796 48k INFO [2.407108783721924, 2.4701902866363525, 8.130342483520508, 15.760852813720703, 0.6597813963890076, 102000, 9.095949191498122e-05] 2023-01-23 19:39:32,916 48k INFO Saving model and optimizer state at iteration 756 to ./logs/48k/G_102000.pth 2023-01-23 19:39:36,919 48k INFO Saving model and optimizer state at iteration 756 to ./logs/48k/D_102000.pth 2023-01-23 19:40:35,307 48k INFO ====> Epoch: 756 2023-01-23 19:44:12,302 48k INFO ====> Epoch: 757 2023-01-23 19:45:43,719 48k INFO Train Epoch: 758 [4%] 2023-01-23 19:45:43,720 48k INFO [2.3647003173828125, 2.5615267753601074, 9.09508228302002, 18.372989654541016, 0.8811466693878174, 102200, 9.093675346324454e-05] 2023-01-23 19:47:39,509 48k INFO ====> Epoch: 758 2023-01-23 19:49:49,192 48k INFO Train Epoch: 759 [52%] 2023-01-23 19:49:49,193 48k INFO [2.256014823913574, 2.5433013439178467, 10.51237678527832, 19.53571891784668, 0.5511649250984192, 102400, 9.092538636906162e-05] 2023-01-23 19:50:46,732 48k INFO ====> Epoch: 759 2023-01-23 19:53:52,347 48k INFO ====> Epoch: 760 2023-01-23 19:55:16,053 48k INFO Train Epoch: 761 [0%] 2023-01-23 19:55:16,055 48k INFO [2.302044153213501, 2.532933473587036, 7.830555438995361, 13.32126235961914, 0.9294160008430481, 102600, 9.09026564431785e-05] 2023-01-23 19:57:15,930 48k INFO ====> Epoch: 761 2023-01-23 19:59:39,745 48k INFO Train Epoch: 762 [48%] 2023-01-23 19:59:39,746 48k INFO [2.3015100955963135, 2.4473941326141357, 10.608293533325195, 17.100566864013672, 0.8943692445755005, 102800, 9.08912936111231e-05] 2023-01-23 20:00:41,823 48k INFO ====> Epoch: 762 2023-01-23 20:04:00,384 48k INFO Train Epoch: 763 [96%] 2023-01-23 20:04:00,385 48k INFO [2.2620701789855957, 2.549402952194214, 8.693544387817383, 17.048097610473633, 0.4190557897090912, 103000, 9.087993219942171e-05] 2023-01-23 20:04:22,955 48k INFO Saving model and optimizer state at iteration 763 to ./logs/48k/G_103000.pth 2023-01-23 20:04:26,156 48k INFO Saving model and optimizer state at iteration 763 to ./logs/48k/D_103000.pth 2023-01-23 20:04:33,230 48k INFO ====> Epoch: 763 2023-01-23 20:08:05,638 48k INFO ====> Epoch: 764 2023-01-23 20:10:02,704 48k INFO Train Epoch: 765 [44%] 2023-01-23 20:10:02,706 48k INFO [2.300236701965332, 2.6558985710144043, 8.273863792419434, 14.544082641601562, 0.8363839387893677, 103200, 9.085721363637077e-05] 2023-01-23 20:11:09,939 48k INFO ====> Epoch: 765 2023-01-23 20:14:08,631 48k INFO Train Epoch: 766 [93%] 2023-01-23 20:14:08,632 48k INFO [2.399754047393799, 2.3258862495422363, 8.307746887207031, 14.138287544250488, 0.4690006971359253, 103400, 9.084585648466622e-05] 2023-01-23 20:14:17,527 48k INFO ====> Epoch: 766 2023-01-23 20:17:21,576 48k INFO ====> Epoch: 767 2023-01-23 20:19:15,187 48k INFO Train Epoch: 768 [41%] 2023-01-23 20:19:15,188 48k INFO [2.2829699516296387, 2.4089457988739014, 8.567840576171875, 17.770273208618164, 0.6051163077354431, 103600, 9.082314644001155e-05] 2023-01-23 20:20:26,266 48k INFO ====> Epoch: 768 2023-01-23 20:23:18,549 48k INFO Train Epoch: 769 [89%] 2023-01-23 20:23:18,550 48k INFO [2.4307332038879395, 2.4980249404907227, 7.882088661193848, 17.581850051879883, 0.9962572455406189, 103800, 9.081179354670654e-05] 2023-01-23 20:23:31,801 48k INFO ====> Epoch: 769 2023-01-23 20:26:39,580 48k INFO ====> Epoch: 770 2023-01-23 20:28:50,113 48k INFO Train Epoch: 771 [37%] 2023-01-23 20:28:50,114 48k INFO [2.24526309967041, 2.5614848136901855, 8.414196014404297, 16.569507598876953, 0.8677699565887451, 104000, 9.078909201725413e-05] 2023-01-23 20:29:14,735 48k INFO Saving model and optimizer state at iteration 771 to ./logs/48k/G_104000.pth 2023-01-23 20:29:18,023 48k INFO Saving model and optimizer state at iteration 771 to ./logs/48k/D_104000.pth 2023-01-23 20:30:36,104 48k INFO ====> Epoch: 771 2023-01-23 20:33:45,591 48k INFO Train Epoch: 772 [85%] 2023-01-23 20:33:45,593 48k INFO [2.2887308597564697, 2.448087215423584, 8.947030067443848, 16.417455673217773, 0.513584554195404, 104200, 9.077774338075196e-05] 2023-01-23 20:34:03,363 48k INFO ====> Epoch: 772 2023-01-23 20:37:12,610 48k INFO ====> Epoch: 773 2023-01-23 20:39:26,592 48k INFO Train Epoch: 774 [33%] 2023-01-23 20:39:26,593 48k INFO [2.2627902030944824, 2.6199052333831787, 10.435575485229492, 17.311084747314453, 0.6526522040367126, 104400, 9.0755050363309e-05] 2023-01-23 20:40:46,469 48k INFO ====> Epoch: 774 2023-01-23 20:43:51,550 48k INFO Train Epoch: 775 [81%] 2023-01-23 20:43:51,552 48k INFO [2.2382192611694336, 2.625276803970337, 8.908227920532227, 16.998428344726562, 0.47642654180526733, 104600, 9.074370598201358e-05] 2023-01-23 20:44:13,670 48k INFO ====> Epoch: 775 2023-01-23 20:47:38,847 48k INFO ====> Epoch: 776 2023-01-23 20:49:21,681 48k INFO Train Epoch: 777 [30%] 2023-01-23 20:49:21,683 48k INFO [2.4550867080688477, 2.267293691635132, 7.084274768829346, 15.668388366699219, 0.5841166973114014, 104800, 9.072102147338848e-05] 2023-01-23 20:50:45,875 48k INFO ====> Epoch: 777 2023-01-23 20:53:42,678 48k INFO Train Epoch: 778 [78%] 2023-01-23 20:53:42,684 48k INFO [2.322422742843628, 2.4155473709106445, 9.413861274719238, 17.722524642944336, 1.0372014045715332, 105000, 9.07096813457043e-05] 2023-01-23 20:54:07,134 48k INFO Saving model and optimizer state at iteration 778 to ./logs/48k/G_105000.pth 2023-01-23 20:54:11,473 48k INFO Saving model and optimizer state at iteration 778 to ./logs/48k/D_105000.pth 2023-01-23 20:54:39,985 48k INFO ====> Epoch: 778 2023-01-23 20:57:58,880 48k INFO ====> Epoch: 779 2023-01-23 20:59:57,495 48k INFO Train Epoch: 780 [26%] 2023-01-23 20:59:57,496 48k INFO [2.1292200088500977, 3.0171303749084473, 9.80234146118164, 14.902020454406738, 0.331755667924881, 105200, 9.068700534270665e-05] 2023-01-23 21:01:25,995 48k INFO ====> Epoch: 780 2023-01-23 21:04:00,256 48k INFO Train Epoch: 781 [74%] 2023-01-23 21:04:00,258 48k INFO [2.413804531097412, 2.3606672286987305, 7.202935695648193, 14.715460777282715, 0.611225426197052, 105400, 9.067566946703881e-05] 2023-01-23 21:04:31,131 48k INFO ====> Epoch: 781 2023-01-23 21:07:53,301 48k INFO ====> Epoch: 782 2023-01-23 21:09:30,041 48k INFO Train Epoch: 783 [22%] 2023-01-23 21:09:30,042 48k INFO [2.4738693237304688, 2.2146594524383545, 7.586230278015137, 15.18497085571289, 0.7123989462852478, 105600, 9.065300196647938e-05] 2023-01-23 21:11:03,345 48k INFO ====> Epoch: 783 2023-01-23 21:13:49,977 48k INFO Train Epoch: 784 [70%] 2023-01-23 21:13:49,978 48k INFO [2.5067989826202393, 2.38822340965271, 7.479330062866211, 15.907941818237305, 0.4839233160018921, 105800, 9.064167034123356e-05] 2023-01-23 21:14:25,513 48k INFO ====> Epoch: 784 2023-01-23 21:18:34,949 48k INFO ====> Epoch: 785 2023-01-23 21:20:51,820 48k INFO Train Epoch: 786 [19%] 2023-01-23 21:20:51,842 48k INFO [2.539746046066284, 2.2941107749938965, 6.687013626098633, 13.735808372497559, 0.723725438117981, 106000, 9.061901133992436e-05] 2023-01-23 21:21:22,453 48k INFO Saving model and optimizer state at iteration 786 to ./logs/48k/G_106000.pth 2023-01-23 21:21:26,189 48k INFO Saving model and optimizer state at iteration 786 to ./logs/48k/D_106000.pth 2023-01-23 21:23:06,238 48k INFO ====> Epoch: 786 2023-01-23 21:26:48,914 48k INFO Train Epoch: 787 [67%] 2023-01-23 21:26:48,915 48k INFO [2.2655673027038574, 2.5925285816192627, 9.324825286865234, 15.519052505493164, 0.536005437374115, 106200, 9.060768396350687e-05] 2023-01-23 21:27:28,888 48k INFO ====> Epoch: 787 2023-01-23 21:31:37,447 48k INFO ====> Epoch: 788 2023-01-23 21:33:55,339 48k INFO Train Epoch: 789 [15%] 2023-01-23 21:33:55,340 48k INFO [2.4093775749206543, 2.4645743370056152, 8.267797470092773, 15.866006851196289, 0.7299731373786926, 106400, 9.058503345826105e-05] 2023-01-23 21:35:36,828 48k INFO ====> Epoch: 789 2023-01-23 21:38:41,931 48k INFO Train Epoch: 790 [63%] 2023-01-23 21:38:41,937 48k INFO [2.7450902462005615, 2.3070931434631348, 7.660112380981445, 13.955930709838867, 0.651479959487915, 106600, 9.057371032907876e-05] 2023-01-23 21:39:26,367 48k INFO ====> Epoch: 790 2023-01-23 21:43:19,945 48k INFO ====> Epoch: 791 2023-01-23 21:45:29,144 48k INFO Train Epoch: 792 [11%] 2023-01-23 21:45:29,146 48k INFO [2.4782862663269043, 2.1259524822235107, 5.530925750732422, 12.878185272216797, 0.7871171236038208, 106800, 9.055106831671071e-05] 2023-01-23 21:47:15,696 48k INFO ====> Epoch: 792 2023-01-23 21:49:58,160 48k INFO Train Epoch: 793 [59%] 2023-01-23 21:49:58,193 48k INFO [2.3106513023376465, 2.7054877281188965, 8.865534782409668, 16.30118751525879, 0.5343518853187561, 107000, 9.053974943317111e-05] 2023-01-23 21:50:28,958 48k INFO Saving model and optimizer state at iteration 793 to ./logs/48k/G_107000.pth 2023-01-23 21:50:32,493 48k INFO Saving model and optimizer state at iteration 793 to ./logs/48k/D_107000.pth 2023-01-23 21:51:23,949 48k INFO ====> Epoch: 793 2023-01-23 21:55:10,292 48k INFO ====> Epoch: 794 2023-01-23 21:57:03,333 48k INFO Train Epoch: 795 [7%] 2023-01-23 21:57:03,335 48k INFO [2.4271700382232666, 2.403296709060669, 7.992094039916992, 16.226186752319336, 0.7988741993904114, 107200, 9.05171159104964e-05] 2023-01-23 21:58:54,633 48k INFO ====> Epoch: 795 2023-01-23 22:01:56,020 48k INFO Train Epoch: 796 [56%] 2023-01-23 22:01:56,026 48k INFO [2.317974090576172, 2.627061605453491, 8.404244422912598, 18.135326385498047, 1.001375436782837, 107400, 9.050580127100758e-05] 2023-01-23 22:02:49,346 48k INFO ====> Epoch: 796 2023-01-23 22:07:29,285 48k INFO ====> Epoch: 797 2023-01-23 22:09:25,921 48k INFO Train Epoch: 798 [4%] 2023-01-23 22:09:25,922 48k INFO [2.216732978820801, 2.6783523559570312, 10.906787872314453, 18.60639190673828, 0.5157018303871155, 107600, 9.048317623484297e-05] 2023-01-23 22:11:21,422 48k INFO ====> Epoch: 798 2023-01-23 22:14:13,807 48k INFO Train Epoch: 799 [52%] 2023-01-23 22:14:13,808 48k INFO [2.2588605880737305, 2.430006265640259, 8.718581199645996, 18.441492080688477, 1.1163910627365112, 107800, 9.04718658378136e-05] 2023-01-23 22:15:11,691 48k INFO ====> Epoch: 799 2023-01-23 22:18:53,615 48k INFO ====> Epoch: 800 2023-01-23 22:20:13,677 48k INFO Train Epoch: 801 [0%] 2023-01-23 22:20:13,679 48k INFO [2.242990016937256, 2.5456955432891846, 8.594388008117676, 18.1732234954834, 0.786427915096283, 108000, 9.044924928497705e-05] 2023-01-23 22:20:25,030 48k INFO Saving model and optimizer state at iteration 801 to ./logs/48k/G_108000.pth 2023-01-23 22:20:27,647 48k INFO Saving model and optimizer state at iteration 801 to ./logs/48k/D_108000.pth 2023-01-23 22:22:29,887 48k INFO ====> Epoch: 801 2023-01-23 22:25:16,513 48k INFO Train Epoch: 802 [48%] 2023-01-23 22:25:16,514 48k INFO [2.416868209838867, 2.386709690093994, 8.4788179397583, 16.190696716308594, 0.6273730397224426, 108200, 9.043794312881642e-05] 2023-01-23 22:26:18,479 48k INFO ====> Epoch: 802 2023-01-23 22:30:19,128 48k INFO Train Epoch: 803 [96%] 2023-01-23 22:30:19,129 48k INFO [2.2032034397125244, 2.601439952850342, 9.411138534545898, 19.025257110595703, 0.814903974533081, 108400, 9.042663838592532e-05] 2023-01-23 22:30:23,529 48k INFO ====> Epoch: 803 2023-01-23 22:34:09,072 48k INFO ====> Epoch: 804 2023-01-23 22:36:40,385 48k INFO Train Epoch: 805 [44%] 2023-01-23 22:36:40,386 48k INFO [2.335099935531616, 2.438112735748291, 8.741422653198242, 14.464008331298828, 0.5869823694229126, 108600, 9.040403313924505e-05] 2023-01-23 22:37:46,790 48k INFO ====> Epoch: 805 2023-01-23 22:41:13,498 48k INFO Train Epoch: 806 [93%] 2023-01-23 22:41:13,499 48k INFO [2.3780741691589355, 2.3096375465393066, 8.054452896118164, 14.047882080078125, 0.7813425064086914, 108800, 9.039273263510263e-05] 2023-01-23 22:41:22,399 48k INFO ====> Epoch: 806 2023-01-23 22:44:54,057 48k INFO ====> Epoch: 807 2023-01-23 22:46:45,877 48k INFO Train Epoch: 808 [41%] 2023-01-23 22:46:45,879 48k INFO [2.2231600284576416, 2.8166794776916504, 8.997668266296387, 17.033491134643555, 0.6101499199867249, 109000, 9.03701358643303e-05] 2023-01-23 22:47:18,590 48k INFO Saving model and optimizer state at iteration 808 to ./logs/48k/G_109000.pth 2023-01-23 22:47:21,796 48k INFO Saving model and optimizer state at iteration 808 to ./logs/48k/D_109000.pth 2023-01-23 22:48:35,150 48k INFO ====> Epoch: 808 2023-01-23 22:52:12,230 48k INFO Train Epoch: 809 [89%] 2023-01-23 22:52:12,232 48k INFO [2.230499267578125, 2.817568302154541, 9.486656188964844, 17.208528518676758, 0.32377469539642334, 109200, 9.035883959734726e-05] 2023-01-23 22:52:25,481 48k INFO ====> Epoch: 809 2023-01-23 22:56:40,262 48k INFO ====> Epoch: 810 2023-01-23 22:59:29,027 48k INFO Train Epoch: 811 [37%] 2023-01-23 22:59:29,034 48k INFO [2.2585108280181885, 2.6317780017852783, 10.032166481018066, 17.05348777770996, 0.6616057753562927, 109400, 9.033625129930478e-05] 2023-01-23 23:00:43,800 48k INFO ====> Epoch: 811 2023-01-23 23:04:10,681 48k INFO Train Epoch: 812 [85%] 2023-01-23 23:04:10,682 48k INFO [2.3010611534118652, 2.3847973346710205, 8.644647598266602, 17.65654945373535, 0.6315338611602783, 109600, 9.032495926789236e-05] 2023-01-23 23:04:28,272 48k INFO ====> Epoch: 812 2023-01-23 23:08:34,248 48k INFO ====> Epoch: 813 2023-01-23 23:10:55,548 48k INFO Train Epoch: 814 [33%] 2023-01-23 23:10:55,549 48k INFO [2.3265511989593506, 2.412606716156006, 9.143632888793945, 15.982678413391113, 0.7747553586959839, 109800, 9.030237943940286e-05] 2023-01-23 23:12:15,471 48k INFO ====> Epoch: 814 2023-01-23 23:15:50,251 48k INFO Train Epoch: 815 [81%] 2023-01-23 23:15:50,252 48k INFO [2.2453720569610596, 2.551546096801758, 9.328926086425781, 15.795797348022461, 0.6768169403076172, 110000, 9.029109164197293e-05] 2023-01-23 23:16:18,671 48k INFO Saving model and optimizer state at iteration 815 to ./logs/48k/G_110000.pth 2023-01-23 23:16:22,005 48k INFO Saving model and optimizer state at iteration 815 to ./logs/48k/D_110000.pth 2023-01-23 23:16:46,252 48k INFO ====> Epoch: 815 2023-01-23 23:20:43,713 48k INFO ====> Epoch: 816 2023-01-23 23:23:00,541 48k INFO Train Epoch: 817 [30%] 2023-01-23 23:23:00,542 48k INFO [2.2335877418518066, 2.558457374572754, 9.779723167419434, 16.714113235473633, 0.6603650450706482, 110200, 9.026852027986074e-05] 2023-01-23 23:24:25,140 48k INFO ====> Epoch: 817 2023-01-23 23:27:23,687 48k INFO Train Epoch: 818 [78%] 2023-01-23 23:27:23,688 48k INFO [2.3177363872528076, 2.583004951477051, 8.723631858825684, 16.569381713867188, 0.6803258061408997, 110400, 9.025723671482575e-05] 2023-01-23 23:27:50,322 48k INFO ====> Epoch: 818 2023-01-23 23:31:07,181 48k INFO ====> Epoch: 819 2023-01-23 23:33:10,588 48k INFO Train Epoch: 820 [26%] 2023-01-23 23:33:10,597 48k INFO [2.180530548095703, 2.6893579959869385, 9.442015647888184, 16.360984802246094, 0.6001001000404358, 110600, 9.023467381591636e-05] 2023-01-23 23:34:40,060 48k INFO ====> Epoch: 820 2023-01-23 23:37:37,224 48k INFO Train Epoch: 821 [74%] 2023-01-23 23:37:37,226 48k INFO [2.429344415664673, 2.469391345977783, 8.504593849182129, 15.802223205566406, 0.8509594798088074, 110800, 9.022339448168936e-05] 2023-01-23 23:38:08,569 48k INFO ====> Epoch: 821 2023-01-23 23:41:45,699 48k INFO ====> Epoch: 822 2023-01-23 23:43:22,525 48k INFO Train Epoch: 823 [22%] 2023-01-23 23:43:22,526 48k INFO [2.302804946899414, 2.9666621685028076, 9.757888793945312, 16.805309295654297, 0.3481007516384125, 111000, 9.020084004280947e-05] 2023-01-23 23:43:45,539 48k INFO Saving model and optimizer state at iteration 823 to ./logs/48k/G_111000.pth 2023-01-23 23:43:48,498 48k INFO Saving model and optimizer state at iteration 823 to ./logs/48k/D_111000.pth 2023-01-23 23:45:23,232 48k INFO ====> Epoch: 823 2023-01-23 23:48:24,286 48k INFO Train Epoch: 824 [70%] 2023-01-23 23:48:24,304 48k INFO [2.384350538253784, 2.370419979095459, 7.297173023223877, 12.693126678466797, 0.7103585600852966, 111200, 9.018956493780411e-05] 2023-01-23 23:48:59,543 48k INFO ====> Epoch: 824 2023-01-23 23:52:01,052 48k INFO ====> Epoch: 825 2023-01-23 23:53:36,670 48k INFO Train Epoch: 826 [19%] 2023-01-23 23:53:36,672 48k INFO [2.519937515258789, 2.3213868141174316, 7.739459037780762, 16.474294662475586, 0.6826459765434265, 111400, 9.01670189557816e-05] 2023-01-23 23:55:14,405 48k INFO ====> Epoch: 826 2023-01-23 23:57:40,769 48k INFO Train Epoch: 827 [67%] 2023-01-23 23:57:40,770 48k INFO [2.2914669513702393, 2.43734073638916, 10.430819511413574, 16.141550064086914, 0.7005162835121155, 111600, 9.015574807841212e-05] 2023-01-23 23:58:20,680 48k INFO ====> Epoch: 827 2023-01-24 00:01:33,064 48k INFO ====> Epoch: 828 2023-01-24 00:02:58,745 48k INFO Train Epoch: 829 [15%] 2023-01-24 00:02:58,748 48k INFO [2.4368526935577393, 2.582282066345215, 10.05846118927002, 18.454212188720703, 0.7947224974632263, 111800, 9.013321055007607e-05] 2023-01-24 00:04:40,682 48k INFO ====> Epoch: 829 2023-01-24 00:07:34,636 48k INFO Train Epoch: 830 [63%] 2023-01-24 00:07:34,640 48k INFO [2.5450875759124756, 2.1548542976379395, 6.448204517364502, 14.959210395812988, 0.6817785501480103, 112000, 9.01219438987573e-05] 2023-01-24 00:07:56,836 48k INFO Saving model and optimizer state at iteration 830 to ./logs/48k/G_112000.pth 2023-01-24 00:07:59,683 48k INFO Saving model and optimizer state at iteration 830 to ./logs/48k/D_112000.pth 2023-01-24 00:08:45,355 48k INFO ====> Epoch: 830 2023-01-24 00:11:42,172 48k INFO ====> Epoch: 831 2023-01-24 00:13:07,766 48k INFO Train Epoch: 832 [11%] 2023-01-24 00:13:07,767 48k INFO [2.5174612998962402, 2.3639719486236572, 6.3932037353515625, 16.25598907470703, 0.5915343165397644, 112200, 9.009941482093798e-05] 2023-01-24 00:14:54,059 48k INFO ====> Epoch: 832 2023-01-24 00:17:10,187 48k INFO Train Epoch: 833 [59%] 2023-01-24 00:17:10,188 48k INFO [2.4338536262512207, 2.4430651664733887, 6.507922649383545, 16.227386474609375, 0.9524767994880676, 112400, 9.008815239408536e-05] 2023-01-24 00:17:58,942 48k INFO ====> Epoch: 833 2023-01-24 00:20:58,386 48k INFO ====> Epoch: 834 2023-01-24 00:22:08,073 48k INFO Train Epoch: 835 [7%] 2023-01-24 00:22:08,075 48k INFO [2.4427719116210938, 2.418487548828125, 7.161908149719238, 14.553746223449707, 0.6536037921905518, 112600, 9.00656317636142e-05] 2023-01-24 00:23:59,161 48k INFO ====> Epoch: 835 2023-01-24 00:26:13,022 48k INFO Train Epoch: 836 [56%] 2023-01-24 00:26:13,023 48k INFO [2.367368698120117, 2.5420982837677, 8.923907279968262, 17.811725616455078, 0.6699185967445374, 112800, 9.005437355964375e-05] 2023-01-24 00:27:06,488 48k INFO ====> Epoch: 836 2023-01-24 00:30:11,493 48k INFO ====> Epoch: 837 2023-01-24 00:31:22,334 48k INFO Train Epoch: 838 [4%] 2023-01-24 00:31:22,335 48k INFO [2.569601535797119, 2.2253735065460205, 8.054095268249512, 14.679471015930176, 0.6668626070022583, 113000, 9.003186137335341e-05] 2023-01-24 00:31:46,928 48k INFO Saving model and optimizer state at iteration 838 to ./logs/48k/G_113000.pth 2023-01-24 00:31:50,642 48k INFO Saving model and optimizer state at iteration 838 to ./logs/48k/D_113000.pth 2023-01-24 00:33:48,663 48k INFO ====> Epoch: 838 2023-01-24 00:35:54,763 48k INFO Train Epoch: 839 [52%] 2023-01-24 00:35:54,764 48k INFO [2.2586615085601807, 2.454270601272583, 9.788749694824219, 17.908628463745117, 0.7760854363441467, 113200, 9.002060739068175e-05] 2023-01-24 00:36:52,494 48k INFO ====> Epoch: 839 2023-01-24 00:39:58,483 48k INFO ====> Epoch: 840 2023-01-24 00:41:02,795 48k INFO Train Epoch: 841 [0%] 2023-01-24 00:41:02,796 48k INFO [2.4731833934783936, 2.3403024673461914, 7.69821310043335, 13.924576759338379, 0.7539843320846558, 113400, 8.999810364540606e-05] 2023-01-24 00:43:03,184 48k INFO ====> Epoch: 841 2023-01-24 00:45:08,930 48k INFO Train Epoch: 842 [48%] 2023-01-24 00:45:08,932 48k INFO [2.223781108856201, 2.5664613246917725, 8.507553100585938, 15.987773895263672, 0.7819296717643738, 113600, 8.998685388245039e-05] 2023-01-24 00:46:11,455 48k INFO ====> Epoch: 842 2023-01-24 00:49:28,590 48k INFO Train Epoch: 843 [96%] 2023-01-24 00:49:28,591 48k INFO [2.3999855518341064, 2.657106637954712, 8.847610473632812, 17.07795524597168, 0.3960340619087219, 113800, 8.997560552571508e-05] 2023-01-24 00:49:33,015 48k INFO ====> Epoch: 843 2023-01-24 00:52:39,283 48k INFO ====> Epoch: 844 2023-01-24 00:55:08,580 48k INFO Train Epoch: 845 [44%] 2023-01-24 00:55:08,581 48k INFO [2.372727870941162, 2.6006765365600586, 8.677314758300781, 17.38389015197754, 0.5757224559783936, 114000, 8.995311303020248e-05] 2023-01-24 00:55:31,577 48k INFO Saving model and optimizer state at iteration 845 to ./logs/48k/G_114000.pth 2023-01-24 00:55:35,368 48k INFO Saving model and optimizer state at iteration 845 to ./logs/48k/D_114000.pth 2023-01-24 00:56:44,771 48k INFO ====> Epoch: 845 2023-01-24 00:59:50,817 48k INFO Train Epoch: 846 [93%] 2023-01-24 00:59:50,819 48k INFO [2.3617682456970215, 2.5258708000183105, 8.982871055603027, 16.72141456604004, 0.9010707139968872, 114200, 8.99418688910737e-05] 2023-01-24 00:59:59,514 48k INFO ====> Epoch: 846 2023-01-24 01:03:22,857 48k INFO ====> Epoch: 847 2023-01-24 01:05:13,250 48k INFO Train Epoch: 848 [41%] 2023-01-24 01:05:13,255 48k INFO [2.1620430946350098, 2.5359201431274414, 9.383580207824707, 16.489540100097656, 0.4719196856021881, 114400, 8.991938482919262e-05] 2023-01-24 01:06:24,676 48k INFO ====> Epoch: 848 2023-01-24 01:09:31,655 48k INFO Train Epoch: 849 [89%] 2023-01-24 01:09:31,656 48k INFO [2.593916893005371, 2.2794933319091797, 8.342040061950684, 14.17483901977539, 0.6111498475074768, 114600, 8.990814490608897e-05] 2023-01-24 01:09:44,665 48k INFO ====> Epoch: 849 2023-01-24 01:12:49,319 48k INFO ====> Epoch: 850 2023-01-24 01:14:52,389 48k INFO Train Epoch: 851 [37%] 2023-01-24 01:14:52,390 48k INFO [2.2306602001190186, 2.740450620651245, 11.313689231872559, 17.76605224609375, 0.6868647336959839, 114800, 8.98856692746772e-05] 2023-01-24 01:16:07,653 48k INFO ====> Epoch: 851 2023-01-24 01:18:59,863 48k INFO Train Epoch: 852 [85%] 2023-01-24 01:18:59,867 48k INFO [2.1297261714935303, 2.6171114444732666, 10.697726249694824, 17.379894256591797, 0.6091629266738892, 115000, 8.987443356601786e-05] 2023-01-24 01:19:47,007 48k INFO Saving model and optimizer state at iteration 852 to ./logs/48k/G_115000.pth 2023-01-24 01:19:49,744 48k INFO Saving model and optimizer state at iteration 852 to ./logs/48k/D_115000.pth 2023-01-24 01:20:09,671 48k INFO ====> Epoch: 852 2023-01-24 01:26:14,036 48k INFO ====> Epoch: 853 2023-01-24 01:29:56,181 48k INFO Train Epoch: 854 [33%] 2023-01-24 01:29:56,182 48k INFO [2.2280683517456055, 2.597025156021118, 9.969812393188477, 18.624692916870117, 0.7157706022262573, 115200, 8.985196636191438e-05] 2023-01-24 01:31:16,192 48k INFO ====> Epoch: 854 2023-01-24 01:35:21,893 48k INFO Train Epoch: 855 [81%] 2023-01-24 01:35:21,895 48k INFO [2.267620086669922, 2.379413366317749, 8.467971801757812, 15.736746788024902, 0.7561547756195068, 115400, 8.984073486611914e-05] 2023-01-24 01:35:44,092 48k INFO ====> Epoch: 855 2023-01-24 01:40:12,038 48k INFO ====> Epoch: 856 2023-01-24 01:43:18,461 48k INFO Train Epoch: 857 [30%] 2023-01-24 01:43:19,093 48k INFO [2.296933889389038, 2.3579909801483154, 9.327986717224121, 16.19449234008789, 0.395641952753067, 115600, 8.981827608616408e-05] 2023-01-24 01:44:43,169 48k INFO ====> Epoch: 857 2023-01-24 01:48:35,253 48k INFO Train Epoch: 858 [78%] 2023-01-24 01:48:35,255 48k INFO [2.1075844764709473, 2.552182197570801, 10.29212474822998, 18.045150756835938, 0.6683355569839478, 115800, 8.98070488016533e-05] 2023-01-24 01:49:02,114 48k INFO ====> Epoch: 858 2023-01-24 01:53:34,236 48k INFO ====> Epoch: 859 2023-01-24 01:58:44,541 48k INFO Train Epoch: 860 [26%] 2023-01-24 01:58:44,551 48k INFO [2.343454122543335, 2.690607786178589, 9.490361213684082, 16.70221710205078, 0.3772018849849701, 116000, 8.978459844268802e-05] 2023-01-24 01:59:34,394 48k INFO Saving model and optimizer state at iteration 860 to ./logs/48k/G_116000.pth 2023-01-24 01:59:37,489 48k INFO Saving model and optimizer state at iteration 860 to ./logs/48k/D_116000.pth 2023-01-24 02:01:07,657 48k INFO ====> Epoch: 860 2023-01-24 02:05:53,596 48k INFO Train Epoch: 861 [74%] 2023-01-24 02:05:53,597 48k INFO [2.2088632583618164, 2.590723752975464, 9.74864673614502, 16.056129455566406, 0.8654630184173584, 116200, 8.977337536788267e-05] 2023-01-24 02:06:25,039 48k INFO ====> Epoch: 861 2023-01-24 02:11:21,406 48k INFO ====> Epoch: 862 2023-01-24 02:14:05,926 48k INFO Train Epoch: 863 [22%] 2023-01-24 02:14:05,927 48k INFO [2.41601300239563, 2.3777568340301514, 8.778372764587402, 14.703840255737305, 0.6797971129417419, 116400, 8.97509334267497e-05] 2023-01-24 02:15:38,971 48k INFO ====> Epoch: 863 2023-01-24 02:20:23,932 48k INFO Train Epoch: 864 [70%] 2023-01-24 02:20:23,933 48k INFO [2.5174648761749268, 2.422945976257324, 8.272797584533691, 13.170279502868652, 0.591995358467102, 116600, 8.973971456007135e-05] 2023-01-24 02:20:59,687 48k INFO ====> Epoch: 864 2023-01-24 02:25:30,865 48k INFO ====> Epoch: 865 2023-01-24 02:27:57,042 48k INFO Train Epoch: 866 [19%] 2023-01-24 02:27:57,043 48k INFO [2.2302703857421875, 2.573716402053833, 8.639214515686035, 15.60397720336914, 0.5089837908744812, 116800, 8.971728103361437e-05] 2023-01-24 02:29:34,287 48k INFO ====> Epoch: 866 2023-01-24 02:34:03,104 48k INFO Train Epoch: 867 [67%] 2023-01-24 02:34:03,105 48k INFO [2.1469473838806152, 2.5985772609710693, 10.048354148864746, 17.292098999023438, 0.5127435922622681, 117000, 8.970606637348517e-05] 2023-01-24 02:34:41,908 48k INFO Saving model and optimizer state at iteration 867 to ./logs/48k/G_117000.pth 2023-01-24 02:34:45,551 48k INFO Saving model and optimizer state at iteration 867 to ./logs/48k/D_117000.pth 2023-01-24 02:35:27,263 48k INFO ====> Epoch: 867 2023-01-24 02:40:03,028 48k INFO ====> Epoch: 868 2023-01-24 02:43:26,349 48k INFO Train Epoch: 869 [15%] 2023-01-24 02:43:26,350 48k INFO [2.3361477851867676, 2.445676326751709, 8.770320892333984, 15.492831230163574, 0.6163167953491211, 117200, 8.968364125854907e-05] 2023-01-24 02:45:08,631 48k INFO ====> Epoch: 869 2023-01-24 02:48:48,068 48k INFO Train Epoch: 870 [63%] 2023-01-24 02:48:48,070 48k INFO [2.4114487171173096, 2.437115430831909, 7.162383556365967, 14.654071807861328, 0.6696683168411255, 117400, 8.967243080339174e-05] 2023-01-24 02:49:33,047 48k INFO ====> Epoch: 870 2023-01-24 02:55:13,861 48k INFO ====> Epoch: 871 2023-01-24 02:59:05,921 48k INFO Train Epoch: 872 [11%] 2023-01-24 02:59:05,923 48k INFO [2.508545160293579, 2.245074987411499, 4.707458972930908, 13.092405319213867, 0.5073220133781433, 117600, 8.965001409682262e-05] 2023-01-24 03:00:51,714 48k INFO ====> Epoch: 872 2023-01-24 03:05:30,778 48k INFO Train Epoch: 873 [59%] 2023-01-24 03:05:30,779 48k INFO [2.374379873275757, 2.426685333251953, 7.1901750564575195, 14.402620315551758, 0.7548645734786987, 117800, 8.963880784506051e-05] 2023-01-24 03:06:19,562 48k INFO ====> Epoch: 873 2023-01-24 03:12:09,019 48k INFO ====> Epoch: 874 2023-01-24 03:14:35,209 48k INFO Train Epoch: 875 [7%] 2023-01-24 03:14:35,210 48k INFO [2.270081043243408, 2.7288336753845215, 10.495768547058105, 16.178165435791016, 0.7657378911972046, 118000, 8.961639954370562e-05] 2023-01-24 03:15:15,440 48k INFO Saving model and optimizer state at iteration 875 to ./logs/48k/G_118000.pth 2023-01-24 03:15:19,415 48k INFO Saving model and optimizer state at iteration 875 to ./logs/48k/D_118000.pth 2023-01-24 03:17:13,350 48k INFO ====> Epoch: 875 2023-01-24 03:21:24,292 48k INFO Train Epoch: 876 [56%] 2023-01-24 03:21:24,293 48k INFO [2.2705562114715576, 2.4452388286590576, 9.000776290893555, 15.994904518127441, 0.6686641573905945, 118200, 8.960519749376266e-05] 2023-01-24 03:22:17,747 48k INFO ====> Epoch: 876 2023-01-24 03:26:43,355 48k INFO ====> Epoch: 877 2023-01-24 03:29:24,953 48k INFO Train Epoch: 878 [4%] 2023-01-24 03:29:24,955 48k INFO [2.345506191253662, 2.4703733921051025, 9.772196769714355, 15.719067573547363, 0.4990023970603943, 118400, 8.958279759447042e-05] 2023-01-24 03:31:19,940 48k INFO ====> Epoch: 878 2023-01-24 03:35:28,717 48k INFO Train Epoch: 879 [52%] 2023-01-24 03:35:28,719 48k INFO [2.211075782775879, 2.7101547718048096, 9.99200439453125, 18.777149200439453, 0.7156846523284912, 118600, 8.957159974477111e-05] 2023-01-24 03:36:26,514 48k INFO ====> Epoch: 879 2023-01-24 03:40:53,740 48k INFO ====> Epoch: 880 2023-01-24 03:43:36,509 48k INFO Train Epoch: 881 [0%] 2023-01-24 03:43:36,523 48k INFO [2.373516798019409, 2.498539447784424, 8.041048049926758, 17.992860794067383, 0.6574445366859436, 118800, 8.954920824439115e-05] 2023-01-24 03:45:35,640 48k INFO ====> Epoch: 881 2023-01-24 03:48:42,071 48k INFO Train Epoch: 882 [48%] 2023-01-24 03:48:42,072 48k INFO [2.4019670486450195, 2.2995152473449707, 9.145539283752441, 15.478026390075684, 0.5866989493370056, 119000, 8.95380145933606e-05] 2023-01-24 03:49:26,483 48k INFO Saving model and optimizer state at iteration 882 to ./logs/48k/G_119000.pth 2023-01-24 03:49:29,843 48k INFO Saving model and optimizer state at iteration 882 to ./logs/48k/D_119000.pth 2023-01-24 03:50:34,245 48k INFO ====> Epoch: 882 2023-01-24 03:55:45,968 48k INFO Train Epoch: 883 [96%] 2023-01-24 03:55:45,969 48k INFO [2.1911134719848633, 2.934497356414795, 9.792662620544434, 18.165023803710938, 0.682482898235321, 119200, 8.952682234153643e-05] 2023-01-24 03:55:50,436 48k INFO ====> Epoch: 883 2023-01-24 04:01:40,679 48k INFO ====> Epoch: 884 2023-01-24 04:07:16,753 48k INFO Train Epoch: 885 [44%] 2023-01-24 04:07:16,754 48k INFO [2.2698683738708496, 2.7041776180267334, 9.945980072021484, 17.77116584777832, 0.45994144678115845, 119400, 8.950444203480763e-05] 2023-01-24 04:08:23,085 48k INFO ====> Epoch: 885 2023-01-24 04:13:30,180 48k INFO Train Epoch: 886 [93%] 2023-01-24 04:13:30,181 48k INFO [2.334658145904541, 2.4805824756622314, 9.90750789642334, 17.84634017944336, 0.8184427618980408, 119600, 8.949325397955328e-05] 2023-01-24 04:13:38,935 48k INFO ====> Epoch: 886 2023-01-24 04:17:55,572 48k INFO ====> Epoch: 887 2023-01-24 04:21:01,860 48k INFO Train Epoch: 888 [41%] 2023-01-24 04:21:01,861 48k INFO [2.2883315086364746, 2.5971925258636475, 9.481980323791504, 16.296920776367188, 0.6769079566001892, 119800, 8.947088206439049e-05] 2023-01-24 04:22:13,417 48k INFO ====> Epoch: 888 2023-01-24 04:27:03,471 48k INFO Train Epoch: 889 [89%] 2023-01-24 04:27:03,472 48k INFO [2.416964292526245, 2.354635715484619, 7.445323944091797, 15.93439769744873, 0.7168738842010498, 120000, 8.945969820413243e-05] 2023-01-24 04:27:46,687 48k INFO Saving model and optimizer state at iteration 889 to ./logs/48k/G_120000.pth 2023-01-24 04:27:50,522 48k INFO Saving model and optimizer state at iteration 889 to ./logs/48k/D_120000.pth 2023-01-24 04:28:06,428 48k INFO ====> Epoch: 889 2023-01-24 04:35:09,404 48k INFO ====> Epoch: 890 2023-01-24 04:39:27,875 48k INFO Train Epoch: 891 [37%] 2023-01-24 04:39:27,876 48k INFO [2.2515034675598145, 2.700775146484375, 11.125267028808594, 19.41693115234375, 0.5761598944664001, 120200, 8.943733467738917e-05] 2023-01-24 04:40:43,031 48k INFO ====> Epoch: 891 2023-01-24 04:45:22,634 48k INFO Train Epoch: 892 [85%] 2023-01-24 04:45:22,641 48k INFO [2.3725247383117676, 2.5496602058410645, 7.10774040222168, 13.974165916442871, 0.46760162711143494, 120400, 8.942615501055449e-05] 2023-01-24 04:45:40,276 48k INFO ====> Epoch: 892 2023-01-24 04:50:33,938 48k INFO ====> Epoch: 893 2023-01-24 04:53:36,296 48k INFO Train Epoch: 894 [33%] 2023-01-24 04:53:36,297 48k INFO [2.0922863483428955, 2.846620559692383, 11.545022010803223, 17.604333877563477, 0.7251648902893066, 120600, 8.940379986908551e-05] 2023-01-24 04:54:55,441 48k INFO ====> Epoch: 894 2023-01-24 04:59:46,035 48k INFO Train Epoch: 895 [81%] 2023-01-24 04:59:46,049 48k INFO [2.405564785003662, 2.336188316345215, 8.035738945007324, 14.186262130737305, 0.542035698890686, 120800, 8.939262439410188e-05] 2023-01-24 05:00:08,372 48k INFO ====> Epoch: 895 2023-01-24 05:05:05,874 48k INFO ====> Epoch: 896 2023-01-24 05:08:27,324 48k INFO Train Epoch: 897 [30%] 2023-01-24 05:08:27,325 48k INFO [2.2987334728240967, 2.4006593227386475, 9.552640914916992, 17.416976928710938, 0.7351034879684448, 121000, 8.93702776347631e-05] 2023-01-24 05:09:07,320 48k INFO Saving model and optimizer state at iteration 897 to ./logs/48k/G_121000.pth 2023-01-24 05:09:11,852 48k INFO Saving model and optimizer state at iteration 897 to ./logs/48k/D_121000.pth 2023-01-24 05:10:43,365 48k INFO ====> Epoch: 897 2023-01-24 05:15:46,837 48k INFO Train Epoch: 898 [78%] 2023-01-24 05:15:46,838 48k INFO [2.1878488063812256, 2.8730270862579346, 10.587799072265625, 18.22486114501953, 0.8790169358253479, 121200, 8.935910635005875e-05] 2023-01-24 05:16:13,315 48k INFO ====> Epoch: 898 2023-01-24 05:21:06,893 48k INFO ====> Epoch: 899 2023-01-24 05:24:35,241 48k INFO Train Epoch: 900 [26%] 2023-01-24 05:24:35,243 48k INFO [2.4396626949310303, 2.5926125049591064, 10.181572914123535, 16.392560958862305, 0.41187185049057007, 121400, 8.933676796970726e-05] 2023-01-24 05:26:04,370 48k INFO ====> Epoch: 900 2023-01-24 05:30:52,881 48k INFO Train Epoch: 901 [74%] 2023-01-24 05:30:52,882 48k INFO [2.418576717376709, 2.288241147994995, 7.332502365112305, 15.196415901184082, 0.6782897114753723, 121600, 8.932560087371105e-05] 2023-01-24 05:31:23,965 48k INFO ====> Epoch: 901 2023-01-24 05:36:17,087 48k INFO ====> Epoch: 902 2023-01-24 05:39:11,578 48k INFO Train Epoch: 903 [22%] 2023-01-24 05:39:11,579 48k INFO [2.3848416805267334, 2.5196380615234375, 9.137702941894531, 18.253910064697266, 0.5907771587371826, 121800, 8.930327086920513e-05] 2023-01-24 05:40:45,071 48k INFO ====> Epoch: 903 2023-01-24 05:44:14,350 48k INFO Train Epoch: 904 [70%] 2023-01-24 05:44:14,352 48k INFO [2.7734320163726807, 2.398744583129883, 7.537172794342041, 14.753615379333496, 0.2564801871776581, 122000, 8.929210796034647e-05] 2023-01-24 05:44:56,208 48k INFO Saving model and optimizer state at iteration 904 to ./logs/48k/G_122000.pth 2023-01-24 05:44:59,402 48k INFO Saving model and optimizer state at iteration 904 to ./logs/48k/D_122000.pth 2023-01-24 05:45:39,404 48k INFO ====> Epoch: 904 2023-01-24 05:50:02,869 48k INFO ====> Epoch: 905 2023-01-24 05:53:45,443 48k INFO Train Epoch: 906 [19%] 2023-01-24 05:53:45,444 48k INFO [2.3772075176239014, 2.4224138259887695, 7.717105388641357, 15.658172607421875, 0.6481421589851379, 122200, 8.926978632854556e-05] 2023-01-24 05:55:22,928 48k INFO ====> Epoch: 906 2023-01-24 06:00:17,686 48k INFO Train Epoch: 907 [67%] 2023-01-24 06:00:17,687 48k INFO [2.3330178260803223, 2.7028262615203857, 10.125439643859863, 15.950325012207031, 0.7774159908294678, 122400, 8.925862760525449e-05] 2023-01-24 06:00:57,582 48k INFO ====> Epoch: 907 2023-01-24 06:06:08,205 48k INFO ====> Epoch: 908 2023-01-24 06:09:01,324 48k INFO Train Epoch: 909 [15%] 2023-01-24 06:09:01,326 48k INFO [2.363442897796631, 2.453665256500244, 8.809946060180664, 16.151973724365234, 0.6405269503593445, 122600, 8.923631434301922e-05] 2023-01-24 06:10:43,030 48k INFO ====> Epoch: 909 2023-01-24 06:15:22,069 48k INFO Train Epoch: 910 [63%] 2023-01-24 06:15:22,070 48k INFO [2.2864959239959717, 2.7289490699768066, 8.321824073791504, 16.145069122314453, 0.7761829495429993, 122800, 8.922515980372634e-05] 2023-01-24 06:16:06,377 48k INFO ====> Epoch: 910 2023-01-24 06:21:31,902 48k INFO ====> Epoch: 911 2023-01-24 06:25:53,811 48k INFO Train Epoch: 912 [11%] 2023-01-24 06:25:53,818 48k INFO [2.4300360679626465, 2.2835588455200195, 7.115235805511475, 16.413192749023438, 0.8921555280685425, 123000, 8.920285490791852e-05] 2023-01-24 06:26:22,035 48k INFO Saving model and optimizer state at iteration 912 to ./logs/48k/G_123000.pth 2023-01-24 06:26:25,734 48k INFO Saving model and optimizer state at iteration 912 to ./logs/48k/D_123000.pth 2023-01-24 06:28:14,297 48k INFO ====> Epoch: 912 2023-01-24 06:32:33,874 48k INFO Train Epoch: 913 [59%] 2023-01-24 06:32:33,875 48k INFO [2.505786418914795, 2.1730003356933594, 7.551631450653076, 14.180998802185059, 0.7393680810928345, 123200, 8.919170455105502e-05] 2023-01-24 06:33:22,933 48k INFO ====> Epoch: 913 2023-01-24 06:38:47,352 48k INFO ====> Epoch: 914 2023-01-24 06:41:37,494 48k INFO Train Epoch: 915 [7%] 2023-01-24 06:41:37,496 48k INFO [2.3994598388671875, 2.4770824909210205, 9.938379287719727, 17.74770736694336, 0.4967344105243683, 123400, 8.916940801853763e-05] 2023-01-24 06:43:28,091 48k INFO ====> Epoch: 915 2023-01-24 06:46:54,722 48k INFO Train Epoch: 916 [56%] 2023-01-24 06:46:54,724 48k INFO [2.3733296394348145, 2.632746458053589, 7.958065986633301, 14.964531898498535, 0.7075625658035278, 123600, 8.915826184253531e-05] 2023-01-24 06:47:47,937 48k INFO ====> Epoch: 916 2023-01-24 06:52:22,104 48k INFO ====> Epoch: 917 2023-01-24 06:56:17,792 48k INFO Train Epoch: 918 [4%] 2023-01-24 06:56:17,794 48k INFO [2.224777936935425, 2.7068967819213867, 10.945693016052246, 17.748506546020508, 0.7510553598403931, 123800, 8.913597367017252e-05] 2023-01-24 06:58:12,533 48k INFO ====> Epoch: 918 2023-01-24 07:01:31,686 48k INFO Train Epoch: 919 [52%] 2023-01-24 07:01:31,687 48k INFO [2.2899041175842285, 2.6243979930877686, 10.251276969909668, 18.107534408569336, 0.6714720129966736, 124000, 8.912483167346374e-05] 2023-01-24 07:02:09,575 48k INFO Saving model and optimizer state at iteration 919 to ./logs/48k/G_124000.pth 2023-01-24 07:02:12,959 48k INFO Saving model and optimizer state at iteration 919 to ./logs/48k/D_124000.pth 2023-01-24 07:03:12,282 48k INFO ====> Epoch: 919 2023-01-24 07:07:59,643 48k INFO ====> Epoch: 920 2023-01-24 07:10:39,097 48k INFO Train Epoch: 921 [0%] 2023-01-24 07:10:39,098 48k INFO [2.342294931411743, 2.707080125808716, 8.5452880859375, 16.920276641845703, 0.602349579334259, 124200, 8.910255185812085e-05] 2023-01-24 07:12:38,882 48k INFO ====> Epoch: 921 2023-01-24 07:15:42,650 48k INFO Train Epoch: 922 [48%] 2023-01-24 07:15:42,652 48k INFO [2.350806474685669, 2.406909942626953, 10.102716445922852, 15.105365753173828, 0.8311254978179932, 124400, 8.909141403913858e-05] 2023-01-24 07:16:44,929 48k INFO ====> Epoch: 922 2023-01-24 07:20:57,927 48k INFO Train Epoch: 923 [96%] 2023-01-24 07:20:57,928 48k INFO [2.1908011436462402, 2.7757363319396973, 9.267443656921387, 17.5227108001709, 0.6477222442626953, 124600, 8.908027761238368e-05] 2023-01-24 07:21:02,283 48k INFO ====> Epoch: 923 2023-01-24 07:25:33,983 48k INFO ====> Epoch: 924 2023-01-24 07:29:35,453 48k INFO Train Epoch: 925 [44%] 2023-01-24 07:29:35,454 48k INFO [2.325500965118408, 2.615245819091797, 9.679145812988281, 16.105714797973633, 0.6716615557670593, 124800, 8.90580089348599e-05] 2023-01-24 07:30:41,884 48k INFO ====> Epoch: 925 2023-01-24 07:34:54,828 48k INFO Train Epoch: 926 [93%] 2023-01-24 07:34:54,844 48k INFO [2.238098621368408, 2.652151584625244, 9.994158744812012, 16.58289909362793, 0.9154826402664185, 125000, 8.904687668374304e-05] 2023-01-24 07:35:35,760 48k INFO Saving model and optimizer state at iteration 926 to ./logs/48k/G_125000.pth 2023-01-24 07:35:39,241 48k INFO Saving model and optimizer state at iteration 926 to ./logs/48k/D_125000.pth 2023-01-24 07:35:49,787 48k INFO ====> Epoch: 926 2023-01-24 07:40:15,990 48k INFO ====> Epoch: 927 2023-01-24 07:43:20,275 48k INFO Train Epoch: 928 [41%] 2023-01-24 07:43:20,276 48k INFO [2.3467118740081787, 2.285156726837158, 8.909309387207031, 16.642812728881836, 0.7600739002227783, 125200, 8.902461635592956e-05] 2023-01-24 07:44:31,034 48k INFO ====> Epoch: 928 2023-01-24 07:48:52,115 48k INFO Train Epoch: 929 [89%] 2023-01-24 07:48:52,129 48k INFO [2.4505679607391357, 2.2626490592956543, 6.712107181549072, 14.5064058303833, 0.6893880367279053, 125400, 8.901348827888507e-05] 2023-01-24 07:49:05,385 48k INFO ====> Epoch: 929 2023-01-24 07:53:15,299 48k INFO ====> Epoch: 930 2023-01-24 07:56:31,487 48k INFO Train Epoch: 931 [37%] 2023-01-24 07:56:31,488 48k INFO [2.37424373626709, 2.7452735900878906, 9.533987998962402, 19.105085372924805, 0.49708059430122375, 125600, 8.899123629765109e-05] 2023-01-24 07:57:46,709 48k INFO ====> Epoch: 931 2023-01-24 08:03:03,258 48k INFO Train Epoch: 932 [85%] 2023-01-24 08:03:03,361 48k INFO [2.223029613494873, 2.493638038635254, 10.30270004272461, 16.073158264160156, 0.551666796207428, 125800, 8.898011239311388e-05] 2023-01-24 08:03:21,247 48k INFO ====> Epoch: 932 2023-01-24 08:09:47,276 48k INFO ====> Epoch: 933 2023-01-24 08:12:30,725 48k INFO Train Epoch: 934 [33%] 2023-01-24 08:12:30,732 48k INFO [2.3147997856140137, 2.4733617305755615, 8.894704818725586, 16.155611038208008, 0.7579798102378845, 126000, 8.895786875532985e-05] 2023-01-24 08:13:14,448 48k INFO Saving model and optimizer state at iteration 934 to ./logs/48k/G_126000.pth 2023-01-24 08:13:17,161 48k INFO Saving model and optimizer state at iteration 934 to ./logs/48k/D_126000.pth 2023-01-24 08:14:38,932 48k INFO ====> Epoch: 934 2023-01-24 08:22:22,581 48k INFO Train Epoch: 935 [81%] 2023-01-24 08:22:22,587 48k INFO [2.170680522918701, 2.4684135913848877, 9.750361442565918, 16.441162109375, 0.5245339870452881, 126200, 8.894674902173544e-05] 2023-01-24 08:22:44,543 48k INFO ====> Epoch: 935 2023-01-24 08:30:06,997 48k INFO ====> Epoch: 936 2023-01-24 08:36:15,753 48k INFO Train Epoch: 937 [30%] 2023-01-24 08:36:15,950 48k INFO [2.4388816356658936, 2.3768088817596436, 8.005547523498535, 14.96677017211914, 0.7244198322296143, 126400, 8.892451372427295e-05] 2023-01-24 08:37:40,468 48k INFO ====> Epoch: 937 2023-01-24 08:45:03,649 48k INFO Train Epoch: 938 [78%] 2023-01-24 08:45:03,650 48k INFO [2.2467257976531982, 2.674919605255127, 11.087282180786133, 17.88282012939453, 0.6499661207199097, 126600, 8.891339816005741e-05] 2023-01-24 08:45:30,807 48k INFO ====> Epoch: 938 2023-01-24 08:52:38,176 48k INFO ====> Epoch: 939 2023-01-24 08:58:39,142 48k INFO Train Epoch: 940 [26%] 2023-01-24 08:58:39,143 48k INFO [2.3014354705810547, 2.5369577407836914, 9.574796676635742, 15.779595375061035, 0.654959499835968, 126800, 8.889117119978924e-05] 2023-01-24 09:00:08,258 48k INFO ====> Epoch: 940 2023-01-24 09:07:17,331 48k INFO Train Epoch: 941 [74%] 2023-01-24 09:07:17,332 48k INFO [2.383192777633667, 2.3288331031799316, 8.108811378479004, 16.69241714477539, 0.7560097575187683, 127000, 8.888005980338925e-05] 2023-01-24 09:08:15,013 48k INFO Saving model and optimizer state at iteration 941 to ./logs/48k/G_127000.pth 2023-01-24 09:08:18,855 48k INFO Saving model and optimizer state at iteration 941 to ./logs/48k/D_127000.pth 2023-01-24 09:08:51,203 48k INFO ====> Epoch: 941 2023-01-24 09:16:24,029 48k INFO ====> Epoch: 942 2023-01-24 09:21:03,835 48k INFO Train Epoch: 943 [22%] 2023-01-24 09:21:03,844 48k INFO [2.234377384185791, 2.5192151069641113, 10.002690315246582, 17.26537322998047, 0.5986073613166809, 127200, 8.885784117718933e-05] 2023-01-24 09:22:36,987 48k INFO ====> Epoch: 943 2023-01-24 09:27:19,025 48k INFO Train Epoch: 944 [70%] 2023-01-24 09:27:19,026 48k INFO [2.2593607902526855, 2.591383934020996, 8.27655029296875, 15.443259239196777, 0.5219033360481262, 127400, 8.884673394704218e-05] 2023-01-24 09:27:54,483 48k INFO ====> Epoch: 944 2023-01-24 09:34:01,692 48k INFO ====> Epoch: 945 2023-01-24 09:41:46,773 48k INFO Train Epoch: 946 [19%] 2023-01-24 09:41:46,774 48k INFO [2.372929573059082, 2.4876441955566406, 8.323272705078125, 14.689545631408691, 0.3364385962486267, 127600, 8.882452365178563e-05] 2023-01-24 09:43:23,811 48k INFO ====> Epoch: 946 2023-01-24 09:48:29,430 48k INFO Train Epoch: 947 [67%] 2023-01-24 09:48:29,431 48k INFO [2.3850855827331543, 2.4961462020874023, 9.868062019348145, 15.769991874694824, 0.6395869255065918, 127800, 8.881342058632916e-05] 2023-01-24 09:49:09,140 48k INFO ====> Epoch: 947 2023-01-24 09:55:58,916 48k INFO ====> Epoch: 948 2023-01-24 10:01:06,968 48k INFO Train Epoch: 949 [15%] 2023-01-24 10:01:06,995 48k INFO [2.2966606616973877, 2.576371908187866, 9.027908325195312, 18.13482666015625, 0.7445420026779175, 128000, 8.879121861889226e-05] 2023-01-24 10:01:52,277 48k INFO Saving model and optimizer state at iteration 949 to ./logs/48k/G_128000.pth 2023-01-24 10:01:55,888 48k INFO Saving model and optimizer state at iteration 949 to ./logs/48k/D_128000.pth 2023-01-24 10:03:39,861 48k INFO ====> Epoch: 949 2023-01-24 10:09:32,747 48k INFO Train Epoch: 950 [63%] 2023-01-24 10:09:32,749 48k INFO [2.3970704078674316, 2.344463348388672, 8.405228614807129, 14.75860595703125, 0.4083884656429291, 128200, 8.87801197165649e-05] 2023-01-24 10:10:17,138 48k INFO ====> Epoch: 950 2023-01-24 10:17:27,509 48k INFO ====> Epoch: 951 2023-01-24 10:22:18,117 48k INFO Train Epoch: 952 [11%] 2023-01-24 10:22:18,118 48k INFO [2.368924617767334, 2.3328959941864014, 9.289410591125488, 16.353334426879883, 0.4902975261211395, 128400, 8.875792607382512e-05] 2023-01-24 10:24:04,298 48k INFO ====> Epoch: 952 2023-01-24 10:28:22,008 48k INFO Train Epoch: 953 [59%] 2023-01-24 10:28:22,015 48k INFO [2.7066121101379395, 2.051213502883911, 5.616269588470459, 11.749345779418945, 0.8217427134513855, 128600, 8.874683133306588e-05] 2023-01-24 10:29:10,712 48k INFO ====> Epoch: 953 2023-01-24 10:33:11,506 48k INFO ====> Epoch: 954 2023-01-24 10:35:09,182 48k INFO Train Epoch: 955 [7%] 2023-01-24 10:35:09,183 48k INFO [2.3767642974853516, 2.4731106758117676, 8.445024490356445, 15.759056091308594, 0.6567166447639465, 128800, 8.872464601190185e-05] 2023-01-24 10:36:59,903 48k INFO ====> Epoch: 955 2023-01-24 10:41:04,206 48k INFO Train Epoch: 956 [56%] 2023-01-24 10:41:04,207 48k INFO [2.383204460144043, 2.4699764251708984, 7.346189975738525, 16.22126579284668, 0.6849616169929504, 129000, 8.871355543115036e-05] 2023-01-24 10:41:29,850 48k INFO Saving model and optimizer state at iteration 956 to ./logs/48k/G_129000.pth 2023-01-24 10:41:33,568 48k INFO Saving model and optimizer state at iteration 956 to ./logs/48k/D_129000.pth 2023-01-24 10:42:29,954 48k INFO ====> Epoch: 956 2023-01-24 10:46:20,901 48k INFO ====> Epoch: 957 2023-01-24 10:48:28,436 48k INFO Train Epoch: 958 [4%] 2023-01-24 10:48:28,438 48k INFO [2.1142971515655518, 2.581911563873291, 10.823942184448242, 16.858585357666016, 0.4343978464603424, 129200, 8.869137842844187e-05] 2023-01-24 10:50:23,930 48k INFO ====> Epoch: 958 2023-01-24 10:54:06,377 48k INFO Train Epoch: 959 [52%] 2023-01-24 10:54:06,378 48k INFO [2.2246387004852295, 2.58884334564209, 11.0679292678833, 17.955554962158203, 0.5920310616493225, 129400, 8.868029200613832e-05] 2023-01-24 10:55:04,207 48k INFO ====> Epoch: 959 2023-01-24 10:59:57,221 48k INFO ====> Epoch: 960 2023-01-24 11:02:01,611 48k INFO Train Epoch: 961 [0%] 2023-01-24 11:02:01,612 48k INFO [2.4025728702545166, 2.43831205368042, 10.938800811767578, 18.383474349975586, 0.7033311128616333, 129600, 8.865812331876634e-05] 2023-01-24 11:04:00,411 48k INFO ====> Epoch: 961 2023-01-24 11:06:58,646 48k INFO Train Epoch: 962 [48%] 2023-01-24 11:06:58,647 48k INFO [2.336359977722168, 2.5269980430603027, 9.261215209960938, 16.754287719726562, 0.6092328429222107, 129800, 8.864704105335148e-05] 2023-01-24 11:08:00,504 48k INFO ====> Epoch: 962 2023-01-24 11:11:48,284 48k INFO Train Epoch: 963 [96%] 2023-01-24 11:11:48,285 48k INFO [2.165881872177124, 2.5876612663269043, 10.29104995727539, 18.259506225585938, 0.6724473237991333, 130000, 8.86359601732198e-05] 2023-01-24 11:12:20,425 48k INFO Saving model and optimizer state at iteration 963 to ./logs/48k/G_130000.pth 2023-01-24 11:12:23,880 48k INFO Saving model and optimizer state at iteration 963 to ./logs/48k/D_130000.pth 2023-01-24 11:12:30,930 48k INFO ====> Epoch: 963 2023-01-24 11:16:40,509 48k INFO ====> Epoch: 964 2023-01-24 11:19:22,701 48k INFO Train Epoch: 965 [44%] 2023-01-24 11:19:22,702 48k INFO [2.1998541355133057, 2.6500535011291504, 10.452553749084473, 16.73428726196289, 0.7708006501197815, 130200, 8.861380256811337e-05] 2023-01-24 11:20:29,274 48k INFO ====> Epoch: 965 2023-01-24 11:24:06,341 48k INFO Train Epoch: 966 [93%] 2023-01-24 11:24:06,343 48k INFO [2.5896778106689453, 2.2461495399475098, 7.796726226806641, 13.170557975769043, 0.6700208187103271, 130400, 8.860272584279235e-05] 2023-01-24 11:24:15,204 48k INFO ====> Epoch: 966 2023-01-24 11:27:57,220 48k INFO ====> Epoch: 967 2023-01-24 11:30:32,794 48k INFO Train Epoch: 968 [41%] 2023-01-24 11:30:32,795 48k INFO [2.4782228469848633, 2.33512282371521, 8.22910213470459, 16.27603530883789, 0.790490448474884, 130600, 8.858057654574923e-05] 2023-01-24 11:31:43,745 48k INFO ====> Epoch: 968 2023-01-24 11:35:19,267 48k INFO Train Epoch: 969 [89%] 2023-01-24 11:35:19,268 48k INFO [2.3688297271728516, 2.5484979152679443, 9.604639053344727, 17.837507247924805, 0.8454071879386902, 130800, 8.856950397368101e-05] 2023-01-24 11:35:32,684 48k INFO ====> Epoch: 969 2023-01-24 11:39:30,586 48k INFO ====> Epoch: 970 2023-01-24 11:42:20,538 48k INFO Train Epoch: 971 [37%] 2023-01-24 11:42:20,539 48k INFO [2.4913182258605957, 2.3272712230682373, 7.661468505859375, 14.440009117126465, 0.9113873243331909, 131000, 8.854736298158609e-05] 2023-01-24 11:42:49,133 48k INFO Saving model and optimizer state at iteration 971 to ./logs/48k/G_131000.pth 2023-01-24 11:42:53,078 48k INFO Saving model and optimizer state at iteration 971 to ./logs/48k/D_131000.pth 2023-01-24 11:44:11,131 48k INFO ====> Epoch: 971 2023-01-24 11:47:56,345 48k INFO Train Epoch: 972 [85%] 2023-01-24 11:47:56,346 48k INFO [2.167165517807007, 3.0989508628845215, 8.541565895080566, 15.141590118408203, 0.8158867359161377, 131200, 8.853629456121339e-05] 2023-01-24 11:48:13,969 48k INFO ====> Epoch: 972 2023-01-24 11:52:33,696 48k INFO ====> Epoch: 973 2023-01-24 11:56:00,002 48k INFO Train Epoch: 974 [33%] 2023-01-24 11:56:00,003 48k INFO [2.2601571083068848, 2.554918050765991, 8.996829986572266, 15.853609085083008, 0.7885333895683289, 131400, 8.851416187095268e-05] 2023-01-24 11:57:19,817 48k INFO ====> Epoch: 974 2023-01-24 12:01:13,619 48k INFO Train Epoch: 975 [81%] 2023-01-24 12:01:13,620 48k INFO [2.2058591842651367, 2.613236427307129, 9.450592994689941, 16.132877349853516, 0.7862616181373596, 131600, 8.850309760071881e-05] 2023-01-24 12:01:36,080 48k INFO ====> Epoch: 975 2023-01-24 12:05:54,781 48k INFO ====> Epoch: 976 2023-01-24 12:08:16,964 48k INFO Train Epoch: 977 [30%] 2023-01-24 12:08:16,965 48k INFO [2.3652162551879883, 2.402324676513672, 8.518515586853027, 14.98678970336914, 0.5272234678268433, 131800, 8.848097320917952e-05] 2023-01-24 12:09:41,404 48k INFO ====> Epoch: 977 2023-01-24 12:13:05,076 48k INFO Train Epoch: 978 [78%] 2023-01-24 12:13:05,084 48k INFO [2.2876675128936768, 2.491758346557617, 9.781346321105957, 16.77726173400879, 0.7414491176605225, 132000, 8.846991308752837e-05] 2023-01-24 12:13:34,305 48k INFO Saving model and optimizer state at iteration 978 to ./logs/48k/G_132000.pth 2023-01-24 12:13:37,791 48k INFO Saving model and optimizer state at iteration 978 to ./logs/48k/D_132000.pth 2023-01-24 12:14:06,268 48k INFO ====> Epoch: 978 2023-01-24 12:18:09,862 48k INFO ====> Epoch: 979 2023-01-24 12:20:36,380 48k INFO Train Epoch: 980 [26%] 2023-01-24 12:20:36,381 48k INFO [2.239737033843994, 2.6695556640625, 9.444403648376465, 16.566242218017578, 0.6127023696899414, 132200, 8.844779699159887e-05] 2023-01-24 12:22:04,747 48k INFO ====> Epoch: 980 2023-01-24 12:25:33,459 48k INFO Train Epoch: 981 [74%] 2023-01-24 12:25:33,460 48k INFO [1.9290368556976318, 3.0351319313049316, 11.313934326171875, 17.534996032714844, 0.8880496621131897, 132400, 8.843674101697492e-05] 2023-01-24 12:26:04,639 48k INFO ====> Epoch: 981 2023-01-24 12:30:01,254 48k INFO ====> Epoch: 982 2023-01-24 12:32:17,143 48k INFO Train Epoch: 983 [22%] 2023-01-24 12:32:17,145 48k INFO [2.2383365631103516, 2.6990013122558594, 10.648435592651367, 16.387910842895508, 0.6811309456825256, 132600, 8.841463321354475e-05] 2023-01-24 12:33:49,930 48k INFO ====> Epoch: 983 2023-01-24 12:37:05,418 48k INFO Train Epoch: 984 [70%] 2023-01-24 12:37:05,419 48k INFO [2.4130959510803223, 2.22031831741333, 7.079253196716309, 11.777743339538574, 0.6533821821212769, 132800, 8.840358138439305e-05] 2023-01-24 12:37:40,780 48k INFO ====> Epoch: 984 2023-01-24 12:41:48,614 48k INFO ====> Epoch: 985 2023-01-24 12:44:01,718 48k INFO Train Epoch: 986 [19%] 2023-01-24 12:44:01,720 48k INFO [2.3187692165374756, 2.5434961318969727, 8.59481143951416, 13.836150169372559, 0.775833785533905, 133000, 8.83814818703529e-05] 2023-01-24 12:44:24,726 48k INFO Saving model and optimizer state at iteration 986 to ./logs/48k/G_133000.pth 2023-01-24 12:44:28,043 48k INFO Saving model and optimizer state at iteration 986 to ./logs/48k/D_133000.pth 2023-01-24 12:46:08,946 48k INFO ====> Epoch: 986 2023-01-24 12:49:39,419 48k INFO Train Epoch: 987 [67%] 2023-01-24 12:49:39,420 48k INFO [2.203028917312622, 2.5676159858703613, 10.867269515991211, 17.426687240600586, 0.7839927077293396, 133200, 8.83704341851191e-05] 2023-01-24 12:50:19,423 48k INFO ====> Epoch: 987 2023-01-24 12:54:20,973 48k INFO ====> Epoch: 988 2023-01-24 12:57:14,031 48k INFO Train Epoch: 989 [15%] 2023-01-24 12:57:14,039 48k INFO [2.3636884689331055, 2.5603020191192627, 8.482573509216309, 16.34256362915039, 0.7993679046630859, 133400, 8.834834295736085e-05] 2023-01-24 12:58:55,547 48k INFO ====> Epoch: 989 2023-01-24 13:02:19,389 48k INFO Train Epoch: 990 [63%] 2023-01-24 13:02:19,391 48k INFO [2.371615171432495, 2.3435933589935303, 7.38176155090332, 15.602129936218262, 0.7076907157897949, 133600, 8.833729941449117e-05] 2023-01-24 13:03:03,787 48k INFO ====> Epoch: 990 2023-01-24 13:07:08,242 48k INFO ====> Epoch: 991 2023-01-24 13:09:35,532 48k INFO Train Epoch: 992 [11%] 2023-01-24 13:09:35,533 48k INFO [2.552978754043579, 2.128122568130493, 4.91049337387085, 11.373445510864258, 0.5417351126670837, 133800, 8.831521646990785e-05] 2023-01-24 13:11:21,612 48k INFO ====> Epoch: 992 2023-01-24 13:14:58,910 48k INFO Train Epoch: 993 [59%] 2023-01-24 13:14:58,912 48k INFO [2.3740394115448, 2.5235350131988525, 9.102145195007324, 16.863126754760742, 0.71214759349823, 134000, 8.83041770678491e-05] 2023-01-24 13:15:25,865 48k INFO Saving model and optimizer state at iteration 993 to ./logs/48k/G_134000.pth 2023-01-24 13:15:29,239 48k INFO Saving model and optimizer state at iteration 993 to ./logs/48k/D_134000.pth 2023-01-24 13:16:20,759 48k INFO ====> Epoch: 993 2023-01-24 13:20:13,346 48k INFO ====> Epoch: 994 2023-01-24 13:22:14,556 48k INFO Train Epoch: 995 [7%] 2023-01-24 13:22:14,557 48k INFO [2.3349390029907227, 2.5828394889831543, 9.273600578308105, 15.772781372070312, 0.47690698504447937, 134200, 8.82821024033349e-05] 2023-01-24 13:24:05,987 48k INFO ====> Epoch: 995 2023-01-24 13:27:39,750 48k INFO Train Epoch: 996 [56%] 2023-01-24 13:27:39,752 48k INFO [2.281371593475342, 2.48960542678833, 8.571503639221191, 16.64261817932129, 0.7958012819290161, 134400, 8.827106714053447e-05] 2023-01-24 13:28:32,872 48k INFO ====> Epoch: 996 2023-01-24 13:32:31,140 48k INFO ====> Epoch: 997 2023-01-24 13:34:36,167 48k INFO Train Epoch: 998 [4%] 2023-01-24 13:34:36,168 48k INFO [2.2668139934539795, 2.446507215499878, 9.691059112548828, 17.032270431518555, 0.7010385990142822, 134600, 8.824900075298475e-05] 2023-01-24 13:36:31,313 48k INFO ====> Epoch: 998 2023-01-24 13:39:58,986 48k INFO Train Epoch: 999 [52%] 2023-01-24 13:39:58,987 48k INFO [2.290771961212158, 2.5353729724884033, 10.958755493164062, 17.421985626220703, 0.8024544715881348, 134800, 8.823796962789062e-05] 2023-01-24 13:40:56,592 48k INFO ====> Epoch: 999 2023-01-24 13:44:46,333 48k INFO ====> Epoch: 1000 2023-01-24 13:46:40,027 48k INFO Train Epoch: 1001 [0%] 2023-01-24 13:46:40,028 48k INFO [2.192291021347046, 2.6594882011413574, 10.650835990905762, 17.17722511291504, 0.7607368230819702, 135000, 8.821591151420192e-05] 2023-01-24 13:46:57,956 48k INFO Saving model and optimizer state at iteration 1001 to ./logs/48k/G_135000.pth 2023-01-24 13:47:01,829 48k INFO Saving model and optimizer state at iteration 1001 to ./logs/48k/D_135000.pth 2023-01-24 13:49:04,592 48k INFO ====> Epoch: 1001 2023-01-24 13:52:18,615 48k INFO Train Epoch: 1002 [48%] 2023-01-24 13:52:18,616 48k INFO [2.460209608078003, 2.318105697631836, 8.28864574432373, 15.640586853027344, 0.5816314220428467, 135200, 8.820488452526264e-05] 2023-01-24 13:53:20,825 48k INFO ====> Epoch: 1002 2023-01-24 13:57:31,252 48k INFO Train Epoch: 1003 [96%] 2023-01-24 13:57:31,254 48k INFO [2.2349467277526855, 2.60292649269104, 10.696944236755371, 16.865493774414062, 0.8913800120353699, 135400, 8.819385891469698e-05] 2023-01-24 13:57:35,626 48k INFO ====> Epoch: 1003 2023-01-24 14:01:36,030 48k INFO ====> Epoch: 1004 2023-01-24 14:04:19,439 48k INFO Train Epoch: 1005 [44%] 2023-01-24 14:04:19,440 48k INFO [2.260882616043091, 2.5552151203155518, 7.34102725982666, 13.149063110351562, 0.5009228587150574, 135600, 8.817181182799734e-05] 2023-01-24 14:05:25,843 48k INFO ====> Epoch: 1005 2023-01-24 14:09:11,486 48k INFO Train Epoch: 1006 [93%] 2023-01-24 14:09:11,487 48k INFO [2.264019012451172, 2.5430655479431152, 9.107340812683105, 15.649896621704102, 0.6938570141792297, 135800, 8.816079035151885e-05] 2023-01-24 14:09:20,534 48k INFO ====> Epoch: 1006 2023-01-24 14:13:10,458 48k INFO ====> Epoch: 1007 2023-01-24 14:15:44,344 48k INFO Train Epoch: 1008 [41%] 2023-01-24 14:15:44,345 48k INFO [2.2075512409210205, 2.412741184234619, 11.469490051269531, 18.373615264892578, 0.5486322045326233, 136000, 8.813875153144332e-05] 2023-01-24 14:16:09,689 48k INFO Saving model and optimizer state at iteration 1008 to ./logs/48k/G_136000.pth 2023-01-24 14:16:14,201 48k INFO Saving model and optimizer state at iteration 1008 to ./logs/48k/D_136000.pth 2023-01-24 14:17:27,247 48k INFO ====> Epoch: 1008 2023-01-24 14:20:50,118 48k INFO Train Epoch: 1009 [89%] 2023-01-24 14:20:50,119 48k INFO [2.3657827377319336, 2.620313882827759, 9.303169250488281, 14.954667091369629, 0.4428693950176239, 136200, 8.812773418750188e-05] 2023-01-24 14:21:03,442 48k INFO ====> Epoch: 1009 2023-01-24 14:24:48,368 48k INFO ====> Epoch: 1010 2023-01-24 14:27:56,055 48k INFO Train Epoch: 1011 [37%] 2023-01-24 14:27:56,064 48k INFO [2.2675838470458984, 2.6346065998077393, 10.902297019958496, 18.432661056518555, 0.7014081478118896, 136400, 8.810570363095084e-05] 2023-01-24 14:29:11,234 48k INFO ====> Epoch: 1011 2023-01-24 14:32:28,310 48k INFO Train Epoch: 1012 [85%] 2023-01-24 14:32:28,312 48k INFO [2.196852922439575, 2.8430356979370117, 8.90884017944336, 14.575113296508789, 0.6819460988044739, 136600, 8.809469041799697e-05] 2023-01-24 14:32:46,163 48k INFO ====> Epoch: 1012 2023-01-24 14:36:31,043 48k INFO ====> Epoch: 1013 2023-01-24 14:38:56,043 48k INFO Train Epoch: 1014 [33%] 2023-01-24 14:38:56,044 48k INFO [2.2347311973571777, 2.494896173477173, 10.056912422180176, 15.360034942626953, 0.7576830983161926, 136800, 8.8072668121872e-05] 2023-01-24 14:40:15,731 48k INFO ====> Epoch: 1014 2023-01-24 14:43:42,708 48k INFO Train Epoch: 1015 [81%] 2023-01-24 14:43:42,709 48k INFO [2.21812105178833, 2.509401798248291, 8.317375183105469, 14.258973121643066, 0.4731205403804779, 137000, 8.806165903835676e-05] 2023-01-24 14:44:09,293 48k INFO Saving model and optimizer state at iteration 1015 to ./logs/48k/G_137000.pth 2023-01-24 14:44:13,072 48k INFO Saving model and optimizer state at iteration 1015 to ./logs/48k/D_137000.pth 2023-01-24 14:44:37,131 48k INFO ====> Epoch: 1015 2023-01-24 14:48:17,366 48k INFO ====> Epoch: 1016 2023-01-24 14:51:30,890 48k INFO Train Epoch: 1017 [30%] 2023-01-24 14:51:30,891 48k INFO [2.4263815879821777, 2.294304370880127, 8.365973472595215, 15.908145904541016, 0.46071305871009827, 137200, 8.803964499956059e-05] 2023-01-24 14:52:55,597 48k INFO ====> Epoch: 1017 2023-01-24 14:57:01,920 48k INFO Train Epoch: 1018 [78%] 2023-01-24 14:57:01,921 48k INFO [2.2131989002227783, 2.8089892864227295, 9.916055679321289, 16.79903793334961, 0.5719242691993713, 137400, 8.802864004393564e-05] 2023-01-24 14:57:28,404 48k INFO ====> Epoch: 1018 2023-01-24 15:01:16,046 48k INFO ====> Epoch: 1019 2023-01-24 15:03:41,154 48k INFO Train Epoch: 1020 [26%] 2023-01-24 15:03:41,155 48k INFO [2.4002487659454346, 2.525844097137451, 9.291467666625977, 16.415414810180664, 0.5278704762458801, 137600, 8.800663425937214e-05] 2023-01-24 15:05:09,534 48k INFO ====> Epoch: 1020 2023-01-24 15:08:30,688 48k INFO Train Epoch: 1021 [74%] 2023-01-24 15:08:30,699 48k INFO [2.3566534519195557, 2.3901896476745605, 8.346665382385254, 15.186899185180664, 0.5874903202056885, 137800, 8.799563343008971e-05] 2023-01-24 15:09:01,652 48k INFO ====> Epoch: 1021 2023-01-24 15:12:51,392 48k INFO ====> Epoch: 1022 2023-01-24 15:15:25,803 48k INFO Train Epoch: 1023 [22%] 2023-01-24 15:15:25,804 48k INFO [2.322918653488159, 2.4504711627960205, 10.039124488830566, 17.09600830078125, 0.6732893586158752, 138000, 8.797363589666394e-05] 2023-01-24 15:15:53,365 48k INFO Saving model and optimizer state at iteration 1023 to ./logs/48k/G_138000.pth 2023-01-24 15:15:56,867 48k INFO Saving model and optimizer state at iteration 1023 to ./logs/48k/D_138000.pth 2023-01-24 15:17:35,722 48k INFO ====> Epoch: 1023 2023-01-24 15:21:21,087 48k INFO Train Epoch: 1024 [70%] 2023-01-24 15:21:21,088 48k INFO [2.443931818008423, 2.328697919845581, 7.2632575035095215, 13.535918235778809, 0.5625838041305542, 138200, 8.796263919217686e-05] 2023-01-24 15:21:56,512 48k INFO ====> Epoch: 1024 2023-01-24 15:25:57,038 48k INFO ====> Epoch: 1025 2023-01-24 15:28:48,726 48k INFO Train Epoch: 1026 [19%] 2023-01-24 15:28:48,728 48k INFO [2.398498296737671, 2.2994558811187744, 7.148616790771484, 15.693490028381348, 0.7121820449829102, 138400, 8.794064990679505e-05] 2023-01-24 15:30:26,972 48k INFO ====> Epoch: 1026 2023-01-24 15:33:33,854 48k INFO Train Epoch: 1027 [67%] 2023-01-24 15:33:33,855 48k INFO [2.328423500061035, 2.5641744136810303, 10.024524688720703, 17.750375747680664, 0.6620914936065674, 138600, 8.79296573255567e-05] 2023-01-24 15:34:13,875 48k INFO ====> Epoch: 1027 2023-01-24 15:37:56,293 48k INFO ====> Epoch: 1028 2023-01-24 15:39:45,508 48k INFO Train Epoch: 1029 [15%] 2023-01-24 15:39:45,510 48k INFO [2.419628381729126, 2.444660186767578, 9.568704605102539, 17.04450035095215, 0.8644347786903381, 138800, 8.79076762851262e-05] 2023-01-24 15:41:27,037 48k INFO ====> Epoch: 1029 2023-01-24 15:44:51,336 48k INFO Train Epoch: 1030 [63%] 2023-01-24 15:44:51,337 48k INFO [2.5772767066955566, 2.294473171234131, 7.28826904296875, 13.532031059265137, 0.4998149871826172, 139000, 8.789668782559057e-05] 2023-01-24 15:45:19,363 48k INFO Saving model and optimizer state at iteration 1030 to ./logs/48k/G_139000.pth 2023-01-24 15:45:22,583 48k INFO Saving model and optimizer state at iteration 1030 to ./logs/48k/D_139000.pth 2023-01-24 15:46:09,451 48k INFO ====> Epoch: 1030 2023-01-24 15:50:05,605 48k INFO ====> Epoch: 1031 2023-01-24 15:52:20,771 48k INFO Train Epoch: 1032 [11%] 2023-01-24 15:52:20,773 48k INFO [2.315751314163208, 2.513489246368408, 6.801383972167969, 11.861603736877441, 0.4303009510040283, 139200, 8.787471502701991e-05] 2023-01-24 15:54:08,167 48k INFO ====> Epoch: 1032 2023-01-24 15:57:07,946 48k INFO Train Epoch: 1033 [59%] 2023-01-24 15:57:08,345 48k INFO [2.2760746479034424, 2.8039894104003906, 10.41810131072998, 17.939348220825195, 0.9081893563270569, 139400, 8.786373068764153e-05] 2023-01-24 15:57:57,829 48k INFO ====> Epoch: 1033 2023-01-24 16:01:47,451 48k INFO ====> Epoch: 1034 2023-01-24 16:03:38,012 48k INFO Train Epoch: 1035 [7%] 2023-01-24 16:03:38,014 48k INFO [2.3704216480255127, 2.5675487518310547, 8.54083251953125, 15.581677436828613, 0.9121060967445374, 139600, 8.784176612784041e-05] 2023-01-24 16:05:30,296 48k INFO ====> Epoch: 1035 2023-01-24 16:08:21,611 48k INFO Train Epoch: 1036 [56%] 2023-01-24 16:08:21,629 48k INFO [2.4463436603546143, 2.22824764251709, 6.642850399017334, 14.234620094299316, 0.7040808796882629, 139800, 8.783078590707442e-05] 2023-01-24 16:09:15,369 48k INFO ====> Epoch: 1036 2023-01-24 16:12:59,100 48k INFO ====> Epoch: 1037 2023-01-24 16:14:47,571 48k INFO Train Epoch: 1038 [4%] 2023-01-24 16:14:47,572 48k INFO [2.301335573196411, 2.4427950382232666, 9.12661075592041, 17.540971755981445, 0.7221667766571045, 140000, 8.780882958295367e-05] 2023-01-24 16:15:02,764 48k INFO Saving model and optimizer state at iteration 1038 to ./logs/48k/G_140000.pth 2023-01-24 16:15:05,667 48k INFO Saving model and optimizer state at iteration 1038 to ./logs/48k/D_140000.pth 2023-01-24 16:17:03,916 48k INFO ====> Epoch: 1038 2023-01-24 16:20:05,348 48k INFO Train Epoch: 1039 [52%] 2023-01-24 16:20:05,349 48k INFO [2.1551902294158936, 2.7152256965637207, 12.012781143188477, 17.54714584350586, 0.4605304002761841, 140200, 8.779785347925579e-05] 2023-01-24 16:21:03,217 48k INFO ====> Epoch: 1039 2023-01-24 16:25:07,910 48k INFO ====> Epoch: 1040 2023-01-24 16:27:04,944 48k INFO Train Epoch: 1041 [0%] 2023-01-24 16:27:04,945 48k INFO [2.3772571086883545, 2.412208080291748, 7.792888641357422, 13.902185440063477, 0.6131341457366943, 140400, 8.777590538772743e-05] 2023-01-24 16:29:05,475 48k INFO ====> Epoch: 1041 2023-01-24 16:32:07,460 48k INFO Train Epoch: 1042 [48%] 2023-01-24 16:32:07,466 48k INFO [2.3742785453796387, 2.4044880867004395, 10.338272094726562, 16.354406356811523, 0.6182767748832703, 140600, 8.776493339955396e-05] 2023-01-24 16:33:09,415 48k INFO ====> Epoch: 1042 2023-01-24 16:37:02,785 48k INFO Train Epoch: 1043 [96%] 2023-01-24 16:37:02,790 48k INFO [2.2375733852386475, 2.5731706619262695, 10.212307929992676, 18.089509963989258, 0.9894091486930847, 140800, 8.775396278287901e-05] 2023-01-24 16:37:07,192 48k INFO ====> Epoch: 1043 2023-01-24 16:41:04,870 48k INFO ====> Epoch: 1044 2023-01-24 16:43:38,864 48k INFO Train Epoch: 1045 [44%] 2023-01-24 16:43:38,865 48k INFO [2.467703342437744, 2.476367950439453, 9.43134880065918, 13.58586597442627, 0.5820820927619934, 141000, 8.773202566333896e-05] 2023-01-24 16:44:08,710 48k INFO Saving model and optimizer state at iteration 1045 to ./logs/48k/G_141000.pth 2023-01-24 16:44:11,936 48k INFO Saving model and optimizer state at iteration 1045 to ./logs/48k/D_141000.pth 2023-01-24 16:45:21,091 48k INFO ====> Epoch: 1045 2023-01-24 16:49:22,514 48k INFO Train Epoch: 1046 [93%] 2023-01-24 16:49:22,522 48k INFO [2.2248809337615967, 2.8249564170837402, 12.018491744995117, 18.074153900146484, 0.6494466066360474, 141200, 8.772105916013104e-05] 2023-01-24 16:49:31,566 48k INFO ====> Epoch: 1046 2023-01-24 16:54:16,220 48k INFO ====> Epoch: 1047 2023-01-24 16:57:26,231 48k INFO Train Epoch: 1048 [41%] 2023-01-24 16:57:26,233 48k INFO [2.306882858276367, 2.2669126987457275, 8.630667686462402, 15.971885681152344, 0.5158030986785889, 141400, 8.769913026598255e-05] 2023-01-24 16:58:37,427 48k INFO ====> Epoch: 1048 2023-01-24 17:02:33,906 48k INFO Train Epoch: 1049 [89%] 2023-01-24 17:02:33,929 48k INFO [2.2589516639709473, 2.552764892578125, 9.740279197692871, 14.73220157623291, 0.7936055660247803, 141600, 8.76881678746993e-05] 2023-01-24 17:02:47,170 48k INFO ====> Epoch: 1049 2023-01-24 17:06:37,905 48k INFO ====> Epoch: 1050 2023-01-24 17:09:22,863 48k INFO Train Epoch: 1051 [37%] 2023-01-24 17:09:22,864 48k INFO [2.1441922187805176, 2.907041072845459, 11.131209373474121, 16.9514217376709, 0.6796467900276184, 141800, 8.766624720285824e-05] 2023-01-24 17:10:38,859 48k INFO ====> Epoch: 1051 2023-01-24 17:14:24,487 48k INFO Train Epoch: 1052 [85%] 2023-01-24 17:14:24,489 48k INFO [2.241797924041748, 2.624675750732422, 10.281003952026367, 16.836729049682617, 0.5666868686676025, 142000, 8.765528892195788e-05] 2023-01-24 17:14:49,476 48k INFO Saving model and optimizer state at iteration 1052 to ./logs/48k/G_142000.pth 2023-01-24 17:14:52,584 48k INFO Saving model and optimizer state at iteration 1052 to ./logs/48k/D_142000.pth 2023-01-24 17:15:12,920 48k INFO ====> Epoch: 1052 2023-01-24 17:18:53,924 48k INFO ====> Epoch: 1053 2023-01-24 17:21:44,838 48k INFO Train Epoch: 1054 [33%] 2023-01-24 17:21:45,009 48k INFO [2.326155185699463, 2.725893259048462, 9.850942611694336, 17.722524642944336, 0.6583234071731567, 142200, 8.763337646934128e-05] 2023-01-24 17:23:05,478 48k INFO ====> Epoch: 1054 2023-01-24 17:27:17,342 48k INFO Train Epoch: 1055 [81%] 2023-01-24 17:27:17,343 48k INFO [2.2548110485076904, 2.397754192352295, 8.394743919372559, 15.185770034790039, 0.3242291808128357, 142400, 8.76224222972826e-05] 2023-01-24 17:27:39,471 48k INFO ====> Epoch: 1055 2023-01-24 17:31:39,963 48k INFO ====> Epoch: 1056 2023-01-24 17:34:14,476 48k INFO Train Epoch: 1057 [30%] 2023-01-24 17:34:14,477 48k INFO [2.333590507507324, 2.3811352252960205, 9.771147727966309, 17.385190963745117, 0.730596125125885, 142600, 8.760051806080861e-05] 2023-01-24 17:35:38,969 48k INFO ====> Epoch: 1057 2023-01-24 17:39:17,563 48k INFO Train Epoch: 1058 [78%] 2023-01-24 17:39:17,564 48k INFO [2.1574301719665527, 2.6842057704925537, 10.799875259399414, 17.912256240844727, 0.8442974090576172, 142800, 8.758956799605101e-05] 2023-01-24 17:39:44,398 48k INFO ====> Epoch: 1058 2023-01-24 17:44:12,742 48k INFO ====> Epoch: 1059 2023-01-24 17:46:27,523 48k INFO Train Epoch: 1060 [26%] 2023-01-24 17:46:27,525 48k INFO [2.2499756813049316, 2.8974809646606445, 9.544875144958496, 15.728507041931152, 0.6234217286109924, 143000, 8.756767197263899e-05] 2023-01-24 17:46:55,956 48k INFO Saving model and optimizer state at iteration 1060 to ./logs/48k/G_143000.pth 2023-01-24 17:46:59,304 48k INFO Saving model and optimizer state at iteration 1060 to ./logs/48k/D_143000.pth 2023-01-24 17:48:30,350 48k INFO ====> Epoch: 1060 2023-01-24 17:51:47,848 48k INFO Train Epoch: 1061 [74%] 2023-01-24 17:51:47,849 48k INFO [2.2480645179748535, 2.5877647399902344, 11.084808349609375, 17.529296875, 0.7315841317176819, 143200, 8.75567260136424e-05] 2023-01-24 17:52:18,910 48k INFO ====> Epoch: 1061 2023-01-24 17:56:12,994 48k INFO ====> Epoch: 1062 2023-01-24 17:58:28,972 48k INFO Train Epoch: 1063 [22%] 2023-01-24 17:58:28,973 48k INFO [2.3343071937561035, 2.443188190460205, 8.241781234741211, 15.75684642791748, 0.64064621925354, 143400, 8.753483820021283e-05] 2023-01-24 18:00:02,827 48k INFO ====> Epoch: 1063 2023-01-24 18:03:56,239 48k INFO Train Epoch: 1064 [70%] 2023-01-24 18:03:56,240 48k INFO [2.319941282272339, 2.4856839179992676, 7.868932247161865, 13.647297859191895, 0.3802858889102936, 143600, 8.75238963454378e-05] 2023-01-24 18:04:31,834 48k INFO ====> Epoch: 1064 2023-01-24 18:08:29,135 48k INFO ====> Epoch: 1065 2023-01-24 18:10:47,986 48k INFO Train Epoch: 1066 [19%] 2023-01-24 18:10:47,988 48k INFO [2.453183174133301, 2.5420143604278564, 9.898748397827148, 17.704105377197266, 0.5305092930793762, 143800, 8.750201673891232e-05] 2023-01-24 18:12:26,742 48k INFO ====> Epoch: 1066 2023-01-24 18:15:40,705 48k INFO Train Epoch: 1067 [67%] 2023-01-24 18:15:40,706 48k INFO [2.18045711517334, 2.919733762741089, 10.72185230255127, 16.216617584228516, 0.6501802206039429, 144000, 8.749107898681995e-05] 2023-01-24 18:16:06,455 48k INFO Saving model and optimizer state at iteration 1067 to ./logs/48k/G_144000.pth 2023-01-24 18:16:09,632 48k INFO Saving model and optimizer state at iteration 1067 to ./logs/48k/D_144000.pth 2023-01-24 18:16:51,450 48k INFO ====> Epoch: 1067 2023-01-24 18:20:41,127 48k INFO ====> Epoch: 1068 2023-01-24 18:23:07,831 48k INFO Train Epoch: 1069 [15%] 2023-01-24 18:23:07,832 48k INFO [2.22476863861084, 2.802706480026245, 8.417036056518555, 15.677046775817871, 0.5930490493774414, 144200, 8.746920758412135e-05] 2023-01-24 18:24:49,825 48k INFO ====> Epoch: 1069 2023-01-24 18:28:03,032 48k INFO Train Epoch: 1070 [63%] 2023-01-24 18:28:03,034 48k INFO [2.4590110778808594, 2.504945755004883, 7.904061794281006, 12.534356117248535, 0.7614327669143677, 144400, 8.745827393317333e-05] 2023-01-24 18:28:47,466 48k INFO ====> Epoch: 1070 2023-01-24 18:32:54,602 48k INFO ====> Epoch: 1071 2023-01-24 18:35:05,327 48k INFO Train Epoch: 1072 [11%] 2023-01-24 18:35:05,328 48k INFO [2.245537757873535, 2.4772446155548096, 7.482198238372803, 16.925077438354492, 0.5987738370895386, 144600, 8.743641073122557e-05] 2023-01-24 18:36:52,290 48k INFO ====> Epoch: 1072 2023-01-24 18:40:02,555 48k INFO Train Epoch: 1073 [59%] 2023-01-24 18:40:02,557 48k INFO [2.3360278606414795, 2.5952558517456055, 7.202507495880127, 13.977287292480469, 0.5498557686805725, 144800, 8.742548117988416e-05] 2023-01-24 18:40:51,703 48k INFO ====> Epoch: 1073 2023-01-24 18:44:37,645 48k INFO ====> Epoch: 1074 2023-01-24 18:46:26,157 48k INFO Train Epoch: 1075 [7%] 2023-01-24 18:46:26,158 48k INFO [2.2057528495788574, 2.7454776763916016, 12.11148452758789, 18.246980667114258, 0.6439139246940613, 145000, 8.740362617561233e-05] 2023-01-24 18:46:42,103 48k INFO Saving model and optimizer state at iteration 1075 to ./logs/48k/G_145000.pth 2023-01-24 18:46:46,007 48k INFO Saving model and optimizer state at iteration 1075 to ./logs/48k/D_145000.pth 2023-01-24 18:48:39,445 48k INFO ====> Epoch: 1075 2023-01-24 18:51:37,865 48k INFO Train Epoch: 1076 [56%] 2023-01-24 18:51:37,872 48k INFO [2.2833662033081055, 2.5619521141052246, 7.042793273925781, 15.581974983215332, 0.6418136954307556, 145200, 8.739270072234037e-05] 2023-01-24 18:52:31,068 48k INFO ====> Epoch: 1076 2023-01-24 18:56:15,924 48k INFO ====> Epoch: 1077 2023-01-24 18:58:16,072 48k INFO Train Epoch: 1078 [4%] 2023-01-24 18:58:16,073 48k INFO [2.2614712715148926, 2.487532615661621, 10.590919494628906, 16.971864700317383, 0.7808196544647217, 145400, 8.737085391267073e-05] 2023-01-24 19:00:11,951 48k INFO ====> Epoch: 1078 2023-01-24 19:03:19,502 48k INFO Train Epoch: 1079 [52%] 2023-01-24 19:03:19,504 48k INFO [2.2330892086029053, 2.533282518386841, 9.623794555664062, 16.27862548828125, 0.5698922872543335, 145600, 8.735993255593163e-05] 2023-01-24 19:04:17,608 48k INFO ====> Epoch: 1079 2023-01-24 19:08:02,076 48k INFO ====> Epoch: 1080 2023-01-24 19:10:04,439 48k INFO Train Epoch: 1081 [0%] 2023-01-24 19:10:04,440 48k INFO [2.3971638679504395, 2.288071632385254, 8.538543701171875, 15.733111381530762, 0.49310338497161865, 145800, 8.73380939377916e-05] 2023-01-24 19:12:05,176 48k INFO ====> Epoch: 1081 2023-01-24 19:15:21,756 48k INFO Train Epoch: 1082 [48%] 2023-01-24 19:15:21,758 48k INFO [2.259730815887451, 2.55236554145813, 10.0763578414917, 16.95671272277832, 0.5885329842567444, 146000, 8.732717667604937e-05] 2023-01-24 19:15:45,587 48k INFO Saving model and optimizer state at iteration 1082 to ./logs/48k/G_146000.pth 2023-01-24 19:15:49,323 48k INFO Saving model and optimizer state at iteration 1082 to ./logs/48k/D_146000.pth 2023-01-24 19:16:54,821 48k INFO ====> Epoch: 1082 2023-01-24 19:21:20,776 48k INFO Train Epoch: 1083 [96%] 2023-01-24 19:21:20,777 48k INFO [2.322312831878662, 2.6005892753601074, 10.539690017700195, 16.488826751708984, 0.7264026403427124, 146200, 8.731626077896486e-05] 2023-01-24 19:21:25,411 48k INFO ====> Epoch: 1083 2023-01-24 19:25:15,450 48k INFO ====> Epoch: 1084 2023-01-24 19:27:56,097 48k INFO Train Epoch: 1085 [44%] 2023-01-24 19:27:56,098 48k INFO [2.320936441421509, 2.4007115364074707, 9.070075035095215, 13.514190673828125, 0.6026538014411926, 146400, 8.729443307808668e-05] 2023-01-24 19:29:02,690 48k INFO ====> Epoch: 1085 2023-01-24 19:32:46,526 48k INFO Train Epoch: 1086 [93%] 2023-01-24 19:32:46,528 48k INFO [2.498831272125244, 2.473767042160034, 9.091024398803711, 14.835451126098633, 0.6217287182807922, 146600, 8.728352127395191e-05] 2023-01-24 19:32:55,391 48k INFO ====> Epoch: 1086 2023-01-24 19:36:52,796 48k INFO ====> Epoch: 1087 2023-01-24 19:39:54,546 48k INFO Train Epoch: 1088 [41%] 2023-01-24 19:39:54,547 48k INFO [2.293199062347412, 2.4486160278320312, 9.011993408203125, 15.346458435058594, 0.9360194802284241, 146800, 8.726170175743843e-05] 2023-01-24 19:41:05,508 48k INFO ====> Epoch: 1088 2023-01-24 19:44:47,401 48k INFO Train Epoch: 1089 [89%] 2023-01-24 19:44:47,402 48k INFO [2.2038888931274414, 2.632293224334717, 9.060490608215332, 15.069991111755371, 0.6554641723632812, 147000, 8.725079404471875e-05] 2023-01-24 19:45:14,560 48k INFO Saving model and optimizer state at iteration 1089 to ./logs/48k/G_147000.pth 2023-01-24 19:45:19,632 48k INFO Saving model and optimizer state at iteration 1089 to ./logs/48k/D_147000.pth 2023-01-24 19:45:36,623 48k INFO ====> Epoch: 1089 2023-01-24 19:50:04,199 48k INFO ====> Epoch: 1090 2023-01-24 19:53:15,534 48k INFO Train Epoch: 1091 [37%] 2023-01-24 19:53:15,535 48k INFO [2.327693223953247, 2.524149179458618, 10.83772087097168, 18.794031143188477, 0.5778418779373169, 147200, 8.722898270950122e-05] 2023-01-24 19:54:31,847 48k INFO ====> Epoch: 1091 2023-01-24 19:59:18,546 48k INFO Train Epoch: 1092 [85%] 2023-01-24 19:59:18,547 48k INFO [2.0821471214294434, 2.7069554328918457, 11.190604209899902, 17.751188278198242, 0.5578516721725464, 147400, 8.721807908666253e-05] 2023-01-24 19:59:36,454 48k INFO ====> Epoch: 1092 2023-01-24 20:04:59,004 48k INFO ====> Epoch: 1093 2023-01-24 20:07:45,731 48k INFO Train Epoch: 1094 [33%] 2023-01-24 20:07:45,743 48k INFO [2.3116462230682373, 2.7662606239318848, 9.210739135742188, 16.792251586914062, 0.6563571691513062, 147600, 8.719627592967335e-05] 2023-01-24 20:09:06,062 48k INFO ====> Epoch: 1094 2023-01-24 20:13:06,966 48k INFO Train Epoch: 1095 [81%] 2023-01-24 20:13:07,011 48k INFO [2.235905647277832, 2.6622352600097656, 8.775749206542969, 13.731940269470215, 0.5359253287315369, 147800, 8.718537639518214e-05] 2023-01-24 20:13:28,990 48k INFO ====> Epoch: 1095 2023-01-24 20:17:56,488 48k INFO ====> Epoch: 1096 2023-01-24 20:20:46,530 48k INFO Train Epoch: 1097 [30%] 2023-01-24 20:20:46,531 48k INFO [2.216320037841797, 3.0365407466888428, 8.981840133666992, 15.61364459991455, 0.5994224548339844, 148000, 8.716358141335484e-05] 2023-01-24 20:21:17,305 48k INFO Saving model and optimizer state at iteration 1097 to ./logs/48k/G_148000.pth 2023-01-24 20:21:20,607 48k INFO Saving model and optimizer state at iteration 1097 to ./logs/48k/D_148000.pth 2023-01-24 20:22:47,039 48k INFO ====> Epoch: 1097 2023-01-24 20:26:58,205 48k INFO Train Epoch: 1098 [78%] 2023-01-24 20:26:58,208 48k INFO [2.317005157470703, 2.4634902477264404, 10.055709838867188, 17.27083969116211, 0.6432420015335083, 148200, 8.715268596567818e-05] 2023-01-24 20:27:24,904 48k INFO ====> Epoch: 1098 2023-01-24 20:31:48,910 48k INFO ====> Epoch: 1099 2023-01-24 20:35:16,692 48k INFO Train Epoch: 1100 [26%] 2023-01-24 20:35:16,697 48k INFO [2.336351156234741, 2.5676229000091553, 9.012084007263184, 16.950502395629883, 0.5003083348274231, 148400, 8.713089915594747e-05] 2023-01-24 20:36:45,933 48k INFO ====> Epoch: 1100 2023-01-24 20:40:16,310 48k INFO Train Epoch: 1101 [74%] 2023-01-24 20:40:16,311 48k INFO [2.2568371295928955, 2.7232203483581543, 10.080810546875, 17.183330535888672, 0.6865756511688232, 148600, 8.712000779355297e-05] 2023-01-24 20:40:47,640 48k INFO ====> Epoch: 1101 2023-01-24 20:45:45,785 48k INFO ====> Epoch: 1102