File size: 9,278 Bytes
197d4ca
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
/home/yuqian_fu
{'gpu': '0', 'data': 'mnist', 'ntr': None, 'translate': None, 'autoaug': 'CA_multiple', 'n': 3, 'stride': 3, 'factor_num': 14, 'epochs': 100, 'nbatch': 100, 'batchsize': 32, 'lr': 0.0001, 'lr_scheduler': 'Step', 'svroot': '/data/work-gcp-europe-west4-a/yuqian_fu/datasets/SingleSourceDG/saved-digit/CA_multiple_14fa_all_ep100_lr1e-4_lr_schedulerStep0.8_bs32_lamCa_1_lamRe_1_cls1_adt2_EW2_100_rmTrue_rnTrue_str3_pipelineAugWoNorm', 'clsadapt': True, 'lambda_causal': 1.0, 'lambda_re': 1.0, 'randm': True, 'randn': True, 'network': 'resnet18'}
--------------------------CA_multiple--------------------------
---------------------------14 factors-----------------
randm: True
randn: True
n: 3
randm: False
100
0.0001
changing lr
---------------------saving model at epoch 0----------------------------------------------------
epoch 0, time 183.25, cls_loss 2.1515
100
0.0001
changing lr
---------------------saving model at epoch 1----------------------------------------------------
epoch 1, time 183.70, cls_loss 1.7865
100
0.0001
changing lr
epoch 2, time 183.59, cls_loss 1.5733
100
0.0001
changing lr
---------------------saving model at epoch 3----------------------------------------------------
epoch 3, time 183.31, cls_loss 1.4407
100
0.0001
changing lr
---------------------saving model at epoch 4----------------------------------------------------
epoch 4, time 183.01, cls_loss 1.3369
100
0.0001
changing lr
---------------------saving model at epoch 5----------------------------------------------------
epoch 5, time 183.85, cls_loss 1.3080
100
0.0001
changing lr
epoch 6, time 182.43, cls_loss 1.2082
100
0.0001
changing lr
---------------------saving model at epoch 7----------------------------------------------------
epoch 7, time 182.60, cls_loss 1.1517
100
0.0001
changing lr
---------------------saving model at epoch 8----------------------------------------------------
epoch 8, time 183.05, cls_loss 1.0938
100
0.0001
changing lr
---------------------saving model at epoch 9----------------------------------------------------
epoch 9, time 183.04, cls_loss 1.0485
100
0.0001
changing lr
epoch 10, time 182.31, cls_loss 1.0636
100
0.0001
changing lr
epoch 11, time 182.08, cls_loss 0.9913
100
0.0001
changing lr
epoch 12, time 182.44, cls_loss 0.9240
100
0.0001
changing lr
---------------------saving model at epoch 13----------------------------------------------------
epoch 13, time 182.56, cls_loss 0.8962
100
0.0001
changing lr
epoch 14, time 182.83, cls_loss 0.8474
100
0.0001
changing lr
epoch 15, time 182.24, cls_loss 0.8730
100
0.0001
changing lr
---------------------saving model at epoch 16----------------------------------------------------
epoch 16, time 182.40, cls_loss 0.8184
100
0.0001
changing lr
epoch 17, time 182.12, cls_loss 0.8083
100
0.0001
changing lr
epoch 18, time 182.02, cls_loss 0.7381
100
0.0001
changing lr
epoch 19, time 182.19, cls_loss 0.7326
100
0.0001
changing lr
epoch 20, time 181.69, cls_loss 0.6649
100
0.0001
changing lr
epoch 21, time 181.62, cls_loss 0.6849
100
0.0001
changing lr
epoch 22, time 181.68, cls_loss 0.6675
100
0.0001
changing lr
---------------------saving model at epoch 23----------------------------------------------------
epoch 23, time 182.29, cls_loss 0.6101
100
0.0001
changing lr
epoch 24, time 182.13, cls_loss 0.6237
100
0.0001
changing lr
epoch 25, time 182.23, cls_loss 0.6229
100
0.0001
changing lr
epoch 26, time 182.24, cls_loss 0.5664
100
0.0001
changing lr
epoch 27, time 182.13, cls_loss 0.5588
100
0.0001
changing lr
epoch 28, time 182.14, cls_loss 0.5539
100
0.0001
changing lr
epoch 29, time 182.35, cls_loss 0.5198
100
0.0001
changing lr
epoch 30, time 182.22, cls_loss 0.5153
100
0.0001
changing lr
epoch 31, time 182.36, cls_loss 0.4764
100
0.0001
changing lr
epoch 32, time 182.13, cls_loss 0.4748
100
0.0001
changing lr
epoch 33, time 181.83, cls_loss 0.4448
100
0.0001
changing lr
epoch 34, time 182.32, cls_loss 0.4358
100
0.0001
changing lr
epoch 35, time 181.92, cls_loss 0.4201
100
0.0001
changing lr
epoch 36, time 181.91, cls_loss 0.3949
100
0.0001
changing lr
epoch 37, time 182.01, cls_loss 0.3818
100
0.0001
changing lr
---------------------saving model at epoch 38----------------------------------------------------
epoch 38, time 182.02, cls_loss 0.3651
100
0.0001
changing lr
epoch 39, time 182.07, cls_loss 0.3656
100
0.0001
changing lr
epoch 40, time 181.87, cls_loss 0.3864
100
0.0001
changing lr
epoch 41, time 182.33, cls_loss 0.3647
100
0.0001
changing lr
epoch 42, time 182.58, cls_loss 0.3301
100
0.0001
changing lr
---------------------saving model at epoch 43----------------------------------------------------
epoch 43, time 182.56, cls_loss 0.3279
100
0.0001
changing lr
epoch 44, time 185.15, cls_loss 0.3470
100
0.0001
changing lr
epoch 45, time 182.28, cls_loss 0.2938
100
0.0001
changing lr
epoch 46, time 182.03, cls_loss 0.2920
100
0.0001
changing lr
epoch 47, time 182.53, cls_loss 0.2780
100
0.0001
changing lr
epoch 48, time 182.87, cls_loss 0.2592
100
0.0001
changing lr
epoch 49, time 182.61, cls_loss 0.2725
100
0.0001
changing lr
epoch 50, time 182.34, cls_loss 0.2344
100
0.0001
changing lr
epoch 51, time 182.13, cls_loss 0.2686
100
0.0001
changing lr
epoch 52, time 183.03, cls_loss 0.2475
100
0.0001
changing lr
epoch 53, time 182.25, cls_loss 0.2359
100
0.0001
changing lr
epoch 54, time 182.39, cls_loss 0.2279
100
0.0001
changing lr
epoch 55, time 182.38, cls_loss 0.2340
100
0.0001
changing lr
epoch 56, time 182.19, cls_loss 0.2217
100
0.0001
changing lr
epoch 57, time 182.01, cls_loss 0.2188
100
0.0001
changing lr
epoch 58, time 182.23, cls_loss 0.2269
100
0.0001
changing lr
epoch 59, time 182.47, cls_loss 0.2212
100
0.0001
changing lr
epoch 60, time 182.34, cls_loss 0.1887
100
0.0001
changing lr
epoch 61, time 182.11, cls_loss 0.1859
100
0.0001
changing lr
epoch 62, time 182.40, cls_loss 0.2021
100
0.0001
changing lr
epoch 63, time 182.09, cls_loss 0.1756
100
0.0001
changing lr
epoch 64, time 182.38, cls_loss 0.1737
100
0.0001
changing lr
epoch 65, time 182.21, cls_loss 0.1648
100
0.0001
changing lr
epoch 66, time 182.02, cls_loss 0.1613
100
0.0001
changing lr
epoch 67, time 182.29, cls_loss 0.1569
100
0.0001
changing lr
epoch 68, time 182.29, cls_loss 0.1487
100
0.0001
changing lr
---------------------saving model at epoch 69----------------------------------------------------
epoch 69, time 182.61, cls_loss 0.1538
100
0.0001
changing lr
epoch 70, time 182.28, cls_loss 0.1653
100
0.0001
changing lr
epoch 71, time 181.94, cls_loss 0.1639
100
0.0001
changing lr
epoch 72, time 181.84, cls_loss 0.1784
100
0.0001
changing lr
epoch 73, time 181.70, cls_loss 0.1843
100
0.0001
changing lr
epoch 74, time 180.53, cls_loss 0.1832
100
0.0001
changing lr
epoch 75, time 180.51, cls_loss 0.1421
100
0.0001
changing lr
epoch 76, time 180.07, cls_loss 0.1224
100
0.0001
changing lr
epoch 77, time 180.21, cls_loss 0.1187
100
0.0001
changing lr
epoch 78, time 180.07, cls_loss 0.1058
100
0.0001
changing lr
epoch 79, time 180.76, cls_loss 0.1301
100
1e-05
changing lr
---------------------saving model at epoch 80----------------------------------------------------
epoch 80, time 181.07, cls_loss 0.0915
100
1e-05
changing lr
epoch 81, time 180.00, cls_loss 0.0845
100
1e-05
changing lr
epoch 82, time 180.09, cls_loss 0.0767
100
1e-05
changing lr
epoch 83, time 180.14, cls_loss 0.0711
100
1e-05
changing lr
epoch 84, time 180.25, cls_loss 0.0698
100
1e-05
changing lr
epoch 85, time 180.12, cls_loss 0.0682
100
1e-05
changing lr
epoch 86, time 179.91, cls_loss 0.0590
100
1e-05
changing lr
epoch 87, time 179.84, cls_loss 0.0607
100
1e-05
changing lr
epoch 88, time 179.82, cls_loss 0.0634
100
1e-05
changing lr
epoch 89, time 180.04, cls_loss 0.0718
100
1e-05
changing lr
epoch 90, time 179.62, cls_loss 0.0704
100
1e-05
changing lr
epoch 91, time 179.77, cls_loss 0.0669
100
1e-05
changing lr
epoch 92, time 179.87, cls_loss 0.0574
100
1e-05
changing lr
epoch 93, time 179.66, cls_loss 0.0556
100
1e-05
changing lr
epoch 94, time 179.87, cls_loss 0.0631
100
1e-05
changing lr
epoch 95, time 179.67, cls_loss 0.0525
100
1e-05
changing lr
epoch 96, time 179.69, cls_loss 0.0473
100
1e-05
changing lr
epoch 97, time 179.39, cls_loss 0.0470
100
1e-05
changing lr
epoch 98, time 179.75, cls_loss 0.0529
100
1e-05
changing lr
epoch 99, time 180.06, cls_loss 0.0541
---------------------saving last model at epoch 99----------------------------------------------------
/home/yuqian_fu
{'gpu': '0', 'svroot': '/data/work-gcp-europe-west4-a/yuqian_fu/datasets/SingleSourceDG/saved-digit/CA_multiple_14fa_all_ep100_lr1e-4_lr_schedulerStep0.8_bs32_lamCa_1_lamRe_1_cls1_adt2_EW2_100_rmTrue_rnTrue_str3_pipelineAugWoNorm', 'svpath': '/data/work-gcp-europe-west4-a/yuqian_fu/datasets/SingleSourceDG/saved-digit/CA_multiple_14fa_all_ep100_lr1e-4_lr_schedulerStep0.8_bs32_lamCa_1_lamRe_1_cls1_adt2_EW2_100_rmTrue_rnTrue_str3_pipelineAugWoNorm/14factor_best.csv', 'channels': 3, 'factor_num': 14, 'stride': 3, 'epoch': 'best', 'eval_mapping': True}
loading weight of best
Using downloaded and verified file: /home/yuqian_fu/.pytorch/SVHN/test_32x32.mat
                     mnist       svhn  ...       usps       Avg
w/o do (original x)  93.89  13.579441  ...  89.436971  40.16719

[1 rows x 6 columns]