xianghe commited on
Commit
b6e3ea0
·
1 Parent(s): 51e2d35

Well-trained model weights

Browse files
This view is limited to 50 files because it contains too many changes.   See raw diff
Files changed (50) hide show
  1. Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_False-psai_1.0-fusion_concat-seed_2025-LIFNode-4/args.yaml +162 -0
  2. Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_False-psai_1.0-fusion_concat-seed_2025-LIFNode-4/checkpoint-18.pth.tar +3 -0
  3. Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_False-psai_1.0-fusion_concat-seed_2025-LIFNode-4/events.out.tfevents.1745049404.af1fd63cd999.1304305.0 +3 -0
  4. Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_False-psai_1.0-fusion_concat-seed_2025-LIFNode-4/last.pth.tar +3 -0
  5. Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_False-psai_1.0-fusion_concat-seed_2025-LIFNode-4/log.txt +0 -0
  6. Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_False-psai_1.0-fusion_concat-seed_2025-LIFNode-4/model_best.pth.tar +3 -0
  7. Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_False-psai_1.0-fusion_concat-seed_2025-LIFNode-4/summary.csv +101 -0
  8. Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_False-psai_1.0-fusion_concat-seed_2025-ReLUNode-1/_weights_xignore=biasbn_xnorm=filter_yignore=biasbn_ynorm=filter.h5 +3 -0
  9. Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_False-psai_1.0-fusion_concat-seed_2025-ReLUNode-1/_weights_xignore=biasbn_xnorm=filter_yignore=biasbn_ynorm=filter.h5_[-10.0,10.0,51]x[-10.0,10.0,51].h5 +3 -0
  10. Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_False-psai_1.0-fusion_concat-seed_2025-ReLUNode-1/args.yaml +162 -0
  11. Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_False-psai_1.0-fusion_concat-seed_2025-ReLUNode-1/checkpoint-23.pth.tar +3 -0
  12. Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_False-psai_1.0-fusion_concat-seed_2025-ReLUNode-1/events.out.tfevents.1744967800.af1fd63cd999.621245.0 +3 -0
  13. Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_False-psai_1.0-fusion_concat-seed_2025-ReLUNode-1/last.pth.tar +3 -0
  14. Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_False-psai_1.0-fusion_concat-seed_2025-ReLUNode-1/log.txt +0 -0
  15. Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_False-psai_1.0-fusion_concat-seed_2025-ReLUNode-1/model_best.pth.tar +3 -0
  16. Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_False-psai_1.0-fusion_concat-seed_2025-ReLUNode-1/summary.csv +101 -0
  17. Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_True-psai_1.0-fusion_concat-seed_2025-LIFNode-4/args.yaml +162 -0
  18. Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_True-psai_1.0-fusion_concat-seed_2025-LIFNode-4/checkpoint-72.pth.tar +3 -0
  19. Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_True-psai_1.0-fusion_concat-seed_2025-LIFNode-4/events.out.tfevents.1745049404.af1fd63cd999.1304306.0 +3 -0
  20. Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_True-psai_1.0-fusion_concat-seed_2025-LIFNode-4/last.pth.tar +3 -0
  21. Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_True-psai_1.0-fusion_concat-seed_2025-LIFNode-4/log.txt +0 -0
  22. Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_True-psai_1.0-fusion_concat-seed_2025-LIFNode-4/model_best.pth.tar +3 -0
  23. Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_True-psai_1.0-fusion_concat-seed_2025-LIFNode-4/summary.csv +101 -0
  24. Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_True-psai_1.0-fusion_concat-seed_2025-ReLUNode-1/_weights_xignore=biasbn_xnorm=filter_yignore=biasbn_ynorm=filter.h5 +3 -0
  25. Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_True-psai_1.0-fusion_concat-seed_2025-ReLUNode-1/_weights_xignore=biasbn_xnorm=filter_yignore=biasbn_ynorm=filter.h5_[-10.0,10.0,51]x[-10.0,10.0,51].h5 +3 -0
  26. Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_True-psai_1.0-fusion_concat-seed_2025-ReLUNode-1/args.yaml +162 -0
  27. Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_True-psai_1.0-fusion_concat-seed_2025-ReLUNode-1/checkpoint-74.pth.tar +3 -0
  28. Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_True-psai_1.0-fusion_concat-seed_2025-ReLUNode-1/events.out.tfevents.1744967800.af1fd63cd999.621246.0 +3 -0
  29. Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_True-psai_1.0-fusion_concat-seed_2025-ReLUNode-1/last.pth.tar +3 -0
  30. Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_True-psai_1.0-fusion_concat-seed_2025-ReLUNode-1/log.txt +0 -0
  31. Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_True-psai_1.0-fusion_concat-seed_2025-ReLUNode-1/model_best.pth.tar +3 -0
  32. Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_True-psai_1.0-fusion_concat-seed_2025-ReLUNode-1/summary.csv +101 -0
  33. Audio Visual Classification/exp_results/readme.md +45 -0
  34. Audio Visual Continual Learning/AV-CIL/save/AVE/audio-visual/use-inverse_False-seed_0/fig/audio-visual_train_loss_step_0.png +0 -0
  35. Audio Visual Continual Learning/AV-CIL/save/AVE/audio-visual/use-inverse_False-seed_0/fig/audio-visual_train_loss_step_1.png +0 -0
  36. Audio Visual Continual Learning/AV-CIL/save/AVE/audio-visual/use-inverse_False-seed_0/fig/audio-visual_train_loss_step_2.png +0 -0
  37. Audio Visual Continual Learning/AV-CIL/save/AVE/audio-visual/use-inverse_False-seed_0/fig/audio-visual_train_loss_step_3.png +0 -0
  38. Audio Visual Continual Learning/AV-CIL/save/AVE/audio-visual/use-inverse_False-seed_0/fig/audio-visual_val_acc_step_0.png +0 -0
  39. Audio Visual Continual Learning/AV-CIL/save/AVE/audio-visual/use-inverse_False-seed_0/fig/audio-visual_val_acc_step_1.png +0 -0
  40. Audio Visual Continual Learning/AV-CIL/save/AVE/audio-visual/use-inverse_False-seed_0/fig/audio-visual_val_acc_step_2.png +0 -0
  41. Audio Visual Continual Learning/AV-CIL/save/AVE/audio-visual/use-inverse_False-seed_0/fig/audio-visual_val_acc_step_3.png +0 -0
  42. Audio Visual Continual Learning/AV-CIL/save/AVE/audio-visual/use-inverse_False-seed_0/step_0_best_audio-visual_model.pkl +3 -0
  43. Audio Visual Continual Learning/AV-CIL/save/AVE/audio-visual/use-inverse_False-seed_0/step_1_best_audio-visual_model.pkl +3 -0
  44. Audio Visual Continual Learning/AV-CIL/save/AVE/audio-visual/use-inverse_False-seed_0/step_2_best_audio-visual_model.pkl +3 -0
  45. Audio Visual Continual Learning/AV-CIL/save/AVE/audio-visual/use-inverse_False-seed_0/step_3_best_audio-visual_model.pkl +3 -0
  46. Audio Visual Continual Learning/AV-CIL/save/AVE/audio-visual/use-inverse_False-seed_0/train.log +885 -0
  47. Audio Visual Continual Learning/AV-CIL/save/AVE/audio-visual/use-inverse_True-seed_0/fig/audio-visual_train_loss_step_0.png +0 -0
  48. Audio Visual Continual Learning/AV-CIL/save/AVE/audio-visual/use-inverse_True-seed_0/fig/audio-visual_train_loss_step_1.png +0 -0
  49. Audio Visual Continual Learning/AV-CIL/save/AVE/audio-visual/use-inverse_True-seed_0/fig/audio-visual_train_loss_step_2.png +0 -0
  50. Audio Visual Continual Learning/AV-CIL/save/AVE/audio-visual/use-inverse_True-seed_0/fig/audio-visual_train_loss_step_3.png +0 -0
Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_False-psai_1.0-fusion_concat-seed_2025-LIFNode-4/args.yaml ADDED
@@ -0,0 +1,162 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ aa: rand-m9-mstd0.5-inc1
2
+ act_fun: QGateGrad
3
+ adam_epoch: 1000
4
+ adaptation_info: false
5
+ adaptive_node: false
6
+ alpha: 0.8
7
+ amp: false
8
+ apex_amp: false
9
+ audio_path: /mnt/home/hexiang/datasets/CREMA-D/AudioWAV/
10
+ aug_splits: 0
11
+ batch_size: 32
12
+ bn_eps: null
13
+ bn_momentum: null
14
+ bn_tf: false
15
+ channels_last: false
16
+ clip_grad: null
17
+ color_jitter: 0.4
18
+ conf_mat: false
19
+ conv_type: normal
20
+ cooldown_epochs: 10
21
+ critical_loss: false
22
+ crop_pct: null
23
+ cut_mix: false
24
+ cutmix: 0.0
25
+ cutmix_beta: 2.0
26
+ cutmix_minmax: null
27
+ cutmix_noise: 0.0
28
+ cutmix_num: 1
29
+ cutmix_prob: 0.5
30
+ dataset: KineticSound
31
+ decay_epochs: 70
32
+ decay_rate: 0.1
33
+ device: 0
34
+ dist_bn: ''
35
+ drop: 0.0
36
+ drop_block: null
37
+ drop_connect: null
38
+ drop_path: 0.1
39
+ encode: direct
40
+ epochs: 100
41
+ eval: false
42
+ eval_checkpoint: ''
43
+ eval_metric: top1
44
+ event_mix: false
45
+ event_size: 48
46
+ fps: 1
47
+ fusion_method: concat
48
+ gaussian_n: 3
49
+ gp: null
50
+ hflip: 0.5
51
+ img_size: 224
52
+ initial_checkpoint: ''
53
+ interpolation: ''
54
+ inverse: false
55
+ inverse_ends: 100
56
+ inverse_starts: 0
57
+ jsd: false
58
+ kernel_method: cuda
59
+ layer_by_layer: false
60
+ local_rank: 0
61
+ log_interval: 50
62
+ loss_fn: ce
63
+ lr: 0.005
64
+ lr_cycle_limit: 1
65
+ lr_cycle_mul: 1.0
66
+ lr_noise: null
67
+ lr_noise_pct: 0.67
68
+ lr_noise_std: 1.0
69
+ mean: null
70
+ mem_dist: false
71
+ meta_ratio: -1.0
72
+ min_lr: 1.0e-05
73
+ mix_up: false
74
+ mixup: 0.0
75
+ mixup_mode: batch
76
+ mixup_off_epoch: 0
77
+ mixup_prob: 0.0
78
+ mixup_switch_prob: 0.5
79
+ modality: audio-visual
80
+ model: AVresnet18
81
+ model_ema: false
82
+ model_ema_decay: 0.99996
83
+ model_ema_force_cpu: false
84
+ modulation: Normal
85
+ modulation_ends: 50
86
+ modulation_starts: 0
87
+ momentum: 0.9
88
+ n_encode_type: linear
89
+ n_groups: 1
90
+ n_preact: false
91
+ native_amp: false
92
+ newton_maxiter: 20
93
+ no_aug: false
94
+ no_prefetcher: false
95
+ no_resume_opt: false
96
+ node_resume: ''
97
+ node_type: LIFNode
98
+ noisy_grad: 0.0
99
+ num_classes: 31
100
+ num_gpu: 1
101
+ opt: sgd
102
+ opt_betas: null
103
+ opt_eps: 1.0e-08
104
+ output: ./exp_results
105
+ patience_epochs: 10
106
+ pin_mem: false
107
+ power: 1
108
+ pretrained: false
109
+ psai: 1.0
110
+ rand_aug: false
111
+ rand_step: false
112
+ randaug_m: 15
113
+ randaug_n: 3
114
+ ratio:
115
+ - 0.75
116
+ - 1.3333333333333333
117
+ recount: 1
118
+ recovery_interval: 0
119
+ remode: pixel
120
+ reprob: 0.25
121
+ requires_thres_grad: false
122
+ reset_drop: false
123
+ resplit: false
124
+ resume: ''
125
+ save_images: false
126
+ scale:
127
+ - 0.08
128
+ - 1.0
129
+ sched: step
130
+ seed: 2025
131
+ sew_cnf: ADD
132
+ sigmoid_thres: false
133
+ smoothing: 0.1
134
+ snr: -100
135
+ snrModality: null
136
+ spike_output: false
137
+ spike_rate: false
138
+ split_bn: false
139
+ start_epoch: null
140
+ std: null
141
+ step: 4
142
+ suffix: ''
143
+ sync_bn: false
144
+ tau: 2.0
145
+ temporal_flatten: false
146
+ tensorboard_dir: ./exp_results
147
+ tet_loss: false
148
+ threshold: 0.5
149
+ train_interpolation: random
150
+ train_portion: 0.9
151
+ tsne: false
152
+ tta: 0
153
+ use_multi_epochs_loader: false
154
+ use_video_frames: 3
155
+ validation_batch_size_multiplier: 1
156
+ vflip: 0.0
157
+ visual_path: /mnt/home/hexiang/datasets/CREMA-D/
158
+ visualize: false
159
+ warmup_epochs: 0
160
+ warmup_lr: 1.0e-06
161
+ weight_decay: 0.0005
162
+ workers: 8
Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_False-psai_1.0-fusion_concat-seed_2025-LIFNode-4/checkpoint-18.pth.tar ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:725a2ecb9fbf7fe1401787f341a2c5634359273ed94e8e79027b54d10685ace4
3
+ size 179373193
Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_False-psai_1.0-fusion_concat-seed_2025-LIFNode-4/events.out.tfevents.1745049404.af1fd63cd999.1304305.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:449918997efa67e11a0332b2d92578c97f8db82b1e24be00ea74eb4537163319
3
+ size 12954544
Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_False-psai_1.0-fusion_concat-seed_2025-LIFNode-4/last.pth.tar ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:46acbe5047813ca9be0c7cc0b68fba0fefd1850ce945611d4f22f493cd2221e6
3
+ size 179373193
Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_False-psai_1.0-fusion_concat-seed_2025-LIFNode-4/log.txt ADDED
The diff for this file is too large to render. See raw diff
 
Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_False-psai_1.0-fusion_concat-seed_2025-LIFNode-4/model_best.pth.tar ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:725a2ecb9fbf7fe1401787f341a2c5634359273ed94e8e79027b54d10685ace4
3
+ size 179373193
Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_False-psai_1.0-fusion_concat-seed_2025-LIFNode-4/summary.csv ADDED
@@ -0,0 +1,101 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ epoch,train_loss,train_loss_single,train_loss_inverse,eval_loss,eval_top1,eval_top1_a,eval_top1_v
2
+ 0,2.842534065246582,0,0,2.413750838571635,32.15111796453354,3.353893600616808,2.197378565921357
3
+ 1,2.3762368722395464,0,0,2.262783727741462,35.50501156515035,2.351580570547417,1.4649190439475712
4
+ 2,2.1767641847783867,0,0,1.9476832593507554,43.79336931380108,2.004626060138782,2.7370855821125675
5
+ 3,1.9504853161898525,0,0,1.9157202124319908,44.41017733230532,1.9275250578257517,3.430994602929838
6
+ 4,1.9574416225606746,0,0,1.9150981292783433,44.75713184271395,2.3901310717039324,3.199691595990748
7
+ 5,1.896279356696389,0,0,1.8586478707601037,46.87740940632228,2.6599845797995374,3.700848111025443
8
+ 6,1.861331289464777,0,0,1.8515102280592495,47.84117193523516,2.043176561295297,3.623747108712413
9
+ 7,1.6339806751771406,0,0,1.812731942858435,49.38319198149576,1.6962220508866614,3.469545104086353
10
+ 8,1.7054612203077837,0,0,1.7926980197291789,49.80724749421743,1.5420200462606013,4.356206630686199
11
+ 9,1.55348078771071,0,0,1.829169128891497,49.88434849653046,3.0069390902081725,3.8164996144949885
12
+ 10,1.5169338638132268,0,0,1.8783849820598053,48.57363145720895,2.235929067077872,4.89591364687741
13
+ 11,1.432269356467507,0,0,1.8311538615406893,49.61449498843485,2.043176561295297,3.5080956052428682
14
+ 12,1.159691420468417,0,0,1.8690683553103034,49.15188897455667,2.235929067077872,4.973014649190439
15
+ 13,1.08019005168568,0,0,1.9097255660463686,49.03623747108713,2.235929067077872,4.7417116422513494
16
+ 14,0.9812171079895713,0,0,1.8703448171328836,50.07710100231303,3.430994602929838,4.009252120277564
17
+ 15,0.8970372460105203,0,0,1.8974868955296007,48.920585967617576,3.0840400925212026,4.163454124903624
18
+ 16,0.8476931290193037,0,0,1.8760049203403564,49.76869699306091,2.197378565921357,3.7779491133384733
19
+ 17,0.7974537340077487,0,0,1.815794890535365,51.5420200462606,2.3901310717039324,4.047802621434078
20
+ 18,0.8030600222674283,0,0,1.7802978856065408,52.85273708558211,2.9298380878951424,3.6622976098689284
21
+ 19,0.7467837821353566,0,0,1.826355043336255,51.50346954510409,2.5057825751734772,4.433307632999229
22
+ 20,0.7559020681814714,0,0,1.8337618179659523,50.38550501156515,2.9683885890516577,4.163454124903624
23
+ 21,0.7487536831335588,0,0,1.8160734474429188,51.464919043947575,3.199691595990748,5.0115651503469545
24
+ 22,0.7532314495606856,0,0,1.81891066824002,52.582883577486506,2.8141865844255975,4.7417116422513494
25
+ 23,0.7439591396938671,0,0,1.8133797384540025,51.61912104857363,2.9683885890516577,4.548959136468774
26
+ 24,0.7267032590779391,0,0,1.8581266364595388,51.15651503469545,3.353893600616808,4.664610639938319
27
+ 25,0.7342003854838285,0,0,1.8453405074367362,50.886661526599845,2.7756360832690823,4.6260601387818046
28
+ 26,0.7209941853176464,0,0,1.8723233523696041,50.0,2.351580570547417,3.9707016191210487
29
+ 27,0.7215252789584073,0,0,1.8430919454202528,50.96376252891287,3.122590593677718,4.356206630686199
30
+ 28,0.7253152132034302,0,0,1.8472302351534688,51.079414032382424,3.122590593677718,5.0115651503469545
31
+ 29,0.7342924096367576,0,0,1.872821488744402,50.424055512721665,2.274479568234387,3.8550501156515034
32
+ 30,0.7284909324212507,0,0,1.863040382161725,51.15651503469545,2.8141865844255975,4.124903623747109
33
+ 31,0.7121335105462507,0,0,1.8921648047939115,50.26985350809561,2.3130300693909023,4.279105628373169
34
+ 32,0.7131044051863931,0,0,1.8635855410406015,51.349267540478024,2.158828064764842,3.8936006168080186
35
+ 33,0.7211100133982572,0,0,1.9082302536151918,50.61680801850424,3.0454895913646878,4.356206630686199
36
+ 34,0.7061974189498208,0,0,1.8796920517176596,51.23361603700848,2.3901310717039324,5.165767154973015
37
+ 35,0.7145656184716658,0,0,1.9016714730259079,50.26985350809561,2.5057825751734772,4.163454124903624
38
+ 36,0.7123901139606129,0,0,1.8908633457851483,51.61912104857363,2.197378565921357,3.5851966075558983
39
+ 37,0.7166036800904707,0,0,1.924300892498278,49.961449498843486,2.4672320740169624,3.623747108712413
40
+ 38,0.7160452441735701,0,0,1.9210228401601728,50.07710100231303,2.274479568234387,4.124903623747109
41
+ 39,0.7101800821044228,0,0,1.8868946819555419,51.349267540478024,2.5828835774865073,4.548959136468774
42
+ 40,0.7167113152417269,0,0,1.9452573532495667,49.07478797224364,2.351580570547417,3.469545104086353
43
+ 41,0.7237472805109891,0,0,1.9285885649823002,50.61680801850424,2.3901310717039324,3.469545104086353
44
+ 42,0.7169086878949945,0,0,1.9345158828435352,49.498843484965306,2.5828835774865073,3.623747108712413
45
+ 43,0.7100526040250604,0,0,1.923240745628257,50.07710100231303,2.7756360832690823,3.0454895913646878
46
+ 44,0.7085476463491266,0,0,1.9688613126162116,47.76407093292213,2.274479568234387,3.0840400925212026
47
+ 45,0.7085912227630615,0,0,1.9245480682267164,49.92289899768697,3.2382420971472627,3.5080956052428682
48
+ 46,0.7055061622099443,0,0,1.9364356520365271,49.2289899768697,2.698535080956052,3.2382420971472627
49
+ 47,0.7136128707365557,0,0,1.917493371662031,50.23130300693909,2.6599845797995374,3.353893600616808
50
+ 48,0.7223523801023309,0,0,1.9193070572711177,50.30840400925212,2.5443330763299925,4.009252120277564
51
+ 49,0.7110848697749051,0,0,1.9484993907793173,49.961449498843486,2.8141865844255975,2.698535080956052
52
+ 50,0.7217407876794989,0,0,1.943437856072725,50.61680801850424,2.9683885890516577,3.122590593677718
53
+ 51,0.7090224894610319,0,0,1.9510596476799722,50.07710100231303,2.8141865844255975,3.0069390902081725
54
+ 52,0.7060390168970282,0,0,1.9396294253370627,50.7710100231303,2.7756360832690823,2.8912875867386276
55
+ 53,0.7102431763302196,0,0,1.9259352441374484,50.38550501156515,2.698535080956052,2.9683885890516577
56
+ 54,0.7168587879701094,0,0,1.936278174818711,49.38319198149576,2.5057825751734772,3.469545104086353
57
+ 55,0.7223646911707792,0,0,1.9537237346769758,49.61449498843485,2.3130300693909023,2.5057825751734772
58
+ 56,0.7149513147094033,0,0,1.9377238414061464,51.42636854279105,2.158828064764842,2.7370855821125675
59
+ 57,0.7087189500982111,0,0,1.9575189408101572,49.30609097918273,2.3130300693909023,2.698535080956052
60
+ 58,0.701707043431022,0,0,1.9589374427530706,49.07478797224364,2.043176561295297,3.0069390902081725
61
+ 59,0.7103474031795155,0,0,1.976736840047373,48.49653045489591,2.6214340786430226,2.274479568234387
62
+ 60,0.7034232128750194,0,0,1.9397829856519617,50.23130300693909,2.235929067077872,2.3901310717039324
63
+ 61,0.7162309722466902,0,0,1.9330084149251099,50.65535851966076,2.4672320740169624,2.4672320740169624
64
+ 62,0.7023949568921869,0,0,1.9759877814452467,49.69159599074788,2.004626060138782,3.5080956052428682
65
+ 63,0.710597650571303,0,0,1.9781732340454596,49.30609097918273,2.120277563608327,3.2382420971472627
66
+ 64,0.7039800882339478,0,0,1.9892029861532547,48.61218195836546,2.235929067077872,3.1611410948342327
67
+ 65,0.7132458307526328,0,0,2.0308918272796372,47.301464919043944,2.428681572860447,2.8141865844255975
68
+ 66,0.7060372233390808,0,0,1.982478394357627,48.14957594448728,1.9275250578257517,2.4672320740169624
69
+ 67,0.7156617749821056,0,0,1.9636854708148777,49.07478797224364,2.4672320740169624,2.9298380878951424
70
+ 68,0.6998460726304487,0,0,1.9446603768406783,49.65304548959136,2.4672320740169624,3.199691595990748
71
+ 69,0.7028526826338335,0,0,2.0081009813336657,48.14957594448728,2.081727062451812,3.122590593677718
72
+ 70,0.6947443160143766,0,0,1.9576997668722913,50.15420200462606,2.235929067077872,3.392444101773323
73
+ 71,0.6898265480995178,0,0,1.9384402094571151,51.00231303006939,2.235929067077872,3.276792598303778
74
+ 72,0.6869108893654563,0,0,1.9484899123080803,51.19506553585197,2.081727062451812,2.9298380878951424
75
+ 73,0.6917671452869069,0,0,1.9716085108225403,50.886661526599845,2.5057825751734772,3.6622976098689284
76
+ 74,0.6779362992806868,0,0,1.96859073933033,51.00231303006939,1.8889745566692366,3.469545104086353
77
+ 75,0.6803765622052279,0,0,1.9765976646264516,50.07710100231303,2.3130300693909023,3.6622976098689284
78
+ 76,0.6840457916259766,0,0,1.9753385332794307,50.809560524286816,2.274479568234387,3.6622976098689284
79
+ 77,0.6730025789954446,0,0,1.975600912098528,50.84811102544333,2.3901310717039324,3.5080956052428682
80
+ 78,0.6753658923235807,0,0,1.9920381619182108,50.11565150346954,2.351580570547417,3.430994602929838
81
+ 79,0.675371148369529,0,0,1.9859073994797565,50.26985350809561,1.9275250578257517,3.700848111025443
82
+ 80,0.6712266206741333,0,0,1.9835092754849306,50.73245952197379,2.081727062451812,3.7779491133384733
83
+ 81,0.6742876822298224,0,0,1.997756567935164,49.69159599074788,2.4672320740169624,3.276792598303778
84
+ 82,0.6806832172653892,0,0,1.9956035683866455,50.34695451040864,2.3901310717039324,3.392444101773323
85
+ 83,0.671293009411205,0,0,2.0078757572100545,50.26985350809561,2.5057825751734772,3.5080956052428682
86
+ 84,0.6695648052475669,0,0,1.9975304324165526,50.19275250578257,2.197378565921357,3.469545104086353
87
+ 85,0.6714890382506631,0,0,1.9962509038729583,50.69390902081727,2.3130300693909023,3.276792598303778
88
+ 86,0.6703818819739602,0,0,2.0085214722956155,49.65304548959136,2.3130300693909023,3.199691595990748
89
+ 87,0.6715712384744124,0,0,2.01004432804693,50.11565150346954,2.274479568234387,3.276792598303778
90
+ 88,0.6723676052960482,0,0,2.0045646898729945,50.38550501156515,2.6599845797995374,4.124903623747109
91
+ 89,0.6792512915351174,0,0,2.0146556974100718,49.76869699306091,2.5443330763299925,3.430994602929838
92
+ 90,0.6771803769198331,0,0,2.0149489555711826,49.42174248265228,2.274479568234387,3.276792598303778
93
+ 91,0.6715349013155157,0,0,2.0269692107724886,49.42174248265228,2.274479568234387,3.6622976098689284
94
+ 92,0.6671275062994524,0,0,2.0215061569360926,49.76869699306091,2.698535080956052,3.392444101773323
95
+ 93,0.6908258199691772,0,0,2.023453015957599,49.92289899768697,2.3130300693909023,3.353893600616808
96
+ 94,0.6682428825985302,0,0,2.0289967890969587,49.69159599074788,2.3901310717039324,3.353893600616808
97
+ 95,0.6812275973233309,0,0,2.024159437339859,49.575944487278335,2.5828835774865073,3.392444101773323
98
+ 96,0.675935918634588,0,0,2.0231098336445523,50.19275250578257,2.4672320740169624,3.199691595990748
99
+ 97,0.6654033064842224,0,0,2.035943557228965,49.61449498843485,2.428681572860447,3.2382420971472627
100
+ 98,0.6672264337539673,0,0,2.0301050680273756,50.11565150346954,2.5443330763299925,3.7393986121819585
101
+ 99,0.6689940094947815,0,0,2.0271076736950193,50.23130300693909,2.428681572860447,3.0840400925212026
Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_False-psai_1.0-fusion_concat-seed_2025-ReLUNode-1/_weights_xignore=biasbn_xnorm=filter_yignore=biasbn_ynorm=filter.h5 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0500a7fb7843992db7dfc3575610fd1f4fa41517040b8c5387a460c8e51579be
3
+ size 179467884
Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_False-psai_1.0-fusion_concat-seed_2025-ReLUNode-1/_weights_xignore=biasbn_xnorm=filter_yignore=biasbn_ynorm=filter.h5_[-10.0,10.0,51]x[-10.0,10.0,51].h5 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2014ac2b7660e3d6d64297cd3b8a889a7fe7a116d27b36fbd918cb3e332f0577
3
+ size 50640
Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_False-psai_1.0-fusion_concat-seed_2025-ReLUNode-1/args.yaml ADDED
@@ -0,0 +1,162 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ aa: rand-m9-mstd0.5-inc1
2
+ act_fun: QGateGrad
3
+ adam_epoch: 1000
4
+ adaptation_info: false
5
+ adaptive_node: false
6
+ alpha: 0.8
7
+ amp: false
8
+ apex_amp: false
9
+ audio_path: /mnt/home/hexiang/datasets/CREMA-D/AudioWAV/
10
+ aug_splits: 0
11
+ batch_size: 32
12
+ bn_eps: null
13
+ bn_momentum: null
14
+ bn_tf: false
15
+ channels_last: false
16
+ clip_grad: null
17
+ color_jitter: 0.4
18
+ conf_mat: false
19
+ conv_type: normal
20
+ cooldown_epochs: 10
21
+ critical_loss: false
22
+ crop_pct: null
23
+ cut_mix: false
24
+ cutmix: 0.0
25
+ cutmix_beta: 2.0
26
+ cutmix_minmax: null
27
+ cutmix_noise: 0.0
28
+ cutmix_num: 1
29
+ cutmix_prob: 0.5
30
+ dataset: KineticSound
31
+ decay_epochs: 70
32
+ decay_rate: 0.1
33
+ device: 0
34
+ dist_bn: ''
35
+ drop: 0.0
36
+ drop_block: null
37
+ drop_connect: null
38
+ drop_path: 0.1
39
+ encode: direct
40
+ epochs: 100
41
+ eval: false
42
+ eval_checkpoint: ''
43
+ eval_metric: top1
44
+ event_mix: false
45
+ event_size: 48
46
+ fps: 1
47
+ fusion_method: concat
48
+ gaussian_n: 3
49
+ gp: null
50
+ hflip: 0.5
51
+ img_size: 224
52
+ initial_checkpoint: ''
53
+ interpolation: ''
54
+ inverse: false
55
+ inverse_ends: 100
56
+ inverse_starts: 0
57
+ jsd: false
58
+ kernel_method: cuda
59
+ layer_by_layer: false
60
+ local_rank: 0
61
+ log_interval: 50
62
+ loss_fn: ce
63
+ lr: 0.005
64
+ lr_cycle_limit: 1
65
+ lr_cycle_mul: 1.0
66
+ lr_noise: null
67
+ lr_noise_pct: 0.67
68
+ lr_noise_std: 1.0
69
+ mean: null
70
+ mem_dist: false
71
+ meta_ratio: -1.0
72
+ min_lr: 1.0e-05
73
+ mix_up: false
74
+ mixup: 0.0
75
+ mixup_mode: batch
76
+ mixup_off_epoch: 0
77
+ mixup_prob: 0.0
78
+ mixup_switch_prob: 0.5
79
+ modality: audio-visual
80
+ model: AVresnet18
81
+ model_ema: false
82
+ model_ema_decay: 0.99996
83
+ model_ema_force_cpu: false
84
+ modulation: Normal
85
+ modulation_ends: 50
86
+ modulation_starts: 0
87
+ momentum: 0.9
88
+ n_encode_type: linear
89
+ n_groups: 1
90
+ n_preact: false
91
+ native_amp: false
92
+ newton_maxiter: 20
93
+ no_aug: false
94
+ no_prefetcher: false
95
+ no_resume_opt: false
96
+ node_resume: ''
97
+ node_type: ReLUNode
98
+ noisy_grad: 0.0
99
+ num_classes: 31
100
+ num_gpu: 1
101
+ opt: sgd
102
+ opt_betas: null
103
+ opt_eps: 1.0e-08
104
+ output: ./exp_results
105
+ patience_epochs: 10
106
+ pin_mem: false
107
+ power: 1
108
+ pretrained: false
109
+ psai: 1.0
110
+ rand_aug: false
111
+ rand_step: false
112
+ randaug_m: 15
113
+ randaug_n: 3
114
+ ratio:
115
+ - 0.75
116
+ - 1.3333333333333333
117
+ recount: 1
118
+ recovery_interval: 0
119
+ remode: pixel
120
+ reprob: 0.25
121
+ requires_thres_grad: false
122
+ reset_drop: false
123
+ resplit: false
124
+ resume: ''
125
+ save_images: false
126
+ scale:
127
+ - 0.08
128
+ - 1.0
129
+ sched: step
130
+ seed: 2025
131
+ sew_cnf: ADD
132
+ sigmoid_thres: false
133
+ smoothing: 0.1
134
+ snr: -100
135
+ snrModality: null
136
+ spike_output: false
137
+ spike_rate: false
138
+ split_bn: false
139
+ start_epoch: null
140
+ std: null
141
+ step: 1
142
+ suffix: ''
143
+ sync_bn: false
144
+ tau: 2.0
145
+ temporal_flatten: false
146
+ tensorboard_dir: ./exp_results
147
+ tet_loss: false
148
+ threshold: 0.5
149
+ train_interpolation: random
150
+ train_portion: 0.9
151
+ tsne: false
152
+ tta: 0
153
+ use_multi_epochs_loader: false
154
+ use_video_frames: 3
155
+ validation_batch_size_multiplier: 1
156
+ vflip: 0.0
157
+ visual_path: /mnt/home/hexiang/datasets/CREMA-D/
158
+ visualize: false
159
+ warmup_epochs: 0
160
+ warmup_lr: 1.0e-06
161
+ weight_decay: 0.0005
162
+ workers: 8
Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_False-psai_1.0-fusion_concat-seed_2025-ReLUNode-1/checkpoint-23.pth.tar ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:5e37a7e4a985649da5c2ee3eb338970317ec8466040e0bc54a23d9d96e94dc97
3
+ size 179373193
Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_False-psai_1.0-fusion_concat-seed_2025-ReLUNode-1/events.out.tfevents.1744967800.af1fd63cd999.621245.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1bb9b357bcaa134dee315d5ca15d71fd18777a60171fe5d1419610a751c65483
3
+ size 12954544
Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_False-psai_1.0-fusion_concat-seed_2025-ReLUNode-1/last.pth.tar ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0193865170edd532847252831b4bc3838d6f440ec420a24e0c0dc0fc73f4af78
3
+ size 179373193
Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_False-psai_1.0-fusion_concat-seed_2025-ReLUNode-1/log.txt ADDED
The diff for this file is too large to render. See raw diff
 
Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_False-psai_1.0-fusion_concat-seed_2025-ReLUNode-1/model_best.pth.tar ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:5e37a7e4a985649da5c2ee3eb338970317ec8466040e0bc54a23d9d96e94dc97
3
+ size 179373193
Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_False-psai_1.0-fusion_concat-seed_2025-ReLUNode-1/summary.csv ADDED
@@ -0,0 +1,101 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ epoch,train_loss,train_loss_single,train_loss_inverse,eval_loss,eval_top1,eval_top1_a,eval_top1_v
2
+ 0,2.84503173828125,0,0,2.4356729096417804,30.724749421742484,2.158828064764842,2.197378565921357
3
+ 1,2.3694887377999048,0,0,2.8122676358189875,28.064764841942946,2.9683885890516577,1.1565150346954511
4
+ 2,2.1565536043860694,0,0,2.046155484345847,41.17193523515806,1.6191210485736314,2.5443330763299925
5
+ 3,1.9681163050911643,0,0,1.9930287789820522,44.79568234387047,2.5828835774865073,2.8527370855821124
6
+ 4,1.95731660452756,0,0,2.1865640749082074,38.858905165767155,1.6191210485736314,3.0454895913646878
7
+ 5,1.8815919377587058,0,0,1.935083419245394,44.83423284502698,2.3901310717039324,3.5080956052428682
8
+ 6,1.8242558674378828,0,0,2.073585390401236,43.87047031611411,1.6191210485736314,2.6599845797995374
9
+ 7,1.6322505257346414,0,0,1.9467310784133287,45.41249036237471,1.5034695451040863,2.428681572860447
10
+ 8,1.6926553140987048,0,0,1.8838076499580143,47.417116422513494,1.040863531225906,3.199691595990748
11
+ 9,1.541946898807179,0,0,2.1788414941903897,43.06090979182729,1.9660755589822667,3.623747108712413
12
+ 10,1.4783494472503662,0,0,1.9968265025727825,46.106399383191984,2.081727062451812,2.9683885890516577
13
+ 11,1.4341550306840376,0,0,1.903367721951734,47.10871241326137,2.7370855821125675,4.857363145720894
14
+ 12,1.1161095825108616,0,0,2.335887564818677,43.90902081727062,2.235929067077872,3.7779491133384733
15
+ 13,1.0882130752910266,0,0,2.07908983017356,46.26060138781804,2.5057825751734772,4.240555127216654
16
+ 14,0.9629263390194286,0,0,1.968355684346572,48.49653045489591,2.197378565921357,2.6599845797995374
17
+ 15,0.897862434387207,0,0,2.0055829440067616,46.72320740169622,2.6214340786430226,4.009252120277564
18
+ 16,0.8452200510285117,0,0,1.9178168904165533,49.92289899768697,1.9275250578257517,2.5443330763299925
19
+ 17,0.8071429025043141,0,0,1.9286873154945343,49.03623747108713,2.428681572860447,4.163454124903624
20
+ 18,0.8018280701203779,0,0,1.8817640207506825,49.73014649190439,2.5057825751734772,3.9321511179645334
21
+ 19,0.7431348508054559,0,0,1.8692893261346986,50.61680801850424,2.4672320740169624,3.9321511179645334
22
+ 20,0.7488561055876992,0,0,1.9402227703203305,48.03392444101773,2.3901310717039324,4.2020046260601385
23
+ 21,0.7464116595008157,0,0,1.8860773106400748,49.46029298380879,2.6599845797995374,3.276792598303778
24
+ 22,0.7589777978983793,0,0,1.8588327532468985,50.809560524286816,2.3901310717039324,3.8550501156515034
25
+ 23,0.7375390204516324,0,0,1.834703096905946,51.58057054741712,2.120277563608327,4.279105628373169
26
+ 24,0.7235157110474326,0,0,1.8571111029078617,50.84811102544333,2.8141865844255975,4.009252120277564
27
+ 25,0.7282718907703053,0,0,1.9045326055337028,50.0,2.158828064764842,3.9321511179645334
28
+ 26,0.7249117168513212,0,0,1.934991702173155,48.265227447956825,2.3130300693909023,3.7779491133384733
29
+ 27,0.7170815467834473,0,0,1.8986565024464885,50.809560524286816,2.081727062451812,4.124903623747109
30
+ 28,0.7267518639564514,0,0,1.9323314469689667,48.99768696993061,3.1611410948342327,3.8164996144949885
31
+ 29,0.7307774207808755,0,0,1.907023419576511,50.73245952197379,2.235929067077872,4.471858134155744
32
+ 30,0.7234120856631886,0,0,1.8932903774352652,51.00231303006939,2.428681572860447,4.587509637625289
33
+ 31,0.7121647054498846,0,0,1.9132159633092358,50.30840400925212,2.4672320740169624,3.7393986121819585
34
+ 32,0.717554737221111,0,0,1.8778468887899689,50.11565150346954,2.120277563608327,4.240555127216654
35
+ 33,0.7193075039170005,0,0,1.933656499858259,49.42174248265228,2.6599845797995374,4.2020046260601385
36
+ 34,0.7088963443582709,0,0,1.917556003505851,50.69390902081727,2.274479568234387,3.9707016191210487
37
+ 35,0.7105227329514243,0,0,1.9279127674279253,50.07710100231303,3.0840400925212026,4.587509637625289
38
+ 36,0.7185628576712175,0,0,1.9110201952911472,51.349267540478024,1.9660755589822667,3.8164996144949885
39
+ 37,0.7170974937352267,0,0,1.892678547934927,50.92521202775636,2.5828835774865073,4.7417116422513494
40
+ 38,0.7181021300229159,0,0,1.9402496645610519,49.46029298380879,2.043176561295297,4.2020046260601385
41
+ 39,0.7050957517190413,0,0,1.9436388714274904,49.15188897455667,1.9660755589822667,3.8936006168080186
42
+ 40,0.7154142260551453,0,0,1.9956428967904567,47.91827293754819,2.235929067077872,3.8550501156515034
43
+ 41,0.7271967963738875,0,0,1.9499217061327343,49.190439475713184,1.6962220508866614,4.394757131842714
44
+ 42,0.7188805504278704,0,0,1.952874204426797,50.53970701619121,2.081727062451812,4.471858134155744
45
+ 43,0.7167698578401045,0,0,1.9593764147394515,49.498843484965306,2.3901310717039324,3.353893600616808
46
+ 44,0.7061739997430281,0,0,1.9899023658234796,47.9568234387047,2.043176561295297,3.9707016191210487
47
+ 45,0.712029679255052,0,0,1.951217144934508,49.73014649190439,2.081727062451812,4.009252120277564
48
+ 46,0.7068618535995483,0,0,1.9516625058770456,48.4579799537394,2.3901310717039324,4.356206630686199
49
+ 47,0.7123134515502236,0,0,1.9463807610061046,50.11565150346954,2.004626060138782,3.700848111025443
50
+ 48,0.7261221300471913,0,0,1.9640198419345922,50.19275250578257,2.428681572860447,4.394757131842714
51
+ 49,0.7144224047660828,0,0,1.9756056315731616,49.42174248265228,2.5443330763299925,4.973014649190439
52
+ 50,0.7240266691554677,0,0,2.002313643854769,47.686969930609095,2.351580570547417,4.703161141094834
53
+ 51,0.7079346559264443,0,0,2.0168485358391894,49.07478797224364,3.0069390902081725,4.433307632999229
54
+ 52,0.704542029987682,0,0,1.9849119728679554,48.920585967617576,2.4672320740169624,4.009252120277564
55
+ 53,0.7049224972724915,0,0,1.9769741869377189,49.07478797224364,1.6962220508866614,5.088666152659985
56
+ 54,0.7113959301601757,0,0,1.972036603605554,49.61449498843485,1.9660755589822667,4.356206630686199
57
+ 55,0.7194026925347068,0,0,1.929792857114958,50.26985350809561,2.120277563608327,4.086353122590594
58
+ 56,0.7168085791847922,0,0,1.9944675268351157,48.88203546646106,3.0454895913646878,4.086353122590594
59
+ 57,0.7093271558934992,0,0,1.9509136568332692,49.113338473400155,2.197378565921357,3.8936006168080186
60
+ 58,0.7040835293856534,0,0,2.0025861290848614,47.14726291441789,2.274479568234387,4.124903623747109
61
+ 59,0.70566542040218,0,0,1.9888466498258028,49.15188897455667,2.8912875867386276,5.05011565150347
62
+ 60,0.7014055577191439,0,0,1.952616757030384,49.03623747108713,2.235929067077872,3.9321511179645334
63
+ 61,0.7207257043231617,0,0,1.9701330084569104,49.38319198149576,2.004626060138782,4.086353122590594
64
+ 62,0.7087216160514138,0,0,1.978608385084589,49.2289899768697,2.7370855821125675,4.587509637625289
65
+ 63,0.7110950459133495,0,0,1.9746223124707765,49.03623747108713,2.197378565921357,4.973014649190439
66
+ 64,0.7021869150075045,0,0,2.028183111523148,47.45566692367001,3.392444101773323,3.5851966075558983
67
+ 65,0.7081769650632684,0,0,2.0330357338339895,46.80030840400925,2.8527370855821124,3.8936006168080186
68
+ 66,0.7077403393658724,0,0,2.028001334544044,47.1858134155744,3.0454895913646878,4.471858134155744
69
+ 67,0.7116573615507646,0,0,1.995275044128724,49.07478797224364,2.5057825751734772,4.086353122590594
70
+ 68,0.6999574357813055,0,0,1.9978079214956361,48.188126445643796,2.5057825751734772,3.430994602929838
71
+ 69,0.7015819224444303,0,0,2.0715577633636406,46.22205088666153,3.0454895913646878,3.0454895913646878
72
+ 70,0.6959977637637745,0,0,1.9542799288250432,50.038550501156514,2.5057825751734772,3.8936006168080186
73
+ 71,0.6889803409576416,0,0,1.9396044842169298,50.65535851966076,2.3130300693909023,3.7393986121819585
74
+ 72,0.6849353584376249,0,0,1.9520756356423876,50.69390902081727,2.428681572860447,4.009252120277564
75
+ 73,0.6939813657240435,0,0,1.958347300336098,50.73245952197379,2.698535080956052,4.163454124903624
76
+ 74,0.6781938997181979,0,0,1.9602921382224643,50.809560524286816,2.428681572860447,3.9707016191210487
77
+ 75,0.6772383343089711,0,0,1.96580110077869,50.69390902081727,2.7370855821125675,4.510408635312259
78
+ 76,0.6841544996608387,0,0,1.9617228759833272,50.38550501156515,2.5828835774865073,4.2020046260601385
79
+ 77,0.672706739469008,0,0,1.9716180815729805,50.886661526599845,2.5828835774865073,4.317656129529684
80
+ 78,0.67627506906336,0,0,1.9800843243977613,50.96376252891287,2.6214340786430226,3.9707016191210487
81
+ 79,0.6769991787997159,0,0,1.981784750152388,50.57825751734772,2.7370855821125675,4.009252120277564
82
+ 80,0.6697457270188765,0,0,1.9825532362841605,51.11796453353894,2.7756360832690823,4.124903623747109
83
+ 81,0.6728830500082537,0,0,1.98822685545375,50.38550501156515,2.5057825751734772,4.163454124903624
84
+ 82,0.6793067888780073,0,0,1.9885198796080366,50.96376252891287,2.5828835774865073,4.433307632999229
85
+ 83,0.6716876246712424,0,0,1.9963799706770073,51.19506553585197,2.698535080956052,4.587509637625289
86
+ 84,0.6683548255400225,0,0,1.998742670303321,50.84811102544333,2.698535080956052,4.2020046260601385
87
+ 85,0.670335666699843,0,0,1.9908146990567974,50.69390902081727,2.7370855821125675,4.317656129529684
88
+ 86,0.669741234996102,0,0,2.0019844271351763,50.53970701619121,2.3901310717039324,4.356206630686199
89
+ 87,0.6702776442874562,0,0,1.999634550457839,50.424055512721665,2.5443330763299925,4.047802621434078
90
+ 88,0.673119458285245,0,0,2.0064249389798436,50.38550501156515,2.6214340786430226,4.2020046260601385
91
+ 89,0.6788747256452387,0,0,2.0116112706471885,50.26985350809561,2.5828835774865073,4.279105628373169
92
+ 90,0.6745892329649492,0,0,2.00881067605412,50.34695451040864,2.5828835774865073,4.394757131842714
93
+ 91,0.6717266494577582,0,0,2.0071605238991697,50.53970701619121,2.8527370855821124,4.279105628373169
94
+ 92,0.668257393620231,0,0,2.0048912540250496,50.26985350809561,2.7756360832690823,4.317656129529684
95
+ 93,0.6899418993429705,0,0,2.0180595345743453,50.34695451040864,2.5057825751734772,4.471858134155744
96
+ 94,0.6683707724918019,0,0,2.013797598061602,50.57825751734772,2.7370855821125675,4.240555127216654
97
+ 95,0.6818247220732949,0,0,2.020260846881381,50.0,2.698535080956052,4.356206630686199
98
+ 96,0.675048903985457,0,0,2.019830103736707,50.501156515034694,2.9298380878951424,4.356206630686199
99
+ 97,0.665759574283253,0,0,2.024168534010488,50.53970701619121,2.8527370855821124,4.587509637625289
100
+ 98,0.6662651246244257,0,0,2.0211983741754374,50.038550501156514,2.4672320740169624,3.9707016191210487
101
+ 99,0.6692075783556158,0,0,2.017012972600109,50.501156515034694,2.5828835774865073,4.510408635312259
Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_True-psai_1.0-fusion_concat-seed_2025-LIFNode-4/args.yaml ADDED
@@ -0,0 +1,162 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ aa: rand-m9-mstd0.5-inc1
2
+ act_fun: QGateGrad
3
+ adam_epoch: 1000
4
+ adaptation_info: false
5
+ adaptive_node: false
6
+ alpha: 0.8
7
+ amp: false
8
+ apex_amp: false
9
+ audio_path: /mnt/home/hexiang/datasets/CREMA-D/AudioWAV/
10
+ aug_splits: 0
11
+ batch_size: 32
12
+ bn_eps: null
13
+ bn_momentum: null
14
+ bn_tf: false
15
+ channels_last: false
16
+ clip_grad: null
17
+ color_jitter: 0.4
18
+ conf_mat: false
19
+ conv_type: normal
20
+ cooldown_epochs: 10
21
+ critical_loss: false
22
+ crop_pct: null
23
+ cut_mix: false
24
+ cutmix: 0.0
25
+ cutmix_beta: 2.0
26
+ cutmix_minmax: null
27
+ cutmix_noise: 0.0
28
+ cutmix_num: 1
29
+ cutmix_prob: 0.5
30
+ dataset: KineticSound
31
+ decay_epochs: 70
32
+ decay_rate: 0.1
33
+ device: 0
34
+ dist_bn: ''
35
+ drop: 0.0
36
+ drop_block: null
37
+ drop_connect: null
38
+ drop_path: 0.1
39
+ encode: direct
40
+ epochs: 100
41
+ eval: false
42
+ eval_checkpoint: ''
43
+ eval_metric: top1
44
+ event_mix: false
45
+ event_size: 48
46
+ fps: 1
47
+ fusion_method: concat
48
+ gaussian_n: 3
49
+ gp: null
50
+ hflip: 0.5
51
+ img_size: 224
52
+ initial_checkpoint: ''
53
+ interpolation: ''
54
+ inverse: true
55
+ inverse_ends: 100
56
+ inverse_starts: 0
57
+ jsd: false
58
+ kernel_method: cuda
59
+ layer_by_layer: false
60
+ local_rank: 0
61
+ log_interval: 50
62
+ loss_fn: ce
63
+ lr: 0.005
64
+ lr_cycle_limit: 1
65
+ lr_cycle_mul: 1.0
66
+ lr_noise: null
67
+ lr_noise_pct: 0.67
68
+ lr_noise_std: 1.0
69
+ mean: null
70
+ mem_dist: false
71
+ meta_ratio: -1.0
72
+ min_lr: 1.0e-05
73
+ mix_up: false
74
+ mixup: 0.0
75
+ mixup_mode: batch
76
+ mixup_off_epoch: 0
77
+ mixup_prob: 0.0
78
+ mixup_switch_prob: 0.5
79
+ modality: audio-visual
80
+ model: AVresnet18
81
+ model_ema: false
82
+ model_ema_decay: 0.99996
83
+ model_ema_force_cpu: false
84
+ modulation: Normal
85
+ modulation_ends: 50
86
+ modulation_starts: 0
87
+ momentum: 0.9
88
+ n_encode_type: linear
89
+ n_groups: 1
90
+ n_preact: false
91
+ native_amp: false
92
+ newton_maxiter: 20
93
+ no_aug: false
94
+ no_prefetcher: false
95
+ no_resume_opt: false
96
+ node_resume: ''
97
+ node_type: LIFNode
98
+ noisy_grad: 0.0
99
+ num_classes: 31
100
+ num_gpu: 1
101
+ opt: sgd
102
+ opt_betas: null
103
+ opt_eps: 1.0e-08
104
+ output: ./exp_results
105
+ patience_epochs: 10
106
+ pin_mem: false
107
+ power: 1
108
+ pretrained: false
109
+ psai: 1.0
110
+ rand_aug: false
111
+ rand_step: false
112
+ randaug_m: 15
113
+ randaug_n: 3
114
+ ratio:
115
+ - 0.75
116
+ - 1.3333333333333333
117
+ recount: 1
118
+ recovery_interval: 0
119
+ remode: pixel
120
+ reprob: 0.25
121
+ requires_thres_grad: false
122
+ reset_drop: false
123
+ resplit: false
124
+ resume: ''
125
+ save_images: false
126
+ scale:
127
+ - 0.08
128
+ - 1.0
129
+ sched: step
130
+ seed: 2025
131
+ sew_cnf: ADD
132
+ sigmoid_thres: false
133
+ smoothing: 0.1
134
+ snr: -100
135
+ snrModality: null
136
+ spike_output: false
137
+ spike_rate: false
138
+ split_bn: false
139
+ start_epoch: null
140
+ std: null
141
+ step: 4
142
+ suffix: ''
143
+ sync_bn: false
144
+ tau: 2.0
145
+ temporal_flatten: false
146
+ tensorboard_dir: ./exp_results
147
+ tet_loss: false
148
+ threshold: 0.5
149
+ train_interpolation: random
150
+ train_portion: 0.9
151
+ tsne: false
152
+ tta: 0
153
+ use_multi_epochs_loader: false
154
+ use_video_frames: 3
155
+ validation_batch_size_multiplier: 1
156
+ vflip: 0.0
157
+ visual_path: /mnt/home/hexiang/datasets/CREMA-D/
158
+ visualize: false
159
+ warmup_epochs: 0
160
+ warmup_lr: 1.0e-06
161
+ weight_decay: 0.0005
162
+ workers: 8
Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_True-psai_1.0-fusion_concat-seed_2025-LIFNode-4/checkpoint-72.pth.tar ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:24cb3180593213d5da34e5e4db609b0c7173396a1cd9ac9253be42749e72c26f
3
+ size 179501441
Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_True-psai_1.0-fusion_concat-seed_2025-LIFNode-4/events.out.tfevents.1745049404.af1fd63cd999.1304306.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:bc68609d01e486ef2c2d5a0276d80dfa89005df0b8683a02b0110bec6cbd03a2
3
+ size 12954544
Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_True-psai_1.0-fusion_concat-seed_2025-LIFNode-4/last.pth.tar ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a9c3ceb72e9051580964cbb79bdf78153beaab8c57ef343e6be17867888eeccd
3
+ size 179501441
Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_True-psai_1.0-fusion_concat-seed_2025-LIFNode-4/log.txt ADDED
The diff for this file is too large to render. See raw diff
 
Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_True-psai_1.0-fusion_concat-seed_2025-LIFNode-4/model_best.pth.tar ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:24cb3180593213d5da34e5e4db609b0c7173396a1cd9ac9253be42749e72c26f
3
+ size 179501441
Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_True-psai_1.0-fusion_concat-seed_2025-LIFNode-4/summary.csv ADDED
@@ -0,0 +1,101 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ epoch,train_loss,train_loss_single,train_loss_inverse,eval_loss,eval_top1,eval_top1_a,eval_top1_v
2
+ 0,8.913422064347701,6.0999579863114795,0.0,2.4186245882244597,30.146491904394757,23.323053199691596,16.345412490362374
3
+ 1,7.956188808787953,5.571812456304377,0.0,2.2643452720083266,37.39398612181958,28.79722436391673,18.889745566692365
4
+ 2,7.426820234818892,5.228178977966309,0.0,1.8895571652796972,45.990747879722434,31.071703932151117,25.636083269082498
5
+ 3,6.896188909357244,4.9416091658852315,0.0,1.968559294548417,43.02235929067078,34.04009252120277,25.250578257517347
6
+ 4,6.867864045229825,4.900304100730202,0.0,1.9174804566176744,46.80030840400925,36.43022359290671,26.060138781804163
7
+ 5,6.724006436087868,4.797688137401234,0.0,1.82541454582832,48.727833461835004,38.62760215882806,28.643022359290672
8
+ 6,6.4798290079290215,4.642946460030296,0.0,1.80135458179318,49.53739398612182,38.04934464148034,29.838087895142635
9
+ 7,5.993401917544278,4.3660430908203125,0.0,1.7473658622000525,51.38781804163454,37.62528912875867,31.033153430994602
10
+ 8,6.148419033397328,4.43590387431058,0.0,1.6255658770306072,54.3562066306862,39.707016191210485,34.96530454895914
11
+ 9,5.871526111255992,4.290742592378096,0.0,1.7117016254798578,51.69622205088666,36.19892058596762,34.734001542020046
12
+ 10,5.74381685256958,4.264637123454701,0.0,1.7374070306511777,50.69390902081727,36.700077101002314,35.8905165767155
13
+ 11,5.452688043767756,4.045811913230202,0.0,1.7055520404736262,52.08172706245181,37.12413261372398,37.27833461835004
14
+ 12,4.628481518138539,3.517678585919467,0.0,1.805424495394081,51.811873554356204,34.65690053970702,37.933693138010796
15
+ 13,4.455186367034912,3.410898880525069,0.0,1.7752763259934754,53.19969159599075,35.81341557440247,41.480339244410175
16
+ 14,4.104010061784224,3.163467688993974,0.0,1.829189393183959,52.23592906707787,37.548188126445645,37.933693138010796
17
+ 15,3.9113983891227027,3.044983213598078,0.0,1.8905051587265826,49.76869699306091,35.774865073245955,39.745566692367
18
+ 16,3.696666110645641,2.892118519002741,0.0,1.867321021021929,49.80724749421743,37.12413261372398,41.017733230531995
19
+ 17,3.6323231133547695,2.8632971156727183,0.0,1.7900664131735138,53.35389360061681,39.01310717039321,42.02004626060139
20
+ 18,3.6393686641346323,2.869971990585327,0.0,1.8415700799240549,52.62143407864302,40.24672320740169,40.70932922127988
21
+ 19,3.48906887661327,2.771279074928977,0.0,1.880367352398157,51.00231303006939,40.32382420971473,44.44872783346184
22
+ 20,3.4191886511715976,2.6914755647832695,0.0,1.9175427671386907,50.57825751734772,39.93831919814958,44.680030840400924
23
+ 21,3.4538497057828037,2.7280169400301846,0.0,1.8990509401584645,50.57825751734772,39.745566692367,44.64148033924441
24
+ 22,3.4887968843633477,2.755061994899403,0.0,1.9076726092865766,50.886661526599845,40.67077872012336,44.64148033924441
25
+ 23,3.385315851731734,2.6678260673176157,0.0,1.9233964123722214,50.38550501156515,39.745566692367,45.065535851966075
26
+ 24,3.419461033561013,2.715679948980158,0.0,1.902960644010957,51.77332305319969,40.24672320740169,47.57131842713955
27
+ 25,3.3484152880581943,2.625418923117898,0.0,1.8944436185802969,51.5420200462606,40.40092521202776,47.14726291441789
28
+ 26,3.346046296032992,2.638855804096569,0.0,1.9281684747547394,50.46260601387818,39.43716268311488,49.961449498843486
29
+ 27,3.2982512387362393,2.594924666664817,0.0,1.9507075506811797,49.92289899768697,40.28527370855821,48.188126445643796
30
+ 28,3.303452795202082,2.5863844481381504,0.0,1.8950384676042848,51.31071703932151,39.66846569005397,45.18118735543562
31
+ 29,3.367577487772161,2.643175081773238,0.0,1.9412081901532645,50.69390902081727,40.20817270624518,46.68465690053971
32
+ 30,3.2480053251439873,2.532977429303256,0.0,1.8963493973637142,52.46723207401696,40.13107170393215,47.57131842713955
33
+ 31,3.3074469566345215,2.6020596027374268,0.0,1.8821703823327833,51.85042405551272,39.39861218195836,47.10871241326137
34
+ 32,3.187801946293224,2.4844328490170566,0.0,1.8810275139584391,52.92983808789514,39.93831919814958,49.88434849653046
35
+ 33,3.2401338057084517,2.5143410726027056,0.0,1.8981733062952963,51.657671549730146,41.094834232845024,46.60755589822668
36
+ 34,3.0894840197129683,2.3931705301458184,0.0,1.9030943005105212,51.69622205088666,40.05397070161912,50.96376252891287
37
+ 35,3.156486056067727,2.458013881336559,0.0,1.931700356298901,51.23361603700848,40.555127216653815,49.38319198149576
38
+ 36,3.259755849838257,2.5482546632940117,0.0,1.9167086475154669,51.58057054741712,40.20817270624518,48.61218195836546
39
+ 37,3.147493449124423,2.4452531120993872,0.0,1.9012195858113474,51.927525057825754,39.745566692367,49.69159599074788
40
+ 38,3.195098400115967,2.4867431900717993,0.0,1.9306611038669772,52.00462606013878,39.55281418658443,50.038550501156514
41
+ 39,3.0399088859558105,2.3429606827822598,0.0,1.9120628173845773,51.811873554356204,39.629915188897456,49.30609097918273
42
+ 40,3.1596357388929888,2.447877515446056,0.0,1.9090616583548423,51.85042405551272,39.59136468774094,52.35158057054742
43
+ 41,3.118578217246316,2.393803661519831,0.0,1.9004772409808204,52.15882806476484,41.2875867386276,49.88434849653046
44
+ 42,3.1229797059839424,2.4038038470528345,0.0,1.9236741179949335,52.27447956823439,40.32382420971473,49.46029298380879
45
+ 43,3.064956534992565,2.354787913235751,0.0,1.893450268985863,52.8141865844256,39.97686969930609,49.80724749421743
46
+ 44,3.0591601675206963,2.364843498576771,0.0,1.9310902580446525,51.11796453353894,38.43484965304549,49.76869699306091
47
+ 45,3.040640419179743,2.3460383631966333,0.0,1.89587194035397,51.927525057825754,39.707016191210485,49.961449498843486
48
+ 46,2.9290889609943735,2.2308849638158623,0.0,1.8957584202059803,51.88897455666924,41.094834232845024,51.734772552043175
49
+ 47,3.0294453230771152,2.325488112189553,0.0,1.8707045902172785,52.54433307632999,39.97686969930609,52.27447956823439
50
+ 48,3.1317387494173916,2.405963897705078,0.0,1.9066181032126375,52.42868157286045,39.55281418658443,51.85042405551272
51
+ 49,2.9616283069957388,2.265440572391857,0.0,1.8994978067961306,53.00693909020817,40.555127216653815,51.5420200462606
52
+ 50,3.1455485820770264,2.4227043281901968,0.0,1.882923813351504,53.31534309946029,40.01542020046261,52.582883577486506
53
+ 51,2.9158610213886607,2.2247644771229136,0.0,1.8807359642493513,52.73708558211257,40.13107170393215,52.27447956823439
54
+ 52,2.9131210934032095,2.2226133671673862,0.0,1.8805447238725062,53.00693909020817,40.05397070161912,51.657671549730146
55
+ 53,2.893153017217463,2.1985829418355767,0.0,1.866078942577565,52.968388589051656,40.43947571318427,52.775636083269085
56
+ 54,2.983793995597146,2.2885990142822266,0.0,1.8809903658805853,53.700848111025444,40.747879722436394,53.39244410177332
57
+ 55,2.9351968331770464,2.222713134505532,0.0,1.8819090033276042,53.238242097147264,39.36006168080185,50.84811102544333
58
+ 56,2.867612058466131,2.1726393808018076,0.0,1.848045437723835,53.77794911333847,40.092521202775636,52.0431765612953
59
+ 57,2.88020075451244,2.1844664270227607,0.0,1.8708817611406101,52.62143407864302,40.82498072474942,52.0431765612953
60
+ 58,2.8559555357152764,2.1746761040254072,0.0,1.8831700686776096,52.0431765612953,39.55281418658443,51.927525057825754
61
+ 59,2.873554988340898,2.1813508814031426,0.0,1.9156408129789873,52.46723207401696,39.43716268311488,52.42868157286045
62
+ 60,2.8872393044558438,2.2007272135127676,0.0,1.8843953909098228,54.39475713184272,39.321511179645334,51.23361603700848
63
+ 61,2.8424318486993965,2.141035556793213,0.0,1.8908357570607017,52.8141865844256,40.13107170393215,52.390131071703934
64
+ 62,2.8343656930056484,2.14149995283647,0.0,1.9011504901953633,52.15882806476484,39.321511179645334,53.19969159599075
65
+ 63,2.904526775533503,2.201468998735601,0.0,1.9070615084610631,53.045489591364685,40.13107170393215,52.0431765612953
66
+ 64,2.869010014967485,2.1730627796866675,0.0,1.9148258545257169,52.313030069390905,39.09020817270625,51.927525057825754
67
+ 65,2.900116508657282,2.2072539112784644,0.0,1.8638444685807298,53.19969159599075,39.51426368542791,53.161141094834235
68
+ 66,2.8059865778142754,2.1229736588217993,0.0,1.8920320319320574,52.582883577486506,39.86121819583654,50.7710100231303
69
+ 67,2.7948883230035957,2.0860411904074927,0.0,1.862464604785834,53.893600616808016,39.707016191210485,54.43330763299923
70
+ 68,2.819111693989147,2.139646064151417,0.0,1.9011826450859663,51.85042405551272,40.362374710871244,51.19506553585197
71
+ 69,2.7721076878634365,2.095326152714816,0.0,1.8980726456770827,52.698535080956056,38.74325366229761,53.508095605242865
72
+ 70,2.5916575735265557,1.9112647555091165,0.0,1.8891348250572186,53.93215111796454,40.70932922127988,57.51734772552043
73
+ 71,2.5009771910580723,1.8191040754318237,0.0,1.877019375953292,54.3562066306862,41.017733230531995,57.97995373939861
74
+ 72,2.5318940336054023,1.8504195104945789,0.0,1.8751355013483382,55.47417116422513,40.747879722436394,58.28835774865073
75
+ 73,2.559442325071855,1.86715059930628,0.0,1.8776340866971217,55.011565150346954,40.32382420971473,58.09560524286816
76
+ 74,2.4455772096460517,1.7696142846887761,0.0,1.8859577449175056,54.20200462606014,40.40092521202776,57.170393215111794
77
+ 75,2.5098052241585473,1.8350491632114758,0.0,1.8917851683352669,54.58750963762529,40.24672320740169,57.20894371626831
78
+ 76,2.5211455171758477,1.8380828120491721,0.0,1.8915326236851324,54.51040863531226,40.632228218966844,57.20894371626831
79
+ 77,2.4673178412697534,1.7988011945377698,0.0,1.8845585563500844,54.934464148033925,40.13107170393215,57.90285273708558
80
+ 78,2.5254632993177935,1.8543669202110984,0.0,1.894946760274671,54.31765612952968,40.940632228218966,57.748650732459524
81
+ 79,2.4496282013979824,1.7808785221793435,0.0,1.894252816805402,55.43562066306862,41.210485736314574,56.55358519660756
82
+ 80,2.484100016680631,1.8123636787587947,0.0,1.887810735584868,55.35851966075559,41.17193523515806,58.018504240555124
83
+ 81,2.445179592479359,1.7747538523240523,0.0,1.9026271706833322,54.548959136468774,39.97686969930609,57.170393215111794
84
+ 82,2.530971787192605,1.8478322462602095,0.0,1.8959243119268483,54.70316114109483,40.940632228218966,58.134155744024675
85
+ 83,2.444200949235396,1.7722663662650369,0.0,1.908030605904396,54.78026214340787,40.40092521202776,57.24749421742483
86
+ 84,2.4377663785761055,1.7739294225519353,0.0,1.902572711431346,54.81881264456438,39.86121819583654,57.71010023130301
87
+ 85,2.4400927153500644,1.7682537165555088,0.0,1.9093585935297064,54.58750963762529,40.32382420971473,57.24749421742483
88
+ 86,2.498041716488925,1.8211543993516401,0.0,1.9054594642856806,54.39475713184272,40.28527370855821,56.6692367000771
89
+ 87,2.48664927482605,1.810897328636863,0.0,1.9057708282882466,54.857363145720896,40.90208172706245,56.93909020817271
90
+ 88,2.473065961490978,1.7910620407624678,0.0,1.9121413475380737,54.39475713184272,40.32382420971473,57.286044718581344
91
+ 89,2.4410992102189497,1.75708345933394,0.0,1.9126712123338498,54.009252120277566,40.362374710871244,56.36083269082498
92
+ 90,2.4862161549654873,1.7980807044289329,0.0,1.8990497668155266,54.471858134155745,40.40092521202776,57.170393215111794
93
+ 91,2.4170630411668257,1.7487245039506392,0.0,1.9122141334030751,54.20200462606014,40.32382420971473,56.245181187355435
94
+ 92,2.3978711691769687,1.7318652868270874,0.0,1.9051065198623316,54.548959136468774,40.20817270624518,57.093292212798765
95
+ 93,2.551802331751043,1.854498570615595,0.0,1.9124730269359274,54.39475713184272,40.43947571318427,56.245181187355435
96
+ 94,2.4188556454398413,1.7558914422988892,0.0,1.9174483050358875,53.93215111796454,40.32382420971473,56.36083269082498
97
+ 95,2.4394628784873267,1.75409244407307,0.0,1.915972285024368,54.548959136468774,40.632228218966844,56.51503469545104
98
+ 96,2.4506113962693648,1.772583159533414,0.0,1.895423756623691,55.08866615265998,40.362374710871244,56.86198920585968
99
+ 97,2.354710665616122,1.693143205209212,0.0,1.9127150442201355,54.39475713184272,40.28527370855821,56.51503469545104
100
+ 98,2.3787700696425005,1.712969801642678,0.0,1.908131160640496,54.741711642251346,40.555127216653815,57.24749421742483
101
+ 99,2.406543905084783,1.7342622171748767,0.0,1.9111042888511947,54.3562066306862,40.43947571318427,56.39938319198149
Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_True-psai_1.0-fusion_concat-seed_2025-ReLUNode-1/_weights_xignore=biasbn_xnorm=filter_yignore=biasbn_ynorm=filter.h5 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:85e0279ce2e13be3567ac0a6d9930bbe1d046b40293096074be77a1c4916209f
3
+ size 179467884
Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_True-psai_1.0-fusion_concat-seed_2025-ReLUNode-1/_weights_xignore=biasbn_xnorm=filter_yignore=biasbn_ynorm=filter.h5_[-10.0,10.0,51]x[-10.0,10.0,51].h5 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:443db5c73240ab8e5d6ef2f156f74df05e26ce8e5a1b7b8a82f55a5d0fbfff39
3
+ size 50640
Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_True-psai_1.0-fusion_concat-seed_2025-ReLUNode-1/args.yaml ADDED
@@ -0,0 +1,162 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ aa: rand-m9-mstd0.5-inc1
2
+ act_fun: QGateGrad
3
+ adam_epoch: 1000
4
+ adaptation_info: false
5
+ adaptive_node: false
6
+ alpha: 0.8
7
+ amp: false
8
+ apex_amp: false
9
+ audio_path: /mnt/home/hexiang/datasets/CREMA-D/AudioWAV/
10
+ aug_splits: 0
11
+ batch_size: 32
12
+ bn_eps: null
13
+ bn_momentum: null
14
+ bn_tf: false
15
+ channels_last: false
16
+ clip_grad: null
17
+ color_jitter: 0.4
18
+ conf_mat: false
19
+ conv_type: normal
20
+ cooldown_epochs: 10
21
+ critical_loss: false
22
+ crop_pct: null
23
+ cut_mix: false
24
+ cutmix: 0.0
25
+ cutmix_beta: 2.0
26
+ cutmix_minmax: null
27
+ cutmix_noise: 0.0
28
+ cutmix_num: 1
29
+ cutmix_prob: 0.5
30
+ dataset: KineticSound
31
+ decay_epochs: 70
32
+ decay_rate: 0.1
33
+ device: 0
34
+ dist_bn: ''
35
+ drop: 0.0
36
+ drop_block: null
37
+ drop_connect: null
38
+ drop_path: 0.1
39
+ encode: direct
40
+ epochs: 100
41
+ eval: false
42
+ eval_checkpoint: ''
43
+ eval_metric: top1
44
+ event_mix: false
45
+ event_size: 48
46
+ fps: 1
47
+ fusion_method: concat
48
+ gaussian_n: 3
49
+ gp: null
50
+ hflip: 0.5
51
+ img_size: 224
52
+ initial_checkpoint: ''
53
+ interpolation: ''
54
+ inverse: true
55
+ inverse_ends: 100
56
+ inverse_starts: 0
57
+ jsd: false
58
+ kernel_method: cuda
59
+ layer_by_layer: false
60
+ local_rank: 0
61
+ log_interval: 50
62
+ loss_fn: ce
63
+ lr: 0.005
64
+ lr_cycle_limit: 1
65
+ lr_cycle_mul: 1.0
66
+ lr_noise: null
67
+ lr_noise_pct: 0.67
68
+ lr_noise_std: 1.0
69
+ mean: null
70
+ mem_dist: false
71
+ meta_ratio: -1.0
72
+ min_lr: 1.0e-05
73
+ mix_up: false
74
+ mixup: 0.0
75
+ mixup_mode: batch
76
+ mixup_off_epoch: 0
77
+ mixup_prob: 0.0
78
+ mixup_switch_prob: 0.5
79
+ modality: audio-visual
80
+ model: AVresnet18
81
+ model_ema: false
82
+ model_ema_decay: 0.99996
83
+ model_ema_force_cpu: false
84
+ modulation: Normal
85
+ modulation_ends: 50
86
+ modulation_starts: 0
87
+ momentum: 0.9
88
+ n_encode_type: linear
89
+ n_groups: 1
90
+ n_preact: false
91
+ native_amp: false
92
+ newton_maxiter: 20
93
+ no_aug: false
94
+ no_prefetcher: false
95
+ no_resume_opt: false
96
+ node_resume: ''
97
+ node_type: ReLUNode
98
+ noisy_grad: 0.0
99
+ num_classes: 31
100
+ num_gpu: 1
101
+ opt: sgd
102
+ opt_betas: null
103
+ opt_eps: 1.0e-08
104
+ output: ./exp_results
105
+ patience_epochs: 10
106
+ pin_mem: false
107
+ power: 1
108
+ pretrained: false
109
+ psai: 1.0
110
+ rand_aug: false
111
+ rand_step: false
112
+ randaug_m: 15
113
+ randaug_n: 3
114
+ ratio:
115
+ - 0.75
116
+ - 1.3333333333333333
117
+ recount: 1
118
+ recovery_interval: 0
119
+ remode: pixel
120
+ reprob: 0.25
121
+ requires_thres_grad: false
122
+ reset_drop: false
123
+ resplit: false
124
+ resume: ''
125
+ save_images: false
126
+ scale:
127
+ - 0.08
128
+ - 1.0
129
+ sched: step
130
+ seed: 2025
131
+ sew_cnf: ADD
132
+ sigmoid_thres: false
133
+ smoothing: 0.1
134
+ snr: -100
135
+ snrModality: null
136
+ spike_output: false
137
+ spike_rate: false
138
+ split_bn: false
139
+ start_epoch: null
140
+ std: null
141
+ step: 1
142
+ suffix: ''
143
+ sync_bn: false
144
+ tau: 2.0
145
+ temporal_flatten: false
146
+ tensorboard_dir: ./exp_results
147
+ tet_loss: false
148
+ threshold: 0.5
149
+ train_interpolation: random
150
+ train_portion: 0.9
151
+ tsne: false
152
+ tta: 0
153
+ use_multi_epochs_loader: false
154
+ use_video_frames: 3
155
+ validation_batch_size_multiplier: 1
156
+ vflip: 0.0
157
+ visual_path: /mnt/home/hexiang/datasets/CREMA-D/
158
+ visualize: false
159
+ warmup_epochs: 0
160
+ warmup_lr: 1.0e-06
161
+ weight_decay: 0.0005
162
+ workers: 8
Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_True-psai_1.0-fusion_concat-seed_2025-ReLUNode-1/checkpoint-74.pth.tar ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a283adbc9717a11534fbe7859f2d754208cfde3c345f5d190e77e300706897f7
3
+ size 179501441
Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_True-psai_1.0-fusion_concat-seed_2025-ReLUNode-1/events.out.tfevents.1744967800.af1fd63cd999.621246.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a1dcb17c1d35eb8a6b56bf5e078f4db34199784c8e72c35d0cde9e0f752e0de9
3
+ size 12954544
Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_True-psai_1.0-fusion_concat-seed_2025-ReLUNode-1/last.pth.tar ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:aaccc9ba71d70c66cd49d0ed6c68668006fd5bd48c0c5f1cfaffb8cfac78ae48
3
+ size 179501441
Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_True-psai_1.0-fusion_concat-seed_2025-ReLUNode-1/log.txt ADDED
The diff for this file is too large to render. See raw diff
 
Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_True-psai_1.0-fusion_concat-seed_2025-ReLUNode-1/model_best.pth.tar ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a283adbc9717a11534fbe7859f2d754208cfde3c345f5d190e77e300706897f7
3
+ size 179501441
Audio Visual Classification/exp_results/AVresnet18-KineticSound-audio-visual-Normal-inverse_True-psai_1.0-fusion_concat-seed_2025-ReLUNode-1/summary.csv ADDED
@@ -0,0 +1,101 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ epoch,train_loss,train_loss_single,train_loss_inverse,eval_loss,eval_top1,eval_top1_a,eval_top1_v
2
+ 0,8.864304672588002,6.08072449944236,0.0,2.4609338110377443,28.79722436391673,20.470316114109483,16.499614494988435
3
+ 1,7.950204502452504,5.571583401073109,0.0,2.5369057806069426,32.960678488820356,23.053199691595992,18.272937548188125
4
+ 2,7.465798247944225,5.2439094456759365,0.0,1.9737686717519414,43.562066306861986,29.02852737085582,23.708558211256747
5
+ 3,6.879950046539307,4.938333164561879,0.0,1.934290700631594,45.60524286815728,32.22821896684657,25.79028527370856
6
+ 4,6.857426296580922,4.895855253393,0.0,1.9962589374945177,43.677717810331536,36.77717810331534,25.096376252891286
7
+ 5,6.64566724950617,4.749696904962713,0.0,1.8239913012490607,47.879722436391674,36.969930609097915,26.522744795682343
8
+ 6,6.462749784643,4.6391071839766065,0.0,1.9780334705743958,46.491904394757135,35.92906707787201,28.95142636854279
9
+ 7,5.949556827545166,4.33861069245772,0.0,1.8036506596949804,48.88203546646106,34.811102544333075,33.0763299922899
10
+ 8,6.214459592645818,4.489969036795876,0.0,1.7482086602963203,51.079414032382424,37.50963762528913,32.035466461064
11
+ 9,5.713632020083341,4.203526193445379,0.0,2.1416370473463164,45.990747879722434,31.5343099460293,31.727062451811875
12
+ 10,5.700679822401567,4.24209003014998,0.0,1.8843012444313068,48.76638396299152,32.99922898997687,34.11719352351581
13
+ 11,5.345557733015581,3.97409573468295,0.0,1.8489470847680556,50.809560524286816,31.84271395528142,37.74094063222822
14
+ 12,4.599525473334572,3.502567182887684,0.0,2.222502092076157,44.79568234387047,31.84271395528142,32.845026985350806
15
+ 13,4.392306696284901,3.359241984107278,0.0,1.8432699939151314,51.85042405551272,33.26908249807248,38.7047031611411
16
+ 14,4.017167394811457,3.1027332219210537,0.0,1.9642711887933146,49.575944487278335,36.00616808018504,37.4325366229761
17
+ 15,3.8412289402701636,2.991567026485096,0.0,1.9268970074061715,48.84348496530455,36.46877409406322,36.00616808018504
18
+ 16,3.688096588308161,2.890348954634233,0.0,1.84404759013662,51.69622205088666,38.04934464148034,41.403238242097146
19
+ 17,3.567133079875599,2.809812545776367,0.0,1.8459300803697007,51.50346954510409,39.629915188897456,38.16499614494988
20
+ 18,3.63045672936873,2.8590270172465932,0.0,1.8357023385826587,52.00462606013878,40.169622205088665,42.21279876638396
21
+ 19,3.478744246742942,2.7622139670632104,0.0,1.9979014415049792,47.686969930609095,38.396299151888975,39.55281418658443
22
+ 20,3.435373826460405,2.704470157623291,0.0,1.9600261314703853,49.61449498843485,40.67077872012336,42.17424826522745
23
+ 21,3.3963942527770996,2.6749524636702104,0.0,1.8978821540117816,51.31071703932151,39.66846569005397,40.28527370855821
24
+ 22,3.4935771118510854,2.764287666840987,0.0,1.937524498106428,50.53970701619121,39.97686969930609,40.5165767154973
25
+ 23,3.3549781279130415,2.6343413266268643,0.0,1.9078881617408583,50.57825751734772,39.784117193523514,44.79568234387047
26
+ 24,3.4593291716142134,2.7532327391884546,0.0,1.9024209023992558,51.19506553585197,40.01542020046261,44.29452582883577
27
+ 25,3.350887493653731,2.6288786151192407,0.0,1.951225899142675,50.26985350809561,41.480339244410175,41.981495759444876
28
+ 26,3.334369811144742,2.6313998915932397,0.0,1.9888019819119938,49.15188897455667,38.35774865073246,46.1449498843485
29
+ 27,3.291086348620328,2.583132570440119,0.0,1.9502920466564946,50.46260601387818,40.01542020046261,46.18350038550501
30
+ 28,3.2810999263416636,2.5619078549471768,0.0,1.994886975152362,46.60755589822668,37.39398612181958,36.160370084811106
31
+ 29,3.3454058170318604,2.620034868066961,0.0,1.9683376987254335,48.84348496530455,39.51426368542791,44.75713184271395
32
+ 30,3.2298372008583764,2.512613144787875,0.0,1.9181220057199988,51.464919043947575,40.43947571318427,45.913646877409406
33
+ 31,3.3129579587416216,2.611720323562622,0.0,1.870143467955343,53.19969159599075,40.32382420971473,42.13569776407093
34
+ 32,3.2366027615287085,2.5308499986475166,0.0,1.9119959745208575,51.23361603700848,40.20817270624518,46.761757902852736
35
+ 33,3.2572227608073843,2.5311226628043433,0.0,1.9109867598887307,52.0431765612953,39.93831919814958,46.26060138781804
36
+ 34,3.10352180220864,2.40699104829268,0.0,1.9203283924641017,52.120277563608326,39.66846569005397,47.031611410948344
37
+ 35,3.2090905146165327,2.5057059851559726,0.0,1.9533046666897529,51.77332305319969,41.017733230531995,40.555127216653815
38
+ 36,3.2580124248157847,2.546559767289595,0.0,1.9498492386493669,50.19275250578257,39.82266769468003,46.68465690053971
39
+ 37,3.1492070934989234,2.4481518268585205,0.0,1.9257320449640867,51.77332305319969,40.32382420971473,46.60755589822668
40
+ 38,3.1474460688504307,2.4407600706273858,0.0,1.945646116525831,51.58057054741712,40.82498072474942,43.79336931380108
41
+ 39,3.0527275042100386,2.3544228727167305,0.0,1.9284831297425187,51.811873554356204,39.707016191210485,46.18350038550501
42
+ 40,3.1385399861769243,2.4237148978493432,0.0,1.9083975874283907,52.08172706245181,40.43947571318427,48.49653045489591
43
+ 41,3.1563447388735684,2.4285459301688452,0.0,1.9166246578154789,51.15651503469545,39.59136468774094,46.68465690053971
44
+ 42,3.139063683423129,2.419008320028132,0.0,1.931701413833277,51.85042405551272,40.169622205088665,46.26060138781804
45
+ 43,3.073895064267245,2.3654291196302935,0.0,1.9050379573333787,51.85042405551272,39.321511179645334,46.18350038550501
46
+ 44,3.0890694098039106,2.393331462686712,0.0,1.9250956267325254,51.85042405551272,38.858905165767155,47.031611410948344
47
+ 45,3.0218137827786533,2.32620247927579,0.0,1.9026443897620844,52.775636083269085,39.43716268311488,48.342328450269854
48
+ 46,2.9242090528661553,2.222837903282859,0.0,1.9298404046178508,50.7710100231303,38.35774865073246,47.49421742482652
49
+ 47,3.0270766778425737,2.321492065082897,0.0,1.9029926213284318,51.811873554356204,38.66615265998458,49.73014649190439
50
+ 48,3.1161446137861772,2.392368294975974,0.0,1.909401678175033,53.238242097147264,40.59367771781033,48.22667694680031
51
+ 49,2.94148512320085,2.242118250239979,0.0,1.8753940750289717,52.23592906707787,39.93831919814958,50.7710100231303
52
+ 50,3.080849300731312,2.3651200207796963,0.0,1.9229632420271474,51.657671549730146,39.39861218195836,48.727833461835004
53
+ 51,2.925799109719016,2.2314494956623423,0.0,1.957751393593911,50.57825751734772,39.321511179645334,48.03392444101773
54
+ 52,2.934916778044267,2.2428288893266157,0.0,1.906772012239984,52.15882806476484,39.16730917501928,48.57363145720895
55
+ 53,2.8512535962191494,2.1539934765208852,0.0,1.886652306032438,52.582883577486506,40.47802621434079,48.03392444101773
56
+ 54,2.957517060366544,2.261018688028509,0.0,1.903125951358145,52.968388589051656,40.59367771781033,48.072474942174246
57
+ 55,2.9226718599146064,2.213052207773382,0.0,1.8549808779402521,54.43330763299923,39.93831919814958,51.61912104857363
58
+ 56,2.8513040976090864,2.155928308313543,0.0,1.8836915241173808,52.968388589051656,39.82266769468003,50.07710100231303
59
+ 57,2.8467988751151343,2.1464450359344482,0.0,1.8788552984606788,53.430994602929836,39.16730917501928,50.61680801850424
60
+ 58,2.8491393436085093,2.1661354953592475,0.0,1.9000250405316732,51.734772552043175,38.62760215882806,48.84348496530455
61
+ 59,2.8890515890988437,2.1955313465811988,0.0,1.885846688366892,53.00693909020817,39.86121819583654,52.50578257517348
62
+ 60,2.8658947294408623,2.1779997565529565,0.0,1.8707261260142212,53.73939861218196,39.86121819583654,49.26754047802621
63
+ 61,2.849040313200517,2.1465529636903242,0.0,1.9295746870195305,52.120277563608326,40.28527370855821,49.46029298380879
64
+ 62,2.8189338554035532,2.1276246200908315,0.0,1.906982354044271,53.0840400925212,40.47802621434079,52.659984579799534
65
+ 63,2.909959749741988,2.206061428243464,0.0,1.8958956412195516,53.77794911333847,40.555127216653815,50.19275250578257
66
+ 64,2.872427073392001,2.1745346134359185,0.0,1.907307339155775,51.58057054741712,40.092521202775636,51.657671549730146
67
+ 65,2.8842995817011055,2.191982475194064,0.0,1.885983072985294,52.62143407864302,40.20817270624518,49.498843484965306
68
+ 66,2.8263662078163843,2.1425382007252085,0.0,1.9078462031959293,51.58057054741712,39.707016191210485,49.07478797224364
69
+ 67,2.831496065313166,2.1193384365601973,0.0,1.9043025220460679,52.35158057054742,39.51426368542791,49.498843484965306
70
+ 68,2.8422492850910532,2.1539700356396763,0.0,1.9129531686822543,52.00462606013878,38.396299151888975,51.96607555898227
71
+ 69,2.7777761112559927,2.094324762170965,0.0,1.914720185279111,52.27447956823439,38.58905165767155,49.80724749421743
72
+ 70,2.643687508322976,1.95837524804202,0.0,1.8350490071540073,55.97532767925983,40.5165767154973,57.093292212798765
73
+ 71,2.536225535652854,1.8452885367653586,0.0,1.8322941990383974,56.09097918272938,41.017733230531995,57.05474171164225
74
+ 72,2.5369892553849653,1.8501596667549827,0.0,1.8352445920063334,55.70547417116423,40.70932922127988,58.48111025443331
75
+ 73,2.5544137087735264,1.859866825017062,0.0,1.8371086400016603,56.12952968388589,40.82498072474942,57.4402467232074
76
+ 74,2.443770408630371,1.7645706155083396,0.0,1.8418604382755395,56.168080185042406,40.632228218966844,57.170393215111794
77
+ 75,2.504184137691151,1.8221033594825051,0.0,1.8495566074721337,55.70547417116423,40.5165767154973,56.97764070932922
78
+ 76,2.4907146367159756,1.805579965764826,0.0,1.8432391805656156,55.74402467232074,40.67077872012336,57.51734772552043
79
+ 77,2.463254841891202,1.792728770862926,0.0,1.8504272639613566,55.35851966075559,40.82498072474942,57.24749421742483
80
+ 78,2.5283753221685235,1.855991471897472,0.0,1.8559216546754977,55.66692367000771,41.017733230531995,56.322282189668464
81
+ 79,2.43806977705522,1.7679215994748203,0.0,1.8561896619745282,55.66692367000771,40.70932922127988,55.93677717810331
82
+ 80,2.4995392886075107,1.8273977583104914,0.0,1.8482537063710398,55.97532767925983,40.940632228218966,57.24749421742483
83
+ 81,2.447273861278187,1.7757724848660557,0.0,1.865626056468202,55.397070161912104,40.32382420971473,56.322282189668464
84
+ 82,2.5413402210582388,1.8561630790883845,0.0,1.8671063946684967,55.58982266769468,40.59367771781033,57.170393215111794
85
+ 83,2.4394211769104004,1.7662001631476663,0.0,1.8754245503277804,55.24286815728605,40.20817270624518,56.36083269082498
86
+ 84,2.440536694093184,1.7737461978738958,0.0,1.8709225954233726,55.20431765612953,40.28527370855821,56.86198920585968
87
+ 85,2.419894131747159,1.7468709837306629,0.0,1.86890560236911,55.05011565150347,40.59367771781033,57.170393215111794
88
+ 86,2.457799803126942,1.777259263125333,0.0,1.8767318196175369,55.16576715497302,40.32382420971473,57.01619121048574
89
+ 87,2.4696818481792104,1.7921430414373225,0.0,1.8706875987115417,55.51272166538165,40.555127216653815,56.437933693138014
90
+ 88,2.4394044442610308,1.757690668106079,0.0,1.878381196958061,55.24286815728605,40.32382420971473,56.90053970701619
91
+ 89,2.4448861208829014,1.7584844827651978,0.0,1.8813223504248084,55.28141865844256,40.05397070161912,56.47648419429453
92
+ 90,2.463672464544123,1.7734671939503064,0.0,1.8762813513335945,55.782575173477255,40.24672320740169,56.55358519660756
93
+ 91,2.431365034796975,1.762659495527094,0.0,1.884416566597285,55.24286815728605,40.092521202775636,56.59213569776407
94
+ 92,2.405337181958285,1.7371771769090132,0.0,1.8751984506546762,55.319969159599076,40.28527370855821,56.36083269082498
95
+ 93,2.5075244253331963,1.8076894608410923,0.0,1.8877609048150739,55.319969159599076,40.28527370855821,55.782575173477255
96
+ 94,2.398684783415361,1.7342462756417014,0.0,1.8853463648646818,55.20431765612953,40.05397070161912,57.093292212798765
97
+ 95,2.429596424102783,1.7440044446425005,0.0,1.8894190295623097,55.397070161912104,39.86121819583654,56.86198920585968
98
+ 96,2.4506863897497,1.7709588029167869,0.0,1.8789105163506572,55.47417116422513,40.32382420971473,56.168080185042406
99
+ 97,2.3225276036696,1.6597023660486394,0.0,1.8905925945584923,55.397070161912104,40.32382420971473,56.36083269082498
100
+ 98,2.3825330300764604,1.716467402198098,0.0,1.8879966691722296,55.47417116422513,40.43947571318427,56.36083269082498
101
+ 99,2.418989723378962,1.7442512620579114,0.0,1.8918366263072677,55.47417116422513,40.092521202775636,55.8982266769468
Audio Visual Classification/exp_results/readme.md ADDED
@@ -0,0 +1,45 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Due to the upload speed limitation, we only upload the results of Normal (Concat method) on the Kinetics-Sound dataset here, and the rest of the results in the paper are uploaded to baidu netdisk. Its decompression structure is as follows:
2
+
3
+ ```
4
+ ├── AVresnet18-CREMAD-audio-visual-OGM_GE-inverse_False-psai_1.0-fusion_concat-seed_2025-LIFNode-4
5
+ │   ├── args.yaml
6
+ │   ├── checkpoint-56.pth.tar
7
+ │   ├── events.out.tfevents.1745036026.af1fd63cd999.968888.0
8
+ │   ├── last.pth.tar
9
+ │   ├── log.txt
10
+ │   ├── model_best.pth.tar
11
+ │   └── summary.csv
12
+ ├── AVresnet18-CREMAD-audio-visual-OGM_GE-inverse_False-psai_1.0-fusion_concat-seed_2025-ReLUNode-1
13
+ │   ├── args.yaml
14
+ │   ├── checkpoint-56.pth.tar
15
+ │   ├── events.out.tfevents.1744958014.af1fd63cd999.329244.0
16
+ │   ├── last.pth.tar
17
+ │   ├── log.txt
18
+ │   ├── model_best.pth.tar
19
+ │   └── summary.csv
20
+ ├── AVresnet18-CREMAD-audio-visual-OGM_GE-inverse_True-psai_1.0-fusion_concat-seed_2025-LIFNode-4
21
+ │   ├── args.yaml
22
+ │   ├── checkpoint-56.pth.tar
23
+ │   ├── events.out.tfevents.1745036026.af1fd63cd999.968889.0
24
+ │   ├── last.pth.tar
25
+ │   ├── log.txt
26
+ │   ├── model_best.pth.tar
27
+ │   └── summary.csv
28
+ ├── AVresnet18-CREMAD-audio-visual-OGM_GE-inverse_True-psai_1.0-fusion_concat-seed_2025-ReLUNode-1
29
+ │   ├── args.yaml
30
+ │   ├── checkpoint-55.pth.tar
31
+ │   ├── events.out.tfevents.1744958014.af1fd63cd999.329245.0
32
+ │   ├── last.pth.tar
33
+ │   ├── log.txt
34
+ │   ├── model_best.pth.tar
35
+ │   └── summary.csv
36
+ ...
37
+
38
+
39
+ ```
40
+
41
+ There are a total of 62 directories in it.
42
+
43
+
44
+
45
+ You can find all our experimental results [here](https://pan.baidu.com/s/1myFj4XVNIgdZIFXsN0pL1A) ( extraction code: q372 )
Audio Visual Continual Learning/AV-CIL/save/AVE/audio-visual/use-inverse_False-seed_0/fig/audio-visual_train_loss_step_0.png ADDED
Audio Visual Continual Learning/AV-CIL/save/AVE/audio-visual/use-inverse_False-seed_0/fig/audio-visual_train_loss_step_1.png ADDED
Audio Visual Continual Learning/AV-CIL/save/AVE/audio-visual/use-inverse_False-seed_0/fig/audio-visual_train_loss_step_2.png ADDED
Audio Visual Continual Learning/AV-CIL/save/AVE/audio-visual/use-inverse_False-seed_0/fig/audio-visual_train_loss_step_3.png ADDED
Audio Visual Continual Learning/AV-CIL/save/AVE/audio-visual/use-inverse_False-seed_0/fig/audio-visual_val_acc_step_0.png ADDED
Audio Visual Continual Learning/AV-CIL/save/AVE/audio-visual/use-inverse_False-seed_0/fig/audio-visual_val_acc_step_1.png ADDED
Audio Visual Continual Learning/AV-CIL/save/AVE/audio-visual/use-inverse_False-seed_0/fig/audio-visual_val_acc_step_2.png ADDED
Audio Visual Continual Learning/AV-CIL/save/AVE/audio-visual/use-inverse_False-seed_0/fig/audio-visual_val_acc_step_3.png ADDED
Audio Visual Continual Learning/AV-CIL/save/AVE/audio-visual/use-inverse_False-seed_0/step_0_best_audio-visual_model.pkl ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d9114a8babaf4afb1c2c1b67243be750a277e44570f8a6b9937420654d882123
3
+ size 114115345
Audio Visual Continual Learning/AV-CIL/save/AVE/audio-visual/use-inverse_False-seed_0/step_1_best_audio-visual_model.pkl ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3c29e36d7dc7d1b0283d7baf227744cdc78537fb27b199102c65476ea17275a5
3
+ size 114179921
Audio Visual Continual Learning/AV-CIL/save/AVE/audio-visual/use-inverse_False-seed_0/step_2_best_audio-visual_model.pkl ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:cfd31bbb96a44249bc14f4ed96e6b58f79949fa76393248030d720768edda071
3
+ size 114244625
Audio Visual Continual Learning/AV-CIL/save/AVE/audio-visual/use-inverse_False-seed_0/step_3_best_audio-visual_model.pkl ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4a1a910e37c69bc613655b86b2d20d249d7a86e50677c17fc22f80b6431f04bb
3
+ size 114309137
Audio Visual Continual Learning/AV-CIL/save/AVE/audio-visual/use-inverse_False-seed_0/train.log ADDED
@@ -0,0 +1,885 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2025-04-19 03:46:47,992 INFO Training start time: 2025-04-19 03:46:47.992610
2
+ 2025-04-19 04:06:48,306 INFO Incremental step: 0
3
+ 2025-04-19 04:07:08,530 INFO Epoch:0 train_loss:2.26734
4
+ 2025-04-19 04:07:13,724 INFO Epoch:0 val_res:0.257143
5
+ 2025-04-19 04:07:13,724 INFO Saving best model at Epoch 0
6
+ 2025-04-19 04:07:28,405 INFO Epoch:1 train_loss:1.55042
7
+ 2025-04-19 04:07:33,779 INFO Epoch:1 val_res:0.466667
8
+ 2025-04-19 04:07:33,779 INFO Saving best model at Epoch 1
9
+ 2025-04-19 04:07:47,208 INFO Epoch:2 train_loss:1.27226
10
+ 2025-04-19 04:07:52,431 INFO Epoch:2 val_res:0.628571
11
+ 2025-04-19 04:07:52,431 INFO Saving best model at Epoch 2
12
+ 2025-04-19 04:08:05,561 INFO Epoch:3 train_loss:1.00044
13
+ 2025-04-19 04:08:10,667 INFO Epoch:3 val_res:0.552381
14
+ 2025-04-19 04:08:20,574 INFO Epoch:4 train_loss:0.93713
15
+ 2025-04-19 04:08:26,069 INFO Epoch:4 val_res:0.742857
16
+ 2025-04-19 04:08:26,070 INFO Saving best model at Epoch 4
17
+ 2025-04-19 04:08:40,943 INFO Epoch:5 train_loss:0.77391
18
+ 2025-04-19 04:08:45,824 INFO Epoch:5 val_res:0.685714
19
+ 2025-04-19 04:08:55,774 INFO Epoch:6 train_loss:0.73917
20
+ 2025-04-19 04:09:00,828 INFO Epoch:6 val_res:0.723810
21
+ 2025-04-19 04:09:11,159 INFO Epoch:7 train_loss:0.65520
22
+ 2025-04-19 04:09:16,636 INFO Epoch:7 val_res:0.761905
23
+ 2025-04-19 04:09:16,636 INFO Saving best model at Epoch 7
24
+ 2025-04-19 04:09:29,546 INFO Epoch:8 train_loss:0.61123
25
+ 2025-04-19 04:09:35,331 INFO Epoch:8 val_res:0.742857
26
+ 2025-04-19 04:09:46,027 INFO Epoch:9 train_loss:0.59487
27
+ 2025-04-19 04:09:51,676 INFO Epoch:9 val_res:0.790476
28
+ 2025-04-19 04:09:51,676 INFO Saving best model at Epoch 9
29
+ 2025-04-19 04:10:05,613 INFO Epoch:10 train_loss:0.52868
30
+ 2025-04-19 04:10:11,031 INFO Epoch:10 val_res:0.771429
31
+ 2025-04-19 04:10:21,310 INFO Epoch:11 train_loss:0.48344
32
+ 2025-04-19 04:10:26,767 INFO Epoch:11 val_res:0.800000
33
+ 2025-04-19 04:10:26,767 INFO Saving best model at Epoch 11
34
+ 2025-04-19 04:10:40,613 INFO Epoch:12 train_loss:0.46746
35
+ 2025-04-19 04:10:45,904 INFO Epoch:12 val_res:0.800000
36
+ 2025-04-19 04:10:56,594 INFO Epoch:13 train_loss:0.43783
37
+ 2025-04-19 04:11:01,844 INFO Epoch:13 val_res:0.790476
38
+ 2025-04-19 04:11:12,707 INFO Epoch:14 train_loss:0.39908
39
+ 2025-04-19 04:11:17,978 INFO Epoch:14 val_res:0.800000
40
+ 2025-04-19 04:11:28,705 INFO Epoch:15 train_loss:0.37672
41
+ 2025-04-19 04:11:34,134 INFO Epoch:15 val_res:0.819048
42
+ 2025-04-19 04:11:34,134 INFO Saving best model at Epoch 15
43
+ 2025-04-19 04:11:47,257 INFO Epoch:16 train_loss:0.35163
44
+ 2025-04-19 04:11:52,283 INFO Epoch:16 val_res:0.828571
45
+ 2025-04-19 04:11:52,283 INFO Saving best model at Epoch 16
46
+ 2025-04-19 04:12:05,078 INFO Epoch:17 train_loss:0.33205
47
+ 2025-04-19 04:12:10,132 INFO Epoch:17 val_res:0.819048
48
+ 2025-04-19 04:12:20,482 INFO Epoch:18 train_loss:0.30715
49
+ 2025-04-19 04:12:25,760 INFO Epoch:18 val_res:0.828571
50
+ 2025-04-19 04:12:36,410 INFO Epoch:19 train_loss:0.29065
51
+ 2025-04-19 04:12:41,355 INFO Epoch:19 val_res:0.809524
52
+ 2025-04-19 04:12:51,712 INFO Epoch:20 train_loss:0.27594
53
+ 2025-04-19 04:12:57,011 INFO Epoch:20 val_res:0.800000
54
+ 2025-04-19 04:13:08,078 INFO Epoch:21 train_loss:0.26882
55
+ 2025-04-19 04:13:13,505 INFO Epoch:21 val_res:0.828571
56
+ 2025-04-19 04:13:23,719 INFO Epoch:22 train_loss:0.23536
57
+ 2025-04-19 04:13:28,826 INFO Epoch:22 val_res:0.800000
58
+ 2025-04-19 04:13:39,214 INFO Epoch:23 train_loss:0.22374
59
+ 2025-04-19 04:13:44,226 INFO Epoch:23 val_res:0.828571
60
+ 2025-04-19 04:13:54,686 INFO Epoch:24 train_loss:0.21201
61
+ 2025-04-19 04:14:00,938 INFO Epoch:24 val_res:0.809524
62
+ 2025-04-19 04:14:11,923 INFO Epoch:25 train_loss:0.20243
63
+ 2025-04-19 04:14:17,557 INFO Epoch:25 val_res:0.828571
64
+ 2025-04-19 04:14:28,816 INFO Epoch:26 train_loss:0.17748
65
+ 2025-04-19 04:14:34,260 INFO Epoch:26 val_res:0.819048
66
+ 2025-04-19 04:14:44,947 INFO Epoch:27 train_loss:0.17828
67
+ 2025-04-19 04:14:50,159 INFO Epoch:27 val_res:0.838095
68
+ 2025-04-19 04:14:50,159 INFO Saving best model at Epoch 27
69
+ 2025-04-19 04:15:03,264 INFO Epoch:28 train_loss:0.15906
70
+ 2025-04-19 04:15:08,513 INFO Epoch:28 val_res:0.838095
71
+ 2025-04-19 04:15:21,592 INFO Epoch:29 train_loss:0.15265
72
+ 2025-04-19 04:15:27,054 INFO Epoch:29 val_res:0.838095
73
+ 2025-04-19 04:15:38,317 INFO Epoch:30 train_loss:0.14531
74
+ 2025-04-19 04:15:43,782 INFO Epoch:30 val_res:0.819048
75
+ 2025-04-19 04:15:54,807 INFO Epoch:31 train_loss:0.12771
76
+ 2025-04-19 04:16:00,007 INFO Epoch:31 val_res:0.828571
77
+ 2025-04-19 04:16:11,550 INFO Epoch:32 train_loss:0.12039
78
+ 2025-04-19 04:16:17,028 INFO Epoch:32 val_res:0.828571
79
+ 2025-04-19 04:16:27,979 INFO Epoch:33 train_loss:0.10778
80
+ 2025-04-19 04:16:33,437 INFO Epoch:33 val_res:0.828571
81
+ 2025-04-19 04:16:44,655 INFO Epoch:34 train_loss:0.10372
82
+ 2025-04-19 04:16:50,128 INFO Epoch:34 val_res:0.838095
83
+ 2025-04-19 04:17:01,494 INFO Epoch:35 train_loss:0.09483
84
+ 2025-04-19 04:17:06,788 INFO Epoch:35 val_res:0.828571
85
+ 2025-04-19 04:17:18,230 INFO Epoch:36 train_loss:0.09487
86
+ 2025-04-19 04:17:23,691 INFO Epoch:36 val_res:0.828571
87
+ 2025-04-19 04:17:35,288 INFO Epoch:37 train_loss:0.09211
88
+ 2025-04-19 04:17:40,417 INFO Epoch:37 val_res:0.828571
89
+ 2025-04-19 04:17:51,908 INFO Epoch:38 train_loss:0.07836
90
+ 2025-04-19 04:17:57,044 INFO Epoch:38 val_res:0.838095
91
+ 2025-04-19 04:18:07,918 INFO Epoch:39 train_loss:0.06967
92
+ 2025-04-19 04:18:13,246 INFO Epoch:39 val_res:0.819048
93
+ 2025-04-19 04:18:24,780 INFO Epoch:40 train_loss:0.06636
94
+ 2025-04-19 04:18:30,638 INFO Epoch:40 val_res:0.819048
95
+ 2025-04-19 04:18:41,916 INFO Epoch:41 train_loss:0.06588
96
+ 2025-04-19 04:18:47,293 INFO Epoch:41 val_res:0.828571
97
+ 2025-04-19 04:18:58,318 INFO Epoch:42 train_loss:0.05647
98
+ 2025-04-19 04:19:03,745 INFO Epoch:42 val_res:0.819048
99
+ 2025-04-19 04:19:15,597 INFO Epoch:43 train_loss:0.05239
100
+ 2025-04-19 04:19:20,853 INFO Epoch:43 val_res:0.828571
101
+ 2025-04-19 04:19:32,594 INFO Epoch:44 train_loss:0.04980
102
+ 2025-04-19 04:19:37,909 INFO Epoch:44 val_res:0.828571
103
+ 2025-04-19 04:19:49,004 INFO Epoch:45 train_loss:0.04698
104
+ 2025-04-19 04:19:54,178 INFO Epoch:45 val_res:0.819048
105
+ 2025-04-19 04:20:05,007 INFO Epoch:46 train_loss:0.04317
106
+ 2025-04-19 04:20:10,096 INFO Epoch:46 val_res:0.828571
107
+ 2025-04-19 04:20:21,179 INFO Epoch:47 train_loss:0.04270
108
+ 2025-04-19 04:20:26,433 INFO Epoch:47 val_res:0.828571
109
+ 2025-04-19 04:20:36,879 INFO Epoch:48 train_loss:0.03995
110
+ 2025-04-19 04:20:41,984 INFO Epoch:48 val_res:0.828571
111
+ 2025-04-19 04:20:52,896 INFO Epoch:49 train_loss:0.03744
112
+ 2025-04-19 04:20:57,942 INFO Epoch:49 val_res:0.828571
113
+ 2025-04-19 04:21:08,409 INFO Epoch:50 train_loss:0.03656
114
+ 2025-04-19 04:21:13,340 INFO Epoch:50 val_res:0.828571
115
+ 2025-04-19 04:21:24,112 INFO Epoch:51 train_loss:0.03274
116
+ 2025-04-19 04:21:29,394 INFO Epoch:51 val_res:0.819048
117
+ 2025-04-19 04:21:40,494 INFO Epoch:52 train_loss:0.02987
118
+ 2025-04-19 04:21:46,047 INFO Epoch:52 val_res:0.809524
119
+ 2025-04-19 04:21:57,747 INFO Epoch:53 train_loss:0.03041
120
+ 2025-04-19 04:22:03,187 INFO Epoch:53 val_res:0.819048
121
+ 2025-04-19 04:22:13,893 INFO Epoch:54 train_loss:0.02775
122
+ 2025-04-19 04:22:19,245 INFO Epoch:54 val_res:0.819048
123
+ 2025-04-19 04:22:30,004 INFO Epoch:55 train_loss:0.02792
124
+ 2025-04-19 04:22:35,117 INFO Epoch:55 val_res:0.819048
125
+ 2025-04-19 04:22:46,301 INFO Epoch:56 train_loss:0.02554
126
+ 2025-04-19 04:22:51,684 INFO Epoch:56 val_res:0.828571
127
+ 2025-04-19 04:23:03,137 INFO Epoch:57 train_loss:0.02422
128
+ 2025-04-19 04:23:08,363 INFO Epoch:57 val_res:0.809524
129
+ 2025-04-19 04:23:20,597 INFO Epoch:58 train_loss:0.02448
130
+ 2025-04-19 04:23:25,970 INFO Epoch:58 val_res:0.809524
131
+ 2025-04-19 04:23:37,592 INFO Epoch:59 train_loss:0.02383
132
+ 2025-04-19 04:23:42,653 INFO Epoch:59 val_res:0.819048
133
+ 2025-04-19 04:23:53,823 INFO Epoch:60 train_loss:0.02276
134
+ 2025-04-19 04:23:58,941 INFO Epoch:60 val_res:0.828571
135
+ 2025-04-19 04:24:10,052 INFO Epoch:61 train_loss:0.02232
136
+ 2025-04-19 04:24:15,066 INFO Epoch:61 val_res:0.828571
137
+ 2025-04-19 04:24:25,649 INFO Epoch:62 train_loss:0.02243
138
+ 2025-04-19 04:24:30,632 INFO Epoch:62 val_res:0.819048
139
+ 2025-04-19 04:24:41,632 INFO Epoch:63 train_loss:0.02103
140
+ 2025-04-19 04:24:46,663 INFO Epoch:63 val_res:0.828571
141
+ 2025-04-19 04:24:57,444 INFO Epoch:64 train_loss:0.02037
142
+ 2025-04-19 04:25:02,501 INFO Epoch:64 val_res:0.819048
143
+ 2025-04-19 04:25:13,255 INFO Epoch:65 train_loss:0.02047
144
+ 2025-04-19 04:25:18,271 INFO Epoch:65 val_res:0.819048
145
+ 2025-04-19 04:25:31,635 INFO Epoch:66 train_loss:0.01881
146
+ 2025-04-19 04:25:36,977 INFO Epoch:66 val_res:0.819048
147
+ 2025-04-19 04:25:47,631 INFO Epoch:67 train_loss:0.01795
148
+ 2025-04-19 04:25:52,557 INFO Epoch:67 val_res:0.819048
149
+ 2025-04-19 04:26:03,465 INFO Epoch:68 train_loss:0.01733
150
+ 2025-04-19 04:26:08,413 INFO Epoch:68 val_res:0.809524
151
+ 2025-04-19 04:26:20,026 INFO Epoch:69 train_loss:0.01720
152
+ 2025-04-19 04:26:25,748 INFO Epoch:69 val_res:0.819048
153
+ 2025-04-19 04:26:37,248 INFO Epoch:70 train_loss:0.01672
154
+ 2025-04-19 04:26:42,801 INFO Epoch:70 val_res:0.819048
155
+ 2025-04-19 04:26:54,448 INFO Epoch:71 train_loss:0.01631
156
+ 2025-04-19 04:26:59,830 INFO Epoch:71 val_res:0.809524
157
+ 2025-04-19 04:27:10,908 INFO Epoch:72 train_loss:0.01544
158
+ 2025-04-19 04:27:15,940 INFO Epoch:72 val_res:0.809524
159
+ 2025-04-19 04:27:26,785 INFO Epoch:73 train_loss:0.01552
160
+ 2025-04-19 04:27:31,725 INFO Epoch:73 val_res:0.819048
161
+ 2025-04-19 04:27:42,621 INFO Epoch:74 train_loss:0.01578
162
+ 2025-04-19 04:27:47,627 INFO Epoch:74 val_res:0.819048
163
+ 2025-04-19 04:27:58,371 INFO Epoch:75 train_loss:0.01469
164
+ 2025-04-19 04:28:03,839 INFO Epoch:75 val_res:0.809524
165
+ 2025-04-19 04:28:15,439 INFO Epoch:76 train_loss:0.01454
166
+ 2025-04-19 04:28:20,682 INFO Epoch:76 val_res:0.809524
167
+ 2025-04-19 04:28:32,873 INFO Epoch:77 train_loss:0.01468
168
+ 2025-04-19 04:28:38,716 INFO Epoch:77 val_res:0.809524
169
+ 2025-04-19 04:28:50,666 INFO Epoch:78 train_loss:0.01445
170
+ 2025-04-19 04:28:55,784 INFO Epoch:78 val_res:0.819048
171
+ 2025-04-19 04:29:06,783 INFO Epoch:79 train_loss:0.01457
172
+ 2025-04-19 04:29:11,637 INFO Epoch:79 val_res:0.819048
173
+ 2025-04-19 04:29:23,069 INFO Epoch:80 train_loss:0.01342
174
+ 2025-04-19 04:29:28,186 INFO Epoch:80 val_res:0.819048
175
+ 2025-04-19 04:29:39,226 INFO Epoch:81 train_loss:0.01392
176
+ 2025-04-19 04:29:44,840 INFO Epoch:81 val_res:0.819048
177
+ 2025-04-19 04:29:56,590 INFO Epoch:82 train_loss:0.01361
178
+ 2025-04-19 04:30:01,663 INFO Epoch:82 val_res:0.819048
179
+ 2025-04-19 04:30:14,720 INFO Epoch:83 train_loss:0.01345
180
+ 2025-04-19 04:30:21,042 INFO Epoch:83 val_res:0.819048
181
+ 2025-04-19 04:30:32,340 INFO Epoch:84 train_loss:0.01321
182
+ 2025-04-19 04:30:37,387 INFO Epoch:84 val_res:0.809524
183
+ 2025-04-19 04:30:48,391 INFO Epoch:85 train_loss:0.01316
184
+ 2025-04-19 04:30:53,961 INFO Epoch:85 val_res:0.819048
185
+ 2025-04-19 04:31:06,125 INFO Epoch:86 train_loss:0.01257
186
+ 2025-04-19 04:31:12,059 INFO Epoch:86 val_res:0.819048
187
+ 2025-04-19 04:31:25,191 INFO Epoch:87 train_loss:0.01234
188
+ 2025-04-19 04:31:30,665 INFO Epoch:87 val_res:0.819048
189
+ 2025-04-19 04:31:41,904 INFO Epoch:88 train_loss:0.01236
190
+ 2025-04-19 04:31:49,031 INFO Epoch:88 val_res:0.809524
191
+ 2025-04-19 04:32:00,985 INFO Epoch:89 train_loss:0.01229
192
+ 2025-04-19 04:32:06,592 INFO Epoch:89 val_res:0.809524
193
+ 2025-04-19 04:32:20,335 INFO Epoch:90 train_loss:0.01214
194
+ 2025-04-19 04:32:25,597 INFO Epoch:90 val_res:0.809524
195
+ 2025-04-19 04:32:36,888 INFO Epoch:91 train_loss:0.01165
196
+ 2025-04-19 04:32:42,294 INFO Epoch:91 val_res:0.809524
197
+ 2025-04-19 04:32:57,072 INFO Epoch:92 train_loss:0.01149
198
+ 2025-04-19 04:33:02,740 INFO Epoch:92 val_res:0.809524
199
+ 2025-04-19 04:33:15,993 INFO Epoch:93 train_loss:0.01121
200
+ 2025-04-19 04:33:21,325 INFO Epoch:93 val_res:0.809524
201
+ 2025-04-19 04:33:34,305 INFO Epoch:94 train_loss:0.01164
202
+ 2025-04-19 04:33:40,145 INFO Epoch:94 val_res:0.819048
203
+ 2025-04-19 04:33:54,227 INFO Epoch:95 train_loss:0.01101
204
+ 2025-04-19 04:34:00,106 INFO Epoch:95 val_res:0.809524
205
+ 2025-04-19 04:34:11,688 INFO Epoch:96 train_loss:0.01099
206
+ 2025-04-19 04:34:17,683 INFO Epoch:96 val_res:0.809524
207
+ 2025-04-19 04:34:32,261 INFO Epoch:97 train_loss:0.01085
208
+ 2025-04-19 04:34:38,236 INFO Epoch:97 val_res:0.809524
209
+ 2025-04-19 04:34:51,108 INFO Epoch:98 train_loss:0.01092
210
+ 2025-04-19 04:34:56,412 INFO Epoch:98 val_res:0.809524
211
+ 2025-04-19 04:35:07,600 INFO Epoch:99 train_loss:0.01068
212
+ 2025-04-19 04:35:13,438 INFO Epoch:99 val_res:0.809524
213
+ 2025-04-19 04:35:13,967 INFO =====================================
214
+ 2025-04-19 04:35:13,968 INFO Start testing...
215
+ 2025-04-19 04:35:13,968 INFO =====================================
216
+ 2025-04-19 04:35:23,848 INFO Incremental step 0 Testing res: 0.778846
217
+ 2025-04-19 04:35:23,850 INFO Incremental step: 1
218
+ 2025-04-19 04:35:53,902 INFO Epoch:0 train_loss:2.87659
219
+ 2025-04-19 04:36:00,605 INFO Epoch:0 val_res:0.399061
220
+ 2025-04-19 04:36:00,605 INFO Saving best model at Epoch 0
221
+ 2025-04-19 04:36:26,503 INFO Epoch:1 train_loss:2.49178
222
+ 2025-04-19 04:36:32,993 INFO Epoch:1 val_res:0.483568
223
+ 2025-04-19 04:36:32,994 INFO Saving best model at Epoch 1
224
+ 2025-04-19 04:36:58,149 INFO Epoch:2 train_loss:2.08639
225
+ 2025-04-19 04:37:04,528 INFO Epoch:2 val_res:0.488263
226
+ 2025-04-19 04:37:04,528 INFO Saving best model at Epoch 2
227
+ 2025-04-19 04:37:28,234 INFO Epoch:3 train_loss:1.91849
228
+ 2025-04-19 04:37:34,577 INFO Epoch:3 val_res:0.507042
229
+ 2025-04-19 04:37:34,577 INFO Saving best model at Epoch 3
230
+ 2025-04-19 04:38:00,495 INFO Epoch:4 train_loss:1.74276
231
+ 2025-04-19 04:38:06,583 INFO Epoch:4 val_res:0.568075
232
+ 2025-04-19 04:38:06,583 INFO Saving best model at Epoch 4
233
+ 2025-04-19 04:38:30,997 INFO Epoch:5 train_loss:1.69453
234
+ 2025-04-19 04:38:36,689 INFO Epoch:5 val_res:0.535211
235
+ 2025-04-19 04:39:00,311 INFO Epoch:6 train_loss:1.59656
236
+ 2025-04-19 04:39:06,101 INFO Epoch:6 val_res:0.553991
237
+ 2025-04-19 04:39:28,517 INFO Epoch:7 train_loss:1.52932
238
+ 2025-04-19 04:39:34,405 INFO Epoch:7 val_res:0.615023
239
+ 2025-04-19 04:39:34,406 INFO Saving best model at Epoch 7
240
+ 2025-04-19 04:40:00,329 INFO Epoch:8 train_loss:1.47690
241
+ 2025-04-19 04:40:06,573 INFO Epoch:8 val_res:0.596244
242
+ 2025-04-19 04:40:30,224 INFO Epoch:9 train_loss:1.45488
243
+ 2025-04-19 04:40:36,365 INFO Epoch:9 val_res:0.615023
244
+ 2025-04-19 04:40:58,455 INFO Epoch:10 train_loss:1.41195
245
+ 2025-04-19 04:41:04,734 INFO Epoch:10 val_res:0.586854
246
+ 2025-04-19 04:41:27,533 INFO Epoch:11 train_loss:1.37424
247
+ 2025-04-19 04:41:34,133 INFO Epoch:11 val_res:0.582160
248
+ 2025-04-19 04:41:58,395 INFO Epoch:12 train_loss:1.35671
249
+ 2025-04-19 04:42:04,503 INFO Epoch:12 val_res:0.582160
250
+ 2025-04-19 04:42:26,057 INFO Epoch:13 train_loss:1.33651
251
+ 2025-04-19 04:42:32,035 INFO Epoch:13 val_res:0.629108
252
+ 2025-04-19 04:42:32,036 INFO Saving best model at Epoch 13
253
+ 2025-04-19 04:42:57,012 INFO Epoch:14 train_loss:1.30658
254
+ 2025-04-19 04:43:02,977 INFO Epoch:14 val_res:0.624413
255
+ 2025-04-19 04:43:27,890 INFO Epoch:15 train_loss:1.29354
256
+ 2025-04-19 04:43:34,531 INFO Epoch:15 val_res:0.624413
257
+ 2025-04-19 04:43:56,974 INFO Epoch:16 train_loss:1.26306
258
+ 2025-04-19 04:44:03,610 INFO Epoch:16 val_res:0.661972
259
+ 2025-04-19 04:44:03,610 INFO Saving best model at Epoch 16
260
+ 2025-04-19 04:44:28,145 INFO Epoch:17 train_loss:1.25631
261
+ 2025-04-19 04:44:36,189 INFO Epoch:17 val_res:0.638498
262
+ 2025-04-19 04:44:59,391 INFO Epoch:18 train_loss:1.24186
263
+ 2025-04-19 04:45:05,336 INFO Epoch:18 val_res:0.647887
264
+ 2025-04-19 04:45:30,886 INFO Epoch:19 train_loss:1.22242
265
+ 2025-04-19 04:45:37,967 INFO Epoch:19 val_res:0.652582
266
+ 2025-04-19 04:46:00,525 INFO Epoch:20 train_loss:1.21569
267
+ 2025-04-19 04:46:07,647 INFO Epoch:20 val_res:0.633803
268
+ 2025-04-19 04:46:31,420 INFO Epoch:21 train_loss:1.20165
269
+ 2025-04-19 04:46:37,592 INFO Epoch:21 val_res:0.647887
270
+ 2025-04-19 04:47:01,663 INFO Epoch:22 train_loss:1.18124
271
+ 2025-04-19 04:47:08,775 INFO Epoch:22 val_res:0.647887
272
+ 2025-04-19 04:47:32,825 INFO Epoch:23 train_loss:1.16940
273
+ 2025-04-19 04:47:39,997 INFO Epoch:23 val_res:0.633803
274
+ 2025-04-19 04:48:03,925 INFO Epoch:24 train_loss:1.15707
275
+ 2025-04-19 04:48:10,242 INFO Epoch:24 val_res:0.647887
276
+ 2025-04-19 04:48:31,357 INFO Epoch:25 train_loss:1.13481
277
+ 2025-04-19 04:48:37,371 INFO Epoch:25 val_res:0.647887
278
+ 2025-04-19 04:49:01,705 INFO Epoch:26 train_loss:1.13422
279
+ 2025-04-19 04:49:08,468 INFO Epoch:26 val_res:0.643192
280
+ 2025-04-19 04:49:33,860 INFO Epoch:27 train_loss:1.12034
281
+ 2025-04-19 04:49:40,481 INFO Epoch:27 val_res:0.638498
282
+ 2025-04-19 04:50:02,700 INFO Epoch:28 train_loss:1.10088
283
+ 2025-04-19 04:50:09,162 INFO Epoch:28 val_res:0.647887
284
+ 2025-04-19 04:50:33,673 INFO Epoch:29 train_loss:1.09147
285
+ 2025-04-19 04:50:40,713 INFO Epoch:29 val_res:0.652582
286
+ 2025-04-19 04:51:05,590 INFO Epoch:30 train_loss:1.09043
287
+ 2025-04-19 04:51:12,641 INFO Epoch:30 val_res:0.643192
288
+ 2025-04-19 04:51:36,958 INFO Epoch:31 train_loss:1.07506
289
+ 2025-04-19 04:51:43,248 INFO Epoch:31 val_res:0.652582
290
+ 2025-04-19 04:52:07,363 INFO Epoch:32 train_loss:1.06859
291
+ 2025-04-19 04:52:14,710 INFO Epoch:32 val_res:0.652582
292
+ 2025-04-19 04:52:40,388 INFO Epoch:33 train_loss:1.05373
293
+ 2025-04-19 04:52:47,520 INFO Epoch:33 val_res:0.647887
294
+ 2025-04-19 04:53:13,113 INFO Epoch:34 train_loss:1.03862
295
+ 2025-04-19 04:53:20,825 INFO Epoch:34 val_res:0.647887
296
+ 2025-04-19 04:53:44,452 INFO Epoch:35 train_loss:1.02759
297
+ 2025-04-19 04:53:51,990 INFO Epoch:35 val_res:0.652582
298
+ 2025-04-19 04:54:16,157 INFO Epoch:36 train_loss:1.02545
299
+ 2025-04-19 04:54:23,837 INFO Epoch:36 val_res:0.661972
300
+ 2025-04-19 04:54:47,536 INFO Epoch:37 train_loss:1.02286
301
+ 2025-04-19 04:54:55,120 INFO Epoch:37 val_res:0.647887
302
+ 2025-04-19 04:55:20,024 INFO Epoch:38 train_loss:1.01146
303
+ 2025-04-19 04:55:28,647 INFO Epoch:38 val_res:0.676056
304
+ 2025-04-19 04:55:28,647 INFO Saving best model at Epoch 38
305
+ 2025-04-19 04:55:55,808 INFO Epoch:39 train_loss:0.99833
306
+ 2025-04-19 04:56:03,657 INFO Epoch:39 val_res:0.671362
307
+ 2025-04-19 04:56:30,150 INFO Epoch:40 train_loss:0.99172
308
+ 2025-04-19 04:56:38,249 INFO Epoch:40 val_res:0.666667
309
+ 2025-04-19 04:57:03,738 INFO Epoch:41 train_loss:0.97500
310
+ 2025-04-19 04:57:10,548 INFO Epoch:41 val_res:0.657277
311
+ 2025-04-19 04:57:35,931 INFO Epoch:42 train_loss:0.97483
312
+ 2025-04-19 04:57:43,440 INFO Epoch:42 val_res:0.643192
313
+ 2025-04-19 04:58:06,863 INFO Epoch:43 train_loss:0.96685
314
+ 2025-04-19 04:58:13,337 INFO Epoch:43 val_res:0.661972
315
+ 2025-04-19 04:58:37,132 INFO Epoch:44 train_loss:0.95945
316
+ 2025-04-19 04:58:44,179 INFO Epoch:44 val_res:0.647887
317
+ 2025-04-19 04:59:09,746 INFO Epoch:45 train_loss:0.95177
318
+ 2025-04-19 04:59:17,392 INFO Epoch:45 val_res:0.657277
319
+ 2025-04-19 04:59:42,917 INFO Epoch:46 train_loss:0.93911
320
+ 2025-04-19 04:59:49,547 INFO Epoch:46 val_res:0.657277
321
+ 2025-04-19 05:00:15,416 INFO Epoch:47 train_loss:0.93742
322
+ 2025-04-19 05:00:24,089 INFO Epoch:47 val_res:0.643192
323
+ 2025-04-19 05:00:51,476 INFO Epoch:48 train_loss:0.93398
324
+ 2025-04-19 05:00:59,974 INFO Epoch:48 val_res:0.666667
325
+ 2025-04-19 05:01:26,184 INFO Epoch:49 train_loss:0.92497
326
+ 2025-04-19 05:01:34,426 INFO Epoch:49 val_res:0.652582
327
+ 2025-04-19 05:02:00,782 INFO Epoch:50 train_loss:0.91717
328
+ 2025-04-19 05:02:09,023 INFO Epoch:50 val_res:0.666667
329
+ 2025-04-19 05:02:36,706 INFO Epoch:51 train_loss:0.91261
330
+ 2025-04-19 05:02:43,401 INFO Epoch:51 val_res:0.652582
331
+ 2025-04-19 05:03:10,428 INFO Epoch:52 train_loss:0.89786
332
+ 2025-04-19 05:03:17,675 INFO Epoch:52 val_res:0.657277
333
+ 2025-04-19 05:03:42,906 INFO Epoch:53 train_loss:0.90636
334
+ 2025-04-19 05:03:50,534 INFO Epoch:53 val_res:0.657277
335
+ 2025-04-19 05:04:18,081 INFO Epoch:54 train_loss:0.90125
336
+ 2025-04-19 05:04:27,362 INFO Epoch:54 val_res:0.657277
337
+ 2025-04-19 05:04:53,831 INFO Epoch:55 train_loss:0.88390
338
+ 2025-04-19 05:05:02,135 INFO Epoch:55 val_res:0.657277
339
+ 2025-04-19 05:05:31,278 INFO Epoch:56 train_loss:0.88072
340
+ 2025-04-19 05:05:38,796 INFO Epoch:56 val_res:0.666667
341
+ 2025-04-19 05:06:05,109 INFO Epoch:57 train_loss:0.86665
342
+ 2025-04-19 05:06:13,291 INFO Epoch:57 val_res:0.657277
343
+ 2025-04-19 05:06:40,100 INFO Epoch:58 train_loss:0.86555
344
+ 2025-04-19 05:06:48,050 INFO Epoch:58 val_res:0.652582
345
+ 2025-04-19 05:07:14,682 INFO Epoch:59 train_loss:0.86187
346
+ 2025-04-19 05:07:22,794 INFO Epoch:59 val_res:0.666667
347
+ 2025-04-19 05:07:49,474 INFO Epoch:60 train_loss:0.85403
348
+ 2025-04-19 05:07:57,732 INFO Epoch:60 val_res:0.652582
349
+ 2025-04-19 05:08:23,177 INFO Epoch:61 train_loss:0.85392
350
+ 2025-04-19 05:08:31,495 INFO Epoch:61 val_res:0.666667
351
+ 2025-04-19 05:08:58,439 INFO Epoch:62 train_loss:0.84927
352
+ 2025-04-19 05:09:06,017 INFO Epoch:62 val_res:0.657277
353
+ 2025-04-19 05:09:31,537 INFO Epoch:63 train_loss:0.84145
354
+ 2025-04-19 05:09:38,699 INFO Epoch:63 val_res:0.671362
355
+ 2025-04-19 05:10:05,458 INFO Epoch:64 train_loss:0.84488
356
+ 2025-04-19 05:10:13,604 INFO Epoch:64 val_res:0.657277
357
+ 2025-04-19 05:10:39,343 INFO Epoch:65 train_loss:0.84491
358
+ 2025-04-19 05:10:46,791 INFO Epoch:65 val_res:0.657277
359
+ 2025-04-19 05:11:11,645 INFO Epoch:66 train_loss:0.83197
360
+ 2025-04-19 05:11:19,564 INFO Epoch:66 val_res:0.661972
361
+ 2025-04-19 05:11:44,555 INFO Epoch:67 train_loss:0.83615
362
+ 2025-04-19 05:11:51,284 INFO Epoch:67 val_res:0.657277
363
+ 2025-04-19 05:12:17,123 INFO Epoch:68 train_loss:0.82324
364
+ 2025-04-19 05:12:25,738 INFO Epoch:68 val_res:0.657277
365
+ 2025-04-19 05:12:50,748 INFO Epoch:69 train_loss:0.81473
366
+ 2025-04-19 05:12:59,173 INFO Epoch:69 val_res:0.657277
367
+ 2025-04-19 05:13:27,317 INFO Epoch:70 train_loss:0.81410
368
+ 2025-04-19 05:13:34,969 INFO Epoch:70 val_res:0.666667
369
+ 2025-04-19 05:14:00,177 INFO Epoch:71 train_loss:0.79444
370
+ 2025-04-19 05:14:08,009 INFO Epoch:71 val_res:0.661972
371
+ 2025-04-19 05:14:33,916 INFO Epoch:72 train_loss:0.79974
372
+ 2025-04-19 05:14:41,399 INFO Epoch:72 val_res:0.661972
373
+ 2025-04-19 05:15:08,015 INFO Epoch:73 train_loss:0.79409
374
+ 2025-04-19 05:15:16,674 INFO Epoch:73 val_res:0.657277
375
+ 2025-04-19 05:15:41,448 INFO Epoch:74 train_loss:0.78789
376
+ 2025-04-19 05:15:49,130 INFO Epoch:74 val_res:0.671362
377
+ 2025-04-19 05:16:17,273 INFO Epoch:75 train_loss:0.77231
378
+ 2025-04-19 05:16:25,035 INFO Epoch:75 val_res:0.661972
379
+ 2025-04-19 05:16:49,375 INFO Epoch:76 train_loss:0.77250
380
+ 2025-04-19 05:16:57,720 INFO Epoch:76 val_res:0.666667
381
+ 2025-04-19 05:17:24,770 INFO Epoch:77 train_loss:0.77266
382
+ 2025-04-19 05:17:31,656 INFO Epoch:77 val_res:0.671362
383
+ 2025-04-19 05:17:57,962 INFO Epoch:78 train_loss:0.77180
384
+ 2025-04-19 05:18:06,414 INFO Epoch:78 val_res:0.676056
385
+ 2025-04-19 05:18:32,270 INFO Epoch:79 train_loss:0.76215
386
+ 2025-04-19 05:18:39,708 INFO Epoch:79 val_res:0.657277
387
+ 2025-04-19 05:19:05,348 INFO Epoch:80 train_loss:0.76284
388
+ 2025-04-19 05:19:12,466 INFO Epoch:80 val_res:0.661972
389
+ 2025-04-19 05:19:38,441 INFO Epoch:81 train_loss:0.76598
390
+ 2025-04-19 05:19:46,633 INFO Epoch:81 val_res:0.661972
391
+ 2025-04-19 05:20:14,697 INFO Epoch:82 train_loss:0.75792
392
+ 2025-04-19 05:20:21,201 INFO Epoch:82 val_res:0.671362
393
+ 2025-04-19 05:20:44,985 INFO Epoch:83 train_loss:0.76565
394
+ 2025-04-19 05:20:53,634 INFO Epoch:83 val_res:0.676056
395
+ 2025-04-19 05:21:19,665 INFO Epoch:84 train_loss:0.76576
396
+ 2025-04-19 05:21:27,691 INFO Epoch:84 val_res:0.661972
397
+ 2025-04-19 05:21:53,456 INFO Epoch:85 train_loss:0.75056
398
+ 2025-04-19 05:22:01,419 INFO Epoch:85 val_res:0.661972
399
+ 2025-04-19 05:22:26,536 INFO Epoch:86 train_loss:0.74927
400
+ 2025-04-19 05:22:32,928 INFO Epoch:86 val_res:0.661972
401
+ 2025-04-19 05:22:59,981 INFO Epoch:87 train_loss:0.73571
402
+ 2025-04-19 05:23:07,038 INFO Epoch:87 val_res:0.676056
403
+ 2025-04-19 05:23:33,157 INFO Epoch:88 train_loss:0.74306
404
+ 2025-04-19 05:23:42,022 INFO Epoch:88 val_res:0.666667
405
+ 2025-04-19 05:24:08,549 INFO Epoch:89 train_loss:0.78038
406
+ 2025-04-19 05:24:15,896 INFO Epoch:89 val_res:0.690141
407
+ 2025-04-19 05:24:15,896 INFO Saving best model at Epoch 89
408
+ 2025-04-19 05:24:44,019 INFO Epoch:90 train_loss:0.78606
409
+ 2025-04-19 05:24:51,696 INFO Epoch:90 val_res:0.690141
410
+ 2025-04-19 05:25:19,169 INFO Epoch:91 train_loss:0.79329
411
+ 2025-04-19 05:25:26,965 INFO Epoch:91 val_res:0.671362
412
+ 2025-04-19 05:25:53,651 INFO Epoch:92 train_loss:0.75971
413
+ 2025-04-19 05:26:01,756 INFO Epoch:92 val_res:0.676056
414
+ 2025-04-19 05:26:28,735 INFO Epoch:93 train_loss:0.76555
415
+ 2025-04-19 05:26:36,608 INFO Epoch:93 val_res:0.680751
416
+ 2025-04-19 05:27:04,958 INFO Epoch:94 train_loss:0.76633
417
+ 2025-04-19 05:27:11,992 INFO Epoch:94 val_res:0.680751
418
+ 2025-04-19 05:27:35,931 INFO Epoch:95 train_loss:0.77622
419
+ 2025-04-19 05:27:43,220 INFO Epoch:95 val_res:0.666667
420
+ 2025-04-19 05:28:08,339 INFO Epoch:96 train_loss:0.75844
421
+ 2025-04-19 05:28:16,109 INFO Epoch:96 val_res:0.671362
422
+ 2025-04-19 05:28:40,738 INFO Epoch:97 train_loss:0.74834
423
+ 2025-04-19 05:28:48,113 INFO Epoch:97 val_res:0.671362
424
+ 2025-04-19 05:29:13,674 INFO Epoch:98 train_loss:0.72671
425
+ 2025-04-19 05:29:21,756 INFO Epoch:98 val_res:0.676056
426
+ 2025-04-19 05:29:46,600 INFO Epoch:99 train_loss:0.72193
427
+ 2025-04-19 05:29:53,808 INFO Epoch:99 val_res:0.690141
428
+ 2025-04-19 05:29:54,247 INFO =====================================
429
+ 2025-04-19 05:29:54,248 INFO Start testing...
430
+ 2025-04-19 05:29:54,248 INFO =====================================
431
+ 2025-04-19 05:30:02,060 INFO Incremental step 1 Testing res: 0.628571
432
+ 2025-04-19 05:30:02,061 INFO forgetting: 0.346154
433
+ 2025-04-19 05:30:02,065 INFO Incremental step: 2
434
+ 2025-04-19 05:30:28,743 INFO Epoch:0 train_loss:3.39496
435
+ 2025-04-19 05:30:37,190 INFO Epoch:0 val_res:0.471154
436
+ 2025-04-19 05:30:37,191 INFO Saving best model at Epoch 0
437
+ 2025-04-19 05:31:04,740 INFO Epoch:1 train_loss:2.90468
438
+ 2025-04-19 05:31:13,320 INFO Epoch:1 val_res:0.467949
439
+ 2025-04-19 05:31:42,034 INFO Epoch:2 train_loss:2.39375
440
+ 2025-04-19 05:31:50,636 INFO Epoch:2 val_res:0.503205
441
+ 2025-04-19 05:31:50,637 INFO Saving best model at Epoch 2
442
+ 2025-04-19 05:32:18,349 INFO Epoch:3 train_loss:1.95143
443
+ 2025-04-19 05:32:27,606 INFO Epoch:3 val_res:0.500000
444
+ 2025-04-19 05:32:53,897 INFO Epoch:4 train_loss:1.73963
445
+ 2025-04-19 05:33:01,305 INFO Epoch:4 val_res:0.512821
446
+ 2025-04-19 05:33:01,305 INFO Saving best model at Epoch 4
447
+ 2025-04-19 05:33:27,309 INFO Epoch:5 train_loss:1.65191
448
+ 2025-04-19 05:33:35,122 INFO Epoch:5 val_res:0.538462
449
+ 2025-04-19 05:33:35,122 INFO Saving best model at Epoch 5
450
+ 2025-04-19 05:34:02,037 INFO Epoch:6 train_loss:1.50180
451
+ 2025-04-19 05:34:10,006 INFO Epoch:6 val_res:0.522436
452
+ 2025-04-19 05:34:36,232 INFO Epoch:7 train_loss:1.42775
453
+ 2025-04-19 05:34:45,846 INFO Epoch:7 val_res:0.544872
454
+ 2025-04-19 05:34:45,847 INFO Saving best model at Epoch 7
455
+ 2025-04-19 05:35:14,477 INFO Epoch:8 train_loss:1.38460
456
+ 2025-04-19 05:35:22,631 INFO Epoch:8 val_res:0.554487
457
+ 2025-04-19 05:35:22,631 INFO Saving best model at Epoch 8
458
+ 2025-04-19 05:35:51,035 INFO Epoch:9 train_loss:1.33024
459
+ 2025-04-19 05:35:59,700 INFO Epoch:9 val_res:0.541667
460
+ 2025-04-19 05:36:24,271 INFO Epoch:10 train_loss:1.27901
461
+ 2025-04-19 05:36:32,807 INFO Epoch:10 val_res:0.541667
462
+ 2025-04-19 05:36:59,839 INFO Epoch:11 train_loss:1.25141
463
+ 2025-04-19 05:37:08,419 INFO Epoch:11 val_res:0.548077
464
+ 2025-04-19 05:37:33,622 INFO Epoch:12 train_loss:1.21144
465
+ 2025-04-19 05:37:41,387 INFO Epoch:12 val_res:0.554487
466
+ 2025-04-19 05:38:07,619 INFO Epoch:13 train_loss:1.19474
467
+ 2025-04-19 05:38:16,403 INFO Epoch:13 val_res:0.557692
468
+ 2025-04-19 05:38:16,404 INFO Saving best model at Epoch 13
469
+ 2025-04-19 05:38:46,565 INFO Epoch:14 train_loss:1.17113
470
+ 2025-04-19 05:38:55,721 INFO Epoch:14 val_res:0.560897
471
+ 2025-04-19 05:38:55,722 INFO Saving best model at Epoch 14
472
+ 2025-04-19 05:39:23,223 INFO Epoch:15 train_loss:1.14082
473
+ 2025-04-19 05:39:32,085 INFO Epoch:15 val_res:0.573718
474
+ 2025-04-19 05:39:32,086 INFO Saving best model at Epoch 15
475
+ 2025-04-19 05:40:00,792 INFO Epoch:16 train_loss:1.13575
476
+ 2025-04-19 05:40:07,720 INFO Epoch:16 val_res:0.564103
477
+ 2025-04-19 05:40:34,313 INFO Epoch:17 train_loss:1.11016
478
+ 2025-04-19 05:40:42,602 INFO Epoch:17 val_res:0.554487
479
+ 2025-04-19 05:41:08,583 INFO Epoch:18 train_loss:1.09610
480
+ 2025-04-19 05:41:16,354 INFO Epoch:18 val_res:0.564103
481
+ 2025-04-19 05:41:42,164 INFO Epoch:19 train_loss:1.07779
482
+ 2025-04-19 05:41:50,433 INFO Epoch:19 val_res:0.560897
483
+ 2025-04-19 05:42:18,140 INFO Epoch:20 train_loss:1.06378
484
+ 2025-04-19 05:42:26,010 INFO Epoch:20 val_res:0.557692
485
+ 2025-04-19 05:42:52,621 INFO Epoch:21 train_loss:1.05497
486
+ 2025-04-19 05:43:01,675 INFO Epoch:21 val_res:0.564103
487
+ 2025-04-19 05:43:26,977 INFO Epoch:22 train_loss:1.03563
488
+ 2025-04-19 05:43:35,189 INFO Epoch:22 val_res:0.560897
489
+ 2025-04-19 05:44:01,220 INFO Epoch:23 train_loss:1.01866
490
+ 2025-04-19 05:44:10,494 INFO Epoch:23 val_res:0.564103
491
+ 2025-04-19 05:44:36,691 INFO Epoch:24 train_loss:1.01375
492
+ 2025-04-19 05:44:44,015 INFO Epoch:24 val_res:0.573718
493
+ 2025-04-19 05:45:10,976 INFO Epoch:25 train_loss:1.00169
494
+ 2025-04-19 05:45:18,956 INFO Epoch:25 val_res:0.560897
495
+ 2025-04-19 05:45:42,273 INFO Epoch:26 train_loss:0.99573
496
+ 2025-04-19 05:45:49,210 INFO Epoch:26 val_res:0.554487
497
+ 2025-04-19 05:46:12,558 INFO Epoch:27 train_loss:0.98392
498
+ 2025-04-19 05:46:20,577 INFO Epoch:27 val_res:0.570513
499
+ 2025-04-19 05:46:45,309 INFO Epoch:28 train_loss:0.97057
500
+ 2025-04-19 05:46:52,261 INFO Epoch:28 val_res:0.576923
501
+ 2025-04-19 05:46:52,261 INFO Saving best model at Epoch 28
502
+ 2025-04-19 05:47:19,078 INFO Epoch:29 train_loss:0.95906
503
+ 2025-04-19 05:47:27,805 INFO Epoch:29 val_res:0.573718
504
+ 2025-04-19 05:47:54,263 INFO Epoch:30 train_loss:0.95423
505
+ 2025-04-19 05:48:03,066 INFO Epoch:30 val_res:0.570513
506
+ 2025-04-19 05:48:30,642 INFO Epoch:31 train_loss:0.94521
507
+ 2025-04-19 05:48:39,422 INFO Epoch:31 val_res:0.567308
508
+ 2025-04-19 05:49:02,550 INFO Epoch:32 train_loss:0.93007
509
+ 2025-04-19 05:49:10,950 INFO Epoch:32 val_res:0.580128
510
+ 2025-04-19 05:49:10,953 INFO Saving best model at Epoch 32
511
+ 2025-04-19 05:49:37,473 INFO Epoch:33 train_loss:0.92772
512
+ 2025-04-19 05:49:45,318 INFO Epoch:33 val_res:0.586538
513
+ 2025-04-19 05:49:45,319 INFO Saving best model at Epoch 33
514
+ 2025-04-19 05:50:11,179 INFO Epoch:34 train_loss:0.91153
515
+ 2025-04-19 05:50:19,719 INFO Epoch:34 val_res:0.570513
516
+ 2025-04-19 05:50:42,310 INFO Epoch:35 train_loss:0.91071
517
+ 2025-04-19 05:50:49,136 INFO Epoch:35 val_res:0.580128
518
+ 2025-04-19 05:51:13,298 INFO Epoch:36 train_loss:0.89606
519
+ 2025-04-19 05:51:20,889 INFO Epoch:36 val_res:0.583333
520
+ 2025-04-19 05:51:43,869 INFO Epoch:37 train_loss:0.89081
521
+ 2025-04-19 05:51:51,102 INFO Epoch:37 val_res:0.583333
522
+ 2025-04-19 05:52:15,170 INFO Epoch:38 train_loss:0.86181
523
+ 2025-04-19 05:52:24,584 INFO Epoch:38 val_res:0.586538
524
+ 2025-04-19 05:52:49,739 INFO Epoch:39 train_loss:0.85872
525
+ 2025-04-19 05:52:57,095 INFO Epoch:39 val_res:0.589744
526
+ 2025-04-19 05:52:57,095 INFO Saving best model at Epoch 39
527
+ 2025-04-19 05:53:25,309 INFO Epoch:40 train_loss:0.86122
528
+ 2025-04-19 05:53:33,600 INFO Epoch:40 val_res:0.592949
529
+ 2025-04-19 05:53:33,601 INFO Saving best model at Epoch 40
530
+ 2025-04-19 05:54:00,869 INFO Epoch:41 train_loss:0.85885
531
+ 2025-04-19 05:54:08,836 INFO Epoch:41 val_res:0.592949
532
+ 2025-04-19 05:54:34,430 INFO Epoch:42 train_loss:0.84400
533
+ 2025-04-19 05:54:41,703 INFO Epoch:42 val_res:0.586538
534
+ 2025-04-19 05:55:06,612 INFO Epoch:43 train_loss:0.83154
535
+ 2025-04-19 05:55:14,354 INFO Epoch:43 val_res:0.592949
536
+ 2025-04-19 05:55:39,333 INFO Epoch:44 train_loss:0.83377
537
+ 2025-04-19 05:55:46,566 INFO Epoch:44 val_res:0.589744
538
+ 2025-04-19 05:56:11,992 INFO Epoch:45 train_loss:0.82829
539
+ 2025-04-19 05:56:20,118 INFO Epoch:45 val_res:0.596154
540
+ 2025-04-19 05:56:20,118 INFO Saving best model at Epoch 45
541
+ 2025-04-19 05:56:50,517 INFO Epoch:46 train_loss:0.82146
542
+ 2025-04-19 05:56:57,879 INFO Epoch:46 val_res:0.589744
543
+ 2025-04-19 05:57:22,727 INFO Epoch:47 train_loss:0.81898
544
+ 2025-04-19 05:57:30,974 INFO Epoch:47 val_res:0.602564
545
+ 2025-04-19 05:57:30,974 INFO Saving best model at Epoch 47
546
+ 2025-04-19 05:58:02,530 INFO Epoch:48 train_loss:0.81417
547
+ 2025-04-19 05:58:09,622 INFO Epoch:48 val_res:0.599359
548
+ 2025-04-19 05:58:32,399 INFO Epoch:49 train_loss:0.79968
549
+ 2025-04-19 05:58:39,577 INFO Epoch:49 val_res:0.589744
550
+ 2025-04-19 05:59:02,867 INFO Epoch:50 train_loss:0.79189
551
+ 2025-04-19 05:59:10,692 INFO Epoch:50 val_res:0.602564
552
+ 2025-04-19 05:59:33,667 INFO Epoch:51 train_loss:0.78620
553
+ 2025-04-19 05:59:41,491 INFO Epoch:51 val_res:0.602564
554
+ 2025-04-19 06:00:04,629 INFO Epoch:52 train_loss:0.78777
555
+ 2025-04-19 06:00:11,286 INFO Epoch:52 val_res:0.599359
556
+ 2025-04-19 06:00:35,373 INFO Epoch:53 train_loss:0.78290
557
+ 2025-04-19 06:00:42,397 INFO Epoch:53 val_res:0.602564
558
+ 2025-04-19 06:01:06,525 INFO Epoch:54 train_loss:0.76779
559
+ 2025-04-19 06:01:14,138 INFO Epoch:54 val_res:0.605769
560
+ 2025-04-19 06:01:14,139 INFO Saving best model at Epoch 54
561
+ 2025-04-19 06:01:39,655 INFO Epoch:55 train_loss:0.77010
562
+ 2025-04-19 06:01:47,572 INFO Epoch:55 val_res:0.599359
563
+ 2025-04-19 06:02:11,586 INFO Epoch:56 train_loss:0.77016
564
+ 2025-04-19 06:02:19,100 INFO Epoch:56 val_res:0.605769
565
+ 2025-04-19 06:02:43,595 INFO Epoch:57 train_loss:0.76770
566
+ 2025-04-19 06:02:50,301 INFO Epoch:57 val_res:0.605769
567
+ 2025-04-19 06:03:16,981 INFO Epoch:58 train_loss:0.76170
568
+ 2025-04-19 06:03:25,588 INFO Epoch:58 val_res:0.608974
569
+ 2025-04-19 06:03:25,588 INFO Saving best model at Epoch 58
570
+ 2025-04-19 06:03:49,874 INFO Epoch:59 train_loss:0.76463
571
+ 2025-04-19 06:03:57,441 INFO Epoch:59 val_res:0.612179
572
+ 2025-04-19 06:03:57,441 INFO Saving best model at Epoch 59
573
+ 2025-04-19 06:04:24,914 INFO Epoch:60 train_loss:0.74682
574
+ 2025-04-19 06:04:33,201 INFO Epoch:60 val_res:0.605769
575
+ 2025-04-19 06:04:59,354 INFO Epoch:61 train_loss:0.74742
576
+ 2025-04-19 06:05:07,370 INFO Epoch:61 val_res:0.608974
577
+ 2025-04-19 06:05:31,816 INFO Epoch:62 train_loss:0.73755
578
+ 2025-04-19 06:05:40,596 INFO Epoch:62 val_res:0.608974
579
+ 2025-04-19 06:06:09,716 INFO Epoch:63 train_loss:0.73293
580
+ 2025-04-19 06:06:17,832 INFO Epoch:63 val_res:0.602564
581
+ 2025-04-19 06:06:44,264 INFO Epoch:64 train_loss:0.73606
582
+ 2025-04-19 06:06:51,681 INFO Epoch:64 val_res:0.615385
583
+ 2025-04-19 06:06:51,681 INFO Saving best model at Epoch 64
584
+ 2025-04-19 06:07:23,209 INFO Epoch:65 train_loss:0.73033
585
+ 2025-04-19 06:07:32,082 INFO Epoch:65 val_res:0.612179
586
+ 2025-04-19 06:07:58,878 INFO Epoch:66 train_loss:0.72776
587
+ 2025-04-19 06:08:07,106 INFO Epoch:66 val_res:0.608974
588
+ 2025-04-19 06:08:32,928 INFO Epoch:67 train_loss:0.72701
589
+ 2025-04-19 06:08:42,448 INFO Epoch:67 val_res:0.618590
590
+ 2025-04-19 06:08:42,449 INFO Saving best model at Epoch 67
591
+ 2025-04-19 06:09:11,423 INFO Epoch:68 train_loss:0.72055
592
+ 2025-04-19 06:09:19,364 INFO Epoch:68 val_res:0.615385
593
+ 2025-04-19 06:09:46,317 INFO Epoch:69 train_loss:0.71438
594
+ 2025-04-19 06:09:54,572 INFO Epoch:69 val_res:0.615385
595
+ 2025-04-19 06:10:19,773 INFO Epoch:70 train_loss:0.71515
596
+ 2025-04-19 06:10:29,110 INFO Epoch:70 val_res:0.612179
597
+ 2025-04-19 06:10:54,332 INFO Epoch:71 train_loss:0.70814
598
+ 2025-04-19 06:11:02,854 INFO Epoch:71 val_res:0.615385
599
+ 2025-04-19 06:11:28,692 INFO Epoch:72 train_loss:0.70507
600
+ 2025-04-19 06:11:37,002 INFO Epoch:72 val_res:0.618590
601
+ 2025-04-19 06:12:02,768 INFO Epoch:73 train_loss:0.70506
602
+ 2025-04-19 06:12:11,569 INFO Epoch:73 val_res:0.621795
603
+ 2025-04-19 06:12:11,569 INFO Saving best model at Epoch 73
604
+ 2025-04-19 06:12:40,296 INFO Epoch:74 train_loss:0.70299
605
+ 2025-04-19 06:12:47,814 INFO Epoch:74 val_res:0.612179
606
+ 2025-04-19 06:13:13,758 INFO Epoch:75 train_loss:0.70639
607
+ 2025-04-19 06:13:22,575 INFO Epoch:75 val_res:0.618590
608
+ 2025-04-19 06:13:49,032 INFO Epoch:76 train_loss:0.68869
609
+ 2025-04-19 06:13:59,111 INFO Epoch:76 val_res:0.618590
610
+ 2025-04-19 06:14:26,232 INFO Epoch:77 train_loss:0.69248
611
+ 2025-04-19 06:14:34,432 INFO Epoch:77 val_res:0.612179
612
+ 2025-04-19 06:15:00,964 INFO Epoch:78 train_loss:0.68906
613
+ 2025-04-19 06:15:09,648 INFO Epoch:78 val_res:0.618590
614
+ 2025-04-19 06:15:36,595 INFO Epoch:79 train_loss:0.68877
615
+ 2025-04-19 06:15:44,783 INFO Epoch:79 val_res:0.628205
616
+ 2025-04-19 06:15:44,783 INFO Saving best model at Epoch 79
617
+ 2025-04-19 06:16:13,941 INFO Epoch:80 train_loss:0.68197
618
+ 2025-04-19 06:16:23,178 INFO Epoch:80 val_res:0.625000
619
+ 2025-04-19 06:16:49,583 INFO Epoch:81 train_loss:0.68469
620
+ 2025-04-19 06:16:57,439 INFO Epoch:81 val_res:0.628205
621
+ 2025-04-19 06:17:24,294 INFO Epoch:82 train_loss:0.67239
622
+ 2025-04-19 06:17:31,668 INFO Epoch:82 val_res:0.631410
623
+ 2025-04-19 06:17:31,669 INFO Saving best model at Epoch 82
624
+ 2025-04-19 06:17:58,769 INFO Epoch:83 train_loss:0.67410
625
+ 2025-04-19 06:18:07,857 INFO Epoch:83 val_res:0.621795
626
+ 2025-04-19 06:18:35,031 INFO Epoch:84 train_loss:0.66848
627
+ 2025-04-19 06:18:42,843 INFO Epoch:84 val_res:0.618590
628
+ 2025-04-19 06:19:09,063 INFO Epoch:85 train_loss:0.67331
629
+ 2025-04-19 06:19:17,362 INFO Epoch:85 val_res:0.631410
630
+ 2025-04-19 06:19:45,521 INFO Epoch:86 train_loss:0.67536
631
+ 2025-04-19 06:19:53,065 INFO Epoch:86 val_res:0.634615
632
+ 2025-04-19 06:19:53,066 INFO Saving best model at Epoch 86
633
+ 2025-04-19 06:20:23,866 INFO Epoch:87 train_loss:0.67924
634
+ 2025-04-19 06:20:32,610 INFO Epoch:87 val_res:0.618590
635
+ 2025-04-19 06:20:58,185 INFO Epoch:88 train_loss:0.67201
636
+ 2025-04-19 06:21:07,402 INFO Epoch:88 val_res:0.618590
637
+ 2025-04-19 06:21:33,121 INFO Epoch:89 train_loss:0.66505
638
+ 2025-04-19 06:21:41,941 INFO Epoch:89 val_res:0.628205
639
+ 2025-04-19 06:22:08,479 INFO Epoch:90 train_loss:0.66443
640
+ 2025-04-19 06:22:16,464 INFO Epoch:90 val_res:0.625000
641
+ 2025-04-19 06:22:43,890 INFO Epoch:91 train_loss:0.66156
642
+ 2025-04-19 06:22:53,139 INFO Epoch:91 val_res:0.628205
643
+ 2025-04-19 06:23:20,445 INFO Epoch:92 train_loss:0.66185
644
+ 2025-04-19 06:23:28,513 INFO Epoch:92 val_res:0.621795
645
+ 2025-04-19 06:23:53,979 INFO Epoch:93 train_loss:0.65537
646
+ 2025-04-19 06:24:02,675 INFO Epoch:93 val_res:0.631410
647
+ 2025-04-19 06:24:29,425 INFO Epoch:94 train_loss:0.65422
648
+ 2025-04-19 06:24:38,231 INFO Epoch:94 val_res:0.625000
649
+ 2025-04-19 06:25:04,726 INFO Epoch:95 train_loss:0.65197
650
+ 2025-04-19 06:25:13,179 INFO Epoch:95 val_res:0.628205
651
+ 2025-04-19 06:25:38,099 INFO Epoch:96 train_loss:0.64868
652
+ 2025-04-19 06:25:47,621 INFO Epoch:96 val_res:0.621795
653
+ 2025-04-19 06:26:13,688 INFO Epoch:97 train_loss:0.64192
654
+ 2025-04-19 06:26:21,746 INFO Epoch:97 val_res:0.625000
655
+ 2025-04-19 06:26:47,987 INFO Epoch:98 train_loss:0.64612
656
+ 2025-04-19 06:26:56,572 INFO Epoch:98 val_res:0.625000
657
+ 2025-04-19 06:27:22,086 INFO Epoch:99 train_loss:0.64826
658
+ 2025-04-19 06:27:30,417 INFO Epoch:99 val_res:0.628205
659
+ 2025-04-19 06:27:30,702 INFO =====================================
660
+ 2025-04-19 06:27:30,707 INFO Start testing...
661
+ 2025-04-19 06:27:30,707 INFO =====================================
662
+ 2025-04-19 06:27:40,938 INFO Incremental step 2 Testing res: 0.574603
663
+ 2025-04-19 06:27:40,940 INFO forgetting: 0.257892
664
+ 2025-04-19 06:27:40,944 INFO Incremental step: 3
665
+ 2025-04-19 06:28:07,675 INFO Epoch:0 train_loss:3.38011
666
+ 2025-04-19 06:28:17,315 INFO Epoch:0 val_res:0.452685
667
+ 2025-04-19 06:28:17,316 INFO Saving best model at Epoch 0
668
+ 2025-04-19 06:28:42,233 INFO Epoch:1 train_loss:4.13826
669
+ 2025-04-19 06:28:51,979 INFO Epoch:1 val_res:0.437340
670
+ 2025-04-19 06:29:17,173 INFO Epoch:2 train_loss:2.86830
671
+ 2025-04-19 06:29:26,940 INFO Epoch:2 val_res:0.521739
672
+ 2025-04-19 06:29:26,941 INFO Saving best model at Epoch 2
673
+ 2025-04-19 06:29:55,287 INFO Epoch:3 train_loss:2.50787
674
+ 2025-04-19 06:30:04,280 INFO Epoch:3 val_res:0.529412
675
+ 2025-04-19 06:30:04,280 INFO Saving best model at Epoch 3
676
+ 2025-04-19 06:30:30,991 INFO Epoch:4 train_loss:2.32324
677
+ 2025-04-19 06:30:40,408 INFO Epoch:4 val_res:0.498721
678
+ 2025-04-19 06:31:05,066 INFO Epoch:5 train_loss:1.87596
679
+ 2025-04-19 06:31:14,842 INFO Epoch:5 val_res:0.488491
680
+ 2025-04-19 06:31:37,799 INFO Epoch:6 train_loss:1.74407
681
+ 2025-04-19 06:31:46,920 INFO Epoch:6 val_res:0.519182
682
+ 2025-04-19 06:32:12,440 INFO Epoch:7 train_loss:1.73869
683
+ 2025-04-19 06:32:21,858 INFO Epoch:7 val_res:0.537084
684
+ 2025-04-19 06:32:21,858 INFO Saving best model at Epoch 7
685
+ 2025-04-19 06:32:48,457 INFO Epoch:8 train_loss:1.46716
686
+ 2025-04-19 06:32:59,142 INFO Epoch:8 val_res:0.554987
687
+ 2025-04-19 06:32:59,143 INFO Saving best model at Epoch 8
688
+ 2025-04-19 06:33:27,833 INFO Epoch:9 train_loss:1.53554
689
+ 2025-04-19 06:33:37,816 INFO Epoch:9 val_res:0.552430
690
+ 2025-04-19 06:34:01,436 INFO Epoch:10 train_loss:1.43512
691
+ 2025-04-19 06:34:10,850 INFO Epoch:10 val_res:0.554987
692
+ 2025-04-19 06:34:36,081 INFO Epoch:11 train_loss:1.28979
693
+ 2025-04-19 06:34:44,355 INFO Epoch:11 val_res:0.557545
694
+ 2025-04-19 06:34:44,356 INFO Saving best model at Epoch 11
695
+ 2025-04-19 06:35:14,645 INFO Epoch:12 train_loss:1.29425
696
+ 2025-04-19 06:35:23,722 INFO Epoch:12 val_res:0.531969
697
+ 2025-04-19 06:35:46,413 INFO Epoch:13 train_loss:1.25942
698
+ 2025-04-19 06:35:55,900 INFO Epoch:13 val_res:0.557545
699
+ 2025-04-19 06:36:20,556 INFO Epoch:14 train_loss:1.23337
700
+ 2025-04-19 06:36:30,537 INFO Epoch:14 val_res:0.567775
701
+ 2025-04-19 06:36:30,538 INFO Saving best model at Epoch 14
702
+ 2025-04-19 06:36:57,673 INFO Epoch:15 train_loss:1.19821
703
+ 2025-04-19 06:37:06,693 INFO Epoch:15 val_res:0.588235
704
+ 2025-04-19 06:37:06,693 INFO Saving best model at Epoch 15
705
+ 2025-04-19 06:37:33,649 INFO Epoch:16 train_loss:1.16489
706
+ 2025-04-19 06:37:42,692 INFO Epoch:16 val_res:0.575448
707
+ 2025-04-19 06:38:08,056 INFO Epoch:17 train_loss:1.12865
708
+ 2025-04-19 06:38:16,411 INFO Epoch:17 val_res:0.560102
709
+ 2025-04-19 06:38:42,899 INFO Epoch:18 train_loss:1.12054
710
+ 2025-04-19 06:38:51,899 INFO Epoch:18 val_res:0.552430
711
+ 2025-04-19 06:39:16,235 INFO Epoch:19 train_loss:1.11969
712
+ 2025-04-19 06:39:25,238 INFO Epoch:19 val_res:0.560102
713
+ 2025-04-19 06:39:51,189 INFO Epoch:20 train_loss:1.10122
714
+ 2025-04-19 06:40:00,118 INFO Epoch:20 val_res:0.570332
715
+ 2025-04-19 06:40:25,631 INFO Epoch:21 train_loss:1.05360
716
+ 2025-04-19 06:40:34,888 INFO Epoch:21 val_res:0.570332
717
+ 2025-04-19 06:41:00,346 INFO Epoch:22 train_loss:1.04817
718
+ 2025-04-19 06:41:09,231 INFO Epoch:22 val_res:0.572890
719
+ 2025-04-19 06:41:33,770 INFO Epoch:23 train_loss:1.05853
720
+ 2025-04-19 06:41:43,550 INFO Epoch:23 val_res:0.580563
721
+ 2025-04-19 06:42:09,383 INFO Epoch:24 train_loss:1.03578
722
+ 2025-04-19 06:42:18,298 INFO Epoch:24 val_res:0.567775
723
+ 2025-04-19 06:42:44,112 INFO Epoch:25 train_loss:1.01370
724
+ 2025-04-19 06:42:53,329 INFO Epoch:25 val_res:0.567775
725
+ 2025-04-19 06:43:18,264 INFO Epoch:26 train_loss:1.00191
726
+ 2025-04-19 06:43:27,498 INFO Epoch:26 val_res:0.578005
727
+ 2025-04-19 06:43:54,334 INFO Epoch:27 train_loss:1.01539
728
+ 2025-04-19 06:44:03,303 INFO Epoch:27 val_res:0.588235
729
+ 2025-04-19 06:44:27,639 INFO Epoch:28 train_loss:1.00665
730
+ 2025-04-19 06:44:37,140 INFO Epoch:28 val_res:0.590793
731
+ 2025-04-19 06:44:37,141 INFO Saving best model at Epoch 28
732
+ 2025-04-19 06:45:04,098 INFO Epoch:29 train_loss:0.98958
733
+ 2025-04-19 06:45:14,005 INFO Epoch:29 val_res:0.588235
734
+ 2025-04-19 06:45:36,066 INFO Epoch:30 train_loss:0.96916
735
+ 2025-04-19 06:45:45,563 INFO Epoch:30 val_res:0.590793
736
+ 2025-04-19 06:46:10,948 INFO Epoch:31 train_loss:0.97828
737
+ 2025-04-19 06:46:19,780 INFO Epoch:31 val_res:0.598466
738
+ 2025-04-19 06:46:19,780 INFO Saving best model at Epoch 31
739
+ 2025-04-19 06:46:46,996 INFO Epoch:32 train_loss:0.96839
740
+ 2025-04-19 06:46:56,369 INFO Epoch:32 val_res:0.606138
741
+ 2025-04-19 06:46:56,369 INFO Saving best model at Epoch 32
742
+ 2025-04-19 06:47:21,764 INFO Epoch:33 train_loss:0.96190
743
+ 2025-04-19 06:47:31,214 INFO Epoch:33 val_res:0.598466
744
+ 2025-04-19 06:47:56,782 INFO Epoch:34 train_loss:0.95478
745
+ 2025-04-19 06:48:06,828 INFO Epoch:34 val_res:0.595908
746
+ 2025-04-19 06:48:32,387 INFO Epoch:35 train_loss:0.93850
747
+ 2025-04-19 06:48:41,781 INFO Epoch:35 val_res:0.598466
748
+ 2025-04-19 06:49:06,597 INFO Epoch:36 train_loss:0.93392
749
+ 2025-04-19 06:49:16,116 INFO Epoch:36 val_res:0.603581
750
+ 2025-04-19 06:49:41,218 INFO Epoch:37 train_loss:0.93435
751
+ 2025-04-19 06:49:51,087 INFO Epoch:37 val_res:0.598466
752
+ 2025-04-19 06:50:15,628 INFO Epoch:38 train_loss:0.91670
753
+ 2025-04-19 06:50:25,424 INFO Epoch:38 val_res:0.606138
754
+ 2025-04-19 06:50:51,078 INFO Epoch:39 train_loss:0.91194
755
+ 2025-04-19 06:51:00,384 INFO Epoch:39 val_res:0.603581
756
+ 2025-04-19 06:51:26,043 INFO Epoch:40 train_loss:0.91345
757
+ 2025-04-19 06:51:35,844 INFO Epoch:40 val_res:0.601023
758
+ 2025-04-19 06:51:59,273 INFO Epoch:41 train_loss:0.89999
759
+ 2025-04-19 06:52:08,545 INFO Epoch:41 val_res:0.603581
760
+ 2025-04-19 06:52:32,470 INFO Epoch:42 train_loss:0.88086
761
+ 2025-04-19 06:52:41,797 INFO Epoch:42 val_res:0.595908
762
+ 2025-04-19 06:53:05,579 INFO Epoch:43 train_loss:0.88301
763
+ 2025-04-19 06:53:15,569 INFO Epoch:43 val_res:0.603581
764
+ 2025-04-19 06:53:40,550 INFO Epoch:44 train_loss:0.90119
765
+ 2025-04-19 06:53:50,051 INFO Epoch:44 val_res:0.585678
766
+ 2025-04-19 06:54:15,454 INFO Epoch:45 train_loss:0.87934
767
+ 2025-04-19 06:54:25,376 INFO Epoch:45 val_res:0.593350
768
+ 2025-04-19 06:54:51,101 INFO Epoch:46 train_loss:0.87032
769
+ 2025-04-19 06:54:59,053 INFO Epoch:46 val_res:0.595908
770
+ 2025-04-19 06:55:23,802 INFO Epoch:47 train_loss:0.86929
771
+ 2025-04-19 06:55:34,268 INFO Epoch:47 val_res:0.595908
772
+ 2025-04-19 06:55:58,418 INFO Epoch:48 train_loss:0.86531
773
+ 2025-04-19 06:56:08,280 INFO Epoch:48 val_res:0.598466
774
+ 2025-04-19 06:56:30,956 INFO Epoch:49 train_loss:0.84910
775
+ 2025-04-19 06:56:40,843 INFO Epoch:49 val_res:0.595908
776
+ 2025-04-19 06:57:06,215 INFO Epoch:50 train_loss:0.85603
777
+ 2025-04-19 06:57:15,678 INFO Epoch:50 val_res:0.588235
778
+ 2025-04-19 06:57:42,527 INFO Epoch:51 train_loss:0.87060
779
+ 2025-04-19 06:57:51,734 INFO Epoch:51 val_res:0.588235
780
+ 2025-04-19 06:58:15,573 INFO Epoch:52 train_loss:0.84394
781
+ 2025-04-19 06:58:25,642 INFO Epoch:52 val_res:0.588235
782
+ 2025-04-19 06:58:50,882 INFO Epoch:53 train_loss:0.84886
783
+ 2025-04-19 06:59:01,183 INFO Epoch:53 val_res:0.590793
784
+ 2025-04-19 06:59:26,477 INFO Epoch:54 train_loss:0.83513
785
+ 2025-04-19 06:59:36,892 INFO Epoch:54 val_res:0.588235
786
+ 2025-04-19 07:00:03,728 INFO Epoch:55 train_loss:0.82280
787
+ 2025-04-19 07:00:13,386 INFO Epoch:55 val_res:0.590793
788
+ 2025-04-19 07:00:38,866 INFO Epoch:56 train_loss:0.83593
789
+ 2025-04-19 07:00:48,895 INFO Epoch:56 val_res:0.598466
790
+ 2025-04-19 07:01:14,183 INFO Epoch:57 train_loss:0.82257
791
+ 2025-04-19 07:01:24,494 INFO Epoch:57 val_res:0.598466
792
+ 2025-04-19 07:01:49,940 INFO Epoch:58 train_loss:0.81759
793
+ 2025-04-19 07:01:58,130 INFO Epoch:58 val_res:0.601023
794
+ 2025-04-19 07:02:23,417 INFO Epoch:59 train_loss:0.81637
795
+ 2025-04-19 07:02:33,276 INFO Epoch:59 val_res:0.598466
796
+ 2025-04-19 07:02:59,402 INFO Epoch:60 train_loss:0.80930
797
+ 2025-04-19 07:03:09,009 INFO Epoch:60 val_res:0.595908
798
+ 2025-04-19 07:03:32,564 INFO Epoch:61 train_loss:0.79513
799
+ 2025-04-19 07:03:42,385 INFO Epoch:61 val_res:0.593350
800
+ 2025-04-19 07:04:07,873 INFO Epoch:62 train_loss:0.80163
801
+ 2025-04-19 07:04:18,108 INFO Epoch:62 val_res:0.611253
802
+ 2025-04-19 07:04:18,109 INFO Saving best model at Epoch 62
803
+ 2025-04-19 07:04:44,805 INFO Epoch:63 train_loss:0.79376
804
+ 2025-04-19 07:04:54,688 INFO Epoch:63 val_res:0.603581
805
+ 2025-04-19 07:05:18,118 INFO Epoch:64 train_loss:0.77790
806
+ 2025-04-19 07:05:27,655 INFO Epoch:64 val_res:0.611253
807
+ 2025-04-19 07:05:52,250 INFO Epoch:65 train_loss:0.78912
808
+ 2025-04-19 07:06:01,885 INFO Epoch:65 val_res:0.590793
809
+ 2025-04-19 07:06:24,610 INFO Epoch:66 train_loss:0.78291
810
+ 2025-04-19 07:06:34,209 INFO Epoch:66 val_res:0.611253
811
+ 2025-04-19 07:06:57,346 INFO Epoch:67 train_loss:0.76917
812
+ 2025-04-19 07:07:07,344 INFO Epoch:67 val_res:0.616368
813
+ 2025-04-19 07:07:07,344 INFO Saving best model at Epoch 67
814
+ 2025-04-19 07:07:34,416 INFO Epoch:68 train_loss:0.77430
815
+ 2025-04-19 07:07:44,622 INFO Epoch:68 val_res:0.608696
816
+ 2025-04-19 07:08:08,002 INFO Epoch:69 train_loss:0.77168
817
+ 2025-04-19 07:08:17,451 INFO Epoch:69 val_res:0.616368
818
+ 2025-04-19 07:08:44,178 INFO Epoch:70 train_loss:0.76969
819
+ 2025-04-19 07:08:54,078 INFO Epoch:70 val_res:0.616368
820
+ 2025-04-19 07:09:19,537 INFO Epoch:71 train_loss:0.76282
821
+ 2025-04-19 07:09:29,878 INFO Epoch:71 val_res:0.611253
822
+ 2025-04-19 07:09:55,249 INFO Epoch:72 train_loss:0.76299
823
+ 2025-04-19 07:10:04,911 INFO Epoch:72 val_res:0.618926
824
+ 2025-04-19 07:10:04,915 INFO Saving best model at Epoch 72
825
+ 2025-04-19 07:10:29,756 INFO Epoch:73 train_loss:0.73843
826
+ 2025-04-19 07:10:39,161 INFO Epoch:73 val_res:0.616368
827
+ 2025-04-19 07:11:03,929 INFO Epoch:74 train_loss:0.75582
828
+ 2025-04-19 07:11:13,364 INFO Epoch:74 val_res:0.613811
829
+ 2025-04-19 07:11:37,982 INFO Epoch:75 train_loss:0.74723
830
+ 2025-04-19 07:11:48,363 INFO Epoch:75 val_res:0.618926
831
+ 2025-04-19 07:12:14,206 INFO Epoch:76 train_loss:0.74862
832
+ 2025-04-19 07:12:23,822 INFO Epoch:76 val_res:0.608696
833
+ 2025-04-19 07:12:47,393 INFO Epoch:77 train_loss:0.74423
834
+ 2025-04-19 07:12:56,868 INFO Epoch:77 val_res:0.613811
835
+ 2025-04-19 07:13:21,519 INFO Epoch:78 train_loss:0.74976
836
+ 2025-04-19 07:13:30,379 INFO Epoch:78 val_res:0.608696
837
+ 2025-04-19 07:13:52,299 INFO Epoch:79 train_loss:0.75697
838
+ 2025-04-19 07:14:00,567 INFO Epoch:79 val_res:0.618926
839
+ 2025-04-19 07:14:25,873 INFO Epoch:80 train_loss:0.74760
840
+ 2025-04-19 07:14:34,191 INFO Epoch:80 val_res:0.616368
841
+ 2025-04-19 07:14:57,626 INFO Epoch:81 train_loss:0.72786
842
+ 2025-04-19 07:15:06,048 INFO Epoch:81 val_res:0.611253
843
+ 2025-04-19 07:15:30,182 INFO Epoch:82 train_loss:0.73431
844
+ 2025-04-19 07:15:39,272 INFO Epoch:82 val_res:0.613811
845
+ 2025-04-19 07:16:03,206 INFO Epoch:83 train_loss:0.73968
846
+ 2025-04-19 07:16:12,381 INFO Epoch:83 val_res:0.611253
847
+ 2025-04-19 07:16:38,276 INFO Epoch:84 train_loss:0.72041
848
+ 2025-04-19 07:16:47,185 INFO Epoch:84 val_res:0.608696
849
+ 2025-04-19 07:17:11,032 INFO Epoch:85 train_loss:0.71967
850
+ 2025-04-19 07:17:19,252 INFO Epoch:85 val_res:0.608696
851
+ 2025-04-19 07:17:43,802 INFO Epoch:86 train_loss:0.72109
852
+ 2025-04-19 07:17:52,775 INFO Epoch:86 val_res:0.611253
853
+ 2025-04-19 07:18:17,663 INFO Epoch:87 train_loss:0.72062
854
+ 2025-04-19 07:18:26,378 INFO Epoch:87 val_res:0.611253
855
+ 2025-04-19 07:18:50,553 INFO Epoch:88 train_loss:0.72032
856
+ 2025-04-19 07:19:00,805 INFO Epoch:88 val_res:0.603581
857
+ 2025-04-19 07:19:26,943 INFO Epoch:89 train_loss:0.71547
858
+ 2025-04-19 07:19:36,376 INFO Epoch:89 val_res:0.611253
859
+ 2025-04-19 07:20:01,393 INFO Epoch:90 train_loss:0.71745
860
+ 2025-04-19 07:20:10,609 INFO Epoch:90 val_res:0.611253
861
+ 2025-04-19 07:20:35,409 INFO Epoch:91 train_loss:0.71683
862
+ 2025-04-19 07:20:44,675 INFO Epoch:91 val_res:0.608696
863
+ 2025-04-19 07:21:08,581 INFO Epoch:92 train_loss:0.72381
864
+ 2025-04-19 07:21:17,672 INFO Epoch:92 val_res:0.613811
865
+ 2025-04-19 07:21:42,679 INFO Epoch:93 train_loss:0.71326
866
+ 2025-04-19 07:21:51,389 INFO Epoch:93 val_res:0.616368
867
+ 2025-04-19 07:22:16,601 INFO Epoch:94 train_loss:0.70276
868
+ 2025-04-19 07:22:25,754 INFO Epoch:94 val_res:0.618926
869
+ 2025-04-19 07:22:50,436 INFO Epoch:95 train_loss:0.71010
870
+ 2025-04-19 07:23:00,343 INFO Epoch:95 val_res:0.611253
871
+ 2025-04-19 07:23:26,028 INFO Epoch:96 train_loss:0.69572
872
+ 2025-04-19 07:23:35,572 INFO Epoch:96 val_res:0.613811
873
+ 2025-04-19 07:24:00,818 INFO Epoch:97 train_loss:0.69438
874
+ 2025-04-19 07:24:09,503 INFO Epoch:97 val_res:0.611253
875
+ 2025-04-19 07:24:34,501 INFO Epoch:98 train_loss:0.70946
876
+ 2025-04-19 07:24:44,504 INFO Epoch:98 val_res:0.611253
877
+ 2025-04-19 07:25:10,567 INFO Epoch:99 train_loss:0.70041
878
+ 2025-04-19 07:25:20,875 INFO Epoch:99 val_res:0.601023
879
+ 2025-04-19 07:25:21,161 INFO =====================================
880
+ 2025-04-19 07:25:21,162 INFO Start testing...
881
+ 2025-04-19 07:25:21,162 INFO =====================================
882
+ 2025-04-19 07:25:32,775 INFO Incremental step 3 Testing res: 0.543147
883
+ 2025-04-19 07:25:32,776 INFO forgetting: 0.172017
884
+ 2025-04-19 07:25:32,777 INFO Average Accuracy: 0.628596
885
+ 2025-04-19 07:25:32,777 INFO Average Forgetting: 0.258687
Audio Visual Continual Learning/AV-CIL/save/AVE/audio-visual/use-inverse_True-seed_0/fig/audio-visual_train_loss_step_0.png ADDED
Audio Visual Continual Learning/AV-CIL/save/AVE/audio-visual/use-inverse_True-seed_0/fig/audio-visual_train_loss_step_1.png ADDED
Audio Visual Continual Learning/AV-CIL/save/AVE/audio-visual/use-inverse_True-seed_0/fig/audio-visual_train_loss_step_2.png ADDED
Audio Visual Continual Learning/AV-CIL/save/AVE/audio-visual/use-inverse_True-seed_0/fig/audio-visual_train_loss_step_3.png ADDED