ctaguchi commited on
Commit
c776cd2
·
verified ·
1 Parent(s): 7440d2c

End of training

Browse files
Files changed (4) hide show
  1. README.md +26 -87
  2. adapter.ttj.safetensors +1 -1
  3. model.safetensors +1 -1
  4. training_args.bin +1 -1
README.md CHANGED
@@ -1,7 +1,5 @@
1
  ---
2
  library_name: transformers
3
- license: cc-by-nc-4.0
4
- base_model: facebook/mms-1b-all
5
  tags:
6
  - generated_from_trainer
7
  metrics:
@@ -16,11 +14,11 @@ should probably proofread and complete it, then remove this comment. -->
16
 
17
  # ssc-ttj-mms-model-mix-adapt-max3-devtrain
18
 
19
- This model is a fine-tuned version of [facebook/mms-1b-all](https://huggingface.co/facebook/mms-1b-all) on an unknown dataset.
20
  It achieves the following results on the evaluation set:
21
- - Loss: 0.1190
22
- - Cer: 0.0590
23
- - Wer: 0.3410
24
 
25
  ## Model description
26
 
@@ -40,97 +38,38 @@ More information needed
40
 
41
  The following hyperparameters were used during training:
42
  - learning_rate: 0.0005
43
- - train_batch_size: 8
44
  - eval_batch_size: 6
45
  - seed: 42
46
  - gradient_accumulation_steps: 2
47
- - total_train_batch_size: 16
48
  - optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
49
  - lr_scheduler_type: linear
50
  - lr_scheduler_warmup_steps: 100
51
- - num_epochs: 20
52
  - mixed_precision_training: Native AMP
53
 
54
  ### Training results
55
 
56
- | Training Loss | Epoch | Step | Validation Loss | Cer | Wer |
57
- |:-------------:|:-------:|:-----:|:---------------:|:------:|:------:|
58
- | 0.7128 | 0.2602 | 200 | 0.2917 | 0.0890 | 0.4935 |
59
- | 0.5556 | 0.5205 | 400 | 0.2645 | 0.0906 | 0.5049 |
60
- | 0.5374 | 0.7807 | 600 | 0.2462 | 0.0864 | 0.4736 |
61
- | 0.485 | 1.0403 | 800 | 0.2209 | 0.0818 | 0.4473 |
62
- | 0.4974 | 1.3006 | 1000 | 0.2127 | 0.0797 | 0.4365 |
63
- | 0.4915 | 1.5608 | 1200 | 0.2094 | 0.0788 | 0.4319 |
64
- | 0.4822 | 1.8211 | 1400 | 0.2037 | 0.0767 | 0.4238 |
65
- | 0.4566 | 2.0807 | 1600 | 0.2007 | 0.0776 | 0.4306 |
66
- | 0.4499 | 2.3409 | 1800 | 0.1891 | 0.0740 | 0.4121 |
67
- | 0.4539 | 2.6012 | 2000 | 0.1904 | 0.0746 | 0.4131 |
68
- | 0.4726 | 2.8614 | 2200 | 0.1835 | 0.0737 | 0.4066 |
69
- | 0.4478 | 3.1210 | 2400 | 0.1822 | 0.0725 | 0.4034 |
70
- | 0.4316 | 3.3813 | 2600 | 0.1813 | 0.0729 | 0.4011 |
71
- | 0.4333 | 3.6415 | 2800 | 0.1782 | 0.0723 | 0.4018 |
72
- | 0.4448 | 3.9018 | 3000 | 0.1786 | 0.0724 | 0.4002 |
73
- | 0.3983 | 4.1614 | 3200 | 0.1747 | 0.0718 | 0.3945 |
74
- | 0.4093 | 4.4216 | 3400 | 0.1691 | 0.0705 | 0.3918 |
75
- | 0.4011 | 4.6818 | 3600 | 0.1625 | 0.0680 | 0.3827 |
76
- | 0.4155 | 4.9421 | 3800 | 0.1642 | 0.0693 | 0.3895 |
77
- | 0.396 | 5.2017 | 4000 | 0.1658 | 0.0693 | 0.3876 |
78
- | 0.391 | 5.4619 | 4200 | 0.1658 | 0.0692 | 0.3913 |
79
- | 0.3835 | 5.7222 | 4400 | 0.1624 | 0.0680 | 0.3793 |
80
- | 0.4102 | 5.9824 | 4600 | 0.1595 | 0.0679 | 0.3832 |
81
- | 0.3711 | 6.2420 | 4800 | 0.1542 | 0.0673 | 0.3765 |
82
- | 0.3836 | 6.5023 | 5000 | 0.1533 | 0.0673 | 0.3776 |
83
- | 0.3727 | 6.7625 | 5200 | 0.1566 | 0.0676 | 0.3804 |
84
- | 0.3726 | 7.0221 | 5400 | 0.1519 | 0.0665 | 0.3721 |
85
- | 0.3851 | 7.2824 | 5600 | 0.1482 | 0.0650 | 0.3715 |
86
- | 0.3535 | 7.5426 | 5800 | 0.1492 | 0.0661 | 0.3722 |
87
- | 0.3885 | 7.8029 | 6000 | 0.1501 | 0.0655 | 0.3683 |
88
- | 0.3744 | 8.0625 | 6200 | 0.1452 | 0.0655 | 0.3700 |
89
- | 0.3388 | 8.3227 | 6400 | 0.1466 | 0.0650 | 0.3683 |
90
- | 0.3751 | 8.5830 | 6600 | 0.1477 | 0.0658 | 0.3698 |
91
- | 0.347 | 8.8432 | 6800 | 0.1413 | 0.0643 | 0.3680 |
92
- | 0.3181 | 9.1028 | 7000 | 0.1416 | 0.0641 | 0.3626 |
93
- | 0.3619 | 9.3630 | 7200 | 0.1419 | 0.0638 | 0.3609 |
94
- | 0.3474 | 9.6233 | 7400 | 0.1438 | 0.0644 | 0.3677 |
95
- | 0.3326 | 9.8835 | 7600 | 0.1375 | 0.0632 | 0.3606 |
96
- | 0.3288 | 10.1431 | 7800 | 0.1394 | 0.0634 | 0.3595 |
97
- | 0.3368 | 10.4034 | 8000 | 0.1384 | 0.0638 | 0.3617 |
98
- | 0.342 | 10.6636 | 8200 | 0.1351 | 0.0629 | 0.3596 |
99
- | 0.3267 | 10.9239 | 8400 | 0.1342 | 0.0631 | 0.3634 |
100
- | 0.3062 | 11.1835 | 8600 | 0.1327 | 0.0626 | 0.3578 |
101
- | 0.3336 | 11.4437 | 8800 | 0.1312 | 0.0621 | 0.3573 |
102
- | 0.3268 | 11.7040 | 9000 | 0.1329 | 0.0628 | 0.3605 |
103
- | 0.3336 | 11.9642 | 9200 | 0.1329 | 0.0629 | 0.3573 |
104
- | 0.3173 | 12.2238 | 9400 | 0.1297 | 0.0622 | 0.3575 |
105
- | 0.3082 | 12.4841 | 9600 | 0.1301 | 0.0619 | 0.3550 |
106
- | 0.3188 | 12.7443 | 9800 | 0.1294 | 0.0618 | 0.3546 |
107
- | 0.3397 | 13.0039 | 10000 | 0.1281 | 0.0615 | 0.3566 |
108
- | 0.316 | 13.2642 | 10200 | 0.1315 | 0.0620 | 0.3576 |
109
- | 0.3084 | 13.5244 | 10400 | 0.1282 | 0.0609 | 0.3517 |
110
- | 0.3152 | 13.7846 | 10600 | 0.1292 | 0.0612 | 0.3533 |
111
- | 0.2819 | 14.0442 | 10800 | 0.1277 | 0.0610 | 0.3500 |
112
- | 0.3114 | 14.3045 | 11000 | 0.1255 | 0.0607 | 0.3502 |
113
- | 0.3106 | 14.5647 | 11200 | 0.1264 | 0.0606 | 0.3473 |
114
- | 0.2999 | 14.8250 | 11400 | 0.1243 | 0.0608 | 0.3499 |
115
- | 0.3029 | 15.0846 | 11600 | 0.1243 | 0.0605 | 0.3506 |
116
- | 0.3004 | 15.3448 | 11800 | 0.1255 | 0.0606 | 0.3515 |
117
- | 0.2935 | 15.6051 | 12000 | 0.1244 | 0.0608 | 0.3522 |
118
- | 0.3084 | 15.8653 | 12200 | 0.1232 | 0.0605 | 0.3495 |
119
- | 0.2796 | 16.1249 | 12400 | 0.1216 | 0.0597 | 0.3458 |
120
- | 0.2988 | 16.3852 | 12600 | 0.1213 | 0.0594 | 0.3458 |
121
- | 0.2863 | 16.6454 | 12800 | 0.1221 | 0.0598 | 0.3455 |
122
- | 0.3014 | 16.9057 | 13000 | 0.1233 | 0.0598 | 0.3436 |
123
- | 0.2972 | 17.1653 | 13200 | 0.1203 | 0.0589 | 0.3417 |
124
- | 0.3098 | 17.4255 | 13400 | 0.1214 | 0.0594 | 0.3430 |
125
- | 0.284 | 17.6858 | 13600 | 0.1206 | 0.0594 | 0.3438 |
126
- | 0.2847 | 17.9460 | 13800 | 0.1200 | 0.0595 | 0.3448 |
127
- | 0.2849 | 18.2056 | 14000 | 0.1192 | 0.0588 | 0.3414 |
128
- | 0.2744 | 18.4658 | 14200 | 0.1189 | 0.0590 | 0.3436 |
129
- | 0.2737 | 18.7261 | 14400 | 0.1207 | 0.0591 | 0.3423 |
130
- | 0.2859 | 18.9863 | 14600 | 0.1197 | 0.0590 | 0.3414 |
131
- | 0.2772 | 19.2459 | 14800 | 0.1187 | 0.0590 | 0.3409 |
132
- | 0.2844 | 19.5062 | 15000 | 0.1190 | 0.0590 | 0.3417 |
133
- | 0.2815 | 19.7664 | 15200 | 0.1190 | 0.0590 | 0.3410 |
134
 
135
 
136
  ### Framework versions
 
1
  ---
2
  library_name: transformers
 
 
3
  tags:
4
  - generated_from_trainer
5
  metrics:
 
14
 
15
  # ssc-ttj-mms-model-mix-adapt-max3-devtrain
16
 
17
+ This model was trained from scratch on an unknown dataset.
18
  It achieves the following results on the evaluation set:
19
+ - Loss: 0.1235
20
+ - Cer: 0.0581
21
+ - Wer: 0.3438
22
 
23
  ## Model description
24
 
 
38
 
39
  The following hyperparameters were used during training:
40
  - learning_rate: 0.0005
41
+ - train_batch_size: 1
42
  - eval_batch_size: 6
43
  - seed: 42
44
  - gradient_accumulation_steps: 2
45
+ - total_train_batch_size: 2
46
  - optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
47
  - lr_scheduler_type: linear
48
  - lr_scheduler_warmup_steps: 100
49
+ - num_epochs: 5
50
  - mixed_precision_training: Native AMP
51
 
52
  ### Training results
53
 
54
+ | Training Loss | Epoch | Step | Validation Loss | Cer | Wer |
55
+ |:-------------:|:------:|:----:|:---------------:|:------:|:------:|
56
+ | 0.2291 | 0.2789 | 200 | 0.1283 | 0.0593 | 0.3495 |
57
+ | 0.1982 | 0.5579 | 400 | 0.1392 | 0.0651 | 0.3722 |
58
+ | 0.1952 | 0.8368 | 600 | 0.1330 | 0.0624 | 0.3571 |
59
+ | 0.1812 | 1.1158 | 800 | 0.1317 | 0.0594 | 0.3482 |
60
+ | 0.18 | 1.3947 | 1000 | 0.1322 | 0.0595 | 0.3517 |
61
+ | 0.2008 | 1.6736 | 1200 | 0.1312 | 0.0600 | 0.3537 |
62
+ | 0.2339 | 1.9526 | 1400 | 0.1276 | 0.0586 | 0.3470 |
63
+ | 0.1732 | 2.2315 | 1600 | 0.1282 | 0.0593 | 0.3513 |
64
+ | 0.1745 | 2.5105 | 1800 | 0.1289 | 0.0597 | 0.3513 |
65
+ | 0.1585 | 2.7894 | 2000 | 0.1272 | 0.0593 | 0.3503 |
66
+ | 0.166 | 3.0683 | 2200 | 0.1267 | 0.0587 | 0.3456 |
67
+ | 0.1561 | 3.3473 | 2400 | 0.1278 | 0.0587 | 0.3469 |
68
+ | 0.1936 | 3.6262 | 2600 | 0.1272 | 0.0587 | 0.3466 |
69
+ | 0.188 | 3.9052 | 2800 | 0.1258 | 0.0582 | 0.3451 |
70
+ | 0.1743 | 4.1841 | 3000 | 0.1259 | 0.0584 | 0.3446 |
71
+ | 0.1552 | 4.4630 | 3200 | 0.1244 | 0.0582 | 0.3432 |
72
+ | 0.1444 | 4.7420 | 3400 | 0.1235 | 0.0581 | 0.3438 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
73
 
74
 
75
  ### Framework versions
adapter.ttj.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:d723162f29e3fe48a78f07fcf60bdc13dae57b26e9acf8203c7557e8cadd73fe
3
  size 9003508
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f9f51800d9897a6940a8e1ccbbde031b286c197536379424d367960ab3e45765
3
  size 9003508
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:b1897f6f4ddec5ec458281da7522ff3dbf291635b9de7745b6fb50e2da4539d8
3
  size 3859095884
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6bd702e015745ed15552a471e406bdb55fe0ccf9e6e385003023d6f11cc3968f
3
  size 3859095884
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:e6b2a22adc9e078f38755893451a83e88c13278511a2768eb00a7545551059ba
3
  size 5969
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4e9b7d7f8778a15ad28d1b7f93db11370af5528e3875fb54d3c427fd853296ca
3
  size 5969