ctaguchi commited on
Commit
9ef00a6
·
verified ·
1 Parent(s): b5948fe

Model save

Browse files
Files changed (2) hide show
  1. README.md +131 -0
  2. model.safetensors +1 -1
README.md ADDED
@@ -0,0 +1,131 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ license: apache-2.0
4
+ base_model: facebook/wav2vec2-xls-r-300m
5
+ tags:
6
+ - generated_from_trainer
7
+ metrics:
8
+ - wer
9
+ model-index:
10
+ - name: ssc-bew-model
11
+ results: []
12
+ ---
13
+
14
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
15
+ should probably proofread and complete it, then remove this comment. -->
16
+
17
+ # ssc-bew-model
18
+
19
+ This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on an unknown dataset.
20
+ It achieves the following results on the evaluation set:
21
+ - Loss: 2.2698
22
+ - Cer: 0.4050
23
+ - Wer: 0.9687
24
+
25
+ ## Model description
26
+
27
+ More information needed
28
+
29
+ ## Intended uses & limitations
30
+
31
+ More information needed
32
+
33
+ ## Training and evaluation data
34
+
35
+ More information needed
36
+
37
+ ## Training procedure
38
+
39
+ ### Training hyperparameters
40
+
41
+ The following hyperparameters were used during training:
42
+ - learning_rate: 0.0003
43
+ - train_batch_size: 16
44
+ - eval_batch_size: 16
45
+ - seed: 42
46
+ - gradient_accumulation_steps: 2
47
+ - total_train_batch_size: 32
48
+ - optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
49
+ - lr_scheduler_type: linear
50
+ - lr_scheduler_warmup_steps: 100
51
+ - num_epochs: 30
52
+ - mixed_precision_training: Native AMP
53
+
54
+ ### Training results
55
+
56
+ | Training Loss | Epoch | Step | Validation Loss | Cer | Wer |
57
+ |:-------------:|:-------:|:----:|:---------------:|:------:|:------:|
58
+ | 4.8123 | 0.4484 | 100 | 3.0133 | 0.9969 | 1.0 |
59
+ | 3.1265 | 0.8969 | 200 | 2.9396 | 0.9969 | 1.0 |
60
+ | 3.119 | 1.3453 | 300 | 2.8454 | 0.9969 | 1.0 |
61
+ | 3.0907 | 1.7937 | 400 | 2.8643 | 0.9969 | 1.0 |
62
+ | 3.0756 | 2.2422 | 500 | 2.9632 | 0.9969 | 1.0 |
63
+ | 2.9941 | 2.6906 | 600 | 2.8723 | 0.9969 | 1.0 |
64
+ | 2.8902 | 3.1390 | 700 | 2.6961 | 0.9892 | 1.0 |
65
+ | 2.7428 | 3.5874 | 800 | 2.4343 | 0.8631 | 0.9999 |
66
+ | 2.5463 | 4.0359 | 900 | 2.3476 | 0.7009 | 1.0 |
67
+ | 2.3062 | 4.4843 | 1000 | 2.1183 | 0.6930 | 1.0 |
68
+ | 2.2073 | 4.9327 | 1100 | 2.0210 | 0.7117 | 1.0 |
69
+ | 2.0765 | 5.3812 | 1200 | 2.0935 | 0.6305 | 0.9960 |
70
+ | 2.031 | 5.8296 | 1300 | 1.9194 | 0.6221 | 0.9994 |
71
+ | 1.9854 | 6.2780 | 1400 | 1.8179 | 0.5811 | 0.9999 |
72
+ | 1.8907 | 6.7265 | 1500 | 1.7837 | 0.5585 | 0.9998 |
73
+ | 1.8584 | 7.1749 | 1600 | 1.8593 | 0.5447 | 0.9931 |
74
+ | 1.7839 | 7.6233 | 1700 | 1.8705 | 0.5141 | 1.0417 |
75
+ | 1.785 | 8.0717 | 1800 | 1.7395 | 0.5033 | 0.9858 |
76
+ | 1.6759 | 8.5202 | 1900 | 1.7391 | 0.4723 | 0.9718 |
77
+ | 1.7218 | 8.9686 | 2000 | 1.7175 | 0.4940 | 0.9723 |
78
+ | 1.609 | 9.4170 | 2100 | 1.7110 | 0.4781 | 0.9772 |
79
+ | 1.5811 | 9.8655 | 2200 | 1.7124 | 0.4658 | 0.9843 |
80
+ | 1.5396 | 10.3139 | 2300 | 1.6364 | 0.4590 | 0.9560 |
81
+ | 1.5133 | 10.7623 | 2400 | 1.6455 | 0.4580 | 0.9590 |
82
+ | 1.4639 | 11.2108 | 2500 | 1.7604 | 0.4770 | 0.9661 |
83
+ | 1.4046 | 11.6592 | 2600 | 1.7846 | 0.4839 | 1.0466 |
84
+ | 1.3616 | 12.1076 | 2700 | 1.6502 | 0.4380 | 0.9452 |
85
+ | 1.3035 | 12.5561 | 2800 | 1.6649 | 0.4458 | 0.9521 |
86
+ | 1.3431 | 13.0045 | 2900 | 1.6487 | 0.4386 | 0.9449 |
87
+ | 1.2072 | 13.4529 | 3000 | 1.6000 | 0.4358 | 0.9402 |
88
+ | 1.1978 | 13.9013 | 3100 | 1.6768 | 0.4380 | 0.9613 |
89
+ | 1.1108 | 14.3498 | 3200 | 1.7206 | 0.4302 | 0.9412 |
90
+ | 1.1131 | 14.7982 | 3300 | 1.6861 | 0.4315 | 0.9715 |
91
+ | 1.0874 | 15.2466 | 3400 | 1.5880 | 0.4253 | 0.9327 |
92
+ | 1.0333 | 15.6951 | 3500 | 1.5706 | 0.4154 | 0.9273 |
93
+ | 0.9514 | 16.1435 | 3600 | 1.7136 | 0.4228 | 0.9538 |
94
+ | 0.9313 | 16.5919 | 3700 | 1.8036 | 0.4289 | 1.0255 |
95
+ | 0.9902 | 17.0404 | 3800 | 1.6053 | 0.4120 | 0.9242 |
96
+ | 0.8452 | 17.4888 | 3900 | 1.5672 | 0.4104 | 0.9327 |
97
+ | 0.872 | 17.9372 | 4000 | 1.6529 | 0.4131 | 0.9536 |
98
+ | 0.7782 | 18.3857 | 4100 | 1.8549 | 0.4384 | 1.0661 |
99
+ | 0.7984 | 18.8341 | 4200 | 1.8437 | 0.4153 | 1.0265 |
100
+ | 0.7615 | 19.2825 | 4300 | 1.7319 | 0.4127 | 0.9696 |
101
+ | 0.7246 | 19.7309 | 4400 | 1.7560 | 0.4035 | 0.9411 |
102
+ | 0.698 | 20.1794 | 4500 | 1.8200 | 0.4254 | 0.9628 |
103
+ | 0.6601 | 20.6278 | 4600 | 1.8046 | 0.4220 | 0.9428 |
104
+ | 0.6534 | 21.0762 | 4700 | 1.9306 | 0.4108 | 0.9498 |
105
+ | 0.6144 | 21.5247 | 4800 | 1.8637 | 0.4024 | 0.9563 |
106
+ | 0.6418 | 21.9731 | 4900 | 1.9459 | 0.4060 | 0.9565 |
107
+ | 0.5814 | 22.4215 | 5000 | 1.9776 | 0.4344 | 1.0098 |
108
+ | 0.577 | 22.8700 | 5100 | 2.0336 | 0.4109 | 0.9729 |
109
+ | 0.5315 | 23.3184 | 5200 | 2.1168 | 0.4078 | 0.9863 |
110
+ | 0.5363 | 23.7668 | 5300 | 2.0074 | 0.4066 | 0.9515 |
111
+ | 0.5649 | 24.2152 | 5400 | 2.0267 | 0.4149 | 0.9734 |
112
+ | 0.5181 | 24.6637 | 5500 | 1.9906 | 0.4076 | 0.9469 |
113
+ | 0.4848 | 25.1121 | 5600 | 2.1771 | 0.4014 | 0.9471 |
114
+ | 0.4804 | 25.5605 | 5700 | 2.0896 | 0.3972 | 0.9382 |
115
+ | 0.5048 | 26.0090 | 5800 | 2.1047 | 0.3946 | 0.9295 |
116
+ | 0.4581 | 26.4574 | 5900 | 2.1378 | 0.4016 | 0.9482 |
117
+ | 0.4616 | 26.9058 | 6000 | 2.1853 | 0.4071 | 0.9466 |
118
+ | 0.4075 | 27.3543 | 6100 | 2.2196 | 0.3982 | 0.9473 |
119
+ | 0.4338 | 27.8027 | 6200 | 2.1815 | 0.4027 | 0.9511 |
120
+ | 0.4475 | 28.2511 | 6300 | 2.2522 | 0.4019 | 0.9564 |
121
+ | 0.4115 | 28.6996 | 6400 | 2.2593 | 0.4024 | 0.9580 |
122
+ | 0.3882 | 29.1480 | 6500 | 2.2502 | 0.4072 | 0.9666 |
123
+ | 0.4021 | 29.5964 | 6600 | 2.2698 | 0.4050 | 0.9687 |
124
+
125
+
126
+ ### Framework versions
127
+
128
+ - Transformers 4.57.2
129
+ - Pytorch 2.9.1+cu128
130
+ - Datasets 3.6.0
131
+ - Tokenizers 0.22.0
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:a77e8abcc24f9c00b5cac2414ecdb1c3e103710025c1afd4cffdb3394f759d9e
3
  size 1261950980
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:96663faaea21846b646f6229b850ec4160b47b4890d4f4db3c0c3abf4730bb99
3
  size 1261950980