Siyong commited on
Commit
096663f
·
1 Parent(s): 14acd39

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +42 -15
README.md CHANGED
@@ -14,9 +14,9 @@ should probably proofread and complete it, then remove this comment. -->
14
 
15
  This model is a fine-tuned version of [facebook/wav2vec2-base](https://huggingface.co/facebook/wav2vec2-base) on the None dataset.
16
  It achieves the following results on the evaluation set:
17
- - Loss: 2.3511
18
- - Wer: 1.0931
19
- - Cer: 0.5953
20
 
21
  ## Model description
22
 
@@ -42,22 +42,49 @@ The following hyperparameters were used during training:
42
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
43
  - lr_scheduler_type: linear
44
  - lr_scheduler_warmup_steps: 1000
45
- - num_epochs: 30
46
  - mixed_precision_training: Native AMP
47
 
48
  ### Training results
49
 
50
- | Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
51
- |:-------------:|:-----:|:----:|:---------------:|:------:|:------:|
52
- | 4.3968 | 3.33 | 500 | 4.2752 | 1.0 | 0.9823 |
53
- | 3.0974 | 6.67 | 1000 | 3.4132 | 1.0 | 0.9823 |
54
- | 2.9198 | 10.0 | 1500 | 3.3316 | 1.0 | 0.9823 |
55
- | 2.7158 | 13.33 | 2000 | 2.7627 | 0.9983 | 0.9278 |
56
- | 2.4643 | 16.67 | 2500 | 2.5953 | 1.1455 | 0.7010 |
57
- | 2.2255 | 20.0 | 3000 | 2.3924 | 1.0732 | 0.6792 |
58
- | 1.9933 | 23.33 | 3500 | 2.3755 | 1.1056 | 0.6173 |
59
- | 1.8317 | 26.67 | 4000 | 2.3958 | 1.1031 | 0.6041 |
60
- | 1.7245 | 30.0 | 4500 | 2.3511 | 1.0931 | 0.5953 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
61
 
62
 
63
  ### Framework versions
 
14
 
15
  This model is a fine-tuned version of [facebook/wav2vec2-base](https://huggingface.co/facebook/wav2vec2-base) on the None dataset.
16
  It achieves the following results on the evaluation set:
17
+ - Loss: 3.0545
18
+ - Wer: 0.8861
19
+ - Cer: 0.5014
20
 
21
  ## Model description
22
 
 
42
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
43
  - lr_scheduler_type: linear
44
  - lr_scheduler_warmup_steps: 1000
45
+ - num_epochs: 120
46
  - mixed_precision_training: Native AMP
47
 
48
  ### Training results
49
 
50
+ | Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
51
+ |:-------------:|:------:|:-----:|:---------------:|:------:|:------:|
52
+ | No log | 3.33 | 500 | 4.0654 | 1.0 | 0.9823 |
53
+ | No log | 6.67 | 1000 | 3.4532 | 1.0 | 0.9823 |
54
+ | No log | 10.0 | 1500 | 3.0707 | 0.9992 | 0.9781 |
55
+ | No log | 13.33 | 2000 | 2.7335 | 1.0017 | 0.9027 |
56
+ | No log | 16.67 | 2500 | 2.5896 | 1.0690 | 0.7302 |
57
+ | No log | 20.0 | 3000 | 2.3315 | 1.0690 | 0.6677 |
58
+ | No log | 23.33 | 3500 | 2.2217 | 1.0150 | 0.5966 |
59
+ | No log | 26.67 | 4000 | 2.3802 | 1.0549 | 0.5948 |
60
+ | No log | 30.0 | 4500 | 2.2208 | 0.9975 | 0.5681 |
61
+ | 2.4224 | 33.33 | 5000 | 2.2687 | 0.9800 | 0.5537 |
62
+ | 2.4224 | 36.67 | 5500 | 2.3169 | 0.9476 | 0.5493 |
63
+ | 2.4224 | 40.0 | 6000 | 2.5196 | 0.9900 | 0.5509 |
64
+ | 2.4224 | 43.33 | 6500 | 2.4816 | 0.9501 | 0.5272 |
65
+ | 2.4224 | 46.67 | 7000 | 2.4894 | 0.9485 | 0.5276 |
66
+ | 2.4224 | 50.0 | 7500 | 2.4555 | 0.9418 | 0.5305 |
67
+ | 2.4224 | 53.33 | 8000 | 2.7326 | 0.9559 | 0.5255 |
68
+ | 2.4224 | 56.67 | 8500 | 2.5514 | 0.9227 | 0.5209 |
69
+ | 2.4224 | 60.0 | 9000 | 2.9135 | 0.9717 | 0.5455 |
70
+ | 2.4224 | 63.33 | 9500 | 3.0465 | 0.8346 | 0.5002 |
71
+ | 0.8569 | 66.67 | 10000 | 2.8177 | 0.9302 | 0.5216 |
72
+ | 0.8569 | 70.0 | 10500 | 2.9908 | 0.9310 | 0.5128 |
73
+ | 0.8569 | 73.33 | 11000 | 3.1752 | 0.9235 | 0.5284 |
74
+ | 0.8569 | 76.67 | 11500 | 2.7412 | 0.8886 | 0.5 |
75
+ | 0.8569 | 80.0 | 12000 | 2.7362 | 0.9127 | 0.5040 |
76
+ | 0.8569 | 83.33 | 12500 | 2.9636 | 0.9152 | 0.5093 |
77
+ | 0.8569 | 86.67 | 13000 | 3.0139 | 0.9011 | 0.5097 |
78
+ | 0.8569 | 90.0 | 13500 | 2.8325 | 0.8853 | 0.5032 |
79
+ | 0.8569 | 93.33 | 14000 | 3.0383 | 0.8845 | 0.5056 |
80
+ | 0.8569 | 96.67 | 14500 | 2.7931 | 0.8795 | 0.4965 |
81
+ | 0.3881 | 100.0 | 15000 | 2.8972 | 0.8928 | 0.5012 |
82
+ | 0.3881 | 103.33 | 15500 | 2.7780 | 0.8736 | 0.4947 |
83
+ | 0.3881 | 106.67 | 16000 | 3.1081 | 0.9036 | 0.5109 |
84
+ | 0.3881 | 110.0 | 16500 | 3.0078 | 0.8928 | 0.5032 |
85
+ | 0.3881 | 113.33 | 17000 | 3.0245 | 0.8886 | 0.5009 |
86
+ | 0.3881 | 116.67 | 17500 | 3.0739 | 0.8928 | 0.5065 |
87
+ | 0.3881 | 120.0 | 18000 | 3.0545 | 0.8861 | 0.5014 |
88
 
89
 
90
  ### Framework versions