zainulhakim commited on
Commit
1a157e2
·
verified ·
1 Parent(s): b2be2df

End of training

Browse files
Files changed (1) hide show
  1. README.md +8 -106
README.md CHANGED
@@ -1,5 +1,6 @@
1
  ---
2
- base_model: zainulhakim/my_checkpoints_cl_250203
 
3
  tags:
4
  - generated_from_trainer
5
  metrics:
@@ -14,11 +15,11 @@ should probably proofread and complete it, then remove this comment. -->
14
 
15
  # my_checkpoints_cl_250203
16
 
17
- This model is a fine-tuned version of [zainulhakim/my_checkpoints_cl_250203](https://huggingface.co/zainulhakim/my_checkpoints_cl_250203) on an unknown dataset.
18
  It achieves the following results on the evaluation set:
19
- - Loss: 7.6020
20
  - Wer: 1.0
21
- - Cer: 0.9971
22
 
23
  ## Model description
24
 
@@ -37,120 +38,21 @@ More information needed
37
  ### Training hyperparameters
38
 
39
  The following hyperparameters were used during training:
40
- - learning_rate: 0.0001
41
  - train_batch_size: 4
42
  - eval_batch_size: 8
43
  - seed: 42
44
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
  - lr_scheduler_type: linear
46
  - lr_scheduler_warmup_steps: 500
47
- - num_epochs: 100
48
  - mixed_precision_training: Native AMP
49
 
50
  ### Training results
51
 
52
  | Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
53
  |:-------------:|:-----:|:----:|:---------------:|:---:|:------:|
54
- | No log | 1.0 | 79 | 7.6020 | 1.0 | 0.9971 |
55
- | No log | 2.0 | 158 | 7.6020 | 1.0 | 0.9971 |
56
- | No log | 3.0 | 237 | 7.6020 | 1.0 | 0.9971 |
57
- | No log | 4.0 | 316 | 7.6020 | 1.0 | 0.9971 |
58
- | No log | 5.0 | 395 | 7.6020 | 1.0 | 0.9971 |
59
- | No log | 6.0 | 474 | 7.6020 | 1.0 | 0.9971 |
60
- | 0.0 | 7.0 | 553 | 7.6020 | 1.0 | 0.9971 |
61
- | 0.0 | 8.0 | 632 | 7.6020 | 1.0 | 0.9971 |
62
- | 0.0 | 9.0 | 711 | 7.6020 | 1.0 | 0.9971 |
63
- | 0.0 | 10.0 | 790 | 7.6020 | 1.0 | 0.9971 |
64
- | 0.0 | 11.0 | 869 | 7.6020 | 1.0 | 0.9971 |
65
- | 0.0 | 12.0 | 948 | 7.6020 | 1.0 | 0.9971 |
66
- | 0.0 | 13.0 | 1027 | 7.6020 | 1.0 | 0.9971 |
67
- | 0.0 | 14.0 | 1106 | 7.6020 | 1.0 | 0.9971 |
68
- | 0.0 | 15.0 | 1185 | 7.6020 | 1.0 | 0.9971 |
69
- | 0.0 | 16.0 | 1264 | 7.6020 | 1.0 | 0.9971 |
70
- | 0.0 | 17.0 | 1343 | 7.6020 | 1.0 | 0.9971 |
71
- | 0.0 | 18.0 | 1422 | 7.6020 | 1.0 | 0.9971 |
72
- | 0.0 | 19.0 | 1501 | 7.6020 | 1.0 | 0.9971 |
73
- | 0.0 | 20.0 | 1580 | 7.6020 | 1.0 | 0.9971 |
74
- | 0.0 | 21.0 | 1659 | 7.6020 | 1.0 | 0.9971 |
75
- | 0.0 | 22.0 | 1738 | 7.6020 | 1.0 | 0.9971 |
76
- | 0.0 | 23.0 | 1817 | 7.6020 | 1.0 | 0.9971 |
77
- | 0.0 | 24.0 | 1896 | 7.6020 | 1.0 | 0.9971 |
78
- | 0.0 | 25.0 | 1975 | 7.6020 | 1.0 | 0.9971 |
79
- | 0.0 | 26.0 | 2054 | 7.6020 | 1.0 | 0.9971 |
80
- | 0.0 | 27.0 | 2133 | 7.6020 | 1.0 | 0.9971 |
81
- | 0.0 | 28.0 | 2212 | 7.6020 | 1.0 | 0.9971 |
82
- | 0.0 | 29.0 | 2291 | 7.6020 | 1.0 | 0.9971 |
83
- | 0.0 | 30.0 | 2370 | 7.6020 | 1.0 | 0.9971 |
84
- | 0.0 | 31.0 | 2449 | 7.6020 | 1.0 | 0.9971 |
85
- | 0.0 | 32.0 | 2528 | 7.6020 | 1.0 | 0.9971 |
86
- | 0.0 | 33.0 | 2607 | 7.6020 | 1.0 | 0.9971 |
87
- | 0.0 | 34.0 | 2686 | 7.6020 | 1.0 | 0.9971 |
88
- | 0.0 | 35.0 | 2765 | 7.6020 | 1.0 | 0.9971 |
89
- | 0.0 | 36.0 | 2844 | 7.6020 | 1.0 | 0.9971 |
90
- | 0.0 | 37.0 | 2923 | 7.6020 | 1.0 | 0.9971 |
91
- | 0.0 | 38.0 | 3002 | 7.6020 | 1.0 | 0.9971 |
92
- | 0.0 | 39.0 | 3081 | 7.6020 | 1.0 | 0.9971 |
93
- | 0.0 | 40.0 | 3160 | 7.6020 | 1.0 | 0.9971 |
94
- | 0.0 | 41.0 | 3239 | 7.6020 | 1.0 | 0.9971 |
95
- | 0.0 | 42.0 | 3318 | 7.6020 | 1.0 | 0.9971 |
96
- | 0.0 | 43.0 | 3397 | 7.6020 | 1.0 | 0.9971 |
97
- | 0.0 | 44.0 | 3476 | 7.6020 | 1.0 | 0.9971 |
98
- | 0.0 | 45.0 | 3555 | 7.6020 | 1.0 | 0.9971 |
99
- | 0.0 | 46.0 | 3634 | 7.6020 | 1.0 | 0.9971 |
100
- | 0.0 | 47.0 | 3713 | 7.6020 | 1.0 | 0.9971 |
101
- | 0.0 | 48.0 | 3792 | 7.6020 | 1.0 | 0.9971 |
102
- | 0.0 | 49.0 | 3871 | 7.6020 | 1.0 | 0.9971 |
103
- | 0.0 | 50.0 | 3950 | 7.6020 | 1.0 | 0.9971 |
104
- | 0.0 | 51.0 | 4029 | 7.6020 | 1.0 | 0.9971 |
105
- | 0.0 | 52.0 | 4108 | 7.6020 | 1.0 | 0.9971 |
106
- | 0.0 | 53.0 | 4187 | 7.6020 | 1.0 | 0.9971 |
107
- | 0.0 | 54.0 | 4266 | 7.6020 | 1.0 | 0.9971 |
108
- | 0.0 | 55.0 | 4345 | 7.6020 | 1.0 | 0.9971 |
109
- | 0.0 | 56.0 | 4424 | 7.6020 | 1.0 | 0.9971 |
110
- | 0.0 | 57.0 | 4503 | 7.6020 | 1.0 | 0.9971 |
111
- | 0.0 | 58.0 | 4582 | 7.6020 | 1.0 | 0.9971 |
112
- | 0.0 | 59.0 | 4661 | 7.6020 | 1.0 | 0.9971 |
113
- | 0.0 | 60.0 | 4740 | 7.6020 | 1.0 | 0.9971 |
114
- | 0.0 | 61.0 | 4819 | 7.6020 | 1.0 | 0.9971 |
115
- | 0.0 | 62.0 | 4898 | 7.6020 | 1.0 | 0.9971 |
116
- | 0.0 | 63.0 | 4977 | 7.6020 | 1.0 | 0.9971 |
117
- | 0.0 | 64.0 | 5056 | 7.6020 | 1.0 | 0.9971 |
118
- | 0.0 | 65.0 | 5135 | 7.6020 | 1.0 | 0.9971 |
119
- | 0.0 | 66.0 | 5214 | 7.6020 | 1.0 | 0.9971 |
120
- | 0.0 | 67.0 | 5293 | 7.6020 | 1.0 | 0.9971 |
121
- | 0.0 | 68.0 | 5372 | 7.6020 | 1.0 | 0.9971 |
122
- | 0.0 | 69.0 | 5451 | 7.6020 | 1.0 | 0.9971 |
123
- | 0.0 | 70.0 | 5530 | 7.6020 | 1.0 | 0.9971 |
124
- | 0.0 | 71.0 | 5609 | 7.6020 | 1.0 | 0.9971 |
125
- | 0.0 | 72.0 | 5688 | 7.6020 | 1.0 | 0.9971 |
126
- | 0.0 | 73.0 | 5767 | 7.6020 | 1.0 | 0.9971 |
127
- | 0.0 | 74.0 | 5846 | 7.6020 | 1.0 | 0.9971 |
128
- | 0.0 | 75.0 | 5925 | 7.6020 | 1.0 | 0.9971 |
129
- | 0.0 | 76.0 | 6004 | 7.6020 | 1.0 | 0.9971 |
130
- | 0.0 | 77.0 | 6083 | 7.6020 | 1.0 | 0.9971 |
131
- | 0.0 | 78.0 | 6162 | 7.6020 | 1.0 | 0.9971 |
132
- | 0.0 | 79.0 | 6241 | 7.6020 | 1.0 | 0.9971 |
133
- | 0.0 | 80.0 | 6320 | 7.6020 | 1.0 | 0.9971 |
134
- | 0.0 | 81.0 | 6399 | 7.6020 | 1.0 | 0.9971 |
135
- | 0.0 | 82.0 | 6478 | 7.6020 | 1.0 | 0.9971 |
136
- | 0.0 | 83.0 | 6557 | 7.6020 | 1.0 | 0.9971 |
137
- | 0.0 | 84.0 | 6636 | 7.6020 | 1.0 | 0.9971 |
138
- | 0.0 | 85.0 | 6715 | 7.6020 | 1.0 | 0.9971 |
139
- | 0.0 | 86.0 | 6794 | 7.6020 | 1.0 | 0.9971 |
140
- | 0.0 | 87.0 | 6873 | 7.6020 | 1.0 | 0.9971 |
141
- | 0.0 | 88.0 | 6952 | 7.6020 | 1.0 | 0.9971 |
142
- | 0.0 | 89.0 | 7031 | 7.6020 | 1.0 | 0.9971 |
143
- | 0.0 | 90.0 | 7110 | 7.6020 | 1.0 | 0.9971 |
144
- | 0.0 | 91.0 | 7189 | 7.6020 | 1.0 | 0.9971 |
145
- | 0.0 | 92.0 | 7268 | 7.6020 | 1.0 | 0.9971 |
146
- | 0.0 | 93.0 | 7347 | 7.6020 | 1.0 | 0.9971 |
147
- | 0.0 | 94.0 | 7426 | 7.6020 | 1.0 | 0.9971 |
148
- | 0.0 | 95.0 | 7505 | 7.6020 | 1.0 | 0.9971 |
149
- | 0.0 | 96.0 | 7584 | 7.6020 | 1.0 | 0.9971 |
150
- | 0.0 | 97.0 | 7663 | 7.6020 | 1.0 | 0.9971 |
151
- | 0.0 | 98.0 | 7742 | 7.6020 | 1.0 | 0.9971 |
152
- | 0.0 | 99.0 | 7821 | 7.6020 | 1.0 | 0.9971 |
153
- | 0.0 | 100.0 | 7900 | 7.6020 | 1.0 | 0.9971 |
154
 
155
 
156
  ### Framework versions
 
1
  ---
2
+ license: apache-2.0
3
+ base_model: zainulhakim/241103_wav2vec2_Augmented_Dataset
4
  tags:
5
  - generated_from_trainer
6
  metrics:
 
15
 
16
  # my_checkpoints_cl_250203
17
 
18
+ This model is a fine-tuned version of [zainulhakim/241103_wav2vec2_Augmented_Dataset](https://huggingface.co/zainulhakim/241103_wav2vec2_Augmented_Dataset) on an unknown dataset.
19
  It achieves the following results on the evaluation set:
20
+ - Loss: nan
21
  - Wer: 1.0
22
+ - Cer: 0.9985
23
 
24
  ## Model description
25
 
 
38
  ### Training hyperparameters
39
 
40
  The following hyperparameters were used during training:
41
+ - learning_rate: 10
42
  - train_batch_size: 4
43
  - eval_batch_size: 8
44
  - seed: 42
45
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
46
  - lr_scheduler_type: linear
47
  - lr_scheduler_warmup_steps: 500
48
+ - num_epochs: 1
49
  - mixed_precision_training: Native AMP
50
 
51
  ### Training results
52
 
53
  | Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
54
  |:-------------:|:-----:|:----:|:---------------:|:---:|:------:|
55
+ | No log | 1.0 | 79 | nan | 1.0 | 0.9985 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
56
 
57
 
58
  ### Framework versions