wav2vec2-E10_pause
This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 1.2932
- Cer: 28.7124
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 50
- num_epochs: 3
- mixed_precision_training: Native AMP
Training results
| Training Loss | Epoch | Step | Validation Loss | Cer |
|---|---|---|---|---|
| 27.1594 | 0.1289 | 200 | 4.8622 | 100.0 |
| 4.9998 | 0.2579 | 400 | 4.7518 | 100.0 |
| 4.8719 | 0.3868 | 600 | 4.7669 | 100.0 |
| 4.809 | 0.5158 | 800 | 4.6593 | 100.0 |
| 4.7354 | 0.6447 | 1000 | 4.5911 | 100.0 |
| 4.6679 | 0.7737 | 1200 | 4.6423 | 99.3773 |
| 4.136 | 0.9026 | 1400 | 3.9276 | 77.9077 |
| 3.1108 | 1.0316 | 1600 | 2.9616 | 56.7845 |
| 2.6314 | 1.1605 | 1800 | 2.7039 | 51.9619 |
| 2.2786 | 1.2895 | 2000 | 2.3306 | 45.8823 |
| 2.0348 | 1.4184 | 2200 | 2.1354 | 40.5721 |
| 1.8952 | 1.5474 | 2400 | 1.9727 | 39.7086 |
| 1.7053 | 1.6763 | 2600 | 1.8535 | 37.7996 |
| 1.5809 | 1.8053 | 2800 | 1.7608 | 36.7246 |
| 1.4968 | 1.9342 | 3000 | 1.6229 | 33.2531 |
| 1.349 | 2.0632 | 3200 | 1.6171 | 33.6290 |
| 1.2592 | 2.1921 | 3400 | 1.5156 | 32.9300 |
| 1.2043 | 2.3211 | 3600 | 1.4406 | 30.7977 |
| 1.1418 | 2.4500 | 3800 | 1.3878 | 29.5172 |
| 1.1157 | 2.5790 | 4000 | 1.3441 | 29.1060 |
| 1.0653 | 2.7079 | 4200 | 1.3052 | 27.9605 |
| 1.0451 | 2.8369 | 4400 | 1.2943 | 28.5656 |
| 1.0225 | 2.9658 | 4600 | 1.2932 | 28.7124 |
Framework versions
- Transformers 4.44.2
- Pytorch 2.4.1+cu121
- Datasets 3.0.1
- Tokenizers 0.19.1
- Downloads last month
- 16
Model tree for Gummybear05/wav2vec2-E10_pause
Base model
facebook/wav2vec2-xls-r-300m