torgo_xlsr_finetune_M01_word_only_2
This model is a fine-tuned version of facebook/wav2vec2-large-xlsr-53 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 1.5197
- Wer: 0.4523
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- num_epochs: 10
Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|---|---|---|---|---|
| No log | 0.28 | 250 | 5.5883 | 1.0 |
| 32.6551 | 0.57 | 500 | 3.4592 | 1.0 |
| 32.6551 | 0.85 | 750 | 3.4807 | 1.0 |
| 3.536 | 1.13 | 1000 | 3.4312 | 1.0 |
| 3.536 | 1.42 | 1250 | 3.4500 | 1.0 |
| 3.4905 | 1.7 | 1500 | 3.4075 | 1.0 |
| 3.4905 | 1.98 | 1750 | 3.1965 | 1.0 |
| 3.1337 | 2.27 | 2000 | 2.4374 | 1.0 |
| 3.1337 | 2.55 | 2250 | 1.9984 | 0.9512 |
| 1.796 | 2.83 | 2500 | 1.7787 | 0.8337 |
| 1.796 | 3.11 | 2750 | 1.7683 | 0.8248 |
| 1.0903 | 3.4 | 3000 | 1.4810 | 0.6452 |
| 1.0903 | 3.68 | 3250 | 1.5670 | 0.6408 |
| 0.8207 | 3.96 | 3500 | 1.4513 | 0.6585 |
| 0.8207 | 4.25 | 3750 | 1.2161 | 0.5632 |
| 0.6562 | 4.53 | 4000 | 1.5037 | 0.6208 |
| 0.6562 | 4.81 | 4250 | 1.2742 | 0.5698 |
| 0.5897 | 5.1 | 4500 | 1.4467 | 0.5543 |
| 0.5897 | 5.38 | 4750 | 1.5197 | 0.5477 |
| 0.5008 | 5.66 | 5000 | 1.6018 | 0.5654 |
| 0.5008 | 5.95 | 5250 | 1.4544 | 0.5299 |
| 0.4653 | 6.23 | 5500 | 1.4768 | 0.5122 |
| 0.4653 | 6.51 | 5750 | 1.4434 | 0.5277 |
| 0.4165 | 6.8 | 6000 | 1.4197 | 0.4834 |
| 0.4165 | 7.08 | 6250 | 1.5469 | 0.5299 |
| 0.3873 | 7.36 | 6500 | 1.5274 | 0.5233 |
| 0.3873 | 7.64 | 6750 | 1.4026 | 0.4812 |
| 0.355 | 7.93 | 7000 | 1.4866 | 0.4723 |
| 0.355 | 8.21 | 7250 | 1.5028 | 0.4989 |
| 0.3382 | 8.49 | 7500 | 1.5790 | 0.4834 |
| 0.3382 | 8.78 | 7750 | 1.4626 | 0.4590 |
| 0.2974 | 9.06 | 8000 | 1.5658 | 0.4856 |
| 0.2974 | 9.34 | 8250 | 1.5510 | 0.4701 |
| 0.2975 | 9.63 | 8500 | 1.5392 | 0.4501 |
| 0.2975 | 9.91 | 8750 | 1.5197 | 0.4523 |
Framework versions
- Transformers 4.26.1
- Pytorch 2.1.0+cu121
- Datasets 2.16.1
- Tokenizers 0.13.3
- Downloads last month
- -