torgo_xlsr_finetune_F03
This model is a fine-tuned version of facebook/wav2vec2-large-xlsr-53 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.3758
- Wer: 0.1234
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 8
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- num_epochs: 20
Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|---|---|---|---|---|
| 3.5793 | 1.06 | 1000 | 3.5545 | 1.0 |
| 2.1557 | 2.11 | 2000 | 0.9675 | 0.7402 |
| 0.9155 | 3.17 | 3000 | 0.5581 | 0.4278 |
| 0.6488 | 4.23 | 4000 | 0.4753 | 0.2703 |
| 0.4806 | 5.29 | 5000 | 0.5099 | 0.1864 |
| 0.4012 | 6.34 | 6000 | 0.3734 | 0.1549 |
| 0.3717 | 7.4 | 7000 | 0.4955 | 0.1732 |
| 0.3247 | 8.46 | 8000 | 0.4187 | 0.1339 |
| 0.3028 | 9.51 | 9000 | 0.3920 | 0.1234 |
| 0.2716 | 10.57 | 10000 | 0.6216 | 0.1785 |
| 0.255 | 11.63 | 11000 | 0.3310 | 0.1417 |
| 0.2418 | 12.68 | 12000 | 0.3653 | 0.1339 |
| 0.2111 | 13.74 | 13000 | 0.3677 | 0.1155 |
| 0.1996 | 14.8 | 14000 | 0.4039 | 0.1286 |
| 0.1869 | 15.86 | 15000 | 0.3811 | 0.1339 |
| 0.175 | 16.91 | 16000 | 0.3736 | 0.1286 |
| 0.1651 | 17.97 | 17000 | 0.3848 | 0.1260 |
| 0.179 | 19.03 | 18000 | 0.3758 | 0.1234 |
Framework versions
- Transformers 4.26.1
- Pytorch 2.2.0
- Datasets 2.16.1
- Tokenizers 0.13.3
- Downloads last month
- 1