w2v2-base-pretrained_lr1e-4_at0.8_da0.8
This model is a fine-tuned version of facebook/wav2vec2-base on the None dataset. It achieves the following results on the evaluation set:
- Loss: 3.0247
- Wer: 1.0
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- training_steps: 3500
- mixed_precision_training: Native AMP
Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|---|---|---|---|---|
| 14.2507 | 6.76 | 250 | 3.8646 | 1.0 |
| 3.1822 | 13.51 | 500 | 3.1594 | 1.0 |
| 3.104 | 20.27 | 750 | 3.1306 | 1.0 |
| 3.0856 | 27.03 | 1000 | 3.1454 | 1.0 |
| 3.0745 | 33.78 | 1250 | 3.1263 | 1.0 |
| 3.0518 | 40.54 | 1500 | 3.0992 | 1.0 |
| 3.0403 | 47.3 | 1750 | 3.0938 | 1.0 |
| 3.023 | 54.05 | 2000 | 3.1334 | 1.0 |
| 3.0097 | 60.81 | 2250 | 3.0640 | 1.0 |
| 2.9931 | 67.57 | 2500 | 3.0890 | 1.0 |
| 2.9778 | 74.32 | 2750 | 3.0379 | 1.0 |
| 2.9657 | 81.08 | 3000 | 3.0294 | 1.0 |
| 2.9529 | 87.84 | 3250 | 3.0291 | 1.0 |
| 2.9447 | 94.59 | 3500 | 3.0247 | 1.0 |
Framework versions
- Transformers 4.35.0
- Pytorch 2.0.0
- Datasets 2.14.6
- Tokenizers 0.14.1
- Downloads last month
- 3
Model tree for MelanieKoe/w2v2-base-pretrained_lr1e-4_at0.8_da0.8
Base model
facebook/wav2vec2-base