slplab's picture
update model card README.md
6d8e114
|
raw
history blame
7.73 kB
metadata
license: apache-2.0
tags:
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: apac_5sents_XLS-R_2_e-4
    results: []

apac_5sents_XLS-R_2_e-4

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.4066
  • Wer: 0.2188

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • training_steps: 20000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
194.7445 11.11 200 81.6865 1.0
61.9115 22.22 400 39.3596 1.0
22.518 33.32 600 9.0332 1.0
6.0776 44.43 800 5.0634 1.0
4.555 55.54 1000 4.1552 1.0
2.402 66.65 1200 2.1760 0.7746
0.8349 77.76 1400 2.0442 0.8326
0.5484 88.86 1600 1.4219 0.8415
0.4242 99.97 1800 3.1197 0.8817
0.3173 111.11 2000 1.7508 0.7388
0.1989 122.22 2200 2.4075 0.6920
0.1169 133.32 2400 6.3769 0.5670
0.0925 144.43 2600 1.6440 0.4710
0.0681 155.54 2800 1.4864 0.3103
0.0561 166.65 3000 2.2973 0.3996
0.0445 177.76 3200 1.8107 0.4219
0.0336 188.86 3400 1.3867 0.3728
0.0344 199.97 3600 1.7830 0.3281
0.0345 211.11 3800 2.3773 0.3638
0.0304 222.22 4000 1.4448 0.1987
0.0357 233.32 4200 2.5893 0.3125
0.0253 244.43 4400 2.4619 0.3013
0.0212 255.54 4600 2.8144 0.2790
0.0186 266.65 4800 2.1155 0.2545
0.0196 277.76 5000 1.7306 0.2254
0.0153 288.86 5200 1.6247 0.2121
0.0146 299.97 5400 3.0580 0.4442
0.0296 311.11 5600 1.7865 0.2857
0.017 322.22 5800 5.4352 0.3795
0.0162 333.32 6000 1.9186 0.25
0.0139 344.43 6200 2.6566 0.2589
0.0139 355.54 6400 2.6532 0.2946
0.0111 366.65 6600 1.9131 0.2567
0.0119 377.76 6800 1.8914 0.3214
0.012 388.86 7000 2.2985 0.3371
0.0117 399.97 7200 3.3127 0.3393
0.0185 411.11 7400 3.1641 0.3862
0.0096 422.22 7600 2.3704 0.3973
0.007 433.32 7800 5.5839 0.4375
0.0104 444.43 8000 3.1441 0.3973
0.0098 455.54 8200 1.8188 0.2768
0.009 466.65 8400 1.8589 0.3058
0.0142 477.76 8600 3.9817 0.3772
0.0095 488.86 8800 2.1353 0.3237
0.0071 499.97 9000 1.5266 0.2902
0.0071 511.11 9200 1.4713 0.2746
0.0068 522.22 9400 2.2041 0.3125
0.0046 533.32 9600 1.4471 0.2522
0.0078 544.43 9800 1.6511 0.2946
0.0077 555.54 10000 1.7329 0.2121
0.004 566.65 10200 1.8652 0.2031
0.0049 577.76 10400 1.3661 0.2210
0.0066 588.86 10600 1.7544 0.2321
0.0072 599.97 10800 1.8081 0.2835
0.0055 611.11 11000 1.5139 0.2232
0.0053 622.22 11200 1.6138 0.2991
0.0052 633.32 11400 1.4865 0.2924
0.0067 644.43 11600 2.4807 0.3705
0.0044 655.54 11800 1.4097 0.3371
0.0026 666.65 12000 1.5313 0.3348
0.0055 677.76 12200 2.1968 0.3661
0.0034 688.86 12400 1.5198 0.3839
0.0028 699.97 12600 1.5379 0.3683
0.0033 711.11 12800 2.1355 0.3571
0.0044 722.22 13000 1.4440 0.3371
0.0024 733.32 13200 3.5154 0.3438
0.0012 744.43 13400 2.8505 0.3214
0.002 755.54 13600 2.9340 0.3304
0.0029 766.65 13800 2.8148 0.3214
0.0034 777.76 14000 2.7587 0.2835
0.0025 788.86 14200 2.8232 0.3638
0.0012 799.97 14400 2.6047 0.375
0.0015 811.11 14600 2.6364 0.3772
0.0018 822.22 14800 2.5143 0.3929
0.0032 833.32 15000 2.9826 0.5469
0.0035 844.43 15200 1.5761 0.4420
0.0023 855.54 15400 1.7465 0.4598
0.0016 866.65 15600 1.7740 0.4397
0.0007 877.76 15800 1.8296 0.4286
0.0012 888.86 16000 2.2368 0.3906
0.0027 899.97 16200 1.7112 0.3527
0.0009 911.11 16400 2.5084 0.375
0.001 922.22 16600 2.3311 0.3304
0.0019 933.32 16800 1.6653 0.3080
0.0018 944.43 17000 1.4620 0.2768
0.0007 955.54 17200 1.8509 0.2723
0.0005 966.65 17400 1.9279 0.2879
0.0009 977.76 17600 2.3558 0.2812
0.0004 988.86 17800 2.8907 0.2924
0.0011 999.97 18000 2.4847 0.2902
0.0016 1011.11 18200 2.2670 0.3058
0.0004 1022.22 18400 2.2399 0.3125
0.0004 1033.32 18600 2.4376 0.3192
0.001 1044.43 18800 2.4744 0.3214
0.0006 1055.54 19000 2.4975 0.3147
0.0006 1066.65 19200 2.6372 0.3259
0.0005 1077.76 19400 2.5817 0.3304
0.0004 1088.86 19600 2.5573 0.3326
0.0001 1099.97 19800 2.5579 0.3348
0.0003 1111.11 20000 2.5641 0.3326

Framework versions

  • Transformers 4.26.1
  • Pytorch 2.0.1+cu118
  • Datasets 2.11.0
  • Tokenizers 0.13.3