apac_5sents_XLS-R_2_e-4_unfreeze

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9903
  • Wer: 0.2031

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • training_steps: 20000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
189.2127 11.11 200 73.2720 1.0
54.2997 22.22 400 34.4132 1.0
20.0866 33.32 600 8.5765 1.0
6.0035 44.43 800 4.9753 1.0
4.7213 55.54 1000 4.3482 1.0
3.4429 66.65 1200 3.1173 1.0
1.5676 77.76 1400 1.7944 0.9353
0.7909 88.86 1600 1.6430 0.9308
0.6096 99.97 1800 1.5703 0.9330
0.5121 111.11 2000 1.5256 0.9129
0.4618 122.22 2200 1.6791 0.8638
0.3701 133.32 2400 1.5437 0.6920
0.2753 144.43 2600 1.3421 0.8304
0.1842 155.54 2800 1.1704 0.7455
0.0994 166.65 3000 1.0909 0.4710
0.0719 177.76 3200 1.2866 0.5335
0.0457 188.86 3400 1.1482 0.5580
0.0373 199.97 3600 0.9844 0.6183
0.0384 211.11 3800 1.1314 0.6451
0.0329 222.22 4000 1.3817 0.6406
0.03 233.32 4200 1.0517 0.6049
0.026 244.43 4400 1.0459 0.5022
0.0339 255.54 4600 1.2741 0.4732
0.0231 266.65 4800 1.5190 0.5424
0.0244 277.76 5000 1.3602 0.6451
0.017 288.86 5200 1.9874 0.6473
0.0204 299.97 5400 1.4477 0.6540
0.0221 311.11 5600 1.1082 0.5960
0.019 322.22 5800 1.5668 0.5737
0.0148 333.32 6000 0.9671 0.5871
0.0171 344.43 6200 1.0411 0.6406
0.0135 355.54 6400 1.0979 0.5938
0.0236 366.65 6600 1.2793 0.5670
0.0178 377.76 6800 1.6185 0.5982
0.0145 388.86 7000 1.1957 0.5491
0.015 399.97 7200 1.6764 0.6272
0.0122 411.11 7400 1.0150 0.6183
0.0112 422.22 7600 1.0519 0.6429
0.0083 433.32 7800 1.2191 0.5714
0.0086 444.43 8000 1.3643 0.6116
0.0149 455.54 8200 0.8175 0.6116
0.0113 466.65 8400 1.3835 0.6161
0.0085 477.76 8600 1.1749 0.5781
0.0074 488.86 8800 1.4239 0.4754
0.0064 499.97 9000 1.2462 0.5871
0.0089 511.11 9200 0.9149 0.4754
0.0084 522.22 9400 0.9334 0.5759
0.0056 533.32 9600 1.2457 0.6585
0.007 544.43 9800 1.1244 0.6518
0.0106 555.54 10000 1.2021 0.6272
0.0082 566.65 10200 1.0176 0.6094
0.0094 577.76 10400 1.0082 0.6161
0.0085 588.86 10600 1.0985 0.5580
0.0087 599.97 10800 0.8951 0.5067
0.0088 611.11 11000 1.0105 0.6094
0.0073 622.22 11200 1.1884 0.6071
0.0058 633.32 11400 1.0784 0.5804
0.008 644.43 11600 0.9484 0.6228
0.0044 655.54 11800 0.9366 0.6295
0.0049 666.65 12000 0.9724 0.6562
0.0055 677.76 12200 1.0735 0.6272
0.0057 688.86 12400 1.4276 0.5670
0.0069 699.97 12600 0.8726 0.6004
0.0042 711.11 12800 1.2411 0.5402
0.0031 722.22 13000 0.9987 0.5603
0.0033 733.32 13200 1.0106 0.5781
0.0046 744.43 13400 1.0548 0.4241
0.0042 755.54 13600 0.9983 0.2009
0.0025 766.65 13800 1.0933 0.2879
0.0067 777.76 14000 1.0665 0.3482
0.0061 788.86 14200 1.3937 0.4085
0.0058 799.97 14400 1.4818 0.3036
0.003 811.11 14600 1.3054 0.2723
0.0036 822.22 14800 1.1874 0.4062
0.0032 833.32 15000 1.4134 0.3571
0.0024 844.43 15200 1.4457 0.3661
0.0019 855.54 15400 1.2084 0.4821
0.0014 866.65 15600 1.2791 0.3058
0.0018 877.76 15800 1.1289 0.3616
0.0013 888.86 16000 1.3066 0.2567
0.001 899.97 16200 1.3964 0.2991
0.0016 911.11 16400 1.3610 0.2522
0.001 922.22 16600 1.4205 0.2812
0.0015 933.32 16800 1.2090 0.2946
0.0009 944.43 17000 1.2032 0.2344
0.0018 955.54 17200 1.2981 0.3013
0.0014 966.65 17400 1.2368 0.3549
0.0016 977.76 17600 1.3524 0.3683
0.0009 988.86 17800 1.4152 0.3460
0.0007 999.97 18000 1.3977 0.3125
0.0015 1011.11 18200 1.3225 0.2857
0.0009 1022.22 18400 1.2686 0.2812
0.0009 1033.32 18600 1.2383 0.3036
0.0011 1044.43 18800 1.2066 0.2902
0.0009 1055.54 19000 1.2048 0.2812
0.0007 1066.65 19200 1.2001 0.2812
0.0009 1077.76 19400 1.2631 0.3013
0.0005 1088.86 19600 1.2438 0.3036
0.0005 1099.97 19800 1.2146 0.3036
0.0005 1111.11 20000 1.2095 0.2991

Framework versions

  • Transformers 4.26.1
  • Pytorch 2.0.1+cu118
  • Datasets 2.11.0
  • Tokenizers 0.13.3
Downloads last month
2
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Evaluation results