ssc-lth-model

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.5683
  • Cer: 0.3085
  • Wer: 0.7821

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 100
  • num_epochs: 30
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Cer Wer
5.7582 0.1546 100 3.1629 0.9943 1.0
3.4065 0.3091 200 3.1093 0.9943 1.0
3.3576 0.4637 300 3.0748 0.9943 1.0
3.1832 0.6182 400 3.0296 0.9943 1.0
3.1613 0.7728 500 2.9966 0.9943 1.0
3.1446 0.9274 600 3.0421 0.9943 1.0
3.1082 1.0819 700 3.0424 0.9943 1.0
3.0153 1.2365 800 2.8610 0.9943 1.0
2.8793 1.3910 900 2.8591 0.8887 1.0
2.7603 1.5456 1000 2.5400 0.8918 1.0
2.6694 1.7002 1100 2.4441 0.8234 1.0
2.4981 1.8547 1200 2.1997 0.7659 1.0
2.3687 2.0093 1300 2.0222 0.7077 1.0
2.3179 2.1638 1400 1.9736 0.6633 1.0
2.2323 2.3184 1500 2.0691 0.5410 0.9733
2.1893 2.4730 1600 1.8443 0.5961 0.9948
2.1184 2.6275 1700 1.8428 0.5332 0.9794
2.0585 2.7821 1800 1.7327 0.5545 0.9920
2.0365 2.9366 1900 1.7461 0.5172 0.9742
1.9224 3.0912 2000 1.6829 0.4921 0.9501
1.9396 3.2457 2100 1.6787 0.4800 0.9399
1.9326 3.4003 2200 1.6375 0.5063 0.9644
1.8535 3.5549 2300 1.6068 0.4837 0.9530
1.859 3.7094 2400 1.5812 0.4838 0.9517
1.8666 3.8640 2500 1.6099 0.4670 0.9356
1.8084 4.0185 2600 1.5219 0.4707 0.9381
1.7369 4.1731 2700 1.5108 0.4377 0.9229
1.7619 4.3277 2800 1.4897 0.4434 0.9216
1.7268 4.4822 2900 1.5166 0.4525 0.9377
1.6908 4.6368 3000 1.5185 0.4541 0.9247
1.7478 4.7913 3100 1.4809 0.4382 0.9254
1.7223 4.9459 3200 1.4538 0.4354 0.9231
1.6529 5.1005 3300 1.4845 0.4355 0.9185
1.633 5.2550 3400 1.4235 0.4358 0.9252
1.6486 5.4096 3500 1.3745 0.4156 0.9040
1.6362 5.5641 3600 1.4383 0.4047 0.9683
1.552 5.7187 3700 1.4306 0.4195 0.9150
1.5794 5.8733 3800 1.4064 0.4153 0.9155
1.5882 6.0278 3900 1.4026 0.4055 0.9205
1.4888 6.1824 4000 1.4291 0.4131 0.9315
1.5515 6.3369 4100 1.4047 0.4018 0.9229
1.5229 6.4915 4200 1.3459 0.3977 0.8988
1.5235 6.6461 4300 1.3754 0.3957 0.9233
1.4919 6.8006 4400 1.3355 0.3897 0.9222
1.5215 6.9552 4500 1.3391 0.3882 0.9048
1.4768 7.1097 4600 1.3710 0.3982 0.8937
1.4441 7.2643 4700 1.3814 0.3961 0.9112
1.3782 7.4189 4800 1.3806 0.4105 0.9028
1.4264 7.5734 4900 1.3845 0.3897 0.9491
1.4418 7.7280 5000 1.2996 0.3800 0.8863
1.4022 7.8825 5100 1.3480 0.3957 0.8869
1.4414 8.0371 5200 1.3391 0.3847 0.9246
1.3637 8.1917 5300 1.3786 0.3991 0.8862
1.365 8.3462 5400 1.2898 0.3729 0.8711
1.3352 8.5008 5500 1.2853 0.3774 0.8765
1.3346 8.6553 5600 1.3410 0.3974 0.8972
1.3163 8.8099 5700 1.3299 0.3836 0.9205
1.38 8.9645 5800 1.4417 0.3892 0.9467
1.3366 9.1190 5900 1.4005 0.3906 0.8904
1.287 9.2736 6000 1.3038 0.3713 0.8920
1.2572 9.4281 6100 1.3145 0.3853 0.8755
1.2785 9.5827 6200 1.3774 0.3848 0.8795
1.237 9.7372 6300 1.3252 0.3853 0.8768
1.2779 9.8918 6400 1.2781 0.3737 0.8769
1.2935 10.0464 6500 1.3950 0.3881 0.8870
1.1689 10.2009 6600 1.3417 0.3744 0.8764
1.2232 10.3555 6700 1.3423 0.3755 0.8663
1.2236 10.5100 6800 1.2751 0.3684 0.8837
1.2102 10.6646 6900 1.2827 0.3716 0.8638
1.2025 10.8192 7000 1.3115 0.3707 0.8959
1.2348 10.9737 7100 1.3197 0.3633 0.8900
1.1198 11.1283 7200 1.3162 0.3682 0.8566
1.1254 11.2828 7300 1.3437 0.3709 0.8776
1.1438 11.4374 7400 1.3546 0.3753 0.8628
1.1528 11.5920 7500 1.2441 0.3541 0.8558
1.1301 11.7465 7600 1.2736 0.3697 0.8635
1.1229 11.9011 7700 1.2383 0.3597 0.8637
1.1234 12.0556 7800 1.2476 0.3583 0.8518
1.0714 12.2102 7900 1.2173 0.3515 0.8411
1.0748 12.3648 8000 1.1947 0.3478 0.8346
1.0588 12.5193 8100 1.3123 0.3533 0.8537
1.0727 12.6739 8200 1.2479 0.3562 0.8472
1.0688 12.8284 8300 1.2803 0.3600 0.8502
1.1059 12.9830 8400 1.2793 0.3529 0.8476
1.0258 13.1376 8500 1.2690 0.3636 0.8521
1.012 13.2921 8600 1.2987 0.3478 0.8704
0.9649 13.4467 8700 1.4050 0.3718 0.8624
0.9875 13.6012 8800 1.2196 0.3451 0.8436
1.0292 13.7558 8900 1.2596 0.3529 0.8320
1.0342 13.9104 9000 1.2486 0.3464 0.8497
1.0011 14.0649 9100 1.2429 0.3533 0.8423
0.9401 14.2195 9200 1.3330 0.3603 0.8569
0.9597 14.3740 9300 1.2858 0.3636 0.8501
0.9457 14.5286 9400 1.1997 0.3444 0.8472
0.9661 14.6832 9500 1.1897 0.3401 0.8425
0.9555 14.8377 9600 1.2808 0.3527 0.8334
0.9833 14.9923 9700 1.2827 0.3486 0.8392
0.8874 15.1468 9800 1.4258 0.3675 0.8566
0.8955 15.3014 9900 1.2527 0.3462 0.8229
0.8936 15.4560 10000 1.2336 0.3363 0.8394
0.8948 15.6105 10100 1.2048 0.3366 0.8244
0.9023 15.7651 10200 1.3404 0.3410 0.8237
0.9056 15.9196 10300 1.3217 0.3543 0.8379
0.904 16.0742 10400 1.3742 0.3638 0.8376
0.7977 16.2287 10500 1.2720 0.3305 0.8096
0.8473 16.3833 10600 1.3408 0.3376 0.8546
0.8579 16.5379 10700 1.2352 0.3421 0.8193
0.8524 16.6924 10800 1.3313 0.3500 0.8385
0.8388 16.8470 10900 1.2586 0.3315 0.8296
0.8291 17.0015 11000 1.2370 0.3517 0.8368
0.7695 17.1561 11100 1.2832 0.3339 0.8271
0.7844 17.3107 11200 1.2764 0.3453 0.8468
0.7764 17.4652 11300 1.2639 0.3428 0.8409
0.8082 17.6198 11400 1.2235 0.3359 0.8198
0.7982 17.7743 11500 1.2658 0.3352 0.8140
0.7763 17.9289 11600 1.2210 0.3287 0.8186
0.7947 18.0835 11700 1.2735 0.3292 0.7998
0.7261 18.2380 11800 1.2937 0.3400 0.8121
0.745 18.3926 11900 1.2504 0.3385 0.8136
0.7626 18.5471 12000 1.2307 0.3260 0.8023
0.725 18.7017 12100 1.2764 0.3369 0.8021
0.7459 18.8563 12200 1.2189 0.3309 0.8020
0.7395 19.0108 12300 1.2120 0.3197 0.7927
0.689 19.1654 12400 1.2412 0.3241 0.7974
0.6776 19.3199 12500 1.1921 0.3202 0.7936
0.6629 19.4745 12600 1.2220 0.3220 0.7943
0.7292 19.6291 12700 1.2844 0.3186 0.7953
0.7135 19.7836 12800 1.2520 0.3211 0.7932
0.7042 19.9382 12900 1.2142 0.3127 0.7783
0.6766 20.0927 13000 1.2213 0.3176 0.7884
0.6471 20.2473 13100 1.3010 0.3232 0.8033
0.6842 20.4019 13200 1.3410 0.3273 0.8023
0.6308 20.5564 13300 1.2506 0.3159 0.7932
0.6551 20.7110 13400 1.3087 0.3197 0.7979
0.6413 20.8655 13500 1.2666 0.3168 0.7988
0.6279 21.0201 13600 1.3987 0.3141 0.7816
0.6173 21.1747 13700 1.3371 0.3125 0.7974
0.5904 21.3292 13800 1.3081 0.3274 0.8015
0.6126 21.4838 13900 1.3683 0.3247 0.8083
0.6055 21.6383 14000 1.3113 0.3157 0.7977
0.6225 21.7929 14100 1.2706 0.3110 0.7865
0.6007 21.9474 14200 1.3128 0.3264 0.7931
0.5837 22.1020 14300 1.3846 0.3198 0.7923
0.5531 22.2566 14400 1.3394 0.3265 0.8053
0.5748 22.4111 14500 1.3888 0.3220 0.7889
0.555 22.5657 14600 1.3188 0.3242 0.7950
0.5632 22.7202 14700 1.3423 0.3216 0.7884
0.5717 22.8748 14800 1.3724 0.3133 0.8009
0.5513 23.0294 14900 1.3964 0.3183 0.7924
0.5324 23.1839 15000 1.4374 0.3187 0.8119
0.5538 23.3385 15100 1.3371 0.3176 0.7888
0.5098 23.4930 15200 1.3456 0.3139 0.7812
0.5437 23.6476 15300 1.3673 0.3183 0.7910
0.5621 23.8022 15400 1.3920 0.3204 0.7893
0.5265 23.9567 15500 1.3835 0.3211 0.7995
0.4964 24.1113 15600 1.3911 0.3153 0.7877
0.489 24.2658 15700 1.4370 0.3224 0.8030
0.4907 24.4204 15800 1.4016 0.3268 0.7973
0.49 24.5750 15900 1.4152 0.3175 0.7901
0.5104 24.7295 16000 1.4569 0.3200 0.7846
0.5075 24.8841 16100 1.4782 0.3175 0.7957
0.4939 25.0386 16200 1.4216 0.3155 0.7935
0.4501 25.1932 16300 1.4857 0.3234 0.7995
0.4498 25.3478 16400 1.4586 0.3161 0.7969
0.4521 25.5023 16500 1.5067 0.3195 0.7895
0.4624 25.6569 16600 1.4472 0.3151 0.7927
0.4881 25.8114 16700 1.4872 0.3175 0.7999
0.4802 25.9660 16800 1.4425 0.3118 0.7969
0.4441 26.1206 16900 1.5231 0.3177 0.8007
0.4641 26.2751 17000 1.4957 0.3152 0.7935
0.4417 26.4297 17100 1.5059 0.3151 0.7932
0.4492 26.5842 17200 1.4759 0.3136 0.7847
0.4443 26.7388 17300 1.5226 0.3121 0.7882
0.4635 26.8934 17400 1.4899 0.3090 0.7768
0.4432 27.0479 17500 1.5173 0.3156 0.7818
0.4349 27.2025 17600 1.5088 0.3122 0.7817
0.4235 27.3570 17700 1.5343 0.3149 0.8004
0.4174 27.5116 17800 1.5429 0.3090 0.7906
0.4293 27.6662 17900 1.5188 0.3122 0.7838
0.4164 27.8207 18000 1.5407 0.3090 0.7850
0.4202 27.9753 18100 1.5115 0.3092 0.7797
0.393 28.1298 18200 1.5165 0.3082 0.7799
0.3997 28.2844 18300 1.5150 0.3093 0.7825
0.4368 28.4389 18400 1.5381 0.3092 0.7765
0.372 28.5935 18500 1.5266 0.3096 0.7876
0.4138 28.7481 18600 1.5435 0.3085 0.7823
0.4082 28.9026 18700 1.5375 0.3079 0.7825
0.4009 29.0572 18800 1.5643 0.3098 0.7856
0.3883 29.2117 18900 1.5731 0.3063 0.7829
0.3836 29.3663 19000 1.5718 0.3073 0.7820
0.3939 29.5209 19100 1.5822 0.3079 0.7827
0.3915 29.6754 19200 1.5766 0.3087 0.7829
0.3958 29.8300 19300 1.5688 0.3086 0.7827
0.4069 29.9845 19400 1.5683 0.3085 0.7821

Framework versions

  • Transformers 4.57.2
  • Pytorch 2.9.1+cu128
  • Datasets 3.6.0
  • Tokenizers 0.22.0
Downloads last month
391
Safetensors
Model size
0.3B params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for ctaguchi/ssc-lth-model

Finetuned
(771)
this model

Evaluation results