ssc-el-CY-model

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 3.7970
  • Cer: 0.9445
  • Wer: 0.9999

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 100
  • num_epochs: 30
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Cer Wer
6.6563 0.1883 100 3.5095 0.9958 1.0
3.5193 0.3766 200 3.5593 0.9958 1.0
3.9095 0.5650 300 3.3924 0.9958 1.0
3.6929 0.7533 400 3.3523 0.9958 1.0
3.5347 0.9416 500 3.4397 0.9958 1.0
3.48 1.1299 600 3.3551 0.9958 1.0
3.5095 1.3183 700 3.3840 0.9958 1.0
3.4933 1.5066 800 3.3772 0.9958 1.0
3.4675 1.6949 900 3.4572 0.9958 1.0
3.4735 1.8832 1000 3.4976 0.9843 0.9995
3.4659 2.0716 1100 3.2917 0.9922 1.0
3.4585 2.2599 1200 3.2933 0.9915 0.9829
3.4264 2.4482 1300 3.3039 0.9915 0.9830
3.4353 2.6365 1400 3.2864 0.9914 0.9830
3.4617 2.8249 1500 3.2812 0.9916 0.9834
3.4312 3.0132 1600 3.3325 0.9915 0.9830
3.3992 3.2015 1700 3.2775 0.9915 0.9833
3.3732 3.3898 1800 3.4619 0.9857 0.9998
3.3619 3.5782 1900 3.4054 0.9916 0.9900
3.3602 3.7665 2000 3.3870 0.9911 0.9870
3.3476 3.9548 2100 3.3257 0.9914 0.9838
3.3112 4.1431 2200 3.4524 0.9901 0.9891
3.3166 4.3315 2300 3.3271 0.9913 0.9859
3.3059 4.5198 2400 3.4313 0.9914 0.9835
3.2864 4.7081 2500 3.2850 0.9852 0.9944
3.2992 4.8964 2600 3.3862 0.9895 0.9885
3.2491 5.0847 2700 3.3486 0.9914 0.9831
3.2507 5.2731 2800 3.2751 0.9911 0.9841
3.1689 5.4614 2900 3.3519 0.9852 0.9994
3.1484 5.6497 3000 3.3401 0.9841 0.9944
3.0872 5.8380 3100 3.1924 0.9804 0.9976
2.9885 6.0264 3200 3.0345 0.9180 1.0
2.9312 6.2147 3300 2.9692 0.8770 1.0
2.8406 6.4030 3400 2.9859 0.8145 1.0
2.7876 6.5913 3500 2.8495 0.8633 0.9999
2.7776 6.7797 3600 2.8264 0.8823 1.0
2.6837 6.9680 3700 2.8850 0.7760 0.9999
2.6025 7.1563 3800 2.7891 0.7636 0.9982
2.5765 7.3446 3900 2.7292 0.8369 1.0
2.5398 7.5330 4000 2.8252 0.7108 0.9896
2.5163 7.7213 4100 2.7143 0.7544 0.9988
2.4676 7.9096 4200 2.6224 0.7611 0.9976
2.4589 8.0979 4300 2.7647 0.7194 0.9957
2.4198 8.2863 4400 2.6359 0.7458 0.9981
2.3952 8.4746 4500 2.6756 0.7132 0.9939
2.3696 8.6629 4600 2.6740 0.7052 0.9911
2.3592 8.8512 4700 2.6290 0.7753 1.0
2.2945 9.0395 4800 2.5619 0.7157 0.9953
2.3084 9.2279 4900 2.6623 0.7370 0.9991
2.3028 9.4162 5000 2.6039 0.6887 0.9909
2.2883 9.6045 5100 2.6003 0.7232 0.9965
2.3452 9.7928 5200 2.5988 0.7459 0.9964
2.305 9.9812 5300 2.6939 0.7314 0.9973
2.3447 10.1695 5400 2.6968 0.7463 0.9994
2.3686 10.3578 5500 2.6992 0.7833 0.9993
2.4429 10.5461 5600 2.6674 0.7284 0.9991
2.4139 10.7345 5700 2.6574 0.7682 0.9988
2.405 10.9228 5800 2.6806 0.7395 0.9999
2.4375 11.1111 5900 2.6505 0.7145 0.9993
2.4065 11.2994 6000 2.8557 0.7743 0.9999
2.5238 11.4878 6100 2.8419 0.7982 0.9999
2.6059 11.6761 6200 2.7876 0.8675 1.0
2.5852 11.8644 6300 2.8252 0.8875 1.0
2.5776 12.0527 6400 2.7428 0.9291 1.0
2.5932 12.2411 6500 2.7696 0.8976 0.9999
2.6125 12.4294 6600 2.7940 0.9234 1.0
2.6714 12.6177 6700 2.7963 0.9418 1.0
2.6844 12.8060 6800 2.9210 0.9862 0.9995
2.7054 12.9944 6900 2.8712 0.9838 0.9990
2.6509 13.1827 7000 2.9206 0.9718 0.9998
2.7351 13.3710 7100 2.9365 0.9731 0.9995
2.7265 13.5593 7200 2.9177 0.9943 0.9984
2.817 13.7476 7300 2.9292 0.9881 0.9972
2.7506 13.9360 7400 2.9558 0.9909 0.9979
2.799 14.1243 7500 2.9085 0.9486 0.9998
2.7322 14.3126 7600 2.8605 0.9283 1.0
2.6716 14.5009 7700 2.8886 0.8909 1.0
2.6592 14.6893 7800 2.8992 0.8688 1.0
2.6646 14.8776 7900 2.9344 0.8686 1.0
2.724 15.0659 8000 2.9167 0.9103 1.0
2.8567 15.2542 8100 2.9886 0.9291 1.0
3.3181 15.4426 8200 3.2414 0.9603 0.9999
3.6228 15.6309 8300 4.0603 0.9935 0.9999
3.2903 15.8192 8400 3.1761 0.9610 0.9999
3.3319 16.0075 8500 3.1989 0.9701 0.9996
3.1339 16.1959 8600 3.4163 0.9837 0.9997
3.1833 16.3842 8700 3.2594 0.9772 0.9996
3.1457 16.5725 8800 3.2544 0.9769 0.9995
3.1402 16.7608 8900 3.2880 0.9776 0.9993
3.1477 16.9492 9000 3.2459 0.9755 0.9994
3.0863 17.1375 9100 3.2432 0.9612 0.9997
3.1396 17.3258 9200 3.2105 0.9637 0.9997
3.0992 17.5141 9300 3.2458 0.9566 0.9994
3.1286 17.7024 9400 3.2120 0.9605 0.9995
3.1312 17.8908 9500 3.2466 0.9570 0.9994
3.1014 18.0791 9600 3.2039 0.9668 0.9995
3.1079 18.2674 9700 3.2022 0.9632 0.9997
3.1021 18.4557 9800 3.2071 0.9608 0.9996
3.0741 18.6441 9900 3.2062 0.9593 0.9993
3.0982 18.8324 10000 3.2004 0.9591 0.9995
3.1016 19.0207 10100 3.2120 0.9552 0.9995
3.0939 19.2090 10200 3.2339 0.9659 0.9997
3.0702 19.3974 10300 3.2106 0.9597 0.9998
3.0876 19.5857 10400 3.1958 0.9546 0.9996
3.091 19.7740 10500 3.2048 0.9605 0.9996
3.0678 19.9623 10600 3.2073 0.9614 0.9997
3.0627 20.1507 10700 3.2349 0.9421 0.9988
3.115 20.3390 10800 3.2771 0.9384 0.9984
3.0859 20.5273 10900 3.2839 0.9350 0.9979
3.0783 20.7156 11000 3.2618 0.9380 0.9984
3.063 20.9040 11100 3.2845 0.9318 0.9968
3.1253 21.0923 11200 3.2100 0.9575 0.9996
3.1022 21.2806 11300 3.2077 0.9570 0.9994
3.0577 21.4689 11400 3.2107 0.9566 0.9991
3.1657 21.6573 11500 3.2189 0.9536 0.9994
3.104 21.8456 11600 3.2216 0.9526 0.9995
3.1261 22.0339 11700 3.2874 0.9331 0.9979
3.1522 22.2222 11800 3.2801 0.9351 0.9982
3.1181 22.4105 11900 3.2882 0.9340 0.9980
3.1137 22.5989 12000 3.3047 0.9278 0.9970
3.1867 22.7872 12100 3.3426 0.9236 0.9974
3.2373 22.9755 12200 3.3194 0.9272 0.9976
3.1979 23.1638 12300 3.3497 0.9263 0.9977
3.2752 23.3522 12400 3.3670 0.9256 0.9978
3.2831 23.5405 12500 3.3733 0.9209 0.9973
3.1949 23.7288 12600 3.4054 0.9129 0.9963
3.2813 23.9171 12700 3.4314 0.9127 0.9968
3.2636 24.1055 12800 3.3522 0.9395 0.9992
3.3405 24.2938 12900 3.3866 0.9271 0.9981
3.3643 24.4821 13000 3.3755 0.9349 0.9988
3.3405 24.6704 13100 3.4095 0.9257 0.9982
3.3849 24.8588 13200 3.4442 0.9180 0.9976
3.4133 25.0471 13300 3.5845 0.8960 0.9952
3.4747 25.2354 13400 3.5972 0.8955 0.9950
3.3754 25.4237 13500 3.5783 0.9000 0.9954
3.6186 25.6121 13600 3.6207 0.8970 0.9952
3.5257 25.8004 13700 3.5450 0.9204 0.9982
3.5552 25.9887 13800 3.5970 0.9126 0.9978
3.5488 26.1770 13900 3.5955 0.9252 0.9987
3.7153 26.3653 14000 3.6388 0.9249 0.9990
3.5743 26.5537 14100 3.6605 0.9255 0.9993
3.5399 26.7420 14200 3.6604 0.9284 0.9993
3.5299 26.9303 14300 3.6808 0.9279 0.9994
3.5682 27.1186 14400 3.6842 0.9275 0.9993
3.6293 27.3070 14500 3.6912 0.9284 0.9994
3.7195 27.4953 14600 3.6877 0.9300 0.9993
3.7357 27.6836 14700 3.6994 0.9340 0.9994
3.6613 27.8719 14800 3.6993 0.9319 0.9995
3.6398 28.0603 14900 3.7152 0.9317 0.9994
3.7112 28.2486 15000 3.7308 0.9372 0.9999
3.6608 28.4369 15100 3.7434 0.9404 0.9999
3.7395 28.6252 15200 3.7598 0.9421 0.9999
3.7235 28.8136 15300 3.7651 0.9412 0.9999
3.6748 29.0019 15400 3.7793 0.9431 0.9999
3.7072 29.1902 15500 3.7839 0.9440 0.9999
3.7432 29.3785 15600 3.7883 0.9437 0.9999
3.692 29.5669 15700 3.7917 0.9438 0.9999
3.8549 29.7552 15800 3.7958 0.9445 1.0
3.6699 29.9435 15900 3.7970 0.9445 0.9999

Framework versions

  • Transformers 4.57.2
  • Pytorch 2.9.1+cu128
  • Datasets 3.6.0
  • Tokenizers 0.22.0
Downloads last month
-
Safetensors
Model size
0.3B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for ctaguchi/ssc-el-CY-model

Finetuned
(796)
this model