ssc-cgg-model

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.2753
  • Cer: 0.2625
  • Wer: 0.8266

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 100
  • num_epochs: 30
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Cer Wer
4.774 0.1864 100 2.9668 0.9919 1.0
3.2708 0.3728 200 2.9400 0.9919 1.0
2.8413 0.5592 300 2.8321 0.9919 1.0
2.674 0.7456 400 2.6069 0.9055 1.0
2.4214 0.9320 500 2.3585 0.7714 1.0
2.0978 1.1174 600 1.8840 0.6053 1.0
1.9201 1.3038 700 1.7315 0.5596 0.9999
1.8358 1.4902 800 1.7703 0.5090 0.9960
1.7211 1.6766 900 1.5671 0.4943 0.9927
1.6456 1.8630 1000 1.5215 0.4965 0.9921
1.6238 2.0485 1100 1.5559 0.5125 0.9971
1.4901 2.2349 1200 1.4645 0.4661 0.9902
1.5001 2.4212 1300 1.4230 0.4627 0.9884
1.4129 2.6076 1400 1.4149 0.4542 0.9870
1.4087 2.7940 1500 1.3809 0.4403 0.9839
1.3885 2.9804 1600 1.4125 0.4351 0.9868
1.3267 3.1659 1700 1.3841 0.4168 1.0238
1.2669 3.3523 1800 1.4178 0.4564 0.9826
1.2492 3.5387 1900 1.3101 0.4087 0.9711
1.242 3.7251 2000 1.2938 0.4311 0.9801
1.2144 3.9115 2100 1.3929 0.4172 0.9994
1.2334 4.0969 2200 1.2291 0.3985 0.9631
1.1318 4.2833 2300 1.2567 0.4069 0.9707
1.1588 4.4697 2400 1.2027 0.3812 0.9539
1.1203 4.6561 2500 1.3124 0.4197 0.9724
1.117 4.8425 2600 1.1737 0.3909 0.9588
1.0582 5.0280 2700 1.1263 0.3584 0.9352
0.9921 5.2144 2800 1.1103 0.3723 0.9463
1.0199 5.4007 2900 1.0575 0.3422 0.9376
1.0181 5.5871 3000 1.1219 0.3554 0.9393
1.0148 5.7735 3100 1.0687 0.3575 0.9324
1.0273 5.9599 3200 1.1047 0.3482 0.9379
0.9599 6.1454 3300 1.0959 0.3559 1.0169
0.9171 6.3318 3400 1.0382 0.3395 0.9403
0.8826 6.5182 3500 1.0086 0.3233 0.9215
0.9008 6.7046 3600 1.0598 0.3788 0.9343
0.9485 6.8910 3700 1.0619 0.3312 0.9480
0.9102 7.0764 3800 1.0945 0.3456 0.9804
0.8499 7.2628 3900 1.1076 0.3549 0.9228
0.8106 7.4492 4000 1.0951 0.3419 0.9167
0.8579 7.6356 4100 1.0706 0.3327 0.9749
0.8635 7.8220 4200 1.1565 0.4120 0.9346
0.8444 8.0075 4300 1.0254 0.3385 0.9159
0.7355 8.1938 4400 1.0250 0.3325 0.9072
0.7618 8.3802 4500 1.0682 0.3577 0.9127
0.7531 8.5666 4600 1.0006 0.3206 0.8979
0.7563 8.7530 4700 0.9997 0.3212 0.8957
0.7501 8.9394 4800 1.0075 0.3317 0.9032
0.7056 9.1249 4900 1.0037 0.2974 0.9229
0.6826 9.3113 5000 1.0271 0.2989 0.8882
0.6654 9.4977 5100 1.0321 0.3169 0.8862
0.6754 9.6841 5200 0.9978 0.2944 0.8738
0.6553 9.8705 5300 1.0132 0.3110 0.8893
0.6753 10.0559 5400 0.9893 0.2966 0.8859
0.6155 10.2423 5500 1.0846 0.3168 0.8874
0.6251 10.4287 5600 0.9949 0.3005 0.8828
0.6091 10.6151 5700 1.0041 0.2948 0.8805
0.6384 10.8015 5800 0.9808 0.3000 0.8831
0.626 10.9879 5900 1.0686 0.3335 0.8965
0.5875 11.1733 6000 1.0360 0.3006 0.8880
0.5614 11.3597 6100 1.0932 0.3177 0.8948
0.5736 11.5461 6200 1.0129 0.2963 0.8779
0.5448 11.7325 6300 1.0100 0.2968 0.8727
0.5637 11.9189 6400 1.0889 0.3037 0.9863
0.54 12.1044 6500 0.9968 0.3057 0.8769
0.5235 12.2908 6600 0.9887 0.2921 0.8667
0.5097 12.4772 6700 0.9957 0.3077 0.8841
0.4934 12.6636 6800 1.0222 0.2951 0.8740
0.5302 12.8500 6900 0.9966 0.3023 0.8727
0.4885 13.0354 7000 1.0340 0.2882 0.8657
0.467 13.2218 7100 1.0304 0.2894 0.8620
0.4771 13.4082 7200 0.9793 0.2794 0.8788
0.4577 13.5946 7300 1.0244 0.2904 0.8622
0.4654 13.7810 7400 1.0196 0.2927 0.8703
0.4663 13.9674 7500 1.0139 0.2795 0.8585
0.4335 14.1528 7600 1.0385 0.2799 0.8613
0.4012 14.3392 7700 1.0115 0.2849 0.8758
0.4507 14.5256 7800 0.9969 0.2835 0.8626
0.4558 14.7120 7900 1.0308 0.2897 0.8659
0.4293 14.8984 8000 1.0390 0.2858 0.8608
0.4586 15.0839 8100 1.0493 0.2890 0.9158
0.4009 15.2703 8200 1.0469 0.2914 0.8598
0.3904 15.4567 8300 1.0786 0.2937 0.8703
0.396 15.6431 8400 1.1096 0.2961 0.8705
0.3706 15.8295 8500 1.0184 0.2945 0.875
0.4348 16.0149 8600 1.0295 0.2826 0.8649
0.3488 16.2013 8700 1.0228 0.2874 0.8738
0.3503 16.3877 8800 1.0451 0.2790 0.8547
0.3749 16.5741 8900 1.0531 0.2838 0.8579
0.3409 16.7605 9000 1.0078 0.2846 0.8627
0.3743 16.9469 9100 1.0457 0.2826 0.8726
0.3218 17.1323 9200 1.1266 0.2860 0.8592
0.3168 17.3187 9300 1.0841 0.2868 0.8760
0.3328 17.5051 9400 1.0564 0.2846 0.8762
0.3293 17.6915 9500 1.0349 0.2811 0.8397
0.3189 17.8779 9600 1.0325 0.2773 0.8554
0.3382 18.0634 9700 1.1433 0.2761 0.8425
0.2802 18.2498 9800 1.1109 0.2821 0.8526
0.3114 18.4362 9900 1.0888 0.2846 0.8488
0.2919 18.6226 10000 1.0904 0.2840 0.8451
0.3049 18.8089 10100 1.1035 0.2848 0.8546
0.3129 18.9953 10200 1.1737 0.2825 0.8553
0.2868 19.1808 10300 1.0775 0.2818 0.8528
0.2777 19.3672 10400 1.1083 0.2904 0.8593
0.29 19.5536 10500 1.1036 0.2853 0.8638
0.2948 19.7400 10600 1.1181 0.2834 0.8493
0.3025 19.9264 10700 1.1364 0.2874 0.8691
0.2742 20.1118 10800 1.1488 0.2792 0.8434
0.264 20.2982 10900 1.1320 0.2729 0.8396
0.2652 20.4846 11000 1.1178 0.2773 0.8376
0.2625 20.6710 11100 1.1140 0.2764 0.8345
0.2555 20.8574 11200 1.0976 0.2782 0.8407
0.2827 21.0429 11300 1.1076 0.2818 0.8487
0.2372 21.2293 11400 1.1379 0.2763 0.8480
0.2506 21.4157 11500 1.1521 0.2703 0.8361
0.244 21.6021 11600 1.1719 0.2790 0.8459
0.2477 21.7884 11700 1.1591 0.2749 0.8526
0.2322 21.9748 11800 1.1166 0.2706 0.8373
0.2441 22.1603 11900 1.1119 0.2711 0.8497
0.2151 22.3467 12000 1.1887 0.2791 0.8441
0.2297 22.5331 12100 1.1586 0.2791 0.8493
0.207 22.7195 12200 1.1546 0.2833 0.8496
0.2133 22.9059 12300 1.1111 0.2849 0.8508
0.2245 23.0913 12400 1.1927 0.2806 0.8579
0.2048 23.2777 12500 1.1808 0.2769 0.8400
0.2061 23.4641 12600 1.1464 0.2747 0.8599
0.1879 23.6505 12700 1.1850 0.2737 0.8369
0.2131 23.8369 12800 1.2191 0.2707 0.8405
0.1956 24.0224 12900 1.2319 0.2704 0.8430
0.1903 24.2088 13000 1.2206 0.2691 0.8437
0.1716 24.3952 13100 1.2839 0.2757 0.8461
0.1973 24.5815 13200 1.2227 0.2703 0.8451
0.1933 24.7679 13300 1.2229 0.2746 0.8430
0.1845 24.9543 13400 1.2350 0.2712 0.8390
0.1697 25.1398 13500 1.2309 0.2684 0.8387
0.1763 25.3262 13600 1.1853 0.2668 0.8322
0.1809 25.5126 13700 1.2074 0.2666 0.8379
0.1696 25.6990 13800 1.2031 0.2657 0.8368
0.154 25.8854 13900 1.2128 0.2656 0.8333
0.1946 26.0708 14000 1.2620 0.2685 0.8361
0.1546 26.2572 14100 1.2554 0.2706 0.8356
0.15 26.4436 14200 1.2318 0.2653 0.8374
0.1549 26.6300 14300 1.2386 0.2645 0.8285
0.1679 26.8164 14400 1.2317 0.2665 0.8314
0.1921 27.0019 14500 1.2301 0.2650 0.8369
0.1552 27.1883 14600 1.2628 0.2679 0.8383
0.1492 27.3747 14700 1.2523 0.2666 0.8301
0.146 27.5610 14800 1.2497 0.2669 0.8263
0.1444 27.7474 14900 1.2734 0.2645 0.8416
0.1463 27.9338 15000 1.2638 0.2659 0.8317
0.148 28.1193 15100 1.2823 0.2643 0.8340
0.1473 28.3057 15200 1.2645 0.2649 0.8285
0.1328 28.4921 15300 1.2629 0.2629 0.8298
0.132 28.6785 15400 1.2607 0.2634 0.8253
0.1434 28.8649 15500 1.2769 0.2644 0.8280
0.1392 29.0503 15600 1.2883 0.2649 0.8272
0.1318 29.2367 15700 1.2793 0.2649 0.8264
0.1301 29.4231 15800 1.2833 0.2639 0.8258
0.1297 29.6095 15900 1.2851 0.2638 0.8269
0.1315 29.7959 16000 1.2768 0.2626 0.8260
0.1279 29.9823 16100 1.2753 0.2625 0.8266

Framework versions

  • Transformers 4.57.2
  • Pytorch 2.9.1+cu128
  • Datasets 3.6.0
  • Tokenizers 0.22.0
Downloads last month
475
Safetensors
Model size
0.3B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for ctaguchi/ssc-cgg-model

Finetuned
(796)
this model