ssc-led-model

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.4926
  • Cer: 0.3156
  • Wer: 0.7297

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 100
  • num_epochs: 30
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Cer Wer
6.557 0.1390 100 3.2742 0.9964 0.9999
3.4632 0.2780 200 3.2694 0.9912 0.9994
3.4078 0.4170 300 3.2839 0.9842 0.9999
3.3831 0.5559 400 3.1970 0.9833 1.0
3.2858 0.6949 500 3.2411 0.9913 0.9998
3.1984 0.8339 600 2.8873 0.9778 0.9999
3.058 0.9729 700 3.1117 0.9838 1.0
2.9646 1.1112 800 2.7581 0.6400 1.0181
2.9609 1.2502 900 2.7426 0.6333 1.0380
2.9051 1.3892 1000 2.7028 0.6283 1.1000
2.7531 1.5281 1100 2.4181 0.6132 1.0336
2.6876 1.6671 1200 2.4824 0.6167 1.0435
2.5923 1.8061 1300 2.2197 0.5982 0.9760
2.4445 1.9451 1400 2.2790 0.5738 1.0593
2.4327 2.0834 1500 2.0705 0.6024 0.9433
2.2965 2.2224 1600 1.9625 0.5488 1.0015
2.251 2.3614 1700 1.9497 0.5375 0.9985
2.2063 2.5003 1800 1.8847 0.5316 1.0227
2.1696 2.6393 1900 1.8468 0.5320 0.9897
2.0999 2.7783 2000 1.8188 0.5418 0.9231
2.106 2.9173 2100 1.7915 0.5226 0.9906
2.0443 3.0556 2200 1.7832 0.5175 0.9703
2.0008 3.1946 2300 1.9252 0.6242 0.9561
1.9581 3.3336 2400 1.7066 0.5050 0.9349
1.9088 3.4726 2500 1.6904 0.5084 0.9193
1.9802 3.6115 2600 1.7291 0.5031 0.9391
1.9073 3.7505 2700 1.6498 0.4955 0.9059
1.8964 3.8895 2800 1.6772 0.4835 0.9511
1.8309 4.0278 2900 1.5232 0.4640 0.9101
1.8294 4.1668 3000 1.5878 0.4740 0.9068
1.7896 4.3058 3100 1.5236 0.4635 0.8732
1.7279 4.4448 3200 1.5316 0.4659 0.8761
1.8116 4.5837 3300 1.5484 0.4494 0.9161
1.7621 4.7227 3400 1.4950 0.4332 0.8972
1.7334 4.8617 3500 1.4884 0.4439 0.8759
1.8286 5.0 3600 1.5486 0.4640 0.8888
1.68 5.1390 3700 1.5825 0.4565 0.9353
1.6566 5.2780 3800 1.5937 0.4680 0.8777
1.6644 5.4170 3900 1.4593 0.4405 0.8580
1.6664 5.5559 4000 1.5155 0.4474 0.8805
1.6503 5.6949 4100 1.6064 0.4754 0.8777
1.6167 5.8339 4200 1.6223 0.4630 0.9703
1.643 5.9729 4300 1.4182 0.4370 0.8597
1.477 6.1112 4400 1.4183 0.4144 0.8682
1.5618 6.2502 4500 1.3755 0.4162 0.8499
1.5647 6.3892 4600 1.4729 0.4105 0.9161
1.5552 6.5281 4700 1.3810 0.4108 0.8432
1.5455 6.6671 4800 1.3562 0.4252 0.8416
1.5155 6.8061 4900 1.3853 0.4014 0.8696
1.5147 6.9451 5000 1.4001 0.3984 0.8850
1.5683 7.0834 5100 1.2923 0.3984 0.8197
1.5013 7.2224 5200 1.3053 0.3908 0.8516
1.4608 7.3614 5300 1.3336 0.3946 0.8734
1.4684 7.5003 5400 1.2944 0.3925 0.8324
1.4936 7.6393 5500 1.3272 0.3852 0.8324
1.4579 7.7783 5600 1.3462 0.4061 0.8576
1.4586 7.9173 5700 1.3090 0.3859 0.8459
1.3563 8.0556 5800 1.3592 0.4145 0.8451
1.4024 8.1946 5900 1.3865 0.3983 0.9042
1.3459 8.3336 6000 1.4191 0.3923 0.8889
1.3772 8.4726 6100 1.3182 0.3893 0.8290
1.3714 8.6115 6200 1.3219 0.4035 0.8205
1.4096 8.7505 6300 1.3486 0.3844 0.8369
1.3768 8.8895 6400 1.2831 0.3810 0.8239
1.3808 9.0278 6500 1.2448 0.3705 0.7927
1.3053 9.1668 6600 1.3261 0.3886 0.8214
1.3379 9.3058 6700 1.2955 0.3741 0.8301
1.2732 9.4448 6800 1.2895 0.3771 0.8382
1.3168 9.5837 6900 1.2404 0.3669 0.8116
1.311 9.7227 7000 1.2535 0.3774 0.8159
1.3142 9.8617 7100 1.2507 0.3743 0.8225
1.4334 10.0 7200 1.2819 0.3881 0.8157
1.2647 10.1390 7300 1.3880 0.3891 0.8348
1.2282 10.2780 7400 1.2940 0.3763 0.8100
1.2965 10.4170 7500 1.3343 0.3774 0.8548
1.2611 10.5559 7600 1.3572 0.3801 0.8199
1.2283 10.6949 7700 1.3654 0.3708 0.8183
1.2699 10.8339 7800 1.2622 0.3679 0.7898
1.2802 10.9729 7900 1.3185 0.3849 0.8088
1.1242 11.1112 8000 1.2298 0.3617 0.7915
1.1502 11.2502 8100 1.2286 0.3525 0.7668
1.1642 11.3892 8200 1.2514 0.3581 0.8106
1.2153 11.5281 8300 1.2211 0.3544 0.7931
1.1378 11.6671 8400 1.2571 0.3652 0.7997
1.1795 11.8061 8500 1.2248 0.3527 0.7937
1.2308 11.9451 8600 1.2390 0.3582 0.7932
1.238 12.0834 8700 1.2046 0.3513 0.7879
1.0873 12.2224 8800 1.1623 0.3424 0.7721
1.1585 12.3614 8900 1.1803 0.3414 0.7795
1.1275 12.5003 9000 1.2063 0.3443 0.7714
1.1469 12.6393 9100 1.1775 0.3444 0.7778
1.1079 12.7783 9200 1.1507 0.3379 0.7576
1.1546 12.9173 9300 1.1499 0.3429 0.7713
1.064 13.0556 9400 1.2232 0.3445 0.7816
1.0603 13.1946 9500 1.2726 0.3697 0.7919
1.0666 13.3336 9600 1.2574 0.3545 0.8084
1.0512 13.4726 9700 1.2035 0.3532 0.7834
1.0698 13.6115 9800 1.2578 0.3527 0.7963
1.0912 13.7505 9900 1.2416 0.3466 0.7682
1.0627 13.8895 10000 1.2337 0.3453 0.7595
1.0987 14.0278 10100 1.1615 0.3369 0.7522
0.9958 14.1668 10200 1.1879 0.3412 0.7602
0.9942 14.3058 10300 1.1657 0.3421 0.7499
0.9759 14.4448 10400 1.1370 0.3482 0.7603
0.999 14.5837 10500 1.2919 0.3459 0.7803
1.0084 14.7227 10600 1.1895 0.3436 0.7621
1.017 14.8617 10700 1.1151 0.3301 0.7453
1.1085 15.0 10800 1.1994 0.3421 0.7856
0.9106 15.1390 10900 1.2353 0.3411 0.7535
0.9494 15.2780 11000 1.2391 0.3534 0.7680
0.9865 15.4170 11100 1.2126 0.3436 0.7789
0.9331 15.5559 11200 1.2147 0.3458 0.7521
0.9954 15.6949 11300 1.3586 0.3612 0.8058
0.9439 15.8339 11400 1.3371 0.3745 0.7877
0.9967 15.9729 11500 1.3378 0.3844 0.7991
0.8549 16.1112 11600 1.2175 0.3354 0.7517
0.8896 16.2502 11700 1.2278 0.3397 0.7657
0.8664 16.3892 11800 1.2651 0.3352 0.7603
0.8753 16.5281 11900 1.1716 0.3290 0.7444
0.8977 16.6671 12000 1.2236 0.3365 0.7479
0.9496 16.8061 12100 1.1973 0.3331 0.7412
0.91 16.9451 12200 1.1738 0.3343 0.7534
0.9119 17.0834 12300 1.1986 0.3325 0.7440
0.8525 17.2224 12400 1.1816 0.3435 0.7520
0.8101 17.3614 12500 1.1608 0.3279 0.7349
0.8687 17.5003 12600 1.1822 0.3338 0.7440
0.8453 17.6393 12700 1.1843 0.3258 0.7545
0.8405 17.7783 12800 1.1765 0.3392 0.7460
0.8665 17.9173 12900 1.1781 0.3304 0.7403
0.7923 18.0556 13000 1.2630 0.3339 0.7420
0.8042 18.1946 13100 1.2656 0.3343 0.7425
0.7833 18.3336 13200 1.2872 0.3359 0.7512
0.8003 18.4726 13300 1.2731 0.3374 0.7513
0.8029 18.6115 13400 1.2229 0.3304 0.7499
0.8214 18.7505 13500 1.2203 0.3259 0.7501
0.7935 18.8895 13600 1.2336 0.3246 0.7390
0.7993 19.0278 13700 1.2858 0.3251 0.7368
0.716 19.1668 13800 1.2969 0.3221 0.7418
0.7621 19.3058 13900 1.2338 0.3268 0.7489
0.7576 19.4448 14000 1.2330 0.3272 0.7390
0.7413 19.5837 14100 1.1901 0.3265 0.7326
0.7763 19.7227 14200 1.2638 0.3293 0.7475
0.7648 19.8617 14300 1.1992 0.3358 0.7456
0.8236 20.0 14400 1.1934 0.3420 0.7650
0.6941 20.1390 14500 1.2392 0.3316 0.7339
0.6681 20.2780 14600 1.3000 0.3249 0.7423
0.698 20.4170 14700 1.1913 0.3200 0.7260
0.7228 20.5559 14800 1.3172 0.3292 0.7378
0.7112 20.6949 14900 1.2621 0.3281 0.7527
0.7178 20.8339 15000 1.3336 0.3263 0.7634
0.7221 20.9729 15100 1.3298 0.3360 0.7485
0.601 21.1112 15200 1.3518 0.3298 0.7473
0.6322 21.2502 15300 1.3088 0.3367 0.7504
0.6384 21.3892 15400 1.2725 0.3278 0.7371
0.679 21.5281 15500 1.3340 0.3287 0.7537
0.6686 21.6671 15600 1.2990 0.3241 0.7401
0.666 21.8061 15700 1.3288 0.3265 0.7418
0.6466 21.9451 15800 1.2629 0.3161 0.7303
0.7039 22.0834 15900 1.2848 0.3208 0.7347
0.6094 22.2224 16000 1.3526 0.3224 0.7461
0.6243 22.3614 16100 1.3610 0.3253 0.7391
0.6053 22.5003 16200 1.3213 0.3196 0.7347
0.6238 22.6393 16300 1.3298 0.3202 0.7341
0.6083 22.7783 16400 1.3180 0.3180 0.7365
0.5848 22.9173 16500 1.3468 0.3208 0.7457
0.5828 23.0556 16600 1.3385 0.3257 0.7404
0.5904 23.1946 16700 1.3450 0.3191 0.7400
0.6162 23.3336 16800 1.3983 0.3278 0.7377
0.5515 23.4726 16900 1.3701 0.3212 0.7412
0.5598 23.6115 17000 1.3807 0.3215 0.7438
0.5535 23.7505 17100 1.4148 0.3301 0.7419
0.6105 23.8895 17200 1.3604 0.3197 0.7393
0.5633 24.0278 17300 1.3609 0.3193 0.7379
0.5327 24.1668 17400 1.3563 0.3185 0.7348
0.5824 24.3058 17500 1.4036 0.3231 0.7402
0.5035 24.4448 17600 1.3861 0.3218 0.7316
0.5411 24.5837 17700 1.3858 0.3218 0.7436
0.5418 24.7227 17800 1.3647 0.3193 0.7374
0.5533 24.8617 17900 1.4045 0.3211 0.7354
0.5557 25.0 18000 1.3967 0.3248 0.7414
0.493 25.1390 18100 1.4083 0.3184 0.7389
0.4945 25.2780 18200 1.3569 0.3174 0.7350
0.5113 25.4170 18300 1.4117 0.3242 0.7432
0.5373 25.5559 18400 1.4016 0.3206 0.7307
0.5291 25.6949 18500 1.4614 0.3221 0.7430
0.5006 25.8339 18600 1.3930 0.3189 0.7280
0.5286 25.9729 18700 1.4034 0.3225 0.7407
0.4113 26.1112 18800 1.4622 0.3169 0.7278
0.4876 26.2502 18900 1.4270 0.3192 0.7333
0.4888 26.3892 19000 1.4628 0.3157 0.7301
0.4572 26.5281 19100 1.4566 0.3175 0.7319
0.4721 26.6671 19200 1.4436 0.3208 0.7409
0.474 26.8061 19300 1.4377 0.3218 0.7391
0.4796 26.9451 19400 1.4474 0.3208 0.7378
0.5193 27.0834 19500 1.4400 0.3168 0.7309
0.4976 27.2224 19600 1.4349 0.3159 0.7302
0.4634 27.3614 19700 1.4753 0.3174 0.7339
0.4465 27.5003 19800 1.4669 0.3168 0.7345
0.4511 27.6393 19900 1.4651 0.3209 0.7390
0.4475 27.7783 20000 1.4643 0.3182 0.7328
0.4309 27.9173 20100 1.4598 0.3161 0.7303
0.4391 28.0556 20200 1.4789 0.3152 0.7373
0.452 28.1946 20300 1.4677 0.3145 0.7280
0.4376 28.3336 20400 1.4662 0.3179 0.7307
0.446 28.4726 20500 1.4773 0.3156 0.7318
0.4332 28.6115 20600 1.4617 0.3166 0.7297
0.4197 28.7505 20700 1.4659 0.3175 0.7336
0.4086 28.8895 20800 1.4816 0.3177 0.7359
0.4456 29.0278 20900 1.4862 0.3176 0.7349
0.4198 29.1668 21000 1.5056 0.3153 0.7317
0.4435 29.3058 21100 1.4887 0.3155 0.7289
0.3888 29.4448 21200 1.4954 0.3151 0.7261
0.4202 29.5837 21300 1.4947 0.3154 0.7289
0.4353 29.7227 21400 1.4926 0.3161 0.7300
0.4154 29.8617 21500 1.4921 0.3161 0.7295
0.4489 30.0 21600 1.4926 0.3156 0.7297

Framework versions

  • Transformers 4.57.2
  • Pytorch 2.9.1+cu128
  • Datasets 3.6.0
  • Tokenizers 0.22.0
Downloads last month
-
Safetensors
Model size
0.3B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for ctaguchi/ssc-led-model

Finetuned
(796)
this model