ssc-meh-model

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.8542
  • Cer: 0.3340
  • Wer: 0.8345

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 100
  • num_epochs: 30
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Cer Wer
5.7202 0.1905 100 2.9202 1.0 1.0
2.9539 0.3810 200 2.7751 1.0 1.0
2.9096 0.5714 300 2.7931 1.0 1.0
2.9036 0.7619 400 2.7883 1.0 1.0
2.8563 0.9524 500 2.9839 1.0 1.0
2.7445 1.1429 600 2.5705 1.0 1.0
2.6738 1.3333 700 2.4914 0.9969 0.9992
2.5821 1.5238 800 2.2302 0.9231 0.9753
2.4371 1.7143 900 2.0914 0.8529 1.0
2.2815 1.9048 1000 1.8929 0.7330 0.9997
2.1294 2.0952 1100 1.8964 0.6461 0.9673
1.9156 2.2857 1200 1.8122 0.7197 0.9931
1.8408 2.4762 1300 1.7276 0.5993 0.9602
1.7506 2.6667 1400 1.6561 0.6534 0.9657
1.7086 2.8571 1500 1.7104 0.4768 0.9349
1.5834 3.0476 1600 1.4368 0.4460 0.9084
1.5731 3.2381 1700 1.4594 0.4292 0.9379
1.5743 3.4286 1800 1.4713 0.4183 0.9731
1.4766 3.6190 1900 1.3967 0.4334 0.9114
1.4799 3.8095 2000 1.5286 0.4260 0.9550
1.4973 4.0 2100 1.7988 0.4115 1.1627
1.3498 4.1905 2200 1.4739 0.4768 0.9209
1.3642 4.3810 2300 1.3685 0.4241 0.9290
1.361 4.5714 2400 1.3578 0.4705 0.9097
1.3747 4.7619 2500 1.4424 0.4560 0.9060
1.3728 4.9524 2600 1.5908 0.3978 0.9865
1.2524 5.1429 2700 1.3019 0.3814 0.9119
1.2532 5.3333 2800 1.3969 0.3960 0.9056
1.2527 5.5238 2900 1.2484 0.3659 0.9150
1.2347 5.7143 3000 1.5303 0.4045 1.0358
1.2531 5.9048 3100 1.3037 0.3827 0.8702
1.2506 6.0952 3200 1.4484 0.3815 0.9621
1.1295 6.2857 3300 1.4467 0.3843 0.9590
1.1507 6.4762 3400 1.3074 0.4033 0.8888
1.146 6.6667 3500 1.6379 0.4109 1.0024
1.1812 6.8571 3600 1.3674 0.4137 0.9033
1.063 7.0476 3700 1.2695 0.3719 0.9200
1.0615 7.2381 3800 1.2816 0.3645 0.9137
1.0779 7.4286 3900 1.3308 0.3772 0.9202
1.059 7.6190 4000 1.1988 0.3983 0.8730
1.0706 7.8095 4100 1.2735 0.3771 0.8901
1.1306 8.0 4200 1.3397 0.4057 0.8777
0.965 8.1905 4300 1.4966 0.3934 1.0157
1.0081 8.3810 4400 1.2140 0.3899 0.8674
1.0052 8.5714 4500 1.2490 0.3873 0.8774
1.0262 8.7619 4600 1.2793 0.3705 0.8633
0.9791 8.9524 4700 1.1578 0.3542 0.8501
0.8975 9.1429 4800 1.1994 0.3646 0.8731
0.9043 9.3333 4900 1.1890 0.3426 0.8523
0.9641 9.5238 5000 1.2537 0.3604 0.9406
0.9526 9.7143 5100 1.2081 0.3549 0.8456
0.905 9.9048 5200 1.2440 0.3614 0.9021
0.9099 10.0952 5300 1.3497 0.3703 0.9114
0.8435 10.2857 5400 1.1731 0.3861 0.8658
0.8914 10.4762 5500 1.3744 0.3745 0.9399
0.879 10.6667 5600 1.2298 0.3559 0.8850
0.8754 10.8571 5700 1.3239 0.3508 0.9467
0.8348 11.0476 5800 1.2166 0.3551 0.8572
0.781 11.2381 5900 1.2808 0.3511 0.9238
0.8081 11.4286 6000 1.2105 0.3496 0.8970
0.8023 11.6190 6100 1.1924 0.3434 0.8323
0.8236 11.8095 6200 1.2141 0.3453 0.9049
0.892 12.0 6300 1.2798 0.3720 0.9378
0.7549 12.1905 6400 1.1700 0.3318 0.8809
0.7428 12.3810 6500 1.3049 0.3628 0.8782
0.765 12.5714 6600 1.2451 0.3449 0.9348
0.7747 12.7619 6700 1.3216 0.3714 0.8654
0.7448 12.9524 6800 1.3266 0.3633 0.8783
0.6694 13.1429 6900 1.2095 0.3411 0.8247
0.6949 13.3333 7000 1.3466 0.3589 0.8935
0.721 13.5238 7100 1.2278 0.3478 0.8737
0.7298 13.7143 7200 1.2689 0.3533 0.9153
0.6928 13.9048 7300 1.2013 0.3433 0.8368
0.7061 14.0952 7400 1.3039 0.3781 0.8541
0.6477 14.2857 7500 1.2427 0.3407 0.8845
0.6752 14.4762 7600 1.3106 0.3536 0.8776
0.6498 14.6667 7700 1.2447 0.3412 0.8615
0.6626 14.8571 7800 1.3016 0.3621 0.8733
0.5813 15.0476 7900 1.2265 0.3362 0.8301
0.6195 15.2381 8000 1.2933 0.3445 0.8307
0.5909 15.4286 8100 1.2376 0.3351 0.8288
0.6048 15.6190 8200 1.2648 0.3374 0.8197
0.6235 15.8095 8300 1.2515 0.3410 0.8602
0.6721 16.0 8400 1.2712 0.3422 0.8459
0.5156 16.1905 8500 1.3692 0.3527 0.8760
0.5312 16.3810 8600 1.3574 0.3438 0.8598
0.5859 16.5714 8700 1.2556 0.3398 0.8362
0.5454 16.7619 8800 1.3780 0.3574 0.8749
0.5552 16.9524 8900 1.2868 0.3461 0.8953
0.484 17.1429 9000 1.2784 0.3434 0.8335
0.5364 17.3333 9100 1.2629 0.3339 0.8294
0.5188 17.5238 9200 1.3447 0.3329 0.8361
0.4979 17.7143 9300 1.3755 0.3332 0.8532
0.5313 17.9048 9400 1.2748 0.3397 0.8486
0.5102 18.0952 9500 1.3303 0.3363 0.8214
0.4997 18.2857 9600 1.3119 0.3453 0.8571
0.475 18.4762 9700 1.4079 0.3446 0.8573
0.4989 18.6667 9800 1.3263 0.3335 0.8518
0.4915 18.8571 9900 1.3138 0.3410 0.8241
0.4469 19.0476 10000 1.3820 0.3309 0.8491
0.4391 19.2381 10100 1.3812 0.3344 0.8371
0.4481 19.4286 10200 1.3321 0.3434 0.8303
0.4482 19.6190 10300 1.3548 0.3356 0.8298
0.4592 19.8095 10400 1.3296 0.3405 0.8610
0.4886 20.0 10500 1.3636 0.3515 0.8321
0.4023 20.1905 10600 1.4382 0.3514 0.8467
0.4332 20.3810 10700 1.2829 0.3319 0.8508
0.4154 20.5714 10800 1.3922 0.3423 0.8313
0.4276 20.7619 10900 1.4043 0.3459 0.8460
0.4316 20.9524 11000 1.4012 0.3346 0.8272
0.351 21.1429 11100 1.4923 0.3404 0.8368
0.3941 21.3333 11200 1.4509 0.3438 0.8382
0.3883 21.5238 11300 1.4189 0.3359 0.8258
0.4208 21.7143 11400 1.4527 0.3411 0.8344
0.3843 21.9048 11500 1.5000 0.3474 0.8349
0.42 22.0952 11600 1.6168 0.3509 0.8666
0.3638 22.2857 11700 1.5645 0.3518 0.8581
0.3763 22.4762 11800 1.4347 0.3441 0.8419
0.3637 22.6667 11900 1.6041 0.3466 0.8635
0.3717 22.8571 12000 1.5876 0.3400 0.8466
0.3398 23.0476 12100 1.5634 0.3378 0.8281
0.347 23.2381 12200 1.4949 0.3316 0.8112
0.3493 23.4286 12300 1.5127 0.3427 0.8347
0.338 23.6190 12400 1.5340 0.3458 0.8423
0.3456 23.8095 12500 1.5608 0.3492 0.8532
0.3829 24.0 12600 1.5481 0.3391 0.8569
0.331 24.1905 12700 1.5683 0.3323 0.8539
0.305 24.3810 12800 1.6475 0.3298 0.8240
0.3244 24.5714 12900 1.5566 0.3375 0.8227
0.3397 24.7619 13000 1.5676 0.3368 0.8393
0.3221 24.9524 13100 1.5839 0.3376 0.8391
0.2943 25.1429 13200 1.7326 0.3316 0.8413
0.3197 25.3333 13300 1.6815 0.3329 0.8492
0.3011 25.5238 13400 1.6816 0.3359 0.8492
0.3152 25.7143 13500 1.6790 0.3391 0.8683
0.3088 25.9048 13600 1.6628 0.3428 0.8819
0.336 26.0952 13700 1.7037 0.3340 0.8444
0.2778 26.2857 13800 1.7835 0.3479 0.8627
0.2775 26.4762 13900 1.7454 0.3359 0.8630
0.286 26.6667 14000 1.7595 0.3359 0.8395
0.2724 26.8571 14100 1.7468 0.3387 0.8408
0.2663 27.0476 14200 1.6998 0.3352 0.8452
0.2941 27.2381 14300 1.8233 0.3367 0.8526
0.2888 27.4286 14400 1.7692 0.3353 0.8241
0.2511 27.6190 14500 1.8221 0.3371 0.8425
0.2623 27.8095 14600 1.8068 0.3400 0.8403
0.2851 28.0 14700 1.8250 0.3398 0.8417
0.2557 28.1905 14800 1.8385 0.3405 0.8464
0.2844 28.3810 14900 1.8351 0.3359 0.8456
0.25 28.5714 15000 1.8201 0.3326 0.8261
0.267 28.7619 15100 1.8391 0.3376 0.8375
0.2409 28.9524 15200 1.8450 0.3371 0.8430
0.2263 29.1429 15300 1.8316 0.3356 0.8336
0.2579 29.3333 15400 1.8472 0.3338 0.8391
0.2424 29.5238 15500 1.8536 0.3328 0.8357
0.2612 29.7143 15600 1.8561 0.3341 0.8339
0.2644 29.9048 15700 1.8542 0.3340 0.8345

Framework versions

  • Transformers 4.57.2
  • Pytorch 2.9.1+cu128
  • Datasets 3.6.0
  • Tokenizers 0.22.0
Downloads last month
-
Safetensors
Model size
0.3B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for ctaguchi/ssc-meh-model

Finetuned
(796)
this model