ssc-hch-model

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 2.4915
  • Cer: 0.7560
  • Wer: 1.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 100
  • num_epochs: 30
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Cer Wer
5.8531 0.2506 100 3.0776 0.9917 1.0
3.1404 0.5013 200 2.9427 0.9917 1.0
3.5462 0.7519 300 3.0055 0.9917 1.0
3.3231 1.0025 400 2.9823 0.9917 1.0
3.2095 1.2531 500 2.9268 0.9917 1.0
3.1251 1.5038 600 2.9311 0.9917 1.0
3.1566 1.7544 700 2.9446 0.9917 1.0
3.1212 2.0050 800 2.9112 0.9917 1.0
3.0708 2.2556 900 2.9260 0.9829 1.0
3.0319 2.5063 1000 3.8026 0.9829 1.0
3.0086 2.7569 1100 3.3550 0.9829 1.0
2.9547 3.0075 1200 2.9463 0.9832 1.0
2.8971 3.2581 1300 3.1363 0.9661 1.0
2.8063 3.5088 1400 2.9367 0.9473 1.0
2.7296 3.7594 1500 3.0035 0.8580 1.0
2.5223 4.0100 1600 2.4125 0.7232 1.0
2.3708 4.2607 1700 2.4144 0.6792 1.0
2.3276 4.5113 1800 2.3535 0.6109 1.0
2.2372 4.7619 1900 2.1567 0.5972 1.0
2.136 5.0125 2000 2.2649 0.5880 0.9996
2.0764 5.2632 2100 2.0068 0.5701 1.0
2.0301 5.5138 2200 1.9794 0.5818 1.0
1.9486 5.7644 2300 2.0473 0.5687 0.9995
1.9485 6.0150 2400 1.9562 0.5288 1.0
1.8988 6.2657 2500 1.8791 0.5423 1.0
1.9147 6.5163 2600 1.8219 0.5453 1.0
1.8067 6.7669 2700 1.8768 0.5321 0.9998
1.7998 7.0175 2800 1.9343 0.5243 0.9996
1.7891 7.2682 2900 1.8222 0.5198 0.9998
1.7434 7.5188 3000 1.8008 0.5444 1.0
1.7674 7.7694 3100 1.9248 0.5005 0.9989
1.708 8.0201 3200 1.7520 0.5447 0.9995
1.7056 8.2707 3300 1.8185 0.5321 0.9995
1.6462 8.5213 3400 1.7989 0.5222 0.9998
1.6381 8.7719 3500 1.8162 0.5100 0.9986
1.6466 9.0226 3600 1.7556 0.5289 1.0
1.6018 9.2732 3700 1.6908 0.5010 0.9991
1.606 9.5238 3800 1.6967 0.4951 0.9977
1.6045 9.7744 3900 1.7846 0.5240 0.9982
1.5963 10.0251 4000 1.7657 0.5304 0.9995
1.5649 10.2757 4100 1.7817 0.5407 0.9995
1.5553 10.5263 4200 1.7213 0.5265 0.9993
1.5201 10.7769 4300 1.7401 0.5195 1.0
1.5254 11.0276 4400 1.8063 0.4984 0.9998
1.5068 11.2782 4500 1.8452 0.5403 1.0
1.5342 11.5288 4600 2.0170 0.5464 0.9998
1.7503 11.7794 4700 2.2264 0.5262 1.0
1.8808 12.0301 4800 1.9860 0.5120 1.0
1.9139 12.2807 4900 2.2726 0.5269 1.0
1.9509 12.5313 5000 1.9673 0.5404 1.0
2.0051 12.7820 5100 2.2250 0.5604 1.0
1.9579 13.0326 5200 2.1640 0.5959 1.0
1.8771 13.2832 5300 2.1336 0.5563 1.0
1.9823 13.5338 5400 2.2691 0.7077 1.0
2.1507 13.7845 5500 2.1997 0.6465 1.0
2.0604 14.0351 5600 2.3030 0.8980 1.0
2.1699 14.2857 5700 2.4337 0.8201 1.0
2.2419 14.5363 5800 2.4167 0.9188 1.0
2.2654 14.7870 5900 2.3791 0.9183 1.0
2.2122 15.0376 6000 2.3923 0.8131 1.0
2.1877 15.2882 6100 2.4024 0.7830 1.0
2.1916 15.5388 6200 2.4181 0.8381 1.0
2.1353 15.7895 6300 2.4418 0.8358 1.0
2.1799 16.0401 6400 2.4687 0.8174 1.0
2.2087 16.2907 6500 2.5634 0.8334 1.0
2.2825 16.5414 6600 2.5488 0.8351 1.0
2.3422 16.7920 6700 2.5349 0.8768 1.0
2.3002 17.0426 6800 2.5182 0.8676 1.0
2.3231 17.2932 6900 2.5707 0.8705 1.0
2.3212 17.5439 7000 2.5670 0.8576 1.0
2.294 17.7945 7100 2.5520 0.8529 1.0
2.3093 18.0451 7200 2.5077 0.8755 1.0
2.2777 18.2957 7300 2.5170 0.8419 1.0
2.2669 18.5464 7400 2.5190 0.8236 1.0
2.2838 18.7970 7500 2.4830 0.8454 1.0
2.2629 19.0476 7600 2.4891 0.8255 1.0
2.2429 19.2982 7700 2.4772 0.8228 1.0
2.2664 19.5489 7800 2.4824 0.8192 1.0
2.2216 19.7995 7900 2.4672 0.8239 1.0
2.2353 20.0501 8000 2.4799 0.7995 1.0
2.2275 20.3008 8100 2.4589 0.8008 1.0
2.2194 20.5514 8200 2.4732 0.7896 1.0
2.2165 20.8020 8300 2.4684 0.7895 1.0
2.1849 21.0526 8400 2.4801 0.7780 1.0
2.1829 21.3033 8500 2.4796 0.7700 1.0
2.1967 21.5539 8600 2.4524 0.7785 1.0
2.1703 21.8045 8700 2.4873 0.7599 1.0
2.195 22.0551 8800 2.4935 0.7567 1.0
2.2086 22.3058 8900 2.4447 0.7694 1.0
2.1866 22.5564 9000 2.4866 0.7501 1.0
2.1733 22.8070 9100 2.4694 0.7529 1.0
2.1637 23.0576 9200 2.4896 0.7430 1.0
2.1756 23.3083 9300 2.4785 0.7428 1.0
2.1634 23.5589 9400 2.5084 0.7336 1.0
2.1894 23.8095 9500 2.4891 0.7340 1.0
2.178 24.0602 9600 2.4982 0.7314 1.0
2.1725 24.3108 9700 2.4581 0.7392 1.0
2.1795 24.5614 9800 2.4721 0.7328 1.0
2.1679 24.8120 9900 2.4810 0.7300 1.0
2.2202 25.0627 10000 2.4784 0.7240 1.0
2.2071 25.3133 10100 2.5258 0.7131 1.0
2.2473 25.5639 10200 2.5072 0.7158 1.0
2.2474 25.8145 10300 2.5417 0.7117 1.0
2.2283 26.0652 10400 2.5465 0.7092 1.0
2.2543 26.3158 10500 2.5444 0.7102 1.0
2.2358 26.5664 10600 2.5406 0.7119 1.0
2.2889 26.8170 10700 2.4930 0.7231 1.0
2.2624 27.0677 10800 2.5067 0.7179 1.0
2.2619 27.3183 10900 2.4803 0.7338 1.0
2.2208 27.5689 11000 2.4662 0.7391 1.0
2.254 27.8195 11100 2.4554 0.7454 1.0
2.2567 28.0702 11200 2.4682 0.7418 1.0
2.2286 28.3208 11300 2.4771 0.7381 1.0
2.2648 28.5714 11400 2.4754 0.7405 1.0
2.2666 28.8221 11500 2.4745 0.7449 1.0
2.2456 29.0727 11600 2.4793 0.7509 1.0
2.2692 29.3233 11700 2.4868 0.7522 1.0
2.2602 29.5739 11800 2.4893 0.7541 1.0
2.2932 29.8246 11900 2.4915 0.7560 1.0

Framework versions

  • Transformers 4.57.2
  • Pytorch 2.9.1+cu128
  • Datasets 3.6.0
  • Tokenizers 0.22.0
Downloads last month
-
Safetensors
Model size
0.3B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for ctaguchi/ssc-hch-model

Finetuned
(796)
this model