ssc-kcn-model

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: nan
  • Cer: 0.9940
  • Wer: 1.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 100
  • num_epochs: 30
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Cer Wer
5.3972 0.1736 100 2.9985 0.9942 1.0
3.1473 0.3472 200 2.9632 0.9942 1.0
3.2868 0.5208 300 2.9640 0.9942 1.0
3.1277 0.6944 400 2.8740 0.9942 1.0
3.1015 0.8681 500 2.8538 0.9942 1.0
3.0851 1.0417 600 2.9601 0.9942 1.0
3.0431 1.2153 700 2.8703 0.9942 1.0
3.0662 1.3889 800 3.0285 0.9942 1.0
3.0358 1.5625 900 2.9641 0.9942 1.0
2.9999 1.7361 1000 3.0095 0.9942 1.0
3.0629 1.9097 1100 2.8662 0.9942 1.0
3.033 2.0833 1200 2.8406 0.9942 1.0
2.9814 2.2569 1300 2.8291 0.9942 1.0
3.0079 2.4306 1400 2.8602 0.9942 1.0
3.0814 2.6042 1500 2.8299 0.9942 1.0
3.0949 2.7778 1600 2.8245 0.9942 1.0
3.1026 2.9514 1700 2.8080 0.9942 1.0
3.0229 3.125 1800 2.9045 0.9942 1.0
3.0744 3.2986 1900 2.8780 0.9942 1.0
3.0559 3.4722 2000 2.9353 0.9942 1.0
3.0577 3.6458 2100 2.9453 0.9942 1.0
3.0844 3.8194 2200 3.0902 0.9942 1.0
3.2003 3.9931 2300 2.9683 0.9942 1.0
3.233 4.1667 2400 3.2298 0.9942 1.0
3.4188 4.3403 2500 3.3186 0.9942 1.0
3.5046 4.5139 2600 3.3847 0.9942 1.0
3.5387 4.6875 2700 3.4096 0.9942 1.0
3.517 4.8611 2800 3.4097 0.9942 1.0
3.5483 5.0347 2900 3.4097 0.9942 1.0
3.5612 5.2083 3000 3.4097 0.9942 1.0
3.5469 5.3819 3100 3.4097 0.9942 1.0
3.5421 5.5556 3200 3.4097 0.9942 1.0
3.5395 5.7292 3300 3.4097 0.9942 1.0
3.5186 5.9028 3400 3.4097 0.9942 1.0
3.5325 6.0764 3500 3.4097 0.9942 1.0
3.5384 6.25 3600 3.4097 0.9942 1.0
3.5326 6.4236 3700 3.4097 0.9942 1.0
3.513 6.5972 3800 3.4097 0.9942 1.0
3.5751 6.7708 3900 3.4097 0.9942 1.0
3.5294 6.9444 4000 3.4097 0.9942 1.0
3.545 7.1181 4100 3.4097 0.9942 1.0
3.5764 7.2917 4200 3.4097 0.9942 1.0
3.5258 7.4653 4300 3.4097 0.9942 1.0
3.514 7.6389 4400 3.4097 0.9942 1.0
3.5538 7.8125 4500 3.4097 0.9942 1.0
3.5231 7.9861 4600 3.4097 0.9942 1.0
3.5428 8.1597 4700 3.4097 0.9942 1.0
3.5614 8.3333 4800 3.4097 0.9942 1.0
3.5369 8.5069 4900 3.4097 0.9942 1.0
3.5457 8.6806 5000 3.4097 0.9942 1.0
3.5007 8.8542 5100 3.4097 0.9942 1.0
3.5424 9.0278 5200 3.4097 0.9942 1.0
3.5172 9.2014 5300 3.4097 0.9942 1.0
3.5408 9.375 5400 3.4097 0.9942 1.0
3.5382 9.5486 5500 3.4097 0.9942 1.0
3.5658 9.7222 5600 3.4097 0.9942 1.0
3.5203 9.8958 5700 3.4097 0.9942 1.0
3.5484 10.0694 5800 3.4097 0.9942 1.0
3.5278 10.2431 5900 3.4097 0.9942 1.0
3.5498 10.4167 6000 3.4097 0.9941 1.0
3.5557 10.5903 6100 3.4097 0.9942 1.0
3.5199 10.7639 6200 3.4097 0.9942 1.0
3.5358 10.9375 6300 3.4097 0.9942 1.0
3.5481 11.1111 6400 3.4097 0.9942 1.0
3.5652 11.2847 6500 3.4097 0.9942 1.0
3.5113 11.4583 6600 3.4097 0.9942 1.0
3.5728 11.6319 6700 3.4097 0.9942 1.0
3.512 11.8056 6800 3.4097 0.9942 1.0
3.5222 11.9792 6900 3.4097 0.9942 1.0
3.5793 12.1528 7000 3.4097 0.9942 1.0
3.5378 12.3264 7100 3.4097 0.9942 1.0
3.5463 12.5 7200 3.4097 0.9942 1.0
3.5221 12.6736 7300 3.4097 0.9942 1.0
3.5232 12.8472 7400 3.4097 0.9942 1.0
3.5345 13.0208 7500 3.4097 0.9942 1.0
3.545 13.1944 7600 3.4097 0.9942 1.0
3.5559 13.3681 7700 3.4097 0.9942 1.0
3.5081 13.5417 7800 3.4097 0.9942 1.0
3.5349 13.7153 7900 3.4097 0.9942 1.0
3.5193 13.8889 8000 3.4097 0.9942 1.0
3.5653 14.0625 8100 3.4097 0.9942 1.0
3.5493 14.2361 8200 3.4097 0.9942 1.0
3.525 14.4097 8300 3.4097 0.9942 1.0
3.5106 14.5833 8400 3.4097 0.9942 1.0
3.5627 14.7569 8500 3.4097 0.9942 1.0
3.5593 14.9306 8600 3.4097 0.9942 1.0
3.5159 15.1042 8700 3.4097 0.9941 1.0
3.5424 15.2778 8800 3.4097 0.9942 1.0
3.5829 15.4514 8900 3.4097 0.9942 1.0
3.5305 15.625 9000 3.4097 0.9942 1.0
3.5183 15.7986 9100 3.4097 0.9942 1.0
3.541 15.9722 9200 3.4097 0.9942 1.0
3.559 16.1458 9300 3.4097 0.9942 1.0
3.5281 16.3194 9400 3.4097 0.9942 1.0
3.5371 16.4931 9500 3.4097 0.9942 1.0
3.5843 16.6667 9600 3.4097 0.9942 1.0
3.5031 16.8403 9700 3.4097 0.9942 1.0
3.5339 17.0139 9800 3.4097 0.9942 1.0
3.5397 17.1875 9900 3.4097 0.9942 1.0
3.5408 17.3611 10000 3.4097 0.9942 1.0
3.5664 17.5347 10100 3.4097 0.9942 1.0
3.5145 17.7083 10200 3.4097 0.9942 1.0
3.5141 17.8819 10300 3.4097 0.9942 1.0
3.5562 18.0556 10400 3.4097 0.9942 1.0
3.5196 18.2292 10500 3.4097 0.9942 1.0
3.5335 18.4028 10600 3.4097 0.9942 1.0
3.5769 18.5764 10700 3.4097 0.9942 1.0
3.525 18.75 10800 3.4097 0.9942 1.0
3.5211 18.9236 10900 3.4097 0.9942 1.0
3.5441 19.0972 11000 3.4097 0.9942 1.0
3.5331 19.2708 11100 3.4097 0.9942 1.0
3.5274 19.4444 11200 3.4097 0.9942 1.0
3.5837 19.6181 11300 3.4097 0.9942 1.0
3.5116 19.7917 11400 3.4097 0.9942 1.0
3.5495 19.9653 11500 3.4097 0.9942 1.0
6.9849 20.1389 11600 nan 0.9940 1.0
0.0 20.3125 11700 nan 0.9940 1.0
0.0 20.4861 11800 nan 0.9940 1.0
0.0 20.6597 11900 nan 0.9940 1.0
0.0 20.8333 12000 nan 0.9940 1.0
0.0 21.0069 12100 nan 0.9940 1.0
0.0 21.1806 12200 nan 0.9940 1.0
0.0 21.3542 12300 nan 0.9940 1.0
0.0 21.5278 12400 nan 0.9940 1.0
0.0 21.7014 12500 nan 0.9940 1.0
0.0 21.875 12600 nan 0.9940 1.0
0.0 22.0486 12700 nan 0.9940 1.0
0.0 22.2222 12800 nan 0.9940 1.0
0.0 22.3958 12900 nan 0.9940 1.0
0.0 22.5694 13000 nan 0.9940 1.0
0.0 22.7431 13100 nan 0.9940 1.0
0.0 22.9167 13200 nan 0.9940 1.0
0.0 23.0903 13300 nan 0.9940 1.0
0.0 23.2639 13400 nan 0.9940 1.0
0.0 23.4375 13500 nan 0.9940 1.0
0.0 23.6111 13600 nan 0.9940 1.0
0.0 23.7847 13700 nan 0.9940 1.0
0.0 23.9583 13800 nan 0.9940 1.0
0.0 24.1319 13900 nan 0.9940 1.0
0.0 24.3056 14000 nan 0.9940 1.0
0.0 24.4792 14100 nan 0.9940 1.0
0.0 24.6528 14200 nan 0.9940 1.0
0.0 24.8264 14300 nan 0.9940 1.0
0.0 25.0 14400 nan 0.9940 1.0
0.0 25.1736 14500 nan 0.9940 1.0
0.0 25.3472 14600 nan 0.9940 1.0
0.0 25.5208 14700 nan 0.9940 1.0
0.0 25.6944 14800 nan 0.9940 1.0
0.0 25.8681 14900 nan 0.9940 1.0
0.0 26.0417 15000 nan 0.9940 1.0
0.0 26.2153 15100 nan 0.9940 1.0
0.0 26.3889 15200 nan 0.9940 1.0
0.0 26.5625 15300 nan 0.9940 1.0
0.0 26.7361 15400 nan 0.9940 1.0
0.0 26.9097 15500 nan 0.9940 1.0
0.0 27.0833 15600 nan 0.9940 1.0
0.0 27.2569 15700 nan 0.9940 1.0
0.0 27.4306 15800 nan 0.9940 1.0
0.0 27.6042 15900 nan 0.9940 1.0
0.0 27.7778 16000 nan 0.9940 1.0
0.0 27.9514 16100 nan 0.9940 1.0
0.0 28.125 16200 nan 0.9940 1.0
0.0 28.2986 16300 nan 0.9940 1.0
0.0 28.4722 16400 nan 0.9940 1.0
0.0 28.6458 16500 nan 0.9940 1.0
0.0 28.8194 16600 nan 0.9940 1.0
0.0 28.9931 16700 nan 0.9940 1.0
0.0 29.1667 16800 nan 0.9940 1.0
0.0 29.3403 16900 nan 0.9940 1.0
0.0 29.5139 17000 nan 0.9940 1.0
0.0 29.6875 17100 nan 0.9940 1.0
0.0 29.8611 17200 nan 0.9940 1.0

Framework versions

  • Transformers 4.57.2
  • Pytorch 2.9.1+cu128
  • Datasets 3.6.0
  • Tokenizers 0.22.0
Downloads last month
-
Safetensors
Model size
0.3B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for ctaguchi/ssc-kcn-model

Finetuned
(796)
this model