|
|
---
|
|
|
license: apache-2.0
|
|
|
base_model: facebook/wav2vec2-base-960h
|
|
|
tags:
|
|
|
- generated_from_trainer
|
|
|
metrics:
|
|
|
- wer
|
|
|
model-index:
|
|
|
- name: Helldivers2ASR_V3
|
|
|
results: []
|
|
|
---
|
|
|
|
|
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
|
|
should probably proofread and complete it, then remove this comment. -->
|
|
|
|
|
|
# Helldivers2ASR_V3
|
|
|
|
|
|
This model is a fine-tuned version of [facebook/wav2vec2-base-960h](https://huggingface.co/facebook/wav2vec2-base-960h) on an unknown dataset.
|
|
|
It achieves the following results on the evaluation set:
|
|
|
- Loss: 121.7825
|
|
|
- Wer: 0.0701
|
|
|
|
|
|
## Model description
|
|
|
|
|
|
More information needed
|
|
|
|
|
|
## Intended uses & limitations
|
|
|
|
|
|
More information needed
|
|
|
|
|
|
## Training and evaluation data
|
|
|
|
|
|
More information needed
|
|
|
|
|
|
## Training procedure
|
|
|
|
|
|
### Training hyperparameters
|
|
|
|
|
|
The following hyperparameters were used during training:
|
|
|
- learning_rate: 7e-05
|
|
|
- train_batch_size: 16
|
|
|
- eval_batch_size: 16
|
|
|
- seed: 42
|
|
|
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
|
|
- lr_scheduler_type: linear
|
|
|
- lr_scheduler_warmup_ratio: 0.1
|
|
|
- num_epochs: 30
|
|
|
- mixed_precision_training: Native AMP
|
|
|
|
|
|
### Training results
|
|
|
|
|
|
| Training Loss | Epoch | Step | Validation Loss | Wer |
|
|
|
|:-------------:|:-----:|:-----:|:---------------:|:------:|
|
|
|
| 989.1626 | 1.0 | 488 | 609.3144 | 0.5319 |
|
|
|
| 659.4745 | 2.0 | 976 | 441.8672 | 0.3931 |
|
|
|
| 514.4119 | 3.0 | 1464 | 382.5584 | 0.3219 |
|
|
|
| 419.7142 | 4.0 | 1952 | 288.8653 | 0.2560 |
|
|
|
| 341.4243 | 5.0 | 2440 | 281.8234 | 0.2205 |
|
|
|
| 288.763 | 6.0 | 2928 | 226.5483 | 0.1841 |
|
|
|
| 253.7379 | 7.0 | 3416 | 226.4153 | 0.1735 |
|
|
|
| 215.8471 | 8.0 | 3904 | 225.1913 | 0.1611 |
|
|
|
| 193.4694 | 9.0 | 4392 | 261.5921 | 0.1593 |
|
|
|
| 171.5698 | 10.0 | 4880 | 162.4843 | 0.1218 |
|
|
|
| 153.7561 | 11.0 | 5368 | 168.0577 | 0.1202 |
|
|
|
| 143.2963 | 12.0 | 5856 | 188.0740 | 0.1173 |
|
|
|
| 128.2614 | 13.0 | 6344 | 146.3703 | 0.0960 |
|
|
|
| 125.1001 | 14.0 | 6832 | 163.2122 | 0.0936 |
|
|
|
| 112.3151 | 15.0 | 7320 | 121.7357 | 0.0763 |
|
|
|
| 105.5073 | 16.0 | 7808 | 133.4259 | 0.0808 |
|
|
|
| 91.294 | 17.0 | 8296 | 134.9446 | 0.0827 |
|
|
|
| 94.5096 | 18.0 | 8784 | 109.5260 | 0.0794 |
|
|
|
| 83.6972 | 19.0 | 9272 | 163.8145 | 0.0909 |
|
|
|
| 78.8643 | 20.0 | 9760 | 102.4994 | 0.0665 |
|
|
|
| 72.3876 | 21.0 | 10248 | 102.5806 | 0.0676 |
|
|
|
| 71.9532 | 22.0 | 10736 | 109.1589 | 0.0666 |
|
|
|
| 73.799 | 23.0 | 11224 | 114.1257 | 0.0703 |
|
|
|
| 71.033 | 24.0 | 11712 | 121.2289 | 0.0781 |
|
|
|
| 66.3356 | 25.0 | 12200 | 109.2525 | 0.0666 |
|
|
|
| 64.8495 | 26.0 | 12688 | 108.1266 | 0.0657 |
|
|
|
| 60.262 | 27.0 | 13176 | 120.6045 | 0.0699 |
|
|
|
| 55.4432 | 28.0 | 13664 | 124.6207 | 0.0716 |
|
|
|
| 62.6622 | 29.0 | 14152 | 122.5060 | 0.0683 |
|
|
|
| 58.4588 | 30.0 | 14640 | 121.7825 | 0.0701 |
|
|
|
|
|
|
|
|
|
### Framework versions
|
|
|
|
|
|
- Transformers 4.44.0
|
|
|
- Pytorch 2.5.1+cu121
|
|
|
- Datasets 3.6.0
|
|
|
- Tokenizers 0.19.1
|
|
|
|