update model card README.md
Browse files
README.md
CHANGED
|
@@ -12,10 +12,10 @@ should probably proofread and complete it, then remove this comment. -->
|
|
| 12 |
|
| 13 |
# train_model
|
| 14 |
|
| 15 |
-
This model is a fine-tuned version of [facebook/wav2vec2-
|
| 16 |
It achieves the following results on the evaluation set:
|
| 17 |
-
- Loss:
|
| 18 |
-
- Wer: 0.
|
| 19 |
|
| 20 |
## Model description
|
| 21 |
|
|
@@ -38,27 +38,45 @@ The following hyperparameters were used during training:
|
|
| 38 |
- train_batch_size: 8
|
| 39 |
- eval_batch_size: 8
|
| 40 |
- seed: 42
|
| 41 |
-
- gradient_accumulation_steps: 2
|
| 42 |
-
- total_train_batch_size: 16
|
| 43 |
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
| 44 |
- lr_scheduler_type: linear
|
| 45 |
-
- lr_scheduler_warmup_steps:
|
| 46 |
-
- num_epochs:
|
| 47 |
- mixed_precision_training: Native AMP
|
| 48 |
|
| 49 |
### Training results
|
| 50 |
|
| 51 |
-
| Training Loss | Epoch | Step
|
| 52 |
-
|
| 53 |
-
|
|
| 54 |
-
|
|
| 55 |
-
|
|
| 56 |
-
|
|
| 57 |
-
|
|
| 58 |
-
|
|
| 59 |
-
|
|
| 60 |
-
|
|
| 61 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 62 |
|
| 63 |
|
| 64 |
### Framework versions
|
|
|
|
| 12 |
|
| 13 |
# train_model
|
| 14 |
|
| 15 |
+
This model is a fine-tuned version of [facebook/wav2vec2-base](https://huggingface.co/facebook/wav2vec2-base) on the None dataset.
|
| 16 |
It achieves the following results on the evaluation set:
|
| 17 |
+
- Loss: 0.5418
|
| 18 |
+
- Wer: 0.3477
|
| 19 |
|
| 20 |
## Model description
|
| 21 |
|
|
|
|
| 38 |
- train_batch_size: 8
|
| 39 |
- eval_batch_size: 8
|
| 40 |
- seed: 42
|
|
|
|
|
|
|
| 41 |
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
| 42 |
- lr_scheduler_type: linear
|
| 43 |
+
- lr_scheduler_warmup_steps: 1000
|
| 44 |
+
- num_epochs: 30
|
| 45 |
- mixed_precision_training: Native AMP
|
| 46 |
|
| 47 |
### Training results
|
| 48 |
|
| 49 |
+
| Training Loss | Epoch | Step | Validation Loss | Wer |
|
| 50 |
+
|:-------------:|:-----:|:-----:|:---------------:|:------:|
|
| 51 |
+
| 3.4778 | 1.0 | 500 | 1.7193 | 0.9881 |
|
| 52 |
+
| 0.8342 | 2.01 | 1000 | 0.5422 | 0.5306 |
|
| 53 |
+
| 0.4304 | 3.01 | 1500 | 0.4456 | 0.4634 |
|
| 54 |
+
| 0.2998 | 4.02 | 2000 | 0.4095 | 0.4283 |
|
| 55 |
+
| 0.2348 | 5.02 | 2500 | 0.4446 | 0.4216 |
|
| 56 |
+
| 0.1921 | 6.02 | 3000 | 0.5314 | 0.3949 |
|
| 57 |
+
| 0.1576 | 7.03 | 3500 | 0.4374 | 0.4033 |
|
| 58 |
+
| 0.1435 | 8.03 | 4000 | 0.6605 | 0.4036 |
|
| 59 |
+
| 0.1261 | 9.04 | 4500 | 0.4944 | 0.3887 |
|
| 60 |
+
| 0.1107 | 10.04 | 5000 | 0.4507 | 0.3806 |
|
| 61 |
+
| 0.0994 | 11.04 | 5500 | 0.4927 | 0.3733 |
|
| 62 |
+
| 0.0891 | 12.05 | 6000 | 0.5067 | 0.3754 |
|
| 63 |
+
| 0.0862 | 13.05 | 6500 | 0.4767 | 0.3691 |
|
| 64 |
+
| 0.0702 | 14.06 | 7000 | 0.4982 | 0.3739 |
|
| 65 |
+
| 0.0648 | 15.06 | 7500 | 0.5233 | 0.3736 |
|
| 66 |
+
| 0.0599 | 16.06 | 8000 | 0.5338 | 0.3694 |
|
| 67 |
+
| 0.0588 | 17.07 | 8500 | 0.5675 | 0.3568 |
|
| 68 |
+
| 0.0587 | 18.07 | 9000 | 0.5689 | 0.3657 |
|
| 69 |
+
| 0.0461 | 19.08 | 9500 | 0.5803 | 0.3639 |
|
| 70 |
+
| 0.0443 | 20.08 | 10000 | 0.5427 | 0.3654 |
|
| 71 |
+
| 0.0436 | 21.08 | 10500 | 0.5441 | 0.3662 |
|
| 72 |
+
| 0.035 | 22.09 | 11000 | 0.5511 | 0.3601 |
|
| 73 |
+
| 0.0338 | 23.09 | 11500 | 0.4968 | 0.3581 |
|
| 74 |
+
| 0.0327 | 24.1 | 12000 | 0.5254 | 0.3553 |
|
| 75 |
+
| 0.0274 | 25.1 | 12500 | 0.5212 | 0.3524 |
|
| 76 |
+
| 0.0246 | 26.1 | 13000 | 0.5445 | 0.3495 |
|
| 77 |
+
| 0.0263 | 27.11 | 13500 | 0.5291 | 0.3500 |
|
| 78 |
+
| 0.0228 | 28.11 | 14000 | 0.5378 | 0.3458 |
|
| 79 |
+
| 0.0223 | 29.12 | 14500 | 0.5418 | 0.3477 |
|
| 80 |
|
| 81 |
|
| 82 |
### Framework versions
|