End of training
Browse files- README.md +23 -23
- model.safetensors +1 -1
- preprocessor_config.json +0 -1
- training_args.bin +1 -1
README.md
CHANGED
|
@@ -17,9 +17,9 @@ should probably proofread and complete it, then remove this comment. -->
|
|
| 17 |
|
| 18 |
This model is a fine-tuned version of [facebook/hubert-large-ls960-ft](https://huggingface.co/facebook/hubert-large-ls960-ft) on an unknown dataset.
|
| 19 |
It achieves the following results on the evaluation set:
|
| 20 |
-
- Loss: 0.
|
| 21 |
-
- Wer: 0.
|
| 22 |
-
- Per: 0.
|
| 23 |
|
| 24 |
## Model description
|
| 25 |
|
|
@@ -51,26 +51,26 @@ The following hyperparameters were used during training:
|
|
| 51 |
|
| 52 |
| Training Loss | Epoch | Step | Validation Loss | Wer | Per |
|
| 53 |
|:-------------:|:-----:|:-----:|:---------------:|:------:|:------:|
|
| 54 |
-
|
|
| 55 |
-
| 0.
|
| 56 |
-
| 0.
|
| 57 |
-
| 0.
|
| 58 |
-
| 0.
|
| 59 |
-
| 0.
|
| 60 |
-
| 0.
|
| 61 |
-
| 0.
|
| 62 |
-
| 0.
|
| 63 |
-
| 0.
|
| 64 |
-
| 0.
|
| 65 |
-
| 0.
|
| 66 |
-
| 0.
|
| 67 |
-
| 0.
|
| 68 |
-
| 0.
|
| 69 |
-
| 0.
|
| 70 |
-
| 0.
|
| 71 |
-
| 0.
|
| 72 |
-
| 0.
|
| 73 |
-
| 0.
|
| 74 |
|
| 75 |
|
| 76 |
### Framework versions
|
|
|
|
| 17 |
|
| 18 |
This model is a fine-tuned version of [facebook/hubert-large-ls960-ft](https://huggingface.co/facebook/hubert-large-ls960-ft) on an unknown dataset.
|
| 19 |
It achieves the following results on the evaluation set:
|
| 20 |
+
- Loss: 0.5363
|
| 21 |
+
- Wer: 0.0490
|
| 22 |
+
- Per: 0.0376
|
| 23 |
|
| 24 |
## Model description
|
| 25 |
|
|
|
|
| 51 |
|
| 52 |
| Training Loss | Epoch | Step | Validation Loss | Wer | Per |
|
| 53 |
|:-------------:|:-----:|:-----:|:---------------:|:------:|:------:|
|
| 54 |
+
| 2.371 | 1.0 | 818 | 1.3106 | 0.6828 | 0.6863 |
|
| 55 |
+
| 0.9013 | 2.0 | 1636 | 0.4890 | 0.1144 | 0.0985 |
|
| 56 |
+
| 0.3369 | 3.0 | 2454 | 0.4295 | 0.0766 | 0.0611 |
|
| 57 |
+
| 0.2219 | 4.0 | 3272 | 0.4386 | 0.0649 | 0.0508 |
|
| 58 |
+
| 0.1565 | 5.0 | 4090 | 0.4124 | 0.0660 | 0.0506 |
|
| 59 |
+
| 0.1343 | 6.0 | 4908 | 0.4422 | 0.0630 | 0.0493 |
|
| 60 |
+
| 0.106 | 7.0 | 5726 | 0.4762 | 0.0600 | 0.0469 |
|
| 61 |
+
| 0.091 | 8.0 | 6544 | 0.4487 | 0.0580 | 0.0460 |
|
| 62 |
+
| 0.0745 | 9.0 | 7362 | 0.4284 | 0.0577 | 0.0461 |
|
| 63 |
+
| 0.0708 | 10.0 | 8180 | 0.4161 | 0.0578 | 0.0451 |
|
| 64 |
+
| 0.0621 | 11.0 | 8998 | 0.4659 | 0.0535 | 0.0414 |
|
| 65 |
+
| 0.0492 | 12.0 | 9816 | 0.5249 | 0.0557 | 0.0433 |
|
| 66 |
+
| 0.0479 | 13.0 | 10634 | 0.5411 | 0.0550 | 0.0426 |
|
| 67 |
+
| 0.0452 | 14.0 | 11452 | 0.5161 | 0.0536 | 0.0410 |
|
| 68 |
+
| 0.0385 | 15.0 | 12270 | 0.5002 | 0.0521 | 0.0404 |
|
| 69 |
+
| 0.0354 | 16.0 | 13088 | 0.4800 | 0.0499 | 0.0389 |
|
| 70 |
+
| 0.0342 | 17.0 | 13906 | 0.5079 | 0.0506 | 0.0394 |
|
| 71 |
+
| 0.0269 | 18.0 | 14724 | 0.5144 | 0.0499 | 0.0386 |
|
| 72 |
+
| 0.0247 | 19.0 | 15542 | 0.5334 | 0.0496 | 0.0380 |
|
| 73 |
+
| 0.0208 | 20.0 | 16360 | 0.5363 | 0.0490 | 0.0376 |
|
| 74 |
|
| 75 |
|
| 76 |
### Framework versions
|
model.safetensors
CHANGED
|
@@ -1,3 +1,3 @@
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
-
oid sha256:
|
| 3 |
size 1261970648
|
|
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:2b4ee4d3ca9b243c6128b6a6c9f7523ed0cf56de4ea07373696d9055a09398d6
|
| 3 |
size 1261970648
|
preprocessor_config.json
CHANGED
|
@@ -4,7 +4,6 @@
|
|
| 4 |
"feature_size": 1,
|
| 5 |
"padding_side": "right",
|
| 6 |
"padding_value": 0,
|
| 7 |
-
"processor_class": "Wav2Vec2Processor",
|
| 8 |
"return_attention_mask": true,
|
| 9 |
"sampling_rate": 16000
|
| 10 |
}
|
|
|
|
| 4 |
"feature_size": 1,
|
| 5 |
"padding_side": "right",
|
| 6 |
"padding_value": 0,
|
|
|
|
| 7 |
"return_attention_mask": true,
|
| 8 |
"sampling_rate": 16000
|
| 9 |
}
|
training_args.bin
CHANGED
|
@@ -1,3 +1,3 @@
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
-
oid sha256:
|
| 3 |
size 4600
|
|
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:fdcbf3563aa16d993b17cc9be35f2c9bdcd5010b0dd9cbbb8f117910c99e21dd
|
| 3 |
size 4600
|