Model save
Browse files
README.md
CHANGED
|
@@ -1,27 +1,25 @@
|
|
| 1 |
---
|
| 2 |
library_name: transformers
|
| 3 |
-
language:
|
| 4 |
-
- ar
|
| 5 |
license: apache-2.0
|
| 6 |
-
base_model:
|
| 7 |
tags:
|
| 8 |
- generated_from_trainer
|
| 9 |
metrics:
|
| 10 |
- wer
|
| 11 |
model-index:
|
| 12 |
-
- name:
|
| 13 |
results: []
|
| 14 |
---
|
| 15 |
|
| 16 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
| 17 |
should probably proofread and complete it, then remove this comment. -->
|
| 18 |
|
| 19 |
-
#
|
| 20 |
|
| 21 |
-
This model is a fine-tuned version of [
|
| 22 |
It achieves the following results on the evaluation set:
|
| 23 |
-
- Loss: 0.
|
| 24 |
-
- Wer: 0.
|
| 25 |
|
| 26 |
## Model description
|
| 27 |
|
|
@@ -49,55 +47,76 @@ The following hyperparameters were used during training:
|
|
| 49 |
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
|
| 50 |
- lr_scheduler_type: linear
|
| 51 |
- lr_scheduler_warmup_steps: 500
|
| 52 |
-
- num_epochs:
|
| 53 |
- mixed_precision_training: Native AMP
|
| 54 |
|
| 55 |
### Training results
|
| 56 |
|
| 57 |
-
| Training Loss | Epoch
|
| 58 |
-
|
| 59 |
-
| 2.9866 | 0.2371
|
| 60 |
-
| 3.6342 | 0.4742
|
| 61 |
-
| 3.2101 | 0.7113
|
| 62 |
-
| 3.3175 | 0.9484
|
| 63 |
-
| 2.5835 | 1.1855
|
| 64 |
-
| 2.6437 | 1.4226
|
| 65 |
-
| 2.4359 | 1.6598
|
| 66 |
-
| 2.5958 | 1.8969
|
| 67 |
-
| 2.0623 | 2.1340
|
| 68 |
-
| 1.8728 | 2.3711
|
| 69 |
-
| 2.1926 | 2.6082
|
| 70 |
-
| 1.9814 | 2.8453
|
| 71 |
-
| 1.7723 | 3.0824
|
| 72 |
-
| 1.7258 | 3.3195
|
| 73 |
-
| 1.6528 | 3.5566
|
| 74 |
-
| 1.6309 | 3.7937
|
| 75 |
-
| 1.4672 | 4.0308
|
| 76 |
-
| 1.4974 | 4.2679
|
| 77 |
-
| 1.3805 | 4.5050
|
| 78 |
-
| 1.5116 | 4.7421
|
| 79 |
-
| 1.4897 | 4.9793
|
| 80 |
-
| 1.3287 | 5.2164
|
| 81 |
-
| 1.3186 | 5.4535
|
| 82 |
-
| 1.3753 | 5.6906
|
| 83 |
-
| 1.3055 | 5.9277
|
| 84 |
-
| 1.2468 | 6.1648
|
| 85 |
-
| 1.183 | 6.4019
|
| 86 |
-
| 1.2316 | 6.6390
|
| 87 |
-
| 1.2264 | 6.8761
|
| 88 |
-
| 1.2133 | 7.1132
|
| 89 |
-
| 1.1362 | 7.3503
|
| 90 |
-
| 1.162 | 7.5874
|
| 91 |
-
| 1.1639 | 7.8245
|
| 92 |
-
| 1.0533 | 8.0616
|
| 93 |
-
| 1.1351 | 8.2988
|
| 94 |
-
| 1.1125 | 8.5359
|
| 95 |
-
| 1.1177 | 8.7730
|
| 96 |
-
| 1.0468 | 9.0101
|
| 97 |
-
| 1.0404 | 9.2472
|
| 98 |
-
| 1.073 | 9.4843
|
| 99 |
-
| 1.0436 | 9.7214
|
| 100 |
-
| 1.0786 | 9.9585
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 101 |
|
| 102 |
|
| 103 |
### Framework versions
|
|
|
|
| 1 |
---
|
| 2 |
library_name: transformers
|
|
|
|
|
|
|
| 3 |
license: apache-2.0
|
| 4 |
+
base_model: Baselhany/Distilation_Whisper_base_CKP
|
| 5 |
tags:
|
| 6 |
- generated_from_trainer
|
| 7 |
metrics:
|
| 8 |
- wer
|
| 9 |
model-index:
|
| 10 |
+
- name: Distilation_Whisper_base_CKP
|
| 11 |
results: []
|
| 12 |
---
|
| 13 |
|
| 14 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
| 15 |
should probably proofread and complete it, then remove this comment. -->
|
| 16 |
|
| 17 |
+
# Distilation_Whisper_base_CKP
|
| 18 |
|
| 19 |
+
This model is a fine-tuned version of [Baselhany/Distilation_Whisper_base_CKP](https://huggingface.co/Baselhany/Distilation_Whisper_base_CKP) on an unknown dataset.
|
| 20 |
It achieves the following results on the evaluation set:
|
| 21 |
+
- Loss: 0.0903
|
| 22 |
+
- Wer: 0.1972
|
| 23 |
|
| 24 |
## Model description
|
| 25 |
|
|
|
|
| 47 |
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
|
| 48 |
- lr_scheduler_type: linear
|
| 49 |
- lr_scheduler_warmup_steps: 500
|
| 50 |
+
- num_epochs: 15
|
| 51 |
- mixed_precision_training: Native AMP
|
| 52 |
|
| 53 |
### Training results
|
| 54 |
|
| 55 |
+
| Training Loss | Epoch | Step | Validation Loss | Wer |
|
| 56 |
+
|:-------------:|:-------:|:-----:|:---------------:|:------:|
|
| 57 |
+
| 2.9866 | 0.2371 | 400 | 0.1061 | 0.2197 |
|
| 58 |
+
| 3.6342 | 0.4742 | 800 | 0.1022 | 0.2051 |
|
| 59 |
+
| 3.2101 | 0.7113 | 1200 | 0.1011 | 0.2044 |
|
| 60 |
+
| 3.3175 | 0.9484 | 1600 | 0.1018 | 0.2042 |
|
| 61 |
+
| 2.5835 | 1.1855 | 2000 | 0.0999 | 0.1982 |
|
| 62 |
+
| 2.6437 | 1.4226 | 2400 | 0.0986 | 0.1979 |
|
| 63 |
+
| 2.4359 | 1.6598 | 2800 | 0.0976 | 0.1976 |
|
| 64 |
+
| 2.5958 | 1.8969 | 3200 | 0.0967 | 0.2090 |
|
| 65 |
+
| 2.0623 | 2.1340 | 3600 | 0.0954 | 0.1942 |
|
| 66 |
+
| 1.8728 | 2.3711 | 4000 | 0.0946 | 0.1895 |
|
| 67 |
+
| 2.1926 | 2.6082 | 4400 | 0.0979 | 0.1958 |
|
| 68 |
+
| 1.9814 | 2.8453 | 4800 | 0.0968 | 0.1973 |
|
| 69 |
+
| 1.7723 | 3.0824 | 5200 | 0.0940 | 0.2020 |
|
| 70 |
+
| 1.7258 | 3.3195 | 5600 | 0.0939 | 0.1954 |
|
| 71 |
+
| 1.6528 | 3.5566 | 6000 | 0.0940 | 0.1951 |
|
| 72 |
+
| 1.6309 | 3.7937 | 6400 | 0.0920 | 0.1985 |
|
| 73 |
+
| 1.4672 | 4.0308 | 6800 | 0.0927 | 0.1961 |
|
| 74 |
+
| 1.4974 | 4.2679 | 7200 | 0.0920 | 0.1954 |
|
| 75 |
+
| 1.3805 | 4.5050 | 7600 | 0.0933 | 0.1929 |
|
| 76 |
+
| 1.5116 | 4.7421 | 8000 | 0.0918 | 0.1955 |
|
| 77 |
+
| 1.4897 | 4.9793 | 8400 | 0.0909 | 0.1917 |
|
| 78 |
+
| 1.3287 | 5.2164 | 8800 | 0.0901 | 0.1942 |
|
| 79 |
+
| 1.3186 | 5.4535 | 9200 | 0.0904 | 0.1944 |
|
| 80 |
+
| 1.3753 | 5.6906 | 9600 | 0.0911 | 0.1941 |
|
| 81 |
+
| 1.3055 | 5.9277 | 10000 | 0.0907 | 0.1920 |
|
| 82 |
+
| 1.2468 | 6.1648 | 10400 | 0.0904 | 0.1982 |
|
| 83 |
+
| 1.183 | 6.4019 | 10800 | 0.0894 | 0.1944 |
|
| 84 |
+
| 1.2316 | 6.6390 | 11200 | 0.0911 | 0.1986 |
|
| 85 |
+
| 1.2264 | 6.8761 | 11600 | 0.0901 | 0.1950 |
|
| 86 |
+
| 1.2133 | 7.1132 | 12000 | 0.0904 | 0.1988 |
|
| 87 |
+
| 1.1362 | 7.3503 | 12400 | 0.0899 | 0.2034 |
|
| 88 |
+
| 1.162 | 7.5874 | 12800 | 0.0899 | 0.1985 |
|
| 89 |
+
| 1.1639 | 7.8245 | 13200 | 0.0904 | 0.1948 |
|
| 90 |
+
| 1.0533 | 8.0616 | 13600 | 0.0891 | 0.1948 |
|
| 91 |
+
| 1.1351 | 8.2988 | 14000 | 0.0895 | 0.1967 |
|
| 92 |
+
| 1.1125 | 8.5359 | 14400 | 0.0893 | 0.2047 |
|
| 93 |
+
| 1.1177 | 8.7730 | 14800 | 0.0890 | 0.2020 |
|
| 94 |
+
| 1.0468 | 9.0101 | 15200 | 0.0891 | 0.2020 |
|
| 95 |
+
| 1.0404 | 9.2472 | 15600 | 0.0890 | 0.2069 |
|
| 96 |
+
| 1.073 | 9.4843 | 16000 | 0.0892 | 0.1975 |
|
| 97 |
+
| 1.0436 | 9.7214 | 16400 | 0.0888 | 0.1981 |
|
| 98 |
+
| 1.0786 | 9.9585 | 16800 | 0.0889 | 0.1972 |
|
| 99 |
+
| 1.1565 | 10.1956 | 17200 | 0.0886 | 0.2019 |
|
| 100 |
+
| 1.0682 | 10.4327 | 17600 | 0.0898 | 0.1966 |
|
| 101 |
+
| 1.1408 | 10.6698 | 18000 | 0.0885 | 0.2010 |
|
| 102 |
+
| 1.0517 | 10.9069 | 18400 | 0.0891 | 0.2019 |
|
| 103 |
+
| 1.0097 | 11.1440 | 18800 | 0.0885 | 0.1950 |
|
| 104 |
+
| 1.0962 | 11.3811 | 19200 | 0.0890 | 0.2004 |
|
| 105 |
+
| 1.0616 | 11.6183 | 19600 | 0.0890 | 0.1975 |
|
| 106 |
+
| 1.1527 | 11.8554 | 20000 | 0.0896 | 0.1963 |
|
| 107 |
+
| 1.0396 | 12.0925 | 20400 | 0.0885 | 0.2029 |
|
| 108 |
+
| 0.9842 | 12.3296 | 20800 | 0.0885 | 0.1917 |
|
| 109 |
+
| 1.0012 | 12.5667 | 21200 | 0.0885 | 0.1973 |
|
| 110 |
+
| 0.941 | 12.8038 | 21600 | 0.0881 | 0.2010 |
|
| 111 |
+
| 0.9645 | 13.0409 | 22000 | 0.0879 | 0.2025 |
|
| 112 |
+
| 0.9253 | 13.2780 | 22400 | 0.0884 | 0.1986 |
|
| 113 |
+
| 0.946 | 13.5151 | 22800 | 0.0882 | 0.1954 |
|
| 114 |
+
| 0.9513 | 13.7522 | 23200 | 0.0884 | 0.1957 |
|
| 115 |
+
| 0.9912 | 13.9893 | 23600 | 0.0876 | 0.2010 |
|
| 116 |
+
| 0.9332 | 14.2264 | 24000 | 0.0878 | 0.1992 |
|
| 117 |
+
| 0.9282 | 14.4635 | 24400 | 0.0880 | 0.2025 |
|
| 118 |
+
| 0.922 | 14.7007 | 24800 | 0.0880 | 0.2009 |
|
| 119 |
+
| 0.8976 | 14.9378 | 25200 | 0.0878 | 0.2016 |
|
| 120 |
|
| 121 |
|
| 122 |
### Framework versions
|
runs/May25_06-50-43_a0a9289979bb/events.out.tfevents.1748189444.a0a9289979bb.19.1
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:6653e19df71a3f740947b2d7fae759b30ad9fab533e051c42d52d0b6212cb9f7
|
| 3 |
+
size 412
|