File size: 4,667 Bytes
25ab4ab
 
 
 
 
 
 
 
 
 
 
145a4fe
25ab4ab
 
 
 
 
 
145a4fe
25ab4ab
 
 
145a4fe
 
25ab4ab
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
145a4fe
25ab4ab
 
 
 
145a4fe
 
 
25ab4ab
 
 
 
145a4fe
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
25ab4ab
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
---
library_name: transformers
language:
- am
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_trainer
metrics:
- wer
model-index:
- name: ' tiny continued from check point 8e-6 - Biniyam Daniel'
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

#  tiny continued from check point 8e-6 - Biniyam Daniel

This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0534
- Wer: 18.3821

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 8e-06
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 5088
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch  | Step | Validation Loss | Wer     |
|:-------------:|:------:|:----:|:---------------:|:-------:|
| 0.0608        | 0.0337 | 100  | 0.0588          | 20.9205 |
| 0.0617        | 0.0675 | 200  | 0.0568          | 20.5858 |
| 0.0477        | 0.1012 | 300  | 0.0563          | 20.6137 |
| 0.0526        | 0.1350 | 400  | 0.0558          | 20.1953 |
| 0.0555        | 0.1687 | 500  | 0.0560          | 19.9721 |
| 0.0542        | 0.2025 | 600  | 0.0564          | 20.5858 |
| 0.0458        | 0.2362 | 700  | 0.0557          | 20.8368 |
| 0.0439        | 0.2700 | 800  | 0.0559          | 20.0837 |
| 0.0419        | 0.3037 | 900  | 0.0562          | 20.5021 |
| 0.0469        | 0.3375 | 1000 | 0.0556          | 19.6653 |
| 0.0457        | 0.3712 | 1100 | 0.0550          | 20.0    |
| 0.0465        | 0.4050 | 1200 | 0.0550          | 19.7768 |
| 0.0453        | 0.4387 | 1300 | 0.0552          | 19.3863 |
| 0.0425        | 0.4725 | 1400 | 0.0558          | 19.6095 |
| 0.0464        | 0.5062 | 1500 | 0.0547          | 19.6653 |
| 0.0396        | 0.5400 | 1600 | 0.0545          | 19.3863 |
| 0.043         | 0.5737 | 1700 | 0.0551          | 19.6653 |
| 0.0415        | 0.6075 | 1800 | 0.0550          | 19.3305 |
| 0.0396        | 0.6412 | 1900 | 0.0546          | 18.4937 |
| 0.0409        | 0.6750 | 2000 | 0.0542          | 18.7448 |
| 0.0418        | 0.7087 | 2100 | 0.0534          | 19.0237 |
| 0.0446        | 0.7425 | 2200 | 0.0538          | 19.1074 |
| 0.0364        | 0.7762 | 2300 | 0.0537          | 18.6053 |
| 0.0343        | 0.8100 | 2400 | 0.0537          | 18.4658 |
| 0.0437        | 0.8437 | 2500 | 0.0532          | 18.4100 |
| 0.0386        | 0.8775 | 2600 | 0.0530          | 18.9121 |
| 0.0426        | 0.9112 | 2700 | 0.0534          | 18.2706 |
| 0.0372        | 0.9450 | 2800 | 0.0536          | 18.6890 |
| 0.0325        | 0.9787 | 2900 | 0.0533          | 18.5495 |
| 0.03          | 1.0125 | 3000 | 0.0537          | 18.4100 |
| 0.0253        | 1.0462 | 3100 | 0.0545          | 18.5774 |
| 0.0316        | 1.0800 | 3200 | 0.0550          | 18.4658 |
| 0.0251        | 1.1137 | 3300 | 0.0556          | 18.7727 |
| 0.0261        | 1.1475 | 3400 | 0.0554          | 18.2427 |
| 0.0285        | 1.1812 | 3500 | 0.0551          | 18.4658 |
| 0.0234        | 1.2150 | 3600 | 0.0553          | 18.6890 |
| 0.0369        | 1.2487 | 3700 | 0.0549          | 18.3543 |
| 0.0248        | 1.2825 | 3800 | 0.0553          | 18.2985 |
| 0.0238        | 1.3162 | 3900 | 0.0551          | 18.2985 |
| 0.0278        | 1.3500 | 4000 | 0.0551          | 18.1311 |
| 0.0351        | 1.3837 | 4100 | 0.0544          | 18.4379 |
| 0.0459        | 1.4175 | 4200 | 0.0539          | 17.9916 |
| 0.0469        | 1.4512 | 4300 | 0.0537          | 18.3543 |
| 0.0384        | 1.4850 | 4400 | 0.0536          | 18.4658 |
| 0.0503        | 1.5187 | 4500 | 0.0536          | 18.4100 |
| 0.0369        | 1.5525 | 4600 | 0.0536          | 18.2985 |
| 0.0373        | 1.5862 | 4700 | 0.0535          | 18.4100 |
| 0.0376        | 1.6200 | 4800 | 0.0534          | 18.3821 |
| 0.0385        | 1.6537 | 4900 | 0.0534          | 18.3821 |
| 0.0362        | 1.6875 | 5000 | 0.0534          | 18.3821 |


### Framework versions

- Transformers 4.57.1
- Pytorch 2.6.0+cu124
- Datasets 3.6.0
- Tokenizers 0.22.1