File size: 3,184 Bytes
3ed18c1
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
---
library_name: transformers
language:
- am
license: apache-2.0
base_model: openai/whisper-medium
tags:
- generated_from_trainer
metrics:
- wer
model-index:
- name: ' Medium Amharic - Biniyam Daniel'
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

#  Medium Amharic - Biniyam Daniel

This model is a fine-tuned version of [openai/whisper-medium](https://huggingface.co/openai/whisper-medium) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0532
- Wer: 19.1091

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 64
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 2

### Training results

| Training Loss | Epoch  | Step  | Validation Loss | Wer     |
|:-------------:|:------:|:-----:|:---------------:|:-------:|
| 0.1306        | 0.0741 | 1000  | 0.1286          | 35.3980 |
| 0.0923        | 0.1482 | 2000  | 0.0923          | 26.9373 |
| 0.0742        | 0.2224 | 3000  | 0.0797          | 23.8271 |
| 0.0727        | 0.2965 | 4000  | 0.0748          | 23.6953 |
| 0.0706        | 0.3706 | 5000  | 0.0708          | 22.7201 |
| 0.0614        | 0.4447 | 6000  | 0.0689          | 22.0611 |
| 0.065         | 0.5189 | 7000  | 0.0671          | 21.5340 |
| 0.0594        | 0.5930 | 8000  | 0.0640          | 21.1650 |
| 0.0563        | 0.6671 | 9000  | 0.0619          | 21.1386 |
| 0.06          | 0.7412 | 10000 | 0.0618          | 21.1650 |
| 0.055         | 0.8153 | 11000 | 0.0597          | 20.6378 |
| 0.0558        | 0.8895 | 12000 | 0.0595          | 20.6115 |
| 0.0513        | 0.9636 | 13000 | 0.0584          | 19.8208 |
| 0.0521        | 1.0377 | 14000 | 0.0567          | 20.2161 |
| 0.0482        | 1.1118 | 15000 | 0.0567          | 20.0316 |
| 0.0458        | 1.1859 | 16000 | 0.0563          | 19.9789 |
| 0.0487        | 1.2600 | 17000 | 0.0559          | 19.7153 |
| 0.0448        | 1.3341 | 18000 | 0.0554          | 19.7417 |
| 0.0411        | 1.4083 | 19000 | 0.0553          | 19.4254 |
| 0.0407        | 1.4824 | 20000 | 0.0542          | 19.2145 |
| 0.0455        | 1.5565 | 21000 | 0.0539          | 19.1355 |
| 0.0439        | 1.6306 | 22000 | 0.0537          | 18.8719 |
| 0.0427        | 1.7048 | 23000 | 0.0538          | 19.2936 |
| 0.0389        | 1.7789 | 24000 | 0.0534          | 18.9773 |
| 0.0385        | 1.8530 | 25000 | 0.0533          | 19.0828 |
| 0.0366        | 1.9271 | 26000 | 0.0532          | 19.1091 |


### Framework versions

- Transformers 4.57.3
- Pytorch 2.7.1+cu128
- Datasets 3.6.0
- Tokenizers 0.22.1