File size: 1,947 Bytes
761a049 39a57d9 761a049 39a57d9 761a049 39a57d9 761a049 4995aac 761a049 39a57d9 761a049 cebf4bf 761a049 cebf4bf 761a049 1e0c42e 761a049 96bd1ec 761a049 1e0c42e 761a049 1e0c42e 761a049 1e0c42e 761a049 0da261c cebf4bf 0da261c 761a049 4995aac |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 |
---
language:
- tr
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- hf-asr-leaderboard
- generated_from_trainer
metrics:
- wer
model-index:
- name: Whisper Tiny TR
results: []
datasets:
- mozilla-foundation/common_voice_13_0
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Whisper Tiny TR
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on the Common Voice 13 Turkish 70% dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5730
- Wer: 55.4805
## Model description
More information needed
## Todo
Train with `mozilla-foundation/common_voice_13_0` after the initial training.
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 3.0
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:-------:|
| 0.3106 | 0.5 | 97 | 0.5626 | 57.0558 |
| 0.3361 | 1.0 | 194 | 0.5635 | 56.9995 |
| 0.3089 | 1.5 | 291 | 0.5639 | 57.6184 |
| 0.2665 | 1.99 | 388 | 0.5746 | 56.4088 |
| 0.2794 | 2.49 | 485 | 0.5799 | 56.2213 |
| 0.2364 | 2.99 | 582 | 0.5730 | 55.4805 |
### Framework versions
- Transformers 4.35.0.dev0
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.14.1 |