|
|
--- |
|
|
language: |
|
|
- zh |
|
|
license: apache-2.0 |
|
|
base_model: openai/whisper-tiny |
|
|
tags: |
|
|
- generated_from_trainer |
|
|
datasets: |
|
|
- formospeech/hat_asr_aligned |
|
|
model-index: |
|
|
- name: Whisper Tiny Hakka Condenser |
|
|
results: [] |
|
|
metrics: |
|
|
- cer |
|
|
--- |
|
|
|
|
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You |
|
|
should probably proofread and complete it, then remove this comment. --> |
|
|
|
|
|
# Whisper Tiny Hakka Condenser |
|
|
|
|
|
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on the HAT ASR Aligned dataset. |
|
|
It achieves the following results on the evaluation set: |
|
|
- Loss: 0.1729 |
|
|
- Cer: 10.2307 |
|
|
|
|
|
## Model description |
|
|
|
|
|
More information needed |
|
|
|
|
|
## Intended uses & limitations |
|
|
|
|
|
More information needed |
|
|
|
|
|
## Training and evaluation data |
|
|
|
|
|
More information needed |
|
|
|
|
|
## Training procedure |
|
|
|
|
|
### Training hyperparameters |
|
|
|
|
|
The following hyperparameters were used during training: |
|
|
- learning_rate: 1e-05 |
|
|
- train_batch_size: 64 |
|
|
- eval_batch_size: 32 |
|
|
- seed: 42 |
|
|
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 |
|
|
- lr_scheduler_type: linear |
|
|
- lr_scheduler_warmup_steps: 1521 |
|
|
- training_steps: 15210 |
|
|
- mixed_precision_training: Native AMP |
|
|
|
|
|
### Training results |
|
|
|
|
|
| Training Loss | Epoch | Step | Validation Loss | Cer | |
|
|
|:-------------:|:------:|:-----:|:---------------:|:-------:| |
|
|
| 0.2476 | 0.9993 | 1521 | 0.4437 | 23.6551 | |
|
|
| 0.0892 | 1.9987 | 3042 | 0.2482 | 14.6693 | |
|
|
| 0.0543 | 2.9980 | 4563 | 0.2007 | 11.1774 | |
|
|
| 0.0361 | 3.9974 | 6084 | 0.1847 | 12.4939 | |
|
|
| 0.0235 | 4.9967 | 7605 | 0.1791 | 10.5405 | |
|
|
| 0.0157 | 5.9961 | 9126 | 0.1727 | 10.9000 | |
|
|
| 0.0121 | 6.9954 | 10647 | 0.1724 | 11.1554 | |
|
|
| 0.0082 | 7.9947 | 12168 | 0.1720 | 10.3694 | |
|
|
| 0.0059 | 8.9941 | 13689 | 0.1732 | 10.4053 | |
|
|
| 0.0049 | 9.9934 | 15210 | 0.1729 | 10.2307 | |
|
|
|
|
|
|
|
|
### Framework versions |
|
|
|
|
|
- Transformers 4.42.3 |
|
|
- Pytorch 2.3.0+cu121 |
|
|
- Datasets 2.20.0 |
|
|
- Tokenizers 0.19.1 |