| | ---
|
| | library_name: transformers
|
| | language:
|
| | - zh
|
| | base_model: whucedar/amoros_spec_01_train_20-medium_1000_8
|
| | tags:
|
| | - hf-asr-leaderboard
|
| | - generated_from_trainer
|
| | datasets:
|
| | - whucedar/amoros_spec_02-medium
|
| | metrics:
|
| | - wer
|
| | model-index:
|
| | - name: amoros_spec_02-medium
|
| | results:
|
| | - task:
|
| | name: Automatic Speech Recognition
|
| | type: automatic-speech-recognition
|
| | dataset:
|
| | name: amoros_spec_02
|
| | type: whucedar/amoros_spec_02-medium
|
| | args: 'config: zh, split: test'
|
| | metrics:
|
| | - name: Wer
|
| | type: wer
|
| | value: 438.75
|
| | ---
|
| |
|
| | <!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
| | should probably proofread and complete it, then remove this comment. -->
|
| |
|
| | # amoros_spec_02-medium
|
| |
|
| | This model is a fine-tuned version of [whucedar/amoros_spec_01_train_20-medium_1000_8](https://huggingface.co/whucedar/amoros_spec_01_train_20-medium_1000_8) on the amoros_spec_02 dataset.
|
| | It achieves the following results on the evaluation set:
|
| | - Loss: 0.5581
|
| | - Wer: 438.75
|
| |
|
| | ## Model description
|
| |
|
| | More information needed
|
| |
|
| | ## Intended uses & limitations
|
| |
|
| | More information needed
|
| |
|
| | ## Training and evaluation data
|
| |
|
| | More information needed
|
| |
|
| | ## Training procedure
|
| |
|
| | ### Training hyperparameters
|
| |
|
| | The following hyperparameters were used during training:
|
| | - learning_rate: 1e-05
|
| | - train_batch_size: 8
|
| | - eval_batch_size: 8
|
| | - seed: 42
|
| | - optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
|
| | - lr_scheduler_type: linear
|
| | - lr_scheduler_warmup_steps: 500
|
| | - training_steps: 1000
|
| | - mixed_precision_training: Native AMP
|
| |
|
| | ### Training results
|
| |
|
| | | Training Loss | Epoch | Step | Validation Loss | Wer |
|
| | |:-------------:|:-----:|:----:|:---------------:|:------:|
|
| | | 0.0001 | 100.0 | 1000 | 0.5581 | 438.75 |
|
| |
|
| |
|
| | ### Framework versions
|
| |
|
| | - Transformers 4.52.3
|
| | - Pytorch 2.7.0+cu126
|
| | - Datasets 3.6.0
|
| | - Tokenizers 0.21.1
|
| | |