hamsa-con-arabic-ctc-streamed-v2
This model is a fine-tuned version of Ahmed107/hamsa-conv-v2-base on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.8863
- Model Preparation Time: 0.0058
- Wer: 0.7867
- Cer: 0.4324
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- training_steps: 20000
- mixed_precision_training: Native AMP
Training results
| Training Loss | Epoch | Step | Validation Loss | Model Preparation Time | Wer | Cer |
|---|---|---|---|---|---|---|
| 1.3385 | 0.05 | 1000 | 1.4764 | 0.0058 | 0.9184 | 0.4682 |
| 1.1819 | 0.1 | 2000 | 1.3150 | 0.0058 | 0.8721 | 0.4428 |
| 0.9857 | 0.15 | 3000 | 1.1637 | 0.0058 | 0.9038 | 0.5026 |
| 1.068 | 0.2 | 4000 | 1.1078 | 0.0058 | 0.8801 | 0.4606 |
| 0.8655 | 0.25 | 5000 | 1.0976 | 0.0058 | 0.8519 | 0.4370 |
| 0.824 | 0.3 | 6000 | 1.0824 | 0.0058 | 0.8648 | 0.4682 |
| 0.7615 | 0.35 | 7000 | 1.0664 | 0.0058 | 0.8787 | 0.4802 |
| 0.6891 | 0.4 | 8000 | 1.0283 | 0.0058 | 0.8449 | 0.4709 |
| 0.7734 | 0.45 | 9000 | 0.9946 | 0.0058 | 0.8285 | 0.4376 |
| 0.6688 | 0.5 | 10000 | 1.0166 | 0.0058 | 0.8623 | 0.4881 |
| 0.6023 | 0.55 | 11000 | 0.9605 | 0.0058 | 0.8815 | 0.4986 |
| 0.5838 | 0.6 | 12000 | 0.9216 | 0.0058 | 0.8362 | 0.4500 |
| 0.5383 | 0.65 | 13000 | 0.9378 | 0.0058 | 0.8181 | 0.4411 |
| 0.5507 | 0.7 | 14000 | 0.9053 | 0.0058 | 0.8623 | 0.4695 |
| 0.5381 | 0.75 | 15000 | 0.8918 | 0.0058 | 0.8236 | 0.4314 |
| 0.4964 | 0.8 | 16000 | 0.8826 | 0.0058 | 0.8020 | 0.4221 |
| 0.4834 | 0.85 | 17000 | 0.8796 | 0.0058 | 0.7996 | 0.4339 |
| 0.4454 | 0.9 | 18000 | 0.8689 | 0.0058 | 0.8010 | 0.4297 |
| 0.3959 | 0.95 | 19000 | 0.8872 | 0.0058 | 0.7863 | 0.4304 |
| 0.4036 | 1.0 | 20000 | 0.8863 | 0.0058 | 0.7867 | 0.4324 |
Framework versions
- Transformers 4.52.0
- Pytorch 2.8.0+cu126
- Datasets 3.6.0
- Tokenizers 0.21.4
- Downloads last month
- 2
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for Ahmed107/hamsa-con-arabic-ctc-streamed-v2
Base model
Ahmed107/hamsa-conv-v2-base