| --- |
| library_name: transformers |
| license: apache-2.0 |
| base_model: google/t5-efficient-tiny |
| tags: |
| - generated_from_trainer |
| metrics: |
| - accuracy |
| - precision |
| - recall |
| - f1 |
| model-index: |
| - name: sunflower_language_classification_v1 |
| results: [] |
| --- |
| |
| <!-- This model card has been generated automatically according to the information the Trainer had access to. You |
| should probably proofread and complete it, then remove this comment. --> |
|
|
| # sunflower_language_classification_v1 |
| |
| This model is a fine-tuned version of [google/t5-efficient-tiny](https://huggingface.co/google/t5-efficient-tiny) on the None dataset. |
| It achieves the following results on the evaluation set: |
| - Loss: 0.7212 |
| - Accuracy: 0.8297 |
| - Precision: 0.8471 |
| - Recall: 0.8297 |
| - F1: 0.8191 |
| |
| ## Model description |
| |
| More information needed |
| |
| ## Intended uses & limitations |
| |
| More information needed |
| |
| ## Training and evaluation data |
| |
| More information needed |
| |
| ## Training procedure |
| |
| ### Training hyperparameters |
| |
| The following hyperparameters were used during training: |
| - learning_rate: 0.001 |
| - train_batch_size: 64 |
| - eval_batch_size: 64 |
| - seed: 42 |
| - optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments |
| - lr_scheduler_type: linear |
| - lr_scheduler_warmup_steps: 10 |
| - training_steps: 30000 |
| |
| ### Training results |
| |
| | Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 | |
| |:-------------:|:------:|:-----:|:---------------:|:--------:|:---------:|:------:|:------:| |
| | 2.3995 | 0.0167 | 500 | 2.0015 | 0.5145 | 0.4412 | 0.5145 | 0.4517 | |
| | 1.3282 | 0.0334 | 1000 | 1.6467 | 0.5688 | 0.4908 | 0.5688 | 0.5080 | |
| | 1.1086 | 0.0502 | 1500 | 1.5051 | 0.6304 | 0.5784 | 0.6304 | 0.5766 | |
| | 0.9882 | 0.0669 | 2000 | 1.4518 | 0.6268 | 0.6374 | 0.6268 | 0.5891 | |
| | 0.9187 | 0.0836 | 2500 | 1.3470 | 0.6522 | 0.6245 | 0.6522 | 0.6093 | |
| | 0.8546 | 0.1003 | 3000 | 1.3747 | 0.6159 | 0.5871 | 0.6159 | 0.5760 | |
| | 0.8214 | 0.1170 | 3500 | 1.2708 | 0.6703 | 0.6316 | 0.6703 | 0.6323 | |
| | 0.7843 | 0.1338 | 4000 | 1.1659 | 0.6848 | 0.6639 | 0.6848 | 0.6461 | |
| | 0.7470 | 0.1505 | 4500 | 1.1969 | 0.6848 | 0.6534 | 0.6848 | 0.6491 | |
| | 0.7299 | 0.1672 | 5000 | 1.0592 | 0.7101 | 0.7030 | 0.7101 | 0.6748 | |
| | 0.7041 | 0.1839 | 5500 | 1.0536 | 0.6848 | 0.6728 | 0.6848 | 0.6534 | |
| | 0.6755 | 0.2006 | 6000 | 1.0265 | 0.7138 | 0.7298 | 0.7138 | 0.6852 | |
| | 0.6683 | 0.2174 | 6500 | 1.0049 | 0.7428 | 0.7403 | 0.7428 | 0.7089 | |
| | 0.6573 | 0.2341 | 7000 | 1.0702 | 0.7029 | 0.7052 | 0.7029 | 0.6764 | |
| | 0.6372 | 0.2508 | 7500 | 1.0260 | 0.7210 | 0.7143 | 0.7210 | 0.6998 | |
| | 0.6173 | 0.2675 | 8000 | 0.9654 | 0.7428 | 0.7492 | 0.7428 | 0.7141 | |
| | 0.6009 | 0.2842 | 8500 | 1.0185 | 0.7464 | 0.7504 | 0.7464 | 0.7167 | |
| | 0.5924 | 0.3010 | 9000 | 1.0028 | 0.7283 | 0.7652 | 0.7283 | 0.7052 | |
| | 0.5916 | 0.3177 | 9500 | 0.9581 | 0.7174 | 0.7217 | 0.7174 | 0.6893 | |
| | 0.5806 | 0.3344 | 10000 | 1.0011 | 0.7355 | 0.7618 | 0.7355 | 0.7149 | |
| | 0.5672 | 0.3511 | 10500 | 0.8978 | 0.7572 | 0.7429 | 0.7572 | 0.7307 | |
| | 0.5580 | 0.3678 | 11000 | 0.9525 | 0.7210 | 0.7308 | 0.7210 | 0.7013 | |
| | 0.5520 | 0.3846 | 11500 | 0.8647 | 0.7645 | 0.7695 | 0.7645 | 0.7391 | |
| | 0.5552 | 0.4013 | 12000 | 0.8977 | 0.7536 | 0.7698 | 0.7536 | 0.7358 | |
| | 0.5341 | 0.4180 | 12500 | 0.8526 | 0.7536 | 0.7625 | 0.7536 | 0.7305 | |
| | 0.5284 | 0.4347 | 13000 | 0.8496 | 0.7464 | 0.7310 | 0.7464 | 0.7166 | |
| | 0.5322 | 0.4514 | 13500 | 0.7672 | 0.8007 | 0.8006 | 0.8007 | 0.7827 | |
| | 0.5229 | 0.4681 | 14000 | 0.8253 | 0.7754 | 0.7698 | 0.7754 | 0.7515 | |
| | 0.5007 | 0.4849 | 14500 | 0.8496 | 0.7826 | 0.7649 | 0.7826 | 0.7547 | |
| | 0.5109 | 0.5016 | 15000 | 0.7700 | 0.7754 | 0.7767 | 0.7754 | 0.7518 | |
| | 0.4989 | 0.5183 | 15500 | 0.8338 | 0.7645 | 0.7741 | 0.7645 | 0.7419 | |
| | 0.4991 | 0.5350 | 16000 | 0.7927 | 0.7754 | 0.7928 | 0.7754 | 0.7625 | |
| | 0.4977 | 0.5517 | 16500 | 0.7859 | 0.7790 | 0.7670 | 0.7790 | 0.7551 | |
| | 0.4854 | 0.5685 | 17000 | 0.7915 | 0.7862 | 0.7907 | 0.7862 | 0.7630 | |
| | 0.4826 | 0.5852 | 17500 | 0.7628 | 0.8043 | 0.7964 | 0.8043 | 0.7846 | |
| | 0.4765 | 0.6019 | 18000 | 0.7632 | 0.7971 | 0.8008 | 0.7971 | 0.7791 | |
| | 0.4641 | 0.6186 | 18500 | 0.7722 | 0.7935 | 0.7660 | 0.7935 | 0.7670 | |
| | 0.4783 | 0.6353 | 19000 | 0.7046 | 0.7899 | 0.8111 | 0.7899 | 0.7773 | |
| | 0.4745 | 0.6521 | 19500 | 0.7342 | 0.7899 | 0.8044 | 0.7899 | 0.7726 | |
| | 0.4555 | 0.6688 | 20000 | 0.7116 | 0.7862 | 0.7853 | 0.7862 | 0.7662 | |
| | 0.4530 | 0.6855 | 20500 | 0.7385 | 0.7754 | 0.7658 | 0.7754 | 0.7557 | |
| | 0.4565 | 0.7022 | 21000 | 0.7651 | 0.7899 | 0.8132 | 0.7899 | 0.7770 | |
| | 0.4555 | 0.7189 | 21500 | 0.7902 | 0.7681 | 0.7812 | 0.7681 | 0.7569 | |
| | 0.4485 | 0.7357 | 22000 | 0.7613 | 0.7862 | 0.7962 | 0.7862 | 0.7686 | |
| | 0.4518 | 0.7524 | 22500 | 0.7544 | 0.7862 | 0.7944 | 0.7862 | 0.7676 | |
| | 0.4508 | 0.7691 | 23000 | 0.7296 | 0.8043 | 0.8110 | 0.8043 | 0.7907 | |
| | 0.4418 | 0.7858 | 23500 | 0.7293 | 0.8261 | 0.8527 | 0.8261 | 0.8137 | |
| | 0.4365 | 0.8025 | 24000 | 0.7370 | 0.8043 | 0.8217 | 0.8043 | 0.7928 | |
| | 0.4353 | 0.8193 | 24500 | 0.7100 | 0.8188 | 0.8274 | 0.8188 | 0.8049 | |
| | 0.4240 | 0.8360 | 25000 | 0.7273 | 0.7862 | 0.7857 | 0.7862 | 0.7697 | |
| | 0.4205 | 0.8527 | 25500 | 0.7297 | 0.8225 | 0.8351 | 0.8225 | 0.8059 | |
| | 0.4316 | 0.8694 | 26000 | 0.7204 | 0.8116 | 0.8066 | 0.8116 | 0.7911 | |
| | 0.4176 | 0.8861 | 26500 | 0.7340 | 0.8080 | 0.8184 | 0.8080 | 0.7922 | |
| | 0.4240 | 0.9029 | 27000 | 0.7298 | 0.8116 | 0.8223 | 0.8116 | 0.7964 | |
| | 0.4149 | 0.9196 | 27500 | 0.7410 | 0.8188 | 0.8185 | 0.8188 | 0.8023 | |
| | 0.4159 | 0.9363 | 28000 | 0.7303 | 0.8152 | 0.8388 | 0.8152 | 0.8069 | |
| | 0.4068 | 0.9530 | 28500 | 0.7220 | 0.8043 | 0.8209 | 0.8043 | 0.7955 | |
| | 0.4135 | 0.9697 | 29000 | 0.7313 | 0.8188 | 0.8238 | 0.8188 | 0.8055 | |
| | 0.4130 | 0.9865 | 29500 | 0.7221 | 0.8225 | 0.8320 | 0.8225 | 0.8095 | |
| | 0.4213 | 1.0032 | 30000 | 0.7212 | 0.8297 | 0.8471 | 0.8297 | 0.8191 | |
| |
| |
| ### Framework versions |
| |
| - Transformers 5.8.0 |
| - Pytorch 2.11.0+cu130 |
| - Datasets 4.8.5 |
| - Tokenizers 0.22.2 |
| |