CeLLaTe3.0_no_vague_adapted_tok
This model is a fine-tuned version of Mardiyyah/cellate1.0-tapt_freeze_llrd_ww_mask-LR_2e-05 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.2213
- Precision: 0.9177
- Recall: 0.9296
- F1: 0.9236
- Accuracy: 0.9649
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 3407
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 20
- mixed_precision_training: Native AMP
Training results
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|---|---|---|---|---|---|---|---|
| 1.0902 | 1.0 | 201 | 0.3240 | 0.7557 | 0.7769 | 0.7662 | 0.9090 |
| 0.2236 | 2.0 | 402 | 0.1587 | 0.8654 | 0.9202 | 0.8919 | 0.9594 |
| 0.1178 | 3.0 | 603 | 0.1405 | 0.9020 | 0.9375 | 0.9194 | 0.9639 |
| 0.0779 | 4.0 | 804 | 0.1647 | 0.8984 | 0.9267 | 0.9124 | 0.9608 |
| 0.0543 | 5.0 | 1005 | 0.1751 | 0.9070 | 0.9261 | 0.9165 | 0.9633 |
| 0.04 | 6.0 | 1206 | 0.2010 | 0.8966 | 0.9239 | 0.9101 | 0.9600 |
| 0.0306 | 7.0 | 1407 | 0.2047 | 0.9005 | 0.9200 | 0.9101 | 0.9595 |
| 0.0242 | 8.0 | 1608 | 0.2117 | 0.9028 | 0.9172 | 0.9099 | 0.9621 |
| 0.019 | 9.0 | 1809 | 0.2070 | 0.9126 | 0.9316 | 0.9220 | 0.9652 |
| 0.0145 | 10.0 | 2010 | 0.2219 | 0.9177 | 0.9296 | 0.9236 | 0.9649 |
| 0.0113 | 11.0 | 2211 | 0.2304 | 0.9045 | 0.9299 | 0.9170 | 0.9627 |
| 0.0089 | 12.0 | 2412 | 0.2482 | 0.9026 | 0.9216 | 0.9120 | 0.9618 |
| 0.0079 | 13.0 | 2613 | 0.2594 | 0.9057 | 0.9214 | 0.9135 | 0.9613 |
| 0.0068 | 14.0 | 2814 | 0.3033 | 0.8992 | 0.9096 | 0.9044 | 0.9596 |
| 0.0053 | 15.0 | 3015 | 0.2722 | 0.9062 | 0.9200 | 0.9130 | 0.9624 |
| 0.0051 | 16.0 | 3216 | 0.2651 | 0.9080 | 0.9238 | 0.9158 | 0.9637 |
| 0.0041 | 17.0 | 3417 | 0.2697 | 0.9121 | 0.9211 | 0.9166 | 0.9635 |
| 0.0038 | 18.0 | 3618 | 0.2755 | 0.9078 | 0.9246 | 0.9161 | 0.9634 |
| 0.0032 | 19.0 | 3819 | 0.2719 | 0.9078 | 0.9261 | 0.9168 | 0.9633 |
| 0.0032 | 20.0 | 4020 | 0.2706 | 0.9103 | 0.9252 | 0.9177 | 0.9636 |
Framework versions
- Transformers 4.48.2
- Pytorch 2.4.1+cu121
- Datasets 3.0.2
- Tokenizers 0.21.0
- Downloads last month
- 2