Biobert_combo_1-6lakh-8-8-2
This model is a fine-tuned version of dmis-lab/biobert-base-cased-v1.2 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 1.5221
- Accuracy: 0.62
- Auc: 0.602
- Precision: 0.586
- Recall: 0.773
- F1: 0.666
- F1-macro: 0.612
- F1-micro: 0.62
- F1-weighted: 0.611
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 16
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 2
- mixed_precision_training: Native AMP
Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Auc | Precision | Recall | F1 | F1-macro | F1-micro | F1-weighted |
|---|---|---|---|---|---|---|---|---|---|---|---|
| 0.3173 | 0.1935 | 2000 | 1.4050 | 0.584 | 0.572 | 0.562 | 0.703 | 0.624 | 0.58 | 0.584 | 0.579 |
| 0.3027 | 0.3870 | 4000 | 1.6222 | 0.603 | 0.585 | 0.574 | 0.744 | 0.648 | 0.596 | 0.603 | 0.595 |
| 0.4245 | 0.5805 | 6000 | 1.3241 | 0.605 | 0.571 | 0.569 | 0.81 | 0.668 | 0.59 | 0.605 | 0.589 |
| 0.4055 | 0.7740 | 8000 | 1.2442 | 0.607 | 0.593 | 0.572 | 0.787 | 0.663 | 0.595 | 0.607 | 0.594 |
| 0.3941 | 0.9675 | 10000 | 1.2942 | 0.61 | 0.595 | 0.573 | 0.813 | 0.672 | 0.596 | 0.61 | 0.595 |
| 0.3418 | 1.1610 | 12000 | 1.4656 | 0.614 | 0.601 | 0.581 | 0.77 | 0.662 | 0.606 | 0.614 | 0.606 |
| 0.3277 | 1.3545 | 14000 | 1.4494 | 0.615 | 0.609 | 0.589 | 0.717 | 0.647 | 0.612 | 0.615 | 0.611 |
| 0.3228 | 1.5480 | 16000 | 1.4825 | 0.616 | 0.598 | 0.585 | 0.756 | 0.659 | 0.61 | 0.616 | 0.609 |
| 0.3185 | 1.7415 | 18000 | 1.5090 | 0.618 | 0.602 | 0.588 | 0.742 | 0.656 | 0.614 | 0.618 | 0.613 |
| 0.3098 | 1.9350 | 20000 | 1.5221 | 0.62 | 0.602 | 0.586 | 0.773 | 0.666 | 0.612 | 0.62 | 0.611 |
Framework versions
- Transformers 4.55.2
- Pytorch 2.8.0+cu126
- Datasets 4.0.0
- Tokenizers 0.21.4
- Downloads last month
- 1
Model tree for adity12345/Biobert_combo_1-6lakh-8-8-2
Base model
dmis-lab/biobert-base-cased-v1.2