biotech-sentiment-modernbert-large
This model is a fine-tuned version of thomas-sounack/BioClinical-ModernBERT-large on the None dataset. It achieves the following results on the evaluation set:
- Loss: 4.3531
- F1 Weighted: 0.5360
- F1 Macro: 0.4102
- F1 Micro: 0.5577
- Accuracy: 0.5577
- F1 Class 0: 0.2881
- F1 Class 1: 0.7143
- F1 Class 2: 0.2281
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 250
- num_epochs: 20
- mixed_precision_training: Native AMP
Training results
| Training Loss | Epoch | Step | Validation Loss | F1 Weighted | F1 Macro | F1 Micro | Accuracy | F1 Class 0 | F1 Class 1 | F1 Class 2 |
|---|---|---|---|---|---|---|---|---|---|---|
| 2.1818 | 1.0 | 127 | 1.0826 | 0.3588 | 0.3214 | 0.3504 | 0.3504 | 0.3487 | 0.4120 | 0.2034 |
| 2.2104 | 2.0 | 254 | 1.1365 | 0.2625 | 0.2229 | 0.2991 | 0.2991 | 0.3404 | 0.3185 | 0.0100 |
| 2.0356 | 3.0 | 381 | 1.2092 | 0.3658 | 0.2829 | 0.3722 | 0.3722 | 0.0293 | 0.4832 | 0.3363 |
| 1.6188 | 4.0 | 508 | 1.1989 | 0.3326 | 0.3157 | 0.3198 | 0.3198 | 0.2950 | 0.3567 | 0.2952 |
| 1.1139 | 5.0 | 635 | 1.5419 | 0.5282 | 0.4009 | 0.5479 | 0.5479 | 0.2277 | 0.7086 | 0.2663 |
| 0.7829 | 6.0 | 762 | 2.0069 | 0.5263 | 0.4022 | 0.5410 | 0.5410 | 0.3171 | 0.7022 | 0.1871 |
| 0.5745 | 7.0 | 889 | 2.6626 | 0.5037 | 0.3840 | 0.5035 | 0.5035 | 0.1843 | 0.6733 | 0.2943 |
| 0.4521 | 8.0 | 1016 | 2.7618 | 0.5304 | 0.4041 | 0.5508 | 0.5508 | 0.2569 | 0.7094 | 0.2460 |
| 0.3251 | 9.0 | 1143 | 2.9332 | 0.5332 | 0.3932 | 0.5706 | 0.5706 | 0.2360 | 0.7317 | 0.2118 |
| 0.2154 | 10.0 | 1270 | 2.9890 | 0.5285 | 0.4127 | 0.5350 | 0.5350 | 0.3134 | 0.6927 | 0.2321 |
| 0.1807 | 11.0 | 1397 | 3.2419 | 0.5279 | 0.4132 | 0.5321 | 0.5321 | 0.2995 | 0.6906 | 0.2493 |
| 0.0934 | 12.0 | 1524 | 3.5012 | 0.5121 | 0.4031 | 0.5104 | 0.5104 | 0.3230 | 0.6667 | 0.2196 |
| 0.0738 | 13.0 | 1651 | 4.1345 | 0.5332 | 0.4017 | 0.5607 | 0.5607 | 0.2943 | 0.7197 | 0.1911 |
| 0.0107 | 14.0 | 1778 | 4.0537 | 0.5337 | 0.4117 | 0.5499 | 0.5499 | 0.3128 | 0.7066 | 0.2156 |
| 0.0141 | 15.0 | 1905 | 4.0290 | 0.5199 | 0.3972 | 0.5301 | 0.5301 | 0.2948 | 0.6939 | 0.2029 |
| 0.0067 | 16.0 | 2032 | 4.3166 | 0.5352 | 0.4112 | 0.5558 | 0.5558 | 0.2992 | 0.7109 | 0.2235 |
| 0.0024 | 17.0 | 2159 | 4.3491 | 0.5359 | 0.4101 | 0.5577 | 0.5577 | 0.2865 | 0.7143 | 0.2294 |
| 0.0101 | 18.0 | 2286 | 4.3545 | 0.5352 | 0.4096 | 0.5568 | 0.5568 | 0.2881 | 0.7133 | 0.2274 |
| 0.0025 | 19.0 | 2413 | 4.3476 | 0.5354 | 0.4095 | 0.5568 | 0.5568 | 0.2873 | 0.7139 | 0.2274 |
| 0.0091 | 20.0 | 2540 | 4.3531 | 0.5360 | 0.4102 | 0.5577 | 0.5577 | 0.2881 | 0.7143 | 0.2281 |
Framework versions
- Transformers 5.3.0
- Pytorch 2.10.0+cu128
- Datasets 4.6.1
- Tokenizers 0.22.2
- Downloads last month
- 1
Model tree for akseljoonas/biotech-sentiment-modernbert-large
Base model
answerdotai/ModernBERT-large