BioLinkBERT-mimic-phi-ner
This model is a fine-tuned version of michiyasunaga/BioLinkBERT-base on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.0153
- F1 Macro: 0.9100
- F1 Weighted: 0.9100
- Precision: 0.8749
- Recall: 0.9481
- F1 Name: 0.92
- F1 Location: 0.88
- F1 Phone: 0.91
- F1 Date: 0.9
- F1 Mrn: 0.65
- F1 Account: 0.66
- F1 Age Over 89: 0.0
- F1 Device Id: 0.5
- F1 Ssn: 0.39
- F1 Url: 0.15
- F1 Email: 0.12
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 0.1
- num_epochs: 2
- mixed_precision_training: Native AMP
Training results
| Training Loss | Epoch | Step | Validation Loss | F1 Macro | F1 Weighted | Precision | Recall | F1 Name | F1 Location | F1 Phone | F1 Date | F1 Mrn | F1 Account | F1 Age Over 89 | F1 Device Id | F1 Ssn | F1 Url | F1 Email |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0.8832 | 0.2020 | 300 | 0.2738 | 0.5399 | 0.5399 | 0.4427 | 0.6917 | 0.64 | 0.31 | 0.22 | 0.16 | 0.05 | 0.79 | 0.14 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.2048 | 0.4039 | 600 | 0.0948 | 0.7961 | 0.7961 | 0.7392 | 0.8624 | 0.85 | 0.58 | 0.7 | 0.64 | 0.47 | 0.9 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.1982 | 0.6059 | 900 | 0.0432 | 0.7818 | 0.7818 | 0.7165 | 0.8602 | 0.83 | 0.61 | 0.82 | 0.66 | 0.35 | 0.6 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0654 | 0.8078 | 1200 | 0.0325 | 0.8718 | 0.8718 | 0.8240 | 0.9255 | 0.9 | 0.79 | 0.85 | 0.85 | 0.46 | 0.75 | 0.0 | 0.19 | 0.19 | 0.12 | 0.0 |
| 0.0755 | 1.0094 | 1500 | 0.0219 | 0.8969 | 0.8969 | 0.8590 | 0.9383 | 0.92 | 0.86 | 0.89 | 0.88 | 0.67 | 0.67 | 0.0 | 0.23 | 0.15 | 0.0 | 0.0 |
| 0.0614 | 1.2114 | 1800 | 0.0174 | 0.8885 | 0.8885 | 0.8412 | 0.9415 | 0.91 | 0.87 | 0.88 | 0.83 | 0.48 | 0.66 | 0.0 | 0.15 | 0.18 | 0.0 | 0.0 |
| 0.0241 | 1.4133 | 2100 | 0.0165 | 0.9065 | 0.9065 | 0.8701 | 0.9460 | 0.92 | 0.88 | 0.91 | 0.9 | 0.66 | 0.66 | 0.0 | 0.35 | 0.29 | 0.0 | 0.0 |
| 0.0414 | 1.6153 | 2400 | 0.0175 | 0.9193 | 0.9193 | 0.8895 | 0.9512 | 0.93 | 0.89 | 0.91 | 0.89 | 0.65 | 0.66 | 0.0 | 0.33 | 0.36 | 0.15 | 0.12 |
| 0.0303 | 1.8172 | 2700 | 0.0153 | 0.9100 | 0.9100 | 0.8749 | 0.9481 | 0.92 | 0.88 | 0.91 | 0.9 | 0.65 | 0.66 | 0.0 | 0.5 | 0.39 | 0.15 | 0.12 |
Framework versions
- Transformers 5.0.0
- Pytorch 2.10.0+cu128
- Datasets 4.0.0
- Tokenizers 0.22.2
- Downloads last month
- 186
Model tree for racheltong/BioLinkBERT-mimic-phi-ner
Base model
michiyasunaga/BioLinkBERT-base