--- license: mit tags: - generated_from_trainer metrics: - f1 model-index: - name: Bio_ClinicalBERT_fold_6_binary_v1 results: [] --- # Bio_ClinicalBERT_fold_6_binary_v1 This model is a fine-tuned version of [emilyalsentzer/Bio_ClinicalBERT](https://huggingface.co/emilyalsentzer/Bio_ClinicalBERT) on the None dataset. It achieves the following results on the evaluation set: - Loss: 1.7858 - F1: 0.8079 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 25 ### Training results | Training Loss | Epoch | Step | Validation Loss | F1 | |:-------------:|:-----:|:----:|:---------------:|:------:| | No log | 1.0 | 290 | 0.4223 | 0.7938 | | 0.4052 | 2.0 | 580 | 0.4262 | 0.7991 | | 0.4052 | 3.0 | 870 | 0.5859 | 0.8201 | | 0.1894 | 4.0 | 1160 | 0.9158 | 0.7859 | | 0.1894 | 5.0 | 1450 | 1.0524 | 0.8018 | | 0.0845 | 6.0 | 1740 | 1.0179 | 0.8041 | | 0.038 | 7.0 | 2030 | 1.2477 | 0.8047 | | 0.038 | 8.0 | 2320 | 1.2635 | 0.8111 | | 0.014 | 9.0 | 2610 | 1.4297 | 0.8018 | | 0.014 | 10.0 | 2900 | 1.4499 | 0.8034 | | 0.0119 | 11.0 | 3190 | 1.4388 | 0.8194 | | 0.0119 | 12.0 | 3480 | 1.4813 | 0.8082 | | 0.0145 | 13.0 | 3770 | 1.5423 | 0.8063 | | 0.006 | 14.0 | 4060 | 1.5658 | 0.8104 | | 0.006 | 15.0 | 4350 | 1.6268 | 0.8052 | | 0.0021 | 16.0 | 4640 | 1.6671 | 0.8148 | | 0.0021 | 17.0 | 4930 | 1.7222 | 0.8132 | | 0.005 | 18.0 | 5220 | 1.7973 | 0.8014 | | 0.0031 | 19.0 | 5510 | 1.7613 | 0.8054 | | 0.0031 | 20.0 | 5800 | 1.7653 | 0.8071 | | 0.0099 | 21.0 | 6090 | 1.7343 | 0.7996 | | 0.0099 | 22.0 | 6380 | 1.7679 | 0.8104 | | 0.0015 | 23.0 | 6670 | 1.7916 | 0.8095 | | 0.0015 | 24.0 | 6960 | 1.7815 | 0.8062 | | 0.0028 | 25.0 | 7250 | 1.7858 | 0.8079 | ### Framework versions - Transformers 4.21.0 - Pytorch 1.12.0+cu113 - Datasets 2.4.0 - Tokenizers 0.12.1