roscazo commited on
Commit
c8dfd92
·
1 Parent(s): 307d8b0

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +24 -20
README.md CHANGED
@@ -12,21 +12,21 @@ should probably proofread and complete it, then remove this comment. -->
12
 
13
  # CTEBMSP_ANAT_DISO
14
 
15
- This model is a fine-tuned version of [PlanTL-GOB-ES/bsc-bio-ehr-es](https://huggingface.co/PlanTL-GOB-ES/bsc-bio-ehr-es) on an unknown dataset.
16
  It achieves the following results on the evaluation set:
17
- - Loss: 0.0625
18
- - Anat Precision: 0.6977
19
- - Anat Recall: 0.6648
20
- - Anat F1: 0.6809
21
  - Anat Number: 361
22
- - Diso Precision: 0.8914
23
- - Diso Recall: 0.8972
24
- - Diso F1: 0.8943
25
  - Diso Number: 2645
26
- - Overall Precision: 0.8693
27
- - Overall Recall: 0.8693
28
- - Overall F1: 0.8693
29
- - Overall Accuracy: 0.9869
30
 
31
  ## Model description
32
 
@@ -45,22 +45,26 @@ More information needed
45
  ### Training hyperparameters
46
 
47
  The following hyperparameters were used during training:
48
- - learning_rate: 2e-05
49
  - train_batch_size: 8
50
  - eval_batch_size: 8
51
  - seed: 42
52
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
53
  - lr_scheduler_type: linear
54
- - num_epochs: 4
55
 
56
  ### Training results
57
 
58
- | Training Loss | Epoch | Step | Validation Loss | Anat Precision | Anat Recall | Anat F1 | Anat Number | Diso Precision | Diso Recall | Diso F1 | Diso Number | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
59
- |:-------------:|:-----:|:----:|:---------------:|:--------------:|:-----------:|:-------:|:-----------:|:--------------:|:-----------:|:-------:|:-----------:|:-----------------:|:--------------:|:----------:|:----------------:|
60
- | 0.0541 | 1.0 | 1570 | 0.0541 | 0.6821 | 0.5291 | 0.5959 | 361 | 0.8545 | 0.8949 | 0.8742 | 2645 | 0.8387 | 0.8510 | 0.8448 | 0.9850 |
61
- | 0.0281 | 2.0 | 3140 | 0.0520 | 0.6339 | 0.7147 | 0.6719 | 361 | 0.8683 | 0.8998 | 0.8838 | 2645 | 0.8380 | 0.8776 | 0.8573 | 0.9860 |
62
- | 0.0148 | 3.0 | 4710 | 0.0573 | 0.7188 | 0.6371 | 0.6755 | 361 | 0.8882 | 0.8983 | 0.8932 | 2645 | 0.8701 | 0.8669 | 0.8685 | 0.9868 |
63
- | 0.0081 | 4.0 | 6280 | 0.0625 | 0.6977 | 0.6648 | 0.6809 | 361 | 0.8914 | 0.8972 | 0.8943 | 2645 | 0.8693 | 0.8693 | 0.8693 | 0.9869 |
 
 
 
 
64
 
65
 
66
  ### Framework versions
 
12
 
13
  # CTEBMSP_ANAT_DISO
14
 
15
+ This model is a fine-tuned version of [PlanTL-GOB-ES/bsc-bio-ehr-es](https://huggingface.co/PlanTL-GOB-ES/bsc-bio-ehr-es) on the None dataset.
16
  It achieves the following results on the evaluation set:
17
+ - Loss: 0.0909
18
+ - Anat Precision: 0.7522
19
+ - Anat Recall: 0.7147
20
+ - Anat F1: 0.7330
21
  - Anat Number: 361
22
+ - Diso Precision: 0.8915
23
+ - Diso Recall: 0.8919
24
+ - Diso F1: 0.8917
25
  - Diso Number: 2645
26
+ - Overall Precision: 0.8755
27
+ - Overall Recall: 0.8706
28
+ - Overall F1: 0.8731
29
+ - Overall Accuracy: 0.9873
30
 
31
  ## Model description
32
 
 
45
  ### Training hyperparameters
46
 
47
  The following hyperparameters were used during training:
48
+ - learning_rate: 8e-05
49
  - train_batch_size: 8
50
  - eval_batch_size: 8
51
  - seed: 42
52
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
53
  - lr_scheduler_type: linear
54
+ - num_epochs: 8
55
 
56
  ### Training results
57
 
58
+ | Training Loss | Epoch | Step | Validation Loss | Anat Precision | Anat Recall | Anat F1 | Anat Number | Diso Precision | Diso Recall | Diso F1 | Diso Number | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
59
+ |:-------------:|:-----:|:-----:|:---------------:|:--------------:|:-----------:|:-------:|:-----------:|:--------------:|:-----------:|:-------:|:-----------:|:-----------------:|:--------------:|:----------:|:----------------:|
60
+ | 0.0592 | 1.0 | 2133 | 0.0506 | 0.6950 | 0.4986 | 0.5806 | 361 | 0.8635 | 0.8609 | 0.8622 | 2645 | 0.8484 | 0.8174 | 0.8326 | 0.9843 |
61
+ | 0.0323 | 2.0 | 4266 | 0.0583 | 0.7899 | 0.6039 | 0.6845 | 361 | 0.8780 | 0.8817 | 0.8798 | 2645 | 0.8697 | 0.8483 | 0.8589 | 0.9858 |
62
+ | 0.0201 | 3.0 | 6399 | 0.0580 | 0.6565 | 0.7147 | 0.6844 | 361 | 0.8598 | 0.8764 | 0.8680 | 2645 | 0.8339 | 0.8570 | 0.8453 | 0.9851 |
63
+ | 0.0121 | 4.0 | 8532 | 0.0758 | 0.7240 | 0.6759 | 0.6991 | 361 | 0.8976 | 0.8752 | 0.8863 | 2645 | 0.8776 | 0.8513 | 0.8642 | 0.9863 |
64
+ | 0.0078 | 5.0 | 10665 | 0.0814 | 0.7219 | 0.7119 | 0.7169 | 361 | 0.8776 | 0.8975 | 0.8875 | 2645 | 0.8595 | 0.8752 | 0.8673 | 0.9862 |
65
+ | 0.0031 | 6.0 | 12798 | 0.0974 | 0.7599 | 0.6399 | 0.6947 | 361 | 0.8895 | 0.8915 | 0.8905 | 2645 | 0.8761 | 0.8613 | 0.8686 | 0.9867 |
66
+ | 0.002 | 7.0 | 14931 | 0.0980 | 0.7143 | 0.6787 | 0.6960 | 361 | 0.8813 | 0.8957 | 0.8884 | 2645 | 0.8624 | 0.8696 | 0.8660 | 0.9860 |
67
+ | 0.0005 | 8.0 | 17064 | 0.0909 | 0.7522 | 0.7147 | 0.7330 | 361 | 0.8915 | 0.8919 | 0.8917 | 2645 | 0.8755 | 0.8706 | 0.8731 | 0.9873 |
68
 
69
 
70
  ### Framework versions