File size: 4,348 Bytes
3065699
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- precision
- recall
- f1
- accuracy
model-index:
- name: modelBsc
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# modelBsc

This model is a fine-tuned version of [PlanTL-GOB-ES/bsc-bio-ehr-es](https://huggingface.co/PlanTL-GOB-ES/bsc-bio-ehr-es) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1830
- Precision: 0.64
- Recall: 0.6207
- F1: 0.6302
- Accuracy: 0.9685

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 32

### Training results

| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1     | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
| No log        | 1.0   | 29   | 0.3033          | 0.0213    | 0.0043 | 0.0072 | 0.9174   |
| No log        | 2.0   | 58   | 0.2425          | 0.1522    | 0.0302 | 0.0504 | 0.9325   |
| No log        | 3.0   | 87   | 0.1906          | 0.3056    | 0.1422 | 0.1941 | 0.9435   |
| No log        | 4.0   | 116  | 0.1825          | 0.3841    | 0.2716 | 0.3182 | 0.9439   |
| No log        | 5.0   | 145  | 0.1594          | 0.5517    | 0.3448 | 0.4244 | 0.9554   |
| No log        | 6.0   | 174  | 0.1383          | 0.5408    | 0.5431 | 0.5419 | 0.9595   |
| No log        | 7.0   | 203  | 0.1460          | 0.5897    | 0.4957 | 0.5386 | 0.9624   |
| No log        | 8.0   | 232  | 0.1534          | 0.6105    | 0.5    | 0.5498 | 0.9642   |
| No log        | 9.0   | 261  | 0.1587          | 0.5869    | 0.5388 | 0.5618 | 0.9631   |
| No log        | 10.0  | 290  | 0.1684          | 0.5921    | 0.5819 | 0.5870 | 0.9637   |
| No log        | 11.0  | 319  | 0.1713          | 0.5270    | 0.6724 | 0.5909 | 0.9606   |
| No log        | 12.0  | 348  | 0.1895          | 0.4984    | 0.6897 | 0.5787 | 0.9572   |
| No log        | 13.0  | 377  | 0.1759          | 0.4969    | 0.6897 | 0.5776 | 0.9568   |
| No log        | 14.0  | 406  | 0.1798          | 0.6468    | 0.5603 | 0.6005 | 0.9664   |
| No log        | 15.0  | 435  | 0.1909          | 0.5118    | 0.6552 | 0.5747 | 0.9597   |
| No log        | 16.0  | 464  | 0.1745          | 0.6184    | 0.6078 | 0.6130 | 0.9658   |
| No log        | 17.0  | 493  | 0.1670          | 0.6261    | 0.6207 | 0.6234 | 0.9674   |
| 0.062         | 18.0  | 522  | 0.1719          | 0.5992    | 0.6509 | 0.6240 | 0.9667   |
| 0.062         | 19.0  | 551  | 0.1759          | 0.6224    | 0.6466 | 0.6342 | 0.9674   |
| 0.062         | 20.0  | 580  | 0.1780          | 0.6327    | 0.6164 | 0.6245 | 0.9669   |
| 0.062         | 21.0  | 609  | 0.1777          | 0.5632    | 0.6336 | 0.5963 | 0.9637   |
| 0.062         | 22.0  | 638  | 0.1784          | 0.6137    | 0.6164 | 0.6151 | 0.9665   |
| 0.062         | 23.0  | 667  | 0.1730          | 0.6276    | 0.6466 | 0.6369 | 0.9678   |
| 0.062         | 24.0  | 696  | 0.1822          | 0.6076    | 0.6207 | 0.6141 | 0.9660   |
| 0.062         | 25.0  | 725  | 0.1820          | 0.6306    | 0.6034 | 0.6167 | 0.9678   |
| 0.062         | 26.0  | 754  | 0.1792          | 0.6083    | 0.6293 | 0.6186 | 0.9671   |
| 0.062         | 27.0  | 783  | 0.1810          | 0.6416    | 0.625  | 0.6332 | 0.9691   |
| 0.062         | 28.0  | 812  | 0.1800          | 0.6360    | 0.625  | 0.6304 | 0.9687   |
| 0.062         | 29.0  | 841  | 0.1811          | 0.6025    | 0.6336 | 0.6176 | 0.9660   |
| 0.062         | 30.0  | 870  | 0.1821          | 0.6074    | 0.6336 | 0.6203 | 0.9664   |
| 0.062         | 31.0  | 899  | 0.1825          | 0.6388    | 0.625  | 0.6318 | 0.9685   |
| 0.062         | 32.0  | 928  | 0.1830          | 0.64      | 0.6207 | 0.6302 | 0.9685   |


### Framework versions

- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.11.0
- Tokenizers 0.13.3