|
|
--- |
|
|
tags: |
|
|
- generated_from_trainer |
|
|
metrics: |
|
|
- precision |
|
|
- recall |
|
|
- f1 |
|
|
- accuracy |
|
|
model-index: |
|
|
- name: modelBeto |
|
|
results: [] |
|
|
--- |
|
|
|
|
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You |
|
|
should probably proofread and complete it, then remove this comment. --> |
|
|
|
|
|
# modelBeto |
|
|
|
|
|
This model is a fine-tuned version of [dccuchile/bert-base-spanish-wwm-cased](https://huggingface.co/dccuchile/bert-base-spanish-wwm-cased) on an unknown dataset. |
|
|
It achieves the following results on the evaluation set: |
|
|
- Loss: 0.1719 |
|
|
- Precision: 0.5388 |
|
|
- Recall: 0.5781 |
|
|
- F1: 0.5578 |
|
|
- Accuracy: 0.9685 |
|
|
|
|
|
## Model description |
|
|
|
|
|
More information needed |
|
|
|
|
|
## Intended uses & limitations |
|
|
|
|
|
More information needed |
|
|
|
|
|
## Training and evaluation data |
|
|
|
|
|
More information needed |
|
|
|
|
|
## Training procedure |
|
|
|
|
|
### Training hyperparameters |
|
|
|
|
|
The following hyperparameters were used during training: |
|
|
- learning_rate: 5e-05 |
|
|
- train_batch_size: 32 |
|
|
- eval_batch_size: 8 |
|
|
- seed: 42 |
|
|
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 |
|
|
- lr_scheduler_type: linear |
|
|
- num_epochs: 32 |
|
|
|
|
|
### Training results |
|
|
|
|
|
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy | |
|
|
|:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:| |
|
|
| No log | 1.0 | 29 | 0.2382 | 0.0 | 0.0 | 0.0 | 0.9473 | |
|
|
| No log | 2.0 | 58 | 0.2253 | 0.0 | 0.0 | 0.0 | 0.9473 | |
|
|
| No log | 3.0 | 87 | 0.1591 | 0.3922 | 0.1042 | 0.1646 | 0.9512 | |
|
|
| No log | 4.0 | 116 | 0.1398 | 0.3529 | 0.2188 | 0.2701 | 0.9590 | |
|
|
| No log | 5.0 | 145 | 0.1157 | 0.4468 | 0.3281 | 0.3784 | 0.9571 | |
|
|
| No log | 6.0 | 174 | 0.1181 | 0.5407 | 0.3802 | 0.4465 | 0.9604 | |
|
|
| No log | 7.0 | 203 | 0.1144 | 0.4384 | 0.5 | 0.4672 | 0.9597 | |
|
|
| No log | 8.0 | 232 | 0.1350 | 0.5887 | 0.4323 | 0.4985 | 0.9682 | |
|
|
| No log | 9.0 | 261 | 0.1193 | 0.5117 | 0.5677 | 0.5383 | 0.9649 | |
|
|
| No log | 10.0 | 290 | 0.1365 | 0.5962 | 0.4844 | 0.5345 | 0.9708 | |
|
|
| No log | 11.0 | 319 | 0.1352 | 0.5 | 0.5781 | 0.5362 | 0.9652 | |
|
|
| No log | 12.0 | 348 | 0.1534 | 0.5593 | 0.5156 | 0.5366 | 0.9692 | |
|
|
| No log | 13.0 | 377 | 0.1475 | 0.5838 | 0.5260 | 0.5534 | 0.9699 | |
|
|
| No log | 14.0 | 406 | 0.1395 | 0.5144 | 0.6510 | 0.5747 | 0.9670 | |
|
|
| No log | 15.0 | 435 | 0.1487 | 0.5550 | 0.6042 | 0.5786 | 0.9696 | |
|
|
| No log | 16.0 | 464 | 0.1576 | 0.5637 | 0.5990 | 0.5808 | 0.9697 | |
|
|
| No log | 17.0 | 493 | 0.1557 | 0.5699 | 0.5521 | 0.5608 | 0.9697 | |
|
|
| 0.0779 | 18.0 | 522 | 0.1581 | 0.5062 | 0.6354 | 0.5635 | 0.9665 | |
|
|
| 0.0779 | 19.0 | 551 | 0.1545 | 0.5312 | 0.6198 | 0.5721 | 0.9671 | |
|
|
| 0.0779 | 20.0 | 580 | 0.1580 | 0.5870 | 0.5625 | 0.5745 | 0.9711 | |
|
|
| 0.0779 | 21.0 | 609 | 0.1615 | 0.5498 | 0.6042 | 0.5757 | 0.9692 | |
|
|
| 0.0779 | 22.0 | 638 | 0.1607 | 0.5289 | 0.6198 | 0.5707 | 0.9678 | |
|
|
| 0.0779 | 23.0 | 667 | 0.1648 | 0.5619 | 0.5677 | 0.5648 | 0.9687 | |
|
|
| 0.0779 | 24.0 | 696 | 0.1686 | 0.5459 | 0.5885 | 0.5664 | 0.9677 | |
|
|
| 0.0779 | 25.0 | 725 | 0.1659 | 0.5463 | 0.5833 | 0.5642 | 0.9680 | |
|
|
| 0.0779 | 26.0 | 754 | 0.1668 | 0.5567 | 0.5885 | 0.5722 | 0.9694 | |
|
|
| 0.0779 | 27.0 | 783 | 0.1681 | 0.5392 | 0.6094 | 0.5721 | 0.9684 | |
|
|
| 0.0779 | 28.0 | 812 | 0.1693 | 0.5534 | 0.5938 | 0.5729 | 0.9690 | |
|
|
| 0.0779 | 29.0 | 841 | 0.1723 | 0.5441 | 0.5781 | 0.5606 | 0.9684 | |
|
|
| 0.0779 | 30.0 | 870 | 0.1710 | 0.5308 | 0.5833 | 0.5558 | 0.9680 | |
|
|
| 0.0779 | 31.0 | 899 | 0.1718 | 0.5388 | 0.5781 | 0.5578 | 0.9684 | |
|
|
| 0.0779 | 32.0 | 928 | 0.1719 | 0.5388 | 0.5781 | 0.5578 | 0.9685 | |
|
|
|
|
|
|
|
|
### Framework versions |
|
|
|
|
|
- Transformers 4.28.1 |
|
|
- Pytorch 2.0.0+cu118 |
|
|
- Datasets 2.11.0 |
|
|
- Tokenizers 0.13.3 |
|
|
|