metadata
license: apache-2.0
base_model: bert-base-multilingual-uncased
tags:
- generated_from_trainer
metrics:
- recall
- accuracy
model-index:
- name: multibert_dataaugmentation
results: []
multibert_dataaugmentation
This model is a fine-tuned version of bert-base-multilingual-uncased on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.6354
- Precisions: 0.8725
- Recall: 0.8025
- F-measure: 0.8295
- Accuracy: 0.8996
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 7.5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 14
Training results
| Training Loss | Epoch | Step | Validation Loss | Precisions | Recall | F-measure | Accuracy |
|---|---|---|---|---|---|---|---|
| 0.5999 | 1.0 | 284 | 0.4542 | 0.8507 | 0.7167 | 0.7422 | 0.8708 |
| 0.2808 | 2.0 | 568 | 0.3886 | 0.8552 | 0.7964 | 0.8192 | 0.8864 |
| 0.1596 | 3.0 | 852 | 0.4749 | 0.8786 | 0.7893 | 0.8208 | 0.8905 |
| 0.1035 | 4.0 | 1136 | 0.5060 | 0.8568 | 0.8034 | 0.8208 | 0.8954 |
| 0.0719 | 5.0 | 1420 | 0.5716 | 0.8498 | 0.8102 | 0.8255 | 0.8919 |
| 0.0456 | 6.0 | 1704 | 0.6255 | 0.8701 | 0.8060 | 0.8279 | 0.8960 |
| 0.0284 | 7.0 | 1988 | 0.6354 | 0.8725 | 0.8025 | 0.8295 | 0.8996 |
| 0.0205 | 8.0 | 2272 | 0.7146 | 0.8518 | 0.8105 | 0.8266 | 0.8988 |
| 0.011 | 9.0 | 2556 | 0.7307 | 0.8614 | 0.8045 | 0.8279 | 0.9004 |
| 0.0082 | 10.0 | 2840 | 0.7403 | 0.8785 | 0.7988 | 0.8255 | 0.9009 |
| 0.0064 | 11.0 | 3124 | 0.7756 | 0.8809 | 0.7913 | 0.8217 | 0.8989 |
| 0.0039 | 12.0 | 3408 | 0.8036 | 0.8650 | 0.7877 | 0.8130 | 0.8966 |
| 0.0038 | 13.0 | 3692 | 0.7660 | 0.8781 | 0.7950 | 0.8222 | 0.8997 |
| 0.0025 | 14.0 | 3976 | 0.7640 | 0.8829 | 0.7961 | 0.8249 | 0.9012 |
Framework versions
- Transformers 4.34.0
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.14.1