DISTILBERT_SARA_BIRADS_ECO_MAMO_2
This model is a fine-tuned version of distilbert/distilbert-base-uncased on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.7336
- Accuracy: 0.9361
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 32
Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|---|---|---|---|---|
| 0.8441 | 1.0 | 776 | 0.5188 | 0.8516 |
| 0.3666 | 2.0 | 1552 | 0.4621 | 0.8819 |
| 0.411 | 3.0 | 2328 | 0.4225 | 0.8877 |
| 0.2854 | 4.0 | 3104 | 0.4847 | 0.8832 |
| 0.4759 | 5.0 | 3880 | 0.4549 | 0.8948 |
| 0.1157 | 6.0 | 4656 | 0.4045 | 0.9052 |
| 0.0939 | 7.0 | 5432 | 0.4246 | 0.9110 |
| 0.086 | 8.0 | 6208 | 0.4076 | 0.9194 |
| 0.011 | 9.0 | 6984 | 0.4956 | 0.9181 |
| 0.1622 | 10.0 | 7760 | 0.4949 | 0.9155 |
| 0.0621 | 11.0 | 8536 | 0.4278 | 0.9342 |
| 0.015 | 12.0 | 9312 | 0.5024 | 0.9290 |
| 0.0308 | 13.0 | 10088 | 0.4937 | 0.9374 |
| 0.1366 | 14.0 | 10864 | 0.5544 | 0.9335 |
| 0.1376 | 15.0 | 11640 | 0.5920 | 0.9284 |
| 0.0003 | 16.0 | 12416 | 0.6138 | 0.9297 |
| 0.0008 | 17.0 | 13192 | 0.5690 | 0.9394 |
| 0.001 | 18.0 | 13968 | 0.5802 | 0.9342 |
| 0.0002 | 19.0 | 14744 | 0.5803 | 0.9368 |
| 0.0001 | 20.0 | 15520 | 0.6447 | 0.9348 |
| 0.0003 | 21.0 | 16296 | 0.6403 | 0.9374 |
| 0.0076 | 22.0 | 17072 | 0.6709 | 0.9355 |
| 0.0001 | 23.0 | 17848 | 0.6706 | 0.9368 |
| 0.0001 | 24.0 | 18624 | 0.6858 | 0.9361 |
| 0.0 | 25.0 | 19400 | 0.6970 | 0.9348 |
| 0.0 | 26.0 | 20176 | 0.7349 | 0.9316 |
| 0.0 | 27.0 | 20952 | 0.7660 | 0.9329 |
| 0.0 | 28.0 | 21728 | 0.7639 | 0.9342 |
| 0.0 | 29.0 | 22504 | 0.7218 | 0.9355 |
| 0.0001 | 30.0 | 23280 | 0.7302 | 0.9335 |
| 0.0 | 31.0 | 24056 | 0.7344 | 0.9355 |
| 0.0 | 32.0 | 24832 | 0.7336 | 0.9361 |
Framework versions
- Transformers 4.42.4
- Pytorch 2.3.0+cu121
- Datasets 2.20.0
- Tokenizers 0.19.1
- Downloads last month
- 1
Model tree for sara-m98/DISTILBERT_BIRADS_ECO_MAMO_2_DESCARTADO
Base model
distilbert/distilbert-base-uncased