96529d1973d145b0ce126c03233b50f6
This model is a fine-tuned version of distilbert/distilroberta-base on the nyu-mll/glue [mnli] dataset. It achieves the following results on the evaluation set:
- Loss: 0.6210
- Data Size: 1.0
- Epoch Runtime: 490.5435
- Accuracy: 0.8114
- F1 Macro: 0.8107
- Rouge1: 0.8115
- Rouge2: 0.0
- Rougel: 0.8113
- Rougelsum: 0.8115
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- distributed_type: multi-GPU
- num_devices: 4
- total_train_batch_size: 32
- total_eval_batch_size: 32
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: constant
- num_epochs: 50
Training results
| Training Loss | Epoch | Step | Validation Loss | Data Size | Epoch Runtime | Accuracy | F1 Macro | Rouge1 | Rouge2 | Rougel | Rougelsum |
|---|---|---|---|---|---|---|---|---|---|---|---|
| No log | 0 | 0 | 1.1023 | 0 | 4.6905 | 0.3545 | 0.1747 | 0.3544 | 0.0 | 0.3545 | 0.3543 |
| 1.0933 | 1 | 12271 | 0.9008 | 0.0078 | 8.6237 | 0.6021 | 0.5969 | 0.6019 | 0.0 | 0.6023 | 0.6023 |
| 0.8248 | 2 | 24542 | 0.7477 | 0.0156 | 12.2017 | 0.6857 | 0.6797 | 0.6858 | 0.0 | 0.6860 | 0.6857 |
| 0.7031 | 3 | 36813 | 0.6315 | 0.0312 | 19.8785 | 0.7371 | 0.7345 | 0.7369 | 0.0 | 0.7373 | 0.7372 |
| 0.649 | 4 | 49084 | 0.5877 | 0.0625 | 34.7162 | 0.7629 | 0.7623 | 0.7625 | 0.0 | 0.7628 | 0.7627 |
| 0.5638 | 5 | 61355 | 0.5520 | 0.125 | 65.1321 | 0.7786 | 0.7767 | 0.7784 | 0.0 | 0.7789 | 0.7786 |
| 0.5506 | 6 | 73626 | 0.6088 | 0.25 | 125.5367 | 0.7786 | 0.7792 | 0.7786 | 0.0 | 0.7786 | 0.7786 |
| 0.4764 | 7 | 85897 | 0.5373 | 0.5 | 246.3156 | 0.7978 | 0.7971 | 0.7978 | 0.0 | 0.7981 | 0.7980 |
| 0.446 | 8.0 | 98168 | 0.5068 | 1.0 | 490.8175 | 0.8110 | 0.8100 | 0.8109 | 0.0 | 0.8110 | 0.8110 |
| 0.3883 | 9.0 | 110439 | 0.5302 | 1.0 | 486.9774 | 0.8090 | 0.8086 | 0.8091 | 0.0 | 0.8089 | 0.8089 |
| 0.3427 | 10.0 | 122710 | 0.5458 | 1.0 | 497.9757 | 0.8088 | 0.8077 | 0.8087 | 0.0 | 0.8088 | 0.8090 |
| 0.3202 | 11.0 | 134981 | 0.5834 | 1.0 | 487.6945 | 0.8106 | 0.8105 | 0.8104 | 0.0 | 0.8105 | 0.8108 |
| 0.254 | 12.0 | 147252 | 0.6210 | 1.0 | 490.5435 | 0.8114 | 0.8107 | 0.8115 | 0.0 | 0.8113 | 0.8115 |
Framework versions
- Transformers 4.57.0
- Pytorch 2.8.0+cu128
- Datasets 4.3.0
- Tokenizers 0.22.1
- Downloads last month
- 2
Model tree for contemmcm/96529d1973d145b0ce126c03233b50f6
Base model
distilbert/distilroberta-base