97c207f35ab05c746821da1a5d50d549
This model is a fine-tuned version of distilbert/distilroberta-base on the nyu-mll/glue [sst2] dataset. It achieves the following results on the evaluation set:
- Loss: 0.2787
- Data Size: 1.0
- Epoch Runtime: 83.7100
- Accuracy: 0.9132
- F1 Macro: 0.9131
- Rouge1: 0.9120
- Rouge2: 0.0
- Rougel: 0.9132
- Rougelsum: 0.9132
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- distributed_type: multi-GPU
- num_devices: 4
- total_train_batch_size: 32
- total_eval_batch_size: 32
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: constant
- num_epochs: 50
Training results
| Training Loss | Epoch | Step | Validation Loss | Data Size | Epoch Runtime | Accuracy | F1 Macro | Rouge1 | Rouge2 | Rougel | Rougelsum |
|---|---|---|---|---|---|---|---|---|---|---|---|
| No log | 0 | 0 | 0.7070 | 0 | 0.9530 | 0.4907 | 0.3292 | 0.4907 | 0.0 | 0.4907 | 0.4919 |
| No log | 1 | 2104 | 0.5541 | 0.0078 | 1.7837 | 0.7141 | 0.6994 | 0.7153 | 0.0 | 0.7141 | 0.7141 |
| No log | 2 | 4208 | 0.7395 | 0.0156 | 2.3333 | 0.7060 | 0.6767 | 0.7060 | 0.0 | 0.7060 | 0.7072 |
| 0.0103 | 3 | 6312 | 0.3356 | 0.0312 | 3.6882 | 0.8843 | 0.8841 | 0.8843 | 0.0 | 0.8843 | 0.8843 |
| 0.3309 | 4 | 8416 | 0.2778 | 0.0625 | 6.2207 | 0.8831 | 0.8831 | 0.8831 | 0.0 | 0.8831 | 0.8831 |
| 0.2941 | 5 | 10520 | 0.2931 | 0.125 | 11.3041 | 0.8866 | 0.8863 | 0.8860 | 0.0 | 0.8866 | 0.8866 |
| 0.2242 | 6 | 12624 | 0.2647 | 0.25 | 21.4089 | 0.9005 | 0.9005 | 0.9005 | 0.0 | 0.9016 | 0.9005 |
| 0.2194 | 7 | 14728 | 0.2676 | 0.5 | 41.5693 | 0.9155 | 0.9155 | 0.9155 | 0.0 | 0.9155 | 0.9155 |
| 0.1613 | 8.0 | 16832 | 0.2746 | 1.0 | 81.4052 | 0.9086 | 0.9086 | 0.9086 | 0.0 | 0.9086 | 0.9086 |
| 0.1383 | 9.0 | 18936 | 0.3334 | 1.0 | 81.9483 | 0.9074 | 0.9072 | 0.9074 | 0.0 | 0.9074 | 0.9074 |
| 0.1272 | 10.0 | 21040 | 0.2787 | 1.0 | 83.7100 | 0.9132 | 0.9131 | 0.9120 | 0.0 | 0.9132 | 0.9132 |
Framework versions
- Transformers 4.57.0
- Pytorch 2.8.0+cu128
- Datasets 4.3.0
- Tokenizers 0.22.1
- Downloads last month
- -
Model tree for contemmcm/97c207f35ab05c746821da1a5d50d549
Base model
distilbert/distilroberta-base