28a8e53904780944e2889c99f7ae6b85
This model is a fine-tuned version of distilbert/distilgpt2 on the contemmcm/hate-speech-and-offensive-language dataset. It achieves the following results on the evaluation set:
- Loss: 0.5368
- Data Size: 1.0
- Epoch Runtime: 27.0331
- Accuracy: 0.8908
- F1 Macro: 0.7376
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- distributed_type: multi-GPU
- num_devices: 4
- total_train_batch_size: 32
- total_eval_batch_size: 32
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: constant
- num_epochs: 50
Training results
| Training Loss | Epoch | Step | Validation Loss | Data Size | Epoch Runtime | Accuracy | F1 Macro |
|---|---|---|---|---|---|---|---|
| No log | 0 | 0 | 6.1676 | 0 | 3.2324 | 0.0633 | 0.0408 |
| No log | 1 | 619 | 1.7804 | 0.0078 | 3.5676 | 0.3488 | 0.2527 |
| No log | 2 | 1238 | 0.7792 | 0.0156 | 3.6281 | 0.7240 | 0.3474 |
| 0.0323 | 3 | 1857 | 0.5612 | 0.0312 | 4.1120 | 0.7991 | 0.4307 |
| 0.0323 | 4 | 2476 | 0.3725 | 0.0625 | 4.8251 | 0.8762 | 0.5798 |
| 0.343 | 5 | 3095 | 0.2935 | 0.125 | 6.5005 | 0.9012 | 0.7077 |
| 0.0258 | 6 | 3714 | 0.2748 | 0.25 | 9.3493 | 0.9083 | 0.7111 |
| 0.2595 | 7 | 4333 | 0.2718 | 0.5 | 15.6207 | 0.9014 | 0.7517 |
| 0.2256 | 8.0 | 4952 | 0.2458 | 1.0 | 28.5792 | 0.9152 | 0.7335 |
| 0.1807 | 9.0 | 5571 | 0.2761 | 1.0 | 29.0246 | 0.9044 | 0.7471 |
| 0.1603 | 10.0 | 6190 | 0.3406 | 1.0 | 27.0634 | 0.9060 | 0.7552 |
| 0.1266 | 11.0 | 6809 | 0.3590 | 1.0 | 27.1792 | 0.9014 | 0.7412 |
| 0.067 | 12.0 | 7428 | 0.5368 | 1.0 | 27.0331 | 0.8908 | 0.7376 |
Framework versions
- Transformers 4.57.0
- Pytorch 2.8.0+cu128
- Datasets 4.3.0
- Tokenizers 0.22.1
- Downloads last month
- 1
Model tree for contemmcm/28a8e53904780944e2889c99f7ae6b85
Base model
distilbert/distilgpt2