eab7dd851004424c8e214ee5d955eeaf
This model is a fine-tuned version of distilbert/distilbert-base-german-cased on the fancyzhx/dbpedia_14 dataset. It achieves the following results on the evaluation set:
- Loss: 0.0639
- Data Size: 1.0
- Epoch Runtime: 467.0432
- Accuracy: 0.9887
- F1 Macro: 0.9887
- Rouge1: 0.9887
- Rouge2: 0.0
- Rougel: 0.9887
- Rougelsum: 0.9887
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- distributed_type: multi-GPU
- num_devices: 4
- total_train_batch_size: 32
- total_eval_batch_size: 32
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: constant
- num_epochs: 50
Training results
| Training Loss | Epoch | Step | Validation Loss | Data Size | Epoch Runtime | Accuracy | F1 Macro | Rouge1 | Rouge2 | Rougel | Rougelsum |
|---|---|---|---|---|---|---|---|---|---|---|---|
| No log | 0 | 0 | 2.6507 | 0 | 18.4531 | 0.0714 | 0.0095 | 0.0714 | 0.0 | 0.0715 | 0.0715 |
| 0.4801 | 1 | 17500 | 0.1784 | 0.0078 | 22.1323 | 0.9571 | 0.9573 | 0.9572 | 0.0 | 0.9571 | 0.9571 |
| 0.1012 | 2 | 35000 | 0.1106 | 0.0156 | 25.0959 | 0.9703 | 0.9703 | 0.9703 | 0.0 | 0.9702 | 0.9702 |
| 0.0568 | 3 | 52500 | 0.0923 | 0.0312 | 32.5786 | 0.9776 | 0.9776 | 0.9776 | 0.0 | 0.9776 | 0.9776 |
| 0.0795 | 4 | 70000 | 0.0678 | 0.0625 | 46.8216 | 0.9829 | 0.9829 | 0.9830 | 0.0 | 0.9829 | 0.9829 |
| 0.0605 | 5 | 87500 | 0.0623 | 0.125 | 74.1349 | 0.9848 | 0.9848 | 0.9848 | 0.0 | 0.9848 | 0.9848 |
| 0.0618 | 6 | 105000 | 0.0569 | 0.25 | 132.2679 | 0.9871 | 0.9871 | 0.9872 | 0.0 | 0.9871 | 0.9871 |
| 0.0004 | 7 | 122500 | 0.0484 | 0.5 | 245.5870 | 0.9885 | 0.9885 | 0.9885 | 0.0 | 0.9885 | 0.9885 |
| 0.0311 | 8.0 | 140000 | 0.0468 | 1.0 | 463.2903 | 0.9895 | 0.9895 | 0.9895 | 0.0 | 0.9895 | 0.9895 |
| 0.0198 | 9.0 | 157500 | 0.0557 | 1.0 | 462.1015 | 0.9888 | 0.9888 | 0.9888 | 0.0 | 0.9888 | 0.9888 |
| 0.0238 | 10.0 | 175000 | 0.0602 | 1.0 | 472.2314 | 0.9892 | 0.9892 | 0.9892 | 0.0 | 0.9892 | 0.9892 |
| 0.0215 | 11.0 | 192500 | 0.0648 | 1.0 | 468.7174 | 0.9894 | 0.9894 | 0.9894 | 0.0 | 0.9894 | 0.9894 |
| 0.0221 | 12.0 | 210000 | 0.0639 | 1.0 | 467.0432 | 0.9887 | 0.9887 | 0.9887 | 0.0 | 0.9887 | 0.9887 |
Framework versions
- Transformers 4.57.0
- Pytorch 2.8.0+cu128
- Datasets 4.3.0
- Tokenizers 0.22.1
- Downloads last month
- -
Model tree for contemmcm/eab7dd851004424c8e214ee5d955eeaf
Base model
distilbert/distilbert-base-german-cased