zealous-rook-163
This model is a fine-tuned version of google-bert/bert-base-cased on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.2277
- Hamming Loss: 0.066
- Zero One Loss: 0.3912
- Jaccard Score: 0.3207
- Hamming Loss Optimised: 0.0617
- Hamming Loss Threshold: 0.7833
- Zero One Loss Optimised: 0.3812
- Zero One Loss Threshold: 0.7112
- Jaccard Score Optimised: 0.3125
- Jaccard Score Threshold: 0.3672
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2.5173536513892423e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 2024
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 9
Training results
| Training Loss | Epoch | Step | Validation Loss | Hamming Loss | Zero One Loss | Jaccard Score | Hamming Loss Optimised | Hamming Loss Threshold | Zero One Loss Optimised | Zero One Loss Threshold | Jaccard Score Optimised | Jaccard Score Threshold |
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0.281 | 1.0 | 800 | 0.1811 | 0.063 | 0.5212 | 0.4829 | 0.0628 | 0.5238 | 0.4525 | 0.3337 | 0.3415 | 0.2658 |
| 0.1571 | 2.0 | 1600 | 0.1641 | 0.0576 | 0.3862 | 0.3378 | 0.0574 | 0.6270 | 0.3725 | 0.4496 | 0.3089 | 0.3271 |
| 0.1218 | 3.0 | 2400 | 0.1656 | 0.0597 | 0.3888 | 0.3386 | 0.0559 | 0.6787 | 0.38 | 0.4645 | 0.3043 | 0.2319 |
| 0.0957 | 4.0 | 3200 | 0.1831 | 0.0629 | 0.39 | 0.3399 | 0.0594 | 0.7655 | 0.3838 | 0.4056 | 0.3134 | 0.2468 |
| 0.0721 | 5.0 | 4000 | 0.1948 | 0.0648 | 0.3775 | 0.3233 | 0.06 | 0.7962 | 0.3762 | 0.4691 | 0.3081 | 0.2256 |
| 0.0525 | 6.0 | 4800 | 0.2050 | 0.0644 | 0.3738 | 0.3154 | 0.06 | 0.7723 | 0.3688 | 0.6390 | 0.3063 | 0.3355 |
| 0.0412 | 7.0 | 5600 | 0.2171 | 0.0638 | 0.375 | 0.3241 | 0.0602 | 0.7833 | 0.3738 | 0.4919 | 0.3169 | 0.2889 |
| 0.0322 | 8.0 | 6400 | 0.2257 | 0.0655 | 0.385 | 0.3174 | 0.0609 | 0.8619 | 0.3788 | 0.5721 | 0.3186 | 0.4056 |
| 0.026 | 9.0 | 7200 | 0.2277 | 0.066 | 0.3912 | 0.3207 | 0.0617 | 0.7833 | 0.3812 | 0.7112 | 0.3125 | 0.3672 |
Framework versions
- Transformers 4.47.0
- Pytorch 2.5.1+cu124
- Datasets 3.1.0
- Tokenizers 0.21.0
- Downloads last month
- 2
Model tree for ElMad/zealous-rook-163
Base model
google-bert/bert-base-cased