roberta-base-Frank-Lampard
This model is a fine-tuned version of FacebookAI/roberta-large on the None dataset. It achieves the following results on the evaluation set:
- Loss: 1.1977
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 2
- mixed_precision_training: Native AMP
Training results
| Training Loss | Epoch | Step | Validation Loss |
|---|---|---|---|
| 1.4754 | 0.0394 | 10 | 2.2326 |
| 1.3717 | 0.0787 | 20 | 1.2141 |
| 1.165 | 0.1181 | 30 | 1.2507 |
| 1.2429 | 0.1575 | 40 | 1.2046 |
| 1.2038 | 0.1969 | 50 | 1.2035 |
| 1.2053 | 0.2362 | 60 | 1.2013 |
| 1.1951 | 0.2756 | 70 | 1.1985 |
| 1.1226 | 0.3150 | 80 | 1.2143 |
| 1.2395 | 0.3543 | 90 | 1.1932 |
| 1.27 | 0.3937 | 100 | 1.2547 |
| 1.1992 | 0.4331 | 110 | 1.2369 |
| 1.3318 | 0.4724 | 120 | 1.2183 |
| 1.2277 | 0.5118 | 130 | 1.2115 |
| 1.1872 | 0.5512 | 140 | 1.1910 |
| 1.1649 | 0.5906 | 150 | 1.2177 |
| 1.2312 | 0.6299 | 160 | 1.2130 |
| 1.1901 | 0.6693 | 170 | 1.2004 |
| 1.1354 | 0.7087 | 180 | 1.2158 |
| 1.321 | 0.7480 | 190 | 1.2036 |
| 1.1644 | 0.7874 | 200 | 1.2144 |
| 1.2748 | 0.8268 | 210 | 1.2105 |
| 1.2324 | 0.8661 | 220 | 1.2071 |
| 1.1694 | 0.9055 | 230 | 1.2149 |
| 1.1755 | 0.9449 | 240 | 1.2259 |
| 1.264 | 0.9843 | 250 | 1.1894 |
| 1.2252 | 1.0236 | 260 | 1.2221 |
| 1.1791 | 1.0630 | 270 | 1.2122 |
| 1.2084 | 1.1024 | 280 | 1.1915 |
| 1.2449 | 1.1417 | 290 | 1.2095 |
| 1.2102 | 1.1811 | 300 | 1.1933 |
| 1.1875 | 1.2205 | 310 | 1.1996 |
| 1.1796 | 1.2598 | 320 | 1.1974 |
| 1.1918 | 1.2992 | 330 | 1.1925 |
| 1.1615 | 1.3386 | 340 | 1.1971 |
| 1.2005 | 1.3780 | 350 | 1.2031 |
| 1.1766 | 1.4173 | 360 | 1.1941 |
| 1.1854 | 1.4567 | 370 | 1.1920 |
| 1.1796 | 1.4961 | 380 | 1.1915 |
| 1.1327 | 1.5354 | 390 | 1.2100 |
| 1.1781 | 1.5748 | 400 | 1.2250 |
| 1.1905 | 1.6142 | 410 | 1.2067 |
| 1.1437 | 1.6535 | 420 | 1.1986 |
| 1.2034 | 1.6929 | 430 | 1.1942 |
| 1.1986 | 1.7323 | 440 | 1.1933 |
| 1.133 | 1.7717 | 450 | 1.1971 |
| 1.2857 | 1.8110 | 460 | 1.1951 |
| 1.2301 | 1.8504 | 470 | 1.1929 |
| 1.1872 | 1.8898 | 480 | 1.1941 |
| 1.2122 | 1.9291 | 490 | 1.1958 |
| 1.1711 | 1.9685 | 500 | 1.1973 |
Framework versions
- Transformers 4.48.3
- Pytorch 2.5.1+cu124
- Datasets 3.3.2
- Tokenizers 0.21.0
- Downloads last month
- 44
Model tree for YuvrajSingh9886/roberta-base-Frank-Lampard
Base model
FacebookAI/roberta-large