train_wsc_1745950304
This model is a fine-tuned version of mistralai/Mistral-7B-Instruct-v0.3 on the wsc dataset. It achieves the following results on the evaluation set:
- Loss: 0.3588
- Num Input Tokens Seen: 13676608
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 123
- gradient_accumulation_steps: 2
- total_train_batch_size: 4
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- training_steps: 40000
Training results
| Training Loss | Epoch | Step | Validation Loss | Input Tokens Seen |
|---|---|---|---|---|
| 0.4182 | 1.6024 | 200 | 0.4221 | 68480 |
| 0.2857 | 3.2008 | 400 | 0.3917 | 137040 |
| 0.3364 | 4.8032 | 600 | 0.3656 | 205344 |
| 0.3653 | 6.4016 | 800 | 0.3616 | 273648 |
| 0.3707 | 8.0 | 1000 | 0.3592 | 342192 |
| 0.3388 | 9.6024 | 1200 | 0.3666 | 410624 |
| 0.3419 | 11.2008 | 1400 | 0.3629 | 479392 |
| 0.3303 | 12.8032 | 1600 | 0.3650 | 547360 |
| 0.3762 | 14.4016 | 1800 | 0.3597 | 616128 |
| 0.3327 | 16.0 | 2000 | 0.3634 | 683616 |
| 0.3621 | 17.6024 | 2200 | 0.3753 | 751520 |
| 0.3296 | 19.2008 | 2400 | 0.3617 | 820000 |
| 0.3546 | 20.8032 | 2600 | 0.3634 | 888576 |
| 0.3424 | 22.4016 | 2800 | 0.3588 | 956480 |
| 0.3467 | 24.0 | 3000 | 0.3636 | 1024784 |
| 0.3619 | 25.6024 | 3200 | 0.3665 | 1093536 |
| 0.3395 | 27.2008 | 3400 | 0.3763 | 1161248 |
| 0.3474 | 28.8032 | 3600 | 0.3631 | 1229760 |
| 0.3492 | 30.4016 | 3800 | 0.3618 | 1298112 |
| 0.3452 | 32.0 | 4000 | 0.3660 | 1366864 |
| 0.3324 | 33.6024 | 4200 | 0.3657 | 1435664 |
| 0.3106 | 35.2008 | 4400 | 0.3714 | 1503408 |
| 0.3492 | 36.8032 | 4600 | 0.3681 | 1572288 |
| 0.3271 | 38.4016 | 4800 | 0.3727 | 1640848 |
| 0.3214 | 40.0 | 5000 | 0.3703 | 1708416 |
| 0.3355 | 41.6024 | 5200 | 0.3794 | 1776416 |
| 0.3505 | 43.2008 | 5400 | 0.3785 | 1845088 |
| 0.3557 | 44.8032 | 5600 | 0.3830 | 1913360 |
| 0.334 | 46.4016 | 5800 | 0.3774 | 1981136 |
| 0.3258 | 48.0 | 6000 | 0.3869 | 2050304 |
| 0.3373 | 49.6024 | 6200 | 0.3870 | 2118640 |
| 0.3009 | 51.2008 | 6400 | 0.3954 | 2186992 |
| 0.3416 | 52.8032 | 6600 | 0.3898 | 2255392 |
| 0.3298 | 54.4016 | 6800 | 0.3953 | 2324240 |
| 0.3259 | 56.0 | 7000 | 0.4013 | 2391840 |
| 0.308 | 57.6024 | 7200 | 0.4036 | 2460464 |
| 0.3438 | 59.2008 | 7400 | 0.4278 | 2528416 |
| 0.3394 | 60.8032 | 7600 | 0.4242 | 2597008 |
| 0.3444 | 62.4016 | 7800 | 0.4261 | 2664720 |
| 0.3241 | 64.0 | 8000 | 0.4233 | 2733360 |
| 0.3375 | 65.6024 | 8200 | 0.4432 | 2801792 |
| 0.3664 | 67.2008 | 8400 | 0.4459 | 2870768 |
| 0.3255 | 68.8032 | 8600 | 0.4392 | 2939344 |
| 0.3203 | 70.4016 | 8800 | 0.4550 | 3007936 |
| 0.3516 | 72.0 | 9000 | 0.4599 | 3076384 |
| 0.3042 | 73.6024 | 9200 | 0.4724 | 3144624 |
| 0.363 | 75.2008 | 9400 | 0.4866 | 3212896 |
| 0.2798 | 76.8032 | 9600 | 0.5124 | 3281408 |
| 0.301 | 78.4016 | 9800 | 0.5196 | 3349872 |
| 0.3132 | 80.0 | 10000 | 0.5165 | 3418368 |
| 0.271 | 81.6024 | 10200 | 0.5482 | 3486640 |
| 0.331 | 83.2008 | 10400 | 0.5430 | 3555456 |
| 0.243 | 84.8032 | 10600 | 0.5753 | 3623440 |
| 0.3408 | 86.4016 | 10800 | 0.5683 | 3691760 |
| 0.3489 | 88.0 | 11000 | 0.5920 | 3760416 |
| 0.2819 | 89.6024 | 11200 | 0.5884 | 3829184 |
| 0.3114 | 91.2008 | 11400 | 0.5958 | 3897520 |
| 0.2958 | 92.8032 | 11600 | 0.6064 | 3965568 |
| 0.3538 | 94.4016 | 11800 | 0.6156 | 4033904 |
| 0.2928 | 96.0 | 12000 | 0.6495 | 4102480 |
| 0.3155 | 97.6024 | 12200 | 0.6617 | 4170912 |
| 0.2244 | 99.2008 | 12400 | 0.6689 | 4238208 |
| 0.262 | 100.8032 | 12600 | 0.6944 | 4307408 |
| 0.2774 | 102.4016 | 12800 | 0.7281 | 4375136 |
| 0.2086 | 104.0 | 13000 | 0.7427 | 4443232 |
| 0.2878 | 105.6024 | 13200 | 0.7361 | 4511824 |
| 0.2031 | 107.2008 | 13400 | 0.7424 | 4580464 |
| 0.3016 | 108.8032 | 13600 | 0.8025 | 4648752 |
| 0.1863 | 110.4016 | 13800 | 0.7966 | 4717136 |
| 0.2902 | 112.0 | 14000 | 0.7886 | 4785328 |
| 0.3493 | 113.6024 | 14200 | 0.8174 | 4853616 |
| 0.1902 | 115.2008 | 14400 | 0.8588 | 4922160 |
| 0.2676 | 116.8032 | 14600 | 0.8796 | 4990880 |
| 0.2767 | 118.4016 | 14800 | 0.8943 | 5059200 |
| 0.2356 | 120.0 | 15000 | 0.9085 | 5127856 |
| 0.1787 | 121.6024 | 15200 | 0.9235 | 5196320 |
| 0.3697 | 123.2008 | 15400 | 0.9581 | 5264752 |
| 0.2209 | 124.8032 | 15600 | 0.9399 | 5333360 |
| 0.2495 | 126.4016 | 15800 | 0.9608 | 5401648 |
| 0.2882 | 128.0 | 16000 | 1.0044 | 5470144 |
| 0.2936 | 129.6024 | 16200 | 1.0493 | 5539584 |
| 0.2582 | 131.2008 | 16400 | 1.0551 | 5606896 |
| 0.2213 | 132.8032 | 16600 | 1.1055 | 5675392 |
| 0.254 | 134.4016 | 16800 | 1.1432 | 5743824 |
| 0.2402 | 136.0 | 17000 | 1.1314 | 5812000 |
| 0.182 | 137.6024 | 17200 | 1.1666 | 5880400 |
| 0.1432 | 139.2008 | 17400 | 1.1957 | 5949456 |
| 0.1634 | 140.8032 | 17600 | 1.2086 | 6017584 |
| 0.2767 | 142.4016 | 17800 | 1.2400 | 6086352 |
| 0.2369 | 144.0 | 18000 | 1.3023 | 6153776 |
| 0.2673 | 145.6024 | 18200 | 1.3155 | 6222672 |
| 0.1579 | 147.2008 | 18400 | 1.3474 | 6291168 |
| 0.1645 | 148.8032 | 18600 | 1.3551 | 6359136 |
| 0.1455 | 150.4016 | 18800 | 1.4188 | 6426976 |
| 0.1607 | 152.0 | 19000 | 1.4721 | 6495568 |
| 0.1333 | 153.6024 | 19200 | 1.5219 | 6564224 |
| 0.1668 | 155.2008 | 19400 | 1.5602 | 6632768 |
| 0.189 | 156.8032 | 19600 | 1.5473 | 6701376 |
| 0.2116 | 158.4016 | 19800 | 1.6037 | 6769520 |
| 0.2086 | 160.0 | 20000 | 1.6331 | 6837904 |
| 0.1643 | 161.6024 | 20200 | 1.6802 | 6905904 |
| 0.071 | 163.2008 | 20400 | 1.6977 | 6974368 |
| 0.1312 | 164.8032 | 20600 | 1.7327 | 7043152 |
| 0.0771 | 166.4016 | 20800 | 1.7637 | 7112192 |
| 0.1598 | 168.0 | 21000 | 1.8099 | 7179920 |
| 0.1882 | 169.6024 | 21200 | 1.8526 | 7248608 |
| 0.2049 | 171.2008 | 21400 | 1.8983 | 7316928 |
| 0.1992 | 172.8032 | 21600 | 1.9234 | 7385216 |
| 0.1081 | 174.4016 | 21800 | 1.9247 | 7453728 |
| 0.096 | 176.0 | 22000 | 2.0103 | 7521888 |
| 0.0913 | 177.6024 | 22200 | 2.0265 | 7590256 |
| 0.2304 | 179.2008 | 22400 | 2.0417 | 7658736 |
| 0.0711 | 180.8032 | 22600 | 2.1007 | 7727488 |
| 0.0523 | 182.4016 | 22800 | 2.1240 | 7796416 |
| 0.0728 | 184.0 | 23000 | 2.1555 | 7864592 |
| 0.1835 | 185.6024 | 23200 | 2.1951 | 7933232 |
| 0.1825 | 187.2008 | 23400 | 2.2397 | 8001808 |
| 0.0929 | 188.8032 | 23600 | 2.2338 | 8070240 |
| 0.1246 | 190.4016 | 23800 | 2.2963 | 8138688 |
| 0.0723 | 192.0 | 24000 | 2.3235 | 8206576 |
| 0.1148 | 193.6024 | 24200 | 2.3402 | 8274800 |
| 0.1673 | 195.2008 | 24400 | 2.3746 | 8342976 |
| 0.1203 | 196.8032 | 24600 | 2.4219 | 8411584 |
| 0.0929 | 198.4016 | 24800 | 2.4452 | 8479856 |
| 0.0821 | 200.0 | 25000 | 2.4297 | 8548304 |
| 0.114 | 201.6024 | 25200 | 2.4874 | 8617520 |
| 0.0966 | 203.2008 | 25400 | 2.5493 | 8685328 |
| 0.0794 | 204.8032 | 25600 | 2.5507 | 8753696 |
| 0.0852 | 206.4016 | 25800 | 2.5829 | 8821840 |
| 0.1576 | 208.0 | 26000 | 2.5570 | 8889904 |
| 0.0564 | 209.6024 | 26200 | 2.6558 | 8958528 |
| 0.0226 | 211.2008 | 26400 | 2.6378 | 9026416 |
| 0.0695 | 212.8032 | 26600 | 2.6579 | 9094992 |
| 0.1734 | 214.4016 | 26800 | 2.7379 | 9162896 |
| 0.0354 | 216.0 | 27000 | 2.7089 | 9231632 |
| 0.0735 | 217.6024 | 27200 | 2.7189 | 9299920 |
| 0.0574 | 219.2008 | 27400 | 2.7909 | 9368176 |
| 0.12 | 220.8032 | 27600 | 2.7917 | 9437280 |
| 0.0318 | 222.4016 | 27800 | 2.8985 | 9505712 |
| 0.0365 | 224.0 | 28000 | 2.8735 | 9573776 |
| 0.018 | 225.6024 | 28200 | 2.9023 | 9641744 |
| 0.0965 | 227.2008 | 28400 | 2.9100 | 9710672 |
| 0.1704 | 228.8032 | 28600 | 2.9287 | 9778976 |
| 0.1279 | 230.4016 | 28800 | 2.9438 | 9846768 |
| 0.0132 | 232.0 | 29000 | 2.9723 | 9915328 |
| 0.0354 | 233.6024 | 29200 | 2.9732 | 9984304 |
| 0.0461 | 235.2008 | 29400 | 3.0188 | 10052656 |
| 0.0905 | 236.8032 | 29600 | 3.0351 | 10121152 |
| 0.0751 | 238.4016 | 29800 | 3.0966 | 10188944 |
| 0.1289 | 240.0 | 30000 | 3.0455 | 10257280 |
| 0.1371 | 241.6024 | 30200 | 3.1055 | 10326160 |
| 0.0224 | 243.2008 | 30400 | 3.0539 | 10393920 |
| 0.1728 | 244.8032 | 30600 | 3.0722 | 10462528 |
| 0.1518 | 246.4016 | 30800 | 3.1652 | 10530528 |
| 0.1182 | 248.0 | 31000 | 3.1398 | 10599104 |
| 0.016 | 249.6024 | 31200 | 3.1207 | 10667920 |
| 0.2305 | 251.2008 | 31400 | 3.1523 | 10736624 |
| 0.0872 | 252.8032 | 31600 | 3.2164 | 10804624 |
| 0.0401 | 254.4016 | 31800 | 3.1512 | 10873200 |
| 0.0819 | 256.0 | 32000 | 3.1937 | 10941264 |
| 0.0769 | 257.6024 | 32200 | 3.2092 | 11010000 |
| 0.1113 | 259.2008 | 32400 | 3.1851 | 11077280 |
| 0.1352 | 260.8032 | 32600 | 3.2525 | 11145744 |
| 0.0615 | 262.4016 | 32800 | 3.2053 | 11214112 |
| 0.0672 | 264.0 | 33000 | 3.2336 | 11282096 |
| 0.0147 | 265.6024 | 33200 | 3.2353 | 11350608 |
| 0.0511 | 267.2008 | 33400 | 3.2861 | 11418608 |
| 0.162 | 268.8032 | 33600 | 3.1972 | 11487936 |
| 0.0567 | 270.4016 | 33800 | 3.2179 | 11556272 |
| 0.0296 | 272.0 | 34000 | 3.2427 | 11624208 |
| 0.0437 | 273.6024 | 34200 | 3.2535 | 11693424 |
| 0.0816 | 275.2008 | 34400 | 3.3090 | 11761200 |
| 0.009 | 276.8032 | 34600 | 3.3013 | 11830208 |
| 0.1148 | 278.4016 | 34800 | 3.2790 | 11898240 |
| 0.142 | 280.0 | 35000 | 3.2912 | 11966432 |
| 0.1121 | 281.6024 | 35200 | 3.2996 | 12035232 |
| 0.0622 | 283.2008 | 35400 | 3.3033 | 12103232 |
| 0.0424 | 284.8032 | 35600 | 3.3449 | 12171376 |
| 0.0516 | 286.4016 | 35800 | 3.2590 | 12240128 |
| 0.1405 | 288.0 | 36000 | 3.3177 | 12308016 |
| 0.0967 | 289.6024 | 36200 | 3.3428 | 12375936 |
| 0.0924 | 291.2008 | 36400 | 3.3152 | 12444880 |
| 0.174 | 292.8032 | 36600 | 3.3220 | 12513664 |
| 0.0326 | 294.4016 | 36800 | 3.3569 | 12581616 |
| 0.0316 | 296.0 | 37000 | 3.3005 | 12650688 |
| 0.0983 | 297.6024 | 37200 | 3.3322 | 12718976 |
| 0.138 | 299.2008 | 37400 | 3.3354 | 12787680 |
| 0.2226 | 300.8032 | 37600 | 3.3302 | 12856448 |
| 0.1144 | 302.4016 | 37800 | 3.2580 | 12924128 |
| 0.0685 | 304.0 | 38000 | 3.3315 | 12992944 |
| 0.0946 | 305.6024 | 38200 | 3.3205 | 13060928 |
| 0.0338 | 307.2008 | 38400 | 3.3585 | 13129472 |
| 0.1372 | 308.8032 | 38600 | 3.3024 | 13198064 |
| 0.0476 | 310.4016 | 38800 | 3.3682 | 13266304 |
| 0.3015 | 312.0 | 39000 | 3.3573 | 13334832 |
| 0.1388 | 313.6024 | 39200 | 3.3516 | 13402912 |
| 0.0485 | 315.2008 | 39400 | 3.3273 | 13470656 |
| 0.0757 | 316.8032 | 39600 | 3.3273 | 13539984 |
| 0.0308 | 318.4016 | 39800 | 3.3273 | 13608768 |
| 0.0862 | 320.0 | 40000 | 3.3273 | 13676608 |
Framework versions
- PEFT 0.15.2.dev0
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
- Downloads last month
- 3
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for rbelanec/train_wsc_1745950304
Base model
mistralai/Mistral-7B-v0.3
Finetuned
mistralai/Mistral-7B-Instruct-v0.3