train_wsc_1745950300
This model is a fine-tuned version of meta-llama/Meta-Llama-3-8B-Instruct on the wsc dataset. It achieves the following results on the evaluation set:
- Loss: 0.3607
- Num Input Tokens Seen: 14002704
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 123
- gradient_accumulation_steps: 2
- total_train_batch_size: 4
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- training_steps: 40000
Training results
| Training Loss | Epoch | Step | Validation Loss | Input Tokens Seen |
|---|---|---|---|---|
| 0.3352 | 1.6024 | 200 | 0.4151 | 70144 |
| 0.3608 | 3.2008 | 400 | 0.3919 | 140304 |
| 0.3306 | 4.8032 | 600 | 0.3797 | 210240 |
| 0.3416 | 6.4016 | 800 | 0.3721 | 279952 |
| 0.3557 | 8.0 | 1000 | 0.3607 | 350224 |
| 0.3524 | 9.6024 | 1200 | 0.3756 | 420256 |
| 0.3408 | 11.2008 | 1400 | 0.3687 | 490496 |
| 0.3174 | 12.8032 | 1600 | 0.3723 | 560224 |
| 0.346 | 14.4016 | 1800 | 0.3654 | 630560 |
| 0.3569 | 16.0 | 2000 | 0.3655 | 699648 |
| 0.3536 | 17.6024 | 2200 | 0.3772 | 769232 |
| 0.3404 | 19.2008 | 2400 | 0.3663 | 839344 |
| 0.3574 | 20.8032 | 2600 | 0.3722 | 909744 |
| 0.3486 | 22.4016 | 2800 | 0.3658 | 979312 |
| 0.3382 | 24.0 | 3000 | 0.3724 | 1049184 |
| 0.32 | 25.6024 | 3200 | 0.3829 | 1119552 |
| 0.3357 | 27.2008 | 3400 | 0.3977 | 1189008 |
| 0.347 | 28.8032 | 3600 | 0.3832 | 1259168 |
| 0.3144 | 30.4016 | 3800 | 0.3864 | 1329056 |
| 0.3331 | 32.0 | 4000 | 0.3924 | 1399280 |
| 0.3138 | 33.6024 | 4200 | 0.4047 | 1469920 |
| 0.329 | 35.2008 | 4400 | 0.4080 | 1539184 |
| 0.323 | 36.8032 | 4600 | 0.4114 | 1609648 |
| 0.3334 | 38.4016 | 4800 | 0.4293 | 1679792 |
| 0.302 | 40.0 | 5000 | 0.4371 | 1749008 |
| 0.3296 | 41.6024 | 5200 | 0.4562 | 1818832 |
| 0.3571 | 43.2008 | 5400 | 0.4634 | 1889136 |
| 0.3626 | 44.8032 | 5600 | 0.4595 | 1959008 |
| 0.3013 | 46.4016 | 5800 | 0.4858 | 2028320 |
| 0.2692 | 48.0 | 6000 | 0.5035 | 2098928 |
| 0.3288 | 49.6024 | 6200 | 0.5108 | 2168688 |
| 0.2918 | 51.2008 | 6400 | 0.5337 | 2238752 |
| 0.3118 | 52.8032 | 6600 | 0.5325 | 2308816 |
| 0.2576 | 54.4016 | 6800 | 0.5671 | 2379328 |
| 0.2914 | 56.0 | 7000 | 0.5735 | 2448704 |
| 0.2553 | 57.6024 | 7200 | 0.5979 | 2519008 |
| 0.3082 | 59.2008 | 7400 | 0.6323 | 2588608 |
| 0.316 | 60.8032 | 7600 | 0.6291 | 2659072 |
| 0.3207 | 62.4016 | 7800 | 0.6799 | 2728480 |
| 0.356 | 64.0 | 8000 | 0.6687 | 2798720 |
| 0.2448 | 65.6024 | 8200 | 0.7052 | 2868672 |
| 0.2705 | 67.2008 | 8400 | 0.7099 | 2939312 |
| 0.2714 | 68.8032 | 8600 | 0.7145 | 3009568 |
| 0.2756 | 70.4016 | 8800 | 0.7587 | 3079584 |
| 0.2359 | 72.0 | 9000 | 0.7910 | 3149680 |
| 0.3139 | 73.6024 | 9200 | 0.8530 | 3219680 |
| 0.234 | 75.2008 | 9400 | 0.8904 | 3289472 |
| 0.2126 | 76.8032 | 9600 | 0.9579 | 3359520 |
| 0.241 | 78.4016 | 9800 | 0.9675 | 3429568 |
| 0.2583 | 80.0 | 10000 | 0.9924 | 3499648 |
| 0.2245 | 81.6024 | 10200 | 1.0858 | 3569504 |
| 0.3603 | 83.2008 | 10400 | 1.0709 | 3639920 |
| 0.2158 | 84.8032 | 10600 | 1.1456 | 3709520 |
| 0.1792 | 86.4016 | 10800 | 1.1610 | 3779456 |
| 0.3313 | 88.0 | 11000 | 1.2253 | 3849744 |
| 0.2754 | 89.6024 | 11200 | 1.2258 | 3919984 |
| 0.2289 | 91.2008 | 11400 | 1.2811 | 3989872 |
| 0.2321 | 92.8032 | 11600 | 1.3618 | 4059568 |
| 0.2502 | 94.4016 | 11800 | 1.4161 | 4129664 |
| 0.1047 | 96.0 | 12000 | 1.5128 | 4199936 |
| 0.2849 | 97.6024 | 12200 | 1.6091 | 4269952 |
| 0.2445 | 99.2008 | 12400 | 1.6657 | 4339040 |
| 0.1908 | 100.8032 | 12600 | 1.7710 | 4409680 |
| 0.1291 | 102.4016 | 12800 | 1.8082 | 4479120 |
| 0.2423 | 104.0 | 13000 | 1.9097 | 4548896 |
| 0.2913 | 105.6024 | 13200 | 2.0164 | 4619216 |
| 0.1558 | 107.2008 | 13400 | 2.0158 | 4689424 |
| 0.3142 | 108.8032 | 13600 | 2.2503 | 4759232 |
| 0.1757 | 110.4016 | 13800 | 2.3324 | 4829120 |
| 0.3289 | 112.0 | 14000 | 2.3124 | 4899024 |
| 0.1823 | 113.6024 | 14200 | 2.5127 | 4968944 |
| 0.1507 | 115.2008 | 14400 | 2.5735 | 5039152 |
| 0.2789 | 116.8032 | 14600 | 2.7307 | 5109312 |
| 0.2676 | 118.4016 | 14800 | 2.8011 | 5179296 |
| 0.37 | 120.0 | 15000 | 2.9478 | 5249504 |
| 0.0622 | 121.6024 | 15200 | 3.0192 | 5319424 |
| 0.1856 | 123.2008 | 15400 | 3.2184 | 5389488 |
| 0.2693 | 124.8032 | 15600 | 3.2791 | 5459776 |
| 0.1659 | 126.4016 | 15800 | 3.4324 | 5529760 |
| 0.2245 | 128.0 | 16000 | 3.6397 | 5599968 |
| 0.1535 | 129.6024 | 16200 | 3.7918 | 5671056 |
| 0.2849 | 131.2008 | 16400 | 3.9842 | 5740000 |
| 0.094 | 132.8032 | 16600 | 4.0727 | 5810288 |
| 0.0609 | 134.4016 | 16800 | 4.2751 | 5880176 |
| 0.0629 | 136.0 | 17000 | 4.3689 | 5950048 |
| 0.0703 | 137.6024 | 17200 | 4.5393 | 6020016 |
| 0.0419 | 139.2008 | 17400 | 4.5218 | 6090672 |
| 0.0552 | 140.8032 | 17600 | 4.7823 | 6160288 |
| 0.1961 | 142.4016 | 17800 | 4.9638 | 6230656 |
| 0.3005 | 144.0 | 18000 | 5.0710 | 6299968 |
| 0.0058 | 145.6024 | 18200 | 5.0842 | 6370512 |
| 0.0346 | 147.2008 | 18400 | 5.1136 | 6440784 |
| 0.1538 | 148.8032 | 18600 | 5.3746 | 6510560 |
| 0.26 | 150.4016 | 18800 | 5.3878 | 6579872 |
| 0.1486 | 152.0 | 19000 | 5.4975 | 6650112 |
| 0.1275 | 153.6024 | 19200 | 5.5547 | 6720368 |
| 0.3176 | 155.2008 | 19400 | 5.6005 | 6790512 |
| 0.0102 | 156.8032 | 19600 | 5.7658 | 6860880 |
| 0.2132 | 158.4016 | 19800 | 5.8400 | 6930576 |
| 0.3428 | 160.0 | 20000 | 5.7560 | 7000640 |
| 0.1867 | 161.6024 | 20200 | 6.0317 | 7070272 |
| 0.0236 | 163.2008 | 20400 | 5.9397 | 7140336 |
| 0.0018 | 164.8032 | 20600 | 6.1296 | 7210816 |
| 0.0625 | 166.4016 | 20800 | 6.1769 | 7281392 |
| 0.1682 | 168.0 | 21000 | 6.1838 | 7350960 |
| 0.0961 | 169.6024 | 21200 | 6.2791 | 7421312 |
| 0.1996 | 171.2008 | 21400 | 6.2697 | 7491200 |
| 0.0041 | 172.8032 | 21600 | 6.3722 | 7560976 |
| 0.0035 | 174.4016 | 21800 | 6.4474 | 7631024 |
| 0.0125 | 176.0 | 22000 | 6.4397 | 7700784 |
| 0.0832 | 177.6024 | 22200 | 6.5712 | 7770752 |
| 0.0084 | 179.2008 | 22400 | 6.6811 | 7840832 |
| 0.0039 | 180.8032 | 22600 | 6.6361 | 7911072 |
| 0.0008 | 182.4016 | 22800 | 6.6250 | 7981312 |
| 0.1602 | 184.0 | 23000 | 6.7170 | 8050976 |
| 0.1004 | 185.6024 | 23200 | 6.7897 | 8121312 |
| 0.0007 | 187.2008 | 23400 | 6.8662 | 8191520 |
| 0.0957 | 188.8032 | 23600 | 6.8907 | 8261456 |
| 0.0064 | 190.4016 | 23800 | 6.8531 | 8331664 |
| 0.0002 | 192.0 | 24000 | 6.9775 | 8401328 |
| 0.0502 | 193.6024 | 24200 | 6.9963 | 8471232 |
| 0.1058 | 195.2008 | 24400 | 6.9280 | 8540976 |
| 0.1785 | 196.8032 | 24600 | 7.1445 | 8611296 |
| 0.001 | 198.4016 | 24800 | 7.1940 | 8681264 |
| 0.0027 | 200.0 | 25000 | 7.2032 | 8751280 |
| 0.0884 | 201.6024 | 25200 | 7.1883 | 8822192 |
| 0.0018 | 203.2008 | 25400 | 7.2794 | 8891648 |
| 0.0011 | 204.8032 | 25600 | 7.3464 | 8961760 |
| 0.2617 | 206.4016 | 25800 | 7.2884 | 9031568 |
| 0.0003 | 208.0 | 26000 | 7.3494 | 9101088 |
| 0.0001 | 209.6024 | 26200 | 7.3600 | 9171168 |
| 0.0037 | 211.2008 | 26400 | 7.3523 | 9240752 |
| 0.0251 | 212.8032 | 26600 | 7.4558 | 9310960 |
| 0.0148 | 214.4016 | 26800 | 7.4483 | 9380560 |
| 0.3346 | 216.0 | 27000 | 7.5636 | 9450912 |
| 0.0008 | 217.6024 | 27200 | 7.5489 | 9520832 |
| 0.0944 | 219.2008 | 27400 | 7.5324 | 9590800 |
| 0.0929 | 220.8032 | 27600 | 7.5856 | 9661456 |
| 0.0002 | 222.4016 | 27800 | 7.5580 | 9731376 |
| 0.0022 | 224.0 | 28000 | 7.6195 | 9801040 |
| 0.1286 | 225.6024 | 28200 | 7.6605 | 9870784 |
| 0.0044 | 227.2008 | 28400 | 7.7439 | 9941408 |
| 0.0057 | 228.8032 | 28600 | 7.8067 | 10011264 |
| 0.0003 | 230.4016 | 28800 | 7.7763 | 10080704 |
| 0.3176 | 232.0 | 29000 | 7.7647 | 10150880 |
| 0.0704 | 233.6024 | 29200 | 7.8208 | 10221616 |
| 0.0532 | 235.2008 | 29400 | 7.8422 | 10291664 |
| 0.0061 | 236.8032 | 29600 | 7.8229 | 10361728 |
| 0.0 | 238.4016 | 29800 | 7.7975 | 10431088 |
| 0.0003 | 240.0 | 30000 | 7.9020 | 10501088 |
| 0.0176 | 241.6024 | 30200 | 7.9049 | 10571488 |
| 0.0001 | 243.2008 | 30400 | 7.8923 | 10640848 |
| 0.0762 | 244.8032 | 30600 | 7.9427 | 10711136 |
| 0.0 | 246.4016 | 30800 | 7.9366 | 10781136 |
| 0.0238 | 248.0 | 31000 | 7.9185 | 10851312 |
| 0.0645 | 249.6024 | 31200 | 8.0396 | 10921664 |
| 0.0016 | 251.2008 | 31400 | 8.0351 | 10991936 |
| 0.0001 | 252.8032 | 31600 | 7.9530 | 11061680 |
| 0.0 | 254.4016 | 31800 | 8.0267 | 11131872 |
| 0.0624 | 256.0 | 32000 | 8.0679 | 11201520 |
| 0.2152 | 257.6024 | 32200 | 8.0272 | 11271952 |
| 0.1678 | 259.2008 | 32400 | 8.0434 | 11340976 |
| 0.0481 | 260.8032 | 32600 | 8.0906 | 11411056 |
| 0.0002 | 262.4016 | 32800 | 8.1119 | 11481152 |
| 0.0006 | 264.0 | 33000 | 8.0791 | 11550752 |
| 0.0033 | 265.6024 | 33200 | 8.1098 | 11620752 |
| 0.0001 | 267.2008 | 33400 | 8.1449 | 11690464 |
| 0.1819 | 268.8032 | 33600 | 8.1190 | 11761360 |
| 0.0001 | 270.4016 | 33800 | 8.0936 | 11831152 |
| 0.0 | 272.0 | 34000 | 8.1583 | 11900768 |
| 0.0001 | 273.6024 | 34200 | 8.1760 | 11971616 |
| 0.0005 | 275.2008 | 34400 | 8.1717 | 12041104 |
| 0.0001 | 276.8032 | 34600 | 8.1783 | 12111712 |
| 0.0 | 278.4016 | 34800 | 8.2085 | 12181328 |
| 0.0001 | 280.0 | 35000 | 8.1241 | 12251088 |
| 0.0079 | 281.6024 | 35200 | 8.1174 | 12321616 |
| 0.0001 | 283.2008 | 35400 | 8.1422 | 12391184 |
| 0.0004 | 284.8032 | 35600 | 8.1090 | 12461088 |
| 0.0051 | 286.4016 | 35800 | 8.1838 | 12531520 |
| 0.0004 | 288.0 | 36000 | 8.2114 | 12600944 |
| 0.0 | 289.6024 | 36200 | 8.1985 | 12670544 |
| 0.0002 | 291.2008 | 36400 | 8.2227 | 12741216 |
| 0.0002 | 292.8032 | 36600 | 8.1926 | 12811584 |
| 0.0019 | 294.4016 | 36800 | 8.2078 | 12881104 |
| 0.0063 | 296.0 | 37000 | 8.1592 | 12951648 |
| 0.0009 | 297.6024 | 37200 | 8.2039 | 13021600 |
| 0.0056 | 299.2008 | 37400 | 8.2096 | 13091888 |
| 0.0029 | 300.8032 | 37600 | 8.2207 | 13162128 |
| 0.0002 | 302.4016 | 37800 | 8.1734 | 13231552 |
| 0.001 | 304.0 | 38000 | 8.2232 | 13302080 |
| 0.2469 | 305.6024 | 38200 | 8.2582 | 13371808 |
| 0.0001 | 307.2008 | 38400 | 8.1514 | 13441936 |
| 0.028 | 308.8032 | 38600 | 8.1905 | 13512304 |
| 0.0174 | 310.4016 | 38800 | 8.2173 | 13582192 |
| 0.0001 | 312.0 | 39000 | 8.2045 | 13652384 |
| 0.0001 | 313.6024 | 39200 | 8.2022 | 13722224 |
| 0.0478 | 315.2008 | 39400 | 8.2213 | 13791728 |
| 0.0361 | 316.8032 | 39600 | 8.2213 | 13862560 |
| 0.0032 | 318.4016 | 39800 | 8.2213 | 13933264 |
| 0.0127 | 320.0 | 40000 | 8.2213 | 14002704 |
Framework versions
- PEFT 0.15.2.dev0
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
- Downloads last month
- 3
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for rbelanec/train_wsc_1745950300
Base model
meta-llama/Meta-Llama-3-8B-Instruct