train_boolq_1745950276
This model is a fine-tuned version of meta-llama/Meta-Llama-3-8B-Instruct on the boolq dataset. It achieves the following results on the evaluation set:
- Loss: 0.3266
- Num Input Tokens Seen: 34078960
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.3
- train_batch_size: 2
- eval_batch_size: 2
- seed: 123
- gradient_accumulation_steps: 2
- total_train_batch_size: 4
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- training_steps: 40000
Training results
| Training Loss | Epoch | Step | Validation Loss | Input Tokens Seen |
|---|---|---|---|---|
| 0.4503 | 0.0943 | 200 | 0.3977 | 171472 |
| 0.4037 | 0.1886 | 400 | 0.3312 | 339520 |
| 0.3821 | 0.2829 | 600 | 0.5777 | 509632 |
| 0.3074 | 0.3772 | 800 | 0.3313 | 685120 |
| 0.3776 | 0.4715 | 1000 | 0.3313 | 856144 |
| 0.3859 | 0.5658 | 1200 | 0.3509 | 1024448 |
| 0.366 | 0.6601 | 1400 | 0.3591 | 1192560 |
| 0.2966 | 0.7544 | 1600 | 0.3289 | 1360304 |
| 0.3697 | 0.8487 | 1800 | 0.3312 | 1535088 |
| 0.4101 | 0.9430 | 2000 | 0.3274 | 1708224 |
| 0.3602 | 1.0372 | 2200 | 0.3301 | 1880336 |
| 0.2841 | 1.1315 | 2400 | 0.3367 | 2048144 |
| 0.3223 | 1.2258 | 2600 | 0.3288 | 2220048 |
| 0.352 | 1.3201 | 2800 | 0.3275 | 2388128 |
| 0.2815 | 1.4144 | 3000 | 0.3372 | 2559696 |
| 0.3051 | 1.5087 | 3200 | 0.3275 | 2730224 |
| 0.3235 | 1.6030 | 3400 | 0.3279 | 2897744 |
| 0.3348 | 1.6973 | 3600 | 0.3646 | 3067728 |
| 0.4181 | 1.7916 | 3800 | 0.3291 | 3235856 |
| 0.3254 | 1.8859 | 4000 | 0.3284 | 3409536 |
| 0.3088 | 1.9802 | 4200 | 0.3327 | 3581600 |
| 0.3823 | 2.0745 | 4400 | 0.3503 | 3753552 |
| 0.4218 | 2.1688 | 4600 | 0.3308 | 3924224 |
| 0.2808 | 2.2631 | 4800 | 0.3303 | 4093360 |
| 0.326 | 2.3574 | 5000 | 0.3273 | 4261312 |
| 0.3026 | 2.4517 | 5200 | 0.3289 | 4438224 |
| 0.3044 | 2.5460 | 5400 | 0.3299 | 4608992 |
| 0.3489 | 2.6403 | 5600 | 0.3294 | 4780976 |
| 0.3564 | 2.7346 | 5800 | 0.3275 | 4946848 |
| 0.4021 | 2.8289 | 6000 | 0.3297 | 5120704 |
| 0.3417 | 2.9231 | 6200 | 0.3596 | 5292816 |
| 0.3519 | 3.0174 | 6400 | 0.3315 | 5463728 |
| 0.3563 | 3.1117 | 6600 | 0.3312 | 5634528 |
| 0.3607 | 3.2060 | 6800 | 0.3293 | 5805344 |
| 0.3434 | 3.3003 | 7000 | 0.3279 | 5976160 |
| 0.3496 | 3.3946 | 7200 | 0.3439 | 6147200 |
| 0.3595 | 3.4889 | 7400 | 0.3373 | 6315984 |
| 0.3667 | 3.5832 | 7600 | 0.3374 | 6484512 |
| 0.2914 | 3.6775 | 7800 | 0.3325 | 6653536 |
| 0.3376 | 3.7718 | 8000 | 0.3276 | 6823184 |
| 0.349 | 3.8661 | 8200 | 0.3294 | 6991232 |
| 0.4064 | 3.9604 | 8400 | 0.3406 | 7161488 |
| 0.3525 | 4.0547 | 8600 | 0.3321 | 7330416 |
| 0.34 | 4.1490 | 8800 | 0.3303 | 7502960 |
| 0.2912 | 4.2433 | 9000 | 0.3291 | 7675776 |
| 0.3682 | 4.3376 | 9200 | 0.3505 | 7847200 |
| 0.3892 | 4.4319 | 9400 | 0.3331 | 8016080 |
| 0.3653 | 4.5262 | 9600 | 0.3292 | 8189040 |
| 0.3264 | 4.6205 | 9800 | 0.3283 | 8355360 |
| 0.3354 | 4.7148 | 10000 | 0.3293 | 8527824 |
| 0.332 | 4.8091 | 10200 | 0.3292 | 8697376 |
| 0.3546 | 4.9033 | 10400 | 0.3470 | 8867872 |
| 0.3372 | 4.9976 | 10600 | 0.3278 | 9039888 |
| 0.3172 | 5.0919 | 10800 | 0.3304 | 9209232 |
| 0.3546 | 5.1862 | 11000 | 0.3321 | 9384064 |
| 0.2926 | 5.2805 | 11200 | 0.3301 | 9555168 |
| 0.2945 | 5.3748 | 11400 | 0.3324 | 9724832 |
| 0.3107 | 5.4691 | 11600 | 0.3294 | 9894608 |
| 0.3976 | 5.5634 | 11800 | 0.3285 | 10067376 |
| 0.3501 | 5.6577 | 12000 | 0.3352 | 10239648 |
| 0.3752 | 5.7520 | 12200 | 0.3293 | 10406400 |
| 0.34 | 5.8463 | 12400 | 0.3396 | 10578176 |
| 0.2835 | 5.9406 | 12600 | 0.3275 | 10745200 |
| 0.3701 | 6.0349 | 12800 | 0.3364 | 10917056 |
| 0.3419 | 6.1292 | 13000 | 0.3293 | 11091376 |
| 0.3755 | 6.2235 | 13200 | 0.3343 | 11260160 |
| 0.3517 | 6.3178 | 13400 | 0.3279 | 11430512 |
| 0.2505 | 6.4121 | 13600 | 0.3336 | 11598992 |
| 0.2929 | 6.5064 | 13800 | 0.3282 | 11771312 |
| 0.3418 | 6.6007 | 14000 | 0.3373 | 11940256 |
| 0.3937 | 6.6950 | 14200 | 0.3284 | 12108896 |
| 0.3437 | 6.7893 | 14400 | 0.3324 | 12277824 |
| 0.372 | 6.8835 | 14600 | 0.3310 | 12450224 |
| 0.3371 | 6.9778 | 14800 | 0.3314 | 12618544 |
| 0.3423 | 7.0721 | 15000 | 0.3382 | 12791104 |
| 0.3464 | 7.1664 | 15200 | 0.3382 | 12964976 |
| 0.3554 | 7.2607 | 15400 | 0.3503 | 13132848 |
| 0.3801 | 7.3550 | 15600 | 0.3269 | 13302528 |
| 0.4169 | 7.4493 | 15800 | 0.3346 | 13471696 |
| 0.3081 | 7.5436 | 16000 | 0.3301 | 13643264 |
| 0.336 | 7.6379 | 16200 | 0.3273 | 13810336 |
| 0.336 | 7.7322 | 16400 | 0.3270 | 13980096 |
| 0.3257 | 7.8265 | 16600 | 0.3267 | 14150688 |
| 0.351 | 7.9208 | 16800 | 0.3279 | 14320928 |
| 0.2939 | 8.0151 | 17000 | 0.3295 | 14497120 |
| 0.3513 | 8.1094 | 17200 | 0.3274 | 14667440 |
| 0.2586 | 8.2037 | 17400 | 0.3292 | 14839920 |
| 0.3236 | 8.2980 | 17600 | 0.3277 | 15013152 |
| 0.3524 | 8.3923 | 17800 | 0.3300 | 15178480 |
| 0.3424 | 8.4866 | 18000 | 0.3431 | 15349216 |
| 0.3419 | 8.5809 | 18200 | 0.3292 | 15518736 |
| 0.3545 | 8.6752 | 18400 | 0.3274 | 15689568 |
| 0.3255 | 8.7694 | 18600 | 0.3277 | 15859632 |
| 0.339 | 8.8637 | 18800 | 0.3278 | 16025920 |
| 0.3306 | 8.9580 | 19000 | 0.3298 | 16196560 |
| 0.3426 | 9.0523 | 19200 | 0.3271 | 16368144 |
| 0.3301 | 9.1466 | 19400 | 0.3283 | 16539952 |
| 0.3963 | 9.2409 | 19600 | 0.3275 | 16710384 |
| 0.3358 | 9.3352 | 19800 | 0.3278 | 16878624 |
| 0.4122 | 9.4295 | 20000 | 0.3298 | 17046992 |
| 0.342 | 9.5238 | 20200 | 0.3286 | 17218176 |
| 0.3337 | 9.6181 | 20400 | 0.3275 | 17390384 |
| 0.3114 | 9.7124 | 20600 | 0.3278 | 17560512 |
| 0.3436 | 9.8067 | 20800 | 0.3522 | 17726048 |
| 0.306 | 9.9010 | 21000 | 0.3309 | 17897376 |
| 0.3466 | 9.9953 | 21200 | 0.3290 | 18068096 |
| 0.3196 | 10.0896 | 21400 | 0.3282 | 18244704 |
| 0.3051 | 10.1839 | 21600 | 0.3285 | 18420464 |
| 0.334 | 10.2782 | 21800 | 0.3298 | 18588480 |
| 0.3217 | 10.3725 | 22000 | 0.3286 | 18758992 |
| 0.3177 | 10.4668 | 22200 | 0.3268 | 18930688 |
| 0.3123 | 10.5611 | 22400 | 0.3306 | 19096400 |
| 0.3375 | 10.6554 | 22600 | 0.3299 | 19263456 |
| 0.3509 | 10.7496 | 22800 | 0.3266 | 19430576 |
| 0.3292 | 10.8439 | 23000 | 0.3269 | 19599200 |
| 0.371 | 10.9382 | 23200 | 0.3284 | 19770608 |
| 0.3363 | 11.0325 | 23400 | 0.3294 | 19942144 |
| 0.3182 | 11.1268 | 23600 | 0.3294 | 20112704 |
| 0.3269 | 11.2211 | 23800 | 0.3301 | 20282048 |
| 0.3506 | 11.3154 | 24000 | 0.3288 | 20456384 |
| 0.316 | 11.4097 | 24200 | 0.3287 | 20624672 |
| 0.4168 | 11.5040 | 24400 | 0.3272 | 20796640 |
| 0.3405 | 11.5983 | 24600 | 0.3291 | 20964288 |
| 0.3649 | 11.6926 | 24800 | 0.3286 | 21132896 |
| 0.2896 | 11.7869 | 25000 | 0.3285 | 21304080 |
| 0.3026 | 11.8812 | 25200 | 0.3278 | 21471264 |
| 0.3258 | 11.9755 | 25400 | 0.3281 | 21642432 |
| 0.3405 | 12.0698 | 25600 | 0.3343 | 21811200 |
| 0.3028 | 12.1641 | 25800 | 0.3285 | 21983552 |
| 0.3171 | 12.2584 | 26000 | 0.3335 | 22155824 |
| 0.3351 | 12.3527 | 26200 | 0.3267 | 22329984 |
| 0.3003 | 12.4470 | 26400 | 0.3283 | 22499472 |
| 0.3651 | 12.5413 | 26600 | 0.3273 | 22669904 |
| 0.3337 | 12.6355 | 26800 | 0.3279 | 22837440 |
| 0.3217 | 12.7298 | 27000 | 0.3276 | 23008384 |
| 0.3209 | 12.8241 | 27200 | 0.3268 | 23177264 |
| 0.3092 | 12.9184 | 27400 | 0.3282 | 23344128 |
| 0.3275 | 13.0127 | 27600 | 0.3303 | 23512272 |
| 0.3312 | 13.1070 | 27800 | 0.3367 | 23680080 |
| 0.3374 | 13.2013 | 28000 | 0.3287 | 23850944 |
| 0.3433 | 13.2956 | 28200 | 0.3286 | 24022624 |
| 0.3572 | 13.3899 | 28400 | 0.3284 | 24192144 |
| 0.3193 | 13.4842 | 28600 | 0.3319 | 24364768 |
| 0.3669 | 13.5785 | 28800 | 0.3310 | 24538352 |
| 0.3224 | 13.6728 | 29000 | 0.3308 | 24710416 |
| 0.3478 | 13.7671 | 29200 | 0.3316 | 24881936 |
| 0.3424 | 13.8614 | 29400 | 0.3295 | 25050656 |
| 0.3438 | 13.9557 | 29600 | 0.3312 | 25222752 |
| 0.3118 | 14.0500 | 29800 | 0.3318 | 25389344 |
| 0.3421 | 14.1443 | 30000 | 0.3286 | 25563712 |
| 0.3621 | 14.2386 | 30200 | 0.3287 | 25738992 |
| 0.3214 | 14.3329 | 30400 | 0.3282 | 25909744 |
| 0.3228 | 14.4272 | 30600 | 0.3282 | 26079120 |
| 0.3444 | 14.5215 | 30800 | 0.3285 | 26245936 |
| 0.3549 | 14.6157 | 31000 | 0.3274 | 26416944 |
| 0.3688 | 14.7100 | 31200 | 0.3296 | 26586032 |
| 0.3469 | 14.8043 | 31400 | 0.3288 | 26756496 |
| 0.3148 | 14.8986 | 31600 | 0.3292 | 26924320 |
| 0.3076 | 14.9929 | 31800 | 0.3287 | 27096688 |
| 0.3478 | 15.0872 | 32000 | 0.3306 | 27264640 |
| 0.3205 | 15.1815 | 32200 | 0.3292 | 27440592 |
| 0.2857 | 15.2758 | 32400 | 0.3280 | 27613248 |
| 0.354 | 15.3701 | 32600 | 0.3292 | 27781552 |
| 0.3337 | 15.4644 | 32800 | 0.3291 | 27956496 |
| 0.3268 | 15.5587 | 33000 | 0.3290 | 28125456 |
| 0.3406 | 15.6530 | 33200 | 0.3286 | 28295136 |
| 0.339 | 15.7473 | 33400 | 0.3295 | 28462640 |
| 0.3179 | 15.8416 | 33600 | 0.3298 | 28631360 |
| 0.3163 | 15.9359 | 33800 | 0.3304 | 28799488 |
| 0.3236 | 16.0302 | 34000 | 0.3288 | 28964832 |
| 0.3448 | 16.1245 | 34200 | 0.3291 | 29137792 |
| 0.3339 | 16.2188 | 34400 | 0.3300 | 29306192 |
| 0.3299 | 16.3131 | 34600 | 0.3299 | 29481760 |
| 0.2756 | 16.4074 | 34800 | 0.3291 | 29654256 |
| 0.323 | 16.5017 | 35000 | 0.3286 | 29821840 |
| 0.3274 | 16.5959 | 35200 | 0.3296 | 29992016 |
| 0.3009 | 16.6902 | 35400 | 0.3290 | 30157952 |
| 0.3676 | 16.7845 | 35600 | 0.3315 | 30329792 |
| 0.3473 | 16.8788 | 35800 | 0.3310 | 30500240 |
| 0.3273 | 16.9731 | 36000 | 0.3306 | 30668944 |
| 0.3209 | 17.0674 | 36200 | 0.3282 | 30840688 |
| 0.3293 | 17.1617 | 36400 | 0.3296 | 31012176 |
| 0.3471 | 17.2560 | 36600 | 0.3294 | 31184160 |
| 0.3463 | 17.3503 | 36800 | 0.3299 | 31359648 |
| 0.2986 | 17.4446 | 37000 | 0.3302 | 31529872 |
| 0.3342 | 17.5389 | 37200 | 0.3295 | 31699104 |
| 0.2928 | 17.6332 | 37400 | 0.3298 | 31870016 |
| 0.3226 | 17.7275 | 37600 | 0.3301 | 32036672 |
| 0.2944 | 17.8218 | 37800 | 0.3292 | 32206048 |
| 0.3026 | 17.9161 | 38000 | 0.3289 | 32377104 |
| 0.3164 | 18.0104 | 38200 | 0.3294 | 32548208 |
| 0.2956 | 18.1047 | 38400 | 0.3286 | 32716560 |
| 0.2683 | 18.1990 | 38600 | 0.3299 | 32885504 |
| 0.3365 | 18.2933 | 38800 | 0.3286 | 33056256 |
| 0.3395 | 18.3876 | 39000 | 0.3287 | 33225648 |
| 0.2881 | 18.4818 | 39200 | 0.3289 | 33393952 |
| 0.2915 | 18.5761 | 39400 | 0.3286 | 33564304 |
| 0.3988 | 18.6704 | 39600 | 0.3287 | 33735024 |
| 0.2974 | 18.7647 | 39800 | 0.3301 | 33907088 |
| 0.3258 | 18.8590 | 40000 | 0.3281 | 34078960 |
Framework versions
- PEFT 0.15.2.dev0
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
- Downloads last month
- 2
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for rbelanec/train_boolq_1745950276
Base model
meta-llama/Meta-Llama-3-8B-Instruct