ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k17_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.4761
  • Qwk: 0.4275
  • Mse: 1.4761
  • Rmse: 1.2149

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.025 2 6.8946 0.0116 6.8946 2.6258
No log 0.05 4 4.4969 0.0591 4.4969 2.1206
No log 0.075 6 3.1263 0.0833 3.1263 1.7681
No log 0.1 8 2.4853 0.0526 2.4853 1.5765
No log 0.125 10 2.2251 0.0141 2.2251 1.4917
No log 0.15 12 1.9939 0.1138 1.9939 1.4121
No log 0.175 14 1.8957 0.0702 1.8957 1.3769
No log 0.2 16 1.7114 0.1143 1.7114 1.3082
No log 0.225 18 1.7230 0.0917 1.7230 1.3126
No log 0.25 20 2.2349 0.1667 2.2349 1.4950
No log 0.275 22 2.1058 0.3088 2.1058 1.4511
No log 0.3 24 2.0060 0.2595 2.0060 1.4163
No log 0.325 26 2.1552 0.1408 2.1552 1.4681
No log 0.35 28 2.0855 0.1702 2.0855 1.4441
No log 0.375 30 1.9111 0.2812 1.9111 1.3824
No log 0.4 32 1.7574 0.2069 1.7574 1.3257
No log 0.425 34 1.8407 0.1636 1.8407 1.3567
No log 0.45 36 1.9746 0.1709 1.9746 1.4052
No log 0.475 38 1.9800 0.1849 1.9800 1.4071
No log 0.5 40 1.7305 0.2241 1.7305 1.3155
No log 0.525 42 1.4620 0.2909 1.4620 1.2091
No log 0.55 44 1.4670 0.3063 1.4670 1.2112
No log 0.575 46 1.6646 0.2034 1.6646 1.2902
No log 0.6 48 1.7107 0.3360 1.7107 1.3079
No log 0.625 50 1.6601 0.3465 1.6601 1.2885
No log 0.65 52 1.5052 0.4603 1.5052 1.2268
No log 0.675 54 1.3801 0.4167 1.3801 1.1748
No log 0.7 56 1.3839 0.4098 1.3839 1.1764
No log 0.725 58 1.3361 0.3826 1.3361 1.1559
No log 0.75 60 1.5769 0.2727 1.5769 1.2557
No log 0.775 62 1.6200 0.2545 1.6200 1.2728
No log 0.8 64 1.3562 0.3273 1.3562 1.1646
No log 0.825 66 1.2913 0.3860 1.2913 1.1364
No log 0.85 68 1.2721 0.4522 1.2721 1.1279
No log 0.875 70 1.2615 0.3571 1.2615 1.1231
No log 0.9 72 1.2761 0.3423 1.2761 1.1297
No log 0.925 74 1.2997 0.3273 1.2997 1.1401
No log 0.95 76 1.1775 0.4211 1.1775 1.0851
No log 0.975 78 1.1758 0.5210 1.1758 1.0843
No log 1.0 80 1.1852 0.5000 1.1852 1.0887
No log 1.025 82 1.1444 0.4561 1.1444 1.0698
No log 1.05 84 1.3019 0.3393 1.3019 1.1410
No log 1.075 86 1.6628 0.25 1.6628 1.2895
No log 1.1 88 1.6909 0.2832 1.6909 1.3003
No log 1.125 90 1.4031 0.3393 1.4031 1.1845
No log 1.15 92 1.3468 0.4035 1.3468 1.1605
No log 1.175 94 1.4256 0.3894 1.4256 1.1940
No log 1.2 96 1.8347 0.1754 1.8347 1.3545
No log 1.225 98 1.6904 0.3036 1.6904 1.3001
No log 1.25 100 1.2670 0.4348 1.2670 1.1256
No log 1.275 102 1.1140 0.4957 1.1140 1.0554
No log 1.3 104 1.3129 0.5246 1.3129 1.1458
No log 1.325 106 1.2838 0.5124 1.2838 1.1330
No log 1.35 108 1.0595 0.4348 1.0595 1.0293
No log 1.375 110 1.4117 0.4062 1.4117 1.1881
No log 1.4 112 1.8233 0.1290 1.8233 1.3503
No log 1.425 114 1.8407 0.1197 1.8407 1.3567
No log 1.45 116 1.4490 0.3360 1.4490 1.2037
No log 1.475 118 1.1065 0.5484 1.1065 1.0519
No log 1.5 120 0.9817 0.5968 0.9817 0.9908
No log 1.525 122 1.0322 0.5366 1.0322 1.0160
No log 1.55 124 1.0789 0.5124 1.0789 1.0387
No log 1.575 126 1.1578 0.4538 1.1578 1.0760
No log 1.6 128 1.2341 0.4426 1.2341 1.1109
No log 1.625 130 1.3138 0.4409 1.3138 1.1462
No log 1.65 132 1.3423 0.4545 1.3423 1.1586
No log 1.675 134 1.2634 0.4426 1.2634 1.1240
No log 1.7 136 1.2442 0.4310 1.2442 1.1154
No log 1.725 138 1.2415 0.4874 1.2415 1.1142
No log 1.75 140 1.2365 0.4667 1.2365 1.1120
No log 1.775 142 1.3829 0.4848 1.3829 1.1759
No log 1.8 144 1.6137 0.3597 1.6137 1.2703
No log 1.825 146 1.6555 0.3286 1.6555 1.2867
No log 1.85 148 1.5273 0.3852 1.5273 1.2358
No log 1.875 150 1.3507 0.4355 1.3507 1.1622
No log 1.9 152 1.2613 0.4202 1.2613 1.1231
No log 1.925 154 1.2658 0.4034 1.2658 1.1251
No log 1.95 156 1.3653 0.4677 1.3653 1.1684
No log 1.975 158 1.3493 0.4882 1.3493 1.1616
No log 2.0 160 1.2738 0.4806 1.2738 1.1286
No log 2.025 162 1.4033 0.4148 1.4033 1.1846
No log 2.05 164 1.6157 0.3857 1.6157 1.2711
No log 2.075 166 1.4870 0.4058 1.4870 1.2194
No log 2.1 168 1.2101 0.5116 1.2101 1.1000
No log 2.125 170 1.1480 0.5354 1.1480 1.0715
No log 2.15 172 1.2062 0.5426 1.2062 1.0983
No log 2.175 174 1.2991 0.4806 1.2991 1.1398
No log 2.2 176 1.2999 0.4844 1.2999 1.1401
No log 2.225 178 1.2271 0.5039 1.2271 1.1077
No log 2.25 180 1.1872 0.3860 1.1872 1.0896
No log 2.275 182 1.1974 0.4202 1.1974 1.0942
No log 2.3 184 1.2330 0.5039 1.2330 1.1104
No log 2.325 186 1.3067 0.4615 1.3067 1.1431
No log 2.35 188 1.5785 0.3714 1.5785 1.2564
No log 2.375 190 1.7655 0.3380 1.7655 1.3287
No log 2.4 192 1.6283 0.3597 1.6283 1.2760
No log 2.425 194 1.3955 0.3852 1.3955 1.1813
No log 2.45 196 1.2357 0.4603 1.2357 1.1116
No log 2.475 198 1.2323 0.4603 1.2323 1.1101
No log 2.5 200 1.4106 0.4296 1.4106 1.1877
No log 2.525 202 1.7394 0.3497 1.7394 1.3189
No log 2.55 204 1.6529 0.3521 1.6529 1.2856
No log 2.575 206 1.3577 0.4361 1.3577 1.1652
No log 2.6 208 1.0786 0.5484 1.0786 1.0385
No log 2.625 210 1.0470 0.5938 1.0470 1.0232
No log 2.65 212 1.0547 0.6308 1.0547 1.0270
No log 2.675 214 1.2644 0.4394 1.2644 1.1244
No log 2.7 216 1.4849 0.3504 1.4849 1.2185
No log 2.725 218 1.5542 0.3504 1.5542 1.2467
No log 2.75 220 1.4352 0.4615 1.4352 1.1980
No log 2.775 222 1.3328 0.4754 1.3328 1.1545
No log 2.8 224 1.2484 0.3860 1.2484 1.1173
No log 2.825 226 1.2583 0.3214 1.2583 1.1218
No log 2.85 228 1.2893 0.4483 1.2893 1.1355
No log 2.875 230 1.4055 0.4640 1.4055 1.1855
No log 2.9 232 1.5630 0.3676 1.5630 1.2502
No log 2.925 234 1.6011 0.3309 1.6011 1.2653
No log 2.95 236 1.4764 0.4030 1.4764 1.2151
No log 2.975 238 1.3155 0.5366 1.3155 1.1470
No log 3.0 240 1.2459 0.5366 1.2459 1.1162
No log 3.025 242 1.3064 0.5366 1.3064 1.1430
No log 3.05 244 1.4749 0.4308 1.4749 1.2145
No log 3.075 246 1.5650 0.3852 1.5650 1.2510
No log 3.1 248 1.4984 0.4060 1.4984 1.2241
No log 3.125 250 1.3399 0.512 1.3399 1.1576
No log 3.15 252 1.2543 0.5203 1.2543 1.1200
No log 3.175 254 1.2900 0.4921 1.2900 1.1358
No log 3.2 256 1.4427 0.4462 1.4427 1.2011
No log 3.225 258 1.5756 0.3824 1.5756 1.2552
No log 3.25 260 1.6242 0.3824 1.6242 1.2744
No log 3.275 262 1.5372 0.3609 1.5372 1.2399
No log 3.3 264 1.3995 0.4462 1.3995 1.1830
No log 3.325 266 1.4174 0.4122 1.4174 1.1906
No log 3.35 268 1.6244 0.3453 1.6244 1.2745
No log 3.375 270 1.8935 0.2817 1.8935 1.3760
No log 3.4 272 1.9807 0.2639 1.9807 1.4074
No log 3.425 274 1.8010 0.3121 1.8010 1.3420
No log 3.45 276 1.4647 0.3582 1.4647 1.2102
No log 3.475 278 1.2699 0.5156 1.2699 1.1269
No log 3.5 280 1.2353 0.5528 1.2353 1.1115
No log 3.525 282 1.2734 0.5041 1.2734 1.1284
No log 3.55 284 1.3894 0.496 1.3894 1.1787
No log 3.575 286 1.5241 0.4211 1.5241 1.2346
No log 3.6 288 1.6011 0.3676 1.6011 1.2653
No log 3.625 290 1.5420 0.4 1.5420 1.2418
No log 3.65 292 1.4518 0.4060 1.4518 1.2049
No log 3.675 294 1.2592 0.4769 1.2592 1.1221
No log 3.7 296 1.1363 0.5625 1.1363 1.0660
No log 3.725 298 1.1463 0.5625 1.1463 1.0707
No log 3.75 300 1.3052 0.4651 1.3052 1.1425
No log 3.775 302 1.4654 0.4328 1.4654 1.2105
No log 3.8 304 1.4796 0.4361 1.4796 1.2164
No log 3.825 306 1.3658 0.4651 1.3658 1.1687
No log 3.85 308 1.3051 0.4286 1.3051 1.1424
No log 3.875 310 1.3236 0.4531 1.3236 1.1505
No log 3.9 312 1.4594 0.4545 1.4594 1.2080
No log 3.925 314 1.6651 0.3623 1.6651 1.2904
No log 3.95 316 1.6988 0.3286 1.6988 1.3034
No log 3.975 318 1.5471 0.3704 1.5471 1.2438
No log 4.0 320 1.3765 0.4615 1.3765 1.1732
No log 4.025 322 1.3536 0.4651 1.3536 1.1634
No log 4.05 324 1.3368 0.4651 1.3368 1.1562
No log 4.075 326 1.3318 0.4320 1.3318 1.1540
No log 4.1 328 1.3187 0.4918 1.3187 1.1483
No log 4.125 330 1.3531 0.4918 1.3531 1.1632
No log 4.15 332 1.4439 0.4409 1.4439 1.2016
No log 4.175 334 1.4988 0.4769 1.4988 1.2243
No log 4.2 336 1.4228 0.4651 1.4228 1.1928
No log 4.225 338 1.2937 0.4444 1.2937 1.1374
No log 4.25 340 1.2749 0.4531 1.2749 1.1291
No log 4.275 342 1.2651 0.4651 1.2651 1.1247
No log 4.3 344 1.3030 0.4615 1.3030 1.1415
No log 4.325 346 1.3268 0.4427 1.3268 1.1518
No log 4.35 348 1.4751 0.4380 1.4751 1.2145
No log 4.375 350 1.4571 0.4255 1.4571 1.2071
No log 4.4 352 1.3554 0.4060 1.3554 1.1642
No log 4.425 354 1.1977 0.4769 1.1977 1.0944
No log 4.45 356 1.1889 0.5038 1.1889 1.0903
No log 4.475 358 1.2974 0.4662 1.2974 1.1390
No log 4.5 360 1.4203 0.4118 1.4203 1.1918
No log 4.525 362 1.6167 0.3521 1.6167 1.2715
No log 4.55 364 1.6603 0.3521 1.6603 1.2885
No log 4.575 366 1.6309 0.3478 1.6309 1.2771
No log 4.6 368 1.5431 0.3824 1.5431 1.2422
No log 4.625 370 1.4981 0.3759 1.4981 1.2240
No log 4.65 372 1.3730 0.4651 1.3730 1.1717
No log 4.675 374 1.2168 0.5238 1.2168 1.1031
No log 4.7 376 1.2083 0.496 1.2083 1.0992
No log 4.725 378 1.3364 0.4615 1.3364 1.1560
No log 4.75 380 1.5423 0.4148 1.5423 1.2419
No log 4.775 382 1.6300 0.3796 1.6300 1.2767
No log 4.8 384 1.5222 0.4478 1.5222 1.2338
No log 4.825 386 1.3186 0.4651 1.3186 1.1483
No log 4.85 388 1.1377 0.5161 1.1377 1.0666
No log 4.875 390 1.0646 0.528 1.0646 1.0318
No log 4.9 392 1.1237 0.4844 1.1237 1.0600
No log 4.925 394 1.3544 0.4511 1.3543 1.1638
No log 4.95 396 1.5425 0.3824 1.5425 1.2420
No log 4.975 398 1.4347 0.4478 1.4347 1.1978
No log 5.0 400 1.2707 0.4697 1.2707 1.1272
No log 5.025 402 1.1791 0.5116 1.1791 1.0859
No log 5.05 404 1.0859 0.5161 1.0859 1.0421
No log 5.075 406 1.0961 0.528 1.0961 1.0469
No log 5.1 408 1.1958 0.4697 1.1958 1.0935
No log 5.125 410 1.4718 0.3971 1.4718 1.2132
No log 5.15 412 1.8354 0.2857 1.8354 1.3548
No log 5.175 414 1.9227 0.2378 1.9227 1.3866
No log 5.2 416 1.7947 0.2837 1.7947 1.3397
No log 5.225 418 1.4317 0.4478 1.4317 1.1966
No log 5.25 420 1.1171 0.48 1.1171 1.0569
No log 5.275 422 1.0773 0.4959 1.0773 1.0379
No log 5.3 424 1.1246 0.4918 1.1246 1.0605
No log 5.325 426 1.2554 0.4882 1.2554 1.1205
No log 5.35 428 1.4243 0.4545 1.4243 1.1934
No log 5.375 430 1.4085 0.4545 1.4085 1.1868
No log 5.4 432 1.2768 0.4806 1.2768 1.1300
No log 5.425 434 1.2005 0.512 1.2005 1.0957
No log 5.45 436 1.2507 0.5039 1.2507 1.1184
No log 5.475 438 1.3992 0.4462 1.3992 1.1829
No log 5.5 440 1.5459 0.4179 1.5459 1.2434
No log 5.525 442 1.7602 0.3022 1.7602 1.3267
No log 5.55 444 1.8992 0.2553 1.8992 1.3781
No log 5.575 446 1.8395 0.2857 1.8395 1.3563
No log 5.6 448 1.6809 0.3066 1.6809 1.2965
No log 5.625 450 1.6171 0.3504 1.6171 1.2716
No log 5.65 452 1.6833 0.3066 1.6833 1.2974
No log 5.675 454 1.5986 0.3731 1.5986 1.2644
No log 5.7 456 1.3916 0.4154 1.3916 1.1797
No log 5.725 458 1.2947 0.4882 1.2947 1.1378
No log 5.75 460 1.2272 0.4878 1.2272 1.1078
No log 5.775 462 1.2329 0.5203 1.2329 1.1103
No log 5.8 464 1.3008 0.4677 1.3008 1.1405
No log 5.825 466 1.4504 0.4154 1.4504 1.2043
No log 5.85 468 1.6627 0.3433 1.6627 1.2895
No log 5.875 470 1.7431 0.3066 1.7431 1.3203
No log 5.9 472 1.7117 0.3066 1.7117 1.3083
No log 5.925 474 1.5806 0.3609 1.5806 1.2572
No log 5.95 476 1.4433 0.3910 1.4433 1.2014
No log 5.975 478 1.3391 0.4375 1.3391 1.1572
No log 6.0 480 1.3259 0.4186 1.3259 1.1515
No log 6.025 482 1.3445 0.4 1.3445 1.1595
No log 6.05 484 1.4461 0.3910 1.4461 1.2025
No log 6.075 486 1.5468 0.3609 1.5468 1.2437
No log 6.1 488 1.5084 0.3511 1.5084 1.2282
No log 6.125 490 1.3753 0.4762 1.3753 1.1727
No log 6.15 492 1.3631 0.496 1.3631 1.1675
No log 6.175 494 1.3674 0.496 1.3674 1.1694
No log 6.2 496 1.3920 0.496 1.3920 1.1798
No log 6.225 498 1.4096 0.4762 1.4096 1.1873
0.4085 6.25 500 1.4311 0.4724 1.4311 1.1963
0.4085 6.275 502 1.4991 0.4091 1.4991 1.2244
0.4085 6.3 504 1.6641 0.3134 1.6641 1.2900
0.4085 6.325 506 1.7147 0.2609 1.7147 1.3095
0.4085 6.35 508 1.6181 0.3582 1.6181 1.2720
0.4085 6.375 510 1.4761 0.4275 1.4761 1.2149

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k17_task1_organization

Finetuned
(4019)
this model