ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k4_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.4462
  • Qwk: 0.0401
  • Mse: 1.4462
  • Rmse: 1.2026

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.2 2 4.0498 -0.0019 4.0498 2.0124
No log 0.4 4 2.4761 -0.0040 2.4761 1.5736
No log 0.6 6 1.5345 0.0294 1.5345 1.2388
No log 0.8 8 1.4773 0.0232 1.4773 1.2154
No log 1.0 10 1.2927 0.0642 1.2927 1.1370
No log 1.2 12 1.1116 0.1398 1.1116 1.0543
No log 1.4 14 1.0997 0.1398 1.0997 1.0487
No log 1.6 16 1.1149 0.0944 1.1149 1.0559
No log 1.8 18 1.2037 0.0312 1.2037 1.0971
No log 2.0 20 1.1541 0.1576 1.1541 1.0743
No log 2.2 22 1.1068 0.1011 1.1068 1.0520
No log 2.4 24 1.1367 0.1046 1.1367 1.0662
No log 2.6 26 1.1566 0.2161 1.1566 1.0754
No log 2.8 28 1.2817 0.0520 1.2817 1.1321
No log 3.0 30 1.2800 0.0520 1.2800 1.1314
No log 3.2 32 1.1833 0.1196 1.1833 1.0878
No log 3.4 34 1.2791 0.0987 1.2791 1.1310
No log 3.6 36 1.3068 0.1202 1.3068 1.1431
No log 3.8 38 1.1916 -0.0022 1.1916 1.0916
No log 4.0 40 1.2001 0.1989 1.2001 1.0955
No log 4.2 42 1.1969 0.1195 1.1969 1.0940
No log 4.4 44 1.5277 0.1168 1.5277 1.2360
No log 4.6 46 1.6736 0.0529 1.6736 1.2937
No log 4.8 48 1.7383 -0.1128 1.7383 1.3184
No log 5.0 50 1.6067 -0.1750 1.6067 1.2676
No log 5.2 52 1.3162 -0.0428 1.3162 1.1473
No log 5.4 54 1.1247 0.1137 1.1247 1.0605
No log 5.6 56 1.1473 0.1625 1.1473 1.0711
No log 5.8 58 1.3745 0.0841 1.3745 1.1724
No log 6.0 60 1.6298 -0.0449 1.6298 1.2766
No log 6.2 62 1.7150 -0.0219 1.7150 1.3096
No log 6.4 64 1.7049 0.0 1.7049 1.3057
No log 6.6 66 1.6382 0.1531 1.6382 1.2799
No log 6.8 68 1.6400 0.1166 1.6400 1.2806
No log 7.0 70 1.5793 0.0623 1.5793 1.2567
No log 7.2 72 1.4967 0.0513 1.4967 1.2234
No log 7.4 74 1.4552 0.1053 1.4552 1.2063
No log 7.6 76 1.3815 0.1814 1.3815 1.1754
No log 7.8 78 1.2365 0.1351 1.2365 1.1120
No log 8.0 80 1.2490 0.1351 1.2490 1.1176
No log 8.2 82 1.3663 0.1628 1.3663 1.1689
No log 8.4 84 1.5632 0.1703 1.5632 1.2503
No log 8.6 86 1.5422 0.2126 1.5422 1.2418
No log 8.8 88 1.5550 0.2424 1.5550 1.2470
No log 9.0 90 1.7373 0.1847 1.7373 1.3181
No log 9.2 92 1.7288 0.1141 1.7288 1.3148
No log 9.4 94 1.6154 0.1058 1.6154 1.2710
No log 9.6 96 1.5806 0.1058 1.5806 1.2572
No log 9.8 98 1.4349 0.1142 1.4349 1.1979
No log 10.0 100 1.3656 0.1142 1.3656 1.1686
No log 10.2 102 1.4528 0.1486 1.4528 1.2053
No log 10.4 104 1.6158 0.0946 1.6158 1.2711
No log 10.6 106 1.6305 0.0226 1.6305 1.2769
No log 10.8 108 1.6307 0.1142 1.6307 1.2770
No log 11.0 110 1.5920 0.1486 1.5920 1.2617
No log 11.2 112 1.6885 0.1601 1.6885 1.2994
No log 11.4 114 1.8334 0.2252 1.8334 1.3540
No log 11.6 116 1.8874 0.1559 1.8874 1.3738
No log 11.8 118 1.7784 0.2252 1.7784 1.3335
No log 12.0 120 1.6659 0.1667 1.6659 1.2907
No log 12.2 122 1.6168 0.0786 1.6168 1.2716
No log 12.4 124 1.6373 0.1114 1.6373 1.2796
No log 12.6 126 1.5642 0.0806 1.5642 1.2507
No log 12.8 128 1.3999 0.0931 1.3999 1.1832
No log 13.0 130 1.3652 0.0931 1.3652 1.1684
No log 13.2 132 1.4930 0.0806 1.4930 1.2219
No log 13.4 134 1.5379 0.0806 1.5379 1.2401
No log 13.6 136 1.5133 0.0806 1.5133 1.2302
No log 13.8 138 1.4793 0.1769 1.4793 1.2163
No log 14.0 140 1.5454 0.1498 1.5454 1.2431
No log 14.2 142 1.7736 0.1467 1.7736 1.3318
No log 14.4 144 1.9747 0.1313 1.9747 1.4053
No log 14.6 146 1.9080 0.1323 1.9080 1.3813
No log 14.8 148 1.6673 0.0911 1.6673 1.2912
No log 15.0 150 1.4134 0.0931 1.4134 1.1889
No log 15.2 152 1.3319 0.0833 1.3319 1.1541
No log 15.4 154 1.3475 0.0310 1.3475 1.1608
No log 15.6 156 1.4127 -0.0355 1.4127 1.1886
No log 15.8 158 1.4732 0.0263 1.4732 1.2138
No log 16.0 160 1.6277 0.1462 1.6277 1.2758
No log 16.2 162 1.7241 0.2006 1.7241 1.3130
No log 16.4 164 1.4981 0.2417 1.4981 1.2240
No log 16.6 166 1.4528 0.2690 1.4528 1.2053
No log 16.8 168 1.5998 0.2386 1.5998 1.2648
No log 17.0 170 1.9088 0.1835 1.9088 1.3816
No log 17.2 172 2.0002 0.0981 2.0002 1.4143
No log 17.4 174 1.8278 0.2053 1.8278 1.3520
No log 17.6 176 1.6832 0.2252 1.6832 1.2974
No log 17.8 178 1.6086 0.2566 1.6086 1.2683
No log 18.0 180 1.5513 0.2058 1.5513 1.2455
No log 18.2 182 1.4801 0.1573 1.4801 1.2166
No log 18.4 184 1.5061 0.1288 1.5061 1.2272
No log 18.6 186 1.4586 0.0970 1.4586 1.2077
No log 18.8 188 1.4722 0.0878 1.4722 1.2134
No log 19.0 190 1.4603 0.0510 1.4603 1.2084
No log 19.2 192 1.4311 0.0122 1.4311 1.1963
No log 19.4 194 1.4413 0.0122 1.4413 1.2005
No log 19.6 196 1.4655 0.0510 1.4655 1.2106
No log 19.8 198 1.4547 0.0510 1.4547 1.2061
No log 20.0 200 1.4935 0.0878 1.4935 1.2221
No log 20.2 202 1.5505 0.1462 1.5505 1.2452
No log 20.4 204 1.5726 0.1371 1.5726 1.2540
No log 20.6 206 1.5185 0.0896 1.5185 1.2323
No log 20.8 208 1.4661 0.0806 1.4661 1.2108
No log 21.0 210 1.4548 0.0896 1.4548 1.2062
No log 21.2 212 1.5658 0.1441 1.5658 1.2513
No log 21.4 214 1.7100 0.1902 1.7100 1.3077
No log 21.6 216 1.8691 0.2296 1.8691 1.3671
No log 21.8 218 1.7924 0.2406 1.7924 1.3388
No log 22.0 220 1.5665 0.1789 1.5665 1.2516
No log 22.2 222 1.4092 0.0510 1.4092 1.1871
No log 22.4 224 1.3652 0.0401 1.3652 1.1684
No log 22.6 226 1.5112 0.0401 1.5112 1.2293
No log 22.8 228 1.7181 0.1703 1.7181 1.3108
No log 23.0 230 1.8661 0.2448 1.8661 1.3661
No log 23.2 232 1.8641 0.2193 1.8641 1.3653
No log 23.4 234 1.7209 0.2568 1.7209 1.3118
No log 23.6 236 1.5960 0.0878 1.5960 1.2633
No log 23.8 238 1.5400 0.0510 1.5400 1.2410
No log 24.0 240 1.5538 0.0510 1.5538 1.2465
No log 24.2 242 1.5902 0.0878 1.5902 1.2610
No log 24.4 244 1.6390 0.1703 1.6390 1.2802
No log 24.6 246 1.6748 0.2062 1.6748 1.2941
No log 24.8 248 1.7073 0.2342 1.7073 1.3067
No log 25.0 250 1.7496 0.2292 1.7496 1.3227
No log 25.2 252 1.7912 0.2005 1.7912 1.3384
No log 25.4 254 1.7652 0.1789 1.7652 1.3286
No log 25.6 256 1.7670 0.1789 1.7670 1.3293
No log 25.8 258 1.7278 0.2270 1.7278 1.3145
No log 26.0 260 1.7009 0.1847 1.7009 1.3042
No log 26.2 262 1.7401 0.1752 1.7401 1.3191
No log 26.4 264 1.7010 0.1441 1.7010 1.3042
No log 26.6 266 1.6070 0.2004 1.6070 1.2677
No log 26.8 268 1.6370 0.2292 1.6370 1.2795
No log 27.0 270 1.6306 0.2424 1.6306 1.2769
No log 27.2 272 1.5440 0.0878 1.5440 1.2426
No log 27.4 274 1.4623 0.0781 1.4623 1.2093
No log 27.6 276 1.4805 0.0781 1.4805 1.2168
No log 27.8 278 1.4920 0.0781 1.4920 1.2215
No log 28.0 280 1.5798 0.0781 1.5798 1.2569
No log 28.2 282 1.6165 0.2065 1.6165 1.2714
No log 28.4 284 1.5882 0.0781 1.5882 1.2602
No log 28.6 286 1.5786 0.0781 1.5786 1.2564
No log 28.8 288 1.5830 0.0781 1.5830 1.2582
No log 29.0 290 1.5948 0.0781 1.5948 1.2629
No log 29.2 292 1.6194 0.0781 1.6194 1.2725
No log 29.4 294 1.7034 0.1222 1.7034 1.3051
No log 29.6 296 1.7279 0.0828 1.7279 1.3145
No log 29.8 298 1.6587 0.1058 1.6587 1.2879
No log 30.0 300 1.5497 0.0781 1.5497 1.2449
No log 30.2 302 1.4584 0.0401 1.4584 1.2076
No log 30.4 304 1.3899 0.0 1.3899 1.1789
No log 30.6 306 1.3631 0.0 1.3631 1.1675
No log 30.8 308 1.4253 0.0401 1.4253 1.1939
No log 31.0 310 1.4945 0.0781 1.4945 1.2225
No log 31.2 312 1.6107 0.2065 1.6107 1.2691
No log 31.4 314 1.6392 0.2474 1.6392 1.2803
No log 31.6 316 1.5508 0.1814 1.5508 1.2453
No log 31.8 318 1.4517 0.0781 1.4517 1.2049
No log 32.0 320 1.4042 0.0781 1.4042 1.1850
No log 32.2 322 1.4471 0.0781 1.4471 1.2030
No log 32.4 324 1.5416 0.2372 1.5416 1.2416
No log 32.6 326 1.6671 0.2522 1.6671 1.2912
No log 32.8 328 1.6915 0.3018 1.6915 1.3006
No log 33.0 330 1.5829 0.2126 1.5829 1.2581
No log 33.2 332 1.4537 0.0781 1.4537 1.2057
No log 33.4 334 1.4130 0.0781 1.4130 1.1887
No log 33.6 336 1.3698 0.0401 1.3698 1.1704
No log 33.8 338 1.3847 0.0781 1.3847 1.1768
No log 34.0 340 1.3631 0.0401 1.3631 1.1675
No log 34.2 342 1.3556 0.0833 1.3556 1.1643
No log 34.4 344 1.3696 0.0833 1.3696 1.1703
No log 34.6 346 1.3620 0.1202 1.3620 1.1670
No log 34.8 348 1.4089 0.1202 1.4089 1.1870
No log 35.0 350 1.5695 0.2126 1.5695 1.2528
No log 35.2 352 1.7908 0.2566 1.7908 1.3382
No log 35.4 354 1.9126 0.2494 1.9126 1.3830
No log 35.6 356 1.8971 0.2193 1.8971 1.3774
No log 35.8 358 1.7862 0.3086 1.7862 1.3365
No log 36.0 360 1.5879 0.2126 1.5879 1.2601
No log 36.2 362 1.4472 0.0878 1.4472 1.2030
No log 36.4 364 1.4007 0.0833 1.4007 1.1835
No log 36.6 366 1.3922 0.0401 1.3922 1.1799
No log 36.8 368 1.4093 0.0401 1.4093 1.1872
No log 37.0 370 1.4026 0.0781 1.4026 1.1843
No log 37.2 372 1.3237 0.1202 1.3236 1.1505
No log 37.4 374 1.2927 0.1202 1.2927 1.1370
No log 37.6 376 1.3537 0.1202 1.3537 1.1635
No log 37.8 378 1.4789 0.2424 1.4789 1.2161
No log 38.0 380 1.5911 0.2709 1.5911 1.2614
No log 38.2 382 1.6357 0.2424 1.6357 1.2789
No log 38.4 384 1.5717 0.1562 1.5717 1.2537
No log 38.6 386 1.4754 0.0878 1.4754 1.2147
No log 38.8 388 1.4037 0.0781 1.4037 1.1848
No log 39.0 390 1.3638 0.0 1.3638 1.1678
No log 39.2 392 1.3816 0.0401 1.3816 1.1754
No log 39.4 394 1.4570 0.0878 1.4570 1.2071
No log 39.6 396 1.5480 0.2424 1.5480 1.2442
No log 39.8 398 1.6550 0.2832 1.6550 1.2865
No log 40.0 400 1.7715 0.3123 1.7715 1.3310
No log 40.2 402 1.7897 0.3149 1.7897 1.3378
No log 40.4 404 1.7411 0.2940 1.7411 1.3195
No log 40.6 406 1.6016 0.2709 1.6016 1.2655
No log 40.8 408 1.4668 0.0878 1.4668 1.2111
No log 41.0 410 1.4198 0.0878 1.4198 1.1916
No log 41.2 412 1.3981 0.0401 1.3981 1.1824
No log 41.4 414 1.3964 0.0401 1.3964 1.1817
No log 41.6 416 1.4410 0.0781 1.4410 1.2004
No log 41.8 418 1.4899 0.1228 1.4899 1.2206
No log 42.0 420 1.5067 0.1486 1.5067 1.2275
No log 42.2 422 1.5787 0.2424 1.5787 1.2565
No log 42.4 424 1.5990 0.2709 1.5990 1.2645
No log 42.6 426 1.5759 0.1814 1.5759 1.2554
No log 42.8 428 1.5479 0.1486 1.5479 1.2442
No log 43.0 430 1.4957 0.0781 1.4957 1.2230
No log 43.2 432 1.4805 0.1486 1.4805 1.2168
No log 43.4 434 1.3960 0.0781 1.3960 1.1815
No log 43.6 436 1.3209 0.0833 1.3209 1.1493
No log 43.8 438 1.3247 0.1202 1.3247 1.1510
No log 44.0 440 1.3443 0.0781 1.3443 1.1594
No log 44.2 442 1.4135 0.1486 1.4135 1.1889
No log 44.4 444 1.5310 0.1814 1.5310 1.2373
No log 44.6 446 1.6573 0.2170 1.6573 1.2874
No log 44.8 448 1.6620 0.1769 1.6620 1.2892
No log 45.0 450 1.5740 0.1562 1.5740 1.2546
No log 45.2 452 1.4633 0.0781 1.4633 1.2097
No log 45.4 454 1.3619 0.0 1.3619 1.1670
No log 45.6 456 1.3283 0.0 1.3283 1.1525
No log 45.8 458 1.3419 0.0 1.3419 1.1584
No log 46.0 460 1.3921 0.0 1.3921 1.1799
No log 46.2 462 1.4439 0.0781 1.4439 1.2016
No log 46.4 464 1.5172 0.0781 1.5172 1.2318
No log 46.6 466 1.5544 0.1228 1.5544 1.2468
No log 46.8 468 1.5812 0.1562 1.5812 1.2575
No log 47.0 470 1.6196 0.2424 1.6196 1.2726
No log 47.2 472 1.5925 0.2709 1.5925 1.2619
No log 47.4 474 1.5611 0.2424 1.5611 1.2494
No log 47.6 476 1.5309 0.2709 1.5309 1.2373
No log 47.8 478 1.4882 0.1880 1.4882 1.2199
No log 48.0 480 1.4832 0.1814 1.4832 1.2179
No log 48.2 482 1.5299 0.1814 1.5299 1.2369
No log 48.4 484 1.5790 0.2424 1.5790 1.2566
No log 48.6 486 1.6311 0.2709 1.6311 1.2771
No log 48.8 488 1.6146 0.2424 1.6146 1.2707
No log 49.0 490 1.5326 0.1142 1.5326 1.2380
No log 49.2 492 1.4981 0.0781 1.4981 1.2240
No log 49.4 494 1.4723 0.1202 1.4723 1.2134
No log 49.6 496 1.4679 0.1552 1.4679 1.2116
No log 49.8 498 1.4768 0.1886 1.4768 1.2153
0.2147 50.0 500 1.5082 0.1880 1.5082 1.2281
0.2147 50.2 502 1.5042 0.1486 1.5042 1.2264
0.2147 50.4 504 1.4614 0.1142 1.4614 1.2089
0.2147 50.6 506 1.4141 0.0 1.4141 1.1891
0.2147 50.8 508 1.4143 0.0 1.4143 1.1892
0.2147 51.0 510 1.4462 0.0401 1.4462 1.2026

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k4_task5_organization

Finetuned
(4019)
this model