ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k6_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.3763
  • Qwk: 0.1697
  • Mse: 1.3763
  • Rmse: 1.1732

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.1333 2 4.0648 0.0070 4.0648 2.0161
No log 0.2667 4 2.1943 0.1047 2.1943 1.4813
No log 0.4 6 1.4606 0.0143 1.4606 1.2086
No log 0.5333 8 1.2335 -0.0032 1.2335 1.1106
No log 0.6667 10 1.1203 0.1304 1.1203 1.0584
No log 0.8 12 1.1447 0.1944 1.1447 1.0699
No log 0.9333 14 1.1465 0.0761 1.1465 1.0707
No log 1.0667 16 1.1281 0.1062 1.1281 1.0621
No log 1.2 18 1.1256 0.2145 1.1256 1.0610
No log 1.3333 20 1.0531 0.2015 1.0531 1.0262
No log 1.4667 22 1.1027 0.1292 1.1027 1.0501
No log 1.6 24 1.0525 0.1629 1.0525 1.0259
No log 1.7333 26 1.0091 0.1908 1.0091 1.0046
No log 1.8667 28 1.2192 0.0950 1.2192 1.1042
No log 2.0 30 1.5284 -0.0305 1.5284 1.2363
No log 2.1333 32 1.4065 0.0102 1.4065 1.1860
No log 2.2667 34 1.0920 0.0917 1.0920 1.0450
No log 2.4 36 1.2433 0.1199 1.2433 1.1150
No log 2.5333 38 1.6740 -0.0939 1.6740 1.2938
No log 2.6667 40 1.7687 -0.1658 1.7687 1.3299
No log 2.8 42 1.5873 -0.0749 1.5873 1.2599
No log 2.9333 44 1.5556 -0.1285 1.5556 1.2472
No log 3.0667 46 1.3672 0.1702 1.3672 1.1693
No log 3.2 48 1.2001 0.2632 1.2001 1.0955
No log 3.3333 50 1.3977 0.0818 1.3977 1.1823
No log 3.4667 52 1.5670 -0.0756 1.5670 1.2518
No log 3.6 54 1.4665 0.1452 1.4665 1.2110
No log 3.7333 56 1.2472 0.2773 1.2472 1.1168
No log 3.8667 58 1.1001 0.1170 1.1001 1.0489
No log 4.0 60 1.0654 0.0886 1.0654 1.0322
No log 4.1333 62 1.1325 0.1233 1.1325 1.0642
No log 4.2667 64 1.3356 0.2728 1.3356 1.1557
No log 4.4 66 1.5044 0.2313 1.5044 1.2265
No log 4.5333 68 1.6028 0.1935 1.6028 1.2660
No log 4.6667 70 1.7481 0.0275 1.7481 1.3222
No log 4.8 72 1.6289 0.1242 1.6289 1.2763
No log 4.9333 74 1.6525 0.0876 1.6525 1.2855
No log 5.0667 76 1.8225 0.0359 1.8225 1.3500
No log 5.2 78 1.7749 0.0542 1.7749 1.3323
No log 5.3333 80 1.4799 0.2194 1.4799 1.2165
No log 5.4667 82 1.4947 0.2311 1.4947 1.2226
No log 5.6 84 1.8309 -0.0145 1.8309 1.3531
No log 5.7333 86 1.8321 -0.0653 1.8321 1.3535
No log 5.8667 88 1.4184 0.1054 1.4184 1.1910
No log 6.0 90 1.1325 0.2864 1.1325 1.0642
No log 6.1333 92 1.0947 0.2512 1.0947 1.0463
No log 6.2667 94 1.2938 0.1649 1.2938 1.1374
No log 6.4 96 1.5198 -0.0052 1.5198 1.2328
No log 6.5333 98 1.4648 -0.0171 1.4648 1.2103
No log 6.6667 100 1.2772 0.1649 1.2772 1.1302
No log 6.8 102 1.1927 0.2636 1.1927 1.0921
No log 6.9333 104 1.2843 0.1868 1.2843 1.1333
No log 7.0667 106 1.3793 0.0870 1.3793 1.1744
No log 7.2 108 1.4438 0.0483 1.4438 1.2016
No log 7.3333 110 1.2752 0.1986 1.2752 1.1293
No log 7.4667 112 1.1645 0.2105 1.1645 1.0791
No log 7.6 114 1.0062 0.2512 1.0062 1.0031
No log 7.7333 116 0.9886 0.2773 0.9886 0.9943
No log 7.8667 118 1.1602 0.2105 1.1602 1.0771
No log 8.0 120 1.4090 0.1026 1.4090 1.1870
No log 8.1333 122 1.6474 -0.0137 1.6474 1.2835
No log 8.2667 124 1.5955 -0.0313 1.5955 1.2631
No log 8.4 126 1.3919 0.1858 1.3919 1.1798
No log 8.5333 128 1.3567 0.2438 1.3567 1.1648
No log 8.6667 130 1.5119 0.2091 1.5119 1.2296
No log 8.8 132 1.5289 0.1971 1.5289 1.2365
No log 8.9333 134 1.3237 0.2528 1.3237 1.1505
No log 9.0667 136 1.1219 0.2167 1.1219 1.0592
No log 9.2 138 1.1877 0.2528 1.1877 1.0898
No log 9.3333 140 1.4345 0.1854 1.4345 1.1977
No log 9.4667 142 1.5098 0.1663 1.5098 1.2288
No log 9.6 144 1.4956 0.1354 1.4956 1.2229
No log 9.7333 146 1.2952 0.2105 1.2952 1.1381
No log 9.8667 148 1.1930 0.1986 1.1930 1.0923
No log 10.0 150 1.1884 0.2105 1.1884 1.0901
No log 10.1333 152 1.2602 0.2105 1.2602 1.1226
No log 10.2667 154 1.2742 0.2424 1.2742 1.1288
No log 10.4 156 1.3051 0.2857 1.3051 1.1424
No log 10.5333 158 1.3209 0.2857 1.3209 1.1493
No log 10.6667 160 1.3848 0.2730 1.3848 1.1768
No log 10.8 162 1.3725 0.2640 1.3725 1.1715
No log 10.9333 164 1.3730 0.2341 1.3730 1.1717
No log 11.0667 166 1.4788 0.1058 1.4788 1.2161
No log 11.2 168 1.3882 0.1370 1.3882 1.1782
No log 11.3333 170 1.3640 0.1961 1.3640 1.1679
No log 11.4667 172 1.3961 0.1512 1.3961 1.1816
No log 11.6 174 1.5546 0.0694 1.5546 1.2468
No log 11.7333 176 1.6420 0.1428 1.6420 1.2814
No log 11.8667 178 1.6865 0.1428 1.6865 1.2986
No log 12.0 180 1.5908 0.1222 1.5908 1.2613
No log 12.1333 182 1.4351 0.1886 1.4351 1.1980
No log 12.2667 184 1.3314 0.1202 1.3314 1.1539
No log 12.4 186 1.3558 0.1052 1.3558 1.1644
No log 12.5333 188 1.4045 0.2709 1.4045 1.1851
No log 12.6667 190 1.4973 0.2522 1.4973 1.2236
No log 12.8 192 1.3463 0.2239 1.3463 1.1603
No log 12.9333 194 1.2102 0.2495 1.2102 1.1001
No log 13.0667 196 1.0868 0.3126 1.0868 1.0425
No log 13.2 198 1.2547 0.2851 1.2547 1.1201
No log 13.3333 200 1.4925 0.2525 1.4925 1.2217
No log 13.4667 202 1.4470 0.1769 1.4470 1.2029
No log 13.6 204 1.3304 0.2203 1.3304 1.1534
No log 13.7333 206 1.2351 0.1622 1.2351 1.1114
No log 13.8667 208 1.2858 0.1622 1.2858 1.1339
No log 14.0 210 1.4297 0.1816 1.4297 1.1957
No log 14.1333 212 1.5151 0.2542 1.5151 1.2309
No log 14.2667 214 1.4593 0.2542 1.4593 1.2080
No log 14.4 216 1.4779 0.2542 1.4779 1.2157
No log 14.5333 218 1.5134 0.2730 1.5134 1.2302
No log 14.6667 220 1.3961 0.2730 1.3961 1.1815
No log 14.8 222 1.1793 0.1842 1.1793 1.0860
No log 14.9333 224 1.1452 0.1842 1.1452 1.0701
No log 15.0667 226 1.2414 0.1697 1.2414 1.1142
No log 15.2 228 1.3604 0.2592 1.3604 1.1664
No log 15.3333 230 1.5068 0.2568 1.5068 1.2275
No log 15.4667 232 1.5124 0.2568 1.5124 1.2298
No log 15.6 234 1.3868 0.2015 1.3868 1.1776
No log 15.7333 236 1.2702 0.1081 1.2702 1.1270
No log 15.8667 238 1.2320 0.0710 1.2320 1.1100
No log 16.0 240 1.3086 0.0781 1.3086 1.1439
No log 16.1333 242 1.4981 0.2292 1.4981 1.2240
No log 16.2667 244 1.6420 0.1955 1.6420 1.2814
No log 16.4 246 1.5780 0.1142 1.5780 1.2562
No log 16.5333 248 1.5528 -0.0624 1.5528 1.2461
No log 16.6667 250 1.4696 -0.0833 1.4696 1.2123
No log 16.8 252 1.3395 -0.0464 1.3395 1.1574
No log 16.9333 254 1.2256 0.1500 1.2256 1.1071
No log 17.0667 256 1.1810 0.1141 1.1810 1.0868
No log 17.2 258 1.2804 0.2284 1.2804 1.1316
No log 17.3333 260 1.4958 0.2647 1.4958 1.2230
No log 17.4667 262 1.5978 0.2880 1.5978 1.2640
No log 17.6 264 1.4738 0.3222 1.4738 1.2140
No log 17.7333 266 1.2603 0.2896 1.2603 1.1226
No log 17.8667 268 1.2759 0.2577 1.2759 1.1296
No log 18.0 270 1.3755 0.2075 1.3755 1.1728
No log 18.1333 272 1.3897 0.1943 1.3897 1.1789
No log 18.2667 274 1.3028 0.1697 1.3028 1.1414
No log 18.4 276 1.2290 0.1842 1.2290 1.1086
No log 18.5333 278 1.2117 0.1842 1.2117 1.1008
No log 18.6667 280 1.2930 0.2577 1.2930 1.1371
No log 18.8 282 1.4411 0.3001 1.4411 1.2004
No log 18.9333 284 1.5555 0.2511 1.5555 1.2472
No log 19.0667 286 1.5347 0.2391 1.5347 1.2388
No log 19.2 288 1.4610 0.1943 1.4610 1.2087
No log 19.3333 290 1.3496 0.2126 1.3496 1.1617
No log 19.4667 292 1.2268 0.0833 1.2268 1.1076
No log 19.6 294 1.1725 0.1141 1.1725 1.0828
No log 19.7333 296 1.2190 0.1842 1.2190 1.1041
No log 19.8667 298 1.3255 0.2592 1.3255 1.1513
No log 20.0 300 1.3713 0.2143 1.3713 1.1710
No log 20.1333 302 1.3914 0.2143 1.3914 1.1796
No log 20.2667 304 1.3368 0.2284 1.3368 1.1562
No log 20.4 306 1.3238 0.2728 1.3238 1.1506
No log 20.5333 308 1.3463 0.2528 1.3463 1.1603
No log 20.6667 310 1.3914 0.2149 1.3914 1.1796
No log 20.8 312 1.4163 0.1880 1.4163 1.1901
No log 20.9333 314 1.3326 0.1486 1.3326 1.1544
No log 21.0667 316 1.2865 0.2341 1.2865 1.1342
No log 21.2 318 1.2202 0.2424 1.2202 1.1046
No log 21.3333 320 1.2479 0.2528 1.2479 1.1171
No log 21.4667 322 1.2315 0.2227 1.2315 1.1098
No log 21.6 324 1.2726 0.1838 1.2726 1.1281
No log 21.7333 326 1.3571 0.1880 1.3571 1.1649
No log 21.8667 328 1.4243 0.2184 1.4243 1.1934
No log 22.0 330 1.4661 0.2522 1.4661 1.2108
No log 22.1333 332 1.4538 0.2522 1.4538 1.2057
No log 22.2667 334 1.4015 0.2730 1.4015 1.1838
No log 22.4 336 1.2614 0.1579 1.2614 1.1231
No log 22.5333 338 1.1733 0.1141 1.1733 1.0832
No log 22.6667 340 1.2034 0.1141 1.2034 1.0970
No log 22.8 342 1.3313 0.1770 1.3313 1.1538
No log 22.9333 344 1.3665 0.1628 1.3665 1.1690
No log 23.0667 346 1.3795 0.1628 1.3795 1.1745
No log 23.2 348 1.3037 0.1473 1.3037 1.1418
No log 23.3333 350 1.2351 0.0987 1.2351 1.1113
No log 23.4667 352 1.1722 0.2528 1.1722 1.0827
No log 23.6 354 1.2106 0.2857 1.2106 1.1003
No log 23.7333 356 1.4219 0.2789 1.4219 1.1924
No log 23.8667 358 1.6274 0.2823 1.6274 1.2757
No log 24.0 360 1.6631 0.2525 1.6631 1.2896
No log 24.1333 362 1.5330 0.1703 1.5330 1.2382
No log 24.2667 364 1.3791 0.0401 1.3791 1.1743
No log 24.4 366 1.2782 0.0401 1.2782 1.1306
No log 24.5333 368 1.2679 0.0556 1.2679 1.1260
No log 24.6667 370 1.3148 0.0556 1.3148 1.1466
No log 24.8 372 1.3240 0.1697 1.3240 1.1507
No log 24.9333 374 1.3359 0.2089 1.3359 1.1558
No log 25.0667 376 1.4408 0.2772 1.4408 1.2004
No log 25.2 378 1.4495 0.2730 1.4495 1.2040
No log 25.3333 380 1.3672 0.2089 1.3672 1.1693
No log 25.4667 382 1.2859 0.1351 1.2859 1.1340
No log 25.6 384 1.2862 0.1351 1.2862 1.1341
No log 25.7333 386 1.3463 0.1697 1.3463 1.1603
No log 25.8667 388 1.4578 0.1880 1.4578 1.2074
No log 26.0 390 1.4584 0.2075 1.4584 1.2077
No log 26.1333 392 1.3716 0.2089 1.3716 1.1712
No log 26.2667 394 1.3532 0.1288 1.3532 1.1633
No log 26.4 396 1.2896 0.1351 1.2896 1.1356
No log 26.5333 398 1.2081 0.0987 1.2081 1.0991
No log 26.6667 400 1.1992 0.0556 1.1992 1.0951
No log 26.8 402 1.2599 0.0401 1.2599 1.1224
No log 26.9333 404 1.2739 0.0401 1.2739 1.1287
No log 27.0667 406 1.2152 0.0556 1.2152 1.1024
No log 27.2 408 1.1661 0.1351 1.1661 1.0799
No log 27.3333 410 1.1822 0.1697 1.1822 1.0873
No log 27.4667 412 1.2208 0.1288 1.2208 1.1049
No log 27.6 414 1.2513 0.1370 1.2513 1.1186
No log 27.7333 416 1.2580 0.1142 1.2580 1.1216
No log 27.8667 418 1.2731 0.1142 1.2731 1.1283
No log 28.0 420 1.2416 0.0931 1.2416 1.1143
No log 28.1333 422 1.2267 0.1351 1.2267 1.1076
No log 28.2667 424 1.2174 0.1351 1.2174 1.1034
No log 28.4 426 1.1777 0.1351 1.1777 1.0852
No log 28.5333 428 1.2411 0.1351 1.2411 1.1141
No log 28.6667 430 1.3046 0.1697 1.3046 1.1422
No log 28.8 432 1.3247 0.1697 1.3247 1.1510
No log 28.9333 434 1.2714 0.1351 1.2714 1.1276
No log 29.0667 436 1.2575 0.1351 1.2575 1.1214
No log 29.2 438 1.1992 0.1351 1.1992 1.0951
No log 29.3333 440 1.1995 0.1230 1.1995 1.0952
No log 29.4667 442 1.2937 0.1697 1.2937 1.1374
No log 29.6 444 1.4537 0.1904 1.4537 1.2057
No log 29.7333 446 1.5864 0.1943 1.5864 1.2595
No log 29.8667 448 1.6076 0.1943 1.6076 1.2679
No log 30.0 450 1.5706 0.1562 1.5706 1.2532
No log 30.1333 452 1.4875 0.2089 1.4875 1.2196
No log 30.2667 454 1.3632 0.2089 1.3632 1.1675
No log 30.4 456 1.3266 0.2089 1.3266 1.1518
No log 30.5333 458 1.3886 0.2446 1.3886 1.1784
No log 30.6667 460 1.4946 0.2542 1.4946 1.2225
No log 30.8 462 1.4519 0.1697 1.4519 1.2050
No log 30.9333 464 1.4045 0.1697 1.4045 1.1851
No log 31.0667 466 1.2958 0.1697 1.2958 1.1383
No log 31.2 468 1.2453 0.1697 1.2453 1.1159
No log 31.3333 470 1.2517 0.1697 1.2517 1.1188
No log 31.4667 472 1.3279 0.2149 1.3279 1.1524
No log 31.6 474 1.3864 0.2446 1.3864 1.1775
No log 31.7333 476 1.4422 0.2315 1.4422 1.2009
No log 31.8667 478 1.4064 0.2455 1.4064 1.1859
No log 32.0 480 1.3066 0.1351 1.3066 1.1431
No log 32.1333 482 1.2696 0.1351 1.2696 1.1268
No log 32.2667 484 1.2523 0.1351 1.2523 1.1191
No log 32.4 486 1.2272 0.1351 1.2272 1.1078
No log 32.5333 488 1.2633 0.1351 1.2633 1.1240
No log 32.6667 490 1.2860 0.1697 1.2860 1.1340
No log 32.8 492 1.3192 0.1770 1.3192 1.1486
No log 32.9333 494 1.3940 0.2640 1.3940 1.1807
No log 33.0667 496 1.4463 0.2417 1.4463 1.2026
No log 33.2 498 1.3623 0.2640 1.3623 1.1672
0.217 33.3333 500 1.2232 0.1697 1.2232 1.1060
0.217 33.4667 502 1.1059 0.1500 1.1059 1.0516
0.217 33.6 504 1.0915 0.1649 1.0915 1.0447
0.217 33.7333 506 1.1657 0.1500 1.1657 1.0797
0.217 33.8667 508 1.2836 0.1697 1.2836 1.1330
0.217 34.0 510 1.3763 0.1697 1.3763 1.1732

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k6_task5_organization

Finetuned
(4019)
this model