ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k9_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.2265
  • Qwk: 0.0160
  • Mse: 1.2265
  • Rmse: 1.1075

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0909 2 4.2022 0.0054 4.2022 2.0499
No log 0.1818 4 2.0665 0.0271 2.0665 1.4375
No log 0.2727 6 1.3293 0.0938 1.3293 1.1529
No log 0.3636 8 1.2055 0.1333 1.2055 1.0979
No log 0.4545 10 1.1096 0.1476 1.1096 1.0534
No log 0.5455 12 1.1230 0.2615 1.1230 1.0597
No log 0.6364 14 1.9505 0.0633 1.9505 1.3966
No log 0.7273 16 1.9299 0.0916 1.9299 1.3892
No log 0.8182 18 1.4919 0.0 1.4919 1.2214
No log 0.9091 20 1.1093 0.0999 1.1093 1.0532
No log 1.0 22 1.0201 0.3902 1.0201 1.0100
No log 1.0909 24 1.0723 0.0762 1.0723 1.0355
No log 1.1818 26 1.0957 0.0888 1.0957 1.0468
No log 1.2727 28 1.0425 0.1233 1.0425 1.0210
No log 1.3636 30 0.9925 0.2467 0.9925 0.9962
No log 1.4545 32 0.9567 0.2566 0.9567 0.9781
No log 1.5455 34 0.9841 0.2416 0.9841 0.9920
No log 1.6364 36 1.1296 0.1071 1.1296 1.0628
No log 1.7273 38 1.1629 0.0917 1.1629 1.0784
No log 1.8182 40 1.0763 0.1755 1.0763 1.0374
No log 1.9091 42 1.1371 0.1447 1.1371 1.0663
No log 2.0 44 1.2074 0.2298 1.2074 1.0988
No log 2.0909 46 1.0949 0.2770 1.0949 1.0464
No log 2.1818 48 1.1886 0.2263 1.1886 1.0902
No log 2.2727 50 1.4425 0.1568 1.4425 1.2010
No log 2.3636 52 1.4336 0.1442 1.4336 1.1973
No log 2.4545 54 1.4913 0.1452 1.4913 1.2212
No log 2.5455 56 1.2295 0.1649 1.2295 1.1088
No log 2.6364 58 1.0343 0.2594 1.0343 1.0170
No log 2.7273 60 0.9892 0.3048 0.9892 0.9946
No log 2.8182 62 1.1373 0.2748 1.1373 1.0664
No log 2.9091 64 1.3179 0.2359 1.3179 1.1480
No log 3.0 66 1.4430 0.1686 1.4430 1.2013
No log 3.0909 68 1.3145 0.1303 1.3145 1.1465
No log 3.1818 70 1.0966 0.1927 1.0966 1.0472
No log 3.2727 72 1.0909 0.1927 1.0909 1.0444
No log 3.3636 74 1.2845 0.2424 1.2845 1.1334
No log 3.4545 76 1.3880 0.1595 1.3880 1.1781
No log 3.5455 78 1.2390 0.1417 1.2390 1.1131
No log 3.6364 80 1.0832 0.1873 1.0832 1.0408
No log 3.7273 82 1.0287 0.1530 1.0287 1.0143
No log 3.8182 84 1.0234 0.1530 1.0234 1.0116
No log 3.9091 86 1.1309 0.1351 1.1309 1.0634
No log 4.0 88 1.3056 0.1697 1.3056 1.1426
No log 4.0909 90 1.4454 0.1703 1.4454 1.2023
No log 4.1818 92 1.3303 0.1838 1.3303 1.1534
No log 4.2727 94 1.0885 0.1139 1.0885 1.0433
No log 4.3636 96 1.0062 0.2287 1.0062 1.0031
No log 4.4545 98 1.0261 0.1653 1.0261 1.0130
No log 4.5455 100 1.1551 0.0833 1.1551 1.0748
No log 4.6364 102 1.3525 0.1486 1.3525 1.1630
No log 4.7273 104 1.4430 0.1814 1.4430 1.2013
No log 4.8182 106 1.3542 0.0878 1.3542 1.1637
No log 4.9091 108 1.1570 0.0781 1.1570 1.0756
No log 5.0 110 1.0347 0.1570 1.0347 1.0172
No log 5.0909 112 1.0199 0.1918 1.0199 1.0099
No log 5.1818 114 1.0804 0.2105 1.0804 1.0394
No log 5.2727 116 1.2582 0.2455 1.2582 1.1217
No log 5.3636 118 1.4537 0.2004 1.4537 1.2057
No log 5.4545 120 1.4671 0.1832 1.4671 1.2112
No log 5.5455 122 1.4709 0.1892 1.4709 1.2128
No log 5.6364 124 1.4482 0.1832 1.4482 1.2034
No log 5.7273 126 1.2916 0.2065 1.2916 1.1365
No log 5.8182 128 1.1314 0.1351 1.1314 1.0637
No log 5.9091 130 1.0760 0.1770 1.0760 1.0373
No log 6.0 132 1.0686 0.1351 1.0686 1.0337
No log 6.0909 134 1.1189 0.1697 1.1189 1.0578
No log 6.1818 136 1.3057 0.1278 1.3057 1.1427
No log 6.2727 138 1.4283 0.2191 1.4283 1.1951
No log 6.3636 140 1.5241 0.2339 1.5241 1.2345
No log 6.4545 142 1.5369 0.1140 1.5369 1.2397
No log 6.5455 144 1.5338 0.1058 1.5338 1.2385
No log 6.6364 146 1.4418 0.1362 1.4418 1.2008
No log 6.7273 148 1.3373 0.0841 1.3373 1.1564
No log 6.8182 150 1.2823 0.0961 1.2823 1.1324
No log 6.9091 152 1.1461 0.1110 1.1461 1.0706
No log 7.0 154 1.0071 0.2559 1.0071 1.0036
No log 7.0909 156 0.9915 0.2287 0.9915 0.9957
No log 7.1818 158 1.0688 0.2062 1.0688 1.0338
No log 7.2727 160 1.2108 0.1880 1.2108 1.1004
No log 7.3636 162 1.2636 0.1880 1.2636 1.1241
No log 7.4545 164 1.3161 0.2187 1.3161 1.1472
No log 7.5455 166 1.3722 0.2062 1.3722 1.1714
No log 7.6364 168 1.3524 0.2187 1.3524 1.1629
No log 7.7273 170 1.4121 0.2342 1.4121 1.1883
No log 7.8182 172 1.5019 0.2239 1.5019 1.2255
No log 7.9091 174 1.4605 0.2372 1.4605 1.2085
No log 8.0 176 1.3250 0.1744 1.3250 1.1511
No log 8.0909 178 1.2179 0.0931 1.2179 1.1036
No log 8.1818 180 1.1655 0.1351 1.1655 1.0796
No log 8.2727 182 1.1612 0.0987 1.1612 1.0776
No log 8.3636 184 1.1961 0.0 1.1961 1.0937
No log 8.4545 186 1.2071 0.0 1.2071 1.0987
No log 8.5455 188 1.2408 0.0781 1.2408 1.1139
No log 8.6364 190 1.3286 0.1744 1.3286 1.1527
No log 8.7273 192 1.3824 0.2260 1.3824 1.1758
No log 8.8182 194 1.4647 0.2431 1.4647 1.2103
No log 8.9091 196 1.5333 0.2717 1.5333 1.2383
No log 9.0 198 1.5734 0.2974 1.5734 1.2544
No log 9.0909 200 1.5084 0.1950 1.5084 1.2282
No log 9.1818 202 1.3673 0.1703 1.3673 1.1693
No log 9.2727 204 1.2438 0.1288 1.2438 1.1152
No log 9.3636 206 1.2619 0.1288 1.2619 1.1234
No log 9.4545 208 1.3735 0.1744 1.3735 1.1720
No log 9.5455 210 1.4397 0.1814 1.4397 1.1999
No log 9.6364 212 1.4793 0.2004 1.4793 1.2163
No log 9.7273 214 1.5000 0.2291 1.5000 1.2247
No log 9.8182 216 1.5650 0.2291 1.5650 1.2510
No log 9.9091 218 1.5039 0.1952 1.5039 1.2263
No log 10.0 220 1.4530 0.1288 1.4530 1.2054
No log 10.0909 222 1.4578 0.1288 1.4578 1.2074
No log 10.1818 224 1.4752 0.1288 1.4752 1.2146
No log 10.2727 226 1.4386 0.1288 1.4386 1.1994
No log 10.3636 228 1.3779 0.1288 1.3779 1.1738
No log 10.4545 230 1.2338 0.1351 1.2338 1.1108
No log 10.5455 232 1.0748 0.1046 1.0748 1.0367
No log 10.6364 234 1.0558 0.1417 1.0558 1.0275
No log 10.7273 236 1.1200 0.2105 1.1200 1.0583
No log 10.8182 238 1.1277 0.2105 1.1277 1.0619
No log 10.9091 240 1.1988 0.1288 1.1988 1.0949
No log 11.0 242 1.2822 0.1886 1.2822 1.1323
No log 11.0909 244 1.3224 0.1886 1.3224 1.1500
No log 11.1818 246 1.3012 0.0931 1.3012 1.1407
No log 11.2727 248 1.3157 0.0781 1.3157 1.1470
No log 11.3636 250 1.2958 0.0931 1.2958 1.1383
No log 11.4545 252 1.3152 0.1288 1.3152 1.1468
No log 11.5455 254 1.3662 0.1288 1.3662 1.1688
No log 11.6364 256 1.4224 0.2417 1.4224 1.1927
No log 11.7273 258 1.4333 0.2555 1.4333 1.1972
No log 11.8182 260 1.3728 0.1835 1.3728 1.1717
No log 11.9091 262 1.4142 0.1634 1.4142 1.1892
No log 12.0 264 1.5633 0.2342 1.5633 1.2503
No log 12.0909 266 1.5963 0.2004 1.5963 1.2634
No log 12.1818 268 1.4799 0.1228 1.4799 1.2165
No log 12.2727 270 1.2865 0.0781 1.2865 1.1342
No log 12.3636 272 1.1117 0.0762 1.1117 1.0544
No log 12.4545 274 1.0784 0.0510 1.0784 1.0385
No log 12.5455 276 1.1102 0.0762 1.1102 1.0536
No log 12.6364 278 1.1980 0.0781 1.1980 1.0945
No log 12.7273 280 1.2909 0.1142 1.2909 1.1362
No log 12.8182 282 1.3786 0.1142 1.3786 1.1741
No log 12.9091 284 1.3923 0.1288 1.3923 1.1800
No log 13.0 286 1.4262 0.1288 1.4262 1.1942
No log 13.0909 288 1.4412 0.0931 1.4412 1.2005
No log 13.1818 290 1.4757 0.0401 1.4757 1.2148
No log 13.2727 292 1.5103 0.1744 1.5103 1.2290
No log 13.3636 294 1.5063 0.1744 1.5063 1.2273
No log 13.4545 296 1.4669 0.1142 1.4669 1.2112
No log 13.5455 298 1.4286 0.1142 1.4286 1.1952
No log 13.6364 300 1.4179 0.1228 1.4179 1.1908
No log 13.7273 302 1.3432 0.1228 1.3432 1.1590
No log 13.8182 304 1.2967 0.1142 1.2967 1.1387
No log 13.9091 306 1.2463 0.1142 1.2463 1.1164
No log 14.0 308 1.2507 0.1142 1.2507 1.1183
No log 14.0909 310 1.2852 0.0781 1.2852 1.1337
No log 14.1818 312 1.3065 0.0781 1.3065 1.1430
No log 14.2727 314 1.3812 0.1142 1.3812 1.1752
No log 14.3636 316 1.3962 0.1142 1.3962 1.1816
No log 14.4545 318 1.4031 0.1142 1.4031 1.1845
No log 14.5455 320 1.3441 0.1142 1.3441 1.1593
No log 14.6364 322 1.2618 0.1434 1.2618 1.1233
No log 14.7273 324 1.2689 0.1434 1.2689 1.1265
No log 14.8182 326 1.3503 0.1628 1.3503 1.1620
No log 14.9091 328 1.4025 0.2126 1.4025 1.1843
No log 15.0 330 1.3794 0.1486 1.3794 1.1745
No log 15.0909 332 1.3727 0.1142 1.3727 1.1716
No log 15.1818 334 1.3821 0.2065 1.3821 1.1756
No log 15.2727 336 1.3869 0.2126 1.3869 1.1777
No log 15.3636 338 1.3196 0.1142 1.3196 1.1488
No log 15.4545 340 1.2683 0.1370 1.2683 1.1262
No log 15.5455 342 1.2557 0.1370 1.2557 1.1206
No log 15.6364 344 1.3246 0.1562 1.3246 1.1509
No log 15.7273 346 1.4144 0.2752 1.4144 1.1893
No log 15.8182 348 1.3733 0.1562 1.3733 1.1719
No log 15.9091 350 1.3031 0.1370 1.3031 1.1415
No log 16.0 352 1.2529 0.1434 1.2529 1.1193
No log 16.0909 354 1.1884 0.1081 1.1884 1.0902
No log 16.1818 356 1.1020 0.1500 1.1020 1.0498
No log 16.2727 358 1.0988 0.1141 1.0988 1.0483
No log 16.3636 360 1.1981 0.1434 1.1981 1.0946
No log 16.4545 362 1.2920 0.1628 1.2920 1.1366
No log 16.5455 364 1.4044 0.2752 1.4044 1.1851
No log 16.6364 366 1.4592 0.2793 1.4592 1.2080
No log 16.7273 368 1.3408 0.3071 1.3408 1.1579
No log 16.8182 370 1.1374 0.2250 1.1374 1.0665
No log 16.9091 372 1.0478 0.2238 1.0478 1.0236
No log 17.0 374 1.0862 0.1500 1.0862 1.0422
No log 17.0909 376 1.2102 0.1370 1.2102 1.1001
No log 17.1818 378 1.2940 0.1700 1.2940 1.1375
No log 17.2727 380 1.2698 0.1700 1.2698 1.1268
No log 17.3636 382 1.2524 0.1700 1.2524 1.1191
No log 17.4545 384 1.1944 0.1904 1.1944 1.0929
No log 17.5455 386 1.0035 0.3330 1.0035 1.0018
No log 17.6364 388 0.9324 0.3434 0.9324 0.9656
No log 17.7273 390 0.9599 0.3434 0.9599 0.9797
No log 17.8182 392 1.0952 0.1474 1.0952 1.0465
No log 17.9091 394 1.3176 0.2602 1.3176 1.1479
No log 18.0 396 1.3901 0.2474 1.3901 1.1790
No log 18.0909 398 1.3315 0.1562 1.3315 1.1539
No log 18.1818 400 1.2666 0.1434 1.2666 1.1254
No log 18.2727 402 1.2244 0.1081 1.2244 1.1065
No log 18.3636 404 1.2402 0.1081 1.2402 1.1136
No log 18.4545 406 1.3194 0.1434 1.3194 1.1486
No log 18.5455 408 1.3808 0.1486 1.3808 1.1751
No log 18.6364 410 1.3489 0.1486 1.3489 1.1614
No log 18.7273 412 1.3164 0.1486 1.3164 1.1474
No log 18.8182 414 1.2595 0.1142 1.2595 1.1223
No log 18.9091 416 1.1916 0.0401 1.1916 1.0916
No log 19.0 418 1.1393 0.0710 1.1393 1.0674
No log 19.0909 420 1.0742 0.0310 1.0742 1.0364
No log 19.1818 422 1.0604 0.0310 1.0604 1.0298
No log 19.2727 424 1.1148 0.1434 1.1148 1.0559
No log 19.3636 426 1.1724 0.1770 1.1724 1.0828
No log 19.4545 428 1.2076 0.1770 1.2076 1.0989
No log 19.5455 430 1.2838 0.1628 1.2838 1.1330
No log 19.6364 432 1.3413 0.1486 1.3413 1.1582
No log 19.7273 434 1.3423 0.1628 1.3423 1.1586
No log 19.8182 436 1.3199 0.1700 1.3199 1.1489
No log 19.9091 438 1.3626 0.1700 1.3626 1.1673
No log 20.0 440 1.4374 0.1562 1.4374 1.1989
No log 20.0909 442 1.4946 0.1562 1.4946 1.2225
No log 20.1818 444 1.4635 0.1562 1.4635 1.2098
No log 20.2727 446 1.3705 0.0781 1.3705 1.1707
No log 20.3636 448 1.2332 0.0556 1.2332 1.1105
No log 20.4545 450 1.1586 0.0710 1.1586 1.0764
No log 20.5455 452 1.0928 0.0310 1.0928 1.0454
No log 20.6364 454 1.1155 0.0310 1.1155 1.0562
No log 20.7273 456 1.1970 0.1838 1.1970 1.0941
No log 20.8182 458 1.3181 0.1634 1.3181 1.1481
No log 20.9091 460 1.3799 0.1634 1.3799 1.1747
No log 21.0 462 1.3780 0.1562 1.3780 1.1739
No log 21.0909 464 1.3672 0.0401 1.3672 1.1693
No log 21.1818 466 1.3780 0.0401 1.3780 1.1739
No log 21.2727 468 1.3411 0.0401 1.3411 1.1581
No log 21.3636 470 1.2844 0.0710 1.2844 1.1333
No log 21.4545 472 1.2349 0.0710 1.2349 1.1113
No log 21.5455 474 1.2766 0.0556 1.2766 1.1299
No log 21.6364 476 1.3290 0.0401 1.3290 1.1528
No log 21.7273 478 1.3868 0.0781 1.3868 1.1776
No log 21.8182 480 1.3982 0.1142 1.3982 1.1824
No log 21.9091 482 1.3751 0.1288 1.3751 1.1726
No log 22.0 484 1.3367 0.0931 1.3367 1.1562
No log 22.0909 486 1.2657 0.1288 1.2657 1.1250
No log 22.1818 488 1.1627 0.0710 1.1627 1.0783
No log 22.2727 490 1.1242 0.0691 1.1242 1.0603
No log 22.3636 492 1.1286 0.1288 1.1286 1.0624
No log 22.4545 494 1.0862 0.1081 1.0862 1.0422
No log 22.5455 496 1.0564 0.1053 1.0564 1.0278
No log 22.6364 498 1.0570 0.1053 1.0570 1.0281
0.2565 22.7273 500 1.1156 0.1434 1.1156 1.0562
0.2565 22.8182 502 1.1621 0.2027 1.1621 1.0780
0.2565 22.9091 504 1.2185 0.2341 1.2185 1.1038
0.2565 23.0 506 1.2716 0.2395 1.2716 1.1276
0.2565 23.0909 508 1.2926 0.2260 1.2926 1.1369
0.2565 23.1818 510 1.3479 0.2065 1.3479 1.1610
0.2565 23.2727 512 1.3426 0.1142 1.3426 1.1587
0.2565 23.3636 514 1.3514 0.1142 1.3514 1.1625
0.2565 23.4545 516 1.3244 0.1142 1.3244 1.1508
0.2565 23.5455 518 1.2670 0.0 1.2670 1.1256
0.2565 23.6364 520 1.2265 0.0160 1.2265 1.1075

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k9_task5_organization

Finetuned
(4019)
this model