ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k20_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.1775
  • Qwk: 0.2294
  • Mse: 1.1775
  • Rmse: 1.0851

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0426 2 2.6637 -0.0924 2.6637 1.6321
No log 0.0851 4 1.4013 0.0141 1.4013 1.1837
No log 0.1277 6 1.3553 -0.2440 1.3553 1.1642
No log 0.1702 8 1.2365 -0.0464 1.2365 1.1120
No log 0.2128 10 1.2210 -0.0344 1.2210 1.1050
No log 0.2553 12 1.2286 -0.0690 1.2286 1.1084
No log 0.2979 14 1.1498 -0.0358 1.1498 1.0723
No log 0.3404 16 1.0321 0.0584 1.0321 1.0159
No log 0.3830 18 0.8367 0.0393 0.8367 0.9147
No log 0.4255 20 0.7600 0.0481 0.7600 0.8718
No log 0.4681 22 0.8112 0.1754 0.8112 0.9006
No log 0.5106 24 0.7766 0.1365 0.7766 0.8812
No log 0.5532 26 0.7312 0.0481 0.7312 0.8551
No log 0.5957 28 0.7361 0.0 0.7361 0.8580
No log 0.6383 30 0.7731 0.0393 0.7731 0.8792
No log 0.6809 32 0.7855 0.0757 0.7855 0.8863
No log 0.7234 34 0.8020 0.1972 0.8020 0.8955
No log 0.7660 36 0.8369 0.2883 0.8369 0.9148
No log 0.8085 38 0.8729 0.3777 0.8729 0.9343
No log 0.8511 40 0.8536 0.3538 0.8536 0.9239
No log 0.8936 42 0.7889 0.3712 0.7889 0.8882
No log 0.9362 44 0.7599 0.2345 0.7599 0.8717
No log 0.9787 46 0.7531 0.1094 0.7531 0.8678
No log 1.0213 48 0.7545 0.0798 0.7545 0.8686
No log 1.0638 50 0.7628 0.0798 0.7628 0.8734
No log 1.1064 52 0.7359 0.0840 0.7359 0.8578
No log 1.1489 54 0.6856 0.0717 0.6856 0.8280
No log 1.1915 56 0.6886 0.2950 0.6886 0.8298
No log 1.2340 58 0.7264 0.3894 0.7264 0.8523
No log 1.2766 60 0.7790 0.3606 0.7790 0.8826
No log 1.3191 62 0.7259 0.3675 0.7259 0.8520
No log 1.3617 64 0.6399 0.3839 0.6399 0.7999
No log 1.4043 66 0.6327 0.1508 0.6327 0.7954
No log 1.4468 68 0.6409 0.1508 0.6409 0.8006
No log 1.4894 70 0.6632 0.1400 0.6632 0.8144
No log 1.5319 72 0.7024 0.3471 0.7024 0.8381
No log 1.5745 74 0.7651 0.4144 0.7651 0.8747
No log 1.6170 76 0.7902 0.4067 0.7902 0.8889
No log 1.6596 78 0.7896 0.3637 0.7896 0.8886
No log 1.7021 80 0.8315 0.2576 0.8315 0.9119
No log 1.7447 82 0.8869 0.2812 0.8869 0.9418
No log 1.7872 84 0.9990 0.3019 0.9990 0.9995
No log 1.8298 86 1.0664 0.2166 1.0664 1.0327
No log 1.8723 88 1.1086 0.2348 1.1086 1.0529
No log 1.9149 90 1.0265 0.2529 1.0265 1.0132
No log 1.9574 92 0.9319 0.3675 0.9319 0.9653
No log 2.0 94 0.8718 0.3675 0.8718 0.9337
No log 2.0426 96 0.8632 0.3746 0.8632 0.9291
No log 2.0851 98 0.8763 0.3494 0.8763 0.9361
No log 2.1277 100 0.9157 0.2843 0.9157 0.9569
No log 2.1702 102 0.8750 0.2574 0.8750 0.9354
No log 2.2128 104 0.8447 0.2692 0.8447 0.9191
No log 2.2553 106 0.8336 0.2574 0.8336 0.9130
No log 2.2979 108 0.8581 0.3234 0.8581 0.9263
No log 2.3404 110 0.8803 0.4175 0.8803 0.9382
No log 2.3830 112 0.9894 0.3022 0.9894 0.9947
No log 2.4255 114 1.0035 0.3160 1.0035 1.0017
No log 2.4681 116 1.0751 0.2730 1.0751 1.0369
No log 2.5106 118 1.0456 0.2779 1.0456 1.0226
No log 2.5532 120 1.1146 0.2824 1.1146 1.0558
No log 2.5957 122 1.0825 0.2659 1.0825 1.0404
No log 2.6383 124 0.9248 0.3461 0.9248 0.9616
No log 2.6809 126 0.8835 0.2871 0.8835 0.9399
No log 2.7234 128 0.9152 0.2754 0.9152 0.9567
No log 2.7660 130 0.8771 0.2812 0.8771 0.9365
No log 2.8085 132 0.8217 0.2871 0.8217 0.9065
No log 2.8511 134 0.8260 0.2604 0.8260 0.9088
No log 2.8936 136 0.8946 0.3008 0.8946 0.9458
No log 2.9362 138 0.9891 0.2988 0.9891 0.9945
No log 2.9787 140 1.1406 0.2094 1.1406 1.0680
No log 3.0213 142 1.2392 0.2271 1.2392 1.1132
No log 3.0638 144 1.1747 0.1689 1.1747 1.0838
No log 3.1064 146 0.9979 0.2012 0.9979 0.9990
No log 3.1489 148 0.9331 0.1706 0.9331 0.9660
No log 3.1915 150 0.9441 0.1661 0.9441 0.9716
No log 3.2340 152 1.0614 0.2124 1.0614 1.0302
No log 3.2766 154 1.0918 0.1712 1.0918 1.0449
No log 3.3191 156 1.0633 0.1912 1.0633 1.0312
No log 3.3617 158 0.9803 0.1176 0.9803 0.9901
No log 3.4043 160 0.9442 0.1368 0.9442 0.9717
No log 3.4468 162 0.9281 0.1550 0.9281 0.9634
No log 3.4894 164 0.9669 0.1103 0.9669 0.9833
No log 3.5319 166 0.9895 0.1718 0.9895 0.9948
No log 3.5745 168 0.9883 0.1628 0.9883 0.9941
No log 3.6170 170 0.9528 0.1332 0.9528 0.9761
No log 3.6596 172 0.9486 0.1332 0.9486 0.9739
No log 3.7021 174 1.0316 0.1765 1.0316 1.0157
No log 3.7447 176 1.1436 0.2017 1.1436 1.0694
No log 3.7872 178 1.1897 0.2590 1.1897 1.0908
No log 3.8298 180 1.1311 0.3052 1.1311 1.0635
No log 3.8723 182 1.0895 0.2702 1.0895 1.0438
No log 3.9149 184 1.0723 0.2702 1.0723 1.0355
No log 3.9574 186 1.1352 0.2234 1.1352 1.0654
No log 4.0 188 1.2639 0.1849 1.2639 1.1242
No log 4.0426 190 1.2899 0.1067 1.2899 1.1357
No log 4.0851 192 1.2242 0.0948 1.2242 1.1064
No log 4.1277 194 1.1887 0.1821 1.1887 1.0903
No log 4.1702 196 1.2394 0.1805 1.2394 1.1133
No log 4.2128 198 1.2129 0.2153 1.2129 1.1013
No log 4.2553 200 1.1904 0.2153 1.1904 1.0910
No log 4.2979 202 1.1590 0.1908 1.1590 1.0766
No log 4.3404 204 1.2032 0.2043 1.2032 1.0969
No log 4.3830 206 1.3023 0.1648 1.3023 1.1412
No log 4.4255 208 1.3391 0.1364 1.3391 1.1572
No log 4.4681 210 1.1667 0.1428 1.1667 1.0801
No log 4.5106 212 0.9364 0.2967 0.9364 0.9677
No log 4.5532 214 0.8737 0.1183 0.8737 0.9347
No log 4.5957 216 0.9011 0.1459 0.9011 0.9492
No log 4.6383 218 1.0129 0.3499 1.0129 1.0064
No log 4.6809 220 1.1979 0.2008 1.1979 1.0945
No log 4.7234 222 1.2342 0.1641 1.2342 1.1110
No log 4.7660 224 1.0801 0.2709 1.0801 1.0393
No log 4.8085 226 0.8695 0.3494 0.8695 0.9325
No log 4.8511 228 0.8180 0.3372 0.8180 0.9044
No log 4.8936 230 0.8523 0.3425 0.8523 0.9232
No log 4.9362 232 1.0303 0.3417 1.0303 1.0150
No log 4.9787 234 1.2725 0.1338 1.2725 1.1280
No log 5.0213 236 1.4029 0.1031 1.4029 1.1844
No log 5.0638 238 1.3424 0.0903 1.3424 1.1586
No log 5.1064 240 1.2547 0.2069 1.2547 1.1201
No log 5.1489 242 1.2110 0.1732 1.2110 1.1004
No log 5.1915 244 1.1334 0.2380 1.1334 1.0646
No log 5.2340 246 1.0611 0.2921 1.0611 1.0301
No log 5.2766 248 1.0023 0.2988 1.0023 1.0012
No log 5.3191 250 1.0058 0.2578 1.0058 1.0029
No log 5.3617 252 1.0955 0.2396 1.0955 1.0466
No log 5.4043 254 1.1702 0.1484 1.1702 1.0818
No log 5.4468 256 1.1160 0.2017 1.1160 1.0564
No log 5.4894 258 1.0675 0.2437 1.0675 1.0332
No log 5.5319 260 1.0828 0.2529 1.0828 1.0406
No log 5.5745 262 1.0001 0.3169 1.0001 1.0001
No log 5.6170 264 0.9774 0.3042 0.9774 0.9886
No log 5.6596 266 0.9762 0.3169 0.9762 0.9880
No log 5.7021 268 0.9795 0.3231 0.9795 0.9897
No log 5.7447 270 1.0808 0.3359 1.0808 1.0396
No log 5.7872 272 1.0856 0.3425 1.0856 1.0419
No log 5.8298 274 0.9787 0.3657 0.9787 0.9893
No log 5.8723 276 0.9342 0.3217 0.9342 0.9665
No log 5.9149 278 0.9802 0.3597 0.9802 0.9900
No log 5.9574 280 0.9934 0.3597 0.9934 0.9967
No log 6.0 282 0.9682 0.3933 0.9682 0.9840
No log 6.0426 284 0.9367 0.3998 0.9367 0.9678
No log 6.0851 286 0.9373 0.3998 0.9373 0.9682
No log 6.1277 288 0.9939 0.3747 0.9939 0.9969
No log 6.1702 290 1.1404 0.3370 1.1404 1.0679
No log 6.2128 292 1.2163 0.2280 1.2163 1.1029
No log 6.2553 294 1.2289 0.1931 1.2289 1.1085
No log 6.2979 296 1.1970 0.1713 1.1970 1.0941
No log 6.3404 298 1.1197 0.2271 1.1197 1.0581
No log 6.3830 300 1.0117 0.3538 1.0117 1.0058
No log 6.4255 302 0.9724 0.4067 0.9724 0.9861
No log 6.4681 304 1.0669 0.2881 1.0669 1.0329
No log 6.5106 306 1.2525 0.1473 1.2525 1.1191
No log 6.5532 308 1.3576 0.1195 1.3576 1.1652
No log 6.5957 310 1.2974 0.1513 1.2974 1.1391
No log 6.6383 312 1.1278 0.1909 1.1278 1.0620
No log 6.6809 314 0.9785 0.3169 0.9785 0.9892
No log 6.7234 316 0.9340 0.3359 0.9340 0.9665
No log 6.7660 318 0.9811 0.3110 0.9811 0.9905
No log 6.8085 320 1.1138 0.2412 1.1138 1.0554
No log 6.8511 322 1.2122 0.1654 1.2122 1.1010
No log 6.8936 324 1.2160 0.0712 1.2160 1.1027
No log 6.9362 326 1.1641 0.0761 1.1641 1.0790
No log 6.9787 328 1.0434 0.1651 1.0434 1.0215
No log 7.0213 330 0.9305 0.2843 0.9305 0.9646
No log 7.0638 332 0.9182 0.2904 0.9182 0.9582
No log 7.1064 334 0.9458 0.2518 0.9458 0.9725
No log 7.1489 336 1.0011 0.2287 1.0011 1.0006
No log 7.1915 338 1.1242 0.1389 1.1242 1.0603
No log 7.2340 340 1.2233 0.1115 1.2233 1.1061
No log 7.2766 342 1.2112 0.0872 1.2112 1.1006
No log 7.3191 344 1.1316 0.1454 1.1316 1.0638
No log 7.3617 346 1.0847 0.1029 1.0847 1.0415
No log 7.4043 348 1.0912 0.1541 1.0912 1.0446
No log 7.4468 350 1.1150 0.1495 1.1150 1.0560
No log 7.4894 352 1.1402 0.0960 1.1402 1.0678
No log 7.5319 354 1.2154 0.0553 1.2154 1.1025
No log 7.5745 356 1.2100 0.1961 1.2100 1.1000
No log 7.6170 358 1.1628 0.1961 1.1628 1.0783
No log 7.6596 360 1.1553 0.1796 1.1553 1.0748
No log 7.7021 362 1.1427 0.2341 1.1427 1.0690
No log 7.7447 364 1.0909 0.2294 1.0909 1.0445
No log 7.7872 366 1.0304 0.3160 1.0304 1.0151
No log 7.8298 368 0.9487 0.3169 0.9487 0.9740
No log 7.8723 370 0.8925 0.3564 0.8925 0.9447
No log 7.9149 372 0.9500 0.3105 0.9500 0.9747
No log 7.9574 374 1.0349 0.3310 1.0349 1.0173
No log 8.0 376 1.2319 0.1641 1.2319 1.1099
No log 8.0426 378 1.4602 0.1075 1.4602 1.2084
No log 8.0851 380 1.5221 0.1031 1.5221 1.2337
No log 8.1277 382 1.4349 0.1189 1.4349 1.1979
No log 8.1702 384 1.2156 0.1772 1.2156 1.1025
No log 8.2128 386 1.0055 0.4208 1.0055 1.0027
No log 8.2553 388 0.8847 0.3940 0.8847 0.9406
No log 8.2979 390 0.8625 0.3699 0.8625 0.9287
No log 8.3404 392 0.9308 0.4419 0.9308 0.9648
No log 8.3830 394 0.9718 0.4419 0.9718 0.9858
No log 8.4255 396 1.0481 0.3080 1.0481 1.0237
No log 8.4681 398 1.1347 0.2376 1.1347 1.0652
No log 8.5106 400 1.0853 0.2006 1.0853 1.0418
No log 8.5532 402 1.0293 0.3082 1.0293 1.0145
No log 8.5957 404 0.9469 0.3777 0.9469 0.9731
No log 8.6383 406 0.9288 0.4030 0.9288 0.9638
No log 8.6809 408 0.9488 0.4044 0.9488 0.9740
No log 8.7234 410 0.9582 0.4044 0.9582 0.9789
No log 8.7660 412 1.0531 0.3247 1.0531 1.0262
No log 8.8085 414 1.2067 0.2280 1.2067 1.0985
No log 8.8511 416 1.2634 0.1709 1.2634 1.1240
No log 8.8936 418 1.1919 0.2280 1.1919 1.0917
No log 8.9362 420 1.1056 0.3264 1.1056 1.0515
No log 8.9787 422 0.9964 0.3933 0.9964 0.9982
No log 9.0213 424 0.9579 0.4064 0.9579 0.9787
No log 9.0638 426 0.9917 0.3346 0.9917 0.9958
No log 9.1064 428 1.0026 0.3022 1.0026 1.0013
No log 9.1489 430 1.0385 0.3102 1.0385 1.0191
No log 9.1915 432 1.0615 0.3337 1.0615 1.0303
No log 9.2340 434 1.1246 0.2732 1.1246 1.0605
No log 9.2766 436 1.2480 0.1248 1.2480 1.1171
No log 9.3191 438 1.2607 0.1450 1.2607 1.1228
No log 9.3617 440 1.2139 0.1973 1.2139 1.1018
No log 9.4043 442 1.1017 0.2935 1.1017 1.0496
No log 9.4468 444 1.0263 0.4076 1.0263 1.0130
No log 9.4894 446 1.0438 0.4279 1.0438 1.0217
No log 9.5319 448 1.1576 0.2358 1.1576 1.0759
No log 9.5745 450 1.2186 0.1740 1.2186 1.1039
No log 9.6170 452 1.1668 0.2665 1.1668 1.0802
No log 9.6596 454 1.0739 0.3264 1.0739 1.0363
No log 9.7021 456 1.0091 0.2886 1.0091 1.0045
No log 9.7447 458 0.9813 0.3110 0.9813 0.9906
No log 9.7872 460 0.9259 0.4080 0.9259 0.9623
No log 9.8298 462 0.8468 0.3847 0.8468 0.9202
No log 9.8723 464 0.8105 0.3981 0.8105 0.9003
No log 9.9149 466 0.8086 0.4203 0.8086 0.8992
No log 9.9574 468 0.8024 0.4123 0.8024 0.8958
No log 10.0 470 0.8118 0.4123 0.8118 0.9010
No log 10.0426 472 0.8185 0.4123 0.8185 0.9047
No log 10.0851 474 0.9168 0.4208 0.9168 0.9575
No log 10.1277 476 0.9510 0.4208 0.9510 0.9752
No log 10.1702 478 0.9112 0.4208 0.9112 0.9545
No log 10.2128 480 0.8328 0.3425 0.8328 0.9126
No log 10.2553 482 0.7718 0.4219 0.7718 0.8785
No log 10.2979 484 0.7653 0.3789 0.7653 0.8748
No log 10.3404 486 0.7807 0.3819 0.7807 0.8836
No log 10.3830 488 0.8104 0.3675 0.8104 0.9002
No log 10.4255 490 0.8710 0.4154 0.8710 0.9333
No log 10.4681 492 0.9339 0.4328 0.9339 0.9664
No log 10.5106 494 0.9453 0.4414 0.9453 0.9723
No log 10.5532 496 0.9646 0.4076 0.9646 0.9822
No log 10.5957 498 0.9088 0.4414 0.9088 0.9533
0.3042 10.6383 500 0.8416 0.3960 0.8416 0.9174
0.3042 10.6809 502 0.8247 0.3960 0.8247 0.9081
0.3042 10.7234 504 0.8651 0.4400 0.8651 0.9301
0.3042 10.7660 506 0.9048 0.4414 0.9048 0.9512
0.3042 10.8085 508 0.9238 0.4414 0.9238 0.9612
0.3042 10.8511 510 0.9334 0.4414 0.9334 0.9661
0.3042 10.8936 512 0.8811 0.4328 0.8811 0.9387
0.3042 10.9362 514 0.7900 0.3675 0.7900 0.8888
0.3042 10.9787 516 0.7412 0.3972 0.7412 0.8609
0.3042 11.0213 518 0.7473 0.4470 0.7473 0.8645
0.3042 11.0638 520 0.8218 0.3918 0.8218 0.9065
0.3042 11.1064 522 0.9478 0.3870 0.9478 0.9735
0.3042 11.1489 524 1.1090 0.2459 1.1090 1.0531
0.3042 11.1915 526 1.2162 0.2121 1.2162 1.1028
0.3042 11.2340 528 1.1775 0.2294 1.1775 1.0851

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k20_task7_organization

Finetuned
(4019)
this model