ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k3_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0531
  • Qwk: 0.1579
  • Mse: 1.0531
  • Rmse: 1.0262

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.25 2 4.1514 0.0182 4.1514 2.0375
No log 0.5 4 2.2956 0.0542 2.2956 1.5151
No log 0.75 6 1.7754 0.0318 1.7754 1.3324
No log 1.0 8 1.7704 0.0639 1.7704 1.3306
No log 1.25 10 1.3770 0.1033 1.3770 1.1735
No log 1.5 12 1.0566 0.3082 1.0566 1.0279
No log 1.75 14 1.1772 0.0938 1.1772 1.0850
No log 2.0 16 1.2745 0.0201 1.2745 1.1290
No log 2.25 18 1.2741 -0.0064 1.2741 1.1288
No log 2.5 20 1.3257 -0.0212 1.3257 1.1514
No log 2.75 22 1.3142 0.0075 1.3142 1.1464
No log 3.0 24 1.2565 0.0489 1.2565 1.1209
No log 3.25 26 1.2373 0.0065 1.2373 1.1124
No log 3.5 28 1.2798 0.1091 1.2798 1.1313
No log 3.75 30 1.2768 0.0462 1.2768 1.1299
No log 4.0 32 1.1643 0.1576 1.1643 1.0790
No log 4.25 34 1.1125 0.1997 1.1125 1.0548
No log 4.5 36 1.1408 0.1498 1.1408 1.0681
No log 4.75 38 1.1909 0.0584 1.1909 1.0913
No log 5.0 40 1.1800 0.1805 1.1800 1.0863
No log 5.25 42 1.1552 0.1725 1.1552 1.0748
No log 5.5 44 1.1724 0.0823 1.1724 1.0828
No log 5.75 46 1.2622 -0.0112 1.2622 1.1235
No log 6.0 48 1.2964 -0.0833 1.2964 1.1386
No log 6.25 50 1.3280 -0.0255 1.3280 1.1524
No log 6.5 52 1.3575 -0.0112 1.3575 1.1651
No log 6.75 54 1.4735 -0.1798 1.4735 1.2139
No log 7.0 56 1.5852 -0.1798 1.5852 1.2590
No log 7.25 58 1.7251 -0.0541 1.7251 1.3134
No log 7.5 60 1.6423 0.0667 1.6423 1.2815
No log 7.75 62 1.4021 0.1663 1.4021 1.1841
No log 8.0 64 1.3872 0.0978 1.3872 1.1778
No log 8.25 66 1.5745 0.1442 1.5745 1.2548
No log 8.5 68 1.5529 0.1545 1.5529 1.2461
No log 8.75 70 1.3818 0.0920 1.3818 1.1755
No log 9.0 72 1.3118 0.0909 1.3118 1.1453
No log 9.25 74 1.2994 0.0817 1.2994 1.1399
No log 9.5 76 1.3029 0.1275 1.3029 1.1415
No log 9.75 78 1.2322 0.2125 1.2322 1.1100
No log 10.0 80 1.2489 0.1135 1.2489 1.1175
No log 10.25 82 1.2458 0.2340 1.2458 1.1162
No log 10.5 84 1.3390 0.1962 1.3390 1.1571
No log 10.75 86 1.5495 0.2391 1.5495 1.2448
No log 11.0 88 1.4669 0.2126 1.4669 1.2112
No log 11.25 90 1.2925 0.1053 1.2925 1.1369
No log 11.5 92 1.1472 0.0954 1.1472 1.0711
No log 11.75 94 1.0923 0.2492 1.0923 1.0451
No log 12.0 96 1.0790 0.3129 1.0790 1.0388
No log 12.25 98 1.1706 0.2250 1.1706 1.0819
No log 12.5 100 1.3731 0.2126 1.3731 1.1718
No log 12.75 102 1.3626 0.2424 1.3626 1.1673
No log 13.0 104 1.2653 0.2687 1.2653 1.1249
No log 13.25 106 1.1905 0.1598 1.1905 1.0911
No log 13.5 108 1.0744 0.2887 1.0744 1.0365
No log 13.75 110 1.0515 0.2887 1.0515 1.0254
No log 14.0 112 1.0941 0.2815 1.0941 1.0460
No log 14.25 114 1.0987 0.3099 1.0987 1.0482
No log 14.5 116 1.1546 0.1573 1.1546 1.0745
No log 14.75 118 1.2114 0.2206 1.2114 1.1006
No log 15.0 120 1.2237 0.1898 1.2237 1.1062
No log 15.25 122 1.1477 0.1500 1.1477 1.0713
No log 15.5 124 1.1833 0.1581 1.1833 1.0878
No log 15.75 126 1.2984 0.1652 1.2984 1.1395
No log 16.0 128 1.3075 0.1713 1.3075 1.1435
No log 16.25 130 1.3349 0.1713 1.3349 1.1554
No log 16.5 132 1.2755 0.2049 1.2755 1.1294
No log 16.75 134 1.2436 0.1928 1.2436 1.1152
No log 17.0 136 1.2142 0.1793 1.2142 1.1019
No log 17.25 138 1.1312 0.1863 1.1312 1.0636
No log 17.5 140 1.1367 0.1863 1.1367 1.0662
No log 17.75 142 1.1191 0.1863 1.1191 1.0579
No log 18.0 144 1.0993 0.1282 1.0993 1.0485
No log 18.25 146 1.0578 0.1823 1.0578 1.0285
No log 18.5 148 1.0274 0.2263 1.0274 1.0136
No log 18.75 150 1.1360 0.2195 1.1360 1.0658
No log 19.0 152 1.2417 0.2520 1.2417 1.1143
No log 19.25 154 1.2228 0.3266 1.2228 1.1058
No log 19.5 156 1.1214 0.2768 1.1214 1.0589
No log 19.75 158 1.0422 0.3577 1.0422 1.0209
No log 20.0 160 1.0161 0.2963 1.0161 1.0080
No log 20.25 162 0.9875 0.2647 0.9875 0.9937
No log 20.5 164 1.0387 0.0220 1.0387 1.0191
No log 20.75 166 1.1579 0.2149 1.1579 1.0761
No log 21.0 168 1.2362 0.2315 1.2362 1.1119
No log 21.25 170 1.1757 0.1770 1.1757 1.0843
No log 21.5 172 1.0802 0.0602 1.0802 1.0393
No log 21.75 174 1.0827 0.0602 1.0827 1.0405
No log 22.0 176 1.1773 0.1793 1.1773 1.0850
No log 22.25 178 1.1951 0.2177 1.1951 1.0932
No log 22.5 180 1.1607 0.2410 1.1607 1.0773
No log 22.75 182 1.1498 0.2026 1.1498 1.0723
No log 23.0 184 1.0364 0.2704 1.0364 1.0181
No log 23.25 186 0.9567 0.2359 0.9567 0.9781
No log 23.5 188 0.9647 0.2066 0.9647 0.9822
No log 23.75 190 1.0156 0.1434 1.0156 1.0078
No log 24.0 192 1.1838 0.2542 1.1838 1.0880
No log 24.25 194 1.2274 0.2313 1.2274 1.1079
No log 24.5 196 1.1391 0.2667 1.1391 1.0673
No log 24.75 198 1.0426 0.3590 1.0426 1.0211
No log 25.0 200 1.0644 0.3480 1.0644 1.0317
No log 25.25 202 1.1148 0.2837 1.1148 1.0558
No log 25.5 204 1.1469 0.2089 1.1469 1.0709
No log 25.75 206 1.0644 0.1202 1.0644 1.0317
No log 26.0 208 0.9957 0.0864 0.9957 0.9979
No log 26.25 210 0.9776 0.0864 0.9776 0.9887
No log 26.5 212 0.9758 0.1873 0.9758 0.9878
No log 26.75 214 0.9758 0.2553 0.9758 0.9878
No log 27.0 216 0.9892 0.3107 0.9892 0.9946
No log 27.25 218 1.0067 0.3103 1.0067 1.0033
No log 27.5 220 1.0488 0.3692 1.0488 1.0241
No log 27.75 222 1.1165 0.3396 1.1165 1.0566
No log 28.0 224 1.1136 0.3040 1.1136 1.0553
No log 28.25 226 1.0775 0.2359 1.0775 1.0380
No log 28.5 228 0.9833 0.2623 0.9833 0.9916
No log 28.75 230 0.9273 0.3214 0.9273 0.9630
No log 29.0 232 0.9276 0.3236 0.9276 0.9631
No log 29.25 234 0.9830 0.1582 0.9830 0.9915
No log 29.5 236 1.0899 0.1587 1.0899 1.0440
No log 29.75 238 1.1393 0.2772 1.1393 1.0674
No log 30.0 240 1.1492 0.2953 1.1492 1.0720
No log 30.25 242 1.1307 0.2479 1.1307 1.0633
No log 30.5 244 1.0693 0.2593 1.0693 1.0341
No log 30.75 246 1.0412 0.2432 1.0412 1.0204
No log 31.0 248 1.0546 0.2296 1.0546 1.0269
No log 31.25 250 1.0902 0.2489 1.0902 1.0441
No log 31.5 252 1.0731 0.2857 1.0731 1.0359
No log 31.75 254 1.0686 0.2897 1.0686 1.0337
No log 32.0 256 1.0212 0.2298 1.0212 1.0105
No log 32.25 258 0.9843 0.2674 0.9843 0.9921
No log 32.5 260 0.9747 0.2263 0.9747 0.9873
No log 32.75 262 1.0068 0.1474 1.0068 1.0034
No log 33.0 264 1.0514 0.1379 1.0514 1.0254
No log 33.25 266 1.1033 0.1986 1.1033 1.0504
No log 33.5 268 1.0862 0.3000 1.0862 1.0422
No log 33.75 270 1.0507 0.3619 1.0507 1.0251
No log 34.0 272 0.9602 0.2770 0.9602 0.9799
No log 34.25 274 0.9529 0.3161 0.9529 0.9762
No log 34.5 276 0.9764 0.2179 0.9764 0.9881
No log 34.75 278 1.0874 0.275 1.0874 1.0428
No log 35.0 280 1.2338 0.2424 1.2338 1.1107
No log 35.25 282 1.2571 0.2065 1.2571 1.1212
No log 35.5 284 1.1986 0.1407 1.1986 1.0948
No log 35.75 286 1.0788 0.0931 1.0788 1.0387
No log 36.0 288 0.9637 0.1446 0.9637 0.9817
No log 36.25 290 0.9087 0.3753 0.9087 0.9532
No log 36.5 292 0.9103 0.3198 0.9103 0.9541
No log 36.75 294 0.9398 0.2919 0.9398 0.9694
No log 37.0 296 0.9681 0.2529 0.9681 0.9839
No log 37.25 298 1.0548 0.1379 1.0548 1.0270
No log 37.5 300 1.1001 0.1986 1.1001 1.0488
No log 37.75 302 1.1435 0.1770 1.1435 1.0693
No log 38.0 304 1.1207 0.0811 1.1207 1.0586
No log 38.25 306 1.0746 0.0811 1.0746 1.0366
No log 38.5 308 1.0524 0.0587 1.0524 1.0259
No log 38.75 310 1.0433 0.0587 1.0433 1.0214
No log 39.0 312 1.0467 0.0587 1.0467 1.0231
No log 39.25 314 1.1214 0.0811 1.1214 1.0590
No log 39.5 316 1.1960 0.2089 1.1960 1.0936
No log 39.75 318 1.2147 0.2495 1.2147 1.1021
No log 40.0 320 1.2441 0.2772 1.2441 1.1154
No log 40.25 322 1.2157 0.3071 1.2157 1.1026
No log 40.5 324 1.2026 0.3071 1.2026 1.0966
No log 40.75 326 1.1172 0.1793 1.1172 1.0570
No log 41.0 328 1.0453 0.1564 1.0453 1.0224
No log 41.25 330 1.0530 0.0990 1.0530 1.0261
No log 41.5 332 1.1006 0.0961 1.1006 1.0491
No log 41.75 334 1.1459 0.0811 1.1459 1.0705
No log 42.0 336 1.1445 0.0811 1.1445 1.0698
No log 42.25 338 1.0994 0.0961 1.0994 1.0485
No log 42.5 340 1.0730 0.0618 1.0730 1.0359
No log 42.75 342 1.0313 0.1823 1.0313 1.0155
No log 43.0 344 1.0069 0.2529 1.0069 1.0035
No log 43.25 346 0.9820 0.2553 0.9820 0.9909
No log 43.5 348 0.9787 0.2553 0.9787 0.9893
No log 43.75 350 0.9960 0.2674 0.9960 0.9980
No log 44.0 352 1.0412 0.1823 1.0412 1.0204
No log 44.25 354 1.0481 0.1823 1.0481 1.0238
No log 44.5 356 1.0288 0.2238 1.0288 1.0143
No log 44.75 358 1.0558 0.1823 1.0558 1.0275
No log 45.0 360 1.0514 0.1823 1.0514 1.0254
No log 45.25 362 1.0321 0.2238 1.0321 1.0159
No log 45.5 364 0.9793 0.2553 0.9793 0.9896
No log 45.75 366 0.9219 0.3372 0.9219 0.9602
No log 46.0 368 0.9072 0.4327 0.9072 0.9525
No log 46.25 370 0.8699 0.3902 0.8699 0.9327
No log 46.5 372 0.8622 0.3902 0.8622 0.9285
No log 46.75 374 0.8722 0.4468 0.8722 0.9339
No log 47.0 376 0.9303 0.3372 0.9303 0.9645
No log 47.25 378 1.0165 0.3059 1.0165 1.0082
No log 47.5 380 1.0268 0.3098 1.0268 1.0133
No log 47.75 382 0.9678 0.2117 0.9678 0.9838
No log 48.0 384 0.9004 0.3476 0.9004 0.9489
No log 48.25 386 0.8793 0.3902 0.8793 0.9377
No log 48.5 388 0.8800 0.3902 0.8800 0.9381
No log 48.75 390 0.9047 0.3777 0.9047 0.9511
No log 49.0 392 0.9519 0.2263 0.9519 0.9756
No log 49.25 394 1.0032 0.2627 1.0032 1.0016
No log 49.5 396 0.9973 0.2627 0.9973 0.9986
No log 49.75 398 0.9449 0.1927 0.9449 0.9721
No log 50.0 400 0.9255 0.3192 0.9255 0.9620
No log 50.25 402 0.9027 0.3326 0.9027 0.9501
No log 50.5 404 0.8927 0.3476 0.8927 0.9448
No log 50.75 406 0.8942 0.4051 0.8942 0.9456
No log 51.0 408 0.9048 0.3326 0.9048 0.9512
No log 51.25 410 0.9107 0.3067 0.9107 0.9543
No log 51.5 412 0.9396 0.3067 0.9396 0.9693
No log 51.75 414 0.9757 0.3140 0.9757 0.9878
No log 52.0 416 0.9725 0.3806 0.9725 0.9862
No log 52.25 418 0.9573 0.3782 0.9573 0.9784
No log 52.5 420 0.9435 0.3192 0.9435 0.9713
No log 52.75 422 0.9434 0.3317 0.9434 0.9713
No log 53.0 424 0.9642 0.3295 0.9642 0.9819
No log 53.25 426 0.9859 0.2724 0.9859 0.9929
No log 53.5 428 1.0308 0.1446 1.0308 1.0153
No log 53.75 430 1.0457 0.1017 1.0457 1.0226
No log 54.0 432 1.0826 0.1579 1.0826 1.0405
No log 54.25 434 1.1024 0.1911 1.1024 1.0500
No log 54.5 436 1.1653 0.2149 1.1653 1.0795
No log 54.75 438 1.1779 0.2015 1.1779 1.0853
No log 55.0 440 1.1561 0.2149 1.1561 1.0752
No log 55.25 442 1.0898 0.2227 1.0898 1.0439
No log 55.5 444 1.0507 0.1724 1.0507 1.0250
No log 55.75 446 1.0275 0.1017 1.0275 1.0136
No log 56.0 448 1.0299 0.1017 1.0299 1.0149
No log 56.25 450 1.0183 0.0496 1.0183 1.0091
No log 56.5 452 1.0434 0.1379 1.0434 1.0215
No log 56.75 454 1.0815 0.2308 1.0815 1.0399
No log 57.0 456 1.0995 0.2227 1.0995 1.0486
No log 57.25 458 1.1325 0.2089 1.1325 1.0642
No log 57.5 460 1.1429 0.2089 1.1429 1.0691
No log 57.75 462 1.1198 0.2089 1.1198 1.0582
No log 58.0 464 1.0849 0.0811 1.0849 1.0416
No log 58.25 466 1.0289 0.0587 1.0289 1.0144
No log 58.5 468 0.9952 0.1873 0.9952 0.9976
No log 58.75 470 0.9746 0.2651 0.9746 0.9872
No log 59.0 472 0.9760 0.3672 0.9760 0.9879
No log 59.25 474 0.9910 0.2986 0.9910 0.9955
No log 59.5 476 1.0294 0.3373 1.0294 1.0146
No log 59.75 478 1.0421 0.2667 1.0421 1.0208
No log 60.0 480 1.0412 0.1838 1.0412 1.0204
No log 60.25 482 1.0079 0.1017 1.0079 1.0039
No log 60.5 484 0.9749 0.1873 0.9749 0.9874
No log 60.75 486 0.9446 0.3147 0.9446 0.9719
No log 61.0 488 0.9384 0.3147 0.9384 0.9687
No log 61.25 490 0.9476 0.3147 0.9476 0.9734
No log 61.5 492 0.9389 0.3569 0.9389 0.9690
No log 61.75 494 0.9660 0.2299 0.9660 0.9829
No log 62.0 496 1.0009 0.1797 1.0009 1.0004
No log 62.25 498 1.0491 0.2227 1.0491 1.0243
0.2169 62.5 500 1.0618 0.1579 1.0618 1.0304
0.2169 62.75 502 1.0489 0.0587 1.0489 1.0242
0.2169 63.0 504 1.0481 0.0961 1.0481 1.0238
0.2169 63.25 506 1.0513 0.0961 1.0513 1.0253
0.2169 63.5 508 1.0428 0.1579 1.0428 1.0212
0.2169 63.75 510 1.0531 0.1579 1.0531 1.0262

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k3_task5_organization

Finetuned
(4019)
this model