ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k10_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9812
  • Qwk: 0.2871
  • Mse: 0.9812
  • Rmse: 0.9906

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0690 2 4.6454 -0.0020 4.6454 2.1553
No log 0.1379 4 2.6055 0.0025 2.6055 1.6141
No log 0.2069 6 2.1625 0.0143 2.1625 1.4706
No log 0.2759 8 2.0562 0.0324 2.0562 1.4340
No log 0.3448 10 1.5247 0.0182 1.5247 1.2348
No log 0.4138 12 1.2363 0.1532 1.2363 1.1119
No log 0.4828 14 1.1886 0.1379 1.1886 1.0902
No log 0.5517 16 1.1939 0.1711 1.1939 1.0927
No log 0.6207 18 1.3784 0.0254 1.3784 1.1741
No log 0.6897 20 1.5929 0.0 1.5929 1.2621
No log 0.7586 22 1.4892 0.0 1.4892 1.2203
No log 0.8276 24 1.2276 0.1542 1.2276 1.1080
No log 0.8966 26 1.1762 0.2142 1.1762 1.0846
No log 0.9655 28 1.1921 0.1979 1.1921 1.0919
No log 1.0345 30 1.2465 0.1814 1.2465 1.1165
No log 1.1034 32 1.5569 -0.0535 1.5570 1.2478
No log 1.1724 34 1.7020 0.0793 1.7020 1.3046
No log 1.2414 36 1.7523 0.0033 1.7523 1.3238
No log 1.3103 38 1.5028 0.1542 1.5028 1.2259
No log 1.3793 40 1.3188 0.0654 1.3188 1.1484
No log 1.4483 42 1.2829 0.1335 1.2829 1.1327
No log 1.5172 44 1.2487 0.1556 1.2487 1.1175
No log 1.5862 46 1.2460 0.2342 1.2460 1.1162
No log 1.6552 48 1.4588 0.1772 1.4588 1.2078
No log 1.7241 50 1.5076 0.1196 1.5076 1.2279
No log 1.7931 52 1.2344 0.1865 1.2344 1.1110
No log 1.8621 54 1.0707 0.3603 1.0707 1.0347
No log 1.9310 56 1.0508 0.3299 1.0508 1.0251
No log 2.0 58 1.0824 0.1752 1.0824 1.0404
No log 2.0690 60 1.0970 0.1752 1.0970 1.0474
No log 2.1379 62 1.1368 0.2835 1.1368 1.0662
No log 2.2069 64 1.3277 0.0838 1.3277 1.1523
No log 2.2759 66 1.4978 0.0723 1.4978 1.2238
No log 2.3448 68 1.4253 0.0811 1.4253 1.1939
No log 2.4138 70 1.2429 0.0688 1.2429 1.1149
No log 2.4828 72 1.1292 0.2935 1.1292 1.0627
No log 2.5517 74 1.0813 0.4072 1.0813 1.0399
No log 2.6207 76 1.0692 0.3921 1.0692 1.0340
No log 2.6897 78 1.0735 0.3902 1.0735 1.0361
No log 2.7586 80 1.1463 0.2690 1.1463 1.0707
No log 2.8276 82 1.2222 0.2675 1.2222 1.1055
No log 2.8966 84 1.2231 0.2346 1.2231 1.1059
No log 2.9655 86 1.2901 0.2019 1.2901 1.1358
No log 3.0345 88 1.2286 0.2346 1.2286 1.1084
No log 3.1034 90 1.0990 0.3031 1.0990 1.0483
No log 3.1724 92 1.0551 0.3855 1.0551 1.0272
No log 3.2414 94 1.1352 0.3018 1.1352 1.0655
No log 3.3103 96 1.1980 0.3530 1.1980 1.0945
No log 3.3793 98 1.0882 0.3633 1.0882 1.0432
No log 3.4483 100 1.0805 0.4050 1.0805 1.0395
No log 3.5172 102 1.1053 0.3897 1.1053 1.0513
No log 3.5862 104 1.0485 0.3985 1.0485 1.0239
No log 3.6552 106 1.0682 0.3985 1.0682 1.0335
No log 3.7241 108 1.0651 0.4215 1.0651 1.0320
No log 3.7931 110 1.0847 0.4215 1.0847 1.0415
No log 3.8621 112 1.0786 0.3923 1.0786 1.0386
No log 3.9310 114 1.0963 0.3460 1.0963 1.0470
No log 4.0 116 1.1557 0.3469 1.1557 1.0750
No log 4.0690 118 1.0703 0.3590 1.0703 1.0346
No log 4.1379 120 1.0679 0.3328 1.0679 1.0334
No log 4.2069 122 1.0714 0.4036 1.0714 1.0351
No log 4.2759 124 0.9803 0.4290 0.9803 0.9901
No log 4.3448 126 0.9624 0.4417 0.9624 0.9810
No log 4.4138 128 1.0460 0.4677 1.0460 1.0228
No log 4.4828 130 1.0331 0.4431 1.0331 1.0164
No log 4.5517 132 1.1246 0.3560 1.1246 1.0605
No log 4.6207 134 1.1052 0.3941 1.1052 1.0513
No log 4.6897 136 0.9622 0.4527 0.9622 0.9809
No log 4.7586 138 0.9951 0.4297 0.9951 0.9975
No log 4.8276 140 0.9954 0.4297 0.9954 0.9977
No log 4.8966 142 0.9700 0.4349 0.9700 0.9849
No log 4.9655 144 1.1268 0.3484 1.1268 1.0615
No log 5.0345 146 1.4789 0.3025 1.4789 1.2161
No log 5.1034 148 1.5972 0.2796 1.5972 1.2638
No log 5.1724 150 1.2763 0.3792 1.2763 1.1297
No log 5.2414 152 0.9555 0.5121 0.9555 0.9775
No log 5.3103 154 0.9434 0.4814 0.9434 0.9713
No log 5.3793 156 0.9733 0.4382 0.9733 0.9866
No log 5.4483 158 1.0754 0.4974 1.0754 1.0370
No log 5.5172 160 1.2535 0.4261 1.2535 1.1196
No log 5.5862 162 1.3731 0.3876 1.3731 1.1718
No log 5.6552 164 1.1898 0.3684 1.1898 1.0908
No log 5.7241 166 1.1178 0.3884 1.1178 1.0573
No log 5.7931 168 1.0082 0.4291 1.0082 1.0041
No log 5.8621 170 0.9742 0.4074 0.9742 0.9870
No log 5.9310 172 1.0386 0.3667 1.0386 1.0191
No log 6.0 174 1.0539 0.3413 1.0539 1.0266
No log 6.0690 176 0.9975 0.4007 0.9975 0.9987
No log 6.1379 178 0.8887 0.4459 0.8887 0.9427
No log 6.2069 180 0.9239 0.5060 0.9239 0.9612
No log 6.2759 182 1.1101 0.4583 1.1101 1.0536
No log 6.3448 184 1.1484 0.4473 1.1484 1.0716
No log 6.4138 186 0.9975 0.3980 0.9975 0.9988
No log 6.4828 188 0.9222 0.5176 0.9222 0.9603
No log 6.5517 190 0.9297 0.4697 0.9297 0.9642
No log 6.6207 192 0.9419 0.4685 0.9419 0.9705
No log 6.6897 194 0.8753 0.5264 0.8753 0.9356
No log 6.7586 196 0.8870 0.5147 0.8870 0.9418
No log 6.8276 198 1.0631 0.3921 1.0631 1.0310
No log 6.8966 200 1.3543 0.3209 1.3543 1.1637
No log 6.9655 202 1.3896 0.3099 1.3896 1.1788
No log 7.0345 204 1.1842 0.3411 1.1842 1.0882
No log 7.1034 206 0.9025 0.5029 0.9025 0.9500
No log 7.1724 208 0.8516 0.5175 0.8516 0.9228
No log 7.2414 210 0.8578 0.4498 0.8578 0.9262
No log 7.3103 212 0.8708 0.4927 0.8708 0.9332
No log 7.3793 214 1.0024 0.4289 1.0024 1.0012
No log 7.4483 216 1.1952 0.3024 1.1952 1.0932
No log 7.5172 218 1.2578 0.3134 1.2578 1.1215
No log 7.5862 220 1.1634 0.2798 1.1634 1.0786
No log 7.6552 222 1.0218 0.4127 1.0218 1.0108
No log 7.7241 224 0.9439 0.4300 0.9439 0.9715
No log 7.7931 226 0.9506 0.3931 0.9506 0.9750
No log 7.8621 228 0.9542 0.3434 0.9542 0.9768
No log 7.9310 230 1.0130 0.3996 1.0130 1.0065
No log 8.0 232 1.1342 0.2634 1.1342 1.0650
No log 8.0690 234 1.1370 0.3024 1.1370 1.0663
No log 8.1379 236 1.0385 0.4281 1.0385 1.0191
No log 8.2069 238 0.9434 0.4998 0.9434 0.9713
No log 8.2759 240 0.9229 0.4780 0.9229 0.9607
No log 8.3448 242 0.9567 0.4611 0.9567 0.9781
No log 8.4138 244 1.0112 0.3697 1.0112 1.0056
No log 8.4828 246 1.0053 0.3786 1.0053 1.0026
No log 8.5517 248 1.0079 0.3645 1.0079 1.0040
No log 8.6207 250 0.9722 0.4611 0.9722 0.9860
No log 8.6897 252 0.9469 0.4611 0.9469 0.9731
No log 8.7586 254 0.9247 0.4611 0.9247 0.9616
No log 8.8276 256 0.9339 0.4145 0.9339 0.9664
No log 8.8966 258 0.9603 0.4572 0.9603 0.9799
No log 8.9655 260 1.0555 0.4780 1.0555 1.0274
No log 9.0345 262 1.0121 0.4734 1.0121 1.0061
No log 9.1034 264 0.9867 0.4454 0.9867 0.9933
No log 9.1724 266 0.9310 0.3919 0.9310 0.9649
No log 9.2414 268 0.9304 0.4381 0.9304 0.9646
No log 9.3103 270 0.9363 0.4242 0.9363 0.9676
No log 9.3793 272 0.9350 0.4381 0.9350 0.9670
No log 9.4483 274 0.9856 0.3787 0.9856 0.9928
No log 9.5172 276 1.0653 0.3579 1.0653 1.0321
No log 9.5862 278 1.0875 0.3109 1.0875 1.0428
No log 9.6552 280 1.0255 0.2976 1.0255 1.0127
No log 9.7241 282 1.0275 0.2716 1.0275 1.0137
No log 9.7931 284 1.0504 0.2930 1.0504 1.0249
No log 9.8621 286 1.0406 0.2431 1.0406 1.0201
No log 9.9310 288 1.0355 0.3068 1.0355 1.0176
No log 10.0 290 1.0945 0.2844 1.0945 1.0462
No log 10.0690 292 1.0825 0.3686 1.0825 1.0404
No log 10.1379 294 0.9989 0.3979 0.9989 0.9994
No log 10.2069 296 0.9566 0.4238 0.9566 0.9780
No log 10.2759 298 0.8971 0.4142 0.8971 0.9472
No log 10.3448 300 0.8899 0.3921 0.8899 0.9433
No log 10.4138 302 0.9028 0.4244 0.9028 0.9502
No log 10.4828 304 1.0108 0.3186 1.0108 1.0054
No log 10.5517 306 1.1430 0.2781 1.1430 1.0691
No log 10.6207 308 1.1396 0.3166 1.1396 1.0675
No log 10.6897 310 1.0084 0.3860 1.0084 1.0042
No log 10.7586 312 0.8862 0.4328 0.8862 0.9414
No log 10.8276 314 0.8430 0.5374 0.8430 0.9182
No log 10.8966 316 0.8269 0.5403 0.8269 0.9093
No log 10.9655 318 0.8424 0.5250 0.8424 0.9179
No log 11.0345 320 0.9126 0.4201 0.9126 0.9553
No log 11.1034 322 0.9174 0.4201 0.9174 0.9578
No log 11.1724 324 0.8515 0.5142 0.8515 0.9228
No log 11.2414 326 0.8042 0.5244 0.8042 0.8968
No log 11.3103 328 0.8014 0.5244 0.8014 0.8952
No log 11.3793 330 0.8239 0.5292 0.8239 0.9077
No log 11.4483 332 0.8306 0.4916 0.8306 0.9114
No log 11.5172 334 0.8217 0.4832 0.8217 0.9065
No log 11.5862 336 0.8201 0.4893 0.8201 0.9056
No log 11.6552 338 0.8348 0.4893 0.8348 0.9137
No log 11.7241 340 0.9056 0.4526 0.9056 0.9516
No log 11.7931 342 0.9120 0.4165 0.9120 0.9550
No log 11.8621 344 0.8709 0.4533 0.8709 0.9332
No log 11.9310 346 0.8856 0.4433 0.8856 0.9411
No log 12.0 348 0.9352 0.3418 0.9352 0.9671
No log 12.0690 350 0.9196 0.3564 0.9196 0.9589
No log 12.1379 352 0.8745 0.3708 0.8745 0.9352
No log 12.2069 354 0.8780 0.3708 0.8780 0.9370
No log 12.2759 356 0.9285 0.3564 0.9285 0.9636
No log 12.3448 358 1.0168 0.3993 1.0168 1.0084
No log 12.4138 360 1.1617 0.2851 1.1617 1.0778
No log 12.4828 362 1.1763 0.3317 1.1763 1.0846
No log 12.5517 364 1.0957 0.3596 1.0957 1.0467
No log 12.6207 366 1.0295 0.2975 1.0295 1.0147
No log 12.6897 368 1.1136 0.2195 1.1136 1.0553
No log 12.7586 370 1.1259 0.2108 1.1259 1.0611
No log 12.8276 372 1.0650 0.2922 1.0650 1.0320
No log 12.8966 374 0.9683 0.3202 0.9683 0.9840
No log 12.9655 376 0.9631 0.3489 0.9631 0.9814
No log 13.0345 378 1.0577 0.3864 1.0577 1.0285
No log 13.1034 380 1.2319 0.2742 1.2319 1.1099
No log 13.1724 382 1.2790 0.2742 1.2790 1.1309
No log 13.2414 384 1.1259 0.3046 1.1259 1.0611
No log 13.3103 386 1.0569 0.3502 1.0569 1.0281
No log 13.3793 388 0.9491 0.4657 0.9491 0.9742
No log 13.4483 390 0.8787 0.4278 0.8787 0.9374
No log 13.5172 392 0.8904 0.4039 0.8904 0.9436
No log 13.5862 394 0.9532 0.4337 0.9532 0.9763
No log 13.6552 396 1.0400 0.2681 1.0400 1.0198
No log 13.7241 398 1.0195 0.2606 1.0195 1.0097
No log 13.7931 400 0.9747 0.4067 0.9747 0.9873
No log 13.8621 402 0.9526 0.4337 0.9526 0.9760
No log 13.9310 404 0.9167 0.4142 0.9167 0.9574
No log 14.0 406 0.9272 0.4373 0.9272 0.9629
No log 14.0690 408 0.9582 0.4394 0.9582 0.9789
No log 14.1379 410 1.0906 0.3567 1.0906 1.0443
No log 14.2069 412 1.2278 0.3631 1.2278 1.1081
No log 14.2759 414 1.2282 0.2918 1.2282 1.1082
No log 14.3448 416 1.1210 0.2195 1.1210 1.0588
No log 14.4138 418 1.0509 0.1541 1.0509 1.0251
No log 14.4828 420 0.9799 0.2834 0.9799 0.9899
No log 14.5517 422 0.9641 0.2993 0.9641 0.9819
No log 14.6207 424 0.9793 0.3372 0.9793 0.9896
No log 14.6897 426 0.9730 0.3931 0.9730 0.9864
No log 14.7586 428 0.9929 0.3954 0.9929 0.9965
No log 14.8276 430 1.0831 0.4325 1.0831 1.0407
No log 14.8966 432 1.0915 0.3880 1.0915 1.0447
No log 14.9655 434 0.9623 0.4822 0.9623 0.9810
No log 15.0345 436 0.8803 0.4916 0.8803 0.9382
No log 15.1034 438 0.8672 0.5161 0.8672 0.9312
No log 15.1724 440 0.8546 0.5161 0.8546 0.9245
No log 15.2414 442 0.9053 0.4787 0.9053 0.9515
No log 15.3103 444 1.0094 0.4257 1.0094 1.0047
No log 15.3793 446 1.0016 0.4129 1.0016 1.0008
No log 15.4483 448 0.9198 0.4787 0.9198 0.9591
No log 15.5172 450 0.8535 0.4603 0.8535 0.9238
No log 15.5862 452 0.8457 0.4282 0.8457 0.9196
No log 15.6552 454 0.8737 0.4377 0.8737 0.9347
No log 15.7241 456 0.9529 0.4568 0.9529 0.9762
No log 15.7931 458 1.0016 0.3762 1.0016 1.0008
No log 15.8621 460 1.0164 0.3119 1.0164 1.0081
No log 15.9310 462 0.9893 0.3223 0.9893 0.9947
No log 16.0 464 1.0105 0.2821 1.0105 1.0052
No log 16.0690 466 1.0545 0.2516 1.0545 1.0269
No log 16.1379 468 1.0871 0.1903 1.0871 1.0426
No log 16.2069 470 1.1136 0.2714 1.1136 1.0553
No log 16.2759 472 1.0696 0.3095 1.0696 1.0342
No log 16.3448 474 0.9927 0.3687 0.9927 0.9963
No log 16.4138 476 0.9250 0.3762 0.9250 0.9618
No log 16.4828 478 0.9107 0.4006 0.9107 0.9543
No log 16.5517 480 0.9733 0.4328 0.9733 0.9866
No log 16.6207 482 1.0989 0.4325 1.0989 1.0483
No log 16.6897 484 1.1293 0.4327 1.1293 1.0627
No log 16.7586 486 1.0416 0.4253 1.0416 1.0206
No log 16.8276 488 0.9443 0.3762 0.9443 0.9717
No log 16.8966 490 0.9402 0.3998 0.9402 0.9697
No log 16.9655 492 0.9557 0.3201 0.9557 0.9776
No log 17.0345 494 0.9672 0.3201 0.9672 0.9834
No log 17.1034 496 1.0209 0.2972 1.0209 1.0104
No log 17.1724 498 1.1681 0.2978 1.1681 1.0808
0.4117 17.2414 500 1.2738 0.3500 1.2738 1.1286
0.4117 17.3103 502 1.2142 0.3025 1.2142 1.1019
0.4117 17.3793 504 1.0379 0.3023 1.0379 1.0188
0.4117 17.4483 506 0.9558 0.4553 0.9558 0.9777
0.4117 17.5172 508 0.9767 0.4754 0.9767 0.9883
0.4117 17.5862 510 1.0588 0.3690 1.0588 1.0290
0.4117 17.6552 512 1.0982 0.3854 1.0982 1.0480
0.4117 17.7241 514 1.1581 0.3823 1.1581 1.0761
0.4117 17.7931 516 1.1586 0.4007 1.1586 1.0764
0.4117 17.8621 518 1.1074 0.3166 1.1074 1.0523
0.4117 17.9310 520 1.0762 0.3367 1.0762 1.0374
0.4117 18.0 522 1.0086 0.3097 1.0086 1.0043
0.4117 18.0690 524 0.9722 0.3270 0.9722 0.9860
0.4117 18.1379 526 0.9812 0.2871 0.9812 0.9906

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k10_task2_organization

Finetuned
(4019)
this model