ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k11_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9529
  • Qwk: 0.2723
  • Mse: 0.9529
  • Rmse: 0.9762

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0769 2 2.5880 -0.0262 2.5880 1.6087
No log 0.1538 4 1.3290 0.0511 1.3290 1.1528
No log 0.2308 6 1.0293 -0.1255 1.0293 1.0145
No log 0.3077 8 0.9372 0.0 0.9372 0.9681
No log 0.3846 10 0.9429 -0.0425 0.9429 0.9710
No log 0.4615 12 0.9954 -0.0392 0.9954 0.9977
No log 0.5385 14 0.9280 -0.0070 0.9280 0.9633
No log 0.6154 16 0.9033 0.1504 0.9033 0.9504
No log 0.6923 18 0.9861 0.1962 0.9861 0.9930
No log 0.7692 20 1.0962 -0.0128 1.0962 1.0470
No log 0.8462 22 1.1394 -0.0281 1.1394 1.0674
No log 0.9231 24 1.1856 -0.2126 1.1856 1.0889
No log 1.0 26 1.0757 -0.0472 1.0757 1.0372
No log 1.0769 28 1.0227 0.1217 1.0227 1.0113
No log 1.1538 30 1.0181 0.0982 1.0181 1.0090
No log 1.2308 32 0.9611 0.1542 0.9611 0.9804
No log 1.3077 34 0.9876 0.1918 0.9876 0.9938
No log 1.3846 36 1.0594 -0.0634 1.0594 1.0293
No log 1.4615 38 1.0132 -0.0479 1.0132 1.0066
No log 1.5385 40 0.9048 0.0966 0.9048 0.9512
No log 1.6154 42 0.8778 0.2171 0.8778 0.9369
No log 1.6923 44 0.8321 0.1972 0.8321 0.9122
No log 1.7692 46 0.7792 0.1407 0.7792 0.8827
No log 1.8462 48 0.7523 0.1508 0.7523 0.8673
No log 1.9231 50 0.7500 0.1236 0.7500 0.8661
No log 2.0 52 0.7803 0.0 0.7803 0.8833
No log 2.0769 54 0.8039 0.0 0.8039 0.8966
No log 2.1538 56 0.8265 0.0359 0.8265 0.9091
No log 2.2308 58 0.8628 0.1094 0.8628 0.9288
No log 2.3077 60 0.9631 0.1504 0.9631 0.9814
No log 2.3846 62 1.1027 0.0487 1.1027 1.0501
No log 2.4615 64 1.1506 -0.0103 1.1506 1.0727
No log 2.5385 66 1.1129 0.1183 1.1129 1.0549
No log 2.6154 68 1.0894 0.0573 1.0894 1.0438
No log 2.6923 70 1.1575 -0.1665 1.1575 1.0759
No log 2.7692 72 1.1793 -0.0345 1.1793 1.0860
No log 2.8462 74 1.0467 -0.0960 1.0467 1.0231
No log 2.9231 76 0.9014 0.1737 0.9014 0.9494
No log 3.0 78 0.8860 0.1313 0.8860 0.9413
No log 3.0769 80 0.9552 0.1268 0.9552 0.9773
No log 3.1538 82 1.0245 0.0933 1.0245 1.0122
No log 3.2308 84 1.0981 -0.0197 1.0981 1.0479
No log 3.3077 86 1.1820 0.0236 1.1820 1.0872
No log 3.3846 88 1.1979 0.0563 1.1979 1.0945
No log 3.4615 90 1.1964 -0.0160 1.1964 1.0938
No log 3.5385 92 1.1862 -0.0358 1.1862 1.0891
No log 3.6154 94 1.1994 0.1348 1.1994 1.0951
No log 3.6923 96 1.2000 0.1384 1.2000 1.0954
No log 3.7692 98 1.1145 0.0205 1.1145 1.0557
No log 3.8462 100 1.0680 0.0880 1.0680 1.0334
No log 3.9231 102 1.0351 0.0839 1.0351 1.0174
No log 4.0 104 1.0266 0.0134 1.0266 1.0132
No log 4.0769 106 1.0909 0.1781 1.0909 1.0444
No log 4.1538 108 1.0983 0.1142 1.0983 1.0480
No log 4.2308 110 0.9763 0.0112 0.9763 0.9881
No log 4.3077 112 0.9867 0.1672 0.9867 0.9933
No log 4.3846 114 1.0429 0.1209 1.0429 1.0212
No log 4.4615 116 1.0364 0.1201 1.0364 1.0180
No log 4.5385 118 0.8978 0.1733 0.8978 0.9475
No log 4.6154 120 0.8910 0.1835 0.8910 0.9439
No log 4.6923 122 1.0714 0.1550 1.0714 1.0351
No log 4.7692 124 1.1165 0.1058 1.1165 1.0566
No log 4.8462 126 1.0089 0.1065 1.0089 1.0044
No log 4.9231 128 0.9944 0.1091 0.9944 0.9972
No log 5.0 130 1.1081 0.1814 1.1081 1.0527
No log 5.0769 132 1.0844 0.0753 1.0844 1.0414
No log 5.1538 134 1.0648 0.0682 1.0648 1.0319
No log 5.2308 136 1.1568 0.0904 1.1568 1.0755
No log 5.3077 138 1.1138 0.0784 1.1138 1.0554
No log 5.3846 140 1.0314 0.0379 1.0314 1.0156
No log 5.4615 142 1.0182 0.1500 1.0182 1.0091
No log 5.5385 144 1.0499 0.1528 1.0499 1.0247
No log 5.6154 146 0.9965 0.0748 0.9965 0.9983
No log 5.6923 148 1.0236 0.0031 1.0236 1.0117
No log 5.7692 150 1.0565 0.0600 1.0565 1.0279
No log 5.8462 152 1.0685 0.0934 1.0685 1.0337
No log 5.9231 154 1.0576 0.0934 1.0576 1.0284
No log 6.0 156 0.9971 0.0442 0.9971 0.9986
No log 6.0769 158 1.0281 0.0934 1.0281 1.0140
No log 6.1538 160 1.0284 0.1205 1.0284 1.0141
No log 6.2308 162 1.1140 0.1467 1.1140 1.0555
No log 6.3077 164 1.2090 0.0981 1.2090 1.0995
No log 6.3846 166 1.1754 0.0421 1.1754 1.0842
No log 6.4615 168 1.1047 0.1267 1.1047 1.0511
No log 6.5385 170 1.0491 0.0787 1.0491 1.0242
No log 6.6154 172 1.0361 0.1033 1.0361 1.0179
No log 6.6923 174 1.0323 0.0696 1.0323 1.0160
No log 6.7692 176 1.0029 0.0724 1.0029 1.0015
No log 6.8462 178 0.9836 0.0696 0.9836 0.9918
No log 6.9231 180 0.9391 0.0839 0.9391 0.9691
No log 7.0 182 0.9196 0.1216 0.9196 0.9590
No log 7.0769 184 0.9497 0.1031 0.9497 0.9745
No log 7.1538 186 0.9801 0.0654 0.9801 0.9900
No log 7.2308 188 1.0106 0.0968 1.0106 1.0053
No log 7.3077 190 1.0963 0.1859 1.0963 1.0471
No log 7.3846 192 1.0793 0.1652 1.0793 1.0389
No log 7.4615 194 1.0132 0.2026 1.0132 1.0066
No log 7.5385 196 1.0194 0.2274 1.0194 1.0097
No log 7.6154 198 1.0608 0.2274 1.0608 1.0300
No log 7.6923 200 1.1895 0.1310 1.1895 1.0907
No log 7.7692 202 1.2444 0.1031 1.2444 1.1155
No log 7.8462 204 1.2396 0.1874 1.2396 1.1134
No log 7.9231 206 1.2253 0.1733 1.2253 1.1069
No log 8.0 208 1.1544 0.0816 1.1544 1.0744
No log 8.0769 210 1.2112 0.1307 1.2112 1.1005
No log 8.1538 212 1.2219 0.1143 1.2219 1.1054
No log 8.2308 214 1.0714 0.2285 1.0714 1.0351
No log 8.3077 216 0.9236 0.2669 0.9236 0.9611
No log 8.3846 218 0.9031 0.3023 0.9031 0.9503
No log 8.4615 220 0.9272 0.3131 0.9272 0.9629
No log 8.5385 222 0.9933 0.2781 0.9933 0.9967
No log 8.6154 224 1.0723 0.2220 1.0723 1.0355
No log 8.6923 226 1.0289 0.2364 1.0289 1.0143
No log 8.7692 228 1.0291 0.2097 1.0291 1.0144
No log 8.8462 230 1.0406 0.2055 1.0406 1.0201
No log 8.9231 232 1.0501 0.2143 1.0501 1.0248
No log 9.0 234 0.9867 0.2055 0.9867 0.9933
No log 9.0769 236 0.9275 0.2348 0.9275 0.9631
No log 9.1538 238 0.9661 0.2139 0.9661 0.9829
No log 9.2308 240 1.0521 0.1935 1.0521 1.0257
No log 9.3077 242 1.0692 0.1935 1.0692 1.0340
No log 9.3846 244 0.9571 0.2926 0.9571 0.9783
No log 9.4615 246 0.8930 0.2843 0.8930 0.9450
No log 9.5385 248 0.8727 0.2270 0.8727 0.9342
No log 9.6154 250 0.9013 0.2780 0.9013 0.9494
No log 9.6923 252 1.0846 0.2377 1.0846 1.0414
No log 9.7692 254 1.2031 0.1117 1.2031 1.0969
No log 9.8462 256 1.0575 0.2097 1.0575 1.0284
No log 9.9231 258 0.9277 0.2806 0.9277 0.9632
No log 10.0 260 0.9336 0.2806 0.9336 0.9662
No log 10.0769 262 1.0060 0.1360 1.0060 1.0030
No log 10.1538 264 1.0534 0.1397 1.0534 1.0264
No log 10.2308 266 1.0537 0.1743 1.0537 1.0265
No log 10.3077 268 1.0242 0.2228 1.0242 1.0120
No log 10.3846 270 0.9840 0.1983 0.9840 0.9920
No log 10.4615 272 1.0950 0.2028 1.0950 1.0464
No log 10.5385 274 1.0764 0.1370 1.0764 1.0375
No log 10.6154 276 1.0166 0.1701 1.0166 1.0083
No log 10.6923 278 1.0933 0.1202 1.0933 1.0456
No log 10.7692 280 1.1986 0.1143 1.1986 1.0948
No log 10.8462 282 1.1242 0.0852 1.1242 1.0603
No log 10.9231 284 0.9429 0.1727 0.9429 0.9710
No log 11.0 286 0.9364 0.1748 0.9364 0.9677
No log 11.0769 288 0.9962 0.2116 0.9962 0.9981
No log 11.1538 290 1.0030 0.2010 1.0030 1.0015
No log 11.2308 292 0.9416 0.2262 0.9416 0.9704
No log 11.3077 294 0.8985 0.2720 0.8985 0.9479
No log 11.3846 296 0.9508 0.1826 0.9508 0.9751
No log 11.4615 298 0.9544 0.1870 0.9544 0.9770
No log 11.5385 300 0.9971 0.1518 0.9971 0.9985
No log 11.6154 302 1.0482 0.1405 1.0482 1.0238
No log 11.6923 304 1.1835 0.0980 1.1835 1.0879
No log 11.7692 306 1.1796 0.0980 1.1796 1.0861
No log 11.8462 308 1.1210 0.1281 1.1210 1.0588
No log 11.9231 310 0.9854 0.1742 0.9854 0.9927
No log 12.0 312 1.0256 0.2394 1.0256 1.0127
No log 12.0769 314 1.0078 0.2442 1.0078 1.0039
No log 12.1538 316 0.9917 0.2247 0.9917 0.9958
No log 12.2308 318 0.9675 0.3095 0.9675 0.9836
No log 12.3077 320 0.9732 0.3095 0.9732 0.9865
No log 12.3846 322 1.0351 0.2369 1.0351 1.0174
No log 12.4615 324 1.0466 0.1831 1.0466 1.0230
No log 12.5385 326 1.2718 0.1491 1.2718 1.1278
No log 12.6154 328 1.4493 0.1867 1.4493 1.2039
No log 12.6923 330 1.4315 0.1867 1.4315 1.1965
No log 12.7692 332 1.1560 0.0955 1.1560 1.0752
No log 12.8462 334 0.9218 0.2471 0.9218 0.9601
No log 12.9231 336 0.8515 0.2872 0.8515 0.9228
No log 13.0 338 0.8468 0.2953 0.8468 0.9202
No log 13.0769 340 0.9297 0.2812 0.9297 0.9642
No log 13.1538 342 1.0286 0.3538 1.0286 1.0142
No log 13.2308 344 1.0259 0.3665 1.0259 1.0129
No log 13.3077 346 0.9993 0.3043 0.9993 0.9997
No log 13.3846 348 0.9415 0.3034 0.9415 0.9703
No log 13.4615 350 0.9102 0.2424 0.9102 0.9541
No log 13.5385 352 0.9450 0.2241 0.9450 0.9721
No log 13.6154 354 1.0573 0.2533 1.0573 1.0283
No log 13.6923 356 1.1225 0.2128 1.1225 1.0595
No log 13.7692 358 1.0815 0.2533 1.0815 1.0399
No log 13.8462 360 0.9853 0.2696 0.9853 0.9926
No log 13.9231 362 0.9374 0.3060 0.9374 0.9682
No log 14.0 364 0.9434 0.3034 0.9434 0.9713
No log 14.0769 366 0.9625 0.3280 0.9625 0.9811
No log 14.1538 368 1.0455 0.2285 1.0455 1.0225
No log 14.2308 370 1.1348 0.1283 1.1348 1.0653
No log 14.3077 372 1.0748 0.1974 1.0748 1.0367
No log 14.3846 374 0.9954 0.2310 0.9954 0.9977
No log 14.4615 376 0.9574 0.2566 0.9574 0.9785
No log 14.5385 378 0.9803 0.2040 0.9803 0.9901
No log 14.6154 380 1.0460 0.1671 1.0460 1.0227
No log 14.6923 382 0.9910 0.2442 0.9910 0.9955
No log 14.7692 384 0.9841 0.2836 0.9841 0.9920
No log 14.8462 386 0.9659 0.2513 0.9659 0.9828
No log 14.9231 388 0.8872 0.3183 0.8872 0.9419
No log 15.0 390 0.8374 0.3209 0.8374 0.9151
No log 15.0769 392 0.8566 0.2835 0.8566 0.9255
No log 15.1538 394 0.9674 0.3280 0.9674 0.9836
No log 15.2308 396 1.0140 0.3183 1.0140 1.0070
No log 15.3077 398 0.9607 0.3125 0.9607 0.9802
No log 15.3846 400 0.8986 0.3305 0.8986 0.9480
No log 15.4615 402 0.8700 0.3183 0.8700 0.9328
No log 15.5385 404 0.8864 0.3183 0.8864 0.9415
No log 15.6154 406 0.9544 0.2670 0.9544 0.9770
No log 15.6923 408 1.1048 0.2380 1.1048 1.0511
No log 15.7692 410 1.0606 0.2999 1.0606 1.0298
No log 15.8462 412 0.9401 0.2670 0.9401 0.9696
No log 15.9231 414 0.8314 0.2109 0.8314 0.9118
No log 16.0 416 0.8279 0.2224 0.8279 0.9099
No log 16.0769 418 0.8358 0.1472 0.8358 0.9142
No log 16.1538 420 0.8759 0.2389 0.8759 0.9359
No log 16.2308 422 0.9640 0.2804 0.9640 0.9818
No log 16.3077 424 0.9913 0.2354 0.9913 0.9956
No log 16.3846 426 0.9590 0.2804 0.9590 0.9793
No log 16.4615 428 0.9074 0.1806 0.9074 0.9526
No log 16.5385 430 0.9263 0.2414 0.9263 0.9624
No log 16.6154 432 0.9743 0.1783 0.9743 0.9871
No log 16.6923 434 0.9645 0.2932 0.9645 0.9821
No log 16.7692 436 0.9753 0.3770 0.9753 0.9876
No log 16.8462 438 1.0312 0.2627 1.0312 1.0155
No log 16.9231 440 1.0462 0.2329 1.0462 1.0229
No log 17.0 442 0.9794 0.2359 0.9794 0.9896
No log 17.0769 444 0.9389 0.1760 0.9389 0.9690
No log 17.1538 446 0.9839 0.2633 0.9839 0.9919
No log 17.2308 448 1.0537 0.2513 1.0537 1.0265
No log 17.3077 450 1.1423 0.1647 1.1423 1.0688
No log 17.3846 452 1.1434 0.1943 1.1434 1.0693
No log 17.4615 454 1.0747 0.3477 1.0747 1.0367
No log 17.5385 456 0.9251 0.3095 0.9251 0.9618
No log 17.6154 458 0.8525 0.2237 0.8525 0.9233
No log 17.6923 460 0.8456 0.1256 0.8456 0.9195
No log 17.7692 462 0.8800 0.2129 0.8800 0.9381
No log 17.8462 464 0.9809 0.3219 0.9809 0.9904
No log 17.9231 466 1.0915 0.2200 1.0915 1.0448
No log 18.0 468 1.0608 0.2555 1.0608 1.0300
No log 18.0769 470 0.9307 0.2724 0.9307 0.9647
No log 18.1538 472 0.8505 0.2389 0.8505 0.9222
No log 18.2308 474 0.8148 0.2193 0.8148 0.9027
No log 18.3077 476 0.8059 0.1603 0.8059 0.8977
No log 18.3846 478 0.8347 0.2109 0.8347 0.9136
No log 18.4615 480 0.9186 0.3723 0.9186 0.9584
No log 18.5385 482 0.9646 0.3280 0.9646 0.9821
No log 18.6154 484 0.9852 0.3125 0.9852 0.9926
No log 18.6923 486 0.9332 0.3344 0.9332 0.9660
No log 18.7692 488 0.8929 0.2899 0.8929 0.9449
No log 18.8462 490 0.8692 0.2090 0.8692 0.9323
No log 18.9231 492 0.8951 0.2926 0.8951 0.9461
No log 19.0 494 0.9290 0.3157 0.9290 0.9638
No log 19.0769 496 0.9759 0.3280 0.9759 0.9879
No log 19.1538 498 0.9971 0.3159 0.9971 0.9986
0.313 19.2308 500 0.9922 0.3159 0.9922 0.9961
0.313 19.3077 502 0.9802 0.3159 0.9802 0.9900
0.313 19.3846 504 0.9856 0.3159 0.9856 0.9928
0.313 19.4615 506 0.9990 0.3100 0.9990 0.9995
0.313 19.5385 508 1.0173 0.2917 1.0173 1.0086
0.313 19.6154 510 0.9529 0.2723 0.9529 0.9762

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k11_task7_organization

Finetuned
(4019)
this model