ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k2_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9813
  • Qwk: 0.2635
  • Mse: 0.9813
  • Rmse: 0.9906

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.4 2 2.5043 -0.0788 2.5043 1.5825
No log 0.8 4 1.1496 0.1284 1.1496 1.0722
No log 1.2 6 0.8398 0.0535 0.8398 0.9164
No log 1.6 8 0.8665 0.0313 0.8665 0.9309
No log 2.0 10 0.9418 0.1181 0.9418 0.9705
No log 2.4 12 0.8754 0.1268 0.8754 0.9356
No log 2.8 14 0.7780 0.0804 0.7780 0.8821
No log 3.2 16 0.7806 0.0444 0.7806 0.8835
No log 3.6 18 0.7911 0.0444 0.7911 0.8895
No log 4.0 20 0.8116 0.0481 0.8116 0.9009
No log 4.4 22 0.7852 0.0444 0.7852 0.8861
No log 4.8 24 0.7556 0.1187 0.7556 0.8693
No log 5.2 26 0.7550 0.1094 0.7550 0.8689
No log 5.6 28 0.7873 0.2285 0.7873 0.8873
No log 6.0 30 0.8195 0.1867 0.8195 0.9053
No log 6.4 32 0.7905 0.1584 0.7905 0.8891
No log 6.8 34 0.8006 0.1542 0.8006 0.8947
No log 7.2 36 0.8280 0.1946 0.8280 0.9100
No log 7.6 38 1.0096 0.1501 1.0096 1.0048
No log 8.0 40 1.0556 0.1867 1.0556 1.0274
No log 8.4 42 0.9515 0.0241 0.9515 0.9754
No log 8.8 44 0.8984 0.1289 0.8984 0.9478
No log 9.2 46 0.9670 0.1385 0.9670 0.9833
No log 9.6 48 1.0412 0.2119 1.0412 1.0204
No log 10.0 50 1.1556 0.1115 1.1556 1.0750
No log 10.4 52 1.1838 0.0845 1.1838 1.0880
No log 10.8 54 1.2096 0.0686 1.2096 1.0998
No log 11.2 56 1.2015 0.0686 1.2015 1.0961
No log 11.6 58 1.1246 0.0713 1.1246 1.0605
No log 12.0 60 1.0039 0.1775 1.0039 1.0019
No log 12.4 62 0.9072 0.2239 0.9072 0.9525
No log 12.8 64 1.0172 0.1014 1.0172 1.0086
No log 13.2 66 1.2605 0.1176 1.2605 1.1227
No log 13.6 68 1.3323 0.1479 1.3323 1.1543
No log 14.0 70 1.3230 0.0704 1.3230 1.1502
No log 14.4 72 1.0369 0.1843 1.0369 1.0183
No log 14.8 74 0.9368 0.2124 0.9368 0.9679
No log 15.2 76 0.9609 0.2076 0.9609 0.9803
No log 15.6 78 1.1684 0.1086 1.1684 1.0809
No log 16.0 80 1.2914 0.1417 1.2914 1.1364
No log 16.4 82 1.2720 0.1145 1.2720 1.1278
No log 16.8 84 1.1177 0.2070 1.1177 1.0572
No log 17.2 86 0.9467 0.3134 0.9467 0.9730
No log 17.6 88 0.9169 0.3194 0.9169 0.9575
No log 18.0 90 1.0413 0.2209 1.0413 1.0205
No log 18.4 92 1.2213 0.2070 1.2213 1.1051
No log 18.8 94 1.2776 0.0727 1.2776 1.1303
No log 19.2 96 1.2925 0.1200 1.2925 1.1369
No log 19.6 98 1.2695 0.2138 1.2695 1.1267
No log 20.0 100 1.0685 0.2659 1.0685 1.0337
No log 20.4 102 0.9646 0.3194 0.9646 0.9822
No log 20.8 104 1.0011 0.2537 1.0011 1.0005
No log 21.2 106 1.0633 0.1549 1.0633 1.0312
No log 21.6 108 1.1412 0.1206 1.1412 1.0683
No log 22.0 110 1.2095 0.0623 1.2095 1.0998
No log 22.4 112 1.2801 0.1254 1.2801 1.1314
No log 22.8 114 1.2387 0.1473 1.2387 1.1130
No log 23.2 116 1.1045 0.1144 1.1045 1.0510
No log 23.6 118 1.0220 0.2437 1.0220 1.0109
No log 24.0 120 0.9643 0.2651 0.9643 0.9820
No log 24.4 122 0.9763 0.2806 0.9763 0.9881
No log 24.8 124 1.0316 0.2545 1.0316 1.0157
No log 25.2 126 1.0772 0.2191 1.0772 1.0379
No log 25.6 128 1.0496 0.2017 1.0496 1.0245
No log 26.0 130 1.0165 0.1922 1.0165 1.0082
No log 26.4 132 1.0143 0.2217 1.0143 1.0071
No log 26.8 134 1.0071 0.2659 1.0071 1.0035
No log 27.2 136 0.9128 0.2702 0.9128 0.9554
No log 27.6 138 0.8561 0.2616 0.8561 0.9253
No log 28.0 140 0.8551 0.2467 0.8551 0.9247
No log 28.4 142 0.8836 0.3329 0.8836 0.9400
No log 28.8 144 0.9271 0.2116 0.9271 0.9629
No log 29.2 146 0.9750 0.2063 0.9750 0.9874
No log 29.6 148 1.0384 0.2029 1.0384 1.0190
No log 30.0 150 1.1437 0.1653 1.1437 1.0694
No log 30.4 152 1.2277 0.2376 1.2277 1.1080
No log 30.8 154 1.1811 0.2030 1.1811 1.0868
No log 31.2 156 1.0947 0.1655 1.0947 1.0463
No log 31.6 158 1.0593 0.2150 1.0593 1.0292
No log 32.0 160 1.0426 0.2150 1.0426 1.0211
No log 32.4 162 1.0575 0.2330 1.0575 1.0284
No log 32.8 164 1.0647 0.2330 1.0647 1.0319
No log 33.2 166 1.0672 0.2755 1.0672 1.0330
No log 33.6 168 1.0461 0.2961 1.0461 1.0228
No log 34.0 170 1.1157 0.2961 1.1157 1.0563
No log 34.4 172 1.2613 0.2948 1.2613 1.1231
No log 34.8 174 1.2750 0.3065 1.2750 1.1291
No log 35.2 176 1.2028 0.2914 1.2028 1.0967
No log 35.6 178 1.1502 0.2417 1.1502 1.0725
No log 36.0 180 1.1302 0.2207 1.1302 1.0631
No log 36.4 182 1.1597 0.1729 1.1597 1.0769
No log 36.8 184 1.2322 0.1663 1.2322 1.1100
No log 37.2 186 1.2817 0.2459 1.2817 1.1321
No log 37.6 188 1.2833 0.2601 1.2833 1.1328
No log 38.0 190 1.3190 0.2499 1.3190 1.1485
No log 38.4 192 1.2863 0.2207 1.2863 1.1341
No log 38.8 194 1.1725 0.2545 1.1725 1.0828
No log 39.2 196 1.0959 0.2372 1.0959 1.0469
No log 39.6 198 1.0191 0.2602 1.0191 1.0095
No log 40.0 200 1.0014 0.2602 1.0014 1.0007
No log 40.4 202 1.0747 0.2372 1.0747 1.0367
No log 40.8 204 1.2319 0.2459 1.2319 1.1099
No log 41.2 206 1.3128 0.2243 1.3128 1.1458
No log 41.6 208 1.2900 0.2559 1.2900 1.1358
No log 42.0 210 1.1675 0.2247 1.1675 1.0805
No log 42.4 212 1.0744 0.2372 1.0744 1.0365
No log 42.8 214 1.0556 0.2372 1.0556 1.0274
No log 43.2 216 1.0870 0.2545 1.0870 1.0426
No log 43.6 218 1.1258 0.2501 1.1258 1.0610
No log 44.0 220 1.1501 0.2501 1.1501 1.0724
No log 44.4 222 1.1116 0.2330 1.1116 1.0543
No log 44.8 224 1.0824 0.2150 1.0824 1.0404
No log 45.2 226 1.0525 0.2372 1.0525 1.0259
No log 45.6 228 1.0645 0.2635 1.0645 1.0318
No log 46.0 230 1.0526 0.2850 1.0526 1.0260
No log 46.4 232 0.9906 0.3538 0.9906 0.9953
No log 46.8 234 0.9796 0.3538 0.9796 0.9898
No log 47.2 236 1.0047 0.3481 1.0047 1.0023
No log 47.6 238 1.0683 0.2802 1.0683 1.0336
No log 48.0 240 1.1119 0.2914 1.1119 1.0545
No log 48.4 242 1.1179 0.2961 1.1179 1.0573
No log 48.8 244 1.0432 0.2802 1.0432 1.0214
No log 49.2 246 0.9511 0.3214 0.9511 0.9752
No log 49.6 248 0.9029 0.3477 0.9029 0.9502
No log 50.0 250 0.8922 0.2728 0.8922 0.9445
No log 50.4 252 0.8658 0.3294 0.8658 0.9305
No log 50.8 254 0.8949 0.2923 0.8949 0.9460
No log 51.2 256 0.9776 0.3082 0.9776 0.9887
No log 51.6 258 1.0779 0.2961 1.0779 1.0382
No log 52.0 260 1.1661 0.2537 1.1661 1.0799
No log 52.4 262 1.1571 0.2537 1.1571 1.0757
No log 52.8 264 1.1359 0.2579 1.1359 1.0658
No log 53.2 266 1.1033 0.2665 1.1033 1.0504
No log 53.6 268 1.0413 0.2755 1.0413 1.0204
No log 54.0 270 0.9681 0.2898 0.9681 0.9839
No log 54.4 272 0.9501 0.3269 0.9501 0.9747
No log 54.8 274 0.9856 0.2898 0.9856 0.9928
No log 55.2 276 1.0138 0.2590 1.0138 1.0069
No log 55.6 278 1.0544 0.2372 1.0544 1.0268
No log 56.0 280 1.0935 0.2961 1.0935 1.0457
No log 56.4 282 1.0973 0.2961 1.0973 1.0475
No log 56.8 284 1.1026 0.2961 1.1026 1.0500
No log 57.2 286 1.1034 0.2961 1.1034 1.0504
No log 57.6 288 1.1017 0.2961 1.1017 1.0496
No log 58.0 290 1.0296 0.3010 1.0296 1.0147
No log 58.4 292 0.9434 0.2939 0.9434 0.9713
No log 58.8 294 0.8964 0.3347 0.8964 0.9468
No log 59.2 296 0.8527 0.3347 0.8527 0.9234
No log 59.6 298 0.8477 0.3347 0.8477 0.9207
No log 60.0 300 0.8474 0.3347 0.8474 0.9205
No log 60.4 302 0.8871 0.3287 0.8871 0.9419
No log 60.8 304 0.9746 0.2926 0.9746 0.9872
No log 61.2 306 1.0717 0.3010 1.0717 1.0352
No log 61.6 308 1.1086 0.2914 1.1086 1.0529
No log 62.0 310 1.0847 0.3010 1.0847 1.0415
No log 62.4 312 1.0463 0.3010 1.0463 1.0229
No log 62.8 314 1.0050 0.2926 1.0050 1.0025
No log 63.2 316 0.9897 0.2707 0.9897 0.9949
No log 63.6 318 1.0251 0.2437 1.0251 1.0125
No log 64.0 320 1.0832 0.2330 1.0832 1.0408
No log 64.4 322 1.1397 0.2330 1.1397 1.0676
No log 64.8 324 1.1405 0.2545 1.1405 1.0679
No log 65.2 326 1.0950 0.2755 1.0950 1.0464
No log 65.6 328 1.0310 0.2707 1.0310 1.0154
No log 66.0 330 0.9736 0.2926 0.9736 0.9867
No log 66.4 332 0.9501 0.2926 0.9501 0.9747
No log 66.8 334 0.9496 0.2926 0.9496 0.9745
No log 67.2 336 0.9554 0.3059 0.9554 0.9774
No log 67.6 338 0.9708 0.3059 0.9708 0.9853
No log 68.0 340 1.0204 0.2926 1.0204 1.0101
No log 68.4 342 1.0272 0.2926 1.0272 1.0135
No log 68.8 344 1.0252 0.2926 1.0252 1.0125
No log 69.2 346 1.0348 0.3059 1.0348 1.0173
No log 69.6 348 1.0126 0.2926 1.0126 1.0063
No log 70.0 350 0.9720 0.2926 0.9720 0.9859
No log 70.4 352 0.9502 0.2707 0.9502 0.9748
No log 70.8 354 0.9376 0.2707 0.9376 0.9683
No log 71.2 356 0.9398 0.3029 0.9398 0.9694
No log 71.6 358 0.9631 0.2850 0.9631 0.9814
No log 72.0 360 0.9738 0.2635 0.9738 0.9868
No log 72.4 362 0.9975 0.2416 0.9975 0.9988
No log 72.8 364 1.0338 0.2416 1.0338 1.0167
No log 73.2 366 1.0433 0.2150 1.0433 1.0214
No log 73.6 368 1.0242 0.1976 1.0242 1.0120
No log 74.0 370 1.0076 0.1976 1.0076 1.0038
No log 74.4 372 1.0033 0.2372 1.0033 1.0017
No log 74.8 374 0.9852 0.2730 0.9852 0.9926
No log 75.2 376 0.9805 0.2730 0.9805 0.9902
No log 75.6 378 0.9917 0.2635 0.9917 0.9959
No log 76.0 380 0.9862 0.2635 0.9862 0.9931
No log 76.4 382 0.9830 0.2635 0.9830 0.9915
No log 76.8 384 0.9761 0.3161 0.9761 0.9880
No log 77.2 386 0.9838 0.2850 0.9838 0.9918
No log 77.6 388 0.9930 0.2850 0.9930 0.9965
No log 78.0 390 0.9946 0.2850 0.9946 0.9973
No log 78.4 392 0.9893 0.2850 0.9893 0.9946
No log 78.8 394 0.9845 0.2635 0.9845 0.9922
No log 79.2 396 0.9782 0.2482 0.9782 0.9890
No log 79.6 398 0.9780 0.2529 0.9780 0.9889
No log 80.0 400 0.9858 0.2297 0.9858 0.9929
No log 80.4 402 0.9929 0.2363 0.9929 0.9965
No log 80.8 404 0.9865 0.2601 0.9865 0.9932
No log 81.2 406 1.0001 0.2601 1.0001 1.0001
No log 81.6 408 1.0296 0.2437 1.0296 1.0147
No log 82.0 410 1.0678 0.2481 1.0678 1.0334
No log 82.4 412 1.0911 0.2481 1.0911 1.0445
No log 82.8 414 1.1003 0.2481 1.1003 1.0489
No log 83.2 416 1.0846 0.2481 1.0846 1.0414
No log 83.6 418 1.0514 0.2524 1.0514 1.0254
No log 84.0 420 1.0021 0.2850 1.0021 1.0010
No log 84.4 422 0.9639 0.2707 0.9639 0.9818
No log 84.8 424 0.9436 0.2857 0.9436 0.9714
No log 85.2 426 0.9430 0.2482 0.9430 0.9711
No log 85.6 428 0.9583 0.2482 0.9583 0.9789
No log 86.0 430 0.9654 0.2482 0.9654 0.9825
No log 86.4 432 0.9649 0.2482 0.9649 0.9823
No log 86.8 434 0.9690 0.2707 0.9690 0.9844
No log 87.2 436 0.9698 0.2707 0.9698 0.9848
No log 87.6 438 0.9606 0.2482 0.9606 0.9801
No log 88.0 440 0.9545 0.2482 0.9545 0.9770
No log 88.4 442 0.9451 0.2707 0.9451 0.9721
No log 88.8 444 0.9417 0.2707 0.9417 0.9704
No log 89.2 446 0.9455 0.2707 0.9455 0.9724
No log 89.6 448 0.9613 0.2850 0.9613 0.9804
No log 90.0 450 0.9727 0.2850 0.9727 0.9862
No log 90.4 452 0.9800 0.2850 0.9800 0.9899
No log 90.8 454 0.9899 0.2850 0.9899 0.9949
No log 91.2 456 0.9923 0.2850 0.9923 0.9961
No log 91.6 458 0.9873 0.2635 0.9873 0.9936
No log 92.0 460 0.9873 0.2635 0.9873 0.9936
No log 92.4 462 0.9872 0.2635 0.9872 0.9936
No log 92.8 464 0.9834 0.2635 0.9834 0.9917
No log 93.2 466 0.9856 0.2635 0.9856 0.9928
No log 93.6 468 0.9885 0.2635 0.9885 0.9942
No log 94.0 470 0.9860 0.2635 0.9860 0.9930
No log 94.4 472 0.9814 0.2635 0.9814 0.9906
No log 94.8 474 0.9823 0.2635 0.9823 0.9911
No log 95.2 476 0.9850 0.2635 0.9850 0.9925
No log 95.6 478 0.9838 0.2635 0.9838 0.9919
No log 96.0 480 0.9826 0.2635 0.9826 0.9913
No log 96.4 482 0.9823 0.2635 0.9823 0.9911
No log 96.8 484 0.9783 0.2635 0.9783 0.9891
No log 97.2 486 0.9760 0.2635 0.9760 0.9879
No log 97.6 488 0.9769 0.2635 0.9769 0.9884
No log 98.0 490 0.9777 0.2635 0.9777 0.9888
No log 98.4 492 0.9783 0.2635 0.9783 0.9891
No log 98.8 494 0.9787 0.2635 0.9787 0.9893
No log 99.2 496 0.9798 0.2635 0.9798 0.9898
No log 99.6 498 0.9808 0.2635 0.9808 0.9903
0.1612 100.0 500 0.9813 0.2635 0.9813 0.9906

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k2_task7_organization

Finetuned
(4019)
this model