ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k14_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9415
  • Qwk: 0.6
  • Mse: 0.9415
  • Rmse: 0.9703

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0190 2 6.9276 0.0116 6.9276 2.6320
No log 0.0381 4 4.7656 0.0545 4.7656 2.1830
No log 0.0571 6 3.5346 0.0212 3.5346 1.8800
No log 0.0762 8 2.4274 0.1333 2.4274 1.5580
No log 0.0952 10 1.9106 0.2645 1.9106 1.3822
No log 0.1143 12 1.5750 0.1905 1.5750 1.2550
No log 0.1333 14 1.4359 0.1495 1.4359 1.1983
No log 0.1524 16 1.4312 0.2783 1.4312 1.1963
No log 0.1714 18 1.3533 0.3009 1.3533 1.1633
No log 0.1905 20 1.5519 0.1905 1.5519 1.2458
No log 0.2095 22 1.6654 0.1622 1.6654 1.2905
No log 0.2286 24 1.7327 0.25 1.7327 1.3163
No log 0.2476 26 1.5649 0.2951 1.5649 1.2510
No log 0.2667 28 1.2763 0.3717 1.2763 1.1297
No log 0.2857 30 1.7902 0.25 1.7902 1.3380
No log 0.3048 32 1.9905 0.1875 1.9905 1.4108
No log 0.3238 34 1.3900 0.2385 1.3900 1.1790
No log 0.3429 36 1.1253 0.4071 1.1253 1.0608
No log 0.3619 38 1.0888 0.4870 1.0888 1.0434
No log 0.3810 40 1.5378 0.3881 1.5378 1.2401
No log 0.4 42 1.5740 0.3768 1.5740 1.2546
No log 0.4190 44 1.2285 0.4783 1.2285 1.1084
No log 0.4381 46 0.9748 0.6471 0.9748 0.9873
No log 0.4571 48 0.9813 0.6212 0.9813 0.9906
No log 0.4762 50 1.0622 0.5802 1.0622 1.0306
No log 0.4952 52 1.1423 0.5857 1.1423 1.0688
No log 0.5143 54 1.1446 0.5373 1.1446 1.0698
No log 0.5333 56 1.1881 0.5507 1.1881 1.0900
No log 0.5524 58 1.2013 0.5255 1.2013 1.0960
No log 0.5714 60 1.1758 0.5507 1.1758 1.0843
No log 0.5905 62 1.1995 0.5070 1.1995 1.0952
No log 0.6095 64 1.2154 0.4930 1.2154 1.1025
No log 0.6286 66 1.2228 0.5143 1.2228 1.1058
No log 0.6476 68 1.2648 0.5655 1.2648 1.1246
No log 0.6667 70 1.3367 0.5290 1.3367 1.1562
No log 0.6857 72 1.3831 0.5125 1.3831 1.1760
No log 0.7048 74 1.3098 0.5663 1.3098 1.1445
No log 0.7238 76 1.0619 0.6056 1.0619 1.0305
No log 0.7429 78 0.8583 0.6269 0.8583 0.9264
No log 0.7619 80 0.8460 0.6475 0.8460 0.9198
No log 0.7810 82 0.9401 0.5915 0.9401 0.9696
No log 0.8 84 1.0403 0.6259 1.0403 1.0199
No log 0.8190 86 1.0513 0.64 1.0513 1.0253
No log 0.8381 88 0.9576 0.5793 0.9576 0.9786
No log 0.8571 90 0.8899 0.6800 0.8899 0.9433
No log 0.8762 92 0.8002 0.6383 0.8002 0.8945
No log 0.8952 94 0.7336 0.6963 0.7336 0.8565
No log 0.9143 96 0.7889 0.6901 0.7889 0.8882
No log 0.9333 98 0.8813 0.6887 0.8813 0.9388
No log 0.9524 100 1.0068 0.6486 1.0068 1.0034
No log 0.9714 102 1.0193 0.6438 1.0193 1.0096
No log 0.9905 104 1.0252 0.6259 1.0252 1.0125
No log 1.0095 106 1.0172 0.6338 1.0172 1.0085
No log 1.0286 108 0.9630 0.6575 0.9630 0.9813
No log 1.0476 110 0.7965 0.6806 0.7965 0.8925
No log 1.0667 112 0.7461 0.7153 0.7461 0.8638
No log 1.0857 114 0.7243 0.7050 0.7243 0.8511
No log 1.1048 116 0.9890 0.7 0.9890 0.9945
No log 1.1238 118 1.4175 0.5789 1.4175 1.1906
No log 1.1429 120 1.4610 0.5612 1.4610 1.2087
No log 1.1619 122 1.1572 0.6857 1.1572 1.0757
No log 1.1810 124 0.8256 0.7170 0.8256 0.9086
No log 1.2 126 0.7839 0.6846 0.7839 0.8854
No log 1.2190 128 0.8247 0.6897 0.8247 0.9081
No log 1.2381 130 0.9266 0.6483 0.9266 0.9626
No log 1.2571 132 0.9848 0.5926 0.9848 0.9923
No log 1.2762 134 1.0753 0.5821 1.0753 1.0370
No log 1.2952 136 1.2665 0.5655 1.2665 1.1254
No log 1.3143 138 1.1714 0.6 1.1714 1.0823
No log 1.3333 140 1.1058 0.6531 1.1058 1.0516
No log 1.3524 142 1.0866 0.6667 1.0866 1.0424
No log 1.3714 144 1.1161 0.6705 1.1161 1.0565
No log 1.3905 146 1.2254 0.6448 1.2254 1.1070
No log 1.4095 148 1.1206 0.6742 1.1206 1.0586
No log 1.4286 150 0.8646 0.7294 0.8646 0.9298
No log 1.4476 152 0.7859 0.7619 0.7859 0.8865
No log 1.4667 154 0.8066 0.7349 0.8066 0.8981
No log 1.4857 156 0.8386 0.7381 0.8386 0.9157
No log 1.5048 158 0.9444 0.7143 0.9444 0.9718
No log 1.5238 160 0.9465 0.7135 0.9465 0.9729
No log 1.5429 162 1.1787 0.7021 1.1787 1.0857
No log 1.5619 164 1.3246 0.6492 1.3246 1.1509
No log 1.5810 166 1.0349 0.7066 1.0349 1.0173
No log 1.6 168 0.7550 0.6933 0.7550 0.8689
No log 1.6190 170 0.6570 0.6763 0.6570 0.8105
No log 1.6381 172 0.7324 0.7226 0.7324 0.8558
No log 1.6571 174 1.0027 0.7143 1.0027 1.0014
No log 1.6762 176 0.9650 0.7473 0.9650 0.9823
No log 1.6952 178 0.9670 0.7333 0.9670 0.9834
No log 1.7143 180 0.9839 0.7086 0.9839 0.9919
No log 1.7333 182 0.8550 0.7273 0.8550 0.9247
No log 1.7524 184 0.7549 0.7362 0.7549 0.8689
No log 1.7714 186 0.5831 0.7815 0.5831 0.7636
No log 1.7905 188 0.5562 0.7792 0.5562 0.7458
No log 1.8095 190 0.5702 0.7974 0.5702 0.7551
No log 1.8286 192 0.5744 0.8176 0.5744 0.7579
No log 1.8476 194 0.7114 0.75 0.7114 0.8435
No log 1.8667 196 1.0511 0.6509 1.0511 1.0252
No log 1.8857 198 1.3085 0.6522 1.3085 1.1439
No log 1.9048 200 1.2143 0.6347 1.2143 1.1020
No log 1.9238 202 0.9178 0.6575 0.9178 0.9580
No log 1.9429 204 0.7264 0.7059 0.7264 0.8523
No log 1.9619 206 0.6319 0.7101 0.6319 0.7949
No log 1.9810 208 0.5785 0.7552 0.5785 0.7606
No log 2.0 210 0.6666 0.7692 0.6666 0.8165
No log 2.0190 212 0.7822 0.7590 0.7822 0.8844
No log 2.0381 214 0.9594 0.6927 0.9594 0.9795
No log 2.0571 216 1.0295 0.6927 1.0295 1.0146
No log 2.0762 218 1.0234 0.6936 1.0234 1.0117
No log 2.0952 220 0.8325 0.6800 0.8325 0.9124
No log 2.1143 222 0.6633 0.7376 0.6633 0.8144
No log 2.1333 224 0.6144 0.7518 0.6144 0.7838
No log 2.1524 226 0.5781 0.7808 0.5781 0.7603
No log 2.1714 228 0.5720 0.75 0.5720 0.7563
No log 2.1905 230 0.7841 0.7006 0.7841 0.8855
No log 2.2095 232 1.1698 0.6522 1.1698 1.0816
No log 2.2286 234 1.2711 0.6528 1.2711 1.1274
No log 2.2476 236 1.1425 0.6630 1.1425 1.0689
No log 2.2667 238 0.9720 0.6338 0.9720 0.9859
No log 2.2857 240 0.9178 0.6383 0.9178 0.9580
No log 2.3048 242 0.8224 0.6667 0.8224 0.9069
No log 2.3238 244 0.6724 0.7483 0.6724 0.8200
No log 2.3429 246 0.6042 0.7815 0.6042 0.7773
No log 2.3619 248 0.5984 0.7821 0.5984 0.7736
No log 2.3810 250 0.6357 0.7811 0.6357 0.7973
No log 2.4 252 0.8069 0.7079 0.8069 0.8983
No log 2.4190 254 1.0167 0.6630 1.0167 1.0083
No log 2.4381 256 1.1817 0.6067 1.1817 1.0871
No log 2.4571 258 1.1942 0.6071 1.1942 1.0928
No log 2.4762 260 1.0612 0.6667 1.0612 1.0302
No log 2.4952 262 1.0103 0.6418 1.0103 1.0051
No log 2.5143 264 0.9821 0.6897 0.9821 0.9910
No log 2.5333 266 1.0528 0.625 1.0528 1.0261
No log 2.5524 268 1.1404 0.6222 1.1404 1.0679
No log 2.5714 270 1.1478 0.6047 1.1478 1.0713
No log 2.5905 272 1.0288 0.6503 1.0288 1.0143
No log 2.6095 274 1.0580 0.6289 1.0580 1.0286
No log 2.6286 276 0.9598 0.7018 0.9598 0.9797
No log 2.6476 278 0.7757 0.7561 0.7757 0.8807
No log 2.6667 280 0.6897 0.7436 0.6897 0.8305
No log 2.6857 282 0.7379 0.7368 0.7379 0.8590
No log 2.7048 284 0.8943 0.6755 0.8943 0.9457
No log 2.7238 286 1.0691 0.6380 1.0691 1.0340
No log 2.7429 288 1.1026 0.6380 1.1026 1.0500
No log 2.7619 290 1.0762 0.6380 1.0762 1.0374
No log 2.7810 292 0.9005 0.6939 0.9005 0.9490
No log 2.8 294 0.8889 0.6883 0.8889 0.9428
No log 2.8190 296 0.8913 0.6795 0.8913 0.9441
No log 2.8381 298 0.7952 0.7162 0.7952 0.8917
No log 2.8571 300 0.7681 0.7333 0.7681 0.8764
No log 2.8762 302 0.8102 0.7237 0.8102 0.9001
No log 2.8952 304 0.9584 0.6832 0.9584 0.9790
No log 2.9143 306 1.2489 0.5989 1.2489 1.1176
No log 2.9333 308 1.4654 0.5683 1.4654 1.2106
No log 2.9524 310 1.3460 0.5629 1.3460 1.1602
No log 2.9714 312 1.1198 0.5946 1.1198 1.0582
No log 2.9905 314 0.9346 0.6483 0.9346 0.9668
No log 3.0095 316 0.8010 0.6986 0.8010 0.8950
No log 3.0286 318 0.7957 0.7075 0.7957 0.8920
No log 3.0476 320 0.8323 0.6797 0.8323 0.9123
No log 3.0667 322 0.7511 0.7273 0.7511 0.8667
No log 3.0857 324 0.7228 0.7532 0.7228 0.8502
No log 3.1048 326 0.6662 0.7712 0.6662 0.8162
No log 3.1238 328 0.6598 0.7682 0.6598 0.8123
No log 3.1429 330 0.7309 0.7273 0.7309 0.8549
No log 3.1619 332 1.0310 0.7128 1.0310 1.0154
No log 3.1810 334 1.3323 0.6502 1.3323 1.1542
No log 3.2 336 1.3242 0.6473 1.3242 1.1508
No log 3.2190 338 1.2287 0.6601 1.2287 1.1085
No log 3.2381 340 0.9364 0.7429 0.9364 0.9677
No log 3.2571 342 0.7346 0.7417 0.7346 0.8571
No log 3.2762 344 0.7309 0.7361 0.7309 0.8549
No log 3.2952 346 0.7612 0.7413 0.7612 0.8725
No log 3.3143 348 0.8334 0.6618 0.8334 0.9129
No log 3.3333 350 0.9475 0.625 0.9475 0.9734
No log 3.3524 352 1.1033 0.5987 1.1033 1.0504
No log 3.3714 354 1.0992 0.6441 1.0992 1.0484
No log 3.3905 356 0.8923 0.7195 0.8923 0.9446
No log 3.4095 358 0.7275 0.7484 0.7275 0.8529
No log 3.4286 360 0.7031 0.7763 0.7031 0.8385
No log 3.4476 362 0.7423 0.7403 0.7423 0.8616
No log 3.4667 364 0.9351 0.7273 0.9351 0.9670
No log 3.4857 366 1.2541 0.6866 1.2541 1.1199
No log 3.5048 368 1.2725 0.6601 1.2725 1.1280
No log 3.5238 370 1.0359 0.6984 1.0359 1.0178
No log 3.5429 372 0.6981 0.7882 0.6981 0.8355
No log 3.5619 374 0.5405 0.7755 0.5405 0.7352
No log 3.5810 376 0.5690 0.7671 0.5690 0.7543
No log 3.6 378 0.6651 0.7568 0.6651 0.8155
No log 3.6190 380 0.8859 0.6846 0.8859 0.9412
No log 3.6381 382 1.0890 0.6061 1.0890 1.0435
No log 3.6571 384 1.1592 0.5904 1.1592 1.0766
No log 3.6762 386 1.0659 0.5974 1.0659 1.0324
No log 3.6952 388 0.9105 0.6573 0.9105 0.9542
No log 3.7143 390 0.8507 0.6857 0.8507 0.9223
No log 3.7333 392 0.8678 0.6759 0.8678 0.9316
No log 3.7524 394 0.7877 0.7211 0.7877 0.8875
No log 3.7714 396 0.6832 0.7568 0.6832 0.8266
No log 3.7905 398 0.6811 0.7517 0.6811 0.8253
No log 3.8095 400 0.8109 0.7262 0.8109 0.9005
No log 3.8286 402 0.9535 0.6893 0.9535 0.9765
No log 3.8476 404 1.0439 0.6629 1.0439 1.0217
No log 3.8667 406 0.9609 0.6380 0.9609 0.9802
No log 3.8857 408 0.8449 0.6806 0.8449 0.9192
No log 3.9048 410 0.8144 0.6853 0.8144 0.9024
No log 3.9238 412 0.8255 0.6714 0.8255 0.9086
No log 3.9429 414 0.8990 0.6528 0.8990 0.9482
No log 3.9619 416 0.9680 0.6447 0.9680 0.9839
No log 3.9810 418 0.9759 0.6447 0.9759 0.9879
No log 4.0 420 0.9182 0.6622 0.9182 0.9582
No log 4.0190 422 0.8673 0.6622 0.8673 0.9313
No log 4.0381 424 0.7924 0.6667 0.7924 0.8902
No log 4.0571 426 0.7588 0.6950 0.7588 0.8711
No log 4.0762 428 0.7577 0.6950 0.7577 0.8704
No log 4.0952 430 0.7311 0.7083 0.7311 0.8550
No log 4.1143 432 0.7298 0.7211 0.7298 0.8543
No log 4.1333 434 0.7760 0.6986 0.7760 0.8809
No log 4.1524 436 0.8683 0.6803 0.8683 0.9318
No log 4.1714 438 0.9499 0.6531 0.9499 0.9746
No log 4.1905 440 1.0331 0.6410 1.0331 1.0164
No log 4.2095 442 0.9723 0.6667 0.9723 0.9860
No log 4.2286 444 0.9087 0.6892 0.9087 0.9533
No log 4.2476 446 0.7878 0.7 0.7878 0.8876
No log 4.2667 448 0.7416 0.7429 0.7416 0.8612
No log 4.2857 450 0.7007 0.7429 0.7007 0.8371
No log 4.3048 452 0.7716 0.7297 0.7716 0.8784
No log 4.3238 454 0.8073 0.7226 0.8073 0.8985
No log 4.3429 456 0.8476 0.7362 0.8476 0.9207
No log 4.3619 458 0.8605 0.7337 0.8605 0.9276
No log 4.3810 460 0.7855 0.7394 0.7855 0.8863
No log 4.4 462 0.7125 0.7517 0.7125 0.8441
No log 4.4190 464 0.6522 0.7550 0.6522 0.8076
No log 4.4381 466 0.6744 0.7517 0.6744 0.8212
No log 4.4571 468 0.7751 0.7296 0.7751 0.8804
No log 4.4762 470 0.7674 0.7407 0.7674 0.8760
No log 4.4952 472 0.7191 0.7673 0.7191 0.8480
No log 4.5143 474 0.6187 0.7755 0.6187 0.7866
No log 4.5333 476 0.5954 0.7703 0.5954 0.7716
No log 4.5524 478 0.6129 0.7843 0.6129 0.7829
No log 4.5714 480 0.7035 0.775 0.7035 0.8387
No log 4.5905 482 0.7294 0.7547 0.7294 0.8540
No log 4.6095 484 0.7819 0.7375 0.7819 0.8842
No log 4.6286 486 0.8245 0.7134 0.8245 0.9080
No log 4.6476 488 0.8018 0.7179 0.8018 0.8954
No log 4.6667 490 0.7614 0.7261 0.7614 0.8726
No log 4.6857 492 0.7688 0.7134 0.7688 0.8768
No log 4.7048 494 0.7713 0.6909 0.7713 0.8782
No log 4.7238 496 0.6533 0.7792 0.6533 0.8082
No log 4.7429 498 0.6313 0.7792 0.6313 0.7945
0.4155 4.7619 500 0.7728 0.6623 0.7728 0.8791
0.4155 4.7810 502 0.9621 0.6497 0.9621 0.9809
0.4155 4.8 504 1.0962 0.6026 1.0962 1.0470
0.4155 4.8190 506 1.0686 0.5833 1.0686 1.0337
0.4155 4.8381 508 1.0001 0.6014 1.0001 1.0001
0.4155 4.8571 510 0.9415 0.6 0.9415 0.9703

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k14_task1_organization

Finetuned
(4023)
this model