ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k4_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5757
  • Qwk: 0.6196
  • Mse: 0.5757
  • Rmse: 0.7588

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.1 2 3.7768 -0.0172 3.7768 1.9434
No log 0.2 4 1.9460 0.0733 1.9460 1.3950
No log 0.3 6 1.2706 0.0496 1.2706 1.1272
No log 0.4 8 1.1331 0.1268 1.1331 1.0645
No log 0.5 10 1.2394 0.0568 1.2394 1.1133
No log 0.6 12 1.1364 0.2471 1.1364 1.0660
No log 0.7 14 1.0100 0.3229 1.0100 1.0050
No log 0.8 16 0.8525 0.4285 0.8525 0.9233
No log 0.9 18 0.8221 0.4409 0.8221 0.9067
No log 1.0 20 0.8068 0.4285 0.8068 0.8982
No log 1.1 22 0.9064 0.4823 0.9064 0.9520
No log 1.2 24 1.0743 0.375 1.0743 1.0365
No log 1.3 26 0.8899 0.4962 0.8899 0.9433
No log 1.4 28 0.7757 0.5245 0.7757 0.8807
No log 1.5 30 0.6834 0.5886 0.6834 0.8267
No log 1.6 32 0.7002 0.6054 0.7002 0.8368
No log 1.7 34 0.6583 0.6076 0.6583 0.8113
No log 1.8 36 0.7253 0.5794 0.7253 0.8516
No log 1.9 38 0.7678 0.5598 0.7678 0.8763
No log 2.0 40 0.6295 0.5995 0.6295 0.7934
No log 2.1 42 0.6647 0.6981 0.6647 0.8153
No log 2.2 44 0.6285 0.5921 0.6285 0.7928
No log 2.3 46 0.6655 0.6638 0.6655 0.8158
No log 2.4 48 0.6462 0.6077 0.6462 0.8039
No log 2.5 50 0.6157 0.5898 0.6157 0.7847
No log 2.6 52 0.6695 0.6756 0.6695 0.8182
No log 2.7 54 0.7169 0.6277 0.7169 0.8467
No log 2.8 56 0.8049 0.6123 0.8049 0.8972
No log 2.9 58 0.6657 0.6328 0.6657 0.8159
No log 3.0 60 0.6644 0.6638 0.6644 0.8151
No log 3.1 62 0.6673 0.6479 0.6673 0.8169
No log 3.2 64 0.6407 0.7010 0.6407 0.8004
No log 3.3 66 0.6314 0.6526 0.6314 0.7946
No log 3.4 68 0.6191 0.6729 0.6191 0.7868
No log 3.5 70 0.6857 0.6009 0.6857 0.8281
No log 3.6 72 0.8081 0.5782 0.8081 0.8990
No log 3.7 74 0.6908 0.6416 0.6908 0.8312
No log 3.8 76 0.7019 0.6393 0.7019 0.8378
No log 3.9 78 0.7018 0.6393 0.7018 0.8377
No log 4.0 80 0.6546 0.6251 0.6546 0.8091
No log 4.1 82 0.6427 0.6341 0.6427 0.8017
No log 4.2 84 0.6462 0.5995 0.6462 0.8039
No log 4.3 86 0.8142 0.5640 0.8142 0.9024
No log 4.4 88 1.2096 0.4017 1.2096 1.0998
No log 4.5 90 1.1036 0.4240 1.1036 1.0505
No log 4.6 92 0.7090 0.5726 0.7090 0.8420
No log 4.7 94 0.6893 0.6240 0.6893 0.8302
No log 4.8 96 0.7206 0.5470 0.7206 0.8489
No log 4.9 98 0.6316 0.6360 0.6316 0.7947
No log 5.0 100 0.6852 0.5573 0.6852 0.8277
No log 5.1 102 0.7547 0.4797 0.7547 0.8687
No log 5.2 104 0.6564 0.5955 0.6564 0.8102
No log 5.3 106 0.6177 0.7118 0.6177 0.7859
No log 5.4 108 0.7138 0.6402 0.7138 0.8449
No log 5.5 110 0.7061 0.6842 0.7061 0.8403
No log 5.6 112 0.6043 0.6260 0.6043 0.7774
No log 5.7 114 0.6683 0.6098 0.6683 0.8175
No log 5.8 116 0.6875 0.5782 0.6875 0.8292
No log 5.9 118 0.6068 0.6154 0.6068 0.7790
No log 6.0 120 0.5976 0.7088 0.5976 0.7731
No log 6.1 122 0.6061 0.6712 0.6061 0.7785
No log 6.2 124 0.5861 0.6039 0.5861 0.7656
No log 6.3 126 0.6696 0.5665 0.6696 0.8183
No log 6.4 128 0.7884 0.5896 0.7884 0.8879
No log 6.5 130 0.6889 0.6744 0.6889 0.8300
No log 6.6 132 0.6178 0.6775 0.6178 0.7860
No log 6.7 134 0.6200 0.6740 0.6200 0.7874
No log 6.8 136 0.6355 0.6586 0.6355 0.7972
No log 6.9 138 0.5863 0.6606 0.5863 0.7657
No log 7.0 140 0.5871 0.6167 0.5871 0.7662
No log 7.1 142 0.6112 0.5585 0.6112 0.7818
No log 7.2 144 0.6092 0.6133 0.6092 0.7805
No log 7.3 146 0.6084 0.5869 0.6084 0.7800
No log 7.4 148 0.5677 0.6720 0.5677 0.7535
No log 7.5 150 0.6463 0.6062 0.6463 0.8039
No log 7.6 152 0.7580 0.6251 0.7580 0.8706
No log 7.7 154 0.6779 0.6253 0.6779 0.8234
No log 7.8 156 0.6186 0.7153 0.6186 0.7865
No log 7.9 158 0.6317 0.7244 0.6317 0.7948
No log 8.0 160 0.7559 0.6070 0.7559 0.8694
No log 8.1 162 0.7791 0.5754 0.7791 0.8826
No log 8.2 164 0.6601 0.6559 0.6601 0.8124
No log 8.3 166 0.6307 0.5711 0.6307 0.7942
No log 8.4 168 0.7003 0.6371 0.7003 0.8368
No log 8.5 170 0.6767 0.6018 0.6767 0.8226
No log 8.6 172 0.6183 0.5975 0.6183 0.7863
No log 8.7 174 0.6308 0.6246 0.6308 0.7942
No log 8.8 176 0.6440 0.6479 0.6440 0.8025
No log 8.9 178 0.6183 0.6263 0.6183 0.7863
No log 9.0 180 0.6426 0.6928 0.6426 0.8017
No log 9.1 182 0.6818 0.6777 0.6818 0.8257
No log 9.2 184 0.6527 0.6992 0.6527 0.8079
No log 9.3 186 0.6549 0.6054 0.6549 0.8092
No log 9.4 188 0.7199 0.5895 0.7199 0.8484
No log 9.5 190 0.6972 0.6157 0.6972 0.8350
No log 9.6 192 0.6436 0.6186 0.6436 0.8023
No log 9.7 194 0.6562 0.6699 0.6562 0.8101
No log 9.8 196 0.6764 0.7001 0.6764 0.8224
No log 9.9 198 0.7066 0.6442 0.7066 0.8406
No log 10.0 200 0.7372 0.6547 0.7372 0.8586
No log 10.1 202 0.7621 0.5782 0.7621 0.8730
No log 10.2 204 0.7093 0.5825 0.7093 0.8422
No log 10.3 206 0.6614 0.5688 0.6614 0.8133
No log 10.4 208 0.6712 0.5845 0.6712 0.8193
No log 10.5 210 0.6841 0.6185 0.6841 0.8271
No log 10.6 212 0.7300 0.6330 0.7300 0.8544
No log 10.7 214 0.7948 0.5850 0.7948 0.8915
No log 10.8 216 0.7530 0.6366 0.7530 0.8678
No log 10.9 218 0.7110 0.6202 0.7110 0.8432
No log 11.0 220 0.7066 0.6202 0.7066 0.8406
No log 11.1 222 0.6379 0.6659 0.6379 0.7987
No log 11.2 224 0.6246 0.6853 0.6246 0.7903
No log 11.3 226 0.6252 0.5817 0.6252 0.7907
No log 11.4 228 0.6517 0.5633 0.6517 0.8073
No log 11.5 230 0.7075 0.5727 0.7075 0.8411
No log 11.6 232 0.7002 0.5922 0.7002 0.8368
No log 11.7 234 0.6560 0.6630 0.6560 0.8099
No log 11.8 236 0.6999 0.6691 0.6999 0.8366
No log 11.9 238 0.7096 0.6615 0.7096 0.8424
No log 12.0 240 0.6917 0.6453 0.6917 0.8317
No log 12.1 242 0.6465 0.5979 0.6465 0.8041
No log 12.2 244 0.6188 0.6049 0.6188 0.7866
No log 12.3 246 0.6102 0.6498 0.6102 0.7811
No log 12.4 248 0.6212 0.6584 0.6212 0.7882
No log 12.5 250 0.6019 0.6695 0.6019 0.7758
No log 12.6 252 0.5730 0.7404 0.5730 0.7569
No log 12.7 254 0.5782 0.6828 0.5782 0.7604
No log 12.8 256 0.6193 0.6541 0.6193 0.7869
No log 12.9 258 0.6738 0.6357 0.6738 0.8208
No log 13.0 260 0.6375 0.6418 0.6375 0.7985
No log 13.1 262 0.6194 0.6699 0.6194 0.7870
No log 13.2 264 0.6716 0.6612 0.6716 0.8195
No log 13.3 266 0.7212 0.6920 0.7212 0.8492
No log 13.4 268 0.8571 0.5720 0.8571 0.9258
No log 13.5 270 0.8787 0.5728 0.8787 0.9374
No log 13.6 272 0.7560 0.6765 0.7560 0.8695
No log 13.7 274 0.6584 0.5984 0.6584 0.8114
No log 13.8 276 0.6483 0.6105 0.6483 0.8051
No log 13.9 278 0.6255 0.6001 0.6255 0.7909
No log 14.0 280 0.6307 0.6001 0.6307 0.7942
No log 14.1 282 0.6355 0.6001 0.6355 0.7972
No log 14.2 284 0.6348 0.5819 0.6348 0.7968
No log 14.3 286 0.6352 0.5909 0.6352 0.7970
No log 14.4 288 0.6593 0.6377 0.6593 0.8120
No log 14.5 290 0.6998 0.6214 0.6998 0.8365
No log 14.6 292 0.6355 0.6683 0.6355 0.7972
No log 14.7 294 0.5922 0.6057 0.5922 0.7695
No log 14.8 296 0.5867 0.6164 0.5867 0.7660
No log 14.9 298 0.5824 0.6330 0.5824 0.7632
No log 15.0 300 0.5799 0.6306 0.5799 0.7615
No log 15.1 302 0.5877 0.6237 0.5877 0.7666
No log 15.2 304 0.5797 0.6392 0.5797 0.7614
No log 15.3 306 0.5813 0.6406 0.5813 0.7624
No log 15.4 308 0.5986 0.7013 0.5986 0.7737
No log 15.5 310 0.5493 0.6779 0.5493 0.7411
No log 15.6 312 0.5402 0.6364 0.5402 0.7350
No log 15.7 314 0.5468 0.6237 0.5468 0.7395
No log 15.8 316 0.5415 0.6339 0.5415 0.7359
No log 15.9 318 0.5457 0.6339 0.5457 0.7387
No log 16.0 320 0.5542 0.6339 0.5542 0.7444
No log 16.1 322 0.5605 0.6237 0.5605 0.7486
No log 16.2 324 0.5553 0.6144 0.5553 0.7452
No log 16.3 326 0.5883 0.6025 0.5883 0.7670
No log 16.4 328 0.6113 0.5917 0.6113 0.7819
No log 16.5 330 0.6091 0.6094 0.6091 0.7804
No log 16.6 332 0.5964 0.6589 0.5964 0.7723
No log 16.7 334 0.6475 0.6509 0.6475 0.8047
No log 16.8 336 0.7010 0.5832 0.7010 0.8373
No log 16.9 338 0.6883 0.5763 0.6883 0.8297
No log 17.0 340 0.6298 0.6195 0.6298 0.7936
No log 17.1 342 0.5997 0.6237 0.5997 0.7744
No log 17.2 344 0.6275 0.6167 0.6275 0.7921
No log 17.3 346 0.6839 0.6218 0.6839 0.8270
No log 17.4 348 0.6812 0.6249 0.6812 0.8253
No log 17.5 350 0.6266 0.6241 0.6266 0.7916
No log 17.6 352 0.5815 0.6272 0.5815 0.7626
No log 17.7 354 0.6179 0.6164 0.6179 0.7861
No log 17.8 356 0.7264 0.5908 0.7264 0.8523
No log 17.9 358 0.8143 0.5828 0.8143 0.9024
No log 18.0 360 0.7928 0.5828 0.7928 0.8904
No log 18.1 362 0.6918 0.6620 0.6918 0.8318
No log 18.2 364 0.6383 0.6509 0.6383 0.7989
No log 18.3 366 0.6408 0.6412 0.6408 0.8005
No log 18.4 368 0.6619 0.6705 0.6619 0.8136
No log 18.5 370 0.6628 0.6109 0.6628 0.8141
No log 18.6 372 0.6598 0.6352 0.6598 0.8123
No log 18.7 374 0.6276 0.6584 0.6276 0.7922
No log 18.8 376 0.5913 0.6740 0.5913 0.7689
No log 18.9 378 0.6069 0.6341 0.6069 0.7790
No log 19.0 380 0.6482 0.6820 0.6482 0.8051
No log 19.1 382 0.6744 0.6613 0.6744 0.8212
No log 19.2 384 0.6844 0.6613 0.6844 0.8273
No log 19.3 386 0.6649 0.6453 0.6649 0.8154
No log 19.4 388 0.6433 0.6667 0.6433 0.8021
No log 19.5 390 0.5996 0.6553 0.5996 0.7743
No log 19.6 392 0.5851 0.6650 0.5851 0.7649
No log 19.7 394 0.5896 0.6916 0.5896 0.7679
No log 19.8 396 0.5963 0.6814 0.5963 0.7722
No log 19.9 398 0.5967 0.6814 0.5967 0.7725
No log 20.0 400 0.5970 0.6424 0.5970 0.7727
No log 20.1 402 0.6133 0.6446 0.6133 0.7831
No log 20.2 404 0.6061 0.6278 0.6061 0.7785
No log 20.3 406 0.6025 0.6207 0.6025 0.7762
No log 20.4 408 0.5969 0.6157 0.5969 0.7726
No log 20.5 410 0.5934 0.6341 0.5934 0.7703
No log 20.6 412 0.5991 0.6319 0.5991 0.7740
No log 20.7 414 0.6019 0.6553 0.6019 0.7758
No log 20.8 416 0.5967 0.6553 0.5967 0.7725
No log 20.9 418 0.5886 0.6553 0.5886 0.7672
No log 21.0 420 0.5898 0.6237 0.5898 0.7680
No log 21.1 422 0.5909 0.6049 0.5909 0.7687
No log 21.2 424 0.5889 0.6237 0.5889 0.7674
No log 21.3 426 0.5917 0.6545 0.5917 0.7692
No log 21.4 428 0.6054 0.6226 0.6054 0.7781
No log 21.5 430 0.6204 0.6822 0.6204 0.7877
No log 21.6 432 0.6054 0.6857 0.6054 0.7780
No log 21.7 434 0.5864 0.7001 0.5864 0.7657
No log 21.8 436 0.5784 0.6796 0.5784 0.7605
No log 21.9 438 0.5601 0.6753 0.5601 0.7484
No log 22.0 440 0.5470 0.6306 0.5470 0.7396
No log 22.1 442 0.5428 0.6650 0.5428 0.7368
No log 22.2 444 0.5415 0.6176 0.5415 0.7359
No log 22.3 446 0.5493 0.6349 0.5493 0.7412
No log 22.4 448 0.5652 0.6288 0.5652 0.7518
No log 22.5 450 0.5478 0.6349 0.5478 0.7401
No log 22.6 452 0.5502 0.6774 0.5502 0.7417
No log 22.7 454 0.5812 0.6468 0.5812 0.7624
No log 22.8 456 0.5785 0.6797 0.5785 0.7606
No log 22.9 458 0.5902 0.6838 0.5902 0.7682
No log 23.0 460 0.5759 0.7385 0.5759 0.7589
No log 23.1 462 0.5499 0.6880 0.5499 0.7416
No log 23.2 464 0.5399 0.6581 0.5399 0.7348
No log 23.3 466 0.5429 0.6553 0.5429 0.7368
No log 23.4 468 0.5512 0.6717 0.5512 0.7424
No log 23.5 470 0.5648 0.6452 0.5648 0.7515
No log 23.6 472 0.5702 0.6452 0.5702 0.7551
No log 23.7 474 0.5616 0.6680 0.5616 0.7494
No log 23.8 476 0.5467 0.6813 0.5467 0.7394
No log 23.9 478 0.5430 0.6680 0.5430 0.7369
No log 24.0 480 0.5515 0.6880 0.5515 0.7427
No log 24.1 482 0.5837 0.7059 0.5837 0.7640
No log 24.2 484 0.6485 0.6581 0.6485 0.8053
No log 24.3 486 0.6687 0.6746 0.6687 0.8177
No log 24.4 488 0.6177 0.6951 0.6177 0.7859
No log 24.5 490 0.5589 0.6846 0.5589 0.7476
No log 24.6 492 0.5402 0.6011 0.5402 0.7350
No log 24.7 494 0.5766 0.6725 0.5766 0.7593
No log 24.8 496 0.6656 0.6507 0.6656 0.8158
No log 24.9 498 0.6955 0.6434 0.6955 0.8340
0.2037 25.0 500 0.6629 0.6602 0.6629 0.8142
0.2037 25.1 502 0.6197 0.6884 0.6197 0.7872
0.2037 25.2 504 0.6079 0.6644 0.6079 0.7797
0.2037 25.3 506 0.6118 0.6875 0.6118 0.7822
0.2037 25.4 508 0.6320 0.6736 0.6320 0.7950
0.2037 25.5 510 0.6594 0.6533 0.6594 0.8120
0.2037 25.6 512 0.6483 0.6533 0.6483 0.8052
0.2037 25.7 514 0.6302 0.6736 0.6302 0.7938
0.2037 25.8 516 0.6153 0.6328 0.6153 0.7844
0.2037 25.9 518 0.6067 0.6717 0.6067 0.7789
0.2037 26.0 520 0.6198 0.6269 0.6198 0.7873
0.2037 26.1 522 0.6180 0.6717 0.6180 0.7861
0.2037 26.2 524 0.6077 0.6813 0.6077 0.7795
0.2037 26.3 526 0.5866 0.6553 0.5866 0.7659
0.2037 26.4 528 0.5778 0.5995 0.5778 0.7601
0.2037 26.5 530 0.5716 0.6105 0.5716 0.7561
0.2037 26.6 532 0.5711 0.6011 0.5711 0.7557
0.2037 26.7 534 0.5757 0.6196 0.5757 0.7588

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k4_task5_organization

Finetuned
(4019)
this model