ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k13_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5961
  • Qwk: 0.6890
  • Mse: 0.5961
  • Rmse: 0.7721

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0308 2 3.8900 -0.0092 3.8900 1.9723
No log 0.0615 4 1.9911 0.0435 1.9911 1.4111
No log 0.0923 6 1.2371 0.0820 1.2371 1.1122
No log 0.1231 8 1.2357 0.1033 1.2357 1.1116
No log 0.1538 10 1.1360 0.1935 1.1360 1.0658
No log 0.1846 12 1.1292 0.3044 1.1292 1.0627
No log 0.2154 14 1.0405 0.1713 1.0405 1.0200
No log 0.2462 16 1.2661 0.0370 1.2661 1.1252
No log 0.2769 18 1.4421 0.1280 1.4421 1.2009
No log 0.3077 20 1.2910 0.1765 1.2910 1.1362
No log 0.3385 22 0.8942 0.3813 0.8942 0.9456
No log 0.3692 24 0.7581 0.3998 0.7581 0.8707
No log 0.4 26 0.7298 0.4537 0.7298 0.8543
No log 0.4308 28 0.7517 0.4312 0.7517 0.8670
No log 0.4615 30 0.8630 0.5145 0.8630 0.9290
No log 0.4923 32 0.9627 0.5068 0.9627 0.9812
No log 0.5231 34 0.7937 0.6008 0.7937 0.8909
No log 0.5538 36 0.7433 0.5865 0.7433 0.8622
No log 0.5846 38 0.7622 0.5958 0.7622 0.8730
No log 0.6154 40 0.8189 0.5952 0.8189 0.9050
No log 0.6462 42 0.9143 0.5316 0.9143 0.9562
No log 0.6769 44 0.7392 0.6128 0.7392 0.8598
No log 0.7077 46 0.6719 0.6139 0.6719 0.8197
No log 0.7385 48 0.7953 0.5749 0.7953 0.8918
No log 0.7692 50 0.6784 0.6644 0.6784 0.8236
No log 0.8 52 0.6769 0.6229 0.6769 0.8227
No log 0.8308 54 0.6854 0.6229 0.6854 0.8279
No log 0.8615 56 0.7221 0.5852 0.7221 0.8498
No log 0.8923 58 0.8343 0.5835 0.8343 0.9134
No log 0.9231 60 0.7540 0.6231 0.7540 0.8683
No log 0.9538 62 0.7612 0.6362 0.7612 0.8725
No log 0.9846 64 0.9237 0.5838 0.9237 0.9611
No log 1.0154 66 0.9356 0.5739 0.9356 0.9673
No log 1.0462 68 0.9265 0.6024 0.9265 0.9626
No log 1.0769 70 0.9042 0.5870 0.9042 0.9509
No log 1.1077 72 0.8629 0.5746 0.8629 0.9289
No log 1.1385 74 0.8113 0.5575 0.8113 0.9007
No log 1.1692 76 0.7205 0.6446 0.7205 0.8488
No log 1.2 78 0.7656 0.5530 0.7656 0.8750
No log 1.2308 80 0.8709 0.5443 0.8709 0.9332
No log 1.2615 82 0.9695 0.4681 0.9695 0.9846
No log 1.2923 84 0.8320 0.5942 0.8320 0.9121
No log 1.3231 86 0.7270 0.6138 0.7270 0.8526
No log 1.3538 88 0.8217 0.5482 0.8217 0.9065
No log 1.3846 90 0.7695 0.6124 0.7695 0.8772
No log 1.4154 92 0.7367 0.6789 0.7367 0.8583
No log 1.4462 94 0.8312 0.5491 0.8312 0.9117
No log 1.4769 96 0.8334 0.5693 0.8334 0.9129
No log 1.5077 98 0.7257 0.5926 0.7257 0.8519
No log 1.5385 100 0.6474 0.6664 0.6474 0.8046
No log 1.5692 102 0.6390 0.6860 0.6390 0.7994
No log 1.6 104 0.6383 0.6636 0.6383 0.7989
No log 1.6308 106 0.6422 0.6334 0.6422 0.8014
No log 1.6615 108 0.6426 0.6195 0.6426 0.8016
No log 1.6923 110 0.6495 0.6651 0.6495 0.8059
No log 1.7231 112 0.6516 0.6651 0.6516 0.8072
No log 1.7538 114 0.6674 0.6345 0.6674 0.8169
No log 1.7846 116 0.6625 0.6490 0.6625 0.8140
No log 1.8154 118 0.6362 0.6882 0.6362 0.7976
No log 1.8462 120 0.6479 0.6728 0.6479 0.8049
No log 1.8769 122 0.6902 0.6345 0.6902 0.8308
No log 1.9077 124 0.6852 0.7180 0.6852 0.8278
No log 1.9385 126 0.8291 0.6062 0.8291 0.9106
No log 1.9692 128 1.0382 0.4776 1.0382 1.0189
No log 2.0 130 0.9739 0.4938 0.9739 0.9869
No log 2.0308 132 0.7924 0.5415 0.7924 0.8902
No log 2.0615 134 0.6396 0.6882 0.6396 0.7998
No log 2.0923 136 0.6949 0.5631 0.6949 0.8336
No log 2.1231 138 0.7928 0.5573 0.7928 0.8904
No log 2.1538 140 0.6859 0.5978 0.6859 0.8282
No log 2.1846 142 0.5643 0.6387 0.5643 0.7512
No log 2.2154 144 0.5697 0.6078 0.5697 0.7548
No log 2.2462 146 0.5535 0.6667 0.5535 0.7440
No log 2.2769 148 0.5745 0.7161 0.5745 0.7580
No log 2.3077 150 0.6408 0.6988 0.6408 0.8005
No log 2.3385 152 0.6688 0.6988 0.6688 0.8178
No log 2.3692 154 0.6292 0.6995 0.6292 0.7932
No log 2.4 156 0.5847 0.7071 0.5847 0.7647
No log 2.4308 158 0.6308 0.6019 0.6308 0.7942
No log 2.4615 160 0.6997 0.6059 0.6997 0.8365
No log 2.4923 162 0.6596 0.6810 0.6596 0.8121
No log 2.5231 164 0.6394 0.7027 0.6394 0.7996
No log 2.5538 166 0.6598 0.6889 0.6598 0.8123
No log 2.5846 168 0.6602 0.6920 0.6602 0.8125
No log 2.6154 170 0.6857 0.6886 0.6857 0.8281
No log 2.6462 172 0.8890 0.5264 0.8890 0.9429
No log 2.6769 174 0.8267 0.5899 0.8267 0.9093
No log 2.7077 176 0.6899 0.6374 0.6899 0.8306
No log 2.7385 178 0.6031 0.6805 0.6031 0.7766
No log 2.7692 180 0.5976 0.6664 0.5976 0.7731
No log 2.8 182 0.5834 0.6473 0.5834 0.7638
No log 2.8308 184 0.6324 0.6820 0.6324 0.7952
No log 2.8615 186 0.7307 0.5934 0.7307 0.8548
No log 2.8923 188 0.7142 0.6299 0.7142 0.8451
No log 2.9231 190 0.6357 0.6470 0.6357 0.7973
No log 2.9538 192 0.5878 0.6584 0.5878 0.7667
No log 2.9846 194 0.6320 0.6249 0.6320 0.7950
No log 3.0154 196 0.6950 0.5938 0.6950 0.8337
No log 3.0462 198 0.6520 0.6157 0.6520 0.8075
No log 3.0769 200 0.6088 0.6588 0.6088 0.7802
No log 3.1077 202 0.6059 0.6097 0.6059 0.7784
No log 3.1385 204 0.6214 0.6417 0.6214 0.7883
No log 3.1692 206 0.6535 0.6876 0.6535 0.8084
No log 3.2 208 0.6768 0.6740 0.6768 0.8227
No log 3.2308 210 0.6772 0.6567 0.6772 0.8229
No log 3.2615 212 0.7266 0.6241 0.7266 0.8524
No log 3.2923 214 0.6534 0.6452 0.6534 0.8083
No log 3.3231 216 0.6147 0.6555 0.6147 0.7840
No log 3.3538 218 0.6259 0.6397 0.6259 0.7912
No log 3.3846 220 0.6158 0.6397 0.6158 0.7848
No log 3.4154 222 0.6215 0.6269 0.6215 0.7884
No log 3.4462 224 0.6427 0.6405 0.6427 0.8017
No log 3.4769 226 0.6633 0.5666 0.6633 0.8144
No log 3.5077 228 0.6696 0.5880 0.6696 0.8183
No log 3.5385 230 0.6659 0.6473 0.6659 0.8160
No log 3.5692 232 0.6857 0.6468 0.6857 0.8281
No log 3.6 234 0.7029 0.6396 0.7029 0.8384
No log 3.6308 236 0.6921 0.6628 0.6921 0.8319
No log 3.6615 238 0.7044 0.6586 0.7044 0.8393
No log 3.6923 240 0.6769 0.6476 0.6769 0.8227
No log 3.7231 242 0.6702 0.5983 0.6702 0.8187
No log 3.7538 244 0.6144 0.6405 0.6144 0.7838
No log 3.7846 246 0.5989 0.6593 0.5989 0.7739
No log 3.8154 248 0.6098 0.6796 0.6098 0.7809
No log 3.8462 250 0.6726 0.7070 0.6726 0.8201
No log 3.8769 252 0.7883 0.5949 0.7883 0.8879
No log 3.9077 254 0.7529 0.5724 0.7529 0.8677
No log 3.9385 256 0.6610 0.6586 0.6610 0.8130
No log 3.9692 258 0.6206 0.7154 0.6206 0.7878
No log 4.0 260 0.6343 0.7103 0.6343 0.7964
No log 4.0308 262 0.6293 0.6537 0.6293 0.7933
No log 4.0615 264 0.6438 0.6007 0.6438 0.8023
No log 4.0923 266 0.6708 0.5969 0.6708 0.8190
No log 4.1231 268 0.6918 0.6284 0.6918 0.8317
No log 4.1538 270 0.7363 0.6275 0.7363 0.8581
No log 4.1846 272 0.7514 0.6160 0.7514 0.8668
No log 4.2154 274 0.7010 0.5844 0.7010 0.8372
No log 4.2462 276 0.6556 0.5688 0.6556 0.8097
No log 4.2769 278 0.6241 0.5359 0.6241 0.7900
No log 4.3077 280 0.5946 0.6720 0.5946 0.7711
No log 4.3385 282 0.6068 0.7071 0.6068 0.7790
No log 4.3692 284 0.6304 0.6729 0.6304 0.7940
No log 4.4 286 0.6433 0.6736 0.6433 0.8021
No log 4.4308 288 0.6215 0.6431 0.6215 0.7883
No log 4.4615 290 0.6016 0.6581 0.6016 0.7756
No log 4.4923 292 0.6036 0.6175 0.6036 0.7769
No log 4.5231 294 0.6076 0.6307 0.6076 0.7795
No log 4.5538 296 0.6247 0.5917 0.6247 0.7904
No log 4.5846 298 0.6316 0.6113 0.6316 0.7947
No log 4.6154 300 0.6152 0.5917 0.6152 0.7844
No log 4.6462 302 0.6026 0.6500 0.6026 0.7762
No log 4.6769 304 0.6116 0.6278 0.6116 0.7820
No log 4.7077 306 0.6382 0.6018 0.6382 0.7989
No log 4.7385 308 0.6496 0.6073 0.6496 0.8060
No log 4.7692 310 0.6671 0.5395 0.6671 0.8168
No log 4.8 312 0.6288 0.5929 0.6288 0.7930
No log 4.8308 314 0.5974 0.7103 0.5974 0.7729
No log 4.8615 316 0.6061 0.7129 0.6061 0.7785
No log 4.8923 318 0.6143 0.7034 0.6143 0.7838
No log 4.9231 320 0.6311 0.6766 0.6311 0.7944
No log 4.9538 322 0.6580 0.6293 0.6580 0.8112
No log 4.9846 324 0.7240 0.5819 0.7240 0.8509
No log 5.0154 326 0.7534 0.5819 0.7534 0.8680
No log 5.0462 328 0.7061 0.5473 0.7061 0.8403
No log 5.0769 330 0.6495 0.6616 0.6495 0.8059
No log 5.1077 332 0.6742 0.5865 0.6742 0.8211
No log 5.1385 334 0.6693 0.6070 0.6693 0.8181
No log 5.1692 336 0.6402 0.6553 0.6402 0.8001
No log 5.2 338 0.6430 0.6979 0.6430 0.8019
No log 5.2308 340 0.6684 0.6979 0.6684 0.8175
No log 5.2615 342 0.6638 0.6813 0.6638 0.8148
No log 5.2923 344 0.6104 0.6672 0.6104 0.7813
No log 5.3231 346 0.6188 0.6349 0.6188 0.7866
No log 5.3538 348 0.6155 0.6349 0.6155 0.7845
No log 5.3846 350 0.5978 0.6433 0.5978 0.7732
No log 5.4154 352 0.5925 0.6433 0.5925 0.7697
No log 5.4462 354 0.6101 0.6430 0.6101 0.7811
No log 5.4769 356 0.5849 0.6572 0.5849 0.7648
No log 5.5077 358 0.5638 0.6896 0.5638 0.7508
No log 5.5385 360 0.5748 0.6535 0.5748 0.7582
No log 5.5692 362 0.5695 0.7110 0.5695 0.7546
No log 5.6 364 0.5759 0.6680 0.5759 0.7589
No log 5.6308 366 0.6239 0.6455 0.6239 0.7899
No log 5.6615 368 0.6645 0.6337 0.6645 0.8152
No log 5.6923 370 0.6819 0.6337 0.6819 0.8258
No log 5.7231 372 0.6705 0.6269 0.6705 0.8188
No log 5.7538 374 0.6735 0.6421 0.6735 0.8206
No log 5.7846 376 0.6745 0.6237 0.6745 0.8213
No log 5.8154 378 0.6563 0.5866 0.6563 0.8101
No log 5.8462 380 0.6616 0.5891 0.6616 0.8134
No log 5.8769 382 0.6715 0.5640 0.6715 0.8194
No log 5.9077 384 0.6799 0.4819 0.6799 0.8246
No log 5.9385 386 0.6815 0.4707 0.6815 0.8255
No log 5.9692 388 0.6631 0.4659 0.6631 0.8143
No log 6.0 390 0.6495 0.5823 0.6495 0.8059
No log 6.0308 392 0.6387 0.6217 0.6387 0.7992
No log 6.0615 394 0.6029 0.6217 0.6029 0.7765
No log 6.0923 396 0.6026 0.6667 0.6026 0.7763
No log 6.1231 398 0.6421 0.5993 0.6421 0.8013
No log 6.1538 400 0.6324 0.6278 0.6324 0.7953
No log 6.1846 402 0.6211 0.6572 0.6211 0.7881
No log 6.2154 404 0.6326 0.6642 0.6326 0.7953
No log 6.2462 406 0.6432 0.6561 0.6432 0.8020
No log 6.2769 408 0.6378 0.5819 0.6378 0.7986
No log 6.3077 410 0.6694 0.5880 0.6694 0.8181
No log 6.3385 412 0.7459 0.4711 0.7459 0.8637
No log 6.3692 414 0.7433 0.3958 0.7433 0.8621
No log 6.4 416 0.7060 0.4888 0.7060 0.8403
No log 6.4308 418 0.6846 0.5156 0.6846 0.8274
No log 6.4615 420 0.6854 0.4576 0.6854 0.8279
No log 6.4923 422 0.6765 0.5635 0.6765 0.8225
No log 6.5231 424 0.7069 0.5855 0.7069 0.8407
No log 6.5538 426 0.8377 0.5316 0.8377 0.9153
No log 6.5846 428 0.9616 0.4805 0.9616 0.9806
No log 6.6154 430 0.9118 0.5105 0.9118 0.9549
No log 6.6462 432 0.7716 0.5896 0.7716 0.8784
No log 6.6769 434 0.6944 0.6012 0.6944 0.8333
No log 6.7077 436 0.6854 0.5010 0.6854 0.8279
No log 6.7385 438 0.7013 0.4511 0.7013 0.8374
No log 6.7692 440 0.7226 0.5663 0.7226 0.8501
No log 6.8 442 0.7232 0.5697 0.7232 0.8504
No log 6.8308 444 0.6811 0.5751 0.6811 0.8253
No log 6.8615 446 0.6319 0.5590 0.6319 0.7949
No log 6.8923 448 0.6135 0.6452 0.6135 0.7832
No log 6.9231 450 0.6145 0.6705 0.6145 0.7839
No log 6.9538 452 0.6343 0.6377 0.6343 0.7964
No log 6.9846 454 0.6483 0.6774 0.6483 0.8052
No log 7.0154 456 0.6909 0.6301 0.6909 0.8312
No log 7.0462 458 0.7706 0.5153 0.7706 0.8778
No log 7.0769 460 0.8462 0.5013 0.8462 0.9199
No log 7.1077 462 0.9011 0.5188 0.9011 0.9492
No log 7.1385 464 0.8788 0.5188 0.8788 0.9375
No log 7.1692 466 0.7876 0.5128 0.7876 0.8875
No log 7.2 468 0.7054 0.5921 0.7054 0.8399
No log 7.2308 470 0.6858 0.5618 0.6858 0.8281
No log 7.2615 472 0.6857 0.5993 0.6857 0.8281
No log 7.2923 474 0.7201 0.6064 0.7201 0.8486
No log 7.3231 476 0.7262 0.5736 0.7262 0.8522
No log 7.3538 478 0.7121 0.6151 0.7121 0.8439
No log 7.3846 480 0.6968 0.6247 0.6968 0.8347
No log 7.4154 482 0.6579 0.5923 0.6579 0.8111
No log 7.4462 484 0.6276 0.5986 0.6276 0.7922
No log 7.4769 486 0.5997 0.6272 0.5997 0.7744
No log 7.5077 488 0.5931 0.6439 0.5931 0.7702
No log 7.5385 490 0.6235 0.6821 0.6235 0.7896
No log 7.5692 492 0.6626 0.6932 0.6626 0.8140
No log 7.6 494 0.6255 0.6962 0.6255 0.7909
No log 7.6308 496 0.5857 0.6680 0.5857 0.7653
No log 7.6615 498 0.5911 0.6788 0.5911 0.7688
0.2495 7.6923 500 0.6027 0.6978 0.6027 0.7763
0.2495 7.7231 502 0.6051 0.6649 0.6051 0.7779
0.2495 7.7538 504 0.5944 0.6830 0.5944 0.7709
0.2495 7.7846 506 0.5827 0.6830 0.5827 0.7633
0.2495 7.8154 508 0.5848 0.6553 0.5848 0.7647
0.2495 7.8462 510 0.5961 0.6890 0.5961 0.7721

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k13_task5_organization

Finetuned
(4019)
this model