ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k9_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7147
  • Qwk: 0.5210
  • Mse: 0.7147
  • Rmse: 0.8454

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0444 2 3.8995 -0.0282 3.8995 1.9747
No log 0.0889 4 2.8514 -0.0172 2.8514 1.6886
No log 0.1333 6 1.3161 0.0380 1.3161 1.1472
No log 0.1778 8 1.0445 0.4080 1.0445 1.0220
No log 0.2222 10 1.0181 0.4402 1.0181 1.0090
No log 0.2667 12 1.1159 0.2293 1.1159 1.0564
No log 0.3111 14 1.1164 0.1671 1.1164 1.0566
No log 0.3556 16 1.1496 0.2316 1.1496 1.0722
No log 0.4 18 1.1524 0.1549 1.1524 1.0735
No log 0.4444 20 1.0570 0.2811 1.0570 1.0281
No log 0.4889 22 0.9728 0.3117 0.9728 0.9863
No log 0.5333 24 0.9243 0.3243 0.9243 0.9614
No log 0.5778 26 0.8912 0.3709 0.8912 0.9440
No log 0.6222 28 0.8538 0.2841 0.8538 0.9240
No log 0.6667 30 0.8232 0.4327 0.8232 0.9073
No log 0.7111 32 0.7534 0.4433 0.7534 0.8680
No log 0.7556 34 0.7353 0.4960 0.7353 0.8575
No log 0.8 36 0.7333 0.4929 0.7333 0.8563
No log 0.8444 38 0.7001 0.5073 0.7001 0.8367
No log 0.8889 40 0.6778 0.5329 0.6778 0.8233
No log 0.9333 42 0.7355 0.5257 0.7355 0.8576
No log 0.9778 44 0.7066 0.4883 0.7066 0.8406
No log 1.0222 46 0.6993 0.5894 0.6993 0.8362
No log 1.0667 48 0.7848 0.5372 0.7848 0.8859
No log 1.1111 50 0.9681 0.4548 0.9681 0.9839
No log 1.1556 52 0.9307 0.4226 0.9307 0.9648
No log 1.2 54 0.7061 0.5002 0.7061 0.8403
No log 1.2444 56 0.7740 0.5127 0.7740 0.8798
No log 1.2889 58 1.0233 0.3796 1.0233 1.0116
No log 1.3333 60 0.9702 0.4681 0.9702 0.9850
No log 1.3778 62 0.6610 0.5806 0.6610 0.8130
No log 1.4222 64 0.7130 0.5705 0.7130 0.8444
No log 1.4667 66 0.8564 0.5295 0.8564 0.9254
No log 1.5111 68 0.6937 0.6099 0.6937 0.8329
No log 1.5556 70 0.6093 0.5399 0.6093 0.7806
No log 1.6 72 0.7487 0.5368 0.7487 0.8653
No log 1.6444 74 0.8290 0.4280 0.8290 0.9105
No log 1.6889 76 0.7247 0.5598 0.7247 0.8513
No log 1.7333 78 0.6485 0.4955 0.6485 0.8053
No log 1.7778 80 0.6452 0.5701 0.6452 0.8032
No log 1.8222 82 0.6162 0.5887 0.6162 0.7850
No log 1.8667 84 0.6196 0.6147 0.6196 0.7871
No log 1.9111 86 0.6602 0.6348 0.6602 0.8125
No log 1.9556 88 0.6302 0.6119 0.6302 0.7939
No log 2.0 90 0.6290 0.5995 0.6290 0.7931
No log 2.0444 92 0.6164 0.6185 0.6164 0.7851
No log 2.0889 94 0.6231 0.6207 0.6231 0.7894
No log 2.1333 96 0.6055 0.6119 0.6055 0.7781
No log 2.1778 98 0.6626 0.6090 0.6626 0.8140
No log 2.2222 100 0.7086 0.6092 0.7086 0.8418
No log 2.2667 102 0.5801 0.5935 0.5801 0.7616
No log 2.3111 104 0.6300 0.6266 0.6300 0.7937
No log 2.3556 106 0.5834 0.6914 0.5834 0.7638
No log 2.4 108 0.5551 0.6374 0.5551 0.7451
No log 2.4444 110 0.5503 0.6473 0.5503 0.7418
No log 2.4889 112 0.5802 0.6685 0.5802 0.7617
No log 2.5333 114 0.6043 0.6189 0.6043 0.7774
No log 2.5778 116 0.6016 0.5987 0.6016 0.7756
No log 2.6222 118 0.6402 0.6404 0.6402 0.8001
No log 2.6667 120 0.7405 0.5686 0.7405 0.8605
No log 2.7111 122 0.7637 0.5695 0.7637 0.8739
No log 2.7556 124 0.6156 0.6218 0.6156 0.7846
No log 2.8 126 0.5837 0.6593 0.5837 0.7640
No log 2.8444 128 0.5944 0.6517 0.5944 0.7709
No log 2.8889 130 0.6006 0.6028 0.6006 0.7750
No log 2.9333 132 0.6018 0.6473 0.6018 0.7757
No log 2.9778 134 0.6303 0.6328 0.6303 0.7939
No log 3.0222 136 0.6173 0.6380 0.6173 0.7857
No log 3.0667 138 0.6190 0.6154 0.6190 0.7868
No log 3.1111 140 0.6616 0.6275 0.6616 0.8134
No log 3.1556 142 0.5902 0.6636 0.5902 0.7682
No log 3.2 144 0.6113 0.6948 0.6113 0.7818
No log 3.2444 146 0.7313 0.6278 0.7313 0.8551
No log 3.2889 148 0.6946 0.5810 0.6946 0.8334
No log 3.3333 150 0.6486 0.6219 0.6486 0.8053
No log 3.3778 152 0.6622 0.5434 0.6622 0.8137
No log 3.4222 154 0.6145 0.6589 0.6145 0.7839
No log 3.4667 156 0.6422 0.6197 0.6422 0.8014
No log 3.5111 158 0.7120 0.6064 0.7120 0.8438
No log 3.5556 160 0.7449 0.6045 0.7449 0.8631
No log 3.6 162 0.6672 0.5969 0.6672 0.8168
No log 3.6444 164 0.6598 0.5902 0.6598 0.8123
No log 3.6889 166 0.6667 0.5680 0.6667 0.8165
No log 3.7333 168 0.6746 0.5877 0.6746 0.8214
No log 3.7778 170 0.6593 0.6619 0.6593 0.8120
No log 3.8222 172 0.6647 0.6297 0.6647 0.8153
No log 3.8667 174 0.6869 0.5763 0.6869 0.8288
No log 3.9111 176 0.6962 0.5528 0.6962 0.8344
No log 3.9556 178 0.7138 0.5720 0.7138 0.8449
No log 4.0 180 0.7060 0.5720 0.7060 0.8403
No log 4.0444 182 0.7064 0.5528 0.7064 0.8405
No log 4.0889 184 0.7813 0.5756 0.7813 0.8839
No log 4.1333 186 0.7667 0.5602 0.7667 0.8756
No log 4.1778 188 0.6926 0.5463 0.6926 0.8322
No log 4.2222 190 0.7144 0.4884 0.7144 0.8452
No log 4.2667 192 0.7013 0.4991 0.7013 0.8374
No log 4.3111 194 0.6777 0.5582 0.6777 0.8232
No log 4.3556 196 0.8306 0.5705 0.8306 0.9114
No log 4.4 198 0.8290 0.5705 0.8290 0.9105
No log 4.4444 200 0.6743 0.6118 0.6743 0.8211
No log 4.4889 202 0.6326 0.6658 0.6326 0.7954
No log 4.5333 204 0.6281 0.6349 0.6281 0.7925
No log 4.5778 206 0.6105 0.6066 0.6105 0.7813
No log 4.6222 208 0.6743 0.6015 0.6743 0.8211
No log 4.6667 210 0.6872 0.5902 0.6872 0.8290
No log 4.7111 212 0.7022 0.6089 0.7022 0.8380
No log 4.7556 214 0.6340 0.6228 0.6340 0.7962
No log 4.8 216 0.6108 0.6990 0.6108 0.7816
No log 4.8444 218 0.6093 0.6751 0.6093 0.7806
No log 4.8889 220 0.6054 0.6845 0.6054 0.7781
No log 4.9333 222 0.5879 0.7050 0.5879 0.7668
No log 4.9778 224 0.5555 0.6988 0.5555 0.7453
No log 5.0222 226 0.5728 0.6584 0.5728 0.7568
No log 5.0667 228 0.5659 0.6857 0.5659 0.7522
No log 5.1111 230 0.6165 0.6766 0.6165 0.7852
No log 5.1556 232 0.7853 0.5480 0.7853 0.8862
No log 5.2 234 0.7129 0.5789 0.7129 0.8443
No log 5.2444 236 0.5960 0.6196 0.5960 0.7720
No log 5.2889 238 0.6054 0.6452 0.6054 0.7781
No log 5.3333 240 0.5949 0.6306 0.5949 0.7713
No log 5.3778 242 0.5870 0.6167 0.5870 0.7662
No log 5.4222 244 0.6050 0.6488 0.6050 0.7778
No log 5.4667 246 0.6120 0.6619 0.6120 0.7823
No log 5.5111 248 0.7122 0.5340 0.7122 0.8439
No log 5.5556 250 0.8202 0.5389 0.8202 0.9056
No log 5.6 252 0.7037 0.5156 0.7037 0.8388
No log 5.6444 254 0.6437 0.6470 0.6437 0.8023
No log 5.6889 256 0.6464 0.5939 0.6464 0.8040
No log 5.7333 258 0.6740 0.5419 0.6740 0.8210
No log 5.7778 260 0.6763 0.5511 0.6763 0.8224
No log 5.8222 262 0.6624 0.6147 0.6624 0.8139
No log 5.8667 264 0.6425 0.6297 0.6425 0.8016
No log 5.9111 266 0.6591 0.5975 0.6591 0.8118
No log 5.9556 268 0.6940 0.5500 0.6940 0.8331
No log 6.0 270 0.6952 0.5710 0.6952 0.8338
No log 6.0444 272 0.7104 0.6132 0.7104 0.8429
No log 6.0889 274 0.6827 0.6578 0.6827 0.8263
No log 6.1333 276 0.6250 0.6788 0.6250 0.7906
No log 6.1778 278 0.6242 0.6788 0.6242 0.7901
No log 6.2222 280 0.6300 0.6114 0.6300 0.7937
No log 6.2667 282 0.6507 0.6327 0.6507 0.8067
No log 6.3111 284 0.6614 0.6335 0.6614 0.8132
No log 6.3556 286 0.6650 0.5645 0.6650 0.8154
No log 6.4 288 0.6469 0.6076 0.6469 0.8043
No log 6.4444 290 0.6500 0.6297 0.6500 0.8062
No log 6.4889 292 0.6920 0.6357 0.6920 0.8319
No log 6.5333 294 0.6871 0.6215 0.6871 0.8289
No log 6.5778 296 0.6921 0.6225 0.6921 0.8319
No log 6.6222 298 0.6756 0.5830 0.6756 0.8219
No log 6.6667 300 0.6827 0.5637 0.6827 0.8263
No log 6.7111 302 0.7140 0.5927 0.7140 0.8450
No log 6.7556 304 0.6922 0.5937 0.6922 0.8320
No log 6.8 306 0.7161 0.5521 0.7161 0.8462
No log 6.8444 308 0.7017 0.5076 0.7017 0.8377
No log 6.8889 310 0.6720 0.5774 0.6720 0.8197
No log 6.9333 312 0.6651 0.5774 0.6651 0.8156
No log 6.9778 314 0.6709 0.5455 0.6709 0.8191
No log 7.0222 316 0.7048 0.5213 0.7048 0.8395
No log 7.0667 318 0.7211 0.5094 0.7211 0.8492
No log 7.1111 320 0.7048 0.5431 0.7048 0.8395
No log 7.1556 322 0.7187 0.5710 0.7187 0.8477
No log 7.2 324 0.6769 0.5557 0.6769 0.8227
No log 7.2444 326 0.6710 0.5660 0.6710 0.8192
No log 7.2889 328 0.6745 0.5660 0.6745 0.8213
No log 7.3333 330 0.6699 0.5785 0.6699 0.8185
No log 7.3778 332 0.6825 0.5785 0.6825 0.8261
No log 7.4222 334 0.7326 0.5586 0.7326 0.8559
No log 7.4667 336 0.7386 0.5387 0.7386 0.8594
No log 7.5111 338 0.7065 0.5731 0.7065 0.8405
No log 7.5556 340 0.6912 0.5364 0.6912 0.8314
No log 7.6 342 0.6993 0.5146 0.6993 0.8362
No log 7.6444 344 0.6827 0.5478 0.6828 0.8263
No log 7.6889 346 0.6642 0.6134 0.6642 0.8150
No log 7.7333 348 0.6419 0.5785 0.6419 0.8012
No log 7.7778 350 0.6861 0.5582 0.6861 0.8283
No log 7.8222 352 0.7102 0.5554 0.7102 0.8427
No log 7.8667 354 0.6469 0.5972 0.6469 0.8043
No log 7.9111 356 0.6343 0.6750 0.6343 0.7965
No log 7.9556 358 0.7084 0.6661 0.7084 0.8417
No log 8.0 360 0.7286 0.6476 0.7286 0.8536
No log 8.0444 362 0.6443 0.6337 0.6443 0.8027
No log 8.0889 364 0.6654 0.4996 0.6654 0.8157
No log 8.1333 366 0.7956 0.5636 0.7956 0.8920
No log 8.1778 368 0.7866 0.5572 0.7866 0.8869
No log 8.2222 370 0.6937 0.5785 0.6937 0.8329
No log 8.2667 372 0.7279 0.5217 0.7279 0.8531
No log 8.3111 374 0.7491 0.4916 0.7491 0.8655
No log 8.3556 376 0.6731 0.6032 0.6731 0.8204
No log 8.4 378 0.7113 0.6263 0.7113 0.8434
No log 8.4444 380 0.9403 0.4871 0.9403 0.9697
No log 8.4889 382 0.9770 0.4494 0.9770 0.9884
No log 8.5333 384 0.8312 0.5800 0.8312 0.9117
No log 8.5778 386 0.7184 0.5330 0.7184 0.8476
No log 8.6222 388 0.7117 0.5552 0.7117 0.8436
No log 8.6667 390 0.7256 0.5522 0.7256 0.8518
No log 8.7111 392 0.7023 0.5522 0.7023 0.8380
No log 8.7556 394 0.7117 0.5958 0.7117 0.8436
No log 8.8 396 0.6833 0.5983 0.6833 0.8266
No log 8.8444 398 0.6359 0.5877 0.6359 0.7974
No log 8.8889 400 0.6299 0.5887 0.6299 0.7936
No log 8.9333 402 0.6567 0.6380 0.6567 0.8104
No log 8.9778 404 0.6700 0.6380 0.6700 0.8186
No log 9.0222 406 0.6380 0.6066 0.6380 0.7987
No log 9.0667 408 0.6299 0.5990 0.6299 0.7937
No log 9.1111 410 0.6137 0.6374 0.6137 0.7834
No log 9.1556 412 0.6145 0.5879 0.6145 0.7839
No log 9.2 414 0.5939 0.6292 0.5939 0.7706
No log 9.2444 416 0.5897 0.6704 0.5897 0.7680
No log 9.2889 418 0.6067 0.6815 0.6067 0.7789
No log 9.3333 420 0.5805 0.7443 0.5805 0.7619
No log 9.3778 422 0.5567 0.6555 0.5567 0.7461
No log 9.4222 424 0.5807 0.6078 0.5807 0.7620
No log 9.4667 426 0.6069 0.6753 0.6069 0.7791
No log 9.5111 428 0.6665 0.5642 0.6665 0.8164
No log 9.5556 430 0.6618 0.6249 0.6618 0.8135
No log 9.6 432 0.6079 0.6094 0.6079 0.7797
No log 9.6444 434 0.6148 0.6442 0.6148 0.7841
No log 9.6889 436 0.6149 0.6114 0.6149 0.7841
No log 9.7333 438 0.6295 0.6066 0.6295 0.7934
No log 9.7778 440 0.6640 0.6443 0.6640 0.8149
No log 9.8222 442 0.6657 0.6340 0.6657 0.8159
No log 9.8667 444 0.5937 0.6430 0.5937 0.7705
No log 9.9111 446 0.5835 0.6488 0.5835 0.7638
No log 9.9556 448 0.5968 0.5982 0.5968 0.7725
No log 10.0 450 0.5868 0.6593 0.5868 0.7660
No log 10.0444 452 0.6191 0.5933 0.6191 0.7868
No log 10.0889 454 0.6536 0.5998 0.6536 0.8085
No log 10.1333 456 0.6105 0.6237 0.6105 0.7813
No log 10.1778 458 0.5917 0.6667 0.5917 0.7692
No log 10.2222 460 0.6024 0.6658 0.6024 0.7761
No log 10.2667 462 0.6276 0.6076 0.6276 0.7922
No log 10.3111 464 0.6570 0.5855 0.6570 0.8105
No log 10.3556 466 0.6588 0.5645 0.6588 0.8117
No log 10.4 468 0.6436 0.6007 0.6436 0.8023
No log 10.4444 470 0.6411 0.5866 0.6411 0.8007
No log 10.4889 472 0.6426 0.5522 0.6426 0.8016
No log 10.5333 474 0.6469 0.5304 0.6469 0.8043
No log 10.5778 476 0.7097 0.5788 0.7097 0.8424
No log 10.6222 478 0.7348 0.6099 0.7348 0.8572
No log 10.6667 480 0.6620 0.5688 0.6620 0.8136
No log 10.7111 482 0.6018 0.6249 0.6018 0.7758
No log 10.7556 484 0.6023 0.6517 0.6023 0.7761
No log 10.8 486 0.6104 0.6185 0.6104 0.7813
No log 10.8444 488 0.6186 0.5370 0.6186 0.7865
No log 10.8889 490 0.6499 0.5098 0.6499 0.8062
No log 10.9333 492 0.6402 0.5570 0.6402 0.8001
No log 10.9778 494 0.6165 0.6104 0.6165 0.7852
No log 11.0222 496 0.6119 0.6442 0.6119 0.7822
No log 11.0667 498 0.6368 0.6347 0.6368 0.7980
0.2707 11.1111 500 0.6878 0.5933 0.6878 0.8293
0.2707 11.1556 502 0.7113 0.5528 0.7113 0.8434
0.2707 11.2 504 0.7125 0.5098 0.7125 0.8441
0.2707 11.2444 506 0.7096 0.5098 0.7096 0.8424
0.2707 11.2889 508 0.7258 0.4974 0.7258 0.8519
0.2707 11.3333 510 0.7147 0.5210 0.7147 0.8454

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k9_task5_organization

Finetuned
(4019)
this model