ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k15_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5694
  • Qwk: 0.6433
  • Mse: 0.5694
  • Rmse: 0.7546

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0267 2 4.0374 -0.0092 4.0374 2.0093
No log 0.0533 4 2.1209 0.1187 2.1209 1.4563
No log 0.08 6 1.2386 0.1344 1.2386 1.1129
No log 0.1067 8 1.0345 0.1864 1.0345 1.0171
No log 0.1333 10 1.0541 0.2539 1.0541 1.0267
No log 0.16 12 1.1810 0.2294 1.1810 1.0867
No log 0.1867 14 0.9427 0.3671 0.9427 0.9709
No log 0.2133 16 0.7940 0.4533 0.7940 0.8910
No log 0.24 18 0.7850 0.3236 0.7850 0.8860
No log 0.2667 20 0.7653 0.4951 0.7653 0.8748
No log 0.2933 22 1.1747 0.3579 1.1747 1.0838
No log 0.32 24 1.3692 0.2092 1.3692 1.1701
No log 0.3467 26 0.9392 0.4729 0.9392 0.9691
No log 0.3733 28 0.7601 0.5888 0.7601 0.8718
No log 0.4 30 0.9429 0.4371 0.9429 0.9710
No log 0.4267 32 0.7656 0.5527 0.7656 0.8750
No log 0.4533 34 0.7889 0.5565 0.7889 0.8882
No log 0.48 36 0.9520 0.4962 0.9520 0.9757
No log 0.5067 38 0.7873 0.5738 0.7873 0.8873
No log 0.5333 40 0.6928 0.6111 0.6928 0.8324
No log 0.56 42 0.7090 0.6133 0.7090 0.8420
No log 0.5867 44 0.7182 0.6151 0.7182 0.8474
No log 0.6133 46 0.7353 0.5667 0.7353 0.8575
No log 0.64 48 0.7438 0.5667 0.7438 0.8625
No log 0.6667 50 0.7254 0.6148 0.7254 0.8517
No log 0.6933 52 0.7348 0.6224 0.7348 0.8572
No log 0.72 54 0.7545 0.5845 0.7545 0.8686
No log 0.7467 56 1.1725 0.4528 1.1725 1.0828
No log 0.7733 58 1.4629 0.3287 1.4629 1.2095
No log 0.8 60 1.1603 0.4553 1.1603 1.0772
No log 0.8267 62 0.7179 0.5463 0.7179 0.8473
No log 0.8533 64 0.7728 0.5327 0.7728 0.8791
No log 0.88 66 0.9174 0.5713 0.9174 0.9578
No log 0.9067 68 0.8363 0.6163 0.8363 0.9145
No log 0.9333 70 0.6865 0.5929 0.6865 0.8285
No log 0.96 72 0.7043 0.5300 0.7043 0.8392
No log 0.9867 74 0.7781 0.5837 0.7781 0.8821
No log 1.0133 76 0.7335 0.6517 0.7335 0.8564
No log 1.04 78 0.7016 0.6142 0.7016 0.8376
No log 1.0667 80 0.6776 0.6169 0.6776 0.8232
No log 1.0933 82 0.6514 0.6350 0.6514 0.8071
No log 1.12 84 0.6706 0.6089 0.6706 0.8189
No log 1.1467 86 0.6972 0.5212 0.6972 0.8350
No log 1.1733 88 0.7245 0.5223 0.7245 0.8512
No log 1.2 90 0.6361 0.6729 0.6361 0.7976
No log 1.2267 92 0.7177 0.6180 0.7177 0.8472
No log 1.2533 94 0.7768 0.6041 0.7768 0.8814
No log 1.28 96 0.7613 0.5919 0.7613 0.8725
No log 1.3067 98 0.7341 0.5854 0.7341 0.8568
No log 1.3333 100 0.7022 0.6528 0.7022 0.8380
No log 1.3600 102 0.6655 0.6341 0.6655 0.8158
No log 1.3867 104 0.6595 0.6124 0.6595 0.8121
No log 1.4133 106 0.6639 0.6007 0.6639 0.8148
No log 1.44 108 0.6504 0.6262 0.6504 0.8064
No log 1.4667 110 0.6405 0.6407 0.6405 0.8003
No log 1.4933 112 0.6324 0.6233 0.6324 0.7952
No log 1.52 114 0.6572 0.6194 0.6572 0.8107
No log 1.5467 116 0.6049 0.5773 0.6049 0.7778
No log 1.5733 118 0.5896 0.7013 0.5896 0.7679
No log 1.6 120 0.5986 0.6055 0.5986 0.7737
No log 1.6267 122 0.7431 0.6476 0.7431 0.8620
No log 1.6533 124 0.8097 0.5887 0.8097 0.8998
No log 1.6800 126 0.6712 0.6524 0.6712 0.8193
No log 1.7067 128 0.6087 0.6659 0.6087 0.7802
No log 1.7333 130 0.6000 0.6634 0.6000 0.7746
No log 1.76 132 0.5858 0.6690 0.5858 0.7654
No log 1.7867 134 0.5954 0.5988 0.5954 0.7716
No log 1.8133 136 0.5743 0.6097 0.5743 0.7578
No log 1.8400 138 0.5631 0.5971 0.5631 0.7504
No log 1.8667 140 0.5497 0.6439 0.5497 0.7414
No log 1.8933 142 0.5776 0.6983 0.5776 0.7600
No log 1.92 144 0.6111 0.6803 0.6111 0.7817
No log 1.9467 146 0.6454 0.6838 0.6454 0.8034
No log 1.9733 148 0.6496 0.6975 0.6496 0.8060
No log 2.0 150 0.6080 0.6560 0.6080 0.7798
No log 2.0267 152 0.6120 0.6459 0.6120 0.7823
No log 2.0533 154 0.6616 0.6322 0.6616 0.8134
No log 2.08 156 0.6180 0.5937 0.6180 0.7861
No log 2.1067 158 0.6153 0.5426 0.6153 0.7844
No log 2.1333 160 0.7006 0.5118 0.7006 0.8370
No log 2.16 162 0.7793 0.5283 0.7793 0.8828
No log 2.1867 164 0.6639 0.6050 0.6639 0.8148
No log 2.2133 166 0.5512 0.6712 0.5512 0.7424
No log 2.24 168 0.5842 0.7013 0.5842 0.7643
No log 2.2667 170 0.5785 0.6921 0.5785 0.7606
No log 2.2933 172 0.5800 0.6976 0.5800 0.7616
No log 2.32 174 0.6261 0.5966 0.6261 0.7913
No log 2.3467 176 0.6402 0.5340 0.6402 0.8001
No log 2.3733 178 0.6239 0.5594 0.6239 0.7899
No log 2.4 180 0.7019 0.6249 0.7019 0.8378
No log 2.4267 182 0.7822 0.6019 0.7822 0.8844
No log 2.4533 184 0.6879 0.6537 0.6879 0.8294
No log 2.48 186 0.6430 0.6104 0.6430 0.8019
No log 2.5067 188 0.6629 0.5560 0.6629 0.8142
No log 2.5333 190 0.7976 0.5791 0.7976 0.8931
No log 2.56 192 0.8677 0.5391 0.8677 0.9315
No log 2.5867 194 0.7173 0.5247 0.7173 0.8469
No log 2.6133 196 0.6512 0.6560 0.6512 0.8069
No log 2.64 198 0.8159 0.6117 0.8159 0.9033
No log 2.6667 200 0.8226 0.6117 0.8226 0.9070
No log 2.6933 202 0.6830 0.6552 0.6830 0.8265
No log 2.7200 204 0.6590 0.6543 0.6590 0.8118
No log 2.7467 206 0.7187 0.5443 0.7187 0.8478
No log 2.7733 208 0.6798 0.6215 0.6798 0.8245
No log 2.8 210 0.6261 0.6266 0.6261 0.7913
No log 2.8267 212 0.6666 0.6254 0.6666 0.8165
No log 2.8533 214 0.7377 0.6199 0.7377 0.8589
No log 2.88 216 0.6857 0.6254 0.6857 0.8281
No log 2.9067 218 0.6248 0.6978 0.6248 0.7904
No log 2.9333 220 0.7284 0.5759 0.7284 0.8535
No log 2.96 222 0.7910 0.5536 0.7910 0.8894
No log 2.9867 224 0.7014 0.5860 0.7014 0.8375
No log 3.0133 226 0.6167 0.6185 0.6167 0.7853
No log 3.04 228 0.6550 0.6188 0.6550 0.8093
No log 3.0667 230 0.7164 0.6089 0.7164 0.8464
No log 3.0933 232 0.7046 0.6552 0.7046 0.8394
No log 3.12 234 0.6338 0.6568 0.6338 0.7961
No log 3.1467 236 0.6328 0.6766 0.6328 0.7955
No log 3.1733 238 0.7813 0.5627 0.7813 0.8839
No log 3.2 240 0.8294 0.5273 0.8294 0.9107
No log 3.2267 242 0.7281 0.5928 0.7281 0.8533
No log 3.2533 244 0.6030 0.6995 0.6030 0.7765
No log 3.2800 246 0.6626 0.6427 0.6626 0.8140
No log 3.3067 248 0.7191 0.6198 0.7191 0.8480
No log 3.3333 250 0.7482 0.6073 0.7482 0.8650
No log 3.36 252 0.6913 0.6586 0.6913 0.8314
No log 3.3867 254 0.6915 0.6427 0.6915 0.8316
No log 3.4133 256 0.6802 0.6495 0.6802 0.8247
No log 3.44 258 0.6778 0.5981 0.6778 0.8233
No log 3.4667 260 0.7001 0.6284 0.7001 0.8367
No log 3.4933 262 0.6683 0.5695 0.6683 0.8175
No log 3.52 264 0.6589 0.6890 0.6589 0.8117
No log 3.5467 266 0.6474 0.6501 0.6474 0.8046
No log 3.5733 268 0.6448 0.6664 0.6448 0.8030
No log 3.6 270 0.6161 0.6983 0.6161 0.7849
No log 3.6267 272 0.6646 0.6617 0.6646 0.8152
No log 3.6533 274 0.6831 0.5812 0.6831 0.8265
No log 3.68 276 0.6914 0.6411 0.6914 0.8315
No log 3.7067 278 0.6599 0.6403 0.6599 0.8124
No log 3.7333 280 0.6023 0.6578 0.6023 0.7761
No log 3.76 282 0.5970 0.6499 0.5970 0.7727
No log 3.7867 284 0.6090 0.6756 0.6090 0.7804
No log 3.8133 286 0.6210 0.6112 0.6210 0.7880
No log 3.84 288 0.6304 0.7083 0.6304 0.7940
No log 3.8667 290 0.6632 0.7118 0.6632 0.8144
No log 3.8933 292 0.6716 0.6981 0.6716 0.8195
No log 3.92 294 0.6800 0.6586 0.6800 0.8246
No log 3.9467 296 0.6801 0.6138 0.6801 0.8247
No log 3.9733 298 0.6271 0.6217 0.6271 0.7919
No log 4.0 300 0.6304 0.6164 0.6304 0.7939
No log 4.0267 302 0.6389 0.6087 0.6389 0.7993
No log 4.0533 304 0.6310 0.6659 0.6310 0.7944
No log 4.08 306 0.6327 0.6659 0.6327 0.7954
No log 4.1067 308 0.6557 0.6195 0.6557 0.8098
No log 4.1333 310 0.7002 0.5412 0.7002 0.8368
No log 4.16 312 0.7000 0.5513 0.7000 0.8366
No log 4.1867 314 0.7297 0.5831 0.7297 0.8542
No log 4.2133 316 0.7073 0.5879 0.7073 0.8410
No log 4.24 318 0.6834 0.5638 0.6834 0.8267
No log 4.2667 320 0.6843 0.6049 0.6843 0.8272
No log 4.2933 322 0.6890 0.6365 0.6890 0.8301
No log 4.32 324 0.7102 0.6075 0.7102 0.8428
No log 4.3467 326 0.7034 0.6085 0.7034 0.8387
No log 4.3733 328 0.6910 0.5931 0.6910 0.8313
No log 4.4 330 0.7034 0.5513 0.7034 0.8387
No log 4.4267 332 0.6814 0.5081 0.6814 0.8255
No log 4.4533 334 0.6657 0.5882 0.6657 0.8159
No log 4.48 336 0.6422 0.6196 0.6422 0.8014
No log 4.5067 338 0.6310 0.6788 0.6310 0.7944
No log 4.5333 340 0.6340 0.6468 0.6340 0.7962
No log 4.5600 342 0.6319 0.6014 0.6319 0.7949
No log 4.5867 344 0.6499 0.5936 0.6499 0.8062
No log 4.6133 346 0.6276 0.6185 0.6276 0.7922
No log 4.64 348 0.6212 0.6681 0.6212 0.7881
No log 4.6667 350 0.6375 0.6359 0.6375 0.7985
No log 4.6933 352 0.6548 0.6198 0.6548 0.8092
No log 4.72 354 0.6757 0.6123 0.6757 0.8220
No log 4.7467 356 0.6995 0.5773 0.6995 0.8364
No log 4.7733 358 0.7412 0.6023 0.7412 0.8609
No log 4.8 360 0.7458 0.5845 0.7458 0.8636
No log 4.8267 362 0.7392 0.5264 0.7392 0.8598
No log 4.8533 364 0.7243 0.5580 0.7243 0.8511
No log 4.88 366 0.7099 0.6239 0.7099 0.8426
No log 4.9067 368 0.7318 0.5933 0.7318 0.8554
No log 4.9333 370 0.7638 0.6091 0.7638 0.8740
No log 4.96 372 0.7308 0.5909 0.7308 0.8549
No log 4.9867 374 0.6676 0.6262 0.6676 0.8170
No log 5.0133 376 0.6623 0.6028 0.6623 0.8138
No log 5.04 378 0.6659 0.6028 0.6659 0.8160
No log 5.0667 380 0.6862 0.5496 0.6862 0.8284
No log 5.0933 382 0.7121 0.5311 0.7121 0.8439
No log 5.12 384 0.7092 0.5679 0.7092 0.8421
No log 5.1467 386 0.7170 0.5753 0.7170 0.8467
No log 5.1733 388 0.7233 0.6008 0.7233 0.8505
No log 5.2 390 0.7106 0.5369 0.7106 0.8430
No log 5.2267 392 0.6904 0.5372 0.6904 0.8309
No log 5.2533 394 0.6720 0.5693 0.6720 0.8198
No log 5.28 396 0.6681 0.5820 0.6681 0.8174
No log 5.3067 398 0.6688 0.5943 0.6688 0.8178
No log 5.3333 400 0.6901 0.5996 0.6901 0.8307
No log 5.36 402 0.6723 0.6113 0.6723 0.8200
No log 5.3867 404 0.6444 0.6239 0.6444 0.8028
No log 5.4133 406 0.6344 0.6114 0.6344 0.7965
No log 5.44 408 0.6343 0.6076 0.6343 0.7964
No log 5.4667 410 0.6328 0.6144 0.6328 0.7955
No log 5.4933 412 0.6536 0.5633 0.6536 0.8085
No log 5.52 414 0.6684 0.6246 0.6684 0.8176
No log 5.5467 416 0.6662 0.6006 0.6662 0.8162
No log 5.5733 418 0.6987 0.6628 0.6987 0.8359
No log 5.6 420 0.6921 0.6301 0.6921 0.8319
No log 5.6267 422 0.6849 0.5287 0.6849 0.8276
No log 5.6533 424 0.6824 0.4826 0.6824 0.8261
No log 5.68 426 0.6676 0.5536 0.6676 0.8171
No log 5.7067 428 0.6756 0.6133 0.6756 0.8219
No log 5.7333 430 0.7010 0.6275 0.7010 0.8373
No log 5.76 432 0.7498 0.6380 0.7498 0.8659
No log 5.7867 434 0.7328 0.6366 0.7328 0.8561
No log 5.8133 436 0.7075 0.6263 0.7075 0.8411
No log 5.84 438 0.6774 0.5990 0.6774 0.8231
No log 5.8667 440 0.6649 0.6143 0.6649 0.8154
No log 5.8933 442 0.6538 0.6154 0.6538 0.8086
No log 5.92 444 0.6546 0.6415 0.6546 0.8090
No log 5.9467 446 0.6458 0.6415 0.6458 0.8036
No log 5.9733 448 0.6219 0.6272 0.6219 0.7886
No log 6.0 450 0.6120 0.5939 0.6120 0.7823
No log 6.0267 452 0.6241 0.6239 0.6241 0.7900
No log 6.0533 454 0.6228 0.6239 0.6228 0.7892
No log 6.08 456 0.6393 0.5192 0.6393 0.7995
No log 6.1067 458 0.6594 0.5093 0.6594 0.8120
No log 6.1333 460 0.6536 0.4963 0.6536 0.8084
No log 6.16 462 0.6243 0.6239 0.6243 0.7901
No log 6.1867 464 0.6275 0.6123 0.6275 0.7921
No log 6.2133 466 0.6275 0.5415 0.6275 0.7922
No log 6.24 468 0.6637 0.4724 0.6637 0.8147
No log 6.2667 470 0.6792 0.4739 0.6792 0.8241
No log 6.2933 472 0.6312 0.4826 0.6312 0.7945
No log 6.32 474 0.6100 0.6123 0.6100 0.7810
No log 6.3467 476 0.6074 0.6311 0.6074 0.7793
No log 6.3733 478 0.6024 0.5884 0.6024 0.7762
No log 6.4 480 0.6061 0.5884 0.6061 0.7785
No log 6.4267 482 0.5885 0.6114 0.5885 0.7671
No log 6.4533 484 0.5596 0.6537 0.5596 0.7481
No log 6.48 486 0.5632 0.6272 0.5632 0.7505
No log 6.5067 488 0.5862 0.5644 0.5862 0.7657
No log 6.5333 490 0.6082 0.4944 0.6082 0.7799
No log 6.5600 492 0.5921 0.5329 0.5921 0.7695
No log 6.5867 494 0.5539 0.5868 0.5539 0.7443
No log 6.6133 496 0.5356 0.6712 0.5356 0.7319
No log 6.64 498 0.5376 0.6838 0.5376 0.7332
0.2617 6.6667 500 0.5460 0.6491 0.5460 0.7389
0.2617 6.6933 502 0.5592 0.6461 0.5592 0.7478
0.2617 6.72 504 0.5486 0.6269 0.5486 0.7407
0.2617 6.7467 506 0.5469 0.6144 0.5469 0.7395
0.2617 6.7733 508 0.5594 0.6433 0.5594 0.7480
0.2617 6.8 510 0.5694 0.6433 0.5694 0.7546

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k15_task5_organization

Finetuned
(4019)
this model