ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k10_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5887
  • Qwk: 0.5316
  • Mse: 0.5887
  • Rmse: 0.7673

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0385 2 4.0537 0.0130 4.0537 2.0134
No log 0.0769 4 2.3877 -0.0111 2.3877 1.5452
No log 0.1154 6 1.2885 0.1460 1.2885 1.1351
No log 0.1538 8 1.3346 0.0135 1.3346 1.1553
No log 0.1923 10 2.0678 0.0139 2.0678 1.4380
No log 0.2308 12 1.8474 0.0522 1.8474 1.3592
No log 0.2692 14 1.2733 0.0053 1.2733 1.1284
No log 0.3077 16 1.0323 0.2265 1.0323 1.0160
No log 0.3462 18 1.0194 0.2944 1.0194 1.0096
No log 0.3846 20 1.1002 0.2125 1.1002 1.0489
No log 0.4231 22 1.2291 0.0232 1.2291 1.1087
No log 0.4615 24 1.2703 0.0232 1.2703 1.1271
No log 0.5 26 1.2238 0.0232 1.2238 1.1063
No log 0.5385 28 1.0984 0.1233 1.0984 1.0480
No log 0.5769 30 1.0005 0.2490 1.0005 1.0002
No log 0.6154 32 0.9920 0.2591 0.9920 0.9960
No log 0.6538 34 1.0123 0.2268 1.0123 1.0061
No log 0.6923 36 1.2443 0.0496 1.2443 1.1155
No log 0.7308 38 1.3327 0.0380 1.3327 1.1544
No log 0.7692 40 1.1492 0.1379 1.1492 1.0720
No log 0.8077 42 0.9482 0.3059 0.9482 0.9738
No log 0.8462 44 0.9194 0.3393 0.9194 0.9589
No log 0.8846 46 0.9404 0.3298 0.9404 0.9698
No log 0.9231 48 0.8851 0.4163 0.8851 0.9408
No log 0.9615 50 0.9052 0.4461 0.9052 0.9514
No log 1.0 52 0.9106 0.4489 0.9106 0.9543
No log 1.0385 54 0.9045 0.4484 0.9045 0.9511
No log 1.0769 56 1.4619 0.2994 1.4619 1.2091
No log 1.1154 58 1.3335 0.3482 1.3335 1.1548
No log 1.1538 60 0.7887 0.4865 0.7887 0.8881
No log 1.1923 62 0.8413 0.4087 0.8413 0.9172
No log 1.2308 64 1.0257 0.1952 1.0257 1.0128
No log 1.2692 66 0.8984 0.2685 0.8984 0.9479
No log 1.3077 68 0.7454 0.5063 0.7454 0.8634
No log 1.3462 70 0.7860 0.5927 0.7860 0.8866
No log 1.3846 72 0.8648 0.5207 0.8648 0.9299
No log 1.4231 74 0.7958 0.5429 0.7958 0.8921
No log 1.4615 76 0.7871 0.5787 0.7871 0.8872
No log 1.5 78 0.7651 0.5317 0.7651 0.8747
No log 1.5385 80 0.7527 0.6053 0.7527 0.8676
No log 1.5769 82 0.7335 0.5912 0.7335 0.8564
No log 1.6154 84 0.7813 0.5725 0.7813 0.8839
No log 1.6538 86 0.6774 0.5810 0.6774 0.8231
No log 1.6923 88 0.8833 0.5154 0.8833 0.9398
No log 1.7308 90 0.7577 0.5215 0.7577 0.8705
No log 1.7692 92 0.6969 0.5614 0.6969 0.8348
No log 1.8077 94 1.0190 0.4482 1.0190 1.0094
No log 1.8462 96 0.9333 0.4970 0.9333 0.9661
No log 1.8846 98 0.6752 0.5232 0.6752 0.8217
No log 1.9231 100 0.8366 0.4939 0.8366 0.9147
No log 1.9615 102 0.8665 0.5382 0.8665 0.9309
No log 2.0 104 0.6672 0.5841 0.6672 0.8168
No log 2.0385 106 0.6381 0.6259 0.6381 0.7988
No log 2.0769 108 0.7260 0.5532 0.7260 0.8521
No log 2.1154 110 0.6004 0.6335 0.6004 0.7749
No log 2.1538 112 0.6964 0.5553 0.6964 0.8345
No log 2.1923 114 0.6680 0.5337 0.6680 0.8173
No log 2.2308 116 0.5889 0.6469 0.5889 0.7674
No log 2.2692 118 0.7557 0.5219 0.7557 0.8693
No log 2.3077 120 0.7716 0.5778 0.7716 0.8784
No log 2.3462 122 0.7679 0.5393 0.7679 0.8763
No log 2.3846 124 0.7426 0.5307 0.7426 0.8618
No log 2.4231 126 0.7150 0.4839 0.7150 0.8456
No log 2.4615 128 0.6862 0.4995 0.6862 0.8283
No log 2.5 130 0.6636 0.5659 0.6636 0.8146
No log 2.5385 132 0.8727 0.4381 0.8727 0.9342
No log 2.5769 134 1.1502 0.4283 1.1502 1.0725
No log 2.6154 136 0.9494 0.4820 0.9494 0.9744
No log 2.6538 138 0.6347 0.6508 0.6347 0.7967
No log 2.6923 140 0.7722 0.5041 0.7722 0.8788
No log 2.7308 142 0.9012 0.5307 0.9012 0.9493
No log 2.7692 144 0.7676 0.4804 0.7676 0.8761
No log 2.8077 146 0.6486 0.6498 0.6486 0.8054
No log 2.8462 148 0.7479 0.5128 0.7479 0.8648
No log 2.8846 150 0.8564 0.3897 0.8564 0.9254
No log 2.9231 152 0.7358 0.5476 0.7358 0.8578
No log 2.9615 154 0.6169 0.5419 0.6169 0.7854
No log 3.0 156 0.6309 0.6121 0.6309 0.7943
No log 3.0385 158 0.5781 0.6916 0.5781 0.7604
No log 3.0769 160 0.6304 0.5471 0.6304 0.7940
No log 3.1154 162 0.7321 0.5051 0.7321 0.8556
No log 3.1538 164 0.5832 0.6906 0.5832 0.7637
No log 3.1923 166 0.5250 0.6970 0.5250 0.7246
No log 3.2308 168 0.5455 0.6548 0.5455 0.7386
No log 3.2692 170 0.5467 0.6639 0.5467 0.7394
No log 3.3077 172 0.5471 0.6537 0.5471 0.7397
No log 3.3462 174 0.5558 0.6639 0.5558 0.7455
No log 3.3846 176 0.5775 0.6950 0.5775 0.7599
No log 3.4231 178 0.6014 0.7180 0.6014 0.7755
No log 3.4615 180 0.5414 0.6907 0.5414 0.7358
No log 3.5 182 0.5203 0.6996 0.5203 0.7213
No log 3.5385 184 0.5719 0.7212 0.5719 0.7562
No log 3.5769 186 0.6019 0.7329 0.6019 0.7758
No log 3.6154 188 0.5619 0.7166 0.5619 0.7496
No log 3.6538 190 0.5326 0.7061 0.5326 0.7298
No log 3.6923 192 0.5419 0.6705 0.5419 0.7362
No log 3.7308 194 0.6288 0.6173 0.6288 0.7930
No log 3.7692 196 0.6664 0.6498 0.6664 0.8163
No log 3.8077 198 0.7048 0.6706 0.7048 0.8395
No log 3.8462 200 0.7893 0.6224 0.7893 0.8884
No log 3.8846 202 0.6114 0.7326 0.6114 0.7819
No log 3.9231 204 0.6148 0.7326 0.6148 0.7841
No log 3.9615 206 0.6661 0.7700 0.6661 0.8162
No log 4.0 208 0.8098 0.6029 0.8098 0.8999
No log 4.0385 210 0.7736 0.5888 0.7736 0.8796
No log 4.0769 212 0.6838 0.5699 0.6838 0.8269
No log 4.1154 214 0.7127 0.5699 0.7127 0.8442
No log 4.1538 216 0.6857 0.6303 0.6857 0.8281
No log 4.1923 218 0.6523 0.7356 0.6523 0.8076
No log 4.2308 220 0.5652 0.7131 0.5652 0.7518
No log 4.2692 222 0.5450 0.6518 0.5450 0.7382
No log 4.3077 224 0.5553 0.7124 0.5553 0.7452
No log 4.3462 226 0.6772 0.6693 0.6772 0.8229
No log 4.3846 228 0.6211 0.6833 0.6211 0.7881
No log 4.4231 230 0.5374 0.6896 0.5374 0.7331
No log 4.4615 232 0.5928 0.6347 0.5928 0.7699
No log 4.5 234 0.6024 0.5986 0.6024 0.7762
No log 4.5385 236 0.5751 0.6345 0.5751 0.7584
No log 4.5769 238 0.5541 0.6616 0.5541 0.7444
No log 4.6154 240 0.6040 0.6236 0.6040 0.7772
No log 4.6538 242 0.5956 0.6332 0.5956 0.7718
No log 4.6923 244 0.5568 0.6778 0.5568 0.7462
No log 4.7308 246 0.5587 0.6452 0.5587 0.7475
No log 4.7692 248 0.5750 0.6119 0.5750 0.7583
No log 4.8077 250 0.5934 0.6215 0.5934 0.7703
No log 4.8462 252 0.6088 0.6386 0.6088 0.7803
No log 4.8846 254 0.5766 0.6891 0.5766 0.7593
No log 4.9231 256 0.6428 0.7131 0.6428 0.8018
No log 4.9615 258 0.6811 0.6889 0.6811 0.8253
No log 5.0 260 0.6146 0.7275 0.6146 0.7839
No log 5.0385 262 0.5062 0.6598 0.5062 0.7115
No log 5.0769 264 0.5513 0.6821 0.5513 0.7425
No log 5.1154 266 0.5640 0.6914 0.5640 0.7510
No log 5.1538 268 0.5107 0.6720 0.5107 0.7146
No log 5.1923 270 0.5915 0.6906 0.5915 0.7691
No log 5.2308 272 0.5994 0.7068 0.5994 0.7742
No log 5.2692 274 0.5356 0.6636 0.5356 0.7318
No log 5.3077 276 0.5304 0.6680 0.5304 0.7283
No log 5.3462 278 0.5216 0.6680 0.5216 0.7222
No log 5.3846 280 0.5196 0.6729 0.5196 0.7208
No log 5.4231 282 0.5818 0.6843 0.5818 0.7628
No log 5.4615 284 0.6958 0.6391 0.6958 0.8342
No log 5.5 286 0.6454 0.6988 0.6454 0.8034
No log 5.5385 288 0.5320 0.6555 0.5320 0.7294
No log 5.5769 290 0.5361 0.6641 0.5361 0.7322
No log 5.6154 292 0.5694 0.6269 0.5694 0.7546
No log 5.6538 294 0.5543 0.6415 0.5543 0.7445
No log 5.6923 296 0.5647 0.6185 0.5647 0.7515
No log 5.7308 298 0.6091 0.6275 0.6091 0.7804
No log 5.7692 300 0.5916 0.6803 0.5916 0.7691
No log 5.8077 302 0.5420 0.6764 0.5420 0.7362
No log 5.8462 304 0.5618 0.6263 0.5618 0.7495
No log 5.8846 306 0.6068 0.6479 0.6068 0.7790
No log 5.9231 308 0.5680 0.6188 0.5680 0.7536
No log 5.9615 310 0.5314 0.6689 0.5314 0.7290
No log 6.0 312 0.5729 0.7244 0.5729 0.7569
No log 6.0385 314 0.5536 0.7292 0.5536 0.7441
No log 6.0769 316 0.5226 0.6729 0.5226 0.7229
No log 6.1154 318 0.5473 0.6779 0.5473 0.7398
No log 6.1538 320 0.5414 0.6649 0.5414 0.7358
No log 6.1923 322 0.5694 0.6308 0.5694 0.7546
No log 6.2308 324 0.7036 0.5986 0.7036 0.8388
No log 6.2692 326 0.7095 0.6112 0.7095 0.8423
No log 6.3077 328 0.6199 0.6616 0.6199 0.7874
No log 6.3462 330 0.5452 0.6926 0.5452 0.7384
No log 6.3846 332 0.5190 0.6903 0.5190 0.7204
No log 6.4231 334 0.5086 0.6861 0.5086 0.7132
No log 6.4615 336 0.5075 0.6861 0.5075 0.7124
No log 6.5 338 0.5154 0.7279 0.5154 0.7179
No log 6.5385 340 0.5456 0.7390 0.5456 0.7387
No log 6.5769 342 0.5333 0.7390 0.5333 0.7303
No log 6.6154 344 0.5224 0.7279 0.5224 0.7228
No log 6.6538 346 0.5065 0.7377 0.5065 0.7117
No log 6.6923 348 0.5344 0.7493 0.5344 0.7311
No log 6.7308 350 0.6696 0.6914 0.6696 0.8183
No log 6.7692 352 0.6737 0.6914 0.6737 0.8208
No log 6.8077 354 0.5344 0.7390 0.5344 0.7310
No log 6.8462 356 0.4940 0.7003 0.4940 0.7029
No log 6.8846 358 0.4996 0.7049 0.4996 0.7068
No log 6.9231 360 0.5206 0.6890 0.5206 0.7215
No log 6.9615 362 0.5802 0.6607 0.5802 0.7617
No log 7.0 364 0.5493 0.6870 0.5493 0.7412
No log 7.0385 366 0.5194 0.7151 0.5194 0.7207
No log 7.0769 368 0.5470 0.6288 0.5470 0.7396
No log 7.1154 370 0.5901 0.6118 0.5901 0.7682
No log 7.1538 372 0.5685 0.6500 0.5685 0.7540
No log 7.1923 374 0.5502 0.6509 0.5502 0.7418
No log 7.2308 376 0.5708 0.6316 0.5708 0.7555
No log 7.2692 378 0.5907 0.5630 0.5907 0.7686
No log 7.3077 380 0.6027 0.5396 0.6027 0.7763
No log 7.3462 382 0.6516 0.5490 0.6516 0.8072
No log 7.3846 384 0.7382 0.5860 0.7382 0.8592
No log 7.4231 386 0.7577 0.5717 0.7577 0.8705
No log 7.4615 388 0.6847 0.6491 0.6847 0.8275
No log 7.5 390 0.5873 0.6301 0.5873 0.7664
No log 7.5385 392 0.5671 0.6725 0.5671 0.7531
No log 7.5769 394 0.5675 0.6764 0.5675 0.7533
No log 7.6154 396 0.5602 0.6725 0.5602 0.7485
No log 7.6538 398 0.5674 0.6430 0.5674 0.7533
No log 7.6923 400 0.5685 0.6278 0.5685 0.7540
No log 7.7308 402 0.5848 0.6269 0.5848 0.7647
No log 7.7692 404 0.6015 0.5975 0.6015 0.7756
No log 7.8077 406 0.6192 0.6053 0.6192 0.7869
No log 7.8462 408 0.6053 0.5821 0.6053 0.7780
No log 7.8846 410 0.5761 0.6259 0.5761 0.7590
No log 7.9231 412 0.5578 0.6301 0.5578 0.7469
No log 7.9615 414 0.5509 0.6259 0.5509 0.7422
No log 8.0 416 0.5448 0.6644 0.5448 0.7381
No log 8.0385 418 0.5258 0.6764 0.5258 0.7251
No log 8.0769 420 0.5402 0.6536 0.5402 0.7350
No log 8.1154 422 0.6347 0.6529 0.6347 0.7967
No log 8.1538 424 0.6515 0.6426 0.6515 0.8071
No log 8.1923 426 0.5750 0.6237 0.5750 0.7583
No log 8.2308 428 0.5280 0.6452 0.5280 0.7266
No log 8.2692 430 0.5858 0.6534 0.5858 0.7654
No log 8.3077 432 0.6007 0.6993 0.6007 0.7751
No log 8.3462 434 0.5636 0.6970 0.5636 0.7508
No log 8.3846 436 0.5283 0.6729 0.5283 0.7268
No log 8.4231 438 0.5318 0.6447 0.5318 0.7292
No log 8.4615 440 0.5224 0.6526 0.5224 0.7228
No log 8.5 442 0.5232 0.7279 0.5232 0.7233
No log 8.5385 444 0.5202 0.7279 0.5202 0.7212
No log 8.5769 446 0.5246 0.6804 0.5246 0.7243
No log 8.6154 448 0.5380 0.6452 0.5380 0.7335
No log 8.6538 450 0.5453 0.6417 0.5453 0.7385
No log 8.6923 452 0.5494 0.6715 0.5494 0.7412
No log 8.7308 454 0.5397 0.6673 0.5397 0.7347
No log 8.7692 456 0.5325 0.6796 0.5325 0.7297
No log 8.8077 458 0.5283 0.6564 0.5283 0.7269
No log 8.8462 460 0.5257 0.6874 0.5257 0.7250
No log 8.8846 462 0.5531 0.6347 0.5531 0.7437
No log 8.9231 464 0.5853 0.6035 0.5853 0.7651
No log 8.9615 466 0.5635 0.6301 0.5635 0.7507
No log 9.0 468 0.5456 0.6641 0.5456 0.7387
No log 9.0385 470 0.5463 0.6876 0.5463 0.7392
No log 9.0769 472 0.5481 0.6705 0.5481 0.7403
No log 9.1154 474 0.5326 0.6796 0.5326 0.7298
No log 9.1538 476 0.5315 0.6641 0.5315 0.7291
No log 9.1923 478 0.5253 0.6584 0.5253 0.7248
No log 9.2308 480 0.5375 0.6602 0.5375 0.7332
No log 9.2692 482 0.5609 0.6967 0.5609 0.7489
No log 9.3077 484 0.5679 0.6543 0.5679 0.7536
No log 9.3462 486 0.5222 0.6822 0.5222 0.7226
No log 9.3846 488 0.5105 0.6874 0.5105 0.7145
No log 9.4231 490 0.5223 0.6390 0.5223 0.7227
No log 9.4615 492 0.5431 0.6547 0.5431 0.7370
No log 9.5 494 0.5619 0.6139 0.5619 0.7496
No log 9.5385 496 0.5708 0.5914 0.5708 0.7555
No log 9.5769 498 0.5618 0.6001 0.5618 0.7495
0.3117 9.6154 500 0.5525 0.6301 0.5525 0.7433
0.3117 9.6538 502 0.5388 0.6128 0.5388 0.7341
0.3117 9.6923 504 0.5161 0.6584 0.5161 0.7184
0.3117 9.7308 506 0.5690 0.6883 0.5690 0.7543
0.3117 9.7692 508 0.5750 0.6883 0.5750 0.7583
0.3117 9.8077 510 0.5420 0.6118 0.5420 0.7362
0.3117 9.8462 512 0.5800 0.5734 0.5800 0.7616
0.3117 9.8846 514 0.6099 0.5627 0.6099 0.7809
0.3117 9.9231 516 0.6116 0.4764 0.6116 0.7820
0.3117 9.9615 518 0.5887 0.5316 0.5887 0.7673

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k10_task5_organization

Finetuned
(4019)
this model