ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k12_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5646
  • Qwk: 0.4997
  • Mse: 0.5646
  • Rmse: 0.7514

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0333 2 2.5503 -0.0788 2.5503 1.5970
No log 0.0667 4 1.2762 0.1243 1.2762 1.1297
No log 0.1 6 0.7536 0.0937 0.7536 0.8681
No log 0.1333 8 0.9009 0.2046 0.9009 0.9491
No log 0.1667 10 0.9997 0.2635 0.9997 0.9999
No log 0.2 12 0.6965 0.3918 0.6965 0.8346
No log 0.2333 14 0.5420 0.3580 0.5420 0.7362
No log 0.2667 16 0.5417 0.4198 0.5417 0.7360
No log 0.3 18 0.5241 0.4740 0.5241 0.7240
No log 0.3333 20 0.5078 0.6017 0.5078 0.7126
No log 0.3667 22 0.5016 0.6330 0.5016 0.7082
No log 0.4 24 0.5183 0.5703 0.5183 0.7200
No log 0.4333 26 0.5177 0.5306 0.5177 0.7195
No log 0.4667 28 0.5306 0.5071 0.5306 0.7284
No log 0.5 30 0.5613 0.4137 0.5613 0.7492
No log 0.5333 32 0.6684 0.3308 0.6684 0.8176
No log 0.5667 34 0.6620 0.4161 0.6620 0.8136
No log 0.6 36 0.5411 0.4596 0.5411 0.7356
No log 0.6333 38 0.5095 0.5741 0.5095 0.7138
No log 0.6667 40 0.5156 0.5528 0.5156 0.7181
No log 0.7 42 0.5785 0.5275 0.5785 0.7606
No log 0.7333 44 0.6262 0.4404 0.6262 0.7913
No log 0.7667 46 0.6043 0.5275 0.6043 0.7774
No log 0.8 48 0.5366 0.5587 0.5366 0.7325
No log 0.8333 50 0.5272 0.5290 0.5272 0.7261
No log 0.8667 52 0.6244 0.4904 0.6244 0.7902
No log 0.9 54 0.7043 0.4812 0.7043 0.8392
No log 0.9333 56 0.6991 0.5264 0.6991 0.8361
No log 0.9667 58 0.6271 0.5533 0.6271 0.7919
No log 1.0 60 0.6096 0.5620 0.6096 0.7808
No log 1.0333 62 0.5738 0.6141 0.5738 0.7575
No log 1.0667 64 0.5639 0.6141 0.5639 0.7509
No log 1.1 66 0.5354 0.5827 0.5354 0.7317
No log 1.1333 68 0.5191 0.5660 0.5191 0.7205
No log 1.1667 70 0.5215 0.6046 0.5215 0.7222
No log 1.2 72 0.5560 0.6064 0.5560 0.7457
No log 1.2333 74 0.5715 0.6270 0.5715 0.7560
No log 1.2667 76 0.6316 0.6670 0.6316 0.7947
No log 1.3 78 0.7349 0.6054 0.7349 0.8573
No log 1.3333 80 0.7634 0.5574 0.7634 0.8737
No log 1.3667 82 0.8140 0.5472 0.8140 0.9022
No log 1.4 84 0.7484 0.5360 0.7484 0.8651
No log 1.4333 86 0.7248 0.5021 0.7248 0.8513
No log 1.4667 88 0.6668 0.5676 0.6668 0.8166
No log 1.5 90 0.6643 0.6152 0.6643 0.8150
No log 1.5333 92 0.6885 0.6030 0.6885 0.8298
No log 1.5667 94 0.6537 0.5787 0.6537 0.8085
No log 1.6 96 0.6306 0.5966 0.6306 0.7941
No log 1.6333 98 0.6776 0.5789 0.6776 0.8232
No log 1.6667 100 0.6966 0.5915 0.6966 0.8346
No log 1.7 102 0.7086 0.5569 0.7086 0.8418
No log 1.7333 104 0.7496 0.5620 0.7496 0.8658
No log 1.7667 106 0.6850 0.5264 0.6850 0.8276
No log 1.8 108 0.6597 0.5215 0.6597 0.8122
No log 1.8333 110 0.6558 0.4893 0.6558 0.8098
No log 1.8667 112 0.5895 0.5388 0.5895 0.7678
No log 1.9 114 0.4990 0.5909 0.4990 0.7064
No log 1.9333 116 0.4979 0.5501 0.4979 0.7056
No log 1.9667 118 0.5472 0.5920 0.5472 0.7397
No log 2.0 120 0.6466 0.5439 0.6466 0.8041
No log 2.0333 122 0.7615 0.5360 0.7615 0.8727
No log 2.0667 124 0.8018 0.5585 0.8018 0.8954
No log 2.1 126 0.7190 0.6119 0.7190 0.8479
No log 2.1333 128 0.5625 0.5343 0.5625 0.7500
No log 2.1667 130 0.5715 0.5710 0.5715 0.7560
No log 2.2 132 0.5391 0.5656 0.5391 0.7342
No log 2.2333 134 0.5359 0.5897 0.5359 0.7321
No log 2.2667 136 0.6610 0.5219 0.6610 0.8130
No log 2.3 138 0.8501 0.4608 0.8501 0.9220
No log 2.3333 140 0.9102 0.3489 0.9102 0.9540
No log 2.3667 142 1.0190 0.3776 1.0190 1.0094
No log 2.4 144 1.2078 0.4360 1.2078 1.0990
No log 2.4333 146 1.2363 0.3948 1.2363 1.1119
No log 2.4667 148 1.0972 0.4348 1.0972 1.0475
No log 2.5 150 0.8773 0.4589 0.8773 0.9366
No log 2.5333 152 0.7224 0.4877 0.7224 0.8499
No log 2.5667 154 0.6206 0.5206 0.6206 0.7878
No log 2.6 156 0.5799 0.5237 0.5799 0.7615
No log 2.6333 158 0.6217 0.5470 0.6217 0.7885
No log 2.6667 160 0.8064 0.5638 0.8064 0.8980
No log 2.7 162 0.9614 0.4504 0.9614 0.9805
No log 2.7333 164 0.9252 0.4870 0.9252 0.9619
No log 2.7667 166 0.7714 0.4288 0.7714 0.8783
No log 2.8 168 0.7729 0.4021 0.7729 0.8791
No log 2.8333 170 0.7550 0.4270 0.7550 0.8689
No log 2.8667 172 0.7223 0.6106 0.7223 0.8499
No log 2.9 174 0.6917 0.6421 0.6917 0.8317
No log 2.9333 176 0.7242 0.5810 0.7242 0.8510
No log 2.9667 178 0.6373 0.6025 0.6373 0.7983
No log 3.0 180 0.5621 0.5333 0.5621 0.7498
No log 3.0333 182 0.5369 0.4958 0.5369 0.7327
No log 3.0667 184 0.5136 0.5528 0.5136 0.7167
No log 3.1 186 0.5482 0.6509 0.5482 0.7404
No log 3.1333 188 0.5700 0.6152 0.5700 0.7550
No log 3.1667 190 0.6810 0.5973 0.6810 0.8252
No log 3.2 192 0.6855 0.5702 0.6855 0.8279
No log 3.2333 194 0.6690 0.5883 0.6690 0.8179
No log 3.2667 196 0.6188 0.5750 0.6188 0.7866
No log 3.3 198 0.6342 0.6338 0.6342 0.7964
No log 3.3333 200 0.6167 0.6281 0.6167 0.7853
No log 3.3667 202 0.4979 0.5801 0.4979 0.7056
No log 3.4 204 0.4931 0.5970 0.4931 0.7022
No log 3.4333 206 0.5293 0.6080 0.5293 0.7276
No log 3.4667 208 0.5091 0.6344 0.5091 0.7135
No log 3.5 210 0.4618 0.5988 0.4618 0.6795
No log 3.5333 212 0.4515 0.5902 0.4515 0.6719
No log 3.5667 214 0.4624 0.5902 0.4624 0.6800
No log 3.6 216 0.5040 0.6221 0.5040 0.7099
No log 3.6333 218 0.6084 0.6453 0.6084 0.7800
No log 3.6667 220 0.6096 0.6453 0.6096 0.7807
No log 3.7 222 0.5543 0.6157 0.5543 0.7445
No log 3.7333 224 0.4963 0.6199 0.4963 0.7045
No log 3.7667 226 0.4570 0.6174 0.4570 0.6760
No log 3.8 228 0.4437 0.5961 0.4437 0.6661
No log 3.8333 230 0.4661 0.6239 0.4661 0.6827
No log 3.8667 232 0.4960 0.6249 0.4960 0.7042
No log 3.9 234 0.5459 0.5824 0.5459 0.7389
No log 3.9333 236 0.5289 0.5579 0.5289 0.7272
No log 3.9667 238 0.4916 0.6015 0.4916 0.7011
No log 4.0 240 0.4877 0.5383 0.4877 0.6984
No log 4.0333 242 0.4992 0.5548 0.4992 0.7065
No log 4.0667 244 0.5069 0.5465 0.5069 0.7120
No log 4.1 246 0.5400 0.5503 0.5400 0.7348
No log 4.1333 248 0.5489 0.5809 0.5489 0.7409
No log 4.1667 250 0.5389 0.5831 0.5389 0.7341
No log 4.2 252 0.5227 0.5831 0.5227 0.7230
No log 4.2333 254 0.5355 0.6155 0.5355 0.7318
No log 4.2667 256 0.5466 0.6374 0.5466 0.7393
No log 4.3 258 0.5409 0.6207 0.5409 0.7355
No log 4.3333 260 0.5278 0.5881 0.5278 0.7265
No log 4.3667 262 0.5544 0.6455 0.5544 0.7446
No log 4.4 264 0.5923 0.6275 0.5923 0.7696
No log 4.4333 266 0.5410 0.5719 0.5410 0.7355
No log 4.4667 268 0.5122 0.6004 0.5122 0.7157
No log 4.5 270 0.5246 0.5934 0.5246 0.7243
No log 4.5333 272 0.6105 0.5748 0.6105 0.7813
No log 4.5667 274 0.6232 0.6016 0.6232 0.7894
No log 4.6 276 0.5948 0.5639 0.5948 0.7713
No log 4.6333 278 0.6029 0.5368 0.6029 0.7765
No log 4.6667 280 0.6985 0.4633 0.6985 0.8358
No log 4.7 282 0.8243 0.5121 0.8243 0.9079
No log 4.7333 284 0.8348 0.4669 0.8348 0.9137
No log 4.7667 286 0.7183 0.3688 0.7183 0.8475
No log 4.8 288 0.5843 0.5166 0.5843 0.7644
No log 4.8333 290 0.5553 0.5692 0.5553 0.7452
No log 4.8667 292 0.5835 0.5357 0.5835 0.7639
No log 4.9 294 0.6071 0.5323 0.6071 0.7792
No log 4.9333 296 0.5991 0.5705 0.5991 0.7740
No log 4.9667 298 0.5881 0.5814 0.5881 0.7669
No log 5.0 300 0.5445 0.5772 0.5445 0.7379
No log 5.0333 302 0.5542 0.5308 0.5542 0.7444
No log 5.0667 304 0.6441 0.5688 0.6441 0.8026
No log 5.1 306 0.7478 0.5526 0.7478 0.8648
No log 5.1333 308 0.7154 0.5281 0.7154 0.8458
No log 5.1667 310 0.5622 0.6269 0.5622 0.7498
No log 5.2 312 0.4808 0.6530 0.4808 0.6934
No log 5.2333 314 0.4733 0.6530 0.4733 0.6880
No log 5.2667 316 0.5020 0.6606 0.5020 0.7085
No log 5.3 318 0.5931 0.6367 0.5931 0.7702
No log 5.3333 320 0.6035 0.6367 0.6035 0.7768
No log 5.3667 322 0.5428 0.6362 0.5428 0.7367
No log 5.4 324 0.5095 0.5485 0.5095 0.7138
No log 5.4333 326 0.5422 0.5327 0.5422 0.7363
No log 5.4667 328 0.6289 0.4665 0.6289 0.7930
No log 5.5 330 0.7863 0.5157 0.7863 0.8867
No log 5.5333 332 0.8240 0.5688 0.8240 0.9078
No log 5.5667 334 0.7105 0.6073 0.7105 0.8429
No log 5.6 336 0.5691 0.5758 0.5691 0.7544
No log 5.6333 338 0.4726 0.5923 0.4726 0.6875
No log 5.6667 340 0.4551 0.6228 0.4551 0.6746
No log 5.7 342 0.4909 0.6507 0.4909 0.7006
No log 5.7333 344 0.5232 0.6277 0.5232 0.7233
No log 5.7667 346 0.5496 0.6285 0.5496 0.7414
No log 5.8 348 0.5062 0.6156 0.5062 0.7115
No log 5.8333 350 0.4778 0.5683 0.4778 0.6913
No log 5.8667 352 0.4887 0.5966 0.4887 0.6990
No log 5.9 354 0.5058 0.5614 0.5058 0.7112
No log 5.9333 356 0.5189 0.5904 0.5189 0.7204
No log 5.9667 358 0.5226 0.5994 0.5226 0.7229
No log 6.0 360 0.4836 0.6150 0.4836 0.6954
No log 6.0333 362 0.4335 0.6643 0.4335 0.6584
No log 6.0667 364 0.4398 0.6125 0.4398 0.6632
No log 6.1 366 0.4605 0.6543 0.4605 0.6786
No log 6.1333 368 0.4660 0.6129 0.4660 0.6826
No log 6.1667 370 0.5484 0.6591 0.5484 0.7405
No log 6.2 372 0.7015 0.6220 0.7015 0.8375
No log 6.2333 374 0.7167 0.6220 0.7167 0.8466
No log 6.2667 376 0.6279 0.6421 0.6279 0.7924
No log 6.3 378 0.4914 0.6427 0.4914 0.7010
No log 6.3333 380 0.4553 0.6024 0.4553 0.6748
No log 6.3667 382 0.4537 0.5956 0.4537 0.6736
No log 6.4 384 0.4559 0.6643 0.4559 0.6752
No log 6.4333 386 0.4819 0.6540 0.4819 0.6942
No log 6.4667 388 0.5286 0.6295 0.5286 0.7270
No log 6.5 390 0.5466 0.6239 0.5466 0.7393
No log 6.5333 392 0.5023 0.6529 0.5023 0.7087
No log 6.5667 394 0.4542 0.6112 0.4542 0.6739
No log 6.6 396 0.4509 0.6377 0.4509 0.6715
No log 6.6333 398 0.4689 0.6452 0.4689 0.6848
No log 6.6667 400 0.5134 0.6437 0.5134 0.7165
No log 6.7 402 0.5013 0.6259 0.5013 0.7080
No log 6.7333 404 0.4606 0.6507 0.4606 0.6787
No log 6.7667 406 0.4602 0.6698 0.4602 0.6784
No log 6.8 408 0.4888 0.6507 0.4888 0.6992
No log 6.8333 410 0.5231 0.5957 0.5231 0.7232
No log 6.8667 412 0.5683 0.5998 0.5683 0.7538
No log 6.9 414 0.5698 0.5998 0.5698 0.7549
No log 6.9333 416 0.5560 0.5978 0.5560 0.7457
No log 6.9667 418 0.5274 0.6239 0.5274 0.7263
No log 7.0 420 0.5726 0.6582 0.5726 0.7567
No log 7.0333 422 0.5856 0.6424 0.5856 0.7652
No log 7.0667 424 0.5322 0.5827 0.5322 0.7296
No log 7.1 426 0.4764 0.5725 0.4764 0.6902
No log 7.1333 428 0.4859 0.6407 0.4859 0.6971
No log 7.1667 430 0.4838 0.6407 0.4838 0.6956
No log 7.2 432 0.4566 0.6105 0.4566 0.6757
No log 7.2333 434 0.4687 0.6353 0.4687 0.6846
No log 7.2667 436 0.4859 0.6239 0.4859 0.6971
No log 7.3 438 0.5305 0.6165 0.5305 0.7283
No log 7.3333 440 0.5169 0.6226 0.5169 0.7190
No log 7.3667 442 0.4896 0.6135 0.4896 0.6997
No log 7.4 444 0.4929 0.5934 0.4929 0.7020
No log 7.4333 446 0.5378 0.5845 0.5378 0.7333
No log 7.4667 448 0.5575 0.5845 0.5575 0.7466
No log 7.5 450 0.5141 0.5845 0.5141 0.7170
No log 7.5333 452 0.5037 0.5934 0.5037 0.7097
No log 7.5667 454 0.5288 0.5758 0.5288 0.7272
No log 7.6 456 0.5353 0.5957 0.5353 0.7317
No log 7.6333 458 0.4967 0.5934 0.4967 0.7048
No log 7.6667 460 0.4688 0.6414 0.4688 0.6847
No log 7.7 462 0.4682 0.6608 0.4682 0.6842
No log 7.7333 464 0.4677 0.6608 0.4677 0.6839
No log 7.7667 466 0.5098 0.6406 0.5098 0.7140
No log 7.8 468 0.5629 0.6588 0.5629 0.7502
No log 7.8333 470 0.5912 0.6258 0.5912 0.7689
No log 7.8667 472 0.5451 0.6333 0.5451 0.7383
No log 7.9 474 0.4969 0.6604 0.4969 0.7049
No log 7.9333 476 0.5170 0.6318 0.5170 0.7190
No log 7.9667 478 0.6200 0.5947 0.6200 0.7874
No log 8.0 480 0.7318 0.6035 0.7318 0.8554
No log 8.0333 482 0.7278 0.5894 0.7278 0.8531
No log 8.0667 484 0.6215 0.5510 0.6215 0.7883
No log 8.1 486 0.5226 0.5909 0.5226 0.7229
No log 8.1333 488 0.4816 0.6018 0.4816 0.6940
No log 8.1667 490 0.4629 0.6414 0.4629 0.6804
No log 8.2 492 0.4885 0.6227 0.4885 0.6989
No log 8.2333 494 0.5487 0.6582 0.5487 0.7408
No log 8.2667 496 0.5476 0.6891 0.5476 0.7400
No log 8.3 498 0.5193 0.6594 0.5193 0.7207
0.3012 8.3333 500 0.4791 0.6330 0.4791 0.6922
0.3012 8.3667 502 0.4669 0.6418 0.4669 0.6833
0.3012 8.4 504 0.4745 0.5831 0.4745 0.6888
0.3012 8.4333 506 0.5147 0.5831 0.5147 0.7175
0.3012 8.4667 508 0.5998 0.5510 0.5998 0.7744
0.3012 8.5 510 0.6209 0.5533 0.6209 0.7879
0.3012 8.5333 512 0.5646 0.4997 0.5646 0.7514

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k12_task7_organization

Finetuned
(4019)
this model