ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k1_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5452
  • Qwk: 0.5884
  • Mse: 0.5452
  • Rmse: 0.7384

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.4 2 2.6258 0.0070 2.6258 1.6204
No log 0.8 4 1.4387 0.0789 1.4387 1.1995
No log 1.2 6 0.8003 0.1372 0.8003 0.8946
No log 1.6 8 0.9093 0.0679 0.9093 0.9536
No log 2.0 10 1.0176 0.1385 1.0176 1.0088
No log 2.4 12 1.1333 0.2121 1.1333 1.0646
No log 2.8 14 0.8989 0.3892 0.8989 0.9481
No log 3.2 16 0.6486 0.1050 0.6486 0.8054
No log 3.6 18 0.6125 0.1407 0.6125 0.7826
No log 4.0 20 0.5978 0.1561 0.5978 0.7732
No log 4.4 22 0.6237 0.0027 0.6237 0.7898
No log 4.8 24 0.6333 0.0495 0.6333 0.7958
No log 5.2 26 0.5837 0.1358 0.5837 0.7640
No log 5.6 28 0.5443 0.4538 0.5443 0.7377
No log 6.0 30 0.5344 0.4660 0.5344 0.7311
No log 6.4 32 0.5799 0.4218 0.5799 0.7615
No log 6.8 34 0.5301 0.5396 0.5301 0.7281
No log 7.2 36 0.5151 0.5195 0.5151 0.7177
No log 7.6 38 0.5242 0.6018 0.5242 0.7240
No log 8.0 40 0.5482 0.4933 0.5482 0.7404
No log 8.4 42 0.5972 0.4315 0.5972 0.7728
No log 8.8 44 0.5296 0.5308 0.5296 0.7278
No log 9.2 46 0.4759 0.6223 0.4759 0.6898
No log 9.6 48 0.7538 0.3870 0.7538 0.8682
No log 10.0 50 0.7015 0.4024 0.7015 0.8376
No log 10.4 52 0.4714 0.5943 0.4714 0.6866
No log 10.8 54 0.5989 0.4964 0.5989 0.7739
No log 11.2 56 0.7422 0.5295 0.7422 0.8615
No log 11.6 58 0.6853 0.5203 0.6853 0.8278
No log 12.0 60 0.5426 0.5597 0.5426 0.7366
No log 12.4 62 0.5337 0.6150 0.5337 0.7306
No log 12.8 64 0.5488 0.6516 0.5488 0.7408
No log 13.2 66 0.5493 0.6526 0.5493 0.7411
No log 13.6 68 0.5324 0.6582 0.5324 0.7297
No log 14.0 70 0.5459 0.6343 0.5459 0.7388
No log 14.4 72 0.5557 0.5935 0.5557 0.7455
No log 14.8 74 0.5890 0.5434 0.5890 0.7675
No log 15.2 76 0.6464 0.4665 0.6464 0.8040
No log 15.6 78 0.6626 0.5155 0.6626 0.8140
No log 16.0 80 0.6101 0.4801 0.6101 0.7811
No log 16.4 82 0.5922 0.4801 0.5922 0.7695
No log 16.8 84 0.6099 0.5015 0.6099 0.7809
No log 17.2 86 0.6071 0.5470 0.6071 0.7791
No log 17.6 88 0.5695 0.5012 0.5695 0.7547
No log 18.0 90 0.5923 0.5771 0.5923 0.7696
No log 18.4 92 0.6492 0.5410 0.6492 0.8057
No log 18.8 94 0.7542 0.5307 0.7542 0.8684
No log 19.2 96 0.7243 0.5281 0.7243 0.8511
No log 19.6 98 0.6139 0.5409 0.6139 0.7835
No log 20.0 100 0.5772 0.5529 0.5772 0.7598
No log 20.4 102 0.5907 0.5672 0.5907 0.7685
No log 20.8 104 0.6065 0.6373 0.6065 0.7788
No log 21.2 106 0.5845 0.5958 0.5845 0.7645
No log 21.6 108 0.5797 0.5914 0.5797 0.7614
No log 22.0 110 0.5922 0.5627 0.5922 0.7695
No log 22.4 112 0.6271 0.5141 0.6271 0.7919
No log 22.8 114 0.5927 0.5552 0.5927 0.7699
No log 23.2 116 0.5444 0.6014 0.5444 0.7379
No log 23.6 118 0.5494 0.5440 0.5494 0.7412
No log 24.0 120 0.5607 0.6139 0.5607 0.7488
No log 24.4 122 0.6103 0.5703 0.6103 0.7812
No log 24.8 124 0.6491 0.5981 0.6491 0.8057
No log 25.2 126 0.6218 0.5827 0.6218 0.7885
No log 25.6 128 0.5562 0.5626 0.5562 0.7458
No log 26.0 130 0.5366 0.5853 0.5366 0.7325
No log 26.4 132 0.5293 0.5926 0.5293 0.7275
No log 26.8 134 0.5377 0.5528 0.5377 0.7333
No log 27.2 136 0.5559 0.6058 0.5559 0.7456
No log 27.6 138 0.5485 0.5784 0.5485 0.7406
No log 28.0 140 0.5448 0.5647 0.5448 0.7381
No log 28.4 142 0.5624 0.5606 0.5624 0.7499
No log 28.8 144 0.5781 0.5823 0.5781 0.7603
No log 29.2 146 0.6042 0.5403 0.6042 0.7773
No log 29.6 148 0.5971 0.5059 0.5971 0.7727
No log 30.0 150 0.5876 0.5473 0.5876 0.7665
No log 30.4 152 0.5695 0.5582 0.5695 0.7547
No log 30.8 154 0.5635 0.5913 0.5635 0.7507
No log 31.2 156 0.5376 0.5970 0.5376 0.7332
No log 31.6 158 0.5329 0.6367 0.5329 0.7300
No log 32.0 160 0.5159 0.6066 0.5159 0.7182
No log 32.4 162 0.4995 0.5853 0.4995 0.7067
No log 32.8 164 0.5305 0.6011 0.5305 0.7283
No log 33.2 166 0.6323 0.5675 0.6323 0.7952
No log 33.6 168 0.7440 0.4905 0.7440 0.8626
No log 34.0 170 0.7993 0.4900 0.7993 0.8940
No log 34.4 172 0.7470 0.4731 0.7470 0.8643
No log 34.8 174 0.6247 0.4539 0.6247 0.7904
No log 35.2 176 0.5876 0.4979 0.5876 0.7665
No log 35.6 178 0.6037 0.4684 0.6037 0.7770
No log 36.0 180 0.6089 0.4980 0.6089 0.7803
No log 36.4 182 0.6181 0.5323 0.6181 0.7862
No log 36.8 184 0.5673 0.5758 0.5673 0.7532
No log 37.2 186 0.5393 0.5412 0.5393 0.7343
No log 37.6 188 0.5354 0.5035 0.5354 0.7317
No log 38.0 190 0.5473 0.5201 0.5473 0.7398
No log 38.4 192 0.5750 0.5032 0.5750 0.7583
No log 38.8 194 0.6212 0.5403 0.6212 0.7882
No log 39.2 196 0.6579 0.5409 0.6579 0.8111
No log 39.6 198 0.6626 0.5409 0.6626 0.8140
No log 40.0 200 0.6124 0.5233 0.6124 0.7826
No log 40.4 202 0.5608 0.5679 0.5608 0.7489
No log 40.8 204 0.5477 0.5910 0.5477 0.7401
No log 41.2 206 0.5576 0.5840 0.5576 0.7468
No log 41.6 208 0.5711 0.5823 0.5711 0.7557
No log 42.0 210 0.5673 0.6329 0.5673 0.7532
No log 42.4 212 0.5568 0.6349 0.5568 0.7462
No log 42.8 214 0.5469 0.6343 0.5469 0.7395
No log 43.2 216 0.5645 0.5538 0.5645 0.7513
No log 43.6 218 0.6312 0.4779 0.6312 0.7945
No log 44.0 220 0.6957 0.4573 0.6957 0.8341
No log 44.4 222 0.7634 0.4683 0.7634 0.8737
No log 44.8 224 0.7608 0.4620 0.7608 0.8723
No log 45.2 226 0.6901 0.4351 0.6901 0.8308
No log 45.6 228 0.6178 0.4476 0.6178 0.7860
No log 46.0 230 0.5897 0.4997 0.5897 0.7679
No log 46.4 232 0.5858 0.4937 0.5858 0.7654
No log 46.8 234 0.6003 0.4891 0.6003 0.7748
No log 47.2 236 0.6135 0.5715 0.6135 0.7833
No log 47.6 238 0.6044 0.5715 0.6044 0.7774
No log 48.0 240 0.5958 0.5263 0.5958 0.7719
No log 48.4 242 0.5642 0.5368 0.5642 0.7511
No log 48.8 244 0.5445 0.4986 0.5445 0.7379
No log 49.2 246 0.5283 0.5288 0.5283 0.7268
No log 49.6 248 0.5237 0.5434 0.5237 0.7236
No log 50.0 250 0.5357 0.6028 0.5357 0.7319
No log 50.4 252 0.5566 0.6174 0.5566 0.7461
No log 50.8 254 0.5633 0.6174 0.5633 0.7505
No log 51.2 256 0.5683 0.6343 0.5683 0.7539
No log 51.6 258 0.5641 0.6265 0.5641 0.7511
No log 52.0 260 0.5575 0.6265 0.5575 0.7466
No log 52.4 262 0.5613 0.6175 0.5613 0.7492
No log 52.8 264 0.5757 0.6088 0.5757 0.7588
No log 53.2 266 0.5887 0.5715 0.5887 0.7673
No log 53.6 268 0.5833 0.6088 0.5833 0.7637
No log 54.0 270 0.5857 0.6280 0.5857 0.7653
No log 54.4 272 0.5972 0.5657 0.5972 0.7728
No log 54.8 274 0.6056 0.5835 0.6056 0.7782
No log 55.2 276 0.6045 0.5835 0.6045 0.7775
No log 55.6 278 0.6057 0.5703 0.6057 0.7783
No log 56.0 280 0.5870 0.5761 0.5870 0.7662
No log 56.4 282 0.5722 0.6057 0.5722 0.7564
No log 56.8 284 0.5649 0.6041 0.5649 0.7516
No log 57.2 286 0.5638 0.6092 0.5638 0.7509
No log 57.6 288 0.5627 0.5938 0.5627 0.7501
No log 58.0 290 0.5670 0.5904 0.5670 0.7530
No log 58.4 292 0.5779 0.6035 0.5779 0.7602
No log 58.8 294 0.5827 0.5404 0.5827 0.7633
No log 59.2 296 0.5744 0.5665 0.5744 0.7579
No log 59.6 298 0.5559 0.5322 0.5559 0.7456
No log 60.0 300 0.5387 0.5288 0.5387 0.7340
No log 60.4 302 0.5328 0.5084 0.5328 0.7299
No log 60.8 304 0.5329 0.5084 0.5329 0.7300
No log 61.2 306 0.5393 0.5488 0.5393 0.7344
No log 61.6 308 0.5572 0.5715 0.5572 0.7465
No log 62.0 310 0.5917 0.5347 0.5917 0.7692
No log 62.4 312 0.6016 0.5347 0.6016 0.7756
No log 62.8 314 0.5816 0.5487 0.5816 0.7626
No log 63.2 316 0.5649 0.5411 0.5649 0.7516
No log 63.6 318 0.5451 0.5633 0.5451 0.7383
No log 64.0 320 0.5320 0.5473 0.5320 0.7294
No log 64.4 322 0.5325 0.5473 0.5325 0.7297
No log 64.8 324 0.5420 0.5543 0.5420 0.7362
No log 65.2 326 0.5655 0.5338 0.5655 0.7520
No log 65.6 328 0.5938 0.4925 0.5938 0.7706
No log 66.0 330 0.6315 0.5091 0.6315 0.7947
No log 66.4 332 0.6208 0.5222 0.6208 0.7879
No log 66.8 334 0.5874 0.5306 0.5874 0.7664
No log 67.2 336 0.5547 0.5888 0.5547 0.7448
No log 67.6 338 0.5379 0.5866 0.5379 0.7334
No log 68.0 340 0.5307 0.5866 0.5307 0.7285
No log 68.4 342 0.5278 0.6210 0.5278 0.7265
No log 68.8 344 0.5297 0.5866 0.5297 0.7278
No log 69.2 346 0.5394 0.5528 0.5394 0.7345
No log 69.6 348 0.5553 0.5619 0.5553 0.7452
No log 70.0 350 0.5781 0.5827 0.5781 0.7604
No log 70.4 352 0.5935 0.5222 0.5935 0.7704
No log 70.8 354 0.5828 0.5683 0.5828 0.7634
No log 71.2 356 0.5705 0.5683 0.5705 0.7553
No log 71.6 358 0.5587 0.5773 0.5587 0.7474
No log 72.0 360 0.5526 0.5709 0.5526 0.7433
No log 72.4 362 0.5462 0.5888 0.5462 0.7391
No log 72.8 364 0.5417 0.5911 0.5417 0.7360
No log 73.2 366 0.5446 0.6084 0.5446 0.7380
No log 73.6 368 0.5542 0.6232 0.5542 0.7445
No log 74.0 370 0.5741 0.5429 0.5741 0.7577
No log 74.4 372 0.5982 0.5167 0.5982 0.7734
No log 74.8 374 0.6173 0.5497 0.6173 0.7857
No log 75.2 376 0.6320 0.5377 0.6320 0.7950
No log 75.6 378 0.6260 0.5377 0.6260 0.7912
No log 76.0 380 0.6035 0.5293 0.6035 0.7769
No log 76.4 382 0.5872 0.5096 0.5872 0.7663
No log 76.8 384 0.5665 0.5323 0.5665 0.7527
No log 77.2 386 0.5584 0.5871 0.5584 0.7472
No log 77.6 388 0.5606 0.5652 0.5606 0.7488
No log 78.0 390 0.5706 0.5748 0.5706 0.7554
No log 78.4 392 0.5767 0.5474 0.5767 0.7594
No log 78.8 394 0.5725 0.5306 0.5725 0.7566
No log 79.2 396 0.5636 0.5487 0.5636 0.7507
No log 79.6 398 0.5516 0.5633 0.5516 0.7427
No log 80.0 400 0.5462 0.5715 0.5462 0.7391
No log 80.4 402 0.5404 0.5715 0.5404 0.7351
No log 80.8 404 0.5317 0.5910 0.5317 0.7292
No log 81.2 406 0.5269 0.5910 0.5269 0.7259
No log 81.6 408 0.5255 0.5729 0.5255 0.7249
No log 82.0 410 0.5272 0.5543 0.5272 0.7261
No log 82.4 412 0.5311 0.5910 0.5311 0.7288
No log 82.8 414 0.5364 0.5910 0.5364 0.7324
No log 83.2 416 0.5481 0.5827 0.5481 0.7404
No log 83.6 418 0.5603 0.5683 0.5603 0.7486
No log 84.0 420 0.5725 0.5748 0.5725 0.7567
No log 84.4 422 0.5761 0.5814 0.5761 0.7590
No log 84.8 424 0.5777 0.5484 0.5777 0.7601
No log 85.2 426 0.5780 0.5484 0.5780 0.7603
No log 85.6 428 0.5727 0.5484 0.5727 0.7568
No log 86.0 430 0.5703 0.5484 0.5703 0.7552
No log 86.4 432 0.5680 0.5484 0.5680 0.7537
No log 86.8 434 0.5697 0.5814 0.5697 0.7548
No log 87.2 436 0.5735 0.5484 0.5735 0.7573
No log 87.6 438 0.5794 0.5484 0.5794 0.7612
No log 88.0 440 0.5838 0.5484 0.5838 0.7641
No log 88.4 442 0.5833 0.5484 0.5833 0.7638
No log 88.8 444 0.5752 0.5484 0.5752 0.7584
No log 89.2 446 0.5672 0.5445 0.5672 0.7531
No log 89.6 448 0.5616 0.5652 0.5616 0.7494
No log 90.0 450 0.5544 0.5619 0.5544 0.7446
No log 90.4 452 0.5456 0.5814 0.5456 0.7387
No log 90.8 454 0.5396 0.5814 0.5396 0.7346
No log 91.2 456 0.5371 0.5814 0.5371 0.7329
No log 91.6 458 0.5373 0.5814 0.5373 0.7330
No log 92.0 460 0.5387 0.5814 0.5387 0.7340
No log 92.4 462 0.5403 0.5814 0.5403 0.7351
No log 92.8 464 0.5415 0.5814 0.5415 0.7359
No log 93.2 466 0.5442 0.5814 0.5442 0.7377
No log 93.6 468 0.5458 0.5814 0.5458 0.7388
No log 94.0 470 0.5463 0.5814 0.5463 0.7391
No log 94.4 472 0.5437 0.5814 0.5437 0.7374
No log 94.8 474 0.5411 0.5814 0.5411 0.7356
No log 95.2 476 0.5402 0.5814 0.5402 0.7350
No log 95.6 478 0.5394 0.5814 0.5394 0.7344
No log 96.0 480 0.5382 0.5814 0.5382 0.7336
No log 96.4 482 0.5387 0.5814 0.5387 0.7340
No log 96.8 484 0.5398 0.5814 0.5398 0.7347
No log 97.2 486 0.5400 0.5814 0.5400 0.7348
No log 97.6 488 0.5407 0.5814 0.5407 0.7353
No log 98.0 490 0.5422 0.5814 0.5422 0.7363
No log 98.4 492 0.5434 0.5884 0.5434 0.7371
No log 98.8 494 0.5441 0.5884 0.5441 0.7376
No log 99.2 496 0.5445 0.5884 0.5445 0.7379
No log 99.6 498 0.5450 0.5884 0.5450 0.7383
0.224 100.0 500 0.5452 0.5884 0.5452 0.7384

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k1_task7_organization

Finetuned
(4019)
this model