ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k19_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5584
  • Qwk: 0.6433
  • Mse: 0.5584
  • Rmse: 0.7473

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0211 2 3.8292 -0.0239 3.8292 1.9568
No log 0.0421 4 1.9801 0.1020 1.9801 1.4072
No log 0.0632 6 1.2937 0.0513 1.2937 1.1374
No log 0.0842 8 0.9974 0.3207 0.9974 0.9987
No log 0.1053 10 0.9806 0.2192 0.9806 0.9903
No log 0.1263 12 0.9564 0.2042 0.9564 0.9780
No log 0.1474 14 1.0494 0.1944 1.0494 1.0244
No log 0.1684 16 1.3207 0.3129 1.3207 1.1492
No log 0.1895 18 1.1478 0.3968 1.1478 1.0713
No log 0.2105 20 0.9367 0.3505 0.9367 0.9678
No log 0.2316 22 0.9490 0.3868 0.9490 0.9741
No log 0.2526 24 1.4003 0.3380 1.4003 1.1833
No log 0.2737 26 1.4632 0.3112 1.4632 1.2096
No log 0.2947 28 1.1563 0.3731 1.1563 1.0753
No log 0.3158 30 0.7974 0.4024 0.7974 0.8930
No log 0.3368 32 0.6785 0.6219 0.6785 0.8237
No log 0.3579 34 0.6825 0.6187 0.6825 0.8261
No log 0.3789 36 0.6797 0.6333 0.6797 0.8244
No log 0.4 38 0.6953 0.6484 0.6953 0.8338
No log 0.4211 40 1.0214 0.4887 1.0214 1.0106
No log 0.4421 42 1.1717 0.4621 1.1717 1.0824
No log 0.4632 44 0.8531 0.5433 0.8531 0.9236
No log 0.4842 46 0.6692 0.6102 0.6692 0.8181
No log 0.5053 48 0.6916 0.6289 0.6916 0.8316
No log 0.5263 50 0.6646 0.5562 0.6646 0.8152
No log 0.5474 52 0.8247 0.5270 0.8247 0.9081
No log 0.5684 54 0.9294 0.5175 0.9294 0.9641
No log 0.5895 56 0.9171 0.5656 0.9171 0.9577
No log 0.6105 58 0.8383 0.5961 0.8383 0.9156
No log 0.6316 60 0.8131 0.6144 0.8131 0.9017
No log 0.6526 62 0.6863 0.6414 0.6863 0.8284
No log 0.6737 64 0.6716 0.6274 0.6716 0.8195
No log 0.6947 66 0.6631 0.5905 0.6631 0.8143
No log 0.7158 68 0.6893 0.6185 0.6893 0.8302
No log 0.7368 70 0.9698 0.5873 0.9698 0.9848
No log 0.7579 72 1.4620 0.3780 1.4620 1.2091
No log 0.7789 74 1.3255 0.4062 1.3255 1.1513
No log 0.8 76 0.7981 0.6118 0.7981 0.8934
No log 0.8211 78 0.6570 0.6162 0.6570 0.8105
No log 0.8421 80 0.7835 0.6297 0.7835 0.8852
No log 0.8632 82 0.8145 0.5663 0.8145 0.9025
No log 0.8842 84 0.7439 0.6604 0.7439 0.8625
No log 0.9053 86 0.5953 0.5949 0.5953 0.7715
No log 0.9263 88 0.6762 0.6362 0.6762 0.8223
No log 0.9474 90 0.7254 0.6041 0.7254 0.8517
No log 0.9684 92 0.6007 0.6448 0.6007 0.7751
No log 0.9895 94 0.5604 0.6352 0.5604 0.7486
No log 1.0105 96 0.5948 0.7122 0.5948 0.7713
No log 1.0316 98 0.5706 0.7022 0.5706 0.7554
No log 1.0526 100 0.5419 0.6748 0.5419 0.7362
No log 1.0737 102 0.5476 0.6680 0.5476 0.7400
No log 1.0947 104 0.5580 0.6880 0.5580 0.7470
No log 1.1158 106 0.5638 0.6572 0.5638 0.7509
No log 1.1368 108 0.6624 0.6202 0.6624 0.8139
No log 1.1579 110 0.8993 0.6034 0.8993 0.9483
No log 1.1789 112 0.8892 0.6034 0.8892 0.9430
No log 1.2 114 0.7230 0.6334 0.7230 0.8503
No log 1.2211 116 0.6213 0.6617 0.6213 0.7882
No log 1.2421 118 0.6623 0.6601 0.6623 0.8138
No log 1.2632 120 0.6247 0.7035 0.6247 0.7904
No log 1.2842 122 0.6285 0.6445 0.6285 0.7928
No log 1.3053 124 0.6351 0.5753 0.6351 0.7969
No log 1.3263 126 0.6103 0.6622 0.6103 0.7812
No log 1.3474 128 0.6156 0.6452 0.6156 0.7846
No log 1.3684 130 0.6106 0.6289 0.6106 0.7814
No log 1.3895 132 0.6073 0.6743 0.6073 0.7793
No log 1.4105 134 0.6352 0.6453 0.6352 0.7970
No log 1.4316 136 0.6245 0.7008 0.6245 0.7902
No log 1.4526 138 0.6306 0.7028 0.6306 0.7941
No log 1.4737 140 0.6428 0.7226 0.6428 0.8018
No log 1.4947 142 0.6149 0.7168 0.6149 0.7842
No log 1.5158 144 0.5975 0.7020 0.5975 0.7730
No log 1.5368 146 0.5982 0.7020 0.5982 0.7734
No log 1.5579 148 0.6115 0.7316 0.6115 0.7820
No log 1.5789 150 0.6316 0.6525 0.6316 0.7947
No log 1.6 152 0.6663 0.6602 0.6663 0.8163
No log 1.6211 154 0.6457 0.6739 0.6457 0.8036
No log 1.6421 156 0.7263 0.6575 0.7263 0.8522
No log 1.6632 158 0.7458 0.6350 0.7458 0.8636
No log 1.6842 160 0.6333 0.6797 0.6333 0.7958
No log 1.7053 162 0.7215 0.6157 0.7215 0.8494
No log 1.7263 164 0.8870 0.5790 0.8870 0.9418
No log 1.7474 166 0.8082 0.5800 0.8082 0.8990
No log 1.7684 168 0.6702 0.6418 0.6702 0.8187
No log 1.7895 170 0.6440 0.6536 0.6440 0.8025
No log 1.8105 172 0.6713 0.6118 0.6713 0.8193
No log 1.8316 174 0.6275 0.6177 0.6275 0.7921
No log 1.8526 176 0.6001 0.6699 0.6001 0.7747
No log 1.8737 178 0.5804 0.7259 0.5804 0.7618
No log 1.8947 180 0.5771 0.7519 0.5771 0.7597
No log 1.9158 182 0.5627 0.7379 0.5627 0.7501
No log 1.9368 184 0.5449 0.7423 0.5449 0.7382
No log 1.9579 186 0.5267 0.7371 0.5267 0.7257
No log 1.9789 188 0.5250 0.6938 0.5250 0.7245
No log 2.0 190 0.5270 0.6973 0.5270 0.7259
No log 2.0211 192 0.5293 0.6675 0.5293 0.7275
No log 2.0421 194 0.5346 0.6993 0.5346 0.7311
No log 2.0632 196 0.5024 0.7041 0.5024 0.7088
No log 2.0842 198 0.4954 0.7259 0.4954 0.7038
No log 2.1053 200 0.4917 0.6988 0.4917 0.7012
No log 2.1263 202 0.4967 0.6872 0.4967 0.7047
No log 2.1474 204 0.4910 0.7042 0.4910 0.7007
No log 2.1684 206 0.5026 0.6903 0.5026 0.7090
No log 2.1895 208 0.5551 0.7343 0.5551 0.7450
No log 2.2105 210 0.5555 0.7343 0.5555 0.7453
No log 2.2316 212 0.5307 0.7085 0.5307 0.7285
No log 2.2526 214 0.5145 0.6753 0.5145 0.7173
No log 2.2737 216 0.5352 0.6736 0.5352 0.7316
No log 2.2947 218 0.5823 0.6646 0.5823 0.7631
No log 2.3158 220 0.5579 0.5758 0.5579 0.7469
No log 2.3368 222 0.5962 0.6150 0.5962 0.7721
No log 2.3579 224 0.7243 0.5387 0.7243 0.8511
No log 2.3789 226 0.7193 0.5486 0.7193 0.8481
No log 2.4 228 0.5590 0.6497 0.5590 0.7477
No log 2.4211 230 0.4705 0.7656 0.4705 0.6860
No log 2.4421 232 0.4903 0.7271 0.4903 0.7002
No log 2.4632 234 0.4949 0.7265 0.4949 0.7035
No log 2.4842 236 0.4741 0.7567 0.4741 0.6885
No log 2.5053 238 0.4908 0.7216 0.4908 0.7006
No log 2.5263 240 0.5489 0.6248 0.5489 0.7409
No log 2.5474 242 0.5472 0.6248 0.5472 0.7397
No log 2.5684 244 0.5021 0.7470 0.5021 0.7086
No log 2.5895 246 0.5361 0.6677 0.5361 0.7322
No log 2.6105 248 0.6142 0.6593 0.6142 0.7837
No log 2.6316 250 0.5485 0.7124 0.5485 0.7406
No log 2.6526 252 0.4994 0.7754 0.4994 0.7067
No log 2.6737 254 0.5677 0.7210 0.5677 0.7535
No log 2.6947 256 0.5457 0.7280 0.5457 0.7387
No log 2.7158 258 0.5137 0.7316 0.5137 0.7167
No log 2.7368 260 0.5863 0.6575 0.5863 0.7657
No log 2.7579 262 0.6658 0.6301 0.6658 0.8160
No log 2.7789 264 0.6361 0.6562 0.6361 0.7976
No log 2.8 266 0.5586 0.6512 0.5586 0.7474
No log 2.8211 268 0.5255 0.7223 0.5255 0.7249
No log 2.8421 270 0.5467 0.7189 0.5467 0.7394
No log 2.8632 272 0.5589 0.7078 0.5589 0.7476
No log 2.8842 274 0.5317 0.7071 0.5317 0.7292
No log 2.9053 276 0.5372 0.6345 0.5372 0.7329
No log 2.9263 278 0.5622 0.6441 0.5622 0.7498
No log 2.9474 280 0.5840 0.6360 0.5840 0.7642
No log 2.9684 282 0.5625 0.6296 0.5625 0.7500
No log 2.9895 284 0.5541 0.6006 0.5541 0.7444
No log 3.0105 286 0.5643 0.6951 0.5643 0.7512
No log 3.0316 288 0.6389 0.6165 0.6389 0.7993
No log 3.0526 290 0.6214 0.6650 0.6214 0.7883
No log 3.0737 292 0.5325 0.7706 0.5325 0.7297
No log 3.0947 294 0.5642 0.6886 0.5642 0.7511
No log 3.1158 296 0.6179 0.6609 0.6179 0.7861
No log 3.1368 298 0.5953 0.6325 0.5953 0.7715
No log 3.1579 300 0.5537 0.6065 0.5537 0.7441
No log 3.1789 302 0.5366 0.5771 0.5366 0.7325
No log 3.2 304 0.5151 0.6616 0.5151 0.7177
No log 3.2211 306 0.5090 0.7635 0.5090 0.7134
No log 3.2421 308 0.5075 0.7667 0.5075 0.7124
No log 3.2632 310 0.5055 0.7556 0.5055 0.7110
No log 3.2842 312 0.5050 0.7556 0.5050 0.7106
No log 3.3053 314 0.5125 0.7168 0.5125 0.7159
No log 3.3263 316 0.5209 0.5894 0.5209 0.7217
No log 3.3474 318 0.5209 0.6017 0.5209 0.7217
No log 3.3684 320 0.5176 0.6602 0.5176 0.7194
No log 3.3895 322 0.5991 0.6503 0.5991 0.7740
No log 3.4105 324 0.6115 0.6208 0.6115 0.7820
No log 3.4316 326 0.6023 0.6236 0.6023 0.7761
No log 3.4526 328 0.5114 0.7049 0.5114 0.7151
No log 3.4737 330 0.5032 0.7224 0.5032 0.7094
No log 3.4947 332 0.5019 0.7224 0.5019 0.7085
No log 3.5158 334 0.5052 0.7175 0.5052 0.7108
No log 3.5368 336 0.5286 0.6838 0.5286 0.7271
No log 3.5579 338 0.5148 0.6584 0.5148 0.7175
No log 3.5789 340 0.5202 0.6537 0.5202 0.7212
No log 3.6 342 0.5215 0.6537 0.5215 0.7221
No log 3.6211 344 0.5228 0.6537 0.5228 0.7231
No log 3.6421 346 0.5449 0.6721 0.5449 0.7382
No log 3.6632 348 0.5402 0.6835 0.5402 0.7350
No log 3.6842 350 0.5434 0.7168 0.5434 0.7372
No log 3.7053 352 0.5430 0.6911 0.5430 0.7369
No log 3.7263 354 0.5377 0.6745 0.5377 0.7332
No log 3.7474 356 0.5366 0.6144 0.5366 0.7325
No log 3.7684 358 0.5392 0.6610 0.5392 0.7343
No log 3.7895 360 0.5390 0.6924 0.5390 0.7341
No log 3.8105 362 0.5489 0.5939 0.5489 0.7409
No log 3.8316 364 0.5622 0.5879 0.5622 0.7498
No log 3.8526 366 0.5997 0.5455 0.5997 0.7744
No log 3.8737 368 0.6100 0.5681 0.6100 0.7810
No log 3.8947 370 0.6340 0.5529 0.6340 0.7963
No log 3.9158 372 0.7103 0.6198 0.7103 0.8428
No log 3.9368 374 0.7819 0.6151 0.7819 0.8843
No log 3.9579 376 0.7661 0.5150 0.7661 0.8753
No log 3.9789 378 0.7063 0.4762 0.7063 0.8404
No log 4.0 380 0.6759 0.5069 0.6759 0.8221
No log 4.0211 382 0.6391 0.5441 0.6391 0.7994
No log 4.0421 384 0.6000 0.6078 0.6000 0.7746
No log 4.0632 386 0.5753 0.6812 0.5753 0.7585
No log 4.0842 388 0.5761 0.6617 0.5761 0.7590
No log 4.1053 390 0.5911 0.6819 0.5911 0.7689
No log 4.1263 392 0.6151 0.7128 0.6151 0.7843
No log 4.1474 394 0.5962 0.6357 0.5962 0.7722
No log 4.1684 396 0.5677 0.6339 0.5677 0.7535
No log 4.1895 398 0.5900 0.6692 0.5900 0.7681
No log 4.2105 400 0.6085 0.6448 0.6085 0.7800
No log 4.2316 402 0.5744 0.6644 0.5744 0.7579
No log 4.2526 404 0.5790 0.6835 0.5790 0.7609
No log 4.2737 406 0.6196 0.7022 0.6196 0.7871
No log 4.2947 408 0.6489 0.7111 0.6489 0.8055
No log 4.3158 410 0.6285 0.6813 0.6285 0.7928
No log 4.3368 412 0.6101 0.6732 0.6101 0.7811
No log 4.3579 414 0.5653 0.6598 0.5653 0.7518
No log 4.3789 416 0.5546 0.6911 0.5546 0.7447
No log 4.4 418 0.5498 0.6328 0.5498 0.7415
No log 4.4211 420 0.5535 0.6244 0.5535 0.7440
No log 4.4421 422 0.5817 0.6664 0.5817 0.7627
No log 4.4632 424 0.6030 0.6482 0.6030 0.7765
No log 4.4842 426 0.5709 0.6330 0.5709 0.7556
No log 4.5053 428 0.5574 0.6535 0.5574 0.7466
No log 4.5263 430 0.5721 0.6581 0.5721 0.7564
No log 4.5474 432 0.5628 0.6358 0.5628 0.7502
No log 4.5684 434 0.5628 0.6060 0.5628 0.7502
No log 4.5895 436 0.5618 0.5505 0.5618 0.7495
No log 4.6105 438 0.5600 0.5606 0.5600 0.7483
No log 4.6316 440 0.5569 0.5725 0.5569 0.7463
No log 4.6526 442 0.5614 0.6067 0.5614 0.7493
No log 4.6737 444 0.5654 0.5536 0.5654 0.7519
No log 4.6947 446 0.5704 0.5523 0.5704 0.7553
No log 4.7158 448 0.5892 0.5536 0.5892 0.7676
No log 4.7368 450 0.6609 0.5255 0.6609 0.8130
No log 4.7579 452 0.7021 0.5622 0.7021 0.8379
No log 4.7789 454 0.6685 0.5935 0.6685 0.8176
No log 4.8 456 0.5739 0.6516 0.5739 0.7576
No log 4.8211 458 0.5861 0.6555 0.5861 0.7656
No log 4.8421 460 0.6989 0.6489 0.6989 0.8360
No log 4.8632 462 0.7186 0.6409 0.7186 0.8477
No log 4.8842 464 0.6575 0.6813 0.6575 0.8108
No log 4.9053 466 0.6118 0.6728 0.6118 0.7822
No log 4.9263 468 0.5944 0.6667 0.5944 0.7710
No log 4.9474 470 0.5944 0.6144 0.5944 0.7710
No log 4.9684 472 0.5815 0.6555 0.5815 0.7626
No log 4.9895 474 0.5642 0.6940 0.5642 0.7511
No log 5.0105 476 0.5559 0.7019 0.5559 0.7456
No log 5.0316 478 0.5581 0.6969 0.5581 0.7471
No log 5.0526 480 0.5638 0.6664 0.5638 0.7509
No log 5.0737 482 0.5464 0.7012 0.5464 0.7392
No log 5.0947 484 0.5378 0.6966 0.5378 0.7334
No log 5.1158 486 0.5506 0.6636 0.5506 0.7421
No log 5.1368 488 0.5536 0.6636 0.5536 0.7440
No log 5.1579 490 0.5528 0.6636 0.5528 0.7435
No log 5.1789 492 0.5562 0.6518 0.5562 0.7458
No log 5.2 494 0.5488 0.6555 0.5488 0.7408
No log 5.2211 496 0.5446 0.6452 0.5446 0.7380
No log 5.2421 498 0.5348 0.6417 0.5348 0.7313
0.2617 5.2632 500 0.5430 0.6590 0.5430 0.7369
0.2617 5.2842 502 0.5998 0.6709 0.5998 0.7745
0.2617 5.3053 504 0.6006 0.6638 0.6006 0.7750
0.2617 5.3263 506 0.5544 0.6205 0.5544 0.7446
0.2617 5.3474 508 0.5529 0.5820 0.5529 0.7436
0.2617 5.3684 510 0.5664 0.6177 0.5664 0.7526
0.2617 5.3895 512 0.5673 0.6207 0.5673 0.7532
0.2617 5.4105 514 0.5584 0.6433 0.5584 0.7473

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k19_task5_organization

Finetuned
(4019)
this model