ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k17_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8632
  • Qwk: 0.5287
  • Mse: 0.8632
  • Rmse: 0.9291

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0208 2 4.4489 0.0163 4.4489 2.1093
No log 0.0417 4 3.0817 -0.0443 3.0817 1.7555
No log 0.0625 6 1.6009 0.0504 1.6009 1.2653
No log 0.0833 8 1.3574 0.0500 1.3574 1.1651
No log 0.1042 10 1.2691 0.1289 1.2691 1.1265
No log 0.125 12 1.2006 0.2678 1.2006 1.0957
No log 0.1458 14 1.2305 0.1556 1.2305 1.1093
No log 0.1667 16 1.2917 0.0982 1.2917 1.1365
No log 0.1875 18 1.2681 0.1354 1.2681 1.1261
No log 0.2083 20 1.1578 0.1870 1.1578 1.0760
No log 0.2292 22 1.1346 0.1701 1.1346 1.0652
No log 0.25 24 1.2330 0.2438 1.2330 1.1104
No log 0.2708 26 1.2211 0.2417 1.2211 1.1050
No log 0.2917 28 1.2286 0.1108 1.2286 1.1084
No log 0.3125 30 1.7812 0.0901 1.7812 1.3346
No log 0.3333 32 1.8955 0.1176 1.8955 1.3768
No log 0.3542 34 1.6738 0.1901 1.6738 1.2938
No log 0.375 36 1.2504 0.2346 1.2504 1.1182
No log 0.3958 38 1.0082 0.3756 1.0082 1.0041
No log 0.4167 40 0.9514 0.4466 0.9514 0.9754
No log 0.4375 42 0.9350 0.4275 0.9350 0.9670
No log 0.4583 44 0.9307 0.4369 0.9307 0.9647
No log 0.4792 46 0.9417 0.5479 0.9417 0.9704
No log 0.5 48 0.9576 0.5891 0.9576 0.9786
No log 0.5208 50 1.0109 0.3249 1.0109 1.0054
No log 0.5417 52 1.0336 0.3290 1.0336 1.0167
No log 0.5625 54 0.9816 0.4994 0.9816 0.9907
No log 0.5833 56 0.9196 0.6269 0.9196 0.9590
No log 0.6042 58 0.9185 0.4923 0.9185 0.9584
No log 0.625 60 0.9941 0.4279 0.9941 0.9970
No log 0.6458 62 0.9986 0.4281 0.9986 0.9993
No log 0.6667 64 0.9887 0.5447 0.9887 0.9943
No log 0.6875 66 1.0030 0.3490 1.0030 1.0015
No log 0.7083 68 1.0815 0.3568 1.0815 1.0400
No log 0.7292 70 0.9502 0.4003 0.9502 0.9748
No log 0.75 72 0.8468 0.4926 0.8468 0.9202
No log 0.7708 74 0.8392 0.5076 0.8392 0.9161
No log 0.7917 76 0.8642 0.4864 0.8642 0.9296
No log 0.8125 78 0.8823 0.5587 0.8823 0.9393
No log 0.8333 80 0.8722 0.5726 0.8722 0.9339
No log 0.8542 82 0.8622 0.5461 0.8622 0.9285
No log 0.875 84 0.9194 0.5571 0.9194 0.9589
No log 0.8958 86 0.9214 0.5344 0.9214 0.9599
No log 0.9167 88 0.7303 0.5759 0.7303 0.8546
No log 0.9375 90 0.8930 0.5801 0.8930 0.9450
No log 0.9583 92 1.1064 0.5485 1.1064 1.0518
No log 0.9792 94 1.1771 0.4455 1.1771 1.0850
No log 1.0 96 0.8926 0.5384 0.8926 0.9448
No log 1.0208 98 0.7005 0.5872 0.7005 0.8370
No log 1.0417 100 0.7408 0.6196 0.7408 0.8607
No log 1.0625 102 0.7521 0.6288 0.7521 0.8672
No log 1.0833 104 0.7281 0.6131 0.7281 0.8533
No log 1.1042 106 0.7426 0.6339 0.7426 0.8617
No log 1.125 108 0.8562 0.5996 0.8562 0.9253
No log 1.1458 110 1.0673 0.5469 1.0673 1.0331
No log 1.1667 112 0.9795 0.5757 0.9795 0.9897
No log 1.1875 114 0.8271 0.6524 0.8271 0.9095
No log 1.2083 116 0.7696 0.5932 0.7696 0.8772
No log 1.2292 118 0.9294 0.5717 0.9294 0.9640
No log 1.25 120 0.8604 0.5796 0.8604 0.9276
No log 1.2708 122 0.7021 0.6650 0.7021 0.8379
No log 1.2917 124 0.6798 0.6355 0.6798 0.8245
No log 1.3125 126 0.7760 0.6467 0.7760 0.8809
No log 1.3333 128 0.7086 0.6200 0.7086 0.8418
No log 1.3542 130 0.6733 0.6684 0.6733 0.8206
No log 1.375 132 0.6822 0.6684 0.6822 0.8260
No log 1.3958 134 0.8529 0.5649 0.8529 0.9235
No log 1.4167 136 0.9120 0.5448 0.9120 0.9550
No log 1.4375 138 1.0080 0.5040 1.0080 1.0040
No log 1.4583 140 0.8406 0.5471 0.8406 0.9168
No log 1.4792 142 0.9316 0.5253 0.9316 0.9652
No log 1.5 144 0.8629 0.5893 0.8629 0.9289
No log 1.5208 146 0.7445 0.6441 0.7445 0.8628
No log 1.5417 148 0.6626 0.7082 0.6626 0.8140
No log 1.5625 150 0.8330 0.6408 0.8330 0.9127
No log 1.5833 152 0.9132 0.6316 0.9132 0.9556
No log 1.6042 154 0.8629 0.6804 0.8629 0.9289
No log 1.625 156 0.7289 0.6181 0.7289 0.8538
No log 1.6458 158 0.7435 0.6358 0.7435 0.8623
No log 1.6667 160 0.8244 0.6604 0.8244 0.9080
No log 1.6875 162 0.9053 0.5358 0.9053 0.9514
No log 1.7083 164 0.9094 0.6064 0.9094 0.9536
No log 1.7292 166 0.8311 0.5788 0.8311 0.9117
No log 1.75 168 0.9077 0.5370 0.9077 0.9527
No log 1.7708 170 0.9799 0.5570 0.9799 0.9899
No log 1.7917 172 0.9255 0.5389 0.9255 0.9620
No log 1.8125 174 0.7740 0.6563 0.7740 0.8798
No log 1.8333 176 0.8032 0.5756 0.8032 0.8962
No log 1.8542 178 0.9077 0.5304 0.9077 0.9528
No log 1.875 180 1.1924 0.4941 1.1924 1.0920
No log 1.8958 182 1.2079 0.4666 1.2079 1.0991
No log 1.9167 184 0.9772 0.4724 0.9772 0.9885
No log 1.9375 186 0.8360 0.4623 0.8360 0.9143
No log 1.9583 188 0.7993 0.5810 0.7993 0.8940
No log 1.9792 190 0.9513 0.5014 0.9513 0.9754
No log 2.0 192 1.1136 0.5101 1.1136 1.0553
No log 2.0208 194 1.0675 0.4831 1.0675 1.0332
No log 2.0417 196 0.8683 0.4396 0.8683 0.9319
No log 2.0625 198 0.8140 0.5815 0.8140 0.9022
No log 2.0833 200 0.8542 0.6068 0.8542 0.9243
No log 2.1042 202 1.0160 0.5422 1.0160 1.0080
No log 2.125 204 0.9644 0.5883 0.9644 0.9820
No log 2.1458 206 0.8519 0.4932 0.8519 0.9230
No log 2.1667 208 0.8404 0.4858 0.8404 0.9167
No log 2.1875 210 0.8505 0.5422 0.8505 0.9222
No log 2.2083 212 0.8127 0.5541 0.8127 0.9015
No log 2.2292 214 0.7878 0.5230 0.7878 0.8876
No log 2.25 216 0.8989 0.5071 0.8989 0.9481
No log 2.2708 218 1.0341 0.4883 1.0341 1.0169
No log 2.2917 220 1.0176 0.5083 1.0176 1.0087
No log 2.3125 222 1.0347 0.4963 1.0347 1.0172
No log 2.3333 224 0.8954 0.4463 0.8954 0.9463
No log 2.3542 226 0.8460 0.4378 0.8460 0.9198
No log 2.375 228 0.8281 0.5029 0.8281 0.9100
No log 2.3958 230 0.8901 0.4933 0.8901 0.9435
No log 2.4167 232 0.8378 0.5276 0.8378 0.9153
No log 2.4375 234 0.7921 0.4864 0.7921 0.8900
No log 2.4583 236 0.8393 0.5130 0.8393 0.9162
No log 2.4792 238 0.8282 0.4968 0.8282 0.9101
No log 2.5 240 0.8245 0.4968 0.8245 0.9080
No log 2.5208 242 0.7849 0.5690 0.7849 0.8859
No log 2.5417 244 0.7687 0.5242 0.7687 0.8768
No log 2.5625 246 0.7799 0.5504 0.7799 0.8831
No log 2.5833 248 0.8333 0.5130 0.8333 0.9129
No log 2.6042 250 0.8964 0.5911 0.8964 0.9468
No log 2.625 252 1.0545 0.5562 1.0545 1.0269
No log 2.6458 254 1.0599 0.5388 1.0599 1.0295
No log 2.6667 256 0.9020 0.5710 0.9020 0.9498
No log 2.6875 258 0.8043 0.5534 0.8043 0.8968
No log 2.7083 260 0.7989 0.4690 0.7989 0.8938
No log 2.7292 262 0.7912 0.4994 0.7912 0.8895
No log 2.75 264 0.7977 0.5885 0.7977 0.8931
No log 2.7708 266 0.8344 0.4956 0.8344 0.9135
No log 2.7917 268 0.8042 0.5298 0.8042 0.8968
No log 2.8125 270 0.7905 0.5769 0.7905 0.8891
No log 2.8333 272 0.8095 0.4932 0.8095 0.8997
No log 2.8542 274 0.8297 0.5115 0.8297 0.9109
No log 2.875 276 0.8550 0.5194 0.8550 0.9246
No log 2.8958 278 0.9027 0.4416 0.9027 0.9501
No log 2.9167 280 0.9847 0.4219 0.9847 0.9923
No log 2.9375 282 0.9486 0.4010 0.9486 0.9740
No log 2.9583 284 0.8490 0.4575 0.8490 0.9214
No log 2.9792 286 0.8077 0.5311 0.8077 0.8987
No log 3.0 288 0.7858 0.5311 0.7858 0.8865
No log 3.0208 290 0.8051 0.5493 0.8051 0.8973
No log 3.0417 292 0.9826 0.4403 0.9826 0.9913
No log 3.0625 294 1.1156 0.4758 1.1156 1.0562
No log 3.0833 296 1.0154 0.4861 1.0154 1.0077
No log 3.1042 298 0.8202 0.5792 0.8202 0.9056
No log 3.125 300 0.7717 0.5847 0.7717 0.8785
No log 3.1458 302 0.8104 0.5685 0.8104 0.9002
No log 3.1667 304 0.9016 0.5423 0.9016 0.9495
No log 3.1875 306 0.9314 0.4618 0.9314 0.9651
No log 3.2083 308 0.8485 0.5637 0.8485 0.9212
No log 3.2292 310 0.7720 0.5490 0.7720 0.8786
No log 3.25 312 0.7764 0.5313 0.7764 0.8812
No log 3.2708 314 0.8431 0.5591 0.8431 0.9182
No log 3.2917 316 0.8905 0.5507 0.8905 0.9437
No log 3.3125 318 0.8608 0.5529 0.8608 0.9278
No log 3.3333 320 0.8087 0.5625 0.8087 0.8993
No log 3.3542 322 0.7663 0.5736 0.7663 0.8754
No log 3.375 324 0.8031 0.5625 0.8031 0.8962
No log 3.3958 326 0.7981 0.5710 0.7981 0.8934
No log 3.4167 328 0.7658 0.5611 0.7658 0.8751
No log 3.4375 330 0.7703 0.5635 0.7703 0.8777
No log 3.4583 332 0.8141 0.5473 0.8141 0.9023
No log 3.4792 334 0.8619 0.5294 0.8619 0.9284
No log 3.5 336 0.8535 0.5448 0.8535 0.9238
No log 3.5208 338 0.8094 0.5562 0.8094 0.8997
No log 3.5417 340 0.8123 0.5517 0.8123 0.9013
No log 3.5625 342 0.8012 0.5537 0.8012 0.8951
No log 3.5833 344 0.8151 0.5537 0.8151 0.9028
No log 3.6042 346 0.8486 0.4677 0.8486 0.9212
No log 3.625 348 0.8916 0.4862 0.8916 0.9443
No log 3.6458 350 0.9743 0.4786 0.9743 0.9871
No log 3.6667 352 1.0041 0.5043 1.0041 1.0020
No log 3.6875 354 0.8885 0.5044 0.8885 0.9426
No log 3.7083 356 0.8556 0.5156 0.8556 0.9250
No log 3.7292 358 0.8545 0.5374 0.8545 0.9244
No log 3.75 360 0.7929 0.5625 0.7929 0.8905
No log 3.7708 362 0.7499 0.5466 0.7499 0.8660
No log 3.7917 364 0.7334 0.5201 0.7334 0.8564
No log 3.8125 366 0.7353 0.4916 0.7353 0.8575
No log 3.8333 368 0.6970 0.5739 0.6970 0.8349
No log 3.8542 370 0.7030 0.5692 0.7030 0.8384
No log 3.875 372 0.6952 0.5837 0.6952 0.8338
No log 3.8958 374 0.7701 0.5624 0.7701 0.8776
No log 3.9167 376 0.8573 0.5591 0.8573 0.9259
No log 3.9375 378 0.9937 0.4965 0.9937 0.9968
No log 3.9583 380 0.9563 0.4989 0.9563 0.9779
No log 3.9792 382 0.8016 0.5470 0.8016 0.8953
No log 4.0 384 0.7345 0.5426 0.7345 0.8570
No log 4.0208 386 0.7211 0.5259 0.7211 0.8492
No log 4.0417 388 0.7067 0.5331 0.7067 0.8407
No log 4.0625 390 0.7650 0.5576 0.7650 0.8747
No log 4.0833 392 0.7929 0.5576 0.7929 0.8904
No log 4.1042 394 0.7745 0.54 0.7745 0.8801
No log 4.125 396 0.7563 0.5094 0.7563 0.8697
No log 4.1458 398 0.7816 0.5077 0.7816 0.8841
No log 4.1667 400 0.7643 0.5111 0.7643 0.8742
No log 4.1875 402 0.7858 0.5077 0.7858 0.8864
No log 4.2083 404 0.8083 0.5044 0.8083 0.8991
No log 4.2292 406 0.8791 0.5419 0.8791 0.9376
No log 4.25 408 0.9266 0.4714 0.9266 0.9626
No log 4.2708 410 0.8794 0.5529 0.8794 0.9378
No log 4.2917 412 0.8521 0.5638 0.8521 0.9231
No log 4.3125 414 0.8951 0.5015 0.8951 0.9461
No log 4.3333 416 0.9152 0.5002 0.9152 0.9567
No log 4.3542 418 0.8597 0.5854 0.8597 0.9272
No log 4.375 420 0.7625 0.5736 0.7625 0.8732
No log 4.3958 422 0.7523 0.6044 0.7523 0.8673
No log 4.4167 424 0.8413 0.5490 0.8413 0.9172
No log 4.4375 426 0.9122 0.5315 0.9122 0.9551
No log 4.4583 428 0.8325 0.5650 0.8325 0.9124
No log 4.4792 430 0.7559 0.5968 0.7559 0.8694
No log 4.5 432 0.7501 0.5936 0.7501 0.8661
No log 4.5208 434 0.7949 0.5650 0.7949 0.8916
No log 4.5417 436 0.8603 0.5959 0.8603 0.9275
No log 4.5625 438 0.8082 0.6330 0.8082 0.8990
No log 4.5833 440 0.7297 0.5437 0.7297 0.8542
No log 4.6042 442 0.7387 0.5545 0.7387 0.8594
No log 4.625 444 0.7799 0.5875 0.7799 0.8831
No log 4.6458 446 0.7662 0.5451 0.7662 0.8753
No log 4.6667 448 0.7477 0.5279 0.7477 0.8647
No log 4.6875 450 0.7580 0.4832 0.7580 0.8706
No log 4.7083 452 0.7544 0.4629 0.7544 0.8686
No log 4.7292 454 0.7505 0.4620 0.7505 0.8663
No log 4.75 456 0.7888 0.5610 0.7888 0.8881
No log 4.7708 458 0.8849 0.5746 0.8849 0.9407
No log 4.7917 460 0.9329 0.5297 0.9329 0.9659
No log 4.8125 462 0.9326 0.4705 0.9326 0.9657
No log 4.8333 464 0.8239 0.5746 0.8239 0.9077
No log 4.8542 466 0.7981 0.5443 0.7981 0.8934
No log 4.875 468 0.7807 0.5560 0.7807 0.8836
No log 4.8958 470 0.7913 0.5077 0.7913 0.8895
No log 4.9167 472 0.7744 0.4584 0.7744 0.8800
No log 4.9375 474 0.7454 0.4962 0.7454 0.8634
No log 4.9583 476 0.7277 0.4873 0.7277 0.8530
No log 4.9792 478 0.7309 0.4963 0.7309 0.8549
No log 5.0 480 0.8056 0.5421 0.8056 0.8975
No log 5.0208 482 1.0122 0.4796 1.0122 1.0061
No log 5.0417 484 1.1228 0.3925 1.1228 1.0596
No log 5.0625 486 1.0988 0.4106 1.0988 1.0482
No log 5.0833 488 0.9677 0.4306 0.9677 0.9837
No log 5.1042 490 0.8737 0.4390 0.8737 0.9347
No log 5.125 492 0.8632 0.5118 0.8632 0.9291
No log 5.1458 494 0.8601 0.5268 0.8601 0.9274
No log 5.1667 496 0.8055 0.56 0.8055 0.8975
No log 5.1875 498 0.7339 0.5702 0.7339 0.8567
0.316 5.2083 500 0.7136 0.5727 0.7136 0.8448
0.316 5.2292 502 0.7129 0.5727 0.7129 0.8443
0.316 5.25 504 0.7483 0.5255 0.7483 0.8650
0.316 5.2708 506 0.8003 0.4911 0.8003 0.8946
0.316 5.2917 508 0.8530 0.5287 0.8530 0.9236
0.316 5.3125 510 0.8632 0.5287 0.8632 0.9291

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k17_task2_organization

Finetuned
(4019)
this model