ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k2_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7693
  • Qwk: 0.5073
  • Mse: 0.7693
  • Rmse: 0.8771

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.1667 2 4.4483 0.0018 4.4483 2.1091
No log 0.3333 4 4.0243 0.0074 4.0243 2.0061
No log 0.5 6 1.9797 0.0879 1.9797 1.4070
No log 0.6667 8 1.3163 0.1110 1.3163 1.1473
No log 0.8333 10 1.2804 0.1239 1.2804 1.1315
No log 1.0 12 1.2242 0.1247 1.2242 1.1065
No log 1.1667 14 1.2186 0.1076 1.2186 1.1039
No log 1.3333 16 1.1440 0.1343 1.1440 1.0696
No log 1.5 18 1.4962 0.0331 1.4962 1.2232
No log 1.6667 20 1.6528 0.0504 1.6528 1.2856
No log 1.8333 22 1.6670 0.1141 1.6670 1.2911
No log 2.0 24 1.4362 0.0455 1.4362 1.1984
No log 2.1667 26 1.2059 0.1920 1.2059 1.0981
No log 2.3333 28 1.1794 0.0865 1.1794 1.0860
No log 2.5 30 1.1109 0.3385 1.1109 1.0540
No log 2.6667 32 1.0448 0.2551 1.0448 1.0222
No log 2.8333 34 1.0499 0.2263 1.0499 1.0247
No log 3.0 36 1.0048 0.2689 1.0048 1.0024
No log 3.1667 38 0.9595 0.3301 0.9595 0.9795
No log 3.3333 40 1.1486 0.2371 1.1486 1.0717
No log 3.5 42 1.4436 0.2680 1.4436 1.2015
No log 3.6667 44 1.4246 0.3002 1.4246 1.1936
No log 3.8333 46 1.1108 0.3424 1.1108 1.0539
No log 4.0 48 0.9581 0.4936 0.9581 0.9788
No log 4.1667 50 1.0857 0.4106 1.0857 1.0420
No log 4.3333 52 1.1998 0.4634 1.1998 1.0954
No log 4.5 54 1.0351 0.3504 1.0351 1.0174
No log 4.6667 56 0.9409 0.4879 0.9409 0.9700
No log 4.8333 58 1.1339 0.2807 1.1339 1.0649
No log 5.0 60 1.4345 0.28 1.4345 1.1977
No log 5.1667 62 1.6505 0.2669 1.6505 1.2847
No log 5.3333 64 1.4058 0.3326 1.4058 1.1857
No log 5.5 66 1.0137 0.5932 1.0137 1.0068
No log 5.6667 68 0.9716 0.4614 0.9716 0.9857
No log 5.8333 70 1.1281 0.4888 1.1281 1.0621
No log 6.0 72 1.2578 0.4292 1.2578 1.1215
No log 6.1667 74 1.2510 0.4594 1.2510 1.1185
No log 6.3333 76 1.0493 0.4847 1.0493 1.0243
No log 6.5 78 0.8811 0.5232 0.8811 0.9387
No log 6.6667 80 0.9682 0.4622 0.9682 0.9840
No log 6.8333 82 0.9528 0.4949 0.9528 0.9761
No log 7.0 84 0.8273 0.5181 0.8273 0.9096
No log 7.1667 86 0.8540 0.4920 0.8540 0.9241
No log 7.3333 88 0.9950 0.5329 0.9950 0.9975
No log 7.5 90 1.1190 0.5470 1.1190 1.0578
No log 7.6667 92 0.9713 0.5658 0.9713 0.9855
No log 7.8333 94 0.8312 0.4941 0.8312 0.9117
No log 8.0 96 0.9154 0.5892 0.9154 0.9568
No log 8.1667 98 1.0173 0.5398 1.0173 1.0086
No log 8.3333 100 1.0420 0.5582 1.0420 1.0208
No log 8.5 102 0.9706 0.5392 0.9706 0.9852
No log 8.6667 104 0.8671 0.5248 0.8671 0.9312
No log 8.8333 106 0.8548 0.5790 0.8548 0.9245
No log 9.0 108 0.8904 0.4772 0.8904 0.9436
No log 9.1667 110 0.8954 0.5472 0.8954 0.9463
No log 9.3333 112 0.8437 0.4858 0.8437 0.9185
No log 9.5 114 0.8301 0.4517 0.8301 0.9111
No log 9.6667 116 0.8549 0.4742 0.8549 0.9246
No log 9.8333 118 0.9769 0.4977 0.9769 0.9884
No log 10.0 120 0.9679 0.4922 0.9679 0.9838
No log 10.1667 122 0.8928 0.5164 0.8928 0.9449
No log 10.3333 124 0.9365 0.4825 0.9365 0.9677
No log 10.5 126 1.1871 0.5015 1.1871 1.0895
No log 10.6667 128 1.4487 0.3768 1.4487 1.2036
No log 10.8333 130 1.4559 0.3715 1.4559 1.2066
No log 11.0 132 1.2811 0.4426 1.2811 1.1318
No log 11.1667 134 1.1231 0.4590 1.1231 1.0597
No log 11.3333 136 0.9477 0.4253 0.9477 0.9735
No log 11.5 138 0.8749 0.4906 0.8749 0.9354
No log 11.6667 140 1.0034 0.4278 1.0034 1.0017
No log 11.8333 142 1.0407 0.4258 1.0407 1.0202
No log 12.0 144 0.9586 0.4454 0.9586 0.9791
No log 12.1667 146 0.9246 0.5094 0.9246 0.9616
No log 12.3333 148 0.8955 0.4332 0.8955 0.9463
No log 12.5 150 0.9268 0.4327 0.9268 0.9627
No log 12.6667 152 0.9093 0.4656 0.9093 0.9536
No log 12.8333 154 0.8885 0.5401 0.8885 0.9426
No log 13.0 156 0.8988 0.5353 0.8988 0.9481
No log 13.1667 158 0.9154 0.5143 0.9154 0.9568
No log 13.3333 160 0.8769 0.4779 0.8769 0.9364
No log 13.5 162 0.8492 0.4894 0.8492 0.9215
No log 13.6667 164 0.8441 0.4364 0.8441 0.9187
No log 13.8333 166 0.8399 0.4572 0.8399 0.9164
No log 14.0 168 0.8693 0.4800 0.8693 0.9323
No log 14.1667 170 0.8544 0.5420 0.8544 0.9243
No log 14.3333 172 0.8279 0.4718 0.8279 0.9099
No log 14.5 174 0.8464 0.4389 0.8464 0.9200
No log 14.6667 176 0.9027 0.5227 0.9027 0.9501
No log 14.8333 178 0.9977 0.4751 0.9977 0.9988
No log 15.0 180 1.1249 0.5206 1.1249 1.0606
No log 15.1667 182 1.1030 0.5230 1.1030 1.0503
No log 15.3333 184 0.9896 0.5109 0.9896 0.9948
No log 15.5 186 0.8906 0.4646 0.8906 0.9437
No log 15.6667 188 0.8451 0.4590 0.8451 0.9193
No log 15.8333 190 0.8321 0.4565 0.8321 0.9122
No log 16.0 192 0.8184 0.4309 0.8184 0.9047
No log 16.1667 194 0.8351 0.5011 0.8351 0.9138
No log 16.3333 196 0.8295 0.5362 0.8295 0.9108
No log 16.5 198 0.8040 0.4847 0.8040 0.8967
No log 16.6667 200 0.7898 0.4249 0.7898 0.8887
No log 16.8333 202 0.7994 0.4828 0.7994 0.8941
No log 17.0 204 0.8376 0.5042 0.8376 0.9152
No log 17.1667 206 0.8766 0.5227 0.8766 0.9363
No log 17.3333 208 0.8589 0.5362 0.8589 0.9268
No log 17.5 210 0.8618 0.5318 0.8618 0.9284
No log 17.6667 212 0.8786 0.5076 0.8786 0.9373
No log 17.8333 214 0.9204 0.4906 0.9204 0.9594
No log 18.0 216 1.0362 0.5310 1.0362 1.0179
No log 18.1667 218 1.0877 0.5347 1.0877 1.0429
No log 18.3333 220 1.0131 0.5015 1.0131 1.0065
No log 18.5 222 0.9012 0.5042 0.9012 0.9493
No log 18.6667 224 0.8408 0.5083 0.8408 0.9170
No log 18.8333 226 0.8339 0.5503 0.8339 0.9132
No log 19.0 228 0.8600 0.5094 0.8600 0.9274
No log 19.1667 230 0.9665 0.5357 0.9665 0.9831
No log 19.3333 232 1.0395 0.5678 1.0395 1.0196
No log 19.5 234 0.9966 0.5678 0.9966 0.9983
No log 19.6667 236 0.8818 0.5145 0.8818 0.9390
No log 19.8333 238 0.8246 0.5320 0.8246 0.9081
No log 20.0 240 0.8136 0.5272 0.8136 0.9020
No log 20.1667 242 0.8169 0.5098 0.8169 0.9038
No log 20.3333 244 0.8568 0.5303 0.8568 0.9257
No log 20.5 246 0.9129 0.5951 0.9129 0.9555
No log 20.6667 248 0.9446 0.5861 0.9446 0.9719
No log 20.8333 250 0.9284 0.5611 0.9284 0.9635
No log 21.0 252 0.9428 0.5386 0.9428 0.9710
No log 21.1667 254 0.9516 0.5057 0.9516 0.9755
No log 21.3333 256 0.9317 0.5207 0.9317 0.9652
No log 21.5 258 0.9129 0.4944 0.9129 0.9555
No log 21.6667 260 0.9085 0.4663 0.9085 0.9531
No log 21.8333 262 0.9167 0.4553 0.9167 0.9574
No log 22.0 264 0.9514 0.3945 0.9514 0.9754
No log 22.1667 266 0.9697 0.3958 0.9697 0.9847
No log 22.3333 268 0.9603 0.4490 0.9603 0.9799
No log 22.5 270 0.9155 0.5406 0.9155 0.9568
No log 22.6667 272 0.8812 0.5793 0.8812 0.9387
No log 22.8333 274 0.8905 0.5753 0.8905 0.9437
No log 23.0 276 0.8804 0.5753 0.8804 0.9383
No log 23.1667 278 0.8423 0.5283 0.8423 0.9178
No log 23.3333 280 0.8776 0.6162 0.8776 0.9368
No log 23.5 282 0.9309 0.5448 0.9309 0.9648
No log 23.6667 284 0.9130 0.5448 0.9130 0.9555
No log 23.8333 286 0.9197 0.5448 0.9197 0.9590
No log 24.0 288 0.9164 0.5448 0.9164 0.9573
No log 24.1667 290 0.8868 0.5276 0.8868 0.9417
No log 24.3333 292 0.8482 0.5564 0.8482 0.9210
No log 24.5 294 0.8531 0.5467 0.8531 0.9236
No log 24.6667 296 0.8863 0.5674 0.8863 0.9414
No log 24.8333 298 0.9181 0.5654 0.9181 0.9582
No log 25.0 300 0.9543 0.5655 0.9543 0.9769
No log 25.1667 302 1.0223 0.5675 1.0223 1.0111
No log 25.3333 304 1.0014 0.5605 1.0014 1.0007
No log 25.5 306 0.9078 0.5258 0.9078 0.9528
No log 25.6667 308 0.8583 0.5098 0.8583 0.9264
No log 25.8333 310 0.8213 0.4775 0.8213 0.9063
No log 26.0 312 0.8069 0.4476 0.8069 0.8983
No log 26.1667 314 0.8077 0.4724 0.8077 0.8987
No log 26.3333 316 0.8332 0.4639 0.8332 0.9128
No log 26.5 318 0.9033 0.5258 0.9033 0.9504
No log 26.6667 320 0.9668 0.5258 0.9668 0.9833
No log 26.8333 322 0.9481 0.5258 0.9481 0.9737
No log 27.0 324 0.8958 0.5151 0.8958 0.9464
No log 27.1667 326 0.8845 0.5544 0.8845 0.9405
No log 27.3333 328 0.8685 0.5237 0.8685 0.9319
No log 27.5 330 0.8778 0.5094 0.8778 0.9369
No log 27.6667 332 0.9072 0.5121 0.9072 0.9525
No log 27.8333 334 0.9744 0.4988 0.9744 0.9871
No log 28.0 336 1.0283 0.5158 1.0283 1.0141
No log 28.1667 338 1.0585 0.5574 1.0585 1.0289
No log 28.3333 340 1.0133 0.5354 1.0133 1.0066
No log 28.5 342 0.9629 0.4715 0.9629 0.9813
No log 28.6667 344 0.9455 0.5034 0.9455 0.9724
No log 28.8333 346 0.9327 0.4864 0.9327 0.9657
No log 29.0 348 0.9078 0.4968 0.9078 0.9528
No log 29.1667 350 0.8929 0.4832 0.8929 0.9449
No log 29.3333 352 0.8889 0.5304 0.8889 0.9428
No log 29.5 354 0.8882 0.5009 0.8882 0.9424
No log 29.6667 356 0.8877 0.5009 0.8877 0.9422
No log 29.8333 358 0.8810 0.5009 0.8810 0.9386
No log 30.0 360 0.8751 0.5142 0.8751 0.9355
No log 30.1667 362 0.8711 0.5142 0.8711 0.9333
No log 30.3333 364 0.8823 0.5276 0.8823 0.9393
No log 30.5 366 0.9002 0.5245 0.9002 0.9488
No log 30.6667 368 0.9143 0.5589 0.9143 0.9562
No log 30.8333 370 0.9045 0.5204 0.9045 0.9510
No log 31.0 372 0.9100 0.4792 0.9100 0.9539
No log 31.1667 374 0.9289 0.4763 0.9289 0.9638
No log 31.3333 376 0.9265 0.4820 0.9265 0.9626
No log 31.5 378 0.9205 0.4944 0.9205 0.9594
No log 31.6667 380 0.9169 0.5468 0.9169 0.9575
No log 31.8333 382 0.9108 0.5376 0.9108 0.9544
No log 32.0 384 0.9127 0.5205 0.9127 0.9554
No log 32.1667 386 0.9177 0.4726 0.9177 0.9579
No log 32.3333 388 0.9188 0.4685 0.9188 0.9586
No log 32.5 390 0.9292 0.4552 0.9292 0.9639
No log 32.6667 392 0.9614 0.4715 0.9614 0.9805
No log 32.8333 394 1.0007 0.4780 1.0007 1.0004
No log 33.0 396 0.9973 0.4593 0.9973 0.9987
No log 33.1667 398 0.9647 0.4715 0.9647 0.9822
No log 33.3333 400 0.9495 0.4829 0.9495 0.9744
No log 33.5 402 0.9467 0.4773 0.9467 0.9730
No log 33.6667 404 0.9592 0.4773 0.9592 0.9794
No log 33.8333 406 0.9697 0.4942 0.9697 0.9847
No log 34.0 408 0.9879 0.4829 0.9879 0.9940
No log 34.1667 410 1.0072 0.4400 1.0072 1.0036
No log 34.3333 412 1.0106 0.4400 1.0106 1.0053
No log 34.5 414 0.9910 0.4829 0.9910 0.9955
No log 34.6667 416 0.9484 0.4382 0.9484 0.9739
No log 34.8333 418 0.9174 0.4944 0.9174 0.9578
No log 35.0 420 0.9088 0.4958 0.9088 0.9533
No log 35.1667 422 0.9112 0.5204 0.9112 0.9546
No log 35.3333 424 0.9230 0.4957 0.9230 0.9607
No log 35.5 426 0.9305 0.5028 0.9305 0.9646
No log 35.6667 428 0.9450 0.5028 0.9450 0.9721
No log 35.8333 430 0.9411 0.5028 0.9411 0.9701
No log 36.0 432 0.9018 0.5073 0.9018 0.9497
No log 36.1667 434 0.8664 0.4143 0.8664 0.9308
No log 36.3333 436 0.8500 0.4540 0.8500 0.9220
No log 36.5 438 0.8493 0.4374 0.8493 0.9216
No log 36.6667 440 0.8668 0.4956 0.8668 0.9310
No log 36.8333 442 0.9027 0.4906 0.9027 0.9501
No log 37.0 444 0.9205 0.5242 0.9205 0.9594
No log 37.1667 446 0.9137 0.5242 0.9137 0.9559
No log 37.3333 448 0.9066 0.5038 0.9066 0.9522
No log 37.5 450 0.8870 0.5218 0.8870 0.9418
No log 37.6667 452 0.8686 0.4852 0.8686 0.9320
No log 37.8333 454 0.8673 0.4663 0.8673 0.9313
No log 38.0 456 0.8823 0.5029 0.8823 0.9393
No log 38.1667 458 0.9014 0.52 0.9014 0.9494
No log 38.3333 460 0.9022 0.5166 0.9022 0.9499
No log 38.5 462 0.8814 0.5183 0.8814 0.9389
No log 38.6667 464 0.8457 0.5495 0.8457 0.9196
No log 38.8333 466 0.8289 0.5119 0.8289 0.9104
No log 39.0 468 0.8222 0.4923 0.8222 0.9067
No log 39.1667 470 0.8231 0.4779 0.8231 0.9073
No log 39.3333 472 0.8253 0.5121 0.8253 0.9085
No log 39.5 474 0.8248 0.4938 0.8248 0.9082
No log 39.6667 476 0.8154 0.4894 0.8154 0.9030
No log 39.8333 478 0.8132 0.4902 0.8132 0.9018
No log 40.0 480 0.8192 0.5479 0.8192 0.9051
No log 40.1667 482 0.8365 0.5447 0.8365 0.9146
No log 40.3333 484 0.8398 0.4968 0.8398 0.9164
No log 40.5 486 0.8250 0.4239 0.8250 0.9083
No log 40.6667 488 0.8141 0.4369 0.8141 0.9023
No log 40.8333 490 0.8065 0.4369 0.8065 0.8980
No log 41.0 492 0.7904 0.5875 0.7904 0.8890
No log 41.1667 494 0.7732 0.6038 0.7732 0.8793
No log 41.3333 496 0.7665 0.6374 0.7665 0.8755
No log 41.5 498 0.7605 0.6026 0.7605 0.8721
0.2936 41.6667 500 0.7581 0.5560 0.7581 0.8707
0.2936 41.8333 502 0.7590 0.5560 0.7590 0.8712
0.2936 42.0 504 0.7579 0.5426 0.7579 0.8706
0.2936 42.1667 506 0.7615 0.5110 0.7615 0.8726
0.2936 42.3333 508 0.7649 0.5295 0.7649 0.8746
0.2936 42.5 510 0.7693 0.5073 0.7693 0.8771

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k2_task2_organization

Finetuned
(4019)
this model