ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k15_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8126
  • Qwk: 0.4730
  • Mse: 0.8126
  • Rmse: 0.9014

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0235 2 5.0212 -0.0214 5.0212 2.2408
No log 0.0471 4 3.5604 -0.0231 3.5604 1.8869
No log 0.0706 6 2.2299 -0.0533 2.2299 1.4933
No log 0.0941 8 1.2555 0.1262 1.2555 1.1205
No log 0.1176 10 1.3052 0.0184 1.3052 1.1424
No log 0.1412 12 1.2992 0.0918 1.2992 1.1398
No log 0.1647 14 1.1670 0.1711 1.1670 1.0803
No log 0.1882 16 1.2231 0.0872 1.2231 1.1059
No log 0.2118 18 1.2347 0.1379 1.2347 1.1112
No log 0.2353 20 1.2209 0.1546 1.2209 1.1049
No log 0.2588 22 1.2060 0.1379 1.2060 1.0982
No log 0.2824 24 1.2244 0.1247 1.2244 1.1065
No log 0.3059 26 1.2519 0.0802 1.2519 1.1189
No log 0.3294 28 1.1875 0.1247 1.1875 1.0897
No log 0.3529 30 1.1803 0.2782 1.1803 1.0864
No log 0.3765 32 1.2107 0.2137 1.2107 1.1003
No log 0.4 34 1.2642 0.0449 1.2642 1.1244
No log 0.4235 36 1.2958 0.0776 1.2958 1.1383
No log 0.4471 38 1.1984 0.1196 1.1984 1.0947
No log 0.4706 40 1.0799 0.1388 1.0799 1.0392
No log 0.4941 42 0.9980 0.3994 0.9980 0.9990
No log 0.5176 44 1.0014 0.2988 1.0014 1.0007
No log 0.5412 46 1.0743 0.3590 1.0743 1.0365
No log 0.5647 48 0.9625 0.3919 0.9625 0.9811
No log 0.5882 50 0.8764 0.4465 0.8764 0.9361
No log 0.6118 52 1.0746 0.4080 1.0746 1.0366
No log 0.6353 54 1.5051 0.3989 1.5051 1.2268
No log 0.6588 56 1.4186 0.4328 1.4186 1.1911
No log 0.6824 58 0.9293 0.4219 0.9293 0.9640
No log 0.7059 60 0.9231 0.3891 0.9231 0.9608
No log 0.7294 62 1.1627 0.3232 1.1627 1.0783
No log 0.7529 64 1.0276 0.3995 1.0276 1.0137
No log 0.7765 66 0.8200 0.5510 0.8200 0.9055
No log 0.8 68 0.8978 0.4576 0.8978 0.9475
No log 0.8235 70 1.0164 0.4410 1.0164 1.0082
No log 0.8471 72 0.9302 0.4930 0.9302 0.9645
No log 0.8706 74 0.8513 0.4924 0.8513 0.9227
No log 0.8941 76 0.7841 0.5504 0.7841 0.8855
No log 0.9176 78 0.8693 0.5451 0.8693 0.9324
No log 0.9412 80 1.0259 0.4538 1.0259 1.0129
No log 0.9647 82 0.9841 0.5319 0.9841 0.9920
No log 0.9882 84 1.0250 0.4614 1.0250 1.0124
No log 1.0118 86 0.9946 0.4493 0.9946 0.9973
No log 1.0353 88 0.9927 0.4996 0.9927 0.9964
No log 1.0588 90 1.0228 0.4435 1.0228 1.0114
No log 1.0824 92 1.0042 0.4168 1.0042 1.0021
No log 1.1059 94 0.9031 0.4654 0.9031 0.9503
No log 1.1294 96 0.8891 0.5198 0.8891 0.9429
No log 1.1529 98 1.0494 0.4309 1.0494 1.0244
No log 1.1765 100 1.0079 0.4309 1.0079 1.0040
No log 1.2 102 0.8990 0.5226 0.8990 0.9481
No log 1.2235 104 0.8198 0.5416 0.8198 0.9054
No log 1.2471 106 0.8854 0.5495 0.8854 0.9410
No log 1.2706 108 0.9974 0.5125 0.9974 0.9987
No log 1.2941 110 1.1177 0.5384 1.1177 1.0572
No log 1.3176 112 0.9898 0.4795 0.9898 0.9949
No log 1.3412 114 0.8447 0.6171 0.8447 0.9191
No log 1.3647 116 0.8362 0.6346 0.8362 0.9144
No log 1.3882 118 0.8526 0.6346 0.8526 0.9234
No log 1.4118 120 0.8392 0.5886 0.8392 0.9161
No log 1.4353 122 0.8991 0.4831 0.8991 0.9482
No log 1.4588 124 0.8448 0.5519 0.8448 0.9191
No log 1.4824 126 0.8180 0.5541 0.8180 0.9044
No log 1.5059 128 0.7718 0.5966 0.7718 0.8785
No log 1.5294 130 0.7872 0.5671 0.7872 0.8872
No log 1.5529 132 0.9024 0.5052 0.9024 0.9500
No log 1.5765 134 0.9121 0.5040 0.9121 0.9550
No log 1.6 136 0.8242 0.5313 0.8242 0.9079
No log 1.6235 138 0.8407 0.5333 0.8407 0.9169
No log 1.6471 140 0.8146 0.5220 0.8146 0.9025
No log 1.6706 142 0.7952 0.5683 0.7952 0.8917
No log 1.6941 144 0.7924 0.5980 0.7924 0.8902
No log 1.7176 146 0.8815 0.5039 0.8815 0.9389
No log 1.7412 148 1.0299 0.4990 1.0299 1.0148
No log 1.7647 150 0.9589 0.4769 0.9589 0.9793
No log 1.7882 152 0.8471 0.5695 0.8471 0.9204
No log 1.8118 154 0.8290 0.5632 0.8290 0.9105
No log 1.8353 156 0.8216 0.5632 0.8216 0.9064
No log 1.8588 158 0.8412 0.5098 0.8412 0.9172
No log 1.8824 160 0.9956 0.4946 0.9956 0.9978
No log 1.9059 162 1.0036 0.4946 1.0036 1.0018
No log 1.9294 164 0.8598 0.5098 0.8598 0.9272
No log 1.9529 166 0.8267 0.5466 0.8267 0.9092
No log 1.9765 168 0.8352 0.5849 0.8352 0.9139
No log 2.0 170 0.8883 0.5775 0.8883 0.9425
No log 2.0235 172 0.9277 0.4959 0.9277 0.9631
No log 2.0471 174 1.0638 0.4518 1.0638 1.0314
No log 2.0706 176 1.0593 0.4518 1.0593 1.0292
No log 2.0941 178 0.9183 0.5501 0.9183 0.9583
No log 2.1176 180 0.9280 0.4781 0.9280 0.9634
No log 2.1412 182 0.9301 0.4902 0.9301 0.9644
No log 2.1647 184 0.9093 0.4601 0.9093 0.9536
No log 2.1882 186 0.9419 0.4321 0.9419 0.9705
No log 2.2118 188 1.0649 0.4219 1.0649 1.0320
No log 2.2353 190 1.1006 0.4219 1.1006 1.0491
No log 2.2588 192 0.9647 0.4600 0.9647 0.9822
No log 2.2824 194 0.8952 0.4321 0.8952 0.9462
No log 2.3059 196 0.8658 0.4444 0.8658 0.9305
No log 2.3294 198 0.9096 0.4289 0.9096 0.9537
No log 2.3529 200 0.9085 0.4382 0.9085 0.9531
No log 2.3765 202 0.8565 0.3970 0.8565 0.9254
No log 2.4 204 0.8387 0.4356 0.8387 0.9158
No log 2.4235 206 0.8486 0.4661 0.8486 0.9212
No log 2.4471 208 0.8440 0.5283 0.8440 0.9187
No log 2.4706 210 0.8532 0.4369 0.8532 0.9237
No log 2.4941 212 0.9436 0.4220 0.9436 0.9714
No log 2.5176 214 0.9765 0.4422 0.9765 0.9882
No log 2.5412 216 0.8964 0.4131 0.8964 0.9468
No log 2.5647 218 0.8359 0.4980 0.8359 0.9143
No log 2.5882 220 0.8286 0.5137 0.8286 0.9103
No log 2.6118 222 0.8401 0.5057 0.8401 0.9166
No log 2.6353 224 0.8492 0.5344 0.8492 0.9215
No log 2.6588 226 0.8874 0.5135 0.8874 0.9420
No log 2.6824 228 0.9603 0.4310 0.9603 0.9800
No log 2.7059 230 0.8969 0.4644 0.8969 0.9471
No log 2.7294 232 0.8463 0.5149 0.8463 0.9199
No log 2.7529 234 0.8672 0.5563 0.8672 0.9312
No log 2.7765 236 0.8478 0.4945 0.8478 0.9208
No log 2.8 238 0.8441 0.4726 0.8441 0.9187
No log 2.8235 240 0.8694 0.4772 0.8694 0.9324
No log 2.8471 242 0.9431 0.4512 0.9431 0.9711
No log 2.8706 244 0.8939 0.5623 0.8939 0.9454
No log 2.8941 246 0.8312 0.6560 0.8312 0.9117
No log 2.9176 248 1.0673 0.4280 1.0673 1.0331
No log 2.9412 250 1.2127 0.4357 1.2127 1.1012
No log 2.9647 252 0.8736 0.5402 0.8736 0.9347
No log 2.9882 254 0.9102 0.5126 0.9102 0.9540
No log 3.0118 256 1.2549 0.4633 1.2549 1.1202
No log 3.0353 258 1.3595 0.4285 1.3595 1.1660
No log 3.0588 260 1.1684 0.3916 1.1684 1.0809
No log 3.0824 262 0.9402 0.4074 0.9402 0.9697
No log 3.1059 264 0.8617 0.4429 0.8617 0.9283
No log 3.1294 266 0.9103 0.4444 0.9103 0.9541
No log 3.1529 268 0.8786 0.4632 0.8786 0.9373
No log 3.1765 270 0.8286 0.4977 0.8286 0.9103
No log 3.2 272 0.8763 0.5012 0.8763 0.9361
No log 3.2235 274 1.0195 0.4830 1.0195 1.0097
No log 3.2471 276 1.2057 0.4312 1.2057 1.0980
No log 3.2706 278 1.2901 0.4240 1.2901 1.1358
No log 3.2941 280 1.2209 0.4240 1.2209 1.1049
No log 3.3176 282 1.0495 0.4836 1.0495 1.0244
No log 3.3412 284 0.9284 0.4940 0.9284 0.9636
No log 3.3647 286 0.8762 0.4579 0.8762 0.9361
No log 3.3882 288 0.8369 0.4316 0.8369 0.9148
No log 3.4118 290 0.8318 0.4352 0.8318 0.9120
No log 3.4353 292 0.8545 0.4540 0.8545 0.9244
No log 3.4588 294 0.9469 0.5029 0.9469 0.9731
No log 3.4824 296 1.0587 0.4625 1.0587 1.0289
No log 3.5059 298 1.0020 0.4552 1.0020 1.0010
No log 3.5294 300 1.0024 0.4672 1.0024 1.0012
No log 3.5529 302 0.9792 0.4920 0.9792 0.9896
No log 3.5765 304 0.9307 0.4390 0.9307 0.9647
No log 3.6 306 0.8951 0.4884 0.8951 0.9461
No log 3.6235 308 0.8908 0.4884 0.8908 0.9438
No log 3.6471 310 0.8744 0.4982 0.8744 0.9351
No log 3.6706 312 0.8557 0.5230 0.8557 0.9251
No log 3.6941 314 0.8291 0.4951 0.8291 0.9105
No log 3.7176 316 0.8280 0.4951 0.8280 0.9099
No log 3.7412 318 0.8737 0.5473 0.8737 0.9347
No log 3.7647 320 0.8627 0.5385 0.8627 0.9288
No log 3.7882 322 0.8422 0.5220 0.8422 0.9177
No log 3.8118 324 0.8701 0.5311 0.8701 0.9328
No log 3.8353 326 0.9532 0.4583 0.9532 0.9763
No log 3.8588 328 0.9692 0.4658 0.9692 0.9845
No log 3.8824 330 0.8934 0.4223 0.8934 0.9452
No log 3.9059 332 0.8567 0.3573 0.8567 0.9256
No log 3.9294 334 0.8540 0.3703 0.8540 0.9241
No log 3.9529 336 0.8916 0.4235 0.8916 0.9442
No log 3.9765 338 0.9536 0.4368 0.9536 0.9765
No log 4.0 340 0.9421 0.4368 0.9421 0.9706
No log 4.0235 342 0.8788 0.5298 0.8788 0.9374
No log 4.0471 344 0.8549 0.4981 0.8549 0.9246
No log 4.0706 346 0.8647 0.5137 0.8647 0.9299
No log 4.0941 348 0.8866 0.4107 0.8866 0.9416
No log 4.1176 350 0.9082 0.4069 0.9082 0.9530
No log 4.1412 352 0.9800 0.3816 0.9800 0.9899
No log 4.1647 354 1.1070 0.4186 1.1070 1.0522
No log 4.1882 356 1.1100 0.3978 1.1100 1.0536
No log 4.2118 358 1.0380 0.3540 1.0380 1.0188
No log 4.2353 360 1.0037 0.4449 1.0037 1.0019
No log 4.2588 362 1.0192 0.3657 1.0192 1.0095
No log 4.2824 364 1.1080 0.3648 1.1080 1.0526
No log 4.3059 366 1.1894 0.3676 1.1894 1.0906
No log 4.3294 368 1.1438 0.3667 1.1438 1.0695
No log 4.3529 370 0.9805 0.3908 0.9805 0.9902
No log 4.3765 372 0.8899 0.4485 0.8899 0.9434
No log 4.4 374 0.8980 0.4098 0.8980 0.9476
No log 4.4235 376 0.8852 0.4098 0.8852 0.9408
No log 4.4471 378 0.8780 0.4413 0.8780 0.9370
No log 4.4706 380 0.9828 0.4220 0.9828 0.9914
No log 4.4941 382 1.1376 0.3874 1.1376 1.0666
No log 4.5176 384 1.1175 0.3874 1.1175 1.0571
No log 4.5412 386 0.9760 0.4649 0.9760 0.9879
No log 4.5647 388 0.8920 0.4164 0.8920 0.9445
No log 4.5882 390 0.8656 0.4313 0.8656 0.9304
No log 4.6118 392 0.8849 0.4560 0.8849 0.9407
No log 4.6353 394 0.9346 0.5124 0.9346 0.9668
No log 4.6588 396 1.0685 0.4219 1.0685 1.0337
No log 4.6824 398 1.1442 0.3874 1.1442 1.0697
No log 4.7059 400 1.0549 0.4219 1.0549 1.0271
No log 4.7294 402 0.9438 0.4614 0.9438 0.9715
No log 4.7529 404 0.9012 0.4242 0.9012 0.9493
No log 4.7765 406 0.8985 0.3838 0.8985 0.9479
No log 4.8 408 0.8875 0.3608 0.8875 0.9421
No log 4.8235 410 0.8859 0.5045 0.8859 0.9412
No log 4.8471 412 0.9089 0.5201 0.9089 0.9534
No log 4.8706 414 1.0140 0.4655 1.0140 1.0070
No log 4.8941 416 1.0633 0.4845 1.0633 1.0312
No log 4.9176 418 1.0016 0.4655 1.0016 1.0008
No log 4.9412 420 0.9091 0.5266 0.9091 0.9534
No log 4.9647 422 0.8700 0.4241 0.8700 0.9328
No log 4.9882 424 0.8675 0.4482 0.8675 0.9314
No log 5.0118 426 0.8511 0.4519 0.8511 0.9225
No log 5.0353 428 0.8554 0.4584 0.8554 0.9249
No log 5.0588 430 0.9204 0.4851 0.9204 0.9594
No log 5.0824 432 1.0665 0.3785 1.0665 1.0327
No log 5.1059 434 1.1011 0.3684 1.1011 1.0494
No log 5.1294 436 0.9856 0.4458 0.9856 0.9928
No log 5.1529 438 0.8907 0.5030 0.8907 0.9438
No log 5.1765 440 0.8264 0.5014 0.8264 0.9091
No log 5.2 442 0.8297 0.5014 0.8297 0.9109
No log 5.2235 444 0.8509 0.5044 0.8509 0.9224
No log 5.2471 446 0.9003 0.4600 0.9003 0.9489
No log 5.2706 448 0.8923 0.4973 0.8923 0.9446
No log 5.2941 450 0.8424 0.5245 0.8424 0.9178
No log 5.3176 452 0.7977 0.5131 0.7977 0.8931
No log 5.3412 454 0.8196 0.5700 0.8196 0.9053
No log 5.3647 456 0.8891 0.5208 0.8891 0.9429
No log 5.3882 458 0.9145 0.5395 0.9145 0.9563
No log 5.4118 460 0.8861 0.5208 0.8861 0.9413
No log 5.4353 462 0.8680 0.5208 0.8680 0.9316
No log 5.4588 464 0.8648 0.5421 0.8648 0.9300
No log 5.4824 466 0.8571 0.5426 0.8571 0.9258
No log 5.5059 468 0.8885 0.5554 0.8885 0.9426
No log 5.5294 470 0.9792 0.5310 0.9792 0.9896
No log 5.5529 472 0.9805 0.5208 0.9805 0.9902
No log 5.5765 474 0.8410 0.5766 0.8410 0.9171
No log 5.6 476 0.7389 0.5523 0.7389 0.8596
No log 5.6235 478 0.7208 0.5701 0.7208 0.8490
No log 5.6471 480 0.7301 0.5483 0.7301 0.8545
No log 5.6706 482 0.7644 0.5718 0.7644 0.8743
No log 5.6941 484 0.7778 0.6052 0.7778 0.8819
No log 5.7176 486 0.7661 0.6052 0.7661 0.8753
No log 5.7412 488 0.7896 0.6154 0.7896 0.8886
No log 5.7647 490 0.7758 0.6052 0.7758 0.8808
No log 5.7882 492 0.7722 0.5874 0.7722 0.8788
No log 5.8118 494 0.7671 0.4444 0.7671 0.8759
No log 5.8353 496 0.7876 0.4540 0.7876 0.8875
No log 5.8588 498 0.8315 0.5727 0.8315 0.9118
0.3564 5.8824 500 0.9715 0.5743 0.9715 0.9857
0.3564 5.9059 502 1.0767 0.4894 1.0767 1.0376
0.3564 5.9294 504 1.0265 0.5334 1.0265 1.0132
0.3564 5.9529 506 0.9049 0.5649 0.9049 0.9513
0.3564 5.9765 508 0.8105 0.4256 0.8105 0.9003
0.3564 6.0 510 0.8126 0.4730 0.8126 0.9014

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k15_task2_organization

Finetuned
(4019)
this model