ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k10_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9596
  • Qwk: 0.4561
  • Mse: 0.9596
  • Rmse: 0.9796

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0351 2 4.7104 -0.0103 4.7104 2.1704
No log 0.0702 4 2.9829 -0.0411 2.9829 1.7271
No log 0.1053 6 1.7354 0.0629 1.7354 1.3174
No log 0.1404 8 1.3510 0.0101 1.3510 1.1623
No log 0.1754 10 1.2191 0.1043 1.2191 1.1041
No log 0.2105 12 1.2185 0.1693 1.2185 1.1039
No log 0.2456 14 1.1925 0.1875 1.1925 1.0920
No log 0.2807 16 1.1687 0.1649 1.1687 1.0811
No log 0.3158 18 1.1280 0.1711 1.1280 1.0621
No log 0.3509 20 1.1002 0.2028 1.1002 1.0489
No log 0.3860 22 1.0800 0.2276 1.0800 1.0392
No log 0.4211 24 1.0060 0.2692 1.0060 1.0030
No log 0.4561 26 1.0134 0.3017 1.0134 1.0067
No log 0.4912 28 0.9914 0.3309 0.9914 0.9957
No log 0.5263 30 1.1158 0.4439 1.1158 1.0563
No log 0.5614 32 1.3107 0.1807 1.3107 1.1448
No log 0.5965 34 1.2968 0.1700 1.2968 1.1388
No log 0.6316 36 1.0700 0.3820 1.0700 1.0344
No log 0.6667 38 0.9748 0.3707 0.9748 0.9873
No log 0.7018 40 0.9999 0.2431 0.9999 0.9999
No log 0.7368 42 0.9922 0.2944 0.9922 0.9961
No log 0.7719 44 0.9780 0.3952 0.9780 0.9889
No log 0.8070 46 0.9651 0.5115 0.9651 0.9824
No log 0.8421 48 1.1099 0.2875 1.1099 1.0535
No log 0.8772 50 1.1099 0.3202 1.1099 1.0535
No log 0.9123 52 0.9922 0.3902 0.9922 0.9961
No log 0.9474 54 0.9055 0.4902 0.9055 0.9516
No log 0.9825 56 0.9290 0.4181 0.9290 0.9639
No log 1.0175 58 0.9367 0.5110 0.9367 0.9678
No log 1.0526 60 0.9824 0.4909 0.9824 0.9911
No log 1.0877 62 0.9516 0.5220 0.9516 0.9755
No log 1.1228 64 1.0468 0.3877 1.0468 1.0231
No log 1.1579 66 1.1323 0.3090 1.1323 1.0641
No log 1.1930 68 0.9604 0.5489 0.9604 0.9800
No log 1.2281 70 0.9413 0.4772 0.9413 0.9702
No log 1.2632 72 1.1296 0.4560 1.1296 1.0628
No log 1.2982 74 1.0651 0.4560 1.0651 1.0320
No log 1.3333 76 0.9232 0.5352 0.9232 0.9609
No log 1.3684 78 0.8312 0.6350 0.8312 0.9117
No log 1.4035 80 1.0369 0.3534 1.0369 1.0183
No log 1.4386 82 1.0807 0.3570 1.0807 1.0395
No log 1.4737 84 0.9025 0.5592 0.9025 0.9500
No log 1.5088 86 0.7972 0.5484 0.7972 0.8929
No log 1.5439 88 0.9644 0.5451 0.9644 0.9820
No log 1.5789 90 1.0389 0.4941 1.0389 1.0192
No log 1.6140 92 0.9276 0.5389 0.9276 0.9631
No log 1.6491 94 0.8359 0.5601 0.8359 0.9143
No log 1.6842 96 0.7871 0.6118 0.7871 0.8872
No log 1.7193 98 0.7915 0.5633 0.7915 0.8897
No log 1.7544 100 0.7784 0.6107 0.7784 0.8823
No log 1.7895 102 0.7543 0.5706 0.7543 0.8685
No log 1.8246 104 0.7705 0.6241 0.7705 0.8778
No log 1.8596 106 0.7743 0.6160 0.7743 0.8799
No log 1.8947 108 0.8225 0.6199 0.8225 0.9069
No log 1.9298 110 0.9895 0.4966 0.9895 0.9947
No log 1.9649 112 1.0306 0.4966 1.0306 1.0152
No log 2.0 114 0.8753 0.6316 0.8753 0.9356
No log 2.0351 116 0.8386 0.5735 0.8386 0.9157
No log 2.0702 118 0.9368 0.5791 0.9368 0.9679
No log 2.1053 120 1.1347 0.5993 1.1347 1.0652
No log 2.1404 122 1.0916 0.5993 1.0916 1.0448
No log 2.1754 124 0.9979 0.5993 0.9979 0.9990
No log 2.2105 126 0.7759 0.5961 0.7759 0.8808
No log 2.2456 128 0.7323 0.6249 0.7323 0.8557
No log 2.2807 130 0.7542 0.5370 0.7542 0.8684
No log 2.3158 132 0.7286 0.5563 0.7286 0.8536
No log 2.3509 134 0.7115 0.6131 0.7115 0.8435
No log 2.3860 136 0.7295 0.6064 0.7295 0.8541
No log 2.4211 138 0.7354 0.6097 0.7354 0.8576
No log 2.4561 140 0.7549 0.5698 0.7549 0.8689
No log 2.4912 142 0.7974 0.5455 0.7974 0.8930
No log 2.5263 144 0.7847 0.5714 0.7847 0.8858
No log 2.5614 146 0.7617 0.6652 0.7617 0.8727
No log 2.5965 148 0.7616 0.6076 0.7616 0.8727
No log 2.6316 150 0.7831 0.6453 0.7831 0.8849
No log 2.6667 152 0.8240 0.5517 0.8240 0.9078
No log 2.7018 154 0.7686 0.5376 0.7686 0.8767
No log 2.7368 156 0.8188 0.5618 0.8188 0.9049
No log 2.7719 158 0.8200 0.5291 0.8200 0.9056
No log 2.8070 160 0.7846 0.5150 0.7846 0.8858
No log 2.8421 162 0.7803 0.4893 0.7803 0.8833
No log 2.8772 164 0.7223 0.4865 0.7223 0.8499
No log 2.9123 166 0.7316 0.5495 0.7316 0.8553
No log 2.9474 168 0.6927 0.5570 0.6927 0.8323
No log 2.9825 170 0.7072 0.5741 0.7072 0.8409
No log 3.0175 172 0.7125 0.5766 0.7125 0.8441
No log 3.0526 174 0.8012 0.5741 0.8012 0.8951
No log 3.0877 176 0.9017 0.5978 0.9017 0.9496
No log 3.1228 178 1.0675 0.4992 1.0675 1.0332
No log 3.1579 180 1.0942 0.4992 1.0942 1.0460
No log 3.1930 182 0.8267 0.5769 0.8267 0.9092
No log 3.2281 184 0.7126 0.6167 0.7126 0.8441
No log 3.2632 186 0.7224 0.5785 0.7224 0.8499
No log 3.2982 188 0.7212 0.5648 0.7212 0.8492
No log 3.3333 190 0.8370 0.5370 0.8370 0.9149
No log 3.3684 192 0.9362 0.5066 0.9362 0.9676
No log 3.4035 194 1.0738 0.4894 1.0738 1.0362
No log 3.4386 196 1.1871 0.4355 1.1871 1.0895
No log 3.4737 198 1.0335 0.5129 1.0335 1.0166
No log 3.5088 200 0.7980 0.5498 0.7980 0.8933
No log 3.5439 202 0.7557 0.5295 0.7557 0.8693
No log 3.5789 204 0.7788 0.5635 0.7788 0.8825
No log 3.6140 206 0.7556 0.5458 0.7556 0.8693
No log 3.6491 208 0.7486 0.5724 0.7486 0.8652
No log 3.6842 210 0.7911 0.5503 0.7911 0.8894
No log 3.7193 212 0.7896 0.5618 0.7896 0.8886
No log 3.7544 214 0.7251 0.6129 0.7251 0.8515
No log 3.7895 216 0.7506 0.5720 0.7506 0.8664
No log 3.8246 218 0.7918 0.5864 0.7918 0.8899
No log 3.8596 220 0.8050 0.5810 0.8050 0.8972
No log 3.8947 222 0.7533 0.6044 0.7533 0.8679
No log 3.9298 224 0.7533 0.5988 0.7533 0.8679
No log 3.9649 226 0.7661 0.5397 0.7661 0.8753
No log 4.0 228 0.7560 0.5443 0.7560 0.8695
No log 4.0351 230 0.7619 0.5396 0.7619 0.8729
No log 4.0702 232 0.7802 0.5481 0.7802 0.8833
No log 4.1053 234 0.8995 0.5451 0.8995 0.9484
No log 4.1404 236 0.9422 0.4840 0.9422 0.9707
No log 4.1754 238 0.8497 0.5386 0.8497 0.9218
No log 4.2105 240 0.7976 0.3977 0.7976 0.8931
No log 4.2456 242 0.8621 0.5219 0.8621 0.9285
No log 4.2807 244 0.8164 0.4421 0.8164 0.9035
No log 4.3158 246 0.7655 0.5621 0.7655 0.8749
No log 4.3509 248 0.7835 0.5391 0.7835 0.8852
No log 4.3860 250 0.8498 0.5168 0.8498 0.9219
No log 4.4211 252 0.8577 0.5168 0.8577 0.9261
No log 4.4561 254 0.8430 0.5660 0.8430 0.9181
No log 4.4912 256 0.8613 0.5324 0.8613 0.9281
No log 4.5263 258 0.8513 0.4757 0.8513 0.9227
No log 4.5614 260 0.9214 0.4646 0.9214 0.9599
No log 4.5965 262 0.9961 0.5056 0.9961 0.9980
No log 4.6316 264 0.9442 0.4323 0.9442 0.9717
No log 4.6667 266 0.8830 0.4902 0.8830 0.9397
No log 4.7018 268 0.8738 0.5042 0.8738 0.9348
No log 4.7368 270 0.8730 0.4871 0.8730 0.9344
No log 4.7719 272 0.8845 0.4366 0.8845 0.9405
No log 4.8070 274 0.9622 0.4468 0.9622 0.9809
No log 4.8421 276 0.9995 0.4521 0.9995 0.9997
No log 4.8772 278 0.9545 0.4857 0.9545 0.9770
No log 4.9123 280 0.8638 0.3809 0.8638 0.9294
No log 4.9474 282 0.7729 0.5026 0.7729 0.8791
No log 4.9825 284 0.7632 0.5291 0.7632 0.8736
No log 5.0175 286 0.7601 0.5581 0.7601 0.8718
No log 5.0526 288 0.7780 0.4966 0.7780 0.8820
No log 5.0877 290 0.8292 0.5521 0.8292 0.9106
No log 5.1228 292 0.8048 0.5194 0.8048 0.8971
No log 5.1579 294 0.7817 0.5025 0.7817 0.8841
No log 5.1930 296 0.7816 0.4739 0.7816 0.8841
No log 5.2281 298 0.7948 0.5194 0.7948 0.8915
No log 5.2632 300 0.8309 0.5239 0.8309 0.9115
No log 5.2982 302 0.8595 0.4998 0.8595 0.9271
No log 5.3333 304 0.8487 0.5201 0.8487 0.9212
No log 5.3684 306 0.8034 0.5570 0.8034 0.8963
No log 5.4035 308 0.7882 0.5828 0.7882 0.8878
No log 5.4386 310 0.7576 0.5706 0.7576 0.8704
No log 5.4737 312 0.7590 0.5743 0.7590 0.8712
No log 5.5088 314 0.8441 0.5287 0.8441 0.9187
No log 5.5439 316 0.8050 0.5228 0.8050 0.8972
No log 5.5789 318 0.7487 0.6137 0.7487 0.8653
No log 5.6140 320 0.8058 0.5706 0.8058 0.8977
No log 5.6491 322 0.8552 0.5614 0.8552 0.9248
No log 5.6842 324 0.8047 0.5610 0.8047 0.8971
No log 5.7193 326 0.7827 0.5380 0.7827 0.8847
No log 5.7544 328 0.7628 0.5455 0.7628 0.8734
No log 5.7895 330 0.7562 0.5455 0.7562 0.8696
No log 5.8246 332 0.7784 0.5447 0.7784 0.8823
No log 5.8596 334 0.8127 0.5306 0.8127 0.9015
No log 5.8947 336 0.8030 0.5721 0.8030 0.8961
No log 5.9298 338 0.7831 0.6111 0.7831 0.8849
No log 5.9649 340 0.7882 0.5972 0.7882 0.8878
No log 6.0 342 0.7985 0.5965 0.7985 0.8936
No log 6.0351 344 0.8115 0.5867 0.8115 0.9008
No log 6.0702 346 0.8357 0.5532 0.8357 0.9142
No log 6.1053 348 0.7865 0.5674 0.7865 0.8868
No log 6.1404 350 0.7898 0.5303 0.7898 0.8887
No log 6.1754 352 0.7920 0.5107 0.7920 0.8900
No log 6.2105 354 0.7568 0.4695 0.7568 0.8700
No log 6.2456 356 0.7621 0.5167 0.7621 0.8730
No log 6.2807 358 0.7776 0.5438 0.7776 0.8818
No log 6.3158 360 0.8236 0.5837 0.8236 0.9075
No log 6.3509 362 0.7857 0.5205 0.7857 0.8864
No log 6.3860 364 0.7869 0.4181 0.7869 0.8871
No log 6.4211 366 0.7960 0.4470 0.7960 0.8922
No log 6.4561 368 0.8076 0.5346 0.8076 0.8987
No log 6.4912 370 0.8458 0.5513 0.8458 0.9197
No log 6.5263 372 0.8476 0.5490 0.8476 0.9207
No log 6.5614 374 0.8248 0.5447 0.8248 0.9082
No log 6.5965 376 0.8171 0.5675 0.8171 0.9039
No log 6.6316 378 0.8892 0.5649 0.8892 0.9430
No log 6.6667 380 0.8954 0.5528 0.8954 0.9462
No log 6.7018 382 0.8826 0.5724 0.8826 0.9395
No log 6.7368 384 0.8485 0.5672 0.8485 0.9212
No log 6.7719 386 0.9248 0.5360 0.9248 0.9616
No log 6.8070 388 0.9509 0.5360 0.9509 0.9751
No log 6.8421 390 0.9931 0.4767 0.9931 0.9965
No log 6.8772 392 0.8733 0.5465 0.8733 0.9345
No log 6.9123 394 0.7695 0.5766 0.7695 0.8772
No log 6.9474 396 0.7686 0.4977 0.7686 0.8767
No log 6.9825 398 0.8048 0.5041 0.8048 0.8971
No log 7.0175 400 0.8666 0.5347 0.8666 0.9309
No log 7.0526 402 0.8306 0.4197 0.8306 0.9114
No log 7.0877 404 0.8411 0.3974 0.8411 0.9171
No log 7.1228 406 0.8409 0.3552 0.8409 0.9170
No log 7.1579 408 0.8684 0.4429 0.8684 0.9319
No log 7.1930 410 0.9146 0.4435 0.9146 0.9564
No log 7.2281 412 0.8444 0.4398 0.8444 0.9189
No log 7.2632 414 0.7496 0.4912 0.7496 0.8658
No log 7.2982 416 0.7075 0.5236 0.7075 0.8411
No log 7.3333 418 0.6859 0.6002 0.6859 0.8282
No log 7.3684 420 0.7057 0.6526 0.7057 0.8401
No log 7.4035 422 0.7231 0.5961 0.7231 0.8504
No log 7.4386 424 0.7616 0.6012 0.7616 0.8727
No log 7.4737 426 0.7617 0.6151 0.7617 0.8727
No log 7.5088 428 0.7567 0.5928 0.7567 0.8699
No log 7.5439 430 0.7373 0.5526 0.7373 0.8587
No log 7.5789 432 0.7453 0.5526 0.7453 0.8633
No log 7.6140 434 0.7651 0.5283 0.7651 0.8747
No log 7.6491 436 0.8343 0.5427 0.8343 0.9134
No log 7.6842 438 0.8792 0.5370 0.8792 0.9376
No log 7.7193 440 0.8485 0.5370 0.8485 0.9212
No log 7.7544 442 0.7610 0.5675 0.7610 0.8724
No log 7.7895 444 0.7539 0.5324 0.7539 0.8683
No log 7.8246 446 0.8171 0.5374 0.8171 0.9040
No log 7.8596 448 0.9063 0.5029 0.9063 0.9520
No log 7.8947 450 0.9324 0.4806 0.9324 0.9656
No log 7.9298 452 0.8833 0.5029 0.8833 0.9398
No log 7.9649 454 0.8059 0.4968 0.8059 0.8977
No log 8.0 456 0.7409 0.5922 0.7409 0.8607
No log 8.0351 458 0.7362 0.5271 0.7362 0.8580
No log 8.0702 460 0.7379 0.5343 0.7379 0.8590
No log 8.1053 462 0.7775 0.5563 0.7775 0.8818
No log 8.1404 464 0.8781 0.5448 0.8781 0.9371
No log 8.1754 466 0.9135 0.5570 0.9135 0.9558
No log 8.2105 468 0.8739 0.5276 0.8739 0.9348
No log 8.2456 470 0.7925 0.5727 0.7925 0.8902
No log 8.2807 472 0.7588 0.5184 0.7588 0.8711
No log 8.3158 474 0.7518 0.5387 0.7518 0.8671
No log 8.3509 476 0.7546 0.5763 0.7546 0.8687
No log 8.3860 478 0.7903 0.5810 0.7903 0.8890
No log 8.4211 480 0.9387 0.5414 0.9387 0.9689
No log 8.4561 482 1.0112 0.5148 1.0112 1.0056
No log 8.4912 484 0.9119 0.5255 0.9119 0.9549
No log 8.5263 486 0.8167 0.5462 0.8167 0.9037
No log 8.5614 488 0.7914 0.5438 0.7914 0.8896
No log 8.5965 490 0.7531 0.5495 0.7531 0.8678
No log 8.6316 492 0.7330 0.5368 0.7330 0.8561
No log 8.6667 494 0.7155 0.6052 0.7155 0.8459
No log 8.7018 496 0.7222 0.6154 0.7222 0.8498
No log 8.7368 498 0.7403 0.6020 0.7403 0.8604
0.3447 8.7719 500 0.7193 0.6233 0.7193 0.8481
0.3447 8.8070 502 0.6817 0.6441 0.6817 0.8256
0.3447 8.8421 504 0.6644 0.6362 0.6644 0.8151
0.3447 8.8772 506 0.6712 0.6362 0.6712 0.8193
0.3447 8.9123 508 0.7034 0.6356 0.7034 0.8387
0.3447 8.9474 510 0.7803 0.5712 0.7803 0.8834
0.3447 8.9825 512 0.8689 0.5297 0.8689 0.9321
0.3447 9.0175 514 0.9994 0.4740 0.9994 0.9997
0.3447 9.0526 516 1.0228 0.4819 1.0228 1.0113
0.3447 9.0877 518 0.9596 0.4561 0.9596 0.9796

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k10_task2_organization

Finetuned
(4019)
this model