ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k18_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7781
  • Qwk: 0.4746
  • Mse: 0.7781
  • Rmse: 0.8821

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0196 2 4.8267 0.0010 4.8267 2.1970
No log 0.0392 4 2.9287 0.0169 2.9287 1.7114
No log 0.0588 6 1.9113 0.1273 1.9113 1.3825
No log 0.0784 8 1.3324 -0.0284 1.3324 1.1543
No log 0.0980 10 1.2287 0.2386 1.2287 1.1085
No log 0.1176 12 1.0895 0.3965 1.0895 1.0438
No log 0.1373 14 1.0169 0.3570 1.0169 1.0084
No log 0.1569 16 1.0387 0.3291 1.0387 1.0191
No log 0.1765 18 1.0551 0.3256 1.0551 1.0272
No log 0.1961 20 1.0956 0.2750 1.0956 1.0467
No log 0.2157 22 1.1400 0.1832 1.1400 1.0677
No log 0.2353 24 0.9812 0.3756 0.9812 0.9906
No log 0.2549 26 1.1499 0.2704 1.1499 1.0723
No log 0.2745 28 1.0707 0.3747 1.0707 1.0348
No log 0.2941 30 0.9352 0.5094 0.9352 0.9670
No log 0.3137 32 0.9502 0.5498 0.9502 0.9748
No log 0.3333 34 0.9076 0.5523 0.9076 0.9527
No log 0.3529 36 0.9396 0.5585 0.9396 0.9693
No log 0.3725 38 1.0534 0.4337 1.0534 1.0263
No log 0.3922 40 0.9858 0.5383 0.9858 0.9929
No log 0.4118 42 0.8990 0.5896 0.8990 0.9482
No log 0.4314 44 0.8994 0.6041 0.8994 0.9484
No log 0.4510 46 0.8786 0.5902 0.8786 0.9373
No log 0.4706 48 0.8886 0.6016 0.8886 0.9426
No log 0.4902 50 1.1952 0.3317 1.1952 1.0933
No log 0.5098 52 1.7262 0.3347 1.7262 1.3138
No log 0.5294 54 1.6498 0.3355 1.6498 1.2844
No log 0.5490 56 1.1623 0.2940 1.1623 1.0781
No log 0.5686 58 0.9326 0.5168 0.9326 0.9657
No log 0.5882 60 1.1874 0.5040 1.1874 1.0897
No log 0.6078 62 1.0922 0.5065 1.0922 1.0451
No log 0.6275 64 0.8749 0.4676 0.8749 0.9353
No log 0.6471 66 1.0034 0.3339 1.0034 1.0017
No log 0.6667 68 1.1652 0.2341 1.1652 1.0795
No log 0.6863 70 1.0107 0.4191 1.0107 1.0053
No log 0.7059 72 0.8280 0.4852 0.8280 0.9100
No log 0.7255 74 0.9107 0.5276 0.9107 0.9543
No log 0.7451 76 1.0730 0.4851 1.0730 1.0359
No log 0.7647 78 0.9580 0.5398 0.9580 0.9788
No log 0.7843 80 0.7987 0.6216 0.7987 0.8937
No log 0.8039 82 0.9054 0.5164 0.9054 0.9515
No log 0.8235 84 0.9964 0.4567 0.9964 0.9982
No log 0.8431 86 0.8574 0.4964 0.8574 0.9260
No log 0.8627 88 0.8107 0.5997 0.8107 0.9004
No log 0.8824 90 1.0086 0.5298 1.0086 1.0043
No log 0.9020 92 1.0027 0.5491 1.0027 1.0013
No log 0.9216 94 0.8505 0.5498 0.8505 0.9222
No log 0.9412 96 0.8687 0.4582 0.8687 0.9321
No log 0.9608 98 0.8999 0.4794 0.8999 0.9486
No log 0.9804 100 0.8252 0.5958 0.8252 0.9084
No log 1.0 102 0.9826 0.5617 0.9826 0.9913
No log 1.0196 104 1.0973 0.5264 1.0973 1.0475
No log 1.0392 106 1.0025 0.5802 1.0025 1.0013
No log 1.0588 108 0.8115 0.6060 0.8115 0.9008
No log 1.0784 110 0.8085 0.6327 0.8085 0.8992
No log 1.0980 112 0.8104 0.6029 0.8104 0.9002
No log 1.1176 114 0.8717 0.5977 0.8717 0.9336
No log 1.1373 116 0.8501 0.5977 0.8501 0.9220
No log 1.1569 118 0.8095 0.5855 0.8095 0.8997
No log 1.1765 120 0.7794 0.6192 0.7794 0.8829
No log 1.1961 122 0.7840 0.5838 0.7840 0.8855
No log 1.2157 124 0.7986 0.5671 0.7986 0.8937
No log 1.2353 126 0.8243 0.5418 0.8243 0.9079
No log 1.2549 128 0.8383 0.5414 0.8383 0.9156
No log 1.2745 130 0.8373 0.5261 0.8373 0.9151
No log 1.2941 132 0.7802 0.4966 0.7802 0.8833
No log 1.3137 134 0.9531 0.5293 0.9531 0.9763
No log 1.3333 136 1.1174 0.4037 1.1174 1.0571
No log 1.3529 138 0.9110 0.5475 0.9110 0.9545
No log 1.3725 140 0.8010 0.5501 0.8010 0.8950
No log 1.3922 142 1.0149 0.5397 1.0149 1.0074
No log 1.4118 144 0.9865 0.5194 0.9865 0.9932
No log 1.4314 146 0.8711 0.5211 0.8711 0.9333
No log 1.4510 148 0.9015 0.4561 0.9015 0.9495
No log 1.4706 150 0.9738 0.4341 0.9738 0.9868
No log 1.4902 152 0.9180 0.4783 0.9180 0.9581
No log 1.5098 154 0.8255 0.5082 0.8255 0.9086
No log 1.5294 156 1.0309 0.5638 1.0309 1.0153
No log 1.5490 158 1.0569 0.5597 1.0569 1.0281
No log 1.5686 160 0.9671 0.5530 0.9671 0.9834
No log 1.5882 162 0.8206 0.5025 0.8206 0.9059
No log 1.6078 164 0.8133 0.6139 0.8133 0.9018
No log 1.6275 166 0.8269 0.6310 0.8269 0.9094
No log 1.6471 168 0.7818 0.5680 0.7818 0.8842
No log 1.6667 170 0.8723 0.5780 0.8723 0.9340
No log 1.6863 172 0.9701 0.5439 0.9701 0.9849
No log 1.7059 174 0.9340 0.5551 0.9340 0.9664
No log 1.7255 176 0.7960 0.5580 0.7960 0.8922
No log 1.7451 178 0.7494 0.6012 0.7494 0.8657
No log 1.7647 180 1.0327 0.4974 1.0327 1.0162
No log 1.7843 182 1.0872 0.4890 1.0872 1.0427
No log 1.8039 184 0.9185 0.5316 0.9185 0.9584
No log 1.8235 186 0.7426 0.5025 0.7426 0.8618
No log 1.8431 188 0.8253 0.5322 0.8253 0.9084
No log 1.8627 190 0.8906 0.5286 0.8906 0.9437
No log 1.8824 192 0.7752 0.5098 0.7752 0.8804
No log 1.9020 194 0.8939 0.4957 0.8939 0.9455
No log 1.9216 196 1.1311 0.3648 1.1311 1.0635
No log 1.9412 198 1.0809 0.4171 1.0809 1.0397
No log 1.9608 200 0.8397 0.4542 0.8397 0.9164
No log 1.9804 202 0.9337 0.5310 0.9337 0.9663
No log 2.0 204 1.4950 0.4120 1.4950 1.2227
No log 2.0196 206 1.7966 0.3363 1.7966 1.3404
No log 2.0392 208 1.5242 0.4092 1.5242 1.2346
No log 2.0588 210 1.1110 0.4371 1.1110 1.0540
No log 2.0784 212 0.8307 0.3674 0.8307 0.9114
No log 2.0980 214 0.7801 0.4401 0.7801 0.8832
No log 2.1176 216 0.7527 0.4874 0.7527 0.8676
No log 2.1373 218 0.7844 0.5426 0.7844 0.8856
No log 2.1569 220 0.9282 0.5283 0.9282 0.9635
No log 2.1765 222 0.9988 0.5514 0.9988 0.9994
No log 2.1961 224 0.8481 0.5707 0.8481 0.9209
No log 2.2157 226 0.7802 0.6562 0.7802 0.8833
No log 2.2353 228 0.8033 0.6389 0.8033 0.8963
No log 2.2549 230 0.7574 0.6651 0.7574 0.8703
No log 2.2745 232 0.7596 0.5940 0.7596 0.8716
No log 2.2941 234 0.8575 0.5431 0.8575 0.9260
No log 2.3137 236 0.9444 0.5707 0.9444 0.9718
No log 2.3333 238 0.9334 0.5399 0.9334 0.9661
No log 2.3529 240 0.8443 0.5255 0.8443 0.9189
No log 2.3725 242 0.7564 0.5211 0.7564 0.8697
No log 2.3922 244 0.7112 0.5711 0.7112 0.8433
No log 2.4118 246 0.7108 0.6487 0.7108 0.8431
No log 2.4314 248 0.7387 0.5713 0.7387 0.8595
No log 2.4510 250 0.8276 0.5164 0.8276 0.9097
No log 2.4706 252 0.8287 0.5 0.8287 0.9104
No log 2.4902 254 0.7847 0.5040 0.7847 0.8858
No log 2.5098 256 0.7994 0.4465 0.7994 0.8941
No log 2.5294 258 0.8156 0.3948 0.8156 0.9031
No log 2.5490 260 0.8783 0.4685 0.8783 0.9372
No log 2.5686 262 0.9166 0.5130 0.9166 0.9574
No log 2.5882 264 0.8697 0.4676 0.8697 0.9326
No log 2.6078 266 0.8195 0.5086 0.8195 0.9052
No log 2.6275 268 0.8240 0.4902 0.8240 0.9077
No log 2.6471 270 0.8226 0.5307 0.8226 0.9070
No log 2.6667 272 0.8216 0.5238 0.8216 0.9064
No log 2.6863 274 0.8475 0.5161 0.8475 0.9206
No log 2.7059 276 0.8836 0.4945 0.8836 0.9400
No log 2.7255 278 0.8841 0.5109 0.8841 0.9403
No log 2.7451 280 0.8645 0.4886 0.8645 0.9298
No log 2.7647 282 0.8714 0.4869 0.8714 0.9335
No log 2.7843 284 0.8970 0.5042 0.8970 0.9471
No log 2.8039 286 0.8600 0.4583 0.8600 0.9274
No log 2.8235 288 0.8278 0.4698 0.8278 0.9098
No log 2.8431 290 0.8148 0.5211 0.8148 0.9027
No log 2.8627 292 0.7919 0.4352 0.7919 0.8899
No log 2.8824 294 0.7925 0.4086 0.7925 0.8902
No log 2.9020 296 0.8008 0.4534 0.8008 0.8949
No log 2.9216 298 0.8185 0.5131 0.8185 0.9047
No log 2.9412 300 0.8024 0.4726 0.8024 0.8958
No log 2.9608 302 0.7922 0.3925 0.7922 0.8901
No log 2.9804 304 0.7922 0.4449 0.7922 0.8901
No log 3.0 306 0.7925 0.5241 0.7925 0.8902
No log 3.0196 308 0.7986 0.5386 0.7986 0.8936
No log 3.0392 310 0.8180 0.5098 0.8180 0.9044
No log 3.0588 312 0.8268 0.5261 0.8268 0.9093
No log 3.0784 314 0.8097 0.5411 0.8097 0.8999
No log 3.0980 316 0.7878 0.5660 0.7878 0.8876
No log 3.1176 318 0.7577 0.5167 0.7577 0.8705
No log 3.1373 320 0.7439 0.5463 0.7439 0.8625
No log 3.1569 322 0.7325 0.5633 0.7325 0.8559
No log 3.1765 324 0.7698 0.5914 0.7698 0.8774
No log 3.1961 326 0.8138 0.6043 0.8138 0.9021
No log 3.2157 328 0.8415 0.6111 0.8415 0.9173
No log 3.2353 330 0.9020 0.6064 0.9020 0.9497
No log 3.2549 332 0.9449 0.6011 0.9449 0.9721
No log 3.2745 334 0.8077 0.6382 0.8077 0.8987
No log 3.2941 336 0.7352 0.5914 0.7352 0.8574
No log 3.3137 338 0.6972 0.6151 0.6972 0.8350
No log 3.3333 340 0.7086 0.5912 0.7086 0.8418
No log 3.3529 342 0.7059 0.5712 0.7059 0.8402
No log 3.3725 344 0.7031 0.5467 0.7031 0.8385
No log 3.3922 346 0.7113 0.5800 0.7113 0.8434
No log 3.4118 348 0.7667 0.5777 0.7667 0.8756
No log 3.4314 350 0.9355 0.5297 0.9355 0.9672
No log 3.4510 352 0.9825 0.5247 0.9825 0.9912
No log 3.4706 354 0.8900 0.4641 0.8900 0.9434
No log 3.4902 356 0.7961 0.4920 0.7961 0.8922
No log 3.5098 358 0.7459 0.4737 0.7459 0.8637
No log 3.5294 360 0.7261 0.4397 0.7261 0.8521
No log 3.5490 362 0.7266 0.4643 0.7266 0.8524
No log 3.5686 364 0.7242 0.4626 0.7242 0.8510
No log 3.5882 366 0.7711 0.6142 0.7711 0.8781
No log 3.6078 368 0.8476 0.6043 0.8476 0.9207
No log 3.6275 370 0.8592 0.5670 0.8592 0.9269
No log 3.6471 372 0.8139 0.5586 0.8139 0.9022
No log 3.6667 374 0.7766 0.3738 0.7766 0.8812
No log 3.6863 376 0.8165 0.5060 0.8165 0.9036
No log 3.7059 378 0.9360 0.4733 0.9360 0.9675
No log 3.7255 380 0.9232 0.4444 0.9232 0.9608
No log 3.7451 382 0.8326 0.3956 0.8326 0.9125
No log 3.7647 384 0.8691 0.4704 0.8691 0.9322
No log 3.7843 386 0.9583 0.4946 0.9583 0.9789
No log 3.8039 388 0.9934 0.4610 0.9934 0.9967
No log 3.8235 390 1.0024 0.4733 1.0024 1.0012
No log 3.8431 392 0.9064 0.4527 0.9064 0.9521
No log 3.8627 394 0.8511 0.4413 0.8511 0.9225
No log 3.8824 396 0.8331 0.4498 0.8331 0.9127
No log 3.9020 398 0.8278 0.4498 0.8278 0.9098
No log 3.9216 400 0.8448 0.4620 0.8448 0.9191
No log 3.9412 402 0.9217 0.5083 0.9217 0.9600
No log 3.9608 404 0.9355 0.5340 0.9355 0.9672
No log 3.9804 406 0.8630 0.5042 0.8630 0.9290
No log 4.0 408 0.7869 0.4540 0.7869 0.8871
No log 4.0196 410 0.7582 0.4401 0.7582 0.8707
No log 4.0392 412 0.7744 0.5468 0.7744 0.8800
No log 4.0588 414 0.7428 0.4957 0.7428 0.8619
No log 4.0784 416 0.7178 0.5922 0.7178 0.8472
No log 4.0980 418 0.7583 0.5411 0.7583 0.8708
No log 4.1176 420 0.7877 0.5869 0.7877 0.8875
No log 4.1373 422 0.8287 0.5759 0.8287 0.9104
No log 4.1569 424 0.8326 0.5331 0.8326 0.9125
No log 4.1765 426 0.8373 0.5136 0.8373 0.9150
No log 4.1961 428 0.8236 0.5474 0.8236 0.9075
No log 4.2157 430 0.8091 0.5287 0.8091 0.8995
No log 4.2353 432 0.8249 0.5635 0.8249 0.9082
No log 4.2549 434 0.8598 0.5253 0.8598 0.9273
No log 4.2745 436 0.8449 0.5318 0.8449 0.9192
No log 4.2941 438 0.8422 0.4479 0.8422 0.9177
No log 4.3137 440 0.8613 0.3608 0.8613 0.9281
No log 4.3333 442 0.8865 0.3437 0.8865 0.9415
No log 4.3529 444 0.9008 0.3652 0.9008 0.9491
No log 4.3725 446 0.9479 0.4165 0.9479 0.9736
No log 4.3922 448 0.9773 0.4576 0.9773 0.9886
No log 4.4118 450 0.9462 0.5218 0.9462 0.9727
No log 4.4314 452 0.8800 0.4907 0.8800 0.9381
No log 4.4510 454 0.8494 0.5028 0.8494 0.9217
No log 4.4706 456 0.8286 0.5311 0.8286 0.9103
No log 4.4902 458 0.8214 0.5495 0.8214 0.9063
No log 4.5098 460 0.8389 0.5625 0.8389 0.9159
No log 4.5294 462 0.8101 0.5625 0.8101 0.9001
No log 4.5490 464 0.7676 0.5011 0.7676 0.8762
No log 4.5686 466 0.7632 0.4671 0.7632 0.8736
No log 4.5882 468 0.7751 0.4841 0.7751 0.8804
No log 4.6078 470 0.7593 0.4941 0.7593 0.8714
No log 4.6275 472 0.7309 0.4908 0.7309 0.8549
No log 4.6471 474 0.7510 0.5211 0.7510 0.8666
No log 4.6667 476 0.7660 0.5585 0.7660 0.8752
No log 4.6863 478 0.7367 0.4662 0.7367 0.8583
No log 4.7059 480 0.7356 0.4834 0.7356 0.8577
No log 4.7255 482 0.7473 0.5749 0.7473 0.8645
No log 4.7451 484 0.7887 0.5979 0.7887 0.8881
No log 4.7647 486 0.8572 0.5365 0.8572 0.9259
No log 4.7843 488 0.9434 0.5415 0.9434 0.9713
No log 4.8039 490 0.9710 0.5032 0.9710 0.9854
No log 4.8235 492 0.9354 0.4889 0.9354 0.9672
No log 4.8431 494 0.8744 0.5303 0.8744 0.9351
No log 4.8627 496 0.8132 0.4637 0.8132 0.9018
No log 4.8824 498 0.7832 0.4634 0.7832 0.8850
0.3541 4.9020 500 0.7805 0.5098 0.7805 0.8835
0.3541 4.9216 502 0.7797 0.5815 0.7797 0.8830
0.3541 4.9412 504 0.8091 0.5588 0.8091 0.8995
0.3541 4.9608 506 0.8435 0.5624 0.8435 0.9184
0.3541 4.9804 508 0.8850 0.5660 0.8850 0.9407
0.3541 5.0 510 0.8802 0.5710 0.8802 0.9382
0.3541 5.0196 512 0.8441 0.5881 0.8441 0.9188
0.3541 5.0392 514 0.8122 0.3596 0.8122 0.9012
0.3541 5.0588 516 0.7916 0.3738 0.7916 0.8897
0.3541 5.0784 518 0.7847 0.4019 0.7847 0.8858
0.3541 5.0980 520 0.7732 0.3879 0.7732 0.8793
0.3541 5.1176 522 0.7781 0.4746 0.7781 0.8821

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k18_task2_organization

Finetuned
(4023)
this model