ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k9_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7876
  • Qwk: 0.3983
  • Mse: 0.7876
  • Rmse: 0.8875

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0392 2 4.5916 0.0010 4.5916 2.1428
No log 0.0784 4 3.0257 -0.0314 3.0257 1.7395
No log 0.1176 6 1.6877 0.1273 1.6877 1.2991
No log 0.1569 8 1.2342 0.1870 1.2342 1.1109
No log 0.1961 10 1.1419 0.2293 1.1419 1.0686
No log 0.2353 12 1.0947 0.3307 1.0947 1.0463
No log 0.2745 14 1.3562 0.0936 1.3562 1.1646
No log 0.3137 16 1.2980 0.1364 1.2980 1.1393
No log 0.3529 18 1.2264 0.1857 1.2264 1.1074
No log 0.3922 20 0.9109 0.3948 0.9109 0.9544
No log 0.4314 22 0.8963 0.4241 0.8963 0.9467
No log 0.4706 24 0.9749 0.3595 0.9749 0.9874
No log 0.5098 26 1.1454 0.3523 1.1454 1.0702
No log 0.5490 28 1.3171 0.3346 1.3171 1.1477
No log 0.5882 30 1.1810 0.3967 1.1810 1.0867
No log 0.6275 32 1.0645 0.4718 1.0645 1.0317
No log 0.6667 34 0.9765 0.5776 0.9765 0.9882
No log 0.7059 36 0.8766 0.5550 0.8766 0.9363
No log 0.7451 38 0.8678 0.6402 0.8678 0.9316
No log 0.7843 40 1.0333 0.3730 1.0333 1.0165
No log 0.8235 42 1.1000 0.3757 1.1000 1.0488
No log 0.8627 44 1.0740 0.3874 1.0740 1.0363
No log 0.9020 46 0.8526 0.6128 0.8526 0.9234
No log 0.9412 48 0.7434 0.5909 0.7434 0.8622
No log 0.9804 50 0.7459 0.5223 0.7459 0.8637
No log 1.0196 52 0.7855 0.6199 0.7855 0.8863
No log 1.0588 54 1.0658 0.3585 1.0658 1.0324
No log 1.0980 56 1.5943 0.3794 1.5943 1.2626
No log 1.1373 58 1.5483 0.4004 1.5483 1.2443
No log 1.1765 60 1.1020 0.5194 1.1020 1.0498
No log 1.2157 62 0.8276 0.5663 0.8276 0.9097
No log 1.2549 64 0.7627 0.6427 0.7627 0.8733
No log 1.2941 66 0.7804 0.6966 0.7804 0.8834
No log 1.3333 68 0.7541 0.6366 0.7541 0.8684
No log 1.3725 70 0.9162 0.5869 0.9162 0.9572
No log 1.4118 72 1.0259 0.4642 1.0259 1.0129
No log 1.4510 74 0.9424 0.5808 0.9424 0.9708
No log 1.4902 76 0.7405 0.6419 0.7405 0.8605
No log 1.5294 78 1.0319 0.5529 1.0319 1.0158
No log 1.5686 80 1.1093 0.5356 1.1093 1.0532
No log 1.6078 82 0.8255 0.5842 0.8255 0.9086
No log 1.6471 84 0.7377 0.5879 0.7377 0.8589
No log 1.6863 86 0.7827 0.6012 0.7827 0.8847
No log 1.7255 88 0.7771 0.6089 0.7771 0.8815
No log 1.7647 90 0.7605 0.5420 0.7605 0.8721
No log 1.8039 92 0.7858 0.5588 0.7858 0.8864
No log 1.8431 94 0.8925 0.5842 0.8925 0.9447
No log 1.8824 96 0.8959 0.5636 0.8959 0.9465
No log 1.9216 98 0.8013 0.4515 0.8013 0.8952
No log 1.9608 100 0.7856 0.4502 0.7856 0.8863
No log 2.0 102 0.7906 0.4690 0.7906 0.8892
No log 2.0392 104 0.8062 0.5440 0.8062 0.8979
No log 2.0784 106 0.7967 0.5309 0.7967 0.8926
No log 2.1176 108 0.8100 0.5190 0.8100 0.9000
No log 2.1569 110 0.8279 0.5392 0.8279 0.9099
No log 2.1961 112 0.8638 0.5362 0.8638 0.9294
No log 2.2353 114 1.2473 0.5311 1.2473 1.1168
No log 2.2745 116 1.4214 0.4498 1.4214 1.1922
No log 2.3137 118 1.2133 0.5015 1.2133 1.1015
No log 2.3529 120 0.9195 0.4985 0.9195 0.9589
No log 2.3922 122 0.9035 0.3733 0.9035 0.9505
No log 2.4314 124 0.8931 0.3733 0.8931 0.9451
No log 2.4706 126 0.9041 0.4910 0.9041 0.9508
No log 2.5098 128 1.0378 0.5066 1.0378 1.0187
No log 2.5490 130 0.9072 0.4945 0.9072 0.9525
No log 2.5882 132 0.8477 0.4730 0.8477 0.9207
No log 2.6275 134 1.0400 0.4944 1.0400 1.0198
No log 2.6667 136 0.9064 0.5399 0.9064 0.9521
No log 2.7059 138 0.8138 0.5876 0.8138 0.9021
No log 2.7451 140 0.9577 0.5514 0.9577 0.9786
No log 2.7843 142 0.9442 0.5550 0.9442 0.9717
No log 2.8235 144 0.8628 0.5185 0.8628 0.9288
No log 2.8627 146 0.7969 0.5305 0.7969 0.8927
No log 2.9020 148 0.7743 0.5918 0.7743 0.8799
No log 2.9412 150 0.7725 0.6193 0.7725 0.8789
No log 2.9804 152 0.7918 0.6325 0.7918 0.8898
No log 3.0196 154 0.7922 0.5821 0.7922 0.8901
No log 3.0588 156 0.7634 0.5011 0.7634 0.8737
No log 3.0980 158 0.7886 0.5833 0.7886 0.8880
No log 3.1373 160 0.7581 0.4889 0.7581 0.8707
No log 3.1765 162 0.7506 0.4994 0.7506 0.8664
No log 3.2157 164 0.7479 0.5327 0.7479 0.8648
No log 3.2549 166 0.7583 0.5391 0.7583 0.8708
No log 3.2941 168 0.8871 0.5693 0.8871 0.9419
No log 3.3333 170 0.9623 0.5605 0.9623 0.9810
No log 3.3725 172 0.8651 0.5455 0.8651 0.9301
No log 3.4118 174 0.7764 0.5232 0.7764 0.8811
No log 3.4510 176 0.7924 0.5235 0.7924 0.8901
No log 3.4902 178 0.7970 0.5204 0.7970 0.8928
No log 3.5294 180 0.8550 0.5455 0.8550 0.9246
No log 3.5686 182 0.9321 0.5814 0.9321 0.9654
No log 3.6078 184 0.8874 0.5614 0.8874 0.9420
No log 3.6471 186 0.8110 0.5815 0.8110 0.9006
No log 3.6863 188 0.8273 0.5398 0.8273 0.9096
No log 3.7255 190 0.7898 0.5159 0.7898 0.8887
No log 3.7647 192 0.7828 0.5794 0.7828 0.8848
No log 3.8039 194 0.8351 0.5422 0.8351 0.9138
No log 3.8431 196 0.7819 0.5040 0.7819 0.8842
No log 3.8824 198 0.8010 0.4823 0.8010 0.8950
No log 3.9216 200 0.7833 0.4760 0.7833 0.8850
No log 3.9608 202 0.7834 0.5042 0.7834 0.8851
No log 4.0 204 0.8172 0.4812 0.8172 0.9040
No log 4.0392 206 0.8171 0.5177 0.8171 0.9039
No log 4.0784 208 0.7749 0.5763 0.7749 0.8803
No log 4.1176 210 0.7947 0.5470 0.7947 0.8915
No log 4.1569 212 0.8575 0.5332 0.8575 0.9260
No log 4.1961 214 0.7956 0.5355 0.7956 0.8920
No log 4.2353 216 0.7982 0.5196 0.7982 0.8934
No log 4.2745 218 0.9283 0.5435 0.9283 0.9635
No log 4.3137 220 1.0911 0.5604 1.0911 1.0446
No log 4.3529 222 1.0305 0.5262 1.0305 1.0151
No log 4.3922 224 0.8734 0.5040 0.8734 0.9345
No log 4.4314 226 0.8536 0.4385 0.8536 0.9239
No log 4.4706 228 0.8656 0.4102 0.8656 0.9304
No log 4.5098 230 0.8745 0.4102 0.8745 0.9351
No log 4.5490 232 0.8891 0.4482 0.8891 0.9429
No log 4.5882 234 0.9727 0.5258 0.9727 0.9862
No log 4.6275 236 0.9316 0.4527 0.9316 0.9652
No log 4.6667 238 0.8516 0.4934 0.8516 0.9228
No log 4.7059 240 0.8185 0.3925 0.8185 0.9047
No log 4.7451 242 0.8088 0.4661 0.8088 0.8993
No log 4.7843 244 0.7804 0.4218 0.7804 0.8834
No log 4.8235 246 0.8130 0.4998 0.8130 0.9017
No log 4.8627 248 0.8130 0.5451 0.8130 0.9017
No log 4.9020 250 0.7632 0.4877 0.7632 0.8736
No log 4.9412 252 0.7370 0.5513 0.7370 0.8585
No log 4.9804 254 0.7536 0.4853 0.7536 0.8681
No log 5.0196 256 0.8148 0.4898 0.8148 0.9027
No log 5.0588 258 0.8507 0.4583 0.8507 0.9223
No log 5.0980 260 0.7873 0.4937 0.7873 0.8873
No log 5.1373 262 0.7360 0.5392 0.7360 0.8579
No log 5.1765 264 0.7390 0.5355 0.7390 0.8597
No log 5.2157 266 0.7181 0.5645 0.7181 0.8474
No log 5.2549 268 0.7158 0.6308 0.7158 0.8461
No log 5.2941 270 0.7493 0.5832 0.7493 0.8656
No log 5.3333 272 0.7692 0.5315 0.7692 0.8771
No log 5.3725 274 0.8287 0.4521 0.8287 0.9104
No log 5.4118 276 0.9054 0.4570 0.9054 0.9515
No log 5.4510 278 0.9421 0.4606 0.9421 0.9706
No log 5.4902 280 0.8577 0.4485 0.8577 0.9261
No log 5.5294 282 0.8425 0.4086 0.8425 0.9179
No log 5.5686 284 0.8460 0.4405 0.8460 0.9198
No log 5.6078 286 0.8280 0.4218 0.8280 0.9100
No log 5.6471 288 0.8284 0.4123 0.8284 0.9102
No log 5.6863 290 0.8540 0.4736 0.8540 0.9241
No log 5.7255 292 0.8884 0.4930 0.8884 0.9426
No log 5.7647 294 0.9626 0.5569 0.9626 0.9811
No log 5.8039 296 0.9202 0.5340 0.9202 0.9593
No log 5.8431 298 0.8170 0.4554 0.8170 0.9039
No log 5.8824 300 0.8233 0.5712 0.8233 0.9074
No log 5.9216 302 0.8710 0.5458 0.8710 0.9333
No log 5.9608 304 0.8616 0.5458 0.8616 0.9282
No log 6.0 306 0.8038 0.4667 0.8038 0.8966
No log 6.0392 308 0.8593 0.4681 0.8593 0.9270
No log 6.0784 310 0.8663 0.4681 0.8663 0.9307
No log 6.1176 312 0.8502 0.4599 0.8502 0.9220
No log 6.1569 314 0.8166 0.4496 0.8166 0.9037
No log 6.1961 316 0.8295 0.5085 0.8295 0.9108
No log 6.2353 318 0.8320 0.4848 0.8320 0.9121
No log 6.2745 320 0.8270 0.4686 0.8270 0.9094
No log 6.3137 322 0.8548 0.4493 0.8548 0.9246
No log 6.3529 324 0.9242 0.4890 0.9242 0.9613
No log 6.3922 326 0.9423 0.4458 0.9423 0.9707
No log 6.4314 328 0.9869 0.4594 0.9869 0.9934
No log 6.4706 330 0.9669 0.3459 0.9669 0.9833
No log 6.5098 332 0.9077 0.3577 0.9077 0.9527
No log 6.5490 334 0.8359 0.3619 0.8359 0.9143
No log 6.5882 336 0.7948 0.3733 0.7948 0.8915
No log 6.6275 338 0.7916 0.4196 0.7916 0.8897
No log 6.6667 340 0.7895 0.4502 0.7895 0.8885
No log 6.7059 342 0.7914 0.5370 0.7914 0.8896
No log 6.7451 344 0.8254 0.5120 0.8254 0.9085
No log 6.7843 346 0.7827 0.5370 0.7827 0.8847
No log 6.8235 348 0.7770 0.4820 0.7770 0.8815
No log 6.8627 350 0.7729 0.5357 0.7729 0.8791
No log 6.9020 352 0.7583 0.5565 0.7583 0.8708
No log 6.9412 354 0.7633 0.6038 0.7633 0.8737
No log 6.9804 356 0.7618 0.6324 0.7618 0.8728
No log 7.0196 358 0.8137 0.5882 0.8137 0.9021
No log 7.0588 360 0.8075 0.5658 0.8075 0.8986
No log 7.0980 362 0.8214 0.5537 0.8214 0.9063
No log 7.1373 364 0.8486 0.5333 0.8486 0.9212
No log 7.1765 366 0.8872 0.5333 0.8872 0.9419
No log 7.2157 368 0.8658 0.4540 0.8658 0.9305
No log 7.2549 370 0.8198 0.4143 0.8198 0.9054
No log 7.2941 372 0.7925 0.4778 0.7925 0.8902
No log 7.3333 374 0.8130 0.4604 0.8130 0.9017
No log 7.3725 376 0.8387 0.4672 0.8387 0.9158
No log 7.4118 378 0.8277 0.3914 0.8277 0.9098
No log 7.4510 380 0.7840 0.4738 0.7840 0.8855
No log 7.4902 382 0.7541 0.4993 0.7541 0.8684
No log 7.5294 384 0.7399 0.5607 0.7399 0.8602
No log 7.5686 386 0.7569 0.6006 0.7569 0.8700
No log 7.6078 388 0.8853 0.5334 0.8853 0.9409
No log 7.6471 390 0.9451 0.5192 0.9451 0.9721
No log 7.6863 392 0.8548 0.5738 0.8548 0.9245
No log 7.7255 394 0.7427 0.6086 0.7427 0.8618
No log 7.7647 396 0.7281 0.5098 0.7281 0.8533
No log 7.8039 398 0.7394 0.4885 0.7394 0.8599
No log 7.8431 400 0.7641 0.4643 0.7641 0.8741
No log 7.8824 402 0.8174 0.3733 0.8174 0.9041
No log 7.9216 404 0.9114 0.4690 0.9114 0.9547
No log 7.9608 406 0.9172 0.4681 0.9172 0.9577
No log 8.0 408 0.8268 0.3880 0.8268 0.9093
No log 8.0392 410 0.7845 0.5040 0.7845 0.8857
No log 8.0784 412 0.7874 0.4996 0.7874 0.8873
No log 8.1176 414 0.7899 0.4877 0.7899 0.8888
No log 8.1569 416 0.8733 0.4490 0.8733 0.9345
No log 8.1961 418 0.9815 0.5480 0.9815 0.9907
No log 8.2353 420 1.0711 0.5437 1.0711 1.0350
No log 8.2745 422 1.0328 0.5437 1.0328 1.0163
No log 8.3137 424 0.8709 0.5462 0.8709 0.9332
No log 8.3529 426 0.7584 0.4086 0.7584 0.8708
No log 8.3922 428 0.7429 0.5120 0.7429 0.8619
No log 8.4314 430 0.7420 0.5495 0.7420 0.8614
No log 8.4706 432 0.7271 0.5467 0.7271 0.8527
No log 8.5098 434 0.7276 0.4508 0.7276 0.8530
No log 8.5490 436 0.7734 0.5291 0.7734 0.8794
No log 8.5882 438 0.8484 0.5509 0.8484 0.9211
No log 8.6275 440 0.9301 0.5627 0.9301 0.9644
No log 8.6667 442 0.9473 0.5506 0.9473 0.9733
No log 8.7059 444 0.8530 0.5472 0.8530 0.9236
No log 8.7451 446 0.7784 0.4337 0.7784 0.8823
No log 8.7843 448 0.7674 0.4920 0.7674 0.8760
No log 8.8235 450 0.7776 0.4508 0.7776 0.8818
No log 8.8627 452 0.8102 0.3998 0.8102 0.9001
No log 8.9020 454 0.8146 0.4215 0.8146 0.9025
No log 8.9412 456 0.8091 0.4215 0.8091 0.8995
No log 8.9804 458 0.8570 0.5876 0.8570 0.9257
No log 9.0196 460 1.0187 0.5506 1.0187 1.0093
No log 9.0588 462 1.1116 0.5111 1.1116 1.0543
No log 9.0980 464 1.0551 0.5397 1.0551 1.0272
No log 9.1373 466 0.9263 0.5670 0.9263 0.9625
No log 9.1765 468 0.7870 0.5963 0.7870 0.8871
No log 9.2157 470 0.7376 0.6095 0.7376 0.8589
No log 9.2549 472 0.7716 0.5707 0.7716 0.8784
No log 9.2941 474 0.8065 0.5264 0.8065 0.8981
No log 9.3333 476 0.7642 0.5707 0.7642 0.8742
No log 9.3725 478 0.7246 0.5811 0.7246 0.8513
No log 9.4118 480 0.7880 0.5935 0.7880 0.8877
No log 9.4510 482 0.9018 0.5592 0.9018 0.9496
No log 9.4902 484 0.9668 0.5649 0.9668 0.9832
No log 9.5294 486 0.9329 0.5696 0.9329 0.9659
No log 9.5686 488 0.9064 0.5553 0.9064 0.9520
No log 9.6078 490 0.9252 0.5614 0.9252 0.9619
No log 9.6471 492 0.9671 0.5567 0.9671 0.9834
No log 9.6863 494 0.9925 0.5355 0.9925 0.9962
No log 9.7255 496 0.9137 0.5805 0.9137 0.9559
No log 9.7647 498 0.8046 0.5102 0.8046 0.8970
0.2924 9.8039 500 0.7615 0.4548 0.7615 0.8727
0.2924 9.8431 502 0.7603 0.4521 0.7603 0.8720
0.2924 9.8824 504 0.7512 0.4521 0.7512 0.8667
0.2924 9.9216 506 0.7835 0.5801 0.7835 0.8852
0.2924 9.9608 508 0.8698 0.5218 0.8698 0.9326
0.2924 10.0 510 0.8980 0.5365 0.8980 0.9476
0.2924 10.0392 512 0.8425 0.5610 0.8425 0.9179
0.2924 10.0784 514 0.7773 0.5426 0.7773 0.8816
0.2924 10.1176 516 0.7639 0.5455 0.7639 0.8740
0.2924 10.1569 518 0.7616 0.5455 0.7616 0.8727
0.2924 10.1961 520 0.7611 0.5287 0.7611 0.8724
0.2924 10.2353 522 0.7643 0.5295 0.7643 0.8743
0.2924 10.2745 524 0.8035 0.5202 0.8035 0.8964
0.2924 10.3137 526 0.8379 0.5562 0.8379 0.9153
0.2924 10.3529 528 0.8199 0.5385 0.8199 0.9055
0.2924 10.3922 530 0.7940 0.5202 0.7940 0.8911
0.2924 10.4314 532 0.7680 0.4534 0.7680 0.8763
0.2924 10.4706 534 0.7719 0.4120 0.7719 0.8786
0.2924 10.5098 536 0.7783 0.4491 0.7783 0.8822
0.2924 10.5490 538 0.7852 0.4120 0.7852 0.8861
0.2924 10.5882 540 0.7876 0.3983 0.7876 0.8875

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k9_task2_organization

Finetuned
(4019)
this model