ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k13_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8717
  • Qwk: 0.4764
  • Mse: 0.8717
  • Rmse: 0.9337

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0270 2 4.3883 -0.0191 4.3883 2.0948
No log 0.0541 4 3.3302 0.0110 3.3302 1.8249
No log 0.0811 6 1.6220 0.0 1.6220 1.2736
No log 0.1081 8 1.2717 0.1257 1.2717 1.1277
No log 0.1351 10 1.1670 0.2188 1.1670 1.0803
No log 0.1622 12 1.1688 0.1458 1.1688 1.0811
No log 0.1892 14 1.6274 0.1429 1.6274 1.2757
No log 0.2162 16 1.3843 0.0691 1.3843 1.1765
No log 0.2432 18 1.1329 0.2293 1.1329 1.0644
No log 0.2703 20 1.4026 0.2424 1.4026 1.1843
No log 0.2973 22 1.4212 0.0865 1.4212 1.1922
No log 0.3243 24 1.4757 -0.0935 1.4757 1.2148
No log 0.3514 26 1.4414 -0.0478 1.4414 1.2006
No log 0.3784 28 1.3758 0.0362 1.3758 1.1729
No log 0.4054 30 1.4206 0.0723 1.4206 1.1919
No log 0.4324 32 1.3198 0.0723 1.3198 1.1488
No log 0.4595 34 1.1955 0.2593 1.1955 1.0934
No log 0.4865 36 1.1721 0.1830 1.1721 1.0826
No log 0.5135 38 1.2524 0.1045 1.2524 1.1191
No log 0.5405 40 1.5962 0.1470 1.5962 1.2634
No log 0.5676 42 1.7857 0.0591 1.7857 1.3363
No log 0.5946 44 1.6294 0.1949 1.6294 1.2765
No log 0.6216 46 1.3123 0.2522 1.3123 1.1455
No log 0.6486 48 1.1710 0.2721 1.1710 1.0821
No log 0.6757 50 1.1001 0.2730 1.1001 1.0489
No log 0.7027 52 1.0579 0.3115 1.0579 1.0286
No log 0.7297 54 1.0226 0.3243 1.0226 1.0112
No log 0.7568 56 0.9984 0.3991 0.9984 0.9992
No log 0.7838 58 0.9995 0.4676 0.9995 0.9997
No log 0.8108 60 0.9625 0.5057 0.9625 0.9811
No log 0.8378 62 0.9475 0.4521 0.9475 0.9734
No log 0.8649 64 0.9436 0.4028 0.9436 0.9714
No log 0.8919 66 0.9340 0.4435 0.9340 0.9664
No log 0.9189 68 0.8951 0.4367 0.8951 0.9461
No log 0.9459 70 0.9711 0.4812 0.9711 0.9854
No log 0.9730 72 0.9290 0.5667 0.9290 0.9639
No log 1.0 74 0.9583 0.5321 0.9583 0.9789
No log 1.0270 76 0.9378 0.5382 0.9378 0.9684
No log 1.0541 78 0.8861 0.5703 0.8861 0.9414
No log 1.0811 80 0.8473 0.5338 0.8473 0.9205
No log 1.1081 82 0.9228 0.5164 0.9228 0.9606
No log 1.1351 84 1.0115 0.5312 1.0115 1.0057
No log 1.1622 86 0.9150 0.4902 0.9150 0.9565
No log 1.1892 88 0.8570 0.4653 0.8570 0.9257
No log 1.2162 90 0.9577 0.3973 0.9577 0.9786
No log 1.2432 92 0.9715 0.3452 0.9715 0.9856
No log 1.2703 94 0.8456 0.5591 0.8456 0.9196
No log 1.2973 96 0.7646 0.4898 0.7646 0.8744
No log 1.3243 98 0.7953 0.5913 0.7953 0.8918
No log 1.3514 100 0.8031 0.5763 0.8031 0.8962
No log 1.3784 102 0.7572 0.6831 0.7572 0.8702
No log 1.4054 104 0.7483 0.6640 0.7483 0.8651
No log 1.4324 106 0.7380 0.6221 0.7380 0.8591
No log 1.4595 108 0.7508 0.6523 0.7508 0.8665
No log 1.4865 110 0.7403 0.6622 0.7403 0.8604
No log 1.5135 112 0.7813 0.6128 0.7813 0.8839
No log 1.5405 114 0.7809 0.5911 0.7809 0.8837
No log 1.5676 116 0.7764 0.5406 0.7764 0.8811
No log 1.5946 118 0.8021 0.5438 0.8021 0.8956
No log 1.6216 120 0.9091 0.5493 0.9091 0.9535
No log 1.6486 122 0.9036 0.5154 0.9036 0.9506
No log 1.6757 124 0.8426 0.5039 0.8426 0.9179
No log 1.7027 126 0.8031 0.5648 0.8031 0.8961
No log 1.7297 128 0.8229 0.4620 0.8229 0.9072
No log 1.7568 130 0.9371 0.5028 0.9371 0.9680
No log 1.7838 132 0.9068 0.4874 0.9068 0.9523
No log 1.8108 134 0.8067 0.5483 0.8067 0.8982
No log 1.8378 136 0.8983 0.4248 0.8983 0.9478
No log 1.8649 138 0.8919 0.4464 0.8919 0.9444
No log 1.8919 140 0.8358 0.5878 0.8358 0.9142
No log 1.9189 142 0.8442 0.5634 0.8442 0.9188
No log 1.9459 144 0.9362 0.5218 0.9362 0.9676
No log 1.9730 146 1.2044 0.4934 1.2044 1.0974
No log 2.0 148 1.1970 0.4921 1.1970 1.0941
No log 2.0270 150 0.9412 0.5346 0.9412 0.9701
No log 2.0541 152 0.8049 0.5729 0.8049 0.8972
No log 2.0811 154 0.8255 0.5275 0.8255 0.9086
No log 2.1081 156 0.7912 0.5884 0.7912 0.8895
No log 2.1351 158 0.8796 0.5556 0.8796 0.9379
No log 2.1622 160 0.9862 0.5451 0.9862 0.9931
No log 2.1892 162 0.9524 0.5572 0.9524 0.9759
No log 2.2162 164 0.8631 0.5934 0.8631 0.9290
No log 2.2432 166 0.8365 0.5529 0.8365 0.9146
No log 2.2703 168 0.8504 0.5024 0.8504 0.9222
No log 2.2973 170 0.8883 0.5194 0.8883 0.9425
No log 2.3243 172 0.9694 0.4902 0.9694 0.9846
No log 2.3514 174 0.9976 0.4219 0.9976 0.9988
No log 2.3784 176 0.9798 0.4007 0.9798 0.9898
No log 2.4054 178 0.9285 0.4302 0.9285 0.9636
No log 2.4324 180 0.9292 0.4311 0.9292 0.9640
No log 2.4595 182 0.9480 0.4275 0.9480 0.9736
No log 2.4865 184 0.9801 0.4911 0.9801 0.9900
No log 2.5135 186 1.0003 0.4570 1.0003 1.0001
No log 2.5405 188 0.9533 0.4846 0.9533 0.9764
No log 2.5676 190 0.9421 0.4927 0.9421 0.9706
No log 2.5946 192 0.9633 0.4790 0.9633 0.9815
No log 2.6216 194 0.8952 0.4970 0.8952 0.9461
No log 2.6486 196 0.8562 0.4953 0.8562 0.9253
No log 2.6757 198 0.8416 0.5821 0.8416 0.9174
No log 2.7027 200 0.8454 0.5415 0.8454 0.9194
No log 2.7297 202 0.8983 0.5000 0.8983 0.9478
No log 2.7568 204 0.9391 0.5334 0.9391 0.9691
No log 2.7838 206 0.9110 0.5435 0.9110 0.9545
No log 2.8108 208 0.8190 0.5542 0.8190 0.9050
No log 2.8378 210 0.7979 0.6163 0.7979 0.8932
No log 2.8649 212 0.8070 0.6051 0.8070 0.8983
No log 2.8919 214 0.9095 0.5434 0.9095 0.9537
No log 2.9189 216 1.0710 0.3989 1.0710 1.0349
No log 2.9459 218 1.1851 0.4004 1.1851 1.0886
No log 2.9730 220 1.1593 0.4110 1.1593 1.0767
No log 3.0 222 1.1431 0.4065 1.1431 1.0692
No log 3.0270 224 1.0916 0.4830 1.0916 1.0448
No log 3.0541 226 1.0345 0.4883 1.0345 1.0171
No log 3.0811 228 0.9617 0.5070 0.9617 0.9807
No log 3.1081 230 0.9235 0.4639 0.9235 0.9610
No log 3.1351 232 0.8690 0.4736 0.8690 0.9322
No log 3.1622 234 0.8438 0.4937 0.8438 0.9186
No log 3.1892 236 0.9115 0.5766 0.9115 0.9547
No log 3.2162 238 1.0391 0.5094 1.0391 1.0194
No log 3.2432 240 1.0882 0.4758 1.0882 1.0432
No log 3.2703 242 1.0015 0.5328 1.0015 1.0008
No log 3.2973 244 0.8958 0.4838 0.8958 0.9464
No log 3.3243 246 0.8535 0.4337 0.8535 0.9239
No log 3.3514 248 0.8335 0.4308 0.8335 0.9130
No log 3.3784 250 0.8587 0.5012 0.8587 0.9266
No log 3.4054 252 0.8399 0.5114 0.8399 0.9164
No log 3.4324 254 0.8085 0.4708 0.8085 0.8991
No log 3.4595 256 0.8272 0.5040 0.8272 0.9095
No log 3.4865 258 0.8754 0.5564 0.8754 0.9356
No log 3.5135 260 0.9691 0.5818 0.9691 0.9844
No log 3.5405 262 1.0582 0.5002 1.0582 1.0287
No log 3.5676 264 1.0680 0.4364 1.0680 1.0334
No log 3.5946 266 1.0278 0.3934 1.0278 1.0138
No log 3.6216 268 0.9251 0.4845 0.9251 0.9618
No log 3.6486 270 0.8823 0.4864 0.8823 0.9393
No log 3.6757 272 0.8905 0.4864 0.8905 0.9437
No log 3.7027 274 0.9172 0.4754 0.9172 0.9577
No log 3.7297 276 0.9155 0.4774 0.9155 0.9568
No log 3.7568 278 0.9158 0.4858 0.9158 0.9570
No log 3.7838 280 0.9233 0.4858 0.9233 0.9609
No log 3.8108 282 0.9578 0.4712 0.9578 0.9787
No log 3.8378 284 0.9924 0.4522 0.9924 0.9962
No log 3.8649 286 1.0334 0.4175 1.0334 1.0166
No log 3.8919 288 1.0129 0.3625 1.0129 1.0064
No log 3.9189 290 0.9336 0.3796 0.9336 0.9662
No log 3.9459 292 0.9324 0.4328 0.9324 0.9656
No log 3.9730 294 0.9167 0.4382 0.9167 0.9574
No log 4.0 296 0.9241 0.4040 0.9241 0.9613
No log 4.0270 298 0.9735 0.4681 0.9735 0.9866
No log 4.0541 300 1.0214 0.4355 1.0214 1.0107
No log 4.0811 302 0.9551 0.5241 0.9551 0.9773
No log 4.1081 304 0.9094 0.5614 0.9094 0.9536
No log 4.1351 306 0.9117 0.5455 0.9117 0.9548
No log 4.1622 308 0.9593 0.5416 0.9593 0.9794
No log 4.1892 310 0.9836 0.5115 0.9836 0.9918
No log 4.2162 312 0.9505 0.4788 0.9505 0.9749
No log 4.2432 314 0.8508 0.5267 0.8508 0.9224
No log 4.2703 316 0.8068 0.4884 0.8068 0.8982
No log 4.2973 318 0.8132 0.5242 0.8132 0.9018
No log 4.3243 320 0.8198 0.5267 0.8198 0.9054
No log 4.3514 322 0.8324 0.4778 0.8324 0.9123
No log 4.3784 324 0.8495 0.4966 0.8495 0.9217
No log 4.4054 326 0.8529 0.4822 0.8529 0.9235
No log 4.4324 328 0.8549 0.4822 0.8549 0.9246
No log 4.4595 330 0.8699 0.5287 0.8699 0.9327
No log 4.4865 332 0.9678 0.4652 0.9678 0.9838
No log 4.5135 334 0.9631 0.4858 0.9631 0.9814
No log 4.5405 336 0.9115 0.5267 0.9115 0.9547
No log 4.5676 338 0.8814 0.5073 0.8814 0.9388
No log 4.5946 340 0.8933 0.5073 0.8933 0.9451
No log 4.6216 342 0.8910 0.5073 0.8910 0.9439
No log 4.6486 344 0.8695 0.5089 0.8695 0.9325
No log 4.6757 346 0.8829 0.5089 0.8829 0.9396
No log 4.7027 348 0.9589 0.4521 0.9589 0.9792
No log 4.7297 350 1.1570 0.4155 1.1570 1.0756
No log 4.7568 352 1.2869 0.3872 1.2869 1.1344
No log 4.7838 354 1.2441 0.4233 1.2441 1.1154
No log 4.8108 356 1.0764 0.4436 1.0764 1.0375
No log 4.8378 358 0.9427 0.4969 0.9427 0.9709
No log 4.8649 360 0.8895 0.4714 0.8895 0.9431
No log 4.8919 362 0.9038 0.4808 0.9038 0.9507
No log 4.9189 364 0.9566 0.4412 0.9566 0.9781
No log 4.9459 366 0.9480 0.4416 0.9480 0.9737
No log 4.9730 368 0.9226 0.4416 0.9226 0.9605
No log 5.0 370 0.8674 0.4672 0.8674 0.9314
No log 5.0270 372 0.8811 0.5175 0.8811 0.9387
No log 5.0541 374 0.9936 0.5286 0.9936 0.9968
No log 5.0811 376 1.0219 0.5098 1.0219 1.0109
No log 5.1081 378 0.9931 0.4339 0.9931 0.9966
No log 5.1351 380 0.9124 0.4845 0.9124 0.9552
No log 5.1622 382 0.9078 0.4871 0.9078 0.9528
No log 5.1892 384 0.9234 0.4871 0.9234 0.9609
No log 5.2162 386 0.9020 0.4998 0.9020 0.9497
No log 5.2432 388 0.9064 0.5130 0.9064 0.9521
No log 5.2703 390 0.8599 0.5365 0.8599 0.9273
No log 5.2973 392 0.7922 0.5408 0.7922 0.8900
No log 5.3243 394 0.7693 0.5223 0.7693 0.8771
No log 5.3514 396 0.7657 0.5528 0.7657 0.8751
No log 5.3784 398 0.7750 0.5528 0.7750 0.8804
No log 5.4054 400 0.8053 0.5408 0.8053 0.8974
No log 5.4324 402 0.8689 0.5352 0.8689 0.9321
No log 5.4595 404 0.9725 0.4943 0.9725 0.9862
No log 5.4865 406 1.0053 0.4760 1.0053 1.0026
No log 5.5135 408 0.9751 0.5185 0.9751 0.9875
No log 5.5405 410 0.9312 0.4945 0.9312 0.9650
No log 5.5676 412 0.8792 0.5477 0.8792 0.9377
No log 5.5946 414 0.8757 0.5287 0.8757 0.9358
No log 5.6216 416 0.9215 0.4295 0.9215 0.9600
No log 5.6486 418 1.0130 0.4186 1.0130 1.0065
No log 5.6757 420 1.1833 0.3631 1.1833 1.0878
No log 5.7027 422 1.2565 0.3531 1.2565 1.1210
No log 5.7297 424 1.1525 0.3590 1.1525 1.0735
No log 5.7568 426 0.9945 0.4316 0.9945 0.9973
No log 5.7838 428 0.9560 0.4386 0.9560 0.9778
No log 5.8108 430 0.9694 0.3996 0.9694 0.9846
No log 5.8378 432 1.0094 0.3824 1.0094 1.0047
No log 5.8649 434 0.9863 0.4091 0.9863 0.9931
No log 5.8919 436 0.9332 0.4845 0.9332 0.9660
No log 5.9189 438 0.8565 0.4746 0.8565 0.9254
No log 5.9459 440 0.8321 0.5089 0.8321 0.9122
No log 5.9730 442 0.8358 0.4951 0.8358 0.9142
No log 6.0 444 0.8969 0.5245 0.8969 0.9470
No log 6.0270 446 0.9463 0.4552 0.9463 0.9728
No log 6.0541 448 0.9247 0.4351 0.9247 0.9616
No log 6.0811 450 0.8664 0.4657 0.8664 0.9308
No log 6.1081 452 0.8398 0.4321 0.8398 0.9164
No log 6.1351 454 0.8471 0.3914 0.8471 0.9204
No log 6.1622 456 0.8373 0.4023 0.8373 0.9150
No log 6.1892 458 0.8245 0.4671 0.8245 0.9080
No log 6.2162 460 0.8674 0.4774 0.8674 0.9313
No log 6.2432 462 0.9151 0.5109 0.9151 0.9566
No log 6.2703 464 0.8884 0.4959 0.8884 0.9426
No log 6.2973 466 0.8482 0.5437 0.8482 0.9210
No log 6.3243 468 0.8491 0.5458 0.8491 0.9215
No log 6.3514 470 0.8884 0.5098 0.8884 0.9426
No log 6.3784 472 0.9124 0.4956 0.9124 0.9552
No log 6.4054 474 0.8928 0.4774 0.8928 0.9449
No log 6.4324 476 0.8660 0.4785 0.8660 0.9306
No log 6.4595 478 0.8658 0.4796 0.8658 0.9305
No log 6.4865 480 0.8851 0.4796 0.8851 0.9408
No log 6.5135 482 0.9014 0.4796 0.9014 0.9494
No log 6.5405 484 0.8804 0.4796 0.8804 0.9383
No log 6.5676 486 0.8431 0.4828 0.8431 0.9182
No log 6.5946 488 0.8313 0.4026 0.8313 0.9117
No log 6.6216 490 0.8206 0.4159 0.8206 0.9059
No log 6.6486 492 0.8136 0.4292 0.8136 0.9020
No log 6.6757 494 0.8179 0.5491 0.8179 0.9044
No log 6.7027 496 0.7958 0.4933 0.7958 0.8921
No log 6.7297 498 0.8301 0.4718 0.8301 0.9111
0.3684 6.7568 500 0.9465 0.5240 0.9465 0.9729
0.3684 6.7838 502 0.9752 0.5435 0.9752 0.9875
0.3684 6.8108 504 0.8979 0.4970 0.8979 0.9476
0.3684 6.8378 506 0.8216 0.4644 0.8216 0.9064
0.3684 6.8649 508 0.8023 0.4685 0.8023 0.8957
0.3684 6.8919 510 0.8208 0.4716 0.8208 0.9060
0.3684 6.9189 512 0.8834 0.5475 0.8834 0.9399
0.3684 6.9459 514 0.8646 0.5495 0.8646 0.9298
0.3684 6.9730 516 0.8045 0.5136 0.8045 0.8969
0.3684 7.0 518 0.7957 0.5152 0.7957 0.8920
0.3684 7.0270 520 0.8098 0.5416 0.8098 0.8999
0.3684 7.0541 522 0.8846 0.5273 0.8846 0.9405
0.3684 7.0811 524 0.8968 0.5273 0.8968 0.9470
0.3684 7.1081 526 0.8475 0.5114 0.8475 0.9206
0.3684 7.1351 528 0.8159 0.5315 0.8159 0.9032
0.3684 7.1622 530 0.8202 0.5315 0.8202 0.9057
0.3684 7.1892 532 0.8482 0.4587 0.8482 0.9210
0.3684 7.2162 534 0.9043 0.4587 0.9043 0.9509
0.3684 7.2432 536 0.9467 0.5058 0.9467 0.9730
0.3684 7.2703 538 0.9266 0.5339 0.9266 0.9626
0.3684 7.2973 540 0.8660 0.5722 0.8660 0.9306
0.3684 7.3243 542 0.8650 0.5722 0.8650 0.9301
0.3684 7.3514 544 0.8841 0.5086 0.8841 0.9402
0.3684 7.3784 546 0.8921 0.4744 0.8921 0.9445
0.3684 7.4054 548 0.8773 0.4764 0.8773 0.9366
0.3684 7.4324 550 0.8526 0.4983 0.8526 0.9234
0.3684 7.4595 552 0.8717 0.4764 0.8717 0.9337

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k13_task2_organization

Finetuned
(4023)
this model