ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k19_task3_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8082
  • Qwk: -0.1398
  • Mse: 0.8082
  • Rmse: 0.8990

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0204 2 3.8284 0.0023 3.8284 1.9566
No log 0.0408 4 1.9331 0.0119 1.9331 1.3903
No log 0.0612 6 1.1702 -0.0445 1.1702 1.0818
No log 0.0816 8 1.0615 0.0378 1.0615 1.0303
No log 0.1020 10 1.0769 0.0723 1.0769 1.0377
No log 0.1224 12 0.7414 -0.0101 0.7414 0.8611
No log 0.1429 14 0.7449 0.0506 0.7449 0.8631
No log 0.1633 16 0.9511 0.0627 0.9511 0.9753
No log 0.1837 18 1.9542 -0.0405 1.9542 1.3979
No log 0.2041 20 1.2383 0.1118 1.2383 1.1128
No log 0.2245 22 0.6690 -0.0626 0.6690 0.8179
No log 0.2449 24 0.6931 0.0555 0.6931 0.8325
No log 0.2653 26 0.7362 0.0 0.7362 0.8580
No log 0.2857 28 0.7323 0.0 0.7323 0.8557
No log 0.3061 30 0.7224 0.0555 0.7224 0.8499
No log 0.3265 32 0.7453 -0.0069 0.7453 0.8633
No log 0.3469 34 0.9563 0.0067 0.9563 0.9779
No log 0.3673 36 1.2800 0.0016 1.2800 1.1314
No log 0.3878 38 1.3556 0.0 1.3556 1.1643
No log 0.4082 40 1.2018 0.0 1.2018 1.0963
No log 0.4286 42 0.9620 0.0543 0.9620 0.9808
No log 0.4490 44 0.8233 -0.0056 0.8233 0.9074
No log 0.4694 46 0.7787 0.0099 0.7787 0.8824
No log 0.4898 48 0.8885 0.1191 0.8885 0.9426
No log 0.5102 50 0.8231 -0.0812 0.8231 0.9072
No log 0.5306 52 0.7392 -0.0679 0.7392 0.8598
No log 0.5510 54 0.7632 0.0555 0.7632 0.8736
No log 0.5714 56 0.7723 0.0555 0.7723 0.8788
No log 0.5918 58 0.8086 -0.0739 0.8086 0.8992
No log 0.6122 60 0.8993 -0.0425 0.8993 0.9483
No log 0.6327 62 0.7805 0.0260 0.7805 0.8835
No log 0.6531 64 0.7391 -0.0131 0.7391 0.8597
No log 0.6735 66 0.8373 -0.0214 0.8373 0.9150
No log 0.6939 68 0.9268 0.0052 0.9268 0.9627
No log 0.7143 70 0.8790 -0.0477 0.8790 0.9375
No log 0.7347 72 0.7876 0.1660 0.7876 0.8875
No log 0.7551 74 0.8913 0.1147 0.8913 0.9441
No log 0.7755 76 0.8651 0.1633 0.8651 0.9301
No log 0.7959 78 0.8247 0.0172 0.8247 0.9081
No log 0.8163 80 0.9847 -0.0228 0.9847 0.9923
No log 0.8367 82 1.0141 -0.0533 1.0141 1.0070
No log 0.8571 84 0.8752 0.1591 0.8752 0.9355
No log 0.8776 86 0.8733 0.0119 0.8733 0.9345
No log 0.8980 88 0.8476 -0.0268 0.8476 0.9206
No log 0.9184 90 0.8309 0.0056 0.8309 0.9115
No log 0.9388 92 0.9553 -0.1152 0.9553 0.9774
No log 0.9592 94 0.8898 0.2031 0.8898 0.9433
No log 0.9796 96 0.8641 -0.0287 0.8641 0.9296
No log 1.0 98 0.8091 0.0816 0.8091 0.8995
No log 1.0204 100 0.8516 0.1400 0.8516 0.9228
No log 1.0408 102 1.1248 0.0541 1.1248 1.0606
No log 1.0612 104 0.9803 0.0416 0.9803 0.9901
No log 1.0816 106 0.8334 0.1529 0.8334 0.9129
No log 1.1020 108 0.8413 0.0688 0.8413 0.9172
No log 1.1224 110 0.8573 0.0833 0.8573 0.9259
No log 1.1429 112 0.9098 0.1221 0.9098 0.9539
No log 1.1633 114 0.9435 0.0866 0.9435 0.9713
No log 1.1837 116 1.0304 0.0007 1.0304 1.0151
No log 1.2041 118 1.3937 -0.0249 1.3937 1.1805
No log 1.2245 120 1.5959 0.0380 1.5959 1.2633
No log 1.2449 122 1.3763 0.0802 1.3763 1.1732
No log 1.2653 124 1.1502 0.1308 1.1502 1.0725
No log 1.2857 126 0.9375 0.1123 0.9375 0.9683
No log 1.3061 128 0.8915 0.1519 0.8915 0.9442
No log 1.3265 130 0.9443 -0.0237 0.9443 0.9717
No log 1.3469 132 1.0621 -0.0373 1.0621 1.0306
No log 1.3673 134 1.0022 0.0487 1.0022 1.0011
No log 1.3878 136 0.8774 0.1925 0.8774 0.9367
No log 1.4082 138 0.8235 0.1529 0.8235 0.9075
No log 1.4286 140 0.8571 0.1941 0.8571 0.9258
No log 1.4490 142 0.9744 0.0729 0.9744 0.9871
No log 1.4694 144 0.8663 0.2316 0.8663 0.9308
No log 1.4898 146 0.8334 0.2545 0.8334 0.9129
No log 1.5102 148 0.8223 0.1608 0.8223 0.9068
No log 1.5306 150 0.9312 0.0643 0.9312 0.9650
No log 1.5510 152 1.3826 0.0596 1.3826 1.1759
No log 1.5714 154 1.4690 0.0603 1.4690 1.2120
No log 1.5918 156 1.0035 0.0730 1.0035 1.0018
No log 1.6122 158 0.9029 0.0517 0.9029 0.9502
No log 1.6327 160 1.0766 -0.0133 1.0766 1.0376
No log 1.6531 162 0.8961 0.0424 0.8961 0.9466
No log 1.6735 164 0.8162 0.0488 0.8162 0.9034
No log 1.6939 166 1.1911 0.0542 1.1911 1.0914
No log 1.7143 168 1.5028 0.1131 1.5028 1.2259
No log 1.7347 170 1.2878 0.0780 1.2878 1.1348
No log 1.7551 172 0.9469 0.1196 0.9469 0.9731
No log 1.7755 174 0.9132 0.1020 0.9132 0.9556
No log 1.7959 176 1.0073 0.1014 1.0073 1.0036
No log 1.8163 178 1.2283 0.0277 1.2283 1.1083
No log 1.8367 180 1.2539 0.0296 1.2539 1.1198
No log 1.8571 182 1.0306 0.1334 1.0306 1.0152
No log 1.8776 184 0.8685 0.1696 0.8685 0.9319
No log 1.8980 186 0.8069 0.1961 0.8069 0.8983
No log 1.9184 188 0.8220 0.2892 0.8220 0.9066
No log 1.9388 190 0.9342 0.1321 0.9342 0.9665
No log 1.9592 192 0.9385 0.1581 0.9385 0.9688
No log 1.9796 194 0.8027 0.1739 0.8027 0.8960
No log 2.0 196 0.8832 0.0435 0.8832 0.9398
No log 2.0204 198 0.9433 0.0762 0.9433 0.9712
No log 2.0408 200 0.7918 0.0586 0.7918 0.8898
No log 2.0612 202 0.8423 0.1039 0.8423 0.9178
No log 2.0816 204 1.0490 0.0522 1.0490 1.0242
No log 2.1020 206 1.0552 0.0497 1.0552 1.0272
No log 2.1224 208 0.8795 0.1672 0.8795 0.9378
No log 2.1429 210 0.7837 0.0488 0.7837 0.8853
No log 2.1633 212 0.8068 0.0165 0.8068 0.8982
No log 2.1837 214 0.9800 0.1111 0.9800 0.9899
No log 2.2041 216 1.0120 0.0482 1.0120 1.0060
No log 2.2245 218 0.8922 0.1228 0.8922 0.9446
No log 2.2449 220 0.8927 0.1208 0.8927 0.9449
No log 2.2653 222 0.8693 0.1907 0.8693 0.9324
No log 2.2857 224 1.0518 0.1017 1.0518 1.0256
No log 2.3061 226 1.1423 0.1294 1.1423 1.0688
No log 2.3265 228 1.0390 0.0707 1.0390 1.0193
No log 2.3469 230 0.8955 0.1304 0.8955 0.9463
No log 2.3673 232 0.9811 0.1014 0.9811 0.9905
No log 2.3878 234 1.0109 0.1017 1.0109 1.0054
No log 2.4082 236 0.9715 0.0516 0.9715 0.9857
No log 2.4286 238 0.8075 0.0592 0.8075 0.8986
No log 2.4490 240 0.7681 0.1835 0.7681 0.8764
No log 2.4694 242 0.7892 0.2118 0.7892 0.8883
No log 2.4898 244 0.8750 0.0311 0.8750 0.9354
No log 2.5102 246 0.9347 0.0391 0.9347 0.9668
No log 2.5306 248 0.8586 0.0311 0.8586 0.9266
No log 2.5510 250 0.7791 0.0987 0.7791 0.8827
No log 2.5714 252 0.7331 0.1856 0.7331 0.8562
No log 2.5918 254 0.7446 0.1928 0.7446 0.8629
No log 2.6122 256 0.8312 0.0350 0.8312 0.9117
No log 2.6327 258 0.8766 0.0643 0.8766 0.9362
No log 2.6531 260 0.9635 0.1146 0.9635 0.9816
No log 2.6735 262 1.0298 0.0767 1.0298 1.0148
No log 2.6939 264 0.9685 0.0789 0.9685 0.9841
No log 2.7143 266 0.9279 0.0154 0.9279 0.9633
No log 2.7347 268 0.8431 0.1079 0.8431 0.9182
No log 2.7551 270 0.7329 0.1081 0.7329 0.8561
No log 2.7755 272 0.7393 0.0814 0.7393 0.8598
No log 2.7959 274 0.7685 0.0814 0.7685 0.8766
No log 2.8163 276 0.7879 0.0432 0.7879 0.8877
No log 2.8367 278 0.9287 -0.0036 0.9287 0.9637
No log 2.8571 280 0.9941 0.0701 0.9941 0.9970
No log 2.8776 282 0.9918 0.0326 0.9918 0.9959
No log 2.8980 284 0.9321 -0.0142 0.9321 0.9655
No log 2.9184 286 1.0294 0.0083 1.0294 1.0146
No log 2.9388 288 1.4105 0.0058 1.4105 1.1876
No log 2.9592 290 1.3782 -0.0191 1.3782 1.1740
No log 2.9796 292 1.1383 -0.0504 1.1383 1.0669
No log 3.0 294 0.8843 -0.0226 0.8843 0.9404
No log 3.0204 296 0.8786 -0.0226 0.8786 0.9373
No log 3.0408 298 0.9931 0.0028 0.9931 0.9965
No log 3.0612 300 1.2189 -0.0077 1.2189 1.1040
No log 3.0816 302 1.2128 -0.0077 1.2128 1.1013
No log 3.1020 304 1.0070 0.0065 1.0070 1.0035
No log 3.1224 306 0.9335 0.0185 0.9335 0.9662
No log 3.1429 308 1.0715 -0.0181 1.0715 1.0351
No log 3.1633 310 1.3956 0.0260 1.3956 1.1813
No log 3.1837 312 1.3196 0.0260 1.3196 1.1487
No log 3.2041 314 0.9987 -0.0595 0.9987 0.9994
No log 3.2245 316 0.8371 -0.0280 0.8371 0.9149
No log 3.2449 318 0.8001 0.0503 0.8001 0.8945
No log 3.2653 320 0.8780 -0.1051 0.8780 0.9370
No log 3.2857 322 1.1622 0.0175 1.1622 1.0781
No log 3.3061 324 1.4210 0.0308 1.4210 1.1920
No log 3.3265 326 1.3406 0.0328 1.3406 1.1578
No log 3.3469 328 1.0510 0.0164 1.0510 1.0252
No log 3.3673 330 0.8093 -0.0786 0.8093 0.8996
No log 3.3878 332 0.7727 0.1259 0.7727 0.8790
No log 3.4082 334 0.9681 -0.0518 0.9681 0.9839
No log 3.4286 336 0.9802 -0.0518 0.9802 0.9901
No log 3.4490 338 0.8280 0.0549 0.8280 0.9099
No log 3.4694 340 0.7879 0.0926 0.7879 0.8876
No log 3.4898 342 0.9878 -0.1152 0.9878 0.9939
No log 3.5102 344 1.1507 0.0472 1.1507 1.0727
No log 3.5306 346 1.1147 0.0659 1.1147 1.0558
No log 3.5510 348 0.9627 0.0093 0.9627 0.9812
No log 3.5714 350 0.9199 0.0392 0.9199 0.9591
No log 3.5918 352 0.9114 0.0145 0.9114 0.9547
No log 3.6122 354 1.0939 0.0149 1.0939 1.0459
No log 3.6327 356 1.2140 -0.0614 1.2140 1.1018
No log 3.6531 358 1.1000 0.0138 1.1000 1.0488
No log 3.6735 360 0.9167 0.0296 0.9167 0.9574
No log 3.6939 362 0.8394 0.0119 0.8394 0.9162
No log 3.7143 364 0.8324 -0.0647 0.8324 0.9124
No log 3.7347 366 0.9147 0.0680 0.9147 0.9564
No log 3.7551 368 1.0408 0.0159 1.0408 1.0202
No log 3.7755 370 1.0326 0.0175 1.0326 1.0162
No log 3.7959 372 0.9615 -0.0661 0.9615 0.9806
No log 3.8163 374 0.8910 -0.1045 0.8910 0.9439
No log 3.8367 376 0.8673 -0.0314 0.8673 0.9313
No log 3.8571 378 0.8957 -0.1412 0.8957 0.9464
No log 3.8776 380 0.9112 -0.2107 0.9112 0.9546
No log 3.8980 382 0.9022 -0.2107 0.9022 0.9498
No log 3.9184 384 0.8165 -0.1268 0.8165 0.9036
No log 3.9388 386 0.7738 -0.0578 0.7738 0.8796
No log 3.9592 388 0.7682 -0.0578 0.7682 0.8765
No log 3.9796 390 0.7854 -0.0449 0.7854 0.8862
No log 4.0 392 0.8152 -0.0406 0.8152 0.9029
No log 4.0204 394 0.8485 -0.0226 0.8485 0.9211
No log 4.0408 396 0.8568 -0.0620 0.8568 0.9257
No log 4.0612 398 0.8925 -0.1103 0.8925 0.9447
No log 4.0816 400 0.8440 -0.0572 0.8440 0.9187
No log 4.1020 402 0.8532 -0.0517 0.8532 0.9237
No log 4.1224 404 0.9543 -0.0638 0.9543 0.9769
No log 4.1429 406 1.1699 -0.0023 1.1699 1.0816
No log 4.1633 408 1.1696 0.0247 1.1696 1.0815
No log 4.1837 410 1.0133 -0.0606 1.0133 1.0066
No log 4.2041 412 0.9228 -0.0103 0.9228 0.9606
No log 4.2245 414 0.8584 0.0123 0.8584 0.9265
No log 4.2449 416 0.9174 -0.0059 0.9174 0.9578
No log 4.2653 418 1.1017 -0.0096 1.1017 1.0496
No log 4.2857 420 1.1275 0.0247 1.1275 1.0618
No log 4.3061 422 0.9379 -0.1103 0.9379 0.9684
No log 4.3265 424 0.7870 0.0518 0.7870 0.8871
No log 4.3469 426 0.7912 0.0983 0.7912 0.8895
No log 4.3673 428 0.8301 -0.0248 0.8301 0.9111
No log 4.3878 430 0.9783 -0.0245 0.9783 0.9891
No log 4.4082 432 0.9750 -0.0545 0.9750 0.9874
No log 4.4286 434 0.9842 -0.0245 0.9842 0.9921
No log 4.4490 436 1.0177 -0.0181 1.0177 1.0088
No log 4.4694 438 0.9446 -0.0672 0.9446 0.9719
No log 4.4898 440 0.8824 -0.0949 0.8824 0.9393
No log 4.5102 442 0.8962 -0.1249 0.8962 0.9467
No log 4.5306 444 0.9017 -0.1249 0.9017 0.9496
No log 4.5510 446 0.8941 -0.0473 0.8941 0.9456
No log 4.5714 448 0.8337 0.0449 0.8337 0.9131
No log 4.5918 450 0.8596 0.0393 0.8596 0.9272
No log 4.6122 452 0.9129 0.0189 0.9129 0.9555
No log 4.6327 454 0.9853 -0.0353 0.9853 0.9926
No log 4.6531 456 0.9686 -0.0672 0.9686 0.9842
No log 4.6735 458 0.9914 -0.0563 0.9914 0.9957
No log 4.6939 460 0.8870 -0.0458 0.8870 0.9418
No log 4.7143 462 0.7987 0.1304 0.7987 0.8937
No log 4.7347 464 0.8613 0.0438 0.8613 0.9280
No log 4.7551 466 0.8439 0.0956 0.8439 0.9186
No log 4.7755 468 0.7946 0.0869 0.7946 0.8914
No log 4.7959 470 0.8750 -0.0852 0.8750 0.9354
No log 4.8163 472 0.9084 -0.0811 0.9084 0.9531
No log 4.8367 474 0.8329 -0.1979 0.8329 0.9126
No log 4.8571 476 0.7900 0.0828 0.7900 0.8888
No log 4.8776 478 0.8133 -0.0444 0.8133 0.9019
No log 4.8980 480 0.9362 -0.0036 0.9362 0.9676
No log 4.9184 482 1.1817 -0.0041 1.1817 1.0871
No log 4.9388 484 1.2091 -0.0041 1.2091 1.0996
No log 4.9592 486 1.0103 -0.0164 1.0103 1.0051
No log 4.9796 488 0.7908 -0.1331 0.7908 0.8893
No log 5.0 490 0.7490 0.1740 0.7490 0.8654
No log 5.0204 492 0.7737 0.1148 0.7737 0.8796
No log 5.0408 494 0.7563 0.1259 0.7563 0.8697
No log 5.0612 496 0.7590 0.0338 0.7590 0.8712
No log 5.0816 498 0.7699 0.0414 0.7699 0.8774
0.3636 5.1020 500 0.7814 0.0414 0.7814 0.8839
0.3636 5.1224 502 0.8000 -0.0030 0.8000 0.8944
0.3636 5.1429 504 0.7694 0.0375 0.7694 0.8772
0.3636 5.1633 506 0.7793 0.0723 0.7793 0.8828
0.3636 5.1837 508 0.7816 0.0714 0.7816 0.8841
0.3636 5.2041 510 0.7720 0.0414 0.7720 0.8787
0.3636 5.2245 512 0.8359 -0.1266 0.8359 0.9143
0.3636 5.2449 514 0.8905 -0.1099 0.8905 0.9436
0.3636 5.2653 516 0.8801 -0.1527 0.8801 0.9381
0.3636 5.2857 518 0.8082 -0.1398 0.8082 0.8990

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k19_task3_organization

Finetuned
(4019)
this model