ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run2_AugV5_k19_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8564
  • Qwk: 0.3506
  • Mse: 0.8564
  • Rmse: 0.9254

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0222 2 4.0614 0.0069 4.0614 2.0153
No log 0.0444 4 2.0585 0.0159 2.0585 1.4347
No log 0.0667 6 1.4231 0.0294 1.4231 1.1929
No log 0.0889 8 1.1084 0.2441 1.1084 1.0528
No log 0.1111 10 0.9968 0.2391 0.9968 0.9984
No log 0.1333 12 0.9902 0.2865 0.9902 0.9951
No log 0.1556 14 0.9814 0.3139 0.9814 0.9906
No log 0.1778 16 1.2828 0.2817 1.2828 1.1326
No log 0.2 18 1.4709 0.1280 1.4709 1.2128
No log 0.2222 20 1.3887 0.1700 1.3887 1.1784
No log 0.2444 22 1.1315 0.2662 1.1315 1.0637
No log 0.2667 24 1.1312 0.2757 1.1312 1.0636
No log 0.2889 26 1.4600 0.1670 1.4600 1.2083
No log 0.3111 28 1.6531 0.0797 1.6531 1.2857
No log 0.3333 30 1.3565 0.2062 1.3565 1.1647
No log 0.3556 32 1.1138 0.2629 1.1138 1.0554
No log 0.3778 34 1.0577 0.2761 1.0577 1.0285
No log 0.4 36 1.1143 0.2779 1.1143 1.0556
No log 0.4222 38 1.3052 0.2034 1.3052 1.1424
No log 0.4444 40 1.3969 0.2340 1.3969 1.1819
No log 0.4667 42 1.3581 0.2133 1.3581 1.1654
No log 0.4889 44 1.3338 0.2239 1.3338 1.1549
No log 0.5111 46 1.4024 0.2230 1.4024 1.1842
No log 0.5333 48 1.2380 0.2380 1.2380 1.1127
No log 0.5556 50 1.0813 0.3196 1.0813 1.0399
No log 0.5778 52 1.0151 0.2290 1.0151 1.0075
No log 0.6 54 1.0468 0.2231 1.0468 1.0231
No log 0.6222 56 1.1604 0.2291 1.1604 1.0772
No log 0.6444 58 1.5620 0.1343 1.5620 1.2498
No log 0.6667 60 1.8084 0.1212 1.8084 1.3448
No log 0.6889 62 1.4198 0.1441 1.4198 1.1915
No log 0.7111 64 1.2981 0.2227 1.2981 1.1394
No log 0.7333 66 1.2613 0.2712 1.2613 1.1231
No log 0.7556 68 1.1664 0.2884 1.1664 1.0800
No log 0.7778 70 1.1023 0.2711 1.1023 1.0499
No log 0.8 72 1.1076 0.2203 1.1076 1.0524
No log 0.8222 74 1.1270 0.2371 1.1270 1.0616
No log 0.8444 76 1.0750 0.2479 1.0750 1.0368
No log 0.8667 78 1.0744 0.1189 1.0744 1.0365
No log 0.8889 80 1.1204 0.1558 1.1204 1.0585
No log 0.9111 82 1.1909 0.1797 1.1909 1.0913
No log 0.9333 84 1.2982 0.1782 1.2982 1.1394
No log 0.9556 86 1.4515 0.1045 1.4515 1.2048
No log 0.9778 88 1.3879 0.2050 1.3879 1.1781
No log 1.0 90 1.2873 0.2600 1.2873 1.1346
No log 1.0222 92 1.3403 0.2191 1.3403 1.1577
No log 1.0444 94 1.3486 0.2288 1.3486 1.1613
No log 1.0667 96 1.3323 0.1884 1.3323 1.1542
No log 1.0889 98 1.2460 0.2100 1.2460 1.1162
No log 1.1111 100 1.1347 0.3205 1.1347 1.0652
No log 1.1333 102 1.1269 0.3162 1.1269 1.0616
No log 1.1556 104 1.2643 0.2744 1.2643 1.1244
No log 1.1778 106 1.2184 0.2703 1.2184 1.1038
No log 1.2 108 1.0733 0.3268 1.0733 1.0360
No log 1.2222 110 1.0157 0.3958 1.0157 1.0078
No log 1.2444 112 1.0811 0.2271 1.0811 1.0398
No log 1.2667 114 1.0475 0.3141 1.0475 1.0235
No log 1.2889 116 1.0032 0.3633 1.0032 1.0016
No log 1.3111 118 1.0975 0.3304 1.0975 1.0476
No log 1.3333 120 1.0778 0.3634 1.0778 1.0382
No log 1.3556 122 1.0104 0.3675 1.0104 1.0052
No log 1.3778 124 0.9548 0.4239 0.9548 0.9772
No log 1.4 126 0.9819 0.4011 0.9819 0.9909
No log 1.4222 128 1.0405 0.3875 1.0405 1.0201
No log 1.4444 130 1.0550 0.4129 1.0550 1.0271
No log 1.4667 132 1.0918 0.3151 1.0918 1.0449
No log 1.4889 134 1.0629 0.3027 1.0629 1.0310
No log 1.5111 136 1.0105 0.3361 1.0105 1.0052
No log 1.5333 138 0.9853 0.4202 0.9853 0.9926
No log 1.5556 140 1.0979 0.4045 1.0979 1.0478
No log 1.5778 142 1.2221 0.2496 1.2221 1.1055
No log 1.6 144 1.3550 0.2352 1.3550 1.1640
No log 1.6222 146 1.2787 0.2539 1.2787 1.1308
No log 1.6444 148 1.1257 0.4450 1.1257 1.0610
No log 1.6667 150 1.0128 0.4186 1.0128 1.0064
No log 1.6889 152 1.0000 0.3602 1.0000 1.0000
No log 1.7111 154 1.0032 0.3500 1.0032 1.0016
No log 1.7333 156 1.0204 0.3858 1.0204 1.0102
No log 1.7556 158 1.0157 0.4151 1.0157 1.0078
No log 1.7778 160 0.9940 0.3699 0.9940 0.9970
No log 1.8 162 0.9844 0.3986 0.9844 0.9922
No log 1.8222 164 0.9580 0.3306 0.9580 0.9788
No log 1.8444 166 0.9998 0.4398 0.9998 0.9999
No log 1.8667 168 1.0162 0.4503 1.0162 1.0081
No log 1.8889 170 0.9746 0.3625 0.9746 0.9872
No log 1.9111 172 0.9749 0.3817 0.9749 0.9874
No log 1.9333 174 1.0153 0.3578 1.0153 1.0076
No log 1.9556 176 1.0446 0.3449 1.0446 1.0221
No log 1.9778 178 1.0640 0.2982 1.0640 1.0315
No log 2.0 180 1.1481 0.2302 1.1481 1.0715
No log 2.0222 182 1.0994 0.2469 1.0994 1.0485
No log 2.0444 184 0.9809 0.3354 0.9809 0.9904
No log 2.0667 186 0.8965 0.3363 0.8965 0.9468
No log 2.0889 188 0.8441 0.3908 0.8441 0.9188
No log 2.1111 190 0.7963 0.5248 0.7963 0.8923
No log 2.1333 192 0.8675 0.4836 0.8675 0.9314
No log 2.1556 194 1.1489 0.4580 1.1489 1.0719
No log 2.1778 196 1.3081 0.3126 1.3081 1.1437
No log 2.2 198 1.1740 0.4032 1.1740 1.0835
No log 2.2222 200 1.0028 0.5220 1.0028 1.0014
No log 2.2444 202 0.9526 0.5019 0.9526 0.9760
No log 2.2667 204 0.9223 0.5048 0.9223 0.9604
No log 2.2889 206 0.9330 0.5041 0.9330 0.9659
No log 2.3111 208 0.8903 0.4601 0.8903 0.9436
No log 2.3333 210 0.8722 0.4401 0.8722 0.9339
No log 2.3556 212 0.9183 0.3152 0.9183 0.9583
No log 2.3778 214 0.9490 0.2871 0.9490 0.9741
No log 2.4 216 0.9526 0.2577 0.9526 0.9760
No log 2.4222 218 0.9055 0.3779 0.9055 0.9516
No log 2.4444 220 0.9333 0.3725 0.9333 0.9660
No log 2.4667 222 0.9431 0.3725 0.9431 0.9711
No log 2.4889 224 0.9424 0.3457 0.9424 0.9708
No log 2.5111 226 1.1327 0.3318 1.1327 1.0643
No log 2.5333 228 1.2947 0.2178 1.2947 1.1378
No log 2.5556 230 1.1428 0.3365 1.1428 1.0690
No log 2.5778 232 0.9687 0.4 0.9687 0.9842
No log 2.6 234 1.0213 0.4621 1.0213 1.0106
No log 2.6222 236 1.0807 0.4307 1.0807 1.0396
No log 2.6444 238 1.1269 0.4036 1.1269 1.0616
No log 2.6667 240 1.0730 0.4436 1.0730 1.0359
No log 2.6889 242 1.0197 0.4638 1.0197 1.0098
No log 2.7111 244 0.9506 0.4328 0.9506 0.9750
No log 2.7333 246 0.9016 0.4731 0.9016 0.9495
No log 2.7556 248 0.9216 0.4410 0.9216 0.9600
No log 2.7778 250 0.9034 0.4601 0.9034 0.9505
No log 2.8 252 0.8873 0.4207 0.8873 0.9420
No log 2.8222 254 0.8796 0.4867 0.8796 0.9379
No log 2.8444 256 0.8673 0.4857 0.8673 0.9313
No log 2.8667 258 0.8352 0.4843 0.8352 0.9139
No log 2.8889 260 0.7834 0.5098 0.7834 0.8851
No log 2.9111 262 0.7652 0.5234 0.7652 0.8748
No log 2.9333 264 0.7946 0.4712 0.7946 0.8914
No log 2.9556 266 0.9131 0.4804 0.9131 0.9556
No log 2.9778 268 1.0180 0.4783 1.0180 1.0090
No log 3.0 270 0.9700 0.4929 0.9700 0.9849
No log 3.0222 272 0.8624 0.4648 0.8624 0.9286
No log 3.0444 274 0.8505 0.5030 0.8505 0.9222
No log 3.0667 276 0.8558 0.5030 0.8558 0.9251
No log 3.0889 278 0.9218 0.4840 0.9218 0.9601
No log 3.1111 280 0.9777 0.4823 0.9777 0.9888
No log 3.1333 282 0.9289 0.4634 0.9289 0.9638
No log 3.1556 284 0.8906 0.4676 0.8906 0.9437
No log 3.1778 286 0.9015 0.4413 0.9015 0.9495
No log 3.2 288 1.0178 0.4695 1.0178 1.0089
No log 3.2222 290 1.0827 0.4560 1.0827 1.0406
No log 3.2444 292 0.9765 0.4428 0.9765 0.9882
No log 3.2667 294 0.9054 0.4375 0.9054 0.9515
No log 3.2889 296 0.9383 0.4460 0.9383 0.9687
No log 3.3111 298 0.9858 0.4783 0.9858 0.9929
No log 3.3333 300 0.8848 0.4812 0.8848 0.9406
No log 3.3556 302 0.8620 0.4801 0.8620 0.9285
No log 3.3778 304 0.8966 0.4157 0.8966 0.9469
No log 3.4 306 0.9571 0.4144 0.9571 0.9783
No log 3.4222 308 0.9819 0.4020 0.9819 0.9909
No log 3.4444 310 0.8972 0.4144 0.8972 0.9472
No log 3.4667 312 0.8745 0.4411 0.8745 0.9352
No log 3.4889 314 0.8106 0.4503 0.8106 0.9004
No log 3.5111 316 0.8257 0.4401 0.8257 0.9087
No log 3.5333 318 0.9361 0.5102 0.9361 0.9675
No log 3.5556 320 1.0186 0.5184 1.0186 1.0092
No log 3.5778 322 0.9889 0.4987 0.9889 0.9944
No log 3.6 324 0.9519 0.4212 0.9519 0.9757
No log 3.6222 326 0.8376 0.4466 0.8376 0.9152
No log 3.6444 328 0.7951 0.3673 0.7951 0.8917
No log 3.6667 330 0.8126 0.3652 0.8126 0.9014
No log 3.6889 332 0.8865 0.3623 0.8865 0.9415
No log 3.7111 334 0.9299 0.3766 0.9299 0.9643
No log 3.7333 336 0.8706 0.3785 0.8706 0.9330
No log 3.7556 338 0.7776 0.4661 0.7776 0.8818
No log 3.7778 340 0.7859 0.5381 0.7859 0.8865
No log 3.8 342 0.7959 0.5146 0.7959 0.8922
No log 3.8222 344 0.8309 0.5002 0.8309 0.9115
No log 3.8444 346 0.9121 0.4577 0.9121 0.9551
No log 3.8667 348 0.9035 0.4575 0.9035 0.9505
No log 3.8889 350 0.7991 0.4873 0.7991 0.8939
No log 3.9111 352 0.7790 0.4115 0.7790 0.8826
No log 3.9333 354 0.7752 0.4145 0.7752 0.8804
No log 3.9556 356 0.7858 0.4988 0.7858 0.8864
No log 3.9778 358 0.8123 0.4728 0.8123 0.9013
No log 4.0 360 0.8174 0.4728 0.8174 0.9041
No log 4.0222 362 0.7946 0.4873 0.7946 0.8914
No log 4.0444 364 0.7740 0.5050 0.7740 0.8798
No log 4.0667 366 0.7846 0.4614 0.7846 0.8858
No log 4.0889 368 0.8293 0.4435 0.8293 0.9107
No log 4.1111 370 0.8150 0.4824 0.8150 0.9028
No log 4.1333 372 0.8415 0.4429 0.8415 0.9173
No log 4.1556 374 0.8472 0.4197 0.8472 0.9204
No log 4.1778 376 0.8524 0.4 0.8524 0.9233
No log 4.2 378 0.8665 0.3522 0.8665 0.9309
No log 4.2222 380 0.8533 0.3914 0.8533 0.9237
No log 4.2444 382 0.9631 0.4681 0.9631 0.9814
No log 4.2667 384 1.0498 0.4779 1.0498 1.0246
No log 4.2889 386 1.0367 0.3794 1.0367 1.0182
No log 4.3111 388 0.9819 0.3794 0.9819 0.9909
No log 4.3333 390 0.9317 0.4098 0.9317 0.9652
No log 4.3556 392 0.8965 0.3879 0.8965 0.9468
No log 4.3778 394 0.8816 0.3652 0.8816 0.9389
No log 4.4 396 0.8968 0.4010 0.8968 0.9470
No log 4.4222 398 0.9456 0.4388 0.9456 0.9724
No log 4.4444 400 0.9408 0.4792 0.9408 0.9700
No log 4.4667 402 0.9500 0.4676 0.9500 0.9747
No log 4.4889 404 0.9376 0.4145 0.9376 0.9683
No log 4.5111 406 0.9330 0.4498 0.9330 0.9659
No log 4.5333 408 0.9398 0.3631 0.9398 0.9694
No log 4.5556 410 1.0204 0.4595 1.0204 1.0101
No log 4.5778 412 1.0148 0.4371 1.0148 1.0074
No log 4.6 414 0.9842 0.4379 0.9842 0.9920
No log 4.6222 416 0.9998 0.4379 0.9998 0.9999
No log 4.6444 418 0.9679 0.4247 0.9679 0.9838
No log 4.6667 420 1.0078 0.4707 1.0078 1.0039
No log 4.6889 422 1.0068 0.4915 1.0068 1.0034
No log 4.7111 424 0.9507 0.4237 0.9507 0.9750
No log 4.7333 426 0.9619 0.4102 0.9619 0.9808
No log 4.7556 428 0.9743 0.4417 0.9743 0.9870
No log 4.7778 430 1.0065 0.3902 1.0065 1.0033
No log 4.8 432 0.9289 0.2623 0.9289 0.9638
No log 4.8222 434 0.8939 0.3372 0.8939 0.9455
No log 4.8444 436 0.9111 0.3957 0.9111 0.9545
No log 4.8667 438 0.9554 0.4111 0.9554 0.9775
No log 4.8889 440 1.0094 0.4792 1.0094 1.0047
No log 4.9111 442 1.0394 0.4881 1.0394 1.0195
No log 4.9333 444 0.9524 0.4681 0.9524 0.9759
No log 4.9556 446 0.8771 0.4466 0.8771 0.9365
No log 4.9778 448 0.8531 0.2338 0.8531 0.9236
No log 5.0 450 0.8536 0.2314 0.8536 0.9239
No log 5.0222 452 0.9121 0.4546 0.9121 0.9550
No log 5.0444 454 0.9415 0.4275 0.9415 0.9703
No log 5.0667 456 0.8621 0.4562 0.8621 0.9285
No log 5.0889 458 0.8243 0.3271 0.8243 0.9079
No log 5.1111 460 0.8309 0.4015 0.8309 0.9115
No log 5.1333 462 0.8618 0.4995 0.8618 0.9283
No log 5.1556 464 0.9657 0.5033 0.9657 0.9827
No log 5.1778 466 1.0406 0.5318 1.0406 1.0201
No log 5.2 468 1.0055 0.4250 1.0055 1.0027
No log 5.2222 470 0.9561 0.4662 0.9561 0.9778
No log 5.2444 472 0.9667 0.4300 0.9667 0.9832
No log 5.2667 474 0.9814 0.4527 0.9814 0.9907
No log 5.2889 476 0.9808 0.3427 0.9808 0.9904
No log 5.3111 478 1.0888 0.3792 1.0888 1.0435
No log 5.3333 480 1.1221 0.3747 1.1221 1.0593
No log 5.3556 482 1.0069 0.3597 1.0069 1.0034
No log 5.3778 484 0.9005 0.2812 0.9005 0.9489
No log 5.4 486 0.9023 0.3250 0.9023 0.9499
No log 5.4222 488 0.9046 0.3250 0.9046 0.9511
No log 5.4444 490 0.9071 0.3656 0.9071 0.9524
No log 5.4667 492 0.9371 0.4013 0.9371 0.9681
No log 5.4889 494 0.9618 0.4042 0.9618 0.9807
No log 5.5111 496 0.9662 0.3625 0.9662 0.9829
No log 5.5333 498 0.9513 0.3128 0.9513 0.9754
0.3034 5.5556 500 0.9760 0.3706 0.9760 0.9879
0.3034 5.5778 502 0.9766 0.3383 0.9766 0.9882
0.3034 5.6 504 0.8922 0.3285 0.8922 0.9446
0.3034 5.6222 506 0.8553 0.3840 0.8553 0.9248
0.3034 5.6444 508 0.8491 0.3840 0.8491 0.9215
0.3034 5.6667 510 0.8868 0.4192 0.8868 0.9417
0.3034 5.6889 512 0.9462 0.4039 0.9462 0.9727
0.3034 5.7111 514 0.9355 0.4174 0.9355 0.9672
0.3034 5.7333 516 0.9086 0.3404 0.9086 0.9532
0.3034 5.7556 518 0.8657 0.2944 0.8657 0.9304
0.3034 5.7778 520 0.8770 0.2944 0.8770 0.9365
0.3034 5.8 522 0.8684 0.2944 0.8684 0.9319
0.3034 5.8222 524 0.8671 0.3799 0.8671 0.9312
0.3034 5.8444 526 0.9115 0.3921 0.9115 0.9547
0.3034 5.8667 528 0.8679 0.4327 0.8679 0.9316
0.3034 5.8889 530 0.8179 0.3697 0.8179 0.9044
0.3034 5.9111 532 0.8486 0.3365 0.8486 0.9212
0.3034 5.9333 534 0.8635 0.3676 0.8635 0.9292
0.3034 5.9556 536 0.8417 0.3323 0.8417 0.9175
0.3034 5.9778 538 0.8564 0.3506 0.8564 0.9254

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run2_AugV5_k19_task5_organization

Finetuned
(4019)
this model