ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run1_AugV5_k7_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9009
  • Qwk: 0.3021
  • Mse: 0.9009
  • Rmse: 0.9491

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0606 2 4.2870 0.0094 4.2870 2.0705
No log 0.1212 4 2.7798 -0.0458 2.7798 1.6673
No log 0.1818 6 1.6862 0.0203 1.6862 1.2985
No log 0.2424 8 1.1307 0.1848 1.1307 1.0633
No log 0.3030 10 1.0419 0.2291 1.0419 1.0207
No log 0.3636 12 1.1026 0.2465 1.1026 1.0500
No log 0.4242 14 1.2418 0.0878 1.2418 1.1143
No log 0.4848 16 1.2841 0.0496 1.2841 1.1332
No log 0.5455 18 1.3494 0.0 1.3494 1.1616
No log 0.6061 20 1.1058 0.1832 1.1058 1.0516
No log 0.6667 22 1.1739 0.1658 1.1739 1.0834
No log 0.7273 24 1.4008 0.1257 1.4008 1.1836
No log 0.7879 26 1.0857 0.1521 1.0857 1.0420
No log 0.8485 28 1.0133 0.1379 1.0133 1.0066
No log 0.9091 30 1.0085 0.2865 1.0085 1.0042
No log 0.9697 32 1.1053 0.1361 1.1053 1.0513
No log 1.0303 34 1.2677 0.1602 1.2677 1.1259
No log 1.0909 36 1.2538 0.1602 1.2538 1.1197
No log 1.1515 38 1.1166 0.1876 1.1166 1.0567
No log 1.2121 40 1.1068 0.1482 1.1068 1.0521
No log 1.2727 42 1.2954 0.1576 1.2954 1.1382
No log 1.3333 44 1.6082 0.0518 1.6082 1.2682
No log 1.3939 46 1.5599 0.0161 1.5599 1.2490
No log 1.4545 48 1.3079 0.1310 1.3079 1.1436
No log 1.5152 50 1.0975 0.1851 1.0975 1.0476
No log 1.5758 52 1.0719 0.1851 1.0719 1.0353
No log 1.6364 54 1.1650 0.1902 1.1650 1.0794
No log 1.6970 56 1.1328 0.2138 1.1328 1.0643
No log 1.7576 58 1.0934 0.2138 1.0934 1.0456
No log 1.8182 60 1.0501 0.2759 1.0501 1.0247
No log 1.8788 62 1.1783 0.2644 1.1783 1.0855
No log 1.9394 64 1.1838 0.2271 1.1838 1.0880
No log 2.0 66 1.1893 0.2687 1.1893 1.0905
No log 2.0606 68 1.1125 0.2455 1.1125 1.0547
No log 2.1212 70 0.9949 0.2716 0.9949 0.9974
No log 2.1818 72 0.9503 0.2812 0.9503 0.9748
No log 2.2424 74 0.9240 0.2812 0.9240 0.9612
No log 2.3030 76 1.0643 0.2512 1.0643 1.0316
No log 2.3636 78 1.2039 0.2191 1.2039 1.0972
No log 2.4242 80 1.2247 0.1968 1.2248 1.1067
No log 2.4848 82 1.0690 0.2385 1.0690 1.0339
No log 2.5455 84 1.1943 0.3032 1.1943 1.0928
No log 2.6061 86 1.3161 0.2294 1.3161 1.1472
No log 2.6667 88 1.2818 0.2541 1.2818 1.1322
No log 2.7273 90 1.2321 0.2221 1.2321 1.1100
No log 2.7879 92 1.0405 0.3437 1.0405 1.0200
No log 2.8485 94 0.9459 0.3045 0.9459 0.9726
No log 2.9091 96 1.2431 0.1958 1.2431 1.1149
No log 2.9697 98 1.8205 -0.0359 1.8205 1.3493
No log 3.0303 100 2.0836 -0.1307 2.0836 1.4435
No log 3.0909 102 1.5573 0.0964 1.5573 1.2479
No log 3.1515 104 1.2236 0.1176 1.2236 1.1062
No log 3.2121 106 1.3942 0.0535 1.3942 1.1807
No log 3.2727 108 1.5821 -0.0557 1.5821 1.2578
No log 3.3333 110 1.4244 0.0088 1.4244 1.1935
No log 3.3939 112 1.4317 -0.0049 1.4317 1.1965
No log 3.4545 114 1.6379 -0.0466 1.6379 1.2798
No log 3.5152 116 1.5734 0.0333 1.5734 1.2544
No log 3.5758 118 1.3308 0.1155 1.3308 1.1536
No log 3.6364 120 1.2885 0.1071 1.2885 1.1351
No log 3.6970 122 1.5026 -0.0108 1.5026 1.2258
No log 3.7576 124 1.5534 -0.0050 1.5534 1.2463
No log 3.8182 126 1.2940 0.1254 1.2940 1.1376
No log 3.8788 128 1.1945 0.1873 1.1945 1.0929
No log 3.9394 130 1.2477 0.1291 1.2477 1.1170
No log 4.0 132 1.1271 0.1701 1.1271 1.0617
No log 4.0606 134 1.1511 0.1788 1.1511 1.0729
No log 4.1212 136 1.2282 0.1685 1.2282 1.1082
No log 4.1818 138 1.0653 0.2114 1.0653 1.0321
No log 4.2424 140 1.0366 0.2151 1.0366 1.0181
No log 4.3030 142 1.1750 0.1406 1.1750 1.0840
No log 4.3636 144 1.1839 0.1005 1.1839 1.0881
No log 4.4242 146 1.1421 0.1713 1.1421 1.0687
No log 4.4848 148 1.0106 0.3347 1.0106 1.0053
No log 4.5455 150 1.0377 0.3914 1.0377 1.0187
No log 4.6061 152 1.0162 0.3788 1.0162 1.0081
No log 4.6667 154 0.9758 0.3725 0.9758 0.9878
No log 4.7273 156 1.0043 0.3569 1.0043 1.0021
No log 4.7879 158 0.9665 0.3467 0.9665 0.9831
No log 4.8485 160 1.0068 0.2986 1.0068 1.0034
No log 4.9091 162 0.9635 0.3257 0.9635 0.9816
No log 4.9697 164 0.8690 0.3933 0.8690 0.9322
No log 5.0303 166 0.8487 0.4192 0.8487 0.9213
No log 5.0909 168 0.9161 0.4826 0.9161 0.9571
No log 5.1515 170 0.9928 0.3913 0.9928 0.9964
No log 5.2121 172 0.9974 0.3897 0.9974 0.9987
No log 5.2727 174 0.9612 0.4349 0.9612 0.9804
No log 5.3333 176 0.8962 0.4522 0.8962 0.9467
No log 5.3939 178 0.9532 0.3347 0.9532 0.9763
No log 5.4545 180 1.1453 0.1546 1.1453 1.0702
No log 5.5152 182 1.1946 0.1987 1.1946 1.0930
No log 5.5758 184 1.1532 0.1398 1.1532 1.0739
No log 5.6364 186 1.0943 0.2263 1.0943 1.0461
No log 5.6970 188 1.1926 0.2526 1.1926 1.0920
No log 5.7576 190 1.1820 0.2959 1.1820 1.0872
No log 5.8182 192 1.0511 0.2704 1.0511 1.0253
No log 5.8788 194 1.0137 0.2610 1.0137 1.0068
No log 5.9394 196 0.9218 0.3740 0.9218 0.9601
No log 6.0 198 0.9021 0.3858 0.9021 0.9498
No log 6.0606 200 0.9567 0.3192 0.9567 0.9781
No log 6.1212 202 1.1185 0.2299 1.1185 1.0576
No log 6.1818 204 1.1929 0.1564 1.1929 1.0922
No log 6.2424 206 1.0620 0.2200 1.0620 1.0305
No log 6.3030 208 0.9444 0.3528 0.9444 0.9718
No log 6.3636 210 0.9258 0.3392 0.9258 0.9622
No log 6.4242 212 0.9865 0.3678 0.9865 0.9932
No log 6.4848 214 1.1648 0.1308 1.1648 1.0793
No log 6.5455 216 1.2129 0.1710 1.2129 1.1013
No log 6.6061 218 1.2144 0.1164 1.2144 1.1020
No log 6.6667 220 1.1325 0.2340 1.1325 1.0642
No log 6.7273 222 0.9913 0.2466 0.9913 0.9956
No log 6.7879 224 0.9324 0.2888 0.9324 0.9656
No log 6.8485 226 0.8834 0.3713 0.8834 0.9399
No log 6.9091 228 0.8856 0.4318 0.8856 0.9411
No log 6.9697 230 0.9596 0.4036 0.9596 0.9796
No log 7.0303 232 0.9153 0.4157 0.9153 0.9567
No log 7.0909 234 0.8132 0.5605 0.8132 0.9018
No log 7.1515 236 0.7840 0.4051 0.7840 0.8855
No log 7.2121 238 0.7819 0.4033 0.7819 0.8843
No log 7.2727 240 0.8247 0.5348 0.8247 0.9081
No log 7.3333 242 0.8281 0.5348 0.8281 0.9100
No log 7.3939 244 0.8872 0.5206 0.8872 0.9419
No log 7.4545 246 0.9297 0.4943 0.9297 0.9642
No log 7.5152 248 0.9586 0.4318 0.9586 0.9791
No log 7.5758 250 1.0318 0.3382 1.0318 1.0158
No log 7.6364 252 1.1960 0.0852 1.1960 1.0936
No log 7.6970 254 1.2166 0.1005 1.2166 1.1030
No log 7.7576 256 1.0587 0.1306 1.0587 1.0289
No log 7.8182 258 0.9333 0.4012 0.9333 0.9661
No log 7.8788 260 0.8650 0.4280 0.8650 0.9300
No log 7.9394 262 0.8729 0.4391 0.8729 0.9343
No log 8.0 264 0.8916 0.4742 0.8916 0.9442
No log 8.0606 266 0.8948 0.4499 0.8948 0.9459
No log 8.1212 268 0.8953 0.4499 0.8953 0.9462
No log 8.1818 270 0.8827 0.4033 0.8827 0.9395
No log 8.2424 272 0.9043 0.3172 0.9043 0.9510
No log 8.3030 274 0.9991 0.4041 0.9991 0.9996
No log 8.3636 276 1.0468 0.3222 1.0468 1.0231
No log 8.4242 278 0.9949 0.4180 0.9949 0.9975
No log 8.4848 280 0.9548 0.3821 0.9548 0.9771
No log 8.5455 282 0.9762 0.3285 0.9762 0.9880
No log 8.6061 284 1.0683 0.3103 1.0683 1.0336
No log 8.6667 286 1.1200 0.2378 1.1200 1.0583
No log 8.7273 288 1.0679 0.2698 1.0679 1.0334
No log 8.7879 290 1.0120 0.1918 1.0120 1.0060
No log 8.8485 292 1.0041 0.1874 1.0041 1.0020
No log 8.9091 294 0.9910 0.2114 0.9910 0.9955
No log 8.9697 296 0.9254 0.2857 0.9254 0.9620
No log 9.0303 298 0.9122 0.3323 0.9122 0.9551
No log 9.0909 300 1.0311 0.3278 1.0311 1.0154
No log 9.1515 302 1.0645 0.2494 1.0645 1.0318
No log 9.2121 304 0.9770 0.4039 0.9770 0.9885
No log 9.2727 306 0.8999 0.4461 0.8999 0.9486
No log 9.3333 308 0.8822 0.4478 0.8822 0.9392
No log 9.3939 310 0.8601 0.4133 0.8601 0.9274
No log 9.4545 312 0.8720 0.4395 0.8720 0.9338
No log 9.5152 314 0.9462 0.3822 0.9462 0.9727
No log 9.5758 316 1.0386 0.3103 1.0386 1.0191
No log 9.6364 318 1.0077 0.3668 1.0077 1.0039
No log 9.6970 320 0.9354 0.4203 0.9354 0.9672
No log 9.7576 322 0.8925 0.4119 0.8925 0.9447
No log 9.8182 324 0.9207 0.4681 0.9207 0.9595
No log 9.8788 326 0.9203 0.4681 0.9203 0.9593
No log 9.9394 328 0.8664 0.4595 0.8664 0.9308
No log 10.0 330 0.8350 0.3757 0.8350 0.9138
No log 10.0606 332 0.8701 0.4174 0.8701 0.9328
No log 10.1212 334 0.8529 0.3896 0.8529 0.9235
No log 10.1818 336 0.8403 0.3817 0.8403 0.9167
No log 10.2424 338 0.8585 0.4336 0.8585 0.9266
No log 10.3030 340 0.8348 0.3797 0.8348 0.9137
No log 10.3636 342 0.8398 0.3915 0.8398 0.9164
No log 10.4242 344 0.8942 0.3552 0.8942 0.9456
No log 10.4848 346 0.8727 0.3236 0.8727 0.9342
No log 10.5455 348 0.8294 0.4156 0.8294 0.9107
No log 10.6061 350 0.8770 0.3940 0.8770 0.9365
No log 10.6667 352 1.0407 0.3721 1.0407 1.0201
No log 10.7273 354 1.1174 0.3144 1.1174 1.0571
No log 10.7879 356 1.0425 0.3677 1.0425 1.0210
No log 10.8485 358 0.9380 0.4310 0.9380 0.9685
No log 10.9091 360 0.8699 0.3998 0.8699 0.9327
No log 10.9697 362 0.8870 0.2292 0.8870 0.9418
No log 11.0303 364 0.9103 0.2316 0.9103 0.9541
No log 11.0909 366 0.8907 0.2316 0.8907 0.9437
No log 11.1515 368 0.8532 0.3448 0.8532 0.9237
No log 11.2121 370 0.8333 0.4778 0.8333 0.9129
No log 11.2727 372 0.8626 0.4093 0.8626 0.9287
No log 11.3333 374 0.8867 0.4180 0.8867 0.9417
No log 11.3939 376 0.8693 0.4697 0.8693 0.9324
No log 11.4545 378 0.8770 0.4697 0.8770 0.9365
No log 11.5152 380 0.8781 0.4697 0.8781 0.9371
No log 11.5758 382 0.8884 0.4180 0.8884 0.9425
No log 11.6364 384 0.8613 0.3382 0.8613 0.9280
No log 11.6970 386 0.8430 0.3425 0.8430 0.9182
No log 11.7576 388 0.8382 0.3737 0.8382 0.9155
No log 11.8182 390 0.8448 0.3485 0.8448 0.9191
No log 11.8788 392 0.8626 0.3611 0.8626 0.9288
No log 11.9394 394 0.8549 0.3485 0.8549 0.9246
No log 12.0 396 0.8530 0.3360 0.8530 0.9236
No log 12.0606 398 0.8501 0.4097 0.8501 0.9220
No log 12.1212 400 0.8568 0.4361 0.8568 0.9256
No log 12.1818 402 0.8520 0.4401 0.8520 0.9231
No log 12.2424 404 0.8588 0.4531 0.8588 0.9267
No log 12.3030 406 0.8662 0.4494 0.8662 0.9307
No log 12.3636 408 0.8618 0.4101 0.8618 0.9283
No log 12.4242 410 0.8608 0.3979 0.8608 0.9278
No log 12.4848 412 0.8740 0.3979 0.8740 0.9349
No log 12.5455 414 0.8887 0.3838 0.8887 0.9427
No log 12.6061 416 0.8669 0.3838 0.8669 0.9311
No log 12.6667 418 0.8395 0.3979 0.8395 0.9162
No log 12.7273 420 0.8132 0.4259 0.8132 0.9018
No log 12.7879 422 0.8110 0.3757 0.8110 0.9005
No log 12.8485 424 0.8194 0.4381 0.8194 0.9052
No log 12.9091 426 0.8489 0.3631 0.8489 0.9213
No log 12.9697 428 0.8740 0.3339 0.8740 0.9349
No log 13.0303 430 0.8984 0.3044 0.8984 0.9479
No log 13.0909 432 0.9477 0.3295 0.9477 0.9735
No log 13.1515 434 0.9853 0.3041 0.9853 0.9926
No log 13.2121 436 0.9679 0.3063 0.9679 0.9838
No log 13.2727 438 0.9334 0.2239 0.9334 0.9661
No log 13.3333 440 0.9070 0.2692 0.9070 0.9524
No log 13.3939 442 0.8866 0.4063 0.8866 0.9416
No log 13.4545 444 0.9044 0.4461 0.9044 0.9510
No log 13.5152 446 0.8793 0.4839 0.8793 0.9377
No log 13.5758 448 0.8587 0.4757 0.8587 0.9267
No log 13.6364 450 0.8583 0.4499 0.8583 0.9264
No log 13.6970 452 0.8558 0.4499 0.8558 0.9251
No log 13.7576 454 0.8408 0.4251 0.8408 0.9169
No log 13.8182 456 0.8386 0.3977 0.8386 0.9158
No log 13.8788 458 0.8564 0.4728 0.8564 0.9254
No log 13.9394 460 0.8607 0.4713 0.8607 0.9278
No log 14.0 462 0.8512 0.4078 0.8512 0.9226
No log 14.0606 464 0.8455 0.4067 0.8455 0.9195
No log 14.1212 466 0.8562 0.3288 0.8562 0.9253
No log 14.1818 468 0.8679 0.3162 0.8679 0.9316
No log 14.2424 470 0.8797 0.3162 0.8797 0.9379
No log 14.3030 472 0.8752 0.3414 0.8752 0.9355
No log 14.3636 474 0.8774 0.3498 0.8774 0.9367
No log 14.4242 476 0.8692 0.3348 0.8692 0.9323
No log 14.4848 478 0.8622 0.3198 0.8622 0.9286
No log 14.5455 480 0.8563 0.4296 0.8563 0.9254
No log 14.6061 482 0.8396 0.4807 0.8396 0.9163
No log 14.6667 484 0.8311 0.5142 0.8311 0.9116
No log 14.7273 486 0.8656 0.4843 0.8656 0.9304
No log 14.7879 488 0.9075 0.5181 0.9075 0.9527
No log 14.8485 490 0.8987 0.5048 0.8987 0.9480
No log 14.9091 492 0.8668 0.4952 0.8669 0.9310
No log 14.9697 494 0.8236 0.4251 0.8236 0.9075
No log 15.0303 496 0.8291 0.4030 0.8291 0.9105
No log 15.0909 498 0.8501 0.4729 0.8501 0.9220
0.3231 15.1515 500 0.8694 0.4597 0.8694 0.9324
0.3231 15.2121 502 0.9001 0.4565 0.9001 0.9488
0.3231 15.2727 504 0.9023 0.4697 0.9023 0.9499
0.3231 15.3333 506 0.8543 0.4478 0.8543 0.9243
0.3231 15.3939 508 0.8502 0.4353 0.8502 0.9221
0.3231 15.4545 510 0.8354 0.4353 0.8354 0.9140
0.3231 15.5152 512 0.8499 0.4851 0.8499 0.9219
0.3231 15.5758 514 0.8978 0.4562 0.8978 0.9475
0.3231 15.6364 516 0.8979 0.4025 0.8979 0.9476
0.3231 15.6970 518 0.8798 0.3071 0.8798 0.9380
0.3231 15.7576 520 0.8746 0.3175 0.8746 0.9352
0.3231 15.8182 522 0.9009 0.3021 0.9009 0.9491

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run1_AugV5_k7_task5_organization

Finetuned
(4019)
this model