ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run1_AugV5_k19_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8332
  • Qwk: 0.2633
  • Mse: 0.8332
  • Rmse: 0.9128

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0211 2 2.6250 -0.0084 2.6250 1.6202
No log 0.0421 4 1.2199 0.0991 1.2199 1.1045
No log 0.0632 6 0.9193 0.0101 0.9193 0.9588
No log 0.0842 8 0.9905 0.1293 0.9905 0.9952
No log 0.1053 10 1.2113 -0.0120 1.2113 1.1006
No log 0.1263 12 1.2295 -0.0449 1.2295 1.1088
No log 0.1474 14 1.2562 -0.0462 1.2562 1.1208
No log 0.1684 16 1.3810 0.0493 1.3810 1.1752
No log 0.1895 18 1.3066 0.0244 1.3066 1.1431
No log 0.2105 20 1.1870 -0.1389 1.1870 1.0895
No log 0.2316 22 1.1516 -0.1072 1.1516 1.0731
No log 0.2526 24 1.0743 -0.0835 1.0743 1.0365
No log 0.2737 26 1.1015 -0.1732 1.1015 1.0495
No log 0.2947 28 1.0426 -0.0232 1.0426 1.0211
No log 0.3158 30 1.0041 -0.0373 1.0041 1.0020
No log 0.3368 32 0.9860 0.0441 0.9860 0.9930
No log 0.3579 34 1.0406 0.0802 1.0406 1.0201
No log 0.3789 36 1.3665 -0.0741 1.3665 1.1690
No log 0.4 38 1.4435 -0.1129 1.4435 1.2015
No log 0.4211 40 1.1560 -0.0168 1.1560 1.0752
No log 0.4421 42 1.0508 0.1367 1.0508 1.0251
No log 0.4632 44 1.0102 0.0727 1.0102 1.0051
No log 0.4842 46 1.1000 0.0239 1.1000 1.0488
No log 0.5053 48 1.0769 0.0957 1.0769 1.0377
No log 0.5263 50 0.9698 0.0816 0.9698 0.9848
No log 0.5474 52 1.0012 -0.0352 1.0012 1.0006
No log 0.5684 54 1.0412 -0.0423 1.0412 1.0204
No log 0.5895 56 1.0102 -0.0654 1.0102 1.0051
No log 0.6105 58 0.9787 -0.0569 0.9787 0.9893
No log 0.6316 60 0.9841 0.0310 0.9841 0.9920
No log 0.6526 62 0.9601 0.0573 0.9601 0.9799
No log 0.6737 64 1.0237 0.0978 1.0237 1.0118
No log 0.6947 66 1.1443 -0.0141 1.1443 1.0697
No log 0.7158 68 1.1932 -0.0084 1.1932 1.0923
No log 0.7368 70 1.2485 0.0356 1.2485 1.1174
No log 0.7579 72 1.1676 0.0238 1.1676 1.0806
No log 0.7789 74 1.1511 -0.1694 1.1511 1.0729
No log 0.8 76 1.3387 0.0283 1.3387 1.1570
No log 0.8211 78 1.3497 0.1002 1.3497 1.1617
No log 0.8421 80 1.1263 0.0149 1.1263 1.0613
No log 0.8632 82 0.9874 0.0318 0.9874 0.9937
No log 0.8842 84 1.1463 0.0666 1.1463 1.0706
No log 0.9053 86 1.1995 0.0178 1.1995 1.0952
No log 0.9263 88 1.0187 0.1577 1.0187 1.0093
No log 0.9474 90 0.8878 0.0725 0.8878 0.9422
No log 0.9684 92 0.8898 0.0227 0.8898 0.9433
No log 0.9895 94 0.8625 0.0227 0.8625 0.9287
No log 1.0105 96 0.8729 0.2708 0.8729 0.9343
No log 1.0316 98 0.9664 0.1234 0.9664 0.9830
No log 1.0526 100 0.9410 0.1581 0.9410 0.9701
No log 1.0737 102 0.9133 0.2341 0.9133 0.9557
No log 1.0947 104 0.8615 0.1700 0.8615 0.9282
No log 1.1158 106 0.7979 0.0846 0.7979 0.8932
No log 1.1368 108 0.8701 0.2204 0.8701 0.9328
No log 1.1579 110 1.0166 0.1712 1.0166 1.0083
No log 1.1789 112 0.9378 0.1796 0.9378 0.9684
No log 1.2 114 0.8001 0.0643 0.8001 0.8945
No log 1.2211 116 0.8585 0.2673 0.8585 0.9265
No log 1.2421 118 0.8824 0.2948 0.8824 0.9394
No log 1.2632 120 0.8194 0.2036 0.8194 0.9052
No log 1.2842 122 0.8036 0.0643 0.8036 0.8965
No log 1.3053 124 0.8661 0.1740 0.8661 0.9306
No log 1.3263 126 0.8721 0.0792 0.8721 0.9339
No log 1.3474 128 0.8594 0.2053 0.8594 0.9270
No log 1.3684 130 0.9415 0.3051 0.9415 0.9703
No log 1.3895 132 0.9342 0.1908 0.9342 0.9665
No log 1.4105 134 0.8924 0.2469 0.8924 0.9447
No log 1.4316 136 0.8155 0.1661 0.8155 0.9031
No log 1.4526 138 0.9119 0.2549 0.9119 0.9549
No log 1.4737 140 1.0172 0.2887 1.0172 1.0086
No log 1.4947 142 0.9690 0.2342 0.9690 0.9844
No log 1.5158 144 0.8726 0.0143 0.8726 0.9341
No log 1.5368 146 0.9258 0.2193 0.9258 0.9622
No log 1.5579 148 1.3018 0.1210 1.3018 1.1410
No log 1.5789 150 1.6956 0.0388 1.6956 1.3022
No log 1.6 152 1.6513 0.0182 1.6513 1.2850
No log 1.6211 154 1.3302 0.1210 1.3302 1.1533
No log 1.6421 156 1.0629 0.1528 1.0629 1.0310
No log 1.6632 158 0.9033 0.1174 0.9033 0.9504
No log 1.6842 160 0.8415 0.1094 0.8415 0.9173
No log 1.7053 162 0.8361 0.1400 0.8361 0.9144
No log 1.7263 164 0.8297 0.1400 0.8297 0.9109
No log 1.7474 166 0.8257 0.1094 0.8257 0.9087
No log 1.7684 168 0.8533 0.1558 0.8533 0.9237
No log 1.7895 170 0.8695 0.1455 0.8695 0.9325
No log 1.8105 172 0.8598 0.2366 0.8598 0.9273
No log 1.8316 174 0.8656 0.2429 0.8656 0.9304
No log 1.8526 176 0.9202 0.3252 0.9202 0.9593
No log 1.8737 178 1.0102 0.3013 1.0102 1.0051
No log 1.8947 180 0.9992 0.3044 0.9992 0.9996
No log 1.9158 182 0.9266 0.2542 0.9266 0.9626
No log 1.9368 184 0.8932 0.2593 0.8932 0.9451
No log 1.9579 186 0.9103 0.2862 0.9103 0.9541
No log 1.9789 188 0.9066 0.1558 0.9066 0.9521
No log 2.0 190 0.9271 0.1995 0.9271 0.9628
No log 2.0211 192 0.9992 0.1946 0.9992 0.9996
No log 2.0421 194 1.0237 0.2253 1.0237 1.0118
No log 2.0632 196 1.0181 0.2370 1.0181 1.0090
No log 2.0842 198 0.9690 0.2420 0.9690 0.9844
No log 2.1053 200 0.8780 0.1558 0.8780 0.9370
No log 2.1263 202 0.8432 0.2652 0.8432 0.9182
No log 2.1474 204 0.8098 0.2712 0.8098 0.8999
No log 2.1684 206 0.7325 0.2058 0.7325 0.8559
No log 2.1895 208 0.6946 0.2476 0.6946 0.8334
No log 2.2105 210 0.7236 0.2254 0.7236 0.8506
No log 2.2316 212 0.8381 0.2552 0.8381 0.9155
No log 2.2526 214 0.9388 0.2964 0.9388 0.9689
No log 2.2737 216 0.9136 0.3068 0.9136 0.9558
No log 2.2947 218 0.8527 0.3252 0.8527 0.9234
No log 2.3158 220 0.8281 0.2830 0.8281 0.9100
No log 2.3368 222 0.8336 0.2424 0.8336 0.9130
No log 2.3579 224 0.9388 0.2342 0.9388 0.9689
No log 2.3789 226 1.0259 0.2958 1.0259 1.0129
No log 2.4 228 0.9554 0.2812 0.9554 0.9774
No log 2.4211 230 0.9525 0.2352 0.9525 0.9759
No log 2.4421 232 0.9355 0.1718 0.9355 0.9672
No log 2.4632 234 0.9705 0.2843 0.9705 0.9851
No log 2.4842 236 1.0895 0.1827 1.0895 1.0438
No log 2.5053 238 1.1425 0.1787 1.1425 1.0689
No log 2.5263 240 1.1621 0.1747 1.1621 1.0780
No log 2.5474 242 1.0802 0.1827 1.0802 1.0393
No log 2.5684 244 0.9287 0.2253 0.9287 0.9637
No log 2.5895 246 0.8965 0.2027 0.8965 0.9468
No log 2.6105 248 0.8458 0.2414 0.8458 0.9197
No log 2.6316 250 0.8321 0.2414 0.8321 0.9122
No log 2.6526 252 0.8093 0.2414 0.8093 0.8996
No log 2.6737 254 0.7808 0.2414 0.7808 0.8836
No log 2.6947 256 0.7827 0.1786 0.7827 0.8847
No log 2.7158 258 0.8258 0.2440 0.8258 0.9087
No log 2.7368 260 0.8340 0.2440 0.8340 0.9132
No log 2.7579 262 0.9570 0.2244 0.9570 0.9783
No log 2.7789 264 0.9574 0.2244 0.9574 0.9784
No log 2.8 266 0.8499 0.2751 0.8499 0.9219
No log 2.8211 268 0.7930 0.2419 0.7930 0.8905
No log 2.8421 270 0.7862 0.2109 0.7862 0.8867
No log 2.8632 272 0.7972 0.2817 0.7972 0.8929
No log 2.8842 274 0.7656 0.2527 0.7656 0.8750
No log 2.9053 276 0.7338 0.1918 0.7338 0.8566
No log 2.9263 278 0.7360 0.1313 0.7360 0.8579
No log 2.9474 280 0.7503 0.1313 0.7503 0.8662
No log 2.9684 282 0.7678 0.0930 0.7678 0.8763
No log 2.9895 284 0.7773 0.1264 0.7773 0.8817
No log 3.0105 286 0.8316 0.2440 0.8316 0.9119
No log 3.0316 288 0.8890 0.2253 0.8890 0.9429
No log 3.0526 290 0.8756 0.1430 0.8756 0.9358
No log 3.0737 292 0.8909 0.2053 0.8909 0.9439
No log 3.0947 294 0.8956 0.2152 0.8956 0.9463
No log 3.1158 296 0.8849 0.2152 0.8849 0.9407
No log 3.1368 298 0.9230 0.2134 0.9230 0.9607
No log 3.1579 300 1.0220 0.2939 1.0220 1.0109
No log 3.1789 302 1.0317 0.3170 1.0317 1.0157
No log 3.2 304 0.8917 0.2754 0.8917 0.9443
No log 3.2211 306 0.8166 0.1176 0.8166 0.9037
No log 3.2421 308 0.8167 0.1839 0.8167 0.9037
No log 3.2632 310 0.8274 0.1773 0.8274 0.9096
No log 3.2842 312 0.8691 0.2392 0.8691 0.9322
No log 3.3053 314 0.9267 0.2589 0.9267 0.9626
No log 3.3263 316 0.9583 0.2562 0.9583 0.9789
No log 3.3474 318 0.9222 0.2615 0.9222 0.9603
No log 3.3684 320 0.8160 0.3099 0.8160 0.9033
No log 3.3895 322 0.8198 0.2754 0.8198 0.9054
No log 3.4105 324 0.8914 0.3105 0.8914 0.9441
No log 3.4316 326 1.0932 0.3302 1.0932 1.0456
No log 3.4526 328 1.0665 0.3477 1.0665 1.0327
No log 3.4737 330 0.8409 0.3169 0.8409 0.9170
No log 3.4947 332 0.7589 0.1216 0.7589 0.8711
No log 3.5158 334 0.7629 0.2342 0.7629 0.8734
No log 3.5368 336 0.7319 0.1138 0.7319 0.8555
No log 3.5579 338 0.8721 0.3709 0.8721 0.9338
No log 3.5789 340 1.0069 0.2886 1.0069 1.0035
No log 3.6 342 0.9239 0.3643 0.9239 0.9612
No log 3.6211 344 0.8065 0.3746 0.8065 0.8980
No log 3.6421 346 0.7241 0.1479 0.7241 0.8509
No log 3.6632 348 0.7256 0.1176 0.7256 0.8518
No log 3.6842 350 0.7427 0.1294 0.7427 0.8618
No log 3.7053 352 0.7764 0.1015 0.7764 0.8812
No log 3.7263 354 0.8319 0.1447 0.8319 0.9121
No log 3.7474 356 0.8437 0.1447 0.8437 0.9185
No log 3.7684 358 0.8204 0.1592 0.8204 0.9058
No log 3.7895 360 0.8521 0.1740 0.8521 0.9231
No log 3.8105 362 0.8622 0.1800 0.8622 0.9286
No log 3.8316 364 0.8001 0.1587 0.8001 0.8945
No log 3.8526 366 0.7635 0.1400 0.7635 0.8738
No log 3.8737 368 0.7568 0.1448 0.7568 0.8700
No log 3.8947 370 0.7651 0.1901 0.7651 0.8747
No log 3.9158 372 0.8534 0.3302 0.8534 0.9238
No log 3.9368 374 0.9175 0.3675 0.9175 0.9579
No log 3.9579 376 0.8424 0.3712 0.8424 0.9178
No log 3.9789 378 0.7642 0.2261 0.7642 0.8742
No log 4.0 380 0.7973 0.1130 0.7973 0.8929
No log 4.0211 382 0.8175 0.1130 0.8175 0.9042
No log 4.0421 384 0.7829 0.1325 0.7829 0.8848
No log 4.0632 386 0.7852 0.1331 0.7852 0.8861
No log 4.0842 388 0.7849 0.1331 0.7849 0.8860
No log 4.1053 390 0.7834 0.1253 0.7834 0.8851
No log 4.1263 392 0.7672 0.0973 0.7672 0.8759
No log 4.1474 394 0.7558 0.0973 0.7558 0.8693
No log 4.1684 396 0.7405 0.0968 0.7405 0.8605
No log 4.1895 398 0.7375 0.1400 0.7375 0.8588
No log 4.2105 400 0.7316 0.1400 0.7316 0.8554
No log 4.2316 402 0.7273 0.1051 0.7273 0.8528
No log 4.2526 404 0.7199 0.1697 0.7199 0.8485
No log 4.2737 406 0.7266 0.2502 0.7266 0.8524
No log 4.2947 408 0.7348 0.3763 0.7348 0.8572
No log 4.3158 410 0.7155 0.2502 0.7155 0.8459
No log 4.3368 412 0.7090 0.1697 0.7090 0.8420
No log 4.3579 414 0.7446 0.1565 0.7446 0.8629
No log 4.3789 416 0.7340 0.1410 0.7340 0.8567
No log 4.4 418 0.7304 0.2591 0.7304 0.8547
No log 4.4211 420 0.7365 0.2809 0.7365 0.8582
No log 4.4421 422 0.7553 0.3196 0.7553 0.8691
No log 4.4632 424 0.7685 0.3471 0.7685 0.8767
No log 4.4842 426 0.7608 0.3088 0.7608 0.8722
No log 4.5053 428 0.7670 0.2780 0.7670 0.8758
No log 4.5263 430 0.7660 0.2780 0.7660 0.8752
No log 4.5474 432 0.7627 0.2479 0.7627 0.8733
No log 4.5684 434 0.7893 0.2780 0.7893 0.8884
No log 4.5895 436 0.8092 0.2691 0.8092 0.8996
No log 4.6105 438 0.8387 0.3157 0.8387 0.9158
No log 4.6316 440 0.7987 0.2077 0.7987 0.8937
No log 4.6526 442 0.7728 0.2261 0.7728 0.8791
No log 4.6737 444 0.7596 0.2379 0.7596 0.8716
No log 4.6947 446 0.7512 0.2261 0.7512 0.8667
No log 4.7158 448 0.7346 0.1846 0.7346 0.8571
No log 4.7368 450 0.7249 0.1737 0.7249 0.8514
No log 4.7579 452 0.7154 0.2884 0.7154 0.8458
No log 4.7789 454 0.7152 0.3485 0.7152 0.8457
No log 4.8 456 0.7402 0.2973 0.7402 0.8604
No log 4.8211 458 0.7775 0.3593 0.7775 0.8818
No log 4.8421 460 0.7865 0.2853 0.7865 0.8868
No log 4.8632 462 0.8169 0.3440 0.8169 0.9038
No log 4.8842 464 0.8227 0.2845 0.8227 0.9070
No log 4.9053 466 0.8367 0.2921 0.8367 0.9147
No log 4.9263 468 0.8378 0.2980 0.8378 0.9153
No log 4.9474 470 0.8890 0.3844 0.8890 0.9429
No log 4.9684 472 0.8309 0.3121 0.8309 0.9115
No log 4.9895 474 0.7432 0.2749 0.7432 0.8621
No log 5.0105 476 0.7271 0.2718 0.7271 0.8527
No log 5.0316 478 0.7474 0.3482 0.7474 0.8645
No log 5.0526 480 0.7712 0.4186 0.7712 0.8782
No log 5.0737 482 0.7502 0.3126 0.7502 0.8662
No log 5.0947 484 0.7444 0.2681 0.7444 0.8628
No log 5.1158 486 0.7638 0.2652 0.7638 0.8739
No log 5.1368 488 0.7587 0.2295 0.7587 0.8710
No log 5.1579 490 0.7839 0.3226 0.7839 0.8854
No log 5.1789 492 0.8241 0.2926 0.8241 0.9078
No log 5.2 494 0.8215 0.2122 0.8215 0.9064
No log 5.2211 496 0.7681 0.2884 0.7681 0.8764
No log 5.2421 498 0.7922 0.2751 0.7922 0.8900
0.3851 5.2632 500 0.8422 0.2995 0.8422 0.9177
0.3851 5.2842 502 0.8344 0.2691 0.8344 0.9134
0.3851 5.3053 504 0.8197 0.2720 0.8197 0.9054
0.3851 5.3263 506 0.8164 0.2690 0.8164 0.9036
0.3851 5.3474 508 0.8164 0.2095 0.8164 0.9036
0.3851 5.3684 510 0.8332 0.2633 0.8332 0.9128

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run1_AugV5_k19_task7_organization

Finetuned
(4019)
this model