ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run2_AugV5_k18_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0468
  • Qwk: 0.3186
  • Mse: 1.0468
  • Rmse: 1.0231

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0238 2 4.1124 -0.0048 4.1124 2.0279
No log 0.0476 4 2.2130 0.0357 2.2130 1.4876
No log 0.0714 6 1.5804 0.1379 1.5804 1.2572
No log 0.0952 8 1.4601 0.0553 1.4601 1.2084
No log 0.1190 10 1.0772 0.1657 1.0772 1.0379
No log 0.1429 12 1.0577 0.2061 1.0577 1.0284
No log 0.1667 14 1.0635 0.1546 1.0635 1.0313
No log 0.1905 16 1.3661 0.2608 1.3661 1.1688
No log 0.2143 18 1.9926 0.0172 1.9926 1.4116
No log 0.2381 20 2.0241 0.0626 2.0241 1.4227
No log 0.2619 22 1.6817 0.0329 1.6817 1.2968
No log 0.2857 24 1.4920 0.1600 1.4920 1.2215
No log 0.3095 26 1.3093 0.2627 1.3093 1.1443
No log 0.3333 28 1.3188 0.2770 1.3188 1.1484
No log 0.3571 30 1.2995 0.2593 1.2995 1.1399
No log 0.3810 32 1.6370 0.2517 1.6370 1.2794
No log 0.4048 34 2.6823 0.0845 2.6823 1.6378
No log 0.4286 36 2.5857 0.1191 2.5857 1.6080
No log 0.4524 38 1.5934 0.2548 1.5934 1.2623
No log 0.4762 40 1.1836 0.2827 1.1836 1.0879
No log 0.5 42 1.2004 0.2863 1.2004 1.0956
No log 0.5238 44 1.3523 0.2746 1.3523 1.1629
No log 0.5476 46 1.4472 0.1929 1.4472 1.2030
No log 0.5714 48 1.3235 0.2437 1.3235 1.1504
No log 0.5952 50 1.3020 0.2631 1.3020 1.1411
No log 0.6190 52 1.0529 0.2577 1.0529 1.0261
No log 0.6429 54 1.0224 0.2326 1.0224 1.0111
No log 0.6667 56 1.0416 0.2326 1.0416 1.0206
No log 0.6905 58 1.0483 0.1841 1.0483 1.0239
No log 0.7143 60 1.1857 0.1935 1.1857 1.0889
No log 0.7381 62 1.3633 0.1740 1.3633 1.1676
No log 0.7619 64 1.5677 0.2430 1.5677 1.2521
No log 0.7857 66 1.5817 0.2430 1.5817 1.2577
No log 0.8095 68 1.3409 0.1903 1.3409 1.1580
No log 0.8333 70 1.2043 0.3506 1.2043 1.0974
No log 0.8571 72 1.1998 0.3390 1.1998 1.0954
No log 0.8810 74 1.2575 0.2028 1.2575 1.1214
No log 0.9048 76 1.2984 0.2191 1.2984 1.1395
No log 0.9286 78 1.1420 0.2737 1.1420 1.0687
No log 0.9524 80 1.0994 0.3000 1.0994 1.0485
No log 0.9762 82 1.0559 0.2222 1.0559 1.0276
No log 1.0 84 1.0796 0.2409 1.0796 1.0391
No log 1.0238 86 1.0897 0.2092 1.0897 1.0439
No log 1.0476 88 1.0689 0.2562 1.0689 1.0339
No log 1.0714 90 1.0821 0.3672 1.0821 1.0403
No log 1.0952 92 1.2406 0.3088 1.2406 1.1138
No log 1.1190 94 1.8576 0.2488 1.8576 1.3629
No log 1.1429 96 2.1790 0.1209 2.1790 1.4761
No log 1.1667 98 1.9027 0.1318 1.9027 1.3794
No log 1.1905 100 1.3523 0.3384 1.3523 1.1629
No log 1.2143 102 1.1378 0.3268 1.1378 1.0667
No log 1.2381 104 1.3136 0.2263 1.3136 1.1461
No log 1.2619 106 1.3355 0.0449 1.3355 1.1557
No log 1.2857 108 1.2097 0.0513 1.2097 1.0999
No log 1.3095 110 1.1595 0.1189 1.1595 1.0768
No log 1.3333 112 1.3449 0.2217 1.3449 1.1597
No log 1.3571 114 1.4700 0.1620 1.4700 1.2124
No log 1.3810 116 1.3846 0.1674 1.3846 1.1767
No log 1.4048 118 1.2858 0.1427 1.2858 1.1339
No log 1.4286 120 1.2661 0.1171 1.2661 1.1252
No log 1.4524 122 1.2440 0.1412 1.2440 1.1154
No log 1.4762 124 1.2810 0.1618 1.2810 1.1318
No log 1.5 126 1.3325 0.0741 1.3325 1.1543
No log 1.5238 128 1.3145 0.0974 1.3145 1.1465
No log 1.5476 130 1.2670 0.1470 1.2670 1.1256
No log 1.5714 132 1.2384 0.3218 1.2384 1.1128
No log 1.5952 134 1.2870 0.3124 1.2870 1.1345
No log 1.6190 136 1.3284 0.2783 1.3284 1.1526
No log 1.6429 138 1.3467 0.2875 1.3467 1.1605
No log 1.6667 140 1.3973 0.2783 1.3973 1.1821
No log 1.6905 142 1.3331 0.2845 1.3331 1.1546
No log 1.7143 144 1.2281 0.2997 1.2281 1.1082
No log 1.7381 146 1.2407 0.2417 1.2407 1.1139
No log 1.7619 148 1.2009 0.2901 1.2009 1.0958
No log 1.7857 150 1.1663 0.3806 1.1663 1.0799
No log 1.8095 152 1.1548 0.3697 1.1548 1.0746
No log 1.8333 154 1.1591 0.3858 1.1591 1.0766
No log 1.8571 156 1.2403 0.2570 1.2403 1.1137
No log 1.8810 158 1.7089 0.2451 1.7089 1.3072
No log 1.9048 160 2.4304 0.0665 2.4304 1.5590
No log 1.9286 162 2.4514 0.0665 2.4514 1.5657
No log 1.9524 164 1.9208 0.2282 1.9208 1.3859
No log 1.9762 166 1.3503 0.1547 1.3503 1.1620
No log 2.0 168 1.1971 0.3733 1.1971 1.0941
No log 2.0238 170 1.2195 0.2997 1.2195 1.1043
No log 2.0476 172 1.2010 0.4136 1.2010 1.0959
No log 2.0714 174 1.2259 0.1717 1.2259 1.1072
No log 2.0952 176 1.2998 0.1713 1.2998 1.1401
No log 2.1190 178 1.2816 0.1829 1.2816 1.1321
No log 2.1429 180 1.2005 0.2134 1.2005 1.0957
No log 2.1667 182 1.1498 0.2334 1.1498 1.0723
No log 2.1905 184 1.2192 0.2336 1.2192 1.1042
No log 2.2143 186 1.2592 0.2292 1.2592 1.1221
No log 2.2381 188 1.2318 0.2601 1.2318 1.1099
No log 2.2619 190 1.1956 0.2280 1.1956 1.0934
No log 2.2857 192 1.1962 0.2605 1.1962 1.0937
No log 2.3095 194 1.1936 0.3059 1.1936 1.0925
No log 2.3333 196 1.1937 0.3380 1.1937 1.0925
No log 2.3571 198 1.1639 0.2519 1.1639 1.0788
No log 2.3810 200 1.1653 0.2222 1.1653 1.0795
No log 2.4048 202 1.1903 0.2892 1.1903 1.0910
No log 2.4286 204 1.1633 0.2604 1.1633 1.0786
No log 2.4524 206 1.0946 0.1788 1.0946 1.0463
No log 2.4762 208 1.0804 0.3103 1.0804 1.0394
No log 2.5 210 1.0629 0.2836 1.0629 1.0310
No log 2.5238 212 1.0609 0.2122 1.0609 1.0300
No log 2.5476 214 1.0312 0.2122 1.0312 1.0155
No log 2.5714 216 1.0274 0.3124 1.0274 1.0136
No log 2.5952 218 1.0179 0.3124 1.0179 1.0089
No log 2.6190 220 1.0279 0.2991 1.0279 1.0138
No log 2.6429 222 1.0379 0.2850 1.0379 1.0188
No log 2.6667 224 1.0784 0.3590 1.0784 1.0385
No log 2.6905 226 1.0970 0.3590 1.0970 1.0474
No log 2.7143 228 1.0679 0.4278 1.0679 1.0334
No log 2.7381 230 1.0586 0.3697 1.0586 1.0289
No log 2.7619 232 1.0507 0.3308 1.0507 1.0250
No log 2.7857 234 1.0829 0.3491 1.0829 1.0406
No log 2.8095 236 1.2106 0.3191 1.2106 1.1003
No log 2.8333 238 1.2627 0.3021 1.2627 1.1237
No log 2.8571 240 1.1909 0.2907 1.1909 1.0913
No log 2.8810 242 1.0720 0.2819 1.0720 1.0354
No log 2.9048 244 1.0166 0.2432 1.0166 1.0083
No log 2.9286 246 1.0449 0.2738 1.0449 1.0222
No log 2.9524 248 1.0485 0.2857 1.0485 1.0240
No log 2.9762 250 1.1108 0.3596 1.1108 1.0539
No log 3.0 252 1.1491 0.3771 1.1491 1.0720
No log 3.0238 254 1.1616 0.3662 1.1616 1.0778
No log 3.0476 256 1.1737 0.3732 1.1737 1.0834
No log 3.0714 258 1.1596 0.3691 1.1596 1.0769
No log 3.0952 260 1.0598 0.3237 1.0598 1.0295
No log 3.1190 262 1.0458 0.2562 1.0458 1.0226
No log 3.1429 264 1.2022 0.3863 1.2022 1.0965
No log 3.1667 266 1.1501 0.4243 1.1501 1.0724
No log 3.1905 268 1.0197 0.3449 1.0197 1.0098
No log 3.2143 270 0.9619 0.2901 0.9619 0.9808
No log 3.2381 272 0.9494 0.4175 0.9494 0.9744
No log 3.2619 274 0.9600 0.4175 0.9600 0.9798
No log 3.2857 276 0.9618 0.3642 0.9618 0.9807
No log 3.3095 278 0.9831 0.3271 0.9831 0.9915
No log 3.3333 280 0.9917 0.2976 0.9917 0.9958
No log 3.3571 282 0.9949 0.3784 0.9949 0.9974
No log 3.3810 284 1.0430 0.3350 1.0430 1.0213
No log 3.4048 286 1.0558 0.2819 1.0558 1.0275
No log 3.4286 288 1.0264 0.2577 1.0264 1.0131
No log 3.4524 290 0.9958 0.2276 0.9958 0.9979
No log 3.4762 292 0.9808 0.3112 0.9808 0.9903
No log 3.5 294 0.9769 0.3091 0.9769 0.9884
No log 3.5238 296 0.9758 0.3556 0.9758 0.9878
No log 3.5476 298 0.9728 0.3628 0.9728 0.9863
No log 3.5714 300 1.0045 0.4034 1.0045 1.0023
No log 3.5952 302 1.0300 0.4048 1.0300 1.0149
No log 3.6190 304 0.9637 0.3747 0.9637 0.9817
No log 3.6429 306 0.9829 0.2976 0.9829 0.9914
No log 3.6667 308 1.0388 0.3321 1.0388 1.0192
No log 3.6905 310 0.9895 0.2879 0.9895 0.9947
No log 3.7143 312 0.9501 0.3133 0.9501 0.9747
No log 3.7381 314 0.9969 0.3510 0.9969 0.9985
No log 3.7619 316 1.0598 0.4006 1.0598 1.0295
No log 3.7857 318 1.0934 0.4006 1.0934 1.0457
No log 3.8095 320 1.0436 0.4148 1.0436 1.0216
No log 3.8333 322 1.0171 0.3056 1.0171 1.0085
No log 3.8571 324 1.0014 0.3466 1.0014 1.0007
No log 3.8810 326 0.9976 0.3784 0.9976 0.9988
No log 3.9048 328 0.9972 0.4045 0.9972 0.9986
No log 3.9286 330 1.0124 0.3855 1.0124 1.0062
No log 3.9524 332 1.0546 0.3627 1.0546 1.0269
No log 3.9762 334 1.0081 0.3236 1.0081 1.0040
No log 4.0 336 0.9944 0.3011 0.9944 0.9972
No log 4.0238 338 0.9987 0.2692 0.9987 0.9993
No log 4.0476 340 1.0131 0.2788 1.0131 1.0065
No log 4.0714 342 1.0028 0.2521 1.0028 1.0014
No log 4.0952 344 0.9854 0.2911 0.9854 0.9927
No log 4.1190 346 1.0228 0.3313 1.0228 1.0113
No log 4.1429 348 1.0296 0.3430 1.0296 1.0147
No log 4.1667 350 1.0260 0.2633 1.0260 1.0129
No log 4.1905 352 1.0068 0.2972 1.0068 1.0034
No log 4.2143 354 0.9897 0.3011 0.9897 0.9949
No log 4.2381 356 0.9621 0.3622 0.9621 0.9809
No log 4.2619 358 1.0047 0.3836 1.0047 1.0024
No log 4.2857 360 1.0676 0.3664 1.0676 1.0333
No log 4.3095 362 1.0052 0.4077 1.0052 1.0026
No log 4.3333 364 0.9768 0.4894 0.9768 0.9884
No log 4.3571 366 0.9835 0.4413 0.9835 0.9917
No log 4.3810 368 0.9901 0.3931 0.9901 0.9950
No log 4.4048 370 1.0564 0.3510 1.0564 1.0278
No log 4.4286 372 1.1229 0.3103 1.1229 1.0597
No log 4.4524 374 1.0790 0.3840 1.0790 1.0387
No log 4.4762 376 1.0736 0.3243 1.0736 1.0362
No log 4.5 378 1.0496 0.3511 1.0496 1.0245
No log 4.5238 380 1.0535 0.2829 1.0535 1.0264
No log 4.5476 382 1.1214 0.2172 1.1214 1.0589
No log 4.5714 384 1.1256 0.2949 1.1256 1.0610
No log 4.5952 386 1.0393 0.3639 1.0393 1.0195
No log 4.6190 388 0.9881 0.4755 0.9881 0.9940
No log 4.6429 390 1.0448 0.3409 1.0448 1.0221
No log 4.6667 392 1.1479 0.2811 1.1479 1.0714
No log 4.6905 394 1.0959 0.3062 1.0959 1.0469
No log 4.7143 396 0.9829 0.3915 0.9829 0.9914
No log 4.7381 398 1.0028 0.3495 1.0028 1.0014
No log 4.7619 400 1.0434 0.3025 1.0434 1.0215
No log 4.7857 402 1.0451 0.3495 1.0451 1.0223
No log 4.8095 404 1.0239 0.3230 1.0239 1.0119
No log 4.8333 406 1.0189 0.3614 1.0189 1.0094
No log 4.8571 408 1.0249 0.3268 1.0249 1.0124
No log 4.8810 410 1.0008 0.3394 1.0008 1.0004
No log 4.9048 412 0.9940 0.3856 0.9940 0.9970
No log 4.9286 414 0.9990 0.4261 0.9990 0.9995
No log 4.9524 416 0.9987 0.4374 0.9987 0.9994
No log 4.9762 418 1.0169 0.3631 1.0169 1.0084
No log 5.0 420 1.1052 0.4710 1.1052 1.0513
No log 5.0238 422 1.1003 0.3990 1.1003 1.0490
No log 5.0476 424 1.0148 0.4264 1.0148 1.0074
No log 5.0714 426 0.9923 0.3343 0.9923 0.9962
No log 5.0952 428 1.0250 0.2954 1.0250 1.0124
No log 5.1190 430 0.9882 0.2757 0.9882 0.9941
No log 5.1429 432 0.9416 0.3356 0.9416 0.9704
No log 5.1667 434 0.9545 0.3860 0.9545 0.9770
No log 5.1905 436 0.9452 0.3666 0.9452 0.9722
No log 5.2143 438 0.9544 0.3583 0.9544 0.9770
No log 5.2381 440 0.9939 0.3202 0.9939 0.9969
No log 5.2619 442 0.9494 0.3448 0.9494 0.9744
No log 5.2857 444 0.9475 0.3802 0.9475 0.9734
No log 5.3095 446 1.0091 0.4023 1.0091 1.0045
No log 5.3333 448 0.9919 0.4382 0.9919 0.9959
No log 5.3571 450 0.9617 0.3802 0.9617 0.9807
No log 5.3810 452 0.9834 0.2901 0.9834 0.9917
No log 5.4048 454 1.0296 0.2934 1.0296 1.0147
No log 5.4286 456 1.0411 0.2934 1.0411 1.0203
No log 5.4524 458 1.0231 0.3045 1.0231 1.0115
No log 5.4762 460 1.0088 0.2901 1.0088 1.0044
No log 5.5 462 1.0733 0.3256 1.0733 1.0360
No log 5.5238 464 1.1323 0.3182 1.1323 1.0641
No log 5.5476 466 1.1290 0.3615 1.1290 1.0626
No log 5.5714 468 1.0815 0.3268 1.0815 1.0400
No log 5.5952 470 1.0723 0.3590 1.0723 1.0355
No log 5.6190 472 1.0900 0.3442 1.0900 1.0441
No log 5.6429 474 1.0968 0.3442 1.0968 1.0473
No log 5.6667 476 1.0589 0.3442 1.0589 1.0290
No log 5.6905 478 0.9908 0.3457 0.9908 0.9954
No log 5.7143 480 0.9960 0.5197 0.9960 0.9980
No log 5.7381 482 1.0339 0.3237 1.0339 1.0168
No log 5.7619 484 0.9843 0.2909 0.9843 0.9921
No log 5.7857 486 0.9446 0.3548 0.9446 0.9719
No log 5.8095 488 1.0298 0.2462 1.0298 1.0148
No log 5.8333 490 1.1314 0.2336 1.1314 1.0637
No log 5.8571 492 1.1490 0.2480 1.1490 1.0719
No log 5.8810 494 1.0918 0.2694 1.0918 1.0449
No log 5.9048 496 1.0857 0.4114 1.0857 1.0420
No log 5.9286 498 1.1082 0.4196 1.1082 1.0527
0.3655 5.9524 500 1.1101 0.3962 1.1101 1.0536
0.3655 5.9762 502 1.0918 0.4131 1.0918 1.0449
0.3655 6.0 504 1.0323 0.4110 1.0323 1.0160
0.3655 6.0238 506 1.0249 0.2912 1.0249 1.0124
0.3655 6.0476 508 1.1295 0.2336 1.1295 1.0628
0.3655 6.0714 510 1.1851 0.2396 1.1851 1.0886
0.3655 6.0952 512 1.1300 0.2691 1.1300 1.0630
0.3655 6.1190 514 1.0468 0.3186 1.0468 1.0231

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run2_AugV5_k18_task5_organization

Finetuned
(4019)
this model