ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run2_AugV5_k14_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7925
  • Qwk: 0.2072
  • Mse: 0.7925
  • Rmse: 0.8902

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0286 2 2.6449 -0.0729 2.6449 1.6263
No log 0.0571 4 1.3553 0.0183 1.3553 1.1642
No log 0.0857 6 1.1217 -0.1787 1.1217 1.0591
No log 0.1143 8 1.2938 -0.1356 1.2938 1.1375
No log 0.1429 10 1.3745 -0.1005 1.3745 1.1724
No log 0.1714 12 1.4500 -0.1559 1.4500 1.2042
No log 0.2 14 1.6186 -0.2185 1.6186 1.2722
No log 0.2286 16 1.8460 -0.0673 1.8460 1.3587
No log 0.2571 18 1.4800 -0.1078 1.4800 1.2165
No log 0.2857 20 1.4701 -0.1183 1.4701 1.2125
No log 0.3143 22 1.3811 -0.1788 1.3811 1.1752
No log 0.3429 24 1.2809 -0.1101 1.2809 1.1318
No log 0.3714 26 1.2461 -0.1823 1.2461 1.1163
No log 0.4 28 1.2604 -0.2080 1.2604 1.1227
No log 0.4286 30 1.3035 -0.1726 1.3035 1.1417
No log 0.4571 32 1.2663 -0.2080 1.2663 1.1253
No log 0.4857 34 1.1786 -0.0712 1.1786 1.0856
No log 0.5143 36 1.1819 0.0560 1.1819 1.0871
No log 0.5429 38 1.1664 0.0560 1.1664 1.0800
No log 0.5714 40 1.3699 -0.1305 1.3699 1.1704
No log 0.6 42 1.4526 -0.0599 1.4526 1.2052
No log 0.6286 44 1.1550 0.0331 1.1550 1.0747
No log 0.6571 46 1.0476 0.0702 1.0476 1.0235
No log 0.6857 48 1.0382 0.0980 1.0382 1.0189
No log 0.7143 50 1.0483 -0.0041 1.0483 1.0239
No log 0.7429 52 1.1908 0.0305 1.1908 1.0912
No log 0.7714 54 1.2813 -0.1023 1.2813 1.1319
No log 0.8 56 1.5453 -0.1470 1.5453 1.2431
No log 0.8286 58 1.3789 -0.1065 1.3789 1.1743
No log 0.8571 60 1.0955 0.0685 1.0955 1.0467
No log 0.8857 62 1.0260 0.0835 1.0260 1.0129
No log 0.9143 64 0.9556 0.0200 0.9556 0.9775
No log 0.9429 66 0.9263 -0.0192 0.9263 0.9624
No log 0.9714 68 0.9157 0.1646 0.9157 0.9569
No log 1.0 70 0.9525 0.0748 0.9525 0.9760
No log 1.0286 72 0.9687 0.1174 0.9687 0.9842
No log 1.0571 74 0.9931 0.1538 0.9931 0.9965
No log 1.0857 76 1.0138 0.1247 1.0138 1.0069
No log 1.1143 78 1.0940 0.0941 1.0940 1.0459
No log 1.1429 80 1.0835 0.0277 1.0835 1.0409
No log 1.1714 82 1.0510 0.1247 1.0510 1.0252
No log 1.2 84 1.0498 0.0839 1.0498 1.0246
No log 1.2286 86 1.0894 0.0627 1.0894 1.0437
No log 1.2571 88 1.0121 0.0485 1.0121 1.0061
No log 1.2857 90 0.9790 0.1906 0.9790 0.9894
No log 1.3143 92 1.0026 0.2007 1.0026 1.0013
No log 1.3429 94 0.9677 0.2352 0.9677 0.9837
No log 1.3714 96 0.9206 0.2373 0.9206 0.9595
No log 1.4 98 0.8571 0.2110 0.8571 0.9258
No log 1.4286 100 0.8873 0.1693 0.8873 0.9420
No log 1.4571 102 0.9514 0.2419 0.9514 0.9754
No log 1.4857 104 0.9748 0.2074 0.9748 0.9873
No log 1.5143 106 0.9827 0.1130 0.9827 0.9913
No log 1.5429 108 1.0052 0.1337 1.0052 1.0026
No log 1.5714 110 1.0669 0.1720 1.0669 1.0329
No log 1.6 112 1.0598 0.2218 1.0598 1.0295
No log 1.6286 114 1.1875 0.1803 1.1875 1.0897
No log 1.6571 116 1.4992 0.1193 1.4992 1.2244
No log 1.6857 118 1.4485 0.0816 1.4485 1.2035
No log 1.7143 120 1.1269 0.1301 1.1269 1.0615
No log 1.7429 122 1.0240 0.1541 1.0240 1.0119
No log 1.7714 124 1.0472 0.0893 1.0472 1.0233
No log 1.8 126 1.0703 0.0538 1.0703 1.0345
No log 1.8286 128 1.0810 0.1489 1.0810 1.0397
No log 1.8571 130 1.0434 0.1611 1.0434 1.0215
No log 1.8857 132 0.9868 0.2310 0.9868 0.9934
No log 1.9143 134 0.9534 0.2513 0.9534 0.9764
No log 1.9429 136 0.9494 0.0459 0.9494 0.9744
No log 1.9714 138 0.9440 0.1170 0.9440 0.9716
No log 2.0 140 0.9435 0.2484 0.9435 0.9714
No log 2.0286 142 0.9551 0.0973 0.9551 0.9773
No log 2.0571 144 1.0035 0.1174 1.0035 1.0018
No log 2.0857 146 1.0983 0.0284 1.0983 1.0480
No log 2.1143 148 1.0765 0.1239 1.0765 1.0375
No log 2.1429 150 1.1349 0.0918 1.1349 1.0653
No log 2.1714 152 1.2566 0.1003 1.2566 1.1210
No log 2.2 154 1.2395 0.0529 1.2395 1.1133
No log 2.2286 156 1.2399 0.0797 1.2399 1.1135
No log 2.2571 158 1.1999 0.0591 1.1999 1.0954
No log 2.2857 160 1.2527 0.0542 1.2527 1.1192
No log 2.3143 162 1.1737 0.1693 1.1737 1.0834
No log 2.3429 164 1.1423 -0.0232 1.1423 1.0688
No log 2.3714 166 1.1337 0.0893 1.1337 1.0648
No log 2.4 168 1.1652 0.2132 1.1652 1.0794
No log 2.4286 170 1.1098 0.2276 1.1098 1.0535
No log 2.4571 172 1.2162 0.1233 1.2162 1.1028
No log 2.4857 174 1.2745 0.0790 1.2745 1.1289
No log 2.5143 176 1.0790 0.0960 1.0790 1.0388
No log 2.5429 178 1.0091 0.2606 1.0091 1.0045
No log 2.5714 180 1.0163 0.2606 1.0163 1.0081
No log 2.6 182 1.0004 0.1531 1.0004 1.0002
No log 2.6286 184 1.0238 0.2801 1.0238 1.0118
No log 2.6571 186 1.0246 0.2747 1.0246 1.0122
No log 2.6857 188 1.0216 0.2429 1.0216 1.0107
No log 2.7143 190 1.0121 0.1162 1.0121 1.0060
No log 2.7429 192 1.0107 0.0960 1.0107 1.0054
No log 2.7714 194 0.9992 0.1959 0.9992 0.9996
No log 2.8 196 1.0426 0.2369 1.0426 1.0211
No log 2.8286 198 1.0097 0.2241 1.0097 1.0048
No log 2.8571 200 1.0126 0.2501 1.0126 1.0063
No log 2.8857 202 1.0432 0.2699 1.0432 1.0214
No log 2.9143 204 0.9393 0.2455 0.9393 0.9692
No log 2.9429 206 0.9074 0.2465 0.9074 0.9526
No log 2.9714 208 0.8959 0.2465 0.8959 0.9465
No log 3.0 210 0.8997 0.1950 0.8997 0.9485
No log 3.0286 212 0.8895 0.1846 0.8895 0.9431
No log 3.0571 214 0.8929 0.2688 0.8929 0.9449
No log 3.0857 216 0.9161 0.1833 0.9161 0.9571
No log 3.1143 218 0.9583 0.1840 0.9583 0.9789
No log 3.1429 220 0.9324 0.2739 0.9324 0.9656
No log 3.1714 222 0.9090 0.2504 0.9090 0.9534
No log 3.2 224 0.9332 0.2455 0.9332 0.9660
No log 3.2286 226 0.9485 0.2455 0.9485 0.9739
No log 3.2571 228 0.9472 0.2433 0.9472 0.9733
No log 3.2857 230 0.9693 0.2360 0.9693 0.9845
No log 3.3143 232 0.9776 0.2360 0.9776 0.9888
No log 3.3429 234 0.9704 0.2478 0.9704 0.9851
No log 3.3714 236 0.9814 0.1935 0.9814 0.9907
No log 3.4 238 1.0198 0.1619 1.0198 1.0098
No log 3.4286 240 0.9580 0.2400 0.9580 0.9788
No log 3.4571 242 0.9488 0.2036 0.9488 0.9740
No log 3.4857 244 0.9623 0.2400 0.9623 0.9809
No log 3.5143 246 1.0047 0.1597 1.0047 1.0023
No log 3.5429 248 1.0050 0.1597 1.0050 1.0025
No log 3.5714 250 1.0000 0.2114 1.0000 1.0000
No log 3.6 252 1.0298 0.1803 1.0298 1.0148
No log 3.6286 254 1.0196 0.0703 1.0196 1.0097
No log 3.6571 256 0.9329 0.2377 0.9329 0.9658
No log 3.6857 258 0.8942 0.2377 0.8942 0.9456
No log 3.7143 260 0.8847 0.3068 0.8847 0.9406
No log 3.7429 262 0.8316 0.3285 0.8316 0.9119
No log 3.7714 264 0.8246 0.3386 0.8246 0.9081
No log 3.8 266 0.8205 0.3409 0.8205 0.9058
No log 3.8286 268 0.8418 0.3470 0.8418 0.9175
No log 3.8571 270 0.8661 0.3531 0.8661 0.9307
No log 3.8857 272 0.8411 0.3470 0.8411 0.9171
No log 3.9143 274 0.8592 0.3643 0.8592 0.9269
No log 3.9429 276 0.9059 0.2131 0.9059 0.9518
No log 3.9714 278 0.8590 0.3003 0.8590 0.9268
No log 4.0 280 0.8117 0.3363 0.8117 0.9009
No log 4.0286 282 0.7950 0.3724 0.7950 0.8916
No log 4.0571 284 0.7904 0.4137 0.7904 0.8891
No log 4.0857 286 0.8505 0.2635 0.8505 0.9222
No log 4.1143 288 0.9203 0.2732 0.9203 0.9593
No log 4.1429 290 0.8859 0.3096 0.8859 0.9412
No log 4.1714 292 0.8140 0.3277 0.8140 0.9022
No log 4.2 294 0.8053 0.3277 0.8053 0.8974
No log 4.2286 296 0.8313 0.3562 0.8313 0.9117
No log 4.2571 298 0.8447 0.3501 0.8447 0.9191
No log 4.2857 300 0.8635 0.3027 0.8635 0.9292
No log 4.3143 302 0.7984 0.2744 0.7984 0.8935
No log 4.3429 304 0.7951 0.2827 0.7951 0.8917
No log 4.3714 306 0.8190 0.3161 0.8190 0.9050
No log 4.4 308 0.8224 0.2827 0.8224 0.9068
No log 4.4286 310 0.8512 0.2926 0.8512 0.9226
No log 4.4571 312 0.8561 0.3232 0.8561 0.9253
No log 4.4857 314 0.9262 0.2759 0.9262 0.9624
No log 4.5143 316 0.9233 0.2759 0.9233 0.9609
No log 4.5429 318 0.8667 0.3175 0.8667 0.9310
No log 4.5714 320 0.8839 0.3021 0.8839 0.9402
No log 4.6 322 0.8559 0.3243 0.8559 0.9252
No log 4.6286 324 0.8331 0.3674 0.8331 0.9127
No log 4.6571 326 0.8798 0.3916 0.8798 0.9380
No log 4.6857 328 0.8216 0.3705 0.8216 0.9064
No log 4.7143 330 0.7837 0.3467 0.7837 0.8853
No log 4.7429 332 0.7924 0.3768 0.7924 0.8902
No log 4.7714 334 0.8502 0.3851 0.8502 0.9221
No log 4.8 336 0.8747 0.3006 0.8747 0.9352
No log 4.8286 338 0.7955 0.3768 0.7955 0.8919
No log 4.8571 340 0.7860 0.3369 0.7860 0.8866
No log 4.8857 342 0.8097 0.3166 0.8097 0.8998
No log 4.9143 344 0.8282 0.3290 0.8282 0.9100
No log 4.9429 346 0.8344 0.2992 0.8344 0.9134
No log 4.9714 348 0.8887 0.2739 0.8887 0.9427
No log 5.0 350 0.9213 0.2068 0.9213 0.9599
No log 5.0286 352 0.8997 0.3141 0.8997 0.9485
No log 5.0571 354 0.9568 0.3067 0.9568 0.9781
No log 5.0857 356 0.9891 0.3067 0.9891 0.9945
No log 5.1143 358 0.9530 0.3112 0.9530 0.9762
No log 5.1429 360 0.9171 0.2507 0.9171 0.9576
No log 5.1714 362 0.8847 0.3153 0.8847 0.9406
No log 5.2 364 0.8354 0.3401 0.8354 0.9140
No log 5.2286 366 0.7689 0.3622 0.7689 0.8769
No log 5.2571 368 0.7708 0.2652 0.7708 0.8780
No log 5.2857 370 0.7631 0.3149 0.7631 0.8736
No log 5.3143 372 0.7660 0.3311 0.7660 0.8752
No log 5.3429 374 0.7778 0.3272 0.7778 0.8820
No log 5.3714 376 0.7915 0.3455 0.7915 0.8896
No log 5.4 378 0.8198 0.3143 0.8198 0.9054
No log 5.4286 380 0.8030 0.3291 0.8030 0.8961
No log 5.4571 382 0.7741 0.2883 0.7741 0.8798
No log 5.4857 384 0.7824 0.3574 0.7824 0.8846
No log 5.5143 386 0.7661 0.3577 0.7661 0.8753
No log 5.5429 388 0.7853 0.2774 0.7853 0.8862
No log 5.5714 390 0.8140 0.3391 0.8140 0.9022
No log 5.6 392 0.8632 0.3243 0.8632 0.9291
No log 5.6286 394 0.8632 0.3266 0.8632 0.9291
No log 5.6571 396 0.8661 0.2966 0.8661 0.9306
No log 5.6857 398 0.9254 0.2660 0.9254 0.9620
No log 5.7143 400 1.1076 0.1863 1.1076 1.0524
No log 5.7429 402 1.1651 0.1831 1.1651 1.0794
No log 5.7714 404 1.0464 0.2143 1.0464 1.0229
No log 5.8 406 0.9135 0.2936 0.9135 0.9558
No log 5.8286 408 0.8534 0.3563 0.8534 0.9238
No log 5.8571 410 0.8038 0.2999 0.8038 0.8965
No log 5.8857 412 0.7817 0.3106 0.7817 0.8841
No log 5.9143 414 0.7791 0.2034 0.7791 0.8827
No log 5.9429 416 0.7722 0.2160 0.7722 0.8787
No log 5.9714 418 0.7758 0.3034 0.7758 0.8808
No log 6.0 420 0.8208 0.3937 0.8208 0.9060
No log 6.0286 422 0.8291 0.3186 0.8291 0.9106
No log 6.0571 424 0.8636 0.2940 0.8636 0.9293
No log 6.0857 426 0.8946 0.2940 0.8946 0.9458
No log 6.1143 428 0.9280 0.2773 0.9280 0.9633
No log 6.1429 430 0.9421 0.2471 0.9421 0.9706
No log 6.1714 432 0.9081 0.2371 0.9081 0.9530
No log 6.2 434 0.9187 0.2020 0.9187 0.9585
No log 6.2286 436 0.9589 0.1975 0.9589 0.9792
No log 6.2571 438 0.9869 0.1406 0.9869 0.9934
No log 6.2857 440 1.0199 0.1623 1.0199 1.0099
No log 6.3143 442 1.0881 0.2388 1.0881 1.0431
No log 6.3429 444 1.0886 0.2408 1.0886 1.0433
No log 6.3714 446 0.9768 0.2693 0.9768 0.9883
No log 6.4 448 0.8979 0.2377 0.8979 0.9476
No log 6.4286 450 0.8706 0.2429 0.8706 0.9330
No log 6.4571 452 0.8632 0.2973 0.8632 0.9291
No log 6.4857 454 0.8689 0.2920 0.8689 0.9321
No log 6.5143 456 0.8285 0.2747 0.8285 0.9102
No log 6.5429 458 0.8303 0.1528 0.8303 0.9112
No log 6.5714 460 0.9169 0.1819 0.9169 0.9575
No log 6.6 462 0.9144 0.1819 0.9144 0.9562
No log 6.6286 464 0.8329 0.2611 0.8329 0.9126
No log 6.6571 466 0.8054 0.2249 0.8054 0.8975
No log 6.6857 468 0.7945 0.2771 0.7945 0.8914
No log 6.7143 470 0.7898 0.2917 0.7898 0.8887
No log 6.7429 472 0.7992 0.3060 0.7992 0.8940
No log 6.7714 474 0.8293 0.3339 0.8293 0.9107
No log 6.8 476 0.8003 0.3339 0.8003 0.8946
No log 6.8286 478 0.7564 0.3645 0.7564 0.8697
No log 6.8571 480 0.7481 0.3831 0.7481 0.8649
No log 6.8857 482 0.7343 0.3051 0.7343 0.8569
No log 6.9143 484 0.7458 0.3391 0.7458 0.8636
No log 6.9429 486 0.8339 0.3231 0.8339 0.9132
No log 6.9714 488 0.8671 0.3028 0.8671 0.9312
No log 7.0 490 0.8035 0.2973 0.8035 0.8964
No log 7.0286 492 0.8097 0.3112 0.8097 0.8998
No log 7.0571 494 0.8491 0.3357 0.8491 0.9215
No log 7.0857 496 0.8318 0.2540 0.8318 0.9120
No log 7.1143 498 0.8281 0.3051 0.8281 0.9100
0.3569 7.1429 500 0.8378 0.2540 0.8378 0.9153
0.3569 7.1714 502 0.8348 0.3285 0.8348 0.9137
0.3569 7.2 504 0.8432 0.2773 0.8432 0.9183
0.3569 7.2286 506 0.8523 0.2386 0.8523 0.9232
0.3569 7.2571 508 0.8752 0.2547 0.8752 0.9355
0.3569 7.2857 510 0.9503 0.3341 0.9503 0.9748
0.3569 7.3143 512 0.9734 0.3631 0.9734 0.9866
0.3569 7.3429 514 0.8843 0.3247 0.8843 0.9404
0.3569 7.3714 516 0.8065 0.3052 0.8065 0.8981
0.3569 7.4 518 0.7766 0.3518 0.7766 0.8813
0.3569 7.4286 520 0.7589 0.3141 0.7589 0.8711
0.3569 7.4571 522 0.7564 0.3386 0.7564 0.8697
0.3569 7.4857 524 0.7233 0.3051 0.7233 0.8505
0.3569 7.5143 526 0.7323 0.3450 0.7323 0.8558
0.3569 7.5429 528 0.7390 0.3492 0.7390 0.8596
0.3569 7.5714 530 0.7597 0.3106 0.7597 0.8716
0.3569 7.6 532 0.8632 0.2937 0.8632 0.9291
0.3569 7.6286 534 0.8625 0.2937 0.8625 0.9287
0.3569 7.6571 536 0.8012 0.2661 0.8012 0.8951
0.3569 7.6857 538 0.7972 0.2652 0.7972 0.8929
0.3569 7.7143 540 0.8505 0.1782 0.8505 0.9222
0.3569 7.7429 542 0.8432 0.1809 0.8432 0.9183
0.3569 7.7714 544 0.8260 0.2652 0.8260 0.9089
0.3569 7.8 546 0.8537 0.2320 0.8537 0.9239
0.3569 7.8286 548 0.8571 0.2102 0.8571 0.9258
0.3569 7.8571 550 0.8409 0.2102 0.8409 0.9170
0.3569 7.8857 552 0.8118 0.1684 0.8118 0.9010
0.3569 7.9143 554 0.7925 0.2072 0.7925 0.8902

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run2_AugV5_k14_task7_organization

Finetuned
(4019)
this model