ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run2_AugV5_k19_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6715
  • Qwk: 0.2780
  • Mse: 0.6715
  • Rmse: 0.8195

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0317 2 2.4779 -0.0449 2.4779 1.5741
No log 0.0635 4 1.2437 0.0985 1.2437 1.1152
No log 0.0952 6 1.1731 -0.2161 1.1731 1.0831
No log 0.1270 8 1.0351 -0.0851 1.0351 1.0174
No log 0.1587 10 1.0346 -0.0907 1.0346 1.0172
No log 0.1905 12 1.0079 -0.0462 1.0079 1.0039
No log 0.2222 14 0.9935 -0.0426 0.9935 0.9967
No log 0.2540 16 0.9813 -0.0462 0.9813 0.9906
No log 0.2857 18 0.9776 0.1268 0.9776 0.9887
No log 0.3175 20 0.9543 0.1313 0.9543 0.9769
No log 0.3492 22 0.9723 0.0501 0.9723 0.9861
No log 0.3810 24 1.0093 0.0916 1.0093 1.0046
No log 0.4127 26 0.8836 0.0460 0.8836 0.9400
No log 0.4444 28 0.8351 -0.0079 0.8351 0.9139
No log 0.4762 30 0.8714 0.0460 0.8714 0.9335
No log 0.5079 32 1.1692 0.0726 1.1692 1.0813
No log 0.5397 34 1.3208 0.0065 1.3208 1.1493
No log 0.5714 36 1.1235 0.1256 1.1235 1.0599
No log 0.6032 38 0.8361 0.0051 0.8361 0.9144
No log 0.6349 40 0.7904 0.1807 0.7904 0.8891
No log 0.6667 42 0.8042 0.2027 0.8042 0.8967
No log 0.6984 44 0.7574 0.1007 0.7574 0.8703
No log 0.7302 46 0.7565 0.0 0.7565 0.8698
No log 0.7619 48 0.8183 0.0522 0.8183 0.9046
No log 0.7937 50 0.8593 0.0078 0.8593 0.9270
No log 0.8254 52 0.8648 -0.0426 0.8648 0.9299
No log 0.8571 54 0.8399 0.0 0.8399 0.9165
No log 0.8889 56 0.8280 0.0 0.8280 0.9099
No log 0.9206 58 0.8717 0.1007 0.8717 0.9336
No log 0.9524 60 0.9144 0.2171 0.9144 0.9562
No log 0.9841 62 0.8701 0.1983 0.8701 0.9328
No log 1.0159 64 0.9350 0.0165 0.9350 0.9670
No log 1.0476 66 1.0584 -0.0579 1.0584 1.0288
No log 1.0794 68 0.8746 0.1212 0.8746 0.9352
No log 1.1111 70 0.8466 0.1942 0.8466 0.9201
No log 1.1429 72 0.8385 0.1131 0.8385 0.9157
No log 1.1746 74 0.8478 0.1131 0.8478 0.9208
No log 1.2063 76 0.8689 0.1092 0.8689 0.9322
No log 1.2381 78 0.8683 0.0860 0.8683 0.9318
No log 1.2698 80 0.8622 0.1222 0.8622 0.9285
No log 1.3016 82 0.8425 0.0851 0.8425 0.9179
No log 1.3333 84 0.8165 0.0295 0.8165 0.9036
No log 1.3651 86 0.8266 0.0460 0.8266 0.9092
No log 1.3968 88 0.8710 0.0024 0.8710 0.9333
No log 1.4286 90 0.8666 0.0973 0.8666 0.9309
No log 1.4603 92 0.8685 0.0935 0.8685 0.9319
No log 1.4921 94 0.8624 0.1133 0.8624 0.9286
No log 1.5238 96 0.8532 0.0437 0.8532 0.9237
No log 1.5556 98 0.8305 0.1136 0.8305 0.9113
No log 1.5873 100 0.8632 0.0821 0.8632 0.9291
No log 1.6190 102 0.8848 0.0821 0.8848 0.9406
No log 1.6508 104 0.8645 0.1624 0.8645 0.9298
No log 1.6825 106 0.8862 0.1775 0.8862 0.9414
No log 1.7143 108 0.8619 0.1440 0.8619 0.9284
No log 1.7460 110 0.8263 0.1417 0.8263 0.9090
No log 1.7778 112 0.8412 0.1264 0.8412 0.9172
No log 1.8095 114 0.9637 0.3169 0.9637 0.9817
No log 1.8413 116 0.9463 0.3425 0.9463 0.9728
No log 1.8730 118 0.8437 0.2589 0.8437 0.9185
No log 1.9048 120 0.7833 -0.0459 0.7833 0.8851
No log 1.9365 122 0.8231 0.0514 0.8231 0.9073
No log 1.9683 124 0.8198 0.0913 0.8198 0.9054
No log 2.0 126 0.8161 0.1294 0.8161 0.9034
No log 2.0317 128 0.7807 0.1386 0.7807 0.8836
No log 2.0635 130 0.8239 0.1452 0.8239 0.9077
No log 2.0952 132 0.8595 0.1672 0.8595 0.9271
No log 2.1270 134 0.8775 0.2140 0.8775 0.9368
No log 2.1587 136 0.9177 0.2400 0.9177 0.9580
No log 2.1905 138 0.9649 0.1549 0.9649 0.9823
No log 2.2222 140 1.1232 0.2055 1.1232 1.0598
No log 2.2540 142 1.0808 0.2354 1.0808 1.0396
No log 2.2857 144 0.9157 0.2397 0.9157 0.9569
No log 2.3175 146 0.9510 0.2397 0.9510 0.9752
No log 2.3492 148 0.9995 0.2471 0.9995 0.9997
No log 2.3810 150 0.9960 0.2471 0.9960 0.9980
No log 2.4127 152 0.9551 0.1931 0.9551 0.9773
No log 2.4444 154 0.9566 0.1803 0.9566 0.9780
No log 2.4762 156 0.9517 0.1587 0.9517 0.9756
No log 2.5079 158 0.9063 0.0948 0.9063 0.9520
No log 2.5397 160 0.8914 0.1589 0.8914 0.9441
No log 2.5714 162 0.8830 0.2467 0.8830 0.9397
No log 2.6032 164 0.8877 0.2244 0.8877 0.9422
No log 2.6349 166 0.9038 0.2463 0.9038 0.9507
No log 2.6667 168 0.7769 0.2227 0.7769 0.8814
No log 2.6984 170 0.7549 0.2652 0.7549 0.8688
No log 2.7302 172 0.7855 0.2754 0.7855 0.8863
No log 2.7619 174 0.7658 0.2172 0.7658 0.8751
No log 2.7937 176 0.7715 0.3442 0.7715 0.8783
No log 2.8254 178 0.7863 0.2623 0.7863 0.8867
No log 2.8571 180 0.7728 0.2001 0.7728 0.8791
No log 2.8889 182 0.8079 0.3172 0.8079 0.8988
No log 2.9206 184 1.0550 0.2348 1.0550 1.0271
No log 2.9524 186 1.0418 0.2348 1.0418 1.0207
No log 2.9841 188 0.8226 0.3092 0.8226 0.9070
No log 3.0159 190 0.7061 0.3158 0.7061 0.8403
No log 3.0476 192 0.7566 0.2402 0.7566 0.8698
No log 3.0794 194 0.7080 0.1597 0.7080 0.8414
No log 3.1111 196 0.7477 0.3966 0.7477 0.8647
No log 3.1429 198 0.8514 0.3359 0.8514 0.9227
No log 3.1746 200 0.8294 0.3008 0.8294 0.9107
No log 3.2063 202 0.7630 0.3314 0.7630 0.8735
No log 3.2381 204 0.7278 0.2994 0.7278 0.8531
No log 3.2698 206 0.7610 0.3714 0.7610 0.8723
No log 3.3016 208 0.8416 0.3723 0.8416 0.9174
No log 3.3333 210 0.8806 0.3001 0.8806 0.9384
No log 3.3651 212 0.8747 0.3001 0.8747 0.9352
No log 3.3968 214 0.7865 0.2988 0.7865 0.8869
No log 3.4286 216 0.7658 0.2988 0.7658 0.8751
No log 3.4603 218 0.7481 0.2530 0.7481 0.8649
No log 3.4921 220 0.7300 0.1661 0.7300 0.8544
No log 3.5238 222 0.6976 0.2746 0.6976 0.8352
No log 3.5556 224 0.6976 0.2563 0.6976 0.8352
No log 3.5873 226 0.6874 0.2747 0.6874 0.8291
No log 3.6190 228 0.6814 0.2449 0.6814 0.8255
No log 3.6508 230 0.6808 0.3144 0.6808 0.8251
No log 3.6825 232 0.6824 0.2838 0.6824 0.8261
No log 3.7143 234 0.6957 0.2532 0.6957 0.8341
No log 3.7460 236 0.6950 0.2334 0.6950 0.8337
No log 3.7778 238 0.7055 0.2342 0.7055 0.8399
No log 3.8095 240 0.6962 0.2895 0.6962 0.8344
No log 3.8413 242 0.7874 0.3656 0.7874 0.8873
No log 3.8730 244 0.9050 0.3543 0.9050 0.9513
No log 3.9048 246 0.8563 0.3207 0.8563 0.9254
No log 3.9365 248 0.8105 0.3329 0.8105 0.9003
No log 3.9683 250 0.7374 0.1935 0.7374 0.8587
No log 4.0 252 0.7371 0.1697 0.7371 0.8585
No log 4.0317 254 0.7380 0.1012 0.7380 0.8591
No log 4.0635 256 0.7629 0.1010 0.7629 0.8735
No log 4.0952 258 0.7899 0.2913 0.7898 0.8887
No log 4.1270 260 0.8149 0.2913 0.8149 0.9027
No log 4.1587 262 0.8120 0.2784 0.8120 0.9011
No log 4.1905 264 0.7744 0.3088 0.7744 0.8800
No log 4.2222 266 0.7136 0.2530 0.7136 0.8448
No log 4.2540 268 0.7041 0.2621 0.7041 0.8391
No log 4.2857 270 0.7182 0.3248 0.7182 0.8474
No log 4.3175 272 0.7570 0.3305 0.7570 0.8700
No log 4.3492 274 0.7482 0.2471 0.7482 0.8650
No log 4.3810 276 0.7026 0.3887 0.7026 0.8382
No log 4.4127 278 0.7044 0.4244 0.7044 0.8393
No log 4.4444 280 0.7338 0.3235 0.7338 0.8566
No log 4.4762 282 0.8617 0.3183 0.8617 0.9283
No log 4.5079 284 0.8448 0.3305 0.8448 0.9191
No log 4.5397 286 0.7811 0.2161 0.7811 0.8838
No log 4.5714 288 0.7900 0.1970 0.7900 0.8888
No log 4.6032 290 0.8036 0.1437 0.8036 0.8964
No log 4.6349 292 0.9036 0.3034 0.9036 0.9506
No log 4.6667 294 0.9496 0.3799 0.9496 0.9745
No log 4.6984 296 0.9076 0.3125 0.9076 0.9527
No log 4.7302 298 0.8650 0.3125 0.8650 0.9300
No log 4.7619 300 0.7267 0.3209 0.7267 0.8525
No log 4.7937 302 0.6731 0.3933 0.6731 0.8204
No log 4.8254 304 0.6715 0.4029 0.6715 0.8195
No log 4.8571 306 0.6571 0.3762 0.6571 0.8106
No log 4.8889 308 0.6658 0.3689 0.6658 0.8160
No log 4.9206 310 0.6905 0.3665 0.6905 0.8309
No log 4.9524 312 0.7469 0.2696 0.7469 0.8642
No log 4.9841 314 0.7153 0.2283 0.7153 0.8457
No log 5.0159 316 0.6770 0.2247 0.6770 0.8228
No log 5.0476 318 0.6994 0.2652 0.6994 0.8363
No log 5.0794 320 0.7492 0.1748 0.7492 0.8655
No log 5.1111 322 0.8832 0.3503 0.8832 0.9398
No log 5.1429 324 1.1482 0.2708 1.1482 1.0715
No log 5.1746 326 1.1239 0.2821 1.1239 1.0601
No log 5.2063 328 0.9077 0.3440 0.9077 0.9528
No log 5.2381 330 0.7787 0.1715 0.7787 0.8824
No log 5.2698 332 0.7543 0.1818 0.7543 0.8685
No log 5.3016 334 0.7460 0.1818 0.7460 0.8637
No log 5.3333 336 0.7369 0.1887 0.7369 0.8584
No log 5.3651 338 0.7333 0.1638 0.7333 0.8563
No log 5.3968 340 0.7774 0.2866 0.7774 0.8817
No log 5.4286 342 0.7596 0.3746 0.7596 0.8715
No log 5.4603 344 0.7468 0.3866 0.7468 0.8642
No log 5.4921 346 0.6692 0.3618 0.6692 0.8180
No log 5.5238 348 0.6696 0.2958 0.6696 0.8183
No log 5.5556 350 0.7403 0.2316 0.7403 0.8604
No log 5.5873 352 0.7290 0.1447 0.7290 0.8538
No log 5.6190 354 0.7366 0.3209 0.7366 0.8583
No log 5.6508 356 0.7749 0.3085 0.7749 0.8803
No log 5.6825 358 0.7505 0.1256 0.7505 0.8663
No log 5.7143 360 0.7738 0.1174 0.7738 0.8797
No log 5.7460 362 0.7540 0.1424 0.7540 0.8683
No log 5.7778 364 0.7340 0.2058 0.7340 0.8567
No log 5.8095 366 0.7145 0.2148 0.7145 0.8453
No log 5.8413 368 0.7238 0.1174 0.7238 0.8508
No log 5.8730 370 0.7776 0.1660 0.7776 0.8818
No log 5.9048 372 0.7860 0.1660 0.7860 0.8865
No log 5.9365 374 0.7445 0.1263 0.7445 0.8629
No log 5.9683 376 0.7293 0.2148 0.7293 0.8540
No log 6.0 378 0.7666 0.3116 0.7666 0.8755
No log 6.0317 380 0.7687 0.3116 0.7687 0.8767
No log 6.0635 382 0.7378 0.1797 0.7378 0.8590
No log 6.0952 384 0.7610 0.1263 0.7610 0.8724
No log 6.1270 386 0.7902 0.1339 0.7902 0.8889
No log 6.1587 388 0.7618 0.0905 0.7618 0.8728
No log 6.1905 390 0.7620 0.1179 0.7620 0.8729
No log 6.2222 392 0.8087 0.2345 0.8087 0.8993
No log 6.2540 394 0.8351 0.2467 0.8351 0.9138
No log 6.2857 396 0.8461 0.2904 0.8461 0.9198
No log 6.3175 398 0.9059 0.3394 0.9059 0.9518
No log 6.3492 400 0.8624 0.3219 0.8624 0.9286
No log 6.3810 402 0.7562 0.3260 0.7562 0.8696
No log 6.4127 404 0.7136 0.2713 0.7136 0.8448
No log 6.4444 406 0.7083 0.2713 0.7083 0.8416
No log 6.4762 408 0.7067 0.2392 0.7067 0.8406
No log 6.5079 410 0.7120 0.1636 0.7120 0.8438
No log 6.5397 412 0.7284 0.2751 0.7284 0.8535
No log 6.5714 414 0.7245 0.3023 0.7245 0.8512
No log 6.6032 416 0.6889 0.2561 0.6889 0.8300
No log 6.6349 418 0.7043 0.0488 0.7043 0.8392
No log 6.6667 420 0.7400 0.1673 0.7400 0.8602
No log 6.6984 422 0.7252 0.1673 0.7252 0.8516
No log 6.7302 424 0.6876 0.2608 0.6876 0.8292
No log 6.7619 426 0.7280 0.3615 0.7280 0.8532
No log 6.7937 428 0.8233 0.3280 0.8233 0.9074
No log 6.8254 430 0.8392 0.3280 0.8392 0.9161
No log 6.8571 432 0.7285 0.3393 0.7285 0.8535
No log 6.8889 434 0.6570 0.3426 0.6570 0.8106
No log 6.9206 436 0.6672 0.2674 0.6672 0.8168
No log 6.9524 438 0.6571 0.2674 0.6571 0.8106
No log 6.9841 440 0.6531 0.3728 0.6531 0.8082
No log 7.0159 442 0.7030 0.3408 0.7030 0.8384
No log 7.0476 444 0.7566 0.3393 0.7566 0.8699
No log 7.0794 446 0.7611 0.3463 0.7611 0.8724
No log 7.1111 448 0.6903 0.4504 0.6903 0.8308
No log 7.1429 450 0.6493 0.4094 0.6493 0.8058
No log 7.1746 452 0.6417 0.3839 0.6417 0.8011
No log 7.2063 454 0.6825 0.3417 0.6825 0.8261
No log 7.2381 456 0.7696 0.3280 0.7696 0.8773
No log 7.2698 458 0.8077 0.3280 0.8077 0.8987
No log 7.3016 460 0.7639 0.3280 0.7639 0.8740
No log 7.3333 462 0.6872 0.3060 0.6872 0.8290
No log 7.3651 464 0.6485 0.3253 0.6485 0.8053
No log 7.3968 466 0.6414 0.2958 0.6414 0.8009
No log 7.4286 468 0.6404 0.2923 0.6404 0.8003
No log 7.4603 470 0.6696 0.4051 0.6696 0.8183
No log 7.4921 472 0.8021 0.3149 0.8021 0.8956
No log 7.5238 474 0.9237 0.3483 0.9237 0.9611
No log 7.5556 476 0.9132 0.3036 0.9132 0.9556
No log 7.5873 478 0.7624 0.2968 0.7624 0.8732
No log 7.6190 480 0.6811 0.3375 0.6811 0.8253
No log 7.6508 482 0.6961 0.3470 0.6961 0.8343
No log 7.6825 484 0.6957 0.2872 0.6957 0.8341
No log 7.7143 486 0.6953 0.3100 0.6953 0.8338
No log 7.7460 488 0.7090 0.3002 0.7090 0.8420
No log 7.7778 490 0.7218 0.3478 0.7218 0.8496
No log 7.8095 492 0.7464 0.3088 0.7464 0.8639
No log 7.8413 494 0.7130 0.3478 0.7130 0.8444
No log 7.8730 496 0.6906 0.4377 0.6906 0.8310
No log 7.9048 498 0.7125 0.3723 0.7125 0.8441
0.3345 7.9365 500 0.7074 0.3656 0.7074 0.8411
0.3345 7.9683 502 0.6678 0.3910 0.6678 0.8172
0.3345 8.0 504 0.6131 0.3862 0.6131 0.7830
0.3345 8.0317 506 0.6083 0.4276 0.6083 0.7799
0.3345 8.0635 508 0.6095 0.4384 0.6095 0.7807
0.3345 8.0952 510 0.6193 0.4161 0.6193 0.7870
0.3345 8.1270 512 0.6388 0.3675 0.6388 0.7993
0.3345 8.1587 514 0.7046 0.3867 0.7046 0.8394
0.3345 8.1905 516 0.7344 0.3095 0.7344 0.8570
0.3345 8.2222 518 0.7236 0.2751 0.7236 0.8506
0.3345 8.2540 520 0.6682 0.3574 0.6682 0.8175
0.3345 8.2857 522 0.6577 0.3530 0.6577 0.8110
0.3345 8.3175 524 0.6715 0.2780 0.6715 0.8195

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run2_AugV5_k19_task7_organization

Finetuned
(4019)
this model