ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run1_AugV5_k18_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7344
  • Qwk: 0.2878
  • Mse: 0.7344
  • Rmse: 0.8570

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0222 2 2.5379 -0.0788 2.5379 1.5931
No log 0.0444 4 1.3554 0.0704 1.3554 1.1642
No log 0.0667 6 1.3594 -0.2360 1.3594 1.1659
No log 0.0889 8 1.2376 -0.0211 1.2376 1.1125
No log 0.1111 10 1.1883 0.0412 1.1883 1.0901
No log 0.1333 12 1.1483 0.1277 1.1483 1.0716
No log 0.1556 14 1.1005 0.0033 1.1005 1.0491
No log 0.1778 16 1.1099 -0.1507 1.1099 1.0535
No log 0.2 18 0.9785 -0.0887 0.9785 0.9892
No log 0.2222 20 0.8976 0.0840 0.8976 0.9474
No log 0.2444 22 0.8314 0.1236 0.8314 0.9118
No log 0.2667 24 0.8327 0.1863 0.8327 0.9125
No log 0.2889 26 0.8736 0.2270 0.8736 0.9347
No log 0.3111 28 0.9214 0.1617 0.9214 0.9599
No log 0.3333 30 0.9924 0.0822 0.9924 0.9962
No log 0.3556 32 1.0618 0.0879 1.0618 1.0305
No log 0.3778 34 1.0775 0.0451 1.0775 1.0380
No log 0.4 36 1.1759 0.0832 1.1759 1.0844
No log 0.4222 38 1.3980 -0.0245 1.3980 1.1824
No log 0.4444 40 1.5674 -0.1507 1.5674 1.2520
No log 0.4667 42 1.8341 -0.0922 1.8341 1.3543
No log 0.4889 44 1.7428 -0.0614 1.7428 1.3202
No log 0.5111 46 1.3814 -0.1507 1.3814 1.1753
No log 0.5333 48 1.0893 0.0952 1.0893 1.0437
No log 0.5556 50 0.9535 0.1918 0.9535 0.9765
No log 0.5778 52 0.8953 0.1504 0.8953 0.9462
No log 0.6 54 0.8175 0.1752 0.8175 0.9041
No log 0.6222 56 0.8026 -0.0054 0.8026 0.8959
No log 0.6444 58 0.8051 -0.0054 0.8051 0.8973
No log 0.6667 60 0.8144 0.2206 0.8144 0.9024
No log 0.6889 62 0.8450 0.1225 0.8450 0.9192
No log 0.7111 64 0.8527 0.1359 0.8527 0.9234
No log 0.7333 66 0.8520 0.1359 0.8520 0.9230
No log 0.7556 68 0.8294 0.0327 0.8294 0.9107
No log 0.7778 70 0.8759 0.0051 0.8759 0.9359
No log 0.8 72 0.9219 0.0145 0.9219 0.9601
No log 0.8222 74 1.0302 0.1618 1.0302 1.0150
No log 0.8444 76 1.1794 0.1248 1.1794 1.0860
No log 0.8667 78 1.0798 0.2183 1.0798 1.0391
No log 0.8889 80 0.9641 0.0609 0.9641 0.9819
No log 0.9111 82 1.0029 0.0609 1.0029 1.0014
No log 0.9333 84 0.9285 0.1299 0.9285 0.9636
No log 0.9556 86 0.8856 0.0902 0.8856 0.9411
No log 0.9778 88 0.8623 0.0733 0.8623 0.9286
No log 1.0 90 0.8602 0.1138 0.8602 0.9275
No log 1.0222 92 0.8888 0.0376 0.8888 0.9428
No log 1.0444 94 0.9291 0.0119 0.9291 0.9639
No log 1.0667 96 0.8793 0.1218 0.8793 0.9377
No log 1.0889 98 0.7852 0.0725 0.7852 0.8861
No log 1.1111 100 0.7767 0.0688 0.7767 0.8813
No log 1.1333 102 0.7953 0.1136 0.7953 0.8918
No log 1.1556 104 0.8636 0.1842 0.8636 0.9293
No log 1.1778 106 0.8732 0.2172 0.8732 0.9345
No log 1.2 108 0.9934 0.1551 0.9934 0.9967
No log 1.2222 110 1.3639 -0.0005 1.3639 1.1679
No log 1.2444 112 1.3457 -0.0255 1.3457 1.1601
No log 1.2667 114 1.0949 0.0401 1.0949 1.0464
No log 1.2889 116 0.8965 0.1660 0.8965 0.9468
No log 1.3111 118 0.8234 0.1633 0.8234 0.9074
No log 1.3333 120 0.8360 0.0902 0.8360 0.9143
No log 1.3556 122 0.9238 0.1626 0.9238 0.9612
No log 1.3778 124 0.8833 0.1993 0.8833 0.9398
No log 1.4 126 0.7730 0.1884 0.7730 0.8792
No log 1.4222 128 0.7723 0.1508 0.7723 0.8788
No log 1.4444 130 0.7910 0.2803 0.7910 0.8894
No log 1.4667 132 0.9851 0.1178 0.9851 0.9925
No log 1.4889 134 1.1038 0.1226 1.1038 1.0506
No log 1.5111 136 1.1775 0.0994 1.1775 1.0851
No log 1.5333 138 1.1581 0.1246 1.1581 1.0762
No log 1.5556 140 1.0695 0.1983 1.0695 1.0342
No log 1.5778 142 0.8501 0.1957 0.8501 0.9220
No log 1.6 144 0.6970 0.2057 0.6970 0.8349
No log 1.6222 146 0.6784 0.2412 0.6784 0.8236
No log 1.6444 148 0.6863 0.1617 0.6863 0.8284
No log 1.6667 150 0.6887 0.1617 0.6887 0.8299
No log 1.6889 152 0.6930 0.2405 0.6930 0.8325
No log 1.7111 154 0.7280 0.1228 0.7280 0.8533
No log 1.7333 156 0.8817 0.2476 0.8817 0.9390
No log 1.7556 158 1.0163 0.1803 1.0163 1.0081
No log 1.7778 160 0.9947 0.1866 0.9947 0.9974
No log 1.8 162 0.8921 0.0890 0.8921 0.9445
No log 1.8222 164 0.8222 0.2424 0.8222 0.9068
No log 1.8444 166 0.7888 0.0798 0.7888 0.8881
No log 1.8667 168 0.7911 0.1400 0.7911 0.8895
No log 1.8889 170 0.7987 0.1737 0.7987 0.8937
No log 1.9111 172 0.8056 0.1737 0.8056 0.8976
No log 1.9333 174 0.8193 0.1264 0.8193 0.9052
No log 1.9556 176 0.8607 0.0393 0.8607 0.9277
No log 1.9778 178 0.9088 0.0025 0.9088 0.9533
No log 2.0 180 0.9326 0.2316 0.9326 0.9657
No log 2.0222 182 0.8731 0.3561 0.8731 0.9344
No log 2.0444 184 0.8438 0.2747 0.8438 0.9186
No log 2.0667 186 0.8624 0.2434 0.8624 0.9287
No log 2.0889 188 0.9370 0.1979 0.9370 0.9680
No log 2.1111 190 1.2093 0.1198 1.2093 1.0997
No log 2.1333 192 1.5946 0.0292 1.5946 1.2628
No log 2.1556 194 1.5582 0.0077 1.5582 1.2483
No log 2.1778 196 1.2133 0.1488 1.2133 1.1015
No log 2.2 198 0.8480 0.2171 0.8480 0.9209
No log 2.2222 200 0.8302 0.2817 0.8302 0.9111
No log 2.2444 202 0.9108 0.2094 0.9108 0.9543
No log 2.2667 204 0.8580 0.3302 0.8580 0.9263
No log 2.2889 206 0.7710 0.2817 0.7710 0.8781
No log 2.3111 208 0.7728 0.2317 0.7728 0.8791
No log 2.3333 210 1.0043 0.1803 1.0043 1.0022
No log 2.3556 212 1.2543 0.1479 1.2543 1.1199
No log 2.3778 214 1.3175 0.1001 1.3175 1.1478
No log 2.4 216 1.2169 0.1211 1.2169 1.1031
No log 2.4222 218 1.1078 0.1671 1.1078 1.0525
No log 2.4444 220 1.1278 0.1185 1.1278 1.0620
No log 2.4667 222 1.1625 0.1185 1.1625 1.0782
No log 2.4889 224 1.1162 0.1185 1.1162 1.0565
No log 2.5111 226 1.0130 0.0933 1.0130 1.0065
No log 2.5333 228 0.9231 0.3177 0.9231 0.9608
No log 2.5556 230 0.9148 0.2224 0.9148 0.9565
No log 2.5778 232 0.8926 0.1581 0.8926 0.9448
No log 2.6 234 0.8417 0.0816 0.8417 0.9175
No log 2.6222 236 0.8361 0.1228 0.8361 0.9144
No log 2.6444 238 0.8315 0.0283 0.8315 0.9119
No log 2.6667 240 0.8507 0.0679 0.8507 0.9223
No log 2.6889 242 0.8603 0.0679 0.8603 0.9275
No log 2.7111 244 0.8586 0.0283 0.8586 0.9266
No log 2.7333 246 0.9429 0.0883 0.9429 0.9710
No log 2.7556 248 1.0708 0.0619 1.0708 1.0348
No log 2.7778 250 1.1584 0.0666 1.1584 1.0763
No log 2.8 252 1.2327 0.0458 1.2327 1.1103
No log 2.8222 254 1.2208 0.0199 1.2208 1.1049
No log 2.8444 256 1.0413 0.2945 1.0413 1.0204
No log 2.8667 258 0.8933 0.1756 0.8933 0.9452
No log 2.8889 260 0.8364 0.1017 0.8364 0.9146
No log 2.9111 262 0.8313 0.1277 0.8313 0.9117
No log 2.9333 264 0.8551 0.0404 0.8551 0.9247
No log 2.9556 266 0.9154 0.1453 0.9154 0.9568
No log 2.9778 268 0.9025 0.0753 0.9025 0.9500
No log 3.0 270 0.8482 0.0720 0.8482 0.9210
No log 3.0222 272 0.8319 0.1268 0.8319 0.9121
No log 3.0444 274 0.8493 0.1850 0.8493 0.9216
No log 3.0667 276 0.8421 0.0552 0.8421 0.9177
No log 3.0889 278 0.8600 -0.0455 0.8600 0.9273
No log 3.1111 280 0.9578 0.1842 0.9578 0.9787
No log 3.1333 282 1.2045 0.0661 1.2045 1.0975
No log 3.1556 284 1.3052 0.1228 1.3052 1.1425
No log 3.1778 286 1.2193 0.0661 1.2193 1.1042
No log 3.2 288 1.0804 0.0569 1.0804 1.0394
No log 3.2222 290 0.9585 0.0810 0.9585 0.9790
No log 3.2444 292 0.9110 0.1123 0.9110 0.9545
No log 3.2667 294 0.8545 0.0694 0.8545 0.9244
No log 3.2889 296 0.8334 0.0306 0.8334 0.9129
No log 3.3111 298 0.7997 0.2004 0.7997 0.8943
No log 3.3333 300 0.7787 0.2652 0.7787 0.8825
No log 3.3556 302 0.7778 0.2558 0.7778 0.8819
No log 3.3778 304 0.7929 0.1884 0.7929 0.8905
No log 3.4 306 0.8355 0.2595 0.8355 0.9141
No log 3.4222 308 0.8739 0.2252 0.8739 0.9348
No log 3.4444 310 0.9836 0.1354 0.9836 0.9918
No log 3.4667 312 1.0157 0.1880 1.0157 1.0078
No log 3.4889 314 0.9640 0.1228 0.9640 0.9818
No log 3.5111 316 0.9137 0.2849 0.9137 0.9559
No log 3.5333 318 0.9363 0.2388 0.9363 0.9676
No log 3.5556 320 1.0270 0.1344 1.0270 1.0134
No log 3.5778 322 1.0637 0.0529 1.0637 1.0313
No log 3.6 324 1.0024 0.1087 1.0024 1.0012
No log 3.6222 326 0.8759 0.2429 0.8759 0.9359
No log 3.6444 328 0.8639 0.1607 0.8639 0.9294
No log 3.6667 330 0.9056 0.2387 0.9056 0.9516
No log 3.6889 332 0.8742 0.2439 0.8742 0.9350
No log 3.7111 334 0.8580 0.1362 0.8580 0.9263
No log 3.7333 336 0.8493 0.1786 0.8493 0.9216
No log 3.7556 338 0.8514 0.1522 0.8514 0.9227
No log 3.7778 340 0.8569 0.2234 0.8569 0.9257
No log 3.8 342 0.8435 0.2234 0.8435 0.9184
No log 3.8222 344 0.8439 0.3175 0.8439 0.9186
No log 3.8444 346 0.8781 0.2964 0.8781 0.9371
No log 3.8667 348 0.9060 0.3099 0.9060 0.9518
No log 3.8889 350 0.9018 0.2303 0.9018 0.9496
No log 3.9111 352 0.8060 0.2295 0.8060 0.8978
No log 3.9333 354 0.7788 0.2479 0.7788 0.8825
No log 3.9556 356 0.7922 0.2508 0.7922 0.8900
No log 3.9778 358 0.8008 0.2777 0.8008 0.8949
No log 4.0 360 0.8099 0.2720 0.8099 0.8999
No log 4.0222 362 0.8401 0.2784 0.8401 0.9165
No log 4.0444 364 0.8969 0.2124 0.8969 0.9470
No log 4.0667 366 0.9173 0.2807 0.9173 0.9578
No log 4.0889 368 0.8366 0.2784 0.8366 0.9147
No log 4.1111 370 0.7966 0.2838 0.7966 0.8925
No log 4.1333 372 0.8028 0.3252 0.8028 0.8960
No log 4.1556 374 0.7702 0.2895 0.7702 0.8776
No log 4.1778 376 0.7696 0.2923 0.7696 0.8772
No log 4.2 378 0.7552 0.2895 0.7552 0.8690
No log 4.2222 380 0.7590 0.3260 0.7590 0.8712
No log 4.2444 382 0.7378 0.3280 0.7378 0.8589
No log 4.2667 384 0.7214 0.4384 0.7214 0.8494
No log 4.2889 386 0.7566 0.3069 0.7566 0.8698
No log 4.3111 388 0.8043 0.3551 0.8043 0.8968
No log 4.3333 390 0.8126 0.4282 0.8126 0.9014
No log 4.3556 392 0.7837 0.3271 0.7837 0.8852
No log 4.3778 394 0.7530 0.3974 0.7530 0.8678
No log 4.4 396 0.7579 0.3746 0.7579 0.8706
No log 4.4222 398 0.7930 0.2968 0.7930 0.8905
No log 4.4444 400 0.7912 0.3085 0.7912 0.8895
No log 4.4667 402 0.7851 0.3460 0.7851 0.8861
No log 4.4889 404 0.8174 0.1672 0.8174 0.9041
No log 4.5111 406 0.7931 0.1683 0.7931 0.8905
No log 4.5333 408 0.7636 0.3198 0.7636 0.8738
No log 4.5556 410 0.7596 0.3408 0.7596 0.8715
No log 4.5778 412 0.7498 0.2929 0.7498 0.8659
No log 4.6 414 0.7629 0.2777 0.7629 0.8735
No log 4.6222 416 0.7968 0.2461 0.7968 0.8926
No log 4.6444 418 0.8122 0.2171 0.8122 0.9012
No log 4.6667 420 0.7896 0.2777 0.7896 0.8886
No log 4.6889 422 0.7936 0.1970 0.7936 0.8909
No log 4.7111 424 0.8222 0.2561 0.8222 0.9068
No log 4.7333 426 0.8659 0.3001 0.8659 0.9305
No log 4.7556 428 0.8773 0.3001 0.8773 0.9367
No log 4.7778 430 0.8346 0.1522 0.8346 0.9135
No log 4.8 432 0.8259 0.2072 0.8259 0.9088
No log 4.8222 434 0.8184 0.1775 0.8184 0.9046
No log 4.8444 436 0.7824 0.2034 0.7824 0.8845
No log 4.8667 438 0.7769 0.2772 0.7769 0.8814
No log 4.8889 440 0.8185 0.1980 0.8185 0.9047
No log 4.9111 442 0.9169 0.2779 0.9169 0.9575
No log 4.9333 444 0.9238 0.2779 0.9238 0.9612
No log 4.9556 446 0.8216 0.2911 0.8216 0.9064
No log 4.9778 448 0.7933 0.3622 0.7933 0.8907
No log 5.0 450 0.7971 0.3622 0.7971 0.8928
No log 5.0222 452 0.8034 0.2986 0.8034 0.8963
No log 5.0444 454 0.8212 0.2944 0.8212 0.9062
No log 5.0667 456 0.8421 0.2355 0.8421 0.9176
No log 5.0889 458 0.8179 0.2944 0.8179 0.9044
No log 5.1111 460 0.8027 0.3320 0.8027 0.8959
No log 5.1333 462 0.7864 0.2857 0.7864 0.8868
No log 5.1556 464 0.7794 0.3222 0.7794 0.8829
No log 5.1778 466 0.7823 0.2712 0.7823 0.8845
No log 5.2 468 0.8237 0.2739 0.8237 0.9076
No log 5.2222 470 0.7942 0.2085 0.7942 0.8912
No log 5.2444 472 0.7757 0.3890 0.7757 0.8808
No log 5.2667 474 0.7870 0.3723 0.7870 0.8871
No log 5.2889 476 0.7705 0.3980 0.7705 0.8778
No log 5.3111 478 0.7542 0.3551 0.7542 0.8685
No log 5.3333 480 0.7516 0.3813 0.7516 0.8669
No log 5.3556 482 0.7745 0.2576 0.7745 0.8801
No log 5.3778 484 0.8178 0.3159 0.8178 0.9043
No log 5.4 486 0.7823 0.2839 0.7823 0.8845
No log 5.4222 488 0.7256 0.3788 0.7256 0.8518
No log 5.4444 490 0.7326 0.3060 0.7326 0.8559
No log 5.4667 492 0.8468 0.2705 0.8468 0.9202
No log 5.4889 494 0.9058 0.3204 0.9058 0.9518
No log 5.5111 496 0.8281 0.3239 0.8281 0.9100
No log 5.5333 498 0.7384 0.3161 0.7384 0.8593
0.4372 5.5556 500 0.7643 0.3910 0.7643 0.8742
0.4372 5.5778 502 0.7727 0.3910 0.7727 0.8790
0.4372 5.6 504 0.7389 0.3910 0.7389 0.8596
0.4372 5.6222 506 0.7188 0.3100 0.7188 0.8478
0.4372 5.6444 508 0.7496 0.3417 0.7496 0.8658
0.4372 5.6667 510 0.7344 0.2878 0.7344 0.8570

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run1_AugV5_k18_task7_organization

Finetuned
(4019)
this model