ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run2_AugV5_k20_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7938
  • Qwk: 0.2747
  • Mse: 0.7938
  • Rmse: 0.8910

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0294 2 2.6429 -0.0262 2.6429 1.6257
No log 0.0588 4 1.3942 0.0759 1.3942 1.1807
No log 0.0882 6 0.9615 -0.0970 0.9615 0.9806
No log 0.1176 8 1.0164 -0.2346 1.0164 1.0081
No log 0.1471 10 0.8443 -0.0103 0.8443 0.9189
No log 0.1765 12 0.7791 -0.0079 0.7791 0.8826
No log 0.2059 14 0.8929 0.0327 0.8929 0.9449
No log 0.2353 16 1.1948 -0.1072 1.1948 1.0931
No log 0.2647 18 1.0232 0.0053 1.0232 1.0115
No log 0.2941 20 0.8711 0.0 0.8711 0.9333
No log 0.3235 22 0.8390 0.0 0.8390 0.9159
No log 0.3529 24 0.9246 0.1345 0.9246 0.9616
No log 0.3824 26 0.9326 0.0955 0.9326 0.9657
No log 0.4118 28 0.8469 0.0889 0.8469 0.9203
No log 0.4412 30 0.8451 0.1007 0.8451 0.9193
No log 0.4706 32 0.9009 0.0927 0.9009 0.9491
No log 0.5 34 0.9700 0.0295 0.9700 0.9849
No log 0.5294 36 1.2107 -0.0053 1.2107 1.1003
No log 0.5588 38 1.4374 -0.1431 1.4374 1.1989
No log 0.5882 40 1.2127 0.0277 1.2127 1.1012
No log 0.6176 42 1.0001 0.0421 1.0001 1.0000
No log 0.6471 44 0.9531 0.0393 0.9531 0.9763
No log 0.6765 46 0.9884 0.0165 0.9884 0.9942
No log 0.7059 48 1.0727 0.1304 1.0727 1.0357
No log 0.7353 50 0.9930 0.0949 0.9930 0.9965
No log 0.7647 52 0.8879 0.0428 0.8879 0.9423
No log 0.7941 54 0.8810 0.0327 0.8810 0.9386
No log 0.8235 56 0.9089 0.0 0.9089 0.9534
No log 0.8529 58 0.9704 0.0509 0.9704 0.9851
No log 0.8824 60 1.0261 0.0522 1.0261 1.0130
No log 0.9118 62 0.9055 -0.0426 0.9055 0.9516
No log 0.9412 64 0.8436 0.0 0.8436 0.9185
No log 0.9706 66 0.8189 0.1094 0.8189 0.9049
No log 1.0 68 0.8590 0.1972 0.8590 0.9268
No log 1.0294 70 0.9209 0.0652 0.9209 0.9597
No log 1.0588 72 1.0756 0.0406 1.0756 1.0371
No log 1.0882 74 1.3904 -0.1241 1.3904 1.1792
No log 1.1176 76 1.4350 -0.1831 1.4350 1.1979
No log 1.1471 78 1.1534 0.0559 1.1534 1.0740
No log 1.1765 80 0.9033 0.1699 0.9033 0.9504
No log 1.2059 82 0.8433 0.2171 0.8433 0.9183
No log 1.2353 84 0.8152 0.1508 0.8152 0.9029
No log 1.2647 86 0.8939 0.0460 0.8939 0.9455
No log 1.2941 88 1.0469 0.1328 1.0469 1.0232
No log 1.3235 90 1.0223 0.1660 1.0223 1.0111
No log 1.3529 92 0.8916 -0.0121 0.8916 0.9442
No log 1.3824 94 0.8564 0.0618 0.8564 0.9254
No log 1.4118 96 0.8888 0.0618 0.8888 0.9428
No log 1.4412 98 0.9621 0.0501 0.9621 0.9809
No log 1.4706 100 1.0364 0.1352 1.0364 1.0180
No log 1.5 102 1.0206 0.1724 1.0206 1.0102
No log 1.5294 104 0.9345 0.1358 0.9345 0.9667
No log 1.5588 106 0.9104 -0.0047 0.9104 0.9541
No log 1.5882 108 0.9501 -0.0000 0.9501 0.9747
No log 1.6176 110 0.9718 0.0798 0.9718 0.9858
No log 1.6471 112 1.0136 0.0108 1.0136 1.0068
No log 1.6765 114 1.0354 0.0403 1.0354 1.0175
No log 1.7059 116 1.0153 0.1641 1.0153 1.0076
No log 1.7353 118 1.0622 0.1198 1.0622 1.0306
No log 1.7647 120 1.2498 0.0735 1.2498 1.1180
No log 1.7941 122 1.2738 0.0493 1.2738 1.1286
No log 1.8235 124 1.0952 0.1022 1.0952 1.0465
No log 1.8529 126 0.9936 0.1016 0.9936 0.9968
No log 1.8824 128 0.9023 0.0490 0.9023 0.9499
No log 1.9118 130 0.9282 0.3131 0.9282 0.9635
No log 1.9412 132 0.8717 0.2722 0.8717 0.9337
No log 1.9706 134 0.8283 0.1010 0.8283 0.9101
No log 2.0 136 0.8335 0.1010 0.8335 0.9130
No log 2.0294 138 0.8390 0.0933 0.8390 0.9159
No log 2.0588 140 0.8852 0.2077 0.8852 0.9409
No log 2.0882 142 0.9118 0.1638 0.9118 0.9549
No log 2.1176 144 0.9562 0.0941 0.9562 0.9779
No log 2.1471 146 0.9645 0.1839 0.9645 0.9821
No log 2.1765 148 0.9519 0.1256 0.9519 0.9756
No log 2.2059 150 0.9095 0.1850 0.9095 0.9537
No log 2.2353 152 0.8751 0.0573 0.8751 0.9355
No log 2.2647 154 0.9001 0.0347 0.9001 0.9487
No log 2.2941 156 0.9050 -0.0091 0.9050 0.9513
No log 2.3235 158 0.9296 0.0899 0.9296 0.9642
No log 2.3529 160 0.9831 0.2392 0.9831 0.9915
No log 2.3824 162 1.0140 0.1701 1.0140 1.0070
No log 2.4118 164 1.0133 0.1564 1.0133 1.0066
No log 2.4412 166 1.1578 0.0130 1.1578 1.0760
No log 2.4706 168 1.1449 0.0130 1.1449 1.0700
No log 2.5 170 1.0928 0.0096 1.0928 1.0454
No log 2.5294 172 0.9589 0.3126 0.9589 0.9792
No log 2.5588 174 0.9452 0.3173 0.9452 0.9722
No log 2.5882 176 0.8691 0.2749 0.8691 0.9323
No log 2.6176 178 0.8251 0.1410 0.8251 0.9083
No log 2.6471 180 0.9012 0.1502 0.9012 0.9493
No log 2.6765 182 0.9503 0.1198 0.9503 0.9748
No log 2.7059 184 0.8989 0.2110 0.8989 0.9481
No log 2.7353 186 0.8763 0.1346 0.8763 0.9361
No log 2.7647 188 0.8667 0.2424 0.8667 0.9310
No log 2.7941 190 0.9182 0.2689 0.9182 0.9582
No log 2.8235 192 0.9639 0.2355 0.9639 0.9818
No log 2.8529 194 1.0101 0.1469 1.0101 1.0050
No log 2.8824 196 1.0541 0.1194 1.0541 1.0267
No log 2.9118 198 1.0421 0.1583 1.0421 1.0208
No log 2.9412 200 1.1305 0.1908 1.1305 1.0632
No log 2.9706 202 1.1376 0.1340 1.1376 1.0666
No log 3.0 204 1.0317 0.0889 1.0317 1.0157
No log 3.0294 206 1.0178 0.2270 1.0178 1.0089
No log 3.0588 208 1.0052 0.2298 1.0052 1.0026
No log 3.0882 210 1.0211 0.0954 1.0211 1.0105
No log 3.1176 212 1.0221 0.0378 1.0221 1.0110
No log 3.1471 214 0.9237 -0.0065 0.9237 0.9611
No log 3.1765 216 0.8550 0.1487 0.8550 0.9247
No log 3.2059 218 0.8287 0.0860 0.8287 0.9103
No log 3.2353 220 0.8019 0.2004 0.8019 0.8955
No log 3.2647 222 0.8241 0.1361 0.8241 0.9078
No log 3.2941 224 0.8124 0.2594 0.8124 0.9014
No log 3.3235 226 0.8210 0.3034 0.8210 0.9061
No log 3.3529 228 0.8686 0.2377 0.8686 0.9320
No log 3.3824 230 0.8765 0.2772 0.8765 0.9362
No log 3.4118 232 0.9517 0.1833 0.9517 0.9755
No log 3.4412 234 0.9097 0.1578 0.9097 0.9538
No log 3.4706 236 0.8520 0.1789 0.8520 0.9231
No log 3.5 238 0.7956 0.1603 0.7956 0.8920
No log 3.5294 240 0.8417 0.2419 0.8417 0.9174
No log 3.5588 242 0.8793 0.3001 0.8793 0.9377
No log 3.5882 244 0.8553 0.2342 0.8553 0.9248
No log 3.6176 246 0.8211 0.1373 0.8211 0.9062
No log 3.6471 248 0.8304 0.1052 0.8304 0.9113
No log 3.6765 250 0.8255 0.1684 0.8255 0.9086
No log 3.7059 252 0.8254 0.2689 0.8254 0.9085
No log 3.7353 254 0.8263 0.1935 0.8263 0.9090
No log 3.7647 256 0.8384 0.1017 0.8384 0.9157
No log 3.7941 258 0.8245 0.1935 0.8245 0.9080
No log 3.8235 260 0.7992 0.2661 0.7992 0.8940
No log 3.8529 262 0.8259 0.2335 0.8259 0.9088
No log 3.8824 264 0.8259 0.2023 0.8259 0.9088
No log 3.9118 266 0.8426 0.1888 0.8426 0.9179
No log 3.9412 268 0.8283 0.1811 0.8283 0.9101
No log 3.9706 270 0.7979 0.2424 0.7979 0.8933
No log 4.0 272 0.7755 0.1592 0.7755 0.8806
No log 4.0294 274 0.7732 0.2596 0.7732 0.8793
No log 4.0588 276 0.8524 0.1941 0.8524 0.9233
No log 4.0882 278 1.0615 0.3425 1.0615 1.0303
No log 4.1176 280 1.0549 0.3119 1.0549 1.0271
No log 4.1471 282 1.0272 0.2934 1.0272 1.0135
No log 4.1765 284 0.8769 0.2104 0.8769 0.9364
No log 4.2059 286 0.7822 0.2058 0.7822 0.8844
No log 4.2353 288 0.7929 0.2652 0.7929 0.8904
No log 4.2647 290 0.8050 0.2715 0.8050 0.8972
No log 4.2941 292 0.8710 0.1950 0.8710 0.9333
No log 4.3235 294 1.0635 0.1653 1.0635 1.0313
No log 4.3529 296 1.1836 0.1141 1.1836 1.0879
No log 4.3824 298 1.0996 0.1356 1.0996 1.0486
No log 4.4118 300 0.8961 0.2517 0.8961 0.9466
No log 4.4412 302 0.8439 0.3028 0.8439 0.9187
No log 4.4706 304 0.8467 0.3569 0.8467 0.9202
No log 4.5 306 0.8321 0.4085 0.8321 0.9122
No log 4.5294 308 0.8083 0.3724 0.8083 0.8991
No log 4.5588 310 0.8634 0.2996 0.8634 0.9292
No log 4.5882 312 0.9693 0.1977 0.9693 0.9845
No log 4.6176 314 0.9892 0.1798 0.9892 0.9946
No log 4.6471 316 0.8960 0.2678 0.8960 0.9466
No log 4.6765 318 0.8018 0.2161 0.8018 0.8954
No log 4.7059 320 0.7436 0.3118 0.7436 0.8623
No log 4.7353 322 0.7190 0.2480 0.7190 0.8480
No log 4.7647 324 0.7368 0.2480 0.7368 0.8583
No log 4.7941 326 0.7962 0.1130 0.7962 0.8923
No log 4.8235 328 0.9368 0.1993 0.9368 0.9679
No log 4.8529 330 0.9502 0.2268 0.9502 0.9748
No log 4.8824 332 0.8483 0.0727 0.8483 0.9210
No log 4.9118 334 0.7735 0.1011 0.7735 0.8795
No log 4.9412 336 0.7562 0.1577 0.7562 0.8696
No log 4.9706 338 0.7700 0.1901 0.7700 0.8775
No log 5.0 340 0.7683 0.2621 0.7683 0.8765
No log 5.0294 342 0.7614 0.1970 0.7614 0.8726
No log 5.0588 344 0.7917 0.2513 0.7917 0.8898
No log 5.0882 346 0.8572 0.2317 0.8572 0.9258
No log 5.1176 348 0.8499 0.2310 0.8499 0.9219
No log 5.1471 350 0.8470 0.2260 0.8470 0.9203
No log 5.1765 352 0.8039 0.2777 0.8039 0.8966
No log 5.2059 354 0.7837 0.1541 0.7837 0.8853
No log 5.2353 356 0.7644 0.1953 0.7644 0.8743
No log 5.2647 358 0.7427 0.2170 0.7427 0.8618
No log 5.2941 360 0.7643 0.2609 0.7643 0.8743
No log 5.3235 362 0.7500 0.3273 0.7500 0.8660
No log 5.3529 364 0.7336 0.2513 0.7336 0.8565
No log 5.3824 366 0.7337 0.1935 0.7337 0.8565
No log 5.4118 368 0.7231 0.2746 0.7231 0.8504
No log 5.4412 370 0.7225 0.2741 0.7225 0.8500
No log 5.4706 372 0.7215 0.3347 0.7215 0.8494
No log 5.5 374 0.7733 0.2709 0.7733 0.8794
No log 5.5294 376 0.8130 0.2074 0.8130 0.9017
No log 5.5588 378 0.8708 0.2793 0.8708 0.9332
No log 5.5882 380 0.8358 0.2122 0.8358 0.9142
No log 5.6176 382 0.7760 0.2150 0.7760 0.8809
No log 5.6471 384 0.8436 0.1092 0.8436 0.9185
No log 5.6765 386 0.7943 0.2077 0.7943 0.8913
No log 5.7059 388 0.7734 0.2335 0.7734 0.8794
No log 5.7353 390 0.7649 0.2193 0.7649 0.8746
No log 5.7647 392 0.7583 0.2247 0.7583 0.8708
No log 5.7941 394 0.7548 0.2113 0.7548 0.8688
No log 5.8235 396 0.7500 0.2113 0.7500 0.8660
No log 5.8529 398 0.7535 0.3111 0.7535 0.8680
No log 5.8824 400 0.7482 0.3111 0.7482 0.8650
No log 5.9118 402 0.7445 0.3070 0.7445 0.8628
No log 5.9412 404 0.7458 0.3293 0.7458 0.8636
No log 5.9706 406 0.7548 0.2973 0.7548 0.8688
No log 6.0 408 0.8136 0.2808 0.8136 0.9020
No log 6.0294 410 0.8782 0.3219 0.8782 0.9371
No log 6.0588 412 0.8365 0.3820 0.8365 0.9146
No log 6.0882 414 0.7675 0.2775 0.7675 0.8761
No log 6.1176 416 0.7440 0.3146 0.7440 0.8625
No log 6.1471 418 0.7675 0.2829 0.7675 0.8761
No log 6.1765 420 0.7492 0.2933 0.7492 0.8656
No log 6.2059 422 0.7276 0.2780 0.7276 0.8530
No log 6.2353 424 0.7460 0.1901 0.7460 0.8637
No log 6.2647 426 0.7695 0.1850 0.7695 0.8772
No log 6.2941 428 0.7522 0.1901 0.7522 0.8673
No log 6.3235 430 0.7427 0.1902 0.7427 0.8618
No log 6.3529 432 0.7646 0.2295 0.7646 0.8744
No log 6.3824 434 0.8017 0.2434 0.8017 0.8954
No log 6.4118 436 0.7964 0.1905 0.7964 0.8924
No log 6.4412 438 0.7923 0.2838 0.7923 0.8901
No log 6.4706 440 0.8019 0.2335 0.8019 0.8955
No log 6.5 442 0.7610 0.2591 0.7610 0.8724
No log 6.5294 444 0.7422 0.2264 0.7422 0.8615
No log 6.5588 446 0.7533 0.2709 0.7533 0.8679
No log 6.5882 448 0.7564 0.2709 0.7564 0.8697
No log 6.6176 450 0.7547 0.2424 0.7547 0.8687
No log 6.6471 452 0.7631 0.2566 0.7631 0.8736
No log 6.6765 454 0.7956 0.2874 0.7956 0.8920
No log 6.7059 456 0.8572 0.3243 0.8572 0.9259
No log 6.7353 458 0.8764 0.2817 0.8765 0.9362
No log 6.7647 460 0.9185 0.3906 0.9185 0.9584
No log 6.7941 462 0.9477 0.4664 0.9477 0.9735
No log 6.8235 464 0.8883 0.3971 0.8883 0.9425
No log 6.8529 466 0.8644 0.3044 0.8644 0.9297
No log 6.8824 468 0.8841 0.2280 0.8841 0.9403
No log 6.9118 470 0.8783 0.2609 0.8783 0.9372
No log 6.9412 472 0.8671 0.2240 0.8671 0.9312
No log 6.9706 474 0.7855 0.2652 0.7855 0.8863
No log 7.0 476 0.7587 0.3574 0.7587 0.8710
No log 7.0294 478 0.7385 0.2419 0.7385 0.8594
No log 7.0588 480 0.7239 0.2424 0.7239 0.8508
No log 7.0882 482 0.7228 0.2947 0.7228 0.8502
No log 7.1176 484 0.7186 0.2218 0.7186 0.8477
No log 7.1471 486 0.7243 0.2218 0.7243 0.8510
No log 7.1765 488 0.7361 0.2947 0.7361 0.8580
No log 7.2059 490 0.7505 0.3435 0.7505 0.8663
No log 7.2353 492 0.7499 0.2532 0.7499 0.8660
No log 7.2647 494 0.7437 0.2365 0.7437 0.8624
No log 7.2941 496 0.7189 0.2830 0.7189 0.8479
No log 7.3235 498 0.7356 0.2771 0.7356 0.8577
0.3542 7.3529 500 0.7239 0.2933 0.7239 0.8508
0.3542 7.3824 502 0.6897 0.2940 0.6897 0.8305
0.3542 7.4118 504 0.6806 0.2574 0.6806 0.8250
0.3542 7.4412 506 0.6778 0.2561 0.6778 0.8233
0.3542 7.4706 508 0.6779 0.2916 0.6779 0.8234
0.3542 7.5 510 0.7156 0.3659 0.7156 0.8459
0.3542 7.5294 512 0.7447 0.3614 0.7447 0.8629
0.3542 7.5588 514 0.7149 0.3635 0.7149 0.8455
0.3542 7.5882 516 0.7102 0.3141 0.7102 0.8427
0.3542 7.6176 518 0.7404 0.4360 0.7404 0.8605
0.3542 7.6471 520 0.7809 0.3519 0.7809 0.8837
0.3542 7.6765 522 0.8615 0.2585 0.8615 0.9282
0.3542 7.7059 524 0.9025 0.2562 0.9025 0.9500
0.3542 7.7353 526 0.8814 0.2560 0.8814 0.9388
0.3542 7.7647 528 0.8374 0.2366 0.8374 0.9151
0.3542 7.7941 530 0.7938 0.2747 0.7938 0.8910

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run2_AugV5_k20_task7_organization

Finetuned
(4019)
this model