ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run1_AugV5_k18_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8680
  • Qwk: 0.2365
  • Mse: 0.8680
  • Rmse: 0.9317

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0328 2 2.5840 -0.0262 2.5840 1.6075
No log 0.0656 4 1.3201 0.0715 1.3201 1.1490
No log 0.0984 6 1.1051 -0.0550 1.1051 1.0512
No log 0.1311 8 0.9591 -0.1217 0.9591 0.9793
No log 0.1639 10 0.9505 -0.0831 0.9505 0.9749
No log 0.1967 12 0.9154 -0.0831 0.9154 0.9568
No log 0.2295 14 0.8866 -0.0831 0.8866 0.9416
No log 0.2623 16 0.8812 -0.0426 0.8812 0.9387
No log 0.2951 18 0.8881 0.0 0.8881 0.9424
No log 0.3279 20 0.9565 -0.0375 0.9565 0.9780
No log 0.3607 22 1.0030 0.0506 1.0030 1.0015
No log 0.3934 24 1.0074 0.0518 1.0074 1.0037
No log 0.4262 26 0.9072 -0.0375 0.9072 0.9525
No log 0.4590 28 0.8628 0.0679 0.8628 0.9289
No log 0.4918 30 0.8475 0.1050 0.8475 0.9206
No log 0.5246 32 0.8056 0.1790 0.8056 0.8976
No log 0.5574 34 0.8680 0.0790 0.8680 0.9317
No log 0.5902 36 0.9636 0.0864 0.9636 0.9816
No log 0.6230 38 1.0828 0.0985 1.0828 1.0406
No log 0.6557 40 1.0006 0.0980 1.0006 1.0003
No log 0.6885 42 0.8256 0.1724 0.8256 0.9086
No log 0.7213 44 0.7557 0.1754 0.7557 0.8693
No log 0.7541 46 0.7608 0.0937 0.7608 0.8722
No log 0.7869 48 0.7681 0.0937 0.7681 0.8764
No log 0.8197 50 0.8681 0.0522 0.8681 0.9317
No log 0.8525 52 0.7729 0.0 0.7729 0.8791
No log 0.8852 54 0.7380 -0.0054 0.7380 0.8591
No log 0.9180 56 0.8137 0.0 0.8137 0.9021
No log 0.9508 58 0.9215 -0.0320 0.9215 0.9600
No log 0.9836 60 0.8777 0.0 0.8777 0.9369
No log 1.0164 62 0.8359 0.0327 0.8359 0.9143
No log 1.0492 64 0.8493 0.0757 0.8493 0.9216
No log 1.0820 66 0.8543 -0.0054 0.8543 0.9243
No log 1.1148 68 0.8676 -0.0054 0.8676 0.9315
No log 1.1475 70 0.8776 0.0359 0.8776 0.9368
No log 1.1803 72 0.9848 -0.1217 0.9848 0.9923
No log 1.2131 74 1.0568 -0.0173 1.0568 1.0280
No log 1.2459 76 1.0406 0.0890 1.0406 1.0201
No log 1.2787 78 1.1094 0.0875 1.1094 1.0533
No log 1.3115 80 1.1289 -0.0012 1.1289 1.0625
No log 1.3443 82 1.0312 0.0524 1.0312 1.0155
No log 1.3770 84 1.0261 0.0587 1.0261 1.0130
No log 1.4098 86 1.1465 -0.0471 1.1465 1.0708
No log 1.4426 88 1.2240 -0.0334 1.2240 1.1063
No log 1.4754 90 1.2586 0.0221 1.2586 1.1219
No log 1.5082 92 1.2648 -0.0069 1.2648 1.1246
No log 1.5410 94 1.1711 -0.0497 1.1711 1.0822
No log 1.5738 96 1.1057 0.0059 1.1057 1.0515
No log 1.6066 98 0.8697 0.3187 0.8697 0.9326
No log 1.6393 100 0.8209 0.2590 0.8209 0.9060
No log 1.6721 102 0.8406 0.0816 0.8406 0.9168
No log 1.7049 104 0.8551 0.1699 0.8551 0.9247
No log 1.7377 106 0.8956 0.0687 0.8956 0.9464
No log 1.7705 108 0.9083 0.0469 0.9083 0.9530
No log 1.8033 110 0.8541 0.1135 0.8541 0.9242
No log 1.8361 112 0.8267 0.1775 0.8267 0.9092
No log 1.8689 114 0.8430 0.1386 0.8430 0.9181
No log 1.9016 116 0.9092 0.1391 0.9092 0.9535
No log 1.9344 118 1.0026 0.1793 1.0026 1.0013
No log 1.9672 120 1.1144 0.1521 1.1144 1.0557
No log 2.0 122 1.0565 0.1850 1.0565 1.0278
No log 2.0328 124 0.9942 0.0476 0.9942 0.9971
No log 2.0656 126 1.0528 0.1248 1.0528 1.0261
No log 2.0984 128 1.1987 0.1412 1.1987 1.0948
No log 2.1311 130 1.0208 0.0932 1.0208 1.0103
No log 2.1639 132 0.9437 0.2085 0.9437 0.9715
No log 2.1967 134 1.0624 0.1479 1.0624 1.0307
No log 2.2295 136 0.9777 0.2303 0.9777 0.9888
No log 2.2623 138 0.8651 0.1984 0.8651 0.9301
No log 2.2951 140 0.8806 0.1102 0.8806 0.9384
No log 2.3279 142 0.8505 0.1285 0.8505 0.9222
No log 2.3607 144 0.8964 0.1161 0.8964 0.9468
No log 2.3934 146 0.9087 0.1487 0.9087 0.9533
No log 2.4262 148 0.8471 0.1356 0.8471 0.9204
No log 2.4590 150 0.8398 0.1321 0.8398 0.9164
No log 2.4918 152 0.8431 0.1716 0.8431 0.9182
No log 2.5246 154 0.9311 0.2600 0.9310 0.9649
No log 2.5574 156 1.0932 0.1482 1.0932 1.0456
No log 2.5902 158 1.0204 0.2389 1.0204 1.0101
No log 2.6230 160 0.8014 0.1828 0.8014 0.8952
No log 2.6557 162 0.7385 0.2787 0.7385 0.8594
No log 2.6885 164 0.8680 0.2784 0.8680 0.9316
No log 2.7213 166 0.8571 0.2784 0.8571 0.9258
No log 2.7541 168 0.7140 0.3238 0.7140 0.8450
No log 2.7869 170 0.6879 0.3863 0.6879 0.8294
No log 2.8197 172 0.7423 0.3729 0.7423 0.8616
No log 2.8525 174 0.7798 0.3051 0.7798 0.8830
No log 2.8852 176 0.7785 0.2505 0.7785 0.8824
No log 2.9180 178 0.9296 0.2784 0.9296 0.9642
No log 2.9508 180 0.9132 0.2724 0.9132 0.9556
No log 2.9836 182 0.8297 0.3625 0.8297 0.9109
No log 3.0164 184 0.9835 0.1974 0.9835 0.9917
No log 3.0492 186 1.0231 0.2070 1.0231 1.0115
No log 3.0820 188 0.9031 0.1259 0.9031 0.9503
No log 3.1148 190 0.7570 0.2170 0.7570 0.8701
No log 3.1475 192 0.7634 0.2440 0.7634 0.8737
No log 3.1803 194 0.8019 0.1995 0.8019 0.8955
No log 3.2131 196 0.7569 0.3341 0.7569 0.8700
No log 3.2459 198 0.7703 0.1091 0.7703 0.8777
No log 3.2787 200 0.8144 0.0807 0.8144 0.9024
No log 3.3115 202 0.7990 0.1091 0.7990 0.8939
No log 3.3443 204 0.8039 0.1782 0.8039 0.8966
No log 3.3770 206 0.8175 0.1986 0.8175 0.9042
No log 3.4098 208 0.8726 0.1915 0.8726 0.9341
No log 3.4426 210 0.8942 0.1915 0.8942 0.9456
No log 3.4754 212 0.8991 0.1992 0.8991 0.9482
No log 3.5082 214 0.9658 0.2291 0.9658 0.9827
No log 3.5410 216 0.9747 0.2451 0.9747 0.9873
No log 3.5738 218 0.9601 0.1801 0.9601 0.9798
No log 3.6066 220 0.9264 0.1801 0.9264 0.9625
No log 3.6393 222 0.9170 0.2301 0.9170 0.9576
No log 3.6721 224 0.8407 0.2553 0.8407 0.9169
No log 3.7049 226 0.8449 0.1860 0.8449 0.9192
No log 3.7377 228 0.8343 0.1860 0.8343 0.9134
No log 3.7705 230 0.8187 0.2291 0.8187 0.9048
No log 3.8033 232 0.8455 0.2262 0.8455 0.9195
No log 3.8361 234 0.8233 0.2291 0.8233 0.9073
No log 3.8689 236 0.8371 0.2163 0.8371 0.9149
No log 3.9016 238 0.9022 0.1162 0.9022 0.9498
No log 3.9344 240 0.8593 0.2016 0.8593 0.9270
No log 3.9672 242 0.8953 0.1826 0.8953 0.9462
No log 4.0 244 0.9868 0.2917 0.9868 0.9934
No log 4.0328 246 0.8752 0.2183 0.8752 0.9355
No log 4.0656 248 0.8436 0.2302 0.8436 0.9185
No log 4.0984 250 0.8670 0.2980 0.8670 0.9311
No log 4.1311 252 0.9055 0.1826 0.9055 0.9516
No log 4.1639 254 0.8773 0.2262 0.8773 0.9366
No log 4.1967 256 0.8691 0.1695 0.8691 0.9323
No log 4.2295 258 1.0187 0.0903 1.0187 1.0093
No log 4.2623 260 1.0219 0.0606 1.0219 1.0109
No log 4.2951 262 0.8871 0.0494 0.8871 0.9418
No log 4.3279 264 0.8072 0.1577 0.8072 0.8985
No log 4.3607 266 0.9731 0.3356 0.9731 0.9864
No log 4.3934 268 1.0028 0.3699 1.0028 1.0014
No log 4.4262 270 0.8600 0.3409 0.8600 0.9273
No log 4.4590 272 0.8295 0.1489 0.8295 0.9108
No log 4.4918 274 0.8530 0.0378 0.8530 0.9236
No log 4.5246 276 0.8536 0.0909 0.8536 0.9239
No log 4.5574 278 0.8872 0.2993 0.8872 0.9419
No log 4.5902 280 0.8859 0.2993 0.8859 0.9412
No log 4.6230 282 0.8652 0.2693 0.8652 0.9302
No log 4.6557 284 0.8884 0.2544 0.8884 0.9426
No log 4.6885 286 1.0233 0.3089 1.0233 1.0116
No log 4.7213 288 1.0205 0.2651 1.0205 1.0102
No log 4.7541 290 0.8718 0.2542 0.8718 0.9337
No log 4.7869 292 0.8089 0.2224 0.8089 0.8994
No log 4.8197 294 0.8151 0.2953 0.8151 0.9028
No log 4.8525 296 0.9727 0.2754 0.9727 0.9863
No log 4.8852 298 1.0847 0.2591 1.0847 1.0415
No log 4.9180 300 0.9542 0.2807 0.9542 0.9768
No log 4.9508 302 0.7732 0.3340 0.7732 0.8793
No log 4.9836 304 0.7417 0.2715 0.7417 0.8612
No log 5.0164 306 0.7445 0.1797 0.7445 0.8629
No log 5.0492 308 0.7372 0.2419 0.7372 0.8586
No log 5.0820 310 0.7673 0.3452 0.7673 0.8760
No log 5.1148 312 0.8839 0.3305 0.8839 0.9401
No log 5.1475 314 1.0093 0.3022 1.0093 1.0046
No log 5.1803 316 0.9158 0.2804 0.9158 0.9570
No log 5.2131 318 0.7931 0.3172 0.7931 0.8906
No log 5.2459 320 0.7766 0.2777 0.7766 0.8813
No log 5.2787 322 0.7792 0.2038 0.7792 0.8827
No log 5.3115 324 0.7876 0.2302 0.7876 0.8875
No log 5.3443 326 0.8424 0.3274 0.8424 0.9178
No log 5.3770 328 0.9350 0.2539 0.9350 0.9669
No log 5.4098 330 0.9539 0.2861 0.9539 0.9767
No log 5.4426 332 0.8488 0.3221 0.8488 0.9213
No log 5.4754 334 0.7879 0.2749 0.7879 0.8876
No log 5.5082 336 0.7690 0.2302 0.7690 0.8769
No log 5.5410 338 0.7772 0.2749 0.7772 0.8816
No log 5.5738 340 0.9201 0.2861 0.9201 0.9592
No log 5.6066 342 1.0247 0.3128 1.0247 1.0123
No log 5.6393 344 0.9294 0.2861 0.9294 0.9641
No log 5.6721 346 0.7597 0.3340 0.7597 0.8716
No log 5.7049 348 0.7361 0.3239 0.7361 0.8580
No log 5.7377 350 0.7547 0.3222 0.7547 0.8687
No log 5.7705 352 0.7791 0.3613 0.7791 0.8827
No log 5.8033 354 0.8798 0.3059 0.8798 0.9380
No log 5.8361 356 0.8801 0.2725 0.8801 0.9381
No log 5.8689 358 0.7771 0.3458 0.7771 0.8815
No log 5.9016 360 0.7129 0.3070 0.7129 0.8443
No log 5.9344 362 0.7100 0.3502 0.7100 0.8426
No log 5.9672 364 0.7122 0.3552 0.7122 0.8439
No log 6.0 366 0.7175 0.3398 0.7175 0.8471
No log 6.0328 368 0.7146 0.3552 0.7146 0.8453
No log 6.0656 370 0.7253 0.3129 0.7253 0.8516
No log 6.0984 372 0.7402 0.3738 0.7402 0.8603
No log 6.1311 374 0.8221 0.3365 0.8221 0.9067
No log 6.1639 376 0.8762 0.3432 0.8762 0.9360
No log 6.1967 378 0.8563 0.3760 0.8563 0.9254
No log 6.2295 380 0.7438 0.3861 0.7438 0.8624
No log 6.2623 382 0.7255 0.3022 0.7255 0.8518
No log 6.2951 384 0.7256 0.3970 0.7256 0.8518
No log 6.3279 386 0.7615 0.3769 0.7615 0.8727
No log 6.3607 388 0.8499 0.3034 0.8499 0.9219
No log 6.3934 390 0.8467 0.3034 0.8467 0.9201
No log 6.4262 392 0.7895 0.2467 0.7895 0.8885
No log 6.4590 394 0.7446 0.2652 0.7446 0.8629
No log 6.4918 396 0.7442 0.3296 0.7442 0.8626
No log 6.5246 398 0.7537 0.3308 0.7537 0.8681
No log 6.5574 400 0.7897 0.2827 0.7897 0.8887
No log 6.5902 402 0.7798 0.3349 0.7798 0.8831
No log 6.6230 404 0.7967 0.3028 0.7967 0.8926
No log 6.6557 406 0.8237 0.2873 0.8237 0.9076
No log 6.6885 408 0.8520 0.3447 0.8520 0.9230
No log 6.7213 410 0.9423 0.3059 0.9423 0.9707
No log 6.7541 412 0.8974 0.2968 0.8974 0.9473
No log 6.7869 414 0.7767 0.2988 0.7767 0.8813
No log 6.8197 416 0.7396 0.3502 0.7396 0.8600
No log 6.8525 418 0.7384 0.2608 0.7384 0.8593
No log 6.8852 420 0.7350 0.3228 0.7350 0.8573
No log 6.9180 422 0.7515 0.2237 0.7515 0.8669
No log 6.9508 424 0.7893 0.2847 0.7893 0.8884
No log 6.9836 426 0.8004 0.2847 0.8004 0.8946
No log 7.0164 428 0.7770 0.2684 0.7770 0.8815
No log 7.0492 430 0.8161 0.2434 0.8161 0.9034
No log 7.0820 432 0.8509 0.2624 0.8509 0.9224
No log 7.1148 434 0.8280 0.2001 0.8280 0.9099
No log 7.1475 436 0.9152 0.2443 0.9152 0.9567
No log 7.1803 438 1.1289 0.2312 1.1289 1.0625
No log 7.2131 440 1.1538 0.2271 1.1538 1.0742
No log 7.2459 442 0.9906 0.2147 0.9906 0.9953
No log 7.2787 444 0.8088 0.2813 0.8088 0.8993
No log 7.3115 446 0.8096 0.3042 0.8096 0.8998
No log 7.3443 448 0.8480 0.2652 0.8480 0.9209
No log 7.3770 450 0.8131 0.2911 0.8131 0.9017
No log 7.4098 452 0.7943 0.3316 0.7943 0.8912
No log 7.4426 454 0.8461 0.2592 0.8461 0.9198
No log 7.4754 456 0.7819 0.3196 0.7819 0.8843
No log 7.5082 458 0.7442 0.3738 0.7442 0.8627
No log 7.5410 460 0.7281 0.3675 0.7281 0.8533
No log 7.5738 462 0.7208 0.3675 0.7208 0.8490
No log 7.6066 464 0.7231 0.3081 0.7231 0.8504
No log 7.6393 466 0.7227 0.3081 0.7227 0.8501
No log 7.6721 468 0.7422 0.3050 0.7422 0.8615
No log 7.7049 470 0.7431 0.3615 0.7431 0.8621
No log 7.7377 472 0.6955 0.3116 0.6955 0.8340
No log 7.7705 474 0.6715 0.3625 0.6715 0.8194
No log 7.8033 476 0.6912 0.3689 0.6912 0.8314
No log 7.8361 478 0.7381 0.3388 0.7381 0.8591
No log 7.8689 480 0.7584 0.3410 0.7584 0.8709
No log 7.9016 482 0.7669 0.4308 0.7669 0.8758
No log 7.9344 484 0.7990 0.3939 0.7990 0.8939
No log 7.9672 486 0.7830 0.3662 0.7830 0.8849
No log 8.0 488 0.7877 0.3283 0.7877 0.8875
No log 8.0328 490 0.9475 0.3747 0.9475 0.9734
No log 8.0656 492 1.0206 0.3365 1.0206 1.0102
No log 8.0984 494 0.8740 0.3747 0.8740 0.9349
No log 8.1311 496 0.7717 0.3330 0.7717 0.8784
No log 8.1639 498 0.7116 0.3408 0.7116 0.8436
0.3593 8.1967 500 0.7023 0.3243 0.7023 0.8380
0.3593 8.2295 502 0.7087 0.3558 0.7087 0.8418
0.3593 8.2623 504 0.7208 0.3585 0.7208 0.8490
0.3593 8.2951 506 0.6944 0.3141 0.6944 0.8333
0.3593 8.3279 508 0.6951 0.3280 0.6951 0.8337
0.3593 8.3607 510 0.7548 0.3139 0.7548 0.8688
0.3593 8.3934 512 0.8135 0.3586 0.8135 0.9019
0.3593 8.4262 514 0.7861 0.3240 0.7861 0.8866
0.3593 8.4590 516 0.7311 0.2834 0.7311 0.8550
0.3593 8.4918 518 0.7284 0.3118 0.7284 0.8534
0.3593 8.5246 520 0.7125 0.3336 0.7125 0.8441
0.3593 8.5574 522 0.7683 0.4044 0.7683 0.8765
0.3593 8.5902 524 0.9454 0.4007 0.9454 0.9723
0.3593 8.6230 526 1.0591 0.3418 1.0591 1.0291
0.3593 8.6557 528 0.9746 0.3377 0.9746 0.9872
0.3593 8.6885 530 0.7914 0.3036 0.7914 0.8896
0.3593 8.7213 532 0.7055 0.3504 0.7055 0.8400
0.3593 8.7541 534 0.7166 0.3556 0.7166 0.8465
0.3593 8.7869 536 0.7218 0.2986 0.7218 0.8496
0.3593 8.8197 538 0.7151 0.2878 0.7151 0.8456
0.3593 8.8525 540 0.7147 0.4171 0.7147 0.8454
0.3593 8.8852 542 0.7781 0.4114 0.7781 0.8821
0.3593 8.9180 544 0.9098 0.3425 0.9098 0.9539
0.3593 8.9508 546 0.8985 0.3425 0.8985 0.9479
0.3593 8.9836 548 0.8006 0.3677 0.8006 0.8947
0.3593 9.0164 550 0.7732 0.3836 0.7732 0.8793
0.3593 9.0492 552 0.7757 0.3604 0.7757 0.8807
0.3593 9.0820 554 0.7788 0.3604 0.7788 0.8825
0.3593 9.1148 556 0.7874 0.3494 0.7874 0.8873
0.3593 9.1475 558 0.8270 0.3409 0.8270 0.9094
0.3593 9.1803 560 0.9093 0.3036 0.9093 0.9536
0.3593 9.2131 562 0.9153 0.3036 0.9153 0.9567
0.3593 9.2459 564 0.8739 0.2725 0.8739 0.9348
0.3593 9.2787 566 0.8069 0.3937 0.8069 0.8983
0.3593 9.3115 568 0.7860 0.2229 0.7860 0.8866
0.3593 9.3443 570 0.7931 0.2239 0.7931 0.8906
0.3593 9.3770 572 0.7991 0.3502 0.7991 0.8939
0.3593 9.4098 574 0.8585 0.3463 0.8585 0.9266
0.3593 9.4426 576 1.0032 0.28 1.0032 1.0016
0.3593 9.4754 578 1.1069 0.2998 1.1069 1.0521
0.3593 9.5082 580 1.0671 0.2797 1.0671 1.0330
0.3593 9.5410 582 0.9570 0.2493 0.9570 0.9783
0.3593 9.5738 584 0.8680 0.2365 0.8680 0.9317

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run1_AugV5_k18_task7_organization

Finetuned
(4019)
this model