ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k10_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7627
  • Qwk: 0.3662
  • Mse: 0.7627
  • Rmse: 0.8733

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0833 2 2.5562 -0.0449 2.5562 1.5988
No log 0.1667 4 1.2422 0.0726 1.2422 1.1145
No log 0.25 6 0.9587 -0.0970 0.9587 0.9791
No log 0.3333 8 0.8678 0.1010 0.8678 0.9316
No log 0.4167 10 0.7701 0.0688 0.7701 0.8776
No log 0.5 12 0.8958 0.2387 0.8958 0.9465
No log 0.5833 14 0.7668 0.2261 0.7668 0.8757
No log 0.6667 16 0.8024 0.2740 0.8024 0.8957
No log 0.75 18 1.1118 0.0518 1.1118 1.0544
No log 0.8333 20 1.0401 0.2037 1.0401 1.0198
No log 0.9167 22 0.8404 0.1786 0.8404 0.9167
No log 1.0 24 0.8079 0.0937 0.8079 0.8988
No log 1.0833 26 0.8021 0.0 0.8021 0.8956
No log 1.1667 28 0.7711 0.0 0.7711 0.8781
No log 1.25 30 0.7354 0.0 0.7354 0.8576
No log 1.3333 32 0.7161 0.0840 0.7161 0.8462
No log 1.4167 34 0.7131 0.0840 0.7131 0.8444
No log 1.5 36 0.6958 0.1236 0.6958 0.8342
No log 1.5833 38 0.6690 0.3323 0.6690 0.8179
No log 1.6667 40 0.8661 0.3231 0.8661 0.9306
No log 1.75 42 1.0736 0.2510 1.0736 1.0362
No log 1.8333 44 1.1826 -0.0960 1.1826 1.0875
No log 1.9167 46 1.0100 -0.1823 1.0100 1.0050
No log 2.0 48 0.7369 -0.0027 0.7369 0.8584
No log 2.0833 50 0.7957 0.1372 0.7957 0.8920
No log 2.1667 52 0.8417 0.2526 0.8417 0.9175
No log 2.25 54 0.7856 0.2181 0.7856 0.8864
No log 2.3333 56 0.7029 0.0937 0.7029 0.8384
No log 2.4167 58 0.6646 0.0393 0.6646 0.8153
No log 2.5 60 0.7157 0.2817 0.7157 0.8460
No log 2.5833 62 0.6541 0.3789 0.6541 0.8087
No log 2.6667 64 0.5738 0.3416 0.5738 0.7575
No log 2.75 66 0.5736 0.3745 0.5736 0.7574
No log 2.8333 68 0.6010 0.4243 0.6010 0.7753
No log 2.9167 70 0.6401 0.4728 0.6401 0.8001
No log 3.0 72 0.6577 0.4644 0.6577 0.8110
No log 3.0833 74 0.6102 0.4020 0.6102 0.7811
No log 3.1667 76 0.5533 0.4929 0.5533 0.7438
No log 3.25 78 0.5857 0.4259 0.5857 0.7653
No log 3.3333 80 0.5909 0.4259 0.5909 0.7687
No log 3.4167 82 0.5542 0.4161 0.5542 0.7444
No log 3.5 84 0.6258 0.4618 0.6258 0.7911
No log 3.5833 86 0.6813 0.4270 0.6813 0.8254
No log 3.6667 88 0.7034 0.3399 0.7034 0.8387
No log 3.75 90 0.7212 0.3099 0.7212 0.8492
No log 3.8333 92 0.6896 0.2171 0.6896 0.8304
No log 3.9167 94 0.6396 0.2852 0.6396 0.7998
No log 4.0 96 0.6207 0.2783 0.6207 0.7878
No log 4.0833 98 0.7373 0.3712 0.7373 0.8587
No log 4.1667 100 0.8021 0.3782 0.8021 0.8956
No log 4.25 102 0.8846 0.3560 0.8846 0.9405
No log 4.3333 104 0.7925 0.4597 0.7925 0.8902
No log 4.4167 106 0.7744 0.4265 0.7744 0.8800
No log 4.5 108 0.7954 0.4057 0.7954 0.8919
No log 4.5833 110 0.8551 0.4199 0.8551 0.9247
No log 4.6667 112 0.9361 0.3274 0.9361 0.9675
No log 4.75 114 0.9522 0.3274 0.9522 0.9758
No log 4.8333 116 1.0659 0.3007 1.0659 1.0324
No log 4.9167 118 0.9919 0.3174 0.9919 0.9959
No log 5.0 120 0.8303 0.2564 0.8303 0.9112
No log 5.0833 122 0.8277 0.2589 0.8277 0.9098
No log 5.1667 124 1.0493 0.2348 1.0493 1.0244
No log 5.25 126 1.3397 0.2441 1.3397 1.1575
No log 5.3333 128 1.4628 0.2178 1.4628 1.2095
No log 5.4167 130 1.1680 0.2421 1.1680 1.0807
No log 5.5 132 0.9496 0.2706 0.9496 0.9745
No log 5.5833 134 1.1682 0.3211 1.1682 1.0808
No log 5.6667 136 1.6563 0.2552 1.6563 1.2870
No log 5.75 138 1.6479 0.2382 1.6479 1.2837
No log 5.8333 140 1.1280 0.3129 1.1280 1.0621
No log 5.9167 142 0.6349 0.4875 0.6349 0.7968
No log 6.0 144 0.6510 0.3879 0.6510 0.8068
No log 6.0833 146 0.6567 0.3060 0.6567 0.8104
No log 6.1667 148 0.6239 0.5056 0.6239 0.7899
No log 6.25 150 0.7613 0.3869 0.7613 0.8725
No log 6.3333 152 0.9027 0.3029 0.9027 0.9501
No log 6.4167 154 0.8123 0.3251 0.8123 0.9013
No log 6.5 156 0.7053 0.4684 0.7053 0.8398
No log 6.5833 158 0.6695 0.4036 0.6695 0.8182
No log 6.6667 160 0.6963 0.4067 0.6963 0.8345
No log 6.75 162 0.8393 0.2982 0.8393 0.9161
No log 6.8333 164 1.0567 0.2398 1.0567 1.0280
No log 6.9167 166 1.0777 0.2247 1.0777 1.0381
No log 7.0 168 0.8720 0.2892 0.8720 0.9338
No log 7.0833 170 0.8126 0.3409 0.8126 0.9015
No log 7.1667 172 0.7636 0.3545 0.7636 0.8738
No log 7.25 174 0.8048 0.3662 0.8048 0.8971
No log 7.3333 176 0.9414 0.2754 0.9414 0.9702
No log 7.4167 178 0.9651 0.3183 0.9651 0.9824
No log 7.5 180 0.8248 0.2861 0.8248 0.9082
No log 7.5833 182 0.7206 0.4100 0.7206 0.8489
No log 7.6667 184 0.7349 0.4100 0.7349 0.8573
No log 7.75 186 0.8291 0.3456 0.8291 0.9106
No log 7.8333 188 0.8555 0.3394 0.8555 0.9249
No log 7.9167 190 0.9574 0.2278 0.9574 0.9784
No log 8.0 192 0.9443 0.2602 0.9443 0.9717
No log 8.0833 194 0.8824 0.3074 0.8824 0.9394
No log 8.1667 196 0.8193 0.3586 0.8193 0.9051
No log 8.25 198 0.7998 0.2871 0.7998 0.8943
No log 8.3333 200 0.8965 0.2949 0.8965 0.9468
No log 8.4167 202 1.0561 0.2412 1.0561 1.0277
No log 8.5 204 1.1634 0.2170 1.1634 1.0786
No log 8.5833 206 1.0671 0.3154 1.0671 1.0330
No log 8.6667 208 0.8529 0.3909 0.8529 0.9235
No log 8.75 210 0.7164 0.4937 0.7164 0.8464
No log 8.8333 212 0.6550 0.4830 0.6550 0.8093
No log 8.9167 214 0.6198 0.4986 0.6198 0.7873
No log 9.0 216 0.6555 0.5259 0.6555 0.8096
No log 9.0833 218 0.6581 0.4665 0.6581 0.8112
No log 9.1667 220 0.6404 0.4815 0.6404 0.8002
No log 9.25 222 0.6163 0.3127 0.6163 0.7851
No log 9.3333 224 0.6145 0.3545 0.6145 0.7839
No log 9.4167 226 0.6187 0.3545 0.6187 0.7866
No log 9.5 228 0.6732 0.2960 0.6732 0.8205
No log 9.5833 230 0.7318 0.4093 0.7318 0.8554
No log 9.6667 232 0.8932 0.3928 0.8932 0.9451
No log 9.75 234 0.9419 0.3827 0.9419 0.9705
No log 9.8333 236 0.8454 0.2754 0.8454 0.9194
No log 9.9167 238 0.7451 0.2817 0.7451 0.8632
No log 10.0 240 0.7490 0.3099 0.7490 0.8654
No log 10.0833 242 0.8085 0.3008 0.8085 0.8992
No log 10.1667 244 0.8044 0.3562 0.8044 0.8969
No log 10.25 246 0.7882 0.2904 0.7882 0.8878
No log 10.3333 248 0.7918 0.3494 0.7918 0.8898
No log 10.4167 250 0.8867 0.3892 0.8867 0.9416
No log 10.5 252 0.9170 0.3933 0.9170 0.9576
No log 10.5833 254 0.8691 0.4064 0.8691 0.9323
No log 10.6667 256 0.8018 0.3891 0.8018 0.8954
No log 10.75 258 0.6860 0.3060 0.6860 0.8282
No log 10.8333 260 0.6733 0.3471 0.6733 0.8205
No log 10.9167 262 0.7366 0.3157 0.7366 0.8583
No log 11.0 264 0.8962 0.4161 0.8962 0.9467
No log 11.0833 266 0.9073 0.4064 0.9073 0.9525
No log 11.1667 268 0.8661 0.4133 0.8661 0.9307
No log 11.25 270 0.7614 0.2817 0.7614 0.8726
No log 11.3333 272 0.6677 0.3092 0.6677 0.8171
No log 11.4167 274 0.6045 0.3323 0.6045 0.7775
No log 11.5 276 0.5940 0.3659 0.5940 0.7707
No log 11.5833 278 0.6330 0.3092 0.6330 0.7956
No log 11.6667 280 0.7833 0.3637 0.7833 0.8850
No log 11.75 282 0.8953 0.4203 0.8953 0.9462
No log 11.8333 284 0.9277 0.4133 0.9277 0.9632
No log 11.9167 286 0.7807 0.3450 0.7807 0.8836
No log 12.0 288 0.6716 0.3688 0.6716 0.8195
No log 12.0833 290 0.6633 0.3688 0.6633 0.8144
No log 12.1667 292 0.6580 0.3545 0.6580 0.8112
No log 12.25 294 0.7456 0.2754 0.7456 0.8635
No log 12.3333 296 0.9725 0.3251 0.9725 0.9861
No log 12.4167 298 1.0687 0.2977 1.0687 1.0338
No log 12.5 300 0.9913 0.3193 0.9913 0.9956
No log 12.5833 302 0.8602 0.3256 0.8602 0.9275
No log 12.6667 304 0.7454 0.3032 0.7454 0.8634
No log 12.75 306 0.7341 0.3032 0.7341 0.8568
No log 12.8333 308 0.8280 0.3869 0.8280 0.9100
No log 12.9167 310 0.8483 0.3731 0.8483 0.9210
No log 13.0 312 0.8985 0.3719 0.8985 0.9479
No log 13.0833 314 0.9033 0.3719 0.9033 0.9504
No log 13.1667 316 0.8776 0.2892 0.8776 0.9368
No log 13.25 318 0.8394 0.2336 0.8394 0.9162
No log 13.3333 320 0.8789 0.2670 0.8789 0.9375
No log 13.4167 322 0.9515 0.2866 0.9515 0.9755
No log 13.5 324 1.0946 0.2552 1.0946 1.0462
No log 13.5833 326 1.1015 0.2504 1.1015 1.0495
No log 13.6667 328 1.0069 0.2343 1.0069 1.0034
No log 13.75 330 0.8380 0.2463 0.8380 0.9154
No log 13.8333 332 0.8181 0.2193 0.8181 0.9045
No log 13.9167 334 0.8788 0.2807 0.8788 0.9375
No log 14.0 336 1.1155 0.2799 1.1155 1.0562
No log 14.0833 338 1.2682 0.2712 1.2682 1.1261
No log 14.1667 340 1.2672 0.2520 1.2672 1.1257
No log 14.25 342 1.1361 0.2799 1.1361 1.0659
No log 14.3333 344 0.9569 0.2643 0.9569 0.9782
No log 14.4167 346 0.7964 0.2692 0.7964 0.8924
No log 14.5 348 0.7735 0.2817 0.7735 0.8795
No log 14.5833 350 0.8285 0.2692 0.8285 0.9102
No log 14.6667 352 0.9157 0.2463 0.9157 0.9569
No log 14.75 354 1.0785 0.3052 1.0785 1.0385
No log 14.8333 356 1.2063 0.2380 1.2063 1.0983
No log 14.9167 358 1.2116 0.2191 1.2116 1.1007
No log 15.0 360 1.1081 0.2635 1.1081 1.0527
No log 15.0833 362 0.9671 0.2562 0.9671 0.9834
No log 15.1667 364 0.8953 0.2463 0.8953 0.9462
No log 15.25 366 0.8881 0.2463 0.8881 0.9424
No log 15.3333 368 0.9497 0.2670 0.9497 0.9745
No log 15.4167 370 1.0326 0.2730 1.0326 1.0162
No log 15.5 372 1.1242 0.2635 1.1242 1.0603
No log 15.5833 374 1.1591 0.2271 1.1591 1.0766
No log 15.6667 376 1.0373 0.2971 1.0373 1.0185
No log 15.75 378 0.9083 0.2779 0.9083 0.9530
No log 15.8333 380 0.8640 0.2754 0.8640 0.9295
No log 15.9167 382 0.8871 0.2193 0.8871 0.9418
No log 16.0 384 0.9185 0.2267 0.9185 0.9584
No log 16.0833 386 0.9991 0.2779 0.9991 0.9995
No log 16.1667 388 1.1274 0.2948 1.1274 1.0618
No log 16.25 390 1.2009 0.2590 1.2009 1.0958
No log 16.3333 392 1.1984 0.2306 1.1984 1.0947
No log 16.4167 394 1.1086 0.2529 1.1086 1.0529
No log 16.5 396 1.0643 0.2626 1.0643 1.0317
No log 16.5833 398 1.0674 0.2857 1.0674 1.0331
No log 16.6667 400 1.0435 0.2779 1.0435 1.0215
No log 16.75 402 0.9780 0.2485 0.9780 0.9889
No log 16.8333 404 0.8632 0.3131 0.8632 0.9291
No log 16.9167 406 0.7643 0.3196 0.7643 0.8742
No log 17.0 408 0.7279 0.3020 0.7279 0.8532
No log 17.0833 410 0.7016 0.3020 0.7016 0.8376
No log 17.1667 412 0.7188 0.3020 0.7188 0.8478
No log 17.25 414 0.7579 0.2784 0.7579 0.8706
No log 17.3333 416 0.7862 0.2784 0.7862 0.8867
No log 17.4167 418 0.7313 0.2589 0.7313 0.8552
No log 17.5 420 0.7388 0.2589 0.7388 0.8595
No log 17.5833 422 0.8038 0.2754 0.8038 0.8965
No log 17.6667 424 0.7987 0.2754 0.7987 0.8937
No log 17.75 426 0.8226 0.2692 0.8226 0.9069
No log 17.8333 428 0.9095 0.2784 0.9095 0.9537
No log 17.9167 430 1.0116 0.2363 1.0116 1.0058
No log 18.0 432 1.0489 0.2437 1.0489 1.0242
No log 18.0833 434 0.9718 0.3310 0.9718 0.9858
No log 18.1667 436 0.8121 0.2967 0.8121 0.9012
No log 18.25 438 0.7264 0.2685 0.7264 0.8523
No log 18.3333 440 0.7128 0.2353 0.7128 0.8443
No log 18.4167 442 0.7221 0.2353 0.7221 0.8498
No log 18.5 444 0.7777 0.3127 0.7777 0.8818
No log 18.5833 446 0.8966 0.2574 0.8966 0.9469
No log 18.6667 448 1.0538 0.2577 1.0538 1.0265
No log 18.75 450 1.1570 0.2207 1.1570 1.0756
No log 18.8333 452 1.1085 0.2501 1.1085 1.0528
No log 18.9167 454 0.9629 0.3294 0.9629 0.9813
No log 19.0 456 0.7891 0.3032 0.7891 0.8883
No log 19.0833 458 0.7083 0.3092 0.7083 0.8416
No log 19.1667 460 0.6994 0.2787 0.6994 0.8363
No log 19.25 462 0.7563 0.2883 0.7563 0.8696
No log 19.3333 464 0.8679 0.3131 0.8679 0.9316
No log 19.4167 466 0.9182 0.2807 0.9182 0.9582
No log 19.5 468 0.9017 0.2807 0.9017 0.9496
No log 19.5833 470 0.8615 0.3169 0.8615 0.9281
No log 19.6667 472 0.8080 0.3712 0.8080 0.8989
No log 19.75 474 0.7962 0.3712 0.7962 0.8923
No log 19.8333 476 0.8209 0.3564 0.8209 0.9060
No log 19.9167 478 0.8854 0.3433 0.8854 0.9410
No log 20.0 480 0.9216 0.3433 0.9216 0.9600
No log 20.0833 482 0.8763 0.3799 0.8763 0.9361
No log 20.1667 484 0.8528 0.3799 0.8528 0.9235
No log 20.25 486 0.8329 0.3450 0.8329 0.9126
No log 20.3333 488 0.7922 0.3637 0.7922 0.8900
No log 20.4167 490 0.7742 0.3712 0.7742 0.8799
No log 20.5 492 0.7994 0.3712 0.7994 0.8941
No log 20.5833 494 0.8670 0.3799 0.8670 0.9311
No log 20.6667 496 0.8819 0.3913 0.8819 0.9391
No log 20.75 498 0.9124 0.3913 0.9124 0.9552
0.3132 20.8333 500 0.8621 0.3981 0.8621 0.9285
0.3132 20.9167 502 0.8001 0.3662 0.8001 0.8945
0.3132 21.0 504 0.7310 0.3471 0.7310 0.8550
0.3132 21.0833 506 0.7264 0.3471 0.7264 0.8523
0.3132 21.1667 508 0.7265 0.3471 0.7265 0.8523
0.3132 21.25 510 0.7627 0.3662 0.7627 0.8733

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k10_task7_organization

Finetuned
(4019)
this model