ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k1_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7404
  • Qwk: 0.3238
  • Mse: 0.7404
  • Rmse: 0.8605

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.6667 2 2.6632 -0.1213 2.6632 1.6319
No log 1.3333 4 1.3882 0.1265 1.3882 1.1782
No log 2.0 6 1.1319 -0.0970 1.1319 1.0639
No log 2.6667 8 1.0927 -0.0810 1.0927 1.0453
No log 3.3333 10 0.9247 0.0058 0.9247 0.9616
No log 4.0 12 0.8601 0.0236 0.8601 0.9274
No log 4.6667 14 0.8247 0.0393 0.8247 0.9081
No log 5.3333 16 0.7599 0.0 0.7599 0.8717
No log 6.0 18 0.7561 0.0481 0.7561 0.8696
No log 6.6667 20 0.7398 0.0937 0.7398 0.8601
No log 7.3333 22 0.6974 0.0889 0.6974 0.8351
No log 8.0 24 0.7067 0.0840 0.7067 0.8407
No log 8.6667 26 0.7288 0.1139 0.7288 0.8537
No log 9.3333 28 0.7213 0.0327 0.7213 0.8493
No log 10.0 30 0.6967 0.0393 0.6967 0.8347
No log 10.6667 32 0.6970 0.0846 0.6970 0.8349
No log 11.3333 34 0.6807 0.1327 0.6807 0.8251
No log 12.0 36 0.6864 0.2783 0.6864 0.8285
No log 12.6667 38 0.7758 0.1341 0.7758 0.8808
No log 13.3333 40 1.0549 0.1650 1.0549 1.0271
No log 14.0 42 1.0926 0.1753 1.0926 1.0453
No log 14.6667 44 0.9507 0.1323 0.9507 0.9750
No log 15.3333 46 0.9933 0.2153 0.9933 0.9966
No log 16.0 48 1.0836 0.1926 1.0836 1.0410
No log 16.6667 50 1.1782 0.1077 1.1782 1.0854
No log 17.3333 52 1.0397 0.1339 1.0397 1.0196
No log 18.0 54 0.8137 0.2715 0.8137 0.9021
No log 18.6667 56 0.7808 0.3541 0.7808 0.8836
No log 19.3333 58 0.7654 0.3622 0.7654 0.8749
No log 20.0 60 0.9714 0.2239 0.9714 0.9856
No log 20.6667 62 1.0619 0.2507 1.0619 1.0305
No log 21.3333 64 1.0113 0.1856 1.0113 1.0056
No log 22.0 66 0.8359 0.2662 0.8359 0.9143
No log 22.6667 68 0.7538 0.2414 0.7538 0.8682
No log 23.3333 70 0.7411 0.2237 0.7411 0.8609
No log 24.0 72 0.8121 0.2096 0.8121 0.9012
No log 24.6667 74 0.9212 0.2253 0.9212 0.9598
No log 25.3333 76 0.9368 0.3586 0.9368 0.9679
No log 26.0 78 0.8852 0.3344 0.8852 0.9409
No log 26.6667 80 0.8121 0.3261 0.8121 0.9012
No log 27.3333 82 0.8265 0.3918 0.8265 0.9091
No log 28.0 84 0.8877 0.3234 0.8877 0.9422
No log 28.6667 86 0.8820 0.3099 0.8820 0.9392
No log 29.3333 88 0.8031 0.3196 0.8031 0.8962
No log 30.0 90 0.8070 0.3127 0.8070 0.8983
No log 30.6667 92 0.9023 0.2967 0.9023 0.9499
No log 31.3333 94 0.9381 0.2692 0.9381 0.9686
No log 32.0 96 0.8218 0.2967 0.8218 0.9066
No log 32.6667 98 0.6896 0.2099 0.6896 0.8304
No log 33.3333 100 0.6649 0.2317 0.6649 0.8154
No log 34.0 102 0.6654 0.2476 0.6654 0.8157
No log 34.6667 104 0.7198 0.3840 0.7198 0.8484
No log 35.3333 106 0.7595 0.4020 0.7595 0.8715
No log 36.0 108 0.7477 0.3498 0.7477 0.8647
No log 36.6667 110 0.7214 0.3572 0.7214 0.8494
No log 37.3333 112 0.7180 0.2379 0.7180 0.8474
No log 38.0 114 0.7629 0.1972 0.7629 0.8735
No log 38.6667 116 0.7824 0.1972 0.7824 0.8845
No log 39.3333 118 0.7750 0.1972 0.7750 0.8804
No log 40.0 120 0.7531 0.2652 0.7531 0.8678
No log 40.6667 122 0.7773 0.2171 0.7773 0.8817
No log 41.3333 124 0.8507 0.2692 0.8507 0.9223
No log 42.0 126 0.9124 0.3234 0.9124 0.9552
No log 42.6667 128 0.8637 0.2967 0.8637 0.9293
No log 43.3333 130 0.8389 0.2967 0.8389 0.9159
No log 44.0 132 0.7617 0.2498 0.7617 0.8727
No log 44.6667 134 0.7038 0.3341 0.7038 0.8390
No log 45.3333 136 0.6932 0.2981 0.6932 0.8326
No log 46.0 138 0.6988 0.3894 0.6988 0.8359
No log 46.6667 140 0.7067 0.3894 0.7067 0.8406
No log 47.3333 142 0.7387 0.3399 0.7387 0.8595
No log 48.0 144 0.7523 0.3590 0.7523 0.8673
No log 48.6667 146 0.7570 0.3590 0.7570 0.8701
No log 49.3333 148 0.8021 0.3940 0.8021 0.8956
No log 50.0 150 0.7996 0.4329 0.7996 0.8942
No log 50.6667 152 0.7466 0.3590 0.7466 0.8640
No log 51.3333 154 0.6695 0.3788 0.6695 0.8182
No log 52.0 156 0.6423 0.3070 0.6423 0.8015
No log 52.6667 158 0.6455 0.3336 0.6455 0.8034
No log 53.3333 160 0.6411 0.3070 0.6411 0.8007
No log 54.0 162 0.6462 0.3599 0.6462 0.8039
No log 54.6667 164 0.6690 0.3524 0.6690 0.8179
No log 55.3333 166 0.6858 0.3267 0.6858 0.8281
No log 56.0 168 0.7004 0.3545 0.7004 0.8369
No log 56.6667 170 0.7119 0.3545 0.7119 0.8437
No log 57.3333 172 0.7268 0.3060 0.7268 0.8525
No log 58.0 174 0.7169 0.3127 0.7169 0.8467
No log 58.6667 176 0.6937 0.3498 0.6937 0.8329
No log 59.3333 178 0.6644 0.3524 0.6644 0.8151
No log 60.0 180 0.6653 0.3170 0.6653 0.8157
No log 60.6667 182 0.6667 0.3481 0.6667 0.8165
No log 61.3333 184 0.6775 0.3860 0.6775 0.8231
No log 62.0 186 0.6774 0.3806 0.6774 0.8230
No log 62.6667 188 0.6643 0.2652 0.6643 0.8150
No log 63.3333 190 0.6663 0.3280 0.6663 0.8163
No log 64.0 192 0.6705 0.3280 0.6705 0.8188
No log 64.6667 194 0.6854 0.2973 0.6854 0.8279
No log 65.3333 196 0.6973 0.3524 0.6973 0.8350
No log 66.0 198 0.7181 0.3840 0.7181 0.8474
No log 66.6667 200 0.7369 0.3425 0.7369 0.8584
No log 67.3333 202 0.7561 0.3060 0.7561 0.8695
No log 68.0 204 0.7591 0.2754 0.7591 0.8712
No log 68.6667 206 0.7362 0.3060 0.7362 0.8580
No log 69.3333 208 0.7107 0.2558 0.7107 0.8430
No log 70.0 210 0.7025 0.3267 0.7025 0.8382
No log 70.6667 212 0.6982 0.3267 0.6982 0.8356
No log 71.3333 214 0.7089 0.3267 0.7089 0.8420
No log 72.0 216 0.7317 0.3545 0.7317 0.8554
No log 72.6667 218 0.7634 0.2754 0.7634 0.8737
No log 73.3333 220 0.8069 0.3169 0.8069 0.8983
No log 74.0 222 0.8195 0.3675 0.8195 0.9053
No log 74.6667 224 0.8127 0.3564 0.8127 0.9015
No log 75.3333 226 0.8149 0.3564 0.8149 0.9027
No log 76.0 228 0.7888 0.3032 0.7888 0.8881
No log 76.6667 230 0.7572 0.3518 0.7572 0.8702
No log 77.3333 232 0.7508 0.3518 0.7508 0.8665
No log 78.0 234 0.7350 0.3312 0.7350 0.8573
No log 78.6667 236 0.7308 0.3312 0.7308 0.8549
No log 79.3333 238 0.7419 0.3594 0.7419 0.8613
No log 80.0 240 0.7610 0.3518 0.7610 0.8724
No log 80.6667 242 0.7882 0.3302 0.7882 0.8878
No log 81.3333 244 0.8140 0.3564 0.8140 0.9022
No log 82.0 246 0.8316 0.3819 0.8316 0.9119
No log 82.6667 248 0.8344 0.3819 0.8344 0.9135
No log 83.3333 250 0.8211 0.3819 0.8211 0.9062
No log 84.0 252 0.7963 0.3564 0.7963 0.8924
No log 84.6667 254 0.7777 0.3302 0.7777 0.8819
No log 85.3333 256 0.7631 0.3444 0.7631 0.8736
No log 86.0 258 0.7530 0.3444 0.7530 0.8678
No log 86.6667 260 0.7475 0.3444 0.7475 0.8646
No log 87.3333 262 0.7487 0.3444 0.7487 0.8652
No log 88.0 264 0.7545 0.3032 0.7545 0.8686
No log 88.6667 266 0.7650 0.3032 0.7650 0.8747
No log 89.3333 268 0.7701 0.3032 0.7701 0.8776
No log 90.0 270 0.7715 0.3032 0.7715 0.8783
No log 90.6667 272 0.7700 0.3032 0.7700 0.8775
No log 91.3333 274 0.7710 0.3032 0.7710 0.8781
No log 92.0 276 0.7701 0.3032 0.7701 0.8775
No log 92.6667 278 0.7653 0.3032 0.7653 0.8748
No log 93.3333 280 0.7617 0.3032 0.7617 0.8727
No log 94.0 282 0.7548 0.3444 0.7548 0.8688
No log 94.6667 284 0.7498 0.3167 0.7498 0.8659
No log 95.3333 286 0.7451 0.3238 0.7451 0.8632
No log 96.0 288 0.7421 0.3238 0.7421 0.8615
No log 96.6667 290 0.7399 0.3238 0.7399 0.8602
No log 97.3333 292 0.7388 0.3238 0.7388 0.8595
No log 98.0 294 0.7384 0.3238 0.7384 0.8593
No log 98.6667 296 0.7393 0.3238 0.7393 0.8598
No log 99.3333 298 0.7400 0.3238 0.7400 0.8603
No log 100.0 300 0.7404 0.3238 0.7404 0.8605

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k1_task7_organization

Finetuned
(4019)
this model