ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k2_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6824
  • Qwk: 0.7034
  • Mse: 0.6824
  • Rmse: 0.8261

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.1333 2 6.6552 0.0308 6.6552 2.5798
No log 0.2667 4 4.6034 0.0803 4.6034 2.1455
No log 0.4 6 2.6960 0.0732 2.6960 1.6420
No log 0.5333 8 2.0273 0.1833 2.0273 1.4238
No log 0.6667 10 1.7963 0.1321 1.7963 1.3403
No log 0.8 12 1.5854 0.1165 1.5854 1.2591
No log 0.9333 14 1.4642 0.2243 1.4642 1.2100
No log 1.0667 16 1.3731 0.2778 1.3731 1.1718
No log 1.2 18 1.2869 0.3036 1.2869 1.1344
No log 1.3333 20 1.3582 0.4094 1.3582 1.1654
No log 1.4667 22 1.3312 0.3902 1.3312 1.1538
No log 1.6 24 1.3412 0.3036 1.3412 1.1581
No log 1.7333 26 1.5091 0.3036 1.5091 1.2284
No log 1.8667 28 1.7085 0.2712 1.7085 1.3071
No log 2.0 30 1.7306 0.3125 1.7306 1.3155
No log 2.1333 32 1.3532 0.4545 1.3532 1.1633
No log 2.2667 34 1.5285 0.5063 1.5285 1.2363
No log 2.4 36 1.9520 0.5258 1.9520 1.3971
No log 2.5333 38 1.1561 0.6415 1.1561 1.0752
No log 2.6667 40 1.3704 0.4103 1.3704 1.1706
No log 2.8 42 1.9264 0.0893 1.9264 1.3879
No log 2.9333 44 2.0321 0.0517 2.0321 1.4255
No log 3.0667 46 1.8805 0.1197 1.8805 1.3713
No log 3.2 48 1.4839 0.2385 1.4839 1.2181
No log 3.3333 50 1.1807 0.3826 1.1807 1.0866
No log 3.4667 52 1.1241 0.4715 1.1241 1.0603
No log 3.6 54 1.0146 0.6308 1.0146 1.0073
No log 3.7333 56 0.9321 0.6308 0.9321 0.9654
No log 3.8667 58 0.9673 0.6047 0.9673 0.9835
No log 4.0 60 1.0037 0.5891 1.0037 1.0019
No log 4.1333 62 0.9706 0.5736 0.9706 0.9852
No log 4.2667 64 0.8562 0.6716 0.8562 0.9253
No log 4.4 66 0.8553 0.6418 0.8553 0.9248
No log 4.5333 68 0.9223 0.6370 0.9223 0.9604
No log 4.6667 70 1.0178 0.5606 1.0178 1.0089
No log 4.8 72 1.2634 0.4651 1.2634 1.1240
No log 4.9333 74 1.2355 0.5306 1.2355 1.1115
No log 5.0667 76 1.0545 0.6335 1.0545 1.0269
No log 5.2 78 0.7138 0.7838 0.7138 0.8448
No log 5.3333 80 0.7255 0.7383 0.7255 0.8517
No log 5.4667 82 0.7469 0.7552 0.7469 0.8643
No log 5.6 84 1.3484 0.5856 1.3484 1.1612
No log 5.7333 86 1.5694 0.5561 1.5694 1.2528
No log 5.8667 88 1.2492 0.6145 1.2492 1.1177
No log 6.0 90 0.8420 0.6667 0.8420 0.9176
No log 6.1333 92 0.7549 0.7083 0.7549 0.8689
No log 6.2667 94 0.6560 0.7376 0.6560 0.8100
No log 6.4 96 0.7739 0.7143 0.7739 0.8797
No log 6.5333 98 1.1548 0.6069 1.1548 1.0746
No log 6.6667 100 1.3905 0.52 1.3905 1.1792
No log 6.8 102 1.3579 0.5478 1.3579 1.1653
No log 6.9333 104 0.9802 0.6187 0.9802 0.9900
No log 7.0667 106 0.6626 0.7482 0.6626 0.8140
No log 7.2 108 0.6332 0.7518 0.6332 0.7957
No log 7.3333 110 0.7148 0.7376 0.7148 0.8455
No log 7.4667 112 0.8129 0.6571 0.8129 0.9016
No log 7.6 114 0.8026 0.6479 0.8026 0.8959
No log 7.7333 116 0.6198 0.7733 0.6198 0.7873
No log 7.8667 118 0.7266 0.6944 0.7266 0.8524
No log 8.0 120 0.8623 0.6711 0.8623 0.9286
No log 8.1333 122 0.6415 0.7662 0.6415 0.8010
No log 8.2667 124 0.6604 0.7662 0.6604 0.8126
No log 8.4 126 0.9292 0.6667 0.9292 0.9639
No log 8.5333 128 0.8896 0.6806 0.8896 0.9432
No log 8.6667 130 0.9029 0.6571 0.9029 0.9502
No log 8.8 132 0.9941 0.6351 0.9941 0.9970
No log 8.9333 134 0.8637 0.6573 0.8637 0.9294
No log 9.0667 136 0.7713 0.6809 0.7713 0.8783
No log 9.2 138 0.7920 0.6571 0.7920 0.8900
No log 9.3333 140 0.8501 0.6056 0.8501 0.9220
No log 9.4667 142 0.9855 0.6 0.9855 0.9927
No log 9.6 144 1.0494 0.5957 1.0494 1.0244
No log 9.7333 146 0.8674 0.6331 0.8674 0.9313
No log 9.8667 148 0.7458 0.6912 0.7458 0.8636
No log 10.0 150 0.7776 0.6906 0.7776 0.8818
No log 10.1333 152 0.7961 0.6667 0.7961 0.8923
No log 10.2667 154 0.9492 0.6622 0.9492 0.9743
No log 10.4 156 1.0172 0.6711 1.0172 1.0085
No log 10.5333 158 0.8203 0.6622 0.8203 0.9057
No log 10.6667 160 0.7372 0.6714 0.7372 0.8586
No log 10.8 162 0.7255 0.7042 0.7255 0.8518
No log 10.9333 164 0.6680 0.7194 0.6680 0.8173
No log 11.0667 166 0.6963 0.7246 0.6963 0.8345
No log 11.2 168 0.8056 0.6475 0.8056 0.8975
No log 11.3333 170 0.7660 0.6812 0.7660 0.8752
No log 11.4667 172 0.6853 0.7338 0.6853 0.8278
No log 11.6 174 0.6304 0.7626 0.6304 0.7940
No log 11.7333 176 0.6075 0.7891 0.6075 0.7794
No log 11.8667 178 0.7396 0.7172 0.7396 0.8600
No log 12.0 180 0.9294 0.6585 0.9294 0.9641
No log 12.1333 182 0.9396 0.6316 0.9396 0.9693
No log 12.2667 184 0.8041 0.6711 0.8041 0.8967
No log 12.4 186 0.6804 0.7432 0.6804 0.8249
No log 12.5333 188 0.7702 0.6571 0.7702 0.8776
No log 12.6667 190 0.9321 0.6479 0.9321 0.9654
No log 12.8 192 1.0040 0.6301 1.0040 1.0020
No log 12.9333 194 0.8510 0.6853 0.8510 0.9225
No log 13.0667 196 0.6875 0.7376 0.6875 0.8291
No log 13.2 198 0.6072 0.7703 0.6072 0.7792
No log 13.3333 200 0.7336 0.7215 0.7336 0.8565
No log 13.4667 202 1.0460 0.6552 1.0460 1.0227
No log 13.6 204 1.2817 0.5697 1.2817 1.1321
No log 13.7333 206 1.1876 0.6203 1.1876 1.0898
No log 13.8667 208 1.0188 0.6164 1.0188 1.0094
No log 14.0 210 0.7513 0.6812 0.7513 0.8668
No log 14.1333 212 0.6711 0.7737 0.6711 0.8192
No log 14.2667 214 0.6848 0.7770 0.6848 0.8275
No log 14.4 216 0.7011 0.75 0.7011 0.8373
No log 14.5333 218 0.8758 0.6471 0.8758 0.9359
No log 14.6667 220 0.9997 0.6143 0.9997 0.9998
No log 14.8 222 0.9076 0.6429 0.9076 0.9527
No log 14.9333 224 0.7261 0.7234 0.7261 0.8521
No log 15.0667 226 0.6580 0.7222 0.6580 0.8112
No log 15.2 228 0.7015 0.72 0.7015 0.8376
No log 15.3333 230 0.7810 0.6846 0.7810 0.8837
No log 15.4667 232 0.7408 0.6974 0.7408 0.8607
No log 15.6 234 0.6574 0.7308 0.6574 0.8108
No log 15.7333 236 0.6791 0.7308 0.6791 0.8241
No log 15.8667 238 0.7096 0.7097 0.7096 0.8424
No log 16.0 240 0.8848 0.6797 0.8848 0.9406
No log 16.1333 242 0.9792 0.6099 0.9792 0.9896
No log 16.2667 244 0.9858 0.6232 0.9858 0.9929
No log 16.4 246 0.8506 0.6471 0.8506 0.9223
No log 16.5333 248 0.7175 0.6912 0.7175 0.8471
No log 16.6667 250 0.6549 0.7746 0.6549 0.8092
No log 16.8 252 0.6353 0.7552 0.6353 0.7971
No log 16.9333 254 0.6639 0.7105 0.6639 0.8148
No log 17.0667 256 0.7135 0.7394 0.7135 0.8447
No log 17.2 258 0.6550 0.7320 0.6550 0.8093
No log 17.3333 260 0.6777 0.7222 0.6777 0.8232
No log 17.4667 262 0.7891 0.7 0.7891 0.8883
No log 17.6 264 0.9163 0.6434 0.9163 0.9572
No log 17.7333 266 0.8956 0.6471 0.8956 0.9463
No log 17.8667 268 0.8029 0.6912 0.8029 0.8961
No log 18.0 270 0.7422 0.7007 0.7422 0.8615
No log 18.1333 272 0.6987 0.7536 0.6987 0.8359
No log 18.2667 274 0.6950 0.7143 0.6950 0.8336
No log 18.4 276 0.7942 0.6901 0.7942 0.8912
No log 18.5333 278 0.8991 0.6667 0.8991 0.9482
No log 18.6667 280 0.8650 0.6571 0.8650 0.9301
No log 18.8 282 0.8234 0.6763 0.8234 0.9074
No log 18.9333 284 0.8478 0.6331 0.8478 0.9207
No log 19.0667 286 0.8134 0.6324 0.8134 0.9019
No log 19.2 288 0.8460 0.6197 0.8460 0.9198
No log 19.3333 290 0.7717 0.6479 0.7717 0.8785
No log 19.4667 292 0.5921 0.7755 0.5921 0.7695
No log 19.6 294 0.5302 0.8235 0.5302 0.7282
No log 19.7333 296 0.5761 0.8025 0.5761 0.7590
No log 19.8667 298 0.6661 0.7619 0.6661 0.8161
No log 20.0 300 0.6860 0.7456 0.6860 0.8282
No log 20.1333 302 0.5951 0.7613 0.5951 0.7714
No log 20.2667 304 0.5438 0.8378 0.5438 0.7374
No log 20.4 306 0.5584 0.8276 0.5584 0.7472
No log 20.5333 308 0.6468 0.7143 0.6468 0.8043
No log 20.6667 310 0.9132 0.6971 0.9132 0.9556
No log 20.8 312 0.9334 0.6864 0.9334 0.9661
No log 20.9333 314 0.7376 0.6753 0.7376 0.8588
No log 21.0667 316 0.5835 0.7785 0.5835 0.7639
No log 21.2 318 0.5969 0.8082 0.5969 0.7726
No log 21.3333 320 0.6354 0.7234 0.6354 0.7971
No log 21.4667 322 0.6994 0.7123 0.6994 0.8363
No log 21.6 324 0.7261 0.7143 0.7261 0.8521
No log 21.7333 326 0.7030 0.7020 0.7030 0.8384
No log 21.8667 328 0.6452 0.7429 0.6452 0.8032
No log 22.0 330 0.6688 0.7429 0.6688 0.8178
No log 22.1333 332 0.6757 0.7338 0.6757 0.8220
No log 22.2667 334 0.6984 0.7338 0.6984 0.8357
No log 22.4 336 0.7253 0.6906 0.7253 0.8517
No log 22.5333 338 0.7382 0.7042 0.7382 0.8592
No log 22.6667 340 0.6912 0.7105 0.6912 0.8314
No log 22.8 342 0.5747 0.8054 0.5747 0.7581
No log 22.9333 344 0.5317 0.8364 0.5317 0.7292
No log 23.0667 346 0.5487 0.8555 0.5487 0.7408
No log 23.2 348 0.6644 0.7515 0.6644 0.8151
No log 23.3333 350 0.7200 0.7329 0.7200 0.8485
No log 23.4667 352 0.8199 0.6928 0.8199 0.9055
No log 23.6 354 0.8883 0.6571 0.8883 0.9425
No log 23.7333 356 0.9538 0.6571 0.9538 0.9766
No log 23.8667 358 0.9195 0.6619 0.9195 0.9589
No log 24.0 360 0.8105 0.6618 0.8105 0.9003
No log 24.1333 362 0.7666 0.6906 0.7666 0.8755
No log 24.2667 364 0.6852 0.6986 0.6852 0.8278
No log 24.4 366 0.6131 0.7310 0.6131 0.7830
No log 24.5333 368 0.5770 0.7733 0.5770 0.7596
No log 24.6667 370 0.6031 0.7532 0.6031 0.7766
No log 24.8 372 0.7723 0.7143 0.7723 0.8788
No log 24.9333 374 0.9207 0.6883 0.9207 0.9595
No log 25.0667 376 0.8928 0.6887 0.8928 0.9449
No log 25.2 378 0.8102 0.6846 0.8102 0.9001
No log 25.3333 380 0.7689 0.6667 0.7689 0.8768
No log 25.4667 382 0.6978 0.6963 0.6978 0.8354
No log 25.6 384 0.6609 0.6963 0.6609 0.8130
No log 25.7333 386 0.6592 0.7206 0.6592 0.8119
No log 25.8667 388 0.6476 0.7429 0.6476 0.8047
No log 26.0 390 0.6837 0.7133 0.6837 0.8268
No log 26.1333 392 0.7288 0.7042 0.7288 0.8537
No log 26.2667 394 0.7566 0.7042 0.7566 0.8698
No log 26.4 396 0.7761 0.6950 0.7761 0.8810
No log 26.5333 398 0.7765 0.7042 0.7765 0.8812
No log 26.6667 400 0.7890 0.7050 0.7890 0.8882
No log 26.8 402 0.8254 0.6714 0.8254 0.9085
No log 26.9333 404 0.8076 0.6809 0.8076 0.8986
No log 27.0667 406 0.6899 0.7143 0.6899 0.8306
No log 27.2 408 0.6358 0.7324 0.6358 0.7974
No log 27.3333 410 0.6498 0.7222 0.6498 0.8061
No log 27.4667 412 0.6913 0.6800 0.6913 0.8314
No log 27.6 414 0.6634 0.7383 0.6634 0.8145
No log 27.7333 416 0.6275 0.7432 0.6275 0.7922
No log 27.8667 418 0.6145 0.7534 0.6145 0.7839
No log 28.0 420 0.6642 0.7383 0.6642 0.8150
No log 28.1333 422 0.6993 0.7260 0.6993 0.8362
No log 28.2667 424 0.6568 0.7413 0.6568 0.8104
No log 28.4 426 0.6369 0.75 0.6369 0.7981
No log 28.5333 428 0.6730 0.7260 0.6730 0.8203
No log 28.6667 430 0.7909 0.6711 0.7909 0.8893
No log 28.8 432 0.8407 0.6577 0.8407 0.9169
No log 28.9333 434 0.8250 0.6525 0.8250 0.9083
No log 29.0667 436 0.8575 0.6525 0.8575 0.9260
No log 29.2 438 0.8007 0.6812 0.8007 0.8948
No log 29.3333 440 0.7438 0.7059 0.7438 0.8624
No log 29.4667 442 0.7197 0.7194 0.7197 0.8483
No log 29.6 444 0.7524 0.6993 0.7524 0.8674
No log 29.7333 446 0.8125 0.6525 0.8125 0.9014
No log 29.8667 448 0.8455 0.6622 0.8455 0.9195
No log 30.0 450 0.7938 0.6849 0.7938 0.8910
No log 30.1333 452 0.7200 0.6901 0.7200 0.8485
No log 30.2667 454 0.6937 0.75 0.6937 0.8329
No log 30.4 456 0.7220 0.7222 0.7220 0.8497
No log 30.5333 458 0.7394 0.6901 0.7394 0.8599
No log 30.6667 460 0.7507 0.6901 0.7507 0.8664
No log 30.8 462 0.7858 0.6809 0.7858 0.8865
No log 30.9333 464 0.8016 0.6809 0.8016 0.8953
No log 31.0667 466 0.7656 0.6901 0.7656 0.8750
No log 31.2 468 0.6984 0.7518 0.6984 0.8357
No log 31.3333 470 0.6582 0.7606 0.6582 0.8113
No log 31.4667 472 0.6609 0.7534 0.6609 0.8130
No log 31.6 474 0.7180 0.7034 0.7180 0.8473
No log 31.7333 476 0.7447 0.6667 0.7447 0.8629
No log 31.8667 478 0.7231 0.6667 0.7231 0.8503
No log 32.0 480 0.6826 0.7482 0.6826 0.8262
No log 32.1333 482 0.6514 0.7482 0.6514 0.8071
No log 32.2667 484 0.6193 0.7746 0.6193 0.7870
No log 32.4 486 0.6658 0.7368 0.6658 0.8159
No log 32.5333 488 0.7012 0.7342 0.7012 0.8374
No log 32.6667 490 0.6646 0.7226 0.6646 0.8152
No log 32.8 492 0.6364 0.7448 0.6364 0.7977
No log 32.9333 494 0.6689 0.7518 0.6689 0.8179
No log 33.0667 496 0.6780 0.7482 0.6780 0.8234
No log 33.2 498 0.6935 0.7391 0.6935 0.8328
0.376 33.3333 500 0.7117 0.7391 0.7117 0.8436
0.376 33.4667 502 0.7760 0.6569 0.7760 0.8809
0.376 33.6 504 0.8113 0.6525 0.8113 0.9007
0.376 33.7333 506 0.7835 0.6809 0.7835 0.8852
0.376 33.8667 508 0.7022 0.7133 0.7022 0.8379
0.376 34.0 510 0.6824 0.7034 0.6824 0.8261

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k2_task1_organization

Finetuned
(4019)
this model