ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run999_AugV5_k1_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8221
  • Qwk: 0.6479
  • Mse: 0.8221
  • Rmse: 0.9067

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.25 2 8.4015 -0.0227 8.4015 2.8985
No log 0.5 4 6.0779 0.0 6.0779 2.4653
No log 0.75 6 4.2145 0.0185 4.2145 2.0529
No log 1.0 8 3.2580 0.0585 3.2580 1.8050
No log 1.25 10 2.6967 0.0261 2.6967 1.6422
No log 1.5 12 2.2909 0.1135 2.2909 1.5136
No log 1.75 14 2.1738 0.2406 2.1738 1.4744
No log 2.0 16 1.8376 0.1930 1.8376 1.3556
No log 2.25 18 1.6283 0.1165 1.6283 1.2761
No log 2.5 20 1.5942 0.1538 1.5942 1.2626
No log 2.75 22 1.5268 0.1538 1.5268 1.2356
No log 3.0 24 1.6219 0.3761 1.6219 1.2735
No log 3.25 26 1.7328 0.375 1.7328 1.3163
No log 3.5 28 1.5396 0.4390 1.5396 1.2408
No log 3.75 30 1.3455 0.3063 1.3455 1.1600
No log 4.0 32 1.4300 0.3214 1.4300 1.1958
No log 4.25 34 1.5067 0.3333 1.5067 1.2275
No log 4.5 36 1.4349 0.3276 1.4349 1.1979
No log 4.75 38 1.4388 0.3770 1.4388 1.1995
No log 5.0 40 1.3805 0.3902 1.3805 1.1750
No log 5.25 42 1.2901 0.4444 1.2901 1.1358
No log 5.5 44 1.1932 0.3833 1.1932 1.0923
No log 5.75 46 1.1241 0.5312 1.1241 1.0602
No log 6.0 48 1.1396 0.5581 1.1396 1.0675
No log 6.25 50 1.1444 0.5736 1.1444 1.0698
No log 6.5 52 1.0692 0.5455 1.0692 1.0340
No log 6.75 54 1.0340 0.5714 1.0340 1.0168
No log 7.0 56 1.0228 0.5775 1.0228 1.0113
No log 7.25 58 1.1170 0.5857 1.1170 1.0569
No log 7.5 60 1.0616 0.6277 1.0616 1.0303
No log 7.75 62 1.0069 0.5414 1.0069 1.0034
No log 8.0 64 0.9424 0.6131 0.9424 0.9708
No log 8.25 66 0.8892 0.6286 0.8892 0.9430
No log 8.5 68 0.9638 0.6429 0.9638 0.9818
No log 8.75 70 0.9483 0.6 0.9483 0.9738
No log 9.0 72 0.9040 0.6412 0.9040 0.9508
No log 9.25 74 0.8926 0.6667 0.8926 0.9448
No log 9.5 76 0.8033 0.6619 0.8033 0.8963
No log 9.75 78 0.9307 0.5942 0.9307 0.9647
No log 10.0 80 0.9813 0.6015 0.9813 0.9906
No log 10.25 82 0.9614 0.6047 0.9614 0.9805
No log 10.5 84 0.8625 0.6667 0.8625 0.9287
No log 10.75 86 0.7352 0.6957 0.7352 0.8574
No log 11.0 88 0.7344 0.6957 0.7344 0.8569
No log 11.25 90 0.7717 0.7133 0.7717 0.8784
No log 11.5 92 0.7450 0.6815 0.7450 0.8631
No log 11.75 94 0.8799 0.7050 0.8799 0.9380
No log 12.0 96 0.9895 0.6207 0.9895 0.9948
No log 12.25 98 0.8847 0.6667 0.8847 0.9406
No log 12.5 100 0.8559 0.6619 0.8559 0.9251
No log 12.75 102 0.9125 0.6423 0.9125 0.9553
No log 13.0 104 0.8910 0.6277 0.8910 0.9439
No log 13.25 106 0.9034 0.6571 0.9034 0.9505
No log 13.5 108 1.0215 0.6131 1.0215 1.0107
No log 13.75 110 0.9906 0.6316 0.9906 0.9953
No log 14.0 112 0.9342 0.6316 0.9342 0.9665
No log 14.25 114 0.8580 0.6471 0.8580 0.9263
No log 14.5 116 0.8055 0.6714 0.8055 0.8975
No log 14.75 118 0.8014 0.6714 0.8014 0.8952
No log 15.0 120 0.8003 0.6571 0.8003 0.8946
No log 15.25 122 0.8637 0.6525 0.8637 0.9294
No log 15.5 124 0.8291 0.6525 0.8291 0.9105
No log 15.75 126 0.7882 0.7211 0.7882 0.8878
No log 16.0 128 0.8044 0.6993 0.8044 0.8969
No log 16.25 130 0.8094 0.7042 0.8094 0.8996
No log 16.5 132 0.8166 0.6475 0.8166 0.9037
No log 16.75 134 0.8957 0.6569 0.8957 0.9464
No log 17.0 136 0.9082 0.6471 0.9082 0.9530
No log 17.25 138 0.9031 0.6471 0.9031 0.9503
No log 17.5 140 0.8356 0.6569 0.8356 0.9141
No log 17.75 142 0.8495 0.6316 0.8495 0.9217
No log 18.0 144 0.8449 0.6759 0.8449 0.9192
No log 18.25 146 0.8068 0.6857 0.8068 0.8982
No log 18.5 148 0.8712 0.6620 0.8712 0.9334
No log 18.75 150 0.9683 0.6533 0.9683 0.9840
No log 19.0 152 0.9026 0.6525 0.9026 0.9500
No log 19.25 154 0.8443 0.6857 0.8443 0.9189
No log 19.5 156 0.9082 0.6377 0.9082 0.9530
No log 19.75 158 0.9250 0.6423 0.9250 0.9618
No log 20.0 160 0.9205 0.6015 0.9205 0.9594
No log 20.25 162 1.0278 0.5674 1.0278 1.0138
No log 20.5 164 1.1169 0.5634 1.1169 1.0568
No log 20.75 166 1.0358 0.5942 1.0358 1.0178
No log 21.0 168 0.9633 0.6074 0.9633 0.9815
No log 21.25 170 0.9961 0.6475 0.9961 0.9981
No log 21.5 172 0.9387 0.6571 0.9387 0.9689
No log 21.75 174 0.8516 0.6522 0.8516 0.9228
No log 22.0 176 0.9413 0.6486 0.9413 0.9702
No log 22.25 178 1.0079 0.64 1.0079 1.0039
No log 22.5 180 0.9115 0.6853 0.9115 0.9547
No log 22.75 182 0.8071 0.6866 0.8071 0.8984
No log 23.0 184 0.8037 0.6716 0.8037 0.8965
No log 23.25 186 0.8030 0.6565 0.8030 0.8961
No log 23.5 188 0.7986 0.6718 0.7986 0.8937
No log 23.75 190 0.8289 0.6815 0.8289 0.9104
No log 24.0 192 0.8615 0.6957 0.8615 0.9282
No log 24.25 194 0.8470 0.6765 0.8470 0.9203
No log 24.5 196 0.8552 0.6713 0.8552 0.9248
No log 24.75 198 0.8551 0.6713 0.8551 0.9247
No log 25.0 200 0.9062 0.6577 0.9062 0.9519
No log 25.25 202 0.9288 0.6040 0.9288 0.9637
No log 25.5 204 0.9146 0.6531 0.9146 0.9563
No log 25.75 206 0.8998 0.6757 0.8998 0.9486
No log 26.0 208 0.8990 0.6846 0.8990 0.9482
No log 26.25 210 0.8652 0.6800 0.8652 0.9302
No log 26.5 212 0.8451 0.6887 0.8451 0.9193
No log 26.75 214 0.8546 0.6842 0.8546 0.9244
No log 27.0 216 0.8321 0.6939 0.8321 0.9122
No log 27.25 218 0.8130 0.7007 0.8130 0.9017
No log 27.5 220 0.8149 0.6912 0.8149 0.9027
No log 27.75 222 0.8107 0.6912 0.8107 0.9004
No log 28.0 224 0.8081 0.6812 0.8081 0.8989
No log 28.25 226 0.8082 0.6763 0.8082 0.8990
No log 28.5 228 0.8091 0.6619 0.8091 0.8995
No log 28.75 230 0.8305 0.6906 0.8305 0.9113
No log 29.0 232 0.8839 0.6711 0.8839 0.9401
No log 29.25 234 0.8924 0.6667 0.8924 0.9447
No log 29.5 236 0.8792 0.72 0.8792 0.9377
No log 29.75 238 0.9161 0.6573 0.9161 0.9571
No log 30.0 240 0.9565 0.6338 0.9565 0.9780
No log 30.25 242 0.9384 0.6620 0.9384 0.9687
No log 30.5 244 1.0006 0.6309 1.0006 1.0003
No log 30.75 246 1.1285 0.5882 1.1285 1.0623
No log 31.0 248 1.1021 0.6483 1.1021 1.0498
No log 31.25 250 1.0230 0.6316 1.0230 1.0114
No log 31.5 252 0.9600 0.6515 0.9600 0.9798
No log 31.75 254 0.8991 0.6815 0.8991 0.9482
No log 32.0 256 0.8515 0.6618 0.8515 0.9228
No log 32.25 258 0.8376 0.6423 0.8376 0.9152
No log 32.5 260 0.8025 0.6906 0.8025 0.8958
No log 32.75 262 0.7954 0.6906 0.7954 0.8918
No log 33.0 264 0.8295 0.6667 0.8295 0.9108
No log 33.25 266 0.8124 0.6906 0.8124 0.9013
No log 33.5 268 0.7775 0.7101 0.7775 0.8817
No log 33.75 270 0.7782 0.6861 0.7782 0.8821
No log 34.0 272 0.7994 0.6763 0.7994 0.8941
No log 34.25 274 0.7895 0.6763 0.7895 0.8886
No log 34.5 276 0.7922 0.7 0.7922 0.8900
No log 34.75 278 0.8814 0.6667 0.8814 0.9388
No log 35.0 280 0.9255 0.6709 0.9255 0.9620
No log 35.25 282 0.8943 0.6579 0.8943 0.9457
No log 35.5 284 0.8204 0.6815 0.8204 0.9058
No log 35.75 286 0.7941 0.6963 0.7941 0.8911
No log 36.0 288 0.7871 0.6963 0.7871 0.8872
No log 36.25 290 0.8084 0.6815 0.8084 0.8991
No log 36.5 292 0.8878 0.6846 0.8878 0.9422
No log 36.75 294 0.8893 0.7067 0.8893 0.9430
No log 37.0 296 0.8098 0.7083 0.8098 0.8999
No log 37.25 298 0.7644 0.6861 0.7644 0.8743
No log 37.5 300 0.7743 0.6765 0.7743 0.8800
No log 37.75 302 0.7814 0.6861 0.7814 0.8840
No log 38.0 304 0.8011 0.6912 0.8011 0.8950
No log 38.25 306 0.8478 0.6812 0.8478 0.9207
No log 38.5 308 0.9024 0.6475 0.9024 0.9499
No log 38.75 310 0.9025 0.6667 0.9025 0.9500
No log 39.0 312 0.8917 0.6714 0.8917 0.9443
No log 39.25 314 0.9079 0.6531 0.9079 0.9528
No log 39.5 316 0.8944 0.6667 0.8944 0.9457
No log 39.75 318 0.8692 0.6901 0.8692 0.9323
No log 40.0 320 0.8531 0.6912 0.8531 0.9237
No log 40.25 322 0.8540 0.6912 0.8540 0.9241
No log 40.5 324 0.8601 0.6912 0.8601 0.9274
No log 40.75 326 0.8993 0.7042 0.8993 0.9483
No log 41.0 328 0.9632 0.6711 0.9632 0.9814
No log 41.25 330 0.9402 0.6800 0.9402 0.9696
No log 41.5 332 0.8785 0.6812 0.8785 0.9373
No log 41.75 334 0.8495 0.6715 0.8495 0.9217
No log 42.0 336 0.8450 0.6765 0.8450 0.9193
No log 42.25 338 0.8416 0.6569 0.8416 0.9174
No log 42.5 340 0.8395 0.6522 0.8395 0.9163
No log 42.75 342 0.8550 0.6812 0.8550 0.9247
No log 43.0 344 0.8482 0.6806 0.8482 0.9210
No log 43.25 346 0.8422 0.6812 0.8422 0.9177
No log 43.5 348 0.8610 0.6861 0.8610 0.9279
No log 43.75 350 0.9155 0.6316 0.9155 0.9568
No log 44.0 352 0.9693 0.6471 0.9693 0.9845
No log 44.25 354 0.9751 0.6475 0.9751 0.9875
No log 44.5 356 0.9037 0.6316 0.9037 0.9507
No log 44.75 358 0.8283 0.6667 0.8283 0.9101
No log 45.0 360 0.7955 0.6466 0.7955 0.8919
No log 45.25 362 0.8170 0.6418 0.8170 0.9039
No log 45.5 364 0.8169 0.6176 0.8169 0.9038
No log 45.75 366 0.7803 0.6765 0.7803 0.8833
No log 46.0 368 0.7711 0.6906 0.7711 0.8781
No log 46.25 370 0.8085 0.6573 0.8085 0.8992
No log 46.5 372 0.8154 0.6429 0.8154 0.9030
No log 46.75 374 0.7906 0.6667 0.7906 0.8892
No log 47.0 376 0.7908 0.6912 0.7908 0.8893
No log 47.25 378 0.7923 0.6912 0.7923 0.8901
No log 47.5 380 0.7983 0.6861 0.7983 0.8935
No log 47.75 382 0.8121 0.6853 0.8121 0.9012
No log 48.0 384 0.8306 0.6795 0.8306 0.9114
No log 48.25 386 0.8504 0.6790 0.8504 0.9222
No log 48.5 388 0.8054 0.7205 0.8054 0.8974
No log 48.75 390 0.7628 0.7248 0.7628 0.8734
No log 49.0 392 0.7536 0.6815 0.7536 0.8681
No log 49.25 394 0.7692 0.6815 0.7692 0.8770
No log 49.5 396 0.7919 0.6815 0.7919 0.8899
No log 49.75 398 0.8288 0.6617 0.8288 0.9104
No log 50.0 400 0.8675 0.6475 0.8675 0.9314
No log 50.25 402 0.8982 0.6759 0.8982 0.9478
No log 50.5 404 0.8867 0.6712 0.8867 0.9417
No log 50.75 406 0.8422 0.6573 0.8422 0.9177
No log 51.0 408 0.8027 0.6618 0.8027 0.8960
No log 51.25 410 0.7910 0.6849 0.7910 0.8894
No log 51.5 412 0.7784 0.7075 0.7784 0.8823
No log 51.75 414 0.7626 0.7162 0.7626 0.8732
No log 52.0 416 0.7661 0.7451 0.7661 0.8753
No log 52.25 418 0.7850 0.7273 0.7850 0.8860
No log 52.5 420 0.8173 0.6753 0.8173 0.9041
No log 52.75 422 0.8199 0.6620 0.8199 0.9055
No log 53.0 424 0.8112 0.6667 0.8112 0.9007
No log 53.25 426 0.8038 0.6765 0.8038 0.8966
No log 53.5 428 0.7847 0.6861 0.7847 0.8858
No log 53.75 430 0.7724 0.7042 0.7724 0.8789
No log 54.0 432 0.7696 0.7034 0.7696 0.8773
No log 54.25 434 0.7805 0.7152 0.7805 0.8835
No log 54.5 436 0.7928 0.7067 0.7928 0.8904
No log 54.75 438 0.8000 0.6993 0.8000 0.8944
No log 55.0 440 0.7988 0.6901 0.7988 0.8937
No log 55.25 442 0.8042 0.6901 0.8042 0.8968
No log 55.5 444 0.7971 0.6812 0.7971 0.8928
No log 55.75 446 0.7862 0.6812 0.7862 0.8867
No log 56.0 448 0.7890 0.7101 0.7890 0.8882
No log 56.25 450 0.7860 0.7042 0.7860 0.8865
No log 56.5 452 0.7644 0.6901 0.7644 0.8743
No log 56.75 454 0.7627 0.7172 0.7627 0.8733
No log 57.0 456 0.7815 0.7034 0.7815 0.8840
No log 57.25 458 0.8063 0.6944 0.8063 0.8980
No log 57.5 460 0.8137 0.6812 0.8137 0.9020
No log 57.75 462 0.8071 0.6812 0.8071 0.8984
No log 58.0 464 0.8032 0.6812 0.8032 0.8962
No log 58.25 466 0.7977 0.6812 0.7977 0.8931
No log 58.5 468 0.8032 0.6812 0.8032 0.8962
No log 58.75 470 0.8188 0.6957 0.8188 0.9049
No log 59.0 472 0.8434 0.6912 0.8434 0.9184
No log 59.25 474 0.8588 0.7050 0.8588 0.9267
No log 59.5 476 0.8394 0.6912 0.8394 0.9162
No log 59.75 478 0.8176 0.6912 0.8176 0.9042
No log 60.0 480 0.7989 0.6957 0.7989 0.8938
No log 60.25 482 0.7873 0.6861 0.7873 0.8873
No log 60.5 484 0.7818 0.6957 0.7818 0.8842
No log 60.75 486 0.7880 0.6957 0.7880 0.8877
No log 61.0 488 0.8108 0.7 0.8108 0.9005
No log 61.25 490 0.8220 0.6950 0.8220 0.9066
No log 61.5 492 0.8198 0.6957 0.8198 0.9055
No log 61.75 494 0.8154 0.6763 0.8154 0.9030
No log 62.0 496 0.8163 0.6763 0.8163 0.9035
No log 62.25 498 0.8176 0.6763 0.8176 0.9042
0.2952 62.5 500 0.8150 0.6763 0.8150 0.9028
0.2952 62.75 502 0.8122 0.6906 0.8122 0.9012
0.2952 63.0 504 0.8279 0.6763 0.8279 0.9099
0.2952 63.25 506 0.8400 0.6812 0.8400 0.9165
0.2952 63.5 508 0.8349 0.6765 0.8349 0.9137
0.2952 63.75 510 0.8267 0.6957 0.8267 0.9092
0.2952 64.0 512 0.8220 0.6957 0.8220 0.9067
0.2952 64.25 514 0.8166 0.6763 0.8166 0.9037
0.2952 64.5 516 0.8071 0.6857 0.8071 0.8984
0.2952 64.75 518 0.8025 0.6906 0.8025 0.8958
0.2952 65.0 520 0.7983 0.6906 0.7983 0.8935
0.2952 65.25 522 0.7980 0.6763 0.7980 0.8933
0.2952 65.5 524 0.8152 0.6713 0.8152 0.9029
0.2952 65.75 526 0.8301 0.6479 0.8301 0.9111
0.2952 66.0 528 0.8295 0.6479 0.8295 0.9108
0.2952 66.25 530 0.8221 0.6479 0.8221 0.9067

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run999_AugV5_k1_task1_organization

Finetuned
(4023)
this model