ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k19_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6365
  • Qwk: 0.4990
  • Mse: 0.6365
  • Rmse: 0.7978

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0208 2 4.0235 0.0086 4.0235 2.0059
No log 0.0417 4 2.7746 -0.0252 2.7746 1.6657
No log 0.0625 6 1.8054 0.0203 1.8054 1.3437
No log 0.0833 8 1.1326 0.2441 1.1326 1.0642
No log 0.1042 10 1.0757 0.1504 1.0757 1.0372
No log 0.125 12 1.0771 0.1561 1.0771 1.0378
No log 0.1458 14 1.0975 0.2015 1.0975 1.0476
No log 0.1667 16 1.0980 0.1389 1.0980 1.0478
No log 0.1875 18 1.1952 0.1265 1.1952 1.0932
No log 0.2083 20 1.0954 0.1776 1.0954 1.0466
No log 0.2292 22 1.1212 0.2764 1.1212 1.0589
No log 0.25 24 1.2134 0.1361 1.2134 1.1015
No log 0.2708 26 1.0979 0.2120 1.0979 1.0478
No log 0.2917 28 1.0212 0.2865 1.0212 1.0106
No log 0.3125 30 1.0185 0.2865 1.0185 1.0092
No log 0.3333 32 1.0212 0.2599 1.0212 1.0105
No log 0.3542 34 1.7296 -0.0305 1.7296 1.3151
No log 0.375 36 1.6515 0.0233 1.6515 1.2851
No log 0.3958 38 1.0757 0.3434 1.0757 1.0372
No log 0.4167 40 0.9107 0.2470 0.9107 0.9543
No log 0.4375 42 0.8939 0.3217 0.8939 0.9455
No log 0.4583 44 0.9926 0.4152 0.9926 0.9963
No log 0.4792 46 0.9727 0.4255 0.9727 0.9862
No log 0.5 48 0.9498 0.3756 0.9498 0.9746
No log 0.5208 50 0.8962 0.3596 0.8962 0.9467
No log 0.5417 52 0.9159 0.3083 0.9159 0.9570
No log 0.5625 54 1.1986 0.2987 1.1986 1.0948
No log 0.5833 56 1.4162 0.2110 1.4162 1.1900
No log 0.6042 58 1.2101 0.2647 1.2101 1.1000
No log 0.625 60 0.9455 0.3769 0.9455 0.9724
No log 0.6458 62 0.8906 0.4565 0.8906 0.9437
No log 0.6667 64 0.8150 0.4851 0.8150 0.9028
No log 0.6875 66 0.7678 0.4644 0.7678 0.8762
No log 0.7083 68 0.8593 0.4940 0.8593 0.9270
No log 0.7292 70 1.2412 0.2926 1.2412 1.1141
No log 0.75 72 1.1524 0.3062 1.1524 1.0735
No log 0.7708 74 0.8038 0.5317 0.8038 0.8965
No log 0.7917 76 0.8847 0.4906 0.8847 0.9406
No log 0.8125 78 0.8852 0.4898 0.8852 0.9409
No log 0.8333 80 0.7870 0.4295 0.7870 0.8871
No log 0.8542 82 0.9766 0.4254 0.9766 0.9883
No log 0.875 84 1.2100 0.3250 1.2100 1.1000
No log 0.8958 86 1.1177 0.2447 1.1177 1.0572
No log 0.9167 88 0.8617 0.4480 0.8617 0.9283
No log 0.9375 90 0.8280 0.4533 0.8280 0.9099
No log 0.9583 92 0.8157 0.4067 0.8157 0.9031
No log 0.9792 94 0.9959 0.3717 0.9959 0.9980
No log 1.0 96 1.1760 0.3744 1.1760 1.0844
No log 1.0208 98 0.9045 0.4474 0.9045 0.9510
No log 1.0417 100 0.7807 0.5348 0.7807 0.8835
No log 1.0625 102 0.7569 0.5232 0.7569 0.8700
No log 1.0833 104 0.7776 0.5206 0.7776 0.8818
No log 1.1042 106 1.0616 0.4276 1.0616 1.0303
No log 1.125 108 1.1137 0.4092 1.1137 1.0553
No log 1.1458 110 0.8023 0.5487 0.8023 0.8957
No log 1.1667 112 0.6892 0.5722 0.6892 0.8302
No log 1.1875 114 0.6757 0.5722 0.6757 0.8220
No log 1.2083 116 0.7116 0.5883 0.7116 0.8436
No log 1.2292 118 0.6545 0.5932 0.6545 0.8090
No log 1.25 120 0.6372 0.5847 0.6372 0.7982
No log 1.2708 122 0.6591 0.6122 0.6591 0.8118
No log 1.2917 124 0.6779 0.5975 0.6779 0.8234
No log 1.3125 126 0.6542 0.5503 0.6542 0.8088
No log 1.3333 128 0.6416 0.5644 0.6416 0.8010
No log 1.3542 130 0.6401 0.5171 0.6401 0.8000
No log 1.375 132 0.6291 0.5847 0.6291 0.7932
No log 1.3958 134 0.7311 0.56 0.7311 0.8550
No log 1.4167 136 0.6597 0.6218 0.6597 0.8122
No log 1.4375 138 0.6353 0.6548 0.6353 0.7971
No log 1.4583 140 0.6788 0.6240 0.6788 0.8239
No log 1.4792 142 0.6772 0.5740 0.6772 0.8229
No log 1.5 144 0.7701 0.5242 0.7701 0.8776
No log 1.5208 146 0.7927 0.5479 0.7927 0.8903
No log 1.5417 148 0.7305 0.4106 0.7305 0.8547
No log 1.5625 150 0.7081 0.4082 0.7081 0.8415
No log 1.5833 152 0.8072 0.5134 0.8072 0.8984
No log 1.6042 154 0.7827 0.5128 0.7827 0.8847
No log 1.625 156 0.6861 0.5786 0.6861 0.8283
No log 1.6458 158 0.6752 0.5884 0.6752 0.8217
No log 1.6667 160 0.7057 0.5677 0.7057 0.8401
No log 1.6875 162 0.8213 0.4898 0.8213 0.9063
No log 1.7083 164 0.7886 0.4912 0.7886 0.8881
No log 1.7292 166 0.6528 0.5074 0.6528 0.8079
No log 1.75 168 0.6309 0.4554 0.6309 0.7943
No log 1.7708 170 0.6577 0.4841 0.6577 0.8110
No log 1.7917 172 0.6347 0.5845 0.6347 0.7967
No log 1.8125 174 0.7668 0.5119 0.7668 0.8757
No log 1.8333 176 0.8825 0.4987 0.8825 0.9394
No log 1.8542 178 0.7703 0.5497 0.7703 0.8777
No log 1.875 180 0.6384 0.5627 0.6384 0.7990
No log 1.8958 182 0.6552 0.5301 0.6552 0.8094
No log 1.9167 184 0.6604 0.5972 0.6604 0.8127
No log 1.9375 186 0.7042 0.5279 0.7042 0.8392
No log 1.9583 188 0.8138 0.5119 0.8138 0.9021
No log 1.9792 190 0.7853 0.5137 0.7853 0.8862
No log 2.0 192 0.6990 0.5103 0.6990 0.8360
No log 2.0208 194 0.6725 0.5288 0.6725 0.8201
No log 2.0417 196 0.7505 0.5342 0.7505 0.8663
No log 2.0625 198 0.7107 0.5188 0.7107 0.8430
No log 2.0833 200 0.6951 0.5635 0.6951 0.8337
No log 2.1042 202 0.7482 0.5566 0.7482 0.8650
No log 2.125 204 0.8778 0.4612 0.8778 0.9369
No log 2.1458 206 0.7761 0.4467 0.7761 0.8809
No log 2.1667 208 0.7285 0.4204 0.7285 0.8535
No log 2.1875 210 0.7869 0.3760 0.7869 0.8871
No log 2.2083 212 0.7455 0.4168 0.7455 0.8635
No log 2.2292 214 0.7102 0.4675 0.7102 0.8427
No log 2.25 216 0.7142 0.4829 0.7142 0.8451
No log 2.2708 218 0.6509 0.5142 0.6509 0.8068
No log 2.2917 220 0.6599 0.4597 0.6599 0.8123
No log 2.3125 222 0.6551 0.5759 0.6551 0.8094
No log 2.3333 224 0.8584 0.5389 0.8584 0.9265
No log 2.3542 226 0.8834 0.5058 0.8834 0.9399
No log 2.375 228 0.6907 0.5438 0.6907 0.8311
No log 2.3958 230 0.6464 0.5288 0.6464 0.8040
No log 2.4167 232 0.6601 0.5359 0.6601 0.8124
No log 2.4375 234 0.6660 0.5388 0.6660 0.8161
No log 2.4583 236 0.7230 0.4990 0.7230 0.8503
No log 2.4792 238 0.7432 0.4879 0.7432 0.8621
No log 2.5 240 0.6933 0.5085 0.6933 0.8326
No log 2.5208 242 0.6696 0.5142 0.6696 0.8183
No log 2.5417 244 0.6654 0.5969 0.6654 0.8157
No log 2.5625 246 0.7053 0.5279 0.7053 0.8398
No log 2.5833 248 0.6807 0.5546 0.6807 0.8251
No log 2.6042 250 0.6953 0.5542 0.6953 0.8338
No log 2.625 252 0.7436 0.5012 0.7436 0.8623
No log 2.6458 254 0.6919 0.4938 0.6919 0.8318
No log 2.6667 256 0.7068 0.5217 0.7068 0.8407
No log 2.6875 258 0.7878 0.4809 0.7878 0.8876
No log 2.7083 260 0.7551 0.4696 0.7551 0.8689
No log 2.7292 262 0.6855 0.5018 0.6855 0.8280
No log 2.75 264 0.7287 0.5229 0.7287 0.8536
No log 2.7708 266 0.7062 0.5852 0.7062 0.8404
No log 2.7917 268 0.6880 0.5422 0.6880 0.8295
No log 2.8125 270 0.7354 0.5153 0.7354 0.8576
No log 2.8333 272 0.7230 0.5626 0.7230 0.8503
No log 2.8542 274 0.7242 0.5329 0.7242 0.8510
No log 2.875 276 0.7365 0.5076 0.7365 0.8582
No log 2.8958 278 0.7686 0.5222 0.7686 0.8767
No log 2.9167 280 0.7124 0.5500 0.7124 0.8441
No log 2.9375 282 0.6984 0.5837 0.6984 0.8357
No log 2.9583 284 0.7466 0.5433 0.7466 0.8641
No log 2.9792 286 0.7415 0.5527 0.7415 0.8611
No log 3.0 288 0.6791 0.5580 0.6791 0.8241
No log 3.0208 290 0.6594 0.4462 0.6594 0.8120
No log 3.0417 292 0.6562 0.4493 0.6562 0.8101
No log 3.0625 294 0.6484 0.4493 0.6484 0.8052
No log 3.0833 296 0.6641 0.4987 0.6641 0.8149
No log 3.1042 298 0.7190 0.5787 0.7190 0.8479
No log 3.125 300 0.6801 0.5450 0.6801 0.8247
No log 3.1458 302 0.6099 0.5408 0.6099 0.7809
No log 3.1667 304 0.6172 0.5165 0.6172 0.7856
No log 3.1875 306 0.6095 0.5394 0.6095 0.7807
No log 3.2083 308 0.6611 0.5663 0.6611 0.8131
No log 3.2292 310 0.6882 0.5942 0.6882 0.8296
No log 3.25 312 0.6548 0.5601 0.6548 0.8092
No log 3.2708 314 0.6030 0.5259 0.6030 0.7765
No log 3.2917 316 0.5846 0.5871 0.5846 0.7646
No log 3.3125 318 0.5797 0.6087 0.5797 0.7614
No log 3.3333 320 0.5852 0.6175 0.5852 0.7650
No log 3.3542 322 0.6216 0.5858 0.6216 0.7884
No log 3.375 324 0.6117 0.6219 0.6117 0.7821
No log 3.3958 326 0.5933 0.6067 0.5933 0.7703
No log 3.4167 328 0.6774 0.5121 0.6774 0.8230
No log 3.4375 330 0.7547 0.5583 0.7547 0.8687
No log 3.4583 332 0.6623 0.5446 0.6623 0.8138
No log 3.4792 334 0.6248 0.5928 0.6248 0.7904
No log 3.5 336 0.6464 0.5909 0.6464 0.8040
No log 3.5208 338 0.6307 0.5949 0.6307 0.7942
No log 3.5417 340 0.6328 0.6073 0.6328 0.7955
No log 3.5625 342 0.6432 0.6122 0.6432 0.8020
No log 3.5833 344 0.6314 0.6054 0.6314 0.7946
No log 3.6042 346 0.6522 0.5174 0.6522 0.8076
No log 3.625 348 0.6605 0.4944 0.6605 0.8127
No log 3.6458 350 0.6557 0.4810 0.6557 0.8098
No log 3.6667 352 0.6423 0.5631 0.6423 0.8014
No log 3.6875 354 0.6415 0.5610 0.6415 0.8010
No log 3.7083 356 0.6466 0.5485 0.6466 0.8041
No log 3.7292 358 0.6486 0.5622 0.6486 0.8054
No log 3.75 360 0.6562 0.5932 0.6562 0.8100
No log 3.7708 362 0.6611 0.5585 0.6611 0.8131
No log 3.7917 364 0.6443 0.4764 0.6443 0.8027
No log 3.8125 366 0.6502 0.4750 0.6502 0.8064
No log 3.8333 368 0.6981 0.5204 0.6981 0.8355
No log 3.8542 370 0.8094 0.5497 0.8094 0.8997
No log 3.875 372 0.8709 0.5549 0.8709 0.9332
No log 3.8958 374 0.7875 0.5598 0.7875 0.8874
No log 3.9167 376 0.7241 0.5433 0.7241 0.8510
No log 3.9375 378 0.7735 0.5387 0.7735 0.8795
No log 3.9583 380 0.7687 0.5403 0.7687 0.8768
No log 3.9792 382 0.6892 0.5463 0.6892 0.8302
No log 4.0 384 0.6543 0.5302 0.6543 0.8089
No log 4.0208 386 0.6551 0.5432 0.6551 0.8094
No log 4.0417 388 0.6395 0.5498 0.6395 0.7997
No log 4.0625 390 0.6405 0.5220 0.6405 0.8003
No log 4.0833 392 0.6317 0.4745 0.6317 0.7948
No log 4.1042 394 0.6221 0.5274 0.6221 0.7887
No log 4.125 396 0.6214 0.5835 0.6214 0.7883
No log 4.1458 398 0.6184 0.6252 0.6184 0.7864
No log 4.1667 400 0.6183 0.6753 0.6183 0.7863
No log 4.1875 402 0.6558 0.6464 0.6558 0.8098
No log 4.2083 404 0.7397 0.5799 0.7397 0.8601
No log 4.2292 406 0.7082 0.6042 0.7082 0.8415
No log 4.25 408 0.6585 0.5978 0.6585 0.8115
No log 4.2708 410 0.6596 0.5978 0.6596 0.8121
No log 4.2917 412 0.6579 0.5978 0.6579 0.8111
No log 4.3125 414 0.6557 0.5142 0.6557 0.8098
No log 4.3333 416 0.6576 0.5302 0.6576 0.8109
No log 4.3542 418 0.6675 0.4903 0.6675 0.8170
No log 4.375 420 0.6835 0.4490 0.6835 0.8268
No log 4.3958 422 0.6813 0.4490 0.6813 0.8254
No log 4.4167 424 0.6817 0.4353 0.6817 0.8257
No log 4.4375 426 0.6920 0.4966 0.6920 0.8319
No log 4.4583 428 0.6755 0.5093 0.6755 0.8219
No log 4.4792 430 0.6586 0.5093 0.6586 0.8115
No log 4.5 432 0.6185 0.5375 0.6185 0.7865
No log 4.5208 434 0.6021 0.5391 0.6021 0.7759
No log 4.5417 436 0.6060 0.5835 0.6060 0.7785
No log 4.5625 438 0.5859 0.5835 0.5859 0.7654
No log 4.5833 440 0.6270 0.5862 0.6270 0.7918
No log 4.6042 442 0.6844 0.5792 0.6844 0.8273
No log 4.625 444 0.6338 0.5860 0.6338 0.7961
No log 4.6458 446 0.5652 0.6461 0.5652 0.7518
No log 4.6667 448 0.5596 0.6046 0.5596 0.7481
No log 4.6875 450 0.5656 0.6196 0.5656 0.7520
No log 4.7083 452 0.6015 0.6174 0.6015 0.7756
No log 4.7292 454 0.6436 0.5770 0.6436 0.8022
No log 4.75 456 0.6508 0.5997 0.6508 0.8068
No log 4.7708 458 0.5790 0.6125 0.5790 0.7609
No log 4.7917 460 0.5673 0.6796 0.5673 0.7532
No log 4.8125 462 0.6134 0.6451 0.6134 0.7832
No log 4.8333 464 0.6051 0.6197 0.6051 0.7779
No log 4.8542 466 0.5642 0.6507 0.5642 0.7512
No log 4.875 468 0.5558 0.6705 0.5558 0.7455
No log 4.8958 470 0.5582 0.6705 0.5582 0.7471
No log 4.9167 472 0.5801 0.6207 0.5801 0.7616
No log 4.9375 474 0.6099 0.6830 0.6099 0.7810
No log 4.9583 476 0.6362 0.6511 0.6362 0.7976
No log 4.9792 478 0.6439 0.6259 0.6439 0.8024
No log 5.0 480 0.6173 0.5810 0.6173 0.7857
No log 5.0208 482 0.5926 0.5510 0.5926 0.7698
No log 5.0417 484 0.5838 0.5510 0.5838 0.7641
No log 5.0625 486 0.5890 0.6311 0.5890 0.7675
No log 5.0833 488 0.5919 0.6421 0.5919 0.7694
No log 5.1042 490 0.6201 0.6485 0.6201 0.7875
No log 5.125 492 0.6828 0.5898 0.6828 0.8263
No log 5.1458 494 0.6767 0.5729 0.6767 0.8226
No log 5.1667 496 0.6584 0.5844 0.6584 0.8114
No log 5.1875 498 0.6227 0.6537 0.6227 0.7891
0.2911 5.2083 500 0.5703 0.7064 0.5703 0.7552
0.2911 5.2292 502 0.5909 0.6444 0.5909 0.7687
0.2911 5.25 504 0.6262 0.5782 0.6262 0.7913
0.2911 5.2708 506 0.6146 0.5554 0.6146 0.7839
0.2911 5.2917 508 0.6077 0.5327 0.6077 0.7795
0.2911 5.3125 510 0.6365 0.4990 0.6365 0.7978

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k19_task5_organization

Finetuned
(4019)
this model