ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k2_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5986
  • Qwk: 0.4306
  • Mse: 0.5986
  • Rmse: 0.7737

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.2 2 2.6091 -0.0109 2.6091 1.6153
No log 0.4 4 1.2345 0.0994 1.2345 1.1111
No log 0.6 6 0.7217 0.1372 0.7217 0.8495
No log 0.8 8 0.7345 0.2440 0.7345 0.8570
No log 1.0 10 0.9827 0.2805 0.9827 0.9913
No log 1.2 12 1.1373 0.1907 1.1373 1.0665
No log 1.4 14 0.9936 0.2020 0.9936 0.9968
No log 1.6 16 0.8009 0.2817 0.8009 0.8949
No log 1.8 18 0.7224 0.0 0.7224 0.8500
No log 2.0 20 0.7384 0.0481 0.7384 0.8593
No log 2.2 22 0.7430 0.0481 0.7430 0.8620
No log 2.4 24 0.7459 0.0481 0.7459 0.8636
No log 2.6 26 0.7185 0.0846 0.7185 0.8476
No log 2.8 28 0.6933 0.1604 0.6933 0.8326
No log 3.0 30 0.7475 0.2913 0.7475 0.8646
No log 3.2 32 0.8263 0.4482 0.8263 0.9090
No log 3.4 34 0.7636 0.3329 0.7636 0.8738
No log 3.6 36 0.7785 0.2812 0.7785 0.8823
No log 3.8 38 0.7626 0.2812 0.7626 0.8733
No log 4.0 40 0.8349 0.3134 0.8349 0.9137
No log 4.2 42 0.9364 0.4076 0.9364 0.9677
No log 4.4 44 0.7842 0.4627 0.7842 0.8856
No log 4.6 46 0.6052 0.4292 0.6052 0.7779
No log 4.8 48 0.7330 0.3180 0.7330 0.8561
No log 5.0 50 0.7052 0.3594 0.7052 0.8398
No log 5.2 52 0.6032 0.4534 0.6032 0.7767
No log 5.4 54 0.8537 0.3618 0.8537 0.9240
No log 5.6 56 0.9693 0.2824 0.9693 0.9845
No log 5.8 58 0.7703 0.3847 0.7703 0.8777
No log 6.0 60 0.5521 0.5819 0.5521 0.7430
No log 6.2 62 0.5446 0.5587 0.5446 0.7380
No log 6.4 64 0.6932 0.3868 0.6932 0.8326
No log 6.6 66 1.0042 0.4410 1.0042 1.0021
No log 6.8 68 1.0091 0.4856 1.0091 1.0046
No log 7.0 70 0.7205 0.3847 0.7205 0.8488
No log 7.2 72 0.5506 0.5467 0.5506 0.7420
No log 7.4 74 0.5133 0.5044 0.5133 0.7165
No log 7.6 76 0.4922 0.5533 0.4922 0.7016
No log 7.8 78 0.6238 0.5295 0.6238 0.7898
No log 8.0 80 0.5903 0.5802 0.5903 0.7683
No log 8.2 82 0.4999 0.5939 0.4999 0.7071
No log 8.4 84 0.5295 0.5836 0.5295 0.7276
No log 8.6 86 0.6513 0.5937 0.6513 0.8071
No log 8.8 88 0.6226 0.5733 0.6226 0.7891
No log 9.0 90 0.5138 0.6144 0.5138 0.7168
No log 9.2 92 0.4849 0.5950 0.4849 0.6963
No log 9.4 94 0.4856 0.5820 0.4856 0.6968
No log 9.6 96 0.4517 0.5003 0.4517 0.6721
No log 9.8 98 0.4465 0.5003 0.4465 0.6682
No log 10.0 100 0.4739 0.5560 0.4739 0.6884
No log 10.2 102 0.5620 0.6464 0.5620 0.7496
No log 10.4 104 0.5222 0.6392 0.5222 0.7226
No log 10.6 106 0.4633 0.5003 0.4633 0.6807
No log 10.8 108 0.4921 0.5189 0.4921 0.7015
No log 11.0 110 0.4793 0.4817 0.4793 0.6923
No log 11.2 112 0.4972 0.5104 0.4972 0.7052
No log 11.4 114 0.7215 0.4472 0.7215 0.8494
No log 11.6 116 0.8301 0.4716 0.8301 0.9111
No log 11.8 118 0.7135 0.4852 0.7135 0.8447
No log 12.0 120 0.5208 0.5395 0.5208 0.7217
No log 12.2 122 0.5255 0.5745 0.5255 0.7249
No log 12.4 124 0.5179 0.5745 0.5179 0.7196
No log 12.6 126 0.4921 0.5703 0.4921 0.7015
No log 12.8 128 0.6609 0.5032 0.6609 0.8130
No log 13.0 130 0.7910 0.4886 0.7910 0.8894
No log 13.2 132 0.6269 0.4721 0.6269 0.7918
No log 13.4 134 0.5453 0.5597 0.5453 0.7385
No log 13.6 136 0.5443 0.5091 0.5443 0.7377
No log 13.8 138 0.5654 0.4414 0.5654 0.7519
No log 14.0 140 0.5448 0.3274 0.5448 0.7381
No log 14.2 142 0.5501 0.3354 0.5501 0.7417
No log 14.4 144 0.6004 0.4997 0.6004 0.7748
No log 14.6 146 0.7489 0.4297 0.7489 0.8654
No log 14.8 148 0.7915 0.4881 0.7915 0.8897
No log 15.0 150 0.6437 0.5747 0.6437 0.8023
No log 15.2 152 0.5420 0.4997 0.5420 0.7362
No log 15.4 154 0.5815 0.5410 0.5815 0.7625
No log 15.6 156 0.5502 0.4864 0.5502 0.7418
No log 15.8 158 0.5101 0.4657 0.5101 0.7142
No log 16.0 160 0.5172 0.4289 0.5172 0.7192
No log 16.2 162 0.5633 0.5533 0.5633 0.7505
No log 16.4 164 0.6701 0.4522 0.6701 0.8186
No log 16.6 166 0.7403 0.4419 0.7403 0.8604
No log 16.8 168 0.6134 0.5292 0.6134 0.7832
No log 17.0 170 0.5178 0.5271 0.5178 0.7196
No log 17.2 172 0.4885 0.5587 0.4885 0.6989
No log 17.4 174 0.5284 0.5349 0.5284 0.7269
No log 17.6 176 0.5548 0.5349 0.5548 0.7449
No log 17.8 178 0.5219 0.5060 0.5219 0.7224
No log 18.0 180 0.5237 0.4289 0.5237 0.7236
No log 18.2 182 0.5268 0.3980 0.5268 0.7258
No log 18.4 184 0.5837 0.5091 0.5837 0.7640
No log 18.6 186 0.6896 0.5052 0.6896 0.8304
No log 18.8 188 0.7030 0.5263 0.7030 0.8385
No log 19.0 190 0.5778 0.5275 0.5778 0.7601
No log 19.2 192 0.4728 0.4949 0.4728 0.6876
No log 19.4 194 0.4704 0.4891 0.4704 0.6858
No log 19.6 196 0.4653 0.5521 0.4653 0.6821
No log 19.8 198 0.6220 0.5408 0.6220 0.7886
No log 20.0 200 0.9248 0.4553 0.9248 0.9617
No log 20.2 202 0.9758 0.3949 0.9758 0.9878
No log 20.4 204 0.8236 0.5281 0.8236 0.9075
No log 20.6 206 0.6344 0.5090 0.6344 0.7965
No log 20.8 208 0.5174 0.4983 0.5174 0.7193
No log 21.0 210 0.4936 0.4289 0.4936 0.7026
No log 21.2 212 0.5028 0.5182 0.5028 0.7091
No log 21.4 214 0.6061 0.4880 0.6061 0.7785
No log 21.6 216 0.7866 0.5088 0.7866 0.8869
No log 21.8 218 0.8583 0.4106 0.8583 0.9264
No log 22.0 220 0.7599 0.4400 0.7599 0.8717
No log 22.2 222 0.6913 0.4624 0.6913 0.8315
No log 22.4 224 0.6913 0.4387 0.6913 0.8315
No log 22.6 226 0.6329 0.4409 0.6329 0.7956
No log 22.8 228 0.5669 0.4925 0.5669 0.7529
No log 23.0 230 0.5407 0.5209 0.5407 0.7353
No log 23.2 232 0.5813 0.5045 0.5813 0.7624
No log 23.4 234 0.6825 0.5068 0.6825 0.8262
No log 23.6 236 0.7641 0.4704 0.7641 0.8741
No log 23.8 238 0.8085 0.4760 0.8085 0.8992
No log 24.0 240 0.7355 0.4687 0.7355 0.8576
No log 24.2 242 0.6075 0.5067 0.6075 0.7794
No log 24.4 244 0.5364 0.5413 0.5364 0.7324
No log 24.6 246 0.5223 0.5161 0.5223 0.7227
No log 24.8 248 0.5296 0.5349 0.5296 0.7277
No log 25.0 250 0.5672 0.5067 0.5672 0.7531
No log 25.2 252 0.5281 0.5310 0.5281 0.7267
No log 25.4 254 0.4924 0.5209 0.4924 0.7017
No log 25.6 256 0.4800 0.4086 0.4800 0.6928
No log 25.8 258 0.4847 0.4264 0.4847 0.6962
No log 26.0 260 0.5198 0.5060 0.5198 0.7210
No log 26.2 262 0.5954 0.5367 0.5954 0.7716
No log 26.4 264 0.6789 0.4769 0.6789 0.8239
No log 26.6 266 0.7351 0.4608 0.7351 0.8574
No log 26.8 268 0.7138 0.4687 0.7138 0.8449
No log 27.0 270 0.6603 0.4853 0.6603 0.8126
No log 27.2 272 0.6896 0.4925 0.6896 0.8304
No log 27.4 274 0.6586 0.4925 0.6586 0.8115
No log 27.6 276 0.6530 0.5211 0.6530 0.8081
No log 27.8 278 0.6631 0.5124 0.6631 0.8143
No log 28.0 280 0.6164 0.5278 0.6164 0.7851
No log 28.2 282 0.5664 0.5773 0.5664 0.7526
No log 28.4 284 0.5377 0.5580 0.5377 0.7332
No log 28.6 286 0.4981 0.5393 0.4981 0.7058
No log 28.8 288 0.4783 0.5182 0.4783 0.6916
No log 29.0 290 0.5040 0.5140 0.5040 0.7100
No log 29.2 292 0.5557 0.5290 0.5557 0.7455
No log 29.4 294 0.5975 0.5904 0.5975 0.7730
No log 29.6 296 0.6222 0.5072 0.6222 0.7888
No log 29.8 298 0.5879 0.5836 0.5879 0.7667
No log 30.0 300 0.5138 0.5551 0.5138 0.7168
No log 30.2 302 0.4720 0.5182 0.4720 0.6870
No log 30.4 304 0.4760 0.4007 0.4760 0.6900
No log 30.6 306 0.4932 0.4634 0.4932 0.7023
No log 30.8 308 0.5418 0.5290 0.5418 0.7360
No log 31.0 310 0.6196 0.4795 0.6196 0.7872
No log 31.2 312 0.6793 0.4549 0.6793 0.8242
No log 31.4 314 0.6387 0.4795 0.6387 0.7992
No log 31.6 316 0.6115 0.4479 0.6115 0.7820
No log 31.8 318 0.5765 0.5393 0.5765 0.7593
No log 32.0 320 0.5671 0.5245 0.5671 0.7531
No log 32.2 322 0.6030 0.4479 0.6030 0.7765
No log 32.4 324 0.6462 0.5095 0.6462 0.8038
No log 32.6 326 0.6914 0.4627 0.6914 0.8315
No log 32.8 328 0.6979 0.4630 0.6979 0.8354
No log 33.0 330 0.6701 0.5131 0.6701 0.8186
No log 33.2 332 0.6879 0.4777 0.6879 0.8294
No log 33.4 334 0.7075 0.4549 0.7075 0.8411
No log 33.6 336 0.6712 0.5095 0.6712 0.8192
No log 33.8 338 0.6231 0.4728 0.6231 0.7893
No log 34.0 340 0.6083 0.4728 0.6083 0.7800
No log 34.2 342 0.6138 0.4644 0.6138 0.7834
No log 34.4 344 0.5704 0.5189 0.5704 0.7552
No log 34.6 346 0.5258 0.5245 0.5258 0.7251
No log 34.8 348 0.5051 0.4610 0.5051 0.7107
No log 35.0 350 0.4811 0.4264 0.4811 0.6936
No log 35.2 352 0.4751 0.4264 0.4751 0.6893
No log 35.4 354 0.4802 0.4264 0.4802 0.6930
No log 35.6 356 0.4856 0.4264 0.4856 0.6969
No log 35.8 358 0.5014 0.3980 0.5014 0.7081
No log 36.0 360 0.5238 0.3953 0.5238 0.7237
No log 36.2 362 0.5569 0.5140 0.5569 0.7463
No log 36.4 364 0.6206 0.4977 0.6206 0.7878
No log 36.6 366 0.6586 0.5120 0.6586 0.8115
No log 36.8 368 0.6576 0.5183 0.6576 0.8109
No log 37.0 370 0.6213 0.5241 0.6213 0.7882
No log 37.2 372 0.5477 0.4962 0.5477 0.7401
No log 37.4 374 0.5141 0.4482 0.5141 0.7170
No log 37.6 376 0.5230 0.4875 0.5230 0.7232
No log 37.8 378 0.5742 0.6078 0.5742 0.7578
No log 38.0 380 0.6054 0.5917 0.6054 0.7781
No log 38.2 382 0.6112 0.5851 0.6112 0.7818
No log 38.4 384 0.5960 0.5664 0.5960 0.7720
No log 38.6 386 0.5986 0.5664 0.5986 0.7737
No log 38.8 388 0.6817 0.4624 0.6817 0.8256
No log 39.0 390 0.7588 0.4687 0.7588 0.8711
No log 39.2 392 0.8110 0.4183 0.8110 0.9005
No log 39.4 394 0.7772 0.4549 0.7772 0.8816
No log 39.6 396 0.7088 0.4144 0.7088 0.8419
No log 39.8 398 0.6191 0.4662 0.6191 0.7868
No log 40.0 400 0.5648 0.4879 0.5648 0.7515
No log 40.2 402 0.5454 0.4238 0.5454 0.7385
No log 40.4 404 0.5500 0.4238 0.5500 0.7417
No log 40.6 406 0.5881 0.5385 0.5881 0.7669
No log 40.8 408 0.6592 0.4862 0.6592 0.8119
No log 41.0 410 0.6792 0.4707 0.6792 0.8241
No log 41.2 412 0.6866 0.4624 0.6866 0.8286
No log 41.4 414 0.6961 0.4624 0.6961 0.8343
No log 41.6 416 0.7346 0.4384 0.7346 0.8571
No log 41.8 418 0.7791 0.4328 0.7791 0.8827
No log 42.0 420 0.7693 0.4484 0.7693 0.8771
No log 42.2 422 0.7345 0.4836 0.7345 0.8570
No log 42.4 424 0.6880 0.4651 0.6880 0.8294
No log 42.6 426 0.6113 0.5008 0.6113 0.7819
No log 42.8 428 0.5742 0.5740 0.5742 0.7578
No log 43.0 430 0.5390 0.5039 0.5390 0.7341
No log 43.2 432 0.5323 0.3953 0.5323 0.7296
No log 43.4 434 0.5360 0.3953 0.5360 0.7321
No log 43.6 436 0.5529 0.4060 0.5529 0.7436
No log 43.8 438 0.6005 0.5411 0.6005 0.7749
No log 44.0 440 0.6641 0.5095 0.6641 0.8149
No log 44.2 442 0.7466 0.4393 0.7466 0.8640
No log 44.4 444 0.7678 0.4091 0.7678 0.8762
No log 44.6 446 0.7371 0.4149 0.7371 0.8585
No log 44.8 448 0.6403 0.5733 0.6403 0.8002
No log 45.0 450 0.5718 0.5622 0.5718 0.7562
No log 45.2 452 0.5495 0.4292 0.5495 0.7413
No log 45.4 454 0.5226 0.4444 0.5226 0.7229
No log 45.6 456 0.5217 0.4267 0.5217 0.7223
No log 45.8 458 0.5513 0.4292 0.5513 0.7425
No log 46.0 460 0.6032 0.5149 0.6032 0.7767
No log 46.2 462 0.6441 0.5554 0.6441 0.8026
No log 46.4 464 0.6471 0.5045 0.6471 0.8044
No log 46.6 466 0.6201 0.5131 0.6201 0.7874
No log 46.8 468 0.5691 0.4354 0.5691 0.7544
No log 47.0 470 0.5273 0.3953 0.5273 0.7261
No log 47.2 472 0.5150 0.3688 0.5150 0.7176
No log 47.4 474 0.5155 0.3688 0.5155 0.7180
No log 47.6 476 0.5204 0.3980 0.5204 0.7214
No log 47.8 478 0.5455 0.3953 0.5455 0.7386
No log 48.0 480 0.5839 0.5189 0.5839 0.7641
No log 48.2 482 0.6091 0.4905 0.6091 0.7805
No log 48.4 484 0.6240 0.4815 0.6240 0.7900
No log 48.6 486 0.6232 0.4824 0.6232 0.7894
No log 48.8 488 0.6091 0.4306 0.6091 0.7805
No log 49.0 490 0.5863 0.4592 0.5863 0.7657
No log 49.2 492 0.5749 0.4592 0.5749 0.7582
No log 49.4 494 0.5771 0.4592 0.5771 0.7597
No log 49.6 496 0.5943 0.4592 0.5943 0.7709
No log 49.8 498 0.5961 0.4592 0.5961 0.7721
0.2325 50.0 500 0.6115 0.4502 0.6115 0.7820
0.2325 50.2 502 0.6696 0.5027 0.6696 0.8183
0.2325 50.4 504 0.7128 0.4912 0.7128 0.8443
0.2325 50.6 506 0.7295 0.4835 0.7295 0.8541
0.2325 50.8 508 0.6924 0.4275 0.6924 0.8321
0.2325 51.0 510 0.6549 0.4728 0.6549 0.8093
0.2325 51.2 512 0.6355 0.4731 0.6355 0.7972
0.2325 51.4 514 0.6085 0.4569 0.6085 0.7800
0.2325 51.6 516 0.5981 0.4035 0.5981 0.7734
0.2325 51.8 518 0.6013 0.4306 0.6013 0.7755
0.2325 52.0 520 0.5962 0.4306 0.5962 0.7721
0.2325 52.2 522 0.5969 0.4306 0.5969 0.7726
0.2325 52.4 524 0.5986 0.4306 0.5986 0.7737

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k2_task7_organization

Finetuned
(4019)
this model