ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k5_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6188
  • Qwk: 0.5620
  • Mse: 0.6188
  • Rmse: 0.7866

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.08 2 2.7179 -0.0729 2.7179 1.6486
No log 0.16 4 1.4797 0.1005 1.4797 1.2164
No log 0.24 6 0.7875 0.2703 0.7875 0.8874
No log 0.32 8 0.7507 0.2810 0.7507 0.8664
No log 0.4 10 1.0987 0.2589 1.0987 1.0482
No log 0.48 12 1.1800 0.2589 1.1800 1.0863
No log 0.56 14 0.8969 0.2411 0.8969 0.9471
No log 0.64 16 0.5773 0.4034 0.5773 0.7598
No log 0.72 18 0.7289 0.3597 0.7289 0.8538
No log 0.8 20 0.8464 0.3909 0.8464 0.9200
No log 0.88 22 0.6030 0.4671 0.6030 0.7765
No log 0.96 24 0.5298 0.5404 0.5298 0.7279
No log 1.04 26 0.6746 0.3630 0.6746 0.8214
No log 1.12 28 0.8613 0.3601 0.8613 0.9281
No log 1.2 30 0.6811 0.4369 0.6811 0.8253
No log 1.28 32 0.5724 0.5498 0.5724 0.7566
No log 1.3600 34 0.6814 0.5700 0.6814 0.8255
No log 1.44 36 0.7273 0.5461 0.7273 0.8528
No log 1.52 38 0.6491 0.6008 0.6491 0.8057
No log 1.6 40 0.7007 0.5272 0.7007 0.8371
No log 1.6800 42 0.7567 0.4817 0.7567 0.8699
No log 1.76 44 0.7494 0.5207 0.7494 0.8657
No log 1.8400 46 0.5755 0.5505 0.5755 0.7586
No log 1.92 48 0.7779 0.4165 0.7779 0.8820
No log 2.0 50 1.0493 0.0898 1.0493 1.0244
No log 2.08 52 0.9548 0.1672 0.9548 0.9771
No log 2.16 54 0.6955 0.3991 0.6955 0.8340
No log 2.24 56 0.5341 0.4007 0.5341 0.7308
No log 2.32 58 0.5390 0.4953 0.5390 0.7342
No log 2.4 60 0.5372 0.5067 0.5372 0.7329
No log 2.48 62 0.5704 0.4951 0.5704 0.7553
No log 2.56 64 0.8843 0.3776 0.8843 0.9404
No log 2.64 66 1.2716 0.1928 1.2716 1.1277
No log 2.7200 68 1.3871 -0.0272 1.3871 1.1778
No log 2.8 70 1.1922 -0.0703 1.1922 1.0919
No log 2.88 72 0.8123 0.3069 0.8123 0.9013
No log 2.96 74 0.4944 0.5123 0.4944 0.7032
No log 3.04 76 0.5895 0.5298 0.5895 0.7678
No log 3.12 78 0.6980 0.4332 0.6980 0.8355
No log 3.2 80 0.6072 0.5283 0.6072 0.7792
No log 3.2800 82 0.5202 0.5388 0.5202 0.7212
No log 3.36 84 0.6425 0.4805 0.6425 0.8015
No log 3.44 86 0.9322 0.3316 0.9322 0.9655
No log 3.52 88 1.1122 0.2799 1.1122 1.0546
No log 3.6 90 1.1688 0.1688 1.1688 1.0811
No log 3.68 92 1.0176 0.2412 1.0176 1.0087
No log 3.76 94 0.7101 0.4777 0.7101 0.8427
No log 3.84 96 0.5495 0.4880 0.5495 0.7413
No log 3.92 98 0.4955 0.5840 0.4955 0.7039
No log 4.0 100 0.4943 0.6124 0.4943 0.7030
No log 4.08 102 0.5187 0.6127 0.5187 0.7202
No log 4.16 104 0.5462 0.6475 0.5462 0.7391
No log 4.24 106 0.6126 0.6354 0.6126 0.7827
No log 4.32 108 0.6593 0.5761 0.6593 0.8120
No log 4.4 110 0.6512 0.6035 0.6512 0.8070
No log 4.48 112 0.5371 0.5337 0.5371 0.7329
No log 4.5600 114 0.5224 0.5184 0.5224 0.7228
No log 4.64 116 0.5671 0.4948 0.5671 0.7531
No log 4.72 118 0.6108 0.5673 0.6108 0.7815
No log 4.8 120 0.6256 0.5232 0.6256 0.7910
No log 4.88 122 0.6670 0.5905 0.6670 0.8167
No log 4.96 124 0.7268 0.4511 0.7268 0.8525
No log 5.04 126 0.7344 0.4183 0.7344 0.8570
No log 5.12 128 0.6181 0.4875 0.6181 0.7862
No log 5.2 130 0.5491 0.4914 0.5491 0.7410
No log 5.28 132 0.6589 0.5293 0.6589 0.8117
No log 5.36 134 0.7125 0.5181 0.7125 0.8441
No log 5.44 136 0.6873 0.5763 0.6873 0.8290
No log 5.52 138 0.6804 0.5802 0.6804 0.8249
No log 5.6 140 0.6789 0.5155 0.6789 0.8240
No log 5.68 142 0.6412 0.5326 0.6412 0.8007
No log 5.76 144 0.6329 0.5326 0.6329 0.7956
No log 5.84 146 0.6316 0.6063 0.6316 0.7948
No log 5.92 148 0.6606 0.6073 0.6606 0.8128
No log 6.0 150 0.6901 0.6411 0.6901 0.8307
No log 6.08 152 0.6717 0.5385 0.6717 0.8196
No log 6.16 154 0.6578 0.5572 0.6578 0.8111
No log 6.24 156 0.6922 0.5215 0.6922 0.8320
No log 6.32 158 0.6512 0.5222 0.6512 0.8070
No log 6.4 160 0.6514 0.5147 0.6514 0.8071
No log 6.48 162 0.7965 0.4853 0.7965 0.8925
No log 6.5600 164 0.9330 0.4886 0.9330 0.9659
No log 6.64 166 0.9034 0.5060 0.9034 0.9505
No log 6.72 168 0.7366 0.5505 0.7366 0.8582
No log 6.8 170 0.6333 0.5671 0.6333 0.7958
No log 6.88 172 0.6199 0.5783 0.6199 0.7874
No log 6.96 174 0.5647 0.5853 0.5647 0.7514
No log 7.04 176 0.5483 0.5306 0.5483 0.7405
No log 7.12 178 0.5838 0.5373 0.5838 0.7641
No log 7.2 180 0.6524 0.4777 0.6524 0.8077
No log 7.28 182 0.6801 0.4328 0.6801 0.8247
No log 7.36 184 0.6294 0.4400 0.6294 0.7934
No log 7.44 186 0.5516 0.5045 0.5516 0.7427
No log 7.52 188 0.5450 0.4703 0.5450 0.7382
No log 7.6 190 0.5971 0.6051 0.5971 0.7727
No log 7.68 192 0.6671 0.5763 0.6671 0.8167
No log 7.76 194 0.7363 0.5372 0.7363 0.8581
No log 7.84 196 0.8414 0.5170 0.8414 0.9173
No log 7.92 198 0.8459 0.4775 0.8459 0.9198
No log 8.0 200 0.7465 0.5034 0.7465 0.8640
No log 8.08 202 0.6385 0.4541 0.6385 0.7991
No log 8.16 204 0.6008 0.3372 0.6008 0.7751
No log 8.24 206 0.6015 0.2652 0.6015 0.7755
No log 8.32 208 0.6024 0.3167 0.6024 0.7762
No log 8.4 210 0.6087 0.3894 0.6087 0.7802
No log 8.48 212 0.6112 0.5697 0.6112 0.7818
No log 8.56 214 0.6373 0.5824 0.6373 0.7983
No log 8.64 216 0.6501 0.5913 0.6501 0.8063
No log 8.72 218 0.6729 0.5843 0.6729 0.8203
No log 8.8 220 0.6859 0.5463 0.6859 0.8282
No log 8.88 222 0.5816 0.5285 0.5816 0.7626
No log 8.96 224 0.5372 0.5345 0.5372 0.7329
No log 9.04 226 0.5643 0.4749 0.5643 0.7512
No log 9.12 228 0.5924 0.4602 0.5924 0.7697
No log 9.2 230 0.6682 0.4705 0.6682 0.8175
No log 9.28 232 0.8666 0.4639 0.8666 0.9309
No log 9.36 234 0.9870 0.4178 0.9870 0.9935
No log 9.44 236 0.8951 0.4465 0.8951 0.9461
No log 9.52 238 0.7269 0.4910 0.7269 0.8526
No log 9.6 240 0.6265 0.5865 0.6265 0.7915
No log 9.68 242 0.5858 0.4704 0.5858 0.7654
No log 9.76 244 0.5924 0.4684 0.5924 0.7697
No log 9.84 246 0.6551 0.5072 0.6551 0.8094
No log 9.92 248 0.7675 0.5310 0.7675 0.8761
No log 10.0 250 0.8163 0.5164 0.8163 0.9035
No log 10.08 252 0.7807 0.6006 0.7807 0.8836
No log 10.16 254 0.7285 0.6225 0.7285 0.8535
No log 10.24 256 0.6499 0.5584 0.6499 0.8062
No log 10.32 258 0.6590 0.5009 0.6590 0.8118
No log 10.4 260 0.7488 0.3579 0.7488 0.8653
No log 10.48 262 0.7253 0.4308 0.7253 0.8516
No log 10.56 264 0.6588 0.5429 0.6588 0.8117
No log 10.64 266 0.6422 0.5663 0.6422 0.8014
No log 10.72 268 0.6184 0.5915 0.6184 0.7864
No log 10.8 270 0.6235 0.6088 0.6235 0.7896
No log 10.88 272 0.6412 0.5675 0.6412 0.8008
No log 10.96 274 0.6397 0.5675 0.6397 0.7998
No log 11.04 276 0.6764 0.5663 0.6764 0.8224
No log 11.12 278 0.6620 0.5663 0.6620 0.8136
No log 11.2 280 0.5982 0.4777 0.5982 0.7734
No log 11.28 282 0.5582 0.4576 0.5582 0.7471
No log 11.36 284 0.5600 0.4812 0.5600 0.7484
No log 11.44 286 0.5951 0.5423 0.5951 0.7714
No log 11.52 288 0.6402 0.5981 0.6402 0.8001
No log 11.6 290 0.6735 0.5629 0.6735 0.8207
No log 11.68 292 0.6678 0.5640 0.6678 0.8172
No log 11.76 294 0.6398 0.5775 0.6398 0.7999
No log 11.84 296 0.6961 0.6 0.6961 0.8343
No log 11.92 298 0.8290 0.5335 0.8290 0.9105
No log 12.0 300 0.8748 0.4106 0.8748 0.9353
No log 12.08 302 0.8237 0.3804 0.8237 0.9076
No log 12.16 304 0.6987 0.3938 0.6987 0.8359
No log 12.24 306 0.5932 0.4167 0.5932 0.7702
No log 12.32 308 0.5374 0.4855 0.5374 0.7331
No log 12.4 310 0.5384 0.5341 0.5384 0.7337
No log 12.48 312 0.6042 0.5724 0.6042 0.7773
No log 12.56 314 0.6417 0.5724 0.6417 0.8011
No log 12.64 316 0.6670 0.5787 0.6670 0.8167
No log 12.72 318 0.7175 0.5846 0.7175 0.8470
No log 12.8 320 0.7503 0.5755 0.7503 0.8662
No log 12.88 322 0.7294 0.6066 0.7294 0.8540
No log 12.96 324 0.6273 0.5738 0.6273 0.7920
No log 13.04 326 0.5784 0.4597 0.5784 0.7605
No log 13.12 328 0.5904 0.4052 0.5904 0.7684
No log 13.2 330 0.6363 0.4562 0.6363 0.7977
No log 13.28 332 0.6821 0.5294 0.6821 0.8259
No log 13.36 334 0.7243 0.5776 0.7243 0.8510
No log 13.44 336 0.6716 0.6246 0.6716 0.8195
No log 13.52 338 0.6012 0.5712 0.6012 0.7754
No log 13.6 340 0.5870 0.5559 0.5870 0.7662
No log 13.68 342 0.5702 0.5242 0.5702 0.7551
No log 13.76 344 0.5709 0.5242 0.5709 0.7556
No log 13.84 346 0.6008 0.5774 0.6008 0.7751
No log 13.92 348 0.6081 0.5692 0.6081 0.7798
No log 14.0 350 0.6095 0.5661 0.6095 0.7807
No log 14.08 352 0.5929 0.5661 0.5929 0.7700
No log 14.16 354 0.5463 0.4576 0.5463 0.7391
No log 14.24 356 0.5264 0.5071 0.5264 0.7255
No log 14.32 358 0.5358 0.4782 0.5358 0.7320
No log 14.4 360 0.5518 0.4782 0.5518 0.7428
No log 14.48 362 0.5646 0.5071 0.5646 0.7514
No log 14.56 364 0.5727 0.5133 0.5727 0.7568
No log 14.64 366 0.5653 0.5133 0.5653 0.7518
No log 14.72 368 0.5764 0.4901 0.5764 0.7592
No log 14.8 370 0.5839 0.5632 0.5839 0.7641
No log 14.88 372 0.5592 0.5559 0.5592 0.7478
No log 14.96 374 0.5283 0.5517 0.5283 0.7268
No log 15.04 376 0.5306 0.5517 0.5306 0.7284
No log 15.12 378 0.5602 0.5859 0.5602 0.7485
No log 15.2 380 0.6224 0.5845 0.6224 0.7889
No log 15.28 382 0.6671 0.6044 0.6671 0.8168
No log 15.36 384 0.6259 0.6056 0.6259 0.7911
No log 15.44 386 0.5716 0.6063 0.5716 0.7561
No log 15.52 388 0.5489 0.5597 0.5489 0.7409
No log 15.6 390 0.5204 0.5167 0.5204 0.7214
No log 15.68 392 0.5059 0.5057 0.5059 0.7112
No log 15.76 394 0.5135 0.5057 0.5135 0.7166
No log 15.84 396 0.5143 0.5039 0.5143 0.7172
No log 15.92 398 0.5490 0.5201 0.5490 0.7409
No log 16.0 400 0.5865 0.5726 0.5865 0.7658
No log 16.08 402 0.5992 0.6201 0.5992 0.7741
No log 16.16 404 0.5907 0.6214 0.5907 0.7686
No log 16.24 406 0.5881 0.5426 0.5881 0.7669
No log 16.32 408 0.5593 0.5373 0.5593 0.7479
No log 16.4 410 0.5295 0.5289 0.5295 0.7277
No log 16.48 412 0.5155 0.5289 0.5155 0.7180
No log 16.56 414 0.5123 0.5071 0.5123 0.7158
No log 16.64 416 0.5244 0.5272 0.5244 0.7242
No log 16.72 418 0.5575 0.5368 0.5575 0.7466
No log 16.8 420 0.5794 0.5696 0.5794 0.7612
No log 16.88 422 0.5833 0.5884 0.5833 0.7638
No log 16.96 424 0.5722 0.5433 0.5722 0.7564
No log 17.04 426 0.5558 0.4795 0.5558 0.7455
No log 17.12 428 0.5497 0.4848 0.5497 0.7414
No log 17.2 430 0.5550 0.4795 0.5550 0.7450
No log 17.28 432 0.5486 0.4848 0.5486 0.7407
No log 17.36 434 0.5466 0.5222 0.5466 0.7393
No log 17.44 436 0.5521 0.5222 0.5521 0.7430
No log 17.52 438 0.5622 0.5884 0.5622 0.7498
No log 17.6 440 0.5605 0.5884 0.5605 0.7487
No log 17.68 442 0.5658 0.5895 0.5658 0.7522
No log 17.76 444 0.5630 0.5871 0.5630 0.7504
No log 17.84 446 0.5408 0.5772 0.5408 0.7354
No log 17.92 448 0.5292 0.5306 0.5292 0.7274
No log 18.0 450 0.5364 0.5949 0.5364 0.7324
No log 18.08 452 0.5726 0.5904 0.5726 0.7567
No log 18.16 454 0.5900 0.5854 0.5900 0.7681
No log 18.24 456 0.5765 0.5526 0.5765 0.7592
No log 18.32 458 0.5829 0.5592 0.5829 0.7635
No log 18.4 460 0.5866 0.6267 0.5866 0.7659
No log 18.48 462 0.5835 0.6178 0.5835 0.7639
No log 18.56 464 0.5702 0.5801 0.5702 0.7551
No log 18.64 466 0.5525 0.5388 0.5525 0.7433
No log 18.72 468 0.5554 0.4795 0.5554 0.7453
No log 18.8 470 0.5773 0.5008 0.5773 0.7598
No log 18.88 472 0.5755 0.5111 0.5755 0.7586
No log 18.96 474 0.5681 0.5190 0.5681 0.7537
No log 19.04 476 0.5873 0.5148 0.5873 0.7663
No log 19.12 478 0.6427 0.5598 0.6427 0.8017
No log 19.2 480 0.6870 0.5520 0.6870 0.8288
No log 19.28 482 0.6993 0.5481 0.6993 0.8362
No log 19.36 484 0.6554 0.5439 0.6554 0.8096
No log 19.44 486 0.5937 0.4534 0.5937 0.7705
No log 19.52 488 0.5585 0.4534 0.5585 0.7473
No log 19.6 490 0.5427 0.4768 0.5427 0.7367
No log 19.68 492 0.5374 0.4701 0.5374 0.7331
No log 19.76 494 0.5414 0.4300 0.5414 0.7358
No log 19.84 496 0.5581 0.5201 0.5581 0.7470
No log 19.92 498 0.5745 0.5324 0.5745 0.7580
0.3106 20.0 500 0.5864 0.5652 0.5864 0.7658
0.3106 20.08 502 0.5898 0.5333 0.5898 0.7680
0.3106 20.16 504 0.6028 0.5190 0.6028 0.7764
0.3106 20.24 506 0.6018 0.5162 0.6018 0.7758
0.3106 20.32 508 0.6192 0.5333 0.6192 0.7869
0.3106 20.4 510 0.6188 0.5620 0.6188 0.7866

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k5_task7_organization

Finetuned
(4019)
this model