ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k2_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7687
  • Qwk: 0.5259
  • Mse: 0.7687
  • Rmse: 0.8768

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.1667 2 4.5530 0.0018 4.5530 2.1338
No log 0.3333 4 2.4686 0.0672 2.4686 1.5712
No log 0.5 6 1.7900 0.0844 1.7900 1.3379
No log 0.6667 8 1.1826 0.1370 1.1826 1.0875
No log 0.8333 10 1.1721 0.1585 1.1721 1.0826
No log 1.0 12 1.1640 0.2448 1.1640 1.0789
No log 1.1667 14 1.1071 0.2843 1.1071 1.0522
No log 1.3333 16 1.0810 0.2694 1.0810 1.0397
No log 1.5 18 1.5072 0.1603 1.5072 1.2277
No log 1.6667 20 1.6234 0.2243 1.6234 1.2741
No log 1.8333 22 1.2415 0.2304 1.2415 1.1142
No log 2.0 24 1.1132 0.1811 1.1132 1.0551
No log 2.1667 26 1.1144 0.1992 1.1144 1.0557
No log 2.3333 28 1.1988 0.2247 1.1988 1.0949
No log 2.5 30 1.2442 0.2821 1.2442 1.1154
No log 2.6667 32 1.2466 0.4140 1.2466 1.1165
No log 2.8333 34 1.0866 0.4094 1.0866 1.0424
No log 3.0 36 1.2535 0.3734 1.2535 1.1196
No log 3.1667 38 1.2222 0.3912 1.2222 1.1055
No log 3.3333 40 0.9568 0.5200 0.9568 0.9781
No log 3.5 42 0.9871 0.5034 0.9871 0.9935
No log 3.6667 44 1.4433 0.3271 1.4433 1.2014
No log 3.8333 46 1.1213 0.3992 1.1213 1.0589
No log 4.0 48 0.8256 0.5397 0.8256 0.9086
No log 4.1667 50 0.8937 0.5357 0.8937 0.9453
No log 4.3333 52 0.8723 0.5634 0.8723 0.9340
No log 4.5 54 1.0182 0.3763 1.0182 1.0091
No log 4.6667 56 1.1373 0.3449 1.1373 1.0665
No log 4.8333 58 0.9808 0.3714 0.9808 0.9904
No log 5.0 60 0.8560 0.5320 0.8560 0.9252
No log 5.1667 62 0.8289 0.5895 0.8289 0.9104
No log 5.3333 64 0.8378 0.6204 0.8378 0.9153
No log 5.5 66 0.8303 0.6168 0.8303 0.9112
No log 5.6667 68 0.9138 0.5471 0.9138 0.9559
No log 5.8333 70 0.9608 0.5296 0.9608 0.9802
No log 6.0 72 0.9785 0.5471 0.9785 0.9892
No log 6.1667 74 0.8401 0.5794 0.8401 0.9166
No log 6.3333 76 0.8317 0.6001 0.8317 0.9120
No log 6.5 78 0.8318 0.5827 0.8318 0.9120
No log 6.6667 80 0.8195 0.5546 0.8195 0.9053
No log 6.8333 82 0.9137 0.5486 0.9137 0.9559
No log 7.0 84 0.7836 0.5497 0.7836 0.8852
No log 7.1667 86 0.7776 0.6097 0.7776 0.8818
No log 7.3333 88 0.7555 0.6239 0.7555 0.8692
No log 7.5 90 0.7408 0.6239 0.7408 0.8607
No log 7.6667 92 0.7284 0.6239 0.7284 0.8535
No log 7.8333 94 0.7638 0.5825 0.7638 0.8740
No log 8.0 96 0.7645 0.5546 0.7645 0.8744
No log 8.1667 98 0.7178 0.6525 0.7178 0.8472
No log 8.3333 100 0.7438 0.6141 0.7438 0.8625
No log 8.5 102 0.7696 0.6151 0.7696 0.8773
No log 8.6667 104 0.8044 0.5287 0.8044 0.8969
No log 8.8333 106 0.7386 0.6172 0.7386 0.8594
No log 9.0 108 0.7624 0.5766 0.7624 0.8732
No log 9.1667 110 0.9735 0.5777 0.9735 0.9867
No log 9.3333 112 0.8539 0.5860 0.8539 0.9241
No log 9.5 114 0.7642 0.6605 0.7642 0.8742
No log 9.6667 116 0.9461 0.5258 0.9461 0.9727
No log 9.8333 118 0.8830 0.5977 0.8830 0.9397
No log 10.0 120 0.7944 0.6026 0.7944 0.8913
No log 10.1667 122 0.7684 0.6029 0.7684 0.8766
No log 10.3333 124 0.7961 0.5339 0.7961 0.8922
No log 10.5 126 0.7620 0.6177 0.7620 0.8729
No log 10.6667 128 0.7413 0.5102 0.7413 0.8610
No log 10.8333 130 0.7858 0.5248 0.7858 0.8864
No log 11.0 132 0.7757 0.5886 0.7757 0.8807
No log 11.1667 134 0.7246 0.5530 0.7246 0.8512
No log 11.3333 136 0.7330 0.5832 0.7330 0.8561
No log 11.5 138 0.8812 0.5139 0.8812 0.9387
No log 11.6667 140 0.7552 0.5788 0.7552 0.8690
No log 11.8333 142 0.6629 0.5821 0.6629 0.8142
No log 12.0 144 0.6833 0.6029 0.6833 0.8266
No log 12.1667 146 0.6789 0.5961 0.6789 0.8239
No log 12.3333 148 0.6851 0.5582 0.6851 0.8277
No log 12.5 150 0.7015 0.6517 0.7015 0.8375
No log 12.6667 152 0.6944 0.6417 0.6944 0.8333
No log 12.8333 154 0.8199 0.4962 0.8199 0.9055
No log 13.0 156 0.9404 0.4894 0.9404 0.9698
No log 13.1667 158 0.8200 0.4949 0.8200 0.9055
No log 13.3333 160 0.7067 0.6237 0.7067 0.8406
No log 13.5 162 0.7963 0.6101 0.7963 0.8923
No log 13.6667 164 0.8762 0.5297 0.8762 0.9360
No log 13.8333 166 0.7821 0.6049 0.7821 0.8844
No log 14.0 168 0.7056 0.6151 0.7056 0.8400
No log 14.1667 170 0.7170 0.5546 0.7170 0.8468
No log 14.3333 172 0.7225 0.5358 0.7225 0.8500
No log 14.5 174 0.6911 0.6117 0.6911 0.8313
No log 14.6667 176 0.6990 0.6117 0.6990 0.8361
No log 14.8333 178 0.7018 0.6487 0.7018 0.8377
No log 15.0 180 0.7465 0.6109 0.7465 0.8640
No log 15.1667 182 0.7944 0.5736 0.7944 0.8913
No log 15.3333 184 0.7707 0.5833 0.7707 0.8779
No log 15.5 186 0.7427 0.5988 0.7427 0.8618
No log 15.6667 188 0.7504 0.5750 0.7504 0.8663
No log 15.8333 190 0.7613 0.6211 0.7613 0.8725
No log 16.0 192 0.7926 0.6047 0.7926 0.8903
No log 16.1667 194 0.8033 0.6069 0.8033 0.8963
No log 16.3333 196 0.7559 0.6131 0.7559 0.8694
No log 16.5 198 0.7296 0.6051 0.7296 0.8542
No log 16.6667 200 0.7601 0.5712 0.7601 0.8718
No log 16.8333 202 0.7463 0.5327 0.7463 0.8639
No log 17.0 204 0.7327 0.5607 0.7327 0.8560
No log 17.1667 206 0.7258 0.5856 0.7258 0.8520
No log 17.3333 208 0.7106 0.6183 0.7106 0.8430
No log 17.5 210 0.7071 0.5776 0.7071 0.8409
No log 17.6667 212 0.7517 0.5117 0.7517 0.8670
No log 17.8333 214 0.7589 0.5228 0.7589 0.8712
No log 18.0 216 0.7202 0.6139 0.7202 0.8486
No log 18.1667 218 0.7395 0.5862 0.7395 0.8599
No log 18.3333 220 0.7525 0.6196 0.7525 0.8674
No log 18.5 222 0.7450 0.5443 0.7450 0.8632
No log 18.6667 224 0.7489 0.5443 0.7489 0.8654
No log 18.8333 226 0.7503 0.6554 0.7503 0.8662
No log 19.0 228 0.7678 0.6489 0.7678 0.8763
No log 19.1667 230 0.8037 0.6091 0.8037 0.8965
No log 19.3333 232 0.7613 0.6334 0.7613 0.8725
No log 19.5 234 0.7209 0.6404 0.7209 0.8490
No log 19.6667 236 0.6930 0.6380 0.6930 0.8325
No log 19.8333 238 0.7697 0.5919 0.7697 0.8773
No log 20.0 240 0.7874 0.5912 0.7874 0.8874
No log 20.1667 242 0.7183 0.6239 0.7183 0.8475
No log 20.3333 244 0.7092 0.6664 0.7092 0.8421
No log 20.5 246 0.7052 0.6773 0.7052 0.8398
No log 20.6667 248 0.7073 0.6664 0.7073 0.8410
No log 20.8333 250 0.7265 0.6141 0.7265 0.8524
No log 21.0 252 0.7390 0.5513 0.7390 0.8596
No log 21.1667 254 0.7567 0.4736 0.7567 0.8699
No log 21.3333 256 0.7614 0.4736 0.7614 0.8726
No log 21.5 258 0.7373 0.5884 0.7373 0.8586
No log 21.6667 260 0.7354 0.5063 0.7354 0.8576
No log 21.8333 262 0.8066 0.4838 0.8066 0.8981
No log 22.0 264 0.8051 0.5117 0.8051 0.8972
No log 22.1667 266 0.7209 0.5437 0.7209 0.8490
No log 22.3333 268 0.7394 0.6758 0.7394 0.8599
No log 22.5 270 0.7855 0.6382 0.7855 0.8863
No log 22.6667 272 0.7716 0.6417 0.7716 0.8784
No log 22.8333 274 0.7212 0.5937 0.7212 0.8493
No log 23.0 276 0.7165 0.6010 0.7165 0.8465
No log 23.1667 278 0.7282 0.5439 0.7282 0.8533
No log 23.3333 280 0.7683 0.5622 0.7683 0.8765
No log 23.5 282 0.7652 0.5336 0.7652 0.8747
No log 23.6667 284 0.7557 0.5307 0.7557 0.8693
No log 23.8333 286 0.7943 0.5393 0.7943 0.8912
No log 24.0 288 0.8430 0.5352 0.8430 0.9182
No log 24.1667 290 0.7886 0.5477 0.7886 0.8881
No log 24.3333 292 0.7604 0.5645 0.7604 0.8720
No log 24.5 294 0.8809 0.5534 0.8809 0.9386
No log 24.6667 296 0.9962 0.5094 0.9962 0.9981
No log 24.8333 298 0.9803 0.4894 0.9803 0.9901
No log 25.0 300 0.8844 0.4345 0.8844 0.9404
No log 25.1667 302 0.8108 0.4937 0.8108 0.9004
No log 25.3333 304 0.7632 0.5528 0.7632 0.8736
No log 25.5 306 0.7490 0.5895 0.7490 0.8654
No log 25.6667 308 0.7729 0.6001 0.7729 0.8792
No log 25.8333 310 0.8051 0.5934 0.8051 0.8973
No log 26.0 312 0.7764 0.6100 0.7764 0.8812
No log 26.1667 314 0.7587 0.5971 0.7587 0.8710
No log 26.3333 316 0.7665 0.5131 0.7665 0.8755
No log 26.5 318 0.7895 0.5841 0.7895 0.8886
No log 26.6667 320 0.7959 0.6005 0.7959 0.8921
No log 26.8333 322 0.7504 0.6269 0.7504 0.8663
No log 27.0 324 0.7251 0.6461 0.7251 0.8516
No log 27.1667 326 0.7105 0.6461 0.7105 0.8429
No log 27.3333 328 0.6992 0.6393 0.6992 0.8362
No log 27.5 330 0.6966 0.5946 0.6966 0.8347
No log 27.6667 332 0.6952 0.6086 0.6952 0.8338
No log 27.8333 334 0.6999 0.6664 0.6999 0.8366
No log 28.0 336 0.7080 0.6377 0.7080 0.8414
No log 28.1667 338 0.7058 0.6340 0.7058 0.8401
No log 28.3333 340 0.7045 0.6448 0.7045 0.8394
No log 28.5 342 0.7101 0.6806 0.7101 0.8427
No log 28.6667 344 0.6902 0.6526 0.6902 0.8308
No log 28.8333 346 0.6895 0.6657 0.6895 0.8304
No log 29.0 348 0.6936 0.6657 0.6936 0.8328
No log 29.1667 350 0.6701 0.6860 0.6701 0.8186
No log 29.3333 352 0.7473 0.5763 0.7473 0.8645
No log 29.5 354 0.8170 0.5048 0.8170 0.9039
No log 29.6667 356 0.7749 0.5618 0.7749 0.8803
No log 29.8333 358 0.6829 0.6232 0.6829 0.8264
No log 30.0 360 0.6769 0.6902 0.6769 0.8228
No log 30.1667 362 0.7518 0.6446 0.7518 0.8670
No log 30.3333 364 0.7965 0.6140 0.7965 0.8925
No log 30.5 366 0.7824 0.6162 0.7824 0.8845
No log 30.6667 368 0.7443 0.6120 0.7443 0.8627
No log 30.8333 370 0.7392 0.5466 0.7392 0.8598
No log 31.0 372 0.7538 0.5361 0.7538 0.8682
No log 31.1667 374 0.7470 0.5451 0.7470 0.8643
No log 31.3333 376 0.7462 0.6044 0.7462 0.8638
No log 31.5 378 0.8341 0.5658 0.8340 0.9133
No log 31.6667 380 0.9309 0.5750 0.9309 0.9648
No log 31.8333 382 0.9555 0.5649 0.9555 0.9775
No log 32.0 384 0.9000 0.5532 0.9000 0.9487
No log 32.1667 386 0.8215 0.5539 0.8215 0.9063
No log 32.3333 388 0.7628 0.5530 0.7628 0.8734
No log 32.5 390 0.7672 0.6012 0.7672 0.8759
No log 32.6667 392 0.7882 0.5734 0.7882 0.8878
No log 32.8333 394 0.7611 0.6035 0.7611 0.8724
No log 33.0 396 0.7267 0.5299 0.7267 0.8525
No log 33.1667 398 0.7276 0.5483 0.7276 0.8530
No log 33.3333 400 0.7484 0.4948 0.7484 0.8651
No log 33.5 402 0.7725 0.5029 0.7725 0.8789
No log 33.6667 404 0.7787 0.5107 0.7787 0.8824
No log 33.8333 406 0.7857 0.5869 0.7857 0.8864
No log 34.0 408 0.7719 0.5779 0.7719 0.8786
No log 34.1667 410 0.7398 0.6230 0.7398 0.8601
No log 34.3333 412 0.7388 0.6067 0.7388 0.8595
No log 34.5 414 0.7347 0.6148 0.7347 0.8572
No log 34.6667 416 0.7273 0.6211 0.7273 0.8528
No log 34.8333 418 0.7342 0.5886 0.7342 0.8569
No log 35.0 420 0.7443 0.5720 0.7443 0.8627
No log 35.1667 422 0.7729 0.5927 0.7729 0.8791
No log 35.3333 424 0.7785 0.5380 0.7785 0.8823
No log 35.5 426 0.7534 0.5057 0.7534 0.8680
No log 35.6667 428 0.7428 0.5749 0.7428 0.8619
No log 35.8333 430 0.7289 0.5886 0.7289 0.8538
No log 36.0 432 0.7396 0.6088 0.7396 0.8600
No log 36.1667 434 0.7383 0.6098 0.7383 0.8593
No log 36.3333 436 0.7283 0.5543 0.7283 0.8534
No log 36.5 438 0.7212 0.5543 0.7212 0.8492
No log 36.6667 440 0.7201 0.6189 0.7201 0.8486
No log 36.8333 442 0.7256 0.6189 0.7256 0.8518
No log 37.0 444 0.7304 0.6189 0.7304 0.8546
No log 37.1667 446 0.7377 0.6098 0.7377 0.8589
No log 37.3333 448 0.7406 0.6066 0.7406 0.8606
No log 37.5 450 0.7318 0.6202 0.7318 0.8555
No log 37.6667 452 0.7204 0.6590 0.7204 0.8488
No log 37.8333 454 0.7034 0.6078 0.7034 0.8387
No log 38.0 456 0.7102 0.6279 0.7102 0.8427
No log 38.1667 458 0.7290 0.6088 0.7290 0.8538
No log 38.3333 460 0.7444 0.5886 0.7444 0.8628
No log 38.5 462 0.7489 0.5720 0.7489 0.8654
No log 38.6667 464 0.7435 0.5659 0.7435 0.8622
No log 38.8333 466 0.7370 0.5797 0.7370 0.8585
No log 39.0 468 0.7270 0.5621 0.7270 0.8526
No log 39.1667 470 0.7200 0.5621 0.7200 0.8485
No log 39.3333 472 0.7088 0.5966 0.7088 0.8419
No log 39.5 474 0.7042 0.6130 0.7042 0.8392
No log 39.6667 476 0.7067 0.5797 0.7067 0.8406
No log 39.8333 478 0.7105 0.6189 0.7105 0.8429
No log 40.0 480 0.7101 0.6015 0.7101 0.8427
No log 40.1667 482 0.7216 0.5835 0.7216 0.8495
No log 40.3333 484 0.7336 0.6015 0.7336 0.8565
No log 40.5 486 0.7558 0.5424 0.7558 0.8693
No log 40.6667 488 0.7691 0.5401 0.7691 0.8770
No log 40.8333 490 0.7712 0.5725 0.7712 0.8782
No log 41.0 492 0.7719 0.5860 0.7719 0.8786
No log 41.1667 494 0.7800 0.5646 0.7800 0.8832
No log 41.3333 496 0.7790 0.6184 0.7790 0.8826
No log 41.5 498 0.7717 0.6184 0.7717 0.8785
0.2373 41.6667 500 0.7587 0.5660 0.7587 0.8710
0.2373 41.8333 502 0.7509 0.5660 0.7509 0.8665
0.2373 42.0 504 0.7512 0.5547 0.7512 0.8667
0.2373 42.1667 506 0.7518 0.5451 0.7518 0.8671
0.2373 42.3333 508 0.7527 0.5380 0.7527 0.8676
0.2373 42.5 510 0.7687 0.5259 0.7687 0.8768

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k2_task2_organization

Finetuned
(4019)
this model