ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k2_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4952
  • Qwk: 0.6602
  • Mse: 0.4952
  • Rmse: 0.7037

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.2 2 3.9981 0.0024 3.9981 1.9995
No log 0.4 4 2.0113 0.0827 2.0113 1.4182
No log 0.6 6 1.0878 0.2537 1.0878 1.0430
No log 0.8 8 0.9545 0.3348 0.9545 0.9770
No log 1.0 10 1.0926 0.2391 1.0926 1.0453
No log 1.2 12 1.0848 0.2465 1.0848 1.0415
No log 1.4 14 1.2718 0.0340 1.2718 1.1278
No log 1.6 16 1.1213 0.1460 1.1213 1.0589
No log 1.8 18 0.7633 0.3856 0.7633 0.8737
No log 2.0 20 0.8444 0.3862 0.8444 0.9189
No log 2.2 22 0.8637 0.4439 0.8637 0.9294
No log 2.4 24 0.6453 0.5700 0.6453 0.8033
No log 2.6 26 0.8974 0.5259 0.8974 0.9473
No log 2.8 28 0.9102 0.5261 0.9102 0.9540
No log 3.0 30 0.6242 0.6058 0.6242 0.7901
No log 3.2 32 0.6226 0.6468 0.6226 0.7890
No log 3.4 34 0.6300 0.6038 0.6300 0.7937
No log 3.6 36 0.8775 0.6005 0.8775 0.9367
No log 3.8 38 0.7548 0.6246 0.7548 0.8688
No log 4.0 40 0.6476 0.6606 0.6476 0.8047
No log 4.2 42 0.5787 0.6044 0.5787 0.7607
No log 4.4 44 0.6637 0.6925 0.6637 0.8147
No log 4.6 46 0.8940 0.6190 0.8940 0.9455
No log 4.8 48 0.9154 0.6271 0.9154 0.9568
No log 5.0 50 0.6208 0.6900 0.6208 0.7879
No log 5.2 52 0.5513 0.7237 0.5513 0.7425
No log 5.4 54 0.6791 0.6369 0.6791 0.8241
No log 5.6 56 1.0109 0.5032 1.0109 1.0054
No log 5.8 58 0.9554 0.5027 0.9554 0.9774
No log 6.0 60 0.5591 0.6080 0.5591 0.7477
No log 6.2 62 0.5082 0.6796 0.5082 0.7129
No log 6.4 64 0.5891 0.7178 0.5891 0.7675
No log 6.6 66 0.7421 0.6620 0.7421 0.8615
No log 6.8 68 0.6783 0.6895 0.6783 0.8236
No log 7.0 70 0.5783 0.7115 0.5783 0.7604
No log 7.2 72 0.5543 0.7219 0.5543 0.7445
No log 7.4 74 0.7083 0.6561 0.7083 0.8416
No log 7.6 76 0.8842 0.5699 0.8842 0.9403
No log 7.8 78 0.7557 0.6458 0.7557 0.8693
No log 8.0 80 0.7231 0.6610 0.7231 0.8503
No log 8.2 82 0.7719 0.6319 0.7719 0.8786
No log 8.4 84 0.5924 0.6892 0.5924 0.7697
No log 8.6 86 0.5610 0.6979 0.5610 0.7490
No log 8.8 88 0.6274 0.6762 0.6274 0.7921
No log 9.0 90 0.6204 0.6851 0.6204 0.7877
No log 9.2 92 0.6473 0.6683 0.6473 0.8045
No log 9.4 94 0.5259 0.6812 0.5259 0.7252
No log 9.6 96 0.5069 0.7246 0.5069 0.7120
No log 9.8 98 0.4964 0.7246 0.4964 0.7046
No log 10.0 100 0.5116 0.6804 0.5116 0.7153
No log 10.2 102 0.6532 0.6532 0.6532 0.8082
No log 10.4 104 0.7345 0.6055 0.7345 0.8570
No log 10.6 106 0.6203 0.7094 0.6203 0.7876
No log 10.8 108 0.5051 0.7178 0.5051 0.7107
No log 11.0 110 0.5004 0.7411 0.5004 0.7074
No log 11.2 112 0.5216 0.6826 0.5216 0.7222
No log 11.4 114 0.6147 0.7128 0.6147 0.7841
No log 11.6 116 0.5775 0.7193 0.5775 0.7600
No log 11.8 118 0.5383 0.6858 0.5383 0.7337
No log 12.0 120 0.5248 0.6993 0.5248 0.7244
No log 12.2 122 0.5604 0.6914 0.5604 0.7486
No log 12.4 124 0.5083 0.6442 0.5083 0.7129
No log 12.6 126 0.5187 0.6606 0.5187 0.7202
No log 12.8 128 0.5155 0.6924 0.5155 0.7180
No log 13.0 130 0.5346 0.6927 0.5346 0.7311
No log 13.2 132 0.5481 0.6970 0.5481 0.7403
No log 13.4 134 0.5035 0.7224 0.5035 0.7096
No log 13.6 136 0.5155 0.7084 0.5155 0.7180
No log 13.8 138 0.6383 0.7172 0.6383 0.7989
No log 14.0 140 0.8549 0.6081 0.8549 0.9246
No log 14.2 142 0.7829 0.7101 0.7829 0.8848
No log 14.4 144 0.5324 0.7142 0.5324 0.7297
No log 14.6 146 0.4761 0.7303 0.4761 0.6900
No log 14.8 148 0.4705 0.7458 0.4705 0.6859
No log 15.0 150 0.4802 0.6804 0.4802 0.6930
No log 15.2 152 0.5885 0.6707 0.5885 0.7671
No log 15.4 154 0.6406 0.6794 0.6406 0.8004
No log 15.6 156 0.5284 0.7210 0.5284 0.7269
No log 15.8 158 0.4728 0.6796 0.4728 0.6876
No log 16.0 160 0.4766 0.7259 0.4766 0.6904
No log 16.2 162 0.4767 0.6720 0.4767 0.6904
No log 16.4 164 0.5271 0.7149 0.5271 0.7260
No log 16.6 166 0.6085 0.6627 0.6085 0.7801
No log 16.8 168 0.5646 0.6599 0.5646 0.7514
No log 17.0 170 0.4946 0.6720 0.4946 0.7033
No log 17.2 172 0.5073 0.6980 0.5073 0.7123
No log 17.4 174 0.5022 0.6407 0.5022 0.7087
No log 17.6 176 0.5558 0.6926 0.5558 0.7455
No log 17.8 178 0.6027 0.6729 0.6027 0.7763
No log 18.0 180 0.5525 0.6820 0.5525 0.7433
No log 18.2 182 0.5180 0.6068 0.5180 0.7197
No log 18.4 184 0.5299 0.6780 0.5299 0.7279
No log 18.6 186 0.5294 0.6536 0.5294 0.7276
No log 18.8 188 0.5378 0.6570 0.5378 0.7333
No log 19.0 190 0.6652 0.6734 0.6652 0.8156
No log 19.2 192 0.7517 0.6348 0.7517 0.8670
No log 19.4 194 0.6586 0.6437 0.6586 0.8115
No log 19.6 196 0.5568 0.6721 0.5568 0.7462
No log 19.8 198 0.5501 0.6578 0.5501 0.7417
No log 20.0 200 0.5572 0.6578 0.5572 0.7465
No log 20.2 202 0.6088 0.6925 0.6088 0.7803
No log 20.4 204 0.6607 0.6771 0.6607 0.8128
No log 20.6 206 0.7177 0.6629 0.7177 0.8472
No log 20.8 208 0.7078 0.6748 0.7078 0.8413
No log 21.0 210 0.6480 0.6925 0.6480 0.8050
No log 21.2 212 0.5618 0.6862 0.5618 0.7495
No log 21.4 214 0.5177 0.6644 0.5177 0.7195
No log 21.6 216 0.5048 0.6680 0.5048 0.7105
No log 21.8 218 0.5066 0.6359 0.5066 0.7118
No log 22.0 220 0.5541 0.6818 0.5541 0.7444
No log 22.2 222 0.6313 0.7297 0.6313 0.7945
No log 22.4 224 0.6492 0.7297 0.6492 0.8057
No log 22.6 226 0.5955 0.7230 0.5955 0.7717
No log 22.8 228 0.5393 0.7032 0.5393 0.7344
No log 23.0 230 0.4912 0.6838 0.4912 0.7009
No log 23.2 232 0.4864 0.6940 0.4864 0.6975
No log 23.4 234 0.4776 0.7041 0.4776 0.6911
No log 23.6 236 0.4949 0.6644 0.4949 0.7035
No log 23.8 238 0.5299 0.6862 0.5299 0.7280
No log 24.0 240 0.6168 0.6678 0.6168 0.7853
No log 24.2 242 0.8266 0.6455 0.8266 0.9092
No log 24.4 244 0.9117 0.6043 0.9117 0.9548
No log 24.6 246 0.7590 0.6620 0.7590 0.8712
No log 24.8 248 0.6124 0.6886 0.6124 0.7825
No log 25.0 250 0.5370 0.6359 0.5370 0.7328
No log 25.2 252 0.5370 0.6001 0.5370 0.7328
No log 25.4 254 0.5501 0.6103 0.5501 0.7417
No log 25.6 256 0.5679 0.6850 0.5679 0.7536
No log 25.8 258 0.6403 0.6648 0.6403 0.8002
No log 26.0 260 0.6495 0.6678 0.6495 0.8059
No log 26.2 262 0.5783 0.6332 0.5783 0.7605
No log 26.4 264 0.5161 0.6488 0.5161 0.7184
No log 26.6 266 0.5143 0.7136 0.5143 0.7172
No log 26.8 268 0.5120 0.7196 0.5120 0.7156
No log 27.0 270 0.5140 0.6426 0.5140 0.7170
No log 27.2 272 0.5760 0.6525 0.5760 0.7590
No log 27.4 274 0.6046 0.6669 0.6046 0.7775
No log 27.6 276 0.5848 0.6332 0.5848 0.7647
No log 27.8 278 0.5460 0.6288 0.5460 0.7389
No log 28.0 280 0.5244 0.6555 0.5244 0.7242
No log 28.2 282 0.5181 0.6689 0.5181 0.7198
No log 28.4 284 0.5276 0.6689 0.5276 0.7264
No log 28.6 286 0.5676 0.6551 0.5676 0.7534
No log 28.8 288 0.6830 0.6727 0.6830 0.8264
No log 29.0 290 0.8317 0.6345 0.8317 0.9120
No log 29.2 292 0.8341 0.6345 0.8341 0.9133
No log 29.4 294 0.7013 0.6537 0.7013 0.8374
No log 29.6 296 0.5980 0.6394 0.5980 0.7733
No log 29.8 298 0.5505 0.6144 0.5505 0.7420
No log 30.0 300 0.5237 0.6667 0.5237 0.7237
No log 30.2 302 0.5237 0.6564 0.5237 0.7237
No log 30.4 304 0.5499 0.6256 0.5499 0.7416
No log 30.6 306 0.5704 0.6479 0.5704 0.7553
No log 30.8 308 0.5626 0.6479 0.5626 0.7501
No log 31.0 310 0.5409 0.6401 0.5409 0.7354
No log 31.2 312 0.5380 0.6293 0.5380 0.7335
No log 31.4 314 0.5495 0.6293 0.5495 0.7413
No log 31.6 316 0.5644 0.5905 0.5644 0.7512
No log 31.8 318 0.5909 0.6669 0.5909 0.7687
No log 32.0 320 0.6591 0.6664 0.6591 0.8119
No log 32.2 322 0.6921 0.6585 0.6921 0.8319
No log 32.4 324 0.6717 0.6578 0.6717 0.8196
No log 32.6 326 0.6032 0.6683 0.6032 0.7766
No log 32.8 328 0.5757 0.6401 0.5757 0.7588
No log 33.0 330 0.5869 0.6508 0.5869 0.7661
No log 33.2 332 0.6419 0.6421 0.6419 0.8012
No log 33.4 334 0.6517 0.6236 0.6517 0.8073
No log 33.6 336 0.5948 0.6332 0.5948 0.7712
No log 33.8 338 0.5547 0.5737 0.5547 0.7448
No log 34.0 340 0.5401 0.6636 0.5401 0.7349
No log 34.2 342 0.5315 0.6697 0.5315 0.7291
No log 34.4 344 0.5338 0.6359 0.5338 0.7306
No log 34.6 346 0.5394 0.6392 0.5394 0.7344
No log 34.8 348 0.5476 0.6401 0.5476 0.7400
No log 35.0 350 0.5527 0.6401 0.5527 0.7434
No log 35.2 352 0.5595 0.6358 0.5595 0.7480
No log 35.4 354 0.5589 0.6358 0.5589 0.7476
No log 35.6 356 0.5571 0.6246 0.5571 0.7464
No log 35.8 358 0.5477 0.6105 0.5477 0.7400
No log 36.0 360 0.5526 0.6005 0.5526 0.7434
No log 36.2 362 0.5648 0.6358 0.5648 0.7515
No log 36.4 364 0.5756 0.6358 0.5756 0.7587
No log 36.6 366 0.5784 0.6358 0.5784 0.7605
No log 36.8 368 0.5841 0.6256 0.5841 0.7643
No log 37.0 370 0.5858 0.6256 0.5858 0.7654
No log 37.2 372 0.5743 0.6015 0.5743 0.7578
No log 37.4 374 0.5628 0.6115 0.5628 0.7502
No log 37.6 376 0.5579 0.6392 0.5579 0.7469
No log 37.8 378 0.5682 0.6015 0.5682 0.7538
No log 38.0 380 0.5963 0.6099 0.5963 0.7722
No log 38.2 382 0.6191 0.6215 0.6191 0.7868
No log 38.4 384 0.6659 0.6607 0.6659 0.8161
No log 38.6 386 0.6704 0.6786 0.6704 0.8188
No log 38.8 388 0.6755 0.6591 0.6755 0.8219
No log 39.0 390 0.6462 0.7010 0.6462 0.8039
No log 39.2 392 0.5997 0.6490 0.5997 0.7744
No log 39.4 394 0.5547 0.6578 0.5547 0.7448
No log 39.6 396 0.5333 0.6328 0.5333 0.7303
No log 39.8 398 0.5291 0.6328 0.5291 0.7274
No log 40.0 400 0.5315 0.6392 0.5315 0.7291
No log 40.2 402 0.5357 0.6392 0.5357 0.7319
No log 40.4 404 0.5569 0.6234 0.5569 0.7463
No log 40.6 406 0.5597 0.6204 0.5597 0.7481
No log 40.8 408 0.5705 0.6204 0.5705 0.7553
No log 41.0 410 0.5614 0.6578 0.5614 0.7493
No log 41.2 412 0.5364 0.6659 0.5364 0.7324
No log 41.4 414 0.5227 0.6969 0.5227 0.7230
No log 41.6 416 0.5139 0.6868 0.5139 0.7168
No log 41.8 418 0.5107 0.6909 0.5107 0.7146
No log 42.0 420 0.5074 0.6896 0.5074 0.7123
No log 42.2 422 0.5226 0.6144 0.5226 0.7229
No log 42.4 424 0.5481 0.6368 0.5481 0.7404
No log 42.6 426 0.5589 0.6661 0.5589 0.7476
No log 42.8 428 0.5557 0.6418 0.5557 0.7455
No log 43.0 430 0.5342 0.6543 0.5342 0.7309
No log 43.2 432 0.5157 0.6796 0.5157 0.7181
No log 43.4 434 0.5066 0.6392 0.5066 0.7118
No log 43.6 436 0.5132 0.6015 0.5132 0.7164
No log 43.8 438 0.5159 0.6256 0.5159 0.7183
No log 44.0 440 0.5141 0.6675 0.5141 0.7170
No log 44.2 442 0.5088 0.6316 0.5088 0.7133
No log 44.4 444 0.5119 0.6316 0.5119 0.7155
No log 44.6 446 0.5229 0.6015 0.5229 0.7231
No log 44.8 448 0.5625 0.6311 0.5625 0.7500
No log 45.0 450 0.6126 0.6418 0.6126 0.7827
No log 45.2 452 0.6124 0.6787 0.6124 0.7825
No log 45.4 454 0.6209 0.6465 0.6209 0.7880
No log 45.6 456 0.6515 0.6374 0.6515 0.8071
No log 45.8 458 0.6446 0.6354 0.6446 0.8029
No log 46.0 460 0.6715 0.6354 0.6715 0.8194
No log 46.2 462 0.7218 0.6377 0.7218 0.8496
No log 46.4 464 0.7843 0.6148 0.7843 0.8856
No log 46.6 466 0.7893 0.5674 0.7893 0.8884
No log 46.8 468 0.6956 0.6043 0.6956 0.8340
No log 47.0 470 0.5920 0.6386 0.5920 0.7694
No log 47.2 472 0.5267 0.6578 0.5267 0.7258
No log 47.4 474 0.5142 0.6401 0.5142 0.7170
No log 47.6 476 0.5114 0.6368 0.5114 0.7151
No log 47.8 478 0.5159 0.6578 0.5159 0.7182
No log 48.0 480 0.5281 0.6311 0.5281 0.7267
No log 48.2 482 0.5418 0.6311 0.5418 0.7361
No log 48.4 484 0.5341 0.6311 0.5341 0.7308
No log 48.6 486 0.5309 0.6311 0.5309 0.7286
No log 48.8 488 0.5204 0.6204 0.5204 0.7214
No log 49.0 490 0.5108 0.6015 0.5108 0.7147
No log 49.2 492 0.5038 0.6144 0.5038 0.7098
No log 49.4 494 0.5163 0.6174 0.5163 0.7186
No log 49.6 496 0.5207 0.6174 0.5207 0.7216
No log 49.8 498 0.5134 0.6174 0.5134 0.7165
0.1918 50.0 500 0.5015 0.6602 0.5015 0.7082
0.1918 50.2 502 0.4939 0.7033 0.4939 0.7028
0.1918 50.4 504 0.4811 0.6874 0.4811 0.6936
0.1918 50.6 506 0.4736 0.7154 0.4736 0.6882
0.1918 50.8 508 0.4747 0.7165 0.4747 0.6890
0.1918 51.0 510 0.4996 0.6509 0.4996 0.7068
0.1918 51.2 512 0.5249 0.6610 0.5249 0.7245
0.1918 51.4 514 0.5563 0.6691 0.5563 0.7458
0.1918 51.6 516 0.5514 0.6321 0.5514 0.7425
0.1918 51.8 518 0.5149 0.6311 0.5149 0.7176
0.1918 52.0 520 0.4961 0.6174 0.4961 0.7043
0.1918 52.2 522 0.4877 0.6602 0.4877 0.6983
0.1918 52.4 524 0.4952 0.6602 0.4952 0.7037

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k2_task5_organization

Finetuned
(4019)
this model