ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run3_AugV5_k19_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6230
  • Qwk: 0.5666
  • Mse: 0.6230
  • Rmse: 0.7893

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0333 2 3.7919 -0.0033 3.7919 1.9473
No log 0.0667 4 2.0670 -0.0409 2.0670 1.4377
No log 0.1 6 1.6513 -0.0277 1.6513 1.2850
No log 0.1333 8 1.4082 0.1310 1.4082 1.1867
No log 0.1667 10 1.1533 0.1379 1.1533 1.0739
No log 0.2 12 1.0848 0.2635 1.0848 1.0415
No log 0.2333 14 1.1148 0.2758 1.1148 1.0558
No log 0.2667 16 1.0915 0.2515 1.0915 1.0448
No log 0.3 18 1.1044 0.2441 1.1044 1.0509
No log 0.3333 20 1.0910 0.2042 1.0910 1.0445
No log 0.3667 22 1.1892 0.0882 1.1892 1.0905
No log 0.4 24 1.4115 -0.0212 1.4115 1.1881
No log 0.4333 26 1.5695 -0.0560 1.5695 1.2528
No log 0.4667 28 1.4680 0.0310 1.4680 1.2116
No log 0.5 30 1.0819 0.2166 1.0819 1.0401
No log 0.5333 32 0.9856 0.2061 0.9856 0.9928
No log 0.5667 34 0.9730 0.4090 0.9730 0.9864
No log 0.6 36 1.0038 0.3127 1.0038 1.0019
No log 0.6333 38 1.0381 0.2880 1.0381 1.0189
No log 0.6667 40 1.0574 0.2489 1.0574 1.0283
No log 0.7 42 1.1230 0.2465 1.1230 1.0597
No log 0.7333 44 1.2364 0.1433 1.2364 1.1119
No log 0.7667 46 1.1841 0.1711 1.1841 1.0881
No log 0.8 48 1.0673 0.2023 1.0673 1.0331
No log 0.8333 50 0.9895 0.2591 0.9895 0.9947
No log 0.8667 52 0.9604 0.2615 0.9604 0.9800
No log 0.9 54 0.9787 0.1549 0.9787 0.9893
No log 0.9333 56 0.9749 0.1848 0.9749 0.9874
No log 0.9667 58 0.9522 0.2455 0.9522 0.9758
No log 1.0 60 0.9767 0.2995 0.9767 0.9883
No log 1.0333 62 0.9590 0.3086 0.9590 0.9793
No log 1.0667 64 0.8644 0.3280 0.8644 0.9298
No log 1.1 66 0.8670 0.2865 0.8670 0.9311
No log 1.1333 68 0.8947 0.3506 0.8947 0.9459
No log 1.1667 70 1.0715 0.3952 1.0715 1.0351
No log 1.2 72 1.1366 0.4014 1.1366 1.0661
No log 1.2333 74 0.9384 0.4056 0.9384 0.9687
No log 1.2667 76 0.8463 0.2988 0.8463 0.9199
No log 1.3 78 0.8983 0.3351 0.8983 0.9478
No log 1.3333 80 0.8848 0.3351 0.8848 0.9406
No log 1.3667 82 0.8100 0.4715 0.8100 0.9000
No log 1.4 84 0.7741 0.4606 0.7741 0.8798
No log 1.4333 86 0.7303 0.5412 0.7303 0.8546
No log 1.4667 88 0.6913 0.5317 0.6913 0.8315
No log 1.5 90 0.6829 0.6198 0.6829 0.8264
No log 1.5333 92 0.6890 0.6518 0.6890 0.8301
No log 1.5667 94 0.7284 0.5921 0.7284 0.8534
No log 1.6 96 0.7091 0.5722 0.7091 0.8421
No log 1.6333 98 0.7005 0.5774 0.7005 0.8369
No log 1.6667 100 0.6798 0.6071 0.6798 0.8245
No log 1.7 102 0.7050 0.5739 0.7050 0.8396
No log 1.7333 104 0.6592 0.6519 0.6592 0.8119
No log 1.7667 106 0.6443 0.5926 0.6443 0.8027
No log 1.8 108 0.7873 0.5068 0.7873 0.8873
No log 1.8333 110 0.7025 0.5642 0.7025 0.8381
No log 1.8667 112 0.6146 0.6008 0.6146 0.7839
No log 1.9 114 0.8386 0.3902 0.8386 0.9157
No log 1.9333 116 0.8391 0.4318 0.8391 0.9160
No log 1.9667 118 0.6979 0.5710 0.6979 0.8354
No log 2.0 120 0.8026 0.3943 0.8026 0.8959
No log 2.0333 122 0.9182 0.4303 0.9182 0.9582
No log 2.0667 124 0.7746 0.4316 0.7746 0.8801
No log 2.1 126 0.7282 0.5231 0.7282 0.8533
No log 2.1333 128 0.7768 0.5811 0.7768 0.8814
No log 2.1667 130 0.7235 0.5642 0.7235 0.8506
No log 2.2 132 0.7166 0.5714 0.7166 0.8465
No log 2.2333 134 0.7123 0.6227 0.7123 0.8440
No log 2.2667 136 0.7724 0.6071 0.7724 0.8789
No log 2.3 138 1.0862 0.4127 1.0862 1.0422
No log 2.3333 140 1.4769 0.2921 1.4769 1.2153
No log 2.3667 142 1.4073 0.2355 1.4073 1.1863
No log 2.4 144 1.1888 0.1609 1.1888 1.0903
No log 2.4333 146 1.0607 0.2820 1.0607 1.0299
No log 2.4667 148 0.9870 0.4180 0.9870 0.9935
No log 2.5 150 0.9398 0.5230 0.9398 0.9695
No log 2.5333 152 0.9291 0.4162 0.9291 0.9639
No log 2.5667 154 0.8141 0.4824 0.8141 0.9023
No log 2.6 156 0.7966 0.4843 0.7966 0.8925
No log 2.6333 158 0.7980 0.4227 0.7980 0.8933
No log 2.6667 160 0.8586 0.4696 0.8586 0.9266
No log 2.7 162 0.9059 0.4459 0.9059 0.9518
No log 2.7333 164 0.8234 0.4843 0.8234 0.9074
No log 2.7667 166 0.7857 0.4273 0.7857 0.8864
No log 2.8 168 0.7609 0.4903 0.7609 0.8723
No log 2.8333 170 0.7586 0.5085 0.7586 0.8710
No log 2.8667 172 0.7721 0.5595 0.7721 0.8787
No log 2.9 174 0.8135 0.4629 0.8135 0.9020
No log 2.9333 176 0.8773 0.3838 0.8773 0.9366
No log 2.9667 178 0.8125 0.5010 0.8125 0.9014
No log 3.0 180 0.7514 0.4576 0.7514 0.8668
No log 3.0333 182 0.7123 0.6123 0.7123 0.8440
No log 3.0667 184 0.6956 0.5796 0.6956 0.8340
No log 3.1 186 0.6710 0.6278 0.6710 0.8191
No log 3.1333 188 0.6562 0.6584 0.6562 0.8101
No log 3.1667 190 0.6393 0.6470 0.6393 0.7996
No log 3.2 192 0.6375 0.6389 0.6375 0.7984
No log 3.2333 194 0.7252 0.5998 0.7252 0.8516
No log 3.2667 196 0.7164 0.5291 0.7164 0.8464
No log 3.3 198 0.7214 0.5528 0.7214 0.8494
No log 3.3333 200 0.6872 0.6094 0.6872 0.8290
No log 3.3667 202 0.6884 0.5516 0.6884 0.8297
No log 3.4 204 0.7142 0.5438 0.7142 0.8451
No log 3.4333 206 0.7918 0.5253 0.7918 0.8898
No log 3.4667 208 0.8788 0.5240 0.8788 0.9374
No log 3.5 210 1.1242 0.3619 1.1242 1.0603
No log 3.5333 212 1.2531 0.3480 1.2531 1.1194
No log 3.5667 214 1.0872 0.3580 1.0872 1.0427
No log 3.6 216 0.7889 0.5811 0.7889 0.8882
No log 3.6333 218 0.6686 0.5786 0.6686 0.8177
No log 3.6667 220 0.6587 0.5786 0.6587 0.8116
No log 3.7 222 0.6727 0.5565 0.6727 0.8202
No log 3.7333 224 0.6654 0.5858 0.6654 0.8157
No log 3.7667 226 0.7353 0.5717 0.7353 0.8575
No log 3.8 228 0.8181 0.5479 0.8181 0.9045
No log 3.8333 230 0.7429 0.5717 0.7429 0.8619
No log 3.8667 232 0.6428 0.6073 0.6428 0.8018
No log 3.9 234 0.7015 0.4857 0.7015 0.8375
No log 3.9333 236 0.7099 0.4857 0.7099 0.8426
No log 3.9667 238 0.6545 0.5316 0.6545 0.8090
No log 4.0 240 0.6834 0.5909 0.6834 0.8267
No log 4.0333 242 0.6983 0.6157 0.6983 0.8357
No log 4.0667 244 0.6421 0.6272 0.6421 0.8013
No log 4.1 246 0.5972 0.6753 0.5972 0.7728
No log 4.1333 248 0.5994 0.6470 0.5994 0.7742
No log 4.1667 250 0.6030 0.6164 0.6030 0.7765
No log 4.2 252 0.6098 0.6177 0.6098 0.7809
No log 4.2333 254 0.5979 0.6500 0.5979 0.7732
No log 4.2667 256 0.6130 0.6095 0.6130 0.7829
No log 4.3 258 0.6307 0.5999 0.6307 0.7941
No log 4.3333 260 0.6201 0.6087 0.6201 0.7875
No log 4.3667 262 0.6390 0.6493 0.6390 0.7994
No log 4.4 264 0.6488 0.6581 0.6488 0.8055
No log 4.4333 266 0.6586 0.6677 0.6586 0.8115
No log 4.4667 268 0.6351 0.6001 0.6351 0.7970
No log 4.5 270 0.6363 0.6144 0.6363 0.7977
No log 4.5333 272 0.6431 0.6167 0.6431 0.8019
No log 4.5667 274 0.6356 0.6057 0.6356 0.7972
No log 4.6 276 0.6410 0.5735 0.6410 0.8006
No log 4.6333 278 0.6453 0.6175 0.6453 0.8033
No log 4.6667 280 0.6969 0.5413 0.6969 0.8348
No log 4.7 282 0.7731 0.5655 0.7731 0.8793
No log 4.7333 284 0.7258 0.5368 0.7258 0.8519
No log 4.7667 286 0.7205 0.5577 0.7205 0.8488
No log 4.8 288 0.7195 0.5963 0.7195 0.8482
No log 4.8333 290 0.7385 0.5370 0.7385 0.8594
No log 4.8667 292 0.7494 0.5147 0.7494 0.8657
No log 4.9 294 0.8248 0.4912 0.8248 0.9082
No log 4.9333 296 0.8112 0.4681 0.8112 0.9007
No log 4.9667 298 0.7211 0.4468 0.7211 0.8492
No log 5.0 300 0.7036 0.5042 0.7036 0.8388
No log 5.0333 302 0.6686 0.5666 0.6686 0.8177
No log 5.0667 304 0.6607 0.6007 0.6607 0.8129
No log 5.1 306 0.7744 0.5938 0.7744 0.8800
No log 5.1333 308 0.7361 0.6459 0.7361 0.8580
No log 5.1667 310 0.6798 0.5510 0.6798 0.8245
No log 5.2 312 0.7337 0.5329 0.7337 0.8566
No log 5.2333 314 0.6850 0.5562 0.6850 0.8276
No log 5.2667 316 0.6778 0.5712 0.6778 0.8233
No log 5.3 318 0.6961 0.5909 0.6961 0.8343
No log 5.3333 320 0.6500 0.5933 0.6500 0.8063
No log 5.3667 322 0.6239 0.6249 0.6239 0.7899
No log 5.4 324 0.6462 0.5647 0.6462 0.8039
No log 5.4333 326 0.6882 0.5536 0.6882 0.8296
No log 5.4667 328 0.7411 0.5018 0.7411 0.8609
No log 5.5 330 0.7072 0.5259 0.7072 0.8410
No log 5.5333 332 0.6634 0.5921 0.6634 0.8145
No log 5.5667 334 0.6373 0.6259 0.6373 0.7983
No log 5.6 336 0.6259 0.6239 0.6259 0.7912
No log 5.6333 338 0.6361 0.5622 0.6361 0.7975
No log 5.6667 340 0.6436 0.5274 0.6436 0.8023
No log 5.7 342 0.6467 0.5710 0.6467 0.8042
No log 5.7333 344 0.6385 0.5503 0.6385 0.7991
No log 5.7667 346 0.6429 0.5577 0.6429 0.8018
No log 5.8 348 0.6763 0.5614 0.6763 0.8224
No log 5.8333 350 0.7532 0.5639 0.7532 0.8679
No log 5.8667 352 0.7115 0.5875 0.7115 0.8435
No log 5.9 354 0.6413 0.6085 0.6413 0.8008
No log 5.9333 356 0.6425 0.5512 0.6425 0.8016
No log 5.9667 358 0.6515 0.6389 0.6515 0.8071
No log 6.0 360 0.6759 0.6073 0.6759 0.8222
No log 6.0333 362 0.7397 0.5809 0.7397 0.8601
No log 6.0667 364 0.6819 0.6142 0.6819 0.8258
No log 6.1 366 0.6418 0.6057 0.6418 0.8011
No log 6.1333 368 0.6237 0.6701 0.6237 0.7897
No log 6.1667 370 0.6243 0.6701 0.6243 0.7901
No log 6.2 372 0.6265 0.6626 0.6265 0.7915
No log 6.2333 374 0.6271 0.6114 0.6271 0.7919
No log 6.2667 376 0.6301 0.5966 0.6301 0.7938
No log 6.3 378 0.6310 0.6356 0.6310 0.7944
No log 6.3333 380 0.6804 0.6045 0.6804 0.8249
No log 6.3667 382 0.6667 0.6151 0.6667 0.8165
No log 6.4 384 0.6356 0.6324 0.6356 0.7972
No log 6.4333 386 0.6520 0.5766 0.6520 0.8074
No log 6.4667 388 0.6713 0.6394 0.6713 0.8193
No log 6.5 390 0.6235 0.6865 0.6235 0.7896
No log 6.5333 392 0.6100 0.6916 0.6100 0.7810
No log 6.5667 394 0.6049 0.6641 0.6049 0.7778
No log 6.6 396 0.6144 0.6573 0.6144 0.7838
No log 6.6333 398 0.6233 0.6573 0.6233 0.7895
No log 6.6667 400 0.6216 0.6395 0.6216 0.7884
No log 6.7 402 0.6080 0.6187 0.6080 0.7798
No log 6.7333 404 0.6177 0.5996 0.6177 0.7859
No log 6.7667 406 0.6137 0.5928 0.6137 0.7834
No log 6.8 408 0.6160 0.5871 0.6160 0.7848
No log 6.8333 410 0.6239 0.5735 0.6239 0.7899
No log 6.8667 412 0.6321 0.5210 0.6321 0.7951
No log 6.9 414 0.6216 0.5210 0.6216 0.7884
No log 6.9333 416 0.6137 0.5648 0.6137 0.7834
No log 6.9667 418 0.6797 0.5973 0.6797 0.8244
No log 7.0 420 0.7381 0.5735 0.7381 0.8591
No log 7.0333 422 0.7143 0.5549 0.7143 0.8451
No log 7.0667 424 0.6429 0.6473 0.6429 0.8018
No log 7.1 426 0.6465 0.6404 0.6465 0.8041
No log 7.1333 428 0.6884 0.6340 0.6884 0.8297
No log 7.1667 430 0.7263 0.6199 0.7263 0.8522
No log 7.2 432 0.7671 0.6019 0.7671 0.8759
No log 7.2333 434 0.6994 0.6081 0.6994 0.8363
No log 7.2667 436 0.6308 0.5955 0.6308 0.7942
No log 7.3 438 0.6454 0.6006 0.6454 0.8034
No log 7.3333 440 0.6436 0.5747 0.6436 0.8023
No log 7.3667 442 0.6911 0.5084 0.6911 0.8313
No log 7.4 444 0.7306 0.4821 0.7306 0.8548
No log 7.4333 446 0.7031 0.5062 0.7031 0.8385
No log 7.4667 448 0.6381 0.5710 0.6381 0.7988
No log 7.5 450 0.6209 0.6097 0.6209 0.7880
No log 7.5333 452 0.6057 0.6057 0.6057 0.7783
No log 7.5667 454 0.6355 0.6296 0.6355 0.7972
No log 7.6 456 0.6697 0.5756 0.6697 0.8184
No log 7.6333 458 0.6440 0.5751 0.6440 0.8025
No log 7.6667 460 0.6374 0.5700 0.6374 0.7984
No log 7.7 462 0.6246 0.5700 0.6246 0.7903
No log 7.7333 464 0.6221 0.6519 0.6221 0.7887
No log 7.7667 466 0.6321 0.6160 0.6321 0.7950
No log 7.8 468 0.6088 0.6815 0.6088 0.7802
No log 7.8333 470 0.5857 0.6288 0.5857 0.7653
No log 7.8667 472 0.5901 0.6297 0.5901 0.7682
No log 7.9 474 0.5958 0.6424 0.5958 0.7719
No log 7.9333 476 0.6009 0.6519 0.6009 0.7751
No log 7.9667 478 0.6188 0.6429 0.6188 0.7867
No log 8.0 480 0.6041 0.6687 0.6041 0.7772
No log 8.0333 482 0.6019 0.5828 0.6019 0.7758
No log 8.0667 484 0.6096 0.5828 0.6096 0.7808
No log 8.1 486 0.5750 0.6501 0.5750 0.7583
No log 8.1333 488 0.5830 0.7200 0.5830 0.7635
No log 8.1667 490 0.5770 0.7246 0.5770 0.7596
No log 8.2 492 0.5923 0.5716 0.5923 0.7696
No log 8.2333 494 0.6485 0.5997 0.6485 0.8053
No log 8.2667 496 0.6053 0.5854 0.6053 0.7780
No log 8.3 498 0.6051 0.6656 0.6051 0.7779
0.3127 8.3333 500 0.6941 0.5978 0.6941 0.8332
0.3127 8.3667 502 0.6869 0.5978 0.6869 0.8288
0.3127 8.4 504 0.6080 0.6371 0.6080 0.7797
0.3127 8.4333 506 0.6173 0.6207 0.6173 0.7857
0.3127 8.4667 508 0.6405 0.5794 0.6405 0.8003
0.3127 8.5 510 0.6157 0.5994 0.6157 0.7846
0.3127 8.5333 512 0.6400 0.4960 0.6400 0.8000
0.3127 8.5667 514 0.6916 0.5400 0.6916 0.8316
0.3127 8.6 516 0.6695 0.5400 0.6695 0.8182
0.3127 8.6333 518 0.6226 0.5845 0.6226 0.7890
0.3127 8.6667 520 0.6276 0.5994 0.6276 0.7922
0.3127 8.7 522 0.6242 0.5994 0.6242 0.7901
0.3127 8.7333 524 0.6337 0.5442 0.6337 0.7961
0.3127 8.7667 526 0.7190 0.5439 0.7190 0.8479
0.3127 8.8 528 0.7119 0.5439 0.7119 0.8438
0.3127 8.8333 530 0.6324 0.5565 0.6324 0.7953
0.3127 8.8667 532 0.6230 0.5666 0.6230 0.7893

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run3_AugV5_k19_task5_organization

Finetuned
(4019)
this model