ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k11_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7216
  • Qwk: 0.3324
  • Mse: 0.7216
  • Rmse: 0.8495

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0364 2 2.5668 -0.0262 2.5668 1.6021
No log 0.0727 4 1.6186 0.0789 1.6186 1.2723
No log 0.1091 6 0.8064 0.1770 0.8064 0.8980
No log 0.1455 8 0.9287 -0.0963 0.9287 0.9637
No log 0.1818 10 0.9986 0.2142 0.9986 0.9993
No log 0.2182 12 1.0884 0.2121 1.0884 1.0433
No log 0.2545 14 1.1141 0.2121 1.1141 1.0555
No log 0.2909 16 1.1174 0.1723 1.1174 1.0571
No log 0.3273 18 0.9435 0.1183 0.9435 0.9714
No log 0.3636 20 0.8515 -0.0054 0.8515 0.9228
No log 0.4 22 0.7603 0.1863 0.7603 0.8720
No log 0.4364 24 0.7092 0.2606 0.7092 0.8422
No log 0.4727 26 0.8626 0.0295 0.8626 0.9288
No log 0.5091 28 0.9564 0.0344 0.9564 0.9779
No log 0.5455 30 0.9297 0.0653 0.9297 0.9642
No log 0.5818 32 1.0498 0.2635 1.0498 1.0246
No log 0.6182 34 0.9536 0.3225 0.9536 0.9765
No log 0.6545 36 0.7721 0.3675 0.7721 0.8787
No log 0.6909 38 0.6464 0.2930 0.6464 0.8040
No log 0.7273 40 0.6907 0.1904 0.6907 0.8311
No log 0.7636 42 0.6339 0.2884 0.6339 0.7962
No log 0.8 44 0.6163 0.3789 0.6163 0.7851
No log 0.8364 46 0.7464 0.3359 0.7464 0.8639
No log 0.8727 48 0.9303 0.3945 0.9303 0.9645
No log 0.9091 50 1.0772 0.2280 1.0772 1.0379
No log 0.9455 52 1.0092 0.2271 1.0092 1.0046
No log 0.9818 54 0.7104 0.3155 0.7104 0.8429
No log 1.0182 56 0.6345 0.4289 0.6345 0.7965
No log 1.0545 58 0.6185 0.5131 0.6185 0.7864
No log 1.0909 60 0.8405 0.3503 0.8405 0.9168
No log 1.1273 62 1.0543 0.3572 1.0543 1.0268
No log 1.1636 64 0.9688 0.4051 0.9688 0.9843
No log 1.2 66 0.7046 0.4451 0.7046 0.8394
No log 1.2364 68 0.6039 0.4595 0.6039 0.7771
No log 1.2727 70 0.7212 0.4003 0.7212 0.8493
No log 1.3091 72 0.6175 0.3383 0.6175 0.7858
No log 1.3455 74 0.5764 0.4419 0.5764 0.7592
No log 1.3818 76 0.7278 0.3819 0.7278 0.8531
No log 1.4182 78 0.8167 0.3473 0.8167 0.9037
No log 1.4545 80 0.8011 0.3825 0.8011 0.8950
No log 1.4909 82 0.6381 0.4662 0.6381 0.7988
No log 1.5273 84 0.6224 0.2476 0.6224 0.7889
No log 1.5636 86 0.6353 0.2476 0.6353 0.7970
No log 1.6 88 0.7101 0.3819 0.7101 0.8427
No log 1.6364 90 1.0804 0.3292 1.0804 1.0394
No log 1.6727 92 1.1114 0.3175 1.1114 1.0542
No log 1.7091 94 0.7990 0.3810 0.7990 0.8939
No log 1.7455 96 0.6547 0.5508 0.6547 0.8091
No log 1.7818 98 0.5964 0.5201 0.5964 0.7723
No log 1.8182 100 0.7306 0.4511 0.7306 0.8547
No log 1.8545 102 0.8064 0.3849 0.8064 0.8980
No log 1.8909 104 0.6583 0.4862 0.6583 0.8113
No log 1.9273 106 0.5405 0.5876 0.5405 0.7352
No log 1.9636 108 0.5616 0.4731 0.5616 0.7494
No log 2.0 110 0.6952 0.4404 0.6952 0.8338
No log 2.0364 112 0.6749 0.4404 0.6749 0.8215
No log 2.0727 114 0.5775 0.5139 0.5775 0.7599
No log 2.1091 116 0.5123 0.5836 0.5123 0.7157
No log 2.1455 118 0.5329 0.5327 0.5329 0.7300
No log 2.1818 120 0.5067 0.6201 0.5067 0.7119
No log 2.2182 122 0.4733 0.5899 0.4733 0.6879
No log 2.2545 124 0.4633 0.5899 0.4633 0.6806
No log 2.2909 126 0.4874 0.5980 0.4874 0.6981
No log 2.3273 128 0.6341 0.5803 0.6341 0.7963
No log 2.3636 130 0.9349 0.3939 0.9349 0.9669
No log 2.4 132 0.9988 0.3886 0.9988 0.9994
No log 2.4364 134 0.7376 0.4852 0.7376 0.8589
No log 2.4727 136 0.6218 0.4295 0.6218 0.7885
No log 2.5091 138 0.6910 0.4457 0.6910 0.8312
No log 2.5455 140 0.7062 0.4385 0.7062 0.8403
No log 2.5818 142 0.6127 0.4819 0.6127 0.7827
No log 2.6182 144 0.6693 0.4512 0.6693 0.8181
No log 2.6545 146 0.6879 0.4275 0.6879 0.8294
No log 2.6909 148 0.6747 0.4349 0.6747 0.8214
No log 2.7273 150 0.5633 0.5140 0.5633 0.7505
No log 2.7636 152 0.4731 0.6443 0.4731 0.6878
No log 2.8 154 0.4769 0.6241 0.4769 0.6906
No log 2.8364 156 0.5431 0.5357 0.5431 0.7369
No log 2.8727 158 0.7679 0.5190 0.7679 0.8763
No log 2.9091 160 0.9716 0.4297 0.9716 0.9857
No log 2.9455 162 0.8349 0.3753 0.8349 0.9137
No log 2.9818 164 0.6645 0.5292 0.6645 0.8152
No log 3.0182 166 0.6017 0.4997 0.6017 0.7757
No log 3.0545 168 0.6169 0.4664 0.6169 0.7854
No log 3.0909 170 0.6719 0.4438 0.6719 0.8197
No log 3.1273 172 0.5408 0.5158 0.5408 0.7354
No log 3.1636 174 0.4687 0.5104 0.4687 0.6846
No log 3.2 176 0.4838 0.5403 0.4838 0.6955
No log 3.2364 178 0.6183 0.4930 0.6183 0.7863
No log 3.2727 180 0.6507 0.4930 0.6507 0.8066
No log 3.3091 182 0.7969 0.3593 0.7969 0.8927
No log 3.3455 184 0.7316 0.4277 0.7316 0.8553
No log 3.3818 186 0.6744 0.4404 0.6744 0.8212
No log 3.4182 188 0.6077 0.4329 0.6077 0.7795
No log 3.4545 190 0.6733 0.4175 0.6733 0.8206
No log 3.4909 192 0.6654 0.4521 0.6654 0.8157
No log 3.5273 194 0.5251 0.5516 0.5251 0.7246
No log 3.5636 196 0.4870 0.5904 0.4870 0.6979
No log 3.6 198 0.5362 0.5627 0.5362 0.7323
No log 3.6364 200 0.4710 0.6053 0.4710 0.6863
No log 3.6727 202 0.6025 0.5101 0.6025 0.7762
No log 3.7091 204 0.7373 0.4652 0.7373 0.8587
No log 3.7455 206 0.6025 0.5045 0.6025 0.7762
No log 3.7818 208 0.5025 0.5289 0.5025 0.7089
No log 3.8182 210 0.4745 0.6037 0.4745 0.6888
No log 3.8545 212 0.4815 0.5955 0.4815 0.6939
No log 3.8909 214 0.5284 0.5498 0.5284 0.7269
No log 3.9273 216 0.6638 0.4493 0.6638 0.8148
No log 3.9636 218 0.9092 0.3632 0.9092 0.9535
No log 4.0 220 0.8585 0.3431 0.8585 0.9265
No log 4.0364 222 0.6861 0.3538 0.6861 0.8283
No log 4.0727 224 0.6867 0.3538 0.6867 0.8287
No log 4.1091 226 0.8179 0.2876 0.8179 0.9044
No log 4.1455 228 0.8508 0.3059 0.8508 0.9224
No log 4.1818 230 0.8845 0.3010 0.8845 0.9405
No log 4.2182 232 0.8435 0.3650 0.8435 0.9184
No log 4.2545 234 0.7567 0.4096 0.7567 0.8699
No log 4.2909 236 0.6673 0.4064 0.6673 0.8169
No log 4.3273 238 0.5692 0.4582 0.5692 0.7545
No log 4.3636 240 0.5634 0.4167 0.5634 0.7506
No log 4.4 242 0.5669 0.4664 0.5669 0.7529
No log 4.4364 244 0.5830 0.4947 0.5830 0.7635
No log 4.4727 246 0.6223 0.4992 0.6223 0.7889
No log 4.5091 248 0.7348 0.4526 0.7348 0.8572
No log 4.5455 250 0.7955 0.4639 0.7955 0.8919
No log 4.5818 252 0.7059 0.4124 0.7059 0.8402
No log 4.6182 254 0.6313 0.4133 0.6313 0.7945
No log 4.6545 256 0.5783 0.4175 0.5783 0.7605
No log 4.6909 258 0.6250 0.3782 0.6250 0.7906
No log 4.7273 260 0.6551 0.4186 0.6551 0.8094
No log 4.7636 262 0.7316 0.4465 0.7316 0.8554
No log 4.8 264 0.7313 0.4409 0.7313 0.8552
No log 4.8364 266 0.6554 0.4085 0.6554 0.8095
No log 4.8727 268 0.6357 0.4153 0.6357 0.7973
No log 4.9091 270 0.5635 0.4502 0.5635 0.7507
No log 4.9455 272 0.5644 0.4502 0.5644 0.7512
No log 4.9818 274 0.6429 0.4114 0.6429 0.8018
No log 5.0182 276 0.5799 0.4887 0.5799 0.7615
No log 5.0545 278 0.4983 0.5934 0.4983 0.7059
No log 5.0909 280 0.4301 0.6843 0.4301 0.6558
No log 5.1273 282 0.4387 0.6964 0.4387 0.6623
No log 5.1636 284 0.4644 0.5980 0.4644 0.6815
No log 5.2 286 0.6387 0.3913 0.6387 0.7992
No log 5.2364 288 0.7878 0.4156 0.7878 0.8876
No log 5.2727 290 0.6647 0.4462 0.6647 0.8153
No log 5.3091 292 0.4729 0.6201 0.4729 0.6877
No log 5.3455 294 0.4454 0.6620 0.4454 0.6674
No log 5.3818 296 0.5446 0.5040 0.5446 0.7380
No log 5.4182 298 0.8751 0.4060 0.8751 0.9354
No log 5.4545 300 1.0284 0.4105 1.0284 1.0141
No log 5.4909 302 0.9559 0.4083 0.9559 0.9777
No log 5.5273 304 0.6580 0.4992 0.6580 0.8112
No log 5.5636 306 0.4731 0.6408 0.4731 0.6878
No log 5.6 308 0.4777 0.5995 0.4777 0.6912
No log 5.6364 310 0.6012 0.5760 0.6012 0.7754
No log 5.6727 312 0.8549 0.4491 0.8549 0.9246
No log 5.7091 314 0.8647 0.4491 0.8647 0.9299
No log 5.7455 316 0.6772 0.4867 0.6772 0.8229
No log 5.7818 318 0.5447 0.5591 0.5447 0.7380
No log 5.8182 320 0.5739 0.5308 0.5739 0.7576
No log 5.8545 322 0.6214 0.4755 0.6214 0.7883
No log 5.8909 324 0.6335 0.4223 0.6335 0.7960
No log 5.9273 326 0.6209 0.4275 0.6209 0.7879
No log 5.9636 328 0.6749 0.4502 0.6749 0.8215
No log 6.0 330 0.6873 0.4502 0.6873 0.8290
No log 6.0364 332 0.7380 0.4161 0.7380 0.8590
No log 6.0727 334 0.6566 0.4438 0.6566 0.8103
No log 6.1091 336 0.5265 0.5736 0.5265 0.7256
No log 6.1455 338 0.4922 0.6067 0.4922 0.7016
No log 6.1818 340 0.5095 0.5867 0.5095 0.7138
No log 6.2182 342 0.5544 0.5063 0.5544 0.7446
No log 6.2545 344 0.5864 0.4502 0.5864 0.7658
No log 6.2909 346 0.6216 0.4502 0.6216 0.7884
No log 6.3273 348 0.5656 0.4582 0.5656 0.7521
No log 6.3636 350 0.5583 0.4582 0.5583 0.7472
No log 6.4 352 0.5887 0.4587 0.5887 0.7673
No log 6.4364 354 0.6389 0.4870 0.6389 0.7993
No log 6.4727 356 0.7503 0.4640 0.7503 0.8662
No log 6.5091 358 0.6832 0.5504 0.6832 0.8266
No log 6.5455 360 0.5235 0.5259 0.5235 0.7235
No log 6.5818 362 0.4206 0.6705 0.4206 0.6486
No log 6.6182 364 0.4173 0.6852 0.4173 0.6460
No log 6.6545 366 0.4190 0.7066 0.4190 0.6473
No log 6.6909 368 0.4278 0.6082 0.4278 0.6541
No log 6.7273 370 0.5136 0.5379 0.5136 0.7166
No log 6.7636 372 0.5574 0.4827 0.5574 0.7466
No log 6.8 374 0.5194 0.5252 0.5194 0.7207
No log 6.8364 376 0.5117 0.5349 0.5117 0.7154
No log 6.8727 378 0.5059 0.5290 0.5059 0.7112
No log 6.9091 380 0.5271 0.5252 0.5271 0.7260
No log 6.9455 382 0.6224 0.3847 0.6224 0.7890
No log 6.9818 384 0.7053 0.3613 0.7053 0.8398
No log 7.0182 386 0.6516 0.3973 0.6516 0.8072
No log 7.0545 388 0.4918 0.5965 0.4918 0.7013
No log 7.0909 390 0.4306 0.6530 0.4306 0.6562
No log 7.1273 392 0.4309 0.6530 0.4309 0.6564
No log 7.1636 394 0.4415 0.6488 0.4415 0.6644
No log 7.2 396 0.4942 0.5965 0.4942 0.7030
No log 7.2364 398 0.5148 0.5639 0.5148 0.7175
No log 7.2727 400 0.5343 0.5544 0.5343 0.7309
No log 7.3091 402 0.4889 0.6381 0.4889 0.6992
No log 7.3455 404 0.5215 0.5544 0.5215 0.7221
No log 7.3818 406 0.5400 0.5310 0.5400 0.7349
No log 7.4182 408 0.5882 0.4664 0.5882 0.7669
No log 7.4545 410 0.5511 0.4663 0.5511 0.7423
No log 7.4909 412 0.5680 0.4663 0.5680 0.7536
No log 7.5273 414 0.6335 0.4197 0.6335 0.7959
No log 7.5636 416 0.6344 0.4199 0.6344 0.7965
No log 7.6 418 0.5794 0.4911 0.5794 0.7612
No log 7.6364 420 0.5151 0.5736 0.5151 0.7177
No log 7.6727 422 0.5526 0.5291 0.5526 0.7434
No log 7.7091 424 0.6979 0.4952 0.6979 0.8354
No log 7.7455 426 0.7130 0.4821 0.7130 0.8444
No log 7.7818 428 0.6275 0.5206 0.6275 0.7921
No log 7.8182 430 0.6034 0.5237 0.6034 0.7768
No log 7.8545 432 0.5546 0.5104 0.5546 0.7447
No log 7.8909 434 0.5048 0.5801 0.5048 0.7105
No log 7.9273 436 0.5099 0.5614 0.5099 0.7141
No log 7.9636 438 0.5492 0.5677 0.5492 0.7411
No log 8.0 440 0.6436 0.5155 0.6436 0.8022
No log 8.0364 442 0.7266 0.4287 0.7266 0.8524
No log 8.0727 444 0.7239 0.4287 0.7239 0.8508
No log 8.1091 446 0.6601 0.4400 0.6601 0.8124
No log 8.1455 448 0.5609 0.4898 0.5609 0.7489
No log 8.1818 450 0.5157 0.5237 0.5157 0.7181
No log 8.2182 452 0.5393 0.4835 0.5393 0.7343
No log 8.2545 454 0.5881 0.4197 0.5881 0.7669
No log 8.2909 456 0.6213 0.3981 0.6213 0.7882
No log 8.3273 458 0.6695 0.4203 0.6695 0.8182
No log 8.3636 460 0.6572 0.3803 0.6572 0.8107
No log 8.4 462 0.6935 0.3803 0.6935 0.8327
No log 8.4364 464 0.6351 0.4197 0.6351 0.7969
No log 8.4727 466 0.5970 0.4749 0.5970 0.7726
No log 8.5091 468 0.5612 0.5158 0.5612 0.7492
No log 8.5455 470 0.5630 0.5158 0.5630 0.7504
No log 8.5818 472 0.6129 0.5219 0.6129 0.7829
No log 8.6182 474 0.7454 0.4114 0.7454 0.8634
No log 8.6545 476 0.7433 0.4287 0.7433 0.8621
No log 8.6909 478 0.5910 0.5131 0.5910 0.7687
No log 8.7273 480 0.4848 0.5195 0.4848 0.6963
No log 8.7636 482 0.4674 0.6096 0.4674 0.6837
No log 8.8 484 0.4707 0.6082 0.4707 0.6861
No log 8.8364 486 0.5143 0.5403 0.5143 0.7172
No log 8.8727 488 0.5962 0.4745 0.5962 0.7721
No log 8.9091 490 0.6067 0.4272 0.6067 0.7789
No log 8.9455 492 0.5353 0.5310 0.5353 0.7316
No log 8.9818 494 0.5247 0.5252 0.5247 0.7244
No log 9.0182 496 0.5983 0.4812 0.5983 0.7735
No log 9.0545 498 0.7388 0.4409 0.7388 0.8595
0.3455 9.0909 500 0.8149 0.4116 0.8149 0.9027
0.3455 9.1273 502 0.7963 0.4116 0.7963 0.8923
0.3455 9.1636 504 0.6542 0.4799 0.6542 0.8088
0.3455 9.2 506 0.5202 0.5989 0.5202 0.7213
0.3455 9.2364 508 0.4756 0.6293 0.4756 0.6896
0.3455 9.2727 510 0.4745 0.6516 0.4745 0.6888
0.3455 9.3091 512 0.4909 0.6060 0.4909 0.7007
0.3455 9.3455 514 0.5913 0.4764 0.5913 0.7690
0.3455 9.3818 516 0.7396 0.4472 0.7396 0.8600
0.3455 9.4182 518 0.7620 0.3963 0.7620 0.8730
0.3455 9.4545 520 0.6329 0.4606 0.6329 0.7955
0.3455 9.4909 522 0.4946 0.6060 0.4946 0.7033
0.3455 9.5273 524 0.4637 0.6296 0.4637 0.6810
0.3455 9.5636 526 0.4603 0.6054 0.4603 0.6785
0.3455 9.6 528 0.5111 0.5560 0.5111 0.7149
0.3455 9.6364 530 0.7104 0.3887 0.7104 0.8429
0.3455 9.6727 532 0.8059 0.3643 0.8059 0.8977
0.3455 9.7091 534 0.8649 0.3590 0.8649 0.9300
0.3455 9.7455 536 0.7245 0.3183 0.7245 0.8512
0.3455 9.7818 538 0.6918 0.4133 0.6918 0.8317
0.3455 9.8182 540 0.7140 0.4098 0.7140 0.8450
0.3455 9.8545 542 0.6959 0.4222 0.6959 0.8342
0.3455 9.8909 544 0.6289 0.4404 0.6289 0.7931
0.3455 9.9273 546 0.5962 0.4916 0.5962 0.7721
0.3455 9.9636 548 0.6374 0.4104 0.6374 0.7984
0.3455 10.0 550 0.7817 0.3593 0.7817 0.8841
0.3455 10.0364 552 0.8402 0.3484 0.8402 0.9166
0.3455 10.0727 554 0.8402 0.3484 0.8402 0.9166
0.3455 10.1091 556 0.7216 0.3324 0.7216 0.8495

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k11_task7_organization

Finetuned
(4019)
this model