ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k7_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8169
  • Qwk: 0.4455
  • Mse: 0.8169
  • Rmse: 0.9038

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0571 2 2.5548 -0.0297 2.5548 1.5984
No log 0.1143 4 1.2387 0.2230 1.2387 1.1130
No log 0.1714 6 0.7899 0.1372 0.7899 0.8888
No log 0.2286 8 0.9237 0.0 0.9237 0.9611
No log 0.2857 10 0.8569 0.0966 0.8569 0.9257
No log 0.3429 12 0.8974 0.0851 0.8974 0.9473
No log 0.4 14 0.8163 0.1550 0.8163 0.9035
No log 0.4571 16 0.8243 0.2967 0.8243 0.9079
No log 0.5143 18 0.9007 0.3287 0.9007 0.9491
No log 0.5714 20 0.9818 0.3225 0.9818 0.9909
No log 0.6286 22 0.9918 0.3615 0.9918 0.9959
No log 0.6857 24 0.9653 0.3231 0.9653 0.9825
No log 0.7429 26 0.9643 0.2510 0.9643 0.9820
No log 0.8 28 0.8864 0.2285 0.8864 0.9415
No log 0.8571 30 0.8195 0.2717 0.8195 0.9053
No log 0.9143 32 0.8507 0.3606 0.8507 0.9223
No log 0.9714 34 0.8570 0.3473 0.8570 0.9258
No log 1.0286 36 0.8272 0.3606 0.8272 0.9095
No log 1.0857 38 0.7191 0.3099 0.7191 0.8480
No log 1.1429 40 0.6354 0.2537 0.6354 0.7971
No log 1.2 42 0.6607 0.2345 0.6607 0.8128
No log 1.2571 44 0.6360 0.2852 0.6360 0.7975
No log 1.3143 46 0.6752 0.4219 0.6752 0.8217
No log 1.3714 48 0.8257 0.3425 0.8257 0.9087
No log 1.4286 50 1.0136 0.4192 1.0136 1.0068
No log 1.4857 52 0.9957 0.3945 0.9957 0.9978
No log 1.5429 54 0.8437 0.4030 0.8437 0.9185
No log 1.6 56 0.7024 0.2787 0.7024 0.8381
No log 1.6571 58 0.7183 0.2930 0.7183 0.8475
No log 1.7143 60 0.7079 0.2857 0.7079 0.8414
No log 1.7714 62 0.7591 0.4382 0.7591 0.8713
No log 1.8286 64 0.8214 0.4936 0.8214 0.9063
No log 1.8857 66 0.6614 0.4702 0.6614 0.8133
No log 1.9429 68 0.5349 0.5310 0.5349 0.7314
No log 2.0 70 0.5724 0.4738 0.5724 0.7566
No log 2.0571 72 0.5961 0.3976 0.5961 0.7721
No log 2.1143 74 0.5920 0.4093 0.5920 0.7694
No log 2.1714 76 0.5652 0.3942 0.5652 0.7518
No log 2.2286 78 0.5505 0.3738 0.5505 0.7420
No log 2.2857 80 0.5457 0.3675 0.5457 0.7387
No log 2.3429 82 0.5516 0.4314 0.5516 0.7427
No log 2.4 84 0.7041 0.5175 0.7041 0.8391
No log 2.4571 86 0.9235 0.4655 0.9235 0.9610
No log 2.5143 88 0.7900 0.4906 0.7900 0.8888
No log 2.5714 90 0.5900 0.4190 0.5900 0.7681
No log 2.6286 92 0.5715 0.4076 0.5715 0.7560
No log 2.6857 94 0.6817 0.4493 0.6817 0.8256
No log 2.7429 96 1.0216 0.4267 1.0216 1.0107
No log 2.8 98 1.0026 0.4483 1.0026 1.0013
No log 2.8571 100 0.6790 0.4493 0.6790 0.8240
No log 2.9143 102 0.5204 0.3649 0.5204 0.7214
No log 2.9714 104 0.5194 0.3728 0.5194 0.7207
No log 3.0286 106 0.5811 0.4502 0.5811 0.7623
No log 3.0857 108 0.9004 0.4823 0.9004 0.9489
No log 3.1429 110 0.9724 0.4712 0.9724 0.9861
No log 3.2 112 0.7350 0.4784 0.7350 0.8573
No log 3.2571 114 0.6168 0.5026 0.6168 0.7854
No log 3.3143 116 0.5512 0.4883 0.5512 0.7424
No log 3.3714 118 0.5382 0.5485 0.5382 0.7336
No log 3.4286 120 0.5765 0.5107 0.5765 0.7593
No log 3.4857 122 0.7160 0.4536 0.7160 0.8462
No log 3.5429 124 0.7831 0.4942 0.7831 0.8849
No log 3.6 126 0.6570 0.5080 0.6570 0.8106
No log 3.6571 128 0.5533 0.4234 0.5533 0.7439
No log 3.7143 130 0.5251 0.4013 0.5251 0.7246
No log 3.7714 132 0.5162 0.4044 0.5162 0.7184
No log 3.8286 134 0.5360 0.4905 0.5360 0.7321
No log 3.8857 136 0.5632 0.4905 0.5632 0.7505
No log 3.9429 138 0.5382 0.4905 0.5382 0.7336
No log 4.0 140 0.5295 0.4753 0.5295 0.7277
No log 4.0571 142 0.5547 0.4824 0.5547 0.7448
No log 4.1143 144 0.6181 0.4726 0.6181 0.7862
No log 4.1714 146 0.6767 0.4789 0.6767 0.8226
No log 4.2286 148 0.6175 0.4801 0.6175 0.7858
No log 4.2857 150 0.5216 0.4267 0.5216 0.7222
No log 4.3429 152 0.5117 0.4637 0.5117 0.7153
No log 4.4 154 0.5401 0.5528 0.5401 0.7349
No log 4.4571 156 0.6171 0.5402 0.6171 0.7856
No log 4.5143 158 0.6736 0.4978 0.6736 0.8207
No log 4.5714 160 0.5691 0.5609 0.5691 0.7544
No log 4.6286 162 0.4985 0.4569 0.4985 0.7060
No log 4.6857 164 0.5757 0.4083 0.5757 0.7588
No log 4.7429 166 0.5685 0.4539 0.5685 0.7540
No log 4.8 168 0.4892 0.4795 0.4892 0.6994
No log 4.8571 170 0.5859 0.5308 0.5859 0.7654
No log 4.9143 172 0.7751 0.4837 0.7751 0.8804
No log 4.9714 174 0.7485 0.5029 0.7485 0.8652
No log 5.0286 176 0.5671 0.5336 0.5671 0.7531
No log 5.0857 178 0.5032 0.6431 0.5032 0.7094
No log 5.1429 180 0.5105 0.5979 0.5105 0.7145
No log 5.2 182 0.5200 0.5799 0.5200 0.7211
No log 5.2571 184 0.5446 0.4951 0.5446 0.7380
No log 5.3143 186 0.5951 0.4014 0.5951 0.7714
No log 5.3714 188 0.7400 0.3988 0.7400 0.8602
No log 5.4286 190 0.7263 0.3782 0.7263 0.8523
No log 5.4857 192 0.6098 0.4123 0.6098 0.7809
No log 5.5429 194 0.5423 0.5498 0.5423 0.7364
No log 5.6 196 0.4963 0.5836 0.4963 0.7045
No log 5.6571 198 0.4830 0.5836 0.4830 0.6950
No log 5.7143 200 0.4999 0.5237 0.4999 0.7071
No log 5.7714 202 0.5802 0.4601 0.5802 0.7617
No log 5.8286 204 0.6518 0.4604 0.6518 0.8074
No log 5.8857 206 0.6160 0.4536 0.6160 0.7848
No log 5.9429 208 0.5778 0.4799 0.5778 0.7601
No log 6.0 210 0.5384 0.5645 0.5384 0.7337
No log 6.0571 212 0.4802 0.5254 0.4802 0.6930
No log 6.1143 214 0.4610 0.5648 0.4610 0.6790
No log 6.1714 216 0.4880 0.5577 0.4880 0.6986
No log 6.2286 218 0.5107 0.5016 0.5107 0.7146
No log 6.2857 220 0.5135 0.4375 0.5135 0.7166
No log 6.3429 222 0.5353 0.3763 0.5353 0.7316
No log 6.4 224 0.6467 0.4502 0.6467 0.8042
No log 6.4571 226 0.6952 0.4277 0.6952 0.8338
No log 6.5143 228 0.5861 0.3819 0.5861 0.7656
No log 6.5714 230 0.5188 0.4479 0.5188 0.7203
No log 6.6286 232 0.4886 0.5955 0.4886 0.6990
No log 6.6857 234 0.5296 0.4606 0.5296 0.7277
No log 6.7429 236 0.5603 0.4369 0.5603 0.7485
No log 6.8 238 0.5333 0.4845 0.5333 0.7303
No log 6.8571 240 0.5813 0.4545 0.5813 0.7625
No log 6.9143 242 0.5560 0.5293 0.5560 0.7457
No log 6.9714 244 0.5265 0.4997 0.5265 0.7256
No log 7.0286 246 0.5518 0.4457 0.5518 0.7428
No log 7.0857 248 0.5390 0.4457 0.5390 0.7342
No log 7.1429 250 0.5636 0.4223 0.5636 0.7507
No log 7.2 252 0.6381 0.4085 0.6381 0.7988
No log 7.2571 254 0.6099 0.3868 0.6099 0.7810
No log 7.3143 256 0.5196 0.4491 0.5196 0.7209
No log 7.3714 258 0.4869 0.4774 0.4869 0.6978
No log 7.4286 260 0.5197 0.4838 0.5197 0.7209
No log 7.4857 262 0.5086 0.4289 0.5086 0.7132
No log 7.5429 264 0.5358 0.4392 0.5358 0.7320
No log 7.6 266 0.6197 0.4667 0.6197 0.7872
No log 7.6571 268 0.6660 0.4784 0.6660 0.8161
No log 7.7143 270 0.5870 0.4933 0.5870 0.7662
No log 7.7714 272 0.5192 0.5190 0.5192 0.7206
No log 7.8286 274 0.5069 0.5131 0.5069 0.7120
No log 7.8857 276 0.4888 0.6052 0.4888 0.6991
No log 7.9429 278 0.4968 0.4717 0.4968 0.7049
No log 8.0 280 0.5325 0.5046 0.5325 0.7297
No log 8.0571 282 0.5371 0.5826 0.5371 0.7329
No log 8.1143 284 0.5393 0.5899 0.5393 0.7344
No log 8.1714 286 0.6248 0.4418 0.6248 0.7905
No log 8.2286 288 0.7142 0.4552 0.7142 0.8451
No log 8.2857 290 0.6893 0.4479 0.6893 0.8303
No log 8.3429 292 0.6423 0.4738 0.6423 0.8014
No log 8.4 294 0.6248 0.4295 0.6248 0.7905
No log 8.4571 296 0.6049 0.4247 0.6049 0.7778
No log 8.5143 298 0.6292 0.4409 0.6292 0.7932
No log 8.5714 300 0.5694 0.3996 0.5694 0.7546
No log 8.6286 302 0.5238 0.4637 0.5238 0.7237
No log 8.6857 304 0.5253 0.4547 0.5253 0.7248
No log 8.7429 306 0.5958 0.4502 0.5958 0.7719
No log 8.8 308 0.7430 0.4703 0.7430 0.8620
No log 8.8571 310 0.7454 0.4703 0.7454 0.8634
No log 8.9143 312 0.6740 0.4085 0.6740 0.8210
No log 8.9714 314 0.5487 0.4020 0.5487 0.7407
No log 9.0286 316 0.5568 0.5321 0.5568 0.7462
No log 9.0857 318 0.5551 0.4722 0.5551 0.7451
No log 9.1429 320 0.5608 0.4020 0.5608 0.7488
No log 9.2 322 0.6614 0.4030 0.6614 0.8133
No log 9.2571 324 0.7971 0.4562 0.7971 0.8928
No log 9.3143 326 0.8162 0.4635 0.8162 0.9034
No log 9.3714 328 0.6794 0.4650 0.6794 0.8243
No log 9.4286 330 0.5785 0.4845 0.5785 0.7606
No log 9.4857 332 0.5245 0.4704 0.5245 0.7242
No log 9.5429 334 0.5172 0.5104 0.5172 0.7192
No log 9.6 336 0.5142 0.5053 0.5142 0.7171
No log 9.6571 338 0.5365 0.5379 0.5365 0.7324
No log 9.7143 340 0.5663 0.4835 0.5663 0.7526
No log 9.7714 342 0.6635 0.4018 0.6635 0.8146
No log 9.8286 344 0.7432 0.4315 0.7432 0.8621
No log 9.8857 346 0.7858 0.4396 0.7858 0.8865
No log 9.9429 348 0.7211 0.3918 0.7211 0.8491
No log 10.0 350 0.6108 0.4815 0.6108 0.7815
No log 10.0571 352 0.5746 0.4905 0.5746 0.7580
No log 10.1143 354 0.6086 0.4409 0.6086 0.7801
No log 10.1714 356 0.6823 0.3918 0.6823 0.8260
No log 10.2286 358 0.6655 0.3991 0.6655 0.8158
No log 10.2857 360 0.6241 0.4409 0.6241 0.7900
No log 10.3429 362 0.5843 0.4911 0.5843 0.7644
No log 10.4 364 0.5719 0.4979 0.5719 0.7562
No log 10.4571 366 0.5712 0.4979 0.5712 0.7558
No log 10.5143 368 0.5395 0.5063 0.5395 0.7345
No log 10.5714 370 0.5426 0.5086 0.5426 0.7366
No log 10.6286 372 0.5738 0.4827 0.5738 0.7575
No log 10.6857 374 0.5960 0.4329 0.5960 0.7720
No log 10.7429 376 0.6179 0.4175 0.6179 0.7861
No log 10.8 378 0.6082 0.4329 0.6082 0.7799
No log 10.8571 380 0.5589 0.4167 0.5589 0.7476
No log 10.9143 382 0.5615 0.4167 0.5615 0.7493
No log 10.9714 384 0.5566 0.4522 0.5566 0.7460
No log 11.0286 386 0.5289 0.4845 0.5289 0.7272
No log 11.0857 388 0.4943 0.5768 0.4943 0.7030
No log 11.1429 390 0.4929 0.5396 0.4929 0.7020
No log 11.2 392 0.4948 0.5703 0.4948 0.7034
No log 11.2571 394 0.5605 0.4812 0.5605 0.7486
No log 11.3143 396 0.6919 0.4536 0.6919 0.8318
No log 11.3714 398 0.7522 0.4472 0.7522 0.8673
No log 11.4286 400 0.7019 0.4650 0.7019 0.8378
No log 11.4857 402 0.5882 0.4424 0.5882 0.7669
No log 11.5429 404 0.5661 0.4167 0.5661 0.7524
No log 11.6 406 0.5795 0.4089 0.5795 0.7613
No log 11.6571 408 0.6495 0.4438 0.6495 0.8059
No log 11.7143 410 0.7601 0.4396 0.7601 0.8718
No log 11.7714 412 0.7672 0.4542 0.7672 0.8759
No log 11.8286 414 0.6314 0.4545 0.6314 0.7946
No log 11.8857 416 0.5463 0.4430 0.5463 0.7391
No log 11.9429 418 0.5246 0.5177 0.5246 0.7243
No log 12.0 420 0.5168 0.5271 0.5168 0.7189
No log 12.0571 422 0.5599 0.4167 0.5599 0.7482
No log 12.1143 424 0.7095 0.4347 0.7095 0.8423
No log 12.1714 426 0.8346 0.3905 0.8346 0.9136
No log 12.2286 428 0.8484 0.4003 0.8484 0.9211
No log 12.2857 430 0.7568 0.4255 0.7568 0.8699
No log 12.3429 432 0.5978 0.3894 0.5978 0.7732
No log 12.4 434 0.5053 0.4925 0.5053 0.7108
No log 12.4571 436 0.4808 0.6636 0.4808 0.6934
No log 12.5143 438 0.4823 0.6228 0.4823 0.6945
No log 12.5714 440 0.5600 0.5024 0.5600 0.7484
No log 12.6286 442 0.6970 0.4794 0.6970 0.8349
No log 12.6857 444 0.7033 0.4794 0.7033 0.8386
No log 12.7429 446 0.5694 0.5132 0.5694 0.7546
No log 12.8 448 0.4619 0.6065 0.4619 0.6796
No log 12.8571 450 0.4633 0.6555 0.4633 0.6806
No log 12.9143 452 0.4650 0.6747 0.4650 0.6819
No log 12.9714 454 0.4801 0.5723 0.4801 0.6929
No log 13.0286 456 0.5762 0.4539 0.5762 0.7591
No log 13.0857 458 0.6676 0.4315 0.6676 0.8171
No log 13.1429 460 0.6882 0.4315 0.6882 0.8296
No log 13.2 462 0.6382 0.4592 0.6382 0.7989
No log 13.2571 464 0.6431 0.4592 0.6431 0.8019
No log 13.3143 466 0.6426 0.4947 0.6426 0.8016
No log 13.3714 468 0.6917 0.4597 0.6917 0.8317
No log 13.4286 470 0.7433 0.4287 0.7433 0.8621
No log 13.4857 472 0.8455 0.4050 0.8455 0.9195
No log 13.5429 474 0.8447 0.4444 0.8447 0.9191
No log 13.6 476 0.7847 0.4777 0.7847 0.8858
No log 13.6571 478 0.6986 0.5068 0.6986 0.8358
No log 13.7143 480 0.6322 0.4059 0.6322 0.7951
No log 13.7714 482 0.5823 0.3919 0.5823 0.7631
No log 13.8286 484 0.5898 0.3738 0.5898 0.7680
No log 13.8857 486 0.6340 0.5081 0.6340 0.7962
No log 13.9429 488 0.7260 0.5065 0.7260 0.8521
No log 14.0 490 0.7531 0.4462 0.7531 0.8678
No log 14.0571 492 0.7394 0.4315 0.7394 0.8599
No log 14.1143 494 0.7373 0.4175 0.7373 0.8587
No log 14.1714 496 0.6912 0.4307 0.6912 0.8314
No log 14.2286 498 0.6419 0.4824 0.6419 0.8012
0.2881 14.2857 500 0.6334 0.4597 0.6334 0.7959
0.2881 14.3429 502 0.6381 0.5149 0.6381 0.7988
0.2881 14.4 504 0.6477 0.5362 0.6477 0.8048
0.2881 14.4571 506 0.6679 0.4741 0.6679 0.8172
0.2881 14.5143 508 0.6176 0.4602 0.6176 0.7859
0.2881 14.5714 510 0.6026 0.4602 0.6026 0.7762
0.2881 14.6286 512 0.5857 0.4782 0.5857 0.7653
0.2881 14.6857 514 0.5689 0.4782 0.5689 0.7542
0.2881 14.7429 516 0.6045 0.5042 0.6045 0.7775
0.2881 14.8 518 0.6724 0.5147 0.6724 0.8200
0.2881 14.8571 520 0.7133 0.4721 0.7133 0.8446
0.2881 14.9143 522 0.6347 0.4930 0.6347 0.7967
0.2881 14.9714 524 0.5713 0.4663 0.5713 0.7558
0.2881 15.0286 526 0.5521 0.4158 0.5521 0.7430
0.2881 15.0857 528 0.5788 0.4247 0.5788 0.7608
0.2881 15.1429 530 0.6769 0.3991 0.6769 0.8228
0.2881 15.2 532 0.7825 0.4379 0.7825 0.8846
0.2881 15.2571 534 0.8169 0.4455 0.8169 0.9038

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k7_task7_organization

Finetuned
(4019)
this model