ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k10_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5333
  • Qwk: 0.4929
  • Mse: 0.5333
  • Rmse: 0.7302

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0377 2 2.7319 -0.0262 2.7319 1.6528
No log 0.0755 4 1.4361 0.0518 1.4361 1.1984
No log 0.1132 6 1.1330 -0.0927 1.1330 1.0644
No log 0.1509 8 1.0110 -0.0112 1.0110 1.0055
No log 0.1887 10 1.0696 0.1238 1.0696 1.0342
No log 0.2264 12 0.9285 0.2142 0.9285 0.9636
No log 0.2642 14 0.8097 0.2440 0.8097 0.8998
No log 0.3019 16 0.7247 0.1953 0.7247 0.8513
No log 0.3396 18 0.6629 0.3010 0.6629 0.8142
No log 0.3774 20 0.7469 0.3693 0.7469 0.8642
No log 0.4151 22 0.9698 0.2335 0.9698 0.9848
No log 0.4528 24 1.0911 0.1935 1.0911 1.0446
No log 0.4906 26 0.8748 0.2613 0.8748 0.9353
No log 0.5283 28 0.6597 0.1282 0.6597 0.8122
No log 0.5660 30 0.8258 0.1183 0.8258 0.9087
No log 0.6038 32 1.1105 0.1869 1.1105 1.0538
No log 0.6415 34 1.0976 0.1573 1.0976 1.0477
No log 0.6792 36 0.9268 0.2982 0.9268 0.9627
No log 0.7170 38 0.7780 0.2206 0.7780 0.8820
No log 0.7547 40 0.7339 0.0840 0.7339 0.8567
No log 0.7925 42 0.7237 0.0840 0.7237 0.8507
No log 0.8302 44 0.7188 0.1660 0.7188 0.8478
No log 0.8679 46 0.7107 0.2786 0.7107 0.8431
No log 0.9057 48 0.7222 0.1699 0.7222 0.8498
No log 0.9434 50 0.7375 0.2589 0.7375 0.8588
No log 0.9811 52 0.7048 0.3305 0.7048 0.8395
No log 1.0189 54 0.7216 0.3289 0.7216 0.8495
No log 1.0566 56 0.7228 0.2621 0.7228 0.8502
No log 1.0943 58 0.8643 0.2410 0.8643 0.9297
No log 1.1321 60 0.9576 0.2703 0.9576 0.9786
No log 1.1698 62 1.1511 0.2543 1.1511 1.0729
No log 1.2075 64 1.0542 0.3225 1.0542 1.0268
No log 1.2453 66 1.0349 0.3006 1.0349 1.0173
No log 1.2830 68 0.8792 0.3294 0.8792 0.9376
No log 1.3208 70 0.7559 0.2950 0.7559 0.8694
No log 1.3585 72 0.7597 0.2652 0.7597 0.8716
No log 1.3962 74 0.7954 0.2754 0.7954 0.8918
No log 1.4340 76 0.9312 0.2836 0.9312 0.9650
No log 1.4717 78 1.0766 0.2437 1.0766 1.0376
No log 1.5094 80 1.2486 0.2880 1.2486 1.1174
No log 1.5472 82 1.2123 0.2691 1.2123 1.1011
No log 1.5849 84 0.8983 0.1808 0.8983 0.9478
No log 1.6226 86 0.7764 0.3183 0.7764 0.8811
No log 1.6604 88 0.7470 0.2576 0.7470 0.8643
No log 1.6981 90 0.7358 0.4072 0.7358 0.8578
No log 1.7358 92 0.6712 0.4158 0.6712 0.8192
No log 1.7736 94 0.6385 0.3336 0.6385 0.7991
No log 1.8113 96 0.6300 0.3407 0.6300 0.7937
No log 1.8491 98 0.6050 0.4459 0.6050 0.7778
No log 1.8868 100 0.6998 0.4788 0.6998 0.8365
No log 1.9245 102 0.9243 0.3538 0.9243 0.9614
No log 1.9623 104 0.8295 0.4384 0.8295 0.9108
No log 2.0 106 0.6225 0.3399 0.6225 0.7890
No log 2.0377 108 0.6421 0.3703 0.6421 0.8013
No log 2.0755 110 0.7471 0.2652 0.7471 0.8643
No log 2.1132 112 0.6740 0.3833 0.6740 0.8210
No log 2.1509 114 0.6292 0.4212 0.6292 0.7932
No log 2.1887 116 0.9467 0.2782 0.9467 0.9730
No log 2.2264 118 1.2354 0.2778 1.2354 1.1115
No log 2.2642 120 1.0988 0.2478 1.0988 1.0483
No log 2.3019 122 0.7472 0.4788 0.7472 0.8644
No log 2.3396 124 0.6448 0.3691 0.6448 0.8030
No log 2.3774 126 0.6701 0.3928 0.6701 0.8186
No log 2.4151 128 0.6386 0.3601 0.6386 0.7991
No log 2.4528 130 0.6052 0.3375 0.6052 0.7779
No log 2.4906 132 0.6124 0.4602 0.6124 0.7826
No log 2.5283 134 0.7190 0.4862 0.7190 0.8479
No log 2.5660 136 0.6719 0.4862 0.6719 0.8197
No log 2.6038 138 0.5602 0.4270 0.5602 0.7485
No log 2.6415 140 0.5497 0.4292 0.5497 0.7414
No log 2.6792 142 0.5652 0.3996 0.5652 0.7518
No log 2.7170 144 0.6177 0.4329 0.6177 0.7860
No log 2.7547 146 0.6589 0.4788 0.6589 0.8117
No log 2.7925 148 0.6194 0.4089 0.6194 0.7870
No log 2.8302 150 0.5481 0.4182 0.5481 0.7403
No log 2.8679 152 0.5528 0.2947 0.5528 0.7435
No log 2.9057 154 0.5717 0.3417 0.5717 0.7561
No log 2.9434 156 0.5539 0.3556 0.5539 0.7442
No log 2.9811 158 0.5698 0.4587 0.5698 0.7548
No log 3.0189 160 0.6083 0.4777 0.6083 0.7799
No log 3.0566 162 0.6183 0.3754 0.6183 0.7863
No log 3.0943 164 0.6286 0.3931 0.6286 0.7928
No log 3.1321 166 0.6261 0.3701 0.6261 0.7912
No log 3.1698 168 0.6149 0.4662 0.6149 0.7841
No log 3.2075 170 0.5961 0.4059 0.5961 0.7721
No log 3.2453 172 0.6446 0.4892 0.6446 0.8029
No log 3.2830 174 0.7446 0.5343 0.7446 0.8629
No log 3.3208 176 0.6728 0.5048 0.6728 0.8202
No log 3.3585 178 0.5562 0.3426 0.5562 0.7458
No log 3.3962 180 0.5647 0.3471 0.5647 0.7514
No log 3.4340 182 0.6458 0.4728 0.6458 0.8036
No log 3.4717 184 0.6695 0.4728 0.6695 0.8182
No log 3.5094 186 0.6016 0.3996 0.6016 0.7757
No log 3.5472 188 0.6026 0.3551 0.6026 0.7763
No log 3.5849 190 0.6146 0.3551 0.6146 0.7839
No log 3.6226 192 0.6273 0.2965 0.6273 0.7920
No log 3.6604 194 0.6265 0.3355 0.6265 0.7915
No log 3.6981 196 0.6574 0.3867 0.6574 0.8108
No log 3.7358 198 0.7285 0.4424 0.7285 0.8535
No log 3.7736 200 0.6694 0.4491 0.6694 0.8182
No log 3.8113 202 0.6128 0.3867 0.6128 0.7828
No log 3.8491 204 0.5945 0.3689 0.5945 0.7710
No log 3.8868 206 0.6365 0.3833 0.6365 0.7978
No log 3.9245 208 0.6323 0.3958 0.6323 0.7952
No log 3.9623 210 0.5822 0.4091 0.5822 0.7630
No log 4.0 212 0.5988 0.3615 0.5988 0.7738
No log 4.0377 214 0.6155 0.4190 0.6155 0.7845
No log 4.0755 216 0.6024 0.3891 0.6024 0.7762
No log 4.1132 218 0.6147 0.3408 0.6147 0.7840
No log 4.1509 220 0.6241 0.3891 0.6241 0.7900
No log 4.1887 222 0.6201 0.3355 0.6201 0.7875
No log 4.2264 224 0.5994 0.3253 0.5994 0.7742
No log 4.2642 226 0.6056 0.3320 0.6056 0.7782
No log 4.3019 228 0.6328 0.5083 0.6328 0.7955
No log 4.3396 230 0.5950 0.4569 0.5950 0.7713
No log 4.3774 232 0.5597 0.3865 0.5597 0.7481
No log 4.4151 234 0.5876 0.3996 0.5876 0.7666
No log 4.4528 236 0.6905 0.4788 0.6905 0.8309
No log 4.4906 238 0.6436 0.4644 0.6436 0.8022
No log 4.5283 240 0.5408 0.4419 0.5408 0.7354
No log 4.5660 242 0.6458 0.4783 0.6458 0.8036
No log 4.6038 244 0.6972 0.4466 0.6972 0.8350
No log 4.6415 246 0.5912 0.4886 0.5912 0.7689
No log 4.6792 248 0.5268 0.5286 0.5268 0.7258
No log 4.7170 250 0.5373 0.4505 0.5373 0.7330
No log 4.7547 252 0.5724 0.4391 0.5724 0.7565
No log 4.7925 254 0.5664 0.4611 0.5664 0.7526
No log 4.8302 256 0.5644 0.4407 0.5644 0.7512
No log 4.8679 258 0.5759 0.4802 0.5759 0.7589
No log 4.9057 260 0.5782 0.5028 0.5782 0.7604
No log 4.9434 262 0.5597 0.3939 0.5597 0.7482
No log 4.9811 264 0.5651 0.3886 0.5651 0.7517
No log 5.0189 266 0.5804 0.5009 0.5804 0.7618
No log 5.0566 268 0.5712 0.4829 0.5712 0.7558
No log 5.0943 270 0.5838 0.3572 0.5838 0.7641
No log 5.1321 272 0.6428 0.4167 0.6428 0.8018
No log 5.1698 274 0.6011 0.3737 0.6011 0.7753
No log 5.2075 276 0.5500 0.3782 0.5500 0.7416
No log 5.2453 278 0.5448 0.4224 0.5448 0.7381
No log 5.2830 280 0.5436 0.4061 0.5436 0.7373
No log 5.3208 282 0.5308 0.4515 0.5308 0.7285
No log 5.3585 284 0.5228 0.4264 0.5228 0.7230
No log 5.3962 286 0.5336 0.4685 0.5336 0.7305
No log 5.4340 288 0.5453 0.4243 0.5453 0.7385
No log 5.4717 290 0.5400 0.4685 0.5400 0.7349
No log 5.5094 292 0.5308 0.4685 0.5308 0.7286
No log 5.5472 294 0.5246 0.4888 0.5246 0.7243
No log 5.5849 296 0.5194 0.5250 0.5194 0.7207
No log 5.6226 298 0.5264 0.4329 0.5264 0.7255
No log 5.6604 300 0.5345 0.4575 0.5345 0.7311
No log 5.6981 302 0.5317 0.4386 0.5317 0.7292
No log 5.7358 304 0.5332 0.4420 0.5332 0.7302
No log 5.7736 306 0.5950 0.4247 0.5950 0.7714
No log 5.8113 308 0.5697 0.4076 0.5697 0.7548
No log 5.8491 310 0.5389 0.3809 0.5389 0.7341
No log 5.8868 312 0.5722 0.5083 0.5722 0.7564
No log 5.9245 314 0.5817 0.5544 0.5817 0.7627
No log 5.9623 316 0.5555 0.5078 0.5555 0.7453
No log 6.0 318 0.5215 0.4904 0.5215 0.7222
No log 6.0377 320 0.5262 0.4724 0.5262 0.7254
No log 6.0755 322 0.5146 0.4743 0.5146 0.7174
No log 6.1132 324 0.5277 0.5078 0.5277 0.7264
No log 6.1509 326 0.5660 0.5095 0.5660 0.7523
No log 6.1887 328 0.5508 0.5455 0.5508 0.7422
No log 6.2264 330 0.5025 0.5252 0.5025 0.7088
No log 6.2642 332 0.5164 0.4788 0.5164 0.7186
No log 6.3019 334 0.4958 0.5071 0.4958 0.7041
No log 6.3396 336 0.5045 0.5811 0.5045 0.7103
No log 6.3774 338 0.5873 0.5249 0.5873 0.7664
No log 6.4151 340 0.6010 0.5331 0.6010 0.7752
No log 6.4528 342 0.5263 0.5373 0.5263 0.7255
No log 6.4906 344 0.4960 0.5170 0.4960 0.7043
No log 6.5283 346 0.5074 0.4983 0.5074 0.7124
No log 6.5660 348 0.5054 0.4077 0.5054 0.7109
No log 6.6038 350 0.5210 0.4958 0.5210 0.7218
No log 6.6415 352 0.5613 0.5470 0.5613 0.7492
No log 6.6792 354 0.5634 0.5470 0.5634 0.7506
No log 6.7170 356 0.5339 0.5083 0.5339 0.7307
No log 6.7547 358 0.5323 0.4821 0.5323 0.7296
No log 6.7925 360 0.5366 0.4632 0.5366 0.7325
No log 6.8302 362 0.5204 0.3964 0.5204 0.7214
No log 6.8679 364 0.5179 0.4253 0.5179 0.7196
No log 6.9057 366 0.5178 0.3939 0.5178 0.7196
No log 6.9434 368 0.5303 0.4527 0.5303 0.7282
No log 6.9811 370 0.5399 0.4527 0.5399 0.7348
No log 7.0189 372 0.5512 0.4697 0.5512 0.7424
No log 7.0566 374 0.5575 0.4697 0.5575 0.7467
No log 7.0943 376 0.5577 0.4105 0.5577 0.7468
No log 7.1321 378 0.5793 0.3598 0.5793 0.7611
No log 7.1698 380 0.6110 0.3615 0.6110 0.7816
No log 7.2075 382 0.5929 0.3325 0.5929 0.7700
No log 7.2453 384 0.5800 0.4555 0.5800 0.7615
No log 7.2830 386 0.5786 0.4635 0.5786 0.7607
No log 7.3208 388 0.5568 0.3862 0.5568 0.7462
No log 7.3585 390 0.5934 0.3545 0.5934 0.7703
No log 7.3962 392 0.6012 0.3267 0.6012 0.7754
No log 7.4340 394 0.5958 0.2943 0.5958 0.7719
No log 7.4717 396 0.5745 0.3625 0.5745 0.7579
No log 7.5094 398 0.5798 0.4717 0.5798 0.7615
No log 7.5472 400 0.6060 0.4913 0.6060 0.7785
No log 7.5849 402 0.5955 0.5250 0.5955 0.7717
No log 7.6226 404 0.6088 0.3643 0.6088 0.7803
No log 7.6604 406 0.6459 0.3640 0.6459 0.8037
No log 7.6981 408 0.6996 0.3287 0.6996 0.8364
No log 7.7358 410 0.7215 0.3985 0.7215 0.8494
No log 7.7736 412 0.6381 0.3545 0.6381 0.7988
No log 7.8113 414 0.5911 0.3324 0.5911 0.7688
No log 7.8491 416 0.5752 0.3625 0.5752 0.7584
No log 7.8868 418 0.5815 0.3762 0.5815 0.7625
No log 7.9245 420 0.5722 0.3667 0.5722 0.7564
No log 7.9623 422 0.5518 0.4536 0.5518 0.7428
No log 8.0 424 0.5445 0.4536 0.5445 0.7379
No log 8.0377 426 0.5391 0.4082 0.5391 0.7342
No log 8.0755 428 0.5362 0.4082 0.5362 0.7322
No log 8.1132 430 0.5444 0.4114 0.5444 0.7378
No log 8.1509 432 0.6008 0.3918 0.6008 0.7751
No log 8.1887 434 0.6349 0.4167 0.6349 0.7968
No log 8.2264 436 0.6172 0.4167 0.6172 0.7856
No log 8.2642 438 0.6639 0.4167 0.6639 0.8148
No log 8.3019 440 0.6484 0.4167 0.6484 0.8052
No log 8.3396 442 0.6168 0.3918 0.6168 0.7854
No log 8.3774 444 0.5562 0.3211 0.5562 0.7458
No log 8.4151 446 0.5479 0.3308 0.5479 0.7402
No log 8.4528 448 0.5522 0.3701 0.5522 0.7431
No log 8.4906 450 0.5513 0.3894 0.5513 0.7425
No log 8.5283 452 0.5549 0.4158 0.5549 0.7449
No log 8.5660 454 0.5980 0.4409 0.5980 0.7733
No log 8.6038 456 0.5915 0.4502 0.5915 0.7691
No log 8.6415 458 0.5335 0.4864 0.5335 0.7304
No log 8.6792 460 0.5102 0.5167 0.5102 0.7143
No log 8.7170 462 0.5055 0.4820 0.5055 0.7110
No log 8.7547 464 0.5287 0.4466 0.5287 0.7271
No log 8.7925 466 0.5269 0.4205 0.5269 0.7259
No log 8.8302 468 0.5168 0.4229 0.5168 0.7189
No log 8.8679 470 0.5772 0.4167 0.5772 0.7597
No log 8.9057 472 0.6921 0.4783 0.6921 0.8319
No log 8.9434 474 0.6679 0.4783 0.6679 0.8173
No log 8.9811 476 0.5516 0.4664 0.5516 0.7427
No log 9.0189 478 0.4971 0.5195 0.4971 0.7051
No log 9.0566 480 0.5020 0.5195 0.5020 0.7085
No log 9.0943 482 0.4924 0.4742 0.4924 0.7017
No log 9.1321 484 0.4950 0.5003 0.4950 0.7036
No log 9.1698 486 0.5276 0.4835 0.5276 0.7264
No log 9.2075 488 0.5324 0.4835 0.5324 0.7297
No log 9.2453 490 0.5320 0.5017 0.5320 0.7294
No log 9.2830 492 0.5558 0.4835 0.5558 0.7455
No log 9.3208 494 0.5243 0.4437 0.5243 0.7241
No log 9.3585 496 0.4888 0.5003 0.4888 0.6991
No log 9.3962 498 0.4825 0.4402 0.4825 0.6946
0.3426 9.4340 500 0.4982 0.5131 0.4982 0.7058
0.3426 9.4717 502 0.5118 0.4925 0.5118 0.7154
0.3426 9.5094 504 0.5032 0.4875 0.5032 0.7093
0.3426 9.5472 506 0.4815 0.5057 0.4815 0.6939
0.3426 9.5849 508 0.4885 0.5683 0.4885 0.6990
0.3426 9.6226 510 0.4963 0.5353 0.4963 0.7045
0.3426 9.6604 512 0.5263 0.5324 0.5263 0.7255
0.3426 9.6981 514 0.5354 0.4513 0.5354 0.7317
0.3426 9.7358 516 0.5037 0.4855 0.5037 0.7097
0.3426 9.7736 518 0.4846 0.5912 0.4846 0.6961
0.3426 9.8113 520 0.4918 0.5335 0.4918 0.7013
0.3426 9.8491 522 0.5099 0.5352 0.5099 0.7141
0.3426 9.8868 524 0.4930 0.5335 0.4930 0.7021
0.3426 9.9245 526 0.4949 0.5306 0.4949 0.7035
0.3426 9.9623 528 0.5303 0.4929 0.5303 0.7282
0.3426 10.0 530 0.5219 0.4968 0.5219 0.7225
0.3426 10.0377 532 0.5380 0.4801 0.5380 0.7335
0.3426 10.0755 534 0.5379 0.4864 0.5379 0.7334
0.3426 10.1132 536 0.5333 0.4929 0.5333 0.7302

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k10_task7_organization

Finetuned
(4019)
this model