ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run2_AugV5_k10_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8486
  • Qwk: 0.3965
  • Mse: 0.8486
  • Rmse: 0.9212

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0571 2 2.4979 -0.0109 2.4979 1.5805
No log 0.1143 4 1.3022 0.0511 1.3022 1.1411
No log 0.1714 6 0.9077 0.0101 0.9077 0.9527
No log 0.2286 8 1.1536 -0.1405 1.1536 1.0741
No log 0.2857 10 1.2024 -0.1356 1.2024 1.0966
No log 0.3429 12 1.0764 0.1029 1.0764 1.0375
No log 0.4 14 1.0514 0.0821 1.0514 1.0254
No log 0.4571 16 0.8727 0.0888 0.8727 0.9342
No log 0.5143 18 0.7553 0.1282 0.7553 0.8691
No log 0.5714 20 0.8678 0.1770 0.8678 0.9316
No log 0.6286 22 0.8648 0.2156 0.8648 0.9300
No log 0.6857 24 0.8587 0.1754 0.8587 0.9267
No log 0.7429 26 0.8834 0.2841 0.8834 0.9399
No log 0.8 28 0.8212 0.1770 0.8212 0.9062
No log 0.8571 30 0.7522 0.0481 0.7522 0.8673
No log 0.9143 32 0.7207 0.0481 0.7207 0.8489
No log 0.9714 34 0.7141 0.0937 0.7141 0.8451
No log 1.0286 36 0.7079 0.1786 0.7079 0.8414
No log 1.0857 38 0.7643 0.1770 0.7643 0.8743
No log 1.1429 40 0.8471 0.2841 0.8471 0.9204
No log 1.2 42 0.9324 0.3192 0.9324 0.9656
No log 1.2571 44 1.1124 0.2246 1.1124 1.0547
No log 1.3143 46 0.9861 0.2928 0.9861 0.9930
No log 1.3714 48 0.8829 0.3230 0.8829 0.9396
No log 1.4286 50 0.8692 0.1685 0.8692 0.9323
No log 1.4857 52 0.8258 0.2494 0.8258 0.9087
No log 1.5429 54 0.7041 0.1365 0.7041 0.8391
No log 1.6 56 0.6351 0.2095 0.6351 0.7969
No log 1.6571 58 0.6390 0.2459 0.6390 0.7994
No log 1.7143 60 0.7629 0.3695 0.7629 0.8735
No log 1.7714 62 0.7884 0.3645 0.7884 0.8879
No log 1.8286 64 0.6952 0.2879 0.6952 0.8338
No log 1.8857 66 0.7105 0.0481 0.7105 0.8429
No log 1.9429 68 0.7584 0.0 0.7584 0.8709
No log 2.0 70 0.8254 0.0 0.8254 0.9085
No log 2.0571 72 0.8290 0.0481 0.8290 0.9105
No log 2.1143 74 0.7830 0.0937 0.7830 0.8849
No log 2.1714 76 0.8088 0.0937 0.8088 0.8993
No log 2.2286 78 0.7795 0.0937 0.7795 0.8829
No log 2.2857 80 0.7374 0.0947 0.7374 0.8587
No log 2.3429 82 0.6482 0.0944 0.6482 0.8051
No log 2.4 84 0.5724 0.3661 0.5724 0.7566
No log 2.4571 86 0.5765 0.3141 0.5765 0.7593
No log 2.5143 88 0.5863 0.4938 0.5863 0.7657
No log 2.5714 90 0.7108 0.3849 0.7108 0.8431
No log 2.6286 92 0.9139 0.2820 0.9139 0.9560
No log 2.6857 94 0.9292 0.2263 0.9292 0.9640
No log 2.7429 96 0.8301 0.2958 0.8301 0.9111
No log 2.8 98 0.7596 0.3825 0.7596 0.8716
No log 2.8571 100 0.7407 0.3553 0.7407 0.8606
No log 2.9143 102 0.7036 0.3772 0.7036 0.8388
No log 2.9714 104 0.6291 0.3215 0.6291 0.7932
No log 3.0286 106 0.5921 0.3377 0.5921 0.7695
No log 3.0857 108 0.6346 0.3746 0.6346 0.7966
No log 3.1429 110 0.6681 0.4268 0.6681 0.8174
No log 3.2 112 0.6841 0.4268 0.6841 0.8271
No log 3.2571 114 0.6546 0.3719 0.6546 0.8090
No log 3.3143 116 0.5706 0.3446 0.5706 0.7554
No log 3.3714 118 0.6362 0.3444 0.6362 0.7976
No log 3.4286 120 0.8069 0.4255 0.8069 0.8983
No log 3.4857 122 0.8419 0.3761 0.8419 0.9176
No log 3.5429 124 0.6560 0.3444 0.6560 0.8099
No log 3.6 126 0.6423 0.2308 0.6423 0.8015
No log 3.6571 128 0.6872 0.3318 0.6872 0.8289
No log 3.7143 130 0.6615 0.4105 0.6615 0.8134
No log 3.7714 132 0.6194 0.4722 0.6194 0.7870
No log 3.8286 134 0.6236 0.4514 0.6236 0.7897
No log 3.8857 136 0.6200 0.4514 0.6200 0.7874
No log 3.9429 138 0.6047 0.4722 0.6047 0.7776
No log 4.0 140 0.7304 0.4429 0.7304 0.8547
No log 4.0571 142 0.7761 0.3913 0.7761 0.8810
No log 4.1143 144 0.6578 0.4517 0.6578 0.8111
No log 4.1714 146 0.6499 0.4259 0.6499 0.8061
No log 4.2286 148 0.5753 0.3754 0.5753 0.7585
No log 4.2857 150 0.5560 0.5250 0.5560 0.7456
No log 4.3429 152 0.5508 0.5289 0.5508 0.7422
No log 4.4 154 0.5909 0.5030 0.5909 0.7687
No log 4.4571 156 0.9213 0.3679 0.9213 0.9598
No log 4.5143 158 1.1699 0.1903 1.1699 1.0816
No log 4.5714 160 1.1197 0.2871 1.1197 1.0582
No log 4.6286 162 0.8644 0.3521 0.8644 0.9297
No log 4.6857 164 0.5859 0.4584 0.5859 0.7654
No log 4.7429 166 0.5512 0.5268 0.5512 0.7425
No log 4.8 168 0.5483 0.5424 0.5483 0.7405
No log 4.8571 170 0.5975 0.3640 0.5975 0.7730
No log 4.9143 172 0.6408 0.4186 0.6408 0.8005
No log 4.9714 174 0.6634 0.4455 0.6634 0.8145
No log 5.0286 176 0.5955 0.4562 0.5955 0.7717
No log 5.0857 178 0.5296 0.4898 0.5296 0.7277
No log 5.1429 180 0.5462 0.5647 0.5462 0.7391
No log 5.2 182 0.5296 0.6032 0.5296 0.7278
No log 5.2571 184 0.5435 0.4674 0.5435 0.7372
No log 5.3143 186 0.6740 0.4092 0.6740 0.8210
No log 5.3714 188 0.6529 0.4491 0.6529 0.8080
No log 5.4286 190 0.5687 0.4569 0.5687 0.7541
No log 5.4857 192 0.5587 0.5022 0.5587 0.7475
No log 5.5429 194 0.5798 0.3990 0.5798 0.7614
No log 5.6 196 0.5578 0.5488 0.5578 0.7469
No log 5.6571 198 0.5741 0.4876 0.5741 0.7577
No log 5.7143 200 0.6450 0.5283 0.6450 0.8031
No log 5.7714 202 0.7370 0.4670 0.7370 0.8585
No log 5.8286 204 0.6512 0.5348 0.6512 0.8070
No log 5.8857 206 0.5737 0.5421 0.5737 0.7574
No log 5.9429 208 0.5737 0.5738 0.5737 0.7574
No log 6.0 210 0.5633 0.5075 0.5633 0.7505
No log 6.0571 212 0.6037 0.4895 0.6037 0.7770
No log 6.1143 214 0.6161 0.4841 0.6161 0.7849
No log 6.1714 216 0.5599 0.4849 0.5599 0.7482
No log 6.2286 218 0.5349 0.4590 0.5349 0.7314
No log 6.2857 220 0.5498 0.5356 0.5498 0.7415
No log 6.3429 222 0.5590 0.5356 0.5590 0.7477
No log 6.4 224 0.5542 0.5587 0.5542 0.7444
No log 6.4571 226 0.6302 0.5445 0.6302 0.7939
No log 6.5143 228 0.6270 0.5445 0.6270 0.7918
No log 6.5714 230 0.6656 0.5328 0.6656 0.8158
No log 6.6286 232 0.6082 0.5513 0.6082 0.7799
No log 6.6857 234 0.5744 0.5748 0.5744 0.7579
No log 6.7429 236 0.5271 0.5649 0.5271 0.7260
No log 6.8 238 0.5153 0.5556 0.5153 0.7179
No log 6.8571 240 0.5275 0.5556 0.5275 0.7263
No log 6.9143 242 0.5352 0.5405 0.5352 0.7316
No log 6.9714 244 0.6096 0.5301 0.6096 0.7808
No log 7.0286 246 0.6346 0.4997 0.6346 0.7966
No log 7.0857 248 0.6353 0.5317 0.6353 0.7971
No log 7.1429 250 0.5650 0.5960 0.5650 0.7516
No log 7.2 252 0.5888 0.6013 0.5888 0.7673
No log 7.2571 254 0.6350 0.4725 0.6350 0.7969
No log 7.3143 256 0.6318 0.4851 0.6318 0.7949
No log 7.3714 258 0.5848 0.5587 0.5848 0.7647
No log 7.4286 260 0.6082 0.5603 0.6082 0.7799
No log 7.4857 262 0.6358 0.5390 0.6358 0.7973
No log 7.5429 264 0.6417 0.4315 0.6417 0.8010
No log 7.6 266 0.5798 0.4653 0.5798 0.7615
No log 7.6571 268 0.5296 0.4698 0.5296 0.7277
No log 7.7143 270 0.5158 0.5117 0.5158 0.7182
No log 7.7714 272 0.4915 0.6255 0.4915 0.7011
No log 7.8286 274 0.5001 0.5980 0.5001 0.7072
No log 7.8857 276 0.5253 0.5017 0.5253 0.7247
No log 7.9429 278 0.5213 0.4502 0.5213 0.7220
No log 8.0 280 0.5179 0.4007 0.5179 0.7196
No log 8.0571 282 0.5179 0.3980 0.5179 0.7197
No log 8.1143 284 0.5041 0.4614 0.5041 0.7100
No log 8.1714 286 0.5174 0.5017 0.5174 0.7193
No log 8.2286 288 0.5094 0.5647 0.5094 0.7137
No log 8.2857 290 0.5098 0.5533 0.5098 0.7140
No log 8.3429 292 0.5346 0.5572 0.5346 0.7311
No log 8.4 294 0.5425 0.5398 0.5425 0.7366
No log 8.4571 296 0.5428 0.6001 0.5428 0.7368
No log 8.5143 298 0.5843 0.6341 0.5843 0.7644
No log 8.5714 300 0.5964 0.5668 0.5964 0.7723
No log 8.6286 302 0.5547 0.5913 0.5547 0.7448
No log 8.6857 304 0.5490 0.5201 0.5490 0.7409
No log 8.7429 306 0.5542 0.4973 0.5542 0.7444
No log 8.8 308 0.5942 0.4749 0.5942 0.7708
No log 8.8571 310 0.6779 0.3869 0.6779 0.8234
No log 8.9143 312 0.6637 0.3869 0.6637 0.8147
No log 8.9714 314 0.5391 0.4964 0.5391 0.7342
No log 9.0286 316 0.5121 0.5305 0.5121 0.7156
No log 9.0857 318 0.5078 0.5714 0.5078 0.7126
No log 9.1429 320 0.4967 0.5945 0.4967 0.7048
No log 9.2 322 0.4956 0.6577 0.4956 0.7040
No log 9.2571 324 0.4986 0.6688 0.4986 0.7061
No log 9.3143 326 0.5274 0.6361 0.5274 0.7262
No log 9.3714 328 0.5962 0.5200 0.5962 0.7722
No log 9.4286 330 0.5534 0.5614 0.5534 0.7439
No log 9.4857 332 0.5266 0.5584 0.5266 0.7257
No log 9.5429 334 0.5107 0.5368 0.5107 0.7147
No log 9.6 336 0.5575 0.5672 0.5575 0.7467
No log 9.6571 338 0.5926 0.5581 0.5926 0.7698
No log 9.7143 340 0.5645 0.5332 0.5645 0.7513
No log 9.7714 342 0.5491 0.5949 0.5491 0.7410
No log 9.8286 344 0.5573 0.5945 0.5573 0.7466
No log 9.8857 346 0.5348 0.5770 0.5348 0.7313
No log 9.9429 348 0.5244 0.4678 0.5244 0.7242
No log 10.0 350 0.6079 0.5247 0.6079 0.7797
No log 10.0571 352 0.6365 0.5263 0.6365 0.7978
No log 10.1143 354 0.5592 0.5226 0.5592 0.7478
No log 10.1714 356 0.5184 0.4402 0.5184 0.7200
No log 10.2286 358 0.5449 0.4618 0.5449 0.7382
No log 10.2857 360 0.5402 0.4618 0.5402 0.7350
No log 10.3429 362 0.5260 0.4147 0.5260 0.7253
No log 10.4 364 0.5315 0.5201 0.5315 0.7290
No log 10.4571 366 0.5693 0.5082 0.5693 0.7545
No log 10.5143 368 0.6055 0.5024 0.6055 0.7782
No log 10.5714 370 0.6274 0.4841 0.6274 0.7921
No log 10.6286 372 0.6035 0.5687 0.6035 0.7769
No log 10.6857 374 0.5980 0.6371 0.5980 0.7733
No log 10.7429 376 0.6027 0.5687 0.6027 0.7763
No log 10.8 378 0.6409 0.4757 0.6409 0.8006
No log 10.8571 380 0.7020 0.4884 0.7020 0.8378
No log 10.9143 382 0.6478 0.4580 0.6478 0.8049
No log 10.9714 384 0.5894 0.4595 0.5894 0.7677
No log 11.0286 386 0.5662 0.4475 0.5662 0.7524
No log 11.0857 388 0.5701 0.5015 0.5701 0.7551
No log 11.1429 390 0.5579 0.5016 0.5579 0.7469
No log 11.2 392 0.5434 0.3703 0.5434 0.7372
No log 11.2571 394 0.5952 0.5483 0.5952 0.7715
No log 11.3143 396 0.6323 0.4842 0.6323 0.7952
No log 11.3714 398 0.5735 0.5320 0.5735 0.7573
No log 11.4286 400 0.5180 0.5517 0.5180 0.7197
No log 11.4857 402 0.5603 0.4769 0.5603 0.7485
No log 11.5429 404 0.6100 0.4664 0.6100 0.7810
No log 11.6 406 0.5615 0.5016 0.5615 0.7493
No log 11.6571 408 0.5292 0.4217 0.5292 0.7274
No log 11.7143 410 0.6307 0.4904 0.6307 0.7942
No log 11.7714 412 0.7328 0.4096 0.7328 0.8561
No log 11.8286 414 0.7205 0.4562 0.7205 0.8488
No log 11.8857 416 0.6282 0.4018 0.6282 0.7926
No log 11.9429 418 0.5565 0.5518 0.5565 0.7460
No log 12.0 420 0.5759 0.5831 0.5759 0.7589
No log 12.0571 422 0.5757 0.5723 0.5757 0.7587
No log 12.1143 424 0.5705 0.5640 0.5705 0.7553
No log 12.1714 426 0.6274 0.4018 0.6274 0.7921
No log 12.2286 428 0.6855 0.4279 0.6855 0.8279
No log 12.2857 430 0.6728 0.4362 0.6728 0.8202
No log 12.3429 432 0.6351 0.4113 0.6351 0.7970
No log 12.4 434 0.6011 0.4997 0.6011 0.7753
No log 12.4571 436 0.5866 0.5442 0.5866 0.7659
No log 12.5143 438 0.5706 0.5563 0.5706 0.7554
No log 12.5714 440 0.5567 0.5853 0.5567 0.7461
No log 12.6286 442 0.5485 0.5662 0.5485 0.7406
No log 12.6857 444 0.5529 0.5167 0.5529 0.7436
No log 12.7429 446 0.5600 0.5565 0.5600 0.7483
No log 12.8 448 0.5677 0.5092 0.5677 0.7535
No log 12.8571 450 0.5733 0.4992 0.5733 0.7572
No log 12.9143 452 0.5722 0.4681 0.5722 0.7564
No log 12.9714 454 0.5793 0.4743 0.5793 0.7611
No log 13.0286 456 0.5760 0.4681 0.5760 0.7590
No log 13.0857 458 0.5820 0.4006 0.5820 0.7629
No log 13.1429 460 0.5875 0.4342 0.5875 0.7665
No log 13.2 462 0.5878 0.4753 0.5878 0.7667
No log 13.2571 464 0.6000 0.4604 0.6000 0.7746
No log 13.3143 466 0.6424 0.5665 0.6424 0.8015
No log 13.3714 468 0.6419 0.5552 0.6419 0.8012
No log 13.4286 470 0.5979 0.4881 0.5979 0.7733
No log 13.4857 472 0.5901 0.5026 0.5901 0.7682
No log 13.5429 474 0.5865 0.4942 0.5865 0.7658
No log 13.6 476 0.5803 0.4904 0.5803 0.7618
No log 13.6571 478 0.5928 0.4036 0.5928 0.7699
No log 13.7143 480 0.6095 0.3572 0.6095 0.7807
No log 13.7714 482 0.6271 0.3894 0.6271 0.7919
No log 13.8286 484 0.6094 0.3267 0.6094 0.7806
No log 13.8857 486 0.5921 0.3677 0.5921 0.7695
No log 13.9429 488 0.5905 0.4126 0.5905 0.7684
No log 14.0 490 0.5824 0.4322 0.5824 0.7631
No log 14.0571 492 0.5812 0.4362 0.5812 0.7624
No log 14.1143 494 0.5876 0.4678 0.5876 0.7666
No log 14.1714 496 0.5898 0.5572 0.5898 0.7680
No log 14.2286 498 0.5743 0.5368 0.5743 0.7578
0.3644 14.2857 500 0.5536 0.5533 0.5536 0.7441
0.3644 14.3429 502 0.5655 0.4979 0.5655 0.7520
0.3644 14.4 504 0.5706 0.5135 0.5706 0.7554
0.3644 14.4571 506 0.5753 0.5373 0.5753 0.7585
0.3644 14.5143 508 0.5822 0.5544 0.5822 0.7630
0.3644 14.5714 510 0.6014 0.5544 0.6014 0.7755
0.3644 14.6286 512 0.5529 0.5603 0.5529 0.7436
0.3644 14.6857 514 0.5290 0.5266 0.5290 0.7273
0.3644 14.7429 516 0.5373 0.5449 0.5373 0.7330
0.3644 14.8 518 0.5980 0.5763 0.5980 0.7733
0.3644 14.8571 520 0.6432 0.4842 0.6432 0.8020
0.3644 14.9143 522 0.6007 0.5470 0.6007 0.7750
0.3644 14.9714 524 0.5708 0.5779 0.5708 0.7555
0.3644 15.0286 526 0.6068 0.5470 0.6068 0.7790
0.3644 15.0857 528 0.7639 0.4159 0.7639 0.8740
0.3644 15.1429 530 0.9716 0.2725 0.9716 0.9857
0.3644 15.2 532 0.9853 0.2702 0.9853 0.9926
0.3644 15.2571 534 0.8486 0.3965 0.8486 0.9212

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run2_AugV5_k10_task7_organization

Finetuned
(4019)
this model