ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run2_AugV5_k10_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5131
  • Qwk: 0.4869
  • Mse: 0.5131
  • Rmse: 0.7163

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0385 2 2.6387 -0.0262 2.6387 1.6244
No log 0.0769 4 1.3440 0.0511 1.3440 1.1593
No log 0.1154 6 0.9498 -0.1408 0.9498 0.9746
No log 0.1538 8 0.8620 0.0968 0.8620 0.9285
No log 0.1923 10 0.7151 0.2884 0.7151 0.8456
No log 0.2308 12 0.7307 0.2386 0.7307 0.8548
No log 0.2692 14 0.7464 0.3608 0.7464 0.8640
No log 0.3077 16 0.9220 0.1628 0.9220 0.9602
No log 0.3462 18 1.3208 0.1001 1.3208 1.1492
No log 0.3846 20 1.0455 0.1797 1.0455 1.0225
No log 0.4231 22 0.7612 0.3031 0.7612 0.8724
No log 0.4615 24 0.8616 0.2480 0.8616 0.9282
No log 0.5 26 0.7893 0.3343 0.7893 0.8884
No log 0.5385 28 0.6487 0.3183 0.6487 0.8054
No log 0.5769 30 0.6559 0.2441 0.6559 0.8099
No log 0.6154 32 0.7358 0.2490 0.7358 0.8578
No log 0.6538 34 0.9130 0.3926 0.9130 0.9555
No log 0.6923 36 1.0306 0.2516 1.0306 1.0152
No log 0.7308 38 0.9922 0.3938 0.9922 0.9961
No log 0.7692 40 0.8020 0.2879 0.8020 0.8955
No log 0.8077 42 0.7296 0.1786 0.7296 0.8541
No log 0.8462 44 0.7067 0.1372 0.7067 0.8406
No log 0.8846 46 0.6755 0.0937 0.6755 0.8219
No log 0.9231 48 0.6359 0.1718 0.6359 0.7975
No log 0.9615 50 0.6473 0.1718 0.6473 0.8045
No log 1.0 52 0.7667 0.2919 0.7667 0.8756
No log 1.0385 54 0.8275 0.2879 0.8275 0.9097
No log 1.0769 56 0.7989 0.2464 0.7989 0.8938
No log 1.1154 58 0.6644 0.3169 0.6644 0.8151
No log 1.1538 60 0.6371 0.2449 0.6371 0.7982
No log 1.1923 62 0.6537 0.3169 0.6537 0.8085
No log 1.2308 64 0.7829 0.3257 0.7829 0.8848
No log 1.2692 66 0.9135 0.3461 0.9135 0.9558
No log 1.3077 68 0.8006 0.3993 0.8006 0.8948
No log 1.3462 70 0.6854 0.3925 0.6854 0.8279
No log 1.3846 72 0.5869 0.4752 0.5869 0.7661
No log 1.4231 74 0.6013 0.4726 0.6013 0.7754
No log 1.4615 76 0.5900 0.3808 0.5900 0.7681
No log 1.5 78 0.7142 0.4028 0.7142 0.8451
No log 1.5385 80 0.8808 0.4088 0.8808 0.9385
No log 1.5769 82 1.0731 0.1473 1.0731 1.0359
No log 1.6154 84 1.0003 0.2975 1.0003 1.0001
No log 1.6538 86 0.7633 0.4021 0.7633 0.8737
No log 1.6923 88 0.6772 0.3882 0.6772 0.8229
No log 1.7308 90 0.6072 0.2916 0.6072 0.7792
No log 1.7692 92 0.6335 0.3993 0.6335 0.7959
No log 1.8077 94 0.7751 0.3730 0.7751 0.8804
No log 1.8462 96 0.9804 0.2702 0.9804 0.9902
No log 1.8846 98 0.9356 0.3961 0.9356 0.9673
No log 1.9231 100 0.8782 0.3961 0.8782 0.9371
No log 1.9615 102 0.7922 0.3719 0.7922 0.8900
No log 2.0 104 0.7698 0.3889 0.7698 0.8774
No log 2.0385 106 0.6770 0.4151 0.6770 0.8228
No log 2.0769 108 0.5911 0.4116 0.5911 0.7688
No log 2.1154 110 0.6189 0.3729 0.6189 0.7867
No log 2.1538 112 0.6393 0.3953 0.6393 0.7996
No log 2.1923 114 0.7014 0.4537 0.7014 0.8375
No log 2.2308 116 0.6179 0.4468 0.6179 0.7860
No log 2.2692 118 0.6116 0.4997 0.6116 0.7820
No log 2.3077 120 0.7054 0.4665 0.7054 0.8399
No log 2.3462 122 0.5728 0.4945 0.5728 0.7568
No log 2.3846 124 0.5938 0.3563 0.5938 0.7706
No log 2.4231 126 0.6433 0.3803 0.6433 0.8021
No log 2.4615 128 0.6082 0.4139 0.6082 0.7799
No log 2.5 130 0.6301 0.4051 0.6301 0.7938
No log 2.5385 132 0.5439 0.4611 0.5439 0.7375
No log 2.5769 134 0.5236 0.4908 0.5236 0.7236
No log 2.6154 136 0.5509 0.4821 0.5509 0.7422
No log 2.6538 138 0.6123 0.4130 0.6123 0.7825
No log 2.6923 140 0.5628 0.4468 0.5628 0.7502
No log 2.7308 142 0.5918 0.4355 0.5918 0.7693
No log 2.7692 144 0.5866 0.4355 0.5866 0.7659
No log 2.8077 146 0.5528 0.4427 0.5528 0.7435
No log 2.8462 148 0.5432 0.4715 0.5432 0.7370
No log 2.8846 150 0.5383 0.5030 0.5383 0.7337
No log 2.9231 152 0.6436 0.4836 0.6436 0.8022
No log 2.9615 154 0.6012 0.5149 0.6012 0.7753
No log 3.0 156 0.5219 0.4715 0.5219 0.7225
No log 3.0385 158 0.7465 0.5220 0.7465 0.8640
No log 3.0769 160 0.8257 0.4867 0.8257 0.9087
No log 3.1154 162 0.6550 0.4811 0.6550 0.8093
No log 3.1538 164 0.5370 0.4517 0.5370 0.7328
No log 3.1923 166 0.5325 0.5107 0.5325 0.7297
No log 3.2308 168 0.5538 0.5283 0.5538 0.7442
No log 3.2692 170 0.6137 0.4708 0.6137 0.7834
No log 3.3077 172 0.5789 0.5015 0.5789 0.7608
No log 3.3462 174 0.5513 0.4044 0.5513 0.7425
No log 3.3846 176 0.5566 0.4875 0.5566 0.7460
No log 3.4231 178 0.5990 0.5237 0.5990 0.7739
No log 3.4615 180 0.6477 0.5354 0.6477 0.8048
No log 3.5 182 0.6283 0.4782 0.6283 0.7927
No log 3.5385 184 0.5993 0.5021 0.5993 0.7742
No log 3.5769 186 0.6230 0.3811 0.6230 0.7893
No log 3.6154 188 0.6237 0.4113 0.6237 0.7898
No log 3.6538 190 0.5865 0.4866 0.5865 0.7659
No log 3.6923 192 0.6634 0.5073 0.6634 0.8145
No log 3.7308 194 0.7719 0.4735 0.7719 0.8786
No log 3.7692 196 0.7272 0.4906 0.7271 0.8527
No log 3.8077 198 0.5542 0.6257 0.5542 0.7445
No log 3.8462 200 0.5282 0.5383 0.5282 0.7268
No log 3.8846 202 0.5220 0.5586 0.5220 0.7225
No log 3.9231 204 0.6109 0.4789 0.6109 0.7816
No log 3.9615 206 0.6419 0.4987 0.6419 0.8012
No log 4.0 208 0.5766 0.4527 0.5766 0.7593
No log 4.0385 210 0.5281 0.5071 0.5281 0.7267
No log 4.0769 212 0.5353 0.5272 0.5353 0.7317
No log 4.1154 214 0.5431 0.5272 0.5431 0.7369
No log 4.1538 216 0.5615 0.5250 0.5615 0.7493
No log 4.1923 218 0.7992 0.4177 0.7992 0.8940
No log 4.2308 220 0.9158 0.3864 0.9158 0.9570
No log 4.2692 222 0.7933 0.4413 0.7933 0.8907
No log 4.3077 224 0.5780 0.5498 0.5780 0.7603
No log 4.3462 226 0.5411 0.4171 0.5411 0.7356
No log 4.3846 228 0.5472 0.5104 0.5472 0.7398
No log 4.4231 230 0.5253 0.4816 0.5253 0.7248
No log 4.4615 232 0.5293 0.4677 0.5293 0.7275
No log 4.5 234 0.6200 0.5112 0.6200 0.7874
No log 4.5385 236 0.7450 0.4597 0.7450 0.8631
No log 4.5769 238 0.7735 0.4597 0.7735 0.8795
No log 4.6154 240 0.6488 0.5408 0.6488 0.8055
No log 4.6538 242 0.5817 0.5315 0.5817 0.7627
No log 4.6923 244 0.5614 0.5513 0.5614 0.7493
No log 4.7308 246 0.5494 0.5706 0.5494 0.7412
No log 4.7692 248 0.5289 0.5569 0.5289 0.7273
No log 4.8077 250 0.5057 0.6295 0.5057 0.7111
No log 4.8462 252 0.5260 0.5498 0.5260 0.7252
No log 4.8846 254 0.5546 0.5513 0.5546 0.7447
No log 4.9231 256 0.5768 0.5313 0.5768 0.7595
No log 4.9615 258 0.5468 0.5706 0.5468 0.7394
No log 5.0 260 0.5758 0.5251 0.5758 0.7588
No log 5.0385 262 0.6957 0.5160 0.6957 0.8341
No log 5.0769 264 0.7064 0.5160 0.7064 0.8405
No log 5.1154 266 0.6046 0.4851 0.6046 0.7776
No log 5.1538 268 0.5618 0.4969 0.5618 0.7495
No log 5.1923 270 0.5707 0.4825 0.5707 0.7554
No log 5.2308 272 0.6012 0.4986 0.6012 0.7754
No log 5.2692 274 0.5590 0.4776 0.5590 0.7477
No log 5.3077 276 0.5034 0.4101 0.5034 0.7095
No log 5.3462 278 0.5151 0.4867 0.5151 0.7177
No log 5.3846 280 0.5170 0.5448 0.5170 0.7190
No log 5.4231 282 0.5024 0.5286 0.5024 0.7088
No log 5.4615 284 0.6217 0.5138 0.6217 0.7885
No log 5.5 286 0.7097 0.4789 0.7097 0.8424
No log 5.5385 288 0.6498 0.4884 0.6498 0.8061
No log 5.5769 290 0.5326 0.5044 0.5326 0.7298
No log 5.6154 292 0.5132 0.5302 0.5132 0.7163
No log 5.6538 294 0.5125 0.5446 0.5125 0.7159
No log 5.6923 296 0.5305 0.5463 0.5305 0.7283
No log 5.7308 298 0.6082 0.4690 0.6082 0.7799
No log 5.7692 300 0.7315 0.4580 0.7315 0.8553
No log 5.8077 302 0.7144 0.4580 0.7144 0.8452
No log 5.8462 304 0.6312 0.4652 0.6312 0.7945
No log 5.8846 306 0.6061 0.4272 0.6061 0.7785
No log 5.9231 308 0.5456 0.4795 0.5456 0.7387
No log 5.9615 310 0.5466 0.4963 0.5466 0.7393
No log 6.0 312 0.5467 0.5120 0.5467 0.7394
No log 6.0385 314 0.5977 0.5220 0.5977 0.7731
No log 6.0769 316 0.6330 0.5312 0.6330 0.7956
No log 6.1154 318 0.5834 0.5471 0.5834 0.7638
No log 6.1538 320 0.5087 0.5373 0.5087 0.7132
No log 6.1923 322 0.5202 0.5533 0.5202 0.7212
No log 6.2308 324 0.5356 0.5894 0.5356 0.7318
No log 6.2692 326 0.5179 0.5357 0.5179 0.7197
No log 6.3077 328 0.5585 0.4901 0.5585 0.7473
No log 6.3462 330 0.6003 0.5014 0.6003 0.7748
No log 6.3846 332 0.5755 0.4997 0.5755 0.7586
No log 6.4231 334 0.5130 0.5446 0.5130 0.7162
No log 6.4615 336 0.5580 0.5362 0.5580 0.7470
No log 6.5 338 0.5603 0.5452 0.5603 0.7485
No log 6.5385 340 0.5176 0.5195 0.5176 0.7194
No log 6.5769 342 0.5734 0.4556 0.5734 0.7572
No log 6.6154 344 0.6726 0.4868 0.6726 0.8201
No log 6.6538 346 0.6775 0.4868 0.6775 0.8231
No log 6.6923 348 0.5858 0.4614 0.5858 0.7653
No log 6.7308 350 0.5540 0.4742 0.5540 0.7443
No log 6.7692 352 0.5786 0.4684 0.5786 0.7607
No log 6.8077 354 0.5695 0.5056 0.5695 0.7547
No log 6.8462 356 0.5870 0.4118 0.5870 0.7662
No log 6.8846 358 0.6888 0.4269 0.6888 0.8299
No log 6.9231 360 0.6670 0.4269 0.6670 0.8167
No log 6.9615 362 0.6062 0.3988 0.6062 0.7786
No log 7.0 364 0.5713 0.4991 0.5713 0.7559
No log 7.0385 366 0.5693 0.4875 0.5693 0.7545
No log 7.0769 368 0.5599 0.4726 0.5599 0.7483
No log 7.1154 370 0.5584 0.4991 0.5584 0.7473
No log 7.1538 372 0.6397 0.3761 0.6397 0.7998
No log 7.1923 374 0.7008 0.4219 0.7008 0.8371
No log 7.2308 376 0.6465 0.3941 0.6465 0.8040
No log 7.2692 378 0.5945 0.4752 0.5945 0.7710
No log 7.3077 380 0.5808 0.4678 0.5808 0.7621
No log 7.3462 382 0.5707 0.4547 0.5707 0.7555
No log 7.3846 384 0.5710 0.3809 0.5710 0.7556
No log 7.4231 386 0.5683 0.4111 0.5683 0.7538
No log 7.4615 388 0.5748 0.4505 0.5748 0.7582
No log 7.5 390 0.5779 0.4227 0.5779 0.7602
No log 7.5385 392 0.5527 0.4262 0.5527 0.7434
No log 7.5769 394 0.5436 0.3754 0.5436 0.7373
No log 7.6154 396 0.5443 0.4548 0.5443 0.7378
No log 7.6538 398 0.5553 0.4821 0.5553 0.7452
No log 7.6923 400 0.5367 0.4674 0.5367 0.7326
No log 7.7308 402 0.5073 0.5446 0.5073 0.7122
No log 7.7692 404 0.5007 0.6024 0.5007 0.7076
No log 7.8077 406 0.4976 0.6007 0.4976 0.7054
No log 7.8462 408 0.5375 0.5136 0.5375 0.7332
No log 7.8846 410 0.5547 0.5283 0.5547 0.7448
No log 7.9231 412 0.5585 0.5429 0.5585 0.7473
No log 7.9615 414 0.5151 0.5678 0.5151 0.7177
No log 8.0 416 0.4983 0.5672 0.4983 0.7059
No log 8.0385 418 0.5128 0.5656 0.5128 0.7161
No log 8.0769 420 0.5247 0.5452 0.5247 0.7244
No log 8.1154 422 0.5687 0.5217 0.5687 0.7542
No log 8.1538 424 0.5631 0.5569 0.5631 0.7504
No log 8.1923 426 0.5264 0.5143 0.5264 0.7256
No log 8.2308 428 0.5296 0.4253 0.5296 0.7278
No log 8.2692 430 0.5356 0.4314 0.5356 0.7319
No log 8.3077 432 0.5310 0.5042 0.5310 0.7287
No log 8.3462 434 0.5878 0.4789 0.5878 0.7667
No log 8.3846 436 0.6289 0.4537 0.6289 0.7930
No log 8.4231 438 0.5930 0.4391 0.5930 0.7701
No log 8.4615 440 0.5447 0.5075 0.5447 0.7381
No log 8.5 442 0.5758 0.5485 0.5758 0.7588
No log 8.5385 444 0.6078 0.5403 0.6078 0.7796
No log 8.5769 446 0.5684 0.5327 0.5684 0.7539
No log 8.6154 448 0.5311 0.5022 0.5311 0.7287
No log 8.6538 450 0.5328 0.5143 0.5328 0.7299
No log 8.6923 452 0.5263 0.4904 0.5263 0.7255
No log 8.7308 454 0.5188 0.4788 0.5188 0.7203
No log 8.7692 456 0.5152 0.4788 0.5152 0.7177
No log 8.8077 458 0.5100 0.4526 0.5100 0.7141
No log 8.8462 460 0.5067 0.5143 0.5067 0.7118
No log 8.8846 462 0.5265 0.5300 0.5265 0.7256
No log 8.9231 464 0.5236 0.5300 0.5236 0.7236
No log 8.9615 466 0.4931 0.4942 0.4931 0.7022
No log 9.0 468 0.4852 0.5596 0.4852 0.6966
No log 9.0385 470 0.4828 0.5672 0.4828 0.6949
No log 9.0769 472 0.4882 0.6282 0.4882 0.6987
No log 9.1154 474 0.5008 0.6308 0.5008 0.7077
No log 9.1538 476 0.5579 0.5249 0.5579 0.7469
No log 9.1923 478 0.6165 0.4669 0.6165 0.7852
No log 9.2308 480 0.5750 0.4783 0.5750 0.7583
No log 9.2692 482 0.5129 0.5373 0.5129 0.7162
No log 9.3077 484 0.4786 0.5782 0.4786 0.6918
No log 9.3462 486 0.4669 0.6321 0.4669 0.6833
No log 9.3846 488 0.4667 0.6567 0.4667 0.6831
No log 9.4231 490 0.5013 0.5794 0.5013 0.7080
No log 9.4615 492 0.6375 0.4884 0.6375 0.7984
No log 9.5 494 0.7311 0.4382 0.7311 0.8550
No log 9.5385 496 0.6737 0.4773 0.6737 0.8208
No log 9.5769 498 0.5612 0.5266 0.5612 0.7491
0.3282 9.6154 500 0.4825 0.5779 0.4825 0.6946
0.3282 9.6538 502 0.4601 0.6770 0.4601 0.6783
0.3282 9.6923 504 0.4754 0.5493 0.4754 0.6895
0.3282 9.7308 506 0.5143 0.5030 0.5143 0.7171
0.3282 9.7692 508 0.5250 0.4859 0.5250 0.7245
0.3282 9.8077 510 0.5184 0.4838 0.5184 0.7200
0.3282 9.8462 512 0.5131 0.4869 0.5131 0.7163

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
2
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run2_AugV5_k10_task7_organization

Finetuned
(4019)
this model