ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run3_AugV5_k13_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4616
  • Qwk: 0.5065
  • Mse: 0.4616
  • Rmse: 0.6794

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0312 2 2.5491 -0.1213 2.5491 1.5966
No log 0.0625 4 1.3669 0.0987 1.3669 1.1691
No log 0.0938 6 1.0779 -0.0611 1.0779 1.0382
No log 0.125 8 0.7287 0.2590 0.7287 0.8536
No log 0.1562 10 0.6722 0.3738 0.6722 0.8199
No log 0.1875 12 0.6613 0.3857 0.6613 0.8132
No log 0.2188 14 0.7790 0.4204 0.7790 0.8826
No log 0.25 16 0.7044 0.4315 0.7044 0.8393
No log 0.2812 18 0.8153 0.3761 0.8153 0.9029
No log 0.3125 20 0.9969 0.1985 0.9969 0.9984
No log 0.3438 22 1.1255 0.2169 1.1255 1.0609
No log 0.375 24 0.9662 0.2900 0.9662 0.9830
No log 0.4062 26 0.6829 0.2118 0.6829 0.8264
No log 0.4375 28 0.6170 0.2751 0.6170 0.7855
No log 0.4688 30 0.6173 0.3006 0.6173 0.7857
No log 0.5 32 0.6207 0.3006 0.6207 0.7879
No log 0.5312 34 0.7067 0.2132 0.7067 0.8407
No log 0.5625 36 0.9541 0.3119 0.9541 0.9768
No log 0.5938 38 0.9760 0.3274 0.9760 0.9879
No log 0.625 40 0.8630 0.2875 0.8630 0.9290
No log 0.6562 42 0.8018 0.2817 0.8018 0.8954
No log 0.6875 44 0.9950 0.1746 0.9950 0.9975
No log 0.7188 46 1.0255 0.1746 1.0255 1.0127
No log 0.75 48 0.8355 0.2508 0.8355 0.9141
No log 0.7812 50 0.5810 0.4585 0.5810 0.7622
No log 0.8125 52 0.5844 0.3809 0.5844 0.7645
No log 0.8438 54 0.6184 0.3396 0.6184 0.7864
No log 0.875 56 0.8933 0.2508 0.8933 0.9452
No log 0.9062 58 1.0533 0.1983 1.0533 1.0263
No log 0.9375 60 0.9060 0.3019 0.9060 0.9519
No log 0.9688 62 0.6345 0.3536 0.6345 0.7966
No log 1.0 64 0.5306 0.4538 0.5306 0.7284
No log 1.0312 66 0.5258 0.5869 0.5258 0.7251
No log 1.0625 68 0.6985 0.4024 0.6985 0.8357
No log 1.0938 70 1.0846 0.3085 1.0846 1.0414
No log 1.125 72 1.1431 0.2081 1.1431 1.0692
No log 1.1562 74 1.1548 0.2081 1.1548 1.0746
No log 1.1875 76 1.0543 0.3059 1.0543 1.0268
No log 1.2188 78 0.7673 0.4354 0.7673 0.8759
No log 1.25 80 0.6142 0.4351 0.6142 0.7837
No log 1.2812 82 0.6359 0.4083 0.6359 0.7974
No log 1.3125 84 0.6013 0.4322 0.6013 0.7754
No log 1.3438 86 0.4853 0.3354 0.4853 0.6967
No log 1.375 88 0.4644 0.4331 0.4644 0.6815
No log 1.4062 90 0.4790 0.4034 0.4790 0.6921
No log 1.4375 92 0.5227 0.4767 0.5227 0.7230
No log 1.4688 94 0.5948 0.4904 0.5948 0.7712
No log 1.5 96 0.5260 0.5748 0.5260 0.7253
No log 1.5312 98 0.4850 0.6295 0.4850 0.6964
No log 1.5625 100 0.4597 0.6106 0.4597 0.6780
No log 1.5938 102 0.4672 0.6053 0.4672 0.6835
No log 1.625 104 0.4666 0.6351 0.4666 0.6831
No log 1.6562 106 0.4781 0.5768 0.4781 0.6915
No log 1.6875 108 0.5686 0.5116 0.5686 0.7540
No log 1.7188 110 0.6825 0.4978 0.6825 0.8261
No log 1.75 112 0.6887 0.4978 0.6887 0.8299
No log 1.7812 114 0.6197 0.4930 0.6197 0.7872
No log 1.8125 116 0.5535 0.5252 0.5535 0.7440
No log 1.8438 118 0.5384 0.5333 0.5384 0.7337
No log 1.875 120 0.5384 0.5333 0.5384 0.7338
No log 1.9062 122 0.5233 0.5797 0.5233 0.7234
No log 1.9375 124 0.5263 0.5991 0.5263 0.7255
No log 1.9688 126 0.5237 0.6267 0.5237 0.7236
No log 2.0 128 0.6934 0.4667 0.6934 0.8327
No log 2.0312 130 0.6982 0.4852 0.6982 0.8356
No log 2.0625 132 0.7723 0.4764 0.7723 0.8788
No log 2.0938 134 0.6480 0.5337 0.6480 0.8050
No log 2.125 136 0.5999 0.5310 0.5999 0.7746
No log 2.1562 138 0.5203 0.6123 0.5203 0.7213
No log 2.1875 140 0.5103 0.5629 0.5103 0.7144
No log 2.2188 142 0.5260 0.5908 0.5260 0.7253
No log 2.25 144 0.5713 0.5368 0.5713 0.7559
No log 2.2812 146 0.6327 0.5032 0.6327 0.7954
No log 2.3125 148 0.6237 0.5267 0.6237 0.7898
No log 2.3438 150 0.5261 0.5726 0.5261 0.7253
No log 2.375 152 0.5998 0.5267 0.5998 0.7744
No log 2.4062 154 0.5721 0.5387 0.5721 0.7563
No log 2.4375 156 0.4567 0.6960 0.4567 0.6758
No log 2.4688 158 0.4588 0.6873 0.4588 0.6773
No log 2.5 160 0.6056 0.4811 0.6056 0.7782
No log 2.5312 162 0.8771 0.4600 0.8771 0.9365
No log 2.5625 164 0.8165 0.4584 0.8165 0.9036
No log 2.5938 166 0.5592 0.5175 0.5592 0.7478
No log 2.625 168 0.4962 0.6647 0.4962 0.7044
No log 2.6562 170 0.4738 0.6341 0.4738 0.6884
No log 2.6875 172 0.4921 0.5621 0.4921 0.7015
No log 2.7188 174 0.5786 0.4868 0.5786 0.7606
No log 2.75 176 0.6622 0.4580 0.6622 0.8138
No log 2.7812 178 0.5557 0.4868 0.5557 0.7455
No log 2.8125 180 0.4948 0.5663 0.4948 0.7034
No log 2.8438 182 0.5010 0.5524 0.5010 0.7078
No log 2.875 184 0.5593 0.5358 0.5593 0.7479
No log 2.9062 186 0.5472 0.5160 0.5472 0.7397
No log 2.9375 188 0.5230 0.5315 0.5230 0.7232
No log 2.9688 190 0.5226 0.5315 0.5226 0.7229
No log 3.0 192 0.6478 0.5131 0.6478 0.8048
No log 3.0312 194 0.7377 0.4748 0.7377 0.8589
No log 3.0625 196 0.6978 0.4925 0.6978 0.8354
No log 3.0938 198 0.5266 0.5603 0.5266 0.7257
No log 3.125 200 0.5240 0.5455 0.5240 0.7239
No log 3.1562 202 0.6609 0.4860 0.6609 0.8130
No log 3.1875 204 0.5701 0.4961 0.5701 0.7551
No log 3.2188 206 0.4720 0.5569 0.4720 0.6870
No log 3.25 208 0.5382 0.4952 0.5382 0.7336
No log 3.2812 210 0.5682 0.5313 0.5682 0.7538
No log 3.3125 212 0.5086 0.5751 0.5086 0.7132
No log 3.3438 214 0.4840 0.6359 0.4840 0.6957
No log 3.375 216 0.5185 0.6382 0.5185 0.7200
No log 3.4062 218 0.5042 0.6120 0.5042 0.7100
No log 3.4375 220 0.5094 0.5896 0.5094 0.7137
No log 3.4688 222 0.5337 0.5780 0.5337 0.7305
No log 3.5 224 0.5541 0.5421 0.5541 0.7444
No log 3.5312 226 0.5566 0.4429 0.5566 0.7460
No log 3.5625 228 0.5184 0.4875 0.5184 0.7200
No log 3.5938 230 0.5162 0.5881 0.5162 0.7185
No log 3.625 232 0.5442 0.5603 0.5442 0.7377
No log 3.6562 234 0.5565 0.5793 0.5565 0.7460
No log 3.6875 236 0.5101 0.6100 0.5101 0.7142
No log 3.7188 238 0.4885 0.6173 0.4885 0.6989
No log 3.75 240 0.5954 0.4755 0.5954 0.7716
No log 3.7812 242 0.5865 0.4755 0.5865 0.7659
No log 3.8125 244 0.4854 0.5289 0.4854 0.6967
No log 3.8438 246 0.4826 0.6183 0.4826 0.6947
No log 3.875 248 0.6036 0.4575 0.6036 0.7769
No log 3.9062 250 0.6486 0.4466 0.6486 0.8054
No log 3.9375 252 0.5560 0.5765 0.5560 0.7456
No log 3.9688 254 0.4696 0.6434 0.4696 0.6852
No log 4.0 256 0.4742 0.6484 0.4742 0.6887
No log 4.0312 258 0.4889 0.5848 0.4889 0.6992
No log 4.0625 260 0.5613 0.5616 0.5613 0.7492
No log 4.0938 262 0.5827 0.5436 0.5827 0.7634
No log 4.125 264 0.4987 0.5348 0.4987 0.7062
No log 4.1562 266 0.4636 0.6518 0.4636 0.6809
No log 4.1875 268 0.4640 0.6518 0.4640 0.6812
No log 4.2188 270 0.4570 0.6479 0.4570 0.6760
No log 4.25 272 0.5061 0.5220 0.5061 0.7114
No log 4.2812 274 0.5410 0.4987 0.5410 0.7356
No log 4.3125 276 0.4999 0.5861 0.4999 0.7071
No log 4.3438 278 0.4561 0.6195 0.4561 0.6754
No log 4.375 280 0.4476 0.6526 0.4476 0.6690
No log 4.4062 282 0.4763 0.6459 0.4763 0.6902
No log 4.4375 284 0.5419 0.5058 0.5419 0.7362
No log 4.4688 286 0.5149 0.5552 0.5149 0.7176
No log 4.5 288 0.5167 0.5373 0.5167 0.7188
No log 4.5312 290 0.5081 0.5378 0.5081 0.7128
No log 4.5625 292 0.5252 0.5184 0.5252 0.7247
No log 4.5938 294 0.4511 0.6601 0.4511 0.6717
No log 4.625 296 0.4446 0.5723 0.4446 0.6668
No log 4.6562 298 0.5222 0.5510 0.5222 0.7226
No log 4.6875 300 0.4864 0.5687 0.4864 0.6974
No log 4.7188 302 0.4274 0.7123 0.4274 0.6537
No log 4.75 304 0.5546 0.4811 0.5546 0.7447
No log 4.7812 306 0.7478 0.4088 0.7478 0.8648
No log 4.8125 308 0.7400 0.4088 0.7400 0.8602
No log 4.8438 310 0.6184 0.4528 0.6184 0.7864
No log 4.875 312 0.4542 0.6431 0.4542 0.6739
No log 4.9062 314 0.4338 0.6174 0.4338 0.6586
No log 4.9375 316 0.4431 0.5983 0.4431 0.6657
No log 4.9688 318 0.4234 0.6904 0.4234 0.6507
No log 5.0 320 0.4315 0.7042 0.4315 0.6569
No log 5.0312 322 0.5162 0.5046 0.5162 0.7185
No log 5.0625 324 0.5617 0.5251 0.5617 0.7495
No log 5.0938 326 0.5060 0.5436 0.5060 0.7113
No log 5.125 328 0.4267 0.6860 0.4267 0.6532
No log 5.1562 330 0.4222 0.6530 0.4222 0.6498
No log 5.1875 332 0.4528 0.5995 0.4528 0.6729
No log 5.2188 334 0.4751 0.5859 0.4751 0.6893
No log 5.25 336 0.4440 0.6365 0.4440 0.6664
No log 5.2812 338 0.4618 0.7031 0.4618 0.6796
No log 5.3125 340 0.6249 0.5199 0.6249 0.7905
No log 5.3438 342 0.7497 0.4979 0.7497 0.8659
No log 5.375 344 0.7112 0.4794 0.7112 0.8433
No log 5.4062 346 0.5614 0.5874 0.5614 0.7492
No log 5.4375 348 0.4492 0.6877 0.4492 0.6702
No log 5.4688 350 0.4543 0.6589 0.4543 0.6740
No log 5.5 352 0.5026 0.5498 0.5026 0.7090
No log 5.5312 354 0.5785 0.5046 0.5785 0.7606
No log 5.5625 356 0.6382 0.4821 0.6382 0.7989
No log 5.5938 358 0.6566 0.4584 0.6566 0.8103
No log 5.625 360 0.6153 0.4737 0.6153 0.7844
No log 5.6562 362 0.5940 0.5096 0.5940 0.7707
No log 5.6875 364 0.5577 0.5560 0.5577 0.7468
No log 5.7188 366 0.5453 0.5861 0.5453 0.7385
No log 5.75 368 0.5333 0.6042 0.5333 0.7302
No log 5.7812 370 0.5536 0.5767 0.5536 0.7440
No log 5.8125 372 0.5943 0.5046 0.5943 0.7709
No log 5.8438 374 0.5724 0.5581 0.5724 0.7566
No log 5.875 376 0.4759 0.6349 0.4759 0.6898
No log 5.9062 378 0.4626 0.6156 0.4626 0.6801
No log 5.9375 380 0.4745 0.5970 0.4745 0.6888
No log 5.9688 382 0.5036 0.5721 0.5036 0.7096
No log 6.0 384 0.5182 0.5721 0.5182 0.7199
No log 6.0312 386 0.5187 0.5721 0.5187 0.7202
No log 6.0625 388 0.5323 0.5200 0.5323 0.7296
No log 6.0938 390 0.5913 0.4382 0.5913 0.7689
No log 6.125 392 0.5651 0.4471 0.5651 0.7518
No log 6.1562 394 0.5384 0.4129 0.5384 0.7338
No log 6.1875 396 0.4559 0.5105 0.4559 0.6752
No log 6.2188 398 0.4115 0.6403 0.4115 0.6415
No log 6.25 400 0.4124 0.6953 0.4124 0.6422
No log 6.2812 402 0.4535 0.5779 0.4535 0.6734
No log 6.3125 404 0.5080 0.5500 0.5080 0.7128
No log 6.3438 406 0.5416 0.5819 0.5416 0.7359
No log 6.375 408 0.5267 0.5887 0.5267 0.7257
No log 6.4062 410 0.4513 0.6613 0.4513 0.6718
No log 6.4375 412 0.4326 0.6815 0.4326 0.6577
No log 6.4688 414 0.4395 0.7147 0.4395 0.6630
No log 6.5 416 0.4367 0.6652 0.4367 0.6608
No log 6.5312 418 0.4292 0.6492 0.4292 0.6551
No log 6.5625 420 0.4286 0.6492 0.4286 0.6547
No log 6.5938 422 0.4359 0.7041 0.4359 0.6602
No log 6.625 424 0.4830 0.6864 0.4830 0.6950
No log 6.6562 426 0.5154 0.6389 0.5154 0.7179
No log 6.6875 428 0.5353 0.6556 0.5353 0.7316
No log 6.7188 430 0.5235 0.6556 0.5235 0.7236
No log 6.75 432 0.5273 0.6712 0.5273 0.7261
No log 6.7812 434 0.5383 0.6414 0.5383 0.7337
No log 6.8125 436 0.5320 0.6414 0.5320 0.7294
No log 6.8438 438 0.5100 0.6482 0.5100 0.7142
No log 6.875 440 0.4883 0.6709 0.4883 0.6988
No log 6.9062 442 0.4959 0.6537 0.4959 0.7042
No log 6.9375 444 0.5250 0.6060 0.5250 0.7246
No log 6.9688 446 0.4955 0.6379 0.4955 0.7039
No log 7.0 448 0.4638 0.7218 0.4638 0.6810
No log 7.0312 450 0.4565 0.7041 0.4565 0.6756
No log 7.0625 452 0.4592 0.6960 0.4592 0.6776
No log 7.0938 454 0.4712 0.6771 0.4712 0.6864
No log 7.125 456 0.4701 0.6577 0.4701 0.6856
No log 7.1562 458 0.4888 0.5918 0.4888 0.6992
No log 7.1875 460 0.4988 0.6017 0.4988 0.7063
No log 7.2188 462 0.4782 0.7004 0.4782 0.6915
No log 7.25 464 0.4664 0.6409 0.4664 0.6829
No log 7.2812 466 0.4708 0.6677 0.4708 0.6861
No log 7.3125 468 0.4974 0.6048 0.4974 0.7053
No log 7.3438 470 0.6033 0.4890 0.6033 0.7767
No log 7.375 472 0.7119 0.4033 0.7119 0.8437
No log 7.4062 474 0.7616 0.4051 0.7616 0.8727
No log 7.4375 476 0.6832 0.4413 0.6832 0.8266
No log 7.4688 478 0.5488 0.4926 0.5488 0.7408
No log 7.5 480 0.4699 0.5956 0.4699 0.6855
No log 7.5312 482 0.4330 0.6661 0.4330 0.6580
No log 7.5625 484 0.4678 0.5943 0.4678 0.6839
No log 7.5938 486 0.5348 0.4969 0.5348 0.7313
No log 7.625 488 0.5774 0.4926 0.5774 0.7598
No log 7.6562 490 0.5939 0.4868 0.5939 0.7706
No log 7.6875 492 0.5775 0.5436 0.5775 0.7599
No log 7.7188 494 0.5236 0.5298 0.5236 0.7236
No log 7.75 496 0.4647 0.5826 0.4647 0.6817
No log 7.7812 498 0.4504 0.6739 0.4504 0.6711
0.3246 7.8125 500 0.4406 0.6819 0.4406 0.6637
0.3246 7.8438 502 0.4616 0.5600 0.4616 0.6794
0.3246 7.875 504 0.4823 0.5171 0.4823 0.6944
0.3246 7.9062 506 0.4714 0.4561 0.4714 0.6866
0.3246 7.9375 508 0.4559 0.4869 0.4559 0.6752
0.3246 7.9688 510 0.4616 0.5065 0.4616 0.6794

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run3_AugV5_k13_task7_organization

Finetuned
(4019)
this model