ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run1_AugV5_k14_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5152
  • Qwk: 0.4632
  • Mse: 0.5152
  • Rmse: 0.7178

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0286 2 2.4803 -0.0449 2.4803 1.5749
No log 0.0571 4 1.4581 -0.1167 1.4581 1.2075
No log 0.0857 6 1.1388 -0.1866 1.1388 1.0672
No log 0.1143 8 0.9332 0.1323 0.9332 0.9660
No log 0.1429 10 0.7761 0.2045 0.7761 0.8810
No log 0.1714 12 0.7510 0.3993 0.7510 0.8666
No log 0.2 14 1.2649 0.0759 1.2649 1.1247
No log 0.2286 16 1.0737 0.2037 1.0737 1.0362
No log 0.2571 18 0.6704 0.3156 0.6704 0.8188
No log 0.2857 20 0.6079 0.3608 0.6079 0.7797
No log 0.3143 22 0.6141 0.2950 0.6141 0.7837
No log 0.3429 24 0.6346 0.3082 0.6346 0.7966
No log 0.3714 26 0.7796 0.3851 0.7796 0.8829
No log 0.4 28 0.9007 0.2875 0.9007 0.9491
No log 0.4286 30 0.8877 0.2942 0.8877 0.9422
No log 0.4571 32 0.8201 0.3051 0.8201 0.9056
No log 0.4857 34 0.7166 0.2454 0.7166 0.8465
No log 0.5143 36 0.6929 0.1633 0.6929 0.8324
No log 0.5429 38 0.7331 0.1687 0.7331 0.8562
No log 0.5714 40 0.8320 0.1724 0.8320 0.9121
No log 0.6 42 0.9355 0.2008 0.9355 0.9672
No log 0.6286 44 0.9734 0.1973 0.9734 0.9866
No log 0.6571 46 1.1767 0.1983 1.1767 1.0848
No log 0.6857 48 1.1399 0.1496 1.1399 1.0676
No log 0.7143 50 0.7927 0.2581 0.7927 0.8903
No log 0.7429 52 0.6547 0.4257 0.6547 0.8092
No log 0.7714 54 0.7895 0.3892 0.7895 0.8886
No log 0.8 56 0.6309 0.4835 0.6309 0.7943
No log 0.8286 58 0.6168 0.4174 0.6168 0.7854
No log 0.8571 60 0.7161 0.3899 0.7161 0.8462
No log 0.8857 62 0.6284 0.4452 0.6284 0.7927
No log 0.9143 64 0.5959 0.4482 0.5959 0.7719
No log 0.9429 66 0.7782 0.4008 0.7782 0.8822
No log 0.9714 68 0.6777 0.4270 0.6777 0.8232
No log 1.0 70 0.6138 0.4367 0.6138 0.7834
No log 1.0286 72 0.7380 0.4058 0.7380 0.8591
No log 1.0571 74 0.8454 0.3933 0.8454 0.9195
No log 1.0857 76 0.8657 0.3274 0.8657 0.9304
No log 1.1143 78 0.7291 0.3681 0.7291 0.8539
No log 1.1429 80 0.6669 0.4298 0.6669 0.8166
No log 1.1714 82 0.6564 0.3793 0.6564 0.8102
No log 1.2 84 0.7363 0.3586 0.7363 0.8581
No log 1.2286 86 0.7058 0.4036 0.7058 0.8401
No log 1.2571 88 0.6346 0.4094 0.6346 0.7966
No log 1.2857 90 0.7763 0.4013 0.7763 0.8811
No log 1.3143 92 0.9709 0.2807 0.9709 0.9853
No log 1.3429 94 1.0178 0.2577 1.0178 1.0088
No log 1.3714 96 0.9704 0.3014 0.9704 0.9851
No log 1.4 98 0.8517 0.4051 0.8517 0.9229
No log 1.4286 100 0.7363 0.5106 0.7363 0.8581
No log 1.4571 102 0.6560 0.5421 0.6560 0.8099
No log 1.4857 104 0.6367 0.5336 0.6367 0.7979
No log 1.5143 106 0.6310 0.5683 0.6310 0.7944
No log 1.5429 108 0.5979 0.5503 0.5979 0.7733
No log 1.5714 110 0.5826 0.4681 0.5826 0.7633
No log 1.6 112 0.5848 0.4703 0.5848 0.7647
No log 1.6286 114 0.5778 0.4382 0.5778 0.7601
No log 1.6571 116 0.5548 0.3716 0.5548 0.7449
No log 1.6857 118 0.5505 0.4015 0.5505 0.7420
No log 1.7143 120 0.5461 0.4015 0.5461 0.7390
No log 1.7429 122 0.5516 0.4803 0.5516 0.7427
No log 1.7714 124 0.5576 0.4966 0.5576 0.7467
No log 1.8 126 0.5733 0.5723 0.5733 0.7571
No log 1.8286 128 0.5800 0.5632 0.5800 0.7616
No log 1.8571 130 0.6199 0.4562 0.6199 0.7873
No log 1.8857 132 0.6474 0.3544 0.6474 0.8046
No log 1.9143 134 0.7294 0.3456 0.7294 0.8540
No log 1.9429 136 0.6674 0.3344 0.6674 0.8169
No log 1.9714 138 0.5794 0.4830 0.5794 0.7612
No log 2.0 140 0.6029 0.5283 0.6029 0.7764
No log 2.0286 142 0.7140 0.4573 0.7140 0.8450
No log 2.0571 144 0.6581 0.5326 0.6581 0.8112
No log 2.0857 146 0.5320 0.5446 0.5320 0.7294
No log 2.1143 148 0.5519 0.5254 0.5519 0.7429
No log 2.1429 150 0.6548 0.4067 0.6548 0.8092
No log 2.1714 152 0.6515 0.4067 0.6515 0.8071
No log 2.2 154 0.5567 0.4397 0.5567 0.7461
No log 2.2286 156 0.6116 0.4051 0.6116 0.7820
No log 2.2571 158 0.6408 0.4814 0.6408 0.8005
No log 2.2857 160 0.5884 0.5135 0.5884 0.7671
No log 2.3143 162 0.5684 0.4719 0.5684 0.7539
No log 2.3429 164 0.5726 0.4819 0.5726 0.7567
No log 2.3714 166 0.6005 0.4294 0.6005 0.7749
No log 2.4 168 0.6009 0.3770 0.6009 0.7752
No log 2.4286 170 0.5651 0.4444 0.5651 0.7517
No log 2.4571 172 0.5969 0.4211 0.5969 0.7726
No log 2.4857 174 0.5913 0.5246 0.5913 0.7690
No log 2.5143 176 0.5832 0.5079 0.5832 0.7636
No log 2.5429 178 0.5757 0.4973 0.5757 0.7588
No log 2.5714 180 0.5796 0.4413 0.5796 0.7613
No log 2.6 182 0.5808 0.4697 0.5808 0.7621
No log 2.6286 184 0.6762 0.5190 0.6762 0.8223
No log 2.6571 186 0.7189 0.4668 0.7189 0.8479
No log 2.6857 188 0.6542 0.5373 0.6542 0.8088
No log 2.7143 190 0.6315 0.5297 0.6315 0.7947
No log 2.7429 192 0.6826 0.5387 0.6826 0.8262
No log 2.7714 194 0.6258 0.5373 0.6258 0.7911
No log 2.8 196 0.5670 0.5335 0.5670 0.7530
No log 2.8286 198 0.5452 0.4547 0.5452 0.7384
No log 2.8571 200 0.5784 0.4731 0.5784 0.7605
No log 2.8857 202 0.5709 0.4414 0.5709 0.7556
No log 2.9143 204 0.5545 0.4482 0.5545 0.7446
No log 2.9429 206 0.5803 0.5231 0.5803 0.7618
No log 2.9714 208 0.6440 0.5765 0.6440 0.8025
No log 3.0 210 0.6659 0.5696 0.6659 0.8160
No log 3.0286 212 0.5780 0.5841 0.5780 0.7603
No log 3.0571 214 0.5492 0.4984 0.5492 0.7411
No log 3.0857 216 0.5497 0.4743 0.5497 0.7414
No log 3.1143 218 0.5454 0.5549 0.5454 0.7385
No log 3.1429 220 0.5902 0.5706 0.5902 0.7683
No log 3.1714 222 0.6577 0.4859 0.6577 0.8110
No log 3.2 224 0.6711 0.4859 0.6711 0.8192
No log 3.2286 226 0.5784 0.5706 0.5784 0.7605
No log 3.2571 228 0.5203 0.5768 0.5203 0.7213
No log 3.2857 230 0.5094 0.5768 0.5094 0.7137
No log 3.3143 232 0.5153 0.6389 0.5153 0.7178
No log 3.3429 234 0.5353 0.5682 0.5353 0.7316
No log 3.3714 236 0.5363 0.5030 0.5363 0.7323
No log 3.4 238 0.5270 0.5493 0.5270 0.7259
No log 3.4286 240 0.5196 0.5734 0.5196 0.7208
No log 3.4571 242 0.5230 0.5574 0.5230 0.7232
No log 3.4857 244 0.5299 0.5565 0.5299 0.7280
No log 3.5143 246 0.5799 0.5553 0.5799 0.7615
No log 3.5429 248 0.5813 0.5471 0.5813 0.7624
No log 3.5714 250 0.5859 0.5471 0.5859 0.7654
No log 3.6 252 0.6010 0.5553 0.6010 0.7753
No log 3.6286 254 0.6307 0.4765 0.6307 0.7942
No log 3.6571 256 0.6870 0.4901 0.6870 0.8289
No log 3.6857 258 0.8011 0.4479 0.8011 0.8950
No log 3.7143 260 0.8751 0.3127 0.8751 0.9355
No log 3.7429 262 0.8018 0.4026 0.8018 0.8955
No log 3.7714 264 0.6827 0.4844 0.6827 0.8262
No log 3.8 266 0.6113 0.4321 0.6113 0.7818
No log 3.8286 268 0.6275 0.3239 0.6275 0.7922
No log 3.8571 270 0.6678 0.3493 0.6678 0.8172
No log 3.8857 272 0.7233 0.4650 0.7233 0.8505
No log 3.9143 274 0.6512 0.4003 0.6512 0.8070
No log 3.9429 276 0.5712 0.4355 0.5712 0.7558
No log 3.9714 278 0.6351 0.4072 0.6351 0.7969
No log 4.0 280 0.6896 0.3933 0.6896 0.8304
No log 4.0286 282 0.6665 0.4615 0.6665 0.8164
No log 4.0571 284 0.5898 0.5106 0.5898 0.7680
No log 4.0857 286 0.5741 0.5106 0.5741 0.7577
No log 4.1143 288 0.5989 0.4909 0.5989 0.7739
No log 4.1429 290 0.5765 0.4690 0.5765 0.7592
No log 4.1714 292 0.5234 0.5904 0.5234 0.7235
No log 4.2 294 0.5052 0.6282 0.5052 0.7107
No log 4.2286 296 0.5325 0.5607 0.5325 0.7297
No log 4.2571 298 0.6049 0.4851 0.6049 0.7777
No log 4.2857 300 0.6524 0.4703 0.6524 0.8077
No log 4.3143 302 0.6740 0.4703 0.6740 0.8210
No log 4.3429 304 0.6232 0.5131 0.6232 0.7894
No log 4.3714 306 0.5258 0.6034 0.5258 0.7251
No log 4.4 308 0.5193 0.5826 0.5193 0.7206
No log 4.4286 310 0.5919 0.5131 0.5919 0.7694
No log 4.4571 312 0.7019 0.4007 0.7019 0.8378
No log 4.4857 314 0.7746 0.3924 0.7746 0.8801
No log 4.5143 316 0.6792 0.4096 0.6792 0.8241
No log 4.5429 318 0.5531 0.5212 0.5531 0.7437
No log 4.5714 320 0.5053 0.6082 0.5053 0.7108
No log 4.6 322 0.5198 0.5081 0.5198 0.7210
No log 4.6286 324 0.4921 0.5846 0.4921 0.7015
No log 4.6571 326 0.5408 0.5874 0.5408 0.7354
No log 4.6857 328 0.6568 0.5153 0.6568 0.8104
No log 4.7143 330 0.6800 0.5098 0.6800 0.8246
No log 4.7429 332 0.6777 0.5098 0.6777 0.8232
No log 4.7714 334 0.6029 0.5441 0.6029 0.7765
No log 4.8 336 0.5474 0.5806 0.5474 0.7399
No log 4.8286 338 0.5258 0.5991 0.5258 0.7251
No log 4.8571 340 0.5290 0.5471 0.5290 0.7273
No log 4.8857 342 0.6154 0.5387 0.6154 0.7845
No log 4.9143 344 0.7023 0.4961 0.7023 0.8380
No log 4.9429 346 0.6863 0.5146 0.6863 0.8285
No log 4.9714 348 0.5562 0.5136 0.5562 0.7458
No log 5.0 350 0.5009 0.6243 0.5009 0.7077
No log 5.0286 352 0.5073 0.5556 0.5073 0.7122
No log 5.0571 354 0.5054 0.5899 0.5054 0.7109
No log 5.0857 356 0.5172 0.5479 0.5172 0.7192
No log 5.1143 358 0.5269 0.5479 0.5269 0.7259
No log 5.1429 360 0.5721 0.5013 0.5721 0.7564
No log 5.1714 362 0.5955 0.4616 0.5955 0.7717
No log 5.2 364 0.6293 0.5464 0.6293 0.7933
No log 5.2286 366 0.5890 0.5630 0.5890 0.7675
No log 5.2571 368 0.5166 0.5678 0.5166 0.7187
No log 5.2857 370 0.5002 0.5783 0.5002 0.7072
No log 5.3143 372 0.4996 0.5783 0.4996 0.7069
No log 5.3429 374 0.5122 0.5868 0.5122 0.7157
No log 5.3714 376 0.5033 0.5767 0.5033 0.7095
No log 5.4 378 0.5007 0.5707 0.5007 0.7076
No log 5.4286 380 0.5112 0.5767 0.5112 0.7150
No log 5.4571 382 0.5095 0.5753 0.5095 0.7138
No log 5.4857 384 0.5151 0.5966 0.5151 0.7177
No log 5.5143 386 0.5179 0.5979 0.5179 0.7197
No log 5.5429 388 0.5740 0.5373 0.5740 0.7576
No log 5.5714 390 0.5684 0.5898 0.5684 0.7540
No log 5.6 392 0.5284 0.6445 0.5284 0.7269
No log 5.6286 394 0.5056 0.6503 0.5056 0.7110
No log 5.6571 396 0.4902 0.5846 0.4902 0.7001
No log 5.6857 398 0.5106 0.5826 0.5106 0.7146
No log 5.7143 400 0.5625 0.5331 0.5625 0.7500
No log 5.7429 402 0.6154 0.5543 0.6154 0.7845
No log 5.7714 404 0.6104 0.5138 0.6104 0.7813
No log 5.8 406 0.6078 0.5077 0.6078 0.7796
No log 5.8286 408 0.5948 0.5595 0.5948 0.7712
No log 5.8571 410 0.6194 0.4946 0.6194 0.7870
No log 5.8857 412 0.5517 0.5249 0.5517 0.7427
No log 5.9143 414 0.5099 0.5687 0.5099 0.7141
No log 5.9429 416 0.5133 0.5479 0.5133 0.7164
No log 5.9714 418 0.5450 0.5299 0.5450 0.7383
No log 6.0 420 0.5490 0.5299 0.5490 0.7410
No log 6.0286 422 0.5077 0.6183 0.5077 0.7125
No log 6.0571 424 0.5238 0.5956 0.5238 0.7238
No log 6.0857 426 0.6259 0.5131 0.6259 0.7912
No log 6.1143 428 0.6706 0.4784 0.6706 0.8189
No log 6.1429 430 0.6271 0.5195 0.6271 0.7919
No log 6.1714 432 0.5950 0.5002 0.5950 0.7714
No log 6.2 434 0.6682 0.5139 0.6682 0.8174
No log 6.2286 436 0.6671 0.4396 0.6671 0.8168
No log 6.2571 438 0.6333 0.4396 0.6333 0.7958
No log 6.2857 440 0.5683 0.5299 0.5683 0.7539
No log 6.3143 442 0.5162 0.5475 0.5162 0.7184
No log 6.3429 444 0.5130 0.4970 0.5130 0.7163
No log 6.3714 446 0.5162 0.4970 0.5162 0.7185
No log 6.4 448 0.5394 0.5010 0.5394 0.7345
No log 6.4286 450 0.6058 0.4451 0.6058 0.7783
No log 6.4571 452 0.6311 0.4238 0.6311 0.7944
No log 6.4857 454 0.5864 0.5313 0.5864 0.7658
No log 6.5143 456 0.5366 0.5779 0.5366 0.7326
No log 6.5429 458 0.5154 0.5956 0.5154 0.7179
No log 6.5714 460 0.5096 0.5868 0.5096 0.7139
No log 6.6 462 0.5204 0.5809 0.5204 0.7214
No log 6.6286 464 0.5262 0.5822 0.5262 0.7254
No log 6.6571 466 0.5699 0.4909 0.5699 0.7549
No log 6.6857 468 0.6593 0.4811 0.6593 0.8120
No log 6.7143 470 0.6421 0.4377 0.6421 0.8013
No log 6.7429 472 0.5393 0.4633 0.5393 0.7344
No log 6.7714 474 0.4733 0.5556 0.4733 0.6879
No log 6.8 476 0.4906 0.5015 0.4906 0.7004
No log 6.8286 478 0.5191 0.5015 0.5191 0.7205
No log 6.8571 480 0.5075 0.5015 0.5075 0.7124
No log 6.8857 482 0.4885 0.5682 0.4885 0.6990
No log 6.9143 484 0.5452 0.5738 0.5452 0.7384
No log 6.9429 486 0.6509 0.5324 0.6509 0.8068
No log 6.9714 488 0.7169 0.5031 0.7169 0.8467
No log 7.0 490 0.7449 0.4442 0.7449 0.8631
No log 7.0286 492 0.6299 0.5491 0.6299 0.7937
No log 7.0571 494 0.5806 0.5160 0.5806 0.7620
No log 7.0857 496 0.6133 0.5252 0.6133 0.7832
No log 7.1143 498 0.6072 0.4859 0.6072 0.7792
0.3349 7.1429 500 0.5359 0.5554 0.5359 0.7320
0.3349 7.1714 502 0.5319 0.4832 0.5319 0.7293
0.3349 7.2 504 0.5213 0.5046 0.5213 0.7220
0.3349 7.2286 506 0.5139 0.4869 0.5139 0.7169
0.3349 7.2571 508 0.4972 0.5798 0.4972 0.7051
0.3349 7.2857 510 0.4916 0.5344 0.4916 0.7011
0.3349 7.3143 512 0.4923 0.5344 0.4923 0.7016
0.3349 7.3429 514 0.5126 0.4869 0.5126 0.7160
0.3349 7.3714 516 0.5171 0.4632 0.5171 0.7191
0.3349 7.4 518 0.5152 0.4632 0.5152 0.7178

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run1_AugV5_k14_task7_organization

Finetuned
(4019)
this model