ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run1_AugV5_k10_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5467
  • Qwk: 0.4674
  • Mse: 0.5467
  • Rmse: 0.7394

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0385 2 2.7009 -0.0702 2.7009 1.6434
No log 0.0769 4 1.4510 0.0503 1.4510 1.2046
No log 0.1154 6 1.1620 -0.1208 1.1620 1.0780
No log 0.1538 8 0.9621 0.0790 0.9621 0.9809
No log 0.1923 10 0.8845 0.1531 0.8845 0.9405
No log 0.2308 12 0.8014 0.3267 0.8014 0.8952
No log 0.2692 14 0.6733 0.1660 0.6733 0.8206
No log 0.3077 16 0.6490 0.1660 0.6490 0.8056
No log 0.3462 18 0.7639 0.2841 0.7639 0.8740
No log 0.3846 20 0.8313 0.2977 0.8313 0.9118
No log 0.4231 22 0.6697 0.2494 0.6697 0.8183
No log 0.4615 24 0.5844 0.2576 0.5844 0.7644
No log 0.5 26 0.6438 0.2804 0.6438 0.8024
No log 0.5385 28 1.0690 0.2746 1.0690 1.0339
No log 0.5769 30 1.5957 0.0389 1.5957 1.2632
No log 0.6154 32 1.5340 0.0796 1.5340 1.2386
No log 0.6538 34 1.1000 0.1983 1.1000 1.0488
No log 0.6923 36 0.7304 0.2804 0.7304 0.8546
No log 0.7308 38 0.6212 0.1903 0.6212 0.7882
No log 0.7692 40 0.6179 0.3274 0.6179 0.7861
No log 0.8077 42 0.6159 0.2606 0.6159 0.7848
No log 0.8462 44 0.6288 0.0889 0.6288 0.7930
No log 0.8846 46 0.6749 0.0937 0.6749 0.8215
No log 0.9231 48 0.6652 0.1744 0.6652 0.8156
No log 0.9615 50 0.5914 0.2786 0.5914 0.7690
No log 1.0 52 0.5849 0.3416 0.5849 0.7648
No log 1.0385 54 0.6199 0.2783 0.6199 0.7874
No log 1.0769 56 0.7846 0.3371 0.7846 0.8858
No log 1.1154 58 1.0099 0.2651 1.0099 1.0049
No log 1.1538 60 1.0494 0.2298 1.0494 1.0244
No log 1.1923 62 0.9354 0.2813 0.9354 0.9671
No log 1.2308 64 0.7551 0.3547 0.7551 0.8689
No log 1.2692 66 0.7901 0.3417 0.7901 0.8889
No log 1.3077 68 0.7959 0.3243 0.7959 0.8921
No log 1.3462 70 0.7406 0.3216 0.7406 0.8606
No log 1.3846 72 0.6356 0.4198 0.6356 0.7972
No log 1.4231 74 0.7391 0.3730 0.7391 0.8597
No log 1.4615 76 0.8733 0.3782 0.8733 0.9345
No log 1.5 78 0.7721 0.3943 0.7721 0.8787
No log 1.5385 80 0.6629 0.4474 0.6629 0.8142
No log 1.5769 82 0.6590 0.3936 0.6590 0.8118
No log 1.6154 84 0.6275 0.4198 0.6275 0.7922
No log 1.6538 86 0.6144 0.4198 0.6144 0.7838
No log 1.6923 88 0.5984 0.4407 0.5984 0.7736
No log 1.7308 90 0.7125 0.4671 0.7125 0.8441
No log 1.7692 92 0.9040 0.36 0.9040 0.9508
No log 1.8077 94 0.8472 0.3477 0.8472 0.9204
No log 1.8462 96 0.7117 0.3683 0.7117 0.8436
No log 1.8846 98 0.6035 0.3580 0.6035 0.7768
No log 1.9231 100 0.6162 0.3945 0.6162 0.7850
No log 1.9615 102 0.6178 0.3945 0.6178 0.7860
No log 2.0 104 0.6037 0.4338 0.6037 0.7770
No log 2.0385 106 0.6701 0.3925 0.6701 0.8186
No log 2.0769 108 0.9055 0.3657 0.9055 0.9516
No log 2.1154 110 0.8722 0.3697 0.8722 0.9339
No log 2.1538 112 0.6395 0.3804 0.6395 0.7997
No log 2.1923 114 0.5985 0.3988 0.5985 0.7736
No log 2.2308 116 0.5884 0.3703 0.5884 0.7671
No log 2.2692 118 0.7120 0.3661 0.7120 0.8438
No log 2.3077 120 0.7939 0.3827 0.7939 0.8910
No log 2.3462 122 0.7102 0.3707 0.7102 0.8428
No log 2.3846 124 0.5685 0.4136 0.5685 0.7540
No log 2.4231 126 0.6323 0.4270 0.6323 0.7952
No log 2.4615 128 0.6725 0.4112 0.6725 0.8200
No log 2.5 130 0.5814 0.5131 0.5814 0.7625
No log 2.5385 132 0.5903 0.3446 0.5903 0.7683
No log 2.5769 134 0.6035 0.3446 0.6035 0.7768
No log 2.6154 136 0.5806 0.2996 0.5806 0.7619
No log 2.6538 138 0.5838 0.4267 0.5838 0.7641
No log 2.6923 140 0.6123 0.5017 0.6123 0.7825
No log 2.7308 142 0.6023 0.5017 0.6023 0.7761
No log 2.7692 144 0.5671 0.3995 0.5671 0.7530
No log 2.8077 146 0.6375 0.4335 0.6375 0.7984
No log 2.8462 148 0.6745 0.4396 0.6745 0.8213
No log 2.8846 150 0.6545 0.4242 0.6545 0.8090
No log 2.9231 152 0.5447 0.6197 0.5447 0.7380
No log 2.9615 154 0.5298 0.5565 0.5298 0.7279
No log 3.0 156 0.5321 0.5640 0.5321 0.7294
No log 3.0385 158 0.5447 0.5592 0.5447 0.7381
No log 3.0769 160 0.7166 0.4511 0.7166 0.8465
No log 3.1154 162 0.7814 0.4462 0.7814 0.8840
No log 3.1538 164 0.6497 0.4219 0.6497 0.8060
No log 3.1923 166 0.5315 0.4444 0.5315 0.7290
No log 3.2308 168 0.5608 0.5597 0.5608 0.7488
No log 3.2692 170 0.6121 0.5149 0.6121 0.7824
No log 3.3077 172 0.5763 0.5327 0.5763 0.7591
No log 3.3462 174 0.5670 0.4044 0.5670 0.7530
No log 3.3846 176 0.7460 0.3180 0.7460 0.8637
No log 3.4231 178 0.8805 0.3389 0.8805 0.9384
No log 3.4615 180 0.7916 0.3137 0.7916 0.8897
No log 3.5 182 0.6888 0.4410 0.6888 0.8299
No log 3.5385 184 0.5592 0.5915 0.5592 0.7478
No log 3.5769 186 0.5516 0.6255 0.5516 0.7427
No log 3.6154 188 0.5676 0.5044 0.5676 0.7534
No log 3.6538 190 0.5442 0.5722 0.5442 0.7377
No log 3.6923 192 0.5802 0.5200 0.5802 0.7617
No log 3.7308 194 0.6070 0.4544 0.6070 0.7791
No log 3.7692 196 0.6068 0.5136 0.6068 0.7790
No log 3.8077 198 0.5552 0.6073 0.5552 0.7451
No log 3.8462 200 0.5494 0.5362 0.5494 0.7412
No log 3.8846 202 0.5892 0.5345 0.5892 0.7676
No log 3.9231 204 0.5639 0.6032 0.5639 0.7509
No log 3.9615 206 0.5830 0.6606 0.5830 0.7635
No log 4.0 208 0.7546 0.4278 0.7546 0.8687
No log 4.0385 210 1.0267 0.3861 1.0267 1.0133
No log 4.0769 212 1.1684 0.2723 1.1684 1.0809
No log 4.1154 214 1.0867 0.2723 1.0867 1.0425
No log 4.1538 216 0.8016 0.3884 0.8016 0.8953
No log 4.1923 218 0.5710 0.5701 0.5710 0.7557
No log 4.2308 220 0.5533 0.5965 0.5533 0.7438
No log 4.2692 222 0.5557 0.5965 0.5557 0.7455
No log 4.3077 224 0.5543 0.5993 0.5543 0.7445
No log 4.3462 226 0.6156 0.4955 0.6156 0.7846
No log 4.3846 228 0.6647 0.3909 0.6647 0.8153
No log 4.4231 230 0.6642 0.4535 0.6642 0.8150
No log 4.4615 232 0.6051 0.4883 0.6051 0.7779
No log 4.5 234 0.6170 0.5428 0.6170 0.7855
No log 4.5385 236 0.6769 0.5560 0.6769 0.8227
No log 4.5769 238 0.6884 0.5543 0.6884 0.8297
No log 4.6154 240 0.6501 0.5518 0.6501 0.8063
No log 4.6538 242 0.6646 0.4057 0.6646 0.8152
No log 4.6923 244 0.7088 0.4228 0.7088 0.8419
No log 4.7308 246 0.7173 0.3832 0.7173 0.8469
No log 4.7692 248 0.6710 0.3645 0.6710 0.8191
No log 4.8077 250 0.6290 0.4422 0.6290 0.7931
No log 4.8462 252 0.6064 0.4738 0.6064 0.7787
No log 4.8846 254 0.6105 0.4472 0.6105 0.7813
No log 4.9231 256 0.6614 0.4619 0.6614 0.8132
No log 4.9615 258 0.6348 0.4619 0.6348 0.7967
No log 5.0 260 0.6047 0.4602 0.6047 0.7776
No log 5.0385 262 0.6043 0.4811 0.6043 0.7774
No log 5.0769 264 0.5746 0.5714 0.5746 0.7580
No log 5.1154 266 0.5729 0.5398 0.5729 0.7569
No log 5.1538 268 0.6194 0.4794 0.6194 0.7870
No log 5.1923 270 0.6138 0.4849 0.6138 0.7834
No log 5.2308 272 0.5736 0.6078 0.5736 0.7573
No log 5.2692 274 0.5757 0.5559 0.5757 0.7588
No log 5.3077 276 0.5581 0.5357 0.5581 0.7471
No log 5.3462 278 0.5496 0.5826 0.5496 0.7414
No log 5.3846 280 0.5996 0.4272 0.5996 0.7743
No log 5.4231 282 0.6156 0.4510 0.6156 0.7846
No log 5.4615 284 0.5702 0.4391 0.5702 0.7551
No log 5.5 286 0.5450 0.4875 0.5450 0.7382
No log 5.5385 288 0.5625 0.4616 0.5625 0.7500
No log 5.5769 290 0.6133 0.4327 0.6133 0.7831
No log 5.6154 292 0.6811 0.4297 0.6811 0.8253
No log 5.6538 294 0.6338 0.4168 0.6338 0.7961
No log 5.6923 296 0.5513 0.6280 0.5513 0.7425
No log 5.7308 298 0.4979 0.6133 0.4979 0.7056
No log 5.7692 300 0.4972 0.6133 0.4972 0.7051
No log 5.8077 302 0.4907 0.6133 0.4907 0.7005
No log 5.8462 304 0.5279 0.6195 0.5279 0.7266
No log 5.8846 306 0.5796 0.4948 0.5796 0.7613
No log 5.9231 308 0.5216 0.5724 0.5216 0.7222
No log 5.9615 310 0.4673 0.6650 0.4673 0.6836
No log 6.0 312 0.4889 0.5816 0.4889 0.6992
No log 6.0385 314 0.4685 0.6254 0.4685 0.6845
No log 6.0769 316 0.5305 0.5678 0.5305 0.7284
No log 6.1154 318 0.7211 0.4338 0.7211 0.8492
No log 6.1538 320 0.8131 0.3864 0.8131 0.9017
No log 6.1923 322 0.6725 0.4222 0.6725 0.8201
No log 6.2308 324 0.5104 0.5250 0.5104 0.7144
No log 6.2692 326 0.5335 0.5577 0.5335 0.7304
No log 6.3077 328 0.6563 0.4153 0.6563 0.8101
No log 6.3462 330 0.6458 0.4385 0.6458 0.8036
No log 6.3846 332 0.5523 0.5655 0.5523 0.7432
No log 6.4231 334 0.5082 0.6014 0.5082 0.7129
No log 6.4615 336 0.5497 0.5751 0.5497 0.7414
No log 6.5 338 0.5489 0.5751 0.5489 0.7409
No log 6.5385 340 0.5286 0.5809 0.5286 0.7271
No log 6.5769 342 0.5358 0.6298 0.5358 0.7320
No log 6.6154 344 0.5573 0.6065 0.5573 0.7465
No log 6.6538 346 0.5819 0.6083 0.5819 0.7628
No log 6.6923 348 0.5493 0.6307 0.5493 0.7411
No log 6.7308 350 0.5673 0.6451 0.5673 0.7532
No log 6.7692 352 0.6010 0.5166 0.6010 0.7753
No log 6.8077 354 0.5590 0.6451 0.5590 0.7477
No log 6.8462 356 0.5401 0.6340 0.5401 0.7349
No log 6.8846 358 0.5485 0.6855 0.5485 0.7406
No log 6.9231 360 0.5415 0.64 0.5415 0.7359
No log 6.9615 362 0.5467 0.6198 0.5467 0.7394
No log 7.0 364 0.5545 0.6078 0.5545 0.7446
No log 7.0385 366 0.5755 0.5190 0.5755 0.7586
No log 7.0769 368 0.5614 0.5697 0.5614 0.7493
No log 7.1154 370 0.5721 0.64 0.5721 0.7564
No log 7.1538 372 0.5969 0.5822 0.5969 0.7726
No log 7.1923 374 0.5833 0.6743 0.5833 0.7637
No log 7.2308 376 0.5870 0.5528 0.5870 0.7661
No log 7.2692 378 0.5907 0.5322 0.5907 0.7685
No log 7.3077 380 0.5660 0.5574 0.5660 0.7524
No log 7.3462 382 0.5583 0.5075 0.5583 0.7472
No log 7.3846 384 0.5740 0.5250 0.5740 0.7576
No log 7.4231 386 0.5589 0.6007 0.5589 0.7476
No log 7.4615 388 0.5552 0.6265 0.5552 0.7451
No log 7.5 390 0.5644 0.6265 0.5644 0.7513
No log 7.5385 392 0.5760 0.5974 0.5760 0.7590
No log 7.5769 394 0.5757 0.6129 0.5757 0.7588
No log 7.6154 396 0.5726 0.5782 0.5726 0.7567
No log 7.6538 398 0.5879 0.4599 0.5879 0.7667
No log 7.6923 400 0.5863 0.4422 0.5863 0.7657
No log 7.7308 402 0.5864 0.5159 0.5864 0.7658
No log 7.7692 404 0.6881 0.4350 0.6881 0.8295
No log 7.8077 406 0.8135 0.4277 0.8135 0.9020
No log 7.8462 408 0.7598 0.4568 0.7598 0.8717
No log 7.8846 410 0.6271 0.4052 0.6271 0.7919
No log 7.9231 412 0.5440 0.5095 0.5440 0.7376
No log 7.9615 414 0.6013 0.4654 0.6013 0.7754
No log 8.0 416 0.7034 0.4511 0.7034 0.8387
No log 8.0385 418 0.7544 0.4367 0.7544 0.8686
No log 8.0769 420 0.7103 0.4910 0.7103 0.8428
No log 8.1154 422 0.5891 0.5701 0.5891 0.7675
No log 8.1538 424 0.5299 0.6076 0.5299 0.7280
No log 8.1923 426 0.5243 0.5912 0.5243 0.7241
No log 8.2308 428 0.5034 0.6650 0.5034 0.7095
No log 8.2692 430 0.5243 0.5575 0.5243 0.7241
No log 8.3077 432 0.6043 0.5175 0.6043 0.7773
No log 8.3462 434 0.6199 0.5175 0.6199 0.7873
No log 8.3846 436 0.5920 0.4971 0.5920 0.7694
No log 8.4231 438 0.5479 0.5607 0.5479 0.7402
No log 8.4615 440 0.5348 0.5607 0.5348 0.7313
No log 8.5 442 0.5258 0.5881 0.5258 0.7251
No log 8.5385 444 0.5048 0.5678 0.5048 0.7105
No log 8.5769 446 0.5358 0.5152 0.5358 0.7320
No log 8.6154 448 0.5964 0.4987 0.5964 0.7723
No log 8.6538 450 0.5732 0.5030 0.5732 0.7571
No log 8.6923 452 0.5485 0.5457 0.5485 0.7406
No log 8.7308 454 0.5255 0.5509 0.5255 0.7249
No log 8.7692 456 0.5082 0.5563 0.5082 0.7129
No log 8.8077 458 0.5193 0.5509 0.5193 0.7206
No log 8.8462 460 0.5390 0.5509 0.5390 0.7342
No log 8.8846 462 0.5588 0.5267 0.5588 0.7475
No log 8.9231 464 0.5223 0.5736 0.5223 0.7227
No log 8.9615 466 0.5147 0.5352 0.5147 0.7174
No log 9.0 468 0.5029 0.6092 0.5029 0.7092
No log 9.0385 470 0.5011 0.5523 0.5011 0.7079
No log 9.0769 472 0.5331 0.5127 0.5331 0.7302
No log 9.1154 474 0.5376 0.4614 0.5376 0.7332
No log 9.1538 476 0.5085 0.5335 0.5085 0.7131
No log 9.1923 478 0.5102 0.5335 0.5102 0.7143
No log 9.2308 480 0.4880 0.5611 0.4880 0.6985
No log 9.2692 482 0.4763 0.5024 0.4763 0.6902
No log 9.3077 484 0.4754 0.5267 0.4754 0.6895
No log 9.3462 486 0.4717 0.5815 0.4717 0.6868
No log 9.3846 488 0.4867 0.5390 0.4867 0.6976
No log 9.4231 490 0.5124 0.5467 0.5124 0.7158
No log 9.4615 492 0.5282 0.4980 0.5282 0.7268
No log 9.5 494 0.5059 0.5390 0.5059 0.7113
No log 9.5385 496 0.4941 0.5379 0.4941 0.7029
No log 9.5769 498 0.4945 0.5521 0.4945 0.7032
0.3491 9.6154 500 0.4979 0.5361 0.4979 0.7056
0.3491 9.6538 502 0.5276 0.4655 0.5276 0.7263
0.3491 9.6923 504 0.5434 0.4315 0.5434 0.7371
0.3491 9.7308 506 0.5438 0.4832 0.5438 0.7374
0.3491 9.7692 508 0.5382 0.4895 0.5382 0.7336
0.3491 9.8077 510 0.5467 0.4674 0.5467 0.7394

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run1_AugV5_k10_task7_organization

Finetuned
(4019)
this model