ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run2_AugV5_k6_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5001
  • Qwk: 0.5153
  • Mse: 0.5001
  • Rmse: 0.7072

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0909 2 2.5780 -0.0262 2.5780 1.6056
No log 0.1818 4 1.3852 0.0763 1.3852 1.1770
No log 0.2727 6 0.8594 0.0101 0.8594 0.9270
No log 0.3636 8 0.9364 0.1050 0.9364 0.9677
No log 0.4545 10 0.8739 0.1734 0.8739 0.9349
No log 0.5455 12 0.6938 0.2407 0.6938 0.8329
No log 0.6364 14 0.6548 0.1863 0.6548 0.8092
No log 0.7273 16 0.6618 -0.0027 0.6618 0.8135
No log 0.8182 18 0.6885 0.0481 0.6885 0.8298
No log 0.9091 20 0.8038 0.3218 0.8038 0.8965
No log 1.0 22 0.8400 0.3854 0.8400 0.9165
No log 1.0909 24 0.7830 0.3542 0.7830 0.8849
No log 1.1818 26 0.6820 0.1372 0.6820 0.8258
No log 1.2727 28 0.6331 0.0444 0.6331 0.7957
No log 1.3636 30 0.6117 0.0804 0.6117 0.7821
No log 1.4545 32 0.6227 0.0846 0.6227 0.7891
No log 1.5455 34 0.6601 0.2522 0.6601 0.8125
No log 1.6364 36 0.8503 0.3746 0.8503 0.9221
No log 1.7273 38 0.9242 0.4037 0.9242 0.9614
No log 1.8182 40 0.7543 0.3131 0.7543 0.8685
No log 1.9091 42 0.7095 0.3101 0.7095 0.8423
No log 2.0 44 0.8197 0.2977 0.8197 0.9054
No log 2.0909 46 0.8373 0.2977 0.8373 0.9150
No log 2.1818 48 0.7341 0.2804 0.7341 0.8568
No log 2.2727 50 0.6871 0.2804 0.6871 0.8289
No log 2.3636 52 0.6150 0.2118 0.6150 0.7842
No log 2.4545 54 0.6285 0.2808 0.6285 0.7928
No log 2.5455 56 0.8297 0.3270 0.8297 0.9109
No log 2.6364 58 0.9776 0.3719 0.9776 0.9887
No log 2.7273 60 0.9635 0.3641 0.9635 0.9816
No log 2.8182 62 0.7728 0.4162 0.7728 0.8791
No log 2.9091 64 0.6837 0.4568 0.6837 0.8269
No log 3.0 66 0.8513 0.4321 0.8513 0.9227
No log 3.0909 68 0.7847 0.4450 0.7847 0.8859
No log 3.1818 70 0.6214 0.5069 0.6214 0.7883
No log 3.2727 72 0.4979 0.5853 0.4979 0.7057
No log 3.3636 74 0.5530 0.5131 0.5530 0.7437
No log 3.4545 76 0.5163 0.4891 0.5163 0.7186
No log 3.5455 78 0.5375 0.4322 0.5375 0.7332
No log 3.6364 80 0.5321 0.4653 0.5321 0.7294
No log 3.7273 82 0.5115 0.5356 0.5115 0.7152
No log 3.8182 84 0.4992 0.5899 0.4992 0.7065
No log 3.9091 86 0.4927 0.5736 0.4927 0.7019
No log 4.0 88 0.6890 0.4462 0.6890 0.8301
No log 4.0909 90 0.8029 0.4277 0.8029 0.8961
No log 4.1818 92 0.6231 0.5543 0.6231 0.7894
No log 4.2727 94 0.5313 0.4507 0.5313 0.7289
No log 4.3636 96 0.4970 0.5123 0.4970 0.7049
No log 4.4545 98 0.5311 0.4371 0.5311 0.7288
No log 4.5455 100 0.5042 0.4086 0.5042 0.7101
No log 4.6364 102 0.5720 0.4677 0.5720 0.7563
No log 4.7273 104 0.6743 0.4814 0.6743 0.8212
No log 4.8182 106 0.5946 0.4756 0.5946 0.7711
No log 4.9091 108 0.5086 0.6132 0.5086 0.7132
No log 5.0 110 0.5560 0.4664 0.5560 0.7457
No log 5.0909 112 0.5253 0.5845 0.5253 0.7248
No log 5.1818 114 0.5572 0.5250 0.5572 0.7464
No log 5.2727 116 0.7416 0.3559 0.7416 0.8612
No log 5.3636 118 0.7901 0.3312 0.7901 0.8889
No log 5.4545 120 0.6801 0.3829 0.6801 0.8247
No log 5.5455 122 0.5394 0.5361 0.5394 0.7345
No log 5.6364 124 0.6272 0.4502 0.6272 0.7920
No log 5.7273 126 0.6528 0.4502 0.6528 0.8079
No log 5.8182 128 0.5813 0.4780 0.5813 0.7624
No log 5.9091 130 0.5683 0.5065 0.5683 0.7539
No log 6.0 132 0.5915 0.4517 0.5915 0.7691
No log 6.0909 134 0.6883 0.4431 0.6883 0.8297
No log 6.1818 136 0.6352 0.4610 0.6352 0.7970
No log 6.2727 138 0.5514 0.4937 0.5514 0.7426
No log 6.3636 140 0.6096 0.4933 0.6096 0.7808
No log 6.4545 142 0.5879 0.4642 0.5879 0.7667
No log 6.5455 144 0.5458 0.5079 0.5458 0.7388
No log 6.6364 146 0.6083 0.4652 0.6083 0.7799
No log 6.7273 148 0.6327 0.4708 0.6327 0.7954
No log 6.8182 150 0.5605 0.5189 0.5605 0.7487
No log 6.9091 152 0.5137 0.5617 0.5137 0.7167
No log 7.0 154 0.5475 0.4808 0.5475 0.7399
No log 7.0909 156 0.5721 0.4964 0.5721 0.7564
No log 7.1818 158 0.5130 0.5422 0.5130 0.7163
No log 7.2727 160 0.5059 0.5831 0.5059 0.7113
No log 7.3636 162 0.5353 0.5587 0.5353 0.7316
No log 7.4545 164 0.5465 0.5733 0.5465 0.7392
No log 7.5455 166 0.4923 0.6007 0.4923 0.7016
No log 7.6364 168 0.4963 0.5899 0.4963 0.7045
No log 7.7273 170 0.4824 0.6171 0.4824 0.6945
No log 7.8182 172 0.4994 0.5718 0.4994 0.7067
No log 7.9091 174 0.5163 0.5718 0.5163 0.7186
No log 8.0 176 0.4904 0.5945 0.4904 0.7003
No log 8.0909 178 0.4860 0.6339 0.4860 0.6971
No log 8.1818 180 0.5142 0.6240 0.5142 0.7171
No log 8.2727 182 0.4714 0.6542 0.4714 0.6866
No log 8.3636 184 0.5270 0.5841 0.5270 0.7259
No log 8.4545 186 0.6024 0.5445 0.6024 0.7761
No log 8.5455 188 0.5460 0.5611 0.5460 0.7389
No log 8.6364 190 0.5112 0.6060 0.5112 0.7150
No log 8.7273 192 0.5265 0.6065 0.5265 0.7256
No log 8.8182 194 0.5399 0.5988 0.5399 0.7348
No log 8.9091 196 0.5355 0.6133 0.5355 0.7318
No log 9.0 198 0.6144 0.4959 0.6144 0.7838
No log 9.0909 200 0.6249 0.5312 0.6249 0.7905
No log 9.1818 202 0.5453 0.5848 0.5453 0.7384
No log 9.2727 204 0.5262 0.6699 0.5262 0.7254
No log 9.3636 206 0.5164 0.6221 0.5164 0.7186
No log 9.4545 208 0.5092 0.5970 0.5092 0.7136
No log 9.5455 210 0.5591 0.5599 0.5591 0.7477
No log 9.6364 212 0.7073 0.4703 0.7073 0.8410
No log 9.7273 214 0.6602 0.5343 0.6602 0.8125
No log 9.8182 216 0.5125 0.5356 0.5125 0.7159
No log 9.9091 218 0.4876 0.6046 0.4876 0.6983
No log 10.0 220 0.4958 0.6046 0.4958 0.7041
No log 10.0909 222 0.4908 0.5941 0.4908 0.7006
No log 10.1818 224 0.5212 0.5356 0.5212 0.7220
No log 10.2727 226 0.5311 0.5144 0.5311 0.7288
No log 10.3636 228 0.5215 0.5144 0.5215 0.7222
No log 10.4545 230 0.4983 0.6073 0.4983 0.7059
No log 10.5455 232 0.5286 0.6434 0.5286 0.7270
No log 10.6364 234 0.6124 0.5246 0.6124 0.7826
No log 10.7273 236 0.6121 0.5445 0.6121 0.7824
No log 10.8182 238 0.5233 0.6198 0.5233 0.7234
No log 10.9091 240 0.5556 0.6526 0.5556 0.7454
No log 11.0 242 0.6240 0.5966 0.6240 0.7899
No log 11.0909 244 0.6249 0.5966 0.6249 0.7905
No log 11.1818 246 0.5334 0.6606 0.5334 0.7303
No log 11.2727 248 0.4969 0.6046 0.4969 0.7049
No log 11.3636 250 0.4963 0.6034 0.4963 0.7045
No log 11.4545 252 0.4824 0.6059 0.4824 0.6945
No log 11.5455 254 0.5391 0.6900 0.5391 0.7342
No log 11.6364 256 0.6501 0.4890 0.6501 0.8063
No log 11.7273 258 0.6746 0.4350 0.6746 0.8213
No log 11.8182 260 0.5688 0.6643 0.5688 0.7542
No log 11.9091 262 0.4939 0.6059 0.4939 0.7028
No log 12.0 264 0.4899 0.5878 0.4899 0.7000
No log 12.0909 266 0.4962 0.6160 0.4962 0.7044
No log 12.1818 268 0.5155 0.5569 0.5155 0.7180
No log 12.2727 270 0.5230 0.5399 0.5230 0.7232
No log 12.3636 272 0.5292 0.4945 0.5292 0.7274
No log 12.4545 274 0.5144 0.5543 0.5144 0.7172
No log 12.5455 276 0.5200 0.6549 0.5200 0.7211
No log 12.6364 278 0.5299 0.6455 0.5299 0.7279
No log 12.7273 280 0.5018 0.6550 0.5018 0.7083
No log 12.8182 282 0.5237 0.5008 0.5237 0.7237
No log 12.9091 284 0.5539 0.5706 0.5539 0.7442
No log 13.0 286 0.5012 0.4928 0.5012 0.7080
No log 13.0909 288 0.4737 0.6171 0.4737 0.6882
No log 13.1818 290 0.4672 0.6185 0.4672 0.6835
No log 13.2727 292 0.4639 0.5988 0.4639 0.6811
No log 13.3636 294 0.4678 0.5846 0.4678 0.6839
No log 13.4545 296 0.5315 0.5498 0.5315 0.7290
No log 13.5455 298 0.5494 0.5567 0.5494 0.7412
No log 13.6364 300 0.5004 0.5678 0.5004 0.7074
No log 13.7273 302 0.4863 0.6173 0.4863 0.6973
No log 13.8182 304 0.6284 0.5461 0.6284 0.7927
No log 13.9091 306 0.6731 0.4903 0.6731 0.8204
No log 14.0 308 0.6281 0.5368 0.6281 0.7925
No log 14.0909 310 0.5337 0.6004 0.5337 0.7305
No log 14.1818 312 0.5091 0.6004 0.5091 0.7135
No log 14.2727 314 0.4965 0.4885 0.4965 0.7046
No log 14.3636 316 0.4912 0.6171 0.4912 0.7009
No log 14.4545 318 0.5072 0.6452 0.5072 0.7122
No log 14.5455 320 0.5444 0.6620 0.5444 0.7378
No log 14.6364 322 0.5511 0.6871 0.5511 0.7424
No log 14.7273 324 0.5258 0.6721 0.5258 0.7251
No log 14.8182 326 0.4942 0.6570 0.4942 0.7030
No log 14.9091 328 0.4794 0.6244 0.4794 0.6924
No log 15.0 330 0.4669 0.6735 0.4669 0.6833
No log 15.0909 332 0.4615 0.6474 0.4615 0.6793
No log 15.1818 334 0.4626 0.6550 0.4626 0.6801
No log 15.2727 336 0.4594 0.6716 0.4594 0.6778
No log 15.3636 338 0.4616 0.6716 0.4616 0.6794
No log 15.4545 340 0.4880 0.5237 0.4880 0.6986
No log 15.5455 342 0.5010 0.5524 0.5010 0.7078
No log 15.6364 344 0.4794 0.5781 0.4794 0.6924
No log 15.7273 346 0.4729 0.5555 0.4729 0.6877
No log 15.8182 348 0.4833 0.5596 0.4833 0.6952
No log 15.9091 350 0.4771 0.5135 0.4771 0.6907
No log 16.0 352 0.4799 0.6040 0.4799 0.6928
No log 16.0909 354 0.4874 0.6438 0.4874 0.6981
No log 16.1818 356 0.4780 0.6060 0.4780 0.6914
No log 16.2727 358 0.4807 0.6156 0.4807 0.6933
No log 16.3636 360 0.4881 0.5656 0.4881 0.6987
No log 16.4545 362 0.5173 0.5538 0.5173 0.7193
No log 16.5455 364 0.5313 0.5065 0.5313 0.7289
No log 16.6364 366 0.5510 0.5334 0.5510 0.7423
No log 16.7273 368 0.5451 0.5883 0.5451 0.7383
No log 16.8182 370 0.4957 0.5966 0.4957 0.7040
No log 16.9091 372 0.4869 0.5655 0.4869 0.6978
No log 17.0 374 0.5089 0.6353 0.5089 0.7134
No log 17.0909 376 0.5641 0.5741 0.5641 0.7510
No log 17.1818 378 0.5659 0.5741 0.5659 0.7522
No log 17.2727 380 0.5360 0.5741 0.5360 0.7321
No log 17.3636 382 0.5116 0.5647 0.5116 0.7152
No log 17.4545 384 0.4966 0.5640 0.4966 0.7047
No log 17.5455 386 0.5307 0.5751 0.5307 0.7285
No log 17.6364 388 0.5501 0.5483 0.5501 0.7417
No log 17.7273 390 0.5168 0.6170 0.5168 0.7189
No log 17.8182 392 0.5214 0.5697 0.5214 0.7221
No log 17.9091 394 0.5792 0.5574 0.5792 0.7610
No log 18.0 396 0.5772 0.5538 0.5772 0.7597
No log 18.0909 398 0.5220 0.5701 0.5220 0.7225
No log 18.1818 400 0.5071 0.5460 0.5071 0.7121
No log 18.2727 402 0.5303 0.6577 0.5303 0.7282
No log 18.3636 404 0.5324 0.6305 0.5324 0.7297
No log 18.4545 406 0.5118 0.6389 0.5118 0.7154
No log 18.5455 408 0.5016 0.5389 0.5016 0.7082
No log 18.6364 410 0.5359 0.5897 0.5359 0.7320
No log 18.7273 412 0.5680 0.5761 0.5680 0.7537
No log 18.8182 414 0.5529 0.5732 0.5529 0.7435
No log 18.9091 416 0.4978 0.6263 0.4978 0.7056
No log 19.0 418 0.4815 0.5929 0.4815 0.6939
No log 19.0909 420 0.5234 0.6292 0.5234 0.7234
No log 19.1818 422 0.6224 0.5592 0.6224 0.7889
No log 19.2727 424 0.6722 0.5415 0.6722 0.8199
No log 19.3636 426 0.6291 0.5625 0.6291 0.7932
No log 19.4545 428 0.5543 0.6399 0.5543 0.7445
No log 19.5455 430 0.5165 0.5406 0.5165 0.7187
No log 19.6364 432 0.5266 0.5865 0.5266 0.7257
No log 19.7273 434 0.5378 0.6529 0.5378 0.7334
No log 19.8182 436 0.5188 0.6015 0.5188 0.7203
No log 19.9091 438 0.5004 0.5853 0.5004 0.7074
No log 20.0 440 0.5079 0.5697 0.5079 0.7126
No log 20.0909 442 0.5265 0.6247 0.5265 0.7256
No log 20.1818 444 0.5138 0.6053 0.5138 0.7168
No log 20.2727 446 0.4892 0.5797 0.4892 0.6994
No log 20.3636 448 0.4817 0.5405 0.4817 0.6941
No log 20.4545 450 0.4962 0.6341 0.4962 0.7044
No log 20.5455 452 0.5130 0.6088 0.5130 0.7162
No log 20.6364 454 0.4994 0.6272 0.4994 0.7067
No log 20.7273 456 0.4799 0.6052 0.4799 0.6927
No log 20.8182 458 0.4818 0.5614 0.4818 0.6941
No log 20.9091 460 0.5035 0.5979 0.5035 0.7096
No log 21.0 462 0.4967 0.5979 0.4967 0.7048
No log 21.0909 464 0.4949 0.5966 0.4949 0.7035
No log 21.1818 466 0.4968 0.5966 0.4968 0.7048
No log 21.2727 468 0.4949 0.5966 0.4949 0.7035
No log 21.3636 470 0.4840 0.5823 0.4840 0.6957
No log 21.4545 472 0.4808 0.5810 0.4808 0.6934
No log 21.5455 474 0.4842 0.6245 0.4842 0.6959
No log 21.6364 476 0.4901 0.6257 0.4901 0.7001
No log 21.7273 478 0.4844 0.6160 0.4844 0.6960
No log 21.8182 480 0.4865 0.5687 0.4865 0.6975
No log 21.9091 482 0.4940 0.5827 0.4940 0.7028
No log 22.0 484 0.4886 0.5434 0.4886 0.6990
No log 22.0909 486 0.4886 0.5523 0.4886 0.6990
No log 22.1818 488 0.4961 0.5538 0.4961 0.7043
No log 22.2727 490 0.5106 0.5283 0.5106 0.7145
No log 22.3636 492 0.5003 0.5267 0.5003 0.7073
No log 22.4545 494 0.4884 0.5467 0.4884 0.6989
No log 22.5455 496 0.4819 0.5812 0.4819 0.6942
No log 22.6364 498 0.4747 0.5557 0.4747 0.6890
0.2537 22.7273 500 0.4798 0.6083 0.4798 0.6927
0.2537 22.8182 502 0.4727 0.5949 0.4727 0.6875
0.2537 22.9091 504 0.4891 0.5144 0.4891 0.6994
0.2537 23.0 506 0.5173 0.5317 0.5173 0.7193
0.2537 23.0909 508 0.5352 0.5317 0.5352 0.7316
0.2537 23.1818 510 0.5001 0.5153 0.5001 0.7072

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run2_AugV5_k6_task7_organization

Finetuned
(4019)
this model