ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run1_AugV5_k2_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5197
  • Qwk: 0.4747
  • Mse: 0.5197
  • Rmse: 0.7209

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.1818 2 2.7163 -0.0230 2.7163 1.6481
No log 0.3636 4 1.2413 0.0479 1.2413 1.1141
No log 0.5455 6 0.7815 0.1372 0.7815 0.8840
No log 0.7273 8 0.7555 0.0618 0.7555 0.8692
No log 0.9091 10 1.0030 0.1671 1.0030 1.0015
No log 1.0909 12 1.2137 0.2160 1.2137 1.1017
No log 1.2727 14 1.1959 0.2160 1.1959 1.0936
No log 1.4545 16 0.9816 0.2589 0.9816 0.9907
No log 1.6364 18 0.7210 0.4731 0.7210 0.8491
No log 1.8182 20 0.6604 0.1922 0.6604 0.8127
No log 2.0 22 0.6511 0.0444 0.6511 0.8069
No log 2.1818 24 0.6377 0.1962 0.6377 0.7986
No log 2.3636 26 0.6856 0.1815 0.6856 0.8280
No log 2.5455 28 0.7305 0.2904 0.7305 0.8547
No log 2.7273 30 0.6723 0.2932 0.6723 0.8199
No log 2.9091 32 0.5823 0.2819 0.5823 0.7631
No log 3.0909 34 0.6209 0.3407 0.6209 0.7879
No log 3.2727 36 0.6785 0.3398 0.6785 0.8237
No log 3.4545 38 0.6681 0.3398 0.6681 0.8174
No log 3.6364 40 0.6657 0.3398 0.6657 0.8159
No log 3.8182 42 0.5855 0.5030 0.5855 0.7652
No log 4.0 44 0.5704 0.4724 0.5704 0.7553
No log 4.1818 46 0.5702 0.4918 0.5702 0.7551
No log 4.3636 48 0.6287 0.3693 0.6287 0.7929
No log 4.5455 50 0.5746 0.5142 0.5746 0.7580
No log 4.7273 52 0.6737 0.3918 0.6737 0.8208
No log 4.9091 54 0.6944 0.3918 0.6944 0.8333
No log 5.0909 56 0.5955 0.4257 0.5955 0.7717
No log 5.2727 58 0.6439 0.4139 0.6439 0.8024
No log 5.4545 60 0.7221 0.3999 0.7221 0.8497
No log 5.6364 62 0.5508 0.4904 0.5508 0.7421
No log 5.8182 64 0.5721 0.4743 0.5721 0.7564
No log 6.0 66 0.5771 0.4493 0.5771 0.7597
No log 6.1818 68 0.5506 0.5539 0.5506 0.7420
No log 6.3636 70 0.6296 0.3693 0.6296 0.7935
No log 6.5455 72 0.8038 0.3155 0.8038 0.8965
No log 6.7273 74 0.7013 0.3902 0.7013 0.8374
No log 6.9091 76 0.5401 0.3860 0.5401 0.7349
No log 7.0909 78 0.5437 0.4158 0.5437 0.7374
No log 7.2727 80 0.5223 0.5587 0.5223 0.7227
No log 7.4545 82 0.5190 0.4407 0.5190 0.7204
No log 7.6364 84 0.6059 0.4695 0.6059 0.7784
No log 7.8182 86 0.7664 0.3870 0.7664 0.8754
No log 8.0 88 0.6768 0.4879 0.6768 0.8227
No log 8.1818 90 0.5480 0.4752 0.5480 0.7403
No log 8.3636 92 0.6895 0.3519 0.6895 0.8304
No log 8.5455 94 0.7449 0.4275 0.7449 0.8630
No log 8.7273 96 0.6332 0.4393 0.6332 0.7958
No log 8.9091 98 0.6297 0.4841 0.6297 0.7935
No log 9.0909 100 0.8660 0.4651 0.8660 0.9306
No log 9.2727 102 0.8206 0.4635 0.8206 0.9059
No log 9.4545 104 0.6777 0.5085 0.6777 0.8232
No log 9.6364 106 0.6161 0.4432 0.6161 0.7849
No log 9.8182 108 0.6239 0.4737 0.6239 0.7899
No log 10.0 110 0.6090 0.4754 0.6090 0.7804
No log 10.1818 112 0.5998 0.4754 0.5998 0.7745
No log 10.3636 114 0.5845 0.4016 0.5845 0.7645
No log 10.5455 116 0.6341 0.4694 0.6341 0.7963
No log 10.7273 118 0.6072 0.4635 0.6072 0.7792
No log 10.9091 120 0.6189 0.4569 0.6189 0.7867
No log 11.0909 122 0.6732 0.4215 0.6732 0.8205
No log 11.2727 124 0.6081 0.4615 0.6081 0.7798
No log 11.4545 126 0.7098 0.5163 0.7098 0.8425
No log 11.6364 128 0.8365 0.4632 0.8365 0.9146
No log 11.8182 130 0.7477 0.3991 0.7477 0.8647
No log 12.0 132 0.5892 0.4413 0.5892 0.7676
No log 12.1818 134 0.5707 0.5079 0.5707 0.7555
No log 12.3636 136 0.5671 0.4858 0.5671 0.7531
No log 12.5455 138 0.5453 0.4923 0.5453 0.7385
No log 12.7273 140 0.5452 0.4885 0.5452 0.7384
No log 12.9091 142 0.5333 0.5143 0.5333 0.7303
No log 13.0909 144 0.5857 0.4633 0.5857 0.7653
No log 13.2727 146 0.5812 0.4776 0.5812 0.7623
No log 13.4545 148 0.5220 0.4697 0.5220 0.7225
No log 13.6364 150 0.5594 0.4260 0.5594 0.7479
No log 13.8182 152 0.6360 0.4887 0.6360 0.7975
No log 14.0 154 0.6902 0.5325 0.6902 0.8308
No log 14.1818 156 0.5808 0.4980 0.5808 0.7621
No log 14.3636 158 0.5445 0.4526 0.5445 0.7379
No log 14.5455 160 0.6110 0.4918 0.6110 0.7816
No log 14.7273 162 0.5874 0.4766 0.5874 0.7664
No log 14.9091 164 0.5589 0.5412 0.5589 0.7476
No log 15.0909 166 0.5847 0.5236 0.5847 0.7646
No log 15.2727 168 0.5652 0.5190 0.5652 0.7518
No log 15.4545 170 0.4956 0.5476 0.4956 0.7040
No log 15.6364 172 0.4880 0.6269 0.4880 0.6986
No log 15.8182 174 0.5127 0.5065 0.5127 0.7160
No log 16.0 176 0.4992 0.6168 0.4992 0.7065
No log 16.1818 178 0.4903 0.5460 0.4903 0.7002
No log 16.3636 180 0.5128 0.4968 0.5128 0.7161
No log 16.5455 182 0.5310 0.4968 0.5310 0.7287
No log 16.7273 184 0.4930 0.5419 0.4930 0.7021
No log 16.9091 186 0.4849 0.6383 0.4849 0.6964
No log 17.0909 188 0.4806 0.6184 0.4806 0.6933
No log 17.2727 190 0.5047 0.4642 0.5047 0.7104
No log 17.4545 192 0.5531 0.4827 0.5531 0.7437
No log 17.6364 194 0.5736 0.4827 0.5736 0.7574
No log 17.8182 196 0.6114 0.4949 0.6114 0.7820
No log 18.0 198 0.5685 0.4777 0.5685 0.7540
No log 18.1818 200 0.5063 0.5003 0.5063 0.7115
No log 18.3636 202 0.5075 0.4190 0.5075 0.7124
No log 18.5455 204 0.5170 0.5289 0.5170 0.7190
No log 18.7273 206 0.5118 0.4555 0.5118 0.7154
No log 18.9091 208 0.5118 0.4973 0.5118 0.7154
No log 19.0909 210 0.5062 0.4973 0.5062 0.7115
No log 19.2727 212 0.5299 0.5256 0.5299 0.7279
No log 19.4545 214 0.5175 0.5190 0.5175 0.7194
No log 19.6364 216 0.4966 0.5846 0.4966 0.7047
No log 19.8182 218 0.5001 0.6292 0.5001 0.7072
No log 20.0 220 0.5049 0.6491 0.5049 0.7106
No log 20.1818 222 0.5350 0.5283 0.5350 0.7314
No log 20.3636 224 0.5124 0.5812 0.5124 0.7158
No log 20.5455 226 0.5274 0.5411 0.5274 0.7262
No log 20.7273 228 0.5369 0.5552 0.5369 0.7327
No log 20.9091 230 0.6125 0.5307 0.6125 0.7826
No log 21.0909 232 0.6707 0.4893 0.6707 0.8190
No log 21.2727 234 0.6188 0.4898 0.6188 0.7866
No log 21.4545 236 0.5778 0.4451 0.5778 0.7601
No log 21.6364 238 0.5327 0.4393 0.5327 0.7299
No log 21.8182 240 0.5690 0.4451 0.5690 0.7543
No log 22.0 242 0.6174 0.4350 0.6174 0.7857
No log 22.1818 244 0.5620 0.4513 0.5620 0.7497
No log 22.3636 246 0.5386 0.4234 0.5386 0.7339
No log 22.5455 248 0.5282 0.4493 0.5282 0.7268
No log 22.7273 250 0.5269 0.4543 0.5269 0.7259
No log 22.9091 252 0.5239 0.5150 0.5239 0.7238
No log 23.0909 254 0.5412 0.4854 0.5412 0.7357
No log 23.2727 256 0.5938 0.4966 0.5938 0.7706
No log 23.4545 258 0.6369 0.4521 0.6369 0.7981
No log 23.6364 260 0.6090 0.4666 0.6090 0.7804
No log 23.8182 262 0.5495 0.4845 0.5495 0.7413
No log 24.0 264 0.5046 0.5345 0.5046 0.7103
No log 24.1818 266 0.4963 0.4425 0.4963 0.7045
No log 24.3636 268 0.5034 0.4738 0.5034 0.7095
No log 24.5455 270 0.5289 0.5652 0.5289 0.7273
No log 24.7273 272 0.6190 0.5494 0.6190 0.7868
No log 24.9091 274 0.6226 0.5323 0.6226 0.7890
No log 25.0909 276 0.5492 0.5111 0.5492 0.7410
No log 25.2727 278 0.5185 0.5103 0.5185 0.7201
No log 25.4545 280 0.5153 0.5167 0.5153 0.7178
No log 25.6364 282 0.5245 0.5421 0.5245 0.7242
No log 25.8182 284 0.5543 0.5467 0.5543 0.7445
No log 26.0 286 0.5400 0.4858 0.5400 0.7348
No log 26.1818 288 0.5198 0.5419 0.5198 0.7210
No log 26.3636 290 0.5728 0.5278 0.5728 0.7568
No log 26.5455 292 0.6370 0.5422 0.6370 0.7981
No log 26.7273 294 0.6125 0.5382 0.6125 0.7826
No log 26.9091 296 0.5355 0.5388 0.5355 0.7318
No log 27.0909 298 0.4891 0.5872 0.4891 0.6994
No log 27.2727 300 0.4675 0.5831 0.4675 0.6838
No log 27.4545 302 0.4689 0.6286 0.4689 0.6848
No log 27.6364 304 0.4727 0.5784 0.4727 0.6875
No log 27.8182 306 0.4920 0.5813 0.4920 0.7014
No log 28.0 308 0.4882 0.5742 0.4882 0.6987
No log 28.1818 310 0.4817 0.6078 0.4817 0.6940
No log 28.3636 312 0.4856 0.5574 0.4856 0.6969
No log 28.5455 314 0.5069 0.5324 0.5069 0.7120
No log 28.7273 316 0.5126 0.5256 0.5126 0.7160
No log 28.9091 318 0.5211 0.5256 0.5211 0.7219
No log 29.0909 320 0.5069 0.5256 0.5069 0.7120
No log 29.2727 322 0.5067 0.5098 0.5067 0.7118
No log 29.4545 324 0.5232 0.4920 0.5232 0.7233
No log 29.6364 326 0.5219 0.4869 0.5219 0.7224
No log 29.8182 328 0.4911 0.5933 0.4911 0.7008
No log 30.0 330 0.4940 0.5868 0.4940 0.7028
No log 30.1818 332 0.5127 0.5217 0.5127 0.7160
No log 30.3636 334 0.4896 0.5592 0.4896 0.6997
No log 30.5455 336 0.4867 0.5996 0.4867 0.6976
No log 30.7273 338 0.5095 0.5321 0.5095 0.7138
No log 30.9091 340 0.5123 0.5081 0.5123 0.7158
No log 31.0909 342 0.5032 0.5228 0.5032 0.7093
No log 31.2727 344 0.5077 0.4795 0.5077 0.7125
No log 31.4545 346 0.4936 0.5228 0.4936 0.7026
No log 31.6364 348 0.4852 0.5472 0.4852 0.6966
No log 31.8182 350 0.4918 0.5567 0.4918 0.7013
No log 32.0 352 0.5015 0.5071 0.5015 0.7082
No log 32.1818 354 0.5089 0.5050 0.5089 0.7134
No log 32.3636 356 0.4957 0.5357 0.4957 0.7040
No log 32.5455 358 0.4916 0.5357 0.4916 0.7012
No log 32.7273 360 0.4929 0.5446 0.4929 0.7021
No log 32.9091 362 0.4898 0.5565 0.4898 0.6999
No log 33.0909 364 0.4919 0.6060 0.4919 0.7014
No log 33.2727 366 0.5091 0.5324 0.5091 0.7135
No log 33.4545 368 0.5054 0.5123 0.5054 0.7109
No log 33.6364 370 0.5140 0.5034 0.5140 0.7169
No log 33.8182 372 0.5210 0.5104 0.5210 0.7218
No log 34.0 374 0.5256 0.5237 0.5256 0.7250
No log 34.1818 376 0.5175 0.4945 0.5175 0.7194
No log 34.3636 378 0.5180 0.4945 0.5180 0.7198
No log 34.5455 380 0.4970 0.5214 0.4970 0.7050
No log 34.7273 382 0.4860 0.4788 0.4860 0.6972
No log 34.9091 384 0.4809 0.5446 0.4809 0.6935
No log 35.0909 386 0.4868 0.6158 0.4868 0.6977
No log 35.2727 388 0.4978 0.6173 0.4978 0.7056
No log 35.4545 390 0.5047 0.6051 0.5047 0.7104
No log 35.6364 392 0.5211 0.5724 0.5211 0.7219
No log 35.8182 394 0.5140 0.5898 0.5140 0.7169
No log 36.0 396 0.4917 0.5594 0.4917 0.7012
No log 36.1818 398 0.5019 0.6114 0.5019 0.7084
No log 36.3636 400 0.5284 0.6154 0.5284 0.7269
No log 36.5455 402 0.5166 0.5733 0.5166 0.7188
No log 36.7273 404 0.5008 0.5718 0.5008 0.7077
No log 36.9091 406 0.4825 0.6279 0.4825 0.6946
No log 37.0909 408 0.4847 0.6242 0.4847 0.6962
No log 37.2727 410 0.4828 0.6024 0.4828 0.6948
No log 37.4545 412 0.4781 0.5633 0.4781 0.6915
No log 37.6364 414 0.4910 0.5356 0.4910 0.7007
No log 37.8182 416 0.4862 0.5356 0.4862 0.6973
No log 38.0 418 0.4826 0.5600 0.4826 0.6947
No log 38.1818 420 0.4965 0.5612 0.4965 0.7046
No log 38.3636 422 0.5139 0.6271 0.5139 0.7169
No log 38.5455 424 0.5000 0.6210 0.5000 0.7071
No log 38.7273 426 0.5035 0.5110 0.5035 0.7096
No log 38.9091 428 0.5266 0.4832 0.5266 0.7257
No log 39.0909 430 0.5638 0.6074 0.5638 0.7509
No log 39.2727 432 0.5640 0.5706 0.5640 0.7510
No log 39.4545 434 0.5268 0.5250 0.5268 0.7258
No log 39.6364 436 0.5089 0.6295 0.5089 0.7134
No log 39.8182 438 0.5596 0.5222 0.5596 0.7480
No log 40.0 440 0.5889 0.5147 0.5889 0.7674
No log 40.1818 442 0.5907 0.4893 0.5907 0.7686
No log 40.3636 444 0.5751 0.4777 0.5751 0.7584
No log 40.5455 446 0.5258 0.4504 0.5258 0.7252
No log 40.7273 448 0.4931 0.6142 0.4931 0.7022
No log 40.9091 450 0.4911 0.5672 0.4911 0.7008
No log 41.0909 452 0.4938 0.6242 0.4938 0.7027
No log 41.2727 454 0.5129 0.6271 0.5129 0.7162
No log 41.4545 456 0.5397 0.5608 0.5397 0.7347
No log 41.6364 458 0.5288 0.5484 0.5288 0.7272
No log 41.8182 460 0.5096 0.5957 0.5096 0.7138
No log 42.0 462 0.5007 0.5852 0.5007 0.7076
No log 42.1818 464 0.5084 0.5271 0.5084 0.7130
No log 42.3636 466 0.5334 0.5291 0.5334 0.7303
No log 42.5455 468 0.5568 0.4916 0.5568 0.7462
No log 42.7273 470 0.5743 0.5278 0.5743 0.7578
No log 42.9091 472 0.5345 0.5293 0.5345 0.7311
No log 43.0909 474 0.4945 0.5517 0.4945 0.7032
No log 43.2727 476 0.4930 0.5703 0.4930 0.7021
No log 43.4545 478 0.5047 0.5517 0.5047 0.7104
No log 43.6364 480 0.5487 0.5858 0.5487 0.7408
No log 43.8182 482 0.6403 0.5080 0.6403 0.8002
No log 44.0 484 0.6659 0.4805 0.6659 0.8160
No log 44.1818 486 0.6203 0.5024 0.6203 0.7876
No log 44.3636 488 0.5926 0.4684 0.5926 0.7698
No log 44.5455 490 0.5524 0.4437 0.5524 0.7433
No log 44.7273 492 0.5258 0.4547 0.5258 0.7251
No log 44.9091 494 0.5097 0.4726 0.5097 0.7140
No log 45.0909 496 0.5056 0.5379 0.5056 0.7111
No log 45.2727 498 0.5136 0.5840 0.5136 0.7167
0.2666 45.4545 500 0.5357 0.5423 0.5357 0.7319
0.2666 45.6364 502 0.5383 0.5293 0.5383 0.7337
0.2666 45.8182 504 0.5259 0.5323 0.5259 0.7252
0.2666 46.0 506 0.5117 0.4747 0.5117 0.7153
0.2666 46.1818 508 0.5119 0.5170 0.5119 0.7155
0.2666 46.3636 510 0.5197 0.4747 0.5197 0.7209

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run1_AugV5_k2_task7_organization

Finetuned
(4019)
this model