ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run2_AugV5_k2_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5018
  • Qwk: 0.4429
  • Mse: 0.5018
  • Rmse: 0.7083

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.2 2 2.5873 -0.0788 2.5873 1.6085
No log 0.4 4 1.1662 0.1765 1.1662 1.0799
No log 0.6 6 0.6673 0.0893 0.6673 0.8169
No log 0.8 8 0.6290 0.2783 0.6290 0.7931
No log 1.0 10 0.6110 0.3341 0.6110 0.7816
No log 1.2 12 0.5956 0.3123 0.5956 0.7718
No log 1.4 14 0.6398 0.4235 0.6398 0.7999
No log 1.6 16 1.2488 0.1453 1.2488 1.1175
No log 1.8 18 1.3716 0.0796 1.3716 1.1711
No log 2.0 20 1.1302 0.2421 1.1302 1.0631
No log 2.2 22 1.1222 0.2184 1.1222 1.0594
No log 2.4 24 0.6646 0.3931 0.6646 0.8152
No log 2.6 26 0.5079 0.4504 0.5079 0.7127
No log 2.8 28 0.5220 0.4808 0.5220 0.7225
No log 3.0 30 0.5514 0.4151 0.5514 0.7426
No log 3.2 32 0.8094 0.3556 0.8094 0.8997
No log 3.4 34 0.9014 0.3355 0.9014 0.9494
No log 3.6 36 0.6893 0.4204 0.6893 0.8303
No log 3.8 38 0.5642 0.4001 0.5642 0.7511
No log 4.0 40 0.8373 0.4308 0.8373 0.9150
No log 4.2 42 0.9923 0.2552 0.9923 0.9961
No log 4.4 44 0.7889 0.4940 0.7889 0.8882
No log 4.6 46 0.5835 0.2852 0.5835 0.7638
No log 4.8 48 0.7004 0.3563 0.7004 0.8369
No log 5.0 50 0.7952 0.3805 0.7952 0.8918
No log 5.2 52 0.7010 0.4548 0.7010 0.8373
No log 5.4 54 0.5453 0.3029 0.5453 0.7384
No log 5.6 56 0.5406 0.4639 0.5406 0.7353
No log 5.8 58 0.5303 0.5362 0.5303 0.7282
No log 6.0 60 0.5579 0.4581 0.5579 0.7469
No log 6.2 62 0.7107 0.4396 0.7107 0.8430
No log 6.4 64 0.6250 0.4972 0.6250 0.7905
No log 6.6 66 0.5099 0.5756 0.5099 0.7141
No log 6.8 68 0.4913 0.4910 0.4913 0.7010
No log 7.0 70 0.4759 0.4516 0.4759 0.6898
No log 7.2 72 0.4515 0.5235 0.4515 0.6719
No log 7.4 74 0.4941 0.4205 0.4941 0.7029
No log 7.6 76 0.5075 0.3831 0.5075 0.7124
No log 7.8 78 0.6096 0.4842 0.6096 0.7807
No log 8.0 80 0.7088 0.4538 0.7088 0.8419
No log 8.2 82 0.6835 0.4538 0.6835 0.8267
No log 8.4 84 0.4910 0.3654 0.4910 0.7007
No log 8.6 86 0.5421 0.5570 0.5421 0.7363
No log 8.8 88 0.5556 0.5190 0.5556 0.7454
No log 9.0 90 0.4777 0.5965 0.4777 0.6912
No log 9.2 92 0.5367 0.3590 0.5367 0.7326
No log 9.4 94 0.6702 0.4593 0.6702 0.8186
No log 9.6 96 0.6521 0.4593 0.6521 0.8075
No log 9.8 98 0.5422 0.4412 0.5422 0.7363
No log 10.0 100 0.4715 0.4847 0.4715 0.6866
No log 10.2 102 0.4738 0.5428 0.4738 0.6883
No log 10.4 104 0.5192 0.4980 0.5192 0.7206
No log 10.6 106 0.7217 0.4413 0.7217 0.8495
No log 10.8 108 0.7016 0.4204 0.7016 0.8376
No log 11.0 110 0.5037 0.5634 0.5037 0.7097
No log 11.2 112 0.5062 0.5617 0.5062 0.7115
No log 11.4 114 0.5505 0.5774 0.5505 0.7419
No log 11.6 116 0.4919 0.5923 0.4919 0.7013
No log 11.8 118 0.6045 0.5184 0.6045 0.7775
No log 12.0 120 0.7725 0.3579 0.7725 0.8789
No log 12.2 122 0.6611 0.4024 0.6611 0.8131
No log 12.4 124 0.5000 0.4681 0.5000 0.7071
No log 12.6 126 0.5128 0.5485 0.5128 0.7161
No log 12.8 128 0.5007 0.5817 0.5007 0.7076
No log 13.0 130 0.4857 0.5003 0.4857 0.6969
No log 13.2 132 0.5593 0.5033 0.5593 0.7479
No log 13.4 134 0.5392 0.4412 0.5392 0.7343
No log 13.6 136 0.4766 0.4082 0.4766 0.6904
No log 13.8 138 0.4741 0.4866 0.4741 0.6885
No log 14.0 140 0.4820 0.5533 0.4820 0.6942
No log 14.2 142 0.5170 0.4715 0.5170 0.7190
No log 14.4 144 0.5364 0.4656 0.5364 0.7324
No log 14.6 146 0.5180 0.6341 0.5180 0.7197
No log 14.8 148 0.5588 0.5307 0.5588 0.7475
No log 15.0 150 0.5288 0.5786 0.5288 0.7272
No log 15.2 152 0.5494 0.4771 0.5494 0.7412
No log 15.4 154 0.6201 0.4602 0.6201 0.7875
No log 15.6 156 0.6106 0.4283 0.6106 0.7814
No log 15.8 158 0.5374 0.4581 0.5374 0.7331
No log 16.0 160 0.5642 0.5601 0.5642 0.7511
No log 16.2 162 0.5769 0.5497 0.5769 0.7595
No log 16.4 164 0.5371 0.4901 0.5371 0.7329
No log 16.6 166 0.5413 0.5037 0.5413 0.7357
No log 16.8 168 0.5664 0.5303 0.5664 0.7526
No log 17.0 170 0.5663 0.5728 0.5663 0.7525
No log 17.2 172 0.5228 0.5135 0.5228 0.7231
No log 17.4 174 0.4857 0.5222 0.4857 0.6969
No log 17.6 176 0.4834 0.6105 0.4834 0.6953
No log 17.8 178 0.4788 0.4681 0.4788 0.6920
No log 18.0 180 0.4922 0.4448 0.4922 0.7016
No log 18.2 182 0.5764 0.4761 0.5764 0.7592
No log 18.4 184 0.5999 0.4151 0.5999 0.7746
No log 18.6 186 0.4980 0.4092 0.4980 0.7057
No log 18.8 188 0.5101 0.5698 0.5101 0.7142
No log 19.0 190 0.5810 0.4979 0.5810 0.7622
No log 19.2 192 0.5292 0.5017 0.5292 0.7275
No log 19.4 194 0.4917 0.4211 0.4917 0.7012
No log 19.6 196 0.5168 0.3911 0.5168 0.7189
No log 19.8 198 0.4895 0.4092 0.4895 0.6996
No log 20.0 200 0.4960 0.5817 0.4960 0.7043
No log 20.2 202 0.5381 0.5291 0.5381 0.7336
No log 20.4 204 0.5164 0.5677 0.5164 0.7186
No log 20.6 206 0.5211 0.3661 0.5211 0.7219
No log 20.8 208 0.5349 0.4211 0.5349 0.7314
No log 21.0 210 0.5275 0.3608 0.5275 0.7263
No log 21.2 212 0.5225 0.3970 0.5225 0.7229
No log 21.4 214 0.5189 0.3608 0.5189 0.7204
No log 21.6 216 0.5456 0.4092 0.5456 0.7387
No log 21.8 218 0.5411 0.3860 0.5411 0.7356
No log 22.0 220 0.5191 0.3916 0.5191 0.7205
No log 22.2 222 0.5034 0.3834 0.5034 0.7095
No log 22.4 224 0.4992 0.3860 0.4992 0.7065
No log 22.6 226 0.5338 0.4589 0.5338 0.7306
No log 22.8 228 0.5290 0.5438 0.5290 0.7274
No log 23.0 230 0.4973 0.5118 0.4973 0.7052
No log 23.2 232 0.4950 0.5288 0.4950 0.7035
No log 23.4 234 0.4928 0.5084 0.4928 0.7020
No log 23.6 236 0.4915 0.4973 0.4915 0.7011
No log 23.8 238 0.4851 0.4150 0.4851 0.6965
No log 24.0 240 0.4821 0.3974 0.4821 0.6943
No log 24.2 242 0.4800 0.4314 0.4800 0.6928
No log 24.4 244 0.4756 0.4361 0.4756 0.6896
No log 24.6 246 0.4721 0.4345 0.4721 0.6871
No log 24.8 248 0.4876 0.4888 0.4876 0.6983
No log 25.0 250 0.5234 0.5263 0.5234 0.7235
No log 25.2 252 0.5000 0.5881 0.5000 0.7071
No log 25.4 254 0.5071 0.5881 0.5071 0.7121
No log 25.6 256 0.5290 0.5334 0.5290 0.7273
No log 25.8 258 0.5033 0.4589 0.5033 0.7094
No log 26.0 260 0.4874 0.4590 0.4874 0.6981
No log 26.2 262 0.5104 0.4828 0.5104 0.7144
No log 26.4 264 0.5237 0.5373 0.5237 0.7237
No log 26.6 266 0.5112 0.5242 0.5112 0.7150
No log 26.8 268 0.4978 0.4776 0.4978 0.7055
No log 27.0 270 0.5445 0.5031 0.5445 0.7379
No log 27.2 272 0.5377 0.5233 0.5377 0.7333
No log 27.4 274 0.5005 0.4802 0.5005 0.7075
No log 27.6 276 0.4868 0.4972 0.4868 0.6977
No log 27.8 278 0.5018 0.4661 0.5018 0.7084
No log 28.0 280 0.5098 0.3865 0.5098 0.7140
No log 28.2 282 0.4969 0.4224 0.4969 0.7049
No log 28.4 284 0.4993 0.4339 0.4993 0.7066
No log 28.6 286 0.5274 0.4740 0.5274 0.7262
No log 28.8 288 0.5217 0.4653 0.5217 0.7223
No log 29.0 290 0.4985 0.4958 0.4985 0.7060
No log 29.2 292 0.4965 0.5753 0.4965 0.7046
No log 29.4 294 0.5079 0.5682 0.5079 0.7127
No log 29.6 296 0.4954 0.5548 0.4954 0.7038
No log 29.8 298 0.5206 0.5237 0.5206 0.7215
No log 30.0 300 0.5999 0.5347 0.5999 0.7746
No log 30.2 302 0.6315 0.4818 0.6315 0.7946
No log 30.4 304 0.5810 0.4770 0.5810 0.7623
No log 30.6 306 0.5028 0.4740 0.5028 0.7091
No log 30.8 308 0.4663 0.4817 0.4663 0.6828
No log 31.0 310 0.4839 0.6135 0.4839 0.6956
No log 31.2 312 0.4920 0.6135 0.4920 0.7014
No log 31.4 314 0.4832 0.5872 0.4832 0.6951
No log 31.6 316 0.4697 0.5030 0.4697 0.6854
No log 31.8 318 0.5162 0.4411 0.5162 0.7184
No log 32.0 320 0.5429 0.4489 0.5429 0.7368
No log 32.2 322 0.5280 0.4489 0.5280 0.7266
No log 32.4 324 0.5162 0.4589 0.5162 0.7184
No log 32.6 326 0.4917 0.4888 0.4917 0.7012
No log 32.8 328 0.4954 0.4561 0.4954 0.7039
No log 33.0 330 0.4972 0.4561 0.4972 0.7051
No log 33.2 332 0.4959 0.4367 0.4959 0.7042
No log 33.4 334 0.4942 0.4888 0.4942 0.7030
No log 33.6 336 0.4935 0.4829 0.4935 0.7025
No log 33.8 338 0.4951 0.4878 0.4951 0.7036
No log 34.0 340 0.5025 0.4147 0.5025 0.7088
No log 34.2 342 0.4994 0.4729 0.4994 0.7067
No log 34.4 344 0.4970 0.4367 0.4970 0.7050
No log 34.6 346 0.5087 0.4389 0.5087 0.7132
No log 34.8 348 0.5145 0.4653 0.5145 0.7173
No log 35.0 350 0.5022 0.4653 0.5022 0.7087
No log 35.2 352 0.4883 0.4505 0.4883 0.6988
No log 35.4 354 0.4872 0.4505 0.4872 0.6980
No log 35.6 356 0.4949 0.4632 0.4949 0.7035
No log 35.8 358 0.4965 0.4367 0.4965 0.7046
No log 36.0 360 0.4939 0.4186 0.4939 0.7028
No log 36.2 362 0.5116 0.5329 0.5116 0.7153
No log 36.4 364 0.5507 0.4774 0.5507 0.7421
No log 36.6 366 0.5420 0.4774 0.5420 0.7362
No log 36.8 368 0.5107 0.4825 0.5107 0.7146
No log 37.0 370 0.5084 0.4150 0.5084 0.7130
No log 37.2 372 0.5869 0.3948 0.5869 0.7661
No log 37.4 374 0.6626 0.4308 0.6626 0.8140
No log 37.6 376 0.6486 0.4308 0.6486 0.8054
No log 37.8 378 0.5496 0.5161 0.5496 0.7413
No log 38.0 380 0.5008 0.4821 0.5008 0.7076
No log 38.2 382 0.4954 0.5118 0.4954 0.7039
No log 38.4 384 0.5030 0.5020 0.5030 0.7092
No log 38.6 386 0.5075 0.5079 0.5075 0.7124
No log 38.8 388 0.5229 0.4821 0.5229 0.7231
No log 39.0 390 0.5358 0.4997 0.5358 0.7320
No log 39.2 392 0.5291 0.4858 0.5291 0.7274
No log 39.4 394 0.5270 0.4380 0.5270 0.7259
No log 39.6 396 0.5411 0.4147 0.5411 0.7356
No log 39.8 398 0.5489 0.4356 0.5489 0.7409
No log 40.0 400 0.5647 0.5239 0.5647 0.7515
No log 40.2 402 0.5737 0.5726 0.5737 0.7574
No log 40.4 404 0.5564 0.4740 0.5564 0.7459
No log 40.6 406 0.5247 0.5304 0.5247 0.7243
No log 40.8 408 0.5259 0.4821 0.5259 0.7252
No log 41.0 410 0.5388 0.4931 0.5388 0.7340
No log 41.2 412 0.5468 0.4931 0.5468 0.7395
No log 41.4 414 0.5350 0.4931 0.5350 0.7314
No log 41.6 416 0.4997 0.4737 0.4997 0.7069
No log 41.8 418 0.4953 0.5246 0.4953 0.7038
No log 42.0 420 0.4975 0.5085 0.4975 0.7053
No log 42.2 422 0.5022 0.4888 0.5022 0.7087
No log 42.4 424 0.5167 0.4821 0.5167 0.7188
No log 42.6 426 0.5289 0.4694 0.5289 0.7273
No log 42.8 428 0.5066 0.4737 0.5066 0.7118
No log 43.0 430 0.5027 0.5413 0.5027 0.7090
No log 43.2 432 0.5155 0.4601 0.5155 0.7180
No log 43.4 434 0.5127 0.4837 0.5127 0.7160
No log 43.6 436 0.5095 0.4795 0.5095 0.7138
No log 43.8 438 0.5256 0.4958 0.5256 0.7250
No log 44.0 440 0.5374 0.4528 0.5374 0.7331
No log 44.2 442 0.5265 0.4821 0.5265 0.7256
No log 44.4 444 0.5096 0.4869 0.5096 0.7138
No log 44.6 446 0.5029 0.5022 0.5029 0.7092
No log 44.8 448 0.5099 0.4763 0.5099 0.7141
No log 45.0 450 0.5313 0.5617 0.5313 0.7289
No log 45.2 452 0.5331 0.5617 0.5331 0.7301
No log 45.4 454 0.5262 0.4514 0.5262 0.7254
No log 45.6 456 0.5182 0.4423 0.5182 0.7199
No log 45.8 458 0.5145 0.4386 0.5145 0.7173
No log 46.0 460 0.5095 0.4386 0.5095 0.7138
No log 46.2 462 0.5060 0.4478 0.5060 0.7113
No log 46.4 464 0.5029 0.4858 0.5029 0.7091
No log 46.6 466 0.5019 0.4809 0.5019 0.7085
No log 46.8 468 0.4986 0.4991 0.4986 0.7061
No log 47.0 470 0.4980 0.4858 0.4980 0.7057
No log 47.2 472 0.4997 0.4928 0.4997 0.7069
No log 47.4 474 0.5028 0.4928 0.5028 0.7091
No log 47.6 476 0.5098 0.5178 0.5098 0.7140
No log 47.8 478 0.5060 0.5092 0.5060 0.7113
No log 48.0 480 0.4964 0.4973 0.4964 0.7046
No log 48.2 482 0.4959 0.5252 0.4959 0.7042
No log 48.4 484 0.4932 0.4972 0.4932 0.7023
No log 48.6 486 0.4924 0.4742 0.4924 0.7017
No log 48.8 488 0.4929 0.4742 0.4929 0.7021
No log 49.0 490 0.4955 0.4547 0.4955 0.7039
No log 49.2 492 0.5020 0.4914 0.5020 0.7085
No log 49.4 494 0.5029 0.4514 0.5029 0.7092
No log 49.6 496 0.5021 0.4701 0.5021 0.7086
No log 49.8 498 0.5050 0.5010 0.5050 0.7106
0.2327 50.0 500 0.5148 0.4632 0.5148 0.7175
0.2327 50.2 502 0.5174 0.4632 0.5174 0.7193
0.2327 50.4 504 0.5087 0.4632 0.5087 0.7133
0.2327 50.6 506 0.4983 0.4942 0.4983 0.7059
0.2327 50.8 508 0.4970 0.4801 0.4970 0.7050
0.2327 51.0 510 0.4983 0.4801 0.4983 0.7059
0.2327 51.2 512 0.4956 0.4801 0.4956 0.7040
0.2327 51.4 514 0.4961 0.5079 0.4961 0.7043
0.2327 51.6 516 0.5090 0.4632 0.5090 0.7134
0.2327 51.8 518 0.5242 0.4389 0.5242 0.7240
0.2327 52.0 520 0.5346 0.4329 0.5346 0.7312
0.2327 52.2 522 0.5204 0.4389 0.5204 0.7214
0.2327 52.4 524 0.5018 0.4429 0.5018 0.7083

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run2_AugV5_k2_task7_organization

Finetuned
(4019)
this model