ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run1_AugV5_k3_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4478
  • Qwk: 0.5475
  • Mse: 0.4478
  • Rmse: 0.6692

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.1667 2 2.5062 -0.0109 2.5062 1.5831
No log 0.3333 4 1.2706 0.1248 1.2706 1.1272
No log 0.5 6 0.8532 -0.0288 0.8532 0.9237
No log 0.6667 8 0.7330 -0.0051 0.7330 0.8561
No log 0.8333 10 0.6922 0.1752 0.6922 0.8320
No log 1.0 12 0.6197 0.2641 0.6197 0.7872
No log 1.1667 14 0.6442 0.2095 0.6442 0.8026
No log 1.3333 16 0.7805 0.2736 0.7805 0.8835
No log 1.5 18 0.7643 0.2435 0.7643 0.8742
No log 1.6667 20 0.8158 0.2736 0.8158 0.9032
No log 1.8333 22 0.8400 0.2736 0.8400 0.9165
No log 2.0 24 0.6630 0.0937 0.6630 0.8142
No log 2.1667 26 0.6831 0.1674 0.6831 0.8265
No log 2.3333 28 0.6781 0.1718 0.6781 0.8235
No log 2.5 30 0.6838 0.2128 0.6838 0.8269
No log 2.6667 32 0.8624 0.2381 0.8624 0.9287
No log 2.8333 34 0.9702 0.3019 0.9702 0.9850
No log 3.0 36 1.0128 0.2651 1.0128 1.0064
No log 3.1667 38 0.8561 0.2844 0.8561 0.9253
No log 3.3333 40 0.6658 0.2916 0.6658 0.8160
No log 3.5 42 0.6646 0.3552 0.6646 0.8152
No log 3.6667 44 0.6946 0.3166 0.6946 0.8335
No log 3.8333 46 0.7152 0.2608 0.7152 0.8457
No log 4.0 48 0.7627 0.3286 0.7627 0.8733
No log 4.1667 50 0.7185 0.3563 0.7185 0.8476
No log 4.3333 52 0.7132 0.4761 0.7132 0.8445
No log 4.5 54 0.6285 0.3640 0.6285 0.7928
No log 4.6667 56 0.5976 0.3916 0.5976 0.7730
No log 4.8333 58 0.5648 0.4356 0.5648 0.7515
No log 5.0 60 0.6746 0.52 0.6746 0.8213
No log 5.1667 62 0.6422 0.5008 0.6422 0.8014
No log 5.3333 64 0.4932 0.5501 0.4932 0.7023
No log 5.5 66 0.7066 0.4709 0.7066 0.8406
No log 5.6667 68 1.1071 0.2637 1.1071 1.0522
No log 5.8333 70 1.1383 0.2637 1.1383 1.0669
No log 6.0 72 0.8367 0.3638 0.8367 0.9147
No log 6.1667 74 0.5242 0.5544 0.5242 0.7240
No log 6.3333 76 0.4953 0.5517 0.4953 0.7037
No log 6.5 78 0.5019 0.5021 0.5019 0.7084
No log 6.6667 80 0.5960 0.5016 0.5960 0.7720
No log 6.8333 82 0.8000 0.4364 0.8000 0.8944
No log 7.0 84 0.8855 0.3353 0.8855 0.9410
No log 7.1667 86 0.7650 0.4426 0.7650 0.8747
No log 7.3333 88 0.5583 0.5234 0.5583 0.7472
No log 7.5 90 0.5546 0.5770 0.5546 0.7447
No log 7.6667 92 0.5320 0.6065 0.5320 0.7294
No log 7.8333 94 0.5793 0.5429 0.5793 0.7611
No log 8.0 96 0.6374 0.4859 0.6374 0.7983
No log 8.1667 98 0.7032 0.4686 0.7032 0.8386
No log 8.3333 100 0.6072 0.4410 0.6072 0.7793
No log 8.5 102 0.5838 0.5150 0.5838 0.7641
No log 8.6667 104 0.7113 0.4836 0.7113 0.8434
No log 8.8333 106 0.5837 0.5418 0.5837 0.7640
No log 9.0 108 0.5317 0.5110 0.5317 0.7292
No log 9.1667 110 0.5472 0.4935 0.5472 0.7397
No log 9.3333 112 0.5427 0.5288 0.5427 0.7367
No log 9.5 114 0.5305 0.5288 0.5305 0.7284
No log 9.6667 116 0.5127 0.5533 0.5127 0.7160
No log 9.8333 118 0.4959 0.5252 0.4959 0.7042
No log 10.0 120 0.4787 0.5222 0.4787 0.6919
No log 10.1667 122 0.4725 0.6326 0.4725 0.6874
No log 10.3333 124 0.5720 0.6129 0.5720 0.7563
No log 10.5 126 0.5420 0.6144 0.5420 0.7362
No log 10.6667 128 0.4772 0.6018 0.4772 0.6908
No log 10.8333 130 0.4753 0.5796 0.4753 0.6894
No log 11.0 132 0.4776 0.5979 0.4776 0.6911
No log 11.1667 134 0.6017 0.5990 0.6017 0.7757
No log 11.3333 136 0.7413 0.5223 0.7413 0.8610
No log 11.5 138 0.7207 0.5281 0.7207 0.8489
No log 11.6667 140 0.5410 0.5953 0.5410 0.7356
No log 11.8333 142 0.5243 0.6100 0.5243 0.7241
No log 12.0 144 0.6501 0.5238 0.6501 0.8063
No log 12.1667 146 0.7181 0.4462 0.7181 0.8474
No log 12.3333 148 0.7740 0.4134 0.7740 0.8798
No log 12.5 150 0.5907 0.5048 0.5907 0.7686
No log 12.6667 152 0.5157 0.5356 0.5157 0.7181
No log 12.8333 154 0.4732 0.5640 0.4732 0.6879
No log 13.0 156 0.4748 0.6073 0.4748 0.6890
No log 13.1667 158 0.4715 0.6073 0.4715 0.6866
No log 13.3333 160 0.5619 0.5635 0.5619 0.7496
No log 13.5 162 0.6669 0.4959 0.6669 0.8166
No log 13.6667 164 0.5570 0.5635 0.5570 0.7463
No log 13.8333 166 0.4719 0.5970 0.4719 0.6869
No log 14.0 168 0.5066 0.6141 0.5066 0.7118
No log 14.1667 170 0.5737 0.5629 0.5737 0.7574
No log 14.3333 172 0.5403 0.5778 0.5403 0.7350
No log 14.5 174 0.5036 0.6342 0.5036 0.7096
No log 14.6667 176 0.4662 0.6694 0.4662 0.6828
No log 14.8333 178 0.4980 0.6141 0.4980 0.7057
No log 15.0 180 0.6020 0.5232 0.6020 0.7759
No log 15.1667 182 0.5539 0.5778 0.5539 0.7443
No log 15.3333 184 0.5149 0.5619 0.5149 0.7176
No log 15.5 186 0.4672 0.5915 0.4672 0.6835
No log 15.6667 188 0.4976 0.6130 0.4976 0.7054
No log 15.8333 190 0.5316 0.5919 0.5316 0.7291
No log 16.0 192 0.5238 0.6351 0.5238 0.7237
No log 16.1667 194 0.5460 0.6135 0.5460 0.7389
No log 16.3333 196 0.5171 0.6292 0.5171 0.7191
No log 16.5 198 0.5536 0.5558 0.5536 0.7441
No log 16.6667 200 0.5147 0.5415 0.5147 0.7174
No log 16.8333 202 0.4609 0.6414 0.4609 0.6789
No log 17.0 204 0.5286 0.6325 0.5286 0.7271
No log 17.1667 206 0.5134 0.6252 0.5134 0.7165
No log 17.3333 208 0.5481 0.6427 0.5481 0.7403
No log 17.5 210 0.6277 0.5373 0.6277 0.7923
No log 17.6667 212 0.5618 0.5437 0.5618 0.7496
No log 17.8333 214 0.4550 0.5763 0.4550 0.6746
No log 18.0 216 0.4653 0.5455 0.4653 0.6821
No log 18.1667 218 0.5272 0.4836 0.5272 0.7261
No log 18.3333 220 0.5071 0.4354 0.5071 0.7121
No log 18.5 222 0.4742 0.5306 0.4742 0.6886
No log 18.6667 224 0.5216 0.5528 0.5216 0.7222
No log 18.8333 226 0.5340 0.5528 0.5340 0.7307
No log 19.0 228 0.5064 0.5854 0.5064 0.7116
No log 19.1667 230 0.4839 0.5812 0.4839 0.6956
No log 19.3333 232 0.4843 0.5769 0.4843 0.6959
No log 19.5 234 0.4962 0.6160 0.4962 0.7044
No log 19.6667 236 0.5581 0.6303 0.5581 0.7471
No log 19.8333 238 0.5903 0.5450 0.5903 0.7683
No log 20.0 240 0.5734 0.5146 0.5734 0.7572
No log 20.1667 242 0.4840 0.5748 0.4840 0.6957
No log 20.3333 244 0.4602 0.5763 0.4602 0.6783
No log 20.5 246 0.4582 0.5748 0.4582 0.6769
No log 20.6667 248 0.4633 0.5619 0.4633 0.6807
No log 20.8333 250 0.5420 0.6100 0.5420 0.7362
No log 21.0 252 0.5786 0.5558 0.5786 0.7606
No log 21.1667 254 0.5596 0.5678 0.5596 0.7480
No log 21.3333 256 0.5954 0.5609 0.5954 0.7716
No log 21.5 258 0.5168 0.5808 0.5168 0.7189
No log 21.6667 260 0.4362 0.6672 0.4362 0.6605
No log 21.8333 262 0.4440 0.5846 0.4440 0.6664
No log 22.0 264 0.5177 0.5561 0.5177 0.7195
No log 22.1667 266 0.6693 0.4770 0.6693 0.8181
No log 22.3333 268 0.6401 0.4770 0.6401 0.8000
No log 22.5 270 0.5187 0.4761 0.5187 0.7202
No log 22.6667 272 0.4477 0.5267 0.4477 0.6691
No log 22.8333 274 0.4575 0.6289 0.4575 0.6764
No log 23.0 276 0.4239 0.6818 0.4239 0.6511
No log 23.1667 278 0.4446 0.6609 0.4446 0.6668
No log 23.3333 280 0.4706 0.6349 0.4706 0.6860
No log 23.5 282 0.4377 0.7031 0.4377 0.6616
No log 23.6667 284 0.4535 0.6235 0.4535 0.6734
No log 23.8333 286 0.4879 0.6138 0.4879 0.6985
No log 24.0 288 0.5383 0.5212 0.5383 0.7337
No log 24.1667 290 0.5550 0.5212 0.5550 0.7450
No log 24.3333 292 0.4820 0.5779 0.4820 0.6942
No log 24.5 294 0.4358 0.6142 0.4358 0.6602
No log 24.6667 296 0.4357 0.6346 0.4357 0.6601
No log 24.8333 298 0.4641 0.6127 0.4641 0.6813
No log 25.0 300 0.5872 0.5216 0.5872 0.7663
No log 25.1667 302 0.6752 0.5108 0.6752 0.8217
No log 25.3333 304 0.6219 0.5169 0.6219 0.7886
No log 25.5 306 0.5340 0.5702 0.5340 0.7308
No log 25.6667 308 0.4635 0.6235 0.4635 0.6808
No log 25.8333 310 0.4557 0.6187 0.4557 0.6751
No log 26.0 312 0.4619 0.6003 0.4619 0.6797
No log 26.1667 314 0.4469 0.6645 0.4469 0.6685
No log 26.3333 316 0.4902 0.6249 0.4902 0.7001
No log 26.5 318 0.6611 0.5295 0.6611 0.8131
No log 26.6667 320 0.7328 0.5108 0.7328 0.8560
No log 26.8333 322 0.6776 0.5295 0.6776 0.8232
No log 27.0 324 0.5601 0.5822 0.5601 0.7484
No log 27.1667 326 0.5138 0.6087 0.5138 0.7168
No log 27.3333 328 0.5003 0.5961 0.5003 0.7073
No log 27.5 330 0.4499 0.5747 0.4499 0.6708
No log 27.6667 332 0.4455 0.5493 0.4455 0.6675
No log 27.8333 334 0.4582 0.5831 0.4582 0.6769
No log 28.0 336 0.4839 0.6110 0.4839 0.6957
No log 28.1667 338 0.4772 0.5892 0.4772 0.6908
No log 28.3333 340 0.4644 0.5831 0.4644 0.6815
No log 28.5 342 0.4800 0.6110 0.4800 0.6928
No log 28.6667 344 0.5106 0.6249 0.5106 0.7146
No log 28.8333 346 0.5083 0.6167 0.5083 0.7130
No log 29.0 348 0.5034 0.6249 0.5034 0.7095
No log 29.1667 350 0.5318 0.6010 0.5318 0.7293
No log 29.3333 352 0.5651 0.5280 0.5651 0.7518
No log 29.5 354 0.6105 0.4943 0.6105 0.7813
No log 29.6667 356 0.6577 0.4921 0.6577 0.8110
No log 29.8333 358 0.6529 0.4921 0.6529 0.8081
No log 30.0 360 0.5587 0.6009 0.5587 0.7475
No log 30.1667 362 0.4455 0.6505 0.4455 0.6674
No log 30.3333 364 0.4216 0.7060 0.4216 0.6493
No log 30.5 366 0.4233 0.6495 0.4233 0.6506
No log 30.6667 368 0.4537 0.6223 0.4537 0.6736
No log 30.8333 370 0.5072 0.6347 0.5072 0.7122
No log 31.0 372 0.5781 0.5763 0.5781 0.7603
No log 31.1667 374 0.6120 0.5093 0.6120 0.7823
No log 31.3333 376 0.5589 0.5702 0.5589 0.7476
No log 31.5 378 0.5074 0.6354 0.5074 0.7123
No log 31.6667 380 0.4796 0.6349 0.4796 0.6926
No log 31.8333 382 0.4864 0.6431 0.4864 0.6974
No log 32.0 384 0.4986 0.5748 0.4986 0.7061
No log 32.1667 386 0.5029 0.5635 0.5029 0.7092
No log 32.3333 388 0.5375 0.5778 0.5375 0.7331
No log 32.5 390 0.5694 0.5792 0.5694 0.7546
No log 32.6667 392 0.5232 0.5560 0.5232 0.7233
No log 32.8333 394 0.4894 0.5635 0.4894 0.6996
No log 33.0 396 0.4849 0.5635 0.4849 0.6964
No log 33.1667 398 0.4982 0.5635 0.4982 0.7058
No log 33.3333 400 0.5172 0.5560 0.5172 0.7192
No log 33.5 402 0.5063 0.5635 0.5063 0.7116
No log 33.6667 404 0.4825 0.5544 0.4825 0.6946
No log 33.8333 406 0.4809 0.5868 0.4809 0.6935
No log 34.0 408 0.5030 0.6536 0.5030 0.7092
No log 34.1667 410 0.5284 0.6311 0.5284 0.7269
No log 34.3333 412 0.5199 0.6148 0.5199 0.7210
No log 34.5 414 0.4523 0.6438 0.4523 0.6725
No log 34.6667 416 0.4286 0.7062 0.4286 0.6547
No log 34.8333 418 0.4481 0.6519 0.4481 0.6694
No log 35.0 420 0.4447 0.6438 0.4447 0.6668
No log 35.1667 422 0.4372 0.6154 0.4372 0.6612
No log 35.3333 424 0.4392 0.6282 0.4392 0.6627
No log 35.5 426 0.4413 0.6114 0.4413 0.6643
No log 35.6667 428 0.4421 0.6223 0.4421 0.6649
No log 35.8333 430 0.4566 0.5933 0.4566 0.6757
No log 36.0 432 0.4565 0.5933 0.4565 0.6757
No log 36.1667 434 0.4513 0.5718 0.4513 0.6718
No log 36.3333 436 0.4398 0.5731 0.4398 0.6631
No log 36.5 438 0.4373 0.5246 0.4373 0.6613
No log 36.6667 440 0.4416 0.4990 0.4416 0.6645
No log 36.8333 442 0.4475 0.5227 0.4475 0.6690
No log 37.0 444 0.4611 0.5414 0.4611 0.6790
No log 37.1667 446 0.4580 0.5246 0.4580 0.6768
No log 37.3333 448 0.4549 0.5246 0.4549 0.6745
No log 37.5 450 0.4672 0.5591 0.4672 0.6835
No log 37.6667 452 0.4967 0.5990 0.4967 0.7048
No log 37.8333 454 0.5346 0.5687 0.5346 0.7312
No log 38.0 456 0.5284 0.5687 0.5284 0.7269
No log 38.1667 458 0.4816 0.6077 0.4816 0.6939
No log 38.3333 460 0.4451 0.5493 0.4451 0.6672
No log 38.5 462 0.4317 0.5493 0.4317 0.6571
No log 38.6667 464 0.4216 0.5714 0.4216 0.6493
No log 38.8333 466 0.4289 0.5861 0.4289 0.6549
No log 39.0 468 0.4426 0.6223 0.4426 0.6652
No log 39.1667 470 0.4671 0.6194 0.4671 0.6835
No log 39.3333 472 0.5430 0.5474 0.5430 0.7369
No log 39.5 474 0.6012 0.4844 0.6012 0.7753
No log 39.6667 476 0.5733 0.5082 0.5733 0.7572
No log 39.8333 478 0.4862 0.6167 0.4862 0.6973
No log 40.0 480 0.4254 0.5714 0.4254 0.6522
No log 40.1667 482 0.4137 0.6243 0.4137 0.6432
No log 40.3333 484 0.4189 0.5714 0.4189 0.6472
No log 40.5 486 0.4374 0.6062 0.4374 0.6613
No log 40.6667 488 0.4909 0.5897 0.4909 0.7007
No log 40.8333 490 0.5831 0.5082 0.5831 0.7636
No log 41.0 492 0.6602 0.5050 0.6602 0.8125
No log 41.1667 494 0.6269 0.4632 0.6269 0.7918
No log 41.3333 496 0.5455 0.5629 0.5455 0.7386
No log 41.5 498 0.5210 0.5629 0.5210 0.7218
0.2437 41.6667 500 0.4811 0.5698 0.4811 0.6936
0.2437 41.8333 502 0.4659 0.5625 0.4659 0.6826
0.2437 42.0 504 0.4459 0.5591 0.4459 0.6677
0.2437 42.1667 506 0.4457 0.5379 0.4457 0.6676
0.2437 42.3333 508 0.4450 0.5640 0.4450 0.6671
0.2437 42.5 510 0.4369 0.5831 0.4369 0.6609
0.2437 42.6667 512 0.4454 0.5414 0.4454 0.6674
0.2437 42.8333 514 0.4710 0.6210 0.4710 0.6863
0.2437 43.0 516 0.4975 0.5841 0.4975 0.7053
0.2437 43.1667 518 0.4935 0.5698 0.4935 0.7025
0.2437 43.3333 520 0.4645 0.5591 0.4645 0.6816
0.2437 43.5 522 0.4479 0.5343 0.4479 0.6693
0.2437 43.6667 524 0.4440 0.5475 0.4440 0.6663
0.2437 43.8333 526 0.4478 0.5475 0.4478 0.6692

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run1_AugV5_k3_task7_organization

Finetuned
(4019)
this model