ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run1_AugV5_k16_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5371
  • Qwk: 0.4672
  • Mse: 0.5371
  • Rmse: 0.7329

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.025 2 2.4870 -0.0471 2.4870 1.5770
No log 0.05 4 1.3568 0.0367 1.3568 1.1648
No log 0.075 6 0.9370 0.0185 0.9370 0.9680
No log 0.1 8 0.8263 0.0289 0.8263 0.9090
No log 0.125 10 0.7302 0.0618 0.7302 0.8545
No log 0.15 12 0.9307 0.2080 0.9307 0.9647
No log 0.175 14 1.2158 0.0268 1.2158 1.1026
No log 0.2 16 1.0772 0.1482 1.0772 1.0379
No log 0.225 18 1.4158 0.1002 1.4158 1.1899
No log 0.25 20 1.2743 0.1737 1.2743 1.1288
No log 0.275 22 0.8854 0.2837 0.8854 0.9410
No log 0.3 24 0.8120 0.3761 0.8120 0.9011
No log 0.325 26 1.0445 0.0990 1.0445 1.0220
No log 0.35 28 1.2961 0.1468 1.2961 1.1385
No log 0.375 30 0.9340 0.1235 0.9340 0.9665
No log 0.4 32 0.7081 0.3302 0.7081 0.8415
No log 0.425 34 0.6632 0.2877 0.6632 0.8144
No log 0.45 36 0.6827 0.2227 0.6827 0.8263
No log 0.475 38 0.6459 0.2374 0.6459 0.8036
No log 0.5 40 0.6842 0.1770 0.6842 0.8271
No log 0.525 42 0.8817 0.2008 0.8817 0.9390
No log 0.55 44 1.1133 0.1259 1.1133 1.0551
No log 0.575 46 1.0251 0.1819 1.0251 1.0125
No log 0.6 48 0.8501 0.2476 0.8501 0.9220
No log 0.625 50 0.9690 0.2040 0.9690 0.9844
No log 0.65 52 1.1008 0.1737 1.1008 1.0492
No log 0.675 54 1.4057 0.1005 1.4057 1.1856
No log 0.7 56 1.4225 0.0381 1.4225 1.1927
No log 0.725 58 1.1515 0.2439 1.1515 1.0731
No log 0.75 60 0.8868 0.1973 0.8868 0.9417
No log 0.775 62 0.7760 0.2095 0.7760 0.8809
No log 0.8 64 0.7373 0.2057 0.7373 0.8587
No log 0.825 66 0.7740 0.1700 0.7740 0.8798
No log 0.85 68 0.9935 0.2784 0.9935 0.9967
No log 0.875 70 1.2348 0.1239 1.2348 1.1112
No log 0.9 72 1.2301 0.1000 1.2301 1.1091
No log 0.925 74 1.2422 0.0547 1.2422 1.1145
No log 0.95 76 0.9609 0.2958 0.9609 0.9803
No log 0.975 78 0.8415 0.3009 0.8415 0.9173
No log 1.0 80 0.8322 0.3313 0.8322 0.9122
No log 1.025 82 0.7153 0.2325 0.7153 0.8458
No log 1.05 84 0.6539 0.2336 0.6539 0.8087
No log 1.075 86 0.6655 0.2336 0.6655 0.8158
No log 1.1 88 0.6827 0.1983 0.6827 0.8263
No log 1.125 90 0.7968 0.1724 0.7968 0.8926
No log 1.15 92 0.8823 0.2642 0.8823 0.9393
No log 1.175 94 0.8083 0.2769 0.8083 0.8991
No log 1.2 96 0.8265 0.2672 0.8265 0.9091
No log 1.225 98 0.9772 0.2066 0.9772 0.9885
No log 1.25 100 1.2261 0.1226 1.2261 1.1073
No log 1.275 102 1.3920 0.0372 1.3920 1.1798
No log 1.3 104 1.3363 0.0789 1.3363 1.1560
No log 1.325 106 1.0456 0.1217 1.0456 1.0226
No log 1.35 108 0.9120 0.2752 0.9120 0.9550
No log 1.375 110 0.9231 0.2727 0.9231 0.9608
No log 1.4 112 0.7682 0.3077 0.7682 0.8765
No log 1.425 114 0.6918 0.3271 0.6918 0.8317
No log 1.45 116 0.6537 0.4087 0.6537 0.8085
No log 1.475 118 0.6916 0.3616 0.6916 0.8316
No log 1.5 120 0.7131 0.3931 0.7131 0.8444
No log 1.525 122 0.6658 0.4106 0.6658 0.8160
No log 1.55 124 0.5854 0.4677 0.5854 0.7651
No log 1.575 126 0.5464 0.4414 0.5464 0.7392
No log 1.6 128 0.5919 0.3770 0.5919 0.7693
No log 1.625 130 0.5263 0.5329 0.5263 0.7255
No log 1.65 132 0.5475 0.5339 0.5475 0.7399
No log 1.675 134 0.7699 0.4096 0.7699 0.8774
No log 1.7 136 0.9686 0.2770 0.9686 0.9842
No log 1.725 138 0.9170 0.2820 0.9170 0.9576
No log 1.75 140 0.6474 0.4230 0.6474 0.8046
No log 1.775 142 0.5754 0.5246 0.5754 0.7586
No log 1.8 144 0.6220 0.4307 0.6220 0.7887
No log 1.825 146 0.7119 0.3434 0.7119 0.8438
No log 1.85 148 0.6749 0.4610 0.6749 0.8215
No log 1.875 150 0.6263 0.5010 0.6263 0.7914
No log 1.9 152 0.6222 0.5267 0.6222 0.7888
No log 1.925 154 0.6371 0.3937 0.6371 0.7982
No log 1.95 156 0.6626 0.3355 0.6626 0.8140
No log 1.975 158 0.6528 0.3524 0.6528 0.8080
No log 2.0 160 0.6489 0.4829 0.6489 0.8055
No log 2.025 162 0.6484 0.4762 0.6484 0.8052
No log 2.05 164 0.6491 0.4762 0.6491 0.8057
No log 2.075 166 0.6255 0.5538 0.6255 0.7909
No log 2.1 168 0.5921 0.6053 0.5921 0.7695
No log 2.125 170 0.5671 0.6140 0.5671 0.7530
No log 2.15 172 0.5631 0.5555 0.5631 0.7504
No log 2.175 174 0.5435 0.5555 0.5435 0.7373
No log 2.2 176 0.5294 0.5633 0.5294 0.7276
No log 2.225 178 0.5914 0.4448 0.5914 0.7690
No log 2.25 180 0.7139 0.4444 0.7139 0.8449
No log 2.275 182 0.6496 0.4427 0.6496 0.8059
No log 2.3 184 0.6278 0.4618 0.6278 0.7924
No log 2.325 186 0.7681 0.4260 0.7681 0.8764
No log 2.35 188 0.9839 0.3844 0.9839 0.9919
No log 2.375 190 1.0026 0.3809 1.0026 1.0013
No log 2.4 192 0.7354 0.4978 0.7354 0.8575
No log 2.425 194 0.6532 0.4784 0.6532 0.8082
No log 2.45 196 0.6793 0.4784 0.6793 0.8242
No log 2.475 198 0.5696 0.5489 0.5696 0.7548
No log 2.5 200 0.4934 0.5678 0.4934 0.7024
No log 2.525 202 0.4711 0.6197 0.4711 0.6864
No log 2.55 204 0.4653 0.6282 0.4653 0.6822
No log 2.575 206 0.4880 0.5554 0.4880 0.6985
No log 2.6 208 0.5014 0.5895 0.5014 0.7081
No log 2.625 210 0.5078 0.5895 0.5078 0.7126
No log 2.65 212 0.5195 0.5895 0.5195 0.7208
No log 2.675 214 0.5496 0.5751 0.5496 0.7413
No log 2.7 216 0.6031 0.5874 0.6031 0.7766
No log 2.725 218 0.6778 0.5630 0.6778 0.8233
No log 2.75 220 0.5828 0.5553 0.5828 0.7634
No log 2.775 222 0.4954 0.5627 0.4954 0.7038
No log 2.8 224 0.4939 0.5753 0.4939 0.7027
No log 2.825 226 0.4855 0.5941 0.4855 0.6968
No log 2.85 228 0.5225 0.5554 0.5225 0.7228
No log 2.875 230 0.6221 0.5184 0.6221 0.7887
No log 2.9 232 0.5790 0.5779 0.5790 0.7609
No log 2.925 234 0.5170 0.5945 0.5170 0.7190
No log 2.95 236 0.5193 0.5344 0.5193 0.7206
No log 2.975 238 0.5348 0.5539 0.5348 0.7313
No log 3.0 240 0.5669 0.5861 0.5669 0.7530
No log 3.025 242 0.5688 0.5797 0.5688 0.7542
No log 3.05 244 0.5539 0.5993 0.5539 0.7442
No log 3.075 246 0.5434 0.6007 0.5434 0.7372
No log 3.1 248 0.5932 0.5212 0.5932 0.7702
No log 3.125 250 0.6776 0.3761 0.6776 0.8232
No log 3.15 252 0.6485 0.4204 0.6485 0.8053
No log 3.175 254 0.6295 0.4373 0.6295 0.7934
No log 3.2 256 0.5652 0.5208 0.5652 0.7518
No log 3.225 258 0.5322 0.6257 0.5322 0.7295
No log 3.25 260 0.5376 0.6168 0.5376 0.7332
No log 3.275 262 0.5584 0.5283 0.5584 0.7472
No log 3.3 264 0.5856 0.5300 0.5856 0.7653
No log 3.325 266 0.6352 0.4302 0.6352 0.7970
No log 3.35 268 0.6074 0.5642 0.6074 0.7794
No log 3.375 270 0.5285 0.5722 0.5285 0.7270
No log 3.4 272 0.5298 0.6068 0.5298 0.7279
No log 3.425 274 0.5429 0.5781 0.5429 0.7368
No log 3.45 276 0.5249 0.5522 0.5249 0.7245
No log 3.475 278 0.5256 0.4788 0.5256 0.7250
No log 3.5 280 0.5298 0.4991 0.5298 0.7279
No log 3.525 282 0.5564 0.5495 0.5564 0.7459
No log 3.55 284 0.5584 0.5266 0.5584 0.7473
No log 3.575 286 0.5178 0.5152 0.5178 0.7196
No log 3.6 288 0.4986 0.4569 0.4986 0.7061
No log 3.625 290 0.4952 0.5539 0.4952 0.7037
No log 3.65 292 0.5712 0.5438 0.5712 0.7557
No log 3.675 294 0.7139 0.3913 0.7139 0.8449
No log 3.7 296 0.6880 0.4122 0.6880 0.8295
No log 3.725 298 0.5463 0.5970 0.5463 0.7391
No log 3.75 300 0.4978 0.6307 0.4978 0.7055
No log 3.775 302 0.5067 0.6388 0.5067 0.7119
No log 3.8 304 0.4935 0.6584 0.4935 0.7025
No log 3.825 306 0.5548 0.5220 0.5548 0.7449
No log 3.85 308 0.6131 0.4410 0.6131 0.7830
No log 3.875 310 0.5996 0.4459 0.5996 0.7744
No log 3.9 312 0.5257 0.5970 0.5257 0.7251
No log 3.925 314 0.4751 0.6183 0.4751 0.6893
No log 3.95 316 0.4685 0.5617 0.4685 0.6845
No log 3.975 318 0.5159 0.5422 0.5159 0.7183
No log 4.0 320 0.6424 0.4575 0.6424 0.8015
No log 4.025 322 0.6544 0.4238 0.6544 0.8089
No log 4.05 324 0.5589 0.5178 0.5589 0.7476
No log 4.075 326 0.5055 0.6047 0.5055 0.7110
No log 4.1 328 0.5282 0.5970 0.5282 0.7268
No log 4.125 330 0.5923 0.4971 0.5923 0.7696
No log 4.15 332 0.5440 0.5486 0.5440 0.7375
No log 4.175 334 0.4676 0.6402 0.4676 0.6838
No log 4.2 336 0.4651 0.6210 0.4651 0.6820
No log 4.225 338 0.5173 0.5933 0.5173 0.7192
No log 4.25 340 0.7137 0.3699 0.7137 0.8448
No log 4.275 342 0.8965 0.3497 0.8965 0.9468
No log 4.3 344 0.9455 0.3497 0.9455 0.9724
No log 4.325 346 0.7799 0.3567 0.7799 0.8831
No log 4.35 348 0.5867 0.4708 0.5867 0.7660
No log 4.375 350 0.4680 0.6127 0.4680 0.6841
No log 4.4 352 0.4433 0.6197 0.4433 0.6658
No log 4.425 354 0.4421 0.6197 0.4421 0.6649
No log 4.45 356 0.4536 0.6197 0.4536 0.6735
No log 4.475 358 0.4616 0.6197 0.4616 0.6794
No log 4.5 360 0.4770 0.6402 0.4770 0.6907
No log 4.525 362 0.4701 0.6389 0.4701 0.6857
No log 4.55 364 0.4739 0.5640 0.4739 0.6884
No log 4.575 366 0.4736 0.5985 0.4736 0.6882
No log 4.6 368 0.5022 0.5796 0.5022 0.7086
No log 4.625 370 0.5080 0.5796 0.5080 0.7127
No log 4.65 372 0.5331 0.5283 0.5331 0.7302
No log 4.675 374 0.6355 0.4429 0.6355 0.7972
No log 4.7 376 0.6643 0.4448 0.6643 0.8151
No log 4.725 378 0.5682 0.4575 0.5682 0.7538
No log 4.75 380 0.4859 0.5446 0.4859 0.6970
No log 4.775 382 0.5215 0.5237 0.5215 0.7221
No log 4.8 384 0.5232 0.5473 0.5232 0.7233
No log 4.825 386 0.5103 0.6108 0.5103 0.7144
No log 4.85 388 0.5399 0.6034 0.5399 0.7348
No log 4.875 390 0.6386 0.4817 0.6386 0.7991
No log 4.9 392 0.6095 0.4817 0.6095 0.7807
No log 4.925 394 0.5272 0.5881 0.5272 0.7261
No log 4.95 396 0.5198 0.5692 0.5198 0.7210
No log 4.975 398 0.5229 0.5692 0.5229 0.7231
No log 5.0 400 0.5089 0.5918 0.5089 0.7133
No log 5.025 402 0.4827 0.5999 0.4827 0.6947
No log 5.05 404 0.5005 0.5999 0.5005 0.7075
No log 5.075 406 0.5580 0.4375 0.5580 0.7470
No log 5.1 408 0.6482 0.4466 0.6482 0.8051
No log 5.125 410 0.6073 0.4051 0.6073 0.7793
No log 5.15 412 0.5203 0.5083 0.5203 0.7213
No log 5.175 414 0.4791 0.5555 0.4791 0.6922
No log 5.2 416 0.4904 0.5479 0.4904 0.7003
No log 5.225 418 0.5371 0.4713 0.5371 0.7329
No log 5.25 420 0.5606 0.4713 0.5606 0.7487
No log 5.275 422 0.5255 0.4841 0.5255 0.7249
No log 5.3 424 0.5052 0.4802 0.5052 0.7108
No log 5.325 426 0.4921 0.5463 0.4921 0.7015
No log 5.35 428 0.5131 0.5841 0.5131 0.7163
No log 5.375 430 0.5518 0.5283 0.5518 0.7428
No log 5.4 432 0.5911 0.4652 0.5911 0.7689
No log 5.425 434 0.6678 0.4103 0.6678 0.8172
No log 5.45 436 0.6336 0.4250 0.6336 0.7960
No log 5.475 438 0.5210 0.5267 0.5210 0.7218
No log 5.5 440 0.4863 0.4984 0.4863 0.6974
No log 5.525 442 0.4895 0.4984 0.4895 0.6997
No log 5.55 444 0.4922 0.6183 0.4922 0.7016
No log 5.575 446 0.5096 0.6305 0.5096 0.7139
No log 5.6 448 0.5349 0.5678 0.5349 0.7314
No log 5.625 450 0.5886 0.5013 0.5886 0.7672
No log 5.65 452 0.5695 0.5471 0.5695 0.7547
No log 5.675 454 0.5720 0.5013 0.5720 0.7563
No log 5.7 456 0.5971 0.4634 0.5971 0.7727
No log 5.725 458 0.5660 0.4690 0.5660 0.7524
No log 5.75 460 0.5095 0.5200 0.5095 0.7138
No log 5.775 462 0.4765 0.6183 0.4765 0.6903
No log 5.8 464 0.4781 0.6305 0.4781 0.6915
No log 5.825 466 0.5062 0.5352 0.5062 0.7115
No log 5.85 468 0.5064 0.5144 0.5064 0.7116
No log 5.875 470 0.4657 0.6491 0.4657 0.6824
No log 5.9 472 0.4586 0.6452 0.4586 0.6772
No log 5.925 474 0.4531 0.6655 0.4531 0.6731
No log 5.95 476 0.4507 0.6672 0.4507 0.6713
No log 5.975 478 0.4802 0.5283 0.4802 0.6930
No log 6.0 480 0.6336 0.5034 0.6336 0.7960
No log 6.025 482 0.7561 0.4430 0.7561 0.8695
No log 6.05 484 0.6781 0.5034 0.6781 0.8234
No log 6.075 486 0.5332 0.4834 0.5332 0.7302
No log 6.1 488 0.4987 0.5621 0.4987 0.7062
No log 6.125 490 0.5392 0.4834 0.5392 0.7343
No log 6.15 492 0.5764 0.5061 0.5764 0.7592
No log 6.175 494 0.5977 0.5131 0.5977 0.7731
No log 6.2 496 0.6265 0.5401 0.6265 0.7915
No log 6.225 498 0.6104 0.5401 0.6104 0.7813
0.3151 6.25 500 0.5738 0.4930 0.5738 0.7575
0.3151 6.275 502 0.5132 0.5090 0.5132 0.7164
0.3151 6.3 504 0.4982 0.6305 0.4982 0.7059
0.3151 6.325 506 0.5090 0.5678 0.5090 0.7134
0.3151 6.35 508 0.5575 0.4969 0.5575 0.7467
0.3151 6.375 510 0.6027 0.4708 0.6027 0.7763
0.3151 6.4 512 0.6260 0.4598 0.6260 0.7912
0.3151 6.425 514 0.6537 0.5018 0.6537 0.8085
0.3151 6.45 516 0.6835 0.5160 0.6835 0.8267
0.3151 6.475 518 0.6426 0.4794 0.6426 0.8016
0.3151 6.5 520 0.5527 0.5407 0.5527 0.7434
0.3151 6.525 522 0.5244 0.5565 0.5244 0.7242
0.3151 6.55 524 0.5727 0.4219 0.5727 0.7568
0.3151 6.575 526 0.5751 0.4052 0.5751 0.7584
0.3151 6.6 528 0.5462 0.4243 0.5462 0.7390
0.3151 6.625 530 0.5258 0.4972 0.5258 0.7251
0.3151 6.65 532 0.5703 0.5553 0.5703 0.7552
0.3151 6.675 534 0.6098 0.4510 0.6098 0.7809
0.3151 6.7 536 0.6283 0.4723 0.6283 0.7926
0.3151 6.725 538 0.6011 0.4510 0.6011 0.7753
0.3151 6.75 540 0.5529 0.6066 0.5529 0.7435
0.3151 6.775 542 0.5422 0.5571 0.5422 0.7363
0.3151 6.8 544 0.5507 0.4979 0.5507 0.7421
0.3151 6.825 546 0.5837 0.4616 0.5837 0.7640
0.3151 6.85 548 0.6491 0.4545 0.6491 0.8057
0.3151 6.875 550 0.6249 0.4740 0.6249 0.7905
0.3151 6.9 552 0.5494 0.5152 0.5494 0.7412
0.3151 6.925 554 0.5251 0.6130 0.5251 0.7246
0.3151 6.95 556 0.5291 0.5999 0.5291 0.7274
0.3151 6.975 558 0.5712 0.5524 0.5712 0.7558
0.3151 7.0 560 0.5852 0.5220 0.5852 0.7650
0.3151 7.025 562 0.5337 0.5348 0.5337 0.7306
0.3151 7.05 564 0.5081 0.6377 0.5081 0.7128
0.3151 7.075 566 0.5258 0.5323 0.5258 0.7251
0.3151 7.1 568 0.5188 0.5323 0.5188 0.7203
0.3151 7.125 570 0.5125 0.5339 0.5125 0.7159
0.3151 7.15 572 0.6102 0.4708 0.6102 0.7812
0.3151 7.175 574 0.7116 0.4051 0.7116 0.8435
0.3151 7.2 576 0.7214 0.4462 0.7214 0.8494
0.3151 7.225 578 0.6353 0.4851 0.6353 0.7970
0.3151 7.25 580 0.5371 0.4672 0.5371 0.7329

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run1_AugV5_k16_task7_organization

Finetuned
(4019)
this model