ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run3_AugV5_k6_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6596
  • Qwk: 0.3127
  • Mse: 0.6596
  • Rmse: 0.8121

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0909 2 2.5477 -0.0788 2.5477 1.5962
No log 0.1818 4 1.2307 0.1248 1.2307 1.1094
No log 0.2727 6 0.9706 -0.1015 0.9706 0.9852
No log 0.3636 8 1.0685 0.0826 1.0685 1.0337
No log 0.4545 10 1.2310 0.1001 1.2310 1.1095
No log 0.5455 12 1.1653 0.1557 1.1653 1.0795
No log 0.6364 14 0.9111 0.2119 0.9111 0.9545
No log 0.7273 16 0.8574 0.1259 0.8574 0.9260
No log 0.8182 18 1.1939 0.0169 1.1939 1.0926
No log 0.9091 20 1.3093 0.1000 1.3093 1.1442
No log 1.0 22 1.0600 0.1882 1.0600 1.0296
No log 1.0909 24 0.8411 0.1365 0.8411 0.9171
No log 1.1818 26 0.7808 0.0 0.7808 0.8836
No log 1.2727 28 0.7803 0.0 0.7803 0.8833
No log 1.3636 30 0.7837 0.0428 0.7837 0.8853
No log 1.4545 32 0.7934 0.2046 0.7934 0.8907
No log 1.5455 34 0.7879 0.0428 0.7879 0.8876
No log 1.6364 36 0.7854 0.0 0.7854 0.8863
No log 1.7273 38 0.7687 0.0 0.7687 0.8768
No log 1.8182 40 0.7604 0.0 0.7604 0.8720
No log 1.9091 42 0.7777 0.0 0.7777 0.8819
No log 2.0 44 0.8207 0.0 0.8207 0.9059
No log 2.0909 46 0.7981 0.0937 0.7981 0.8934
No log 2.1818 48 0.8087 0.2087 0.8087 0.8993
No log 2.2727 50 0.7550 0.0053 0.7550 0.8689
No log 2.3636 52 0.7027 0.0481 0.7027 0.8383
No log 2.4545 54 0.7090 0.0509 0.7090 0.8420
No log 2.5455 56 0.7323 0.1358 0.7323 0.8558
No log 2.6364 58 0.7117 0.1998 0.7117 0.8436
No log 2.7273 60 0.6985 0.1228 0.6985 0.8358
No log 2.8182 62 0.7047 0.1228 0.7047 0.8395
No log 2.9091 64 0.7210 0.1673 0.7210 0.8491
No log 3.0 66 0.7773 0.1334 0.7773 0.8817
No log 3.0909 68 0.7235 0.1310 0.7235 0.8506
No log 3.1818 70 0.6706 0.0697 0.6706 0.8189
No log 3.2727 72 0.7097 0.2243 0.7097 0.8424
No log 3.3636 74 0.7717 0.0 0.7717 0.8785
No log 3.4545 76 0.7455 0.0816 0.7455 0.8634
No log 3.5455 78 0.7552 0.1739 0.7552 0.8690
No log 3.6364 80 0.7641 0.1739 0.7641 0.8741
No log 3.7273 82 0.7854 0.2408 0.7854 0.8862
No log 3.8182 84 0.8067 0.2046 0.8067 0.8982
No log 3.9091 86 0.7719 0.2066 0.7719 0.8786
No log 4.0 88 0.7516 0.2341 0.7516 0.8669
No log 4.0909 90 0.7322 0.0822 0.7322 0.8557
No log 4.1818 92 0.7186 0.1181 0.7186 0.8477
No log 4.2727 94 0.7376 -0.0026 0.7376 0.8588
No log 4.3636 96 0.6482 0.3105 0.6482 0.8051
No log 4.4545 98 0.6139 0.2748 0.6139 0.7835
No log 4.5455 100 0.6304 0.1277 0.6304 0.7940
No log 4.6364 102 0.7277 0.1328 0.7277 0.8531
No log 4.7273 104 0.8915 0.0977 0.8915 0.9442
No log 4.8182 106 0.9051 0.0342 0.9051 0.9514
No log 4.9091 108 0.7825 0.1873 0.7825 0.8846
No log 5.0 110 0.7287 0.2857 0.7287 0.8537
No log 5.0909 112 0.8117 0.3234 0.8117 0.9010
No log 5.1818 114 0.8627 0.2410 0.8627 0.9288
No log 5.2727 116 0.8393 0.2784 0.8393 0.9162
No log 5.3636 118 0.7437 0.4219 0.7437 0.8624
No log 5.4545 120 0.6885 0.4015 0.6885 0.8298
No log 5.5455 122 0.6802 0.3072 0.6802 0.8248
No log 5.6364 124 0.7325 0.2274 0.7325 0.8559
No log 5.7273 126 0.7572 0.1906 0.7572 0.8702
No log 5.8182 128 0.6988 0.2909 0.6988 0.8360
No log 5.9091 130 0.6595 0.3893 0.6595 0.8121
No log 6.0 132 0.6597 0.3673 0.6597 0.8122
No log 6.0909 134 0.6726 0.3789 0.6726 0.8201
No log 6.1818 136 0.7031 0.4479 0.7031 0.8385
No log 6.2727 138 0.7024 0.4479 0.7024 0.8381
No log 6.3636 140 0.6672 0.4569 0.6672 0.8168
No log 6.4545 142 0.6481 0.4502 0.6481 0.8051
No log 6.5455 144 0.6337 0.3398 0.6337 0.7960
No log 6.6364 146 0.6341 0.4077 0.6341 0.7963
No log 6.7273 148 0.6420 0.3964 0.6420 0.8013
No log 6.8182 150 0.6241 0.3964 0.6241 0.7900
No log 6.9091 152 0.6198 0.4077 0.6198 0.7872
No log 7.0 154 0.6408 0.4538 0.6408 0.8005
No log 7.0909 156 0.7232 0.4134 0.7232 0.8504
No log 7.1818 158 0.7030 0.3712 0.7030 0.8384
No log 7.2727 160 0.6376 0.4908 0.6376 0.7985
No log 7.3636 162 0.6282 0.4027 0.6282 0.7926
No log 7.4545 164 0.6284 0.3919 0.6284 0.7927
No log 7.5455 166 0.6250 0.4076 0.6250 0.7905
No log 7.6364 168 0.6637 0.3564 0.6637 0.8147
No log 7.7273 170 0.6585 0.3387 0.6585 0.8115
No log 7.8182 172 0.6498 0.2019 0.6498 0.8061
No log 7.9091 174 0.6799 0.3659 0.6799 0.8246
No log 8.0 176 0.6706 0.4182 0.6706 0.8189
No log 8.0909 178 0.5999 0.4299 0.5999 0.7745
No log 8.1818 180 0.6138 0.4437 0.6138 0.7834
No log 8.2727 182 0.6785 0.3450 0.6785 0.8237
No log 8.3636 184 0.6653 0.4167 0.6653 0.8157
No log 8.4545 186 0.6113 0.4845 0.6113 0.7818
No log 8.5455 188 0.6002 0.4867 0.6002 0.7747
No log 8.6364 190 0.5695 0.4492 0.5695 0.7546
No log 8.7273 192 0.5969 0.4249 0.5969 0.7726
No log 8.8182 194 0.6906 0.3434 0.6906 0.8310
No log 8.9091 196 0.6714 0.3730 0.6714 0.8194
No log 9.0 198 0.5915 0.4515 0.5915 0.7691
No log 9.0909 200 0.6360 0.4663 0.6360 0.7975
No log 9.1818 202 0.7901 0.4204 0.7901 0.8889
No log 9.2727 204 0.7538 0.4400 0.7538 0.8682
No log 9.3636 206 0.6391 0.4020 0.6391 0.7994
No log 9.4545 208 0.6124 0.4466 0.6124 0.7825
No log 9.5455 210 0.6610 0.3659 0.6610 0.8130
No log 9.6364 212 0.6185 0.3704 0.6185 0.7864
No log 9.7273 214 0.5705 0.5286 0.5705 0.7553
No log 9.8182 216 0.6457 0.4052 0.6457 0.8036
No log 9.9091 218 0.7279 0.4296 0.7279 0.8531
No log 10.0 220 0.7345 0.3675 0.7345 0.8570
No log 10.0909 222 0.6753 0.3312 0.6753 0.8218
No log 10.1818 224 0.6230 0.3839 0.6230 0.7893
No log 10.2727 226 0.6042 0.4437 0.6042 0.7773
No log 10.3636 228 0.5859 0.5361 0.5859 0.7654
No log 10.4545 230 0.6752 0.3994 0.6752 0.8217
No log 10.5455 232 0.7255 0.3191 0.7255 0.8517
No log 10.6364 234 0.6562 0.3648 0.6562 0.8101
No log 10.7273 236 0.6410 0.5593 0.6410 0.8006
No log 10.8182 238 0.7104 0.4315 0.7104 0.8429
No log 10.9091 240 0.8424 0.3557 0.8424 0.9178
No log 11.0 242 0.8066 0.3597 0.8066 0.8981
No log 11.0909 244 0.6748 0.4703 0.6748 0.8215
No log 11.1818 246 0.6338 0.5632 0.6338 0.7961
No log 11.2727 248 0.6294 0.5501 0.6294 0.7933
No log 11.3636 250 0.6496 0.4801 0.6496 0.8059
No log 11.4545 252 0.6851 0.4247 0.6851 0.8277
No log 11.5455 254 0.6584 0.4352 0.6584 0.8114
No log 11.6364 256 0.6234 0.4722 0.6234 0.7895
No log 11.7273 258 0.6353 0.3859 0.6353 0.7971
No log 11.8182 260 0.6465 0.4240 0.6465 0.8040
No log 11.9091 262 0.6552 0.2965 0.6552 0.8094
No log 12.0 264 0.6690 0.3551 0.6690 0.8180
No log 12.0909 266 0.6709 0.3467 0.6709 0.8191
No log 12.1818 268 0.6829 0.3385 0.6829 0.8264
No log 12.2727 270 0.6802 0.3716 0.6802 0.8247
No log 12.3636 272 0.6950 0.3716 0.6950 0.8336
No log 12.4545 274 0.6723 0.3450 0.6723 0.8199
No log 12.5455 276 0.6303 0.4722 0.6303 0.7939
No log 12.6364 278 0.6416 0.4895 0.6416 0.8010
No log 12.7273 280 0.6374 0.4547 0.6374 0.7984
No log 12.8182 282 0.6439 0.4291 0.6439 0.8024
No log 12.9091 284 0.6736 0.3868 0.6736 0.8207
No log 13.0 286 0.6831 0.3868 0.6831 0.8265
No log 13.0909 288 0.6772 0.3985 0.6772 0.8229
No log 13.1818 290 0.6663 0.4769 0.6663 0.8163
No log 13.2727 292 0.6796 0.4044 0.6796 0.8244
No log 13.3636 294 0.7095 0.3909 0.7095 0.8423
No log 13.4545 296 0.6952 0.4592 0.6952 0.8338
No log 13.5455 298 0.6529 0.5373 0.6529 0.8080
No log 13.6364 300 0.6425 0.4348 0.6425 0.8016
No log 13.7273 302 0.6427 0.4783 0.6427 0.8017
No log 13.8182 304 0.6356 0.5238 0.6356 0.7972
No log 13.9091 306 0.6267 0.5021 0.6267 0.7917
No log 14.0 308 0.6599 0.5673 0.6599 0.8123
No log 14.0909 310 0.6778 0.5259 0.6778 0.8233
No log 14.1818 312 0.6280 0.5308 0.6280 0.7924
No log 14.2727 314 0.5915 0.4762 0.5915 0.7691
No log 14.3636 316 0.6193 0.4918 0.6193 0.7870
No log 14.4545 318 0.6199 0.4461 0.6199 0.7874
No log 14.5455 320 0.6110 0.4221 0.6110 0.7817
No log 14.6364 322 0.6051 0.4722 0.6051 0.7779
No log 14.7273 324 0.6188 0.5095 0.6188 0.7866
No log 14.8182 326 0.6178 0.5095 0.6178 0.7860
No log 14.9091 328 0.6200 0.4837 0.6200 0.7874
No log 15.0 330 0.6193 0.5024 0.6193 0.7869
No log 15.0909 332 0.6127 0.4949 0.6127 0.7827
No log 15.1818 334 0.6056 0.4681 0.6056 0.7782
No log 15.2727 336 0.5990 0.4276 0.5990 0.7740
No log 15.3636 338 0.5987 0.4504 0.5987 0.7738
No log 15.4545 340 0.6010 0.4681 0.6010 0.7753
No log 15.5455 342 0.6064 0.5344 0.6064 0.7787
No log 15.6364 344 0.6174 0.5003 0.6174 0.7857
No log 15.7273 346 0.6688 0.4916 0.6688 0.8178
No log 15.8182 348 0.6787 0.4625 0.6787 0.8239
No log 15.9091 350 0.6511 0.4514 0.6511 0.8069
No log 16.0 352 0.6547 0.4968 0.6547 0.8091
No log 16.0909 354 0.7250 0.4600 0.7250 0.8515
No log 16.1818 356 0.8851 0.4577 0.8851 0.9408
No log 16.2727 358 0.8555 0.4527 0.8555 0.9249
No log 16.3636 360 0.7255 0.4666 0.7255 0.8518
No log 16.4545 362 0.6513 0.4871 0.6513 0.8070
No log 16.5455 364 0.6407 0.3859 0.6407 0.8005
No log 16.6364 366 0.6473 0.3714 0.6473 0.8046
No log 16.7273 368 0.6703 0.5254 0.6703 0.8187
No log 16.8182 370 0.7144 0.4023 0.7144 0.8452
No log 16.9091 372 0.7024 0.4023 0.7024 0.8381
No log 17.0 374 0.6438 0.4964 0.6438 0.8024
No log 17.0909 376 0.6158 0.4308 0.6158 0.7848
No log 17.1818 378 0.6258 0.4427 0.6258 0.7911
No log 17.2727 380 0.6079 0.3978 0.6079 0.7797
No log 17.3636 382 0.6380 0.4864 0.6380 0.7987
No log 17.4545 384 0.6805 0.4484 0.6805 0.8249
No log 17.5455 386 0.6536 0.4484 0.6536 0.8085
No log 17.6364 388 0.6323 0.5195 0.6323 0.7952
No log 17.7273 390 0.6439 0.4639 0.6439 0.8024
No log 17.8182 392 0.7006 0.5601 0.7006 0.8370
No log 17.9091 394 0.7707 0.4877 0.7707 0.8779
No log 18.0 396 0.7565 0.4741 0.7565 0.8698
No log 18.0909 398 0.7110 0.4916 0.7110 0.8432
No log 18.1818 400 0.6524 0.4278 0.6524 0.8077
No log 18.2727 402 0.6410 0.4006 0.6410 0.8007
No log 18.3636 404 0.6500 0.4605 0.6500 0.8062
No log 18.4545 406 0.6587 0.4872 0.6587 0.8116
No log 18.5455 408 0.6220 0.3927 0.6220 0.7887
No log 18.6364 410 0.6165 0.4386 0.6165 0.7852
No log 18.7273 412 0.6096 0.4386 0.6096 0.7808
No log 18.8182 414 0.6142 0.4747 0.6142 0.7837
No log 18.9091 416 0.6689 0.4218 0.6689 0.8179
No log 19.0 418 0.6912 0.4144 0.6912 0.8314
No log 19.0909 420 0.6439 0.3355 0.6439 0.8025
No log 19.1818 422 0.6118 0.4194 0.6118 0.7822
No log 19.2727 424 0.6211 0.3810 0.6211 0.7881
No log 19.3636 426 0.6192 0.4029 0.6192 0.7869
No log 19.4545 428 0.6353 0.3990 0.6353 0.7971
No log 19.5455 430 0.6634 0.4845 0.6634 0.8145
No log 19.6364 432 0.6570 0.5166 0.6570 0.8105
No log 19.7273 434 0.6222 0.5395 0.6222 0.7888
No log 19.8182 436 0.6092 0.5437 0.6092 0.7805
No log 19.9091 438 0.6026 0.5631 0.6026 0.7763
No log 20.0 440 0.6032 0.5631 0.6032 0.7767
No log 20.0909 442 0.6275 0.5395 0.6275 0.7922
No log 20.1818 444 0.6194 0.5324 0.6194 0.7870
No log 20.2727 446 0.6238 0.5485 0.6238 0.7898
No log 20.3636 448 0.6196 0.4763 0.6196 0.7871
No log 20.4545 450 0.6393 0.5133 0.6393 0.7996
No log 20.5455 452 0.6853 0.4782 0.6853 0.8278
No log 20.6364 454 0.6932 0.4782 0.6932 0.8326
No log 20.7273 456 0.6787 0.4951 0.6787 0.8239
No log 20.8182 458 0.6484 0.3816 0.6484 0.8053
No log 20.9091 460 0.6486 0.4201 0.6486 0.8054
No log 21.0 462 0.6442 0.3738 0.6442 0.8026
No log 21.0909 464 0.6549 0.3928 0.6549 0.8093
No log 21.1818 466 0.6504 0.3906 0.6504 0.8065
No log 21.2727 468 0.6439 0.3906 0.6439 0.8024
No log 21.3636 470 0.6610 0.4812 0.6610 0.8130
No log 21.4545 472 0.6499 0.4448 0.6499 0.8062
No log 21.5455 474 0.6152 0.5340 0.6152 0.7844
No log 21.6364 476 0.6325 0.4127 0.6325 0.7953
No log 21.7273 478 0.6941 0.4518 0.6941 0.8332
No log 21.8182 480 0.6740 0.4261 0.6740 0.8210
No log 21.9091 482 0.6083 0.4505 0.6083 0.7800
No log 22.0 484 0.6190 0.5632 0.6190 0.7868
No log 22.0909 486 0.6472 0.4782 0.6472 0.8045
No log 22.1818 488 0.6391 0.5053 0.6391 0.7995
No log 22.2727 490 0.6197 0.4091 0.6197 0.7872
No log 22.3636 492 0.6088 0.4091 0.6088 0.7802
No log 22.4545 494 0.5983 0.4358 0.5983 0.7735
No log 22.5455 496 0.5925 0.4768 0.5925 0.7697
No log 22.6364 498 0.5902 0.4768 0.5902 0.7682
0.3443 22.7273 500 0.5924 0.3813 0.5924 0.7697
0.3443 22.8182 502 0.5920 0.4526 0.5920 0.7694
0.3443 22.9091 504 0.5943 0.4914 0.5943 0.7709
0.3443 23.0 506 0.6052 0.4124 0.6052 0.7780
0.3443 23.0909 508 0.6077 0.4660 0.6077 0.7796
0.3443 23.1818 510 0.5948 0.4547 0.5948 0.7712
0.3443 23.2727 512 0.6023 0.4160 0.6023 0.7761
0.3443 23.3636 514 0.6111 0.4262 0.6111 0.7818
0.3443 23.4545 516 0.6103 0.4262 0.6103 0.7812
0.3443 23.5455 518 0.5974 0.4547 0.5974 0.7729
0.3443 23.6364 520 0.5922 0.4547 0.5922 0.7695
0.3443 23.7273 522 0.5869 0.4857 0.5869 0.7661
0.3443 23.8182 524 0.5802 0.4857 0.5802 0.7617
0.3443 23.9091 526 0.5779 0.5095 0.5779 0.7602
0.3443 24.0 528 0.5817 0.5307 0.5817 0.7627
0.3443 24.0909 530 0.5678 0.4857 0.5678 0.7535
0.3443 24.1818 532 0.5680 0.4160 0.5680 0.7536
0.3443 24.2727 534 0.6016 0.4307 0.6016 0.7756
0.3443 24.3636 536 0.6291 0.4005 0.6291 0.7932
0.3443 24.4545 538 0.5932 0.4012 0.5932 0.7702
0.3443 24.5455 540 0.5809 0.4160 0.5809 0.7622
0.3443 24.6364 542 0.5874 0.4160 0.5874 0.7665
0.3443 24.7273 544 0.5919 0.4160 0.5919 0.7693
0.3443 24.8182 546 0.6044 0.4463 0.6044 0.7774
0.3443 24.9091 548 0.6346 0.4257 0.6346 0.7966
0.3443 25.0 550 0.7021 0.4628 0.7021 0.8379
0.3443 25.0909 552 0.7491 0.4628 0.7491 0.8655
0.3443 25.1818 554 0.7606 0.4217 0.7606 0.8721
0.3443 25.2727 556 0.7010 0.3936 0.7010 0.8373
0.3443 25.3636 558 0.6443 0.4013 0.6443 0.8027
0.3443 25.4545 560 0.6307 0.4535 0.6307 0.7942
0.3443 25.5455 562 0.6216 0.4535 0.6216 0.7884
0.3443 25.6364 564 0.6346 0.5173 0.6346 0.7966
0.3443 25.7273 566 0.6413 0.5442 0.6413 0.8008
0.3443 25.8182 568 0.6119 0.4743 0.6119 0.7822
0.3443 25.9091 570 0.5988 0.4547 0.5988 0.7738
0.3443 26.0 572 0.6045 0.4423 0.6045 0.7775
0.3443 26.0909 574 0.6070 0.4217 0.6070 0.7791
0.3443 26.1818 576 0.6294 0.4397 0.6294 0.7933
0.3443 26.2727 578 0.6681 0.4020 0.6681 0.8174
0.3443 26.3636 580 0.6766 0.4451 0.6766 0.8226
0.3443 26.4545 582 0.6441 0.3524 0.6441 0.8025
0.3443 26.5455 584 0.6193 0.3160 0.6193 0.7870
0.3443 26.6364 586 0.6190 0.3715 0.6190 0.7868
0.3443 26.7273 588 0.6279 0.3961 0.6279 0.7924
0.3443 26.8182 590 0.6836 0.4929 0.6836 0.8268
0.3443 26.9091 592 0.7925 0.4305 0.7925 0.8902
0.3443 27.0 594 0.7985 0.4481 0.7985 0.8936
0.3443 27.0909 596 0.7071 0.5023 0.7071 0.8409
0.3443 27.1818 598 0.6108 0.4881 0.6108 0.7815
0.3443 27.2727 600 0.5965 0.4 0.5965 0.7723
0.3443 27.3636 602 0.5999 0.4249 0.5999 0.7746
0.3443 27.4545 604 0.6022 0.4249 0.6022 0.7760
0.3443 27.5455 606 0.5842 0.4249 0.5842 0.7643
0.3443 27.6364 608 0.5719 0.4548 0.5719 0.7562
0.3443 27.7273 610 0.5683 0.4205 0.5683 0.7538
0.3443 27.8182 612 0.5701 0.4205 0.5701 0.7551
0.3443 27.9091 614 0.5746 0.4158 0.5746 0.7580
0.3443 28.0 616 0.5858 0.4937 0.5858 0.7654
0.3443 28.0909 618 0.5823 0.3786 0.5823 0.7631
0.3443 28.1818 620 0.5884 0.3859 0.5884 0.7671
0.3443 28.2727 622 0.5984 0.3781 0.5984 0.7736
0.3443 28.3636 624 0.6234 0.3267 0.6234 0.7896
0.3443 28.4545 626 0.6489 0.3737 0.6489 0.8056
0.3443 28.5455 628 0.6723 0.3737 0.6723 0.8199
0.3443 28.6364 630 0.6596 0.3127 0.6596 0.8121

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run3_AugV5_k6_task7_organization

Finetuned
(4019)
this model