Arabic_CrossPrompt_FineTuningAraBERT_noAug_TestTask4_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8605
  • Qwk: 0.6299
  • Mse: 0.8605
  • Rmse: 0.9276

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0187 2 4.1512 0.0204 4.1512 2.0374
No log 0.0374 4 2.7415 0.1026 2.7415 1.6557
No log 0.0561 6 1.7383 0.1586 1.7383 1.3184
No log 0.0748 8 1.0072 0.2544 1.0072 1.0036
No log 0.0935 10 1.2004 0.1056 1.2004 1.0956
No log 0.1121 12 1.5846 -0.1779 1.5846 1.2588
No log 0.1308 14 1.4274 -0.0300 1.4274 1.1947
No log 0.1495 16 1.2516 0.0450 1.2516 1.1187
No log 0.1682 18 1.3111 0.1512 1.3111 1.1451
No log 0.1869 20 1.1973 0.1270 1.1973 1.0942
No log 0.2056 22 0.9866 0.1811 0.9866 0.9933
No log 0.2243 24 0.8823 0.3492 0.8823 0.9393
No log 0.2430 26 1.1442 0.3337 1.1442 1.0697
No log 0.2617 28 1.0645 0.3763 1.0645 1.0317
No log 0.2804 30 0.8436 0.3521 0.8436 0.9185
No log 0.2991 32 0.9211 0.2925 0.9211 0.9597
No log 0.3178 34 1.0492 0.2193 1.0492 1.0243
No log 0.3364 36 1.0401 0.1966 1.0401 1.0198
No log 0.3551 38 0.9345 0.2658 0.9345 0.9667
No log 0.3738 40 0.7994 0.3210 0.7994 0.8941
No log 0.3925 42 0.7436 0.4083 0.7436 0.8623
No log 0.4112 44 0.8024 0.4110 0.8024 0.8957
No log 0.4299 46 0.7496 0.4252 0.7496 0.8658
No log 0.4486 48 0.7613 0.3729 0.7613 0.8725
No log 0.4673 50 0.9850 0.2389 0.9850 0.9925
No log 0.4860 52 1.0449 0.1985 1.0449 1.0222
No log 0.5047 54 0.9138 0.2617 0.9138 0.9559
No log 0.5234 56 0.7373 0.3942 0.7373 0.8587
No log 0.5421 58 0.6902 0.4912 0.6902 0.8308
No log 0.5607 60 0.6784 0.5205 0.6784 0.8237
No log 0.5794 62 0.6504 0.5030 0.6504 0.8065
No log 0.5981 64 0.6835 0.4278 0.6835 0.8267
No log 0.6168 66 0.8242 0.4585 0.8242 0.9079
No log 0.6355 68 0.8274 0.4728 0.8274 0.9096
No log 0.6542 70 0.7348 0.5384 0.7348 0.8572
No log 0.6729 72 0.6150 0.6150 0.6150 0.7842
No log 0.6916 74 0.6306 0.6209 0.6306 0.7941
No log 0.7103 76 0.6165 0.6355 0.6165 0.7852
No log 0.7290 78 0.6091 0.6439 0.6091 0.7804
No log 0.7477 80 0.6370 0.5982 0.6370 0.7981
No log 0.7664 82 0.6908 0.4930 0.6908 0.8311
No log 0.7850 84 0.7874 0.5079 0.7874 0.8874
No log 0.8037 86 0.8471 0.4342 0.8471 0.9204
No log 0.8224 88 0.7772 0.4994 0.7772 0.8816
No log 0.8411 90 0.7175 0.4699 0.7175 0.8471
No log 0.8598 92 0.6817 0.4579 0.6817 0.8257
No log 0.8785 94 0.6611 0.4606 0.6611 0.8131
No log 0.8972 96 0.6536 0.4471 0.6536 0.8084
No log 0.9159 98 0.6371 0.4465 0.6371 0.7982
No log 0.9346 100 0.6251 0.4640 0.6251 0.7907
No log 0.9533 102 0.6222 0.4949 0.6222 0.7888
No log 0.9720 104 0.6430 0.5249 0.6430 0.8019
No log 0.9907 106 0.7397 0.5294 0.7397 0.8601
No log 1.0093 108 0.8386 0.4899 0.8386 0.9157
No log 1.0280 110 0.7692 0.5441 0.7692 0.8771
No log 1.0467 112 0.6749 0.5931 0.6749 0.8215
No log 1.0654 114 0.6820 0.6068 0.6820 0.8258
No log 1.0841 116 0.7510 0.5562 0.7510 0.8666
No log 1.1028 118 0.7368 0.5659 0.7368 0.8584
No log 1.1215 120 0.6435 0.5915 0.6435 0.8022
No log 1.1402 122 0.6140 0.6072 0.6140 0.7836
No log 1.1589 124 0.6646 0.5826 0.6646 0.8152
No log 1.1776 126 0.6892 0.5438 0.6892 0.8302
No log 1.1963 128 0.6873 0.4937 0.6873 0.8290
No log 1.2150 130 0.7477 0.4530 0.7477 0.8647
No log 1.2336 132 0.7320 0.4505 0.7320 0.8556
No log 1.2523 134 0.6654 0.5147 0.6654 0.8157
No log 1.2710 136 0.5774 0.5986 0.5774 0.7599
No log 1.2897 138 0.5777 0.6351 0.5777 0.7601
No log 1.3084 140 0.5718 0.6751 0.5718 0.7562
No log 1.3271 142 0.5408 0.6649 0.5408 0.7354
No log 1.3458 144 0.5579 0.6395 0.5579 0.7469
No log 1.3645 146 0.5984 0.6337 0.5984 0.7736
No log 1.3832 148 0.5985 0.6058 0.5985 0.7737
No log 1.4019 150 0.5921 0.5728 0.5921 0.7695
No log 1.4206 152 0.5832 0.6091 0.5832 0.7637
No log 1.4393 154 0.5952 0.5947 0.5952 0.7715
No log 1.4579 156 0.6563 0.5726 0.6563 0.8101
No log 1.4766 158 0.9074 0.5181 0.9074 0.9526
No log 1.4953 160 1.0554 0.4443 1.0554 1.0274
No log 1.5140 162 0.9148 0.5049 0.9148 0.9565
No log 1.5327 164 0.7863 0.5285 0.7863 0.8868
No log 1.5514 166 0.7000 0.5515 0.7000 0.8366
No log 1.5701 168 0.6121 0.5881 0.6121 0.7824
No log 1.5888 170 0.5812 0.6263 0.5812 0.7624
No log 1.6075 172 0.5953 0.6218 0.5953 0.7715
No log 1.6262 174 0.6050 0.6376 0.6050 0.7778
No log 1.6449 176 0.5710 0.6437 0.5710 0.7556
No log 1.6636 178 0.6117 0.5954 0.6117 0.7821
No log 1.6822 180 0.7070 0.5410 0.7070 0.8408
No log 1.7009 182 0.7082 0.5650 0.7082 0.8415
No log 1.7196 184 0.7070 0.5529 0.7070 0.8408
No log 1.7383 186 0.6388 0.6263 0.6388 0.7992
No log 1.7570 188 0.5777 0.6550 0.5777 0.7601
No log 1.7757 190 0.5813 0.6684 0.5813 0.7624
No log 1.7944 192 0.5962 0.6580 0.5962 0.7722
No log 1.8131 194 0.5893 0.6545 0.5893 0.7676
No log 1.8318 196 0.5882 0.6819 0.5882 0.7670
No log 1.8505 198 0.6728 0.6336 0.6728 0.8202
No log 1.8692 200 0.7996 0.5602 0.7996 0.8942
No log 1.8879 202 0.7845 0.5578 0.7845 0.8857
No log 1.9065 204 0.6816 0.6133 0.6816 0.8256
No log 1.9252 206 0.6615 0.5877 0.6615 0.8133
No log 1.9439 208 0.6647 0.5862 0.6647 0.8153
No log 1.9626 210 0.6940 0.5870 0.6940 0.8331
No log 1.9813 212 0.6683 0.6313 0.6683 0.8175
No log 2.0 214 0.6509 0.6078 0.6509 0.8068
No log 2.0187 216 0.6518 0.6058 0.6518 0.8074
No log 2.0374 218 0.6855 0.5747 0.6855 0.8279
No log 2.0561 220 0.7135 0.4728 0.7135 0.8447
No log 2.0748 222 0.6786 0.5292 0.6786 0.8238
No log 2.0935 224 0.6634 0.5416 0.6634 0.8145
No log 2.1121 226 0.6797 0.5429 0.6797 0.8244
No log 2.1308 228 0.6489 0.5825 0.6489 0.8056
No log 2.1495 230 0.5903 0.6384 0.5903 0.7683
No log 2.1682 232 0.5637 0.6620 0.5637 0.7508
No log 2.1869 234 0.5728 0.6534 0.5728 0.7568
No log 2.2056 236 0.5494 0.6737 0.5494 0.7412
No log 2.2243 238 0.5402 0.6748 0.5402 0.7350
No log 2.2430 240 0.5496 0.6682 0.5496 0.7413
No log 2.2617 242 0.6096 0.6528 0.6096 0.7808
No log 2.2804 244 0.5963 0.6451 0.5963 0.7722
No log 2.2991 246 0.6179 0.6440 0.6179 0.7860
No log 2.3178 248 0.6334 0.6498 0.6334 0.7959
No log 2.3364 250 0.5781 0.6595 0.5781 0.7604
No log 2.3551 252 0.5603 0.6678 0.5603 0.7486
No log 2.3738 254 0.5936 0.6826 0.5936 0.7705
No log 2.3925 256 0.6564 0.6490 0.6564 0.8102
No log 2.4112 258 0.6203 0.6759 0.6203 0.7876
No log 2.4299 260 0.5515 0.6702 0.5515 0.7426
No log 2.4486 262 0.5436 0.6593 0.5436 0.7373
No log 2.4673 264 0.5423 0.6548 0.5423 0.7364
No log 2.4860 266 0.5716 0.6675 0.5716 0.7560
No log 2.5047 268 0.6823 0.6530 0.6823 0.8260
No log 2.5234 270 0.7107 0.6519 0.7107 0.8431
No log 2.5421 272 0.5959 0.6558 0.5959 0.7719
No log 2.5607 274 0.5796 0.6698 0.5796 0.7613
No log 2.5794 276 0.6370 0.6523 0.6370 0.7981
No log 2.5981 278 0.7433 0.6025 0.7433 0.8621
No log 2.6168 280 0.7558 0.6108 0.7558 0.8694
No log 2.6355 282 0.7214 0.6108 0.7214 0.8494
No log 2.6542 284 0.7315 0.6048 0.7315 0.8553
No log 2.6729 286 0.6519 0.6052 0.6519 0.8074
No log 2.6916 288 0.6372 0.6161 0.6372 0.7983
No log 2.7103 290 0.5989 0.6203 0.5989 0.7739
No log 2.7290 292 0.5735 0.6688 0.5735 0.7573
No log 2.7477 294 0.5960 0.6358 0.5960 0.7720
No log 2.7664 296 0.5946 0.6724 0.5946 0.7711
No log 2.7850 298 0.5735 0.6691 0.5735 0.7573
No log 2.8037 300 0.5592 0.6552 0.5592 0.7478
No log 2.8224 302 0.5555 0.6607 0.5555 0.7453
No log 2.8411 304 0.5566 0.6782 0.5566 0.7461
No log 2.8598 306 0.5792 0.6756 0.5792 0.7611
No log 2.8785 308 0.5419 0.6908 0.5419 0.7362
No log 2.8972 310 0.5292 0.6782 0.5292 0.7275
No log 2.9159 312 0.5270 0.6780 0.5270 0.7259
No log 2.9346 314 0.5312 0.6754 0.5312 0.7289
No log 2.9533 316 0.5304 0.6739 0.5304 0.7283
No log 2.9720 318 0.5550 0.6516 0.5550 0.7450
No log 2.9907 320 0.5491 0.6771 0.5491 0.7410
No log 3.0093 322 0.5870 0.6973 0.5870 0.7661
No log 3.0280 324 0.7666 0.6591 0.7666 0.8756
No log 3.0467 326 0.8554 0.6114 0.8554 0.9249
No log 3.0654 328 0.7249 0.6787 0.7249 0.8514
No log 3.0841 330 0.6079 0.6918 0.6079 0.7797
No log 3.1028 332 0.5624 0.7101 0.5624 0.7500
No log 3.1215 334 0.5671 0.7236 0.5671 0.7530
No log 3.1402 336 0.5648 0.7048 0.5648 0.7516
No log 3.1589 338 0.6356 0.6726 0.6356 0.7972
No log 3.1776 340 0.6706 0.6708 0.6706 0.8189
No log 3.1963 342 0.5656 0.6760 0.5656 0.7520
No log 3.2150 344 0.5417 0.6890 0.5417 0.7360
No log 3.2336 346 0.5333 0.6845 0.5333 0.7302
No log 3.2523 348 0.5260 0.6806 0.5260 0.7253
No log 3.2710 350 0.5301 0.6929 0.5301 0.7281
No log 3.2897 352 0.5189 0.6823 0.5189 0.7204
No log 3.3084 354 0.5236 0.6759 0.5236 0.7236
No log 3.3271 356 0.5187 0.7040 0.5187 0.7202
No log 3.3458 358 0.5426 0.6881 0.5426 0.7366
No log 3.3645 360 0.5326 0.7044 0.5326 0.7298
No log 3.3832 362 0.5378 0.6882 0.5378 0.7334
No log 3.4019 364 0.6033 0.6687 0.6033 0.7767
No log 3.4206 366 0.6317 0.6429 0.6317 0.7948
No log 3.4393 368 0.7011 0.6028 0.7011 0.8373
No log 3.4579 370 0.7362 0.5822 0.7362 0.8580
No log 3.4766 372 0.6642 0.6238 0.6642 0.8150
No log 3.4953 374 0.5618 0.6708 0.5618 0.7495
No log 3.5140 376 0.5598 0.6805 0.5598 0.7482
No log 3.5327 378 0.6239 0.6613 0.6239 0.7899
No log 3.5514 380 0.6843 0.6230 0.6843 0.8272
No log 3.5701 382 0.6110 0.6382 0.6110 0.7817
No log 3.5888 384 0.5550 0.6741 0.5550 0.7450
No log 3.6075 386 0.5483 0.6713 0.5483 0.7405
No log 3.6262 388 0.5898 0.6342 0.5898 0.7680
No log 3.6449 390 0.7250 0.5979 0.7250 0.8514
No log 3.6636 392 0.7286 0.6138 0.7286 0.8536
No log 3.6822 394 0.6173 0.6593 0.6173 0.7857
No log 3.7009 396 0.5660 0.6565 0.5660 0.7523
No log 3.7196 398 0.5415 0.6697 0.5415 0.7359
No log 3.7383 400 0.5400 0.6625 0.5400 0.7348
No log 3.7570 402 0.6175 0.6685 0.6175 0.7858
No log 3.7757 404 0.6847 0.6376 0.6847 0.8275
No log 3.7944 406 0.6731 0.6374 0.6731 0.8204
No log 3.8131 408 0.6506 0.6466 0.6506 0.8066
No log 3.8318 410 0.5749 0.6617 0.5749 0.7582
No log 3.8505 412 0.5310 0.6818 0.5310 0.7287
No log 3.8692 414 0.5405 0.6732 0.5405 0.7352
No log 3.8879 416 0.5963 0.6710 0.5963 0.7722
No log 3.9065 418 0.6389 0.6558 0.6389 0.7993
No log 3.9252 420 0.5552 0.6788 0.5552 0.7451
No log 3.9439 422 0.5279 0.6853 0.5279 0.7266
No log 3.9626 424 0.5866 0.6651 0.5866 0.7659
No log 3.9813 426 0.6537 0.6389 0.6537 0.8085
No log 4.0 428 0.6126 0.6489 0.6126 0.7827
No log 4.0187 430 0.5334 0.7000 0.5334 0.7303
No log 4.0374 432 0.6202 0.6410 0.6202 0.7875
No log 4.0561 434 0.8063 0.6155 0.8063 0.8979
No log 4.0748 436 0.8806 0.5920 0.8806 0.9384
No log 4.0935 438 0.7559 0.6236 0.7559 0.8695
No log 4.1121 440 0.6550 0.6586 0.6550 0.8093
No log 4.1308 442 0.6897 0.6693 0.6897 0.8305
No log 4.1495 444 0.6787 0.6682 0.6787 0.8238
No log 4.1682 446 0.6326 0.6971 0.6326 0.7954
No log 4.1869 448 0.6310 0.6866 0.6310 0.7944
No log 4.2056 450 0.6551 0.6812 0.6551 0.8094
No log 4.2243 452 0.6345 0.6661 0.6345 0.7965
No log 4.2430 454 0.6768 0.6372 0.6768 0.8227
No log 4.2617 456 0.6700 0.6233 0.6700 0.8185
No log 4.2804 458 0.6310 0.6494 0.6310 0.7944
No log 4.2991 460 0.5936 0.6593 0.5936 0.7705
No log 4.3178 462 0.6004 0.6686 0.6004 0.7748
No log 4.3364 464 0.6189 0.6683 0.6189 0.7867
No log 4.3551 466 0.6444 0.6800 0.6444 0.8027
No log 4.3738 468 0.6662 0.6700 0.6662 0.8162
No log 4.3925 470 0.6395 0.6770 0.6395 0.7997
No log 4.4112 472 0.7194 0.6909 0.7194 0.8482
No log 4.4299 474 0.9093 0.6407 0.9093 0.9536
No log 4.4486 476 0.9676 0.6270 0.9676 0.9837
No log 4.4673 478 0.8043 0.6763 0.8043 0.8968
No log 4.4860 480 0.6564 0.6648 0.6564 0.8102
No log 4.5047 482 0.6184 0.6809 0.6184 0.7864
No log 4.5234 484 0.5990 0.6315 0.5990 0.7740
No log 4.5421 486 0.5745 0.6519 0.5745 0.7579
No log 4.5607 488 0.6115 0.6688 0.6115 0.7820
No log 4.5794 490 0.6926 0.6472 0.6926 0.8322
No log 4.5981 492 0.6485 0.6695 0.6485 0.8053
No log 4.6168 494 0.5874 0.7138 0.5874 0.7664
No log 4.6355 496 0.5765 0.7000 0.5765 0.7593
No log 4.6542 498 0.5817 0.6808 0.5817 0.7627
0.5458 4.6729 500 0.6435 0.6837 0.6435 0.8022
0.5458 4.6916 502 0.7844 0.6395 0.7844 0.8856
0.5458 4.7103 504 0.7559 0.6373 0.7559 0.8694
0.5458 4.7290 506 0.6251 0.7013 0.6251 0.7906
0.5458 4.7477 508 0.5980 0.6673 0.5980 0.7733
0.5458 4.7664 510 0.6180 0.6700 0.6180 0.7861
0.5458 4.7850 512 0.5997 0.6611 0.5997 0.7744
0.5458 4.8037 514 0.6298 0.6834 0.6298 0.7936
0.5458 4.8224 516 0.8605 0.6299 0.8605 0.9276

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/Arabic_CrossPrompt_FineTuningAraBERT_noAug_TestTask4_organization

Finetuned
(4019)
this model