Arabic_CrossPrompt_FineTuningAraBERT_noAug_TestTask3_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6077
  • Qwk: 0.6282
  • Mse: 0.6077
  • Rmse: 0.7796

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0194 2 4.1192 -0.0034 4.1192 2.0296
No log 0.0388 4 2.3037 0.1370 2.3037 1.5178
No log 0.0583 6 1.3084 0.1867 1.3084 1.1439
No log 0.0777 8 0.9093 0.3833 0.9093 0.9536
No log 0.0971 10 0.8730 0.3406 0.8730 0.9343
No log 0.1165 12 0.7960 0.3792 0.7960 0.8922
No log 0.1359 14 1.1182 0.3827 1.1182 1.0575
No log 0.1553 16 1.9350 0.1967 1.9350 1.3910
No log 0.1748 18 1.9856 0.2355 1.9856 1.4091
No log 0.1942 20 1.0820 0.4215 1.0820 1.0402
No log 0.2136 22 0.7313 0.5785 0.7313 0.8552
No log 0.2330 24 0.7043 0.5675 0.7043 0.8392
No log 0.2524 26 0.7690 0.5463 0.7690 0.8769
No log 0.2718 28 0.8365 0.5084 0.8365 0.9146
No log 0.2913 30 0.6734 0.5741 0.6734 0.8206
No log 0.3107 32 0.7220 0.5324 0.7220 0.8497
No log 0.3301 34 0.9394 0.5097 0.9394 0.9692
No log 0.3495 36 0.8643 0.4708 0.8643 0.9297
No log 0.3689 38 0.6632 0.5285 0.6632 0.8143
No log 0.3883 40 0.6363 0.5769 0.6363 0.7977
No log 0.4078 42 0.6449 0.5805 0.6449 0.8031
No log 0.4272 44 0.6235 0.5667 0.6235 0.7896
No log 0.4466 46 0.6327 0.5964 0.6327 0.7954
No log 0.4660 48 0.7242 0.6180 0.7242 0.8510
No log 0.4854 50 0.7348 0.6320 0.7348 0.8572
No log 0.5049 52 0.6803 0.6304 0.6803 0.8248
No log 0.5243 54 0.6661 0.6119 0.6661 0.8162
No log 0.5437 56 0.8287 0.5725 0.8287 0.9103
No log 0.5631 58 1.0345 0.4632 1.0345 1.0171
No log 0.5825 60 0.8967 0.4124 0.8967 0.9469
No log 0.6019 62 0.7047 0.5191 0.7047 0.8395
No log 0.6214 64 0.7130 0.5349 0.7130 0.8444
No log 0.6408 66 0.7121 0.5364 0.7121 0.8438
No log 0.6602 68 0.6977 0.5443 0.6977 0.8353
No log 0.6796 70 0.7002 0.5647 0.7002 0.8368
No log 0.6990 72 0.7333 0.5581 0.7333 0.8563
No log 0.7184 74 0.7797 0.5915 0.7797 0.8830
No log 0.7379 76 0.8048 0.5847 0.8048 0.8971
No log 0.7573 78 0.8657 0.5364 0.8657 0.9304
No log 0.7767 80 0.8643 0.5315 0.8643 0.9297
No log 0.7961 82 0.6885 0.5876 0.6885 0.8298
No log 0.8155 84 0.6523 0.6152 0.6523 0.8077
No log 0.8350 86 0.6308 0.6460 0.6308 0.7942
No log 0.8544 88 0.6144 0.6513 0.6144 0.7838
No log 0.8738 90 0.6318 0.6509 0.6318 0.7949
No log 0.8932 92 0.8111 0.6186 0.8111 0.9006
No log 0.9126 94 0.7917 0.6015 0.7917 0.8898
No log 0.9320 96 0.7864 0.5500 0.7864 0.8868
No log 0.9515 98 0.6391 0.6017 0.6391 0.7994
No log 0.9709 100 0.5968 0.6000 0.5968 0.7725
No log 0.9903 102 0.6184 0.5480 0.6184 0.7864
No log 1.0097 104 0.7621 0.5329 0.7621 0.8730
No log 1.0291 106 0.8598 0.4947 0.8598 0.9272
No log 1.0485 108 0.7507 0.5518 0.7507 0.8664
No log 1.0680 110 0.6615 0.5878 0.6615 0.8133
No log 1.0874 112 0.6187 0.5977 0.6187 0.7866
No log 1.1068 114 0.6026 0.6384 0.6026 0.7763
No log 1.1262 116 0.6180 0.6558 0.6180 0.7861
No log 1.1456 118 0.6208 0.6770 0.6208 0.7879
No log 1.1650 120 0.6107 0.6814 0.6107 0.7815
No log 1.1845 122 0.6028 0.6737 0.6028 0.7764
No log 1.2039 124 0.6662 0.6372 0.6662 0.8162
No log 1.2233 126 0.6803 0.6080 0.6803 0.8248
No log 1.2427 128 0.5943 0.6312 0.5943 0.7709
No log 1.2621 130 0.6678 0.6417 0.6678 0.8172
No log 1.2816 132 0.7741 0.6073 0.7741 0.8798
No log 1.3010 134 0.7595 0.5946 0.7595 0.8715
No log 1.3204 136 0.7045 0.6073 0.7045 0.8394
No log 1.3398 138 0.6380 0.6303 0.6380 0.7988
No log 1.3592 140 0.6106 0.6237 0.6106 0.7814
No log 1.3786 142 0.6297 0.6088 0.6297 0.7935
No log 1.3981 144 0.5903 0.6458 0.5903 0.7683
No log 1.4175 146 0.5976 0.6314 0.5976 0.7730
No log 1.4369 148 0.5871 0.6146 0.5871 0.7662
No log 1.4563 150 0.5869 0.6234 0.5869 0.7661
No log 1.4757 152 0.5886 0.6523 0.5886 0.7672
No log 1.4951 154 0.6086 0.6210 0.6086 0.7801
No log 1.5146 156 0.5772 0.6600 0.5772 0.7597
No log 1.5340 158 0.5808 0.6518 0.5808 0.7621
No log 1.5534 160 0.6076 0.6392 0.6076 0.7795
No log 1.5728 162 0.5860 0.6593 0.5860 0.7655
No log 1.5922 164 0.5878 0.6495 0.5878 0.7667
No log 1.6117 166 0.6099 0.6565 0.6099 0.7809
No log 1.6311 168 0.6081 0.6525 0.6081 0.7798
No log 1.6505 170 0.6463 0.6461 0.6463 0.8039
No log 1.6699 172 0.6290 0.6533 0.6290 0.7931
No log 1.6893 174 0.6641 0.6189 0.6641 0.8149
No log 1.7087 176 0.6506 0.6343 0.6506 0.8066
No log 1.7282 178 0.6968 0.6340 0.6968 0.8348
No log 1.7476 180 0.7027 0.6418 0.7027 0.8382
No log 1.7670 182 0.6492 0.6580 0.6492 0.8057
No log 1.7864 184 0.6350 0.6395 0.6350 0.7969
No log 1.8058 186 0.6238 0.6494 0.6238 0.7898
No log 1.8252 188 0.5996 0.6582 0.5996 0.7744
No log 1.8447 190 0.8196 0.6008 0.8196 0.9053
No log 1.8641 192 0.8920 0.6132 0.8920 0.9445
No log 1.8835 194 0.7247 0.6543 0.7247 0.8513
No log 1.9029 196 0.7352 0.6322 0.7352 0.8574
No log 1.9223 198 0.8917 0.6046 0.8917 0.9443
No log 1.9417 200 0.8855 0.5731 0.8855 0.9410
No log 1.9612 202 0.7278 0.6319 0.7278 0.8531
No log 1.9806 204 0.6247 0.5971 0.6247 0.7904
No log 2.0 206 0.6529 0.5627 0.6529 0.8080
No log 2.0194 208 0.6546 0.5752 0.6546 0.8090
No log 2.0388 210 0.6442 0.5686 0.6442 0.8026
No log 2.0583 212 0.6354 0.5906 0.6354 0.7971
No log 2.0777 214 0.6182 0.6062 0.6182 0.7863
No log 2.0971 216 0.6621 0.5930 0.6621 0.8137
No log 2.1165 218 0.7624 0.5322 0.7624 0.8732
No log 2.1359 220 0.7732 0.5263 0.7732 0.8793
No log 2.1553 222 0.6488 0.5926 0.6488 0.8055
No log 2.1748 224 0.5912 0.5988 0.5912 0.7689
No log 2.1942 226 0.6279 0.5876 0.6279 0.7924
No log 2.2136 228 0.7088 0.5496 0.7088 0.8419
No log 2.2330 230 0.6892 0.5711 0.6892 0.8302
No log 2.2524 232 0.5852 0.6732 0.5852 0.7650
No log 2.2718 234 0.5945 0.6633 0.5945 0.7710
No log 2.2913 236 0.7302 0.6369 0.7302 0.8545
No log 2.3107 238 0.7532 0.6317 0.7532 0.8679
No log 2.3301 240 0.6397 0.6787 0.6397 0.7998
No log 2.3495 242 0.6567 0.7030 0.6567 0.8104
No log 2.3689 244 0.7973 0.6286 0.7973 0.8929
No log 2.3883 246 0.7526 0.6539 0.7526 0.8675
No log 2.4078 248 0.5971 0.7002 0.5971 0.7727
No log 2.4272 250 0.6379 0.6600 0.6379 0.7987
No log 2.4466 252 0.7062 0.5864 0.7062 0.8404
No log 2.4660 254 0.6234 0.6287 0.6234 0.7895
No log 2.4854 256 0.5590 0.6189 0.5590 0.7477
No log 2.5049 258 0.5755 0.6378 0.5755 0.7586
No log 2.5243 260 0.5800 0.6596 0.5800 0.7616
No log 2.5437 262 0.6255 0.6854 0.6255 0.7909
No log 2.5631 264 0.6573 0.6572 0.6573 0.8107
No log 2.5825 266 0.6400 0.6905 0.6400 0.8000
No log 2.6019 268 0.6949 0.6502 0.6949 0.8336
No log 2.6214 270 0.7816 0.6083 0.7816 0.8841
No log 2.6408 272 0.7298 0.6404 0.7298 0.8543
No log 2.6602 274 0.6396 0.6893 0.6396 0.7997
No log 2.6796 276 0.6203 0.6736 0.6203 0.7876
No log 2.6990 278 0.6220 0.6732 0.6220 0.7887
No log 2.7184 280 0.6147 0.6693 0.6147 0.7840
No log 2.7379 282 0.6606 0.6485 0.6606 0.8128
No log 2.7573 284 0.7364 0.6339 0.7364 0.8581
No log 2.7767 286 0.7088 0.6443 0.7088 0.8419
No log 2.7961 288 0.6141 0.6709 0.6141 0.7836
No log 2.8155 290 0.6677 0.6406 0.6677 0.8171
No log 2.8350 292 0.6685 0.6396 0.6685 0.8176
No log 2.8544 294 0.7362 0.6199 0.7362 0.8580
No log 2.8738 296 0.7408 0.6142 0.7408 0.8607
No log 2.8932 298 0.6079 0.6536 0.6079 0.7797
No log 2.9126 300 0.6260 0.6533 0.6260 0.7912
No log 2.9320 302 0.7423 0.6333 0.7423 0.8615
No log 2.9515 304 0.7040 0.6455 0.7040 0.8390
No log 2.9709 306 0.5888 0.6874 0.5888 0.7673
No log 2.9903 308 0.6375 0.6466 0.6375 0.7984
No log 3.0097 310 0.6928 0.6304 0.6928 0.8324
No log 3.0291 312 0.6109 0.6571 0.6109 0.7816
No log 3.0485 314 0.5876 0.6794 0.5876 0.7665
No log 3.0680 316 0.6459 0.6505 0.6459 0.8037
No log 3.0874 318 0.6284 0.6611 0.6284 0.7927
No log 3.1068 320 0.6058 0.6806 0.6058 0.7783
No log 3.1262 322 0.6426 0.6777 0.6426 0.8016
No log 3.1456 324 0.6861 0.6472 0.6861 0.8283
No log 3.1650 326 0.7556 0.6701 0.7556 0.8692
No log 3.1845 328 0.7964 0.6635 0.7964 0.8924
No log 3.2039 330 0.7827 0.6715 0.7827 0.8847
No log 3.2233 332 0.7455 0.6585 0.7455 0.8634
No log 3.2427 334 0.6878 0.6602 0.6878 0.8294
No log 3.2621 336 0.6511 0.6784 0.6511 0.8069
No log 3.2816 338 0.6140 0.6637 0.6140 0.7836
No log 3.3010 340 0.6083 0.6474 0.6083 0.7799
No log 3.3204 342 0.6128 0.6388 0.6128 0.7828
No log 3.3398 344 0.6126 0.6434 0.6126 0.7827
No log 3.3592 346 0.6155 0.6474 0.6155 0.7845
No log 3.3786 348 0.6104 0.6729 0.6104 0.7813
No log 3.3981 350 0.6615 0.6457 0.6615 0.8133
No log 3.4175 352 0.6548 0.6492 0.6548 0.8092
No log 3.4369 354 0.6301 0.6436 0.6301 0.7938
No log 3.4563 356 0.6053 0.6598 0.6053 0.7780
No log 3.4757 358 0.6407 0.6393 0.6407 0.8004
No log 3.4951 360 0.6794 0.6077 0.6794 0.8243
No log 3.5146 362 0.6257 0.6248 0.6257 0.7910
No log 3.5340 364 0.6281 0.5876 0.6281 0.7925
No log 3.5534 366 0.6449 0.5968 0.6449 0.8031
No log 3.5728 368 0.6236 0.5805 0.6236 0.7897
No log 3.5922 370 0.6126 0.6095 0.6126 0.7827
No log 3.6117 372 0.6129 0.6611 0.6129 0.7829
No log 3.6311 374 0.6357 0.6427 0.6357 0.7973
No log 3.6505 376 0.6656 0.6566 0.6656 0.8159
No log 3.6699 378 0.6637 0.6720 0.6637 0.8147
No log 3.6893 380 0.7364 0.6608 0.7364 0.8581
No log 3.7087 382 0.7397 0.6523 0.7397 0.8600
No log 3.7282 384 0.7520 0.6365 0.7520 0.8672
No log 3.7476 386 0.6274 0.6333 0.6274 0.7921
No log 3.7670 388 0.5981 0.6504 0.5981 0.7734
No log 3.7864 390 0.5982 0.6408 0.5982 0.7734
No log 3.8058 392 0.6090 0.6217 0.6090 0.7804
No log 3.8252 394 0.6135 0.6349 0.6135 0.7832
No log 3.8447 396 0.6318 0.6305 0.6318 0.7948
No log 3.8641 398 0.6861 0.6293 0.6861 0.8283
No log 3.8835 400 0.6065 0.6426 0.6065 0.7788
No log 3.9029 402 0.6058 0.6667 0.6058 0.7783
No log 3.9223 404 0.6327 0.6627 0.6327 0.7955
No log 3.9417 406 0.6362 0.6493 0.6362 0.7976
No log 3.9612 408 0.6426 0.6401 0.6426 0.8016
No log 3.9806 410 0.7914 0.5957 0.7914 0.8896
No log 4.0 412 0.8371 0.5976 0.8371 0.9149
No log 4.0194 414 0.7091 0.6442 0.7091 0.8421
No log 4.0388 416 0.6988 0.6417 0.6988 0.8359
No log 4.0583 418 0.7976 0.5853 0.7976 0.8931
No log 4.0777 420 0.8029 0.6028 0.8029 0.8960
No log 4.0971 422 0.6993 0.6338 0.6993 0.8362
No log 4.1165 424 0.6162 0.6370 0.6162 0.7850
No log 4.1359 426 0.6479 0.6448 0.6479 0.8049
No log 4.1553 428 0.6687 0.6387 0.6687 0.8177
No log 4.1748 430 0.6281 0.6289 0.6281 0.7925
No log 4.1942 432 0.6371 0.6224 0.6371 0.7982
No log 4.2136 434 0.6360 0.6238 0.6360 0.7975
No log 4.2330 436 0.6360 0.6382 0.6360 0.7975
No log 4.2524 438 0.6343 0.6245 0.6343 0.7964
No log 4.2718 440 0.6394 0.6390 0.6394 0.7996
No log 4.2913 442 0.6507 0.6545 0.6507 0.8067
No log 4.3107 444 0.6692 0.6230 0.6692 0.8180
No log 4.3301 446 0.7074 0.6234 0.7074 0.8411
No log 4.3495 448 0.8249 0.6121 0.8249 0.9082
No log 4.3689 450 0.8294 0.5814 0.8294 0.9107
No log 4.3883 452 0.7326 0.6019 0.7326 0.8559
No log 4.4078 454 0.6883 0.6030 0.6883 0.8296
No log 4.4272 456 0.7121 0.6124 0.7121 0.8439
No log 4.4466 458 0.7412 0.6187 0.7412 0.8609
No log 4.4660 460 0.7029 0.6242 0.7029 0.8384
No log 4.4854 462 0.6921 0.6421 0.6921 0.8319
No log 4.5049 464 0.7015 0.6238 0.7015 0.8375
No log 4.5243 466 0.6942 0.6287 0.6942 0.8332
No log 4.5437 468 0.6965 0.6657 0.6965 0.8346
No log 4.5631 470 0.7020 0.6713 0.7020 0.8379
No log 4.5825 472 0.7073 0.6688 0.7073 0.8410
No log 4.6019 474 0.6873 0.6637 0.6873 0.8290
No log 4.6214 476 0.6779 0.6089 0.6779 0.8233
No log 4.6408 478 0.6721 0.6202 0.6721 0.8198
No log 4.6602 480 0.6417 0.6071 0.6417 0.8011
No log 4.6796 482 0.6739 0.6351 0.6739 0.8209
No log 4.6990 484 0.6549 0.6408 0.6549 0.8093
No log 4.7184 486 0.6912 0.6700 0.6912 0.8314
No log 4.7379 488 0.7080 0.6832 0.7080 0.8414
No log 4.7573 490 0.7041 0.6430 0.7041 0.8391
No log 4.7767 492 0.8231 0.5904 0.8231 0.9073
No log 4.7961 494 0.7945 0.6070 0.7945 0.8913
No log 4.8155 496 0.6772 0.6476 0.6772 0.8229
No log 4.8350 498 0.7101 0.6656 0.7101 0.8427
0.5062 4.8544 500 0.7249 0.6592 0.7249 0.8514
0.5062 4.8738 502 0.6342 0.6699 0.6342 0.7964
0.5062 4.8932 504 0.6648 0.6069 0.6648 0.8153
0.5062 4.9126 506 0.7607 0.5772 0.7607 0.8722
0.5062 4.9320 508 0.7314 0.5749 0.7314 0.8552
0.5062 4.9515 510 0.6299 0.6098 0.6299 0.7936
0.5062 4.9709 512 0.6077 0.6282 0.6077 0.7796

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/Arabic_CrossPrompt_FineTuningAraBERT_noAug_TestTask3_organization

Finetuned
(4019)
this model