Arabic_CrossPrompt_FineTuningAraBERT_noAug_TestTask6_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7787
  • Qwk: 0.5654
  • Mse: 0.7787
  • Rmse: 0.8824

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0194 2 4.1398 -0.0389 4.1398 2.0346
No log 0.0388 4 2.4118 0.1134 2.4118 1.5530
No log 0.0583 6 1.3055 0.0436 1.3055 1.1426
No log 0.0777 8 1.0944 0.1108 1.0944 1.0461
No log 0.0971 10 1.1481 0.0392 1.1481 1.0715
No log 0.1165 12 1.1043 0.0830 1.1043 1.0508
No log 0.1359 14 1.0030 0.1708 1.0030 1.0015
No log 0.1553 16 1.0923 0.1790 1.0923 1.0452
No log 0.1748 18 1.3665 0.0950 1.3665 1.1690
No log 0.1942 20 1.8233 0.1272 1.8233 1.3503
No log 0.2136 22 1.5082 0.1460 1.5082 1.2281
No log 0.2330 24 0.9831 0.1995 0.9831 0.9915
No log 0.2524 26 0.8261 0.4099 0.8261 0.9089
No log 0.2718 28 0.8261 0.3575 0.8261 0.9089
No log 0.2913 30 0.8932 0.2857 0.8932 0.9451
No log 0.3107 32 0.8670 0.3369 0.8670 0.9312
No log 0.3301 34 0.8270 0.3417 0.8270 0.9094
No log 0.3495 36 0.9090 0.2602 0.9090 0.9534
No log 0.3689 38 0.9472 0.2421 0.9472 0.9733
No log 0.3883 40 0.8775 0.2942 0.8775 0.9368
No log 0.4078 42 0.8436 0.3807 0.8436 0.9185
No log 0.4272 44 0.8100 0.4609 0.8100 0.9000
No log 0.4466 46 0.7906 0.4523 0.7906 0.8892
No log 0.4660 48 0.7712 0.4621 0.7712 0.8782
No log 0.4854 50 0.7721 0.5092 0.7721 0.8787
No log 0.5049 52 0.8385 0.4944 0.8385 0.9157
No log 0.5243 54 0.7542 0.4831 0.7542 0.8685
No log 0.5437 56 0.7422 0.4860 0.7422 0.8615
No log 0.5631 58 0.7727 0.5050 0.7727 0.8790
No log 0.5825 60 0.7520 0.5199 0.7520 0.8672
No log 0.6019 62 0.7061 0.4864 0.7061 0.8403
No log 0.6214 64 0.6897 0.5162 0.6897 0.8305
No log 0.6408 66 0.6958 0.5665 0.6958 0.8342
No log 0.6602 68 0.7106 0.5923 0.7106 0.8430
No log 0.6796 70 0.7961 0.5559 0.7961 0.8922
No log 0.6990 72 0.7921 0.5379 0.7921 0.8900
No log 0.7184 74 0.6771 0.6051 0.6771 0.8229
No log 0.7379 76 0.6842 0.6139 0.6842 0.8272
No log 0.7573 78 0.8096 0.5321 0.8096 0.8998
No log 0.7767 80 0.7182 0.5784 0.7182 0.8475
No log 0.7961 82 0.6674 0.5473 0.6674 0.8169
No log 0.8155 84 0.6881 0.4986 0.6881 0.8295
No log 0.8350 86 0.7020 0.4711 0.7020 0.8379
No log 0.8544 88 0.6864 0.4736 0.6864 0.8285
No log 0.8738 90 0.6741 0.4946 0.6741 0.8210
No log 0.8932 92 0.7005 0.5042 0.7005 0.8370
No log 0.9126 94 0.6847 0.5370 0.6847 0.8275
No log 0.9320 96 0.6566 0.5813 0.6566 0.8103
No log 0.9515 98 0.7424 0.5699 0.7424 0.8616
No log 0.9709 100 0.7828 0.5583 0.7828 0.8848
No log 0.9903 102 0.7411 0.6108 0.7411 0.8609
No log 1.0097 104 0.6652 0.5963 0.6652 0.8156
No log 1.0291 106 0.7123 0.5957 0.7123 0.8440
No log 1.0485 108 0.6905 0.5717 0.6905 0.8310
No log 1.0680 110 0.7199 0.5599 0.7199 0.8485
No log 1.0874 112 0.7659 0.5150 0.7659 0.8751
No log 1.1068 114 0.8183 0.4780 0.8183 0.9046
No log 1.1262 116 0.8503 0.5263 0.8503 0.9221
No log 1.1456 118 0.7778 0.5757 0.7778 0.8819
No log 1.1650 120 0.7302 0.5717 0.7302 0.8545
No log 1.1845 122 0.7102 0.5536 0.7102 0.8427
No log 1.2039 124 0.7605 0.5712 0.7605 0.8721
No log 1.2233 126 0.7759 0.5801 0.7759 0.8808
No log 1.2427 128 0.7304 0.5494 0.7304 0.8546
No log 1.2621 130 0.7349 0.5590 0.7349 0.8572
No log 1.2816 132 0.8223 0.5703 0.8223 0.9068
No log 1.3010 134 0.8822 0.5561 0.8822 0.9393
No log 1.3204 136 0.8066 0.5533 0.8066 0.8981
No log 1.3398 138 0.7047 0.5289 0.7047 0.8395
No log 1.3592 140 0.7007 0.5320 0.7007 0.8371
No log 1.3786 142 0.7120 0.5448 0.7120 0.8438
No log 1.3981 144 0.7210 0.5613 0.7210 0.8491
No log 1.4175 146 0.7232 0.5832 0.7232 0.8504
No log 1.4369 148 0.6947 0.5592 0.6947 0.8335
No log 1.4563 150 0.7419 0.5628 0.7419 0.8613
No log 1.4757 152 0.7313 0.5632 0.7313 0.8551
No log 1.4951 154 0.6618 0.5887 0.6618 0.8135
No log 1.5146 156 0.6953 0.6116 0.6953 0.8339
No log 1.5340 158 0.9191 0.5247 0.9191 0.9587
No log 1.5534 160 0.9546 0.4989 0.9546 0.9771
No log 1.5728 162 0.8426 0.5286 0.8426 0.9179
No log 1.5922 164 0.6930 0.5334 0.6930 0.8325
No log 1.6117 166 0.6666 0.5606 0.6666 0.8165
No log 1.6311 168 0.6300 0.5818 0.6300 0.7938
No log 1.6505 170 0.6508 0.5780 0.6508 0.8067
No log 1.6699 172 0.6518 0.5490 0.6518 0.8074
No log 1.6893 174 0.6323 0.5367 0.6323 0.7952
No log 1.7087 176 0.6477 0.5802 0.6477 0.8048
No log 1.7282 178 0.6462 0.5920 0.6462 0.8038
No log 1.7476 180 0.6561 0.5741 0.6561 0.8100
No log 1.7670 182 0.7697 0.6046 0.7697 0.8773
No log 1.7864 184 0.8001 0.6005 0.8001 0.8945
No log 1.8058 186 0.7196 0.6110 0.7196 0.8483
No log 1.8252 188 0.6393 0.5929 0.6393 0.7995
No log 1.8447 190 0.6533 0.6123 0.6533 0.8083
No log 1.8641 192 0.6579 0.6086 0.6579 0.8111
No log 1.8835 194 0.6164 0.5921 0.6164 0.7851
No log 1.9029 196 0.6413 0.5682 0.6413 0.8008
No log 1.9223 198 0.6834 0.5697 0.6834 0.8267
No log 1.9417 200 0.7164 0.5635 0.7164 0.8464
No log 1.9612 202 0.6886 0.5666 0.6886 0.8298
No log 1.9806 204 0.6064 0.6280 0.6064 0.7787
No log 2.0 206 0.5926 0.6548 0.5926 0.7698
No log 2.0194 208 0.5932 0.6393 0.5932 0.7702
No log 2.0388 210 0.5971 0.6548 0.5971 0.7727
No log 2.0583 212 0.5999 0.6486 0.5999 0.7746
No log 2.0777 214 0.6027 0.6254 0.6027 0.7763
No log 2.0971 216 0.6076 0.5975 0.6076 0.7795
No log 2.1165 218 0.6212 0.5822 0.6212 0.7881
No log 2.1359 220 0.6305 0.5698 0.6305 0.7940
No log 2.1553 222 0.6855 0.5703 0.6855 0.8280
No log 2.1748 224 0.8625 0.5766 0.8625 0.9287
No log 2.1942 226 0.9249 0.5539 0.9249 0.9617
No log 2.2136 228 0.8289 0.5884 0.8289 0.9104
No log 2.2330 230 0.8414 0.5682 0.8414 0.9173
No log 2.2524 232 0.7675 0.5338 0.7675 0.8761
No log 2.2718 234 0.6717 0.5645 0.6717 0.8196
No log 2.2913 236 0.6863 0.5746 0.6863 0.8284
No log 2.3107 238 0.6852 0.5485 0.6852 0.8278
No log 2.3301 240 0.6705 0.5415 0.6705 0.8189
No log 2.3495 242 0.6500 0.5683 0.6500 0.8062
No log 2.3689 244 0.6817 0.5696 0.6817 0.8257
No log 2.3883 246 0.7063 0.6080 0.7063 0.8404
No log 2.4078 248 0.6812 0.5949 0.6812 0.8253
No log 2.4272 250 0.6831 0.6034 0.6831 0.8265
No log 2.4466 252 0.6818 0.6023 0.6818 0.8257
No log 2.4660 254 0.7203 0.6038 0.7203 0.8487
No log 2.4854 256 0.9110 0.5799 0.9110 0.9545
No log 2.5049 258 0.8855 0.5905 0.8855 0.9410
No log 2.5243 260 0.7245 0.5627 0.7245 0.8511
No log 2.5437 262 0.6622 0.5661 0.6622 0.8137
No log 2.5631 264 0.6680 0.5595 0.6680 0.8173
No log 2.5825 266 0.6614 0.5831 0.6614 0.8133
No log 2.6019 268 0.6591 0.5829 0.6591 0.8118
No log 2.6214 270 0.7108 0.6056 0.7108 0.8431
No log 2.6408 272 0.8231 0.6011 0.8231 0.9072
No log 2.6602 274 0.8070 0.6082 0.8070 0.8983
No log 2.6796 276 0.7720 0.6280 0.7720 0.8786
No log 2.6990 278 0.7685 0.6311 0.7685 0.8767
No log 2.7184 280 0.7567 0.6190 0.7567 0.8699
No log 2.7379 282 0.7035 0.6159 0.7035 0.8388
No log 2.7573 284 0.6554 0.6266 0.6554 0.8095
No log 2.7767 286 0.7966 0.6157 0.7966 0.8925
No log 2.7961 288 0.7744 0.5999 0.7744 0.8800
No log 2.8155 290 0.6516 0.5759 0.6516 0.8072
No log 2.8350 292 0.5881 0.6267 0.5881 0.7669
No log 2.8544 294 0.5767 0.6298 0.5767 0.7594
No log 2.8738 296 0.6168 0.6396 0.6168 0.7854
No log 2.8932 298 0.7194 0.6499 0.7194 0.8482
No log 2.9126 300 0.7972 0.6364 0.7972 0.8928
No log 2.9320 302 0.8105 0.6366 0.8105 0.9003
No log 2.9515 304 0.8309 0.6272 0.8309 0.9116
No log 2.9709 306 0.7306 0.6388 0.7306 0.8548
No log 2.9903 308 0.6859 0.6526 0.6859 0.8282
No log 3.0097 310 0.6775 0.6338 0.6775 0.8231
No log 3.0291 312 0.6616 0.6199 0.6616 0.8134
No log 3.0485 314 0.6532 0.6228 0.6532 0.8082
No log 3.0680 316 0.6908 0.5683 0.6908 0.8311
No log 3.0874 318 0.7051 0.5844 0.7051 0.8397
No log 3.1068 320 0.7269 0.5630 0.7269 0.8526
No log 3.1262 322 0.8145 0.5442 0.8145 0.9025
No log 3.1456 324 0.8807 0.5367 0.8807 0.9385
No log 3.1650 326 0.8901 0.5369 0.8901 0.9435
No log 3.1845 328 0.8131 0.5455 0.8131 0.9017
No log 3.2039 330 0.7905 0.5905 0.7905 0.8891
No log 3.2233 332 0.7808 0.6055 0.7808 0.8836
No log 3.2427 334 0.7820 0.5866 0.7820 0.8843
No log 3.2621 336 0.8155 0.5723 0.8155 0.9030
No log 3.2816 338 0.7594 0.5958 0.7594 0.8715
No log 3.3010 340 0.6753 0.6375 0.6753 0.8218
No log 3.3204 342 0.6548 0.6422 0.6548 0.8092
No log 3.3398 344 0.6845 0.6152 0.6845 0.8274
No log 3.3592 346 0.7110 0.5939 0.7110 0.8432
No log 3.3786 348 0.7942 0.5778 0.7942 0.8912
No log 3.3981 350 0.7871 0.5823 0.7871 0.8872
No log 3.4175 352 0.7338 0.6090 0.7338 0.8566
No log 3.4369 354 0.6802 0.6577 0.6802 0.8247
No log 3.4563 356 0.6701 0.6750 0.6701 0.8186
No log 3.4757 358 0.6648 0.6733 0.6648 0.8154
No log 3.4951 360 0.6440 0.6708 0.6440 0.8025
No log 3.5146 362 0.6387 0.6569 0.6387 0.7992
No log 3.5340 364 0.6350 0.6682 0.6350 0.7968
No log 3.5534 366 0.6917 0.6355 0.6917 0.8317
No log 3.5728 368 0.7335 0.6342 0.7335 0.8564
No log 3.5922 370 0.7739 0.6319 0.7739 0.8797
No log 3.6117 372 0.7196 0.6641 0.7196 0.8483
No log 3.6311 374 0.6600 0.6677 0.6600 0.8124
No log 3.6505 376 0.5976 0.6651 0.5976 0.7730
No log 3.6699 378 0.5746 0.6395 0.5746 0.7580
No log 3.6893 380 0.5753 0.6471 0.5753 0.7585
No log 3.7087 382 0.5884 0.6565 0.5884 0.7671
No log 3.7282 384 0.6219 0.6651 0.6219 0.7886
No log 3.7476 386 0.6774 0.6289 0.6774 0.8230
No log 3.7670 388 0.6927 0.6269 0.6927 0.8323
No log 3.7864 390 0.6879 0.6210 0.6879 0.8294
No log 3.8058 392 0.6519 0.6422 0.6519 0.8074
No log 3.8252 394 0.6192 0.6033 0.6192 0.7869
No log 3.8447 396 0.7012 0.6165 0.7012 0.8374
No log 3.8641 398 0.7811 0.5943 0.7811 0.8838
No log 3.8835 400 0.6942 0.5935 0.6942 0.8332
No log 3.9029 402 0.6393 0.6152 0.6393 0.7996
No log 3.9223 404 0.6707 0.5983 0.6707 0.8190
No log 3.9417 406 0.9479 0.5306 0.9479 0.9736
No log 3.9612 408 1.0198 0.4948 1.0198 1.0098
No log 3.9806 410 0.8909 0.5473 0.8909 0.9439
No log 4.0 412 0.7391 0.5968 0.7391 0.8597
No log 4.0194 414 0.6898 0.6045 0.6898 0.8306
No log 4.0388 416 0.6870 0.6135 0.6870 0.8288
No log 4.0583 418 0.7140 0.6225 0.7140 0.8450
No log 4.0777 420 0.8903 0.5826 0.8903 0.9435
No log 4.0971 422 1.1906 0.5440 1.1906 1.0911
No log 4.1165 424 1.1847 0.5411 1.1847 1.0884
No log 4.1359 426 0.9379 0.5827 0.9379 0.9684
No log 4.1553 428 0.7066 0.6117 0.7066 0.8406
No log 4.1748 430 0.6695 0.6161 0.6695 0.8183
No log 4.1942 432 0.6562 0.6034 0.6562 0.8101
No log 4.2136 434 0.6616 0.5959 0.6616 0.8134
No log 4.2330 436 0.7036 0.6309 0.7036 0.8388
No log 4.2524 438 0.8055 0.6105 0.8055 0.8975
No log 4.2718 440 0.8559 0.6046 0.8559 0.9251
No log 4.2913 442 0.8990 0.5978 0.8990 0.9482
No log 4.3107 444 0.8883 0.6225 0.8883 0.9425
No log 4.3301 446 0.8513 0.6378 0.8513 0.9226
No log 4.3495 448 0.8348 0.6239 0.8348 0.9137
No log 4.3689 450 0.7942 0.6227 0.7942 0.8912
No log 4.3883 452 0.7935 0.6121 0.7935 0.8908
No log 4.4078 454 0.9040 0.5852 0.9040 0.9508
No log 4.4272 456 0.9710 0.5418 0.9710 0.9854
No log 4.4466 458 0.9356 0.5561 0.9356 0.9673
No log 4.4660 460 0.8196 0.5830 0.8196 0.9053
No log 4.4854 462 0.7319 0.6371 0.7319 0.8555
No log 4.5049 464 0.8443 0.5866 0.8443 0.9189
No log 4.5243 466 0.8744 0.5557 0.8744 0.9351
No log 4.5437 468 0.7711 0.632 0.7711 0.8781
No log 4.5631 470 0.6972 0.6350 0.6972 0.8350
No log 4.5825 472 0.7152 0.6248 0.7152 0.8457
No log 4.6019 474 0.6699 0.6300 0.6699 0.8184
No log 4.6214 476 0.6461 0.5984 0.6461 0.8038
No log 4.6408 478 0.6826 0.5832 0.6826 0.8262
No log 4.6602 480 0.7369 0.5792 0.7369 0.8584
No log 4.6796 482 0.7684 0.5576 0.7684 0.8766
No log 4.6990 484 0.6972 0.6121 0.6972 0.8350
No log 4.7184 486 0.6495 0.6159 0.6495 0.8059
No log 4.7379 488 0.6442 0.6206 0.6441 0.8026
No log 4.7573 490 0.6788 0.6465 0.6788 0.8239
No log 4.7767 492 0.8319 0.6238 0.8319 0.9121
No log 4.7961 494 0.8090 0.6214 0.8090 0.8995
No log 4.8155 496 0.6423 0.5985 0.6423 0.8015
No log 4.8350 498 0.5933 0.6102 0.5933 0.7702
0.584 4.8544 500 0.5927 0.6127 0.5927 0.7699
0.584 4.8738 502 0.5966 0.6110 0.5966 0.7724
0.584 4.8932 504 0.6422 0.6170 0.6422 0.8014
0.584 4.9126 506 0.7617 0.6358 0.7617 0.8727
0.584 4.9320 508 0.7472 0.6322 0.7472 0.8644
0.584 4.9515 510 0.6571 0.6497 0.6571 0.8106
0.584 4.9709 512 0.6486 0.6605 0.6486 0.8054
0.584 4.9903 514 0.7200 0.6482 0.7200 0.8486
0.584 5.0097 516 0.8576 0.6545 0.8576 0.9261
0.584 5.0291 518 0.8891 0.6451 0.8891 0.9429
0.584 5.0485 520 0.7986 0.6285 0.7986 0.8936
0.584 5.0680 522 0.6797 0.6411 0.6797 0.8244
0.584 5.0874 524 0.6481 0.6241 0.6481 0.8051
0.584 5.1068 526 0.6672 0.6325 0.6672 0.8168
0.584 5.1262 528 0.7338 0.6395 0.7338 0.8566
0.584 5.1456 530 0.9417 0.6043 0.9417 0.9704
0.584 5.1650 532 1.0412 0.5860 1.0412 1.0204
0.584 5.1845 534 0.8635 0.6181 0.8635 0.9292
0.584 5.2039 536 0.6903 0.6621 0.6903 0.8308
0.584 5.2233 538 0.7635 0.6105 0.7635 0.8738
0.584 5.2427 540 0.7809 0.5514 0.7809 0.8837
0.584 5.2621 542 0.6619 0.6091 0.6619 0.8136
0.584 5.2816 544 0.6308 0.6112 0.6308 0.7942
0.584 5.3010 546 0.7787 0.5654 0.7787 0.8824

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/Arabic_CrossPrompt_FineTuningAraBERT_noAug_TestTask6_organization

Finetuned
(4019)
this model