Arabic_CrossPrompt_FineTuningAraBERT_noAug_TestTask6_vocabulary

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5043
  • Qwk: 0.5939
  • Mse: 0.5043
  • Rmse: 0.7101

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0194 2 5.0185 -0.0129 5.0185 2.2402
No log 0.0388 4 2.9396 0.0340 2.9396 1.7145
No log 0.0583 6 1.5723 0.0107 1.5723 1.2539
No log 0.0777 8 1.0046 0.1951 1.0046 1.0023
No log 0.0971 10 0.9021 0.2164 0.9021 0.9498
No log 0.1165 12 0.7757 0.2351 0.7757 0.8808
No log 0.1359 14 0.8886 0.2783 0.8886 0.9427
No log 0.1553 16 0.8244 0.3383 0.8244 0.9080
No log 0.1748 18 0.6587 0.4086 0.6587 0.8116
No log 0.1942 20 1.4258 0.2606 1.4258 1.1941
No log 0.2136 22 1.6812 0.2299 1.6812 1.2966
No log 0.2330 24 1.1817 0.2976 1.1817 1.0870
No log 0.2524 26 0.6237 0.4497 0.6237 0.7898
No log 0.2718 28 0.7439 0.3684 0.7439 0.8625
No log 0.2913 30 0.8713 0.1200 0.8713 0.9334
No log 0.3107 32 0.8233 0.1393 0.8233 0.9073
No log 0.3301 34 0.6951 0.2541 0.6951 0.8337
No log 0.3495 36 0.6489 0.3517 0.6489 0.8055
No log 0.3689 38 0.6482 0.3727 0.6482 0.8051
No log 0.3883 40 0.6171 0.4117 0.6171 0.7855
No log 0.4078 42 0.6288 0.5058 0.6288 0.7929
No log 0.4272 44 0.6472 0.5511 0.6472 0.8045
No log 0.4466 46 0.6464 0.5574 0.6464 0.8040
No log 0.4660 48 0.6805 0.5896 0.6805 0.8249
No log 0.4854 50 0.6914 0.6018 0.6914 0.8315
No log 0.5049 52 0.6490 0.5776 0.6490 0.8056
No log 0.5243 54 0.6433 0.5199 0.6433 0.8021
No log 0.5437 56 0.8356 0.4986 0.8356 0.9141
No log 0.5631 58 0.9066 0.4738 0.9066 0.9522
No log 0.5825 60 0.7273 0.5117 0.7273 0.8528
No log 0.6019 62 0.5856 0.5143 0.5856 0.7653
No log 0.6214 64 0.5471 0.4574 0.5471 0.7397
No log 0.6408 66 0.5823 0.4981 0.5823 0.7631
No log 0.6602 68 0.6212 0.4906 0.6212 0.7882
No log 0.6796 70 0.5922 0.5154 0.5922 0.7695
No log 0.6990 72 0.5612 0.5341 0.5612 0.7491
No log 0.7184 74 0.5135 0.4796 0.5135 0.7166
No log 0.7379 76 0.5082 0.4922 0.5082 0.7129
No log 0.7573 78 0.5285 0.4901 0.5285 0.7270
No log 0.7767 80 0.6103 0.5209 0.6103 0.7812
No log 0.7961 82 0.6812 0.4771 0.6812 0.8254
No log 0.8155 84 0.6926 0.4647 0.6926 0.8322
No log 0.8350 86 0.5713 0.4938 0.5713 0.7558
No log 0.8544 88 0.5054 0.5667 0.5054 0.7109
No log 0.8738 90 0.5707 0.5959 0.5707 0.7554
No log 0.8932 92 0.6008 0.6130 0.6008 0.7751
No log 0.9126 94 0.5892 0.6104 0.5892 0.7676
No log 0.9320 96 0.6716 0.5943 0.6716 0.8195
No log 0.9515 98 0.7515 0.5650 0.7515 0.8669
No log 0.9709 100 0.6382 0.6054 0.6382 0.7989
No log 0.9903 102 0.5340 0.6517 0.5340 0.7307
No log 1.0097 104 0.5589 0.6064 0.5589 0.7476
No log 1.0291 106 0.5618 0.6024 0.5618 0.7495
No log 1.0485 108 0.5105 0.6228 0.5105 0.7145
No log 1.0680 110 0.4902 0.6028 0.4902 0.7001
No log 1.0874 112 0.4895 0.5383 0.4895 0.6996
No log 1.1068 114 0.4925 0.5240 0.4925 0.7018
No log 1.1262 116 0.4858 0.5276 0.4858 0.6970
No log 1.1456 118 0.4933 0.5846 0.4933 0.7024
No log 1.1650 120 0.5781 0.5478 0.5781 0.7603
No log 1.1845 122 0.6169 0.5625 0.6169 0.7854
No log 1.2039 124 0.5393 0.5795 0.5393 0.7344
No log 1.2233 126 0.4967 0.5583 0.4967 0.7048
No log 1.2427 128 0.4764 0.5533 0.4764 0.6902
No log 1.2621 130 0.4815 0.4923 0.4815 0.6939
No log 1.2816 132 0.4704 0.5279 0.4704 0.6858
No log 1.3010 134 0.4655 0.5905 0.4655 0.6823
No log 1.3204 136 0.5077 0.6196 0.5077 0.7126
No log 1.3398 138 0.5011 0.6468 0.5011 0.7079
No log 1.3592 140 0.5052 0.6685 0.5052 0.7108
No log 1.3786 142 0.6131 0.5736 0.6131 0.7830
No log 1.3981 144 0.6537 0.5500 0.6537 0.8085
No log 1.4175 146 0.5958 0.5884 0.5958 0.7719
No log 1.4369 148 0.5857 0.5966 0.5857 0.7653
No log 1.4563 150 0.5901 0.5407 0.5901 0.7682
No log 1.4757 152 0.5959 0.5280 0.5959 0.7720
No log 1.4951 154 0.5597 0.5043 0.5597 0.7481
No log 1.5146 156 0.5805 0.5259 0.5805 0.7619
No log 1.5340 158 0.6225 0.5301 0.6225 0.7890
No log 1.5534 160 0.6014 0.5343 0.6014 0.7755
No log 1.5728 162 0.5468 0.5466 0.5468 0.7394
No log 1.5922 164 0.5146 0.5381 0.5146 0.7174
No log 1.6117 166 0.5428 0.5822 0.5428 0.7367
No log 1.6311 168 0.5028 0.5799 0.5028 0.7091
No log 1.6505 170 0.4712 0.6016 0.4712 0.6864
No log 1.6699 172 0.5099 0.5266 0.5099 0.7141
No log 1.6893 174 0.4636 0.5760 0.4636 0.6809
No log 1.7087 176 0.4573 0.6305 0.4573 0.6763
No log 1.7282 178 0.5367 0.5659 0.5367 0.7326
No log 1.7476 180 0.5682 0.5761 0.5682 0.7538
No log 1.7670 182 0.5302 0.5743 0.5302 0.7282
No log 1.7864 184 0.4951 0.5980 0.4951 0.7036
No log 1.8058 186 0.4841 0.5819 0.4841 0.6958
No log 1.8252 188 0.5124 0.6044 0.5124 0.7158
No log 1.8447 190 0.5255 0.6160 0.5255 0.7249
No log 1.8641 192 0.4934 0.6026 0.4934 0.7024
No log 1.8835 194 0.4855 0.6246 0.4855 0.6968
No log 1.9029 196 0.4680 0.6270 0.4680 0.6841
No log 1.9223 198 0.4763 0.6507 0.4763 0.6901
No log 1.9417 200 0.4737 0.6590 0.4737 0.6882
No log 1.9612 202 0.5355 0.6248 0.5355 0.7318
No log 1.9806 204 0.5802 0.6199 0.5802 0.7617
No log 2.0 206 0.5328 0.6052 0.5328 0.7300
No log 2.0194 208 0.5278 0.6075 0.5278 0.7265
No log 2.0388 210 0.5362 0.6045 0.5362 0.7323
No log 2.0583 212 0.5709 0.6156 0.5709 0.7556
No log 2.0777 214 0.5900 0.6204 0.5900 0.7681
No log 2.0971 216 0.5552 0.6250 0.5552 0.7451
No log 2.1165 218 0.5440 0.5994 0.5440 0.7376
No log 2.1359 220 0.5055 0.6177 0.5055 0.7110
No log 2.1553 222 0.5128 0.6063 0.5128 0.7161
No log 2.1748 224 0.5069 0.6026 0.5069 0.7120
No log 2.1942 226 0.4857 0.6086 0.4857 0.6969
No log 2.2136 228 0.5261 0.6406 0.5261 0.7253
No log 2.2330 230 0.6282 0.5701 0.6282 0.7926
No log 2.2524 232 0.6307 0.5805 0.6307 0.7941
No log 2.2718 234 0.5401 0.6312 0.5401 0.7349
No log 2.2913 236 0.6096 0.6168 0.6096 0.7808
No log 2.3107 238 0.6558 0.6171 0.6558 0.8098
No log 2.3301 240 0.6020 0.6242 0.6020 0.7759
No log 2.3495 242 0.5460 0.6191 0.5460 0.7389
No log 2.3689 244 0.5831 0.5926 0.5831 0.7636
No log 2.3883 246 0.5659 0.5819 0.5659 0.7522
No log 2.4078 248 0.4981 0.5535 0.4981 0.7058
No log 2.4272 250 0.4630 0.5618 0.4630 0.6804
No log 2.4466 252 0.4554 0.5702 0.4554 0.6748
No log 2.4660 254 0.4838 0.5997 0.4838 0.6956
No log 2.4854 256 0.5475 0.6177 0.5475 0.7399
No log 2.5049 258 0.5324 0.6314 0.5324 0.7297
No log 2.5243 260 0.5035 0.6247 0.5035 0.7096
No log 2.5437 262 0.4603 0.6489 0.4603 0.6785
No log 2.5631 264 0.5295 0.6174 0.5295 0.7277
No log 2.5825 266 0.6091 0.6155 0.6091 0.7804
No log 2.6019 268 0.5501 0.6144 0.5501 0.7417
No log 2.6214 270 0.4845 0.6304 0.4845 0.6960
No log 2.6408 272 0.5389 0.6153 0.5389 0.7341
No log 2.6602 274 0.5236 0.6159 0.5236 0.7236
No log 2.6796 276 0.4803 0.6255 0.4803 0.6931
No log 2.6990 278 0.4717 0.6234 0.4717 0.6868
No log 2.7184 280 0.4631 0.6131 0.4631 0.6805
No log 2.7379 282 0.4621 0.5859 0.4621 0.6798
No log 2.7573 284 0.4714 0.6284 0.4714 0.6866
No log 2.7767 286 0.5291 0.6261 0.5291 0.7274
No log 2.7961 288 0.5714 0.6298 0.5714 0.7559
No log 2.8155 290 0.7010 0.6097 0.7010 0.8373
No log 2.8350 292 0.7396 0.6030 0.7396 0.8600
No log 2.8544 294 0.6527 0.6545 0.6527 0.8079
No log 2.8738 296 0.5766 0.6537 0.5766 0.7593
No log 2.8932 298 0.5636 0.6490 0.5636 0.7507
No log 2.9126 300 0.5956 0.6145 0.5956 0.7718
No log 2.9320 302 0.5988 0.5997 0.5988 0.7738
No log 2.9515 304 0.5256 0.6304 0.5256 0.7250
No log 2.9709 306 0.4523 0.6053 0.4523 0.6725
No log 2.9903 308 0.5132 0.5035 0.5132 0.7164
No log 3.0097 310 0.5535 0.4775 0.5535 0.7440
No log 3.0291 312 0.5141 0.5125 0.5141 0.7170
No log 3.0485 314 0.4796 0.5615 0.4796 0.6926
No log 3.0680 316 0.4711 0.6260 0.4711 0.6864
No log 3.0874 318 0.4764 0.6556 0.4764 0.6902
No log 3.1068 320 0.5471 0.6321 0.5471 0.7397
No log 3.1262 322 0.6458 0.6169 0.6458 0.8036
No log 3.1456 324 0.6997 0.5904 0.6997 0.8365
No log 3.1650 326 0.6804 0.5835 0.6804 0.8249
No log 3.1845 328 0.6154 0.6293 0.6154 0.7845
No log 3.2039 330 0.5867 0.6604 0.5867 0.7660
No log 3.2233 332 0.7011 0.5611 0.7011 0.8373
No log 3.2427 334 0.6714 0.6000 0.6714 0.8194
No log 3.2621 336 0.4959 0.6034 0.4959 0.7042
No log 3.2816 338 0.4689 0.5524 0.4689 0.6847
No log 3.3010 340 0.4901 0.5342 0.4901 0.7001
No log 3.3204 342 0.5012 0.5062 0.5012 0.7079
No log 3.3398 344 0.4821 0.5312 0.4821 0.6943
No log 3.3592 346 0.4882 0.5679 0.4882 0.6987
No log 3.3786 348 0.5437 0.6194 0.5437 0.7373
No log 3.3981 350 0.5610 0.5985 0.5610 0.7490
No log 3.4175 352 0.5199 0.6184 0.5199 0.7210
No log 3.4369 354 0.5316 0.6360 0.5316 0.7291
No log 3.4563 356 0.5564 0.6345 0.5564 0.7459
No log 3.4757 358 0.5545 0.6239 0.5545 0.7446
No log 3.4951 360 0.5846 0.5769 0.5846 0.7646
No log 3.5146 362 0.5746 0.5745 0.5746 0.7581
No log 3.5340 364 0.5257 0.6184 0.5257 0.7250
No log 3.5534 366 0.5324 0.6134 0.5324 0.7296
No log 3.5728 368 0.5801 0.5772 0.5801 0.7617
No log 3.5922 370 0.5960 0.5785 0.5960 0.7720
No log 3.6117 372 0.6188 0.5673 0.6188 0.7867
No log 3.6311 374 0.5733 0.6112 0.5733 0.7572
No log 3.6505 376 0.5183 0.6426 0.5183 0.7199
No log 3.6699 378 0.5238 0.6428 0.5238 0.7237
No log 3.6893 380 0.6038 0.6206 0.6038 0.7770
No log 3.7087 382 0.7103 0.6014 0.7103 0.8428
No log 3.7282 384 0.7364 0.5703 0.7364 0.8581
No log 3.7476 386 0.6206 0.6061 0.6206 0.7878
No log 3.7670 388 0.4774 0.6403 0.4774 0.6910
No log 3.7864 390 0.4610 0.6240 0.4610 0.6789
No log 3.8058 392 0.5178 0.6357 0.5178 0.7196
No log 3.8252 394 0.4679 0.6367 0.4679 0.6841
No log 3.8447 396 0.4468 0.6088 0.4468 0.6685
No log 3.8641 398 0.4443 0.6414 0.4443 0.6665
No log 3.8835 400 0.4607 0.6664 0.4607 0.6788
No log 3.9029 402 0.4886 0.6733 0.4886 0.6990
No log 3.9223 404 0.4933 0.6665 0.4933 0.7023
No log 3.9417 406 0.6363 0.6004 0.6363 0.7977
No log 3.9612 408 0.5993 0.5946 0.5993 0.7741
No log 3.9806 410 0.4883 0.6469 0.4883 0.6988
No log 4.0 412 0.4980 0.6455 0.4980 0.7057
No log 4.0194 414 0.4722 0.6336 0.4722 0.6872
No log 4.0388 416 0.4709 0.6563 0.4709 0.6862
No log 4.0583 418 0.6482 0.5841 0.6482 0.8051
No log 4.0777 420 0.6921 0.5807 0.6921 0.8319
No log 4.0971 422 0.5601 0.6277 0.5601 0.7484
No log 4.1165 424 0.5036 0.6443 0.5036 0.7096
No log 4.1359 426 0.5252 0.6293 0.5252 0.7247
No log 4.1553 428 0.4969 0.6802 0.4969 0.7049
No log 4.1748 430 0.6433 0.6113 0.6433 0.8021
No log 4.1942 432 0.8000 0.5576 0.8000 0.8945
No log 4.2136 434 0.7291 0.5501 0.7291 0.8539
No log 4.2330 436 0.5664 0.6039 0.5664 0.7526
No log 4.2524 438 0.5335 0.6424 0.5335 0.7304
No log 4.2718 440 0.5453 0.6374 0.5453 0.7384
No log 4.2913 442 0.6212 0.5766 0.6212 0.7882
No log 4.3107 444 0.6525 0.5626 0.6525 0.8077
No log 4.3301 446 0.5421 0.6317 0.5421 0.7363
No log 4.3495 448 0.4732 0.6086 0.4732 0.6879
No log 4.3689 450 0.4728 0.6159 0.4728 0.6876
No log 4.3883 452 0.5124 0.6399 0.5124 0.7158
No log 4.4078 454 0.6048 0.6269 0.6048 0.7777
No log 4.4272 456 0.6392 0.6105 0.6392 0.7995
No log 4.4466 458 0.5977 0.6313 0.5977 0.7731
No log 4.4660 460 0.5276 0.6374 0.5276 0.7263
No log 4.4854 462 0.5003 0.6576 0.5003 0.7073
No log 4.5049 464 0.5031 0.6160 0.5031 0.7093
No log 4.5243 466 0.5266 0.6264 0.5266 0.7257
No log 4.5437 468 0.5167 0.6555 0.5167 0.7188
No log 4.5631 470 0.6057 0.6372 0.6057 0.7782
No log 4.5825 472 0.6771 0.6296 0.6771 0.8229
No log 4.6019 474 0.6840 0.6126 0.6840 0.8270
No log 4.6214 476 0.6895 0.6180 0.6895 0.8304
No log 4.6408 478 0.7669 0.5934 0.7669 0.8757
No log 4.6602 480 0.6391 0.6565 0.6391 0.7995
No log 4.6796 482 0.5674 0.6751 0.5674 0.7533
No log 4.6990 484 0.5409 0.6634 0.5409 0.7354
No log 4.7184 486 0.4929 0.6493 0.4929 0.7021
No log 4.7379 488 0.4682 0.6264 0.4682 0.6842
No log 4.7573 490 0.4998 0.6629 0.4998 0.7070
No log 4.7767 492 0.6387 0.5938 0.6387 0.7992
No log 4.7961 494 0.5536 0.6225 0.5536 0.7441
No log 4.8155 496 0.4551 0.6336 0.4551 0.6746
No log 4.8350 498 0.4391 0.6238 0.4391 0.6627
0.527 4.8544 500 0.4298 0.6166 0.4298 0.6556
0.527 4.8738 502 0.4278 0.6371 0.4278 0.6541
0.527 4.8932 504 0.4767 0.6567 0.4767 0.6904
0.527 4.9126 506 0.5171 0.6740 0.5171 0.7191
0.527 4.9320 508 0.4616 0.6911 0.4616 0.6794
0.527 4.9515 510 0.4704 0.6924 0.4704 0.6858
0.527 4.9709 512 0.5043 0.6835 0.5043 0.7101
0.527 4.9903 514 0.6404 0.6730 0.6404 0.8002
0.527 5.0097 516 0.7511 0.6016 0.7511 0.8667
0.527 5.0291 518 0.6328 0.6483 0.6328 0.7955
0.527 5.0485 520 0.4811 0.6617 0.4811 0.6936
0.527 5.0680 522 0.4744 0.6629 0.4744 0.6888
0.527 5.0874 524 0.5357 0.6442 0.5357 0.7319
0.527 5.1068 526 0.5974 0.6478 0.5974 0.7729
0.527 5.1262 528 0.6125 0.6468 0.6125 0.7826
0.527 5.1456 530 0.6401 0.6256 0.6401 0.8001
0.527 5.1650 532 0.6535 0.6226 0.6535 0.8084
0.527 5.1845 534 0.5774 0.6423 0.5774 0.7599
0.527 5.2039 536 0.5787 0.6433 0.5787 0.7607
0.527 5.2233 538 0.5225 0.6435 0.5225 0.7229
0.527 5.2427 540 0.4607 0.6228 0.4607 0.6787
0.527 5.2621 542 0.4653 0.5769 0.4653 0.6821
0.527 5.2816 544 0.4786 0.6189 0.4786 0.6918
0.527 5.3010 546 0.6743 0.5974 0.6743 0.8212
0.527 5.3204 548 0.9360 0.4811 0.9360 0.9675
0.527 5.3398 550 0.9418 0.4952 0.9418 0.9705
0.527 5.3592 552 0.6920 0.5479 0.6920 0.8319
0.527 5.3786 554 0.5043 0.5939 0.5043 0.7101

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/Arabic_CrossPrompt_FineTuningAraBERT_noAug_TestTask6_vocabulary

Finetuned
(4019)
this model