Arabic_CrossPrompt_FineTuningAraBERT_noAug_TestTask3_development

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3815
  • Qwk: 0.6200
  • Mse: 0.3815
  • Rmse: 0.6177

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0194 2 3.8407 -0.0045 3.8407 1.9598
No log 0.0388 4 2.2355 0.1101 2.2355 1.4951
No log 0.0583 6 1.2637 0.0549 1.2637 1.1242
No log 0.0777 8 0.7238 0.2311 0.7238 0.8508
No log 0.0971 10 0.6851 0.1360 0.6851 0.8277
No log 0.1165 12 0.6795 0.1154 0.6795 0.8243
No log 0.1359 14 0.6190 0.2805 0.6190 0.7868
No log 0.1553 16 0.6968 0.2922 0.6968 0.8347
No log 0.1748 18 0.8617 0.1219 0.8617 0.9283
No log 0.1942 20 0.7391 0.2016 0.7391 0.8597
No log 0.2136 22 0.5878 0.3982 0.5878 0.7667
No log 0.2330 24 0.5293 0.4446 0.5293 0.7276
No log 0.2524 26 0.4939 0.4794 0.4939 0.7028
No log 0.2718 28 0.5129 0.4367 0.5129 0.7162
No log 0.2913 30 0.4829 0.4710 0.4829 0.6949
No log 0.3107 32 0.4259 0.5474 0.4259 0.6526
No log 0.3301 34 0.5059 0.5296 0.5059 0.7113
No log 0.3495 36 0.5640 0.5075 0.5640 0.7510
No log 0.3689 38 0.4977 0.5398 0.4977 0.7055
No log 0.3883 40 0.4045 0.5579 0.4045 0.6360
No log 0.4078 42 0.4049 0.6042 0.4049 0.6363
No log 0.4272 44 0.4112 0.6143 0.4112 0.6412
No log 0.4466 46 0.4324 0.6429 0.4324 0.6576
No log 0.4660 48 0.4675 0.6633 0.4675 0.6837
No log 0.4854 50 0.5175 0.6353 0.5175 0.7194
No log 0.5049 52 0.6148 0.5696 0.6148 0.7841
No log 0.5243 54 0.5774 0.6051 0.5774 0.7599
No log 0.5437 56 0.4151 0.5929 0.4151 0.6443
No log 0.5631 58 0.4314 0.5788 0.4314 0.6568
No log 0.5825 60 0.4377 0.5507 0.4377 0.6616
No log 0.6019 62 0.4266 0.5607 0.4266 0.6532
No log 0.6214 64 0.4358 0.5009 0.4358 0.6602
No log 0.6408 66 0.4387 0.4467 0.4387 0.6624
No log 0.6602 68 0.4315 0.4720 0.4315 0.6569
No log 0.6796 70 0.4182 0.5211 0.4182 0.6467
No log 0.6990 72 0.4497 0.5685 0.4497 0.6706
No log 0.7184 74 0.4220 0.5922 0.4220 0.6496
No log 0.7379 76 0.3873 0.6108 0.3873 0.6224
No log 0.7573 78 0.4027 0.6339 0.4027 0.6346
No log 0.7767 80 0.3942 0.6295 0.3942 0.6278
No log 0.7961 82 0.4388 0.6212 0.4388 0.6624
No log 0.8155 84 0.4690 0.6325 0.4690 0.6848
No log 0.8350 86 0.3817 0.6319 0.3817 0.6178
No log 0.8544 88 0.3693 0.5984 0.3693 0.6077
No log 0.8738 90 0.4263 0.5908 0.4263 0.6529
No log 0.8932 92 0.4282 0.6030 0.4282 0.6544
No log 0.9126 94 0.3701 0.6112 0.3701 0.6084
No log 0.9320 96 0.3679 0.5988 0.3679 0.6066
No log 0.9515 98 0.4327 0.5951 0.4327 0.6578
No log 0.9709 100 0.4437 0.5936 0.4437 0.6661
No log 0.9903 102 0.3780 0.6178 0.3780 0.6148
No log 1.0097 104 0.4347 0.6134 0.4347 0.6593
No log 1.0291 106 0.4412 0.6103 0.4412 0.6642
No log 1.0485 108 0.3786 0.6143 0.3786 0.6153
No log 1.0680 110 0.3703 0.6069 0.3703 0.6085
No log 1.0874 112 0.3715 0.6309 0.3715 0.6095
No log 1.1068 114 0.3803 0.6583 0.3803 0.6167
No log 1.1262 116 0.3804 0.6550 0.3804 0.6168
No log 1.1456 118 0.4030 0.6702 0.4030 0.6348
No log 1.1650 120 0.4088 0.6602 0.4088 0.6394
No log 1.1845 122 0.4273 0.6448 0.4273 0.6537
No log 1.2039 124 0.3945 0.6187 0.3945 0.6281
No log 1.2233 126 0.3762 0.6046 0.3762 0.6134
No log 1.2427 128 0.3544 0.6294 0.3544 0.5953
No log 1.2621 130 0.3503 0.6519 0.3503 0.5919
No log 1.2816 132 0.3703 0.6427 0.3703 0.6085
No log 1.3010 134 0.3943 0.6307 0.3943 0.6279
No log 1.3204 136 0.4105 0.6199 0.4105 0.6407
No log 1.3398 138 0.4646 0.6120 0.4646 0.6816
No log 1.3592 140 0.3965 0.5647 0.3965 0.6297
No log 1.3786 142 0.3822 0.5781 0.3822 0.6182
No log 1.3981 144 0.3842 0.5737 0.3842 0.6198
No log 1.4175 146 0.3808 0.5979 0.3808 0.6171
No log 1.4369 148 0.3781 0.6059 0.3781 0.6149
No log 1.4563 150 0.4288 0.5697 0.4288 0.6548
No log 1.4757 152 0.4318 0.5760 0.4318 0.6571
No log 1.4951 154 0.4122 0.6435 0.4122 0.6421
No log 1.5146 156 0.4308 0.6215 0.4308 0.6564
No log 1.5340 158 0.3755 0.6525 0.3755 0.6128
No log 1.5534 160 0.3755 0.6330 0.3755 0.6128
No log 1.5728 162 0.3923 0.6521 0.3923 0.6263
No log 1.5922 164 0.4072 0.6649 0.4072 0.6382
No log 1.6117 166 0.4683 0.6456 0.4683 0.6843
No log 1.6311 168 0.4517 0.6508 0.4517 0.6721
No log 1.6505 170 0.4004 0.6634 0.4004 0.6328
No log 1.6699 172 0.4210 0.6428 0.4210 0.6488
No log 1.6893 174 0.4103 0.6262 0.4103 0.6406
No log 1.7087 176 0.3723 0.5842 0.3723 0.6102
No log 1.7282 178 0.3854 0.5396 0.3854 0.6208
No log 1.7476 180 0.3911 0.5623 0.3911 0.6254
No log 1.7670 182 0.3765 0.5578 0.3765 0.6136
No log 1.7864 184 0.3699 0.5705 0.3699 0.6082
No log 1.8058 186 0.3606 0.5916 0.3606 0.6005
No log 1.8252 188 0.3528 0.5976 0.3528 0.5939
No log 1.8447 190 0.3681 0.6522 0.3681 0.6067
No log 1.8641 192 0.3805 0.6451 0.3805 0.6168
No log 1.8835 194 0.3568 0.6964 0.3568 0.5973
No log 1.9029 196 0.4559 0.6727 0.4559 0.6752
No log 1.9223 198 0.5688 0.6192 0.5688 0.7542
No log 1.9417 200 0.5431 0.6177 0.5431 0.7369
No log 1.9612 202 0.4325 0.6125 0.4325 0.6577
No log 1.9806 204 0.4298 0.6115 0.4298 0.6556
No log 2.0 206 0.3934 0.6047 0.3934 0.6272
No log 2.0194 208 0.3640 0.5604 0.3640 0.6033
No log 2.0388 210 0.4034 0.5772 0.4034 0.6351
No log 2.0583 212 0.4611 0.5532 0.4611 0.6790
No log 2.0777 214 0.4260 0.5850 0.4260 0.6527
No log 2.0971 216 0.3609 0.5648 0.3609 0.6007
No log 2.1165 218 0.3751 0.5867 0.3751 0.6124
No log 2.1359 220 0.4200 0.5789 0.4200 0.6481
No log 2.1553 222 0.4028 0.6162 0.4028 0.6347
No log 2.1748 224 0.3534 0.6443 0.3534 0.5945
No log 2.1942 226 0.3576 0.6301 0.3576 0.5980
No log 2.2136 228 0.3818 0.6519 0.3818 0.6179
No log 2.2330 230 0.3722 0.6526 0.3722 0.6101
No log 2.2524 232 0.3871 0.6645 0.3871 0.6222
No log 2.2718 234 0.3648 0.6613 0.3648 0.6040
No log 2.2913 236 0.3767 0.6906 0.3767 0.6138
No log 2.3107 238 0.3674 0.6840 0.3674 0.6061
No log 2.3301 240 0.3572 0.6673 0.3572 0.5977
No log 2.3495 242 0.4368 0.6157 0.4368 0.6609
No log 2.3689 244 0.4939 0.5793 0.4939 0.7028
No log 2.3883 246 0.4406 0.5522 0.4406 0.6638
No log 2.4078 248 0.4015 0.5786 0.4015 0.6336
No log 2.4272 250 0.3601 0.6188 0.3601 0.6001
No log 2.4466 252 0.3617 0.6402 0.3617 0.6014
No log 2.4660 254 0.3496 0.6587 0.3496 0.5913
No log 2.4854 256 0.3556 0.6373 0.3556 0.5963
No log 2.5049 258 0.3819 0.6295 0.3819 0.6180
No log 2.5243 260 0.4668 0.6397 0.4668 0.6832
No log 2.5437 262 0.4294 0.6580 0.4294 0.6553
No log 2.5631 264 0.3673 0.6675 0.3673 0.6060
No log 2.5825 266 0.3892 0.6343 0.3892 0.6239
No log 2.6019 268 0.3635 0.6246 0.3635 0.6029
No log 2.6214 270 0.3788 0.6532 0.3788 0.6154
No log 2.6408 272 0.4269 0.6730 0.4269 0.6534
No log 2.6602 274 0.5374 0.6215 0.5374 0.7331
No log 2.6796 276 0.4475 0.6409 0.4475 0.6689
No log 2.6990 278 0.3538 0.6683 0.3538 0.5948
No log 2.7184 280 0.3635 0.6218 0.3635 0.6029
No log 2.7379 282 0.3664 0.6023 0.3664 0.6053
No log 2.7573 284 0.3472 0.6323 0.3472 0.5893
No log 2.7767 286 0.3641 0.6628 0.3641 0.6034
No log 2.7961 288 0.3673 0.6589 0.3673 0.6061
No log 2.8155 290 0.3573 0.6317 0.3573 0.5977
No log 2.8350 292 0.3626 0.6310 0.3626 0.6022
No log 2.8544 294 0.4157 0.6323 0.4157 0.6448
No log 2.8738 296 0.4005 0.6354 0.4005 0.6328
No log 2.8932 298 0.3699 0.6389 0.3699 0.6082
No log 2.9126 300 0.4151 0.6542 0.4151 0.6443
No log 2.9320 302 0.4757 0.6355 0.4757 0.6897
No log 2.9515 304 0.4341 0.6331 0.4341 0.6588
No log 2.9709 306 0.3805 0.6459 0.3805 0.6168
No log 2.9903 308 0.3677 0.6470 0.3677 0.6064
No log 3.0097 310 0.3910 0.6555 0.3910 0.6253
No log 3.0291 312 0.3957 0.6504 0.3957 0.6290
No log 3.0485 314 0.4229 0.6259 0.4229 0.6503
No log 3.0680 316 0.5687 0.5572 0.5687 0.7541
No log 3.0874 318 0.5071 0.5722 0.5071 0.7121
No log 3.1068 320 0.4433 0.6349 0.4433 0.6658
No log 3.1262 322 0.6868 0.5611 0.6868 0.8288
No log 3.1456 324 0.7744 0.5286 0.7744 0.8800
No log 3.1650 326 0.5943 0.5793 0.5943 0.7709
No log 3.1845 328 0.3925 0.6647 0.3925 0.6265
No log 3.2039 330 0.3932 0.6131 0.3932 0.6271
No log 3.2233 332 0.4354 0.6015 0.4354 0.6598
No log 3.2427 334 0.3841 0.6606 0.3841 0.6198
No log 3.2621 336 0.3220 0.6824 0.3220 0.5675
No log 3.2816 338 0.3938 0.6763 0.3938 0.6275
No log 3.3010 340 0.5318 0.6198 0.5318 0.7293
No log 3.3204 342 0.5251 0.6613 0.5251 0.7246
No log 3.3398 344 0.4105 0.6842 0.4105 0.6407
No log 3.3592 346 0.3599 0.7047 0.3599 0.5999
No log 3.3786 348 0.4009 0.6954 0.4009 0.6332
No log 3.3981 350 0.4287 0.6591 0.4287 0.6547
No log 3.4175 352 0.3778 0.6767 0.3778 0.6147
No log 3.4369 354 0.3229 0.6653 0.3229 0.5683
No log 3.4563 356 0.3364 0.6661 0.3364 0.5800
No log 3.4757 358 0.3391 0.6615 0.3391 0.5824
No log 3.4951 360 0.3305 0.6724 0.3305 0.5749
No log 3.5146 362 0.3235 0.6546 0.3235 0.5688
No log 3.5340 364 0.3246 0.6616 0.3246 0.5698
No log 3.5534 366 0.3228 0.6634 0.3228 0.5681
No log 3.5728 368 0.3226 0.6575 0.3226 0.5680
No log 3.5922 370 0.3419 0.6811 0.3419 0.5847
No log 3.6117 372 0.3946 0.6622 0.3946 0.6282
No log 3.6311 374 0.4066 0.6712 0.4066 0.6377
No log 3.6505 376 0.3832 0.6740 0.3832 0.6191
No log 3.6699 378 0.3589 0.6796 0.3589 0.5991
No log 3.6893 380 0.3617 0.6514 0.3617 0.6014
No log 3.7087 382 0.3634 0.6433 0.3634 0.6028
No log 3.7282 384 0.3624 0.6497 0.3624 0.6020
No log 3.7476 386 0.3470 0.6513 0.3470 0.5891
No log 3.7670 388 0.3464 0.6605 0.3464 0.5885
No log 3.7864 390 0.3585 0.6684 0.3585 0.5988
No log 3.8058 392 0.3369 0.6639 0.3369 0.5804
No log 3.8252 394 0.3319 0.6950 0.3319 0.5761
No log 3.8447 396 0.3568 0.7030 0.3568 0.5973
No log 3.8641 398 0.4250 0.7000 0.4250 0.6519
No log 3.8835 400 0.3942 0.7230 0.3942 0.6278
No log 3.9029 402 0.3616 0.6913 0.3616 0.6013
No log 3.9223 404 0.4204 0.6753 0.4204 0.6484
No log 3.9417 406 0.3953 0.6564 0.3953 0.6287
No log 3.9612 408 0.3512 0.6692 0.3512 0.5927
No log 3.9806 410 0.3393 0.6748 0.3393 0.5825
No log 4.0 412 0.3415 0.6592 0.3415 0.5844
No log 4.0194 414 0.3432 0.6802 0.3432 0.5859
No log 4.0388 416 0.3500 0.6824 0.3500 0.5916
No log 4.0583 418 0.3562 0.7105 0.3562 0.5968
No log 4.0777 420 0.3494 0.6837 0.3494 0.5911
No log 4.0971 422 0.3467 0.6866 0.3467 0.5888
No log 4.1165 424 0.3490 0.6715 0.3490 0.5907
No log 4.1359 426 0.3906 0.6398 0.3906 0.6250
No log 4.1553 428 0.4090 0.6463 0.4090 0.6395
No log 4.1748 430 0.3725 0.6823 0.3725 0.6103
No log 4.1942 432 0.3386 0.6879 0.3386 0.5819
No log 4.2136 434 0.3365 0.7036 0.3365 0.5801
No log 4.2330 436 0.3372 0.6926 0.3372 0.5807
No log 4.2524 438 0.3557 0.6866 0.3557 0.5964
No log 4.2718 440 0.3878 0.6846 0.3878 0.6228
No log 4.2913 442 0.4039 0.6796 0.4039 0.6355
No log 4.3107 444 0.3622 0.7067 0.3622 0.6018
No log 4.3301 446 0.3524 0.6960 0.3524 0.5937
No log 4.3495 448 0.3351 0.6871 0.3351 0.5789
No log 4.3689 450 0.3353 0.6418 0.3353 0.5791
No log 4.3883 452 0.3396 0.6346 0.3396 0.5827
No log 4.4078 454 0.3465 0.6172 0.3465 0.5887
No log 4.4272 456 0.3421 0.5953 0.3421 0.5849
No log 4.4466 458 0.3414 0.5958 0.3414 0.5843
No log 4.4660 460 0.3695 0.6254 0.3695 0.6079
No log 4.4854 462 0.3585 0.6064 0.3585 0.5987
No log 4.5049 464 0.3516 0.6519 0.3516 0.5929
No log 4.5243 466 0.4086 0.6352 0.4086 0.6392
No log 4.5437 468 0.4825 0.5759 0.4825 0.6947
No log 4.5631 470 0.4313 0.6226 0.4313 0.6568
No log 4.5825 472 0.3709 0.6638 0.3709 0.6090
No log 4.6019 474 0.3780 0.6318 0.3780 0.6149
No log 4.6214 476 0.4587 0.6398 0.4587 0.6772
No log 4.6408 478 0.4685 0.6524 0.4685 0.6845
No log 4.6602 480 0.4222 0.6462 0.4222 0.6498
No log 4.6796 482 0.3884 0.6471 0.3884 0.6232
No log 4.6990 484 0.3795 0.6524 0.3795 0.6160
No log 4.7184 486 0.3608 0.6355 0.3608 0.6006
No log 4.7379 488 0.3979 0.6416 0.3979 0.6308
No log 4.7573 490 0.3797 0.6248 0.3797 0.6162
No log 4.7767 492 0.3461 0.6419 0.3461 0.5883
No log 4.7961 494 0.3369 0.6495 0.3369 0.5804
No log 4.8155 496 0.3688 0.6309 0.3688 0.6073
No log 4.8350 498 0.3713 0.6390 0.3713 0.6094
0.449 4.8544 500 0.3542 0.6511 0.3542 0.5951
0.449 4.8738 502 0.3527 0.6654 0.3527 0.5939
0.449 4.8932 504 0.3587 0.6856 0.3587 0.5989
0.449 4.9126 506 0.3640 0.6905 0.3640 0.6033
0.449 4.9320 508 0.3681 0.6746 0.3681 0.6067
0.449 4.9515 510 0.3511 0.6723 0.3511 0.5926
0.449 4.9709 512 0.3724 0.6580 0.3724 0.6103
0.449 4.9903 514 0.4176 0.6083 0.4176 0.6462
0.449 5.0097 516 0.3652 0.6710 0.3652 0.6043
0.449 5.0291 518 0.3431 0.6866 0.3431 0.5857
0.449 5.0485 520 0.3384 0.6930 0.3384 0.5817
0.449 5.0680 522 0.3340 0.6784 0.3340 0.5779
0.449 5.0874 524 0.3742 0.6595 0.3742 0.6117
0.449 5.1068 526 0.3992 0.6418 0.3992 0.6318
0.449 5.1262 528 0.3842 0.6455 0.3842 0.6199
0.449 5.1456 530 0.3628 0.6727 0.3628 0.6023
0.449 5.1650 532 0.3566 0.6847 0.3566 0.5972
0.449 5.1845 534 0.4049 0.6748 0.4049 0.6363
0.449 5.2039 536 0.3966 0.6748 0.3966 0.6298
0.449 5.2233 538 0.3370 0.6957 0.3370 0.5805
0.449 5.2427 540 0.3328 0.6848 0.3328 0.5769
0.449 5.2621 542 0.3288 0.6928 0.3288 0.5734
0.449 5.2816 544 0.3705 0.6679 0.3705 0.6087
0.449 5.3010 546 0.4246 0.6398 0.4246 0.6516
0.449 5.3204 548 0.4351 0.6514 0.4351 0.6596
0.449 5.3398 550 0.3864 0.6790 0.3864 0.6216
0.449 5.3592 552 0.3561 0.7001 0.3561 0.5967
0.449 5.3786 554 0.3928 0.6749 0.3928 0.6267
0.449 5.3981 556 0.3873 0.6833 0.3873 0.6223
0.449 5.4175 558 0.3584 0.6934 0.3584 0.5986
0.449 5.4369 560 0.3478 0.6798 0.3478 0.5897
0.449 5.4563 562 0.3599 0.6916 0.3599 0.5999
0.449 5.4757 564 0.4073 0.6693 0.4073 0.6382
0.449 5.4951 566 0.4892 0.6696 0.4892 0.6994
0.449 5.5146 568 0.4670 0.6717 0.4670 0.6834
0.449 5.5340 570 0.4353 0.6686 0.4353 0.6598
0.449 5.5534 572 0.4118 0.6569 0.4118 0.6417
0.449 5.5728 574 0.4022 0.6363 0.4022 0.6342
0.449 5.5922 576 0.4104 0.6292 0.4104 0.6406
0.449 5.6117 578 0.3811 0.6382 0.3811 0.6173
0.449 5.6311 580 0.3745 0.6169 0.3745 0.6120
0.449 5.6505 582 0.3815 0.6200 0.3815 0.6177

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/Arabic_CrossPrompt_FineTuningAraBERT_noAug_TestTask3_development

Finetuned
(4019)
this model