Arabic_CrossPrompt_FineTuningAraBERT_noAug_TestTask2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5944
  • Qwk: 0.6687
  • Mse: 0.5944
  • Rmse: 0.7710

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0213 2 2.9795 0.0430 2.9795 1.7261
No log 0.0426 4 1.4708 0.2522 1.4708 1.2128
No log 0.0638 6 1.0064 0.2618 1.0064 1.0032
No log 0.0851 8 1.2904 0.1202 1.2904 1.1360
No log 0.1064 10 1.2330 0.0637 1.2330 1.1104
No log 0.1277 12 1.5751 0.1702 1.5751 1.2550
No log 0.1489 14 1.1905 0.2331 1.1905 1.0911
No log 0.1702 16 0.9164 0.3593 0.9164 0.9573
No log 0.1915 18 0.9250 0.3366 0.9250 0.9618
No log 0.2128 20 0.9043 0.3628 0.9043 0.9509
No log 0.2340 22 0.9070 0.3449 0.9070 0.9524
No log 0.2553 24 0.8852 0.3664 0.8852 0.9408
No log 0.2766 26 0.9674 0.3085 0.9674 0.9835
No log 0.2979 28 0.8407 0.4256 0.8407 0.9169
No log 0.3191 30 0.8122 0.4854 0.8122 0.9012
No log 0.3404 32 0.7604 0.4958 0.7604 0.8720
No log 0.3617 34 0.9329 0.4156 0.9329 0.9658
No log 0.3830 36 1.2650 0.3276 1.2650 1.1247
No log 0.4043 38 1.1839 0.3290 1.1839 1.0881
No log 0.4255 40 0.8910 0.3968 0.8910 0.9439
No log 0.4468 42 0.7932 0.4888 0.7932 0.8906
No log 0.4681 44 0.8132 0.4130 0.8132 0.9018
No log 0.4894 46 0.8004 0.4072 0.8004 0.8947
No log 0.5106 48 0.7953 0.4318 0.7953 0.8918
No log 0.5319 50 0.7835 0.4540 0.7835 0.8852
No log 0.5532 52 0.7898 0.5115 0.7898 0.8887
No log 0.5745 54 0.7927 0.5722 0.7927 0.8904
No log 0.5957 56 0.8246 0.5874 0.8246 0.9081
No log 0.6170 58 0.7815 0.6092 0.7815 0.8840
No log 0.6383 60 0.7768 0.5905 0.7768 0.8814
No log 0.6596 62 0.8028 0.5827 0.8028 0.8960
No log 0.6809 64 0.7882 0.5883 0.7882 0.8878
No log 0.7021 66 0.7194 0.5740 0.7194 0.8481
No log 0.7234 68 0.6851 0.6425 0.6851 0.8277
No log 0.7447 70 0.6723 0.6335 0.6723 0.8200
No log 0.7660 72 0.7085 0.6214 0.7085 0.8417
No log 0.7872 74 0.7907 0.5905 0.7907 0.8892
No log 0.8085 76 0.9169 0.5446 0.9169 0.9576
No log 0.8298 78 1.1575 0.4144 1.1575 1.0759
No log 0.8511 80 1.3266 0.3259 1.3266 1.1518
No log 0.8723 82 1.2883 0.4161 1.2883 1.1351
No log 0.8936 84 0.8524 0.6100 0.8524 0.9233
No log 0.9149 86 0.6729 0.6021 0.6729 0.8203
No log 0.9362 88 0.7719 0.5333 0.7719 0.8786
No log 0.9574 90 0.6814 0.5844 0.6814 0.8254
No log 0.9787 92 0.7899 0.5289 0.7899 0.8887
No log 1.0 94 0.9546 0.5193 0.9546 0.9770
No log 1.0213 96 0.8140 0.5663 0.8140 0.9022
No log 1.0426 98 0.6093 0.5942 0.6093 0.7806
No log 1.0638 100 0.7174 0.5856 0.7174 0.8470
No log 1.0851 102 0.7791 0.5663 0.7791 0.8827
No log 1.1064 104 0.6486 0.6128 0.6486 0.8053
No log 1.1277 106 0.6603 0.6388 0.6603 0.8126
No log 1.1489 108 0.8010 0.5604 0.8010 0.8950
No log 1.1702 110 0.7512 0.5718 0.7512 0.8667
No log 1.1915 112 0.6331 0.5998 0.6331 0.7957
No log 1.2128 114 0.6389 0.5847 0.6389 0.7993
No log 1.2340 116 0.6505 0.5757 0.6505 0.8066
No log 1.2553 118 0.6267 0.5984 0.6267 0.7916
No log 1.2766 120 0.6744 0.6024 0.6744 0.8212
No log 1.2979 122 0.7847 0.4918 0.7847 0.8858
No log 1.3191 124 0.9194 0.4738 0.9194 0.9589
No log 1.3404 126 0.8439 0.4772 0.8439 0.9186
No log 1.3617 128 0.6659 0.5993 0.6659 0.8160
No log 1.3830 130 0.6322 0.6355 0.6322 0.7951
No log 1.4043 132 0.6250 0.6399 0.6250 0.7906
No log 1.4255 134 0.6122 0.6662 0.6122 0.7824
No log 1.4468 136 0.6426 0.6675 0.6426 0.8016
No log 1.4681 138 0.6597 0.6588 0.6597 0.8122
No log 1.4894 140 0.6621 0.6217 0.6621 0.8137
No log 1.5106 142 0.6481 0.6325 0.6481 0.8051
No log 1.5319 144 0.5860 0.6642 0.5860 0.7655
No log 1.5532 146 0.5880 0.6660 0.5880 0.7668
No log 1.5745 148 0.6085 0.6572 0.6085 0.7800
No log 1.5957 150 0.7136 0.6585 0.7136 0.8448
No log 1.6170 152 0.9794 0.5218 0.9794 0.9896
No log 1.6383 154 1.1605 0.4894 1.1605 1.0773
No log 1.6596 156 0.9892 0.5423 0.9892 0.9946
No log 1.6809 158 0.7404 0.6536 0.7404 0.8605
No log 1.7021 160 0.6037 0.7030 0.6037 0.7770
No log 1.7234 162 0.5954 0.6706 0.5954 0.7716
No log 1.7447 164 0.6004 0.6773 0.6004 0.7748
No log 1.7660 166 0.6691 0.6691 0.6691 0.8180
No log 1.7872 168 0.6828 0.6714 0.6828 0.8263
No log 1.8085 170 0.6503 0.6577 0.6503 0.8064
No log 1.8298 172 0.5978 0.6594 0.5978 0.7731
No log 1.8511 174 0.5696 0.6797 0.5696 0.7547
No log 1.8723 176 0.6067 0.6412 0.6067 0.7789
No log 1.8936 178 0.8842 0.5464 0.8842 0.9403
No log 1.9149 180 1.2835 0.4245 1.2835 1.1329
No log 1.9362 182 1.2030 0.4692 1.2030 1.0968
No log 1.9574 184 0.7730 0.5898 0.7730 0.8792
No log 1.9787 186 0.7789 0.5316 0.7789 0.8826
No log 2.0 188 0.7883 0.5404 0.7883 0.8879
No log 2.0213 190 0.6419 0.6292 0.6419 0.8012
No log 2.0426 192 0.8288 0.5825 0.8288 0.9104
No log 2.0638 194 0.8930 0.5628 0.8930 0.9450
No log 2.0851 196 0.7060 0.6324 0.7060 0.8402
No log 2.1064 198 0.5885 0.7172 0.5885 0.7672
No log 2.1277 200 0.5811 0.6853 0.5811 0.7623
No log 2.1489 202 0.5833 0.6538 0.5833 0.7638
No log 2.1702 204 0.5530 0.6438 0.5530 0.7436
No log 2.1915 206 0.5884 0.6686 0.5884 0.7671
No log 2.2128 208 0.8139 0.5603 0.8139 0.9021
No log 2.2340 210 1.1550 0.4702 1.1550 1.0747
No log 2.2553 212 1.1025 0.4974 1.1025 1.0500
No log 2.2766 214 0.8676 0.5769 0.8676 0.9314
No log 2.2979 216 0.7146 0.6269 0.7146 0.8454
No log 2.3191 218 0.6834 0.6275 0.6834 0.8267
No log 2.3404 220 0.6561 0.6507 0.6561 0.8100
No log 2.3617 222 0.6387 0.6452 0.6387 0.7992
No log 2.3830 224 0.6286 0.6156 0.6286 0.7928
No log 2.4043 226 0.6418 0.5817 0.6418 0.8011
No log 2.4255 228 0.6450 0.6097 0.6450 0.8031
No log 2.4468 230 0.6501 0.5834 0.6501 0.8063
No log 2.4681 232 0.5985 0.6423 0.5985 0.7736
No log 2.4894 234 0.6036 0.6629 0.6036 0.7769
No log 2.5106 236 0.5139 0.6670 0.5139 0.7169
No log 2.5319 238 0.5408 0.6767 0.5408 0.7354
No log 2.5532 240 0.5353 0.6456 0.5353 0.7316
No log 2.5745 242 0.5542 0.6457 0.5542 0.7444
No log 2.5957 244 0.9278 0.5618 0.9278 0.9632
No log 2.6170 246 1.3161 0.4325 1.3161 1.1472
No log 2.6383 248 1.2822 0.4691 1.2822 1.1323
No log 2.6596 250 0.8725 0.5714 0.8725 0.9341
No log 2.6809 252 0.5267 0.6903 0.5267 0.7257
No log 2.7021 254 0.6968 0.6389 0.6968 0.8348
No log 2.7234 256 0.8919 0.5452 0.8919 0.9444
No log 2.7447 258 0.7998 0.5484 0.7998 0.8943
No log 2.7660 260 0.6040 0.6506 0.6040 0.7772
No log 2.7872 262 0.5420 0.6471 0.5420 0.7362
No log 2.8085 264 0.8467 0.5569 0.8467 0.9202
No log 2.8298 266 1.1593 0.4704 1.1593 1.0767
No log 2.8511 268 1.1460 0.4630 1.1460 1.0705
No log 2.8723 270 0.9057 0.5600 0.9057 0.9517
No log 2.8936 272 0.6712 0.5678 0.6712 0.8193
No log 2.9149 274 0.6617 0.5032 0.6617 0.8135
No log 2.9362 276 0.7325 0.5007 0.7325 0.8559
No log 2.9574 278 0.7542 0.4746 0.7542 0.8684
No log 2.9787 280 0.6737 0.5665 0.6737 0.8208
No log 3.0 282 0.6070 0.6110 0.6070 0.7791
No log 3.0213 284 0.5623 0.6465 0.5623 0.7499
No log 3.0426 286 0.5542 0.6766 0.5542 0.7445
No log 3.0638 288 0.5836 0.7004 0.5836 0.7639
No log 3.0851 290 0.6133 0.7125 0.6133 0.7831
No log 3.1064 292 0.5968 0.7134 0.5968 0.7725
No log 3.1277 294 0.6292 0.7011 0.6292 0.7932
No log 3.1489 296 0.7177 0.6880 0.7177 0.8471
No log 3.1702 298 0.6694 0.6973 0.6694 0.8182
No log 3.1915 300 0.5992 0.6910 0.5992 0.7741
No log 3.2128 302 0.6497 0.6648 0.6497 0.8060
No log 3.2340 304 0.9107 0.5353 0.9107 0.9543
No log 3.2553 306 1.0925 0.4318 1.0925 1.0452
No log 3.2766 308 1.0429 0.4003 1.0429 1.0212
No log 3.2979 310 0.8412 0.4095 0.8412 0.9172
No log 3.3191 312 0.6646 0.5059 0.6646 0.8152
No log 3.3404 314 0.6233 0.5205 0.6233 0.7895
No log 3.3617 316 0.6060 0.5900 0.6060 0.7784
No log 3.3830 318 0.6024 0.6104 0.6024 0.7762
No log 3.4043 320 0.5967 0.6653 0.5967 0.7725
No log 3.4255 322 0.6930 0.6530 0.6930 0.8325
No log 3.4468 324 0.8456 0.6280 0.8456 0.9196
No log 3.4681 326 0.9347 0.6103 0.9347 0.9668
No log 3.4894 328 0.8202 0.6594 0.8202 0.9057
No log 3.5106 330 0.6678 0.7150 0.6678 0.8172
No log 3.5319 332 0.6610 0.7139 0.6610 0.8130
No log 3.5532 334 0.6443 0.7113 0.6443 0.8027
No log 3.5745 336 0.5915 0.7133 0.5915 0.7691
No log 3.5957 338 0.6299 0.6809 0.6299 0.7937
No log 3.6170 340 0.8117 0.5781 0.8117 0.9010
No log 3.6383 342 1.0869 0.5108 1.0869 1.0426
No log 3.6596 344 1.1670 0.4624 1.1670 1.0803
No log 3.6809 346 1.1315 0.4334 1.1315 1.0637
No log 3.7021 348 0.9681 0.3557 0.9681 0.9839
No log 3.7234 350 0.8355 0.3726 0.8355 0.9141
No log 3.7447 352 0.7805 0.4425 0.7805 0.8835
No log 3.7660 354 0.7453 0.5191 0.7453 0.8633
No log 3.7872 356 0.6977 0.5372 0.6977 0.8353
No log 3.8085 358 0.6570 0.5306 0.6570 0.8105
No log 3.8298 360 0.6821 0.5453 0.6821 0.8259
No log 3.8511 362 0.7734 0.5928 0.7734 0.8794
No log 3.8723 364 0.8651 0.5940 0.8651 0.9301
No log 3.8936 366 0.8502 0.5908 0.8502 0.9221
No log 3.9149 368 0.7957 0.6071 0.7957 0.8920
No log 3.9362 370 0.6293 0.6741 0.6293 0.7933
No log 3.9574 372 0.5445 0.7025 0.5445 0.7379
No log 3.9787 374 0.5380 0.7025 0.5380 0.7335
No log 4.0 376 0.5410 0.7094 0.5410 0.7355
No log 4.0213 378 0.5642 0.6702 0.5642 0.7511
No log 4.0426 380 0.5881 0.6287 0.5881 0.7669
No log 4.0638 382 0.6616 0.5875 0.6616 0.8134
No log 4.0851 384 0.6835 0.5993 0.6835 0.8267
No log 4.1064 386 0.6574 0.6350 0.6574 0.8108
No log 4.1277 388 0.5790 0.6859 0.5790 0.7609
No log 4.1489 390 0.5634 0.7182 0.5634 0.7506
No log 4.1702 392 0.6070 0.7065 0.6070 0.7791
No log 4.1915 394 0.6264 0.6784 0.6264 0.7915
No log 4.2128 396 0.7449 0.6500 0.7449 0.8630
No log 4.2340 398 0.7664 0.6479 0.7664 0.8755
No log 4.2553 400 0.6247 0.6952 0.6247 0.7904
No log 4.2766 402 0.5720 0.7190 0.5720 0.7563
No log 4.2979 404 0.6046 0.7120 0.6046 0.7776
No log 4.3191 406 0.6890 0.6531 0.6890 0.8301
No log 4.3404 408 0.7081 0.6258 0.7081 0.8415
No log 4.3617 410 0.7887 0.6071 0.7887 0.8881
No log 4.3830 412 0.7614 0.6029 0.7614 0.8726
No log 4.4043 414 0.5887 0.6544 0.5887 0.7672
No log 4.4255 416 0.5534 0.6704 0.5534 0.7439
No log 4.4468 418 0.5740 0.6582 0.5740 0.7576
No log 4.4681 420 0.5688 0.6769 0.5688 0.7542
No log 4.4894 422 0.7319 0.6316 0.7319 0.8555
No log 4.5106 424 0.8256 0.6290 0.8256 0.9086
No log 4.5319 426 0.8411 0.6352 0.8411 0.9171
No log 4.5532 428 0.7590 0.6421 0.7590 0.8712
No log 4.5745 430 0.7167 0.6771 0.7167 0.8466
No log 4.5957 432 0.6752 0.6589 0.6752 0.8217
No log 4.6170 434 0.6656 0.7030 0.6656 0.8159
No log 4.6383 436 0.6326 0.7083 0.6326 0.7954
No log 4.6596 438 0.6287 0.6792 0.6287 0.7929
No log 4.6809 440 0.5928 0.6710 0.5928 0.7699
No log 4.7021 442 0.5707 0.6713 0.5707 0.7554
No log 4.7234 444 0.5786 0.6789 0.5786 0.7607
No log 4.7447 446 0.6081 0.6747 0.6081 0.7798
No log 4.7660 448 0.6419 0.6769 0.6419 0.8012
No log 4.7872 450 0.7496 0.6119 0.7496 0.8658
No log 4.8085 452 0.7215 0.6359 0.7215 0.8494
No log 4.8298 454 0.6190 0.6934 0.6190 0.7868
No log 4.8511 456 0.5394 0.6777 0.5394 0.7345
No log 4.8723 458 0.6038 0.6589 0.6038 0.7770
No log 4.8936 460 0.6352 0.6352 0.6352 0.7970
No log 4.9149 462 0.5894 0.6615 0.5894 0.7677
No log 4.9362 464 0.5531 0.6921 0.5531 0.7437
No log 4.9574 466 0.6797 0.6574 0.6797 0.8244
No log 4.9787 468 0.8562 0.5981 0.8562 0.9253
No log 5.0 470 0.9287 0.5849 0.9287 0.9637
No log 5.0213 472 0.8604 0.6067 0.8604 0.9276
No log 5.0426 474 0.6654 0.6790 0.6654 0.8157
No log 5.0638 476 0.5868 0.6839 0.5868 0.7660
No log 5.0851 478 0.5594 0.6701 0.5594 0.7480
No log 5.1064 480 0.5695 0.6213 0.5695 0.7546
No log 5.1277 482 0.5667 0.6463 0.5667 0.7528
No log 5.1489 484 0.5769 0.6640 0.5769 0.7595
No log 5.1702 486 0.6746 0.6307 0.6746 0.8214
No log 5.1915 488 0.8034 0.5952 0.8034 0.8963
No log 5.2128 490 0.7357 0.6277 0.7357 0.8577
No log 5.2340 492 0.6018 0.6784 0.6018 0.7758
No log 5.2553 494 0.6016 0.6803 0.6016 0.7756
No log 5.2766 496 0.6520 0.6395 0.6520 0.8075
No log 5.2979 498 0.5782 0.6995 0.5782 0.7604
0.609 5.3191 500 0.6230 0.6706 0.6230 0.7893
0.609 5.3404 502 0.8195 0.5868 0.8195 0.9053
0.609 5.3617 504 0.8613 0.5847 0.8613 0.9281
0.609 5.3830 506 0.7107 0.6360 0.7107 0.8431
0.609 5.4043 508 0.6229 0.6491 0.6229 0.7892
0.609 5.4255 510 0.5944 0.6687 0.5944 0.7710

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/Arabic_CrossPrompt_FineTuningAraBERT_noAug_TestTask2_organization

Finetuned
(4019)
this model