ArabicNewSplits6_WithDuplicationsForScore5_FineTuningAraBERT_run1_AugV5_k10_task3_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6270
  • Qwk: 0.3427
  • Mse: 0.6270
  • Rmse: 0.7919

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0385 2 3.0940 0.0 3.0940 1.7590
No log 0.0769 4 1.5951 -0.0070 1.5951 1.2630
No log 0.1154 6 1.4346 0.0294 1.4346 1.1978
No log 0.1538 8 0.7851 0.1193 0.7851 0.8861
No log 0.1923 10 0.5697 0.0569 0.5697 0.7548
No log 0.2308 12 0.5884 0.0569 0.5884 0.7671
No log 0.2692 14 0.6858 0.0409 0.6858 0.8282
No log 0.3077 16 0.9507 0.0333 0.9507 0.9750
No log 0.3462 18 0.5940 0.0815 0.5940 0.7707
No log 0.3846 20 0.5813 0.0 0.5813 0.7624
No log 0.4231 22 0.5883 0.0 0.5883 0.7670
No log 0.4615 24 0.6226 -0.0233 0.6226 0.7890
No log 0.5 26 0.6363 -0.0159 0.6363 0.7977
No log 0.5385 28 0.6652 0.0 0.6652 0.8156
No log 0.5769 30 0.6079 0.0222 0.6079 0.7797
No log 0.6154 32 0.7196 -0.0115 0.7196 0.8483
No log 0.6538 34 0.6693 0.1079 0.6693 0.8181
No log 0.6923 36 0.7809 0.2000 0.7809 0.8837
No log 0.7308 38 0.6303 0.1884 0.6303 0.7939
No log 0.7692 40 0.7227 0.1828 0.7227 0.8501
No log 0.8077 42 1.1450 0.0722 1.1450 1.0701
No log 0.8462 44 0.9036 0.0667 0.9036 0.9506
No log 0.8846 46 0.6278 0.0327 0.6278 0.7923
No log 0.9231 48 0.5851 0.2222 0.5851 0.7649
No log 0.9615 50 0.7074 0.2000 0.7074 0.8410
No log 1.0 52 0.7136 0.3292 0.7136 0.8447
No log 1.0385 54 0.6361 0.2308 0.6361 0.7976
No log 1.0769 56 0.7147 0.2217 0.7147 0.8454
No log 1.1154 58 0.6265 0.2688 0.6265 0.7915
No log 1.1538 60 0.6984 0.2653 0.6984 0.8357
No log 1.1923 62 0.8597 0.1588 0.8597 0.9272
No log 1.2308 64 0.6757 0.2157 0.6757 0.8220
No log 1.2692 66 0.7182 0.2511 0.7182 0.8475
No log 1.3077 68 1.1504 0.1818 1.1504 1.0726
No log 1.3462 70 0.9133 0.2126 0.9133 0.9557
No log 1.3846 72 0.5896 0.2513 0.5896 0.7678
No log 1.4231 74 1.1163 0.1367 1.1163 1.0566
No log 1.4615 76 1.0878 0.1941 1.0878 1.0430
No log 1.5 78 0.7334 0.2453 0.7334 0.8564
No log 1.5385 80 0.6297 0.3365 0.6297 0.7935
No log 1.5769 82 0.7532 0.2554 0.7532 0.8679
No log 1.6154 84 0.6024 0.3684 0.6024 0.7762
No log 1.6538 86 0.7146 0.2536 0.7146 0.8453
No log 1.6923 88 0.6201 0.3200 0.6201 0.7874
No log 1.7308 90 0.5750 0.3814 0.5750 0.7583
No log 1.7692 92 0.5499 0.2174 0.5499 0.7415
No log 1.8077 94 0.5320 0.2626 0.5320 0.7294
No log 1.8462 96 0.5444 0.3402 0.5444 0.7378
No log 1.8846 98 0.5194 0.2201 0.5194 0.7207
No log 1.9231 100 0.7644 0.3427 0.7644 0.8743
No log 1.9615 102 0.9706 0.2000 0.9706 0.9852
No log 2.0 104 0.7395 0.3398 0.7395 0.8599
No log 2.0385 106 0.5820 0.36 0.5820 0.7629
No log 2.0769 108 0.6158 0.2941 0.6158 0.7847
No log 2.1154 110 0.8891 0.2566 0.8891 0.9429
No log 2.1538 112 1.3960 0.1096 1.3960 1.1815
No log 2.1923 114 1.2119 0.1304 1.2119 1.1009
No log 2.2308 116 0.7600 0.2850 0.7600 0.8718
No log 2.2692 118 0.6581 0.3333 0.6581 0.8112
No log 2.3077 120 0.6692 0.2821 0.6692 0.8180
No log 2.3462 122 1.2495 0.0831 1.2495 1.1178
No log 2.3846 124 1.4629 0.0976 1.4629 1.2095
No log 2.4231 126 0.8126 0.2233 0.8126 0.9015
No log 2.4615 128 0.6210 0.3035 0.6210 0.7880
No log 2.5 130 0.7142 0.2364 0.7142 0.8451
No log 2.5385 132 0.6008 0.3725 0.6008 0.7751
No log 2.5769 134 1.0506 0.1822 1.0506 1.0250
No log 2.6154 136 1.3995 0.1324 1.3995 1.1830
No log 2.6538 138 0.9210 0.2134 0.9210 0.9597
No log 2.6923 140 0.6155 0.2917 0.6155 0.7845
No log 2.7308 142 0.7058 0.2696 0.7058 0.8401
No log 2.7692 144 0.7629 0.2068 0.7629 0.8734
No log 2.8077 146 0.7024 0.2646 0.7024 0.8381
No log 2.8462 148 0.6957 0.2842 0.6957 0.8341
No log 2.8846 150 0.9127 0.2000 0.9127 0.9553
No log 2.9231 152 0.8347 0.1790 0.8347 0.9136
No log 2.9615 154 0.7550 0.3663 0.7550 0.8689
No log 3.0 156 0.9224 0.1628 0.9224 0.9604
No log 3.0385 158 0.9170 0.1880 0.9170 0.9576
No log 3.0769 160 0.7063 0.3469 0.7063 0.8404
No log 3.1154 162 1.1872 0.0727 1.1872 1.0896
No log 3.1538 164 1.4203 0.1125 1.4203 1.1918
No log 3.1923 166 0.9159 0.2134 0.9159 0.9570
No log 3.2308 168 0.5797 0.3446 0.5797 0.7614
No log 3.2692 170 0.5712 0.2821 0.5712 0.7558
No log 3.3077 172 0.5674 0.3535 0.5674 0.7533
No log 3.3462 174 0.5757 0.4545 0.5757 0.7587
No log 3.3846 176 0.5788 0.4475 0.5788 0.7608
No log 3.4231 178 0.7936 0.2397 0.7936 0.8908
No log 3.4615 180 1.2288 0.1842 1.2288 1.1085
No log 3.5 182 1.3175 0.1553 1.3175 1.1478
No log 3.5385 184 1.0904 0.2055 1.0904 1.0442
No log 3.5769 186 0.7008 0.3498 0.7008 0.8371
No log 3.6154 188 0.9640 0.3038 0.9640 0.9818
No log 3.6538 190 1.0213 0.2848 1.0213 1.0106
No log 3.6923 192 0.6850 0.3833 0.6850 0.8276
No log 3.7308 194 1.1357 0.2053 1.1357 1.0657
No log 3.7692 196 1.9651 0.0612 1.9651 1.4018
No log 3.8077 198 1.9540 0.0607 1.9540 1.3979
No log 3.8462 200 1.3492 0.1455 1.3492 1.1615
No log 3.8846 202 0.6733 0.2670 0.6733 0.8205
No log 3.9231 204 0.5496 0.3469 0.5496 0.7413
No log 3.9615 206 0.6129 0.3052 0.6129 0.7829
No log 4.0 208 0.5139 0.4157 0.5139 0.7169
No log 4.0385 210 0.6173 0.3478 0.6173 0.7857
No log 4.0769 212 0.9629 0.2000 0.9629 0.9813
No log 4.1154 214 1.1517 0.1438 1.1517 1.0732
No log 4.1538 216 0.9021 0.2569 0.9021 0.9498
No log 4.1923 218 0.6088 0.3118 0.6088 0.7803
No log 4.2308 220 0.5840 0.3617 0.5840 0.7642
No log 4.2692 222 0.5892 0.3297 0.5892 0.7676
No log 4.3077 224 0.7273 0.3561 0.7273 0.8528
No log 4.3462 226 1.0001 0.1828 1.0001 1.0001
No log 4.3846 228 1.0055 0.1828 1.0055 1.0027
No log 4.4231 230 0.7231 0.2965 0.7231 0.8503
No log 4.4615 232 0.5861 0.3927 0.5861 0.7656
No log 4.5 234 0.5824 0.3927 0.5824 0.7631
No log 4.5385 236 0.6784 0.2670 0.6784 0.8237
No log 4.5769 238 0.9041 0.2199 0.9041 0.9508
No log 4.6154 240 0.8591 0.2566 0.8591 0.9269
No log 4.6538 242 0.7404 0.2245 0.7404 0.8605
No log 4.6923 244 0.6586 0.3786 0.6586 0.8115
No log 4.7308 246 0.6579 0.3786 0.6579 0.8111
No log 4.7692 248 0.7164 0.3561 0.7164 0.8464
No log 4.8077 250 0.8173 0.2134 0.8173 0.9041
No log 4.8462 252 0.8221 0.2140 0.8221 0.9067
No log 4.8846 254 0.8193 0.1855 0.8193 0.9052
No log 4.9231 256 0.6782 0.3803 0.6782 0.8235
No log 4.9615 258 0.5889 0.2513 0.5889 0.7674
No log 5.0 260 0.6019 0.3498 0.6019 0.7758
No log 5.0385 262 0.6078 0.3878 0.6078 0.7796
No log 5.0769 264 0.6803 0.3761 0.6803 0.8248
No log 5.1154 266 0.9100 0.1506 0.9100 0.9539
No log 5.1538 268 0.9846 0.1278 0.9846 0.9923
No log 5.1923 270 0.7996 0.2442 0.7996 0.8942
No log 5.2308 272 0.6333 0.2965 0.6333 0.7958
No log 5.2692 274 0.6300 0.3561 0.6300 0.7938
No log 5.3077 276 0.6184 0.4462 0.6184 0.7864
No log 5.3462 278 0.7011 0.3761 0.7011 0.8373
No log 5.3846 280 0.8976 0.2713 0.8976 0.9474
No log 5.4231 282 0.8876 0.2713 0.8876 0.9421
No log 5.4615 284 0.7211 0.2830 0.7211 0.8492
No log 5.5 286 0.6427 0.3892 0.6427 0.8017
No log 5.5385 288 0.6774 0.3427 0.6774 0.8230
No log 5.5769 290 0.7779 0.2523 0.7779 0.8820
No log 5.6154 292 0.7615 0.2811 0.7615 0.8726
No log 5.6538 294 0.7302 0.2563 0.7302 0.8545
No log 5.6923 296 0.7010 0.3035 0.7010 0.8372
No log 5.7308 298 0.5841 0.3200 0.5841 0.7642
No log 5.7692 300 0.5338 0.4098 0.5338 0.7306
No log 5.8077 302 0.5564 0.4098 0.5564 0.7459
No log 5.8462 304 0.5326 0.4157 0.5326 0.7298
No log 5.8846 306 0.5381 0.4220 0.5381 0.7335
No log 5.9231 308 0.5754 0.4098 0.5754 0.7585
No log 5.9615 310 0.6942 0.2621 0.6942 0.8332
No log 6.0 312 0.8788 0.1597 0.8788 0.9375
No log 6.0385 314 0.9218 0.1880 0.9218 0.9601
No log 6.0769 316 0.7467 0.2222 0.7467 0.8641
No log 6.1154 318 0.6001 0.3439 0.6001 0.7747
No log 6.1538 320 0.5980 0.2990 0.5980 0.7733
No log 6.1923 322 0.6535 0.3498 0.6535 0.8084
No log 6.2308 324 0.7394 0.3391 0.7394 0.8599
No log 6.2692 326 0.6931 0.3333 0.6931 0.8325
No log 6.3077 328 0.5871 0.3990 0.5871 0.7662
No log 6.3462 330 0.5334 0.3708 0.5334 0.7303
No log 6.3846 332 0.5294 0.3708 0.5294 0.7276
No log 6.4231 334 0.5400 0.3478 0.5400 0.7349
No log 6.4615 336 0.6680 0.2653 0.6680 0.8173
No log 6.5 338 0.9453 0.1892 0.9453 0.9723
No log 6.5385 340 1.0224 0.1892 1.0224 1.0111
No log 6.5769 342 0.8763 0.2432 0.8763 0.9361
No log 6.6154 344 0.6449 0.3200 0.6449 0.8030
No log 6.6538 346 0.5564 0.3797 0.5564 0.7459
No log 6.6923 348 0.5596 0.3548 0.5596 0.7481
No log 6.7308 350 0.5632 0.3263 0.5632 0.7505
No log 6.7692 352 0.6115 0.3641 0.6115 0.7820
No log 6.8077 354 0.6532 0.3200 0.6532 0.8082
No log 6.8462 356 0.6240 0.3641 0.6240 0.7900
No log 6.8846 358 0.6090 0.3263 0.6090 0.7804
No log 6.9231 360 0.5830 0.2990 0.5830 0.7635
No log 6.9615 362 0.5804 0.3016 0.5804 0.7618
No log 7.0 364 0.5995 0.2990 0.5995 0.7743
No log 7.0385 366 0.7015 0.2965 0.7015 0.8376
No log 7.0769 368 0.8892 0.2554 0.8892 0.9430
No log 7.1154 370 0.9962 0.1875 0.9962 0.9981
No log 7.1538 372 0.9509 0.2195 0.9509 0.9751
No log 7.1923 374 0.7925 0.2850 0.7925 0.8902
No log 7.2308 376 0.6700 0.2941 0.6700 0.8186
No log 7.2692 378 0.6679 0.3623 0.6679 0.8172
No log 7.3077 380 0.6668 0.4175 0.6668 0.8166
No log 7.3462 382 0.6772 0.2523 0.6772 0.8229
No log 7.3846 384 0.8048 0.3091 0.8048 0.8971
No log 7.4231 386 1.0111 0.1873 1.0111 1.0055
No log 7.4615 388 1.0952 0.1594 1.0952 1.0465
No log 7.5 390 1.0061 0.1880 1.0061 1.0030
No log 7.5385 392 0.8588 0.2294 0.8588 0.9267
No log 7.5769 394 0.6960 0.2941 0.6960 0.8343
No log 7.6154 396 0.5798 0.4043 0.5798 0.7614
No log 7.6538 398 0.5589 0.2000 0.5589 0.7476
No log 7.6923 400 0.5690 0.2644 0.5690 0.7543
No log 7.7308 402 0.5571 0.2644 0.5571 0.7464
No log 7.7692 404 0.5492 0.3563 0.5492 0.7411
No log 7.8077 406 0.5819 0.4043 0.5819 0.7628
No log 7.8462 408 0.7082 0.2941 0.7082 0.8415
No log 7.8846 410 0.8460 0.2275 0.8460 0.9198
No log 7.9231 412 0.8912 0.2269 0.8912 0.9440
No log 7.9615 414 0.8306 0.2275 0.8306 0.9114
No log 8.0 416 0.7227 0.2941 0.7227 0.8501
No log 8.0385 418 0.6280 0.3462 0.6280 0.7925
No log 8.0769 420 0.5828 0.4627 0.5828 0.7634
No log 8.1154 422 0.5713 0.4764 0.5713 0.7558
No log 8.1538 424 0.5753 0.4286 0.5753 0.7585
No log 8.1923 426 0.5779 0.4627 0.5779 0.7602
No log 8.2308 428 0.5849 0.4627 0.5849 0.7648
No log 8.2692 430 0.6093 0.3462 0.6093 0.7806
No log 8.3077 432 0.6332 0.3171 0.6332 0.7957
No log 8.3462 434 0.6384 0.3171 0.6384 0.7990
No log 8.3846 436 0.6406 0.3171 0.6406 0.8004
No log 8.4231 438 0.6504 0.3171 0.6504 0.8064
No log 8.4615 440 0.6179 0.3462 0.6179 0.7861
No log 8.5 442 0.5983 0.4175 0.5983 0.7735
No log 8.5385 444 0.5937 0.4627 0.5937 0.7705
No log 8.5769 446 0.5787 0.4627 0.5787 0.7607
No log 8.6154 448 0.5655 0.4627 0.5655 0.7520
No log 8.6538 450 0.5692 0.4286 0.5692 0.7545
No log 8.6923 452 0.5894 0.3939 0.5894 0.7677
No log 8.7308 454 0.6164 0.3171 0.6164 0.7851
No log 8.7692 456 0.6259 0.3143 0.6259 0.7911
No log 8.8077 458 0.6437 0.3143 0.6437 0.8023
No log 8.8462 460 0.6709 0.3143 0.6709 0.8191
No log 8.8846 462 0.6831 0.3143 0.6831 0.8265
No log 8.9231 464 0.6629 0.3143 0.6629 0.8142
No log 8.9615 466 0.6273 0.3077 0.6273 0.7920
No log 9.0 468 0.5880 0.4043 0.5880 0.7668
No log 9.0385 470 0.5694 0.4764 0.5694 0.7546
No log 9.0769 472 0.5582 0.5052 0.5582 0.7471
No log 9.1154 474 0.5563 0.5052 0.5563 0.7458
No log 9.1538 476 0.5594 0.5052 0.5594 0.7480
No log 9.1923 478 0.5730 0.4043 0.5730 0.7569
No log 9.2308 480 0.5980 0.3535 0.5980 0.7733
No log 9.2692 482 0.6290 0.3427 0.6290 0.7931
No log 9.3077 484 0.6582 0.3143 0.6582 0.8113
No log 9.3462 486 0.6770 0.3143 0.6770 0.8228
No log 9.3846 488 0.6782 0.3143 0.6782 0.8236
No log 9.4231 490 0.6775 0.3143 0.6775 0.8231
No log 9.4615 492 0.6672 0.3143 0.6672 0.8168
No log 9.5 494 0.6566 0.3143 0.6566 0.8103
No log 9.5385 496 0.6573 0.3143 0.6573 0.8107
No log 9.5769 498 0.6681 0.3143 0.6681 0.8174
0.4167 9.6154 500 0.6682 0.3143 0.6682 0.8174
0.4167 9.6538 502 0.6577 0.3143 0.6577 0.8110
0.4167 9.6923 504 0.6503 0.3143 0.6503 0.8064
0.4167 9.7308 506 0.6414 0.3427 0.6414 0.8009
0.4167 9.7692 508 0.6381 0.3427 0.6381 0.7988
0.4167 9.8077 510 0.6375 0.3427 0.6375 0.7984
0.4167 9.8462 512 0.6356 0.3427 0.6356 0.7973
0.4167 9.8846 514 0.6330 0.3427 0.6330 0.7956
0.4167 9.9231 516 0.6294 0.3427 0.6294 0.7934
0.4167 9.9615 518 0.6279 0.3427 0.6279 0.7924
0.4167 10.0 520 0.6270 0.3427 0.6270 0.7919

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits6_WithDuplicationsForScore5_FineTuningAraBERT_run1_AugV5_k10_task3_organization

Finetuned
(4023)
this model