ArabicNewSplits4_WithDuplicationsForScore5_FineTuningAraBERT_run2_AugV5_k17_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8830
  • Qwk: 0.7030
  • Mse: 0.8830
  • Rmse: 0.9397

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0357 2 2.2479 0.0250 2.2479 1.4993
No log 0.0714 4 1.4431 0.2480 1.4431 1.2013
No log 0.1071 6 1.5100 0.1209 1.5100 1.2288
No log 0.1429 8 1.8074 0.2235 1.8074 1.3444
No log 0.1786 10 2.2957 0.1632 2.2957 1.5152
No log 0.2143 12 2.2682 0.1988 2.2682 1.5060
No log 0.25 14 2.2008 0.1950 2.2008 1.4835
No log 0.2857 16 1.8779 0.1786 1.8779 1.3704
No log 0.3214 18 1.8393 0.1130 1.8393 1.3562
No log 0.3571 20 1.8411 0.1573 1.8411 1.3569
No log 0.3929 22 1.9334 0.2315 1.9334 1.3905
No log 0.4286 24 1.9163 0.2824 1.9163 1.3843
No log 0.4643 26 1.9489 0.3026 1.9489 1.3960
No log 0.5 28 2.0296 0.2547 2.0296 1.4246
No log 0.5357 30 2.1725 0.2421 2.1725 1.4739
No log 0.5714 32 2.0421 0.2367 2.0421 1.4290
No log 0.6071 34 1.8270 0.2868 1.8270 1.3517
No log 0.6429 36 1.6186 0.3673 1.6186 1.2722
No log 0.6786 38 1.4785 0.3815 1.4785 1.2160
No log 0.7143 40 1.4320 0.3725 1.4320 1.1966
No log 0.75 42 1.4751 0.3932 1.4751 1.2145
No log 0.7857 44 1.5879 0.3903 1.5879 1.2601
No log 0.8214 46 1.4643 0.4125 1.4643 1.2101
No log 0.8571 48 1.4015 0.4125 1.4015 1.1839
No log 0.8929 50 1.3806 0.4269 1.3806 1.1750
No log 0.9286 52 1.5213 0.4025 1.5213 1.2334
No log 0.9643 54 1.2609 0.4430 1.2609 1.1229
No log 1.0 56 0.9927 0.4778 0.9927 0.9963
No log 1.0357 58 1.1180 0.4758 1.1180 1.0573
No log 1.0714 60 1.6603 0.5005 1.6603 1.2885
No log 1.1071 62 1.8937 0.4802 1.8937 1.3761
No log 1.1429 64 1.5883 0.5154 1.5883 1.2603
No log 1.1786 66 1.0759 0.5697 1.0759 1.0373
No log 1.2143 68 0.7704 0.6253 0.7704 0.8777
No log 1.25 70 0.7475 0.6766 0.7475 0.8646
No log 1.2857 72 0.9916 0.5964 0.9916 0.9958
No log 1.3214 74 1.8116 0.5168 1.8116 1.3459
No log 1.3571 76 2.3836 0.4154 2.3836 1.5439
No log 1.3929 78 2.2022 0.4737 2.2022 1.4840
No log 1.4286 80 1.3973 0.5580 1.3973 1.1821
No log 1.4643 82 0.9181 0.5772 0.9181 0.9582
No log 1.5 84 0.8708 0.5524 0.8708 0.9332
No log 1.5357 86 0.9721 0.5680 0.9721 0.9859
No log 1.5714 88 1.3659 0.5452 1.3659 1.1687
No log 1.6071 90 1.7115 0.5123 1.7115 1.3083
No log 1.6429 92 1.6654 0.5128 1.6654 1.2905
No log 1.6786 94 1.2007 0.5596 1.2007 1.0958
No log 1.7143 96 1.0399 0.6176 1.0399 1.0198
No log 1.75 98 1.2004 0.5538 1.2004 1.0956
No log 1.7857 100 1.6004 0.5138 1.6004 1.2651
No log 1.8214 102 1.5689 0.5043 1.5689 1.2526
No log 1.8571 104 1.1675 0.5595 1.1675 1.0805
No log 1.8929 106 0.9127 0.6271 0.9127 0.9553
No log 1.9286 108 0.9660 0.6198 0.9660 0.9828
No log 1.9643 110 1.2819 0.5720 1.2819 1.1322
No log 2.0 112 1.5373 0.5905 1.5373 1.2399
No log 2.0357 114 1.3853 0.5859 1.3853 1.1770
No log 2.0714 116 1.0144 0.6342 1.0144 1.0072
No log 2.1071 118 0.9580 0.6466 0.9580 0.9788
No log 2.1429 120 0.9727 0.6622 0.9727 0.9863
No log 2.1786 122 0.9161 0.6403 0.9161 0.9572
No log 2.2143 124 0.7332 0.7325 0.7332 0.8563
No log 2.25 126 0.7146 0.7355 0.7146 0.8454
No log 2.2857 128 0.7194 0.7355 0.7194 0.8482
No log 2.3214 130 0.7954 0.6813 0.7954 0.8919
No log 2.3571 132 1.0765 0.5917 1.0765 1.0375
No log 2.3929 134 1.1948 0.5765 1.1948 1.0931
No log 2.4286 136 1.0338 0.6160 1.0338 1.0167
No log 2.4643 138 0.8361 0.6654 0.8361 0.9144
No log 2.5 140 0.7871 0.6798 0.7871 0.8872
No log 2.5357 142 0.8649 0.6558 0.8649 0.9300
No log 2.5714 144 1.0452 0.6310 1.0452 1.0224
No log 2.6071 146 1.4949 0.5875 1.4949 1.2227
No log 2.6429 148 1.8900 0.5287 1.8900 1.3748
No log 2.6786 150 1.8142 0.5396 1.8142 1.3469
No log 2.7143 152 1.4617 0.5519 1.4617 1.2090
No log 2.75 154 1.0323 0.5789 1.0323 1.0160
No log 2.7857 156 0.7616 0.6629 0.7616 0.8727
No log 2.8214 158 0.7100 0.6586 0.7100 0.8426
No log 2.8571 160 0.7166 0.6727 0.7166 0.8465
No log 2.8929 162 0.7791 0.6431 0.7791 0.8826
No log 2.9286 164 0.8653 0.6065 0.8653 0.9302
No log 2.9643 166 1.0937 0.5658 1.0937 1.0458
No log 3.0 168 1.4897 0.5358 1.4897 1.2205
No log 3.0357 170 1.7035 0.5276 1.7035 1.3052
No log 3.0714 172 1.5947 0.5491 1.5947 1.2628
No log 3.1071 174 1.2534 0.5672 1.2534 1.1196
No log 3.1429 176 0.8909 0.6504 0.8909 0.9439
No log 3.1786 178 0.7639 0.6680 0.7639 0.8740
No log 3.2143 180 0.7084 0.6508 0.7084 0.8416
No log 3.25 182 0.7325 0.6508 0.7325 0.8559
No log 3.2857 184 0.8266 0.6509 0.8266 0.9092
No log 3.3214 186 1.0305 0.6010 1.0305 1.0152
No log 3.3571 188 1.2061 0.6059 1.2061 1.0982
No log 3.3929 190 1.1813 0.6096 1.1813 1.0869
No log 3.4286 192 1.0913 0.6045 1.0913 1.0446
No log 3.4643 194 0.9145 0.6253 0.9145 0.9563
No log 3.5 196 0.8390 0.6295 0.8390 0.9160
No log 3.5357 198 0.8423 0.6379 0.8423 0.9178
No log 3.5714 200 0.9201 0.6346 0.9201 0.9592
No log 3.6071 202 1.0619 0.6247 1.0619 1.0305
No log 3.6429 204 1.3130 0.6145 1.3130 1.1459
No log 3.6786 206 1.4413 0.6075 1.4413 1.2005
No log 3.7143 208 1.3873 0.6170 1.3873 1.1778
No log 3.75 210 1.1332 0.6272 1.1332 1.0645
No log 3.7857 212 0.9009 0.6514 0.9009 0.9492
No log 3.8214 214 0.7168 0.7392 0.7168 0.8466
No log 3.8571 216 0.6753 0.7685 0.6753 0.8218
No log 3.8929 218 0.6980 0.7507 0.6980 0.8355
No log 3.9286 220 0.8059 0.7041 0.8059 0.8977
No log 3.9643 222 1.0091 0.6360 1.0091 1.0045
No log 4.0 224 1.1226 0.6321 1.1226 1.0595
No log 4.0357 226 1.1260 0.6264 1.1260 1.0611
No log 4.0714 228 1.0933 0.6381 1.0933 1.0456
No log 4.1071 230 1.0439 0.6364 1.0439 1.0217
No log 4.1429 232 1.0708 0.6166 1.0708 1.0348
No log 4.1786 234 1.1820 0.6107 1.1820 1.0872
No log 4.2143 236 1.2371 0.6136 1.2371 1.1123
No log 4.25 238 1.1794 0.6122 1.1794 1.0860
No log 4.2857 240 1.0063 0.6323 1.0063 1.0031
No log 4.3214 242 0.8067 0.7022 0.8067 0.8982
No log 4.3571 244 0.7069 0.7698 0.7069 0.8408
No log 4.3929 246 0.7151 0.7544 0.7151 0.8456
No log 4.4286 248 0.7917 0.7476 0.7917 0.8897
No log 4.4643 250 0.8275 0.7307 0.8275 0.9097
No log 4.5 252 0.8469 0.7116 0.8469 0.9203
No log 4.5357 254 0.8537 0.7032 0.8537 0.9239
No log 4.5714 256 0.8041 0.6960 0.8041 0.8967
No log 4.6071 258 0.7329 0.7074 0.7329 0.8561
No log 4.6429 260 0.7210 0.7253 0.7210 0.8491
No log 4.6786 262 0.7042 0.7383 0.7042 0.8392
No log 4.7143 264 0.6708 0.7530 0.6708 0.8190
No log 4.75 266 0.6547 0.7555 0.6547 0.8091
No log 4.7857 268 0.6604 0.7627 0.6604 0.8127
No log 4.8214 270 0.7357 0.7447 0.7357 0.8578
No log 4.8571 272 0.8177 0.7011 0.8177 0.9043
No log 4.8929 274 0.7882 0.7152 0.7882 0.8878
No log 4.9286 276 0.6971 0.7686 0.6971 0.8349
No log 4.9643 278 0.6625 0.7617 0.6625 0.8139
No log 5.0 280 0.6657 0.7425 0.6657 0.8159
No log 5.0357 282 0.6959 0.7590 0.6959 0.8342
No log 5.0714 284 0.7830 0.7120 0.7830 0.8849
No log 5.1071 286 0.9209 0.6804 0.9209 0.9596
No log 5.1429 288 0.9449 0.6804 0.9449 0.9721
No log 5.1786 290 0.8858 0.6930 0.8858 0.9412
No log 5.2143 292 0.9065 0.6804 0.9065 0.9521
No log 5.25 294 0.9023 0.6804 0.9023 0.9499
No log 5.2857 296 0.8090 0.6991 0.8090 0.8994
No log 5.3214 298 0.7822 0.7144 0.7822 0.8844
No log 5.3571 300 0.8422 0.6859 0.8422 0.9177
No log 5.3929 302 0.9412 0.6768 0.9412 0.9702
No log 5.4286 304 0.9769 0.6732 0.9769 0.9884
No log 5.4643 306 0.9442 0.6768 0.9442 0.9717
No log 5.5 308 0.9237 0.6717 0.9237 0.9611
No log 5.5357 310 0.8887 0.6703 0.8887 0.9427
No log 5.5714 312 0.8553 0.6832 0.8553 0.9248
No log 5.6071 314 0.8278 0.6759 0.8278 0.9098
No log 5.6429 316 0.8081 0.6964 0.8081 0.8989
No log 5.6786 318 0.8459 0.6815 0.8459 0.9197
No log 5.7143 320 0.9257 0.6537 0.9257 0.9621
No log 5.75 322 1.0597 0.6027 1.0597 1.0294
No log 5.7857 324 1.2570 0.6010 1.2570 1.1212
No log 5.8214 326 1.3307 0.6051 1.3307 1.1535
No log 5.8571 328 1.2934 0.6037 1.2934 1.1373
No log 5.8929 330 1.1696 0.6142 1.1696 1.0815
No log 5.9286 332 1.0142 0.6607 1.0142 1.0071
No log 5.9643 334 0.9325 0.6666 0.9325 0.9657
No log 6.0 336 0.9110 0.6719 0.9110 0.9544
No log 6.0357 338 0.9841 0.6586 0.9841 0.9920
No log 6.0714 340 1.0809 0.6404 1.0809 1.0397
No log 6.1071 342 1.1418 0.6204 1.1418 1.0685
No log 6.1429 344 1.1082 0.6256 1.1082 1.0527
No log 6.1786 346 1.0235 0.6493 1.0235 1.0117
No log 6.2143 348 0.9202 0.6549 0.9202 0.9593
No log 6.25 350 0.8382 0.6936 0.8382 0.9155
No log 6.2857 352 0.8146 0.7041 0.8146 0.9026
No log 6.3214 354 0.8349 0.6949 0.8349 0.9137
No log 6.3571 356 0.8289 0.6962 0.8289 0.9105
No log 6.3929 358 0.8324 0.6916 0.8324 0.9124
No log 6.4286 360 0.8701 0.6630 0.8701 0.9328
No log 6.4643 362 0.9064 0.6550 0.9064 0.9520
No log 6.5 364 0.9290 0.6520 0.9290 0.9638
No log 6.5357 366 0.9080 0.6680 0.9080 0.9529
No log 6.5714 368 0.8291 0.6752 0.8291 0.9105
No log 6.6071 370 0.7613 0.7248 0.7613 0.8725
No log 6.6429 372 0.7452 0.7289 0.7452 0.8633
No log 6.6786 374 0.7353 0.7289 0.7353 0.8575
No log 6.7143 376 0.7669 0.7343 0.7669 0.8757
No log 6.75 378 0.8639 0.6951 0.8639 0.9295
No log 6.7857 380 0.9241 0.6743 0.9241 0.9613
No log 6.8214 382 0.9277 0.6666 0.9277 0.9632
No log 6.8571 384 0.9086 0.6697 0.9086 0.9532
No log 6.8929 386 0.8747 0.6857 0.8747 0.9353
No log 6.9286 388 0.8268 0.7190 0.8268 0.9093
No log 6.9643 390 0.8355 0.7077 0.8355 0.9141
No log 7.0 392 0.8642 0.6951 0.8642 0.9296
No log 7.0357 394 0.8657 0.6885 0.8657 0.9304
No log 7.0714 396 0.8491 0.7012 0.8491 0.9215
No log 7.1071 398 0.8427 0.7122 0.8427 0.9180
No log 7.1429 400 0.8725 0.6923 0.8725 0.9341
No log 7.1786 402 0.8984 0.6938 0.8984 0.9478
No log 7.2143 404 0.9334 0.6688 0.9334 0.9661
No log 7.25 406 0.9581 0.6726 0.9581 0.9788
No log 7.2857 408 0.9454 0.6651 0.9454 0.9723
No log 7.3214 410 0.9100 0.6743 0.9100 0.9539
No log 7.3571 412 0.8930 0.6868 0.8930 0.9450
No log 7.3929 414 0.8502 0.7103 0.8502 0.9221
No log 7.4286 416 0.8326 0.7263 0.8326 0.9125
No log 7.4643 418 0.8488 0.7152 0.8488 0.9213
No log 7.5 420 0.8776 0.7062 0.8776 0.9368
No log 7.5357 422 0.9158 0.6727 0.9158 0.9570
No log 7.5714 424 0.9298 0.6628 0.9298 0.9643
No log 7.6071 426 0.9800 0.6644 0.9800 0.9900
No log 7.6429 428 1.0019 0.6652 1.0019 1.0009
No log 7.6786 430 1.0068 0.6652 1.0068 1.0034
No log 7.7143 432 0.9920 0.6652 0.9920 0.9960
No log 7.75 434 0.9387 0.6809 0.9387 0.9689
No log 7.7857 436 0.8696 0.6960 0.8696 0.9325
No log 7.8214 438 0.8147 0.7249 0.8147 0.9026
No log 7.8571 440 0.7813 0.7323 0.7813 0.8839
No log 7.8929 442 0.7828 0.7248 0.7828 0.8848
No log 7.9286 444 0.8127 0.7019 0.8127 0.9015
No log 7.9643 446 0.8191 0.7019 0.8191 0.9051
No log 8.0 448 0.8417 0.7000 0.8417 0.9175
No log 8.0357 450 0.8734 0.6800 0.8734 0.9346
No log 8.0714 452 0.9213 0.6797 0.9213 0.9599
No log 8.1071 454 0.9509 0.6613 0.9509 0.9751
No log 8.1429 456 0.9606 0.6604 0.9606 0.9801
No log 8.1786 458 0.9765 0.6521 0.9765 0.9882
No log 8.2143 460 0.9722 0.6521 0.9722 0.9860
No log 8.25 462 0.9558 0.6604 0.9558 0.9777
No log 8.2857 464 0.9193 0.6784 0.9193 0.9588
No log 8.3214 466 0.8781 0.6817 0.8781 0.9371
No log 8.3571 468 0.8559 0.6834 0.8559 0.9251
No log 8.3929 470 0.8520 0.6834 0.8520 0.9230
No log 8.4286 472 0.8635 0.6834 0.8635 0.9293
No log 8.4643 474 0.8677 0.6912 0.8677 0.9315
No log 8.5 476 0.8955 0.6912 0.8955 0.9463
No log 8.5357 478 0.9381 0.6712 0.9381 0.9686
No log 8.5714 480 0.9744 0.6530 0.9744 0.9871
No log 8.6071 482 1.0148 0.6607 1.0148 1.0074
No log 8.6429 484 1.0451 0.6465 1.0451 1.0223
No log 8.6786 486 1.0535 0.6465 1.0535 1.0264
No log 8.7143 488 1.0540 0.6465 1.0540 1.0266
No log 8.75 490 1.0373 0.6465 1.0373 1.0185
No log 8.7857 492 1.0085 0.6503 1.0085 1.0042
No log 8.8214 494 0.9710 0.6613 0.9710 0.9854
No log 8.8571 496 0.9375 0.6797 0.9375 0.9682
No log 8.8929 498 0.9330 0.6797 0.9330 0.9659
0.3325 8.9286 500 0.9469 0.6712 0.9469 0.9731
0.3325 8.9643 502 0.9552 0.6674 0.9552 0.9774
0.3325 9.0 504 0.9593 0.6590 0.9593 0.9794
0.3325 9.0357 506 0.9601 0.6696 0.9601 0.9799
0.3325 9.0714 508 0.9632 0.6696 0.9632 0.9814
0.3325 9.1071 510 0.9643 0.6636 0.9643 0.9820
0.3325 9.1429 512 0.9522 0.6673 0.9522 0.9758
0.3325 9.1786 514 0.9386 0.6819 0.9386 0.9688
0.3325 9.2143 516 0.9273 0.6943 0.9273 0.9630
0.3325 9.25 518 0.9208 0.6943 0.9208 0.9596
0.3325 9.2857 520 0.9186 0.6943 0.9186 0.9585
0.3325 9.3214 522 0.9184 0.6943 0.9184 0.9583
0.3325 9.3571 524 0.9144 0.6943 0.9144 0.9562
0.3325 9.3929 526 0.9080 0.6943 0.9080 0.9529
0.3325 9.4286 528 0.8981 0.7030 0.8981 0.9477
0.3325 9.4643 530 0.8862 0.7030 0.8862 0.9414
0.3325 9.5 532 0.8759 0.7138 0.8759 0.9359
0.3325 9.5357 534 0.8629 0.7085 0.8629 0.9289
0.3325 9.5714 536 0.8602 0.7083 0.8602 0.9275
0.3325 9.6071 538 0.8647 0.7085 0.8647 0.9299
0.3325 9.6429 540 0.8668 0.7030 0.8668 0.9310
0.3325 9.6786 542 0.8665 0.7026 0.8665 0.9309
0.3325 9.7143 544 0.8679 0.7030 0.8679 0.9316
0.3325 9.75 546 0.8705 0.7030 0.8705 0.9330
0.3325 9.7857 548 0.8735 0.7030 0.8735 0.9346
0.3325 9.8214 550 0.8752 0.7030 0.8752 0.9355
0.3325 9.8571 552 0.8775 0.7030 0.8775 0.9368
0.3325 9.8929 554 0.8805 0.7030 0.8805 0.9383
0.3325 9.9286 556 0.8822 0.7030 0.8822 0.9392
0.3325 9.9643 558 0.8828 0.7030 0.8828 0.9396
0.3325 10.0 560 0.8830 0.7030 0.8830 0.9397

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
3
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits4_WithDuplicationsForScore5_FineTuningAraBERT_run2_AugV5_k17_task5_organization

Finetuned
(4023)
this model