ArabicNewSplits4_WithDuplicationsForScore5_FineTuningAraBERT_run2_AugV5_k13_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9348
  • Qwk: 0.6922
  • Mse: 0.9348
  • Rmse: 0.9668

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0455 2 2.1680 0.0156 2.1680 1.4724
No log 0.0909 4 1.6328 0.0759 1.6328 1.2778
No log 0.1364 6 1.7104 0.1495 1.7104 1.3078
No log 0.1818 8 1.8170 0.1722 1.8170 1.3480
No log 0.2273 10 1.8184 0.2171 1.8184 1.3485
No log 0.2727 12 1.5745 0.2915 1.5745 1.2548
No log 0.3182 14 1.3895 0.2868 1.3895 1.1788
No log 0.3636 16 1.4464 0.3895 1.4464 1.2027
No log 0.4091 18 1.7245 0.2592 1.7245 1.3132
No log 0.4545 20 1.7516 0.2823 1.7516 1.3235
No log 0.5 22 1.7216 0.2741 1.7216 1.3121
No log 0.5455 24 1.7452 0.3527 1.7452 1.3211
No log 0.5909 26 1.8904 0.2585 1.8904 1.3749
No log 0.6364 28 1.9483 0.2440 1.9483 1.3958
No log 0.6818 30 1.8016 0.3158 1.8016 1.3423
No log 0.7273 32 1.6147 0.3312 1.6147 1.2707
No log 0.7727 34 1.6195 0.3318 1.6195 1.2726
No log 0.8182 36 1.7093 0.3179 1.7093 1.3074
No log 0.8636 38 2.0448 0.3231 2.0448 1.4300
No log 0.9091 40 2.4748 0.3008 2.4748 1.5731
No log 0.9545 42 2.5003 0.2930 2.5003 1.5812
No log 1.0 44 2.6207 0.3084 2.6207 1.6188
No log 1.0455 46 2.3959 0.3450 2.3959 1.5479
No log 1.0909 48 1.9169 0.3512 1.9169 1.3845
No log 1.1364 50 1.6162 0.4136 1.6162 1.2713
No log 1.1818 52 1.4347 0.4604 1.4347 1.1978
No log 1.2273 54 1.3761 0.4604 1.3761 1.1731
No log 1.2727 56 1.4589 0.4397 1.4589 1.2078
No log 1.3182 58 1.6848 0.3992 1.6848 1.2980
No log 1.3636 60 1.8731 0.3906 1.8731 1.3686
No log 1.4091 62 2.0583 0.3542 2.0583 1.4347
No log 1.4545 64 2.2413 0.3484 2.2413 1.4971
No log 1.5 66 2.1829 0.3439 2.1829 1.4775
No log 1.5455 68 2.0682 0.3909 2.0682 1.4381
No log 1.5909 70 2.0189 0.4207 2.0189 1.4209
No log 1.6364 72 1.9723 0.4055 1.9723 1.4044
No log 1.6818 74 1.8348 0.4221 1.8348 1.3545
No log 1.7273 76 1.7452 0.4499 1.7452 1.3211
No log 1.7727 78 1.6438 0.4535 1.6438 1.2821
No log 1.8182 80 1.5829 0.4680 1.5829 1.2581
No log 1.8636 82 1.6914 0.5012 1.6914 1.3005
No log 1.9091 84 1.5737 0.5111 1.5737 1.2545
No log 1.9545 86 1.3352 0.6001 1.3352 1.1555
No log 2.0 88 1.1905 0.6172 1.1905 1.0911
No log 2.0455 90 1.0371 0.6507 1.0371 1.0184
No log 2.0909 92 0.8948 0.6661 0.8948 0.9459
No log 2.1364 94 0.9398 0.6516 0.9398 0.9694
No log 2.1818 96 1.0247 0.6694 1.0247 1.0123
No log 2.2273 98 1.2178 0.6540 1.2178 1.1035
No log 2.2727 100 1.2186 0.6599 1.2186 1.1039
No log 2.3182 102 1.0244 0.6610 1.0244 1.0121
No log 2.3636 104 0.7632 0.6957 0.7632 0.8736
No log 2.4091 106 0.6927 0.6924 0.6927 0.8323
No log 2.4545 108 0.7726 0.6893 0.7726 0.8790
No log 2.5 110 0.9003 0.6839 0.9003 0.9489
No log 2.5455 112 1.1230 0.6547 1.1230 1.0597
No log 2.5909 114 1.2304 0.6269 1.2304 1.1092
No log 2.6364 116 1.0800 0.6601 1.0800 1.0392
No log 2.6818 118 0.8570 0.6638 0.8570 0.9257
No log 2.7273 120 0.7264 0.6875 0.7264 0.8523
No log 2.7727 122 0.7480 0.6707 0.7480 0.8649
No log 2.8182 124 0.8036 0.6582 0.8036 0.8964
No log 2.8636 126 0.9755 0.6382 0.9755 0.9877
No log 2.9091 128 1.1850 0.5768 1.1850 1.0886
No log 2.9545 130 1.3301 0.5298 1.3301 1.1533
No log 3.0 132 1.4826 0.5221 1.4826 1.2176
No log 3.0455 134 1.7295 0.5208 1.7295 1.3151
No log 3.0909 136 1.6953 0.5265 1.6953 1.3020
No log 3.1364 138 1.3039 0.5930 1.3039 1.1419
No log 3.1818 140 0.9401 0.6564 0.9401 0.9696
No log 3.2273 142 0.8240 0.6293 0.8240 0.9077
No log 3.2727 144 0.8376 0.6188 0.8376 0.9152
No log 3.3182 146 0.9447 0.6547 0.9447 0.9719
No log 3.3636 148 1.1681 0.5956 1.1681 1.0808
No log 3.4091 150 1.1462 0.6054 1.1462 1.0706
No log 3.4545 152 0.9793 0.6248 0.9793 0.9896
No log 3.5 154 1.0076 0.6277 1.0076 1.0038
No log 3.5455 156 1.0594 0.6249 1.0594 1.0293
No log 3.5909 158 1.1173 0.6418 1.1173 1.0570
No log 3.6364 160 1.1570 0.6577 1.1570 1.0757
No log 3.6818 162 1.2352 0.6212 1.2352 1.1114
No log 3.7273 164 1.1864 0.6484 1.1864 1.0892
No log 3.7727 166 1.0808 0.6584 1.0808 1.0396
No log 3.8182 168 0.8829 0.6944 0.8829 0.9397
No log 3.8636 170 0.7716 0.6890 0.7716 0.8784
No log 3.9091 172 0.8033 0.7072 0.8033 0.8963
No log 3.9545 174 0.8920 0.6914 0.8920 0.9444
No log 4.0 176 0.9966 0.6597 0.9966 0.9983
No log 4.0455 178 1.1360 0.6426 1.1360 1.0658
No log 4.0909 180 1.0744 0.6207 1.0744 1.0366
No log 4.1364 182 1.0083 0.6176 1.0083 1.0042
No log 4.1818 184 0.9834 0.6387 0.9834 0.9917
No log 4.2273 186 0.9251 0.6511 0.9251 0.9618
No log 4.2727 188 0.9644 0.6268 0.9644 0.9820
No log 4.3182 190 1.0571 0.5935 1.0571 1.0281
No log 4.3636 192 1.0903 0.5923 1.0903 1.0442
No log 4.4091 194 1.1260 0.5923 1.1260 1.0611
No log 4.4545 196 1.1701 0.6180 1.1701 1.0817
No log 4.5 198 1.1850 0.6252 1.1850 1.0886
No log 4.5455 200 1.1898 0.6252 1.1898 1.0908
No log 4.5909 202 1.1688 0.6290 1.1688 1.0811
No log 4.6364 204 1.1923 0.6290 1.1923 1.0919
No log 4.6818 206 1.0664 0.6189 1.0664 1.0326
No log 4.7273 208 0.9397 0.6425 0.9397 0.9694
No log 4.7727 210 0.8336 0.6502 0.8336 0.9130
No log 4.8182 212 0.8407 0.6582 0.8407 0.9169
No log 4.8636 214 0.9293 0.6525 0.9293 0.9640
No log 4.9091 216 0.9983 0.6673 0.9983 0.9991
No log 4.9545 218 1.1453 0.6465 1.1453 1.0702
No log 5.0 220 1.2508 0.6187 1.2508 1.1184
No log 5.0455 222 1.3117 0.5845 1.3117 1.1453
No log 5.0909 224 1.3098 0.5550 1.3098 1.1445
No log 5.1364 226 1.2640 0.5479 1.2640 1.1243
No log 5.1818 228 1.3530 0.5427 1.3530 1.1632
No log 5.2273 230 1.4168 0.5385 1.4168 1.1903
No log 5.2727 232 1.4206 0.5328 1.4206 1.1919
No log 5.3182 234 1.3220 0.5485 1.3220 1.1498
No log 5.3636 236 1.3153 0.5509 1.3153 1.1469
No log 5.4091 238 1.3569 0.5485 1.3569 1.1649
No log 5.4545 240 1.4514 0.5377 1.4514 1.2048
No log 5.5 242 1.4963 0.5519 1.4963 1.2232
No log 5.5455 244 1.4482 0.5526 1.4482 1.2034
No log 5.5909 246 1.2731 0.5655 1.2731 1.1283
No log 5.6364 248 1.0790 0.6282 1.0790 1.0387
No log 5.6818 250 0.9366 0.6512 0.9366 0.9678
No log 5.7273 252 0.9308 0.6351 0.9308 0.9648
No log 5.7727 254 0.9845 0.6479 0.9845 0.9922
No log 5.8182 256 1.0534 0.6426 1.0534 1.0263
No log 5.8636 258 1.0612 0.6441 1.0612 1.0301
No log 5.9091 260 0.9605 0.6456 0.9605 0.9800
No log 5.9545 262 0.8629 0.6651 0.8629 0.9289
No log 6.0 264 0.8376 0.6322 0.8376 0.9152
No log 6.0455 266 0.9106 0.6256 0.9106 0.9542
No log 6.0909 268 1.0069 0.6353 1.0069 1.0035
No log 6.1364 270 1.0423 0.6391 1.0423 1.0209
No log 6.1818 272 1.0152 0.6560 1.0152 1.0076
No log 6.2273 274 0.9728 0.6789 0.9728 0.9863
No log 6.2727 276 0.9504 0.6755 0.9504 0.9749
No log 6.3182 278 0.9836 0.6816 0.9836 0.9918
No log 6.3636 280 1.1087 0.6488 1.1087 1.0529
No log 6.4091 282 1.2708 0.5922 1.2708 1.1273
No log 6.4545 284 1.4613 0.5370 1.4613 1.2089
No log 6.5 286 1.5798 0.5335 1.5798 1.2569
No log 6.5455 288 1.5181 0.5270 1.5181 1.2321
No log 6.5909 290 1.3714 0.5621 1.3714 1.1711
No log 6.6364 292 1.2285 0.5879 1.2285 1.1084
No log 6.6818 294 1.1118 0.6467 1.1118 1.0544
No log 6.7273 296 1.0283 0.6650 1.0283 1.0140
No log 6.7727 298 1.0000 0.6714 1.0000 1.0000
No log 6.8182 300 0.9810 0.6864 0.9810 0.9904
No log 6.8636 302 0.9951 0.6633 0.9951 0.9976
No log 6.9091 304 0.9607 0.6912 0.9607 0.9802
No log 6.9545 306 0.9703 0.6844 0.9703 0.9850
No log 7.0 308 0.9991 0.6712 0.9991 0.9995
No log 7.0455 310 0.9882 0.6861 0.9882 0.9941
No log 7.0909 312 1.0004 0.6861 1.0004 1.0002
No log 7.1364 314 1.0352 0.6444 1.0352 1.0174
No log 7.1818 316 1.0081 0.6826 1.0081 1.0040
No log 7.2273 318 0.9418 0.6928 0.9418 0.9705
No log 7.2727 320 0.8704 0.6797 0.8704 0.9330
No log 7.3182 322 0.8219 0.7019 0.8219 0.9066
No log 7.3636 324 0.8386 0.7209 0.8386 0.9157
No log 7.4091 326 0.9234 0.7076 0.9234 0.9609
No log 7.4545 328 1.0385 0.6636 1.0385 1.0191
No log 7.5 330 1.0820 0.6621 1.0820 1.0402
No log 7.5455 332 1.1311 0.6492 1.1311 1.0635
No log 7.5909 334 1.1112 0.6483 1.1112 1.0541
No log 7.6364 336 1.0311 0.6800 1.0311 1.0154
No log 7.6818 338 0.9723 0.7033 0.9723 0.9861
No log 7.7273 340 0.9598 0.7169 0.9598 0.9797
No log 7.7727 342 0.9825 0.6947 0.9825 0.9912
No log 7.8182 344 0.9962 0.6947 0.9962 0.9981
No log 7.8636 346 0.9611 0.7169 0.9611 0.9804
No log 7.9091 348 0.9370 0.7176 0.9370 0.9680
No log 7.9545 350 0.9181 0.7182 0.9181 0.9582
No log 8.0 352 0.8935 0.7182 0.8935 0.9453
No log 8.0455 354 0.9114 0.7182 0.9114 0.9547
No log 8.0909 356 0.9576 0.7169 0.9576 0.9786
No log 8.1364 358 0.9577 0.7169 0.9577 0.9786
No log 8.1818 360 0.9285 0.7169 0.9285 0.9636
No log 8.2273 362 0.9116 0.7168 0.9116 0.9548
No log 8.2727 364 0.8762 0.7136 0.8762 0.9361
No log 8.3182 366 0.8460 0.7098 0.8460 0.9198
No log 8.3636 368 0.8410 0.7098 0.8410 0.9171
No log 8.4091 370 0.8608 0.7131 0.8608 0.9278
No log 8.4545 372 0.9000 0.7136 0.9000 0.9487
No log 8.5 374 0.9549 0.7058 0.9549 0.9772
No log 8.5455 376 1.0017 0.6774 1.0017 1.0008
No log 8.5909 378 1.0070 0.6774 1.0070 1.0035
No log 8.6364 380 0.9815 0.6998 0.9815 0.9907
No log 8.6818 382 0.9338 0.6998 0.9338 0.9664
No log 8.7273 384 0.8806 0.6811 0.8806 0.9384
No log 8.7727 386 0.8417 0.7059 0.8417 0.9174
No log 8.8182 388 0.8354 0.7059 0.8354 0.9140
No log 8.8636 390 0.8400 0.7059 0.8400 0.9165
No log 8.9091 392 0.8641 0.6908 0.8641 0.9296
No log 8.9545 394 0.8877 0.6811 0.8877 0.9422
No log 9.0 396 0.8985 0.6811 0.8985 0.9479
No log 9.0455 398 0.9036 0.6889 0.9036 0.9506
No log 9.0909 400 0.9251 0.6998 0.9251 0.9618
No log 9.1364 402 0.9542 0.6998 0.9542 0.9768
No log 9.1818 404 0.9855 0.6998 0.9855 0.9927
No log 9.2273 406 1.0125 0.6774 1.0125 1.0062
No log 9.2727 408 1.0331 0.6774 1.0331 1.0164
No log 9.3182 410 1.0313 0.6774 1.0313 1.0155
No log 9.3636 412 1.0213 0.6774 1.0213 1.0106
No log 9.4091 414 1.0044 0.6910 1.0044 1.0022
No log 9.4545 416 0.9925 0.6910 0.9925 0.9962
No log 9.5 418 0.9849 0.6833 0.9849 0.9924
No log 9.5455 420 0.9721 0.6922 0.9721 0.9860
No log 9.5909 422 0.9616 0.6922 0.9616 0.9806
No log 9.6364 424 0.9480 0.6922 0.9480 0.9736
No log 9.6818 426 0.9406 0.6922 0.9406 0.9698
No log 9.7273 428 0.9368 0.6922 0.9368 0.9679
No log 9.7727 430 0.9323 0.6922 0.9323 0.9656
No log 9.8182 432 0.9272 0.6922 0.9272 0.9629
No log 9.8636 434 0.9269 0.6922 0.9269 0.9628
No log 9.9091 436 0.9299 0.6922 0.9299 0.9643
No log 9.9545 438 0.9334 0.6922 0.9334 0.9661
No log 10.0 440 0.9348 0.6922 0.9348 0.9668

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
3
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits4_WithDuplicationsForScore5_FineTuningAraBERT_run2_AugV5_k13_task5_organization

Finetuned
(4023)
this model