ArabicNewSplits6_WithDuplicationsForScore5_FineTuningAraBERT_run3_AugV5_k10_task3_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9310
  • Qwk: 0.2119
  • Mse: 0.9310
  • Rmse: 0.9649

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0385 2 3.0560 0.0130 3.0560 1.7481
No log 0.0769 4 1.6247 0.0255 1.6247 1.2746
No log 0.1154 6 1.0353 0.0075 1.0353 1.0175
No log 0.1538 8 0.5984 0.0909 0.5984 0.7736
No log 0.1923 10 0.5697 0.1111 0.5697 0.7548
No log 0.2308 12 0.5497 0.0 0.5497 0.7414
No log 0.2692 14 0.5829 0.1746 0.5829 0.7635
No log 0.3077 16 0.7132 0.2077 0.7132 0.8445
No log 0.3462 18 0.8453 0.0201 0.8453 0.9194
No log 0.3846 20 0.7104 0.2000 0.7104 0.8428
No log 0.4231 22 0.5757 0.0 0.5757 0.7588
No log 0.4615 24 0.6502 0.0 0.6502 0.8063
No log 0.5 26 0.6637 0.0 0.6637 0.8147
No log 0.5385 28 0.6645 0.0 0.6645 0.8152
No log 0.5769 30 0.6336 0.0 0.6336 0.7960
No log 0.6154 32 0.5531 0.0388 0.5531 0.7437
No log 0.6538 34 0.6080 -0.0370 0.6080 0.7797
No log 0.6923 36 0.7242 -0.0864 0.7242 0.8510
No log 0.7308 38 0.9839 -0.0233 0.9839 0.9919
No log 0.7692 40 1.0290 0.0 1.0290 1.0144
No log 0.8077 42 0.9044 0.0 0.9044 0.9510
No log 0.8462 44 0.7221 0.1746 0.7221 0.8497
No log 0.8846 46 0.7117 0.1590 0.7117 0.8437
No log 0.9231 48 0.6400 0.2090 0.6400 0.8000
No log 0.9615 50 0.5824 0.0 0.5824 0.7631
No log 1.0 52 0.5956 0.0 0.5956 0.7717
No log 1.0385 54 0.6134 0.0 0.6134 0.7832
No log 1.0769 56 0.5499 0.0 0.5499 0.7416
No log 1.1154 58 0.5597 0.1795 0.5597 0.7481
No log 1.1538 60 0.6473 0.1919 0.6473 0.8046
No log 1.1923 62 0.5932 0.0952 0.5932 0.7702
No log 1.2308 64 0.5629 0.2222 0.5629 0.7503
No log 1.2692 66 0.6239 -0.0159 0.6239 0.7899
No log 1.3077 68 0.6180 -0.0159 0.6180 0.7861
No log 1.3462 70 0.6105 0.0638 0.6105 0.7814
No log 1.3846 72 0.6405 0.2298 0.6405 0.8003
No log 1.4231 74 0.6859 0.0986 0.6859 0.8282
No log 1.4615 76 0.7512 0.1828 0.7512 0.8667
No log 1.5 78 0.7025 0.2081 0.7025 0.8382
No log 1.5385 80 0.7082 0.2086 0.7082 0.8416
No log 1.5769 82 0.7320 0.0685 0.7320 0.8556
No log 1.6154 84 0.5786 0.2448 0.5786 0.7607
No log 1.6538 86 0.5618 0.2448 0.5618 0.7495
No log 1.6923 88 0.6751 0.1716 0.6751 0.8217
No log 1.7308 90 1.0002 0.1389 1.0002 1.0001
No log 1.7692 92 0.9292 0.1942 0.9292 0.9640
No log 1.8077 94 0.5347 0.3789 0.5347 0.7312
No log 1.8462 96 0.8447 0.1417 0.8447 0.9191
No log 1.8846 98 0.7145 0.2153 0.7145 0.8453
No log 1.9231 100 0.5288 0.4083 0.5288 0.7272
No log 1.9615 102 0.8838 0.0929 0.8838 0.9401
No log 2.0 104 1.1660 0.0530 1.1660 1.0798
No log 2.0385 106 1.2320 0.0853 1.2320 1.1099
No log 2.0769 108 1.1443 0.0662 1.1443 1.0697
No log 2.1154 110 0.9945 0.1161 0.9945 0.9972
No log 2.1538 112 0.9527 0.1751 0.9527 0.9761
No log 2.1923 114 1.1957 0.1268 1.1957 1.0935
No log 2.2308 116 1.4075 0.1897 1.4075 1.1864
No log 2.2692 118 1.6507 0.1420 1.6507 1.2848
No log 2.3077 120 1.2082 0.2250 1.2082 1.0992
No log 2.3462 122 0.6735 0.3498 0.6735 0.8206
No log 2.3846 124 0.6761 0.3488 0.6761 0.8222
No log 2.4231 126 0.6046 0.3103 0.6046 0.7776
No log 2.4615 128 0.7679 0.2922 0.7679 0.8763
No log 2.5 130 1.5539 0.1628 1.5539 1.2466
No log 2.5385 132 1.9535 0.1504 1.9535 1.3977
No log 2.5769 134 1.4500 0.1856 1.4500 1.2042
No log 2.6154 136 0.8918 0.2374 0.8918 0.9444
No log 2.6538 138 0.8787 0.2068 0.8787 0.9374
No log 2.6923 140 1.4836 0.1707 1.4836 1.2180
No log 2.7308 142 2.2462 0.1212 2.2462 1.4987
No log 2.7692 144 1.8315 0.1489 1.8315 1.3533
No log 2.8077 146 0.9123 0.2740 0.9123 0.9551
No log 2.8462 148 0.7199 0.2922 0.7199 0.8485
No log 2.8846 150 0.9186 0.3594 0.9186 0.9585
No log 2.9231 152 1.2953 0.1950 1.2953 1.1381
No log 2.9615 154 1.4669 0.1905 1.4669 1.2112
No log 3.0 156 1.0609 0.3197 1.0609 1.0300
No log 3.0385 158 0.8534 0.3791 0.8534 0.9238
No log 3.0769 160 1.0355 0.2703 1.0355 1.0176
No log 3.1154 162 1.2558 0.1903 1.2558 1.1206
No log 3.1538 164 1.3235 0.1906 1.3235 1.1504
No log 3.1923 166 1.0941 0.1693 1.0941 1.0460
No log 3.2308 168 1.3475 0.1681 1.3475 1.1608
No log 3.2692 170 1.4798 0.1549 1.4798 1.2165
No log 3.3077 172 1.1313 0.1738 1.1313 1.0636
No log 3.3462 174 0.9897 0.2000 0.9897 0.9948
No log 3.3846 176 0.9373 0.2806 0.9373 0.9681
No log 3.4231 178 0.8658 0.3125 0.8658 0.9305
No log 3.4615 180 1.1428 0.2204 1.1428 1.0690
No log 3.5 182 1.0593 0.2792 1.0593 1.0292
No log 3.5385 184 0.8596 0.3282 0.8596 0.9272
No log 3.5769 186 0.7356 0.3538 0.7356 0.8577
No log 3.6154 188 0.7633 0.3388 0.7633 0.8737
No log 3.6538 190 1.1629 0.1531 1.1629 1.0784
No log 3.6923 192 1.2574 0.1083 1.2574 1.1213
No log 3.7308 194 1.1564 0.1788 1.1564 1.0753
No log 3.7692 196 0.8122 0.3016 0.8122 0.9012
No log 3.8077 198 0.9000 0.3379 0.9000 0.9487
No log 3.8462 200 0.8896 0.3514 0.8896 0.9432
No log 3.8846 202 1.1991 0.2000 1.1991 1.0951
No log 3.9231 204 1.3384 0.1262 1.3384 1.1569
No log 3.9615 206 1.2420 0.2201 1.2420 1.1144
No log 4.0 208 1.0690 0.3421 1.0690 1.0339
No log 4.0385 210 1.1423 0.2147 1.1423 1.0688
No log 4.0769 212 0.9577 0.2792 0.9577 0.9786
No log 4.1154 214 1.0466 0.2475 1.0466 1.0230
No log 4.1538 216 1.3219 0.1821 1.3219 1.1497
No log 4.1923 218 1.1627 0.1399 1.1627 1.0783
No log 4.2308 220 0.7795 0.2803 0.7795 0.8829
No log 4.2692 222 0.6500 0.2811 0.6500 0.8062
No log 4.3077 224 0.6648 0.2523 0.6648 0.8153
No log 4.3462 226 0.7072 0.2511 0.7072 0.8410
No log 4.3846 228 0.8300 0.2479 0.8300 0.9110
No log 4.4231 230 1.2954 0.0831 1.2954 1.1382
No log 4.4615 232 1.5171 0.1543 1.5171 1.2317
No log 4.5 234 1.3487 0.1273 1.3487 1.1613
No log 4.5385 236 0.9387 0.3213 0.9387 0.9689
No log 4.5769 238 0.8117 0.3251 0.8117 0.9009
No log 4.6154 240 0.8444 0.3619 0.8444 0.9189
No log 4.6538 242 1.0318 0.1944 1.0318 1.0158
No log 4.6923 244 1.2319 0.0611 1.2319 1.1099
No log 4.7308 246 1.3091 0.0943 1.3091 1.1442
No log 4.7692 248 1.0638 0.1448 1.0638 1.0314
No log 4.8077 250 0.7437 0.4286 0.7437 0.8624
No log 4.8462 252 0.6701 0.4027 0.6701 0.8186
No log 4.8846 254 0.8534 0.3869 0.8534 0.9238
No log 4.9231 256 1.2323 0.1676 1.2323 1.1101
No log 4.9615 258 1.6445 0.1821 1.6445 1.2824
No log 5.0 260 1.5508 0.1821 1.5508 1.2453
No log 5.0385 262 1.1030 0.2584 1.1030 1.0503
No log 5.0769 264 0.7307 0.4378 0.7307 0.8548
No log 5.1154 266 0.7357 0.4378 0.7357 0.8577
No log 5.1538 268 1.0089 0.2973 1.0089 1.0045
No log 5.1923 270 1.3088 0.1771 1.3088 1.1440
No log 5.2308 272 1.3182 0.1617 1.3182 1.1481
No log 5.2692 274 1.0427 0.2211 1.0427 1.0211
No log 5.3077 276 0.7903 0.3898 0.7903 0.8890
No log 5.3462 278 0.8131 0.2941 0.8131 0.9017
No log 5.3846 280 1.0611 0.1111 1.0611 1.0301
No log 5.4231 282 1.4063 0.1622 1.4063 1.1859
No log 5.4615 284 1.4053 0.1622 1.4053 1.1855
No log 5.5 286 1.1022 0.1640 1.1022 1.0499
No log 5.5385 288 0.7930 0.3488 0.7930 0.8905
No log 5.5769 290 0.7503 0.3580 0.7503 0.8662
No log 5.6154 292 0.8840 0.25 0.8840 0.9402
No log 5.6538 294 1.0642 0.1409 1.0642 1.0316
No log 5.6923 296 1.1079 0.1640 1.1079 1.0526
No log 5.7308 298 1.0592 0.1634 1.0592 1.0292
No log 5.7692 300 0.8228 0.3333 0.8228 0.9071
No log 5.8077 302 0.7705 0.3202 0.7705 0.8778
No log 5.8462 304 0.8680 0.2771 0.8680 0.9316
No log 5.8846 306 1.2034 0.1634 1.2034 1.0970
No log 5.9231 308 1.4379 0.1153 1.4379 1.1991
No log 5.9615 310 1.3236 0.1153 1.3236 1.1505
No log 6.0 312 0.9714 0.2180 0.9714 0.9856
No log 6.0385 314 0.7995 0.3306 0.7995 0.8942
No log 6.0769 316 0.8545 0.2756 0.8545 0.9244
No log 6.1154 318 1.0456 0.2239 1.0456 1.0226
No log 6.1538 320 1.2535 0.0903 1.2535 1.1196
No log 6.1923 322 1.2845 0.1377 1.2845 1.1334
No log 6.2308 324 1.2006 0.1850 1.2006 1.0957
No log 6.2692 326 0.9819 0.2552 0.9819 0.9909
No log 6.3077 328 0.8051 0.375 0.8051 0.8973
No log 6.3462 330 0.8727 0.3623 0.8727 0.9342
No log 6.3846 332 1.1558 0.1801 1.1558 1.0751
No log 6.4231 334 1.5543 0.1818 1.5543 1.2467
No log 6.4615 336 1.6101 0.1818 1.6101 1.2689
No log 6.5 338 1.3995 0.1854 1.3995 1.1830
No log 6.5385 340 1.0911 0.1894 1.0911 1.0446
No log 6.5769 342 0.8607 0.2941 0.8607 0.9277
No log 6.6154 344 0.8187 0.3633 0.8187 0.9048
No log 6.6538 346 0.9039 0.2698 0.9039 0.9507
No log 6.6923 348 1.1489 0.1383 1.1489 1.0719
No log 6.7308 350 1.3353 0.1617 1.3353 1.1555
No log 6.7692 352 1.2930 0.1611 1.2930 1.1371
No log 6.8077 354 1.1603 0.1141 1.1603 1.0772
No log 6.8462 356 0.9854 0.1940 0.9854 0.9927
No log 6.8846 358 0.8315 0.2941 0.8315 0.9118
No log 6.9231 360 0.8226 0.2941 0.8226 0.9070
No log 6.9615 362 0.9465 0.2184 0.9465 0.9729
No log 7.0 364 1.0166 0.2109 1.0166 1.0083
No log 7.0385 366 1.1767 0.1850 1.1767 1.0848
No log 7.0769 368 1.2038 0.1850 1.2038 1.0972
No log 7.1154 370 1.0670 0.1579 1.0670 1.0329
No log 7.1538 372 0.8878 0.2374 0.8878 0.9423
No log 7.1923 374 0.8048 0.3202 0.8048 0.8971
No log 7.2308 376 0.8340 0.3202 0.8340 0.9133
No log 7.2692 378 0.9812 0.2401 0.9812 0.9905
No log 7.3077 380 1.0477 0.1572 1.0477 1.0236
No log 7.3462 382 0.9477 0.1875 0.9477 0.9735
No log 7.3846 384 0.8314 0.3202 0.8314 0.9118
No log 7.4231 386 0.8525 0.2713 0.8525 0.9233
No log 7.4615 388 0.8761 0.2713 0.8761 0.9360
No log 7.5 390 0.9384 0.2058 0.9384 0.9687
No log 7.5385 392 1.0053 0.1831 1.0053 1.0027
No log 7.5769 394 0.9612 0.2057 0.9612 0.9804
No log 7.6154 396 0.8961 0.2698 0.8961 0.9466
No log 7.6538 398 0.8675 0.2941 0.8675 0.9314
No log 7.6923 400 0.9002 0.2941 0.9002 0.9488
No log 7.7308 402 0.8865 0.2941 0.8865 0.9415
No log 7.7692 404 0.8858 0.2941 0.8858 0.9412
No log 7.8077 406 0.8821 0.2941 0.8821 0.9392
No log 7.8462 408 0.8551 0.2941 0.8551 0.9247
No log 7.8846 410 0.8639 0.2698 0.8639 0.9295
No log 7.9231 412 0.8441 0.2698 0.8441 0.9187
No log 7.9615 414 0.8172 0.2698 0.8172 0.9040
No log 8.0 416 0.8712 0.2520 0.8712 0.9334
No log 8.0385 418 0.8913 0.2188 0.8913 0.9441
No log 8.0769 420 0.8897 0.2188 0.8897 0.9433
No log 8.1154 422 0.8600 0.2191 0.8600 0.9274
No log 8.1538 424 0.8800 0.2188 0.8800 0.9381
No log 8.1923 426 0.9433 0.1882 0.9433 0.9712
No log 8.2308 428 0.9895 0.1367 0.9895 0.9947
No log 8.2692 430 0.9547 0.1317 0.9547 0.9771
No log 8.3077 432 0.8601 0.2191 0.8601 0.9274
No log 8.3462 434 0.7847 0.2980 0.7847 0.8859
No log 8.3846 436 0.7905 0.2960 0.7905 0.8891
No log 8.4231 438 0.7943 0.2960 0.7943 0.8912
No log 8.4615 440 0.8164 0.2713 0.8164 0.9035
No log 8.5 442 0.8289 0.2698 0.8289 0.9104
No log 8.5385 444 0.8606 0.2126 0.8606 0.9277
No log 8.5769 446 0.8835 0.1875 0.8835 0.9400
No log 8.6154 448 0.8499 0.1811 0.8499 0.9219
No log 8.6538 450 0.7957 0.3202 0.7957 0.8920
No log 8.6923 452 0.7532 0.3548 0.7532 0.8679
No log 8.7308 454 0.7217 0.3909 0.7217 0.8495
No log 8.7692 456 0.7290 0.3548 0.7290 0.8538
No log 8.8077 458 0.7880 0.3202 0.7880 0.8877
No log 8.8462 460 0.8990 0.2291 0.8990 0.9481
No log 8.8846 462 1.0196 0.1280 1.0196 1.0098
No log 8.9231 464 1.0846 0.1565 1.0846 1.0414
No log 8.9615 466 1.0942 0.1565 1.0942 1.0461
No log 9.0 468 1.0432 0.1280 1.0432 1.0214
No log 9.0385 470 0.9568 0.2119 0.9568 0.9782
No log 9.0769 472 0.8725 0.2374 0.8725 0.9341
No log 9.1154 474 0.8383 0.2960 0.8383 0.9156
No log 9.1538 476 0.8475 0.2713 0.8475 0.9206
No log 9.1923 478 0.8944 0.2374 0.8944 0.9457
No log 9.2308 480 0.9301 0.2119 0.9301 0.9644
No log 9.2692 482 0.9741 0.2119 0.9741 0.9869
No log 9.3077 484 0.9675 0.2119 0.9675 0.9836
No log 9.3462 486 0.9373 0.2119 0.9373 0.9682
No log 9.3846 488 0.8947 0.2374 0.8947 0.9459
No log 9.4231 490 0.8692 0.2381 0.8692 0.9323
No log 9.4615 492 0.8640 0.2381 0.8640 0.9295
No log 9.5 494 0.8685 0.2381 0.8685 0.9319
No log 9.5385 496 0.8605 0.2381 0.8605 0.9276
No log 9.5769 498 0.8724 0.2381 0.8724 0.9340
0.4035 9.6154 500 0.8764 0.2381 0.8764 0.9361
0.4035 9.6538 502 0.8922 0.2061 0.8922 0.9446
0.4035 9.6923 504 0.9142 0.2059 0.9142 0.9562
0.4035 9.7308 506 0.9289 0.2119 0.9289 0.9638
0.4035 9.7692 508 0.9390 0.2119 0.9390 0.9690
0.4035 9.8077 510 0.9429 0.2119 0.9429 0.9710
0.4035 9.8462 512 0.9409 0.2119 0.9409 0.9700
0.4035 9.8846 514 0.9408 0.2119 0.9408 0.9699
0.4035 9.9231 516 0.9371 0.2119 0.9371 0.9680
0.4035 9.9615 518 0.9329 0.2119 0.9329 0.9659
0.4035 10.0 520 0.9310 0.2119 0.9310 0.9649

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits6_WithDuplicationsForScore5_FineTuningAraBERT_run3_AugV5_k10_task3_organization

Finetuned
(4023)
this model