ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run2_AugV5_k14_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0650
  • Qwk: 0.1611
  • Mse: 1.0650
  • Rmse: 1.0320

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0417 2 2.5943 -0.0788 2.5943 1.6107
No log 0.0833 4 1.4325 0.1002 1.4325 1.1969
No log 0.125 6 0.8247 0.1786 0.8247 0.9081
No log 0.1667 8 0.9930 -0.0831 0.9930 0.9965
No log 0.2083 10 1.4592 -0.1097 1.4592 1.2080
No log 0.25 12 1.1959 -0.2100 1.1959 1.0936
No log 0.2917 14 0.9111 0.0359 0.9111 0.9545
No log 0.3333 16 0.8444 0.0757 0.8444 0.9189
No log 0.375 18 0.8455 0.0 0.8455 0.9195
No log 0.4167 20 0.8282 0.0 0.8282 0.9101
No log 0.4583 22 0.7893 0.1508 0.7893 0.8884
No log 0.5 24 0.8306 0.2589 0.8306 0.9114
No log 0.5417 26 0.8866 0.2574 0.8866 0.9416
No log 0.5833 28 0.8403 0.1648 0.8403 0.9167
No log 0.625 30 0.8295 0.0757 0.8295 0.9108
No log 0.6667 32 0.8537 0.0 0.8537 0.9239
No log 0.7083 34 0.8197 0.0359 0.8197 0.9054
No log 0.75 36 0.7974 0.0717 0.7974 0.8930
No log 0.7917 38 0.7886 0.0265 0.7886 0.8880
No log 0.8333 40 0.7907 0.1648 0.7907 0.8892
No log 0.875 42 0.7807 0.1699 0.7807 0.8836
No log 0.9167 44 0.7934 0.0410 0.7934 0.8907
No log 0.9583 46 0.8174 -0.0339 0.8174 0.9041
No log 1.0 48 0.7641 -0.0054 0.7641 0.8741
No log 1.0417 50 0.7275 -0.0054 0.7275 0.8529
No log 1.0833 52 0.7240 0.1807 0.7240 0.8509
No log 1.125 54 0.7434 0.1007 0.7434 0.8622
No log 1.1667 56 0.8451 0.1924 0.8451 0.9193
No log 1.2083 58 0.9690 0.1571 0.9690 0.9844
No log 1.25 60 0.9378 0.2495 0.9378 0.9684
No log 1.2917 62 0.8724 0.1483 0.8724 0.9340
No log 1.3333 64 0.9158 0.1492 0.9158 0.9570
No log 1.375 66 0.9361 0.1379 0.9361 0.9675
No log 1.4167 68 0.9737 0.0262 0.9737 0.9867
No log 1.4583 70 0.9870 -0.0138 0.9870 0.9935
No log 1.5 72 1.0128 -0.0204 1.0128 1.0064
No log 1.5417 74 1.0311 -0.0204 1.0311 1.0154
No log 1.5833 76 1.0573 0.0054 1.0573 1.0282
No log 1.625 78 1.0874 0.0248 1.0874 1.0428
No log 1.6667 80 1.0864 -0.0600 1.0864 1.0423
No log 1.7083 82 1.0706 -0.0561 1.0706 1.0347
No log 1.75 84 1.1082 -0.0951 1.1082 1.0527
No log 1.7917 86 1.1601 -0.0652 1.1601 1.0771
No log 1.8333 88 1.1967 0.1045 1.1967 1.0939
No log 1.875 90 1.2142 0.0673 1.2142 1.1019
No log 1.9167 92 1.2047 0.0390 1.2047 1.0976
No log 1.9583 94 1.1698 0.0768 1.1698 1.0816
No log 2.0 96 1.1769 0.0896 1.1769 1.0849
No log 2.0417 98 1.0100 0.2493 1.0100 1.0050
No log 2.0833 100 1.0267 0.1884 1.0267 1.0132
No log 2.125 102 0.9833 0.2054 0.9833 0.9916
No log 2.1667 104 1.0987 0.0459 1.0987 1.0482
No log 2.2083 106 1.0759 0.0488 1.0759 1.0372
No log 2.25 108 0.9478 -0.0907 0.9478 0.9735
No log 2.2917 110 0.8499 0.1007 0.8499 0.9219
No log 2.3333 112 1.0063 0.2923 1.0063 1.0031
No log 2.375 114 1.0507 0.2119 1.0507 1.0250
No log 2.4167 116 0.9050 0.2784 0.9050 0.9513
No log 2.4583 118 0.7912 0.2285 0.7912 0.8895
No log 2.5 120 0.8238 0.1379 0.8238 0.9076
No log 2.5417 122 0.8768 0.2458 0.8768 0.9364
No log 2.5833 124 1.0033 0.2544 1.0033 1.0017
No log 2.625 126 1.0679 0.2544 1.0679 1.0334
No log 2.6667 128 1.1224 0.2219 1.1224 1.0594
No log 2.7083 130 1.1958 0.1158 1.1958 1.0935
No log 2.75 132 1.1428 0.1518 1.1428 1.0690
No log 2.7917 134 1.1221 0.1407 1.1221 1.0593
No log 2.8333 136 1.0556 0.1143 1.0556 1.0274
No log 2.875 138 0.9914 0.2071 0.9914 0.9957
No log 2.9167 140 1.0117 -0.0409 1.0117 1.0058
No log 2.9583 142 1.0643 -0.0623 1.0643 1.0317
No log 3.0 144 0.9945 -0.0361 0.9945 0.9973
No log 3.0417 146 0.9411 0.1753 0.9411 0.9701
No log 3.0833 148 1.0225 0.1853 1.0225 1.0112
No log 3.125 150 1.1215 -0.0317 1.1215 1.0590
No log 3.1667 152 1.1503 0.0703 1.1503 1.0725
No log 3.2083 154 1.0794 0.1425 1.0794 1.0389
No log 3.25 156 1.0676 0.2877 1.0676 1.0333
No log 3.2917 158 1.0175 0.1192 1.0175 1.0087
No log 3.3333 160 0.9711 0.0712 0.9711 0.9854
No log 3.375 162 0.9879 0.1260 0.9879 0.9939
No log 3.4167 164 1.0860 0.1553 1.0860 1.0421
No log 3.4583 166 1.1371 0.1864 1.1371 1.0664
No log 3.5 168 1.1878 0.2199 1.1878 1.0899
No log 3.5417 170 1.2134 0.1752 1.2134 1.1015
No log 3.5833 172 1.2017 0.1986 1.2017 1.0962
No log 3.625 174 1.2212 0.0640 1.2212 1.1051
No log 3.6667 176 1.1528 0.1449 1.1528 1.0737
No log 3.7083 178 1.0883 0.0094 1.0883 1.0432
No log 3.75 180 1.0721 0.1926 1.0721 1.0354
No log 3.7917 182 0.9954 0.1102 0.9954 0.9977
No log 3.8333 184 1.0143 -0.0105 1.0143 1.0071
No log 3.875 186 1.0614 -0.0396 1.0614 1.0302
No log 3.9167 188 1.0767 0.1954 1.0767 1.0377
No log 3.9583 190 1.1741 0.0576 1.1741 1.0836
No log 4.0 192 1.2242 0.0741 1.2242 1.1064
No log 4.0417 194 1.3640 -0.0031 1.3640 1.1679
No log 4.0833 196 1.2572 -0.0017 1.2572 1.1212
No log 4.125 198 1.1004 0.1482 1.1004 1.0490
No log 4.1667 200 1.0314 0.2147 1.0314 1.0156
No log 4.2083 202 1.0105 0.0285 1.0105 1.0052
No log 4.25 204 1.0413 0.1331 1.0413 1.0205
No log 4.2917 206 0.9361 0.1500 0.9361 0.9675
No log 4.3333 208 0.9394 0.1578 0.9394 0.9692
No log 4.375 210 0.9371 0.1905 0.9371 0.9680
No log 4.4167 212 0.9797 0.1587 0.9797 0.9898
No log 4.4583 214 1.0389 0.0826 1.0389 1.0193
No log 4.5 216 1.1273 0.1852 1.1273 1.0617
No log 4.5417 218 1.1397 0.1313 1.1397 1.0676
No log 4.5833 220 1.1278 0.1225 1.1278 1.0620
No log 4.625 222 1.1394 0.1223 1.1394 1.0674
No log 4.6667 224 1.1170 0.1257 1.1170 1.0569
No log 4.7083 226 1.0491 0.2692 1.0491 1.0243
No log 4.75 228 0.9907 0.2692 0.9907 0.9953
No log 4.7917 230 0.9244 0.2583 0.9244 0.9615
No log 4.8333 232 0.9121 0.2583 0.9121 0.9550
No log 4.875 234 0.9607 0.2608 0.9607 0.9802
No log 4.9167 236 1.0236 0.2667 1.0236 1.0117
No log 4.9583 238 1.0928 0.2258 1.0928 1.0454
No log 5.0 240 1.1230 0.1816 1.1230 1.0597
No log 5.0417 242 1.0139 0.1903 1.0139 1.0069
No log 5.0833 244 1.0047 0.1130 1.0047 1.0024
No log 5.125 246 1.1102 0.1680 1.1102 1.0536
No log 5.1667 248 1.1199 0.0708 1.1199 1.0583
No log 5.2083 250 1.1037 0.0605 1.1037 1.0506
No log 5.25 252 1.0777 0.1002 1.0777 1.0381
No log 5.2917 254 1.1403 0.1849 1.1403 1.0679
No log 5.3333 256 1.0289 0.1691 1.0289 1.0143
No log 5.375 258 0.9515 0.1661 0.9515 0.9755
No log 5.4167 260 0.9296 0.1724 0.9296 0.9642
No log 5.4583 262 0.9728 0.1791 0.9728 0.9863
No log 5.5 264 1.0296 0.1768 1.0296 1.0147
No log 5.5417 266 1.0785 0.1728 1.0785 1.0385
No log 5.5833 268 1.0998 0.1728 1.0998 1.0487
No log 5.625 270 1.1868 0.1268 1.1868 1.0894
No log 5.6667 272 1.1785 0.0616 1.1785 1.0856
No log 5.7083 274 1.0479 0.1122 1.0479 1.0237
No log 5.75 276 0.9899 -0.0379 0.9899 0.9949
No log 5.7917 278 0.9879 0.0839 0.9879 0.9939
No log 5.8333 280 1.0282 0.0852 1.0282 1.0140
No log 5.875 282 1.0769 0.1449 1.0769 1.0377
No log 5.9167 284 1.1334 0.0730 1.1334 1.0646
No log 5.9583 286 1.0393 0.1054 1.0393 1.0194
No log 6.0 288 0.9031 0.0876 0.9031 0.9503
No log 6.0417 290 0.9160 0.2163 0.9160 0.9571
No log 6.0833 292 0.9873 0.1859 0.9873 0.9937
No log 6.125 294 0.9579 0.2071 0.9579 0.9787
No log 6.1667 296 0.9441 0.1662 0.9441 0.9717
No log 6.2083 298 0.9675 0.1945 0.9675 0.9836
No log 6.25 300 0.9813 0.2277 0.9813 0.9906
No log 6.2917 302 0.9652 0.3173 0.9652 0.9824
No log 6.3333 304 0.9651 0.2643 0.9651 0.9824
No log 6.375 306 0.9670 0.2643 0.9670 0.9834
No log 6.4167 308 1.0449 0.2055 1.0449 1.0222
No log 6.4583 310 1.0011 0.2420 1.0011 1.0005
No log 6.5 312 0.9454 0.2501 0.9454 0.9723
No log 6.5417 314 0.9259 0.1134 0.9259 0.9622
No log 6.5833 316 0.9372 0.2026 0.9372 0.9681
No log 6.625 318 0.9196 0.2542 0.9196 0.9590
No log 6.6667 320 0.8716 0.3076 0.8716 0.9336
No log 6.7083 322 0.8728 0.3571 0.8728 0.9343
No log 6.75 324 0.9062 0.3775 0.9062 0.9520
No log 6.7917 326 0.9472 0.3775 0.9472 0.9733
No log 6.8333 328 0.9170 0.4037 0.9170 0.9576
No log 6.875 330 0.9103 0.1846 0.9103 0.9541
No log 6.9167 332 0.9431 0.3616 0.9431 0.9711
No log 6.9583 334 0.9643 0.3571 0.9643 0.9820
No log 7.0 336 0.9313 0.2634 0.9313 0.9650
No log 7.0417 338 0.9259 0.2349 0.9259 0.9623
No log 7.0833 340 0.9304 0.1672 0.9304 0.9646
No log 7.125 342 0.9700 0.1662 0.9700 0.9849
No log 7.1667 344 0.9410 0.1860 0.9410 0.9701
No log 7.2083 346 0.9628 0.3339 0.9628 0.9812
No log 7.25 348 1.0228 0.3005 1.0228 1.0113
No log 7.2917 350 1.0041 0.2828 1.0041 1.0020
No log 7.3333 352 0.9709 0.2801 0.9709 0.9854
No log 7.375 354 0.9526 0.1816 0.9526 0.9760
No log 7.4167 356 1.0007 0.1054 1.0007 1.0003
No log 7.4583 358 1.0109 0.1803 1.0109 1.0054
No log 7.5 360 1.0559 0.2328 1.0559 1.0276
No log 7.5417 362 1.1606 0.1707 1.1606 1.0773
No log 7.5833 364 1.0590 0.1803 1.0590 1.0291
No log 7.625 366 0.9162 0.1140 0.9162 0.9572
No log 7.6667 368 0.9437 -0.0085 0.9437 0.9714
No log 7.7083 370 1.0082 0.0300 1.0082 1.0041
No log 7.75 372 1.0121 0.0220 1.0121 1.0060
No log 7.7917 374 1.0782 0.1846 1.0782 1.0384
No log 7.8333 376 1.1237 0.1860 1.1237 1.0601
No log 7.875 378 1.0403 0.1954 1.0403 1.0199
No log 7.9167 380 1.0221 0.0220 1.0221 1.0110
No log 7.9583 382 1.0578 -0.0281 1.0578 1.0285
No log 8.0 384 1.0581 0.0832 1.0581 1.0286
No log 8.0417 386 0.9807 -0.0291 0.9807 0.9903
No log 8.0833 388 0.9412 0.0592 0.9412 0.9701
No log 8.125 390 0.9788 0.1174 0.9788 0.9893
No log 8.1667 392 1.0368 0.2590 1.0368 1.0182
No log 8.2083 394 1.0667 0.2310 1.0667 1.0328
No log 8.25 396 1.0118 0.1506 1.0118 1.0059
No log 8.2917 398 0.9785 0.0627 0.9785 0.9892
No log 8.3333 400 0.9659 0.0365 0.9659 0.9828
No log 8.375 402 0.9719 0.0813 0.9719 0.9858
No log 8.4167 404 0.9668 0.0325 0.9668 0.9832
No log 8.4583 406 0.9933 0.1506 0.9933 0.9967
No log 8.5 408 1.0177 0.1334 1.0177 1.0088
No log 8.5417 410 0.9704 0.1566 0.9704 0.9851
No log 8.5833 412 0.9090 0.1869 0.9090 0.9534
No log 8.625 414 0.9364 -0.0108 0.9364 0.9677
No log 8.6667 416 0.9694 0.0648 0.9694 0.9846
No log 8.7083 418 0.9984 0.0950 0.9984 0.9992
No log 8.75 420 1.0147 0.1198 1.0147 1.0073
No log 8.7917 422 0.9921 0.2291 0.9921 0.9961
No log 8.8333 424 0.9920 0.2145 0.9920 0.9960
No log 8.875 426 0.9524 0.2993 0.9524 0.9759
No log 8.9167 428 0.8707 0.2365 0.8707 0.9331
No log 8.9583 430 0.8691 0.1806 0.8691 0.9322
No log 9.0 432 0.9087 0.0868 0.9087 0.9532
No log 9.0417 434 0.9568 0.0868 0.9568 0.9781
No log 9.0833 436 1.0016 0.1274 1.0016 1.0008
No log 9.125 438 1.0760 0.2849 1.0760 1.0373
No log 9.1667 440 1.1340 0.1923 1.1340 1.0649
No log 9.2083 442 1.0726 0.2301 1.0726 1.0357
No log 9.25 444 1.0023 0.1531 1.0023 1.0011
No log 9.2917 446 0.9813 0.1906 0.9813 0.9906
No log 9.3333 448 0.9551 0.1906 0.9551 0.9773
No log 9.375 450 0.9322 0.2049 0.9322 0.9655
No log 9.4167 452 0.9364 0.3092 0.9364 0.9677
No log 9.4583 454 0.9485 0.3092 0.9485 0.9739
No log 9.5 456 0.9425 0.2029 0.9425 0.9708
No log 9.5417 458 0.9416 0.2345 0.9416 0.9704
No log 9.5833 460 0.9553 0.2146 0.9553 0.9774
No log 9.625 462 0.9006 0.2462 0.9006 0.9490
No log 9.6667 464 0.8811 0.2132 0.8811 0.9387
No log 9.7083 466 0.9057 0.3092 0.9057 0.9517
No log 9.75 468 0.9124 0.3060 0.9124 0.9552
No log 9.7917 470 0.9350 0.2950 0.9350 0.9669
No log 9.8333 472 0.9505 0.2619 0.9505 0.9749
No log 9.875 474 0.9554 0.2572 0.9554 0.9775
No log 9.9167 476 0.9468 0.2572 0.9468 0.9730
No log 9.9583 478 0.9273 0.2845 0.9273 0.9630
No log 10.0 480 0.9174 0.3121 0.9174 0.9578
No log 10.0417 482 0.9223 0.3885 0.9223 0.9604
No log 10.0833 484 0.8625 0.2839 0.8625 0.9287
No log 10.125 486 0.7794 0.2359 0.7794 0.8828
No log 10.1667 488 0.7534 0.2914 0.7534 0.8680
No log 10.2083 490 0.7956 0.3593 0.7956 0.8919
No log 10.25 492 0.8617 0.3754 0.8617 0.9283
No log 10.2917 494 0.9119 0.3447 0.9119 0.9549
No log 10.3333 496 0.9454 0.3196 0.9454 0.9723
No log 10.375 498 0.9747 0.3028 0.9747 0.9873
0.311 10.4167 500 1.0156 0.2801 1.0156 1.0077
0.311 10.4583 502 1.0363 0.1968 1.0363 1.0180
0.311 10.5 504 1.0838 0.1968 1.0838 1.0410
0.311 10.5417 506 1.1220 0.1107 1.1220 1.0593
0.311 10.5833 508 1.1088 0.1533 1.1088 1.0530
0.311 10.625 510 1.0650 0.1611 1.0650 1.0320

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run2_AugV5_k14_task7_organization

Finetuned
(4019)
this model