ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k2_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.1064
  • Qwk: 0.0556
  • Mse: 1.1064
  • Rmse: 1.0518

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.4 2 3.9424 -0.0109 3.9424 1.9856
No log 0.8 4 2.1749 0.0357 2.1749 1.4748
No log 1.2 6 1.7505 -0.0288 1.7505 1.3231
No log 1.6 8 1.3908 0.1463 1.3908 1.1793
No log 2.0 10 1.0239 0.3082 1.0239 1.0119
No log 2.4 12 0.9474 0.3688 0.9474 0.9734
No log 2.8 14 1.0368 0.2268 1.0368 1.0183
No log 3.2 16 0.9209 0.4365 0.9209 0.9596
No log 3.6 18 0.9337 0.3396 0.9337 0.9663
No log 4.0 20 0.9609 0.3510 0.9609 0.9802
No log 4.4 22 0.9379 0.2981 0.9379 0.9685
No log 4.8 24 0.9718 0.2865 0.9718 0.9858
No log 5.2 26 1.0579 0.2416 1.0579 1.0285
No log 5.6 28 1.0804 0.1962 1.0804 1.0394
No log 6.0 30 1.0598 0.2672 1.0598 1.0295
No log 6.4 32 1.1437 0.0823 1.1437 1.0694
No log 6.8 34 1.2664 0.0454 1.2664 1.1254
No log 7.2 36 1.1920 0.0947 1.1920 1.0918
No log 7.6 38 1.0807 0.1810 1.0807 1.0396
No log 8.0 40 1.0469 0.3139 1.0469 1.0232
No log 8.4 42 1.1135 0.1783 1.1135 1.0552
No log 8.8 44 1.3309 -0.0366 1.3309 1.1536
No log 9.2 46 1.4643 -0.0422 1.4643 1.2101
No log 9.6 48 1.5086 0.0876 1.5086 1.2282
No log 10.0 50 1.5770 0.0976 1.5770 1.2558
No log 10.4 52 1.6056 -0.0212 1.6056 1.2671
No log 10.8 54 1.5472 -0.0778 1.5472 1.2439
No log 11.2 56 1.4887 0.0579 1.4887 1.2201
No log 11.6 58 1.2836 0.0236 1.2836 1.1330
No log 12.0 60 1.1135 0.1101 1.1135 1.0552
No log 12.4 62 1.1493 0.0449 1.1493 1.0720
No log 12.8 64 1.2099 0.0395 1.2099 1.0999
No log 13.2 66 1.3105 0.0760 1.3105 1.1448
No log 13.6 68 1.3061 0.0817 1.3061 1.1429
No log 14.0 70 1.3762 0.0246 1.3762 1.1731
No log 14.4 72 1.4676 0.2038 1.4676 1.2114
No log 14.8 74 1.4929 0.1835 1.4929 1.2219
No log 15.2 76 1.3929 0.2614 1.3929 1.1802
No log 15.6 78 1.2371 0.1397 1.2371 1.1122
No log 16.0 80 1.1892 0.1992 1.1892 1.0905
No log 16.4 82 1.3037 0.1845 1.3037 1.1418
No log 16.8 84 1.5980 0.1728 1.5980 1.2641
No log 17.2 86 1.6561 0.2015 1.6561 1.2869
No log 17.6 88 1.5196 0.2206 1.5196 1.2327
No log 18.0 90 1.2789 0.1221 1.2789 1.1309
No log 18.4 92 1.2209 0.0780 1.2209 1.1050
No log 18.8 94 1.2552 0.1170 1.2552 1.1203
No log 19.2 96 1.3097 0.0931 1.3097 1.1444
No log 19.6 98 1.2791 0.1288 1.2791 1.1310
No log 20.0 100 1.1892 0.1081 1.1892 1.0905
No log 20.4 102 1.0306 0.1418 1.0306 1.0152
No log 20.8 104 0.9615 0.3178 0.9615 0.9806
No log 21.2 106 0.9456 0.3301 0.9456 0.9724
No log 21.6 108 0.9952 0.2322 0.9952 0.9976
No log 22.0 110 1.2839 0.2203 1.2839 1.1331
No log 22.4 112 1.5316 0.2424 1.5316 1.2376
No log 22.8 114 1.5018 0.2424 1.5018 1.2255
No log 23.2 116 1.4153 0.2126 1.4153 1.1897
No log 23.6 118 1.2261 0.2015 1.2261 1.1073
No log 24.0 120 0.9445 0.2678 0.9445 0.9719
No log 24.4 122 0.8544 0.4328 0.8544 0.9243
No log 24.8 124 0.8815 0.2288 0.8815 0.9389
No log 25.2 126 1.0880 0.1770 1.0880 1.0431
No log 25.6 128 1.2534 0.0510 1.2534 1.1195
No log 26.0 130 1.4321 0.0878 1.4321 1.1967
No log 26.4 132 1.4799 0.1486 1.4799 1.2165
No log 26.8 134 1.4010 0.0510 1.4010 1.1836
No log 27.2 136 1.2501 0.0 1.2501 1.1181
No log 27.6 138 1.1370 0.0888 1.1370 1.0663
No log 28.0 140 1.0703 0.0919 1.0703 1.0346
No log 28.4 142 1.0693 0.0919 1.0693 1.0341
No log 28.8 144 1.1350 0.0888 1.1350 1.0654
No log 29.2 146 1.2105 0.0 1.2105 1.1002
No log 29.6 148 1.1915 0.0 1.1915 1.0915
No log 30.0 150 1.1713 0.1228 1.1713 1.0823
No log 30.4 152 1.0851 0.1697 1.0851 1.0417
No log 30.8 154 0.9604 0.2108 0.9604 0.9800
No log 31.2 156 0.9625 0.1292 0.9625 0.9811
No log 31.6 158 1.0193 0.1351 1.0193 1.0096
No log 32.0 160 1.1203 0.0964 1.1203 1.0584
No log 32.4 162 1.2000 0.0964 1.2000 1.0954
No log 32.8 164 1.2431 0.0673 1.2431 1.1149
No log 33.2 166 1.1628 0.0931 1.1628 1.0784
No log 33.6 168 1.0501 0.0193 1.0501 1.0247
No log 34.0 170 0.9992 0.1418 0.9992 0.9996
No log 34.4 172 1.0266 0.1292 1.0266 1.0132
No log 34.8 174 1.1236 0.0510 1.1236 1.0600
No log 35.2 176 1.1933 0.0618 1.1933 1.0924
No log 35.6 178 1.2192 0.0618 1.2192 1.1042
No log 36.0 180 1.2722 -0.0244 1.2722 1.1279
No log 36.4 182 1.2791 0.0390 1.2791 1.1310
No log 36.8 184 1.3288 0.0390 1.3288 1.1527
No log 37.2 186 1.4322 0.0390 1.4322 1.1968
No log 37.6 188 1.5605 0.0390 1.5605 1.2492
No log 38.0 190 1.5490 -0.0399 1.5490 1.2446
No log 38.4 192 1.4536 -0.0244 1.4536 1.2057
No log 38.8 194 1.3298 0.0188 1.3298 1.1532
No log 39.2 196 1.2091 0.0068 1.2091 1.0996
No log 39.6 198 1.1489 0.0068 1.1489 1.0719
No log 40.0 200 1.1695 0.0122 1.1695 1.0814
No log 40.4 202 1.1914 0.0122 1.1914 1.0915
No log 40.8 204 1.1899 0.1024 1.1899 1.0908
No log 41.2 206 1.1184 0.1434 1.1184 1.0576
No log 41.6 208 1.0758 0.1528 1.0758 1.0372
No log 42.0 210 0.9947 0.2161 0.9947 0.9973
No log 42.4 212 0.9846 0.1854 0.9846 0.9923
No log 42.8 214 1.0445 0.1330 1.0445 1.0220
No log 43.2 216 1.1403 0.1370 1.1403 1.0679
No log 43.6 218 1.1923 0.0878 1.1923 1.0919
No log 44.0 220 1.1933 0.0878 1.1933 1.0924
No log 44.4 222 1.1584 0.0401 1.1584 1.0763
No log 44.8 224 1.1104 0.0 1.1104 1.0537
No log 45.2 226 1.1382 0.0 1.1382 1.0668
No log 45.6 228 1.2183 0.0 1.2183 1.1038
No log 46.0 230 1.3207 0.0878 1.3207 1.1492
No log 46.4 232 1.3374 0.0878 1.3374 1.1565
No log 46.8 234 1.2903 0.0122 1.2903 1.1359
No log 47.2 236 1.2272 0.0 1.2272 1.1078
No log 47.6 238 1.1793 0.0445 1.1793 1.0859
No log 48.0 240 1.1527 0.0445 1.1527 1.0737
No log 48.4 242 1.1861 0.0556 1.1861 1.0891
No log 48.8 244 1.1973 0.0931 1.1973 1.0942
No log 49.2 246 1.1837 0.0556 1.1837 1.0880
No log 49.6 248 1.1531 0.0556 1.1531 1.0738
No log 50.0 250 1.1641 0.0931 1.1641 1.0789
No log 50.4 252 1.1346 0.0931 1.1346 1.0652
No log 50.8 254 1.0984 0.0445 1.0984 1.0480
No log 51.2 256 1.1340 0.0445 1.1340 1.0649
No log 51.6 258 1.2230 0.0122 1.2230 1.1059
No log 52.0 260 1.2764 0.0122 1.2764 1.1298
No log 52.4 262 1.3001 0.0811 1.3001 1.1402
No log 52.8 264 1.3401 0.0811 1.3401 1.1576
No log 53.2 266 1.3394 0.0811 1.3394 1.1573
No log 53.6 268 1.2816 0.0811 1.2816 1.1321
No log 54.0 270 1.2196 0.0811 1.2196 1.1044
No log 54.4 272 1.2205 0.0811 1.2205 1.1048
No log 54.8 274 1.1950 0.0931 1.1950 1.0932
No log 55.2 276 1.1635 0.0931 1.1635 1.0787
No log 55.6 278 1.1232 0.0931 1.1232 1.0598
No log 56.0 280 1.1317 0.0931 1.1317 1.0638
No log 56.4 282 1.1191 0.0931 1.1191 1.0579
No log 56.8 284 1.0889 0.0833 1.0889 1.0435
No log 57.2 286 1.0442 0.0445 1.0442 1.0218
No log 57.6 288 1.0137 0.0888 1.0137 1.0068
No log 58.0 290 1.0033 0.1046 1.0033 1.0017
No log 58.4 292 1.0111 0.1046 1.0111 1.0055
No log 58.8 294 1.0195 0.0710 1.0195 1.0097
No log 59.2 296 1.0436 0.0931 1.0436 1.0216
No log 59.6 298 1.0616 0.1288 1.0616 1.0304
No log 60.0 300 1.0214 0.1351 1.0214 1.0107
No log 60.4 302 0.9948 0.0888 0.9948 0.9974
No log 60.8 304 1.0179 0.0888 1.0179 1.0089
No log 61.2 306 1.0721 0.0556 1.0721 1.0354
No log 61.6 308 1.1268 0.0556 1.1268 1.0615
No log 62.0 310 1.1521 0.0556 1.1521 1.0733
No log 62.4 312 1.1944 0.0556 1.1944 1.0929
No log 62.8 314 1.2083 0.0556 1.2083 1.0992
No log 63.2 316 1.2057 0.0931 1.2057 1.0980
No log 63.6 318 1.1931 0.0510 1.1931 1.0923
No log 64.0 320 1.1900 0.0931 1.1900 1.0909
No log 64.4 322 1.2049 0.0931 1.2049 1.0977
No log 64.8 324 1.2232 0.0510 1.2232 1.1060
No log 65.2 326 1.1947 0.0931 1.1947 1.0930
No log 65.6 328 1.1442 0.0556 1.1442 1.0697
No log 66.0 330 1.0926 0.0445 1.0926 1.0453
No log 66.4 332 1.0761 0.0445 1.0761 1.0374
No log 66.8 334 1.0936 0.0445 1.0936 1.0458
No log 67.2 336 1.1373 0.0445 1.1373 1.0664
No log 67.6 338 1.1608 0.0445 1.1608 1.0774
No log 68.0 340 1.1729 0.0556 1.1729 1.0830
No log 68.4 342 1.1744 0.0931 1.1744 1.0837
No log 68.8 344 1.1426 0.0931 1.1426 1.0689
No log 69.2 346 1.0846 0.0833 1.0846 1.0414
No log 69.6 348 1.0491 0.0833 1.0491 1.0242
No log 70.0 350 1.0223 0.0833 1.0223 1.0111
No log 70.4 352 1.0147 0.0833 1.0147 1.0073
No log 70.8 354 1.0373 0.0833 1.0373 1.0185
No log 71.2 356 1.0605 0.0931 1.0605 1.0298
No log 71.6 358 1.0740 0.0931 1.0740 1.0363
No log 72.0 360 1.0601 0.0931 1.0601 1.0296
No log 72.4 362 1.0257 0.0931 1.0257 1.0128
No log 72.8 364 0.9896 0.1265 0.9896 0.9948
No log 73.2 366 0.9677 0.1487 0.9677 0.9837
No log 73.6 368 0.9811 0.1330 0.9811 0.9905
No log 74.0 370 1.0102 0.0833 1.0102 1.0051
No log 74.4 372 1.0585 0.0931 1.0585 1.0288
No log 74.8 374 1.0738 0.0931 1.0738 1.0362
No log 75.2 376 1.0810 0.0931 1.0810 1.0397
No log 75.6 378 1.0993 0.0931 1.0993 1.0485
No log 76.0 380 1.1021 0.0931 1.1021 1.0498
No log 76.4 382 1.1034 0.0556 1.1034 1.0505
No log 76.8 384 1.1084 0.0556 1.1084 1.0528
No log 77.2 386 1.0929 0.0556 1.0929 1.0454
No log 77.6 388 1.0839 0.0556 1.0839 1.0411
No log 78.0 390 1.0766 0.0556 1.0766 1.0376
No log 78.4 392 1.0763 0.0556 1.0763 1.0374
No log 78.8 394 1.0586 0.0556 1.0586 1.0289
No log 79.2 396 1.0398 0.0445 1.0398 1.0197
No log 79.6 398 1.0346 0.0445 1.0346 1.0172
No log 80.0 400 1.0346 0.0445 1.0346 1.0172
No log 80.4 402 1.0205 0.0445 1.0205 1.0102
No log 80.8 404 1.0152 0.0445 1.0152 1.0076
No log 81.2 406 1.0213 0.0445 1.0213 1.0106
No log 81.6 408 1.0227 0.0445 1.0227 1.0113
No log 82.0 410 1.0192 0.0445 1.0192 1.0095
No log 82.4 412 1.0193 0.0445 1.0193 1.0096
No log 82.8 414 1.0353 0.0445 1.0353 1.0175
No log 83.2 416 1.0401 0.0445 1.0401 1.0199
No log 83.6 418 1.0539 0.0445 1.0539 1.0266
No log 84.0 420 1.0640 0.0445 1.0640 1.0315
No log 84.4 422 1.0757 0.0445 1.0757 1.0371
No log 84.8 424 1.0873 0.0445 1.0873 1.0428
No log 85.2 426 1.1099 0.0556 1.1099 1.0535
No log 85.6 428 1.1429 0.0556 1.1429 1.0691
No log 86.0 430 1.1627 0.0556 1.1627 1.0783
No log 86.4 432 1.1703 0.0556 1.1703 1.0818
No log 86.8 434 1.1686 0.0556 1.1686 1.0810
No log 87.2 436 1.1525 0.0556 1.1525 1.0735
No log 87.6 438 1.1277 0.0556 1.1277 1.0619
No log 88.0 440 1.1113 0.0445 1.1113 1.0542
No log 88.4 442 1.1050 0.0445 1.1050 1.0512
No log 88.8 444 1.1037 0.0445 1.1037 1.0506
No log 89.2 446 1.1081 0.0556 1.1081 1.0527
No log 89.6 448 1.1172 0.0556 1.1172 1.0570
No log 90.0 450 1.1271 0.0556 1.1271 1.0617
No log 90.4 452 1.1264 0.0556 1.1264 1.0613
No log 90.8 454 1.1318 0.0556 1.1318 1.0639
No log 91.2 456 1.1333 0.0556 1.1333 1.0646
No log 91.6 458 1.1248 0.0556 1.1248 1.0605
No log 92.0 460 1.1166 0.0556 1.1166 1.0567
No log 92.4 462 1.1111 0.0556 1.1111 1.0541
No log 92.8 464 1.1146 0.0556 1.1146 1.0557
No log 93.2 466 1.1127 0.0556 1.1127 1.0549
No log 93.6 468 1.1122 0.0556 1.1122 1.0546
No log 94.0 470 1.1090 0.0556 1.1090 1.0531
No log 94.4 472 1.1059 0.0556 1.1059 1.0516
No log 94.8 474 1.1064 0.0556 1.1064 1.0519
No log 95.2 476 1.1096 0.0556 1.1096 1.0534
No log 95.6 478 1.1129 0.0556 1.1129 1.0549
No log 96.0 480 1.1130 0.0556 1.1130 1.0550
No log 96.4 482 1.1152 0.0556 1.1152 1.0560
No log 96.8 484 1.1153 0.0556 1.1153 1.0561
No log 97.2 486 1.1134 0.0556 1.1134 1.0552
No log 97.6 488 1.1122 0.0556 1.1122 1.0546
No log 98.0 490 1.1112 0.0556 1.1112 1.0541
No log 98.4 492 1.1085 0.0556 1.1085 1.0528
No log 98.8 494 1.1076 0.0556 1.1076 1.0524
No log 99.2 496 1.1071 0.0556 1.1071 1.0522
No log 99.6 498 1.1067 0.0556 1.1067 1.0520
0.1626 100.0 500 1.1064 0.0556 1.1064 1.0518

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k2_task5_organization

Finetuned
(4019)
this model