ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k2_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.1292
  • Qwk: 0.0556
  • Mse: 1.1292
  • Rmse: 1.0626

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.4 2 3.9424 -0.0109 3.9424 1.9856
No log 0.8 4 2.1749 0.0357 2.1749 1.4748
No log 1.2 6 1.7505 -0.0288 1.7505 1.3231
No log 1.6 8 1.3908 0.1463 1.3908 1.1793
No log 2.0 10 1.0239 0.3082 1.0239 1.0119
No log 2.4 12 0.9474 0.3688 0.9474 0.9734
No log 2.8 14 1.0368 0.2268 1.0368 1.0183
No log 3.2 16 0.9209 0.4365 0.9209 0.9596
No log 3.6 18 0.9337 0.3396 0.9337 0.9663
No log 4.0 20 0.9609 0.3510 0.9609 0.9802
No log 4.4 22 0.9379 0.2981 0.9379 0.9685
No log 4.8 24 0.9718 0.2865 0.9718 0.9858
No log 5.2 26 1.0579 0.2416 1.0579 1.0285
No log 5.6 28 1.0804 0.1962 1.0804 1.0394
No log 6.0 30 1.0598 0.2672 1.0598 1.0295
No log 6.4 32 1.1437 0.0823 1.1437 1.0694
No log 6.8 34 1.2664 0.0454 1.2664 1.1254
No log 7.2 36 1.1920 0.0947 1.1920 1.0918
No log 7.6 38 1.0807 0.1810 1.0807 1.0396
No log 8.0 40 1.0469 0.3139 1.0469 1.0232
No log 8.4 42 1.1135 0.1783 1.1135 1.0552
No log 8.8 44 1.3308 -0.0366 1.3308 1.1536
No log 9.2 46 1.4643 -0.0422 1.4643 1.2101
No log 9.6 48 1.5086 0.0876 1.5086 1.2282
No log 10.0 50 1.5770 0.0976 1.5770 1.2558
No log 10.4 52 1.6056 -0.0212 1.6056 1.2671
No log 10.8 54 1.5472 -0.0778 1.5472 1.2439
No log 11.2 56 1.4887 0.0579 1.4887 1.2201
No log 11.6 58 1.2836 0.0236 1.2836 1.1329
No log 12.0 60 1.1135 0.1101 1.1135 1.0552
No log 12.4 62 1.1493 0.0449 1.1493 1.0720
No log 12.8 64 1.2099 0.0395 1.2099 1.0999
No log 13.2 66 1.3105 0.0760 1.3105 1.1448
No log 13.6 68 1.3061 0.0817 1.3061 1.1429
No log 14.0 70 1.3761 0.0246 1.3761 1.1731
No log 14.4 72 1.4675 0.2038 1.4675 1.2114
No log 14.8 74 1.4929 0.1835 1.4929 1.2218
No log 15.2 76 1.3928 0.2614 1.3928 1.1802
No log 15.6 78 1.2371 0.1397 1.2371 1.1122
No log 16.0 80 1.1892 0.1992 1.1892 1.0905
No log 16.4 82 1.3038 0.1845 1.3038 1.1418
No log 16.8 84 1.5981 0.1728 1.5981 1.2642
No log 17.2 86 1.6562 0.2015 1.6562 1.2869
No log 17.6 88 1.5198 0.2206 1.5198 1.2328
No log 18.0 90 1.2789 0.1221 1.2789 1.1309
No log 18.4 92 1.2210 0.0780 1.2210 1.1050
No log 18.8 94 1.2552 0.1170 1.2552 1.1204
No log 19.2 96 1.3097 0.0931 1.3097 1.1444
No log 19.6 98 1.2791 0.1288 1.2791 1.1310
No log 20.0 100 1.1891 0.1081 1.1891 1.0904
No log 20.4 102 1.0304 0.1418 1.0304 1.0151
No log 20.8 104 0.9614 0.3178 0.9614 0.9805
No log 21.2 106 0.9455 0.3301 0.9455 0.9724
No log 21.6 108 0.9953 0.2322 0.9953 0.9976
No log 22.0 110 1.2841 0.2203 1.2841 1.1332
No log 22.4 112 1.5317 0.2424 1.5317 1.2376
No log 22.8 114 1.5020 0.2424 1.5020 1.2255
No log 23.2 116 1.4155 0.2126 1.4155 1.1897
No log 23.6 118 1.2262 0.2015 1.2262 1.1073
No log 24.0 120 0.9446 0.2678 0.9446 0.9719
No log 24.4 122 0.8544 0.4328 0.8544 0.9243
No log 24.8 124 0.8815 0.2288 0.8815 0.9389
No log 25.2 126 1.0881 0.1770 1.0881 1.0431
No log 25.6 128 1.2535 0.0510 1.2535 1.1196
No log 26.0 130 1.4322 0.0878 1.4322 1.1968
No log 26.4 132 1.4800 0.1486 1.4800 1.2166
No log 26.8 134 1.4012 0.0510 1.4012 1.1837
No log 27.2 136 1.2505 0.0 1.2505 1.1183
No log 27.6 138 1.1375 0.0888 1.1375 1.0665
No log 28.0 140 1.0707 0.0919 1.0707 1.0348
No log 28.4 142 1.0696 0.0919 1.0696 1.0342
No log 28.8 144 1.1351 0.0888 1.1351 1.0654
No log 29.2 146 1.2103 0.0 1.2103 1.1001
No log 29.6 148 1.1912 0.0 1.1912 1.0914
No log 30.0 150 1.1709 0.1228 1.1709 1.0821
No log 30.4 152 1.0848 0.1697 1.0848 1.0415
No log 30.8 154 0.9605 0.2108 0.9605 0.9800
No log 31.2 156 0.9628 0.1292 0.9628 0.9812
No log 31.6 158 1.0202 0.1351 1.0202 1.0100
No log 32.0 160 1.1217 0.0964 1.1217 1.0591
No log 32.4 162 1.2011 0.0964 1.2011 1.0959
No log 32.8 164 1.2436 0.0673 1.2436 1.1152
No log 33.2 166 1.1631 0.0931 1.1631 1.0785
No log 33.6 168 1.0505 0.0604 1.0505 1.0249
No log 34.0 170 0.9994 0.1418 0.9994 0.9997
No log 34.4 172 1.0265 0.1292 1.0265 1.0132
No log 34.8 174 1.1232 0.0510 1.1232 1.0598
No log 35.2 176 1.1924 0.0618 1.1924 1.0920
No log 35.6 178 1.2183 0.0618 1.2183 1.1038
No log 36.0 180 1.2718 -0.0244 1.2718 1.1277
No log 36.4 182 1.2793 0.0390 1.2793 1.1310
No log 36.8 184 1.3291 0.0390 1.3291 1.1529
No log 37.2 186 1.4328 0.0390 1.4328 1.1970
No log 37.6 188 1.5614 0.0390 1.5614 1.2496
No log 38.0 190 1.5507 -0.0399 1.5507 1.2453
No log 38.4 192 1.4562 -0.0244 1.4562 1.2067
No log 38.8 194 1.3329 0.0188 1.3329 1.1545
No log 39.2 196 1.2118 0.0068 1.2118 1.1008
No log 39.6 198 1.1506 0.0068 1.1506 1.0726
No log 40.0 200 1.1703 0.0122 1.1703 1.0818
No log 40.4 202 1.1917 0.0122 1.1917 1.0917
No log 40.8 204 1.1905 0.1024 1.1905 1.0911
No log 41.2 206 1.1197 0.1434 1.1197 1.0582
No log 41.6 208 1.0771 0.1528 1.0771 1.0378
No log 42.0 210 0.9954 0.2161 0.9954 0.9977
No log 42.4 212 0.9853 0.1854 0.9853 0.9926
No log 42.8 214 1.0448 0.1330 1.0448 1.0222
No log 43.2 216 1.1399 0.1370 1.1399 1.0677
No log 43.6 218 1.1914 0.0878 1.1914 1.0915
No log 44.0 220 1.1921 0.0878 1.1921 1.0919
No log 44.4 222 1.1572 0.0401 1.1572 1.0757
No log 44.8 224 1.1093 0.0 1.1093 1.0532
No log 45.2 226 1.1373 0.0 1.1373 1.0665
No log 45.6 228 1.2180 0.0401 1.2180 1.1036
No log 46.0 230 1.3216 0.0878 1.3216 1.1496
No log 46.4 232 1.3397 0.0878 1.3397 1.1575
No log 46.8 234 1.2930 0.0122 1.2930 1.1371
No log 47.2 236 1.2295 0.0 1.2295 1.1088
No log 47.6 238 1.1810 0.0445 1.1810 1.0868
No log 48.0 240 1.1542 0.0445 1.1542 1.0743
No log 48.4 242 1.1874 0.0556 1.1874 1.0897
No log 48.8 244 1.1983 0.0556 1.1983 1.0947
No log 49.2 246 1.1848 0.0556 1.1848 1.0885
No log 49.6 248 1.1556 0.0556 1.1556 1.0750
No log 50.0 250 1.1691 0.0931 1.1691 1.0812
No log 50.4 252 1.1413 0.0931 1.1413 1.0683
No log 50.8 254 1.1057 0.0445 1.1057 1.0515
No log 51.2 256 1.1410 0.0556 1.1410 1.0682
No log 51.6 258 1.2289 0.0122 1.2289 1.1086
No log 52.0 260 1.2801 0.0122 1.2801 1.1314
No log 52.4 262 1.3029 0.0811 1.3029 1.1414
No log 52.8 264 1.3491 0.0811 1.3491 1.1615
No log 53.2 266 1.3536 0.0811 1.3536 1.1634
No log 53.6 268 1.2990 0.0811 1.2990 1.1398
No log 54.0 270 1.2408 0.0811 1.2408 1.1139
No log 54.4 272 1.2386 0.0811 1.2386 1.1129
No log 54.8 274 1.2069 0.0931 1.2069 1.0986
No log 55.2 276 1.1696 0.0931 1.1696 1.0815
No log 55.6 278 1.1294 0.0931 1.1294 1.0627
No log 56.0 280 1.1405 0.0931 1.1405 1.0679
No log 56.4 282 1.1320 0.0931 1.1320 1.0639
No log 56.8 284 1.1063 0.0931 1.1063 1.0518
No log 57.2 286 1.0649 0.0445 1.0649 1.0319
No log 57.6 288 1.0339 0.0888 1.0339 1.0168
No log 58.0 290 1.0198 0.1046 1.0198 1.0099
No log 58.4 292 1.0223 0.1046 1.0223 1.0111
No log 58.8 294 1.0275 0.0710 1.0275 1.0136
No log 59.2 296 1.0535 0.0931 1.0535 1.0264
No log 59.6 298 1.0727 0.1288 1.0727 1.0357
No log 60.0 300 1.0328 0.1351 1.0328 1.0162
No log 60.4 302 1.0025 0.0888 1.0025 1.0012
No log 60.8 304 1.0231 0.0888 1.0231 1.0115
No log 61.2 306 1.0765 0.0556 1.0765 1.0375
No log 61.6 308 1.1393 0.0556 1.1393 1.0674
No log 62.0 310 1.1762 0.0556 1.1762 1.0845
No log 62.4 312 1.2020 0.0556 1.2020 1.0964
No log 62.8 314 1.2044 0.0556 1.2044 1.0974
No log 63.2 316 1.1963 0.0931 1.1963 1.0937
No log 63.6 318 1.1889 0.0931 1.1889 1.0904
No log 64.0 320 1.2090 0.0931 1.2090 1.0996
No log 64.4 322 1.2442 0.0931 1.2442 1.1154
No log 64.8 324 1.2611 0.0878 1.2611 1.1230
No log 65.2 326 1.2285 0.1288 1.2285 1.1084
No log 65.6 328 1.1730 0.0931 1.1730 1.0831
No log 66.0 330 1.1199 0.0556 1.1199 1.0583
No log 66.4 332 1.1054 0.0445 1.1054 1.0514
No log 66.8 334 1.1261 0.0445 1.1261 1.0612
No log 67.2 336 1.1655 0.0 1.1655 1.0796
No log 67.6 338 1.1790 0.0122 1.1790 1.0858
No log 68.0 340 1.1828 0.0931 1.1828 1.0876
No log 68.4 342 1.1758 0.0931 1.1758 1.0843
No log 68.8 344 1.1396 0.0931 1.1396 1.0675
No log 69.2 346 1.0812 0.0833 1.0812 1.0398
No log 69.6 348 1.0476 0.0833 1.0476 1.0235
No log 70.0 350 1.0361 0.0833 1.0361 1.0179
No log 70.4 352 1.0478 0.0833 1.0478 1.0236
No log 70.8 354 1.0853 0.0833 1.0853 1.0418
No log 71.2 356 1.1127 0.0931 1.1127 1.0549
No log 71.6 358 1.1250 0.0931 1.1250 1.0606
No log 72.0 360 1.1105 0.0931 1.1105 1.0538
No log 72.4 362 1.0757 0.0931 1.0757 1.0371
No log 72.8 364 1.0358 0.0833 1.0358 1.0177
No log 73.2 366 1.0036 0.0445 1.0036 1.0018
No log 73.6 368 1.0095 0.0445 1.0095 1.0048
No log 74.0 370 1.0337 0.0445 1.0337 1.0167
No log 74.4 372 1.0786 0.0931 1.0786 1.0386
No log 74.8 374 1.0946 0.0931 1.0946 1.0462
No log 75.2 376 1.1065 0.0931 1.1065 1.0519
No log 75.6 378 1.1268 0.0931 1.1268 1.0615
No log 76.0 380 1.1321 0.0556 1.1321 1.0640
No log 76.4 382 1.1345 0.0556 1.1345 1.0652
No log 76.8 384 1.1376 0.0556 1.1376 1.0666
No log 77.2 386 1.1213 0.0556 1.1213 1.0589
No log 77.6 388 1.1086 0.0556 1.1086 1.0529
No log 78.0 390 1.0978 0.0556 1.0978 1.0478
No log 78.4 392 1.0914 0.0556 1.0914 1.0447
No log 78.8 394 1.0729 0.0445 1.0729 1.0358
No log 79.2 396 1.0550 0.0445 1.0550 1.0271
No log 79.6 398 1.0512 0.0445 1.0512 1.0253
No log 80.0 400 1.0498 0.0445 1.0498 1.0246
No log 80.4 402 1.0368 0.0445 1.0368 1.0183
No log 80.8 404 1.0336 0.0445 1.0336 1.0167
No log 81.2 406 1.0415 0.0445 1.0415 1.0205
No log 81.6 408 1.0423 0.0445 1.0423 1.0209
No log 82.0 410 1.0377 0.0445 1.0377 1.0187
No log 82.4 412 1.0369 0.0445 1.0369 1.0183
No log 82.8 414 1.0522 0.0445 1.0522 1.0257
No log 83.2 416 1.0577 0.0445 1.0577 1.0285
No log 83.6 418 1.0727 0.0445 1.0727 1.0357
No log 84.0 420 1.0836 0.0445 1.0836 1.0410
No log 84.4 422 1.0951 0.0445 1.0951 1.0465
No log 84.8 424 1.1122 0.0445 1.1122 1.0546
No log 85.2 426 1.1410 0.0556 1.1410 1.0682
No log 85.6 428 1.1794 0.0556 1.1794 1.0860
No log 86.0 430 1.2073 0.0556 1.2073 1.0988
No log 86.4 432 1.2212 0.0556 1.2212 1.1051
No log 86.8 434 1.2245 0.0556 1.2245 1.1066
No log 87.2 436 1.2114 0.0556 1.2114 1.1007
No log 87.6 438 1.1872 0.0556 1.1872 1.0896
No log 88.0 440 1.1698 0.0556 1.1698 1.0816
No log 88.4 442 1.1617 0.0556 1.1617 1.0778
No log 88.8 444 1.1585 0.0556 1.1585 1.0764
No log 89.2 446 1.1588 0.0556 1.1588 1.0765
No log 89.6 448 1.1646 0.0556 1.1646 1.0792
No log 90.0 450 1.1704 0.0556 1.1704 1.0819
No log 90.4 452 1.1655 0.0556 1.1655 1.0796
No log 90.8 454 1.1668 0.0556 1.1668 1.0802
No log 91.2 456 1.1642 0.0556 1.1642 1.0790
No log 91.6 458 1.1518 0.0556 1.1518 1.0732
No log 92.0 460 1.1398 0.0556 1.1398 1.0676
No log 92.4 462 1.1317 0.0556 1.1317 1.0638
No log 92.8 464 1.1331 0.0556 1.1331 1.0645
No log 93.2 466 1.1296 0.0556 1.1296 1.0628
No log 93.6 468 1.1299 0.0556 1.1299 1.0630
No log 94.0 470 1.1277 0.0556 1.1277 1.0619
No log 94.4 472 1.1256 0.0556 1.1256 1.0610
No log 94.8 474 1.1269 0.0556 1.1269 1.0616
No log 95.2 476 1.1304 0.0556 1.1304 1.0632
No log 95.6 478 1.1342 0.0556 1.1342 1.0650
No log 96.0 480 1.1344 0.0556 1.1344 1.0651
No log 96.4 482 1.1367 0.0556 1.1367 1.0662
No log 96.8 484 1.1371 0.0556 1.1371 1.0663
No log 97.2 486 1.1353 0.0556 1.1353 1.0655
No log 97.6 488 1.1346 0.0556 1.1346 1.0652
No log 98.0 490 1.1337 0.0556 1.1337 1.0648
No log 98.4 492 1.1310 0.0556 1.1310 1.0635
No log 98.8 494 1.1301 0.0556 1.1301 1.0630
No log 99.2 496 1.1298 0.0556 1.1298 1.0629
No log 99.6 498 1.1294 0.0556 1.1294 1.0628
0.1626 100.0 500 1.1292 0.0556 1.1292 1.0626

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k2_task5_organization

Finetuned
(4019)
this model