ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k7_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.4248
  • Qwk: 0.0781
  • Mse: 1.4248
  • Rmse: 1.1937

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.1176 2 4.0140 -0.0174 4.0140 2.0035
No log 0.2353 4 2.6063 -0.0926 2.6063 1.6144
No log 0.3529 6 1.5650 -0.0684 1.5650 1.2510
No log 0.4706 8 1.1358 0.1160 1.1358 1.0658
No log 0.5882 10 1.0366 0.2541 1.0366 1.0181
No log 0.7059 12 1.1396 0.2515 1.1396 1.0675
No log 0.8235 14 1.1764 0.1240 1.1764 1.0846
No log 0.9412 16 1.1849 0.1148 1.1849 1.0885
No log 1.0588 18 1.0913 0.2243 1.0913 1.0447
No log 1.1765 20 1.0292 0.1799 1.0292 1.0145
No log 1.2941 22 1.1401 0.1755 1.1401 1.0678
No log 1.4118 24 1.3986 -0.1078 1.3986 1.1826
No log 1.5294 26 1.4496 0.0 1.4496 1.2040
No log 1.6471 28 1.2535 0.0380 1.2535 1.1196
No log 1.7647 30 1.2141 0.0996 1.2141 1.1019
No log 1.8824 32 1.1102 0.2221 1.1102 1.0536
No log 2.0 34 1.0261 0.2692 1.0261 1.0130
No log 2.1176 36 1.0550 0.2569 1.0550 1.0271
No log 2.2353 38 1.0972 0.1545 1.0972 1.0475
No log 2.3529 40 1.0478 0.2314 1.0478 1.0236
No log 2.4706 42 1.0741 0.1981 1.0741 1.0364
No log 2.5882 44 1.0318 0.2492 1.0318 1.0158
No log 2.7059 46 1.0765 0.1601 1.0765 1.0376
No log 2.8235 48 1.5068 0.1081 1.5068 1.2275
No log 2.9412 50 1.6387 0.0673 1.6387 1.2801
No log 3.0588 52 1.4641 0.1230 1.4641 1.2100
No log 3.1765 54 1.1123 0.1830 1.1123 1.0546
No log 3.2941 56 1.0416 0.3134 1.0416 1.0206
No log 3.4118 58 1.0282 0.3011 1.0282 1.0140
No log 3.5294 60 1.1434 0.1530 1.1434 1.0693
No log 3.6471 62 1.7029 -0.1001 1.7029 1.3049
No log 3.7647 64 1.8308 -0.1806 1.8308 1.3531
No log 3.8824 66 1.6226 -0.2240 1.6226 1.2738
No log 4.0 68 1.2734 0.0169 1.2734 1.1285
No log 4.1176 70 0.9954 0.2547 0.9954 0.9977
No log 4.2353 72 0.9919 0.1734 0.9919 0.9959
No log 4.3529 74 1.1816 0.1707 1.1816 1.0870
No log 4.4706 76 1.6375 -0.1434 1.6375 1.2797
No log 4.5882 78 1.6639 -0.0513 1.6639 1.2899
No log 4.7059 80 1.5073 0.1292 1.5073 1.2277
No log 4.8235 82 1.5057 0.1477 1.5057 1.2271
No log 4.9412 84 1.6460 0.1537 1.6460 1.2829
No log 5.0588 86 1.8272 0.1379 1.8272 1.3517
No log 5.1765 88 1.8576 0.1367 1.8576 1.3629
No log 5.2941 90 1.6237 0.2163 1.6237 1.2743
No log 5.4118 92 1.5985 0.2026 1.5985 1.2643
No log 5.5294 94 1.7969 0.1771 1.7969 1.3405
No log 5.6471 96 1.9635 -0.0200 1.9635 1.4013
No log 5.7647 98 1.9162 -0.0766 1.9162 1.3843
No log 5.8824 100 1.7822 0.0357 1.7822 1.3350
No log 6.0 102 1.6227 0.1656 1.6227 1.2739
No log 6.1176 104 1.6403 0.0969 1.6403 1.2807
No log 6.2353 106 1.7658 -0.0009 1.7658 1.3288
No log 6.3529 108 1.7952 0.0181 1.7952 1.3398
No log 6.4706 110 1.7833 0.0749 1.7833 1.3354
No log 6.5882 112 1.7027 0.0749 1.7027 1.3049
No log 6.7059 114 1.6791 0.0749 1.6791 1.2958
No log 6.8235 116 1.5692 0.1193 1.5692 1.2527
No log 6.9412 118 1.5476 0.0760 1.5476 1.2440
No log 7.0588 120 1.4455 0.1255 1.4455 1.2023
No log 7.1765 122 1.4469 0.0907 1.4469 1.2029
No log 7.2941 124 1.6220 0.1323 1.6220 1.2736
No log 7.4118 126 1.6811 0.0829 1.6811 1.2966
No log 7.5294 128 1.5359 0.2016 1.5359 1.2393
No log 7.6471 130 1.1755 0.2707 1.1755 1.0842
No log 7.7647 132 1.1193 0.2906 1.1193 1.0580
No log 7.8824 134 1.3122 0.2367 1.3122 1.1455
No log 8.0 136 1.5355 0.2006 1.5355 1.2392
No log 8.1176 138 1.4861 0.2317 1.4861 1.2190
No log 8.2353 140 1.3650 0.2424 1.3650 1.1683
No log 8.3529 142 1.5131 0.2058 1.5131 1.2301
No log 8.4706 144 1.5121 0.2270 1.5121 1.2297
No log 8.5882 146 1.3173 0.1562 1.3173 1.1477
No log 8.7059 148 1.3333 0.1486 1.3333 1.1547
No log 8.8235 150 1.4492 0.1814 1.4492 1.2038
No log 8.9412 152 1.5059 0.2292 1.5059 1.2272
No log 9.0588 154 1.3440 0.1486 1.3440 1.1593
No log 9.1765 156 1.1986 0.0931 1.1986 1.0948
No log 9.2941 158 1.1850 0.0931 1.1850 1.0886
No log 9.4118 160 1.2937 0.1486 1.2937 1.1374
No log 9.5294 162 1.5248 0.1955 1.5248 1.2348
No log 9.6471 164 1.4194 0.2342 1.4194 1.1914
No log 9.7647 166 1.1947 0.1628 1.1947 1.0930
No log 9.8824 168 1.1255 0.1649 1.1255 1.0609
No log 10.0 170 1.1592 0.1202 1.1592 1.0767
No log 10.1176 172 1.1977 0.1202 1.1977 1.0944
No log 10.2353 174 1.5281 0.2110 1.5281 1.2362
No log 10.3529 176 1.6108 0.1688 1.6108 1.2692
No log 10.4706 178 1.4100 0.1630 1.4100 1.1874
No log 10.5882 180 1.2376 0.1579 1.2376 1.1125
No log 10.7059 182 1.2836 0.1228 1.2836 1.1330
No log 10.8235 184 1.4999 0.2391 1.4999 1.2247
No log 10.9412 186 1.5152 0.2694 1.5152 1.2309
No log 11.0588 188 1.4173 0.2062 1.4173 1.1905
No log 11.1765 190 1.2275 0.0781 1.2275 1.1079
No log 11.2941 192 1.1713 0.0401 1.1713 1.0823
No log 11.4118 194 1.2240 0.0401 1.2240 1.1063
No log 11.5294 196 1.3159 0.1052 1.3159 1.1471
No log 11.6471 198 1.4596 0.1832 1.4596 1.2081
No log 11.7647 200 1.6607 0.1635 1.6607 1.2887
No log 11.8824 202 1.6727 0.1271 1.6727 1.2933
No log 12.0 204 1.6933 0.0807 1.6933 1.3013
No log 12.1176 206 1.7818 0.1075 1.7818 1.3348
No log 12.2353 208 1.6647 0.1729 1.6647 1.2902
No log 12.3529 210 1.4417 0.1310 1.4417 1.2007
No log 12.4706 212 1.2965 0.1052 1.2965 1.1386
No log 12.5882 214 1.2207 0.0556 1.2207 1.1048
No log 12.7059 216 1.3050 0.1407 1.3050 1.1424
No log 12.8235 218 1.4775 0.1769 1.4775 1.2155
No log 12.9412 220 1.6678 0.1607 1.6678 1.2914
No log 13.0588 222 1.7194 0.2391 1.7194 1.3113
No log 13.1765 224 1.6048 0.1769 1.6048 1.2668
No log 13.2941 226 1.4119 0.1562 1.4119 1.1882
No log 13.4118 228 1.3233 0.1202 1.3233 1.1504
No log 13.5294 230 1.3914 0.2015 1.3914 1.1796
No log 13.6471 232 1.4250 0.1703 1.4250 1.1937
No log 13.7647 234 1.4740 0.1832 1.4740 1.2141
No log 13.8824 236 1.3734 0.1703 1.3734 1.1719
No log 14.0 238 1.3462 0.2062 1.3462 1.1603
No log 14.1176 240 1.2927 0.1835 1.2927 1.1370
No log 14.2353 242 1.3602 0.2004 1.3602 1.1663
No log 14.3529 244 1.5079 0.1832 1.5079 1.2280
No log 14.4706 246 1.5987 0.2270 1.5987 1.2644
No log 14.5882 248 1.5353 0.1462 1.5353 1.2391
No log 14.7059 250 1.4060 0.1228 1.4060 1.1858
No log 14.8235 252 1.2680 0.0556 1.2680 1.1260
No log 14.9412 254 1.2128 0.0710 1.2128 1.1013
No log 15.0588 256 1.2539 0.0556 1.2539 1.1198
No log 15.1765 258 1.3794 0.1052 1.3794 1.1745
No log 15.2941 260 1.4057 0.0510 1.4057 1.1856
No log 15.4118 262 1.4154 0.1142 1.4154 1.1897
No log 15.5294 264 1.5339 0.1943 1.5339 1.2385
No log 15.6471 266 1.5234 0.2482 1.5234 1.2342
No log 15.7647 268 1.5916 0.1955 1.5916 1.2616
No log 15.8824 270 1.7899 0.1014 1.7899 1.3379
No log 16.0 272 1.8018 0.0879 1.8018 1.3423
No log 16.1176 274 1.5904 0.1388 1.5904 1.2611
No log 16.2353 276 1.4107 0.0510 1.4107 1.1877
No log 16.3529 278 1.3484 0.0278 1.3484 1.1612
No log 16.4706 280 1.2604 0.0401 1.2604 1.1227
No log 16.5882 282 1.2634 0.0401 1.2634 1.1240
No log 16.7059 284 1.3770 0.1052 1.3770 1.1735
No log 16.8235 286 1.6176 0.2437 1.6176 1.2718
No log 16.9412 288 1.7386 0.1911 1.7386 1.3186
No log 17.0588 290 1.6975 0.2006 1.6975 1.3029
No log 17.1765 292 1.6276 0.2006 1.6276 1.2758
No log 17.2941 294 1.5311 0.2170 1.5311 1.2374
No log 17.4118 296 1.3317 0.0401 1.3317 1.1540
No log 17.5294 298 1.2893 0.0 1.2893 1.1355
No log 17.6471 300 1.2873 0.0 1.2873 1.1346
No log 17.7647 302 1.2557 0.0 1.2557 1.1206
No log 17.8824 304 1.2806 0.0160 1.2806 1.1316
No log 18.0 306 1.3892 0.0401 1.3892 1.1786
No log 18.1176 308 1.5591 0.2611 1.5591 1.2487
No log 18.2353 310 1.6394 0.2221 1.6394 1.2804
No log 18.3529 312 1.7915 0.1264 1.7915 1.3385
No log 18.4706 314 1.7241 0.1264 1.7241 1.3130
No log 18.5882 316 1.5038 0.2611 1.5038 1.2263
No log 18.7059 318 1.3890 0.1370 1.3890 1.1786
No log 18.8235 320 1.4029 0.0781 1.4029 1.1844
No log 18.9412 322 1.3889 0.0401 1.3889 1.1785
No log 19.0588 324 1.3851 0.0 1.3851 1.1769
No log 19.1765 326 1.3945 0.0 1.3945 1.1809
No log 19.2941 328 1.4840 0.1052 1.4840 1.2182
No log 19.4118 330 1.6438 0.1703 1.6438 1.2821
No log 19.5294 332 1.6433 0.1298 1.6433 1.2819
No log 19.6471 334 1.6042 0.2126 1.6042 1.2666
No log 19.7647 336 1.5861 0.1562 1.5861 1.2594
No log 19.8824 338 1.5061 0.1486 1.5061 1.2272
No log 20.0 340 1.4540 0.1142 1.4540 1.2058
No log 20.1176 342 1.4318 0.1052 1.4318 1.1966
No log 20.2353 344 1.5343 0.2184 1.5343 1.2387
No log 20.3529 346 1.6128 0.2317 1.6128 1.2699
No log 20.4706 348 1.6064 0.2406 1.6064 1.2674
No log 20.5882 350 1.5060 0.3006 1.5060 1.2272
No log 20.7059 352 1.4985 0.3006 1.4985 1.2241
No log 20.8235 354 1.4809 0.2974 1.4809 1.2169
No log 20.9412 356 1.5577 0.2363 1.5577 1.2481
No log 21.0588 358 1.6146 0.2206 1.6146 1.2707
No log 21.1765 360 1.6486 0.2206 1.6486 1.2840
No log 21.2941 362 1.5752 0.2437 1.5752 1.2551
No log 21.4118 364 1.4478 0.2062 1.4478 1.2033
No log 21.5294 366 1.4855 0.2062 1.4855 1.2188
No log 21.6471 368 1.5100 0.2372 1.5100 1.2288
No log 21.7647 370 1.5795 0.1486 1.5795 1.2568
No log 21.8824 372 1.5596 0.1142 1.5596 1.2489
No log 22.0 374 1.4800 0.0401 1.4800 1.2165
No log 22.1176 376 1.5058 0.0401 1.5058 1.2271
No log 22.2353 378 1.5401 0.0878 1.5401 1.2410
No log 22.3529 380 1.5856 0.0878 1.5856 1.2592
No log 22.4706 382 1.6066 0.1562 1.6066 1.2675
No log 22.5882 384 1.5539 0.1814 1.5539 1.2465
No log 22.7059 386 1.4323 0.0401 1.4323 1.1968
No log 22.8235 388 1.4664 0.0781 1.4664 1.2109
No log 22.9412 390 1.6399 0.2004 1.6399 1.2806
No log 23.0588 392 1.7552 0.1573 1.7552 1.3248
No log 23.1765 394 1.7306 0.2342 1.7306 1.3155
No log 23.2941 396 1.5592 0.1634 1.5592 1.2487
No log 23.4118 398 1.4100 0.1288 1.4100 1.1874
No log 23.5294 400 1.4331 0.1113 1.4331 1.1971
No log 23.6471 402 1.5573 0.1880 1.5573 1.2479
No log 23.7647 404 1.7161 0.2005 1.7161 1.3100
No log 23.8824 406 1.8383 0.1279 1.8383 1.3558
No log 24.0 408 1.7874 0.1892 1.7874 1.3369
No log 24.1176 410 1.6410 0.1310 1.6410 1.2810
No log 24.2353 412 1.4747 0.0401 1.4747 1.2144
No log 24.3529 414 1.3673 0.0401 1.3673 1.1693
No log 24.4706 416 1.3678 0.0401 1.3678 1.1695
No log 24.5882 418 1.4691 0.0781 1.4691 1.2121
No log 24.7059 420 1.6057 0.1310 1.6057 1.2672
No log 24.8235 422 1.6334 0.0970 1.6334 1.2780
No log 24.9412 424 1.6869 0.0554 1.6869 1.2988
No log 25.0588 426 1.6993 0.1462 1.6993 1.3036
No log 25.1765 428 1.6813 0.2004 1.6813 1.2967
No log 25.2941 430 1.6156 0.2062 1.6156 1.2710
No log 25.4118 432 1.5345 0.1880 1.5345 1.2387
No log 25.5294 434 1.5427 0.2184 1.5427 1.2421
No log 25.6471 436 1.5656 0.1142 1.5656 1.2513
No log 25.7647 438 1.5863 0.0970 1.5863 1.2595
No log 25.8824 440 1.5905 0.0970 1.5905 1.2611
No log 26.0 442 1.5386 0.0878 1.5386 1.2404
No log 26.1176 444 1.4024 0.0401 1.4024 1.1842
No log 26.2353 446 1.2834 0.0401 1.2834 1.1329
No log 26.3529 448 1.3089 0.0401 1.3089 1.1441
No log 26.4706 450 1.3984 0.0401 1.3984 1.1825
No log 26.5882 452 1.5329 0.0781 1.5329 1.2381
No log 26.7059 454 1.7029 0.1601 1.7029 1.3050
No log 26.8235 456 1.8135 0.1718 1.8135 1.3467
No log 26.9412 458 1.8347 0.1582 1.8347 1.3545
No log 27.0588 460 1.7496 0.2206 1.7496 1.3227
No log 27.1765 462 1.7127 0.2525 1.7127 1.3087
No log 27.2941 464 1.6038 0.2568 1.6038 1.2664
No log 27.4118 466 1.5118 0.1814 1.5118 1.2295
No log 27.5294 468 1.4948 0.0401 1.4948 1.2226
No log 27.6471 470 1.5303 0.0401 1.5303 1.2371
No log 27.7647 472 1.5651 0.0401 1.5651 1.2510
No log 27.8824 474 1.6417 0.1228 1.6417 1.2813
No log 28.0 476 1.7256 0.2292 1.7256 1.3136
No log 28.1176 478 1.7427 0.2611 1.7427 1.3201
No log 28.2353 480 1.6944 0.2611 1.6944 1.3017
No log 28.3529 482 1.6585 0.2391 1.6585 1.2878
No log 28.4706 484 1.6667 0.2653 1.6667 1.2910
No log 28.5882 486 1.6771 0.1943 1.6771 1.2950
No log 28.7059 488 1.7011 0.1298 1.7011 1.3043
No log 28.8235 490 1.7033 0.0896 1.7033 1.3051
No log 28.9412 492 1.6742 0.1310 1.6742 1.2939
No log 29.0588 494 1.5713 0.0781 1.5713 1.2535
No log 29.1765 496 1.4940 0.0401 1.4940 1.2223
No log 29.2941 498 1.4486 0.1142 1.4486 1.2036
0.2647 29.4118 500 1.4531 0.1634 1.4531 1.2054
0.2647 29.5294 502 1.5616 0.2342 1.5616 1.2496
0.2647 29.6471 504 1.6245 0.2566 1.6245 1.2746
0.2647 29.7647 506 1.6412 0.2566 1.6412 1.2811
0.2647 29.8824 508 1.6653 0.2566 1.6653 1.2905
0.2647 30.0 510 1.6087 0.1729 1.6087 1.2683
0.2647 30.1176 512 1.5731 0.1634 1.5731 1.2542
0.2647 30.2353 514 1.5173 0.0401 1.5173 1.2318
0.2647 30.3529 516 1.4582 0.0401 1.4582 1.2076
0.2647 30.4706 518 1.4132 0.0401 1.4132 1.1888
0.2647 30.5882 520 1.4248 0.0781 1.4248 1.1937

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k7_task5_organization

Finetuned
(4019)
this model