ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k4_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.5206
  • Qwk: 0.0672
  • Mse: 1.5206
  • Rmse: 1.2331

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0952 2 4.8857 -0.0075 4.8857 2.2104
No log 0.1905 4 3.0851 -0.0481 3.0851 1.7565
No log 0.2857 6 2.4287 -0.0893 2.4287 1.5584
No log 0.3810 8 2.3034 -0.0893 2.3034 1.5177
No log 0.4762 10 2.6507 -0.0950 2.6507 1.6281
No log 0.5714 12 1.7613 -0.2379 1.7613 1.3271
No log 0.6667 14 1.5487 -0.0729 1.5487 1.2445
No log 0.7619 16 1.4407 -0.0460 1.4407 1.2003
No log 0.8571 18 1.6938 -0.0550 1.6938 1.3015
No log 0.9524 20 2.1369 -0.0283 2.1369 1.4618
No log 1.0476 22 2.4499 -0.0073 2.4499 1.5652
No log 1.1429 24 1.9979 0.0 1.9979 1.4135
No log 1.2381 26 1.4608 -0.0661 1.4608 1.2086
No log 1.3333 28 1.2876 0.0711 1.2876 1.1347
No log 1.4286 30 1.2581 0.0527 1.2581 1.1216
No log 1.5238 32 1.2484 0.0599 1.2484 1.1173
No log 1.6190 34 1.3589 -0.0078 1.3589 1.1657
No log 1.7143 36 1.3742 -0.0267 1.3742 1.1723
No log 1.8095 38 1.3469 0.0319 1.3469 1.1605
No log 1.9048 40 1.3042 -0.0609 1.3042 1.1420
No log 2.0 42 1.2460 0.1009 1.2460 1.1162
No log 2.0952 44 1.2207 0.0872 1.2207 1.1049
No log 2.1905 46 1.2234 0.1110 1.2234 1.1061
No log 2.2857 48 1.1873 0.0974 1.1873 1.0896
No log 2.3810 50 1.2320 0.1834 1.2320 1.1100
No log 2.4762 52 1.2685 0.1495 1.2685 1.1263
No log 2.5714 54 1.2739 0.1397 1.2739 1.1287
No log 2.6667 56 1.3342 0.1171 1.3342 1.1551
No log 2.7619 58 1.4566 -0.0721 1.4566 1.2069
No log 2.8571 60 1.3641 0.0883 1.3641 1.1679
No log 2.9524 62 1.3283 0.0780 1.3283 1.1525
No log 3.0476 64 1.3518 0.1043 1.3518 1.1627
No log 3.1429 66 1.4027 0.0573 1.4027 1.1843
No log 3.2381 68 1.3132 0.0878 1.3132 1.1460
No log 3.3333 70 1.3261 0.0318 1.3261 1.1516
No log 3.4286 72 1.2887 0.1032 1.2887 1.1352
No log 3.5238 74 1.4054 0.1870 1.4054 1.1855
No log 3.6190 76 1.4568 0.1388 1.4568 1.2070
No log 3.7143 78 1.3153 0.1176 1.3153 1.1468
No log 3.8095 80 1.3183 0.0566 1.3183 1.1482
No log 3.9048 82 1.3337 0.0914 1.3337 1.1549
No log 4.0 84 1.3059 0.1000 1.3059 1.1428
No log 4.0952 86 1.2430 0.1278 1.2430 1.1149
No log 4.1905 88 1.3041 0.0834 1.3041 1.1420
No log 4.2857 90 1.2701 0.1053 1.2701 1.1270
No log 4.3810 92 1.2757 0.0954 1.2757 1.1295
No log 4.4762 94 1.2917 0.0954 1.2917 1.1365
No log 4.5714 96 1.2816 0.0954 1.2816 1.1321
No log 4.6667 98 1.3291 0.0082 1.3291 1.1529
No log 4.7619 100 1.3979 0.0571 1.3979 1.1823
No log 4.8571 102 1.2616 0.1081 1.2616 1.1232
No log 4.9524 104 1.2797 0.0948 1.2797 1.1312
No log 5.0476 106 1.3500 0.1479 1.3500 1.1619
No log 5.1429 108 1.4020 0.1188 1.4020 1.1841
No log 5.2381 110 1.4368 0.1097 1.4368 1.1987
No log 5.3333 112 1.4960 0.1448 1.4960 1.2231
No log 5.4286 114 1.5439 0.2239 1.5439 1.2425
No log 5.5238 116 1.4961 0.1698 1.4961 1.2232
No log 5.6190 118 1.4548 0.0631 1.4548 1.2061
No log 5.7143 120 1.5714 0.0809 1.5714 1.2536
No log 5.8095 122 1.5352 0.1185 1.5352 1.2390
No log 5.9048 124 1.7145 0.1629 1.7145 1.3094
No log 6.0 126 1.7985 0.1569 1.7985 1.3411
No log 6.0952 128 1.7979 0.1634 1.7979 1.3409
No log 6.1905 130 1.7943 0.2019 1.7943 1.3395
No log 6.2857 132 1.5925 0.1902 1.5925 1.2619
No log 6.3810 134 1.5351 0.1602 1.5351 1.2390
No log 6.4762 136 1.5788 0.3127 1.5788 1.2565
No log 6.5714 138 1.6434 0.1283 1.6434 1.2820
No log 6.6667 140 1.6027 0.2163 1.6027 1.2660
No log 6.7619 142 1.5571 0.1135 1.5571 1.2479
No log 6.8571 144 1.4944 0.1098 1.4944 1.2224
No log 6.9524 146 1.3711 0.0958 1.3711 1.1710
No log 7.0476 148 1.6961 0.0578 1.6961 1.3023
No log 7.1429 150 1.7971 0.0244 1.7971 1.3405
No log 7.2381 152 1.4759 0.0840 1.4759 1.2149
No log 7.3333 154 1.4325 0.0976 1.4325 1.1969
No log 7.4286 156 1.6349 0.0818 1.6349 1.2786
No log 7.5238 158 1.6866 0.1109 1.6866 1.2987
No log 7.6190 160 1.6670 0.0422 1.6670 1.2911
No log 7.7143 162 1.5026 0.1249 1.5026 1.2258
No log 7.8095 164 1.4117 0.1130 1.4117 1.1881
No log 7.9048 166 1.4319 0.1509 1.4319 1.1966
No log 8.0 168 1.4565 0.1420 1.4565 1.2069
No log 8.0952 170 1.4533 0.1648 1.4533 1.2055
No log 8.1905 172 1.5452 0.1253 1.5452 1.2431
No log 8.2857 174 1.6496 0.1765 1.6496 1.2844
No log 8.3810 176 1.6247 0.2099 1.6247 1.2746
No log 8.4762 178 1.5977 0.1583 1.5977 1.2640
No log 8.5714 180 1.6176 0.1171 1.6176 1.2719
No log 8.6667 182 1.5842 0.1051 1.5842 1.2586
No log 8.7619 184 1.5647 0.0981 1.5647 1.2509
No log 8.8571 186 1.5689 0.0920 1.5689 1.2526
No log 8.9524 188 1.5880 0.1371 1.5880 1.2601
No log 9.0476 190 1.6412 0.1027 1.6412 1.2811
No log 9.1429 192 1.6586 0.1310 1.6586 1.2879
No log 9.2381 194 1.6245 0.1943 1.6245 1.2746
No log 9.3333 196 1.6343 -0.1406 1.6343 1.2784
No log 9.4286 198 1.6662 -0.0678 1.6662 1.2908
No log 9.5238 200 1.5838 0.0367 1.5838 1.2585
No log 9.6190 202 1.7033 0.1845 1.7033 1.3051
No log 9.7143 204 1.7867 0.1806 1.7867 1.3367
No log 9.8095 206 1.7578 0.1758 1.7578 1.3258
No log 9.9048 208 1.6294 0.1254 1.6294 1.2765
No log 10.0 210 1.4628 -0.0502 1.4628 1.2095
No log 10.0952 212 1.4349 0.0423 1.4349 1.1979
No log 10.1905 214 1.4257 0.0217 1.4257 1.1940
No log 10.2857 216 1.5404 0.1371 1.5404 1.2411
No log 10.3810 218 1.6557 0.1970 1.6557 1.2867
No log 10.4762 220 1.6681 0.1639 1.6681 1.2915
No log 10.5714 222 1.5718 0.2289 1.5718 1.2537
No log 10.6667 224 1.4710 0.1162 1.4710 1.2129
No log 10.7619 226 1.3709 0.125 1.3709 1.1708
No log 10.8571 228 1.3656 0.0596 1.3656 1.1686
No log 10.9524 230 1.4289 0.0702 1.4289 1.1953
No log 11.0476 232 1.4977 0.1592 1.4977 1.2238
No log 11.1429 234 1.4926 0.1071 1.4926 1.2217
No log 11.2381 236 1.4761 0.1131 1.4761 1.2149
No log 11.3333 238 1.5294 0.0958 1.5294 1.2367
No log 11.4286 240 1.6406 0.1634 1.6406 1.2809
No log 11.5238 242 1.6545 0.0765 1.6545 1.2863
No log 11.6190 244 1.6737 0.0338 1.6737 1.2937
No log 11.7143 246 1.6506 0.0406 1.6506 1.2848
No log 11.8095 248 1.7370 0.1308 1.7370 1.3179
No log 11.9048 250 1.6953 0.1416 1.6953 1.3020
No log 12.0 252 1.6572 0.0865 1.6572 1.2873
No log 12.0952 254 1.6104 0.0707 1.6104 1.2690
No log 12.1905 256 1.6025 0.0707 1.6025 1.2659
No log 12.2857 258 1.6283 0.1756 1.6283 1.2761
No log 12.3810 260 1.7617 0.1726 1.7617 1.3273
No log 12.4762 262 1.8017 0.2032 1.8017 1.3423
No log 12.5714 264 1.7072 0.2181 1.7072 1.3066
No log 12.6667 266 1.6096 0.2065 1.6096 1.2687
No log 12.7619 268 1.5792 0.2020 1.5792 1.2567
No log 12.8571 270 1.6518 0.2839 1.6518 1.2852
No log 12.9524 272 1.8034 0.2435 1.8034 1.3429
No log 13.0476 274 1.9095 0.0784 1.9095 1.3819
No log 13.1429 276 1.8651 0.0784 1.8651 1.3657
No log 13.2381 278 1.6842 0.0907 1.6842 1.2978
No log 13.3333 280 1.5062 0.2678 1.5062 1.2273
No log 13.4286 282 1.3941 0.1745 1.3941 1.1807
No log 13.5238 284 1.3768 0.1026 1.3768 1.1734
No log 13.6190 286 1.4066 0.1122 1.4066 1.1860
No log 13.7143 288 1.4903 0.1935 1.4903 1.2208
No log 13.8095 290 1.5760 0.1796 1.5760 1.2554
No log 13.9048 292 1.5962 0.1618 1.5962 1.2634
No log 14.0 294 1.5528 0.0578 1.5528 1.2461
No log 14.0952 296 1.5545 -0.0130 1.5545 1.2468
No log 14.1905 298 1.5604 -0.0529 1.5604 1.2492
No log 14.2857 300 1.5492 -0.0515 1.5492 1.2447
No log 14.3810 302 1.5881 0.0173 1.5881 1.2602
No log 14.4762 304 1.6718 0.0782 1.6718 1.2930
No log 14.5714 306 1.6595 0.0716 1.6595 1.2882
No log 14.6667 308 1.6631 0.1064 1.6631 1.2896
No log 14.7619 310 1.6478 0.1335 1.6478 1.2837
No log 14.8571 312 1.6142 0.0416 1.6142 1.2705
No log 14.9524 314 1.5759 0.0562 1.5759 1.2553
No log 15.0476 316 1.5527 0.1080 1.5527 1.2461
No log 15.1429 318 1.5456 0.0152 1.5456 1.2432
No log 15.2381 320 1.6055 0.1004 1.6055 1.2671
No log 15.3333 322 1.6210 0.1708 1.6210 1.2732
No log 15.4286 324 1.5916 0.1427 1.5916 1.2616
No log 15.5238 326 1.5305 0.0589 1.5305 1.2371
No log 15.6190 328 1.4800 -0.0513 1.4800 1.2166
No log 15.7143 330 1.5122 -0.0069 1.5122 1.2297
No log 15.8095 332 1.6274 0.1756 1.6274 1.2757
No log 15.9048 334 1.6938 0.1801 1.6938 1.3015
No log 16.0 336 1.6592 0.1708 1.6592 1.2881
No log 16.0952 338 1.6134 0.1046 1.6134 1.2702
No log 16.1905 340 1.5769 0.0741 1.5769 1.2558
No log 16.2857 342 1.5349 -0.0161 1.5349 1.2389
No log 16.3810 344 1.5394 -0.0161 1.5394 1.2407
No log 16.4762 346 1.5364 -0.0161 1.5364 1.2395
No log 16.5714 348 1.5587 0.0422 1.5587 1.2485
No log 16.6667 350 1.6233 0.1516 1.6233 1.2741
No log 16.7619 352 1.6544 0.0818 1.6544 1.2862
No log 16.8571 354 1.6308 0.1224 1.6308 1.2770
No log 16.9524 356 1.5572 0.1224 1.5572 1.2479
No log 17.0476 358 1.4733 0.1090 1.4733 1.2138
No log 17.1429 360 1.4140 0.0500 1.4140 1.1891
No log 17.2381 362 1.3968 0.0682 1.3968 1.1819
No log 17.3333 364 1.4385 0.0596 1.4385 1.1994
No log 17.4286 366 1.5098 0.0898 1.5098 1.2288
No log 17.5238 368 1.5564 0.1046 1.5564 1.2476
No log 17.6190 370 1.5394 0.0399 1.5394 1.2407
No log 17.7143 372 1.5110 -0.0602 1.5110 1.2292
No log 17.8095 374 1.5066 0.0303 1.5066 1.2274
No log 17.9048 376 1.4992 0.0399 1.4992 1.2244
No log 18.0 378 1.4999 0.0871 1.4999 1.2247
No log 18.0952 380 1.5163 0.1046 1.5163 1.2314
No log 18.1905 382 1.5951 0.1310 1.5951 1.2630
No log 18.2857 384 1.7005 0.2285 1.7005 1.3040
No log 18.3810 386 1.6530 0.2343 1.6530 1.2857
No log 18.4762 388 1.5047 0.0662 1.5047 1.2267
No log 18.5714 390 1.4316 0.1046 1.4316 1.1965
No log 18.6667 392 1.4627 0.0759 1.4627 1.2094
No log 18.7619 394 1.5194 0.1461 1.5194 1.2326
No log 18.8571 396 1.4975 0.1071 1.4975 1.2237
No log 18.9524 398 1.4690 0.0662 1.4690 1.2120
No log 19.0476 400 1.4249 0.1017 1.4249 1.1937
No log 19.1429 402 1.4690 0.1166 1.4690 1.2120
No log 19.2381 404 1.5766 0.1583 1.5766 1.2556
No log 19.3333 406 1.7231 0.2886 1.7231 1.3127
No log 19.4286 408 1.7577 0.1845 1.7577 1.3258
No log 19.5238 410 1.6631 0.2375 1.6631 1.2896
No log 19.6190 412 1.5181 0.0792 1.5181 1.2321
No log 19.7143 414 1.4848 0.1166 1.4848 1.2185
No log 19.8095 416 1.5501 0.1935 1.5501 1.2450
No log 19.9048 418 1.6528 0.2633 1.6528 1.2856
No log 20.0 420 1.7934 0.1634 1.7934 1.3392
No log 20.0952 422 1.8253 0.1745 1.8253 1.3511
No log 20.1905 424 1.7606 0.1095 1.7606 1.3269
No log 20.2857 426 1.6037 0.1756 1.6037 1.2664
No log 20.3810 428 1.4036 0.1312 1.4036 1.1847
No log 20.4762 430 1.3283 0.1279 1.3283 1.1525
No log 20.5714 432 1.3206 0.1114 1.3206 1.1492
No log 20.6667 434 1.3618 0.0442 1.3618 1.1670
No log 20.7619 436 1.4361 0.0702 1.4361 1.1984
No log 20.8571 438 1.5152 0.0920 1.5152 1.2309
No log 20.9524 440 1.6103 0.1568 1.6103 1.2690
No log 21.0476 442 1.5874 0.1171 1.5874 1.2599
No log 21.1429 444 1.4712 0.1395 1.4712 1.2129
No log 21.2381 446 1.4379 0.1674 1.4379 1.1991
No log 21.3333 448 1.4908 0.1984 1.4908 1.2210
No log 21.4286 450 1.6294 0.2726 1.6294 1.2765
No log 21.5238 452 1.6849 0.2806 1.6849 1.2980
No log 21.6190 454 1.6698 0.2578 1.6698 1.2922
No log 21.7143 456 1.6338 0.1929 1.6338 1.2782
No log 21.8095 458 1.5606 0.1051 1.5606 1.2492
No log 21.9048 460 1.4687 0.1162 1.4687 1.2119
No log 22.0 462 1.3923 0.1067 1.3923 1.1800
No log 22.0952 464 1.3428 0.0751 1.3428 1.1588
No log 22.1905 466 1.3491 0.0751 1.3491 1.1615
No log 22.2857 468 1.3734 0.1217 1.3734 1.1719
No log 22.3810 470 1.3938 0.1051 1.3938 1.1806
No log 22.4762 472 1.4239 0.1222 1.4239 1.1933
No log 22.5714 474 1.4848 0.0406 1.4848 1.2185
No log 22.6667 476 1.4844 0.0839 1.4844 1.2183
No log 22.7619 478 1.5034 0.1121 1.5034 1.2261
No log 22.8571 480 1.5386 0.1121 1.5386 1.2404
No log 22.9524 482 1.5008 0.1364 1.5008 1.2251
No log 23.0476 484 1.4008 0.0626 1.4008 1.1836
No log 23.1429 486 1.3640 0.0541 1.3640 1.1679
No log 23.2381 488 1.3536 0.0233 1.3536 1.1634
No log 23.3333 490 1.3935 0.1462 1.3935 1.1805
No log 23.4286 492 1.3981 -0.0072 1.3981 1.1824
No log 23.5238 494 1.3454 0.1048 1.3454 1.1599
No log 23.6190 496 1.2968 0.1279 1.2968 1.1388
No log 23.7143 498 1.2992 0.0916 1.2992 1.1398
0.3569 23.8095 500 1.3254 0.1279 1.3254 1.1512
0.3569 23.9048 502 1.3885 0.1193 1.3885 1.1783
0.3569 24.0 504 1.5233 0.1447 1.5233 1.2342
0.3569 24.0952 506 1.7075 0.2424 1.7075 1.3067
0.3569 24.1905 508 1.7861 0.2138 1.7861 1.3365
0.3569 24.2857 510 1.7439 0.2071 1.7439 1.3206
0.3569 24.3810 512 1.6777 0.1999 1.6777 1.2953
0.3569 24.4762 514 1.6446 0.1760 1.6446 1.2824
0.3569 24.5714 516 1.5583 0.1336 1.5583 1.2483
0.3569 24.6667 518 1.4830 0.0732 1.4830 1.2178
0.3569 24.7619 520 1.4330 0.1118 1.4330 1.1971
0.3569 24.8571 522 1.4193 0.1118 1.4193 1.1914
0.3569 24.9524 524 1.4647 0.0444 1.4647 1.2102
0.3569 25.0476 526 1.5206 0.0672 1.5206 1.2331

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k4_task2_organization

Finetuned
(4019)
this model