ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k1_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.3110
  • Qwk: 0.4923
  • Mse: 1.3110
  • Rmse: 1.1450

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.4 2 5.8937 0.0381 5.8937 2.4277
No log 0.8 4 4.7279 0.0142 4.7279 2.1744
No log 1.2 6 3.4648 0.0485 3.4648 1.8614
No log 1.6 8 2.4939 0.0658 2.4939 1.5792
No log 2.0 10 2.1040 0.2239 2.1040 1.4505
No log 2.4 12 2.0142 0.1017 2.0142 1.4192
No log 2.8 14 1.9889 0.1176 1.9889 1.4103
No log 3.2 16 1.8914 0.2393 1.8914 1.3753
No log 3.6 18 1.7768 0.3448 1.7768 1.3330
No log 4.0 20 1.6923 0.3761 1.6923 1.3009
No log 4.4 22 1.6664 0.3761 1.6664 1.2909
No log 4.8 24 1.6339 0.3478 1.6339 1.2783
No log 5.2 26 1.6472 0.2364 1.6472 1.2834
No log 5.6 28 1.6910 0.2364 1.6910 1.3004
No log 6.0 30 1.6965 0.2364 1.6965 1.3025
No log 6.4 32 1.6525 0.2364 1.6525 1.2855
No log 6.8 34 1.5751 0.2832 1.5751 1.2550
No log 7.2 36 1.5752 0.3471 1.5752 1.2551
No log 7.6 38 1.5412 0.3471 1.5412 1.2415
No log 8.0 40 1.5814 0.4341 1.5814 1.2575
No log 8.4 42 1.5687 0.4341 1.5687 1.2525
No log 8.8 44 1.5243 0.3077 1.5243 1.2346
No log 9.2 46 1.4318 0.2679 1.4318 1.1966
No log 9.6 48 1.3027 0.3717 1.3027 1.1414
No log 10.0 50 1.2976 0.4754 1.2976 1.1391
No log 10.4 52 1.3195 0.4878 1.3195 1.1487
No log 10.8 54 1.2398 0.4538 1.2398 1.1135
No log 11.2 56 1.2476 0.4370 1.2476 1.1169
No log 11.6 58 1.3344 0.35 1.3344 1.1552
No log 12.0 60 1.4404 0.4127 1.4404 1.2002
No log 12.4 62 1.4396 0.3902 1.4396 1.1998
No log 12.8 64 1.4332 0.3902 1.4332 1.1972
No log 13.2 66 1.3807 0.3158 1.3807 1.1750
No log 13.6 68 1.3366 0.3478 1.3366 1.1561
No log 14.0 70 1.2521 0.4370 1.2521 1.1190
No log 14.4 72 1.2828 0.4651 1.2828 1.1326
No log 14.8 74 1.3602 0.4545 1.3602 1.1663
No log 15.2 76 1.4627 0.4030 1.4627 1.2094
No log 15.6 78 1.5119 0.3796 1.5119 1.2296
No log 16.0 80 1.3741 0.4531 1.3741 1.1722
No log 16.4 82 1.2626 0.5197 1.2626 1.1237
No log 16.8 84 1.2926 0.5312 1.2926 1.1369
No log 17.2 86 1.3735 0.4328 1.3735 1.1720
No log 17.6 88 1.5302 0.3333 1.5302 1.2370
No log 18.0 90 1.5567 0.3358 1.5567 1.2477
No log 18.4 92 1.5281 0.3333 1.5281 1.2362
No log 18.8 94 1.3301 0.4511 1.3301 1.1533
No log 19.2 96 1.2001 0.5426 1.2001 1.0955
No log 19.6 98 1.1884 0.5469 1.1884 1.0901
No log 20.0 100 1.2874 0.4427 1.2874 1.1347
No log 20.4 102 1.3365 0.4394 1.3365 1.1561
No log 20.8 104 1.3974 0.4697 1.3974 1.1821
No log 21.2 106 1.4154 0.4733 1.4154 1.1897
No log 21.6 108 1.3358 0.4961 1.3358 1.1557
No log 22.0 110 1.3172 0.496 1.3172 1.1477
No log 22.4 112 1.3680 0.4769 1.3680 1.1696
No log 22.8 114 1.3561 0.4615 1.3561 1.1645
No log 23.2 116 1.3840 0.4427 1.3840 1.1765
No log 23.6 118 1.3457 0.4615 1.3457 1.1600
No log 24.0 120 1.3151 0.4921 1.3151 1.1468
No log 24.4 122 1.2870 0.4034 1.2870 1.1345
No log 24.8 124 1.2989 0.3898 1.2989 1.1397
No log 25.2 126 1.3483 0.5197 1.3483 1.1612
No log 25.6 128 1.4044 0.4923 1.4044 1.1851
No log 26.0 130 1.3581 0.4923 1.3581 1.1654
No log 26.4 132 1.3428 0.4923 1.3428 1.1588
No log 26.8 134 1.3470 0.5271 1.3470 1.1606
No log 27.2 136 1.3158 0.5271 1.3158 1.1471
No log 27.6 138 1.3498 0.5231 1.3498 1.1618
No log 28.0 140 1.3897 0.4580 1.3897 1.1788
No log 28.4 142 1.4299 0.4211 1.4299 1.1958
No log 28.8 144 1.4481 0.4211 1.4481 1.2034
No log 29.2 146 1.4412 0.4211 1.4412 1.2005
No log 29.6 148 1.4114 0.4885 1.4114 1.1880
No log 30.0 150 1.4087 0.4885 1.4087 1.1869
No log 30.4 152 1.4168 0.4179 1.4168 1.1903
No log 30.8 154 1.3474 0.4812 1.3474 1.1608
No log 31.2 156 1.3095 0.4812 1.3095 1.1443
No log 31.6 158 1.3602 0.4361 1.3602 1.1663
No log 32.0 160 1.4391 0.4179 1.4391 1.1996
No log 32.4 162 1.4424 0.4179 1.4424 1.2010
No log 32.8 164 1.4208 0.4179 1.4208 1.1920
No log 33.2 166 1.3248 0.4961 1.3248 1.1510
No log 33.6 168 1.2593 0.4882 1.2593 1.1222
No log 34.0 170 1.3079 0.4961 1.3079 1.1437
No log 34.4 172 1.4280 0.4179 1.4280 1.1950
No log 34.8 174 1.4251 0.4179 1.4251 1.1938
No log 35.2 176 1.4058 0.4179 1.4058 1.1857
No log 35.6 178 1.3448 0.4615 1.3448 1.1597
No log 36.0 180 1.3777 0.4427 1.3777 1.1738
No log 36.4 182 1.5027 0.3852 1.5027 1.2258
No log 36.8 184 1.5399 0.3852 1.5399 1.2409
No log 37.2 186 1.5314 0.3852 1.5314 1.2375
No log 37.6 188 1.4993 0.4179 1.4993 1.2245
No log 38.0 190 1.3744 0.4341 1.3744 1.1723
No log 38.4 192 1.2591 0.48 1.2591 1.1221
No log 38.8 194 1.2535 0.4715 1.2535 1.1196
No log 39.2 196 1.2965 0.496 1.2965 1.1387
No log 39.6 198 1.4266 0.4394 1.4266 1.1944
No log 40.0 200 1.7019 0.3309 1.7019 1.3046
No log 40.4 202 1.8739 0.2778 1.8739 1.3689
No log 40.8 204 1.8614 0.3034 1.8614 1.3643
No log 41.2 206 1.6909 0.3262 1.6909 1.3003
No log 41.6 208 1.4100 0.4361 1.4100 1.1874
No log 42.0 210 1.2677 0.5152 1.2677 1.1259
No log 42.4 212 1.2087 0.5312 1.2087 1.0994
No log 42.8 214 1.2152 0.5039 1.2152 1.1024
No log 43.2 216 1.2933 0.4806 1.2933 1.1372
No log 43.6 218 1.4655 0.4361 1.4655 1.2106
No log 44.0 220 1.6232 0.3309 1.6232 1.2740
No log 44.4 222 1.6456 0.3043 1.6456 1.2828
No log 44.8 224 1.5553 0.3910 1.5553 1.2471
No log 45.2 226 1.4325 0.4615 1.4325 1.1969
No log 45.6 228 1.3235 0.496 1.3235 1.1504
No log 46.0 230 1.2954 0.4628 1.2954 1.1382
No log 46.4 232 1.3092 0.4839 1.3092 1.1442
No log 46.8 234 1.3480 0.512 1.3480 1.1610
No log 47.2 236 1.4155 0.4615 1.4155 1.1897
No log 47.6 238 1.4362 0.4615 1.4362 1.1984
No log 48.0 240 1.3925 0.5354 1.3925 1.1800
No log 48.4 242 1.3467 0.528 1.3467 1.1605
No log 48.8 244 1.3208 0.5 1.3208 1.1493
No log 49.2 246 1.3365 0.5238 1.3365 1.1561
No log 49.6 248 1.3646 0.4580 1.3646 1.1682
No log 50.0 250 1.3277 0.4961 1.3277 1.1523
No log 50.4 252 1.3031 0.4923 1.3031 1.1415
No log 50.8 254 1.2898 0.4806 1.2898 1.1357
No log 51.2 256 1.3025 0.4806 1.3025 1.1413
No log 51.6 258 1.3293 0.4806 1.3293 1.1530
No log 52.0 260 1.3877 0.4688 1.3877 1.1780
No log 52.4 262 1.3858 0.4688 1.3858 1.1772
No log 52.8 264 1.3349 0.4806 1.3349 1.1554
No log 53.2 266 1.2839 0.5 1.2839 1.1331
No log 53.6 268 1.2804 0.5 1.2804 1.1315
No log 54.0 270 1.3260 0.4806 1.3260 1.1515
No log 54.4 272 1.4125 0.4961 1.4125 1.1885
No log 54.8 274 1.4314 0.4733 1.4314 1.1964
No log 55.2 276 1.4092 0.4923 1.4092 1.1871
No log 55.6 278 1.3714 0.4885 1.3714 1.1711
No log 56.0 280 1.3050 0.48 1.3050 1.1424
No log 56.4 282 1.2831 0.5161 1.2831 1.1327
No log 56.8 284 1.2970 0.544 1.2970 1.1389
No log 57.2 286 1.2939 0.544 1.2939 1.1375
No log 57.6 288 1.2905 0.544 1.2905 1.1360
No log 58.0 290 1.2788 0.544 1.2788 1.1309
No log 58.4 292 1.3011 0.5116 1.3011 1.1407
No log 58.8 294 1.3434 0.5116 1.3434 1.1590
No log 59.2 296 1.3117 0.5512 1.3117 1.1453
No log 59.6 298 1.3064 0.5512 1.3064 1.1430
No log 60.0 300 1.3172 0.5156 1.3172 1.1477
No log 60.4 302 1.3170 0.5116 1.3170 1.1476
No log 60.8 304 1.3200 0.5116 1.3200 1.1489
No log 61.2 306 1.3386 0.4923 1.3386 1.1570
No log 61.6 308 1.3565 0.4885 1.3565 1.1647
No log 62.0 310 1.3418 0.4923 1.3418 1.1584
No log 62.4 312 1.3461 0.4923 1.3461 1.1602
No log 62.8 314 1.3169 0.4531 1.3169 1.1475
No log 63.2 316 1.2901 0.4531 1.2901 1.1358
No log 63.6 318 1.2921 0.4531 1.2921 1.1367
No log 64.0 320 1.3105 0.4531 1.3105 1.1448
No log 64.4 322 1.3429 0.4531 1.3429 1.1589
No log 64.8 324 1.3668 0.4923 1.3668 1.1691
No log 65.2 326 1.4264 0.4697 1.4264 1.1943
No log 65.6 328 1.4704 0.4179 1.4704 1.2126
No log 66.0 330 1.4421 0.4179 1.4421 1.2009
No log 66.4 332 1.3756 0.4923 1.3756 1.1729
No log 66.8 334 1.3072 0.4603 1.3072 1.1433
No log 67.2 336 1.2447 0.48 1.2447 1.1157
No log 67.6 338 1.2221 0.528 1.2221 1.1055
No log 68.0 340 1.2174 0.528 1.2174 1.1034
No log 68.4 342 1.2261 0.5161 1.2261 1.1073
No log 68.8 344 1.2654 0.48 1.2654 1.1249
No log 69.2 346 1.3233 0.4724 1.3233 1.1504
No log 69.6 348 1.3683 0.4923 1.3683 1.1697
No log 70.0 350 1.3832 0.4923 1.3832 1.1761
No log 70.4 352 1.3997 0.4697 1.3997 1.1831
No log 70.8 354 1.3885 0.4923 1.3885 1.1783
No log 71.2 356 1.3800 0.4651 1.3800 1.1747
No log 71.6 358 1.3688 0.4651 1.3688 1.1699
No log 72.0 360 1.3689 0.4651 1.3689 1.1700
No log 72.4 362 1.3601 0.4651 1.3601 1.1662
No log 72.8 364 1.3403 0.4531 1.3403 1.1577
No log 73.2 366 1.3338 0.4531 1.3338 1.1549
No log 73.6 368 1.3343 0.4651 1.3343 1.1551
No log 74.0 370 1.3303 0.4651 1.3303 1.1534
No log 74.4 372 1.3364 0.4651 1.3364 1.1560
No log 74.8 374 1.3334 0.4687 1.3334 1.1547
No log 75.2 376 1.3428 0.4687 1.3428 1.1588
No log 75.6 378 1.3359 0.4724 1.3359 1.1558
No log 76.0 380 1.3376 0.4724 1.3376 1.1565
No log 76.4 382 1.3324 0.4603 1.3324 1.1543
No log 76.8 384 1.3244 0.4603 1.3244 1.1508
No log 77.2 386 1.3020 0.4839 1.3020 1.1411
No log 77.6 388 1.2852 0.4839 1.2852 1.1337
No log 78.0 390 1.2812 0.4839 1.2812 1.1319
No log 78.4 392 1.2989 0.4839 1.2989 1.1397
No log 78.8 394 1.3049 0.4839 1.3049 1.1423
No log 79.2 396 1.2967 0.4839 1.2967 1.1387
No log 79.6 398 1.2853 0.4839 1.2853 1.1337
No log 80.0 400 1.2861 0.5079 1.2861 1.1341
No log 80.4 402 1.2723 0.5079 1.2723 1.1280
No log 80.8 404 1.2567 0.5079 1.2567 1.1210
No log 81.2 406 1.2417 0.48 1.2417 1.1143
No log 81.6 408 1.2306 0.48 1.2306 1.1093
No log 82.0 410 1.2206 0.48 1.2206 1.1048
No log 82.4 412 1.2350 0.48 1.2350 1.1113
No log 82.8 414 1.2639 0.5000 1.2639 1.1242
No log 83.2 416 1.2803 0.4961 1.2803 1.1315
No log 83.6 418 1.2963 0.4923 1.2963 1.1386
No log 84.0 420 1.3212 0.4885 1.3212 1.1494
No log 84.4 422 1.3285 0.4885 1.3285 1.1526
No log 84.8 424 1.3253 0.4885 1.3253 1.1512
No log 85.2 426 1.3132 0.4961 1.3132 1.1460
No log 85.6 428 1.2950 0.5000 1.2950 1.1380
No log 86.0 430 1.2792 0.5079 1.2792 1.1310
No log 86.4 432 1.2795 0.5079 1.2795 1.1311
No log 86.8 434 1.2868 0.5197 1.2868 1.1344
No log 87.2 436 1.2868 0.5197 1.2868 1.1344
No log 87.6 438 1.2928 0.5197 1.2928 1.1370
No log 88.0 440 1.3072 0.4961 1.3072 1.1433
No log 88.4 442 1.3145 0.4961 1.3145 1.1465
No log 88.8 444 1.3157 0.4961 1.3157 1.1471
No log 89.2 446 1.3108 0.4961 1.3108 1.1449
No log 89.6 448 1.3101 0.4961 1.3101 1.1446
No log 90.0 450 1.3093 0.4923 1.3093 1.1442
No log 90.4 452 1.3105 0.4923 1.3105 1.1448
No log 90.8 454 1.3150 0.4885 1.3150 1.1467
No log 91.2 456 1.3182 0.4885 1.3182 1.1481
No log 91.6 458 1.3170 0.4885 1.3170 1.1476
No log 92.0 460 1.3092 0.4885 1.3092 1.1442
No log 92.4 462 1.2994 0.4923 1.2994 1.1399
No log 92.8 464 1.2949 0.4923 1.2949 1.1379
No log 93.2 466 1.2936 0.4961 1.2936 1.1374
No log 93.6 468 1.2918 0.4961 1.2918 1.1366
No log 94.0 470 1.2900 0.4961 1.2900 1.1358
No log 94.4 472 1.2896 0.4961 1.2896 1.1356
No log 94.8 474 1.2858 0.4961 1.2858 1.1339
No log 95.2 476 1.2860 0.4961 1.2860 1.1340
No log 95.6 478 1.2850 0.4961 1.2850 1.1336
No log 96.0 480 1.2856 0.4961 1.2856 1.1339
No log 96.4 482 1.2891 0.4961 1.2891 1.1354
No log 96.8 484 1.2897 0.4961 1.2897 1.1356
No log 97.2 486 1.2925 0.4961 1.2925 1.1369
No log 97.6 488 1.2975 0.4961 1.2975 1.1391
No log 98.0 490 1.3030 0.4961 1.3030 1.1415
No log 98.4 492 1.3072 0.4923 1.3072 1.1433
No log 98.8 494 1.3092 0.4923 1.3092 1.1442
No log 99.2 496 1.3100 0.4923 1.3100 1.1445
No log 99.6 498 1.3109 0.4923 1.3109 1.1449
0.2181 100.0 500 1.3110 0.4923 1.3110 1.1450

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k1_task1_organization

Finetuned
(4019)
this model