ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k2_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9542
  • Qwk: 0.2728
  • Mse: 0.9542
  • Rmse: 0.9768

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.4 2 2.5043 -0.0788 2.5043 1.5825
No log 0.8 4 1.1496 0.1284 1.1496 1.0722
No log 1.2 6 0.8398 0.0535 0.8398 0.9164
No log 1.6 8 0.8665 0.0313 0.8665 0.9309
No log 2.0 10 0.9418 0.1181 0.9418 0.9705
No log 2.4 12 0.8754 0.1268 0.8754 0.9356
No log 2.8 14 0.7780 0.0804 0.7780 0.8821
No log 3.2 16 0.7806 0.0444 0.7806 0.8835
No log 3.6 18 0.7911 0.0444 0.7911 0.8895
No log 4.0 20 0.8116 0.0481 0.8116 0.9009
No log 4.4 22 0.7852 0.0444 0.7852 0.8861
No log 4.8 24 0.7556 0.1187 0.7556 0.8693
No log 5.2 26 0.7550 0.1094 0.7550 0.8689
No log 5.6 28 0.7873 0.2285 0.7873 0.8873
No log 6.0 30 0.8195 0.1867 0.8195 0.9053
No log 6.4 32 0.7905 0.1584 0.7905 0.8891
No log 6.8 34 0.8006 0.1542 0.8006 0.8948
No log 7.2 36 0.8280 0.1946 0.8280 0.9100
No log 7.6 38 1.0096 0.1501 1.0096 1.0048
No log 8.0 40 1.0556 0.1867 1.0556 1.0274
No log 8.4 42 0.9515 0.0241 0.9515 0.9754
No log 8.8 44 0.8984 0.1289 0.8984 0.9478
No log 9.2 46 0.9670 0.1385 0.9670 0.9834
No log 9.6 48 1.0413 0.2119 1.0413 1.0204
No log 10.0 50 1.1556 0.1115 1.1556 1.0750
No log 10.4 52 1.1839 0.0845 1.1839 1.0881
No log 10.8 54 1.2097 0.0686 1.2097 1.0999
No log 11.2 56 1.2016 0.0686 1.2016 1.0962
No log 11.6 58 1.1247 0.0713 1.1247 1.0605
No log 12.0 60 1.0039 0.1775 1.0039 1.0020
No log 12.4 62 0.9073 0.2239 0.9073 0.9525
No log 12.8 64 1.0174 0.1014 1.0174 1.0087
No log 13.2 66 1.2605 0.1176 1.2605 1.1227
No log 13.6 68 1.3322 0.1479 1.3322 1.1542
No log 14.0 70 1.3229 0.0704 1.3229 1.1502
No log 14.4 72 1.0367 0.1843 1.0367 1.0182
No log 14.8 74 0.9368 0.2124 0.9368 0.9679
No log 15.2 76 0.9610 0.2076 0.9610 0.9803
No log 15.6 78 1.1687 0.1086 1.1687 1.0811
No log 16.0 80 1.2916 0.1417 1.2916 1.1365
No log 16.4 82 1.2721 0.1145 1.2721 1.1279
No log 16.8 84 1.1186 0.2031 1.1186 1.0577
No log 17.2 86 0.9483 0.2836 0.9483 0.9738
No log 17.6 88 0.9183 0.3194 0.9183 0.9583
No log 18.0 90 1.0436 0.2209 1.0436 1.0216
No log 18.4 92 1.2240 0.1696 1.2240 1.1063
No log 18.8 94 1.2794 0.0727 1.2794 1.1311
No log 19.2 96 1.2941 0.1200 1.2941 1.1376
No log 19.6 98 1.2731 0.2138 1.2731 1.1283
No log 20.0 100 1.0707 0.2354 1.0707 1.0348
No log 20.4 102 0.9664 0.3134 0.9664 0.9831
No log 20.8 104 1.0031 0.2416 1.0031 1.0015
No log 21.2 106 1.0604 0.1549 1.0604 1.0298
No log 21.6 108 1.1393 0.1206 1.1393 1.0674
No log 22.0 110 1.2163 0.1114 1.2163 1.1028
No log 22.4 112 1.2884 0.1198 1.2884 1.1351
No log 22.8 114 1.2350 0.1473 1.2350 1.1113
No log 23.2 116 1.0956 0.1417 1.0956 1.0467
No log 23.6 118 1.0371 0.2545 1.0371 1.0184
No log 24.0 120 1.0082 0.2590 1.0082 1.0041
No log 24.4 122 1.0430 0.2501 1.0430 1.0213
No log 24.8 124 1.0602 0.2398 1.0602 1.0297
No log 25.2 126 1.0591 0.1980 1.0591 1.0291
No log 25.6 128 1.0164 0.2094 1.0164 1.0082
No log 26.0 130 0.9772 0.2343 0.9772 0.9885
No log 26.4 132 0.9775 0.2554 0.9775 0.9887
No log 26.8 134 0.9668 0.2971 0.9668 0.9833
No log 27.2 136 0.8557 0.2415 0.8557 0.9250
No log 27.6 138 0.8140 0.1978 0.8140 0.9022
No log 28.0 140 0.8324 0.2471 0.8324 0.9123
No log 28.4 142 0.8592 0.1800 0.8592 0.9269
No log 28.8 144 0.9046 0.1800 0.9046 0.9511
No log 29.2 146 0.9863 0.2012 0.9863 0.9931
No log 29.6 148 1.0669 0.1343 1.0669 1.0329
No log 30.0 150 1.1964 0.1576 1.1964 1.0938
No log 30.4 152 1.2572 0.1663 1.2572 1.1212
No log 30.8 154 1.1742 0.2601 1.1742 1.0836
No log 31.2 156 1.0509 0.2150 1.0509 1.0251
No log 31.6 158 0.9483 0.2287 0.9483 0.9738
No log 32.0 160 0.9430 0.1808 0.9430 0.9711
No log 32.4 162 0.9993 0.1843 0.9993 0.9997
No log 32.8 164 1.0404 0.1962 1.0404 1.0200
No log 33.2 166 1.0315 0.2416 1.0315 1.0156
No log 33.6 168 0.9515 0.2234 0.9515 0.9755
No log 34.0 170 0.9024 0.2437 0.9024 0.9499
No log 34.4 172 0.9961 0.2850 0.9961 0.9980
No log 34.8 174 1.0240 0.3010 1.0240 1.0119
No log 35.2 176 1.0145 0.2961 1.0145 1.0072
No log 35.6 178 1.0418 0.2288 1.0418 1.0207
No log 36.0 180 1.0914 0.2288 1.0914 1.0447
No log 36.4 182 1.1725 0.1873 1.1725 1.0828
No log 36.8 184 1.1836 0.2153 1.1836 1.0879
No log 37.2 186 1.1222 0.2590 1.1222 1.0593
No log 37.6 188 1.0458 0.2635 1.0458 1.0227
No log 38.0 190 1.0439 0.2635 1.0439 1.0217
No log 38.4 192 1.0284 0.2554 1.0284 1.0141
No log 38.8 194 1.0121 0.2934 1.0121 1.0061
No log 39.2 196 1.0429 0.2086 1.0429 1.0212
No log 39.6 198 1.0289 0.2464 1.0289 1.0144
No log 40.0 200 1.0263 0.2464 1.0263 1.0131
No log 40.4 202 1.0675 0.1962 1.0675 1.0332
No log 40.8 204 1.1541 0.2070 1.1541 1.0743
No log 41.2 206 1.1747 0.1739 1.1747 1.0838
No log 41.6 208 1.1810 0.1448 1.1810 1.0867
No log 42.0 210 1.0795 0.1737 1.0795 1.0390
No log 42.4 212 1.0292 0.1565 1.0292 1.0145
No log 42.8 214 1.0236 0.2017 1.0236 1.0117
No log 43.2 216 1.0169 0.2192 1.0169 1.0084
No log 43.6 218 1.0176 0.2635 1.0176 1.0087
No log 44.0 220 1.0146 0.2999 1.0146 1.0073
No log 44.4 222 0.9931 0.3214 0.9931 0.9966
No log 44.8 224 1.0013 0.2999 1.0013 1.0007
No log 45.2 226 1.0119 0.2999 1.0119 1.0059
No log 45.6 228 1.0254 0.3161 1.0254 1.0126
No log 46.0 230 1.0386 0.3161 1.0386 1.0191
No log 46.4 232 0.9926 0.3538 0.9926 0.9963
No log 46.8 234 0.9719 0.3601 0.9719 0.9858
No log 47.2 236 1.0057 0.3269 1.0057 1.0028
No log 47.6 238 1.0725 0.2982 1.0725 1.0356
No log 48.0 240 1.0926 0.2889 1.0926 1.0453
No log 48.4 242 1.1106 0.2889 1.1106 1.0539
No log 48.8 244 1.0326 0.3269 1.0326 1.0162
No log 49.2 246 0.9331 0.3439 0.9331 0.9660
No log 49.6 248 0.8869 0.3499 0.8869 0.9418
No log 50.0 250 0.8849 0.3333 0.8849 0.9407
No log 50.4 252 0.8912 0.3497 0.8912 0.9441
No log 50.8 254 0.9026 0.3497 0.9026 0.9501
No log 51.2 256 0.9595 0.3371 0.9595 0.9795
No log 51.6 258 1.0318 0.3052 1.0318 1.0158
No log 52.0 260 1.0802 0.2850 1.0802 1.0393
No log 52.4 262 1.0655 0.2850 1.0655 1.0322
No log 52.8 264 1.0021 0.3161 1.0021 1.0011
No log 53.2 266 0.9307 0.3274 0.9307 0.9647
No log 53.6 268 0.8881 0.3333 0.8881 0.9424
No log 54.0 270 0.8610 0.3159 0.8610 0.9279
No log 54.4 272 0.8450 0.2917 0.8450 0.9192
No log 54.8 274 0.8844 0.3159 0.8844 0.9404
No log 55.2 276 0.9526 0.2464 0.9526 0.9760
No log 55.6 278 1.0060 0.2192 1.0060 1.0030
No log 56.0 280 1.0026 0.2239 1.0026 1.0013
No log 56.4 282 0.9948 0.1723 0.9948 0.9974
No log 56.8 284 0.9996 0.1682 0.9996 0.9998
No log 57.2 286 1.0082 0.2287 1.0082 1.0041
No log 57.6 288 1.0309 0.2369 1.0309 1.0154
No log 58.0 290 1.0313 0.2147 1.0313 1.0155
No log 58.4 292 0.9920 0.2287 0.9920 0.9960
No log 58.8 294 0.9488 0.1584 0.9488 0.9741
No log 59.2 296 0.9060 0.2244 0.9060 0.9518
No log 59.6 298 0.8869 0.2632 0.8869 0.9418
No log 60.0 300 0.8742 0.2904 0.8742 0.9350
No log 60.4 302 0.8852 0.2784 0.8852 0.9409
No log 60.8 304 0.9418 0.2670 0.9418 0.9705
No log 61.2 306 1.0133 0.1940 1.0133 1.0066
No log 61.6 308 1.0391 0.1856 1.0391 1.0194
No log 62.0 310 1.0224 0.1897 1.0224 1.0112
No log 62.4 312 0.9879 0.1940 0.9879 0.9939
No log 62.8 314 0.9543 0.2537 0.9543 0.9769
No log 63.2 316 0.9292 0.2537 0.9292 0.9639
No log 63.6 318 0.9289 0.2537 0.9289 0.9638
No log 64.0 320 0.9620 0.2934 0.9620 0.9808
No log 64.4 322 1.0081 0.2554 1.0081 1.0040
No log 64.8 324 1.0130 0.2554 1.0130 1.0065
No log 65.2 326 0.9944 0.2934 0.9944 0.9972
No log 65.6 328 0.9710 0.2934 0.9710 0.9854
No log 66.0 330 0.9369 0.2702 0.9369 0.9680
No log 66.4 332 0.9159 0.2537 0.9159 0.9571
No log 66.8 334 0.8952 0.2537 0.8952 0.9461
No log 67.2 336 0.8738 0.2537 0.8738 0.9348
No log 67.6 338 0.8708 0.2781 0.8708 0.9332
No log 68.0 340 0.8928 0.2781 0.8928 0.9449
No log 68.4 342 0.8858 0.2781 0.8858 0.9412
No log 68.8 344 0.8727 0.3043 0.8727 0.9342
No log 69.2 346 0.8841 0.3043 0.8841 0.9403
No log 69.6 348 0.8765 0.3043 0.8765 0.9362
No log 70.0 350 0.8586 0.3754 0.8586 0.9266
No log 70.4 352 0.8396 0.3520 0.8396 0.9163
No log 70.8 354 0.8223 0.3384 0.8223 0.9068
No log 71.2 356 0.8199 0.3384 0.8199 0.9055
No log 71.6 358 0.8373 0.3280 0.8373 0.9150
No log 72.0 360 0.8480 0.3319 0.8480 0.9209
No log 72.4 362 0.8732 0.2949 0.8732 0.9344
No log 72.8 364 0.9200 0.2934 0.9200 0.9592
No log 73.2 366 0.9699 0.2728 0.9699 0.9848
No log 73.6 368 0.9991 0.2728 0.9991 0.9995
No log 74.0 370 1.0294 0.2554 1.0294 1.0146
No log 74.4 372 1.0455 0.2507 1.0455 1.0225
No log 74.8 374 1.0259 0.2779 1.0259 1.0129
No log 75.2 376 1.0167 0.2728 1.0167 1.0083
No log 75.6 378 1.0120 0.2881 1.0120 1.0060
No log 76.0 380 0.9975 0.2881 0.9975 0.9987
No log 76.4 382 0.9776 0.2728 0.9776 0.9887
No log 76.8 384 0.9403 0.2728 0.9403 0.9697
No log 77.2 386 0.9275 0.2728 0.9275 0.9631
No log 77.6 388 0.9223 0.2728 0.9223 0.9604
No log 78.0 390 0.9190 0.2728 0.9190 0.9586
No log 78.4 392 0.9347 0.2728 0.9347 0.9668
No log 78.8 394 0.9454 0.2728 0.9454 0.9723
No log 79.2 396 0.9545 0.2781 0.9545 0.9770
No log 79.6 398 0.9670 0.2487 0.9670 0.9834
No log 80.0 400 0.9939 0.2437 0.9939 0.9970
No log 80.4 402 1.0113 0.2437 1.0113 1.0056
No log 80.8 404 1.0081 0.2437 1.0081 1.0040
No log 81.2 406 1.0143 0.2437 1.0143 1.0071
No log 81.6 408 1.0310 0.2437 1.0310 1.0154
No log 82.0 410 1.0542 0.2297 1.0542 1.0267
No log 82.4 412 1.0668 0.2297 1.0668 1.0329
No log 82.8 414 1.0687 0.2756 1.0687 1.0338
No log 83.2 416 1.0559 0.2806 1.0559 1.0276
No log 83.6 418 1.0415 0.2806 1.0415 1.0205
No log 84.0 420 1.0105 0.2806 1.0105 1.0052
No log 84.4 422 0.9806 0.3417 0.9806 0.9903
No log 84.8 424 0.9556 0.3417 0.9556 0.9775
No log 85.2 426 0.9504 0.3417 0.9504 0.9749
No log 85.6 428 0.9579 0.3417 0.9579 0.9787
No log 86.0 430 0.9612 0.3417 0.9612 0.9804
No log 86.4 432 0.9530 0.3636 0.9530 0.9762
No log 86.8 434 0.9429 0.3636 0.9429 0.9710
No log 87.2 436 0.9375 0.3636 0.9375 0.9682
No log 87.6 438 0.9235 0.3477 0.9235 0.9610
No log 88.0 440 0.9043 0.3371 0.9043 0.9510
No log 88.4 442 0.8827 0.3538 0.8827 0.9395
No log 88.8 444 0.8738 0.3538 0.8738 0.9348
No log 89.2 446 0.8746 0.3538 0.8746 0.9352
No log 89.6 448 0.8873 0.3371 0.8873 0.9420
No log 90.0 450 0.9091 0.3310 0.9091 0.9534
No log 90.4 452 0.9244 0.2781 0.9244 0.9614
No log 90.8 454 0.9414 0.2728 0.9414 0.9703
No log 91.2 456 0.9486 0.2728 0.9486 0.9740
No log 91.6 458 0.9554 0.2728 0.9554 0.9774
No log 92.0 460 0.9705 0.2626 0.9705 0.9851
No log 92.4 462 0.9847 0.2577 0.9847 0.9923
No log 92.8 464 0.9916 0.2577 0.9916 0.9958
No log 93.2 466 0.9971 0.2577 0.9971 0.9986
No log 93.6 468 0.9953 0.2577 0.9953 0.9976
No log 94.0 470 0.9905 0.2577 0.9905 0.9952
No log 94.4 472 0.9838 0.2577 0.9838 0.9919
No log 94.8 474 0.9811 0.2577 0.9811 0.9905
No log 95.2 476 0.9801 0.2577 0.9801 0.9900
No log 95.6 478 0.9784 0.2577 0.9784 0.9891
No log 96.0 480 0.9735 0.2343 0.9735 0.9866
No log 96.4 482 0.9696 0.2677 0.9696 0.9847
No log 96.8 484 0.9626 0.2677 0.9626 0.9811
No log 97.2 486 0.9583 0.2677 0.9583 0.9789
No log 97.6 488 0.9570 0.2677 0.9570 0.9783
No log 98.0 490 0.9561 0.2677 0.9561 0.9778
No log 98.4 492 0.9555 0.2677 0.9555 0.9775
No log 98.8 494 0.9543 0.2728 0.9543 0.9769
No log 99.2 496 0.9542 0.2728 0.9542 0.9768
No log 99.6 498 0.9542 0.2728 0.9542 0.9768
0.1607 100.0 500 0.9542 0.2728 0.9542 0.9768

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k2_task7_organization

Finetuned
(4019)
this model