ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k6_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.1012
  • Qwk: 0.1896
  • Mse: 1.1012
  • Rmse: 1.0494

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.1333 2 2.5143 -0.0262 2.5143 1.5857
No log 0.2667 4 1.2683 0.0994 1.2683 1.1262
No log 0.4 6 1.0406 -0.0970 1.0406 1.0201
No log 0.5333 8 1.0903 -0.1680 1.0903 1.0442
No log 0.6667 10 1.1030 -0.2125 1.1030 1.0502
No log 0.8 12 0.8646 -0.0426 0.8646 0.9298
No log 0.9333 14 0.6932 0.0481 0.6932 0.8326
No log 1.0667 16 0.7194 0.0937 0.7194 0.8482
No log 1.2 18 0.6690 0.1604 0.6690 0.8179
No log 1.3333 20 0.6683 0.1604 0.6683 0.8175
No log 1.4667 22 0.6842 0.0481 0.6842 0.8272
No log 1.6 24 0.6898 0.0889 0.6898 0.8306
No log 1.7333 26 0.6979 0.0889 0.6979 0.8354
No log 1.8667 28 0.7063 0.1508 0.7063 0.8404
No log 2.0 30 0.7367 0.1648 0.7367 0.8583
No log 2.1333 32 0.7342 0.1648 0.7342 0.8569
No log 2.2667 34 0.7758 0.2012 0.7758 0.8808
No log 2.4 36 0.7360 0.1504 0.7360 0.8579
No log 2.5333 38 0.6928 0.3336 0.6928 0.8323
No log 2.6667 40 0.6866 0.3213 0.6866 0.8286
No log 2.8 42 0.7266 0.2899 0.7266 0.8524
No log 2.9333 44 0.7599 0.3157 0.7599 0.8717
No log 3.0667 46 0.8999 0.3310 0.8999 0.9486
No log 3.2 48 1.2339 0.2988 1.2339 1.1108
No log 3.3333 50 1.1708 0.2589 1.1708 1.0820
No log 3.4667 52 0.8973 0.3239 0.8973 0.9472
No log 3.6 54 0.7864 0.3121 0.7864 0.8868
No log 3.7333 56 0.9177 0.3559 0.9177 0.9580
No log 3.8667 58 0.9673 0.4032 0.9673 0.9835
No log 4.0 60 0.8860 0.3579 0.8860 0.9413
No log 4.1333 62 1.0967 0.4106 1.0967 1.0472
No log 4.2667 64 1.1033 0.4106 1.1033 1.0504
No log 4.4 66 1.2485 0.2838 1.2485 1.1174
No log 4.5333 68 1.3110 0.2571 1.3110 1.1450
No log 4.6667 70 1.1756 0.4106 1.1756 1.0843
No log 4.8 72 0.8521 0.2627 0.8521 0.9231
No log 4.9333 74 0.7996 0.2798 0.7996 0.8942
No log 5.0667 76 0.8690 0.2849 0.8690 0.9322
No log 5.2 78 1.2367 0.3401 1.2367 1.1121
No log 5.3333 80 1.3324 0.2444 1.3324 1.1543
No log 5.4667 82 1.1571 0.3650 1.1571 1.0757
No log 5.6 84 0.9563 0.3151 0.9563 0.9779
No log 5.7333 86 0.9529 0.2928 0.9529 0.9762
No log 5.8667 88 1.0193 0.3403 1.0193 1.0096
No log 6.0 90 0.9731 0.3425 0.9731 0.9864
No log 6.1333 92 0.7890 0.3330 0.7890 0.8883
No log 6.2667 94 0.7055 0.3769 0.7055 0.8399
No log 6.4 96 0.7592 0.3885 0.7592 0.8713
No log 6.5333 98 0.9936 0.3470 0.9936 0.9968
No log 6.6667 100 1.0689 0.3473 1.0689 1.0339
No log 6.8 102 0.7708 0.4173 0.7708 0.8779
No log 6.9333 104 0.6329 0.4013 0.6329 0.7956
No log 7.0667 106 0.6214 0.3581 0.6214 0.7883
No log 7.2 108 0.6627 0.3700 0.6627 0.8140
No log 7.3333 110 0.9513 0.3650 0.9513 0.9753
No log 7.4667 112 0.9826 0.3557 0.9826 0.9913
No log 7.6 114 0.7964 0.4180 0.7964 0.8924
No log 7.7333 116 0.7806 0.4723 0.7806 0.8835
No log 7.8667 118 0.9056 0.3847 0.9056 0.9516
No log 8.0 120 1.0075 0.3183 1.0075 1.0037
No log 8.1333 122 1.2078 0.2008 1.2078 1.0990
No log 8.2667 124 1.1553 0.2168 1.1553 1.0748
No log 8.4 126 0.8875 0.2670 0.8875 0.9421
No log 8.5333 128 0.7765 0.2907 0.7765 0.8812
No log 8.6667 130 0.7952 0.3314 0.7952 0.8917
No log 8.8 132 0.8748 0.2784 0.8748 0.9353
No log 8.9333 134 0.9687 0.1897 0.9687 0.9842
No log 9.0667 136 1.0976 0.2601 1.0976 1.0477
No log 9.2 138 1.1330 0.2130 1.1330 1.0644
No log 9.3333 140 1.0311 0.2833 1.0311 1.0154
No log 9.4667 142 1.0290 0.2833 1.0290 1.0144
No log 9.6 144 1.0884 0.2903 1.0884 1.0433
No log 9.7333 146 1.1798 0.1815 1.1798 1.0862
No log 9.8667 148 1.0714 0.3006 1.0714 1.0351
No log 10.0 150 0.8262 0.2904 0.8262 0.9090
No log 10.1333 152 0.7611 0.2319 0.7611 0.8724
No log 10.2667 154 0.7848 0.2063 0.7848 0.8859
No log 10.4 156 0.8144 0.2574 0.8144 0.9025
No log 10.5333 158 1.0474 0.2833 1.0474 1.0234
No log 10.6667 160 1.1247 0.2223 1.1247 1.0605
No log 10.8 162 1.1296 0.2223 1.1296 1.0628
No log 10.9333 164 0.9948 0.2626 0.9948 0.9974
No log 11.0667 166 0.8627 0.2670 0.8627 0.9288
No log 11.2 168 0.9250 0.3019 0.9250 0.9618
No log 11.3333 170 1.2392 0.2280 1.2392 1.1132
No log 11.4667 172 1.3906 0.1793 1.3906 1.1792
No log 11.6 174 1.1832 0.2280 1.1832 1.0877
No log 11.7333 176 0.8797 0.3294 0.8797 0.9379
No log 11.8667 178 0.7950 0.2328 0.7950 0.8916
No log 12.0 180 0.7666 0.2720 0.7666 0.8756
No log 12.1333 182 0.8177 0.2492 0.8177 0.9043
No log 12.2667 184 0.9247 0.3473 0.9247 0.9616
No log 12.4 186 1.0912 0.2567 1.0912 1.0446
No log 12.5333 188 1.1013 0.2459 1.1013 1.0494
No log 12.6667 190 1.0380 0.1990 1.0380 1.0188
No log 12.8 192 0.9104 0.2410 0.9104 0.9541
No log 12.9333 194 0.8534 0.1914 0.8534 0.9238
No log 13.0667 196 0.8042 0.2440 0.8042 0.8968
No log 13.2 198 0.9272 0.2410 0.9272 0.9629
No log 13.3333 200 1.1790 0.1981 1.1790 1.0858
No log 13.4667 202 1.1868 0.1860 1.1868 1.0894
No log 13.6 204 1.0050 0.3169 1.0050 1.0025
No log 13.7333 206 0.9496 0.2923 0.9496 0.9745
No log 13.8667 208 0.9644 0.2923 0.9644 0.9820
No log 14.0 210 0.9333 0.2923 0.9333 0.9661
No log 14.1333 212 0.9390 0.2923 0.9390 0.9690
No log 14.2667 214 0.9675 0.2756 0.9675 0.9836
No log 14.4 216 1.0309 0.2651 1.0309 1.0153
No log 14.5333 218 1.0387 0.2316 1.0387 1.0192
No log 14.6667 220 1.0297 0.2316 1.0297 1.0147
No log 14.8 222 1.0102 0.2703 1.0102 1.0051
No log 14.9333 224 0.9153 0.2094 0.9153 0.9567
No log 15.0667 226 0.8361 0.2063 0.8361 0.9144
No log 15.2 228 0.9274 0.1777 0.9274 0.9630
No log 15.3333 230 1.1837 0.2141 1.1837 1.0880
No log 15.4667 232 1.2539 0.1458 1.2539 1.1198
No log 15.6 234 1.0888 0.2411 1.0888 1.0435
No log 15.7333 236 0.8615 0.2297 0.8615 0.9282
No log 15.8667 238 0.7699 0.1972 0.7699 0.8774
No log 16.0 240 0.7503 0.2085 0.7503 0.8662
No log 16.1333 242 0.7830 0.1918 0.7830 0.8849
No log 16.2667 244 0.8626 0.2632 0.8626 0.9287
No log 16.4 246 0.9167 0.2142 0.9167 0.9574
No log 16.5333 248 0.9135 0.2193 0.9135 0.9558
No log 16.6667 250 0.8482 0.2297 0.8482 0.9210
No log 16.8 252 0.7559 0.1866 0.7559 0.8694
No log 16.9333 254 0.7464 0.1815 0.7464 0.8639
No log 17.0667 256 0.8110 0.2063 0.8110 0.9005
No log 17.2 258 0.8697 0.2297 0.8697 0.9326
No log 17.3333 260 0.9346 0.2574 0.9346 0.9668
No log 17.4667 262 0.9232 0.2193 0.9232 0.9608
No log 17.6 264 0.9084 0.1914 0.9084 0.9531
No log 17.7333 266 0.9326 0.2463 0.9326 0.9657
No log 17.8667 268 1.0170 0.2562 1.0170 1.0084
No log 18.0 270 1.0174 0.2046 1.0174 1.0087
No log 18.1333 272 0.9209 0.2843 0.9209 0.9596
No log 18.2667 274 0.7779 0.2467 0.7779 0.8820
No log 18.4 276 0.7408 0.3092 0.7408 0.8607
No log 18.5333 278 0.7911 0.2692 0.7911 0.8894
No log 18.6667 280 0.9561 0.2726 0.9561 0.9778
No log 18.8 282 1.0231 0.2615 1.0231 1.0115
No log 18.9333 284 1.0252 0.3169 1.0252 1.0125
No log 19.0667 286 0.8949 0.3105 0.8949 0.9460
No log 19.2 288 0.8390 0.3105 0.8390 0.9160
No log 19.3333 290 0.8296 0.2904 0.8296 0.9108
No log 19.4667 292 0.9285 0.2726 0.9285 0.9636
No log 19.6 294 1.0138 0.2259 1.0138 1.0069
No log 19.7333 296 1.0111 0.2259 1.0111 1.0056
No log 19.8667 298 1.0143 0.2259 1.0143 1.0071
No log 20.0 300 1.0319 0.2411 1.0319 1.0158
No log 20.1333 302 1.0285 0.3051 1.0285 1.0141
No log 20.2667 304 1.0562 0.2833 1.0562 1.0277
No log 20.4 306 1.1269 0.2282 1.1269 1.0615
No log 20.5333 308 1.0876 0.2478 1.0876 1.0429
No log 20.6667 310 0.9021 0.3359 0.9021 0.9498
No log 20.8 312 0.7515 0.3127 0.7515 0.8669
No log 20.9333 314 0.7276 0.3196 0.7276 0.8530
No log 21.0667 316 0.7940 0.2352 0.7940 0.8911
No log 21.2 318 0.9387 0.2982 0.9387 0.9689
No log 21.3333 320 1.0417 0.2363 1.0417 1.0206
No log 21.4667 322 1.1318 0.2412 1.1318 1.0638
No log 21.6 324 1.1439 0.2100 1.1439 1.0695
No log 21.7333 326 1.1553 0.2282 1.1553 1.0748
No log 21.8667 328 1.1402 0.2282 1.1402 1.0678
No log 22.0 330 1.0591 0.2903 1.0591 1.0291
No log 22.1333 332 0.9488 0.3347 0.9488 0.9741
No log 22.2667 334 0.8823 0.3294 0.8823 0.9393
No log 22.4 336 0.8681 0.3294 0.8681 0.9317
No log 22.5333 338 0.9363 0.2866 0.9363 0.9676
No log 22.6667 340 1.0542 0.3280 1.0542 1.0267
No log 22.8 342 1.2676 0.2008 1.2676 1.1259
No log 22.9333 344 1.4713 0.1146 1.4713 1.2130
No log 23.0667 346 1.5210 0.1310 1.5210 1.2333
No log 23.2 348 1.2985 0.1973 1.2985 1.1395
No log 23.3333 350 1.0130 0.3579 1.0130 1.0065
No log 23.4667 352 0.8963 0.2923 0.8963 0.9467
No log 23.6 354 0.8551 0.3042 0.8551 0.9247
No log 23.7333 356 0.8875 0.2923 0.8875 0.9421
No log 23.8667 358 0.9396 0.2923 0.9396 0.9693
No log 24.0 360 1.0005 0.3409 1.0005 1.0002
No log 24.1333 362 1.1293 0.2782 1.1293 1.0627
No log 24.2667 364 1.1648 0.2006 1.1648 1.0793
No log 24.4 366 1.0473 0.3516 1.0473 1.0234
No log 24.5333 368 0.9690 0.3409 0.9690 0.9844
No log 24.6667 370 0.9139 0.3294 0.9139 0.9560
No log 24.8 372 0.9148 0.3294 0.9148 0.9564
No log 24.9333 374 0.9299 0.2923 0.9299 0.9643
No log 25.0667 376 0.8919 0.3105 0.8919 0.9444
No log 25.2 378 0.8937 0.3105 0.8937 0.9454
No log 25.3333 380 0.9207 0.3105 0.9207 0.9595
No log 25.4667 382 0.9034 0.3105 0.9034 0.9505
No log 25.6 384 0.8665 0.3105 0.8665 0.9309
No log 25.7333 386 0.9170 0.3042 0.9170 0.9576
No log 25.8667 388 0.9855 0.3938 0.9855 0.9927
No log 26.0 390 0.9846 0.3409 0.9846 0.9923
No log 26.1333 392 1.0240 0.2316 1.0240 1.0120
No log 26.2667 394 0.9940 0.2866 0.9940 0.9970
No log 26.4 396 0.9686 0.2923 0.9686 0.9842
No log 26.5333 398 0.9641 0.2670 0.9641 0.9819
No log 26.6667 400 0.9904 0.2866 0.9904 0.9952
No log 26.8 402 1.0118 0.2119 1.0118 1.0059
No log 26.9333 404 1.1002 0.2141 1.1002 1.0489
No log 27.0667 406 1.1289 0.1832 1.1289 1.0625
No log 27.2 408 1.0619 0.2183 1.0619 1.0305
No log 27.3333 410 1.0216 0.2183 1.0216 1.0107
No log 27.4667 412 0.9879 0.2504 0.9879 0.9940
No log 27.6 414 0.8796 0.3231 0.8796 0.9379
No log 27.7333 416 0.8357 0.3169 0.8357 0.9142
No log 27.8667 418 0.8613 0.3425 0.8613 0.9281
No log 28.0 420 0.9502 0.2923 0.9502 0.9748
No log 28.1333 422 1.0057 0.3538 1.0057 1.0029
No log 28.2667 424 0.9700 0.2923 0.9700 0.9849
No log 28.4 426 0.9343 0.3042 0.9343 0.9666
No log 28.5333 428 0.8388 0.2904 0.8388 0.9159
No log 28.6667 430 0.7964 0.2904 0.7964 0.8924
No log 28.8 432 0.8337 0.3494 0.8337 0.9131
No log 28.9333 434 0.9644 0.3538 0.9644 0.9821
No log 29.0667 436 1.1567 0.2145 1.1567 1.0755
No log 29.2 438 1.2244 0.1815 1.2244 1.1065
No log 29.3333 440 1.0817 0.2504 1.0817 1.0400
No log 29.4667 442 0.8586 0.3169 0.8586 0.9266
No log 29.6 444 0.7459 0.2652 0.7459 0.8637
No log 29.7333 446 0.7329 0.2537 0.7329 0.8561
No log 29.8667 448 0.7827 0.3099 0.7827 0.8847
No log 30.0 450 0.9064 0.2784 0.9064 0.9520
No log 30.1333 452 1.0254 0.3347 1.0254 1.0126
No log 30.2667 454 1.0206 0.3347 1.0206 1.0103
No log 30.4 456 0.9582 0.2726 0.9582 0.9789
No log 30.5333 458 0.9337 0.3294 0.9337 0.9663
No log 30.6667 460 0.9592 0.3473 0.9592 0.9794
No log 30.8 462 0.9181 0.3675 0.9181 0.9582
No log 30.9333 464 0.8812 0.3425 0.8812 0.9387
No log 31.0667 466 0.8321 0.2904 0.8321 0.9122
No log 31.2 468 0.8159 0.2967 0.8159 0.9032
No log 31.3333 470 0.8329 0.2904 0.8329 0.9126
No log 31.4667 472 0.8382 0.2904 0.8382 0.9155
No log 31.6 474 0.9061 0.3606 0.9061 0.9519
No log 31.7333 476 0.9785 0.3110 0.9785 0.9892
No log 31.8667 478 0.9753 0.3110 0.9753 0.9876
No log 32.0 480 1.0103 0.2756 1.0103 1.0051
No log 32.1333 482 1.0811 0.1949 1.0811 1.0398
No log 32.2667 484 1.0854 0.2032 1.0854 1.0418
No log 32.4 486 1.1356 0.1949 1.1356 1.0656
No log 32.5333 488 1.0904 0.2032 1.0904 1.0442
No log 32.6667 490 0.9782 0.2562 0.9782 0.9890
No log 32.8 492 0.9364 0.2615 0.9364 0.9677
No log 32.9333 494 0.8984 0.3042 0.8984 0.9479
No log 33.0667 496 0.8917 0.2904 0.8917 0.9443
No log 33.2 498 0.8910 0.2904 0.8910 0.9439
0.2256 33.3333 500 0.9349 0.2726 0.9349 0.9669
0.2256 33.4667 502 1.0311 0.2059 1.0311 1.0154
0.2256 33.6 504 1.0863 0.2059 1.0863 1.0422
0.2256 33.7333 506 1.1450 0.2125 1.1450 1.0700
0.2256 33.8667 508 1.1552 0.1821 1.1552 1.0748
0.2256 34.0 510 1.1012 0.1896 1.1012 1.0494

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k6_task7_organization

Finetuned
(4019)
this model