ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k17_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.1178
  • Qwk: 0.1594
  • Mse: 1.1178
  • Rmse: 1.0572

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0233 2 2.6482 -0.0369 2.6482 1.6273
No log 0.0465 4 1.3997 0.0245 1.3997 1.1831
No log 0.0698 6 1.1077 -0.1255 1.1077 1.0525
No log 0.0930 8 1.1314 -0.0727 1.1314 1.0637
No log 0.1163 10 1.1528 -0.0672 1.1528 1.0737
No log 0.1395 12 1.2188 0.0390 1.2188 1.1040
No log 0.1628 14 1.1173 0.0609 1.1173 1.0570
No log 0.1860 16 1.0261 0.0741 1.0261 1.0130
No log 0.2093 18 0.9609 0.1181 0.9609 0.9803
No log 0.2326 20 1.0499 0.0392 1.0499 1.0246
No log 0.2558 22 1.0026 0.0451 1.0026 1.0013
No log 0.2791 24 0.9002 -0.0860 0.9002 0.9488
No log 0.3023 26 0.8227 0.1139 0.8227 0.9070
No log 0.3256 28 0.8335 0.2145 0.8335 0.9130
No log 0.3488 30 0.8763 0.1142 0.8763 0.9361
No log 0.3721 32 0.9544 0.1348 0.9544 0.9769
No log 0.3953 34 0.9703 0.1348 0.9703 0.9850
No log 0.4186 36 0.9561 0.1293 0.9561 0.9778
No log 0.4419 38 0.8839 0.1268 0.8839 0.9402
No log 0.4651 40 0.8785 0.1699 0.8785 0.9373
No log 0.4884 42 0.8770 0.0643 0.8770 0.9365
No log 0.5116 44 0.9647 -0.0047 0.9647 0.9822
No log 0.5349 46 1.1166 -0.2811 1.1166 1.0567
No log 0.5581 48 1.0130 -0.0753 1.0130 1.0065
No log 0.5814 50 0.9195 0.0741 0.9195 0.9589
No log 0.6047 52 0.8951 0.0741 0.8951 0.9461
No log 0.6279 54 0.8183 0.0725 0.8183 0.9046
No log 0.6512 56 0.7607 0.1139 0.7607 0.8722
No log 0.6744 58 0.7563 0.1094 0.7563 0.8697
No log 0.6977 60 0.8016 0.1972 0.8016 0.8953
No log 0.7209 62 0.9082 0.1254 0.9082 0.9530
No log 0.7442 64 1.0833 0.2460 1.0833 1.0408
No log 0.7674 66 1.1259 0.1521 1.1259 1.0611
No log 0.7907 68 1.0700 0.1584 1.0700 1.0344
No log 0.8140 70 0.9243 0.2285 0.9243 0.9614
No log 0.8372 72 0.8514 0.2345 0.8514 0.9227
No log 0.8605 74 0.8277 0.1456 0.8277 0.9098
No log 0.8837 76 0.8187 0.1094 0.8187 0.9048
No log 0.9070 78 0.8377 0.0393 0.8377 0.9153
No log 0.9302 80 0.8340 0.0833 0.8340 0.9133
No log 0.9535 82 0.7737 0.2041 0.7737 0.8796
No log 0.9767 84 0.7169 0.1903 0.7169 0.8467
No log 1.0 86 0.7406 0.1598 0.7406 0.8606
No log 1.0233 88 0.8485 0.2632 0.8485 0.9211
No log 1.0465 90 0.8504 0.2843 0.8504 0.9222
No log 1.0698 92 0.8386 0.2982 0.8386 0.9158
No log 1.0930 94 0.8069 0.2784 0.8069 0.8983
No log 1.1163 96 0.8123 0.2632 0.8123 0.9013
No log 1.1395 98 0.9411 0.2193 0.9411 0.9701
No log 1.1628 100 1.0036 0.1328 1.0036 1.0018
No log 1.1860 102 0.9645 0.1264 0.9645 0.9821
No log 1.2093 104 0.9785 0.1264 0.9785 0.9892
No log 1.2326 106 0.9359 0.1264 0.9359 0.9674
No log 1.2558 108 0.9006 0.1648 0.9006 0.9490
No log 1.2791 110 0.9918 0.0715 0.9918 0.9959
No log 1.3023 112 1.0667 0.1214 1.0667 1.0328
No log 1.3256 114 1.0329 0.1584 1.0329 1.0163
No log 1.3488 116 0.9678 0.2410 0.9678 0.9838
No log 1.3721 118 0.8986 0.2843 0.8986 0.9479
No log 1.3953 120 0.8446 0.2574 0.8446 0.9190
No log 1.4186 122 0.8169 0.2467 0.8169 0.9039
No log 1.4419 124 0.8333 0.1866 0.8333 0.9129
No log 1.4651 126 0.8326 0.2319 0.8326 0.9125
No log 1.4884 128 0.8375 0.2319 0.8375 0.9152
No log 1.5116 130 0.8359 0.2319 0.8359 0.9143
No log 1.5349 132 0.8209 0.2748 0.8209 0.9060
No log 1.5581 134 0.8180 0.1050 0.8180 0.9044
No log 1.5814 136 0.8386 0.0781 0.8386 0.9158
No log 1.6047 138 0.9401 0.2223 0.9401 0.9696
No log 1.6279 140 1.0603 0.2303 1.0603 1.0297
No log 1.6512 142 1.1180 0.2134 1.1180 1.0573
No log 1.6744 144 1.1252 0.1835 1.1252 1.0607
No log 1.6977 146 1.1204 0.1799 1.1204 1.0585
No log 1.7209 148 1.0204 0.2830 1.0204 1.0101
No log 1.7442 150 0.8525 0.2352 0.8525 0.9233
No log 1.7674 152 0.7969 0.1550 0.7969 0.8927
No log 1.7907 154 0.7908 0.2204 0.7908 0.8893
No log 1.8140 156 0.8310 0.2063 0.8310 0.9116
No log 1.8372 158 0.8716 0.2275 0.8716 0.9336
No log 1.8605 160 0.8570 0.1766 0.8570 0.9257
No log 1.8837 162 0.8316 0.1766 0.8316 0.9119
No log 1.9070 164 0.8143 0.1866 0.8143 0.9024
No log 1.9302 166 0.7840 0.2621 0.7840 0.8854
No log 1.9535 168 0.7590 0.2621 0.7590 0.8712
No log 1.9767 170 0.7603 0.1866 0.7603 0.8720
No log 2.0 172 0.7856 0.2063 0.7856 0.8863
No log 2.0233 174 0.8487 0.3234 0.8487 0.9213
No log 2.0465 176 0.9103 0.3159 0.9103 0.9541
No log 2.0698 178 0.9118 0.2727 0.9118 0.9549
No log 2.0930 180 0.8092 0.2751 0.8092 0.8996
No log 2.1163 182 0.7294 0.2256 0.7294 0.8540
No log 2.1395 184 0.7269 0.2342 0.7269 0.8526
No log 2.1628 186 0.7310 0.2398 0.7310 0.8550
No log 2.1860 188 0.7362 0.2058 0.7362 0.8580
No log 2.2093 190 0.7545 0.1587 0.7545 0.8686
No log 2.2326 192 0.7607 0.1786 0.7607 0.8722
No log 2.2558 194 0.7504 0.1587 0.7504 0.8663
No log 2.2791 196 0.7534 0.1613 0.7534 0.8680
No log 2.3023 198 0.7677 0.0896 0.7677 0.8762
No log 2.3256 200 0.8272 0.3723 0.8272 0.9095
No log 2.3488 202 0.9575 0.2779 0.9575 0.9785
No log 2.3721 204 1.0783 0.2821 1.0783 1.0384
No log 2.3953 206 1.0299 0.2868 1.0299 1.0149
No log 2.4186 208 0.9001 0.2975 0.9001 0.9487
No log 2.4419 210 0.8134 0.2577 0.8134 0.9019
No log 2.4651 212 0.8059 0.2749 0.8059 0.8977
No log 2.4884 214 0.8151 0.2605 0.8151 0.9028
No log 2.5116 216 0.8611 0.2643 0.8611 0.9279
No log 2.5349 218 0.9281 0.3402 0.9281 0.9634
No log 2.5581 220 0.9315 0.3343 0.9315 0.9651
No log 2.5814 222 0.9670 0.3543 0.9670 0.9834
No log 2.6047 224 0.9172 0.2725 0.9172 0.9577
No log 2.6279 226 0.8717 0.2723 0.8717 0.9336
No log 2.6512 228 0.8070 0.2445 0.8070 0.8984
No log 2.6744 230 0.7919 0.1986 0.7919 0.8899
No log 2.6977 232 0.8004 0.1952 0.8004 0.8947
No log 2.7209 234 0.8316 0.2129 0.8316 0.9119
No log 2.7442 236 0.9373 0.2513 0.9373 0.9682
No log 2.7674 238 1.0237 0.2578 1.0237 1.0118
No log 2.7907 240 0.9620 0.1623 0.9620 0.9808
No log 2.8140 242 0.8730 0.2749 0.8730 0.9344
No log 2.8372 244 0.8119 0.3391 0.8119 0.9011
No log 2.8605 246 0.7908 0.3391 0.7908 0.8893
No log 2.8837 248 0.8213 0.3221 0.8213 0.9063
No log 2.9070 250 0.8904 0.2949 0.8904 0.9436
No log 2.9302 252 1.0133 0.3160 1.0133 1.0066
No log 2.9535 254 1.1344 0.2330 1.1344 1.0651
No log 2.9767 256 1.1303 0.2330 1.1303 1.0631
No log 3.0 258 0.9901 0.3052 0.9901 0.9950
No log 3.0233 260 0.8698 0.3256 0.8698 0.9326
No log 3.0465 262 0.8074 0.3450 0.8074 0.8985
No log 3.0698 264 0.7794 0.3209 0.7794 0.8828
No log 3.0930 266 0.7632 0.2606 0.7632 0.8736
No log 3.1163 268 0.7650 0.3247 0.7650 0.8746
No log 3.1395 270 0.7956 0.3492 0.7956 0.8920
No log 3.1628 272 0.8757 0.3397 0.8757 0.9358
No log 3.1860 274 0.9372 0.3086 0.9372 0.9681
No log 3.2093 276 0.9240 0.3086 0.9240 0.9613
No log 3.2326 278 0.9642 0.2571 0.9642 0.9819
No log 3.2558 280 0.9841 0.2636 0.9841 0.9920
No log 3.2791 282 0.9692 0.2603 0.9692 0.9845
No log 3.3023 284 0.9764 0.2603 0.9764 0.9881
No log 3.3256 286 1.0517 0.1935 1.0517 1.0255
No log 3.3488 288 1.0290 0.1935 1.0290 1.0144
No log 3.3721 290 0.9124 0.2518 0.9124 0.9552
No log 3.3953 292 0.8410 0.2967 0.8410 0.9171
No log 3.4186 294 0.8287 0.2967 0.8287 0.9104
No log 3.4419 296 0.8549 0.2967 0.8549 0.9246
No log 3.4651 298 0.8723 0.3095 0.8723 0.9340
No log 3.4884 300 0.8370 0.2696 0.8370 0.9149
No log 3.5116 302 0.8122 0.2298 0.8122 0.9012
No log 3.5349 304 0.8245 0.2580 0.8245 0.9080
No log 3.5581 306 0.8680 0.3425 0.8680 0.9316
No log 3.5814 308 0.9406 0.2374 0.9406 0.9698
No log 3.6047 310 1.0496 0.2729 1.0496 1.0245
No log 3.6279 312 1.0532 0.2217 1.0532 1.0262
No log 3.6512 314 0.9601 0.2905 0.9601 0.9799
No log 3.6744 316 0.9332 0.2651 0.9332 0.9660
No log 3.6977 318 0.9229 0.2590 0.9229 0.9607
No log 3.7209 320 0.9459 0.2917 0.9459 0.9726
No log 3.7442 322 0.9740 0.2589 0.9740 0.9869
No log 3.7674 324 1.0901 0.2247 1.0901 1.0441
No log 3.7907 326 1.2420 0.1839 1.2420 1.1145
No log 3.8140 328 1.2484 0.1696 1.2484 1.1173
No log 3.8372 330 1.1322 0.1729 1.1322 1.0641
No log 3.8605 332 1.0318 0.2129 1.0318 1.0158
No log 3.8837 334 0.9272 0.2616 0.9272 0.9629
No log 3.9070 336 0.8598 0.3700 0.8598 0.9273
No log 3.9302 338 0.8427 0.3146 0.8427 0.9180
No log 3.9535 340 0.8453 0.2937 0.8453 0.9194
No log 3.9767 342 0.8341 0.2695 0.8341 0.9133
No log 4.0 344 0.8736 0.3463 0.8736 0.9347
No log 4.0233 346 0.9137 0.3463 0.9137 0.9559
No log 4.0465 348 0.9673 0.3740 0.9673 0.9835
No log 4.0698 350 1.0170 0.3165 1.0170 1.0085
No log 4.0930 352 1.0726 0.3138 1.0726 1.0356
No log 4.1163 354 1.0459 0.3374 1.0459 1.0227
No log 4.1395 356 0.9384 0.3397 0.9384 0.9687
No log 4.1628 358 0.8448 0.3658 0.8448 0.9191
No log 4.1860 360 0.8148 0.3839 0.8148 0.9027
No log 4.2093 362 0.8144 0.2445 0.8144 0.9025
No log 4.2326 364 0.9102 0.2724 0.9102 0.9540
No log 4.2558 366 1.0922 0.2215 1.0922 1.0451
No log 4.2791 368 1.1520 0.2439 1.1520 1.0733
No log 4.3023 370 1.0782 0.2568 1.0782 1.0384
No log 4.3256 372 0.9524 0.2892 0.9524 0.9759
No log 4.3488 374 0.9105 0.2223 0.9105 0.9542
No log 4.3721 376 0.8816 0.2839 0.8816 0.9390
No log 4.3953 378 0.8885 0.3121 0.8885 0.9426
No log 4.4186 380 0.9530 0.3115 0.9530 0.9762
No log 4.4419 382 0.9478 0.3365 0.9478 0.9735
No log 4.4651 384 0.9209 0.3486 0.9209 0.9596
No log 4.4884 386 0.8790 0.3980 0.8790 0.9376
No log 4.5116 388 0.8978 0.3183 0.8978 0.9475
No log 4.5349 390 0.9640 0.2201 0.9640 0.9818
No log 4.5581 392 1.0767 0.3290 1.0767 1.0376
No log 4.5814 394 1.1566 0.2524 1.1566 1.0755
No log 4.6047 396 1.1523 0.2635 1.1523 1.0734
No log 4.6279 398 1.1634 0.2330 1.1634 1.0786
No log 4.6512 400 1.1940 0.2191 1.1940 1.0927
No log 4.6744 402 1.1613 0.2481 1.1613 1.0776
No log 4.6977 404 1.0121 0.2627 1.0121 1.0060
No log 4.7209 406 0.8709 0.3613 0.8709 0.9332
No log 4.7442 408 0.8665 0.3575 0.8665 0.9309
No log 4.7674 410 0.8931 0.3335 0.8931 0.9451
No log 4.7907 412 0.8940 0.3382 0.8940 0.9455
No log 4.8140 414 0.9261 0.4108 0.9261 0.9623
No log 4.8372 416 0.9656 0.3621 0.9656 0.9826
No log 4.8605 418 0.9538 0.3621 0.9538 0.9766
No log 4.8837 420 0.9030 0.4108 0.9030 0.9503
No log 4.9070 422 0.8484 0.3321 0.8484 0.9211
No log 4.9302 424 0.8288 0.3321 0.8288 0.9104
No log 4.9535 426 0.8222 0.3360 0.8222 0.9068
No log 4.9767 428 0.8213 0.4023 0.8213 0.9063
No log 5.0 430 0.8587 0.3629 0.8587 0.9267
No log 5.0233 432 0.9305 0.3285 0.9305 0.9646
No log 5.0465 434 0.9586 0.3036 0.9586 0.9791
No log 5.0698 436 0.9370 0.3034 0.9370 0.9680
No log 5.0930 438 0.9111 0.2202 0.9111 0.9545
No log 5.1163 440 0.9622 0.2105 0.9622 0.9809
No log 5.1395 442 1.0422 0.2706 1.0422 1.0209
No log 5.1628 444 1.1129 0.2686 1.1129 1.0549
No log 5.1860 446 1.1171 0.2582 1.1171 1.0569
No log 5.2093 448 1.0214 0.3084 1.0214 1.0106
No log 5.2326 450 0.9611 0.3845 0.9611 0.9804
No log 5.2558 452 0.8748 0.4118 0.8748 0.9353
No log 5.2791 454 0.8292 0.4089 0.8292 0.9106
No log 5.3023 456 0.8397 0.4183 0.8397 0.9164
No log 5.3256 458 0.9006 0.3432 0.9006 0.9490
No log 5.3488 460 0.9664 0.3268 0.9664 0.9830
No log 5.3721 462 0.9598 0.2329 0.9598 0.9797
No log 5.3953 464 0.9039 0.3251 0.9039 0.9507
No log 5.4186 466 0.8421 0.3486 0.8421 0.9177
No log 5.4419 468 0.7849 0.3635 0.7849 0.8859
No log 5.4651 470 0.7575 0.4001 0.7575 0.8703
No log 5.4884 472 0.7643 0.4001 0.7643 0.8742
No log 5.5116 474 0.7735 0.3885 0.7735 0.8795
No log 5.5349 476 0.7756 0.3840 0.7756 0.8807
No log 5.5581 478 0.7790 0.3840 0.7790 0.8826
No log 5.5814 480 0.7598 0.4341 0.7598 0.8717
No log 5.6047 482 0.7654 0.3906 0.7654 0.8749
No log 5.6279 484 0.7978 0.3754 0.7978 0.8932
No log 5.6512 486 0.8484 0.3732 0.8484 0.9211
No log 5.6744 488 0.9319 0.3586 0.9319 0.9653
No log 5.6977 490 0.9514 0.3321 0.9514 0.9754
No log 5.7209 492 0.9691 0.3321 0.9691 0.9844
No log 5.7442 494 0.8687 0.4663 0.8687 0.9320
No log 5.7674 496 0.7790 0.4058 0.7790 0.8826
No log 5.7907 498 0.7580 0.3902 0.7580 0.8706
0.4087 5.8140 500 0.7576 0.3178 0.7576 0.8704
0.4087 5.8372 502 0.7561 0.3284 0.7561 0.8696
0.4087 5.8605 504 0.7570 0.4137 0.7570 0.8701
0.4087 5.8837 506 0.7712 0.4360 0.7712 0.8782
0.4087 5.9070 508 0.7854 0.4360 0.7854 0.8862
0.4087 5.9302 510 0.7977 0.3417 0.7977 0.8931
0.4087 5.9535 512 0.8060 0.3145 0.8060 0.8978
0.4087 5.9767 514 0.8177 0.2841 0.8177 0.9042
0.4087 6.0 516 0.8243 0.4240 0.8243 0.9079
0.4087 6.0233 518 0.8772 0.3883 0.8772 0.9366
0.4087 6.0465 520 0.9470 0.3474 0.9470 0.9731
0.4087 6.0698 522 0.9742 0.3052 0.9742 0.9870
0.4087 6.0930 524 1.0244 0.1434 1.0244 1.0121
0.4087 6.1163 526 1.0639 0.1434 1.0639 1.0315
0.4087 6.1395 528 1.1227 0.1628 1.1227 1.0596
0.4087 6.1628 530 1.1178 0.1594 1.1178 1.0572

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k17_task7_organization

Finetuned
(4019)
this model