ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k17_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9638
  • Qwk: 0.3842
  • Mse: 0.9638
  • Rmse: 0.9817

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0233 2 4.0718 0.0078 4.0718 2.0179
No log 0.0465 4 2.3721 0.0364 2.3721 1.5402
No log 0.0698 6 2.1812 -0.0264 2.1812 1.4769
No log 0.0930 8 1.8530 0.0169 1.8530 1.3612
No log 0.1163 10 1.1344 0.1810 1.1344 1.0651
No log 0.1395 12 1.1030 0.0824 1.1030 1.0502
No log 0.1628 14 1.0936 0.1292 1.0936 1.0458
No log 0.1860 16 1.0885 0.2265 1.0885 1.0433
No log 0.2093 18 1.0595 0.2643 1.0595 1.0293
No log 0.2326 20 1.0418 0.2340 1.0418 1.0207
No log 0.2558 22 1.0366 0.2341 1.0366 1.0182
No log 0.2791 24 0.9843 0.2591 0.9843 0.9921
No log 0.3023 26 0.9590 0.2897 0.9590 0.9793
No log 0.3256 28 1.0263 0.2897 1.0263 1.0131
No log 0.3488 30 1.0029 0.2061 1.0029 1.0015
No log 0.3721 32 0.9408 0.2517 0.9408 0.9700
No log 0.3953 34 0.8954 0.3562 0.8954 0.9463
No log 0.4186 36 0.9328 0.3251 0.9328 0.9658
No log 0.4419 38 0.9135 0.3688 0.9135 0.9558
No log 0.4651 40 0.9366 0.2594 0.9366 0.9678
No log 0.4884 42 1.0128 0.1294 1.0128 1.0064
No log 0.5116 44 1.0515 0.2308 1.0515 1.0254
No log 0.5349 46 1.0618 0.2614 1.0618 1.0304
No log 0.5581 48 0.9984 0.1446 0.9984 0.9992
No log 0.5814 50 0.9678 0.1351 0.9678 0.9838
No log 0.6047 52 0.9169 0.2988 0.9169 0.9575
No log 0.6279 54 0.9111 0.2865 0.9111 0.9545
No log 0.6512 56 0.9330 0.3112 0.9330 0.9659
No log 0.6744 58 0.9274 0.2965 0.9274 0.9630
No log 0.6977 60 0.8976 0.3129 0.8976 0.9474
No log 0.7209 62 0.9183 0.2842 0.9183 0.9583
No log 0.7442 64 0.9795 0.2471 0.9795 0.9897
No log 0.7674 66 0.9277 0.2464 0.9277 0.9632
No log 0.7907 68 0.8683 0.3960 0.8683 0.9318
No log 0.8140 70 0.8759 0.3702 0.8759 0.9359
No log 0.8372 72 0.8684 0.3454 0.8684 0.9319
No log 0.8605 74 0.9098 0.2610 0.9098 0.9538
No log 0.8837 76 0.9629 0.3135 0.9629 0.9813
No log 0.9070 78 0.9849 0.3322 0.9849 0.9924
No log 0.9302 80 1.0327 0.2572 1.0327 1.0162
No log 0.9535 82 1.0599 0.2927 1.0599 1.0295
No log 0.9767 84 1.1585 0.2062 1.1585 1.0763
No log 1.0 86 1.4086 0.1428 1.4086 1.1868
No log 1.0233 88 1.5834 0.1807 1.5834 1.2583
No log 1.0465 90 1.5046 0.2170 1.5046 1.2266
No log 1.0698 92 1.2335 0.1202 1.2335 1.1106
No log 1.0930 94 1.0240 0.1167 1.0240 1.0119
No log 1.1163 96 0.9339 0.3382 0.9339 0.9664
No log 1.1395 98 0.9083 0.3044 0.9083 0.9530
No log 1.1628 100 0.9985 0.1952 0.9985 0.9993
No log 1.1860 102 1.1281 0.2206 1.1281 1.0621
No log 1.2093 104 1.1760 0.2598 1.1760 1.0844
No log 1.2326 106 1.1219 0.2793 1.1219 1.0592
No log 1.2558 108 1.2213 0.2120 1.2213 1.1051
No log 1.2791 110 1.3553 0.1792 1.3553 1.1642
No log 1.3023 112 1.3231 0.0813 1.3231 1.1503
No log 1.3256 114 1.3236 0.0370 1.3236 1.1505
No log 1.3488 116 1.3320 0.1168 1.3320 1.1541
No log 1.3721 118 1.3199 0.1268 1.3199 1.1489
No log 1.3953 120 1.1660 0.1727 1.1660 1.0798
No log 1.4186 122 1.0757 0.1725 1.0757 1.0372
No log 1.4419 124 1.0148 0.2636 1.0148 1.0074
No log 1.4651 126 0.9555 0.3024 0.9555 0.9775
No log 1.4884 128 0.9189 0.3454 0.9189 0.9586
No log 1.5116 130 0.9522 0.3278 0.9522 0.9758
No log 1.5349 132 1.2404 0.2958 1.2404 1.1137
No log 1.5581 134 1.4058 0.2296 1.4058 1.1857
No log 1.5814 136 1.2700 0.2958 1.2700 1.1269
No log 1.6047 138 0.9554 0.3558 0.9554 0.9775
No log 1.6279 140 0.8713 0.3488 0.8713 0.9334
No log 1.6512 142 0.8261 0.3776 0.8261 0.9089
No log 1.6744 144 0.7988 0.5698 0.7988 0.8938
No log 1.6977 146 1.0439 0.3310 1.0439 1.0217
No log 1.7209 148 1.1611 0.3008 1.1611 1.0775
No log 1.7442 150 1.0298 0.3469 1.0298 1.0148
No log 1.7674 152 0.8266 0.4586 0.8266 0.9092
No log 1.7907 154 0.7710 0.5070 0.7710 0.8781
No log 1.8140 156 0.8507 0.5160 0.8507 0.9223
No log 1.8372 158 1.1247 0.3163 1.1247 1.0605
No log 1.8605 160 1.6032 0.2986 1.6032 1.2662
No log 1.8837 162 1.7421 0.3025 1.7421 1.3199
No log 1.9070 164 1.6031 0.3215 1.6031 1.2661
No log 1.9302 166 1.2108 0.2906 1.2108 1.1004
No log 1.9535 168 0.9824 0.3614 0.9824 0.9911
No log 1.9767 170 0.8941 0.3806 0.8941 0.9455
No log 2.0 172 0.9259 0.3717 0.9259 0.9622
No log 2.0233 174 0.9353 0.3985 0.9353 0.9671
No log 2.0465 176 1.0685 0.3139 1.0685 1.0337
No log 2.0698 178 1.2417 0.2223 1.2417 1.1143
No log 2.0930 180 1.1134 0.3139 1.1134 1.0552
No log 2.1163 182 0.9233 0.4926 0.9233 0.9609
No log 2.1395 184 0.8763 0.5292 0.8763 0.9361
No log 2.1628 186 0.8169 0.5676 0.8169 0.9038
No log 2.1860 188 0.8141 0.5107 0.8141 0.9023
No log 2.2093 190 0.8972 0.5279 0.8972 0.9472
No log 2.2326 192 1.0618 0.3370 1.0618 1.0305
No log 2.2558 194 1.1959 0.3238 1.1959 1.0936
No log 2.2791 196 1.1303 0.2827 1.1303 1.0632
No log 2.3023 198 0.9187 0.4169 0.9187 0.9585
No log 2.3256 200 0.8434 0.3979 0.8434 0.9184
No log 2.3488 202 0.8716 0.4180 0.8716 0.9336
No log 2.3721 204 0.8494 0.5073 0.8494 0.9216
No log 2.3953 206 0.8342 0.4610 0.8342 0.9133
No log 2.4186 208 0.8660 0.3939 0.8660 0.9306
No log 2.4419 210 0.8692 0.4068 0.8692 0.9323
No log 2.4651 212 0.8262 0.4843 0.8263 0.9090
No log 2.4884 214 0.8135 0.4843 0.8135 0.9019
No log 2.5116 216 0.9033 0.3778 0.9033 0.9504
No log 2.5349 218 0.9604 0.3913 0.9604 0.9800
No log 2.5581 220 0.9571 0.3913 0.9571 0.9783
No log 2.5814 222 0.8219 0.5599 0.8219 0.9066
No log 2.6047 224 0.8264 0.5517 0.8264 0.9091
No log 2.6279 226 0.9300 0.4613 0.9300 0.9644
No log 2.6512 228 0.9306 0.4815 0.9306 0.9647
No log 2.6744 230 0.8514 0.4321 0.8514 0.9227
No log 2.6977 232 0.9322 0.4341 0.9322 0.9655
No log 2.7209 234 0.9275 0.4091 0.9275 0.9631
No log 2.7442 236 0.9735 0.3619 0.9735 0.9867
No log 2.7674 238 1.0201 0.3928 1.0201 1.0100
No log 2.7907 240 1.0203 0.4252 1.0203 1.0101
No log 2.8140 242 0.9148 0.4820 0.9148 0.9565
No log 2.8372 244 0.9203 0.5224 0.9203 0.9593
No log 2.8605 246 0.9300 0.5012 0.9300 0.9644
No log 2.8837 248 0.8538 0.5305 0.8538 0.9240
No log 2.9070 250 0.9117 0.4787 0.9117 0.9548
No log 2.9302 252 0.9650 0.4787 0.9650 0.9823
No log 2.9535 254 0.8305 0.4709 0.8305 0.9113
No log 2.9767 256 0.7423 0.4995 0.7423 0.8616
No log 3.0 258 0.7403 0.5010 0.7403 0.8604
No log 3.0233 260 0.7647 0.5318 0.7647 0.8745
No log 3.0465 262 0.8210 0.4832 0.8210 0.9061
No log 3.0698 264 0.7865 0.4857 0.7865 0.8869
No log 3.0930 266 0.7486 0.5557 0.7486 0.8652
No log 3.1163 268 0.7375 0.5692 0.7375 0.8588
No log 3.1395 270 0.7678 0.5069 0.7678 0.8762
No log 3.1628 272 0.7692 0.5317 0.7692 0.8770
No log 3.1860 274 0.7783 0.4974 0.7783 0.8822
No log 3.2093 276 0.8197 0.4601 0.8197 0.9054
No log 3.2326 278 0.7643 0.4995 0.7643 0.8743
No log 3.2558 280 0.7587 0.4645 0.7587 0.8710
No log 3.2791 282 0.7479 0.4660 0.7479 0.8648
No log 3.3023 284 0.7470 0.5112 0.7470 0.8643
No log 3.3256 286 0.8182 0.4836 0.8182 0.9046
No log 3.3488 288 0.8478 0.4186 0.8478 0.9208
No log 3.3721 290 0.7830 0.5175 0.7830 0.8849
No log 3.3953 292 0.7238 0.5135 0.7238 0.8508
No log 3.4186 294 0.7675 0.4608 0.7675 0.8760
No log 3.4419 296 0.9341 0.3702 0.9341 0.9665
No log 3.4651 298 0.8875 0.4056 0.8875 0.9421
No log 3.4884 300 0.7834 0.5174 0.7834 0.8851
No log 3.5116 302 0.8637 0.4491 0.8637 0.9293
No log 3.5349 304 0.8962 0.4143 0.8962 0.9467
No log 3.5581 306 0.8495 0.4784 0.8495 0.9217
No log 3.5814 308 0.8563 0.5152 0.8563 0.9253
No log 3.6047 310 0.8593 0.5152 0.8593 0.9270
No log 3.6279 312 0.8642 0.5116 0.8642 0.9296
No log 3.6512 314 0.9078 0.3990 0.9078 0.9528
No log 3.6744 316 0.9806 0.2373 0.9806 0.9902
No log 3.6977 318 0.9358 0.3250 0.9358 0.9674
No log 3.7209 320 0.8059 0.4368 0.8059 0.8977
No log 3.7442 322 0.7647 0.5783 0.7647 0.8745
No log 3.7674 324 0.8035 0.4879 0.8035 0.8964
No log 3.7907 326 0.7633 0.5771 0.7633 0.8736
No log 3.8140 328 0.7261 0.5247 0.7261 0.8521
No log 3.8372 330 0.7559 0.4968 0.7559 0.8694
No log 3.8605 332 0.7487 0.5654 0.7487 0.8653
No log 3.8837 334 0.7445 0.5032 0.7445 0.8628
No log 3.9070 336 0.7712 0.5032 0.7712 0.8782
No log 3.9302 338 0.8103 0.5199 0.8103 0.9002
No log 3.9535 340 0.9462 0.4171 0.9462 0.9727
No log 3.9767 342 0.9809 0.3076 0.9809 0.9904
No log 4.0 344 0.9078 0.3139 0.9078 0.9528
No log 4.0233 346 0.8431 0.3998 0.8431 0.9182
No log 4.0465 348 0.8172 0.4405 0.8172 0.9040
No log 4.0698 350 0.7998 0.4645 0.7998 0.8943
No log 4.0930 352 0.8445 0.4840 0.8445 0.9190
No log 4.1163 354 0.9984 0.3958 0.9984 0.9992
No log 4.1395 356 1.0380 0.3798 1.0380 1.0188
No log 4.1628 358 0.9321 0.4455 0.9321 0.9654
No log 4.1860 360 0.8042 0.4995 0.8042 0.8968
No log 4.2093 362 0.7819 0.5121 0.7819 0.8843
No log 4.2326 364 0.7949 0.4988 0.7949 0.8916
No log 4.2558 366 0.8609 0.4275 0.8609 0.9279
No log 4.2791 368 1.0434 0.4014 1.0434 1.0215
No log 4.3023 370 1.1088 0.2485 1.1088 1.0530
No log 4.3256 372 1.0202 0.2614 1.0202 1.0100
No log 4.3488 374 0.9084 0.2983 0.9084 0.9531
No log 4.3721 376 0.8354 0.4211 0.8354 0.9140
No log 4.3953 378 0.8236 0.3998 0.8236 0.9075
No log 4.4186 380 0.8192 0.4388 0.8192 0.9051
No log 4.4419 382 0.8203 0.4251 0.8203 0.9057
No log 4.4651 384 0.8911 0.3992 0.8911 0.9440
No log 4.4884 386 1.0038 0.3863 1.0038 1.0019
No log 4.5116 388 0.9740 0.3863 0.9740 0.9869
No log 4.5349 390 0.8734 0.3992 0.8734 0.9346
No log 4.5581 392 0.8103 0.4774 0.8103 0.9002
No log 4.5814 394 0.8232 0.4292 0.8232 0.9073
No log 4.6047 396 0.8042 0.5032 0.8042 0.8967
No log 4.6279 398 0.8018 0.4888 0.8018 0.8954
No log 4.6512 400 0.8816 0.4584 0.8816 0.9389
No log 4.6744 402 0.9554 0.4197 0.9554 0.9775
No log 4.6977 404 1.0339 0.2815 1.0339 1.0168
No log 4.7209 406 0.9846 0.3310 0.9846 0.9922
No log 4.7442 408 0.8691 0.4244 0.8691 0.9322
No log 4.7674 410 0.8386 0.4115 0.8386 0.9157
No log 4.7907 412 0.8379 0.4244 0.8379 0.9154
No log 4.8140 414 0.9412 0.3577 0.9412 0.9702
No log 4.8372 416 1.0314 0.4454 1.0314 1.0156
No log 4.8605 418 0.9830 0.4579 0.9830 0.9915
No log 4.8837 420 0.9394 0.4510 0.9394 0.9692
No log 4.9070 422 0.9186 0.5039 0.9186 0.9584
No log 4.9302 424 0.8613 0.4960 0.8613 0.9281
No log 4.9535 426 0.8254 0.5333 0.8254 0.9085
No log 4.9767 428 0.8327 0.5333 0.8327 0.9125
No log 5.0 430 0.8780 0.4471 0.8780 0.9370
No log 5.0233 432 0.8832 0.4078 0.8832 0.9398
No log 5.0465 434 0.8669 0.4748 0.8669 0.9311
No log 5.0698 436 0.8848 0.4133 0.8848 0.9406
No log 5.0930 438 0.9152 0.3609 0.9152 0.9567
No log 5.1163 440 1.0502 0.2968 1.0502 1.0248
No log 5.1395 442 1.2312 0.3568 1.2312 1.1096
No log 5.1628 444 1.1993 0.3568 1.1993 1.0951
No log 5.1860 446 1.0657 0.3046 1.0657 1.0323
No log 5.2093 448 0.9478 0.2879 0.9478 0.9735
No log 5.2326 450 0.9456 0.4048 0.9456 0.9724
No log 5.2558 452 0.9589 0.3263 0.9589 0.9792
No log 5.2791 454 0.9922 0.2533 0.9922 0.9961
No log 5.3023 456 1.0286 0.2857 1.0286 1.0142
No log 5.3256 458 0.9959 0.2633 0.9959 0.9979
No log 5.3488 460 0.9467 0.3521 0.9467 0.9730
No log 5.3721 462 0.9158 0.4180 0.9158 0.9570
No log 5.3953 464 0.8940 0.4163 0.8940 0.9455
No log 5.4186 466 0.9091 0.3861 0.9091 0.9535
No log 5.4419 468 0.8938 0.4499 0.8938 0.9454
No log 5.4651 470 0.9023 0.4282 0.9023 0.9499
No log 5.4884 472 0.8980 0.4305 0.8980 0.9477
No log 5.5116 474 0.9227 0.4971 0.9227 0.9606
No log 5.5349 476 0.9280 0.4971 0.9280 0.9633
No log 5.5581 478 0.9340 0.4859 0.9340 0.9664
No log 5.5814 480 1.0342 0.3959 1.0342 1.0170
No log 5.6047 482 1.1467 0.3300 1.1467 1.0708
No log 5.6279 484 1.1417 0.2907 1.1417 1.0685
No log 5.6512 486 1.0405 0.2898 1.0405 1.0200
No log 5.6744 488 0.9077 0.4006 0.9077 0.9528
No log 5.6977 490 0.8603 0.4724 0.8603 0.9275
No log 5.7209 492 0.9268 0.4383 0.9268 0.9627
No log 5.7442 494 1.1355 0.4186 1.1355 1.0656
No log 5.7674 496 1.2350 0.4269 1.2350 1.1113
No log 5.7907 498 1.1309 0.3817 1.1309 1.0634
0.3401 5.8140 500 0.9254 0.4032 0.9254 0.9620
0.3401 5.8372 502 0.8477 0.4237 0.8477 0.9207
0.3401 5.8605 504 0.8572 0.4106 0.8572 0.9259
0.3401 5.8837 506 0.9075 0.3732 0.9075 0.9526
0.3401 5.9070 508 0.9790 0.3322 0.9790 0.9895
0.3401 5.9302 510 0.9423 0.3418 0.9423 0.9707
0.3401 5.9535 512 0.8968 0.4359 0.8968 0.9470
0.3401 5.9767 514 0.8682 0.4398 0.8682 0.9318
0.3401 6.0 516 0.8489 0.4824 0.8489 0.9213
0.3401 6.0233 518 0.8804 0.4719 0.8804 0.9383
0.3401 6.0465 520 0.8652 0.4928 0.8652 0.9302
0.3401 6.0698 522 0.8921 0.5074 0.8921 0.9445
0.3401 6.0930 524 0.9055 0.4945 0.9055 0.9516
0.3401 6.1163 526 0.8626 0.4634 0.8626 0.9287
0.3401 6.1395 528 0.8272 0.5247 0.8272 0.9095
0.3401 6.1628 530 0.8208 0.4659 0.8208 0.9060
0.3401 6.1860 532 0.8171 0.5490 0.8171 0.9039
0.3401 6.2093 534 0.8355 0.5546 0.8355 0.9141
0.3401 6.2326 536 0.8682 0.5498 0.8682 0.9318
0.3401 6.2558 538 0.8978 0.5163 0.8978 0.9475
0.3401 6.2791 540 0.8822 0.3739 0.8822 0.9393
0.3401 6.3023 542 0.9008 0.3892 0.9008 0.9491
0.3401 6.3256 544 0.9291 0.3993 0.9291 0.9639
0.3401 6.3488 546 0.9632 0.3860 0.9632 0.9814
0.3401 6.3721 548 0.9638 0.3842 0.9638 0.9817

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k17_task5_organization

Finetuned
(4019)
this model