ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k7_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8966
  • Qwk: 0.4450
  • Mse: 0.8966
  • Rmse: 0.9469

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.05 2 4.4529 0.0010 4.4529 2.1102
No log 0.1 4 3.2904 -0.0119 3.2904 1.8140
No log 0.15 6 2.1225 0.0538 2.1225 1.4569
No log 0.2 8 1.3211 0.0750 1.3211 1.1494
No log 0.25 10 1.2455 0.0277 1.2455 1.1160
No log 0.3 12 1.2683 0.0654 1.2683 1.1262
No log 0.35 14 1.3012 0.1016 1.3012 1.1407
No log 0.4 16 1.3325 0.0688 1.3325 1.1544
No log 0.45 18 1.3563 0.0688 1.3563 1.1646
No log 0.5 20 1.1378 0.2936 1.1378 1.0667
No log 0.55 22 1.1792 0.1541 1.1792 1.0859
No log 0.6 24 1.1186 0.2056 1.1186 1.0577
No log 0.65 26 1.1239 0.2986 1.1239 1.0602
No log 0.7 28 1.4634 0.1722 1.4634 1.2097
No log 0.75 30 1.5295 0.2164 1.5295 1.2367
No log 0.8 32 1.2002 0.2191 1.2002 1.0955
No log 0.85 34 1.0665 0.3371 1.0665 1.0327
No log 0.9 36 1.0964 0.3674 1.0964 1.0471
No log 0.95 38 1.1044 0.4050 1.1044 1.0509
No log 1.0 40 1.0772 0.3781 1.0772 1.0379
No log 1.05 42 1.1233 0.3644 1.1233 1.0599
No log 1.1 44 1.0593 0.4948 1.0593 1.0292
No log 1.15 46 1.0160 0.4503 1.0160 1.0080
No log 1.2 48 1.0377 0.3922 1.0377 1.0187
No log 1.25 50 0.9437 0.4981 0.9437 0.9714
No log 1.3 52 0.9113 0.5513 0.9113 0.9546
No log 1.35 54 0.9264 0.5334 0.9264 0.9625
No log 1.4 56 0.8518 0.5671 0.8518 0.9229
No log 1.45 58 0.8337 0.6067 0.8337 0.9131
No log 1.5 60 0.7965 0.5697 0.7965 0.8925
No log 1.55 62 1.0013 0.4668 1.0013 1.0006
No log 1.6 64 1.0930 0.3098 1.0930 1.0455
No log 1.65 66 0.9459 0.5262 0.9459 0.9726
No log 1.7 68 0.9212 0.5749 0.9212 0.9598
No log 1.75 70 1.0490 0.5667 1.0490 1.0242
No log 1.8 72 1.0615 0.5666 1.0615 1.0303
No log 1.85 74 0.9574 0.6069 0.9574 0.9785
No log 1.9 76 0.9299 0.5434 0.9299 0.9643
No log 1.95 78 1.0029 0.5421 1.0029 1.0014
No log 2.0 80 0.9507 0.5361 0.9507 0.9751
No log 2.05 82 0.9186 0.5587 0.9186 0.9584
No log 2.1 84 1.1057 0.4586 1.1057 1.0515
No log 2.15 86 1.0343 0.5280 1.0343 1.0170
No log 2.2 88 0.8806 0.4910 0.8806 0.9384
No log 2.25 90 0.8834 0.5515 0.8834 0.9399
No log 2.3 92 0.8925 0.5429 0.8925 0.9447
No log 2.35 94 0.8638 0.5621 0.8638 0.9294
No log 2.4 96 0.8505 0.5917 0.8505 0.9222
No log 2.45 98 0.8501 0.5208 0.8501 0.9220
No log 2.5 100 0.9401 0.5187 0.9401 0.9696
No log 2.55 102 0.8950 0.4494 0.8950 0.9461
No log 2.6 104 0.9661 0.4879 0.9661 0.9829
No log 2.65 106 1.0707 0.4402 1.0707 1.0347
No log 2.7 108 0.9260 0.5708 0.9260 0.9623
No log 2.75 110 0.8534 0.4719 0.8534 0.9238
No log 2.8 112 0.8713 0.4730 0.8713 0.9334
No log 2.85 114 0.8024 0.5673 0.8024 0.8958
No log 2.9 116 0.8174 0.6023 0.8174 0.9041
No log 2.95 118 0.8590 0.5743 0.8590 0.9268
No log 3.0 120 0.9613 0.4658 0.9613 0.9805
No log 3.05 122 1.1090 0.4203 1.1090 1.0531
No log 3.1 124 1.1075 0.4125 1.1075 1.0524
No log 3.15 126 0.9193 0.5621 0.9193 0.9588
No log 3.2 128 0.9269 0.5390 0.9269 0.9627
No log 3.25 130 0.9281 0.5394 0.9281 0.9634
No log 3.3 132 0.8354 0.5411 0.8354 0.9140
No log 3.35 134 0.7948 0.5271 0.7948 0.8915
No log 3.4 136 0.8920 0.5455 0.8920 0.9445
No log 3.45 138 0.8332 0.5355 0.8332 0.9128
No log 3.5 140 0.8482 0.5434 0.8482 0.9210
No log 3.55 142 0.9592 0.5052 0.9592 0.9794
No log 3.6 144 0.8443 0.5414 0.8443 0.9188
No log 3.65 146 0.8156 0.5486 0.8156 0.9031
No log 3.7 148 0.8234 0.5392 0.8234 0.9074
No log 3.75 150 0.8524 0.4301 0.8524 0.9233
No log 3.8 152 0.8311 0.5686 0.8311 0.9116
No log 3.85 154 0.9332 0.5201 0.9332 0.9660
No log 3.9 156 0.9206 0.4788 0.9206 0.9595
No log 3.95 158 0.8636 0.4628 0.8636 0.9293
No log 4.0 160 0.8484 0.5216 0.8484 0.9211
No log 4.05 162 0.8416 0.5320 0.8416 0.9174
No log 4.1 164 1.0038 0.4531 1.0038 1.0019
No log 4.15 166 0.9648 0.4772 0.9648 0.9822
No log 4.2 168 0.8774 0.4994 0.8774 0.9367
No log 4.25 170 0.8872 0.5736 0.8872 0.9419
No log 4.3 172 0.9546 0.4933 0.9546 0.9770
No log 4.35 174 0.8838 0.5255 0.8838 0.9401
No log 4.4 176 0.8145 0.4898 0.8145 0.9025
No log 4.45 178 0.9841 0.3509 0.9841 0.9920
No log 4.5 180 0.9570 0.4005 0.9570 0.9782
No log 4.55 182 0.8006 0.5673 0.8006 0.8948
No log 4.6 184 0.9477 0.4436 0.9477 0.9735
No log 4.65 186 0.9554 0.4440 0.9554 0.9774
No log 4.7 188 0.8485 0.5473 0.8485 0.9211
No log 4.75 190 0.7852 0.5251 0.7852 0.8861
No log 4.8 192 0.8043 0.5848 0.8043 0.8968
No log 4.85 194 0.7973 0.5645 0.7973 0.8929
No log 4.9 196 0.8365 0.5921 0.8365 0.9146
No log 4.95 198 0.9785 0.4857 0.9785 0.9892
No log 5.0 200 1.0803 0.4968 1.0803 1.0394
No log 5.05 202 0.9604 0.5614 0.9604 0.9800
No log 5.1 204 0.9099 0.5871 0.9099 0.9539
No log 5.15 206 0.9043 0.5183 0.9043 0.9509
No log 5.2 208 0.8574 0.5473 0.8574 0.9259
No log 5.25 210 0.8059 0.5184 0.8059 0.8977
No log 5.3 212 0.8111 0.5752 0.8111 0.9006
No log 5.35 214 0.8708 0.5660 0.8708 0.9331
No log 5.4 216 0.8324 0.6233 0.8324 0.9124
No log 5.45 218 0.7510 0.6107 0.7510 0.8666
No log 5.5 220 0.7974 0.6689 0.7974 0.8930
No log 5.55 222 0.7870 0.6371 0.7870 0.8871
No log 5.6 224 0.7467 0.5866 0.7467 0.8641
No log 5.65 226 0.7482 0.5532 0.7482 0.8650
No log 5.7 228 0.7935 0.5428 0.7935 0.8908
No log 5.75 230 0.8052 0.5471 0.8052 0.8973
No log 5.8 232 0.7919 0.4979 0.7919 0.8899
No log 5.85 234 0.7951 0.3821 0.7951 0.8917
No log 5.9 236 0.8268 0.4371 0.8268 0.9093
No log 5.95 238 0.8001 0.4098 0.8001 0.8945
No log 6.0 240 0.7854 0.5688 0.7854 0.8862
No log 6.05 242 0.8329 0.5815 0.8329 0.9126
No log 6.1 244 0.9618 0.4989 0.9618 0.9807
No log 6.15 246 1.0756 0.4592 1.0756 1.0371
No log 6.2 248 0.9544 0.4909 0.9544 0.9769
No log 6.25 250 0.7816 0.5892 0.7816 0.8841
No log 6.3 252 0.7768 0.5711 0.7768 0.8814
No log 6.35 254 0.7874 0.6142 0.7874 0.8874
No log 6.4 256 0.8488 0.5578 0.8488 0.9213
No log 6.45 258 0.9157 0.4921 0.9157 0.9569
No log 6.5 260 0.8988 0.5124 0.8988 0.9480
No log 6.55 262 0.7997 0.5253 0.7997 0.8943
No log 6.6 264 0.7591 0.5242 0.7591 0.8712
No log 6.65 266 0.7931 0.5403 0.7931 0.8905
No log 6.7 268 0.7868 0.5447 0.7868 0.8870
No log 6.75 270 0.8048 0.6196 0.8048 0.8971
No log 6.8 272 0.9963 0.5310 0.9963 0.9981
No log 6.85 274 1.0654 0.4708 1.0654 1.0322
No log 6.9 276 1.0002 0.4820 1.0002 1.0001
No log 6.95 278 0.8249 0.5625 0.8249 0.9083
No log 7.0 280 0.7598 0.5408 0.7598 0.8717
No log 7.05 282 0.7665 0.5467 0.7665 0.8755
No log 7.1 284 0.7594 0.4893 0.7594 0.8714
No log 7.15 286 0.7373 0.5526 0.7373 0.8587
No log 7.2 288 0.7594 0.6336 0.7594 0.8715
No log 7.25 290 0.8257 0.5685 0.8257 0.9087
No log 7.3 292 0.9022 0.5310 0.9022 0.9498
No log 7.35 294 0.8328 0.5685 0.8328 0.9126
No log 7.4 296 0.7588 0.5806 0.7588 0.8711
No log 7.45 298 0.7577 0.5688 0.7577 0.8705
No log 7.5 300 0.8046 0.5537 0.8046 0.8970
No log 7.55 302 0.8239 0.5118 0.8239 0.9077
No log 7.6 304 0.8682 0.4781 0.8682 0.9318
No log 7.65 306 0.8986 0.4527 0.8986 0.9479
No log 7.7 308 0.8427 0.4911 0.8427 0.9180
No log 7.75 310 0.7457 0.6098 0.7457 0.8635
No log 7.8 312 0.7291 0.6292 0.7291 0.8539
No log 7.85 314 0.7374 0.6449 0.7374 0.8587
No log 7.9 316 0.7929 0.6190 0.7929 0.8904
No log 7.95 318 0.9051 0.5471 0.9051 0.9513
No log 8.0 320 0.9678 0.4891 0.9678 0.9838
No log 8.05 322 0.8873 0.5208 0.8873 0.9420
No log 8.1 324 0.7701 0.5836 0.7701 0.8776
No log 8.15 326 0.7159 0.6449 0.7159 0.8461
No log 8.2 328 0.7049 0.6468 0.7049 0.8396
No log 8.25 330 0.6991 0.6487 0.6991 0.8361
No log 8.3 332 0.7235 0.6395 0.7235 0.8506
No log 8.35 334 0.8387 0.4986 0.8387 0.9158
No log 8.4 336 0.9114 0.4813 0.9114 0.9547
No log 8.45 338 0.8216 0.4898 0.8216 0.9064
No log 8.5 340 0.7173 0.5815 0.7173 0.8469
No log 8.55 342 0.7438 0.5747 0.7438 0.8624
No log 8.6 344 0.8070 0.5482 0.8070 0.8983
No log 8.65 346 0.7713 0.5524 0.7713 0.8782
No log 8.7 348 0.6970 0.6280 0.6970 0.8348
No log 8.75 350 0.7189 0.5992 0.7189 0.8479
No log 8.8 352 0.7528 0.6154 0.7528 0.8677
No log 8.85 354 0.7138 0.6259 0.7138 0.8449
No log 8.9 356 0.6942 0.6613 0.6942 0.8332
No log 8.95 358 0.7161 0.6017 0.7161 0.8462
No log 9.0 360 0.7402 0.5720 0.7402 0.8604
No log 9.05 362 0.7320 0.6017 0.7320 0.8556
No log 9.1 364 0.7325 0.5858 0.7325 0.8559
No log 9.15 366 0.7359 0.5931 0.7359 0.8578
No log 9.2 368 0.7336 0.6057 0.7336 0.8565
No log 9.25 370 0.7165 0.5374 0.7165 0.8465
No log 9.3 372 0.7236 0.5759 0.7236 0.8506
No log 9.35 374 0.7031 0.5971 0.7031 0.8385
No log 9.4 376 0.7045 0.5841 0.7045 0.8394
No log 9.45 378 0.6992 0.5869 0.6992 0.8362
No log 9.5 380 0.7043 0.5847 0.7043 0.8392
No log 9.55 382 0.6873 0.5722 0.6873 0.8290
No log 9.6 384 0.6937 0.5869 0.6937 0.8329
No log 9.65 386 0.7188 0.5899 0.7188 0.8478
No log 9.7 388 0.6930 0.6249 0.6930 0.8325
No log 9.75 390 0.6725 0.6010 0.6725 0.8201
No log 9.8 392 0.6725 0.6468 0.6725 0.8201
No log 9.85 394 0.6725 0.6450 0.6725 0.8200
No log 9.9 396 0.6989 0.6290 0.6989 0.8360
No log 9.95 398 0.6986 0.6266 0.6986 0.8358
No log 10.0 400 0.7120 0.5947 0.7120 0.8438
No log 10.05 402 0.7311 0.5975 0.7311 0.8550
No log 10.1 404 0.7328 0.5975 0.7328 0.8561
No log 10.15 406 0.6978 0.6110 0.6978 0.8354
No log 10.2 408 0.6907 0.6120 0.6907 0.8311
No log 10.25 410 0.7076 0.6142 0.7076 0.8412
No log 10.3 412 0.6968 0.6142 0.6968 0.8348
No log 10.35 414 0.6795 0.6950 0.6795 0.8243
No log 10.4 416 0.6814 0.6721 0.6814 0.8255
No log 10.45 418 0.6946 0.6285 0.6946 0.8334
No log 10.5 420 0.7257 0.5880 0.7257 0.8519
No log 10.55 422 0.8147 0.5183 0.8147 0.9026
No log 10.6 424 0.9272 0.5384 0.9272 0.9629
No log 10.65 426 0.9796 0.4794 0.9796 0.9898
No log 10.7 428 0.9917 0.4528 0.9917 0.9958
No log 10.75 430 0.9822 0.4810 0.9822 0.9911
No log 10.8 432 0.8381 0.5250 0.8381 0.9155
No log 10.85 434 0.7392 0.5493 0.7392 0.8597
No log 10.9 436 0.6975 0.5524 0.6975 0.8351
No log 10.95 438 0.6871 0.6646 0.6871 0.8289
No log 11.0 440 0.6763 0.7066 0.6763 0.8224
No log 11.05 442 0.7003 0.6918 0.7003 0.8368
No log 11.1 444 0.7234 0.6285 0.7234 0.8505
No log 11.15 446 0.6845 0.6449 0.6845 0.8274
No log 11.2 448 0.6753 0.6506 0.6753 0.8218
No log 11.25 450 0.6762 0.6292 0.6762 0.8223
No log 11.3 452 0.7158 0.6441 0.7158 0.8460
No log 11.35 454 0.8909 0.5626 0.8909 0.9439
No log 11.4 456 0.9723 0.5508 0.9723 0.9861
No log 11.45 458 0.8714 0.5670 0.8714 0.9335
No log 11.5 460 0.7304 0.6441 0.7304 0.8546
No log 11.55 462 0.6960 0.6107 0.6960 0.8342
No log 11.6 464 0.7262 0.5618 0.7262 0.8522
No log 11.65 466 0.7085 0.5909 0.7085 0.8417
No log 11.7 468 0.7015 0.5892 0.7015 0.8375
No log 11.75 470 0.7581 0.5954 0.7581 0.8707
No log 11.8 472 0.8974 0.5570 0.8974 0.9473
No log 11.85 474 0.9267 0.5702 0.9267 0.9627
No log 11.9 476 0.8369 0.6052 0.8369 0.9148
No log 11.95 478 0.7216 0.6590 0.7216 0.8495
No log 12.0 480 0.7192 0.5835 0.7192 0.8481
No log 12.05 482 0.7782 0.5393 0.7782 0.8822
No log 12.1 484 0.7806 0.5710 0.7806 0.8835
No log 12.15 486 0.7251 0.6059 0.7251 0.8515
No log 12.2 488 0.7201 0.6319 0.7201 0.8486
No log 12.25 490 0.8410 0.5827 0.8410 0.9170
No log 12.3 492 1.0343 0.4939 1.0343 1.0170
No log 12.35 494 1.1018 0.4731 1.1018 1.0497
No log 12.4 496 1.0696 0.4749 1.0696 1.0342
No log 12.45 498 0.9603 0.5181 0.9603 0.9799
0.3117 12.5 500 0.8153 0.5000 0.8153 0.9029
0.3117 12.55 502 0.7131 0.5213 0.7131 0.8444
0.3117 12.6 504 0.7198 0.5714 0.7198 0.8484
0.3117 12.65 506 0.7434 0.5618 0.7434 0.8622
0.3117 12.7 508 0.7388 0.5911 0.7388 0.8596
0.3117 12.75 510 0.7198 0.6563 0.7198 0.8484
0.3117 12.8 512 0.7918 0.6384 0.7918 0.8898
0.3117 12.85 514 0.8970 0.5416 0.8970 0.9471
0.3117 12.9 516 0.9053 0.5412 0.9053 0.9515
0.3117 12.95 518 0.8296 0.5849 0.8296 0.9108
0.3117 13.0 520 0.7902 0.5401 0.7902 0.8890
0.3117 13.05 522 0.7684 0.5239 0.7684 0.8766
0.3117 13.1 524 0.7369 0.5203 0.7369 0.8584
0.3117 13.15 526 0.7265 0.5528 0.7265 0.8523
0.3117 13.2 528 0.7322 0.5214 0.7322 0.8557
0.3117 13.25 530 0.7588 0.5550 0.7588 0.8711
0.3117 13.3 532 0.8289 0.5871 0.8289 0.9105
0.3117 13.35 534 0.8814 0.5592 0.8814 0.9389
0.3117 13.4 536 0.8732 0.5871 0.8732 0.9344
0.3117 13.45 538 0.8387 0.5673 0.8387 0.9158
0.3117 13.5 540 0.8231 0.5722 0.8231 0.9072
0.3117 13.55 542 0.8178 0.5856 0.8178 0.9043
0.3117 13.6 544 0.8159 0.5509 0.8159 0.9033
0.3117 13.65 546 0.8095 0.5153 0.8095 0.8997
0.3117 13.7 548 0.7945 0.5559 0.7945 0.8914
0.3117 13.75 550 0.7796 0.5807 0.7796 0.8830
0.3117 13.8 552 0.7565 0.5821 0.7565 0.8698
0.3117 13.85 554 0.7623 0.5633 0.7623 0.8731
0.3117 13.9 556 0.7709 0.5534 0.7709 0.8780
0.3117 13.95 558 0.8064 0.4472 0.8064 0.8980
0.3117 14.0 560 0.7994 0.4817 0.7994 0.8941
0.3117 14.05 562 0.8167 0.4115 0.8167 0.9037
0.3117 14.1 564 0.8706 0.4051 0.8706 0.9331
0.3117 14.15 566 0.8966 0.4450 0.8966 0.9469

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k7_task2_organization

Finetuned
(4019)
this model