ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k11_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6314
  • Qwk: 0.6446
  • Mse: 0.6314
  • Rmse: 0.7946

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0364 2 4.0474 -0.0047 4.0474 2.0118
No log 0.0727 4 2.9943 -0.0242 2.9943 1.7304
No log 0.1091 6 1.4571 0.0380 1.4571 1.2071
No log 0.1455 8 1.1920 0.0761 1.1920 1.0918
No log 0.1818 10 1.1679 0.1408 1.1679 1.0807
No log 0.2182 12 1.1304 0.1685 1.1304 1.0632
No log 0.2545 14 1.1907 0.1711 1.1907 1.0912
No log 0.2909 16 1.5238 0.1347 1.5238 1.2344
No log 0.3273 18 1.5221 0.1803 1.5221 1.2337
No log 0.3636 20 0.9406 0.4157 0.9406 0.9698
No log 0.4 22 0.7539 0.4705 0.7539 0.8683
No log 0.4364 24 0.8486 0.4763 0.8486 0.9212
No log 0.4727 26 1.4172 0.3141 1.4172 1.1904
No log 0.5091 28 1.5808 0.2802 1.5808 1.2573
No log 0.5455 30 1.1555 0.4072 1.1555 1.0749
No log 0.5818 32 0.7472 0.5407 0.7472 0.8644
No log 0.6182 34 0.7870 0.5779 0.7870 0.8871
No log 0.6545 36 0.8817 0.5350 0.8817 0.9390
No log 0.6909 38 0.9488 0.4855 0.9488 0.9741
No log 0.7273 40 0.7792 0.6019 0.7792 0.8827
No log 0.7636 42 0.6724 0.5735 0.6724 0.8200
No log 0.8 44 0.8566 0.3849 0.8566 0.9255
No log 0.8364 46 0.8351 0.4931 0.8351 0.9138
No log 0.8727 48 0.7384 0.5025 0.7384 0.8593
No log 0.9091 50 0.6442 0.6311 0.6442 0.8026
No log 0.9455 52 0.6738 0.6100 0.6738 0.8208
No log 0.9818 54 0.7075 0.6123 0.7075 0.8411
No log 1.0182 56 0.6527 0.6110 0.6527 0.8079
No log 1.0545 58 0.6602 0.6254 0.6602 0.8125
No log 1.0909 60 0.7043 0.6222 0.7043 0.8392
No log 1.1273 62 0.7042 0.6195 0.7042 0.8391
No log 1.1636 64 0.7074 0.6138 0.7074 0.8411
No log 1.2 66 0.7500 0.6286 0.7500 0.8661
No log 1.2364 68 0.7906 0.5924 0.7906 0.8891
No log 1.2727 70 0.6532 0.6536 0.6532 0.8082
No log 1.3091 72 0.6557 0.6278 0.6557 0.8097
No log 1.3455 74 0.6470 0.6278 0.6470 0.8044
No log 1.3818 76 0.6591 0.6179 0.6591 0.8119
No log 1.4182 78 0.7035 0.5560 0.7035 0.8387
No log 1.4545 80 0.6466 0.6788 0.6466 0.8041
No log 1.4909 82 0.6691 0.6089 0.6691 0.8180
No log 1.5273 84 0.6697 0.5879 0.6697 0.8184
No log 1.5636 86 0.6398 0.6330 0.6398 0.7999
No log 1.6 88 0.6550 0.6409 0.6550 0.8093
No log 1.6364 90 0.6984 0.5340 0.6984 0.8357
No log 1.6727 92 0.6712 0.5879 0.6712 0.8193
No log 1.7091 94 0.6531 0.6227 0.6531 0.8081
No log 1.7455 96 0.6610 0.5530 0.6610 0.8130
No log 1.7818 98 0.6472 0.5274 0.6472 0.8045
No log 1.8182 100 0.6431 0.5902 0.6431 0.8019
No log 1.8545 102 0.6459 0.5638 0.6459 0.8037
No log 1.8909 104 0.7153 0.5803 0.7153 0.8457
No log 1.9273 106 0.6768 0.5984 0.6768 0.8227
No log 1.9636 108 0.6613 0.6602 0.6613 0.8132
No log 2.0 110 0.7316 0.6219 0.7316 0.8553
No log 2.0364 112 0.6803 0.6269 0.6803 0.8248
No log 2.0727 114 0.6030 0.6254 0.6030 0.7765
No log 2.1091 116 0.6256 0.6377 0.6256 0.7910
No log 2.1455 118 0.6323 0.6448 0.6323 0.7952
No log 2.1818 120 0.6212 0.6214 0.6212 0.7881
No log 2.2182 122 0.6263 0.6893 0.6263 0.7914
No log 2.2545 124 0.6319 0.6864 0.6319 0.7949
No log 2.2909 126 0.6738 0.6003 0.6738 0.8209
No log 2.3273 128 0.6259 0.6930 0.6259 0.7911
No log 2.3636 130 0.6615 0.6893 0.6615 0.8133
No log 2.4 132 0.7014 0.6437 0.7014 0.8375
No log 2.4364 134 0.6702 0.6818 0.6702 0.8186
No log 2.4727 136 0.7076 0.6184 0.7076 0.8412
No log 2.5091 138 0.7279 0.5835 0.7279 0.8532
No log 2.5455 140 0.6591 0.5815 0.6591 0.8118
No log 2.5818 142 0.6509 0.6826 0.6509 0.8068
No log 2.6182 144 0.6579 0.6679 0.6579 0.8111
No log 2.6545 146 0.6563 0.6519 0.6563 0.8101
No log 2.6909 148 0.6974 0.6341 0.6974 0.8351
No log 2.7273 150 0.8758 0.5729 0.8758 0.9359
No log 2.7636 152 0.8394 0.5931 0.8394 0.9162
No log 2.8 154 0.6771 0.6139 0.6771 0.8229
No log 2.8364 156 0.6516 0.6519 0.6516 0.8072
No log 2.8727 158 0.6870 0.5810 0.6870 0.8288
No log 2.9091 160 0.6220 0.6092 0.6220 0.7887
No log 2.9455 162 0.5856 0.6389 0.5856 0.7652
No log 2.9818 164 0.6053 0.6025 0.6053 0.7780
No log 3.0182 166 0.6653 0.6657 0.6653 0.8156
No log 3.0545 168 0.6545 0.6510 0.6545 0.8090
No log 3.0909 170 0.6521 0.6580 0.6521 0.8075
No log 3.1273 172 0.6380 0.6842 0.6380 0.7987
No log 3.1636 174 0.6129 0.5706 0.6129 0.7829
No log 3.2 176 0.7064 0.5729 0.7064 0.8405
No log 3.2364 178 0.6575 0.5815 0.6575 0.8108
No log 3.2727 180 0.5939 0.6796 0.5939 0.7706
No log 3.3091 182 0.6428 0.6275 0.6428 0.8018
No log 3.3455 184 0.6329 0.6157 0.6329 0.7955
No log 3.3818 186 0.6137 0.6249 0.6137 0.7834
No log 3.4182 188 0.6360 0.5854 0.6360 0.7975
No log 3.4545 190 0.6411 0.6499 0.6411 0.8007
No log 3.4909 192 0.6130 0.6007 0.6130 0.7830
No log 3.5273 194 0.6414 0.6464 0.6414 0.8009
No log 3.5636 196 0.6998 0.6004 0.6998 0.8365
No log 3.6 198 0.6490 0.6879 0.6490 0.8056
No log 3.6364 200 0.5932 0.6704 0.5932 0.7702
No log 3.6727 202 0.5813 0.6659 0.5813 0.7624
No log 3.7091 204 0.6004 0.7291 0.6004 0.7749
No log 3.7455 206 0.5974 0.7122 0.5974 0.7729
No log 3.7818 208 0.6180 0.6677 0.6180 0.7861
No log 3.8182 210 0.6258 0.6588 0.6258 0.7911
No log 3.8545 212 0.5662 0.7385 0.5662 0.7525
No log 3.8909 214 0.5475 0.6888 0.5475 0.7399
No log 3.9273 216 0.5617 0.6998 0.5617 0.7495
No log 3.9636 218 0.5504 0.6627 0.5504 0.7419
No log 4.0 220 0.6015 0.6334 0.6015 0.7755
No log 4.0364 222 0.5744 0.6334 0.5744 0.7579
No log 4.0727 224 0.5688 0.6650 0.5688 0.7542
No log 4.1091 226 0.6075 0.6932 0.6075 0.7794
No log 4.1455 228 0.6229 0.7189 0.6229 0.7892
No log 4.1818 230 0.6544 0.6408 0.6544 0.8089
No log 4.2182 232 0.6709 0.6563 0.6709 0.8191
No log 4.2545 234 0.6742 0.6982 0.6742 0.8211
No log 4.2909 236 0.7234 0.6180 0.7234 0.8505
No log 4.3273 238 0.7079 0.6066 0.7079 0.8413
No log 4.3636 240 0.6546 0.6356 0.6546 0.8091
No log 4.4 242 0.6281 0.6659 0.6281 0.7925
No log 4.4364 244 0.6698 0.6098 0.6698 0.8184
No log 4.4727 246 0.7610 0.6182 0.7610 0.8723
No log 4.5091 248 0.7090 0.6144 0.7090 0.8420
No log 4.5455 250 0.6104 0.6302 0.6104 0.7813
No log 4.5818 252 0.6153 0.5808 0.6153 0.7844
No log 4.6182 254 0.6449 0.6284 0.6449 0.8030
No log 4.6545 256 0.6242 0.5997 0.6242 0.7900
No log 4.6909 258 0.5947 0.6509 0.5947 0.7711
No log 4.7273 260 0.6084 0.6215 0.6084 0.7800
No log 4.7636 262 0.6086 0.6284 0.6086 0.7801
No log 4.8 264 0.6359 0.6435 0.6359 0.7975
No log 4.8364 266 0.6316 0.6452 0.6316 0.7947
No log 4.8727 268 0.6234 0.6433 0.6234 0.7895
No log 4.9091 270 0.6777 0.6063 0.6777 0.8232
No log 4.9455 272 0.7403 0.6071 0.7403 0.8604
No log 4.9818 274 0.7047 0.6310 0.7047 0.8395
No log 5.0182 276 0.6416 0.6455 0.6416 0.8010
No log 5.0545 278 0.6391 0.6626 0.6391 0.7994
No log 5.0909 280 0.6627 0.6573 0.6627 0.8141
No log 5.1273 282 0.6649 0.6815 0.6649 0.8154
No log 5.1636 284 0.6193 0.6617 0.6193 0.7870
No log 5.2 286 0.5905 0.6561 0.5905 0.7685
No log 5.2364 288 0.5978 0.6869 0.5978 0.7732
No log 5.2727 290 0.5742 0.6581 0.5742 0.7578
No log 5.3091 292 0.5860 0.6936 0.5860 0.7655
No log 5.3455 294 0.6936 0.6236 0.6936 0.8328
No log 5.3818 296 0.7493 0.6120 0.7493 0.8656
No log 5.4182 298 0.7019 0.6313 0.7019 0.8378
No log 5.4545 300 0.5809 0.7094 0.5809 0.7622
No log 5.4909 302 0.5739 0.6840 0.5739 0.7576
No log 5.5273 304 0.6609 0.6565 0.6609 0.8130
No log 5.5636 306 0.6595 0.6627 0.6595 0.8121
No log 5.6 308 0.5777 0.6721 0.5777 0.7600
No log 5.6364 310 0.5640 0.6247 0.5640 0.7510
No log 5.6727 312 0.7236 0.5532 0.7236 0.8507
No log 5.7091 314 0.8362 0.6018 0.8362 0.9144
No log 5.7455 316 0.7921 0.6352 0.7921 0.8900
No log 5.7818 318 0.6776 0.6241 0.6775 0.8231
No log 5.8182 320 0.6413 0.6828 0.6413 0.8008
No log 5.8545 322 0.6380 0.6969 0.6380 0.7988
No log 5.8909 324 0.6598 0.6511 0.6598 0.8123
No log 5.9273 326 0.7417 0.6503 0.7417 0.8612
No log 5.9636 328 0.7339 0.6439 0.7339 0.8567
No log 6.0 330 0.6356 0.6215 0.6356 0.7972
No log 6.0364 332 0.6019 0.6949 0.6019 0.7758
No log 6.0727 334 0.6136 0.6565 0.6136 0.7834
No log 6.1091 336 0.5868 0.7033 0.5868 0.7660
No log 6.1455 338 0.5678 0.6619 0.5678 0.7535
No log 6.1818 340 0.5571 0.6822 0.5571 0.7464
No log 6.2182 342 0.5544 0.6518 0.5544 0.7445
No log 6.2545 344 0.5637 0.6828 0.5637 0.7508
No log 6.2909 346 0.5694 0.6733 0.5694 0.7546
No log 6.3273 348 0.5710 0.6341 0.5710 0.7557
No log 6.3636 350 0.5692 0.6365 0.5692 0.7545
No log 6.4 352 0.5650 0.6528 0.5650 0.7517
No log 6.4364 354 0.5742 0.6623 0.5742 0.7577
No log 6.4727 356 0.5825 0.6728 0.5825 0.7632
No log 6.5091 358 0.5671 0.6229 0.5671 0.7531
No log 6.5455 360 0.5799 0.6398 0.5799 0.7615
No log 6.5818 362 0.6047 0.6687 0.6047 0.7776
No log 6.6182 364 0.6053 0.6687 0.6053 0.7780
No log 6.6545 366 0.5895 0.6350 0.5895 0.7678
No log 6.6909 368 0.5760 0.6546 0.5760 0.7589
No log 6.7273 370 0.5636 0.6970 0.5636 0.7507
No log 6.7636 372 0.5650 0.7019 0.5650 0.7517
No log 6.8 374 0.5709 0.7133 0.5709 0.7556
No log 6.8364 376 0.5697 0.6944 0.5697 0.7548
No log 6.8727 378 0.6218 0.6867 0.6218 0.7885
No log 6.9091 380 0.6745 0.6858 0.6745 0.8213
No log 6.9455 382 0.6640 0.7005 0.6640 0.8148
No log 6.9818 384 0.6177 0.6610 0.6177 0.7859
No log 7.0182 386 0.6167 0.6383 0.6167 0.7853
No log 7.0545 388 0.6336 0.6509 0.6336 0.7960
No log 7.0909 390 0.6412 0.6430 0.6412 0.8007
No log 7.1273 392 0.6497 0.6157 0.6497 0.8060
No log 7.1636 394 0.6541 0.6291 0.6541 0.8088
No log 7.2 396 0.6227 0.6025 0.6227 0.7891
No log 7.2364 398 0.6244 0.5274 0.6244 0.7902
No log 7.2727 400 0.6220 0.5759 0.6220 0.7887
No log 7.3091 402 0.5762 0.5820 0.5762 0.7591
No log 7.3455 404 0.5736 0.6337 0.5736 0.7574
No log 7.3818 406 0.5633 0.6581 0.5633 0.7505
No log 7.4182 408 0.5335 0.7225 0.5335 0.7304
No log 7.4545 410 0.5778 0.6817 0.5778 0.7602
No log 7.4909 412 0.6117 0.6966 0.6117 0.7821
No log 7.5273 414 0.5974 0.7232 0.5974 0.7729
No log 7.5636 416 0.5934 0.7116 0.5934 0.7703
No log 7.6 418 0.5753 0.6667 0.5753 0.7585
No log 7.6364 420 0.5704 0.6562 0.5704 0.7552
No log 7.6727 422 0.5956 0.6371 0.5956 0.7717
No log 7.7091 424 0.6480 0.6521 0.6480 0.8050
No log 7.7455 426 0.6287 0.6394 0.6287 0.7929
No log 7.7818 428 0.5721 0.6708 0.5721 0.7563
No log 7.8182 430 0.5322 0.6345 0.5322 0.7295
No log 7.8545 432 0.5332 0.7097 0.5332 0.7302
No log 7.8909 434 0.5388 0.7097 0.5388 0.7340
No log 7.9273 436 0.5386 0.6697 0.5386 0.7339
No log 7.9636 438 0.5498 0.6642 0.5498 0.7415
No log 8.0 440 0.5451 0.6680 0.5451 0.7383
No log 8.0364 442 0.5380 0.6830 0.5380 0.7335
No log 8.0727 444 0.5403 0.6985 0.5403 0.7351
No log 8.1091 446 0.5473 0.6985 0.5473 0.7398
No log 8.1455 448 0.5564 0.6978 0.5564 0.7459
No log 8.1818 450 0.5668 0.6866 0.5668 0.7529
No log 8.2182 452 0.6156 0.6350 0.6156 0.7846
No log 8.2545 454 0.6386 0.5787 0.6386 0.7991
No log 8.2909 456 0.6162 0.5909 0.6162 0.7850
No log 8.3273 458 0.5749 0.6154 0.5749 0.7582
No log 8.3636 460 0.5618 0.6762 0.5618 0.7495
No log 8.4 462 0.5736 0.6186 0.5736 0.7574
No log 8.4364 464 0.5859 0.6157 0.5859 0.7654
No log 8.4727 466 0.5818 0.6622 0.5818 0.7628
No log 8.5091 468 0.5706 0.6383 0.5706 0.7554
No log 8.5455 470 0.5657 0.6424 0.5657 0.7521
No log 8.5818 472 0.5739 0.6398 0.5739 0.7576
No log 8.6182 474 0.5999 0.6186 0.5999 0.7746
No log 8.6545 476 0.6280 0.6293 0.6280 0.7925
No log 8.6909 478 0.6626 0.6698 0.6626 0.8140
No log 8.7273 480 0.6499 0.6081 0.6499 0.8062
No log 8.7636 482 0.6354 0.6218 0.6354 0.7971
No log 8.8 484 0.6307 0.6634 0.6307 0.7942
No log 8.8364 486 0.5928 0.6701 0.5928 0.7699
No log 8.8727 488 0.5742 0.6598 0.5742 0.7577
No log 8.9091 490 0.5930 0.6886 0.5930 0.7700
No log 8.9455 492 0.5814 0.6945 0.5814 0.7625
No log 8.9818 494 0.5605 0.6788 0.5605 0.7487
No log 9.0182 496 0.5686 0.6439 0.5686 0.7541
No log 9.0545 498 0.5794 0.6301 0.5794 0.7612
0.2749 9.0909 500 0.5887 0.6628 0.5887 0.7673
0.2749 9.1273 502 0.5984 0.6628 0.5984 0.7736
0.2749 9.1636 504 0.6078 0.6628 0.6078 0.7796
0.2749 9.2 506 0.6340 0.6459 0.6340 0.7962
0.2749 9.2364 508 0.6480 0.6142 0.6480 0.8050
0.2749 9.2727 510 0.6314 0.6446 0.6314 0.7946

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k11_task5_organization

Finetuned
(4019)
this model