ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k6_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8186
  • Qwk: 0.4575
  • Mse: 0.8186
  • Rmse: 0.9047

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0588 2 4.5977 0.0010 4.5977 2.1442
No log 0.1176 4 2.7399 -0.0559 2.7399 1.6553
No log 0.1765 6 1.5923 0.0504 1.5923 1.2619
No log 0.2353 8 1.4511 -0.0109 1.4511 1.2046
No log 0.2941 10 1.3805 -0.1024 1.3805 1.1750
No log 0.3529 12 1.1662 0.2023 1.1662 1.0799
No log 0.4118 14 1.2503 0.1247 1.2503 1.1182
No log 0.4706 16 1.3097 0.1271 1.3097 1.1444
No log 0.5294 18 1.7575 0.0 1.7575 1.3257
No log 0.5882 20 1.8796 0.0 1.8796 1.3710
No log 0.6471 22 1.4791 -0.0300 1.4791 1.2162
No log 0.7059 24 1.2430 0.1696 1.2430 1.1149
No log 0.7647 26 1.1427 0.1649 1.1427 1.0690
No log 0.8235 28 1.1456 0.1645 1.1456 1.0703
No log 0.8824 30 1.0858 0.3151 1.0858 1.0420
No log 0.9412 32 1.3339 0.1224 1.3339 1.1549
No log 1.0 34 1.3803 0.1166 1.3803 1.1749
No log 1.0588 36 1.2385 0.1427 1.2385 1.1129
No log 1.1176 38 1.0337 0.3946 1.0337 1.0167
No log 1.1765 40 0.9466 0.3547 0.9466 0.9729
No log 1.2353 42 1.0196 0.3325 1.0196 1.0098
No log 1.2941 44 0.9883 0.3892 0.9883 0.9941
No log 1.3529 46 0.9495 0.4120 0.9495 0.9744
No log 1.4118 48 1.2205 0.2362 1.2205 1.1048
No log 1.4706 50 1.3552 0.2902 1.3552 1.1641
No log 1.5294 52 1.1508 0.3536 1.1508 1.0728
No log 1.5882 54 1.0143 0.5345 1.0143 1.0071
No log 1.6471 56 0.8881 0.5740 0.8881 0.9424
No log 1.7059 58 0.8894 0.5961 0.8894 0.9431
No log 1.7647 60 0.8777 0.6089 0.8777 0.9368
No log 1.8235 62 0.9449 0.5850 0.9449 0.9721
No log 1.8824 64 1.0086 0.5504 1.0086 1.0043
No log 1.9412 66 1.1363 0.4093 1.1363 1.0660
No log 2.0 68 1.2531 0.3998 1.2531 1.1194
No log 2.0588 70 1.1421 0.3687 1.1421 1.0687
No log 2.1176 72 0.9113 0.5566 0.9113 0.9546
No log 2.1765 74 0.8197 0.4966 0.8197 0.9054
No log 2.2353 76 0.7965 0.5432 0.7965 0.8925
No log 2.2941 78 0.7911 0.5940 0.7911 0.8894
No log 2.3529 80 0.7664 0.6070 0.7664 0.8754
No log 2.4118 82 0.7399 0.6154 0.7399 0.8602
No log 2.4706 84 0.7354 0.6340 0.7354 0.8575
No log 2.5294 86 0.7404 0.5940 0.7404 0.8604
No log 2.5882 88 0.7363 0.6019 0.7363 0.8581
No log 2.6471 90 0.7360 0.6292 0.7360 0.8579
No log 2.7059 92 0.7369 0.6460 0.7369 0.8584
No log 2.7647 94 0.7336 0.6417 0.7336 0.8565
No log 2.8235 96 0.8693 0.5444 0.8693 0.9324
No log 2.8824 98 0.9672 0.4812 0.9672 0.9835
No log 2.9412 100 0.8762 0.5210 0.8762 0.9361
No log 3.0 102 0.8549 0.4976 0.8549 0.9246
No log 3.0588 104 0.8604 0.5632 0.8604 0.9276
No log 3.1176 106 0.8730 0.5937 0.8730 0.9343
No log 3.1765 108 1.0537 0.5194 1.0537 1.0265
No log 3.2353 110 1.0372 0.5367 1.0372 1.0184
No log 3.2941 112 1.0089 0.5393 1.0089 1.0045
No log 3.3529 114 0.9072 0.5915 0.9072 0.9524
No log 3.4118 116 0.8705 0.6310 0.8705 0.9330
No log 3.4706 118 0.8600 0.6227 0.8600 0.9274
No log 3.5294 120 0.8352 0.5574 0.8352 0.9139
No log 3.5882 122 0.8468 0.4981 0.8468 0.9202
No log 3.6471 124 0.8673 0.4521 0.8673 0.9313
No log 3.7059 126 0.8906 0.5279 0.8906 0.9437
No log 3.7647 128 0.8491 0.5646 0.8491 0.9215
No log 3.8235 130 0.8738 0.5551 0.8738 0.9348
No log 3.8824 132 0.8891 0.5551 0.8891 0.9429
No log 3.9412 134 0.9544 0.5399 0.9544 0.9770
No log 4.0 136 0.9053 0.5680 0.9053 0.9514
No log 4.0588 138 0.8320 0.5198 0.8320 0.9121
No log 4.1176 140 0.8813 0.4802 0.8813 0.9388
No log 4.1765 142 0.8456 0.4802 0.8456 0.9196
No log 4.2353 144 0.8031 0.5115 0.8031 0.8962
No log 4.2941 146 0.8331 0.5523 0.8331 0.9127
No log 4.3529 148 0.8454 0.5411 0.8454 0.9194
No log 4.4118 150 0.8273 0.4982 0.8273 0.9096
No log 4.4706 152 0.8135 0.4633 0.8135 0.9020
No log 4.5294 154 0.8140 0.4750 0.8140 0.9022
No log 4.5882 156 0.8826 0.5542 0.8826 0.9394
No log 4.6471 158 0.9511 0.5375 0.9511 0.9753
No log 4.7059 160 0.9482 0.5375 0.9482 0.9738
No log 4.7647 162 0.9446 0.5414 0.9446 0.9719
No log 4.8235 164 0.8737 0.5370 0.8737 0.9347
No log 4.8824 166 0.8336 0.4885 0.8336 0.9130
No log 4.9412 168 0.8393 0.5137 0.8393 0.9161
No log 5.0 170 0.8776 0.5267 0.8776 0.9368
No log 5.0588 172 0.8389 0.4933 0.8389 0.9159
No log 5.1176 174 0.8622 0.4464 0.8622 0.9286
No log 5.1765 176 1.0171 0.5129 1.0171 1.0085
No log 5.2353 178 0.9353 0.4710 0.9353 0.9671
No log 5.2941 180 0.8281 0.4038 0.8281 0.9100
No log 5.3529 182 0.8150 0.5343 0.8150 0.9028
No log 5.4118 184 0.8051 0.5557 0.8051 0.8973
No log 5.4706 186 0.7979 0.5008 0.7979 0.8932
No log 5.5294 188 0.8033 0.5322 0.8033 0.8963
No log 5.5882 190 0.8044 0.4933 0.8044 0.8969
No log 5.6471 192 0.7987 0.4933 0.7987 0.8937
No log 5.7059 194 0.8307 0.4920 0.8307 0.9114
No log 5.7647 196 0.9068 0.5624 0.9068 0.9522
No log 5.8235 198 1.0256 0.5700 1.0256 1.0127
No log 5.8824 200 1.0514 0.5700 1.0514 1.0254
No log 5.9412 202 0.9106 0.5706 0.9106 0.9543
No log 6.0 204 0.7600 0.5178 0.7600 0.8718
No log 6.0588 206 0.7839 0.5275 0.7839 0.8854
No log 6.1176 208 0.7666 0.5785 0.7666 0.8755
No log 6.1765 210 0.8037 0.5057 0.8037 0.8965
No log 6.2353 212 0.8320 0.5411 0.8320 0.9121
No log 6.2941 214 0.8071 0.6032 0.8071 0.8984
No log 6.3529 216 0.8083 0.4966 0.8083 0.8991
No log 6.4118 218 0.8309 0.5008 0.8309 0.9115
No log 6.4706 220 0.8195 0.5214 0.8195 0.9052
No log 6.5294 222 0.8505 0.5083 0.8505 0.9222
No log 6.5882 224 0.8306 0.4509 0.8306 0.9114
No log 6.6471 226 0.7981 0.4430 0.7981 0.8934
No log 6.7059 228 0.7845 0.5510 0.7845 0.8857
No log 6.7647 230 0.7840 0.5556 0.7840 0.8855
No log 6.8235 232 0.7878 0.5534 0.7878 0.8876
No log 6.8824 234 0.7829 0.5645 0.7829 0.8848
No log 6.9412 236 0.8084 0.5110 0.8084 0.8991
No log 7.0 238 0.8143 0.5110 0.8143 0.9024
No log 7.0588 240 0.7887 0.5582 0.7887 0.8881
No log 7.1176 242 0.8636 0.5298 0.8636 0.9293
No log 7.1765 244 0.9068 0.4932 0.9068 0.9522
No log 7.2353 246 0.8413 0.4741 0.8413 0.9172
No log 7.2941 248 0.8158 0.4930 0.8158 0.9032
No log 7.3529 250 0.8301 0.4086 0.8301 0.9111
No log 7.4118 252 0.8124 0.4217 0.8124 0.9013
No log 7.4706 254 0.7994 0.4568 0.7994 0.8941
No log 7.5294 256 0.7875 0.4916 0.7875 0.8874
No log 7.5882 258 0.7872 0.4949 0.7872 0.8873
No log 7.6471 260 0.7857 0.4818 0.7857 0.8864
No log 7.7059 262 0.7971 0.4870 0.7971 0.8928
No log 7.7647 264 0.8108 0.5420 0.8108 0.9004
No log 7.8235 266 0.8278 0.5565 0.8278 0.9099
No log 7.8824 268 0.8152 0.5693 0.8152 0.9029
No log 7.9412 270 0.7955 0.5509 0.7955 0.8919
No log 8.0 272 0.7873 0.4949 0.7873 0.8873
No log 8.0588 274 0.7821 0.4949 0.7821 0.8844
No log 8.1176 276 0.7939 0.5547 0.7939 0.8910
No log 8.1765 278 0.8245 0.5892 0.8245 0.9080
No log 8.2353 280 0.7977 0.5696 0.7977 0.8931
No log 8.2941 282 0.7943 0.5148 0.7943 0.8912
No log 8.3529 284 0.7772 0.5213 0.7772 0.8816
No log 8.4118 286 0.7691 0.5349 0.7691 0.8770
No log 8.4706 288 0.7733 0.5349 0.7733 0.8794
No log 8.5294 290 0.7676 0.5349 0.7676 0.8761
No log 8.5882 292 0.7715 0.5011 0.7715 0.8784
No log 8.6471 294 0.7840 0.5086 0.7840 0.8854
No log 8.7059 296 0.8131 0.5287 0.8131 0.9017
No log 8.7647 298 0.8030 0.4859 0.8030 0.8961
No log 8.8235 300 0.8117 0.5722 0.8117 0.9009
No log 8.8824 302 0.8431 0.5587 0.8431 0.9182
No log 8.9412 304 0.8761 0.5865 0.8761 0.9360
No log 9.0 306 0.8428 0.5968 0.8428 0.9180
No log 9.0588 308 0.7984 0.5720 0.7984 0.8935
No log 9.1176 310 0.7876 0.5361 0.7876 0.8875
No log 9.1765 312 0.7832 0.5518 0.7832 0.8850
No log 9.2353 314 0.7554 0.5931 0.7554 0.8691
No log 9.2941 316 0.8010 0.5563 0.8010 0.8950
No log 9.3529 318 0.8834 0.5290 0.8834 0.9399
No log 9.4118 320 0.8557 0.5539 0.8557 0.9250
No log 9.4706 322 0.8216 0.5949 0.8216 0.9064
No log 9.5294 324 0.8487 0.5648 0.8487 0.9212
No log 9.5882 326 0.9011 0.5475 0.9011 0.9493
No log 9.6471 328 0.8918 0.5578 0.8918 0.9444
No log 9.7059 330 0.8125 0.5806 0.8125 0.9014
No log 9.7647 332 0.7927 0.5143 0.7927 0.8903
No log 9.8235 334 0.7952 0.5495 0.7952 0.8918
No log 9.8824 336 0.7974 0.5110 0.7974 0.8930
No log 9.9412 338 0.8003 0.5143 0.8003 0.8946
No log 10.0 340 0.8680 0.5338 0.8680 0.9317
No log 10.0588 342 0.9711 0.5550 0.9711 0.9854
No log 10.1176 344 1.0146 0.5679 1.0146 1.0073
No log 10.1765 346 0.9454 0.5316 0.9454 0.9723
No log 10.2353 348 0.8304 0.5731 0.8304 0.9113
No log 10.2941 350 0.7858 0.4718 0.7858 0.8865
No log 10.3529 352 0.7866 0.5056 0.7866 0.8869
No log 10.4118 354 0.7911 0.4798 0.7911 0.8894
No log 10.4706 356 0.8410 0.5083 0.8410 0.9171
No log 10.5294 358 0.9333 0.5578 0.9333 0.9661
No log 10.5882 360 0.9371 0.5578 0.9371 0.9680
No log 10.6471 362 0.8755 0.5168 0.8755 0.9357
No log 10.7059 364 0.8371 0.5073 0.8371 0.9149
No log 10.7647 366 0.8350 0.4871 0.8350 0.9138
No log 10.8235 368 0.8443 0.4871 0.8443 0.9188
No log 10.8824 370 0.8328 0.5089 0.8328 0.9126
No log 10.9412 372 0.7989 0.4789 0.7989 0.8938
No log 11.0 374 0.7869 0.4232 0.7869 0.8871
No log 11.0588 376 0.7961 0.4854 0.7961 0.8922
No log 11.1176 378 0.7909 0.4950 0.7909 0.8893
No log 11.1765 380 0.7750 0.5681 0.7750 0.8803
No log 11.2353 382 0.7901 0.5858 0.7901 0.8889
No log 11.2941 384 0.7822 0.5670 0.7822 0.8844
No log 11.3529 386 0.7759 0.5547 0.7759 0.8808
No log 11.4118 388 0.7698 0.5213 0.7698 0.8774
No log 11.4706 390 0.7688 0.4789 0.7688 0.8768
No log 11.5294 392 0.7746 0.5213 0.7746 0.8801
No log 11.5882 394 0.8096 0.6142 0.8096 0.8998
No log 11.6471 396 0.8434 0.6266 0.8434 0.9184
No log 11.7059 398 0.8282 0.6320 0.8282 0.9100
No log 11.7647 400 0.8348 0.6256 0.8348 0.9137
No log 11.8235 402 0.7932 0.5832 0.7932 0.8906
No log 11.8824 404 0.7824 0.6026 0.7824 0.8846
No log 11.9412 406 0.7951 0.5968 0.7951 0.8917
No log 12.0 408 0.8294 0.6201 0.8294 0.9107
No log 12.0588 410 0.8227 0.6201 0.8227 0.9070
No log 12.1176 412 0.7858 0.5611 0.7858 0.8865
No log 12.1765 414 0.7663 0.4966 0.7663 0.8754
No log 12.2353 416 0.7755 0.4996 0.7755 0.8806
No log 12.2941 418 0.7828 0.4789 0.7828 0.8848
No log 12.3529 420 0.7810 0.4840 0.7810 0.8837
No log 12.4118 422 0.7970 0.5428 0.7970 0.8927
No log 12.4706 424 0.8490 0.6071 0.8490 0.9214
No log 12.5294 426 0.9367 0.5867 0.9367 0.9678
No log 12.5882 428 0.9518 0.5794 0.9518 0.9756
No log 12.6471 430 0.8659 0.5958 0.8659 0.9305
No log 12.7059 432 0.7878 0.6287 0.7878 0.8876
No log 12.7647 434 0.7568 0.5992 0.7568 0.8699
No log 12.8235 436 0.7653 0.5462 0.7653 0.8748
No log 12.8824 438 0.7823 0.5350 0.7823 0.8845
No log 12.9412 440 0.8053 0.5695 0.8053 0.8974
No log 13.0 442 0.8131 0.5476 0.8131 0.9017
No log 13.0588 444 0.8244 0.5814 0.8244 0.9079
No log 13.1176 446 0.8482 0.5788 0.8482 0.9210
No log 13.1765 448 0.8772 0.5738 0.8772 0.9366
No log 13.2353 450 0.8859 0.5738 0.8859 0.9412
No log 13.2941 452 0.8543 0.5788 0.8543 0.9243
No log 13.3529 454 0.8465 0.5738 0.8465 0.9200
No log 13.4118 456 0.7977 0.6056 0.7977 0.8931
No log 13.4706 458 0.7909 0.6369 0.7909 0.8893
No log 13.5294 460 0.8236 0.6101 0.8236 0.9075
No log 13.5882 462 0.8790 0.5867 0.8790 0.9376
No log 13.6471 464 0.8538 0.5867 0.8538 0.9240
No log 13.7059 466 0.7982 0.6220 0.7982 0.8934
No log 13.7647 468 0.7753 0.5169 0.7753 0.8805
No log 13.8235 470 0.7786 0.4906 0.7786 0.8824
No log 13.8824 472 0.7855 0.4664 0.7855 0.8863
No log 13.9412 474 0.7903 0.4633 0.7903 0.8890
No log 14.0 476 0.7881 0.4906 0.7881 0.8878
No log 14.0588 478 0.8056 0.5169 0.8056 0.8976
No log 14.1176 480 0.8082 0.5248 0.8082 0.8990
No log 14.1765 482 0.7935 0.5386 0.7935 0.8908
No log 14.2353 484 0.7923 0.5408 0.7923 0.8901
No log 14.2941 486 0.7916 0.4485 0.7916 0.8897
No log 14.3529 488 0.7914 0.4782 0.7914 0.8896
No log 14.4118 490 0.7832 0.4789 0.7832 0.8850
No log 14.4706 492 0.8304 0.5455 0.8304 0.9113
No log 14.5294 494 0.8945 0.5648 0.8945 0.9458
No log 14.5882 496 0.9203 0.5818 0.9203 0.9593
No log 14.6471 498 0.8680 0.5418 0.8680 0.9317
0.3178 14.7059 500 0.8035 0.5186 0.8035 0.8964
0.3178 14.7647 502 0.7804 0.5251 0.7804 0.8834
0.3178 14.8235 504 0.7723 0.5647 0.7723 0.8788
0.3178 14.8824 506 0.7717 0.5251 0.7717 0.8785
0.3178 14.9412 508 0.7789 0.4906 0.7789 0.8826
0.3178 15.0 510 0.7823 0.4787 0.7823 0.8845
0.3178 15.0588 512 0.7723 0.5102 0.7723 0.8788
0.3178 15.1176 514 0.7684 0.5167 0.7684 0.8766
0.3178 15.1765 516 0.7616 0.5364 0.7616 0.8727
0.3178 15.2353 518 0.7583 0.5343 0.7583 0.8708
0.3178 15.2941 520 0.7736 0.5833 0.7736 0.8796
0.3178 15.3529 522 0.8468 0.5968 0.8468 0.9202
0.3178 15.4118 524 0.9083 0.5179 0.9083 0.9530
0.3178 15.4706 526 0.9285 0.4988 0.9285 0.9636
0.3178 15.5294 528 0.8555 0.5810 0.8555 0.9249
0.3178 15.5882 530 0.8075 0.5142 0.8075 0.8986
0.3178 15.6471 532 0.7998 0.5176 0.7998 0.8943
0.3178 15.7059 534 0.8156 0.4859 0.8156 0.9031
0.3178 15.7647 536 0.8247 0.4859 0.8247 0.9081
0.3178 15.8235 538 0.8402 0.5611 0.8402 0.9166
0.3178 15.8824 540 0.8515 0.5775 0.8515 0.9228
0.3178 15.9412 542 0.8474 0.5246 0.8474 0.9206
0.3178 16.0 544 0.8488 0.5611 0.8488 0.9213
0.3178 16.0588 546 0.8722 0.5731 0.8722 0.9339
0.3178 16.1176 548 0.8629 0.5783 0.8629 0.9289
0.3178 16.1765 550 0.8355 0.5455 0.8355 0.9141
0.3178 16.2353 552 0.8110 0.4801 0.8110 0.9005
0.3178 16.2941 554 0.8152 0.4847 0.8152 0.9029
0.3178 16.3529 556 0.8129 0.4977 0.8129 0.9016
0.3178 16.4118 558 0.8110 0.4652 0.8110 0.9005
0.3178 16.4706 560 0.8186 0.4575 0.8186 0.9047

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k6_task2_organization

Finetuned
(4019)
this model