ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k9_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5626
  • Qwk: 0.6731
  • Mse: 0.5626
  • Rmse: 0.7501

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0444 2 3.9277 -0.0257 3.9277 1.9818
No log 0.0889 4 2.4862 0.0385 2.4862 1.5768
No log 0.1333 6 1.3874 -0.0281 1.3874 1.1779
No log 0.1778 8 1.0906 0.1685 1.0906 1.0443
No log 0.2222 10 1.1414 0.2140 1.1414 1.0683
No log 0.2667 12 1.1125 0.2208 1.1125 1.0547
No log 0.3111 14 1.3506 0.1453 1.3506 1.1622
No log 0.3556 16 1.0793 0.2709 1.0793 1.0389
No log 0.4 18 0.8329 0.4230 0.8329 0.9126
No log 0.4444 20 0.8036 0.4230 0.8036 0.8964
No log 0.4889 22 0.7704 0.4624 0.7704 0.8777
No log 0.5333 24 0.7050 0.4919 0.7050 0.8397
No log 0.5778 26 0.8111 0.4009 0.8111 0.9006
No log 0.6222 28 0.8323 0.4831 0.8323 0.9123
No log 0.6667 30 0.6517 0.6281 0.6517 0.8073
No log 0.7111 32 0.6210 0.6101 0.6210 0.7880
No log 0.7556 34 0.6454 0.6151 0.6454 0.8034
No log 0.8 36 0.8085 0.6814 0.8085 0.8992
No log 0.8444 38 0.7226 0.6889 0.7226 0.8501
No log 0.8889 40 0.6857 0.6370 0.6857 0.8281
No log 0.9333 42 0.6400 0.6459 0.6400 0.8000
No log 0.9778 44 0.6785 0.7042 0.6785 0.8237
No log 1.0222 46 0.6328 0.6481 0.6328 0.7955
No log 1.0667 48 0.6292 0.6354 0.6292 0.7932
No log 1.1111 50 0.6135 0.6377 0.6135 0.7833
No log 1.1556 52 0.6623 0.6330 0.6623 0.8138
No log 1.2 54 0.6150 0.6718 0.6150 0.7842
No log 1.2444 56 0.6233 0.6492 0.6233 0.7895
No log 1.2889 58 0.5892 0.6707 0.5892 0.7676
No log 1.3333 60 0.5947 0.7082 0.5947 0.7712
No log 1.3778 62 0.6442 0.6763 0.6442 0.8026
No log 1.4222 64 0.6284 0.6798 0.6284 0.7927
No log 1.4667 66 0.5862 0.7082 0.5862 0.7656
No log 1.5111 68 0.5817 0.6826 0.5817 0.7627
No log 1.5556 70 0.5971 0.6525 0.5971 0.7727
No log 1.6 72 0.5697 0.6699 0.5697 0.7548
No log 1.6444 74 0.5715 0.6460 0.5715 0.7560
No log 1.6889 76 0.6484 0.6802 0.6484 0.8052
No log 1.7333 78 0.6655 0.6347 0.6655 0.8158
No log 1.7778 80 0.6075 0.6014 0.6075 0.7794
No log 1.8222 82 0.7069 0.7076 0.7069 0.8408
No log 1.8667 84 0.8140 0.6215 0.8140 0.9022
No log 1.9111 86 0.7170 0.6996 0.7170 0.8468
No log 1.9556 88 0.6777 0.6479 0.6777 0.8232
No log 2.0 90 0.7375 0.6325 0.7375 0.8588
No log 2.0444 92 0.6512 0.6072 0.6512 0.8069
No log 2.0889 94 0.6342 0.6104 0.6342 0.7963
No log 2.1333 96 0.6581 0.5963 0.6581 0.8112
No log 2.1778 98 0.6751 0.6812 0.6751 0.8216
No log 2.2222 100 0.8088 0.5733 0.8088 0.8993
No log 2.2667 102 0.7886 0.6067 0.7886 0.8880
No log 2.3111 104 0.7161 0.6590 0.7161 0.8462
No log 2.3556 106 0.6769 0.6760 0.6769 0.8227
No log 2.4 108 0.9053 0.5902 0.9053 0.9515
No log 2.4444 110 0.8785 0.5902 0.8785 0.9373
No log 2.4889 112 0.6617 0.6939 0.6617 0.8135
No log 2.5333 114 0.6130 0.7055 0.6130 0.7829
No log 2.5778 116 0.6014 0.7112 0.6014 0.7755
No log 2.6222 118 0.6020 0.7278 0.6020 0.7759
No log 2.6667 120 0.6318 0.7126 0.6318 0.7949
No log 2.7111 122 0.6107 0.7205 0.6107 0.7815
No log 2.7556 124 0.5948 0.7308 0.5948 0.7713
No log 2.8 126 0.5774 0.7308 0.5774 0.7599
No log 2.8444 128 0.5472 0.7480 0.5472 0.7397
No log 2.8889 130 0.5259 0.7336 0.5259 0.7252
No log 2.9333 132 0.5425 0.7366 0.5425 0.7365
No log 2.9778 134 0.5714 0.6992 0.5714 0.7559
No log 3.0222 136 0.6327 0.6154 0.6327 0.7954
No log 3.0667 138 0.6623 0.6154 0.6623 0.8138
No log 3.1111 140 0.6634 0.6241 0.6634 0.8145
No log 3.1556 142 0.6114 0.7112 0.6114 0.7819
No log 3.2 144 0.6384 0.6857 0.6384 0.7990
No log 3.2444 146 0.6360 0.6857 0.6360 0.7975
No log 3.2889 148 0.5991 0.7001 0.5991 0.7740
No log 3.3333 150 0.6954 0.6121 0.6954 0.8339
No log 3.3778 152 0.7171 0.6017 0.7171 0.8468
No log 3.4222 154 0.6084 0.6929 0.6084 0.7800
No log 3.4667 156 0.5376 0.7088 0.5376 0.7332
No log 3.5111 158 0.5692 0.7133 0.5692 0.7544
No log 3.5556 160 0.6154 0.7021 0.6154 0.7845
No log 3.6 162 0.5770 0.6941 0.5770 0.7596
No log 3.6444 164 0.5458 0.7151 0.5458 0.7388
No log 3.6889 166 0.5072 0.7088 0.5072 0.7122
No log 3.7333 168 0.5506 0.6312 0.5506 0.7420
No log 3.7778 170 0.5585 0.6288 0.5585 0.7473
No log 3.8222 172 0.5352 0.6869 0.5352 0.7316
No log 3.8667 174 0.5596 0.7146 0.5596 0.7481
No log 3.9111 176 0.5725 0.6826 0.5725 0.7567
No log 3.9556 178 0.5817 0.6857 0.5817 0.7627
No log 4.0 180 0.6090 0.6729 0.6090 0.7804
No log 4.0444 182 0.6702 0.6355 0.6702 0.8186
No log 4.0889 184 0.7960 0.6191 0.7960 0.8922
No log 4.1333 186 0.7236 0.6132 0.7236 0.8507
No log 4.1778 188 0.6772 0.6091 0.6772 0.8229
No log 4.2222 190 0.6773 0.6091 0.6773 0.8230
No log 4.2667 192 0.6397 0.6575 0.6397 0.7998
No log 4.3111 194 0.6200 0.6586 0.6200 0.7874
No log 4.3556 196 0.6310 0.6249 0.6310 0.7944
No log 4.4 198 0.7748 0.5719 0.7748 0.8802
No log 4.4444 200 0.8247 0.5679 0.8247 0.9081
No log 4.4889 202 0.6611 0.6647 0.6611 0.8131
No log 4.5333 204 0.6392 0.6260 0.6392 0.7995
No log 4.5778 206 0.6485 0.6026 0.6485 0.8053
No log 4.6222 208 0.6789 0.6455 0.6789 0.8239
No log 4.6667 210 0.6545 0.6605 0.6545 0.8090
No log 4.7111 212 0.6016 0.6830 0.6016 0.7756
No log 4.7556 214 0.5936 0.6830 0.5936 0.7704
No log 4.8 216 0.6044 0.6528 0.6044 0.7775
No log 4.8444 218 0.5946 0.7077 0.5946 0.7711
No log 4.8889 220 0.5865 0.6764 0.5865 0.7658
No log 4.9333 222 0.5801 0.5871 0.5801 0.7617
No log 4.9778 224 0.5776 0.6186 0.5776 0.7600
No log 5.0222 226 0.5803 0.6195 0.5803 0.7618
No log 5.0667 228 0.5934 0.6164 0.5934 0.7703
No log 5.1111 230 0.6163 0.6461 0.6163 0.7850
No log 5.1556 232 0.5756 0.7285 0.5756 0.7587
No log 5.2 234 0.6019 0.7128 0.6019 0.7759
No log 5.2444 236 0.7192 0.6198 0.7192 0.8480
No log 5.2889 238 0.7128 0.6394 0.7128 0.8443
No log 5.3333 240 0.6078 0.6677 0.6078 0.7796
No log 5.3778 242 0.5699 0.6627 0.5699 0.7549
No log 5.4222 244 0.5892 0.6653 0.5892 0.7676
No log 5.4667 246 0.5732 0.6249 0.5732 0.7571
No log 5.5111 248 0.5805 0.6293 0.5805 0.7619
No log 5.5556 250 0.6333 0.6761 0.6333 0.7958
No log 5.6 252 0.6235 0.6696 0.6235 0.7896
No log 5.6444 254 0.5631 0.6468 0.5631 0.7504
No log 5.6889 256 0.5602 0.6990 0.5602 0.7485
No log 5.7333 258 0.5555 0.6545 0.5555 0.7453
No log 5.7778 260 0.5874 0.6388 0.5874 0.7664
No log 5.8222 262 0.6532 0.7086 0.6532 0.8082
No log 5.8667 264 0.6749 0.6696 0.6749 0.8215
No log 5.9111 266 0.6246 0.6620 0.6246 0.7903
No log 5.9556 268 0.6003 0.7006 0.6003 0.7748
No log 6.0 270 0.6151 0.6904 0.6151 0.7843
No log 6.0444 272 0.6083 0.6904 0.6083 0.7800
No log 6.0889 274 0.5972 0.6624 0.5972 0.7728
No log 6.1333 276 0.6653 0.6329 0.6653 0.8156
No log 6.1778 278 0.8367 0.5893 0.8367 0.9147
No log 6.2222 280 0.8593 0.5624 0.8593 0.9270
No log 6.2667 282 0.7516 0.6554 0.7516 0.8670
No log 6.3111 284 0.6203 0.6388 0.6203 0.7876
No log 6.3556 286 0.5806 0.6782 0.5806 0.7620
No log 6.4 288 0.6109 0.6935 0.6109 0.7816
No log 6.4444 290 0.5974 0.7098 0.5974 0.7729
No log 6.4889 292 0.5767 0.6598 0.5767 0.7594
No log 6.5333 294 0.5896 0.6821 0.5896 0.7678
No log 6.5778 296 0.6096 0.6685 0.6096 0.7808
No log 6.6222 298 0.6403 0.6570 0.6403 0.8002
No log 6.6667 300 0.6600 0.6329 0.6600 0.8124
No log 6.7111 302 0.6887 0.6301 0.6887 0.8299
No log 6.7556 304 0.6983 0.6064 0.6983 0.8356
No log 6.8 306 0.6458 0.5756 0.6458 0.8036
No log 6.8444 308 0.6252 0.5756 0.6252 0.7907
No log 6.8889 310 0.5832 0.6872 0.5832 0.7637
No log 6.9333 312 0.5657 0.6995 0.5657 0.7521
No log 6.9778 314 0.5663 0.6916 0.5663 0.7525
No log 7.0222 316 0.5872 0.7271 0.5872 0.7663
No log 7.0667 318 0.6042 0.7417 0.6042 0.7773
No log 7.1111 320 0.6071 0.7189 0.6071 0.7792
No log 7.1556 322 0.6027 0.7035 0.6027 0.7764
No log 7.2 324 0.6148 0.7118 0.6148 0.7841
No log 7.2444 326 0.6176 0.6838 0.6176 0.7858
No log 7.2889 328 0.6023 0.6606 0.6023 0.7761
No log 7.3333 330 0.5942 0.5428 0.5942 0.7708
No log 7.3778 332 0.5960 0.5314 0.5960 0.7720
No log 7.4222 334 0.5855 0.5909 0.5855 0.7652
No log 7.4667 336 0.5828 0.6275 0.5828 0.7634
No log 7.5111 338 0.6019 0.6589 0.6019 0.7758
No log 7.5556 340 0.5990 0.6519 0.5990 0.7740
No log 7.6 342 0.5931 0.6632 0.5931 0.7701
No log 7.6444 344 0.6139 0.6588 0.6139 0.7835
No log 7.6889 346 0.6432 0.6669 0.6432 0.8020
No log 7.7333 348 0.6579 0.6570 0.6579 0.8111
No log 7.7778 350 0.6072 0.6813 0.6072 0.7792
No log 7.8222 352 0.5908 0.6134 0.5908 0.7687
No log 7.8667 354 0.6183 0.5653 0.6183 0.7863
No log 7.9111 356 0.6061 0.5314 0.6061 0.7785
No log 7.9556 358 0.5838 0.6460 0.5838 0.7641
No log 8.0 360 0.6296 0.6764 0.6296 0.7935
No log 8.0444 362 0.6741 0.5828 0.6741 0.8210
No log 8.0889 364 0.6983 0.6160 0.6983 0.8357
No log 8.1333 366 0.6518 0.6581 0.6518 0.8074
No log 8.1778 368 0.6235 0.6886 0.6235 0.7896
No log 8.2222 370 0.6077 0.7043 0.6077 0.7796
No log 8.2667 372 0.5825 0.6732 0.5825 0.7632
No log 8.3111 374 0.5873 0.7050 0.5873 0.7663
No log 8.3556 376 0.6173 0.6965 0.6173 0.7857
No log 8.4 378 0.6144 0.7000 0.6144 0.7838
No log 8.4444 380 0.6104 0.7093 0.6104 0.7813
No log 8.4889 382 0.5821 0.7027 0.5821 0.7630
No log 8.5333 384 0.5630 0.6980 0.5630 0.7503
No log 8.5778 386 0.5658 0.6909 0.5658 0.7522
No log 8.6222 388 0.5912 0.7540 0.5912 0.7689
No log 8.6667 390 0.6797 0.6154 0.6797 0.8244
No log 8.7111 392 0.7333 0.5935 0.7333 0.8564
No log 8.7556 394 0.6902 0.6097 0.6902 0.8308
No log 8.8 396 0.6142 0.7042 0.6142 0.7837
No log 8.8444 398 0.6293 0.6933 0.6293 0.7933
No log 8.8889 400 0.6836 0.6615 0.6836 0.8268
No log 8.9333 402 0.6750 0.6692 0.6750 0.8216
No log 8.9778 404 0.6319 0.7287 0.6319 0.7949
No log 9.0222 406 0.6409 0.6869 0.6409 0.8006
No log 9.0667 408 0.6390 0.6657 0.6390 0.7994
No log 9.1111 410 0.6020 0.7687 0.6020 0.7759
No log 9.1556 412 0.5873 0.7251 0.5873 0.7663
No log 9.2 414 0.5862 0.7196 0.5862 0.7656
No log 9.2444 416 0.5826 0.6941 0.5826 0.7633
No log 9.2889 418 0.5518 0.7286 0.5518 0.7428
No log 9.3333 420 0.5617 0.7545 0.5617 0.7494
No log 9.3778 422 0.6328 0.6820 0.6328 0.7955
No log 9.4222 424 0.6533 0.6580 0.6533 0.8082
No log 9.4667 426 0.6146 0.7100 0.6146 0.7839
No log 9.5111 428 0.5919 0.7278 0.5919 0.7693
No log 9.5556 430 0.5775 0.7103 0.5775 0.7599
No log 9.6 432 0.5670 0.6903 0.5670 0.7530
No log 9.6444 434 0.5607 0.7191 0.5607 0.7488
No log 9.6889 436 0.5579 0.6826 0.5579 0.7469
No log 9.7333 438 0.5629 0.6826 0.5629 0.7503
No log 9.7778 440 0.6266 0.6599 0.6266 0.7916
No log 9.8222 442 0.6243 0.6539 0.6243 0.7901
No log 9.8667 444 0.5745 0.7118 0.5745 0.7580
No log 9.9111 446 0.5199 0.7482 0.5199 0.7210
No log 9.9556 448 0.4987 0.7187 0.4987 0.7062
No log 10.0 450 0.4919 0.7090 0.4919 0.7013
No log 10.0444 452 0.4883 0.7184 0.4883 0.6988
No log 10.0889 454 0.4882 0.7191 0.4882 0.6987
No log 10.1333 456 0.5027 0.7445 0.5027 0.7090
No log 10.1778 458 0.4822 0.7428 0.4822 0.6944
No log 10.2222 460 0.4804 0.7278 0.4804 0.6931
No log 10.2667 462 0.5021 0.7151 0.5021 0.7086
No log 10.3111 464 0.5056 0.7198 0.5056 0.7111
No log 10.3556 466 0.5113 0.6872 0.5113 0.7151
No log 10.4 468 0.4760 0.6951 0.4760 0.6899
No log 10.4444 470 0.4782 0.7272 0.4782 0.6915
No log 10.4889 472 0.5293 0.6990 0.5293 0.7275
No log 10.5333 474 0.6157 0.6872 0.6157 0.7847
No log 10.5778 476 0.7023 0.7002 0.7023 0.8381
No log 10.6222 478 0.6270 0.6928 0.6270 0.7918
No log 10.6667 480 0.5503 0.6556 0.5503 0.7418
No log 10.7111 482 0.5983 0.7155 0.5983 0.7735
No log 10.7556 484 0.6257 0.6538 0.6257 0.7910
No log 10.8 486 0.5972 0.5627 0.5972 0.7728
No log 10.8444 488 0.5624 0.5983 0.5624 0.7500
No log 10.8889 490 0.5324 0.6196 0.5324 0.7296
No log 10.9333 492 0.5128 0.6509 0.5128 0.7161
No log 10.9778 494 0.5403 0.7063 0.5403 0.7350
No log 11.0222 496 0.5457 0.7227 0.5457 0.7387
No log 11.0667 498 0.5400 0.7365 0.5400 0.7348
0.2408 11.1111 500 0.5299 0.7365 0.5299 0.7280
0.2408 11.1556 502 0.5002 0.7225 0.5002 0.7072
0.2408 11.2 504 0.4884 0.6959 0.4884 0.6989
0.2408 11.2444 506 0.4948 0.6838 0.4948 0.7034
0.2408 11.2889 508 0.5365 0.7043 0.5365 0.7325
0.2408 11.3333 510 0.5626 0.6731 0.5626 0.7501

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k9_task5_organization

Finetuned
(4019)
this model