ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k9_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6483
  • Qwk: 0.5845
  • Mse: 0.6483
  • Rmse: 0.8052

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0444 2 4.3123 0.0045 4.3123 2.0766
No log 0.0889 4 2.0687 0.1342 2.0687 1.4383
No log 0.1333 6 1.7422 0.1128 1.7422 1.3199
No log 0.1778 8 1.9567 0.1240 1.9567 1.3988
No log 0.2222 10 1.8315 0.1266 1.8315 1.3533
No log 0.2667 12 0.9228 0.3627 0.9228 0.9606
No log 0.3111 14 0.8345 0.3464 0.8345 0.9135
No log 0.3556 16 0.9507 0.2724 0.9507 0.9750
No log 0.4 18 0.8842 0.2599 0.8842 0.9403
No log 0.4444 20 0.8325 0.2921 0.8325 0.9124
No log 0.4889 22 0.8960 0.3734 0.8960 0.9465
No log 0.5333 24 0.9824 0.4394 0.9824 0.9912
No log 0.5778 26 0.9146 0.4186 0.9146 0.9563
No log 0.6222 28 0.7729 0.4596 0.7729 0.8791
No log 0.6667 30 0.7110 0.6421 0.7110 0.8432
No log 0.7111 32 0.7247 0.5683 0.7247 0.8513
No log 0.7556 34 0.9069 0.5260 0.9069 0.9523
No log 0.8 36 1.2492 0.4740 1.2492 1.1177
No log 0.8444 38 1.2287 0.4541 1.2287 1.1084
No log 0.8889 40 1.0033 0.4777 1.0033 1.0017
No log 0.9333 42 0.7775 0.4934 0.7775 0.8818
No log 0.9778 44 0.6460 0.6244 0.6460 0.8037
No log 1.0222 46 0.6378 0.6341 0.6378 0.7986
No log 1.0667 48 0.6382 0.6740 0.6382 0.7989
No log 1.1111 50 0.6373 0.6278 0.6373 0.7983
No log 1.1556 52 0.7294 0.6434 0.7294 0.8540
No log 1.2 54 0.8851 0.5516 0.8851 0.9408
No log 1.2444 56 0.8282 0.6070 0.8282 0.9100
No log 1.2889 58 0.8269 0.5922 0.8269 0.9093
No log 1.3333 60 0.8032 0.5621 0.8032 0.8962
No log 1.3778 62 0.7726 0.5617 0.7726 0.8790
No log 1.4222 64 0.6743 0.5969 0.6743 0.8212
No log 1.4667 66 0.6907 0.6056 0.6907 0.8311
No log 1.5111 68 0.6768 0.6402 0.6768 0.8227
No log 1.5556 70 0.7333 0.5772 0.7333 0.8563
No log 1.6 72 0.8352 0.5685 0.8352 0.9139
No log 1.6444 74 0.7533 0.6217 0.7533 0.8679
No log 1.6889 76 0.6726 0.6194 0.6726 0.8201
No log 1.7333 78 0.8159 0.5444 0.8159 0.9033
No log 1.7778 80 0.9110 0.5363 0.9110 0.9545
No log 1.8222 82 0.8795 0.5455 0.8795 0.9378
No log 1.8667 84 0.7850 0.6266 0.7850 0.8860
No log 1.9111 86 0.7449 0.6255 0.7449 0.8631
No log 1.9556 88 0.8271 0.6258 0.8271 0.9094
No log 2.0 90 0.8442 0.5888 0.8442 0.9188
No log 2.0444 92 0.7095 0.6502 0.7095 0.8423
No log 2.0889 94 0.6898 0.6427 0.6898 0.8305
No log 2.1333 96 0.6507 0.6597 0.6507 0.8067
No log 2.1778 98 0.6478 0.5945 0.6478 0.8049
No log 2.2222 100 0.6348 0.6350 0.6348 0.7968
No log 2.2667 102 0.6380 0.6350 0.6380 0.7987
No log 2.3111 104 0.6859 0.6099 0.6859 0.8282
No log 2.3556 106 0.6524 0.6636 0.6524 0.8077
No log 2.4 108 0.6303 0.6693 0.6303 0.7939
No log 2.4444 110 0.6384 0.6701 0.6384 0.7990
No log 2.4889 112 0.6756 0.6284 0.6756 0.8220
No log 2.5333 114 0.7000 0.6406 0.7000 0.8367
No log 2.5778 116 0.7673 0.6002 0.7673 0.8759
No log 2.6222 118 0.8423 0.5789 0.8423 0.9178
No log 2.6667 120 0.8007 0.5807 0.8007 0.8948
No log 2.7111 122 0.7074 0.5849 0.7074 0.8411
No log 2.7556 124 0.7799 0.5245 0.7799 0.8831
No log 2.8 126 0.7450 0.4998 0.7450 0.8631
No log 2.8444 128 0.6752 0.6087 0.6752 0.8217
No log 2.8889 130 0.7176 0.6013 0.7176 0.8471
No log 2.9333 132 0.7414 0.5537 0.7414 0.8611
No log 2.9778 134 0.7748 0.6975 0.7748 0.8802
No log 3.0222 136 0.9055 0.5562 0.9055 0.9516
No log 3.0667 138 0.9198 0.5733 0.9198 0.9590
No log 3.1111 140 0.8119 0.6226 0.8119 0.9010
No log 3.1556 142 0.7130 0.6026 0.7130 0.8444
No log 3.2 144 0.7200 0.6227 0.7200 0.8485
No log 3.2444 146 0.7633 0.5735 0.7633 0.8737
No log 3.2889 148 0.6663 0.6257 0.6663 0.8163
No log 3.3333 150 0.6513 0.5158 0.6513 0.8070
No log 3.3778 152 0.7966 0.5552 0.7966 0.8925
No log 3.4222 154 0.8317 0.5547 0.8317 0.9120
No log 3.4667 156 0.8186 0.5510 0.8186 0.9048
No log 3.5111 158 0.7708 0.6689 0.7708 0.8780
No log 3.5556 160 0.7528 0.6456 0.7528 0.8676
No log 3.6 162 0.7185 0.5824 0.7185 0.8477
No log 3.6444 164 0.6587 0.6134 0.6587 0.8116
No log 3.6889 166 0.6558 0.6039 0.6558 0.8098
No log 3.7333 168 0.6515 0.5990 0.6515 0.8071
No log 3.7778 170 0.7011 0.6536 0.7011 0.8373
No log 3.8222 172 0.7423 0.6294 0.7423 0.8615
No log 3.8667 174 0.7435 0.6094 0.7435 0.8623
No log 3.9111 176 0.7170 0.5713 0.7170 0.8467
No log 3.9556 178 0.7469 0.5209 0.7469 0.8642
No log 4.0 180 0.7316 0.4657 0.7316 0.8553
No log 4.0444 182 0.6840 0.6016 0.6840 0.8270
No log 4.0889 184 0.6816 0.6197 0.6816 0.8256
No log 4.1333 186 0.6746 0.5818 0.6746 0.8213
No log 4.1778 188 0.6885 0.6272 0.6885 0.8297
No log 4.2222 190 0.7473 0.5759 0.7473 0.8645
No log 4.2667 192 0.7126 0.6418 0.7126 0.8442
No log 4.3111 194 0.6834 0.6266 0.6834 0.8267
No log 4.3556 196 0.7127 0.5652 0.7127 0.8442
No log 4.4 198 0.6786 0.5958 0.6786 0.8238
No log 4.4444 200 0.6648 0.6318 0.6648 0.8153
No log 4.4889 202 0.7741 0.6131 0.7741 0.8798
No log 4.5333 204 0.8463 0.6315 0.8463 0.9199
No log 4.5778 206 0.7567 0.6694 0.7567 0.8699
No log 4.6222 208 0.7253 0.6694 0.7253 0.8516
No log 4.6667 210 0.6684 0.6202 0.6684 0.8175
No log 4.7111 212 0.6393 0.5865 0.6393 0.7996
No log 4.7556 214 0.6263 0.6259 0.6263 0.7914
No log 4.8 216 0.6282 0.6464 0.6282 0.7926
No log 4.8444 218 0.6383 0.6092 0.6383 0.7989
No log 4.8889 220 0.6692 0.5508 0.6692 0.8180
No log 4.9333 222 0.6887 0.6010 0.6887 0.8299
No log 4.9778 224 0.6813 0.5931 0.6813 0.8254
No log 5.0222 226 0.6536 0.6138 0.6536 0.8085
No log 5.0667 228 0.6511 0.6446 0.6511 0.8069
No log 5.1111 230 0.6587 0.6609 0.6587 0.8116
No log 5.1556 232 0.6680 0.6609 0.6680 0.8173
No log 5.2 234 0.6462 0.6500 0.6462 0.8039
No log 5.2444 236 0.6689 0.5945 0.6689 0.8178
No log 5.2889 238 0.6984 0.5636 0.6984 0.8357
No log 5.3333 240 0.7145 0.5837 0.7145 0.8453
No log 5.3778 242 0.6977 0.6166 0.6977 0.8353
No log 5.4222 244 0.6843 0.6260 0.6843 0.8272
No log 5.4667 246 0.6602 0.6695 0.6602 0.8125
No log 5.5111 248 0.6494 0.5909 0.6494 0.8058
No log 5.5556 250 0.6480 0.5964 0.6480 0.8050
No log 5.6 252 0.6563 0.5376 0.6563 0.8101
No log 5.6444 254 0.6646 0.5596 0.6646 0.8152
No log 5.6889 256 0.6815 0.5936 0.6815 0.8255
No log 5.7333 258 0.6727 0.5804 0.6727 0.8202
No log 5.7778 260 0.6815 0.6092 0.6815 0.8256
No log 5.8222 262 0.8002 0.6112 0.8002 0.8946
No log 5.8667 264 0.8330 0.5910 0.8330 0.9127
No log 5.9111 266 0.7175 0.5569 0.7175 0.8471
No log 5.9556 268 0.7462 0.5922 0.7462 0.8638
No log 6.0 270 0.9487 0.5526 0.9487 0.9740
No log 6.0444 272 0.9134 0.5617 0.9134 0.9557
No log 6.0889 274 0.7302 0.5835 0.7302 0.8545
No log 6.1333 276 0.6766 0.6187 0.6766 0.8226
No log 6.1778 278 0.7501 0.6249 0.7501 0.8661
No log 6.2222 280 0.7404 0.6199 0.7404 0.8605
No log 6.2667 282 0.6876 0.6014 0.6876 0.8292
No log 6.3111 284 0.6469 0.6252 0.6469 0.8043
No log 6.3556 286 0.6306 0.6094 0.6306 0.7941
No log 6.4 288 0.6186 0.6011 0.6186 0.7865
No log 6.4444 290 0.6723 0.6062 0.6723 0.8199
No log 6.4889 292 0.7292 0.6154 0.7292 0.8540
No log 6.5333 294 0.7270 0.6154 0.7270 0.8527
No log 6.5778 296 0.7075 0.5961 0.7075 0.8411
No log 6.6222 298 0.6709 0.5607 0.6709 0.8191
No log 6.6667 300 0.6777 0.6059 0.6777 0.8232
No log 6.7111 302 0.7046 0.6263 0.7046 0.8394
No log 6.7556 304 0.6937 0.6059 0.6937 0.8329
No log 6.8 306 0.6871 0.5752 0.6871 0.8289
No log 6.8444 308 0.6800 0.5483 0.6800 0.8246
No log 6.8889 310 0.6869 0.5409 0.6869 0.8288
No log 6.9333 312 0.6842 0.5483 0.6842 0.8272
No log 6.9778 314 0.6932 0.5466 0.6932 0.8326
No log 7.0222 316 0.7056 0.6406 0.7056 0.8400
No log 7.0667 318 0.7041 0.6399 0.7041 0.8391
No log 7.1111 320 0.6820 0.5474 0.6820 0.8259
No log 7.1556 322 0.6716 0.5773 0.6716 0.8195
No log 7.2 324 0.6682 0.5381 0.6682 0.8174
No log 7.2444 326 0.6720 0.5731 0.6720 0.8198
No log 7.2889 328 0.6800 0.5055 0.6800 0.8247
No log 7.3333 330 0.6870 0.5483 0.6870 0.8288
No log 7.3778 332 0.6873 0.5680 0.6873 0.8290
No log 7.4222 334 0.6931 0.5602 0.6931 0.8325
No log 7.4667 336 0.6905 0.5402 0.6905 0.8310
No log 7.5111 338 0.6904 0.5602 0.6904 0.8309
No log 7.5556 340 0.6884 0.6237 0.6884 0.8297
No log 7.6 342 0.7033 0.5756 0.7033 0.8386
No log 7.6444 344 0.6748 0.6284 0.6748 0.8214
No log 7.6889 346 0.6367 0.6537 0.6367 0.7980
No log 7.7333 348 0.6415 0.5879 0.6415 0.8010
No log 7.7778 350 0.6480 0.5759 0.6480 0.8050
No log 7.8222 352 0.6202 0.5817 0.6202 0.7875
No log 7.8667 354 0.6179 0.6562 0.6179 0.7860
No log 7.9111 356 0.6424 0.6554 0.6424 0.8015
No log 7.9556 358 0.6162 0.6628 0.6162 0.7850
No log 8.0 360 0.6047 0.6610 0.6047 0.7777
No log 8.0444 362 0.6150 0.6207 0.6150 0.7842
No log 8.0889 364 0.6359 0.5856 0.6359 0.7975
No log 8.1333 366 0.6474 0.5856 0.6474 0.8046
No log 8.1778 368 0.6423 0.5856 0.6423 0.8014
No log 8.2222 370 0.6445 0.5735 0.6445 0.8028
No log 8.2667 372 0.6665 0.5820 0.6665 0.8164
No log 8.3111 374 0.7207 0.6128 0.7207 0.8489
No log 8.3556 376 0.8277 0.5581 0.8277 0.9098
No log 8.4 378 0.8821 0.5283 0.8821 0.9392
No log 8.4444 380 0.8005 0.5113 0.8005 0.8947
No log 8.4889 382 0.6761 0.5877 0.6761 0.8223
No log 8.5333 384 0.6532 0.5647 0.6532 0.8082
No log 8.5778 386 0.6727 0.6217 0.6727 0.8202
No log 8.6222 388 0.6497 0.6217 0.6497 0.8060
No log 8.6667 390 0.6205 0.5856 0.6205 0.7877
No log 8.7111 392 0.6345 0.6138 0.6345 0.7966
No log 8.7556 394 0.7053 0.5948 0.7053 0.8398
No log 8.8 396 0.7007 0.5380 0.7007 0.8371
No log 8.8444 398 0.6478 0.5610 0.6478 0.8049
No log 8.8889 400 0.6167 0.5735 0.6167 0.7853
No log 8.9333 402 0.6155 0.6429 0.6155 0.7846
No log 8.9778 404 0.6160 0.6456 0.6160 0.7849
No log 9.0222 406 0.5904 0.5977 0.5904 0.7684
No log 9.0667 408 0.5981 0.6356 0.5981 0.7734
No log 9.1111 410 0.6305 0.6589 0.6305 0.7941
No log 9.1556 412 0.6335 0.6421 0.6335 0.7959
No log 9.2 414 0.6206 0.5614 0.6206 0.7878
No log 9.2444 416 0.6182 0.6564 0.6182 0.7863
No log 9.2889 418 0.6371 0.6412 0.6371 0.7982
No log 9.3333 420 0.6401 0.5809 0.6401 0.8000
No log 9.3778 422 0.6111 0.5820 0.6111 0.7817
No log 9.4222 424 0.6038 0.6634 0.6038 0.7770
No log 9.4667 426 0.6004 0.6581 0.6004 0.7748
No log 9.5111 428 0.6274 0.5966 0.6274 0.7921
No log 9.5556 430 0.6439 0.6253 0.6439 0.8024
No log 9.6 432 0.6252 0.6244 0.6252 0.7907
No log 9.6444 434 0.6049 0.6493 0.6049 0.7778
No log 9.6889 436 0.6452 0.6728 0.6452 0.8032
No log 9.7333 438 0.6660 0.6555 0.6660 0.8161
No log 9.7778 440 0.6593 0.6464 0.6593 0.8120
No log 9.8222 442 0.6748 0.6464 0.6748 0.8214
No log 9.8667 444 0.6729 0.6235 0.6729 0.8203
No log 9.9111 446 0.6701 0.5662 0.6701 0.8186
No log 9.9556 448 0.6761 0.5888 0.6761 0.8223
No log 10.0 450 0.6767 0.5679 0.6767 0.8226
No log 10.0444 452 0.6663 0.5975 0.6663 0.8163
No log 10.0889 454 0.6799 0.6455 0.6799 0.8246
No log 10.1333 456 0.6761 0.6066 0.6761 0.8222
No log 10.1778 458 0.6898 0.6055 0.6898 0.8305
No log 10.2222 460 0.6835 0.5954 0.6835 0.8267
No log 10.2667 462 0.6757 0.5954 0.6757 0.8220
No log 10.3111 464 0.6656 0.5796 0.6656 0.8158
No log 10.3556 466 0.6762 0.5594 0.6762 0.8223
No log 10.4 468 0.6937 0.5774 0.6937 0.8329
No log 10.4444 470 0.7288 0.5853 0.7288 0.8537
No log 10.4889 472 0.7221 0.5701 0.7221 0.8498
No log 10.5333 474 0.6906 0.5614 0.6906 0.8310
No log 10.5778 476 0.6846 0.5735 0.6846 0.8274
No log 10.6222 478 0.6788 0.5735 0.6788 0.8239
No log 10.6667 480 0.6913 0.5120 0.6913 0.8314
No log 10.7111 482 0.7242 0.6318 0.7242 0.8510
No log 10.7556 484 0.7210 0.6479 0.7210 0.8491
No log 10.8 486 0.6693 0.6473 0.6693 0.8181
No log 10.8444 488 0.6741 0.5726 0.6741 0.8210
No log 10.8889 490 0.7193 0.5835 0.7193 0.8481
No log 10.9333 492 0.6954 0.5759 0.6954 0.8339
No log 10.9778 494 0.6540 0.5523 0.6540 0.8087
No log 11.0222 496 0.6833 0.5446 0.6833 0.8266
No log 11.0667 498 0.8207 0.6097 0.8207 0.9059
0.254 11.1111 500 0.8770 0.6047 0.8770 0.9365
0.254 11.1556 502 0.7990 0.5944 0.7990 0.8939
0.254 11.2 504 0.6961 0.6237 0.6961 0.8343
0.254 11.2444 506 0.6564 0.5735 0.6564 0.8102
0.254 11.2889 508 0.6752 0.5071 0.6752 0.8217
0.254 11.3333 510 0.6834 0.4963 0.6834 0.8267
0.254 11.3778 512 0.6623 0.5735 0.6623 0.8138
0.254 11.4222 514 0.6483 0.5845 0.6483 0.8052

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k9_task5_organization

Finetuned
(4019)
this model