ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k11_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6296
  • Qwk: 0.5405
  • Mse: 0.6296
  • Rmse: 0.7934

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0357 2 4.1044 -0.0154 4.1044 2.0259
No log 0.0714 4 2.6135 -0.0177 2.6135 1.6166
No log 0.1071 6 1.6519 0.0534 1.6519 1.2853
No log 0.1429 8 1.1427 0.3143 1.1427 1.0690
No log 0.1786 10 1.1747 0.1037 1.1747 1.0838
No log 0.2143 12 1.6281 -0.0091 1.6281 1.2760
No log 0.25 14 1.7154 -0.0391 1.7154 1.3097
No log 0.2857 16 1.7568 -0.0204 1.7568 1.3255
No log 0.3214 18 1.3672 -0.0627 1.3672 1.1693
No log 0.3571 20 1.1306 0.1989 1.1306 1.0633
No log 0.3929 22 1.0937 0.1962 1.0937 1.0458
No log 0.4286 24 1.0989 0.2094 1.0989 1.0483
No log 0.4643 26 1.2837 0.0349 1.2837 1.1330
No log 0.5 28 1.3338 0.0 1.3338 1.1549
No log 0.5357 30 1.1948 0.0996 1.1948 1.0931
No log 0.5714 32 1.0251 0.2740 1.0251 1.0124
No log 0.6071 34 0.9978 0.3476 0.9978 0.9989
No log 0.6429 36 1.0043 0.3117 1.0043 1.0021
No log 0.6786 38 0.9938 0.3258 0.9938 0.9969
No log 0.7143 40 1.1403 0.2062 1.1403 1.0678
No log 0.75 42 1.2485 0.2240 1.2485 1.1174
No log 0.7857 44 1.0950 0.1817 1.0950 1.0464
No log 0.8214 46 0.9955 0.3083 0.9955 0.9977
No log 0.8571 48 0.9120 0.2972 0.9120 0.9550
No log 0.8929 50 0.9886 0.3243 0.9886 0.9943
No log 0.9286 52 1.0168 0.3006 1.0168 1.0084
No log 0.9643 54 0.8570 0.3897 0.8570 0.9257
No log 1.0 56 0.7918 0.4676 0.7918 0.8898
No log 1.0357 58 0.7786 0.4544 0.7786 0.8824
No log 1.0714 60 0.8144 0.4988 0.8144 0.9024
No log 1.1071 62 0.8268 0.4988 0.8268 0.9093
No log 1.1429 64 0.7988 0.4461 0.7988 0.8937
No log 1.1786 66 0.8571 0.4656 0.8571 0.9258
No log 1.2143 68 0.7483 0.5492 0.7483 0.8650
No log 1.25 70 0.8056 0.5054 0.8056 0.8975
No log 1.2857 72 0.7470 0.5245 0.7470 0.8643
No log 1.3214 74 0.8303 0.4057 0.8303 0.9112
No log 1.3571 76 0.8251 0.4143 0.8251 0.9083
No log 1.3929 78 0.9196 0.4286 0.9196 0.9590
No log 1.4286 80 1.0258 0.3897 1.0258 1.0128
No log 1.4643 82 0.9294 0.3808 0.9294 0.9641
No log 1.5 84 0.8783 0.3425 0.8783 0.9372
No log 1.5357 86 0.9550 0.3375 0.9550 0.9772
No log 1.5714 88 0.9654 0.3428 0.9654 0.9826
No log 1.6071 90 0.8584 0.3713 0.8584 0.9265
No log 1.6429 92 0.8817 0.4812 0.8817 0.9390
No log 1.6786 94 0.8844 0.4169 0.8844 0.9404
No log 1.7143 96 0.8346 0.3838 0.8346 0.9136
No log 1.75 98 0.9215 0.3499 0.9215 0.9600
No log 1.7857 100 0.8690 0.4019 0.8690 0.9322
No log 1.8214 102 0.8197 0.4234 0.8197 0.9053
No log 1.8571 104 0.8041 0.4388 0.8041 0.8967
No log 1.8929 106 0.8131 0.4416 0.8131 0.9017
No log 1.9286 108 0.8007 0.4794 0.8007 0.8948
No log 1.9643 110 0.8114 0.4675 0.8114 0.9008
No log 2.0 112 0.7988 0.4133 0.7988 0.8938
No log 2.0357 114 0.9030 0.4797 0.9030 0.9503
No log 2.0714 116 0.9319 0.4284 0.9319 0.9654
No log 2.1071 118 0.8334 0.4831 0.8334 0.9129
No log 2.1429 120 0.8093 0.3200 0.8093 0.8996
No log 2.1786 122 0.9886 0.3627 0.9886 0.9943
No log 2.2143 124 0.9245 0.3849 0.9245 0.9615
No log 2.25 126 0.7493 0.4643 0.7493 0.8656
No log 2.2857 128 0.7344 0.5232 0.7344 0.8570
No log 2.3214 130 0.7151 0.5232 0.7151 0.8457
No log 2.3571 132 0.7327 0.4192 0.7327 0.8560
No log 2.3929 134 1.1019 0.3633 1.1019 1.0497
No log 2.4286 136 1.1918 0.2763 1.1918 1.0917
No log 2.4643 138 0.8437 0.4444 0.8437 0.9185
No log 2.5 140 0.7342 0.4593 0.7342 0.8568
No log 2.5357 142 1.2448 0.3237 1.2448 1.1157
No log 2.5714 144 1.5373 0.2314 1.5373 1.2399
No log 2.6071 146 1.3593 0.3114 1.3593 1.1659
No log 2.6429 148 0.9056 0.4444 0.9056 0.9516
No log 2.6786 150 0.7076 0.4987 0.7076 0.8412
No log 2.7143 152 0.6886 0.5380 0.6886 0.8298
No log 2.75 154 0.7114 0.5217 0.7114 0.8434
No log 2.7857 156 0.6753 0.5593 0.6753 0.8218
No log 2.8214 158 0.6695 0.4692 0.6695 0.8182
No log 2.8571 160 0.7720 0.5256 0.7720 0.8786
No log 2.8929 162 0.7065 0.4588 0.7065 0.8406
No log 2.9286 164 0.6574 0.5195 0.6574 0.8108
No log 2.9643 166 0.6456 0.5436 0.6456 0.8035
No log 3.0 168 0.6928 0.5450 0.6928 0.8323
No log 3.0357 170 0.6570 0.5795 0.6570 0.8106
No log 3.0714 172 0.6670 0.5568 0.6670 0.8167
No log 3.1071 174 0.6672 0.5568 0.6672 0.8168
No log 3.1429 176 0.7678 0.5256 0.7678 0.8762
No log 3.1786 178 0.7972 0.5012 0.7972 0.8928
No log 3.2143 180 0.6989 0.4590 0.6989 0.8360
No log 3.25 182 0.6895 0.4554 0.6895 0.8303
No log 3.2857 184 0.6994 0.4590 0.6994 0.8363
No log 3.3214 186 0.7325 0.4606 0.7325 0.8559
No log 3.3571 188 0.6845 0.4707 0.6845 0.8274
No log 3.3929 190 0.6971 0.4968 0.6971 0.8349
No log 3.4286 192 0.6558 0.4947 0.6558 0.8098
No log 3.4643 194 0.7000 0.5011 0.7000 0.8366
No log 3.5 196 0.8860 0.4537 0.8860 0.9413
No log 3.5357 198 0.8746 0.4519 0.8746 0.9352
No log 3.5714 200 0.7062 0.4872 0.7062 0.8404
No log 3.6071 202 0.6949 0.4966 0.6949 0.8336
No log 3.6429 204 0.7863 0.4008 0.7863 0.8867
No log 3.6786 206 0.9283 0.4348 0.9283 0.9635
No log 3.7143 208 0.9178 0.3936 0.9178 0.9580
No log 3.75 210 0.8214 0.4593 0.8214 0.9063
No log 3.7857 212 0.7615 0.4808 0.7615 0.8727
No log 3.8214 214 0.6852 0.5419 0.6852 0.8278
No log 3.8571 216 0.7001 0.4524 0.7001 0.8367
No log 3.8929 218 0.7339 0.4935 0.7339 0.8567
No log 3.9286 220 0.7487 0.4864 0.7487 0.8653
No log 3.9643 222 0.7306 0.5107 0.7306 0.8547
No log 4.0 224 0.6849 0.4955 0.6849 0.8276
No log 4.0357 226 0.6829 0.5418 0.6829 0.8264
No log 4.0714 228 0.7201 0.5245 0.7201 0.8486
No log 4.1071 230 0.6762 0.5542 0.6762 0.8223
No log 4.1429 232 0.6777 0.5405 0.6777 0.8232
No log 4.1786 234 0.7155 0.4867 0.7155 0.8459
No log 4.2143 236 0.7350 0.4516 0.7350 0.8573
No log 4.25 238 0.7465 0.4427 0.7465 0.8640
No log 4.2857 240 0.7255 0.4485 0.7255 0.8518
No log 4.3214 242 0.7199 0.4730 0.7199 0.8485
No log 4.3571 244 0.7084 0.4626 0.7084 0.8416
No log 4.3929 246 0.7048 0.5419 0.7048 0.8395
No log 4.4286 248 0.7066 0.5197 0.7066 0.8406
No log 4.4643 250 0.7345 0.5446 0.7345 0.8571
No log 4.5 252 0.6981 0.5419 0.6981 0.8355
No log 4.5357 254 0.7091 0.4847 0.7091 0.8421
No log 4.5714 256 0.7128 0.5088 0.7128 0.8443
No log 4.6071 258 0.6814 0.5361 0.6814 0.8255
No log 4.6429 260 0.7861 0.5033 0.7861 0.8866
No log 4.6786 262 0.8813 0.4936 0.8813 0.9388
No log 4.7143 264 0.8280 0.4913 0.8280 0.9099
No log 4.75 266 0.7751 0.4708 0.7751 0.8804
No log 4.7857 268 0.7962 0.4604 0.7962 0.8923
No log 4.8214 270 0.8999 0.3560 0.8999 0.9486
No log 4.8571 272 0.9559 0.2991 0.9559 0.9777
No log 4.8929 274 0.9094 0.3744 0.9094 0.9536
No log 4.9286 276 0.7363 0.6119 0.7363 0.8581
No log 4.9643 278 0.6799 0.5795 0.6799 0.8246
No log 5.0 280 0.6793 0.5168 0.6793 0.8242
No log 5.0357 282 0.7269 0.5082 0.7269 0.8526
No log 5.0714 284 0.7699 0.4709 0.7699 0.8774
No log 5.1071 286 0.7707 0.4833 0.7707 0.8779
No log 5.1429 288 0.7419 0.4941 0.7419 0.8613
No log 5.1786 290 0.7604 0.4709 0.7604 0.8720
No log 5.2143 292 0.7539 0.5098 0.7539 0.8683
No log 5.25 294 0.6998 0.5033 0.6998 0.8366
No log 5.2857 296 0.7598 0.4681 0.7598 0.8717
No log 5.3214 298 0.8473 0.4903 0.8473 0.9205
No log 5.3571 300 0.8009 0.5028 0.8009 0.8949
No log 5.3929 302 0.7219 0.4456 0.7219 0.8496
No log 5.4286 304 0.7296 0.4230 0.7296 0.8542
No log 5.4643 306 0.7420 0.4373 0.7420 0.8614
No log 5.5 308 0.7442 0.4373 0.7442 0.8627
No log 5.5357 310 0.7387 0.4230 0.7387 0.8595
No log 5.5714 312 0.7288 0.4391 0.7288 0.8537
No log 5.6071 314 0.7416 0.4019 0.7416 0.8611
No log 5.6429 316 0.9008 0.4108 0.9008 0.9491
No log 5.6786 318 0.9906 0.3778 0.9906 0.9953
No log 5.7143 320 0.9259 0.3984 0.9259 0.9623
No log 5.75 322 0.8191 0.5051 0.8191 0.9050
No log 5.7857 324 0.7410 0.5686 0.7410 0.8608
No log 5.8214 326 0.6950 0.5684 0.6950 0.8337
No log 5.8571 328 0.7642 0.5245 0.7642 0.8742
No log 5.8929 330 0.7449 0.5230 0.7449 0.8631
No log 5.9286 332 0.7019 0.5568 0.7019 0.8378
No log 5.9643 334 0.6749 0.5329 0.6749 0.8215
No log 6.0 336 0.7506 0.4656 0.7506 0.8664
No log 6.0357 338 0.7728 0.5270 0.7728 0.8791
No log 6.0714 340 0.8761 0.4743 0.8761 0.9360
No log 6.1071 342 0.9762 0.4181 0.9762 0.9880
No log 6.1429 344 0.7942 0.5382 0.7942 0.8912
No log 6.1786 346 0.5928 0.6667 0.5928 0.7699
No log 6.2143 348 0.6074 0.6593 0.6074 0.7793
No log 6.25 350 0.6600 0.5872 0.6600 0.8124
No log 6.2857 352 0.6349 0.5605 0.6349 0.7968
No log 6.3214 354 0.6405 0.5432 0.6405 0.8003
No log 6.3571 356 0.6773 0.48 0.6773 0.8230
No log 6.3929 358 0.6793 0.4909 0.6793 0.8242
No log 6.4286 360 0.6858 0.4980 0.6858 0.8281
No log 6.4643 362 0.6905 0.5686 0.6905 0.8309
No log 6.5 364 0.6384 0.5509 0.6384 0.7990
No log 6.5357 366 0.6479 0.5316 0.6479 0.8049
No log 6.5714 368 0.7793 0.5058 0.7793 0.8828
No log 6.6071 370 0.7772 0.5058 0.7772 0.8816
No log 6.6429 372 0.6682 0.5561 0.6682 0.8174
No log 6.6786 374 0.6356 0.5886 0.6356 0.7972
No log 6.7143 376 0.6486 0.5536 0.6486 0.8053
No log 6.75 378 0.6799 0.5197 0.6799 0.8245
No log 6.7857 380 0.6662 0.5316 0.6662 0.8162
No log 6.8214 382 0.6463 0.5746 0.6463 0.8039
No log 6.8571 384 0.6523 0.5377 0.6523 0.8076
No log 6.8929 386 0.6429 0.5522 0.6429 0.8018
No log 6.9286 388 0.6403 0.5405 0.6403 0.8002
No log 6.9643 390 0.6397 0.5405 0.6397 0.7998
No log 7.0 392 0.6441 0.5288 0.6441 0.8026
No log 7.0357 394 0.6691 0.5168 0.6691 0.8180
No log 7.0714 396 0.6761 0.5259 0.6761 0.8223
No log 7.1071 398 0.6913 0.5098 0.6913 0.8314
No log 7.1429 400 0.6659 0.5259 0.6659 0.8160
No log 7.1786 402 0.6487 0.5153 0.6487 0.8054
No log 7.2143 404 0.6461 0.5153 0.6461 0.8038
No log 7.25 406 0.6358 0.5391 0.6358 0.7974
No log 7.2857 408 0.6359 0.5966 0.6359 0.7974
No log 7.3214 410 0.6687 0.6262 0.6687 0.8177
No log 7.3571 412 0.6507 0.6529 0.6507 0.8067
No log 7.3929 414 0.6217 0.5989 0.6217 0.7885
No log 7.4286 416 0.6379 0.5171 0.6379 0.7987
No log 7.4643 418 0.6553 0.4929 0.6553 0.8095
No log 7.5 420 0.6419 0.5631 0.6419 0.8012
No log 7.5357 422 0.6401 0.5156 0.6401 0.8001
No log 7.5714 424 0.6385 0.5156 0.6385 0.7991
No log 7.6071 426 0.6435 0.5419 0.6435 0.8022
No log 7.6429 428 0.6505 0.5432 0.6505 0.8065
No log 7.6786 430 0.6235 0.5288 0.6235 0.7896
No log 7.7143 432 0.5993 0.5522 0.5993 0.7742
No log 7.75 434 0.5877 0.5631 0.5877 0.7666
No log 7.7857 436 0.5897 0.6107 0.5897 0.7679
No log 7.8214 438 0.5868 0.6107 0.5868 0.7661
No log 7.8571 440 0.6340 0.5905 0.6340 0.7962
No log 7.8929 442 0.6199 0.5988 0.6199 0.7873
No log 7.9286 444 0.6350 0.5568 0.6350 0.7969
No log 7.9643 446 0.6623 0.5568 0.6623 0.8138
No log 8.0 448 0.6503 0.5568 0.6503 0.8064
No log 8.0357 450 0.6215 0.5679 0.6215 0.7883
No log 8.0714 452 0.6251 0.5302 0.6251 0.7907
No log 8.1071 454 0.6346 0.5436 0.6346 0.7966
No log 8.1429 456 0.7228 0.5242 0.7228 0.8502
No log 8.1786 458 0.8247 0.4826 0.8247 0.9081
No log 8.2143 460 0.7625 0.4799 0.7625 0.8732
No log 8.25 462 0.6303 0.5666 0.6303 0.7939
No log 8.2857 464 0.7044 0.6413 0.7044 0.8393
No log 8.3214 466 0.7539 0.5655 0.7539 0.8683
No log 8.3571 468 0.6886 0.6132 0.6886 0.8298
No log 8.3929 470 0.6355 0.5640 0.6355 0.7972
No log 8.4286 472 0.6402 0.5640 0.6402 0.8001
No log 8.4643 474 0.6744 0.5377 0.6744 0.8212
No log 8.5 476 0.7049 0.5229 0.7049 0.8396
No log 8.5357 478 0.7066 0.5334 0.7066 0.8406
No log 8.5714 480 0.6917 0.5459 0.6917 0.8317
No log 8.6071 482 0.7182 0.5206 0.7182 0.8474
No log 8.6429 484 0.7696 0.5387 0.7696 0.8772
No log 8.6786 486 0.8819 0.4885 0.8819 0.9391
No log 8.7143 488 0.8370 0.5332 0.8370 0.9149
No log 8.75 490 0.7315 0.5416 0.7315 0.8553
No log 8.7857 492 0.6577 0.5734 0.6577 0.8110
No log 8.8214 494 0.6573 0.5605 0.6573 0.8107
No log 8.8571 496 0.6674 0.5377 0.6674 0.8170
No log 8.8929 498 0.6492 0.5391 0.6492 0.8058
0.2944 8.9286 500 0.6472 0.5882 0.6472 0.8045
0.2944 8.9643 502 0.6768 0.5783 0.6768 0.8226
0.2944 9.0 504 0.6666 0.5659 0.6666 0.8165
0.2944 9.0357 506 0.6470 0.5415 0.6470 0.8044
0.2944 9.0714 508 0.6366 0.5771 0.6366 0.7979
0.2944 9.1071 510 0.6296 0.5405 0.6296 0.7934

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k11_task5_organization

Finetuned
(4019)
this model