ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k20_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7722
  • Qwk: 0.4638
  • Mse: 0.7722
  • Rmse: 0.8788

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0198 2 4.0148 0.0024 4.0148 2.0037
No log 0.0396 4 2.2972 0.0110 2.2972 1.5156
No log 0.0594 6 1.8925 0.0032 1.8925 1.3757
No log 0.0792 8 1.0941 0.2639 1.0941 1.0460
No log 0.0990 10 0.9889 0.4199 0.9889 0.9944
No log 0.1188 12 0.9793 0.4002 0.9793 0.9896
No log 0.1386 14 1.0428 0.3439 1.0428 1.0212
No log 0.1584 16 1.1139 0.2392 1.1139 1.0554
No log 0.1782 18 1.3627 0.0380 1.3627 1.1673
No log 0.1980 20 1.2816 0.0380 1.2816 1.1321
No log 0.2178 22 1.0972 0.2857 1.0972 1.0475
No log 0.2376 24 0.9862 0.2288 0.9862 0.9931
No log 0.2574 26 1.0403 0.1799 1.0403 1.0200
No log 0.2772 28 1.1268 0.2094 1.1268 1.0615
No log 0.2970 30 1.5233 0.1084 1.5233 1.2342
No log 0.3168 32 1.7665 0.1038 1.7665 1.3291
No log 0.3366 34 1.2181 0.0731 1.2181 1.1037
No log 0.3564 36 1.0282 0.2424 1.0282 1.0140
No log 0.3762 38 0.9947 0.2788 0.9947 0.9973
No log 0.3960 40 0.9990 0.2744 0.9990 0.9995
No log 0.4158 42 0.9861 0.3094 0.9861 0.9930
No log 0.4356 44 0.9777 0.3306 0.9777 0.9888
No log 0.4554 46 0.9927 0.3124 0.9927 0.9964
No log 0.4752 48 1.2263 0.2309 1.2263 1.1074
No log 0.4950 50 1.1961 0.3041 1.1961 1.0937
No log 0.5149 52 0.9984 0.3714 0.9984 0.9992
No log 0.5347 54 0.8651 0.3878 0.8651 0.9301
No log 0.5545 56 0.8172 0.4027 0.8172 0.9040
No log 0.5743 58 0.8708 0.4382 0.8708 0.9332
No log 0.5941 60 1.0450 0.3549 1.0450 1.0223
No log 0.6139 62 1.2670 0.2101 1.2670 1.1256
No log 0.6337 64 1.3174 0.1611 1.3174 1.1478
No log 0.6535 66 1.3012 0.1263 1.3012 1.1407
No log 0.6733 68 1.0807 0.2827 1.0807 1.0396
No log 0.6931 70 0.9335 0.3794 0.9335 0.9662
No log 0.7129 72 0.9311 0.4349 0.9311 0.9649
No log 0.7327 74 0.9411 0.4238 0.9411 0.9701
No log 0.7525 76 0.9730 0.3622 0.9730 0.9864
No log 0.7723 78 1.1708 0.3627 1.1708 1.0820
No log 0.7921 80 1.1539 0.3848 1.1539 1.0742
No log 0.8119 82 0.9019 0.4459 0.9019 0.9497
No log 0.8317 84 0.9176 0.4468 0.9176 0.9579
No log 0.8515 86 0.9395 0.4468 0.9395 0.9693
No log 0.8713 88 0.8054 0.4916 0.8054 0.8974
No log 0.8911 90 0.9427 0.4458 0.9427 0.9709
No log 0.9109 92 1.2495 0.2431 1.2495 1.1178
No log 0.9307 94 1.1038 0.3075 1.1038 1.0506
No log 0.9505 96 0.7698 0.4867 0.7698 0.8774
No log 0.9703 98 0.7266 0.4872 0.7266 0.8524
No log 0.9901 100 0.7164 0.4998 0.7164 0.8464
No log 1.0099 102 0.6786 0.5302 0.6786 0.8238
No log 1.0297 104 0.7213 0.6132 0.7213 0.8493
No log 1.0495 106 0.7248 0.5953 0.7248 0.8514
No log 1.0693 108 0.7907 0.5458 0.7907 0.8892
No log 1.0891 110 0.7458 0.5835 0.7458 0.8636
No log 1.1089 112 0.6406 0.5487 0.6406 0.8004
No log 1.1287 114 0.6966 0.4789 0.6966 0.8346
No log 1.1485 116 0.6818 0.4856 0.6818 0.8257
No log 1.1683 118 0.7305 0.4715 0.7305 0.8547
No log 1.1881 120 0.9033 0.2773 0.9033 0.9504
No log 1.2079 122 0.8895 0.3185 0.8895 0.9431
No log 1.2277 124 0.7447 0.5567 0.7447 0.8630
No log 1.2475 126 0.6630 0.5301 0.6630 0.8142
No log 1.2673 128 0.7231 0.4191 0.7231 0.8504
No log 1.2871 130 0.6710 0.5287 0.6710 0.8191
No log 1.3069 132 0.8890 0.4563 0.8890 0.9429
No log 1.3267 134 1.1147 0.4267 1.1147 1.0558
No log 1.3465 136 0.9834 0.4668 0.9834 0.9917
No log 1.3663 138 0.7090 0.5833 0.7090 0.8420
No log 1.3861 140 0.6726 0.6001 0.6726 0.8201
No log 1.4059 142 0.6967 0.5954 0.6967 0.8347
No log 1.4257 144 0.9021 0.4241 0.9021 0.9498
No log 1.4455 146 1.0739 0.4260 1.0739 1.0363
No log 1.4653 148 1.0259 0.3606 1.0259 1.0129
No log 1.4851 150 0.8896 0.4783 0.8896 0.9432
No log 1.5050 152 0.8075 0.5253 0.8075 0.8986
No log 1.5248 154 0.7782 0.5370 0.7782 0.8822
No log 1.5446 156 0.7971 0.5356 0.7971 0.8928
No log 1.5644 158 0.8711 0.4667 0.8711 0.9334
No log 1.5842 160 0.8633 0.4293 0.8633 0.9291
No log 1.6040 162 0.6772 0.4972 0.6772 0.8229
No log 1.6238 164 0.6550 0.5708 0.6550 0.8093
No log 1.6436 166 0.7059 0.5381 0.7059 0.8402
No log 1.6634 168 0.6078 0.6006 0.6078 0.7796
No log 1.6832 170 0.7648 0.5279 0.7648 0.8745
No log 1.7030 172 0.8977 0.5307 0.8977 0.9475
No log 1.7228 174 0.7865 0.5241 0.7865 0.8868
No log 1.7426 176 0.6014 0.5949 0.6014 0.7755
No log 1.7624 178 0.6108 0.6052 0.6108 0.7816
No log 1.7822 180 0.6110 0.5636 0.6110 0.7817
No log 1.8020 182 0.6049 0.5409 0.6049 0.7778
No log 1.8218 184 0.6080 0.5631 0.6080 0.7797
No log 1.8416 186 0.6453 0.5811 0.6453 0.8033
No log 1.8614 188 0.7283 0.5035 0.7283 0.8534
No log 1.8812 190 0.7131 0.5546 0.7131 0.8444
No log 1.9010 192 0.6585 0.5071 0.6585 0.8115
No log 1.9208 194 0.6989 0.5005 0.6989 0.8360
No log 1.9406 196 0.6693 0.5434 0.6693 0.8181
No log 1.9604 198 0.6807 0.4988 0.6807 0.8250
No log 1.9802 200 0.8169 0.4917 0.8169 0.9038
No log 2.0 202 0.7847 0.5042 0.7847 0.8858
No log 2.0198 204 0.6730 0.4676 0.6730 0.8204
No log 2.0396 206 0.6847 0.4834 0.6847 0.8275
No log 2.0594 208 0.6882 0.5273 0.6882 0.8296
No log 2.0792 210 0.7172 0.5450 0.7172 0.8469
No log 2.0990 212 0.7392 0.5567 0.7392 0.8598
No log 2.1188 214 0.6937 0.5123 0.6937 0.8329
No log 2.1386 216 0.7064 0.4765 0.7064 0.8405
No log 2.1584 218 0.7247 0.4475 0.7247 0.8513
No log 2.1782 220 0.6723 0.5560 0.6723 0.8199
No log 2.1980 222 0.6173 0.5594 0.6173 0.7857
No log 2.2178 224 0.7081 0.5333 0.7081 0.8415
No log 2.2376 226 0.7841 0.5527 0.7841 0.8855
No log 2.2574 228 0.7405 0.5195 0.7405 0.8605
No log 2.2772 230 0.6679 0.5698 0.6679 0.8172
No log 2.2970 232 0.6548 0.5071 0.6548 0.8092
No log 2.3168 234 0.6608 0.4949 0.6608 0.8129
No log 2.3366 236 0.6583 0.5071 0.6583 0.8114
No log 2.3564 238 0.6873 0.5477 0.6873 0.8290
No log 2.3762 240 0.7839 0.5450 0.7839 0.8854
No log 2.3960 242 0.7672 0.5450 0.7672 0.8759
No log 2.4158 244 0.6690 0.5862 0.6690 0.8179
No log 2.4356 246 0.6986 0.4249 0.6986 0.8358
No log 2.4554 248 0.6913 0.4363 0.6913 0.8315
No log 2.4752 250 0.6426 0.5288 0.6426 0.8016
No log 2.4950 252 0.6845 0.5686 0.6845 0.8274
No log 2.5149 254 0.7911 0.5241 0.7911 0.8895
No log 2.5347 256 0.7735 0.5241 0.7735 0.8795
No log 2.5545 258 0.6556 0.5858 0.6556 0.8097
No log 2.5743 260 0.6205 0.5856 0.6205 0.7877
No log 2.5941 262 0.6711 0.5353 0.6711 0.8192
No log 2.6139 264 0.6544 0.4960 0.6544 0.8089
No log 2.6337 266 0.6418 0.5966 0.6418 0.8011
No log 2.6535 268 0.6732 0.5925 0.6732 0.8205
No log 2.6733 270 0.6434 0.5966 0.6434 0.8022
No log 2.6931 272 0.6346 0.5902 0.6346 0.7966
No log 2.7129 274 0.6310 0.5902 0.6310 0.7944
No log 2.7327 276 0.6573 0.5799 0.6573 0.8108
No log 2.7525 278 0.8039 0.5124 0.8039 0.8966
No log 2.7723 280 0.8674 0.4667 0.8674 0.9314
No log 2.7921 282 0.7751 0.5124 0.7751 0.8804
No log 2.8119 284 0.6524 0.5825 0.6524 0.8077
No log 2.8317 286 0.6473 0.5331 0.6473 0.8046
No log 2.8515 288 0.6504 0.5345 0.6504 0.8064
No log 2.8713 290 0.6372 0.5610 0.6372 0.7983
No log 2.8911 292 0.6482 0.5597 0.6482 0.8051
No log 2.9109 294 0.6696 0.5622 0.6696 0.8183
No log 2.9307 296 0.6934 0.5413 0.6934 0.8327
No log 2.9505 298 0.6954 0.5709 0.6954 0.8339
No log 2.9703 300 0.6796 0.5746 0.6796 0.8244
No log 2.9901 302 0.6829 0.5161 0.6829 0.8264
No log 3.0099 304 0.6791 0.5197 0.6791 0.8241
No log 3.0297 306 0.6433 0.5880 0.6433 0.8020
No log 3.0495 308 0.6314 0.5629 0.6314 0.7946
No log 3.0693 310 0.6586 0.5822 0.6586 0.8116
No log 3.0891 312 0.6424 0.6021 0.6424 0.8015
No log 3.1089 314 0.6192 0.6032 0.6192 0.7869
No log 3.1287 316 0.6075 0.6296 0.6075 0.7794
No log 3.1485 318 0.6166 0.6286 0.6166 0.7852
No log 3.1683 320 0.6308 0.5847 0.6308 0.7942
No log 3.1881 322 0.6443 0.5989 0.6443 0.8027
No log 3.2079 324 0.6625 0.6255 0.6625 0.8139
No log 3.2277 326 0.6690 0.6143 0.6690 0.8179
No log 3.2475 328 0.6630 0.6132 0.6630 0.8143
No log 3.2673 330 0.6459 0.6105 0.6459 0.8037
No log 3.2871 332 0.6451 0.6073 0.6451 0.8032
No log 3.3069 334 0.7130 0.5787 0.7130 0.8444
No log 3.3267 336 0.7495 0.5534 0.7495 0.8657
No log 3.3465 338 0.6635 0.6051 0.6635 0.8146
No log 3.3663 340 0.6370 0.5889 0.6370 0.7981
No log 3.3861 342 0.6337 0.5978 0.6337 0.7960
No log 3.4059 344 0.6576 0.6143 0.6576 0.8109
No log 3.4257 346 0.7163 0.5763 0.7163 0.8464
No log 3.4455 348 0.8065 0.4444 0.8065 0.8981
No log 3.4653 350 0.7893 0.4444 0.7893 0.8884
No log 3.4851 352 0.7074 0.5651 0.7074 0.8411
No log 3.5050 354 0.6563 0.5850 0.6563 0.8101
No log 3.5248 356 0.6450 0.6176 0.6450 0.8031
No log 3.5446 358 0.6372 0.5774 0.6372 0.7982
No log 3.5644 360 0.6977 0.6071 0.6977 0.8353
No log 3.5842 362 0.8457 0.5391 0.8457 0.9196
No log 3.6040 364 1.0290 0.4462 1.0290 1.0144
No log 3.6238 366 1.0583 0.4050 1.0583 1.0287
No log 3.6436 368 0.9183 0.3721 0.9183 0.9583
No log 3.6634 370 0.7430 0.5190 0.7430 0.8619
No log 3.6832 372 0.6575 0.6127 0.6575 0.8109
No log 3.7030 374 0.6396 0.6221 0.6396 0.7997
No log 3.7228 376 0.6394 0.6221 0.6394 0.7996
No log 3.7426 378 0.6472 0.5811 0.6472 0.8045
No log 3.7624 380 0.6760 0.6015 0.6760 0.8222
No log 3.7822 382 0.8445 0.4554 0.8445 0.9190
No log 3.8020 384 0.9823 0.4655 0.9823 0.9911
No log 3.8218 386 0.9243 0.4444 0.9243 0.9614
No log 3.8416 388 0.7543 0.5266 0.7543 0.8685
No log 3.8614 390 0.6233 0.6084 0.6233 0.7895
No log 3.8812 392 0.5796 0.6104 0.5796 0.7613
No log 3.9010 394 0.5936 0.5993 0.5936 0.7705
No log 3.9208 396 0.6132 0.5993 0.6132 0.7831
No log 3.9406 398 0.6002 0.5770 0.6002 0.7747
No log 3.9604 400 0.5908 0.5961 0.5908 0.7686
No log 3.9802 402 0.5933 0.5614 0.5933 0.7703
No log 4.0 404 0.5872 0.5724 0.5872 0.7663
No log 4.0198 406 0.5817 0.6006 0.5817 0.7627
No log 4.0396 408 0.6334 0.5648 0.6334 0.7958
No log 4.0594 410 0.7054 0.4831 0.7054 0.8399
No log 4.0792 412 0.8279 0.4962 0.8279 0.9099
No log 4.0990 414 0.7776 0.4740 0.7776 0.8818
No log 4.1188 416 0.6106 0.5748 0.6106 0.7814
No log 4.1386 418 0.5833 0.6464 0.5833 0.7638
No log 4.1584 420 0.6146 0.6914 0.6146 0.7840
No log 4.1782 422 0.6081 0.6914 0.6081 0.7798
No log 4.1980 424 0.5918 0.6642 0.5918 0.7693
No log 4.2178 426 0.5858 0.6737 0.5858 0.7653
No log 4.2376 428 0.6092 0.6725 0.6092 0.7805
No log 4.2574 430 0.6334 0.5945 0.6334 0.7959
No log 4.2772 432 0.6493 0.5464 0.6493 0.8058
No log 4.2970 434 0.6577 0.4915 0.6577 0.8110
No log 4.3168 436 0.6607 0.5108 0.6607 0.8129
No log 4.3366 438 0.6620 0.5349 0.6620 0.8136
No log 4.3564 440 0.7130 0.5167 0.7130 0.8444
No log 4.3762 442 0.8546 0.5219 0.8546 0.9244
No log 4.3960 444 0.8969 0.5013 0.8969 0.9471
No log 4.4158 446 0.8029 0.5042 0.8029 0.8961
No log 4.4356 448 0.6882 0.5686 0.6882 0.8296
No log 4.4554 450 0.6898 0.4968 0.6898 0.8305
No log 4.4752 452 0.7330 0.4330 0.7330 0.8561
No log 4.4950 454 0.7219 0.4516 0.7219 0.8496
No log 4.5149 456 0.7417 0.4591 0.7417 0.8612
No log 4.5347 458 0.7658 0.4851 0.7658 0.8751
No log 4.5545 460 0.7496 0.4480 0.7496 0.8658
No log 4.5743 462 0.7432 0.4892 0.7432 0.8621
No log 4.5941 464 0.7424 0.4248 0.7424 0.8616
No log 4.6139 466 0.7810 0.3876 0.7810 0.8837
No log 4.6337 468 0.8207 0.3935 0.8207 0.9059
No log 4.6535 470 0.8556 0.4655 0.8556 0.9250
No log 4.6733 472 0.8542 0.4655 0.8542 0.9243
No log 4.6931 474 0.7842 0.4090 0.7842 0.8856
No log 4.7129 476 0.7328 0.4 0.7328 0.8560
No log 4.7327 478 0.7024 0.4516 0.7024 0.8381
No log 4.7525 480 0.7049 0.4708 0.7049 0.8396
No log 4.7723 482 0.7755 0.5033 0.7755 0.8806
No log 4.7921 484 0.8207 0.5161 0.8207 0.9059
No log 4.8119 486 0.8153 0.5161 0.8153 0.9029
No log 4.8317 488 0.7447 0.5245 0.7447 0.8629
No log 4.8515 490 0.7232 0.5245 0.7232 0.8504
No log 4.8713 492 0.6802 0.5684 0.6802 0.8248
No log 4.8911 494 0.6799 0.5794 0.6799 0.8246
No log 4.9109 496 0.6862 0.5581 0.6862 0.8284
No log 4.9307 498 0.7328 0.5244 0.7328 0.8560
0.2997 4.9505 500 0.7693 0.5257 0.7693 0.8771
0.2997 4.9703 502 0.7626 0.5364 0.7626 0.8733
0.2997 4.9901 504 0.9232 0.4425 0.9232 0.9608
0.2997 5.0099 506 1.0583 0.4440 1.0583 1.0288
0.2997 5.0297 508 0.8798 0.4700 0.8798 0.9380
0.2997 5.0495 510 0.6954 0.5342 0.6954 0.8339
0.2997 5.0693 512 0.6680 0.5246 0.6680 0.8173
0.2997 5.0891 514 0.6877 0.4398 0.6877 0.8293
0.2997 5.1089 516 0.7874 0.4076 0.7874 0.8874
0.2997 5.1287 518 0.7989 0.4076 0.7989 0.8938
0.2997 5.1485 520 0.7722 0.4638 0.7722 0.8788

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k20_task5_organization

Finetuned
(4019)
this model