ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k15_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6781
  • Qwk: 0.5472
  • Mse: 0.6781
  • Rmse: 0.8234

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0260 2 4.0588 -0.0092 4.0588 2.0146
No log 0.0519 4 2.2525 0.0715 2.2525 1.5008
No log 0.0779 6 1.3654 0.0760 1.3654 1.1685
No log 0.1039 8 1.1094 0.2711 1.1094 1.0533
No log 0.1299 10 1.1395 0.1603 1.1395 1.0675
No log 0.1558 12 1.4168 -0.0812 1.4168 1.1903
No log 0.1818 14 1.3400 -0.0226 1.3400 1.1576
No log 0.2078 16 1.1691 0.0950 1.1691 1.0812
No log 0.2338 18 1.2507 0.0760 1.2507 1.1183
No log 0.2597 20 1.2259 0.1202 1.2259 1.1072
No log 0.2857 22 1.1400 0.0730 1.1400 1.0677
No log 0.3117 24 1.1497 0.1096 1.1497 1.0722
No log 0.3377 26 1.2595 0.0349 1.2595 1.1223
No log 0.3636 28 1.2919 0.0232 1.2919 1.1366
No log 0.3896 30 1.1944 0.0584 1.1944 1.0929
No log 0.4156 32 1.0525 0.1997 1.0525 1.0259
No log 0.4416 34 1.0613 0.1398 1.0613 1.0302
No log 0.4675 36 1.0265 0.1671 1.0265 1.0131
No log 0.4935 38 1.0651 0.2023 1.0651 1.0320
No log 0.5195 40 1.0973 0.2074 1.0973 1.0475
No log 0.5455 42 1.0857 0.2074 1.0857 1.0420
No log 0.5714 44 1.1044 0.1645 1.1044 1.0509
No log 0.5974 46 1.0599 0.2125 1.0599 1.0295
No log 0.6234 48 0.9708 0.3014 0.9708 0.9853
No log 0.6494 50 0.9993 0.3130 0.9993 0.9997
No log 0.6753 52 1.0167 0.2236 1.0167 1.0083
No log 0.7013 54 0.9909 0.3604 0.9909 0.9955
No log 0.7273 56 0.9999 0.3094 0.9999 1.0000
No log 0.7532 58 1.0579 0.2125 1.0579 1.0285
No log 0.7792 60 1.1513 0.0613 1.1513 1.0730
No log 0.8052 62 1.1425 0.0613 1.1425 1.0689
No log 0.8312 64 1.0356 0.2513 1.0356 1.0176
No log 0.8571 66 0.9348 0.3435 0.9348 0.9668
No log 0.8831 68 0.9541 0.4067 0.9541 0.9768
No log 0.9091 70 0.9467 0.4498 0.9467 0.9730
No log 0.9351 72 1.0207 0.2857 1.0207 1.0103
No log 0.9610 74 1.0635 0.1738 1.0635 1.0312
No log 0.9870 76 1.0409 0.3026 1.0409 1.0202
No log 1.0130 78 0.9752 0.2611 0.9752 0.9875
No log 1.0390 80 0.8388 0.4 0.8388 0.9159
No log 1.0649 82 0.8658 0.2842 0.8658 0.9305
No log 1.0909 84 0.9593 0.2416 0.9593 0.9795
No log 1.1169 86 0.9935 0.3506 0.9935 0.9967
No log 1.1429 88 0.9579 0.3968 0.9579 0.9787
No log 1.1688 90 1.0195 0.3641 1.0195 1.0097
No log 1.1948 92 0.9871 0.3462 0.9871 0.9936
No log 1.2208 94 0.9884 0.2033 0.9884 0.9942
No log 1.2468 96 0.9056 0.3217 0.9056 0.9516
No log 1.2727 98 0.9027 0.3981 0.9027 0.9501
No log 1.2987 100 0.9706 0.4161 0.9706 0.9852
No log 1.3247 102 0.9512 0.4161 0.9512 0.9753
No log 1.3506 104 0.9200 0.3517 0.9200 0.9592
No log 1.3766 106 1.0755 0.4096 1.0755 1.0371
No log 1.4026 108 1.0698 0.4096 1.0698 1.0343
No log 1.4286 110 0.9353 0.3418 0.9353 0.9671
No log 1.4545 112 0.8877 0.3418 0.8877 0.9422
No log 1.4805 114 0.9012 0.375 0.9012 0.9493
No log 1.5065 116 0.8783 0.3897 0.8783 0.9372
No log 1.5325 118 0.8795 0.4455 0.8795 0.9378
No log 1.5584 120 1.0498 0.4332 1.0498 1.0246
No log 1.5844 122 1.3050 0.2805 1.3050 1.1424
No log 1.6104 124 1.3117 0.2898 1.3117 1.1453
No log 1.6364 126 1.3251 0.2713 1.3251 1.1511
No log 1.6623 128 1.1562 0.3631 1.1562 1.0753
No log 1.6883 130 1.0222 0.3715 1.0222 1.0110
No log 1.7143 132 1.0563 0.3696 1.0563 1.0278
No log 1.7403 134 1.2448 0.2891 1.2448 1.1157
No log 1.7662 136 1.3214 0.3616 1.3214 1.1495
No log 1.7922 138 1.1397 0.3255 1.1397 1.0676
No log 1.8182 140 0.9348 0.4648 0.9348 0.9668
No log 1.8442 142 0.8476 0.4465 0.8476 0.9206
No log 1.8701 144 0.8509 0.4681 0.8509 0.9224
No log 1.8961 146 0.8912 0.3984 0.8912 0.9440
No log 1.9221 148 0.7999 0.4787 0.7999 0.8944
No log 1.9481 150 0.7401 0.5197 0.7401 0.8603
No log 1.9740 152 0.7890 0.5253 0.7890 0.8883
No log 2.0 154 0.9301 0.4877 0.9301 0.9644
No log 2.0260 156 0.9060 0.4667 0.9060 0.9518
No log 2.0519 158 0.8133 0.5019 0.8133 0.9018
No log 2.0779 160 0.7723 0.5067 0.7723 0.8788
No log 2.1039 162 0.7676 0.5089 0.7676 0.8761
No log 2.1299 164 0.7782 0.4495 0.7782 0.8822
No log 2.1558 166 0.9667 0.4440 0.9667 0.9832
No log 2.1818 168 1.1383 0.3476 1.1383 1.0669
No log 2.2078 170 1.0398 0.3316 1.0398 1.0197
No log 2.2338 172 0.9258 0.3671 0.9258 0.9622
No log 2.2597 174 0.8838 0.3454 0.8838 0.9401
No log 2.2857 176 0.9478 0.4902 0.9478 0.9736
No log 2.3117 178 1.0996 0.4267 1.0996 1.0486
No log 2.3377 180 1.0833 0.4384 1.0833 1.0408
No log 2.3636 182 0.8817 0.5128 0.8817 0.9390
No log 2.3896 184 0.7431 0.4676 0.7431 0.8621
No log 2.4156 186 0.7456 0.4461 0.7456 0.8635
No log 2.4416 188 0.7288 0.4834 0.7288 0.8537
No log 2.4675 190 0.7222 0.5152 0.7222 0.8498
No log 2.4935 192 0.8163 0.5344 0.8163 0.9035
No log 2.5195 194 0.9536 0.4786 0.9536 0.9765
No log 2.5455 196 0.9370 0.4787 0.9370 0.9680
No log 2.5714 198 0.7805 0.5610 0.7805 0.8835
No log 2.5974 200 0.7442 0.4527 0.7442 0.8627
No log 2.6234 202 0.7857 0.4760 0.7857 0.8864
No log 2.6494 204 0.8518 0.3921 0.8518 0.9229
No log 2.6753 206 0.9794 0.3790 0.9794 0.9897
No log 2.7013 208 1.0861 0.3747 1.0861 1.0421
No log 2.7273 210 1.0399 0.4191 1.0399 1.0198
No log 2.7532 212 0.9231 0.4163 0.9231 0.9608
No log 2.7792 214 0.9107 0.4293 0.9107 0.9543
No log 2.8052 216 0.8706 0.4284 0.8706 0.9330
No log 2.8312 218 0.8788 0.4151 0.8788 0.9375
No log 2.8571 220 0.9637 0.4191 0.9637 0.9817
No log 2.8831 222 1.0415 0.3967 1.0415 1.0206
No log 2.9091 224 1.0176 0.4072 1.0176 1.0088
No log 2.9351 226 0.9529 0.4072 0.9529 0.9762
No log 2.9610 228 0.8044 0.4461 0.8044 0.8969
No log 2.9870 230 0.7747 0.4101 0.7747 0.8802
No log 3.0130 232 0.7886 0.4951 0.7886 0.8880
No log 3.0390 234 0.7555 0.5302 0.7555 0.8692
No log 3.0649 236 0.7845 0.4818 0.7845 0.8857
No log 3.0909 238 0.9274 0.5081 0.9274 0.9630
No log 3.1169 240 0.9154 0.5073 0.9154 0.9568
No log 3.1429 242 0.8176 0.5499 0.8176 0.9042
No log 3.1688 244 0.7430 0.5331 0.7430 0.8620
No log 3.1948 246 0.6402 0.6415 0.6402 0.8001
No log 3.2208 248 0.6378 0.5978 0.6378 0.7986
No log 3.2468 250 0.6749 0.6165 0.6749 0.8215
No log 3.2727 252 0.6607 0.5247 0.6607 0.8129
No log 3.2987 254 0.6674 0.5597 0.6674 0.8169
No log 3.3247 256 0.6753 0.5472 0.6753 0.8218
No log 3.3506 258 0.6964 0.5305 0.6964 0.8345
No log 3.3766 260 0.6681 0.5331 0.6681 0.8174
No log 3.4026 262 0.6306 0.5712 0.6306 0.7941
No log 3.4286 264 0.6343 0.5774 0.6343 0.7965
No log 3.4545 266 0.6500 0.6147 0.6500 0.8062
No log 3.4805 268 0.7205 0.5658 0.7205 0.8488
No log 3.5065 270 0.7062 0.5668 0.7062 0.8403
No log 3.5325 272 0.6646 0.5944 0.6646 0.8152
No log 3.5584 274 0.6842 0.5774 0.6842 0.8272
No log 3.5844 276 0.7160 0.5654 0.7160 0.8462
No log 3.6104 278 0.7057 0.5676 0.7057 0.8401
No log 3.6364 280 0.6959 0.4161 0.6959 0.8342
No log 3.6623 282 0.7568 0.4639 0.7568 0.8699
No log 3.6883 284 0.7278 0.4489 0.7278 0.8531
No log 3.7143 286 0.7065 0.5734 0.7065 0.8405
No log 3.7403 288 0.7916 0.5433 0.7916 0.8897
No log 3.7662 290 0.8494 0.4681 0.8494 0.9216
No log 3.7922 292 0.8370 0.4562 0.8370 0.9149
No log 3.8182 294 0.8229 0.3515 0.8229 0.9071
No log 3.8442 296 0.8378 0.4180 0.8378 0.9153
No log 3.8701 298 0.9172 0.3865 0.9172 0.9577
No log 3.8961 300 0.9364 0.4277 0.9364 0.9677
No log 3.9221 302 0.8402 0.5045 0.8402 0.9166
No log 3.9481 304 0.7596 0.4943 0.7596 0.8716
No log 3.9740 306 0.7151 0.5722 0.7151 0.8456
No log 4.0 308 0.6830 0.5577 0.6830 0.8265
No log 4.0260 310 0.6795 0.5975 0.6795 0.8243
No log 4.0519 312 0.7534 0.5735 0.7534 0.8680
No log 4.0779 314 0.9151 0.54 0.9151 0.9566
No log 4.1039 316 1.1029 0.4950 1.1029 1.0502
No log 4.1299 318 1.2022 0.4006 1.2022 1.0964
No log 4.1558 320 0.9625 0.4994 0.9625 0.9811
No log 4.1818 322 0.7432 0.5584 0.7432 0.8621
No log 4.2078 324 0.7060 0.6092 0.7060 0.8403
No log 4.2338 326 0.7029 0.5894 0.7029 0.8384
No log 4.2597 328 0.7283 0.5642 0.7283 0.8534
No log 4.2857 330 0.7716 0.5254 0.7716 0.8784
No log 4.3117 332 0.7695 0.5610 0.7695 0.8772
No log 4.3377 334 0.7097 0.5192 0.7097 0.8424
No log 4.3636 336 0.6748 0.5018 0.6748 0.8215
No log 4.3896 338 0.6960 0.5585 0.6960 0.8343
No log 4.4156 340 0.7680 0.5192 0.7680 0.8763
No log 4.4416 342 0.7978 0.4952 0.7978 0.8932
No log 4.4675 344 0.7748 0.4227 0.7748 0.8802
No log 4.4935 346 0.7849 0.4612 0.7849 0.8860
No log 4.5195 348 0.8222 0.4335 0.8222 0.9067
No log 4.5455 350 0.9065 0.4318 0.9065 0.9521
No log 4.5714 352 0.9559 0.4864 0.9559 0.9777
No log 4.5974 354 0.8831 0.5106 0.8831 0.9397
No log 4.6234 356 0.7646 0.4937 0.7646 0.8744
No log 4.6494 358 0.6760 0.4995 0.6760 0.8222
No log 4.6753 360 0.6594 0.4520 0.6594 0.8120
No log 4.7013 362 0.6437 0.4778 0.6437 0.8023
No log 4.7273 364 0.6539 0.5686 0.6539 0.8086
No log 4.7532 366 0.7001 0.5416 0.7001 0.8367
No log 4.7792 368 0.6901 0.5416 0.6901 0.8307
No log 4.8052 370 0.6247 0.5686 0.6247 0.7904
No log 4.8312 372 0.5878 0.5722 0.5878 0.7667
No log 4.8571 374 0.5970 0.5978 0.5970 0.7727
No log 4.8831 376 0.6138 0.5746 0.6138 0.7834
No log 4.9091 378 0.6643 0.5192 0.6643 0.8151
No log 4.9351 380 0.7964 0.5390 0.7964 0.8924
No log 4.9610 382 0.8339 0.5265 0.8339 0.9132
No log 4.9870 384 0.7742 0.5164 0.7742 0.8799
No log 5.0130 386 0.7270 0.5192 0.7270 0.8527
No log 5.0390 388 0.7361 0.5192 0.7361 0.8580
No log 5.0649 390 0.7698 0.4937 0.7698 0.8774
No log 5.0909 392 0.7635 0.5065 0.7635 0.8738
No log 5.1169 394 0.7324 0.5192 0.7324 0.8558
No log 5.1429 396 0.7095 0.4204 0.7095 0.8423
No log 5.1688 398 0.7248 0.4216 0.7248 0.8514
No log 5.1948 400 0.7917 0.4697 0.7917 0.8897
No log 5.2208 402 0.8774 0.4444 0.8774 0.9367
No log 5.2468 404 1.0022 0.4765 1.0022 1.0011
No log 5.2727 406 1.0233 0.4123 1.0233 1.0116
No log 5.2987 408 1.0049 0.4540 1.0049 1.0025
No log 5.3247 410 0.9363 0.4318 0.9363 0.9676
No log 5.3506 412 0.8202 0.4917 0.8202 0.9057
No log 5.3766 414 0.7881 0.4812 0.7881 0.8878
No log 5.4026 416 0.7890 0.3317 0.7890 0.8882
No log 5.4286 418 0.8322 0.3421 0.8322 0.9123
No log 5.4545 420 0.9142 0.3059 0.9142 0.9561
No log 5.4805 422 0.9590 0.3635 0.9590 0.9793
No log 5.5065 424 0.9091 0.3635 0.9091 0.9535
No log 5.5325 426 0.8304 0.3844 0.8304 0.9113
No log 5.5584 428 0.7971 0.3403 0.7971 0.8928
No log 5.5844 430 0.7651 0.4363 0.7651 0.8747
No log 5.6104 432 0.7324 0.4748 0.7324 0.8558
No log 5.6364 434 0.7088 0.5477 0.7088 0.8419
No log 5.6623 436 0.6723 0.5131 0.6723 0.8199
No log 5.6883 438 0.6608 0.5986 0.6608 0.8129
No log 5.7143 440 0.6366 0.5570 0.6366 0.7979
No log 5.7403 442 0.6256 0.5570 0.6256 0.7909
No log 5.7662 444 0.6549 0.6090 0.6549 0.8093
No log 5.7922 446 0.7215 0.5675 0.7215 0.8494
No log 5.8182 448 0.7241 0.5618 0.7241 0.8509
No log 5.8442 450 0.6838 0.5651 0.6838 0.8269
No log 5.8701 452 0.6472 0.5698 0.6472 0.8045
No log 5.8961 454 0.6488 0.5585 0.6488 0.8055
No log 5.9221 456 0.6737 0.5079 0.6737 0.8208
No log 5.9481 458 0.7353 0.4460 0.7353 0.8575
No log 5.9740 460 0.8912 0.5073 0.8912 0.9440
No log 6.0 462 1.0009 0.4548 1.0009 1.0005
No log 6.0260 464 0.9556 0.5070 0.9556 0.9776
No log 6.0519 466 0.7988 0.5676 0.7988 0.8937
No log 6.0779 468 0.6967 0.5963 0.6967 0.8347
No log 6.1039 470 0.6538 0.5833 0.6538 0.8086
No log 6.1299 472 0.6312 0.5455 0.6312 0.7945
No log 6.1558 474 0.6352 0.5455 0.6352 0.7970
No log 6.1818 476 0.6592 0.5459 0.6592 0.8119
No log 6.2078 478 0.7178 0.4943 0.7178 0.8472
No log 6.2338 480 0.7926 0.5164 0.7926 0.8903
No log 6.2597 482 0.8325 0.5102 0.8325 0.9124
No log 6.2857 484 0.8100 0.5532 0.8100 0.9000
No log 6.3117 486 0.7044 0.5429 0.7044 0.8393
No log 6.3377 488 0.6462 0.5455 0.6462 0.8039
No log 6.3636 490 0.6428 0.5455 0.6428 0.8018
No log 6.3896 492 0.6893 0.5751 0.6893 0.8303
No log 6.4156 494 0.7022 0.6004 0.7022 0.8379
No log 6.4416 496 0.6938 0.6117 0.6938 0.8330
No log 6.4675 498 0.6856 0.6117 0.6856 0.8280
0.3543 6.4935 500 0.7030 0.6004 0.7030 0.8384
0.3543 6.5195 502 0.7260 0.5867 0.7260 0.8521
0.3543 6.5455 504 0.6974 0.5540 0.6974 0.8351
0.3543 6.5714 506 0.7316 0.5406 0.7316 0.8553
0.3543 6.5974 508 0.7306 0.5292 0.7306 0.8547
0.3543 6.6234 510 0.6781 0.5472 0.6781 0.8234

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k15_task5_organization

Finetuned
(4019)
this model