ArabicNewSplits8_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k18_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7085
  • Qwk: 0.6664
  • Mse: 0.7085
  • Rmse: 0.8417

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0230 2 5.3702 -0.0401 5.3702 2.3174
No log 0.0460 4 3.4302 0.0261 3.4302 1.8521
No log 0.0690 6 2.1725 0.0049 2.1725 1.4739
No log 0.0920 8 1.8678 0.0784 1.8678 1.3667
No log 0.1149 10 1.6273 -0.0251 1.6273 1.2757
No log 0.1379 12 1.1985 0.2327 1.1985 1.0948
No log 0.1609 14 1.1434 0.3727 1.1434 1.0693
No log 0.1839 16 1.1459 0.3179 1.1459 1.0705
No log 0.2069 18 1.2405 0.2992 1.2405 1.1138
No log 0.2299 20 1.3623 0.2470 1.3623 1.1672
No log 0.2529 22 1.1717 0.2998 1.1717 1.0825
No log 0.2759 24 1.1047 0.1757 1.1047 1.0510
No log 0.2989 26 1.1584 0.1539 1.1584 1.0763
No log 0.3218 28 1.0985 0.2650 1.0985 1.0481
No log 0.3448 30 1.1268 0.2047 1.1268 1.0615
No log 0.3678 32 1.1207 0.2047 1.1207 1.0586
No log 0.3908 34 1.0809 0.3181 1.0809 1.0397
No log 0.4138 36 1.0762 0.2696 1.0762 1.0374
No log 0.4368 38 1.1357 0.3305 1.1357 1.0657
No log 0.4598 40 1.0940 0.3612 1.0940 1.0460
No log 0.4828 42 1.0363 0.3603 1.0363 1.0180
No log 0.5057 44 1.1341 0.4330 1.1341 1.0650
No log 0.5287 46 1.1927 0.4247 1.1927 1.0921
No log 0.5517 48 1.0360 0.4627 1.0360 1.0178
No log 0.5747 50 1.0036 0.4726 1.0036 1.0018
No log 0.5977 52 0.9922 0.4726 0.9922 0.9961
No log 0.6207 54 1.1687 0.4738 1.1687 1.0810
No log 0.6437 56 1.5600 0.1196 1.5600 1.2490
No log 0.6667 58 1.4582 0.2100 1.4582 1.2075
No log 0.6897 60 1.2577 0.3969 1.2577 1.1215
No log 0.7126 62 1.0252 0.4955 1.0252 1.0125
No log 0.7356 64 1.1683 0.3769 1.1683 1.0809
No log 0.7586 66 1.1014 0.3795 1.1014 1.0495
No log 0.7816 68 1.0872 0.4304 1.0872 1.0427
No log 0.8046 70 1.3937 0.3597 1.3937 1.1805
No log 0.8276 72 1.6325 0.2927 1.6325 1.2777
No log 0.8506 74 1.5081 0.3229 1.5081 1.2280
No log 0.8736 76 1.0738 0.4582 1.0738 1.0362
No log 0.8966 78 0.8757 0.4376 0.8757 0.9358
No log 0.9195 80 0.8522 0.4564 0.8522 0.9231
No log 0.9425 82 0.8897 0.5229 0.8897 0.9433
No log 0.9655 84 1.0765 0.4237 1.0765 1.0375
No log 0.9885 86 1.2416 0.3585 1.2416 1.1143
No log 1.0115 88 1.1344 0.3720 1.1344 1.0651
No log 1.0345 90 0.9868 0.4699 0.9868 0.9934
No log 1.0575 92 0.9015 0.5427 0.9015 0.9495
No log 1.0805 94 1.0027 0.4851 1.0027 1.0013
No log 1.1034 96 1.0824 0.4399 1.0824 1.0404
No log 1.1264 98 1.2383 0.4511 1.2383 1.1128
No log 1.1494 100 1.0836 0.4399 1.0836 1.0410
No log 1.1724 102 0.8852 0.5177 0.8852 0.9408
No log 1.1954 104 0.8919 0.5270 0.8919 0.9444
No log 1.2184 106 0.9072 0.5151 0.9072 0.9524
No log 1.2414 108 1.0596 0.4844 1.0596 1.0294
No log 1.2644 110 1.0207 0.5171 1.0207 1.0103
No log 1.2874 112 1.0437 0.5 1.0437 1.0216
No log 1.3103 114 1.0338 0.5055 1.0338 1.0168
No log 1.3333 116 1.0143 0.4817 1.0143 1.0071
No log 1.3563 118 1.0152 0.4845 1.0152 1.0075
No log 1.3793 120 0.9681 0.4968 0.9681 0.9839
No log 1.4023 122 1.1657 0.4500 1.1657 1.0797
No log 1.4253 124 1.2303 0.4426 1.2303 1.1092
No log 1.4483 126 1.0212 0.5024 1.0212 1.0105
No log 1.4713 128 0.9374 0.5394 0.9374 0.9682
No log 1.4943 130 0.9852 0.4761 0.9852 0.9926
No log 1.5172 132 1.1806 0.4469 1.1806 1.0866
No log 1.5402 134 1.2092 0.4246 1.2092 1.0996
No log 1.5632 136 1.0848 0.4614 1.0848 1.0415
No log 1.5862 138 0.8864 0.4393 0.8864 0.9415
No log 1.6092 140 0.8274 0.4717 0.8274 0.9096
No log 1.6322 142 0.8007 0.5134 0.8007 0.8948
No log 1.6552 144 0.8681 0.5859 0.8681 0.9317
No log 1.6782 146 1.1384 0.4715 1.1384 1.0670
No log 1.7011 148 1.1061 0.5047 1.1061 1.0517
No log 1.7241 150 0.8868 0.5756 0.8868 0.9417
No log 1.7471 152 0.7817 0.5767 0.7817 0.8841
No log 1.7701 154 0.7808 0.5456 0.7808 0.8836
No log 1.7931 156 0.7662 0.6184 0.7662 0.8754
No log 1.8161 158 0.8739 0.5660 0.8739 0.9348
No log 1.8391 160 0.9408 0.5491 0.9408 0.9699
No log 1.8621 162 0.8653 0.5824 0.8653 0.9302
No log 1.8851 164 0.7556 0.6353 0.7556 0.8693
No log 1.9080 166 0.7629 0.6517 0.7629 0.8734
No log 1.9310 168 0.7921 0.6346 0.7921 0.8900
No log 1.9540 170 0.8397 0.6335 0.8397 0.9163
No log 1.9770 172 0.9096 0.5867 0.9096 0.9537
No log 2.0 174 0.8923 0.5867 0.8923 0.9446
No log 2.0230 176 0.7856 0.6271 0.7856 0.8863
No log 2.0460 178 0.7019 0.6723 0.7019 0.8378
No log 2.0690 180 0.7667 0.5808 0.7667 0.8756
No log 2.0920 182 0.7523 0.5809 0.7523 0.8674
No log 2.1149 184 0.6971 0.6018 0.6971 0.8349
No log 2.1379 186 0.7201 0.6459 0.7201 0.8486
No log 2.1609 188 0.6999 0.6333 0.6999 0.8366
No log 2.1839 190 0.7096 0.6521 0.7096 0.8424
No log 2.2069 192 0.7096 0.6544 0.7096 0.8424
No log 2.2299 194 0.6890 0.6738 0.6890 0.8301
No log 2.2529 196 0.7508 0.6559 0.7508 0.8665
No log 2.2759 198 0.7660 0.6421 0.7660 0.8752
No log 2.2989 200 0.6592 0.6592 0.6592 0.8119
No log 2.3218 202 0.6556 0.7173 0.6556 0.8097
No log 2.3448 204 0.6953 0.6716 0.6953 0.8339
No log 2.3678 206 0.6500 0.6750 0.6500 0.8062
No log 2.3908 208 0.7058 0.6359 0.7058 0.8401
No log 2.4138 210 0.7848 0.6108 0.7848 0.8859
No log 2.4368 212 0.7424 0.6072 0.7424 0.8616
No log 2.4598 214 0.6828 0.6481 0.6828 0.8263
No log 2.4828 216 0.6860 0.6326 0.6860 0.8282
No log 2.5057 218 0.6939 0.6724 0.6939 0.8330
No log 2.5287 220 0.6713 0.6843 0.6713 0.8194
No log 2.5517 222 0.6629 0.6562 0.6629 0.8142
No log 2.5747 224 0.6879 0.6066 0.6879 0.8294
No log 2.5977 226 0.6528 0.6398 0.6528 0.8080
No log 2.6207 228 0.6469 0.6305 0.6469 0.8043
No log 2.6437 230 0.6343 0.6772 0.6343 0.7965
No log 2.6667 232 0.6377 0.7295 0.6377 0.7986
No log 2.6897 234 0.6421 0.7254 0.6421 0.8013
No log 2.7126 236 0.6938 0.6917 0.6938 0.8330
No log 2.7356 238 0.6634 0.7114 0.6634 0.8145
No log 2.7586 240 0.7423 0.7004 0.7423 0.8616
No log 2.7816 242 0.7020 0.6857 0.7020 0.8379
No log 2.8046 244 0.6375 0.7326 0.6375 0.7984
No log 2.8276 246 0.6253 0.7404 0.6253 0.7908
No log 2.8506 248 0.6175 0.7187 0.6175 0.7858
No log 2.8736 250 0.6322 0.6657 0.6322 0.7951
No log 2.8966 252 0.7148 0.6330 0.7148 0.8455
No log 2.9195 254 0.6951 0.6534 0.6951 0.8337
No log 2.9425 256 0.6388 0.6358 0.6388 0.7993
No log 2.9655 258 0.6740 0.6839 0.6740 0.8210
No log 2.9885 260 0.7121 0.6713 0.7121 0.8438
No log 3.0115 262 0.7058 0.6924 0.7058 0.8401
No log 3.0345 264 0.7042 0.6936 0.7042 0.8392
No log 3.0575 266 0.7398 0.6614 0.7398 0.8601
No log 3.0805 268 0.8504 0.5921 0.8504 0.9222
No log 3.1034 270 0.9267 0.5484 0.9267 0.9626
No log 3.1264 272 0.8385 0.5906 0.8385 0.9157
No log 3.1494 274 0.7150 0.6298 0.7150 0.8456
No log 3.1724 276 0.6606 0.6029 0.6606 0.8128
No log 3.1954 278 0.6583 0.6145 0.6583 0.8113
No log 3.2184 280 0.6533 0.6241 0.6533 0.8083
No log 3.2414 282 0.6786 0.6495 0.6786 0.8238
No log 3.2644 284 0.6947 0.6495 0.6947 0.8335
No log 3.2874 286 0.6728 0.6365 0.6728 0.8203
No log 3.3103 288 0.6872 0.6189 0.6872 0.8290
No log 3.3333 290 0.7264 0.6012 0.7264 0.8523
No log 3.3563 292 0.7113 0.5915 0.7113 0.8434
No log 3.3793 294 0.6984 0.6303 0.6984 0.8357
No log 3.4023 296 0.7376 0.6540 0.7376 0.8588
No log 3.4253 298 0.7689 0.6306 0.7689 0.8769
No log 3.4483 300 0.7219 0.6460 0.7219 0.8497
No log 3.4713 302 0.6840 0.6241 0.6840 0.8271
No log 3.4943 304 0.6812 0.6226 0.6812 0.8253
No log 3.5172 306 0.6639 0.6017 0.6639 0.8148
No log 3.5402 308 0.6738 0.6382 0.6738 0.8208
No log 3.5632 310 0.7470 0.6255 0.7470 0.8643
No log 3.5862 312 0.7960 0.6171 0.7960 0.8922
No log 3.6092 314 0.7231 0.6363 0.7231 0.8503
No log 3.6322 316 0.7351 0.6452 0.7351 0.8574
No log 3.6552 318 0.7694 0.6475 0.7694 0.8772
No log 3.6782 320 0.7416 0.6398 0.7416 0.8612
No log 3.7011 322 0.7222 0.6347 0.7222 0.8498
No log 3.7241 324 0.7728 0.6077 0.7728 0.8791
No log 3.7471 326 0.9275 0.5800 0.9275 0.9630
No log 3.7701 328 0.9687 0.5561 0.9687 0.9842
No log 3.7931 330 0.8608 0.5730 0.8608 0.9278
No log 3.8161 332 0.7843 0.5989 0.7843 0.8856
No log 3.8391 334 0.7477 0.6055 0.7477 0.8647
No log 3.8621 336 0.7411 0.6443 0.7411 0.8609
No log 3.8851 338 0.7386 0.6259 0.7386 0.8594
No log 3.9080 340 0.7923 0.6462 0.7923 0.8901
No log 3.9310 342 0.7990 0.6493 0.7990 0.8939
No log 3.9540 344 0.7706 0.6627 0.7706 0.8778
No log 3.9770 346 0.7162 0.6767 0.7162 0.8463
No log 4.0 348 0.7020 0.6850 0.7020 0.8379
No log 4.0230 350 0.6851 0.6779 0.6851 0.8277
No log 4.0460 352 0.6646 0.6600 0.6646 0.8153
No log 4.0690 354 0.6937 0.6644 0.6937 0.8329
No log 4.0920 356 0.6977 0.6603 0.6977 0.8353
No log 4.1149 358 0.8158 0.6385 0.8158 0.9032
No log 4.1379 360 1.0177 0.5462 1.0177 1.0088
No log 4.1609 362 1.1025 0.4811 1.1025 1.0500
No log 4.1839 364 0.9976 0.5537 0.9976 0.9988
No log 4.2069 366 0.8639 0.6223 0.8639 0.9295
No log 4.2299 368 0.8722 0.6145 0.8722 0.9339
No log 4.2529 370 0.9324 0.6015 0.9324 0.9656
No log 4.2759 372 0.9022 0.5797 0.9022 0.9499
No log 4.2989 374 0.8824 0.5873 0.8824 0.9394
No log 4.3218 376 0.7423 0.6284 0.7423 0.8615
No log 4.3448 378 0.6851 0.6045 0.6851 0.8277
No log 4.3678 380 0.6766 0.6134 0.6766 0.8226
No log 4.3908 382 0.6872 0.5987 0.6872 0.8290
No log 4.4138 384 0.6738 0.6148 0.6738 0.8209
No log 4.4368 386 0.6759 0.6343 0.6759 0.8222
No log 4.4598 388 0.8108 0.6124 0.8108 0.9005
No log 4.4828 390 1.0258 0.5745 1.0258 1.0128
No log 4.5057 392 0.9725 0.6076 0.9725 0.9862
No log 4.5287 394 0.8276 0.6297 0.8276 0.9097
No log 4.5517 396 0.6942 0.6457 0.6942 0.8332
No log 4.5747 398 0.6536 0.7069 0.6536 0.8085
No log 4.5977 400 0.6429 0.7137 0.6429 0.8018
No log 4.6207 402 0.6441 0.6803 0.6441 0.8026
No log 4.6437 404 0.6281 0.7016 0.6281 0.7925
No log 4.6667 406 0.6230 0.7043 0.6230 0.7893
No log 4.6897 408 0.6606 0.7186 0.6606 0.8128
No log 4.7126 410 0.6119 0.7485 0.6119 0.7822
No log 4.7356 412 0.5864 0.7321 0.5864 0.7658
No log 4.7586 414 0.6019 0.7114 0.6019 0.7758
No log 4.7816 416 0.5914 0.7222 0.5914 0.7690
No log 4.8046 418 0.5772 0.7267 0.5772 0.7598
No log 4.8276 420 0.5899 0.7417 0.5899 0.7680
No log 4.8506 422 0.6053 0.6938 0.6053 0.7780
No log 4.8736 424 0.6084 0.6983 0.6084 0.7800
No log 4.8966 426 0.6116 0.6801 0.6116 0.7821
No log 4.9195 428 0.6031 0.7033 0.6031 0.7766
No log 4.9425 430 0.5982 0.7305 0.5982 0.7734
No log 4.9655 432 0.5956 0.7409 0.5956 0.7717
No log 4.9885 434 0.5934 0.7549 0.5934 0.7703
No log 5.0115 436 0.6052 0.7358 0.6052 0.7780
No log 5.0345 438 0.6418 0.6957 0.6418 0.8011
No log 5.0575 440 0.6267 0.6888 0.6267 0.7916
No log 5.0805 442 0.6370 0.6948 0.6370 0.7981
No log 5.1034 444 0.7185 0.6457 0.7185 0.8477
No log 5.1264 446 0.6846 0.6228 0.6846 0.8274
No log 5.1494 448 0.6408 0.6529 0.6408 0.8005
No log 5.1724 450 0.6454 0.6490 0.6454 0.8034
No log 5.1954 452 0.6359 0.6762 0.6359 0.7974
No log 5.2184 454 0.6224 0.6830 0.6224 0.7889
No log 5.2414 456 0.6129 0.6958 0.6129 0.7829
No log 5.2644 458 0.6196 0.6948 0.6196 0.7872
No log 5.2874 460 0.6950 0.6841 0.6950 0.8336
No log 5.3103 462 0.6852 0.6946 0.6852 0.8278
No log 5.3333 464 0.6198 0.7034 0.6198 0.7873
No log 5.3563 466 0.6440 0.6726 0.6440 0.8025
No log 5.3793 468 0.6600 0.6917 0.6600 0.8124
No log 5.4023 470 0.6265 0.6749 0.6265 0.7915
No log 5.4253 472 0.6838 0.6760 0.6838 0.8269
No log 5.4483 474 0.8712 0.6203 0.8712 0.9334
No log 5.4713 476 0.9915 0.5805 0.9915 0.9957
No log 5.4943 478 0.9026 0.5854 0.9026 0.9500
No log 5.5172 480 0.7036 0.6762 0.7036 0.8388
No log 5.5402 482 0.6469 0.6955 0.6469 0.8043
No log 5.5632 484 0.6703 0.6672 0.6703 0.8187
No log 5.5862 486 0.6782 0.6683 0.6782 0.8235
No log 5.6092 488 0.7027 0.6703 0.7027 0.8383
No log 5.6322 490 0.6715 0.6662 0.6715 0.8194
No log 5.6552 492 0.6668 0.6772 0.6668 0.8166
No log 5.6782 494 0.6725 0.6804 0.6725 0.8201
No log 5.7011 496 0.6489 0.6772 0.6489 0.8055
No log 5.7241 498 0.6387 0.6795 0.6387 0.7992
0.4081 5.7471 500 0.6346 0.6762 0.6346 0.7966
0.4081 5.7701 502 0.6520 0.6800 0.6520 0.8075
0.4081 5.7931 504 0.7352 0.6635 0.7352 0.8574
0.4081 5.8161 506 0.7569 0.6655 0.7569 0.8700
0.4081 5.8391 508 0.6615 0.6944 0.6615 0.8133
0.4081 5.8621 510 0.6372 0.7021 0.6372 0.7983
0.4081 5.8851 512 0.6640 0.6877 0.6640 0.8149
0.4081 5.9080 514 0.6575 0.6868 0.6575 0.8109
0.4081 5.9310 516 0.6508 0.6791 0.6508 0.8068
0.4081 5.9540 518 0.6229 0.6720 0.6229 0.7892
0.4081 5.9770 520 0.6214 0.7131 0.6214 0.7883
0.4081 6.0 522 0.6240 0.7236 0.6240 0.7899
0.4081 6.0230 524 0.6404 0.7402 0.6404 0.8003
0.4081 6.0460 526 0.6869 0.6972 0.6869 0.8288
0.4081 6.0690 528 0.7247 0.6549 0.7247 0.8513
0.4081 6.0920 530 0.7338 0.6282 0.7338 0.8566
0.4081 6.1149 532 0.6910 0.6722 0.6910 0.8313
0.4081 6.1379 534 0.6947 0.6644 0.6947 0.8335
0.4081 6.1609 536 0.7085 0.6664 0.7085 0.8417

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits8_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k18_task1_organization

Finetuned
(4023)
this model