ArabicNewSplits8_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k19_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8219
  • Qwk: 0.5982
  • Mse: 0.8219
  • Rmse: 0.9066

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0217 2 5.4061 -0.0020 5.4061 2.3251
No log 0.0435 4 3.2790 0.0668 3.2790 1.8108
No log 0.0652 6 2.3606 -0.0685 2.3606 1.5364
No log 0.0870 8 2.5477 -0.1747 2.5477 1.5961
No log 0.1087 10 1.6410 0.0610 1.6410 1.2810
No log 0.1304 12 1.1641 0.2329 1.1641 1.0789
No log 0.1522 14 1.7210 0.0894 1.7210 1.3119
No log 0.1739 16 2.4707 0.0573 2.4707 1.5718
No log 0.1957 18 2.0162 0.0777 2.0162 1.4199
No log 0.2174 20 1.2285 0.3243 1.2285 1.1084
No log 0.2391 22 1.1208 0.2149 1.1208 1.0587
No log 0.2609 24 1.0630 0.2619 1.0630 1.0310
No log 0.2826 26 1.1005 0.4093 1.1005 1.0491
No log 0.3043 28 1.1938 0.3033 1.1938 1.0926
No log 0.3261 30 1.4462 0.1522 1.4462 1.2026
No log 0.3478 32 1.4524 0.1875 1.4524 1.2052
No log 0.3696 34 1.2254 0.3340 1.2254 1.1070
No log 0.3913 36 1.0888 0.3964 1.0888 1.0434
No log 0.4130 38 0.8688 0.4761 0.8688 0.9321
No log 0.4348 40 0.7479 0.5203 0.7479 0.8648
No log 0.4565 42 0.7724 0.4862 0.7724 0.8789
No log 0.4783 44 0.7543 0.5130 0.7543 0.8685
No log 0.5 46 0.8712 0.4875 0.8712 0.9334
No log 0.5217 48 1.1806 0.4409 1.1806 1.0866
No log 0.5435 50 1.2900 0.4324 1.2900 1.1358
No log 0.5652 52 1.0692 0.4487 1.0692 1.0340
No log 0.5870 54 0.8503 0.4456 0.8503 0.9221
No log 0.6087 56 0.8977 0.3498 0.8977 0.9475
No log 0.6304 58 1.0080 0.2643 1.0080 1.0040
No log 0.6522 60 1.0685 0.2708 1.0685 1.0337
No log 0.6739 62 0.9910 0.3829 0.9910 0.9955
No log 0.6957 64 0.8817 0.4187 0.8817 0.9390
No log 0.7174 66 0.8313 0.4993 0.8313 0.9117
No log 0.7391 68 0.8090 0.5615 0.8090 0.8994
No log 0.7609 70 0.8693 0.5644 0.8693 0.9323
No log 0.7826 72 1.1872 0.4347 1.1872 1.0896
No log 0.8043 74 1.2843 0.4484 1.2843 1.1333
No log 0.8261 76 0.9447 0.5772 0.9447 0.9720
No log 0.8478 78 0.9076 0.5756 0.9076 0.9527
No log 0.8696 80 0.7954 0.5929 0.7954 0.8918
No log 0.8913 82 0.8251 0.4720 0.8251 0.9083
No log 0.9130 84 0.8132 0.5056 0.8132 0.9018
No log 0.9348 86 0.8465 0.6042 0.8465 0.9200
No log 0.9565 88 1.1263 0.4877 1.1263 1.0613
No log 0.9783 90 1.2194 0.4583 1.2194 1.1043
No log 1.0 92 0.9542 0.5453 0.9542 0.9768
No log 1.0217 94 0.8058 0.4882 0.8058 0.8976
No log 1.0435 96 0.8899 0.4583 0.8899 0.9433
No log 1.0652 98 0.8445 0.4245 0.8445 0.9190
No log 1.0870 100 0.9034 0.5730 0.9034 0.9505
No log 1.1087 102 1.3387 0.4265 1.3387 1.1570
No log 1.1304 104 1.3577 0.4360 1.3577 1.1652
No log 1.1522 106 1.0371 0.5035 1.0371 1.0184
No log 1.1739 108 0.8511 0.6177 0.8511 0.9225
No log 1.1957 110 0.8674 0.6166 0.8674 0.9313
No log 1.2174 112 1.0513 0.5164 1.0513 1.0253
No log 1.2391 114 1.0601 0.5273 1.0601 1.0296
No log 1.2609 116 0.8478 0.6167 0.8478 0.9207
No log 1.2826 118 0.7395 0.5718 0.7395 0.8599
No log 1.3043 120 0.7637 0.5262 0.7637 0.8739
No log 1.3261 122 0.7692 0.4900 0.7692 0.8770
No log 1.3478 124 0.8006 0.5047 0.8006 0.8948
No log 1.3696 126 0.8766 0.5106 0.8766 0.9363
No log 1.3913 128 0.9192 0.5452 0.9192 0.9587
No log 1.4130 130 0.8916 0.4838 0.8916 0.9443
No log 1.4348 132 0.8867 0.5050 0.8867 0.9417
No log 1.4565 134 0.8157 0.4909 0.8157 0.9032
No log 1.4783 136 0.7657 0.6069 0.7657 0.8750
No log 1.5 138 0.7812 0.6569 0.7812 0.8839
No log 1.5217 140 0.8637 0.6383 0.8637 0.9294
No log 1.5435 142 0.8764 0.6366 0.8764 0.9362
No log 1.5652 144 0.8796 0.6185 0.8796 0.9378
No log 1.5870 146 0.8346 0.6171 0.8346 0.9136
No log 1.6087 148 0.7321 0.6426 0.7321 0.8556
No log 1.6304 150 0.7378 0.6478 0.7378 0.8590
No log 1.6522 152 0.7575 0.6301 0.7575 0.8704
No log 1.6739 154 0.8567 0.5886 0.8567 0.9256
No log 1.6957 156 1.0869 0.5302 1.0869 1.0426
No log 1.7174 158 1.0514 0.5107 1.0514 1.0254
No log 1.7391 160 0.8571 0.5666 0.8571 0.9258
No log 1.7609 162 0.8405 0.5432 0.8405 0.9168
No log 1.7826 164 0.9812 0.5193 0.9812 0.9905
No log 1.8043 166 0.9123 0.5241 0.9123 0.9552
No log 1.8261 168 0.7839 0.5358 0.7839 0.8854
No log 1.8478 170 0.8573 0.5922 0.8573 0.9259
No log 1.8696 172 0.9111 0.5937 0.9111 0.9545
No log 1.8913 174 0.8570 0.6040 0.8570 0.9258
No log 1.9130 176 0.7981 0.5995 0.7981 0.8934
No log 1.9348 178 0.7635 0.5165 0.7635 0.8738
No log 1.9565 180 0.7705 0.4722 0.7705 0.8778
No log 1.9783 182 0.7718 0.5085 0.7718 0.8785
No log 2.0 184 0.7674 0.5490 0.7674 0.8760
No log 2.0217 186 0.7502 0.5484 0.7502 0.8661
No log 2.0435 188 0.7398 0.6173 0.7398 0.8601
No log 2.0652 190 0.7960 0.6105 0.7960 0.8922
No log 2.0870 192 0.8512 0.5988 0.8512 0.9226
No log 2.1087 194 0.9712 0.5623 0.9712 0.9855
No log 2.1304 196 1.0108 0.5451 1.0108 1.0054
No log 2.1522 198 0.8846 0.5700 0.8846 0.9405
No log 2.1739 200 0.7916 0.4805 0.7916 0.8897
No log 2.1957 202 0.8201 0.4379 0.8201 0.9056
No log 2.2174 204 0.7962 0.4890 0.7962 0.8923
No log 2.2391 206 0.8138 0.5372 0.8138 0.9021
No log 2.2609 208 0.8946 0.5867 0.8946 0.9458
No log 2.2826 210 0.8559 0.5931 0.8559 0.9252
No log 2.3043 212 0.7660 0.6508 0.7660 0.8752
No log 2.3261 214 0.7252 0.6721 0.7252 0.8516
No log 2.3478 216 0.7243 0.6835 0.7243 0.8511
No log 2.3696 218 0.7206 0.6784 0.7206 0.8489
No log 2.3913 220 0.7303 0.6484 0.7303 0.8546
No log 2.4130 222 0.7616 0.6172 0.7616 0.8727
No log 2.4348 224 0.7802 0.6083 0.7802 0.8833
No log 2.4565 226 0.7142 0.5799 0.7142 0.8451
No log 2.4783 228 0.7100 0.5493 0.7100 0.8426
No log 2.5 230 0.7708 0.5985 0.7708 0.8780
No log 2.5217 232 0.7263 0.5772 0.7263 0.8522
No log 2.5435 234 0.7314 0.6419 0.7314 0.8552
No log 2.5652 236 0.8626 0.5880 0.8626 0.9288
No log 2.5870 238 0.7734 0.6121 0.7734 0.8794
No log 2.6087 240 0.7385 0.6457 0.7385 0.8594
No log 2.6304 242 0.7477 0.5872 0.7477 0.8647
No log 2.6522 244 0.7500 0.5923 0.7500 0.8660
No log 2.6739 246 0.8040 0.6197 0.8040 0.8967
No log 2.6957 248 0.8327 0.6186 0.8327 0.9125
No log 2.7174 250 0.8759 0.5855 0.8759 0.9359
No log 2.7391 252 0.8495 0.6040 0.8495 0.9217
No log 2.7609 254 0.8128 0.5111 0.8128 0.9016
No log 2.7826 256 0.8065 0.5095 0.8065 0.8981
No log 2.8043 258 0.7991 0.5073 0.7991 0.8939
No log 2.8261 260 0.8096 0.5519 0.8096 0.8998
No log 2.8478 262 0.7898 0.5570 0.7898 0.8887
No log 2.8696 264 0.8143 0.6226 0.8143 0.9024
No log 2.8913 266 0.8523 0.6288 0.8523 0.9232
No log 2.9130 268 0.8845 0.6163 0.8845 0.9405
No log 2.9348 270 0.8339 0.5501 0.8339 0.9132
No log 2.9565 272 0.8055 0.5158 0.8055 0.8975
No log 2.9783 274 0.8068 0.4955 0.8068 0.8982
No log 3.0 276 0.8148 0.5097 0.8148 0.9027
No log 3.0217 278 0.7815 0.5029 0.7815 0.8840
No log 3.0435 280 0.7490 0.5618 0.7490 0.8654
No log 3.0652 282 0.7061 0.5815 0.7061 0.8403
No log 3.0870 284 0.7366 0.6199 0.7366 0.8582
No log 3.1087 286 0.7217 0.6495 0.7217 0.8495
No log 3.1304 288 0.6870 0.6538 0.6870 0.8289
No log 3.1522 290 0.7230 0.6212 0.7230 0.8503
No log 3.1739 292 0.6990 0.6354 0.6990 0.8361
No log 3.1957 294 0.6876 0.6436 0.6876 0.8292
No log 3.2174 296 0.6852 0.6688 0.6852 0.8278
No log 3.2391 298 0.6855 0.6862 0.6855 0.8280
No log 3.2609 300 0.6676 0.6383 0.6676 0.8171
No log 3.2826 302 0.7448 0.5990 0.7448 0.8630
No log 3.3043 304 0.8491 0.5979 0.8491 0.9214
No log 3.3261 306 0.7808 0.6062 0.7808 0.8836
No log 3.3478 308 0.6725 0.6548 0.6725 0.8200
No log 3.3696 310 0.6794 0.6535 0.6794 0.8243
No log 3.3913 312 0.6739 0.6367 0.6739 0.8209
No log 3.4130 314 0.7241 0.6331 0.7241 0.8510
No log 3.4348 316 0.8837 0.6111 0.8837 0.9401
No log 3.4565 318 0.9325 0.6094 0.9325 0.9656
No log 3.4783 320 0.8330 0.6601 0.8330 0.9127
No log 3.5 322 0.7838 0.6928 0.7838 0.8853
No log 3.5217 324 0.7791 0.6731 0.7791 0.8827
No log 3.5435 326 0.7900 0.6449 0.7900 0.8888
No log 3.5652 328 0.7977 0.6288 0.7977 0.8931
No log 3.5870 330 0.8851 0.6017 0.8851 0.9408
No log 3.6087 332 0.9912 0.5760 0.9912 0.9956
No log 3.6304 334 1.0135 0.5389 1.0135 1.0067
No log 3.6522 336 0.9310 0.5176 0.9310 0.9649
No log 3.6739 338 0.8286 0.4186 0.8286 0.9103
No log 3.6957 340 0.7726 0.4609 0.7726 0.8790
No log 3.7174 342 0.7354 0.5192 0.7354 0.8576
No log 3.7391 344 0.7185 0.6203 0.7185 0.8476
No log 3.7609 346 0.7368 0.6314 0.7368 0.8583
No log 3.7826 348 0.9013 0.6036 0.9013 0.9494
No log 3.8043 350 1.0388 0.5600 1.0388 1.0192
No log 3.8261 352 0.9856 0.5600 0.9856 0.9928
No log 3.8478 354 0.8293 0.6377 0.8293 0.9107
No log 3.8696 356 0.7130 0.6400 0.7130 0.8444
No log 3.8913 358 0.7445 0.5724 0.7445 0.8629
No log 3.9130 360 0.8154 0.5693 0.8154 0.9030
No log 3.9348 362 0.8083 0.5620 0.8083 0.8991
No log 3.9565 364 0.7499 0.5384 0.7499 0.8660
No log 3.9783 366 0.7079 0.5840 0.7079 0.8413
No log 4.0 368 0.7285 0.5997 0.7285 0.8535
No log 4.0217 370 0.7136 0.6173 0.7136 0.8448
No log 4.0435 372 0.6717 0.6651 0.6717 0.8196
No log 4.0652 374 0.6621 0.7065 0.6621 0.8137
No log 4.0870 376 0.6871 0.7086 0.6871 0.8289
No log 4.1087 378 0.6735 0.7000 0.6735 0.8207
No log 4.1304 380 0.6817 0.6469 0.6817 0.8257
No log 4.1522 382 0.7612 0.6237 0.7612 0.8725
No log 4.1739 384 0.7717 0.6146 0.7717 0.8785
No log 4.1957 386 0.7526 0.6081 0.7526 0.8675
No log 4.2174 388 0.7356 0.5815 0.7356 0.8577
No log 4.2391 390 0.7466 0.5318 0.7466 0.8641
No log 4.2609 392 0.7395 0.5361 0.7395 0.8600
No log 4.2826 394 0.7358 0.5737 0.7358 0.8578
No log 4.3043 396 0.7841 0.5923 0.7841 0.8855
No log 4.3261 398 0.8971 0.5701 0.8971 0.9471
No log 4.3478 400 0.9999 0.5527 0.9999 1.0000
No log 4.3696 402 0.9878 0.5551 0.9878 0.9939
No log 4.3913 404 0.9200 0.5701 0.9200 0.9591
No log 4.4130 406 0.9099 0.6019 0.9099 0.9539
No log 4.4348 408 0.8788 0.5922 0.8788 0.9374
No log 4.4565 410 0.7961 0.5853 0.7961 0.8922
No log 4.4783 412 0.7689 0.6265 0.7689 0.8769
No log 4.5 414 0.7435 0.5752 0.7435 0.8623
No log 4.5217 416 0.7253 0.5798 0.7253 0.8516
No log 4.5435 418 0.7120 0.6246 0.7120 0.8438
No log 4.5652 420 0.6946 0.6395 0.6946 0.8334
No log 4.5870 422 0.7082 0.6673 0.7082 0.8416
No log 4.6087 424 0.7317 0.6613 0.7317 0.8554
No log 4.6304 426 0.7500 0.6512 0.7500 0.8660
No log 4.6522 428 0.7801 0.6613 0.7801 0.8832
No log 4.6739 430 0.8247 0.6048 0.8247 0.9081
No log 4.6957 432 0.8895 0.5960 0.8895 0.9431
No log 4.7174 434 0.9923 0.5556 0.9923 0.9962
No log 4.7391 436 0.9588 0.5556 0.9588 0.9792
No log 4.7609 438 0.9566 0.5502 0.9566 0.9781
No log 4.7826 440 0.9367 0.5704 0.9367 0.9678
No log 4.8043 442 0.8288 0.6149 0.8288 0.9104
No log 4.8261 444 0.7300 0.6684 0.7300 0.8544
No log 4.8478 446 0.6935 0.6848 0.6935 0.8328
No log 4.8696 448 0.6957 0.6689 0.6957 0.8341
No log 4.8913 450 0.6972 0.6690 0.6972 0.8350
No log 4.9130 452 0.6938 0.6899 0.6938 0.8330
No log 4.9348 454 0.7023 0.6748 0.7023 0.8380
No log 4.9565 456 0.7259 0.6756 0.7259 0.8520
No log 4.9783 458 0.7025 0.6988 0.7025 0.8382
No log 5.0 460 0.6911 0.7002 0.6911 0.8313
No log 5.0217 462 0.6933 0.6772 0.6933 0.8327
No log 5.0435 464 0.6978 0.6732 0.6978 0.8353
No log 5.0652 466 0.7098 0.7067 0.7098 0.8425
No log 5.0870 468 0.7487 0.6567 0.7487 0.8653
No log 5.1087 470 0.7428 0.6840 0.7428 0.8619
No log 5.1304 472 0.7066 0.6849 0.7066 0.8406
No log 5.1522 474 0.7104 0.6780 0.7104 0.8429
No log 5.1739 476 0.6913 0.7023 0.6913 0.8314
No log 5.1957 478 0.6907 0.7023 0.6907 0.8311
No log 5.2174 480 0.6849 0.6913 0.6849 0.8276
No log 5.2391 482 0.6950 0.6852 0.6950 0.8337
No log 5.2609 484 0.7355 0.6589 0.7355 0.8576
No log 5.2826 486 0.7559 0.6551 0.7559 0.8694
No log 5.3043 488 0.7127 0.7012 0.7127 0.8442
No log 5.3261 490 0.7290 0.6707 0.7290 0.8538
No log 5.3478 492 0.7613 0.6366 0.7613 0.8726
No log 5.3696 494 0.7536 0.6499 0.7536 0.8681
No log 5.3913 496 0.7134 0.6934 0.7134 0.8446
No log 5.4130 498 0.7104 0.6805 0.7104 0.8429
0.4586 5.4348 500 0.7484 0.6533 0.7484 0.8651
0.4586 5.4565 502 0.7560 0.6075 0.7560 0.8695
0.4586 5.4783 504 0.7252 0.6118 0.7252 0.8516
0.4586 5.5 506 0.7003 0.6256 0.7003 0.8369
0.4586 5.5217 508 0.6939 0.6514 0.6939 0.8330
0.4586 5.5435 510 0.7162 0.7107 0.7162 0.8463
0.4586 5.5652 512 0.7326 0.6857 0.7326 0.8559
0.4586 5.5870 514 0.7135 0.6847 0.7135 0.8447
0.4586 5.6087 516 0.6945 0.6972 0.6945 0.8334
0.4586 5.6304 518 0.6948 0.6547 0.6948 0.8335
0.4586 5.6522 520 0.6939 0.6567 0.6939 0.8330
0.4586 5.6739 522 0.7240 0.6738 0.7240 0.8509
0.4586 5.6957 524 0.7813 0.6339 0.7813 0.8839
0.4586 5.7174 526 0.7887 0.6321 0.7887 0.8881
0.4586 5.7391 528 0.8040 0.6157 0.8040 0.8967
0.4586 5.7609 530 0.7985 0.6157 0.7985 0.8936
0.4586 5.7826 532 0.8219 0.5982 0.8219 0.9066

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits8_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k19_task1_organization

Finetuned
(4023)
this model