ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run2_AugV5_k14_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6064
  • Qwk: 0.7792
  • Mse: 0.6064
  • Rmse: 0.7787

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0299 2 7.1677 -0.0108 7.1677 2.6773
No log 0.0597 4 4.6782 0.0488 4.6782 2.1629
No log 0.0896 6 3.9247 -0.0597 3.9247 1.9811
No log 0.1194 8 3.0008 0.0120 3.0008 1.7323
No log 0.1493 10 2.7044 0.1135 2.7044 1.6445
No log 0.1791 12 2.8702 0.0544 2.8702 1.6942
No log 0.2090 14 2.7292 0.0805 2.7292 1.6520
No log 0.2388 16 2.0770 0.1805 2.0770 1.4412
No log 0.2687 18 1.9424 0.2406 1.9424 1.3937
No log 0.2985 20 1.9175 0.2857 1.9175 1.3847
No log 0.3284 22 1.8340 0.2901 1.8340 1.3542
No log 0.3582 24 1.8812 0.3134 1.8812 1.3716
No log 0.3881 26 1.7829 0.3134 1.7829 1.3353
No log 0.4179 28 1.8676 0.2603 1.8676 1.3666
No log 0.4478 30 1.5942 0.3053 1.5942 1.2626
No log 0.4776 32 1.5598 0.3231 1.5598 1.2489
No log 0.5075 34 1.8562 0.3567 1.8562 1.3624
No log 0.5373 36 2.6417 0.3431 2.6417 1.6253
No log 0.5672 38 2.9889 0.2922 2.9889 1.7288
No log 0.5970 40 2.4817 0.3516 2.4817 1.5753
No log 0.6269 42 2.2706 0.2994 2.2706 1.5068
No log 0.6567 44 1.7011 0.3611 1.7011 1.3043
No log 0.6866 46 1.3744 0.4 1.3744 1.1723
No log 0.7164 48 1.3180 0.4567 1.3180 1.1481
No log 0.7463 50 1.3748 0.4308 1.3748 1.1725
No log 0.7761 52 1.7369 0.3688 1.7369 1.3179
No log 0.8060 54 1.9701 0.3743 1.9701 1.4036
No log 0.8358 56 2.0393 0.3678 2.0393 1.4280
No log 0.8657 58 1.6854 0.3976 1.6854 1.2982
No log 0.8955 60 1.3197 0.5324 1.3197 1.1488
No log 0.9254 62 1.2006 0.5 1.2006 1.0957
No log 0.9552 64 1.2407 0.5571 1.2407 1.1139
No log 0.9851 66 1.1997 0.5734 1.1997 1.0953
No log 1.0149 68 1.1736 0.5369 1.1736 1.0833
No log 1.0448 70 1.1780 0.5405 1.1780 1.0854
No log 1.0746 72 1.6025 0.5029 1.6025 1.2659
No log 1.1045 74 1.8037 0.4444 1.8037 1.3430
No log 1.1343 76 1.4701 0.5405 1.4701 1.2125
No log 1.1642 78 1.0943 0.6353 1.0943 1.0461
No log 1.1940 80 1.0185 0.5578 1.0185 1.0092
No log 1.2239 82 1.0997 0.5 1.0997 1.0487
No log 1.2537 84 1.0614 0.5692 1.0614 1.0302
No log 1.2836 86 1.0181 0.5865 1.0181 1.0090
No log 1.3134 88 0.9108 0.6197 0.9108 0.9543
No log 1.3433 90 0.8244 0.7190 0.8244 0.9080
No log 1.3731 92 0.9550 0.6623 0.9550 0.9772
No log 1.4030 94 0.9585 0.6623 0.9585 0.9790
No log 1.4328 96 0.9499 0.6835 0.9499 0.9746
No log 1.4627 98 1.0703 0.6258 1.0703 1.0345
No log 1.4925 100 1.0960 0.6258 1.0960 1.0469
No log 1.5224 102 0.9662 0.6790 0.9662 0.9829
No log 1.5522 104 0.7204 0.7484 0.7204 0.8488
No log 1.5821 106 0.6801 0.7712 0.6801 0.8247
No log 1.6119 108 0.6880 0.7712 0.6880 0.8294
No log 1.6418 110 0.6799 0.7742 0.6799 0.8245
No log 1.6716 112 0.6882 0.7532 0.6882 0.8295
No log 1.7015 114 0.7000 0.7451 0.7000 0.8366
No log 1.7313 116 0.6709 0.7662 0.6709 0.8191
No log 1.7612 118 0.7217 0.7771 0.7217 0.8495
No log 1.7910 120 0.8019 0.7381 0.8019 0.8955
No log 1.8209 122 1.0578 0.6776 1.0578 1.0285
No log 1.8507 124 1.1645 0.6383 1.1645 1.0791
No log 1.8806 126 1.0361 0.6776 1.0361 1.0179
No log 1.9104 128 0.8754 0.7079 0.8754 0.9357
No log 1.9403 130 0.7089 0.7532 0.7089 0.8420
No log 1.9701 132 0.6638 0.7333 0.6638 0.8148
No log 2.0 134 0.7001 0.7483 0.7001 0.8367
No log 2.0299 136 0.8074 0.7042 0.8074 0.8986
No log 2.0597 138 0.8462 0.7092 0.8462 0.9199
No log 2.0896 140 0.8714 0.6763 0.8714 0.9335
No log 2.1194 142 0.8092 0.6667 0.8092 0.8995
No log 2.1493 144 0.8112 0.7034 0.8112 0.9007
No log 2.1791 146 0.9515 0.6410 0.9515 0.9754
No log 2.2090 148 1.0171 0.6503 1.0171 1.0085
No log 2.2388 150 0.9543 0.6875 0.9543 0.9769
No log 2.2687 152 0.8291 0.7308 0.8291 0.9106
No log 2.2985 154 0.7875 0.7162 0.7875 0.8874
No log 2.3284 156 0.8054 0.6944 0.8054 0.8975
No log 2.3582 158 0.8006 0.6901 0.8006 0.8947
No log 2.3881 160 0.8048 0.6763 0.8048 0.8971
No log 2.4179 162 0.8033 0.6763 0.8033 0.8963
No log 2.4478 164 0.8654 0.6933 0.8654 0.9303
No log 2.4776 166 0.9368 0.6928 0.9368 0.9679
No log 2.5075 168 0.8698 0.7105 0.8698 0.9326
No log 2.5373 170 0.6995 0.7273 0.6995 0.8364
No log 2.5672 172 0.7872 0.6331 0.7872 0.8872
No log 2.5970 174 0.7565 0.7234 0.7565 0.8698
No log 2.6269 176 0.7205 0.7448 0.7205 0.8488
No log 2.6567 178 0.8822 0.6625 0.8822 0.9393
No log 2.6866 180 1.1724 0.6702 1.1724 1.0828
No log 2.7164 182 1.0321 0.6882 1.0321 1.0159
No log 2.7463 184 0.8537 0.7176 0.8537 0.9240
No log 2.7761 186 0.7274 0.7285 0.7274 0.8529
No log 2.8060 188 0.7671 0.6619 0.7671 0.8759
No log 2.8358 190 0.8588 0.6519 0.8588 0.9267
No log 2.8657 192 0.9060 0.6165 0.9060 0.9519
No log 2.8955 194 0.9327 0.6569 0.9327 0.9658
No log 2.9254 196 1.0117 0.5942 1.0117 1.0058
No log 2.9552 198 1.0872 0.5075 1.0872 1.0427
No log 2.9851 200 1.1134 0.4733 1.1134 1.0552
No log 3.0149 202 1.1201 0.5113 1.1201 1.0583
No log 3.0448 204 1.0681 0.5778 1.0681 1.0335
No log 3.0746 206 0.9882 0.6525 0.9882 0.9941
No log 3.1045 208 0.8704 0.6761 0.8704 0.9330
No log 3.1343 210 0.7689 0.7172 0.7689 0.8769
No log 3.1642 212 0.7440 0.7123 0.7440 0.8626
No log 3.1940 214 0.7307 0.7451 0.7307 0.8548
No log 3.2239 216 0.6624 0.7792 0.6624 0.8139
No log 3.2537 218 0.6494 0.7273 0.6494 0.8059
No log 3.2836 220 0.6702 0.7368 0.6702 0.8186
No log 3.3134 222 0.6783 0.7674 0.6783 0.8236
No log 3.3433 224 0.6358 0.7692 0.6358 0.7973
No log 3.3731 226 0.6131 0.7582 0.6131 0.7830
No log 3.4030 228 0.6556 0.7413 0.6556 0.8097
No log 3.4328 230 0.6560 0.7376 0.6560 0.8100
No log 3.4627 232 0.6498 0.7273 0.6498 0.8061
No log 3.4925 234 0.7310 0.7468 0.7310 0.8550
No log 3.5224 236 0.8951 0.6826 0.8951 0.9461
No log 3.5522 238 0.8871 0.7101 0.8871 0.9419
No log 3.5821 240 0.7249 0.725 0.7249 0.8514
No log 3.6119 242 0.6501 0.7310 0.6501 0.8063
No log 3.6418 244 0.6997 0.7286 0.6997 0.8365
No log 3.6716 246 0.7203 0.7324 0.7203 0.8487
No log 3.7015 248 0.7207 0.7397 0.7207 0.8489
No log 3.7313 250 0.7547 0.7211 0.7547 0.8687
No log 3.7612 252 0.7944 0.7273 0.7944 0.8913
No log 3.7910 254 0.7990 0.6806 0.7990 0.8939
No log 3.8209 256 0.8006 0.6763 0.8006 0.8948
No log 3.8507 258 0.8048 0.6763 0.8048 0.8971
No log 3.8806 260 0.7959 0.6763 0.7959 0.8922
No log 3.9104 262 0.7816 0.6906 0.7816 0.8841
No log 3.9403 264 0.7819 0.7183 0.7819 0.8843
No log 3.9701 266 0.7750 0.7310 0.7750 0.8803
No log 4.0 268 0.8247 0.7296 0.8247 0.9081
No log 4.0299 270 0.8894 0.7294 0.8894 0.9431
No log 4.0597 272 0.8597 0.7195 0.8597 0.9272
No log 4.0896 274 0.7892 0.72 0.7892 0.8884
No log 4.1194 276 0.8097 0.72 0.8097 0.8998
No log 4.1493 278 0.8806 0.6892 0.8806 0.9384
No log 4.1791 280 1.1268 0.6824 1.1268 1.0615
No log 4.2090 282 1.5078 0.5169 1.5078 1.2279
No log 4.2388 284 1.6022 0.4916 1.6022 1.2658
No log 4.2687 286 1.3599 0.5465 1.3599 1.1662
No log 4.2985 288 0.9888 0.6918 0.9888 0.9944
No log 4.3284 290 0.7933 0.6993 0.7933 0.8907
No log 4.3582 292 0.7962 0.7153 0.7962 0.8923
No log 4.3881 294 0.8098 0.6963 0.8098 0.8999
No log 4.4179 296 0.7805 0.7015 0.7805 0.8834
No log 4.4478 298 0.7501 0.7015 0.7501 0.8661
No log 4.4776 300 0.7524 0.7101 0.7524 0.8674
No log 4.5075 302 0.7667 0.7027 0.7667 0.8756
No log 4.5373 304 0.7813 0.6980 0.7813 0.8839
No log 4.5672 306 0.7630 0.6980 0.7630 0.8735
No log 4.5970 308 0.7270 0.7285 0.7270 0.8526
No log 4.6269 310 0.7114 0.7368 0.7114 0.8435
No log 4.6567 312 0.7481 0.7651 0.7481 0.8649
No log 4.6866 314 0.7572 0.7347 0.7572 0.8702
No log 4.7164 316 0.7895 0.6933 0.7895 0.8886
No log 4.7463 318 0.8005 0.7134 0.8005 0.8947
No log 4.7761 320 0.8151 0.7329 0.8151 0.9029
No log 4.8060 322 0.7766 0.7342 0.7766 0.8812
No log 4.8358 324 0.8044 0.7362 0.8044 0.8969
No log 4.8657 326 0.8489 0.7239 0.8489 0.9214
No log 4.8955 328 0.9237 0.72 0.9237 0.9611
No log 4.9254 330 0.8756 0.7386 0.8756 0.9358
No log 4.9552 332 0.7838 0.7273 0.7838 0.8853
No log 4.9851 334 0.7004 0.7625 0.7004 0.8369
No log 5.0149 336 0.7245 0.7550 0.7245 0.8512
No log 5.0448 338 0.7410 0.7027 0.7410 0.8608
No log 5.0746 340 0.7638 0.6849 0.7638 0.8739
No log 5.1045 342 0.7927 0.6853 0.7927 0.8904
No log 5.1343 344 0.8529 0.6986 0.8529 0.9235
No log 5.1642 346 1.0154 0.64 1.0154 1.0077
No log 5.1940 348 1.0996 0.6708 1.0996 1.0486
No log 5.2239 350 0.9003 0.6389 0.9003 0.9488
No log 5.2537 352 0.7362 0.7050 0.7362 0.8580
No log 5.2836 354 0.7509 0.7571 0.7509 0.8666
No log 5.3134 356 0.7405 0.7778 0.7405 0.8605
No log 5.3433 358 0.7278 0.7692 0.7278 0.8531
No log 5.3731 360 0.7655 0.7353 0.7655 0.8750
No log 5.4030 362 0.8641 0.6815 0.8641 0.9296
No log 5.4328 364 1.1046 0.5634 1.1046 1.0510
No log 5.4627 366 1.1902 0.4823 1.1902 1.0910
No log 5.4925 368 1.0395 0.6043 1.0395 1.0196
No log 5.5224 370 0.8660 0.6767 0.8660 0.9306
No log 5.5522 372 0.8132 0.7015 0.8132 0.9018
No log 5.5821 374 0.7759 0.7338 0.7759 0.8809
No log 5.6119 376 0.7462 0.7429 0.7462 0.8639
No log 5.6418 378 0.7307 0.7432 0.7307 0.8548
No log 5.6716 380 0.6727 0.7871 0.6727 0.8202
No log 5.7015 382 0.6667 0.8047 0.6667 0.8165
No log 5.7313 384 0.6420 0.7778 0.6420 0.8012
No log 5.7612 386 0.6268 0.7949 0.6268 0.7917
No log 5.7910 388 0.6280 0.7703 0.6280 0.7925
No log 5.8209 390 0.6484 0.7397 0.6484 0.8052
No log 5.8507 392 0.6708 0.7397 0.6708 0.8190
No log 5.8806 394 0.6714 0.75 0.6714 0.8194
No log 5.9104 396 0.6674 0.7778 0.6674 0.8170
No log 5.9403 398 0.7145 0.7665 0.7145 0.8453
No log 5.9701 400 0.7251 0.7784 0.7251 0.8515
No log 6.0 402 0.6859 0.7778 0.6859 0.8282
No log 6.0299 404 0.6757 0.75 0.6757 0.8220
No log 6.0597 406 0.6672 0.7843 0.6672 0.8168
No log 6.0896 408 0.6698 0.7821 0.6698 0.8184
No log 6.1194 410 0.6928 0.7607 0.6928 0.8323
No log 6.1493 412 0.6593 0.7778 0.6593 0.8120
No log 6.1791 414 0.6056 0.7821 0.6056 0.7782
No log 6.2090 416 0.6424 0.7582 0.6424 0.8015
No log 6.2388 418 0.7265 0.6986 0.7265 0.8524
No log 6.2687 420 0.7303 0.6897 0.7303 0.8546
No log 6.2985 422 0.7124 0.7550 0.7124 0.8441
No log 6.3284 424 0.7130 0.7172 0.7130 0.8444
No log 6.3582 426 0.6807 0.7568 0.6807 0.8250
No log 6.3881 428 0.6352 0.7763 0.6352 0.7970
No log 6.4179 430 0.6280 0.7763 0.6280 0.7925
No log 6.4478 432 0.6327 0.7651 0.6327 0.7954
No log 6.4776 434 0.6399 0.7703 0.6399 0.7999
No log 6.5075 436 0.6624 0.7333 0.6624 0.8139
No log 6.5373 438 0.6522 0.7333 0.6522 0.8076
No log 6.5672 440 0.6251 0.7815 0.6251 0.7907
No log 6.5970 442 0.6085 0.7763 0.6085 0.7801
No log 6.6269 444 0.6123 0.7763 0.6123 0.7825
No log 6.6567 446 0.6162 0.7922 0.6162 0.7850
No log 6.6866 448 0.6263 0.7949 0.6263 0.7914
No log 6.7164 450 0.6407 0.7949 0.6407 0.8005
No log 6.7463 452 0.6552 0.7799 0.6552 0.8095
No log 6.7761 454 0.6647 0.8118 0.6647 0.8153
No log 6.8060 456 0.6815 0.8046 0.6815 0.8255
No log 6.8358 458 0.6844 0.8046 0.6844 0.8273
No log 6.8657 460 0.6735 0.7953 0.6735 0.8207
No log 6.8955 462 0.6623 0.7922 0.6623 0.8138
No log 6.9254 464 0.6574 0.7763 0.6574 0.8108
No log 6.9552 466 0.6471 0.7763 0.6471 0.8044
No log 6.9851 468 0.6372 0.7671 0.6372 0.7982
No log 7.0149 470 0.6385 0.7755 0.6385 0.7991
No log 7.0448 472 0.6474 0.7639 0.6474 0.8046
No log 7.0746 474 0.6514 0.7639 0.6514 0.8071
No log 7.1045 476 0.6644 0.7518 0.6644 0.8151
No log 7.1343 478 0.6626 0.7518 0.6626 0.8140
No log 7.1642 480 0.6381 0.7606 0.6381 0.7988
No log 7.1940 482 0.6366 0.7692 0.6366 0.7979
No log 7.2239 484 0.6253 0.7763 0.6253 0.7908
No log 7.2537 486 0.6104 0.7763 0.6104 0.7813
No log 7.2836 488 0.5932 0.7974 0.5932 0.7702
No log 7.3134 490 0.5951 0.7785 0.5951 0.7714
No log 7.3433 492 0.6505 0.7347 0.6505 0.8065
No log 7.3731 494 0.7064 0.7260 0.7064 0.8405
No log 7.4030 496 0.6793 0.7260 0.6793 0.8242
No log 7.4328 498 0.6261 0.7651 0.6261 0.7913
0.3725 7.4627 500 0.6151 0.8125 0.6151 0.7843
0.3725 7.4925 502 0.6431 0.8098 0.6431 0.8020
0.3725 7.5224 504 0.6866 0.7738 0.6866 0.8286
0.3725 7.5522 506 0.6752 0.7545 0.6752 0.8217
0.3725 7.5821 508 0.6285 0.8049 0.6285 0.7928
0.3725 7.6119 510 0.6064 0.7792 0.6064 0.7787

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
2
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run2_AugV5_k14_task1_organization

Finetuned
(4023)
this model