ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run3_AugV5_k11_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7541
  • Qwk: 0.3888
  • Mse: 0.7541
  • Rmse: 0.8684

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0364 2 2.6746 -0.0729 2.6746 1.6354
No log 0.0727 4 1.3376 0.0990 1.3376 1.1565
No log 0.1091 6 1.1904 -0.1993 1.1904 1.0911
No log 0.1455 8 1.2056 -0.2249 1.2056 1.0980
No log 0.1818 10 1.1218 0.1044 1.1218 1.0591
No log 0.2182 12 0.9945 0.2782 0.9945 0.9973
No log 0.2545 14 0.9112 0.3228 0.9112 0.9545
No log 0.2909 16 0.9131 0.3228 0.9131 0.9556
No log 0.3273 18 0.8381 0.3294 0.8381 0.9155
No log 0.3636 20 0.9821 0.3059 0.9821 0.9910
No log 0.4 22 0.9394 0.3228 0.9394 0.9692
No log 0.4364 24 0.7855 0.2527 0.7855 0.8863
No log 0.4727 26 0.7621 0.0295 0.7621 0.8730
No log 0.5091 28 0.8161 0.0 0.8161 0.9034
No log 0.5455 30 0.8209 -0.0307 0.8209 0.9061
No log 0.5818 32 0.7449 0.0643 0.7449 0.8631
No log 0.6182 34 0.8023 0.3494 0.8023 0.8957
No log 0.6545 36 0.8146 0.3494 0.8146 0.9026
No log 0.6909 38 0.7495 0.3088 0.7495 0.8658
No log 0.7273 40 0.7440 0.2182 0.7440 0.8626
No log 0.7636 42 0.8707 0.1763 0.8707 0.9331
No log 0.8 44 1.1919 0.2066 1.1919 1.0917
No log 0.8364 46 1.3315 0.1000 1.3315 1.1539
No log 0.8727 48 1.2886 0.0754 1.2886 1.1352
No log 0.9091 50 1.0425 0.2394 1.0425 1.0210
No log 0.9455 52 0.8590 0.2115 0.8590 0.9268
No log 0.9818 54 0.7812 0.2550 0.7812 0.8838
No log 1.0182 56 0.7247 0.1923 0.7247 0.8513
No log 1.0545 58 0.7012 0.1923 0.7012 0.8374
No log 1.0909 60 0.7085 0.0898 0.7085 0.8417
No log 1.1273 62 0.7218 0.0444 0.7218 0.8496
No log 1.1636 64 0.7097 0.0444 0.7097 0.8424
No log 1.2 66 0.7055 0.0474 0.7055 0.8399
No log 1.2364 68 0.6688 0.1846 0.6688 0.8178
No log 1.2727 70 0.6615 0.2145 0.6615 0.8133
No log 1.3091 72 0.6633 0.1813 0.6633 0.8145
No log 1.3455 74 0.9019 0.2008 0.9019 0.9497
No log 1.3818 76 1.0576 0.2676 1.0576 1.0284
No log 1.4182 78 0.9421 0.2356 0.9421 0.9706
No log 1.4545 80 0.8868 0.2046 0.8868 0.9417
No log 1.4909 82 0.8194 0.2308 0.8194 0.9052
No log 1.5273 84 0.7903 0.1519 0.7903 0.8890
No log 1.5636 86 0.8177 0.2274 0.8177 0.9043
No log 1.6 88 0.8153 0.2015 0.8153 0.9029
No log 1.6364 90 0.7960 0.2325 0.7960 0.8922
No log 1.6727 92 0.7720 0.1961 0.7720 0.8786
No log 1.7091 94 0.7583 0.2290 0.7583 0.8708
No log 1.7455 96 0.7599 0.1850 0.7599 0.8717
No log 1.7818 98 0.8128 0.0870 0.8128 0.9016
No log 1.8182 100 0.8995 0.1318 0.8995 0.9484
No log 1.8545 102 0.9371 0.1304 0.9371 0.9680
No log 1.8909 104 0.8842 0.1860 0.8842 0.9403
No log 1.9273 106 0.7758 0.2126 0.7758 0.8808
No log 1.9636 108 0.7602 0.3089 0.7602 0.8719
No log 2.0 110 0.7270 0.3628 0.7270 0.8526
No log 2.0364 112 0.7114 0.3122 0.7114 0.8435
No log 2.0727 114 0.7096 0.3252 0.7096 0.8424
No log 2.1091 116 0.7381 0.4205 0.7381 0.8591
No log 2.1455 118 0.8035 0.4193 0.8035 0.8964
No log 2.1818 120 0.9282 0.2730 0.9282 0.9634
No log 2.2182 122 0.9294 0.2363 0.9294 0.9641
No log 2.2545 124 0.8597 0.3551 0.8597 0.9272
No log 2.2909 126 0.7305 0.3656 0.7305 0.8547
No log 2.3273 128 0.7188 0.3906 0.7188 0.8478
No log 2.3636 130 0.7145 0.3467 0.7145 0.8453
No log 2.4 132 0.7253 0.2980 0.7253 0.8517
No log 2.4364 134 0.7251 0.3618 0.7251 0.8515
No log 2.4727 136 0.7339 0.3536 0.7339 0.8567
No log 2.5091 138 0.9096 0.2707 0.9096 0.9537
No log 2.5455 140 1.0725 0.1384 1.0725 1.0356
No log 2.5818 142 1.0495 0.1887 1.0495 1.0244
No log 2.6182 144 0.9901 0.1899 0.9901 0.9950
No log 2.6545 146 0.8666 0.2890 0.8666 0.9309
No log 2.6909 148 0.8283 0.3392 0.8283 0.9101
No log 2.7273 150 0.7863 0.3189 0.7863 0.8867
No log 2.7636 152 0.7840 0.2992 0.7840 0.8854
No log 2.8 154 0.7331 0.3271 0.7331 0.8562
No log 2.8364 156 0.6527 0.2642 0.6527 0.8079
No log 2.8727 158 0.6405 0.3947 0.6405 0.8003
No log 2.9091 160 0.6420 0.4402 0.6420 0.8013
No log 2.9455 162 0.6741 0.3174 0.6741 0.8210
No log 2.9818 164 0.8745 0.3119 0.8745 0.9351
No log 3.0182 166 0.9992 0.2900 0.9992 0.9996
No log 3.0545 168 0.8729 0.3119 0.8729 0.9343
No log 3.0909 170 0.7649 0.2984 0.7649 0.8746
No log 3.1273 172 0.6950 0.1539 0.6950 0.8337
No log 3.1636 174 0.7248 0.1646 0.7248 0.8514
No log 3.2 176 0.7529 0.1702 0.7529 0.8677
No log 3.2364 178 0.7396 0.1282 0.7396 0.8600
No log 3.2727 180 0.6989 0.2963 0.6989 0.8360
No log 3.3091 182 0.7043 0.2379 0.7043 0.8392
No log 3.3455 184 0.7756 0.3287 0.7756 0.8807
No log 3.3818 186 0.7608 0.3569 0.7608 0.8722
No log 3.4182 188 0.7563 0.2451 0.7563 0.8697
No log 3.4545 190 0.8543 0.2424 0.8543 0.9243
No log 3.4909 192 0.8174 0.2368 0.8174 0.9041
No log 3.5273 194 0.7023 0.3153 0.7023 0.8380
No log 3.5636 196 0.7083 0.3023 0.7083 0.8416
No log 3.6 198 0.6768 0.3754 0.6768 0.8227
No log 3.6364 200 0.7048 0.3229 0.7048 0.8395
No log 3.6727 202 0.7273 0.3857 0.7273 0.8528
No log 3.7091 204 0.7033 0.4555 0.7033 0.8386
No log 3.7455 206 0.7107 0.3247 0.7107 0.8430
No log 3.7818 208 0.7186 0.4232 0.7186 0.8477
No log 3.8182 210 0.7244 0.4137 0.7244 0.8511
No log 3.8545 212 0.7532 0.3902 0.7532 0.8679
No log 3.8909 214 0.7612 0.3834 0.7612 0.8725
No log 3.9273 216 0.7586 0.3597 0.7586 0.8710
No log 3.9636 218 0.7622 0.3395 0.7622 0.8730
No log 4.0 220 0.7958 0.3282 0.7958 0.8921
No log 4.0364 222 0.7822 0.3856 0.7822 0.8844
No log 4.0727 224 0.7496 0.3638 0.7496 0.8658
No log 4.1091 226 0.7736 0.4135 0.7736 0.8796
No log 4.1455 228 0.7144 0.3995 0.7144 0.8452
No log 4.1818 230 0.7064 0.4308 0.7064 0.8405
No log 4.2182 232 0.7308 0.3671 0.7308 0.8549
No log 4.2545 234 0.7283 0.3474 0.7283 0.8534
No log 4.2909 236 0.6872 0.3738 0.6872 0.8290
No log 4.3273 238 0.7252 0.4218 0.7252 0.8516
No log 4.3636 240 0.7348 0.3963 0.7348 0.8572
No log 4.4 242 0.7130 0.4072 0.7130 0.8444
No log 4.4364 244 0.7121 0.4457 0.7121 0.8439
No log 4.4727 246 0.6479 0.4413 0.6479 0.8049
No log 4.5091 248 0.6510 0.4182 0.6510 0.8068
No log 4.5455 250 0.6862 0.4895 0.6862 0.8284
No log 4.5818 252 0.7125 0.4480 0.7125 0.8441
No log 4.6182 254 0.6888 0.4149 0.6888 0.8299
No log 4.6545 256 0.6884 0.3785 0.6884 0.8297
No log 4.6909 258 0.6665 0.3536 0.6665 0.8164
No log 4.7273 260 0.6748 0.3669 0.6748 0.8215
No log 4.7636 262 0.6784 0.4382 0.6784 0.8236
No log 4.8 264 0.6839 0.4128 0.6839 0.8270
No log 4.8364 266 0.6818 0.4735 0.6818 0.8257
No log 4.8727 268 0.6723 0.4659 0.6723 0.8199
No log 4.9091 270 0.6781 0.4288 0.6781 0.8235
No log 4.9455 272 0.7183 0.4010 0.7183 0.8475
No log 4.9818 274 0.8044 0.3441 0.8044 0.8969
No log 5.0182 276 0.9174 0.3174 0.9174 0.9578
No log 5.0545 278 0.8733 0.3387 0.8733 0.9345
No log 5.0909 280 0.7896 0.3562 0.7896 0.8886
No log 5.1273 282 0.7973 0.3411 0.7973 0.8929
No log 5.1636 284 0.8324 0.2774 0.8324 0.9124
No log 5.2 286 0.8044 0.3068 0.8044 0.8969
No log 5.2364 288 0.8215 0.2819 0.8215 0.9064
No log 5.2727 290 0.8452 0.2790 0.8452 0.9194
No log 5.3091 292 0.8215 0.3060 0.8215 0.9063
No log 5.3455 294 0.8050 0.3188 0.8050 0.8972
No log 5.3818 296 0.8164 0.3372 0.8164 0.9035
No log 5.4182 298 0.8486 0.2962 0.8486 0.9212
No log 5.4545 300 0.8685 0.3524 0.8685 0.9319
No log 5.4909 302 0.8551 0.3574 0.8551 0.9247
No log 5.5273 304 0.8918 0.4490 0.8918 0.9444
No log 5.5636 306 0.8712 0.4234 0.8712 0.9334
No log 5.6 308 0.8029 0.3966 0.8029 0.8960
No log 5.6364 310 0.7907 0.3255 0.7907 0.8892
No log 5.6727 312 0.7936 0.4354 0.7936 0.8908
No log 5.7091 314 0.8482 0.3520 0.8482 0.9210
No log 5.7455 316 0.8033 0.3256 0.8033 0.8963
No log 5.7818 318 0.8089 0.3433 0.8089 0.8994
No log 5.8182 320 0.7781 0.3997 0.7781 0.8821
No log 5.8545 322 0.7389 0.4342 0.7389 0.8596
No log 5.8909 324 0.7722 0.2985 0.7722 0.8787
No log 5.9273 326 0.7871 0.3102 0.7871 0.8872
No log 5.9636 328 0.7701 0.3648 0.7701 0.8776
No log 6.0 330 0.7508 0.4037 0.7508 0.8665
No log 6.0364 332 0.7380 0.4328 0.7380 0.8591
No log 6.0727 334 0.7703 0.4516 0.7703 0.8776
No log 6.1091 336 0.8399 0.4338 0.8399 0.9164
No log 6.1455 338 0.8647 0.3045 0.8647 0.9299
No log 6.1818 340 0.8703 0.3788 0.8703 0.9329
No log 6.2182 342 0.8560 0.4131 0.8560 0.9252
No log 6.2545 344 0.7596 0.3671 0.7596 0.8715
No log 6.2909 346 0.7104 0.3715 0.7104 0.8429
No log 6.3273 348 0.7042 0.4555 0.7042 0.8392
No log 6.3636 350 0.7641 0.2993 0.7641 0.8741
No log 6.4 352 0.7548 0.3388 0.7548 0.8688
No log 6.4364 354 0.7084 0.4505 0.7084 0.8417
No log 6.4727 356 0.8057 0.4817 0.8057 0.8976
No log 6.5091 358 0.8627 0.4087 0.8627 0.9288
No log 6.5455 360 0.8007 0.4456 0.8007 0.8948
No log 6.5818 362 0.7763 0.4499 0.7763 0.8811
No log 6.6182 364 0.7880 0.4568 0.7880 0.8877
No log 6.6545 366 0.7823 0.4497 0.7823 0.8845
No log 6.6909 368 0.7495 0.4681 0.7495 0.8658
No log 6.7273 370 0.7182 0.4480 0.7182 0.8475
No log 6.7636 372 0.7198 0.4622 0.7198 0.8484
No log 6.8 374 0.7388 0.4772 0.7388 0.8595
No log 6.8364 376 0.8588 0.4260 0.8588 0.9267
No log 6.8727 378 0.8484 0.4496 0.8484 0.9211
No log 6.9091 380 0.7332 0.3503 0.7332 0.8563
No log 6.9455 382 0.6734 0.5133 0.6734 0.8206
No log 6.9818 384 0.6753 0.4451 0.6753 0.8218
No log 7.0182 386 0.6716 0.4840 0.6716 0.8195
No log 7.0545 388 0.6777 0.4713 0.6777 0.8232
No log 7.0909 390 0.6789 0.5037 0.6789 0.8240
No log 7.1273 392 0.6892 0.4248 0.6892 0.8302
No log 7.1636 394 0.6949 0.5067 0.6949 0.8336
No log 7.2 396 0.7017 0.5158 0.7017 0.8377
No log 7.2364 398 0.6962 0.4992 0.6962 0.8344
No log 7.2727 400 0.6834 0.5067 0.6834 0.8267
No log 7.3091 402 0.6647 0.5020 0.6647 0.8153
No log 7.3455 404 0.6690 0.4948 0.6690 0.8179
No log 7.3818 406 0.6967 0.4684 0.6967 0.8347
No log 7.4182 408 0.7123 0.4606 0.7123 0.8439
No log 7.4545 410 0.7233 0.4476 0.7233 0.8505
No log 7.4909 412 0.6853 0.5119 0.6853 0.8278
No log 7.5273 414 0.6868 0.5300 0.6868 0.8288
No log 7.5636 416 0.7316 0.4518 0.7316 0.8553
No log 7.6 418 0.7466 0.4518 0.7466 0.8640
No log 7.6364 420 0.7146 0.4619 0.7146 0.8453
No log 7.6727 422 0.6699 0.5428 0.6699 0.8185
No log 7.7091 424 0.6562 0.5378 0.6562 0.8101
No log 7.7455 426 0.6547 0.5970 0.6547 0.8091
No log 7.7818 428 0.6662 0.5397 0.6662 0.8162
No log 7.8182 430 0.6571 0.5970 0.6571 0.8106
No log 7.8545 432 0.6430 0.4551 0.6430 0.8019
No log 7.8909 434 0.6337 0.4621 0.6337 0.7961
No log 7.9273 436 0.6315 0.4603 0.6315 0.7947
No log 7.9636 438 0.6439 0.4710 0.6439 0.8024
No log 8.0 440 0.6937 0.4689 0.6937 0.8329
No log 8.0364 442 0.7044 0.4492 0.7044 0.8393
No log 8.0727 444 0.6533 0.4655 0.6533 0.8083
No log 8.1091 446 0.6348 0.6286 0.6348 0.7968
No log 8.1455 448 0.6667 0.5014 0.6667 0.8165
No log 8.1818 450 0.6603 0.5368 0.6603 0.8126
No log 8.2182 452 0.6446 0.5840 0.6446 0.8029
No log 8.2545 454 0.6984 0.4492 0.6984 0.8357
No log 8.2909 456 0.7934 0.4076 0.7934 0.8908
No log 8.3273 458 0.7997 0.4076 0.7997 0.8943
No log 8.3636 460 0.7201 0.4359 0.7201 0.8486
No log 8.4 462 0.6333 0.5301 0.6333 0.7958
No log 8.4364 464 0.6271 0.5167 0.6271 0.7919
No log 8.4727 466 0.6517 0.5920 0.6517 0.8073
No log 8.5091 468 0.6937 0.5051 0.6937 0.8329
No log 8.5455 470 0.7134 0.4732 0.7134 0.8446
No log 8.5818 472 0.6713 0.5335 0.6713 0.8193
No log 8.6182 474 0.6407 0.5111 0.6407 0.8004
No log 8.6545 476 0.6485 0.5104 0.6485 0.8053
No log 8.6909 478 0.6734 0.4866 0.6734 0.8206
No log 8.7273 480 0.6955 0.4827 0.6955 0.8340
No log 8.7636 482 0.6715 0.4997 0.6715 0.8195
No log 8.8 484 0.6388 0.5120 0.6388 0.7992
No log 8.8364 486 0.6378 0.5899 0.6378 0.7986
No log 8.8727 488 0.6524 0.5306 0.6524 0.8077
No log 8.9091 490 0.6459 0.4624 0.6459 0.8037
No log 8.9455 492 0.6384 0.4678 0.6384 0.7990
No log 8.9818 494 0.6559 0.4789 0.6559 0.8099
No log 9.0182 496 0.6731 0.4731 0.6731 0.8204
No log 9.0545 498 0.6811 0.4157 0.6811 0.8253
0.3816 9.0909 500 0.6949 0.4524 0.6949 0.8336
0.3816 9.1273 502 0.7220 0.4970 0.7220 0.8497
0.3816 9.1636 504 0.7124 0.5142 0.7124 0.8441
0.3816 9.2 506 0.6948 0.4256 0.6948 0.8336
0.3816 9.2364 508 0.7004 0.3664 0.7004 0.8369
0.3816 9.2727 510 0.7526 0.3871 0.7526 0.8675
0.3816 9.3091 512 0.7965 0.3716 0.7965 0.8924
0.3816 9.3455 514 0.7541 0.3888 0.7541 0.8684

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run3_AugV5_k11_task7_organization

Finetuned
(4019)
this model