ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run3_AugV5_k19_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9681
  • Qwk: 0.5426
  • Mse: 0.9681
  • Rmse: 0.9839

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0225 2 6.7955 0.0057 6.7955 2.6068
No log 0.0449 4 4.1470 0.0631 4.1470 2.0364
No log 0.0674 6 3.3956 -0.0219 3.3956 1.8427
No log 0.0899 8 2.6864 0.0432 2.6864 1.6390
No log 0.1124 10 1.8866 0.1322 1.8866 1.3735
No log 0.1348 12 2.3269 0.0704 2.3269 1.5254
No log 0.1573 14 3.9433 0.0808 3.9433 1.9858
No log 0.1798 16 4.0577 0.0657 4.0577 2.0144
No log 0.2022 18 2.8818 0.0476 2.8818 1.6976
No log 0.2247 20 1.8388 0.1695 1.8388 1.3560
No log 0.2472 22 1.7131 0.1165 1.7131 1.3089
No log 0.2697 24 3.1128 -0.1127 3.1128 1.7643
No log 0.2921 26 3.7027 -0.0339 3.7027 1.9242
No log 0.3146 28 2.7960 0.0552 2.7960 1.6721
No log 0.3371 30 1.9023 0.1818 1.9023 1.3792
No log 0.3596 32 1.8843 0.2301 1.8843 1.3727
No log 0.3820 34 1.9573 0.2281 1.9573 1.3990
No log 0.4045 36 2.0238 0.25 2.0238 1.4226
No log 0.4270 38 2.2783 0.1151 2.2783 1.5094
No log 0.4494 40 2.4022 0.0822 2.4022 1.5499
No log 0.4719 42 2.1965 0.1408 2.1965 1.4820
No log 0.4944 44 1.9957 0.2667 1.9957 1.4127
No log 0.5169 46 1.8991 0.3008 1.8991 1.3781
No log 0.5393 48 1.9315 0.2647 1.9315 1.3898
No log 0.5618 50 1.8444 0.2815 1.8444 1.3581
No log 0.5843 52 1.8903 0.2174 1.8903 1.3749
No log 0.6067 54 2.0801 0.1571 2.0801 1.4423
No log 0.6292 56 2.4122 0.1605 2.4122 1.5531
No log 0.6517 58 2.4921 0.1893 2.4921 1.5786
No log 0.6742 60 2.1579 0.2484 2.1579 1.4690
No log 0.6966 62 1.6520 0.4060 1.6520 1.2853
No log 0.7191 64 1.5969 0.3846 1.5969 1.2637
No log 0.7416 66 1.6630 0.3846 1.6630 1.2896
No log 0.7640 68 1.6710 0.3622 1.6710 1.2927
No log 0.7865 70 1.7367 0.3622 1.7367 1.3178
No log 0.8090 72 1.8833 0.3310 1.8833 1.3723
No log 0.8315 74 2.1353 0.275 2.1353 1.4612
No log 0.8539 76 2.4611 0.2712 2.4611 1.5688
No log 0.8764 78 2.3402 0.2644 2.3402 1.5298
No log 0.8989 80 2.1068 0.3152 2.1068 1.4515
No log 0.9213 82 1.9139 0.3951 1.9139 1.3834
No log 0.9438 84 1.8281 0.3951 1.8281 1.3521
No log 0.9663 86 1.7008 0.4051 1.7008 1.3042
No log 0.9888 88 1.9687 0.4 1.9687 1.4031
No log 1.0112 90 2.2865 0.3295 2.2865 1.5121
No log 1.0337 92 2.4177 0.3043 2.4177 1.5549
No log 1.0562 94 2.1429 0.3432 2.1429 1.4638
No log 1.0787 96 1.6919 0.3830 1.6919 1.3007
No log 1.1011 98 1.4314 0.4219 1.4314 1.1964
No log 1.1236 100 1.3171 0.4308 1.3171 1.1476
No log 1.1461 102 1.2138 0.5116 1.2138 1.1017
No log 1.1685 104 1.1704 0.4961 1.1704 1.0819
No log 1.1910 106 1.3086 0.4724 1.3086 1.1439
No log 1.2135 108 1.5123 0.4328 1.5123 1.2298
No log 1.2360 110 1.4830 0.4296 1.4830 1.2178
No log 1.2584 112 1.4832 0.3796 1.4832 1.2179
No log 1.2809 114 1.5360 0.3212 1.5360 1.2393
No log 1.3034 116 1.5351 0.3916 1.5351 1.2390
No log 1.3258 118 1.5955 0.3836 1.5955 1.2631
No log 1.3483 120 1.7219 0.4189 1.7219 1.3122
No log 1.3708 122 1.8388 0.4026 1.8388 1.3560
No log 1.3933 124 2.2436 0.3218 2.2436 1.4979
No log 1.4157 126 2.3477 0.3548 2.3477 1.5322
No log 1.4382 128 2.2312 0.3934 2.2312 1.4937
No log 1.4607 130 2.3904 0.3731 2.3904 1.5461
No log 1.4831 132 2.6136 0.3776 2.6136 1.6167
No log 1.5056 134 2.0546 0.4000 2.0546 1.4334
No log 1.5281 136 1.6677 0.4417 1.6677 1.2914
No log 1.5506 138 1.5011 0.4837 1.5011 1.2252
No log 1.5730 140 1.4544 0.48 1.4544 1.2060
No log 1.5955 142 1.6158 0.4533 1.6158 1.2711
No log 1.6180 144 1.6811 0.4564 1.6811 1.2966
No log 1.6404 146 1.7201 0.4161 1.7201 1.3115
No log 1.6629 148 1.6698 0.4306 1.6698 1.2922
No log 1.6854 150 1.4897 0.4818 1.4897 1.2205
No log 1.7079 152 1.1962 0.4923 1.1962 1.0937
No log 1.7303 154 0.9963 0.5865 0.9963 0.9982
No log 1.7528 156 1.0478 0.5970 1.0478 1.0236
No log 1.7753 158 1.1195 0.5778 1.1195 1.0580
No log 1.7978 160 1.3831 0.5359 1.3831 1.1760
No log 1.8202 162 1.6177 0.4444 1.6177 1.2719
No log 1.8427 164 1.5683 0.5311 1.5683 1.2523
No log 1.8652 166 1.6114 0.5393 1.6114 1.2694
No log 1.8876 168 1.4462 0.5731 1.4462 1.2026
No log 1.9101 170 1.2162 0.6053 1.2162 1.1028
No log 1.9326 172 0.9949 0.6074 0.9949 0.9975
No log 1.9551 174 0.9273 0.5649 0.9273 0.9630
No log 1.9775 176 0.8975 0.5538 0.8975 0.9474
No log 2.0 178 0.8830 0.5802 0.8831 0.9397
No log 2.0225 180 0.9728 0.6324 0.9728 0.9863
No log 2.0449 182 1.1962 0.6316 1.1962 1.0937
No log 2.0674 184 1.4073 0.5614 1.4073 1.1863
No log 2.0899 186 1.4411 0.5591 1.4411 1.2005
No log 2.1124 188 1.3287 0.6082 1.3287 1.1527
No log 2.1348 190 1.1771 0.6587 1.1771 1.0850
No log 2.1573 192 1.1262 0.6752 1.1262 1.0612
No log 2.1798 194 1.0195 0.6577 1.0195 1.0097
No log 2.2022 196 0.9340 0.6711 0.9340 0.9664
No log 2.2247 198 1.0108 0.6027 1.0108 1.0054
No log 2.2472 200 0.9780 0.6164 0.9780 0.9890
No log 2.2697 202 0.8469 0.6573 0.8469 0.9203
No log 2.2921 204 0.8205 0.6471 0.8205 0.9058
No log 2.3146 206 0.8398 0.6165 0.8398 0.9164
No log 2.3371 208 0.8431 0.6222 0.8431 0.9182
No log 2.3596 210 0.8819 0.5985 0.8819 0.9391
No log 2.3820 212 0.9660 0.6528 0.9660 0.9829
No log 2.4045 214 0.9548 0.6577 0.9548 0.9771
No log 2.4270 216 0.8622 0.6483 0.8622 0.9285
No log 2.4494 218 0.7405 0.6901 0.7405 0.8605
No log 2.4719 220 0.7166 0.7391 0.7166 0.8465
No log 2.4944 222 0.8669 0.6569 0.8669 0.9311
No log 2.5169 224 1.0128 0.5571 1.0128 1.0064
No log 2.5393 226 0.9511 0.5839 0.9511 0.9753
No log 2.5618 228 0.8070 0.7164 0.8070 0.8983
No log 2.5843 230 0.7582 0.6912 0.7582 0.8707
No log 2.6067 232 0.7879 0.6475 0.7879 0.8876
No log 2.6292 234 0.8277 0.6331 0.8277 0.9098
No log 2.6517 236 0.8559 0.6667 0.8559 0.9252
No log 2.6742 238 0.8362 0.6806 0.8362 0.9145
No log 2.6966 240 0.7840 0.7042 0.7840 0.8854
No log 2.7191 242 0.7815 0.6901 0.7815 0.8840
No log 2.7416 244 0.8347 0.6620 0.8347 0.9136
No log 2.7640 246 0.8434 0.6621 0.8434 0.9183
No log 2.7865 248 0.8787 0.6294 0.8787 0.9374
No log 2.8090 250 0.8256 0.6667 0.8256 0.9086
No log 2.8315 252 0.7864 0.6569 0.7864 0.8868
No log 2.8539 254 0.8063 0.6519 0.8063 0.8979
No log 2.8764 256 0.7342 0.6950 0.7342 0.8569
No log 2.8989 258 0.8000 0.7226 0.8000 0.8944
No log 2.9213 260 1.0115 0.6708 1.0115 1.0057
No log 2.9438 262 1.1789 0.6705 1.1789 1.0858
No log 2.9663 264 1.0520 0.6667 1.0520 1.0257
No log 2.9888 266 0.8399 0.6486 0.8399 0.9165
No log 3.0112 268 0.7612 0.6901 0.7612 0.8724
No log 3.0337 270 0.7614 0.6519 0.7614 0.8726
No log 3.0562 272 0.7816 0.6667 0.7816 0.8841
No log 3.0787 274 0.7856 0.6364 0.7856 0.8863
No log 3.1011 276 0.7588 0.6569 0.7588 0.8711
No log 3.1236 278 0.7419 0.6944 0.7419 0.8613
No log 3.1461 280 0.7461 0.6928 0.7461 0.8638
No log 3.1685 282 0.7584 0.7170 0.7584 0.8709
No log 3.1910 284 0.7625 0.7239 0.7625 0.8732
No log 3.2135 286 0.7935 0.7590 0.7935 0.8908
No log 3.2360 288 0.8759 0.7273 0.8759 0.9359
No log 3.2584 290 0.9344 0.7073 0.9344 0.9666
No log 3.2809 292 0.8705 0.6792 0.8705 0.9330
No log 3.3034 294 0.7762 0.6923 0.7762 0.8810
No log 3.3258 296 0.7273 0.7083 0.7273 0.8528
No log 3.3483 298 0.7553 0.6892 0.7553 0.8691
No log 3.3708 300 0.6911 0.7517 0.6911 0.8314
No log 3.3933 302 0.6153 0.7785 0.6153 0.7844
No log 3.4157 304 0.6059 0.7785 0.6059 0.7784
No log 3.4382 306 0.6832 0.7248 0.6832 0.8266
No log 3.4607 308 0.7891 0.7143 0.7891 0.8883
No log 3.4831 310 0.7477 0.7114 0.7477 0.8647
No log 3.5056 312 0.6821 0.7550 0.6821 0.8259
No log 3.5281 314 0.6899 0.7483 0.6899 0.8306
No log 3.5506 316 0.7383 0.7320 0.7383 0.8592
No log 3.5730 318 0.8836 0.7 0.8836 0.9400
No log 3.5955 320 0.9764 0.6982 0.9764 0.9882
No log 3.6180 322 0.8781 0.7152 0.8781 0.9371
No log 3.6404 324 0.6992 0.76 0.6992 0.8362
No log 3.6629 326 0.6579 0.7413 0.6579 0.8111
No log 3.6854 328 0.6919 0.7338 0.6919 0.8318
No log 3.7079 330 0.7280 0.6963 0.7280 0.8532
No log 3.7303 332 0.7399 0.6818 0.7399 0.8602
No log 3.7528 334 0.7588 0.6515 0.7588 0.8711
No log 3.7753 336 0.7421 0.6970 0.7421 0.8614
No log 3.7978 338 0.7455 0.6917 0.7455 0.8634
No log 3.8202 340 0.7940 0.6522 0.7940 0.8911
No log 3.8427 342 0.9236 0.6014 0.9236 0.9610
No log 3.8652 344 0.9526 0.6111 0.9526 0.9760
No log 3.8876 346 0.8434 0.6621 0.8434 0.9184
No log 3.9101 348 0.7378 0.6857 0.7378 0.8590
No log 3.9326 350 0.6915 0.7552 0.6915 0.8316
No log 3.9551 352 0.6944 0.7517 0.6944 0.8333
No log 3.9775 354 0.7080 0.6980 0.7080 0.8414
No log 4.0 356 0.6839 0.7792 0.6839 0.8270
No log 4.0225 358 0.6913 0.75 0.6913 0.8315
No log 4.0449 360 0.7185 0.7582 0.7185 0.8477
No log 4.0674 362 0.7892 0.7114 0.7892 0.8884
No log 4.0899 364 0.8257 0.7152 0.8257 0.9087
No log 4.1124 366 0.8148 0.7152 0.8148 0.9027
No log 4.1348 368 0.7852 0.6993 0.7852 0.8861
No log 4.1573 370 0.7131 0.7361 0.7131 0.8444
No log 4.1798 372 0.6676 0.7724 0.6676 0.8171
No log 4.2022 374 0.6912 0.75 0.6912 0.8314
No log 4.2247 376 0.6787 0.75 0.6787 0.8239
No log 4.2472 378 0.6319 0.8027 0.6319 0.7949
No log 4.2697 380 0.6266 0.8108 0.6266 0.7916
No log 4.2921 382 0.6364 0.7871 0.6364 0.7978
No log 4.3146 384 0.7395 0.7261 0.7395 0.8599
No log 4.3371 386 0.8464 0.6994 0.8464 0.9200
No log 4.3596 388 0.8880 0.7030 0.8880 0.9423
No log 4.3820 390 0.8406 0.6709 0.8406 0.9169
No log 4.4045 392 0.8224 0.6835 0.8224 0.9069
No log 4.4270 394 0.8346 0.6624 0.8346 0.9135
No log 4.4494 396 0.8075 0.6624 0.8075 0.8986
No log 4.4719 398 0.7911 0.6918 0.7911 0.8895
No log 4.4944 400 0.7803 0.6842 0.7803 0.8833
No log 4.5169 402 0.8381 0.64 0.8381 0.9155
No log 4.5393 404 0.8563 0.6351 0.8563 0.9253
No log 4.5618 406 0.8442 0.6577 0.8442 0.9188
No log 4.5843 408 0.7692 0.6887 0.7692 0.8770
No log 4.6067 410 0.7428 0.6887 0.7428 0.8619
No log 4.6292 412 0.7888 0.7044 0.7888 0.8882
No log 4.6517 414 0.8407 0.6584 0.8407 0.9169
No log 4.6742 416 0.8023 0.7195 0.8023 0.8957
No log 4.6966 418 0.7331 0.7349 0.7331 0.8562
No log 4.7191 420 0.7326 0.7349 0.7326 0.8559
No log 4.7416 422 0.8060 0.7412 0.8060 0.8978
No log 4.7640 424 0.8374 0.7066 0.8374 0.9151
No log 4.7865 426 0.9310 0.6829 0.9310 0.9649
No log 4.8090 428 0.9251 0.6242 0.9251 0.9618
No log 4.8315 430 0.8350 0.5942 0.8350 0.9138
No log 4.8539 432 0.7828 0.6331 0.7828 0.8847
No log 4.8764 434 0.7662 0.6667 0.7662 0.8753
No log 4.8989 436 0.7864 0.6471 0.7864 0.8868
No log 4.9213 438 0.8249 0.6222 0.8249 0.9082
No log 4.9438 440 0.8790 0.5672 0.8790 0.9376
No log 4.9663 442 0.9146 0.5778 0.9146 0.9564
No log 4.9888 444 0.9879 0.6043 0.9879 0.9939
No log 5.0112 446 1.0325 0.6710 1.0325 1.0161
No log 5.0337 448 0.9561 0.6497 0.9561 0.9778
No log 5.0562 450 0.7723 0.6842 0.7723 0.8788
No log 5.0787 452 0.6611 0.7682 0.6611 0.8131
No log 5.1011 454 0.6251 0.7724 0.6251 0.7906
No log 5.1236 456 0.6298 0.7724 0.6298 0.7936
No log 5.1461 458 0.6363 0.75 0.6363 0.7977
No log 5.1685 460 0.6660 0.7586 0.6660 0.8161
No log 5.1910 462 0.7138 0.7413 0.7138 0.8448
No log 5.2135 464 0.7581 0.7042 0.7581 0.8707
No log 5.2360 466 0.8358 0.6345 0.8358 0.9142
No log 5.2584 468 0.9308 0.6709 0.9308 0.9648
No log 5.2809 470 0.9014 0.6909 0.9014 0.9494
No log 5.3034 472 0.7843 0.7073 0.7843 0.8856
No log 5.3258 474 0.7092 0.7439 0.7092 0.8422
No log 5.3483 476 0.7013 0.7654 0.7013 0.8374
No log 5.3708 478 0.7998 0.7229 0.7998 0.8943
No log 5.3933 480 0.9201 0.6905 0.9201 0.9592
No log 5.4157 482 0.9031 0.6982 0.9031 0.9503
No log 5.4382 484 0.7938 0.7152 0.7938 0.8909
No log 5.4607 486 0.6686 0.8118 0.6686 0.8177
No log 5.4831 488 0.6192 0.8 0.6192 0.7869
No log 5.5056 490 0.6107 0.8 0.6107 0.7815
No log 5.5281 492 0.6376 0.8118 0.6376 0.7985
No log 5.5506 494 0.7146 0.7765 0.7146 0.8453
No log 5.5730 496 0.8316 0.6982 0.8316 0.9119
No log 5.5955 498 0.9700 0.6550 0.9700 0.9849
0.4994 5.6180 500 0.9680 0.6550 0.9680 0.9839
0.4994 5.6404 502 0.8604 0.7176 0.8604 0.9276
0.4994 5.6629 504 0.7929 0.7251 0.7929 0.8905
0.4994 5.6854 506 0.7218 0.7456 0.7218 0.8496
0.4994 5.7079 508 0.6613 0.7882 0.6613 0.8132
0.4994 5.7303 510 0.6646 0.7882 0.6646 0.8152
0.4994 5.7528 512 0.6920 0.75 0.6920 0.8319
0.4994 5.7753 514 0.7288 0.7381 0.7288 0.8537
0.4994 5.7978 516 0.6914 0.75 0.6914 0.8315
0.4994 5.8202 518 0.6642 0.7836 0.6642 0.8150
0.4994 5.8427 520 0.6728 0.7647 0.6728 0.8202
0.4994 5.8652 522 0.7551 0.7684 0.7551 0.8690
0.4994 5.8876 524 0.9411 0.6857 0.9411 0.9701
0.4994 5.9101 526 1.1648 0.6444 1.1648 1.0793
0.4994 5.9326 528 1.1459 0.6630 1.1459 1.0705
0.4994 5.9551 530 1.0207 0.6818 1.0207 1.0103
0.4994 5.9775 532 0.9713 0.7333 0.9713 0.9855
0.4994 6.0 534 0.9645 0.7333 0.9645 0.9821
0.4994 6.0225 536 1.0437 0.7072 1.0437 1.0216
0.4994 6.0449 538 1.1221 0.6667 1.1221 1.0593
0.4994 6.0674 540 1.0506 0.6927 1.0506 1.0250
0.4994 6.0899 542 0.9560 0.7293 0.9560 0.9777
0.4994 6.1124 544 0.8625 0.7293 0.8625 0.9287
0.4994 6.1348 546 0.7679 0.7571 0.7679 0.8763
0.4994 6.1573 548 0.7081 0.7647 0.7081 0.8415
0.4994 6.1798 550 0.7270 0.7647 0.7270 0.8526
0.4994 6.2022 552 0.8201 0.7543 0.8201 0.9056
0.4994 6.2247 554 0.8835 0.7159 0.8835 0.9399
0.4994 6.2472 556 0.8491 0.7345 0.8491 0.9215
0.4994 6.2697 558 0.8601 0.7356 0.8601 0.9274
0.4994 6.2921 560 0.8580 0.7356 0.8580 0.9263
0.4994 6.3146 562 0.8515 0.7356 0.8515 0.9227
0.4994 6.3371 564 0.7768 0.7305 0.7768 0.8814
0.4994 6.3596 566 0.7377 0.7273 0.7377 0.8589
0.4994 6.3820 568 0.6813 0.7342 0.6813 0.8254
0.4994 6.4045 570 0.6491 0.7673 0.6491 0.8057
0.4994 6.4270 572 0.6671 0.7531 0.6671 0.8168
0.4994 6.4494 574 0.7507 0.7425 0.7507 0.8664
0.4994 6.4719 576 0.8786 0.7143 0.8786 0.9374
0.4994 6.4944 578 0.9357 0.7093 0.9357 0.9673
0.4994 6.5169 580 0.9443 0.6941 0.9443 0.9717
0.4994 6.5393 582 0.9547 0.6941 0.9547 0.9771
0.4994 6.5618 584 0.8809 0.6941 0.8809 0.9386
0.4994 6.5843 586 0.7144 0.7362 0.7144 0.8452
0.4994 6.6067 588 0.6414 0.7468 0.6414 0.8009
0.4994 6.6292 590 0.6192 0.7742 0.6192 0.7869
0.4994 6.6517 592 0.6427 0.7484 0.6427 0.8017
0.4994 6.6742 594 0.6347 0.7703 0.6347 0.7967
0.4994 6.6966 596 0.6900 0.7152 0.6900 0.8306
0.4994 6.7191 598 0.7420 0.6803 0.7420 0.8614
0.4994 6.7416 600 0.7480 0.6620 0.7480 0.8649
0.4994 6.7640 602 0.7514 0.6620 0.7514 0.8669
0.4994 6.7865 604 0.7644 0.6620 0.7644 0.8743
0.4994 6.8090 606 0.7529 0.6569 0.7529 0.8677
0.4994 6.8315 608 0.7690 0.6324 0.7690 0.8769
0.4994 6.8539 610 0.8791 0.6014 0.8791 0.9376
0.4994 6.8764 612 0.9896 0.6197 0.9896 0.9948
0.4994 6.8989 614 1.0399 0.6056 1.0399 1.0198
0.4994 6.9213 616 0.9681 0.5426 0.9681 0.9839

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run3_AugV5_k19_task1_organization

Finetuned
(4019)
this model