ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run3_AugV5_k17_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7193
  • Qwk: 0.3272
  • Mse: 0.7193
  • Rmse: 0.8481

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0233 2 2.5265 -0.0758 2.5265 1.5895
No log 0.0465 4 1.3903 0.0985 1.3903 1.1791
No log 0.0698 6 1.2934 -0.1569 1.2934 1.1373
No log 0.0930 8 1.1515 -0.0155 1.1515 1.0731
No log 0.1163 10 1.3255 -0.1173 1.3255 1.1513
No log 0.1395 12 1.4889 -0.1185 1.4889 1.2202
No log 0.1628 14 1.3791 -0.0284 1.3791 1.1744
No log 0.1860 16 1.2204 -0.0581 1.2204 1.1047
No log 0.2093 18 1.1847 -0.0437 1.1847 1.0884
No log 0.2326 20 1.1585 -0.0978 1.1585 1.0763
No log 0.2558 22 1.0652 -0.0918 1.0652 1.0321
No log 0.2791 24 1.0273 0.0364 1.0273 1.0135
No log 0.3023 26 1.0202 -0.0925 1.0202 1.0101
No log 0.3256 28 1.0827 0.0119 1.0827 1.0405
No log 0.3488 30 1.3291 -0.1928 1.3291 1.1529
No log 0.3721 32 1.1675 -0.0468 1.1675 1.0805
No log 0.3953 34 1.1725 -0.0313 1.1725 1.0828
No log 0.4186 36 1.2159 -0.0510 1.2159 1.1027
No log 0.4419 38 1.2217 -0.0632 1.2217 1.1053
No log 0.4651 40 1.1616 -0.0128 1.1616 1.0778
No log 0.4884 42 1.4276 -0.1430 1.4276 1.1948
No log 0.5116 44 1.9407 -0.0734 1.9407 1.3931
No log 0.5349 46 1.7161 -0.0104 1.7161 1.3100
No log 0.5581 48 1.1717 -0.0446 1.1717 1.0824
No log 0.5814 50 0.8423 0.0444 0.8423 0.9178
No log 0.6047 52 0.8176 0.1550 0.8176 0.9042
No log 0.6279 54 0.8212 0.1550 0.8212 0.9062
No log 0.6512 56 0.8360 0.0804 0.8360 0.9143
No log 0.6744 58 0.8403 0.0804 0.8403 0.9167
No log 0.6977 60 0.8478 0.0444 0.8478 0.9208
No log 0.7209 62 0.8580 0.0804 0.8580 0.9263
No log 0.7442 64 0.8531 0.1007 0.8531 0.9237
No log 0.7674 66 0.9068 0.2045 0.9068 0.9523
No log 0.7907 68 0.8725 0.2319 0.8725 0.9341
No log 0.8140 70 0.9382 0.0765 0.9382 0.9686
No log 0.8372 72 1.1434 0.0679 1.1434 1.0693
No log 0.8605 74 1.1300 0.0411 1.1300 1.0630
No log 0.8837 76 0.9346 0.1893 0.9346 0.9668
No log 0.9070 78 0.8012 0.2002 0.8012 0.8951
No log 0.9302 80 0.8008 0.2041 0.8008 0.8949
No log 0.9535 82 0.8718 0.1786 0.8718 0.9337
No log 0.9767 84 0.8910 0.1786 0.8910 0.9439
No log 1.0 86 0.7943 0.1372 0.7943 0.8912
No log 1.0233 88 0.7656 0.1138 0.7656 0.8750
No log 1.0465 90 0.8059 0.1268 0.8059 0.8977
No log 1.0698 92 0.8510 0.0971 0.8510 0.9225
No log 1.0930 94 0.9028 0.1624 0.9028 0.9502
No log 1.1163 96 0.9639 0.1566 0.9639 0.9818
No log 1.1395 98 0.9415 0.0933 0.9415 0.9703
No log 1.1628 100 0.9503 0.0364 0.9503 0.9748
No log 1.1860 102 0.9604 -0.0112 0.9604 0.9800
No log 1.2093 104 0.9971 0.0405 0.9971 0.9985
No log 1.2326 106 1.0278 0.2616 1.0278 1.0138
No log 1.2558 108 0.9546 0.0405 0.9546 0.9771
No log 1.2791 110 0.8703 0.0856 0.8703 0.9329
No log 1.3023 112 0.9091 0.1440 0.9091 0.9535
No log 1.3256 114 1.0046 0.0827 1.0046 1.0023
No log 1.3488 116 0.9461 0.1089 0.9461 0.9727
No log 1.3721 118 0.9364 0.0946 0.9364 0.9677
No log 1.3953 120 0.9914 0.0918 0.9914 0.9957
No log 1.4186 122 1.0982 0.2141 1.0982 1.0480
No log 1.4419 124 1.0818 0.1963 1.0818 1.0401
No log 1.4651 126 0.9538 0.1446 0.9538 0.9766
No log 1.4884 128 0.8802 0.2229 0.8802 0.9382
No log 1.5116 130 0.8901 0.0892 0.8901 0.9435
No log 1.5349 132 0.9137 0.0892 0.9137 0.9559
No log 1.5581 134 0.9536 0.1967 0.9536 0.9765
No log 1.5814 136 0.9827 0.0991 0.9827 0.9913
No log 1.6047 138 0.9628 0.1876 0.9628 0.9812
No log 1.6279 140 0.9497 0.1966 0.9497 0.9745
No log 1.6512 142 0.9813 0.1631 0.9813 0.9906
No log 1.6744 144 1.0358 0.1296 1.0358 1.0178
No log 1.6977 146 1.0114 0.1314 1.0114 1.0057
No log 1.7209 148 0.9152 0.1243 0.9152 0.9567
No log 1.7442 150 0.9101 0.3095 0.9101 0.9540
No log 1.7674 152 0.9481 0.2975 0.9481 0.9737
No log 1.7907 154 0.9218 0.2352 0.9218 0.9601
No log 1.8140 156 0.8865 0.0186 0.8865 0.9415
No log 1.8372 158 0.9852 0.1407 0.9852 0.9926
No log 1.8605 160 1.0396 0.0915 1.0396 1.0196
No log 1.8837 162 0.8891 0.2349 0.8891 0.9429
No log 1.9070 164 0.7856 0.1624 0.7856 0.8863
No log 1.9302 166 0.7761 0.2227 0.7761 0.8810
No log 1.9535 168 0.7619 0.2007 0.7619 0.8729
No log 1.9767 170 0.7735 0.1850 0.7735 0.8795
No log 2.0 172 0.7880 0.2290 0.7880 0.8877
No log 2.0233 174 0.8080 0.2249 0.8080 0.8989
No log 2.0465 176 0.7960 0.2509 0.7960 0.8922
No log 2.0698 178 0.8678 0.2202 0.8678 0.9316
No log 2.0930 180 0.8956 0.2247 0.8956 0.9463
No log 2.1163 182 0.8598 0.2523 0.8598 0.9272
No log 2.1395 184 0.8239 0.2143 0.8239 0.9077
No log 2.1628 186 0.8707 0.1741 0.8707 0.9331
No log 2.1860 188 0.8403 0.0941 0.8403 0.9167
No log 2.2093 190 0.9490 0.2308 0.9490 0.9742
No log 2.2326 192 0.9858 0.1573 0.9858 0.9929
No log 2.2558 194 0.8871 0.2328 0.8871 0.9419
No log 2.2791 196 0.8553 0.1935 0.8553 0.9248
No log 2.3023 198 0.8687 0.2687 0.8687 0.9320
No log 2.3256 200 0.8788 0.2634 0.8788 0.9374
No log 2.3488 202 0.9280 0.2617 0.9280 0.9633
No log 2.3721 204 0.9422 0.2247 0.9422 0.9707
No log 2.3953 206 0.9028 0.1960 0.9028 0.9502
No log 2.4186 208 0.9113 0.2277 0.9113 0.9546
No log 2.4419 210 0.9194 0.2049 0.9194 0.9588
No log 2.4651 212 0.9297 0.2260 0.9297 0.9642
No log 2.4884 214 1.0227 0.2097 1.0227 1.0113
No log 2.5116 216 0.9607 0.2825 0.9607 0.9802
No log 2.5349 218 0.8897 0.2409 0.8897 0.9432
No log 2.5581 220 0.8671 0.1672 0.8671 0.9312
No log 2.5814 222 0.9159 0.1542 0.9159 0.9570
No log 2.6047 224 0.8985 0.1914 0.8985 0.9479
No log 2.6279 226 0.8614 0.1597 0.8614 0.9281
No log 2.6512 228 0.8972 0.2224 0.8972 0.9472
No log 2.6744 230 0.9053 0.2224 0.9053 0.9515
No log 2.6977 232 0.8259 0.2182 0.8259 0.9088
No log 2.7209 234 0.8537 0.2751 0.8537 0.9240
No log 2.7442 236 0.9531 0.2627 0.9531 0.9763
No log 2.7674 238 0.8736 0.3207 0.8736 0.9346
No log 2.7907 240 0.7996 0.3201 0.7996 0.8942
No log 2.8140 242 0.8529 0.2612 0.8529 0.9235
No log 2.8372 244 0.8211 0.2709 0.8211 0.9062
No log 2.8605 246 0.7999 0.2796 0.7999 0.8944
No log 2.8837 248 0.8226 0.3701 0.8226 0.9070
No log 2.9070 250 0.8801 0.3183 0.8801 0.9381
No log 2.9302 252 0.8443 0.2943 0.8443 0.9189
No log 2.9535 254 0.7834 0.3839 0.7834 0.8851
No log 2.9767 256 0.7667 0.2318 0.7667 0.8756
No log 3.0 258 0.7825 0.2623 0.7825 0.8846
No log 3.0233 260 0.7893 0.3018 0.7893 0.8885
No log 3.0465 262 0.8342 0.3671 0.8342 0.9133
No log 3.0698 264 0.9311 0.3457 0.9311 0.9649
No log 3.0930 266 0.9426 0.3400 0.9426 0.9709
No log 3.1163 268 0.8624 0.4001 0.8624 0.9286
No log 3.1395 270 0.8278 0.3986 0.8278 0.9098
No log 3.1628 272 0.8187 0.3796 0.8187 0.9048
No log 3.1860 274 0.8834 0.3218 0.8834 0.9399
No log 3.2093 276 0.8310 0.2862 0.8310 0.9116
No log 3.2326 278 0.7925 0.3096 0.7925 0.8902
No log 3.2558 280 0.8621 0.2465 0.8621 0.9285
No log 3.2791 282 0.9333 0.2056 0.9333 0.9661
No log 3.3023 284 0.8914 0.2252 0.8914 0.9441
No log 3.3256 286 0.9250 0.2980 0.9250 0.9618
No log 3.3488 288 1.0406 0.2797 1.0406 1.0201
No log 3.3721 290 1.0749 0.1886 1.0749 1.0368
No log 3.3953 292 0.9596 0.1742 0.9596 0.9796
No log 3.4186 294 0.8618 0.3069 0.8618 0.9283
No log 3.4419 296 0.8719 0.3096 0.8719 0.9338
No log 3.4651 298 0.9028 0.3060 0.9028 0.9502
No log 3.4884 300 0.9427 0.3011 0.9427 0.9709
No log 3.5116 302 0.9765 0.2798 0.9765 0.9882
No log 3.5349 304 0.9721 0.3229 0.9721 0.9859
No log 3.5581 306 0.9284 0.3326 0.9284 0.9635
No log 3.5814 308 0.8998 0.3386 0.8998 0.9486
No log 3.6047 310 0.8825 0.2999 0.8825 0.9394
No log 3.6279 312 0.9075 0.2668 0.9075 0.9526
No log 3.6512 314 1.0025 0.1926 1.0025 1.0012
No log 3.6744 316 1.0289 0.2557 1.0289 1.0143
No log 3.6977 318 1.0505 0.2636 1.0505 1.0250
No log 3.7209 320 0.9662 0.3067 0.9662 0.9830
No log 3.7442 322 0.9327 0.3540 0.9327 0.9657
No log 3.7674 324 0.9252 0.3335 0.9252 0.9619
No log 3.7907 326 0.9305 0.2845 0.9305 0.9646
No log 3.8140 328 0.9141 0.2555 0.9141 0.9561
No log 3.8372 330 0.9178 0.2775 0.9178 0.9580
No log 3.8605 332 0.9372 0.2962 0.9372 0.9681
No log 3.8837 334 0.9931 0.1701 0.9931 0.9965
No log 3.9070 336 0.9531 0.2643 0.9531 0.9763
No log 3.9302 338 0.8994 0.2857 0.8994 0.9484
No log 3.9535 340 0.8849 0.3101 0.8849 0.9407
No log 3.9767 342 0.8968 0.2364 0.8968 0.9470
No log 4.0 344 0.9093 0.2336 0.9093 0.9536
No log 4.0233 346 0.8749 0.3034 0.8749 0.9354
No log 4.0465 348 0.8400 0.2947 0.8400 0.9165
No log 4.0698 350 0.8550 0.3495 0.8550 0.9247
No log 4.0930 352 0.9932 0.2567 0.9932 0.9966
No log 4.1163 354 1.0244 0.2200 1.0244 1.0121
No log 4.1395 356 0.9180 0.2827 0.9180 0.9581
No log 4.1628 358 0.7961 0.3077 0.7961 0.8923
No log 4.1860 360 0.7950 0.3136 0.7950 0.8916
No log 4.2093 362 0.8250 0.3700 0.8250 0.9083
No log 4.2326 364 0.7690 0.3417 0.7690 0.8769
No log 4.2558 366 0.7116 0.3198 0.7116 0.8436
No log 4.2791 368 0.7079 0.2622 0.7079 0.8414
No log 4.3023 370 0.7124 0.2622 0.7124 0.8440
No log 4.3256 372 0.7380 0.3171 0.7380 0.8591
No log 4.3488 374 0.8413 0.2778 0.8413 0.9172
No log 4.3721 376 0.9001 0.3206 0.9001 0.9487
No log 4.3953 378 0.9027 0.2887 0.9027 0.9501
No log 4.4186 380 0.8188 0.2445 0.8188 0.9049
No log 4.4419 382 0.7719 0.2867 0.7719 0.8786
No log 4.4651 384 0.7815 0.3715 0.7815 0.8840
No log 4.4884 386 0.7949 0.2867 0.7949 0.8916
No log 4.5116 388 0.8275 0.2606 0.8275 0.9097
No log 4.5349 390 0.8926 0.2127 0.8926 0.9448
No log 4.5581 392 0.9841 0.2533 0.9841 0.9920
No log 4.5814 394 1.0781 0.2872 1.0781 1.0383
No log 4.6047 396 1.0917 0.2777 1.0917 1.0449
No log 4.6279 398 0.9683 0.2756 0.9683 0.9840
No log 4.6512 400 0.8947 0.2142 0.8947 0.9459
No log 4.6744 402 0.9011 0.2410 0.9011 0.9493
No log 4.6977 404 0.9026 0.2443 0.9026 0.9500
No log 4.7209 406 0.9277 0.2857 0.9277 0.9632
No log 4.7442 408 0.9311 0.2590 0.9311 0.9649
No log 4.7674 410 0.9125 0.2776 0.9125 0.9553
No log 4.7907 412 0.9125 0.2676 0.9125 0.9553
No log 4.8140 414 0.9843 0.2555 0.9843 0.9921
No log 4.8372 416 0.9480 0.2958 0.9480 0.9736
No log 4.8605 418 0.8772 0.3068 0.8772 0.9366
No log 4.8837 420 0.7679 0.2893 0.7679 0.8763
No log 4.9070 422 0.7179 0.4137 0.7179 0.8473
No log 4.9302 424 0.7143 0.3835 0.7143 0.8452
No log 4.9535 426 0.7470 0.3299 0.7470 0.8643
No log 4.9767 428 0.8084 0.3746 0.8084 0.8991
No log 5.0 430 0.8214 0.3092 0.8214 0.9063
No log 5.0233 432 0.7322 0.3434 0.7322 0.8557
No log 5.0465 434 0.7246 0.3305 0.7246 0.8513
No log 5.0698 436 0.8188 0.2706 0.8188 0.9049
No log 5.0930 438 0.8205 0.2958 0.8205 0.9058
No log 5.1163 440 0.7176 0.3950 0.7176 0.8471
No log 5.1395 442 0.7327 0.3842 0.7327 0.8560
No log 5.1628 444 0.7417 0.3486 0.7417 0.8612
No log 5.1860 446 0.6930 0.4504 0.6930 0.8324
No log 5.2093 448 0.6980 0.3835 0.6980 0.8355
No log 5.2326 450 0.7228 0.3163 0.7228 0.8502
No log 5.2558 452 0.7283 0.3128 0.7283 0.8534
No log 5.2791 454 0.6989 0.3266 0.6989 0.8360
No log 5.3023 456 0.7001 0.3198 0.7001 0.8367
No log 5.3256 458 0.7022 0.3122 0.7022 0.8380
No log 5.3488 460 0.7042 0.3400 0.7042 0.8392
No log 5.3721 462 0.7031 0.3939 0.7031 0.8385
No log 5.3953 464 0.7343 0.3248 0.7343 0.8569
No log 5.4186 466 0.7368 0.3146 0.7368 0.8584
No log 5.4419 468 0.7313 0.3955 0.7313 0.8552
No log 5.4651 470 0.8045 0.3586 0.8045 0.8969
No log 5.4884 472 0.8356 0.3224 0.8356 0.9141
No log 5.5116 474 0.7815 0.3586 0.7815 0.8840
No log 5.5349 476 0.7308 0.3881 0.7308 0.8549
No log 5.5581 478 0.7248 0.3955 0.7248 0.8513
No log 5.5814 480 0.7339 0.2862 0.7339 0.8567
No log 5.6047 482 0.7856 0.3305 0.7856 0.8864
No log 5.6279 484 0.8340 0.3280 0.8340 0.9132
No log 5.6512 486 0.8075 0.3798 0.8075 0.8986
No log 5.6744 488 0.7170 0.2914 0.7170 0.8468
No log 5.6977 490 0.6797 0.3859 0.6797 0.8244
No log 5.7209 492 0.6875 0.4051 0.6875 0.8291
No log 5.7442 494 0.7087 0.4456 0.7087 0.8418
No log 5.7674 496 0.7021 0.3899 0.7021 0.8379
No log 5.7907 498 0.8701 0.4400 0.8701 0.9328
0.3903 5.8140 500 1.0430 0.3478 1.0430 1.0213
0.3903 5.8372 502 1.0239 0.4125 1.0239 1.0119
0.3903 5.8605 504 0.8467 0.3641 0.8467 0.9202
0.3903 5.8837 506 0.7082 0.3791 0.7082 0.8416
0.3903 5.9070 508 0.7587 0.4926 0.7587 0.8710
0.3903 5.9302 510 0.8854 0.4114 0.8854 0.9410
0.3903 5.9535 512 0.8313 0.3657 0.8313 0.9117
0.3903 5.9767 514 0.7010 0.5078 0.7010 0.8373
0.3903 6.0 516 0.6831 0.3559 0.6831 0.8265
0.3903 6.0233 518 0.6963 0.3211 0.6963 0.8345
0.3903 6.0465 520 0.7331 0.3377 0.7331 0.8562
0.3903 6.0698 522 0.7626 0.4263 0.7626 0.8733
0.3903 6.0930 524 0.8072 0.4531 0.8072 0.8984
0.3903 6.1163 526 0.7842 0.3822 0.7842 0.8856
0.3903 6.1395 528 0.7269 0.3377 0.7269 0.8526
0.3903 6.1628 530 0.6822 0.3235 0.6822 0.8259
0.3903 6.1860 532 0.6931 0.4022 0.6931 0.8325
0.3903 6.2093 534 0.6858 0.3578 0.6858 0.8282
0.3903 6.2326 536 0.7276 0.3701 0.7276 0.8530
0.3903 6.2558 538 0.8059 0.3183 0.8059 0.8977
0.3903 6.2791 540 0.7897 0.3544 0.7897 0.8886
0.3903 6.3023 542 0.7115 0.4147 0.7115 0.8435
0.3903 6.3256 544 0.6910 0.3086 0.6910 0.8313
0.3903 6.3488 546 0.6934 0.3285 0.6934 0.8327
0.3903 6.3721 548 0.6999 0.3643 0.6999 0.8366
0.3903 6.3953 550 0.7042 0.2999 0.7042 0.8392
0.3903 6.4186 552 0.7127 0.2940 0.7127 0.8442
0.3903 6.4419 554 0.7361 0.2940 0.7361 0.8580
0.3903 6.4651 556 0.7405 0.2940 0.7405 0.8605
0.3903 6.4884 558 0.7193 0.3272 0.7193 0.8481

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
3
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run3_AugV5_k17_task7_organization

Finetuned
(4019)
this model