ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run1_AugV5_k4_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7818
  • Qwk: 0.1952
  • Mse: 0.7818
  • Rmse: 0.8842

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0952 2 2.4869 -0.0593 2.4869 1.5770
No log 0.1905 4 1.1877 0.1581 1.1877 1.0898
No log 0.2857 6 1.1377 -0.1680 1.1377 1.0666
No log 0.3810 8 1.1024 0.0821 1.1024 1.0500
No log 0.4762 10 1.1094 0.0585 1.1094 1.0533
No log 0.5714 12 1.1393 0.1635 1.1393 1.0674
No log 0.6667 14 1.1153 0.0575 1.1153 1.0561
No log 0.7619 16 1.1076 0.0403 1.1076 1.0524
No log 0.8571 18 1.4820 0.0239 1.4820 1.2174
No log 0.9524 20 1.1180 0.1976 1.1180 1.0573
No log 1.0476 22 0.8182 0.2749 0.8182 0.9046
No log 1.1429 24 0.8168 0.2883 0.8168 0.9038
No log 1.2381 26 0.7903 0.2285 0.7903 0.8890
No log 1.3333 28 0.7665 0.1775 0.7665 0.8755
No log 1.4286 30 0.7505 0.1529 0.7505 0.8663
No log 1.5238 32 0.7492 0.2852 0.7492 0.8656
No log 1.6190 34 0.7672 0.2783 0.7672 0.8759
No log 1.7143 36 0.8015 0.2096 0.8015 0.8953
No log 1.8095 38 0.8461 0.2467 0.8461 0.9198
No log 1.9048 40 0.8085 0.2530 0.8085 0.8991
No log 2.0 42 0.8386 0.2385 0.8386 0.9158
No log 2.0952 44 0.8276 0.2013 0.8276 0.9097
No log 2.1905 46 0.8185 0.2335 0.8185 0.9047
No log 2.2857 48 0.8016 0.2593 0.8016 0.8953
No log 2.3810 50 0.8480 0.3481 0.8480 0.9209
No log 2.4762 52 0.8852 0.2815 0.8852 0.9409
No log 2.5714 54 0.8411 0.2911 0.8411 0.9171
No log 2.6667 56 0.8594 0.2831 0.8594 0.9270
No log 2.7619 58 0.8903 0.3085 0.8903 0.9436
No log 2.8571 60 0.8362 0.2867 0.8362 0.9145
No log 2.9524 62 0.8302 0.2019 0.8302 0.9112
No log 3.0476 64 0.8388 0.2379 0.8388 0.9159
No log 3.1429 66 0.8160 0.2279 0.8160 0.9033
No log 3.2381 68 0.9054 0.2724 0.9054 0.9515
No log 3.3333 70 0.8891 0.2751 0.8891 0.9429
No log 3.4286 72 0.8820 0.2835 0.8820 0.9391
No log 3.5238 74 0.9150 0.2221 0.9150 0.9566
No log 3.6190 76 0.9658 0.1755 0.9658 0.9828
No log 3.7143 78 1.0566 0.0956 1.0566 1.0279
No log 3.8095 80 1.1198 0.2251 1.1198 1.0582
No log 3.9048 82 1.4419 0.1574 1.4419 1.2008
No log 4.0 84 1.6727 0.1445 1.6727 1.2933
No log 4.0952 86 1.4719 0.1704 1.4719 1.2132
No log 4.1905 88 1.0848 0.1302 1.0848 1.0416
No log 4.2857 90 1.0450 0.1341 1.0450 1.0222
No log 4.3810 92 0.9717 0.0563 0.9717 0.9857
No log 4.4762 94 1.1013 0.2779 1.1013 1.0494
No log 4.5714 96 1.2572 0.1875 1.2572 1.1213
No log 4.6667 98 1.1461 0.2613 1.1461 1.0706
No log 4.7619 100 0.8364 0.3314 0.8364 0.9146
No log 4.8571 102 0.8550 0.2796 0.8550 0.9247
No log 4.9524 104 0.8449 0.3051 0.8449 0.9192
No log 5.0476 106 0.7775 0.3375 0.7775 0.8818
No log 5.1429 108 1.0032 0.3022 1.0032 1.0016
No log 5.2381 110 1.0715 0.2948 1.0715 1.0351
No log 5.3333 112 0.9043 0.3333 0.9043 0.9509
No log 5.4286 114 0.8798 0.3648 0.8798 0.9380
No log 5.5238 116 0.8967 0.3159 0.8967 0.9469
No log 5.6190 118 0.8604 0.2508 0.8604 0.9276
No log 5.7143 120 0.8928 0.2114 0.8928 0.9449
No log 5.8095 122 0.9773 0.3005 0.9773 0.9886
No log 5.9048 124 1.1582 0.1884 1.1582 1.0762
No log 6.0 126 1.0948 0.2329 1.0948 1.0463
No log 6.0952 128 1.0315 0.1902 1.0315 1.0156
No log 6.1905 130 1.0567 0.1453 1.0567 1.0280
No log 6.2857 132 1.0328 0.1682 1.0328 1.0163
No log 6.3810 134 0.9671 0.1672 0.9671 0.9834
No log 6.4762 136 0.9753 0.3028 0.9753 0.9876
No log 6.5714 138 0.9536 0.3028 0.9536 0.9765
No log 6.6667 140 0.8584 0.1888 0.8584 0.9265
No log 6.7619 142 0.8993 0.1725 0.8993 0.9483
No log 6.8571 144 0.9300 0.3051 0.9300 0.9644
No log 6.9524 146 0.8469 0.2944 0.8469 0.9203
No log 7.0476 148 0.9265 0.2149 0.9265 0.9625
No log 7.1429 150 1.0437 0.2686 1.0437 1.0216
No log 7.2381 152 0.9093 0.2651 0.9093 0.9536
No log 7.3333 154 0.8416 0.2213 0.8416 0.9174
No log 7.4286 156 0.9599 0.2932 0.9599 0.9798
No log 7.5238 158 0.9791 0.2571 0.9791 0.9895
No log 7.6190 160 0.9480 0.1843 0.9480 0.9736
No log 7.7143 162 1.1183 0.2665 1.1183 1.0575
No log 7.8095 164 1.1255 0.2686 1.1255 1.0609
No log 7.9048 166 0.9692 0.1835 0.9692 0.9845
No log 8.0 168 0.9523 0.1968 0.9523 0.9758
No log 8.0952 170 0.8893 0.1391 0.8893 0.9430
No log 8.1905 172 0.8643 0.2114 0.8643 0.9297
No log 8.2857 174 0.8362 0.2114 0.8362 0.9145
No log 8.3810 176 0.8503 0.3183 0.8503 0.9221
No log 8.4762 178 0.8820 0.3256 0.8820 0.9392
No log 8.5714 180 0.8135 0.3183 0.8135 0.9020
No log 8.6667 182 0.7584 0.2072 0.7584 0.8708
No log 8.7619 184 0.8603 0.3459 0.8603 0.9275
No log 8.8571 186 0.8508 0.2886 0.8508 0.9224
No log 8.9524 188 0.7614 0.2398 0.7614 0.8726
No log 9.0476 190 0.7850 0.2502 0.7850 0.8860
No log 9.1429 192 0.8419 0.2988 0.8419 0.9175
No log 9.2381 194 0.8754 0.2320 0.8754 0.9356
No log 9.3333 196 0.9470 0.1472 0.9470 0.9731
No log 9.4286 198 1.0084 0.1940 1.0084 1.0042
No log 9.5238 200 0.9959 0.1734 0.9959 0.9980
No log 9.6190 202 1.0408 0.2779 1.0408 1.0202
No log 9.7143 204 0.9481 0.1969 0.9481 0.9737
No log 9.8095 206 0.8561 0.0909 0.8561 0.9253
No log 9.9048 208 0.8541 0.2349 0.8541 0.9242
No log 10.0 210 0.8401 0.1208 0.8401 0.9166
No log 10.0952 212 0.9573 0.2578 0.9573 0.9784
No log 10.1905 214 1.1281 0.2439 1.1281 1.0621
No log 10.2857 216 1.0280 0.2706 1.0280 1.0139
No log 10.3810 218 0.8390 0.2993 0.8390 0.9160
No log 10.4762 220 0.8161 0.2295 0.8161 0.9034
No log 10.5714 222 0.8574 0.2260 0.8574 0.9260
No log 10.6667 224 0.8228 0.2537 0.8228 0.9071
No log 10.7619 226 0.8226 0.2780 0.8226 0.9070
No log 10.8571 228 0.8460 0.3359 0.8460 0.9198
No log 10.9524 230 0.8065 0.3042 0.8065 0.8980
No log 11.0476 232 0.7364 0.2589 0.7364 0.8581
No log 11.1429 234 0.7039 0.2407 0.7039 0.8390
No log 11.2381 236 0.7076 0.2407 0.7076 0.8412
No log 11.3333 238 0.7244 0.2683 0.7244 0.8511
No log 11.4286 240 0.7665 0.3102 0.7665 0.8755
No log 11.5238 242 0.7977 0.3331 0.7977 0.8931
No log 11.6190 244 0.7836 0.3102 0.7836 0.8852
No log 11.7143 246 0.7565 0.2772 0.7565 0.8698
No log 11.8095 248 0.7413 0.2502 0.7413 0.8610
No log 11.9048 250 0.7746 0.3590 0.7746 0.8801
No log 12.0 252 0.7577 0.2913 0.7577 0.8704
No log 12.0952 254 0.7939 0.3234 0.7939 0.8910
No log 12.1905 256 0.8954 0.3623 0.8954 0.9463
No log 12.2857 258 0.9441 0.3499 0.9441 0.9717
No log 12.3810 260 0.7890 0.2817 0.7890 0.8882
No log 12.4762 262 0.7310 0.3341 0.7310 0.8550
No log 12.5714 264 0.7353 0.3498 0.7353 0.8575
No log 12.6667 266 0.7067 0.3041 0.7067 0.8406
No log 12.7619 268 0.7524 0.3498 0.7524 0.8674
No log 12.8571 270 0.8187 0.3409 0.8187 0.9048
No log 12.9524 272 0.8825 0.3394 0.8825 0.9394
No log 13.0476 274 0.7962 0.3500 0.7962 0.8923
No log 13.1429 276 0.7132 0.3308 0.7132 0.8445
No log 13.2381 278 0.7181 0.3425 0.7181 0.8474
No log 13.3333 280 0.7292 0.3485 0.7292 0.8539
No log 13.4286 282 0.7014 0.3485 0.7014 0.8375
No log 13.5238 284 0.6811 0.3762 0.6811 0.8253
No log 13.6190 286 0.6714 0.3425 0.6714 0.8194
No log 13.7143 288 0.6578 0.3425 0.6578 0.8110
No log 13.8095 290 0.6748 0.4397 0.6748 0.8215
No log 13.9048 292 0.7176 0.3662 0.7176 0.8471
No log 14.0 294 0.7153 0.3545 0.7153 0.8458
No log 14.0952 296 0.7062 0.3545 0.7062 0.8404
No log 14.1905 298 0.6887 0.2916 0.6887 0.8299
No log 14.2857 300 0.7361 0.2259 0.7361 0.8579
No log 14.3810 302 0.7025 0.2916 0.7025 0.8382
No log 14.4762 304 0.8151 0.3590 0.8151 0.9028
No log 14.5714 306 1.2944 0.2398 1.2944 1.1377
No log 14.6667 308 1.4718 0.2056 1.4718 1.2132
No log 14.7619 310 1.2234 0.3016 1.2234 1.1061
No log 14.8571 312 0.8903 0.3439 0.8903 0.9435
No log 14.9524 314 0.7788 0.2593 0.7788 0.8825
No log 15.0476 316 0.8964 0.2332 0.8964 0.9468
No log 15.1429 318 0.9526 0.2345 0.9526 0.9760
No log 15.2381 320 0.9089 0.1999 0.9089 0.9534
No log 15.3333 322 0.8940 0.2449 0.8940 0.9455
No log 15.4286 324 0.9577 0.2305 0.9577 0.9786
No log 15.5238 326 1.0153 0.2442 1.0153 1.0076
No log 15.6190 328 0.9611 0.2315 0.9611 0.9804
No log 15.7143 330 0.8726 0.2285 0.8726 0.9341
No log 15.8095 332 0.8503 0.2345 0.8503 0.9221
No log 15.9048 334 0.8693 0.1953 0.8693 0.9324
No log 16.0 336 0.9152 0.1531 0.9152 0.9567
No log 16.0952 338 0.9839 0.0960 0.9839 0.9919
No log 16.1905 340 1.0535 0.1701 1.0535 1.0264
No log 16.2857 342 1.0470 0.1399 1.0470 1.0232
No log 16.3810 344 1.0007 0.1209 1.0007 1.0003
No log 16.4762 346 1.0014 0.1441 1.0014 1.0007
No log 16.5714 348 0.9588 0.1870 0.9588 0.9792
No log 16.6667 350 0.8859 0.1357 0.8859 0.9412
No log 16.7619 352 0.8409 0.2129 0.8409 0.9170
No log 16.8571 354 0.7985 0.1699 0.7985 0.8936
No log 16.9524 356 0.7700 0.1790 0.7700 0.8775
No log 17.0476 358 0.7707 0.1011 0.7707 0.8779
No log 17.1429 360 0.7770 0.0604 0.7770 0.8815
No log 17.2381 362 0.8160 0.1577 0.8160 0.9033
No log 17.3333 364 0.9831 0.1884 0.9831 0.9915
No log 17.4286 366 1.1012 0.2424 1.1012 1.0494
No log 17.5238 368 1.0580 0.1486 1.0580 1.0286
No log 17.6190 370 0.9665 0.1357 0.9665 0.9831
No log 17.7143 372 0.9632 0.1357 0.9632 0.9814
No log 17.8095 374 0.9798 0.1694 0.9798 0.9898
No log 17.9048 376 0.9413 0.2253 0.9413 0.9702
No log 18.0 378 0.8676 0.1357 0.8676 0.9314
No log 18.0952 380 0.8245 0.2661 0.8245 0.9080
No log 18.1905 382 0.7958 0.2224 0.7958 0.8921
No log 18.2857 384 0.7877 0.2593 0.7877 0.8875
No log 18.3810 386 0.7896 0.2593 0.7896 0.8886
No log 18.4762 388 0.7794 0.2593 0.7794 0.8828
No log 18.5714 390 0.7758 0.2563 0.7758 0.8808
No log 18.6667 392 0.8117 0.2027 0.8117 0.9010
No log 18.7619 394 0.8359 0.2662 0.8359 0.9143
No log 18.8571 396 0.7973 0.2718 0.7973 0.8929
No log 18.9524 398 0.7536 0.2593 0.7536 0.8681
No log 19.0476 400 0.7650 0.2287 0.7650 0.8747
No log 19.1429 402 0.7625 0.2342 0.7625 0.8732
No log 19.2381 404 0.7355 0.1986 0.7355 0.8576
No log 19.3333 406 0.7619 0.2027 0.7619 0.8729
No log 19.4286 408 0.8370 0.2817 0.8370 0.9149
No log 19.5238 410 0.9290 0.3100 0.9290 0.9639
No log 19.6190 412 0.9158 0.3456 0.9158 0.9570
No log 19.7143 414 0.8948 0.2943 0.8948 0.9459
No log 19.8095 416 0.9123 0.3100 0.9123 0.9551
No log 19.9048 418 0.8879 0.2415 0.8879 0.9423
No log 20.0 420 0.8832 0.2975 0.8832 0.9398
No log 20.0952 422 0.8381 0.3121 0.8381 0.9155
No log 20.1905 424 0.8077 0.2058 0.8077 0.8987
No log 20.2857 426 0.7890 0.2058 0.7890 0.8883
No log 20.3810 428 0.7671 0.2161 0.7671 0.8758
No log 20.4762 430 0.7304 0.1986 0.7304 0.8546
No log 20.5714 432 0.7289 0.2360 0.7289 0.8537
No log 20.6667 434 0.7616 0.1260 0.7616 0.8727
No log 20.7619 436 0.8512 0.3280 0.8512 0.9226
No log 20.8571 438 0.9132 0.3909 0.9132 0.9556
No log 20.9524 440 0.8785 0.3520 0.8785 0.9373
No log 21.0476 442 0.8111 0.2058 0.8111 0.9006
No log 21.1429 444 0.8274 0.2009 0.8274 0.9096
No log 21.2381 446 0.8213 0.1786 0.8213 0.9062
No log 21.3333 448 0.8248 0.1976 0.8248 0.9082
No log 21.4286 450 0.8192 0.1976 0.8192 0.9051
No log 21.5238 452 0.8010 0.2140 0.8010 0.8950
No log 21.6190 454 0.7897 0.2193 0.7897 0.8887
No log 21.7143 456 0.7790 0.2247 0.7790 0.8826
No log 21.8095 458 0.7638 0.2621 0.7638 0.8739
No log 21.9048 460 0.7618 0.2809 0.7618 0.8728
No log 22.0 462 0.7718 0.2784 0.7718 0.8785
No log 22.0952 464 0.7694 0.2932 0.7694 0.8771
No log 22.1905 466 0.7300 0.2718 0.7300 0.8544
No log 22.2857 468 0.7059 0.2023 0.7059 0.8402
No log 22.3810 470 0.7044 0.2744 0.7044 0.8393
No log 22.4762 472 0.7157 0.2712 0.7157 0.8460
No log 22.5714 474 0.7423 0.2744 0.7423 0.8616
No log 22.6667 476 0.8831 0.3864 0.8831 0.9397
No log 22.7619 478 1.0489 0.2857 1.0489 1.0242
No log 22.8571 480 1.0540 0.3046 1.0540 1.0266
No log 22.9524 482 0.9307 0.2846 0.9307 0.9647
No log 23.0476 484 0.8341 0.1986 0.8341 0.9133
No log 23.1429 486 0.8126 0.1999 0.8126 0.9014
No log 23.2381 488 0.7899 0.1361 0.7899 0.8888
No log 23.3333 490 0.7701 0.1012 0.7701 0.8776
No log 23.4286 492 0.8007 0.1308 0.8007 0.8948
No log 23.5238 494 0.8598 0.2839 0.8598 0.9272
No log 23.6190 496 0.9148 0.3173 0.9148 0.9565
No log 23.7143 498 0.9187 0.3432 0.9187 0.9585
0.2913 23.8095 500 0.8874 0.3021 0.8874 0.9420
0.2913 23.9048 502 0.8530 0.2501 0.8530 0.9236
0.2913 24.0 504 0.8176 0.2453 0.8176 0.9042
0.2913 24.0952 506 0.8090 0.2152 0.8090 0.8995
0.2913 24.1905 508 0.8322 0.2747 0.8322 0.9123
0.2913 24.2857 510 0.8576 0.4097 0.8576 0.9261
0.2913 24.3810 512 0.8382 0.2831 0.8382 0.9155
0.2913 24.4762 514 0.8043 0.2152 0.8043 0.8968
0.2913 24.5714 516 0.8040 0.1951 0.8040 0.8966
0.2913 24.6667 518 0.8044 0.1999 0.8044 0.8969
0.2913 24.7619 520 0.7818 0.1952 0.7818 0.8842

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run1_AugV5_k4_task7_organization

Finetuned
(4019)
this model