ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run1_AugV5_k16_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9695
  • Qwk: 0.2727
  • Mse: 0.9695
  • Rmse: 0.9847

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0364 2 2.6335 -0.0262 2.6335 1.6228
No log 0.0727 4 1.4630 0.0299 1.4630 1.2096
No log 0.1091 6 1.0958 -0.0970 1.0958 1.0468
No log 0.1455 8 1.2744 -0.2046 1.2744 1.1289
No log 0.1818 10 1.1919 -0.1356 1.1919 1.0917
No log 0.2182 12 1.2446 -0.0920 1.2446 1.1156
No log 0.2545 14 1.2829 -0.1585 1.2829 1.1327
No log 0.2909 16 1.2954 -0.2119 1.2954 1.1381
No log 0.3273 18 1.1961 -0.0085 1.1961 1.0936
No log 0.3636 20 1.1673 -0.0691 1.1673 1.0804
No log 0.4 22 1.1392 -0.0281 1.1392 1.0673
No log 0.4364 24 1.1239 0.0310 1.1239 1.0601
No log 0.4727 26 1.1590 0.0 1.1590 1.0766
No log 0.5091 28 1.1876 -0.1371 1.1876 1.0898
No log 0.5455 30 1.1643 -0.0217 1.1643 1.0790
No log 0.5818 32 1.1025 -0.0077 1.1025 1.0500
No log 0.6182 34 1.1276 -0.0185 1.1276 1.0619
No log 0.6545 36 1.2380 -0.1307 1.2380 1.1127
No log 0.6909 38 1.1612 -0.0550 1.1612 1.0776
No log 0.7273 40 1.0391 0.1299 1.0391 1.0193
No log 0.7636 42 1.0398 0.1290 1.0398 1.0197
No log 0.8 44 0.9650 0.1310 0.9650 0.9823
No log 0.8364 46 0.8974 -0.0054 0.8974 0.9473
No log 0.8727 48 0.8910 0.1139 0.8910 0.9439
No log 0.9091 50 0.8855 0.1539 0.8855 0.9410
No log 0.9455 52 0.9103 0.1206 0.9103 0.9541
No log 0.9818 54 0.9785 0.0875 0.9785 0.9892
No log 1.0182 56 0.9344 0.2345 0.9344 0.9667
No log 1.0545 58 0.8734 0.2038 0.8734 0.9346
No log 1.0909 60 0.8721 0.2092 0.8721 0.9339
No log 1.1273 62 0.9092 0.1426 0.9092 0.9535
No log 1.1636 64 1.0191 0.1476 1.0191 1.0095
No log 1.2 66 1.0245 0.0569 1.0245 1.0122
No log 1.2364 68 0.9731 0.1203 0.9731 0.9865
No log 1.2727 70 0.9259 0.1224 0.9259 0.9622
No log 1.3091 72 0.9060 0.0376 0.9060 0.9518
No log 1.3455 74 0.8963 0.0327 0.8963 0.9467
No log 1.3818 76 0.8868 -0.0054 0.8868 0.9417
No log 1.4182 78 0.9465 0.0 0.9465 0.9729
No log 1.4545 80 0.9522 0.0428 0.9522 0.9758
No log 1.4909 82 0.9431 0.1561 0.9431 0.9711
No log 1.5273 84 0.9610 0.0747 0.9610 0.9803
No log 1.5636 86 0.9447 0.0354 0.9447 0.9720
No log 1.6 88 0.9420 0.1424 0.9420 0.9706
No log 1.6364 90 1.0225 0.0498 1.0225 1.0112
No log 1.6727 92 1.2455 0.0419 1.2455 1.1160
No log 1.7091 94 1.2177 0.0133 1.2177 1.1035
No log 1.7455 96 1.1583 0.0920 1.1583 1.0762
No log 1.7818 98 0.9383 0.0827 0.9383 0.9686
No log 1.8182 100 0.9087 0.1181 0.9087 0.9533
No log 1.8545 102 0.9491 0.0715 0.9491 0.9742
No log 1.8909 104 0.9398 0.2447 0.9398 0.9694
No log 1.9273 106 0.9467 0.1854 0.9467 0.9730
No log 1.9636 108 0.9856 0.0521 0.9856 0.9927
No log 2.0 110 0.9448 0.0838 0.9448 0.9720
No log 2.0364 112 0.9291 0.0838 0.9291 0.9639
No log 2.0727 114 0.9059 0.1854 0.9059 0.9518
No log 2.1091 116 0.8969 0.1577 0.8969 0.9471
No log 2.1455 118 0.9126 0.2045 0.9126 0.9553
No log 2.1818 120 0.9159 0.2045 0.9159 0.9570
No log 2.2182 122 0.9138 0.2564 0.9138 0.9560
No log 2.2545 124 0.9703 0.1088 0.9703 0.9851
No log 2.2909 126 0.9530 0.1661 0.9530 0.9762
No log 2.3273 128 0.9754 0.1995 0.9754 0.9876
No log 2.3636 130 0.9885 0.1368 0.9885 0.9943
No log 2.4 132 0.9844 0.0950 0.9844 0.9921
No log 2.4364 134 1.0203 0.0834 1.0203 1.0101
No log 2.4727 136 1.0360 0.0896 1.0360 1.0178
No log 2.5091 138 1.0222 0.1652 1.0222 1.0110
No log 2.5455 140 0.9575 0.1325 0.9575 0.9785
No log 2.5818 142 0.9160 0.0646 0.9160 0.9571
No log 2.6182 144 0.8953 0.0208 0.8953 0.9462
No log 2.6545 146 0.8851 0.0573 0.8851 0.9408
No log 2.6909 148 0.8989 0.1313 0.8989 0.9481
No log 2.7273 150 0.9451 0.0747 0.9451 0.9721
No log 2.7636 152 1.1014 0.1689 1.1014 1.0495
No log 2.8 154 0.9981 0.2394 0.9981 0.9991
No log 2.8364 156 0.9258 0.1888 0.9258 0.9622
No log 2.8727 158 0.9579 0.1893 0.9579 0.9787
No log 2.9091 160 0.9049 0.1903 0.9049 0.9513
No log 2.9455 162 0.8944 0.2050 0.8944 0.9457
No log 2.9818 164 0.9127 0.2375 0.9127 0.9553
No log 3.0182 166 0.9344 0.2301 0.9344 0.9666
No log 3.0545 168 0.9777 0.3068 0.9777 0.9888
No log 3.0909 170 0.9984 0.3044 0.9984 0.9992
No log 3.1273 172 0.8131 0.3121 0.8131 0.9017
No log 3.1636 174 0.7654 0.2203 0.7654 0.8749
No log 3.2 176 0.8003 0.2661 0.8003 0.8946
No log 3.2364 178 0.8812 0.3044 0.8812 0.9387
No log 3.2727 180 0.9536 0.3044 0.9536 0.9765
No log 3.3091 182 0.9885 0.3339 0.9885 0.9942
No log 3.3455 184 0.9790 0.0928 0.9790 0.9894
No log 3.3818 186 1.0214 0.1150 1.0214 1.0107
No log 3.4182 188 0.9318 0.0391 0.9318 0.9653
No log 3.4545 190 0.9398 0.2801 0.9398 0.9694
No log 3.4909 192 0.9506 0.3043 0.9506 0.9750
No log 3.5273 194 0.9621 0.2694 0.9621 0.9809
No log 3.5636 196 1.0286 0.2518 1.0286 1.0142
No log 3.6 198 1.1821 0.2512 1.1821 1.0872
No log 3.6364 200 1.1879 0.2122 1.1879 1.0899
No log 3.6727 202 1.2675 0.2122 1.2675 1.1258
No log 3.7091 204 1.1935 0.0846 1.1935 1.0925
No log 3.7455 206 1.1533 0.0846 1.1533 1.0739
No log 3.7818 208 1.1351 0.1861 1.1351 1.0654
No log 3.8182 210 1.2223 0.2428 1.2223 1.1056
No log 3.8545 212 1.2033 0.2389 1.2033 1.0969
No log 3.8909 214 0.9448 0.2594 0.9448 0.9720
No log 3.9273 216 0.8934 0.2441 0.8934 0.9452
No log 3.9636 218 0.8937 0.2770 0.8937 0.9454
No log 4.0 220 1.0295 0.3015 1.0295 1.0147
No log 4.0364 222 1.0278 0.2464 1.0278 1.0138
No log 4.0727 224 0.8565 0.2980 0.8565 0.9255
No log 4.1091 226 0.8375 0.2652 0.8375 0.9151
No log 4.1455 228 0.8334 0.2611 0.8334 0.9129
No log 4.1818 230 0.8146 0.1979 0.8146 0.9026
No log 4.2182 232 0.8308 0.2652 0.8308 0.9115
No log 4.2545 234 0.8824 0.1740 0.8824 0.9393
No log 4.2909 236 0.9181 0.2747 0.9181 0.9582
No log 4.3273 238 0.9630 0.3011 0.9630 0.9813
No log 4.3636 240 1.0617 0.3059 1.0617 1.0304
No log 4.4 242 1.0974 0.2535 1.0974 1.0476
No log 4.4364 244 1.1878 0.2283 1.1878 1.0899
No log 4.4727 246 1.1435 0.2167 1.1435 1.0694
No log 4.5091 248 0.9381 0.3068 0.9381 0.9686
No log 4.5455 250 0.8982 0.2014 0.8982 0.9477
No log 4.5818 252 0.8731 0.2057 0.8731 0.9344
No log 4.6182 254 0.8417 0.2744 0.8417 0.9175
No log 4.6545 256 0.7895 0.2310 0.7895 0.8885
No log 4.6909 258 0.7635 0.2170 0.7635 0.8738
No log 4.7273 260 0.7814 0.1538 0.7814 0.8840
No log 4.7636 262 0.7892 0.1569 0.7892 0.8884
No log 4.8 264 0.8278 0.2395 0.8278 0.9098
No log 4.8364 266 1.0443 0.2988 1.0443 1.0219
No log 4.8727 268 1.2029 0.2277 1.2029 1.0968
No log 4.9091 270 1.1408 0.2604 1.1408 1.0681
No log 4.9455 272 1.0936 0.0338 1.0936 1.0457
No log 4.9818 274 1.1569 0.1028 1.1569 1.0756
No log 5.0182 276 1.1003 0.0695 1.1003 1.0489
No log 5.0545 278 1.0173 0.0893 1.0173 1.0086
No log 5.0909 280 1.0079 0.2723 1.0079 1.0039
No log 5.1273 282 1.0038 0.2643 1.0038 1.0019
No log 5.1636 284 0.9704 0.2926 0.9704 0.9851
No log 5.2 286 0.9575 0.2036 0.9575 0.9785
No log 5.2364 288 0.9870 0.1500 0.9870 0.9935
No log 5.2727 290 1.0020 0.1634 1.0020 1.0010
No log 5.3091 292 1.0565 0.3075 1.0565 1.0279
No log 5.3455 294 1.0584 0.3220 1.0584 1.0288
No log 5.3818 296 0.9896 0.2104 0.9896 0.9948
No log 5.4182 298 0.9644 0.2747 0.9644 0.9821
No log 5.4545 300 0.9654 0.2808 0.9654 0.9825
No log 5.4909 302 0.9674 0.2751 0.9674 0.9836
No log 5.5273 304 0.8683 0.1522 0.8683 0.9318
No log 5.5636 306 0.8402 0.2746 0.8402 0.9166
No log 5.6 308 0.8296 0.2513 0.8296 0.9108
No log 5.6364 310 0.8296 0.2746 0.8296 0.9108
No log 5.6727 312 0.8846 0.2633 0.8846 0.9405
No log 5.7091 314 0.9875 0.2516 0.9875 0.9937
No log 5.7455 316 1.0081 0.2701 1.0081 1.0041
No log 5.7818 318 0.9286 0.2835 0.9286 0.9637
No log 5.8182 320 0.8523 0.2479 0.8523 0.9232
No log 5.8545 322 0.8220 0.2398 0.8220 0.9066
No log 5.8909 324 0.7848 0.2451 0.7848 0.8859
No log 5.9273 326 0.7800 0.2389 0.7800 0.8832
No log 5.9636 328 0.7907 0.2389 0.7907 0.8892
No log 6.0 330 0.8035 0.3136 0.8035 0.8964
No log 6.0364 332 0.8124 0.3377 0.8124 0.9013
No log 6.0727 334 0.7962 0.3101 0.7962 0.8923
No log 6.1091 336 0.8028 0.2318 0.8028 0.8960
No log 6.1455 338 0.8256 0.1795 0.8256 0.9086
No log 6.1818 340 0.7862 0.2019 0.7862 0.8867
No log 6.2182 342 0.7938 0.2535 0.7938 0.8909
No log 6.2545 344 0.8173 0.3007 0.8173 0.9041
No log 6.2909 346 0.8719 0.3482 0.8719 0.9337
No log 6.3273 348 0.8633 0.3007 0.8633 0.9292
No log 6.3636 350 0.8385 0.2802 0.8385 0.9157
No log 6.4 352 0.9086 0.2600 0.9086 0.9532
No log 6.4364 354 1.0280 0.1467 1.0280 1.0139
No log 6.4727 356 0.9595 0.1485 0.9595 0.9795
No log 6.5091 358 0.8424 0.3451 0.8424 0.9178
No log 6.5455 360 0.7921 0.3460 0.7921 0.8900
No log 6.5818 362 0.9453 0.3228 0.9453 0.9723
No log 6.6182 364 0.9777 0.3368 0.9777 0.9888
No log 6.6545 366 0.8919 0.3001 0.8919 0.9444
No log 6.6909 368 0.7797 0.2684 0.7797 0.8830
No log 6.7273 370 0.7317 0.2780 0.7317 0.8554
No log 6.7636 372 0.7163 0.3100 0.7163 0.8463
No log 6.8 374 0.7274 0.3100 0.7274 0.8529
No log 6.8364 376 0.7897 0.3007 0.7897 0.8886
No log 6.8727 378 0.9181 0.3720 0.9181 0.9582
No log 6.9091 380 0.9919 0.3547 0.9919 0.9959
No log 6.9455 382 0.9810 0.3547 0.9810 0.9905
No log 6.9818 384 0.9605 0.3547 0.9605 0.9801
No log 7.0182 386 0.9763 0.3486 0.9763 0.9881
No log 7.0545 388 0.8464 0.3379 0.8464 0.9200
No log 7.0909 390 0.7230 0.3340 0.7230 0.8503
No log 7.1273 392 0.7067 0.3106 0.7067 0.8406
No log 7.1636 394 0.7058 0.3352 0.7058 0.8401
No log 7.2 396 0.6979 0.3141 0.6979 0.8354
No log 7.2364 398 0.7141 0.3106 0.7141 0.8451
No log 7.2727 400 0.7129 0.3787 0.7129 0.8443
No log 7.3091 402 0.7133 0.3939 0.7133 0.8446
No log 7.3455 404 0.7342 0.3566 0.7342 0.8568
No log 7.3818 406 0.7240 0.3226 0.7240 0.8509
No log 7.4182 408 0.7476 0.4036 0.7476 0.8646
No log 7.4545 410 0.8440 0.3051 0.8440 0.9187
No log 7.4909 412 0.9287 0.2394 0.9287 0.9637
No log 7.5273 414 0.8932 0.2315 0.8932 0.9451
No log 7.5636 416 0.8043 0.3171 0.8043 0.8968
No log 7.6 418 0.7665 0.2480 0.7665 0.8755
No log 7.6364 420 0.7603 0.1838 0.7603 0.8719
No log 7.6727 422 0.7751 0.2171 0.7751 0.8804
No log 7.7091 424 0.7873 0.3022 0.7873 0.8873
No log 7.7455 426 0.8722 0.3235 0.8722 0.9339
No log 7.7818 428 0.9804 0.2752 0.9804 0.9902
No log 7.8182 430 1.0591 0.28 1.0591 1.0291
No log 7.8545 432 0.9546 0.2542 0.9546 0.9770
No log 7.8909 434 0.9299 0.2474 0.9299 0.9643
No log 7.9273 436 0.9204 0.2241 0.9204 0.9594
No log 7.9636 438 0.9314 0.2301 0.9314 0.9651
No log 8.0 440 0.9578 0.2516 0.9578 0.9787
No log 8.0364 442 0.8746 0.2617 0.8746 0.9352
No log 8.0727 444 0.7767 0.2532 0.7767 0.8813
No log 8.1091 446 0.7445 0.2532 0.7445 0.8629
No log 8.1455 448 0.7442 0.2980 0.7442 0.8627
No log 8.1818 450 0.7570 0.3613 0.7570 0.8700
No log 8.2182 452 0.7478 0.3613 0.7478 0.8647
No log 8.2545 454 0.7333 0.3316 0.7333 0.8563
No log 8.2909 456 0.7773 0.3365 0.7773 0.8816
No log 8.3273 458 0.8820 0.3561 0.8820 0.9391
No log 8.3636 460 0.9427 0.4011 0.9427 0.9709
No log 8.4 462 0.8805 0.3735 0.8805 0.9384
No log 8.4364 464 0.7852 0.3864 0.7852 0.8861
No log 8.4727 466 0.6936 0.3974 0.6936 0.8328
No log 8.5091 468 0.6855 0.4839 0.6855 0.8280
No log 8.5455 470 0.6924 0.4776 0.6924 0.8321
No log 8.5818 472 0.6958 0.4505 0.6958 0.8342
No log 8.6182 474 0.7380 0.2718 0.7380 0.8591
No log 8.6545 476 0.8393 0.3139 0.8393 0.9161
No log 8.6909 478 0.8257 0.2498 0.8257 0.9087
No log 8.7273 480 0.8057 0.2920 0.8057 0.8976
No log 8.7636 482 0.7755 0.2661 0.7755 0.8806
No log 8.8 484 0.7917 0.2365 0.7917 0.8898
No log 8.8364 486 0.7690 0.2365 0.7690 0.8769
No log 8.8727 488 0.7952 0.2926 0.7952 0.8918
No log 8.9091 490 0.7846 0.3287 0.7846 0.8858
No log 8.9455 492 0.7131 0.2749 0.7131 0.8444
No log 8.9818 494 0.6757 0.3111 0.6757 0.8220
No log 9.0182 496 0.6759 0.3808 0.6759 0.8221
No log 9.0545 498 0.6933 0.4345 0.6933 0.8327
0.3438 9.0909 500 0.6907 0.4345 0.6907 0.8311
0.3438 9.1273 502 0.6907 0.3964 0.6907 0.8311
0.3438 9.1636 504 0.8118 0.2695 0.8118 0.9010
0.3438 9.2 506 0.9707 0.2864 0.9707 0.9852
0.3438 9.2364 508 1.0372 0.3326 1.0372 1.0184
0.3438 9.2727 510 0.9695 0.2727 0.9695 0.9847

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run1_AugV5_k16_task7_organization

Finetuned
(4019)
this model