ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run1_AugV5_k13_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9855
  • Qwk: 0.6667
  • Mse: 0.9855
  • Rmse: 0.9927

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0328 2 7.1156 0.0 7.1156 2.6675
No log 0.0656 4 4.7824 0.0887 4.7824 2.1869
No log 0.0984 6 4.1103 -0.0784 4.1103 2.0274
No log 0.1311 8 4.1820 -0.0980 4.1820 2.0450
No log 0.1639 10 2.8584 -0.0276 2.8584 1.6907
No log 0.1967 12 2.4498 0.0 2.4498 1.5652
No log 0.2295 14 2.3954 0.0411 2.3954 1.5477
No log 0.2623 16 2.1137 0.1493 2.1137 1.4539
No log 0.2951 18 1.9508 0.0847 1.9508 1.3967
No log 0.3279 20 1.9990 0.0354 1.9990 1.4139
No log 0.3607 22 2.1235 0.0351 2.1235 1.4572
No log 0.3934 24 2.1855 0.0331 2.1855 1.4783
No log 0.4262 26 2.3079 0.1231 2.3079 1.5192
No log 0.4590 28 2.3672 0.1159 2.3672 1.5386
No log 0.4918 30 2.7246 0.0263 2.7246 1.6506
No log 0.5246 32 2.7042 0.0658 2.7042 1.6444
No log 0.5574 34 2.1524 0.2098 2.1524 1.4671
No log 0.5902 36 1.7681 0.3636 1.7681 1.3297
No log 0.6230 38 1.6734 0.4094 1.6734 1.2936
No log 0.6557 40 1.5322 0.4065 1.5322 1.2378
No log 0.6885 42 1.5248 0.4065 1.5248 1.2348
No log 0.7213 44 1.5438 0.3594 1.5438 1.2425
No log 0.7541 46 1.8305 0.2794 1.8305 1.3530
No log 0.7869 48 2.4645 0.2564 2.4645 1.5699
No log 0.8197 50 2.3242 0.2710 2.3242 1.5245
No log 0.8525 52 2.1048 0.2533 2.1048 1.4508
No log 0.8852 54 1.7979 0.3704 1.7979 1.3409
No log 0.9180 56 1.7067 0.4091 1.7067 1.3064
No log 0.9508 58 1.4796 0.4094 1.4796 1.2164
No log 0.9836 60 1.2998 0.3780 1.2998 1.1401
No log 1.0164 62 1.5780 0.2857 1.5780 1.2562
No log 1.0492 64 1.8017 0.3333 1.8017 1.3423
No log 1.0820 66 1.3851 0.375 1.3851 1.1769
No log 1.1148 68 1.1376 0.4640 1.1376 1.0666
No log 1.1475 70 1.1853 0.4262 1.1853 1.0887
No log 1.1803 72 1.2480 0.4516 1.2480 1.1171
No log 1.2131 74 1.3160 0.4160 1.3160 1.1472
No log 1.2459 76 1.2957 0.4094 1.2957 1.1383
No log 1.2787 78 1.1503 0.4262 1.1503 1.0725
No log 1.3115 80 1.0860 0.4833 1.0860 1.0421
No log 1.3443 82 1.0846 0.5366 1.0846 1.0415
No log 1.3770 84 1.0805 0.5484 1.0805 1.0395
No log 1.4098 86 1.0793 0.4677 1.0793 1.0389
No log 1.4426 88 1.1407 0.5039 1.1407 1.0680
No log 1.4754 90 1.2164 0.5113 1.2164 1.1029
No log 1.5082 92 1.1553 0.4882 1.1553 1.0748
No log 1.5410 94 1.0998 0.48 1.0998 1.0487
No log 1.5738 96 1.1795 0.5079 1.1795 1.0861
No log 1.6066 98 1.3057 0.4762 1.3057 1.1427
No log 1.6393 100 1.3685 0.4724 1.3685 1.1698
No log 1.6721 102 1.3303 0.4545 1.3303 1.1534
No log 1.7049 104 1.2162 0.5758 1.2162 1.1028
No log 1.7377 106 1.1761 0.5481 1.1761 1.0845
No log 1.7705 108 1.2088 0.4593 1.2088 1.0995
No log 1.8033 110 1.2973 0.4580 1.2973 1.1390
No log 1.8361 112 1.4166 0.3810 1.4166 1.1902
No log 1.8689 114 1.5694 0.4094 1.5694 1.2528
No log 1.9016 116 1.5076 0.3968 1.5076 1.2278
No log 1.9344 118 1.4516 0.3810 1.4516 1.2048
No log 1.9672 120 1.2421 0.5496 1.2421 1.1145
No log 2.0 122 1.0658 0.6074 1.0658 1.0324
No log 2.0328 124 1.0080 0.6618 1.0080 1.0040
No log 2.0656 126 1.0152 0.6269 1.0152 1.0076
No log 2.0984 128 1.0542 0.5630 1.0542 1.0267
No log 2.1311 130 1.0304 0.5547 1.0304 1.0151
No log 2.1639 132 1.0579 0.5942 1.0579 1.0285
No log 2.1967 134 1.1709 0.5075 1.1709 1.0821
No log 2.2295 136 1.2758 0.5578 1.2758 1.1295
No log 2.2623 138 1.2131 0.5674 1.2131 1.1014
No log 2.2951 140 1.2628 0.5143 1.2628 1.1237
No log 2.3279 142 1.0872 0.5 1.0872 1.0427
No log 2.3607 144 0.9578 0.6277 0.9578 0.9786
No log 2.3934 146 1.0756 0.5692 1.0756 1.0371
No log 2.4262 148 1.3033 0.4889 1.3033 1.1416
No log 2.4590 150 1.3895 0.4493 1.3895 1.1788
No log 2.4918 152 1.3507 0.5224 1.3507 1.1622
No log 2.5246 154 1.3053 0.5038 1.3053 1.1425
No log 2.5574 156 1.2537 0.5231 1.2537 1.1197
No log 2.5902 158 1.2529 0.5481 1.2529 1.1193
No log 2.6230 160 1.2850 0.5315 1.2850 1.1336
No log 2.6557 162 1.3754 0.5 1.3754 1.1728
No log 2.6885 164 1.3217 0.4903 1.3217 1.1496
No log 2.7213 166 1.2813 0.5223 1.2813 1.1320
No log 2.7541 168 1.4480 0.5357 1.4480 1.2033
No log 2.7869 170 1.4725 0.5389 1.4725 1.2135
No log 2.8197 172 1.1933 0.5429 1.1933 1.0924
No log 2.8525 174 1.1610 0.5429 1.1610 1.0775
No log 2.8852 176 1.2062 0.5185 1.2062 1.0983
No log 2.9180 178 1.2142 0.4885 1.2142 1.1019
No log 2.9508 180 1.1593 0.5344 1.1593 1.0767
No log 2.9836 182 1.0804 0.5714 1.0804 1.0394
No log 3.0164 184 1.0493 0.6154 1.0493 1.0244
No log 3.0492 186 1.0960 0.5414 1.0960 1.0469
No log 3.0820 188 1.1236 0.5 1.1236 1.0600
No log 3.1148 190 1.0594 0.5778 1.0594 1.0293
No log 3.1475 192 0.9999 0.6165 0.9999 1.0000
No log 3.1803 194 1.0755 0.5263 1.0755 1.0371
No log 3.2131 196 1.1948 0.5735 1.1948 1.0931
No log 3.2459 198 1.3025 0.4662 1.3025 1.1413
No log 3.2787 200 1.3368 0.4361 1.3368 1.1562
No log 3.3115 202 1.2551 0.4651 1.2551 1.1203
No log 3.3443 204 1.1633 0.4844 1.1633 1.0786
No log 3.3770 206 1.1066 0.4882 1.1066 1.0519
No log 3.4098 208 1.0989 0.4839 1.0989 1.0483
No log 3.4426 210 1.1232 0.4553 1.1232 1.0598
No log 3.4754 212 1.1876 0.4553 1.1876 1.0898
No log 3.5082 214 1.2249 0.4553 1.2249 1.1068
No log 3.5410 216 1.2342 0.4553 1.2342 1.1110
No log 3.5738 218 1.2026 0.4677 1.2026 1.0966
No log 3.6066 220 1.1937 0.4496 1.1937 1.0926
No log 3.6393 222 1.1917 0.5191 1.1917 1.0917
No log 3.6721 224 1.1635 0.4885 1.1635 1.0786
No log 3.7049 226 1.1094 0.5152 1.1094 1.0533
No log 3.7377 228 1.0455 0.5891 1.0455 1.0225
No log 3.7705 230 1.0978 0.5714 1.0978 1.0478
No log 3.8033 232 1.2360 0.4848 1.2360 1.1118
No log 3.8361 234 1.3654 0.4714 1.3654 1.1685
No log 3.8689 236 1.5578 0.4903 1.5578 1.2481
No log 3.9016 238 1.5392 0.5316 1.5392 1.2407
No log 3.9344 240 1.3063 0.4662 1.3063 1.1429
No log 3.9672 242 1.1126 0.5038 1.1126 1.0548
No log 4.0 244 1.0813 0.5758 1.0813 1.0398
No log 4.0328 246 1.1045 0.5303 1.1045 1.0509
No log 4.0656 248 1.0857 0.5714 1.0857 1.0420
No log 4.0984 250 1.0482 0.6515 1.0482 1.0238
No log 4.1311 252 1.0755 0.5581 1.0755 1.0371
No log 4.1639 254 1.0888 0.5846 1.0888 1.0435
No log 4.1967 256 1.0831 0.6364 1.0831 1.0407
No log 4.2295 258 1.0922 0.6364 1.0922 1.0451
No log 4.2623 260 1.1107 0.6154 1.1107 1.0539
No log 4.2951 262 1.1364 0.5469 1.1364 1.0660
No log 4.3279 264 1.1393 0.5271 1.1393 1.0674
No log 4.3607 266 1.1596 0.5191 1.1596 1.0768
No log 4.3934 268 1.1404 0.5455 1.1404 1.0679
No log 4.4262 270 1.1011 0.5649 1.1011 1.0493
No log 4.4590 272 1.0632 0.6165 1.0632 1.0311
No log 4.4918 274 1.0536 0.5802 1.0536 1.0265
No log 4.5246 276 1.0576 0.5802 1.0576 1.0284
No log 4.5574 278 1.0636 0.5846 1.0636 1.0313
No log 4.5902 280 1.0954 0.5532 1.0954 1.0466
No log 4.6230 282 1.2070 0.5806 1.2070 1.0986
No log 4.6557 284 1.3333 0.6 1.3333 1.1547
No log 4.6885 286 1.3854 0.5909 1.3854 1.1770
No log 4.7213 288 1.3158 0.5765 1.3158 1.1471
No log 4.7541 290 1.1506 0.5175 1.1506 1.0727
No log 4.7869 292 1.0507 0.5385 1.0507 1.0250
No log 4.8197 294 1.1042 0.5802 1.1042 1.0508
No log 4.8525 296 1.2752 0.4219 1.2752 1.1292
No log 4.8852 298 1.3486 0.4094 1.3486 1.1613
No log 4.9180 300 1.3409 0.4426 1.3409 1.1580
No log 4.9508 302 1.3296 0.4370 1.3296 1.1531
No log 4.9836 304 1.3351 0.4370 1.3351 1.1554
No log 5.0164 306 1.3283 0.4370 1.3283 1.1525
No log 5.0492 308 1.2764 0.4500 1.2764 1.1298
No log 5.0820 310 1.2238 0.5323 1.2238 1.1063
No log 5.1148 312 1.3377 0.5191 1.3377 1.1566
No log 5.1475 314 1.7034 0.3231 1.7034 1.3051
No log 5.1803 316 1.5197 0.3676 1.5197 1.2328
No log 5.2131 318 1.1275 0.5303 1.1275 1.0618
No log 5.2459 320 0.9995 0.6316 0.9995 0.9997
No log 5.2787 322 0.9775 0.6715 0.9775 0.9887
No log 5.3115 324 0.9770 0.6715 0.9770 0.9884
No log 5.3443 326 0.9812 0.6131 0.9812 0.9905
No log 5.3770 328 0.9798 0.6087 0.9798 0.9899
No log 5.4098 330 0.9718 0.6099 0.9718 0.9858
No log 5.4426 332 1.0034 0.6667 1.0034 1.0017
No log 5.4754 334 1.0763 0.6452 1.0763 1.0375
No log 5.5082 336 1.0164 0.64 1.0164 1.0082
No log 5.5410 338 0.9806 0.6338 0.9806 0.9902
No log 5.5738 340 1.0122 0.6131 1.0122 1.0061
No log 5.6066 342 1.0532 0.6222 1.0532 1.0263
No log 5.6393 344 1.1425 0.5606 1.1425 1.0689
No log 5.6721 346 1.2790 0.5152 1.2790 1.1309
No log 5.7049 348 1.3886 0.4394 1.3886 1.1784
No log 5.7377 350 1.4502 0.4361 1.4502 1.2043
No log 5.7705 352 1.3793 0.4580 1.3793 1.1744
No log 5.8033 354 1.2894 0.5038 1.2894 1.1355
No log 5.8361 356 1.1519 0.5512 1.1519 1.0732
No log 5.8689 358 1.0833 0.496 1.0833 1.0408
No log 5.9016 360 1.1299 0.4812 1.1299 1.0630
No log 5.9344 362 1.2112 0.5541 1.2112 1.1005
No log 5.9672 364 1.2358 0.6098 1.2358 1.1117
No log 6.0 366 1.1850 0.6061 1.1850 1.0886
No log 6.0328 368 1.1885 0.5952 1.1885 1.0902
No log 6.0656 370 1.3589 0.5506 1.3589 1.1657
No log 6.0984 372 1.6516 0.5484 1.6516 1.2852
No log 6.1311 374 1.4984 0.5341 1.4984 1.2241
No log 6.1639 376 1.2363 0.5976 1.2363 1.1119
No log 6.1967 378 1.0238 0.6579 1.0238 1.0118
No log 6.2295 380 1.0091 0.6383 1.0091 1.0046
No log 6.2623 382 1.0217 0.6269 1.0217 1.0108
No log 6.2951 384 1.0197 0.6462 1.0197 1.0098
No log 6.3279 386 1.0295 0.6565 1.0295 1.0146
No log 6.3607 388 1.0587 0.6107 1.0587 1.0290
No log 6.3934 390 1.1135 0.5075 1.1135 1.0552
No log 6.4262 392 1.1551 0.4848 1.1551 1.0747
No log 6.4590 394 1.1317 0.4848 1.1317 1.0638
No log 6.4918 396 1.0834 0.6316 1.0834 1.0409
No log 6.5246 398 1.0612 0.6357 1.0612 1.0301
No log 6.5574 400 1.0465 0.6165 1.0465 1.0230
No log 6.5902 402 1.0376 0.6466 1.0376 1.0186
No log 6.6230 404 0.9985 0.6466 0.9985 0.9992
No log 6.6557 406 0.9731 0.5426 0.9731 0.9864
No log 6.6885 408 0.9565 0.5469 0.9565 0.9780
No log 6.7213 410 0.9450 0.5469 0.9450 0.9721
No log 6.7541 412 0.9184 0.6222 0.9184 0.9583
No log 6.7869 414 0.8982 0.6812 0.8982 0.9477
No log 6.8197 416 0.9257 0.6861 0.9257 0.9621
No log 6.8525 418 0.9515 0.6815 0.9515 0.9755
No log 6.8852 420 0.9813 0.6466 0.9813 0.9906
No log 6.9180 422 1.0207 0.6269 1.0207 1.0103
No log 6.9508 424 1.0582 0.5970 1.0582 1.0287
No log 6.9836 426 1.0867 0.5672 1.0867 1.0425
No log 7.0164 428 1.0298 0.5606 1.0298 1.0148
No log 7.0492 430 0.9673 0.6617 0.9673 0.9835
No log 7.0820 432 0.9487 0.6963 0.9487 0.9740
No log 7.1148 434 0.9715 0.6176 0.9715 0.9856
No log 7.1475 436 0.9939 0.6286 0.9939 0.9969
No log 7.1803 438 0.9141 0.6812 0.9141 0.9561
No log 7.2131 440 0.8379 0.7050 0.8379 0.9154
No log 7.2459 442 0.8340 0.7050 0.8340 0.9132
No log 7.2787 444 0.8717 0.6812 0.8717 0.9337
No log 7.3115 446 0.8699 0.6957 0.8699 0.9327
No log 7.3443 448 0.8619 0.6667 0.8619 0.9284
No log 7.3770 450 0.9039 0.6165 0.9039 0.9507
No log 7.4098 452 0.9071 0.6015 0.9071 0.9524
No log 7.4426 454 0.9171 0.6767 0.9171 0.9576
No log 7.4754 456 0.9374 0.7007 0.9374 0.9682
No log 7.5082 458 0.8961 0.7246 0.8961 0.9466
No log 7.5410 460 0.8457 0.7246 0.8457 0.9196
No log 7.5738 462 0.7914 0.7143 0.7914 0.8896
No log 7.6066 464 0.7617 0.7042 0.7617 0.8728
No log 7.6393 466 0.7452 0.7042 0.7452 0.8632
No log 7.6721 468 0.7341 0.7222 0.7341 0.8568
No log 7.7049 470 0.7483 0.7222 0.7483 0.8651
No log 7.7377 472 0.7630 0.7172 0.7630 0.8735
No log 7.7705 474 0.7913 0.7092 0.7913 0.8895
No log 7.8033 476 0.8327 0.7101 0.8327 0.9125
No log 7.8361 478 0.8333 0.7101 0.8333 0.9129
No log 7.8689 480 0.8041 0.7007 0.8041 0.8967
No log 7.9016 482 0.7978 0.6912 0.7978 0.8932
No log 7.9344 484 0.8095 0.6906 0.8095 0.8997
No log 7.9672 486 0.8225 0.6861 0.8225 0.9069
No log 8.0 488 0.8124 0.7007 0.8124 0.9013
No log 8.0328 490 0.8172 0.6861 0.8172 0.9040
No log 8.0656 492 0.8369 0.6569 0.8369 0.9148
No log 8.0984 494 0.8722 0.5821 0.8722 0.9339
No log 8.1311 496 0.8748 0.5865 0.8748 0.9353
No log 8.1639 498 0.8690 0.5909 0.8690 0.9322
0.4067 8.1967 500 0.8586 0.6861 0.8586 0.9266
0.4067 8.2295 502 0.8774 0.6815 0.8774 0.9367
0.4067 8.2623 504 0.9072 0.6861 0.9072 0.9525
0.4067 8.2951 506 0.9273 0.6866 0.9273 0.9630
0.4067 8.3279 508 0.9340 0.6917 0.9340 0.9664
0.4067 8.3607 510 0.9239 0.6718 0.9239 0.9612
0.4067 8.3934 512 0.9185 0.6466 0.9185 0.9584
0.4067 8.4262 514 0.9207 0.6466 0.9207 0.9595
0.4067 8.4590 516 0.9361 0.6202 0.9361 0.9675
0.4067 8.4918 518 0.9855 0.6667 0.9855 0.9927

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run1_AugV5_k13_task1_organization

Finetuned
(4023)
this model