ArabicNewSplits8_usingALLEssays_FineTuningAraBERT_run2_AugV5_k14_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8425
  • Qwk: 0.6002
  • Mse: 0.8425
  • Rmse: 0.9179

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0290 2 5.1876 -0.0062 5.1876 2.2776
No log 0.0580 4 3.1441 0.0753 3.1441 1.7732
No log 0.0870 6 2.0024 0.1160 2.0024 1.4151
No log 0.1159 8 1.3607 0.2820 1.3607 1.1665
No log 0.1449 10 1.5965 0.0136 1.5965 1.2635
No log 0.1739 12 2.0668 -0.1240 2.0668 1.4376
No log 0.2029 14 2.4559 -0.1564 2.4559 1.5671
No log 0.2319 16 2.8043 -0.0355 2.8043 1.6746
No log 0.2609 18 1.8815 0.1172 1.8815 1.3717
No log 0.2899 20 1.3284 0.3541 1.3284 1.1526
No log 0.3188 22 1.3216 0.2824 1.3216 1.1496
No log 0.3478 24 1.3387 0.3104 1.3387 1.1570
No log 0.3768 26 1.1633 0.3679 1.1633 1.0785
No log 0.4058 28 1.0734 0.3410 1.0734 1.0361
No log 0.4348 30 1.3666 0.1888 1.3666 1.1690
No log 0.4638 32 1.6806 0.1803 1.6806 1.2964
No log 0.4928 34 1.9343 0.2173 1.9343 1.3908
No log 0.5217 36 2.1166 0.2226 2.1166 1.4549
No log 0.5507 38 1.7001 0.1682 1.7001 1.3039
No log 0.5797 40 1.1631 0.2913 1.1631 1.0785
No log 0.6087 42 1.0964 0.3273 1.0964 1.0471
No log 0.6377 44 1.1231 0.3051 1.1231 1.0597
No log 0.6667 46 1.1297 0.2987 1.1297 1.0629
No log 0.6957 48 1.1433 0.3220 1.1433 1.0693
No log 0.7246 50 1.1766 0.2274 1.1766 1.0847
No log 0.7536 52 1.2498 0.2684 1.2498 1.1179
No log 0.7826 54 1.3106 0.2417 1.3106 1.1448
No log 0.8116 56 1.2667 0.2609 1.2667 1.1255
No log 0.8406 58 1.2465 0.3109 1.2465 1.1165
No log 0.8696 60 1.1890 0.2774 1.1890 1.0904
No log 0.8986 62 1.1093 0.3377 1.1093 1.0532
No log 0.9275 64 1.0749 0.3795 1.0749 1.0368
No log 0.9565 66 1.0692 0.3679 1.0692 1.0340
No log 0.9855 68 1.0671 0.3937 1.0671 1.0330
No log 1.0145 70 1.0997 0.4863 1.0997 1.0487
No log 1.0435 72 1.1937 0.4229 1.1937 1.0926
No log 1.0725 74 1.1133 0.4439 1.1133 1.0551
No log 1.1014 76 0.9823 0.4512 0.9823 0.9911
No log 1.1304 78 0.9426 0.4440 0.9426 0.9709
No log 1.1594 80 0.9352 0.4527 0.9352 0.9671
No log 1.1884 82 1.1217 0.4078 1.1217 1.0591
No log 1.2174 84 1.4105 0.3219 1.4105 1.1876
No log 1.2464 86 1.6092 0.2522 1.6092 1.2685
No log 1.2754 88 1.1285 0.4142 1.1285 1.0623
No log 1.3043 90 1.1762 0.4338 1.1762 1.0845
No log 1.3333 92 1.2418 0.4126 1.2418 1.1144
No log 1.3623 94 0.9958 0.5531 0.9958 0.9979
No log 1.3913 96 0.9569 0.5384 0.9569 0.9782
No log 1.4203 98 0.9508 0.5216 0.9508 0.9751
No log 1.4493 100 1.0300 0.4823 1.0300 1.0149
No log 1.4783 102 1.0169 0.4764 1.0169 1.0084
No log 1.5072 104 0.9011 0.5055 0.9011 0.9493
No log 1.5362 106 0.8942 0.5073 0.8942 0.9456
No log 1.5652 108 0.9602 0.5055 0.9602 0.9799
No log 1.5942 110 0.9075 0.4872 0.9075 0.9526
No log 1.6232 112 0.8951 0.4877 0.8951 0.9461
No log 1.6522 114 1.0088 0.4583 1.0088 1.0044
No log 1.6812 116 1.0996 0.4044 1.0996 1.0486
No log 1.7101 118 1.5281 0.3157 1.5281 1.2362
No log 1.7391 120 2.1474 0.2262 2.1474 1.4654
No log 1.7681 122 1.7786 0.2673 1.7786 1.3336
No log 1.7971 124 1.0890 0.5084 1.0890 1.0435
No log 1.8261 126 0.9427 0.5036 0.9427 0.9709
No log 1.8551 128 1.0066 0.5078 1.0066 1.0033
No log 1.8841 130 1.2026 0.4495 1.2026 1.0966
No log 1.9130 132 1.5268 0.3590 1.5268 1.2356
No log 1.9420 134 1.5888 0.3232 1.5888 1.2605
No log 1.9710 136 1.4575 0.3475 1.4575 1.2073
No log 2.0 138 1.2135 0.3612 1.2135 1.1016
No log 2.0290 140 1.0398 0.4018 1.0398 1.0197
No log 2.0580 142 0.9650 0.4168 0.9650 0.9823
No log 2.0870 144 0.9662 0.4828 0.9662 0.9830
No log 2.1159 146 1.0806 0.4540 1.0806 1.0395
No log 2.1449 148 1.1404 0.4308 1.1404 1.0679
No log 2.1739 150 1.1193 0.4723 1.1193 1.0580
No log 2.2029 152 1.0800 0.5028 1.0800 1.0392
No log 2.2319 154 0.9760 0.5560 0.9760 0.9880
No log 2.2609 156 0.8752 0.5568 0.8752 0.9355
No log 2.2899 158 0.8851 0.5699 0.8851 0.9408
No log 2.3188 160 0.8215 0.5595 0.8215 0.9064
No log 2.3478 162 0.8234 0.5607 0.8234 0.9074
No log 2.3768 164 0.9033 0.5713 0.9033 0.9504
No log 2.4058 166 1.0675 0.5128 1.0675 1.0332
No log 2.4348 168 0.9987 0.5665 0.9987 0.9994
No log 2.4638 170 0.8270 0.5869 0.8270 0.9094
No log 2.4928 172 0.8222 0.5658 0.8222 0.9067
No log 2.5217 174 0.8775 0.5831 0.8775 0.9367
No log 2.5507 176 0.7985 0.6016 0.7985 0.8936
No log 2.5797 178 0.8296 0.5628 0.8296 0.9108
No log 2.6087 180 0.9128 0.6047 0.9128 0.9554
No log 2.6377 182 0.8649 0.5868 0.8649 0.9300
No log 2.6667 184 0.8202 0.4953 0.8202 0.9057
No log 2.6957 186 0.8883 0.5360 0.8883 0.9425
No log 2.7246 188 0.8962 0.4666 0.8962 0.9467
No log 2.7536 190 0.9843 0.5267 0.9843 0.9921
No log 2.7826 192 1.0635 0.5137 1.0635 1.0313
No log 2.8116 194 1.0003 0.5207 1.0003 1.0001
No log 2.8406 196 0.9177 0.4941 0.9177 0.9580
No log 2.8696 198 0.8945 0.4957 0.8945 0.9458
No log 2.8986 200 0.8959 0.5319 0.8959 0.9465
No log 2.9275 202 0.9675 0.5070 0.9675 0.9836
No log 2.9565 204 1.1080 0.4329 1.1080 1.0526
No log 2.9855 206 1.0717 0.4811 1.0717 1.0352
No log 3.0145 208 1.0247 0.5238 1.0247 1.0123
No log 3.0435 210 0.9807 0.5239 0.9807 0.9903
No log 3.0725 212 0.9263 0.5681 0.9263 0.9624
No log 3.1014 214 0.9400 0.5396 0.9400 0.9695
No log 3.1304 216 1.0157 0.4978 1.0157 1.0078
No log 3.1594 218 1.1658 0.4682 1.1658 1.0797
No log 3.1884 220 1.0887 0.4763 1.0887 1.0434
No log 3.2174 222 0.8896 0.5495 0.8896 0.9432
No log 3.2464 224 0.8825 0.6010 0.8825 0.9394
No log 3.2754 226 0.8924 0.6096 0.8924 0.9447
No log 3.3043 228 0.9612 0.5246 0.9612 0.9804
No log 3.3333 230 1.1460 0.4911 1.1460 1.0705
No log 3.3623 232 1.4684 0.4483 1.4684 1.2118
No log 3.3913 234 1.7799 0.3982 1.7799 1.3341
No log 3.4203 236 1.8232 0.3780 1.8232 1.3502
No log 3.4493 238 1.6960 0.4128 1.6960 1.3023
No log 3.4783 240 1.5730 0.4526 1.5730 1.2542
No log 3.5072 242 1.2819 0.5114 1.2819 1.1322
No log 3.5362 244 1.0328 0.4967 1.0328 1.0162
No log 3.5652 246 0.8746 0.6483 0.8746 0.9352
No log 3.5942 248 0.8705 0.6746 0.8705 0.9330
No log 3.6232 250 0.9085 0.6100 0.9085 0.9531
No log 3.6522 252 0.9852 0.5723 0.9852 0.9926
No log 3.6812 254 1.1818 0.5093 1.1818 1.0871
No log 3.7101 256 1.3026 0.4636 1.3026 1.1413
No log 3.7391 258 1.1469 0.4683 1.1469 1.0710
No log 3.7681 260 0.8894 0.6227 0.8894 0.9431
No log 3.7971 262 0.7845 0.6576 0.7845 0.8857
No log 3.8261 264 0.7754 0.6755 0.7754 0.8805
No log 3.8551 266 0.8287 0.6436 0.8287 0.9103
No log 3.8841 268 0.9376 0.5550 0.9376 0.9683
No log 3.9130 270 1.0411 0.5249 1.0411 1.0203
No log 3.9420 272 1.0950 0.5272 1.0950 1.0464
No log 3.9710 274 1.0331 0.5210 1.0331 1.0164
No log 4.0 276 0.8592 0.6370 0.8592 0.9270
No log 4.0290 278 0.7772 0.6151 0.7772 0.8816
No log 4.0580 280 0.7751 0.6180 0.7751 0.8804
No log 4.0870 282 0.8376 0.6369 0.8376 0.9152
No log 4.1159 284 1.0179 0.4919 1.0179 1.0089
No log 4.1449 286 1.1284 0.4488 1.1284 1.0623
No log 4.1739 288 1.1351 0.4523 1.1351 1.0654
No log 4.2029 290 0.9645 0.5326 0.9645 0.9821
No log 4.2319 292 0.9079 0.5665 0.9079 0.9528
No log 4.2609 294 0.8592 0.5930 0.8592 0.9269
No log 4.2899 296 0.8807 0.5900 0.8807 0.9384
No log 4.3188 298 0.9032 0.5767 0.9032 0.9504
No log 4.3478 300 0.9756 0.5230 0.9756 0.9877
No log 4.3768 302 1.1211 0.4979 1.1211 1.0588
No log 4.4058 304 1.1873 0.5028 1.1873 1.0896
No log 4.4348 306 1.0640 0.5047 1.0640 1.0315
No log 4.4638 308 0.9174 0.5873 0.9174 0.9578
No log 4.4928 310 0.9105 0.5725 0.9105 0.9542
No log 4.5217 312 1.0249 0.5444 1.0249 1.0124
No log 4.5507 314 1.1465 0.4979 1.1465 1.0707
No log 4.5797 316 1.3488 0.4593 1.3488 1.1614
No log 4.6087 318 1.3611 0.4498 1.3611 1.1666
No log 4.6377 320 1.1984 0.4797 1.1984 1.0947
No log 4.6667 322 0.9238 0.5958 0.9238 0.9612
No log 4.6957 324 0.7694 0.6480 0.7694 0.8772
No log 4.7246 326 0.7553 0.6380 0.7553 0.8691
No log 4.7536 328 0.8299 0.6502 0.8299 0.9110
No log 4.7826 330 1.1012 0.4916 1.1012 1.0494
No log 4.8116 332 1.4337 0.4526 1.4337 1.1974
No log 4.8406 334 1.5246 0.4179 1.5246 1.2348
No log 4.8696 336 1.5023 0.3880 1.5023 1.2257
No log 4.8986 338 1.3949 0.4097 1.3949 1.1810
No log 4.9275 340 1.1132 0.4817 1.1132 1.0551
No log 4.9565 342 0.9040 0.5825 0.9040 0.9508
No log 4.9855 344 0.7760 0.6711 0.7760 0.8809
No log 5.0145 346 0.7542 0.6152 0.7542 0.8684
No log 5.0435 348 0.7797 0.6179 0.7797 0.8830
No log 5.0725 350 0.8920 0.6177 0.8920 0.9444
No log 5.1014 352 0.9080 0.6160 0.9080 0.9529
No log 5.1304 354 0.8247 0.6517 0.8247 0.9081
No log 5.1594 356 0.7804 0.6160 0.7804 0.8834
No log 5.1884 358 0.7692 0.6457 0.7692 0.8770
No log 5.2174 360 0.8014 0.6564 0.8014 0.8952
No log 5.2464 362 0.8138 0.6636 0.8138 0.9021
No log 5.2754 364 0.8788 0.6163 0.8788 0.9374
No log 5.3043 366 0.8996 0.5852 0.8996 0.9485
No log 5.3333 368 1.0233 0.5290 1.0233 1.0116
No log 5.3623 370 1.1763 0.4781 1.1763 1.0846
No log 5.3913 372 1.2630 0.4664 1.2630 1.1239
No log 5.4203 374 1.1366 0.4720 1.1366 1.0661
No log 5.4493 376 0.9923 0.5498 0.9923 0.9962
No log 5.4783 378 0.8546 0.6245 0.8546 0.9245
No log 5.5072 380 0.7339 0.6859 0.7339 0.8567
No log 5.5362 382 0.7092 0.6806 0.7092 0.8422
No log 5.5652 384 0.7277 0.6881 0.7277 0.8530
No log 5.5942 386 0.8205 0.6130 0.8205 0.9058
No log 5.6232 388 0.9332 0.5419 0.9332 0.9660
No log 5.6522 390 1.0205 0.5198 1.0205 1.0102
No log 5.6812 392 0.9899 0.5350 0.9899 0.9949
No log 5.7101 394 0.8680 0.6126 0.8680 0.9316
No log 5.7391 396 0.7403 0.6145 0.7403 0.8604
No log 5.7681 398 0.7225 0.6574 0.7225 0.8500
No log 5.7971 400 0.7688 0.6832 0.7688 0.8768
No log 5.8261 402 0.7983 0.6667 0.7983 0.8935
No log 5.8551 404 0.8602 0.6145 0.8602 0.9275
No log 5.8841 406 0.8389 0.6175 0.8389 0.9159
No log 5.9130 408 0.7862 0.6499 0.7863 0.8867
No log 5.9420 410 0.7041 0.6805 0.7041 0.8391
No log 5.9710 412 0.6811 0.6704 0.6811 0.8253
No log 6.0 414 0.6977 0.6797 0.6977 0.8353
No log 6.0290 416 0.7566 0.6586 0.7566 0.8698
No log 6.0580 418 0.7377 0.6753 0.7378 0.8589
No log 6.0870 420 0.7547 0.6723 0.7547 0.8687
No log 6.1159 422 0.8049 0.6433 0.8049 0.8972
No log 6.1449 424 0.8941 0.6178 0.8941 0.9456
No log 6.1739 426 1.0113 0.5823 1.0113 1.0056
No log 6.2029 428 1.0494 0.5902 1.0494 1.0244
No log 6.2319 430 0.9825 0.5799 0.9825 0.9912
No log 6.2609 432 0.9658 0.5449 0.9658 0.9828
No log 6.2899 434 1.0624 0.4733 1.0624 1.0307
No log 6.3188 436 1.1198 0.4755 1.1198 1.0582
No log 6.3478 438 1.1286 0.5000 1.1286 1.0623
No log 6.3768 440 1.0812 0.5242 1.0812 1.0398
No log 6.4058 442 0.8723 0.6468 0.8723 0.9340
No log 6.4348 444 0.7199 0.6760 0.7199 0.8485
No log 6.4638 446 0.7340 0.6561 0.7340 0.8567
No log 6.4928 448 0.7082 0.6676 0.7082 0.8415
No log 6.5217 450 0.7348 0.6851 0.7348 0.8572
No log 6.5507 452 0.9064 0.6245 0.9064 0.9520
No log 6.5797 454 1.0740 0.5041 1.0740 1.0364
No log 6.6087 456 1.0989 0.4864 1.0989 1.0483
No log 6.6377 458 0.9242 0.5480 0.9242 0.9614
No log 6.6667 460 0.7664 0.6664 0.7664 0.8755
No log 6.6957 462 0.6928 0.6566 0.6928 0.8324
No log 6.7246 464 0.6898 0.6827 0.6898 0.8305
No log 6.7536 466 0.7350 0.6714 0.7350 0.8573
No log 6.7826 468 0.8859 0.6170 0.8859 0.9412
No log 6.8116 470 1.1300 0.5041 1.1300 1.0630
No log 6.8406 472 1.1618 0.4994 1.1618 1.0779
No log 6.8696 474 1.0245 0.5255 1.0245 1.0122
No log 6.8986 476 0.8190 0.6133 0.8190 0.9050
No log 6.9275 478 0.7064 0.6192 0.7064 0.8405
No log 6.9565 480 0.7044 0.5819 0.7044 0.8393
No log 6.9855 482 0.7109 0.6260 0.7109 0.8431
No log 7.0145 484 0.7498 0.6491 0.7498 0.8659
No log 7.0435 486 0.8715 0.6181 0.8715 0.9336
No log 7.0725 488 0.9032 0.5947 0.9032 0.9504
No log 7.1014 490 0.9378 0.5776 0.9378 0.9684
No log 7.1304 492 0.8528 0.6245 0.8528 0.9235
No log 7.1594 494 0.7856 0.6760 0.7856 0.8863
No log 7.1884 496 0.7557 0.6828 0.7557 0.8693
No log 7.2174 498 0.7706 0.6701 0.7706 0.8778
0.4955 7.2464 500 0.8273 0.6580 0.8273 0.9096
0.4955 7.2754 502 0.8964 0.6116 0.8964 0.9468
0.4955 7.3043 504 0.8663 0.6175 0.8663 0.9308
0.4955 7.3333 506 0.7802 0.6273 0.7802 0.8833
0.4955 7.3623 508 0.7298 0.6976 0.7298 0.8543
0.4955 7.3913 510 0.7101 0.6920 0.7101 0.8427
0.4955 7.4203 512 0.7039 0.6398 0.7039 0.8390
0.4955 7.4493 514 0.8090 0.6221 0.8090 0.8994
0.4955 7.4783 516 0.9346 0.5487 0.9346 0.9667
0.4955 7.5072 518 0.9195 0.5631 0.9195 0.9589
0.4955 7.5362 520 0.8425 0.6002 0.8425 0.9179

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits8_usingALLEssays_FineTuningAraBERT_run2_AugV5_k14_task1_organization

Finetuned
(4023)
this model