ArabicNewSplits8_usingALLEssays_FineTuningAraBERT_run2_AugV5_k13_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8119
  • Qwk: 0.6570
  • Mse: 0.8119
  • Rmse: 0.9011

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0317 2 5.0210 -0.0109 5.0210 2.2408
No log 0.0635 4 2.8805 0.0881 2.8805 1.6972
No log 0.0952 6 1.8444 0.0836 1.8444 1.3581
No log 0.1270 8 1.3067 0.2017 1.3067 1.1431
No log 0.1587 10 1.5433 -0.0336 1.5433 1.2423
No log 0.1905 12 1.9641 -0.0771 1.9641 1.4015
No log 0.2222 14 1.5301 0.0046 1.5301 1.2370
No log 0.2540 16 1.3381 0.1531 1.3381 1.1568
No log 0.2857 18 1.2459 0.1223 1.2459 1.1162
No log 0.3175 20 1.2420 0.1584 1.2420 1.1144
No log 0.3492 22 1.2472 0.1423 1.2472 1.1168
No log 0.3810 24 1.3577 0.0772 1.3577 1.1652
No log 0.4127 26 1.5560 -0.0304 1.5560 1.2474
No log 0.4444 28 1.5792 -0.0304 1.5792 1.2567
No log 0.4762 30 1.4560 -0.0421 1.4560 1.2066
No log 0.5079 32 1.2436 0.2401 1.2436 1.1152
No log 0.5397 34 1.2188 0.1721 1.2188 1.1040
No log 0.5714 36 1.2130 0.2036 1.2130 1.1014
No log 0.6032 38 1.2188 0.2089 1.2188 1.1040
No log 0.6349 40 1.3419 0.1877 1.3419 1.1584
No log 0.6667 42 1.6227 0.1674 1.6227 1.2738
No log 0.6984 44 1.8874 0.1737 1.8874 1.3738
No log 0.7302 46 1.8157 0.1845 1.8157 1.3475
No log 0.7619 48 1.4306 0.1660 1.4306 1.1961
No log 0.7937 50 1.2603 0.2038 1.2603 1.1226
No log 0.8254 52 1.1167 0.2616 1.1167 1.0567
No log 0.8571 54 1.0953 0.2900 1.0953 1.0466
No log 0.8889 56 1.1076 0.2909 1.1076 1.0524
No log 0.9206 58 1.1806 0.2341 1.1806 1.0866
No log 0.9524 60 1.3008 0.2297 1.3008 1.1405
No log 0.9841 62 1.3353 0.2597 1.3353 1.1555
No log 1.0159 64 1.2791 0.3009 1.2791 1.1310
No log 1.0476 66 1.2979 0.3451 1.2979 1.1392
No log 1.0794 68 1.1598 0.3994 1.1598 1.0769
No log 1.1111 70 1.1535 0.3738 1.1535 1.0740
No log 1.1429 72 1.1650 0.3847 1.1650 1.0794
No log 1.1746 74 1.0901 0.4212 1.0901 1.0441
No log 1.2063 76 1.0187 0.4354 1.0187 1.0093
No log 1.2381 78 1.0356 0.4254 1.0356 1.0177
No log 1.2698 80 1.0121 0.4006 1.0121 1.0060
No log 1.3016 82 1.0276 0.3586 1.0276 1.0137
No log 1.3333 84 1.0438 0.3447 1.0438 1.0216
No log 1.3651 86 1.0254 0.3358 1.0254 1.0126
No log 1.3968 88 1.0582 0.3443 1.0582 1.0287
No log 1.4286 90 1.0578 0.3688 1.0578 1.0285
No log 1.4603 92 1.0840 0.3845 1.0840 1.0412
No log 1.4921 94 0.9784 0.4617 0.9784 0.9891
No log 1.5238 96 1.0232 0.3896 1.0232 1.0115
No log 1.5556 98 1.0983 0.4058 1.0983 1.0480
No log 1.5873 100 1.0273 0.5288 1.0273 1.0136
No log 1.6190 102 0.9466 0.5811 0.9466 0.9729
No log 1.6508 104 0.9370 0.5974 0.9370 0.9680
No log 1.6825 106 0.9398 0.5734 0.9398 0.9694
No log 1.7143 108 1.1058 0.4694 1.1058 1.0516
No log 1.7460 110 1.1228 0.4902 1.1228 1.0596
No log 1.7778 112 0.9785 0.5101 0.9785 0.9892
No log 1.8095 114 0.8798 0.5689 0.8798 0.9380
No log 1.8413 116 0.9642 0.4840 0.9642 0.9819
No log 1.8730 118 0.9965 0.5019 0.9965 0.9982
No log 1.9048 120 0.9139 0.5333 0.9139 0.9560
No log 1.9365 122 0.8804 0.5507 0.8804 0.9383
No log 1.9683 124 0.9254 0.5951 0.9254 0.9620
No log 2.0 126 0.9550 0.5710 0.9550 0.9773
No log 2.0317 128 0.8880 0.5445 0.8880 0.9424
No log 2.0635 130 0.9220 0.5091 0.9220 0.9602
No log 2.0952 132 1.0246 0.5084 1.0246 1.0122
No log 2.1270 134 0.9219 0.5191 0.9219 0.9602
No log 2.1587 136 0.8387 0.6440 0.8387 0.9158
No log 2.1905 138 1.1022 0.5180 1.1022 1.0498
No log 2.2222 140 1.2401 0.4607 1.2401 1.1136
No log 2.2540 142 1.0553 0.5152 1.0553 1.0273
No log 2.2857 144 0.7367 0.6493 0.7367 0.8583
No log 2.3175 146 0.7043 0.6444 0.7043 0.8392
No log 2.3492 148 0.7065 0.6544 0.7065 0.8405
No log 2.3810 150 0.7768 0.6394 0.7768 0.8814
No log 2.4127 152 0.9253 0.5842 0.9253 0.9619
No log 2.4444 154 0.9015 0.6172 0.9015 0.9495
No log 2.4762 156 0.7967 0.6228 0.7967 0.8926
No log 2.5079 158 0.7562 0.6774 0.7562 0.8696
No log 2.5397 160 0.7917 0.6193 0.7917 0.8898
No log 2.5714 162 0.7596 0.6255 0.7596 0.8715
No log 2.6032 164 0.7586 0.6004 0.7586 0.8710
No log 2.6349 166 0.8349 0.6419 0.8349 0.9137
No log 2.6667 168 0.8209 0.6212 0.8209 0.9061
No log 2.6984 170 0.8348 0.6163 0.8348 0.9136
No log 2.7302 172 0.8762 0.6046 0.8762 0.9361
No log 2.7619 174 0.8743 0.6249 0.8743 0.9350
No log 2.7937 176 0.8501 0.6460 0.8501 0.9220
No log 2.8254 178 0.9625 0.5421 0.9625 0.9811
No log 2.8571 180 1.1031 0.5561 1.1031 1.0503
No log 2.8889 182 1.3053 0.4792 1.3053 1.1425
No log 2.9206 184 1.2898 0.4785 1.2898 1.1357
No log 2.9524 186 1.1392 0.5201 1.1392 1.0673
No log 2.9841 188 0.9447 0.5586 0.9447 0.9719
No log 3.0159 190 0.8344 0.5541 0.8344 0.9135
No log 3.0476 192 0.8541 0.4768 0.8541 0.9242
No log 3.0794 194 0.8230 0.5708 0.8230 0.9072
No log 3.1111 196 1.0572 0.4965 1.0572 1.0282
No log 3.1429 198 1.1924 0.4534 1.1924 1.0919
No log 3.1746 200 1.0826 0.4943 1.0826 1.0405
No log 3.2063 202 0.9015 0.5094 0.9015 0.9495
No log 3.2381 204 0.8155 0.5904 0.8155 0.9030
No log 3.2698 206 0.8376 0.5878 0.8376 0.9152
No log 3.3016 208 1.1125 0.5483 1.1125 1.0548
No log 3.3333 210 1.5238 0.4772 1.5238 1.2344
No log 3.3651 212 1.5121 0.4551 1.5121 1.2297
No log 3.3968 214 1.1616 0.5142 1.1616 1.0778
No log 3.4286 216 0.8355 0.6153 0.8355 0.9141
No log 3.4603 218 0.8269 0.6217 0.8269 0.9093
No log 3.4921 220 0.9741 0.5727 0.9741 0.9870
No log 3.5238 222 1.1264 0.5307 1.1264 1.0613
No log 3.5556 224 1.0586 0.5503 1.0586 1.0289
No log 3.5873 226 0.8702 0.6265 0.8702 0.9329
No log 3.6190 228 0.7879 0.5664 0.7879 0.8877
No log 3.6508 230 0.8043 0.6019 0.8043 0.8968
No log 3.6825 232 0.9277 0.5830 0.9277 0.9632
No log 3.7143 234 1.0665 0.5606 1.0665 1.0327
No log 3.7460 236 1.1914 0.5218 1.1914 1.0915
No log 3.7778 238 1.1370 0.5718 1.1370 1.0663
No log 3.8095 240 0.9628 0.5961 0.9628 0.9812
No log 3.8413 242 0.8939 0.5988 0.8939 0.9455
No log 3.8730 244 0.9113 0.6091 0.9113 0.9546
No log 3.9048 246 0.9879 0.5991 0.9879 0.9939
No log 3.9365 248 1.1824 0.5592 1.1824 1.0874
No log 3.9683 250 1.2744 0.5375 1.2744 1.1289
No log 4.0 252 1.1713 0.5326 1.1713 1.0823
No log 4.0317 254 0.9672 0.5666 0.9672 0.9834
No log 4.0635 256 0.8858 0.6040 0.8858 0.9412
No log 4.0952 258 0.9119 0.5915 0.9119 0.9549
No log 4.1270 260 1.0011 0.5584 1.0011 1.0005
No log 4.1587 262 1.0964 0.5556 1.0964 1.0471
No log 4.1905 264 0.9824 0.5873 0.9824 0.9911
No log 4.2222 266 0.8718 0.5994 0.8718 0.9337
No log 4.2540 268 0.8770 0.5968 0.8770 0.9365
No log 4.2857 270 0.9278 0.5833 0.9278 0.9632
No log 4.3175 272 0.9504 0.5942 0.9504 0.9749
No log 4.3492 274 0.9466 0.5992 0.9466 0.9729
No log 4.3810 276 0.8968 0.6116 0.8968 0.9470
No log 4.4127 278 0.9851 0.5901 0.9851 0.9925
No log 4.4444 280 0.9541 0.5573 0.9541 0.9768
No log 4.4762 282 0.8907 0.5808 0.8907 0.9438
No log 4.5079 284 0.8804 0.5943 0.8804 0.9383
No log 4.5397 286 0.8455 0.6074 0.8455 0.9195
No log 4.5714 288 0.9066 0.5674 0.9066 0.9521
No log 4.6032 290 0.8951 0.5917 0.8951 0.9461
No log 4.6349 292 0.9735 0.5788 0.9735 0.9866
No log 4.6667 294 0.9387 0.5857 0.9387 0.9689
No log 4.6984 296 0.8971 0.5875 0.8971 0.9472
No log 4.7302 298 0.8870 0.5895 0.8870 0.9418
No log 4.7619 300 0.8596 0.5561 0.8596 0.9271
No log 4.7937 302 0.8632 0.5645 0.8632 0.9291
No log 4.8254 304 0.8052 0.6083 0.8052 0.8973
No log 4.8571 306 0.8053 0.6293 0.8053 0.8974
No log 4.8889 308 0.8298 0.6264 0.8298 0.9109
No log 4.9206 310 1.0165 0.5762 1.0165 1.0082
No log 4.9524 312 1.3027 0.5375 1.3027 1.1413
No log 4.9841 314 1.3799 0.5331 1.3799 1.1747
No log 5.0159 316 1.1334 0.5430 1.1334 1.0646
No log 5.0476 318 0.8905 0.5984 0.8905 0.9437
No log 5.0794 320 0.7828 0.6345 0.7828 0.8848
No log 5.1111 322 0.8179 0.6511 0.8179 0.9044
No log 5.1429 324 1.0928 0.5589 1.0928 1.0453
No log 5.1746 326 1.3135 0.5496 1.3135 1.1461
No log 5.2063 328 1.1942 0.5561 1.1942 1.0928
No log 5.2381 330 0.9092 0.6315 0.9092 0.9535
No log 5.2698 332 0.8100 0.6382 0.8100 0.9000
No log 5.3016 334 0.8856 0.6619 0.8856 0.9410
No log 5.3333 336 0.9827 0.5888 0.9827 0.9913
No log 5.3651 338 0.9225 0.6219 0.9225 0.9605
No log 5.3968 340 0.7507 0.6432 0.7507 0.8664
No log 5.4286 342 0.7197 0.6642 0.7197 0.8483
No log 5.4603 344 0.7902 0.6405 0.7902 0.8889
No log 5.4921 346 0.8809 0.6333 0.8809 0.9386
No log 5.5238 348 1.0621 0.5574 1.0621 1.0306
No log 5.5556 350 1.0634 0.5621 1.0634 1.0312
No log 5.5873 352 0.8736 0.5895 0.8736 0.9346
No log 5.6190 354 0.7506 0.6643 0.7506 0.8664
No log 5.6508 356 0.7593 0.6498 0.7593 0.8714
No log 5.6825 358 0.8417 0.6047 0.8417 0.9175
No log 5.7143 360 0.9336 0.5813 0.9336 0.9662
No log 5.7460 362 0.9276 0.5813 0.9276 0.9631
No log 5.7778 364 0.8471 0.5690 0.8471 0.9204
No log 5.8095 366 0.8336 0.6193 0.8336 0.9130
No log 5.8413 368 0.9820 0.5767 0.9820 0.9910
No log 5.8730 370 1.0583 0.5416 1.0583 1.0287
No log 5.9048 372 1.0669 0.5567 1.0669 1.0329
No log 5.9365 374 1.1091 0.5369 1.1091 1.0531
No log 5.9683 376 0.9919 0.5779 0.9919 0.9959
No log 6.0 378 0.9214 0.5890 0.9214 0.9599
No log 6.0317 380 0.9663 0.5936 0.9663 0.9830
No log 6.0635 382 0.9547 0.5838 0.9547 0.9771
No log 6.0952 384 1.0026 0.5790 1.0026 1.0013
No log 6.1270 386 0.9516 0.5790 0.9516 0.9755
No log 6.1587 388 0.8225 0.6317 0.8225 0.9069
No log 6.1905 390 0.7619 0.6917 0.7619 0.8729
No log 6.2222 392 0.8126 0.6401 0.8126 0.9015
No log 6.2540 394 0.9806 0.5387 0.9806 0.9903
No log 6.2857 396 1.1634 0.5295 1.1634 1.0786
No log 6.3175 398 1.2804 0.5170 1.2804 1.1315
No log 6.3492 400 1.1015 0.5090 1.1015 1.0495
No log 6.3810 402 0.8952 0.5947 0.8952 0.9461
No log 6.4127 404 0.7676 0.6357 0.7676 0.8761
No log 6.4444 406 0.7595 0.6501 0.7595 0.8715
No log 6.4762 408 0.8996 0.5791 0.8996 0.9485
No log 6.5079 410 1.2118 0.5076 1.2118 1.1008
No log 6.5397 412 1.4291 0.4787 1.4291 1.1955
No log 6.5714 414 1.2885 0.5185 1.2885 1.1351
No log 6.6032 416 1.0289 0.5240 1.0289 1.0144
No log 6.6349 418 0.7979 0.6416 0.7979 0.8932
No log 6.6667 420 0.7416 0.6653 0.7416 0.8612
No log 6.6984 422 0.7849 0.6520 0.7849 0.8859
No log 6.7302 424 0.9826 0.5593 0.9826 0.9913
No log 6.7619 426 1.0858 0.5611 1.0858 1.0420
No log 6.7937 428 1.0034 0.5593 1.0034 1.0017
No log 6.8254 430 0.9216 0.5981 0.9216 0.9600
No log 6.8571 432 0.9386 0.5981 0.9386 0.9688
No log 6.8889 434 0.8880 0.5998 0.8880 0.9423
No log 6.9206 436 0.9202 0.5950 0.9202 0.9593
No log 6.9524 438 0.8951 0.5950 0.8951 0.9461
No log 6.9841 440 0.7836 0.6748 0.7836 0.8852
No log 7.0159 442 0.7367 0.6920 0.7367 0.8583
No log 7.0476 444 0.8217 0.6519 0.8217 0.9065
No log 7.0794 446 0.8903 0.6193 0.8903 0.9436
No log 7.1111 448 0.8000 0.6645 0.8000 0.8944
No log 7.1429 450 0.7155 0.6831 0.7155 0.8459
No log 7.1746 452 0.6995 0.6621 0.6995 0.8364
No log 7.2063 454 0.7020 0.6702 0.7020 0.8379
No log 7.2381 456 0.7797 0.6636 0.7797 0.8830
No log 7.2698 458 0.7903 0.6636 0.7903 0.8890
No log 7.3016 460 0.8071 0.6550 0.8071 0.8984
No log 7.3333 462 0.7404 0.6693 0.7404 0.8604
No log 7.3651 464 0.7257 0.6772 0.7257 0.8519
No log 7.3968 466 0.7615 0.6771 0.7615 0.8726
No log 7.4286 468 0.8473 0.5901 0.8473 0.9205
No log 7.4603 470 0.8692 0.5903 0.8692 0.9323
No log 7.4921 472 0.8156 0.6379 0.8156 0.9031
No log 7.5238 474 0.7829 0.6515 0.7829 0.8848
No log 7.5556 476 0.7451 0.7118 0.7451 0.8632
No log 7.5873 478 0.7589 0.6810 0.7589 0.8711
No log 7.6190 480 0.8328 0.6134 0.8328 0.9126
No log 7.6508 482 0.9349 0.5924 0.9349 0.9669
No log 7.6825 484 0.9775 0.5921 0.9775 0.9887
No log 7.7143 486 0.9191 0.6031 0.9191 0.9587
No log 7.7460 488 0.7773 0.6910 0.7773 0.8816
No log 7.7778 490 0.7110 0.6916 0.7110 0.8432
No log 7.8095 492 0.7269 0.6849 0.7269 0.8526
No log 7.8413 494 0.7611 0.6846 0.7611 0.8724
No log 7.8730 496 0.7236 0.6954 0.7236 0.8507
No log 7.9048 498 0.6763 0.7080 0.6763 0.8224
0.4733 7.9365 500 0.6864 0.7077 0.6864 0.8285
0.4733 7.9683 502 0.7302 0.7051 0.7302 0.8545
0.4733 8.0 504 0.7608 0.6998 0.7608 0.8722
0.4733 8.0317 506 0.8018 0.6896 0.8018 0.8955
0.4733 8.0635 508 0.8051 0.6794 0.8051 0.8973
0.4733 8.0952 510 0.8119 0.6570 0.8119 0.9011

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits8_usingALLEssays_FineTuningAraBERT_run2_AugV5_k13_task1_organization

Finetuned
(4023)
this model