ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run1_AugV5_k2_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.1427
  • Qwk: 0.3792
  • Mse: 1.1427
  • Rmse: 1.0690

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.25 2 4.5069 0.0010 4.5069 2.1230
No log 0.5 4 2.6521 0.0365 2.6521 1.6285
No log 0.75 6 1.8474 0.0198 1.8474 1.3592
No log 1.0 8 1.7237 0.0062 1.7237 1.3129
No log 1.25 10 1.5334 0.0372 1.5334 1.2383
No log 1.5 12 1.3753 0.0099 1.3753 1.1727
No log 1.75 14 1.3418 0.0462 1.3418 1.1584
No log 2.0 16 1.2889 0.0376 1.2889 1.1353
No log 2.25 18 1.6023 0.1038 1.6023 1.2658
No log 2.5 20 2.5206 0.0681 2.5206 1.5876
No log 2.75 22 2.4297 0.0706 2.4297 1.5588
No log 3.0 24 1.7362 0.1763 1.7362 1.3177
No log 3.25 26 1.2828 0.1252 1.2828 1.1326
No log 3.5 28 1.1866 0.1587 1.1866 1.0893
No log 3.75 30 1.1499 0.1809 1.1499 1.0723
No log 4.0 32 1.1876 0.1865 1.1876 1.0898
No log 4.25 34 1.2476 0.1255 1.2476 1.1170
No log 4.5 36 1.3467 0.1722 1.3467 1.1605
No log 4.75 38 1.7098 0.2516 1.7098 1.3076
No log 5.0 40 1.6905 0.2658 1.6905 1.3002
No log 5.25 42 1.5978 0.2479 1.5978 1.2640
No log 5.5 44 1.2233 0.2961 1.2233 1.1060
No log 5.75 46 1.0913 0.3937 1.0913 1.0446
No log 6.0 48 1.0308 0.3614 1.0308 1.0153
No log 6.25 50 0.9626 0.3942 0.9626 0.9811
No log 6.5 52 1.2094 0.3585 1.2094 1.0997
No log 6.75 54 1.3415 0.3070 1.3415 1.1582
No log 7.0 56 1.1255 0.3665 1.1255 1.0609
No log 7.25 58 0.9002 0.4475 0.9002 0.9488
No log 7.5 60 0.8465 0.4657 0.8465 0.9201
No log 7.75 62 0.8452 0.4249 0.8452 0.9194
No log 8.0 64 0.8956 0.4728 0.8956 0.9463
No log 8.25 66 1.0627 0.4306 1.0627 1.0309
No log 8.5 68 1.2613 0.3788 1.2613 1.1231
No log 8.75 70 0.9132 0.4767 0.9132 0.9556
No log 9.0 72 0.7887 0.5948 0.7887 0.8881
No log 9.25 74 0.8415 0.4978 0.8415 0.9173
No log 9.5 76 0.8643 0.4690 0.8643 0.9297
No log 9.75 78 1.1341 0.4166 1.1341 1.0650
No log 10.0 80 1.4133 0.4137 1.4133 1.1888
No log 10.25 82 1.4724 0.3783 1.4724 1.2134
No log 10.5 84 1.1948 0.3571 1.1948 1.0931
No log 10.75 86 0.9495 0.4893 0.9495 0.9744
No log 11.0 88 0.9856 0.3960 0.9856 0.9928
No log 11.25 90 0.9653 0.2188 0.9653 0.9825
No log 11.5 92 0.8130 0.4726 0.8130 0.9017
No log 11.75 94 0.9936 0.4083 0.9936 0.9968
No log 12.0 96 1.5033 0.3686 1.5033 1.2261
No log 12.25 98 1.4495 0.3788 1.4495 1.2040
No log 12.5 100 1.0045 0.3747 1.0045 1.0023
No log 12.75 102 0.8182 0.4519 0.8182 0.9045
No log 13.0 104 1.0696 0.2801 1.0696 1.0342
No log 13.25 106 1.1912 0.2089 1.1912 1.0914
No log 13.5 108 1.0009 0.3318 1.0009 1.0005
No log 13.75 110 0.9711 0.3431 0.9711 0.9854
No log 14.0 112 1.0526 0.2613 1.0526 1.0260
No log 14.25 114 1.0295 0.2939 1.0295 1.0146
No log 14.5 116 0.8963 0.4752 0.8963 0.9467
No log 14.75 118 0.8280 0.4760 0.8280 0.9100
No log 15.0 120 0.8476 0.4888 0.8476 0.9206
No log 15.25 122 0.8808 0.5089 0.8808 0.9385
No log 15.5 124 0.9235 0.4867 0.9235 0.9610
No log 15.75 126 1.1115 0.3787 1.1115 1.0543
No log 16.0 128 1.3503 0.4481 1.3503 1.1620
No log 16.25 130 1.3902 0.4205 1.3902 1.1791
No log 16.5 132 1.1737 0.4064 1.1737 1.0834
No log 16.75 134 0.9152 0.5603 0.9152 0.9566
No log 17.0 136 0.8803 0.4806 0.8803 0.9382
No log 17.25 138 0.8733 0.5025 0.8733 0.9345
No log 17.5 140 0.8375 0.5463 0.8375 0.9151
No log 17.75 142 0.8819 0.4906 0.8819 0.9391
No log 18.0 144 1.1578 0.4655 1.1578 1.0760
No log 18.25 146 1.0645 0.3946 1.0645 1.0317
No log 18.5 148 0.8831 0.5139 0.8831 0.9397
No log 18.75 150 0.7693 0.5404 0.7693 0.8771
No log 19.0 152 0.7527 0.5565 0.7527 0.8676
No log 19.25 154 0.8126 0.5793 0.8126 0.9015
No log 19.5 156 0.9903 0.4448 0.9903 0.9952
No log 19.75 158 0.9666 0.4428 0.9666 0.9831
No log 20.0 160 0.8034 0.5862 0.8034 0.8963
No log 20.25 162 0.7802 0.5645 0.7802 0.8833
No log 20.5 164 0.7747 0.5837 0.7747 0.8802
No log 20.75 166 0.8315 0.5551 0.8315 0.9118
No log 21.0 168 0.8592 0.5554 0.8592 0.9270
No log 21.25 170 0.9441 0.4861 0.9441 0.9716
No log 21.5 172 0.9251 0.5016 0.9251 0.9618
No log 21.75 174 0.9171 0.4913 0.9171 0.9577
No log 22.0 176 0.8784 0.5641 0.8784 0.9372
No log 22.25 178 0.8564 0.5352 0.8564 0.9254
No log 22.5 180 0.8373 0.5409 0.8373 0.9150
No log 22.75 182 0.8252 0.5200 0.8252 0.9084
No log 23.0 184 0.9026 0.4463 0.9026 0.9501
No log 23.25 186 0.9600 0.4205 0.9600 0.9798
No log 23.5 188 0.9183 0.4435 0.9183 0.9583
No log 23.75 190 0.9126 0.4932 0.9126 0.9553
No log 24.0 192 0.8838 0.5110 0.8838 0.9401
No log 24.25 194 0.9047 0.5237 0.9047 0.9512
No log 24.5 196 1.0024 0.4700 1.0024 1.0012
No log 24.75 198 1.0064 0.4866 1.0064 1.0032
No log 25.0 200 1.0637 0.4990 1.0637 1.0314
No log 25.25 202 0.9955 0.4809 0.9955 0.9977
No log 25.5 204 0.8945 0.4855 0.8945 0.9458
No log 25.75 206 0.7885 0.5634 0.7885 0.8880
No log 26.0 208 0.8054 0.4859 0.8054 0.8975
No log 26.25 210 0.8312 0.4385 0.8312 0.9117
No log 26.5 212 0.8282 0.4433 0.8282 0.9101
No log 26.75 214 0.8717 0.4578 0.8717 0.9336
No log 27.0 216 1.2040 0.3824 1.2040 1.0973
No log 27.25 218 1.5052 0.3918 1.5052 1.2269
No log 27.5 220 1.5149 0.3504 1.5149 1.2308
No log 27.75 222 1.2523 0.4182 1.2523 1.1191
No log 28.0 224 1.0269 0.4532 1.0269 1.0134
No log 28.25 226 0.9745 0.4509 0.9745 0.9872
No log 28.5 228 0.9076 0.5006 0.9076 0.9527
No log 28.75 230 0.9358 0.4484 0.9358 0.9674
No log 29.0 232 1.0593 0.3995 1.0593 1.0292
No log 29.25 234 1.0787 0.4246 1.0787 1.0386
No log 29.5 236 1.0186 0.3998 1.0186 1.0093
No log 29.75 238 0.8916 0.4763 0.8916 0.9442
No log 30.0 240 0.8340 0.5102 0.8340 0.9132
No log 30.25 242 0.8142 0.4945 0.8142 0.9023
No log 30.5 244 0.8207 0.5226 0.8207 0.9059
No log 30.75 246 0.8662 0.4567 0.8662 0.9307
No log 31.0 248 0.9320 0.4976 0.9320 0.9654
No log 31.25 250 0.9040 0.4613 0.9040 0.9508
No log 31.5 252 0.8891 0.5188 0.8891 0.9429
No log 31.75 254 0.8791 0.5317 0.8791 0.9376
No log 32.0 256 0.8531 0.5317 0.8531 0.9236
No log 32.25 258 0.8350 0.5517 0.8350 0.9138
No log 32.5 260 0.8539 0.5618 0.8539 0.9241
No log 32.75 262 0.9679 0.3839 0.9679 0.9838
No log 33.0 264 1.1246 0.3835 1.1246 1.0605
No log 33.25 266 1.2216 0.3857 1.2216 1.1053
No log 33.5 268 1.0941 0.4255 1.0941 1.0460
No log 33.75 270 0.9171 0.4401 0.9171 0.9576
No log 34.0 272 0.8228 0.5781 0.8228 0.9071
No log 34.25 274 0.7930 0.5026 0.7930 0.8905
No log 34.5 276 0.8084 0.5622 0.8084 0.8991
No log 34.75 278 0.7958 0.4789 0.7958 0.8921
No log 35.0 280 0.8049 0.5536 0.8049 0.8972
No log 35.25 282 0.8633 0.5376 0.8633 0.9291
No log 35.5 284 1.0549 0.3942 1.0549 1.0271
No log 35.75 286 1.1978 0.3645 1.1978 1.0944
No log 36.0 288 1.2553 0.3571 1.2553 1.1204
No log 36.25 290 1.1810 0.3824 1.1810 1.0867
No log 36.5 292 1.0844 0.3630 1.0844 1.0413
No log 36.75 294 0.9871 0.4012 0.9871 0.9935
No log 37.0 296 0.8997 0.4271 0.8997 0.9485
No log 37.25 298 0.8708 0.4794 0.8708 0.9331
No log 37.5 300 0.8953 0.4271 0.8953 0.9462
No log 37.75 302 0.9568 0.4404 0.9568 0.9782
No log 38.0 304 1.0529 0.4347 1.0529 1.0261
No log 38.25 306 1.2103 0.3646 1.2103 1.1001
No log 38.5 308 1.2671 0.3726 1.2671 1.1257
No log 38.75 310 1.1847 0.3946 1.1847 1.0884
No log 39.0 312 1.0326 0.4538 1.0326 1.0162
No log 39.25 314 0.9513 0.5150 0.9513 0.9753
No log 39.5 316 0.9620 0.4866 0.9620 0.9808
No log 39.75 318 1.0279 0.4598 1.0279 1.0139
No log 40.0 320 1.1523 0.4199 1.1523 1.0734
No log 40.25 322 1.1702 0.3755 1.1702 1.0818
No log 40.5 324 1.1627 0.4015 1.1627 1.0783
No log 40.75 326 1.0712 0.3639 1.0712 1.0350
No log 41.0 328 0.9643 0.4861 0.9643 0.9820
No log 41.25 330 0.8965 0.4632 0.8965 0.9468
No log 41.5 332 0.8868 0.4624 0.8868 0.9417
No log 41.75 334 0.9028 0.5110 0.9028 0.9501
No log 42.0 336 0.9064 0.5325 0.9064 0.9521
No log 42.25 338 0.9021 0.5318 0.9021 0.9498
No log 42.5 340 0.9023 0.5318 0.9023 0.9499
No log 42.75 342 0.8843 0.4657 0.8843 0.9404
No log 43.0 344 0.8901 0.4657 0.8901 0.9435
No log 43.25 346 0.9265 0.5247 0.9265 0.9625
No log 43.5 348 0.9149 0.5127 0.9149 0.9565
No log 43.75 350 0.8778 0.4624 0.8778 0.9369
No log 44.0 352 0.8709 0.4944 0.8709 0.9332
No log 44.25 354 0.8484 0.4815 0.8484 0.9211
No log 44.5 356 0.8323 0.4782 0.8323 0.9123
No log 44.75 358 0.8402 0.4782 0.8402 0.9166
No log 45.0 360 0.8582 0.4815 0.8582 0.9264
No log 45.25 362 0.9190 0.4949 0.9190 0.9587
No log 45.5 364 1.0056 0.4721 1.0056 1.0028
No log 45.75 366 1.0515 0.4758 1.0515 1.0254
No log 46.0 368 1.0822 0.4708 1.0822 1.0403
No log 46.25 370 1.0578 0.4590 1.0578 1.0285
No log 46.5 372 0.9788 0.4915 0.9788 0.9893
No log 46.75 374 0.9405 0.4915 0.9405 0.9698
No log 47.0 376 0.8930 0.4794 0.8930 0.9450
No log 47.25 378 0.8922 0.4774 0.8922 0.9446
No log 47.5 380 0.9413 0.5036 0.9413 0.9702
No log 47.75 382 0.9714 0.4961 0.9714 0.9856
No log 48.0 384 1.0071 0.4804 1.0071 1.0035
No log 48.25 386 1.0797 0.4428 1.0797 1.0391
No log 48.5 388 1.0980 0.4480 1.0980 1.0479
No log 48.75 390 1.0297 0.4428 1.0297 1.0148
No log 49.0 392 0.9716 0.5004 0.9716 0.9857
No log 49.25 394 0.9437 0.5256 0.9437 0.9714
No log 49.5 396 0.9256 0.5036 0.9256 0.9621
No log 49.75 398 0.9372 0.5156 0.9372 0.9681
No log 50.0 400 0.9894 0.4556 0.9894 0.9947
No log 50.25 402 1.0141 0.4259 1.0141 1.0070
No log 50.5 404 0.9812 0.4855 0.9812 0.9905
No log 50.75 406 0.9117 0.5133 0.9117 0.9548
No log 51.0 408 0.8936 0.5007 0.8936 0.9453
No log 51.25 410 0.9016 0.5133 0.9016 0.9495
No log 51.5 412 0.9113 0.4817 0.9113 0.9546
No log 51.75 414 0.8850 0.4880 0.8850 0.9408
No log 52.0 416 0.8672 0.4752 0.8672 0.9313
No log 52.25 418 0.8937 0.4880 0.8937 0.9454
No log 52.5 420 0.8963 0.4898 0.8963 0.9467
No log 52.75 422 0.8743 0.4774 0.8743 0.9350
No log 53.0 424 0.8847 0.4774 0.8847 0.9406
No log 53.25 426 0.9431 0.4961 0.9431 0.9711
No log 53.5 428 0.9779 0.4785 0.9779 0.9889
No log 53.75 430 1.0000 0.4698 1.0000 1.0000
No log 54.0 432 1.0029 0.4468 1.0029 1.0014
No log 54.25 434 0.9708 0.4556 0.9708 0.9853
No log 54.5 436 0.9015 0.4893 0.9015 0.9495
No log 54.75 438 0.8529 0.4500 0.8529 0.9235
No log 55.0 440 0.8375 0.4908 0.8375 0.9151
No log 55.25 442 0.8331 0.4908 0.8331 0.9127
No log 55.5 444 0.8405 0.4719 0.8405 0.9168
No log 55.75 446 0.8847 0.5022 0.8847 0.9406
No log 56.0 448 0.9268 0.4556 0.9268 0.9627
No log 56.25 450 0.9232 0.4435 0.9232 0.9608
No log 56.5 452 0.9328 0.4440 0.9328 0.9658
No log 56.75 454 0.9689 0.4826 0.9689 0.9844
No log 57.0 456 1.0099 0.4556 1.0099 1.0049
No log 57.25 458 1.0508 0.3945 1.0508 1.0251
No log 57.5 460 1.0230 0.4758 1.0230 1.0114
No log 57.75 462 1.0255 0.4758 1.0255 1.0127
No log 58.0 464 1.0478 0.4651 1.0478 1.0236
No log 58.25 466 1.0857 0.4524 1.0857 1.0420
No log 58.5 468 1.1265 0.4111 1.1265 1.0614
No log 58.75 470 1.1181 0.4111 1.1181 1.0574
No log 59.0 472 1.1395 0.4139 1.1395 1.0675
No log 59.25 474 1.1110 0.4182 1.1110 1.0540
No log 59.5 476 1.0397 0.4833 1.0397 1.0196
No log 59.75 478 0.9836 0.4745 0.9836 0.9918
No log 60.0 480 0.9220 0.4817 0.9220 0.9602
No log 60.25 482 0.8869 0.4563 0.8869 0.9417
No log 60.5 484 0.8820 0.4563 0.8820 0.9391
No log 60.75 486 0.8635 0.4308 0.8635 0.9292
No log 61.0 488 0.8525 0.4500 0.8525 0.9233
No log 61.25 490 0.8788 0.4563 0.8788 0.9374
No log 61.5 492 0.9436 0.4723 0.9436 0.9714
No log 61.75 494 1.0486 0.4297 1.0486 1.0240
No log 62.0 496 1.1436 0.4356 1.1436 1.0694
No log 62.25 498 1.2063 0.4 1.2063 1.0983
0.2315 62.5 500 1.1723 0.4307 1.1723 1.0827
0.2315 62.75 502 1.1222 0.4281 1.1222 1.0593
0.2315 63.0 504 1.0574 0.4297 1.0574 1.0283
0.2315 63.25 506 1.0119 0.4520 1.0119 1.0059
0.2315 63.5 508 0.9820 0.4226 0.9820 0.9910
0.2315 63.75 510 0.9627 0.4475 0.9627 0.9812
0.2315 64.0 512 0.9515 0.4475 0.9515 0.9755
0.2315 64.25 514 0.9622 0.4475 0.9622 0.9809
0.2315 64.5 516 0.9953 0.3972 0.9953 0.9977
0.2315 64.75 518 1.0360 0.3923 1.0360 1.0179
0.2315 65.0 520 1.0707 0.4323 1.0707 1.0348
0.2315 65.25 522 1.1110 0.3832 1.1110 1.0540
0.2315 65.5 524 1.1514 0.3867 1.1514 1.0730
0.2315 65.75 526 1.1720 0.3902 1.1720 1.0826
0.2315 66.0 528 1.1694 0.3646 1.1694 1.0814
0.2315 66.25 530 1.1427 0.3792 1.1427 1.0690

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run1_AugV5_k2_task2_organization

Finetuned
(4019)
this model