ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run1_AugV5_k12_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7857
  • Qwk: 0.5360
  • Mse: 0.7857
  • Rmse: 0.8864

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0455 2 4.5038 -0.0191 4.5038 2.1222
No log 0.0909 4 2.7599 -0.0135 2.7599 1.6613
No log 0.1364 6 2.1242 -0.0370 2.1242 1.4575
No log 0.1818 8 1.5877 0.0062 1.5877 1.2600
No log 0.2273 10 1.4014 0.0549 1.4014 1.1838
No log 0.2727 12 1.2472 0.0811 1.2472 1.1168
No log 0.3182 14 1.4203 0.1064 1.4203 1.1918
No log 0.3636 16 1.5551 0.1409 1.5551 1.2470
No log 0.4091 18 1.5613 0.0833 1.5613 1.2495
No log 0.4545 20 1.7730 0.1006 1.7730 1.3315
No log 0.5 22 2.2146 0.1166 2.2146 1.4882
No log 0.5455 24 2.0192 0.1331 2.0192 1.4210
No log 0.5909 26 1.7222 0.1459 1.7222 1.3123
No log 0.6364 28 1.4136 0.1418 1.4136 1.1890
No log 0.6818 30 1.2506 0.2071 1.2506 1.1183
No log 0.7273 32 1.1978 0.1928 1.1978 1.0944
No log 0.7727 34 1.3059 0.1570 1.3059 1.1428
No log 0.8182 36 1.5098 0.1387 1.5098 1.2287
No log 0.8636 38 1.5091 0.1387 1.5091 1.2284
No log 0.9091 40 1.3932 0.1899 1.3932 1.1803
No log 0.9545 42 1.2057 0.2014 1.2057 1.0980
No log 1.0 44 1.1551 0.2054 1.1551 1.0748
No log 1.0455 46 1.2896 0.1928 1.2896 1.1356
No log 1.0909 48 1.7425 0.2553 1.7425 1.3200
No log 1.1364 50 2.0456 0.2352 2.0456 1.4302
No log 1.1818 52 1.9760 0.1935 1.9760 1.4057
No log 1.2273 54 1.8440 0.1851 1.8440 1.3579
No log 1.2727 56 1.5051 0.1522 1.5051 1.2268
No log 1.3182 58 1.1765 0.1790 1.1765 1.0846
No log 1.3636 60 1.0585 0.3269 1.0585 1.0288
No log 1.4091 62 0.9736 0.3814 0.9736 0.9867
No log 1.4545 64 0.9562 0.3771 0.9562 0.9778
No log 1.5 66 0.9043 0.4220 0.9043 0.9509
No log 1.5455 68 0.9076 0.4197 0.9076 0.9527
No log 1.5909 70 1.0027 0.3612 1.0027 1.0014
No log 1.6364 72 1.4830 0.2863 1.4830 1.2178
No log 1.6818 74 1.8279 0.2871 1.8279 1.3520
No log 1.7273 76 1.6993 0.3113 1.6993 1.3036
No log 1.7727 78 1.2592 0.3135 1.2592 1.1221
No log 1.8182 80 0.9373 0.4020 0.9373 0.9682
No log 1.8636 82 0.9295 0.4076 0.9295 0.9641
No log 1.9091 84 0.9888 0.3447 0.9888 0.9944
No log 1.9545 86 1.1619 0.2532 1.1619 1.0779
No log 2.0 88 1.1797 0.2532 1.1797 1.0861
No log 2.0455 90 1.0921 0.2657 1.0921 1.0451
No log 2.0909 92 0.9485 0.3602 0.9485 0.9739
No log 2.1364 94 0.9659 0.3996 0.9659 0.9828
No log 2.1818 96 0.9731 0.4235 0.9731 0.9865
No log 2.2273 98 1.0914 0.2191 1.0914 1.0447
No log 2.2727 100 1.2093 0.1884 1.2093 1.0997
No log 2.3182 102 1.0753 0.2417 1.0753 1.0370
No log 2.3636 104 0.9421 0.4861 0.9421 0.9706
No log 2.4091 106 0.8001 0.5203 0.8001 0.8945
No log 2.4545 108 0.7724 0.4951 0.7724 0.8788
No log 2.5 110 0.8177 0.5167 0.8177 0.9043
No log 2.5455 112 0.8791 0.5157 0.8791 0.9376
No log 2.5909 114 0.9749 0.3347 0.9749 0.9874
No log 2.6364 116 1.0838 0.3044 1.0838 1.0410
No log 2.6818 118 0.9817 0.4290 0.9817 0.9908
No log 2.7273 120 0.7032 0.5410 0.7032 0.8386
No log 2.7727 122 0.7139 0.5543 0.7139 0.8449
No log 2.8182 124 0.7559 0.5830 0.7559 0.8694
No log 2.8636 126 0.7975 0.5294 0.7975 0.8930
No log 2.9091 128 0.8035 0.5683 0.8035 0.8964
No log 2.9545 130 0.8411 0.4483 0.8411 0.9171
No log 3.0 132 0.7632 0.5684 0.7632 0.8736
No log 3.0455 134 0.8782 0.5305 0.8782 0.9371
No log 3.0909 136 0.8608 0.5667 0.8608 0.9278
No log 3.1364 138 0.7723 0.4947 0.7723 0.8788
No log 3.1818 140 0.7859 0.4780 0.7859 0.8865
No log 3.2273 142 0.8613 0.4563 0.8613 0.9281
No log 3.2727 144 0.9738 0.5081 0.9738 0.9868
No log 3.3182 146 0.8623 0.4503 0.8623 0.9286
No log 3.3636 148 0.7801 0.5711 0.7801 0.8832
No log 3.4091 150 0.7893 0.5974 0.7893 0.8884
No log 3.4545 152 0.7928 0.5082 0.7928 0.8904
No log 3.5 154 0.8111 0.6015 0.8111 0.9006
No log 3.5455 156 0.7959 0.5056 0.7959 0.8921
No log 3.5909 158 0.9206 0.4689 0.9206 0.9595
No log 3.6364 160 1.0842 0.4653 1.0842 1.0412
No log 3.6818 162 0.9546 0.4009 0.9546 0.9770
No log 3.7273 164 0.8350 0.3411 0.8350 0.9138
No log 3.7727 166 0.8252 0.4571 0.8252 0.9084
No log 3.8182 168 0.8119 0.4397 0.8119 0.9010
No log 3.8636 170 0.8154 0.4334 0.8154 0.9030
No log 3.9091 172 0.7770 0.4860 0.7770 0.8815
No log 3.9545 174 0.7817 0.4860 0.7817 0.8842
No log 4.0 176 0.8667 0.4039 0.8667 0.9310
No log 4.0455 178 0.9514 0.3886 0.9514 0.9754
No log 4.0909 180 0.8377 0.4371 0.8377 0.9152
No log 4.1364 182 0.7957 0.5337 0.7957 0.8920
No log 4.1818 184 0.9120 0.4572 0.9120 0.9550
No log 4.2273 186 0.8732 0.4952 0.8732 0.9345
No log 4.2727 188 0.7877 0.5079 0.7877 0.8875
No log 4.3182 190 1.0081 0.5004 1.0081 1.0040
No log 4.3636 192 1.1891 0.4595 1.1891 1.0905
No log 4.4091 194 1.0993 0.5028 1.0993 1.0485
No log 4.4545 196 0.8989 0.4023 0.8989 0.9481
No log 4.5 198 0.8546 0.4 0.8546 0.9245
No log 4.5455 200 0.8604 0.3814 0.8604 0.9276
No log 4.5909 202 0.8481 0.4220 0.8481 0.9210
No log 4.6364 204 0.8487 0.3787 0.8487 0.9212
No log 4.6818 206 0.8555 0.3493 0.8555 0.9249
No log 4.7273 208 0.8590 0.2942 0.8590 0.9268
No log 4.7727 210 0.8498 0.3804 0.8498 0.9218
No log 4.8182 212 0.8285 0.3744 0.8285 0.9102
No log 4.8636 214 0.8102 0.4498 0.8102 0.9001
No log 4.9091 216 0.8207 0.4158 0.8207 0.9059
No log 4.9545 218 0.8514 0.3806 0.8514 0.9227
No log 5.0 220 0.8317 0.3866 0.8317 0.9120
No log 5.0455 222 0.8075 0.4098 0.8075 0.8986
No log 5.0909 224 0.8235 0.4861 0.8235 0.9075
No log 5.1364 226 0.9207 0.5094 0.9207 0.9595
No log 5.1818 228 0.8300 0.4880 0.8300 0.9110
No log 5.2273 230 0.7377 0.5234 0.7377 0.8589
No log 5.2727 232 0.7710 0.5721 0.7710 0.8781
No log 5.3182 234 0.7379 0.5012 0.7379 0.8590
No log 5.3636 236 0.7780 0.4606 0.7780 0.8820
No log 5.4091 238 0.9417 0.5069 0.9417 0.9704
No log 5.4545 240 0.8936 0.4656 0.8936 0.9453
No log 5.5 242 0.7376 0.5175 0.7376 0.8588
No log 5.5455 244 0.8322 0.5789 0.8322 0.9123
No log 5.5909 246 0.8619 0.5665 0.8619 0.9284
No log 5.6364 248 0.7860 0.5012 0.7860 0.8866
No log 5.6818 250 0.7784 0.4472 0.7784 0.8823
No log 5.7273 252 0.7759 0.4712 0.7759 0.8809
No log 5.7727 254 0.7772 0.5012 0.7772 0.8816
No log 5.8182 256 0.7829 0.6035 0.7829 0.8848
No log 5.8636 258 0.7619 0.5748 0.7619 0.8729
No log 5.9091 260 0.7714 0.4158 0.7714 0.8783
No log 5.9545 262 0.7991 0.4681 0.7991 0.8940
No log 6.0 264 0.7511 0.4817 0.7511 0.8666
No log 6.0455 266 0.7936 0.5442 0.7936 0.8908
No log 6.0909 268 0.8282 0.4700 0.8282 0.9101
No log 6.1364 270 0.7461 0.6035 0.7461 0.8638
No log 6.1818 272 0.7276 0.5735 0.7276 0.8530
No log 6.2273 274 0.7515 0.5195 0.7515 0.8669
No log 6.2727 276 0.7763 0.5079 0.7763 0.8811
No log 6.3182 278 0.8199 0.4401 0.8199 0.9055
No log 6.3636 280 0.8711 0.4347 0.8711 0.9333
No log 6.4091 282 0.8268 0.4797 0.8268 0.9093
No log 6.4545 284 0.7954 0.4286 0.7954 0.8918
No log 6.5 286 0.7950 0.4769 0.7950 0.8916
No log 6.5455 288 0.7735 0.5046 0.7735 0.8795
No log 6.5909 290 0.7769 0.5061 0.7769 0.8814
No log 6.6364 292 0.7725 0.5505 0.7725 0.8789
No log 6.6818 294 0.7665 0.5606 0.7665 0.8755
No log 6.7273 296 0.7946 0.4100 0.7946 0.8914
No log 6.7727 298 0.8530 0.4347 0.8530 0.9236
No log 6.8182 300 0.8267 0.4054 0.8267 0.9093
No log 6.8636 302 0.7797 0.5502 0.7797 0.8830
No log 6.9091 304 0.7737 0.6043 0.7737 0.8796
No log 6.9545 306 0.7690 0.5163 0.7690 0.8769
No log 7.0 308 0.7669 0.5061 0.7669 0.8757
No log 7.0455 310 0.7528 0.5831 0.7528 0.8677
No log 7.0909 312 0.7650 0.5312 0.7650 0.8746
No log 7.1364 314 0.8490 0.4530 0.8490 0.9214
No log 7.1818 316 0.8395 0.4713 0.8395 0.9162
No log 7.2273 318 0.7809 0.4563 0.7809 0.8837
No log 7.2727 320 0.7711 0.5009 0.7711 0.8781
No log 7.3182 322 0.7848 0.4563 0.7848 0.8859
No log 7.3636 324 0.7688 0.5305 0.7688 0.8768
No log 7.4091 326 0.7685 0.5305 0.7685 0.8766
No log 7.4545 328 0.8046 0.4471 0.8046 0.8970
No log 7.5 330 0.8052 0.4137 0.8052 0.8973
No log 7.5455 332 0.7724 0.5061 0.7724 0.8788
No log 7.5909 334 0.7692 0.5163 0.7692 0.8771
No log 7.6364 336 0.7897 0.5205 0.7897 0.8886
No log 7.6818 338 0.9643 0.4020 0.9643 0.9820
No log 7.7273 340 1.0974 0.4027 1.0974 1.0476
No log 7.7727 342 1.0227 0.2937 1.0227 1.0113
No log 7.8182 344 0.9661 0.3790 0.9661 0.9829
No log 7.8636 346 0.9079 0.3441 0.9079 0.9529
No log 7.9091 348 0.8859 0.3615 0.8859 0.9412
No log 7.9545 350 0.8665 0.3513 0.8665 0.9309
No log 8.0 352 0.8658 0.3418 0.8658 0.9305
No log 8.0455 354 0.8383 0.4078 0.8383 0.9156
No log 8.0909 356 0.7854 0.4563 0.7854 0.8862
No log 8.1364 358 0.7412 0.5634 0.7412 0.8609
No log 8.1818 360 0.7554 0.5548 0.7554 0.8691
No log 8.2273 362 0.7683 0.5472 0.7683 0.8765
No log 8.2727 364 0.7739 0.4839 0.7739 0.8797
No log 8.3182 366 0.8716 0.4116 0.8716 0.9336
No log 8.3636 368 0.9322 0.3601 0.9322 0.9655
No log 8.4091 370 0.8444 0.4344 0.8444 0.9189
No log 8.4545 372 0.7465 0.5534 0.7465 0.8640
No log 8.5 374 0.7811 0.5495 0.7811 0.8838
No log 8.5455 376 0.8309 0.5414 0.8309 0.9115
No log 8.5909 378 0.7844 0.6108 0.7844 0.8856
No log 8.6364 380 0.7527 0.5093 0.7527 0.8676
No log 8.6818 382 0.7950 0.4500 0.7950 0.8916
No log 8.7273 384 0.7783 0.4570 0.7783 0.8822
No log 8.7727 386 0.7851 0.4942 0.7851 0.8860
No log 8.8182 388 0.8036 0.5949 0.8036 0.8964
No log 8.8636 390 0.7985 0.5393 0.7985 0.8936
No log 8.9091 392 0.7923 0.4534 0.7923 0.8901
No log 8.9545 394 0.7836 0.4730 0.7836 0.8852
No log 9.0 396 0.7659 0.4529 0.7659 0.8752
No log 9.0455 398 0.7654 0.4785 0.7654 0.8749
No log 9.0909 400 0.7213 0.5470 0.7213 0.8493
No log 9.1364 402 0.7093 0.6059 0.7093 0.8422
No log 9.1818 404 0.7502 0.5521 0.7502 0.8661
No log 9.2273 406 0.7958 0.5473 0.7958 0.8921
No log 9.2727 408 0.7633 0.6121 0.7633 0.8737
No log 9.3182 410 0.7595 0.5505 0.7595 0.8715
No log 9.3636 412 0.7935 0.4054 0.7935 0.8908
No log 9.4091 414 0.8113 0.4051 0.8113 0.9007
No log 9.4545 416 0.8048 0.4157 0.8048 0.8971
No log 9.5 418 0.7978 0.4588 0.7978 0.8932
No log 9.5455 420 0.8047 0.5292 0.8047 0.8970
No log 9.5909 422 0.7741 0.6035 0.7741 0.8799
No log 9.6364 424 0.7394 0.6043 0.7394 0.8599
No log 9.6818 426 0.7363 0.5215 0.7363 0.8581
No log 9.7273 428 0.7593 0.4299 0.7593 0.8714
No log 9.7727 430 0.7444 0.4748 0.7444 0.8628
No log 9.8182 432 0.7598 0.4371 0.7598 0.8716
No log 9.8636 434 0.7721 0.4540 0.7721 0.8787
No log 9.9091 436 0.7827 0.5361 0.7827 0.8847
No log 9.9545 438 0.7461 0.5739 0.7461 0.8638
No log 10.0 440 0.7241 0.4671 0.7241 0.8510
No log 10.0455 442 0.7288 0.4570 0.7288 0.8537
No log 10.0909 444 0.7203 0.6144 0.7203 0.8487
No log 10.1364 446 0.7384 0.5957 0.7384 0.8593
No log 10.1818 448 0.7696 0.5112 0.7696 0.8772
No log 10.2273 450 0.8120 0.4645 0.8120 0.9011
No log 10.2727 452 0.8532 0.4237 0.8532 0.9237
No log 10.3182 454 0.8775 0.4237 0.8775 0.9367
No log 10.3636 456 0.8669 0.4051 0.8669 0.9311
No log 10.4091 458 0.8116 0.4157 0.8116 0.9009
No log 10.4545 460 0.7533 0.5429 0.7533 0.8680
No log 10.5 462 0.7368 0.4993 0.7368 0.8584
No log 10.5455 464 0.7597 0.4465 0.7597 0.8716
No log 10.5909 466 0.7394 0.5236 0.7394 0.8599
No log 10.6364 468 0.7274 0.5966 0.7274 0.8529
No log 10.6818 470 0.7349 0.5940 0.7349 0.8573
No log 10.7273 472 0.7299 0.5871 0.7299 0.8544
No log 10.7727 474 0.7408 0.4772 0.7408 0.8607
No log 10.8182 476 0.7772 0.4741 0.7772 0.8816
No log 10.8636 478 0.7609 0.4841 0.7609 0.8723
No log 10.9091 480 0.7166 0.5357 0.7166 0.8465
No log 10.9545 482 0.7159 0.5898 0.7159 0.8461
No log 11.0 484 0.7309 0.6039 0.7309 0.8549
No log 11.0455 486 0.7318 0.5485 0.7318 0.8555
No log 11.0909 488 0.7304 0.4923 0.7304 0.8546
No log 11.1364 490 0.7384 0.4444 0.7384 0.8593
No log 11.1818 492 0.7386 0.5027 0.7386 0.8594
No log 11.2273 494 0.7435 0.5131 0.7435 0.8623
No log 11.2727 496 0.7616 0.5335 0.7616 0.8727
No log 11.3182 498 0.7493 0.5027 0.7493 0.8656
0.3122 11.3636 500 0.7342 0.5561 0.7342 0.8568
0.3122 11.4091 502 0.7392 0.6196 0.7392 0.8598
0.3122 11.4545 504 0.7476 0.6068 0.7476 0.8647
0.3122 11.5 506 0.7452 0.5532 0.7452 0.8633
0.3122 11.5455 508 0.7746 0.5125 0.7746 0.8801
0.3122 11.5909 510 0.8041 0.4175 0.8041 0.8967
0.3122 11.6364 512 0.8144 0.4383 0.8144 0.9025
0.3122 11.6818 514 0.7857 0.5360 0.7857 0.8864

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run1_AugV5_k12_task2_organization

Finetuned
(4019)
this model