ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k13_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8716
  • Qwk: 0.5029
  • Mse: 0.8716
  • Rmse: 0.9336

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0270 2 4.5371 0.0163 4.5371 2.1301
No log 0.0541 4 2.5560 0.0554 2.5560 1.5988
No log 0.0811 6 1.6001 0.0504 1.6001 1.2650
No log 0.1081 8 1.2120 0.1517 1.2120 1.1009
No log 0.1351 10 1.1295 0.2233 1.1295 1.0628
No log 0.1622 12 1.1435 0.2023 1.1435 1.0694
No log 0.1892 14 1.1249 0.1773 1.1249 1.0606
No log 0.2162 16 1.0802 0.2257 1.0802 1.0393
No log 0.2432 18 1.0532 0.2731 1.0532 1.0262
No log 0.2703 20 1.0315 0.3404 1.0315 1.0156
No log 0.2973 22 1.0817 0.3411 1.0817 1.0400
No log 0.3243 24 1.1050 0.2515 1.1050 1.0512
No log 0.3514 26 0.9852 0.4075 0.9852 0.9926
No log 0.3784 28 1.0752 0.3955 1.0752 1.0369
No log 0.4054 30 1.6314 0.1512 1.6314 1.2773
No log 0.4324 32 1.5090 0.1448 1.5090 1.2284
No log 0.4595 34 1.0883 0.3086 1.0883 1.0432
No log 0.4865 36 1.0113 0.3151 1.0113 1.0056
No log 0.5135 38 0.9780 0.3462 0.9780 0.9890
No log 0.5405 40 1.0241 0.3798 1.0241 1.0120
No log 0.5676 42 0.9280 0.4945 0.9280 0.9633
No log 0.5946 44 0.9652 0.3816 0.9652 0.9824
No log 0.6216 46 1.0881 0.4414 1.0881 1.0431
No log 0.6486 48 1.0700 0.4722 1.0700 1.0344
No log 0.6757 50 0.8942 0.4726 0.8942 0.9456
No log 0.7027 52 0.9880 0.4057 0.9880 0.9940
No log 0.7297 54 1.0929 0.3620 1.0929 1.0454
No log 0.7568 56 1.0165 0.4776 1.0165 1.0082
No log 0.7838 58 0.7943 0.6023 0.7943 0.8912
No log 0.8108 60 0.9954 0.5192 0.9954 0.9977
No log 0.8378 62 1.1706 0.5409 1.1706 1.0819
No log 0.8649 64 0.9111 0.5077 0.9111 0.9545
No log 0.8919 66 0.7321 0.5797 0.7321 0.8556
No log 0.9189 68 0.8754 0.5388 0.8754 0.9356
No log 0.9459 70 1.1108 0.3741 1.1108 1.0540
No log 0.9730 72 1.0038 0.4329 1.0038 1.0019
No log 1.0 74 0.7634 0.6117 0.7634 0.8737
No log 1.0270 76 0.7251 0.6175 0.7251 0.8515
No log 1.0541 78 0.8743 0.6073 0.8743 0.9351
No log 1.0811 80 0.8879 0.5930 0.8879 0.9423
No log 1.1081 82 0.7728 0.6073 0.7728 0.8791
No log 1.1351 84 0.6930 0.6748 0.6930 0.8325
No log 1.1622 86 0.7908 0.5920 0.7908 0.8892
No log 1.1892 88 0.7706 0.5655 0.7706 0.8778
No log 1.2162 90 0.6836 0.6176 0.6836 0.8268
No log 1.2432 92 0.6811 0.6019 0.6811 0.8253
No log 1.2703 94 0.7016 0.6025 0.7016 0.8376
No log 1.2973 96 0.8474 0.4621 0.8474 0.9205
No log 1.3243 98 0.7566 0.5840 0.7566 0.8698
No log 1.3514 100 0.6996 0.6272 0.6996 0.8364
No log 1.3784 102 0.9095 0.6102 0.9095 0.9537
No log 1.4054 104 1.1709 0.5241 1.1709 1.0821
No log 1.4324 106 1.0962 0.5579 1.0962 1.0470
No log 1.4595 108 0.8729 0.6146 0.8729 0.9343
No log 1.4865 110 0.7288 0.6443 0.7288 0.8537
No log 1.5135 112 0.7437 0.6413 0.7437 0.8624
No log 1.5405 114 0.8208 0.5733 0.8208 0.9060
No log 1.5676 116 0.7586 0.5757 0.7586 0.8710
No log 1.5946 118 0.7500 0.6004 0.7500 0.8660
No log 1.6216 120 0.6728 0.6332 0.6728 0.8202
No log 1.6486 122 0.6779 0.6061 0.6779 0.8233
No log 1.6757 124 0.6897 0.5971 0.6897 0.8305
No log 1.7027 126 0.7158 0.5963 0.7158 0.8461
No log 1.7297 128 0.7076 0.6746 0.7076 0.8412
No log 1.7568 130 0.8323 0.5766 0.8323 0.9123
No log 1.7838 132 1.0607 0.5199 1.0607 1.0299
No log 1.8108 134 1.1491 0.5028 1.1491 1.0720
No log 1.8378 136 0.9945 0.5504 0.9945 0.9972
No log 1.8649 138 0.8189 0.5779 0.8189 0.9049
No log 1.8919 140 0.6831 0.6254 0.6831 0.8265
No log 1.9189 142 0.9512 0.4218 0.9512 0.9753
No log 1.9459 144 0.9517 0.4974 0.9517 0.9756
No log 1.9730 146 0.8734 0.5134 0.8734 0.9346
No log 2.0 148 0.7630 0.5551 0.7630 0.8735
No log 2.0270 150 0.7278 0.6244 0.7278 0.8531
No log 2.0541 152 0.8882 0.5959 0.8882 0.9424
No log 2.0811 154 0.9066 0.5444 0.9066 0.9522
No log 2.1081 156 0.8712 0.5576 0.8712 0.9334
No log 2.1351 158 0.7476 0.5959 0.7476 0.8646
No log 2.1622 160 0.7186 0.5553 0.7186 0.8477
No log 2.1892 162 0.7320 0.6154 0.7320 0.8555
No log 2.2162 164 0.8252 0.5766 0.8252 0.9084
No log 2.2432 166 0.9275 0.5614 0.9275 0.9631
No log 2.2703 168 0.9940 0.4977 0.9940 0.9970
No log 2.2973 170 0.8667 0.5601 0.8667 0.9309
No log 2.3243 172 0.7207 0.5898 0.7207 0.8489
No log 2.3514 174 0.6771 0.6485 0.6771 0.8229
No log 2.3784 176 0.7257 0.5440 0.7257 0.8519
No log 2.4054 178 0.6990 0.5505 0.6990 0.8361
No log 2.4324 180 0.6516 0.6386 0.6516 0.8072
No log 2.4595 182 0.7629 0.5947 0.7629 0.8734
No log 2.4865 184 1.2327 0.5247 1.2327 1.1103
No log 2.5135 186 1.4917 0.4750 1.4917 1.2214
No log 2.5405 188 1.3277 0.4852 1.3277 1.1523
No log 2.5676 190 1.0093 0.4885 1.0093 1.0046
No log 2.5946 192 0.7737 0.5895 0.7737 0.8796
No log 2.6216 194 0.9693 0.4153 0.9693 0.9845
No log 2.6486 196 1.3371 0.2616 1.3371 1.1563
No log 2.6757 198 1.3048 0.2278 1.3048 1.1423
No log 2.7027 200 0.9643 0.4225 0.9643 0.9820
No log 2.7297 202 0.8673 0.4852 0.8673 0.9313
No log 2.7568 204 1.0356 0.4841 1.0356 1.0176
No log 2.7838 206 1.1451 0.4505 1.1451 1.0701
No log 2.8108 208 1.0730 0.4052 1.0730 1.0359
No log 2.8378 210 0.8878 0.5028 0.8878 0.9422
No log 2.8649 212 0.8487 0.4976 0.8487 0.9212
No log 2.8919 214 0.8341 0.4783 0.8341 0.9133
No log 2.9189 216 0.7860 0.5520 0.7860 0.8866
No log 2.9459 218 0.7661 0.5779 0.7661 0.8753
No log 2.9730 220 0.7672 0.5382 0.7672 0.8759
No log 3.0 222 0.8128 0.5022 0.8128 0.9016
No log 3.0270 224 0.7893 0.5219 0.7893 0.8885
No log 3.0541 226 0.7408 0.5143 0.7408 0.8607
No log 3.0811 228 0.7901 0.5649 0.7901 0.8889
No log 3.1081 230 0.8319 0.5513 0.8319 0.9121
No log 3.1351 232 0.8227 0.5691 0.8227 0.9070
No log 3.1622 234 0.8380 0.5691 0.8380 0.9154
No log 3.1892 236 0.7987 0.6101 0.7987 0.8937
No log 3.2162 238 0.8035 0.5712 0.8035 0.8964
No log 3.2432 240 0.8619 0.5690 0.8619 0.9284
No log 3.2703 242 0.8910 0.5794 0.8910 0.9439
No log 3.2973 244 0.9367 0.5794 0.9367 0.9678
No log 3.3243 246 1.0302 0.5626 1.0302 1.0150
No log 3.3514 248 1.0324 0.5750 1.0324 1.0161
No log 3.3784 250 1.1077 0.4482 1.1077 1.0525
No log 3.4054 252 1.0552 0.4300 1.0552 1.0272
No log 3.4324 254 0.9644 0.3785 0.9644 0.9820
No log 3.4595 256 0.8626 0.3975 0.8626 0.9288
No log 3.4865 258 0.8377 0.5202 0.8377 0.9152
No log 3.5135 260 0.7792 0.5892 0.7792 0.8827
No log 3.5405 262 0.7642 0.5938 0.7642 0.8742
No log 3.5676 264 0.7789 0.5216 0.7789 0.8825
No log 3.5946 266 0.8036 0.4508 0.8036 0.8964
No log 3.6216 268 0.8728 0.5611 0.8728 0.9342
No log 3.6486 270 0.9103 0.5920 0.9103 0.9541
No log 3.6757 272 0.9134 0.5814 0.9134 0.9557
No log 3.7027 274 0.8663 0.5836 0.8663 0.9308
No log 3.7297 276 0.8058 0.6498 0.8058 0.8977
No log 3.7568 278 0.8327 0.5228 0.8327 0.9125
No log 3.7838 280 0.8827 0.4458 0.8827 0.9395
No log 3.8108 282 0.8608 0.5291 0.8608 0.9278
No log 3.8378 284 0.8920 0.5055 0.8920 0.9444
No log 3.8649 286 0.9640 0.4959 0.9640 0.9818
No log 3.8919 288 1.0107 0.4615 1.0107 1.0053
No log 3.9189 290 0.9590 0.3771 0.9590 0.9793
No log 3.9459 292 0.9251 0.3847 0.9251 0.9618
No log 3.9730 294 0.9246 0.4065 0.9246 0.9616
No log 4.0 296 0.9350 0.4519 0.9350 0.9669
No log 4.0270 298 0.8490 0.4785 0.8490 0.9214
No log 4.0541 300 0.7567 0.5415 0.7567 0.8699
No log 4.0811 302 0.7759 0.5451 0.7759 0.8809
No log 4.1081 304 0.7793 0.5575 0.7793 0.8828
No log 4.1351 306 0.7175 0.6022 0.7175 0.8471
No log 4.1622 308 0.7160 0.5713 0.7160 0.8461
No log 4.1892 310 0.9105 0.5750 0.9105 0.9542
No log 4.2162 312 1.0611 0.5273 1.0611 1.0301
No log 4.2432 314 0.9984 0.5659 0.9984 0.9992
No log 4.2703 316 0.8245 0.5669 0.8245 0.9080
No log 4.2973 318 0.6982 0.6687 0.6982 0.8356
No log 4.3243 320 0.6927 0.6402 0.6927 0.8323
No log 4.3514 322 0.7178 0.6585 0.7178 0.8472
No log 4.3784 324 0.8427 0.5466 0.8427 0.9180
No log 4.4054 326 0.9153 0.5533 0.9153 0.9567
No log 4.4324 328 0.9198 0.5893 0.9198 0.9591
No log 4.4595 330 0.8525 0.5793 0.8525 0.9233
No log 4.4865 332 0.7889 0.5390 0.7889 0.8882
No log 4.5135 334 0.7401 0.5302 0.7401 0.8603
No log 4.5405 336 0.7346 0.5902 0.7346 0.8571
No log 4.5676 338 0.7384 0.5923 0.7384 0.8593
No log 4.5946 340 0.7445 0.5848 0.7445 0.8628
No log 4.6216 342 0.8023 0.5947 0.8023 0.8957
No log 4.6486 344 0.9334 0.5766 0.9334 0.9661
No log 4.6757 346 1.0010 0.6170 1.0010 1.0005
No log 4.7027 348 0.9969 0.6023 0.9969 0.9984
No log 4.7297 350 0.9298 0.5926 0.9298 0.9643
No log 4.7568 352 0.7885 0.5562 0.7885 0.8880
No log 4.7838 354 0.7309 0.5483 0.7309 0.8549
No log 4.8108 356 0.7337 0.5358 0.7337 0.8566
No log 4.8378 358 0.7677 0.6319 0.7677 0.8762
No log 4.8649 360 0.8895 0.5531 0.8895 0.9431
No log 4.8919 362 0.9440 0.5509 0.9440 0.9716
No log 4.9189 364 0.9707 0.5649 0.9707 0.9853
No log 4.9459 366 0.8808 0.5509 0.8808 0.9385
No log 4.9730 368 0.7760 0.5218 0.7760 0.8809
No log 5.0 370 0.7185 0.5805 0.7185 0.8476
No log 5.0270 372 0.7155 0.5458 0.7155 0.8459
No log 5.0541 374 0.7269 0.4964 0.7269 0.8526
No log 5.0811 376 0.7336 0.4933 0.7336 0.8565
No log 5.1081 378 0.7413 0.5336 0.7413 0.8610
No log 5.1351 380 0.7483 0.5131 0.7483 0.8651
No log 5.1622 382 0.7711 0.5968 0.7711 0.8781
No log 5.1892 384 0.7645 0.6097 0.7645 0.8744
No log 5.2162 386 0.7516 0.5559 0.7516 0.8669
No log 5.2432 388 0.7575 0.5873 0.7575 0.8703
No log 5.2703 390 0.7799 0.5720 0.7799 0.8831
No log 5.2973 392 0.7933 0.5806 0.7933 0.8907
No log 5.3243 394 0.8021 0.5562 0.8021 0.8956
No log 5.3514 396 0.7699 0.5279 0.7699 0.8774
No log 5.3784 398 0.7659 0.4385 0.7659 0.8752
No log 5.4054 400 0.7648 0.4479 0.7648 0.8745
No log 5.4324 402 0.7623 0.4728 0.7623 0.8731
No log 5.4595 404 0.7528 0.4620 0.7528 0.8676
No log 5.4865 406 0.7600 0.4902 0.7600 0.8718
No log 5.5135 408 0.7744 0.5610 0.7744 0.8800
No log 5.5405 410 0.8333 0.5448 0.8333 0.9129
No log 5.5676 412 0.8461 0.5515 0.8461 0.9199
No log 5.5946 414 0.9113 0.5830 0.9113 0.9546
No log 5.6216 416 0.8955 0.5830 0.8955 0.9463
No log 5.6486 418 0.8003 0.5865 0.8003 0.8946
No log 5.6757 420 0.7484 0.5611 0.7484 0.8651
No log 5.7027 422 0.7190 0.5722 0.7190 0.8480
No log 5.7297 424 0.7227 0.5611 0.7227 0.8501
No log 5.7568 426 0.7155 0.5815 0.7155 0.8459
No log 5.7838 428 0.7279 0.5585 0.7279 0.8531
No log 5.8108 430 0.7363 0.5585 0.7363 0.8581
No log 5.8378 432 0.6867 0.5585 0.6867 0.8287
No log 5.8649 434 0.6314 0.6015 0.6314 0.7946
No log 5.8919 436 0.6188 0.6698 0.6188 0.7866
No log 5.9189 438 0.6240 0.6252 0.6240 0.7899
No log 5.9459 440 0.6325 0.6242 0.6325 0.7953
No log 5.9730 442 0.6728 0.4824 0.6728 0.8203
No log 6.0 444 0.7177 0.4879 0.7177 0.8472
No log 6.0270 446 0.7326 0.4879 0.7326 0.8559
No log 6.0541 448 0.7244 0.5014 0.7244 0.8511
No log 6.0811 450 0.6921 0.5524 0.6921 0.8319
No log 6.1081 452 0.6703 0.5648 0.6703 0.8187
No log 6.1351 454 0.6643 0.5770 0.6643 0.8151
No log 6.1622 456 0.6828 0.5716 0.6828 0.8263
No log 6.1892 458 0.7015 0.5543 0.7015 0.8376
No log 6.2162 460 0.7485 0.5991 0.7485 0.8652
No log 6.2432 462 0.7735 0.5991 0.7735 0.8795
No log 6.2703 464 0.7674 0.5991 0.7674 0.8760
No log 6.2973 466 0.7636 0.5991 0.7636 0.8739
No log 6.3243 468 0.7911 0.6154 0.7911 0.8894
No log 6.3514 470 0.7887 0.6154 0.7887 0.8881
No log 6.3784 472 0.7865 0.6038 0.7865 0.8868
No log 6.4054 474 0.7523 0.5905 0.7523 0.8673
No log 6.4324 476 0.7298 0.5727 0.7298 0.8543
No log 6.4595 478 0.7459 0.5424 0.7459 0.8636
No log 6.4865 480 0.7824 0.5246 0.7824 0.8846
No log 6.5135 482 0.7720 0.5295 0.7720 0.8786
No log 6.5405 484 0.7371 0.4611 0.7371 0.8585
No log 6.5676 486 0.7082 0.5849 0.7082 0.8415
No log 6.5946 488 0.7257 0.5766 0.7257 0.8519
No log 6.6216 490 0.7340 0.5247 0.7340 0.8568
No log 6.6486 492 0.7130 0.5318 0.7130 0.8444
No log 6.6757 494 0.6983 0.6321 0.6983 0.8356
No log 6.7027 496 0.6878 0.6840 0.6878 0.8294
No log 6.7297 498 0.7189 0.6691 0.7189 0.8479
0.3657 6.7568 500 0.7898 0.5818 0.7898 0.8887
0.3657 6.7838 502 0.8013 0.5637 0.8013 0.8951
0.3657 6.8108 504 0.7740 0.5934 0.7740 0.8798
0.3657 6.8378 506 0.7761 0.5881 0.7761 0.8810
0.3657 6.8649 508 0.7499 0.5569 0.7499 0.8659
0.3657 6.8919 510 0.7424 0.4571 0.7424 0.8616
0.3657 6.9189 512 0.7499 0.4571 0.7499 0.8660
0.3657 6.9459 514 0.7757 0.5663 0.7757 0.8807
0.3657 6.9730 516 0.8716 0.5029 0.8716 0.9336

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k13_task2_organization

Finetuned
(4019)
this model