ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k18_task3_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7674
  • Qwk: 0.1440
  • Mse: 0.7674
  • Rmse: 0.8760

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0215 2 3.9326 0.0017 3.9326 1.9831
No log 0.0430 4 2.2912 0.0038 2.2912 1.5137
No log 0.0645 6 1.2165 0.0586 1.2165 1.1030
No log 0.0860 8 0.9690 0.0848 0.9690 0.9844
No log 0.1075 10 0.9575 0.0217 0.9575 0.9785
No log 0.1290 12 1.1036 -0.0500 1.1036 1.0505
No log 0.1505 14 0.9916 -0.0164 0.9916 0.9958
No log 0.1720 16 0.7642 -0.0739 0.7642 0.8742
No log 0.1935 18 0.7633 0.0260 0.7633 0.8736
No log 0.2151 20 0.9154 0.0404 0.9154 0.9568
No log 0.2366 22 1.3065 0.0 1.3065 1.1430
No log 0.2581 24 1.3089 0.0016 1.3089 1.1441
No log 0.2796 26 0.9840 -0.0385 0.9840 0.9920
No log 0.3011 28 0.8311 -0.0008 0.8311 0.9116
No log 0.3226 30 0.7090 0.0909 0.7090 0.8421
No log 0.3441 32 0.7108 0.0909 0.7108 0.8431
No log 0.3656 34 0.9566 -0.1263 0.9566 0.9781
No log 0.3871 36 1.6701 -0.0469 1.6701 1.2923
No log 0.4086 38 1.6657 -0.0013 1.6657 1.2906
No log 0.4301 40 1.0087 0.0684 1.0087 1.0043
No log 0.4516 42 0.9353 0.0456 0.9353 0.9671
No log 0.4731 44 0.9706 0.0651 0.9706 0.9852
No log 0.4946 46 1.1227 0.0026 1.1227 1.0596
No log 0.5161 48 1.0889 0.1329 1.0889 1.0435
No log 0.5376 50 0.9318 0.0065 0.9318 0.9653
No log 0.5591 52 1.0222 -0.0297 1.0222 1.0111
No log 0.5806 54 1.0344 -0.0702 1.0344 1.0171
No log 0.6022 56 1.0247 0.0138 1.0247 1.0123
No log 0.6237 58 0.9127 0.1091 0.9127 0.9553
No log 0.6452 60 0.9992 0.1269 0.9992 0.9996
No log 0.6667 62 1.0961 0.0694 1.0961 1.0469
No log 0.6882 64 0.8749 0.0790 0.8749 0.9354
No log 0.7097 66 0.9513 0.0209 0.9513 0.9754
No log 0.7312 68 0.8349 0.1597 0.8349 0.9138
No log 0.7527 70 0.7974 0.1580 0.7974 0.8930
No log 0.7742 72 0.9161 0.0147 0.9161 0.9571
No log 0.7957 74 0.7612 0.1196 0.7612 0.8725
No log 0.8172 76 0.7679 0.1620 0.7679 0.8763
No log 0.8387 78 0.8462 0.1324 0.8462 0.9199
No log 0.8602 80 0.9775 0.0241 0.9775 0.9887
No log 0.8817 82 1.0164 0.0870 1.0164 1.0082
No log 0.9032 84 1.0293 0.1610 1.0293 1.0145
No log 0.9247 86 1.0263 0.0682 1.0263 1.0131
No log 0.9462 88 1.0138 0.0682 1.0138 1.0069
No log 0.9677 90 0.9831 0.0600 0.9831 0.9915
No log 0.9892 92 0.9130 0.1615 0.9130 0.9555
No log 1.0108 94 0.8402 0.1609 0.8402 0.9166
No log 1.0323 96 0.8114 0.1609 0.8114 0.9008
No log 1.0538 98 1.0683 0.0006 1.0683 1.0336
No log 1.0753 100 0.9532 0.0134 0.9532 0.9763
No log 1.0968 102 0.7199 0.2122 0.7199 0.8485
No log 1.1183 104 0.7252 0.3062 0.7252 0.8516
No log 1.1398 106 0.7825 0.1996 0.7825 0.8846
No log 1.1613 108 0.8207 0.1889 0.8207 0.9059
No log 1.1828 110 0.8725 0.1636 0.8725 0.9341
No log 1.2043 112 0.8974 0.1256 0.8974 0.9473
No log 1.2258 114 1.1273 0.0679 1.1273 1.0617
No log 1.2473 116 1.0051 0.0856 1.0051 1.0025
No log 1.2688 118 0.8437 0.1854 0.8437 0.9186
No log 1.2903 120 1.2525 0.1105 1.2525 1.1191
No log 1.3118 122 1.2102 0.0839 1.2102 1.1001
No log 1.3333 124 0.7605 0.2454 0.7605 0.8721
No log 1.3548 126 0.9377 0.1269 0.9377 0.9684
No log 1.3763 128 1.0030 0.0342 1.0030 1.0015
No log 1.3978 130 0.7451 0.1859 0.7451 0.8632
No log 1.4194 132 0.8361 -0.0425 0.8361 0.9144
No log 1.4409 134 0.9789 0.0808 0.9789 0.9894
No log 1.4624 136 0.8367 -0.0425 0.8367 0.9147
No log 1.4839 138 0.7278 0.1475 0.7278 0.8531
No log 1.5054 140 0.8377 0.1687 0.8377 0.9153
No log 1.5269 142 1.0026 0.0847 1.0026 1.0013
No log 1.5484 144 0.8221 0.1727 0.8221 0.9067
No log 1.5699 146 0.7651 0.1508 0.7651 0.8747
No log 1.5914 148 0.7628 0.1767 0.7628 0.8734
No log 1.6129 150 0.7830 0.1621 0.7830 0.8848
No log 1.6344 152 0.9376 0.0537 0.9376 0.9683
No log 1.6559 154 0.8121 0.1522 0.8121 0.9011
No log 1.6774 156 0.7592 0.2892 0.7592 0.8713
No log 1.6989 158 0.7835 0.2776 0.7835 0.8851
No log 1.7204 160 0.7723 0.1752 0.7723 0.8788
No log 1.7419 162 0.9816 0.0701 0.9816 0.9908
No log 1.7634 164 1.0465 0.0291 1.0465 1.0230
No log 1.7849 166 0.8354 0.1188 0.8354 0.9140
No log 1.8065 168 0.7549 0.3184 0.7549 0.8689
No log 1.8280 170 0.7638 0.3184 0.7638 0.8740
No log 1.8495 172 0.7796 0.3072 0.7796 0.8829
No log 1.8710 174 0.8386 0.1924 0.8386 0.9157
No log 1.8925 176 0.7740 0.2486 0.7740 0.8798
No log 1.9140 178 0.7608 0.1431 0.7608 0.8722
No log 1.9355 180 1.4282 0.1175 1.4282 1.1951
No log 1.9570 182 1.8004 0.0578 1.8004 1.3418
No log 1.9785 184 1.3430 0.0502 1.3430 1.1589
No log 2.0 186 0.7561 0.1565 0.7561 0.8696
No log 2.0215 188 0.6955 0.1081 0.6955 0.8340
No log 2.0430 190 0.7445 0.2019 0.7445 0.8628
No log 2.0645 192 0.6812 0.1423 0.6812 0.8253
No log 2.0860 194 0.7701 0.2034 0.7701 0.8775
No log 2.1075 196 0.9942 -0.0521 0.9942 0.9971
No log 2.1290 198 1.0514 0.0142 1.0514 1.0254
No log 2.1505 200 0.8925 0.2401 0.8925 0.9447
No log 2.1720 202 0.8869 0.0868 0.8869 0.9418
No log 2.1935 204 0.9816 0.0953 0.9816 0.9908
No log 2.2151 206 0.9243 0.0578 0.9243 0.9614
No log 2.2366 208 0.9080 0.0578 0.9080 0.9529
No log 2.2581 210 0.8312 0.2145 0.8312 0.9117
No log 2.2796 212 0.8300 0.2194 0.8300 0.9111
No log 2.3011 214 0.7687 0.1921 0.7687 0.8768
No log 2.3226 216 0.7219 0.2046 0.7219 0.8496
No log 2.3441 218 0.7155 0.2053 0.7155 0.8459
No log 2.3656 220 0.8260 0.1235 0.8260 0.9089
No log 2.3871 222 0.7598 0.1943 0.7598 0.8716
No log 2.4086 224 0.6850 0.1952 0.6850 0.8277
No log 2.4301 226 0.7113 0.0556 0.7113 0.8434
No log 2.4516 228 0.6893 0.1026 0.6893 0.8302
No log 2.4731 230 0.7178 0.2180 0.7178 0.8472
No log 2.4946 232 0.8500 0.0392 0.8500 0.9220
No log 2.5161 234 0.7772 0.0959 0.7772 0.8816
No log 2.5376 236 0.7753 0.1942 0.7753 0.8805
No log 2.5591 238 0.8719 0.0920 0.8719 0.9337
No log 2.5806 240 1.1381 0.0336 1.1381 1.0668
No log 2.6022 242 1.1883 0.0315 1.1883 1.0901
No log 2.6237 244 0.9429 0.1539 0.9429 0.9710
No log 2.6452 246 1.0212 0.0943 1.0212 1.0106
No log 2.6667 248 1.1217 0.1297 1.1217 1.0591
No log 2.6882 250 0.9114 0.0665 0.9114 0.9547
No log 2.7097 252 0.7854 0.1146 0.7854 0.8862
No log 2.7312 254 1.0564 -0.0575 1.0564 1.0278
No log 2.7527 256 1.0276 -0.0923 1.0276 1.0137
No log 2.7742 258 0.7663 0.1047 0.7663 0.8754
No log 2.7957 260 0.7690 0.1372 0.7690 0.8769
No log 2.8172 262 0.8193 0.1334 0.8193 0.9052
No log 2.8387 264 0.7727 0.1675 0.7727 0.8790
No log 2.8602 266 0.9185 0.0362 0.9185 0.9584
No log 2.8817 268 0.9334 0.0362 0.9334 0.9661
No log 2.9032 270 0.8599 0.0167 0.8599 0.9273
No log 2.9247 272 0.9263 0.0864 0.9263 0.9624
No log 2.9462 274 0.9908 0.0979 0.9908 0.9954
No log 2.9677 276 0.8903 0.0842 0.8903 0.9435
No log 2.9892 278 0.8581 0.0917 0.8581 0.9263
No log 3.0108 280 0.8367 0.0917 0.8367 0.9147
No log 3.0323 282 0.7736 0.2053 0.7736 0.8795
No log 3.0538 284 0.8168 0.0913 0.8168 0.9038
No log 3.0753 286 0.8609 0.0665 0.8609 0.9279
No log 3.0968 288 0.7672 0.2181 0.7672 0.8759
No log 3.1183 290 0.8102 0.1387 0.8102 0.9001
No log 3.1398 292 0.8593 0.1235 0.8593 0.9270
No log 3.1613 294 0.8463 0.1231 0.8463 0.9199
No log 3.1828 296 0.8190 0.0733 0.8190 0.9050
No log 3.2043 298 0.9300 0.1321 0.9300 0.9643
No log 3.2258 300 1.0019 0.1612 1.0019 1.0010
No log 3.2473 302 0.8603 0.0511 0.8603 0.9275
No log 3.2688 304 0.8272 0.0529 0.8272 0.9095
No log 3.2903 306 0.8106 0.0529 0.8106 0.9003
No log 3.3118 308 0.7920 0.0608 0.7920 0.8900
No log 3.3333 310 0.8016 0.1263 0.8016 0.8953
No log 3.3548 312 0.8003 0.1176 0.8003 0.8946
No log 3.3763 314 0.7959 0.1263 0.7959 0.8921
No log 3.3978 316 0.7724 0.0441 0.7724 0.8788
No log 3.4194 318 0.7649 0.1475 0.7649 0.8746
No log 3.4409 320 0.8256 0.1105 0.8256 0.9086
No log 3.4624 322 0.7747 0.1817 0.7747 0.8802
No log 3.4839 324 0.7223 0.1599 0.7223 0.8499
No log 3.5054 326 0.7088 0.1023 0.7088 0.8419
No log 3.5269 328 0.7164 0.0914 0.7164 0.8464
No log 3.5484 330 0.7489 0.1240 0.7489 0.8654
No log 3.5699 332 0.8020 0.1475 0.8020 0.8956
No log 3.5914 334 0.8709 0.2107 0.8709 0.9332
No log 3.6129 336 0.8541 0.1465 0.8541 0.9242
No log 3.6344 338 0.8379 0.0313 0.8379 0.9154
No log 3.6559 340 0.8036 0.0679 0.8036 0.8964
No log 3.6774 342 0.8068 0.1475 0.8068 0.8982
No log 3.6989 344 0.8997 0.0917 0.8997 0.9485
No log 3.7204 346 0.9255 0.0837 0.9255 0.9620
No log 3.7419 348 0.8011 0.1485 0.8011 0.8951
No log 3.7634 350 0.8308 0.0989 0.8308 0.9115
No log 3.7849 352 0.9090 0.0365 0.9090 0.9534
No log 3.8065 354 0.8725 0.0964 0.8725 0.9341
No log 3.8280 356 0.8241 0.0697 0.8241 0.9078
No log 3.8495 358 0.8564 0.2194 0.8564 0.9254
No log 3.8710 360 0.9160 0.1882 0.9160 0.9571
No log 3.8925 362 0.8553 0.1823 0.8553 0.9248
No log 3.9140 364 0.9047 0.0541 0.9047 0.9512
No log 3.9355 366 0.9577 0.0653 0.9577 0.9786
No log 3.9570 368 0.8849 0.0444 0.8849 0.9407
No log 3.9785 370 0.8579 0.1475 0.8579 0.9262
No log 4.0 372 0.9009 0.0831 0.9009 0.9492
No log 4.0215 374 0.9585 0.0362 0.9585 0.9790
No log 4.0430 376 0.8504 0.0456 0.8504 0.9221
No log 4.0645 378 0.7701 0.1979 0.7701 0.8775
No log 4.0860 380 0.8259 -0.0329 0.8259 0.9088
No log 4.1075 382 0.8418 -0.0956 0.8418 0.9175
No log 4.1290 384 0.7945 0.1240 0.7945 0.8914
No log 4.1505 386 0.8512 0.0917 0.8512 0.9226
No log 4.1720 388 0.9488 0.0793 0.9488 0.9741
No log 4.1935 390 0.8633 0.0470 0.8633 0.9292
No log 4.2151 392 0.7780 0.1659 0.7780 0.8820
No log 4.2366 394 0.7584 0.1413 0.7584 0.8708
No log 4.2581 396 0.7307 0.1878 0.7307 0.8548
No log 4.2796 398 0.7884 0.0999 0.7884 0.8879
No log 4.3011 400 0.9749 0.0224 0.9749 0.9874
No log 4.3226 402 0.9612 0.0250 0.9612 0.9804
No log 4.3441 404 0.8419 0.0917 0.8419 0.9176
No log 4.3656 406 0.7980 0.0697 0.7980 0.8933
No log 4.3871 408 0.8695 0.1586 0.8695 0.9325
No log 4.4086 410 0.8653 0.1536 0.8653 0.9302
No log 4.4301 412 0.7914 0.1674 0.7914 0.8896
No log 4.4516 414 0.7833 0.1659 0.7833 0.8850
No log 4.4731 416 0.8010 0.2155 0.8010 0.8950
No log 4.4946 418 0.7932 0.1879 0.7932 0.8906
No log 4.5161 420 0.7366 0.2078 0.7366 0.8582
No log 4.5376 422 0.7112 0.1371 0.7112 0.8433
No log 4.5591 424 0.7069 0.1371 0.7069 0.8408
No log 4.5806 426 0.7062 0.1371 0.7062 0.8403
No log 4.6022 428 0.7381 0.1612 0.7381 0.8591
No log 4.6237 430 0.7720 0.1943 0.7720 0.8786
No log 4.6452 432 0.7279 0.1659 0.7279 0.8532
No log 4.6667 434 0.7402 0.1362 0.7402 0.8604
No log 4.6882 436 0.7563 0.0412 0.7563 0.8696
No log 4.7097 438 0.7269 0.1371 0.7269 0.8526
No log 4.7312 440 0.7743 0.1943 0.7743 0.8800
No log 4.7527 442 0.8164 0.1758 0.8164 0.9035
No log 4.7742 444 0.7547 0.1691 0.7547 0.8688
No log 4.7957 446 0.7318 0.1371 0.7318 0.8555
No log 4.8172 448 0.8084 0.0547 0.8084 0.8991
No log 4.8387 450 0.8289 0.0230 0.8289 0.9104
No log 4.8602 452 0.7953 0.0545 0.7953 0.8918
No log 4.8817 454 0.7652 0.0978 0.7652 0.8747
No log 4.9032 456 0.7317 0.1740 0.7317 0.8554
No log 4.9247 458 0.7329 0.1541 0.7329 0.8561
No log 4.9462 460 0.7420 0.1807 0.7420 0.8614
No log 4.9677 462 0.7512 0.0741 0.7512 0.8667
No log 4.9892 464 0.7879 0.0660 0.7879 0.8877
No log 5.0108 466 0.8547 0.2131 0.8547 0.9245
No log 5.0323 468 0.8527 0.1783 0.8527 0.9234
No log 5.0538 470 0.8431 0.1093 0.8431 0.9182
No log 5.0753 472 0.9527 0.0885 0.9527 0.9761
No log 5.0968 474 0.9704 0.0312 0.9704 0.9851
No log 5.1183 476 0.8598 0.0362 0.8598 0.9273
No log 5.1398 478 0.8338 0.1094 0.8338 0.9131
No log 5.1613 480 0.8292 0.1921 0.8292 0.9106
No log 5.1828 482 0.7790 0.1144 0.7790 0.8826
No log 5.2043 484 0.7379 0.1254 0.7379 0.8590
No log 5.2258 486 0.7413 0.1259 0.7413 0.8610
No log 5.2473 488 0.7517 0.1254 0.7517 0.8670
No log 5.2688 490 0.7672 0.1144 0.7672 0.8759
No log 5.2903 492 0.7423 0.1143 0.7423 0.8616
No log 5.3118 494 0.7617 0.0978 0.7617 0.8728
No log 5.3333 496 0.7868 0.1885 0.7868 0.8870
No log 5.3548 498 0.7322 0.1751 0.7322 0.8557
0.3562 5.3763 500 0.7438 0.1553 0.7438 0.8624
0.3562 5.3978 502 0.7497 0.1553 0.7497 0.8659
0.3562 5.4194 504 0.7093 0.1612 0.7093 0.8422
0.3562 5.4409 506 0.7241 0.1354 0.7241 0.8510
0.3562 5.4624 508 0.7464 0.0428 0.7464 0.8639
0.3562 5.4839 510 0.7524 0.0393 0.7524 0.8674
0.3562 5.5054 512 0.7361 0.1143 0.7361 0.8580
0.3562 5.5269 514 0.7674 0.1440 0.7674 0.8760

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k18_task3_organization

Finetuned
(4019)
this model