ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k13_task3_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8643
  • Qwk: 0.0024
  • Mse: 0.8643
  • Rmse: 0.9297

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0299 2 4.0465 0.0017 4.0465 2.0116
No log 0.0597 4 2.3142 -0.0121 2.3142 1.5212
No log 0.0896 6 1.2283 -0.0479 1.2283 1.1083
No log 0.1194 8 1.0257 -0.0182 1.0257 1.0128
No log 0.1493 10 0.7570 -0.0069 0.7570 0.8701
No log 0.1791 12 0.7473 -0.0101 0.7473 0.8645
No log 0.2090 14 1.0177 0.0941 1.0177 1.0088
No log 0.2388 16 1.0808 -0.0101 1.0808 1.0396
No log 0.2687 18 0.9290 -0.0916 0.9290 0.9638
No log 0.2985 20 0.9126 -0.0101 0.9126 0.9553
No log 0.3284 22 0.7700 0.0225 0.7700 0.8775
No log 0.3582 24 0.7559 -0.1220 0.7559 0.8694
No log 0.3881 26 0.8147 -0.0695 0.8147 0.9026
No log 0.4179 28 0.7532 0.1047 0.7532 0.8678
No log 0.4478 30 1.1090 0.1056 1.1090 1.0531
No log 0.4776 32 1.7201 0.1410 1.7201 1.3115
No log 0.5075 34 1.2705 0.1117 1.2705 1.1272
No log 0.5373 36 0.8310 0.0755 0.8310 0.9116
No log 0.5672 38 0.9020 -0.0408 0.9020 0.9497
No log 0.5970 40 0.7543 0.0909 0.7543 0.8685
No log 0.6269 42 0.7371 -0.0033 0.7371 0.8586
No log 0.6567 44 0.8908 0.0071 0.8908 0.9438
No log 0.6866 46 0.9498 -0.0842 0.9498 0.9746
No log 0.7164 48 0.8543 0.0129 0.8543 0.9243
No log 0.7463 50 0.8355 0.0863 0.8355 0.9141
No log 0.7761 52 0.9181 -0.0266 0.9181 0.9582
No log 0.8060 54 1.0221 0.0095 1.0221 1.0110
No log 0.8358 56 1.1176 0.0710 1.1176 1.0572
No log 0.8657 58 1.0736 0.1730 1.0736 1.0362
No log 0.8955 60 1.5270 0.0576 1.5270 1.2357
No log 0.9254 62 1.8535 0.0338 1.8535 1.3614
No log 0.9552 64 1.0443 0.2010 1.0443 1.0219
No log 0.9851 66 0.9294 0.2212 0.9294 0.9641
No log 1.0149 68 0.9169 0.1880 0.9169 0.9576
No log 1.0448 70 0.8347 0.2239 0.8347 0.9136
No log 1.0746 72 0.9809 0.1573 0.9809 0.9904
No log 1.1045 74 1.2187 0.0011 1.2187 1.1039
No log 1.1343 76 1.2252 0.1017 1.2252 1.1069
No log 1.1642 78 0.9666 0.1684 0.9666 0.9831
No log 1.1940 80 1.3519 0.1119 1.3519 1.1627
No log 1.2239 82 1.2206 0.0287 1.2206 1.1048
No log 1.2537 84 0.9626 0.1426 0.9626 0.9811
No log 1.2836 86 1.4198 0.0339 1.4198 1.1916
No log 1.3134 88 1.3032 0.0328 1.3032 1.1416
No log 1.3433 90 0.9009 0.1555 0.9009 0.9492
No log 1.3731 92 0.8761 0.1624 0.8761 0.9360
No log 1.4030 94 0.9041 0.0709 0.9041 0.9508
No log 1.4328 96 1.0012 0.0612 1.0012 1.0006
No log 1.4627 98 0.9336 0.0215 0.9336 0.9662
No log 1.4925 100 0.9818 -0.0137 0.9818 0.9909
No log 1.5224 102 1.3984 0.1118 1.3984 1.1825
No log 1.5522 104 1.3210 0.1144 1.3210 1.1493
No log 1.5821 106 0.9555 0.0943 0.9555 0.9775
No log 1.6119 108 1.0956 0.0741 1.0956 1.0467
No log 1.6418 110 1.2430 0.0825 1.2430 1.1149
No log 1.6716 112 1.1099 0.0715 1.1099 1.0535
No log 1.7015 114 0.9107 0.2783 0.9107 0.9543
No log 1.7313 116 1.3996 0.0578 1.3996 1.1830
No log 1.7612 118 1.2707 0.0480 1.2707 1.1273
No log 1.7910 120 0.9281 0.2574 0.9281 0.9634
No log 1.8209 122 1.3569 0.1022 1.3569 1.1649
No log 1.8507 124 1.4798 0.0630 1.4798 1.2165
No log 1.8806 126 1.2602 0.0817 1.2602 1.1226
No log 1.9104 128 0.8601 -0.0409 0.8601 0.9274
No log 1.9403 130 0.8865 0.1065 0.8865 0.9415
No log 1.9701 132 0.8663 0.1064 0.8663 0.9307
No log 2.0 134 0.7601 0.0257 0.7601 0.8718
No log 2.0299 136 0.7945 0.0081 0.7945 0.8913
No log 2.0597 138 0.8214 0.0106 0.8214 0.9063
No log 2.0896 140 0.7706 -0.0056 0.7706 0.8778
No log 2.1194 142 0.8932 0.1484 0.8932 0.9451
No log 2.1493 144 1.0475 0.0451 1.0475 1.0235
No log 2.1791 146 0.8699 0.1188 0.8699 0.9327
No log 2.2090 148 0.8657 0.0559 0.8657 0.9304
No log 2.2388 150 1.0413 0.0762 1.0413 1.0204
No log 2.2687 152 1.0210 0.0752 1.0210 1.0104
No log 2.2985 154 0.9522 0.1990 0.9522 0.9758
No log 2.3284 156 0.8289 -0.0200 0.8289 0.9105
No log 2.3582 158 0.9333 0.0793 0.9333 0.9661
No log 2.3881 160 0.8227 0.0588 0.8227 0.9070
No log 2.4179 162 0.8002 0.0474 0.8002 0.8945
No log 2.4478 164 0.7790 0.0376 0.7790 0.8826
No log 2.4776 166 0.7849 -0.0138 0.7849 0.8860
No log 2.5075 168 0.8079 -0.0230 0.8079 0.8988
No log 2.5373 170 0.8615 0.0109 0.8615 0.9282
No log 2.5672 172 0.8189 0.0670 0.8189 0.9050
No log 2.5970 174 0.9469 0.1684 0.9469 0.9731
No log 2.6269 176 0.9970 0.2037 0.9970 0.9985
No log 2.6567 178 0.8178 0.0488 0.8178 0.9043
No log 2.6866 180 0.8853 0.1542 0.8853 0.9409
No log 2.7164 182 1.0176 0.1149 1.0176 1.0088
No log 2.7463 184 0.8477 0.1399 0.8477 0.9207
No log 2.7761 186 0.7445 -0.0163 0.7445 0.8628
No log 2.8060 188 0.7402 0.0884 0.7402 0.8603
No log 2.8358 190 0.7358 0.0884 0.7358 0.8578
No log 2.8657 192 0.7741 0.0205 0.7741 0.8798
No log 2.8955 194 0.7961 0.0840 0.7961 0.8922
No log 2.9254 196 0.8294 0.1168 0.8294 0.9107
No log 2.9552 198 0.8954 0.1559 0.8954 0.9463
No log 2.9851 200 0.9474 0.1246 0.9474 0.9734
No log 3.0149 202 0.9533 0.1551 0.9533 0.9764
No log 3.0448 204 0.9032 0.0972 0.9032 0.9504
No log 3.0746 206 0.8103 0.1179 0.8103 0.9002
No log 3.1045 208 0.7891 0.0840 0.7891 0.8883
No log 3.1343 210 0.7895 0.0804 0.7895 0.8885
No log 3.1642 212 0.8204 0.1047 0.8204 0.9058
No log 3.1940 214 0.7945 0.0810 0.7945 0.8914
No log 3.2239 216 0.8183 -0.0238 0.8183 0.9046
No log 3.2537 218 0.8754 -0.0543 0.8754 0.9356
No log 3.2836 220 0.8909 -0.0558 0.8909 0.9439
No log 3.3134 222 0.8136 -0.0354 0.8136 0.9020
No log 3.3433 224 0.8144 0.1541 0.8144 0.9024
No log 3.3731 226 0.8019 0.0053 0.8019 0.8955
No log 3.4030 228 0.7980 -0.0329 0.7980 0.8933
No log 3.4328 230 0.7684 -0.0488 0.7684 0.8766
No log 3.4627 232 0.8644 0.0805 0.8644 0.9297
No log 3.4925 234 0.9282 0.0392 0.9282 0.9634
No log 3.5224 236 0.9043 0.0702 0.9043 0.9510
No log 3.5522 238 0.9374 0.1209 0.9374 0.9682
No log 3.5821 240 0.9248 0.1281 0.9248 0.9617
No log 3.6119 242 0.9194 0.0281 0.9194 0.9589
No log 3.6418 244 0.8361 0.0501 0.8361 0.9144
No log 3.6716 246 0.8515 0.0043 0.8515 0.9228
No log 3.7015 248 0.9698 0.0537 0.9698 0.9848
No log 3.7313 250 0.8748 0.0071 0.8748 0.9353
No log 3.7612 252 0.8168 -0.0051 0.8168 0.9038
No log 3.7910 254 0.9091 0.0998 0.9091 0.9535
No log 3.8209 256 0.9800 0.1005 0.9800 0.9899
No log 3.8507 258 0.8836 0.1090 0.8836 0.9400
No log 3.8806 260 0.8625 0.0448 0.8625 0.9287
No log 3.9104 262 0.8761 0.0378 0.8761 0.9360
No log 3.9403 264 0.8036 0.0562 0.8036 0.8964
No log 3.9701 266 0.8681 0.0285 0.8681 0.9317
No log 4.0 268 0.9015 0.0303 0.9015 0.9495
No log 4.0299 270 0.7863 -0.0329 0.7863 0.8868
No log 4.0597 272 0.7854 0.1001 0.7854 0.8862
No log 4.0896 274 0.8372 0.1605 0.8372 0.9150
No log 4.1194 276 0.7555 0.1047 0.7555 0.8692
No log 4.1493 278 0.7417 -0.0493 0.7417 0.8612
No log 4.1791 280 0.7889 -0.0849 0.7889 0.8882
No log 4.2090 282 0.7615 -0.0958 0.7615 0.8726
No log 4.2388 284 0.7912 0.1143 0.7912 0.8895
No log 4.2687 286 0.8296 0.1050 0.8296 0.9108
No log 4.2985 288 0.8310 0.1415 0.8310 0.9116
No log 4.3284 290 0.8225 -0.0051 0.8225 0.9069
No log 4.3582 292 0.8549 -0.1009 0.8549 0.9246
No log 4.3881 294 0.8455 -0.0647 0.8455 0.9195
No log 4.4179 296 0.7893 0.0028 0.7893 0.8884
No log 4.4478 298 0.7611 0.0394 0.7611 0.8724
No log 4.4776 300 0.7779 0.1541 0.7779 0.8820
No log 4.5075 302 0.8027 0.0798 0.8027 0.8959
No log 4.5373 304 0.8769 0.0955 0.8769 0.9364
No log 4.5672 306 0.9344 0.0589 0.9344 0.9666
No log 4.5970 308 0.9474 0.0897 0.9474 0.9733
No log 4.6269 310 0.9259 0.1586 0.9259 0.9622
No log 4.6567 312 0.8980 0.1212 0.8980 0.9476
No log 4.6866 314 0.8419 0.0361 0.8419 0.9175
No log 4.7164 316 0.8546 0.0016 0.8546 0.9244
No log 4.7463 318 0.9919 0.1222 0.9919 0.9960
No log 4.7761 320 0.9346 0.0988 0.9346 0.9668
No log 4.8060 322 0.7800 0.0071 0.7800 0.8832
No log 4.8358 324 0.7931 0.0129 0.7931 0.8906
No log 4.8657 326 0.8655 0.1785 0.8655 0.9303
No log 4.8955 328 0.8130 0.1035 0.8130 0.9017
No log 4.9254 330 0.7805 0.0768 0.7805 0.8835
No log 4.9552 332 0.9216 0.0016 0.9216 0.9600
No log 4.9851 334 0.9023 -0.0279 0.9023 0.9499
No log 5.0149 336 0.8649 -0.0076 0.8649 0.9300
No log 5.0448 338 0.8714 -0.0634 0.8714 0.9335
No log 5.0746 340 0.8272 -0.0705 0.8272 0.9095
No log 5.1045 342 0.7502 0.0436 0.7502 0.8661
No log 5.1343 344 0.8075 0.0512 0.8075 0.8986
No log 5.1642 346 0.9045 0.0362 0.9045 0.9510
No log 5.1940 348 0.8805 0.0727 0.8805 0.9383
No log 5.2239 350 0.7711 0.0909 0.7711 0.8781
No log 5.2537 352 0.8250 -0.0300 0.8250 0.9083
No log 5.2836 354 0.8309 -0.0774 0.8309 0.9116
No log 5.3134 356 0.8454 0.0149 0.8454 0.9195
No log 5.3433 358 0.9316 0.1385 0.9316 0.9652
No log 5.3731 360 0.8779 0.0831 0.8779 0.9370
No log 5.4030 362 0.8285 0.0476 0.8285 0.9102
No log 5.4328 364 0.7544 0.0759 0.7544 0.8686
No log 5.4627 366 0.7385 0.0922 0.7385 0.8594
No log 5.4925 368 0.7252 0.0030 0.7252 0.8516
No log 5.5224 370 0.6950 0.0374 0.6950 0.8337
No log 5.5522 372 0.7385 0.0953 0.7385 0.8593
No log 5.5821 374 0.8698 0.1106 0.8698 0.9326
No log 5.6119 376 0.8202 0.1106 0.8202 0.9057
No log 5.6418 378 0.7388 0.0341 0.7388 0.8595
No log 5.6716 380 0.8621 0.0007 0.8621 0.9285
No log 5.7015 382 0.9391 0.0431 0.9391 0.9691
No log 5.7313 384 0.8244 -0.0581 0.8244 0.9079
No log 5.7612 386 0.7823 0.1485 0.7823 0.8845
No log 5.7910 388 0.9251 0.1542 0.9251 0.9618
No log 5.8209 390 0.8594 0.1646 0.8594 0.9270
No log 5.8507 392 0.7233 0.0759 0.7233 0.8505
No log 5.8806 394 0.7821 -0.0750 0.7821 0.8843
No log 5.9104 396 0.9538 0.0007 0.9538 0.9767
No log 5.9403 398 0.9590 -0.0322 0.9590 0.9793
No log 5.9701 400 0.8343 -0.0339 0.8343 0.9134
No log 6.0 402 0.8110 0.1144 0.8110 0.9006
No log 6.0299 404 0.8228 0.2604 0.8228 0.9071
No log 6.0597 406 0.7893 0.0357 0.7893 0.8884
No log 6.0896 408 0.7825 -0.0054 0.7825 0.8846
No log 6.1194 410 0.8331 -0.0334 0.8331 0.9127
No log 6.1493 412 0.8329 -0.0370 0.8329 0.9126
No log 6.1791 414 0.7895 0.0428 0.7895 0.8885
No log 6.2090 416 0.7906 0.0303 0.7906 0.8891
No log 6.2388 418 0.8197 0.0633 0.8197 0.9054
No log 6.2687 420 0.8409 -0.0079 0.8409 0.9170
No log 6.2985 422 0.8352 -0.0054 0.8352 0.9139
No log 6.3284 424 0.8137 0.0428 0.8137 0.9020
No log 6.3582 426 0.7797 -0.0103 0.7797 0.8830
No log 6.3881 428 0.7498 0.1096 0.7498 0.8659
No log 6.4179 430 0.7682 0.1565 0.7682 0.8765
No log 6.4478 432 0.7815 0.1094 0.7815 0.8840
No log 6.4776 434 0.8094 0.0679 0.8094 0.8997
No log 6.5075 436 0.8247 0.1136 0.8247 0.9081
No log 6.5373 438 0.8338 0.1225 0.8338 0.9131
No log 6.5672 440 0.9035 0.1145 0.9035 0.9505
No log 6.5970 442 0.8683 0.0810 0.8683 0.9318
No log 6.6269 444 0.8118 0.0670 0.8118 0.9010
No log 6.6567 446 0.8226 0.0670 0.8226 0.9070
No log 6.6866 448 0.8145 0.0277 0.8145 0.9025
No log 6.7164 450 0.8046 0.0393 0.8046 0.8970
No log 6.7463 452 0.8345 0.0268 0.8345 0.9135
No log 6.7761 454 0.7744 -0.0350 0.7744 0.8800
No log 6.8060 456 0.7241 0.1379 0.7241 0.8509
No log 6.8358 458 0.7282 0.1259 0.7282 0.8534
No log 6.8657 460 0.7411 0.0768 0.7411 0.8609
No log 6.8955 462 0.7661 0.0549 0.7661 0.8753
No log 6.9254 464 0.7989 0.0867 0.7989 0.8938
No log 6.9552 466 0.7790 0.0512 0.7790 0.8826
No log 6.9851 468 0.7483 -0.1121 0.7483 0.8650
No log 7.0149 470 0.7681 -0.0027 0.7681 0.8764
No log 7.0448 472 0.7984 0.0141 0.7984 0.8936
No log 7.0746 474 0.7846 0.0081 0.7846 0.8858
No log 7.1045 476 0.7599 -0.0029 0.7599 0.8717
No log 7.1343 478 0.7515 -0.0264 0.7515 0.8669
No log 7.1642 480 0.7784 0.0512 0.7784 0.8823
No log 7.1940 482 0.7815 -0.0170 0.7815 0.8840
No log 7.2239 484 0.8026 -0.0446 0.8026 0.8959
No log 7.2537 486 0.7949 -0.0532 0.7949 0.8916
No log 7.2836 488 0.7980 -0.0082 0.7980 0.8933
No log 7.3134 490 0.7926 -0.0082 0.7926 0.8903
No log 7.3433 492 0.7913 -0.0870 0.7913 0.8896
No log 7.3731 494 0.8551 0.0303 0.8551 0.9247
No log 7.4030 496 0.8710 0.0350 0.8710 0.9333
No log 7.4328 498 0.8048 -0.1270 0.8048 0.8971
0.3191 7.4627 500 0.8212 0.1049 0.8212 0.9062
0.3191 7.4925 502 0.8776 0.1646 0.8776 0.9368
0.3191 7.5224 504 0.8192 0.0549 0.8192 0.9051
0.3191 7.5522 506 0.8248 -0.0293 0.8248 0.9082
0.3191 7.5821 508 0.9345 -0.0981 0.9345 0.9667
0.3191 7.6119 510 0.9094 -0.0379 0.9094 0.9536
0.3191 7.6418 512 0.8643 0.0024 0.8643 0.9297

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k13_task3_organization

Finetuned
(4019)
this model