ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k14_task3_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8160
  • Qwk: -0.0517
  • Mse: 0.8160
  • Rmse: 0.9033

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0278 2 3.8707 0.0029 3.8707 1.9674
No log 0.0556 4 2.3084 0.0229 2.3084 1.5193
No log 0.0833 6 1.4020 0.0 1.4020 1.1840
No log 0.1111 8 1.0010 -0.0067 1.0010 1.0005
No log 0.1389 10 0.8000 0.0159 0.8000 0.8944
No log 0.1667 12 0.8341 -0.0031 0.8341 0.9133
No log 0.1944 14 0.6977 -0.0101 0.6977 0.8353
No log 0.2222 16 0.7322 -0.0662 0.7322 0.8557
No log 0.25 18 0.7625 -0.1765 0.7625 0.8732
No log 0.2778 20 0.8790 -0.0767 0.8790 0.9375
No log 0.3056 22 1.2759 0.0095 1.2759 1.1296
No log 0.3333 24 1.9325 0.0260 1.9325 1.3901
No log 0.3611 26 1.5092 0.0720 1.5092 1.2285
No log 0.3889 28 0.8547 0.0071 0.8548 0.9245
No log 0.4167 30 0.8308 -0.0725 0.8308 0.9115
No log 0.4444 32 0.7780 -0.0188 0.7780 0.8820
No log 0.4722 34 0.7086 -0.0101 0.7086 0.8418
No log 0.5 36 0.7858 0.1148 0.7858 0.8864
No log 0.5278 38 1.0709 0.0367 1.0709 1.0348
No log 0.5556 40 1.2554 0.0998 1.2554 1.1204
No log 0.5833 42 1.0070 0.0129 1.0070 1.0035
No log 0.6111 44 0.9481 0.0043 0.9481 0.9737
No log 0.6389 46 0.9216 -0.0264 0.9216 0.9600
No log 0.6667 48 0.8550 0.0436 0.8550 0.9247
No log 0.6944 50 0.7055 -0.0695 0.7055 0.8400
No log 0.7222 52 0.7029 0.1899 0.7029 0.8384
No log 0.75 54 0.9897 0.1640 0.9897 0.9948
No log 0.7778 56 1.0158 0.2225 1.0158 1.0079
No log 0.8056 58 0.7705 0.0871 0.7705 0.8778
No log 0.8333 60 0.7586 0.0338 0.7586 0.8710
No log 0.8611 62 0.8723 -0.0173 0.8723 0.9339
No log 0.8889 64 0.9806 0.0142 0.9806 0.9902
No log 0.9167 66 0.9968 0.1396 0.9968 0.9984
No log 0.9444 68 0.9423 -0.0424 0.9423 0.9707
No log 0.9722 70 0.8368 0.0633 0.8368 0.9148
No log 1.0 72 0.8048 -0.0271 0.8048 0.8971
No log 1.0278 74 0.7916 -0.0209 0.7916 0.8897
No log 1.0556 76 0.8079 0.1829 0.8079 0.8989
No log 1.0833 78 0.8614 0.1128 0.8614 0.9281
No log 1.1111 80 0.8904 0.0785 0.8904 0.9436
No log 1.1389 82 0.9606 -0.0071 0.9606 0.9801
No log 1.1667 84 1.0729 0.0042 1.0729 1.0358
No log 1.1944 86 1.2823 0.0710 1.2823 1.1324
No log 1.2222 88 1.0021 0.0367 1.0021 1.0010
No log 1.25 90 0.8570 0.1403 0.8570 0.9258
No log 1.2778 92 0.8455 0.1444 0.8455 0.9195
No log 1.3056 94 0.8410 0.1857 0.8410 0.9171
No log 1.3333 96 1.0954 -0.0165 1.0954 1.0466
No log 1.3611 98 1.1238 -0.0094 1.1238 1.0601
No log 1.3889 100 0.8721 -0.0266 0.8721 0.9339
No log 1.4167 102 0.8365 0.0327 0.8365 0.9146
No log 1.4444 104 0.8810 0.0838 0.8810 0.9386
No log 1.4722 106 0.8903 0.1612 0.8903 0.9436
No log 1.5 108 0.9300 0.0995 0.9300 0.9644
No log 1.5278 110 1.0560 0.2153 1.0560 1.0276
No log 1.5556 112 1.0581 0.0967 1.0581 1.0286
No log 1.5833 114 1.2472 0.1467 1.2472 1.1168
No log 1.6111 116 0.9830 -0.0156 0.9830 0.9915
No log 1.6389 118 0.8820 0.0779 0.8820 0.9392
No log 1.6667 120 0.8858 0.0741 0.8858 0.9411
No log 1.6944 122 0.8784 0.1416 0.8784 0.9372
No log 1.7222 124 0.8349 0.0289 0.8349 0.9138
No log 1.75 126 0.8171 0.0289 0.8171 0.9040
No log 1.7778 128 0.8002 0.0255 0.8002 0.8945
No log 1.8056 130 0.8657 0.0922 0.8657 0.9304
No log 1.8333 132 0.8605 0.0 0.8605 0.9276
No log 1.8611 134 0.8794 0.0344 0.8794 0.9378
No log 1.8889 136 0.9123 0.0437 0.9123 0.9551
No log 1.9167 138 0.9349 0.0851 0.9349 0.9669
No log 1.9444 140 0.9106 0.0272 0.9106 0.9543
No log 1.9722 142 0.9647 0.0968 0.9647 0.9822
No log 2.0 144 1.2046 0.0989 1.2046 1.0976
No log 2.0278 146 0.9409 0.1595 0.9409 0.9700
No log 2.0556 148 0.8331 0.0344 0.8331 0.9127
No log 2.0833 150 0.8630 0.0488 0.8630 0.9290
No log 2.1111 152 0.9200 0.0883 0.9200 0.9592
No log 2.1389 154 1.2222 0.0989 1.2222 1.1055
No log 2.1667 156 1.3132 0.1290 1.3132 1.1459
No log 2.1944 158 1.0671 -0.1260 1.0671 1.0330
No log 2.2222 160 1.0217 -0.1310 1.0217 1.0108
No log 2.25 162 1.0049 0.0584 1.0049 1.0025
No log 2.2778 164 1.0361 -0.1722 1.0361 1.0179
No log 2.3056 166 1.3052 0.0959 1.3052 1.1425
No log 2.3333 168 1.2182 0.0959 1.2182 1.1037
No log 2.3611 170 0.9849 0.1324 0.9849 0.9924
No log 2.3889 172 0.9981 0.1429 0.9981 0.9990
No log 2.4167 174 0.8993 0.0946 0.8993 0.9483
No log 2.4444 176 0.9833 0.1284 0.9833 0.9916
No log 2.4722 178 1.0204 0.0095 1.0204 1.0102
No log 2.5 180 0.8603 0.0827 0.8603 0.9275
No log 2.5278 182 0.8343 0.1095 0.8343 0.9134
No log 2.5556 184 0.9064 0.0871 0.9064 0.9521
No log 2.5833 186 0.7885 0.1144 0.7885 0.8880
No log 2.6111 188 0.8007 0.0905 0.8007 0.8948
No log 2.6389 190 0.8054 0.0934 0.8054 0.8974
No log 2.6667 192 0.8621 0.0709 0.8621 0.9285
No log 2.6944 194 0.8947 0.0909 0.8947 0.9459
No log 2.7222 196 0.9107 0.0537 0.9107 0.9543
No log 2.75 198 0.8847 0.0 0.8847 0.9406
No log 2.7778 200 0.9302 0.0959 0.9302 0.9645
No log 2.8056 202 0.8749 0.1048 0.8749 0.9354
No log 2.8333 204 0.8328 -0.0307 0.8328 0.9126
No log 2.8611 206 0.9234 -0.0638 0.9234 0.9609
No log 2.8889 208 0.8691 -0.0079 0.8691 0.9323
No log 2.9167 210 0.7718 0.1722 0.7718 0.8785
No log 2.9444 212 0.8457 0.1395 0.8457 0.9196
No log 2.9722 214 0.9355 0.1193 0.9355 0.9672
No log 3.0 216 0.8341 0.2155 0.8341 0.9133
No log 3.0278 218 0.8990 -0.0137 0.8990 0.9482
No log 3.0556 220 1.0197 0.0379 1.0197 1.0098
No log 3.0833 222 0.9118 0.0529 0.9118 0.9549
No log 3.1111 224 0.8565 -0.0441 0.8565 0.9255
No log 3.1389 226 0.8609 -0.0099 0.8609 0.9278
No log 3.1667 228 0.9742 0.0840 0.9742 0.9870
No log 3.1944 230 1.1430 0.0698 1.1430 1.0691
No log 3.2222 232 1.1399 0.0953 1.1399 1.0676
No log 3.25 234 0.9816 0.1161 0.9816 0.9907
No log 3.2778 236 0.9298 0.0818 0.9298 0.9642
No log 3.3056 238 0.9139 0.1203 0.9139 0.9560
No log 3.3333 240 0.9559 0.0578 0.9559 0.9777
No log 3.3611 242 0.8753 0.0069 0.8753 0.9356
No log 3.3889 244 0.8181 0.1192 0.8181 0.9045
No log 3.4167 246 0.8273 0.1964 0.8273 0.9095
No log 3.4444 248 0.8047 0.0821 0.8047 0.8971
No log 3.4722 250 0.8782 0.0823 0.8782 0.9371
No log 3.5 252 0.8383 0.1136 0.8383 0.9156
No log 3.5278 254 0.8687 0.0541 0.8687 0.9320
No log 3.5556 256 0.8264 0.1138 0.8264 0.9091
No log 3.5833 258 0.8077 0.0481 0.8077 0.8987
No log 3.6111 260 0.7761 0.1249 0.7761 0.8810
No log 3.6389 262 0.7717 0.1249 0.7717 0.8785
No log 3.6667 264 0.7829 -0.0524 0.7829 0.8848
No log 3.6944 266 0.9475 0.0391 0.9475 0.9734
No log 3.7222 268 0.9464 0.0083 0.9464 0.9728
No log 3.75 270 0.8080 0.0357 0.8080 0.8989
No log 3.7778 272 0.8171 0.1964 0.8171 0.9040
No log 3.8056 274 0.8009 -0.0086 0.8009 0.8949
No log 3.8333 276 0.9564 0.0125 0.9564 0.9780
No log 3.8611 278 1.1346 0.1105 1.1346 1.0652
No log 3.8889 280 1.0308 0.1326 1.0308 1.0153
No log 3.9167 282 0.8764 0.0526 0.8764 0.9362
No log 3.9444 284 0.8235 0.0469 0.8235 0.9075
No log 3.9722 286 0.7995 0.0410 0.7995 0.8941
No log 4.0 288 0.7925 0.0049 0.7925 0.8902
No log 4.0278 290 0.7651 -0.0082 0.7651 0.8747
No log 4.0556 292 0.7871 0.0522 0.7871 0.8872
No log 4.0833 294 0.7770 0.0110 0.7770 0.8815
No log 4.1111 296 0.7450 -0.0113 0.7450 0.8631
No log 4.1389 298 0.7623 0.0776 0.7623 0.8731
No log 4.1667 300 0.7765 -0.0426 0.7765 0.8812
No log 4.1944 302 0.8075 0.0509 0.8075 0.8986
No log 4.2222 304 0.8426 0.0522 0.8426 0.9179
No log 4.25 306 0.8924 0.1001 0.8924 0.9447
No log 4.2778 308 0.8161 0.0106 0.8161 0.9034
No log 4.3056 310 0.7851 0.0394 0.7851 0.8861
No log 4.3333 312 0.7782 0.1199 0.7782 0.8822
No log 4.3611 314 0.7825 0.1199 0.7825 0.8846
No log 4.3889 316 0.7983 -0.0408 0.7983 0.8935
No log 4.4167 318 0.9763 0.0451 0.9763 0.9881
No log 4.4444 320 0.9839 0.0451 0.9839 0.9919
No log 4.4722 322 0.8381 0.0482 0.8381 0.9155
No log 4.5 324 0.8172 0.0 0.8172 0.9040
No log 4.5278 326 0.8245 0.0866 0.8245 0.9080
No log 4.5556 328 0.8255 0.0905 0.8255 0.9085
No log 4.5833 330 0.8249 0.0905 0.8249 0.9083
No log 4.6111 332 0.8726 0.0964 0.8726 0.9341
No log 4.6389 334 0.8250 0.0537 0.8250 0.9083
No log 4.6667 336 0.8570 0.0239 0.8570 0.9258
No log 4.6944 338 0.9001 0.0255 0.9001 0.9487
No log 4.7222 340 0.8788 0.0255 0.8788 0.9374
No log 4.75 342 0.8204 -0.0500 0.8204 0.9058
No log 4.7778 344 0.8481 0.0377 0.8481 0.9209
No log 4.8056 346 0.8865 0.1538 0.8865 0.9415
No log 4.8333 348 0.9519 0.0879 0.9519 0.9757
No log 4.8611 350 0.9975 0.1228 0.9975 0.9987
No log 4.8889 352 0.9977 0.0569 0.9977 0.9989
No log 4.9167 354 0.9677 0.0159 0.9677 0.9837
No log 4.9444 356 0.9337 0.0159 0.9337 0.9663
No log 4.9722 358 0.9007 0.0178 0.9007 0.9490
No log 5.0 360 0.8348 -0.0889 0.8348 0.9137
No log 5.0278 362 0.8302 -0.0999 0.8302 0.9111
No log 5.0556 364 0.8438 -0.0103 0.8438 0.9186
No log 5.0833 366 0.8608 0.0301 0.8608 0.9278
No log 5.1111 368 0.9377 0.0221 0.9377 0.9683
No log 5.1389 370 1.0426 0.0979 1.0426 1.0211
No log 5.1667 372 1.0913 0.0698 1.0913 1.0447
No log 5.1944 374 0.9687 -0.0193 0.9687 0.9842
No log 5.2222 376 0.8897 -0.0127 0.8897 0.9433
No log 5.25 378 0.8908 0.0226 0.8908 0.9438
No log 5.2778 380 0.8836 -0.0103 0.8836 0.9400
No log 5.3056 382 0.9239 -0.0672 0.9239 0.9612
No log 5.3333 384 0.9549 0.0196 0.9549 0.9772
No log 5.3611 386 0.9548 0.0196 0.9548 0.9771
No log 5.3889 388 0.8863 -0.0841 0.8863 0.9414
No log 5.4167 390 0.8917 0.0175 0.8917 0.9443
No log 5.4444 392 0.8676 0.0709 0.8676 0.9314
No log 5.4722 394 0.8387 0.0313 0.8387 0.9158
No log 5.5 396 0.8482 -0.0391 0.8482 0.9210
No log 5.5278 398 0.9352 0.0925 0.9352 0.9670
No log 5.5556 400 0.9930 -0.0931 0.9930 0.9965
No log 5.5833 402 0.9170 -0.0307 0.9170 0.9576
No log 5.6111 404 0.8768 0.1553 0.8768 0.9364
No log 5.6389 406 0.8991 0.1943 0.8991 0.9482
No log 5.6667 408 0.8987 0.0660 0.8987 0.9480
No log 5.6944 410 0.9118 -0.0408 0.9118 0.9549
No log 5.7222 412 0.9345 -0.1072 0.9345 0.9667
No log 5.75 414 0.8832 -0.0517 0.8832 0.9398
No log 5.7778 416 0.8792 0.1144 0.8792 0.9377
No log 5.8056 418 0.8834 -0.0533 0.8834 0.9399
No log 5.8333 420 0.8850 0.1144 0.8850 0.9408
No log 5.8611 422 0.8614 0.1144 0.8614 0.9281
No log 5.8889 424 0.8356 0.0732 0.8356 0.9141
No log 5.9167 426 0.8187 0.0776 0.8187 0.9048
No log 5.9444 428 0.8148 0.0732 0.8148 0.9027
No log 5.9722 430 0.8367 0.1001 0.8367 0.9147
No log 6.0 432 0.8572 0.1379 0.8572 0.9259
No log 6.0278 434 0.8286 0.0289 0.8286 0.9103
No log 6.0556 436 0.8378 -0.0898 0.8378 0.9153
No log 6.0833 438 0.8396 -0.0898 0.8396 0.9163
No log 6.1111 440 0.8533 -0.0806 0.8533 0.9237
No log 6.1389 442 0.8359 -0.0541 0.8359 0.9143
No log 6.1667 444 0.8484 0.0700 0.8484 0.9211
No log 6.1944 446 0.8174 0.0741 0.8174 0.9041
No log 6.2222 448 0.8134 -0.0500 0.8134 0.9019
No log 6.25 450 0.9559 0.1011 0.9559 0.9777
No log 6.2778 452 1.0164 0.1570 1.0164 1.0082
No log 6.3056 454 0.8824 0.0641 0.8824 0.9393
No log 6.3333 456 0.7724 0.1189 0.7724 0.8789
No log 6.3611 458 0.8184 0.2011 0.8184 0.9047
No log 6.3889 460 0.7977 0.1660 0.7977 0.8932
No log 6.4167 462 0.8033 0.1264 0.8033 0.8962
No log 6.4444 464 0.8067 0.1187 0.8067 0.8982
No log 6.4722 466 0.8307 -0.0029 0.8307 0.9115
No log 6.5 468 0.8235 0.0791 0.8235 0.9075
No log 6.5278 470 0.8476 0.1181 0.8476 0.9207
No log 6.5556 472 0.8376 0.0670 0.8376 0.9152
No log 6.5833 474 0.8294 0.0709 0.8294 0.9107
No log 6.6111 476 0.8284 0.1095 0.8284 0.9102
No log 6.6389 478 0.8226 0.1049 0.8226 0.9070
No log 6.6667 480 0.8260 -0.0426 0.8260 0.9089
No log 6.6944 482 0.8711 0.0172 0.8711 0.9333
No log 6.7222 484 0.8794 -0.0173 0.8794 0.9377
No log 6.75 486 0.8356 -0.0889 0.8356 0.9141
No log 6.7778 488 0.7961 0.1196 0.7961 0.8922
No log 6.8056 490 0.8704 0.2276 0.8704 0.9330
No log 6.8333 492 0.8454 0.2276 0.8454 0.9195
No log 6.8611 494 0.7677 0.1196 0.7677 0.8762
No log 6.8889 496 0.8265 -0.0753 0.8265 0.9091
No log 6.9167 498 0.8409 -0.0307 0.8409 0.9170
0.3025 6.9444 500 0.8288 -0.0357 0.8288 0.9104
0.3025 6.9722 502 0.8123 -0.0079 0.8123 0.9013
0.3025 7.0 504 0.8088 -0.0082 0.8088 0.8994
0.3025 7.0278 506 0.8740 -0.0723 0.8740 0.9349
0.3025 7.0556 508 0.9268 -0.0745 0.9268 0.9627
0.3025 7.0833 510 0.9215 -0.0816 0.9215 0.9599
0.3025 7.1111 512 0.8580 -0.1116 0.8580 0.9263
0.3025 7.1389 514 0.8160 -0.0517 0.8160 0.9033

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k14_task3_organization

Finetuned
(4019)
this model