ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k19_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6505
  • Qwk: 0.7516
  • Mse: 0.6505
  • Rmse: 0.8065

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0141 2 6.8382 0.0188 6.8382 2.6150
No log 0.0282 4 5.0356 0.0659 5.0356 2.2440
No log 0.0423 6 3.2477 0.1091 3.2477 1.8021
No log 0.0563 8 2.6457 0.0397 2.6457 1.6266
No log 0.0704 10 1.8361 0.2703 1.8361 1.3550
No log 0.0845 12 1.7389 0.1714 1.7389 1.3187
No log 0.0986 14 1.6272 0.1538 1.6272 1.2756
No log 0.1127 16 1.7228 0.1538 1.7228 1.3125
No log 0.1268 18 1.9998 0.1404 1.9998 1.4141
No log 0.1408 20 2.2564 0.1135 2.2564 1.5021
No log 0.1549 22 2.1814 0.1594 2.1814 1.4770
No log 0.1690 24 1.9976 0.2258 1.9976 1.4134
No log 0.1831 26 1.7306 0.2617 1.7306 1.3155
No log 0.1972 28 1.5470 0.1905 1.5470 1.2438
No log 0.2113 30 1.5701 0.3478 1.5701 1.2530
No log 0.2254 32 1.6056 0.2975 1.6056 1.2671
No log 0.2394 34 1.5342 0.3740 1.5342 1.2386
No log 0.2535 36 1.4696 0.3802 1.4696 1.2123
No log 0.2676 38 1.3947 0.3214 1.3947 1.1810
No log 0.2817 40 1.3510 0.2264 1.3510 1.1623
No log 0.2958 42 1.2833 0.2407 1.2833 1.1328
No log 0.3099 44 1.2438 0.3571 1.2438 1.1153
No log 0.3239 46 1.2426 0.3860 1.2426 1.1147
No log 0.3380 48 1.3795 0.3934 1.3795 1.1745
No log 0.3521 50 1.1551 0.5289 1.1551 1.0747
No log 0.3662 52 1.1267 0.5 1.1267 1.0615
No log 0.3803 54 1.3499 0.2881 1.3499 1.1618
No log 0.3944 56 1.6062 0.2258 1.6062 1.2674
No log 0.4085 58 1.3709 0.3740 1.3709 1.1709
No log 0.4225 60 0.9696 0.4918 0.9696 0.9847
No log 0.4366 62 0.8481 0.6462 0.8481 0.9209
No log 0.4507 64 0.8845 0.6512 0.8845 0.9405
No log 0.4648 66 0.9459 0.6512 0.9459 0.9726
No log 0.4789 68 0.9635 0.625 0.9635 0.9816
No log 0.4930 70 0.8723 0.6202 0.8723 0.9340
No log 0.5070 72 0.8870 0.6471 0.8870 0.9418
No log 0.5211 74 0.9311 0.6471 0.9311 0.9649
No log 0.5352 76 0.9818 0.6471 0.9818 0.9909
No log 0.5493 78 0.9889 0.6471 0.9889 0.9944
No log 0.5634 80 0.9778 0.5954 0.9778 0.9888
No log 0.5775 82 0.9601 0.6260 0.9601 0.9798
No log 0.5915 84 1.0607 0.5038 1.0607 1.0299
No log 0.6056 86 1.0129 0.5846 1.0129 1.0064
No log 0.6197 88 0.8201 0.6767 0.8201 0.9056
No log 0.6338 90 0.7880 0.6912 0.7880 0.8877
No log 0.6479 92 0.8577 0.6615 0.8577 0.9261
No log 0.6620 94 0.7934 0.6466 0.7934 0.8907
No log 0.6761 96 0.6950 0.7183 0.6950 0.8336
No log 0.6901 98 0.6738 0.7550 0.6738 0.8209
No log 0.7042 100 0.6962 0.7333 0.6962 0.8344
No log 0.7183 102 0.7859 0.7134 0.7859 0.8865
No log 0.7324 104 1.0632 0.6335 1.0632 1.0311
No log 0.7465 106 1.4177 0.5444 1.4177 1.1907
No log 0.7606 108 1.3982 0.5876 1.3982 1.1824
No log 0.7746 110 1.1767 0.6405 1.1767 1.0848
No log 0.7887 112 0.9553 0.625 0.9553 0.9774
No log 0.8028 114 0.8535 0.6853 0.8535 0.9239
No log 0.8169 116 0.7785 0.6667 0.7785 0.8823
No log 0.8310 118 0.7406 0.6761 0.7406 0.8606
No log 0.8451 120 0.8261 0.6897 0.8261 0.9089
No log 0.8592 122 1.1443 0.6460 1.1443 1.0697
No log 0.8732 124 1.1418 0.6265 1.1418 1.0686
No log 0.8873 126 0.7777 0.6906 0.7777 0.8819
No log 0.9014 128 0.7095 0.6714 0.7095 0.8423
No log 0.9155 130 0.6919 0.6763 0.6919 0.8318
No log 0.9296 132 0.7606 0.6759 0.7606 0.8721
No log 0.9437 134 0.8155 0.6429 0.8155 0.9031
No log 0.9577 136 1.0032 0.6207 1.0032 1.0016
No log 0.9718 138 1.2154 0.6163 1.2154 1.1024
No log 0.9859 140 1.1739 0.6286 1.1739 1.0835
No log 1.0 142 0.9850 0.5850 0.9850 0.9925
No log 1.0141 144 0.8270 0.6471 0.8270 0.9094
No log 1.0282 146 0.8775 0.5827 0.8775 0.9367
No log 1.0423 148 1.0448 0.5 1.0448 1.0222
No log 1.0563 150 0.9313 0.5669 0.9313 0.9650
No log 1.0704 152 0.7772 0.7111 0.7772 0.8816
No log 1.0845 154 0.7969 0.6944 0.7969 0.8927
No log 1.0986 156 0.9625 0.6503 0.9625 0.9810
No log 1.1127 158 1.0623 0.5802 1.0623 1.0307
No log 1.1268 160 1.1238 0.5621 1.1238 1.0601
No log 1.1408 162 1.0758 0.5752 1.0758 1.0372
No log 1.1549 164 0.9453 0.6383 0.9453 0.9723
No log 1.1690 166 0.9043 0.6667 0.9043 0.9509
No log 1.1831 168 0.8852 0.6866 0.8852 0.9409
No log 1.1972 170 0.8945 0.6767 0.8945 0.9458
No log 1.2113 172 0.8840 0.6767 0.8840 0.9402
No log 1.2254 174 0.8416 0.6767 0.8416 0.9174
No log 1.2394 176 0.8233 0.6912 0.8233 0.9074
No log 1.2535 178 0.9326 0.6131 0.9326 0.9657
No log 1.2676 180 0.8633 0.6714 0.8633 0.9292
No log 1.2817 182 0.6636 0.7376 0.6636 0.8146
No log 1.2958 184 0.6366 0.7465 0.6366 0.7978
No log 1.3099 186 0.7369 0.7368 0.7369 0.8584
No log 1.3239 188 0.9651 0.6909 0.9651 0.9824
No log 1.3380 190 0.9963 0.6875 0.9963 0.9981
No log 1.3521 192 0.7861 0.7133 0.7861 0.8866
No log 1.3662 194 0.7103 0.7338 0.7103 0.8428
No log 1.3803 196 0.7664 0.6906 0.7664 0.8754
No log 1.3944 198 0.8363 0.6522 0.8363 0.9145
No log 1.4085 200 0.7786 0.7007 0.7786 0.8824
No log 1.4225 202 0.7184 0.7338 0.7184 0.8476
No log 1.4366 204 0.7322 0.7246 0.7322 0.8557
No log 1.4507 206 0.8465 0.6667 0.8465 0.9200
No log 1.4648 208 1.0289 0.6901 1.0289 1.0143
No log 1.4789 210 1.0655 0.6707 1.0655 1.0322
No log 1.4930 212 0.8407 0.6757 0.8407 0.9169
No log 1.5070 214 0.6871 0.7465 0.6871 0.8289
No log 1.5211 216 0.6527 0.7338 0.6527 0.8079
No log 1.5352 218 0.7111 0.7143 0.7111 0.8433
No log 1.5493 220 0.8232 0.6620 0.8232 0.9073
No log 1.5634 222 0.9748 0.6581 0.9748 0.9873
No log 1.5775 224 1.0200 0.64 1.0200 1.0100
No log 1.5915 226 0.8245 0.6765 0.8245 0.9080
No log 1.6056 228 0.7345 0.6667 0.7345 0.8570
No log 1.6197 230 0.7471 0.6462 0.7471 0.8643
No log 1.6338 232 0.7785 0.6815 0.7785 0.8823
No log 1.6479 234 0.9388 0.6569 0.9388 0.9689
No log 1.6620 236 1.1370 0.5629 1.1370 1.0663
No log 1.6761 238 1.2212 0.5848 1.2212 1.1051
No log 1.6901 240 1.0021 0.6277 1.0021 1.0011
No log 1.7042 242 0.7942 0.6765 0.7942 0.8912
No log 1.7183 244 0.7632 0.7007 0.7632 0.8736
No log 1.7324 246 0.7757 0.7007 0.7757 0.8808
No log 1.7465 248 0.7945 0.7 0.7945 0.8914
No log 1.7606 250 0.9010 0.6939 0.9010 0.9492
No log 1.7746 252 1.0728 0.6108 1.0728 1.0357
No log 1.7887 254 1.0200 0.6108 1.0200 1.0099
No log 1.8028 256 0.8034 0.7368 0.8034 0.8964
No log 1.8169 258 0.6089 0.7534 0.6089 0.7803
No log 1.8310 260 0.6021 0.7550 0.6021 0.7760
No log 1.8451 262 0.6012 0.7703 0.6012 0.7754
No log 1.8592 264 0.7217 0.7368 0.7217 0.8495
No log 1.8732 266 1.0149 0.6946 1.0149 1.0074
No log 1.8873 268 1.0403 0.6463 1.0403 1.0199
No log 1.9014 270 0.8279 0.6892 0.8279 0.9099
No log 1.9155 272 0.6793 0.6861 0.6793 0.8242
No log 1.9296 274 0.6536 0.7299 0.6536 0.8085
No log 1.9437 276 0.6515 0.7101 0.6515 0.8072
No log 1.9577 278 0.7315 0.7273 0.7315 0.8553
No log 1.9718 280 0.7849 0.7421 0.7849 0.8859
No log 1.9859 282 0.8734 0.725 0.8734 0.9346
No log 2.0 284 0.9189 0.7205 0.9189 0.9586
No log 2.0141 286 0.9101 0.7097 0.9101 0.9540
No log 2.0282 288 0.7407 0.7517 0.7407 0.8606
No log 2.0423 290 0.6352 0.7517 0.6352 0.7970
No log 2.0563 292 0.6208 0.7712 0.6208 0.7879
No log 2.0704 294 0.7312 0.7673 0.7312 0.8551
No log 2.0845 296 0.8722 0.7636 0.8722 0.9339
No log 2.0986 298 1.0898 0.6557 1.0898 1.0439
No log 2.1127 300 1.0478 0.6816 1.0478 1.0236
No log 2.1268 302 0.7731 0.7531 0.7731 0.8793
No log 2.1408 304 0.6597 0.7682 0.6597 0.8122
No log 2.1549 306 0.6815 0.7483 0.6815 0.8255
No log 2.1690 308 0.7788 0.7468 0.7788 0.8825
No log 2.1831 310 0.9556 0.7018 0.9556 0.9776
No log 2.1972 312 0.9133 0.7024 0.9133 0.9556
No log 2.2113 314 0.6924 0.7662 0.6924 0.8321
No log 2.2254 316 0.6057 0.7712 0.6057 0.7783
No log 2.2394 318 0.5963 0.7871 0.5963 0.7722
No log 2.2535 320 0.7134 0.7702 0.7134 0.8446
No log 2.2676 322 0.9065 0.7073 0.9065 0.9521
No log 2.2817 324 1.0710 0.6108 1.0710 1.0349
No log 2.2958 326 1.1588 0.5976 1.1588 1.0765
No log 2.3099 328 1.0963 0.5942 1.0963 1.0471
No log 2.3239 330 0.9696 0.6412 0.9696 0.9847
No log 2.3380 332 0.8990 0.5891 0.8990 0.9482
No log 2.3521 334 0.8471 0.6412 0.8471 0.9204
No log 2.3662 336 0.8273 0.6906 0.8273 0.9096
No log 2.3803 338 1.0452 0.6705 1.0452 1.0224
No log 2.3944 340 1.2672 0.6421 1.2672 1.1257
No log 2.4085 342 1.2685 0.6077 1.2685 1.1263
No log 2.4225 344 1.1100 0.6433 1.1100 1.0536
No log 2.4366 346 0.9373 0.6623 0.9373 0.9682
No log 2.4507 348 0.8566 0.7013 0.8566 0.9255
No log 2.4648 350 0.8342 0.7329 0.8342 0.9133
No log 2.4789 352 0.9425 0.7174 0.9425 0.9708
No log 2.4930 354 0.8611 0.7486 0.8611 0.9279
No log 2.5070 356 0.6908 0.7561 0.6908 0.8311
No log 2.5211 358 0.6092 0.7733 0.6092 0.7805
No log 2.5352 360 0.6602 0.7172 0.6602 0.8125
No log 2.5493 362 0.7983 0.7297 0.7983 0.8935
No log 2.5634 364 0.8725 0.7034 0.8725 0.9341
No log 2.5775 366 0.8632 0.6986 0.8632 0.9291
No log 2.5915 368 0.8669 0.6901 0.8669 0.9311
No log 2.6056 370 0.9404 0.6901 0.9404 0.9697
No log 2.6197 372 1.0554 0.6573 1.0554 1.0273
No log 2.6338 374 1.0046 0.6522 1.0046 1.0023
No log 2.6479 376 0.9364 0.6370 0.9364 0.9677
No log 2.6620 378 0.8061 0.6519 0.8061 0.8978
No log 2.6761 380 0.7437 0.6944 0.7437 0.8624
No log 2.6901 382 0.6612 0.7662 0.6612 0.8131
No log 2.7042 384 0.6822 0.7515 0.6822 0.8259
No log 2.7183 386 0.6410 0.7436 0.6410 0.8006
No log 2.7324 388 0.7268 0.7389 0.7268 0.8525
No log 2.7465 390 0.9828 0.6582 0.9828 0.9913
No log 2.7606 392 1.1733 0.6125 1.1733 1.0832
No log 2.7746 394 1.1273 0.5753 1.1273 1.0618
No log 2.7887 396 0.9834 0.6061 0.9834 0.9917
No log 2.8028 398 0.9045 0.6412 0.9045 0.9511
No log 2.8169 400 0.8326 0.6667 0.8326 0.9125
No log 2.8310 402 0.8196 0.6667 0.8196 0.9053
No log 2.8451 404 0.8480 0.6853 0.8480 0.9209
No log 2.8592 406 0.8029 0.6809 0.8029 0.8960
No log 2.8732 408 0.7360 0.6619 0.7360 0.8579
No log 2.8873 410 0.7211 0.6618 0.7211 0.8492
No log 2.9014 412 0.7010 0.6957 0.7010 0.8373
No log 2.9155 414 0.8245 0.6324 0.8245 0.9080
No log 2.9296 416 0.9585 0.5821 0.9585 0.9790
No log 2.9437 418 0.9803 0.6029 0.9803 0.9901
No log 2.9577 420 0.9369 0.6528 0.9369 0.9679
No log 2.9718 422 0.8050 0.6618 0.8050 0.8972
No log 2.9859 424 0.7714 0.6715 0.7714 0.8783
No log 3.0 426 0.8277 0.6667 0.8277 0.9098
No log 3.0141 428 0.9523 0.6755 0.9523 0.9759
No log 3.0282 430 0.9727 0.6621 0.9727 0.9863
No log 3.0423 432 0.8478 0.6567 0.8478 0.9208
No log 3.0563 434 0.8186 0.6515 0.8186 0.9048
No log 3.0704 436 0.8290 0.6567 0.8290 0.9105
No log 3.0845 438 0.8824 0.6383 0.8824 0.9394
No log 3.0986 440 0.9523 0.6755 0.9523 0.9759
No log 3.1127 442 0.8577 0.6846 0.8577 0.9261
No log 3.1268 444 0.7235 0.6715 0.7235 0.8506
No log 3.1408 446 0.6688 0.7111 0.6688 0.8178
No log 3.1549 448 0.6630 0.7286 0.6630 0.8143
No log 3.1690 450 0.6602 0.7448 0.6602 0.8125
No log 3.1831 452 0.6864 0.7222 0.6864 0.8285
No log 3.1972 454 0.6870 0.7222 0.6870 0.8288
No log 3.2113 456 0.6900 0.7092 0.6900 0.8307
No log 3.2254 458 0.7233 0.7092 0.7233 0.8505
No log 3.2394 460 0.7450 0.7 0.7450 0.8631
No log 3.2535 462 0.7205 0.7092 0.7205 0.8488
No log 3.2676 464 0.7241 0.7222 0.7241 0.8509
No log 3.2817 466 0.7497 0.7285 0.7497 0.8659
No log 3.2958 468 0.7321 0.7123 0.7321 0.8556
No log 3.3099 470 0.7818 0.7248 0.7818 0.8842
No log 3.3239 472 0.7074 0.7172 0.7074 0.8411
No log 3.3380 474 0.7045 0.7248 0.7045 0.8394
No log 3.3521 476 0.7711 0.7248 0.7711 0.8781
No log 3.3662 478 0.8419 0.7179 0.8419 0.9176
No log 3.3803 480 0.9553 0.6667 0.9553 0.9774
No log 3.3944 482 0.8928 0.7006 0.8928 0.9449
No log 3.4085 484 0.7417 0.7432 0.7417 0.8612
No log 3.4225 486 0.7014 0.7361 0.7014 0.8375
No log 3.4366 488 0.6406 0.7361 0.6406 0.8004
No log 3.4507 490 0.6290 0.7273 0.6290 0.7931
No log 3.4648 492 0.6741 0.7483 0.6741 0.8210
No log 3.4789 494 0.8367 0.725 0.8367 0.9147
No log 3.4930 496 0.8584 0.7152 0.8584 0.9265
No log 3.5070 498 0.7413 0.7532 0.7413 0.8610
0.4714 3.5211 500 0.6823 0.7586 0.6823 0.8260
0.4714 3.5352 502 0.6845 0.6906 0.6845 0.8273
0.4714 3.5493 504 0.6829 0.7123 0.6829 0.8264
0.4714 3.5634 506 0.6363 0.7763 0.6363 0.7977
0.4714 3.5775 508 0.6829 0.7329 0.6829 0.8264
0.4714 3.5915 510 0.8308 0.7442 0.8308 0.9115
0.4714 3.6056 512 0.9045 0.7416 0.9045 0.9511
0.4714 3.6197 514 0.7674 0.7514 0.7674 0.8760
0.4714 3.6338 516 0.6505 0.7516 0.6505 0.8065

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k19_task1_organization

Finetuned
(4023)
this model