ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k11_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7969
  • Qwk: 0.6466
  • Mse: 0.7969
  • Rmse: 0.8927

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0241 2 6.5801 0.0308 6.5801 2.5652
No log 0.0482 4 4.1965 0.1000 4.1965 2.0485
No log 0.0723 6 2.6665 0.0633 2.6665 1.6329
No log 0.0964 8 2.3861 0.1111 2.3861 1.5447
No log 0.1205 10 1.7013 0.1835 1.7013 1.3043
No log 0.1446 12 1.8045 0.1538 1.8045 1.3433
No log 0.1687 14 1.8781 0.1538 1.8781 1.3704
No log 0.1928 16 1.8509 0.1165 1.8509 1.3605
No log 0.2169 18 1.7216 0.1165 1.7216 1.3121
No log 0.2410 20 1.5982 0.1165 1.5982 1.2642
No log 0.2651 22 1.4068 0.1538 1.4068 1.1861
No log 0.2892 24 1.3393 0.2243 1.3393 1.1573
No log 0.3133 26 1.4211 0.3478 1.4211 1.1921
No log 0.3373 28 1.6120 0.4355 1.6120 1.2696
No log 0.3614 30 1.6261 0.3906 1.6261 1.2752
No log 0.3855 32 1.5182 0.4065 1.5182 1.2321
No log 0.4096 34 1.4776 0.3934 1.4776 1.2156
No log 0.4337 36 1.1216 0.4615 1.1216 1.0590
No log 0.4578 38 1.0251 0.5085 1.0251 1.0125
No log 0.4819 40 1.0523 0.5085 1.0523 1.0258
No log 0.5060 42 1.1449 0.5041 1.1449 1.0700
No log 0.5301 44 1.1707 0.5238 1.1707 1.0820
No log 0.5542 46 1.1686 0.5354 1.1686 1.0810
No log 0.5783 48 1.1376 0.6202 1.1376 1.0666
No log 0.6024 50 1.0778 0.5649 1.0778 1.0382
No log 0.6265 52 1.0294 0.5397 1.0294 1.0146
No log 0.6506 54 1.0567 0.5781 1.0567 1.0279
No log 0.6747 56 0.9339 0.6406 0.9339 0.9664
No log 0.6988 58 0.8027 0.6718 0.8027 0.8960
No log 0.7229 60 0.7763 0.6615 0.7763 0.8811
No log 0.7470 62 0.9191 0.6617 0.9191 0.9587
No log 0.7711 64 0.9921 0.6176 0.9921 0.9960
No log 0.7952 66 0.9823 0.5909 0.9823 0.9911
No log 0.8193 68 1.1934 0.4463 1.1934 1.0924
No log 0.8434 70 1.5235 0.3025 1.5235 1.2343
No log 0.8675 72 1.1931 0.4298 1.1931 1.0923
No log 0.8916 74 0.9616 0.6338 0.9616 0.9806
No log 0.9157 76 0.9820 0.6225 0.9820 0.9910
No log 0.9398 78 0.8674 0.6765 0.8674 0.9313
No log 0.9639 80 0.8580 0.6857 0.8580 0.9263
No log 0.9880 82 0.9119 0.6623 0.9119 0.9549
No log 1.0120 84 1.0269 0.6389 1.0269 1.0133
No log 1.0361 86 1.0643 0.5778 1.0643 1.0316
No log 1.0602 88 1.0531 0.6131 1.0531 1.0262
No log 1.0843 90 1.1428 0.5906 1.1428 1.0690
No log 1.1084 92 1.0356 0.6 1.0356 1.0176
No log 1.1325 94 0.8132 0.6957 0.8132 0.9018
No log 1.1566 96 0.9047 0.6061 0.9047 0.9512
No log 1.1807 98 1.0205 0.5522 1.0205 1.0102
No log 1.2048 100 0.9127 0.6061 0.9127 0.9554
No log 1.2289 102 0.8974 0.6308 0.8974 0.9473
No log 1.2530 104 0.9530 0.6260 0.9530 0.9762
No log 1.2771 106 0.9711 0.5891 0.9711 0.9855
No log 1.3012 108 0.9507 0.5581 0.9507 0.9750
No log 1.3253 110 0.8897 0.6154 0.8897 0.9432
No log 1.3494 112 0.9780 0.6939 0.9780 0.9889
No log 1.3735 114 1.2749 0.6163 1.2749 1.1291
No log 1.3976 116 1.2572 0.6228 1.2572 1.1212
No log 1.4217 118 1.0069 0.6579 1.0069 1.0035
No log 1.4458 120 0.8452 0.7067 0.8452 0.9194
No log 1.4699 122 0.8199 0.7248 0.8199 0.9055
No log 1.4940 124 0.8397 0.7051 0.8397 0.9164
No log 1.5181 126 0.7904 0.7133 0.7904 0.8891
No log 1.5422 128 0.7689 0.7 0.7689 0.8768
No log 1.5663 130 0.8325 0.6906 0.8325 0.9124
No log 1.5904 132 0.8990 0.6667 0.8990 0.9482
No log 1.6145 134 0.8143 0.6806 0.8143 0.9024
No log 1.6386 136 0.8387 0.6759 0.8387 0.9158
No log 1.6627 138 0.8647 0.6986 0.8647 0.9299
No log 1.6867 140 0.8145 0.6620 0.8145 0.9025
No log 1.7108 142 0.8281 0.7114 0.8281 0.9100
No log 1.7349 144 0.8615 0.6928 0.8615 0.9282
No log 1.7590 146 0.8404 0.6928 0.8404 0.9167
No log 1.7831 148 0.7863 0.6892 0.7863 0.8867
No log 1.8072 150 0.8560 0.7044 0.8560 0.9252
No log 1.8313 152 0.8040 0.7105 0.8040 0.8966
No log 1.8554 154 0.8516 0.6883 0.8516 0.9228
No log 1.8795 156 0.7664 0.7075 0.7664 0.8754
No log 1.9036 158 0.6741 0.7183 0.6741 0.8210
No log 1.9277 160 0.7238 0.6806 0.7238 0.8508
No log 1.9518 162 0.8484 0.6667 0.8484 0.9211
No log 1.9759 164 0.7977 0.6761 0.7977 0.8931
No log 2.0 166 0.7538 0.6986 0.7538 0.8682
No log 2.0241 168 0.9242 0.6871 0.9242 0.9613
No log 2.0482 170 1.2971 0.6327 1.2971 1.1389
No log 2.0723 172 1.2454 0.625 1.2454 1.1160
No log 2.0964 174 0.9588 0.5865 0.9588 0.9792
No log 2.1205 176 0.7777 0.7059 0.7777 0.8819
No log 2.1446 178 0.6579 0.7234 0.6579 0.8111
No log 2.1687 180 0.6555 0.7692 0.6555 0.8096
No log 2.1928 182 0.8059 0.7630 0.8059 0.8977
No log 2.2169 184 1.0330 0.6630 1.0330 1.0164
No log 2.2410 186 0.9442 0.7232 0.9442 0.9717
No log 2.2651 188 0.7608 0.7619 0.7608 0.8722
No log 2.2892 190 0.6648 0.7811 0.6648 0.8153
No log 2.3133 192 0.6931 0.7684 0.6931 0.8325
No log 2.3373 194 0.6647 0.7791 0.6647 0.8153
No log 2.3614 196 0.7274 0.7647 0.7274 0.8529
No log 2.3855 198 0.9951 0.6818 0.9951 0.9975
No log 2.4096 200 1.1119 0.6477 1.1119 1.0545
No log 2.4337 202 0.9380 0.7018 0.9380 0.9685
No log 2.4578 204 0.7738 0.6806 0.7738 0.8797
No log 2.4819 206 0.7586 0.6331 0.7586 0.8710
No log 2.5060 208 0.7958 0.6962 0.7958 0.8920
No log 2.5301 210 0.8970 0.7066 0.8970 0.9471
No log 2.5542 212 0.9003 0.7176 0.9003 0.9489
No log 2.5783 214 0.8100 0.6933 0.8100 0.9000
No log 2.6024 216 0.7746 0.6901 0.7746 0.8801
No log 2.6265 218 0.8221 0.7186 0.8221 0.9067
No log 2.6506 220 0.8251 0.6849 0.8251 0.9083
No log 2.6747 222 0.8748 0.6853 0.8748 0.9353
No log 2.6988 224 0.9498 0.6056 0.9498 0.9746
No log 2.7229 226 0.8972 0.6165 0.8972 0.9472
No log 2.7470 228 0.8906 0.6316 0.8906 0.9437
No log 2.7711 230 0.9559 0.6061 0.9559 0.9777
No log 2.7952 232 0.9681 0.5846 0.9681 0.9839
No log 2.8193 234 0.9625 0.6119 0.9625 0.9811
No log 2.8434 236 0.9473 0.64 0.9473 0.9733
No log 2.8675 238 0.8948 0.6809 0.8948 0.9460
No log 2.8916 240 0.8042 0.6714 0.8042 0.8968
No log 2.9157 242 0.7784 0.6618 0.7784 0.8823
No log 2.9398 244 0.8003 0.6667 0.8003 0.8946
No log 2.9639 246 0.8129 0.6567 0.8129 0.9016
No log 2.9880 248 0.7794 0.6519 0.7794 0.8829
No log 3.0120 250 0.7893 0.6567 0.7893 0.8884
No log 3.0361 252 0.8941 0.6525 0.8941 0.9456
No log 3.0602 254 1.1390 0.6310 1.1390 1.0672
No log 3.0843 256 1.0912 0.6627 1.0912 1.0446
No log 3.1084 258 0.8347 0.6994 0.8347 0.9136
No log 3.1325 260 0.6996 0.7413 0.6996 0.8364
No log 3.1566 262 0.6916 0.7101 0.6916 0.8316
No log 3.1807 264 0.7574 0.6993 0.7574 0.8703
No log 3.2048 266 0.9699 0.6914 0.9699 0.9848
No log 3.2289 268 1.0453 0.6707 1.0453 1.0224
No log 3.2530 270 0.9270 0.6569 0.9270 0.9628
No log 3.2771 272 0.8470 0.6866 0.8470 0.9203
No log 3.3012 274 0.8409 0.6615 0.8409 0.9170
No log 3.3253 276 0.8650 0.6667 0.8650 0.9301
No log 3.3494 278 0.9897 0.6528 0.9897 0.9948
No log 3.3735 280 1.0301 0.6456 1.0301 1.0150
No log 3.3976 282 0.8783 0.6582 0.8783 0.9372
No log 3.4217 284 0.6735 0.75 0.6735 0.8207
No log 3.4458 286 0.6242 0.7755 0.6242 0.7901
No log 3.4699 288 0.6667 0.7034 0.6667 0.8165
No log 3.4940 290 0.6462 0.7586 0.6462 0.8039
No log 3.5181 292 0.6623 0.7763 0.6623 0.8138
No log 3.5422 294 0.8222 0.6951 0.8222 0.9067
No log 3.5663 296 0.9423 0.6747 0.9423 0.9707
No log 3.5904 298 0.8785 0.6667 0.8785 0.9373
No log 3.6145 300 0.7387 0.7391 0.7387 0.8595
No log 3.6386 302 0.7542 0.6815 0.7542 0.8685
No log 3.6627 304 0.7885 0.6471 0.7885 0.8880
No log 3.6867 306 0.8275 0.6963 0.8275 0.9097
No log 3.7108 308 0.9422 0.6462 0.9422 0.9707
No log 3.7349 310 1.1564 0.6303 1.1564 1.0754
No log 3.7590 312 1.3432 0.6047 1.3432 1.1589
No log 3.7831 314 1.2731 0.6190 1.2731 1.1283
No log 3.8072 316 1.0642 0.6667 1.0642 1.0316
No log 3.8313 318 0.9176 0.6047 0.9176 0.9579
No log 3.8554 320 0.9492 0.5806 0.9492 0.9743
No log 3.8795 322 0.9398 0.5854 0.9398 0.9694
No log 3.9036 324 0.9237 0.5781 0.9237 0.9611
No log 3.9277 326 0.9203 0.6565 0.9203 0.9593
No log 3.9518 328 0.9061 0.6812 0.9061 0.9519
No log 3.9759 330 0.8251 0.7183 0.8251 0.9083
No log 4.0 332 0.7238 0.7361 0.7238 0.8508
No log 4.0241 334 0.6535 0.7660 0.6535 0.8084
No log 4.0482 336 0.6447 0.7376 0.6447 0.8029
No log 4.0723 338 0.6582 0.7639 0.6582 0.8113
No log 4.0964 340 0.7106 0.7285 0.7106 0.8430
No log 4.1205 342 0.7064 0.7248 0.7064 0.8405
No log 4.1446 344 0.7547 0.7286 0.7547 0.8688
No log 4.1687 346 0.8093 0.7299 0.8093 0.8996
No log 4.1928 348 0.9469 0.6483 0.9469 0.9731
No log 4.2169 350 1.0172 0.6483 1.0172 1.0085
No log 4.2410 352 0.9652 0.6331 0.9652 0.9825
No log 4.2651 354 0.8429 0.6466 0.8429 0.9181
No log 4.2892 356 0.8133 0.6466 0.8133 0.9018
No log 4.3133 358 0.9019 0.6331 0.9019 0.9497
No log 4.3373 360 0.9548 0.6622 0.9548 0.9771
No log 4.3614 362 1.0182 0.6452 1.0182 1.0091
No log 4.3855 364 0.9808 0.6755 0.9808 0.9904
No log 4.4096 366 0.8618 0.6569 0.8618 0.9283
No log 4.4337 368 0.8284 0.6299 0.8284 0.9102
No log 4.4578 370 0.8419 0.6406 0.8419 0.9175
No log 4.4819 372 0.8136 0.6308 0.8136 0.9020
No log 4.5060 374 0.8051 0.6809 0.8051 0.8973
No log 4.5301 376 0.8269 0.6839 0.8269 0.9093
No log 4.5542 378 0.8460 0.6839 0.8460 0.9198
No log 4.5783 380 0.8361 0.6619 0.8361 0.9144
No log 4.6024 382 0.8387 0.6466 0.8387 0.9158
No log 4.6265 384 0.8524 0.6212 0.8524 0.9233
No log 4.6506 386 0.9238 0.6712 0.9238 0.9612
No log 4.6747 388 1.0823 0.6380 1.0823 1.0403
No log 4.6988 390 1.0694 0.6380 1.0694 1.0341
No log 4.7229 392 0.9307 0.6711 0.9307 0.9647
No log 4.7470 394 0.8233 0.7194 0.8233 0.9074
No log 4.7711 396 0.7902 0.7429 0.7902 0.8889
No log 4.7952 398 0.8170 0.6849 0.8170 0.9039
No log 4.8193 400 0.8924 0.6842 0.8924 0.9447
No log 4.8434 402 0.9401 0.6667 0.9401 0.9696
No log 4.8675 404 0.8889 0.6573 0.8889 0.9428
No log 4.8916 406 0.8321 0.6815 0.8321 0.9122
No log 4.9157 408 0.8087 0.6875 0.8087 0.8993
No log 4.9398 410 0.7743 0.6562 0.7743 0.8800
No log 4.9639 412 0.7307 0.7101 0.7307 0.8548
No log 4.9880 414 0.7239 0.7429 0.7239 0.8508
No log 5.0120 416 0.7676 0.7105 0.7676 0.8761
No log 5.0361 418 0.8315 0.6711 0.8315 0.9118
No log 5.0602 420 0.9004 0.6533 0.9004 0.9489
No log 5.0843 422 0.8672 0.6714 0.8672 0.9312
No log 5.1084 424 0.8211 0.7015 0.8211 0.9062
No log 5.1325 426 0.7867 0.7068 0.7867 0.8870
No log 5.1566 428 0.7592 0.7164 0.7592 0.8713
No log 5.1807 430 0.8316 0.7172 0.8316 0.9119
No log 5.2048 432 0.9738 0.6369 0.9738 0.9868
No log 5.2289 434 0.9812 0.6364 0.9812 0.9906
No log 5.2530 436 0.9353 0.6241 0.9353 0.9671
No log 5.2771 438 0.8648 0.6853 0.8648 0.9299
No log 5.3012 440 0.7657 0.7183 0.7657 0.8750
No log 5.3253 442 0.7047 0.7338 0.7047 0.8395
No log 5.3494 444 0.6946 0.7483 0.6946 0.8334
No log 5.3735 446 0.7226 0.7552 0.7226 0.8500
No log 5.3976 448 0.7161 0.7194 0.7161 0.8462
No log 5.4217 450 0.7442 0.7246 0.7442 0.8627
No log 5.4458 452 0.8063 0.7007 0.8063 0.8979
No log 5.4699 454 0.8563 0.6715 0.8563 0.9254
No log 5.4940 456 0.8680 0.6853 0.8680 0.9317
No log 5.5181 458 0.8537 0.7034 0.8537 0.9240
No log 5.5422 460 0.8086 0.7083 0.8086 0.8992
No log 5.5663 462 0.7907 0.6963 0.7907 0.8892
No log 5.5904 464 0.7925 0.6617 0.7925 0.8902
No log 5.6145 466 0.8401 0.6849 0.8401 0.9166
No log 5.6386 468 0.8943 0.6755 0.8943 0.9457
No log 5.6627 470 0.8818 0.6753 0.8818 0.9390
No log 5.6867 472 0.7974 0.6887 0.7974 0.8930
No log 5.7108 474 0.7087 0.7397 0.7087 0.8419
No log 5.7349 476 0.6816 0.7483 0.6816 0.8256
No log 5.7590 478 0.7387 0.7190 0.7387 0.8595
No log 5.7831 480 0.9314 0.6941 0.9314 0.9651
No log 5.8072 482 0.9990 0.6941 0.9990 0.9995
No log 5.8313 484 0.8444 0.6709 0.8444 0.9189
No log 5.8554 486 0.7033 0.7324 0.7033 0.8386
No log 5.8795 488 0.6849 0.7259 0.6849 0.8276
No log 5.9036 490 0.6926 0.7324 0.6926 0.8322
No log 5.9277 492 0.7351 0.7568 0.7351 0.8574
No log 5.9518 494 0.7940 0.7294 0.7940 0.8911
No log 5.9759 496 0.8147 0.7294 0.8147 0.9026
No log 6.0 498 0.7582 0.75 0.7582 0.8707
0.4211 6.0241 500 0.7116 0.7403 0.7116 0.8436
0.4211 6.0482 502 0.7432 0.7179 0.7432 0.8621
0.4211 6.0723 504 0.7962 0.7089 0.7962 0.8923
0.4211 6.0964 506 0.8054 0.6755 0.8054 0.8975
0.4211 6.1205 508 0.7679 0.6812 0.7679 0.8763
0.4211 6.1446 510 0.7969 0.6466 0.7969 0.8927

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k11_task1_organization

Finetuned
(4023)
this model