ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k20_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8474
  • Qwk: 0.6755
  • Mse: 0.8474
  • Rmse: 0.9205

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0133 2 6.8266 0.0242 6.8266 2.6128
No log 0.0267 4 4.8530 0.0766 4.8530 2.2030
No log 0.04 6 2.9493 0.0988 2.9493 1.7174
No log 0.0533 8 2.2233 0.0993 2.2233 1.4911
No log 0.0667 10 1.8017 0.1682 1.8017 1.3423
No log 0.08 12 1.7983 0.1333 1.7983 1.3410
No log 0.0933 14 1.6554 0.1165 1.6554 1.2866
No log 0.1067 16 1.5070 0.3214 1.5070 1.2276
No log 0.12 18 1.5488 0.2931 1.5488 1.2445
No log 0.1333 20 1.4648 0.3607 1.4648 1.2103
No log 0.1467 22 1.3624 0.4320 1.3624 1.1672
No log 0.16 24 1.3952 0.5116 1.3952 1.1812
No log 0.1733 26 1.5626 0.4426 1.5626 1.2500
No log 0.1867 28 1.5615 0.1524 1.5615 1.2496
No log 0.2 30 1.5652 0.2143 1.5652 1.2511
No log 0.2133 32 1.6385 0.3871 1.6385 1.2800
No log 0.2267 34 1.5251 0.3770 1.5251 1.2349
No log 0.24 36 1.3923 0.1930 1.3923 1.1800
No log 0.2533 38 1.2271 0.4000 1.2271 1.1077
No log 0.2667 40 1.1188 0.4237 1.1188 1.0577
No log 0.28 42 1.1927 0.4959 1.1927 1.0921
No log 0.2933 44 1.2980 0.5203 1.2980 1.1393
No log 0.3067 46 1.4134 0.3968 1.4134 1.1889
No log 0.32 48 1.3696 0.4496 1.3696 1.1703
No log 0.3333 50 1.3825 0.4580 1.3825 1.1758
No log 0.3467 52 1.3618 0.4627 1.3618 1.1670
No log 0.36 54 1.1965 0.5 1.1965 1.0939
No log 0.3733 56 1.0657 0.6471 1.0657 1.0323
No log 0.3867 58 0.8768 0.6569 0.8768 0.9364
No log 0.4 60 0.7663 0.6957 0.7663 0.8754
No log 0.4133 62 0.7998 0.7246 0.7998 0.8943
No log 0.4267 64 1.0132 0.6087 1.0132 1.0066
No log 0.44 66 1.2162 0.5270 1.2162 1.1028
No log 0.4533 68 1.4747 0.4533 1.4747 1.2144
No log 0.4667 70 1.4891 0.4533 1.4891 1.2203
No log 0.48 72 1.3142 0.4823 1.3142 1.1464
No log 0.4933 74 0.9482 0.6212 0.9482 0.9737
No log 0.5067 76 0.7634 0.7101 0.7634 0.8737
No log 0.52 78 0.7793 0.7310 0.7793 0.8828
No log 0.5333 80 0.7686 0.6906 0.7686 0.8767
No log 0.5467 82 0.8474 0.6763 0.8474 0.9205
No log 0.56 84 1.0242 0.6131 1.0242 1.0120
No log 0.5733 86 1.2046 0.5401 1.2046 1.0975
No log 0.5867 88 1.2215 0.5 1.2215 1.1052
No log 0.6 90 1.1165 0.6225 1.1165 1.0566
No log 0.6133 92 0.8561 0.7059 0.8561 0.9253
No log 0.6267 94 0.8312 0.7574 0.8312 0.9117
No log 0.64 96 0.9185 0.7108 0.9185 0.9584
No log 0.6533 98 0.8183 0.6879 0.8183 0.9046
No log 0.6667 100 0.8876 0.6711 0.8876 0.9421
No log 0.68 102 1.0979 0.4928 1.0979 1.0478
No log 0.6933 104 1.2099 0.5732 1.2099 1.0999
No log 0.7067 106 0.9873 0.6497 0.9873 0.9936
No log 0.72 108 0.8860 0.5899 0.8860 0.9413
No log 0.7333 110 1.0238 0.5507 1.0238 1.0118
No log 0.7467 112 1.1634 0.5987 1.1634 1.0786
No log 0.76 114 1.1134 0.5844 1.1134 1.0552
No log 0.7733 116 0.9873 0.6351 0.9873 0.9936
No log 0.7867 118 0.9408 0.6577 0.9408 0.9699
No log 0.8 120 0.9523 0.6667 0.9523 0.9759
No log 0.8133 122 1.1038 0.6429 1.1038 1.0506
No log 0.8267 124 1.2042 0.6243 1.2042 1.0974
No log 0.84 126 1.4057 0.5990 1.4057 1.1856
No log 0.8533 128 1.1806 0.6413 1.1806 1.0866
No log 0.8667 130 0.8946 0.7176 0.8946 0.9458
No log 0.88 132 0.6894 0.7976 0.6894 0.8303
No log 0.8933 134 0.7021 0.7329 0.7021 0.8379
No log 0.9067 136 0.8589 0.6324 0.8589 0.9268
No log 0.92 138 0.9827 0.6165 0.9827 0.9913
No log 0.9333 140 0.9552 0.6277 0.9552 0.9773
No log 0.9467 142 1.0351 0.7059 1.0351 1.0174
No log 0.96 144 0.8556 0.7219 0.8556 0.9250
No log 0.9733 146 0.7263 0.7375 0.7263 0.8522
No log 0.9867 148 0.6047 0.7722 0.6047 0.7776
No log 1.0 150 0.5819 0.7826 0.5819 0.7628
No log 1.0133 152 0.5712 0.7821 0.5712 0.7558
No log 1.0267 154 0.6654 0.7484 0.6654 0.8157
No log 1.04 156 0.7659 0.7296 0.7659 0.8752
No log 1.0533 158 1.0290 0.6358 1.0290 1.0144
No log 1.0667 160 1.1959 0.6222 1.1959 1.0936
No log 1.08 162 1.3508 0.5621 1.3508 1.1622
No log 1.0933 164 1.2088 0.5672 1.2088 1.0995
No log 1.1067 166 1.1937 0.5079 1.1937 1.0926
No log 1.12 168 1.2800 0.3667 1.2800 1.1314
No log 1.1333 170 1.0248 0.5669 1.0248 1.0123
No log 1.1467 172 0.9045 0.6358 0.9045 0.9511
No log 1.16 174 1.2217 0.6597 1.2217 1.1053
No log 1.1733 176 1.5651 0.5849 1.5651 1.2510
No log 1.1867 178 1.5721 0.6132 1.5721 1.2539
No log 1.2 180 1.2614 0.6465 1.2614 1.1231
No log 1.2133 182 1.1654 0.6701 1.1654 1.0795
No log 1.2267 184 0.8216 0.7391 0.8216 0.9064
No log 1.24 186 0.7276 0.7692 0.7276 0.8530
No log 1.2533 188 0.7173 0.7640 0.7173 0.8469
No log 1.2667 190 0.7054 0.7636 0.7054 0.8399
No log 1.28 192 0.7399 0.7784 0.7399 0.8602
No log 1.2933 194 0.8103 0.7160 0.8103 0.9002
No log 1.3067 196 0.9796 0.6705 0.9796 0.9898
No log 1.32 198 0.7813 0.7160 0.7813 0.8839
No log 1.3333 200 0.6344 0.7468 0.6344 0.7965
No log 1.3467 202 0.5867 0.7901 0.5867 0.7660
No log 1.3600 204 0.6376 0.7730 0.6376 0.7985
No log 1.3733 206 0.7659 0.7561 0.7659 0.8752
No log 1.3867 208 1.1168 0.6344 1.1168 1.0568
No log 1.4 210 1.4734 0.6020 1.4734 1.2138
No log 1.4133 212 1.3266 0.6413 1.3266 1.1518
No log 1.4267 214 0.8286 0.6711 0.8286 0.9103
No log 1.44 216 0.6260 0.7660 0.6260 0.7912
No log 1.4533 218 0.6368 0.7143 0.6368 0.7980
No log 1.4667 220 0.8845 0.6667 0.8845 0.9405
No log 1.48 222 1.1699 0.6966 1.1699 1.0816
No log 1.4933 224 1.0510 0.7052 1.0510 1.0252
No log 1.5067 226 0.9292 0.6879 0.9292 0.9640
No log 1.52 228 0.7568 0.6883 0.7568 0.8700
No log 1.5333 230 0.6279 0.7517 0.6279 0.7924
No log 1.5467 232 0.6394 0.7333 0.6394 0.7996
No log 1.56 234 0.7951 0.6571 0.7951 0.8917
No log 1.5733 236 1.1892 0.6405 1.1892 1.0905
No log 1.5867 238 1.4869 0.5562 1.4869 1.2194
No log 1.6 240 1.5331 0.5244 1.5331 1.2382
No log 1.6133 242 1.3542 0.5395 1.3542 1.1637
No log 1.6267 244 1.0663 0.5874 1.0663 1.0326
No log 1.6400 246 0.7957 0.6892 0.7957 0.8920
No log 1.6533 248 0.6099 0.7821 0.6099 0.7810
No log 1.6667 250 0.5756 0.7702 0.5756 0.7587
No log 1.6800 252 0.6688 0.8046 0.6688 0.8178
No log 1.6933 254 0.9720 0.6885 0.9720 0.9859
No log 1.7067 256 1.0827 0.6774 1.0827 1.0405
No log 1.72 258 0.8945 0.6989 0.8945 0.9458
No log 1.7333 260 0.6202 0.8068 0.6202 0.7875
No log 1.7467 262 0.5058 0.8049 0.5058 0.7112
No log 1.76 264 0.5271 0.8075 0.5271 0.7260
No log 1.7733 266 0.6548 0.75 0.6548 0.8092
No log 1.7867 268 0.8819 0.6531 0.8819 0.9391
No log 1.8 270 1.1418 0.6289 1.1418 1.0686
No log 1.8133 272 1.2434 0.6182 1.2434 1.1151
No log 1.8267 274 1.1409 0.5912 1.1409 1.0681
No log 1.8400 276 0.9301 0.6849 0.9301 0.9644
No log 1.8533 278 0.8016 0.6901 0.8016 0.8953
No log 1.8667 280 0.6536 0.7632 0.6536 0.8084
No log 1.88 282 0.5723 0.8199 0.5723 0.7565
No log 1.8933 284 0.5913 0.8121 0.5913 0.7690
No log 1.9067 286 0.6485 0.7673 0.6485 0.8053
No log 1.92 288 0.6150 0.7950 0.6150 0.7842
No log 1.9333 290 0.6061 0.7925 0.6061 0.7785
No log 1.9467 292 0.6506 0.775 0.6506 0.8066
No log 1.96 294 0.6823 0.7692 0.6823 0.8260
No log 1.9733 296 0.7108 0.7470 0.7108 0.8431
No log 1.9867 298 0.8011 0.7176 0.8011 0.8951
No log 2.0 300 0.9948 0.6860 0.9948 0.9974
No log 2.0133 302 0.8861 0.6824 0.8861 0.9413
No log 2.0267 304 0.6749 0.7722 0.6749 0.8215
No log 2.04 306 0.6695 0.7722 0.6695 0.8183
No log 2.0533 308 0.6967 0.7453 0.6967 0.8347
No log 2.0667 310 0.7322 0.7470 0.7322 0.8557
No log 2.08 312 0.8029 0.7296 0.8029 0.8961
No log 2.0933 314 0.7965 0.7059 0.7965 0.8925
No log 2.1067 316 0.8377 0.6842 0.8377 0.9153
No log 2.12 318 0.7670 0.6621 0.7670 0.8758
No log 2.1333 320 0.6366 0.7895 0.6366 0.7978
No log 2.1467 322 0.5927 0.7895 0.5927 0.7699
No log 2.16 324 0.5774 0.8025 0.5774 0.7598
No log 2.1733 326 0.7448 0.7416 0.7448 0.8630
No log 2.1867 328 0.9456 0.6854 0.9456 0.9724
No log 2.2 330 0.8458 0.6901 0.8458 0.9197
No log 2.2133 332 0.6754 0.7027 0.6754 0.8218
No log 2.2267 334 0.6561 0.7297 0.6561 0.8100
No log 2.24 336 0.7688 0.6667 0.7688 0.8768
No log 2.2533 338 0.8810 0.6795 0.8810 0.9386
No log 2.2667 340 0.9075 0.6795 0.9075 0.9526
No log 2.2800 342 0.7885 0.7037 0.7885 0.8880
No log 2.2933 344 0.6429 0.7397 0.6429 0.8018
No log 2.3067 346 0.6045 0.7432 0.6045 0.7775
No log 2.32 348 0.6240 0.7310 0.6240 0.7900
No log 2.3333 350 0.7600 0.6579 0.7600 0.8718
No log 2.3467 352 0.8621 0.6790 0.8621 0.9285
No log 2.36 354 0.7486 0.6968 0.7486 0.8652
No log 2.3733 356 0.6529 0.7222 0.6529 0.8080
No log 2.3867 358 0.6164 0.7397 0.6164 0.7851
No log 2.4 360 0.5702 0.7534 0.5702 0.7551
No log 2.4133 362 0.5714 0.7564 0.5714 0.7559
No log 2.4267 364 0.5835 0.7547 0.5835 0.7638
No log 2.44 366 0.7176 0.7470 0.7176 0.8471
No log 2.4533 368 1.0080 0.6739 1.0080 1.0040
No log 2.4667 370 1.0876 0.6704 1.0876 1.0429
No log 2.48 372 0.9539 0.6857 0.9539 0.9767
No log 2.4933 374 0.6598 0.7484 0.6598 0.8123
No log 2.5067 376 0.5604 0.7742 0.5604 0.7486
No log 2.52 378 0.5590 0.7799 0.5590 0.7476
No log 2.5333 380 0.6043 0.7643 0.6043 0.7774
No log 2.5467 382 0.7124 0.7602 0.7124 0.8441
No log 2.56 384 0.7735 0.7412 0.7735 0.8795
No log 2.5733 386 0.7810 0.7574 0.7810 0.8837
No log 2.5867 388 0.7224 0.7020 0.7224 0.8500
No log 2.6 390 0.7078 0.7324 0.7078 0.8413
No log 2.6133 392 0.7639 0.7234 0.7639 0.8740
No log 2.6267 394 0.8021 0.6615 0.8021 0.8956
No log 2.64 396 0.7938 0.6818 0.7938 0.8910
No log 2.6533 398 0.7952 0.6466 0.7952 0.8917
No log 2.6667 400 0.8300 0.6577 0.8300 0.9110
No log 2.68 402 0.8722 0.6788 0.8722 0.9339
No log 2.6933 404 1.0489 0.6667 1.0489 1.0242
No log 2.7067 406 0.9543 0.7027 0.9543 0.9769
No log 2.7200 408 0.7035 0.7914 0.7035 0.8387
No log 2.7333 410 0.5402 0.8229 0.5402 0.7350
No log 2.7467 412 0.5050 0.8421 0.5050 0.7106
No log 2.76 414 0.5733 0.8 0.5733 0.7572
No log 2.7733 416 0.6249 0.8092 0.6249 0.7905
No log 2.7867 418 0.5942 0.8182 0.5942 0.7708
No log 2.8 420 0.6695 0.8111 0.6695 0.8182
No log 2.8133 422 0.6577 0.8111 0.6577 0.8110
No log 2.8267 424 0.5998 0.8023 0.5998 0.7745
No log 2.84 426 0.5608 0.7931 0.5608 0.7489
No log 2.8533 428 0.5721 0.7952 0.5721 0.7564
No log 2.8667 430 0.6387 0.7516 0.6387 0.7992
No log 2.88 432 0.6376 0.7467 0.6376 0.7985
No log 2.8933 434 0.6199 0.7467 0.6199 0.7873
No log 2.9067 436 0.6396 0.7248 0.6396 0.7998
No log 2.92 438 0.6425 0.7407 0.6425 0.8016
No log 2.9333 440 0.6101 0.7929 0.6101 0.7811
No log 2.9467 442 0.6456 0.7711 0.6456 0.8035
No log 2.96 444 0.7249 0.75 0.7249 0.8514
No log 2.9733 446 0.7325 0.7250 0.7325 0.8559
No log 2.9867 448 0.8328 0.7152 0.8328 0.9126
No log 3.0 450 0.8657 0.7073 0.8657 0.9304
No log 3.0133 452 0.7026 0.7389 0.7026 0.8382
No log 3.0267 454 0.6178 0.7517 0.6178 0.7860
No log 3.04 456 0.6585 0.7152 0.6585 0.8115
No log 3.0533 458 0.7085 0.7134 0.7085 0.8417
No log 3.0667 460 0.6560 0.6974 0.6560 0.8099
No log 3.08 462 0.6345 0.7619 0.6345 0.7965
No log 3.0933 464 0.6062 0.7683 0.6062 0.7786
No log 3.1067 466 0.6480 0.7317 0.6480 0.8050
No log 3.12 468 0.6483 0.7261 0.6483 0.8052
No log 3.1333 470 0.6273 0.7417 0.6273 0.7920
No log 3.1467 472 0.6556 0.7407 0.6556 0.8097
No log 3.16 474 0.6520 0.7407 0.6520 0.8075
No log 3.1733 476 0.6645 0.7425 0.6645 0.8152
No log 3.1867 478 0.7388 0.7425 0.7388 0.8595
No log 3.2 480 0.8569 0.6826 0.8569 0.9257
No log 3.2133 482 0.8507 0.6826 0.8507 0.9223
No log 3.2267 484 0.8642 0.6667 0.8642 0.9296
No log 3.24 486 0.7953 0.7362 0.7953 0.8918
No log 3.2533 488 0.7401 0.7226 0.7401 0.8603
No log 3.2667 490 0.7419 0.7105 0.7419 0.8613
No log 3.2800 492 0.8423 0.7013 0.8423 0.9178
No log 3.2933 494 0.8544 0.6928 0.8544 0.9243
No log 3.3067 496 0.7916 0.6667 0.7916 0.8897
No log 3.32 498 0.6828 0.7383 0.6828 0.8263
0.4522 3.3333 500 0.6589 0.75 0.6589 0.8118
0.4522 3.3467 502 0.6530 0.7625 0.6530 0.8081
0.4522 3.36 504 0.7325 0.7470 0.7325 0.8559
0.4522 3.3733 506 0.7944 0.7362 0.7944 0.8913
0.4522 3.3867 508 0.9245 0.7073 0.9245 0.9615
0.4522 3.4 510 0.9718 0.7073 0.9718 0.9858
0.4522 3.4133 512 0.8474 0.6755 0.8474 0.9205

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k20_task1_organization

Finetuned
(4023)
this model