ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k14_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9199
  • Qwk: 0.3938
  • Mse: 0.9199
  • Rmse: 0.9591

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0282 2 4.2998 -0.0281 4.2998 2.0736
No log 0.0563 4 2.7012 -0.0650 2.7012 1.6435
No log 0.0845 6 3.1398 -0.0578 3.1398 1.7719
No log 0.1127 8 3.0660 -0.0447 3.0660 1.7510
No log 0.1408 10 1.3323 0.1880 1.3323 1.1542
No log 0.1690 12 1.1984 0.1211 1.1984 1.0947
No log 0.1972 14 1.8012 -0.0313 1.8012 1.3421
No log 0.2254 16 1.6899 -0.0321 1.6899 1.3000
No log 0.2535 18 1.3098 -0.0328 1.3098 1.1445
No log 0.2817 20 1.1111 0.1864 1.1111 1.0541
No log 0.3099 22 1.0798 0.2239 1.0798 1.0391
No log 0.3380 24 1.1068 0.2343 1.1068 1.0520
No log 0.3662 26 1.1142 0.1858 1.1142 1.0555
No log 0.3944 28 1.0371 0.2564 1.0371 1.0184
No log 0.4225 30 0.9931 0.1476 0.9931 0.9966
No log 0.4507 32 0.9998 0.1228 0.9998 0.9999
No log 0.4789 34 0.9668 0.1935 0.9668 0.9833
No log 0.5070 36 1.0133 0.2588 1.0133 1.0066
No log 0.5352 38 1.0807 0.2513 1.0807 1.0396
No log 0.5634 40 1.1777 0.1352 1.1777 1.0852
No log 0.5915 42 1.1826 0.1460 1.1826 1.0875
No log 0.6197 44 1.2770 0.1860 1.2770 1.1300
No log 0.6479 46 1.2462 0.1860 1.2462 1.1163
No log 0.6761 48 1.0407 0.2100 1.0407 1.0201
No log 0.7042 50 0.9096 0.3222 0.9096 0.9537
No log 0.7324 52 0.9588 0.1653 0.9588 0.9792
No log 0.7606 54 0.9598 0.1953 0.9598 0.9797
No log 0.7887 56 0.9424 0.3332 0.9424 0.9708
No log 0.8169 58 1.0696 0.2392 1.0696 1.0342
No log 0.8451 60 1.1534 0.1910 1.1534 1.0740
No log 0.8732 62 1.1028 0.2150 1.1028 1.0501
No log 0.9014 64 1.0707 0.2271 1.0707 1.0347
No log 0.9296 66 0.9982 0.3026 0.9982 0.9991
No log 0.9577 68 0.9433 0.3540 0.9433 0.9712
No log 0.9859 70 0.9525 0.2594 0.9525 0.9759
No log 1.0141 72 1.0188 0.3374 1.0188 1.0093
No log 1.0423 74 0.9941 0.3921 0.9941 0.9970
No log 1.0704 76 0.8922 0.2956 0.8922 0.9446
No log 1.0986 78 1.2215 0.3292 1.2215 1.1052
No log 1.1268 80 1.0770 0.4287 1.0770 1.0378
No log 1.1549 82 0.7272 0.5783 0.7272 0.8527
No log 1.1831 84 0.8447 0.4792 0.8447 0.9191
No log 1.2113 86 0.9841 0.4475 0.9841 0.9920
No log 1.2394 88 0.8486 0.4988 0.8486 0.9212
No log 1.2676 90 0.8009 0.5025 0.8009 0.8949
No log 1.2958 92 0.9507 0.4467 0.9507 0.9750
No log 1.3239 94 1.2772 0.3320 1.2772 1.1301
No log 1.3521 96 1.4643 0.2442 1.4643 1.2101
No log 1.3803 98 1.4654 0.2442 1.4654 1.2105
No log 1.4085 100 1.2510 0.3568 1.2510 1.1185
No log 1.4366 102 0.9897 0.4255 0.9897 0.9948
No log 1.4648 104 0.8661 0.4948 0.8661 0.9307
No log 1.4930 106 0.8365 0.5067 0.8365 0.9146
No log 1.5211 108 0.9025 0.5020 0.9025 0.9500
No log 1.5493 110 0.9763 0.3601 0.9763 0.9881
No log 1.5775 112 0.8708 0.3207 0.8708 0.9332
No log 1.6056 114 0.8140 0.3485 0.8140 0.9022
No log 1.6338 116 0.7807 0.3673 0.7807 0.8836
No log 1.6620 118 0.7869 0.4867 0.7869 0.8871
No log 1.6901 120 0.8120 0.4565 0.8120 0.9011
No log 1.7183 122 0.8972 0.4192 0.8972 0.9472
No log 1.7465 124 1.0326 0.3864 1.0326 1.0162
No log 1.7746 126 1.0732 0.3565 1.0732 1.0359
No log 1.8028 128 1.0364 0.3290 1.0364 1.0180
No log 1.8310 130 0.9493 0.2796 0.9493 0.9743
No log 1.8592 132 0.9138 0.3085 0.9138 0.9559
No log 1.8873 134 0.8945 0.4071 0.8945 0.9458
No log 1.9155 136 0.9089 0.4186 0.9089 0.9533
No log 1.9437 138 0.9086 0.4439 0.9086 0.9532
No log 1.9718 140 0.8297 0.4952 0.8297 0.9109
No log 2.0 142 0.7990 0.5434 0.7990 0.8938
No log 2.0282 144 0.7766 0.5345 0.7766 0.8813
No log 2.0563 146 0.7963 0.5065 0.7963 0.8924
No log 2.0845 148 0.7220 0.4980 0.7220 0.8497
No log 2.1127 150 0.6871 0.5377 0.6871 0.8289
No log 2.1408 152 0.6565 0.5703 0.6565 0.8102
No log 2.1690 154 0.6594 0.5459 0.6594 0.8121
No log 2.1972 156 0.6480 0.6673 0.6480 0.8050
No log 2.2254 158 0.7628 0.5138 0.7628 0.8734
No log 2.2535 160 0.7200 0.6133 0.7200 0.8485
No log 2.2817 162 0.7506 0.5387 0.7506 0.8664
No log 2.3099 164 0.8446 0.5052 0.8446 0.9190
No log 2.3380 166 0.8371 0.5331 0.8371 0.9149
No log 2.3662 168 0.8902 0.4723 0.8902 0.9435
No log 2.3944 170 1.0247 0.4371 1.0247 1.0123
No log 2.4225 172 1.2319 0.2881 1.2319 1.1099
No log 2.4507 174 1.3808 0.3005 1.3808 1.1751
No log 2.4789 176 1.1943 0.4031 1.1943 1.0929
No log 2.5070 178 0.8567 0.4533 0.8567 0.9256
No log 2.5352 180 0.7812 0.5202 0.7812 0.8839
No log 2.5634 182 0.9047 0.4413 0.9047 0.9512
No log 2.5915 184 1.2182 0.3601 1.2182 1.1037
No log 2.6197 186 1.3620 0.2572 1.3620 1.1670
No log 2.6479 188 1.1933 0.3475 1.1933 1.0924
No log 2.6761 190 0.8868 0.4375 0.8868 0.9417
No log 2.7042 192 0.8431 0.4694 0.8431 0.9182
No log 2.7324 194 0.8950 0.4468 0.8950 0.9461
No log 2.7606 196 0.7726 0.5470 0.7726 0.8790
No log 2.7887 198 0.7223 0.4966 0.7223 0.8499
No log 2.8169 200 0.6867 0.5627 0.6867 0.8286
No log 2.8451 202 0.6521 0.5042 0.6521 0.8075
No log 2.8732 204 0.6506 0.5889 0.6506 0.8066
No log 2.9014 206 0.6380 0.6276 0.6380 0.7988
No log 2.9296 208 0.6083 0.5822 0.6083 0.7799
No log 2.9577 210 0.5911 0.5845 0.5911 0.7688
No log 2.9859 212 0.5968 0.5759 0.5968 0.7725
No log 3.0141 214 0.6110 0.5542 0.6110 0.7817
No log 3.0423 216 0.6237 0.5809 0.6237 0.7897
No log 3.0704 218 0.7273 0.5231 0.7273 0.8528
No log 3.0986 220 0.8092 0.5119 0.8092 0.8996
No log 3.1268 222 0.7436 0.5048 0.7436 0.8623
No log 3.1549 224 0.6923 0.4363 0.6923 0.8320
No log 3.1831 226 0.6837 0.4626 0.6837 0.8269
No log 3.2113 228 0.7157 0.5305 0.7157 0.8460
No log 3.2394 230 0.8305 0.5007 0.8305 0.9113
No log 3.2676 232 0.9347 0.4867 0.9347 0.9668
No log 3.2958 234 0.8339 0.5216 0.8339 0.9132
No log 3.3239 236 0.6701 0.5220 0.6701 0.8186
No log 3.3521 238 0.6265 0.5503 0.6265 0.7915
No log 3.3803 240 0.6475 0.5774 0.6475 0.8047
No log 3.4085 242 0.7675 0.5475 0.7675 0.8761
No log 3.4366 244 0.8335 0.4889 0.8335 0.9129
No log 3.4648 246 0.8792 0.5 0.8792 0.9377
No log 3.4930 248 0.8388 0.3648 0.8388 0.9159
No log 3.5211 250 0.7404 0.4478 0.7404 0.8605
No log 3.5493 252 0.7038 0.4675 0.7038 0.8389
No log 3.5775 254 0.6964 0.5073 0.6964 0.8345
No log 3.6056 256 0.6732 0.5549 0.6732 0.8205
No log 3.6338 258 0.6824 0.5822 0.6824 0.8261
No log 3.6620 260 0.6909 0.5388 0.6909 0.8312
No log 3.6901 262 0.6659 0.5288 0.6659 0.8160
No log 3.7183 264 0.6868 0.4622 0.6868 0.8287
No log 3.7465 266 0.6673 0.5088 0.6673 0.8169
No log 3.7746 268 0.6391 0.5759 0.6391 0.7994
No log 3.8028 270 0.7213 0.5622 0.7213 0.8493
No log 3.8310 272 0.7436 0.5504 0.7436 0.8623
No log 3.8592 274 0.6587 0.5442 0.6587 0.8116
No log 3.8873 276 0.6022 0.5886 0.6022 0.7760
No log 3.9155 278 0.6100 0.5562 0.6100 0.7810
No log 3.9437 280 0.6059 0.5562 0.6059 0.7784
No log 3.9718 282 0.5902 0.6364 0.5902 0.7682
No log 4.0 284 0.5897 0.6491 0.5897 0.7679
No log 4.0282 286 0.6516 0.6008 0.6516 0.8072
No log 4.0563 288 0.6174 0.6177 0.6174 0.7857
No log 4.0845 290 0.5818 0.7025 0.5818 0.7627
No log 4.1127 292 0.6109 0.6036 0.6109 0.7816
No log 4.1408 294 0.6225 0.6047 0.6225 0.7890
No log 4.1690 296 0.5948 0.5666 0.5948 0.7713
No log 4.1972 298 0.6165 0.6397 0.6165 0.7852
No log 4.2254 300 0.6186 0.6043 0.6186 0.7865
No log 4.2535 302 0.5786 0.6479 0.5786 0.7607
No log 4.2817 304 0.5554 0.6779 0.5554 0.7452
No log 4.3099 306 0.5437 0.6788 0.5437 0.7373
No log 4.3380 308 0.5628 0.7122 0.5628 0.7502
No log 4.3662 310 0.6026 0.6014 0.6026 0.7763
No log 4.3944 312 0.6068 0.6165 0.6068 0.7790
No log 4.4225 314 0.6129 0.5422 0.6129 0.7829
No log 4.4507 316 0.6193 0.5332 0.6193 0.7870
No log 4.4789 318 0.6033 0.5469 0.6033 0.7767
No log 4.5070 320 0.5689 0.5796 0.5689 0.7543
No log 4.5352 322 0.5856 0.6360 0.5856 0.7652
No log 4.5634 324 0.5951 0.6360 0.5951 0.7714
No log 4.5915 326 0.5925 0.6360 0.5925 0.7697
No log 4.6197 328 0.6327 0.5989 0.6327 0.7954
No log 4.6479 330 0.6578 0.5522 0.6578 0.8111
No log 4.6761 332 0.6507 0.5195 0.6507 0.8066
No log 4.7042 334 0.6385 0.5346 0.6385 0.7991
No log 4.7324 336 0.6204 0.5583 0.6204 0.7877
No log 4.7606 338 0.5972 0.5436 0.5972 0.7728
No log 4.7887 340 0.6372 0.5348 0.6372 0.7982
No log 4.8169 342 0.6647 0.5504 0.6647 0.8153
No log 4.8451 344 0.6676 0.5788 0.6676 0.8171
No log 4.8732 346 0.6294 0.5810 0.6294 0.7934
No log 4.9014 348 0.6369 0.5708 0.6369 0.7981
No log 4.9296 350 0.6517 0.5844 0.6517 0.8073
No log 4.9577 352 0.6443 0.5835 0.6443 0.8027
No log 4.9859 354 0.6689 0.5618 0.6689 0.8179
No log 5.0141 356 0.6783 0.5246 0.6783 0.8236
No log 5.0423 358 0.6816 0.5375 0.6816 0.8256
No log 5.0704 360 0.6863 0.5361 0.6863 0.8284
No log 5.0986 362 0.6573 0.5786 0.6573 0.8107
No log 5.1268 364 0.6083 0.5722 0.6083 0.7799
No log 5.1549 366 0.6075 0.6701 0.6075 0.7794
No log 5.1831 368 0.6054 0.6479 0.6054 0.7781
No log 5.2113 370 0.6449 0.5959 0.6449 0.8031
No log 5.2394 372 0.7726 0.5266 0.7726 0.8790
No log 5.2676 374 0.8502 0.5147 0.8502 0.9221
No log 5.2958 376 0.8208 0.5048 0.8208 0.9060
No log 5.3239 378 0.7293 0.3959 0.7293 0.8540
No log 5.3521 380 0.6893 0.4760 0.6893 0.8303
No log 5.3803 382 0.7159 0.5634 0.7159 0.8461
No log 5.4085 384 0.7456 0.5410 0.7456 0.8635
No log 5.4366 386 0.7569 0.4368 0.7569 0.8700
No log 5.4648 388 0.7890 0.3840 0.7890 0.8883
No log 5.4930 390 0.8097 0.3702 0.8097 0.8998
No log 5.5211 392 0.7939 0.4110 0.7939 0.8910
No log 5.5493 394 0.7977 0.4368 0.7977 0.8931
No log 5.5775 396 0.7960 0.4237 0.7960 0.8922
No log 5.6056 398 0.7715 0.3556 0.7715 0.8784
No log 5.6338 400 0.7395 0.4241 0.7395 0.8599
No log 5.6620 402 0.7050 0.4781 0.7050 0.8397
No log 5.6901 404 0.6664 0.4810 0.6664 0.8163
No log 5.7183 406 0.6521 0.5386 0.6521 0.8075
No log 5.7465 408 0.6662 0.6519 0.6662 0.8162
No log 5.7746 410 0.6960 0.5933 0.6960 0.8343
No log 5.8028 412 0.6825 0.5959 0.6825 0.8261
No log 5.8310 414 0.6422 0.5735 0.6422 0.8014
No log 5.8592 416 0.6533 0.5329 0.6533 0.8083
No log 5.8873 418 0.6631 0.5568 0.6631 0.8143
No log 5.9155 420 0.6312 0.5771 0.6312 0.7945
No log 5.9437 422 0.6187 0.6407 0.6187 0.7866
No log 5.9718 424 0.6236 0.6758 0.6236 0.7897
No log 6.0 426 0.6428 0.6445 0.6428 0.8018
No log 6.0282 428 0.6505 0.6441 0.6505 0.8065
No log 6.0563 430 0.6257 0.5659 0.6257 0.7910
No log 6.0845 432 0.6131 0.5692 0.6131 0.7830
No log 6.1127 434 0.6285 0.5464 0.6285 0.7928
No log 6.1408 436 0.6445 0.5464 0.6445 0.8028
No log 6.1690 438 0.6779 0.5343 0.6779 0.8233
No log 6.1972 440 0.8027 0.5367 0.8027 0.8959
No log 6.2254 442 0.7857 0.5367 0.7857 0.8864
No log 6.2535 444 0.6878 0.5993 0.6878 0.8294
No log 6.2817 446 0.6490 0.5701 0.6490 0.8056
No log 6.3099 448 0.6944 0.5472 0.6944 0.8333
No log 6.3380 450 0.6695 0.5156 0.6695 0.8182
No log 6.3662 452 0.6420 0.5185 0.6420 0.8012
No log 6.3944 454 0.6381 0.5329 0.6381 0.7988
No log 6.4225 456 0.6547 0.5581 0.6547 0.8092
No log 6.4507 458 0.6320 0.6108 0.6320 0.7950
No log 6.4789 460 0.6176 0.6249 0.6176 0.7858
No log 6.5070 462 0.6521 0.6197 0.6521 0.8075
No log 6.5352 464 0.7124 0.6325 0.7124 0.8441
No log 6.5634 466 0.7341 0.5565 0.7341 0.8568
No log 6.5915 468 0.7405 0.5565 0.7405 0.8605
No log 6.6197 470 0.6604 0.5833 0.6604 0.8126
No log 6.6479 472 0.6428 0.5845 0.6428 0.8017
No log 6.6761 474 0.6532 0.6133 0.6532 0.8082
No log 6.7042 476 0.6814 0.6325 0.6814 0.8255
No log 6.7324 478 0.6571 0.6325 0.6571 0.8106
No log 6.7606 480 0.6564 0.6133 0.6564 0.8102
No log 6.7887 482 0.6406 0.5932 0.6406 0.8004
No log 6.8169 484 0.6339 0.5048 0.6339 0.7962
No log 6.8451 486 0.6437 0.5180 0.6437 0.8023
No log 6.8732 488 0.6762 0.5477 0.6762 0.8223
No log 6.9014 490 0.7779 0.5048 0.7779 0.8820
No log 6.9296 492 0.8323 0.5128 0.8323 0.9123
No log 6.9577 494 0.8256 0.5119 0.8256 0.9087
No log 6.9859 496 0.7905 0.5242 0.7905 0.8891
No log 7.0141 498 0.7967 0.5455 0.7967 0.8926
0.3478 7.0423 500 0.8458 0.5318 0.8458 0.9197
0.3478 7.0704 502 0.9304 0.5198 0.9304 0.9646
0.3478 7.0986 504 0.9308 0.4787 0.9308 0.9648
0.3478 7.1268 506 0.8182 0.4696 0.8182 0.9045
0.3478 7.1549 508 0.6913 0.5472 0.6913 0.8314
0.3478 7.1831 510 0.6310 0.6479 0.6310 0.7944
0.3478 7.2113 512 0.6220 0.6345 0.6220 0.7886
0.3478 7.2394 514 0.6179 0.6239 0.6179 0.7861
0.3478 7.2676 516 0.6239 0.5954 0.6239 0.7899
0.3478 7.2958 518 0.6357 0.5929 0.6357 0.7973
0.3478 7.3239 520 0.6307 0.6405 0.6307 0.7941
0.3478 7.3521 522 0.6058 0.6664 0.6058 0.7783
0.3478 7.3803 524 0.5924 0.5955 0.5924 0.7697
0.3478 7.4085 526 0.6313 0.5118 0.6313 0.7945
0.3478 7.4366 528 0.6297 0.5361 0.6297 0.7935
0.3478 7.4648 530 0.6155 0.5692 0.6155 0.7845
0.3478 7.4930 532 0.6185 0.6265 0.6185 0.7865
0.3478 7.5211 534 0.6288 0.5771 0.6288 0.7930
0.3478 7.5493 536 0.6358 0.5859 0.6358 0.7974
0.3478 7.5775 538 0.6458 0.6025 0.6458 0.8036
0.3478 7.6056 540 0.6579 0.6438 0.6579 0.8111
0.3478 7.6338 542 0.6711 0.6404 0.6711 0.8192
0.3478 7.6620 544 0.6694 0.6404 0.6694 0.8182
0.3478 7.6901 546 0.6403 0.6128 0.6403 0.8002
0.3478 7.7183 548 0.6309 0.6405 0.6309 0.7943
0.3478 7.7465 550 0.6127 0.6164 0.6127 0.7828
0.3478 7.7746 552 0.6130 0.6087 0.6130 0.7829
0.3478 7.8028 554 0.6046 0.5301 0.6046 0.7776
0.3478 7.8310 556 0.6133 0.6186 0.6133 0.7831
0.3478 7.8592 558 0.6337 0.5735 0.6337 0.7960
0.3478 7.8873 560 0.6333 0.5735 0.6333 0.7958
0.3478 7.9155 562 0.6306 0.5943 0.6306 0.7941
0.3478 7.9437 564 0.6541 0.6165 0.6541 0.8088
0.3478 7.9718 566 0.6702 0.5960 0.6702 0.8186
0.3478 8.0 568 0.7049 0.5259 0.7049 0.8396
0.3478 8.0282 570 0.7561 0.5076 0.7561 0.8695
0.3478 8.0563 572 0.8592 0.5254 0.8592 0.9269
0.3478 8.0845 574 0.9547 0.4681 0.9547 0.9771
0.3478 8.1127 576 0.9199 0.3938 0.9199 0.9591

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k14_task5_organization

Finetuned
(4019)
this model