ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k14_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6146
  • Qwk: 0.4355
  • Mse: 0.6146
  • Rmse: 0.7839

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0278 2 2.6145 -0.0788 2.6145 1.6169
No log 0.0556 4 1.2130 0.1262 1.2130 1.1014
No log 0.0833 6 0.9105 -0.0354 0.9105 0.9542
No log 0.1111 8 0.8843 0.0181 0.8843 0.9404
No log 0.1389 10 0.7405 0.2621 0.7405 0.8605
No log 0.1667 12 0.8038 0.3008 0.8038 0.8965
No log 0.1944 14 0.8554 0.2259 0.8554 0.9249
No log 0.2222 16 0.7879 0.1093 0.7879 0.8876
No log 0.25 18 0.9499 0.1649 0.9499 0.9746
No log 0.2778 20 0.8965 0.2368 0.8965 0.9468
No log 0.3056 22 0.8602 0.1617 0.8602 0.9275
No log 0.3333 24 1.1533 0.3296 1.1533 1.0739
No log 0.3611 26 1.1104 0.3225 1.1104 1.0538
No log 0.3889 28 0.9306 0.1777 0.9306 0.9647
No log 0.4167 30 0.7849 0.0851 0.7849 0.8859
No log 0.4444 32 0.7478 0.1660 0.7478 0.8648
No log 0.4722 34 0.7377 0.0236 0.7377 0.8589
No log 0.5 36 0.7915 0.2116 0.7915 0.8897
No log 0.5278 38 0.8317 0.1962 0.8317 0.9120
No log 0.5556 40 0.9884 0.2756 0.9884 0.9942
No log 0.5833 42 0.9128 0.2463 0.9128 0.9554
No log 0.6111 44 0.7826 0.2352 0.7826 0.8846
No log 0.6389 46 0.7176 0.1093 0.7176 0.8471
No log 0.6667 48 0.7144 0.0679 0.7144 0.8452
No log 0.6944 50 0.7351 0.1550 0.7351 0.8574
No log 0.7222 52 0.7271 0.1007 0.7271 0.8527
No log 0.75 54 0.7384 0.1007 0.7384 0.8593
No log 0.7778 56 0.7433 -0.0517 0.7433 0.8621
No log 0.8056 58 0.7509 -0.0517 0.7509 0.8665
No log 0.8333 60 0.7667 0.2407 0.7667 0.8756
No log 0.8611 62 0.8776 0.3169 0.8776 0.9368
No log 0.8889 64 0.9496 0.2995 0.9496 0.9745
No log 0.9167 66 0.9386 0.2810 0.9386 0.9688
No log 0.9444 68 0.7813 0.2527 0.7813 0.8839
No log 0.9722 70 0.7480 0.1550 0.7480 0.8649
No log 1.0 72 0.9255 0.3359 0.9255 0.9620
No log 1.0278 74 1.0628 0.2682 1.0628 1.0309
No log 1.0556 76 0.9153 0.4067 0.9153 0.9567
No log 1.0833 78 0.9083 0.4175 0.9083 0.9530
No log 1.1111 80 0.9691 0.3669 0.9691 0.9845
No log 1.1389 82 0.8939 0.3953 0.8939 0.9455
No log 1.1667 84 0.8229 0.3653 0.8229 0.9072
No log 1.1944 86 0.8064 0.3248 0.8064 0.8980
No log 1.2222 88 0.8676 0.3586 0.8676 0.9314
No log 1.25 90 1.0392 0.3082 1.0392 1.0194
No log 1.2778 92 0.9989 0.3137 0.9989 0.9995
No log 1.3056 94 0.8878 0.3562 0.8878 0.9422
No log 1.3333 96 0.8708 0.3497 0.8708 0.9331
No log 1.3611 98 0.9362 0.3782 0.9362 0.9676
No log 1.3889 100 1.1663 0.2857 1.1663 1.0800
No log 1.4167 102 1.1098 0.2900 1.1098 1.0535
No log 1.4444 104 0.7921 0.3260 0.7921 0.8900
No log 1.4722 106 0.8115 0.3902 0.8115 0.9009
No log 1.5 108 0.8220 0.3808 0.8220 0.9066
No log 1.5278 110 0.8009 0.2690 0.8009 0.8949
No log 1.5556 112 1.1743 0.2552 1.1743 1.0836
No log 1.5833 114 1.3256 0.2514 1.3256 1.1513
No log 1.6111 116 1.1002 0.2799 1.1002 1.0489
No log 1.6389 118 0.7919 0.3940 0.7919 0.8899
No log 1.6667 120 0.6954 0.2488 0.6954 0.8339
No log 1.6944 122 0.8112 0.2444 0.8112 0.9007
No log 1.7222 124 0.8312 0.2203 0.8312 0.9117
No log 1.75 126 0.7270 0.2580 0.7270 0.8526
No log 1.7778 128 0.6836 0.2353 0.6836 0.8268
No log 1.8056 130 0.7598 0.3302 0.7598 0.8717
No log 1.8333 132 0.8032 0.3425 0.8032 0.8962
No log 1.8611 134 0.7702 0.2967 0.7702 0.8776
No log 1.8889 136 0.7201 0.2995 0.7201 0.8486
No log 1.9167 138 0.7394 0.3261 0.7394 0.8599
No log 1.9444 140 0.7546 0.3261 0.7546 0.8687
No log 1.9722 142 0.7272 0.2294 0.7272 0.8528
No log 2.0 144 0.7280 0.2652 0.7280 0.8532
No log 2.0278 146 0.7450 0.2440 0.7450 0.8631
No log 2.0556 148 0.7376 0.2182 0.7376 0.8589
No log 2.0833 150 0.7751 0.3329 0.7751 0.8804
No log 2.1111 152 0.7845 0.3475 0.7845 0.8857
No log 2.1389 154 0.7334 0.2809 0.7334 0.8564
No log 2.1667 156 0.7299 0.3239 0.7299 0.8544
No log 2.1944 158 0.7231 0.3504 0.7231 0.8503
No log 2.2222 160 0.7172 0.3738 0.7172 0.8469
No log 2.25 162 0.8432 0.3520 0.8432 0.9182
No log 2.2778 164 0.7885 0.3891 0.7885 0.8880
No log 2.3056 166 0.6794 0.3787 0.6794 0.8242
No log 2.3333 168 0.6754 0.3677 0.6754 0.8218
No log 2.3611 170 0.6983 0.3287 0.6983 0.8356
No log 2.3889 172 0.7570 0.3940 0.7570 0.8701
No log 2.4167 174 0.6917 0.4014 0.6917 0.8317
No log 2.4444 176 0.6079 0.4124 0.6079 0.7797
No log 2.4722 178 0.5853 0.3702 0.5853 0.7650
No log 2.5 180 0.5981 0.4262 0.5981 0.7733
No log 2.5278 182 0.6081 0.3704 0.6081 0.7798
No log 2.5556 184 0.5753 0.4569 0.5753 0.7585
No log 2.5833 186 0.6712 0.4329 0.6712 0.8193
No log 2.6111 188 0.7220 0.4329 0.7220 0.8497
No log 2.6389 190 0.7115 0.4329 0.7115 0.8435
No log 2.6667 192 0.6197 0.4190 0.6197 0.7872
No log 2.6944 194 0.5986 0.4020 0.5986 0.7737
No log 2.7222 196 0.6236 0.3723 0.6236 0.7897
No log 2.75 198 0.6944 0.4307 0.6944 0.8333
No log 2.7778 200 0.6497 0.4067 0.6497 0.8060
No log 2.8056 202 0.5798 0.4291 0.5798 0.7615
No log 2.8333 204 0.6011 0.4012 0.6011 0.7753
No log 2.8611 206 0.6738 0.3528 0.6738 0.8208
No log 2.8889 208 0.6468 0.4391 0.6468 0.8042
No log 2.9167 210 0.6208 0.4681 0.6208 0.7879
No log 2.9444 212 0.7434 0.4587 0.7434 0.8622
No log 2.9722 214 0.8265 0.4133 0.8265 0.9091
No log 3.0 216 0.8071 0.4064 0.8071 0.8984
No log 3.0278 218 0.7367 0.4808 0.7367 0.8583
No log 3.0556 220 0.6871 0.4827 0.6871 0.8289
No log 3.0833 222 0.7040 0.4808 0.7040 0.8391
No log 3.1111 224 0.6399 0.4602 0.6399 0.7999
No log 3.1389 226 0.5804 0.4914 0.5804 0.7619
No log 3.1667 228 0.6030 0.4149 0.6030 0.7765
No log 3.1944 230 0.6084 0.3069 0.6084 0.7800
No log 3.2222 232 0.5745 0.4657 0.5745 0.7580
No log 3.25 234 0.5629 0.5056 0.5629 0.7503
No log 3.2778 236 0.5655 0.5488 0.5655 0.7520
No log 3.3056 238 0.6044 0.4149 0.6044 0.7774
No log 3.3333 240 0.6272 0.4118 0.6272 0.7920
No log 3.3611 242 0.6058 0.3934 0.6058 0.7784
No log 3.3889 244 0.5866 0.4828 0.5866 0.7659
No log 3.4167 246 0.6392 0.4728 0.6392 0.7995
No log 3.4444 248 0.6180 0.4978 0.6180 0.7862
No log 3.4722 250 0.5874 0.4253 0.5874 0.7664
No log 3.5 252 0.6149 0.3586 0.6149 0.7842
No log 3.5278 254 0.6168 0.3586 0.6168 0.7854
No log 3.5556 256 0.5977 0.4096 0.5977 0.7731
No log 3.5833 258 0.5576 0.4423 0.5576 0.7467
No log 3.6111 260 0.5869 0.5485 0.5869 0.7661
No log 3.6389 262 0.5964 0.5239 0.5964 0.7723
No log 3.6667 264 0.5693 0.4555 0.5693 0.7545
No log 3.6944 266 0.5909 0.4963 0.5909 0.7687
No log 3.7222 268 0.6175 0.5200 0.6175 0.7858
No log 3.75 270 0.6132 0.4980 0.6132 0.7831
No log 3.7778 272 0.6297 0.4935 0.6297 0.7935
No log 3.8056 274 0.6224 0.4261 0.6224 0.7889
No log 3.8333 276 0.5941 0.4029 0.5941 0.7708
No log 3.8611 278 0.6530 0.5184 0.6530 0.8081
No log 3.8889 280 0.6850 0.4911 0.6850 0.8277
No log 3.9167 282 0.6339 0.4124 0.6339 0.7962
No log 3.9444 284 0.6061 0.3862 0.6061 0.7785
No log 3.9722 286 0.6152 0.3577 0.6152 0.7844
No log 4.0 288 0.6350 0.3788 0.6350 0.7968
No log 4.0278 290 0.6810 0.4703 0.6810 0.8252
No log 4.0556 292 0.6689 0.4484 0.6689 0.8178
No log 4.0833 294 0.6038 0.3995 0.6038 0.7770
No log 4.1111 296 0.5919 0.3910 0.5919 0.7694
No log 4.1389 298 0.5872 0.3474 0.5872 0.7663
No log 4.1667 300 0.6099 0.3782 0.6099 0.7810
No log 4.1944 302 0.6495 0.3312 0.6495 0.8059
No log 4.2222 304 0.7144 0.4884 0.7144 0.8452
No log 4.25 306 0.7164 0.4977 0.7164 0.8464
No log 4.2778 308 0.6831 0.3238 0.6831 0.8265
No log 4.3056 310 0.6689 0.3020 0.6689 0.8179
No log 4.3333 312 0.6425 0.3782 0.6425 0.8016
No log 4.3611 314 0.6237 0.3474 0.6237 0.7897
No log 4.3889 316 0.6276 0.2574 0.6276 0.7922
No log 4.4167 318 0.6222 0.3474 0.6222 0.7888
No log 4.4444 320 0.6262 0.5228 0.6262 0.7913
No log 4.4722 322 0.6801 0.4568 0.6801 0.8247
No log 4.5 324 0.6836 0.4665 0.6836 0.8268
No log 4.5278 326 0.6223 0.5104 0.6223 0.7888
No log 4.5556 328 0.6023 0.3352 0.6023 0.7761
No log 4.5833 330 0.5995 0.3677 0.5995 0.7743
No log 4.6111 332 0.5878 0.3675 0.5878 0.7667
No log 4.6389 334 0.5812 0.5036 0.5812 0.7624
No log 4.6667 336 0.5899 0.5036 0.5899 0.7681
No log 4.6944 338 0.6349 0.4329 0.6349 0.7968
No log 4.7222 340 0.6223 0.4664 0.6223 0.7889
No log 4.75 342 0.6037 0.5498 0.6037 0.7770
No log 4.7778 344 0.5703 0.4206 0.5703 0.7552
No log 4.8056 346 0.5659 0.4206 0.5659 0.7522
No log 4.8333 348 0.5832 0.5498 0.5832 0.7636
No log 4.8611 350 0.6810 0.4502 0.6810 0.8253
No log 4.8889 352 0.7504 0.4556 0.7504 0.8662
No log 4.9167 354 0.6891 0.5173 0.6891 0.8301
No log 4.9444 356 0.5990 0.5498 0.5990 0.7740
No log 4.9722 358 0.5662 0.4397 0.5662 0.7525
No log 5.0 360 0.5777 0.4082 0.5777 0.7601
No log 5.0278 362 0.5878 0.3581 0.5878 0.7667
No log 5.0556 364 0.5882 0.3835 0.5882 0.7669
No log 5.0833 366 0.6002 0.4482 0.6002 0.7747
No log 5.1111 368 0.6573 0.5237 0.6573 0.8107
No log 5.1389 370 0.6867 0.4764 0.6867 0.8287
No log 5.1667 372 0.6547 0.4291 0.6547 0.8092
No log 5.1944 374 0.6474 0.2509 0.6474 0.8046
No log 5.2222 376 0.6692 0.2160 0.6692 0.8181
No log 5.25 378 0.6681 0.3174 0.6681 0.8174
No log 5.2778 380 0.6100 0.3556 0.6100 0.7810
No log 5.3056 382 0.6361 0.5149 0.6361 0.7975
No log 5.3333 384 0.6317 0.4997 0.6317 0.7948
No log 5.3611 386 0.5907 0.4397 0.5907 0.7685
No log 5.3889 388 0.5822 0.4482 0.5822 0.7630
No log 5.4167 390 0.6020 0.4705 0.6020 0.7759
No log 5.4444 392 0.6503 0.5139 0.6503 0.8064
No log 5.4722 394 0.6486 0.5139 0.6486 0.8054
No log 5.5 396 0.6019 0.3572 0.6019 0.7758
No log 5.5278 398 0.5829 0.3915 0.5829 0.7634
No log 5.5556 400 0.5877 0.4160 0.5877 0.7666
No log 5.5833 402 0.5848 0.3915 0.5848 0.7647
No log 5.6111 404 0.6133 0.4397 0.6133 0.7831
No log 5.6389 406 0.6136 0.5036 0.6136 0.7833
No log 5.6667 408 0.6340 0.5498 0.6340 0.7963
No log 5.6944 410 0.6782 0.5048 0.6782 0.8235
No log 5.7222 412 0.6521 0.5330 0.6521 0.8075
No log 5.75 414 0.6177 0.3416 0.6177 0.7859
No log 5.7778 416 0.6122 0.3183 0.6122 0.7824
No log 5.8056 418 0.6163 0.3183 0.6163 0.7851
No log 5.8333 420 0.6072 0.3474 0.6072 0.7792
No log 5.8611 422 0.6063 0.3677 0.6063 0.7786
No log 5.8889 424 0.6010 0.3677 0.6010 0.7753
No log 5.9167 426 0.5967 0.3781 0.5967 0.7725
No log 5.9444 428 0.5935 0.3474 0.5935 0.7704
No log 5.9722 430 0.5951 0.3166 0.5951 0.7714
No log 6.0 432 0.6031 0.3092 0.6031 0.7766
No log 6.0278 434 0.6341 0.4306 0.6341 0.7963
No log 6.0556 436 0.6425 0.4919 0.6425 0.8016
No log 6.0833 438 0.6191 0.4306 0.6191 0.7868
No log 6.1111 440 0.5972 0.3166 0.5972 0.7728
No log 6.1389 442 0.5720 0.2857 0.5720 0.7563
No log 6.1667 444 0.5550 0.3523 0.5550 0.7450
No log 6.1944 446 0.5502 0.4224 0.5502 0.7418
No log 6.2222 448 0.5460 0.4044 0.5460 0.7389
No log 6.25 450 0.5349 0.4052 0.5349 0.7313
No log 6.2778 452 0.5653 0.5448 0.5653 0.7519
No log 6.3056 454 0.5523 0.5569 0.5523 0.7432
No log 6.3333 456 0.5399 0.4160 0.5399 0.7348
No log 6.3611 458 0.5860 0.4548 0.5860 0.7655
No log 6.3889 460 0.5839 0.4782 0.5839 0.7641
No log 6.4167 462 0.5539 0.4160 0.5539 0.7442
No log 6.4444 464 0.5750 0.4707 0.5750 0.7583
No log 6.4722 466 0.6711 0.4554 0.6711 0.8192
No log 6.5 468 0.8181 0.4142 0.8181 0.9045
No log 6.5278 470 0.8323 0.3807 0.8323 0.9123
No log 6.5556 472 0.7291 0.5256 0.7291 0.8539
No log 6.5833 474 0.6185 0.4663 0.6185 0.7864
No log 6.6111 476 0.5595 0.5131 0.5595 0.7480
No log 6.6389 478 0.5473 0.4908 0.5473 0.7398
No log 6.6667 480 0.5521 0.4984 0.5521 0.7431
No log 6.6944 482 0.5882 0.5349 0.5882 0.7669
No log 6.7222 484 0.6477 0.5158 0.6477 0.8048
No log 6.75 486 0.6669 0.4815 0.6669 0.8166
No log 6.7778 488 0.6316 0.5580 0.6316 0.7947
No log 6.8056 490 0.5661 0.4774 0.5661 0.7524
No log 6.8333 492 0.5306 0.4569 0.5306 0.7284
No log 6.8611 494 0.5232 0.4314 0.5232 0.7234
No log 6.8889 496 0.5221 0.4314 0.5221 0.7225
No log 6.9167 498 0.5299 0.5056 0.5299 0.7279
0.3477 6.9444 500 0.5240 0.5587 0.5240 0.7239
0.3477 6.9722 502 0.5186 0.4492 0.5186 0.7201
0.3477 7.0 504 0.5445 0.4548 0.5445 0.7379
0.3477 7.0278 506 0.6016 0.4182 0.6016 0.7756
0.3477 7.0556 508 0.6217 0.4130 0.6217 0.7885
0.3477 7.0833 510 0.6146 0.4355 0.6146 0.7839

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k14_task7_organization

Finetuned
(4019)
this model