ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k11_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8486
  • Qwk: 0.3697
  • Mse: 0.8486
  • Rmse: 0.9212

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0357 2 4.3066 0.0138 4.3066 2.0752
No log 0.0714 4 2.6385 0.0 2.6385 1.6243
No log 0.1071 6 2.3541 0.0203 2.3541 1.5343
No log 0.1429 8 1.4080 0.0775 1.4080 1.1866
No log 0.1786 10 1.1670 0.2108 1.1670 1.0803
No log 0.2143 12 1.1117 0.2545 1.1117 1.0544
No log 0.25 14 1.0575 0.2619 1.0575 1.0284
No log 0.2857 16 1.0411 0.2467 1.0411 1.0204
No log 0.3214 18 1.0519 0.2416 1.0519 1.0256
No log 0.3571 20 1.0542 0.1749 1.0542 1.0267
No log 0.3929 22 1.0735 0.1944 1.0735 1.0361
No log 0.4286 24 1.1616 0.0741 1.1616 1.0778
No log 0.4643 26 1.1844 0.1037 1.1844 1.0883
No log 0.5 28 1.1348 0.2179 1.1348 1.0653
No log 0.5357 30 1.0330 0.2035 1.0330 1.0164
No log 0.5714 32 1.0208 0.2187 1.0208 1.0103
No log 0.6071 34 1.0632 0.2140 1.0632 1.0311
No log 0.6429 36 1.0745 0.2192 1.0745 1.0366
No log 0.6786 38 1.0635 0.2341 1.0635 1.0313
No log 0.7143 40 1.0616 0.2441 1.0616 1.0303
No log 0.75 42 1.0088 0.2166 1.0088 1.0044
No log 0.7857 44 0.9535 0.2865 0.9535 0.9765
No log 0.8214 46 0.9477 0.3392 0.9477 0.9735
No log 0.8571 48 0.9627 0.2314 0.9627 0.9812
No log 0.8929 50 1.0587 0.2440 1.0587 1.0289
No log 0.9286 52 1.0913 0.2820 1.0913 1.0447
No log 0.9643 54 1.0604 0.2535 1.0604 1.0298
No log 1.0 56 1.1284 0.2117 1.1284 1.0622
No log 1.0357 58 1.1146 0.1803 1.1146 1.0558
No log 1.0714 60 1.0598 0.2505 1.0598 1.0295
No log 1.1071 62 1.0845 0.2981 1.0845 1.0414
No log 1.1429 64 1.1518 0.1053 1.1518 1.0732
No log 1.1786 66 1.1307 0.2448 1.1307 1.0633
No log 1.2143 68 1.1767 0.0847 1.1767 1.0847
No log 1.25 70 1.0793 0.2471 1.0793 1.0389
No log 1.2857 72 1.1331 0.1632 1.1331 1.0645
No log 1.3214 74 1.2859 0.1013 1.2859 1.1340
No log 1.3571 76 1.1735 0.1435 1.1735 1.0833
No log 1.3929 78 1.0336 0.3128 1.0336 1.0167
No log 1.4286 80 1.0428 0.2920 1.0428 1.0212
No log 1.4643 82 1.2769 0.0632 1.2769 1.1300
No log 1.5 84 1.3113 0.0324 1.3113 1.1451
No log 1.5357 86 1.2131 0.2328 1.2131 1.1014
No log 1.5714 88 1.1913 0.2439 1.1913 1.0915
No log 1.6071 90 1.2612 0.2038 1.2612 1.1230
No log 1.6429 92 1.2579 0.2038 1.2579 1.1215
No log 1.6786 94 1.0943 0.2448 1.0943 1.0461
No log 1.7143 96 1.0536 0.2587 1.0536 1.0265
No log 1.75 98 1.1362 0.1911 1.1362 1.0659
No log 1.7857 100 1.1400 0.1316 1.1400 1.0677
No log 1.8214 102 0.9634 0.2600 0.9634 0.9815
No log 1.8571 104 0.9363 0.3030 0.9363 0.9676
No log 1.8929 106 1.0518 0.2448 1.0518 1.0256
No log 1.9286 108 1.3182 0.2260 1.3182 1.1481
No log 1.9643 110 1.4219 0.2367 1.4219 1.1924
No log 2.0 112 1.1984 0.2542 1.1984 1.0947
No log 2.0357 114 0.9569 0.2545 0.9569 0.9782
No log 2.0714 116 1.1258 0.1688 1.1258 1.0610
No log 2.1071 118 1.0222 0.2035 1.0222 1.0111
No log 2.1429 120 0.9664 0.3992 0.9664 0.9830
No log 2.1786 122 1.2062 0.2026 1.2062 1.0983
No log 2.2143 124 1.2919 0.2588 1.2919 1.1366
No log 2.25 126 1.1963 0.2670 1.1963 1.0937
No log 2.2857 128 1.0974 0.3219 1.0974 1.0476
No log 2.3214 130 1.0558 0.2300 1.0558 1.0275
No log 2.3571 132 1.1195 0.2410 1.1195 1.0580
No log 2.3929 134 1.2295 0.2650 1.2295 1.1088
No log 2.4286 136 1.1808 0.2750 1.1808 1.0867
No log 2.4643 138 1.0148 0.1883 1.0148 1.0074
No log 2.5 140 0.9919 0.2879 0.9919 0.9959
No log 2.5357 142 1.0035 0.2643 1.0035 1.0018
No log 2.5714 144 0.9408 0.2788 0.9408 0.9699
No log 2.6071 146 0.9356 0.2993 0.9356 0.9673
No log 2.6429 148 0.8940 0.3095 0.8940 0.9455
No log 2.6786 150 0.8686 0.3408 0.8686 0.9320
No log 2.7143 152 0.9345 0.4106 0.9345 0.9667
No log 2.75 154 0.9757 0.4801 0.9757 0.9878
No log 2.7857 156 1.0404 0.4119 1.0404 1.0200
No log 2.8214 158 1.0924 0.3778 1.0924 1.0452
No log 2.8571 160 1.1028 0.3658 1.1028 1.0501
No log 2.8929 162 1.0509 0.4081 1.0509 1.0251
No log 2.9286 164 0.9396 0.4010 0.9396 0.9693
No log 2.9643 166 0.9006 0.3356 0.9006 0.9490
No log 3.0 168 0.8998 0.4014 0.8998 0.9486
No log 3.0357 170 0.9930 0.4388 0.9930 0.9965
No log 3.0714 172 0.9968 0.4511 0.9968 0.9984
No log 3.1071 174 1.0081 0.3914 1.0081 1.0040
No log 3.1429 176 0.9950 0.3378 0.9950 0.9975
No log 3.1786 178 0.9843 0.2667 0.9843 0.9921
No log 3.2143 180 0.9704 0.2879 0.9704 0.9851
No log 3.25 182 0.9722 0.3647 0.9722 0.9860
No log 3.2857 184 1.0758 0.3667 1.0758 1.0372
No log 3.3214 186 1.0645 0.3897 1.0645 1.0317
No log 3.3571 188 1.0160 0.4250 1.0160 1.0080
No log 3.3929 190 0.9467 0.4499 0.9467 0.9730
No log 3.4286 192 0.9424 0.4499 0.9424 0.9708
No log 3.4643 194 1.0072 0.4710 1.0072 1.0036
No log 3.5 196 0.9833 0.4603 0.9833 0.9916
No log 3.5357 198 0.9394 0.4375 0.9394 0.9692
No log 3.5714 200 0.9220 0.3603 0.9220 0.9602
No log 3.6071 202 0.9165 0.3878 0.9165 0.9573
No log 3.6429 204 0.9174 0.3427 0.9174 0.9578
No log 3.6786 206 0.9624 0.3415 0.9624 0.9810
No log 3.7143 208 0.9869 0.4549 0.9869 0.9934
No log 3.75 210 1.0268 0.4406 1.0268 1.0133
No log 3.7857 212 0.9787 0.4662 0.9787 0.9893
No log 3.8214 214 0.9599 0.4305 0.9599 0.9798
No log 3.8571 216 0.9580 0.2785 0.9580 0.9788
No log 3.8929 218 0.9510 0.2785 0.9510 0.9752
No log 3.9286 220 0.9645 0.2103 0.9645 0.9821
No log 3.9643 222 0.9538 0.3398 0.9538 0.9766
No log 4.0 224 0.9791 0.4159 0.9791 0.9895
No log 4.0357 226 0.9547 0.3291 0.9547 0.9771
No log 4.0714 228 0.9508 0.2785 0.9508 0.9751
No log 4.1071 230 0.9331 0.4632 0.9331 0.9660
No log 4.1429 232 0.9524 0.4604 0.9524 0.9759
No log 4.1786 234 1.1956 0.3997 1.1956 1.0934
No log 4.2143 236 1.3007 0.3766 1.3007 1.1405
No log 4.25 238 1.0862 0.3953 1.0862 1.0422
No log 4.2857 240 0.8314 0.3896 0.8314 0.9118
No log 4.3214 242 0.8342 0.3160 0.8342 0.9133
No log 4.3571 244 0.8146 0.3622 0.8146 0.9026
No log 4.3929 246 1.0715 0.4574 1.0715 1.0351
No log 4.4286 248 1.3280 0.3672 1.3280 1.1524
No log 4.4643 250 1.1420 0.4359 1.1420 1.0686
No log 4.5 252 0.9699 0.4593 0.9699 0.9848
No log 4.5357 254 0.8430 0.3358 0.8430 0.9182
No log 4.5714 256 0.8721 0.3223 0.8721 0.9338
No log 4.6071 258 0.9793 0.3863 0.9793 0.9896
No log 4.6429 260 1.1238 0.4100 1.1238 1.0601
No log 4.6786 262 1.0228 0.3721 1.0228 1.0113
No log 4.7143 264 0.8930 0.2179 0.8930 0.9450
No log 4.75 266 0.8885 0.2788 0.8885 0.9426
No log 4.7857 268 0.9017 0.2596 0.9017 0.9496
No log 4.8214 270 0.9443 0.2523 0.9443 0.9718
No log 4.8571 272 1.0735 0.2772 1.0735 1.0361
No log 4.8929 274 1.1188 0.4103 1.1188 1.0577
No log 4.9286 276 1.0395 0.4115 1.0395 1.0196
No log 4.9643 278 0.9896 0.2625 0.9896 0.9948
No log 5.0 280 0.9774 0.2763 0.9774 0.9886
No log 5.0357 282 0.9563 0.2696 0.9563 0.9779
No log 5.0714 284 0.9346 0.3112 0.9346 0.9668
No log 5.1071 286 0.9254 0.2597 0.9254 0.9620
No log 5.1429 288 0.9910 0.2689 0.9910 0.9955
No log 5.1786 290 0.9512 0.2526 0.9512 0.9753
No log 5.2143 292 0.9104 0.2812 0.9104 0.9541
No log 5.25 294 0.9813 0.3861 0.9813 0.9906
No log 5.2857 296 1.0007 0.4139 1.0007 1.0003
No log 5.3214 298 0.9540 0.4619 0.9540 0.9767
No log 5.3571 300 0.9710 0.4499 0.9710 0.9854
No log 5.3929 302 0.9756 0.4250 0.9756 0.9877
No log 5.4286 304 0.9389 0.3590 0.9389 0.9690
No log 5.4643 306 0.9341 0.2532 0.9341 0.9665
No log 5.5 308 0.9202 0.2229 0.9202 0.9593
No log 5.5357 310 0.9470 0.3502 0.9470 0.9732
No log 5.5714 312 0.9719 0.3897 0.9719 0.9859
No log 5.6071 314 0.9871 0.4604 0.9871 0.9935
No log 5.6429 316 0.9505 0.3824 0.9505 0.9750
No log 5.6786 318 0.9572 0.3572 0.9572 0.9784
No log 5.7143 320 1.0339 0.4111 1.0339 1.0168
No log 5.75 322 1.0511 0.3987 1.0511 1.0252
No log 5.7857 324 0.9722 0.4020 0.9722 0.9860
No log 5.8214 326 0.8896 0.3437 0.8896 0.9432
No log 5.8571 328 0.8796 0.3947 0.8796 0.9379
No log 5.8929 330 0.9946 0.4681 0.9946 0.9973
No log 5.9286 332 1.1287 0.4869 1.1287 1.0624
No log 5.9643 334 1.1139 0.4964 1.1139 1.0554
No log 6.0 336 0.9776 0.4681 0.9776 0.9887
No log 6.0357 338 0.9757 0.4681 0.9757 0.9878
No log 6.0714 340 0.9738 0.4334 0.9738 0.9868
No log 6.1071 342 0.9384 0.3987 0.9384 0.9687
No log 6.1429 344 0.9068 0.2456 0.9068 0.9523
No log 6.1786 346 0.9368 0.3380 0.9368 0.9679
No log 6.2143 348 0.9540 0.2929 0.9540 0.9767
No log 6.25 350 0.9550 0.2628 0.9550 0.9773
No log 6.2857 352 0.9554 0.4006 0.9554 0.9774
No log 6.3214 354 0.9714 0.3930 0.9714 0.9856
No log 6.3571 356 0.9911 0.4386 0.9911 0.9955
No log 6.3929 358 0.9627 0.4751 0.9627 0.9812
No log 6.4286 360 1.0179 0.4932 1.0179 1.0089
No log 6.4643 362 1.0610 0.4595 1.0610 1.0301
No log 6.5 364 1.0876 0.4459 1.0876 1.0429
No log 6.5357 366 1.0398 0.4115 1.0398 1.0197
No log 6.5714 368 0.9680 0.3992 0.9680 0.9839
No log 6.6071 370 0.9431 0.3802 0.9431 0.9711
No log 6.6429 372 0.9333 0.3689 0.9333 0.9661
No log 6.6786 374 0.9458 0.3296 0.9458 0.9725
No log 6.7143 376 1.0089 0.4371 1.0089 1.0045
No log 6.75 378 1.0130 0.4474 1.0130 1.0065
No log 6.7857 380 0.9478 0.3358 0.9478 0.9735
No log 6.8214 382 0.9377 0.3418 0.9377 0.9683
No log 6.8571 384 0.9384 0.3256 0.9384 0.9687
No log 6.8929 386 0.9413 0.4159 0.9413 0.9702
No log 6.9286 388 0.9801 0.4826 0.9801 0.9900
No log 6.9643 390 0.9235 0.3914 0.9235 0.9610
No log 7.0 392 0.9134 0.3133 0.9134 0.9557
No log 7.0357 394 0.9279 0.2785 0.9279 0.9633
No log 7.0714 396 0.9202 0.2133 0.9202 0.9593
No log 7.1071 398 0.9185 0.2133 0.9185 0.9584
No log 7.1429 400 0.9198 0.2835 0.9198 0.9591
No log 7.1786 402 0.9103 0.2693 0.9103 0.9541
No log 7.2143 404 0.9021 0.3493 0.9021 0.9498
No log 7.25 406 0.8863 0.3052 0.8863 0.9414
No log 7.2857 408 0.8775 0.2693 0.8775 0.9368
No log 7.3214 410 0.8956 0.2479 0.8956 0.9463
No log 7.3571 412 0.8863 0.3038 0.8863 0.9414
No log 7.3929 414 0.9391 0.4590 0.9391 0.9691
No log 7.4286 416 1.1332 0.4765 1.1332 1.0645
No log 7.4643 418 1.1257 0.4551 1.1257 1.0610
No log 7.5 420 1.0135 0.4244 1.0135 1.0067
No log 7.5357 422 0.9210 0.4056 0.9210 0.9597
No log 7.5714 424 0.9154 0.4056 0.9154 0.9568
No log 7.6071 426 0.9238 0.4853 0.9238 0.9612
No log 7.6429 428 0.9495 0.4489 0.9495 0.9744
No log 7.6786 430 1.0627 0.4563 1.0627 1.0309
No log 7.7143 432 1.0747 0.4326 1.0747 1.0367
No log 7.75 434 0.9902 0.4563 0.9902 0.9951
No log 7.7857 436 0.8847 0.4357 0.8847 0.9406
No log 7.8214 438 0.8309 0.3271 0.8309 0.9116
No log 7.8571 440 0.8413 0.3332 0.8413 0.9172
No log 7.8929 442 0.8292 0.3196 0.8292 0.9106
No log 7.9286 444 0.8521 0.4350 0.8521 0.9231
No log 7.9643 446 0.9187 0.4234 0.9187 0.9585
No log 8.0 448 0.9217 0.4573 0.9217 0.9600
No log 8.0357 450 0.9092 0.3973 0.9092 0.9535
No log 8.0714 452 0.8738 0.4257 0.8738 0.9348
No log 8.1071 454 0.8516 0.4247 0.8516 0.9228
No log 8.1429 456 0.8683 0.4366 0.8683 0.9318
No log 8.1786 458 0.8668 0.4604 0.8668 0.9310
No log 8.2143 460 0.9154 0.4123 0.9154 0.9568
No log 8.25 462 0.9620 0.4681 0.9620 0.9808
No log 8.2857 464 1.0158 0.4783 1.0158 1.0079
No log 8.3214 466 0.9870 0.4356 0.9870 0.9935
No log 8.3571 468 0.9767 0.4356 0.9767 0.9883
No log 8.3929 470 0.9918 0.4356 0.9918 0.9959
No log 8.4286 472 1.0468 0.4551 1.0468 1.0231
No log 8.4643 474 0.9937 0.4667 0.9937 0.9968
No log 8.5 476 0.8959 0.3990 0.8959 0.9465
No log 8.5357 478 0.8697 0.4148 0.8697 0.9326
No log 8.5714 480 0.8849 0.4956 0.8849 0.9407
No log 8.6071 482 0.9166 0.4951 0.9166 0.9574
No log 8.6429 484 0.9837 0.4621 0.9837 0.9918
No log 8.6786 486 0.9858 0.4708 0.9858 0.9929
No log 8.7143 488 0.8756 0.4951 0.8756 0.9357
No log 8.75 490 0.8280 0.4449 0.8280 0.9099
No log 8.7857 492 0.8310 0.4560 0.8310 0.9116
No log 8.8214 494 0.8308 0.3636 0.8308 0.9115
No log 8.8571 496 0.8547 0.3369 0.8547 0.9245
No log 8.8929 498 0.8966 0.4020 0.8966 0.9469
0.3174 8.9286 500 0.8824 0.4020 0.8824 0.9394
0.3174 8.9643 502 0.8449 0.4153 0.8449 0.9192
0.3174 9.0 504 0.8322 0.4297 0.8322 0.9122
0.3174 9.0357 506 0.8367 0.4519 0.8367 0.9147
0.3174 9.0714 508 0.9300 0.4444 0.9300 0.9644
0.3174 9.1071 510 0.9830 0.4318 0.9830 0.9914
0.3174 9.1429 512 0.9313 0.3558 0.9313 0.9650
0.3174 9.1786 514 0.8461 0.2842 0.8461 0.9199
0.3174 9.2143 516 0.8385 0.3078 0.8385 0.9157
0.3174 9.25 518 0.8333 0.3467 0.8333 0.9128
0.3174 9.2857 520 0.8486 0.3697 0.8486 0.9212

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k11_task5_organization

Finetuned
(4019)
this model