ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k16_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7920
  • Qwk: 0.3974
  • Mse: 0.7920
  • Rmse: 0.8899

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0247 2 3.9153 -0.0323 3.9153 1.9787
No log 0.0494 4 2.1126 -0.0668 2.1126 1.4535
No log 0.0741 6 1.6808 -0.0144 1.6808 1.2964
No log 0.0988 8 1.2299 0.1706 1.2299 1.1090
No log 0.1235 10 1.2968 0.1036 1.2968 1.1388
No log 0.1481 12 1.1291 0.2171 1.1291 1.0626
No log 0.1728 14 1.1179 0.2787 1.1179 1.0573
No log 0.1975 16 1.1847 0.2042 1.1847 1.0884
No log 0.2222 18 1.1169 0.0824 1.1169 1.0568
No log 0.2469 20 1.0947 0.2135 1.0947 1.0463
No log 0.2716 22 1.1034 0.1981 1.1034 1.0504
No log 0.2963 24 1.1265 0.0762 1.1265 1.0614
No log 0.3210 26 1.1794 0.0 1.1794 1.0860
No log 0.3457 28 1.1157 0.0604 1.1157 1.0563
No log 0.3704 30 1.0054 0.1487 1.0054 1.0027
No log 0.3951 32 0.9650 0.2569 0.9650 0.9823
No log 0.4198 34 0.9557 0.2897 0.9557 0.9776
No log 0.4444 36 0.9639 0.3499 0.9639 0.9818
No log 0.4691 38 0.9874 0.2611 0.9874 0.9937
No log 0.4938 40 0.8923 0.3604 0.8923 0.9446
No log 0.5185 42 0.8806 0.3192 0.8806 0.9384
No log 0.5432 44 0.9241 0.1761 0.9241 0.9613
No log 0.5679 46 0.9040 0.1909 0.9040 0.9508
No log 0.5926 48 0.8736 0.3503 0.8736 0.9347
No log 0.6173 50 0.8814 0.3763 0.8814 0.9388
No log 0.6420 52 0.8645 0.3264 0.8645 0.9298
No log 0.6667 54 0.8676 0.2993 0.8676 0.9315
No log 0.6914 56 0.9758 0.2588 0.9758 0.9878
No log 0.7160 58 1.0728 0.2650 1.0728 1.0358
No log 0.7407 60 0.8733 0.2556 0.8733 0.9345
No log 0.7654 62 0.7878 0.4241 0.7878 0.8876
No log 0.7901 64 0.7848 0.4312 0.7848 0.8859
No log 0.8148 66 0.8038 0.4113 0.8038 0.8965
No log 0.8395 68 0.8190 0.4113 0.8190 0.9050
No log 0.8642 70 0.7648 0.4286 0.7648 0.8745
No log 0.8889 72 0.8720 0.3519 0.8720 0.9338
No log 0.9136 74 1.1168 0.3686 1.1168 1.0568
No log 0.9383 76 1.2305 0.3001 1.2305 1.1093
No log 0.9630 78 1.2278 0.3001 1.2278 1.1081
No log 0.9877 80 1.2037 0.3231 1.2037 1.0971
No log 1.0123 82 1.1788 0.3511 1.1788 1.0857
No log 1.0370 84 1.1318 0.4081 1.1318 1.0639
No log 1.0617 86 1.1686 0.2924 1.1686 1.0810
No log 1.0864 88 1.2631 0.3194 1.2631 1.1239
No log 1.1111 90 1.4262 0.2890 1.4262 1.1942
No log 1.1358 92 1.4679 0.2661 1.4679 1.2116
No log 1.1605 94 1.2613 0.2631 1.2613 1.1231
No log 1.1852 96 1.1543 0.2631 1.1543 1.0744
No log 1.2099 98 1.1289 0.2750 1.1289 1.0625
No log 1.2346 100 1.2585 0.2924 1.2585 1.1218
No log 1.2593 102 1.4270 0.3024 1.4270 1.1946
No log 1.2840 104 1.3661 0.3378 1.3661 1.1688
No log 1.3086 106 1.1614 0.3897 1.1614 1.0777
No log 1.3333 108 1.0165 0.4010 1.0165 1.0082
No log 1.3580 110 1.0077 0.4010 1.0077 1.0038
No log 1.3827 112 0.9857 0.4554 0.9857 0.9928
No log 1.4074 114 0.9500 0.4439 0.9500 0.9747
No log 1.4321 116 1.0066 0.4334 1.0066 1.0033
No log 1.4568 118 0.9810 0.3809 0.9810 0.9905
No log 1.4815 120 0.9173 0.4082 0.9173 0.9578
No log 1.5062 122 0.8928 0.4434 0.8928 0.9449
No log 1.5309 124 0.9403 0.3979 0.9403 0.9697
No log 1.5556 126 1.0431 0.3945 1.0431 1.0213
No log 1.5802 128 0.9893 0.4799 0.9893 0.9946
No log 1.6049 130 1.0243 0.4326 1.0243 1.0121
No log 1.6296 132 1.0326 0.4551 1.0326 1.0162
No log 1.6543 134 1.2821 0.2994 1.2821 1.1323
No log 1.6790 136 1.3057 0.2682 1.3057 1.1427
No log 1.7037 138 1.0604 0.4318 1.0604 1.0297
No log 1.7284 140 0.9354 0.4202 0.9354 0.9671
No log 1.7531 142 0.8736 0.4455 0.8736 0.9347
No log 1.7778 144 0.9219 0.3717 0.9219 0.9602
No log 1.8025 146 1.1198 0.3881 1.1198 1.0582
No log 1.8272 148 1.3264 0.2857 1.3264 1.1517
No log 1.8519 150 1.2759 0.2624 1.2759 1.1296
No log 1.8765 152 1.1571 0.3897 1.1571 1.0757
No log 1.9012 154 0.8432 0.4696 0.8432 0.9183
No log 1.9259 156 0.7708 0.4378 0.7708 0.8780
No log 1.9506 158 0.8016 0.5192 0.8016 0.8953
No log 1.9753 160 1.0054 0.4311 1.0054 1.0027
No log 2.0 162 1.3780 0.3024 1.3780 1.1739
No log 2.0247 164 1.5452 0.2451 1.5452 1.2430
No log 2.0494 166 1.3474 0.2624 1.3474 1.1608
No log 2.0741 168 1.0809 0.2636 1.0809 1.0397
No log 2.0988 170 0.9369 0.2441 0.9369 0.9679
No log 2.1235 172 0.9402 0.2623 0.9402 0.9696
No log 2.1481 174 1.0693 0.3641 1.0693 1.0341
No log 2.1728 176 1.1502 0.3706 1.1502 1.0725
No log 2.1975 178 1.1656 0.4036 1.1656 1.0796
No log 2.2222 180 1.1108 0.4133 1.1108 1.0539
No log 2.2469 182 0.9682 0.4783 0.9682 0.9839
No log 2.2716 184 0.8687 0.4785 0.8687 0.9320
No log 2.2963 186 0.8303 0.4917 0.8303 0.9112
No log 2.3210 188 0.9115 0.4444 0.9115 0.9547
No log 2.3457 190 1.1323 0.3981 1.1323 1.0641
No log 2.3704 192 1.3856 0.3354 1.3856 1.1771
No log 2.3951 194 1.2968 0.3650 1.2968 1.1388
No log 2.4198 196 1.0230 0.3953 1.0230 1.0114
No log 2.4444 198 0.9345 0.3103 0.9345 0.9667
No log 2.4691 200 0.8716 0.2569 0.8716 0.9336
No log 2.4938 202 0.8211 0.2541 0.8211 0.9061
No log 2.5185 204 0.8172 0.2572 0.8172 0.9040
No log 2.5432 206 0.9201 0.4667 0.9201 0.9592
No log 2.5679 208 1.1183 0.3650 1.1183 1.0575
No log 2.5926 210 1.2144 0.3165 1.2144 1.1020
No log 2.6173 212 1.0646 0.4420 1.0646 1.0318
No log 2.6420 214 0.8879 0.3987 0.8879 0.9423
No log 2.6667 216 0.8691 0.3863 0.8691 0.9323
No log 2.6914 218 0.8863 0.3099 0.8863 0.9414
No log 2.7160 220 0.8902 0.3099 0.8902 0.9435
No log 2.7407 222 0.8929 0.4119 0.8929 0.9449
No log 2.7654 224 0.9246 0.4468 0.9246 0.9615
No log 2.7901 226 0.8729 0.5543 0.8729 0.9343
No log 2.8148 228 0.7777 0.4975 0.7777 0.8819
No log 2.8395 230 0.7329 0.4828 0.7329 0.8561
No log 2.8642 232 0.7117 0.5016 0.7117 0.8436
No log 2.8889 234 0.7119 0.4995 0.7119 0.8437
No log 2.9136 236 0.6926 0.5261 0.6926 0.8322
No log 2.9383 238 0.6888 0.4922 0.6888 0.8299
No log 2.9630 240 0.6894 0.5036 0.6894 0.8303
No log 2.9877 242 0.7042 0.4978 0.7042 0.8392
No log 3.0123 244 0.7295 0.4984 0.7295 0.8541
No log 3.0370 246 0.7498 0.5233 0.7498 0.8659
No log 3.0617 248 0.6707 0.6196 0.6707 0.8190
No log 3.0864 250 0.8075 0.5253 0.8075 0.8986
No log 3.1111 252 0.9915 0.4435 0.9915 0.9958
No log 3.1358 254 1.0472 0.4191 1.0472 1.0233
No log 3.1605 256 1.0444 0.4191 1.0444 1.0220
No log 3.1852 258 0.9522 0.4061 0.9522 0.9758
No log 3.2099 260 0.9185 0.4163 0.9185 0.9584
No log 3.2346 262 0.9611 0.4286 0.9611 0.9803
No log 3.2593 264 0.9807 0.4286 0.9807 0.9903
No log 3.2840 266 0.9549 0.4277 0.9549 0.9772
No log 3.3086 268 0.8912 0.3201 0.8912 0.9440
No log 3.3333 270 0.7847 0.3921 0.7847 0.8858
No log 3.3580 272 0.6975 0.5003 0.6975 0.8351
No log 3.3827 274 0.6670 0.5605 0.6670 0.8167
No log 3.4074 276 0.6884 0.5410 0.6884 0.8297
No log 3.4321 278 0.7745 0.5343 0.7745 0.8800
No log 3.4568 280 0.7786 0.5131 0.7786 0.8824
No log 3.4815 282 0.7750 0.5131 0.7750 0.8804
No log 3.5062 284 0.7967 0.4710 0.7967 0.8926
No log 3.5309 286 0.7245 0.5552 0.7245 0.8512
No log 3.5556 288 0.6772 0.5928 0.6772 0.8229
No log 3.5802 290 0.6583 0.5602 0.6583 0.8113
No log 3.6049 292 0.6711 0.6102 0.6711 0.8192
No log 3.6296 294 0.7409 0.5383 0.7409 0.8608
No log 3.6543 296 0.7936 0.5140 0.7936 0.8909
No log 3.6790 298 0.8124 0.5618 0.8124 0.9013
No log 3.7037 300 0.7539 0.4473 0.7539 0.8683
No log 3.7284 302 0.6982 0.4643 0.6982 0.8356
No log 3.7531 304 0.6946 0.5809 0.6946 0.8334
No log 3.7778 306 0.6740 0.6067 0.6740 0.8210
No log 3.8025 308 0.6443 0.5131 0.6443 0.8027
No log 3.8272 310 0.7335 0.5642 0.7335 0.8564
No log 3.8519 312 0.8099 0.5167 0.8099 0.9000
No log 3.8765 314 0.8259 0.4681 0.8259 0.9088
No log 3.9012 316 0.7833 0.3682 0.7833 0.8851
No log 3.9259 318 0.7745 0.3393 0.7745 0.8801
No log 3.9506 320 0.8099 0.3107 0.8099 0.9000
No log 3.9753 322 0.8757 0.4423 0.8757 0.9358
No log 4.0 324 0.9263 0.4650 0.9263 0.9625
No log 4.0247 326 0.8599 0.4917 0.8599 0.9273
No log 4.0494 328 0.7622 0.4843 0.7622 0.8730
No log 4.0741 330 0.6829 0.4729 0.6829 0.8264
No log 4.0988 332 0.6379 0.4729 0.6379 0.7987
No log 4.1235 334 0.6175 0.4878 0.6175 0.7858
No log 4.1481 336 0.6206 0.4745 0.6206 0.7878
No log 4.1728 338 0.6117 0.5373 0.6117 0.7821
No log 4.1975 340 0.6091 0.5590 0.6091 0.7805
No log 4.2222 342 0.6138 0.5712 0.6138 0.7834
No log 4.2469 344 0.6221 0.5859 0.6221 0.7887
No log 4.2716 346 0.6498 0.5287 0.6498 0.8061
No log 4.2963 348 0.6695 0.5326 0.6695 0.8182
No log 4.3210 350 0.6343 0.5399 0.6343 0.7964
No log 4.3457 352 0.6249 0.6196 0.6249 0.7905
No log 4.3704 354 0.6331 0.5871 0.6331 0.7957
No log 4.3951 356 0.6435 0.5983 0.6435 0.8022
No log 4.4198 358 0.7044 0.5751 0.7044 0.8393
No log 4.4444 360 0.8227 0.4593 0.8227 0.9071
No log 4.4691 362 0.8805 0.5124 0.8805 0.9384
No log 4.4938 364 0.8517 0.4681 0.8517 0.9229
No log 4.5185 366 0.8247 0.4568 0.8247 0.9081
No log 4.5432 368 0.7784 0.4343 0.7784 0.8823
No log 4.5679 370 0.7751 0.4343 0.7751 0.8804
No log 4.5926 372 0.7820 0.4588 0.7820 0.8843
No log 4.6173 374 0.8010 0.4474 0.8010 0.8950
No log 4.6420 376 0.7552 0.5425 0.7552 0.8690
No log 4.6667 378 0.7747 0.5498 0.7747 0.8802
No log 4.6914 380 0.8572 0.4902 0.8572 0.9259
No log 4.7160 382 1.0176 0.4484 1.0176 1.0087
No log 4.7407 384 1.0496 0.4484 1.0496 1.0245
No log 4.7654 386 0.8929 0.4681 0.8929 0.9449
No log 4.7901 388 0.7438 0.4612 0.7438 0.8624
No log 4.8148 390 0.7203 0.4507 0.7203 0.8487
No log 4.8395 392 0.7152 0.4778 0.7152 0.8457
No log 4.8642 394 0.7362 0.4483 0.7362 0.8580
No log 4.8889 396 0.7844 0.4592 0.7844 0.8856
No log 4.9136 398 0.8406 0.4794 0.8406 0.9168
No log 4.9383 400 0.8989 0.4792 0.8989 0.9481
No log 4.9630 402 0.8478 0.4579 0.8478 0.9207
No log 4.9877 404 0.7942 0.4366 0.7942 0.8912
No log 5.0123 406 0.7281 0.5142 0.7281 0.8533
No log 5.0370 408 0.7158 0.5902 0.7158 0.8460
No log 5.0617 410 0.7062 0.5503 0.7062 0.8404
No log 5.0864 412 0.7686 0.5232 0.7686 0.8767
No log 5.1111 414 0.9402 0.4790 0.9402 0.9697
No log 5.1358 416 0.9938 0.4885 0.9938 0.9969
No log 5.1605 418 0.8905 0.4792 0.8905 0.9437
No log 5.1852 420 0.7559 0.4836 0.7559 0.8694
No log 5.2099 422 0.7073 0.5142 0.7073 0.8410
No log 5.2346 424 0.7106 0.5898 0.7106 0.8430
No log 5.2593 426 0.7096 0.5677 0.7096 0.8424
No log 5.2840 428 0.7145 0.4882 0.7145 0.8453
No log 5.3086 430 0.7841 0.3687 0.7841 0.8855
No log 5.3333 432 0.8639 0.3668 0.8639 0.9294
No log 5.3580 434 0.8938 0.3531 0.8938 0.9454
No log 5.3827 436 0.8888 0.3124 0.8888 0.9428
No log 5.4074 438 0.9003 0.3124 0.9003 0.9488
No log 5.4321 440 0.9232 0.2723 0.9232 0.9608
No log 5.4568 442 0.9285 0.3921 0.9285 0.9636
No log 5.4815 444 0.9307 0.3897 0.9307 0.9648
No log 5.5062 446 0.9061 0.4016 0.9061 0.9519
No log 5.5309 448 0.8365 0.4004 0.8365 0.9146
No log 5.5556 450 0.7869 0.4343 0.7869 0.8871
No log 5.5802 452 0.7684 0.4599 0.7684 0.8766
No log 5.6049 454 0.7880 0.4711 0.7880 0.8877
No log 5.6296 456 0.8115 0.4364 0.8115 0.9008
No log 5.6543 458 0.7732 0.4966 0.7732 0.8793
No log 5.6790 460 0.7189 0.4862 0.7189 0.8479
No log 5.7037 462 0.7006 0.4995 0.7006 0.8370
No log 5.7284 464 0.6943 0.4914 0.6943 0.8332
No log 5.7531 466 0.7005 0.5315 0.7005 0.8370
No log 5.7778 468 0.7001 0.5315 0.7001 0.8367
No log 5.8025 470 0.7012 0.5315 0.7012 0.8374
No log 5.8272 472 0.7036 0.5763 0.7036 0.8388
No log 5.8519 474 0.7834 0.5032 0.7834 0.8851
No log 5.8765 476 0.8282 0.4807 0.8282 0.9101
No log 5.9012 478 0.8048 0.4592 0.8048 0.8971
No log 5.9259 480 0.8023 0.4815 0.8023 0.8957
No log 5.9506 482 0.7905 0.3570 0.7905 0.8891
No log 5.9753 484 0.8046 0.3842 0.8046 0.8970
No log 6.0 486 0.8355 0.4586 0.8355 0.9141
No log 6.0247 488 0.8120 0.3842 0.8120 0.9011
No log 6.0494 490 0.8034 0.3992 0.8034 0.8963
No log 6.0741 492 0.8256 0.4350 0.8256 0.9087
No log 6.0988 494 0.8816 0.4334 0.8816 0.9389
No log 6.1235 496 0.8872 0.4334 0.8872 0.9419
No log 6.1481 498 0.8625 0.4334 0.8625 0.9287
0.3129 6.1728 500 0.8525 0.4334 0.8525 0.9233
0.3129 6.1975 502 0.8329 0.4460 0.8329 0.9126
0.3129 6.2222 504 0.8287 0.4586 0.8287 0.9103
0.3129 6.2469 506 0.8401 0.4586 0.8401 0.9166
0.3129 6.2716 508 0.8460 0.4586 0.8460 0.9198
0.3129 6.2963 510 0.8516 0.3844 0.8516 0.9228
0.3129 6.3210 512 0.8399 0.3551 0.8399 0.9165
0.3129 6.3457 514 0.8021 0.3250 0.8021 0.8956
0.3129 6.3704 516 0.7839 0.3536 0.7839 0.8854
0.3129 6.3951 518 0.7920 0.3974 0.7920 0.8899

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k16_task5_organization

Finetuned
(4019)
this model