ArabicNewSplits8_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k16_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.1859
  • Qwk: 0.4568
  • Mse: 1.1859
  • Rmse: 1.0890

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0253 2 5.1806 -0.0034 5.1806 2.2761
No log 0.0506 4 3.3370 0.0381 3.3370 1.8267
No log 0.0759 6 2.3106 -0.0134 2.3106 1.5201
No log 0.1013 8 1.9331 0.0127 1.9331 1.3904
No log 0.1266 10 1.5523 0.0957 1.5523 1.2459
No log 0.1519 12 1.4714 0.0426 1.4714 1.2130
No log 0.1772 14 1.3586 0.0868 1.3586 1.1656
No log 0.2025 16 1.2604 0.1080 1.2604 1.1227
No log 0.2278 18 1.2188 0.1496 1.2188 1.1040
No log 0.2532 20 1.1797 0.1508 1.1797 1.0862
No log 0.2785 22 1.1623 0.1913 1.1623 1.0781
No log 0.3038 24 1.1225 0.2220 1.1225 1.0595
No log 0.3291 26 1.0483 0.2369 1.0483 1.0239
No log 0.3544 28 0.9954 0.3480 0.9954 0.9977
No log 0.3797 30 1.1283 0.3173 1.1283 1.0622
No log 0.4051 32 1.0807 0.3719 1.0807 1.0396
No log 0.4304 34 0.9961 0.2742 0.9961 0.9981
No log 0.4557 36 0.9890 0.2742 0.9890 0.9945
No log 0.4810 38 0.9739 0.2908 0.9739 0.9869
No log 0.5063 40 0.9749 0.3803 0.9749 0.9874
No log 0.5316 42 0.9689 0.3956 0.9689 0.9843
No log 0.5570 44 0.9335 0.4111 0.9335 0.9662
No log 0.5823 46 0.9237 0.3560 0.9237 0.9611
No log 0.6076 48 0.9516 0.3596 0.9516 0.9755
No log 0.6329 50 0.9568 0.3326 0.9568 0.9782
No log 0.6582 52 1.0486 0.3382 1.0486 1.0240
No log 0.6835 54 1.1623 0.3092 1.1623 1.0781
No log 0.7089 56 1.0517 0.3762 1.0517 1.0255
No log 0.7342 58 1.0664 0.3420 1.0664 1.0327
No log 0.7595 60 1.1979 0.2283 1.1979 1.0945
No log 0.7848 62 1.2059 0.2264 1.2059 1.0981
No log 0.8101 64 1.1834 0.2438 1.1834 1.0878
No log 0.8354 66 1.1457 0.2832 1.1457 1.0704
No log 0.8608 68 1.0721 0.3312 1.0721 1.0354
No log 0.8861 70 1.0097 0.3618 1.0097 1.0048
No log 0.9114 72 1.0289 0.2654 1.0289 1.0144
No log 0.9367 74 1.0968 0.2606 1.0968 1.0473
No log 0.9620 76 1.1002 0.2419 1.1002 1.0489
No log 0.9873 78 1.0499 0.2162 1.0499 1.0246
No log 1.0127 80 1.2773 0.2810 1.2773 1.1302
No log 1.0380 82 1.4339 0.3273 1.4339 1.1975
No log 1.0633 84 1.2727 0.2881 1.2727 1.1281
No log 1.0886 86 1.2533 0.3390 1.2533 1.1195
No log 1.1139 88 1.0878 0.3042 1.0878 1.0430
No log 1.1392 90 0.9779 0.3342 0.9779 0.9889
No log 1.1646 92 1.0391 0.3983 1.0391 1.0193
No log 1.1899 94 1.1861 0.2638 1.1861 1.0891
No log 1.2152 96 1.2446 0.3033 1.2446 1.1156
No log 1.2405 98 1.2963 0.1593 1.2963 1.1385
No log 1.2658 100 1.3339 0.1523 1.3339 1.1549
No log 1.2911 102 1.3281 0.2616 1.3281 1.1524
No log 1.3165 104 1.3039 0.2698 1.3039 1.1419
No log 1.3418 106 1.1810 0.3542 1.1810 1.0867
No log 1.3671 108 1.1077 0.3852 1.1077 1.0525
No log 1.3924 110 1.1527 0.3474 1.1527 1.0736
No log 1.4177 112 1.3256 0.3435 1.3256 1.1514
No log 1.4430 114 1.4510 0.2625 1.4510 1.2046
No log 1.4684 116 1.3491 0.2244 1.3491 1.1615
No log 1.4937 118 1.1958 0.2335 1.1958 1.0935
No log 1.5190 120 1.0880 0.2976 1.0880 1.0431
No log 1.5443 122 0.9883 0.3444 0.9883 0.9941
No log 1.5696 124 0.9487 0.3649 0.9487 0.9740
No log 1.5949 126 0.9541 0.3091 0.9541 0.9768
No log 1.6203 128 0.9831 0.3272 0.9831 0.9915
No log 1.6456 130 1.0851 0.3411 1.0851 1.0417
No log 1.6709 132 1.1720 0.3463 1.1720 1.0826
No log 1.6962 134 1.2000 0.3395 1.2000 1.0954
No log 1.7215 136 1.3398 0.3694 1.3398 1.1575
No log 1.7468 138 1.3642 0.3224 1.3642 1.1680
No log 1.7722 140 1.2561 0.2892 1.2561 1.1208
No log 1.7975 142 1.1396 0.3033 1.1396 1.0675
No log 1.8228 144 1.0876 0.3227 1.0876 1.0429
No log 1.8481 146 1.0818 0.3449 1.0818 1.0401
No log 1.8734 148 1.1986 0.4398 1.1986 1.0948
No log 1.8987 150 1.3759 0.3568 1.3759 1.1730
No log 1.9241 152 1.3163 0.4618 1.3163 1.1473
No log 1.9494 154 1.0840 0.5039 1.0840 1.0412
No log 1.9747 156 0.9322 0.4307 0.9322 0.9655
No log 2.0 158 0.9347 0.4236 0.9347 0.9668
No log 2.0253 160 0.9822 0.4141 0.9822 0.9911
No log 2.0506 162 0.9579 0.4332 0.9579 0.9787
No log 2.0759 164 1.0044 0.4806 1.0044 1.0022
No log 2.1013 166 1.0512 0.5397 1.0512 1.0253
No log 2.1266 168 1.0318 0.5387 1.0318 1.0158
No log 2.1519 170 0.9934 0.5380 0.9934 0.9967
No log 2.1772 172 0.9961 0.5369 0.9961 0.9981
No log 2.2025 174 1.1161 0.5142 1.1161 1.0564
No log 2.2278 176 1.4356 0.3949 1.4356 1.1982
No log 2.2532 178 1.8437 0.2999 1.8437 1.3578
No log 2.2785 180 1.8288 0.2758 1.8288 1.3523
No log 2.3038 182 1.3750 0.4184 1.3750 1.1726
No log 2.3291 184 0.9712 0.4405 0.9712 0.9855
No log 2.3544 186 0.8847 0.4580 0.8847 0.9406
No log 2.3797 188 0.8744 0.5319 0.8744 0.9351
No log 2.4051 190 0.9669 0.5287 0.9669 0.9833
No log 2.4304 192 1.2754 0.4510 1.2754 1.1293
No log 2.4557 194 1.6252 0.3551 1.6252 1.2748
No log 2.4810 196 1.8023 0.3192 1.8023 1.3425
No log 2.5063 198 1.6269 0.3394 1.6269 1.2755
No log 2.5316 200 1.3045 0.4515 1.3045 1.1422
No log 2.5570 202 1.1902 0.4720 1.1902 1.0910
No log 2.5823 204 1.3081 0.4454 1.3081 1.1437
No log 2.6076 206 1.4229 0.4182 1.4229 1.1929
No log 2.6329 208 1.4408 0.3679 1.4408 1.2003
No log 2.6582 210 1.3095 0.4178 1.3095 1.1443
No log 2.6835 212 1.1999 0.3910 1.1999 1.0954
No log 2.7089 214 1.1394 0.4683 1.1394 1.0674
No log 2.7342 216 0.9732 0.5253 0.9732 0.9865
No log 2.7595 218 0.8634 0.4652 0.8634 0.9292
No log 2.7848 220 0.8703 0.5023 0.8703 0.9329
No log 2.8101 222 0.9802 0.4970 0.9802 0.9901
No log 2.8354 224 1.1737 0.4556 1.1737 1.0834
No log 2.8608 226 1.2552 0.4401 1.2552 1.1204
No log 2.8861 228 1.1243 0.4390 1.1243 1.0603
No log 2.9114 230 0.9237 0.3899 0.9237 0.9611
No log 2.9367 232 0.8726 0.4868 0.8726 0.9341
No log 2.9620 234 0.8729 0.4896 0.8729 0.9343
No log 2.9873 236 0.8482 0.4855 0.8482 0.9210
No log 3.0127 238 0.9027 0.4821 0.9027 0.9501
No log 3.0380 240 1.0798 0.4848 1.0798 1.0391
No log 3.0633 242 1.1209 0.4896 1.1209 1.0587
No log 3.0886 244 1.0669 0.4891 1.0669 1.0329
No log 3.1139 246 0.9685 0.4770 0.9685 0.9841
No log 3.1392 248 0.9188 0.4552 0.9188 0.9586
No log 3.1646 250 0.8832 0.4465 0.8832 0.9398
No log 3.1899 252 0.8868 0.4396 0.8868 0.9417
No log 3.2152 254 0.8930 0.4326 0.8930 0.9450
No log 3.2405 256 0.8890 0.4572 0.8890 0.9429
No log 3.2658 258 0.8903 0.4789 0.8903 0.9436
No log 3.2911 260 0.8615 0.4801 0.8615 0.9282
No log 3.3165 262 0.8621 0.4479 0.8621 0.9285
No log 3.3418 264 0.9134 0.4944 0.9134 0.9557
No log 3.3671 266 1.0057 0.4676 1.0057 1.0028
No log 3.3924 268 1.0712 0.4877 1.0712 1.0350
No log 3.4177 270 1.0908 0.4609 1.0908 1.0444
No log 3.4430 272 1.1440 0.4765 1.1440 1.0696
No log 3.4684 274 1.0805 0.4743 1.0805 1.0394
No log 3.4937 276 1.0336 0.4999 1.0336 1.0167
No log 3.5190 278 1.0458 0.5073 1.0458 1.0226
No log 3.5443 280 1.0062 0.5219 1.0062 1.0031
No log 3.5696 282 1.0429 0.5170 1.0429 1.0212
No log 3.5949 284 1.0126 0.5347 1.0126 1.0063
No log 3.6203 286 1.0683 0.5194 1.0683 1.0336
No log 3.6456 288 1.1564 0.4792 1.1564 1.0753
No log 3.6709 290 1.0746 0.5098 1.0746 1.0366
No log 3.6962 292 0.9353 0.5703 0.9353 0.9671
No log 3.7215 294 0.7970 0.5490 0.7970 0.8928
No log 3.7468 296 0.7924 0.5490 0.7924 0.8902
No log 3.7722 298 0.8939 0.5405 0.8939 0.9455
No log 3.7975 300 1.1346 0.4579 1.1346 1.0652
No log 3.8228 302 1.1928 0.4422 1.1928 1.0922
No log 3.8481 304 1.0694 0.5236 1.0694 1.0341
No log 3.8734 306 0.9004 0.3967 0.9004 0.9489
No log 3.8987 308 0.8448 0.4493 0.8448 0.9191
No log 3.9241 310 0.8664 0.4906 0.8664 0.9308
No log 3.9494 312 0.8370 0.4821 0.8370 0.9149
No log 3.9747 314 0.8104 0.4706 0.8104 0.9002
No log 4.0 316 0.8594 0.4995 0.8594 0.9270
No log 4.0253 318 1.0696 0.4674 1.0696 1.0342
No log 4.0506 320 1.2319 0.4393 1.2319 1.1099
No log 4.0759 322 1.1562 0.4123 1.1562 1.0753
No log 4.1013 324 1.0050 0.4088 1.0050 1.0025
No log 4.1266 326 0.9313 0.4257 0.9313 0.9651
No log 4.1519 328 0.8916 0.4331 0.8916 0.9443
No log 4.1772 330 0.8669 0.4572 0.8669 0.9311
No log 4.2025 332 0.8505 0.4632 0.8505 0.9222
No log 4.2278 334 0.8320 0.4408 0.8320 0.9121
No log 4.2532 336 0.8941 0.4231 0.8941 0.9456
No log 4.2785 338 1.1190 0.4947 1.1190 1.0578
No log 4.3038 340 1.2989 0.4309 1.2989 1.1397
No log 4.3291 342 1.2520 0.4274 1.2520 1.1189
No log 4.3544 344 1.1228 0.4700 1.1228 1.0596
No log 4.3797 346 1.0020 0.5344 1.0020 1.0010
No log 4.4051 348 0.9141 0.5295 0.9141 0.9561
No log 4.4304 350 0.8990 0.5161 0.8990 0.9481
No log 4.4557 352 0.9026 0.5210 0.9026 0.9500
No log 4.4810 354 0.9233 0.5210 0.9233 0.9609
No log 4.5063 356 0.9613 0.5217 0.9613 0.9804
No log 4.5316 358 0.9876 0.5002 0.9876 0.9938
No log 4.5570 360 0.9571 0.4976 0.9571 0.9783
No log 4.5823 362 0.9388 0.4948 0.9388 0.9689
No log 4.6076 364 0.9628 0.4956 0.9628 0.9812
No log 4.6329 366 1.0142 0.4594 1.0142 1.0071
No log 4.6582 368 1.0830 0.4456 1.0830 1.0407
No log 4.6835 370 1.0817 0.4838 1.0817 1.0401
No log 4.7089 372 1.0333 0.4830 1.0333 1.0165
No log 4.7342 374 0.9576 0.4808 0.9576 0.9786
No log 4.7595 376 0.8653 0.4720 0.8653 0.9302
No log 4.7848 378 0.8421 0.4898 0.8421 0.9177
No log 4.8101 380 0.8566 0.5097 0.8566 0.9255
No log 4.8354 382 0.9500 0.5516 0.9500 0.9747
No log 4.8608 384 1.0602 0.5215 1.0602 1.0296
No log 4.8861 386 1.2372 0.4850 1.2372 1.1123
No log 4.9114 388 1.2754 0.4635 1.2754 1.1293
No log 4.9367 390 1.1591 0.5038 1.1591 1.0766
No log 4.9620 392 0.9943 0.5418 0.9943 0.9972
No log 4.9873 394 0.8385 0.5146 0.8385 0.9157
No log 5.0127 396 0.8240 0.5418 0.8240 0.9077
No log 5.0380 398 0.8456 0.5208 0.8456 0.9196
No log 5.0633 400 0.9272 0.5353 0.9272 0.9629
No log 5.0886 402 1.0614 0.5344 1.0614 1.0302
No log 5.1139 404 1.1880 0.4933 1.1880 1.0900
No log 5.1392 406 1.1300 0.4844 1.1300 1.0630
No log 5.1646 408 1.0136 0.5141 1.0136 1.0068
No log 5.1899 410 0.8722 0.5725 0.8722 0.9339
No log 5.2152 412 0.8006 0.5337 0.8006 0.8948
No log 5.2405 414 0.7826 0.4767 0.7826 0.8847
No log 5.2658 416 0.7944 0.4767 0.7944 0.8913
No log 5.2911 418 0.8149 0.4709 0.8149 0.9027
No log 5.3165 420 0.8651 0.4368 0.8651 0.9301
No log 5.3418 422 0.8691 0.4592 0.8691 0.9323
No log 5.3671 424 0.8522 0.4560 0.8522 0.9231
No log 5.3924 426 0.8592 0.4808 0.8592 0.9269
No log 5.4177 428 0.8397 0.5213 0.8397 0.9164
No log 5.4430 430 0.8090 0.5695 0.8090 0.8994
No log 5.4684 432 0.8112 0.5928 0.8112 0.9007
No log 5.4937 434 0.7785 0.6717 0.7785 0.8823
No log 5.5190 436 0.7813 0.6853 0.7813 0.8839
No log 5.5443 438 0.7597 0.6759 0.7597 0.8716
No log 5.5696 440 0.7557 0.6607 0.7557 0.8693
No log 5.5949 442 0.8260 0.6771 0.8260 0.9088
No log 5.6203 444 0.8880 0.6095 0.8880 0.9423
No log 5.6456 446 0.8737 0.6208 0.8737 0.9347
No log 5.6709 448 0.8092 0.6046 0.8092 0.8996
No log 5.6962 450 0.7828 0.5949 0.7828 0.8848
No log 5.7215 452 0.7932 0.6195 0.7932 0.8906
No log 5.7468 454 0.8714 0.6092 0.8714 0.9335
No log 5.7722 456 0.9906 0.5615 0.9906 0.9953
No log 5.7975 458 0.9998 0.5517 0.9998 0.9999
No log 5.8228 460 0.8537 0.6442 0.8537 0.9240
No log 5.8481 462 0.7014 0.6069 0.7014 0.8375
No log 5.8734 464 0.7072 0.5594 0.7072 0.8410
No log 5.8987 466 0.7274 0.5562 0.7274 0.8529
No log 5.9241 468 0.7492 0.5632 0.7492 0.8656
No log 5.9494 470 0.7979 0.6236 0.7979 0.8932
No log 5.9747 472 0.8865 0.6017 0.8865 0.9415
No log 6.0 474 0.9040 0.5840 0.9040 0.9508
No log 6.0253 476 0.8500 0.6061 0.8500 0.9220
No log 6.0506 478 0.7904 0.5772 0.7904 0.8890
No log 6.0759 480 0.7669 0.5132 0.7669 0.8758
No log 6.1013 482 0.7627 0.5190 0.7627 0.8733
No log 6.1266 484 0.8084 0.5380 0.8084 0.8991
No log 6.1519 486 0.9299 0.5275 0.9299 0.9643
No log 6.1772 488 0.9852 0.5054 0.9852 0.9926
No log 6.2025 490 1.0294 0.5239 1.0294 1.0146
No log 6.2278 492 1.0327 0.5143 1.0327 1.0162
No log 6.2532 494 1.0530 0.5039 1.0530 1.0262
No log 6.2785 496 1.0748 0.4921 1.0748 1.0367
No log 6.3038 498 1.0197 0.5325 1.0197 1.0098
0.4658 6.3291 500 0.9384 0.4884 0.9384 0.9687
0.4658 6.3544 502 0.9632 0.5055 0.9632 0.9814
0.4658 6.3797 504 1.1164 0.4868 1.1164 1.0566
0.4658 6.4051 506 1.3488 0.4278 1.3488 1.1614
0.4658 6.4304 508 1.4952 0.4032 1.4952 1.2228
0.4658 6.4557 510 1.3561 0.3998 1.3561 1.1645
0.4658 6.4810 512 1.1859 0.4568 1.1859 1.0890

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits8_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k16_task1_organization

Finetuned
(4023)
this model