ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k10_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0144
  • Qwk: 0.3908
  • Mse: 1.0144
  • Rmse: 1.0072

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0351 2 4.4767 0.0144 4.4767 2.1158
No log 0.0702 4 3.2551 0.0110 3.2551 1.8042
No log 0.1053 6 1.8708 0.1273 1.8708 1.3678
No log 0.1404 8 1.3418 -0.0211 1.3418 1.1583
No log 0.1754 10 1.2254 0.1649 1.2254 1.1070
No log 0.2105 12 1.3274 0.0253 1.3274 1.1521
No log 0.2456 14 1.4190 -0.0141 1.4190 1.1912
No log 0.2807 16 1.2841 0.0454 1.2841 1.1332
No log 0.3158 18 1.2382 0.0454 1.2382 1.1127
No log 0.3509 20 1.2234 0.0872 1.2234 1.1061
No log 0.3860 22 1.2104 0.1370 1.2104 1.1002
No log 0.4211 24 1.3089 0.1327 1.3089 1.1441
No log 0.4561 26 1.2664 0.0918 1.2664 1.1253
No log 0.4912 28 1.1500 0.1935 1.1500 1.0724
No log 0.5263 30 1.1491 0.1979 1.1491 1.0720
No log 0.5614 32 1.1268 0.1417 1.1268 1.0615
No log 0.5965 34 1.1160 0.1711 1.1160 1.0564
No log 0.6316 36 1.2215 0.2085 1.2215 1.1052
No log 0.6667 38 1.2766 0.0900 1.2766 1.1298
No log 0.7018 40 1.1143 0.2155 1.1143 1.0556
No log 0.7368 42 1.2225 0.1758 1.2225 1.1057
No log 0.7719 44 1.2452 0.1756 1.2452 1.1159
No log 0.8070 46 1.1075 0.2973 1.1075 1.0524
No log 0.8421 48 1.1585 0.2432 1.1585 1.0763
No log 0.8772 50 1.2360 0.2202 1.2360 1.1118
No log 0.9123 52 1.1657 0.2243 1.1657 1.0797
No log 0.9474 54 1.0446 0.3756 1.0446 1.0221
No log 0.9825 56 1.2181 0.3605 1.2181 1.1037
No log 1.0175 58 1.2348 0.3304 1.2348 1.1112
No log 1.0526 60 1.0478 0.3342 1.0478 1.0236
No log 1.0877 62 1.0128 0.4120 1.0128 1.0064
No log 1.1228 64 1.0143 0.4388 1.0143 1.0071
No log 1.1579 66 0.9770 0.4512 0.9770 0.9884
No log 1.1930 68 0.9637 0.4996 0.9637 0.9817
No log 1.2281 70 0.9898 0.4548 0.9898 0.9949
No log 1.2632 72 1.0512 0.4381 1.0512 1.0253
No log 1.2982 74 1.0497 0.4275 1.0497 1.0245
No log 1.3333 76 1.0904 0.4102 1.0904 1.0442
No log 1.3684 78 1.1588 0.3971 1.1588 1.0765
No log 1.4035 80 1.1290 0.3950 1.1290 1.0626
No log 1.4386 82 1.1486 0.4107 1.1486 1.0717
No log 1.4737 84 1.1178 0.3965 1.1178 1.0572
No log 1.5088 86 1.1326 0.3907 1.1326 1.0642
No log 1.5439 88 1.1471 0.3780 1.1471 1.0710
No log 1.5789 90 1.3805 0.3375 1.3805 1.1749
No log 1.6140 92 1.3626 0.3979 1.3626 1.1673
No log 1.6491 94 1.0766 0.4273 1.0766 1.0376
No log 1.6842 96 1.2514 0.3824 1.2514 1.1187
No log 1.7193 98 1.1884 0.4555 1.1884 1.0901
No log 1.7544 100 1.0193 0.4079 1.0193 1.0096
No log 1.7895 102 0.9570 0.5051 0.9570 0.9783
No log 1.8246 104 1.0378 0.4974 1.0378 1.0187
No log 1.8596 106 1.1262 0.4924 1.1262 1.0612
No log 1.8947 108 0.9948 0.5301 0.9948 0.9974
No log 1.9298 110 0.9734 0.5135 0.9734 0.9866
No log 1.9649 112 0.9530 0.4773 0.9530 0.9762
No log 2.0 114 0.9546 0.3873 0.9546 0.9770
No log 2.0351 116 0.9707 0.4008 0.9707 0.9852
No log 2.0702 118 1.0474 0.4552 1.0474 1.0234
No log 2.1053 120 1.2169 0.4168 1.2169 1.1031
No log 2.1404 122 1.1648 0.4118 1.1648 1.0793
No log 2.1754 124 1.0453 0.4008 1.0453 1.0224
No log 2.2105 126 1.0330 0.4503 1.0330 1.0164
No log 2.2456 128 1.0378 0.4794 1.0378 1.0187
No log 2.2807 130 1.0116 0.3733 1.0116 1.0058
No log 2.3158 132 1.0219 0.3733 1.0219 1.0109
No log 2.3509 134 1.0285 0.3482 1.0285 1.0141
No log 2.3860 136 1.0331 0.3243 1.0331 1.0164
No log 2.4211 138 1.0660 0.3522 1.0660 1.0325
No log 2.4561 140 1.1099 0.3384 1.1099 1.0535
No log 2.4912 142 1.1530 0.3902 1.1530 1.0738
No log 2.5263 144 1.1820 0.3902 1.1820 1.0872
No log 2.5614 146 1.3410 0.4045 1.3410 1.1580
No log 2.5965 148 1.4672 0.3124 1.4672 1.2113
No log 2.6316 150 1.3690 0.3208 1.3690 1.1700
No log 2.6667 152 1.1331 0.4093 1.1331 1.0645
No log 2.7018 154 1.0911 0.2712 1.0911 1.0446
No log 2.7368 156 1.0410 0.3619 1.0410 1.0203
No log 2.7719 158 1.0261 0.3987 1.0261 1.0130
No log 2.8070 160 1.0748 0.3616 1.0748 1.0367
No log 2.8421 162 1.0740 0.4031 1.0740 1.0363
No log 2.8772 164 1.0140 0.3991 1.0140 1.0070
No log 2.9123 166 1.0093 0.3829 1.0093 1.0046
No log 2.9474 168 1.0087 0.4308 1.0087 1.0043
No log 2.9825 170 1.1381 0.38 1.1381 1.0668
No log 3.0175 172 1.2276 0.3539 1.2276 1.1080
No log 3.0526 174 1.1135 0.4082 1.1135 1.0552
No log 3.0877 176 1.0055 0.4796 1.0055 1.0027
No log 3.1228 178 1.0341 0.4565 1.0341 1.0169
No log 3.1579 180 1.0600 0.4559 1.0600 1.0296
No log 3.1930 182 1.0335 0.4091 1.0335 1.0166
No log 3.2281 184 1.0693 0.3034 1.0693 1.0340
No log 3.2632 186 1.1025 0.3056 1.1025 1.0500
No log 3.2982 188 1.1030 0.3725 1.1030 1.0503
No log 3.3333 190 1.1432 0.3907 1.1432 1.0692
No log 3.3684 192 1.1994 0.4182 1.1994 1.0952
No log 3.4035 194 1.3792 0.3547 1.3792 1.1744
No log 3.4386 196 1.3932 0.3614 1.3932 1.1804
No log 3.4737 198 1.1695 0.3227 1.1695 1.0814
No log 3.5088 200 1.0916 0.3728 1.0916 1.0448
No log 3.5439 202 1.0823 0.3128 1.0823 1.0403
No log 3.5789 204 1.1191 0.3518 1.1191 1.0579
No log 3.6140 206 1.2635 0.3466 1.2635 1.1241
No log 3.6491 208 1.1859 0.3503 1.1859 1.0890
No log 3.6842 210 1.0283 0.3809 1.0283 1.0140
No log 3.7193 212 0.9978 0.3694 0.9978 0.9989
No log 3.7544 214 1.0010 0.4726 1.0010 1.0005
No log 3.7895 216 1.0369 0.4415 1.0369 1.0183
No log 3.8246 218 1.0514 0.3928 1.0514 1.0254
No log 3.8596 220 1.0212 0.5279 1.0212 1.0106
No log 3.8947 222 1.1064 0.4337 1.1064 1.0519
No log 3.9298 224 1.1428 0.3792 1.1428 1.0690
No log 3.9649 226 1.0432 0.3959 1.0432 1.0214
No log 4.0 228 0.9772 0.4894 0.9772 0.9885
No log 4.0351 230 1.0042 0.4572 1.0042 1.0021
No log 4.0702 232 1.0347 0.4031 1.0347 1.0172
No log 4.1053 234 1.1060 0.4282 1.1060 1.0517
No log 4.1404 236 1.0786 0.4028 1.0786 1.0385
No log 4.1754 238 1.0345 0.3729 1.0345 1.0171
No log 4.2105 240 0.9995 0.4681 0.9995 0.9997
No log 4.2456 242 1.0187 0.4521 1.0187 1.0093
No log 4.2807 244 1.0420 0.4699 1.0420 1.0208
No log 4.3158 246 1.0604 0.4589 1.0604 1.0297
No log 4.3509 248 1.0559 0.4969 1.0559 1.0276
No log 4.3860 250 1.0383 0.5012 1.0383 1.0190
No log 4.4211 252 1.0055 0.4664 1.0055 1.0027
No log 4.4561 254 0.9828 0.4246 0.9828 0.9913
No log 4.4912 256 0.9868 0.4559 0.9868 0.9934
No log 4.5263 258 0.9853 0.4146 0.9853 0.9926
No log 4.5614 260 1.0118 0.3990 1.0118 1.0059
No log 4.5965 262 0.9837 0.4366 0.9837 0.9918
No log 4.6316 264 0.9518 0.4278 0.9518 0.9756
No log 4.6667 266 0.9685 0.4420 0.9685 0.9841
No log 4.7018 268 0.9872 0.4738 0.9872 0.9936
No log 4.7368 270 0.9949 0.4738 0.9949 0.9974
No log 4.7719 272 0.9974 0.4420 0.9974 0.9987
No log 4.8070 274 0.9794 0.4787 0.9794 0.9896
No log 4.8421 276 0.9804 0.4757 0.9804 0.9902
No log 4.8772 278 0.9508 0.4787 0.9508 0.9751
No log 4.9123 280 0.9391 0.5058 0.9391 0.9691
No log 4.9474 282 0.9489 0.4626 0.9489 0.9741
No log 4.9825 284 0.9268 0.5155 0.9268 0.9627
No log 5.0175 286 0.9310 0.5012 0.9310 0.9649
No log 5.0526 288 1.0109 0.4028 1.0109 1.0055
No log 5.0877 290 1.0584 0.4200 1.0584 1.0288
No log 5.1228 292 0.9984 0.4266 0.9984 0.9992
No log 5.1579 294 0.9461 0.5012 0.9461 0.9727
No log 5.1930 296 0.9620 0.5058 0.9620 0.9808
No log 5.2281 298 0.9731 0.5072 0.9731 0.9865
No log 5.2632 300 0.9648 0.5089 0.9648 0.9822
No log 5.2982 302 0.9868 0.4181 0.9868 0.9934
No log 5.3333 304 0.9814 0.4181 0.9814 0.9906
No log 5.3684 306 0.9901 0.4356 0.9901 0.9950
No log 5.4035 308 1.0074 0.4356 1.0074 1.0037
No log 5.4386 310 0.9918 0.4356 0.9918 0.9959
No log 5.4737 312 0.9529 0.3852 0.9529 0.9762
No log 5.5088 314 0.9663 0.4059 0.9663 0.9830
No log 5.5439 316 0.9675 0.3346 0.9675 0.9836
No log 5.5789 318 0.9407 0.3948 0.9407 0.9699
No log 5.6140 320 1.0483 0.4607 1.0483 1.0239
No log 5.6491 322 1.1850 0.3693 1.1850 1.0886
No log 5.6842 324 1.1452 0.3599 1.1452 1.0701
No log 5.7193 326 1.0302 0.4761 1.0302 1.0150
No log 5.7544 328 0.9610 0.4546 0.9610 0.9803
No log 5.7895 330 0.9399 0.4934 0.9399 0.9695
No log 5.8246 332 0.9565 0.4587 0.9565 0.9780
No log 5.8596 334 0.9710 0.4143 0.9710 0.9854
No log 5.8947 336 0.9990 0.4297 0.9990 0.9995
No log 5.9298 338 1.0491 0.3809 1.0491 1.0243
No log 5.9649 340 1.0942 0.3657 1.0942 1.0461
No log 6.0 342 1.1294 0.3726 1.1294 1.0627
No log 6.0351 344 1.0731 0.3657 1.0731 1.0359
No log 6.0702 346 0.9682 0.4369 0.9682 0.9840
No log 6.1053 348 0.9497 0.4646 0.9497 0.9745
No log 6.1404 350 0.9579 0.4834 0.9579 0.9787
No log 6.1754 352 0.9435 0.4801 0.9435 0.9713
No log 6.2105 354 0.9507 0.4965 0.9507 0.9750
No log 6.2456 356 1.0234 0.3997 1.0234 1.0117
No log 6.2807 358 1.0465 0.3997 1.0465 1.0230
No log 6.3158 360 0.9802 0.4202 0.9802 0.9901
No log 6.3509 362 0.9357 0.4181 0.9357 0.9673
No log 6.3860 364 0.9464 0.3987 0.9464 0.9728
No log 6.4211 366 0.9542 0.3619 0.9542 0.9768
No log 6.4561 368 0.9657 0.4409 0.9657 0.9827
No log 6.4912 370 1.0038 0.4359 1.0038 1.0019
No log 6.5263 372 1.0733 0.3645 1.0733 1.0360
No log 6.5614 374 1.0733 0.3955 1.0733 1.0360
No log 6.5965 376 1.0007 0.5256 1.0007 1.0004
No log 6.6316 378 1.0401 0.4476 1.0401 1.0199
No log 6.6667 380 1.1285 0.3392 1.1285 1.0623
No log 6.7018 382 1.1312 0.3424 1.1312 1.0636
No log 6.7368 384 1.0910 0.3892 1.0910 1.0445
No log 6.7719 386 0.9752 0.4792 0.9752 0.9875
No log 6.8070 388 0.9442 0.5165 0.9442 0.9717
No log 6.8421 390 0.9961 0.4431 0.9961 0.9981
No log 6.8772 392 1.0165 0.4036 1.0165 1.0082
No log 6.9123 394 0.9838 0.4763 0.9838 0.9919
No log 6.9474 396 0.9729 0.4559 0.9729 0.9863
No log 6.9825 398 0.9493 0.4639 0.9493 0.9743
No log 7.0175 400 0.9422 0.4202 0.9422 0.9707
No log 7.0526 402 0.9573 0.3697 0.9573 0.9784
No log 7.0877 404 0.9483 0.4065 0.9483 0.9738
No log 7.1228 406 0.9358 0.4241 0.9358 0.9674
No log 7.1579 408 0.9415 0.3771 0.9415 0.9703
No log 7.1930 410 0.9671 0.4816 0.9671 0.9834
No log 7.2281 412 0.9850 0.4454 0.9850 0.9925
No log 7.2632 414 0.9824 0.4340 0.9824 0.9911
No log 7.2982 416 0.9815 0.4597 0.9815 0.9907
No log 7.3333 418 0.9765 0.4546 0.9765 0.9882
No log 7.3684 420 0.9536 0.4065 0.9536 0.9765
No log 7.4035 422 0.9397 0.3868 0.9397 0.9694
No log 7.4386 424 0.9436 0.4009 0.9436 0.9714
No log 7.4737 426 0.9699 0.4496 0.9699 0.9848
No log 7.5088 428 0.9515 0.4239 0.9515 0.9754
No log 7.5439 430 0.9580 0.4398 0.9580 0.9788
No log 7.5789 432 0.9928 0.4473 0.9928 0.9964
No log 7.6140 434 1.0136 0.4254 1.0136 1.0068
No log 7.6491 436 1.0643 0.4252 1.0643 1.0317
No log 7.6842 438 1.0437 0.4396 1.0437 1.0216
No log 7.7193 440 1.0092 0.4623 1.0092 1.0046
No log 7.7544 442 0.9657 0.4808 0.9657 0.9827
No log 7.7895 444 0.9512 0.4337 0.9512 0.9753
No log 7.8246 446 0.9905 0.4648 0.9905 0.9953
No log 7.8596 448 1.0542 0.3954 1.0542 1.0267
No log 7.8947 450 1.0071 0.4201 1.0071 1.0035
No log 7.9298 452 0.9347 0.4595 0.9347 0.9668
No log 7.9649 454 0.9326 0.4218 0.9326 0.9657
No log 8.0 456 0.9656 0.4160 0.9656 0.9827
No log 8.0351 458 0.9846 0.4311 0.9846 0.9923
No log 8.0702 460 1.0338 0.4540 1.0338 1.0167
No log 8.1053 462 1.1379 0.4199 1.1379 1.0667
No log 8.1404 464 1.1739 0.4199 1.1739 1.0835
No log 8.1754 466 1.0995 0.4404 1.0995 1.0486
No log 8.2105 468 1.0341 0.4146 1.0341 1.0169
No log 8.2456 470 1.0248 0.3039 1.0248 1.0123
No log 8.2807 472 1.0136 0.3371 1.0136 1.0068
No log 8.3158 474 0.9969 0.3424 0.9969 0.9984
No log 8.3509 476 1.0449 0.3560 1.0449 1.0222
No log 8.3860 478 1.1146 0.3567 1.1146 1.0557
No log 8.4211 480 1.1229 0.3579 1.1229 1.0597
No log 8.4561 482 1.0612 0.3595 1.0612 1.0302
No log 8.4912 484 0.9972 0.3779 0.9972 0.9986
No log 8.5263 486 0.9960 0.3224 0.9960 0.9980
No log 8.5614 488 0.9999 0.3205 0.9999 1.0000
No log 8.5965 490 0.9997 0.3224 0.9997 0.9998
No log 8.6316 492 1.0040 0.3472 1.0040 1.0020
No log 8.6667 494 1.0161 0.2803 1.0161 1.0080
No log 8.7018 496 1.0227 0.3811 1.0227 1.0113
No log 8.7368 498 1.0318 0.4273 1.0318 1.0158
0.3349 8.7719 500 1.0360 0.4053 1.0360 1.0178
0.3349 8.8070 502 1.0153 0.4048 1.0153 1.0076
0.3349 8.8421 504 1.0008 0.3430 1.0008 1.0004
0.3349 8.8772 506 1.0103 0.3243 1.0103 1.0051
0.3349 8.9123 508 1.0089 0.3430 1.0089 1.0044
0.3349 8.9474 510 1.0144 0.3908 1.0144 1.0072

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k10_task2_organization

Finetuned
(4019)
this model