ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k15_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8983
  • Qwk: 0.3196
  • Mse: 0.8983
  • Rmse: 0.9478

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0235 2 4.6898 -0.0103 4.6898 2.1656
No log 0.0471 4 3.3269 0.0153 3.3269 1.8240
No log 0.0706 6 1.6454 0.0504 1.6454 1.2827
No log 0.0941 8 1.5209 0.0082 1.5209 1.2332
No log 0.1176 10 1.3812 0.0346 1.3812 1.1752
No log 0.1412 12 1.2636 0.0977 1.2636 1.1241
No log 0.1647 14 1.2286 0.1211 1.2286 1.1084
No log 0.1882 16 1.2351 0.2038 1.2351 1.1113
No log 0.2118 18 1.2270 0.2886 1.2270 1.1077
No log 0.2353 20 1.1764 0.0682 1.1764 1.0846
No log 0.2588 22 1.1496 0.1814 1.1496 1.0722
No log 0.2824 24 1.1945 0.0954 1.1945 1.0929
No log 0.3059 26 1.2503 0.0353 1.2503 1.1182
No log 0.3294 28 1.4778 -0.0657 1.4778 1.2156
No log 0.3529 30 1.5027 0.0522 1.5027 1.2258
No log 0.3765 32 1.2792 0.1388 1.2792 1.1310
No log 0.4 34 1.0844 0.3902 1.0844 1.0414
No log 0.4235 36 0.9709 0.4363 0.9709 0.9854
No log 0.4471 38 0.9235 0.4012 0.9235 0.9610
No log 0.4706 40 0.8901 0.3974 0.8901 0.9434
No log 0.4941 42 0.8701 0.3830 0.8701 0.9328
No log 0.5176 44 0.8839 0.4993 0.8839 0.9401
No log 0.5412 46 0.9551 0.5209 0.9551 0.9773
No log 0.5647 48 0.9931 0.5325 0.9931 0.9966
No log 0.5882 50 0.9354 0.5618 0.9354 0.9672
No log 0.6118 52 0.8622 0.5153 0.8622 0.9285
No log 0.6353 54 0.9304 0.4962 0.9304 0.9646
No log 0.6588 56 1.0445 0.4954 1.0445 1.0220
No log 0.6824 58 0.8990 0.5293 0.8990 0.9482
No log 0.7059 60 0.7944 0.5802 0.7944 0.8913
No log 0.7294 62 0.8550 0.5683 0.8550 0.9247
No log 0.7529 64 0.8475 0.5683 0.8475 0.9206
No log 0.7765 66 0.7500 0.6142 0.7500 0.8660
No log 0.8 68 0.7899 0.5390 0.7899 0.8888
No log 0.8235 70 0.7943 0.5601 0.7943 0.8913
No log 0.8471 72 0.7754 0.5580 0.7754 0.8806
No log 0.8706 74 1.0626 0.4292 1.0626 1.0308
No log 0.8941 76 1.0935 0.4642 1.0935 1.0457
No log 0.9176 78 0.9747 0.5279 0.9747 0.9873
No log 0.9412 80 0.7625 0.5749 0.7625 0.8732
No log 0.9647 82 0.7785 0.4966 0.7785 0.8823
No log 0.9882 84 0.8307 0.5227 0.8307 0.9114
No log 1.0118 86 0.9018 0.5405 0.9018 0.9496
No log 1.0353 88 0.7740 0.5624 0.7740 0.8798
No log 1.0588 90 0.7241 0.6429 0.7241 0.8510
No log 1.0824 92 0.7518 0.6459 0.7518 0.8671
No log 1.1059 94 0.9017 0.5304 0.9017 0.9496
No log 1.1294 96 1.0043 0.4976 1.0043 1.0022
No log 1.1529 98 1.0582 0.5040 1.0582 1.0287
No log 1.1765 100 0.8102 0.6066 0.8102 0.9001
No log 1.2 102 0.7383 0.4947 0.7383 0.8592
No log 1.2235 104 0.7725 0.4794 0.7725 0.8789
No log 1.2471 106 0.7504 0.4993 0.7504 0.8663
No log 1.2706 108 0.7316 0.5526 0.7316 0.8553
No log 1.2941 110 0.7718 0.5787 0.7718 0.8785
No log 1.3176 112 0.8665 0.5385 0.8665 0.9309
No log 1.3412 114 0.8331 0.4572 0.8331 0.9127
No log 1.3647 116 0.8075 0.5125 0.8075 0.8986
No log 1.3882 118 0.8295 0.5300 0.8295 0.9108
No log 1.4118 120 0.9091 0.4291 0.9091 0.9535
No log 1.4353 122 0.8920 0.4907 0.8920 0.9444
No log 1.4588 124 0.8540 0.5072 0.8540 0.9241
No log 1.4824 126 0.8425 0.5596 0.8425 0.9179
No log 1.5059 128 0.9725 0.5145 0.9725 0.9862
No log 1.5294 130 1.0291 0.4836 1.0291 1.0144
No log 1.5529 132 0.8768 0.5416 0.8768 0.9364
No log 1.5765 134 0.8949 0.5251 0.8949 0.9460
No log 1.6 136 0.8869 0.5469 0.8869 0.9417
No log 1.6235 138 1.0013 0.4805 1.0013 1.0006
No log 1.6471 140 1.4164 0.3919 1.4164 1.1901
No log 1.6706 142 1.5376 0.2992 1.5376 1.2400
No log 1.6941 144 1.3194 0.3490 1.3194 1.1486
No log 1.7176 146 1.0227 0.4130 1.0227 1.0113
No log 1.7412 148 0.8654 0.3938 0.8654 0.9303
No log 1.7647 150 0.8589 0.4879 0.8589 0.9268
No log 1.7882 152 0.8633 0.4853 0.8633 0.9291
No log 1.8118 154 0.9240 0.4331 0.9240 0.9613
No log 1.8353 156 0.9525 0.3950 0.9525 0.9759
No log 1.8588 158 0.9864 0.4468 0.9864 0.9932
No log 1.8824 160 0.9390 0.3945 0.9390 0.9690
No log 1.9059 162 0.8731 0.5997 0.8731 0.9344
No log 1.9294 164 0.8539 0.5164 0.8539 0.9241
No log 1.9529 166 0.8464 0.5495 0.8464 0.9200
No log 1.9765 168 0.8426 0.6061 0.8426 0.9179
No log 2.0 170 0.8676 0.5477 0.8676 0.9315
No log 2.0235 172 0.8998 0.5192 0.8998 0.9486
No log 2.0471 174 0.9604 0.4166 0.9604 0.9800
No log 2.0706 176 1.1180 0.4217 1.1180 1.0574
No log 2.0941 178 1.1504 0.4108 1.1504 1.0726
No log 2.1176 180 1.0493 0.4218 1.0493 1.0243
No log 2.1412 182 0.9547 0.4254 0.9547 0.9771
No log 2.1647 184 0.8527 0.5501 0.8527 0.9234
No log 2.1882 186 0.8467 0.5167 0.8467 0.9201
No log 2.2118 188 0.8532 0.4676 0.8532 0.9237
No log 2.2353 190 0.8565 0.4671 0.8565 0.9255
No log 2.2588 192 0.8611 0.4947 0.8611 0.9280
No log 2.2824 194 0.8676 0.4884 0.8676 0.9314
No log 2.3059 196 0.8964 0.5013 0.8964 0.9468
No log 2.3294 198 0.9285 0.4289 0.9285 0.9636
No log 2.3529 200 0.8781 0.4998 0.8781 0.9371
No log 2.3765 202 0.8279 0.5239 0.8279 0.9099
No log 2.4 204 0.8274 0.5220 0.8274 0.9096
No log 2.4235 206 0.9186 0.5029 0.9186 0.9584
No log 2.4471 208 0.9702 0.4507 0.9702 0.9850
No log 2.4706 210 0.9181 0.4631 0.9181 0.9582
No log 2.4941 212 0.8722 0.5245 0.8722 0.9339
No log 2.5176 214 0.7874 0.5311 0.7874 0.8874
No log 2.5412 216 0.7832 0.5059 0.7832 0.8850
No log 2.5647 218 0.7990 0.4931 0.7990 0.8939
No log 2.5882 220 0.8777 0.4952 0.8777 0.9368
No log 2.6118 222 1.0399 0.4521 1.0399 1.0198
No log 2.6353 224 1.0269 0.4714 1.0269 1.0134
No log 2.6588 226 0.8384 0.5164 0.8384 0.9157
No log 2.6824 228 0.7848 0.5072 0.7848 0.8859
No log 2.7059 230 0.8037 0.5085 0.8037 0.8965
No log 2.7294 232 0.8200 0.5781 0.8200 0.9056
No log 2.7529 234 0.8683 0.5636 0.8683 0.9318
No log 2.7765 236 0.8304 0.5011 0.8304 0.9113
No log 2.8 238 0.8440 0.4699 0.8440 0.9187
No log 2.8235 240 0.8291 0.4800 0.8291 0.9105
No log 2.8471 242 0.8638 0.5086 0.8638 0.9294
No log 2.8706 244 0.9498 0.4521 0.9498 0.9746
No log 2.8941 246 0.9304 0.4316 0.9304 0.9646
No log 2.9176 248 0.8352 0.5055 0.8352 0.9139
No log 2.9412 250 0.8184 0.5082 0.8184 0.9047
No log 2.9647 252 0.8211 0.4343 0.8211 0.9062
No log 2.9882 254 0.8450 0.4983 0.8450 0.9193
No log 3.0118 256 0.9351 0.4222 0.9351 0.9670
No log 3.0353 258 0.9164 0.4565 0.9164 0.9573
No log 3.0588 260 0.8989 0.5046 0.8989 0.9481
No log 3.0824 262 0.8117 0.4373 0.8117 0.9010
No log 3.1059 264 0.8116 0.4780 0.8116 0.9009
No log 3.1294 266 0.8303 0.3908 0.8303 0.9112
No log 3.1529 268 0.9295 0.4622 0.9295 0.9641
No log 3.1765 270 1.1173 0.4508 1.1173 1.0570
No log 3.2 272 1.1825 0.3912 1.1825 1.0874
No log 3.2235 274 1.0487 0.4426 1.0487 1.0241
No log 3.2471 276 0.8613 0.5352 0.8613 0.9281
No log 3.2706 278 0.8475 0.5190 0.8475 0.9206
No log 3.2941 280 0.8481 0.5181 0.8481 0.9209
No log 3.3176 282 0.8615 0.4499 0.8615 0.9282
No log 3.3412 284 0.9005 0.4902 0.9005 0.9489
No log 3.3647 286 0.8937 0.4595 0.8937 0.9454
No log 3.3882 288 0.8735 0.4792 0.8735 0.9346
No log 3.4118 290 0.8639 0.4691 0.8639 0.9295
No log 3.4353 292 0.8509 0.4724 0.8509 0.9224
No log 3.4588 294 0.9031 0.4924 0.9031 0.9503
No log 3.4824 296 0.9276 0.5071 0.9276 0.9631
No log 3.5059 298 0.8388 0.5587 0.8388 0.9159
No log 3.5294 300 0.7718 0.5833 0.7718 0.8785
No log 3.5529 302 0.7487 0.6029 0.7487 0.8653
No log 3.5765 304 0.7568 0.5713 0.7568 0.8699
No log 3.6 306 0.7627 0.5315 0.7627 0.8733
No log 3.6235 308 0.8104 0.5521 0.8104 0.9002
No log 3.6471 310 0.8221 0.5610 0.8221 0.9067
No log 3.6706 312 0.7963 0.5868 0.7963 0.8924
No log 3.6941 314 0.7884 0.5606 0.7884 0.8879
No log 3.7176 316 0.7899 0.5606 0.7899 0.8888
No log 3.7412 318 0.8134 0.5636 0.8134 0.9019
No log 3.7647 320 0.8401 0.5380 0.8401 0.9165
No log 3.7882 322 0.8377 0.5291 0.8377 0.9153
No log 3.8118 324 0.8532 0.5501 0.8532 0.9237
No log 3.8353 326 0.9271 0.4744 0.9271 0.9629
No log 3.8588 328 0.9633 0.4623 0.9633 0.9815
No log 3.8824 330 0.9236 0.4335 0.9236 0.9610
No log 3.9059 332 0.9291 0.4889 0.9291 0.9639
No log 3.9294 334 0.9583 0.4120 0.9583 0.9789
No log 3.9529 336 0.9528 0.4032 0.9528 0.9761
No log 3.9765 338 0.9011 0.4634 0.9011 0.9493
No log 4.0 340 0.8991 0.3643 0.8991 0.9482
No log 4.0235 342 0.9301 0.4128 0.9301 0.9644
No log 4.0471 344 0.9593 0.3770 0.9593 0.9794
No log 4.0706 346 0.9832 0.3770 0.9832 0.9916
No log 4.0941 348 1.1211 0.4487 1.1211 1.0588
No log 4.1176 350 1.1719 0.4065 1.1719 1.0825
No log 4.1412 352 1.0983 0.4397 1.0983 1.0480
No log 4.1647 354 0.9677 0.3046 0.9677 0.9837
No log 4.1882 356 0.8774 0.3938 0.8774 0.9367
No log 4.2118 358 0.8880 0.4681 0.8880 0.9423
No log 4.2353 360 0.8795 0.4828 0.8795 0.9378
No log 4.2588 362 0.8561 0.5089 0.8561 0.9253
No log 4.2824 364 0.8989 0.4401 0.8989 0.9481
No log 4.3059 366 0.9422 0.5151 0.9422 0.9707
No log 4.3294 368 0.9209 0.5161 0.9209 0.9596
No log 4.3529 370 0.8756 0.4449 0.8756 0.9357
No log 4.3765 372 0.8402 0.5251 0.8402 0.9166
No log 4.4 374 0.8449 0.5275 0.8449 0.9192
No log 4.4235 376 0.8658 0.5072 0.8658 0.9305
No log 4.4471 378 0.8549 0.5491 0.8549 0.9246
No log 4.4706 380 0.8247 0.5755 0.8247 0.9081
No log 4.4941 382 0.8708 0.4277 0.8708 0.9332
No log 4.5176 384 0.9244 0.3802 0.9244 0.9615
No log 4.5412 386 0.8922 0.4144 0.8922 0.9446
No log 4.5647 388 0.8348 0.5011 0.8348 0.9137
No log 4.5882 390 0.8140 0.5755 0.8140 0.9022
No log 4.6118 392 0.8424 0.5566 0.8424 0.9178
No log 4.6353 394 0.8337 0.5566 0.8337 0.9131
No log 4.6588 396 0.8150 0.5607 0.8150 0.9028
No log 4.6824 398 0.8192 0.4540 0.8192 0.9051
No log 4.7059 400 0.8385 0.4505 0.8385 0.9157
No log 4.7294 402 0.8419 0.4241 0.8419 0.9176
No log 4.7529 404 0.8475 0.4241 0.8475 0.9206
No log 4.7765 406 0.8699 0.3874 0.8699 0.9327
No log 4.8 408 0.8918 0.4871 0.8918 0.9443
No log 4.8235 410 0.9307 0.5171 0.9307 0.9647
No log 4.8471 412 0.9062 0.4838 0.9062 0.9520
No log 4.8706 414 0.8698 0.4202 0.8698 0.9326
No log 4.8941 416 0.8546 0.3861 0.8546 0.9245
No log 4.9176 418 0.8606 0.3861 0.8606 0.9277
No log 4.9412 420 0.8805 0.3970 0.8805 0.9383
No log 4.9647 422 0.8631 0.4009 0.8631 0.9290
No log 4.9882 424 0.8325 0.4009 0.8325 0.9124
No log 5.0118 426 0.8229 0.4181 0.8229 0.9072
No log 5.0353 428 0.8168 0.4086 0.8168 0.9038
No log 5.0588 430 0.8308 0.5119 0.8308 0.9115
No log 5.0824 432 0.9160 0.4732 0.9160 0.9571
No log 5.1059 434 1.0341 0.4134 1.0341 1.0169
No log 5.1294 436 1.0143 0.4545 1.0143 1.0071
No log 5.1529 438 0.9069 0.4732 0.9069 0.9523
No log 5.1765 440 0.8649 0.4444 0.8649 0.9300
No log 5.2 442 0.8754 0.4418 0.8754 0.9357
No log 5.2235 444 0.8807 0.3753 0.8807 0.9385
No log 5.2471 446 0.9127 0.3931 0.9127 0.9553
No log 5.2706 448 0.9902 0.4093 0.9902 0.9951
No log 5.2941 450 0.9821 0.4091 0.9821 0.9910
No log 5.3176 452 0.9280 0.4614 0.9280 0.9634
No log 5.3412 454 0.8810 0.3827 0.8810 0.9386
No log 5.3647 456 0.8740 0.3632 0.8740 0.9349
No log 5.3882 458 0.8771 0.3609 0.8771 0.9365
No log 5.4118 460 0.8772 0.3609 0.8772 0.9366
No log 5.4353 462 0.8956 0.3861 0.8956 0.9464
No log 5.4588 464 0.9146 0.3577 0.9146 0.9563
No log 5.4824 466 0.9120 0.3577 0.9120 0.9550
No log 5.5059 468 0.8976 0.3861 0.8976 0.9474
No log 5.5294 470 0.8887 0.4104 0.8887 0.9427
No log 5.5529 472 0.8645 0.4104 0.8645 0.9298
No log 5.5765 474 0.8391 0.5136 0.8391 0.9160
No log 5.6 476 0.8191 0.5136 0.8191 0.9050
No log 5.6235 478 0.8101 0.5136 0.8101 0.9001
No log 5.6471 480 0.8086 0.5136 0.8086 0.8992
No log 5.6706 482 0.8067 0.4181 0.8067 0.8982
No log 5.6941 484 0.8326 0.4575 0.8326 0.9125
No log 5.7176 486 0.9422 0.5061 0.9422 0.9707
No log 5.7412 488 1.0299 0.4730 1.0299 1.0148
No log 5.7647 490 1.0226 0.4526 1.0226 1.0112
No log 5.7882 492 0.9739 0.3857 0.9739 0.9868
No log 5.8118 494 0.9361 0.4224 0.9361 0.9675
No log 5.8353 496 0.8648 0.3463 0.8648 0.9299
No log 5.8588 498 0.8504 0.4119 0.8504 0.9222
0.3391 5.8824 500 0.8475 0.3780 0.8475 0.9206
0.3391 5.9059 502 0.8451 0.4540 0.8451 0.9193
0.3391 5.9294 504 0.8539 0.4401 0.8539 0.9241
0.3391 5.9529 506 0.8580 0.4401 0.8580 0.9263
0.3391 5.9765 508 0.8510 0.4401 0.8510 0.9225
0.3391 6.0 510 0.8362 0.4352 0.8362 0.9144
0.3391 6.0235 512 0.8246 0.4388 0.8246 0.9081
0.3391 6.0471 514 0.8095 0.4715 0.8095 0.8997
0.3391 6.0706 516 0.7974 0.4671 0.7974 0.8930
0.3391 6.0941 518 0.8021 0.5595 0.8021 0.8956
0.3391 6.1176 520 0.8569 0.5058 0.8569 0.9257
0.3391 6.1412 522 0.8872 0.5236 0.8872 0.9419
0.3391 6.1647 524 0.8432 0.5073 0.8432 0.9183
0.3391 6.1882 526 0.8051 0.4801 0.8051 0.8973
0.3391 6.2118 528 0.8056 0.4257 0.8056 0.8975
0.3391 6.2353 530 0.8036 0.4485 0.8036 0.8965
0.3391 6.2588 532 0.8209 0.5570 0.8209 0.9060
0.3391 6.2824 534 0.8619 0.4845 0.8619 0.9284
0.3391 6.3059 536 0.9433 0.4781 0.9433 0.9712
0.3391 6.3294 538 0.9644 0.5 0.9644 0.9821
0.3391 6.3529 540 0.9078 0.4533 0.9078 0.9528
0.3391 6.3765 542 0.8622 0.4440 0.8622 0.9285
0.3391 6.4 544 0.8592 0.4160 0.8592 0.9269
0.3391 6.4235 546 0.8549 0.4352 0.8549 0.9246
0.3391 6.4471 548 0.8498 0.4352 0.8498 0.9219
0.3391 6.4706 550 0.8386 0.4081 0.8386 0.9157
0.3391 6.4941 552 0.8313 0.3943 0.8313 0.9118
0.3391 6.5176 554 0.8137 0.4548 0.8137 0.9020
0.3391 6.5412 556 0.8042 0.4840 0.8042 0.8968
0.3391 6.5647 558 0.8177 0.5239 0.8177 0.9042
0.3391 6.5882 560 0.8173 0.5239 0.8173 0.9040
0.3391 6.6118 562 0.8056 0.5239 0.8056 0.8976
0.3391 6.6353 564 0.7969 0.5279 0.7969 0.8927
0.3391 6.6588 566 0.8006 0.4826 0.8006 0.8948
0.3391 6.6824 568 0.8126 0.4599 0.8126 0.9015
0.3391 6.7059 570 0.8196 0.3943 0.8196 0.9053
0.3391 6.7294 572 0.8496 0.5201 0.8496 0.9217
0.3391 6.7529 574 0.8774 0.4845 0.8774 0.9367
0.3391 6.7765 576 0.8555 0.5201 0.8555 0.9249
0.3391 6.8 578 0.8516 0.4278 0.8516 0.9228
0.3391 6.8235 580 0.8613 0.4540 0.8613 0.9281
0.3391 6.8471 582 0.8770 0.3663 0.8770 0.9365
0.3391 6.8706 584 0.8840 0.3738 0.8840 0.9402
0.3391 6.8941 586 0.9023 0.3301 0.9023 0.9499
0.3391 6.9176 588 0.9047 0.3190 0.9047 0.9512
0.3391 6.9412 590 0.8983 0.3196 0.8983 0.9478

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k15_task2_organization

Finetuned
(4023)
this model