ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k17_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8322
  • Qwk: 0.6412
  • Mse: 0.8322
  • Rmse: 0.9123

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0157 2 6.7883 0.0358 6.7883 2.6054
No log 0.0315 4 4.8413 0.0682 4.8413 2.2003
No log 0.0472 6 3.0603 0.0848 3.0603 1.7494
No log 0.0630 8 2.7090 0.1060 2.7090 1.6459
No log 0.0787 10 2.1067 0.1515 2.1067 1.4515
No log 0.0945 12 1.5891 0.1154 1.5891 1.2606
No log 0.1102 14 1.6815 0.1538 1.6815 1.2967
No log 0.1260 16 1.7015 0.2018 1.7015 1.3044
No log 0.1417 18 1.7046 0.1495 1.7046 1.3056
No log 0.1575 20 1.7080 0.2000 1.7080 1.3069
No log 0.1732 22 1.5865 0.3509 1.5865 1.2596
No log 0.1890 24 1.3117 0.4762 1.3117 1.1453
No log 0.2047 26 1.3973 0.4203 1.3973 1.1821
No log 0.2205 28 2.1031 0.3509 2.1031 1.4502
No log 0.2362 30 1.7712 0.4255 1.7712 1.3309
No log 0.2520 32 1.3228 0.3898 1.3228 1.1501
No log 0.2677 34 1.4996 0.3306 1.4996 1.2246
No log 0.2835 36 1.6272 0.1880 1.6272 1.2756
No log 0.2992 38 1.6205 0.1754 1.6205 1.2730
No log 0.3150 40 1.4026 0.3717 1.4026 1.1843
No log 0.3307 42 1.2039 0.5203 1.2039 1.0972
No log 0.3465 44 1.0494 0.5854 1.0494 1.0244
No log 0.3622 46 0.9865 0.5210 0.9865 0.9932
No log 0.3780 48 0.9417 0.5410 0.9417 0.9704
No log 0.3937 50 0.9620 0.576 0.9620 0.9808
No log 0.4094 52 1.0730 0.5714 1.0730 1.0359
No log 0.4252 54 1.1320 0.5581 1.1320 1.0639
No log 0.4409 56 1.1305 0.6 1.1305 1.0633
No log 0.4567 58 1.0333 0.6712 1.0333 1.0165
No log 0.4724 60 0.8163 0.6897 0.8163 0.9035
No log 0.4882 62 0.7448 0.7162 0.7448 0.8630
No log 0.5039 64 0.7328 0.7083 0.7328 0.8561
No log 0.5197 66 0.7433 0.6849 0.7433 0.8622
No log 0.5354 68 0.7711 0.6986 0.7711 0.8781
No log 0.5512 70 0.8266 0.6812 0.8266 0.9092
No log 0.5669 72 1.0364 0.6015 1.0364 1.0180
No log 0.5827 74 1.3931 0.4720 1.3931 1.1803
No log 0.5984 76 1.4228 0.5029 1.4228 1.1928
No log 0.6142 78 1.0851 0.6832 1.0851 1.0417
No log 0.6299 80 0.7439 0.7027 0.7439 0.8625
No log 0.6457 82 0.7365 0.6980 0.7365 0.8582
No log 0.6614 84 0.8055 0.6883 0.8055 0.8975
No log 0.6772 86 0.7863 0.7467 0.7863 0.8868
No log 0.6929 88 0.6867 0.7397 0.6867 0.8287
No log 0.7087 90 0.7795 0.6806 0.7795 0.8829
No log 0.7244 92 0.9020 0.7105 0.9020 0.9497
No log 0.7402 94 1.0885 0.6333 1.0885 1.0433
No log 0.7559 96 1.2807 0.6264 1.2807 1.1317
No log 0.7717 98 1.1752 0.5890 1.1752 1.0840
No log 0.7874 100 0.9266 0.6294 0.9266 0.9626
No log 0.8031 102 0.8045 0.6620 0.8045 0.8969
No log 0.8189 104 0.7592 0.7114 0.7592 0.8713
No log 0.8346 106 0.7171 0.7355 0.7171 0.8468
No log 0.8504 108 0.6534 0.7516 0.6534 0.8083
No log 0.8661 110 0.7070 0.7515 0.7070 0.8408
No log 0.8819 112 1.1652 0.6222 1.1652 1.0794
No log 0.8976 114 1.5356 0.5172 1.5356 1.2392
No log 0.9134 116 1.5614 0.4342 1.5614 1.2495
No log 0.9291 118 1.4097 0.4626 1.4097 1.1873
No log 0.9449 120 1.0525 0.6165 1.0525 1.0259
No log 0.9606 122 0.8510 0.6857 0.8510 0.9225
No log 0.9764 124 0.8067 0.7143 0.8067 0.8982
No log 0.9921 126 0.8334 0.7050 0.8334 0.9129
No log 1.0079 128 0.8988 0.6087 0.8988 0.9480
No log 1.0236 130 0.9679 0.5874 0.9679 0.9838
No log 1.0394 132 0.9865 0.6144 0.9865 0.9932
No log 1.0551 134 0.8222 0.6573 0.8222 0.9068
No log 1.0709 136 0.6885 0.6809 0.6885 0.8298
No log 1.0866 138 0.7159 0.6763 0.7159 0.8461
No log 1.1024 140 0.7610 0.6957 0.7610 0.8723
No log 1.1181 142 0.8494 0.6620 0.8494 0.9216
No log 1.1339 144 1.0654 0.6144 1.0654 1.0322
No log 1.1496 146 1.2582 0.5542 1.2582 1.1217
No log 1.1654 148 1.2743 0.4906 1.2743 1.1289
No log 1.1811 150 1.2891 0.5031 1.2891 1.1354
No log 1.1969 152 1.2260 0.5270 1.2260 1.1073
No log 1.2126 154 1.1048 0.5294 1.1048 1.0511
No log 1.2283 156 1.0799 0.5612 1.0799 1.0392
No log 1.2441 158 0.9890 0.5714 0.9890 0.9945
No log 1.2598 160 0.8318 0.6438 0.8318 0.9120
No log 1.2756 162 0.7045 0.7368 0.7045 0.8393
No log 1.2913 164 0.6566 0.7448 0.6566 0.8103
No log 1.3071 166 0.6554 0.7211 0.6554 0.8096
No log 1.3228 168 0.6968 0.7075 0.6968 0.8347
No log 1.3386 170 0.7736 0.7020 0.7736 0.8795
No log 1.3543 172 0.9850 0.6076 0.9850 0.9925
No log 1.3701 174 1.1904 0.6199 1.1904 1.0910
No log 1.3858 176 1.2922 0.6082 1.2922 1.1367
No log 1.4016 178 1.1966 0.5939 1.1966 1.0939
No log 1.4173 180 1.0237 0.6289 1.0237 1.0118
No log 1.4331 182 0.9033 0.6294 0.9033 0.9504
No log 1.4488 184 0.8090 0.6950 0.8090 0.8994
No log 1.4646 186 0.8016 0.7042 0.8016 0.8953
No log 1.4803 188 0.8022 0.7 0.8022 0.8957
No log 1.4961 190 0.8267 0.6525 0.8267 0.9092
No log 1.5118 192 0.8759 0.6216 0.8759 0.9359
No log 1.5276 194 0.8712 0.6620 0.8712 0.9334
No log 1.5433 196 0.9673 0.5714 0.9673 0.9835
No log 1.5591 198 1.0765 0.5414 1.0765 1.0375
No log 1.5748 200 1.0594 0.5414 1.0594 1.0293
No log 1.5906 202 0.8837 0.6176 0.8837 0.9400
No log 1.6063 204 0.7319 0.7234 0.7319 0.8555
No log 1.6220 206 0.7000 0.7234 0.7000 0.8366
No log 1.6378 208 0.7127 0.7172 0.7127 0.8442
No log 1.6535 210 0.7995 0.6667 0.7995 0.8942
No log 1.6693 212 0.8171 0.6712 0.8171 0.9039
No log 1.6850 214 0.8432 0.6577 0.8432 0.9182
No log 1.7008 216 0.8680 0.6483 0.8680 0.9317
No log 1.7165 218 0.8630 0.6475 0.8630 0.9290
No log 1.7323 220 0.8618 0.6475 0.8618 0.9283
No log 1.7480 222 0.8799 0.6370 0.8799 0.9381
No log 1.7638 224 0.8898 0.6165 0.8898 0.9433
No log 1.7795 226 0.9304 0.5802 0.9304 0.9646
No log 1.7953 228 0.9096 0.6316 0.9096 0.9537
No log 1.8110 230 0.8310 0.6812 0.8310 0.9116
No log 1.8268 232 0.7485 0.7273 0.7485 0.8651
No log 1.8425 234 0.7160 0.7682 0.7160 0.8462
No log 1.8583 236 0.7355 0.7564 0.7355 0.8576
No log 1.8740 238 0.8414 0.6988 0.8414 0.9173
No log 1.8898 240 1.0296 0.6889 1.0296 1.0147
No log 1.9055 242 1.1241 0.6667 1.1241 1.0602
No log 1.9213 244 1.0324 0.6667 1.0324 1.0160
No log 1.9370 246 0.8550 0.7020 0.8550 0.9247
No log 1.9528 248 0.7741 0.7211 0.7741 0.8798
No log 1.9685 250 0.7635 0.7183 0.7635 0.8738
No log 1.9843 252 0.7662 0.7324 0.7662 0.8753
No log 2.0 254 0.7572 0.7324 0.7572 0.8702
No log 2.0157 256 0.8347 0.6525 0.8347 0.9136
No log 2.0315 258 0.9941 0.5833 0.9941 0.9971
No log 2.0472 260 1.0738 0.5867 1.0738 1.0362
No log 2.0630 262 0.9629 0.6241 0.9629 0.9813
No log 2.0787 264 0.8544 0.6462 0.8544 0.9243
No log 2.0945 266 0.8538 0.6406 0.8538 0.9240
No log 2.1102 268 0.8970 0.6562 0.8970 0.9471
No log 2.1260 270 0.8858 0.6457 0.8858 0.9412
No log 2.1417 272 0.9280 0.6308 0.9280 0.9633
No log 2.1575 274 1.0649 0.5775 1.0649 1.0319
No log 2.1732 276 1.2150 0.5679 1.2150 1.1023
No log 2.1890 278 1.2350 0.5988 1.2350 1.1113
No log 2.2047 280 1.0419 0.6460 1.0419 1.0207
No log 2.2205 282 0.8691 0.6667 0.8691 0.9322
No log 2.2362 284 0.8199 0.6165 0.8199 0.9055
No log 2.2520 286 0.8654 0.6269 0.8654 0.9303
No log 2.2677 288 0.9133 0.6370 0.9133 0.9556
No log 2.2835 290 0.9509 0.5672 0.9509 0.9752
No log 2.2992 292 1.0237 0.6 1.0237 1.0118
No log 2.3150 294 1.1310 0.5655 1.1310 1.0635
No log 2.3307 296 1.0935 0.5860 1.0935 1.0457
No log 2.3465 298 0.9171 0.7 0.9171 0.9577
No log 2.3622 300 0.7525 0.7333 0.7525 0.8675
No log 2.3780 302 0.6545 0.7448 0.6545 0.8090
No log 2.3937 304 0.6977 0.7092 0.6977 0.8353
No log 2.4094 306 0.7401 0.6944 0.7401 0.8603
No log 2.4252 308 0.6977 0.7273 0.6977 0.8353
No log 2.4409 310 0.7241 0.6861 0.7241 0.8510
No log 2.4567 312 0.8410 0.6370 0.8410 0.9171
No log 2.4724 314 0.9976 0.5942 0.9976 0.9988
No log 2.4882 316 1.1640 0.5217 1.1640 1.0789
No log 2.5039 318 1.2758 0.5180 1.2758 1.1295
No log 2.5197 320 1.2503 0.5217 1.2503 1.1182
No log 2.5354 322 1.1017 0.5263 1.1017 1.0496
No log 2.5512 324 0.9879 0.5758 0.9879 0.9939
No log 2.5669 326 0.9232 0.5802 0.9232 0.9608
No log 2.5827 328 0.8684 0.6212 0.8684 0.9319
No log 2.5984 330 0.8940 0.6331 0.8940 0.9455
No log 2.6142 332 1.0072 0.56 1.0072 1.0036
No log 2.6299 334 1.1227 0.5655 1.1227 1.0596
No log 2.6457 336 1.1617 0.5109 1.1617 1.0778
No log 2.6614 338 1.1084 0.5294 1.1084 1.0528
No log 2.6772 340 0.9846 0.6029 0.9846 0.9923
No log 2.6929 342 0.8796 0.6519 0.8796 0.9379
No log 2.7087 344 0.8284 0.6569 0.8284 0.9102
No log 2.7244 346 0.8616 0.6519 0.8616 0.9282
No log 2.7402 348 0.8292 0.6866 0.8292 0.9106
No log 2.7559 350 0.8039 0.6866 0.8039 0.8966
No log 2.7717 352 0.7827 0.6866 0.7827 0.8847
No log 2.7874 354 0.7879 0.6901 0.7879 0.8876
No log 2.8031 356 0.7928 0.7152 0.7928 0.8904
No log 2.8189 358 0.8114 0.6957 0.8114 0.9008
No log 2.8346 360 0.8070 0.7337 0.8070 0.8983
No log 2.8504 362 0.7488 0.6939 0.7488 0.8654
No log 2.8661 364 0.7206 0.7222 0.7206 0.8489
No log 2.8819 366 0.7044 0.7465 0.7044 0.8393
No log 2.8976 368 0.7462 0.7183 0.7462 0.8638
No log 2.9134 370 0.8933 0.6286 0.8933 0.9451
No log 2.9291 372 1.0770 0.5369 1.0770 1.0378
No log 2.9449 374 1.0882 0.5548 1.0882 1.0432
No log 2.9606 376 1.0530 0.5556 1.0530 1.0261
No log 2.9764 378 0.9137 0.6423 0.9137 0.9559
No log 2.9921 380 0.8360 0.6906 0.8360 0.9143
No log 3.0079 382 0.8257 0.6957 0.8257 0.9087
No log 3.0236 384 0.7622 0.7143 0.7622 0.8730
No log 3.0394 386 0.7316 0.7286 0.7316 0.8554
No log 3.0551 388 0.8324 0.6923 0.8324 0.9123
No log 3.0709 390 0.9353 0.7030 0.9353 0.9671
No log 3.0866 392 0.9516 0.6626 0.9516 0.9755
No log 3.1024 394 0.8743 0.625 0.8743 0.9350
No log 3.1181 396 0.7965 0.6815 0.7965 0.8924
No log 3.1339 398 0.7930 0.6963 0.7930 0.8905
No log 3.1496 400 0.8154 0.6963 0.8154 0.9030
No log 3.1654 402 0.8817 0.6716 0.8817 0.9390
No log 3.1811 404 0.9866 0.6043 0.9866 0.9933
No log 3.1969 406 1.0203 0.5915 1.0203 1.0101
No log 3.2126 408 0.9699 0.6176 0.9699 0.9849
No log 3.2283 410 0.8889 0.6567 0.8889 0.9428
No log 3.2441 412 0.8714 0.6667 0.8714 0.9335
No log 3.2598 414 0.9392 0.6490 0.9392 0.9691
No log 3.2756 416 1.0751 0.6125 1.0751 1.0369
No log 3.2913 418 1.0576 0.6040 1.0576 1.0284
No log 3.3071 420 0.9453 0.6567 0.9453 0.9723
No log 3.3228 422 0.8648 0.6667 0.8648 0.9300
No log 3.3386 424 0.8193 0.6912 0.8193 0.9052
No log 3.3543 426 0.7927 0.6912 0.7927 0.8904
No log 3.3701 428 0.7872 0.7172 0.7872 0.8873
No log 3.3858 430 0.7832 0.7143 0.7832 0.8850
No log 3.4016 432 0.7644 0.7375 0.7644 0.8743
No log 3.4173 434 0.6831 0.7403 0.6831 0.8265
No log 3.4331 436 0.6639 0.7194 0.6639 0.8148
No log 3.4488 438 0.7031 0.7194 0.7031 0.8385
No log 3.4646 440 0.7679 0.6866 0.7679 0.8763
No log 3.4803 442 0.8364 0.6567 0.8364 0.9146
No log 3.4961 444 0.8766 0.6412 0.8766 0.9363
No log 3.5118 446 0.8422 0.6617 0.8422 0.9177
No log 3.5276 448 0.8323 0.6617 0.8323 0.9123
No log 3.5433 450 0.8242 0.6866 0.8242 0.9079
No log 3.5591 452 0.8644 0.6715 0.8644 0.9298
No log 3.5748 454 0.9000 0.6569 0.9000 0.9487
No log 3.5906 456 0.9527 0.5970 0.9527 0.9761
No log 3.6063 458 0.9681 0.6165 0.9681 0.9839
No log 3.6220 460 0.9238 0.6466 0.9238 0.9611
No log 3.6378 462 0.9674 0.6107 0.9674 0.9836
No log 3.6535 464 0.9909 0.6107 0.9909 0.9954
No log 3.6693 466 0.9741 0.6107 0.9741 0.9870
No log 3.6850 468 0.9336 0.6165 0.9336 0.9662
No log 3.7008 470 0.9062 0.6309 0.9062 0.9520
No log 3.7165 472 1.0247 0.6932 1.0247 1.0123
No log 3.7323 474 1.0486 0.6780 1.0486 1.0240
No log 3.7480 476 0.9794 0.6506 0.9794 0.9897
No log 3.7638 478 0.8504 0.6857 0.8504 0.9222
No log 3.7795 480 0.7795 0.6912 0.7795 0.8829
No log 3.7953 482 0.7996 0.7111 0.7996 0.8942
No log 3.8110 484 0.8337 0.6515 0.8337 0.9131
No log 3.8268 486 0.8644 0.6370 0.8644 0.9297
No log 3.8425 488 0.8662 0.6571 0.8662 0.9307
No log 3.8583 490 0.8280 0.6619 0.8280 0.9100
No log 3.8740 492 0.7695 0.6963 0.7695 0.8772
No log 3.8898 494 0.7183 0.7111 0.7183 0.8475
No log 3.9055 496 0.6885 0.7429 0.6885 0.8298
No log 3.9213 498 0.7038 0.7518 0.7038 0.8389
0.4568 3.9370 500 0.8139 0.725 0.8139 0.9022
0.4568 3.9528 502 0.8635 0.6795 0.8635 0.9293
0.4568 3.9685 504 0.8259 0.6715 0.8259 0.9088
0.4568 3.9843 506 0.7923 0.6462 0.7923 0.8901
0.4568 4.0 508 0.8023 0.6615 0.8023 0.8957
0.4568 4.0157 510 0.8322 0.6412 0.8322 0.9123

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
2
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k17_task1_organization

Finetuned
(4023)
this model