ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k20_task3_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0633
  • Qwk: -0.0151
  • Mse: 1.0633
  • Rmse: 1.0312

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0194 2 3.5265 -0.0047 3.5265 1.8779
No log 0.0388 4 1.6708 0.0172 1.6708 1.2926
No log 0.0583 6 1.3410 -0.1773 1.3410 1.1580
No log 0.0777 8 1.2064 -0.1553 1.2064 1.0984
No log 0.0971 10 0.8236 -0.1233 0.8236 0.9075
No log 0.1165 12 0.7739 -0.1227 0.7739 0.8797
No log 0.1359 14 0.7746 -0.1230 0.7746 0.8801
No log 0.1553 16 0.8874 -0.0812 0.8874 0.9420
No log 0.1748 18 1.2765 0.0176 1.2765 1.1298
No log 0.1942 20 1.8728 0.0229 1.8728 1.3685
No log 0.2136 22 1.5998 -0.0258 1.5998 1.2648
No log 0.2330 24 1.2042 -0.0468 1.2042 1.0973
No log 0.2524 26 1.1642 0.0217 1.1642 1.0790
No log 0.2718 28 1.1174 0.0176 1.1174 1.0571
No log 0.2913 30 0.9235 -0.0545 0.9235 0.9610
No log 0.3107 32 0.7841 -0.0778 0.7841 0.8855
No log 0.3301 34 0.7377 0.1318 0.7377 0.8589
No log 0.3495 36 0.7239 0.1512 0.7239 0.8508
No log 0.3689 38 0.8910 -0.0788 0.8910 0.9439
No log 0.3883 40 1.3396 0.0840 1.3396 1.1574
No log 0.4078 42 0.9333 -0.0518 0.9333 0.9661
No log 0.4272 44 0.8074 0.0409 0.8074 0.8986
No log 0.4466 46 0.7257 0.1021 0.7257 0.8519
No log 0.4660 48 0.8130 -0.0571 0.8130 0.9017
No log 0.4854 50 0.7903 -0.1223 0.7903 0.8890
No log 0.5049 52 0.7964 -0.0753 0.7964 0.8924
No log 0.5243 54 0.8118 -0.0309 0.8118 0.9010
No log 0.5437 56 0.9243 -0.0442 0.9243 0.9614
No log 0.5631 58 1.2141 0.0319 1.2141 1.1018
No log 0.5825 60 1.1903 -0.0012 1.1903 1.0910
No log 0.6019 62 0.8804 0.1107 0.8804 0.9383
No log 0.6214 64 0.7714 0.1097 0.7714 0.8783
No log 0.6408 66 0.7566 0.0191 0.7566 0.8698
No log 0.6602 68 0.9088 0.1064 0.9088 0.9533
No log 0.6796 70 1.0210 0.0576 1.0210 1.0105
No log 0.6990 72 1.0128 0.0984 1.0128 1.0064
No log 0.7184 74 0.8680 -0.0008 0.8680 0.9317
No log 0.7379 76 0.9246 0.0793 0.9246 0.9616
No log 0.7573 78 0.9969 0.1437 0.9969 0.9984
No log 0.7767 80 0.9108 -0.0284 0.9108 0.9544
No log 0.7961 82 0.9852 -0.0055 0.9852 0.9926
No log 0.8155 84 1.0618 0.0366 1.0618 1.0304
No log 0.8350 86 1.1008 0.0134 1.1008 1.0492
No log 0.8544 88 0.9082 0.1410 0.9082 0.9530
No log 0.8738 90 0.9328 0.0632 0.9328 0.9658
No log 0.8932 92 1.0161 0.1032 1.0161 1.0080
No log 0.9126 94 0.9869 0.0479 0.9869 0.9934
No log 0.9320 96 0.9766 0.1099 0.9766 0.9882
No log 0.9515 98 0.9502 0.1099 0.9502 0.9748
No log 0.9709 100 0.8934 0.1775 0.8934 0.9452
No log 0.9903 102 0.9646 0.0975 0.9646 0.9822
No log 1.0097 104 0.8314 0.0 0.8314 0.9118
No log 1.0291 106 0.8298 0.0272 0.8298 0.9109
No log 1.0485 108 0.8835 0.1386 0.8835 0.9399
No log 1.0680 110 0.8376 0.0547 0.8376 0.9152
No log 1.0874 112 0.8111 0.0888 0.8111 0.9006
No log 1.1068 114 1.1950 0.0915 1.1950 1.0932
No log 1.1262 116 1.2433 0.1086 1.2433 1.1150
No log 1.1456 118 0.9405 0.0479 0.9405 0.9698
No log 1.1650 120 0.9819 0.2193 0.9819 0.9909
No log 1.1845 122 1.0377 0.0943 1.0377 1.0187
No log 1.2039 124 0.9765 0.2107 0.9765 0.9882
No log 1.2233 126 1.1755 0.0557 1.1755 1.0842
No log 1.2427 128 1.3725 0.0866 1.3725 1.1716
No log 1.2621 130 1.0884 0.0091 1.0884 1.0433
No log 1.2816 132 1.0036 0.1631 1.0036 1.0018
No log 1.3010 134 0.9190 0.1606 0.9190 0.9586
No log 1.3204 136 1.0110 0.0241 1.0110 1.0055
No log 1.3398 138 1.0559 0.0783 1.0559 1.0276
No log 1.3592 140 0.8992 0.0490 0.8992 0.9483
No log 1.3786 142 0.8848 0.1716 0.8848 0.9407
No log 1.3981 144 0.8565 0.1857 0.8565 0.9255
No log 1.4175 146 0.8069 0.0652 0.8069 0.8983
No log 1.4369 148 0.7980 0.1096 0.7980 0.8933
No log 1.4563 150 0.7268 0.1882 0.7268 0.8525
No log 1.4757 152 0.7448 0.1080 0.7448 0.8630
No log 1.4951 154 0.7722 0.1365 0.7722 0.8788
No log 1.5146 156 0.8566 0.0283 0.8566 0.9255
No log 1.5340 158 0.9573 0.0798 0.9573 0.9784
No log 1.5534 160 0.9072 0.1251 0.9072 0.9525
No log 1.5728 162 1.0249 0.1077 1.0249 1.0124
No log 1.5922 164 0.9795 0.0772 0.9795 0.9897
No log 1.6117 166 0.8930 0.0407 0.8930 0.9450
No log 1.6311 168 0.9101 -0.0149 0.9101 0.9540
No log 1.6505 170 0.9477 -0.0393 0.9477 0.9735
No log 1.6699 172 1.0066 0.0084 1.0066 1.0033
No log 1.6893 174 1.0307 -0.0302 1.0307 1.0152
No log 1.7087 176 1.0657 -0.0440 1.0657 1.0323
No log 1.7282 178 1.0781 -0.0395 1.0781 1.0383
No log 1.7476 180 1.1011 0.0238 1.1011 1.0493
No log 1.7670 182 1.1291 0.0355 1.1291 1.0626
No log 1.7864 184 1.0135 0.0262 1.0135 1.0067
No log 1.8058 186 0.9329 0.1131 0.9329 0.9659
No log 1.8252 188 0.9000 0.1051 0.9000 0.9487
No log 1.8447 190 0.8657 0.1132 0.8657 0.9304
No log 1.8641 192 0.8811 0.0868 0.8811 0.9387
No log 1.8835 194 0.8684 0.1209 0.8684 0.9319
No log 1.9029 196 0.8603 0.0362 0.8603 0.9275
No log 1.9223 198 0.8981 0.0334 0.8981 0.9477
No log 1.9417 200 0.9136 0.0392 0.9136 0.9558
No log 1.9612 202 0.9895 0.0255 0.9895 0.9947
No log 1.9806 204 1.0523 0.0286 1.0523 1.0258
No log 2.0 206 0.9624 0.0609 0.9624 0.9810
No log 2.0194 208 0.8693 0.0 0.8693 0.9324
No log 2.0388 210 0.8613 0.0426 0.8613 0.9281
No log 2.0583 212 0.8929 0.0913 0.8929 0.9449
No log 2.0777 214 0.9627 0.0134 0.9627 0.9812
No log 2.0971 216 1.0743 0.0506 1.0743 1.0365
No log 2.1165 218 1.0235 0.1141 1.0235 1.0117
No log 2.1359 220 1.0308 0.0934 1.0308 1.0153
No log 2.1553 222 1.1147 0.0729 1.1147 1.0558
No log 2.1748 224 0.9188 0.0406 0.9188 0.9585
No log 2.1942 226 1.0778 0.0576 1.0778 1.0382
No log 2.2136 228 1.0276 -0.0052 1.0276 1.0137
No log 2.2330 230 0.9288 0.0705 0.9288 0.9637
No log 2.2524 232 0.9443 -0.1304 0.9443 0.9718
No log 2.2718 234 0.9314 0.0581 0.9314 0.9651
No log 2.2913 236 1.0840 0.0692 1.0840 1.0412
No log 2.3107 238 1.2868 0.0786 1.2868 1.1344
No log 2.3301 240 1.0251 -0.0007 1.0251 1.0125
No log 2.3495 242 0.9382 0.0833 0.9382 0.9686
No log 2.3689 244 0.9744 -0.0079 0.9744 0.9871
No log 2.3883 246 0.8626 0.0827 0.8626 0.9287
No log 2.4078 248 0.8240 0.1228 0.8240 0.9078
No log 2.4272 250 0.8363 -0.0295 0.8363 0.9145
No log 2.4466 252 0.8031 -0.0262 0.8031 0.8962
No log 2.4660 254 0.8367 0.0209 0.8367 0.9147
No log 2.4854 256 0.8494 0.0179 0.8494 0.9216
No log 2.5049 258 0.8556 -0.0127 0.8556 0.9250
No log 2.5243 260 0.9226 0.0109 0.9226 0.9605
No log 2.5437 262 1.0050 0.0350 1.0050 1.0025
No log 2.5631 264 0.9361 -0.0515 0.9361 0.9675
No log 2.5825 266 0.9944 0.0649 0.9944 0.9972
No log 2.6019 268 1.0505 -0.0557 1.0505 1.0249
No log 2.6214 270 0.9364 0.0153 0.9364 0.9677
No log 2.6408 272 0.8903 0.0697 0.8903 0.9436
No log 2.6602 274 0.8866 -0.0230 0.8866 0.9416
No log 2.6796 276 0.8850 0.0670 0.8850 0.9407
No log 2.6990 278 0.8811 0.0679 0.8811 0.9387
No log 2.7184 280 0.8617 0.0688 0.8617 0.9283
No log 2.7379 282 0.8534 0.0822 0.8534 0.9238
No log 2.7573 284 0.8355 0.0633 0.8355 0.9140
No log 2.7767 286 0.8239 0.0764 0.8239 0.9077
No log 2.7961 288 0.8125 0.0804 0.8125 0.9014
No log 2.8155 290 0.8393 0.0226 0.8393 0.9161
No log 2.8350 292 0.8986 0.0065 0.8986 0.9479
No log 2.8544 294 0.8503 -0.0149 0.8503 0.9221
No log 2.8738 296 0.8375 -0.0025 0.8375 0.9151
No log 2.8932 298 0.8914 0.1339 0.8914 0.9441
No log 2.9126 300 0.8393 0.0495 0.8393 0.9162
No log 2.9320 302 0.8846 -0.0271 0.8846 0.9405
No log 2.9515 304 0.9814 -0.0828 0.9814 0.9907
No log 2.9709 306 0.8945 -0.0112 0.8945 0.9458
No log 2.9903 308 0.9033 0.0172 0.9033 0.9504
No log 3.0097 310 0.9867 0.0050 0.9867 0.9934
No log 3.0291 312 1.0140 0.0351 1.0140 1.0070
No log 3.0485 314 0.9830 -0.0108 0.9830 0.9915
No log 3.0680 316 0.9828 -0.0455 0.9828 0.9914
No log 3.0874 318 0.9011 -0.0753 0.9011 0.9492
No log 3.1068 320 0.8494 -0.0859 0.8494 0.9216
No log 3.1262 322 0.8314 -0.1072 0.8314 0.9118
No log 3.1456 324 0.9299 -0.0788 0.9299 0.9643
No log 3.1650 326 0.8815 -0.0316 0.8815 0.9389
No log 3.1845 328 0.8009 -0.1047 0.8009 0.8949
No log 3.2039 330 0.8206 -0.0488 0.8206 0.9059
No log 3.2233 332 0.8634 -0.0541 0.8634 0.9292
No log 3.2427 334 0.8799 -0.0898 0.8799 0.9380
No log 3.2621 336 0.8888 -0.0238 0.8888 0.9427
No log 3.2816 338 0.9457 -0.0341 0.9457 0.9724
No log 3.3010 340 0.9811 -0.0008 0.9811 0.9905
No log 3.3204 342 0.9201 0.0049 0.9201 0.9592
No log 3.3398 344 0.9499 -0.0187 0.9499 0.9746
No log 3.3592 346 0.9832 -0.0440 0.9832 0.9916
No log 3.3786 348 0.8750 -0.0407 0.8750 0.9354
No log 3.3981 350 0.9171 0.0095 0.9171 0.9577
No log 3.4175 352 0.9264 -0.0316 0.9264 0.9625
No log 3.4369 354 0.8664 -0.0614 0.8664 0.9308
No log 3.4563 356 0.9059 -0.0634 0.9059 0.9518
No log 3.4757 358 0.9095 -0.0595 0.9095 0.9537
No log 3.4951 360 0.8686 -0.0588 0.8686 0.9320
No log 3.5146 362 0.9088 0.0068 0.9088 0.9533
No log 3.5340 364 0.9066 0.0068 0.9066 0.9522
No log 3.5534 366 0.8926 0.0159 0.8926 0.9448
No log 3.5728 368 0.9241 -0.0912 0.9241 0.9613
No log 3.5922 370 0.8508 -0.0049 0.8508 0.9224
No log 3.6117 372 0.8819 0.0562 0.8819 0.9391
No log 3.6311 374 0.9062 0.0786 0.9062 0.9520
No log 3.6505 376 0.8416 -0.0240 0.8416 0.9174
No log 3.6699 378 0.8598 -0.0300 0.8598 0.9272
No log 3.6893 380 1.0662 -0.0137 1.0662 1.0326
No log 3.7087 382 1.1403 0.0219 1.1403 1.0679
No log 3.7282 384 0.9837 0.0326 0.9837 0.9918
No log 3.7476 386 0.8909 -0.0230 0.8909 0.9439
No log 3.7670 388 0.8509 0.0097 0.8509 0.9224
No log 3.7864 390 0.8073 -0.0186 0.8073 0.8985
No log 3.8058 392 0.7993 0.0247 0.7993 0.8941
No log 3.8252 394 0.8026 0.0821 0.8026 0.8959
No log 3.8447 396 0.8201 -0.0967 0.8201 0.9056
No log 3.8641 398 0.8252 -0.0056 0.8252 0.9084
No log 3.8835 400 0.8800 0.0562 0.8800 0.9381
No log 3.9029 402 1.0602 0.0224 1.0602 1.0297
No log 3.9223 404 1.0146 -0.0490 1.0146 1.0072
No log 3.9417 406 0.8921 -0.0178 0.8921 0.9445
No log 3.9612 408 0.8988 -0.0734 0.8988 0.9481
No log 3.9806 410 0.8810 -0.0870 0.8810 0.9386
No log 4.0 412 0.9801 0.0333 0.9801 0.9900
No log 4.0194 414 0.9824 -0.0056 0.9824 0.9911
No log 4.0388 416 0.8278 -0.0228 0.8278 0.9098
No log 4.0583 418 0.8191 -0.0785 0.8191 0.9051
No log 4.0777 420 0.8355 -0.0334 0.8355 0.9140
No log 4.0971 422 0.8236 -0.0170 0.8236 0.9075
No log 4.1165 424 0.9303 -0.0031 0.9303 0.9645
No log 4.1359 426 0.9264 0.1329 0.9264 0.9625
No log 4.1553 428 0.9050 -0.0391 0.9050 0.9513
No log 4.1748 430 0.9018 -0.0121 0.9018 0.9496
No log 4.1942 432 0.9291 0.0920 0.9291 0.9639
No log 4.2136 434 0.9030 0.0961 0.9030 0.9502
No log 4.2330 436 0.8432 0.1004 0.8432 0.9183
No log 4.2524 438 0.8531 0.1387 0.8531 0.9236
No log 4.2718 440 0.8129 0.0611 0.8129 0.9016
No log 4.2913 442 0.8169 0.0277 0.8169 0.9038
No log 4.3107 444 0.8296 -0.0462 0.8296 0.9108
No log 4.3301 446 0.8432 -0.0103 0.8432 0.9183
No log 4.3495 448 0.8853 0.1379 0.8853 0.9409
No log 4.3689 450 0.8181 -0.0156 0.8181 0.9045
No log 4.3883 452 0.8408 0.0547 0.8408 0.9170
No log 4.4078 454 0.8450 -0.0138 0.8450 0.9192
No log 4.4272 456 0.7790 -0.0056 0.7790 0.8826
No log 4.4466 458 0.8667 0.0831 0.8667 0.9310
No log 4.4660 460 1.0358 -0.0143 1.0358 1.0178
No log 4.4854 462 0.9653 -0.0076 0.9653 0.9825
No log 4.5049 464 0.8406 0.1001 0.8406 0.9169
No log 4.5243 466 0.8112 0.0027 0.8112 0.9007
No log 4.5437 468 0.8151 -0.0387 0.8151 0.9028
No log 4.5631 470 0.7725 -0.0030 0.7725 0.8789
No log 4.5825 472 0.8571 0.1342 0.8571 0.9258
No log 4.6019 474 0.9549 0.0287 0.9549 0.9772
No log 4.6214 476 0.9177 0.0711 0.9177 0.9580
No log 4.6408 478 0.7868 0.1097 0.7868 0.8870
No log 4.6602 480 0.7545 -0.0488 0.7545 0.8686
No log 4.6796 482 0.7795 -0.0892 0.7795 0.8829
No log 4.6990 484 0.7719 -0.0446 0.7719 0.8786
No log 4.7184 486 0.7953 0.0289 0.7953 0.8918
No log 4.7379 488 0.8865 0.0909 0.8865 0.9415
No log 4.7573 490 1.0218 -0.0545 1.0218 1.0108
No log 4.7767 492 0.9687 0.0333 0.9687 0.9842
No log 4.7961 494 0.8504 0.0650 0.8504 0.9222
No log 4.8155 496 0.8424 -0.0103 0.8424 0.9178
No log 4.8350 498 0.8351 0.0690 0.8351 0.9138
0.359 4.8544 500 0.8735 0.0409 0.8735 0.9346
0.359 4.8738 502 0.8381 -0.0295 0.8381 0.9155
0.359 4.8932 504 0.8161 0.0732 0.8161 0.9034
0.359 4.9126 506 0.8383 -0.0389 0.8383 0.9156
0.359 4.9320 508 0.8468 -0.0132 0.8468 0.9202
0.359 4.9515 510 0.8628 -0.0209 0.8628 0.9289
0.359 4.9709 512 0.9578 0.0016 0.9578 0.9787
0.359 4.9903 514 1.0633 -0.0151 1.0633 1.0312

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k20_task3_organization

Finetuned
(4019)
this model