ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k16_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6828
  • Qwk: 0.5727
  • Mse: 0.6828
  • Rmse: 0.8263

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0222 2 4.4190 0.0074 4.4190 2.1021
No log 0.0444 4 3.3703 0.0110 3.3703 1.8358
No log 0.0667 6 1.5892 0.0142 1.5892 1.2606
No log 0.0889 8 1.2261 0.1468 1.2261 1.1073
No log 0.1111 10 1.1820 0.1857 1.1820 1.0872
No log 0.1333 12 1.1741 0.1417 1.1741 1.0836
No log 0.1556 14 1.1535 0.1417 1.1535 1.0740
No log 0.1778 16 1.1692 0.1852 1.1692 1.0813
No log 0.2 18 1.2379 0.0992 1.2379 1.1126
No log 0.2222 20 1.2614 0.1196 1.2614 1.1231
No log 0.2444 22 1.1552 0.1865 1.1552 1.0748
No log 0.2667 24 1.1063 0.1984 1.1063 1.0518
No log 0.2889 26 1.0525 0.3119 1.0525 1.0259
No log 0.3111 28 1.0529 0.2621 1.0529 1.0261
No log 0.3333 30 1.0257 0.3066 1.0257 1.0128
No log 0.3556 32 1.0519 0.3039 1.0519 1.0256
No log 0.3778 34 1.1857 0.2815 1.1857 1.0889
No log 0.4 36 1.3220 0.2925 1.3220 1.1498
No log 0.4222 38 1.4109 0.2927 1.4109 1.1878
No log 0.4444 40 1.3176 0.3442 1.3176 1.1479
No log 0.4667 42 1.3797 0.3624 1.3797 1.1746
No log 0.4889 44 1.2141 0.3689 1.2141 1.1019
No log 0.5111 46 1.0321 0.5159 1.0321 1.0159
No log 0.5333 48 0.8357 0.5779 0.8357 0.9142
No log 0.5556 50 0.7647 0.5447 0.7647 0.8745
No log 0.5778 52 0.7141 0.6356 0.7141 0.8451
No log 0.6 54 0.7734 0.5426 0.7734 0.8794
No log 0.6222 56 0.7335 0.5700 0.7335 0.8565
No log 0.6444 58 0.7380 0.5569 0.7380 0.8591
No log 0.6667 60 0.7502 0.4879 0.7502 0.8662
No log 0.6889 62 0.7564 0.5226 0.7564 0.8697
No log 0.7111 64 0.7203 0.6476 0.7203 0.8487
No log 0.7333 66 0.8219 0.5427 0.8219 0.9066
No log 0.7556 68 0.8639 0.5208 0.8639 0.9295
No log 0.7778 70 0.7540 0.6008 0.7540 0.8683
No log 0.8 72 0.6460 0.7217 0.6460 0.8038
No log 0.8222 74 0.6670 0.6410 0.6670 0.8167
No log 0.8444 76 0.6326 0.7232 0.6326 0.7954
No log 0.8667 78 0.6386 0.6791 0.6386 0.7991
No log 0.8889 80 0.6423 0.6588 0.6423 0.8014
No log 0.9111 82 0.6476 0.6368 0.6476 0.8048
No log 0.9333 84 0.7034 0.5905 0.7034 0.8387
No log 0.9556 86 0.6789 0.6317 0.6789 0.8240
No log 0.9778 88 0.6507 0.7066 0.6507 0.8067
No log 1.0 90 0.7936 0.5962 0.7936 0.8909
No log 1.0222 92 0.9145 0.5384 0.9145 0.9563
No log 1.0444 94 0.8781 0.5548 0.8781 0.9371
No log 1.0667 96 0.7339 0.5975 0.7339 0.8567
No log 1.0889 98 0.6821 0.5327 0.6821 0.8259
No log 1.1111 100 0.6844 0.5746 0.6844 0.8273
No log 1.1333 102 0.6546 0.6809 0.6546 0.8091
No log 1.1556 104 0.8604 0.6007 0.8604 0.9276
No log 1.1778 106 1.0673 0.5963 1.0673 1.0331
No log 1.2 108 0.8574 0.5836 0.8574 0.9260
No log 1.2222 110 0.6188 0.7088 0.6188 0.7866
No log 1.2444 112 0.7067 0.5881 0.7067 0.8407
No log 1.2667 114 0.6727 0.5678 0.6727 0.8202
No log 1.2889 116 0.6278 0.6908 0.6278 0.7923
No log 1.3111 118 0.7104 0.6330 0.7104 0.8428
No log 1.3333 120 0.9479 0.5028 0.9479 0.9736
No log 1.3556 122 1.0724 0.4741 1.0724 1.0356
No log 1.3778 124 0.9706 0.5425 0.9706 0.9852
No log 1.4 126 0.7284 0.6385 0.7284 0.8534
No log 1.4222 128 0.6674 0.6651 0.6674 0.8170
No log 1.4444 130 0.6977 0.6385 0.6977 0.8353
No log 1.4667 132 0.8796 0.6073 0.8796 0.9379
No log 1.4889 134 1.0217 0.5779 1.0217 1.0108
No log 1.5111 136 0.9597 0.6179 0.9597 0.9797
No log 1.5333 138 0.7597 0.6365 0.7597 0.8716
No log 1.5556 140 0.6667 0.7243 0.6667 0.8165
No log 1.5778 142 0.6491 0.7029 0.6491 0.8056
No log 1.6 144 0.7584 0.6474 0.7584 0.8708
No log 1.6222 146 1.1018 0.5437 1.1018 1.0497
No log 1.6444 148 1.1365 0.5331 1.1365 1.0661
No log 1.6667 150 0.8769 0.6045 0.8769 0.9364
No log 1.6889 152 0.6631 0.6481 0.6631 0.8143
No log 1.7111 154 0.6527 0.6369 0.6527 0.8079
No log 1.7333 156 0.7453 0.6481 0.7453 0.8633
No log 1.7556 158 0.7695 0.6568 0.7695 0.8772
No log 1.7778 160 0.7274 0.6755 0.7274 0.8529
No log 1.8 162 0.7980 0.6568 0.7980 0.8933
No log 1.8222 164 0.9144 0.5626 0.9144 0.9563
No log 1.8444 166 0.8779 0.5420 0.8779 0.9370
No log 1.8667 168 0.7640 0.6109 0.7640 0.8741
No log 1.8889 170 0.7575 0.5204 0.7575 0.8704
No log 1.9111 172 0.8076 0.5318 0.8076 0.8987
No log 1.9333 174 0.9245 0.5455 0.9245 0.9615
No log 1.9556 176 0.9137 0.5614 0.9137 0.9559
No log 1.9778 178 0.9302 0.6083 0.9302 0.9645
No log 2.0 180 0.8706 0.6062 0.8706 0.9331
No log 2.0222 182 0.8067 0.6159 0.8067 0.8981
No log 2.0444 184 0.9044 0.6219 0.9044 0.9510
No log 2.0667 186 1.0556 0.5467 1.0556 1.0274
No log 2.0889 188 0.9795 0.4872 0.9795 0.9897
No log 2.1111 190 0.9317 0.4714 0.9317 0.9652
No log 2.1333 192 0.9217 0.4632 0.9217 0.9601
No log 2.1556 194 0.8560 0.5736 0.8560 0.9252
No log 2.1778 196 0.7688 0.6026 0.7688 0.8768
No log 2.2 198 0.7808 0.6287 0.7808 0.8836
No log 2.2222 200 0.9055 0.5962 0.9055 0.9516
No log 2.2444 202 1.0061 0.6018 1.0061 1.0030
No log 2.2667 204 0.9056 0.5962 0.9056 0.9516
No log 2.2889 206 0.7856 0.6139 0.7856 0.8863
No log 2.3111 208 0.8475 0.5791 0.8475 0.9206
No log 2.3333 210 0.9197 0.5893 0.9197 0.9590
No log 2.3556 212 0.7868 0.5685 0.7868 0.8870
No log 2.3778 214 0.7234 0.5606 0.7234 0.8505
No log 2.4 216 0.7108 0.6084 0.7108 0.8431
No log 2.4222 218 0.7301 0.5957 0.7301 0.8545
No log 2.4444 220 0.7361 0.5766 0.7361 0.8579
No log 2.4667 222 0.7745 0.5741 0.7745 0.8801
No log 2.4889 224 0.8091 0.5716 0.8091 0.8995
No log 2.5111 226 0.8455 0.5614 0.8455 0.9195
No log 2.5333 228 0.8543 0.6207 0.8543 0.9243
No log 2.5556 230 0.8304 0.6026 0.8304 0.9112
No log 2.5778 232 0.7653 0.6239 0.7653 0.8748
No log 2.6 234 0.7583 0.6338 0.7583 0.8708
No log 2.6222 236 0.7554 0.6611 0.7554 0.8691
No log 2.6444 238 0.8150 0.5965 0.8150 0.9028
No log 2.6667 240 1.0086 0.5040 1.0086 1.0043
No log 2.6889 242 1.0630 0.4387 1.0630 1.0310
No log 2.7111 244 0.9852 0.5094 0.9852 0.9926
No log 2.7333 246 0.9759 0.4218 0.9759 0.9879
No log 2.7556 248 1.0294 0.4219 1.0294 1.0146
No log 2.7778 250 0.9407 0.4222 0.9407 0.9699
No log 2.8 252 0.8621 0.5295 0.8621 0.9285
No log 2.8222 254 0.8220 0.5303 0.8220 0.9066
No log 2.8444 256 0.8489 0.5661 0.8489 0.9214
No log 2.8667 258 0.8724 0.5427 0.8724 0.9340
No log 2.8889 260 0.8401 0.5719 0.8401 0.9166
No log 2.9111 262 0.7944 0.5661 0.7944 0.8913
No log 2.9333 264 0.7678 0.6132 0.7678 0.8762
No log 2.9556 266 0.7819 0.5675 0.7819 0.8842
No log 2.9778 268 0.8088 0.5536 0.8088 0.8993
No log 3.0 270 0.8714 0.5379 0.8714 0.9335
No log 3.0222 272 0.9005 0.5591 0.9005 0.9490
No log 3.0444 274 0.7931 0.6280 0.7931 0.8905
No log 3.0667 276 0.6829 0.6635 0.6829 0.8264
No log 3.0889 278 0.6473 0.6669 0.6473 0.8045
No log 3.1111 280 0.6525 0.6809 0.6525 0.8078
No log 3.1333 282 0.6772 0.6849 0.6772 0.8229
No log 3.1556 284 0.6859 0.6849 0.6859 0.8282
No log 3.1778 286 0.7696 0.6101 0.7696 0.8772
No log 3.2 288 0.8116 0.5719 0.8116 0.9009
No log 3.2222 290 0.7688 0.6013 0.7688 0.8768
No log 3.2444 292 0.7063 0.5898 0.7063 0.8404
No log 3.2667 294 0.6923 0.6144 0.6923 0.8320
No log 3.2889 296 0.6867 0.5727 0.6867 0.8287
No log 3.3111 298 0.7469 0.5844 0.7469 0.8642
No log 3.3333 300 0.7518 0.5844 0.7518 0.8670
No log 3.3556 302 0.6773 0.5806 0.6773 0.8230
No log 3.3778 304 0.6692 0.5983 0.6692 0.8180
No log 3.4 306 0.7203 0.5727 0.7203 0.8487
No log 3.4222 308 0.8582 0.5419 0.8582 0.9264
No log 3.4444 310 0.9401 0.4796 0.9401 0.9696
No log 3.4667 312 0.9305 0.5002 0.9305 0.9646
No log 3.4889 314 0.8486 0.4772 0.8486 0.9212
No log 3.5111 316 0.7885 0.5303 0.7885 0.8880
No log 3.5333 318 0.7946 0.5303 0.7946 0.8914
No log 3.5556 320 0.8009 0.5447 0.8009 0.8949
No log 3.5778 322 0.8802 0.5135 0.8802 0.9382
No log 3.6 324 0.9246 0.5286 0.9246 0.9616
No log 3.6222 326 0.9749 0.5237 0.9749 0.9874
No log 3.6444 328 0.8434 0.5365 0.8434 0.9184
No log 3.6667 330 0.7526 0.6518 0.7526 0.8675
No log 3.6889 332 0.7190 0.6324 0.7190 0.8480
No log 3.7111 334 0.7283 0.6455 0.7283 0.8534
No log 3.7333 336 0.7777 0.6076 0.7777 0.8819
No log 3.7556 338 0.9210 0.4921 0.9210 0.9597
No log 3.7778 340 0.9580 0.4487 0.9580 0.9788
No log 3.8 342 0.8616 0.4874 0.8616 0.9282
No log 3.8222 344 0.7685 0.6110 0.7685 0.8766
No log 3.8444 346 0.7303 0.6110 0.7303 0.8546
No log 3.8667 348 0.7866 0.5661 0.7866 0.8869
No log 3.8889 350 0.9428 0.5342 0.9428 0.9710
No log 3.9111 352 0.9978 0.5134 0.9978 0.9989
No log 3.9333 354 0.9154 0.5190 0.9154 0.9568
No log 3.9556 356 0.7906 0.5996 0.7906 0.8892
No log 3.9778 358 0.7442 0.5555 0.7442 0.8627
No log 4.0 360 0.7244 0.5392 0.7244 0.8511
No log 4.0222 362 0.7216 0.5647 0.7216 0.8495
No log 4.0444 364 0.7253 0.5854 0.7253 0.8516
No log 4.0667 366 0.7139 0.5969 0.7139 0.8449
No log 4.0889 368 0.7057 0.6272 0.7057 0.8401
No log 4.1111 370 0.7499 0.6089 0.7499 0.8660
No log 4.1333 372 0.9075 0.5130 0.9075 0.9526
No log 4.1556 374 0.9595 0.5177 0.9595 0.9795
No log 4.1778 376 0.8502 0.5451 0.8502 0.9221
No log 4.2 378 0.7661 0.5766 0.7661 0.8752
No log 4.2222 380 0.6797 0.6412 0.6797 0.8244
No log 4.2444 382 0.6717 0.5864 0.6717 0.8196
No log 4.2667 384 0.6907 0.6007 0.6907 0.8311
No log 4.2889 386 0.6411 0.6495 0.6411 0.8007
No log 4.3111 388 0.6027 0.7053 0.6027 0.7763
No log 4.3333 390 0.7559 0.6825 0.7559 0.8694
No log 4.3556 392 0.9875 0.5376 0.9875 0.9937
No log 4.3778 394 0.9974 0.6066 0.9974 0.9987
No log 4.4 396 0.9047 0.5626 0.9047 0.9512
No log 4.4222 398 0.7454 0.5875 0.7454 0.8633
No log 4.4444 400 0.7155 0.6110 0.7155 0.8459
No log 4.4667 402 0.7032 0.5959 0.7032 0.8386
No log 4.4889 404 0.6946 0.6263 0.6946 0.8334
No log 4.5111 406 0.6776 0.6262 0.6776 0.8232
No log 4.5333 408 0.6736 0.5866 0.6736 0.8207
No log 4.5556 410 0.6696 0.6281 0.6696 0.8183
No log 4.5778 412 0.6874 0.5847 0.6874 0.8291
No log 4.6 414 0.7798 0.6122 0.7798 0.8830
No log 4.6222 416 0.9494 0.5956 0.9494 0.9743
No log 4.6444 418 0.9273 0.5956 0.9273 0.9629
No log 4.6667 420 0.7686 0.6201 0.7686 0.8767
No log 4.6889 422 0.7088 0.6400 0.7088 0.8419
No log 4.7111 424 0.7028 0.6162 0.7028 0.8383
No log 4.7333 426 0.7390 0.5447 0.7390 0.8596
No log 4.7556 428 0.8187 0.5638 0.8187 0.9048
No log 4.7778 430 0.9177 0.5445 0.9177 0.9580
No log 4.8 432 0.9581 0.5425 0.9581 0.9788
No log 4.8222 434 0.8870 0.5705 0.8870 0.9418
No log 4.8444 436 0.7605 0.6398 0.7605 0.8720
No log 4.8667 438 0.6735 0.5766 0.6735 0.8207
No log 4.8889 440 0.6631 0.6085 0.6631 0.8143
No log 4.9111 442 0.6654 0.6085 0.6654 0.8157
No log 4.9333 444 0.6724 0.6217 0.6724 0.8200
No log 4.9556 446 0.6699 0.6269 0.6699 0.8185
No log 4.9778 448 0.6743 0.6384 0.6743 0.8211
No log 5.0 450 0.6731 0.5806 0.6731 0.8204
No log 5.0222 452 0.7121 0.6091 0.7121 0.8438
No log 5.0444 454 0.7152 0.5814 0.7152 0.8457
No log 5.0667 456 0.6804 0.5416 0.6804 0.8249
No log 5.0889 458 0.6798 0.5753 0.6798 0.8245
No log 5.1111 460 0.7087 0.6189 0.7087 0.8418
No log 5.1333 462 0.7317 0.6282 0.7317 0.8554
No log 5.1556 464 0.7154 0.6189 0.7154 0.8458
No log 5.1778 466 0.6671 0.6120 0.6671 0.8167
No log 5.2 468 0.6328 0.6343 0.6328 0.7955
No log 5.2222 470 0.6375 0.6203 0.6375 0.7985
No log 5.2444 472 0.6956 0.6580 0.6956 0.8340
No log 5.2667 474 0.8525 0.6073 0.8525 0.9233
No log 5.2889 476 0.9501 0.5486 0.9501 0.9747
No log 5.3111 478 0.8881 0.5984 0.8881 0.9424
No log 5.3333 480 0.7677 0.6398 0.7677 0.8762
No log 5.3556 482 0.6699 0.6718 0.6699 0.8185
No log 5.3778 484 0.6524 0.6406 0.6524 0.8077
No log 5.4 486 0.6383 0.6791 0.6383 0.7989
No log 5.4222 488 0.6460 0.6791 0.6460 0.8037
No log 5.4444 490 0.6661 0.6384 0.6661 0.8162
No log 5.4667 492 0.7074 0.6464 0.7074 0.8411
No log 5.4889 494 0.7565 0.6269 0.7565 0.8697
No log 5.5111 496 0.7837 0.6182 0.7837 0.8852
No log 5.5333 498 0.7718 0.6182 0.7718 0.8785
0.3447 5.5556 500 0.7422 0.6131 0.7422 0.8615
0.3447 5.5778 502 0.7026 0.6040 0.7026 0.8382
0.3447 5.6 504 0.7003 0.6320 0.7003 0.8369
0.3447 5.6222 506 0.7227 0.5839 0.7227 0.8501
0.3447 5.6444 508 0.6905 0.6320 0.6905 0.8310
0.3447 5.6667 510 0.6731 0.5963 0.6731 0.8204
0.3447 5.6889 512 0.6849 0.5753 0.6849 0.8276
0.3447 5.7111 514 0.6580 0.6255 0.6580 0.8112
0.3447 5.7333 516 0.6777 0.6460 0.6777 0.8232
0.3447 5.7556 518 0.6902 0.6154 0.6902 0.8308
0.3447 5.7778 520 0.6848 0.5964 0.6848 0.8275
0.3447 5.8 522 0.7033 0.5650 0.7033 0.8386
0.3447 5.8222 524 0.7564 0.6101 0.7564 0.8697
0.3447 5.8444 526 0.7433 0.5962 0.7433 0.8622
0.3447 5.8667 528 0.6828 0.5727 0.6828 0.8263

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k16_task2_organization

Finetuned
(4019)
this model