ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k10_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5725
  • Qwk: 0.5752
  • Mse: 0.5725
  • Rmse: 0.7566

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.04 2 3.9274 -0.0062 3.9274 1.9818
No log 0.08 4 2.3575 0.0203 2.3575 1.5354
No log 0.12 6 1.4572 0.0 1.4572 1.2072
No log 0.16 8 1.1583 0.0053 1.1583 1.0763
No log 0.2 10 1.1266 0.2268 1.1266 1.0614
No log 0.24 12 1.0855 0.2517 1.0855 1.0419
No log 0.28 14 1.0739 0.2340 1.0739 1.0363
No log 0.32 16 1.0917 0.1589 1.0917 1.0448
No log 0.36 18 1.0658 0.2015 1.0658 1.0324
No log 0.4 20 1.0086 0.2897 1.0086 1.0043
No log 0.44 22 0.9765 0.2192 0.9765 0.9882
No log 0.48 24 0.9708 0.3396 0.9708 0.9853
No log 0.52 26 0.8859 0.4022 0.8859 0.9412
No log 0.56 28 0.8238 0.4530 0.8238 0.9076
No log 0.6 30 0.7944 0.4101 0.7944 0.8913
No log 0.64 32 0.8115 0.4041 0.8115 0.9009
No log 0.68 34 0.7798 0.3780 0.7798 0.8831
No log 0.72 36 0.7506 0.5127 0.7506 0.8664
No log 0.76 38 0.7206 0.5156 0.7206 0.8489
No log 0.8 40 0.7070 0.5883 0.7070 0.8408
No log 0.84 42 0.7230 0.5860 0.7230 0.8503
No log 0.88 44 0.7235 0.5630 0.7235 0.8506
No log 0.92 46 0.7234 0.5630 0.7234 0.8505
No log 0.96 48 0.7530 0.5504 0.7530 0.8677
No log 1.0 50 0.7794 0.5629 0.7794 0.8828
No log 1.04 52 0.7883 0.4345 0.7883 0.8879
No log 1.08 54 0.7001 0.5863 0.7001 0.8367
No log 1.12 56 0.6327 0.6066 0.6327 0.7954
No log 1.16 58 0.6561 0.6295 0.6561 0.8100
No log 1.2 60 0.6341 0.5653 0.6341 0.7963
No log 1.24 62 0.6464 0.4938 0.6464 0.8040
No log 1.28 64 0.6384 0.5771 0.6384 0.7990
No log 1.32 66 0.6540 0.6167 0.6540 0.8087
No log 1.3600 68 0.6696 0.6554 0.6696 0.8183
No log 1.4 70 0.6267 0.5911 0.6267 0.7916
No log 1.44 72 0.6592 0.6195 0.6592 0.8119
No log 1.48 74 0.6591 0.6495 0.6591 0.8119
No log 1.52 76 0.9135 0.5875 0.9135 0.9557
No log 1.56 78 0.7767 0.6123 0.7767 0.8813
No log 1.6 80 0.6410 0.5478 0.6410 0.8006
No log 1.6400 82 0.7834 0.5951 0.7834 0.8851
No log 1.6800 84 0.7591 0.5928 0.7591 0.8713
No log 1.72 86 0.6321 0.6435 0.6321 0.7951
No log 1.76 88 0.6591 0.5819 0.6591 0.8119
No log 1.8 90 0.7636 0.5810 0.7636 0.8738
No log 1.8400 92 0.7053 0.5948 0.7053 0.8398
No log 1.88 94 0.6415 0.5921 0.6415 0.8010
No log 1.92 96 0.8079 0.5172 0.8079 0.8989
No log 1.96 98 0.8106 0.5367 0.8106 0.9003
No log 2.0 100 0.7044 0.5547 0.7044 0.8393
No log 2.04 102 0.6167 0.6148 0.6167 0.7853
No log 2.08 104 0.6347 0.6511 0.6347 0.7967
No log 2.12 106 0.7316 0.6190 0.7316 0.8553
No log 2.16 108 0.7941 0.5985 0.7941 0.8911
No log 2.2 110 0.7275 0.6019 0.7275 0.8529
No log 2.24 112 0.6239 0.6235 0.6239 0.7899
No log 2.2800 114 0.7105 0.6126 0.7105 0.8429
No log 2.32 116 0.7256 0.5857 0.7256 0.8518
No log 2.36 118 0.6150 0.6272 0.6150 0.7842
No log 2.4 120 0.5812 0.6412 0.5812 0.7624
No log 2.44 122 0.6901 0.6209 0.6901 0.8307
No log 2.48 124 0.6487 0.6313 0.6487 0.8054
No log 2.52 126 0.5442 0.6775 0.5442 0.7377
No log 2.56 128 0.5527 0.6677 0.5527 0.7435
No log 2.6 130 0.5221 0.6601 0.5221 0.7226
No log 2.64 132 0.5309 0.6528 0.5309 0.7286
No log 2.68 134 0.6350 0.6304 0.6350 0.7969
No log 2.7200 136 0.6288 0.5978 0.6288 0.7930
No log 2.76 138 0.5773 0.6035 0.5773 0.7598
No log 2.8 140 0.5672 0.6198 0.5672 0.7531
No log 2.84 142 0.5633 0.6188 0.5633 0.7505
No log 2.88 144 0.5781 0.6503 0.5781 0.7603
No log 2.92 146 0.5762 0.6760 0.5762 0.7591
No log 2.96 148 0.6226 0.6197 0.6226 0.7890
No log 3.0 150 0.7195 0.5895 0.7195 0.8483
No log 3.04 152 0.7435 0.5250 0.7435 0.8622
No log 3.08 154 0.7836 0.4568 0.7836 0.8852
No log 3.12 156 0.7419 0.5176 0.7419 0.8613
No log 3.16 158 0.7368 0.5566 0.7368 0.8584
No log 3.2 160 0.7437 0.6014 0.7437 0.8624
No log 3.24 162 0.6467 0.6328 0.6467 0.8042
No log 3.2800 164 0.6728 0.6305 0.6728 0.8203
No log 3.32 166 0.8053 0.6283 0.8053 0.8974
No log 3.36 168 0.7807 0.6114 0.7807 0.8835
No log 3.4 170 0.6360 0.6664 0.6360 0.7975
No log 3.44 172 0.6219 0.5622 0.6219 0.7886
No log 3.48 174 0.5980 0.5737 0.5980 0.7733
No log 3.52 176 0.5751 0.6239 0.5751 0.7584
No log 3.56 178 0.6225 0.6748 0.6225 0.7890
No log 3.6 180 0.5983 0.6865 0.5983 0.7735
No log 3.64 182 0.5692 0.7049 0.5692 0.7544
No log 3.68 184 0.5973 0.6204 0.5973 0.7728
No log 3.7200 186 0.5930 0.6465 0.5930 0.7701
No log 3.76 188 0.6228 0.6728 0.6228 0.7892
No log 3.8 190 0.7903 0.6199 0.7903 0.8890
No log 3.84 192 0.8823 0.6119 0.8823 0.9393
No log 3.88 194 0.7519 0.5725 0.7519 0.8672
No log 3.92 196 0.6101 0.5880 0.6101 0.7811
No log 3.96 198 0.6020 0.5701 0.6020 0.7759
No log 4.0 200 0.6139 0.5125 0.6139 0.7835
No log 4.04 202 0.6281 0.4988 0.6281 0.7925
No log 4.08 204 0.6365 0.4883 0.6365 0.7978
No log 4.12 206 0.6496 0.5534 0.6496 0.8060
No log 4.16 208 0.6129 0.5986 0.6129 0.7829
No log 4.2 210 0.5777 0.6740 0.5777 0.7601
No log 4.24 212 0.5747 0.6998 0.5747 0.7581
No log 4.28 214 0.5913 0.6732 0.5913 0.7690
No log 4.32 216 0.6202 0.6314 0.6202 0.7875
No log 4.36 218 0.6399 0.5635 0.6399 0.7999
No log 4.4 220 0.7038 0.5005 0.7038 0.8390
No log 4.44 222 0.6420 0.4962 0.6420 0.8012
No log 4.48 224 0.5779 0.6500 0.5779 0.7602
No log 4.52 226 0.5657 0.6944 0.5657 0.7521
No log 4.5600 228 0.5580 0.6473 0.5580 0.7470
No log 4.6 230 0.5617 0.6846 0.5617 0.7495
No log 4.64 232 0.5597 0.6297 0.5597 0.7482
No log 4.68 234 0.6023 0.6395 0.6023 0.7761
No log 4.72 236 0.6510 0.6151 0.6510 0.8068
No log 4.76 238 0.6300 0.5898 0.6300 0.7938
No log 4.8 240 0.5907 0.6314 0.5907 0.7686
No log 4.84 242 0.5780 0.6114 0.5780 0.7603
No log 4.88 244 0.5859 0.6306 0.5859 0.7655
No log 4.92 246 0.5888 0.6536 0.5888 0.7673
No log 4.96 248 0.6818 0.6301 0.6818 0.8257
No log 5.0 250 0.8329 0.5916 0.8329 0.9126
No log 5.04 252 0.7840 0.6489 0.7840 0.8854
No log 5.08 254 0.6680 0.6503 0.6680 0.8173
No log 5.12 256 0.5945 0.6297 0.5945 0.7711
No log 5.16 258 0.6461 0.5215 0.6461 0.8038
No log 5.2 260 0.6703 0.4944 0.6703 0.8187
No log 5.24 262 0.6719 0.5040 0.6719 0.8197
No log 5.28 264 0.6589 0.5146 0.6589 0.8117
No log 5.32 266 0.6432 0.5357 0.6432 0.8020
No log 5.36 268 0.6444 0.5730 0.6444 0.8028
No log 5.4 270 0.6326 0.6435 0.6326 0.7954
No log 5.44 272 0.6340 0.6511 0.6340 0.7962
No log 5.48 274 0.6311 0.6435 0.6311 0.7944
No log 5.52 276 0.6244 0.6729 0.6244 0.7902
No log 5.5600 278 0.6231 0.5692 0.6231 0.7893
No log 5.6 280 0.6163 0.5471 0.6163 0.7850
No log 5.64 282 0.6124 0.6407 0.6124 0.7825
No log 5.68 284 0.6062 0.6589 0.6062 0.7786
No log 5.72 286 0.6105 0.6256 0.6105 0.7814
No log 5.76 288 0.6282 0.5855 0.6282 0.7926
No log 5.8 290 0.6194 0.6177 0.6194 0.7871
No log 5.84 292 0.6046 0.6853 0.6046 0.7776
No log 5.88 294 0.6987 0.6316 0.6987 0.8359
No log 5.92 296 0.6982 0.6482 0.6982 0.8356
No log 5.96 298 0.6042 0.6561 0.6042 0.7773
No log 6.0 300 0.6794 0.6029 0.6794 0.8242
No log 6.04 302 0.6953 0.6019 0.6953 0.8339
No log 6.08 304 0.6139 0.5969 0.6139 0.7835
No log 6.12 306 0.6059 0.6134 0.6059 0.7784
No log 6.16 308 0.5995 0.6426 0.5995 0.7743
No log 6.2 310 0.5978 0.6561 0.5978 0.7732
No log 6.24 312 0.5985 0.6916 0.5985 0.7736
No log 6.28 314 0.5990 0.6748 0.5990 0.7740
No log 6.32 316 0.5977 0.6602 0.5977 0.7731
No log 6.36 318 0.5838 0.6602 0.5838 0.7641
No log 6.4 320 0.5967 0.6092 0.5967 0.7725
No log 6.44 322 0.5941 0.5986 0.5941 0.7708
No log 6.48 324 0.5854 0.6672 0.5854 0.7651
No log 6.52 326 0.5558 0.7171 0.5558 0.7455
No log 6.5600 328 0.5506 0.7380 0.5506 0.7420
No log 6.6 330 0.5898 0.7094 0.5898 0.7680
No log 6.64 332 0.6443 0.6310 0.6443 0.8027
No log 6.68 334 0.6407 0.6455 0.6407 0.8005
No log 6.72 336 0.6231 0.6092 0.6231 0.7894
No log 6.76 338 0.6463 0.5425 0.6463 0.8040
No log 6.8 340 0.6186 0.6014 0.6186 0.7865
No log 6.84 342 0.6005 0.6584 0.6005 0.7749
No log 6.88 344 0.6208 0.6380 0.6208 0.7879
No log 6.92 346 0.5921 0.6554 0.5921 0.7695
No log 6.96 348 0.5770 0.7404 0.5770 0.7596
No log 7.0 350 0.5714 0.6368 0.5714 0.7559
No log 7.04 352 0.5738 0.6392 0.5738 0.7575
No log 7.08 354 0.5789 0.6814 0.5789 0.7608
No log 7.12 356 0.6529 0.6230 0.6529 0.8081
No log 7.16 358 0.6664 0.5835 0.6664 0.8163
No log 7.2 360 0.6341 0.5964 0.6341 0.7963
No log 7.24 362 0.6274 0.5442 0.6274 0.7921
No log 7.28 364 0.6244 0.5536 0.6244 0.7902
No log 7.32 366 0.6116 0.5536 0.6116 0.7820
No log 7.36 368 0.5978 0.6380 0.5978 0.7732
No log 7.4 370 0.6112 0.6617 0.6112 0.7818
No log 7.44 372 0.5928 0.6894 0.5928 0.7699
No log 7.48 374 0.5820 0.6650 0.5820 0.7629
No log 7.52 376 0.6038 0.6468 0.6038 0.7770
No log 7.5600 378 0.6075 0.6544 0.6075 0.7794
No log 7.6 380 0.6092 0.5902 0.6092 0.7805
No log 7.64 382 0.6007 0.6545 0.6007 0.7751
No log 7.68 384 0.6030 0.6085 0.6030 0.7766
No log 7.72 386 0.6109 0.5774 0.6109 0.7816
No log 7.76 388 0.6559 0.6272 0.6559 0.8099
No log 7.8 390 0.7324 0.5766 0.7324 0.8558
No log 7.84 392 0.7048 0.5572 0.7048 0.8395
No log 7.88 394 0.6337 0.6291 0.6337 0.7960
No log 7.92 396 0.6072 0.6275 0.6072 0.7793
No log 7.96 398 0.5905 0.6284 0.5905 0.7684
No log 8.0 400 0.5877 0.6545 0.5877 0.7666
No log 8.04 402 0.6100 0.6335 0.6100 0.7810
No log 8.08 404 0.6369 0.6640 0.6369 0.7981
No log 8.12 406 0.6566 0.6151 0.6566 0.8103
No log 8.16 408 0.6073 0.6656 0.6073 0.7793
No log 8.2 410 0.6074 0.6297 0.6074 0.7794
No log 8.24 412 0.6067 0.5898 0.6067 0.7789
No log 8.28 414 0.6064 0.6032 0.6064 0.7787
No log 8.32 416 0.6013 0.6407 0.6013 0.7754
No log 8.36 418 0.6515 0.6240 0.6515 0.8072
No log 8.4 420 0.6775 0.5864 0.6775 0.8231
No log 8.44 422 0.6477 0.6282 0.6477 0.8048
No log 8.48 424 0.6258 0.6068 0.6258 0.7911
No log 8.52 426 0.6318 0.6068 0.6318 0.7948
No log 8.56 428 0.6401 0.6491 0.6401 0.8000
No log 8.6 430 0.6482 0.6392 0.6482 0.8051
No log 8.64 432 0.6698 0.6050 0.6698 0.8184
No log 8.68 434 0.7438 0.5830 0.7438 0.8625
No log 8.72 436 0.7641 0.5915 0.7641 0.8741
No log 8.76 438 0.6929 0.6167 0.6929 0.8324
No log 8.8 440 0.6535 0.5475 0.6535 0.8084
No log 8.84 442 0.6517 0.5415 0.6517 0.8073
No log 8.88 444 0.6293 0.6001 0.6293 0.7933
No log 8.92 446 0.6060 0.6500 0.6060 0.7785
No log 8.96 448 0.6192 0.6872 0.6192 0.7869
No log 9.0 450 0.6245 0.6732 0.6245 0.7902
No log 9.04 452 0.6063 0.6672 0.6063 0.7786
No log 9.08 454 0.5999 0.6383 0.5999 0.7745
No log 9.12 456 0.6055 0.6383 0.6055 0.7781
No log 9.16 458 0.6367 0.6269 0.6367 0.7979
No log 9.2 460 0.6675 0.6063 0.6675 0.8170
No log 9.24 462 0.6900 0.5206 0.6900 0.8306
No log 9.28 464 0.6893 0.4494 0.6893 0.8303
No log 9.32 466 0.7056 0.4975 0.7056 0.8400
No log 9.36 468 0.7250 0.4505 0.7250 0.8515
No log 9.4 470 0.7068 0.4210 0.7068 0.8407
No log 9.44 472 0.6791 0.4804 0.6791 0.8241
No log 9.48 474 0.6952 0.5425 0.6952 0.8338
No log 9.52 476 0.7844 0.5756 0.7844 0.8857
No log 9.56 478 0.7957 0.5210 0.7957 0.8920
No log 9.6 480 0.7617 0.5527 0.7617 0.8728
No log 9.64 482 0.6856 0.6356 0.6856 0.8280
No log 9.68 484 0.6689 0.6356 0.6689 0.8179
No log 9.72 486 0.6709 0.6028 0.6709 0.8191
No log 9.76 488 0.6941 0.6160 0.6941 0.8331
No log 9.8 490 0.6648 0.6228 0.6648 0.8154
No log 9.84 492 0.6353 0.6347 0.6353 0.7971
No log 9.88 494 0.6149 0.6085 0.6149 0.7841
No log 9.92 496 0.6021 0.6095 0.6021 0.7760
No log 9.96 498 0.6174 0.6284 0.6174 0.7858
0.2665 10.0 500 0.6192 0.6380 0.6192 0.7869
0.2665 10.04 502 0.6758 0.6097 0.6758 0.8221
0.2665 10.08 504 0.6792 0.6333 0.6792 0.8241
0.2665 10.12 506 0.5996 0.6536 0.5996 0.7743
0.2665 10.16 508 0.5715 0.6614 0.5715 0.7560
0.2665 10.2 510 0.5743 0.6659 0.5743 0.7578
0.2665 10.24 512 0.5758 0.6406 0.5758 0.7588
0.2665 10.28 514 0.6404 0.6564 0.6404 0.8002
0.2665 10.32 516 0.6959 0.6546 0.6959 0.8342
0.2665 10.36 518 0.7138 0.6446 0.7138 0.8448
0.2665 10.4 520 0.6062 0.6838 0.6062 0.7786
0.2665 10.44 522 0.5579 0.6553 0.5579 0.7469
0.2665 10.48 524 0.5630 0.6115 0.5630 0.7503
0.2665 10.52 526 0.5536 0.6499 0.5536 0.7441
0.2665 10.56 528 0.5529 0.6732 0.5529 0.7436
0.2665 10.6 530 0.5599 0.6396 0.5599 0.7483
0.2665 10.64 532 0.5574 0.6687 0.5574 0.7466
0.2665 10.68 534 0.5396 0.6545 0.5396 0.7346
0.2665 10.72 536 0.5409 0.6389 0.5409 0.7355
0.2665 10.76 538 0.5472 0.6704 0.5472 0.7397
0.2665 10.8 540 0.5730 0.6167 0.5730 0.7570
0.2665 10.84 542 0.5695 0.6199 0.5695 0.7547
0.2665 10.88 544 0.5710 0.6748 0.5710 0.7556
0.2665 10.92 546 0.5597 0.6973 0.5597 0.7481
0.2665 10.96 548 0.5377 0.6464 0.5377 0.7333
0.2665 11.0 550 0.5357 0.6634 0.5357 0.7319
0.2665 11.04 552 0.5255 0.6581 0.5255 0.7249
0.2665 11.08 554 0.5267 0.6589 0.5267 0.7258
0.2665 11.12 556 0.5313 0.6770 0.5313 0.7289
0.2665 11.16 558 0.5506 0.6655 0.5506 0.7420
0.2665 11.2 560 0.6005 0.6485 0.6005 0.7749
0.2665 11.24 562 0.6184 0.6485 0.6184 0.7864
0.2665 11.28 564 0.6421 0.6377 0.6421 0.8013
0.2665 11.32 566 0.6060 0.6485 0.6060 0.7784
0.2665 11.36 568 0.5441 0.6291 0.5441 0.7376
0.2665 11.4 570 0.5359 0.6880 0.5359 0.7321
0.2665 11.44 572 0.5445 0.6602 0.5445 0.7379
0.2665 11.48 574 0.5736 0.5678 0.5736 0.7573
0.2665 11.52 576 0.5952 0.5654 0.5952 0.7715
0.2665 11.56 578 0.5962 0.5654 0.5962 0.7721
0.2665 11.6 580 0.5821 0.5540 0.5821 0.7629
0.2665 11.64 582 0.5725 0.5752 0.5725 0.7566

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k10_task5_organization

Finetuned
(4019)
this model