ArabicNewSplits8_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k11_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8306
  • Qwk: 0.6002
  • Mse: 0.8306
  • Rmse: 0.9114

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0364 2 5.4526 -0.0250 5.4526 2.3351
No log 0.0727 4 3.4332 0.0300 3.4332 1.8529
No log 0.1091 6 2.4817 -0.0914 2.4817 1.5753
No log 0.1455 8 1.9628 0.0349 1.9628 1.4010
No log 0.1818 10 1.5892 -0.0169 1.5892 1.2606
No log 0.2182 12 1.2132 0.1785 1.2132 1.1015
No log 0.2545 14 1.1825 0.2182 1.1825 1.0875
No log 0.2909 16 1.1768 0.2216 1.1768 1.0848
No log 0.3273 18 1.1631 0.2365 1.1631 1.0785
No log 0.3636 20 1.4031 0.0527 1.4031 1.1845
No log 0.4 22 1.5171 0.0281 1.5171 1.2317
No log 0.4364 24 1.2336 0.2501 1.2336 1.1107
No log 0.4727 26 1.1800 0.2585 1.1800 1.0863
No log 0.5091 28 1.1498 0.2712 1.1498 1.0723
No log 0.5455 30 1.1630 0.2182 1.1630 1.0784
No log 0.5818 32 1.1120 0.2322 1.1120 1.0545
No log 0.6182 34 1.0448 0.2983 1.0448 1.0222
No log 0.6545 36 1.3351 0.2275 1.3351 1.1555
No log 0.6909 38 1.1332 0.3466 1.1332 1.0645
No log 0.7273 40 0.9597 0.4225 0.9597 0.9796
No log 0.7636 42 1.5358 0.0610 1.5358 1.2393
No log 0.8 44 1.8603 -0.1190 1.8603 1.3639
No log 0.8364 46 1.4880 0.0316 1.4880 1.2198
No log 0.8727 48 0.9650 0.4183 0.9650 0.9824
No log 0.9091 50 0.9858 0.4431 0.9858 0.9929
No log 0.9455 52 1.1354 0.3751 1.1354 1.0655
No log 0.9818 54 1.1133 0.2950 1.1133 1.0551
No log 1.0182 56 1.0591 0.3370 1.0591 1.0291
No log 1.0545 58 0.9840 0.2754 0.9840 0.9920
No log 1.0909 60 1.0252 0.4185 1.0252 1.0125
No log 1.1273 62 1.1053 0.3612 1.1053 1.0513
No log 1.1636 64 1.1350 0.3534 1.1350 1.0654
No log 1.2 66 1.1119 0.3290 1.1119 1.0545
No log 1.2364 68 1.2953 0.2607 1.2953 1.1381
No log 1.2727 70 1.6135 0.0875 1.6135 1.2702
No log 1.3091 72 1.4682 0.2944 1.4682 1.2117
No log 1.3455 74 1.2895 0.3170 1.2895 1.1355
No log 1.3818 76 1.1689 0.3991 1.1689 1.0812
No log 1.4182 78 0.9613 0.5455 0.9613 0.9805
No log 1.4545 80 0.8631 0.5970 0.8631 0.9290
No log 1.4909 82 0.8851 0.5753 0.8851 0.9408
No log 1.5273 84 0.8093 0.6060 0.8093 0.8996
No log 1.5636 86 0.9060 0.6063 0.9060 0.9519
No log 1.6 88 1.1588 0.4653 1.1588 1.0765
No log 1.6364 90 1.0835 0.5167 1.0835 1.0409
No log 1.6727 92 0.9317 0.5906 0.9317 0.9652
No log 1.7091 94 0.8976 0.6117 0.8976 0.9474
No log 1.7455 96 0.8364 0.6557 0.8364 0.9145
No log 1.7818 98 0.8376 0.6377 0.8376 0.9152
No log 1.8182 100 1.0385 0.5172 1.0385 1.0191
No log 1.8545 102 1.0957 0.4743 1.0957 1.0468
No log 1.8909 104 0.9951 0.5473 0.9951 0.9975
No log 1.9273 106 0.9039 0.5559 0.9039 0.9507
No log 1.9636 108 0.8801 0.5863 0.8801 0.9381
No log 2.0 110 0.7997 0.6174 0.7997 0.8942
No log 2.0364 112 0.7854 0.5997 0.7854 0.8862
No log 2.0727 114 0.9044 0.5336 0.9044 0.9510
No log 2.1091 116 1.0398 0.4580 1.0398 1.0197
No log 2.1455 118 0.8671 0.5815 0.8671 0.9312
No log 2.1818 120 0.7716 0.6693 0.7716 0.8784
No log 2.2182 122 0.7286 0.6959 0.7286 0.8536
No log 2.2545 124 0.7029 0.6919 0.7029 0.8384
No log 2.2909 126 0.7390 0.6437 0.7390 0.8597
No log 2.3273 128 0.7704 0.5966 0.7704 0.8777
No log 2.3636 130 0.6980 0.6101 0.6980 0.8355
No log 2.4 132 0.8238 0.5590 0.8238 0.9076
No log 2.4364 134 0.8174 0.5561 0.8174 0.9041
No log 2.4727 136 0.7093 0.6020 0.7093 0.8422
No log 2.5091 138 0.7615 0.6034 0.7615 0.8727
No log 2.5455 140 0.7019 0.6451 0.7019 0.8378
No log 2.5818 142 0.6917 0.6658 0.6917 0.8317
No log 2.6182 144 0.7012 0.6745 0.7012 0.8374
No log 2.6545 146 0.7695 0.6453 0.7695 0.8772
No log 2.6909 148 0.9079 0.6088 0.9079 0.9529
No log 2.7273 150 0.8018 0.6288 0.8018 0.8954
No log 2.7636 152 0.7748 0.6249 0.7748 0.8802
No log 2.8 154 0.8039 0.6080 0.8039 0.8966
No log 2.8364 156 0.7083 0.6556 0.7083 0.8416
No log 2.8727 158 0.6964 0.6616 0.6964 0.8345
No log 2.9091 160 0.7658 0.6079 0.7658 0.8751
No log 2.9455 162 0.8798 0.6125 0.8798 0.9380
No log 2.9818 164 0.9593 0.5834 0.9593 0.9794
No log 3.0182 166 1.0230 0.5397 1.0230 1.0114
No log 3.0545 168 0.9646 0.5666 0.9646 0.9821
No log 3.0909 170 0.8157 0.5452 0.8157 0.9032
No log 3.1273 172 0.7974 0.5175 0.7974 0.8930
No log 3.1636 174 0.7864 0.5602 0.7864 0.8868
No log 3.2 176 0.7780 0.6407 0.7780 0.8820
No log 3.2364 178 0.8295 0.6075 0.8295 0.9108
No log 3.2727 180 0.9753 0.5515 0.9753 0.9876
No log 3.3091 182 1.0080 0.5366 1.0080 1.0040
No log 3.3455 184 0.8457 0.5757 0.8457 0.9196
No log 3.3818 186 0.7457 0.5437 0.7457 0.8635
No log 3.4182 188 0.7624 0.5682 0.7624 0.8732
No log 3.4545 190 0.7844 0.5985 0.7844 0.8857
No log 3.4909 192 0.8372 0.5810 0.8372 0.9150
No log 3.5273 194 0.8011 0.6122 0.8011 0.8951
No log 3.5636 196 0.7286 0.6268 0.7286 0.8536
No log 3.6 198 0.7249 0.6165 0.7249 0.8514
No log 3.6364 200 0.7316 0.6239 0.7316 0.8553
No log 3.6727 202 0.7218 0.6527 0.7218 0.8496
No log 3.7091 204 0.7421 0.5463 0.7421 0.8615
No log 3.7455 206 0.7281 0.5424 0.7281 0.8533
No log 3.7818 208 0.7937 0.6150 0.7937 0.8909
No log 3.8182 210 1.0188 0.5452 1.0188 1.0094
No log 3.8545 212 1.1021 0.5428 1.1021 1.0498
No log 3.8909 214 0.9050 0.6156 0.9050 0.9513
No log 3.9273 216 0.8135 0.6369 0.8135 0.9020
No log 3.9636 218 0.8106 0.6565 0.8106 0.9003
No log 4.0 220 0.8952 0.6255 0.8952 0.9462
No log 4.0364 222 0.9232 0.6093 0.9232 0.9608
No log 4.0727 224 0.9317 0.5990 0.9317 0.9653
No log 4.1091 226 0.7829 0.6132 0.7829 0.8848
No log 4.1455 228 0.7032 0.5836 0.7032 0.8386
No log 4.1818 230 0.7378 0.6003 0.7378 0.8589
No log 4.2182 232 0.7445 0.5805 0.7445 0.8628
No log 4.2545 234 0.7128 0.5751 0.7128 0.8442
No log 4.2909 236 0.7295 0.6092 0.7295 0.8541
No log 4.3273 238 0.7315 0.6380 0.7315 0.8553
No log 4.3636 240 0.7060 0.6550 0.7060 0.8403
No log 4.4 242 0.6832 0.7099 0.6832 0.8266
No log 4.4364 244 0.6529 0.7276 0.6529 0.8080
No log 4.4727 246 0.6506 0.6608 0.6506 0.8066
No log 4.5091 248 0.6371 0.7067 0.6371 0.7982
No log 4.5455 250 0.7411 0.6267 0.7411 0.8609
No log 4.5818 252 0.7917 0.6007 0.7917 0.8898
No log 4.6182 254 0.7380 0.6012 0.7380 0.8591
No log 4.6545 256 0.6804 0.6684 0.6804 0.8249
No log 4.6909 258 0.6615 0.6778 0.6615 0.8133
No log 4.7273 260 0.6732 0.6838 0.6732 0.8205
No log 4.7636 262 0.7485 0.6398 0.7485 0.8652
No log 4.8 264 0.9439 0.5871 0.9439 0.9716
No log 4.8364 266 0.9755 0.5945 0.9755 0.9877
No log 4.8727 268 0.8457 0.6076 0.8457 0.9196
No log 4.9091 270 0.8321 0.6306 0.8321 0.9122
No log 4.9455 272 0.8569 0.6496 0.8569 0.9257
No log 4.9818 274 0.9467 0.6274 0.9467 0.9730
No log 5.0182 276 0.8704 0.6179 0.8704 0.9329
No log 5.0545 278 0.8210 0.6174 0.8210 0.9061
No log 5.0909 280 0.7858 0.6244 0.7858 0.8864
No log 5.1273 282 0.8197 0.6134 0.8197 0.9053
No log 5.1636 284 0.8044 0.6212 0.8044 0.8969
No log 5.2 286 0.8715 0.6099 0.8715 0.9335
No log 5.2364 288 0.9636 0.5852 0.9636 0.9816
No log 5.2727 290 0.9111 0.6298 0.9111 0.9545
No log 5.3091 292 0.7699 0.6833 0.7699 0.8774
No log 5.3455 294 0.7255 0.6870 0.7255 0.8517
No log 5.3818 296 0.7124 0.6955 0.7124 0.8441
No log 5.4182 298 0.7159 0.7187 0.7159 0.8461
No log 5.4545 300 0.7632 0.6494 0.7632 0.8736
No log 5.4909 302 0.7957 0.6244 0.7957 0.8920
No log 5.5273 304 0.8553 0.6099 0.8553 0.9248
No log 5.5636 306 0.8024 0.6374 0.8024 0.8958
No log 5.6 308 0.7760 0.6410 0.7760 0.8809
No log 5.6364 310 0.7400 0.6501 0.7400 0.8603
No log 5.6727 312 0.7677 0.6655 0.7677 0.8762
No log 5.7091 314 0.7531 0.6743 0.7531 0.8678
No log 5.7455 316 0.7443 0.6636 0.7443 0.8627
No log 5.7818 318 0.6842 0.7076 0.6842 0.8272
No log 5.8182 320 0.6805 0.6634 0.6805 0.8249
No log 5.8545 322 0.7199 0.6636 0.7199 0.8485
No log 5.8909 324 0.8161 0.6842 0.8161 0.9034
No log 5.9273 326 0.7891 0.7010 0.7891 0.8883
No log 5.9636 328 0.7126 0.7344 0.7126 0.8442
No log 6.0 330 0.6722 0.7204 0.6722 0.8199
No log 6.0364 332 0.6529 0.7297 0.6529 0.8080
No log 6.0727 334 0.6471 0.7428 0.6471 0.8044
No log 6.1091 336 0.6640 0.6797 0.6640 0.8149
No log 6.1455 338 0.6931 0.625 0.6931 0.8325
No log 6.1818 340 0.6686 0.6280 0.6686 0.8177
No log 6.2182 342 0.6554 0.6057 0.6554 0.8096
No log 6.2545 344 0.6537 0.6282 0.6537 0.8085
No log 6.2909 346 0.6606 0.5875 0.6606 0.8127
No log 6.3273 348 0.6994 0.6095 0.6994 0.8363
No log 6.3636 350 0.7072 0.6042 0.7072 0.8410
No log 6.4 352 0.6846 0.6645 0.6846 0.8274
No log 6.4364 354 0.6376 0.7479 0.6376 0.7985
No log 6.4727 356 0.6589 0.7357 0.6589 0.8117
No log 6.5091 358 0.7681 0.6757 0.7681 0.8764
No log 6.5455 360 0.8799 0.5881 0.8799 0.9380
No log 6.5818 362 0.9480 0.5632 0.9480 0.9737
No log 6.6182 364 0.8857 0.5938 0.8857 0.9411
No log 6.6545 366 0.7666 0.6137 0.7666 0.8756
No log 6.6909 368 0.7625 0.6107 0.7625 0.8732
No log 6.7273 370 0.8526 0.5947 0.8526 0.9234
No log 6.7636 372 0.9270 0.5743 0.9270 0.9628
No log 6.8 374 0.8359 0.5922 0.8359 0.9143
No log 6.8364 376 0.8034 0.5794 0.8034 0.8963
No log 6.8727 378 0.7987 0.5807 0.7987 0.8937
No log 6.9091 380 0.7309 0.6024 0.7309 0.8549
No log 6.9455 382 0.6839 0.6412 0.6839 0.8270
No log 6.9818 384 0.6777 0.6388 0.6777 0.8232
No log 7.0182 386 0.6871 0.6927 0.6871 0.8289
No log 7.0545 388 0.7102 0.6702 0.7102 0.8427
No log 7.0909 390 0.7155 0.6556 0.7155 0.8459
No log 7.1273 392 0.7475 0.6487 0.7475 0.8646
No log 7.1636 394 0.7299 0.6195 0.7299 0.8543
No log 7.2 396 0.6846 0.5991 0.6846 0.8274
No log 7.2364 398 0.6966 0.6411 0.6966 0.8346
No log 7.2727 400 0.6882 0.6061 0.6882 0.8296
No log 7.3091 402 0.7127 0.6409 0.7127 0.8442
No log 7.3455 404 0.9093 0.5833 0.9093 0.9536
No log 7.3818 406 0.9984 0.5891 0.9984 0.9992
No log 7.4182 408 0.8944 0.6069 0.8944 0.9457
No log 7.4545 410 0.7415 0.6476 0.7415 0.8611
No log 7.4909 412 0.7103 0.6470 0.7103 0.8428
No log 7.5273 414 0.7207 0.6244 0.7207 0.8489
No log 7.5636 416 0.7520 0.6241 0.7520 0.8672
No log 7.6 418 0.8868 0.5783 0.8868 0.9417
No log 7.6364 420 0.9889 0.5744 0.9889 0.9944
No log 7.6727 422 0.9180 0.5718 0.9180 0.9581
No log 7.7091 424 0.7466 0.6341 0.7466 0.8641
No log 7.7455 426 0.6990 0.6264 0.6990 0.8360
No log 7.7818 428 0.6951 0.6152 0.6951 0.8337
No log 7.8182 430 0.6988 0.6306 0.6988 0.8360
No log 7.8545 432 0.7137 0.6434 0.7137 0.8448
No log 7.8909 434 0.7093 0.6268 0.7093 0.8422
No log 7.9273 436 0.7095 0.6292 0.7095 0.8423
No log 7.9636 438 0.7099 0.6400 0.7099 0.8426
No log 8.0 440 0.7257 0.6157 0.7257 0.8519
No log 8.0364 442 0.7918 0.5913 0.7918 0.8898
No log 8.0727 444 0.7803 0.5849 0.7803 0.8833
No log 8.1091 446 0.6955 0.6332 0.6955 0.8340
No log 8.1455 448 0.6477 0.7073 0.6477 0.8048
No log 8.1818 450 0.6526 0.6834 0.6526 0.8078
No log 8.2182 452 0.6952 0.6634 0.6952 0.8338
No log 8.2545 454 0.8723 0.5924 0.8723 0.9340
No log 8.2909 456 1.0645 0.5450 1.0645 1.0318
No log 8.3273 458 1.0680 0.5777 1.0680 1.0334
No log 8.3636 460 0.9219 0.5941 0.9219 0.9602
No log 8.4 462 0.7392 0.6985 0.7392 0.8598
No log 8.4364 464 0.6493 0.7164 0.6493 0.8058
No log 8.4727 466 0.6457 0.7315 0.6457 0.8036
No log 8.5091 468 0.7191 0.6352 0.7191 0.8480
No log 8.5455 470 0.8763 0.5952 0.8763 0.9361
No log 8.5818 472 0.9416 0.5681 0.9416 0.9704
No log 8.6182 474 0.8797 0.5617 0.8797 0.9379
No log 8.6545 476 0.7453 0.6378 0.7453 0.8633
No log 8.6909 478 0.6833 0.7040 0.6833 0.8266
No log 8.7273 480 0.6607 0.7069 0.6607 0.8129
No log 8.7636 482 0.6663 0.7056 0.6663 0.8163
No log 8.8 484 0.6357 0.7231 0.6357 0.7973
No log 8.8364 486 0.6179 0.7450 0.6179 0.7861
No log 8.8727 488 0.6239 0.7135 0.6239 0.7899
No log 8.9091 490 0.6227 0.7070 0.6227 0.7891
No log 8.9455 492 0.6291 0.7238 0.6291 0.7932
No log 8.9818 494 0.6259 0.7253 0.6259 0.7911
No log 9.0182 496 0.6217 0.7370 0.6217 0.7885
No log 9.0545 498 0.6301 0.7261 0.6301 0.7938
0.3988 9.0909 500 0.6836 0.6833 0.6836 0.8268
0.3988 9.1273 502 0.7463 0.7093 0.7463 0.8639
0.3988 9.1636 504 0.8809 0.6371 0.8809 0.9385
0.3988 9.2 506 0.8637 0.6304 0.8637 0.9293
0.3988 9.2364 508 0.7450 0.6881 0.7450 0.8631
0.3988 9.2727 510 0.6424 0.6992 0.6424 0.8015
0.3988 9.3091 512 0.6249 0.7309 0.6249 0.7905
0.3988 9.3455 514 0.6253 0.6841 0.6253 0.7907
0.3988 9.3818 516 0.6905 0.6655 0.6905 0.8309
0.3988 9.4182 518 0.8158 0.6121 0.8158 0.9032
0.3988 9.4545 520 0.9368 0.6125 0.9368 0.9679
0.3988 9.4909 522 0.9349 0.5898 0.9349 0.9669
0.3988 9.5273 524 0.8306 0.6002 0.8306 0.9114

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits8_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k11_task1_organization

Finetuned
(4023)
this model