ArabicNewSplits8_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k20_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6479
  • Qwk: 0.6323
  • Mse: 0.6479
  • Rmse: 0.8049

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0204 2 5.2559 -0.0274 5.2559 2.2926
No log 0.0408 4 3.2102 0.0762 3.2102 1.7917
No log 0.0612 6 2.1451 0.0114 2.1451 1.4646
No log 0.0816 8 1.4039 0.1805 1.4039 1.1849
No log 0.1020 10 1.2265 0.2767 1.2265 1.1075
No log 0.1224 12 1.5295 0.1463 1.5295 1.2367
No log 0.1429 14 1.6264 0.0515 1.6264 1.2753
No log 0.1633 16 1.3547 0.1544 1.3547 1.1639
No log 0.1837 18 1.3151 0.3083 1.3151 1.1468
No log 0.2041 20 1.1000 0.2589 1.1000 1.0488
No log 0.2245 22 1.0680 0.2546 1.0680 1.0334
No log 0.2449 24 1.1532 0.3797 1.1532 1.0739
No log 0.2653 26 1.1610 0.4111 1.1610 1.0775
No log 0.2857 28 1.0538 0.3574 1.0538 1.0265
No log 0.3061 30 1.2734 0.2481 1.2734 1.1285
No log 0.3265 32 1.3424 0.2341 1.3424 1.1586
No log 0.3469 34 1.6475 0.1962 1.6475 1.2835
No log 0.3673 36 1.8051 0.2309 1.8051 1.3435
No log 0.3878 38 1.3821 0.2960 1.3821 1.1756
No log 0.4082 40 0.9229 0.4198 0.9229 0.9607
No log 0.4286 42 1.1127 0.2284 1.1127 1.0548
No log 0.4490 44 1.4391 -0.0014 1.4391 1.1996
No log 0.4694 46 1.0672 0.1684 1.0672 1.0331
No log 0.4898 48 0.9480 0.3335 0.9480 0.9737
No log 0.5102 50 1.0186 0.3798 1.0186 1.0092
No log 0.5306 52 1.0169 0.3827 1.0169 1.0084
No log 0.5510 54 0.8753 0.4829 0.8753 0.9356
No log 0.5714 56 1.0371 0.4808 1.0371 1.0184
No log 0.5918 58 0.9920 0.5337 0.9920 0.9960
No log 0.6122 60 0.8890 0.5671 0.8890 0.9428
No log 0.6327 62 0.8410 0.5416 0.8410 0.9170
No log 0.6531 64 0.8737 0.5358 0.8737 0.9347
No log 0.6735 66 0.7432 0.6344 0.7432 0.8621
No log 0.6939 68 0.8063 0.6177 0.8063 0.8979
No log 0.7143 70 0.8135 0.6103 0.8135 0.9020
No log 0.7347 72 0.6978 0.6012 0.6978 0.8353
No log 0.7551 74 0.8397 0.4639 0.8397 0.9163
No log 0.7755 76 0.9734 0.4828 0.9734 0.9866
No log 0.7959 78 0.7991 0.5041 0.7991 0.8939
No log 0.8163 80 0.7194 0.6158 0.7194 0.8482
No log 0.8367 82 0.8923 0.5975 0.8923 0.9446
No log 0.8571 84 0.8916 0.5886 0.8916 0.9443
No log 0.8776 86 0.7499 0.6449 0.7499 0.8659
No log 0.8980 88 0.7703 0.6589 0.7703 0.8777
No log 0.9184 90 0.7476 0.6699 0.7476 0.8646
No log 0.9388 92 0.8011 0.5769 0.8011 0.8950
No log 0.9592 94 0.9257 0.5480 0.9257 0.9621
No log 0.9796 96 0.9932 0.5174 0.9932 0.9966
No log 1.0 98 0.8540 0.5801 0.8540 0.9241
No log 1.0204 100 0.7720 0.5993 0.7720 0.8786
No log 1.0408 102 0.7821 0.5703 0.7821 0.8844
No log 1.0612 104 0.8081 0.6109 0.8081 0.8990
No log 1.0816 106 1.0426 0.4954 1.0426 1.0211
No log 1.1020 108 1.2313 0.4929 1.2313 1.1096
No log 1.1224 110 1.2398 0.4887 1.2398 1.1135
No log 1.1429 112 1.0459 0.5036 1.0459 1.0227
No log 1.1633 114 0.8386 0.5620 0.8386 0.9157
No log 1.1837 116 0.7726 0.5981 0.7726 0.8790
No log 1.2041 118 0.7975 0.5797 0.7975 0.8930
No log 1.2245 120 0.9830 0.5347 0.9830 0.9915
No log 1.2449 122 1.0153 0.5158 1.0153 1.0076
No log 1.2653 124 0.8714 0.5258 0.8714 0.9335
No log 1.2857 126 0.8193 0.5461 0.8193 0.9052
No log 1.3061 128 0.7816 0.5772 0.7816 0.8841
No log 1.3265 130 0.7817 0.5991 0.7817 0.8842
No log 1.3469 132 1.0355 0.5523 1.0355 1.0176
No log 1.3673 134 1.2113 0.5217 1.2113 1.1006
No log 1.3878 136 1.1522 0.5151 1.1522 1.0734
No log 1.4082 138 1.0126 0.5796 1.0126 1.0063
No log 1.4286 140 1.1053 0.5221 1.1053 1.0514
No log 1.4490 142 1.1269 0.5014 1.1269 1.0615
No log 1.4694 144 1.0328 0.5604 1.0328 1.0162
No log 1.4898 146 1.1024 0.5264 1.1024 1.0499
No log 1.5102 148 1.0948 0.5149 1.0948 1.0463
No log 1.5306 150 1.0711 0.5273 1.0711 1.0349
No log 1.5510 152 0.8560 0.5660 0.8560 0.9252
No log 1.5714 154 0.8207 0.5768 0.8207 0.9059
No log 1.5918 156 0.8195 0.5849 0.8195 0.9052
No log 1.6122 158 0.7659 0.5835 0.7659 0.8751
No log 1.6327 160 0.7827 0.6078 0.7827 0.8847
No log 1.6531 162 0.8578 0.6087 0.8578 0.9262
No log 1.6735 164 0.7732 0.6396 0.7732 0.8793
No log 1.6939 166 0.7456 0.6504 0.7456 0.8635
No log 1.7143 168 0.7404 0.6593 0.7404 0.8605
No log 1.7347 170 0.8414 0.6200 0.8414 0.9173
No log 1.7551 172 0.8502 0.5879 0.8502 0.9221
No log 1.7755 174 0.7681 0.5927 0.7681 0.8764
No log 1.7959 176 0.7708 0.5755 0.7708 0.8779
No log 1.8163 178 0.7972 0.56 0.7972 0.8928
No log 1.8367 180 0.7193 0.6015 0.7193 0.8481
No log 1.8571 182 0.7016 0.6394 0.7016 0.8376
No log 1.8776 184 0.6880 0.6371 0.6880 0.8294
No log 1.8980 186 0.6942 0.6403 0.6942 0.8332
No log 1.9184 188 0.6829 0.6533 0.6829 0.8264
No log 1.9388 190 0.7050 0.6756 0.7050 0.8396
No log 1.9592 192 0.7154 0.6776 0.7154 0.8458
No log 1.9796 194 0.6996 0.6622 0.6996 0.8364
No log 2.0 196 0.7304 0.6075 0.7304 0.8546
No log 2.0204 198 0.7138 0.6285 0.7138 0.8449
No log 2.0408 200 0.7039 0.6391 0.7039 0.8390
No log 2.0612 202 0.7115 0.6358 0.7115 0.8435
No log 2.0816 204 0.6902 0.6400 0.6902 0.8308
No log 2.1020 206 0.7258 0.6134 0.7258 0.8520
No log 2.1224 208 0.7187 0.6134 0.7187 0.8477
No log 2.1429 210 0.7004 0.6488 0.7004 0.8369
No log 2.1633 212 0.6806 0.6513 0.6806 0.8250
No log 2.1837 214 0.6785 0.6455 0.6785 0.8237
No log 2.2041 216 0.6746 0.6728 0.6746 0.8213
No log 2.2245 218 0.7819 0.6574 0.7819 0.8842
No log 2.2449 220 0.7583 0.6207 0.7583 0.8708
No log 2.2653 222 0.6792 0.6474 0.6792 0.8241
No log 2.2857 224 0.6903 0.6421 0.6903 0.8308
No log 2.3061 226 0.7032 0.6253 0.7032 0.8386
No log 2.3265 228 0.7577 0.5669 0.7577 0.8705
No log 2.3469 230 0.8124 0.5919 0.8124 0.9013
No log 2.3673 232 0.7187 0.6036 0.7187 0.8478
No log 2.3878 234 0.6805 0.6439 0.6805 0.8249
No log 2.4082 236 0.6922 0.6303 0.6922 0.8320
No log 2.4286 238 0.6807 0.6346 0.6807 0.8250
No log 2.4490 240 0.6841 0.6420 0.6841 0.8271
No log 2.4694 242 0.7222 0.6498 0.7222 0.8498
No log 2.4898 244 0.7392 0.6612 0.7392 0.8598
No log 2.5102 246 0.6667 0.6636 0.6667 0.8165
No log 2.5306 248 0.6815 0.6580 0.6815 0.8255
No log 2.5510 250 0.6667 0.6545 0.6667 0.8165
No log 2.5714 252 0.6691 0.6458 0.6691 0.8180
No log 2.5918 254 0.6605 0.6451 0.6605 0.8127
No log 2.6122 256 0.6609 0.6619 0.6609 0.8130
No log 2.6327 258 0.6725 0.6812 0.6725 0.8200
No log 2.6531 260 0.6919 0.6179 0.6919 0.8318
No log 2.6735 262 0.6665 0.6865 0.6665 0.8164
No log 2.6939 264 0.6528 0.6871 0.6528 0.8080
No log 2.7143 266 0.6520 0.6726 0.6520 0.8075
No log 2.7347 268 0.6611 0.6847 0.6611 0.8131
No log 2.7551 270 0.6594 0.6883 0.6594 0.8120
No log 2.7755 272 0.6688 0.7059 0.6688 0.8178
No log 2.7959 274 0.7155 0.6204 0.7155 0.8459
No log 2.8163 276 0.7704 0.6255 0.7704 0.8777
No log 2.8367 278 0.7157 0.6284 0.7157 0.8460
No log 2.8571 280 0.6696 0.6891 0.6696 0.8183
No log 2.8776 282 0.6484 0.7251 0.6484 0.8052
No log 2.8980 284 0.6419 0.7172 0.6419 0.8012
No log 2.9184 286 0.6422 0.7219 0.6422 0.8014
No log 2.9388 288 0.6387 0.7172 0.6387 0.7992
No log 2.9592 290 0.6408 0.7067 0.6408 0.8005
No log 2.9796 292 0.7026 0.6487 0.7026 0.8382
No log 3.0 294 0.7376 0.6239 0.7376 0.8588
No log 3.0204 296 0.6813 0.6287 0.6813 0.8254
No log 3.0408 298 0.6552 0.6824 0.6552 0.8094
No log 3.0612 300 0.6649 0.6780 0.6649 0.8154
No log 3.0816 302 0.6431 0.7153 0.6431 0.8020
No log 3.1020 304 0.6483 0.7177 0.6483 0.8052
No log 3.1224 306 0.6502 0.7164 0.6502 0.8063
No log 3.1429 308 0.7300 0.6580 0.7300 0.8544
No log 3.1633 310 0.7816 0.6527 0.7816 0.8841
No log 3.1837 312 0.6959 0.6756 0.6959 0.8342
No log 3.2041 314 0.6472 0.7053 0.6472 0.8045
No log 3.2245 316 0.6679 0.6939 0.6679 0.8173
No log 3.2449 318 0.6675 0.6785 0.6675 0.8170
No log 3.2653 320 0.6687 0.6747 0.6687 0.8177
No log 3.2857 322 0.6836 0.6449 0.6836 0.8268
No log 3.3061 324 0.6831 0.6509 0.6831 0.8265
No log 3.3265 326 0.6496 0.6533 0.6496 0.8060
No log 3.3469 328 0.6309 0.6537 0.6309 0.7943
No log 3.3673 330 0.6349 0.6423 0.6349 0.7968
No log 3.3878 332 0.6275 0.6652 0.6275 0.7921
No log 3.4082 334 0.6073 0.7232 0.6073 0.7793
No log 3.4286 336 0.6071 0.6882 0.6071 0.7791
No log 3.4490 338 0.6265 0.6610 0.6265 0.7915
No log 3.4694 340 0.6075 0.6951 0.6075 0.7794
No log 3.4898 342 0.6029 0.7026 0.6029 0.7765
No log 3.5102 344 0.6047 0.7219 0.6047 0.7776
No log 3.5306 346 0.6378 0.6692 0.6378 0.7986
No log 3.5510 348 0.6296 0.7017 0.6296 0.7935
No log 3.5714 350 0.6587 0.6910 0.6587 0.8116
No log 3.5918 352 0.6804 0.6605 0.6804 0.8249
No log 3.6122 354 0.6444 0.7200 0.6444 0.8028
No log 3.6327 356 0.6428 0.7196 0.6428 0.8018
No log 3.6531 358 0.6332 0.7187 0.6332 0.7957
No log 3.6735 360 0.6256 0.6788 0.6256 0.7909
No log 3.6939 362 0.6500 0.6399 0.6500 0.8062
No log 3.7143 364 0.6718 0.6438 0.6718 0.8196
No log 3.7347 366 0.6694 0.6435 0.6694 0.8181
No log 3.7551 368 0.6941 0.6279 0.6941 0.8331
No log 3.7755 370 0.7403 0.6174 0.7403 0.8604
No log 3.7959 372 0.7523 0.6212 0.7523 0.8674
No log 3.8163 374 0.7824 0.6202 0.7824 0.8845
No log 3.8367 376 0.8079 0.6395 0.8079 0.8988
No log 3.8571 378 0.8515 0.6335 0.8515 0.9228
No log 3.8776 380 0.8358 0.6415 0.8358 0.9142
No log 3.8980 382 0.7931 0.6332 0.7931 0.8906
No log 3.9184 384 0.8174 0.6348 0.8174 0.9041
No log 3.9388 386 0.8988 0.6034 0.8988 0.9480
No log 3.9592 388 1.0775 0.5460 1.0775 1.0380
No log 3.9796 390 1.0879 0.5366 1.0879 1.0430
No log 4.0 392 0.9929 0.5718 0.9929 0.9964
No log 4.0204 394 0.8872 0.5835 0.8872 0.9419
No log 4.0408 396 0.8760 0.6088 0.8760 0.9359
No log 4.0612 398 0.8420 0.6344 0.8420 0.9176
No log 4.0816 400 0.8430 0.6306 0.8430 0.9182
No log 4.1020 402 0.7347 0.6345 0.7347 0.8572
No log 4.1224 404 0.7135 0.6465 0.7135 0.8447
No log 4.1429 406 0.7299 0.6579 0.7299 0.8543
No log 4.1633 408 0.7338 0.6503 0.7338 0.8566
No log 4.1837 410 0.7267 0.6528 0.7267 0.8525
No log 4.2041 412 0.6734 0.6575 0.6734 0.8206
No log 4.2245 414 0.6377 0.6770 0.6377 0.7985
No log 4.2449 416 0.6177 0.7094 0.6177 0.7859
No log 4.2653 418 0.6161 0.6903 0.6161 0.7849
No log 4.2857 420 0.6250 0.6883 0.6250 0.7905
No log 4.3061 422 0.6070 0.6856 0.6070 0.7791
No log 4.3265 424 0.6472 0.7110 0.6472 0.8045
No log 4.3469 426 0.6533 0.6908 0.6533 0.8083
No log 4.3673 428 0.6601 0.6788 0.6601 0.8125
No log 4.3878 430 0.6325 0.6649 0.6325 0.7953
No log 4.4082 432 0.6555 0.6544 0.6555 0.8096
No log 4.4286 434 0.6556 0.6438 0.6556 0.8097
No log 4.4490 436 0.6545 0.6660 0.6545 0.8090
No log 4.4694 438 0.6702 0.6583 0.6702 0.8187
No log 4.4898 440 0.6928 0.6477 0.6928 0.8323
No log 4.5102 442 0.6735 0.6653 0.6735 0.8207
No log 4.5306 444 0.6656 0.6520 0.6656 0.8159
No log 4.5510 446 0.6450 0.6968 0.6450 0.8031
No log 4.5714 448 0.6800 0.6963 0.6800 0.8246
No log 4.5918 450 0.6995 0.6906 0.6995 0.8364
No log 4.6122 452 0.6319 0.6721 0.6319 0.7949
No log 4.6327 454 0.6222 0.6441 0.6222 0.7888
No log 4.6531 456 0.6155 0.6499 0.6155 0.7846
No log 4.6735 458 0.6119 0.6957 0.6119 0.7822
No log 4.6939 460 0.6430 0.6916 0.6430 0.8018
No log 4.7143 462 0.6648 0.6838 0.6648 0.8154
No log 4.7347 464 0.6098 0.7217 0.6098 0.7809
No log 4.7551 466 0.6516 0.6788 0.6516 0.8072
No log 4.7755 468 0.7147 0.6119 0.7147 0.8454
No log 4.7959 470 0.6809 0.6332 0.6809 0.8251
No log 4.8163 472 0.6468 0.6628 0.6468 0.8042
No log 4.8367 474 0.7395 0.6024 0.7395 0.8600
No log 4.8571 476 0.7583 0.5778 0.7583 0.8708
No log 4.8776 478 0.6736 0.6199 0.6736 0.8207
No log 4.8980 480 0.6593 0.6434 0.6593 0.8120
No log 4.9184 482 0.6580 0.6588 0.6580 0.8112
No log 4.9388 484 0.6423 0.6672 0.6423 0.8014
No log 4.9592 486 0.6748 0.6530 0.6748 0.8214
No log 4.9796 488 0.7035 0.6406 0.7035 0.8388
No log 5.0 490 0.6873 0.6427 0.6873 0.8290
No log 5.0204 492 0.6591 0.6643 0.6591 0.8119
No log 5.0408 494 0.6251 0.6933 0.6251 0.7906
No log 5.0612 496 0.6332 0.7001 0.6332 0.7957
No log 5.0816 498 0.6199 0.7188 0.6199 0.7874
0.4013 5.1020 500 0.6341 0.6877 0.6341 0.7963
0.4013 5.1224 502 0.6725 0.6623 0.6725 0.8200
0.4013 5.1429 504 0.6482 0.6817 0.6482 0.8051
0.4013 5.1633 506 0.6506 0.6704 0.6506 0.8066
0.4013 5.1837 508 0.6536 0.6723 0.6536 0.8085
0.4013 5.2041 510 0.6172 0.7080 0.6172 0.7856
0.4013 5.2245 512 0.6431 0.6826 0.6431 0.8019
0.4013 5.2449 514 0.6565 0.6686 0.6565 0.8102
0.4013 5.2653 516 0.6679 0.6665 0.6679 0.8172
0.4013 5.2857 518 0.6517 0.6604 0.6517 0.8073
0.4013 5.3061 520 0.6479 0.6323 0.6479 0.8049

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits8_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k20_task1_organization

Finetuned
(4023)
this model