ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k11_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6457
  • Qwk: 0.6187
  • Mse: 0.6457
  • Rmse: 0.8036

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0364 2 3.8951 -0.0092 3.8951 1.9736
No log 0.0727 4 1.8543 0.0081 1.8543 1.3617
No log 0.1091 6 1.2311 0.0496 1.2311 1.1095
No log 0.1455 8 1.0546 0.2042 1.0546 1.0269
No log 0.1818 10 1.5501 -0.0743 1.5501 1.2450
No log 0.2182 12 1.9077 0.1358 1.9077 1.3812
No log 0.2545 14 1.5173 0.0256 1.5173 1.2318
No log 0.2909 16 1.0669 0.1591 1.0669 1.0329
No log 0.3273 18 0.8844 0.3498 0.8844 0.9404
No log 0.3636 20 0.9463 0.3394 0.9463 0.9728
No log 0.4 22 0.8300 0.3958 0.8300 0.9110
No log 0.4364 24 0.9006 0.2969 0.9006 0.9490
No log 0.4727 26 1.1639 0.2229 1.1639 1.0788
No log 0.5091 28 1.0103 0.2547 1.0103 1.0051
No log 0.5455 30 0.7288 0.5176 0.7288 0.8537
No log 0.5818 32 0.6642 0.5480 0.6642 0.8150
No log 0.6182 34 0.7194 0.5050 0.7194 0.8482
No log 0.6545 36 0.8941 0.4784 0.8941 0.9456
No log 0.6909 38 1.2044 0.3558 1.2044 1.0974
No log 0.7273 40 1.1432 0.3715 1.1432 1.0692
No log 0.7636 42 0.9024 0.4784 0.9024 0.9500
No log 0.8 44 0.7182 0.4834 0.7182 0.8475
No log 0.8364 46 0.7067 0.4828 0.7067 0.8407
No log 0.8727 48 0.7794 0.5427 0.7794 0.8829
No log 0.9091 50 0.7392 0.5915 0.7392 0.8597
No log 0.9455 52 0.6936 0.6010 0.6936 0.8328
No log 0.9818 54 0.6734 0.6196 0.6734 0.8206
No log 1.0182 56 0.8726 0.5763 0.8726 0.9341
No log 1.0545 58 0.9227 0.5672 0.9227 0.9606
No log 1.0909 60 0.7296 0.6465 0.7296 0.8541
No log 1.1273 62 0.7114 0.6499 0.7114 0.8434
No log 1.1636 64 0.8177 0.6048 0.8177 0.9043
No log 1.2 66 0.8770 0.5365 0.8770 0.9365
No log 1.2364 68 0.6529 0.6174 0.6529 0.8080
No log 1.2727 70 0.6085 0.6206 0.6085 0.7801
No log 1.3091 72 0.6096 0.6206 0.6096 0.7808
No log 1.3455 74 0.6875 0.6482 0.6875 0.8291
No log 1.3818 76 0.6667 0.6543 0.6667 0.8165
No log 1.4182 78 0.7652 0.5895 0.7652 0.8748
No log 1.4545 80 0.7056 0.6444 0.7056 0.8400
No log 1.4909 82 0.6593 0.6263 0.6593 0.8120
No log 1.5273 84 0.8790 0.5155 0.8790 0.9375
No log 1.5636 86 0.8218 0.5495 0.8218 0.9065
No log 1.6 88 0.6531 0.5577 0.6531 0.8081
No log 1.6364 90 0.7774 0.5794 0.7774 0.8817
No log 1.6727 92 0.9330 0.5272 0.9330 0.9659
No log 1.7091 94 0.8401 0.5565 0.8401 0.9165
No log 1.7455 96 0.6998 0.5955 0.6998 0.8365
No log 1.7818 98 0.6704 0.5851 0.6704 0.8188
No log 1.8182 100 0.6915 0.6179 0.6915 0.8316
No log 1.8545 102 0.6529 0.6104 0.6529 0.8080
No log 1.8909 104 0.7189 0.6174 0.7189 0.8479
No log 1.9273 106 0.8133 0.5451 0.8133 0.9018
No log 1.9636 108 0.7552 0.5566 0.7552 0.8690
No log 2.0 110 0.6728 0.5736 0.6728 0.8203
No log 2.0364 112 0.6664 0.6217 0.6664 0.8163
No log 2.0727 114 0.6740 0.6374 0.6740 0.8210
No log 2.1091 116 0.6918 0.6254 0.6918 0.8318
No log 2.1455 118 0.7165 0.6001 0.7165 0.8465
No log 2.1818 120 0.7156 0.5685 0.7156 0.8460
No log 2.2182 122 0.6925 0.6187 0.6925 0.8322
No log 2.2545 124 0.7457 0.6299 0.7457 0.8635
No log 2.2909 126 0.7706 0.5678 0.7706 0.8778
No log 2.3273 128 0.7618 0.5483 0.7618 0.8728
No log 2.3636 130 0.7431 0.5125 0.7431 0.8620
No log 2.4 132 0.6878 0.5950 0.6878 0.8294
No log 2.4364 134 0.7068 0.5374 0.7068 0.8407
No log 2.4727 136 0.7749 0.5334 0.7749 0.8803
No log 2.5091 138 0.7770 0.5683 0.7770 0.8815
No log 2.5455 140 0.7338 0.5911 0.7338 0.8566
No log 2.5818 142 0.6857 0.6322 0.6857 0.8281
No log 2.6182 144 0.6771 0.6692 0.6771 0.8229
No log 2.6545 146 0.6554 0.6357 0.6554 0.8096
No log 2.6909 148 0.6359 0.6528 0.6359 0.7974
No log 2.7273 150 0.6637 0.6288 0.6637 0.8147
No log 2.7636 152 0.7431 0.6120 0.7431 0.8620
No log 2.8 154 0.6594 0.6154 0.6594 0.8120
No log 2.8364 156 0.6356 0.6555 0.6356 0.7972
No log 2.8727 158 0.6508 0.6581 0.6508 0.8067
No log 2.9091 160 0.6157 0.6796 0.6157 0.7847
No log 2.9455 162 0.6283 0.6164 0.6283 0.7927
No log 2.9818 164 0.6242 0.5487 0.6242 0.7901
No log 3.0182 166 0.6438 0.5261 0.6438 0.8024
No log 3.0545 168 0.6535 0.5054 0.6535 0.8084
No log 3.0909 170 0.6600 0.5902 0.6600 0.8124
No log 3.1273 172 0.6183 0.6393 0.6183 0.7863
No log 3.1636 174 0.6072 0.6804 0.6072 0.7792
No log 3.2 176 0.6171 0.6774 0.6171 0.7855
No log 3.2364 178 0.6333 0.6544 0.6333 0.7958
No log 3.2727 180 0.6211 0.6781 0.6211 0.7881
No log 3.3091 182 0.6138 0.6875 0.6138 0.7834
No log 3.3455 184 0.6112 0.6766 0.6112 0.7818
No log 3.3818 186 0.6061 0.7103 0.6061 0.7785
No log 3.4182 188 0.6056 0.6729 0.6056 0.7782
No log 3.4545 190 0.6344 0.6302 0.6344 0.7965
No log 3.4909 192 0.6541 0.6035 0.6541 0.8087
No log 3.5273 194 0.6234 0.6526 0.6234 0.7896
No log 3.5636 196 0.6418 0.6564 0.6418 0.8011
No log 3.6 198 0.6361 0.6389 0.6361 0.7975
No log 3.6364 200 0.6654 0.6035 0.6654 0.8157
No log 3.6727 202 0.9707 0.5263 0.9707 0.9853
No log 3.7091 204 1.1334 0.5273 1.1334 1.0646
No log 3.7455 206 0.9041 0.5590 0.9041 0.9509
No log 3.7818 208 0.6542 0.6938 0.6542 0.8088
No log 3.8182 210 0.6920 0.6064 0.6920 0.8318
No log 3.8545 212 0.6649 0.5688 0.6649 0.8154
No log 3.8909 214 0.6211 0.6537 0.6211 0.7881
No log 3.9273 216 0.6571 0.6184 0.6571 0.8106
No log 3.9636 218 0.6633 0.6377 0.6633 0.8144
No log 4.0 220 0.6410 0.6812 0.6410 0.8006
No log 4.0364 222 0.6560 0.6664 0.6560 0.8099
No log 4.0727 224 0.6656 0.6365 0.6656 0.8158
No log 4.1091 226 0.6427 0.6689 0.6427 0.8017
No log 4.1455 228 0.6566 0.6256 0.6566 0.8103
No log 4.1818 230 0.6394 0.6705 0.6394 0.7996
No log 4.2182 232 0.6279 0.6256 0.6279 0.7924
No log 4.2545 234 0.6749 0.6404 0.6749 0.8215
No log 4.2909 236 0.6339 0.6337 0.6339 0.7962
No log 4.3273 238 0.5830 0.6564 0.5830 0.7636
No log 4.3636 240 0.6491 0.6226 0.6491 0.8057
No log 4.4 242 0.6617 0.6163 0.6617 0.8135
No log 4.4364 244 0.6006 0.6812 0.6006 0.7750
No log 4.4727 246 0.6142 0.7213 0.6142 0.7837
No log 4.5091 248 0.6752 0.6520 0.6752 0.8217
No log 4.5455 250 0.6342 0.6589 0.6342 0.7963
No log 4.5818 252 0.6013 0.6804 0.6013 0.7755
No log 4.6182 254 0.6157 0.6089 0.6157 0.7847
No log 4.6545 256 0.6245 0.5879 0.6245 0.7903
No log 4.6909 258 0.6205 0.5879 0.6205 0.7877
No log 4.7273 260 0.6127 0.6712 0.6127 0.7828
No log 4.7636 262 0.6401 0.6693 0.6401 0.8000
No log 4.8 264 0.6507 0.6969 0.6507 0.8067
No log 4.8364 266 0.6611 0.6509 0.6611 0.8131
No log 4.8727 268 0.6695 0.6341 0.6695 0.8183
No log 4.9091 270 0.6364 0.6089 0.6364 0.7977
No log 4.9455 272 0.6049 0.6426 0.6049 0.7778
No log 4.9818 274 0.6251 0.6353 0.6251 0.7906
No log 5.0182 276 0.6080 0.6195 0.6080 0.7798
No log 5.0545 278 0.6027 0.5807 0.6027 0.7763
No log 5.0909 280 0.5947 0.5921 0.5947 0.7712
No log 5.1273 282 0.5778 0.6186 0.5778 0.7601
No log 5.1636 284 0.5695 0.6537 0.5695 0.7547
No log 5.2 286 0.5743 0.6017 0.5743 0.7579
No log 5.2364 288 0.6135 0.6129 0.6135 0.7832
No log 5.2727 290 0.6352 0.6004 0.6352 0.7970
No log 5.3091 292 0.5851 0.6358 0.5851 0.7649
No log 5.3455 294 0.5728 0.6866 0.5728 0.7568
No log 5.3818 296 0.5800 0.6857 0.5800 0.7616
No log 5.4182 298 0.5737 0.6866 0.5737 0.7575
No log 5.4545 300 0.5805 0.6460 0.5805 0.7619
No log 5.4909 302 0.5898 0.6302 0.5898 0.7680
No log 5.5273 304 0.6020 0.6302 0.6020 0.7759
No log 5.5636 306 0.6016 0.6488 0.6016 0.7757
No log 5.6 308 0.5997 0.6932 0.5997 0.7744
No log 5.6364 310 0.6053 0.6452 0.6053 0.7780
No log 5.6727 312 0.6122 0.6488 0.6122 0.7824
No log 5.7091 314 0.6088 0.6498 0.6088 0.7802
No log 5.7455 316 0.6204 0.6102 0.6204 0.7876
No log 5.7818 318 0.6031 0.6417 0.6031 0.7766
No log 5.8182 320 0.6129 0.5971 0.6129 0.7829
No log 5.8545 322 0.6079 0.6164 0.6079 0.7797
No log 5.8909 324 0.6018 0.6134 0.6018 0.7758
No log 5.9273 326 0.6357 0.6555 0.6357 0.7973
No log 5.9636 328 0.6583 0.6008 0.6583 0.8113
No log 6.0 330 0.6446 0.6209 0.6446 0.8029
No log 6.0364 332 0.6396 0.5516 0.6396 0.7998
No log 6.0727 334 0.6360 0.5994 0.6360 0.7975
No log 6.1091 336 0.6342 0.6175 0.6342 0.7964
No log 6.1455 338 0.6499 0.5869 0.6499 0.8062
No log 6.1818 340 0.6681 0.6118 0.6681 0.8174
No log 6.2182 342 0.6392 0.6380 0.6392 0.7995
No log 6.2545 344 0.6292 0.6659 0.6292 0.7932
No log 6.2909 346 0.6342 0.6705 0.6342 0.7964
No log 6.3273 348 0.6283 0.6546 0.6283 0.7927
No log 6.3636 350 0.6555 0.6084 0.6555 0.8096
No log 6.4 352 0.6789 0.6198 0.6789 0.8239
No log 6.4364 354 0.6844 0.6198 0.6844 0.8273
No log 6.4727 356 0.6386 0.6822 0.6386 0.7991
No log 6.5091 358 0.6319 0.6636 0.6319 0.7949
No log 6.5455 360 0.6313 0.6246 0.6313 0.7945
No log 6.5818 362 0.6201 0.6697 0.6201 0.7875
No log 6.6182 364 0.6284 0.6932 0.6284 0.7927
No log 6.6545 366 0.6399 0.6584 0.6399 0.7999
No log 6.6909 368 0.6415 0.6292 0.6415 0.8009
No log 6.7273 370 0.6453 0.6667 0.6453 0.8033
No log 6.7636 372 0.6470 0.6518 0.6470 0.8044
No log 6.8 374 0.6575 0.6147 0.6575 0.8109
No log 6.8364 376 0.6831 0.5833 0.6831 0.8265
No log 6.8727 378 0.6793 0.5948 0.6793 0.8242
No log 6.9091 380 0.6638 0.5356 0.6638 0.8147
No log 6.9455 382 0.6707 0.6185 0.6707 0.8189
No log 6.9818 384 0.7228 0.5665 0.7228 0.8502
No log 7.0182 386 0.7327 0.5560 0.7327 0.8560
No log 7.0545 388 0.6933 0.5770 0.6933 0.8327
No log 7.0909 390 0.6606 0.6175 0.6606 0.8128
No log 7.1273 392 0.6783 0.6226 0.6783 0.8236
No log 7.1636 394 0.6819 0.6226 0.6819 0.8257
No log 7.2 396 0.6910 0.6324 0.6910 0.8313
No log 7.2364 398 0.6744 0.6324 0.6744 0.8212
No log 7.2727 400 0.6621 0.5939 0.6621 0.8137
No log 7.3091 402 0.6635 0.5939 0.6635 0.8145
No log 7.3455 404 0.6585 0.6374 0.6585 0.8115
No log 7.3818 406 0.6548 0.5930 0.6548 0.8092
No log 7.4182 408 0.6554 0.6105 0.6554 0.8095
No log 7.4545 410 0.6573 0.6622 0.6573 0.8107
No log 7.4909 412 0.6502 0.6969 0.6502 0.8063
No log 7.5273 414 0.6365 0.6853 0.6365 0.7978
No log 7.5636 416 0.6265 0.6105 0.6265 0.7915
No log 7.6 418 0.6251 0.6374 0.6251 0.7906
No log 7.6364 420 0.6061 0.6175 0.6061 0.7785
No log 7.6727 422 0.5909 0.6398 0.5909 0.7687
No log 7.7091 424 0.5821 0.6415 0.5821 0.7629
No log 7.7455 426 0.5609 0.6537 0.5609 0.7490
No log 7.7818 428 0.5550 0.6649 0.5550 0.7450
No log 7.8182 430 0.5611 0.6237 0.5611 0.7491
No log 7.8545 432 0.5954 0.6479 0.5954 0.7716
No log 7.8909 434 0.6066 0.6661 0.6066 0.7789
No log 7.9273 436 0.5957 0.6675 0.5957 0.7718
No log 7.9636 438 0.5874 0.6923 0.5874 0.7664
No log 8.0 440 0.6074 0.6838 0.6074 0.7793
No log 8.0364 442 0.6706 0.6485 0.6706 0.8189
No log 8.0727 444 0.6753 0.6314 0.6753 0.8217
No log 8.1091 446 0.6346 0.6455 0.6346 0.7966
No log 8.1455 448 0.6294 0.6175 0.6294 0.7933
No log 8.1818 450 0.6343 0.6175 0.6343 0.7964
No log 8.2182 452 0.6411 0.5724 0.6411 0.8007
No log 8.2545 454 0.6523 0.5822 0.6523 0.8077
No log 8.2909 456 0.6369 0.6154 0.6369 0.7981
No log 8.3273 458 0.6208 0.6278 0.6208 0.7879
No log 8.3636 460 0.6039 0.6733 0.6039 0.7771
No log 8.4 462 0.6241 0.6184 0.6241 0.7900
No log 8.4364 464 0.6509 0.6184 0.6509 0.8068
No log 8.4727 466 0.6416 0.6675 0.6416 0.8010
No log 8.5091 468 0.6470 0.6389 0.6470 0.8044
No log 8.5455 470 0.6891 0.5618 0.6891 0.8301
No log 8.5818 472 0.6968 0.5467 0.6968 0.8347
No log 8.6182 474 0.6996 0.4903 0.6996 0.8364
No log 8.6545 476 0.6904 0.4807 0.6904 0.8309
No log 8.6909 478 0.6771 0.5046 0.6771 0.8228
No log 8.7273 480 0.6978 0.5809 0.6978 0.8353
No log 8.7636 482 0.7506 0.5875 0.7506 0.8663
No log 8.8 484 0.7447 0.5566 0.7447 0.8629
No log 8.8364 486 0.6795 0.5708 0.6795 0.8243
No log 8.8727 488 0.6221 0.6262 0.6221 0.7888
No log 8.9091 490 0.6224 0.5807 0.6224 0.7889
No log 8.9455 492 0.6156 0.5807 0.6156 0.7846
No log 8.9818 494 0.5968 0.6517 0.5968 0.7725
No log 9.0182 496 0.5901 0.6712 0.5901 0.7682
No log 9.0545 498 0.6109 0.6371 0.6109 0.7816
0.248 9.0909 500 0.5984 0.6555 0.5984 0.7736
0.248 9.1273 502 0.5668 0.6788 0.5668 0.7529
0.248 9.1636 504 0.5639 0.6697 0.5639 0.7509
0.248 9.2 506 0.5685 0.6602 0.5685 0.7540
0.248 9.2364 508 0.5725 0.6517 0.5725 0.7566
0.248 9.2727 510 0.5883 0.6517 0.5883 0.7670
0.248 9.3091 512 0.6191 0.6025 0.6191 0.7869
0.248 9.3455 514 0.6467 0.5880 0.6467 0.8042
0.248 9.3818 516 0.6457 0.6187 0.6457 0.8036

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k11_task5_organization

Finetuned
(4019)
this model