ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k20_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6567
  • Qwk: 0.5712
  • Mse: 0.6567
  • Rmse: 0.8104

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.02 2 3.8643 0.0012 3.8643 1.9658
No log 0.04 4 1.8361 0.0749 1.8361 1.3550
No log 0.06 6 1.2820 0.1140 1.2820 1.1323
No log 0.08 8 1.3093 0.1140 1.3093 1.1443
No log 0.1 10 1.0373 0.1764 1.0373 1.0185
No log 0.12 12 1.0464 0.2004 1.0464 1.0229
No log 0.14 14 0.9777 0.4148 0.9777 0.9888
No log 0.16 16 0.9279 0.2314 0.9279 0.9633
No log 0.18 18 0.8598 0.3795 0.8598 0.9272
No log 0.2 20 0.9864 0.3849 0.9864 0.9932
No log 0.22 22 1.1458 0.3783 1.1458 1.0704
No log 0.24 24 0.8523 0.4152 0.8523 0.9232
No log 0.26 26 0.8373 0.3922 0.8373 0.9150
No log 0.28 28 0.8749 0.3107 0.8749 0.9353
No log 0.3 30 0.7485 0.4628 0.7485 0.8651
No log 0.32 32 0.7646 0.4888 0.7646 0.8744
No log 0.34 34 0.8300 0.4844 0.8300 0.9110
No log 0.36 36 0.6520 0.5692 0.6520 0.8074
No log 0.38 38 0.6219 0.6117 0.6219 0.7886
No log 0.4 40 0.6203 0.5786 0.6203 0.7876
No log 0.42 42 0.5918 0.6282 0.5918 0.7693
No log 0.44 44 0.6069 0.6282 0.6069 0.7790
No log 0.46 46 0.6372 0.6429 0.6372 0.7982
No log 0.48 48 0.6146 0.6789 0.6146 0.7840
No log 0.5 50 0.5989 0.6695 0.5989 0.7739
No log 0.52 52 0.6147 0.5770 0.6147 0.7840
No log 0.54 54 0.9851 0.4820 0.9851 0.9925
No log 0.56 56 0.8639 0.5061 0.8639 0.9294
No log 0.58 58 0.6215 0.5483 0.6215 0.7884
No log 0.6 60 0.7612 0.5799 0.7612 0.8725
No log 0.62 62 0.8235 0.5110 0.8235 0.9075
No log 0.64 64 0.7185 0.5833 0.7185 0.8476
No log 0.66 66 0.7259 0.5202 0.7259 0.8520
No log 0.68 68 0.8135 0.5474 0.8135 0.9019
No log 0.7 70 0.7532 0.5783 0.7532 0.8679
No log 0.72 72 0.6690 0.6459 0.6690 0.8179
No log 0.74 74 0.7348 0.5688 0.7348 0.8572
No log 0.76 76 0.7533 0.5779 0.7533 0.8679
No log 0.78 78 0.6669 0.5681 0.6669 0.8166
No log 0.8 80 0.6536 0.5905 0.6536 0.8085
No log 0.82 82 0.6483 0.5905 0.6483 0.8052
No log 0.84 84 0.7571 0.6047 0.7571 0.8701
No log 0.86 86 0.9602 0.5048 0.9602 0.9799
No log 0.88 88 0.9224 0.4864 0.9224 0.9604
No log 0.9 90 0.6924 0.5777 0.6924 0.8321
No log 0.92 92 0.7359 0.5635 0.7359 0.8578
No log 0.94 94 0.8947 0.5385 0.8947 0.9459
No log 0.96 96 0.7290 0.5347 0.7290 0.8538
No log 0.98 98 0.6546 0.6007 0.6546 0.8091
No log 1.0 100 0.6847 0.5710 0.6847 0.8275
No log 1.02 102 0.6691 0.5905 0.6691 0.8180
No log 1.04 104 0.6581 0.5692 0.6581 0.8112
No log 1.06 106 0.6661 0.6028 0.6661 0.8161
No log 1.08 108 0.7246 0.5366 0.7246 0.8513
No log 1.1 110 0.7176 0.5366 0.7176 0.8471
No log 1.12 112 0.6640 0.6528 0.6640 0.8149
No log 1.1400 114 0.6573 0.6207 0.6573 0.8107
No log 1.16 116 0.6588 0.6314 0.6588 0.8116
No log 1.18 118 0.7409 0.5735 0.7409 0.8608
No log 1.2 120 0.7112 0.5924 0.7112 0.8433
No log 1.22 122 0.6240 0.6732 0.6240 0.7899
No log 1.24 124 0.6307 0.6256 0.6307 0.7942
No log 1.26 126 0.6502 0.6035 0.6502 0.8063
No log 1.28 128 0.6055 0.6433 0.6055 0.7781
No log 1.3 130 0.6495 0.6008 0.6495 0.8059
No log 1.32 132 0.7218 0.5631 0.7218 0.8496
No log 1.34 134 0.6751 0.6783 0.6751 0.8216
No log 1.3600 136 0.6835 0.6526 0.6835 0.8268
No log 1.38 138 0.6950 0.6695 0.6950 0.8337
No log 1.4 140 0.6959 0.6695 0.6959 0.8342
No log 1.42 142 0.6978 0.6495 0.6978 0.8354
No log 1.44 144 0.7484 0.5607 0.7484 0.8651
No log 1.46 146 0.6730 0.6547 0.6730 0.8204
No log 1.48 148 0.6596 0.6219 0.6596 0.8122
No log 1.5 150 0.6476 0.6424 0.6476 0.8047
No log 1.52 152 0.6699 0.6105 0.6699 0.8185
No log 1.54 154 0.7078 0.5960 0.7078 0.8413
No log 1.56 156 0.6775 0.6439 0.6775 0.8231
No log 1.58 158 0.8119 0.5575 0.8119 0.9011
No log 1.6 160 0.8803 0.4705 0.8803 0.9382
No log 1.62 162 0.7466 0.5983 0.7466 0.8641
No log 1.6400 164 0.6945 0.5590 0.6945 0.8334
No log 1.6600 166 0.6792 0.5160 0.6792 0.8241
No log 1.6800 168 0.6738 0.5399 0.6738 0.8209
No log 1.7 170 0.7170 0.5554 0.7170 0.8467
No log 1.72 172 0.6914 0.4941 0.6914 0.8315
No log 1.74 174 0.7214 0.5064 0.7214 0.8494
No log 1.76 176 0.8207 0.5318 0.8207 0.9059
No log 1.78 178 0.8017 0.5538 0.8017 0.8954
No log 1.8 180 0.7272 0.4962 0.7272 0.8528
No log 1.8200 182 0.7272 0.4739 0.7272 0.8527
No log 1.8400 184 0.7173 0.4865 0.7173 0.8469
No log 1.8600 186 0.7733 0.5877 0.7733 0.8794
No log 1.88 188 0.7664 0.5487 0.7664 0.8755
No log 1.9 190 0.7477 0.5504 0.7477 0.8647
No log 1.92 192 0.7530 0.5385 0.7530 0.8677
No log 1.94 194 0.7845 0.5254 0.7845 0.8857
No log 1.96 196 0.7402 0.5370 0.7402 0.8604
No log 1.98 198 0.6715 0.5618 0.6715 0.8194
No log 2.0 200 0.7226 0.6109 0.7226 0.8501
No log 2.02 202 0.6886 0.5618 0.6886 0.8298
No log 2.04 204 0.6814 0.5720 0.6814 0.8255
No log 2.06 206 0.8482 0.6088 0.8482 0.9210
No log 2.08 208 0.8429 0.5515 0.8429 0.9181
No log 2.1 210 0.7177 0.5626 0.7177 0.8472
No log 2.12 212 0.6846 0.4544 0.6846 0.8274
No log 2.14 214 0.7054 0.5328 0.7054 0.8399
No log 2.16 216 0.7021 0.4962 0.7021 0.8379
No log 2.18 218 0.6911 0.5074 0.6911 0.8313
No log 2.2 220 0.7086 0.5798 0.7086 0.8418
No log 2.22 222 0.7691 0.5566 0.7691 0.8770
No log 2.24 224 0.7524 0.5677 0.7524 0.8674
No log 2.26 226 0.7252 0.5819 0.7252 0.8516
No log 2.2800 228 0.8752 0.3833 0.8752 0.9355
No log 2.3 230 0.8564 0.4479 0.8564 0.9254
No log 2.32 232 0.7324 0.5999 0.7324 0.8558
No log 2.34 234 0.7188 0.5657 0.7188 0.8478
No log 2.36 236 0.7398 0.5588 0.7398 0.8601
No log 2.38 238 0.6920 0.5534 0.6920 0.8318
No log 2.4 240 0.6676 0.5386 0.6676 0.8171
No log 2.42 242 0.6757 0.5248 0.6757 0.8220
No log 2.44 244 0.6980 0.5654 0.6980 0.8354
No log 2.46 246 0.7562 0.5572 0.7562 0.8696
No log 2.48 248 0.7395 0.5572 0.7395 0.8600
No log 2.5 250 0.7117 0.5634 0.7117 0.8436
No log 2.52 252 0.7710 0.5128 0.7710 0.8781
No log 2.54 254 0.7422 0.5777 0.7422 0.8615
No log 2.56 256 0.7219 0.5888 0.7219 0.8496
No log 2.58 258 0.6728 0.5797 0.6728 0.8202
No log 2.6 260 0.6701 0.5763 0.6701 0.8186
No log 2.62 262 0.6965 0.5797 0.6965 0.8346
No log 2.64 264 0.7933 0.5745 0.7933 0.8907
No log 2.66 266 0.8037 0.5443 0.8037 0.8965
No log 2.68 268 0.7442 0.4946 0.7442 0.8627
No log 2.7 270 0.7191 0.4507 0.7191 0.8480
No log 2.7200 272 0.7214 0.4494 0.7214 0.8494
No log 2.74 274 0.7259 0.4597 0.7259 0.8520
No log 2.76 276 0.6937 0.4854 0.6937 0.8329
No log 2.7800 278 0.7137 0.5697 0.7137 0.8448
No log 2.8 280 0.6996 0.5862 0.6996 0.8364
No log 2.82 282 0.6712 0.6301 0.6712 0.8193
No log 2.84 284 0.6801 0.6301 0.6801 0.8247
No log 2.86 286 0.6638 0.5874 0.6638 0.8147
No log 2.88 288 0.6751 0.5752 0.6751 0.8216
No log 2.9 290 0.7698 0.5666 0.7698 0.8774
No log 2.92 292 0.9409 0.4761 0.9409 0.9700
No log 2.94 294 0.9121 0.4333 0.9121 0.9550
No log 2.96 296 0.8049 0.4169 0.8049 0.8972
No log 2.98 298 0.7403 0.4873 0.7403 0.8604
No log 3.0 300 0.6902 0.5149 0.6902 0.8308
No log 3.02 302 0.7090 0.5797 0.7090 0.8420
No log 3.04 304 0.8228 0.5102 0.8228 0.9071
No log 3.06 306 0.8134 0.5555 0.8134 0.9019
No log 3.08 308 0.7239 0.5626 0.7239 0.8508
No log 3.1 310 0.6936 0.5944 0.6936 0.8328
No log 3.12 312 0.6755 0.5939 0.6755 0.8219
No log 3.14 314 0.6643 0.5939 0.6643 0.8150
No log 3.16 316 0.6685 0.6038 0.6685 0.8176
No log 3.18 318 0.7193 0.5708 0.7193 0.8481
No log 3.2 320 0.6959 0.5098 0.6959 0.8342
No log 3.22 322 0.6491 0.5135 0.6491 0.8057
No log 3.24 324 0.6420 0.6154 0.6420 0.8013
No log 3.26 326 0.7094 0.5798 0.7094 0.8423
No log 3.2800 328 0.7346 0.5966 0.7346 0.8571
No log 3.3 330 0.7147 0.5798 0.7147 0.8454
No log 3.32 332 0.6923 0.5688 0.6923 0.8320
No log 3.34 334 0.6875 0.5528 0.6875 0.8292
No log 3.36 336 0.6705 0.5546 0.6705 0.8188
No log 3.38 338 0.6507 0.5274 0.6507 0.8066
No log 3.4 340 0.6442 0.5388 0.6442 0.8026
No log 3.42 342 0.6895 0.5634 0.6895 0.8303
No log 3.44 344 0.7224 0.6035 0.7224 0.8500
No log 3.46 346 0.6399 0.5909 0.6399 0.8000
No log 3.48 348 0.5997 0.6850 0.5997 0.7744
No log 3.5 350 0.5918 0.6507 0.5918 0.7693
No log 3.52 352 0.6219 0.5954 0.6219 0.7886
No log 3.54 354 0.6925 0.5891 0.6925 0.8322
No log 3.56 356 0.6771 0.5811 0.6771 0.8229
No log 3.58 358 0.6698 0.5846 0.6698 0.8184
No log 3.6 360 0.6355 0.5121 0.6355 0.7972
No log 3.62 362 0.6734 0.5697 0.6734 0.8206
No log 3.64 364 0.7285 0.5938 0.7285 0.8535
No log 3.66 366 0.6738 0.6083 0.6738 0.8208
No log 3.68 368 0.6435 0.5830 0.6435 0.8022
No log 3.7 370 0.6114 0.6610 0.6114 0.7819
No log 3.7200 372 0.5958 0.6028 0.5958 0.7719
No log 3.74 374 0.5942 0.6433 0.5942 0.7709
No log 3.76 376 0.6186 0.5996 0.6186 0.7865
No log 3.7800 378 0.6044 0.6197 0.6044 0.7775
No log 3.8 380 0.6158 0.6087 0.6158 0.7847
No log 3.82 382 0.6446 0.6207 0.6446 0.8029
No log 3.84 384 0.6280 0.5622 0.6280 0.7924
No log 3.86 386 0.7010 0.5181 0.7010 0.8372
No log 3.88 388 0.7904 0.5856 0.7904 0.8891
No log 3.9 390 0.7803 0.5675 0.7803 0.8834
No log 3.92 392 0.6960 0.5195 0.6960 0.8342
No log 3.94 394 0.6256 0.5724 0.6256 0.7909
No log 3.96 396 0.6073 0.6317 0.6073 0.7793
No log 3.98 398 0.6022 0.6537 0.6022 0.7760
No log 4.0 400 0.6576 0.6419 0.6576 0.8109
No log 4.02 402 0.6695 0.6124 0.6695 0.8182
No log 4.04 404 0.6579 0.6128 0.6579 0.8111
No log 4.06 406 0.6992 0.5862 0.6992 0.8362
No log 4.08 408 0.7007 0.5788 0.7007 0.8371
No log 4.1 410 0.6931 0.5798 0.6931 0.8325
No log 4.12 412 0.6534 0.6301 0.6534 0.8084
No log 4.14 414 0.6401 0.6460 0.6401 0.8001
No log 4.16 416 0.6757 0.6138 0.6757 0.8220
No log 4.18 418 0.7349 0.5850 0.7349 0.8572
No log 4.2 420 0.6782 0.6455 0.6782 0.8235
No log 4.22 422 0.6637 0.6498 0.6637 0.8147
No log 4.24 424 0.6686 0.6498 0.6686 0.8177
No log 4.26 426 0.6932 0.6113 0.6932 0.8326
No log 4.28 428 0.8239 0.5521 0.8239 0.9077
No log 4.3 430 0.8932 0.4279 0.8932 0.9451
No log 4.32 432 0.8222 0.5254 0.8222 0.9067
No log 4.34 434 0.7338 0.5010 0.7338 0.8566
No log 4.36 436 0.7235 0.5248 0.7235 0.8506
No log 4.38 438 0.7419 0.4875 0.7419 0.8613
No log 4.4 440 0.8078 0.5173 0.8078 0.8988
No log 4.42 442 0.8541 0.5549 0.8541 0.9242
No log 4.44 444 0.8059 0.5173 0.8059 0.8977
No log 4.46 446 0.7410 0.4875 0.7410 0.8608
No log 4.48 448 0.7134 0.5249 0.7134 0.8446
No log 4.5 450 0.7181 0.5237 0.7181 0.8474
No log 4.52 452 0.7487 0.5291 0.7487 0.8653
No log 4.54 454 0.8152 0.5769 0.8152 0.9029
No log 4.5600 456 0.7902 0.5631 0.7902 0.8889
No log 4.58 458 0.7684 0.5304 0.7684 0.8766
No log 4.6 460 0.7591 0.4352 0.7591 0.8712
No log 4.62 462 0.7162 0.5146 0.7162 0.8463
No log 4.64 464 0.6976 0.5032 0.6976 0.8352
No log 4.66 466 0.6756 0.5261 0.6756 0.8219
No log 4.68 468 0.6751 0.5237 0.6751 0.8217
No log 4.7 470 0.7272 0.5953 0.7272 0.8527
No log 4.72 472 0.7446 0.6124 0.7446 0.8629
No log 4.74 474 0.6843 0.5731 0.6843 0.8272
No log 4.76 476 0.6706 0.5843 0.6706 0.8189
No log 4.78 478 0.6735 0.5843 0.6735 0.8207
No log 4.8 480 0.6650 0.5368 0.6650 0.8155
No log 4.82 482 0.6695 0.5112 0.6695 0.8182
No log 4.84 484 0.7096 0.5494 0.7096 0.8424
No log 4.86 486 0.7091 0.5494 0.7091 0.8421
No log 4.88 488 0.6745 0.5719 0.6745 0.8213
No log 4.9 490 0.6414 0.5112 0.6414 0.8009
No log 4.92 492 0.6272 0.5135 0.6272 0.7920
No log 4.9400 494 0.6303 0.5121 0.6303 0.7939
No log 4.96 496 0.6713 0.5494 0.6713 0.8194
No log 4.98 498 0.7116 0.5494 0.7116 0.8436
0.2761 5.0 500 0.7207 0.5973 0.7207 0.8490
0.2761 5.02 502 0.6866 0.5494 0.6866 0.8286
0.2761 5.04 504 0.6744 0.5919 0.6744 0.8212
0.2761 5.06 506 0.6856 0.6605 0.6856 0.8280
0.2761 5.08 508 0.6757 0.6443 0.6757 0.8220
0.2761 5.1 510 0.6492 0.6138 0.6492 0.8057
0.2761 5.12 512 0.6484 0.6167 0.6484 0.8052
0.2761 5.14 514 0.6524 0.5701 0.6524 0.8077
0.2761 5.16 516 0.6632 0.5590 0.6632 0.8144
0.2761 5.18 518 0.6805 0.5210 0.6805 0.8249
0.2761 5.2 520 0.6719 0.5467 0.6719 0.8197
0.2761 5.22 522 0.6567 0.5712 0.6567 0.8104

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k20_task5_organization

Finetuned
(4019)
this model