ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run3_AugV5_k3_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7057
  • Qwk: 0.5626
  • Mse: 0.7057
  • Rmse: 0.8401

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.1818 2 3.8797 -0.0151 3.8797 1.9697
No log 0.3636 4 2.2542 -0.0161 2.2542 1.5014
No log 0.5455 6 1.5640 0.0408 1.5640 1.2506
No log 0.7273 8 1.1553 0.0760 1.1553 1.0748
No log 0.9091 10 1.0787 0.2343 1.0787 1.0386
No log 1.0909 12 1.2778 0.0085 1.2778 1.1304
No log 1.2727 14 1.3674 -0.0148 1.3674 1.1694
No log 1.4545 16 1.1710 0.1261 1.1710 1.0821
No log 1.6364 18 1.0584 0.2074 1.0584 1.0288
No log 1.8182 20 1.1532 0.1910 1.1532 1.0739
No log 2.0 22 1.0715 0.2611 1.0715 1.0352
No log 2.1818 24 1.1021 0.2659 1.1021 1.0498
No log 2.3636 26 1.1801 0.2295 1.1801 1.0863
No log 2.5455 28 1.1904 0.1910 1.1904 1.0910
No log 2.7273 30 1.0786 0.2857 1.0786 1.0385
No log 2.9091 32 1.0629 0.2566 1.0629 1.0310
No log 3.0909 34 1.0899 0.2770 1.0899 1.0440
No log 3.2727 36 1.1022 0.1398 1.1022 1.0498
No log 3.4545 38 1.1317 0.1591 1.1317 1.0638
No log 3.6364 40 1.1801 0.0820 1.1801 1.0863
No log 3.8182 42 1.1056 0.1805 1.1056 1.0515
No log 4.0 44 0.9069 0.2448 0.9069 0.9523
No log 4.1818 46 0.8893 0.3112 0.8893 0.9430
No log 4.3636 48 0.8987 0.3011 0.8987 0.9480
No log 4.5455 50 1.0456 0.3361 1.0456 1.0226
No log 4.7273 52 1.0315 0.3772 1.0315 1.0156
No log 4.9091 54 0.8472 0.3661 0.8472 0.9205
No log 5.0909 56 0.8577 0.3721 0.8577 0.9261
No log 5.2727 58 0.9184 0.3301 0.9184 0.9584
No log 5.4545 60 0.9835 0.1961 0.9835 0.9917
No log 5.6364 62 0.9938 0.1987 0.9938 0.9969
No log 5.8182 64 0.9835 0.2850 0.9835 0.9917
No log 6.0 66 0.9564 0.3411 0.9564 0.9780
No log 6.1818 68 0.8398 0.4313 0.8398 0.9164
No log 6.3636 70 0.8642 0.4342 0.8642 0.9296
No log 6.5455 72 0.8220 0.4487 0.8220 0.9066
No log 6.7273 74 0.8496 0.4879 0.8496 0.9217
No log 6.9091 76 0.9117 0.4128 0.9117 0.9548
No log 7.0909 78 0.8032 0.5396 0.8032 0.8962
No log 7.2727 80 0.8479 0.4843 0.8479 0.9208
No log 7.4545 82 0.8160 0.5522 0.8160 0.9033
No log 7.6364 84 0.7757 0.5969 0.7757 0.8808
No log 7.8182 86 0.8290 0.5144 0.8290 0.9105
No log 8.0 88 1.0519 0.4281 1.0519 1.0256
No log 8.1818 90 1.0822 0.4273 1.0822 1.0403
No log 8.3636 92 0.8511 0.5266 0.8511 0.9226
No log 8.5455 94 0.8403 0.4346 0.8403 0.9167
No log 8.7273 96 0.9118 0.3468 0.9118 0.9549
No log 8.9091 98 0.8302 0.3908 0.8302 0.9111
No log 9.0909 100 0.7711 0.5640 0.7711 0.8781
No log 9.2727 102 0.9022 0.5428 0.9022 0.9499
No log 9.4545 104 0.9870 0.5179 0.9870 0.9935
No log 9.6364 106 0.7824 0.5543 0.7824 0.8845
No log 9.8182 108 0.6832 0.5557 0.6832 0.8266
No log 10.0 110 0.7102 0.4747 0.7102 0.8427
No log 10.1818 112 0.7176 0.5492 0.7176 0.8471
No log 10.3636 114 0.8335 0.5041 0.8335 0.9130
No log 10.5455 116 0.9802 0.3982 0.9802 0.9900
No log 10.7273 118 0.9685 0.4123 0.9685 0.9841
No log 10.9091 120 0.7593 0.6118 0.7593 0.8714
No log 11.0909 122 0.7110 0.5274 0.7110 0.8432
No log 11.2727 124 0.7332 0.4723 0.7332 0.8563
No log 11.4545 126 0.7153 0.5412 0.7153 0.8457
No log 11.6364 128 0.7427 0.5343 0.7427 0.8618
No log 11.8182 130 0.7929 0.4616 0.7929 0.8904
No log 12.0 132 0.7914 0.4645 0.7914 0.8896
No log 12.1818 134 0.8182 0.4845 0.8182 0.9046
No log 12.3636 136 0.7783 0.4963 0.7783 0.8822
No log 12.5455 138 0.7750 0.5690 0.7750 0.8803
No log 12.7273 140 0.9079 0.5436 0.9079 0.9528
No log 12.9091 142 0.8590 0.5668 0.8590 0.9268
No log 13.0909 144 0.7516 0.5236 0.7516 0.8670
No log 13.2727 146 0.8058 0.3908 0.8058 0.8977
No log 13.4545 148 0.8077 0.4282 0.8077 0.8987
No log 13.6364 150 0.7553 0.4676 0.7553 0.8691
No log 13.8182 152 0.8525 0.4350 0.8525 0.9233
No log 14.0 154 0.9381 0.3391 0.9381 0.9686
No log 14.1818 156 0.8628 0.4224 0.8628 0.9289
No log 14.3636 158 0.7848 0.4888 0.7848 0.8859
No log 14.5455 160 0.7833 0.5174 0.7833 0.8850
No log 14.7273 162 0.7567 0.5370 0.7567 0.8699
No log 14.9091 164 0.7343 0.5357 0.7343 0.8569
No log 15.0909 166 0.7409 0.4875 0.7409 0.8608
No log 15.2727 168 0.7796 0.5368 0.7796 0.8829
No log 15.4545 170 0.7266 0.4975 0.7266 0.8524
No log 15.6364 172 0.7216 0.4969 0.7216 0.8495
No log 15.8182 174 0.7066 0.5563 0.7066 0.8406
No log 16.0 176 0.7402 0.5166 0.7402 0.8603
No log 16.1818 178 0.7466 0.5166 0.7466 0.8641
No log 16.3636 180 0.7648 0.5186 0.7648 0.8745
No log 16.5455 182 0.8640 0.4799 0.8640 0.9295
No log 16.7273 184 0.8457 0.5254 0.8457 0.9196
No log 16.9091 186 0.7407 0.6025 0.7407 0.8606
No log 17.0909 188 0.6988 0.5142 0.6988 0.8359
No log 17.2727 190 0.7029 0.5044 0.7029 0.8384
No log 17.4545 192 0.7113 0.5125 0.7113 0.8434
No log 17.6364 194 0.7241 0.5234 0.7241 0.8510
No log 17.8182 196 0.7539 0.5442 0.7539 0.8683
No log 18.0 198 0.7618 0.4531 0.7618 0.8728
No log 18.1818 200 0.7480 0.4417 0.7480 0.8649
No log 18.3636 202 0.7198 0.5331 0.7198 0.8484
No log 18.5455 204 0.7221 0.5751 0.7221 0.8497
No log 18.7273 206 0.7536 0.5938 0.7536 0.8681
No log 18.9091 208 0.7248 0.5410 0.7248 0.8514
No log 19.0909 210 0.7369 0.5678 0.7369 0.8584
No log 19.2727 212 0.7787 0.4329 0.7787 0.8824
No log 19.4545 214 0.8557 0.4006 0.8557 0.9251
No log 19.6364 216 0.8762 0.4006 0.8762 0.9361
No log 19.8182 218 0.8313 0.3656 0.8313 0.9117
No log 20.0 220 0.8193 0.4728 0.8193 0.9052
No log 20.1818 222 0.8794 0.4696 0.8794 0.9377
No log 20.3636 224 0.9045 0.4043 0.9045 0.9511
No log 20.5455 226 0.8053 0.5516 0.8053 0.8974
No log 20.7273 228 0.7293 0.5475 0.7293 0.8540
No log 20.9091 230 0.7272 0.5660 0.7272 0.8528
No log 21.0909 232 0.7927 0.5279 0.7927 0.8903
No log 21.2727 234 0.8809 0.4796 0.8809 0.9386
No log 21.4545 236 0.8392 0.5846 0.8392 0.9161
No log 21.6364 238 0.7700 0.5121 0.7700 0.8775
No log 21.8182 240 0.7756 0.4048 0.7756 0.8807
No log 22.0 242 0.7812 0.4163 0.7812 0.8839
No log 22.1818 244 0.7794 0.4395 0.7794 0.8828
No log 22.3636 246 0.7699 0.4163 0.7699 0.8775
No log 22.5455 248 0.7538 0.4428 0.7538 0.8682
No log 22.7273 250 0.7333 0.4789 0.7333 0.8563
No log 22.9091 252 0.7270 0.5223 0.7270 0.8527
No log 23.0909 254 0.7233 0.5330 0.7233 0.8505
No log 23.2727 256 0.7286 0.4661 0.7286 0.8536
No log 23.4545 258 0.7514 0.4645 0.7514 0.8668
No log 23.6364 260 0.7760 0.4613 0.7760 0.8809
No log 23.8182 262 0.7857 0.4728 0.7857 0.8864
No log 24.0 264 0.7772 0.4269 0.7772 0.8816
No log 24.1818 266 0.7522 0.4540 0.7522 0.8673
No log 24.3636 268 0.7253 0.5038 0.7253 0.8517
No log 24.5455 270 0.7081 0.5630 0.7081 0.8415
No log 24.7273 272 0.6745 0.5618 0.6745 0.8213
No log 24.9091 274 0.6633 0.5785 0.6633 0.8144
No log 25.0909 276 0.6680 0.5356 0.6680 0.8173
No log 25.2727 278 0.6806 0.5356 0.6806 0.8250
No log 25.4545 280 0.6902 0.5356 0.6902 0.8308
No log 25.6364 282 0.6937 0.5463 0.6937 0.8329
No log 25.8182 284 0.6963 0.5356 0.6963 0.8345
No log 26.0 286 0.6998 0.5139 0.6998 0.8366
No log 26.1818 288 0.7054 0.5139 0.7054 0.8399
No log 26.3636 290 0.7047 0.5356 0.7047 0.8395
No log 26.5455 292 0.7158 0.5016 0.7158 0.8461
No log 26.7273 294 0.7342 0.4646 0.7342 0.8569
No log 26.9091 296 0.7422 0.5208 0.7422 0.8615
No log 27.0909 298 0.7282 0.5654 0.7282 0.8534
No log 27.2727 300 0.6916 0.4995 0.6916 0.8316
No log 27.4545 302 0.6937 0.4995 0.6937 0.8329
No log 27.6364 304 0.7054 0.4219 0.7054 0.8399
No log 27.8182 306 0.7096 0.4691 0.7096 0.8424
No log 28.0 308 0.7108 0.4691 0.7108 0.8431
No log 28.1818 310 0.7164 0.4893 0.7164 0.8464
No log 28.3636 312 0.7319 0.5210 0.7319 0.8555
No log 28.5455 314 0.7304 0.4757 0.7304 0.8546
No log 28.7273 316 0.7253 0.5199 0.7253 0.8517
No log 28.9091 318 0.6978 0.4922 0.6978 0.8353
No log 29.0909 320 0.6922 0.5287 0.6922 0.8320
No log 29.2727 322 0.7047 0.4962 0.7047 0.8395
No log 29.4545 324 0.6979 0.4706 0.6979 0.8354
No log 29.6364 326 0.6968 0.5552 0.6968 0.8348
No log 29.8182 328 0.7151 0.5304 0.7151 0.8456
No log 30.0 330 0.7049 0.5552 0.7049 0.8396
No log 30.1818 332 0.7119 0.4225 0.7119 0.8437
No log 30.3636 334 0.7130 0.4609 0.7130 0.8444
No log 30.5455 336 0.6906 0.5831 0.6906 0.8310
No log 30.7273 338 0.6836 0.5832 0.6836 0.8268
No log 30.9091 340 0.6728 0.6147 0.6728 0.8203
No log 31.0909 342 0.6753 0.6087 0.6753 0.8218
No log 31.2727 344 0.6846 0.5974 0.6846 0.8274
No log 31.4545 346 0.6970 0.5999 0.6970 0.8349
No log 31.6364 348 0.7010 0.5748 0.7010 0.8372
No log 31.8182 350 0.6663 0.5884 0.6663 0.8163
No log 32.0 352 0.6919 0.5855 0.6919 0.8318
No log 32.1818 354 0.7155 0.5969 0.7155 0.8459
No log 32.3636 356 0.7336 0.4645 0.7336 0.8565
No log 32.5455 358 0.7506 0.4778 0.7506 0.8663
No log 32.7273 360 0.7450 0.4645 0.7450 0.8631
No log 32.9091 362 0.7439 0.5232 0.7439 0.8625
No log 33.0909 364 0.7289 0.5654 0.7289 0.8537
No log 33.2727 366 0.6908 0.5450 0.6908 0.8311
No log 33.4545 368 0.6768 0.5648 0.6768 0.8227
No log 33.6364 370 0.6770 0.6147 0.6770 0.8228
No log 33.8182 372 0.6628 0.5853 0.6628 0.8141
No log 34.0 374 0.6625 0.5763 0.6625 0.8140
No log 34.1818 376 0.6756 0.5819 0.6756 0.8220
No log 34.3636 378 0.6856 0.5176 0.6856 0.8280
No log 34.5455 380 0.7042 0.5223 0.7042 0.8392
No log 34.7273 382 0.7274 0.5331 0.7274 0.8529
No log 34.9091 384 0.7419 0.5546 0.7419 0.8613
No log 35.0909 386 0.7334 0.4757 0.7334 0.8564
No log 35.2727 388 0.7439 0.5060 0.7439 0.8625
No log 35.4545 390 0.7532 0.4834 0.7532 0.8678
No log 35.6364 392 0.7247 0.4834 0.7247 0.8513
No log 35.8182 394 0.7045 0.4772 0.7045 0.8393
No log 36.0 396 0.6957 0.5657 0.6957 0.8341
No log 36.1818 398 0.6766 0.5492 0.6766 0.8226
No log 36.3636 400 0.6669 0.5224 0.6669 0.8166
No log 36.5455 402 0.6704 0.5752 0.6704 0.8188
No log 36.7273 404 0.6749 0.5120 0.6749 0.8215
No log 36.9091 406 0.6819 0.5030 0.6819 0.8258
No log 37.0909 408 0.6871 0.5139 0.6871 0.8289
No log 37.2727 410 0.6892 0.5224 0.6892 0.8302
No log 37.4545 412 0.6938 0.5316 0.6938 0.8330
No log 37.6364 414 0.6935 0.5832 0.6935 0.8328
No log 37.8182 416 0.6985 0.6305 0.6985 0.8358
No log 38.0 418 0.6726 0.5621 0.6726 0.8201
No log 38.1818 420 0.6671 0.5671 0.6671 0.8168
No log 38.3636 422 0.6759 0.5466 0.6759 0.8221
No log 38.5455 424 0.6809 0.5671 0.6809 0.8251
No log 38.7273 426 0.6996 0.5648 0.6996 0.8364
No log 38.9091 428 0.7003 0.5434 0.7003 0.8368
No log 39.0909 430 0.6992 0.5304 0.6992 0.8362
No log 39.2727 432 0.6838 0.6119 0.6838 0.8269
No log 39.4545 434 0.6889 0.5987 0.6889 0.8300
No log 39.6364 436 0.6827 0.5416 0.6827 0.8263
No log 39.8182 438 0.6822 0.4996 0.6822 0.8260
No log 40.0 440 0.6981 0.4813 0.6981 0.8356
No log 40.1818 442 0.7384 0.5421 0.7384 0.8593
No log 40.3636 444 0.7365 0.4962 0.7365 0.8582
No log 40.5455 446 0.7105 0.5023 0.7105 0.8429
No log 40.7273 448 0.7365 0.5304 0.7365 0.8582
No log 40.9091 450 0.7613 0.5888 0.7613 0.8725
No log 41.0909 452 0.7494 0.5888 0.7494 0.8657
No log 41.2727 454 0.7105 0.5699 0.7105 0.8429
No log 41.4545 456 0.6659 0.5752 0.6659 0.8160
No log 41.6364 458 0.6803 0.5391 0.6803 0.8248
No log 41.8182 460 0.6804 0.5726 0.6804 0.8248
No log 42.0 462 0.6567 0.5796 0.6567 0.8104
No log 42.1818 464 0.6596 0.5853 0.6596 0.8122
No log 42.3636 466 0.6805 0.5637 0.6805 0.8249
No log 42.5455 468 0.7280 0.5938 0.7280 0.8532
No log 42.7273 470 0.7271 0.5774 0.7271 0.8527
No log 42.9091 472 0.7169 0.4879 0.7169 0.8467
No log 43.0909 474 0.7080 0.4772 0.7080 0.8414
No log 43.2727 476 0.6956 0.4787 0.6956 0.8341
No log 43.4545 478 0.6894 0.4787 0.6894 0.8303
No log 43.6364 480 0.6913 0.4770 0.6913 0.8314
No log 43.8182 482 0.7056 0.5858 0.7056 0.8400
No log 44.0 484 0.6900 0.5546 0.6900 0.8307
No log 44.1818 486 0.6708 0.5368 0.6708 0.8190
No log 44.3636 488 0.6946 0.5204 0.6946 0.8334
No log 44.5455 490 0.7024 0.5204 0.7024 0.8381
No log 44.7273 492 0.6899 0.5066 0.6899 0.8306
No log 44.9091 494 0.6991 0.5234 0.6991 0.8361
No log 45.0909 496 0.7290 0.5740 0.7290 0.8538
No log 45.2727 498 0.7293 0.5740 0.7293 0.8540
0.2408 45.4545 500 0.7114 0.5975 0.7114 0.8434
0.2408 45.6364 502 0.7020 0.5438 0.7020 0.8379
0.2408 45.8182 504 0.6992 0.5869 0.6992 0.8362
0.2408 46.0 506 0.7041 0.5869 0.7041 0.8391
0.2408 46.1818 508 0.7141 0.5626 0.7141 0.8450
0.2408 46.3636 510 0.7057 0.5626 0.7057 0.8401

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run3_AugV5_k3_task5_organization

Finetuned
(4019)
this model