ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k20_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8349
  • Qwk: 0.5229
  • Mse: 0.8349
  • Rmse: 0.9137

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0198 2 4.0009 -0.0091 4.0009 2.0002
No log 0.0396 4 2.1903 0.0721 2.1903 1.4800
No log 0.0594 6 1.3125 0.0672 1.3125 1.1456
No log 0.0792 8 1.2610 0.1832 1.2610 1.1229
No log 0.0990 10 1.3852 0.0861 1.3852 1.1769
No log 0.1188 12 1.1289 0.1261 1.1289 1.0625
No log 0.1386 14 1.1519 0.0941 1.1519 1.0732
No log 0.1584 16 1.2868 -0.1419 1.2868 1.1344
No log 0.1782 18 1.5415 -0.0444 1.5415 1.2416
No log 0.1980 20 1.3838 -0.0858 1.3838 1.1763
No log 0.2178 22 1.2170 0.0281 1.2170 1.1032
No log 0.2376 24 1.1489 0.1767 1.1489 1.0719
No log 0.2574 26 1.1329 0.1671 1.1329 1.0644
No log 0.2772 28 1.1363 0.1046 1.1363 1.0660
No log 0.2970 30 1.1881 0.0888 1.1881 1.0900
No log 0.3168 32 1.2206 0.0445 1.2206 1.1048
No log 0.3366 34 1.1918 0.0445 1.1918 1.0917
No log 0.3564 36 1.1295 0.2081 1.1295 1.0628
No log 0.3762 38 1.1050 0.1532 1.1050 1.0512
No log 0.3960 40 1.0940 0.2316 1.0940 1.0459
No log 0.4158 42 1.0965 0.2366 1.0965 1.0471
No log 0.4356 44 1.1843 0.0849 1.1843 1.0883
No log 0.4554 46 1.2023 0.0849 1.2023 1.0965
No log 0.4752 48 1.0677 0.1482 1.0677 1.0333
No log 0.4950 50 1.0114 0.2771 1.0114 1.0057
No log 0.5149 52 1.0826 0.2192 1.0826 1.0405
No log 0.5347 54 1.0416 0.2678 1.0416 1.0206
No log 0.5545 56 0.9316 0.3151 0.9316 0.9652
No log 0.5743 58 1.0042 0.2588 1.0042 1.0021
No log 0.5941 60 1.0554 0.2659 1.0554 1.0273
No log 0.6139 62 1.0204 0.2513 1.0204 1.0102
No log 0.6337 64 0.9734 0.2611 0.9734 0.9866
No log 0.6535 66 0.9347 0.3856 0.9347 0.9668
No log 0.6733 68 0.9175 0.1883 0.9175 0.9579
No log 0.6931 70 0.9454 0.2865 0.9454 0.9723
No log 0.7129 72 0.9392 0.2865 0.9392 0.9691
No log 0.7327 74 0.9097 0.2674 0.9097 0.9538
No log 0.7525 76 0.8553 0.3393 0.8553 0.9248
No log 0.7723 78 0.8351 0.3942 0.8351 0.9139
No log 0.7921 80 0.8516 0.4473 0.8516 0.9228
No log 0.8119 82 0.9186 0.3062 0.9186 0.9585
No log 0.8317 84 0.9518 0.3310 0.9518 0.9756
No log 0.8515 86 0.8994 0.3223 0.8994 0.9484
No log 0.8713 88 0.8467 0.3467 0.8467 0.9202
No log 0.8911 90 0.8280 0.4388 0.8280 0.9099
No log 0.9109 92 0.9591 0.3672 0.9591 0.9793
No log 0.9307 94 1.1886 0.2555 1.1886 1.0902
No log 0.9505 96 1.2278 0.1911 1.2278 1.1081
No log 0.9703 98 1.0654 0.0011 1.0654 1.0322
No log 0.9901 100 0.9096 0.3862 0.9096 0.9537
No log 1.0099 102 0.8361 0.4571 0.8361 0.9144
No log 1.0297 104 0.8984 0.2981 0.8984 0.9478
No log 1.0495 106 1.0921 0.2934 1.0921 1.0450
No log 1.0693 108 1.0997 0.3190 1.0997 1.0487
No log 1.0891 110 1.1994 0.3346 1.1994 1.0952
No log 1.1089 112 1.1555 0.3897 1.1555 1.0750
No log 1.1287 114 1.0354 0.4100 1.0354 1.0176
No log 1.1485 116 1.0943 0.4220 1.0943 1.0461
No log 1.1683 118 1.3350 0.2386 1.3350 1.1554
No log 1.1881 120 1.4780 0.1955 1.4780 1.2157
No log 1.2079 122 1.3022 0.1790 1.3022 1.1411
No log 1.2277 124 1.0711 0.2960 1.0711 1.0349
No log 1.2475 126 1.0712 0.3135 1.0712 1.0350
No log 1.2673 128 1.1122 0.3527 1.1122 1.0546
No log 1.2871 130 1.0903 0.3527 1.0903 1.0442
No log 1.3069 132 1.0229 0.3659 1.0229 1.0114
No log 1.3267 134 0.9299 0.4144 0.9299 0.9643
No log 1.3465 136 0.8419 0.4337 0.8419 0.9176
No log 1.3663 138 0.8390 0.4337 0.8390 0.9160
No log 1.3861 140 0.9330 0.4411 0.9330 0.9659
No log 1.4059 142 1.0496 0.3681 1.0496 1.0245
No log 1.4257 144 0.9545 0.4318 0.9545 0.9770
No log 1.4455 146 0.8454 0.4345 0.8454 0.9194
No log 1.4653 148 0.8393 0.4371 0.8393 0.9161
No log 1.4851 150 0.8292 0.4613 0.8292 0.9106
No log 1.5050 152 0.8689 0.5192 0.8689 0.9321
No log 1.5248 154 0.9179 0.4815 0.9179 0.9580
No log 1.5446 156 0.8928 0.5160 0.8928 0.9449
No log 1.5644 158 0.8924 0.5319 0.8924 0.9447
No log 1.5842 160 0.8452 0.4966 0.8452 0.9193
No log 1.6040 162 0.8109 0.4244 0.8109 0.9005
No log 1.6238 164 0.8103 0.4628 0.8103 0.9002
No log 1.6436 166 0.8254 0.4398 0.8254 0.9085
No log 1.6634 168 0.8029 0.4762 0.8029 0.8961
No log 1.6832 170 0.8558 0.4198 0.8558 0.9251
No log 1.7030 172 0.9267 0.4681 0.9267 0.9626
No log 1.7228 174 0.8400 0.3939 0.8400 0.9165
No log 1.7426 176 0.7984 0.4473 0.7984 0.8935
No log 1.7624 178 0.7788 0.4234 0.7788 0.8825
No log 1.7822 180 0.7867 0.4595 0.7867 0.8870
No log 1.8020 182 0.8710 0.4334 0.8710 0.9333
No log 1.8218 184 0.8358 0.4568 0.8358 0.9142
No log 1.8416 186 0.8159 0.4455 0.8159 0.9033
No log 1.8614 188 0.7616 0.4858 0.7616 0.8727
No log 1.8812 190 0.7716 0.4862 0.7716 0.8784
No log 1.9010 192 0.8682 0.4423 0.8682 0.9318
No log 1.9208 194 1.1088 0.3590 1.1088 1.0530
No log 1.9406 196 1.1006 0.3913 1.1006 1.0491
No log 1.9604 198 0.8815 0.4326 0.8815 0.9389
No log 1.9802 200 0.8147 0.3976 0.8147 0.9026
No log 2.0 202 0.8332 0.4075 0.8332 0.9128
No log 2.0198 204 0.9550 0.3519 0.9550 0.9772
No log 2.0396 206 0.9918 0.3492 0.9918 0.9959
No log 2.0594 208 0.8777 0.3989 0.8777 0.9369
No log 2.0792 210 0.8476 0.3435 0.8476 0.9206
No log 2.0990 212 0.8707 0.3037 0.8707 0.9331
No log 2.1188 214 0.8393 0.3836 0.8393 0.9161
No log 2.1386 216 0.8807 0.3883 0.8807 0.9384
No log 2.1584 218 0.9319 0.4777 0.9319 0.9653
No log 2.1782 220 0.8632 0.4681 0.8632 0.9291
No log 2.1980 222 0.8364 0.4342 0.8364 0.9145
No log 2.2178 224 0.7873 0.4471 0.7873 0.8873
No log 2.2376 226 0.8078 0.4450 0.8078 0.8988
No log 2.2574 228 0.8617 0.4434 0.8617 0.9283
No log 2.2772 230 0.7751 0.3939 0.7751 0.8804
No log 2.2970 232 0.7114 0.4355 0.7114 0.8435
No log 2.3168 234 0.7102 0.4498 0.7102 0.8427
No log 2.3366 236 0.7044 0.4498 0.7044 0.8393
No log 2.3564 238 0.7197 0.4468 0.7197 0.8484
No log 2.3762 240 0.7584 0.3760 0.7584 0.8709
No log 2.3960 242 0.7602 0.3760 0.7602 0.8719
No log 2.4158 244 0.7340 0.4355 0.7340 0.8567
No log 2.4356 246 0.7403 0.5328 0.7403 0.8604
No log 2.4554 248 0.7497 0.4870 0.7497 0.8658
No log 2.4752 250 0.7128 0.5638 0.7128 0.8443
No log 2.4950 252 0.7096 0.5357 0.7096 0.8424
No log 2.5149 254 0.7306 0.5677 0.7306 0.8547
No log 2.5347 256 0.6922 0.5480 0.6922 0.8320
No log 2.5545 258 0.6905 0.5287 0.6905 0.8309
No log 2.5743 260 0.7023 0.4645 0.7023 0.8380
No log 2.5941 262 0.7505 0.4247 0.7505 0.8663
No log 2.6139 264 0.7759 0.4696 0.7759 0.8808
No log 2.6337 266 0.8223 0.5013 0.8223 0.9068
No log 2.6535 268 0.8281 0.5013 0.8281 0.9100
No log 2.6733 270 0.9107 0.4885 0.9107 0.9543
No log 2.6931 272 0.8304 0.4681 0.8304 0.9113
No log 2.7129 274 0.7219 0.3821 0.7219 0.8496
No log 2.7327 276 0.6853 0.4748 0.6853 0.8278
No log 2.7525 278 0.7006 0.3706 0.7006 0.8370
No log 2.7723 280 0.8327 0.5111 0.8327 0.9125
No log 2.7921 282 0.8741 0.5086 0.8741 0.9350
No log 2.8119 284 0.7684 0.5431 0.7684 0.8766
No log 2.8317 286 0.7590 0.4460 0.7590 0.8712
No log 2.8515 288 0.8575 0.5196 0.8575 0.9260
No log 2.8713 290 0.8574 0.4785 0.8574 0.9260
No log 2.8911 292 0.7430 0.4089 0.7430 0.8620
No log 2.9109 294 0.7114 0.4988 0.7114 0.8434
No log 2.9307 296 0.7549 0.4214 0.7549 0.8688
No log 2.9505 298 0.8837 0.4885 0.8837 0.9401
No log 2.9703 300 0.8432 0.5111 0.8432 0.9183
No log 2.9901 302 0.7306 0.4343 0.7306 0.8547
No log 3.0099 304 0.7110 0.5220 0.7110 0.8432
No log 3.0297 306 0.7305 0.4727 0.7305 0.8547
No log 3.0495 308 0.7551 0.4343 0.7551 0.8690
No log 3.0693 310 0.7558 0.4455 0.7558 0.8694
No log 3.0891 312 0.7241 0.4106 0.7241 0.8509
No log 3.1089 314 0.6898 0.4241 0.6898 0.8305
No log 3.1287 316 0.7099 0.4106 0.7099 0.8425
No log 3.1485 318 0.7426 0.4089 0.7426 0.8617
No log 3.1683 320 0.7343 0.4220 0.7343 0.8569
No log 3.1881 322 0.6972 0.4106 0.6972 0.8350
No log 3.2079 324 0.7251 0.4106 0.7251 0.8515
No log 3.2277 326 0.8208 0.5027 0.8208 0.9060
No log 3.2475 328 0.8074 0.5027 0.8074 0.8986
No log 3.2673 330 0.7140 0.4106 0.7140 0.8450
No log 3.2871 332 0.6932 0.4789 0.6932 0.8326
No log 3.3069 334 0.6935 0.5373 0.6935 0.8328
No log 3.3267 336 0.7140 0.4974 0.7140 0.8450
No log 3.3465 338 0.8434 0.5 0.8434 0.9183
No log 3.3663 340 1.0270 0.4226 1.0270 1.0134
No log 3.3861 342 1.0126 0.4540 1.0126 1.0063
No log 3.4059 344 0.8616 0.4898 0.8616 0.9282
No log 3.4257 346 0.7890 0.3939 0.7890 0.8882
No log 3.4455 348 0.7722 0.4071 0.7722 0.8787
No log 3.4653 350 0.8096 0.4889 0.8096 0.8998
No log 3.4851 352 0.9005 0.4885 0.9005 0.9489
No log 3.5050 354 0.8989 0.4667 0.8989 0.9481
No log 3.5248 356 0.7967 0.5007 0.7967 0.8926
No log 3.5446 358 0.7110 0.5062 0.7110 0.8432
No log 3.5644 360 0.7078 0.4093 0.7078 0.8413
No log 3.5842 362 0.7439 0.4310 0.7439 0.8625
No log 3.6040 364 0.8158 0.4898 0.8158 0.9032
No log 3.6238 366 0.7787 0.4792 0.7787 0.8824
No log 3.6436 368 0.6865 0.4301 0.6865 0.8285
No log 3.6634 370 0.6362 0.4625 0.6362 0.7976
No log 3.6832 372 0.6373 0.5402 0.6373 0.7983
No log 3.7030 374 0.6813 0.5070 0.6813 0.8254
No log 3.7228 376 0.7843 0.4994 0.7843 0.8856
No log 3.7426 378 0.8111 0.4885 0.8111 0.9006
No log 3.7624 380 0.7261 0.5292 0.7261 0.8521
No log 3.7822 382 0.7049 0.4336 0.7049 0.8396
No log 3.8020 384 0.7162 0.4336 0.7162 0.8463
No log 3.8218 386 0.6926 0.4336 0.6926 0.8322
No log 3.8416 388 0.6655 0.4730 0.6655 0.8158
No log 3.8614 390 0.6506 0.4730 0.6506 0.8066
No log 3.8812 392 0.6432 0.4776 0.6432 0.8020
No log 3.9010 394 0.6746 0.4490 0.6746 0.8213
No log 3.9208 396 0.7129 0.4616 0.7129 0.8443
No log 3.9406 398 0.7236 0.4411 0.7236 0.8507
No log 3.9604 400 0.7472 0.4264 0.7472 0.8644
No log 3.9802 402 0.7541 0.4010 0.7541 0.8684
No log 4.0 404 0.7287 0.4014 0.7287 0.8537
No log 4.0198 406 0.7168 0.4405 0.7168 0.8466
No log 4.0396 408 0.7370 0.4847 0.7370 0.8585
No log 4.0594 410 0.7512 0.5292 0.7512 0.8667
No log 4.0792 412 0.7458 0.5292 0.7458 0.8636
No log 4.0990 414 0.7530 0.5255 0.7530 0.8677
No log 4.1188 416 0.8489 0.5098 0.8489 0.9214
No log 4.1386 418 0.8455 0.5098 0.8455 0.9195
No log 4.1584 420 0.9500 0.5283 0.9500 0.9747
No log 4.1782 422 0.9076 0.5394 0.9076 0.9527
No log 4.1980 424 0.7689 0.5219 0.7689 0.8769
No log 4.2178 426 0.6876 0.4630 0.6876 0.8292
No log 4.2376 428 0.7002 0.5186 0.7002 0.8368
No log 4.2574 430 0.7874 0.5447 0.7874 0.8874
No log 4.2772 432 0.8222 0.5447 0.8222 0.9067
No log 4.2970 434 0.7680 0.5035 0.7680 0.8763
No log 4.3168 436 0.7151 0.4082 0.7151 0.8457
No log 4.3366 438 0.7197 0.4373 0.7197 0.8483
No log 4.3564 440 0.7345 0.3652 0.7345 0.8570
No log 4.3762 442 0.7971 0.4928 0.7971 0.8928
No log 4.3960 444 0.9100 0.5240 0.9100 0.9539
No log 4.4158 446 0.8876 0.5134 0.8876 0.9421
No log 4.4356 448 0.8312 0.4327 0.8312 0.9117
No log 4.4554 450 0.8607 0.4203 0.8607 0.9278
No log 4.4752 452 0.8513 0.4584 0.8513 0.9227
No log 4.4950 454 0.8350 0.4581 0.8350 0.9138
No log 4.5149 456 0.8262 0.4824 0.8262 0.9090
No log 4.5347 458 0.7859 0.4836 0.7859 0.8865
No log 4.5545 460 0.7778 0.5150 0.7778 0.8819
No log 4.5743 462 0.7010 0.4613 0.7010 0.8373
No log 4.5941 464 0.6772 0.5002 0.6772 0.8229
No log 4.6139 466 0.7143 0.5618 0.7143 0.8452
No log 4.6337 468 0.8196 0.5356 0.8196 0.9053
No log 4.6535 470 0.8992 0.4667 0.8992 0.9483
No log 4.6733 472 0.9110 0.4667 0.9110 0.9544
No log 4.6931 474 0.8517 0.4885 0.8517 0.9229
No log 4.7129 476 0.7685 0.5378 0.7685 0.8767
No log 4.7327 478 0.7298 0.4768 0.7298 0.8543
No log 4.7525 480 0.7260 0.4660 0.7260 0.8520
No log 4.7723 482 0.7460 0.4168 0.7460 0.8637
No log 4.7921 484 0.7676 0.4186 0.7676 0.8762
No log 4.8119 486 0.8043 0.4893 0.8043 0.8968
No log 4.8317 488 0.7834 0.5134 0.7834 0.8851
No log 4.8515 490 0.7291 0.5407 0.7291 0.8539
No log 4.8713 492 0.7182 0.5407 0.7182 0.8475
No log 4.8911 494 0.7212 0.5291 0.7212 0.8492
No log 4.9109 496 0.8045 0.5229 0.8045 0.8970
No log 4.9307 498 0.8336 0.5227 0.8336 0.9130
0.3485 4.9505 500 0.7940 0.5362 0.7940 0.8910
0.3485 4.9703 502 0.7613 0.5362 0.7613 0.8725
0.3485 4.9901 504 0.7045 0.5475 0.7045 0.8394
0.3485 5.0099 506 0.6184 0.5856 0.6184 0.7864
0.3485 5.0297 508 0.6174 0.6195 0.6174 0.7858
0.3485 5.0495 510 0.6098 0.6316 0.6098 0.7809
0.3485 5.0693 512 0.6254 0.5752 0.6254 0.7908
0.3485 5.0891 514 0.7585 0.5844 0.7585 0.8709
0.3485 5.1089 516 0.8167 0.5755 0.8167 0.9037
0.3485 5.1287 518 0.8231 0.5755 0.8231 0.9073
0.3485 5.1485 520 0.7708 0.5556 0.7708 0.8779
0.3485 5.1683 522 0.6796 0.5035 0.6796 0.8244
0.3485 5.1881 524 0.6479 0.5212 0.6479 0.8049
0.3485 5.2079 526 0.6721 0.5291 0.6721 0.8198
0.3485 5.2277 528 0.7648 0.5229 0.7648 0.8745
0.3485 5.2475 530 0.8349 0.5229 0.8349 0.9137

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k20_task5_organization

Finetuned
(4019)
this model