ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run1_AugV5_k20_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6245
  • Qwk: 0.4652
  • Mse: 0.6245
  • Rmse: 0.7902

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0198 2 2.5599 -0.0924 2.5599 1.6000
No log 0.0396 4 1.2536 0.0986 1.2536 1.1196
No log 0.0594 6 0.9178 0.0421 0.9178 0.9580
No log 0.0792 8 0.8143 0.1775 0.8143 0.9024
No log 0.0990 10 0.7493 0.3158 0.7493 0.8656
No log 0.1188 12 0.6974 0.4253 0.6974 0.8351
No log 0.1386 14 0.6983 0.3664 0.6983 0.8356
No log 0.1584 16 0.8826 0.1882 0.8826 0.9394
No log 0.1782 18 0.7464 0.2772 0.7464 0.8639
No log 0.1980 20 0.6547 0.2847 0.6547 0.8092
No log 0.2178 22 0.6475 0.2878 0.6475 0.8047
No log 0.2376 24 0.6780 0.2776 0.6780 0.8234
No log 0.2574 26 1.1213 0.1775 1.1213 1.0589
No log 0.2772 28 1.4274 0.0578 1.4274 1.1948
No log 0.2970 30 1.3032 0.1453 1.3032 1.1416
No log 0.3168 32 0.9787 0.2702 0.9787 0.9893
No log 0.3366 34 0.6738 0.3348 0.6738 0.8208
No log 0.3564 36 0.6622 0.1846 0.6622 0.8138
No log 0.3762 38 0.6597 0.1962 0.6597 0.8122
No log 0.3960 40 0.6846 0.0937 0.6846 0.8274
No log 0.4158 42 0.6999 0.0481 0.6999 0.8366
No log 0.4356 44 0.6615 0.0798 0.6615 0.8133
No log 0.4554 46 0.6635 0.0757 0.6635 0.8146
No log 0.4752 48 0.6688 0.1138 0.6688 0.8178
No log 0.4950 50 0.6651 0.1489 0.6651 0.8156
No log 0.5149 52 0.6819 0.2454 0.6819 0.8257
No log 0.5347 54 0.7275 0.2673 0.7275 0.8530
No log 0.5545 56 0.8758 0.2817 0.8758 0.9359
No log 0.5743 58 0.8863 0.3027 0.8863 0.9414
No log 0.5941 60 0.6852 0.3360 0.6852 0.8278
No log 0.6139 62 0.7091 0.4175 0.7091 0.8421
No log 0.6337 64 1.1026 0.2589 1.1026 1.0500
No log 0.6535 66 1.1815 0.2199 1.1815 1.0869
No log 0.6733 68 0.9741 0.2732 0.9741 0.9870
No log 0.6931 70 0.7118 0.1790 0.7118 0.8437
No log 0.7129 72 0.7972 0.2804 0.7972 0.8929
No log 0.7327 74 0.8775 0.3354 0.8775 0.9368
No log 0.7525 76 0.8050 0.2841 0.8050 0.8972
No log 0.7723 78 0.8958 0.3085 0.8958 0.9465
No log 0.7921 80 1.0926 0.2559 1.0926 1.0453
No log 0.8119 82 1.2862 0.1448 1.2862 1.1341
No log 0.8317 84 1.1744 0.1935 1.1744 1.0837
No log 0.8515 86 0.9129 0.2437 0.9129 0.9554
No log 0.8713 88 0.8224 0.2942 0.8224 0.9068
No log 0.8911 90 0.7935 0.3022 0.7935 0.8908
No log 0.9109 92 0.8176 0.2703 0.8176 0.9042
No log 0.9307 94 0.8105 0.2555 0.8105 0.9003
No log 0.9505 96 0.7608 0.2867 0.7608 0.8722
No log 0.9703 98 0.7437 0.3964 0.7437 0.8624
No log 0.9901 100 0.7593 0.3607 0.7593 0.8714
No log 1.0099 102 0.8683 0.2920 0.8683 0.9318
No log 1.0297 104 0.9349 0.1935 0.9349 0.9669
No log 1.0495 106 0.8419 0.4169 0.8419 0.9176
No log 1.0693 108 0.8010 0.3350 0.8010 0.8950
No log 1.0891 110 0.8084 0.2888 0.8084 0.8991
No log 1.1089 112 0.8820 0.2832 0.8820 0.9392
No log 1.1287 114 0.8200 0.2975 0.8200 0.9055
No log 1.1485 116 0.7221 0.4036 0.7221 0.8498
No log 1.1683 118 0.7081 0.4158 0.7081 0.8415
No log 1.1881 120 0.7050 0.4146 0.7050 0.8396
No log 1.2079 122 0.6717 0.5118 0.6717 0.8196
No log 1.2277 124 0.7140 0.4253 0.7140 0.8450
No log 1.2475 126 0.7297 0.4224 0.7297 0.8542
No log 1.2673 128 0.6590 0.4406 0.6590 0.8118
No log 1.2871 130 0.6341 0.4808 0.6341 0.7963
No log 1.3069 132 0.7003 0.3770 0.7003 0.8368
No log 1.3267 134 0.6486 0.3843 0.6486 0.8054
No log 1.3465 136 0.6408 0.4096 0.6408 0.8005
No log 1.3663 138 0.6590 0.4118 0.6590 0.8118
No log 1.3861 140 0.6511 0.4066 0.6511 0.8069
No log 1.4059 142 0.6281 0.5444 0.6281 0.7925
No log 1.4257 144 0.6935 0.4574 0.6935 0.8328
No log 1.4455 146 0.6294 0.5225 0.6294 0.7933
No log 1.4653 148 0.8079 0.4096 0.8079 0.8988
No log 1.4851 150 0.9983 0.2949 0.9983 0.9992
No log 1.5050 152 0.8827 0.3389 0.8827 0.9395
No log 1.5248 154 0.6433 0.4518 0.6433 0.8021
No log 1.5446 156 0.6052 0.5357 0.6052 0.7779
No log 1.5644 158 0.6072 0.6271 0.6072 0.7792
No log 1.5842 160 0.7019 0.3718 0.7019 0.8378
No log 1.6040 162 0.9194 0.3620 0.9194 0.9588
No log 1.6238 164 1.0232 0.2199 1.0232 1.0116
No log 1.6436 166 0.8720 0.3368 0.8720 0.9338
No log 1.6634 168 0.6216 0.5014 0.6216 0.7884
No log 1.6832 170 0.5843 0.5672 0.5843 0.7644
No log 1.7030 172 0.6244 0.5058 0.6244 0.7902
No log 1.7228 174 0.6361 0.4911 0.6361 0.7976
No log 1.7426 176 0.6220 0.4622 0.6220 0.7887
No log 1.7624 178 0.6262 0.4692 0.6262 0.7913
No log 1.7822 180 0.6182 0.4692 0.6182 0.7862
No log 1.8020 182 0.6940 0.4670 0.6940 0.8331
No log 1.8218 184 0.7109 0.4670 0.7109 0.8431
No log 1.8416 186 0.6191 0.4562 0.6191 0.7868
No log 1.8614 188 0.5927 0.4534 0.5927 0.7699
No log 1.8812 190 0.6411 0.4218 0.6411 0.8007
No log 1.9010 192 0.6305 0.4165 0.6305 0.7941
No log 1.9208 194 0.6012 0.5324 0.6012 0.7754
No log 1.9406 196 0.5829 0.5003 0.5829 0.7635
No log 1.9604 198 0.5954 0.5308 0.5954 0.7716
No log 1.9802 200 0.5914 0.5308 0.5914 0.7690
No log 2.0 202 0.5732 0.5450 0.5732 0.7571
No log 2.0198 204 0.5788 0.5655 0.5788 0.7608
No log 2.0396 206 0.5841 0.6046 0.5841 0.7642
No log 2.0594 208 0.5915 0.5628 0.5915 0.7691
No log 2.0792 210 0.5947 0.5434 0.5947 0.7711
No log 2.0990 212 0.5825 0.5798 0.5825 0.7632
No log 2.1188 214 0.5751 0.4948 0.5751 0.7583
No log 2.1386 216 0.5765 0.5533 0.5765 0.7593
No log 2.1584 218 0.6495 0.4409 0.6495 0.8059
No log 2.1782 220 0.6111 0.4753 0.6111 0.7817
No log 2.1980 222 0.5854 0.5160 0.5854 0.7651
No log 2.2178 224 0.6769 0.3551 0.6769 0.8227
No log 2.2376 226 0.6394 0.3783 0.6394 0.7996
No log 2.2574 228 0.6264 0.4935 0.6264 0.7915
No log 2.2772 230 0.5607 0.4734 0.5607 0.7488
No log 2.2970 232 0.5362 0.5286 0.5362 0.7323
No log 2.3168 234 0.5537 0.5250 0.5537 0.7441
No log 2.3366 236 0.6731 0.4142 0.6731 0.8205
No log 2.3564 238 0.7709 0.3579 0.7709 0.8780
No log 2.3762 240 0.9463 0.3747 0.9463 0.9728
No log 2.3960 242 0.8771 0.3942 0.8771 0.9366
No log 2.4158 244 0.7525 0.4250 0.7525 0.8675
No log 2.4356 246 0.6154 0.4518 0.6154 0.7845
No log 2.4554 248 0.5678 0.5252 0.5678 0.7535
No log 2.4752 250 0.5527 0.5609 0.5527 0.7434
No log 2.4950 252 0.5515 0.5517 0.5515 0.7426
No log 2.5149 254 0.5792 0.5016 0.5792 0.7611
No log 2.5347 256 0.6850 0.4175 0.6850 0.8277
No log 2.5545 258 0.8241 0.3970 0.8241 0.9078
No log 2.5743 260 0.7292 0.4382 0.7292 0.8539
No log 2.5941 262 0.5560 0.5671 0.5560 0.7456
No log 2.6139 264 0.5546 0.4674 0.5546 0.7447
No log 2.6337 266 0.5796 0.4789 0.5796 0.7613
No log 2.6535 268 0.5801 0.4694 0.5801 0.7617
No log 2.6733 270 0.5538 0.5009 0.5538 0.7442
No log 2.6931 272 0.5485 0.5304 0.5485 0.7406
No log 2.7129 274 0.5375 0.5231 0.5375 0.7331
No log 2.7327 276 0.5426 0.5248 0.5426 0.7366
No log 2.7525 278 0.5888 0.4690 0.5888 0.7674
No log 2.7723 280 0.5986 0.4690 0.5986 0.7737
No log 2.7921 282 0.5653 0.5751 0.5653 0.7518
No log 2.8119 284 0.5568 0.5405 0.5568 0.7462
No log 2.8317 286 0.5706 0.5479 0.5706 0.7554
No log 2.8515 288 0.5680 0.5697 0.5680 0.7537
No log 2.8713 290 0.5572 0.5517 0.5572 0.7465
No log 2.8911 292 0.5816 0.5452 0.5816 0.7627
No log 2.9109 294 0.5429 0.5271 0.5429 0.7368
No log 2.9307 296 0.5134 0.5440 0.5134 0.7165
No log 2.9505 298 0.6187 0.5065 0.6187 0.7866
No log 2.9703 300 0.6982 0.3933 0.6982 0.8356
No log 2.9901 302 0.6415 0.4669 0.6415 0.8010
No log 3.0099 304 0.5315 0.4444 0.5315 0.7290
No log 3.0297 306 0.5237 0.5521 0.5237 0.7237
No log 3.0495 308 0.5336 0.5267 0.5336 0.7305
No log 3.0693 310 0.5541 0.4634 0.5541 0.7444
No log 3.0891 312 0.6404 0.4825 0.6404 0.8002
No log 3.1089 314 0.6550 0.4410 0.6550 0.8093
No log 3.1287 316 0.5981 0.3982 0.5981 0.7734
No log 3.1485 318 0.5698 0.4136 0.5698 0.7548
No log 3.1683 320 0.5785 0.4136 0.5785 0.7606
No log 3.1881 322 0.5892 0.4705 0.5892 0.7676
No log 3.2079 324 0.5863 0.5214 0.5863 0.7657
No log 3.2277 326 0.5797 0.5151 0.5797 0.7614
No log 3.2475 328 0.5932 0.4299 0.5932 0.7702
No log 3.2673 330 0.6439 0.4556 0.6439 0.8024
No log 3.2871 332 0.6486 0.4761 0.6486 0.8053
No log 3.3069 334 0.5696 0.5234 0.5696 0.7547
No log 3.3267 336 0.5331 0.6542 0.5331 0.7302
No log 3.3465 338 0.5294 0.6201 0.5294 0.7276
No log 3.3663 340 0.5247 0.6303 0.5247 0.7243
No log 3.3861 342 0.5445 0.5812 0.5445 0.7379
No log 3.4059 344 0.6109 0.4892 0.6109 0.7816
No log 3.4257 346 0.7319 0.4615 0.7319 0.8555
No log 3.4455 348 0.7520 0.4615 0.7520 0.8672
No log 3.4653 350 0.6767 0.4448 0.6767 0.8226
No log 3.4851 352 0.6006 0.4271 0.6006 0.7750
No log 3.5050 354 0.5591 0.2955 0.5591 0.7477
No log 3.5248 356 0.5617 0.3809 0.5617 0.7494
No log 3.5446 358 0.5534 0.3809 0.5534 0.7439
No log 3.5644 360 0.5564 0.4186 0.5564 0.7459
No log 3.5842 362 0.5546 0.4186 0.5546 0.7447
No log 3.6040 364 0.5335 0.4448 0.5335 0.7304
No log 3.6238 366 0.5112 0.4614 0.5112 0.7150
No log 3.6436 368 0.5467 0.4052 0.5467 0.7394
No log 3.6634 370 0.5606 0.4307 0.5606 0.7488
No log 3.6832 372 0.5104 0.5111 0.5104 0.7144
No log 3.7030 374 0.4850 0.5493 0.4850 0.6964
No log 3.7228 376 0.5292 0.5065 0.5292 0.7275
No log 3.7426 378 0.5255 0.5061 0.5255 0.7249
No log 3.7624 380 0.4865 0.5523 0.4865 0.6975
No log 3.7822 382 0.4661 0.6542 0.4661 0.6827
No log 3.8020 384 0.5367 0.5345 0.5367 0.7326
No log 3.8218 386 0.5557 0.5345 0.5557 0.7454
No log 3.8416 388 0.5012 0.5995 0.5012 0.7079
No log 3.8614 390 0.5295 0.5234 0.5295 0.7277
No log 3.8812 392 0.6392 0.4594 0.6392 0.7995
No log 3.9010 394 0.7132 0.3913 0.7132 0.8445
No log 3.9208 396 0.6608 0.4100 0.6608 0.8129
No log 3.9406 398 0.5836 0.3958 0.5836 0.7639
No log 3.9604 400 0.5755 0.4707 0.5755 0.7586
No log 3.9802 402 0.6042 0.4905 0.6042 0.7773
No log 4.0 404 0.5781 0.4663 0.5781 0.7603
No log 4.0198 406 0.5408 0.4276 0.5408 0.7354
No log 4.0396 408 0.6220 0.4909 0.6220 0.7886
No log 4.0594 410 0.7011 0.4142 0.7011 0.8373
No log 4.0792 412 0.7011 0.3978 0.7011 0.8373
No log 4.0990 414 0.6119 0.3999 0.6119 0.7822
No log 4.1188 416 0.5483 0.4044 0.5483 0.7405
No log 4.1386 418 0.5516 0.4707 0.5516 0.7427
No log 4.1584 420 0.5482 0.3474 0.5482 0.7404
No log 4.1782 422 0.5657 0.3541 0.5657 0.7521
No log 4.1980 424 0.6820 0.4142 0.6820 0.8258
No log 4.2178 426 0.8095 0.4222 0.8095 0.8997
No log 4.2376 428 0.7464 0.4396 0.7464 0.8640
No log 4.2574 430 0.6006 0.4391 0.6006 0.7750
No log 4.2772 432 0.5473 0.4945 0.5473 0.7398
No log 4.2970 434 0.5441 0.4595 0.5441 0.7376
No log 4.3168 436 0.5542 0.3754 0.5542 0.7445
No log 4.3366 438 0.5698 0.3445 0.5698 0.7548
No log 4.3564 440 0.5861 0.3258 0.5861 0.7655
No log 4.3762 442 0.6075 0.3718 0.6075 0.7794
No log 4.3960 444 0.6078 0.3718 0.6078 0.7796
No log 4.4158 446 0.5993 0.3446 0.5993 0.7742
No log 4.4356 448 0.5999 0.3446 0.5999 0.7746
No log 4.4554 450 0.6157 0.3093 0.6157 0.7847
No log 4.4752 452 0.6072 0.3137 0.6072 0.7792
No log 4.4950 454 0.5854 0.3151 0.5854 0.7651
No log 4.5149 456 0.5687 0.3803 0.5687 0.7541
No log 4.5347 458 0.5597 0.3289 0.5597 0.7481
No log 4.5545 460 0.5593 0.3729 0.5593 0.7479
No log 4.5743 462 0.5902 0.3808 0.5902 0.7682
No log 4.5941 464 0.6317 0.3875 0.6317 0.7948
No log 4.6139 466 0.6378 0.3875 0.6378 0.7986
No log 4.6337 468 0.5885 0.4221 0.5885 0.7671
No log 4.6535 470 0.5787 0.5460 0.5787 0.7607
No log 4.6733 472 0.5744 0.5846 0.5744 0.7579
No log 4.6931 474 0.5680 0.3950 0.5680 0.7536
No log 4.7129 476 0.5838 0.4315 0.5838 0.7640
No log 4.7327 478 0.5959 0.4087 0.5959 0.7719
No log 4.7525 480 0.5953 0.3610 0.5953 0.7716
No log 4.7723 482 0.6096 0.4215 0.6096 0.7808
No log 4.7921 484 0.5727 0.4249 0.5727 0.7568
No log 4.8119 486 0.5409 0.3703 0.5409 0.7354
No log 4.8317 488 0.5258 0.4747 0.5258 0.7251
No log 4.8515 490 0.5154 0.4984 0.5154 0.7179
No log 4.8713 492 0.5042 0.5195 0.5042 0.7101
No log 4.8911 494 0.5312 0.5212 0.5312 0.7288
No log 4.9109 496 0.5675 0.4690 0.5675 0.7533
No log 4.9307 498 0.6600 0.5205 0.6600 0.8124
0.3374 4.9505 500 0.6723 0.5205 0.6723 0.8199
0.3374 4.9703 502 0.6908 0.4961 0.6908 0.8312
0.3374 4.9901 504 0.5985 0.5003 0.5985 0.7736
0.3374 5.0099 506 0.5480 0.4892 0.5480 0.7403
0.3374 5.0297 508 0.5931 0.4634 0.5931 0.7701
0.3374 5.0495 510 0.6245 0.4652 0.6245 0.7902

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run1_AugV5_k20_task7_organization

Finetuned
(4019)
this model