ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k15_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4776
  • Qwk: 0.4997
  • Mse: 0.4776
  • Rmse: 0.6911

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0267 2 2.3972 0.0052 2.3972 1.5483
No log 0.0533 4 1.4952 0.1453 1.4952 1.2228
No log 0.08 6 0.7615 0.1786 0.7615 0.8726
No log 0.1067 8 0.9268 -0.0578 0.9268 0.9627
No log 0.1333 10 1.0856 0.1497 1.0856 1.0419
No log 0.16 12 0.9720 0.2363 0.9720 0.9859
No log 0.1867 14 0.7444 0.1604 0.7444 0.8628
No log 0.2133 16 1.1475 0.1727 1.1475 1.0712
No log 0.24 18 1.0237 0.2516 1.0237 1.0118
No log 0.2667 20 0.6609 0.1365 0.6609 0.8130
No log 0.2933 22 0.6996 0.5311 0.6996 0.8364
No log 0.32 24 1.0021 0.2757 1.0021 1.0010
No log 0.3467 26 1.0773 0.2757 1.0773 1.0379
No log 0.3733 28 0.8681 0.3615 0.8681 0.9317
No log 0.4 30 0.6617 0.3712 0.6617 0.8134
No log 0.4267 32 0.5734 0.3980 0.5734 0.7573
No log 0.4533 34 0.5486 0.2290 0.5486 0.7407
No log 0.48 36 0.5311 0.3318 0.5311 0.7288
No log 0.5067 38 0.5766 0.4351 0.5766 0.7594
No log 0.5333 40 0.6746 0.4466 0.6746 0.8214
No log 0.56 42 0.6452 0.5280 0.6452 0.8033
No log 0.5867 44 0.5237 0.6434 0.5237 0.7237
No log 0.6133 46 0.5394 0.6303 0.5394 0.7344
No log 0.64 48 0.5891 0.5408 0.5891 0.7675
No log 0.6667 50 0.6271 0.5343 0.6271 0.7919
No log 0.6933 52 0.5115 0.5895 0.5115 0.7152
No log 0.72 54 0.4658 0.6970 0.4658 0.6825
No log 0.7467 56 0.5924 0.5351 0.5924 0.7697
No log 0.7733 58 0.5874 0.4898 0.5874 0.7664
No log 0.8 60 0.6669 0.4728 0.6669 0.8166
No log 0.8267 62 0.5769 0.4751 0.5769 0.7596
No log 0.8533 64 0.5023 0.6701 0.5023 0.7087
No log 0.88 66 0.4773 0.6918 0.4773 0.6909
No log 0.9067 68 0.6577 0.5018 0.6577 0.8110
No log 0.9333 70 0.6941 0.4719 0.6941 0.8331
No log 0.96 72 0.5173 0.5919 0.5173 0.7192
No log 0.9867 74 0.5202 0.7102 0.5202 0.7213
No log 1.0133 76 0.6504 0.6013 0.6504 0.8065
No log 1.04 78 0.5448 0.6469 0.5448 0.7381
No log 1.0667 80 0.5431 0.5546 0.5431 0.7369
No log 1.0933 82 0.6751 0.4462 0.6751 0.8217
No log 1.12 84 0.6324 0.4615 0.6324 0.7952
No log 1.1467 86 0.4483 0.6933 0.4483 0.6696
No log 1.1733 88 0.4508 0.6434 0.4508 0.6714
No log 1.2 90 0.4480 0.6817 0.4480 0.6693
No log 1.2267 92 0.4666 0.6773 0.4666 0.6831
No log 1.2533 94 0.5055 0.6712 0.5055 0.7110
No log 1.28 96 0.4883 0.6343 0.4883 0.6988
No log 1.3067 98 0.4919 0.6676 0.4919 0.7013
No log 1.3333 100 0.6083 0.5254 0.6083 0.7799
No log 1.3600 102 0.6667 0.4733 0.6667 0.8165
No log 1.3867 104 0.5513 0.6630 0.5513 0.7425
No log 1.4133 106 0.5425 0.6632 0.5425 0.7366
No log 1.44 108 0.6629 0.5914 0.6629 0.8142
No log 1.4667 110 0.6411 0.5730 0.6411 0.8007
No log 1.4933 112 0.5300 0.5426 0.5300 0.7280
No log 1.52 114 0.4480 0.6229 0.4480 0.6693
No log 1.5467 116 0.5551 0.5560 0.5551 0.7450
No log 1.5733 118 0.6357 0.4939 0.6357 0.7973
No log 1.6 120 0.5678 0.5778 0.5678 0.7535
No log 1.6267 122 0.4475 0.5731 0.4475 0.6689
No log 1.6533 124 0.4839 0.5468 0.4839 0.6957
No log 1.6800 126 0.4945 0.5468 0.4945 0.7032
No log 1.7067 128 0.4452 0.5874 0.4452 0.6672
No log 1.7333 130 0.5321 0.6549 0.5321 0.7294
No log 1.76 132 0.7013 0.5032 0.7013 0.8374
No log 1.7867 134 0.7414 0.4874 0.7414 0.8610
No log 1.8133 136 0.6236 0.5340 0.6236 0.7897
No log 1.8400 138 0.4678 0.6923 0.4678 0.6840
No log 1.8667 140 0.4820 0.6188 0.4820 0.6943
No log 1.8933 142 0.4745 0.5991 0.4745 0.6889
No log 1.92 144 0.4250 0.7022 0.4250 0.6519
No log 1.9467 146 0.4374 0.7062 0.4374 0.6613
No log 1.9733 148 0.4269 0.6827 0.4269 0.6534
No log 2.0 150 0.4426 0.6735 0.4426 0.6652
No log 2.0267 152 0.4676 0.6434 0.4676 0.6838
No log 2.0533 154 0.5176 0.6620 0.5176 0.7194
No log 2.08 156 0.5141 0.6607 0.5141 0.7170
No log 2.1067 158 0.4840 0.6434 0.4840 0.6957
No log 2.1333 160 0.4890 0.6643 0.4890 0.6993
No log 2.16 162 0.4683 0.7004 0.4683 0.6843
No log 2.1867 164 0.4825 0.7192 0.4825 0.6946
No log 2.2133 166 0.4647 0.6943 0.4647 0.6817
No log 2.24 168 0.4515 0.6830 0.4515 0.6720
No log 2.2667 170 0.4508 0.6830 0.4508 0.6714
No log 2.2933 172 0.4593 0.6678 0.4593 0.6777
No log 2.32 174 0.4963 0.6537 0.4963 0.7045
No log 2.3467 176 0.4729 0.7041 0.4729 0.6876
No log 2.3733 178 0.4659 0.6806 0.4659 0.6825
No log 2.4 180 0.4659 0.6806 0.4659 0.6825
No log 2.4267 182 0.4795 0.6960 0.4795 0.6925
No log 2.4533 184 0.4928 0.6526 0.4928 0.7020
No log 2.48 186 0.5679 0.4756 0.5679 0.7536
No log 2.5067 188 0.5027 0.6804 0.5027 0.7090
No log 2.5333 190 0.4570 0.6443 0.4570 0.6760
No log 2.56 192 0.5010 0.5157 0.5010 0.7078
No log 2.5867 194 0.4762 0.5157 0.4762 0.6900
No log 2.6133 196 0.4392 0.7218 0.4392 0.6628
No log 2.64 198 0.4916 0.6379 0.4916 0.7012
No log 2.6667 200 0.4630 0.6613 0.4630 0.6804
No log 2.6933 202 0.4752 0.6806 0.4752 0.6893
No log 2.7200 204 0.5170 0.6633 0.5170 0.7190
No log 2.7467 206 0.5298 0.6805 0.5298 0.7279
No log 2.7733 208 0.5476 0.6796 0.5476 0.7400
No log 2.8 210 0.6337 0.5988 0.6337 0.7961
No log 2.8267 212 0.6463 0.5448 0.6463 0.8040
No log 2.8533 214 0.5315 0.6063 0.5315 0.7291
No log 2.88 216 0.4840 0.6279 0.4840 0.6957
No log 2.9067 218 0.5190 0.5895 0.5190 0.7204
No log 2.9333 220 0.4825 0.6770 0.4825 0.6946
No log 2.96 222 0.5183 0.5801 0.5183 0.7199
No log 2.9867 224 0.5727 0.5484 0.5727 0.7568
No log 3.0133 226 0.5051 0.6150 0.5051 0.7107
No log 3.04 228 0.4749 0.6282 0.4749 0.6891
No log 3.0667 230 0.5360 0.5513 0.5360 0.7321
No log 3.0933 232 0.5280 0.5947 0.5280 0.7267
No log 3.12 234 0.4658 0.6053 0.4658 0.6825
No log 3.1467 236 0.5020 0.5712 0.5020 0.7085
No log 3.1733 238 0.6409 0.4382 0.6409 0.8006
No log 3.2 240 0.6435 0.4102 0.6435 0.8022
No log 3.2267 242 0.5299 0.4664 0.5299 0.7280
No log 3.2533 244 0.4357 0.6395 0.4357 0.6601
No log 3.2800 246 0.4300 0.7164 0.4300 0.6558
No log 3.3067 248 0.4367 0.7328 0.4367 0.6608
No log 3.3333 250 0.4437 0.7245 0.4437 0.6661
No log 3.36 252 0.4765 0.7330 0.4765 0.6903
No log 3.3867 254 0.4636 0.7593 0.4636 0.6809
No log 3.4133 256 0.4993 0.5931 0.4993 0.7066
No log 3.44 258 0.5618 0.5139 0.5618 0.7495
No log 3.4667 260 0.4914 0.6303 0.4914 0.7010
No log 3.4933 262 0.4198 0.7227 0.4198 0.6479
No log 3.52 264 0.3918 0.7033 0.3918 0.6259
No log 3.5467 266 0.4505 0.5741 0.4505 0.6712
No log 3.5733 268 0.4524 0.5528 0.4524 0.6726
No log 3.6 270 0.4088 0.6624 0.4088 0.6394
No log 3.6267 272 0.4198 0.6344 0.4198 0.6479
No log 3.6533 274 0.4165 0.6843 0.4165 0.6454
No log 3.68 276 0.4078 0.6736 0.4078 0.6386
No log 3.7067 278 0.4130 0.6935 0.4130 0.6426
No log 3.7333 280 0.4297 0.7273 0.4297 0.6555
No log 3.76 282 0.4757 0.6712 0.4757 0.6897
No log 3.7867 284 0.5829 0.6462 0.5829 0.7635
No log 3.8133 286 0.5860 0.6106 0.5860 0.7655
No log 3.84 288 0.5182 0.6855 0.5182 0.7199
No log 3.8667 290 0.4829 0.7139 0.4829 0.6949
No log 3.8933 292 0.4605 0.7254 0.4605 0.6786
No log 3.92 294 0.4480 0.6443 0.4480 0.6694
No log 3.9467 296 0.4522 0.6326 0.4522 0.6725
No log 3.9733 298 0.4411 0.6326 0.4411 0.6641
No log 4.0 300 0.4550 0.6018 0.4550 0.6746
No log 4.0267 302 0.4239 0.6639 0.4239 0.6511
No log 4.0533 304 0.4207 0.6837 0.4207 0.6486
No log 4.08 306 0.4605 0.6888 0.4605 0.6786
No log 4.1067 308 0.5528 0.6116 0.5528 0.7435
No log 4.1333 310 0.6557 0.6085 0.6557 0.8098
No log 4.16 312 0.6225 0.6056 0.6225 0.7890
No log 4.1867 314 0.5368 0.6015 0.5368 0.7327
No log 4.2133 316 0.4767 0.6275 0.4767 0.6905
No log 4.24 318 0.4494 0.6753 0.4494 0.6703
No log 4.2667 320 0.4408 0.6741 0.4408 0.6640
No log 4.2933 322 0.4619 0.6228 0.4619 0.6796
No log 4.32 324 0.5418 0.5014 0.5418 0.7361
No log 4.3467 326 0.5379 0.5574 0.5379 0.7334
No log 4.3733 328 0.4749 0.5886 0.4749 0.6892
No log 4.4 330 0.4526 0.6443 0.4526 0.6727
No log 4.4267 332 0.4590 0.6326 0.4590 0.6775
No log 4.4533 334 0.4554 0.6115 0.4554 0.6748
No log 4.48 336 0.4474 0.6115 0.4474 0.6689
No log 4.5067 338 0.4506 0.6957 0.4506 0.6713
No log 4.5333 340 0.4426 0.7032 0.4426 0.6653
No log 4.5600 342 0.4370 0.7032 0.4370 0.6610
No log 4.5867 344 0.4307 0.6854 0.4307 0.6563
No log 4.6133 346 0.4284 0.6745 0.4284 0.6545
No log 4.64 348 0.4188 0.6953 0.4188 0.6472
No log 4.6667 350 0.4122 0.6843 0.4122 0.6420
No log 4.6933 352 0.4156 0.6632 0.4156 0.6447
No log 4.72 354 0.4168 0.6807 0.4168 0.6456
No log 4.7467 356 0.4044 0.7173 0.4044 0.6359
No log 4.7733 358 0.4035 0.6919 0.4035 0.6352
No log 4.8 360 0.3962 0.7389 0.3962 0.6295
No log 4.8267 362 0.4041 0.7236 0.4041 0.6357
No log 4.8533 364 0.3999 0.7433 0.3999 0.6324
No log 4.88 366 0.3972 0.7424 0.3972 0.6303
No log 4.9067 368 0.3999 0.7536 0.3999 0.6324
No log 4.9333 370 0.4099 0.6873 0.4099 0.6402
No log 4.96 372 0.4053 0.7064 0.4053 0.6366
No log 4.9867 374 0.4060 0.7064 0.4060 0.6371
No log 5.0133 376 0.4148 0.7064 0.4148 0.6441
No log 5.04 378 0.4115 0.6448 0.4115 0.6415
No log 5.0667 380 0.4185 0.6184 0.4185 0.6469
No log 5.0933 382 0.4226 0.6937 0.4226 0.6500
No log 5.12 384 0.4215 0.6708 0.4215 0.6493
No log 5.1467 386 0.4079 0.7033 0.4079 0.6387
No log 5.1733 388 0.3970 0.7033 0.3970 0.6301
No log 5.2 390 0.3942 0.7033 0.3942 0.6279
No log 5.2267 392 0.4011 0.7033 0.4011 0.6333
No log 5.2533 394 0.4232 0.6914 0.4232 0.6506
No log 5.28 396 0.4057 0.7033 0.4057 0.6370
No log 5.3067 398 0.4006 0.7033 0.4006 0.6329
No log 5.3333 400 0.4015 0.7479 0.4015 0.6337
No log 5.36 402 0.3940 0.7389 0.3940 0.6277
No log 5.3867 404 0.3944 0.7467 0.3944 0.6281
No log 5.4133 406 0.3872 0.7227 0.3872 0.6223
No log 5.44 408 0.3737 0.7424 0.3737 0.6113
No log 5.4667 410 0.3708 0.7424 0.3708 0.6090
No log 5.4933 412 0.3675 0.7339 0.3675 0.6062
No log 5.52 414 0.3808 0.6832 0.3808 0.6171
No log 5.5467 416 0.4047 0.5980 0.4047 0.6361
No log 5.5733 418 0.4202 0.5882 0.4202 0.6482
No log 5.6 420 0.4150 0.6101 0.4150 0.6442
No log 5.6267 422 0.3928 0.6326 0.3928 0.6268
No log 5.6533 424 0.3703 0.7339 0.3703 0.6085
No log 5.68 426 0.4073 0.6880 0.4073 0.6382
No log 5.7067 428 0.4115 0.7227 0.4115 0.6415
No log 5.7333 430 0.3840 0.7218 0.3840 0.6197
No log 5.76 432 0.4030 0.6993 0.4030 0.6348
No log 5.7867 434 0.3914 0.6727 0.3914 0.6256
No log 5.8133 436 0.3817 0.7118 0.3817 0.6178
No log 5.84 438 0.3803 0.7237 0.3803 0.6167
No log 5.8667 440 0.4114 0.6690 0.4114 0.6414
No log 5.8933 442 0.4493 0.6701 0.4493 0.6703
No log 5.92 444 0.4221 0.6257 0.4221 0.6497
No log 5.9467 446 0.4021 0.6140 0.4021 0.6341
No log 5.9733 448 0.4139 0.6243 0.4139 0.6433
No log 6.0 450 0.4166 0.5782 0.4166 0.6455
No log 6.0267 452 0.4240 0.5782 0.4240 0.6512
No log 6.0533 454 0.4315 0.5556 0.4315 0.6569
No log 6.08 456 0.4170 0.6422 0.4170 0.6458
No log 6.1067 458 0.4195 0.6517 0.4195 0.6477
No log 6.1333 460 0.4130 0.6632 0.4130 0.6426
No log 6.16 462 0.4134 0.6939 0.4134 0.6430
No log 6.1867 464 0.4158 0.6672 0.4158 0.6448
No log 6.2133 466 0.4071 0.6929 0.4071 0.6381
No log 6.24 468 0.4354 0.6414 0.4354 0.6599
No log 6.2667 470 0.5120 0.5802 0.5120 0.7155
No log 6.2933 472 0.5912 0.5065 0.5912 0.7689
No log 6.32 474 0.6070 0.5131 0.6070 0.7791
No log 6.3467 476 0.5309 0.5131 0.5309 0.7286
No log 6.3733 478 0.4097 0.6414 0.4097 0.6401
No log 6.4 480 0.3857 0.6950 0.3857 0.6211
No log 6.4267 482 0.4210 0.6970 0.4210 0.6488
No log 6.4533 484 0.4114 0.6970 0.4114 0.6414
No log 6.48 486 0.3924 0.7432 0.3924 0.6264
No log 6.5067 488 0.4129 0.7424 0.4129 0.6425
No log 6.5333 490 0.4323 0.6968 0.4323 0.6575
No log 6.5600 492 0.4735 0.6441 0.4735 0.6881
No log 6.5867 494 0.4581 0.6434 0.4581 0.6768
No log 6.6133 496 0.4327 0.6530 0.4327 0.6578
No log 6.64 498 0.3847 0.7306 0.3847 0.6202
0.2988 6.6667 500 0.3984 0.6667 0.3984 0.6312
0.2988 6.6933 502 0.4303 0.6295 0.4303 0.6560
0.2988 6.72 504 0.4354 0.6082 0.4354 0.6598
0.2988 6.7467 506 0.4145 0.6229 0.4145 0.6438
0.2988 6.7733 508 0.4117 0.6045 0.4117 0.6416
0.2988 6.8 510 0.4776 0.4997 0.4776 0.6911

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k15_task7_organization

Finetuned
(4019)
this model