ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k19_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4251
  • Qwk: 0.6265
  • Mse: 0.4251
  • Rmse: 0.6520

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0211 2 2.6098 -0.0545 2.6098 1.6155
No log 0.0421 4 1.2358 0.0745 1.2358 1.1117
No log 0.0632 6 0.7507 0.0893 0.7507 0.8664
No log 0.0842 8 0.6557 0.4352 0.6557 0.8097
No log 0.1053 10 0.5681 0.4397 0.5681 0.7537
No log 0.1263 12 0.5479 0.4966 0.5479 0.7402
No log 0.1474 14 0.5272 0.5816 0.5272 0.7261
No log 0.1684 16 0.5156 0.6053 0.5156 0.7181
No log 0.1895 18 0.5329 0.5846 0.5329 0.7300
No log 0.2105 20 0.5309 0.5930 0.5309 0.7286
No log 0.2316 22 0.6218 0.5378 0.6218 0.7886
No log 0.2526 24 0.7674 0.4200 0.7674 0.8760
No log 0.2737 26 0.7485 0.4859 0.7485 0.8651
No log 0.2947 28 1.0127 0.3092 1.0127 1.0063
No log 0.3158 30 1.5040 0.2467 1.5040 1.2264
No log 0.3368 32 1.4568 0.2316 1.4568 1.2070
No log 0.3579 34 0.8104 0.4883 0.8104 0.9002
No log 0.3789 36 0.5270 0.5125 0.5270 0.7259
No log 0.4 38 0.6294 0.4314 0.6294 0.7934
No log 0.4211 40 0.5376 0.4555 0.5376 0.7332
No log 0.4421 42 0.4367 0.6330 0.4367 0.6609
No log 0.4632 44 0.6202 0.4844 0.6202 0.7875
No log 0.4842 46 0.6902 0.4700 0.6902 0.8308
No log 0.5053 48 0.4517 0.5796 0.4517 0.6721
No log 0.5263 50 0.4755 0.6109 0.4755 0.6896
No log 0.5474 52 0.5427 0.6310 0.5427 0.7367
No log 0.5684 54 0.4516 0.5845 0.4516 0.6720
No log 0.5895 56 0.4436 0.6514 0.4436 0.6660
No log 0.6105 58 0.5197 0.5408 0.5197 0.7209
No log 0.6316 60 0.4719 0.6537 0.4719 0.6869
No log 0.6526 62 0.4410 0.5800 0.4410 0.6641
No log 0.6737 64 0.4872 0.5577 0.4872 0.6980
No log 0.6947 66 0.4830 0.5593 0.4830 0.6950
No log 0.7158 68 0.4785 0.6449 0.4785 0.6917
No log 0.7368 70 0.7570 0.4321 0.7570 0.8700
No log 0.7579 72 1.0736 0.3861 1.0736 1.0361
No log 0.7789 74 1.0001 0.3897 1.0001 1.0000
No log 0.8 76 0.7110 0.5626 0.7110 0.8432
No log 0.8211 78 0.5540 0.6663 0.5540 0.7443
No log 0.8421 80 0.6842 0.5380 0.6842 0.8272
No log 0.8632 82 0.7430 0.5257 0.7430 0.8620
No log 0.8842 84 0.6420 0.6044 0.6420 0.8012
No log 0.9053 86 0.5024 0.6604 0.5024 0.7088
No log 0.9263 88 0.4636 0.5475 0.4636 0.6809
No log 0.9474 90 0.4892 0.5246 0.4892 0.6994
No log 0.9684 92 0.4393 0.6001 0.4393 0.6628
No log 0.9895 94 0.4765 0.6588 0.4765 0.6903
No log 1.0105 96 0.4939 0.6589 0.4939 0.7028
No log 1.0316 98 0.4721 0.6769 0.4721 0.6871
No log 1.0526 100 0.4677 0.6383 0.4677 0.6839
No log 1.0737 102 0.4834 0.5819 0.4834 0.6953
No log 1.0947 104 0.5139 0.6182 0.5139 0.7169
No log 1.1158 106 0.5233 0.6182 0.5233 0.7234
No log 1.1368 108 0.5267 0.6144 0.5267 0.7257
No log 1.1579 110 0.5637 0.5781 0.5637 0.7508
No log 1.1789 112 0.6258 0.5672 0.6258 0.7911
No log 1.2 114 0.5994 0.5938 0.5994 0.7742
No log 1.2211 116 0.5642 0.6086 0.5642 0.7511
No log 1.2421 118 0.5418 0.6755 0.5418 0.7361
No log 1.2632 120 0.5310 0.6391 0.5310 0.7287
No log 1.2842 122 0.5152 0.6387 0.5152 0.7178
No log 1.3053 124 0.5238 0.6466 0.5238 0.7237
No log 1.3263 126 0.5219 0.6596 0.5219 0.7224
No log 1.3474 128 0.5268 0.6562 0.5268 0.7258
No log 1.3684 130 0.5683 0.5631 0.5683 0.7539
No log 1.3895 132 0.6065 0.5455 0.6065 0.7788
No log 1.4105 134 0.5769 0.5387 0.5769 0.7595
No log 1.4316 136 0.4650 0.5422 0.4650 0.6819
No log 1.4526 138 0.4568 0.6129 0.4568 0.6758
No log 1.4737 140 0.4846 0.5357 0.4846 0.6961
No log 1.4947 142 0.5011 0.5738 0.5011 0.7078
No log 1.5158 144 0.5321 0.5859 0.5321 0.7294
No log 1.5368 146 0.6267 0.6210 0.6267 0.7917
No log 1.5579 148 0.6894 0.5789 0.6894 0.8303
No log 1.5789 150 0.5787 0.6099 0.5787 0.7607
No log 1.6 152 0.4933 0.6198 0.4933 0.7024
No log 1.6211 154 0.4939 0.6133 0.4939 0.7028
No log 1.6421 156 0.4857 0.6286 0.4857 0.6969
No log 1.6632 158 0.4957 0.6452 0.4957 0.7040
No log 1.6842 160 0.5062 0.6210 0.5062 0.7115
No log 1.7053 162 0.5600 0.5743 0.5600 0.7483
No log 1.7263 164 0.6635 0.5670 0.6635 0.8145
No log 1.7474 166 0.7418 0.5677 0.7418 0.8613
No log 1.7684 168 0.6376 0.5800 0.6376 0.7985
No log 1.7895 170 0.5512 0.5909 0.5512 0.7424
No log 1.8105 172 0.5361 0.6175 0.5361 0.7322
No log 1.8316 174 0.5441 0.6010 0.5441 0.7376
No log 1.8526 176 0.5201 0.6137 0.5201 0.7212
No log 1.8737 178 0.4970 0.6173 0.4970 0.7050
No log 1.8947 180 0.4931 0.6233 0.4931 0.7022
No log 1.9158 182 0.5231 0.6034 0.5231 0.7233
No log 1.9368 184 0.5872 0.5835 0.5872 0.7663
No log 1.9579 186 0.5711 0.6135 0.5711 0.7557
No log 1.9789 188 0.4993 0.5729 0.4993 0.7066
No log 2.0 190 0.4707 0.6254 0.4707 0.6861
No log 2.0211 192 0.4718 0.6279 0.4718 0.6869
No log 2.0421 194 0.4619 0.6455 0.4619 0.6796
No log 2.0632 196 0.4611 0.6455 0.4611 0.6790
No log 2.0842 198 0.4733 0.6411 0.4733 0.6879
No log 2.1053 200 0.4776 0.6347 0.4776 0.6911
No log 2.1263 202 0.5424 0.5854 0.5424 0.7365
No log 2.1474 204 0.5326 0.6752 0.5326 0.7298
No log 2.1684 206 0.5018 0.6509 0.5018 0.7084
No log 2.1895 208 0.4867 0.6641 0.4867 0.6976
No log 2.2105 210 0.4520 0.6879 0.4520 0.6723
No log 2.2316 212 0.4420 0.7254 0.4420 0.6649
No log 2.2526 214 0.4354 0.7305 0.4354 0.6599
No log 2.2737 216 0.5073 0.6295 0.5073 0.7122
No log 2.2947 218 0.5469 0.6200 0.5469 0.7395
No log 2.3158 220 0.5157 0.5970 0.5157 0.7181
No log 2.3368 222 0.4611 0.6158 0.4611 0.6791
No log 2.3579 224 0.4764 0.6154 0.4764 0.6902
No log 2.3789 226 0.4929 0.6370 0.4929 0.7021
No log 2.4 228 0.4978 0.6772 0.4978 0.7056
No log 2.4211 230 0.4916 0.6763 0.4916 0.7011
No log 2.4421 232 0.4893 0.6021 0.4893 0.6995
No log 2.4632 234 0.4936 0.6301 0.4936 0.7026
No log 2.4842 236 0.5310 0.6003 0.5310 0.7287
No log 2.5053 238 0.5495 0.5081 0.5495 0.7413
No log 2.5263 240 0.6149 0.4728 0.6149 0.7842
No log 2.5474 242 0.7086 0.5342 0.7086 0.8418
No log 2.5684 244 0.8100 0.4831 0.8100 0.9000
No log 2.5895 246 0.7589 0.5430 0.7589 0.8712
No log 2.6105 248 0.6534 0.5953 0.6534 0.8083
No log 2.6316 250 0.6300 0.5979 0.6300 0.7937
No log 2.6526 252 0.6033 0.5897 0.6033 0.7767
No log 2.6737 254 0.5484 0.5812 0.5484 0.7405
No log 2.6947 256 0.4705 0.6145 0.4705 0.6859
No log 2.7158 258 0.4583 0.5413 0.4583 0.6770
No log 2.7368 260 0.4663 0.5926 0.4663 0.6829
No log 2.7579 262 0.4821 0.6334 0.4821 0.6943
No log 2.7789 264 0.5113 0.6003 0.5113 0.7150
No log 2.8 266 0.5049 0.6361 0.5049 0.7106
No log 2.8211 268 0.4958 0.6361 0.4958 0.7041
No log 2.8421 270 0.4770 0.5949 0.4770 0.6906
No log 2.8632 272 0.4776 0.6405 0.4776 0.6911
No log 2.8842 274 0.4901 0.6479 0.4901 0.7001
No log 2.9053 276 0.4993 0.6548 0.4993 0.7066
No log 2.9263 278 0.4900 0.6479 0.4900 0.7000
No log 2.9474 280 0.4678 0.7166 0.4678 0.6840
No log 2.9684 282 0.4625 0.7071 0.4625 0.6801
No log 2.9895 284 0.4605 0.6550 0.4605 0.6786
No log 3.0105 286 0.4461 0.6839 0.4461 0.6679
No log 3.0316 288 0.4392 0.6479 0.4392 0.6627
No log 3.0526 290 0.4538 0.5796 0.4538 0.6736
No log 3.0737 292 0.4645 0.6013 0.4645 0.6816
No log 3.0947 294 0.4382 0.6481 0.4382 0.6620
No log 3.1158 296 0.4971 0.6460 0.4971 0.7051
No log 3.1368 298 0.5260 0.6460 0.5260 0.7252
No log 3.1579 300 0.4828 0.6455 0.4828 0.6948
No log 3.1789 302 0.4391 0.6553 0.4391 0.6626
No log 3.2 304 0.4426 0.6579 0.4426 0.6653
No log 3.2211 306 0.4459 0.6269 0.4459 0.6677
No log 3.2421 308 0.4492 0.6269 0.4492 0.6702
No log 3.2632 310 0.4527 0.6542 0.4527 0.6728
No log 3.2842 312 0.4523 0.6727 0.4523 0.6726
No log 3.3053 314 0.4375 0.5797 0.4375 0.6614
No log 3.3263 316 0.4441 0.6389 0.4441 0.6664
No log 3.3474 318 0.4652 0.6620 0.4652 0.6821
No log 3.3684 320 0.4728 0.6784 0.4728 0.6876
No log 3.3895 322 0.4491 0.6773 0.4491 0.6701
No log 3.4105 324 0.4487 0.6641 0.4487 0.6699
No log 3.4316 326 0.5080 0.6210 0.5080 0.7127
No log 3.4526 328 0.5316 0.6200 0.5316 0.7291
No log 3.4737 330 0.5686 0.6178 0.5686 0.7540
No log 3.4947 332 0.5358 0.6135 0.5358 0.7320
No log 3.5158 334 0.4954 0.5647 0.4954 0.7039
No log 3.5368 336 0.4797 0.5533 0.4797 0.6926
No log 3.5579 338 0.4780 0.6132 0.4780 0.6914
No log 3.5789 340 0.4786 0.6147 0.4786 0.6918
No log 3.6 342 0.4907 0.6279 0.4907 0.7005
No log 3.6211 344 0.4844 0.6199 0.4844 0.6960
No log 3.6421 346 0.4765 0.6160 0.4765 0.6903
No log 3.6632 348 0.4963 0.6045 0.4963 0.7045
No log 3.6842 350 0.5165 0.6011 0.5165 0.7187
No log 3.7053 352 0.4907 0.6025 0.4907 0.7005
No log 3.7263 354 0.4565 0.6426 0.4565 0.6756
No log 3.7474 356 0.4431 0.6198 0.4431 0.6657
No log 3.7684 358 0.4473 0.6244 0.4473 0.6688
No log 3.7895 360 0.4520 0.6932 0.4520 0.6723
No log 3.8105 362 0.4621 0.6280 0.4621 0.6798
No log 3.8316 364 0.5104 0.6088 0.5104 0.7144
No log 3.8526 366 0.4830 0.6536 0.4830 0.6950
No log 3.8737 368 0.4576 0.6424 0.4576 0.6765
No log 3.8947 370 0.4513 0.6788 0.4513 0.6718
No log 3.9158 372 0.4195 0.6481 0.4195 0.6477
No log 3.9368 374 0.4302 0.6210 0.4302 0.6559
No log 3.9579 376 0.4528 0.5714 0.4528 0.6729
No log 3.9789 378 0.4804 0.5687 0.4804 0.6931
No log 4.0 380 0.5081 0.5859 0.5081 0.7128
No log 4.0211 382 0.5544 0.5918 0.5544 0.7446
No log 4.0421 384 0.5080 0.6092 0.5080 0.7127
No log 4.0632 386 0.4990 0.6022 0.4990 0.7064
No log 4.0842 388 0.4770 0.6443 0.4770 0.6907
No log 4.1053 390 0.4733 0.6039 0.4733 0.6880
No log 4.1263 392 0.4814 0.6082 0.4814 0.6938
No log 4.1474 394 0.4670 0.6567 0.4670 0.6834
No log 4.1684 396 0.4516 0.6770 0.4516 0.6720
No log 4.1895 398 0.4451 0.6040 0.4451 0.6672
No log 4.2105 400 0.4442 0.5854 0.4442 0.6665
No log 4.2316 402 0.4454 0.6645 0.4454 0.6673
No log 4.2526 404 0.4463 0.6645 0.4463 0.6680
No log 4.2737 406 0.4466 0.6462 0.4466 0.6683
No log 4.2947 408 0.4531 0.5945 0.4531 0.6732
No log 4.3158 410 0.4439 0.5731 0.4439 0.6663
No log 4.3368 412 0.4305 0.6462 0.4305 0.6561
No log 4.3579 414 0.4256 0.6475 0.4256 0.6523
No log 4.3789 416 0.4289 0.6257 0.4289 0.6549
No log 4.4 418 0.4375 0.6047 0.4375 0.6614
No log 4.4211 420 0.4370 0.6047 0.4370 0.6611
No log 4.4421 422 0.4344 0.6047 0.4344 0.6591
No log 4.4632 424 0.4382 0.6841 0.4382 0.6620
No log 4.4842 426 0.4496 0.7012 0.4496 0.6705
No log 4.5053 428 0.4520 0.6716 0.4520 0.6723
No log 4.5263 430 0.4293 0.6935 0.4293 0.6552
No log 4.5474 432 0.4122 0.6667 0.4122 0.6420
No log 4.5684 434 0.4096 0.6933 0.4096 0.6400
No log 4.5895 436 0.4208 0.6964 0.4208 0.6487
No log 4.6105 438 0.4442 0.6596 0.4442 0.6665
No log 4.6316 440 0.4647 0.6750 0.4647 0.6817
No log 4.6526 442 0.4891 0.6384 0.4891 0.6994
No log 4.6737 444 0.4701 0.6096 0.4701 0.6856
No log 4.6947 446 0.4566 0.6235 0.4566 0.6757
No log 4.7158 448 0.4251 0.7042 0.4251 0.6520
No log 4.7368 450 0.4447 0.6807 0.4447 0.6669
No log 4.7579 452 0.4435 0.6797 0.4435 0.6659
No log 4.7789 454 0.4190 0.6923 0.4190 0.6473
No log 4.8 456 0.4142 0.7032 0.4142 0.6436
No log 4.8211 458 0.4131 0.6673 0.4131 0.6427
No log 4.8421 460 0.4200 0.6448 0.4200 0.6481
No log 4.8632 462 0.4463 0.6228 0.4463 0.6681
No log 4.8842 464 0.4465 0.6228 0.4465 0.6682
No log 4.9053 466 0.4478 0.6334 0.4478 0.6692
No log 4.9263 468 0.4425 0.6140 0.4425 0.6652
No log 4.9474 470 0.4517 0.6462 0.4517 0.6721
No log 4.9684 472 0.4763 0.5010 0.4763 0.6901
No log 4.9895 474 0.4961 0.5010 0.4961 0.7043
No log 5.0105 476 0.4664 0.5475 0.4664 0.6830
No log 5.0316 478 0.4543 0.6201 0.4543 0.6740
No log 5.0526 480 0.5393 0.6056 0.5393 0.7344
No log 5.0737 482 0.5478 0.6022 0.5478 0.7402
No log 5.0947 484 0.4808 0.6150 0.4808 0.6934
No log 5.1158 486 0.4285 0.6443 0.4285 0.6546
No log 5.1368 488 0.4309 0.6761 0.4309 0.6564
No log 5.1579 490 0.4328 0.6282 0.4328 0.6578
No log 5.1789 492 0.4237 0.6282 0.4237 0.6509
No log 5.2 494 0.4046 0.6477 0.4046 0.6361
No log 5.2211 496 0.3995 0.6477 0.3995 0.6321
No log 5.2421 498 0.4021 0.6282 0.4021 0.6341
0.3246 5.2632 500 0.4142 0.6514 0.4142 0.6436
0.3246 5.2842 502 0.4123 0.6096 0.4123 0.6421
0.3246 5.3053 504 0.4079 0.6295 0.4079 0.6386
0.3246 5.3263 506 0.3959 0.6757 0.3959 0.6292
0.3246 5.3474 508 0.4086 0.7022 0.4086 0.6392
0.3246 5.3684 510 0.4240 0.6904 0.4240 0.6512
0.3246 5.3895 512 0.4284 0.6720 0.4284 0.6545
0.3246 5.4105 514 0.4243 0.6736 0.4243 0.6514
0.3246 5.4316 516 0.4332 0.6145 0.4332 0.6582
0.3246 5.4526 518 0.4463 0.6174 0.4463 0.6681
0.3246 5.4737 520 0.4376 0.6065 0.4376 0.6615
0.3246 5.4947 522 0.4223 0.6265 0.4223 0.6498
0.3246 5.5158 524 0.4251 0.6265 0.4251 0.6520

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k19_task7_organization

Finetuned
(4019)
this model