ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k10_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5006
  • Qwk: 0.5444
  • Mse: 0.5006
  • Rmse: 0.7075

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.04 2 2.4911 -0.0449 2.4911 1.5783
No log 0.08 4 1.1580 0.0991 1.1580 1.0761
No log 0.12 6 0.9096 -0.1166 0.9096 0.9537
No log 0.16 8 0.9367 0.2164 0.9367 0.9678
No log 0.2 10 0.8205 0.3251 0.8205 0.9058
No log 0.24 12 0.6557 0.4076 0.6557 0.8097
No log 0.28 14 0.6618 0.4927 0.6618 0.8135
No log 0.32 16 0.8737 0.3368 0.8737 0.9347
No log 0.36 18 0.6192 0.4674 0.6192 0.7869
No log 0.4 20 0.5794 0.4663 0.5794 0.7612
No log 0.44 22 0.5545 0.4614 0.5545 0.7446
No log 0.48 24 0.5794 0.4908 0.5794 0.7612
No log 0.52 26 0.5769 0.4594 0.5769 0.7595
No log 0.56 28 0.6125 0.4346 0.6125 0.7826
No log 0.6 30 0.8259 0.4240 0.8259 0.9088
No log 0.64 32 0.6888 0.4667 0.6888 0.8299
No log 0.68 34 0.4446 0.6111 0.4446 0.6668
No log 0.72 36 0.5213 0.5538 0.5213 0.7220
No log 0.76 38 0.4612 0.5845 0.4612 0.6791
No log 0.8 40 0.4538 0.6852 0.4538 0.6736
No log 0.84 42 0.5798 0.5093 0.5798 0.7615
No log 0.88 44 0.5263 0.5347 0.5263 0.7254
No log 0.92 46 0.4364 0.6587 0.4364 0.6606
No log 0.96 48 0.4877 0.5997 0.4877 0.6983
No log 1.0 50 0.4823 0.6793 0.4823 0.6945
No log 1.04 52 0.5488 0.5804 0.5488 0.7408
No log 1.08 54 0.6016 0.5900 0.6016 0.7756
No log 1.12 56 0.6180 0.6105 0.6180 0.7862
No log 1.16 58 0.6418 0.5825 0.6418 0.8012
No log 1.2 60 0.5967 0.6310 0.5967 0.7725
No log 1.24 62 0.5183 0.7130 0.5183 0.7199
No log 1.28 64 0.7494 0.4888 0.7494 0.8657
No log 1.32 66 0.8813 0.4354 0.8813 0.9388
No log 1.3600 68 0.6693 0.4580 0.6693 0.8181
No log 1.4 70 0.4471 0.6395 0.4471 0.6687
No log 1.44 72 0.4435 0.6418 0.4435 0.6660
No log 1.48 74 0.4418 0.6228 0.4418 0.6647
No log 1.52 76 0.4701 0.5302 0.4701 0.6856
No log 1.56 78 0.4800 0.5523 0.4800 0.6928
No log 1.6 80 0.4961 0.5563 0.4961 0.7043
No log 1.6400 82 0.4654 0.5799 0.4654 0.6822
No log 1.6800 84 0.4570 0.6472 0.4570 0.6760
No log 1.72 86 0.5189 0.6459 0.5189 0.7203
No log 1.76 88 0.4842 0.6269 0.4842 0.6959
No log 1.8 90 0.4656 0.6047 0.4656 0.6823
No log 1.8400 92 0.5881 0.5393 0.5881 0.7669
No log 1.88 94 0.5084 0.5794 0.5084 0.7130
No log 1.92 96 0.5029 0.5445 0.5029 0.7091
No log 1.96 98 0.4842 0.5445 0.4842 0.6958
No log 2.0 100 0.5336 0.4986 0.5336 0.7305
No log 2.04 102 0.5529 0.4926 0.5529 0.7436
No log 2.08 104 0.4315 0.6673 0.4315 0.6569
No log 2.12 106 0.4519 0.6082 0.4519 0.6722
No log 2.16 108 0.4823 0.5283 0.4823 0.6945
No log 2.2 110 0.6416 0.5103 0.6416 0.8010
No log 2.24 112 0.5364 0.5090 0.5364 0.7324
No log 2.2800 114 0.4683 0.6617 0.4683 0.6843
No log 2.32 116 0.4608 0.6894 0.4608 0.6788
No log 2.36 118 0.4657 0.6519 0.4657 0.6824
No log 2.4 120 0.4768 0.5512 0.4768 0.6905
No log 2.44 122 0.4674 0.5682 0.4674 0.6837
No log 2.48 124 0.4462 0.7054 0.4462 0.6680
No log 2.52 126 0.4456 0.7114 0.4456 0.6675
No log 2.56 128 0.4385 0.7114 0.4385 0.6622
No log 2.6 130 0.4715 0.5569 0.4715 0.6867
No log 2.64 132 0.4221 0.7123 0.4221 0.6497
No log 2.68 134 0.4177 0.7123 0.4177 0.6463
No log 2.7200 136 0.4110 0.7133 0.4110 0.6411
No log 2.76 138 0.4038 0.6489 0.4038 0.6355
No log 2.8 140 0.4224 0.5933 0.4224 0.6500
No log 2.84 142 0.4188 0.6210 0.4188 0.6472
No log 2.88 144 0.4022 0.6370 0.4022 0.6342
No log 2.92 146 0.4082 0.6370 0.4082 0.6389
No log 2.96 148 0.3880 0.6946 0.3880 0.6229
No log 3.0 150 0.4118 0.7414 0.4118 0.6417
No log 3.04 152 0.4198 0.7491 0.4198 0.6479
No log 3.08 154 0.5076 0.5450 0.5076 0.7124
No log 3.12 156 0.4491 0.7137 0.4491 0.6702
No log 3.16 158 0.4203 0.7053 0.4203 0.6483
No log 3.2 160 0.4045 0.7532 0.4045 0.6360
No log 3.24 162 0.4054 0.7173 0.4054 0.6367
No log 3.2800 164 0.3903 0.6946 0.3903 0.6247
No log 3.32 166 0.4321 0.6047 0.4321 0.6573
No log 3.36 168 0.4142 0.5633 0.4142 0.6436
No log 3.4 170 0.4182 0.6620 0.4182 0.6466
No log 3.44 172 0.4705 0.6165 0.4705 0.6860
No log 3.48 174 0.3967 0.7118 0.3967 0.6299
No log 3.52 176 0.5074 0.5460 0.5074 0.7123
No log 3.56 178 0.6428 0.5018 0.6428 0.8018
No log 3.6 180 0.5242 0.5265 0.5242 0.7240
No log 3.64 182 0.4007 0.7022 0.4007 0.6330
No log 3.68 184 0.4007 0.7118 0.4007 0.6330
No log 3.7200 186 0.4324 0.6701 0.4324 0.6576
No log 3.76 188 0.4283 0.6501 0.4283 0.6545
No log 3.8 190 0.4123 0.6924 0.4123 0.6421
No log 3.84 192 0.5044 0.5863 0.5044 0.7102
No log 3.88 194 0.4596 0.6259 0.4596 0.6780
No log 3.92 196 0.4016 0.7324 0.4016 0.6337
No log 3.96 198 0.5575 0.5408 0.5575 0.7467
No log 4.0 200 0.5816 0.5609 0.5816 0.7626
No log 4.04 202 0.4382 0.6537 0.4382 0.6620
No log 4.08 204 0.4202 0.6439 0.4202 0.6482
No log 4.12 206 0.5315 0.5471 0.5315 0.7291
No log 4.16 208 0.5250 0.5024 0.5250 0.7245
No log 4.2 210 0.4222 0.6422 0.4222 0.6498
No log 4.24 212 0.4724 0.5226 0.4724 0.6873
No log 4.28 214 0.4652 0.5698 0.4652 0.6820
No log 4.32 216 0.4317 0.6308 0.4317 0.6570
No log 4.36 218 0.4267 0.7326 0.4267 0.6533
No log 4.4 220 0.5341 0.5586 0.5341 0.7308
No log 4.44 222 0.5836 0.5672 0.5836 0.7640
No log 4.48 224 0.4798 0.5852 0.4798 0.6927
No log 4.52 226 0.4493 0.6427 0.4493 0.6703
No log 4.5600 228 0.4775 0.6342 0.4775 0.6910
No log 4.6 230 0.4337 0.6543 0.4337 0.6586
No log 4.64 232 0.4827 0.5468 0.4827 0.6948
No log 4.68 234 0.4808 0.5420 0.4808 0.6934
No log 4.72 236 0.4462 0.5836 0.4462 0.6680
No log 4.76 238 0.4441 0.6321 0.4441 0.6664
No log 4.8 240 0.4742 0.5892 0.4742 0.6886
No log 4.84 242 0.4887 0.6321 0.4887 0.6991
No log 4.88 244 0.4388 0.7450 0.4388 0.6624
No log 4.92 246 0.4352 0.6877 0.4352 0.6597
No log 4.96 248 0.4250 0.7205 0.4250 0.6519
No log 5.0 250 0.4191 0.7588 0.4191 0.6474
No log 5.04 252 0.4513 0.7306 0.4513 0.6718
No log 5.08 254 0.4246 0.7579 0.4246 0.6516
No log 5.12 256 0.4096 0.7149 0.4096 0.6400
No log 5.16 258 0.4122 0.6993 0.4122 0.6421
No log 5.2 260 0.3938 0.7043 0.3938 0.6276
No log 5.24 262 0.4238 0.6487 0.4238 0.6510
No log 5.28 264 0.4397 0.6196 0.4397 0.6631
No log 5.32 266 0.4285 0.6592 0.4285 0.6546
No log 5.36 268 0.4575 0.6110 0.4575 0.6764
No log 5.4 270 0.4270 0.6462 0.4270 0.6534
No log 5.44 272 0.4170 0.6636 0.4170 0.6457
No log 5.48 274 0.4498 0.5601 0.4498 0.6707
No log 5.52 276 0.4252 0.6200 0.4252 0.6521
No log 5.5600 278 0.4225 0.6243 0.4225 0.6500
No log 5.6 280 0.4318 0.6257 0.4318 0.6571
No log 5.64 282 0.4322 0.6257 0.4322 0.6574
No log 5.68 284 0.4291 0.6257 0.4291 0.6551
No log 5.72 286 0.4222 0.6661 0.4222 0.6498
No log 5.76 288 0.4522 0.6182 0.4522 0.6724
No log 5.8 290 0.4750 0.6330 0.4750 0.6892
No log 5.84 292 0.4451 0.6301 0.4451 0.6672
No log 5.88 294 0.4381 0.7034 0.4381 0.6619
No log 5.92 296 0.4351 0.7073 0.4351 0.6596
No log 5.96 298 0.4321 0.5875 0.4321 0.6574
No log 6.0 300 0.4436 0.5527 0.4436 0.6661
No log 6.04 302 0.4408 0.6295 0.4408 0.6639
No log 6.08 304 0.4097 0.7104 0.4097 0.6401
No log 6.12 306 0.4234 0.7238 0.4234 0.6507
No log 6.16 308 0.4143 0.6904 0.4143 0.6437
No log 6.2 310 0.4293 0.5747 0.4293 0.6552
No log 6.24 312 0.4344 0.5527 0.4344 0.6591
No log 6.28 314 0.4290 0.5747 0.4290 0.6549
No log 6.32 316 0.4166 0.6750 0.4166 0.6454
No log 6.36 318 0.4158 0.6761 0.4158 0.6448
No log 6.4 320 0.4215 0.6592 0.4215 0.6492
No log 6.44 322 0.4139 0.6383 0.4139 0.6433
No log 6.48 324 0.4163 0.6830 0.4163 0.6452
No log 6.52 326 0.4645 0.6239 0.4645 0.6815
No log 6.5600 328 0.4221 0.6727 0.4221 0.6497
No log 6.6 330 0.4165 0.6579 0.4165 0.6454
No log 6.64 332 0.5081 0.5619 0.5081 0.7128
No log 6.68 334 0.4839 0.5466 0.4839 0.6956
No log 6.72 336 0.4701 0.5466 0.4701 0.6856
No log 6.76 338 0.4112 0.6462 0.4112 0.6412
No log 6.8 340 0.4258 0.6632 0.4258 0.6525
No log 6.84 342 0.4272 0.7149 0.4272 0.6536
No log 6.88 344 0.4670 0.6061 0.4670 0.6834
No log 6.92 346 0.5146 0.5997 0.5146 0.7174
No log 6.96 348 0.4622 0.6223 0.4622 0.6798
No log 7.0 350 0.4181 0.6661 0.4181 0.6466
No log 7.04 352 0.4286 0.6636 0.4286 0.6547
No log 7.08 354 0.4416 0.5815 0.4416 0.6645
No log 7.12 356 0.5059 0.5050 0.5059 0.7113
No log 7.16 358 0.5021 0.5050 0.5021 0.7086
No log 7.2 360 0.4537 0.5831 0.4537 0.6736
No log 7.24 362 0.4350 0.6993 0.4350 0.6595
No log 7.28 364 0.4293 0.7247 0.4293 0.6552
No log 7.32 366 0.4308 0.6047 0.4308 0.6564
No log 7.36 368 0.4395 0.6062 0.4395 0.6630
No log 7.4 370 0.4389 0.6396 0.4389 0.6625
No log 7.44 372 0.4178 0.7022 0.4178 0.6463
No log 7.48 374 0.4294 0.7043 0.4294 0.6553
No log 7.52 376 0.4239 0.6462 0.4239 0.6511
No log 7.5600 378 0.4307 0.6257 0.4307 0.6563
No log 7.6 380 0.4532 0.6422 0.4532 0.6732
No log 7.64 382 0.4748 0.5631 0.4748 0.6891
No log 7.68 384 0.4492 0.6636 0.4492 0.6702
No log 7.72 386 0.4397 0.6243 0.4397 0.6631
No log 7.76 388 0.4435 0.5731 0.4435 0.6660
No log 7.8 390 0.4403 0.5731 0.4403 0.6636
No log 7.84 392 0.4343 0.6462 0.4343 0.6590
No log 7.88 394 0.4435 0.6819 0.4435 0.6660
No log 7.92 396 0.4551 0.6612 0.4551 0.6746
No log 7.96 398 0.4429 0.6819 0.4429 0.6655
No log 8.0 400 0.4396 0.6832 0.4396 0.6630
No log 8.04 402 0.4324 0.6832 0.4324 0.6576
No log 8.08 404 0.4507 0.6228 0.4507 0.6713
No log 8.12 406 0.5148 0.4703 0.5148 0.7175
No log 8.16 408 0.4850 0.5373 0.4850 0.6964
No log 8.2 410 0.4102 0.6953 0.4102 0.6405
No log 8.24 412 0.4861 0.5603 0.4861 0.6972
No log 8.28 414 0.5305 0.5672 0.5305 0.7283
No log 8.32 416 0.4745 0.5528 0.4745 0.6889
No log 8.36 418 0.4065 0.6964 0.4065 0.6376
No log 8.4 420 0.4346 0.6423 0.4346 0.6592
No log 8.44 422 0.4368 0.6115 0.4368 0.6609
No log 8.48 424 0.4142 0.6435 0.4142 0.6436
No log 8.52 426 0.4892 0.5544 0.4892 0.6994
No log 8.56 428 0.5660 0.5407 0.5660 0.7523
No log 8.6 430 0.5918 0.5033 0.5918 0.7693
No log 8.64 432 0.5192 0.5390 0.5192 0.7206
No log 8.68 434 0.4479 0.5527 0.4479 0.6693
No log 8.72 436 0.4306 0.6661 0.4306 0.6562
No log 8.76 438 0.4318 0.6448 0.4318 0.6571
No log 8.8 440 0.4379 0.6567 0.4379 0.6618
No log 8.84 442 0.4664 0.5603 0.4664 0.6830
No log 8.88 444 0.4624 0.5892 0.4624 0.6800
No log 8.92 446 0.4476 0.5555 0.4476 0.6690
No log 8.96 448 0.4596 0.6414 0.4596 0.6779
No log 9.0 450 0.5060 0.5677 0.5060 0.7114
No log 9.04 452 0.5499 0.5345 0.5499 0.7416
No log 9.08 454 0.5299 0.5067 0.5299 0.7279
No log 9.12 456 0.4693 0.5819 0.4693 0.6851
No log 9.16 458 0.4450 0.6321 0.4450 0.6671
No log 9.2 460 0.4340 0.5945 0.4340 0.6588
No log 9.24 462 0.4226 0.6661 0.4226 0.6501
No log 9.28 464 0.4162 0.7043 0.4162 0.6452
No log 9.32 466 0.4154 0.6946 0.4154 0.6445
No log 9.36 468 0.4182 0.7138 0.4182 0.6467
No log 9.4 470 0.4211 0.7138 0.4211 0.6489
No log 9.44 472 0.4278 0.7275 0.4278 0.6541
No log 9.48 474 0.4227 0.7138 0.4227 0.6502
No log 9.52 476 0.4364 0.5877 0.4364 0.6606
No log 9.56 478 0.4464 0.5666 0.4464 0.6682
No log 9.6 480 0.4271 0.6555 0.4271 0.6535
No log 9.64 482 0.4402 0.6843 0.4402 0.6634
No log 9.68 484 0.4462 0.6636 0.4462 0.6679
No log 9.72 486 0.4376 0.6344 0.4376 0.6615
No log 9.76 488 0.4347 0.6555 0.4347 0.6593
No log 9.8 490 0.4339 0.6555 0.4339 0.6587
No log 9.84 492 0.4407 0.7118 0.4407 0.6638
No log 9.88 494 0.4386 0.6841 0.4386 0.6623
No log 9.92 496 0.4430 0.5846 0.4430 0.6656
No log 9.96 498 0.4644 0.6013 0.4644 0.6815
0.2672 10.0 500 0.4605 0.5555 0.4605 0.6786
0.2672 10.04 502 0.4522 0.5633 0.4522 0.6725
0.2672 10.08 504 0.4513 0.5846 0.4513 0.6718
0.2672 10.12 506 0.4583 0.6724 0.4583 0.6770
0.2672 10.16 508 0.5293 0.5215 0.5293 0.7275
0.2672 10.2 510 0.5341 0.5107 0.5341 0.7308
0.2672 10.24 512 0.4759 0.5715 0.4759 0.6898
0.2672 10.28 514 0.4581 0.5846 0.4581 0.6768
0.2672 10.32 516 0.4597 0.5861 0.4597 0.6780
0.2672 10.36 518 0.4592 0.5781 0.4592 0.6776
0.2672 10.4 520 0.4492 0.6068 0.4492 0.6702
0.2672 10.44 522 0.4429 0.6555 0.4429 0.6655
0.2672 10.48 524 0.4385 0.7306 0.4385 0.6622
0.2672 10.52 526 0.4386 0.7275 0.4386 0.6622
0.2672 10.56 528 0.4382 0.7247 0.4382 0.6620
0.2672 10.6 530 0.4384 0.6364 0.4384 0.6622
0.2672 10.64 532 0.4634 0.5960 0.4634 0.6807
0.2672 10.68 534 0.4698 0.5960 0.4698 0.6854
0.2672 10.72 536 0.4583 0.5731 0.4583 0.6769
0.2672 10.76 538 0.4717 0.5960 0.4717 0.6868
0.2672 10.8 540 0.4816 0.5747 0.4816 0.6940
0.2672 10.84 542 0.4608 0.5731 0.4608 0.6788
0.2672 10.88 544 0.4508 0.5784 0.4508 0.6714
0.2672 10.92 546 0.4909 0.5452 0.4909 0.7006
0.2672 10.96 548 0.4848 0.5597 0.4848 0.6962
0.2672 11.0 550 0.4770 0.5939 0.4770 0.6906
0.2672 11.04 552 0.4597 0.5248 0.4597 0.6780
0.2672 11.08 554 0.4610 0.5819 0.4610 0.6790
0.2672 11.12 556 0.4558 0.5503 0.4558 0.6751
0.2672 11.16 558 0.4510 0.5189 0.4510 0.6716
0.2672 11.2 560 0.4431 0.5414 0.4431 0.6657
0.2672 11.24 562 0.4621 0.5373 0.4621 0.6798
0.2672 11.28 564 0.4741 0.5826 0.4741 0.6886
0.2672 11.32 566 0.4625 0.5796 0.4625 0.6801
0.2672 11.36 568 0.4805 0.5268 0.4805 0.6932
0.2672 11.4 570 0.5119 0.5272 0.5119 0.7155
0.2672 11.44 572 0.5321 0.5256 0.5321 0.7295
0.2672 11.48 574 0.5240 0.5272 0.5240 0.7239
0.2672 11.52 576 0.5006 0.5444 0.5006 0.7075

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k10_task7_organization

Finetuned
(4019)
this model