ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k4_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4802
  • Qwk: 0.4637
  • Mse: 0.4802
  • Rmse: 0.6930

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.1 2 2.5769 -0.0729 2.5769 1.6053
No log 0.2 4 1.3658 0.0289 1.3658 1.1687
No log 0.3 6 0.7953 0.2181 0.7953 0.8918
No log 0.4 8 0.9714 0.0851 0.9714 0.9856
No log 0.5 10 1.0358 0.2324 1.0358 1.0178
No log 0.6 12 0.7605 0.3294 0.7605 0.8721
No log 0.7 14 0.6255 0.2711 0.6255 0.7909
No log 0.8 16 0.6723 0.2494 0.6723 0.8199
No log 0.9 18 0.5952 0.4174 0.5952 0.7715
No log 1.0 20 0.6164 0.4732 0.6164 0.7851
No log 1.1 22 0.5245 0.5065 0.5245 0.7242
No log 1.2 24 0.5315 0.4211 0.5315 0.7290
No log 1.3 26 0.5295 0.4471 0.5295 0.7276
No log 1.4 28 0.5565 0.5587 0.5565 0.7460
No log 1.5 30 0.6869 0.3940 0.6869 0.8288
No log 1.6 32 0.5493 0.6154 0.5493 0.7411
No log 1.7 34 0.5173 0.5422 0.5173 0.7192
No log 1.8 36 0.6106 0.5003 0.6106 0.7814
No log 1.9 38 0.5009 0.5841 0.5009 0.7077
No log 2.0 40 0.5119 0.5647 0.5119 0.7155
No log 2.1 42 0.5952 0.5246 0.5952 0.7715
No log 2.2 44 0.4869 0.6643 0.4869 0.6978
No log 2.3 46 0.6058 0.5048 0.6058 0.7783
No log 2.4 48 0.6039 0.4765 0.6039 0.7771
No log 2.5 50 0.5126 0.6389 0.5126 0.7159
No log 2.6 52 0.5284 0.5729 0.5284 0.7269
No log 2.7 54 0.4935 0.6826 0.4935 0.7025
No log 2.8 56 0.5244 0.5621 0.5244 0.7241
No log 2.9 58 0.4880 0.6170 0.4880 0.6986
No log 3.0 60 0.4978 0.5966 0.4978 0.7056
No log 3.1 62 0.5129 0.5908 0.5129 0.7162
No log 3.2 64 0.4401 0.6672 0.4401 0.6634
No log 3.3 66 0.5745 0.5486 0.5745 0.7580
No log 3.4 68 0.4982 0.6169 0.4982 0.7058
No log 3.5 70 0.4421 0.5902 0.4421 0.6649
No log 3.6 72 0.7193 0.5295 0.7193 0.8481
No log 3.7 74 1.0020 0.4978 1.0020 1.0010
No log 3.8 76 0.7590 0.5449 0.7590 0.8712
No log 3.9 78 0.4649 0.6853 0.4649 0.6819
No log 4.0 80 0.5497 0.6564 0.5497 0.7414
No log 4.1 82 0.7755 0.5186 0.7755 0.8806
No log 4.2 84 0.6788 0.5223 0.6788 0.8239
No log 4.3 86 0.4657 0.6322 0.4657 0.6824
No log 4.4 88 0.4135 0.7191 0.4135 0.6430
No log 4.5 90 0.4006 0.7032 0.4006 0.6329
No log 4.6 92 0.4215 0.6962 0.4215 0.6492
No log 4.7 94 0.4053 0.6414 0.4053 0.6366
No log 4.8 96 0.4124 0.6289 0.4124 0.6422
No log 4.9 98 0.4329 0.6863 0.4329 0.6579
No log 5.0 100 0.4864 0.6361 0.4864 0.6974
No log 5.1 102 0.4711 0.6445 0.4711 0.6864
No log 5.2 104 0.4480 0.6541 0.4480 0.6693
No log 5.3 106 0.5539 0.5521 0.5539 0.7443
No log 5.4 108 0.5571 0.5484 0.5571 0.7464
No log 5.5 110 0.4346 0.7080 0.4346 0.6593
No log 5.6 112 0.4815 0.6706 0.4815 0.6939
No log 5.7 114 0.5096 0.6395 0.5096 0.7139
No log 5.8 116 0.4318 0.7101 0.4318 0.6571
No log 5.9 118 0.4611 0.6227 0.4611 0.6791
No log 6.0 120 0.4758 0.6118 0.4758 0.6898
No log 6.1 122 0.4477 0.5949 0.4477 0.6691
No log 6.2 124 0.4407 0.6184 0.4407 0.6638
No log 6.3 126 0.4578 0.5909 0.4578 0.6766
No log 6.4 128 0.4850 0.5770 0.4850 0.6964
No log 6.5 130 0.4519 0.5909 0.4519 0.6722
No log 6.6 132 0.4369 0.6616 0.4369 0.6610
No log 6.7 134 0.4270 0.7022 0.4270 0.6535
No log 6.8 136 0.4425 0.6459 0.4425 0.6652
No log 6.9 138 0.4562 0.6459 0.4562 0.6755
No log 7.0 140 0.4624 0.6379 0.4624 0.6800
No log 7.1 142 0.4293 0.6698 0.4293 0.6552
No log 7.2 144 0.4127 0.6598 0.4127 0.6424
No log 7.3 146 0.4341 0.6793 0.4341 0.6589
No log 7.4 148 0.4479 0.6547 0.4479 0.6693
No log 7.5 150 0.5167 0.5957 0.5167 0.7188
No log 7.6 152 0.4718 0.6470 0.4718 0.6869
No log 7.7 154 0.4848 0.6476 0.4848 0.6963
No log 7.8 156 0.5029 0.6943 0.5029 0.7092
No log 7.9 158 0.5312 0.6943 0.5312 0.7288
No log 8.0 160 0.5302 0.6275 0.5302 0.7282
No log 8.1 162 0.5310 0.6481 0.5310 0.7287
No log 8.2 164 0.4549 0.6709 0.4549 0.6745
No log 8.3 166 0.4224 0.6201 0.4224 0.6500
No log 8.4 168 0.4624 0.5692 0.4624 0.6800
No log 8.5 170 0.4298 0.5995 0.4298 0.6556
No log 8.6 172 0.4384 0.6782 0.4384 0.6621
No log 8.7 174 0.5225 0.6390 0.5225 0.7229
No log 8.8 176 0.6251 0.4963 0.6251 0.7906
No log 8.9 178 0.5603 0.6042 0.5603 0.7485
No log 9.0 180 0.4435 0.6292 0.4435 0.6660
No log 9.1 182 0.4785 0.6188 0.4785 0.6917
No log 9.2 184 0.4717 0.6188 0.4717 0.6868
No log 9.3 186 0.4316 0.64 0.4316 0.6570
No log 9.4 188 0.4943 0.6568 0.4943 0.7030
No log 9.5 190 0.5490 0.6412 0.5490 0.7409
No log 9.6 192 0.4732 0.6568 0.4732 0.6879
No log 9.7 194 0.4228 0.6197 0.4228 0.6502
No log 9.8 196 0.4566 0.5970 0.4566 0.6757
No log 9.9 198 0.4854 0.6498 0.4854 0.6967
No log 10.0 200 0.4406 0.5756 0.4406 0.6638
No log 10.1 202 0.4313 0.5550 0.4313 0.6567
No log 10.2 204 0.4482 0.6067 0.4482 0.6695
No log 10.3 206 0.4300 0.6082 0.4300 0.6557
No log 10.4 208 0.4161 0.5826 0.4161 0.6450
No log 10.5 210 0.4448 0.6616 0.4448 0.6669
No log 10.6 212 0.4414 0.6763 0.4414 0.6644
No log 10.7 214 0.4559 0.6826 0.4559 0.6752
No log 10.8 216 0.4674 0.6263 0.4674 0.6837
No log 10.9 218 0.4489 0.6334 0.4489 0.6700
No log 11.0 220 0.4273 0.6255 0.4273 0.6537
No log 11.1 222 0.4488 0.6589 0.4488 0.6699
No log 11.2 224 0.4435 0.6667 0.4435 0.6660
No log 11.3 226 0.4348 0.6555 0.4348 0.6594
No log 11.4 228 0.4544 0.5860 0.4544 0.6741
No log 11.5 230 0.5635 0.5787 0.5635 0.7507
No log 11.6 232 0.6100 0.5219 0.6100 0.7810
No log 11.7 234 0.5287 0.5687 0.5287 0.7271
No log 11.8 236 0.4611 0.5988 0.4611 0.6790
No log 11.9 238 0.4732 0.6609 0.4732 0.6879
No log 12.0 240 0.4781 0.6688 0.4781 0.6914
No log 12.1 242 0.4745 0.6688 0.4745 0.6889
No log 12.2 244 0.4512 0.6364 0.4512 0.6717
No log 12.3 246 0.4492 0.5396 0.4492 0.6702
No log 12.4 248 0.4743 0.5723 0.4743 0.6887
No log 12.5 250 0.4610 0.5444 0.4610 0.6789
No log 12.6 252 0.4578 0.6849 0.4578 0.6766
No log 12.7 254 0.5177 0.5721 0.5177 0.7195
No log 12.8 256 0.5312 0.5581 0.5312 0.7289
No log 12.9 258 0.4610 0.6127 0.4610 0.6789
No log 13.0 260 0.4493 0.6147 0.4493 0.6703
No log 13.1 262 0.4921 0.6032 0.4921 0.7015
No log 13.2 264 0.4933 0.5568 0.4933 0.7023
No log 13.3 266 0.4560 0.5732 0.4560 0.6753
No log 13.4 268 0.4439 0.5142 0.4439 0.6663
No log 13.5 270 0.4447 0.5405 0.4447 0.6668
No log 13.6 272 0.4380 0.5305 0.4380 0.6618
No log 13.7 274 0.4287 0.5883 0.4287 0.6548
No log 13.8 276 0.4657 0.6419 0.4657 0.6824
No log 13.9 278 0.5097 0.6340 0.5097 0.7139
No log 14.0 280 0.5205 0.6121 0.5205 0.7214
No log 14.1 282 0.4917 0.6355 0.4917 0.7012
No log 14.2 284 0.4554 0.6240 0.4554 0.6748
No log 14.3 286 0.4810 0.6087 0.4810 0.6935
No log 14.4 288 0.4559 0.6201 0.4559 0.6752
No log 14.5 290 0.4342 0.5846 0.4342 0.6589
No log 14.6 292 0.4416 0.5784 0.4416 0.6645
No log 14.7 294 0.4698 0.6227 0.4698 0.6854
No log 14.8 296 0.4994 0.5895 0.4994 0.7067
No log 14.9 298 0.4854 0.6045 0.4854 0.6967
No log 15.0 300 0.4531 0.5682 0.4531 0.6731
No log 15.1 302 0.4443 0.5812 0.4443 0.6666
No log 15.2 304 0.4696 0.5855 0.4696 0.6853
No log 15.3 306 0.4595 0.5877 0.4595 0.6779
No log 15.4 308 0.4445 0.5665 0.4445 0.6667
No log 15.5 310 0.4548 0.5768 0.4548 0.6744
No log 15.6 312 0.4485 0.5703 0.4485 0.6697
No log 15.7 314 0.4437 0.5926 0.4437 0.6661
No log 15.8 316 0.4481 0.5875 0.4481 0.6694
No log 15.9 318 0.4628 0.6047 0.4628 0.6803
No log 16.0 320 0.4718 0.5584 0.4718 0.6869
No log 16.1 322 0.4524 0.6047 0.4524 0.6726
No log 16.2 324 0.4301 0.5397 0.4301 0.6558
No log 16.3 326 0.4189 0.5869 0.4189 0.6473
No log 16.4 328 0.4118 0.6422 0.4118 0.6417
No log 16.5 330 0.4031 0.6624 0.4031 0.6349
No log 16.6 332 0.3973 0.6641 0.3973 0.6303
No log 16.7 334 0.3980 0.6845 0.3980 0.6309
No log 16.8 336 0.4028 0.6154 0.4028 0.6347
No log 16.9 338 0.4160 0.6114 0.4160 0.6450
No log 17.0 340 0.4219 0.6118 0.4219 0.6495
No log 17.1 342 0.4478 0.6474 0.4478 0.6692
No log 17.2 344 0.4862 0.6529 0.4862 0.6973
No log 17.3 346 0.4742 0.6355 0.4742 0.6886
No log 17.4 348 0.4252 0.6716 0.4252 0.6521
No log 17.5 350 0.4398 0.6154 0.4398 0.6632
No log 17.6 352 0.4858 0.5718 0.4858 0.6970
No log 17.7 354 0.4866 0.5571 0.4866 0.6975
No log 17.8 356 0.4632 0.5414 0.4632 0.6806
No log 17.9 358 0.4367 0.5681 0.4367 0.6608
No log 18.0 360 0.4488 0.5455 0.4488 0.6700
No log 18.1 362 0.4534 0.5671 0.4534 0.6734
No log 18.2 364 0.4354 0.5831 0.4354 0.6599
No log 18.3 366 0.4326 0.5930 0.4326 0.6577
No log 18.4 368 0.4348 0.5846 0.4348 0.6594
No log 18.5 370 0.4305 0.5831 0.4305 0.6561
No log 18.6 372 0.4534 0.6187 0.4534 0.6733
No log 18.7 374 0.4879 0.5677 0.4879 0.6985
No log 18.8 376 0.4896 0.5639 0.4896 0.6997
No log 18.9 378 0.4439 0.6171 0.4439 0.6663
No log 19.0 380 0.4243 0.5831 0.4243 0.6513
No log 19.1 382 0.4505 0.6248 0.4505 0.6712
No log 19.2 384 0.4914 0.6042 0.4914 0.7010
No log 19.3 386 0.4742 0.5997 0.4742 0.6886
No log 19.4 388 0.4291 0.6154 0.4291 0.6551
No log 19.5 390 0.4558 0.6276 0.4558 0.6751
No log 19.6 392 0.4816 0.6052 0.4816 0.6940
No log 19.7 394 0.4577 0.5533 0.4577 0.6765
No log 19.8 396 0.4381 0.5440 0.4381 0.6619
No log 19.9 398 0.4725 0.6061 0.4725 0.6874
No log 20.0 400 0.5133 0.6401 0.5133 0.7165
No log 20.1 402 0.5122 0.6072 0.5122 0.7157
No log 20.2 404 0.4800 0.6345 0.4800 0.6928
No log 20.3 406 0.4580 0.6301 0.4580 0.6768
No log 20.4 408 0.4466 0.6351 0.4466 0.6683
No log 20.5 410 0.4436 0.5831 0.4436 0.6660
No log 20.6 412 0.4565 0.5633 0.4565 0.6756
No log 20.7 414 0.4580 0.5414 0.4580 0.6767
No log 20.8 416 0.4448 0.5714 0.4448 0.6670
No log 20.9 418 0.4325 0.5782 0.4325 0.6576
No log 21.0 420 0.4540 0.5816 0.4540 0.6738
No log 21.1 422 0.5494 0.6056 0.5494 0.7412
No log 21.2 424 0.6275 0.5574 0.6275 0.7921
No log 21.3 426 0.6335 0.5506 0.6335 0.7959
No log 21.4 428 0.5523 0.5117 0.5523 0.7432
No log 21.5 430 0.4983 0.6004 0.4983 0.7059
No log 21.6 432 0.4504 0.6032 0.4504 0.6711
No log 21.7 434 0.4447 0.6027 0.4447 0.6668
No log 21.8 436 0.4593 0.5966 0.4593 0.6777
No log 21.9 438 0.4522 0.5966 0.4522 0.6725
No log 22.0 440 0.4342 0.6481 0.4342 0.6590
No log 22.1 442 0.4235 0.7104 0.4235 0.6507
No log 22.2 444 0.4200 0.6317 0.4200 0.6481
No log 22.3 446 0.4247 0.6408 0.4247 0.6517
No log 22.4 448 0.4326 0.6186 0.4326 0.6577
No log 22.5 450 0.4580 0.5894 0.4580 0.6767
No log 22.6 452 0.4745 0.5894 0.4745 0.6888
No log 22.7 454 0.4850 0.5894 0.4850 0.6964
No log 22.8 456 0.4966 0.5867 0.4966 0.7047
No log 22.9 458 0.5057 0.5867 0.5057 0.7112
No log 23.0 460 0.4730 0.5056 0.4730 0.6877
No log 23.1 462 0.4489 0.5361 0.4489 0.6700
No log 23.2 464 0.4505 0.5522 0.4505 0.6712
No log 23.3 466 0.4634 0.5549 0.4634 0.6807
No log 23.4 468 0.4625 0.6046 0.4625 0.6800
No log 23.5 470 0.4427 0.5912 0.4427 0.6654
No log 23.6 472 0.4377 0.6408 0.4377 0.6616
No log 23.7 474 0.4420 0.6517 0.4420 0.6648
No log 23.8 476 0.4491 0.5986 0.4491 0.6702
No log 23.9 478 0.4517 0.5831 0.4517 0.6721
No log 24.0 480 0.4576 0.5956 0.4576 0.6765
No log 24.1 482 0.4666 0.5979 0.4666 0.6831
No log 24.2 484 0.4674 0.5768 0.4674 0.6837
No log 24.3 486 0.4640 0.5768 0.4640 0.6812
No log 24.4 488 0.4624 0.6241 0.4624 0.6800
No log 24.5 490 0.4633 0.6334 0.4633 0.6807
No log 24.6 492 0.4566 0.6334 0.4566 0.6757
No log 24.7 494 0.4563 0.5860 0.4563 0.6755
No log 24.8 496 0.4573 0.5768 0.4573 0.6763
No log 24.9 498 0.4626 0.5768 0.4626 0.6801
0.2385 25.0 500 0.4595 0.5831 0.4595 0.6779
0.2385 25.1 502 0.4555 0.6068 0.4555 0.6749
0.2385 25.2 504 0.4481 0.6068 0.4481 0.6694
0.2385 25.3 506 0.4470 0.6068 0.4470 0.6686
0.2385 25.4 508 0.4417 0.6068 0.4417 0.6646
0.2385 25.5 510 0.4368 0.6053 0.4368 0.6609
0.2385 25.6 512 0.4479 0.6346 0.4479 0.6693
0.2385 25.7 514 0.4674 0.6423 0.4674 0.6837
0.2385 25.8 516 0.4649 0.6612 0.4649 0.6818
0.2385 25.9 518 0.4514 0.6530 0.4514 0.6719
0.2385 26.0 520 0.4373 0.6449 0.4373 0.6613
0.2385 26.1 522 0.4234 0.7003 0.4234 0.6507
0.2385 26.2 524 0.4224 0.6564 0.4224 0.6499
0.2385 26.3 526 0.4242 0.6359 0.4242 0.6513
0.2385 26.4 528 0.4322 0.6446 0.4322 0.6574
0.2385 26.5 530 0.4332 0.5926 0.4332 0.6582
0.2385 26.6 532 0.4285 0.6184 0.4285 0.6546
0.2385 26.7 534 0.4292 0.5899 0.4292 0.6551
0.2385 26.8 536 0.4545 0.4984 0.4545 0.6741
0.2385 26.9 538 0.4817 0.4964 0.4817 0.6940
0.2385 27.0 540 0.5053 0.5597 0.5053 0.7108
0.2385 27.1 542 0.5034 0.4330 0.5034 0.7095
0.2385 27.2 544 0.4802 0.4637 0.4802 0.6930

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k4_task7_organization

Finetuned
(4019)
this model