ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k1_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4324
  • Qwk: 0.6727
  • Mse: 0.4324
  • Rmse: 0.6576

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.2857 2 1.7259 0.0403 1.7259 1.3137
No log 0.5714 4 0.9688 0.1334 0.9688 0.9843
No log 0.8571 6 0.7157 0.1287 0.7157 0.8460
No log 1.1429 8 0.7825 0.1822 0.7825 0.8846
No log 1.4286 10 0.8707 0.3555 0.8707 0.9331
No log 1.7143 12 0.7426 0.4255 0.7426 0.8617
No log 2.0 14 0.5326 0.5289 0.5326 0.7298
No log 2.2857 16 0.5203 0.4237 0.5203 0.7213
No log 2.5714 18 0.4688 0.5627 0.4688 0.6847
No log 2.8571 20 0.4870 0.5442 0.4870 0.6978
No log 3.1429 22 0.4729 0.5786 0.4729 0.6877
No log 3.4286 24 0.4769 0.6389 0.4769 0.6905
No log 3.7143 26 0.7054 0.5933 0.7054 0.8399
No log 4.0 28 0.9402 0.5009 0.9402 0.9697
No log 4.2857 30 0.5216 0.6568 0.5216 0.7222
No log 4.5714 32 0.4416 0.6341 0.4416 0.6645
No log 4.8571 34 0.4219 0.7003 0.4219 0.6495
No log 5.1429 36 0.4101 0.6849 0.4101 0.6404
No log 5.4286 38 0.4494 0.6367 0.4494 0.6704
No log 5.7143 40 0.4341 0.7184 0.4341 0.6589
No log 6.0 42 0.4606 0.6073 0.4606 0.6787
No log 6.2857 44 0.5245 0.6290 0.5245 0.7242
No log 6.5714 46 0.5707 0.6529 0.5707 0.7555
No log 6.8571 48 0.4873 0.6301 0.4873 0.6981
No log 7.1429 50 0.4524 0.6495 0.4524 0.6726
No log 7.4286 52 0.4561 0.6347 0.4561 0.6754
No log 7.7143 54 0.4707 0.6139 0.4707 0.6861
No log 8.0 56 0.4730 0.6657 0.4730 0.6878
No log 8.2857 58 0.5832 0.6420 0.5832 0.7637
No log 8.5714 60 0.6253 0.5943 0.6253 0.7908
No log 8.8571 62 0.7011 0.5551 0.7011 0.8373
No log 9.1429 64 0.7444 0.4914 0.7444 0.8628
No log 9.4286 66 0.6391 0.5407 0.6391 0.7995
No log 9.7143 68 0.6380 0.5716 0.6380 0.7987
No log 10.0 70 0.5487 0.5547 0.5487 0.7407
No log 10.2857 72 0.4994 0.6541 0.4994 0.7067
No log 10.5714 74 0.5861 0.5871 0.5861 0.7656
No log 10.8571 76 0.5882 0.5510 0.5882 0.7669
No log 11.1429 78 0.5327 0.5568 0.5327 0.7298
No log 11.4286 80 0.4755 0.5874 0.4755 0.6896
No log 11.7143 82 0.4667 0.6065 0.4667 0.6831
No log 12.0 84 0.4670 0.6222 0.4670 0.6834
No log 12.2857 86 0.4778 0.5810 0.4778 0.6913
No log 12.5714 88 0.5014 0.6449 0.5014 0.7081
No log 12.8571 90 0.5113 0.6449 0.5113 0.7150
No log 13.1429 92 0.4986 0.5878 0.4986 0.7061
No log 13.4286 94 0.5413 0.5836 0.5413 0.7357
No log 13.7143 96 0.8087 0.3790 0.8087 0.8993
No log 14.0 98 0.7856 0.4321 0.7856 0.8864
No log 14.2857 100 0.5434 0.5738 0.5434 0.7372
No log 14.5714 102 0.4816 0.6160 0.4816 0.6939
No log 14.8571 104 0.5119 0.6427 0.5119 0.7155
No log 15.1429 106 0.4754 0.6541 0.4754 0.6895
No log 15.4286 108 0.4717 0.6867 0.4717 0.6868
No log 15.7143 110 0.5186 0.6292 0.5186 0.7201
No log 16.0 112 0.5209 0.6292 0.5209 0.7217
No log 16.2857 114 0.4535 0.6598 0.4535 0.6734
No log 16.5714 116 0.4308 0.6899 0.4308 0.6563
No log 16.8571 118 0.4808 0.6441 0.4808 0.6934
No log 17.1429 120 0.5140 0.5568 0.5140 0.7169
No log 17.4286 122 0.4600 0.6173 0.4600 0.6783
No log 17.7143 124 0.4478 0.6575 0.4478 0.6692
No log 18.0 126 0.5281 0.6719 0.5281 0.7267
No log 18.2857 128 0.5447 0.6294 0.5447 0.7381
No log 18.5714 130 0.5249 0.6092 0.5249 0.7245
No log 18.8571 132 0.4571 0.5891 0.4571 0.6761
No log 19.1429 134 0.4550 0.6275 0.4550 0.6745
No log 19.4286 136 0.4801 0.6160 0.4801 0.6929
No log 19.7143 138 0.4697 0.6083 0.4697 0.6854
No log 20.0 140 0.4567 0.6210 0.4567 0.6758
No log 20.2857 142 0.4503 0.5961 0.4503 0.6710
No log 20.5714 144 0.4591 0.6427 0.4591 0.6776
No log 20.8571 146 0.4532 0.6515 0.4532 0.6732
No log 21.1429 148 0.4622 0.6867 0.4622 0.6799
No log 21.4286 150 0.4271 0.6849 0.4271 0.6535
No log 21.7143 152 0.4283 0.5974 0.4283 0.6545
No log 22.0 154 0.4580 0.5961 0.4580 0.6768
No log 22.2857 156 0.4424 0.5559 0.4424 0.6651
No log 22.5714 158 0.4234 0.6254 0.4234 0.6507
No log 22.8571 160 0.4259 0.6455 0.4259 0.6526
No log 23.1429 162 0.4273 0.6359 0.4273 0.6537
No log 23.4286 164 0.4261 0.6564 0.4261 0.6527
No log 23.7143 166 0.4369 0.6762 0.4369 0.6610
No log 24.0 168 0.4465 0.6595 0.4465 0.6682
No log 24.2857 170 0.4348 0.6322 0.4348 0.6594
No log 24.5714 172 0.4238 0.6469 0.4238 0.6510
No log 24.8571 174 0.4178 0.6750 0.4178 0.6464
No log 25.1429 176 0.4122 0.6277 0.4122 0.6421
No log 25.4286 178 0.4186 0.6158 0.4186 0.6470
No log 25.7143 180 0.4152 0.6359 0.4152 0.6443
No log 26.0 182 0.4200 0.6310 0.4200 0.6481
No log 26.2857 184 0.4282 0.6458 0.4282 0.6544
No log 26.5714 186 0.4854 0.6341 0.4854 0.6967
No log 26.8571 188 0.5149 0.6768 0.5149 0.7176
No log 27.1429 190 0.5456 0.6508 0.5456 0.7386
No log 27.4286 192 0.4929 0.6147 0.4929 0.7021
No log 27.7143 194 0.4386 0.6359 0.4386 0.6623
No log 28.0 196 0.4568 0.6950 0.4568 0.6759
No log 28.2857 198 0.4687 0.6683 0.4687 0.6846
No log 28.5714 200 0.4661 0.6688 0.4661 0.6827
No log 28.8571 202 0.4494 0.6667 0.4494 0.6704
No log 29.1429 204 0.4595 0.6124 0.4595 0.6779
No log 29.4286 206 0.4716 0.6243 0.4716 0.6868
No log 29.7143 208 0.4610 0.6570 0.4610 0.6790
No log 30.0 210 0.4470 0.7071 0.4470 0.6686
No log 30.2857 212 0.4407 0.7104 0.4407 0.6639
No log 30.5714 214 0.4551 0.5886 0.4551 0.6746
No log 30.8571 216 0.4702 0.5133 0.4702 0.6857
No log 31.1429 218 0.4717 0.4212 0.4717 0.6868
No log 31.4286 220 0.4430 0.6643 0.4430 0.6655
No log 31.7143 222 0.4463 0.6957 0.4463 0.6681
No log 32.0 224 0.4589 0.6515 0.4589 0.6774
No log 32.2857 226 0.4796 0.6526 0.4796 0.6925
No log 32.5714 228 0.5057 0.6289 0.5057 0.7111
No log 32.8571 230 0.4724 0.6351 0.4724 0.6873
No log 33.1429 232 0.4601 0.6351 0.4601 0.6783
No log 33.4286 234 0.4550 0.6515 0.4550 0.6746
No log 33.7143 236 0.4352 0.6849 0.4352 0.6597
No log 34.0 238 0.4298 0.6841 0.4298 0.6556
No log 34.2857 240 0.4619 0.5996 0.4619 0.6796
No log 34.5714 242 0.4714 0.6272 0.4714 0.6866
No log 34.8571 244 0.4493 0.6886 0.4493 0.6703
No log 35.1429 246 0.4573 0.7060 0.4573 0.6762
No log 35.4286 248 0.4726 0.6170 0.4726 0.6875
No log 35.7143 250 0.4676 0.6843 0.4676 0.6838
No log 36.0 252 0.4488 0.7122 0.4488 0.6699
No log 36.2857 254 0.4505 0.6210 0.4505 0.6712
No log 36.5714 256 0.4351 0.7062 0.4351 0.6596
No log 36.8571 258 0.4299 0.6837 0.4299 0.6557
No log 37.1429 260 0.4324 0.6771 0.4324 0.6576
No log 37.4286 262 0.4175 0.6667 0.4175 0.6462
No log 37.7143 264 0.4252 0.6530 0.4252 0.6521
No log 38.0 266 0.4733 0.5845 0.4733 0.6880
No log 38.2857 268 0.4858 0.5624 0.4858 0.6970
No log 38.5714 270 0.4469 0.5983 0.4469 0.6685
No log 38.8571 272 0.4145 0.6929 0.4145 0.6438
No log 39.1429 274 0.4825 0.6367 0.4825 0.6946
No log 39.4286 276 0.5528 0.6241 0.5528 0.7435
No log 39.7143 278 0.5239 0.6240 0.5239 0.7238
No log 40.0 280 0.4552 0.6682 0.4552 0.6747
No log 40.2857 282 0.4363 0.6552 0.4363 0.6606
No log 40.5714 284 0.4326 0.6530 0.4326 0.6578
No log 40.8571 286 0.4312 0.6530 0.4312 0.6567
No log 41.1429 288 0.4166 0.7415 0.4166 0.6454
No log 41.4286 290 0.4246 0.6852 0.4246 0.6516
No log 41.7143 292 0.4212 0.6852 0.4212 0.6490
No log 42.0 294 0.4173 0.6946 0.4173 0.6460
No log 42.2857 296 0.4211 0.7012 0.4211 0.6489
No log 42.5714 298 0.4491 0.5756 0.4491 0.6702
No log 42.8571 300 0.4829 0.5395 0.4829 0.6949
No log 43.1429 302 0.4794 0.5756 0.4794 0.6924
No log 43.4286 304 0.4456 0.6530 0.4456 0.6675
No log 43.7143 306 0.4227 0.6730 0.4227 0.6502
No log 44.0 308 0.4208 0.7104 0.4208 0.6487
No log 44.2857 310 0.4236 0.7104 0.4236 0.6508
No log 44.5714 312 0.4270 0.6929 0.4270 0.6535
No log 44.8571 314 0.4323 0.7199 0.4323 0.6575
No log 45.1429 316 0.4451 0.6334 0.4451 0.6671
No log 45.4286 318 0.4514 0.6132 0.4514 0.6719
No log 45.7143 320 0.4349 0.6554 0.4349 0.6594
No log 46.0 322 0.4309 0.6866 0.4309 0.6564
No log 46.2857 324 0.4402 0.6967 0.4402 0.6635
No log 46.5714 326 0.4519 0.6620 0.4519 0.6722
No log 46.8571 328 0.4351 0.6598 0.4351 0.6596
No log 47.1429 330 0.4215 0.6828 0.4215 0.6492
No log 47.4286 332 0.4290 0.6720 0.4290 0.6550
No log 47.7143 334 0.4292 0.6530 0.4292 0.6552
No log 48.0 336 0.4201 0.6530 0.4201 0.6481
No log 48.2857 338 0.4144 0.6919 0.4144 0.6437
No log 48.5714 340 0.4195 0.6254 0.4195 0.6477
No log 48.8571 342 0.4380 0.6228 0.4380 0.6618
No log 49.1429 344 0.4783 0.6004 0.4783 0.6916
No log 49.4286 346 0.5450 0.5553 0.5450 0.7382
No log 49.7143 348 0.5988 0.5538 0.5988 0.7738
No log 50.0 350 0.5643 0.5553 0.5643 0.7512
No log 50.2857 352 0.4873 0.6118 0.4873 0.6980
No log 50.5714 354 0.4347 0.6530 0.4347 0.6593
No log 50.8571 356 0.4272 0.6060 0.4272 0.6536
No log 51.1429 358 0.4261 0.6542 0.4261 0.6528
No log 51.4286 360 0.4311 0.6727 0.4311 0.6566
No log 51.7143 362 0.4424 0.6145 0.4424 0.6651
No log 52.0 364 0.4626 0.6334 0.4626 0.6801
No log 52.2857 366 0.4685 0.6334 0.4685 0.6844
No log 52.5714 368 0.4567 0.6145 0.4567 0.6758
No log 52.8571 370 0.4439 0.6554 0.4439 0.6662
No log 53.1429 372 0.4431 0.6383 0.4431 0.6657
No log 53.4286 374 0.4449 0.6210 0.4449 0.6670
No log 53.7143 376 0.4372 0.6383 0.4372 0.6612
No log 54.0 378 0.4309 0.6839 0.4309 0.6564
No log 54.2857 380 0.4310 0.6720 0.4310 0.6565
No log 54.5714 382 0.4482 0.6334 0.4482 0.6694
No log 54.8571 384 0.4477 0.6334 0.4477 0.6691
No log 55.1429 386 0.4299 0.6720 0.4299 0.6557
No log 55.4286 388 0.4264 0.6552 0.4264 0.6530
No log 55.7143 390 0.4305 0.6394 0.4305 0.6561
No log 56.0 392 0.4347 0.6552 0.4347 0.6593
No log 56.2857 394 0.4348 0.6552 0.4348 0.6594
No log 56.5714 396 0.4311 0.6552 0.4311 0.6565
No log 56.8571 398 0.4270 0.7003 0.4270 0.6535
No log 57.1429 400 0.4221 0.6639 0.4221 0.6497
No log 57.4286 402 0.4168 0.6828 0.4168 0.6456
No log 57.7143 404 0.4170 0.6828 0.4170 0.6458
No log 58.0 406 0.4188 0.6919 0.4188 0.6472
No log 58.2857 408 0.4236 0.7022 0.4236 0.6508
No log 58.5714 410 0.4299 0.7246 0.4299 0.6557
No log 58.8571 412 0.4314 0.7104 0.4314 0.6568
No log 59.1429 414 0.4304 0.7003 0.4304 0.6560
No log 59.4286 416 0.4339 0.6720 0.4339 0.6587
No log 59.7143 418 0.4375 0.6720 0.4375 0.6614
No log 60.0 420 0.4332 0.6919 0.4332 0.6581
No log 60.2857 422 0.4245 0.6828 0.4245 0.6515
No log 60.5714 424 0.4330 0.6953 0.4330 0.6580
No log 60.8571 426 0.4504 0.6867 0.4504 0.6711
No log 61.1429 428 0.4594 0.6434 0.4594 0.6778
No log 61.4286 430 0.4463 0.6867 0.4463 0.6681
No log 61.7143 432 0.4287 0.7041 0.4286 0.6547
No log 62.0 434 0.4250 0.6929 0.4250 0.6520
No log 62.2857 436 0.4343 0.6458 0.4343 0.6590
No log 62.5714 438 0.4408 0.6145 0.4408 0.6639
No log 62.8571 440 0.4394 0.6632 0.4394 0.6629
No log 63.1429 442 0.4365 0.7022 0.4365 0.6607
No log 63.4286 444 0.4338 0.7138 0.4338 0.6586
No log 63.7143 446 0.4371 0.7138 0.4371 0.6611
No log 64.0 448 0.4362 0.7138 0.4362 0.6604
No log 64.2857 450 0.4340 0.7237 0.4340 0.6588
No log 64.5714 452 0.4323 0.7237 0.4323 0.6575
No log 64.8571 454 0.4283 0.7415 0.4283 0.6544
No log 65.1429 456 0.4300 0.6632 0.4300 0.6558
No log 65.4286 458 0.4310 0.6530 0.4310 0.6565
No log 65.7143 460 0.4237 0.7022 0.4237 0.6509
No log 66.0 462 0.4226 0.7022 0.4226 0.6501
No log 66.2857 464 0.4234 0.6929 0.4234 0.6507
No log 66.5714 466 0.4252 0.7218 0.4252 0.6521
No log 66.8571 468 0.4256 0.6839 0.4256 0.6524
No log 67.1429 470 0.4235 0.6929 0.4235 0.6507
No log 67.4286 472 0.4220 0.6553 0.4220 0.6496
No log 67.7143 474 0.4185 0.6828 0.4185 0.6469
No log 68.0 476 0.4154 0.6828 0.4154 0.6445
No log 68.2857 478 0.4154 0.6828 0.4154 0.6445
No log 68.5714 480 0.4187 0.6919 0.4187 0.6471
No log 68.8571 482 0.4206 0.6727 0.4206 0.6485
No log 69.1429 484 0.4199 0.6727 0.4199 0.6480
No log 69.4286 486 0.4225 0.6830 0.4225 0.6500
No log 69.7143 488 0.4251 0.6830 0.4251 0.6520
No log 70.0 490 0.4294 0.6730 0.4294 0.6553
No log 70.2857 492 0.4334 0.6730 0.4334 0.6583
No log 70.5714 494 0.4363 0.6730 0.4363 0.6605
No log 70.8571 496 0.4386 0.6899 0.4386 0.6622
No log 71.1429 498 0.4377 0.6730 0.4377 0.6616
0.1999 71.4286 500 0.4344 0.6730 0.4344 0.6591
0.1999 71.7143 502 0.4291 0.6739 0.4291 0.6551
0.1999 72.0 504 0.4248 0.6929 0.4248 0.6518
0.1999 72.2857 506 0.4247 0.6751 0.4247 0.6517
0.1999 72.5714 508 0.4245 0.6860 0.4245 0.6515
0.1999 72.8571 510 0.4222 0.6751 0.4222 0.6497
0.1999 73.1429 512 0.4218 0.6830 0.4218 0.6495
0.1999 73.4286 514 0.4265 0.6727 0.4265 0.6531
0.1999 73.7143 516 0.4338 0.6346 0.4338 0.6586
0.1999 74.0 518 0.4368 0.6145 0.4368 0.6609
0.1999 74.2857 520 0.4363 0.6530 0.4363 0.6605
0.1999 74.5714 522 0.4324 0.6727 0.4324 0.6576

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k1_task7_organization

Finetuned
(4019)
this model