ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k7_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4840
  • Qwk: 0.5808
  • Mse: 0.4840
  • Rmse: 0.6957

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0571 2 2.5083 -0.0297 2.5083 1.5838
No log 0.1143 4 1.1375 0.1251 1.1375 1.0665
No log 0.1714 6 0.6806 0.1702 0.6806 0.8250
No log 0.2286 8 0.6107 0.3197 0.6107 0.7814
No log 0.2857 10 0.5701 0.3947 0.5701 0.7551
No log 0.3429 12 0.7036 0.3851 0.7036 0.8388
No log 0.4 14 0.8789 0.3425 0.8789 0.9375
No log 0.4571 16 0.6982 0.5146 0.6982 0.8356
No log 0.5143 18 0.5109 0.4802 0.5109 0.7148
No log 0.5714 20 0.5849 0.5323 0.5849 0.7648
No log 0.6286 22 0.6859 0.5024 0.6859 0.8282
No log 0.6857 24 0.5465 0.6330 0.5465 0.7392
No log 0.7429 26 0.6818 0.4783 0.6818 0.8257
No log 0.8 28 1.2708 0.2171 1.2708 1.1273
No log 0.8571 30 1.2707 0.2171 1.2707 1.1273
No log 0.9143 32 0.8421 0.5007 0.8421 0.9176
No log 0.9714 34 0.4135 0.7128 0.4135 0.6430
No log 1.0286 36 0.5817 0.5190 0.5817 0.7627
No log 1.0857 38 0.5813 0.4961 0.5813 0.7625
No log 1.1429 40 0.5230 0.5362 0.5230 0.7232
No log 1.2 42 0.4358 0.6187 0.4358 0.6602
No log 1.2571 44 0.4049 0.6946 0.4049 0.6363
No log 1.3143 46 0.4076 0.6599 0.4076 0.6384
No log 1.3714 48 0.3935 0.6530 0.3935 0.6273
No log 1.4286 50 0.3898 0.6530 0.3898 0.6243
No log 1.4857 52 0.3810 0.6542 0.3810 0.6173
No log 1.5429 54 0.3917 0.7004 0.3917 0.6259
No log 1.6 56 0.5087 0.5616 0.5087 0.7132
No log 1.6571 58 0.4878 0.5679 0.4878 0.6984
No log 1.7143 60 0.3989 0.7364 0.3989 0.6315
No log 1.7714 62 0.4431 0.6394 0.4431 0.6656
No log 1.8286 64 0.4975 0.6032 0.4975 0.7053
No log 1.8857 66 0.4032 0.6651 0.4032 0.6350
No log 1.9429 68 0.4423 0.7031 0.4423 0.6651
No log 2.0 70 0.5239 0.6658 0.5239 0.7238
No log 2.0571 72 0.4331 0.7031 0.4331 0.6581
No log 2.1143 74 0.4368 0.6329 0.4368 0.6609
No log 2.1714 76 0.4485 0.6516 0.4485 0.6697
No log 2.2286 78 0.5080 0.6303 0.5080 0.7128
No log 2.2857 80 0.5891 0.6014 0.5891 0.7675
No log 2.3429 82 0.5251 0.6333 0.5251 0.7246
No log 2.4 84 0.5208 0.6583 0.5208 0.7217
No log 2.4571 86 0.5495 0.6007 0.5495 0.7413
No log 2.5143 88 0.4856 0.6617 0.4856 0.6969
No log 2.5714 90 0.5338 0.5643 0.5338 0.7306
No log 2.6286 92 0.6682 0.5700 0.6682 0.8174
No log 2.6857 94 0.6767 0.5516 0.6767 0.8226
No log 2.7429 96 0.6453 0.5516 0.6453 0.8033
No log 2.8 98 0.5883 0.6030 0.5883 0.7670
No log 2.8571 100 0.5172 0.6156 0.5172 0.7192
No log 2.9143 102 0.4748 0.6247 0.4748 0.6891
No log 2.9714 104 0.4339 0.6307 0.4339 0.6587
No log 3.0286 106 0.4129 0.6455 0.4129 0.6425
No log 3.0857 108 0.4112 0.5956 0.4112 0.6413
No log 3.1429 110 0.4799 0.6092 0.4799 0.6928
No log 3.2 112 0.5101 0.6206 0.5101 0.7142
No log 3.2571 114 0.4300 0.5692 0.4300 0.6557
No log 3.3143 116 0.4071 0.6720 0.4071 0.6380
No log 3.3714 118 0.4937 0.6038 0.4937 0.7026
No log 3.4286 120 0.5243 0.5639 0.5243 0.7241
No log 3.4857 122 0.4459 0.6254 0.4459 0.6677
No log 3.5429 124 0.5066 0.5460 0.5066 0.7117
No log 3.6 126 0.5317 0.6117 0.5317 0.7292
No log 3.6571 128 0.5154 0.5706 0.5154 0.7179
No log 3.7143 130 0.4775 0.5750 0.4775 0.6910
No log 3.7714 132 0.5484 0.6011 0.5484 0.7405
No log 3.8286 134 0.7197 0.4521 0.7197 0.8483
No log 3.8857 136 0.6891 0.4741 0.6891 0.8301
No log 3.9429 138 0.5006 0.5786 0.5006 0.7075
No log 4.0 140 0.4766 0.5970 0.4766 0.6904
No log 4.0571 142 0.5524 0.5792 0.5524 0.7432
No log 4.1143 144 0.5303 0.6218 0.5303 0.7282
No log 4.1714 146 0.4687 0.6793 0.4687 0.6846
No log 4.2286 148 0.4320 0.6263 0.4320 0.6573
No log 4.2857 150 0.4727 0.5652 0.4727 0.6875
No log 4.3429 152 0.4316 0.5983 0.4316 0.6570
No log 4.4 154 0.3975 0.6841 0.3975 0.6305
No log 4.4571 156 0.3996 0.6841 0.3996 0.6322
No log 4.5143 158 0.4051 0.7548 0.4051 0.6365
No log 4.5714 160 0.4478 0.7374 0.4478 0.6692
No log 4.6286 162 0.4517 0.6935 0.4517 0.6721
No log 4.6857 164 0.4393 0.7025 0.4393 0.6628
No log 4.7429 166 0.4552 0.6409 0.4552 0.6747
No log 4.8 168 0.4259 0.6712 0.4259 0.6526
No log 4.8571 170 0.4120 0.6667 0.4120 0.6419
No log 4.9143 172 0.4162 0.6344 0.4162 0.6452
No log 4.9714 174 0.4683 0.6174 0.4683 0.6843
No log 5.0286 176 0.6372 0.5481 0.6372 0.7982
No log 5.0857 178 0.6515 0.5481 0.6515 0.8071
No log 5.1429 180 0.4894 0.6105 0.4894 0.6995
No log 5.2 182 0.4669 0.5627 0.4669 0.6833
No log 5.2571 184 0.5640 0.6273 0.5640 0.7510
No log 5.3143 186 0.5994 0.6010 0.5994 0.7742
No log 5.3714 188 0.5117 0.5869 0.5117 0.7153
No log 5.4286 190 0.4244 0.6140 0.4244 0.6515
No log 5.4857 192 0.4911 0.6582 0.4911 0.7008
No log 5.5429 194 0.5878 0.6059 0.5878 0.7667
No log 5.6 196 0.5162 0.6529 0.5162 0.7185
No log 5.6571 198 0.4468 0.6735 0.4468 0.6684
No log 5.7143 200 0.4699 0.6247 0.4699 0.6855
No log 5.7714 202 0.4506 0.6582 0.4506 0.6712
No log 5.8286 204 0.4536 0.6568 0.4536 0.6735
No log 5.8857 206 0.4731 0.6322 0.4731 0.6878
No log 5.9429 208 0.4603 0.6127 0.4603 0.6784
No log 6.0 210 0.4211 0.6418 0.4211 0.6489
No log 6.0571 212 0.3895 0.6200 0.3895 0.6241
No log 6.1143 214 0.3860 0.6255 0.3860 0.6213
No log 6.1714 216 0.3883 0.6771 0.3883 0.6232
No log 6.2286 218 0.3806 0.6479 0.3806 0.6170
No log 6.2857 220 0.3903 0.6721 0.3903 0.6247
No log 6.3429 222 0.4095 0.6506 0.4095 0.6399
No log 6.4 224 0.3987 0.5995 0.3987 0.6314
No log 6.4571 226 0.3977 0.6214 0.3977 0.6307
No log 6.5143 228 0.3957 0.6577 0.3957 0.6290
No log 6.5714 230 0.3960 0.6577 0.3960 0.6293
No log 6.6286 232 0.4037 0.6477 0.4037 0.6354
No log 6.6857 234 0.4073 0.6567 0.4073 0.6382
No log 6.7429 236 0.4126 0.6821 0.4126 0.6423
No log 6.8 238 0.4634 0.5677 0.4634 0.6807
No log 6.8571 240 0.4728 0.6189 0.4728 0.6876
No log 6.9143 242 0.4050 0.6541 0.4050 0.6364
No log 6.9714 244 0.4630 0.6537 0.4630 0.6805
No log 7.0286 246 0.6565 0.5017 0.6565 0.8103
No log 7.0857 248 0.7420 0.5098 0.7420 0.8614
No log 7.1429 250 0.7605 0.5045 0.7605 0.8721
No log 7.2 252 0.5703 0.5565 0.5703 0.7552
No log 7.2571 254 0.4399 0.6395 0.4399 0.6632
No log 7.3143 256 0.4774 0.5998 0.4774 0.6910
No log 7.3714 258 0.5177 0.5622 0.5177 0.7195
No log 7.4286 260 0.4851 0.5698 0.4851 0.6965
No log 7.4857 262 0.4381 0.6184 0.4381 0.6619
No log 7.5429 264 0.4362 0.5815 0.4362 0.6604
No log 7.6 266 0.4791 0.5855 0.4791 0.6922
No log 7.6571 268 0.5040 0.6294 0.5040 0.7099
No log 7.7143 270 0.5059 0.5957 0.5059 0.7113
No log 7.7714 272 0.4622 0.5983 0.4622 0.6799
No log 7.8286 274 0.4127 0.6770 0.4127 0.6424
No log 7.8857 276 0.4531 0.6394 0.4531 0.6732
No log 7.9429 278 0.4893 0.5379 0.4893 0.6995
No log 8.0 280 0.4495 0.6490 0.4495 0.6704
No log 8.0571 282 0.4075 0.6543 0.4075 0.6384
No log 8.1143 284 0.3960 0.6335 0.3960 0.6293
No log 8.1714 286 0.3997 0.5633 0.3997 0.6322
No log 8.2286 288 0.4447 0.6096 0.4447 0.6668
No log 8.2857 290 0.4862 0.6356 0.4862 0.6973
No log 8.3429 292 0.4796 0.6357 0.4796 0.6925
No log 8.4 294 0.4911 0.6363 0.4911 0.7008
No log 8.4571 296 0.5033 0.6182 0.5033 0.7094
No log 8.5143 298 0.4653 0.6170 0.4653 0.6821
No log 8.5714 300 0.4261 0.6092 0.4261 0.6528
No log 8.6286 302 0.4222 0.6244 0.4222 0.6498
No log 8.6857 304 0.4278 0.6158 0.4278 0.6541
No log 8.7429 306 0.4184 0.6247 0.4184 0.6468
No log 8.8 308 0.4204 0.5841 0.4204 0.6484
No log 8.8571 310 0.4060 0.6683 0.4060 0.6371
No log 8.9143 312 0.4031 0.6745 0.4031 0.6349
No log 8.9714 314 0.3988 0.6651 0.3988 0.6315
No log 9.0286 316 0.4113 0.6613 0.4113 0.6413
No log 9.0857 318 0.3937 0.6377 0.3937 0.6274
No log 9.1429 320 0.3735 0.7042 0.3735 0.6111
No log 9.2 322 0.3897 0.7051 0.3897 0.6243
No log 9.2571 324 0.3905 0.7051 0.3905 0.6249
No log 9.3143 326 0.3726 0.7133 0.3726 0.6104
No log 9.3714 328 0.3781 0.7138 0.3781 0.6149
No log 9.4286 330 0.3952 0.6736 0.3952 0.6287
No log 9.4857 332 0.4000 0.7033 0.4000 0.6325
No log 9.5429 334 0.4232 0.6721 0.4232 0.6506
No log 9.6 336 0.4344 0.6705 0.4344 0.6591
No log 9.6571 338 0.4300 0.5414 0.4300 0.6558
No log 9.7143 340 0.4417 0.5666 0.4417 0.6646
No log 9.7714 342 0.4402 0.5649 0.4402 0.6635
No log 9.8286 344 0.4357 0.6068 0.4357 0.6601
No log 9.8857 346 0.4349 0.6914 0.4349 0.6595
No log 9.9429 348 0.4311 0.6721 0.4311 0.6566
No log 10.0 350 0.4281 0.6200 0.4281 0.6543
No log 10.0571 352 0.4224 0.6001 0.4224 0.6500
No log 10.1143 354 0.4209 0.5915 0.4209 0.6488
No log 10.1714 356 0.4271 0.5714 0.4271 0.6535
No log 10.2286 358 0.4266 0.5714 0.4266 0.6531
No log 10.2857 360 0.4303 0.5800 0.4303 0.6559
No log 10.3429 362 0.4538 0.6118 0.4538 0.6737
No log 10.4 364 0.5064 0.5639 0.5064 0.7116
No log 10.4571 366 0.4752 0.6087 0.4752 0.6893
No log 10.5143 368 0.4135 0.6307 0.4135 0.6430
No log 10.5714 370 0.4161 0.6101 0.4161 0.6451
No log 10.6286 372 0.4448 0.6047 0.4448 0.6670
No log 10.6857 374 0.4710 0.5983 0.4710 0.6863
No log 10.7429 376 0.4479 0.6061 0.4479 0.6692
No log 10.8 378 0.4197 0.6305 0.4197 0.6479
No log 10.8571 380 0.4051 0.6577 0.4051 0.6365
No log 10.9143 382 0.4120 0.6721 0.4120 0.6419
No log 10.9714 384 0.4165 0.6721 0.4165 0.6454
No log 11.0286 386 0.4016 0.6242 0.4016 0.6337
No log 11.0857 388 0.4137 0.6683 0.4137 0.6432
No log 11.1429 390 0.4251 0.6870 0.4251 0.6520
No log 11.2 392 0.4403 0.6337 0.4403 0.6635
No log 11.2571 394 0.4350 0.6419 0.4350 0.6596
No log 11.3143 396 0.4248 0.6223 0.4248 0.6517
No log 11.3714 398 0.4168 0.5970 0.4168 0.6456
No log 11.4286 400 0.4099 0.6344 0.4099 0.6402
No log 11.4857 402 0.4360 0.6082 0.4360 0.6603
No log 11.5429 404 0.4648 0.6214 0.4648 0.6818
No log 11.6 406 0.4628 0.6579 0.4628 0.6803
No log 11.6571 408 0.4442 0.6169 0.4442 0.6665
No log 11.7143 410 0.4414 0.6169 0.4414 0.6644
No log 11.7714 412 0.4395 0.5927 0.4395 0.6630
No log 11.8286 414 0.4318 0.5675 0.4318 0.6571
No log 11.8857 416 0.4416 0.5516 0.4416 0.6645
No log 11.9429 418 0.4217 0.5995 0.4217 0.6494
No log 12.0 420 0.4027 0.6443 0.4027 0.6346
No log 12.0571 422 0.4179 0.6357 0.4179 0.6465
No log 12.1143 424 0.4256 0.6370 0.4256 0.6524
No log 12.1714 426 0.4258 0.5945 0.4258 0.6525
No log 12.2286 428 0.4407 0.5432 0.4407 0.6638
No log 12.2857 430 0.4963 0.5599 0.4963 0.7045
No log 12.3429 432 0.5392 0.5808 0.5392 0.7343
No log 12.4 434 0.5255 0.5836 0.5255 0.7249
No log 12.4571 436 0.4932 0.6024 0.4932 0.7023
No log 12.5143 438 0.4513 0.6020 0.4513 0.6718
No log 12.5714 440 0.4089 0.6255 0.4089 0.6394
No log 12.6286 442 0.3959 0.6833 0.3959 0.6292
No log 12.6857 444 0.3866 0.6628 0.3866 0.6218
No log 12.7429 446 0.4012 0.5798 0.4012 0.6334
No log 12.8 448 0.4630 0.5135 0.4630 0.6805
No log 12.8571 450 0.4809 0.5603 0.4809 0.6935
No log 12.9143 452 0.4579 0.5373 0.4579 0.6766
No log 12.9714 454 0.4510 0.5578 0.4510 0.6715
No log 13.0286 456 0.4609 0.6318 0.4609 0.6789
No log 13.0857 458 0.4654 0.6318 0.4654 0.6822
No log 13.1429 460 0.4629 0.5728 0.4629 0.6804
No log 13.2 462 0.4661 0.5528 0.4661 0.6827
No log 13.2571 464 0.4756 0.5528 0.4756 0.6896
No log 13.3143 466 0.4597 0.5373 0.4597 0.6780
No log 13.3714 468 0.4276 0.5356 0.4276 0.6539
No log 13.4286 470 0.4092 0.6477 0.4092 0.6397
No log 13.4857 472 0.4125 0.7166 0.4125 0.6422
No log 13.5429 474 0.4231 0.7004 0.4231 0.6504
No log 13.6 476 0.4384 0.6280 0.4384 0.6621
No log 13.6571 478 0.4286 0.6590 0.4286 0.6546
No log 13.7143 480 0.4302 0.6590 0.4302 0.6559
No log 13.7714 482 0.4443 0.6449 0.4443 0.6666
No log 13.8286 484 0.4472 0.6182 0.4472 0.6687
No log 13.8857 486 0.4361 0.7004 0.4361 0.6604
No log 13.9429 488 0.4354 0.7175 0.4354 0.6598
No log 14.0 490 0.4271 0.6863 0.4271 0.6535
No log 14.0571 492 0.4355 0.5702 0.4355 0.6599
No log 14.1143 494 0.4576 0.5794 0.4576 0.6765
No log 14.1714 496 0.4574 0.5869 0.4574 0.6763
No log 14.2286 498 0.4347 0.5642 0.4347 0.6593
0.2786 14.2857 500 0.4145 0.6678 0.4145 0.6438
0.2786 14.3429 502 0.4149 0.7013 0.4149 0.6441
0.2786 14.4 504 0.4273 0.6420 0.4273 0.6537
0.2786 14.4571 506 0.4403 0.6420 0.4403 0.6635
0.2786 14.5143 508 0.4575 0.6536 0.4575 0.6764
0.2786 14.5714 510 0.4718 0.6181 0.4718 0.6868
0.2786 14.6286 512 0.4840 0.5808 0.4840 0.6957

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k7_task7_organization

Finetuned
(4019)
this model