ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k7_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6213
  • Qwk: 0.4964
  • Mse: 0.6213
  • Rmse: 0.7882

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0541 2 2.5107 -0.0646 2.5107 1.5845
No log 0.1081 4 1.1341 0.1271 1.1341 1.0649
No log 0.1622 6 0.7839 0.0944 0.7839 0.8854
No log 0.2162 8 0.7216 0.0376 0.7216 0.8495
No log 0.2703 10 0.7619 0.2754 0.7619 0.8728
No log 0.3243 12 0.9481 0.2683 0.9481 0.9737
No log 0.3784 14 0.9589 0.2457 0.9589 0.9792
No log 0.4324 16 0.7008 0.3782 0.7008 0.8371
No log 0.4865 18 0.7265 0.0898 0.7265 0.8524
No log 0.5405 20 0.6803 0.0804 0.6803 0.8248
No log 0.5946 22 0.8092 0.3312 0.8092 0.8996
No log 0.6486 24 0.9586 0.0816 0.9586 0.9791
No log 0.7027 26 0.9525 -0.0149 0.9525 0.9760
No log 0.7568 28 0.9267 0.0478 0.9267 0.9627
No log 0.8108 30 0.9178 0.1313 0.9178 0.9580
No log 0.8649 32 0.8887 0.1807 0.8887 0.9427
No log 0.9189 34 0.8718 0.1094 0.8718 0.9337
No log 0.9730 36 0.8832 -0.0851 0.8832 0.9398
No log 1.0270 38 0.8905 0.0977 0.8905 0.9436
No log 1.0811 40 0.8438 0.3723 0.8438 0.9186
No log 1.1351 42 0.8377 0.2460 0.8377 0.9153
No log 1.1892 44 0.8706 0.2363 0.8706 0.9331
No log 1.2432 46 1.0194 0.1599 1.0194 1.0097
No log 1.2973 48 1.0375 0.1827 1.0375 1.0186
No log 1.3514 50 1.0865 -0.0436 1.0865 1.0423
No log 1.4054 52 1.1060 -0.0137 1.1060 1.0517
No log 1.4595 54 1.2029 0.1428 1.2029 1.0968
No log 1.5135 56 1.2832 0.0921 1.2832 1.1328
No log 1.5676 58 1.1627 0.1235 1.1627 1.0783
No log 1.6216 60 0.9488 0.2703 0.9488 0.9741
No log 1.6757 62 0.8260 0.3473 0.8260 0.9089
No log 1.7297 64 0.7239 0.4247 0.7239 0.8508
No log 1.7838 66 0.6460 0.4182 0.6460 0.8037
No log 1.8378 68 0.6090 0.4206 0.6090 0.7804
No log 1.8919 70 0.5972 0.5324 0.5972 0.7728
No log 1.9459 72 0.6202 0.3229 0.6202 0.7875
No log 2.0 74 0.6916 0.1244 0.6916 0.8316
No log 2.0541 76 0.7415 0.1218 0.7415 0.8611
No log 2.1081 78 0.7429 0.0822 0.7429 0.8619
No log 2.1622 80 0.7097 0.1723 0.7097 0.8424
No log 2.2162 82 0.7395 0.3843 0.7395 0.8600
No log 2.2703 84 0.9096 0.3461 0.9096 0.9537
No log 2.3243 86 1.0084 0.3707 1.0084 1.0042
No log 2.3784 88 0.9923 0.3501 0.9923 0.9961
No log 2.4324 90 0.8898 0.2781 0.8898 0.9433
No log 2.4865 92 0.7949 0.2899 0.7949 0.8916
No log 2.5405 94 0.7232 0.3224 0.7232 0.8504
No log 2.5946 96 0.6974 0.4437 0.6974 0.8351
No log 2.6486 98 0.6498 0.4614 0.6498 0.8061
No log 2.7027 100 0.6906 0.3919 0.6906 0.8310
No log 2.7568 102 0.7327 0.4769 0.7327 0.8560
No log 2.8108 104 0.6890 0.4769 0.6890 0.8300
No log 2.8649 106 0.6553 0.4769 0.6553 0.8095
No log 2.9189 108 0.6775 0.5275 0.6775 0.8231
No log 2.9730 110 0.7467 0.4721 0.7467 0.8641
No log 3.0270 112 0.6689 0.4366 0.6689 0.8178
No log 3.0811 114 0.5323 0.4380 0.5323 0.7296
No log 3.1351 116 0.5293 0.4345 0.5293 0.7275
No log 3.1892 118 0.5352 0.4380 0.5352 0.7316
No log 3.2432 120 0.5791 0.4555 0.5791 0.7610
No log 3.2973 122 0.7003 0.4366 0.7003 0.8369
No log 3.3514 124 0.7302 0.4328 0.7302 0.8545
No log 3.4054 126 0.7075 0.4707 0.7075 0.8411
No log 3.4595 128 0.6248 0.3814 0.6248 0.7904
No log 3.5135 130 0.6000 0.3763 0.6000 0.7746
No log 3.5676 132 0.5965 0.4100 0.5965 0.7723
No log 3.6216 134 0.6172 0.4076 0.6172 0.7856
No log 3.6757 136 0.5965 0.3919 0.5965 0.7723
No log 3.7297 138 0.5774 0.3919 0.5774 0.7599
No log 3.7838 140 0.5730 0.4171 0.5730 0.7569
No log 3.8378 142 0.5773 0.3939 0.5773 0.7598
No log 3.8919 144 0.5719 0.4194 0.5719 0.7563
No log 3.9459 146 0.5960 0.4135 0.5960 0.7720
No log 4.0 148 0.7322 0.4085 0.7322 0.8557
No log 4.0541 150 0.8073 0.3767 0.8073 0.8985
No log 4.1081 152 0.7539 0.3623 0.7539 0.8683
No log 4.1622 154 0.6995 0.4664 0.6995 0.8364
No log 4.2162 156 0.6604 0.4182 0.6604 0.8126
No log 4.2703 158 0.6050 0.4875 0.6050 0.7778
No log 4.3243 160 0.5482 0.4276 0.5482 0.7404
No log 4.3784 162 0.5312 0.4809 0.5312 0.7289
No log 4.4324 164 0.5297 0.5475 0.5297 0.7278
No log 4.4865 166 0.5408 0.4972 0.5408 0.7354
No log 4.5405 168 0.5624 0.4484 0.5624 0.7499
No log 4.5946 170 0.5938 0.4285 0.5938 0.7706
No log 4.6486 172 0.6204 0.4610 0.6204 0.7877
No log 4.7027 174 0.5520 0.5649 0.5520 0.7430
No log 4.7568 176 0.5466 0.4847 0.5466 0.7393
No log 4.8108 178 0.6412 0.4717 0.6412 0.8007
No log 4.8649 180 0.6588 0.4717 0.6588 0.8117
No log 4.9189 182 0.5656 0.4754 0.5656 0.7521
No log 4.9730 184 0.5437 0.6154 0.5437 0.7373
No log 5.0270 186 0.5541 0.5892 0.5541 0.7444
No log 5.0811 188 0.5330 0.5892 0.5330 0.7301
No log 5.1351 190 0.5249 0.6020 0.5249 0.7245
No log 5.1892 192 0.5322 0.5999 0.5322 0.7295
No log 5.2432 194 0.5670 0.5657 0.5670 0.7530
No log 5.2973 196 0.5796 0.5230 0.5796 0.7613
No log 5.3514 198 0.5760 0.4698 0.5760 0.7589
No log 5.4054 200 0.6258 0.4596 0.6258 0.7911
No log 5.4595 202 0.7498 0.3996 0.7498 0.8659
No log 5.5135 204 0.9218 0.3217 0.9218 0.9601
No log 5.5676 206 1.1763 0.2359 1.1763 1.0846
No log 5.6216 208 1.2899 0.2416 1.2899 1.1358
No log 5.6757 210 1.2449 0.2059 1.2449 1.1158
No log 5.7297 212 1.0770 0.2215 1.0770 1.0378
No log 5.7838 214 0.9360 0.2124 0.9360 0.9675
No log 5.8378 216 0.9411 0.2132 0.9411 0.9701
No log 5.8919 218 0.8698 0.2502 0.8698 0.9326
No log 5.9459 220 0.7740 0.3817 0.7740 0.8798
No log 6.0 222 0.7320 0.3963 0.7320 0.8556
No log 6.0541 224 0.6931 0.3963 0.6931 0.8325
No log 6.1081 226 0.6627 0.5362 0.6627 0.8140
No log 6.1622 228 0.5825 0.5428 0.5825 0.7632
No log 6.2162 230 0.5449 0.5647 0.5449 0.7382
No log 6.2703 232 0.5505 0.5647 0.5505 0.7420
No log 6.3243 234 0.6073 0.5098 0.6073 0.7793
No log 6.3784 236 0.6755 0.4892 0.6755 0.8219
No log 6.4324 238 0.7369 0.4272 0.7369 0.8585
No log 6.4865 240 0.7219 0.4190 0.7219 0.8497
No log 6.5405 242 0.7235 0.4270 0.7235 0.8506
No log 6.5946 244 0.7640 0.4036 0.7640 0.8741
No log 6.6486 246 0.7111 0.4190 0.7111 0.8432
No log 6.7027 248 0.6844 0.4134 0.6844 0.8273
No log 6.7568 250 0.6914 0.4371 0.6914 0.8315
No log 6.8108 252 0.6929 0.4606 0.6929 0.8324
No log 6.8649 254 0.6122 0.4704 0.6122 0.7824
No log 6.9189 256 0.5658 0.5472 0.5658 0.7522
No log 6.9730 258 0.5609 0.5703 0.5609 0.7489
No log 7.0270 260 0.5516 0.5323 0.5516 0.7427
No log 7.0811 262 0.5551 0.5647 0.5551 0.7451
No log 7.1351 264 0.5641 0.5897 0.5641 0.7511
No log 7.1892 266 0.5684 0.5897 0.5684 0.7539
No log 7.2432 268 0.5446 0.5897 0.5446 0.7380
No log 7.2973 270 0.5188 0.5054 0.5188 0.7202
No log 7.3514 272 0.5279 0.5368 0.5279 0.7266
No log 7.4054 274 0.5251 0.5118 0.5251 0.7247
No log 7.4595 276 0.5607 0.5593 0.5607 0.7488
No log 7.5135 278 0.6199 0.5442 0.6199 0.7873
No log 7.5676 280 0.6322 0.4451 0.6322 0.7951
No log 7.6216 282 0.6055 0.4375 0.6055 0.7781
No log 7.6757 284 0.5906 0.4660 0.5906 0.7685
No log 7.7297 286 0.5914 0.4763 0.5914 0.7690
No log 7.7838 288 0.6267 0.5463 0.6267 0.7916
No log 7.8378 290 0.6559 0.4895 0.6559 0.8099
No log 7.8919 292 0.6132 0.5143 0.6132 0.7831
No log 7.9459 294 0.6890 0.4495 0.6890 0.8301
No log 8.0 296 0.8246 0.3437 0.8246 0.9081
No log 8.0541 298 0.7850 0.3204 0.7850 0.8860
No log 8.1081 300 0.7122 0.3319 0.7122 0.8439
No log 8.1622 302 0.6380 0.4371 0.6380 0.7988
No log 8.2162 304 0.6014 0.5345 0.6014 0.7755
No log 8.2703 306 0.5758 0.5411 0.5758 0.7588
No log 8.3243 308 0.5209 0.5357 0.5209 0.7217
No log 8.3784 310 0.4898 0.5941 0.4898 0.6999
No log 8.4324 312 0.5267 0.6342 0.5267 0.7258
No log 8.4865 314 0.5084 0.6598 0.5084 0.7130
No log 8.5405 316 0.5196 0.5425 0.5196 0.7208
No log 8.5946 318 0.5345 0.5383 0.5345 0.7311
No log 8.6486 320 0.5217 0.5574 0.5217 0.7223
No log 8.7027 322 0.5209 0.5619 0.5209 0.7217
No log 8.7568 324 0.5351 0.5421 0.5351 0.7315
No log 8.8108 326 0.5397 0.5476 0.5397 0.7346
No log 8.8649 328 0.5384 0.5549 0.5384 0.7338
No log 8.9189 330 0.5556 0.6214 0.5556 0.7454
No log 8.9730 332 0.5902 0.5741 0.5902 0.7683
No log 9.0270 334 0.6346 0.5528 0.6346 0.7966
No log 9.0811 336 0.6534 0.5140 0.6534 0.8083
No log 9.1351 338 0.6058 0.5308 0.6058 0.7783
No log 9.1892 340 0.6223 0.5223 0.6223 0.7888
No log 9.2432 342 0.6438 0.5195 0.6438 0.8024
No log 9.2973 344 0.5940 0.5442 0.5940 0.7707
No log 9.3514 346 0.5198 0.5831 0.5198 0.7210
No log 9.4054 348 0.5074 0.5555 0.5074 0.7123
No log 9.4595 350 0.5152 0.5479 0.5152 0.7178
No log 9.5135 352 0.5242 0.5479 0.5242 0.7240
No log 9.5676 354 0.5344 0.5107 0.5344 0.7310
No log 9.6216 356 0.5441 0.5323 0.5441 0.7376
No log 9.6757 358 0.5391 0.5039 0.5391 0.7342
No log 9.7297 360 0.5251 0.5201 0.5251 0.7247
No log 9.7838 362 0.5150 0.5039 0.5150 0.7176
No log 9.8378 364 0.5173 0.5625 0.5173 0.7193
No log 9.8919 366 0.5024 0.5107 0.5024 0.7088
No log 9.9459 368 0.5002 0.5340 0.5002 0.7073
No log 10.0 370 0.5079 0.5672 0.5079 0.7127
No log 10.0541 372 0.5112 0.5125 0.5112 0.7150
No log 10.1081 374 0.5364 0.5323 0.5364 0.7324
No log 10.1622 376 0.5665 0.6228 0.5665 0.7527
No log 10.2162 378 0.5614 0.5738 0.5614 0.7493
No log 10.2703 380 0.5483 0.5323 0.5483 0.7405
No log 10.3243 382 0.5436 0.5125 0.5436 0.7373
No log 10.3784 384 0.5466 0.4928 0.5466 0.7393
No log 10.4324 386 0.5432 0.5336 0.5432 0.7370
No log 10.4865 388 0.5377 0.5674 0.5377 0.7333
No log 10.5405 390 0.5424 0.6115 0.5424 0.7365
No log 10.5946 392 0.5965 0.6092 0.5965 0.7724
No log 10.6486 394 0.5648 0.6175 0.5648 0.7515
No log 10.7027 396 0.5095 0.6158 0.5095 0.7138
No log 10.7568 398 0.5389 0.5966 0.5389 0.7341
No log 10.8108 400 0.5864 0.5283 0.5864 0.7658
No log 10.8649 402 0.5692 0.5497 0.5692 0.7545
No log 10.9189 404 0.5183 0.5590 0.5183 0.7200
No log 10.9730 406 0.5524 0.6282 0.5524 0.7432
No log 11.0270 408 0.5814 0.5761 0.5814 0.7625
No log 11.0811 410 0.5431 0.6423 0.5431 0.7370
No log 11.1351 412 0.4962 0.5965 0.4962 0.7044
No log 11.1892 414 0.4922 0.5768 0.4922 0.7016
No log 11.2432 416 0.4974 0.5201 0.4974 0.7053
No log 11.2973 418 0.5158 0.5965 0.5158 0.7182
No log 11.3514 420 0.5538 0.6321 0.5538 0.7441
No log 11.4054 422 0.6230 0.5970 0.6230 0.7893
No log 11.4595 424 0.6472 0.5306 0.6472 0.8045
No log 11.5135 426 0.6338 0.5047 0.6338 0.7961
No log 11.5676 428 0.6104 0.5206 0.6104 0.7813
No log 11.6216 430 0.5726 0.5549 0.5726 0.7567
No log 11.6757 432 0.5715 0.5323 0.5715 0.7560
No log 11.7297 434 0.6234 0.5034 0.6234 0.7895
No log 11.7838 436 0.6873 0.5201 0.6873 0.8290
No log 11.8378 438 0.6938 0.5201 0.6938 0.8330
No log 11.8919 440 0.6876 0.5435 0.6876 0.8292
No log 11.9459 442 0.6267 0.5237 0.6267 0.7917
No log 12.0 444 0.5683 0.5609 0.5683 0.7538
No log 12.0541 446 0.5212 0.5609 0.5212 0.7219
No log 12.1081 448 0.5094 0.5609 0.5094 0.7137
No log 12.1622 450 0.5310 0.5609 0.5310 0.7287
No log 12.2162 452 0.5371 0.5609 0.5371 0.7329
No log 12.2703 454 0.5490 0.5609 0.5490 0.7409
No log 12.3243 456 0.5774 0.5593 0.5774 0.7598
No log 12.3784 458 0.5834 0.5593 0.5834 0.7638
No log 12.4324 460 0.5886 0.5687 0.5886 0.7672
No log 12.4865 462 0.5730 0.5703 0.5730 0.7570
No log 12.5405 464 0.5813 0.5912 0.5813 0.7624
No log 12.5946 466 0.6004 0.5242 0.6004 0.7749
No log 12.6486 468 0.5952 0.5701 0.5952 0.7715
No log 12.7027 470 0.5595 0.5697 0.5595 0.7480
No log 12.7568 472 0.5430 0.5336 0.5430 0.7369
No log 12.8108 474 0.5526 0.5135 0.5526 0.7433
No log 12.8649 476 0.5469 0.5135 0.5469 0.7395
No log 12.9189 478 0.5485 0.6115 0.5485 0.7406
No log 12.9730 480 0.5814 0.6010 0.5814 0.7625
No log 13.0270 482 0.6028 0.5998 0.6028 0.7764
No log 13.0811 484 0.5758 0.6506 0.5758 0.7588
No log 13.1351 486 0.5787 0.6401 0.5787 0.7607
No log 13.1892 488 0.6163 0.6105 0.6163 0.7851
No log 13.2432 490 0.6961 0.5455 0.6961 0.8343
No log 13.2973 492 0.7118 0.5278 0.7118 0.8437
No log 13.3514 494 0.7120 0.5076 0.7120 0.8438
No log 13.4054 496 0.6459 0.5373 0.6459 0.8037
No log 13.4595 498 0.5845 0.5089 0.5845 0.7645
0.3763 13.5135 500 0.5686 0.5396 0.5686 0.7541
0.3763 13.5676 502 0.5755 0.5671 0.5755 0.7586
0.3763 13.6216 504 0.5898 0.5671 0.5898 0.7680
0.3763 13.6757 506 0.5827 0.5671 0.5827 0.7633
0.3763 13.7297 508 0.5659 0.5784 0.5659 0.7523
0.3763 13.7838 510 0.5734 0.4866 0.5734 0.7573
0.3763 13.8378 512 0.5967 0.4866 0.5967 0.7724
0.3763 13.8919 514 0.6137 0.5559 0.6137 0.7834
0.3763 13.9459 516 0.6483 0.5568 0.6483 0.8052
0.3763 14.0 518 0.6503 0.5858 0.6503 0.8064
0.3763 14.0541 520 0.6306 0.5373 0.6306 0.7941
0.3763 14.1081 522 0.5799 0.5460 0.5799 0.7615
0.3763 14.1622 524 0.5690 0.5269 0.5690 0.7543
0.3763 14.2162 526 0.5708 0.5269 0.5708 0.7555
0.3763 14.2703 528 0.5919 0.5974 0.5919 0.7694
0.3763 14.3243 530 0.6202 0.6147 0.6202 0.7875
0.3763 14.3784 532 0.6404 0.5758 0.6404 0.8003
0.3763 14.4324 534 0.6238 0.5617 0.6238 0.7898
0.3763 14.4865 536 0.5761 0.5089 0.5761 0.7590
0.3763 14.5405 538 0.5630 0.4866 0.5630 0.7503
0.3763 14.5946 540 0.5654 0.4866 0.5654 0.7520
0.3763 14.6486 542 0.5846 0.4866 0.5846 0.7646
0.3763 14.7027 544 0.6213 0.4964 0.6213 0.7882

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k7_task7_organization

Finetuned
(4019)
this model