ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k12_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5617
  • Qwk: 0.3925
  • Mse: 0.5617
  • Rmse: 0.7495

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0323 2 2.5973 -0.0262 2.5973 1.6116
No log 0.0645 4 1.3390 0.0726 1.3390 1.1572
No log 0.0968 6 1.1960 -0.1255 1.1960 1.0936
No log 0.1290 8 0.9288 0.0168 0.9288 0.9637
No log 0.1613 10 0.7624 0.3008 0.7624 0.8731
No log 0.1935 12 0.6502 0.2506 0.6502 0.8064
No log 0.2258 14 0.6415 0.2506 0.6415 0.8010
No log 0.2581 16 0.6657 0.2080 0.6657 0.8159
No log 0.2903 18 0.6947 0.2490 0.6947 0.8335
No log 0.3226 20 0.6256 0.1561 0.6256 0.7910
No log 0.3548 22 0.6301 0.1094 0.6301 0.7938
No log 0.3871 24 0.6785 0.2142 0.6785 0.8237
No log 0.4194 26 0.6729 0.2892 0.6729 0.8203
No log 0.4516 28 0.6593 0.2024 0.6593 0.8120
No log 0.4839 30 0.6674 0.2308 0.6674 0.8170
No log 0.5161 32 0.7159 0.2063 0.7159 0.8461
No log 0.5484 34 0.7204 0.2412 0.7204 0.8487
No log 0.5806 36 0.7101 0.2181 0.7101 0.8427
No log 0.6129 38 0.7217 0.2641 0.7217 0.8495
No log 0.6452 40 0.8925 0.1628 0.8925 0.9447
No log 0.6774 42 0.9431 0.2202 0.9431 0.9711
No log 0.7097 44 0.7637 0.1372 0.7637 0.8739
No log 0.7419 46 0.7443 0.2692 0.7443 0.8627
No log 0.7742 48 1.1542 0.2543 1.1542 1.0743
No log 0.8065 50 1.2314 0.1654 1.2314 1.1097
No log 0.8387 52 1.0676 0.2683 1.0676 1.0333
No log 0.8710 54 0.8377 0.2063 0.8377 0.9153
No log 0.9032 56 0.7436 0.0327 0.7436 0.8623
No log 0.9355 58 0.7566 0.0 0.7566 0.8699
No log 0.9677 60 0.8087 0.1372 0.8087 0.8993
No log 1.0 62 0.8468 0.1739 0.8468 0.9202
No log 1.0323 64 0.8207 0.1786 0.8207 0.9059
No log 1.0645 66 0.7372 0.1327 0.7372 0.8586
No log 1.0968 68 0.7048 0.1407 0.7048 0.8395
No log 1.1290 70 0.7582 0.3372 0.7582 0.8708
No log 1.1613 72 0.9905 0.3231 0.9905 0.9952
No log 1.1935 74 1.1077 0.2600 1.1077 1.0525
No log 1.2258 76 1.0273 0.2806 1.0273 1.0135
No log 1.2581 78 0.8267 0.3494 0.8267 0.9092
No log 1.2903 80 0.6893 0.1598 0.6893 0.8303
No log 1.3226 82 0.6701 0.1903 0.6701 0.8186
No log 1.3548 84 0.6580 0.1942 0.6580 0.8111
No log 1.3871 86 0.6325 0.2243 0.6325 0.7953
No log 1.4194 88 0.6267 0.2913 0.6267 0.7916
No log 1.4516 90 0.6514 0.3425 0.6514 0.8071
No log 1.4839 92 0.6956 0.3590 0.6956 0.8340
No log 1.5161 94 0.7719 0.3384 0.7719 0.8786
No log 1.5484 96 0.8984 0.3731 0.8984 0.9478
No log 1.5806 98 1.1361 0.2518 1.1361 1.0659
No log 1.6129 100 1.0829 0.2665 1.0829 1.0406
No log 1.6452 102 0.7200 0.3261 0.7200 0.8485
No log 1.6774 104 0.6699 0.4126 0.6699 0.8185
No log 1.7097 106 0.8307 0.2704 0.8307 0.9114
No log 1.7419 108 1.1699 0.2214 1.1699 1.0816
No log 1.7742 110 1.0801 0.2474 1.0801 1.0393
No log 1.8065 112 0.7544 0.3556 0.7544 0.8686
No log 1.8387 114 0.6425 0.4157 0.6425 0.8016
No log 1.8710 116 0.9570 0.3305 0.9570 0.9783
No log 1.9032 118 0.9542 0.2999 0.9542 0.9768
No log 1.9355 120 0.8011 0.3538 0.8011 0.8951
No log 1.9677 122 0.6511 0.2652 0.6511 0.8069
No log 2.0 124 0.6266 0.2987 0.6266 0.7916
No log 2.0323 126 0.6301 0.3289 0.6301 0.7938
No log 2.0645 128 0.6382 0.3836 0.6382 0.7989
No log 2.0968 130 0.7564 0.3131 0.7564 0.8697
No log 2.1290 132 0.8039 0.3754 0.8039 0.8966
No log 2.1613 134 0.7122 0.2871 0.7122 0.8439
No log 2.1935 136 0.7241 0.2871 0.7241 0.8510
No log 2.2258 138 0.7073 0.3519 0.7073 0.8410
No log 2.2581 140 0.6731 0.3918 0.6731 0.8204
No log 2.2903 142 0.6004 0.3919 0.6004 0.7749
No log 2.3226 144 0.5854 0.3865 0.5854 0.7651
No log 2.3548 146 0.6330 0.4167 0.6330 0.7956
No log 2.3871 148 0.6169 0.4513 0.6169 0.7854
No log 2.4194 150 0.5444 0.4726 0.5444 0.7379
No log 2.4516 152 0.5332 0.4345 0.5332 0.7302
No log 2.4839 154 0.5156 0.5133 0.5156 0.7180
No log 2.5161 156 0.5310 0.5501 0.5310 0.7287
No log 2.5484 158 0.5731 0.4948 0.5731 0.7570
No log 2.5806 160 0.5350 0.5159 0.5350 0.7314
No log 2.6129 162 0.6012 0.4985 0.6012 0.7754
No log 2.6452 164 0.6942 0.4818 0.6942 0.8332
No log 2.6774 166 0.5966 0.5195 0.5966 0.7724
No log 2.7097 168 0.5574 0.4934 0.5574 0.7466
No log 2.7419 170 0.5679 0.4934 0.5679 0.7536
No log 2.7742 172 0.5867 0.4412 0.5867 0.7660
No log 2.8065 174 0.6948 0.4214 0.6948 0.8335
No log 2.8387 176 0.6472 0.4301 0.6472 0.8045
No log 2.8710 178 0.5681 0.5003 0.5681 0.7537
No log 2.9032 180 0.7063 0.3894 0.7063 0.8404
No log 2.9355 182 0.8182 0.4008 0.8182 0.9046
No log 2.9677 184 0.7271 0.4144 0.7271 0.8527
No log 3.0 186 0.6171 0.3637 0.6171 0.7856
No log 3.0323 188 0.5784 0.3939 0.5784 0.7605
No log 3.0645 190 0.5916 0.3780 0.5916 0.7692
No log 3.0968 192 0.5813 0.3787 0.5813 0.7624
No log 3.1290 194 0.6612 0.4112 0.6612 0.8131
No log 3.1613 196 0.8292 0.4228 0.8292 0.9106
No log 3.1935 198 0.8419 0.4228 0.8419 0.9175
No log 3.2258 200 0.6636 0.4464 0.6636 0.8146
No log 3.2581 202 0.5596 0.5184 0.5596 0.7481
No log 3.2903 204 0.5963 0.4875 0.5963 0.7722
No log 3.3226 206 0.6468 0.4986 0.6468 0.8042
No log 3.3548 208 0.6163 0.4807 0.6163 0.7851
No log 3.3871 210 0.5804 0.4555 0.5804 0.7618
No log 3.4194 212 0.5569 0.5177 0.5569 0.7463
No log 3.4516 214 0.5482 0.4934 0.5482 0.7404
No log 3.4839 216 0.5511 0.4953 0.5511 0.7424
No log 3.5161 218 0.5750 0.4174 0.5750 0.7583
No log 3.5484 220 0.6513 0.4097 0.6513 0.8070
No log 3.5806 222 0.6196 0.3693 0.6196 0.7871
No log 3.6129 224 0.5383 0.4681 0.5383 0.7337
No log 3.6452 226 0.5543 0.4705 0.5543 0.7445
No log 3.6774 228 0.6007 0.4430 0.6007 0.7750
No log 3.7097 230 0.5549 0.4705 0.5549 0.7449
No log 3.7419 232 0.5407 0.3426 0.5407 0.7353
No log 3.7742 234 0.5427 0.3426 0.5427 0.7367
No log 3.8065 236 0.5491 0.3863 0.5491 0.7410
No log 3.8387 238 0.5808 0.4375 0.5808 0.7621
No log 3.8710 240 0.6088 0.4134 0.6088 0.7803
No log 3.9032 242 0.5536 0.3813 0.5536 0.7440
No log 3.9355 244 0.5448 0.3813 0.5448 0.7381
No log 3.9677 246 0.5353 0.4504 0.5353 0.7316
No log 4.0 248 0.5359 0.4504 0.5359 0.7320
No log 4.0323 250 0.5530 0.3813 0.5530 0.7436
No log 4.0645 252 0.5629 0.3813 0.5629 0.7503
No log 4.0968 254 0.5950 0.3919 0.5950 0.7714
No log 4.1290 256 0.6390 0.3444 0.6390 0.7994
No log 4.1613 258 0.6228 0.3518 0.6228 0.7892
No log 4.1935 260 0.6720 0.3972 0.6720 0.8197
No log 4.2258 262 0.7033 0.3770 0.7033 0.8386
No log 4.2581 264 0.6181 0.3814 0.6181 0.7862
No log 4.2903 266 0.6022 0.3865 0.6022 0.7760
No log 4.3226 268 0.6181 0.3814 0.6181 0.7862
No log 4.3548 270 0.6502 0.4247 0.6502 0.8063
No log 4.3871 272 0.6628 0.4036 0.6628 0.8141
No log 4.4194 274 0.6511 0.4036 0.6511 0.8069
No log 4.4516 276 0.5992 0.4044 0.5992 0.7741
No log 4.4839 278 0.5996 0.3577 0.5996 0.7743
No log 4.5161 280 0.6350 0.2947 0.6350 0.7968
No log 4.5484 282 0.6312 0.2947 0.6312 0.7945
No log 4.5806 284 0.6104 0.3170 0.6104 0.7813
No log 4.6129 286 0.5892 0.3170 0.5892 0.7676
No log 4.6452 288 0.5680 0.3551 0.5680 0.7537
No log 4.6774 290 0.5440 0.4124 0.5440 0.7376
No log 4.7097 292 0.5660 0.4769 0.5660 0.7523
No log 4.7419 294 0.5668 0.4769 0.5668 0.7528
No log 4.7742 296 0.5699 0.4769 0.5699 0.7549
No log 4.8065 298 0.5540 0.4769 0.5540 0.7443
No log 4.8387 300 0.4861 0.5056 0.4861 0.6972
No log 4.8710 302 0.4634 0.5413 0.4634 0.6808
No log 4.9032 304 0.4684 0.5571 0.4684 0.6844
No log 4.9355 306 0.4585 0.6170 0.4585 0.6771
No log 4.9677 308 0.4535 0.6267 0.4535 0.6734
No log 5.0 310 0.4600 0.5993 0.4600 0.6782
No log 5.0323 312 0.4833 0.5368 0.4833 0.6952
No log 5.0645 314 0.5475 0.5650 0.5475 0.7399
No log 5.0968 316 0.5362 0.5724 0.5362 0.7323
No log 5.1290 318 0.4730 0.6383 0.4730 0.6877
No log 5.1613 320 0.5257 0.4929 0.5257 0.7250
No log 5.1935 322 0.5619 0.4997 0.5619 0.7496
No log 5.2258 324 0.5106 0.5086 0.5106 0.7146
No log 5.2581 326 0.4856 0.4747 0.4856 0.6969
No log 5.2903 328 0.5537 0.4474 0.5537 0.7441
No log 5.3226 330 0.6045 0.4814 0.6045 0.7775
No log 5.3548 332 0.5549 0.4373 0.5549 0.7449
No log 5.3871 334 0.4703 0.4929 0.4703 0.6858
No log 5.4194 336 0.5594 0.4978 0.5594 0.7479
No log 5.4516 338 0.6697 0.4707 0.6697 0.8184
No log 5.4839 340 0.6332 0.4562 0.6332 0.7957
No log 5.5161 342 0.5007 0.4724 0.5007 0.7076
No log 5.5484 344 0.4817 0.6383 0.4817 0.6940
No log 5.5806 346 0.5927 0.5216 0.5927 0.7698
No log 5.6129 348 0.7063 0.4992 0.7063 0.8404
No log 5.6452 350 0.6355 0.5504 0.6355 0.7972
No log 5.6774 352 0.4805 0.5846 0.4805 0.6932
No log 5.7097 354 0.5034 0.5104 0.5034 0.7095
No log 5.7419 356 0.6088 0.4827 0.6088 0.7802
No log 5.7742 358 0.6280 0.4827 0.6280 0.7924
No log 5.8065 360 0.5781 0.4911 0.5781 0.7604
No log 5.8387 362 0.5528 0.4997 0.5528 0.7435
No log 5.8710 364 0.6070 0.4911 0.6070 0.7791
No log 5.9032 366 0.6565 0.4462 0.6565 0.8102
No log 5.9355 368 0.6950 0.4462 0.6950 0.8337
No log 5.9677 370 0.6439 0.4795 0.6439 0.8024
No log 6.0 372 0.5498 0.4705 0.5498 0.7415
No log 6.0323 374 0.5341 0.3947 0.5341 0.7308
No log 6.0645 376 0.5609 0.3202 0.5609 0.7490
No log 6.0968 378 0.5708 0.3803 0.5708 0.7555
No log 6.1290 380 0.5404 0.3061 0.5404 0.7351
No log 6.1613 382 0.5325 0.4027 0.5325 0.7297
No log 6.1935 384 0.5518 0.4941 0.5518 0.7429
No log 6.2258 386 0.5612 0.4502 0.5612 0.7491
No log 6.2581 388 0.5579 0.4502 0.5579 0.7469
No log 6.2903 390 0.5364 0.4437 0.5364 0.7324
No log 6.3226 392 0.4966 0.5307 0.4966 0.7047
No log 6.3548 394 0.4958 0.4820 0.4958 0.7041
No log 6.3871 396 0.4995 0.4423 0.4995 0.7068
No log 6.4194 398 0.5106 0.4724 0.5106 0.7145
No log 6.4516 400 0.5181 0.4681 0.5181 0.7198
No log 6.4839 402 0.5184 0.4929 0.5184 0.7200
No log 6.5161 404 0.5214 0.3916 0.5214 0.7221
No log 6.5484 406 0.5509 0.3693 0.5509 0.7422
No log 6.5806 408 0.6336 0.4015 0.6336 0.7960
No log 6.6129 410 0.6474 0.4015 0.6474 0.8046
No log 6.6452 412 0.5613 0.4209 0.5613 0.7492
No log 6.6774 414 0.5036 0.4338 0.5036 0.7097
No log 6.7097 416 0.5168 0.5307 0.5168 0.7189
No log 6.7419 418 0.5498 0.4684 0.5498 0.7415
No log 6.7742 420 0.5359 0.4147 0.5359 0.7320
No log 6.8065 422 0.5356 0.3889 0.5356 0.7319
No log 6.8387 424 0.5780 0.3719 0.5780 0.7603
No log 6.8710 426 0.6296 0.4280 0.6296 0.7935
No log 6.9032 428 0.6026 0.4280 0.6026 0.7763
No log 6.9355 430 0.5451 0.4767 0.5451 0.7383
No log 6.9677 432 0.5159 0.4634 0.5159 0.7183
No log 7.0 434 0.5176 0.4504 0.5176 0.7194
No log 7.0323 436 0.5239 0.4504 0.5239 0.7238
No log 7.0645 438 0.5250 0.4504 0.5250 0.7245
No log 7.0968 440 0.5305 0.4361 0.5305 0.7284
No log 7.1290 442 0.5382 0.3558 0.5382 0.7336
No log 7.1613 444 0.5432 0.3273 0.5432 0.7370
No log 7.1935 446 0.5498 0.3243 0.5498 0.7415
No log 7.2258 448 0.5741 0.2979 0.5741 0.7577
No log 7.2581 450 0.5966 0.2776 0.5966 0.7724
No log 7.2903 452 0.5981 0.3112 0.5981 0.7733
No log 7.3226 454 0.5828 0.3082 0.5828 0.7634
No log 7.3548 456 0.5596 0.2947 0.5596 0.7481
No log 7.3871 458 0.5414 0.3608 0.5414 0.7358
No log 7.4194 460 0.5392 0.3608 0.5392 0.7343
No log 7.4516 462 0.5507 0.2947 0.5507 0.7421
No log 7.4839 464 0.5506 0.2947 0.5506 0.7420
No log 7.5161 466 0.5739 0.3936 0.5739 0.7575
No log 7.5484 468 0.6070 0.4021 0.6070 0.7791
No log 7.5806 470 0.6050 0.4021 0.6050 0.7778
No log 7.6129 472 0.5496 0.4059 0.5496 0.7413
No log 7.6452 474 0.5242 0.4161 0.5242 0.7240
No log 7.6774 476 0.5237 0.3889 0.5237 0.7237
No log 7.7097 478 0.5261 0.3889 0.5261 0.7253
No log 7.7419 480 0.5308 0.4068 0.5308 0.7286
No log 7.7742 482 0.5417 0.4036 0.5417 0.7360
No log 7.8065 484 0.5394 0.4345 0.5394 0.7345
No log 7.8387 486 0.5330 0.4703 0.5330 0.7301
No log 7.8710 488 0.5439 0.4161 0.5439 0.7375
No log 7.9032 490 0.5627 0.2947 0.5627 0.7501
No log 7.9355 492 0.5769 0.2641 0.5769 0.7595
No log 7.9677 494 0.5764 0.4264 0.5764 0.7592
No log 8.0 496 0.5791 0.3814 0.5791 0.7610
No log 8.0323 498 0.5773 0.4414 0.5773 0.7598
0.3994 8.0645 500 0.5493 0.4182 0.5493 0.7411
0.3994 8.0968 502 0.5421 0.4136 0.5421 0.7362
0.3994 8.1290 504 0.5713 0.3563 0.5713 0.7558
0.3994 8.1613 506 0.6046 0.3970 0.6046 0.7776
0.3994 8.1935 508 0.5956 0.3970 0.5956 0.7717
0.3994 8.2258 510 0.5617 0.3925 0.5617 0.7495

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k12_task7_organization

Finetuned
(4019)
this model