ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k6_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5277
  • Qwk: 0.5619
  • Mse: 0.5277
  • Rmse: 0.7264

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0625 2 2.5046 -0.0924 2.5046 1.5826
No log 0.125 4 1.1263 0.0651 1.1263 1.0613
No log 0.1875 6 0.9728 -0.0091 0.9728 0.9863
No log 0.25 8 0.9129 0.2358 0.9129 0.9555
No log 0.3125 10 0.8343 0.1461 0.8343 0.9134
No log 0.375 12 0.7858 0.1620 0.7858 0.8864
No log 0.4375 14 0.7665 0.1998 0.7665 0.8755
No log 0.5 16 0.7031 0.2135 0.7031 0.8385
No log 0.5625 18 0.6796 0.2193 0.6796 0.8244
No log 0.625 20 0.6528 0.2158 0.6528 0.8080
No log 0.6875 22 0.6353 0.2413 0.6353 0.7971
No log 0.75 24 0.6317 0.4044 0.6317 0.7948
No log 0.8125 26 0.6721 0.4841 0.6721 0.8198
No log 0.875 28 0.8685 0.3825 0.8685 0.9319
No log 0.9375 30 0.8321 0.3870 0.8321 0.9122
No log 1.0 32 0.6836 0.3635 0.6836 0.8268
No log 1.0625 34 0.6266 0.2815 0.6266 0.7916
No log 1.125 36 0.7764 0.3894 0.7764 0.8811
No log 1.1875 38 0.8865 0.3869 0.8865 0.9415
No log 1.25 40 0.6823 0.3518 0.6823 0.8260
No log 1.3125 42 0.6258 0.2027 0.6258 0.7911
No log 1.375 44 0.6717 0.1700 0.6717 0.8196
No log 1.4375 46 0.6343 0.3258 0.6343 0.7964
No log 1.5 48 0.6553 0.2950 0.6553 0.8095
No log 1.5625 50 0.6526 0.3050 0.6526 0.8078
No log 1.625 52 0.6674 0.3050 0.6674 0.8169
No log 1.6875 54 0.6600 0.3100 0.6600 0.8124
No log 1.75 56 0.7263 0.4444 0.7263 0.8523
No log 1.8125 58 0.7048 0.3138 0.7048 0.8395
No log 1.875 60 0.8805 0.4133 0.8805 0.9383
No log 1.9375 62 1.0159 0.3574 1.0159 1.0079
No log 2.0 64 0.9379 0.3827 0.9379 0.9685
No log 2.0625 66 0.8093 0.4943 0.8093 0.8996
No log 2.125 68 0.7472 0.3656 0.7472 0.8644
No log 2.1875 70 0.7415 0.3235 0.7415 0.8611
No log 2.25 72 0.6761 0.3724 0.6761 0.8223
No log 2.3125 74 0.6869 0.3609 0.6869 0.8288
No log 2.375 76 0.7564 0.3981 0.7564 0.8697
No log 2.4375 78 0.7865 0.4684 0.7865 0.8869
No log 2.5 80 0.7369 0.4298 0.7369 0.8584
No log 2.5625 82 0.7307 0.3774 0.7307 0.8548
No log 2.625 84 0.7627 0.4068 0.7627 0.8733
No log 2.6875 86 0.7855 0.4271 0.7855 0.8863
No log 2.75 88 0.8274 0.4711 0.8274 0.9096
No log 2.8125 90 0.7926 0.4732 0.7926 0.8903
No log 2.875 92 0.7217 0.4972 0.7217 0.8496
No log 2.9375 94 0.6982 0.4301 0.6982 0.8356
No log 3.0 96 0.6931 0.4227 0.6931 0.8325
No log 3.0625 98 0.7219 0.4315 0.7219 0.8496
No log 3.125 100 0.7828 0.4726 0.7828 0.8848
No log 3.1875 102 0.6770 0.4336 0.6770 0.8228
No log 3.25 104 0.7282 0.3364 0.7282 0.8534
No log 3.3125 106 0.8377 0.3636 0.8377 0.9152
No log 3.375 108 0.7725 0.4030 0.7725 0.8789
No log 3.4375 110 0.7567 0.4030 0.7567 0.8699
No log 3.5 112 0.7748 0.3873 0.7748 0.8802
No log 3.5625 114 0.6919 0.3505 0.6919 0.8318
No log 3.625 116 0.6594 0.4029 0.6594 0.8120
No log 3.6875 118 0.7699 0.4592 0.7699 0.8775
No log 3.75 120 0.7623 0.4512 0.7623 0.8731
No log 3.8125 122 0.6539 0.4513 0.6539 0.8086
No log 3.875 124 0.6318 0.4020 0.6318 0.7948
No log 3.9375 126 0.6260 0.4029 0.6260 0.7912
No log 4.0 128 0.6154 0.4029 0.6154 0.7845
No log 4.0625 130 0.5634 0.5095 0.5634 0.7506
No log 4.125 132 0.6882 0.4189 0.6882 0.8296
No log 4.1875 134 0.7741 0.4333 0.7741 0.8798
No log 4.25 136 0.6106 0.4315 0.6106 0.7814
No log 4.3125 138 0.5586 0.4984 0.5586 0.7474
No log 4.375 140 0.6437 0.3985 0.6437 0.8023
No log 4.4375 142 0.6227 0.3985 0.6227 0.7891
No log 4.5 144 0.6075 0.3942 0.6075 0.7794
No log 4.5625 146 0.5937 0.4397 0.5937 0.7705
No log 4.625 148 0.5805 0.4514 0.5805 0.7619
No log 4.6875 150 0.6078 0.4681 0.6078 0.7796
No log 4.75 152 0.7160 0.4967 0.7160 0.8462
No log 4.8125 154 0.6903 0.5161 0.6903 0.8308
No log 4.875 156 0.6339 0.4450 0.6339 0.7962
No log 4.9375 158 0.7218 0.5340 0.7218 0.8496
No log 5.0 160 0.7100 0.5146 0.7100 0.8426
No log 5.0625 162 0.6051 0.4303 0.6051 0.7779
No log 5.125 164 0.5814 0.4881 0.5814 0.7625
No log 5.1875 166 0.5701 0.5003 0.5701 0.7550
No log 5.25 168 0.6011 0.4282 0.6011 0.7753
No log 5.3125 170 0.6645 0.4631 0.6645 0.8152
No log 5.375 172 0.6701 0.4538 0.6701 0.8186
No log 5.4375 174 0.5900 0.4795 0.5900 0.7681
No log 5.5 176 0.5482 0.4201 0.5482 0.7404
No log 5.5625 178 0.6848 0.5455 0.6848 0.8275
No log 5.625 180 0.7219 0.5155 0.7219 0.8496
No log 5.6875 182 0.6045 0.5426 0.6045 0.7775
No log 5.75 184 0.6027 0.5081 0.6027 0.7763
No log 5.8125 186 0.5698 0.4901 0.5698 0.7548
No log 5.875 188 0.5721 0.4656 0.5721 0.7564
No log 5.9375 190 0.6161 0.4359 0.6161 0.7849
No log 6.0 192 0.6195 0.4309 0.6195 0.7871
No log 6.0625 194 0.5604 0.5286 0.5604 0.7486
No log 6.125 196 0.5869 0.4523 0.5869 0.7661
No log 6.1875 198 0.5908 0.4901 0.5908 0.7686
No log 6.25 200 0.5988 0.5151 0.5988 0.7738
No log 6.3125 202 0.7057 0.4122 0.7057 0.8401
No log 6.375 204 0.6932 0.4122 0.6932 0.8326
No log 6.4375 206 0.6018 0.5442 0.6018 0.7758
No log 6.5 208 0.5804 0.5111 0.5804 0.7618
No log 6.5625 210 0.5689 0.56 0.5689 0.7543
No log 6.625 212 0.6035 0.5074 0.6035 0.7769
No log 6.6875 214 0.5895 0.5190 0.5895 0.7678
No log 6.75 216 0.5659 0.5221 0.5659 0.7522
No log 6.8125 218 0.5617 0.5190 0.5617 0.7495
No log 6.875 220 0.5678 0.5008 0.5678 0.7535
No log 6.9375 222 0.5576 0.4975 0.5576 0.7467
No log 7.0 224 0.5475 0.5133 0.5475 0.7399
No log 7.0625 226 0.5438 0.5133 0.5438 0.7375
No log 7.125 228 0.5423 0.5398 0.5423 0.7364
No log 7.1875 230 0.5498 0.5241 0.5498 0.7415
No log 7.25 232 0.5616 0.5627 0.5616 0.7494
No log 7.3125 234 0.5504 0.5363 0.5504 0.7419
No log 7.375 236 0.5799 0.5205 0.5799 0.7615
No log 7.4375 238 0.5963 0.5283 0.5963 0.7722
No log 7.5 240 0.5444 0.5781 0.5444 0.7378
No log 7.5625 242 0.5603 0.4932 0.5603 0.7485
No log 7.625 244 0.6460 0.5336 0.6460 0.8037
No log 7.6875 246 0.6333 0.5160 0.6333 0.7958
No log 7.75 248 0.5940 0.4973 0.5940 0.7707
No log 7.8125 250 0.6170 0.5105 0.6170 0.7855
No log 7.875 252 0.6508 0.5416 0.6508 0.8067
No log 7.9375 254 0.6379 0.5163 0.6379 0.7987
No log 8.0 256 0.5623 0.5081 0.5623 0.7499
No log 8.0625 258 0.5135 0.5549 0.5135 0.7166
No log 8.125 260 0.5277 0.5868 0.5277 0.7264
No log 8.1875 262 0.5595 0.5414 0.5595 0.7480
No log 8.25 264 0.5479 0.5483 0.5479 0.7402
No log 8.3125 266 0.5691 0.5348 0.5691 0.7544
No log 8.375 268 0.6242 0.5106 0.6242 0.7900
No log 8.4375 270 0.6356 0.5106 0.6356 0.7972
No log 8.5 272 0.5560 0.4963 0.5560 0.7457
No log 8.5625 274 0.5530 0.5218 0.5530 0.7437
No log 8.625 276 0.6214 0.4877 0.6214 0.7883
No log 8.6875 278 0.5828 0.5308 0.5828 0.7634
No log 8.75 280 0.5648 0.5345 0.5648 0.7515
No log 8.8125 282 0.6540 0.4502 0.6540 0.8087
No log 8.875 284 0.6834 0.4424 0.6834 0.8267
No log 8.9375 286 0.6354 0.4582 0.6354 0.7971
No log 9.0 288 0.6806 0.4123 0.6806 0.8250
No log 9.0625 290 0.6379 0.5067 0.6379 0.7987
No log 9.125 292 0.6370 0.5560 0.6370 0.7981
No log 9.1875 294 0.6163 0.5786 0.6163 0.7851
No log 9.25 296 0.6180 0.5528 0.6180 0.7861
No log 9.3125 298 0.6075 0.5003 0.6075 0.7794
No log 9.375 300 0.6169 0.4981 0.6169 0.7855
No log 9.4375 302 0.6381 0.4527 0.6381 0.7988
No log 9.5 304 0.5988 0.4824 0.5988 0.7738
No log 9.5625 306 0.6517 0.4980 0.6517 0.8073
No log 9.625 308 0.6650 0.4980 0.6650 0.8155
No log 9.6875 310 0.6410 0.5059 0.6410 0.8006
No log 9.75 312 0.5958 0.4422 0.5958 0.7719
No log 9.8125 314 0.6344 0.4091 0.6344 0.7965
No log 9.875 316 0.6417 0.4544 0.6417 0.8011
No log 9.9375 318 0.6180 0.4783 0.6180 0.7861
No log 10.0 320 0.5894 0.4107 0.5894 0.7677
No log 10.0625 322 0.5895 0.5016 0.5895 0.7678
No log 10.125 324 0.6039 0.4929 0.6039 0.7771
No log 10.1875 326 0.5649 0.5195 0.5649 0.7516
No log 10.25 328 0.5611 0.5003 0.5611 0.7491
No log 10.3125 330 0.5674 0.4828 0.5674 0.7532
No log 10.375 332 0.5923 0.5403 0.5923 0.7696
No log 10.4375 334 0.6650 0.5219 0.6650 0.8154
No log 10.5 336 0.6525 0.5219 0.6525 0.8078
No log 10.5625 338 0.5898 0.5056 0.5898 0.7680
No log 10.625 340 0.5386 0.5133 0.5386 0.7339
No log 10.6875 342 0.5179 0.5389 0.5179 0.7196
No log 10.75 344 0.5175 0.5389 0.5175 0.7194
No log 10.8125 346 0.5332 0.5813 0.5332 0.7302
No log 10.875 348 0.5309 0.5784 0.5309 0.7286
No log 10.9375 350 0.5228 0.6867 0.5228 0.7231
No log 11.0 352 0.5478 0.5363 0.5478 0.7401
No log 11.0625 354 0.5350 0.5621 0.5350 0.7315
No log 11.125 356 0.5121 0.5725 0.5121 0.7156
No log 11.1875 358 0.5261 0.5373 0.5261 0.7253
No log 11.25 360 0.5307 0.5450 0.5307 0.7285
No log 11.3125 362 0.5424 0.4975 0.5424 0.7364
No log 11.375 364 0.5594 0.4968 0.5594 0.7479
No log 11.4375 366 0.5630 0.5060 0.5630 0.7503
No log 11.5 368 0.5635 0.4944 0.5635 0.7506
No log 11.5625 370 0.5733 0.5683 0.5733 0.7572
No log 11.625 372 0.5707 0.5893 0.5707 0.7554
No log 11.6875 374 0.5613 0.5710 0.5613 0.7492
No log 11.75 376 0.5399 0.6222 0.5399 0.7348
No log 11.8125 378 0.5327 0.6027 0.5327 0.7299
No log 11.875 380 0.5227 0.5725 0.5227 0.7230
No log 11.9375 382 0.5225 0.5874 0.5225 0.7228
No log 12.0 384 0.5269 0.5272 0.5269 0.7259
No log 12.0625 386 0.5272 0.5875 0.5272 0.7261
No log 12.125 388 0.5485 0.4391 0.5485 0.7406
No log 12.1875 390 0.5387 0.4895 0.5387 0.7339
No log 12.25 392 0.5297 0.5681 0.5297 0.7278
No log 12.3125 394 0.5419 0.5177 0.5419 0.7362
No log 12.375 396 0.5408 0.5177 0.5408 0.7354
No log 12.4375 398 0.5654 0.5158 0.5654 0.7519
No log 12.5 400 0.6339 0.5394 0.6339 0.7962
No log 12.5625 402 0.7550 0.4735 0.7550 0.8689
No log 12.625 404 0.7773 0.4667 0.7773 0.8817
No log 12.6875 406 0.6647 0.5163 0.6647 0.8153
No log 12.75 408 0.5888 0.5149 0.5888 0.7673
No log 12.8125 410 0.5494 0.4876 0.5494 0.7412
No log 12.875 412 0.5626 0.4789 0.5626 0.7500
No log 12.9375 414 0.5520 0.5335 0.5520 0.7430
No log 13.0 416 0.5404 0.5241 0.5404 0.7351
No log 13.0625 418 0.5449 0.4605 0.5449 0.7382
No log 13.125 420 0.5381 0.5421 0.5381 0.7336
No log 13.1875 422 0.5511 0.5127 0.5511 0.7423
No log 13.25 424 0.5441 0.5390 0.5441 0.7377
No log 13.3125 426 0.5559 0.5411 0.5559 0.7456
No log 13.375 428 0.5950 0.4979 0.5950 0.7713
No log 13.4375 430 0.6086 0.5326 0.6086 0.7801
No log 13.5 432 0.5890 0.5553 0.5890 0.7675
No log 13.5625 434 0.5666 0.4887 0.5666 0.7527
No log 13.625 436 0.5730 0.5195 0.5730 0.7570
No log 13.6875 438 0.5740 0.5306 0.5740 0.7576
No log 13.75 440 0.5856 0.5362 0.5856 0.7653
No log 13.8125 442 0.5835 0.5362 0.5835 0.7639
No log 13.875 444 0.5496 0.5586 0.5496 0.7413
No log 13.9375 446 0.5292 0.4919 0.5292 0.7274
No log 14.0 448 0.5263 0.5020 0.5263 0.7255
No log 14.0625 450 0.5558 0.4913 0.5558 0.7455
No log 14.125 452 0.5934 0.4729 0.5934 0.7703
No log 14.1875 454 0.5666 0.5061 0.5666 0.7527
No log 14.25 456 0.5488 0.4883 0.5488 0.7408
No log 14.3125 458 0.5369 0.5320 0.5369 0.7327
No log 14.375 460 0.5434 0.4562 0.5434 0.7371
No log 14.4375 462 0.5490 0.4620 0.5490 0.7410
No log 14.5 464 0.5543 0.4535 0.5543 0.7445
No log 14.5625 466 0.5567 0.4147 0.5567 0.7462
No log 14.625 468 0.5685 0.5184 0.5685 0.7540
No log 14.6875 470 0.5777 0.5254 0.5777 0.7601
No log 14.75 472 0.5710 0.5272 0.5710 0.7556
No log 14.8125 474 0.5685 0.4543 0.5685 0.7540
No log 14.875 476 0.5640 0.4543 0.5640 0.7510
No log 14.9375 478 0.5579 0.4562 0.5579 0.7469
No log 15.0 480 0.5529 0.4562 0.5529 0.7436
No log 15.0625 482 0.5509 0.4222 0.5509 0.7422
No log 15.125 484 0.5855 0.5403 0.5855 0.7652
No log 15.1875 486 0.7194 0.5113 0.7194 0.8482
No log 15.25 488 0.8097 0.5295 0.8097 0.8998
No log 15.3125 490 0.7583 0.5065 0.7583 0.8708
No log 15.375 492 0.6358 0.5345 0.6358 0.7974
No log 15.4375 494 0.5724 0.5349 0.5724 0.7566
No log 15.5 496 0.5593 0.5437 0.5593 0.7478
No log 15.5625 498 0.5554 0.5437 0.5554 0.7453
0.2898 15.625 500 0.5442 0.5923 0.5442 0.7377
0.2898 15.6875 502 0.5304 0.4179 0.5304 0.7283
0.2898 15.75 504 0.5357 0.4614 0.5357 0.7319
0.2898 15.8125 506 0.5381 0.4614 0.5381 0.7336
0.2898 15.875 508 0.5379 0.4614 0.5379 0.7334
0.2898 15.9375 510 0.5277 0.5619 0.5277 0.7264

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k6_task7_organization

Finetuned
(4019)
this model