ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k1_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5591
  • Qwk: 0.6479
  • Mse: 0.5591
  • Rmse: 0.7477

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.2857 2 3.9029 -0.0153 3.9029 1.9756
No log 0.5714 4 2.4790 0.0541 2.4790 1.5745
No log 0.8571 6 1.2095 0.0760 1.2095 1.0998
No log 1.1429 8 0.9903 0.2897 0.9903 0.9951
No log 1.4286 10 1.0652 0.0888 1.0652 1.0321
No log 1.7143 12 1.1097 0.0888 1.1097 1.0534
No log 2.0 14 1.1143 0.1713 1.1143 1.0556
No log 2.2857 16 1.4437 0.0196 1.4437 1.2016
No log 2.5714 18 1.2948 0.0196 1.2948 1.1379
No log 2.8571 20 1.1823 0.1058 1.1823 1.0873
No log 3.1429 22 0.9909 0.2616 0.9909 0.9955
No log 3.4286 24 0.9075 0.3041 0.9075 0.9526
No log 3.7143 26 0.7882 0.4416 0.7882 0.8878
No log 4.0 28 0.7233 0.5054 0.7233 0.8505
No log 4.2857 30 0.7493 0.4604 0.7493 0.8656
No log 4.5714 32 0.9637 0.4149 0.9637 0.9817
No log 4.8571 34 1.3570 0.2142 1.3570 1.1649
No log 5.1429 36 1.3356 0.2424 1.3356 1.1557
No log 5.4286 38 1.1192 0.3289 1.1192 1.0579
No log 5.7143 40 0.7064 0.5288 0.7064 0.8405
No log 6.0 42 0.8131 0.5006 0.8131 0.9017
No log 6.2857 44 0.7458 0.5243 0.7458 0.8636
No log 6.5714 46 0.7205 0.5434 0.7205 0.8488
No log 6.8571 48 1.2061 0.2727 1.2061 1.0982
No log 7.1429 50 1.3307 0.3050 1.3307 1.1536
No log 7.4286 52 0.9919 0.2618 0.9919 0.9959
No log 7.7143 54 0.6741 0.5396 0.6741 0.8210
No log 8.0 56 0.6682 0.6227 0.6682 0.8174
No log 8.2857 58 0.6560 0.6513 0.6560 0.8099
No log 8.5714 60 0.5671 0.6328 0.5671 0.7530
No log 8.8571 62 0.5550 0.5977 0.5550 0.7450
No log 9.1429 64 0.6480 0.5794 0.6480 0.8050
No log 9.4286 66 0.6524 0.5936 0.6524 0.8077
No log 9.7143 68 0.5621 0.6722 0.5621 0.7498
No log 10.0 70 0.5057 0.6899 0.5057 0.7111
No log 10.2857 72 0.5187 0.7128 0.5187 0.7202
No log 10.5714 74 0.5950 0.6957 0.5950 0.7713
No log 10.8571 76 0.8691 0.5470 0.8691 0.9322
No log 11.1429 78 1.1508 0.4954 1.1508 1.0728
No log 11.4286 80 1.0493 0.4767 1.0493 1.0243
No log 11.7143 82 0.6987 0.6715 0.6987 0.8359
No log 12.0 84 0.6875 0.6791 0.6875 0.8292
No log 12.2857 86 0.7229 0.6394 0.7229 0.8502
No log 12.5714 88 0.6364 0.6426 0.6364 0.7978
No log 12.8571 90 0.5690 0.6006 0.5690 0.7543
No log 13.1429 92 0.6737 0.6421 0.6737 0.8208
No log 13.4286 94 0.7395 0.5891 0.7395 0.8599
No log 13.7143 96 0.6816 0.6949 0.6816 0.8256
No log 14.0 98 0.5847 0.6306 0.5847 0.7646
No log 14.2857 100 0.5886 0.6537 0.5886 0.7672
No log 14.5714 102 0.6045 0.6257 0.6045 0.7775
No log 14.8571 104 0.5691 0.6326 0.5691 0.7544
No log 15.1429 106 0.5734 0.6177 0.5734 0.7572
No log 15.4286 108 0.5709 0.6773 0.5709 0.7556
No log 15.7143 110 0.5465 0.6544 0.5465 0.7392
No log 16.0 112 0.5479 0.7057 0.5479 0.7402
No log 16.2857 114 0.5467 0.6899 0.5467 0.7394
No log 16.5714 116 0.5212 0.7253 0.5212 0.7219
No log 16.8571 118 0.5222 0.6594 0.5222 0.7227
No log 17.1429 120 0.5628 0.6751 0.5628 0.7502
No log 17.4286 122 0.5819 0.6669 0.5819 0.7628
No log 17.7143 124 0.5327 0.6740 0.5327 0.7299
No log 18.0 126 0.5085 0.7036 0.5085 0.7131
No log 18.2857 128 0.5399 0.7187 0.5399 0.7348
No log 18.5714 130 0.5069 0.6845 0.5069 0.7119
No log 18.8571 132 0.5412 0.6302 0.5412 0.7357
No log 19.1429 134 0.5299 0.6606 0.5299 0.7280
No log 19.4286 136 0.4998 0.6896 0.4998 0.7070
No log 19.7143 138 0.5113 0.6903 0.5113 0.7150
No log 20.0 140 0.5230 0.6675 0.5230 0.7232
No log 20.2857 142 0.5062 0.6976 0.5062 0.7115
No log 20.5714 144 0.4973 0.7351 0.4973 0.7052
No log 20.8571 146 0.5255 0.6841 0.5255 0.7249
No log 21.1429 148 0.5456 0.6796 0.5456 0.7386
No log 21.4286 150 0.5440 0.7042 0.5440 0.7375
No log 21.7143 152 0.5137 0.7017 0.5137 0.7167
No log 22.0 154 0.5125 0.6622 0.5125 0.7159
No log 22.2857 156 0.5294 0.6743 0.5294 0.7276
No log 22.5714 158 0.5456 0.6601 0.5456 0.7387
No log 22.8571 160 0.5573 0.6187 0.5573 0.7465
No log 23.1429 162 0.5707 0.6207 0.5707 0.7555
No log 23.4286 164 0.5644 0.6593 0.5644 0.7513
No log 23.7143 166 0.5703 0.6667 0.5703 0.7552
No log 24.0 168 0.6173 0.6298 0.6173 0.7857
No log 24.2857 170 0.6510 0.6298 0.6510 0.8069
No log 24.5714 172 0.6150 0.6508 0.6150 0.7842
No log 24.8571 174 0.5896 0.6493 0.5896 0.7679
No log 25.1429 176 0.5948 0.6583 0.5948 0.7712
No log 25.4286 178 0.5975 0.6501 0.5975 0.7730
No log 25.7143 180 0.6674 0.6470 0.6674 0.8169
No log 26.0 182 0.7250 0.6074 0.7250 0.8515
No log 26.2857 184 0.7195 0.6074 0.7195 0.8482
No log 26.5714 186 0.6046 0.6516 0.6046 0.7776
No log 26.8571 188 0.5599 0.7071 0.5599 0.7483
No log 27.1429 190 0.5511 0.6518 0.5511 0.7424
No log 27.4286 192 0.5463 0.6518 0.5463 0.7391
No log 27.7143 194 0.5441 0.6966 0.5441 0.7377
No log 28.0 196 0.5522 0.7165 0.5522 0.7431
No log 28.2857 198 0.5499 0.7165 0.5499 0.7416
No log 28.5714 200 0.5430 0.6858 0.5430 0.7369
No log 28.8571 202 0.5372 0.7033 0.5372 0.7329
No log 29.1429 204 0.5309 0.6772 0.5309 0.7287
No log 29.4286 206 0.5324 0.6388 0.5324 0.7297
No log 29.7143 208 0.5328 0.6430 0.5328 0.7299
No log 30.0 210 0.5325 0.6796 0.5325 0.7297
No log 30.2857 212 0.5370 0.6452 0.5370 0.7328
No log 30.5714 214 0.5369 0.6858 0.5369 0.7328
No log 30.8571 216 0.5359 0.6466 0.5359 0.7320
No log 31.1429 218 0.5348 0.6332 0.5348 0.7313
No log 31.4286 220 0.5258 0.7009 0.5258 0.7251
No log 31.7143 222 0.5169 0.6464 0.5169 0.7189
No log 32.0 224 0.5286 0.6484 0.5286 0.7271
No log 32.2857 226 0.5441 0.6374 0.5441 0.7376
No log 32.5714 228 0.5690 0.7093 0.5690 0.7543
No log 32.8571 230 0.5530 0.7093 0.5530 0.7436
No log 33.1429 232 0.5289 0.7027 0.5289 0.7273
No log 33.4286 234 0.5140 0.6389 0.5140 0.7170
No log 33.7143 236 0.5194 0.6634 0.5194 0.7207
No log 34.0 238 0.5179 0.6114 0.5179 0.7197
No log 34.2857 240 0.5293 0.7035 0.5293 0.7275
No log 34.5714 242 0.5484 0.6845 0.5484 0.7405
No log 34.8571 244 0.5455 0.6838 0.5455 0.7386
No log 35.1429 246 0.5286 0.7080 0.5286 0.7271
No log 35.4286 248 0.5237 0.6647 0.5237 0.7236
No log 35.7143 250 0.5174 0.7222 0.5174 0.7193
No log 36.0 252 0.5141 0.7222 0.5141 0.7170
No log 36.2857 254 0.5105 0.7402 0.5105 0.7145
No log 36.5714 256 0.5051 0.6398 0.5051 0.7107
No log 36.8571 258 0.5046 0.6473 0.5046 0.7104
No log 37.1429 260 0.5217 0.6594 0.5217 0.7223
No log 37.4286 262 0.5488 0.6695 0.5488 0.7408
No log 37.7143 264 0.5675 0.6610 0.5675 0.7533
No log 38.0 266 0.5989 0.6452 0.5989 0.7739
No log 38.2857 268 0.5871 0.6725 0.5871 0.7662
No log 38.5714 270 0.5583 0.6676 0.5583 0.7472
No log 38.8571 272 0.5510 0.6681 0.5510 0.7423
No log 39.1429 274 0.5663 0.6687 0.5663 0.7525
No log 39.4286 276 0.5555 0.6451 0.5555 0.7453
No log 39.7143 278 0.5460 0.6482 0.5460 0.7389
No log 40.0 280 0.5649 0.6392 0.5649 0.7516
No log 40.2857 282 0.5600 0.6561 0.5600 0.7483
No log 40.5714 284 0.5712 0.6392 0.5712 0.7558
No log 40.8571 286 0.5601 0.6725 0.5601 0.7484
No log 41.1429 288 0.5475 0.6725 0.5475 0.7399
No log 41.4286 290 0.5313 0.6545 0.5313 0.7289
No log 41.7143 292 0.5237 0.6680 0.5237 0.7237
No log 42.0 294 0.5224 0.7141 0.5224 0.7228
No log 42.2857 296 0.5125 0.7034 0.5125 0.7159
No log 42.5714 298 0.5173 0.6858 0.5173 0.7193
No log 42.8571 300 0.5310 0.7041 0.5310 0.7287
No log 43.1429 302 0.5475 0.6716 0.5475 0.7399
No log 43.4286 304 0.5453 0.7001 0.5453 0.7385
No log 43.7143 306 0.5616 0.6644 0.5616 0.7494
No log 44.0 308 0.5511 0.6644 0.5511 0.7424
No log 44.2857 310 0.5296 0.6650 0.5296 0.7278
No log 44.5714 312 0.5447 0.6554 0.5447 0.7380
No log 44.8571 314 0.5772 0.6705 0.5772 0.7597
No log 45.1429 316 0.5719 0.6245 0.5719 0.7563
No log 45.4286 318 0.5389 0.7050 0.5389 0.7341
No log 45.7143 320 0.5304 0.6756 0.5304 0.7283
No log 46.0 322 0.5360 0.6733 0.5360 0.7322
No log 46.2857 324 0.5287 0.6725 0.5287 0.7271
No log 46.5714 326 0.5288 0.6835 0.5288 0.7272
No log 46.8571 328 0.5334 0.7206 0.5334 0.7304
No log 47.1429 330 0.5334 0.7106 0.5334 0.7303
No log 47.4286 332 0.5342 0.7070 0.5342 0.7309
No log 47.7143 334 0.5275 0.7501 0.5275 0.7263
No log 48.0 336 0.5292 0.6733 0.5292 0.7274
No log 48.2857 338 0.5367 0.6667 0.5367 0.7326
No log 48.5714 340 0.5370 0.6966 0.5370 0.7328
No log 48.8571 342 0.5328 0.6850 0.5328 0.7299
No log 49.1429 344 0.5382 0.7019 0.5382 0.7336
No log 49.4286 346 0.5352 0.6901 0.5352 0.7316
No log 49.7143 348 0.5311 0.6820 0.5311 0.7287
No log 50.0 350 0.5342 0.6789 0.5342 0.7309
No log 50.2857 352 0.5349 0.6781 0.5349 0.7314
No log 50.5714 354 0.5359 0.6923 0.5359 0.7321
No log 50.8571 356 0.5363 0.6614 0.5363 0.7323
No log 51.1429 358 0.5398 0.6986 0.5398 0.7347
No log 51.4286 360 0.5384 0.6882 0.5384 0.7338
No log 51.7143 362 0.5369 0.6955 0.5369 0.7327
No log 52.0 364 0.5372 0.6822 0.5372 0.7329
No log 52.2857 366 0.5329 0.6737 0.5329 0.7300
No log 52.5714 368 0.5312 0.7071 0.5312 0.7289
No log 52.8571 370 0.5309 0.7071 0.5309 0.7286
No log 53.1429 372 0.5301 0.6911 0.5301 0.7281
No log 53.4286 374 0.5363 0.6854 0.5363 0.7323
No log 53.7143 376 0.5586 0.6898 0.5586 0.7474
No log 54.0 378 0.5733 0.6322 0.5733 0.7572
No log 54.2857 380 0.5585 0.6536 0.5585 0.7473
No log 54.5714 382 0.5324 0.6858 0.5324 0.7296
No log 54.8571 384 0.5392 0.7179 0.5392 0.7343
No log 55.1429 386 0.5670 0.6743 0.5670 0.7530
No log 55.4286 388 0.5724 0.6512 0.5724 0.7565
No log 55.7143 390 0.5538 0.6845 0.5538 0.7442
No log 56.0 392 0.5311 0.6687 0.5311 0.7287
No log 56.2857 394 0.5285 0.6869 0.5285 0.7269
No log 56.5714 396 0.5322 0.6854 0.5322 0.7295
No log 56.8571 398 0.5273 0.6745 0.5273 0.7262
No log 57.1429 400 0.5207 0.6966 0.5207 0.7216
No log 57.4286 402 0.5276 0.6846 0.5276 0.7264
No log 57.7143 404 0.5526 0.6573 0.5526 0.7434
No log 58.0 406 0.5617 0.6340 0.5617 0.7495
No log 58.2857 408 0.5518 0.6677 0.5518 0.7429
No log 58.5714 410 0.5340 0.6830 0.5340 0.7308
No log 58.8571 412 0.5202 0.6924 0.5202 0.7212
No log 59.1429 414 0.5156 0.6841 0.5156 0.7180
No log 59.4286 416 0.5175 0.6507 0.5175 0.7194
No log 59.7143 418 0.5200 0.6547 0.5200 0.7211
No log 60.0 420 0.5225 0.6742 0.5225 0.7228
No log 60.2857 422 0.5203 0.6742 0.5203 0.7213
No log 60.5714 424 0.5190 0.7025 0.5190 0.7204
No log 60.8571 426 0.5226 0.6970 0.5226 0.7229
No log 61.1429 428 0.5250 0.6880 0.5250 0.7245
No log 61.4286 430 0.5284 0.6880 0.5284 0.7269
No log 61.7143 432 0.5234 0.7049 0.5234 0.7235
No log 62.0 434 0.5181 0.7193 0.5181 0.7198
No log 62.2857 436 0.5160 0.7025 0.5160 0.7183
No log 62.5714 438 0.5147 0.7025 0.5147 0.7174
No log 62.8571 440 0.5162 0.7025 0.5162 0.7184
No log 63.1429 442 0.5168 0.7025 0.5168 0.7189
No log 63.4286 444 0.5141 0.7025 0.5141 0.7170
No log 63.7143 446 0.5163 0.7005 0.5163 0.7186
No log 64.0 448 0.5249 0.6901 0.5249 0.7245
No log 64.2857 450 0.5334 0.6589 0.5334 0.7303
No log 64.5714 452 0.5408 0.6589 0.5408 0.7354
No log 64.8571 454 0.5379 0.6732 0.5379 0.7334
No log 65.1429 456 0.5274 0.6888 0.5274 0.7262
No log 65.4286 458 0.5272 0.6796 0.5272 0.7261
No log 65.7143 460 0.5385 0.6325 0.5385 0.7338
No log 66.0 462 0.5594 0.6581 0.5594 0.7480
No log 66.2857 464 0.5959 0.6590 0.5959 0.7719
No log 66.5714 466 0.6068 0.6590 0.6068 0.7790
No log 66.8571 468 0.5811 0.6479 0.5811 0.7623
No log 67.1429 470 0.5550 0.6581 0.5550 0.7450
No log 67.4286 472 0.5382 0.6581 0.5382 0.7336
No log 67.7143 474 0.5247 0.6745 0.5247 0.7244
No log 68.0 476 0.5214 0.6737 0.5214 0.7221
No log 68.2857 478 0.5225 0.6737 0.5225 0.7228
No log 68.5714 480 0.5242 0.6737 0.5242 0.7240
No log 68.8571 482 0.5285 0.6778 0.5285 0.7270
No log 69.1429 484 0.5312 0.6644 0.5312 0.7288
No log 69.4286 486 0.5378 0.6644 0.5378 0.7334
No log 69.7143 488 0.5430 0.6581 0.5430 0.7369
No log 70.0 490 0.5397 0.6644 0.5397 0.7347
No log 70.2857 492 0.5347 0.6499 0.5347 0.7313
No log 70.5714 494 0.5343 0.6499 0.5343 0.7310
No log 70.8571 496 0.5343 0.6667 0.5343 0.7310
No log 71.1429 498 0.5334 0.6667 0.5334 0.7303
0.2218 71.4286 500 0.5331 0.6667 0.5331 0.7301
0.2218 71.7143 502 0.5342 0.6667 0.5342 0.7309
0.2218 72.0 504 0.5387 0.6499 0.5387 0.7340
0.2218 72.2857 506 0.5468 0.6581 0.5468 0.7395
0.2218 72.5714 508 0.5547 0.6581 0.5547 0.7448
0.2218 72.8571 510 0.5591 0.6479 0.5591 0.7477

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k1_task5_organization

Finetuned
(4019)
this model