ArabicNewSplits6_FineTuningAraBERT_run2_AugV5_k10_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7062
  • Qwk: 0.5461
  • Mse: 0.7062
  • Rmse: 0.8403

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0385 2 4.1913 -0.0292 4.1913 2.0473
No log 0.0769 4 2.4680 0.0776 2.4680 1.5710
No log 0.1154 6 1.1726 0.0773 1.1726 1.0828
No log 0.1538 8 1.3667 -0.0462 1.3667 1.1690
No log 0.1923 10 1.7225 -0.0687 1.7225 1.3124
No log 0.2308 12 1.1512 -0.0773 1.1512 1.0730
No log 0.2692 14 0.8162 0.1668 0.8162 0.9034
No log 0.3077 16 0.9294 0.0283 0.9294 0.9641
No log 0.3462 18 1.5199 0.1040 1.5199 1.2328
No log 0.3846 20 1.3749 0.0855 1.3749 1.1726
No log 0.4231 22 0.9032 0.2488 0.9032 0.9504
No log 0.4615 24 0.7607 0.2013 0.7607 0.8722
No log 0.5 26 0.7794 0.1786 0.7794 0.8829
No log 0.5385 28 0.9015 0.0789 0.9015 0.9495
No log 0.5769 30 0.9775 0.0283 0.9775 0.9887
No log 0.6154 32 1.0155 0.0338 1.0155 1.0077
No log 0.6538 34 1.0135 0.0735 1.0135 1.0067
No log 0.6923 36 0.8395 0.1604 0.8395 0.9163
No log 0.7308 38 0.7352 0.2880 0.7352 0.8574
No log 0.7692 40 0.6210 0.4430 0.6210 0.7880
No log 0.8077 42 0.6607 0.4027 0.6607 0.8128
No log 0.8462 44 0.7122 0.2961 0.7122 0.8439
No log 0.8846 46 0.6799 0.3241 0.6799 0.8245
No log 0.9231 48 0.7625 0.2327 0.7625 0.8732
No log 0.9615 50 0.7218 0.2837 0.7218 0.8496
No log 1.0 52 0.5959 0.4400 0.5959 0.7720
No log 1.0385 54 0.5774 0.4866 0.5774 0.7598
No log 1.0769 56 0.5735 0.4749 0.5735 0.7573
No log 1.1154 58 0.6058 0.4849 0.6058 0.7783
No log 1.1538 60 0.7641 0.3197 0.7641 0.8741
No log 1.1923 62 0.8770 0.3397 0.8770 0.9365
No log 1.2308 64 1.0557 0.3686 1.0557 1.0275
No log 1.2692 66 0.9970 0.3549 0.9970 0.9985
No log 1.3077 68 1.0157 0.3501 1.0157 1.0078
No log 1.3462 70 1.0112 0.3637 1.0112 1.0056
No log 1.3846 72 0.7897 0.4017 0.7897 0.8886
No log 1.4231 74 0.6837 0.4084 0.6837 0.8269
No log 1.4615 76 0.8053 0.3325 0.8053 0.8974
No log 1.5 78 0.8755 0.3230 0.8755 0.9357
No log 1.5385 80 0.8635 0.3539 0.8635 0.9293
No log 1.5769 82 0.8393 0.3080 0.8393 0.9162
No log 1.6154 84 0.7375 0.3749 0.7375 0.8588
No log 1.6538 86 0.6987 0.3349 0.6987 0.8359
No log 1.6923 88 0.6545 0.3766 0.6545 0.8090
No log 1.7308 90 0.6128 0.3736 0.6128 0.7828
No log 1.7692 92 0.6079 0.4272 0.6079 0.7797
No log 1.8077 94 0.5463 0.5004 0.5463 0.7391
No log 1.8462 96 0.5140 0.5375 0.5140 0.7170
No log 1.8846 98 0.5189 0.5366 0.5189 0.7203
No log 1.9231 100 0.5714 0.5154 0.5714 0.7559
No log 1.9615 102 0.6927 0.4535 0.6927 0.8323
No log 2.0 104 0.6370 0.4755 0.6370 0.7981
No log 2.0385 106 0.6462 0.4689 0.6462 0.8039
No log 2.0769 108 0.5908 0.5018 0.5908 0.7686
No log 2.1154 110 0.6342 0.4882 0.6342 0.7964
No log 2.1538 112 0.7638 0.4490 0.7638 0.8739
No log 2.1923 114 0.9716 0.3959 0.9716 0.9857
No log 2.2308 116 0.9222 0.3957 0.9222 0.9603
No log 2.2692 118 0.7549 0.4713 0.7549 0.8689
No log 2.3077 120 0.6996 0.5378 0.6996 0.8364
No log 2.3462 122 0.6792 0.5109 0.6792 0.8242
No log 2.3846 124 0.6644 0.4717 0.6644 0.8151
No log 2.4231 126 0.6891 0.4592 0.6891 0.8301
No log 2.4615 128 0.6367 0.5385 0.6367 0.7979
No log 2.5 130 0.6508 0.5287 0.6508 0.8067
No log 2.5385 132 0.6651 0.5415 0.6651 0.8156
No log 2.5769 134 0.6431 0.5287 0.6431 0.8019
No log 2.6154 136 0.6637 0.5450 0.6637 0.8147
No log 2.6538 138 0.6519 0.5580 0.6519 0.8074
No log 2.6923 140 0.6773 0.5240 0.6773 0.8230
No log 2.7308 142 0.6999 0.4954 0.6999 0.8366
No log 2.7692 144 0.7140 0.4863 0.7140 0.8450
No log 2.8077 146 0.7222 0.5478 0.7222 0.8498
No log 2.8462 148 0.7226 0.5253 0.7226 0.8500
No log 2.8846 150 0.7308 0.5139 0.7308 0.8549
No log 2.9231 152 0.7681 0.4423 0.7681 0.8764
No log 2.9615 154 0.8164 0.4286 0.8164 0.9035
No log 3.0 156 0.8138 0.4622 0.8138 0.9021
No log 3.0385 158 0.7890 0.5142 0.7890 0.8883
No log 3.0769 160 0.7855 0.5206 0.7855 0.8863
No log 3.1154 162 0.7793 0.5248 0.7793 0.8828
No log 3.1538 164 0.8855 0.4655 0.8855 0.9410
No log 3.1923 166 0.9053 0.4899 0.9053 0.9514
No log 3.2308 168 0.7773 0.4913 0.7773 0.8817
No log 3.2692 170 0.7000 0.5593 0.7000 0.8366
No log 3.3077 172 0.6777 0.5532 0.6777 0.8232
No log 3.3462 174 0.6725 0.5281 0.6725 0.8200
No log 3.3846 176 0.6930 0.5102 0.6930 0.8325
No log 3.4231 178 0.6887 0.5427 0.6887 0.8299
No log 3.4615 180 0.6940 0.5574 0.6940 0.8330
No log 3.5 182 0.6859 0.5171 0.6859 0.8282
No log 3.5385 184 0.7156 0.4770 0.7156 0.8459
No log 3.5769 186 0.7340 0.4953 0.7340 0.8568
No log 3.6154 188 0.7214 0.4776 0.7214 0.8494
No log 3.6538 190 0.7093 0.5120 0.7093 0.8422
No log 3.6923 192 0.8413 0.4747 0.8413 0.9172
No log 3.7308 194 0.8751 0.4730 0.8751 0.9355
No log 3.7692 196 0.7362 0.5256 0.7362 0.8580
No log 3.8077 198 0.6817 0.4646 0.6817 0.8256
No log 3.8462 200 0.8345 0.4104 0.8345 0.9135
No log 3.8846 202 0.8865 0.3971 0.8865 0.9415
No log 3.9231 204 0.7414 0.4627 0.7414 0.8610
No log 3.9615 206 0.7085 0.5006 0.7085 0.8417
No log 4.0 208 0.7832 0.5125 0.7832 0.8850
No log 4.0385 210 0.7324 0.5114 0.7324 0.8558
No log 4.0769 212 0.6537 0.4773 0.6537 0.8085
No log 4.1154 214 0.6688 0.5291 0.6688 0.8178
No log 4.1538 216 0.7233 0.4370 0.7233 0.8505
No log 4.1923 218 0.6773 0.5309 0.6773 0.8230
No log 4.2308 220 0.6591 0.4944 0.6591 0.8119
No log 4.2692 222 0.7168 0.5003 0.7168 0.8466
No log 4.3077 224 0.6923 0.5030 0.6923 0.8321
No log 4.3462 226 0.6623 0.4855 0.6623 0.8138
No log 4.3846 228 0.7078 0.4544 0.7078 0.8413
No log 4.4231 230 0.6953 0.4847 0.6953 0.8338
No log 4.4615 232 0.6615 0.5274 0.6615 0.8133
No log 4.5 234 0.7035 0.5256 0.7035 0.8388
No log 4.5385 236 0.6963 0.5345 0.6963 0.8344
No log 4.5769 238 0.6632 0.5245 0.6632 0.8144
No log 4.6154 240 0.6508 0.4985 0.6508 0.8067
No log 4.6538 242 0.6821 0.4835 0.6821 0.8259
No log 4.6923 244 0.7347 0.4742 0.7347 0.8571
No log 4.7308 246 0.7147 0.5049 0.7147 0.8454
No log 4.7692 248 0.7030 0.5416 0.7030 0.8384
No log 4.8077 250 0.7281 0.5289 0.7281 0.8533
No log 4.8462 252 0.7125 0.5258 0.7125 0.8441
No log 4.8846 254 0.7093 0.5143 0.7093 0.8422
No log 4.9231 256 0.7110 0.5200 0.7110 0.8432
No log 4.9615 258 0.7115 0.5200 0.7115 0.8435
No log 5.0 260 0.7227 0.5309 0.7227 0.8501
No log 5.0385 262 0.7350 0.5346 0.7350 0.8573
No log 5.0769 264 0.7167 0.5227 0.7167 0.8466
No log 5.1154 266 0.7289 0.5294 0.7289 0.8537
No log 5.1538 268 0.7408 0.5324 0.7408 0.8607
No log 5.1923 270 0.7140 0.5290 0.7140 0.8450
No log 5.2308 272 0.6816 0.5255 0.6816 0.8256
No log 5.2692 274 0.6919 0.5336 0.6919 0.8318
No log 5.3077 276 0.7054 0.5222 0.7054 0.8399
No log 5.3462 278 0.6825 0.5264 0.6825 0.8261
No log 5.3846 280 0.6595 0.5419 0.6595 0.8121
No log 5.4231 282 0.6591 0.5047 0.6591 0.8119
No log 5.4615 284 0.6522 0.5595 0.6522 0.8076
No log 5.5 286 0.6663 0.5768 0.6663 0.8163
No log 5.5385 288 0.6477 0.5709 0.6477 0.8048
No log 5.5769 290 0.6312 0.5105 0.6312 0.7945
No log 5.6154 292 0.6464 0.5301 0.6464 0.8040
No log 5.6538 294 0.6426 0.5360 0.6426 0.8016
No log 5.6923 296 0.6474 0.5177 0.6474 0.8046
No log 5.7308 298 0.6312 0.5098 0.6312 0.7945
No log 5.7692 300 0.6243 0.5294 0.6243 0.7901
No log 5.8077 302 0.6397 0.5294 0.6397 0.7998
No log 5.8462 304 0.6454 0.5277 0.6454 0.8034
No log 5.8846 306 0.6458 0.5294 0.6458 0.8036
No log 5.9231 308 0.6653 0.5118 0.6653 0.8156
No log 5.9615 310 0.7119 0.4636 0.7119 0.8438
No log 6.0 312 0.7056 0.4680 0.7056 0.8400
No log 6.0385 314 0.6584 0.5326 0.6584 0.8114
No log 6.0769 316 0.6370 0.5265 0.6370 0.7981
No log 6.1154 318 0.6446 0.5189 0.6446 0.8028
No log 6.1538 320 0.6555 0.5341 0.6555 0.8096
No log 6.1923 322 0.6718 0.5174 0.6718 0.8197
No log 6.2308 324 0.6687 0.5386 0.6687 0.8177
No log 6.2692 326 0.6618 0.4895 0.6618 0.8135
No log 6.3077 328 0.6547 0.5174 0.6547 0.8092
No log 6.3462 330 0.6521 0.5174 0.6521 0.8076
No log 6.3846 332 0.6604 0.5174 0.6604 0.8126
No log 6.4231 334 0.6680 0.5461 0.6680 0.8173
No log 6.4615 336 0.6884 0.5309 0.6884 0.8297
No log 6.5 338 0.6993 0.5368 0.6993 0.8362
No log 6.5385 340 0.6853 0.54 0.6853 0.8278
No log 6.5769 342 0.6849 0.5655 0.6849 0.8276
No log 6.6154 344 0.6905 0.5654 0.6905 0.8310
No log 6.6538 346 0.6951 0.5279 0.6951 0.8337
No log 6.6923 348 0.7180 0.5351 0.7180 0.8473
No log 6.7308 350 0.7504 0.5649 0.7504 0.8663
No log 6.7692 352 0.7700 0.5501 0.7700 0.8775
No log 6.8077 354 0.7484 0.5456 0.7484 0.8651
No log 6.8462 356 0.7287 0.5339 0.7287 0.8536
No log 6.8846 358 0.6941 0.5519 0.6941 0.8331
No log 6.9231 360 0.6802 0.5595 0.6802 0.8248
No log 6.9615 362 0.6762 0.5477 0.6762 0.8223
No log 7.0 364 0.6725 0.5753 0.6725 0.8200
No log 7.0385 366 0.6784 0.5797 0.6784 0.8237
No log 7.0769 368 0.6738 0.5592 0.6738 0.8209
No log 7.1154 370 0.6723 0.5592 0.6723 0.8200
No log 7.1538 372 0.6760 0.5313 0.6760 0.8222
No log 7.1923 374 0.6873 0.5297 0.6873 0.8290
No log 7.2308 376 0.6838 0.5297 0.6838 0.8270
No log 7.2692 378 0.6721 0.5239 0.6721 0.8198
No log 7.3077 380 0.6696 0.5309 0.6696 0.8183
No log 7.3462 382 0.6610 0.5313 0.6610 0.8130
No log 7.3846 384 0.6710 0.5313 0.6710 0.8191
No log 7.4231 386 0.6893 0.5288 0.6893 0.8303
No log 7.4615 388 0.6837 0.5292 0.6837 0.8268
No log 7.5 390 0.6706 0.5403 0.6706 0.8189
No log 7.5385 392 0.6648 0.5639 0.6648 0.8153
No log 7.5769 394 0.6606 0.5461 0.6606 0.8128
No log 7.6154 396 0.6583 0.5520 0.6583 0.8114
No log 7.6538 398 0.6597 0.5755 0.6597 0.8122
No log 7.6923 400 0.6574 0.5639 0.6574 0.8108
No log 7.7308 402 0.6523 0.5713 0.6523 0.8077
No log 7.7692 404 0.6498 0.5713 0.6498 0.8061
No log 7.8077 406 0.6491 0.54 0.6491 0.8056
No log 7.8462 408 0.6547 0.54 0.6547 0.8091
No log 7.8846 410 0.6677 0.5713 0.6677 0.8171
No log 7.9231 412 0.6826 0.5713 0.6826 0.8262
No log 7.9615 414 0.6937 0.5490 0.6937 0.8329
No log 8.0 416 0.7008 0.5475 0.7008 0.8371
No log 8.0385 418 0.7020 0.5371 0.7020 0.8379
No log 8.0769 420 0.7033 0.5227 0.7033 0.8386
No log 8.1154 422 0.6938 0.54 0.6938 0.8329
No log 8.1538 424 0.6782 0.5295 0.6782 0.8235
No log 8.1923 426 0.6708 0.5277 0.6708 0.8190
No log 8.2308 428 0.6684 0.5157 0.6684 0.8175
No log 8.2692 430 0.6622 0.5157 0.6622 0.8137
No log 8.3077 432 0.6576 0.5277 0.6576 0.8109
No log 8.3462 434 0.6549 0.5397 0.6549 0.8093
No log 8.3846 436 0.6489 0.5397 0.6489 0.8056
No log 8.4231 438 0.6465 0.5397 0.6465 0.8040
No log 8.4615 440 0.6438 0.5200 0.6438 0.8024
No log 8.5 442 0.6449 0.5200 0.6449 0.8031
No log 8.5385 444 0.6451 0.5200 0.6451 0.8032
No log 8.5769 446 0.6452 0.5519 0.6452 0.8033
No log 8.6154 448 0.6473 0.5639 0.6473 0.8046
No log 8.6538 450 0.6598 0.5317 0.6598 0.8123
No log 8.6923 452 0.6690 0.5240 0.6690 0.8179
No log 8.7308 454 0.6672 0.5317 0.6672 0.8169
No log 8.7692 456 0.6670 0.5317 0.6670 0.8167
No log 8.8077 458 0.6621 0.54 0.6621 0.8137
No log 8.8462 460 0.6564 0.5341 0.6564 0.8102
No log 8.8846 462 0.6570 0.5416 0.6570 0.8105
No log 8.9231 464 0.6574 0.5416 0.6574 0.8108
No log 8.9615 466 0.6631 0.5341 0.6631 0.8143
No log 9.0 468 0.6645 0.5341 0.6645 0.8152
No log 9.0385 470 0.6683 0.5341 0.6683 0.8175
No log 9.0769 472 0.6737 0.5341 0.6737 0.8208
No log 9.1154 474 0.6811 0.5341 0.6811 0.8253
No log 9.1538 476 0.6912 0.5445 0.6912 0.8314
No log 9.1923 478 0.6945 0.5445 0.6945 0.8334
No log 9.2308 480 0.6961 0.5443 0.6961 0.8343
No log 9.2692 482 0.6978 0.5443 0.6978 0.8353
No log 9.3077 484 0.6954 0.5461 0.6954 0.8339
No log 9.3462 486 0.6918 0.5341 0.6918 0.8317
No log 9.3846 488 0.6909 0.5416 0.6909 0.8312
No log 9.4231 490 0.6941 0.5416 0.6941 0.8331
No log 9.4615 492 0.6993 0.5416 0.6993 0.8362
No log 9.5 494 0.7009 0.5416 0.7009 0.8372
No log 9.5385 496 0.7023 0.5461 0.7023 0.8381
No log 9.5769 498 0.7048 0.5461 0.7048 0.8395
0.4043 9.6154 500 0.7040 0.5461 0.7040 0.8390
0.4043 9.6538 502 0.7069 0.5461 0.7069 0.8408
0.4043 9.6923 504 0.7099 0.5461 0.7099 0.8425
0.4043 9.7308 506 0.7094 0.5387 0.7094 0.8422
0.4043 9.7692 508 0.7097 0.5387 0.7097 0.8424
0.4043 9.8077 510 0.7089 0.5387 0.7089 0.8420
0.4043 9.8462 512 0.7083 0.5387 0.7083 0.8416
0.4043 9.8846 514 0.7075 0.5387 0.7075 0.8411
0.4043 9.9231 516 0.7070 0.5387 0.7070 0.8408
0.4043 9.9615 518 0.7065 0.5461 0.7065 0.8405
0.4043 10.0 520 0.7062 0.5461 0.7062 0.8403

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits6_FineTuningAraBERT_run2_AugV5_k10_task2_organization

Finetuned
(4023)
this model