ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run1_AugV5_k6_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5122
  • Qwk: 0.4591
  • Mse: 0.5122
  • Rmse: 0.7157

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0909 2 2.5566 -0.0449 2.5566 1.5989
No log 0.1818 4 1.1821 0.1256 1.1821 1.0873
No log 0.2727 6 0.8398 0.0535 0.8398 0.9164
No log 0.3636 8 1.0951 -0.0816 1.0951 1.0465
No log 0.4545 10 1.0413 0.0265 1.0413 1.0204
No log 0.5455 12 0.8719 0.1459 0.8719 0.9337
No log 0.6364 14 0.7654 0.0 0.7654 0.8749
No log 0.7273 16 0.8511 0.0481 0.8511 0.9226
No log 0.8182 18 1.0056 0.0952 1.0056 1.0028
No log 0.9091 20 0.9189 0.0522 0.9189 0.9586
No log 1.0 22 0.8089 0.0481 0.8089 0.8994
No log 1.0909 24 0.7774 0.0481 0.7774 0.8817
No log 1.1818 26 0.7989 0.1365 0.7989 0.8938
No log 1.2727 28 0.7551 0.0 0.7551 0.8690
No log 1.3636 30 0.7371 0.0 0.7371 0.8586
No log 1.4545 32 0.7645 0.0481 0.7645 0.8744
No log 1.5455 34 0.8305 0.0937 0.8305 0.9113
No log 1.6364 36 1.0003 0.1334 1.0003 1.0002
No log 1.7273 38 1.1492 0.1284 1.1492 1.0720
No log 1.8182 40 1.1545 0.1573 1.1545 1.0745
No log 1.9091 42 1.0584 0.1609 1.0584 1.0288
No log 2.0 44 0.9449 0.2308 0.9449 0.9720
No log 2.0909 46 0.8966 0.2046 0.8966 0.9469
No log 2.1818 48 0.7541 0.0937 0.7541 0.8684
No log 2.2727 50 0.7117 0.1674 0.7117 0.8436
No log 2.3636 52 0.7365 0.1770 0.7365 0.8582
No log 2.4545 54 0.7808 0.2066 0.7808 0.8836
No log 2.5455 56 0.9224 0.1910 0.9224 0.9604
No log 2.6364 58 1.0274 0.1856 1.0274 1.0136
No log 2.7273 60 0.9538 0.2183 0.9538 0.9766
No log 2.8182 62 0.7277 0.3271 0.7277 0.8530
No log 2.9091 64 0.6610 0.2621 0.6610 0.8130
No log 3.0 66 0.6696 0.2319 0.6696 0.8183
No log 3.0909 68 0.6978 0.2299 0.6978 0.8353
No log 3.1818 70 0.9242 0.2559 0.9242 0.9614
No log 3.2727 72 1.0056 0.2113 1.0056 1.0028
No log 3.3636 74 0.8768 0.2977 0.8768 0.9364
No log 3.4545 76 0.7001 0.0460 0.7001 0.8367
No log 3.5455 78 0.6534 0.3465 0.6534 0.8083
No log 3.6364 80 0.6655 0.3465 0.6655 0.8158
No log 3.7273 82 0.6409 0.2857 0.6409 0.8006
No log 3.8182 84 0.6422 0.3661 0.6422 0.8014
No log 3.9091 86 0.7446 0.2435 0.7447 0.8629
No log 4.0 88 0.9938 0.2820 0.9938 0.9969
No log 4.0909 90 1.0561 0.2230 1.0561 1.0277
No log 4.1818 92 1.1015 0.2230 1.1015 1.0495
No log 4.2727 94 1.0220 0.2496 1.0220 1.0109
No log 4.3636 96 0.8396 0.2995 0.8396 0.9163
No log 4.4545 98 0.6033 0.3273 0.6033 0.7767
No log 4.5455 100 0.5839 0.3976 0.5839 0.7642
No log 4.6364 102 0.7260 0.3562 0.7260 0.8521
No log 4.7273 104 0.7506 0.3562 0.7506 0.8664
No log 4.8182 106 0.6946 0.3963 0.6946 0.8334
No log 4.9091 108 0.5684 0.4502 0.5684 0.7539
No log 5.0 110 0.5721 0.4763 0.5721 0.7563
No log 5.0909 112 0.6124 0.5184 0.6124 0.7826
No log 5.1818 114 0.6763 0.4272 0.6763 0.8223
No log 5.2727 116 0.7458 0.3236 0.7458 0.8636
No log 5.3636 118 0.6254 0.5210 0.6254 0.7909
No log 5.4545 120 0.6379 0.4609 0.6379 0.7987
No log 5.5455 122 0.6090 0.5543 0.6090 0.7804
No log 5.6364 124 0.6510 0.3546 0.6510 0.8068
No log 5.7273 126 0.6915 0.3570 0.6915 0.8316
No log 5.8182 128 0.6137 0.3953 0.6137 0.7834
No log 5.9091 130 0.5799 0.3890 0.5799 0.7615
No log 6.0 132 0.5992 0.5101 0.5992 0.7741
No log 6.0909 134 0.6057 0.4759 0.6057 0.7782
No log 6.1818 136 0.5591 0.4618 0.5591 0.7477
No log 6.2727 138 0.5789 0.3183 0.5789 0.7608
No log 6.3636 140 0.5919 0.3213 0.5919 0.7694
No log 6.4545 142 0.5771 0.4086 0.5771 0.7596
No log 6.5455 144 0.5917 0.4941 0.5917 0.7693
No log 6.6364 146 0.6167 0.3665 0.6167 0.7853
No log 6.7273 148 0.7548 0.4152 0.7548 0.8688
No log 6.8182 150 0.7678 0.4057 0.7678 0.8763
No log 6.9091 152 0.6602 0.4107 0.6602 0.8126
No log 7.0 154 0.7976 0.3346 0.7976 0.8931
No log 7.0909 156 0.7931 0.3579 0.7931 0.8905
No log 7.1818 158 0.6821 0.3808 0.6821 0.8259
No log 7.2727 160 0.6011 0.4343 0.6011 0.7753
No log 7.3636 162 0.7135 0.4369 0.7135 0.8447
No log 7.4545 164 0.7603 0.4295 0.7603 0.8720
No log 7.5455 166 0.7058 0.4522 0.7058 0.8401
No log 7.6364 168 0.6478 0.4925 0.6478 0.8049
No log 7.7273 170 0.6188 0.4774 0.6188 0.7867
No log 7.8182 172 0.6052 0.4747 0.6052 0.7779
No log 7.9091 174 0.6298 0.3345 0.6298 0.7936
No log 8.0 176 0.6788 0.4594 0.6788 0.8239
No log 8.0909 178 0.7820 0.3563 0.7820 0.8843
No log 8.1818 180 0.6423 0.4778 0.6423 0.8014
No log 8.2727 182 0.5429 0.5768 0.5429 0.7368
No log 8.3636 184 0.5737 0.4883 0.5737 0.7574
No log 8.4545 186 0.6322 0.4522 0.6322 0.7951
No log 8.5455 188 0.5811 0.3841 0.5811 0.7623
No log 8.6364 190 0.5602 0.4569 0.5602 0.7485
No log 8.7273 192 0.5816 0.4124 0.5816 0.7626
No log 8.8182 194 0.6776 0.3372 0.6776 0.8231
No log 8.9091 196 0.7568 0.3940 0.7568 0.8700
No log 9.0 198 0.7706 0.3520 0.7706 0.8778
No log 9.0909 200 0.6708 0.3867 0.6708 0.8190
No log 9.1818 202 0.6325 0.3809 0.6325 0.7953
No log 9.2727 204 0.6098 0.3183 0.6098 0.7809
No log 9.3636 206 0.5779 0.4136 0.5779 0.7602
No log 9.4545 208 0.5493 0.4027 0.5493 0.7412
No log 9.5455 210 0.5453 0.4634 0.5453 0.7385
No log 9.6364 212 0.5393 0.3995 0.5393 0.7344
No log 9.7273 214 0.5684 0.4182 0.5684 0.7540
No log 9.8182 216 0.5817 0.4502 0.5817 0.7627
No log 9.9091 218 0.5940 0.3471 0.5940 0.7707
No log 10.0 220 0.5604 0.4314 0.5604 0.7486
No log 10.0909 222 0.5673 0.3454 0.5673 0.7532
No log 10.1818 224 0.5687 0.4206 0.5687 0.7541
No log 10.2727 226 0.6155 0.4522 0.6155 0.7845
No log 10.3636 228 0.5758 0.4726 0.5758 0.7588
No log 10.4545 230 0.5819 0.4127 0.5819 0.7628
No log 10.5455 232 0.6110 0.4595 0.6110 0.7816
No log 10.6364 234 0.5891 0.5110 0.5891 0.7675
No log 10.7273 236 0.5726 0.5252 0.5726 0.7567
No log 10.8182 238 0.5810 0.5184 0.5810 0.7622
No log 10.9091 240 0.6389 0.4283 0.6389 0.7993
No log 11.0 242 0.7071 0.4261 0.7071 0.8409
No log 11.0909 244 0.6460 0.4091 0.6460 0.8037
No log 11.1818 246 0.6002 0.4719 0.6002 0.7747
No log 11.2727 248 0.6942 0.4444 0.6942 0.8332
No log 11.3636 250 0.7640 0.4385 0.7640 0.8741
No log 11.4545 252 0.6867 0.4444 0.6867 0.8287
No log 11.5455 254 0.6166 0.4186 0.6166 0.7852
No log 11.6364 256 0.6147 0.3972 0.6147 0.7840
No log 11.7273 258 0.5945 0.4386 0.5945 0.7711
No log 11.8182 260 0.6034 0.4020 0.6034 0.7768
No log 11.9091 262 0.6129 0.3060 0.6129 0.7829
No log 12.0 264 0.6002 0.2973 0.6002 0.7747
No log 12.0909 266 0.6104 0.3806 0.6104 0.7813
No log 12.1818 268 0.5987 0.4590 0.5987 0.7738
No log 12.2727 270 0.6015 0.5032 0.6015 0.7755
No log 12.3636 272 0.7017 0.5155 0.7017 0.8377
No log 12.4545 274 0.6813 0.5155 0.6813 0.8254
No log 12.5455 276 0.5697 0.5845 0.5697 0.7548
No log 12.6364 278 0.5398 0.5826 0.5398 0.7347
No log 12.7273 280 0.5277 0.6333 0.5277 0.7264
No log 12.8182 282 0.5216 0.6247 0.5216 0.7222
No log 12.9091 284 0.5085 0.6443 0.5085 0.7131
No log 13.0 286 0.5182 0.5923 0.5182 0.7198
No log 13.0909 288 0.5159 0.5941 0.5159 0.7183
No log 13.1818 290 0.5754 0.4711 0.5754 0.7586
No log 13.2727 292 0.5771 0.4497 0.5771 0.7597
No log 13.3636 294 0.5771 0.5320 0.5771 0.7597
No log 13.4545 296 0.5871 0.4789 0.5871 0.7662
No log 13.5455 298 0.6122 0.4169 0.6122 0.7824
No log 13.6364 300 0.6078 0.3673 0.6078 0.7796
No log 13.7273 302 0.5634 0.4444 0.5634 0.7506
No log 13.8182 304 0.5373 0.5559 0.5373 0.7330
No log 13.9091 306 0.5464 0.5957 0.5464 0.7392
No log 14.0 308 0.5295 0.6118 0.5295 0.7277
No log 14.0909 310 0.5139 0.6104 0.5139 0.7169
No log 14.1818 312 0.5218 0.5991 0.5218 0.7224
No log 14.2727 314 0.5119 0.6201 0.5119 0.7155
No log 14.3636 316 0.5066 0.5593 0.5066 0.7117
No log 14.4545 318 0.5168 0.4526 0.5168 0.7189
No log 14.5455 320 0.5356 0.4060 0.5356 0.7318
No log 14.6364 322 0.5586 0.4182 0.5586 0.7474
No log 14.7273 324 0.5765 0.4684 0.5765 0.7593
No log 14.8182 326 0.5739 0.4473 0.5739 0.7576
No log 14.9091 328 0.5794 0.3426 0.5794 0.7612
No log 15.0 330 0.5879 0.3523 0.5879 0.7668
No log 15.0909 332 0.6100 0.3919 0.6100 0.7810
No log 15.1818 334 0.6746 0.4628 0.6746 0.8213
No log 15.2727 336 0.6679 0.4495 0.6679 0.8172
No log 15.3636 338 0.6495 0.4281 0.6495 0.8059
No log 15.4545 340 0.6609 0.3972 0.6609 0.8129
No log 15.5455 342 0.7295 0.3305 0.7295 0.8541
No log 15.6364 344 0.6950 0.3247 0.6950 0.8337
No log 15.7273 346 0.6487 0.4285 0.6487 0.8054
No log 15.8182 348 0.6316 0.4196 0.6316 0.7947
No log 15.9091 350 0.6039 0.4484 0.6039 0.7771
No log 16.0 352 0.5772 0.5386 0.5772 0.7597
No log 16.0909 354 0.5539 0.4378 0.5539 0.7443
No log 16.1818 356 0.5442 0.4111 0.5442 0.7377
No log 16.2727 358 0.5384 0.4111 0.5384 0.7337
No log 16.3636 360 0.5313 0.4314 0.5313 0.7289
No log 16.4545 362 0.5289 0.4768 0.5289 0.7272
No log 16.5455 364 0.5327 0.5022 0.5327 0.7298
No log 16.6364 366 0.5517 0.5201 0.5517 0.7428
No log 16.7273 368 0.5982 0.5117 0.5982 0.7734
No log 16.8182 370 0.5749 0.4895 0.5749 0.7582
No log 16.9091 372 0.5676 0.5008 0.5676 0.7534
No log 17.0 374 0.6249 0.4843 0.6249 0.7905
No log 17.0909 376 0.6566 0.5516 0.6566 0.8103
No log 17.1818 378 0.6312 0.5970 0.6312 0.7945
No log 17.2727 380 0.5428 0.4858 0.5428 0.7368
No log 17.3636 382 0.5208 0.4569 0.5208 0.7217
No log 17.4545 384 0.5395 0.4513 0.5395 0.7345
No log 17.5455 386 0.5408 0.4663 0.5408 0.7354
No log 17.6364 388 0.5138 0.4291 0.5138 0.7168
No log 17.7273 390 0.5011 0.4837 0.5011 0.7079
No log 17.8182 392 0.4970 0.5076 0.4970 0.7050
No log 17.9091 394 0.4926 0.4837 0.4926 0.7018
No log 18.0 396 0.4857 0.4837 0.4857 0.6969
No log 18.0909 398 0.4823 0.5076 0.4823 0.6945
No log 18.1818 400 0.4897 0.5681 0.4897 0.6998
No log 18.2727 402 0.5282 0.5527 0.5282 0.7268
No log 18.3636 404 0.5190 0.5283 0.5190 0.7204
No log 18.4545 406 0.5137 0.5010 0.5137 0.7167
No log 18.5455 408 0.5075 0.4569 0.5075 0.7124
No log 18.6364 410 0.5144 0.4569 0.5144 0.7172
No log 18.7273 412 0.5347 0.4808 0.5347 0.7312
No log 18.8182 414 0.5451 0.5034 0.5451 0.7383
No log 18.9091 416 0.5274 0.4569 0.5274 0.7262
No log 19.0 418 0.5352 0.4724 0.5352 0.7316
No log 19.0909 420 0.5222 0.4788 0.5222 0.7226
No log 19.1818 422 0.5190 0.4724 0.5190 0.7204
No log 19.2727 424 0.5641 0.5342 0.5641 0.7510
No log 19.3636 426 0.5495 0.5342 0.5495 0.7413
No log 19.4545 428 0.5374 0.5639 0.5374 0.7331
No log 19.5455 430 0.5066 0.5708 0.5066 0.7117
No log 19.6364 432 0.5023 0.5042 0.5023 0.7088
No log 19.7273 434 0.5095 0.4677 0.5095 0.7138
No log 19.8182 436 0.5049 0.4703 0.5049 0.7105
No log 19.9091 438 0.5071 0.4908 0.5071 0.7121
No log 20.0 440 0.5235 0.5379 0.5235 0.7235
No log 20.0909 442 0.5232 0.4661 0.5232 0.7233
No log 20.1818 444 0.5198 0.4136 0.5198 0.7210
No log 20.2727 446 0.5448 0.4364 0.5448 0.7381
No log 20.3636 448 0.5479 0.4061 0.5479 0.7402
No log 20.4545 450 0.5246 0.4402 0.5246 0.7243
No log 20.5455 452 0.5688 0.5063 0.5688 0.7542
No log 20.6364 454 0.6531 0.5155 0.6531 0.8081
No log 20.7273 456 0.6176 0.5495 0.6176 0.7859
No log 20.8182 458 0.5256 0.5485 0.5256 0.7250
No log 20.9091 460 0.5529 0.5078 0.5529 0.7436
No log 21.0 462 0.6722 0.4667 0.6722 0.8198
No log 21.0909 464 0.6917 0.4667 0.6917 0.8317
No log 21.1818 466 0.6129 0.4961 0.6129 0.7829
No log 21.2727 468 0.5369 0.4693 0.5369 0.7328
No log 21.3636 470 0.5053 0.6406 0.5053 0.7108
No log 21.4545 472 0.5343 0.5510 0.5343 0.7310
No log 21.5455 474 0.5292 0.5470 0.5292 0.7274
No log 21.6364 476 0.5230 0.5342 0.5230 0.7232
No log 21.7273 478 0.5276 0.5342 0.5276 0.7264
No log 21.8182 480 0.5338 0.5553 0.5338 0.7306
No log 21.9091 482 0.5363 0.5553 0.5363 0.7323
No log 22.0 484 0.5563 0.5291 0.5563 0.7459
No log 22.0909 486 0.5313 0.5166 0.5313 0.7289
No log 22.1818 488 0.4945 0.5307 0.4945 0.7032
No log 22.2727 490 0.4907 0.5307 0.4907 0.7005
No log 22.3636 492 0.4963 0.5386 0.4963 0.7045
No log 22.4545 494 0.5138 0.4964 0.5138 0.7168
No log 22.5455 496 0.5166 0.4964 0.5166 0.7188
No log 22.6364 498 0.5131 0.5386 0.5131 0.7163
0.3025 22.7273 500 0.5106 0.5326 0.5106 0.7146
0.3025 22.8182 502 0.5135 0.5567 0.5135 0.7166
0.3025 22.9091 504 0.5143 0.5386 0.5143 0.7171
0.3025 23.0 506 0.5304 0.5655 0.5304 0.7283
0.3025 23.0909 508 0.5546 0.5544 0.5546 0.7447
0.3025 23.1818 510 0.5719 0.5787 0.5719 0.7562
0.3025 23.2727 512 0.5780 0.5403 0.5780 0.7603
0.3025 23.3636 514 0.5473 0.5448 0.5473 0.7398
0.3025 23.4545 516 0.5231 0.4875 0.5231 0.7233
0.3025 23.5455 518 0.5100 0.5123 0.5100 0.7141
0.3025 23.6364 520 0.5046 0.4504 0.5046 0.7103
0.3025 23.7273 522 0.5080 0.4591 0.5080 0.7127
0.3025 23.8182 524 0.5122 0.4591 0.5122 0.7157

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run1_AugV5_k6_task7_organization

Finetuned
(4019)
this model