ArabicNewSplits7_FineTuningAraBERT_run2_AugV5_k14_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5034
  • Qwk: 0.4816
  • Mse: 0.5034
  • Rmse: 0.7095

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0556 2 2.5519 -0.1213 2.5519 1.5975
No log 0.1111 4 1.3000 0.0412 1.3000 1.1402
No log 0.1667 6 1.1046 0.0476 1.1046 1.0510
No log 0.2222 8 0.8966 0.0408 0.8966 0.9469
No log 0.2778 10 0.8234 0.1093 0.8234 0.9074
No log 0.3333 12 0.9243 0.0058 0.9243 0.9614
No log 0.3889 14 0.9262 0.0478 0.9262 0.9624
No log 0.4444 16 0.8842 0.0717 0.8842 0.9403
No log 0.5 18 0.8177 0.0643 0.8177 0.9043
No log 0.5556 20 0.7688 0.1187 0.7688 0.8768
No log 0.6111 22 0.7551 0.0428 0.7551 0.8690
No log 0.6667 24 0.7611 0.0428 0.7611 0.8724
No log 0.7222 26 0.7364 0.0840 0.7364 0.8581
No log 0.7778 28 0.7772 0.2883 0.7772 0.8816
No log 0.8333 30 0.8087 0.3105 0.8087 0.8993
No log 0.8889 32 0.7655 0.2692 0.7655 0.8749
No log 0.9444 34 0.7013 0.3575 0.7013 0.8374
No log 1.0 36 0.7438 0.2317 0.7438 0.8625
No log 1.0556 38 0.7077 0.2783 0.7077 0.8413
No log 1.1111 40 0.7392 0.3637 0.7392 0.8598
No log 1.1667 42 0.8616 0.2843 0.8616 0.9282
No log 1.2222 44 0.8756 0.1962 0.8756 0.9357
No log 1.2778 46 0.8959 0.2923 0.8959 0.9465
No log 1.3333 48 0.8698 0.3425 0.8698 0.9326
No log 1.3889 50 0.7803 0.3238 0.7803 0.8834
No log 1.4444 52 0.7288 0.2285 0.7288 0.8537
No log 1.5 54 0.6840 0.2718 0.6840 0.8270
No log 1.5556 56 0.6796 0.2652 0.6796 0.8244
No log 1.6111 58 0.6982 0.2227 0.6982 0.8356
No log 1.6667 60 0.7410 0.2527 0.7410 0.8608
No log 1.7222 62 0.7601 0.2817 0.7601 0.8718
No log 1.7778 64 0.7568 0.2817 0.7568 0.8699
No log 1.8333 66 0.8656 0.2615 0.8656 0.9304
No log 1.8889 68 0.8813 0.2615 0.8813 0.9388
No log 1.9444 70 0.8784 0.2995 0.8784 0.9373
No log 2.0 72 0.7779 0.1866 0.7779 0.8820
No log 2.0556 74 0.7609 0.2171 0.7609 0.8723
No log 2.1111 76 0.7895 0.2518 0.7895 0.8886
No log 2.1667 78 0.7907 0.3231 0.7907 0.8892
No log 2.2222 80 0.7271 0.1962 0.7271 0.8527
No log 2.2778 82 0.7466 0.3069 0.7466 0.8641
No log 2.3333 84 0.6716 0.3399 0.6716 0.8195
No log 2.3889 86 0.6348 0.3426 0.6348 0.7967
No log 2.4444 88 0.6295 0.4068 0.6295 0.7934
No log 2.5 90 0.6111 0.4068 0.6111 0.7817
No log 2.5556 92 0.6535 0.3919 0.6535 0.8084
No log 2.6111 94 0.6644 0.4437 0.6644 0.8151
No log 2.6667 96 0.6157 0.3523 0.6157 0.7847
No log 2.7222 98 0.6406 0.3728 0.6406 0.8004
No log 2.7778 100 0.8311 0.2467 0.8311 0.9116
No log 2.8333 102 0.7052 0.3224 0.7052 0.8397
No log 2.8889 104 0.6526 0.2218 0.6526 0.8078
No log 2.9444 106 0.6957 0.2193 0.6957 0.8341
No log 3.0 108 0.9129 0.1672 0.9129 0.9555
No log 3.0556 110 0.8553 0.2443 0.8553 0.9248
No log 3.1111 112 0.5990 0.4618 0.5990 0.7739
No log 3.1667 114 0.5549 0.5266 0.5549 0.7449
No log 3.2222 116 0.5431 0.5248 0.5431 0.7370
No log 3.2778 118 0.5386 0.5079 0.5386 0.7339
No log 3.3333 120 0.5479 0.4832 0.5479 0.7402
No log 3.3889 122 0.5402 0.4555 0.5402 0.7350
No log 3.4444 124 0.5467 0.5430 0.5467 0.7394
No log 3.5 126 0.6176 0.4704 0.6176 0.7859
No log 3.5556 128 0.8769 0.3044 0.8769 0.9364
No log 3.6111 130 1.1187 0.1967 1.1187 1.0577
No log 3.6667 132 0.9356 0.2988 0.9356 0.9673
No log 3.7222 134 0.5724 0.5022 0.5724 0.7566
No log 3.7778 136 0.6008 0.4171 0.6008 0.7751
No log 3.8333 138 0.5747 0.4746 0.5747 0.7581
No log 3.8889 140 0.6474 0.4100 0.6474 0.8046
No log 3.9444 142 0.7428 0.3653 0.7428 0.8618
No log 4.0 144 0.6191 0.3894 0.6191 0.7868
No log 4.0556 146 0.5420 0.5521 0.5420 0.7362
No log 4.1111 148 0.5526 0.4888 0.5526 0.7434
No log 4.1667 150 0.5432 0.5326 0.5432 0.7370
No log 4.2222 152 0.5460 0.5095 0.5460 0.7389
No log 4.2778 154 0.5236 0.4857 0.5236 0.7236
No log 4.3333 156 0.5161 0.5714 0.5161 0.7184
No log 4.3889 158 0.5102 0.5798 0.5102 0.7143
No log 4.4444 160 0.4983 0.6154 0.4983 0.7059
No log 4.5 162 0.5311 0.4858 0.5311 0.7287
No log 4.5556 164 0.5836 0.5175 0.5836 0.7639
No log 4.6111 166 0.5146 0.4918 0.5146 0.7174
No log 4.6667 168 0.4932 0.5812 0.4932 0.7023
No log 4.7222 170 0.5115 0.6359 0.5115 0.7152
No log 4.7778 172 0.5124 0.5782 0.5124 0.7158
No log 4.8333 174 0.5335 0.5286 0.5335 0.7304
No log 4.8889 176 0.5402 0.5707 0.5402 0.7350
No log 4.9444 178 0.5484 0.6009 0.5484 0.7405
No log 5.0 180 0.6731 0.5863 0.6731 0.8204
No log 5.0556 182 0.6024 0.5457 0.6024 0.7761
No log 5.1111 184 0.5437 0.5860 0.5437 0.7374
No log 5.1667 186 0.5600 0.4942 0.5600 0.7484
No log 5.2222 188 0.5425 0.5860 0.5425 0.7366
No log 5.2778 190 0.7394 0.4133 0.7394 0.8599
No log 5.3333 192 0.8306 0.4204 0.8306 0.9114
No log 5.3889 194 0.7498 0.4153 0.7498 0.8659
No log 5.4444 196 0.7021 0.4089 0.7021 0.8379
No log 5.5 198 0.6146 0.4167 0.6146 0.7840
No log 5.5556 200 0.5709 0.4684 0.5709 0.7556
No log 5.6111 202 0.6948 0.4089 0.6948 0.8335
No log 5.6667 204 0.7416 0.4723 0.7416 0.8612
No log 5.7222 206 0.8027 0.3786 0.8027 0.8959
No log 5.7778 208 0.6332 0.5131 0.6332 0.7958
No log 5.8333 210 0.5242 0.4774 0.5242 0.7240
No log 5.8889 212 0.5120 0.5550 0.5120 0.7155
No log 5.9444 214 0.5088 0.5386 0.5088 0.7133
No log 6.0 216 0.5618 0.4576 0.5618 0.7495
No log 6.0556 218 0.5840 0.4491 0.5840 0.7642
No log 6.1111 220 0.5160 0.4855 0.5160 0.7183
No log 6.1667 222 0.4969 0.5831 0.4969 0.7049
No log 6.2222 224 0.4966 0.6267 0.4966 0.7047
No log 6.2778 226 0.4836 0.5912 0.4836 0.6954
No log 6.3333 228 0.4812 0.5995 0.4812 0.6937
No log 6.3889 230 0.4949 0.5768 0.4949 0.7035
No log 6.4444 232 0.5104 0.5768 0.5104 0.7144
No log 6.5 234 0.5052 0.5201 0.5052 0.7108
No log 6.5556 236 0.5238 0.4808 0.5238 0.7237
No log 6.6111 238 0.6872 0.4614 0.6872 0.8290
No log 6.6667 240 0.8044 0.4601 0.8044 0.8969
No log 6.7222 242 0.6604 0.4576 0.6604 0.8127
No log 6.7778 244 0.6220 0.4753 0.6220 0.7887
No log 6.8333 246 0.6052 0.3622 0.6052 0.7779
No log 6.8889 248 0.5584 0.4027 0.5584 0.7473
No log 6.9444 250 0.5585 0.4463 0.5585 0.7473
No log 7.0 252 0.5615 0.4463 0.5615 0.7493
No log 7.0556 254 0.5780 0.4027 0.5780 0.7603
No log 7.1111 256 0.6547 0.4835 0.6547 0.8091
No log 7.1667 258 0.6544 0.4753 0.6544 0.8089
No log 7.2222 260 0.5801 0.4086 0.5801 0.7616
No log 7.2778 262 0.5672 0.4631 0.5672 0.7532
No log 7.3333 264 0.5740 0.4958 0.5740 0.7576
No log 7.3889 266 0.5341 0.4857 0.5341 0.7308
No log 7.4444 268 0.5904 0.4684 0.5904 0.7684
No log 7.5 270 0.5701 0.5086 0.5701 0.7551
No log 7.5556 272 0.5145 0.5970 0.5145 0.7173
No log 7.6111 274 0.5674 0.5283 0.5674 0.7532
No log 7.6667 276 0.6280 0.4723 0.6280 0.7924
No log 7.7222 278 0.5308 0.5234 0.5308 0.7286
No log 7.7778 280 0.5165 0.5970 0.5165 0.7187
No log 7.8333 282 0.5131 0.5569 0.5131 0.7163
No log 7.8889 284 0.5126 0.5687 0.5126 0.7160
No log 7.9444 286 0.5080 0.5765 0.5080 0.7127
No log 8.0 288 0.4972 0.5189 0.4972 0.7051
No log 8.0556 290 0.5478 0.5039 0.5478 0.7402
No log 8.1111 292 0.5626 0.5039 0.5626 0.7500
No log 8.1667 294 0.5099 0.4086 0.5099 0.7141
No log 8.2222 296 0.5203 0.5266 0.5203 0.7213
No log 8.2778 298 0.5349 0.5065 0.5349 0.7313
No log 8.3333 300 0.5411 0.4561 0.5411 0.7356
No log 8.3889 302 0.5513 0.3893 0.5513 0.7425
No log 8.4444 304 0.6717 0.4247 0.6717 0.8196
No log 8.5 306 0.7275 0.3991 0.7275 0.8529
No log 8.5556 308 0.6510 0.4414 0.6510 0.8068
No log 8.6111 310 0.5679 0.4086 0.5679 0.7536
No log 8.6667 312 0.5499 0.4345 0.5499 0.7415
No log 8.7222 314 0.5391 0.4809 0.5391 0.7342
No log 8.7778 316 0.5928 0.4292 0.5928 0.7700
No log 8.8333 318 0.8040 0.4844 0.8040 0.8967
No log 8.8889 320 0.8604 0.4611 0.8604 0.9276
No log 8.9444 322 0.6983 0.5325 0.6983 0.8356
No log 9.0 324 0.5452 0.5050 0.5452 0.7384
No log 9.0556 326 0.5162 0.5075 0.5162 0.7185
No log 9.1111 328 0.5104 0.5075 0.5104 0.7145
No log 9.1667 330 0.5344 0.4576 0.5344 0.7310
No log 9.2222 332 0.5794 0.4997 0.5794 0.7612
No log 9.2778 334 0.5443 0.4639 0.5443 0.7378
No log 9.3333 336 0.5218 0.5160 0.5218 0.7224
No log 9.3889 338 0.5405 0.4832 0.5405 0.7352
No log 9.4444 340 0.5262 0.5213 0.5262 0.7254
No log 9.5 342 0.5312 0.5379 0.5312 0.7288
No log 9.5556 344 0.5302 0.5614 0.5302 0.7282
No log 9.6111 346 0.5210 0.5587 0.5210 0.7218
No log 9.6667 348 0.5188 0.5003 0.5188 0.7203
No log 9.7222 350 0.5258 0.4888 0.5258 0.7251
No log 9.7778 352 0.5271 0.4888 0.5271 0.7260
No log 9.8333 354 0.5236 0.5036 0.5236 0.7236
No log 9.8889 356 0.5292 0.5289 0.5292 0.7275
No log 9.9444 358 0.5510 0.5289 0.5510 0.7423
No log 10.0 360 0.5593 0.4819 0.5593 0.7478
No log 10.0556 362 0.5553 0.4901 0.5553 0.7452
No log 10.1111 364 0.5644 0.4895 0.5644 0.7513
No log 10.1667 366 0.5794 0.3809 0.5794 0.7612
No log 10.2222 368 0.5908 0.3728 0.5908 0.7686
No log 10.2778 370 0.6191 0.3701 0.6191 0.7868
No log 10.3333 372 0.5789 0.4437 0.5789 0.7609
No log 10.3889 374 0.5354 0.4888 0.5354 0.7317
No log 10.4444 376 0.5240 0.5076 0.5240 0.7239
No log 10.5 378 0.5167 0.5141 0.5167 0.7188
No log 10.5556 380 0.5262 0.5882 0.5262 0.7254
No log 10.6111 382 0.5195 0.6101 0.5195 0.7207
No log 10.6667 384 0.5380 0.5708 0.5380 0.7335
No log 10.7222 386 0.5392 0.5708 0.5392 0.7343
No log 10.7778 388 0.5477 0.5708 0.5477 0.7400
No log 10.8333 390 0.5207 0.5995 0.5207 0.7216
No log 10.8889 392 0.5104 0.5549 0.5104 0.7144
No log 10.9444 394 0.5157 0.5472 0.5157 0.7181
No log 11.0 396 0.5265 0.5472 0.5265 0.7256
No log 11.0556 398 0.5106 0.5707 0.5106 0.7146
No log 11.1111 400 0.5425 0.5795 0.5425 0.7365
No log 11.1667 402 0.5326 0.5795 0.5326 0.7298
No log 11.2222 404 0.5148 0.5234 0.5148 0.7175
No log 11.2778 406 0.5110 0.5563 0.5110 0.7148
No log 11.3333 408 0.5200 0.5058 0.5200 0.7211
No log 11.3889 410 0.5123 0.5120 0.5123 0.7157
No log 11.4444 412 0.4878 0.5797 0.4878 0.6984
No log 11.5 414 0.4981 0.5056 0.4981 0.7058
No log 11.5556 416 0.4871 0.5056 0.4871 0.6980
No log 11.6111 418 0.4728 0.5617 0.4728 0.6876
No log 11.6667 420 0.5051 0.5892 0.5051 0.7107
No log 11.7222 422 0.4785 0.5731 0.4785 0.6917
No log 11.7778 424 0.4683 0.4659 0.4683 0.6843
No log 11.8333 426 0.4906 0.4945 0.4906 0.7004
No log 11.8889 428 0.4787 0.5437 0.4787 0.6919
No log 11.9444 430 0.4784 0.6082 0.4784 0.6917
No log 12.0 432 0.6016 0.5460 0.6016 0.7757
No log 12.0556 434 0.6858 0.4789 0.6858 0.8281
No log 12.1111 436 0.6136 0.5664 0.6136 0.7833
No log 12.1667 438 0.5033 0.6082 0.5033 0.7094
No log 12.2222 440 0.5379 0.5091 0.5379 0.7334
No log 12.2778 442 0.8394 0.3939 0.8394 0.9162
No log 12.3333 444 0.9590 0.3476 0.9590 0.9793
No log 12.3889 446 0.8467 0.4230 0.8467 0.9201
No log 12.4444 448 0.6374 0.4892 0.6374 0.7984
No log 12.5 450 0.5044 0.4206 0.5044 0.7102
No log 12.5556 452 0.4979 0.5061 0.4979 0.7056
No log 12.6111 454 0.5038 0.5765 0.5038 0.7098
No log 12.6667 456 0.4976 0.5985 0.4976 0.7054
No log 12.7222 458 0.4943 0.5826 0.4943 0.7031
No log 12.7778 460 0.4924 0.5826 0.4924 0.7017
No log 12.8333 462 0.4762 0.5430 0.4762 0.6900
No log 12.8889 464 0.5103 0.5111 0.5103 0.7143
No log 12.9444 466 0.5441 0.5111 0.5441 0.7376
No log 13.0 468 0.5482 0.5177 0.5482 0.7404
No log 13.0556 470 0.4969 0.4888 0.4969 0.7049
No log 13.1111 472 0.4833 0.5596 0.4833 0.6952
No log 13.1667 474 0.5277 0.5498 0.5277 0.7264
No log 13.2222 476 0.5603 0.5046 0.5603 0.7485
No log 13.2778 478 0.5147 0.5999 0.5147 0.7174
No log 13.3333 480 0.5025 0.5357 0.5025 0.7089
No log 13.3889 482 0.5081 0.4701 0.5081 0.7128
No log 13.4444 484 0.5208 0.4459 0.5208 0.7217
No log 13.5 486 0.5116 0.4229 0.5116 0.7153
No log 13.5556 488 0.5030 0.4809 0.5030 0.7092
No log 13.6111 490 0.5147 0.5960 0.5147 0.7174
No log 13.6667 492 0.5218 0.6182 0.5218 0.7224
No log 13.7222 494 0.5064 0.5463 0.5064 0.7117
No log 13.7778 496 0.5131 0.4681 0.5131 0.7163
No log 13.8333 498 0.5670 0.5403 0.5670 0.7530
0.3319 13.8889 500 0.5617 0.5252 0.5617 0.7494
0.3319 13.9444 502 0.5212 0.4747 0.5212 0.7220
0.3319 14.0 504 0.5021 0.5522 0.5021 0.7086
0.3319 14.0556 506 0.5131 0.5319 0.5131 0.7163
0.3319 14.1111 508 0.5202 0.5627 0.5202 0.7213
0.3319 14.1667 510 0.5087 0.5406 0.5087 0.7132
0.3319 14.2222 512 0.5002 0.5361 0.5002 0.7073
0.3319 14.2778 514 0.5485 0.5485 0.5485 0.7406
0.3319 14.3333 516 0.5897 0.5310 0.5897 0.7679
0.3319 14.3889 518 0.5479 0.5254 0.5479 0.7402
0.3319 14.4444 520 0.5053 0.5114 0.5053 0.7109
0.3319 14.5 522 0.5005 0.5555 0.5005 0.7075
0.3319 14.5556 524 0.4983 0.5555 0.4983 0.7059
0.3319 14.6111 526 0.4937 0.5457 0.4937 0.7026
0.3319 14.6667 528 0.5295 0.4726 0.5295 0.7277
0.3319 14.7222 530 0.6029 0.5481 0.6029 0.7765
0.3319 14.7778 532 0.6370 0.4815 0.6370 0.7981
0.3319 14.8333 534 0.5832 0.5481 0.5832 0.7637
0.3319 14.8889 536 0.5168 0.4684 0.5168 0.7189
0.3319 14.9444 538 0.4853 0.4929 0.4853 0.6966
0.3319 15.0 540 0.4808 0.5024 0.4808 0.6934
0.3319 15.0556 542 0.4956 0.4569 0.4956 0.7040
0.3319 15.1111 544 0.5034 0.4816 0.5034 0.7095

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_FineTuningAraBERT_run2_AugV5_k14_task7_organization

Finetuned
(4023)
this model