ArabicNewSplits6_WithDuplicationsForScore5_FineTuningAraBERT_run2_AugV5_k11_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7414
  • Qwk: 0.7533
  • Mse: 0.7414
  • Rmse: 0.8611

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0455 2 2.4235 0.0129 2.4235 1.5568
No log 0.0909 4 1.5997 0.1191 1.5997 1.2648
No log 0.1364 6 1.5535 0.0517 1.5535 1.2464
No log 0.1818 8 1.5423 0.3012 1.5423 1.2419
No log 0.2273 10 1.8133 0.2803 1.8133 1.3466
No log 0.2727 12 2.2812 0.1820 2.2812 1.5104
No log 0.3182 14 2.0136 0.2635 2.0136 1.4190
No log 0.3636 16 1.6119 0.3577 1.6119 1.2696
No log 0.4091 18 1.3136 0.2003 1.3136 1.1461
No log 0.4545 20 1.2711 0.1872 1.2711 1.1274
No log 0.5 22 1.3285 0.1544 1.3285 1.1526
No log 0.5455 24 1.3137 0.2958 1.3137 1.1462
No log 0.5909 26 1.2948 0.3790 1.2948 1.1379
No log 0.6364 28 1.3489 0.3611 1.3489 1.1614
No log 0.6818 30 1.2497 0.4073 1.2497 1.1179
No log 0.7273 32 1.0738 0.4173 1.0738 1.0362
No log 0.7727 34 1.0546 0.4356 1.0546 1.0269
No log 0.8182 36 1.0719 0.4509 1.0719 1.0353
No log 0.8636 38 0.9685 0.5208 0.9685 0.9841
No log 0.9091 40 1.0427 0.4517 1.0427 1.0211
No log 0.9545 42 0.9759 0.4767 0.9759 0.9879
No log 1.0 44 1.1108 0.5022 1.1108 1.0539
No log 1.0455 46 1.5966 0.3933 1.5966 1.2636
No log 1.0909 48 1.5819 0.4100 1.5819 1.2577
No log 1.1364 50 1.2877 0.4036 1.2877 1.1348
No log 1.1818 52 1.0957 0.4316 1.0957 1.0468
No log 1.2273 54 1.0269 0.4434 1.0269 1.0134
No log 1.2727 56 1.0040 0.4518 1.0040 1.0020
No log 1.3182 58 0.9708 0.5022 0.9708 0.9853
No log 1.3636 60 1.0347 0.4876 1.0347 1.0172
No log 1.4091 62 1.1684 0.4798 1.1684 1.0809
No log 1.4545 64 1.5555 0.4082 1.5555 1.2472
No log 1.5 66 1.5287 0.4173 1.5287 1.2364
No log 1.5455 68 1.2076 0.4725 1.2076 1.0989
No log 1.5909 70 0.9193 0.5581 0.9193 0.9588
No log 1.6364 72 0.9118 0.5411 0.9118 0.9549
No log 1.6818 74 0.9196 0.5352 0.9196 0.9589
No log 1.7273 76 0.8911 0.5936 0.8911 0.9440
No log 1.7727 78 0.8538 0.5992 0.8538 0.9240
No log 1.8182 80 0.8878 0.5739 0.8878 0.9422
No log 1.8636 82 0.8629 0.5838 0.8629 0.9289
No log 1.9091 84 0.8278 0.6286 0.8278 0.9098
No log 1.9545 86 0.8471 0.6553 0.8471 0.9204
No log 2.0 88 0.8206 0.6744 0.8206 0.9059
No log 2.0455 90 0.8262 0.6778 0.8262 0.9089
No log 2.0909 92 0.8271 0.6851 0.8271 0.9094
No log 2.1364 94 0.8356 0.6690 0.8356 0.9141
No log 2.1818 96 0.8068 0.6764 0.8068 0.8982
No log 2.2273 98 0.7933 0.6662 0.7933 0.8907
No log 2.2727 100 0.8580 0.6841 0.8580 0.9263
No log 2.3182 102 0.8631 0.6710 0.8631 0.9290
No log 2.3636 104 0.7610 0.6652 0.7610 0.8723
No log 2.4091 106 0.7504 0.6671 0.7504 0.8663
No log 2.4545 108 0.8087 0.6386 0.8087 0.8993
No log 2.5 110 1.0401 0.6333 1.0401 1.0199
No log 2.5455 112 1.1046 0.6285 1.1046 1.0510
No log 2.5909 114 0.9168 0.6567 0.9168 0.9575
No log 2.6364 116 0.7293 0.6767 0.7293 0.8540
No log 2.6818 118 0.7165 0.6552 0.7165 0.8465
No log 2.7273 120 0.7107 0.6789 0.7107 0.8430
No log 2.7727 122 0.8017 0.6600 0.8017 0.8954
No log 2.8182 124 0.9640 0.6833 0.9640 0.9818
No log 2.8636 126 1.0900 0.6356 1.0900 1.0440
No log 2.9091 128 0.9408 0.6800 0.9408 0.9699
No log 2.9545 130 0.7822 0.6573 0.7822 0.8844
No log 3.0 132 0.7224 0.6465 0.7224 0.8500
No log 3.0455 134 0.7496 0.6618 0.7496 0.8658
No log 3.0909 136 0.9203 0.6849 0.9203 0.9593
No log 3.1364 138 1.2636 0.5766 1.2636 1.1241
No log 3.1818 140 1.2605 0.5816 1.2605 1.1227
No log 3.2273 142 1.0099 0.6305 1.0099 1.0049
No log 3.2727 144 0.7635 0.6645 0.7635 0.8738
No log 3.3182 146 0.7044 0.6559 0.7044 0.8393
No log 3.3636 148 0.7041 0.6645 0.7041 0.8391
No log 3.4091 150 0.7251 0.6730 0.7251 0.8516
No log 3.4545 152 0.9042 0.6697 0.9042 0.9509
No log 3.5 154 1.0753 0.5944 1.0753 1.0370
No log 3.5455 156 1.0259 0.6120 1.0259 1.0129
No log 3.5909 158 0.8264 0.7071 0.8264 0.9091
No log 3.6364 160 0.7287 0.7217 0.7287 0.8536
No log 3.6818 162 0.7077 0.7199 0.7077 0.8413
No log 3.7273 164 0.7619 0.6936 0.7619 0.8729
No log 3.7727 166 0.8659 0.6804 0.8659 0.9306
No log 3.8182 168 1.0342 0.6065 1.0342 1.0169
No log 3.8636 170 1.2171 0.5706 1.2171 1.1032
No log 3.9091 172 1.2441 0.5626 1.2441 1.1154
No log 3.9545 174 1.1486 0.5868 1.1486 1.0717
No log 4.0 176 0.9494 0.6811 0.9494 0.9744
No log 4.0455 178 0.8638 0.6803 0.8638 0.9294
No log 4.0909 180 0.8483 0.6803 0.8483 0.9211
No log 4.1364 182 0.8538 0.6803 0.8538 0.9240
No log 4.1818 184 0.9276 0.6748 0.9276 0.9631
No log 4.2273 186 1.0027 0.6461 1.0027 1.0013
No log 4.2727 188 1.0344 0.6383 1.0344 1.0171
No log 4.3182 190 1.0665 0.6183 1.0665 1.0327
No log 4.3636 192 0.9579 0.6654 0.9579 0.9787
No log 4.4091 194 0.9103 0.6565 0.9103 0.9541
No log 4.4545 196 0.9957 0.6589 0.9957 0.9978
No log 4.5 198 0.9932 0.6503 0.9932 0.9966
No log 4.5455 200 0.9576 0.6788 0.9576 0.9785
No log 4.5909 202 0.8118 0.7037 0.8118 0.9010
No log 4.6364 204 0.6894 0.7011 0.6894 0.8303
No log 4.6818 206 0.6733 0.6892 0.6733 0.8205
No log 4.7273 208 0.6734 0.6892 0.6734 0.8206
No log 4.7727 210 0.6664 0.6805 0.6664 0.8163
No log 4.8182 212 0.7298 0.7481 0.7298 0.8543
No log 4.8636 214 0.8696 0.6972 0.8696 0.9325
No log 4.9091 216 0.9060 0.7029 0.9060 0.9518
No log 4.9545 218 0.8285 0.7108 0.8285 0.9102
No log 5.0 220 0.8161 0.7183 0.8161 0.9034
No log 5.0455 222 0.7617 0.7361 0.7617 0.8728
No log 5.0909 224 0.7151 0.7403 0.7151 0.8456
No log 5.1364 226 0.6737 0.7049 0.6737 0.8208
No log 5.1818 228 0.6753 0.7154 0.6753 0.8218
No log 5.2273 230 0.7324 0.7539 0.7324 0.8558
No log 5.2727 232 0.8477 0.7044 0.8477 0.9207
No log 5.3182 234 0.8819 0.6762 0.8819 0.9391
No log 5.3636 236 0.9205 0.6614 0.9205 0.9594
No log 5.4091 238 0.8901 0.6793 0.8901 0.9435
No log 5.4545 240 0.8779 0.6862 0.8779 0.9370
No log 5.5 242 0.8970 0.6695 0.8970 0.9471
No log 5.5455 244 0.8858 0.6942 0.8858 0.9412
No log 5.5909 246 0.7865 0.7209 0.7865 0.8868
No log 5.6364 248 0.7023 0.7485 0.7023 0.8380
No log 5.6818 250 0.6879 0.7415 0.6879 0.8294
No log 5.7273 252 0.6846 0.7332 0.6846 0.8274
No log 5.7727 254 0.6982 0.7533 0.6982 0.8356
No log 5.8182 256 0.6879 0.7269 0.6879 0.8294
No log 5.8636 258 0.7005 0.7256 0.7005 0.8370
No log 5.9091 260 0.7609 0.7358 0.7609 0.8723
No log 5.9545 262 0.8631 0.6912 0.8631 0.9290
No log 6.0 264 0.9200 0.6581 0.9200 0.9592
No log 6.0455 266 0.8823 0.6912 0.8823 0.9393
No log 6.0909 268 0.7957 0.7195 0.7957 0.8920
No log 6.1364 270 0.7187 0.7286 0.7187 0.8478
No log 6.1818 272 0.7036 0.7227 0.7036 0.8388
No log 6.2273 274 0.7231 0.7359 0.7231 0.8504
No log 6.2727 276 0.7629 0.7244 0.7629 0.8734
No log 6.3182 278 0.8097 0.7175 0.8097 0.8998
No log 6.3636 280 0.8142 0.6967 0.8142 0.9023
No log 6.4091 282 0.8015 0.7175 0.8015 0.8953
No log 6.4545 284 0.8470 0.6756 0.8470 0.9203
No log 6.5 286 0.9055 0.6510 0.9055 0.9516
No log 6.5455 288 0.9057 0.6510 0.9057 0.9517
No log 6.5909 290 0.8353 0.6925 0.8353 0.9140
No log 6.6364 292 0.7509 0.7309 0.7509 0.8665
No log 6.6818 294 0.7242 0.7316 0.7242 0.8510
No log 6.7273 296 0.6749 0.7472 0.6749 0.8215
No log 6.7727 298 0.6392 0.7439 0.6392 0.7995
No log 6.8182 300 0.6362 0.7380 0.6362 0.7976
No log 6.8636 302 0.6606 0.7507 0.6606 0.8127
No log 6.9091 304 0.6752 0.7512 0.6752 0.8217
No log 6.9545 306 0.6951 0.7549 0.6951 0.8337
No log 7.0 308 0.6749 0.7684 0.6749 0.8215
No log 7.0455 310 0.6619 0.7569 0.6619 0.8136
No log 7.0909 312 0.6695 0.7480 0.6695 0.8182
No log 7.1364 314 0.7030 0.7591 0.7030 0.8384
No log 7.1818 316 0.7552 0.7224 0.7552 0.8690
No log 7.2273 318 0.7904 0.6932 0.7904 0.8890
No log 7.2727 320 0.8330 0.6941 0.8330 0.9127
No log 7.3182 322 0.8957 0.6507 0.8957 0.9464
No log 7.3636 324 0.9081 0.6463 0.9081 0.9530
No log 7.4091 326 0.8566 0.6822 0.8566 0.9255
No log 7.4545 328 0.7820 0.7099 0.7820 0.8843
No log 7.5 330 0.7216 0.7368 0.7216 0.8494
No log 7.5455 332 0.7092 0.7442 0.7092 0.8421
No log 7.5909 334 0.7259 0.7367 0.7259 0.8520
No log 7.6364 336 0.7525 0.7361 0.7525 0.8675
No log 7.6818 338 0.7710 0.7150 0.7710 0.8781
No log 7.7273 340 0.8002 0.7099 0.8002 0.8945
No log 7.7727 342 0.8472 0.7035 0.8472 0.9204
No log 7.8182 344 0.8637 0.6840 0.8637 0.9294
No log 7.8636 346 0.8505 0.6768 0.8505 0.9222
No log 7.9091 348 0.8419 0.6892 0.8419 0.9176
No log 7.9545 350 0.8315 0.7059 0.8315 0.9119
No log 8.0 352 0.8072 0.7175 0.8072 0.8985
No log 8.0455 354 0.7819 0.7382 0.7819 0.8842
No log 8.0909 356 0.7740 0.7324 0.7740 0.8798
No log 8.1364 358 0.7583 0.7367 0.7583 0.8708
No log 8.1818 360 0.7644 0.7367 0.7644 0.8743
No log 8.2273 362 0.7844 0.7324 0.7844 0.8857
No log 8.2727 364 0.7734 0.7324 0.7734 0.8794
No log 8.3182 366 0.7486 0.7410 0.7486 0.8652
No log 8.3636 368 0.7328 0.7496 0.7328 0.8560
No log 8.4091 370 0.7177 0.7506 0.7177 0.8472
No log 8.4545 372 0.7260 0.7539 0.7260 0.8520
No log 8.5 374 0.7504 0.7496 0.7504 0.8663
No log 8.5455 376 0.7607 0.7410 0.7607 0.8722
No log 8.5909 378 0.7827 0.7237 0.7827 0.8847
No log 8.6364 380 0.7917 0.7116 0.7917 0.8898
No log 8.6818 382 0.7872 0.7194 0.7872 0.8872
No log 8.7273 384 0.7738 0.7361 0.7738 0.8796
No log 8.7727 386 0.7621 0.7361 0.7621 0.8730
No log 8.8182 388 0.7743 0.7361 0.7743 0.8800
No log 8.8636 390 0.7924 0.7073 0.7924 0.8902
No log 8.9091 392 0.7910 0.7116 0.7910 0.8894
No log 8.9545 394 0.7847 0.7281 0.7847 0.8858
No log 9.0 396 0.7611 0.7361 0.7611 0.8724
No log 9.0455 398 0.7410 0.7490 0.7410 0.8608
No log 9.0909 400 0.7383 0.7533 0.7383 0.8593
No log 9.1364 402 0.7362 0.7496 0.7362 0.8580
No log 9.1818 404 0.7454 0.7490 0.7454 0.8634
No log 9.2273 406 0.7626 0.7409 0.7626 0.8733
No log 9.2727 408 0.7699 0.7287 0.7699 0.8774
No log 9.3182 410 0.7630 0.7409 0.7630 0.8735
No log 9.3636 412 0.7576 0.7409 0.7576 0.8704
No log 9.4091 414 0.7567 0.7490 0.7567 0.8699
No log 9.4545 416 0.7603 0.7409 0.7603 0.8720
No log 9.5 418 0.7572 0.7409 0.7572 0.8702
No log 9.5455 420 0.7562 0.7409 0.7562 0.8696
No log 9.5909 422 0.7505 0.7533 0.7505 0.8663
No log 9.6364 424 0.7447 0.7533 0.7447 0.8630
No log 9.6818 426 0.7430 0.7496 0.7430 0.8620
No log 9.7273 428 0.7388 0.7496 0.7388 0.8595
No log 9.7727 430 0.7368 0.7496 0.7368 0.8584
No log 9.8182 432 0.7353 0.7496 0.7353 0.8575
No log 9.8636 434 0.7364 0.7496 0.7364 0.8582
No log 9.9091 436 0.7386 0.7496 0.7386 0.8594
No log 9.9545 438 0.7407 0.7533 0.7407 0.8606
No log 10.0 440 0.7414 0.7533 0.7414 0.8611

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits6_WithDuplicationsForScore5_FineTuningAraBERT_run2_AugV5_k11_task5_organization

Finetuned
(4023)
this model